diff --git a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Farming Simulator 17 KUHN-RELOADED Free Download.md b/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Farming Simulator 17 KUHN-RELOADED Free Download.md deleted file mode 100644 index d2e9112f2c8835c5ca996794a465c6785585061a..0000000000000000000000000000000000000000 --- a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Farming Simulator 17 KUHN-RELOADED Free Download.md +++ /dev/null @@ -1,47 +0,0 @@ - -

Farming Simulator 17 KUHN-RELOADED Free Download: A Review of the New Official Extension

- -

If you are a fan of Farming Simulator 17, you might be interested in the new official extension that adds 18 new implements from KUHN, a leading manufacturer of agricultural machinery. The KUHN-RELOADED DLC is available for download on PC and Mac platforms, and it requires the base game Farming Simulator 17 (Update 1.4 or higher) to play.

- -

In this article, we will review the features and benefits of the KUHN-RELOADED DLC, and show you how to download it for free.

-

Farming Simulator 17 KUHN-RELOADED Free Download


Download >>> https://byltly.com/2uKwp7



- -

What's Included in the KUHN-RELOADED DLC?

- -

The KUHN-RELOADED DLC contains a variety of equipment that will enhance your farming experience and productivity. You will find cultivators, sowing machines, sprayers, fertilizers spreaders, mowers, tedders, windrowers, balers and bale wrappers from KUHN. Here is a list of the equipment included in the DLC:

- - - -

Each equipment has its own specifications and features that will help you perform various tasks on your farm. You can check out the details of each equipment on the official website[^1^] or on Steam[^2^]. You can also watch the official trailer for the DLC below:

- - - -

How to Download Farming Simulator 17 KUHN-RELOADED Free?

- -

If you want to download Farming Simulator 17 KUHN-RELOADED free, you will need a torrent client such as BitTorrent or uTorrent. You can download the torrent file from various sources online, such as LaptrinhX[^3^]. Once you have downloaded the torrent file, you can open it with your torrent client and start downloading the game files.

- -

After the download is complete, you will need to extract the files using a program such as WinRAR or 7-Zip. You will find a folder named Farming.Simulator.17.KUHN-RELOADED that contains an ISO file and a folder named Crack. You will need to mount the ISO file using a program such as Daemon Tools or PowerISO. Then, you will need to run the setup.exe file and follow the instructions to install the game.

- -

Once the installation is done, you will need to copy the contents of the Crack folder and paste them into the game directory where you installed Farming Simulator 17. This will overwrite some files and crack the game. Now, you can launch the game

-

7b8c122e87
-
-
\ No newline at end of file diff --git a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Google Earth Pro 7.3.0 Portable Free ((NEW)) Download.md b/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Google Earth Pro 7.3.0 Portable Free ((NEW)) Download.md deleted file mode 100644 index a77d51bc0ed58712445e158313b0f582cd269d97..0000000000000000000000000000000000000000 --- a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/Google Earth Pro 7.3.0 Portable Free ((NEW)) Download.md +++ /dev/null @@ -1,21 +0,0 @@ -
-

Google Earth Pro 7.3.0 Portable Free Download

-

Google Earth Pro is a powerful application that lets you explore the world from the comfort of your computer. You can view satellite imagery, maps, terrain, 3D buildings, and even galaxies in outer space. You can also save your toured places, share them with others, and import and export GIS data.

-

Google Earth Pro 7.3.0 Portable Free Download


Download ⚹⚹⚹ https://byltly.com/2uKxoj



-

Google Earth Pro is now free and available for Windows, macOS, Android and Linux[^2^]. However, if you want to use it without installing anything on your system, you can download Google Earth Pro Portable from PortableApps.com[^1^]. This version is based on Google Earth Pro 7.3.0 and allows you to enable caching and associate *.kml and *.kmz files with the app.

-

To download Google Earth Pro Portable, follow these steps:

-
    -
  1. Go to https://portableapps.com/node/57406
  2. -
  3. Click on the link that says "Download Google Earth Pro Portable 7.3.0.3832 Dev Test 1 [1.06+54.9MB download / 179.0MB installed]"
  4. -
  5. Save the file to your desired location and run it
  6. -
  7. Follow the instructions to extract and launch Google Earth Pro Portable
  8. -
  9. Enjoy exploring the world!
  10. -
-

Note: This is a development test version and may not work properly or be stable. Use it at your own risk.

-

Google Earth Pro Portable has some features that are not available in the regular Google Earth. For example, you can measure distances and areas using lines and polygons, print high-resolution images for presentations and reports, record HD movies of your virtual flights, and import large vector image files.

-

Google Earth Pro Portable also lets you customize your experience by changing the settings in the GoogleEarthProPortable.ini file. You can find this file in the Other\Source folder of the app. You can edit this file with any text editor and change the values of different parameters. For example, you can enable caching by setting EnableCache=true. This will allow Google Earth Pro Portable to store some data locally and reduce the loading time and bandwidth usage.

-

Google Earth Pro Portable is a great way to explore the world without installing anything on your system. However, you should be aware of some limitations and drawbacks of using this version. For instance, you may encounter some errors or crashes due to compatibility issues or bugs. You may also miss some updates or features that are available in the official Google Earth Pro. Moreover, you should respect the terms of service and privacy policy of Google when using this app.

To update Google Earth Pro Portable, you can check the PortableApps.com website for any new releases. You can also use the PortableApps.com Platform to automatically update your apps. Alternatively, you can download the latest version of Google Earth Pro from the official website and replace the files in the App\GoogleEarthPro folder of your portable app.

-

To uninstall Google Earth Pro Portable, you can simply delete the folder where you extracted the app. You can also use the PortableApps.com Platform to uninstall your apps. However, you should note that this will not remove any data or settings that Google Earth Pro Portable may have stored on your system or online.

-

Google Earth Pro Portable requires an internet connection to download and display the imagery and data from Google servers. You cannot use it offline unless you have enabled caching and have previously viewed the areas that you want to see. Even then, you may not be able to access all the features and functions of the app.

81aa517590
-
-
\ No newline at end of file diff --git a/spaces/1gistliPinn/ChatGPT4/Examples/Adobe Media Encoder Cs6 Activation Crack NEW.md b/spaces/1gistliPinn/ChatGPT4/Examples/Adobe Media Encoder Cs6 Activation Crack NEW.md deleted file mode 100644 index 5bee0fa91150fd2ffb564baa9baa69a52cfdca45..0000000000000000000000000000000000000000 --- a/spaces/1gistliPinn/ChatGPT4/Examples/Adobe Media Encoder Cs6 Activation Crack NEW.md +++ /dev/null @@ -1,90 +0,0 @@ -
-

How to Download and Install Adobe Media Encoder CS6 Activation Crack

-

Adobe Media Encoder CS6 is a software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. It is a part of the Adobe Creative Suite 6, which includes other popular applications such as Photoshop, Premiere Pro, After Effects, etc. However, to use Adobe Media Encoder CS6, you need to activate it with a valid serial number or a crack file. In this article, we will show you how to download and install Adobe Media Encoder CS6 activation crack for free.

-

What is Adobe Media Encoder CS6?

-

Adobe Media Encoder CS6 is a software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. It can help you convert your video clips to different formats such as MP4, MOV, AVI, FLV, etc. It can also help you adjust the quality and size of your video files according to your needs. You can use Adobe Media Encoder CS6 to export your video files from other Adobe applications such as After Effects or Premiere Pro. You can also use it to upload your video files to social media platforms such as YouTube or Facebook.

-

adobe media encoder cs6 activation crack


DOWNLOADhttps://imgfil.com/2uxXxD



-

What is Adobe Media Encoder CS6 Activation Crack?

-

Adobe Media Encoder CS6 activation crack is a file that can bypass the activation process of the software and make it work without a serial number. It is usually a modified version of the original file "amtlib.dll" that is located in the installation folder of Adobe Media Encoder CS6. By replacing the original file with the crack file, you can trick the software into thinking that it is activated and use it without any limitations.

-

Where to Download Adobe Media Encoder CS6 Activation Crack?

-

There are many websites that offer Adobe Media Encoder CS6 activation crack for download. However, not all of them are reliable or safe. Some of them may have fake or corrupted files, or may contain malicious ads or pop-ups. Therefore, you should be careful when choosing a website and check the reviews and ratings of the files before downloading them. Here are some of the websites that offer Adobe Media Encoder CS6 activation crack:

- -
How to Install Adobe Media Encoder CS6 Activation Crack?
-

After you have downloaded Adobe Media Encoder CS6 activation crack from one of the websites above, you need to follow these steps to install it:

-
    -
  1. Run Block Adobe Activation.app: This is an application that prevents the Adobe activation program from starting. You need to run this application before installing Adobe Media Encoder CS6.
  2. -
  3. Run the Adobe CS6 installer: This is an application that installs Adobe Media Encoder CS6 on your computer. You need to run this application and select "Trial" as the installation option.
  4. -
  5. Register a free adobe account: This is an account that allows you to access various Adobe services and products. You need to register a free adobe account and sign in with it during the installation process.
  6. -
  7. Copy amtlib.dll file and replace with existing one: This is the crack file that activates Adobe Media Encoder CS6. You need to copy this file from the downloaded folder and replace it with the existing one in the installation folder of Adobe Media Encoder CS6. The default location of the installation folder is C:\Program Files\Adobe\Adobe Media Encoder CS6.
  8. -
  9. Enjoy using Adobe Media Encoder CS6: After you have replaced the amtlib.dll file with the crack file, you can enjoy using Adobe Media Encoder CS6 without any restrictions.
  10. -
-
Benefits of Using Adobe Media Encoder CS6 Activation Crack
-

Using Adobe Media Encoder CS6 activation crack has many benefits for you, such as:

- -Conclusion -

Adobe Media Encoder CS6 is a powerful software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. However, to use it, you need to activate it with a valid serial number or a crack file. In this article, we showed you how to download and install Adobe Media Encoder CS6 activation crack for free. By using this crack file, you can enjoy using Adobe Media Encoder CS6 without any restrictions.

-

How to Download and Install Adobe Media Encoder CS6 Activation Crack

-

Adobe Media Encoder CS6 is a software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. It is a part of the Adobe Creative Suite 6, which includes other popular applications such as Photoshop, Premiere Pro, After Effects, etc. However, to use Adobe Media Encoder CS6, you need to activate it with a valid serial number or a crack file. In this article, we will show you how to download and install Adobe Media Encoder CS6 activation crack for free.

-

What is Adobe Media Encoder CS6?

-

Adobe Media Encoder CS6 is a software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. It can help you convert your video clips to different formats such as MP4, MOV, AVI, FLV, etc. It can also help you adjust the quality and size of your video files according to your needs. You can use Adobe Media Encoder CS6 to export your video files from other Adobe applications such as After Effects or Premiere Pro. You can also use it to upload your video files to social media platforms such as YouTube or Facebook.

-

-

What is Adobe Media Encoder CS6 Activation Crack?

-

Adobe Media Encoder CS6 activation crack is a file that can bypass the activation process of the software and make it work without a serial number. It is usually a modified version of the original file "amtlib.dll" that is located in the installation folder of Adobe Media Encoder CS6. By replacing the original file with the crack file, you can trick the software into thinking that it is activated and use it without any limitations.

-

Where to Download Adobe Media Encoder CS6 Activation Crack?

-

There are many websites that offer Adobe Media Encoder CS6 activation crack for download. However, not all of them are reliable or safe. Some of them may have fake or corrupted files, or may contain malicious ads or pop-ups. Therefore, you should be careful when choosing a website and check the reviews and ratings of the files before downloading them. Here are some of the websites that offer Adobe Media Encoder CS6 activation crack:

- -
How to Install Adobe Media Encoder CS6 Activation Crack?
-

After you have downloaded Adobe Media Encoder CS6 activation crack from one of the websites above, you need to follow these steps to install it:

-
    -
  1. Run Block Adobe Activation.app: This is an application that prevents the Adobe activation program from starting. You need to run this application before installing Adobe Media Encoder CS6.
  2. -
  3. Run the Adobe CS6 installer: This is an application that installs Adobe Media Encoder CS6 on your computer. You need to run this application and select "Trial" as the installation option.
  4. -
  5. Register a free adobe account: This is an account that allows you to access various Adobe services and products. You need to register a free adobe account and sign in with it during the installation process.
  6. -
  7. Copy amtlib.dll file and replace with existing one: This is the crack file that activates Adobe Media Encoder CS6. You need to copy this file from the downloaded folder and replace it with the existing one in the installation folder of Adobe Media Encoder CS6. The default location of the installation folder is C:\Program Files\Adobe\Adobe Media Encoder CS6.
  8. -
  9. Enjoy using Adobe Media Encoder CS6: After you have replaced the amtlib.dll file with the crack file, you can enjoy using Adobe Media Encoder CS6 without any restrictions.
  10. -
-
Benefits of Using Adobe Media Encoder CS6 Activation Crack
-

Using Adobe Media Encoder CS6 activation crack has many benefits for you, such as:

- - -Tips for Using Adobe Media Encoder CS6 Activation Crack -

Besides installing Adobe Media Encoder CS6 activation crack, there are some tips that can help you use the software more effectively:

- - - -Conclusion - -

Adobe Media Encoder CS6 is a powerful software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. However, to use it, you need to activate it with a valid serial number or a crack file. In this article, we showed you how to download and install Adobe Media Encoder CS6 activation crack for free. We also gave you some tips on how to use the software more effectively. By using this crack file and these tips, you can enjoy using Adobe Media Encoder CS6 without any restrictions.

-Conclusion - -

Adobe Media Encoder CS6 is a powerful software that allows you to encode, transcode, and compress video and audio files for various formats and platforms. However, to use it, you need to activate it with a valid serial number or a crack file. In this article, we showed you how to download and install Adobe Media Encoder CS6 activation crack for free. We also gave you some tips on how to use the software more effectively. By using this crack file and these tips, you can enjoy using Adobe Media Encoder CS6 without any restrictions.

3cee63e6c2
-
-
\ No newline at end of file diff --git a/spaces/1gistliPinn/ChatGPT4/Examples/Free Download Maps Blaupunkt Travelpilot Ex.rar [PATCHED].md b/spaces/1gistliPinn/ChatGPT4/Examples/Free Download Maps Blaupunkt Travelpilot Ex.rar [PATCHED].md deleted file mode 100644 index 2d8ee965a20eb20c9afe0729e58082b40f0a27d3..0000000000000000000000000000000000000000 --- a/spaces/1gistliPinn/ChatGPT4/Examples/Free Download Maps Blaupunkt Travelpilot Ex.rar [PATCHED].md +++ /dev/null @@ -1,6 +0,0 @@ -

Free Download Maps Blaupunkt Travelpilot Ex.rar


Download File ►►►►► https://imgfil.com/2uxY5W



- -3B0 035 191G 7 612 001 025 24C32 new.rar. File type: .rar. Downloaded: 0 times. Size: 1.4s. Blaupunkt Travelpilot E (NOT EX!) (analogue from Sony) - 2 pcs. (for 2 pieces - 2,800 rubles). 3B0 035 191G 7 612 001 025 24C32.rar.File type: .rar. Downloaded: 0 times. Size: 1.4s. Blaupunkt Travelpilot E (NOT EX!) (analogue from Sony) - 2 pcs. (for 2 pieces - 2,800 rubles). 3B0 035 191G 7 612 001 025 24C32.rar.File type: .rar. Downloaded: 0 times. Size: 1.4s. Blaupunkt Travelpilot E (NOT EX!) (analogue from Sony) - 2 pcs. ( 8a78ff9644
-
-
-

diff --git a/spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Download 730 precompilato 2023 i vantaggi i requisiti e i passaggi da seguire.md b/spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Download 730 precompilato 2023 i vantaggi i requisiti e i passaggi da seguire.md deleted file mode 100644 index e7a5d30afd5312f020363603a0247bc9ab4d1d3c..0000000000000000000000000000000000000000 --- a/spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Download 730 precompilato 2023 i vantaggi i requisiti e i passaggi da seguire.md +++ /dev/null @@ -1,160 +0,0 @@ - -

How to Download 730 Precompilato 2023: A Complete Guide

-

If you are a resident taxpayer in Italy, you may be eligible to use the 730 Precompilato service to file your income tax return online. This service allows you to access a pre-filled tax form with your personal and income data, verify its accuracy, and submit it electronically. In this article, we will explain what 730 Precompilato is, how to access it online, how to submit it online, and how to get help with it online.

-

What is 730 Precompilato?

-

730 Precompilato is a service provided by the Agenzia delle Entrate (the Italian Revenue Agency) that allows you to file your income tax return online without having to fill in any data manually. The service uses the information that the Agenzia delle Entrate collects from various sources, such as your employer, your bank, your health provider, etc., to pre-fill your tax form with your personal and income data. You can then check and edit your data, add any deductions or tax credits that you are entitled to, and send your declaration electronically.

-

download 730 precompilato 2023


Download Zip >>>>> https://urlin.us/2uSRYH



-

What are the benefits of using 730 Precompilato?

-

Using 730 Precompilato has several advantages, such as:

- -

Who can use 730 Precompilato?

-

You can use 730 Precompilato if you meet the following conditions:

- -

How to access 730 Precompilato online

-

To access 730 Precompilato online, you need to visit the Agenzia delle Entrate website and log in with one of the following methods:

-

download 730 precompilato 2023 agenzia entrate
-download 730 precompilato 2023 online
-download 730 precompilato 2023 editabile
-download 730 precompilato 2023 pdf
-download 730 precompilato 2023 spid
-download 730 precompilato 2023 inps
-download 730 precompilato 2023 istruzioni
-download 730 precompilato 2023 scadenza
-download 730 precompilato 2023 modello
-download 730 precompilato 2023 congiunto
-download 730 precompilato 2023 cns
-download 730 precompilato 2023 redditi
-download 730 precompilato 2023 detrazioni
-download 730 precompilato 2023 rimborsi
-download 730 precompilato 2023 gratis
-download 730 precompilato 2023 senza sostituto d'imposta
-download 730 precompilato 2023 con pin inps
-download 730 precompilato 2023 con cie
-download 730 precompilato 2023 con entratel
-download 730 precompilato 2023 con fisconline
-download 730 precompilato 2023 con cud
-download 730 precompilato 2023 con unico
-download 730 precompilato 2023 con partita IVA
-download 730 precompilato 2023 con mutuo
-download 730 precompilato 2023 con affitto
-download 730 precompilato 2023 con assegni familiari
-download 730 precompilato 2023 con bonus renzi
-download 730 precompilato 2023 con spese mediche
-download 730 precompilato 2023 con spese universitarie
-download 730 precompilato 2023 con spese funebri
-download 730 precompilato 2023 con spese veterinarie
-download 730 precompilato 2023 con spese scolastiche
-download 730 precompilato 2023 con spese sportive
-download 730 precompilato 2023 con spese culturali
-download 730 precompilato 2023 con spese condominiali
-download 730 precompilato 2023 con spese energetiche
-download 730 precompilato 2023 con spese ristrutturazione
-download 730 precompilato 2023 con spese donazioni
-download

- -

What are the requirements for accessing 730 Precompilato online?

-

To access 730 Precompilato online, you need to have:

- -

How to log in to the Agenzia delle Entrate website with SPID, CIE, or CNS

-

To log in to the Agenzia delle Entrate website with SPID, CIE, or CNS, you need to follow these steps:

-
    -
  1. Go to the Agenzia delle Entrate website and click on the "730 Precompilato" button.
  2. -
  3. Select the option "Accesso con SPID, CIE o CNS" and choose the method that you prefer.
  4. -
  5. Follow the instructions on the screen to enter your credentials and authenticate yourself.
  6. -
  7. Once you are logged in, you will see your personal page with your 730 Precompilato form.
  8. -
-

How to check and edit your personal and income data

-

Once you access your 730 Precompilato form online, you need to check and edit your personal and income data. You can do this by following these steps:

-
    -
  1. Click on the "Dati Anagrafici e Reddituali" section to see your personal and income data.
  2. -
  3. Verify that your personal data (such as name, surname, date of birth, place of residence, etc.) are correct and complete. If you notice any errors or omissions, you can edit them by clicking on the "Modifica" button.
  4. -
  5. Verify that your income data (such as wages, pensions, dividends, interest, etc.) are correct and complete. If you notice any errors or omissions, you can edit them by clicking on the "Modifica" button.
  6. -
  7. If you have received any income from abroad, you need to declare it by clicking on the "Dichiarazione dei redditi esteri" section and filling in the required fields.
  8. -
  9. If you have any other income that is not pre-filled in your form, you need to declare it by clicking on the "Altri redditi" section and filling in the required fields.
  10. -
  11. Once you have checked and edited your personal and income data, you can proceed to the next section by clicking on the "Continua" button.
  12. -
-

How to submit 730 Precompilato online

-

After you have verified and edited your personal and income data, you need to submit your 730 Precompilato form online. You can do this by following these steps:

-

How to confirm or modify your deductions and tax credits

-

The 730 Precompilato form also includes some deductions and tax credits that you may be entitled to, such as expenses for health, education, mortgage interest, donations, etc. You can confirm or modify these deductions and tax credits by following these steps:

-
    -
  1. Click on the "Detrazioni e Credito d'Imposta" section to see your deductions and tax credits.
  2. -
  3. Verify that the deductions and tax credits are correct and complete. If you notice any errors or omissions, you can edit them by clicking on the "Modifica" button.
  4. -
  5. If you have any other deductions or tax credits that are not pre-filled in your form, you can add them by clicking on the "Aggiungi" button and filling in the required fields.
  6. -
  7. Once you have confirmed or modified your deductions and tax credits, you can proceed to the next section by clicking on the "Continua" button.
  8. -
-

How to choose your payment method and send your declaration

-

The final step of submitting your 730 Precompilato form online is to choose your payment method and send your declaration. You can do this by following these steps:

-
    -
  1. Click on the "Metodo di Pagamento e Invio" section to see your payment method and declaration.
  2. -
  3. If you have a tax due, you can choose how to pay it. You can either pay it online with a credit card or a bank transfer, or pay it offline with a F24 form at a bank or a post office. You can also choose to pay it in installments if you meet certain conditions.
  4. -
  5. If you have a tax refund, you can choose how to receive it. You can either receive it by bank transfer or by postal order. You need to provide your bank account details or your postal address accordingly.
  6. -
  7. Once you have chosen your payment method, you can send your declaration by clicking on the "Invia" button. You will receive a confirmation code and a receipt that you can download and print.
  8. -
-

How to track the status of your submission and receive your refund

-

After you have sent your 730 Precompilato form online, you can track the status of your submission and receive your refund by following these steps:

-
    -
  1. Go to the Agenzia delle Entrate website and log in with your SPID, CIE, or CNS.
  2. -
  3. Click on the "730 Precompilato" button and then on the "Verifica lo stato della tua dichiarazione" section.
  4. -
  5. You will see the status of your submission, which can be one of the following:
  6. - -
  7. If you have a refund, you will receive it within a few weeks after your declaration has been finalized. You will receive it by the payment method that you have chosen (bank transfer or postal order).
  8. -
-

How to get help with 730 Precompilato online

-

If you have any questions or problems with 730 Precompilato online, you can get help by following these methods:

-

How to contact the Agenzia delle Entrate for assistance

-

You can contact the Agenzia delle Entrate for assistance by phone, email, or chat. You can do this by following these steps:

-
    -
  1. Go to the Agenzia delle Entrate website and click on the "Contatti" button.
  2. -
  3. Select the option "Assistenza per il 730 Precompilato" and choose the method that you prefer (phone, email, or chat).
  4. -
  5. Follow the instructions on the screen to enter your details and your request.
  6. -
  7. You will be connected to an operator who will assist you with your issue.
  8. -
-

How to use the online guides and FAQs

-

You can also use the online guides and FAQs that are available on the Agenzia delle Entrate website. You can do this by following these steps:

-
    -
  1. Go to the Agenzia delle Entrate website and click on the "Guida e FAQ" button.
  2. -
  3. Select the option "Guida al 730 Precompilato" or "FAQ sul 730 Precompilato" depending on what you need.
  4. -
  5. You will see a list of topics and questions that cover various aspects of 730 Precompilato online.
  6. -
  7. Click on the topic or question that interests you and read the answer or explanation.
  8. -
-

Conclusion

-

In conclusion, 730 Precompilato online is a convenient and easy way to file your income tax return in Italy. It allows you to access a pre-filled tax form with your personal and income data, verify its accuracy, and submit it electronically. It also allows you to receive your refund faster and avoid errors and omissions. To use 730 Precompilato online, you need to visit the Agenzia delle Entrate website and log in with your SPID, CIE, or CNS. You can then check and edit your data, add any deductions or tax credits that you are entitled to, choose your payment method, and send your declaration. You can also track the status of your submission and receive your refund online. If you need any help with 730 Precompilato online, you can contact the Agenzia delle Entrate for assistance or use the online guides and FAQs.

-

FAQs

-

Here are some frequently asked questions about 730 Precompilato online:

-

When can I access 730 Precompilato online?

-

You can access 730 Precompilato online from April 1st to July 31st of each year. However, if you want to use a tax assistance center (CAF) or a qualified professional (such as an accountant) to submit your declaration, you need to access it by June 30th.

-

What if I don't agree with the data pre-filled in my form?

-

If you don't agree with the data pre-filled in your form, you can edit it by clicking on the "Modifica" button. You can also add any data that is missing or not pre-filled by clicking on the "Aggiungi" button. However, you need to have supporting documents or receipts that prove your data. You are responsible for the accuracy and completeness of your data.

-

What if I don't want to use 730 Precompilato online?

-

If you don't want to use 730 Precompilato online, you have other options to file your income tax return. You can either use the traditional 730 form, which you need to fill in manually and submit through a tax assistance center (CAF) or a qualified professional (such as an accountant), or use the Unico form, which you need to fill in manually and submit online or by mail. However, these options may be more time-consuming, costly, and prone to errors than 730 Precompilato online.

-

What if I have already submitted my 730 Precompilato online and I want to change something?

-

If you have already submitted your 730 Precompilato online and you want to change something, you can do so by submitting a new declaration with the correct data. You can do this by following the same steps as before, but you need to select the option "Invia una nuova dichiarazione" instead of "Invia". You can submit a new declaration until July 31st of each year. However, if you have already received your refund or paid your tax due, you may need to adjust your payment or request a new refund accordingly.

-

How secure is 730 Precompilato online?

-

730 Precompilato online is a secure service that uses encryption and authentication methods to protect your data and privacy. The Agenzia delle Entrate guarantees that your data is treated in accordance with the law and that only you can access and modify it. You can also check the security certificate of the website by clicking on the padlock icon in your browser.

-

Where can I find more information about 730 Precompilato online?

-

You can find more information about 730 Precompilato online on the Agenzia delle Entrate website, where you can also access the service and get help. You can also consult the official guide and FAQ that are available on the website. Alternatively, you can contact the Agenzia delle Entrate by phone, email, or chat for assistance.

197e85843d
-
-
\ No newline at end of file diff --git a/spaces/1toTree/lora_test/ppdiffusers/models/ema.py b/spaces/1toTree/lora_test/ppdiffusers/models/ema.py deleted file mode 100644 index f5ce3ad6407a106cc5f9f863af51ef7977c03e23..0000000000000000000000000000000000000000 --- a/spaces/1toTree/lora_test/ppdiffusers/models/ema.py +++ /dev/null @@ -1,103 +0,0 @@ -# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved. -# Copyright 2022 The HuggingFace Team. All rights reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import paddle -from paddle import nn - - -class LitEma(nn.Layer): - """ - Exponential Moving Average (EMA) of model updates - - Parameters: - model: The model architecture for apply EMA. - decay: The exponential decay. Default 0.9999. - use_num_updates: Whether to use number of updates when computing - averages. - """ - - def __init__(self, model, decay=0.9999, use_num_upates=True): - super().__init__() - if decay < 0.0 or decay > 1.0: - raise ValueError("Decay must be between 0 and 1") - - self.m_name2s_name = {} - self.register_buffer("decay", paddle.to_tensor(decay, dtype=paddle.float32)) - self.register_buffer( - "num_updates", - paddle.to_tensor(0, dtype=paddle.int64) if use_num_upates else paddle.to_tensor(-1, dtype=paddle.int64), - ) - - for name, p in model.named_parameters(): - if not p.stop_gradient: - # remove as '.'-character is not allowed in buffers - s_name = name.replace(".", "") - self.m_name2s_name.update({name: s_name}) - self.register_buffer(s_name, p.clone().detach()) - - self.collected_params = [] - - def forward(self, model): - decay = self.decay - - if self.num_updates >= 0: - self.num_updates += 1 - decay = min(self.decay, (1 + self.num_updates) / (10 + self.num_updates)) - - one_minus_decay = 1.0 - decay - - with paddle.no_grad(): - m_param = dict(model.named_parameters()) - shadow_params = dict(self.named_buffers()) - - for key in m_param: - if not m_param[key].stop_gradient: - sname = self.m_name2s_name[key] - shadow_params[sname].scale_(decay) - shadow_params[sname].add_(m_param[key] * one_minus_decay) - else: - assert key not in self.m_name2s_name - - def copy_to(self, model): - m_param = dict(model.named_parameters()) - shadow_params = dict(self.named_buffers()) - for key in m_param: - if not m_param[key].stop_gradient: - m_param[key].copy_(shadow_params[self.m_name2s_name[key]], True) - else: - assert key not in self.m_name2s_name - - def store(self, parameters): - """ - Save the current parameters for restoring later. - Args: - parameters: Iterable of `EagerParamBase`; the parameters to be - temporarily stored. - """ - self.collected_params = [param.clone() for param in parameters] - - def restore(self, parameters): - """ - Restore the parameters stored with the `store` method. - Useful to validate the model with EMA parameters without affecting the - original optimization process. Store the parameters before the - `copy_to` method. After validation (or model saving), use this to - restore the former parameters. - Args: - parameters: Iterable of `EagerParamBase`; the parameters to be - updated with the stored parameters. - """ - for c_param, param in zip(self.collected_params, parameters): - param.copy_(c_param, True) diff --git a/spaces/4com/4com-license/README.md b/spaces/4com/4com-license/README.md deleted file mode 100644 index cf950474e69a18911a0cbaa68fd90211492b7bcb..0000000000000000000000000000000000000000 --- a/spaces/4com/4com-license/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: 4COM License -emoji: ⚖ -colorFrom: pink -colorTo: red -sdk: gradio -sdk_version: 3.41.2 -app_file: app.py -pinned: false -license: creativeml-openrail-m ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference \ No newline at end of file diff --git a/spaces/52Hz/SRMNet_AWGN_denoising/predict.py b/spaces/52Hz/SRMNet_AWGN_denoising/predict.py deleted file mode 100644 index 092ecaf8ef531ce11d77aaa68a3cf3b82ebf5a07..0000000000000000000000000000000000000000 --- a/spaces/52Hz/SRMNet_AWGN_denoising/predict.py +++ /dev/null @@ -1,82 +0,0 @@ -import cog -import tempfile -from pathlib import Path -import argparse -import shutil -import os -import glob -import torch -from skimage import img_as_ubyte -from PIL import Image -from model.SRMNet import SRMNet -from main_test_SRMNet import save_img, setup -import torchvision.transforms.functional as TF -import torch.nn.functional as F - - -class Predictor(cog.Predictor): - def setup(self): - model_dir = 'experiments/pretrained_models/AWGN_denoising_SRMNet.pth' - - parser = argparse.ArgumentParser(description='Demo Image Denoising') - parser.add_argument('--input_dir', default='./test/', type=str, help='Input images') - parser.add_argument('--result_dir', default='./result/', type=str, help='Directory for results') - parser.add_argument('--weights', - default='./checkpoints/SRMNet_real_denoise/models/model_bestPSNR.pth', type=str, - help='Path to weights') - - self.args = parser.parse_args() - - self.device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') - - @cog.input("image", type=Path, help="input image") - def predict(self, image): - # set input folder - input_dir = 'input_cog_temp' - os.makedirs(input_dir, exist_ok=True) - input_path = os.path.join(input_dir, os.path.basename(image)) - shutil.copy(str(image), input_path) - - # Load corresponding models architecture and weights - model = SRMNet() - model.eval() - model = model.to(self.device) - - folder, save_dir = setup(self.args) - os.makedirs(save_dir, exist_ok=True) - - out_path = Path(tempfile.mkdtemp()) / "out.png" - mul = 16 - for file_ in sorted(glob.glob(os.path.join(folder, '*.PNG'))): - img = Image.open(file_).convert('RGB') - input_ = TF.to_tensor(img).unsqueeze(0).cuda() - - # Pad the input if not_multiple_of 8 - h, w = input_.shape[2], input_.shape[3] - H, W = ((h + mul) // mul) * mul, ((w + mul) // mul) * mul - padh = H - h if h % mul != 0 else 0 - padw = W - w if w % mul != 0 else 0 - input_ = F.pad(input_, (0, padw, 0, padh), 'reflect') - with torch.no_grad(): - restored = model(input_) - - restored = torch.clamp(restored, 0, 1) - restored = restored[:, :, :h, :w] - restored = restored.permute(0, 2, 3, 1).cpu().detach().numpy() - restored = img_as_ubyte(restored[0]) - - save_img(str(out_path), restored) - clean_folder(input_dir) - return out_path - - -def clean_folder(folder): - for filename in os.listdir(folder): - file_path = os.path.join(folder, filename) - try: - if os.path.isfile(file_path) or os.path.islink(file_path): - os.unlink(file_path) - elif os.path.isdir(file_path): - shutil.rmtree(file_path) - except Exception as e: - print('Failed to delete %s. Reason: %s' % (file_path, e)) diff --git a/spaces/AEUPH/AethericGPT/app.py b/spaces/AEUPH/AethericGPT/app.py deleted file mode 100644 index 1864d8dc8e67cb899d074ff3431f884d924803f2..0000000000000000000000000000000000000000 --- a/spaces/AEUPH/AethericGPT/app.py +++ /dev/null @@ -1,32 +0,0 @@ -import gradio as gr -import requests - -def interact_with_server(prompt): - server_url = "http://xzyorb.servemp3.com:80" - response = requests.post(server_url, data={"prompt": prompt}) - response_text = response.text - - # Split the response text into individual messages - conversation_messages = response_text.split("\n") - num_messages = len(conversation_messages) - - return response_text, num_messages - -def print_session_data(data): - response_text, num_messages = data - session = gr.capture_session() - ip_address = session["ip"] - user_agent = session["user_agent"] - - print("IP Address:", ip_address) - print("User Agent:", user_agent) - print("Number of Messages:", num_messages) - -iface = gr.Interface( - fn=interact_with_server, - inputs=gr.inputs.Textbox(), - outputs=[gr.outputs.HTML(), gr.outputs.Label()], # Use Label for displaying the number of messages - capture_session=True # Automatically captures IP address and user agent -) - -iface.launch(print_session_data) # Pass the function to print session data to the launch method diff --git a/spaces/AILab-CVC/SEED-LLaMA/models/seed_qformer/utils.py b/spaces/AILab-CVC/SEED-LLaMA/models/seed_qformer/utils.py deleted file mode 100644 index 9110d0353b458ee1516a192f8a0dac6c6f316e5c..0000000000000000000000000000000000000000 --- a/spaces/AILab-CVC/SEED-LLaMA/models/seed_qformer/utils.py +++ /dev/null @@ -1,138 +0,0 @@ -""" - Copyright (c) 2022, salesforce.com, inc. - All rights reserved. - SPDX-License-Identifier: BSD-3-Clause - For full license text, see the LICENSE file in the repo root or https://opensource.org/licenses/BSD-3-Clause -""" - -import datetime -import functools -import os - -import torch -import torch.distributed as dist -import timm.models.hub as timm_hub -from urllib.parse import urlparse - - -def setup_for_distributed(is_master): - """ - This function disables printing when not in master process - """ - import builtins as __builtin__ - - builtin_print = __builtin__.print - - def print(*args, **kwargs): - force = kwargs.pop("force", False) - if is_master or force: - builtin_print(*args, **kwargs) - - __builtin__.print = print - - -def is_dist_avail_and_initialized(): - if not dist.is_available(): - return False - if not dist.is_initialized(): - return False - return True - - -def get_world_size(): - if not is_dist_avail_and_initialized(): - return 1 - return dist.get_world_size() - - -def get_rank(): - if not is_dist_avail_and_initialized(): - return 0 - return dist.get_rank() - - -def is_main_process(): - return get_rank() == 0 - - -def init_distributed_mode(args): - if "RANK" in os.environ and "WORLD_SIZE" in os.environ: - args.rank = int(os.environ["RANK"]) - args.world_size = int(os.environ["WORLD_SIZE"]) - args.gpu = int(os.environ["LOCAL_RANK"]) - elif "SLURM_PROCID" in os.environ: - args.rank = int(os.environ["SLURM_PROCID"]) - args.gpu = args.rank % torch.cuda.device_count() - else: - print("Not using distributed mode") - args.distributed = False - return - - args.distributed = True - - torch.cuda.set_device(args.gpu) - args.dist_backend = "nccl" - print( - "| distributed init (rank {}, world {}): {}".format(args.rank, args.world_size, args.dist_url), - flush=True, - ) - torch.distributed.init_process_group( - backend=args.dist_backend, - init_method=args.dist_url, - world_size=args.world_size, - rank=args.rank, - timeout=datetime.timedelta(days=365), # allow auto-downloading and de-compressing - ) - torch.distributed.barrier() - setup_for_distributed(args.rank == 0) - - -def get_dist_info(): - if torch.__version__ < "1.0": - initialized = dist._initialized - else: - initialized = dist.is_initialized() - if initialized: - rank = dist.get_rank() - world_size = dist.get_world_size() - else: # non-distributed training - rank = 0 - world_size = 1 - return rank, world_size - - -def main_process(func): - @functools.wraps(func) - def wrapper(*args, **kwargs): - rank, _ = get_dist_info() - if rank == 0: - return func(*args, **kwargs) - - return wrapper - - -def download_cached_file(url, check_hash=True, progress=False): - """ - Download a file from a URL and cache it locally. If the file already exists, it is not downloaded again. - If distributed, only the main process downloads the file, and the other processes wait for the file to be downloaded. - """ - def get_cached_file_path(): - # a hack to sync the file path across processes - parts = torch.hub.urlparse(url) - filename = os.path.basename(parts.path) - cached_file = os.path.join(timm_hub.get_cache_dir(), filename) - - return cached_file - - if is_main_process(): - timm_hub.download_cached_file(url, check_hash, progress) - - if is_dist_avail_and_initialized(): - dist.barrier() - - return get_cached_file_path() - - -def is_url(url_or_filename): - parsed = urlparse(url_or_filename) - return parsed.scheme in ("http", "https") diff --git a/spaces/AchyuthGamer/OpenGPT-Chat-UI/src/lib/server/models.ts b/spaces/AchyuthGamer/OpenGPT-Chat-UI/src/lib/server/models.ts deleted file mode 100644 index 74a46c090a3b9280690aff8722f7ac79fd7dda01..0000000000000000000000000000000000000000 --- a/spaces/AchyuthGamer/OpenGPT-Chat-UI/src/lib/server/models.ts +++ /dev/null @@ -1,178 +0,0 @@ -import { HF_ACCESS_TOKEN, MODELS, OLD_MODELS } from "$env/static/private"; -import type { - ChatTemplateInput, - WebSearchQueryTemplateInput, - WebSearchSummaryTemplateInput, -} from "$lib/types/Template"; -import { compileTemplate } from "$lib/utils/template"; -import { z } from "zod"; - -type Optional = Pick, K> & Omit; - -const sagemakerEndpoint = z.object({ - host: z.literal("sagemaker"), - url: z.string().url(), - accessKey: z.string().min(1), - secretKey: z.string().min(1), - sessionToken: z.string().optional(), -}); - -const tgiEndpoint = z.object({ - host: z.union([z.literal("tgi"), z.undefined()]), - url: z.string().url(), - authorization: z.string().min(1).default(`Bearer ${HF_ACCESS_TOKEN}`), -}); - -const localEndpoint = z.object({ - host: z.union([z.literal("local"), z.undefined()]), - model: z.string(), - url: z.string().url(), - authorization: z.string().min(1).default(`Bearer ${HF_ACCESS_TOKEN}`), -}); - -const commonEndpoint = z.object({ - weight: z.number().int().positive().default(1), -}); - -const endpoint = z.lazy(() => - z.union([ - sagemakerEndpoint.merge(commonEndpoint), - tgiEndpoint.merge(commonEndpoint), - localEndpoint.merge(commonEndpoint), - ]) -); - -const combinedEndpoint = endpoint.transform((data) => { - if (data.host === "tgi" || data.host === undefined) { - return tgiEndpoint.merge(commonEndpoint).parse(data); - } else if (data.host === "sagemaker") { - return sagemakerEndpoint.merge(commonEndpoint).parse(data); - } else if (data.host === "local") { - return localEndpoint.merge(commonEndpoint).parse(data); - } else { - throw new Error(`Invalid host: ${data.host}`); - } -}); - -const modelsRaw = z - .array( - z.object({ - /** Used as an identifier in DB */ - id: z.string().optional(), - /** Used to link to the model page, and for inference */ - name: z.string().min(1), - displayName: z.string().min(1).optional(), - description: z.string().min(1).optional(), - is_local: z.boolean().optional(), - is_code: z.boolean().optional(), - is_phi: z.boolean().optional(), - type: z.string().min(1), - websiteUrl: z.string().url().optional(), - modelUrl: z.string().url().optional(), - datasetName: z.string().min(1).optional(), - datasetUrl: z.string().url().optional(), - userMessageToken: z.string().default(""), - userMessageEndToken: z.string().default(""), - assistantMessageToken: z.string().default(""), - assistantMessageEndToken: z.string().default(""), - messageEndToken: z.string().default(""), - preprompt: z.string().default(""), - prepromptUrl: z.string().url().optional(), - chatPromptTemplate: z - .string() - .default( - "{{preprompt}}" + - "{{#each messages}}" + - "{{#ifUser}}{{@root.userMessageToken}}{{content}}{{@root.userMessageEndToken}}{{/ifUser}}" + - "{{#ifAssistant}}{{@root.assistantMessageToken}}{{content}}{{@root.assistantMessageEndToken}}{{/ifAssistant}}" + - "{{/each}}" + - "{{assistantMessageToken}}" - ), - webSearchSummaryPromptTemplate: z - .string() - .default( - "{{userMessageToken}}{{answer}}{{userMessageEndToken}}" + - "{{userMessageToken}}" + - "The text above should be summarized to best answer the query: {{query}}." + - "{{userMessageEndToken}}" + - "{{assistantMessageToken}}Summary: " - ), - webSearchQueryPromptTemplate: z - .string() - .default( - "{{userMessageToken}}" + - "The following messages were written by a user, trying to answer a question." + - "{{userMessageEndToken}}" + - "{{#each messages}}" + - "{{#ifUser}}{{@root.userMessageToken}}{{content}}{{@root.userMessageEndToken}}{{/ifUser}}" + - "{{/each}}" + - "{{userMessageToken}}" + - "What plain-text english sentence would you input into Google to answer the last question? Answer with a short (10 words max) simple sentence." + - "{{userMessageEndToken}}" + - "{{assistantMessageToken}}Query: " - ), - promptExamples: z - .array( - z.object({ - title: z.string().min(1), - prompt: z.string().min(1), - }) - ) - .optional(), - endpoints: z.array(combinedEndpoint).optional(), - parameters: z - .object({ - temperature: z.number().min(0).max(1), - truncate: z.number().int().positive(), - max_new_tokens: z.number().int().positive(), - stop: z.array(z.string()).optional(), - }) - .passthrough() - .optional(), - }) - ) - .parse(JSON.parse(MODELS)); - -export const models = await Promise.all( - modelsRaw.map(async (m) => ({ - ...m, - userMessageEndToken: m?.userMessageEndToken || m?.messageEndToken, - assistantMessageEndToken: m?.assistantMessageEndToken || m?.messageEndToken, - chatPromptRender: compileTemplate(m.chatPromptTemplate, m), - webSearchSummaryPromptRender: compileTemplate( - m.webSearchSummaryPromptTemplate, - m - ), - webSearchQueryPromptRender: compileTemplate( - m.webSearchQueryPromptTemplate, - m - ), - id: m.id || m.name, - displayName: m.displayName || m.name, - preprompt: m.prepromptUrl ? await fetch(m.prepromptUrl).then((r) => r.text()) : m.preprompt, - })) -); - -// Models that have been deprecated -export const oldModels = OLD_MODELS - ? z - .array( - z.object({ - id: z.string().optional(), - name: z.string().min(1), - displayName: z.string().min(1).optional(), - }) - ) - .parse(JSON.parse(OLD_MODELS)) - .map((m) => ({ ...m, id: m.id || m.name, displayName: m.displayName || m.name })) - : []; - -export type BackendModel = Optional<(typeof models)[0], "preprompt">; -export type Endpoint = z.infer; - -export const defaultModel = models[0]; - -export const validateModel = (_models: BackendModel[]) => { - // Zod enum function requires 2 parameters - return z.enum([_models[0].id, ..._models.slice(1).map((m) => m.id)]); -}; diff --git a/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/examples/text_to_image/train_text_to_image_lora.py b/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/examples/text_to_image/train_text_to_image_lora.py deleted file mode 100644 index fd5aef558f0345a384d020920b0a64252272ff6e..0000000000000000000000000000000000000000 --- a/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/examples/text_to_image/train_text_to_image_lora.py +++ /dev/null @@ -1,949 +0,0 @@ -# coding=utf-8 -# Copyright 2023 The HuggingFace Inc. team. All rights reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -"""Fine-tuning script for Stable Diffusion for text2image with support for LoRA.""" - -import argparse -import logging -import math -import os -import random -import shutil -from pathlib import Path - -import datasets -import numpy as np -import torch -import torch.nn.functional as F -import torch.utils.checkpoint -import transformers -from accelerate import Accelerator -from accelerate.logging import get_logger -from accelerate.utils import ProjectConfiguration, set_seed -from datasets import load_dataset -from huggingface_hub import create_repo, upload_folder -from packaging import version -from torchvision import transforms -from tqdm.auto import tqdm -from transformers import CLIPTextModel, CLIPTokenizer - -import diffusers -from diffusers import AutoencoderKL, DDPMScheduler, DiffusionPipeline, UNet2DConditionModel -from diffusers.loaders import AttnProcsLayers -from diffusers.models.attention_processor import LoRAAttnProcessor -from diffusers.optimization import get_scheduler -from diffusers.utils import check_min_version, is_wandb_available -from diffusers.utils.import_utils import is_xformers_available - - -# Will error if the minimal version of diffusers is not installed. Remove at your own risks. -check_min_version("0.19.0") - -logger = get_logger(__name__, log_level="INFO") - - -def save_model_card(repo_id: str, images=None, base_model=str, dataset_name=str, repo_folder=None): - img_str = "" - for i, image in enumerate(images): - image.save(os.path.join(repo_folder, f"image_{i}.png")) - img_str += f"![img_{i}](./image_{i}.png)\n" - - yaml = f""" ---- -license: creativeml-openrail-m -base_model: {base_model} -tags: -- stable-diffusion -- stable-diffusion-diffusers -- text-to-image -- diffusers -- lora -inference: true ---- - """ - model_card = f""" -# LoRA text2image fine-tuning - {repo_id} -These are LoRA adaption weights for {base_model}. The weights were fine-tuned on the {dataset_name} dataset. You can find some example images in the following. \n -{img_str} -""" - with open(os.path.join(repo_folder, "README.md"), "w") as f: - f.write(yaml + model_card) - - -def parse_args(): - parser = argparse.ArgumentParser(description="Simple example of a training script.") - parser.add_argument( - "--pretrained_model_name_or_path", - type=str, - default=None, - required=True, - help="Path to pretrained model or model identifier from huggingface.co/models.", - ) - parser.add_argument( - "--revision", - type=str, - default=None, - required=False, - help="Revision of pretrained model identifier from huggingface.co/models.", - ) - parser.add_argument( - "--dataset_name", - type=str, - default=None, - help=( - "The name of the Dataset (from the HuggingFace hub) to train on (could be your own, possibly private," - " dataset). It can also be a path pointing to a local copy of a dataset in your filesystem," - " or to a folder containing files that 🤗 Datasets can understand." - ), - ) - parser.add_argument( - "--dataset_config_name", - type=str, - default=None, - help="The config of the Dataset, leave as None if there's only one config.", - ) - parser.add_argument( - "--train_data_dir", - type=str, - default=None, - help=( - "A folder containing the training data. Folder contents must follow the structure described in" - " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" - " must exist to provide the captions for the images. Ignored if `dataset_name` is specified." - ), - ) - parser.add_argument( - "--image_column", type=str, default="image", help="The column of the dataset containing an image." - ) - parser.add_argument( - "--caption_column", - type=str, - default="text", - help="The column of the dataset containing a caption or a list of captions.", - ) - parser.add_argument( - "--validation_prompt", type=str, default=None, help="A prompt that is sampled during training for inference." - ) - parser.add_argument( - "--num_validation_images", - type=int, - default=4, - help="Number of images that should be generated during validation with `validation_prompt`.", - ) - parser.add_argument( - "--validation_epochs", - type=int, - default=1, - help=( - "Run fine-tuning validation every X epochs. The validation process consists of running the prompt" - " `args.validation_prompt` multiple times: `args.num_validation_images`." - ), - ) - parser.add_argument( - "--max_train_samples", - type=int, - default=None, - help=( - "For debugging purposes or quicker training, truncate the number of training examples to this " - "value if set." - ), - ) - parser.add_argument( - "--output_dir", - type=str, - default="sd-model-finetuned-lora", - help="The output directory where the model predictions and checkpoints will be written.", - ) - parser.add_argument( - "--cache_dir", - type=str, - default=None, - help="The directory where the downloaded models and datasets will be stored.", - ) - parser.add_argument("--seed", type=int, default=None, help="A seed for reproducible training.") - parser.add_argument( - "--resolution", - type=int, - default=512, - help=( - "The resolution for input images, all the images in the train/validation dataset will be resized to this" - " resolution" - ), - ) - parser.add_argument( - "--center_crop", - default=False, - action="store_true", - help=( - "Whether to center crop the input images to the resolution. If not set, the images will be randomly" - " cropped. The images will be resized to the resolution first before cropping." - ), - ) - parser.add_argument( - "--random_flip", - action="store_true", - help="whether to randomly flip images horizontally", - ) - parser.add_argument( - "--train_batch_size", type=int, default=16, help="Batch size (per device) for the training dataloader." - ) - parser.add_argument("--num_train_epochs", type=int, default=100) - parser.add_argument( - "--max_train_steps", - type=int, - default=None, - help="Total number of training steps to perform. If provided, overrides num_train_epochs.", - ) - parser.add_argument( - "--gradient_accumulation_steps", - type=int, - default=1, - help="Number of updates steps to accumulate before performing a backward/update pass.", - ) - parser.add_argument( - "--gradient_checkpointing", - action="store_true", - help="Whether or not to use gradient checkpointing to save memory at the expense of slower backward pass.", - ) - parser.add_argument( - "--learning_rate", - type=float, - default=1e-4, - help="Initial learning rate (after the potential warmup period) to use.", - ) - parser.add_argument( - "--scale_lr", - action="store_true", - default=False, - help="Scale the learning rate by the number of GPUs, gradient accumulation steps, and batch size.", - ) - parser.add_argument( - "--lr_scheduler", - type=str, - default="constant", - help=( - 'The scheduler type to use. Choose between ["linear", "cosine", "cosine_with_restarts", "polynomial",' - ' "constant", "constant_with_warmup"]' - ), - ) - parser.add_argument( - "--lr_warmup_steps", type=int, default=500, help="Number of steps for the warmup in the lr scheduler." - ) - parser.add_argument( - "--snr_gamma", - type=float, - default=None, - help="SNR weighting gamma to be used if rebalancing the loss. Recommended value is 5.0. " - "More details here: https://arxiv.org/abs/2303.09556.", - ) - parser.add_argument( - "--use_8bit_adam", action="store_true", help="Whether or not to use 8-bit Adam from bitsandbytes." - ) - parser.add_argument( - "--allow_tf32", - action="store_true", - help=( - "Whether or not to allow TF32 on Ampere GPUs. Can be used to speed up training. For more information, see" - " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" - ), - ) - parser.add_argument( - "--dataloader_num_workers", - type=int, - default=0, - help=( - "Number of subprocesses to use for data loading. 0 means that the data will be loaded in the main process." - ), - ) - parser.add_argument("--adam_beta1", type=float, default=0.9, help="The beta1 parameter for the Adam optimizer.") - parser.add_argument("--adam_beta2", type=float, default=0.999, help="The beta2 parameter for the Adam optimizer.") - parser.add_argument("--adam_weight_decay", type=float, default=1e-2, help="Weight decay to use.") - parser.add_argument("--adam_epsilon", type=float, default=1e-08, help="Epsilon value for the Adam optimizer") - parser.add_argument("--max_grad_norm", default=1.0, type=float, help="Max gradient norm.") - parser.add_argument("--push_to_hub", action="store_true", help="Whether or not to push the model to the Hub.") - parser.add_argument("--hub_token", type=str, default=None, help="The token to use to push to the Model Hub.") - parser.add_argument( - "--prediction_type", - type=str, - default=None, - help="The prediction_type that shall be used for training. Choose between 'epsilon' or 'v_prediction' or leave `None`. If left to `None` the default prediction type of the scheduler: `noise_scheduler.config.prediciton_type` is chosen.", - ) - parser.add_argument( - "--hub_model_id", - type=str, - default=None, - help="The name of the repository to keep in sync with the local `output_dir`.", - ) - parser.add_argument( - "--logging_dir", - type=str, - default="logs", - help=( - "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" - " *output_dir/runs/**CURRENT_DATETIME_HOSTNAME***." - ), - ) - parser.add_argument( - "--mixed_precision", - type=str, - default=None, - choices=["no", "fp16", "bf16"], - help=( - "Whether to use mixed precision. Choose between fp16 and bf16 (bfloat16). Bf16 requires PyTorch >=" - " 1.10.and an Nvidia Ampere GPU. Default to the value of accelerate config of the current system or the" - " flag passed with the `accelerate.launch` command. Use this argument to override the accelerate config." - ), - ) - parser.add_argument( - "--report_to", - type=str, - default="tensorboard", - help=( - 'The integration to report the results and logs to. Supported platforms are `"tensorboard"`' - ' (default), `"wandb"` and `"comet_ml"`. Use `"all"` to report to all integrations.' - ), - ) - parser.add_argument("--local_rank", type=int, default=-1, help="For distributed training: local_rank") - parser.add_argument( - "--checkpointing_steps", - type=int, - default=500, - help=( - "Save a checkpoint of the training state every X updates. These checkpoints are only suitable for resuming" - " training using `--resume_from_checkpoint`." - ), - ) - parser.add_argument( - "--checkpoints_total_limit", - type=int, - default=None, - help=("Max number of checkpoints to store."), - ) - parser.add_argument( - "--resume_from_checkpoint", - type=str, - default=None, - help=( - "Whether training should be resumed from a previous checkpoint. Use a path saved by" - ' `--checkpointing_steps`, or `"latest"` to automatically select the last available checkpoint.' - ), - ) - parser.add_argument( - "--enable_xformers_memory_efficient_attention", action="store_true", help="Whether or not to use xformers." - ) - parser.add_argument("--noise_offset", type=float, default=0, help="The scale of noise offset.") - parser.add_argument( - "--rank", - type=int, - default=4, - help=("The dimension of the LoRA update matrices."), - ) - - args = parser.parse_args() - env_local_rank = int(os.environ.get("LOCAL_RANK", -1)) - if env_local_rank != -1 and env_local_rank != args.local_rank: - args.local_rank = env_local_rank - - # Sanity checks - if args.dataset_name is None and args.train_data_dir is None: - raise ValueError("Need either a dataset name or a training folder.") - - return args - - -DATASET_NAME_MAPPING = { - "lambdalabs/pokemon-blip-captions": ("image", "text"), -} - - -def main(): - args = parse_args() - logging_dir = Path(args.output_dir, args.logging_dir) - - accelerator_project_config = ProjectConfiguration(project_dir=args.output_dir, logging_dir=logging_dir) - - accelerator = Accelerator( - gradient_accumulation_steps=args.gradient_accumulation_steps, - mixed_precision=args.mixed_precision, - log_with=args.report_to, - project_config=accelerator_project_config, - ) - if args.report_to == "wandb": - if not is_wandb_available(): - raise ImportError("Make sure to install wandb if you want to use it for logging during training.") - import wandb - - # Make one log on every process with the configuration for debugging. - logging.basicConfig( - format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", - datefmt="%m/%d/%Y %H:%M:%S", - level=logging.INFO, - ) - logger.info(accelerator.state, main_process_only=False) - if accelerator.is_local_main_process: - datasets.utils.logging.set_verbosity_warning() - transformers.utils.logging.set_verbosity_warning() - diffusers.utils.logging.set_verbosity_info() - else: - datasets.utils.logging.set_verbosity_error() - transformers.utils.logging.set_verbosity_error() - diffusers.utils.logging.set_verbosity_error() - - # If passed along, set the training seed now. - if args.seed is not None: - set_seed(args.seed) - - # Handle the repository creation - if accelerator.is_main_process: - if args.output_dir is not None: - os.makedirs(args.output_dir, exist_ok=True) - - if args.push_to_hub: - repo_id = create_repo( - repo_id=args.hub_model_id or Path(args.output_dir).name, exist_ok=True, token=args.hub_token - ).repo_id - # Load scheduler, tokenizer and models. - noise_scheduler = DDPMScheduler.from_pretrained(args.pretrained_model_name_or_path, subfolder="scheduler") - tokenizer = CLIPTokenizer.from_pretrained( - args.pretrained_model_name_or_path, subfolder="tokenizer", revision=args.revision - ) - text_encoder = CLIPTextModel.from_pretrained( - args.pretrained_model_name_or_path, subfolder="text_encoder", revision=args.revision - ) - vae = AutoencoderKL.from_pretrained(args.pretrained_model_name_or_path, subfolder="vae", revision=args.revision) - unet = UNet2DConditionModel.from_pretrained( - args.pretrained_model_name_or_path, subfolder="unet", revision=args.revision - ) - # freeze parameters of models to save more memory - unet.requires_grad_(False) - vae.requires_grad_(False) - - text_encoder.requires_grad_(False) - - # For mixed precision training we cast all non-trainable weigths (vae, non-lora text_encoder and non-lora unet) to half-precision - # as these weights are only used for inference, keeping weights in full precision is not required. - weight_dtype = torch.float32 - if accelerator.mixed_precision == "fp16": - weight_dtype = torch.float16 - elif accelerator.mixed_precision == "bf16": - weight_dtype = torch.bfloat16 - - # Move unet, vae and text_encoder to device and cast to weight_dtype - unet.to(accelerator.device, dtype=weight_dtype) - vae.to(accelerator.device, dtype=weight_dtype) - text_encoder.to(accelerator.device, dtype=weight_dtype) - - # now we will add new LoRA weights to the attention layers - # It's important to realize here how many attention weights will be added and of which sizes - # The sizes of the attention layers consist only of two different variables: - # 1) - the "hidden_size", which is increased according to `unet.config.block_out_channels`. - # 2) - the "cross attention size", which is set to `unet.config.cross_attention_dim`. - - # Let's first see how many attention processors we will have to set. - # For Stable Diffusion, it should be equal to: - # - down blocks (2x attention layers) * (2x transformer layers) * (3x down blocks) = 12 - # - mid blocks (2x attention layers) * (1x transformer layers) * (1x mid blocks) = 2 - # - up blocks (2x attention layers) * (3x transformer layers) * (3x down blocks) = 18 - # => 32 layers - - # Set correct lora layers - lora_attn_procs = {} - for name in unet.attn_processors.keys(): - cross_attention_dim = None if name.endswith("attn1.processor") else unet.config.cross_attention_dim - if name.startswith("mid_block"): - hidden_size = unet.config.block_out_channels[-1] - elif name.startswith("up_blocks"): - block_id = int(name[len("up_blocks.")]) - hidden_size = list(reversed(unet.config.block_out_channels))[block_id] - elif name.startswith("down_blocks"): - block_id = int(name[len("down_blocks.")]) - hidden_size = unet.config.block_out_channels[block_id] - - lora_attn_procs[name] = LoRAAttnProcessor( - hidden_size=hidden_size, - cross_attention_dim=cross_attention_dim, - rank=args.rank, - ) - - unet.set_attn_processor(lora_attn_procs) - - if args.enable_xformers_memory_efficient_attention: - if is_xformers_available(): - import xformers - - xformers_version = version.parse(xformers.__version__) - if xformers_version == version.parse("0.0.16"): - logger.warn( - "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." - ) - unet.enable_xformers_memory_efficient_attention() - else: - raise ValueError("xformers is not available. Make sure it is installed correctly") - - def compute_snr(timesteps): - """ - Computes SNR as per https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 - """ - alphas_cumprod = noise_scheduler.alphas_cumprod - sqrt_alphas_cumprod = alphas_cumprod**0.5 - sqrt_one_minus_alphas_cumprod = (1.0 - alphas_cumprod) ** 0.5 - - # Expand the tensors. - # Adapted from https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 - sqrt_alphas_cumprod = sqrt_alphas_cumprod.to(device=timesteps.device)[timesteps].float() - while len(sqrt_alphas_cumprod.shape) < len(timesteps.shape): - sqrt_alphas_cumprod = sqrt_alphas_cumprod[..., None] - alpha = sqrt_alphas_cumprod.expand(timesteps.shape) - - sqrt_one_minus_alphas_cumprod = sqrt_one_minus_alphas_cumprod.to(device=timesteps.device)[timesteps].float() - while len(sqrt_one_minus_alphas_cumprod.shape) < len(timesteps.shape): - sqrt_one_minus_alphas_cumprod = sqrt_one_minus_alphas_cumprod[..., None] - sigma = sqrt_one_minus_alphas_cumprod.expand(timesteps.shape) - - # Compute SNR. - snr = (alpha / sigma) ** 2 - return snr - - lora_layers = AttnProcsLayers(unet.attn_processors) - - # Enable TF32 for faster training on Ampere GPUs, - # cf https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices - if args.allow_tf32: - torch.backends.cuda.matmul.allow_tf32 = True - - if args.scale_lr: - args.learning_rate = ( - args.learning_rate * args.gradient_accumulation_steps * args.train_batch_size * accelerator.num_processes - ) - - # Initialize the optimizer - if args.use_8bit_adam: - try: - import bitsandbytes as bnb - except ImportError: - raise ImportError( - "Please install bitsandbytes to use 8-bit Adam. You can do so by running `pip install bitsandbytes`" - ) - - optimizer_cls = bnb.optim.AdamW8bit - else: - optimizer_cls = torch.optim.AdamW - - optimizer = optimizer_cls( - lora_layers.parameters(), - lr=args.learning_rate, - betas=(args.adam_beta1, args.adam_beta2), - weight_decay=args.adam_weight_decay, - eps=args.adam_epsilon, - ) - - # Get the datasets: you can either provide your own training and evaluation files (see below) - # or specify a Dataset from the hub (the dataset will be downloaded automatically from the datasets Hub). - - # In distributed training, the load_dataset function guarantees that only one local process can concurrently - # download the dataset. - if args.dataset_name is not None: - # Downloading and loading a dataset from the hub. - dataset = load_dataset( - args.dataset_name, - args.dataset_config_name, - cache_dir=args.cache_dir, - ) - else: - data_files = {} - if args.train_data_dir is not None: - data_files["train"] = os.path.join(args.train_data_dir, "**") - dataset = load_dataset( - "imagefolder", - data_files=data_files, - cache_dir=args.cache_dir, - ) - # See more about loading custom images at - # https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder - - # Preprocessing the datasets. - # We need to tokenize inputs and targets. - column_names = dataset["train"].column_names - - # 6. Get the column names for input/target. - dataset_columns = DATASET_NAME_MAPPING.get(args.dataset_name, None) - if args.image_column is None: - image_column = dataset_columns[0] if dataset_columns is not None else column_names[0] - else: - image_column = args.image_column - if image_column not in column_names: - raise ValueError( - f"--image_column' value '{args.image_column}' needs to be one of: {', '.join(column_names)}" - ) - if args.caption_column is None: - caption_column = dataset_columns[1] if dataset_columns is not None else column_names[1] - else: - caption_column = args.caption_column - if caption_column not in column_names: - raise ValueError( - f"--caption_column' value '{args.caption_column}' needs to be one of: {', '.join(column_names)}" - ) - - # Preprocessing the datasets. - # We need to tokenize input captions and transform the images. - def tokenize_captions(examples, is_train=True): - captions = [] - for caption in examples[caption_column]: - if isinstance(caption, str): - captions.append(caption) - elif isinstance(caption, (list, np.ndarray)): - # take a random caption if there are multiple - captions.append(random.choice(caption) if is_train else caption[0]) - else: - raise ValueError( - f"Caption column `{caption_column}` should contain either strings or lists of strings." - ) - inputs = tokenizer( - captions, max_length=tokenizer.model_max_length, padding="max_length", truncation=True, return_tensors="pt" - ) - return inputs.input_ids - - # Preprocessing the datasets. - train_transforms = transforms.Compose( - [ - transforms.Resize(args.resolution, interpolation=transforms.InterpolationMode.BILINEAR), - transforms.CenterCrop(args.resolution) if args.center_crop else transforms.RandomCrop(args.resolution), - transforms.RandomHorizontalFlip() if args.random_flip else transforms.Lambda(lambda x: x), - transforms.ToTensor(), - transforms.Normalize([0.5], [0.5]), - ] - ) - - def preprocess_train(examples): - images = [image.convert("RGB") for image in examples[image_column]] - examples["pixel_values"] = [train_transforms(image) for image in images] - examples["input_ids"] = tokenize_captions(examples) - return examples - - with accelerator.main_process_first(): - if args.max_train_samples is not None: - dataset["train"] = dataset["train"].shuffle(seed=args.seed).select(range(args.max_train_samples)) - # Set the training transforms - train_dataset = dataset["train"].with_transform(preprocess_train) - - def collate_fn(examples): - pixel_values = torch.stack([example["pixel_values"] for example in examples]) - pixel_values = pixel_values.to(memory_format=torch.contiguous_format).float() - input_ids = torch.stack([example["input_ids"] for example in examples]) - return {"pixel_values": pixel_values, "input_ids": input_ids} - - # DataLoaders creation: - train_dataloader = torch.utils.data.DataLoader( - train_dataset, - shuffle=True, - collate_fn=collate_fn, - batch_size=args.train_batch_size, - num_workers=args.dataloader_num_workers, - ) - - # Scheduler and math around the number of training steps. - overrode_max_train_steps = False - num_update_steps_per_epoch = math.ceil(len(train_dataloader) / args.gradient_accumulation_steps) - if args.max_train_steps is None: - args.max_train_steps = args.num_train_epochs * num_update_steps_per_epoch - overrode_max_train_steps = True - - lr_scheduler = get_scheduler( - args.lr_scheduler, - optimizer=optimizer, - num_warmup_steps=args.lr_warmup_steps * accelerator.num_processes, - num_training_steps=args.max_train_steps * accelerator.num_processes, - ) - - # Prepare everything with our `accelerator`. - lora_layers, optimizer, train_dataloader, lr_scheduler = accelerator.prepare( - lora_layers, optimizer, train_dataloader, lr_scheduler - ) - - # We need to recalculate our total training steps as the size of the training dataloader may have changed. - num_update_steps_per_epoch = math.ceil(len(train_dataloader) / args.gradient_accumulation_steps) - if overrode_max_train_steps: - args.max_train_steps = args.num_train_epochs * num_update_steps_per_epoch - # Afterwards we recalculate our number of training epochs - args.num_train_epochs = math.ceil(args.max_train_steps / num_update_steps_per_epoch) - - # We need to initialize the trackers we use, and also store our configuration. - # The trackers initializes automatically on the main process. - if accelerator.is_main_process: - accelerator.init_trackers("text2image-fine-tune", config=vars(args)) - - # Train! - total_batch_size = args.train_batch_size * accelerator.num_processes * args.gradient_accumulation_steps - - logger.info("***** Running training *****") - logger.info(f" Num examples = {len(train_dataset)}") - logger.info(f" Num Epochs = {args.num_train_epochs}") - logger.info(f" Instantaneous batch size per device = {args.train_batch_size}") - logger.info(f" Total train batch size (w. parallel, distributed & accumulation) = {total_batch_size}") - logger.info(f" Gradient Accumulation steps = {args.gradient_accumulation_steps}") - logger.info(f" Total optimization steps = {args.max_train_steps}") - global_step = 0 - first_epoch = 0 - - # Potentially load in the weights and states from a previous save - if args.resume_from_checkpoint: - if args.resume_from_checkpoint != "latest": - path = os.path.basename(args.resume_from_checkpoint) - else: - # Get the most recent checkpoint - dirs = os.listdir(args.output_dir) - dirs = [d for d in dirs if d.startswith("checkpoint")] - dirs = sorted(dirs, key=lambda x: int(x.split("-")[1])) - path = dirs[-1] if len(dirs) > 0 else None - - if path is None: - accelerator.print( - f"Checkpoint '{args.resume_from_checkpoint}' does not exist. Starting a new training run." - ) - args.resume_from_checkpoint = None - else: - accelerator.print(f"Resuming from checkpoint {path}") - accelerator.load_state(os.path.join(args.output_dir, path)) - global_step = int(path.split("-")[1]) - - resume_global_step = global_step * args.gradient_accumulation_steps - first_epoch = global_step // num_update_steps_per_epoch - resume_step = resume_global_step % (num_update_steps_per_epoch * args.gradient_accumulation_steps) - - # Only show the progress bar once on each machine. - progress_bar = tqdm(range(global_step, args.max_train_steps), disable=not accelerator.is_local_main_process) - progress_bar.set_description("Steps") - - for epoch in range(first_epoch, args.num_train_epochs): - unet.train() - train_loss = 0.0 - for step, batch in enumerate(train_dataloader): - # Skip steps until we reach the resumed step - if args.resume_from_checkpoint and epoch == first_epoch and step < resume_step: - if step % args.gradient_accumulation_steps == 0: - progress_bar.update(1) - continue - - with accelerator.accumulate(unet): - # Convert images to latent space - latents = vae.encode(batch["pixel_values"].to(dtype=weight_dtype)).latent_dist.sample() - latents = latents * vae.config.scaling_factor - - # Sample noise that we'll add to the latents - noise = torch.randn_like(latents) - if args.noise_offset: - # https://www.crosslabs.org//blog/diffusion-with-offset-noise - noise += args.noise_offset * torch.randn( - (latents.shape[0], latents.shape[1], 1, 1), device=latents.device - ) - - bsz = latents.shape[0] - # Sample a random timestep for each image - timesteps = torch.randint(0, noise_scheduler.config.num_train_timesteps, (bsz,), device=latents.device) - timesteps = timesteps.long() - - # Add noise to the latents according to the noise magnitude at each timestep - # (this is the forward diffusion process) - noisy_latents = noise_scheduler.add_noise(latents, noise, timesteps) - - # Get the text embedding for conditioning - encoder_hidden_states = text_encoder(batch["input_ids"])[0] - - # Get the target for loss depending on the prediction type - if args.prediction_type is not None: - # set prediction_type of scheduler if defined - noise_scheduler.register_to_config(prediction_type=args.prediction_type) - - if noise_scheduler.config.prediction_type == "epsilon": - target = noise - elif noise_scheduler.config.prediction_type == "v_prediction": - target = noise_scheduler.get_velocity(latents, noise, timesteps) - else: - raise ValueError(f"Unknown prediction type {noise_scheduler.config.prediction_type}") - - # Predict the noise residual and compute loss - model_pred = unet(noisy_latents, timesteps, encoder_hidden_states).sample - - if args.snr_gamma is None: - loss = F.mse_loss(model_pred.float(), target.float(), reduction="mean") - else: - # Compute loss-weights as per Section 3.4 of https://arxiv.org/abs/2303.09556. - # Since we predict the noise instead of x_0, the original formulation is slightly changed. - # This is discussed in Section 4.2 of the same paper. - snr = compute_snr(timesteps) - mse_loss_weights = ( - torch.stack([snr, args.snr_gamma * torch.ones_like(timesteps)], dim=1).min(dim=1)[0] / snr - ) - # We first calculate the original loss. Then we mean over the non-batch dimensions and - # rebalance the sample-wise losses with their respective loss weights. - # Finally, we take the mean of the rebalanced loss. - loss = F.mse_loss(model_pred.float(), target.float(), reduction="none") - loss = loss.mean(dim=list(range(1, len(loss.shape)))) * mse_loss_weights - loss = loss.mean() - - # Gather the losses across all processes for logging (if we use distributed training). - avg_loss = accelerator.gather(loss.repeat(args.train_batch_size)).mean() - train_loss += avg_loss.item() / args.gradient_accumulation_steps - - # Backpropagate - accelerator.backward(loss) - if accelerator.sync_gradients: - params_to_clip = lora_layers.parameters() - accelerator.clip_grad_norm_(params_to_clip, args.max_grad_norm) - optimizer.step() - lr_scheduler.step() - optimizer.zero_grad() - - # Checks if the accelerator has performed an optimization step behind the scenes - if accelerator.sync_gradients: - progress_bar.update(1) - global_step += 1 - accelerator.log({"train_loss": train_loss}, step=global_step) - train_loss = 0.0 - - if global_step % args.checkpointing_steps == 0: - if accelerator.is_main_process: - # _before_ saving state, check if this save would set us over the `checkpoints_total_limit` - if args.checkpoints_total_limit is not None: - checkpoints = os.listdir(args.output_dir) - checkpoints = [d for d in checkpoints if d.startswith("checkpoint")] - checkpoints = sorted(checkpoints, key=lambda x: int(x.split("-")[1])) - - # before we save the new checkpoint, we need to have at _most_ `checkpoints_total_limit - 1` checkpoints - if len(checkpoints) >= args.checkpoints_total_limit: - num_to_remove = len(checkpoints) - args.checkpoints_total_limit + 1 - removing_checkpoints = checkpoints[0:num_to_remove] - - logger.info( - f"{len(checkpoints)} checkpoints already exist, removing {len(removing_checkpoints)} checkpoints" - ) - logger.info(f"removing checkpoints: {', '.join(removing_checkpoints)}") - - for removing_checkpoint in removing_checkpoints: - removing_checkpoint = os.path.join(args.output_dir, removing_checkpoint) - shutil.rmtree(removing_checkpoint) - - save_path = os.path.join(args.output_dir, f"checkpoint-{global_step}") - accelerator.save_state(save_path) - logger.info(f"Saved state to {save_path}") - - logs = {"step_loss": loss.detach().item(), "lr": lr_scheduler.get_last_lr()[0]} - progress_bar.set_postfix(**logs) - - if global_step >= args.max_train_steps: - break - - if accelerator.is_main_process: - if args.validation_prompt is not None and epoch % args.validation_epochs == 0: - logger.info( - f"Running validation... \n Generating {args.num_validation_images} images with prompt:" - f" {args.validation_prompt}." - ) - # create pipeline - pipeline = DiffusionPipeline.from_pretrained( - args.pretrained_model_name_or_path, - unet=accelerator.unwrap_model(unet), - revision=args.revision, - torch_dtype=weight_dtype, - ) - pipeline = pipeline.to(accelerator.device) - pipeline.set_progress_bar_config(disable=True) - - # run inference - generator = torch.Generator(device=accelerator.device) - if args.seed is not None: - generator = generator.manual_seed(args.seed) - images = [] - for _ in range(args.num_validation_images): - images.append( - pipeline(args.validation_prompt, num_inference_steps=30, generator=generator).images[0] - ) - - for tracker in accelerator.trackers: - if tracker.name == "tensorboard": - np_images = np.stack([np.asarray(img) for img in images]) - tracker.writer.add_images("validation", np_images, epoch, dataformats="NHWC") - if tracker.name == "wandb": - tracker.log( - { - "validation": [ - wandb.Image(image, caption=f"{i}: {args.validation_prompt}") - for i, image in enumerate(images) - ] - } - ) - - del pipeline - torch.cuda.empty_cache() - - # Save the lora layers - accelerator.wait_for_everyone() - if accelerator.is_main_process: - unet = unet.to(torch.float32) - unet.save_attn_procs(args.output_dir) - - if args.push_to_hub: - save_model_card( - repo_id, - images=images, - base_model=args.pretrained_model_name_or_path, - dataset_name=args.dataset_name, - repo_folder=args.output_dir, - ) - upload_folder( - repo_id=repo_id, - folder_path=args.output_dir, - commit_message="End of training", - ignore_patterns=["step_*", "epoch_*"], - ) - - # Final inference - # Load previous pipeline - pipeline = DiffusionPipeline.from_pretrained( - args.pretrained_model_name_or_path, revision=args.revision, torch_dtype=weight_dtype - ) - pipeline = pipeline.to(accelerator.device) - - # load attention processors - pipeline.unet.load_attn_procs(args.output_dir) - - # run inference - generator = torch.Generator(device=accelerator.device) - if args.seed is not None: - generator = generator.manual_seed(args.seed) - images = [] - for _ in range(args.num_validation_images): - images.append(pipeline(args.validation_prompt, num_inference_steps=30, generator=generator).images[0]) - - if accelerator.is_main_process: - for tracker in accelerator.trackers: - if len(images) != 0: - if tracker.name == "tensorboard": - np_images = np.stack([np.asarray(img) for img in images]) - tracker.writer.add_images("test", np_images, epoch, dataformats="NHWC") - if tracker.name == "wandb": - tracker.log( - { - "test": [ - wandb.Image(image, caption=f"{i}: {args.validation_prompt}") - for i, image in enumerate(images) - ] - } - ) - - accelerator.end_training() - - -if __name__ == "__main__": - main() diff --git a/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/tests/fixtures/custom_pipeline/what_ever.py b/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/tests/fixtures/custom_pipeline/what_ever.py deleted file mode 100644 index 494c5a1a4e95158dafb0959fa65c228461a2e069..0000000000000000000000000000000000000000 --- a/spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/tests/fixtures/custom_pipeline/what_ever.py +++ /dev/null @@ -1,101 +0,0 @@ -# Copyright 2023 The HuggingFace Team. All rights reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and - -# limitations under the License. - - -from typing import Optional, Tuple, Union - -import torch - -from diffusers.pipeline_utils import DiffusionPipeline, ImagePipelineOutput - - -class CustomLocalPipeline(DiffusionPipeline): - r""" - This model inherits from [`DiffusionPipeline`]. Check the superclass documentation for the generic methods the - library implements for all the pipelines (such as downloading or saving, running on a particular device, etc.) - - Parameters: - unet ([`UNet2DModel`]): U-Net architecture to denoise the encoded image. - scheduler ([`SchedulerMixin`]): - A scheduler to be used in combination with `unet` to denoise the encoded image. Can be one of - [`DDPMScheduler`], or [`DDIMScheduler`]. - """ - - def __init__(self, unet, scheduler): - super().__init__() - self.register_modules(unet=unet, scheduler=scheduler) - - @torch.no_grad() - def __call__( - self, - batch_size: int = 1, - generator: Optional[torch.Generator] = None, - num_inference_steps: int = 50, - output_type: Optional[str] = "pil", - return_dict: bool = True, - **kwargs, - ) -> Union[ImagePipelineOutput, Tuple]: - r""" - Args: - batch_size (`int`, *optional*, defaults to 1): - The number of images to generate. - generator (`torch.Generator`, *optional*): - A [torch generator](https://pytorch.org/docs/stable/generated/torch.Generator.html) to make generation - deterministic. - eta (`float`, *optional*, defaults to 0.0): - The eta parameter which controls the scale of the variance (0 is DDIM and 1 is one type of DDPM). - num_inference_steps (`int`, *optional*, defaults to 50): - The number of denoising steps. More denoising steps usually lead to a higher quality image at the - expense of slower inference. - output_type (`str`, *optional*, defaults to `"pil"`): - The output format of the generate image. Choose between - [PIL](https://pillow.readthedocs.io/en/stable/): `PIL.Image.Image` or `np.array`. - return_dict (`bool`, *optional*, defaults to `True`): - Whether or not to return a [`~pipeline_utils.ImagePipelineOutput`] instead of a plain tuple. - - Returns: - [`~pipeline_utils.ImagePipelineOutput`] or `tuple`: [`~pipelines.utils.ImagePipelineOutput`] if - `return_dict` is True, otherwise a `tuple. When returning a tuple, the first element is a list with the - generated images. - """ - - # Sample gaussian noise to begin loop - image = torch.randn( - (batch_size, self.unet.config.in_channels, self.unet.config.sample_size, self.unet.config.sample_size), - generator=generator, - ) - image = image.to(self.device) - - # set step values - self.scheduler.set_timesteps(num_inference_steps) - - for t in self.progress_bar(self.scheduler.timesteps): - # 1. predict noise model_output - model_output = self.unet(image, t).sample - - # 2. predict previous mean of image x_t-1 and add variance depending on eta - # eta corresponds to η in paper and should be between [0, 1] - # do x_t -> x_t-1 - image = self.scheduler.step(model_output, t, image).prev_sample - - image = (image / 2 + 0.5).clamp(0, 1) - image = image.cpu().permute(0, 2, 3, 1).numpy() - if output_type == "pil": - image = self.numpy_to_pil(image) - - if not return_dict: - return (image,), "This is a local test" - - return ImagePipelineOutput(images=image), "This is a local test" diff --git a/spaces/Andy1621/uniformer_image_detection/configs/_base_/models/cascade_rcnn_r50_fpn.py b/spaces/Andy1621/uniformer_image_detection/configs/_base_/models/cascade_rcnn_r50_fpn.py deleted file mode 100644 index cde2a96c41b49d463a480e505ff5abe04a68db32..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/configs/_base_/models/cascade_rcnn_r50_fpn.py +++ /dev/null @@ -1,179 +0,0 @@ -# model settings -model = dict( - type='CascadeRCNN', - pretrained='torchvision://resnet50', - backbone=dict( - type='ResNet', - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type='BN', requires_grad=True), - norm_eval=True, - style='pytorch'), - neck=dict( - type='FPN', - in_channels=[256, 512, 1024, 2048], - out_channels=256, - num_outs=5), - rpn_head=dict( - type='RPNHead', - in_channels=256, - feat_channels=256, - anchor_generator=dict( - type='AnchorGenerator', - scales=[8], - ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64]), - bbox_coder=dict( - type='DeltaXYWHBBoxCoder', - target_means=[.0, .0, .0, .0], - target_stds=[1.0, 1.0, 1.0, 1.0]), - loss_cls=dict( - type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), - loss_bbox=dict(type='SmoothL1Loss', beta=1.0 / 9.0, loss_weight=1.0)), - roi_head=dict( - type='CascadeRoIHead', - num_stages=3, - stage_loss_weights=[1, 0.5, 0.25], - bbox_roi_extractor=dict( - type='SingleRoIExtractor', - roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32]), - bbox_head=[ - dict( - type='Shared2FCBBoxHead', - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type='DeltaXYWHBBoxCoder', - target_means=[0., 0., 0., 0.], - target_stds=[0.1, 0.1, 0.2, 0.2]), - reg_class_agnostic=True, - loss_cls=dict( - type='CrossEntropyLoss', - use_sigmoid=False, - loss_weight=1.0), - loss_bbox=dict(type='SmoothL1Loss', beta=1.0, - loss_weight=1.0)), - dict( - type='Shared2FCBBoxHead', - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type='DeltaXYWHBBoxCoder', - target_means=[0., 0., 0., 0.], - target_stds=[0.05, 0.05, 0.1, 0.1]), - reg_class_agnostic=True, - loss_cls=dict( - type='CrossEntropyLoss', - use_sigmoid=False, - loss_weight=1.0), - loss_bbox=dict(type='SmoothL1Loss', beta=1.0, - loss_weight=1.0)), - dict( - type='Shared2FCBBoxHead', - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type='DeltaXYWHBBoxCoder', - target_means=[0., 0., 0., 0.], - target_stds=[0.033, 0.033, 0.067, 0.067]), - reg_class_agnostic=True, - loss_cls=dict( - type='CrossEntropyLoss', - use_sigmoid=False, - loss_weight=1.0), - loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0)) - ]), - # model training and testing settings - train_cfg=dict( - rpn=dict( - assigner=dict( - type='MaxIoUAssigner', - pos_iou_thr=0.7, - neg_iou_thr=0.3, - min_pos_iou=0.3, - match_low_quality=True, - ignore_iof_thr=-1), - sampler=dict( - type='RandomSampler', - num=256, - pos_fraction=0.5, - neg_pos_ub=-1, - add_gt_as_proposals=False), - allowed_border=0, - pos_weight=-1, - debug=False), - rpn_proposal=dict( - nms_pre=2000, - max_per_img=2000, - nms=dict(type='nms', iou_threshold=0.7), - min_bbox_size=0), - rcnn=[ - dict( - assigner=dict( - type='MaxIoUAssigner', - pos_iou_thr=0.5, - neg_iou_thr=0.5, - min_pos_iou=0.5, - match_low_quality=False, - ignore_iof_thr=-1), - sampler=dict( - type='RandomSampler', - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True), - pos_weight=-1, - debug=False), - dict( - assigner=dict( - type='MaxIoUAssigner', - pos_iou_thr=0.6, - neg_iou_thr=0.6, - min_pos_iou=0.6, - match_low_quality=False, - ignore_iof_thr=-1), - sampler=dict( - type='RandomSampler', - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True), - pos_weight=-1, - debug=False), - dict( - assigner=dict( - type='MaxIoUAssigner', - pos_iou_thr=0.7, - neg_iou_thr=0.7, - min_pos_iou=0.7, - match_low_quality=False, - ignore_iof_thr=-1), - sampler=dict( - type='RandomSampler', - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True), - pos_weight=-1, - debug=False) - ]), - test_cfg=dict( - rpn=dict( - nms_pre=1000, - max_per_img=1000, - nms=dict(type='nms', iou_threshold=0.7), - min_bbox_size=0), - rcnn=dict( - score_thr=0.05, - nms=dict(type='nms', iou_threshold=0.5), - max_per_img=100))) diff --git a/spaces/Andy1621/uniformer_image_detection/configs/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco.py b/spaces/Andy1621/uniformer_image_detection/configs/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco.py deleted file mode 100644 index 4535034efa3f4c4a09064a753a2bbde68b6cd2f2..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/configs/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco.py +++ /dev/null @@ -1,9 +0,0 @@ -_base_ = 'faster_rcnn_r50_fpg_crop640_50e_coco.py' - -norm_cfg = dict(type='BN', requires_grad=True) -model = dict( - neck=dict(out_channels=128, inter_channels=128), - rpn_head=dict(in_channels=128), - roi_head=dict( - bbox_roi_extractor=dict(out_channels=128), - bbox_head=dict(in_channels=128))) diff --git a/spaces/Andy1621/uniformer_image_detection/mmdet/core/bbox/coder/base_bbox_coder.py b/spaces/Andy1621/uniformer_image_detection/mmdet/core/bbox/coder/base_bbox_coder.py deleted file mode 100644 index cf0b34c7cc2fe561718b0c884990beb40a993643..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/mmdet/core/bbox/coder/base_bbox_coder.py +++ /dev/null @@ -1,17 +0,0 @@ -from abc import ABCMeta, abstractmethod - - -class BaseBBoxCoder(metaclass=ABCMeta): - """Base bounding box coder.""" - - def __init__(self, **kwargs): - pass - - @abstractmethod - def encode(self, bboxes, gt_bboxes): - """Encode deltas between bboxes and ground truth boxes.""" - - @abstractmethod - def decode(self, bboxes, bboxes_pred): - """Decode the predicted bboxes according to prediction and base - boxes.""" diff --git a/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/__init__.py b/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/__init__.py deleted file mode 100644 index c6f424debd1623e7511dd77da464a6639d816745..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/__init__.py +++ /dev/null @@ -1,25 +0,0 @@ -from .auto_augment import (AutoAugment, BrightnessTransform, ColorTransform, - ContrastTransform, EqualizeTransform, Rotate, Shear, - Translate) -from .compose import Compose -from .formating import (Collect, DefaultFormatBundle, ImageToTensor, - ToDataContainer, ToTensor, Transpose, to_tensor) -from .instaboost import InstaBoost -from .loading import (LoadAnnotations, LoadImageFromFile, LoadImageFromWebcam, - LoadMultiChannelImageFromFiles, LoadProposals) -from .test_time_aug import MultiScaleFlipAug -from .transforms import (Albu, CutOut, Expand, MinIoURandomCrop, Normalize, - Pad, PhotoMetricDistortion, RandomCenterCropPad, - RandomCrop, RandomFlip, Resize, SegRescale) - -__all__ = [ - 'Compose', 'to_tensor', 'ToTensor', 'ImageToTensor', 'ToDataContainer', - 'Transpose', 'Collect', 'DefaultFormatBundle', 'LoadAnnotations', - 'LoadImageFromFile', 'LoadImageFromWebcam', - 'LoadMultiChannelImageFromFiles', 'LoadProposals', 'MultiScaleFlipAug', - 'Resize', 'RandomFlip', 'Pad', 'RandomCrop', 'Normalize', 'SegRescale', - 'MinIoURandomCrop', 'Expand', 'PhotoMetricDistortion', 'Albu', - 'InstaBoost', 'RandomCenterCropPad', 'AutoAugment', 'CutOut', 'Shear', - 'Rotate', 'ColorTransform', 'EqualizeTransform', 'BrightnessTransform', - 'ContrastTransform', 'Translate' -] diff --git a/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/transforms.py b/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/transforms.py deleted file mode 100644 index caed51d89ffc1259d0b086954f03c3d4c0749cf2..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/mmdet/datasets/pipelines/transforms.py +++ /dev/null @@ -1,1811 +0,0 @@ -import copy -import inspect - -import mmcv -import numpy as np -from numpy import random - -from mmdet.core import PolygonMasks -from mmdet.core.evaluation.bbox_overlaps import bbox_overlaps -from ..builder import PIPELINES - -try: - from imagecorruptions import corrupt -except ImportError: - corrupt = None - -try: - import albumentations - from albumentations import Compose -except ImportError: - albumentations = None - Compose = None - - -@PIPELINES.register_module() -class Resize(object): - """Resize images & bbox & mask. - - This transform resizes the input image to some scale. Bboxes and masks are - then resized with the same scale factor. If the input dict contains the key - "scale", then the scale in the input dict is used, otherwise the specified - scale in the init method is used. If the input dict contains the key - "scale_factor" (if MultiScaleFlipAug does not give img_scale but - scale_factor), the actual scale will be computed by image shape and - scale_factor. - - `img_scale` can either be a tuple (single-scale) or a list of tuple - (multi-scale). There are 3 multiscale modes: - - - ``ratio_range is not None``: randomly sample a ratio from the ratio \ - range and multiply it with the image scale. - - ``ratio_range is None`` and ``multiscale_mode == "range"``: randomly \ - sample a scale from the multiscale range. - - ``ratio_range is None`` and ``multiscale_mode == "value"``: randomly \ - sample a scale from multiple scales. - - Args: - img_scale (tuple or list[tuple]): Images scales for resizing. - multiscale_mode (str): Either "range" or "value". - ratio_range (tuple[float]): (min_ratio, max_ratio) - keep_ratio (bool): Whether to keep the aspect ratio when resizing the - image. - bbox_clip_border (bool, optional): Whether clip the objects outside - the border of the image. Defaults to True. - backend (str): Image resize backend, choices are 'cv2' and 'pillow'. - These two backends generates slightly different results. Defaults - to 'cv2'. - override (bool, optional): Whether to override `scale` and - `scale_factor` so as to call resize twice. Default False. If True, - after the first resizing, the existed `scale` and `scale_factor` - will be ignored so the second resizing can be allowed. - This option is a work-around for multiple times of resize in DETR. - Defaults to False. - """ - - def __init__(self, - img_scale=None, - multiscale_mode='range', - ratio_range=None, - keep_ratio=True, - bbox_clip_border=True, - backend='cv2', - override=False): - if img_scale is None: - self.img_scale = None - else: - if isinstance(img_scale, list): - self.img_scale = img_scale - else: - self.img_scale = [img_scale] - assert mmcv.is_list_of(self.img_scale, tuple) - - if ratio_range is not None: - # mode 1: given a scale and a range of image ratio - assert len(self.img_scale) == 1 - else: - # mode 2: given multiple scales or a range of scales - assert multiscale_mode in ['value', 'range'] - - self.backend = backend - self.multiscale_mode = multiscale_mode - self.ratio_range = ratio_range - self.keep_ratio = keep_ratio - # TODO: refactor the override option in Resize - self.override = override - self.bbox_clip_border = bbox_clip_border - - @staticmethod - def random_select(img_scales): - """Randomly select an img_scale from given candidates. - - Args: - img_scales (list[tuple]): Images scales for selection. - - Returns: - (tuple, int): Returns a tuple ``(img_scale, scale_dix)``, \ - where ``img_scale`` is the selected image scale and \ - ``scale_idx`` is the selected index in the given candidates. - """ - - assert mmcv.is_list_of(img_scales, tuple) - scale_idx = np.random.randint(len(img_scales)) - img_scale = img_scales[scale_idx] - return img_scale, scale_idx - - @staticmethod - def random_sample(img_scales): - """Randomly sample an img_scale when ``multiscale_mode=='range'``. - - Args: - img_scales (list[tuple]): Images scale range for sampling. - There must be two tuples in img_scales, which specify the lower - and upper bound of image scales. - - Returns: - (tuple, None): Returns a tuple ``(img_scale, None)``, where \ - ``img_scale`` is sampled scale and None is just a placeholder \ - to be consistent with :func:`random_select`. - """ - - assert mmcv.is_list_of(img_scales, tuple) and len(img_scales) == 2 - img_scale_long = [max(s) for s in img_scales] - img_scale_short = [min(s) for s in img_scales] - long_edge = np.random.randint( - min(img_scale_long), - max(img_scale_long) + 1) - short_edge = np.random.randint( - min(img_scale_short), - max(img_scale_short) + 1) - img_scale = (long_edge, short_edge) - return img_scale, None - - @staticmethod - def random_sample_ratio(img_scale, ratio_range): - """Randomly sample an img_scale when ``ratio_range`` is specified. - - A ratio will be randomly sampled from the range specified by - ``ratio_range``. Then it would be multiplied with ``img_scale`` to - generate sampled scale. - - Args: - img_scale (tuple): Images scale base to multiply with ratio. - ratio_range (tuple[float]): The minimum and maximum ratio to scale - the ``img_scale``. - - Returns: - (tuple, None): Returns a tuple ``(scale, None)``, where \ - ``scale`` is sampled ratio multiplied with ``img_scale`` and \ - None is just a placeholder to be consistent with \ - :func:`random_select`. - """ - - assert isinstance(img_scale, tuple) and len(img_scale) == 2 - min_ratio, max_ratio = ratio_range - assert min_ratio <= max_ratio - ratio = np.random.random_sample() * (max_ratio - min_ratio) + min_ratio - scale = int(img_scale[0] * ratio), int(img_scale[1] * ratio) - return scale, None - - def _random_scale(self, results): - """Randomly sample an img_scale according to ``ratio_range`` and - ``multiscale_mode``. - - If ``ratio_range`` is specified, a ratio will be sampled and be - multiplied with ``img_scale``. - If multiple scales are specified by ``img_scale``, a scale will be - sampled according to ``multiscale_mode``. - Otherwise, single scale will be used. - - Args: - results (dict): Result dict from :obj:`dataset`. - - Returns: - dict: Two new keys 'scale` and 'scale_idx` are added into \ - ``results``, which would be used by subsequent pipelines. - """ - - if self.ratio_range is not None: - scale, scale_idx = self.random_sample_ratio( - self.img_scale[0], self.ratio_range) - elif len(self.img_scale) == 1: - scale, scale_idx = self.img_scale[0], 0 - elif self.multiscale_mode == 'range': - scale, scale_idx = self.random_sample(self.img_scale) - elif self.multiscale_mode == 'value': - scale, scale_idx = self.random_select(self.img_scale) - else: - raise NotImplementedError - - results['scale'] = scale - results['scale_idx'] = scale_idx - - def _resize_img(self, results): - """Resize images with ``results['scale']``.""" - for key in results.get('img_fields', ['img']): - if self.keep_ratio: - img, scale_factor = mmcv.imrescale( - results[key], - results['scale'], - return_scale=True, - backend=self.backend) - # the w_scale and h_scale has minor difference - # a real fix should be done in the mmcv.imrescale in the future - new_h, new_w = img.shape[:2] - h, w = results[key].shape[:2] - w_scale = new_w / w - h_scale = new_h / h - else: - img, w_scale, h_scale = mmcv.imresize( - results[key], - results['scale'], - return_scale=True, - backend=self.backend) - results[key] = img - - scale_factor = np.array([w_scale, h_scale, w_scale, h_scale], - dtype=np.float32) - results['img_shape'] = img.shape - # in case that there is no padding - results['pad_shape'] = img.shape - results['scale_factor'] = scale_factor - results['keep_ratio'] = self.keep_ratio - - def _resize_bboxes(self, results): - """Resize bounding boxes with ``results['scale_factor']``.""" - for key in results.get('bbox_fields', []): - bboxes = results[key] * results['scale_factor'] - if self.bbox_clip_border: - img_shape = results['img_shape'] - bboxes[:, 0::2] = np.clip(bboxes[:, 0::2], 0, img_shape[1]) - bboxes[:, 1::2] = np.clip(bboxes[:, 1::2], 0, img_shape[0]) - results[key] = bboxes - - def _resize_masks(self, results): - """Resize masks with ``results['scale']``""" - for key in results.get('mask_fields', []): - if results[key] is None: - continue - if self.keep_ratio: - results[key] = results[key].rescale(results['scale']) - else: - results[key] = results[key].resize(results['img_shape'][:2]) - - def _resize_seg(self, results): - """Resize semantic segmentation map with ``results['scale']``.""" - for key in results.get('seg_fields', []): - if self.keep_ratio: - gt_seg = mmcv.imrescale( - results[key], - results['scale'], - interpolation='nearest', - backend=self.backend) - else: - gt_seg = mmcv.imresize( - results[key], - results['scale'], - interpolation='nearest', - backend=self.backend) - results['gt_semantic_seg'] = gt_seg - - def __call__(self, results): - """Call function to resize images, bounding boxes, masks, semantic - segmentation map. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Resized results, 'img_shape', 'pad_shape', 'scale_factor', \ - 'keep_ratio' keys are added into result dict. - """ - - if 'scale' not in results: - if 'scale_factor' in results: - img_shape = results['img'].shape[:2] - scale_factor = results['scale_factor'] - assert isinstance(scale_factor, float) - results['scale'] = tuple( - [int(x * scale_factor) for x in img_shape][::-1]) - else: - self._random_scale(results) - else: - if not self.override: - assert 'scale_factor' not in results, ( - 'scale and scale_factor cannot be both set.') - else: - results.pop('scale') - if 'scale_factor' in results: - results.pop('scale_factor') - self._random_scale(results) - - self._resize_img(results) - self._resize_bboxes(results) - self._resize_masks(results) - self._resize_seg(results) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(img_scale={self.img_scale}, ' - repr_str += f'multiscale_mode={self.multiscale_mode}, ' - repr_str += f'ratio_range={self.ratio_range}, ' - repr_str += f'keep_ratio={self.keep_ratio}, ' - repr_str += f'bbox_clip_border={self.bbox_clip_border})' - return repr_str - - -@PIPELINES.register_module() -class RandomFlip(object): - """Flip the image & bbox & mask. - - If the input dict contains the key "flip", then the flag will be used, - otherwise it will be randomly decided by a ratio specified in the init - method. - - When random flip is enabled, ``flip_ratio``/``direction`` can either be a - float/string or tuple of float/string. There are 3 flip modes: - - - ``flip_ratio`` is float, ``direction`` is string: the image will be - ``direction``ly flipped with probability of ``flip_ratio`` . - E.g., ``flip_ratio=0.5``, ``direction='horizontal'``, - then image will be horizontally flipped with probability of 0.5. - - ``flip_ratio`` is float, ``direction`` is list of string: the image wil - be ``direction[i]``ly flipped with probability of - ``flip_ratio/len(direction)``. - E.g., ``flip_ratio=0.5``, ``direction=['horizontal', 'vertical']``, - then image will be horizontally flipped with probability of 0.25, - vertically with probability of 0.25. - - ``flip_ratio`` is list of float, ``direction`` is list of string: - given ``len(flip_ratio) == len(direction)``, the image wil - be ``direction[i]``ly flipped with probability of ``flip_ratio[i]``. - E.g., ``flip_ratio=[0.3, 0.5]``, ``direction=['horizontal', - 'vertical']``, then image will be horizontally flipped with probability - of 0.3, vertically with probability of 0.5 - - Args: - flip_ratio (float | list[float], optional): The flipping probability. - Default: None. - direction(str | list[str], optional): The flipping direction. Options - are 'horizontal', 'vertical', 'diagonal'. Default: 'horizontal'. - If input is a list, the length must equal ``flip_ratio``. Each - element in ``flip_ratio`` indicates the flip probability of - corresponding direction. - """ - - def __init__(self, flip_ratio=None, direction='horizontal'): - if isinstance(flip_ratio, list): - assert mmcv.is_list_of(flip_ratio, float) - assert 0 <= sum(flip_ratio) <= 1 - elif isinstance(flip_ratio, float): - assert 0 <= flip_ratio <= 1 - elif flip_ratio is None: - pass - else: - raise ValueError('flip_ratios must be None, float, ' - 'or list of float') - self.flip_ratio = flip_ratio - - valid_directions = ['horizontal', 'vertical', 'diagonal'] - if isinstance(direction, str): - assert direction in valid_directions - elif isinstance(direction, list): - assert mmcv.is_list_of(direction, str) - assert set(direction).issubset(set(valid_directions)) - else: - raise ValueError('direction must be either str or list of str') - self.direction = direction - - if isinstance(flip_ratio, list): - assert len(self.flip_ratio) == len(self.direction) - - def bbox_flip(self, bboxes, img_shape, direction): - """Flip bboxes horizontally. - - Args: - bboxes (numpy.ndarray): Bounding boxes, shape (..., 4*k) - img_shape (tuple[int]): Image shape (height, width) - direction (str): Flip direction. Options are 'horizontal', - 'vertical'. - - Returns: - numpy.ndarray: Flipped bounding boxes. - """ - - assert bboxes.shape[-1] % 4 == 0 - flipped = bboxes.copy() - if direction == 'horizontal': - w = img_shape[1] - flipped[..., 0::4] = w - bboxes[..., 2::4] - flipped[..., 2::4] = w - bboxes[..., 0::4] - elif direction == 'vertical': - h = img_shape[0] - flipped[..., 1::4] = h - bboxes[..., 3::4] - flipped[..., 3::4] = h - bboxes[..., 1::4] - elif direction == 'diagonal': - w = img_shape[1] - h = img_shape[0] - flipped[..., 0::4] = w - bboxes[..., 2::4] - flipped[..., 1::4] = h - bboxes[..., 3::4] - flipped[..., 2::4] = w - bboxes[..., 0::4] - flipped[..., 3::4] = h - bboxes[..., 1::4] - else: - raise ValueError(f"Invalid flipping direction '{direction}'") - return flipped - - def __call__(self, results): - """Call function to flip bounding boxes, masks, semantic segmentation - maps. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Flipped results, 'flip', 'flip_direction' keys are added \ - into result dict. - """ - - if 'flip' not in results: - if isinstance(self.direction, list): - # None means non-flip - direction_list = self.direction + [None] - else: - # None means non-flip - direction_list = [self.direction, None] - - if isinstance(self.flip_ratio, list): - non_flip_ratio = 1 - sum(self.flip_ratio) - flip_ratio_list = self.flip_ratio + [non_flip_ratio] - else: - non_flip_ratio = 1 - self.flip_ratio - # exclude non-flip - single_ratio = self.flip_ratio / (len(direction_list) - 1) - flip_ratio_list = [single_ratio] * (len(direction_list) - - 1) + [non_flip_ratio] - - cur_dir = np.random.choice(direction_list, p=flip_ratio_list) - - results['flip'] = cur_dir is not None - if 'flip_direction' not in results: - results['flip_direction'] = cur_dir - if results['flip']: - # flip image - for key in results.get('img_fields', ['img']): - results[key] = mmcv.imflip( - results[key], direction=results['flip_direction']) - # flip bboxes - for key in results.get('bbox_fields', []): - results[key] = self.bbox_flip(results[key], - results['img_shape'], - results['flip_direction']) - # flip masks - for key in results.get('mask_fields', []): - results[key] = results[key].flip(results['flip_direction']) - - # flip segs - for key in results.get('seg_fields', []): - results[key] = mmcv.imflip( - results[key], direction=results['flip_direction']) - return results - - def __repr__(self): - return self.__class__.__name__ + f'(flip_ratio={self.flip_ratio})' - - -@PIPELINES.register_module() -class Pad(object): - """Pad the image & mask. - - There are two padding modes: (1) pad to a fixed size and (2) pad to the - minimum size that is divisible by some number. - Added keys are "pad_shape", "pad_fixed_size", "pad_size_divisor", - - Args: - size (tuple, optional): Fixed padding size. - size_divisor (int, optional): The divisor of padded size. - pad_val (float, optional): Padding value, 0 by default. - """ - - def __init__(self, size=None, size_divisor=None, pad_val=0): - self.size = size - self.size_divisor = size_divisor - self.pad_val = pad_val - # only one of size and size_divisor should be valid - assert size is not None or size_divisor is not None - assert size is None or size_divisor is None - - def _pad_img(self, results): - """Pad images according to ``self.size``.""" - for key in results.get('img_fields', ['img']): - if self.size is not None: - padded_img = mmcv.impad( - results[key], shape=self.size, pad_val=self.pad_val) - elif self.size_divisor is not None: - padded_img = mmcv.impad_to_multiple( - results[key], self.size_divisor, pad_val=self.pad_val) - results[key] = padded_img - results['pad_shape'] = padded_img.shape - results['pad_fixed_size'] = self.size - results['pad_size_divisor'] = self.size_divisor - - def _pad_masks(self, results): - """Pad masks according to ``results['pad_shape']``.""" - pad_shape = results['pad_shape'][:2] - for key in results.get('mask_fields', []): - results[key] = results[key].pad(pad_shape, pad_val=self.pad_val) - - def _pad_seg(self, results): - """Pad semantic segmentation map according to - ``results['pad_shape']``.""" - for key in results.get('seg_fields', []): - results[key] = mmcv.impad( - results[key], shape=results['pad_shape'][:2]) - - def __call__(self, results): - """Call function to pad images, masks, semantic segmentation maps. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Updated result dict. - """ - self._pad_img(results) - self._pad_masks(results) - self._pad_seg(results) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(size={self.size}, ' - repr_str += f'size_divisor={self.size_divisor}, ' - repr_str += f'pad_val={self.pad_val})' - return repr_str - - -@PIPELINES.register_module() -class Normalize(object): - """Normalize the image. - - Added key is "img_norm_cfg". - - Args: - mean (sequence): Mean values of 3 channels. - std (sequence): Std values of 3 channels. - to_rgb (bool): Whether to convert the image from BGR to RGB, - default is true. - """ - - def __init__(self, mean, std, to_rgb=True): - self.mean = np.array(mean, dtype=np.float32) - self.std = np.array(std, dtype=np.float32) - self.to_rgb = to_rgb - - def __call__(self, results): - """Call function to normalize images. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Normalized results, 'img_norm_cfg' key is added into - result dict. - """ - for key in results.get('img_fields', ['img']): - results[key] = mmcv.imnormalize(results[key], self.mean, self.std, - self.to_rgb) - results['img_norm_cfg'] = dict( - mean=self.mean, std=self.std, to_rgb=self.to_rgb) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(mean={self.mean}, std={self.std}, to_rgb={self.to_rgb})' - return repr_str - - -@PIPELINES.register_module() -class RandomCrop(object): - """Random crop the image & bboxes & masks. - - The absolute `crop_size` is sampled based on `crop_type` and `image_size`, - then the cropped results are generated. - - Args: - crop_size (tuple): The relative ratio or absolute pixels of - height and width. - crop_type (str, optional): one of "relative_range", "relative", - "absolute", "absolute_range". "relative" randomly crops - (h * crop_size[0], w * crop_size[1]) part from an input of size - (h, w). "relative_range" uniformly samples relative crop size from - range [crop_size[0], 1] and [crop_size[1], 1] for height and width - respectively. "absolute" crops from an input with absolute size - (crop_size[0], crop_size[1]). "absolute_range" uniformly samples - crop_h in range [crop_size[0], min(h, crop_size[1])] and crop_w - in range [crop_size[0], min(w, crop_size[1])]. Default "absolute". - allow_negative_crop (bool, optional): Whether to allow a crop that does - not contain any bbox area. Default False. - bbox_clip_border (bool, optional): Whether clip the objects outside - the border of the image. Defaults to True. - - Note: - - If the image is smaller than the absolute crop size, return the - original image. - - The keys for bboxes, labels and masks must be aligned. That is, - `gt_bboxes` corresponds to `gt_labels` and `gt_masks`, and - `gt_bboxes_ignore` corresponds to `gt_labels_ignore` and - `gt_masks_ignore`. - - If the crop does not contain any gt-bbox region and - `allow_negative_crop` is set to False, skip this image. - """ - - def __init__(self, - crop_size, - crop_type='absolute', - allow_negative_crop=False, - bbox_clip_border=True): - if crop_type not in [ - 'relative_range', 'relative', 'absolute', 'absolute_range' - ]: - raise ValueError(f'Invalid crop_type {crop_type}.') - if crop_type in ['absolute', 'absolute_range']: - assert crop_size[0] > 0 and crop_size[1] > 0 - assert isinstance(crop_size[0], int) and isinstance( - crop_size[1], int) - else: - assert 0 < crop_size[0] <= 1 and 0 < crop_size[1] <= 1 - self.crop_size = crop_size - self.crop_type = crop_type - self.allow_negative_crop = allow_negative_crop - self.bbox_clip_border = bbox_clip_border - # The key correspondence from bboxes to labels and masks. - self.bbox2label = { - 'gt_bboxes': 'gt_labels', - 'gt_bboxes_ignore': 'gt_labels_ignore' - } - self.bbox2mask = { - 'gt_bboxes': 'gt_masks', - 'gt_bboxes_ignore': 'gt_masks_ignore' - } - - def _crop_data(self, results, crop_size, allow_negative_crop): - """Function to randomly crop images, bounding boxes, masks, semantic - segmentation maps. - - Args: - results (dict): Result dict from loading pipeline. - crop_size (tuple): Expected absolute size after cropping, (h, w). - allow_negative_crop (bool): Whether to allow a crop that does not - contain any bbox area. Default to False. - - Returns: - dict: Randomly cropped results, 'img_shape' key in result dict is - updated according to crop size. - """ - assert crop_size[0] > 0 and crop_size[1] > 0 - for key in results.get('img_fields', ['img']): - img = results[key] - margin_h = max(img.shape[0] - crop_size[0], 0) - margin_w = max(img.shape[1] - crop_size[1], 0) - offset_h = np.random.randint(0, margin_h + 1) - offset_w = np.random.randint(0, margin_w + 1) - crop_y1, crop_y2 = offset_h, offset_h + crop_size[0] - crop_x1, crop_x2 = offset_w, offset_w + crop_size[1] - - # crop the image - img = img[crop_y1:crop_y2, crop_x1:crop_x2, ...] - img_shape = img.shape - results[key] = img - results['img_shape'] = img_shape - - # crop bboxes accordingly and clip to the image boundary - for key in results.get('bbox_fields', []): - # e.g. gt_bboxes and gt_bboxes_ignore - bbox_offset = np.array([offset_w, offset_h, offset_w, offset_h], - dtype=np.float32) - bboxes = results[key] - bbox_offset - if self.bbox_clip_border: - bboxes[:, 0::2] = np.clip(bboxes[:, 0::2], 0, img_shape[1]) - bboxes[:, 1::2] = np.clip(bboxes[:, 1::2], 0, img_shape[0]) - valid_inds = (bboxes[:, 2] > bboxes[:, 0]) & ( - bboxes[:, 3] > bboxes[:, 1]) - # If the crop does not contain any gt-bbox area and - # allow_negative_crop is False, skip this image. - if (key == 'gt_bboxes' and not valid_inds.any() - and not allow_negative_crop): - return None - results[key] = bboxes[valid_inds, :] - # label fields. e.g. gt_labels and gt_labels_ignore - label_key = self.bbox2label.get(key) - if label_key in results: - results[label_key] = results[label_key][valid_inds] - - # mask fields, e.g. gt_masks and gt_masks_ignore - mask_key = self.bbox2mask.get(key) - if mask_key in results: - results[mask_key] = results[mask_key][ - valid_inds.nonzero()[0]].crop( - np.asarray([crop_x1, crop_y1, crop_x2, crop_y2])) - - # crop semantic seg - for key in results.get('seg_fields', []): - results[key] = results[key][crop_y1:crop_y2, crop_x1:crop_x2] - - return results - - def _get_crop_size(self, image_size): - """Randomly generates the absolute crop size based on `crop_type` and - `image_size`. - - Args: - image_size (tuple): (h, w). - - Returns: - crop_size (tuple): (crop_h, crop_w) in absolute pixels. - """ - h, w = image_size - if self.crop_type == 'absolute': - return (min(self.crop_size[0], h), min(self.crop_size[1], w)) - elif self.crop_type == 'absolute_range': - assert self.crop_size[0] <= self.crop_size[1] - crop_h = np.random.randint( - min(h, self.crop_size[0]), - min(h, self.crop_size[1]) + 1) - crop_w = np.random.randint( - min(w, self.crop_size[0]), - min(w, self.crop_size[1]) + 1) - return crop_h, crop_w - elif self.crop_type == 'relative': - crop_h, crop_w = self.crop_size - return int(h * crop_h + 0.5), int(w * crop_w + 0.5) - elif self.crop_type == 'relative_range': - crop_size = np.asarray(self.crop_size, dtype=np.float32) - crop_h, crop_w = crop_size + np.random.rand(2) * (1 - crop_size) - return int(h * crop_h + 0.5), int(w * crop_w + 0.5) - - def __call__(self, results): - """Call function to randomly crop images, bounding boxes, masks, - semantic segmentation maps. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Randomly cropped results, 'img_shape' key in result dict is - updated according to crop size. - """ - image_size = results['img'].shape[:2] - crop_size = self._get_crop_size(image_size) - results = self._crop_data(results, crop_size, self.allow_negative_crop) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(crop_size={self.crop_size}, ' - repr_str += f'crop_type={self.crop_type}, ' - repr_str += f'allow_negative_crop={self.allow_negative_crop}, ' - repr_str += f'bbox_clip_border={self.bbox_clip_border})' - return repr_str - - -@PIPELINES.register_module() -class SegRescale(object): - """Rescale semantic segmentation maps. - - Args: - scale_factor (float): The scale factor of the final output. - backend (str): Image rescale backend, choices are 'cv2' and 'pillow'. - These two backends generates slightly different results. Defaults - to 'cv2'. - """ - - def __init__(self, scale_factor=1, backend='cv2'): - self.scale_factor = scale_factor - self.backend = backend - - def __call__(self, results): - """Call function to scale the semantic segmentation map. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Result dict with semantic segmentation map scaled. - """ - - for key in results.get('seg_fields', []): - if self.scale_factor != 1: - results[key] = mmcv.imrescale( - results[key], - self.scale_factor, - interpolation='nearest', - backend=self.backend) - return results - - def __repr__(self): - return self.__class__.__name__ + f'(scale_factor={self.scale_factor})' - - -@PIPELINES.register_module() -class PhotoMetricDistortion(object): - """Apply photometric distortion to image sequentially, every transformation - is applied with a probability of 0.5. The position of random contrast is in - second or second to last. - - 1. random brightness - 2. random contrast (mode 0) - 3. convert color from BGR to HSV - 4. random saturation - 5. random hue - 6. convert color from HSV to BGR - 7. random contrast (mode 1) - 8. randomly swap channels - - Args: - brightness_delta (int): delta of brightness. - contrast_range (tuple): range of contrast. - saturation_range (tuple): range of saturation. - hue_delta (int): delta of hue. - """ - - def __init__(self, - brightness_delta=32, - contrast_range=(0.5, 1.5), - saturation_range=(0.5, 1.5), - hue_delta=18): - self.brightness_delta = brightness_delta - self.contrast_lower, self.contrast_upper = contrast_range - self.saturation_lower, self.saturation_upper = saturation_range - self.hue_delta = hue_delta - - def __call__(self, results): - """Call function to perform photometric distortion on images. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Result dict with images distorted. - """ - - if 'img_fields' in results: - assert results['img_fields'] == ['img'], \ - 'Only single img_fields is allowed' - img = results['img'] - assert img.dtype == np.float32, \ - 'PhotoMetricDistortion needs the input image of dtype np.float32,'\ - ' please set "to_float32=True" in "LoadImageFromFile" pipeline' - # random brightness - if random.randint(2): - delta = random.uniform(-self.brightness_delta, - self.brightness_delta) - img += delta - - # mode == 0 --> do random contrast first - # mode == 1 --> do random contrast last - mode = random.randint(2) - if mode == 1: - if random.randint(2): - alpha = random.uniform(self.contrast_lower, - self.contrast_upper) - img *= alpha - - # convert color from BGR to HSV - img = mmcv.bgr2hsv(img) - - # random saturation - if random.randint(2): - img[..., 1] *= random.uniform(self.saturation_lower, - self.saturation_upper) - - # random hue - if random.randint(2): - img[..., 0] += random.uniform(-self.hue_delta, self.hue_delta) - img[..., 0][img[..., 0] > 360] -= 360 - img[..., 0][img[..., 0] < 0] += 360 - - # convert color from HSV to BGR - img = mmcv.hsv2bgr(img) - - # random contrast - if mode == 0: - if random.randint(2): - alpha = random.uniform(self.contrast_lower, - self.contrast_upper) - img *= alpha - - # randomly swap channels - if random.randint(2): - img = img[..., random.permutation(3)] - - results['img'] = img - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(\nbrightness_delta={self.brightness_delta},\n' - repr_str += 'contrast_range=' - repr_str += f'{(self.contrast_lower, self.contrast_upper)},\n' - repr_str += 'saturation_range=' - repr_str += f'{(self.saturation_lower, self.saturation_upper)},\n' - repr_str += f'hue_delta={self.hue_delta})' - return repr_str - - -@PIPELINES.register_module() -class Expand(object): - """Random expand the image & bboxes. - - Randomly place the original image on a canvas of 'ratio' x original image - size filled with mean values. The ratio is in the range of ratio_range. - - Args: - mean (tuple): mean value of dataset. - to_rgb (bool): if need to convert the order of mean to align with RGB. - ratio_range (tuple): range of expand ratio. - prob (float): probability of applying this transformation - """ - - def __init__(self, - mean=(0, 0, 0), - to_rgb=True, - ratio_range=(1, 4), - seg_ignore_label=None, - prob=0.5): - self.to_rgb = to_rgb - self.ratio_range = ratio_range - if to_rgb: - self.mean = mean[::-1] - else: - self.mean = mean - self.min_ratio, self.max_ratio = ratio_range - self.seg_ignore_label = seg_ignore_label - self.prob = prob - - def __call__(self, results): - """Call function to expand images, bounding boxes. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Result dict with images, bounding boxes expanded - """ - - if random.uniform(0, 1) > self.prob: - return results - - if 'img_fields' in results: - assert results['img_fields'] == ['img'], \ - 'Only single img_fields is allowed' - img = results['img'] - - h, w, c = img.shape - ratio = random.uniform(self.min_ratio, self.max_ratio) - # speedup expand when meets large image - if np.all(self.mean == self.mean[0]): - expand_img = np.empty((int(h * ratio), int(w * ratio), c), - img.dtype) - expand_img.fill(self.mean[0]) - else: - expand_img = np.full((int(h * ratio), int(w * ratio), c), - self.mean, - dtype=img.dtype) - left = int(random.uniform(0, w * ratio - w)) - top = int(random.uniform(0, h * ratio - h)) - expand_img[top:top + h, left:left + w] = img - - results['img'] = expand_img - # expand bboxes - for key in results.get('bbox_fields', []): - results[key] = results[key] + np.tile( - (left, top), 2).astype(results[key].dtype) - - # expand masks - for key in results.get('mask_fields', []): - results[key] = results[key].expand( - int(h * ratio), int(w * ratio), top, left) - - # expand segs - for key in results.get('seg_fields', []): - gt_seg = results[key] - expand_gt_seg = np.full((int(h * ratio), int(w * ratio)), - self.seg_ignore_label, - dtype=gt_seg.dtype) - expand_gt_seg[top:top + h, left:left + w] = gt_seg - results[key] = expand_gt_seg - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(mean={self.mean}, to_rgb={self.to_rgb}, ' - repr_str += f'ratio_range={self.ratio_range}, ' - repr_str += f'seg_ignore_label={self.seg_ignore_label})' - return repr_str - - -@PIPELINES.register_module() -class MinIoURandomCrop(object): - """Random crop the image & bboxes, the cropped patches have minimum IoU - requirement with original image & bboxes, the IoU threshold is randomly - selected from min_ious. - - Args: - min_ious (tuple): minimum IoU threshold for all intersections with - bounding boxes - min_crop_size (float): minimum crop's size (i.e. h,w := a*h, a*w, - where a >= min_crop_size). - bbox_clip_border (bool, optional): Whether clip the objects outside - the border of the image. Defaults to True. - - Note: - The keys for bboxes, labels and masks should be paired. That is, \ - `gt_bboxes` corresponds to `gt_labels` and `gt_masks`, and \ - `gt_bboxes_ignore` to `gt_labels_ignore` and `gt_masks_ignore`. - """ - - def __init__(self, - min_ious=(0.1, 0.3, 0.5, 0.7, 0.9), - min_crop_size=0.3, - bbox_clip_border=True): - # 1: return ori img - self.min_ious = min_ious - self.sample_mode = (1, *min_ious, 0) - self.min_crop_size = min_crop_size - self.bbox_clip_border = bbox_clip_border - self.bbox2label = { - 'gt_bboxes': 'gt_labels', - 'gt_bboxes_ignore': 'gt_labels_ignore' - } - self.bbox2mask = { - 'gt_bboxes': 'gt_masks', - 'gt_bboxes_ignore': 'gt_masks_ignore' - } - - def __call__(self, results): - """Call function to crop images and bounding boxes with minimum IoU - constraint. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Result dict with images and bounding boxes cropped, \ - 'img_shape' key is updated. - """ - - if 'img_fields' in results: - assert results['img_fields'] == ['img'], \ - 'Only single img_fields is allowed' - img = results['img'] - assert 'bbox_fields' in results - boxes = [results[key] for key in results['bbox_fields']] - boxes = np.concatenate(boxes, 0) - h, w, c = img.shape - while True: - mode = random.choice(self.sample_mode) - self.mode = mode - if mode == 1: - return results - - min_iou = mode - for i in range(50): - new_w = random.uniform(self.min_crop_size * w, w) - new_h = random.uniform(self.min_crop_size * h, h) - - # h / w in [0.5, 2] - if new_h / new_w < 0.5 or new_h / new_w > 2: - continue - - left = random.uniform(w - new_w) - top = random.uniform(h - new_h) - - patch = np.array( - (int(left), int(top), int(left + new_w), int(top + new_h))) - # Line or point crop is not allowed - if patch[2] == patch[0] or patch[3] == patch[1]: - continue - overlaps = bbox_overlaps( - patch.reshape(-1, 4), boxes.reshape(-1, 4)).reshape(-1) - if len(overlaps) > 0 and overlaps.min() < min_iou: - continue - - # center of boxes should inside the crop img - # only adjust boxes and instance masks when the gt is not empty - if len(overlaps) > 0: - # adjust boxes - def is_center_of_bboxes_in_patch(boxes, patch): - center = (boxes[:, :2] + boxes[:, 2:]) / 2 - mask = ((center[:, 0] > patch[0]) * - (center[:, 1] > patch[1]) * - (center[:, 0] < patch[2]) * - (center[:, 1] < patch[3])) - return mask - - mask = is_center_of_bboxes_in_patch(boxes, patch) - if not mask.any(): - continue - for key in results.get('bbox_fields', []): - boxes = results[key].copy() - mask = is_center_of_bboxes_in_patch(boxes, patch) - boxes = boxes[mask] - if self.bbox_clip_border: - boxes[:, 2:] = boxes[:, 2:].clip(max=patch[2:]) - boxes[:, :2] = boxes[:, :2].clip(min=patch[:2]) - boxes -= np.tile(patch[:2], 2) - - results[key] = boxes - # labels - label_key = self.bbox2label.get(key) - if label_key in results: - results[label_key] = results[label_key][mask] - - # mask fields - mask_key = self.bbox2mask.get(key) - if mask_key in results: - results[mask_key] = results[mask_key][ - mask.nonzero()[0]].crop(patch) - # adjust the img no matter whether the gt is empty before crop - img = img[patch[1]:patch[3], patch[0]:patch[2]] - results['img'] = img - results['img_shape'] = img.shape - - # seg fields - for key in results.get('seg_fields', []): - results[key] = results[key][patch[1]:patch[3], - patch[0]:patch[2]] - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(min_ious={self.min_ious}, ' - repr_str += f'min_crop_size={self.min_crop_size}, ' - repr_str += f'bbox_clip_border={self.bbox_clip_border})' - return repr_str - - -@PIPELINES.register_module() -class Corrupt(object): - """Corruption augmentation. - - Corruption transforms implemented based on - `imagecorruptions `_. - - Args: - corruption (str): Corruption name. - severity (int, optional): The severity of corruption. Default: 1. - """ - - def __init__(self, corruption, severity=1): - self.corruption = corruption - self.severity = severity - - def __call__(self, results): - """Call function to corrupt image. - - Args: - results (dict): Result dict from loading pipeline. - - Returns: - dict: Result dict with images corrupted. - """ - - if corrupt is None: - raise RuntimeError('imagecorruptions is not installed') - if 'img_fields' in results: - assert results['img_fields'] == ['img'], \ - 'Only single img_fields is allowed' - results['img'] = corrupt( - results['img'].astype(np.uint8), - corruption_name=self.corruption, - severity=self.severity) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(corruption={self.corruption}, ' - repr_str += f'severity={self.severity})' - return repr_str - - -@PIPELINES.register_module() -class Albu(object): - """Albumentation augmentation. - - Adds custom transformations from Albumentations library. - Please, visit `https://albumentations.readthedocs.io` - to get more information. - - An example of ``transforms`` is as followed: - - .. code-block:: - - [ - dict( - type='ShiftScaleRotate', - shift_limit=0.0625, - scale_limit=0.0, - rotate_limit=0, - interpolation=1, - p=0.5), - dict( - type='RandomBrightnessContrast', - brightness_limit=[0.1, 0.3], - contrast_limit=[0.1, 0.3], - p=0.2), - dict(type='ChannelShuffle', p=0.1), - dict( - type='OneOf', - transforms=[ - dict(type='Blur', blur_limit=3, p=1.0), - dict(type='MedianBlur', blur_limit=3, p=1.0) - ], - p=0.1), - ] - - Args: - transforms (list[dict]): A list of albu transformations - bbox_params (dict): Bbox_params for albumentation `Compose` - keymap (dict): Contains {'input key':'albumentation-style key'} - skip_img_without_anno (bool): Whether to skip the image if no ann left - after aug - """ - - def __init__(self, - transforms, - bbox_params=None, - keymap=None, - update_pad_shape=False, - skip_img_without_anno=False): - if Compose is None: - raise RuntimeError('albumentations is not installed') - - # Args will be modified later, copying it will be safer - transforms = copy.deepcopy(transforms) - if bbox_params is not None: - bbox_params = copy.deepcopy(bbox_params) - if keymap is not None: - keymap = copy.deepcopy(keymap) - self.transforms = transforms - self.filter_lost_elements = False - self.update_pad_shape = update_pad_shape - self.skip_img_without_anno = skip_img_without_anno - - # A simple workaround to remove masks without boxes - if (isinstance(bbox_params, dict) and 'label_fields' in bbox_params - and 'filter_lost_elements' in bbox_params): - self.filter_lost_elements = True - self.origin_label_fields = bbox_params['label_fields'] - bbox_params['label_fields'] = ['idx_mapper'] - del bbox_params['filter_lost_elements'] - - self.bbox_params = ( - self.albu_builder(bbox_params) if bbox_params else None) - self.aug = Compose([self.albu_builder(t) for t in self.transforms], - bbox_params=self.bbox_params) - - if not keymap: - self.keymap_to_albu = { - 'img': 'image', - 'gt_masks': 'masks', - 'gt_bboxes': 'bboxes' - } - else: - self.keymap_to_albu = keymap - self.keymap_back = {v: k for k, v in self.keymap_to_albu.items()} - - def albu_builder(self, cfg): - """Import a module from albumentations. - - It inherits some of :func:`build_from_cfg` logic. - - Args: - cfg (dict): Config dict. It should at least contain the key "type". - - Returns: - obj: The constructed object. - """ - - assert isinstance(cfg, dict) and 'type' in cfg - args = cfg.copy() - - obj_type = args.pop('type') - if mmcv.is_str(obj_type): - if albumentations is None: - raise RuntimeError('albumentations is not installed') - obj_cls = getattr(albumentations, obj_type) - elif inspect.isclass(obj_type): - obj_cls = obj_type - else: - raise TypeError( - f'type must be a str or valid type, but got {type(obj_type)}') - - if 'transforms' in args: - args['transforms'] = [ - self.albu_builder(transform) - for transform in args['transforms'] - ] - - return obj_cls(**args) - - @staticmethod - def mapper(d, keymap): - """Dictionary mapper. Renames keys according to keymap provided. - - Args: - d (dict): old dict - keymap (dict): {'old_key':'new_key'} - Returns: - dict: new dict. - """ - - updated_dict = {} - for k, v in zip(d.keys(), d.values()): - new_k = keymap.get(k, k) - updated_dict[new_k] = d[k] - return updated_dict - - def __call__(self, results): - # dict to albumentations format - results = self.mapper(results, self.keymap_to_albu) - # TODO: add bbox_fields - if 'bboxes' in results: - # to list of boxes - if isinstance(results['bboxes'], np.ndarray): - results['bboxes'] = [x for x in results['bboxes']] - # add pseudo-field for filtration - if self.filter_lost_elements: - results['idx_mapper'] = np.arange(len(results['bboxes'])) - - # TODO: Support mask structure in albu - if 'masks' in results: - if isinstance(results['masks'], PolygonMasks): - raise NotImplementedError( - 'Albu only supports BitMap masks now') - ori_masks = results['masks'] - if albumentations.__version__ < '0.5': - results['masks'] = results['masks'].masks - else: - results['masks'] = [mask for mask in results['masks'].masks] - - results = self.aug(**results) - - if 'bboxes' in results: - if isinstance(results['bboxes'], list): - results['bboxes'] = np.array( - results['bboxes'], dtype=np.float32) - results['bboxes'] = results['bboxes'].reshape(-1, 4) - - # filter label_fields - if self.filter_lost_elements: - - for label in self.origin_label_fields: - results[label] = np.array( - [results[label][i] for i in results['idx_mapper']]) - if 'masks' in results: - results['masks'] = np.array( - [results['masks'][i] for i in results['idx_mapper']]) - results['masks'] = ori_masks.__class__( - results['masks'], results['image'].shape[0], - results['image'].shape[1]) - - if (not len(results['idx_mapper']) - and self.skip_img_without_anno): - return None - - if 'gt_labels' in results: - if isinstance(results['gt_labels'], list): - results['gt_labels'] = np.array(results['gt_labels']) - results['gt_labels'] = results['gt_labels'].astype(np.int64) - - # back to the original format - results = self.mapper(results, self.keymap_back) - - # update final shape - if self.update_pad_shape: - results['pad_shape'] = results['img'].shape - - return results - - def __repr__(self): - repr_str = self.__class__.__name__ + f'(transforms={self.transforms})' - return repr_str - - -@PIPELINES.register_module() -class RandomCenterCropPad(object): - """Random center crop and random around padding for CornerNet. - - This operation generates randomly cropped image from the original image and - pads it simultaneously. Different from :class:`RandomCrop`, the output - shape may not equal to ``crop_size`` strictly. We choose a random value - from ``ratios`` and the output shape could be larger or smaller than - ``crop_size``. The padding operation is also different from :class:`Pad`, - here we use around padding instead of right-bottom padding. - - The relation between output image (padding image) and original image: - - .. code:: text - - output image - - +----------------------------+ - | padded area | - +------|----------------------------|----------+ - | | cropped area | | - | | +---------------+ | | - | | | . center | | | original image - | | | range | | | - | | +---------------+ | | - +------|----------------------------|----------+ - | padded area | - +----------------------------+ - - There are 5 main areas in the figure: - - - output image: output image of this operation, also called padding - image in following instruction. - - original image: input image of this operation. - - padded area: non-intersect area of output image and original image. - - cropped area: the overlap of output image and original image. - - center range: a smaller area where random center chosen from. - center range is computed by ``border`` and original image's shape - to avoid our random center is too close to original image's border. - - Also this operation act differently in train and test mode, the summary - pipeline is listed below. - - Train pipeline: - - 1. Choose a ``random_ratio`` from ``ratios``, the shape of padding image - will be ``random_ratio * crop_size``. - 2. Choose a ``random_center`` in center range. - 3. Generate padding image with center matches the ``random_center``. - 4. Initialize the padding image with pixel value equals to ``mean``. - 5. Copy the cropped area to padding image. - 6. Refine annotations. - - Test pipeline: - - 1. Compute output shape according to ``test_pad_mode``. - 2. Generate padding image with center matches the original image - center. - 3. Initialize the padding image with pixel value equals to ``mean``. - 4. Copy the ``cropped area`` to padding image. - - Args: - crop_size (tuple | None): expected size after crop, final size will - computed according to ratio. Requires (h, w) in train mode, and - None in test mode. - ratios (tuple): random select a ratio from tuple and crop image to - (crop_size[0] * ratio) * (crop_size[1] * ratio). - Only available in train mode. - border (int): max distance from center select area to image border. - Only available in train mode. - mean (sequence): Mean values of 3 channels. - std (sequence): Std values of 3 channels. - to_rgb (bool): Whether to convert the image from BGR to RGB. - test_mode (bool): whether involve random variables in transform. - In train mode, crop_size is fixed, center coords and ratio is - random selected from predefined lists. In test mode, crop_size - is image's original shape, center coords and ratio is fixed. - test_pad_mode (tuple): padding method and padding shape value, only - available in test mode. Default is using 'logical_or' with - 127 as padding shape value. - - - 'logical_or': final_shape = input_shape | padding_shape_value - - 'size_divisor': final_shape = int( - ceil(input_shape / padding_shape_value) * padding_shape_value) - bbox_clip_border (bool, optional): Whether clip the objects outside - the border of the image. Defaults to True. - """ - - def __init__(self, - crop_size=None, - ratios=(0.9, 1.0, 1.1), - border=128, - mean=None, - std=None, - to_rgb=None, - test_mode=False, - test_pad_mode=('logical_or', 127), - bbox_clip_border=True): - if test_mode: - assert crop_size is None, 'crop_size must be None in test mode' - assert ratios is None, 'ratios must be None in test mode' - assert border is None, 'border must be None in test mode' - assert isinstance(test_pad_mode, (list, tuple)) - assert test_pad_mode[0] in ['logical_or', 'size_divisor'] - else: - assert isinstance(crop_size, (list, tuple)) - assert crop_size[0] > 0 and crop_size[1] > 0, ( - 'crop_size must > 0 in train mode') - assert isinstance(ratios, (list, tuple)) - assert test_pad_mode is None, ( - 'test_pad_mode must be None in train mode') - - self.crop_size = crop_size - self.ratios = ratios - self.border = border - # We do not set default value to mean, std and to_rgb because these - # hyper-parameters are easy to forget but could affect the performance. - # Please use the same setting as Normalize for performance assurance. - assert mean is not None and std is not None and to_rgb is not None - self.to_rgb = to_rgb - self.input_mean = mean - self.input_std = std - if to_rgb: - self.mean = mean[::-1] - self.std = std[::-1] - else: - self.mean = mean - self.std = std - self.test_mode = test_mode - self.test_pad_mode = test_pad_mode - self.bbox_clip_border = bbox_clip_border - - def _get_border(self, border, size): - """Get final border for the target size. - - This function generates a ``final_border`` according to image's shape. - The area between ``final_border`` and ``size - final_border`` is the - ``center range``. We randomly choose center from the ``center range`` - to avoid our random center is too close to original image's border. - Also ``center range`` should be larger than 0. - - Args: - border (int): The initial border, default is 128. - size (int): The width or height of original image. - Returns: - int: The final border. - """ - k = 2 * border / size - i = pow(2, np.ceil(np.log2(np.ceil(k))) + (k == int(k))) - return border // i - - def _filter_boxes(self, patch, boxes): - """Check whether the center of each box is in the patch. - - Args: - patch (list[int]): The cropped area, [left, top, right, bottom]. - boxes (numpy array, (N x 4)): Ground truth boxes. - - Returns: - mask (numpy array, (N,)): Each box is inside or outside the patch. - """ - center = (boxes[:, :2] + boxes[:, 2:]) / 2 - mask = (center[:, 0] > patch[0]) * (center[:, 1] > patch[1]) * ( - center[:, 0] < patch[2]) * ( - center[:, 1] < patch[3]) - return mask - - def _crop_image_and_paste(self, image, center, size): - """Crop image with a given center and size, then paste the cropped - image to a blank image with two centers align. - - This function is equivalent to generating a blank image with ``size`` - as its shape. Then cover it on the original image with two centers ( - the center of blank image and the random center of original image) - aligned. The overlap area is paste from the original image and the - outside area is filled with ``mean pixel``. - - Args: - image (np array, H x W x C): Original image. - center (list[int]): Target crop center coord. - size (list[int]): Target crop size. [target_h, target_w] - - Returns: - cropped_img (np array, target_h x target_w x C): Cropped image. - border (np array, 4): The distance of four border of - ``cropped_img`` to the original image area, [top, bottom, - left, right] - patch (list[int]): The cropped area, [left, top, right, bottom]. - """ - center_y, center_x = center - target_h, target_w = size - img_h, img_w, img_c = image.shape - - x0 = max(0, center_x - target_w // 2) - x1 = min(center_x + target_w // 2, img_w) - y0 = max(0, center_y - target_h // 2) - y1 = min(center_y + target_h // 2, img_h) - patch = np.array((int(x0), int(y0), int(x1), int(y1))) - - left, right = center_x - x0, x1 - center_x - top, bottom = center_y - y0, y1 - center_y - - cropped_center_y, cropped_center_x = target_h // 2, target_w // 2 - cropped_img = np.zeros((target_h, target_w, img_c), dtype=image.dtype) - for i in range(img_c): - cropped_img[:, :, i] += self.mean[i] - y_slice = slice(cropped_center_y - top, cropped_center_y + bottom) - x_slice = slice(cropped_center_x - left, cropped_center_x + right) - cropped_img[y_slice, x_slice, :] = image[y0:y1, x0:x1, :] - - border = np.array([ - cropped_center_y - top, cropped_center_y + bottom, - cropped_center_x - left, cropped_center_x + right - ], - dtype=np.float32) - - return cropped_img, border, patch - - def _train_aug(self, results): - """Random crop and around padding the original image. - - Args: - results (dict): Image infomations in the augment pipeline. - - Returns: - results (dict): The updated dict. - """ - img = results['img'] - h, w, c = img.shape - boxes = results['gt_bboxes'] - while True: - scale = random.choice(self.ratios) - new_h = int(self.crop_size[0] * scale) - new_w = int(self.crop_size[1] * scale) - h_border = self._get_border(self.border, h) - w_border = self._get_border(self.border, w) - - for i in range(50): - center_x = random.randint(low=w_border, high=w - w_border) - center_y = random.randint(low=h_border, high=h - h_border) - - cropped_img, border, patch = self._crop_image_and_paste( - img, [center_y, center_x], [new_h, new_w]) - - mask = self._filter_boxes(patch, boxes) - # if image do not have valid bbox, any crop patch is valid. - if not mask.any() and len(boxes) > 0: - continue - - results['img'] = cropped_img - results['img_shape'] = cropped_img.shape - results['pad_shape'] = cropped_img.shape - - x0, y0, x1, y1 = patch - - left_w, top_h = center_x - x0, center_y - y0 - cropped_center_x, cropped_center_y = new_w // 2, new_h // 2 - - # crop bboxes accordingly and clip to the image boundary - for key in results.get('bbox_fields', []): - mask = self._filter_boxes(patch, results[key]) - bboxes = results[key][mask] - bboxes[:, 0:4:2] += cropped_center_x - left_w - x0 - bboxes[:, 1:4:2] += cropped_center_y - top_h - y0 - if self.bbox_clip_border: - bboxes[:, 0:4:2] = np.clip(bboxes[:, 0:4:2], 0, new_w) - bboxes[:, 1:4:2] = np.clip(bboxes[:, 1:4:2], 0, new_h) - keep = (bboxes[:, 2] > bboxes[:, 0]) & ( - bboxes[:, 3] > bboxes[:, 1]) - bboxes = bboxes[keep] - results[key] = bboxes - if key in ['gt_bboxes']: - if 'gt_labels' in results: - labels = results['gt_labels'][mask] - labels = labels[keep] - results['gt_labels'] = labels - if 'gt_masks' in results: - raise NotImplementedError( - 'RandomCenterCropPad only supports bbox.') - - # crop semantic seg - for key in results.get('seg_fields', []): - raise NotImplementedError( - 'RandomCenterCropPad only supports bbox.') - return results - - def _test_aug(self, results): - """Around padding the original image without cropping. - - The padding mode and value are from ``test_pad_mode``. - - Args: - results (dict): Image infomations in the augment pipeline. - - Returns: - results (dict): The updated dict. - """ - img = results['img'] - h, w, c = img.shape - results['img_shape'] = img.shape - if self.test_pad_mode[0] in ['logical_or']: - target_h = h | self.test_pad_mode[1] - target_w = w | self.test_pad_mode[1] - elif self.test_pad_mode[0] in ['size_divisor']: - divisor = self.test_pad_mode[1] - target_h = int(np.ceil(h / divisor)) * divisor - target_w = int(np.ceil(w / divisor)) * divisor - else: - raise NotImplementedError( - 'RandomCenterCropPad only support two testing pad mode:' - 'logical-or and size_divisor.') - - cropped_img, border, _ = self._crop_image_and_paste( - img, [h // 2, w // 2], [target_h, target_w]) - results['img'] = cropped_img - results['pad_shape'] = cropped_img.shape - results['border'] = border - return results - - def __call__(self, results): - img = results['img'] - assert img.dtype == np.float32, ( - 'RandomCenterCropPad needs the input image of dtype np.float32,' - ' please set "to_float32=True" in "LoadImageFromFile" pipeline') - h, w, c = img.shape - assert c == len(self.mean) - if self.test_mode: - return self._test_aug(results) - else: - return self._train_aug(results) - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(crop_size={self.crop_size}, ' - repr_str += f'ratios={self.ratios}, ' - repr_str += f'border={self.border}, ' - repr_str += f'mean={self.input_mean}, ' - repr_str += f'std={self.input_std}, ' - repr_str += f'to_rgb={self.to_rgb}, ' - repr_str += f'test_mode={self.test_mode}, ' - repr_str += f'test_pad_mode={self.test_pad_mode}, ' - repr_str += f'bbox_clip_border={self.bbox_clip_border})' - return repr_str - - -@PIPELINES.register_module() -class CutOut(object): - """CutOut operation. - - Randomly drop some regions of image used in - `Cutout `_. - - Args: - n_holes (int | tuple[int, int]): Number of regions to be dropped. - If it is given as a list, number of holes will be randomly - selected from the closed interval [`n_holes[0]`, `n_holes[1]`]. - cutout_shape (tuple[int, int] | list[tuple[int, int]]): The candidate - shape of dropped regions. It can be `tuple[int, int]` to use a - fixed cutout shape, or `list[tuple[int, int]]` to randomly choose - shape from the list. - cutout_ratio (tuple[float, float] | list[tuple[float, float]]): The - candidate ratio of dropped regions. It can be `tuple[float, float]` - to use a fixed ratio or `list[tuple[float, float]]` to randomly - choose ratio from the list. Please note that `cutout_shape` - and `cutout_ratio` cannot be both given at the same time. - fill_in (tuple[float, float, float] | tuple[int, int, int]): The value - of pixel to fill in the dropped regions. Default: (0, 0, 0). - """ - - def __init__(self, - n_holes, - cutout_shape=None, - cutout_ratio=None, - fill_in=(0, 0, 0)): - - assert (cutout_shape is None) ^ (cutout_ratio is None), \ - 'Either cutout_shape or cutout_ratio should be specified.' - assert (isinstance(cutout_shape, (list, tuple)) - or isinstance(cutout_ratio, (list, tuple))) - if isinstance(n_holes, tuple): - assert len(n_holes) == 2 and 0 <= n_holes[0] < n_holes[1] - else: - n_holes = (n_holes, n_holes) - self.n_holes = n_holes - self.fill_in = fill_in - self.with_ratio = cutout_ratio is not None - self.candidates = cutout_ratio if self.with_ratio else cutout_shape - if not isinstance(self.candidates, list): - self.candidates = [self.candidates] - - def __call__(self, results): - """Call function to drop some regions of image.""" - h, w, c = results['img'].shape - n_holes = np.random.randint(self.n_holes[0], self.n_holes[1] + 1) - for _ in range(n_holes): - x1 = np.random.randint(0, w) - y1 = np.random.randint(0, h) - index = np.random.randint(0, len(self.candidates)) - if not self.with_ratio: - cutout_w, cutout_h = self.candidates[index] - else: - cutout_w = int(self.candidates[index][0] * w) - cutout_h = int(self.candidates[index][1] * h) - - x2 = np.clip(x1 + cutout_w, 0, w) - y2 = np.clip(y1 + cutout_h, 0, h) - results['img'][y1:y2, x1:x2, :] = self.fill_in - - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(n_holes={self.n_holes}, ' - repr_str += (f'cutout_ratio={self.candidates}, ' if self.with_ratio - else f'cutout_shape={self.candidates}, ') - repr_str += f'fill_in={self.fill_in})' - return repr_str diff --git a/spaces/Andy1621/uniformer_image_detection/mmdet/models/detectors/htc.py b/spaces/Andy1621/uniformer_image_detection/mmdet/models/detectors/htc.py deleted file mode 100644 index d9efdf420fa7373f7f1d116f8d97836d73b457bf..0000000000000000000000000000000000000000 --- a/spaces/Andy1621/uniformer_image_detection/mmdet/models/detectors/htc.py +++ /dev/null @@ -1,15 +0,0 @@ -from ..builder import DETECTORS -from .cascade_rcnn import CascadeRCNN - - -@DETECTORS.register_module() -class HybridTaskCascade(CascadeRCNN): - """Implementation of `HTC `_""" - - def __init__(self, **kwargs): - super(HybridTaskCascade, self).__init__(**kwargs) - - @property - def with_semantic(self): - """bool: whether the detector has a semantic head""" - return self.roi_head.with_semantic diff --git a/spaces/AnishKumbhar/ChatBot/text-generation-webui-main/extensions/multimodal/abstract_pipeline.py b/spaces/AnishKumbhar/ChatBot/text-generation-webui-main/extensions/multimodal/abstract_pipeline.py deleted file mode 100644 index 584219419d256e7743fd4d5120c56bcfa8f2a9f9..0000000000000000000000000000000000000000 --- a/spaces/AnishKumbhar/ChatBot/text-generation-webui-main/extensions/multimodal/abstract_pipeline.py +++ /dev/null @@ -1,62 +0,0 @@ -from abc import ABC, abstractmethod -from typing import List, Optional - -import torch -from PIL import Image - - -class AbstractMultimodalPipeline(ABC): - @staticmethod - @abstractmethod - def name() -> str: - 'name of the pipeline, should be same as in --multimodal-pipeline' - pass - - @staticmethod - @abstractmethod - def image_start() -> Optional[str]: - 'return image start string, string representation of image start token, or None if not applicable' - pass - - @staticmethod - @abstractmethod - def image_end() -> Optional[str]: - 'return image end string, string representation of image end token, or None if not applicable' - pass - - @staticmethod - @abstractmethod - def placeholder_token_id() -> int: - 'return placeholder token id' - pass - - @staticmethod - @abstractmethod - def num_image_embeds() -> int: - 'return the number of embeds used by a single image (for example: 256 for LLaVA)' - pass - - @abstractmethod - def embed_images(self, images: List[Image.Image]) -> torch.Tensor: - 'forward the images through vision pipeline, and return their embeddings' - pass - - @staticmethod - @abstractmethod - def embed_tokens(input_ids: torch.Tensor) -> torch.Tensor: - 'embed tokens, the exact function varies by LLM, for LLaMA it is `shared.model.model.embed_tokens`' - pass - - @staticmethod - @abstractmethod - def placeholder_embeddings() -> torch.Tensor: - 'get placeholder embeddings if there are multiple images, and `add_all_images_to_prompt` is False' - pass - - def _get_device(self, setting_name: str, params: dict): - if params[setting_name] is None: - return torch.device("cuda:0" if torch.cuda.is_available() else "cpu") - return torch.device(params[setting_name]) - - def _get_dtype(self, setting_name: str, params: dict): - return torch.float32 if int(params[setting_name]) == 32 else torch.float16 diff --git a/spaces/Anonymous-sub/Rerender/ControlNet/ldm/modules/midas/midas/base_model.py b/spaces/Anonymous-sub/Rerender/ControlNet/ldm/modules/midas/midas/base_model.py deleted file mode 100644 index 5cf430239b47ec5ec07531263f26f5c24a2311cd..0000000000000000000000000000000000000000 --- a/spaces/Anonymous-sub/Rerender/ControlNet/ldm/modules/midas/midas/base_model.py +++ /dev/null @@ -1,16 +0,0 @@ -import torch - - -class BaseModel(torch.nn.Module): - def load(self, path): - """Load model from file. - - Args: - path (str): file path - """ - parameters = torch.load(path, map_location=torch.device('cpu')) - - if "optimizer" in parameters: - parameters = parameters["model"] - - self.load_state_dict(parameters) diff --git a/spaces/Artrajz/vits-simple-api/vits/bert/__init__.py b/spaces/Artrajz/vits-simple-api/vits/bert/__init__.py deleted file mode 100644 index 6f70b7372c156d3969a2f87328737f63ed4bef71..0000000000000000000000000000000000000000 --- a/spaces/Artrajz/vits-simple-api/vits/bert/__init__.py +++ /dev/null @@ -1,15 +0,0 @@ -""" from https://github.com/PlayVoice/vits_chinese """ -import os - -import config -from utils.download import download_and_verify -from .ProsodyModel import TTSProsody - -URLS = [ - "https://huggingface.co/spaces/maxmax20160403/vits_chinese/resolve/main/bert/prosody_model.pt", -] -TARGET_PATH = os.path.join(config.ABS_PATH, "vits/bert/prosody_model.pt") -EXPECTED_MD5 = None - -if not os.path.exists(TARGET_PATH): - success, message = download_and_verify(URLS, TARGET_PATH, EXPECTED_MD5) \ No newline at end of file diff --git a/spaces/Audio-AGI/WavJourney/scripts/start_services.sh b/spaces/Audio-AGI/WavJourney/scripts/start_services.sh deleted file mode 100644 index 2347f327760ad9442f14eff8bb28924021dcfaba..0000000000000000000000000000000000000000 --- a/spaces/Audio-AGI/WavJourney/scripts/start_services.sh +++ /dev/null @@ -1 +0,0 @@ -nohup conda run --live-stream -n WavJourney python services.py > services_logs/service.out 2>&1 & diff --git a/spaces/Awiny/Image2Paragraph/models/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py b/spaces/Awiny/Image2Paragraph/models/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py deleted file mode 100644 index 5213faf4d859bb109a03bcd2721a02d63d2f89ce..0000000000000000000000000000000000000000 --- a/spaces/Awiny/Image2Paragraph/models/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py +++ /dev/null @@ -1,185 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import argparse -import glob -import multiprocessing as mp -import os -import time -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -from predictor import VisualizationDemo -from centernet.config import add_centernet_config -# constants -WINDOW_NAME = "CenterNet2 detections" - -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer -from detectron2.data import MetadataCatalog - -def setup_cfg(args): - # load config from file and command-line arguments - cfg = get_cfg() - add_centernet_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - # Set score_threshold for builtin models - cfg.MODEL.RETINANET.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args.confidence_threshold - if cfg.MODEL.META_ARCHITECTURE in ['ProposalNetwork', 'CenterNetDetector']: - cfg.MODEL.CENTERNET.INFERENCE_TH = args.confidence_threshold - cfg.MODEL.CENTERNET.NMS_TH = cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args.confidence_threshold - cfg.freeze() - return cfg - - -def get_parser(): - parser = argparse.ArgumentParser(description="Detectron2 demo for builtin models") - parser.add_argument( - "--config-file", - default="configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml", - metavar="FILE", - help="path to config file", - ) - parser.add_argument("--webcam", action="store_true", help="Take inputs from webcam.") - parser.add_argument("--video-input", help="Path to video file.") - parser.add_argument("--input", nargs="+", help="A list of space separated input images") - parser.add_argument( - "--output", - help="A file or directory to save output visualizations. " - "If not given, will show output in an OpenCV window.", - ) - - parser.add_argument( - "--confidence-threshold", - type=float, - default=0.3, - help="Minimum score for instance predictions to be shown", - ) - parser.add_argument( - "--opts", - help="Modify config options using the command-line 'KEY VALUE' pairs", - default=[], - nargs=argparse.REMAINDER, - ) - return parser - - -if __name__ == "__main__": - mp.set_start_method("spawn", force=True) - args = get_parser().parse_args() - logger = setup_logger() - logger.info("Arguments: " + str(args)) - - cfg = setup_cfg(args) - - demo = VisualizationDemo(cfg) - output_file = None - if args.input: - if len(args.input) == 1: - args.input = glob.glob(os.path.expanduser(args.input[0])) - files = os.listdir(args.input[0]) - args.input = [args.input[0] + x for x in files] - assert args.input, "The input path(s) was not found" - visualizer = VideoVisualizer( - MetadataCatalog.get( - cfg.DATASETS.TEST[0] if len(cfg.DATASETS.TEST) else "__unused" - ), - instance_mode=ColorMode.IMAGE) - for path in tqdm.tqdm(args.input, disable=not args.output): - # use PIL, to be consistent with evaluation - img = read_image(path, format="BGR") - start_time = time.time() - predictions, visualized_output = demo.run_on_image( - img, visualizer=visualizer) - if 'instances' in predictions: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["instances"]), time.time() - start_time - ) - ) - else: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["proposals"]), time.time() - start_time - ) - ) - - if args.output: - if os.path.isdir(args.output): - assert os.path.isdir(args.output), args.output - out_filename = os.path.join(args.output, os.path.basename(path)) - visualized_output.save(out_filename) - else: - # assert len(args.input) == 1, "Please specify a directory with args.output" - # out_filename = args.output - if output_file is None: - width = visualized_output.get_image().shape[1] - height = visualized_output.get_image().shape[0] - frames_per_second = 15 - output_file = cv2.VideoWriter( - filename=args.output, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - output_file.write(visualized_output.get_image()[:, :, ::-1]) - else: - # cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, visualized_output.get_image()[:, :, ::-1]) - if cv2.waitKey(1 ) == 27: - break # esc to quit - elif args.webcam: - assert args.input is None, "Cannot have both --input and --webcam!" - cam = cv2.VideoCapture(0) - for vis in tqdm.tqdm(demo.run_on_video(cam)): - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, vis) - if cv2.waitKey(1) == 27: - break # esc to quit - cv2.destroyAllWindows() - elif args.video_input: - video = cv2.VideoCapture(args.video_input) - width = int(video.get(cv2.CAP_PROP_FRAME_WIDTH)) - height = int(video.get(cv2.CAP_PROP_FRAME_HEIGHT)) - frames_per_second = 15 # video.get(cv2.CAP_PROP_FPS) - num_frames = int(video.get(cv2.CAP_PROP_FRAME_COUNT)) - basename = os.path.basename(args.video_input) - - if args.output: - if os.path.isdir(args.output): - output_fname = os.path.join(args.output, basename) - output_fname = os.path.splitext(output_fname)[0] + ".mkv" - else: - output_fname = args.output - # assert not os.path.isfile(output_fname), output_fname - output_file = cv2.VideoWriter( - filename=output_fname, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - assert os.path.isfile(args.video_input) - for vis_frame in tqdm.tqdm(demo.run_on_video(video), total=num_frames): - if args.output: - output_file.write(vis_frame) - - cv2.namedWindow(basename, cv2.WINDOW_NORMAL) - cv2.imshow(basename, vis_frame) - if cv2.waitKey(1) == 27: - break # esc to quit - video.release() - if args.output: - output_file.release() - else: - cv2.destroyAllWindows() diff --git a/spaces/Benson/text-generation/Examples/3d Solitario Descargar Gratis.md b/spaces/Benson/text-generation/Examples/3d Solitario Descargar Gratis.md deleted file mode 100644 index 46a27afa92bf8434e7707dbfd23454d7469db281..0000000000000000000000000000000000000000 --- a/spaces/Benson/text-generation/Examples/3d Solitario Descargar Gratis.md +++ /dev/null @@ -1,61 +0,0 @@ -
-

Descargar gratis 3D Solitaire: La guía definitiva

-

Si eres un fan de los juegos de cartas clásicos, es posible que hayas oído hablar de 3D Solitaire. Es una versión moderna del juego tradicional de solitario, también conocido como Klondike o Patience, con impresionantes gráficos en 3D, ajustes personalizables y un juego adictivo. Pero ¿qué es 3D Solitaire exactamente, y cómo se puede descargar y jugar de forma gratuita? En esta guía, responderemos estas preguntas y más, para que puedas disfrutar de este increíble juego en tu dispositivo. ¡Empecemos!

-

¿Qué es el Solitario 3D?

-

3D Solitaire es un tipo de juego de solitario que utiliza gráficos tridimensionales para crear una experiencia realista e inmersiva. A diferencia de las cartas planas y aburridas del juego de solitario estándar, las tarjetas 3D Solitaire tienen una percepción de profundidad y una perspectiva de tamaño real, lo que hace que parezcan estar volando en tu pantalla. También puede personalizar el fondo y el respaldo de la cubierta con sus propias imágenes, o elegir entre una variedad de opciones preestablecidas. Jugar 3D Solitaire es como jugar con cartas reales, pero mejor.

-

3d solitario descargar gratis


Downloadhttps://bltlly.com/2v6MSJ



-

Historia y evolución del Solitario

-

Solitaire es uno de los juegos de cartas más antiguos y populares del mundo. Se originó en Europa a finales del siglo XVIII, y se jugó originalmente con una sola baraja de cartas. El objetivo era organizar las cartas en cuatro montones, uno para cada palo, en orden ascendente de As a Rey. El juego también era conocido como Paciencia, porque requería mucha concentración y habilidad.

-

Con el tiempo, Solitaire evolucionó en diferentes variaciones, como Spider Solitaire, FreeCell solitaire, Mahjong y más. Algunas de estas variaciones añadieron más mazos, más reglas o más desafíos al juego. Sin embargo, el principio básico seguía siendo el mismo: ordenar las tarjetas de cierta manera.

- -

Las características y beneficios de 3D Solitaire

-

3D Solitaire es una de las versiones más avanzadas e innovadoras de Solitaire jamás creadas. Ofrece el juego tradicional de Solitario con un historial completo de deshacer y rehacer, animaciones automáticas y configuraciones personalizadas. Usted puede elegir jugar con puntuación normal o Vegas, ofreciendo una experiencia de juego de cartas auténtica.

-

Pero lo que diferencia a 3D Solitaire de otros juegos de Solitaire son sus impresionantes efectos visuales. El juego cuenta con gráficos 3D inmersivos que te hacen sentir como si estuvieras jugando con cartas reales. También puede personalizar el fondo y el respaldo de la cubierta con sus propias imágenes o elegir entre una variedad de opciones preestablecidas. También puede ajustar la velocidad, el desenfoque del movimiento, las sombras, los reflejos de color, la reproducción automática y las opciones de finalización para adaptarse a sus preferencias.

-

No solo es visualmente impresionante 3D Solitaire, sino que también desafía tus habilidades lógicas con divertidos rompecabezas de diferentes niveles de dificultad. Puede utilizar indirectas interactivas que le muestran todos los movimientos posibles en la secuencia de lo más probable para resolver el juego menos probable. También puede utilizar soluciones que le muestran los movimientos exactos para resolver el juego, o dejar que el juego se juega automáticamente. También puedes realizar un seguimiento de tu progreso y logros con estadísticas y tablas de clasificación, y comparar tus puntuaciones con otros jugadores de todo el mundo.

-

¿Cómo descargar y jugar 3D Solitaire gratis?

-

Ahora que sabes lo que es 3D Solitaire y por qué es impresionante, es posible que se pregunte cómo conseguirlo en su dispositivo. La buena noticia es que 3D Solitaire está disponible de forma gratuita en varias plataformas y fuentes, para que pueda disfrutarlo en cualquier momento y en cualquier lugar. Estas son algunas de las mejores opciones para descargar y jugar 3D Solitaire gratis:

-

-

Las mejores plataformas y fuentes para 3D Solitaire

-

Dependiendo de su dispositivo y preferencia, puede elegir entre diferentes plataformas y fuentes para 3D Solitaire. Estos son algunos de los más populares:

- -

Este es uno de los mejores juegos de solitario en 3D para dispositivos Android. Tiene más de 10 millones de descargas y una calificación de 4.5 estrellas en Google Play Store. Ofrece más de 100 juegos diferentes de solitario, incluyendo Klondike, Spider, FreeCell, Pyramid, TriPeaks, Golf y más. También puede personalizar las tarjetas, fondos, animaciones, sonidos y ajustes a su gusto. Puedes descargarlo gratis desde Google Play Store o desde el sitio web oficial. También puede actualizar a la versión premium por $2.99 para eliminar anuncios y desbloquear más características.

-

Solitario 3D por GrassGames

-

Este es otro gran juego de solitario en 3D que funciona en Windows, Mac, Linux, iOS y dispositivos Android. Tiene más de 1 millón de descargas y una calificación de 4.6 estrellas en App Store. Ofrece más de 250 juegos diferentes de solitario, incluyendo Klondike, Spider, FreeCell, Pyramid, TriPeaks, Golf y más. También puede personalizar las tarjetas, fondos, animaciones, sonidos y ajustes a su gusto. Puedes descargarlo gratis desde App Store, Google Play Store o desde el sitio web oficial. También puede actualizar a la versión premium por $4.99 para eliminar anuncios y desbloquear más características.

-

Los consejos y trucos para jugar 3D Solitaire como un profesional

-

Ya sea que seas nuevo en Solitario o un jugador experimentado, siempre puedes mejorar tus habilidades y estrategias con algunos consejos y trucos. Estos son algunos de los mejores para jugar 3D Solitaire como un profesional:

-

Cómo personalizar la configuración y las preferencias

-

Una de las mejores cosas de 3D Solitaire es que puedes adaptarlo a tu gusto. Puede cambiar el tamaño de la tarjeta, el estilo, el color, la fuente y el respaldo con sus propias imágenes o elegir entre una variedad de opciones preestablecidas. También puede cambiar la imagen de fondo o color con sus propias imágenes o elegir entre una variedad de opciones preestablecidas. También puede ajustar la velocidad, el desenfoque del movimiento, las sombras, los reflejos de color, la reproducción automática y las opciones de finalización para adaptarse a sus preferencias.

- -

Cómo usar sugerencias y soluciones

-

Si estás atascado o necesitas alguna orientación, puedes usar las sugerencias y las funciones de soluciones en 3D Solitaire. Consejos le muestran todos los movimientos posibles en la secuencia de más probabilidades de resolver el juego a menos probable. Las soluciones le muestran los movimientos exactos para resolver el juego, o dejar que el juego se juega automáticamente. Puede acceder a estas características haciendo clic en el icono de bombilla en la esquina superior derecha de la pantalla. También puede ajustar la configuración de sugerencia y solución en el menú Opciones de juego.

-

Cómo alcanzar metas y posicionarse en la clasificación

-

Si estás buscando un poco de motivación y desafío extra, puedes intentar alcanzar metas y posicionarte en la clasificación en 3D Solitaire. Los objetivos son tareas específicas que necesitas completar en un juego, como limpiar todas las cartas, anotar un cierto número de puntos o terminar dentro de un límite de tiempo. Puedes ver tus objetivos actuales haciendo clic en el icono del trofeo en la esquina superior derecha de la pantalla. También puede ver sus metas y logros completados en el menú Estadísticas.

-

Leaderboard es una característica que le permite comparar sus puntuaciones y clasificaciones con otros jugadores de todo el mundo. Puedes ver tu posición actual haciendo clic en el icono de estrella en la esquina superior derecha de la pantalla. También puedes ver tus mejores puntuaciones y rankings en diferentes categorías, como juegos jugados, juegos ganados, juegos perdidos, puntuación promedio, puntuación más alta, puntuación más baja, tiempo más rápido, tiempo más lento y racha más larga. También puedes filtrar la clasificación por tipo de juego, nivel de dificultad, modo de puntuación y periodo de tiempo.

-

¿Por qué debería probar 3D Solitaire hoy?

-

Por ahora, usted podría estar convencido de que 3D Solitaire es un gran juego para jugar. Pero si necesitas más razones para probarlo hoy, aquí están algunas de las ventajas de jugar 3D Solitaire sobre otros juegos de cartas:

-

Las ventajas de jugar 3D Solitaire sobre otros juegos de cartas

- -

Es divertido, desafiante y adictivo

-

3D Solitaire es un juego que nunca se vuelve viejo o aburrido. Ofrece infinitas variaciones y desafíos que te mantienen entretenido y comprometido. Puedes elegir entre más de 100 juegos de solitario diferentes, cada uno con sus propias reglas y estrategias. También puede personalizar sus ajustes y preferencias para hacer el juego más divertido y desafiante. También puedes competir contigo mismo o con otros jugadores de todo el mundo con metas y tablas de clasificación. Jugar 3D Solitaire es una gran manera de divertirse, desafiarse y mejorar sus habilidades.

-

Es visualmente impresionante e inmersiva

-

3D Solitaire es un juego que apela a tus sentidos e imaginación. Utiliza gráficos 3D realistas e inmersivos que te hacen sentir que estás jugando con cartas reales. También puede personalizar el fondo y el respaldo de la cubierta con sus propias imágenes o elegir entre una variedad de opciones preestablecidas. Jugar 3D Solitaire es como jugar en un entorno de realidad virtual, donde puedes disfrutar de la belleza y el realismo de los juegos de cartas.

-

Es bueno para el cerebro y la salud mental

-

3D Solitaire es un juego que estimula tu cerebro y tu salud mental. Requiere habilidades de lógica, concentración, memoria y resolución de problemas que mantengan tu mente aguda y activa. También te ayuda a relajarte, reducir el estrés y mejorar tu estado de ánimo. Jugar al solitario en 3D es una excelente manera de ejercitar tu cerebro y tu salud mental.

-

Conclusión

-

3D Solitaire es una versión moderna del juego tradicional de solitario, con impresionantes gráficos en 3D, ajustes personalizables y un juego adictivo. Ofrece más de 100 juegos diferentes de solitario, incluyendo Klondike, Spider, FreeCell, Pyramid, TriPeaks, Golf y más. También puede utilizar sugerencias y soluciones para ayudarle a resolver el juego o dejar que se juega automáticamente. También puedes rastrear tu progreso y logros con estadísticas y tablas de clasificación.

- -

3D Solitaire es un juego divertido, desafiante y adictivo. También es visualmente impresionante y envolvente, y bueno para su cerebro y la salud mental. Si eres un fan de los juegos de cartas clásicos o un recién llegado a Solitaire, deberías probar 3D Solitaire hoy y ver por ti mismo por qué es uno de los mejores juegos de todos los tiempos.

-

Preguntas frecuentes

-

Aquí están algunas de las preguntas más frecuentes sobre 3D Solitaire:

-

Q: ¿Cuáles son los requisitos del sistema para el solitario 3D?

-

A: 3D Solitaire es compatible con la mayoría de los dispositivos y sistemas operativos. Sin embargo, para obtener el mejor rendimiento y gráficos, debe tener al menos 1 GB de RAM, 100 MB de espacio de almacenamiento gratuito y una tarjeta gráfica y un procesador decente.

-

Q: ¿Cómo puedo cambiar el tipo de juego o el nivel de dificultad en 3D Solitaire?

-

A: Para cambiar el tipo de juego o el nivel de dificultad en 3D Solitaire, es necesario acceder al menú haciendo clic en las tres líneas horizontales en la esquina superior izquierda de la pantalla. A continuación, puede seleccionar el tipo de juego opción de la lista de categorías. A continuación, puede elegir entre más de 100 diferentes juegos de solitario, cada uno con sus propias reglas y estrategias. También puedes ajustar el nivel de dificultad cambiando el número de palos, mazos o cartas en juego.

-

Q: ¿Cómo puedo guardar o cargar un juego en solitario en 3D?

-

A: Para guardar o cargar un juego en 3D Solitaire, es necesario acceder al menú haciendo clic en las tres líneas horizontales en la esquina superior izquierda de la pantalla. A continuación, puede seleccionar la opción Guardar juego o Cargar juego de la lista de categorías. A continuación, puede elegir una ranura para guardar o cargar el juego. También puedes usar la función Guardar automáticamente que guarda tu juego automáticamente cada vez que sales o cambias de aplicación.

-

P: ¿Cómo puedo compartir mis resultados o logros en solitario en 3D?

- -

Q: ¿Cómo puedo contactar a los desarrolladores o reportar un error en 3D Solitaire?

-

A: Para contactar a los desarrolladores o reportar un error en 3D Solitaire, necesita acceder al menú haciendo clic en las tres líneas horizontales en la esquina superior izquierda de la pantalla. Luego puede seleccionar la opción Ayuda de la lista de categorías. A continuación, puede elegir ponerse en contacto con los desarrolladores por correo electrónico, visitar su sitio web, calificar su aplicación, o informar de un error.

64aa2da5cf
-
-
\ No newline at end of file diff --git a/spaces/Benson/text-generation/Examples/Ai Apk.md b/spaces/Benson/text-generation/Examples/Ai Apk.md deleted file mode 100644 index 514fc6d811d2862d10a15c75c7d09dd32265a1c4..0000000000000000000000000000000000000000 --- a/spaces/Benson/text-generation/Examples/Ai Apk.md +++ /dev/null @@ -1,76 +0,0 @@ - -

Ciudad muerta invasión zombi APK: Un juego de supervivencia para Android

-

Si usted es un fan de los juegos de zombies, es posible que desee echa un vistazo a Ciudad Muerta Zombie invasión APK, un emocionante juego de supervivencia para dispositivos Android. En este juego, tendrás que luchar contra hordas de zombies que se han apoderado de la ciudad después de un brote de virus. Tendrás que usar varias armas y equipos para defenderte a ti mismo y a tus aliados, así como explorar diferentes lugares y completar misiones. Aquí está todo lo que necesitas saber sobre Dead City Zombie invasión APK, incluyendo sus características, cómo descargarlo e instalarlo, y algunos consejos y trucos para jugarlo.

-

ai apk


Download Ziphttps://bltlly.com/2v6K3c



-

Introducción

-

¿Qué es la ciudad muerta invasión zombi APK?

-

Ciudad muerta invasión zombi APK es un juego de disparos zombie lleno de acción desarrollado por Charm Tech. Está disponible de forma gratuita en la Google Play Store, pero también puede descargar el archivo APK de otras fuentes si desea acceder a algunas características adicionales. El juego tiene una calificación de 4.4 de 5 estrellas en la Play Store, con más de 10 millones de descargas y comentarios positivos de los jugadores.

-

¿Por qué deberías jugar Ciudad muerta invasión zombi APK?

-

Ciudad muerta invasión zombi APK es un juego que te mantendrá en el borde de su asiento como te enfrentas a interminables olas de zombies que tienen hambre de su carne. Tendrás que usar tus habilidades y estrategia para sobrevivir en este mundo post-apocalíptico, donde cada decisión importa. También disfrutarás de los siguientes beneficios cuando juegas Dead City Zombie invasión APK:

-
    -
  • Experimentarás un juego realista e inmersivo, con impresionantes gráficos y efectos de sonido que te harán sentir como si estuvieras en medio de un apocalipsis zombi.
  • -
  • Usted tendrá acceso a una variedad de armas y equipos, tales como pistolas, rifles, escopetas, granadas, cuchillos, armaduras y más. También puedes mejorar tus armas y habilidades para mejorar tu rendimiento.
  • - -
  • Te divertirás explorando diferentes lugares y escenarios, como calles, edificios, metros, fábricas y más. También encontrarás diferentes tipos de zombies, como caminantes, corredores, saltadores, bombarderos y más.
  • -
-

Características de la ciudad muerta invasión zombi APK

-

Varias armas y equipos

-

Una de las principales características de Ciudad Muerta Zombie invasión APK es la amplia gama de armas y equipos que se pueden utilizar para luchar contra los zombies. Puedes elegir entre pistolas, rifles, escopetas, granadas, cuchillos, armaduras y más. Cada arma tiene sus propias ventajas y desventajas, como daños, alcance, precisión, velocidad de recarga y capacidad de munición. También puedes mejorar tus armas y habilidades para hacerlas más poderosas y efectivas.

-

Diferentes modos y niveles

-

Otra característica de la ciudad muerta invasión zombi APK es los diferentes modos y niveles que se pueden jugar. Puedes elegir entre el modo historia, el modo supervivencia, el modo jefe y el modo desafío. Cada modo tiene sus propios objetivos y dificultades, como matar a un cierto número de zombies, sobrevivir durante un tiempo determinado, derrotar a un jefe zombi o completar una tarea específica. También puedes jugar con tus amigos online o offline en modo multijugador.

-

Impresionantes gráficos y efectos de sonido

-

Una tercera característica de Ciudad Muerta Zombie invasión APK es la impresionante gráficos y efectos de sonido que hacen que el juego más realista e inmersiva. Te sentirás como si estuvieras en medio de un apocalipsis zombi, con entornos detallados, animaciones realistas y efectos de sangre. También escuchará los sonidos de disparos, explosiones, gemidos de zombis y gritos que se sumarán a la tensión y la emoción del juego.

-

-

¿Cómo descargar e instalar Ciudad Muerta Zombie invasión APK?

-

Descargar el archivo APK de una fuente de confianza

- -

Descargar Ciudad Muerta Zombie invasión APK aquí

-

Habilitar fuentes desconocidas en su dispositivo

-

Antes de poder instalar el archivo APK, tendrá que habilitar fuentes desconocidas en su dispositivo. Esto le permitirá instalar aplicaciones que no son de Google Play Store. Para habilitar fuentes desconocidas, siga estos pasos:

-
    -
  1. Ir a la configuración de su dispositivo y toque en la seguridad o la privacidad.
  2. -
  3. Encontrar la opción que dice fuentes desconocidas o instalar aplicaciones desconocidas y alternar en.
  4. -
  5. Confirme su elección tocando en OK o Permitir.
  6. -
-

Instalar el archivo APK y lanzar el juego

-

Después de haber habilitado fuentes desconocidas, puede instalar el archivo APK y lanzar el juego. Para hacer esto, siga estos pasos:

-
    -
  1. Busque el archivo APK que ha descargado y toque en él.
  2. -
  3. Siga las instrucciones en la pantalla y toque en Instalar.
  4. -
  5. Espere a que termine el proceso de instalación y toque en Abrir.
  6. -
  7. Disfruta jugando Ciudad muerta invasión zombi APK!
  8. -
-

Consejos y trucos para jugar Ciudad muerta invasión zombi APK

-

Mejora tus armas y habilidades

-

Uno de los consejos para jugar Ciudad Muerta Zombie invasión APK es mejorar sus armas y habilidades tanto como sea posible. Esto te ayudará a hacer más daño, sobrevivir más tiempo y completar misiones más rápido. Puedes mejorar tus armas y habilidades usando monedas y gemas que puedes ganar jugando el juego o viendo anuncios. También puedes comprar monedas y gemas con dinero real si quieres acelerar el proceso.

-

Apunta a la cabeza y usa granadas

-

Otro consejo para jugar Ciudad Muerta Zombie invasión APK es apuntar a la cabeza y utilizar granadas cuando se lucha contra zombies. Apuntar a la cabeza hará más daño y ahorrará munición, mientras que el uso de granadas causará daño en el área y aturdirá a los zombies. También puedes usar otros artículos, como botiquines, cajas de munición y torretas, para ayudarte en el combate.

-

Recopilar recursos y elementos

- -

Conclusión

-

Ciudad muerta invasión zombi APK es un juego de supervivencia para dispositivos Android que le desafiará a luchar contra hordas de zombies en un mundo post-apocalíptico. Tendrás que usar varias armas y equipos para defenderte a ti mismo y a tus aliados, así como explorar diferentes lugares y completar misiones. También podrá disfrutar de la jugabilidad realista e inmersiva, con impresionantes gráficos y efectos de sonido. Si usted está buscando un emocionante juego de disparos zombie, usted debe descargar e instalar Ciudad Muerta Zombie invasión APK hoy!

-

Preguntas frecuentes

-

Aquí están algunas de las preguntas más frecuentes sobre Ciudad Muerta Zombie invasión APK:

-

¿Es seguro descargar Ciudad muerta invasión zombi APK?

-

Sí, Ciudad muerta invasión zombi APK es seguro de descargar, siempre y cuando lo obtenga de una fuente de confianza. Sin embargo, siempre debe tener cuidado al descargar cualquier aplicación de fuentes desconocidas, ya que pueden contener virus o malware que pueden dañar su dispositivo.

-

¿Es la ciudad muerta invasión zombi APK libre para jugar?

-

Sí, Ciudad muerta invasión zombi APK es libre de jugar, pero contiene anuncios y compras en la aplicación que pueden mejorar su experiencia de juego. Puedes desactivar los anuncios desactivando tu conexión a Internet o comprando la versión sin anuncios del juego. También puedes comprar monedas y gemas con dinero real si quieres mejorar tus armas y habilidades más rápido.

-

¿Cómo puedo jugar con mis amigos en línea o fuera de línea en la ciudad muerta invasión zombi APK?

- -

¿Cuáles son los requisitos mínimos para jugar Ciudad muerta invasión zombi APK?

-

Los requisitos mínimos para jugar Ciudad muerta invasión zombi APK son los siguientes:

-
    -
  • Versión para Android: 4.4 o superior
  • -
  • RAM: 2 GB o más
  • -
  • Espacio de almacenamiento: 300 MB o más
  • -
  • Conexión a Internet: opcional (requerido para multijugador y anuncios en línea)
  • -
-

¿Cómo puedo contactar con el desarrollador de Dead City Zombie invasión APK?

-

Si usted tiene alguna pregunta, retroalimentación, o sugerencias acerca de Ciudad Muerta Zombie invasión APK, puede ponerse en contacto con el desarrollador mediante el uso de los siguientes métodos:

-
    -
  • Correo electrónico: charmtech@gmail.com
  • -
  • Facebook: https://www.facebook.com/charmtechgames
  • -
  • Twitter: https://twitter.com/charmtechgames
  • -

64aa2da5cf
-
-
\ No newline at end of file diff --git a/spaces/Benson/text-generation/Examples/Descargar Gratis Fuego Mx Apk 45 Mb.md b/spaces/Benson/text-generation/Examples/Descargar Gratis Fuego Mx Apk 45 Mb.md deleted file mode 100644 index a0096a9136efaace43b69054bc419a0f83cd1039..0000000000000000000000000000000000000000 --- a/spaces/Benson/text-generation/Examples/Descargar Gratis Fuego Mx Apk 45 Mb.md +++ /dev/null @@ -1,79 +0,0 @@ -
-

Free Fire MAX: Cómo descargar el archivo APK en 45 MB

-

Si eres un fan de los juegos battle royale, es posible que hayas oído hablar de Free Fire MAX, una nueva versión del popular juego Free Fire que ofrece gráficos mejorados, jugabilidad y compatibilidad. En este artículo, le diremos todo lo que necesita saber sobre Free Fire MAX, incluyendo cómo descargar el archivo APK en solo 45 MB. ¡Sigue leyendo para saber más!

-

¿Qué es Free Fire MAX?

-

Free Fire MAX es un juego de disparos desarrollado por Garena International que está diseñado exclusivamente para ofrecer una experiencia de juego premium en una batalla real. Se basa en el mismo juego central que Free Fire, pero con características mejoradas como:

-

descargar gratis fuego máx apk 45 mb


Download ✓✓✓ https://bltlly.com/2v6IOK



-
    -
  • Resoluciones ultra HD que hacen el juego más realista e inmersivo
  • -
  • Efectos impresionantes que mejoran la experiencia de combate
  • -
  • Contenido exclusivo que solo está disponible en Free Fire MAX
  • -
  • Tecnología Firelink que te permite jugar con otros jugadores de Free Fire a través de la compatibilidad multiplataforma
  • -
-

Free Fire MAX está actualmente disponible en determinadas regiones, pero se puede descargar desde cualquier lugar utilizando un archivo APK. Te mostraremos cómo hacerlo en la siguiente sección.

-

¿Cuáles son los beneficios de jugar Free Fire MAX?

-

Si te estás preguntando por qué deberías jugar Free Fire MAX en lugar de Free Fire, estos son algunos de los beneficios que puedes disfrutar:

-
    -
  • Mejores gráficos: Free Fire MAX tiene gráficos en HD que son más detallados y vibrantes que Free Fire. Puedes ver cada textura, sombra y reflejo en el mundo del juego. También puede personalizar la configuración de gráficos de acuerdo con las especificaciones y preferencias del dispositivo.
  • - -
  • Mejor compatibilidad: Free Fire MAX es compatible con más dispositivos que Free Fire. Puede reproducirlo en dispositivos de gama baja, así como en dispositivos de gama alta con un mejor rendimiento. También se puede jugar en PC o Mac utilizando un emulador de Android. También puedes jugar con otros jugadores de Free Fire usando la tecnología Firelink, que explicaremos más adelante.
  • -
-

Como puedes ver, Free Fire MAX tiene muchas ventajas sobre Free Fire. Si quieres experimentar el juego por ti mismo, tendrás que descargar el archivo APK, que es un pequeño archivo que contiene los datos del juego. Así es como puedes hacerlo en tan solo 45 MB.

-

¿Cómo descargar el archivo APK Free Fire MAX en 45 MB?

-

Para descargar el archivo APK Free Fire MAX en 45 MB, tendrá que seguir estos sencillos pasos:

-

Paso 1: Habilitar fuentes desconocidas en el dispositivo

-

Antes de poder instalar el archivo APK, tendrá que permitir que su dispositivo instale aplicaciones desde fuentes distintas de Google Play Store. Para ello, vaya a la configuración del dispositivo y busque la opción de seguridad o privacidad. Luego, habilite la opción de fuentes desconocidas. Esto le permitirá instalar aplicaciones de sitios web de terceros.

-

Paso 2: Descargar el archivo APK de una fuente de confianza

-

Siguiente, tendrá que encontrar y descargar el archivo APK de un sitio web confiable. Hay muchos sitios web que ofrecen archivos APK para varios juegos y aplicaciones, pero no todos ellos son seguros y protegidos. Algunos de ellos pueden contener malware o virus que pueden dañar su dispositivo o robar sus datos. Por lo tanto, solo debe descargar el archivo APK de una fuente de confianza, como APKPure o APKMirror. Estos sitios web son verificados y probados por millones de usuarios y proporcionan las versiones más recientes y actualizadas de los archivos APK.

-

- -

Paso 3: Instalar el archivo APK en su dispositivo

-

Una vez que haya descargado el archivo APK, tendrá que instalarlo en su dispositivo. Para ello, localice el archivo APK en el almacenamiento del dispositivo y toque en él. Verá un mensaje pidiéndole que confirme la instalación. Toque en instalar y espere unos segundos hasta que se complete la instalación.

-

Paso 4: Iniciar el juego y disfrutar de

-

Después de instalar el archivo APK, puede iniciar el juego tocando en su icono en la pantalla de inicio o cajón de aplicaciones. Verá una pantalla de carga que descargará los datos adicionales del juego, lo que puede tardar algún tiempo dependiendo de su velocidad de Internet. Una vez completada la descarga, puede iniciar sesión con su cuenta de Free Fire existente o crear una nueva. También puedes vincular tu cuenta de Facebook o Google para guardar tu progreso y sincronizar tus datos entre dispositivos.

-

Ahora, puedes disfrutar jugando Free Fire MAX con gráficos mejorados, jugabilidad y compatibilidad. También puedes jugar con otros jugadores de Free Fire usando la tecnología Firelink, que explicaremos en la siguiente sección.

-

¿Cómo se juega Free Fire MAX con otros jugadores de Free Fire?

-

Una de las mejores características de Free Fire MAX es que te permite jugar con otros jugadores de Free Fire usando la tecnología Firelink. Esta es una tecnología multiplataforma que conecta a los jugadores a través de ambas versiones del juego. Esto significa que puedes jugar con tus amigos y compañeros de equipo que están usando Free Fire o Free Fire MAX sin ningún problema.

-

Para utilizar esta función, tendrá que activarla en la configuración del juego. Vaya a la configuración y busque la opción Firelink. Luego, enciéndelo y escanea el código QR que aparece en tu pantalla. Esto vinculará tu cuenta con tu dispositivo y te permitirá jugar con otros jugadores usando cualquiera de las versiones del juego.

- -

¿Cómo optimizar tu experiencia Free Fire MAX?

-

Si quieres aprovechar al máximo tu experiencia Free Fire MAX, aquí hay algunos consejos y trucos que puedes usar:

-

Consejo 1: Ajusta la configuración de tus gráficos según las especificaciones de tu dispositivo

-

Free Fire MAX tiene gráficos increíbles que hacen que el juego sea más realista e inmersivo. Sin embargo, no todos los dispositivos pueden manejar la configuración de gráficos altos sin afectar el rendimiento o la duración de la batería. Por lo tanto, debe ajustar la configuración de los gráficos de acuerdo con las especificaciones y preferencias del dispositivo.

-

Para hacer esto, vaya a la configuración y busque la opción de gráficos. Verá un control deslizante que le permite elegir entre ajustes de gráficos bajos, medios, altos y ultra. También puede personalizar la configuración de sus gráficos cambiando la resolución, la velocidad de fotogramas, las sombras, el anti-aliasing y la calidad de la textura. También puedes activar o desactivar el modo HDR, que mejora el color y el contraste del juego.

-

Deberías experimentar con diferentes configuraciones gráficas hasta que encuentres la que se adapte a tu dispositivo y a tu gusto. También puede comprobar la temperatura del dispositivo y el nivel de batería para ver cuánto impacto tienen los ajustes gráficos en el dispositivo.

-

Consejo 2: Utilice una conexión a Internet estable y un buen servicio VPN

-

Free Fire MAX es un juego en línea que requiere una conexión a Internet estable para jugar sin problemas y sin interrupciones. Si tiene una conexión a Internet lenta o inestable, puede experimentar retraso, desconexión u otros problemas que pueden arruinar su experiencia de juego. Por lo tanto, debe utilizar una conexión a Internet rápida y confiable para jugar Free Fire MAX.

- -

Un servicio VPN es un software que puede cifrar el tráfico de Internet y enrutarlo a través de un servidor en otro país. De esta manera, puedes evitar cualquier restricción geográfica o censura que pueda impedirte jugar a Free Fire MAX. También puede disfrutar de un acceso a Internet más rápido y seguro con un servicio VPN.

-

Hay muchos servicios VPN disponibles en línea, pero no todos son seguros y efectivos. Algunos de ellos pueden contener malware o spyware que pueden dañar su dispositivo o robar sus datos. Algunos de ellos también pueden ralentizar su velocidad de Internet o filtrar su dirección IP. Por lo tanto, solo debe usar un servicio VPN confiable y de buena reputación, como NordVPN o ExpressVPN. Estos servicios VPN son verificados y probados por millones de usuarios y proporcionan servidores VPN rápidos y seguros en varios países.

-

Para usar un servicio VPN para jugar a Free Fire MAX, tendrá que descargar e instalar la aplicación VPN en su dispositivo. Luego, tendrás que crear una cuenta y elegir una ubicación de servidor compatible con el juego. Por ejemplo, si quieres jugar a Free Fire MAX en India, puedes elegir un servidor en India. Luego, deberá conectarse al servidor VPN y lanzar el juego. Verás que puedes acceder a los servidores y características del juego sin ningún problema.

-

Consejo 3: Utilice BlueStacks para jugar Free Fire MAX en PC o Mac

-

Si quieres jugar Free Fire MAX en una pantalla más grande con mejores controles y rendimiento, puedes usar BlueStacks, un emulador de Android que te permite jugar juegos de Android en PC o Mac. BlueStacks es uno de los mejores emuladores de Android disponibles en línea, ya que tiene muchas características que lo hacen ideal para jugar Free Fire MAX, como:

-
    - -
  • Controles inteligentes: BlueStacks tiene controles inteligentes que detectan automáticamente cuando estás en combate o en menús y cambian entre los controles del teclado y el ratón en consecuencia. De esta manera, puedes jugar el juego con facilidad y eficiencia.
  • -
  • Grabadora de macros: BlueStacks tiene una grabadora de macros que le permite grabar y reproducir cualquier acción o secuencia de acciones en el juego con una sola pulsación. De esta manera, puedes automatizar tareas como saquear, curar, recargar o usar habilidades.
  • -
  • Eco mode: BlueStacks tiene un modo eco que reduce el uso de CPU y RAM del emulador cuando se ejecuta en segundo plano. De esta manera, puede guardar los recursos del dispositivo y la duración de la batería mientras juega Free Fire MAX.
  • -
-

Para usar BlueStacks para jugar Free Fire MAX en PC o Mac, tendrá que descargar e instalar BlueStacks en su computadora desde su sitio web oficial aquí. Luego, tendrás que iniciar sesión con tu cuenta de Google y buscar Free Fire MAX en Google Play Store. También puede descargar el archivo APK de los sitios web mencionados anteriormente e instalarlo en BlueStacks. Luego, puedes iniciar el juego e iniciar sesión con tu cuenta de Free Fire o crear una nueva. También puede personalizar los controles del teclado y del ratón según sus preferencias.

-

Ahora, puedes disfrutar jugando Free Fire MAX en PC o Mac con BlueStacks y experimentar el juego de una manera completamente nueva.

-

Conclusión

-

Free Fire MAX es un gran juego para cualquiera que ame los juegos battle royale y quiera experimentarlos con mejores gráficos, jugabilidad y compatibilidad. Puede descargar el archivo APK en solo 45 MB e instalarlo en su dispositivo fácilmente. También puedes jugar con otros jugadores de Free Fire usando la tecnología Firelink y cambiar entre ambas versiones del juego sin perder tu progreso o datos. También puede usar un servicio VPN para acceder al juego desde cualquier región y usar BlueStacks para jugar el juego en PC o Mac.

- -

Preguntas frecuentes

-

Aquí están algunas de las preguntas y respuestas más frecuentes sobre Free Fire MAX:

-
    -
  • Q: ¿Free Fire MAX es libre para jugar?
  • -
  • A: Sí, Free Fire MAX es gratis para jugar y descargar. Sin embargo, puede contener compras en la aplicación que requieren dinero real.
  • -
  • Q: ¿Es seguro descargar e instalar Free Fire MAX?
  • -
  • A: Sí, Free Fire MAX es seguro para descargar e instalar siempre y cuando utilice una fuente de confianza como Google Play Store o APKPure. También debe habilitar fuentes desconocidas en su dispositivo antes de instalar el archivo APK.
  • -
  • Q: ¿Cuáles son los requisitos mínimos para jugar Free Fire MAX?
  • -
  • A: Los requisitos mínimos para jugar Free Fire MAX son Android 4.4 o superior, 2 GB de RAM y 1.5 GB de espacio de almacenamiento libre.
  • -
  • Q: ¿Puedo jugar Free Fire MAX sin conexión?
  • -
  • A: No, Free Fire MAX es un juego en línea que requiere una conexión a Internet para jugar. También debe usar una conexión a Internet estable y un buen servicio VPN para evitar retrasos, desconexiones o restricciones geográficas.
  • -
  • Q: ¿Puedo jugar Free Fire MAX con mis amigos?
  • -
  • A: Sí, puedes jugar a Free Fire MAX con tus amigos usando la tecnología Firelink que conecta a los jugadores en ambas versiones del juego. También puedes usar BlueStacks para jugar Free Fire MAX en PC o Mac con tus amigos.
  • -

64aa2da5cf
-
-
\ No newline at end of file diff --git a/spaces/Big-Web/MMSD/env/Lib/site-packages/boto3/dynamodb/__init__.py b/spaces/Big-Web/MMSD/env/Lib/site-packages/boto3/dynamodb/__init__.py deleted file mode 100644 index 6001b27b37430efbf22057efde79637b340fa1db..0000000000000000000000000000000000000000 --- a/spaces/Big-Web/MMSD/env/Lib/site-packages/boto3/dynamodb/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"). You -# may not use this file except in compliance with the License. A copy of -# the License is located at -# -# https://aws.amazon.com/apache2.0/ -# -# or in the "license" file accompanying this file. This file is -# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF -# ANY KIND, either express or implied. See the License for the specific -# language governing permissions and limitations under the License. diff --git a/spaces/Big-Web/MMSD/env/Lib/site-packages/pip/_vendor/rich/pager.py b/spaces/Big-Web/MMSD/env/Lib/site-packages/pip/_vendor/rich/pager.py deleted file mode 100644 index a3f7aa62af1ee2690e1e17ee41f3c368953625b8..0000000000000000000000000000000000000000 --- a/spaces/Big-Web/MMSD/env/Lib/site-packages/pip/_vendor/rich/pager.py +++ /dev/null @@ -1,34 +0,0 @@ -from abc import ABC, abstractmethod -from typing import Any - - -class Pager(ABC): - """Base class for a pager.""" - - @abstractmethod - def show(self, content: str) -> None: - """Show content in pager. - - Args: - content (str): Content to be displayed. - """ - - -class SystemPager(Pager): - """Uses the pager installed on the system.""" - - def _pager(self, content: str) -> Any: #  pragma: no cover - return __import__("pydoc").pager(content) - - def show(self, content: str) -> None: - """Use the same pager used by pydoc.""" - self._pager(content) - - -if __name__ == "__main__": # pragma: no cover - from .__main__ import make_test_card - from .console import Console - - console = Console() - with console.pager(styles=True): - console.print(make_test_card()) diff --git a/spaces/Big-Web/MMSD/env/Lib/site-packages/pkg_resources/_vendor/packaging/__about__.py b/spaces/Big-Web/MMSD/env/Lib/site-packages/pkg_resources/_vendor/packaging/__about__.py deleted file mode 100644 index 3551bc2d29846441299cf57b397b02fc164c99b9..0000000000000000000000000000000000000000 --- a/spaces/Big-Web/MMSD/env/Lib/site-packages/pkg_resources/_vendor/packaging/__about__.py +++ /dev/null @@ -1,26 +0,0 @@ -# This file is dual licensed under the terms of the Apache License, Version -# 2.0, and the BSD License. See the LICENSE file in the root of this repository -# for complete details. - -__all__ = [ - "__title__", - "__summary__", - "__uri__", - "__version__", - "__author__", - "__email__", - "__license__", - "__copyright__", -] - -__title__ = "packaging" -__summary__ = "Core utilities for Python packages" -__uri__ = "https://github.com/pypa/packaging" - -__version__ = "21.3" - -__author__ = "Donald Stufft and individual contributors" -__email__ = "donald@stufft.io" - -__license__ = "BSD-2-Clause or Apache-2.0" -__copyright__ = "2014-2019 %s" % __author__ diff --git a/spaces/Bl1tzie/Jam/server.js b/spaces/Bl1tzie/Jam/server.js deleted file mode 100644 index 04a48b7a429c4d0ad0b772ba1edf503e349eda21..0000000000000000000000000000000000000000 --- a/spaces/Bl1tzie/Jam/server.js +++ /dev/null @@ -1,32 +0,0 @@ -const express = require('express'); -const proxy = require('express-http-proxy'); -const app = express(); -const targetUrl = 'https://api.openai.com'; -const openaiKey = process.env.OPENAI_KEY -const port = 7860; -const baseUrl = getExternalUrl(process.env.SPACE_ID); - -app.use('/api', proxy(targetUrl, { - proxyReqOptDecorator: (proxyReqOpts, srcReq) => { - // Modify the request headers if necessary - proxyReqOpts.headers['Authorization'] = 'Bearer '+openaiKey; - return proxyReqOpts; - }, -})); - -app.get("/", (req, res) => { - res.send(`This is your OpenAI Reverse Proxy URL: ${baseUrl}`); -}); - -function getExternalUrl(spaceId) { - try { - const [username, spacename] = spaceId.split("/"); - return `https://${username}-${spacename.replace(/_/g, "-")}.hf.space/api/v1`; - } catch (e) { - return ""; - } -} - -app.listen(port, () => { - console.log(`Reverse proxy server running on ${baseUrl}`); -}); \ No newline at end of file diff --git a/spaces/CVPR/Dual-Key_Backdoor_Attacks/datagen/detectron2/tools/deploy/caffe2_mask_rcnn.cpp b/spaces/CVPR/Dual-Key_Backdoor_Attacks/datagen/detectron2/tools/deploy/caffe2_mask_rcnn.cpp deleted file mode 100644 index ba2e7726ee13a467539773091670eae6ea5befb4..0000000000000000000000000000000000000000 --- a/spaces/CVPR/Dual-Key_Backdoor_Attacks/datagen/detectron2/tools/deploy/caffe2_mask_rcnn.cpp +++ /dev/null @@ -1,116 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved - -#include -#include -#include -#include -#include -#include -#include - -#include -#include -#include -#include -#include - -C10_DEFINE_string(predict_net, "", "path to model.pb"); -C10_DEFINE_string(init_net, "", "path to model_init.pb"); -C10_DEFINE_string(input, "", "path to input image"); - -using namespace std; -using namespace caffe2; - -int main(int argc, char** argv) { - caffe2::GlobalInit(&argc, &argv); - string predictNetPath = FLAGS_predict_net; - string initNetPath = FLAGS_init_net; - cv::Mat input = cv::imread(FLAGS_input, cv::IMREAD_COLOR); - - const int height = input.rows; - const int width = input.cols; - // FPN models require divisibility of 32 - assert(height % 32 == 0 && width % 32 == 0); - const int batch = 1; - const int channels = 3; - - // initialize Net and Workspace - caffe2::NetDef initNet_, predictNet_; - CAFFE_ENFORCE(ReadProtoFromFile(initNetPath, &initNet_)); - CAFFE_ENFORCE(ReadProtoFromFile(predictNetPath, &predictNet_)); - - Workspace workSpace; - for (auto& str : predictNet_.external_input()) { - workSpace.CreateBlob(str); - } - CAFFE_ENFORCE(workSpace.CreateNet(predictNet_)); - CAFFE_ENFORCE(workSpace.RunNetOnce(initNet_)); - - // setup inputs - auto data = BlobGetMutableTensor(workSpace.GetBlob("data"), caffe2::CPU); - data->Resize(batch, channels, height, width); - float* ptr = data->mutable_data(); - // HWC to CHW - for (int c = 0; c < 3; ++c) { - for (int i = 0; i < height * width; ++i) { - ptr[c * height * width + i] = static_cast(input.data[3 * i + c]); - } - } - - auto im_info = - BlobGetMutableTensor(workSpace.GetBlob("im_info"), caffe2::CPU); - im_info->Resize(batch, 3); - float* im_info_ptr = im_info->mutable_data(); - im_info_ptr[0] = height; - im_info_ptr[1] = width; - im_info_ptr[2] = 1.0; - - // run the network - CAFFE_ENFORCE(workSpace.RunNet(predictNet_.name())); - - // run 3 more times to benchmark - int N_benchmark = 3; - auto start_time = chrono::high_resolution_clock::now(); - for (int i = 0; i < N_benchmark; ++i) { - CAFFE_ENFORCE(workSpace.RunNet(predictNet_.name())); - } - auto end_time = chrono::high_resolution_clock::now(); - auto ms = std::chrono::duration_cast( - end_time - start_time) - .count(); - cout << "Latency: " << ms * 1.0 / 1e6 / N_benchmark << " seconds" << endl; - - // parse Mask R-CNN outputs - auto& bbox = BlobGetTensor(*workSpace.GetBlob("bbox_nms"), caffe2::CPU); - auto& scores = BlobGetTensor(*workSpace.GetBlob("score_nms"), caffe2::CPU); - auto& labels = BlobGetTensor(*workSpace.GetBlob("class_nms"), caffe2::CPU); - auto& mask_probs = - BlobGetTensor(*workSpace.GetBlob("mask_fcn_probs"), caffe2::CPU); - cout << "bbox:" << bbox.DebugString() << endl; - cout << "scores:" << scores.DebugString() << endl; - cout << "labels:" << labels.DebugString() << endl; - cout << "mask_probs: " << mask_probs.DebugString() << endl; - - int num_instances = bbox.sizes()[0]; - for (int i = 0; i < num_instances; ++i) { - float score = scores.data()[i]; - if (score < 0.6) - continue; // skip them - - const float* box = bbox.data() + i * 4; - int label = labels.data()[i]; - - cout << "Prediction " << i << ", xyxy=("; - cout << box[0] << ", " << box[1] << ", " << box[2] << ", " << box[3] - << "); score=" << score << "; label=" << label << endl; - - const float* mask = mask_probs.data() + - i * mask_probs.size_from_dim(1) + label * mask_probs.size_from_dim(2); - - // save the 28x28 mask - cv::Mat cv_mask(28, 28, CV_32FC1); - memcpy(cv_mask.data, mask, 28 * 28 * sizeof(float)); - cv::imwrite("mask" + std::to_string(i) + ".png", cv_mask * 255.); - } - return 0; -} diff --git a/spaces/CVPR/GFPGAN-example/tests/test_gfpgan_arch.py b/spaces/CVPR/GFPGAN-example/tests/test_gfpgan_arch.py deleted file mode 100644 index cef14a435aa824a1b7c4baaf2d1fe0a2f6cc4441..0000000000000000000000000000000000000000 --- a/spaces/CVPR/GFPGAN-example/tests/test_gfpgan_arch.py +++ /dev/null @@ -1,203 +0,0 @@ -import torch - -from gfpgan.archs.gfpganv1_arch import FacialComponentDiscriminator, GFPGANv1, StyleGAN2GeneratorSFT -from gfpgan.archs.gfpganv1_clean_arch import GFPGANv1Clean, StyleGAN2GeneratorCSFT - - -def test_stylegan2generatorsft(): - """Test arch: StyleGAN2GeneratorSFT.""" - - # model init and forward (gpu) - if torch.cuda.is_available(): - net = StyleGAN2GeneratorSFT( - out_size=32, - num_style_feat=512, - num_mlp=8, - channel_multiplier=1, - resample_kernel=(1, 3, 3, 1), - lr_mlp=0.01, - narrow=1, - sft_half=False).cuda().eval() - style = torch.rand((1, 512), dtype=torch.float32).cuda() - condition1 = torch.rand((1, 512, 8, 8), dtype=torch.float32).cuda() - condition2 = torch.rand((1, 512, 16, 16), dtype=torch.float32).cuda() - condition3 = torch.rand((1, 512, 32, 32), dtype=torch.float32).cuda() - conditions = [condition1, condition1, condition2, condition2, condition3, condition3] - output = net([style], conditions) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - # -------------------- with return_latents ----------------------- # - output = net([style], conditions, return_latents=True) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 1 - # check latent - assert output[1][0].shape == (8, 512) - - # -------------------- with randomize_noise = False ----------------------- # - output = net([style], conditions, randomize_noise=False) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - # -------------------- with truncation = 0.5 and mixing----------------------- # - output = net([style, style], conditions, truncation=0.5, truncation_latent=style) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - -def test_gfpganv1(): - """Test arch: GFPGANv1.""" - - # model init and forward (gpu) - if torch.cuda.is_available(): - net = GFPGANv1( - out_size=32, - num_style_feat=512, - channel_multiplier=1, - resample_kernel=(1, 3, 3, 1), - decoder_load_path=None, - fix_decoder=True, - # for stylegan decoder - num_mlp=8, - lr_mlp=0.01, - input_is_latent=False, - different_w=False, - narrow=1, - sft_half=True).cuda().eval() - img = torch.rand((1, 3, 32, 32), dtype=torch.float32).cuda() - output = net(img) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 3 - # check out_rgbs for intermediate loss - assert output[1][0].shape == (1, 3, 8, 8) - assert output[1][1].shape == (1, 3, 16, 16) - assert output[1][2].shape == (1, 3, 32, 32) - - # -------------------- with different_w = True ----------------------- # - net = GFPGANv1( - out_size=32, - num_style_feat=512, - channel_multiplier=1, - resample_kernel=(1, 3, 3, 1), - decoder_load_path=None, - fix_decoder=True, - # for stylegan decoder - num_mlp=8, - lr_mlp=0.01, - input_is_latent=False, - different_w=True, - narrow=1, - sft_half=True).cuda().eval() - img = torch.rand((1, 3, 32, 32), dtype=torch.float32).cuda() - output = net(img) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 3 - # check out_rgbs for intermediate loss - assert output[1][0].shape == (1, 3, 8, 8) - assert output[1][1].shape == (1, 3, 16, 16) - assert output[1][2].shape == (1, 3, 32, 32) - - -def test_facialcomponentdiscriminator(): - """Test arch: FacialComponentDiscriminator.""" - - # model init and forward (gpu) - if torch.cuda.is_available(): - net = FacialComponentDiscriminator().cuda().eval() - img = torch.rand((1, 3, 32, 32), dtype=torch.float32).cuda() - output = net(img) - assert len(output) == 2 - assert output[0].shape == (1, 1, 8, 8) - assert output[1] is None - - # -------------------- return intermediate features ----------------------- # - output = net(img, return_feats=True) - assert len(output) == 2 - assert output[0].shape == (1, 1, 8, 8) - assert len(output[1]) == 2 - assert output[1][0].shape == (1, 128, 16, 16) - assert output[1][1].shape == (1, 256, 8, 8) - - -def test_stylegan2generatorcsft(): - """Test arch: StyleGAN2GeneratorCSFT.""" - - # model init and forward (gpu) - if torch.cuda.is_available(): - net = StyleGAN2GeneratorCSFT( - out_size=32, num_style_feat=512, num_mlp=8, channel_multiplier=1, narrow=1, sft_half=False).cuda().eval() - style = torch.rand((1, 512), dtype=torch.float32).cuda() - condition1 = torch.rand((1, 512, 8, 8), dtype=torch.float32).cuda() - condition2 = torch.rand((1, 512, 16, 16), dtype=torch.float32).cuda() - condition3 = torch.rand((1, 512, 32, 32), dtype=torch.float32).cuda() - conditions = [condition1, condition1, condition2, condition2, condition3, condition3] - output = net([style], conditions) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - # -------------------- with return_latents ----------------------- # - output = net([style], conditions, return_latents=True) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 1 - # check latent - assert output[1][0].shape == (8, 512) - - # -------------------- with randomize_noise = False ----------------------- # - output = net([style], conditions, randomize_noise=False) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - # -------------------- with truncation = 0.5 and mixing----------------------- # - output = net([style, style], conditions, truncation=0.5, truncation_latent=style) - assert output[0].shape == (1, 3, 32, 32) - assert output[1] is None - - -def test_gfpganv1clean(): - """Test arch: GFPGANv1Clean.""" - - # model init and forward (gpu) - if torch.cuda.is_available(): - net = GFPGANv1Clean( - out_size=32, - num_style_feat=512, - channel_multiplier=1, - decoder_load_path=None, - fix_decoder=True, - # for stylegan decoder - num_mlp=8, - input_is_latent=False, - different_w=False, - narrow=1, - sft_half=True).cuda().eval() - - img = torch.rand((1, 3, 32, 32), dtype=torch.float32).cuda() - output = net(img) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 3 - # check out_rgbs for intermediate loss - assert output[1][0].shape == (1, 3, 8, 8) - assert output[1][1].shape == (1, 3, 16, 16) - assert output[1][2].shape == (1, 3, 32, 32) - - # -------------------- with different_w = True ----------------------- # - net = GFPGANv1Clean( - out_size=32, - num_style_feat=512, - channel_multiplier=1, - decoder_load_path=None, - fix_decoder=True, - # for stylegan decoder - num_mlp=8, - input_is_latent=False, - different_w=True, - narrow=1, - sft_half=True).cuda().eval() - img = torch.rand((1, 3, 32, 32), dtype=torch.float32).cuda() - output = net(img) - assert output[0].shape == (1, 3, 32, 32) - assert len(output[1]) == 3 - # check out_rgbs for intermediate loss - assert output[1][0].shape == (1, 3, 8, 8) - assert output[1][1].shape == (1, 3, 16, 16) - assert output[1][2].shape == (1, 3, 32, 32) diff --git a/spaces/CVPR/LIVE/thrust/thrust/system/detail/adl/set_operations.h b/spaces/CVPR/LIVE/thrust/thrust/system/detail/adl/set_operations.h deleted file mode 100644 index 7d09355e12ad1beaf3ad6af34558d5a8692bd62a..0000000000000000000000000000000000000000 --- a/spaces/CVPR/LIVE/thrust/thrust/system/detail/adl/set_operations.h +++ /dev/null @@ -1,44 +0,0 @@ -/* - * Copyright 2008-2013 NVIDIA Corporation - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a fill of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#pragma once - -#include - -// the purpose of this header is to #include the set_operations.h header -// of the sequential, host, and device systems. It should be #included in any -// code which uses adl to dispatch set_operations - -#include - -// SCons can't see through the #defines below to figure out what this header -// includes, so we fake it out by specifying all possible files we might end up -// including inside an #if 0. -#if 0 -#include -#include -#include -#include -#endif - -#define __THRUST_HOST_SYSTEM_SET_OPERATIONS_HEADER <__THRUST_HOST_SYSTEM_ROOT/detail/set_operations.h> -#include __THRUST_HOST_SYSTEM_SET_OPERATIONS_HEADER -#undef __THRUST_HOST_SYSTEM_SET_OPERATIONS_HEADER - -#define __THRUST_DEVICE_SYSTEM_SET_OPERATIONS_HEADER <__THRUST_DEVICE_SYSTEM_ROOT/detail/set_operations.h> -#include __THRUST_DEVICE_SYSTEM_SET_OPERATIONS_HEADER -#undef __THRUST_DEVICE_SYSTEM_SET_OPERATIONS_HEADER - diff --git a/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/assign_value.h b/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/assign_value.h deleted file mode 100644 index cf244a02193211b9b4e4f07a6bc9b975d50e5388..0000000000000000000000000000000000000000 --- a/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/assign_value.h +++ /dev/null @@ -1,23 +0,0 @@ -/* - * Copyright 2008-2013 NVIDIA Corporation - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#pragma once - -#include - -// this system inherits assign_value -#include - diff --git a/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/transform_reduce.h b/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/transform_reduce.h deleted file mode 100644 index a8736bd75d06e54d9158baeb2504162d75312885..0000000000000000000000000000000000000000 --- a/spaces/CVPR/LIVE/thrust/thrust/system/tbb/detail/transform_reduce.h +++ /dev/null @@ -1,23 +0,0 @@ -/* - * Copyright 2008-2013 NVIDIA Corporation - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#pragma once - -#include - -// this system inherits transform_reduce -#include - diff --git a/spaces/CVPR/WALT/mmdet/datasets/pipelines/__init__.py b/spaces/CVPR/WALT/mmdet/datasets/pipelines/__init__.py deleted file mode 100644 index c6f424debd1623e7511dd77da464a6639d816745..0000000000000000000000000000000000000000 --- a/spaces/CVPR/WALT/mmdet/datasets/pipelines/__init__.py +++ /dev/null @@ -1,25 +0,0 @@ -from .auto_augment import (AutoAugment, BrightnessTransform, ColorTransform, - ContrastTransform, EqualizeTransform, Rotate, Shear, - Translate) -from .compose import Compose -from .formating import (Collect, DefaultFormatBundle, ImageToTensor, - ToDataContainer, ToTensor, Transpose, to_tensor) -from .instaboost import InstaBoost -from .loading import (LoadAnnotations, LoadImageFromFile, LoadImageFromWebcam, - LoadMultiChannelImageFromFiles, LoadProposals) -from .test_time_aug import MultiScaleFlipAug -from .transforms import (Albu, CutOut, Expand, MinIoURandomCrop, Normalize, - Pad, PhotoMetricDistortion, RandomCenterCropPad, - RandomCrop, RandomFlip, Resize, SegRescale) - -__all__ = [ - 'Compose', 'to_tensor', 'ToTensor', 'ImageToTensor', 'ToDataContainer', - 'Transpose', 'Collect', 'DefaultFormatBundle', 'LoadAnnotations', - 'LoadImageFromFile', 'LoadImageFromWebcam', - 'LoadMultiChannelImageFromFiles', 'LoadProposals', 'MultiScaleFlipAug', - 'Resize', 'RandomFlip', 'Pad', 'RandomCrop', 'Normalize', 'SegRescale', - 'MinIoURandomCrop', 'Expand', 'PhotoMetricDistortion', 'Albu', - 'InstaBoost', 'RandomCenterCropPad', 'AutoAugment', 'CutOut', 'Shear', - 'Rotate', 'ColorTransform', 'EqualizeTransform', 'BrightnessTransform', - 'ContrastTransform', 'Translate' -] diff --git a/spaces/CVPR/WALT/mmdet/models/roi_heads/pisa_roi_head.py b/spaces/CVPR/WALT/mmdet/models/roi_heads/pisa_roi_head.py deleted file mode 100644 index e01113629837eb9c065ba40cd4025899b7bd0172..0000000000000000000000000000000000000000 --- a/spaces/CVPR/WALT/mmdet/models/roi_heads/pisa_roi_head.py +++ /dev/null @@ -1,159 +0,0 @@ -from mmdet.core import bbox2roi -from ..builder import HEADS -from ..losses.pisa_loss import carl_loss, isr_p -from .standard_roi_head import StandardRoIHead - - -@HEADS.register_module() -class PISARoIHead(StandardRoIHead): - r"""The RoI head for `Prime Sample Attention in Object Detection - `_.""" - - def forward_train(self, - x, - img_metas, - proposal_list, - gt_bboxes, - gt_labels, - gt_bboxes_ignore=None, - gt_masks=None): - """Forward function for training. - - Args: - x (list[Tensor]): List of multi-level img features. - img_metas (list[dict]): List of image info dict where each dict - has: 'img_shape', 'scale_factor', 'flip', and may also contain - 'filename', 'ori_shape', 'pad_shape', and 'img_norm_cfg'. - For details on the values of these keys see - `mmdet/datasets/pipelines/formatting.py:Collect`. - proposals (list[Tensors]): List of region proposals. - gt_bboxes (list[Tensor]): Each item are the truth boxes for each - image in [tl_x, tl_y, br_x, br_y] format. - gt_labels (list[Tensor]): Class indices corresponding to each box - gt_bboxes_ignore (list[Tensor], optional): Specify which bounding - boxes can be ignored when computing the loss. - gt_masks (None | Tensor) : True segmentation masks for each box - used if the architecture supports a segmentation task. - - Returns: - dict[str, Tensor]: a dictionary of loss components - """ - # assign gts and sample proposals - if self.with_bbox or self.with_mask: - num_imgs = len(img_metas) - if gt_bboxes_ignore is None: - gt_bboxes_ignore = [None for _ in range(num_imgs)] - sampling_results = [] - neg_label_weights = [] - for i in range(num_imgs): - assign_result = self.bbox_assigner.assign( - proposal_list[i], gt_bboxes[i], gt_bboxes_ignore[i], - gt_labels[i]) - sampling_result = self.bbox_sampler.sample( - assign_result, - proposal_list[i], - gt_bboxes[i], - gt_labels[i], - feats=[lvl_feat[i][None] for lvl_feat in x]) - # neg label weight is obtained by sampling when using ISR-N - neg_label_weight = None - if isinstance(sampling_result, tuple): - sampling_result, neg_label_weight = sampling_result - sampling_results.append(sampling_result) - neg_label_weights.append(neg_label_weight) - - losses = dict() - # bbox head forward and loss - if self.with_bbox: - bbox_results = self._bbox_forward_train( - x, - sampling_results, - gt_bboxes, - gt_labels, - img_metas, - neg_label_weights=neg_label_weights) - losses.update(bbox_results['loss_bbox']) - - # mask head forward and loss - if self.with_mask: - mask_results = self._mask_forward_train(x, sampling_results, - bbox_results['bbox_feats'], - gt_masks, img_metas) - losses.update(mask_results['loss_mask']) - - return losses - - def _bbox_forward(self, x, rois): - """Box forward function used in both training and testing.""" - # TODO: a more flexible way to decide which feature maps to use - bbox_feats = self.bbox_roi_extractor( - x[:self.bbox_roi_extractor.num_inputs], rois) - if self.with_shared_head: - bbox_feats = self.shared_head(bbox_feats) - cls_score, bbox_pred = self.bbox_head(bbox_feats) - - bbox_results = dict( - cls_score=cls_score, bbox_pred=bbox_pred, bbox_feats=bbox_feats) - return bbox_results - - def _bbox_forward_train(self, - x, - sampling_results, - gt_bboxes, - gt_labels, - img_metas, - neg_label_weights=None): - """Run forward function and calculate loss for box head in training.""" - rois = bbox2roi([res.bboxes for res in sampling_results]) - - bbox_results = self._bbox_forward(x, rois) - - bbox_targets = self.bbox_head.get_targets(sampling_results, gt_bboxes, - gt_labels, self.train_cfg) - - # neg_label_weights obtained by sampler is image-wise, mapping back to - # the corresponding location in label weights - if neg_label_weights[0] is not None: - label_weights = bbox_targets[1] - cur_num_rois = 0 - for i in range(len(sampling_results)): - num_pos = sampling_results[i].pos_inds.size(0) - num_neg = sampling_results[i].neg_inds.size(0) - label_weights[cur_num_rois + num_pos:cur_num_rois + num_pos + - num_neg] = neg_label_weights[i] - cur_num_rois += num_pos + num_neg - - cls_score = bbox_results['cls_score'] - bbox_pred = bbox_results['bbox_pred'] - - # Apply ISR-P - isr_cfg = self.train_cfg.get('isr', None) - if isr_cfg is not None: - bbox_targets = isr_p( - cls_score, - bbox_pred, - bbox_targets, - rois, - sampling_results, - self.bbox_head.loss_cls, - self.bbox_head.bbox_coder, - **isr_cfg, - num_class=self.bbox_head.num_classes) - loss_bbox = self.bbox_head.loss(cls_score, bbox_pred, rois, - *bbox_targets) - - # Add CARL Loss - carl_cfg = self.train_cfg.get('carl', None) - if carl_cfg is not None: - loss_carl = carl_loss( - cls_score, - bbox_targets[0], - bbox_pred, - bbox_targets[2], - self.bbox_head.loss_bbox, - **carl_cfg, - num_class=self.bbox_head.num_classes) - loss_bbox.update(loss_carl) - - bbox_results.update(loss_bbox=loss_bbox) - return bbox_results diff --git a/spaces/Catmeow/Face2Painting_From_Photo/app.py b/spaces/Catmeow/Face2Painting_From_Photo/app.py deleted file mode 100644 index 4c0cb26ab900f67896e49ffce86bca5338d9c4fe..0000000000000000000000000000000000000000 --- a/spaces/Catmeow/Face2Painting_From_Photo/app.py +++ /dev/null @@ -1,22 +0,0 @@ -import gradio as gr -from paintingface import generate - -title = "Face from Photo into Handed-paint" -description = "Upload a photo, this Ai will detect and transfer only the main face into cartoon/anime handed-painting style. (If cannot detect a face, try the edit button on the right corner of the picture to crop the photo manually.)" -article = "Examples are from Internet" - -Example=[['Example01.jpg'],['Example02.jpg'],['Example03.jpg']] - -demo = gr.Interface( - generate, - inputs = [gr.Image(type="pil", label="Upload a photo")], - outputs= [gr.Image(type="pil", label="Output")], - title=title, - description=description, - article=article, - examples=Example, - allow_flagging='never' - ) - -demo.launch() - diff --git a/spaces/CikeyQI/Yunzai/Yunzai/plugins/ws-plugin/apps/message/message.js b/spaces/CikeyQI/Yunzai/Yunzai/plugins/ws-plugin/apps/message/message.js deleted file mode 100644 index e1908bec0f81100cf6bac154e435b9ced4586b6e..0000000000000000000000000000000000000000 --- a/spaces/CikeyQI/Yunzai/Yunzai/plugins/ws-plugin/apps/message/message.js +++ /dev/null @@ -1,277 +0,0 @@ -import { sendSocketList, Config, Version } from '../../components/index.js' -import { makeOneBotReportMsg, makeGSUidReportMsg, setGuildLatestMsgId, setMsgMap } from '../../model/index.js' -import _ from 'lodash' -import cfg from '../../../../lib/config/config.js' - - -Bot.on('message', async e => { - if (e.self_id == '88888'){ - if (e.group?.bot?.uin) { - e.self_id = e.group.bot.uin - } else if (e.friend?.bot?.uin) { - e.self_id = e.friend.bot.uin - } - e.bot = Bot[e.self_id] - } - // 被禁言或者全体禁言 - if (Config.muteStop && (e.group?.mute_left > 0 || e.group?.all_muted)) return false - // 如果没有已连接的Websocket - if (sendSocketList.length == 0) return false - if (e.group_id) { - // 判断云崽白名单 - const whiteGroup = Config.whiteGroup - if (Array.isArray(whiteGroup) && whiteGroup.length > 0) { - if (!whiteGroup.some(i => i == e.group_id)) return false - } - // 判断插件白名单 - const yesGroup = Config.yesGroup - if (Array.isArray(yesGroup) && yesGroup.length > 0) { - if (!yesGroup.some(i => i == e.group_id)) return false - } - // 判断云崽黑名单 - const blackGroup = Config.blackGroup - if (Array.isArray(blackGroup) && blackGroup.length > 0) { - if (blackGroup.some(i => i == e.group_id)) return false - } - // 判断插件黑名单 - const noGroup = Config.noGroup - if (Array.isArray(noGroup) && noGroup.length > 0) { - if (noGroup.some(i => i == e.group_id)) return false - } - } - // 判断插件前缀 - if (Array.isArray(Config.noMsgStart) && Config.noMsgStart.length > 0) { - if (e.message?.[0]?.type === 'text') { - if (Config.noMsgStart.some(i => e.message[0].text.startsWith(i))) return false - } - } - let isMaster = e.isMaster - if (Version.isTrss) { - if (e.user_id && cfg.master[e.self_id]?.includes(String(e.user_id))) { - isMaster = true - } - } - const message_id = Math.floor(Math.random() * Math.pow(2, 32)) | 0 - let msg = { - time: e.time, - message_id: e.message_id, - message: _.cloneDeep(e.message), - rand: e.rand, - seq: e.seq, - source: e.source, - user_id: e.user_id, - self_id: e.self_id, - isMaster, - sender: e.sender, - param: { - time: e.time, - self_id: e.self_id, - post_type: e.post_type, - message_type: e.message_type, - sub_type: e.sub_type, - message_id, - user_id: e.user_id, - font: 0, - sender: e.sender, - anonymous: e.anonymous ? { - id: e.anonymous.id, - name: e.anonymous.name, - flag: e.anonymous.flag - } : null - } - } - let message = [] - //增加isGroup e.isPrivate - if (e.guild_id) { - setGuildLatestMsgId(e.message_id) - //处理成message - if (e.content) { - let content = toMsg(e.content) - message.push(...content) - } - if (e.attachments) { - e.attachments.forEach(i => { - if (i.content_type.startsWith('image')) { - message.push({ - type: 'image', - file: i.filename, - url: i.url - }) - } - }) - } - msg.message = message - - msg.isGuild = true - msg.param = { - time: Math.floor(new Date(msg.timestamp).getTime() / 1000), - post_type: 'message', - message_type: 'guild', - sub_type: 'channel', - guild_id: e.guild_id, - channel_id: e.channel_id, - user_id: e.author.id, - message_id: e.message_id, - self_id: e.bot.appID, - sender: { - user_id: e.author.id, - nickname: e.author.username, - tiny_id: e.author.id, - }, - self_tiny_id: e.bot.appID, - } - } else if (e.message_type == 'group') { - msg.isGroup = true - msg.group_id = e.group_id - msg.param.group_id = e.group_id - msg.self_id = e.group?.bot?.uin || msg.self_id - } else if (e.message_type == 'private') { - msg.isPrivate = true - msg.self_id = e.friend?.bot?.uin || msg.self_id - } else { - return false - } - // 判断云崽前缀 - msg = onlyReplyAt(msg) - if (!msg) return false - for (const i of sendSocketList) { - if (i.status == 1) { - let reportMsg = null - switch (Number(i.type)) { - case 1: - case 2: - case 6: - if (Version.isTrss) { - if (i.uin != e.self_id) continue - if (!Version.protocol.some(i => i == e.bot?.version?.name)) continue - } - e.reply = reply(e) - reportMsg = await makeOneBotReportMsg(msg) - break; - case 3: - reportMsg = await makeGSUidReportMsg(msg) - break - default: - break; - } - if (reportMsg) i.ws.send(reportMsg) - } - } -}) - -function reply(e) { - if (!Version.isTrss) { - const replyNew = e.reply - return async function (massage, quote = false, data = {}) { - const ret = await replyNew(massage, quote, data) - if (ret) { - setMsgMap({ - message_id: ret.message_id, - time: ret.time, - seq: ret.seq, - rand: ret.rand, - user_id: e.user_id, - group_id: e.group_id, - onebot_id: Math.floor(Math.random() * Math.pow(2, 32)) | 0, - }) - } - return ret - } - } else { - if (e.bot?.version?.name == 'ICQQ') { - return async function (massage, quote = false) { - let ret - if (e.isGroup) { - if (e.group?.sendMsg) { - ret = await e.group.sendMsg(massage, quote) - } else { - ret = await e.bot.pickGroup(e.group_id).sendMsg(massage, quote) - } - } else { - if (e.friend?.sendMsg) { - ret = await e.friend.sendMsg(massage, quote) - } else { - ret = await e.bot.pickFriend(e.user_id).sendMsg(massage, quote) - } - } - if (ret) { - setMsgMap({ - message_id: ret.message_id, - time: ret.time, - seq: ret.seq, - rand: ret.rand, - user_id: e.user_id, - group_id: e.group_id, - onebot_id: Math.floor(Math.random() * Math.pow(2, 32)) | 0, - }) - } - return ret - } - } - return e.reply - } -} - -function onlyReplyAt(e) { - if (!e.message) return false - - let groupCfg = Version.isTrss ? cfg.getGroup(e.self_id, e.group_id) : cfg.getGroup(e.group_id) - if (groupCfg.onlyReplyAt != 1 || !groupCfg.botAlias || e.isPrivate) return e - - let at = atBot(e) - if (at) return e - e = hasAlias(e, groupCfg) - if (e) return e - - return false -} - -function atBot(e) { - for (const i of e.message) { - if (i.type === 'at') { - if (i.qq == e.self_id) return true - } - } - return false -} - -function hasAlias(e, groupCfg) { - if (e.message[0].type === 'text') { - let alias = groupCfg.botAlias - if (!Array.isArray(alias)) { - alias = [alias] - } - for (let name of alias) { - if (e.message[0].text.startsWith(name)) { - e.message[0].text = _.trimStart(e.message[0].text, name).trim() - return e - } - } - } - return false -} - -function toMsg(content) { - const regex = /<@!(\d+)>||([^<]+)/g; - let match; - const result = []; - while ((match = regex.exec(content)) !== null) { - if (match[1]) { - result.push({ - type: 'at', - qq: match[1] - }); - } else if (match[2]) { - result.push({ - type: 'face', - id: parseInt(match[2]) - }); - } else if (match[3]) { - result.push({ - type: 'text', - text: match[3] - }); - } - } - return result; -} \ No newline at end of file diff --git a/spaces/Cyril666/ContourNet-ABI/maskrcnn_benchmark/layers/scale.py b/spaces/Cyril666/ContourNet-ABI/maskrcnn_benchmark/layers/scale.py deleted file mode 100644 index 2c25622e939b6cc19e07c485a6910c1a5ff8da3c..0000000000000000000000000000000000000000 --- a/spaces/Cyril666/ContourNet-ABI/maskrcnn_benchmark/layers/scale.py +++ /dev/null @@ -1,11 +0,0 @@ -import torch -from torch import nn - - -class Scale(nn.Module): - def __init__(self, init_value=1.0): - super(Scale, self).__init__() - self.scale = nn.Parameter(torch.FloatTensor([init_value])) - - def forward(self, input): - return input * self.scale diff --git a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/dateutil/zoneinfo/rebuild.py b/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/dateutil/zoneinfo/rebuild.py deleted file mode 100644 index 684c6586f091350c347f2b6150935f5214ffec27..0000000000000000000000000000000000000000 --- a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/dateutil/zoneinfo/rebuild.py +++ /dev/null @@ -1,75 +0,0 @@ -import logging -import os -import tempfile -import shutil -import json -from subprocess import check_call, check_output -from tarfile import TarFile - -from dateutil.zoneinfo import METADATA_FN, ZONEFILENAME - - -def rebuild(filename, tag=None, format="gz", zonegroups=[], metadata=None): - """Rebuild the internal timezone info in dateutil/zoneinfo/zoneinfo*tar* - - filename is the timezone tarball from ``ftp.iana.org/tz``. - - """ - tmpdir = tempfile.mkdtemp() - zonedir = os.path.join(tmpdir, "zoneinfo") - moduledir = os.path.dirname(__file__) - try: - with TarFile.open(filename) as tf: - for name in zonegroups: - tf.extract(name, tmpdir) - filepaths = [os.path.join(tmpdir, n) for n in zonegroups] - - _run_zic(zonedir, filepaths) - - # write metadata file - with open(os.path.join(zonedir, METADATA_FN), 'w') as f: - json.dump(metadata, f, indent=4, sort_keys=True) - target = os.path.join(moduledir, ZONEFILENAME) - with TarFile.open(target, "w:%s" % format) as tf: - for entry in os.listdir(zonedir): - entrypath = os.path.join(zonedir, entry) - tf.add(entrypath, entry) - finally: - shutil.rmtree(tmpdir) - - -def _run_zic(zonedir, filepaths): - """Calls the ``zic`` compiler in a compatible way to get a "fat" binary. - - Recent versions of ``zic`` default to ``-b slim``, while older versions - don't even have the ``-b`` option (but default to "fat" binaries). The - current version of dateutil does not support Version 2+ TZif files, which - causes problems when used in conjunction with "slim" binaries, so this - function is used to ensure that we always get a "fat" binary. - """ - - try: - help_text = check_output(["zic", "--help"]) - except OSError as e: - _print_on_nosuchfile(e) - raise - - if b"-b " in help_text: - bloat_args = ["-b", "fat"] - else: - bloat_args = [] - - check_call(["zic"] + bloat_args + ["-d", zonedir] + filepaths) - - -def _print_on_nosuchfile(e): - """Print helpful troubleshooting message - - e is an exception raised by subprocess.check_call() - - """ - if e.errno == 2: - logging.error( - "Could not find zic. Perhaps you need to install " - "libc-bin or some other package that provides it, " - "or it's not in your PATH?") diff --git a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/components/highlighted_text.py b/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/components/highlighted_text.py deleted file mode 100644 index 8d8c9b0d0b34e290f5f86052372587b9a6b7cc4f..0000000000000000000000000000000000000000 --- a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/components/highlighted_text.py +++ /dev/null @@ -1,205 +0,0 @@ -"""gr.HighlightedText() component.""" - -from __future__ import annotations - -from typing import Callable, Literal - -from gradio_client.documentation import document, set_documentation_group -from gradio_client.serializing import ( - JSONSerializable, -) - -from gradio.components.base import IOComponent, _Keywords -from gradio.deprecation import warn_style_method_deprecation -from gradio.events import ( - Changeable, - EventListenerMethod, - Selectable, -) - -set_documentation_group("component") - - -@document() -class HighlightedText(Changeable, Selectable, IOComponent, JSONSerializable): - """ - Displays text that contains spans that are highlighted by category or numerical value. - Preprocessing: this component does *not* accept input. - Postprocessing: expects a {List[Tuple[str, float | str]]]} consisting of spans of text and their associated labels, or a {Dict} with two keys: (1) "text" whose value is the complete text, and "entities", which is a list of dictionaries, each of which have the keys: "entity" (consisting of the entity label), "start" (the character index where the label starts), and "end" (the character index where the label ends). Entities should not overlap. - - Demos: diff_texts, text_analysis - Guides: named-entity-recognition - """ - - def __init__( - self, - value: list[tuple[str, str | float | None]] | dict | Callable | None = None, - *, - color_map: dict[str, str] - | None = None, # Parameter moved to HighlightedText.style() - show_legend: bool = False, - combine_adjacent: bool = False, - adjacent_separator: str = "", - label: str | None = None, - every: float | None = None, - show_label: bool | None = None, - container: bool = True, - scale: int | None = None, - min_width: int = 160, - visible: bool = True, - elem_id: str | None = None, - elem_classes: list[str] | str | None = None, - **kwargs, - ): - """ - Parameters: - value: Default value to show. If callable, the function will be called whenever the app loads to set the initial value of the component. - show_legend: whether to show span categories in a separate legend or inline. - combine_adjacent: If True, will merge the labels of adjacent tokens belonging to the same category. - adjacent_separator: Specifies the separator to be used between tokens if combine_adjacent is True. - label: component name in interface. - every: If `value` is a callable, run the function 'every' number of seconds while the client connection is open. Has no effect otherwise. Queue must be enabled. The event can be accessed (e.g. to cancel it) via this component's .load_event attribute. - show_label: if True, will display label. - container: If True, will place the component in a container - providing some extra padding around the border. - scale: relative width compared to adjacent Components in a Row. For example, if Component A has scale=2, and Component B has scale=1, A will be twice as wide as B. Should be an integer. - min_width: minimum pixel width, will wrap if not sufficient screen space to satisfy this value. If a certain scale value results in this Component being narrower than min_width, the min_width parameter will be respected first. - visible: If False, component will be hidden. - elem_id: An optional string that is assigned as the id of this component in the HTML DOM. Can be used for targeting CSS styles. - elem_classes: An optional list of strings that are assigned as the classes of this component in the HTML DOM. Can be used for targeting CSS styles. - """ - self.color_map = color_map - self.show_legend = show_legend - self.combine_adjacent = combine_adjacent - self.adjacent_separator = adjacent_separator - self.select: EventListenerMethod - """ - Event listener for when the user selects Highlighted text span. - Uses event data gradio.SelectData to carry `value` referring to selected [text, label] tuple, and `index` to refer to span index. - See EventData documentation on how to use this event data. - """ - IOComponent.__init__( - self, - label=label, - every=every, - show_label=show_label, - container=container, - scale=scale, - min_width=min_width, - visible=visible, - elem_id=elem_id, - elem_classes=elem_classes, - value=value, - **kwargs, - ) - - def get_config(self): - return { - "color_map": self.color_map, - "show_legend": self.show_legend, - "value": self.value, - "selectable": self.selectable, - **IOComponent.get_config(self), - } - - @staticmethod - def update( - value: list[tuple[str, str | float | None]] - | dict - | Literal[_Keywords.NO_VALUE] - | None = _Keywords.NO_VALUE, - color_map: dict[str, str] | None = None, - show_legend: bool | None = None, - label: str | None = None, - show_label: bool | None = None, - container: bool | None = None, - scale: int | None = None, - min_width: int | None = None, - visible: bool | None = None, - ): - updated_config = { - "color_map": color_map, - "show_legend": show_legend, - "label": label, - "show_label": show_label, - "container": container, - "scale": scale, - "min_width": min_width, - "visible": visible, - "value": value, - "__type__": "update", - } - return updated_config - - def postprocess( - self, y: list[tuple[str, str | float | None]] | dict | None - ) -> list[tuple[str, str | float | None]] | None: - """ - Parameters: - y: List of (word, category) tuples - Returns: - List of (word, category) tuples - """ - if y is None: - return None - if isinstance(y, dict): - try: - text = y["text"] - entities = y["entities"] - except KeyError as ke: - raise ValueError( - "Expected a dictionary with keys 'text' and 'entities' " - "for the value of the HighlightedText component." - ) from ke - if len(entities) == 0: - y = [(text, None)] - else: - list_format = [] - index = 0 - entities = sorted(entities, key=lambda x: x["start"]) - for entity in entities: - list_format.append((text[index : entity["start"]], None)) - list_format.append( - (text[entity["start"] : entity["end"]], entity["entity"]) - ) - index = entity["end"] - list_format.append((text[index:], None)) - y = list_format - if self.combine_adjacent: - output = [] - running_text, running_category = None, None - for text, category in y: - if running_text is None: - running_text = text - running_category = category - elif category == running_category: - running_text += self.adjacent_separator + text - elif not text: - # Skip fully empty item, these get added in processing - # of dictionaries. - pass - else: - output.append((running_text, running_category)) - running_text = text - running_category = category - if running_text is not None: - output.append((running_text, running_category)) - return output - else: - return y - - def style( - self, - *, - color_map: dict[str, str] | None = None, - container: bool | None = None, - **kwargs, - ): - """ - This method is deprecated. Please set these arguments in the constructor instead. - """ - warn_style_method_deprecation() - if container is not None: - self.container = container - if color_map is not None: - self.color_map = color_map - return self diff --git a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/interpretation.py b/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/interpretation.py deleted file mode 100644 index 767ad641b99a51c08b4efadec350c7170bdc734b..0000000000000000000000000000000000000000 --- a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/interpretation.py +++ /dev/null @@ -1,328 +0,0 @@ -"""Contains classes and methods related to interpretation for components in Gradio.""" - -from __future__ import annotations - -import copy -import math -from abc import ABC, abstractmethod -from typing import TYPE_CHECKING, Any - -import numpy as np -from gradio_client import utils as client_utils - -from gradio import components - -if TYPE_CHECKING: # Only import for type checking (is False at runtime). - from gradio import Interface - - -class Interpretable(ABC): # noqa: B024 - def __init__(self) -> None: - self.set_interpret_parameters() - - def set_interpret_parameters(self): # noqa: B027 - """ - Set any parameters for interpretation. Properties can be set here to be - used in get_interpretation_neighbors and get_interpretation_scores. - """ - pass - - def get_interpretation_scores( - self, x: Any, neighbors: list[Any] | None, scores: list[float], **kwargs - ) -> list: - """ - Arrange the output values from the neighbors into interpretation scores for the interface to render. - Parameters: - x: Input to interface - neighbors: Neighboring values to input x used for interpretation. - scores: Output value corresponding to each neighbor in neighbors - Returns: - Arrangement of interpretation scores for interfaces to render. - """ - return scores - - -class TokenInterpretable(Interpretable, ABC): - @abstractmethod - def tokenize(self, x: Any) -> tuple[list, list, None]: - """ - Interprets an input data point x by splitting it into a list of tokens (e.g - a string into words or an image into super-pixels). - """ - return [], [], None - - @abstractmethod - def get_masked_inputs(self, tokens: list, binary_mask_matrix: list[list]) -> list: - return [] - - -class NeighborInterpretable(Interpretable, ABC): - @abstractmethod - def get_interpretation_neighbors(self, x: Any) -> tuple[list, dict]: - """ - Generates values similar to input to be used to interpret the significance of the input in the final output. - Parameters: - x: Input to interface - Returns: (neighbor_values, interpret_kwargs, interpret_by_removal) - neighbor_values: Neighboring values to input x to compute for interpretation - interpret_kwargs: Keyword arguments to be passed to get_interpretation_scores - """ - return [], {} - - -async def run_interpret(interface: Interface, raw_input: list): - """ - Runs the interpretation command for the machine learning model. Handles both the "default" out-of-the-box - interpretation for a certain set of UI component types, as well as the custom interpretation case. - Parameters: - raw_input: a list of raw inputs to apply the interpretation(s) on. - """ - if isinstance(interface.interpretation, list): # Either "default" or "shap" - processed_input = [ - input_component.preprocess(raw_input[i]) - for i, input_component in enumerate(interface.input_components) - ] - original_output = await interface.call_function(0, processed_input) - original_output = original_output["prediction"] - - if len(interface.output_components) == 1: - original_output = [original_output] - - scores, alternative_outputs = [], [] - - for i, (x, interp) in enumerate(zip(raw_input, interface.interpretation)): - if interp == "default": - input_component = interface.input_components[i] - neighbor_raw_input = list(raw_input) - if isinstance(input_component, TokenInterpretable): - tokens, neighbor_values, masks = input_component.tokenize(x) - interface_scores = [] - alternative_output = [] - for neighbor_input in neighbor_values: - neighbor_raw_input[i] = neighbor_input - processed_neighbor_input = [ - input_component.preprocess(neighbor_raw_input[i]) - for i, input_component in enumerate( - interface.input_components - ) - ] - - neighbor_output = await interface.call_function( - 0, processed_neighbor_input - ) - neighbor_output = neighbor_output["prediction"] - if len(interface.output_components) == 1: - neighbor_output = [neighbor_output] - processed_neighbor_output = [ - output_component.postprocess(neighbor_output[i]) - for i, output_component in enumerate( - interface.output_components - ) - ] - - alternative_output.append(processed_neighbor_output) - interface_scores.append( - quantify_difference_in_label( - interface, original_output, neighbor_output - ) - ) - alternative_outputs.append(alternative_output) - scores.append( - input_component.get_interpretation_scores( - raw_input[i], - neighbor_values, - interface_scores, - masks=masks, - tokens=tokens, - ) - ) - elif isinstance(input_component, NeighborInterpretable): - ( - neighbor_values, - interpret_kwargs, - ) = input_component.get_interpretation_neighbors( - x - ) # type: ignore - interface_scores = [] - alternative_output = [] - for neighbor_input in neighbor_values: - neighbor_raw_input[i] = neighbor_input - processed_neighbor_input = [ - input_component.preprocess(neighbor_raw_input[i]) - for i, input_component in enumerate( - interface.input_components - ) - ] - neighbor_output = await interface.call_function( - 0, processed_neighbor_input - ) - neighbor_output = neighbor_output["prediction"] - if len(interface.output_components) == 1: - neighbor_output = [neighbor_output] - processed_neighbor_output = [ - output_component.postprocess(neighbor_output[i]) - for i, output_component in enumerate( - interface.output_components - ) - ] - - alternative_output.append(processed_neighbor_output) - interface_scores.append( - quantify_difference_in_label( - interface, original_output, neighbor_output - ) - ) - alternative_outputs.append(alternative_output) - interface_scores = [-score for score in interface_scores] - scores.append( - input_component.get_interpretation_scores( - raw_input[i], - neighbor_values, - interface_scores, - **interpret_kwargs, - ) - ) - else: - raise ValueError( - f"Component {input_component} does not support interpretation" - ) - elif interp == "shap" or interp == "shapley": - try: - import shap # type: ignore - except (ImportError, ModuleNotFoundError) as err: - raise ValueError( - "The package `shap` is required for this interpretation method. Try: `pip install shap`" - ) from err - input_component = interface.input_components[i] - if not isinstance(input_component, TokenInterpretable): - raise ValueError( - f"Input component {input_component} does not support `shap` interpretation" - ) - - tokens, _, masks = input_component.tokenize(x) - - # construct a masked version of the input - def get_masked_prediction(binary_mask): - assert isinstance(input_component, TokenInterpretable) - masked_xs = input_component.get_masked_inputs(tokens, binary_mask) - preds = [] - for masked_x in masked_xs: - processed_masked_input = copy.deepcopy(processed_input) - processed_masked_input[i] = input_component.preprocess(masked_x) - new_output = client_utils.synchronize_async( - interface.call_function, 0, processed_masked_input - ) - new_output = new_output["prediction"] - if len(interface.output_components) == 1: - new_output = [new_output] - pred = get_regression_or_classification_value( - interface, original_output, new_output - ) - preds.append(pred) - return np.array(preds) - - num_total_segments = len(tokens) - explainer = shap.KernelExplainer( - get_masked_prediction, np.zeros((1, num_total_segments)) - ) - shap_values = explainer.shap_values( - np.ones((1, num_total_segments)), - nsamples=int(interface.num_shap * num_total_segments), - silent=True, - ) - assert shap_values is not None, "SHAP values could not be calculated" - scores.append( - input_component.get_interpretation_scores( - raw_input[i], - None, - shap_values[0].tolist(), - masks=masks, - tokens=tokens, - ) - ) - alternative_outputs.append([]) - elif interp is None: - scores.append(None) - alternative_outputs.append([]) - else: - raise ValueError(f"Unknown interpretation method: {interp}") - return scores, alternative_outputs - elif interface.interpretation: # custom interpretation function - processed_input = [ - input_component.preprocess(raw_input[i]) - for i, input_component in enumerate(interface.input_components) - ] - interpreter = interface.interpretation - interpretation = interpreter(*processed_input) - if len(raw_input) == 1: - interpretation = [interpretation] - return interpretation, [] - else: - raise ValueError("No interpretation method specified.") - - -def diff(original: Any, perturbed: Any) -> int | float: - try: # try computing numerical difference - score = float(original) - float(perturbed) - except ValueError: # otherwise, look at strict difference in label - score = int(original != perturbed) - return score - - -def quantify_difference_in_label( - interface: Interface, original_output: list, perturbed_output: list -) -> int | float: - output_component = interface.output_components[0] - post_original_output = output_component.postprocess(original_output[0]) - post_perturbed_output = output_component.postprocess(perturbed_output[0]) - - if isinstance(output_component, components.Label): - original_label = post_original_output["label"] - perturbed_label = post_perturbed_output["label"] - - # Handle different return types of Label interface - if "confidences" in post_original_output: - original_confidence = original_output[0][original_label] - perturbed_confidence = perturbed_output[0][original_label] - score = original_confidence - perturbed_confidence - else: - score = diff(original_label, perturbed_label) - return score - - elif isinstance(output_component, components.Number): - score = diff(post_original_output, post_perturbed_output) - return score - - else: - raise ValueError( - f"This interpretation method doesn't support the Output component: {output_component}" - ) - - -def get_regression_or_classification_value( - interface: Interface, original_output: list, perturbed_output: list -) -> int | float: - """Used to combine regression/classification for Shap interpretation method.""" - output_component = interface.output_components[0] - post_original_output = output_component.postprocess(original_output[0]) - post_perturbed_output = output_component.postprocess(perturbed_output[0]) - - if isinstance(output_component, components.Label): - original_label = post_original_output["label"] - perturbed_label = post_perturbed_output["label"] - - # Handle different return types of Label interface - if "confidences" in post_original_output: - if math.isnan(perturbed_output[0][original_label]): - return 0 - return perturbed_output[0][original_label] - else: - score = diff( - perturbed_label, original_label - ) # Intentionally inverted order of arguments. - return score - - else: - raise ValueError( - f"This interpretation method doesn't support the Output component: {output_component}" - ) diff --git a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio_client/media_data.py b/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio_client/media_data.py deleted file mode 100644 index ecbb7442a6c7c1a13f24418bea01e74aeee4d033..0000000000000000000000000000000000000000 --- a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio_client/media_data.py +++ /dev/null @@ -1,8655 +0,0 @@ -BASE64_IMAGE = ( # test/test_files/bus.png - "data:image/png;base64," - "R0lGODlhPQBEAPeoAJosM//AwO/AwHVYZ/z595kzAP/s7P+goOXMv8+fhw/v739/f+8PD98fH/8mJl+fn/9ZWb8/PzWlwv///6wWGbImAPgTEMImIN9gUFCEm/gDALULDN8PAD6atYdCTX9gUNKlj8wZAKUsAOzZz+UMAOsJAP/Z2ccMDA8PD/95eX5NWvsJCOVNQPtfX/8zM8+QePLl38MGBr8JCP+zs9myn/8GBqwpAP/GxgwJCPny78lzYLgjAJ8vAP9fX/+MjMUcAN8zM/9wcM8ZGcATEL+QePdZWf/29uc/P9cmJu9MTDImIN+/r7+/vz8/P8VNQGNugV8AAF9fX8swMNgTAFlDOICAgPNSUnNWSMQ5MBAQEJE3QPIGAM9AQMqGcG9vb6MhJsEdGM8vLx8fH98AANIWAMuQeL8fABkTEPPQ0OM5OSYdGFl5jo+Pj/+pqcsTE78wMFNGQLYmID4dGPvd3UBAQJmTkP+8vH9QUK+vr8ZWSHpzcJMmILdwcLOGcHRQUHxwcK9PT9DQ0O/v70w5MLypoG8wKOuwsP/g4P/Q0IcwKEswKMl8aJ9fX2xjdOtGRs/Pz+Dg4GImIP8gIH0sKEAwKKmTiKZ8aB/f39Wsl+LFt8dgUE9PT5x5aHBwcP+AgP+WltdgYMyZfyywz78AAAAAAAD///8AAP9mZv///wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACH5BAEAAKgALAAAAAA9AEQAAAj/AFEJHEiwoMGDCBMqXMiwocAbBww4nEhxoYkUpzJGrMixogkfGUNqlNixJEIDB0SqHGmyJSojM1bKZOmyop0gM3Oe2liTISKMOoPy7GnwY9CjIYcSRYm0aVKSLmE6nfq05QycVLPuhDrxBlCtYJUqNAq2bNWEBj6ZXRuyxZyDRtqwnXvkhACDV+euTeJm1Ki7A73qNWtFiF+/gA95Gly2CJLDhwEHMOUAAuOpLYDEgBxZ4GRTlC1fDnpkM+fOqD6DDj1aZpITp0dtGCDhr+fVuCu3zlg49ijaokTZTo27uG7Gjn2P+hI8+PDPERoUB318bWbfAJ5sUNFcuGRTYUqV/3ogfXp1rWlMc6awJjiAAd2fm4ogXjz56aypOoIde4OE5u/F9x199dlXnnGiHZWEYbGpsAEA3QXYnHwEFliKAgswgJ8LPeiUXGwedCAKABACCN+EA1pYIIYaFlcDhytd51sGAJbo3onOpajiihlO92KHGaUXGwWjUBChjSPiWJuOO/LYIm4v1tXfE6J4gCSJEZ7YgRYUNrkji9P55sF/ogxw5ZkSqIDaZBV6aSGYq/lGZplndkckZ98xoICbTcIJGQAZcNmdmUc210hs35nCyJ58fgmIKX5RQGOZowxaZwYA+JaoKQwswGijBV4C6SiTUmpphMspJx9unX4KaimjDv9aaXOEBteBqmuuxgEHoLX6Kqx+yXqqBANsgCtit4FWQAEkrNbpq7HSOmtwag5w57GrmlJBASEU18ADjUYb3ADTinIttsgSB1oJFfA63bduimuqKB1keqwUhoCSK374wbujvOSu4QG6UvxBRydcpKsav++Ca6G8A6Pr1x2kVMyHwsVxUALDq/krnrhPSOzXG1lUTIoffqGR7Goi2MAxbv6O2kEG56I7CSlRsEFKFVyovDJoIRTg7sugNRDGqCJzJgcKE0ywc0ELm6KBCCJo8DIPFeCWNGcyqNFE06ToAfV0HBRgxsvLThHn1oddQMrXj5DyAQgjEHSAJMWZwS3HPxT/QMbabI/iBCliMLEJKX2EEkomBAUCxRi42VDADxyTYDVogV+wSChqmKxEKCDAYFDFj4OmwbY7bDGdBhtrnTQYOigeChUmc1K3QTnAUfEgGFgAWt88hKA6aCRIXhxnQ1yg3BCayK44EWdkUQcBByEQChFXfCB776aQsG0BIlQgQgE8qO26X1h8cEUep8ngRBnOy74E9QgRgEAC8SvOfQkh7FDBDmS43PmGoIiKUUEGkMEC/PJHgxw0xH74yx/3XnaYRJgMB8obxQW6kL9QYEJ0FIFgByfIL7/IQAlvQwEpnAC7DtLNJCKUoO/w45c44GwCXiAFB/OXAATQryUxdN4LfFiwgjCNYg+kYMIEFkCKDs6PKAIJouyGWMS1FSKJOMRB/BoIxYJIUXFUxNwoIkEKPAgCBZSQHQ1A2EWDfDEUVLyADj5AChSIQW6gu10bE/JG2VnCZGfo4R4d0sdQoBAHhPjhIB94v/wRoRKQWGRHgrhGSQJxCS+0pCZbEhAAOw==" -) -BASE64_AUDIO = { - "name": "test/test_files/audio_sample.wav", - "data": "data:audio/wav;base64,UklGRuI/AABXQVZFZm10IBAAAAABAAEAQB8AAIA+AAACABAAZGF0Ydw+AACO/w//5P6R/9D/SgDJAGIAegA3ALkAPAC8/zEA4/+G/8X/3//f/+n/jv+d/87/mP+p/7v/jv/C/ygAogB+AOQAHADX/1EAQwCz//T/kv/B/oD/rf8VABUAKAA3ANv/4P/o/8T/5/8o/6P/dgDDADcBUwCu/w3/+f5Z/5L/YQCfAMsAaAGxAXgAg//m/lT+Rf6k/lQA8wAXAR0BtwD1AF4Amf8g/xX/Tf/8/rb/FQDc/6sA6wAJAeIABQEyADn/af7D/b7+Mv8nALwAdAFAAooBswAKAEz/4v66/nb/KAAlAEoAQwBIAM//qf85AGAAeP+z/5f/n/8rAOL/MwBkAMsACwHxANUAjP8B/w7/2/7X/vj+TgDp/0MA5wDRAOMA5v+Q/+n/1/+C/zL/qf/y/yMAhQBEAEAAyf9A/23/JQCZ/5EArgDkAGMAmP/o/9b+Hv9O/8f/mQCdAIwAYwDX/3T/5v7//8r/PQCNAMIAvADq/4//SP8yAMP/1v/t/67/AgBaADwAAQD+/4YAZQDmAHAAgf+S/0D/D/94/7oA1QDaAMoAQgEFAX0A+v+S/i3+lP4o/ycACQBlAMQALAHxAJb/ZQBV/4T/z/8HAMUADgEuASQANwCCAD8A2/9e/wz/O/8u//T/+////ysATABVACABbQAwAMX/tf44/93+vf8IAHEAJAGnATYBoQCn/3j/VP65/vz///83AE8AeQDD//X/b/9RAMz/vwBmANP/dQAaAKT/vP/X/57/xP9B/1H/Bv+nAPgALwF3AY8BFQDe/9f+tv73/qT+hgBPAPcAOgAoAC8Akv/C/3YAaP/3/1//d/+6/6b/TQCAAPMAtgC5AN7/dv/s/fj+Ov/6/+8AfAGQAagB1gBV//3+kf7R/oH+jv/H/3AAdgCYABAAowDK/97/uwAEAJEA3v8SAJ3/b/8vAO3/8f+QAFT/OgCCAEkAKwAFAKL/Qv/S/4//yP/s/2wAPQB3AF4AlAAXAAsAZP+a//b/rv8ZAOb/EgCt//z/sQAlAC0AJwHs/1D/G/68/k3/z/+TAfgAewE7AvwA8v+Y/nn+7P7E/YMAmwDQAIABYwBxAEYAHwBrAIP/Rv9m/9f+GwBH/7j/0wCVAfgBCAHJ/8f/s/7+/rb/BP+v/zMAzgDa/+T/twAfAKD+7f91/+f/sQDq/6H/AACZANAAfgD1/+n/aP6h/9X+uP4CAHkAqAGBAT8BkgHZ/33/Df9j/jD/PP/HAI4AIwChAKsApv+3/yD/kv/+/x8A+/8v/xsASgBbAIcAdADy/4YAaP/w/8v/T//U/zkA2P+dADQBdAAqAP3+bP/P//r/i/+M/in/bQAaAEQBhwDsAJcAXf+o/+T+TP/A/1cANgCIAI0AJQHK/53/AwCqAEQBWAD6/8X/dv/L/83/q/9rAFsA/ABPAMf/xf5K/+7+Sf9nAPwAjAGYAA8Ar/+b/5L/kf8m/z8Ad/83AVgA2P/cAJn/VwDG/6P/gP8Z/z7/XP/P/oUA7P9XAK4AKwCNAKn/Iv9YAAUA3P8DACoAPgC8/moAFgA1ANEA9P/r/7IAxP/c/kD/vv9cAEoArAFmAVEAagBJABj/yf+X/z8AGABY/2kA2f85AC4APP+c/+f/yf8T/+r+bgCu/x8AJgKUAbMBTAI6AGv/TP7//X7+vv7sAL//bAEnAoYATgCt/+n/Uv9w/tP+j/6i/0YAUAA8AXgBIQJEAfL/Cf6a/if/iP9bADsBugLiAiMBVv/e/r3+EP7s/Xr/qP9z/4AAQwCk/7MAlwDoAOgA6f+A/+n+D/9E/if/BwHTABIC2gGEADMAUf9P/3D+lv7F/sv/6QBPACQAWwDgANn/2f8I/z7/7P96/lr+vABgAWYBEgJaAT8Asf/N/3n+FP6N/kP/mADsARIB7AC4AIX/kv54/v3/BQDf/0sAKQCqAGEATP8jAMr/7ADtALL/9f6k/pT+vv7t/84AyAG7AQECJwDG/7n+d/2X/uD/6QBKAZ8BOgGbAAwACv/f/goAsP+d/2z/QQFJAML/uP/Z/xABmf8LAE8AEgCM/wn/c/99/04AgQHG/5IBOwFrAGABOAC+/+/+5v6W/j/+qf/mAGX/9AC/AHb/i/8g/6z/n//J/2wAiABZAZABiADBAMP//f8PAE4AEgAvAPH+jv7A/+n/OgDk/4wAKAAVAJUAj/99/tP+Mf4AAMgBGAFZAZUBhwCh/2b/Y/+C/2f/6v8X/3n/+v7A/mkAr/8ZAF8B/wDBAPH/8P/o/9j/TACr/wwAZgC8////3f+4/mz/XgCF/9D/XwA2/6v/pv/3/1YA1QDmAFQAnABDALX/NQDx/zEAewFfALsAVwCH/77/7/5m/9D/Qv/k/4n/7v7S/n79tv/DACEALAHaAacBugDfAJIA7v+x/+X/EP+d/+j/2P8LAMH/Iv8PABcAlP/I//D+VwDS/mT/jwB4APUAwAC5AD0BAP+PAGsAIP8gAaT/sAAqAL8A9AAG//n/SABU/nX/uv/p/37/gP85AMX/aQBMAMn/Mf9vAOb//QBHAPn/hgDi/ykAGv9h/kAAqwCU/wAAZQBgART/i/+F/5D+YP9wABoAUABNAe8AcwCbAK4A8f+oALYAkP89/8f/7f7+/8b+Tf+yAPX/CAEHAaz/ywAbAXv/Kf/R/5EA2f9uAQAANf+5AKkAZf9T/xABLwB0/yoAIgAKACsAGP+B/93/mf+6/+r/bP9s/in/fwB5APAAKgEvAdIBTgBsAFMAMf+3/s/+GAAWAL0AQAEFAH3/cf8aAMj/tP9+/+D+lwDsANP/mP+DALH/pf+MALQAwgDlAAwAbf/5/00A5/99/1AAZv9q/8H/0P6+/vj+4/9hAdb/xwDQAIX/zP7e/uD/I/+T/0QBOQCtAE8B3v6DANb/Dv9T/1YA2P9p/4QAngF0AfcARwBD/9wAGP8u/yv/z/7T//b/yf9vAKIBlAALAHEB3v+8/s7/H/70/LD+FAGGALcBZwIeAbkA2gBB/2H+0P5V/93/ZwC2AVL/uP+o/yj/r/+6/p//hf/K/qYBKwIoAUIA8wD8/zD/ggDC/tr+2v7d/9r/RQE5AgEA7f+TAcn/Xv8AAB0AlP65/hUB5v8nAU4CBwAI/xgAU/5i/oz+6v6u/7sBCgKuAQ0BkAD1/rT/R/8+/mkA0f1n/4cA9gDLAKgB3gBg/1cA6wCX/lT+AQAG/m7/FgGo/xAAeAExALcAbf+//x7/Uf8pANf/QgCbABcB8QCyABD/rQDQ/gH/9f9F/mcAbQC4/14AtQA1AW7/LP+OAGT+9gDsAEb/BwEbAMoABAHS//z/g/9i//T+qv0AAOv/b/+QAKj/2gDKAScAdQHl/0YAEQDn/+kAzf6xAEgANwAGAGYAOf+D/zUAdP6R/6r/W/8oALz/UQErAKEAGQHv/jQAQf/B/2X/CAA6ALcAjAGAAHD/NwGsAHQAAP++/r//Yv6J/+j+zv9T/0YARgFHARgA7wAdAIT/RwCe/yEAQgAuALT/FwCYARMAV/9pATf/XwD+//f/F//V/yb/fv8FAPf/dQCP/xsAMv/mAOH/lAA5AXT/Vv4/Avb/n/8mAcEAhP9i/+3/4P24/8H/JP+g/iQCZf/wAD4B1P88AJgAXQDY/oj/QQCQANn+UwCd/5gB//9o/w8Apv8n/4X/t//j/4sA1P+oAMf/UQFv/zn/sgAtAFMAogDm/4oAkADBALD+5P4qAWz+bwCI//P/0/5n/1v/R/7R/5gAqQCvAGsBpQDyAAP/JQDr/9H/4P/8AB0A2ACBAGz/xv7U//H/cv/PATD/6/5p/44Aef+c/0gAhQBOALYAif/O/0YB3QD7/4IBggBKANcAhP5CAF79qf9H/4n/yQKd/2sAMQC2/uf/y/79/yAAh/+oAF8B5QCG/5L/b/8YAB7/pgEV/xn/3gD9/sf/TP+M/0oB0AAUACX/Af97AQL/Sv/F/3UAqwDbACMAWQEGAPP/LgGe/3MAcf+7/ZP9X/7t/f7+0v6lAiQBhwI1Az4A0v4//3v/Vv97ABQAKwFw/+8B+f5m/y3/Vv6vALwAHwG6/qb9VP8y/lj+WwBOAWcDfAGiAAsAFf8+/SL/of7l/5UC0gLHATwBYQCU/oT/GP67/sr/SwLI/3D+GAA1/13/uv81/iYBygHA/+L/tf/IAFD/EwHVALEA6wDbAM//fwAdAJr/3P86APf/DQEvAZn/NgBv/sH/Bf4YADL/d/7BAOD+3v95AmABEQAOAIf/5f+0/SUARwKy/zMBrgGz/1QBW/5g/6L/Gf9wAEr+GwEeAP79af9v/9D+4wAI/yEBwwAb/7MAC/8pAEUChwDwACQBnP8oAKH9mf/k/uL/MQFsAN0AQADV/yT/7P27//f+pf9NAPYA/QBcANgBgf7jAaf+7v+V/4v+cwBo/nMApAJtAV0AMf+zACQAAP4tAFT/oQCX/8MBLQEpAboAhv8Z/oj/H/+6/9n/mP8MAcL/PAIeAQQBMgHIAOP8xv5c/lf+dv36ASQCQQE0BJUANAH8/zEABP3t/yP/Tv9NANYA5v4CAEcAuP8EAQMAx/36/BwAwvwfAC8BOgOmAF8CCQGvAJ0A0/1J/Pv9mgCN/8cCHQHNAWMAKwH7/Yv/mv3W/nz8K/4QACIAUgKNAI8B6QE3A4r/JgD8/Ef/Gf2AAVsA2v6lAT4CDQHY/xwALv8s/uP85v/K/OUB1QCMAHoA1AOlAqX/uP+h/cP92v2a/qgA8P+PAZwEvv6QAsr9r/4d/lL+OACL/jEB2AESALH/3gIEACsBnwCbAf7+5/6q/u/+/v0VARcCNAEYApT/1gCu/Z7+CP7U/c7/bQH0/zwCFQH9AKYAh//YAPD+nf+3AO3/aP90AQAAwwJG/6QBz/9N/OT/Gv3a/HH/pv6jAOwBkwEtA37/YgF+/gz+hQBaALAABwME/58AVQGT/kQA5P2s//z+yf+UAIH/hgBKAFX+FALh/3UAK/+O//v8cP4WAkAAkQIyAQsDbwFMAhv/c/2J/Vr+qv2BAWUAJQAyAOL/WwDL/OUBGP50/r8AzwCOAPsDDgIXAX7/WwBt/7j7X/+b/Ab/pf/pACgB5AL4AL3/KwCJACoAwP5v/8n/YABF/rQAn/8iAgYAAQKZAFj+6QCI/q/85P8jAQcB4QDTANoCr/3F/7b8r/wv/8P/kADhAa0CTAKlAGsBvwHk/TP/6/83/sj+Cv+X/9oB5P+GAgEACP+5AEP9uPvy/p//lQF8AfoCjgNP/woCov4F/ff9R/+8/rcA2AAFA9cAKwDIAP39zgD//q/+l/26/2L+wQAkAX0DAwIGABID0/6r/QL+m/19/z//wP+UBIX+xQHv/qz/1ADT/jMCB/9VAKsAz/43/xYCu/7AAN//lgCY/u7+ov36/NYAtgKeAekArwSP/3j/zP65/hb+Zv+S//P/6v9iArkAhf5xAIz/NgH1AAYA9v7W/zL/GADn/sYDZf8tAXoCnf3+/5b95P6A/xL+rQDnAQQDrgHy/qgB6P0W/5T+ov5z/4ECAQGeAKABawG7/zz/IAE1/Yj/AQEq/vX/NQFh/5gBIQD7ATb8lQCnAHL80//UANcAbAAEAkIA1v9j/wD/M/4iAZv+agF6ACsA0P9dAdUABQAEAZr/CwI4/hb9q/qT/zz+xf8UArUElQCZAO8CA/7K/+z9RP+k/r8CsgE9ANn/HwJr/ff+1P70AUf/Jv0CAaf8+AIa/9AAUgCjALr/IAAP/zICav9t/20AiP9qAWb+2AFT/Rz+vgDiAY/7fgA3Adz+9QDsAJ4C9v/uAUUAeP8gAKb9Hfw3/wT/QwEqAVoBiQGlAO0AwQBk/s7+Uf8P/noBnv8jAwMBB/4aAYv9N//JACn9zwL8/kcB9wJo/5EC6/4w/joBWQDFAAUAVvy6AKz9Xv5K/8D+YAICArH/AgRj/db/GP7//ZQC8P3YBZ8A7/+jALP/t/27/gL9vAAJAKQCAQEC/sQASv9R/vX+OAEA/3wDhP4mAgX9XwJw/6/+YQDW/gADK/4cAST+hP+6/UUDZgBr/z8AfQJC//MA7/8u/xH+P/76ATr8tgKG/tEAWgDOAu//m/9CAYv/5vzGAdcCMf8v/2wASwF//c4Ahvx0AFv9agLmACsAwAFEAjUA//6EAJD/PAAnARcCq/wTABIAA/1C/BsBnP10AlICegPz/wIAPAL4/N3/MQB2/REB5QFV/70A5PxpAwX+8/65ADgC8f4VAEX/xQF1AVn+6AEf/XwBxv5mAH4AE//k/YwC3P6eAG/9iP8XAwz/fgCvAvkBWABKAbP7AQGv+zoCWv9x/ywDa/2FACMB2PzzADUBAABmApn9HgNv/Jn+RAA+/bf/hQPk/jwDjAFE/0oBRPy1Af36b//AAggBeQAyAd7+6wFk/g7+ov8H/1sBZv5+AFoATwE8/m0CJf2VAen/jf87Auz8sP+U/6AA+v+bADQD9v/+/tcCgv1L/pL+Xf+X/WQBdf8FACMBMAGH/wD/qAIG/1H+7P+yARoBrwEW/xACMP8eASL+Ff7W/IX9UQHF/xwDkwNgAbEAuACn/cL+CABXAX/87ACUAesBxf5MAX//aP2ZAcf/6/9G/jkC/vwsAF0AswGK/00D4QBK/RAC+/2L/o398v6lAnsC7v/HAwf/RwGL/C4Be/5c/L4Asv/cAXYBvAA5/h8CY/4oAXH9XAHE/iL/YwAtAZL+2gJrAcT+VQMg/zYC/P04/+38ev9p/jX+mP2JA0ABXgBwAYf/CP8WAA3/3P8xANH/OgKc/Q4EcP7Z/pX/Ff/Q/d4Aov8WAZj/L/2wAQT/jwGD/x0BvgGH/1kANQJO/pv/i/0c/vcA+/6YAfsCJQGWAcT/JP8RAWf6RwAj/4f9YQJA/yYBkwAg/6sDjwDAANAAkfyfBKf9NP5CAeP9lv81AOb/PQI8/6z+DgCk/hgCWf5ZAG4BaADMAEgAP/7/AZb8qv83APT+tANT/6cBAQGT/1wAwwHl/AYAkwI3AL39pv2v/jX9Pf9i/6cBpwWCAw0DAQXDAKsBgP9T/UkCjP6b/hP+mf5A/0z5ifxmAEj7z/hr/mX5of6fBODxZwTiC/n7KgmSBAAKDQhb+3sKrgdg/Y4CiwEp/mz9oPzB+P/88ve/9OX9yvqZ+xH+Nv4GASgATQA0A0gC7QPoAVUEkgMWBK0BlwR/Az4CTwTAAdMARf+kBBr9KgDW/6QCoP/DANH/Yf5yAKb4e/zI+Vb4Dvvm+vz2cAOV/Cj7VQaJ/JQHgAgB+ikO5QUC/GgMxQOWBq8Fsfy/Clv/ge7vAhn5XfWI9FHxqQOC+GrxRgAOBFj+SgDCC84MkQhUCJEIOxAICGoBIAoeBjD/Iv+v/J39Evho9gL5rPVw/M33svZe+s36Zvqb+az+uPy7/k8AsgCQ/rgD8wNvAQcHagWmCOYEIATIBkEAcQK/AqkEvgGSA3QFLAEWAyL+oQC6+Xb9qP/D+Ir4Gf+/+Qn2lgBt+vD9PQC7/lEFEAR0//kI9QZyBogDwAPPCp8BgPVHAPMDlvIA9FP4Svy/9Ez0I/3r+2j7ePqBAFEEiQJ4BgoIkAyLC04Nqwz/Cw0JoQEqBfgBagAZ+1z9Hf0d+KD6Qvs19nv59vrk+B/6Wfrt/Bz4HP0d/b7/8ALY/jUDKASfA6kE2ADzA3ECNgE4B0gD1ASMBUIBNwLcB7r/kwFgBIL/oP/p/MT5oP7t+ivxu/2m/tf6BvqT/boDvv6i+gAJ0wfZAtMABQd5CjsD3v8YApsJkfqR/bj8KP8I9hbySvkW+v74s/Lx/Mf5UvvN/ywENAU1CVQJagoUEO0Lsgb3ByoI6QRmA/4CAgDT+jL8kfi5+lL3xft1+sb4QfsI+wH80/nM+2/9bf4y/BMErv2j/CwDsgMs/nAHywObAeQGJgLpBncBngMvB0ADRP+PBvgB5gAU/Wf+PgSBAhH6bfsWA074Avas+WH/rfki9o79xQTh/tT8/gS/COMDLQZMCe4JTgRM/s8Cx/4t/hH7yfs6/uv4mfWH9zv1V/Zp88/4kv7f/xoIugWpCX8LUQpHDVULDQnIClAFjwPBAiACKv8r/pX7N/+J/Zn2y/098wf1bPpn+DT6Mvtk/fX+//+i/WX/1ALO/fcBNQTT/5kDrQWKA5MCVgSnBnwFqPvDBMcGYAEa/7EEOAax/4T8hgDbA2z61PnQ+xwBtPeT9rH62v/5+BT5ggIGBR4EpgFgB8wGmwWMAwcGUAIFBXr/4QKs/V38n/ta94X2SPYR9+f1kvtb9Zj95/3QAK4CSQZNCLwLbQdJEugM+wPxDXgElgLKACYCVPxW/Sv6ZP1s+V35+/rz+Ln2lP2E/BL39/4y/AX+V/1WAisBEwHn+9D+QwXkAWz/2wTlB/sB+/7OBp0KowAHAPsFGgkvAJb9EAHlAWL7Y/o9AcoDBP9N+xz77/3D+Hj0bvyu+lv+Sv/bBXcD1ARmBOkF5QUQAzoGwQFEBb7+swDL/OX5APyW9371IvuC8x/5u/pu8cD/4P4t/90HwQVADVsO8AlNEHEIkQQiBG4EFv8fAjEBBQCq/Rb/yf3R+BT94vYz+iz2MPgHACT5F/WGAYUAUv8V++7/WAWK/OT/swK7BaQE2AHcBMQLpgAt/+cDywZzAcz94gckBf79nf07AqoAKf6k/E8BZf1k+6D5+Pcl+0r89/qk/TwE5P4zA/cBowEgB5cBPwYnB2ECJQhRA7b9v/6Z/kb77fho95n6H/bp87X5MPcw+5/7uwKZAlMDgAn9B/0JFQzjBzML8ws7Bi8G7AK1/5EAZP21+Cn+MPwh+vD0y/cYAUP2MfWkAI/+Sf5g94oBfwKg9xAAY/+VBg8Cx/47B2QGBAFB/yoCUAjlBKf92wU6BU7+TgN+/yoEgAAw/hwHDv+U/qf8CfuU+J/5KfnT+oL91vvZ+9gBwAAeA/0DqAMEBhMFDAfPAkkDeQAvCPUA5P4z/rL9+/uD9EL3sfXs9mz2evmD+Zv9+QN+BcYDCAsvCRoICwhVCpkISwKsCHMFSwVLAJoCRAKi+SD4DvmB/cb3mfV0/Kz/Sfzh+G0AE/0M+mb2ov7rAY797f9+AtkKY/4rAt8AoAXqBsv+uQQfBakB5wTPA6EE1gPN/y8Cmv9GAf77hACK+oD8xv3B/BH+uvsw+XT5kPkI/OD+jfxsAU8EVgmKAwYIMweyBmYB3gKx/gQBB/6B+6v/xfgU/gD27fly9S/18feL+GP7cwNNAOgDCwuID8cK7QeWDSELGwc0/gwHfwIEAov4bQGtAgT7Bfk0+s/9Fvai96b8kv10+UD8AfvZAM37qvp5/s0Fzv0dAJEE2wIIBo//twToA4UBDQJDBtICDQT9BOwDCP8HBNoBeQDl/wT+oAB6/F7///nb/nv4KPyP+Xf93P2N+UwANf/1AUYCYwcCB34HIQZ/BqkCOAH3/mb/U/6l/uj8P/zv+F745PXA72L6Hvzy+lT5GwKoDJMDkgC+C6sKTwbNBUQHUAyNBRcBBgUcBP3/Afyr/OH/3PiK89n9bf3297f4Xf3g/or74fsP+/D/Q/46/T3/UARk/0YB/QPEAJwEGgAvBvkDcADRBMkDvgG4ALcCBAV4AAgHwAL3AIf/TQD+/S751/r/9S7/RPY9/0P8Sfqu/Rj+zgCiABkFpQbuBQIGkAiLAzUItQFbAwwBNABW+9n/6vbo72H1Avr890ryTPsvAmsAp/u9BucHqwrWBEEKrQwxDCsD8whkB64BaQHK/7gBnvgd/FH3ngDf+JH4B/9p/ej5z/vp+637tPv1/PgBuv1m/yn+gAGP/vcAyQBpBaIAZgX4BYEBzQY9AYgE6wBCAfsEqAK1AZoCmP/fAzv9Wf29/Lz69fxD+4z79/pb+rf60fs//Ff9IwLpAm0ClwmZCOEFKQYhCE3/Y//SAQ8DFv7X+937C/7H+q3yy/aV+pP2j/EW/soFhQEKAgAJgwgpC/gFbAeNDGIGIwWnBNIHqwGV/ev97/0//mz6c/12/Qj5tPo8/A77o/iA/Db/1vfZ/rEA9/jx/LAD7P9lANgCLgX9BDr+0AOkADkE4gBTABsJ/QOVBeIETQOUA7P/mv+C//n/YAEoAej97vc9/Xz3BfgL92n4Z/0T+wsAqAIsCOQCSQblCbYECgKOBn4DBwKk/YYATwLv/Xv4Evow/CDzl/Mh9DD/tfUa/RIDGwFTBh0E2wc+CdEIjwnqBNcLKQbLAC4Fqv3jABUAqANX+/z/nPwd+Wf4cvZf/mv5evgJ/kj/IABC/pAAUv58/CcABv4oANf79AFyAxoEFQLKBScHXwR4AYQDjwSuAvACJwOp//IDSAZ7/CADvf7yAp74JPpH/Cf1YfuM9M35lwJp/7f9MQW3Bm4BKv7cA7oHPQPNAU8IVwQQBTP+JwA//yb6Zfob9aD7+/ON9/z3Cfsz/G798gWfBlcEWQkqBs4KZwesBLMIggE+BoMAlwTMAO8C+P4n/PD7Kvue++T31/qn+xQAtPx4/a8B5P2d+6H65/2f/xX5GwObAXr98gP9/7IBYQJfBUABvgI9BNkDsQTb/wwHKwPJAlABqQPZBz7+zAAr/3D6DP4p8qH3ofuj9qn3kv7OAjkA9ABCA9UJkP8wBu4DmQPuB639ZQXpA9kBi/6u+yv+H/UO9c35jPIg+Tj7gfsH/zf/pAfZAWgIkAasCsYDywnXBqYDGwl0AJwH7wBAApD6B/1N/qH5Qfe8/+b4b/yW/T7/3PwB/FT/Ifu8/jv6fQEm+7MC/f7jAfIBYgF8BF370AHNAoj+hQTHANIDlgn0A4kG7wFkB2gBaP4iBQgAcf7C/IT8lvts+2r2efso/cz5JPyO/iQGHP4YAL4E5gEcBlEDjAJmBdAFUwOsAkwAF/y2/EX4cfgX+VD2wPqc+Bf42/5n/4UDFf+GCJsESAJeDXoGQwb6BB0KXggmBdf/DAMU/b/9//pK+0z99Psy/U/5wf6q/xT/3/eO/zb5gP5g/Mv8Zf5y/vsAogFPBGn/cAMQ/McFdv6o/4kEYAXPA4IBlgWSCu8AUAKhB+L+UwS4+yoAdP1A/wX7R/tp+6/+j/Xi+wEAgPY9AJ0AOQFOAhAELABsBxMF0wq8AJQJaAQG/ocAgfhn+UP6gfqt95v8mvTg/WP3vf60/Q/7lASuBGsJewn6BhEM6QfE//gJpwNSAD0AKQIC/SsDMwH5/Xv8jvzt/aT3gvwB+U34AfyX+LD/pQNy+ysDvvuiAOf6Vf/O/nr9YAOdAOMC2waAB3AAUQYa/5AC6//gBPMAmwJVArAEBQS0A6wAlvzu/dP8cvuu9xv7hfef/Vz40v7B/BQEGgEbBVYGMwnjBOoBigOHAnQC9/l6BUL/Nf4R+9b+U/aI+Gv2Ivvc9gH9tvvj+5wHzQJ9BMAGIQqWAgsK6wTaCckC9QRh/+sAEAHZ/Vn+gQCd/Yj7MwE0+zkBBPYP+yD9Gv96+uX6NwCjAbD46/0hAtj8dwJg/Un8DAQ0BxT+GAh3ANQDMQA7Bl4Gmv4SCNoB7wImAoECigRKBwz7RQFy/av8lPbd9jH58fRi+37+7vsv/EoFU/xTBs0E7QKyBwkHMgOKBtoDeQbr/WkB0P4m92X8y/Sj9p/zffiG+Bf/mPz6BLP/KATOBRsEfgRCBW0IqAfwBlUHigvS/7kCH/9CARv8Wf3Y+jr+Zvq4/MD5/v6t/v/3lgLh/Oz/+/fg/mX5K/0J/SMDVwExAyIEsgLbA6/9jALI/B4DygDIBaMDxgU4BYwDhgSyBjoC5wW5/9H6yPvE/DP6QvRW/T/9L/nR/ukCS/lYAtr6DgHF/9kH9gMKA1YJsgR8BskEhgac+cL9SP0T+lj7yPed9kH7UvYZ++j5BQJMADr/QQPMBJAIrgdbAwAFRhBmAEADgwWjBMn/Rv7xAQz9zvul/931IfzB/uj12gAz/Tr/d/tg/6X/uPuN+cX/cfxd/kUBOf4KA4/+1gGyB6wAFwQoBEr+nwWe/FwHg/4vBvQGegJcBuIGuAAx/8UAFvgd/9j8g/dQ9V382/gU/HT6CgOk/F8HmwOaArEDIQK2BnMChQmrAQQH6f3/A4v6JP1792X3sPqS8oj77/qS/s/8BQCa/GQE8wGfAUsEywqQBSMFegp7B78GYAJ6BGn+PAIeAJ/8WgBD+wH52/+O/DH/jfku/Wz79fy/+vP7yvyf/kECav3tBDr7QAaeAOz/KvwxASsCqP7kA4IEwP6QBV4GXAA+BcYCXgQK/VQGuP7kAsf9Zf43+aT9x/63+F34Rvw9/F/7+gIq/AADXP3MCMX/oQbYAKMGgATyA2wG+gHaAfv8sgN88Wb8q/kD+Z3ywPv/98r9CPymAGkCUQR5CLUCfAwGBXwLfQMsCbgAlARw9+cD8/+2+oj/4QJUBR/5NgEH+bL+4/iD/hb5Cf8BBPf6afntAMP3zAD4AVr87ACAA3MDqPutBiAEvAMWA5IFKfw/CHoBr/5ZAYACOgRVCFsE/QGcAir6AgP182z+E/Sv+pf8wfqK+gwATf+vAA0A1f+cBzr+iwmS/JkG5Ae1BwEFSwKe+WcEkve8+2T3lvMj/Er4cfuv9jIFS/lqAwAAQAgjAwAHW/+rBbkB2Ab/BDIF7wicAZMKhfqCBUT3X/2o+mf7mfreAvX3ZwLO/pj9pPw5+5MAlfiTA8T2EAWL+m8FJ/2bBTf//AAJAikA1/6cAa8D4f/UBzUAnAvBAJ0NFvvqAzwFsv6L/xUEN/WEAMT90fOz+4j2c/4a9ycGaf3zBCH9DAhz/ZwEN//gAeYIXQOIBFgCVwbh/QP/T/T9/4zzGQD78HP8UPvA9pQAoP7y/+kE2QZiBMUJNQL5DAABcAsLAM0D1AWaAl36CgYs92r8oABI9XwDzPc1A338eP8T+I0BMfkRBRT4BgADAO/5zgO/+1H/xwGKAGj/Cwic9mQP9PWeB1kB3fy4Cb3/TgIPABUE0wIuA/IBLgmB+CcMCfcu+aj8x/hw+O77Y/tC/j4CJftQBH76LgVoApUE7ATHAp4HpwnE/yYFdQGj/8b/k/jB9O/1VvdZ92f4J/0VAO78qAfq/QkCEQX5BSQErwchBFkKKgZwCOUAGwFCBRf8IwNR9YMBsPW8/v326/66+wH7gAEz+3H/dfwPAzr9GP17AGcGePvpBpD8bAHH/FoFk/yCAAADovzpA6MB3wMr/KoHxQJ2A0sAvguE/kgLtvltAxb90ft4/wXzuP4z+Zf83ftNAtH3dAhl9g8MKf6RAyQFdv5tAZoBgwQqB10GvvwgCLDwFwbt7qv5B/iz9WT7Uv49/YkCFgA8BH8M8PwrB08AgwkmAKsHzAKDDxv9ugjR/6f8wQHk9N0B6/ln/OX8v/1j+pAFgfPgCwH4NgDd/qP6VP4q9rP73gHSAWf79Qdc/oILAfvxBEX5swPXApf8r/4CDJ//2QFOCXIASgar/sEFM/w1Ac78+gFb9FwCWPmZ/fL+4vtN+RIAkAB9/iD68AHvCLn5fxAvAnkKNf/8BOf+G/vW+vb+EvK3AEv/9fi4AZD19wCZ8/MCVvvHAdABmAkCAQgKjgVTCSoB4ALXCrf+wwXbAdYA5vrTBZj0Ewfs9S/+kPvA+hb9yfwp+0X9Bv3V/vADePsGDMT1IwrN+1YCUf7L/pr8/ACU+mgAGQUg/dQMUvfWCjMEYgJBAJEIcAH0CQH8Kgey/H36HgF+9X8B//YW/0b6Iv569XkGQ/ZABi7+4QHoCScD1gLPB1gDrwDMAH79pwTg88AFE/WqAdr1+/0o/Wf7ofiv/msBhADxBgz9mQcAAmAEOgGNCTsC9gc1AswLOQFaAM4CpPiP/HH7GvlI/n78/fsuAJL8OQf6+CoCzv1EAsz9j/0W/HX7yP3s+iwBiP2bA24B8ASOAdUBQv4CCeD9qgFuA/EGsALQAh8J9AQ5BZr8IwEt/CQBDPu6/rb3EQDZ+Hj/y/kp/b75cQEM/q39EwMB/hIIiPwYC8L8gAh+AagHN/0TA+j3Jfxe89/1GfvG+TH+KvkTBaT8rQzp+owF4v0hCAEDrwWAB3wIj/8bBdr8mwNwBfP4ngYk+Y4HOPT7BEj5o/4S+vb8u/0P/6f/dPmL/+77HgDy9y4C9v0gAlb4GQ7T9WkIEAbcA0IApPwYBaD5BAHu++kELQLQDYH6txIS/bwEsAES/d35Iv5o+Ab7qP5f958AOfaCCzP4IQph/PADtQCzBGT8tQcKA9UA/go4/vMEzPkrAary/vzu9R3/yPTNAov0l/5bA4H9dgSu/AgOyP+kB3UB/wTCAR4JmvptC4kBiAsf/zj6yAFu8/L/6fKV+oD87QIl+3gEMPrQB2z1SATz+Y78/AV2+VMBC//3/hoCAgULAdUIBv0HCPTwzQd7+F0Ba/9CBLcGTgNmCngBVAajACwBRfXkCAr9L//F+mABg/l6Arv6QvvK/M7/L/tT+0EEevwVDCQErgjtALUIIwCd/y/4jQH59wz56/629y3//fxV/Xz83/o8/R8GZ/rZCMf8lgn5CNkGtgjXArIGzgW//aYBsAHk+dwE6PmC+x4BzPyT/08DzvEQB5n2bgFp/nXzaAKD/PgC+v9Q/XQCrwMb/8sEiPf3BGv8gfx5/R//yPpfDH0GJv6GCL3+hgrt+NEAQgGL/GEI6f9V+L0L3vvN/zEBPvt4Afv1qP2N95H7Nv+SBSECugtSAnEFQAei/OUBgPq2AEIDvPxS++z7X/pw/hP08P9Y+o37kQBD+zgCAP/R/zMNSgX5BUIMugL4BVj+IAFI/40Ep/90/EsAhP/p/Uj9+//m+08BqP8o/wj96Pz3+6f+1vzD+9/9XAO+ABH5SQQ+/g8GIgLq/iUDWv/CAKX+NQA//ToBkgULBAsDBwgEBkD+JwJS/Xb/vgCN+Tj8k//8/tYBy/ej/aADJvW2BWz9MwSaA+sDu/60AvwCOv2dBaX+uwSl+0j/s/vi+R/++vhi+Av+SPiwBbH3wQBDBMj76QjX/QsHwASXBLAIGQLn/ZYGWv76ArYCxwLLAmT+pwOhBPn6B/wGAHbtX/7/92787v7gAMsEdv2RApAGtP7c/moD7/iEB6L5/v7I/VH6hQPYAKQBqAU9A/n/5v24AN0Azv2HApQI6QcBBjcHcvoKAPD7A/sm+d38FPt0APsBBP45ABwGV/rf/ogIGfwkDvv8Uf+sAZX/cwRg/ET5NgEW+VsARPzN9YX/IfiX/iT4tQYz+RgFp/mFB3QCYwiDCMoDvBGZ+1ULiP5M/2L6RwSr9QwIlfZpBXf6XvsDAIr4Q/6SAMwA0vqACwLw4gUY+m0FI/cqBW8Efv5vADYGoPcGBu380/4+BH32GQ4B+RUB+f5TBIf4dxCB+s0KSAJtAIkEhAID/4QDewA4/qIBt+35CUv1ugIR9lgHDvzyBEz/eQAWA1ACCQTp/i4KOPtJAsv9Bv/k8YAAz/TC/zzyrgCh+g4AcgUw/rP93wnCAfv+fwnV924NcfyYDsr+TQ/MA1UADAAgAHX1oQJj+HX7wP8F9dgL9PdMA436ZgBm+aMFl++/CDr2UgGCBTT+dAViAqoEKfzEAiTzswd49QcGSPczCAEFkQKH/x8EigDABMgE5P6/AnP/jAc88bsIg/cKA038lQI4/PIDlvnPAib//flABtr+GAsq/BAFNgIDAU769QcN7hQJlPqn/kL/k/cy+BX8Pv7U+Wb/Cv+bC6z0ngwN/EwGkAkmBgT8tQ/P+gwJBv0M/z8E5veNEODzDAYl++oCHPt8AkT0v/8O+TADpPht/WUBSPorA178Nv7J/egHg/JJD/70DwiOAZ4F2gAZ+s8C2gFQ/QUBrP6DAB8L7fm/CVH52wuW+fEJKfsBAkr5UQRJ/Br80QVn96AOPvO2AZr79gIJ/O8DKfmNDP//Zwh3AYz7AQJk++wBp/Sa//zxbwXd7wsBrPhMBdr/f/73A5z77Ait/v0EV/8VCuwAvxNB/uQItwOGAdj4gQB0+T72vgm09VQJAvKrCNj8Of3B+fEEBPk1AyT+EwCy/fD2pAd5+nME5PoTCFf71gpS7pcIk/QZB/T6oPtqDCr5kgv+Ah8HhvyYCJz6OQr38BoHTf9qARwGav13/kcGXfvu/XH52PhGBZjyQwyH+RoFUAQWBhH+IQmw9TQJbfuQ+NQCqvPWA+zyLAAo+hn9SwCTBrnvqAlK/h0B7wBY/y8CWQpyCmAFdwXW/7AHtfYAChX2OgAtBbIALfr5BtL9hf8/900F6PvB/B8Fuvg3AIX54wHV+IsDNPw1A938LQV+ACEBqPmDCKj0rgI2BI/6eQRSAV8ItPWYBWj+MwaA++YF4QN4COQAJgP5+uIAO/1A+7f/FPm8A4/9bQru9ykJA/jpBrr7j/+YBAMBmQRR9ggIMfgo/ssAQ/WI+3ABUfXsCp3sOQh4/Kn4Lgaj/1AEfv6YCpMCkAQQ/XcU/PoNCk37cgRvBrP+ZvlV/FQEGvTyAVn8iwGc9xcC6/56Arf7egWr/ef+Bf2r++QASP5wA6j/ZARLA/r84PjdCIPzxAKQ/T79gwDpAdwGH/iMA7IFzPzXBYEIT/9QCHr7wAS+/K/+T//B/PgCX/18AtgHX/kT/F4FV/nIBsQAKPdOCA//2vhJBs3+lQNC+WcFiwH08mH/7fcg8az3zP3e/PcFdAUCCkEFVAOzAAX9XQeuAsf/YApyBy8Dngl198MDe/cQ+Z8A9Piy/Q8AIQWa+UUB7v/fAcf5xghO+OX7HAB8A+j3ff7ZA3wDBwWDAZcHCvFaCWTyVPpCAekDQ/rWBhgAbv/OAV3/owhS+SoHMwMHA6IAaQCC+3gGjPz9Ba0A4/fCBan8uPYfAzv8xQsQAZn8GQpW99UOjvlD+RsDIft3+kv8Q/q4//b2sf6VADj83gVg/VQCMv1mA8n8uwQ3/f0OMQE0AvEKS/1aBDsAUgGFAB4FLf2tAEv1ywUU/Sj7of1XAzT9Zf5L/WP3kvz3/7sCt/hwCov/gP1VA4j5eQDoAMT/fAYC99gL5/sd/hD8TfzUABH/PgcSABkILwCNAbv7ZAdW/nwBbAEaBdgAdQPw+FX/yf1Q+agG7/sKCvAAvQIg/BH/4QG7/rn6BAnY+7kBLAUr9Gf62fmR+nj6FQIi/+ECnfXIB4z20PwxB2L+KAW4BMQEKQY1CMEABAYK+u4LL/+f/9kBpvaQBRn98PYXBBH54wRK/qP1GQd3+ur96AB7AEX3LgcY/a39SAOa//4Aiv1cACH/O/tZA0oCfvokBiP4/Ahw/WT/rgGA/nAF+gYQ/wMAPQPQ/0YFMPdZB6b8U/8K/JP6x/7YBhr57gaL/6EDdwRf+z8GEvncCP358AT1+0kBk/wa/n/5pfqF/HD25wPH/db5xwYJ9+3/HwTW/HII5v2pDuz+bQnzBQYC6ASLA7f76gSGA4799v22+ygCv/ex/378nABy/RQDbvo3A2f6dvxNAbv7NwPy+xgEQ/+i/h4AqgCT/rQBHPKsAbD+MPyBBBsClwGaAQUBpwFQBcH5CQZs+o4JwwO4AK4KZ/ujAEz8C/fiAgn9WgGxAZEDpv1cBF4BqP6wAmP9Mgyq+MsHSvjQ/nT6q/rZ+RD8vfzg/Yr2rPqyA071FwfQ/oIGFwMLBf8AcglY+GsMS/+S/wQF+/6Y/YQDOQEpAYYA3/0G/xj4Jwgk7U0OtfpgCyX+gAAjBYwA5/d+/hcAG/eKC6f44wHw/iAGp/MtACoCkAIW+p8BH/X9/ez82QAvAEP9dAsXBIcIKwWDAzr7igg5+bUPTPVCBDoASvpEA3D+xPvvA7AFtfscCan3wg8Q6ygKt/g3/D4A1/cM9tYATP0I+5sGUOzlEOvzsgQj/5D8xvYPDXz7cwYo/gsKqgqL9H8MXfibByH9WwZK9xYNq//nAgX46AXD/Oz2HQYc/Mr+BwVL/yEFAAk59IUO5feeCZ/3CAMGALX6EvpR/xv5l/nTAIj0VQTq+k8AMf69AYL0LArD80kLk/5mAS8JIPzSDmL8awffANIDDwA7BeD6LAjd/04EfgD5/t8C4f4Q/M0CyP1f/UQMXfRAA9X1PwVF9D3+1/2x+7v9Afyo/pn4ZAT2+P8EeP5gArn+qQjP/H3+zAAhD8r13Qo3Al369AiM9wwA+f23BbT2Qwnf+pQJtPe9B6L9FP7g/1AFv/4OBkwBRfnOCHP4Kgsx8NwGVvWPBPT1RwCT/zsFxvk9/VH+2vWI/8b1IAAM+asGOAKbDSz/PA4w92YTP/m0/1YBVgBSCMP4NgViBCYB1wMnAyT4hBHZ718I9/hXAkX9/AGt+aIGWvzk/TD9NPJxBKTu8Pvm+JMBJ/EfDqT0ZxBZ+lQGwAf+ALEGJ/49AiADigRD+bAM1PSREXfu6wtC81L8wvxqAD38Xv88BIwDow9W+wER7PPgBvr8TQA0918HYvakBJX0BwZ+/mf85AMk9FAAp/3//1Hx0AXf8NUI5PQUDxb+FASwCVYDKv1XDLUAdvzkDN/2agwG+SAIJf31Aa39ZwbK9YIPcvNVBvkB9fziCbv8/Qiq+FEDafMKBvnpcQNh9Hz8Yvfw/QwEDP5/+/QE0P4xAGoGsfcYDvn09RCM+SoIJwXKBaj+pAJU/Jn5bP4N+VMB8ffYBqX/OQSuBM0JyPMTEE33mgE+Avf6rwtS9jYG6/5R/J4JxfkD+4IB5/Sq+0r3g/5t9aQDhPTWBvf4af2BBTH85gTq/G4JkgXbA8gJYAOA/P4NKvvzCEf2cA7EAXX+GwUB+4ECf/7V/0j9Zghk8z4FnPYZAMz9rPQ7B8v7PvxrAaH01Qg591/4+QK59wkEhwD6AKMCRwph/ZEFUP2UBZoCgPwJB3j+yP4EBRb9zgAm9b0BygIY++sKu/sXBkMC5wPv/64D4wPKAjf+d/uFAQ/69QB4/wP3Awgf9gIHjQC88PsMK/JGARD3//h6AkH0EwKm/04CtATLAPQERP88BHoE/AIJAz8B+QjCAKAJjABF/yoEa/78Aer9bAWv+XADUgPW+g4HdfxD/5j7f/dv/jf2Kf8x9MD87PqX+X8DdfoqBnX7LglBAcQBLggFAy8A0gT+AIEFzgD3AhABwfitBJ700QA9/HAB5fnbCeAA9QHQBiAAcAaH+6UFIfo+Bsz8KARr/+UCDQSh+Zj+4/gZ/CH9+/hm+YQA8/sd/nT5NgRz9ogDov0v/6wJu/nX/T8FSQaR/sAFbgZhBQb7uwrH+B3+mwsd+eIBEBLd+toGUgJb/yME/PVlDa/w0gFwBgD0BABAAEnxR/vX/Sr8mfomAI/+DgGf+9j+WQAF/ssGg/ogCZr+TwVaAl8DSwS2Cdr9iAVS+7z8Rfuh9Uf+pvqPCBIEngS7/94K7PWmCND5IgLeAO/7agcB/1ACiAUU/LsEw/xY+qEDLvJTA9ztsQQD9iD+2/uzBMMAB/6f+cn8GAZX+lsDiv6rD/wBRBAoBVEH+gMr/9L6gQdQ/ScCT/n0/88GOfvvCGT8lAei+w8EmP8n+1X2z/iM9Uz7KPcHAYP7Rv/8AhP44wQc+mAFsPpOAgMD0QQXBKUKbgGQAxYHFv6O/cL6LQAs9OoDUwLyAd79IwfN+18Gsv/EAIcD5QKuBPD6uQJx/+8AQvckCQr7HgWo+0L/E/t+/Qz6rQFA9aj/lAHo/R8GBu/+CDvz9gLG/eIA0/ujDX/7sgTN/1UE/wPs+O4Nkf2BBf8EoAeu+oMNff0JDi/6jQ2v/Ez7UgPk9SAAffiWAc34UwAv+OMAh/EFA5X2FgGe8LoIZ/jPAZgBEfkwD+7yoxEf+ggGXwPIBzP2+wqw+gkHTfrFBOoCTfSCC2H8S/1a/KIJEftJC6jzWBIa8WcMMgA3+IYJHv1bBJj4dgHO/YX76P8CBvTukAggAE8A//NbBCzvd/03/nf03gPg+9MLo/ylAW4MSwL//9gDvv/YBc37AgaoAGcBAgdJB9wAUgj6/KgA5QHa/VwK+fRnA4ECuvj8Adb3Y/gx+ZX2dgMx9EL/aAKf9pQCJP/P+5gI3f8uBHoGwvz3CiL+8wETBUH4fwSzBmH3yASv+mz+dwLl+F4AwALqAkT+Uwfg+6gEFgFV/+YGIvyUDXj4mQB5ByjxBglW9RP9yv1J9qAHUPqp/moB2fY4AGX7T/k8AfL7aABH/ZsG5AXa/0L/yQFMBNkALQIjBWMDEQIyBPUJYwPq+0UEVgJQ//4MB/s1BHAD4vjHBHDwaQu/82b5yQMy8FIGUPdL+K/4fv9hBFX6KfmBAmX6K/+uBEX9JwUaBoUErwYuBVH+nwnD+goAkPl4AR7+1/+U/zf1jg79/5cLY/j5BSoFrfuNBIT8gwobAWwAoQTr+LgFs/1Q7/sCcPj0AJ72QQC9/Qr4NgOE+NYFt/hRABEDm/xzAH/9DQLyAyT/PwQVAiP/wwUdAtb/DAFnATn/owdP/1UJe/pSDFEGUPt/Ezz4GAVj/BX7nP+P9W0CB/ti8ogAp/OF/yr3e/hJ/5z4egiV/uQEswHyAaEEmPpMCMEDwQKBAlL/Ygc78fwCQwVN9sr+KAXbAMgEyPwOAuQB5fnZDOv3mQrbCSP7yAlg94MHpvVc+LIIr/izAcb+pgKMAY75vfz3/Lf30foJ/bj5k/vBBcvytARRBOcE4AMVAFoJzfnMAVYADwC7ABsDtgRCCB4CgwtoAmn/cAX+/0AAMgWiAdP9JQIF+oEDHPYZAaD6x/VCAWvwUQG88hcAIgFp+8oC5v63BKD9pgKI/a8ErvxHBx4DgwGnBF4EN/w3BQH8ff86ASj8qQdr9JgLz/mwCI74EgOvAtn+cwaw/o4GK/1eAZEJ8/diAAYDjfX4CInqdBIE87YBjwFu9A0H2/Y1/3b3M/nF+SgAH/nQCt787Qa3/68E7wQG/sQA8gSv/dYBsgsQ+ogLtABkBxwDSAcLBXADVP4LBDcA6vPZBQr1NwLV+zn8IwKt9Kz88PzO7QsHa/wz/r0HqPi/Cn35yASb/7MBuv0cDPsCYQMiAD75GwVk830GfflZ/3MI3wFH+MkH0/xpBT37ZQadBgv8DAlO+7gDCPyTBrr3awvc+AMDDP+n+gcF0/fj/Mn7cwFM//787fTeA0/z3wLn9HX/uQSb/dwDcf1QAMsEDAKL/oAJO/vBB9cFuf5D/1EDZAEBBs7+qQof/hgNAwO4/dcDm/zUBw/4Gv+m9nX9wvbl9RT22//D/HwCPfnF/7/7oQJXA6D5ywdRAUIHMgA+Ayf9FwQBBi39M/6YAxX97ACJ/Zb73QAsAaMF2v/8AnADgwMpAj//SvyNB2UBl/tMBGT8ggVD+4MHQPzC/2gDCv1p+ov9Zv9x85cF/PJt+p4BCP1n/eb8x/ypCiXzgAqT/xX7jAhq+tYFN/tACMAA3QL8BDAK+P6LBuIE6ATBBL8DegTMBOT6WQbx/ED1UQS07z3/cvdE/Ib76fppAfj4jfdMSVNUYgAAAElORk9JTkFNEAAAAEltcGFjdCBNb2RlcmF0bwBJUFJEFgAAAFlvdVR1YmUgQXVkaW8gTGlicmFyeQBJQVJUDgAAAEtldmluIE1hY0xlb2QASUdOUgoAAABDaW5lbWF0aWMAaWQzIHAAAABJRDMDAAAAAABmVElUMgAAABAAAABJbXBhY3QgTW9kZXJhdG9UQUxCAAAAFgAAAFlvdVR1YmUgQXVkaW8gTGlicmFyeVRQRTEAAAAOAAAAS2V2aW4gTWFjTGVvZFRDT04AAAAKAAAAQ2luZW1hdGlj", -} - -BASE64_AUDIO_DUPLICATE = { - "name": "test/test_files/audio_sample.wav", - "data": "data:audio/wav;base64,UklGRuI/AABXQVZFZm10IBAAAAABAAEAQB8AAIA+AAACABAAZGF0Ydw+AACO/w//5P6R/9D/SgDJAGIAegA3ALkAPAC8/zEA4/+G/8X/3//f/+n/jv+d/87/mP+p/7v/jv/C/ygAogB+AOQAHADX/1EAQwCz//T/kv/B/oD/rf8VABUAKAA3ANv/4P/o/8T/5/8o/6P/dgDDADcBUwCu/w3/+f5Z/5L/YQCfAMsAaAGxAXgAg//m/lT+Rf6k/lQA8wAXAR0BtwD1AF4Amf8g/xX/Tf/8/rb/FQDc/6sA6wAJAeIABQEyADn/af7D/b7+Mv8nALwAdAFAAooBswAKAEz/4v66/nb/KAAlAEoAQwBIAM//qf85AGAAeP+z/5f/n/8rAOL/MwBkAMsACwHxANUAjP8B/w7/2/7X/vj+TgDp/0MA5wDRAOMA5v+Q/+n/1/+C/zL/qf/y/yMAhQBEAEAAyf9A/23/JQCZ/5EArgDkAGMAmP/o/9b+Hv9O/8f/mQCdAIwAYwDX/3T/5v7//8r/PQCNAMIAvADq/4//SP8yAMP/1v/t/67/AgBaADwAAQD+/4YAZQDmAHAAgf+S/0D/D/94/7oA1QDaAMoAQgEFAX0A+v+S/i3+lP4o/ycACQBlAMQALAHxAJb/ZQBV/4T/z/8HAMUADgEuASQANwCCAD8A2/9e/wz/O/8u//T/+////ysATABVACABbQAwAMX/tf44/93+vf8IAHEAJAGnATYBoQCn/3j/VP65/vz///83AE8AeQDD//X/b/9RAMz/vwBmANP/dQAaAKT/vP/X/57/xP9B/1H/Bv+nAPgALwF3AY8BFQDe/9f+tv73/qT+hgBPAPcAOgAoAC8Akv/C/3YAaP/3/1//d/+6/6b/TQCAAPMAtgC5AN7/dv/s/fj+Ov/6/+8AfAGQAagB1gBV//3+kf7R/oH+jv/H/3AAdgCYABAAowDK/97/uwAEAJEA3v8SAJ3/b/8vAO3/8f+QAFT/OgCCAEkAKwAFAKL/Qv/S/4//yP/s/2wAPQB3AF4AlAAXAAsAZP+a//b/rv8ZAOb/EgCt//z/sQAlAC0AJwHs/1D/G/68/k3/z/+TAfgAewE7AvwA8v+Y/nn+7P7E/YMAmwDQAIABYwBxAEYAHwBrAIP/Rv9m/9f+GwBH/7j/0wCVAfgBCAHJ/8f/s/7+/rb/BP+v/zMAzgDa/+T/twAfAKD+7f91/+f/sQDq/6H/AACZANAAfgD1/+n/aP6h/9X+uP4CAHkAqAGBAT8BkgHZ/33/Df9j/jD/PP/HAI4AIwChAKsApv+3/yD/kv/+/x8A+/8v/xsASgBbAIcAdADy/4YAaP/w/8v/T//U/zkA2P+dADQBdAAqAP3+bP/P//r/i/+M/in/bQAaAEQBhwDsAJcAXf+o/+T+TP/A/1cANgCIAI0AJQHK/53/AwCqAEQBWAD6/8X/dv/L/83/q/9rAFsA/ABPAMf/xf5K/+7+Sf9nAPwAjAGYAA8Ar/+b/5L/kf8m/z8Ad/83AVgA2P/cAJn/VwDG/6P/gP8Z/z7/XP/P/oUA7P9XAK4AKwCNAKn/Iv9YAAUA3P8DACoAPgC8/moAFgA1ANEA9P/r/7IAxP/c/kD/vv9cAEoArAFmAVEAagBJABj/yf+X/z8AGABY/2kA2f85AC4APP+c/+f/yf8T/+r+bgCu/x8AJgKUAbMBTAI6AGv/TP7//X7+vv7sAL//bAEnAoYATgCt/+n/Uv9w/tP+j/6i/0YAUAA8AXgBIQJEAfL/Cf6a/if/iP9bADsBugLiAiMBVv/e/r3+EP7s/Xr/qP9z/4AAQwCk/7MAlwDoAOgA6f+A/+n+D/9E/if/BwHTABIC2gGEADMAUf9P/3D+lv7F/sv/6QBPACQAWwDgANn/2f8I/z7/7P96/lr+vABgAWYBEgJaAT8Asf/N/3n+FP6N/kP/mADsARIB7AC4AIX/kv54/v3/BQDf/0sAKQCqAGEATP8jAMr/7ADtALL/9f6k/pT+vv7t/84AyAG7AQECJwDG/7n+d/2X/uD/6QBKAZ8BOgGbAAwACv/f/goAsP+d/2z/QQFJAML/uP/Z/xABmf8LAE8AEgCM/wn/c/99/04AgQHG/5IBOwFrAGABOAC+/+/+5v6W/j/+qf/mAGX/9AC/AHb/i/8g/6z/n//J/2wAiABZAZABiADBAMP//f8PAE4AEgAvAPH+jv7A/+n/OgDk/4wAKAAVAJUAj/99/tP+Mf4AAMgBGAFZAZUBhwCh/2b/Y/+C/2f/6v8X/3n/+v7A/mkAr/8ZAF8B/wDBAPH/8P/o/9j/TACr/wwAZgC8////3f+4/mz/XgCF/9D/XwA2/6v/pv/3/1YA1QDmAFQAnABDALX/NQDx/zEAewFfALsAVwCH/77/7/5m/9D/Qv/k/4n/7v7S/n79tv/DACEALAHaAacBugDfAJIA7v+x/+X/EP+d/+j/2P8LAMH/Iv8PABcAlP/I//D+VwDS/mT/jwB4APUAwAC5AD0BAP+PAGsAIP8gAaT/sAAqAL8A9AAG//n/SABU/nX/uv/p/37/gP85AMX/aQBMAMn/Mf9vAOb//QBHAPn/hgDi/ykAGv9h/kAAqwCU/wAAZQBgART/i/+F/5D+YP9wABoAUABNAe8AcwCbAK4A8f+oALYAkP89/8f/7f7+/8b+Tf+yAPX/CAEHAaz/ywAbAXv/Kf/R/5EA2f9uAQAANf+5AKkAZf9T/xABLwB0/yoAIgAKACsAGP+B/93/mf+6/+r/bP9s/in/fwB5APAAKgEvAdIBTgBsAFMAMf+3/s/+GAAWAL0AQAEFAH3/cf8aAMj/tP9+/+D+lwDsANP/mP+DALH/pf+MALQAwgDlAAwAbf/5/00A5/99/1AAZv9q/8H/0P6+/vj+4/9hAdb/xwDQAIX/zP7e/uD/I/+T/0QBOQCtAE8B3v6DANb/Dv9T/1YA2P9p/4QAngF0AfcARwBD/9wAGP8u/yv/z/7T//b/yf9vAKIBlAALAHEB3v+8/s7/H/70/LD+FAGGALcBZwIeAbkA2gBB/2H+0P5V/93/ZwC2AVL/uP+o/yj/r/+6/p//hf/K/qYBKwIoAUIA8wD8/zD/ggDC/tr+2v7d/9r/RQE5AgEA7f+TAcn/Xv8AAB0AlP65/hUB5v8nAU4CBwAI/xgAU/5i/oz+6v6u/7sBCgKuAQ0BkAD1/rT/R/8+/mkA0f1n/4cA9gDLAKgB3gBg/1cA6wCX/lT+AQAG/m7/FgGo/xAAeAExALcAbf+//x7/Uf8pANf/QgCbABcB8QCyABD/rQDQ/gH/9f9F/mcAbQC4/14AtQA1AW7/LP+OAGT+9gDsAEb/BwEbAMoABAHS//z/g/9i//T+qv0AAOv/b/+QAKj/2gDKAScAdQHl/0YAEQDn/+kAzf6xAEgANwAGAGYAOf+D/zUAdP6R/6r/W/8oALz/UQErAKEAGQHv/jQAQf/B/2X/CAA6ALcAjAGAAHD/NwGsAHQAAP++/r//Yv6J/+j+zv9T/0YARgFHARgA7wAdAIT/RwCe/yEAQgAuALT/FwCYARMAV/9pATf/XwD+//f/F//V/yb/fv8FAPf/dQCP/xsAMv/mAOH/lAA5AXT/Vv4/Avb/n/8mAcEAhP9i/+3/4P24/8H/JP+g/iQCZf/wAD4B1P88AJgAXQDY/oj/QQCQANn+UwCd/5gB//9o/w8Apv8n/4X/t//j/4sA1P+oAMf/UQFv/zn/sgAtAFMAogDm/4oAkADBALD+5P4qAWz+bwCI//P/0/5n/1v/R/7R/5gAqQCvAGsBpQDyAAP/JQDr/9H/4P/8AB0A2ACBAGz/xv7U//H/cv/PATD/6/5p/44Aef+c/0gAhQBOALYAif/O/0YB3QD7/4IBggBKANcAhP5CAF79qf9H/4n/yQKd/2sAMQC2/uf/y/79/yAAh/+oAF8B5QCG/5L/b/8YAB7/pgEV/xn/3gD9/sf/TP+M/0oB0AAUACX/Af97AQL/Sv/F/3UAqwDbACMAWQEGAPP/LgGe/3MAcf+7/ZP9X/7t/f7+0v6lAiQBhwI1Az4A0v4//3v/Vv97ABQAKwFw/+8B+f5m/y3/Vv6vALwAHwG6/qb9VP8y/lj+WwBOAWcDfAGiAAsAFf8+/SL/of7l/5UC0gLHATwBYQCU/oT/GP67/sr/SwLI/3D+GAA1/13/uv81/iYBygHA/+L/tf/IAFD/EwHVALEA6wDbAM//fwAdAJr/3P86APf/DQEvAZn/NgBv/sH/Bf4YADL/d/7BAOD+3v95AmABEQAOAIf/5f+0/SUARwKy/zMBrgGz/1QBW/5g/6L/Gf9wAEr+GwEeAP79af9v/9D+4wAI/yEBwwAb/7MAC/8pAEUChwDwACQBnP8oAKH9mf/k/uL/MQFsAN0AQADV/yT/7P27//f+pf9NAPYA/QBcANgBgf7jAaf+7v+V/4v+cwBo/nMApAJtAV0AMf+zACQAAP4tAFT/oQCX/8MBLQEpAboAhv8Z/oj/H/+6/9n/mP8MAcL/PAIeAQQBMgHIAOP8xv5c/lf+dv36ASQCQQE0BJUANAH8/zEABP3t/yP/Tv9NANYA5v4CAEcAuP8EAQMAx/36/BwAwvwfAC8BOgOmAF8CCQGvAJ0A0/1J/Pv9mgCN/8cCHQHNAWMAKwH7/Yv/mv3W/nz8K/4QACIAUgKNAI8B6QE3A4r/JgD8/Ef/Gf2AAVsA2v6lAT4CDQHY/xwALv8s/uP85v/K/OUB1QCMAHoA1AOlAqX/uP+h/cP92v2a/qgA8P+PAZwEvv6QAsr9r/4d/lL+OACL/jEB2AESALH/3gIEACsBnwCbAf7+5/6q/u/+/v0VARcCNAEYApT/1gCu/Z7+CP7U/c7/bQH0/zwCFQH9AKYAh//YAPD+nf+3AO3/aP90AQAAwwJG/6QBz/9N/OT/Gv3a/HH/pv6jAOwBkwEtA37/YgF+/gz+hQBaALAABwME/58AVQGT/kQA5P2s//z+yf+UAIH/hgBKAFX+FALh/3UAK/+O//v8cP4WAkAAkQIyAQsDbwFMAhv/c/2J/Vr+qv2BAWUAJQAyAOL/WwDL/OUBGP50/r8AzwCOAPsDDgIXAX7/WwBt/7j7X/+b/Ab/pf/pACgB5AL4AL3/KwCJACoAwP5v/8n/YABF/rQAn/8iAgYAAQKZAFj+6QCI/q/85P8jAQcB4QDTANoCr/3F/7b8r/wv/8P/kADhAa0CTAKlAGsBvwHk/TP/6/83/sj+Cv+X/9oB5P+GAgEACP+5AEP9uPvy/p//lQF8AfoCjgNP/woCov4F/ff9R/+8/rcA2AAFA9cAKwDIAP39zgD//q/+l/26/2L+wQAkAX0DAwIGABID0/6r/QL+m/19/z//wP+UBIX+xQHv/qz/1ADT/jMCB/9VAKsAz/43/xYCu/7AAN//lgCY/u7+ov36/NYAtgKeAekArwSP/3j/zP65/hb+Zv+S//P/6v9iArkAhf5xAIz/NgH1AAYA9v7W/zL/GADn/sYDZf8tAXoCnf3+/5b95P6A/xL+rQDnAQQDrgHy/qgB6P0W/5T+ov5z/4ECAQGeAKABawG7/zz/IAE1/Yj/AQEq/vX/NQFh/5gBIQD7ATb8lQCnAHL80//UANcAbAAEAkIA1v9j/wD/M/4iAZv+agF6ACsA0P9dAdUABQAEAZr/CwI4/hb9q/qT/zz+xf8UArUElQCZAO8CA/7K/+z9RP+k/r8CsgE9ANn/HwJr/ff+1P70AUf/Jv0CAaf8+AIa/9AAUgCjALr/IAAP/zICav9t/20AiP9qAWb+2AFT/Rz+vgDiAY/7fgA3Adz+9QDsAJ4C9v/uAUUAeP8gAKb9Hfw3/wT/QwEqAVoBiQGlAO0AwQBk/s7+Uf8P/noBnv8jAwMBB/4aAYv9N//JACn9zwL8/kcB9wJo/5EC6/4w/joBWQDFAAUAVvy6AKz9Xv5K/8D+YAICArH/AgRj/db/GP7//ZQC8P3YBZ8A7/+jALP/t/27/gL9vAAJAKQCAQEC/sQASv9R/vX+OAEA/3wDhP4mAgX9XwJw/6/+YQDW/gADK/4cAST+hP+6/UUDZgBr/z8AfQJC//MA7/8u/xH+P/76ATr8tgKG/tEAWgDOAu//m/9CAYv/5vzGAdcCMf8v/2wASwF//c4Ahvx0AFv9agLmACsAwAFEAjUA//6EAJD/PAAnARcCq/wTABIAA/1C/BsBnP10AlICegPz/wIAPAL4/N3/MQB2/REB5QFV/70A5PxpAwX+8/65ADgC8f4VAEX/xQF1AVn+6AEf/XwBxv5mAH4AE//k/YwC3P6eAG/9iP8XAwz/fgCvAvkBWABKAbP7AQGv+zoCWv9x/ywDa/2FACMB2PzzADUBAABmApn9HgNv/Jn+RAA+/bf/hQPk/jwDjAFE/0oBRPy1Af36b//AAggBeQAyAd7+6wFk/g7+ov8H/1sBZv5+AFoATwE8/m0CJf2VAen/jf87Auz8sP+U/6AA+v+bADQD9v/+/tcCgv1L/pL+Xf+X/WQBdf8FACMBMAGH/wD/qAIG/1H+7P+yARoBrwEW/xACMP8eASL+Ff7W/IX9UQHF/xwDkwNgAbEAuACn/cL+CABXAX/87ACUAesBxf5MAX//aP2ZAcf/6/9G/jkC/vwsAF0AswGK/00D4QBK/RAC+/2L/o398v6lAnsC7v/HAwf/RwGL/C4Be/5c/L4Asv/cAXYBvAA5/h8CY/4oAXH9XAHE/iL/YwAtAZL+2gJrAcT+VQMg/zYC/P04/+38ev9p/jX+mP2JA0ABXgBwAYf/CP8WAA3/3P8xANH/OgKc/Q4EcP7Z/pX/Ff/Q/d4Aov8WAZj/L/2wAQT/jwGD/x0BvgGH/1kANQJO/pv/i/0c/vcA+/6YAfsCJQGWAcT/JP8RAWf6RwAj/4f9YQJA/yYBkwAg/6sDjwDAANAAkfyfBKf9NP5CAeP9lv81AOb/PQI8/6z+DgCk/hgCWf5ZAG4BaADMAEgAP/7/AZb8qv83APT+tANT/6cBAQGT/1wAwwHl/AYAkwI3AL39pv2v/jX9Pf9i/6cBpwWCAw0DAQXDAKsBgP9T/UkCjP6b/hP+mf5A/0z5ifxmAEj7z/hr/mX5of6fBODxZwTiC/n7KgmSBAAKDQhb+3sKrgdg/Y4CiwEp/mz9oPzB+P/88ve/9OX9yvqZ+xH+Nv4GASgATQA0A0gC7QPoAVUEkgMWBK0BlwR/Az4CTwTAAdMARf+kBBr9KgDW/6QCoP/DANH/Yf5yAKb4e/zI+Vb4Dvvm+vz2cAOV/Cj7VQaJ/JQHgAgB+ikO5QUC/GgMxQOWBq8Fsfy/Clv/ge7vAhn5XfWI9FHxqQOC+GrxRgAOBFj+SgDCC84MkQhUCJEIOxAICGoBIAoeBjD/Iv+v/J39Evho9gL5rPVw/M33svZe+s36Zvqb+az+uPy7/k8AsgCQ/rgD8wNvAQcHagWmCOYEIATIBkEAcQK/AqkEvgGSA3QFLAEWAyL+oQC6+Xb9qP/D+Ir4Gf+/+Qn2lgBt+vD9PQC7/lEFEAR0//kI9QZyBogDwAPPCp8BgPVHAPMDlvIA9FP4Svy/9Ez0I/3r+2j7ePqBAFEEiQJ4BgoIkAyLC04Nqwz/Cw0JoQEqBfgBagAZ+1z9Hf0d+KD6Qvs19nv59vrk+B/6Wfrt/Bz4HP0d/b7/8ALY/jUDKASfA6kE2ADzA3ECNgE4B0gD1ASMBUIBNwLcB7r/kwFgBIL/oP/p/MT5oP7t+ivxu/2m/tf6BvqT/boDvv6i+gAJ0wfZAtMABQd5CjsD3v8YApsJkfqR/bj8KP8I9hbySvkW+v74s/Lx/Mf5UvvN/ywENAU1CVQJagoUEO0Lsgb3ByoI6QRmA/4CAgDT+jL8kfi5+lL3xft1+sb4QfsI+wH80/nM+2/9bf4y/BMErv2j/CwDsgMs/nAHywObAeQGJgLpBncBngMvB0ADRP+PBvgB5gAU/Wf+PgSBAhH6bfsWA074Avas+WH/rfki9o79xQTh/tT8/gS/COMDLQZMCe4JTgRM/s8Cx/4t/hH7yfs6/uv4mfWH9zv1V/Zp88/4kv7f/xoIugWpCX8LUQpHDVULDQnIClAFjwPBAiACKv8r/pX7N/+J/Zn2y/098wf1bPpn+DT6Mvtk/fX+//+i/WX/1ALO/fcBNQTT/5kDrQWKA5MCVgSnBnwFqPvDBMcGYAEa/7EEOAax/4T8hgDbA2z61PnQ+xwBtPeT9rH62v/5+BT5ggIGBR4EpgFgB8wGmwWMAwcGUAIFBXr/4QKs/V38n/ta94X2SPYR9+f1kvtb9Zj95/3QAK4CSQZNCLwLbQdJEugM+wPxDXgElgLKACYCVPxW/Sv6ZP1s+V35+/rz+Ln2lP2E/BL39/4y/AX+V/1WAisBEwHn+9D+QwXkAWz/2wTlB/sB+/7OBp0KowAHAPsFGgkvAJb9EAHlAWL7Y/o9AcoDBP9N+xz77/3D+Hj0bvyu+lv+Sv/bBXcD1ARmBOkF5QUQAzoGwQFEBb7+swDL/OX5APyW9371IvuC8x/5u/pu8cD/4P4t/90HwQVADVsO8AlNEHEIkQQiBG4EFv8fAjEBBQCq/Rb/yf3R+BT94vYz+iz2MPgHACT5F/WGAYUAUv8V++7/WAWK/OT/swK7BaQE2AHcBMQLpgAt/+cDywZzAcz94gckBf79nf07AqoAKf6k/E8BZf1k+6D5+Pcl+0r89/qk/TwE5P4zA/cBowEgB5cBPwYnB2ECJQhRA7b9v/6Z/kb77fho95n6H/bp87X5MPcw+5/7uwKZAlMDgAn9B/0JFQzjBzML8ws7Bi8G7AK1/5EAZP21+Cn+MPwh+vD0y/cYAUP2MfWkAI/+Sf5g94oBfwKg9xAAY/+VBg8Cx/47B2QGBAFB/yoCUAjlBKf92wU6BU7+TgN+/yoEgAAw/hwHDv+U/qf8CfuU+J/5KfnT+oL91vvZ+9gBwAAeA/0DqAMEBhMFDAfPAkkDeQAvCPUA5P4z/rL9+/uD9EL3sfXs9mz2evmD+Zv9+QN+BcYDCAsvCRoICwhVCpkISwKsCHMFSwVLAJoCRAKi+SD4DvmB/cb3mfV0/Kz/Sfzh+G0AE/0M+mb2ov7rAY797f9+AtkKY/4rAt8AoAXqBsv+uQQfBakB5wTPA6EE1gPN/y8Cmv9GAf77hACK+oD8xv3B/BH+uvsw+XT5kPkI/OD+jfxsAU8EVgmKAwYIMweyBmYB3gKx/gQBB/6B+6v/xfgU/gD27fly9S/18feL+GP7cwNNAOgDCwuID8cK7QeWDSELGwc0/gwHfwIEAov4bQGtAgT7Bfk0+s/9Fvai96b8kv10+UD8AfvZAM37qvp5/s0Fzv0dAJEE2wIIBo//twToA4UBDQJDBtICDQT9BOwDCP8HBNoBeQDl/wT+oAB6/F7///nb/nv4KPyP+Xf93P2N+UwANf/1AUYCYwcCB34HIQZ/BqkCOAH3/mb/U/6l/uj8P/zv+F745PXA72L6Hvzy+lT5GwKoDJMDkgC+C6sKTwbNBUQHUAyNBRcBBgUcBP3/Afyr/OH/3PiK89n9bf3297f4Xf3g/or74fsP+/D/Q/46/T3/UARk/0YB/QPEAJwEGgAvBvkDcADRBMkDvgG4ALcCBAV4AAgHwAL3AIf/TQD+/S751/r/9S7/RPY9/0P8Sfqu/Rj+zgCiABkFpQbuBQIGkAiLAzUItQFbAwwBNABW+9n/6vbo72H1Avr890ryTPsvAmsAp/u9BucHqwrWBEEKrQwxDCsD8whkB64BaQHK/7gBnvgd/FH3ngDf+JH4B/9p/ej5z/vp+637tPv1/PgBuv1m/yn+gAGP/vcAyQBpBaIAZgX4BYEBzQY9AYgE6wBCAfsEqAK1AZoCmP/fAzv9Wf29/Lz69fxD+4z79/pb+rf60fs//Ff9IwLpAm0ClwmZCOEFKQYhCE3/Y//SAQ8DFv7X+937C/7H+q3yy/aV+pP2j/EW/soFhQEKAgAJgwgpC/gFbAeNDGIGIwWnBNIHqwGV/ev97/0//mz6c/12/Qj5tPo8/A77o/iA/Db/1vfZ/rEA9/jx/LAD7P9lANgCLgX9BDr+0AOkADkE4gBTABsJ/QOVBeIETQOUA7P/mv+C//n/YAEoAej97vc9/Xz3BfgL92n4Z/0T+wsAqAIsCOQCSQblCbYECgKOBn4DBwKk/YYATwLv/Xv4Evow/CDzl/Mh9DD/tfUa/RIDGwFTBh0E2wc+CdEIjwnqBNcLKQbLAC4Fqv3jABUAqANX+/z/nPwd+Wf4cvZf/mv5evgJ/kj/IABC/pAAUv58/CcABv4oANf79AFyAxoEFQLKBScHXwR4AYQDjwSuAvACJwOp//IDSAZ7/CADvf7yAp74JPpH/Cf1YfuM9M35lwJp/7f9MQW3Bm4BKv7cA7oHPQPNAU8IVwQQBTP+JwA//yb6Zfob9aD7+/ON9/z3Cfsz/G798gWfBlcEWQkqBs4KZwesBLMIggE+BoMAlwTMAO8C+P4n/PD7Kvue++T31/qn+xQAtPx4/a8B5P2d+6H65/2f/xX5GwObAXr98gP9/7IBYQJfBUABvgI9BNkDsQTb/wwHKwPJAlABqQPZBz7+zAAr/3D6DP4p8qH3ofuj9qn3kv7OAjkA9ABCA9UJkP8wBu4DmQPuB639ZQXpA9kBi/6u+yv+H/UO9c35jPIg+Tj7gfsH/zf/pAfZAWgIkAasCsYDywnXBqYDGwl0AJwH7wBAApD6B/1N/qH5Qfe8/+b4b/yW/T7/3PwB/FT/Ifu8/jv6fQEm+7MC/f7jAfIBYgF8BF370AHNAoj+hQTHANIDlgn0A4kG7wFkB2gBaP4iBQgAcf7C/IT8lvts+2r2efso/cz5JPyO/iQGHP4YAL4E5gEcBlEDjAJmBdAFUwOsAkwAF/y2/EX4cfgX+VD2wPqc+Bf42/5n/4UDFf+GCJsESAJeDXoGQwb6BB0KXggmBdf/DAMU/b/9//pK+0z99Psy/U/5wf6q/xT/3/eO/zb5gP5g/Mv8Zf5y/vsAogFPBGn/cAMQ/McFdv6o/4kEYAXPA4IBlgWSCu8AUAKhB+L+UwS4+yoAdP1A/wX7R/tp+6/+j/Xi+wEAgPY9AJ0AOQFOAhAELABsBxMF0wq8AJQJaAQG/ocAgfhn+UP6gfqt95v8mvTg/WP3vf60/Q/7lASuBGsJewn6BhEM6QfE//gJpwNSAD0AKQIC/SsDMwH5/Xv8jvzt/aT3gvwB+U34AfyX+LD/pQNy+ysDvvuiAOf6Vf/O/nr9YAOdAOMC2waAB3AAUQYa/5AC6//gBPMAmwJVArAEBQS0A6wAlvzu/dP8cvuu9xv7hfef/Vz40v7B/BQEGgEbBVYGMwnjBOoBigOHAnQC9/l6BUL/Nf4R+9b+U/aI+Gv2Ivvc9gH9tvvj+5wHzQJ9BMAGIQqWAgsK6wTaCckC9QRh/+sAEAHZ/Vn+gQCd/Yj7MwE0+zkBBPYP+yD9Gv96+uX6NwCjAbD46/0hAtj8dwJg/Un8DAQ0BxT+GAh3ANQDMQA7Bl4Gmv4SCNoB7wImAoECigRKBwz7RQFy/av8lPbd9jH58fRi+37+7vsv/EoFU/xTBs0E7QKyBwkHMgOKBtoDeQbr/WkB0P4m92X8y/Sj9p/zffiG+Bf/mPz6BLP/KATOBRsEfgRCBW0IqAfwBlUHigvS/7kCH/9CARv8Wf3Y+jr+Zvq4/MD5/v6t/v/3lgLh/Oz/+/fg/mX5K/0J/SMDVwExAyIEsgLbA6/9jALI/B4DygDIBaMDxgU4BYwDhgSyBjoC5wW5/9H6yPvE/DP6QvRW/T/9L/nR/ukCS/lYAtr6DgHF/9kH9gMKA1YJsgR8BskEhgac+cL9SP0T+lj7yPed9kH7UvYZ++j5BQJMADr/QQPMBJAIrgdbAwAFRhBmAEADgwWjBMn/Rv7xAQz9zvul/931IfzB/uj12gAz/Tr/d/tg/6X/uPuN+cX/cfxd/kUBOf4KA4/+1gGyB6wAFwQoBEr+nwWe/FwHg/4vBvQGegJcBuIGuAAx/8UAFvgd/9j8g/dQ9V382/gU/HT6CgOk/F8HmwOaArEDIQK2BnMChQmrAQQH6f3/A4v6JP1792X3sPqS8oj77/qS/s/8BQCa/GQE8wGfAUsEywqQBSMFegp7B78GYAJ6BGn+PAIeAJ/8WgBD+wH52/+O/DH/jfku/Wz79fy/+vP7yvyf/kECav3tBDr7QAaeAOz/KvwxASsCqP7kA4IEwP6QBV4GXAA+BcYCXgQK/VQGuP7kAsf9Zf43+aT9x/63+F34Rvw9/F/7+gIq/AADXP3MCMX/oQbYAKMGgATyA2wG+gHaAfv8sgN88Wb8q/kD+Z3ywPv/98r9CPymAGkCUQR5CLUCfAwGBXwLfQMsCbgAlARw9+cD8/+2+oj/4QJUBR/5NgEH+bL+4/iD/hb5Cf8BBPf6afntAMP3zAD4AVr87ACAA3MDqPutBiAEvAMWA5IFKfw/CHoBr/5ZAYACOgRVCFsE/QGcAir6AgP182z+E/Sv+pf8wfqK+gwATf+vAA0A1f+cBzr+iwmS/JkG5Ae1BwEFSwKe+WcEkve8+2T3lvMj/Er4cfuv9jIFS/lqAwAAQAgjAwAHW/+rBbkB2Ab/BDIF7wicAZMKhfqCBUT3X/2o+mf7mfreAvX3ZwLO/pj9pPw5+5MAlfiTA8T2EAWL+m8FJ/2bBTf//AAJAikA1/6cAa8D4f/UBzUAnAvBAJ0NFvvqAzwFsv6L/xUEN/WEAMT90fOz+4j2c/4a9ycGaf3zBCH9DAhz/ZwEN//gAeYIXQOIBFgCVwbh/QP/T/T9/4zzGQD78HP8UPvA9pQAoP7y/+kE2QZiBMUJNQL5DAABcAsLAM0D1AWaAl36CgYs92r8oABI9XwDzPc1A338eP8T+I0BMfkRBRT4BgADAO/5zgO/+1H/xwGKAGj/Cwic9mQP9PWeB1kB3fy4Cb3/TgIPABUE0wIuA/IBLgmB+CcMCfcu+aj8x/hw+O77Y/tC/j4CJftQBH76LgVoApUE7ATHAp4HpwnE/yYFdQGj/8b/k/jB9O/1VvdZ92f4J/0VAO78qAfq/QkCEQX5BSQErwchBFkKKgZwCOUAGwFCBRf8IwNR9YMBsPW8/v326/66+wH7gAEz+3H/dfwPAzr9GP17AGcGePvpBpD8bAHH/FoFk/yCAAADovzpA6MB3wMr/KoHxQJ2A0sAvguE/kgLtvltAxb90ft4/wXzuP4z+Zf83ftNAtH3dAhl9g8MKf6RAyQFdv5tAZoBgwQqB10GvvwgCLDwFwbt7qv5B/iz9WT7Uv49/YkCFgA8BH8M8PwrB08AgwkmAKsHzAKDDxv9ugjR/6f8wQHk9N0B6/ln/OX8v/1j+pAFgfPgCwH4NgDd/qP6VP4q9rP73gHSAWf79Qdc/oILAfvxBEX5swPXApf8r/4CDJ//2QFOCXIASgar/sEFM/w1Ac78+gFb9FwCWPmZ/fL+4vtN+RIAkAB9/iD68AHvCLn5fxAvAnkKNf/8BOf+G/vW+vb+EvK3AEv/9fi4AZD19wCZ8/MCVvvHAdABmAkCAQgKjgVTCSoB4ALXCrf+wwXbAdYA5vrTBZj0Ewfs9S/+kPvA+hb9yfwp+0X9Bv3V/vADePsGDMT1IwrN+1YCUf7L/pr8/ACU+mgAGQUg/dQMUvfWCjMEYgJBAJEIcAH0CQH8Kgey/H36HgF+9X8B//YW/0b6Iv569XkGQ/ZABi7+4QHoCScD1gLPB1gDrwDMAH79pwTg88AFE/WqAdr1+/0o/Wf7ofiv/msBhADxBgz9mQcAAmAEOgGNCTsC9gc1AswLOQFaAM4CpPiP/HH7GvlI/n78/fsuAJL8OQf6+CoCzv1EAsz9j/0W/HX7yP3s+iwBiP2bA24B8ASOAdUBQv4CCeD9qgFuA/EGsALQAh8J9AQ5BZr8IwEt/CQBDPu6/rb3EQDZ+Hj/y/kp/b75cQEM/q39EwMB/hIIiPwYC8L8gAh+AagHN/0TA+j3Jfxe89/1GfvG+TH+KvkTBaT8rQzp+owF4v0hCAEDrwWAB3wIj/8bBdr8mwNwBfP4ngYk+Y4HOPT7BEj5o/4S+vb8u/0P/6f/dPmL/+77HgDy9y4C9v0gAlb4GQ7T9WkIEAbcA0IApPwYBaD5BAHu++kELQLQDYH6txIS/bwEsAES/d35Iv5o+Ab7qP5f958AOfaCCzP4IQph/PADtQCzBGT8tQcKA9UA/go4/vMEzPkrAary/vzu9R3/yPTNAov0l/5bA4H9dgSu/AgOyP+kB3UB/wTCAR4JmvptC4kBiAsf/zj6yAFu8/L/6fKV+oD87QIl+3gEMPrQB2z1SATz+Y78/AV2+VMBC//3/hoCAgULAdUIBv0HCPTwzQd7+F0Ba/9CBLcGTgNmCngBVAajACwBRfXkCAr9L//F+mABg/l6Arv6QvvK/M7/L/tT+0EEevwVDCQErgjtALUIIwCd/y/4jQH59wz56/629y3//fxV/Xz83/o8/R8GZ/rZCMf8lgn5CNkGtgjXArIGzgW//aYBsAHk+dwE6PmC+x4BzPyT/08DzvEQB5n2bgFp/nXzaAKD/PgC+v9Q/XQCrwMb/8sEiPf3BGv8gfx5/R//yPpfDH0GJv6GCL3+hgrt+NEAQgGL/GEI6f9V+L0L3vvN/zEBPvt4Afv1qP2N95H7Nv+SBSECugtSAnEFQAei/OUBgPq2AEIDvPxS++z7X/pw/hP08P9Y+o37kQBD+zgCAP/R/zMNSgX5BUIMugL4BVj+IAFI/40Ep/90/EsAhP/p/Uj9+//m+08BqP8o/wj96Pz3+6f+1vzD+9/9XAO+ABH5SQQ+/g8GIgLq/iUDWv/CAKX+NQA//ToBkgULBAsDBwgEBkD+JwJS/Xb/vgCN+Tj8k//8/tYBy/ej/aADJvW2BWz9MwSaA+sDu/60AvwCOv2dBaX+uwSl+0j/s/vi+R/++vhi+Av+SPiwBbH3wQBDBMj76QjX/QsHwASXBLAIGQLn/ZYGWv76ArYCxwLLAmT+pwOhBPn6B/wGAHbtX/7/92787v7gAMsEdv2RApAGtP7c/moD7/iEB6L5/v7I/VH6hQPYAKQBqAU9A/n/5v24AN0Azv2HApQI6QcBBjcHcvoKAPD7A/sm+d38FPt0APsBBP45ABwGV/rf/ogIGfwkDvv8Uf+sAZX/cwRg/ET5NgEW+VsARPzN9YX/IfiX/iT4tQYz+RgFp/mFB3QCYwiDCMoDvBGZ+1ULiP5M/2L6RwSr9QwIlfZpBXf6XvsDAIr4Q/6SAMwA0vqACwLw4gUY+m0FI/cqBW8Efv5vADYGoPcGBu380/4+BH32GQ4B+RUB+f5TBIf4dxCB+s0KSAJtAIkEhAID/4QDewA4/qIBt+35CUv1ugIR9lgHDvzyBEz/eQAWA1ACCQTp/i4KOPtJAsv9Bv/k8YAAz/TC/zzyrgCh+g4AcgUw/rP93wnCAfv+fwnV924NcfyYDsr+TQ/MA1UADAAgAHX1oQJj+HX7wP8F9dgL9PdMA436ZgBm+aMFl++/CDr2UgGCBTT+dAViAqoEKfzEAiTzswd49QcGSPczCAEFkQKH/x8EigDABMgE5P6/AnP/jAc88bsIg/cKA038lQI4/PIDlvnPAib//flABtr+GAsq/BAFNgIDAU769QcN7hQJlPqn/kL/k/cy+BX8Pv7U+Wb/Cv+bC6z0ngwN/EwGkAkmBgT8tQ/P+gwJBv0M/z8E5veNEODzDAYl++oCHPt8AkT0v/8O+TADpPht/WUBSPorA178Nv7J/egHg/JJD/70DwiOAZ4F2gAZ+s8C2gFQ/QUBrP6DAB8L7fm/CVH52wuW+fEJKfsBAkr5UQRJ/Br80QVn96AOPvO2AZr79gIJ/O8DKfmNDP//Zwh3AYz7AQJk++wBp/Sa//zxbwXd7wsBrPhMBdr/f/73A5z77Ait/v0EV/8VCuwAvxNB/uQItwOGAdj4gQB0+T72vgm09VQJAvKrCNj8Of3B+fEEBPk1AyT+EwCy/fD2pAd5+nME5PoTCFf71gpS7pcIk/QZB/T6oPtqDCr5kgv+Ah8HhvyYCJz6OQr38BoHTf9qARwGav13/kcGXfvu/XH52PhGBZjyQwyH+RoFUAQWBhH+IQmw9TQJbfuQ+NQCqvPWA+zyLAAo+hn9SwCTBrnvqAlK/h0B7wBY/y8CWQpyCmAFdwXW/7AHtfYAChX2OgAtBbIALfr5BtL9hf8/900F6PvB/B8Fuvg3AIX54wHV+IsDNPw1A938LQV+ACEBqPmDCKj0rgI2BI/6eQRSAV8ItPWYBWj+MwaA++YF4QN4COQAJgP5+uIAO/1A+7f/FPm8A4/9bQru9ykJA/jpBrr7j/+YBAMBmQRR9ggIMfgo/ssAQ/WI+3ABUfXsCp3sOQh4/Kn4Lgaj/1AEfv6YCpMCkAQQ/XcU/PoNCk37cgRvBrP+ZvlV/FQEGvTyAVn8iwGc9xcC6/56Arf7egWr/ef+Bf2r++QASP5wA6j/ZARLA/r84PjdCIPzxAKQ/T79gwDpAdwGH/iMA7IFzPzXBYEIT/9QCHr7wAS+/K/+T//B/PgCX/18AtgHX/kT/F4FV/nIBsQAKPdOCA//2vhJBs3+lQNC+WcFiwH08mH/7fcg8az3zP3e/PcFdAUCCkEFVAOzAAX9XQeuAsf/YApyBy8Dngl198MDe/cQ+Z8A9Piy/Q8AIQWa+UUB7v/fAcf5xghO+OX7HAB8A+j3ff7ZA3wDBwWDAZcHCvFaCWTyVPpCAekDQ/rWBhgAbv/OAV3/owhS+SoHMwMHA6IAaQCC+3gGjPz9Ba0A4/fCBan8uPYfAzv8xQsQAZn8GQpW99UOjvlD+RsDIft3+kv8Q/q4//b2sf6VADj83gVg/VQCMv1mA8n8uwQ3/f0OMQE0AvEKS/1aBDsAUgGFAB4FLf2tAEv1ywUU/Sj7of1XAzT9Zf5L/WP3kvz3/7sCt/hwCov/gP1VA4j5eQDoAMT/fAYC99gL5/sd/hD8TfzUABH/PgcSABkILwCNAbv7ZAdW/nwBbAEaBdgAdQPw+FX/yf1Q+agG7/sKCvAAvQIg/BH/4QG7/rn6BAnY+7kBLAUr9Gf62fmR+nj6FQIi/+ECnfXIB4z20PwxB2L+KAW4BMQEKQY1CMEABAYK+u4LL/+f/9kBpvaQBRn98PYXBBH54wRK/qP1GQd3+ur96AB7AEX3LgcY/a39SAOa//4Aiv1cACH/O/tZA0oCfvokBiP4/Ahw/WT/rgGA/nAF+gYQ/wMAPQPQ/0YFMPdZB6b8U/8K/JP6x/7YBhr57gaL/6EDdwRf+z8GEvncCP358AT1+0kBk/wa/n/5pfqF/HD25wPH/db5xwYJ9+3/HwTW/HII5v2pDuz+bQnzBQYC6ASLA7f76gSGA4799v22+ygCv/ex/378nABy/RQDbvo3A2f6dvxNAbv7NwPy+xgEQ/+i/h4AqgCT/rQBHPKsAbD+MPyBBBsClwGaAQUBpwFQBcH5CQZs+o4JwwO4AK4KZ/ujAEz8C/fiAgn9WgGxAZEDpv1cBF4BqP6wAmP9Mgyq+MsHSvjQ/nT6q/rZ+RD8vfzg/Yr2rPqyA071FwfQ/oIGFwMLBf8AcglY+GsMS/+S/wQF+/6Y/YQDOQEpAYYA3/0G/xj4Jwgk7U0OtfpgCyX+gAAjBYwA5/d+/hcAG/eKC6f44wHw/iAGp/MtACoCkAIW+p8BH/X9/ez82QAvAEP9dAsXBIcIKwWDAzr7igg5+bUPTPVCBDoASvpEA3D+xPvvA7AFtfscCan3wg8Q6ygKt/g3/D4A1/cM9tYATP0I+5sGUOzlEOvzsgQj/5D8xvYPDXz7cwYo/gsKqgqL9H8MXfibByH9WwZK9xYNq//nAgX46AXD/Oz2HQYc/Mr+BwVL/yEFAAk59IUO5feeCZ/3CAMGALX6EvpR/xv5l/nTAIj0VQTq+k8AMf69AYL0LArD80kLk/5mAS8JIPzSDmL8awffANIDDwA7BeD6LAjd/04EfgD5/t8C4f4Q/M0CyP1f/UQMXfRAA9X1PwVF9D3+1/2x+7v9Afyo/pn4ZAT2+P8EeP5gArn+qQjP/H3+zAAhD8r13Qo3Al369AiM9wwA+f23BbT2Qwnf+pQJtPe9B6L9FP7g/1AFv/4OBkwBRfnOCHP4Kgsx8NwGVvWPBPT1RwCT/zsFxvk9/VH+2vWI/8b1IAAM+asGOAKbDSz/PA4w92YTP/m0/1YBVgBSCMP4NgViBCYB1wMnAyT4hBHZ718I9/hXAkX9/AGt+aIGWvzk/TD9NPJxBKTu8Pvm+JMBJ/EfDqT0ZxBZ+lQGwAf+ALEGJ/49AiADigRD+bAM1PSREXfu6wtC81L8wvxqAD38Xv88BIwDow9W+wER7PPgBvr8TQA0918HYvakBJX0BwZ+/mf85AMk9FAAp/3//1Hx0AXf8NUI5PQUDxb+FASwCVYDKv1XDLUAdvzkDN/2agwG+SAIJf31Aa39ZwbK9YIPcvNVBvkB9fziCbv8/Qiq+FEDafMKBvnpcQNh9Hz8Yvfw/QwEDP5/+/QE0P4xAGoGsfcYDvn09RCM+SoIJwXKBaj+pAJU/Jn5bP4N+VMB8ffYBqX/OQSuBM0JyPMTEE33mgE+Avf6rwtS9jYG6/5R/J4JxfkD+4IB5/Sq+0r3g/5t9aQDhPTWBvf4af2BBTH85gTq/G4JkgXbA8gJYAOA/P4NKvvzCEf2cA7EAXX+GwUB+4ECf/7V/0j9Zghk8z4FnPYZAMz9rPQ7B8v7PvxrAaH01Qg591/4+QK59wkEhwD6AKMCRwph/ZEFUP2UBZoCgPwJB3j+yP4EBRb9zgAm9b0BygIY++sKu/sXBkMC5wPv/64D4wPKAjf+d/uFAQ/69QB4/wP3Awgf9gIHjQC88PsMK/JGARD3//h6AkH0EwKm/04CtATLAPQERP88BHoE/AIJAz8B+QjCAKAJjABF/yoEa/78Aer9bAWv+XADUgPW+g4HdfxD/5j7f/dv/jf2Kf8x9MD87PqX+X8DdfoqBnX7LglBAcQBLggFAy8A0gT+AIEFzgD3AhABwfitBJ700QA9/HAB5fnbCeAA9QHQBiAAcAaH+6UFIfo+Bsz8KARr/+UCDQSh+Zj+4/gZ/CH9+/hm+YQA8/sd/nT5NgRz9ogDov0v/6wJu/nX/T8FSQaR/sAFbgZhBQb7uwrH+B3+mwsd+eIBEBLd+toGUgJb/yME/PVlDa/w0gFwBgD0BABAAEnxR/vX/Sr8mfomAI/+DgGf+9j+WQAF/ssGg/ogCZr+TwVaAl8DSwS2Cdr9iAVS+7z8Rfuh9Uf+pvqPCBIEngS7/94K7PWmCND5IgLeAO/7agcB/1ACiAUU/LsEw/xY+qEDLvJTA9ztsQQD9iD+2/uzBMMAB/6f+cn8GAZX+lsDiv6rD/wBRBAoBVEH+gMr/9L6gQdQ/ScCT/n0/88GOfvvCGT8lAei+w8EmP8n+1X2z/iM9Uz7KPcHAYP7Rv/8AhP44wQc+mAFsPpOAgMD0QQXBKUKbgGQAxYHFv6O/cL6LQAs9OoDUwLyAd79IwfN+18Gsv/EAIcD5QKuBPD6uQJx/+8AQvckCQr7HgWo+0L/E/t+/Qz6rQFA9aj/lAHo/R8GBu/+CDvz9gLG/eIA0/ujDX/7sgTN/1UE/wPs+O4Nkf2BBf8EoAeu+oMNff0JDi/6jQ2v/Ez7UgPk9SAAffiWAc34UwAv+OMAh/EFA5X2FgGe8LoIZ/jPAZgBEfkwD+7yoxEf+ggGXwPIBzP2+wqw+gkHTfrFBOoCTfSCC2H8S/1a/KIJEftJC6jzWBIa8WcMMgA3+IYJHv1bBJj4dgHO/YX76P8CBvTukAggAE8A//NbBCzvd/03/nf03gPg+9MLo/ylAW4MSwL//9gDvv/YBc37AgaoAGcBAgdJB9wAUgj6/KgA5QHa/VwK+fRnA4ECuvj8Adb3Y/gx+ZX2dgMx9EL/aAKf9pQCJP/P+5gI3f8uBHoGwvz3CiL+8wETBUH4fwSzBmH3yASv+mz+dwLl+F4AwALqAkT+Uwfg+6gEFgFV/+YGIvyUDXj4mQB5ByjxBglW9RP9yv1J9qAHUPqp/moB2fY4AGX7T/k8AfL7aABH/ZsG5AXa/0L/yQFMBNkALQIjBWMDEQIyBPUJYwPq+0UEVgJQ//4MB/s1BHAD4vjHBHDwaQu/82b5yQMy8FIGUPdL+K/4fv9hBFX6KfmBAmX6K/+uBEX9JwUaBoUErwYuBVH+nwnD+goAkPl4AR7+1/+U/zf1jg79/5cLY/j5BSoFrfuNBIT8gwobAWwAoQTr+LgFs/1Q7/sCcPj0AJ72QQC9/Qr4NgOE+NYFt/hRABEDm/xzAH/9DQLyAyT/PwQVAiP/wwUdAtb/DAFnATn/owdP/1UJe/pSDFEGUPt/Ezz4GAVj/BX7nP+P9W0CB/ti8ogAp/OF/yr3e/hJ/5z4egiV/uQEswHyAaEEmPpMCMEDwQKBAlL/Ygc78fwCQwVN9sr+KAXbAMgEyPwOAuQB5fnZDOv3mQrbCSP7yAlg94MHpvVc+LIIr/izAcb+pgKMAY75vfz3/Lf30foJ/bj5k/vBBcvytARRBOcE4AMVAFoJzfnMAVYADwC7ABsDtgRCCB4CgwtoAmn/cAX+/0AAMgWiAdP9JQIF+oEDHPYZAaD6x/VCAWvwUQG88hcAIgFp+8oC5v63BKD9pgKI/a8ErvxHBx4DgwGnBF4EN/w3BQH8ff86ASj8qQdr9JgLz/mwCI74EgOvAtn+cwaw/o4GK/1eAZEJ8/diAAYDjfX4CInqdBIE87YBjwFu9A0H2/Y1/3b3M/nF+SgAH/nQCt787Qa3/68E7wQG/sQA8gSv/dYBsgsQ+ogLtABkBxwDSAcLBXADVP4LBDcA6vPZBQr1NwLV+zn8IwKt9Kz88PzO7QsHa/wz/r0HqPi/Cn35yASb/7MBuv0cDPsCYQMiAD75GwVk830GfflZ/3MI3wFH+MkH0/xpBT37ZQadBgv8DAlO+7gDCPyTBrr3awvc+AMDDP+n+gcF0/fj/Mn7cwFM//787fTeA0/z3wLn9HX/uQSb/dwDcf1QAMsEDAKL/oAJO/vBB9cFuf5D/1EDZAEBBs7+qQof/hgNAwO4/dcDm/zUBw/4Gv+m9nX9wvbl9RT22//D/HwCPfnF/7/7oQJXA6D5ywdRAUIHMgA+Ayf9FwQBBi39M/6YAxX97ACJ/Zb73QAsAaMF2v/8AnADgwMpAj//SvyNB2UBl/tMBGT8ggVD+4MHQPzC/2gDCv1p+ov9Zv9x85cF/PJt+p4BCP1n/eb8x/ypCiXzgAqT/xX7jAhq+tYFN/tACMAA3QL8BDAK+P6LBuIE6ATBBL8DegTMBOT6WQbx/ED1UQS07z3/cvdE/Ib76fppAfj4jfdMSVNUYgAAAElORk9JTkFNEAAAAEltcGFjdCBNb2RlcmF0bwBJUFJEFgAAAFlvdVR1YmUgQXVkaW8gTGlicmFyeQBJQVJUDgAAAEtldmluIE1hY0xlb2QASUdOUgoAAABDaW5lbWF0aWMAaWQzIHAAAABJRDMDAAAAAABmVElUMgAAABAAAABJbXBhY3QgTW9kZXJhdG9UQUxCAAAAFgAAAFlvdVR1YmUgQXVkaW8gTGlicmFyeVRQRTEAAAAOAAAAS2V2aW4gTWFjTGVvZFRDT04AAAAKAAAAQ2luZW1hdGlj", -} -BASE64_VIDEO = { - "is_file": True, - "name": "test/test_files/video_sample.mp4", - "data": "data:video/mp4;base64,AAAAHGZ0eXBtcDQyAAAAAWlzb21tcDQxbXA0MgAAAAFtZGF0AAAAAAAD8BohEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8AAAC4gYF///e3EXpvebZSLeWLNgg2SPu73gyNjQgLSBjb3JlIDE0NiByMTFNIDEyMTM5NmMgLSBILjI2NC9NUEVHLTQgQVZDIGNvZGVjIC0gQ29weWxlZnQgMjAwMy0yMDE1IC0gaHR0cDovL3d3dy52aWRlb2xhbi5vcmcveDI2NC5odG1sIC0gb3B0aW9uczogY2FiYWM9MCByZWY9MyBkZWJsb2NrPTE6MDowIGFuYWx5c2U9MHgxOjB4MTExIG1lPWhleCBzdWJtZT03IHBzeT0xIHBzeV9yZD0xLjAwOjAuMDAgbWl4ZWRfcmVmPTEgbWVfcmFuZ2U9MTYgY2hyb21hX21lPTEgdHJlbGxpcz0xIDh4OGRjdD0wIGNxbT0wIGRlYWR6b25lPTIxLDExIGZhc3RfcHNraXA9MSBjaHJvbWFfcXBfb2Zmc2V0PS0yIHRocmVhZHM9NDggbG9va2FoZWFkX3RocmVhZHM9MiBzbGljZWRfdGhyZWFkcz0wIG5yPTAgZGVjaW1hdGU9MSBpbnRlcmxhY2VkPTAgYmx1cmF5X2NvbXBhdD0wIHN0aXRjaGFibGU9MSBjb25zdHJhaW5lZF9pbnRyYT0wIGJmcmFtZXM9MCB3ZWlnaHRwPTAga2V5aW50PWluZmluaXRlIGtleWludF9taW49MzAgc2NlbmVjdXQ9NDAgaW50cmFfcmVmcmVzaD0wIHJjX2xvb2thaGVhZD00MCByYz0ycGFzcyBtYnRyZWU9MSBiaXRyYXRlPTMwMCByYXRldG9sPTEuMCBxY29tcD0wLjYwIHFwbWluPTUgcXBtYXg9NjkgcXBzdGVwPTQgY3BseGJsdXI9MjAuMCBxYmx1cj0wLjUgdmJ2X21heHJhdGU9MzMwIHZidl9idWZzaXplPTM2MCBuYWxfaHJkPW5vbmUgZmlsbGVyPTAgaXBfcmF0aW89MS40MCBhcT0xOjEuMDAAgAAAMsJliIQFfJigADijJycnJycnJycnJycnJycnJycnJycnJycnJycnJydddddddddddf//8FxOAAmKZxB5GdbBJ0I/qo/+Ee5/93d4oOmgATyCOPs0YQeSU9gHogQgiKkeTMGgzhtmA3WzCcX9v9GB1FRV6izBeETEN8RUn4Je+68aKjADOf3ubYk08AHEtZSwC2H7GiIqbM8cRd43GpcARMxEOpH4KRIvGRP52KgM7jxi/EBunL+Pb8Ix+/7jerkCz/QtRtUideSfnaYLRJSz3lB1RvwBgazm58BcNnMliUz/zW1WZSYFyQG41SL6ow45c4iU6r7FJFPdK8xe6yyxBmrVixHdQkyeS9T4AwgVDLo7LoTzET0SdQjjirUv+BAXdSd8IboCpR3Im+IIKrnmRguh/9L8WA1irxxWN0JvUNIu8nNqd/b9ddBcVcsuC9IeBMTymfewA8LtG7q2wAa+IwbQA9k65iZLgPob2eFnnDBcagqMpt2I7/1VZ1Vh27BryvRZp0fhRWMBxiA3eVGMJY8H/No5i//gMZ5poHv9ddddddddddddddddddddf/+Tk8IDuABDKTM9BI7pwAHwESgL/56gBTQGTkZfwAHUghT26wGNHy5ieDNIBFU+qSAeyFMKNEmAb0DvqGnHGb+jFMYIAT3YDOggSMfG+GPCScBAvSHHWgsNL8ndz3dnFPgAfIEOeu0Apw+TLDwj2nBaAYQiqTyG5xRyeZgaBXx/gKKC//4BWA8QTisiw11pZXteZnofZgQQR/qMOwbgv7hvNiUQESQhGALf/myLwej3JG1GwIEkX+/CmyBBflXC9Sl6cdQpi59oqlWHzUueWwQe5ggEWJAkH4aw2KPjGk7t67AIQeUIrvoDzCv+899b8QJ4uz7k79djgbBzQnVsOrUuJAayty00xMJlSDV0VtZvIqqnvBs/7ji7WDR39wNZom+DQ3v5PxD64pyT4PuPL/1l0/j8acTZmZp7gQdDHCen6PymgTN1zjuEf0VeQ1JXF2cjJqY8imaqG+4t3t8UdVEOPXNODVzgfbk4h5dvLnvPP20Uv9S+7xQKtxZRuBeKZFzqqMDGhMjcftOTeAdlwGOH+T8AdBG1C5w0i/v7BvCEdnYm4KFog2nYrtyV0EXdxvdebsMw2vne/FK1TK/2JTQHexJdEg9FKaxQt2mB88PJ0av7/AOeAm71/uRNi7ZU3a8a5yI11EktxpGhGl0uLWmGxtN8Bu+rJmjMMXTlGLqvue1sF4nRav3bdVQrv1QxGs0dEPWCMvup9s2pXg+N6cLxIGBZz5Wpmfpt0mgQylEeOVFPzReR9TMt9IYMQSVZaxzw/9TTQyaHfdUFVGovPWcCwM6871GyOSxd/XLt6ziDrViqIqgY6b4GnD7lxqTcST5l6CiB7UGoHAzkoXlcpqNx5mtvb6qhHU8UeKE0OsVm80Zzx+lrNJmPE3I56lGLLSKPzBk50VHw+AmyNP99BHL2Xj7I6wHIcBRBquSR4DLEZGqM8r6v/mdc7Bb1umLIBjfOeglpBU3w6a74MsxqLrrrrrrrrrrrrrrrrrr//yImhAIcACxOAfUhhTMjEAPjEyTgAOwhpL21pHBa4xPz74ADiCcFmJrhUNq/7tNtj+cuoAQGC//nGxva5+690BkbtgEMDwPgiMpggBGINge3wExmw0cfg0CEHIgwAmzPSx/FBaU3yImsz9GFg4ADqmAMsBCoXZqRH/2mNedevwxSI/7aZnj9mNmYT+nh4EgAXist+hzc/NGYb2TeZ0Z7i6aG68KkfCVfskOLagYheehm9P7Pd7skEOz9+74o5EqlVs/oTKb8EGnYIAELrE53D79YkdflH8hbvq4bs/j4wyAwuhGYVtXq7YmUaik8yVHntqbJg/Xn7UaHOID7AKbZHHaNod+ZytfRyQcpik5q731gF67NGY37A1SIdPgu6iT3G7fHi6xEKB8/dFgNXEfqGOmMbuJTMV8t2ZGskPyMfhfrav+3lL8+GcHvXwzokaeCcZRDjbBQI8o463E0CkplW7++fde5Wjhv24r/TED9W1AYiQiMmIn9cfLYTb62/fM1uLwAXS9dq3hunpx7JmC98FD5D89/Yh8mRmAJhuhg1cDMVeGrc+xYMQv2JWgiZ6/7ks/zf9nhMnf0ctryrGXodUbuDtoFAUu9tPf6dZDszkjO6BLjnb2JpF7vjm1Chv3i/7/MxZMFJ80CN5PFcununmH9W7sHXJ8exHXU+OJrLru+QOfrYjkWu24T2DO8SSuApgRG0fEd+hKEkoTvy4MLvdqxqpMBDGNBdzPv/sf9lDfjYXYzX1jfoewVr+UZGTfMqmhQD0/QY+HZ1P2X2mdQE75GBXXHHIGEYCgKJDhFqme6sSEQdUAVEnI/d5r5W6f6Nv2Yz/NBD1tvOEladUlUtBf+HKo26DFSmJ76rxu9UqGo9l10/byG85jdRNDWlBWWAAdQm9/g29t2NnNUGpwELvnVspmMYt7548FfGs2E1eY5lcd7GGGgLQ1n+ulqgwBIysonwZHmw8dIBL9Pa7fndLPH7KuO05gKZZT1vzI0M1Uj0Sq15ntTDQLWAVHCU1ypQ37EcLnbXfcqulbCXD7ZBEbHF5IOl7cg39+f0ME0seX227NqSQ4vapL2GaCtlzgx3Wu5973sITIgqbwSI0+vh4UWomuuuuuuuuuuuuuuuv//s2HB3ABE/8r4gOAgcJllJjJYaMwxK3/4AEuRGO5t6/7/4JCHb1QOSG1sORf8EF3YIBIQvAJjWwP24AUtzcIIZYmsDMdgCXIAB0k3OP7BWF10jBIE0PQp8FtY/Hg7xiqnus8Hz2oWj3wQj4r5sqwDeyyVhuy3U2tLgn9EUewCATFvJ36lAqDuQVrzveA/re/6oIH2/JHp9C2yb0b1pGSQNe6vBGAUBBrCAQcJtAEzNtsGgkFyH5rw65kFGJ7FY8IIPkXt3WUENwFDMier2666nTIF5K4uc/NhdpP6RgyGhlsqdiGUbwXYe3rzw78yb2Uf+TqrQ+Hd0w5uptDCt7/3XcpHGgAHfh11xAtRfx+nfdIKtYfZq/f3AsMQnfFy0JG07qvzNIv2KjfHH3Arbier36aKYAJfocSzuMAy1rcYvVOKmbPudrvCH5qhl2wnMtj5/dYexDpqkGrPBB/oEcXu/gFo2mD2pGpWSl0DZoF45czID8c4IiawhTAy7pQhPyV2VSrlyQb9s8ogwzgCnkQEB7vaRQu8vp3Ba2e/kj3YhrLud+6kaC6/BXvWQSrevBpJCRX38RPqF9CwlAT1gBNI40Y6J+hoYDo/R3kc1iV7clpjivESd0EziRAJN5NCOeW5ADPdWTMj/wAbVV42vSm7B4ZP5eJ69wBZRtw3WYbq852n1L4m3lwvoAk/luOr+fZJ5vHDw5/UKN6sW1NGPsgvEsVWvRWrHixH31CfVbkhj5IL7TFpZxjaq/Pp3FGJ5kWOW7b0/cbkLhCZBWFe0xFa31I6v7Vz1HuO6fJtQpz7BEMI2UAGrlMhxd7ZnR4MZ2g8Q+PZ2kH0wbGg7ke7UZhuDUrhbl0GOuxsbOhOzKDsSQBz+lsUL1uovzWFPyBhKkX4AJWpGRiPeihqpCf88MjnUS3GkVo32pvrW/WK3clmOe7ZmPVN09//3u2G8RC5iL3qGQJUo/hqKc7KNC2sc6gUWBIxYjiSbmVqwtzrxeNoDnRGvq9ckRyk8QAAPKYuQdadKxPIk69XfKR1K//p+/VktAQ91nn7vCKdNH5f2i3LVP4XA2ya24NNT5meN6XJxilH7POb8YxQs7kLtdOhG689vjSugJ9ks4FzmH5eNvLcmyhmL/INtO+FT4Fu8wdoRlGHcmuKFowbfsGXc5W4D7vjLSmvVTtesW6kFmgVeHRST+9CEfyd3RWqxvcnARmDUwIDJsfcI3Wx8Ku4AYRXkhoxmxmB8ikV1QlvxGleNcBdRGErhoNn3ysGkgGdj6vq7SmkHF6wd/ACZEI2M9fqiy4aURePJrTfLlmlfq2gh/rNM5IDl4Sa75QJ/cquJXDff/0p9gtEhVXU77Xru96lrrrrrrrrrrrrrrr/a21vJCXAAVwk3KFWQIsmykBaZ3S4GyLNV/6jCJlFdH34AGf0f9+dQqM2Nhm9dygDK1bAjMPb98AGEeU3GcSIRPUigHbSBf/+fG5R5WnAJ9pOy8N9ZcuAcdhlBJa6jYJFtwfhZ45Sj9hG6LPPixVmBmrYJsA8Bbh+z0S39d/t/+JEVfv5PiH8eX5jZ696xZPn5yXb5eHlGJ9rjTDUpgRDW87FHUGxSwG9gYF6jL+3P5Nyo58irDt7XmmoGoSTu994AWqeEACm5Fh3EJ2vyimqrZOUI+MRQd7hh/7bL7EKdWVHv4ISgDCIdGk32oZrhfOa2zkkayFH6wmvsHNyc9zkakIpqjjIIOJImguJJfJISdC+KLQ6MHrLYAN022D6h8cpjcQ//FmV+nWk89B3e29RHwffx+mmkU2V7/BS1TT1cGu1mRsdKAd92OuvRvaEOXoPJp6ZearPjgWvg4UgwneLmzvoslIGDLMnaWAef73UTYhUmRkvzIq3uEzhqfgCH6p2d3/lt1fhXW9CZbwIuN8/DfjbC53srRhBdQTCtVr3HuO53C4G/tvT+Rjwhn/12h5kahwKM/1ng6KVd5ojR1+CAQYgkIIVbt9N/8As/KQY3BXmrn/GlDI+QBkdP6bXJQQYXGpPesvmiL7t843O+3sebkM7Vox4bmub+nwk2GIEgBQwBmz6/PnM2uydR7EWFep1gMogY4q9MvfUvU/TbzhjmRmXxulD0Q51MUtlZA+YB+oc4e3FTqxxfWJ8SWn82ZzazWt8MQpcNOp5SCFuWdAPtc8DZfF+n6SE6OI39TsuPHP83lrlv5UKqCiKvt7wYHdlfAHgwLmEaglstB0j2o4hif95nE2J1FqOSQA9Zcx+FtBou4X13oUxMgKsxkYJM6v/6YyJ745iXvbfpJFjYWP3eTWHLkKNUSLxp+C2/6lVG/73Xpygx6VRn/YqmP0yU637BzYVfA6mnNlE0OW/wo/7MSFYS9p9a8/UlOk/UekYwf04ztrMd00Xiy92jARVEa++YY4HGAFCc+o+tu3DYqTc/J9HMLShWjInpOWrgiBqzLJqHMP4x5PUoEmLfg5a2P+8bLIDPrdcDVjN0ygB/R3GzQsqYNjWG76yYkHucSuCb/p1SiY3q2xUYZ5zA5lOvy9LTfmxDj244S/n++3YsA5DCUXot9q7Cr5dWd9uJODe5cYDBb/Pk3sVs9pNB9yJgpDWQ/yc3eGgAPwyBaGTOH84/jHn9X6Ue5V1cG8mjASmaqxYT1/UIbQasFViFDo5Nfy02NE60IJlyXRMm3clmF0vAcGfQiBb7STBH0DC063kQv51a+FPubwmWQUdS4EOdGCmDv/eEcQaxw+wGbP/eR2ikA+B0+5YRzohlZgXWco3v/2S0toh0VPf732vAS3A9l1O7Fg0rAXwFTrqCqwD0UNdpsp6KYME4cDIIYAKzy+QAip/oLyBm7xblv8mg+QaN1CX4Hn1rKNaeKR7smmCOos4u7OYF4EzfxBR7XTf/a+N9AFriI/W08GE12omN7/jqSAdU1AUKCEHJ/u8JLrYn7x1gH13pGzTGruJvWv3t374m/DEPIDbhiJPuzascX9zwsuan0Dc5uV+XwfKgFMFOX6c3nj2e//LncJNmC9nnu2zhnEcAv4QLubFZojbl6vLwBmJrYzPAD/A+8qr6elPgwJOx85+1bGMrnL/icYSYIwMI5/1VJPTLqJrrrrrrrrrrrrrr//+EF2CR7tAB54AWTqAcLGF0icpdAHAR4EZTee4A7TNFnrW9WsAdwkAg8JlUkb9SkOLG++76VmEDdzRgGclYMOeh4TU/kSMajwQETCSpDJ3qiETUGUKYPy3/NpKASdgjQ/lh/f7+SwYal/hm5hi3DfKNEwfJ+GcnCf/+OQVJ2k0bAlhMBANX30dmXSsdhTCMkG1myGjAZ48sXQAB3s2pMfbL4eMoU1Hhe/Vtpe0HqR9UEgV/A4oGLszpyOfBYZ9R4vYmCevV9/dyHYJpN74UbDfhNwL/shdxuLlQay4Kloks0ryqPxOpvczhH2/ESu4OAMiA5tqVRj/jeD3wGPa1hhC+nmih1yP19IKLKyMzOog+GjeUwaA+bstms6ISokBZyMJVDFyKUPm2EQcwtKLjdZUSI4AhlYP+XTbvqXbA+lzcPe1y7aPcIic2rXo4AtYVR2A8jAzgs+RnSZe+3aKlnz+y1a3c2YGJnEHdR4SuFLRYUr6pfY6xrGvUxk5x0m870Cz0zWEyvd/YrDWuJHOfusqyC+OcCVbz08gRVcJT/Uy8v7hvZPVXW8G4RGo6O4khC8ONSk0bho9MWMK2cgKMHGBwEHnNzt1iR8W4hm6Vk75ewbNZoufDxsdnugIAzqjuCmvu/ExRs7+Oeqgf5r9FXQn4X/8qkV/JZg87mH8Nh+xq14nyq2DFVcvRaDVyHv90TDWLMF/95Gz/OCSsVyLLKUJufo2JLzDW/uPIrwBw+/lDtXGOkXe4KBM4HBXHYV8QlYSsPSlHdroAqWzcVPyTa/BlMVowug5RWQfqHQl3K645i1662VEm+YVeNWPfgzIiAZkPT4opmbFe5AosXU6yPomNYvOsE88X4uxNHbEoJohk+qYV+juC9jpBjr1yGRbO8qCcdcumZjHtYIZEKSYhBAd75EZmtd/VoACaRGcEVxEesszyJhlxw64Z8DilVzUjrcqbYGvw4W6ASdPty77y3flfebXV696rjfBqRvurSCq5zqfAJ6/KvzN1gYjBBLgDPRU6uVdSzrAhHv0l8MP4RA9TqHsoK7CyaOd67bGTx998iqO0AJgIINQ3Gc9V0uJwdK+/bRB1ScTseR4/7+l0w+Scf93Pmhrhcc1yzJFCvt7hC+b7il2bOscIwDJaaCvWv8yN7bgimvBRQUMYl+VNb3yklBL5uQwypYOSX9P/Jy2d853MQxVOlRARGcHZzZSRxFbFQ0nu/jEzeGWNBu/qvUgfWo/vZTutj1afc6CM6D01dDOIc/+ODTmuW4LXfgN6+fK/b6l5nFVeOftX95Gtbh89m8JEISFcHdcWZ9gHslMm75o7GWHEQGUi4GUpkJExubyx9SZAbyp//4EgEgOsbp2dpph1/lOU+6GBmZaa4RpajZL8otpE3Vnc5mwkqti5r9Y+b/nGcAgl/UuvGe1MQ4sXrjUKhl8thl234XrbGcs7RjvSt/e+k/UfPIlB8TDT4a3a1PJRF3QScShrBCCzSXUJvIu4qrnD6GQ3zKz8Mo+/+7lzrv6hCKUCeWMoT+ATMdbeGUdXHOvDAQIHkzyfBJ/PgGMB8Ts0v4q9o3H12Wg7iVu6JrdHNQIDPnnpiBT2QNPg1Uz4WLX9WMY7UDZ0BCCt28XqfPyeWzceg2qqXLVvcbY2sPNkJGavtJnKVyrBT10/rn3phWXmbINMS1pD/LAN8Xocbt2ZB/+3DJIG0N7UIKrfb+7y/vMsbfEuPlzyiiywlEZEQCTzH57k2HH/dyZEa2k4ur6Y2uuuuuuuuuuuuuv//9hwd1004McEPOh0kf/QLgDs8BA4nhUmLb/oQ+AIMBc+wJf56EgqYKNvvxB8vBWGQISAQ9HY00HNgP05NoEHojhwG1ztt9rUciqIkBaxWw1EsFBDTwOLlpBhZM2Cvo4aLgsDDl4NTC2gd7lHnqgBPRUG9M/w1AgLAxXfhjNcPDD93J9f1sxVRptmPcsmD/9wJBbBbq2bhkCBa9AAVWTofmSl0GPSTHgX6QBXLfKaQXiPuufNkCSxoBS3tNf8b5f+/LoDMoGYFzlri2J6v3VdcZxHhp0BXvXIh89FtsSJ9Cz9Pp0G0v29s+Mz/QvwDkPR2fL1nZqwGMwQptdeCTPWQTeCKTN7rmyKRKpFMekr/csHaE5l5f1QW1Bjo1k12M2Ux268724p/enxYfMxGm1TtitiY+v/SWKdHoqqeuje4bpmkOP/NA0HSZFY5/zYCqAMBXPVMEjqkiiqPUJcxcpsoCStAiuYmvLuVxjcIKyk0B75qzJanTeXbN5426v99n9sWU5OD+fetJRYMdB0MP/TR0Q10eTXG9LtT83K4+jICv700ZsXR1Y3iwfBRiuSb1X7/CTXiejkTo9Bx5A5keLATqqNj4YH6L3qBj/bmTuc93E2RAC6leHfAMaAAEAHKYpJDbX/NFmithh/Xqejl75dSLR069imrJrlsi8c3Dr9mg9KD46a6Tm7vaLZH/KAope5QVj4uwD/z4ruILaB+Jx/z1rxPY3WpvzsCmD3ClBQyfVTb0U1UYZjlo+CkfrPVtj8CbRgF6w8jWl/grlvPS5yy1tI8q6YD+xUpp658YdruJkm1MIMCb7iV0CHQ0FaGnJqmlrHUZJp47CYMMWFRBxLECs/avtzWMShzmmwBrJ3RKVlCnqLEQhkDCzWo2BN97TwishFAdAPrwBA33Y3MAhF1P7ZUiPw1UgAAgB7rx0O/5dv7HRvYqQjd/+v4d24ytV2OnrHn/T//eRNCjsYActl9/Ai88MLhUE0KqGNBNr2QJyD/nCQ8/HrCHmn26PmTkyMMmncnJ0SwORjjLRP6M02GAdNPNz9vgaZ1DxIw4gzDfOQvIGLNIHqfe/TvGrBSBQHspCLPvJnjPCiXhLJF/xh087PxT9vmlm1G5d2ngZOEldhREjiYF3P1G2k8gMZvQ77ZP1FqSi/mnds6zdp91cG6bXgsMcteksosLU7tSdhYnXMGIuCt6KUB8Lt1n3j/DK3AlcmgBSHckjtms4XgURooYvTWvnHedJ61SBuq6QMXm0D/GXF3BJ5AbHXWe/1HL3miZ8dncvbsxqB5p4HtgLKratT/jlfdUVeDVC6UW3S4NCIwEV41Zj4IvhTN0kKGgKKCK0jdh465s9MFQI9mITOt9Gm7F7akt6tV0O29VZMqn396xv6ujVDvv6NrfWS5foAkahN63eX1LvxBIBFol+SCj6aG2vV2WNaaIo9basYus1U52UDORLrpDdSh1aqI8r7WNmnD/go9/ZM5cveED0DsjQJ+C8/AsN+n2/sNi8JZYIi3V8NxI6P3bfyQ54VPXXXXXXXXXXXXX//+w4FIfgb/maPRa9COTQj/KAsWAmTCpjy/7MV//7XJaKi06x3y33o61lFuKzn1/kyv9Fs1jCQcCzHfiLg8jft5jJ/l/Q7uUJzWce5VW0zji/rF05QP42SRajyRBhYshRfxqIis1G5gomc2b3pmbjj1+g9iEtsTnak7aoRwQpXCR0EPIzCm3ko0nNha28V/UBuDsRKN2QkkU1W//nm3ZXdPyLymCjWxzo6+P9otN5hkzdZl1zKWsm9MPS4L2F7MH4wtob7RGnvjCfHZhfX1Lx68Y15n9IItq4NhhwbSD7V5tl8hmPoxCBObuguwer8DXPO+XEBAnkJ+HeyakoDLECxV+2KipMb9E5rGjJV8vGU2FJMMaOVHxObn0p4DEmOEiT1qAAnL7Ndwhi++INuJyagmrILIQA6pFsSYJgi+oUUdcWE/Fso3XRH64tHspycp2s1usuTYlWtvqxo4bmtBygInReM1WavwQAQNYYexvrCT+Co5J8CRUUwMC3wWdzCnoXQeSWMZ2eZkQNEOY/sSCmSKUZTmb/JpB3cVzbONzf1BT3MLKeA4vSPTu+uZOCE8MPQAU/8oUQobr5uBi2aXHvQ4FK92YQ99KzLeVnW0vOeWvfG0U3a3gqJFQRRgT0RaSSHeQZlwrYQ3/Qo+2wakhvzYUb0YG4G2ytwimmw6giWwlqR0ORA9z31L2wv027y/ADuYqYluPU3K+1C3JPiKJ0Oezc2K1G7Ask4oU7N5nE6JadNIcr9bYapaxPWZDH+BmmDuN6nnLnQ5rEd4g5UZGbc2v3qHQI+wRHk9gdNJMd1JkfxkvGST/Ba3HRtNcqhNKELLXkG2N6e+ML5usgY4w2X28kKVZoVuSmYjOJyfxpFhQapPV3eC9QJ6weRuHhJ+hOdG483aATTkq8NywbDkj0Ytxnaru1f3fURDJ7M3JhwU1NFe7EPEQe8iJ6fIOGjJF5LxNTSIZUjHjSeiyg6bw2h4En6gwwUI7Idf1mAMmVFBbjyRqHnrr+lb/OcFhB0lCRGISV8U2VrjxhT0QiUwr7DV2q39MsvWNBdpAr9CylD+V4FzrZto1eLuqG0dNo1ksIxzuW6KhWdXVXJmN412zncFPZ5U90evsN7lQGzbCZwW6DmnB8wIFCDfxpvT1n7a3r6quG/Mnb2stTVJRaFaLc901W3hpg1lsg6CsIwXsae+dvh/YbEkw2CoXslL/SUzJYO1+/U9ddddddddddddf//0YcHQbnIOUEW0i//noHeBSA7wMihUq/v0CwF/oUAcMVaIDd/6DkfCBQyp8t7+oJQnCJgYkPFVN4JNArbTRFTAPPf//cQqs4ScOaSkqEa41dHsyT1ajjQQPI+vFu8mduAnDSwtI/jsPOE1REJA00yavMJJ+lujg9hp1N0VEIJkldlyBBPqAHz7OQ1sXcm6nj0alU+Ao6h93B09ecul4sE/9g7wyrfqpI6A5I4gjMrzynHi+6HKDOE2N2nGUXPk2kHU0rlcNyDthgc/j0FqJOLJoNQeZlDzIm9ZhfQ0lQaJBYPGmmb03YYH2DOw7yHOk3ihSl4tkrxV5LRBTRXcDFXLKSKaDoecfPc8+j/d2Rt1dfznSxJRf17kwr7bfV4YRUHUxNOxzR+JP4m79FiqHlBGEXVaWDFDiWz9Af85SLjL+AEFzS7m7mOQ7FqfrgEn0/IQT/dwwJ2paV3ehvq7oWR4UbkE9TQ1mXok2W0+Dv58Tfeu7JZNK/beEAHIHDFuOL6/+QfdR+SxMTd1V0UyJK0FEcFTLOyb/nLy4d9USkkjD7xMKIcm+nHCUu5BY5KbPzw/fsTDzHtjxW8qHyMGuluA32PAyqnqo28TczdzIhFAVlcOY1UbTTtxVuS27XBlS0HGuvoExvtyxfyrrSfQvY384y28j6iZHU1TVtK+hD4qzfZhQf/lznEowiDzJ0Uxpc8s2C+lcL4JEp4Cuawu0Rsw8jU6Rk4qwXsbv7WfWM2iamGP7btAQDqwdGTYWopuBQv/qnS/gm3oSeWhr8+lbaYYabTHYHPTkxuZTNj+JX18VrP377o/PUCOotT3Gggorhic7xHr7H5OjJHzuvzSTPu6CciJZEHtBv3jfvQVt0iaYOY1+8cHW2hbcMYicPGOr0VIjrow9wOwe36LdR0+0psa7Xr1YkXY/NEar5w2daskcFwOJ4EuSDybizZ+jJkhcXXcjIUo8wcrH260g9O6d9MdtN82Wbo38JOznx5Acr1v09rdg91cZ2wfXPO8xpNF33FFSyW+aeg3HnuLvQbMt+ZxkEWw5JoO1VYrrrdMkZiQk4uTmDl/xAf7ux3QURR1xomd0qVOUSHiFvaTcH9DpLllHOzS31kAe0+B5+Wymw6JYBg9b5eb9dezm4iLkE3BEQ/IwzhpTSRAW6ePAOCck7fRuAV0YQ1dWrOWug1lCYOdWpTuAxW1mJwvUtHETJxMJCfSv9unADPoDWgLWMcv5P8z8STT4cXIWcgEWEnAxu6b4kRr0UgHKNxEj/D4I2sJFWrP1E8raOgL1GkqOkdlY1Pmav1bc0msOagUbT3LhSs/BFET7e4zPFLV3Mgf6ueWVNrUBzsGZseuV1oOTHDNHXm+wsstoEyaKY6NIYLjIM1DZKM29kodIikhcHHuRHvdlT7q+v98xWK4wAgGoxb39slOrviPtD/+w2Ll9CvfbpmVE6f9x+54xc11111111111111/0/9Asxl9fDvd+gEzXpvm9B70RKJpT7hDgTBkxrQxVH8HwSIj+eIAOP2GAAxEatSACWH7hZf834uCnqHhw4WExwPw8OATHgFYwKYR7QFin7X0micWADHA5tJlztXsSCeqJeNWBDT1+WTkBp1HttooPdHjCA4LIQCoaDUHRQ9mwCJyAM2VfWDqQVn2OSmBXhM8440BmNQQKMNHt0OKxVVqso5LYSkkQAfSDWIMjHIwm1qi23oWNFkouCePwlAQ4MADiMnPPjOwO8CZoO8LvMTckKLmDqU0qqcOMoY5U9lFPLSjBO8hqZ42SnwKVeZUfkR2hkeM/Awb7Zwgk/UcCKfuMLc2aX65Fe3gjPo2BkRGb8ZkrwOimaEdi8bI/qSxcgW+/IbkQEjmbo3OnOsNNXcMa/wTWY7j3mmk0wrwrbbKWC0xNaXl+Gj7k7Dxe+LIWJrg3hkBHtImGvYfjPv9wDk0QVyITvRui3jHeYABsTGYRdDW+0kSN9ddp1pkZsbUN8Q2Fa1PVGfuCMFkJq8Pz6voetMZyj8l3aMl5oMem6kTuoTUjMt9CqbI4WTxNxDs8R4YxTtapNMMtYhO6sxqiqodRL11tTo6ET+Q91dtynfU3lVVPHMJZNXNuff5Bms3DZsEIL8Y6t448WkkpvXRkEaqddNuPvjpZXRv4eFzXigzamNhMVbq8Mx5B1IfeHrqnavdtv1/0bCiEfiquqgLdeC5B4QuNDIgmcCRbdAMX76u9r/VfRuYoYkCq7ipJsrkuDcZamjAr826VPLc7Cw3QCxDPpooGRDBbu3QmB+0gOOwIe22VRP0z4vsbsvUbwZOrDX0BDzIFnTHO5hPgzgcy5xCy4nPYZXpiueYjXwL3TxZrPcXQwq2fwmgqtHSqX3W4BCmOc1t5dgmhCq++9aRbw7M4ntwW16WLtL4dbtLveB35ZSJsG///8tmQig89hC7ClZ2jX3ynFhcyohkfHrXy3LGCZsvw1qOJ/WOSodpKfkrqpw9iHLMRuAzVc9lP1G0+W0tVTNNb+fsGQFObd6VeGXA0WzRnZFPgAvkdJBeqRZrnARopbLiGb/iU76iL4v5+0YlLLzCc8Dd3kkT/Wp4/C96kIsocYr584QBUNxqQY9U6a/zd+1LZj7nvhJ/XITBJC2TOnLvEo11QDDnlvBQ4B1dUdu2rk/BiAyAQBycFcmjXuERJGG0TZgUgb5Flvq4wM8YHNsoLpWN+xbfNS9Pqqw6XYLpwMVnS4t03b4O6mLBq9jJ4GVsyn6KYr7wAYsUGKyma00JNJy/NCuuxKqHvwSDrC+K8/4HqAMMXq6nJMtkjfXXXXXXXXC2AiVtIOr+GR98scd5L+EeNpsxtjqdM8CJArgVcWCj340EAYEsljPAOyx40Ay4YWszGZfoT+Lrrrr6UpSelMcEvAzgDM3VWAchqx1/6AsyxzM4ybxdfNZxQH00m3i/3CeVlWG+BhBBVZM7D1ar/8GYALGFXkwlqLRTf/zjNbmvsETBiBAoMSG2g0xsK17YaJEAZtEBRgvONTx+3hHEVQn01P/+5lpslkziy99MXpR/zVP+sDqIQjXaoVdL+FdqnKnGdAL8AS4k9nAPFd8L3C/h1Ur1/TijBk70rL8EWiSUzUjPn2/4qZk2ARj407Wj2FDTqTcl0wuK3FkSlLefW+BpSKIQ1LJ2Y7rjdJbkZI9W1r7qmHTpkN+AKWXISAkustM/avToIrJui5sX+9arrSIkO+5pzc7H/D37qcLMbKN/L4ZUAUmthom1u4FWcgtNKUUxfUFW5P/drtrD0/HEOXJ20cT++nbuJj2xiripCp33+TXDxvpzU53qSsSKzdq7IiMU0Y75r/wavedhyiPyN7uhho5Nu+Cr+BuCrbjIWcvj2bx6op4/c/iQukFqU7TTqir3VGLD/4fxDOJcWbPguDiGERlnz3rnlkClaLuPZdRnpWjkQ0J+TqxE5gkq/vZf7npgh3QoI2MH/bb4dkYPdpNQPk0AAxP6U3CLK4PGX5UFeYGEb601/7gbX5O20CPy/W24BKhY1ht1bvJwzgbeDFipa6BS9rg+L3iDGbUoI4gXxkZS9MfBypzJlGal9YEvyLE7oF2ozNqCzkRJDVda+ffTMTwJKYGsrAbMBmuOJBa02s22ZrFJlCGHQ6jPwN4pDi1xjfkfu1f5p7/DWZcxulJ6gfL2NCIOqKUkjvn3L75YPMFLeq79YzMfM4x2wCnMxUf0i9TZrGAu1eyelbOpUvJtPBGVQH2IymzpdRoErAZeQN4vQXtSDumagf1n+CL11enRF6qf4iOyT3acPNAKJssnLad1UfXXXXXXXXXHqGMtEd//11111/mk/9AsGA4CkxQ5GQoEylZB1X/9kBoyzDZIGraCBZ2CAlIWEoEsIFZ+MooMSWlcv9Pmip7kbs0mg/RqYbKDNSgT+Jh+BEsd5/BPgS0AQsmDYUsjATcbqWRss4MN/fVhcugh/FKpiEufL53k7AqecAbNo5HFAf5c4+wQABTBzAgACtYDimxpb8LbiqP+iDxfixLmYGJpc/v4ikQskHHR9Li+cGXAAmcYXwnJQYOAI4nJ5Z0wcvs5+sy/8zKqEOFQojlUvHRfED5c+X5+c5m6b+rhXlu1bRc8rzewYXu/XPDDREm1ytufzzVFdg1gGsZgVNq8FUsolOhWdRqO7jrxpBHp+GUEUr3wgWlmLJM7Hop8TAycDNXjHoeIpElvu9ePBwc61pKkuxuF7c8Z/CcAq37Z6InW/Tav7+p4V2e5up+NLwCQSwY4htMa9CIeYH2gfOTWyUeiOg/0rsvyu4mYKI4jADV0LHKbBvmJS/ECBNeuwk/0oNC/2XaE9TC2hr+T+gzxsfsUsrE8ahnomCZUq6hUGUsPCqkEL4sl8oDeD38FngXL8OOP3ezJGiIJmKE75fPhke0YVojNztJuRClL85mwq5R812/KDc9IUqNgIUMpW3UO22I+88bCc201BREyye0xa32WttdBmDxn/xiTApFzNFnyN6wu+++XogH5pPfAEKwm6+41f4B1ps9tidKJx0Hxgbn5XZ6mZ3sGVPqfHN2JWPkUIzl3KNr3+Slv828pH8PYA6Qevxvze0+fxWS4zFLPe2ohIQcMYCv5PT+oNRBBYOlxck75meQfu2vmP9TFDJMFyeb3si7Ug8GEjeLF33+tumMQ3CR6neJmIm8vQ7z+JxEJF2ljjqTF6iXk4ExNkt1YAkf+2TlZvNc7CEgQ4nCEZJANrllefn9v//BXUbEh/BxHEof8f76i6666666666666666+fn+ehwQzHut/f9/Quzg0/0NCNxJffjiDaQEZsXit08v//3Bk5rSmjg0zDgfubF47XXfPDCAAJhjTAgACgOcJ8ANJ+KiPAxGsNdogdeykDgrpbar5tM8z8/vOcsUON7l7ovhgaUTbiCv7e+4MDKNhLHTQpBf+AsgiXNQrxF8tWfmkZrnEvAQAVUDBACRABSxuF/BpflgQcAsRdg/D3wd81QeHIkWFf91p8puR8HuM51gP2Bfa7LlVIf9oSaW/g3wIkY9wv629R59lb+SGiVBD4lrdZB9UULHPYdoq9exIxnp7faG6imlHtqib2xPdyyOKrB7Bv48AEABAUBf3cc/O0jPAsw96Yk7PGixP6rCTSURtSasLq8fexPMtO/e8arDCnwdy4VerGKIP9+MAWojE0lyrYlhwXyvkzpK3zstfj+JlLH+Y7txRmwJqF7qmkY1C6zQ6oIgFuON5vyXqOOAeMJTOx0p7yCEzV6YfPxqggQbvHLRiPQvE7h/lMYdf0Y6wxgN8JrewXiRBJLH0FJHMW6BSFtogpum8npM4hX614JWgSKjTCc5szum2faAoO1XFlE2Uk2LFBEdCW49W2dWkkaIRxfvrsKDojOtFP++PAt4qudsu7TgIitwPnn//I+TV4vwNRMn83d6ySqTaR+998556Jm8gFpLTxxaf7HaH/CWa8fEUVlaX4g4FDIpBZiIAMGuLqbOaqg0n0fXybO/jlJ62A1acvfQB3IxSV8+75saYQL5h9qDVVQL0PpNuZ4wmmBpDUb8Y7JxzuBFjl/ho8yXLfCEzMQkTQlLf8/9RddddddddddddddddddL19V/0OHeAVFNGWQaM6Q+YAvBYDtxw/A9YR+fTg6GN5AwJwYFOaEfnHPXnxDPeEAAjPcEAAQDkF+AUCtBGi/z+LgD3BpJQdxP3/VdVXHX1XJQolloatRIC/Z5gzZDvn1yQA/OFmqTDZ0oO4eAfnZlkDQhWGEALsKIvFhWl3Ht3/6TIctb8Vj1SnIOItDqyKVEWHv84gk6AWWbCC3TX/BABFsUEAIzg8cbJjQLLgUGMzwaCJlhifCcawQp84JRtRAL4DDsE46pa519AzD4v2iqqp9V8mXTMzOHljecf0XLbzhY/VprZexC1gZMDJZbhkgHDHBbdVkUYSx2kbuvEJb+GA/orVbuvAAvx5dLHVXd8MmGlmsOx4XDFVuAb6n7dlcRqUA6g61Euwxkt+BK2WwkvLcRHuGYw0CjRKpvImn/Bf3NymVc1lQJSZk+b/TRb08FvlsMB7rSalj9xJhGrr7yoRlpk2Q03XXUyrUlVlf0qEScP2fbDVv5MN59Sj/TO7xi1H1yqiKBZRFlqISJ+ePMhkOVdkK0LWMVVqGcd59J4g5hEAaxEF7aN9jqSQbHHbXcLUN9sYlxBeeWDRZRr4EAcKFAgUojwtU3CkZcfwIyPd3cpMmSf0xw5vg3wi76Ii+bqMSPk11gCu3064goubUj3Uvh7LcQzKa1nM9e3mTl//BWU0WM1rCipYwJCQ8rwhMzCpINxKT+VfvqeuuuuuuuuuuuuuuuuuuuuPX/gz/+uPX/hodpvjIHpUFwt/wGAJldByIegM0v/n/rZCYIvbpB/J3/4pbEPJFegGZND1xWI7MBurgg/EMlE1BfvgwAYOCAARgAb8yBKjYT0YCs8/wGjBOPyNgZ6HFYHpXz/+qx/46mWLEkdLqCZSmAK6AlAeuU/556QgpxSb7wEjEd8wF2FHQ+2FIEv4j9JFf+dgzmwYN5ZTCLLDiEJcMvzAI9BQgQxTf1/geE4UHhUFH7Ot/kMl7BKQza7CAgX2Yis6eVqIGH/W/5ZQhyIAygaEoIIBwiZZohEB6o44SHo7aEL0zP6I2Q0QAQOyAmvg4+YP1IbIBVz1jSQMWvnn4bOBwDcTn4olT9QsLpw8/isHLfdMnwi+6/+uRDJFFHU46/DNOj6SAxRIJBqKmglQcdfh4JngJuEvwPBBPoz+Jfn+IWgSSNn22b4JZSpcxxFXvv1KA/X5z7qshSYfD+hej8DVp3zXEwXS0RRZVdgASSnTz6gkLSU6ztWvtFAVA6hO9GSIpb/s94GZjdnF6ufjsus/Usu77v/6DVMzqIPG/M325+W+VzwJb58DAxwOGstBgy0HvXyj5nki/XXXXXXXXXXXXXXXXXXXXXXXXXXS111111111111114AAABEEGaOAr4EjiwlgIWl/YYFzo2lePuE59t7yhXcX2lFiIg96v3ov+/CxffexP/JXwCPp6e/7jih+Q69WO/DGHGQXbtXH/BGaHy86Xdl+HUY/vvy0pBXubyvr1TfCW99pZf/WFC/+2LHcTEXezPt0xN8y+QbOFhD+FW24y9u92MEfalw7vBKqS2bjcpbY1aOT6X9/EFCY3y904sbx4iGu7zGjAcr0pHuOK3lXMEm3Bdq5gyQFZztvD9eOhM5PzOx6iEq/w+iZHl/3wV8Ewnp8C4YLrauFdhSU0n7Akdf6hV+KEO7u7vvxF7wDP2Jj09mBKL998TL1pQlurV3hrxFqFX3TbeHHpwQeHvEyYW8EFZJsCRAAAA10GaVAK+CvzDsttLd82e8NF/3bFkAg+P2u667hKlKvB5f3vH30+HoNn3DaU5nFzJl6ERcNdwDDc2v++fvl3l7lm+Fl7ZRgo3Fb17U9wr5jPhlKE0Nu4rfUd3CcmVa/hN14q3bw4k+4LRveXcf3ywoX/33F4Sgl9++7u7755UbkIfk/uor9KE3fV6H4iOuJhPWKn6oR8TrWtb5cUd73fUXWXPPzwj5q136QT3u932Ur39qJnhLyEV5f296RGrRVwtPLH8l37SO0Ku0X05rhqep+S9P+Hvh74uAAABekGab0pAK+CzxYrF5i9Ge/lIp3evffrDPiyT6ATepuPAT1aZrf/z+/gP7jITus1O7uHodNcOJJPxlBJ+lbn4Jpi2LkqjQ3EFTovrBEVz7mbVb5Pr6a9fhO6PNry8LF/vLIEHf7b996t+/J6bl/9PeWQ9794TfthEUKN3fgI9bS2+yqdHYmJB+78eZJVi+T3S7fqi9NL98K+bEKCF6ZunxhH7ef2+3sqByqWPapttJ8ntNfv09V9P0/br3tCTzKUll0me8uES/l7gnijsS+3G0Dkqg7aW4wgr3co45a6dvbvfJ7p+79fT1RfXk9JL36S8t3Rj3uEX85DX/pXJEbd3e+k+Tp+uqfr6+v2ruoRplZHv1E9svXNVdcK+UTP77Xruhuqvq5H1wj4jGqfk9UmW8Vvpu7OunLxrT0/XXJTSdKEl4kJ1TyUvTyqqkhR181DYShnB88v76hUv6+Yl305ZZROK/Juuye20/1C5fJ/rIuva71dQ4X8lVh/42AAAAbpBmo9KQCvgr8o7h5fIn5YvjTwusyFLXL22gx5iYKP/Mv/uXgj27ASm9NPd+4Qh9H3d9737hG76RcG3VhGeHW/y8n6XieimbssJ0r+eHcVd3u47pry8IHlYVX2LCDu4KO7D/+4ks8vvBH4dsejvXdZP6TVxeqErrXLol33S1ynFfCZf72wiOe/ke+yqZfp9+mzefO9UJ1vC3m1nQcsZCNMlLstuWTBqu3ew0kkQEBDdrzD4bsa3hiU/1WVdC3035fQ3qv2kLVgp3lWyfxwSu/eLF0kNIsJ+CfyPixwfs1pcsdFatCLZbcFUaCz3bveT0t/+nJrDckWnNIelWlp7sThLy5//GXd3qld977q66uXyYGtkh3Q2YTe9NZDbev0oRX7FXu6Hl6auh+Sr2mpcIl+L/CUn9a3xOCcr3e2/uz6ruvrqha6L64S8EkzE39v3H3e7tvd3f0Ur3qz4hpP/cI+KIZivk/2Zu3v0kd/SnoWunhPyHm/9kLzMboXkob6rrMe99EVeyivhQv5fkHbv1YskfVC919iHpcRhhV8l/UkNZpDiOL1viOXMu9YjiIV5vhDgQNuMreZ9JwVwAAAB8kGaoBXwV+Ydgj3LY2/iy7RkBqVO/5SJTi6ov+/DPhEmMxe4CRcsngjl5Jrj/fzPL9xoO3Qe4+G14v0OlGYCcixrpZe2qtsFctMgaKr+Iysg+A+7XgRLo+cm6/zJS89lpHJejxFJ383k/Xf8vv1lu+Fy+920CUZvHRO9LGD629rp/Xs/Sa/Za5Qt5hA3h+4TWXW8sTd33s5iu6e1fd9E6S2iiT/1dF9iUK7u7+yOEi//Yo16J2++8IkP8okG1oEA3dE2hkMOpNx73fZ1qZEX9PmQC9h/vVUfaV/SW4TJnX7Zsk9psvfeXHj0n5MJv3BZFbvrV3d+9oFN0Tl1E6YlveksgaP9e9u7Oit0yHcdFrmzk+kq3fJ6S/ku9Hk+m8saWkVzosmG/e2ht+vv6ve1JS+oR8hI2k7vvBNvXmhFdP1XX19ZMRj6S6b8irUEJbvlCL9BEoh7/PVaPXmVWJfdLXf0/Xa/Ju7hH2QdpwZvf0CQrn97VX2Pff0IWnydtf1XZ1dlSL0I+C2RmT+b239GI6TvrEluK4o4o+q7T7eqL6+vvtr5JfP4R8ERFTW3iQj0MSv2vf39fk1bffVlWT1/9nd8K532lZP0tX+iSmRhibO6owl3wq/lyfaftKW8uJblontP3/TX3Wn2sMWJ9Edl/9YEiAAAAlpBmsAV8FhfyfFis+hN13Ehlr8WTfrmp3rJcpOW/lLcscMeYmiD+T4INupihWmX6BveltN/vue5siDj3WapUH9+2MuiwktmvPPDzlNEi9+4elNdxnt9W+9nCQfbB6SP2fVie6z68pzykNQqX+3KwiFHsMpBUz47lhb/uY+HWydK2/d9fXqi/sTXKnl5O7y/7oSU77hMv/lYKBmTgFXr3+ne8t4t2X1vy28otL/uV003KunrVf+qPl/RfkNZuEvMS24bp5L95U40g0d03a3vsyB93t4CeVV63rXHytZv+i2ecqMu/Xvdiem1wXdok8Y3tpvwmTOvGcebPj+ZMEOafPDH8Jzim3ot9MC7RKqXvCT9IVP29xPvvZ9l3Pr3VuM2NlftLJ+RZ6YsWmdmGH+1ev3ffVJ70qKGu9iyhHe+5fk/bThPf0/uR3Tfovd5vDLO6bctwEcqzlr/03rk9q/Ffq6yLf8Il/vscQ/Q0Jt7Kxdy+7pcd3OMvu3095BN77ft+iLrNlyZ1p7iDbsjJ8j7bVZLFohX3010ZR/kEU5f7HY7173nZnf7OE8jL737+36+va5utcEZ4b779fXZZN779uqKbFbwkT1XKaVMJS9xDS5dn9tukuWT2k19+r7Llo76S2nvvfWoR8E8nlQbqn+sv29rmV+LJw+j8/RP1fL/J7O0eKskmxnxfQI+Cc12OpPX/4IyPL/U18g19y1/mJzqKSE1L3cK1vErOn43I28/r90TtJUtZJBPFYW1ExyT5/es3xPsTqva/C+SQRENOsX/Wq+y+qhzxB55bnlgsgAAAoBBmu9KQCvgr8w7cJHnZLJcXRk5AV8zfi+a0Fl8NL3BSThTOawq0+8gQXgzuzf015k/Y3pIq9x88TZk3pw0ko0pX02W4u9fLwm+nN19ffZeX8vylt3vJayenRaTdZP68ssIU3PVlY9j2WXeIEcKeYdxOFfYkkN6O728ntp24l4Lj/L3s6dOhOvqi4Vf4LCD846unHfSDtar9g1R+26aUIbdyLn+vpkHpi4bk/vIsxXjI6PoSUvBQ+pyWeCTPGA38HuXO1eq95dDmH6vSBDy/qE/FEpytlU17YUIG3S7Lbu6r9u9wCd6+tDtLBEc620WVApO0aBG8dSygz98/uwny2luPjP/p2bPw9Fy3KrV8tCg4jRd/8v29KL0CeVcqmMCRYnLrsaf9pibj7ZXwJLRLPzzOl5zpL6mjSV2XanRGCnMajP2VkoS89/+t9ZT5PCPihVjvCS1nPl/e3BTcMpHXvfQ5oT7SSdb3y7cM2Au7ajPf/b+rocObVCO071Wte6LW+lcCH2er9/+KEO/w1JeljL9ie3kcEgm933t4JOX5Qi/yGe31tEghu76uz+/uzCeGEkqs9iir/QmIEk/d/X5KvSesxL3CXkvf6rT3tLRPu3/7+6tpC0vpQi/cOzP+TyMhvoTXGF97rfPf4Ii8uRj2+hr92Wk9XfQmbe+qLd9U17wl4JDG/Ta4QwpcVu7ob8/uHjpc4tdnZd3fZ6LXdX5TcesXWnWXvrhPwvUm/M1v1mnL7oS6iSc/vvehxkjt3c9CeT0kqVUqf4ku73dwjWTDcmOShL+JuhPabmkNP/UlE/X+byYUdcfk9KvCC+6vSppWRIveRm3f6hpd1pW+Hq3VLBXAAAB/kGbABXwV+LHccj9Uy14vuzzzMfXl2Zcwx5iTphN5q5jKX/7BR4wAxC06P/045bqgp1L+9uO8tsvGxd5fpMs8XwDvWKfZ+H3KXtK3CfdXu924npcnfl15T4QO6MKv3CQUu3a5Y35Uzv5JaqisnjTvoq22VPdCdZZN2nCfm8NJMS/+WjPl++2tqcvLYuCTwwkj7WVX2dFLKVjNV1pLqFH20CgQ9zpZlnLixH4JjwRaJeUj2jUW5PF9BOUqd/KENyKsTBFy2YPPtMlxRHj59ta92ZvCj9xUVu93d+0Mjq9y1If13gYk/ZcHp7ve/RWGeAVLB9n1Ge/8n0lnv7TsbBLSrz04ZPd/ssEZ51zOatF1ZZc31uBH7Vcf9OKESb6mCUnrX7iSk9C4BI7uqve8tFaEvBUTbbhN287tbGvsEfh+tfkt8ntOuWKXX1kJHaemhu2edqZf3u9+iyle+1ka9oQR93d3CPKLEN2/N+7LZ2/5Raxresurohsyz717yV7v6L68vv37ptkhJfKCq7xDjvbtE12Rg9tXtVZWUr777svrtpF8I+ySZ35IJ/M+P47ClLa6fqu7onrlar9/XdWbe9tfCPojry6J7f3/ckm16e2t0foVfn1yVRdWL6pM3TRWRfq8LeCO9315dV2t1WtLkwzp3m9ZClW3DWaQkma8l3wWQAAAolBmy9KQCvgsflmGWCOXaRr1NQMbffr3/F5LDq4h1cpcg38vOjaDBf/cEBsdGXMdcwoAc7Ilnj4Q++9Lfwn++Jv3GlMFM8zXwQ/OWZCw4FUv2ZstqkrMtPl9N7oFZtxZu3IqRxFmAwI9LZy5L5U1r1QnpovBIXisjZPd7umoWXlgmCEwse2aJH24fXxNrdMUeCZ8/5FwSw0iN3399b3/LWU7cW4T8KDOHHtfU5jwmtbtCuMM+6ibeXiYrf7s/8aCCdlr2vevBJ4bi8tUvoxbvCl4rO+PnCYslINyaumzlGGDahsEn3H8t9+77KyK3sSr7cFJawwlphsnt+y72KqgSXFrat1Iv71gqhfqgSUpKX2vgqkTBPUpR5awGmtspyhoM4juzwVYcu/ScrrUNCOR6bwggq4mH1k93X8JSjkZl5kL4UL/eWK3tdvFdu/x5r94Bjex9H65l+T203ztwQlwZsHzHQD2Ohcf4c6ZoFr331guq6mmj3w+lyp1+0nUhJK3N3RIn390hPbe/2gQ731CPhIkkeVjvvBFe48XNtpd9fZPf3SPFvusv/7EGXfu8qZQxJ/9ZN6UIeCYzGZ7H5/vWJ1RfT1Z0hMu+1vSv2WCLKx9tISiQT3d+79CPYgRP9a1baPl19P2bV/f3+R1T+2vQj4qT1t1GKX5YTy/e79st79aLbv+qoT7/aVu2+i1b3Eb3vfqTe4R8E5qis32yest4kIm6zOlfJ98l/e/f39DV19m/o9b6y5e7+FFn8jr+3y+M6pgnNDsy5yhh92F+yftq2WvkmLe8n9X4RDBC8ZXkwp5hXN/aPLsrqhPquvaf1RMLZkQl3+/Ee/sR79dV1sWf8N+yB4zZV734K4AAACwkGbT0pAK+CvzDsljMWX9dy+hGiHC/75SBChwhepY3Ee4UKdr3t/mQMKEfpn8o7eP2Bfe4s3Z4BxyNTavfLCZ+TUQ5L/LUUp0JjqSd73kS1+Czx3TEOW81XO2/oZev73feOTNO4V8w7hvhL+nsoJCH9/8v274o5eRSH2F/Hf/vf2Ku5i2++jyX3a/Nme/XJwq/bCJhI+3fZDE6/fuOiu/d3IdIv2ke5uGkX39fRchPey0vKUMoN1z2evbSGkNU6bX3d8Jvehhnfzn7C6vZZbu6R3bniGvQJsah+oEx9hd6socL/sF9OsHVVQ2Ce2NC7Lfkpcyfr+4KuUPp6MwRDMpCE2vvDo/QoNKL4U/LmOJtw772YKOBI//BqfcW0qsEuU3BJ9ahCV7F9dkwmX/3BP4erb7vWt8OkPQ1+23XaSXgOGimB6d03gxzf/7povZw9a9raCXvk/tZXEylui3cgnt8Ve9VTbIMPH/FdpCZO6aEkR26JCe7G97hHscR7eO0973vYLqfdbsEL7VCvvdCy6rc2795T7vvCZOMtzdKrPcBxOiXf9r3Z9L01f0mNedeP8wQm/8EO52Xs9reEQQ3vd2JLSWi+i+7IXhBw8XTSngoK+46dfI3bghvvWT7cyr6dlTW3Usvd+0FLvdN93e79Qi/kBWItjNNvN5uT2e4u2yRRXiu9+2lb09t7YTKHZP+737x2773af3XTXa1l6axPe7ZPL6dslcoS0iGe5vd99CYIT3u3Tl3vuu780Ehs93VQ9VUhBL31aiP5LCcqLCPkEKq15NaXLYtZcdUXRP6fcslyTtKiNGKm0q+k+5YT8RHFNb9ahfZ+6+vrsn0qtLiyXnLv2roU2xxwgeZL7vnNu4KXugnd+X3C6TsWy9N9at19dVZCD7v2+9ehlL31fXTT3MW9w35I90+T1zMbGRXwhJLjWCyAAAAJaQZtvSkAr4LPFjM6o8jXfxmL8XSrJF06tuHgvLeiNny+WYX8WbmjPg9MtHfhHknIFnwAk9oqb/q9Du0s3uCkoC628IcUuiXT9uu0fFM7ZC7PCBrCHhdNpwi4HXuVmMhL4fHd74w9913v/3klu8/oTNq5kVeXvfE3e+WHuW93CvgpCVEq85lrJEfnLbcrcIncKkfx+Zv3nm8gqNPn0Ur+QuMNfvre9ViWW766/ZXd96EYqlDXWvx/8KP3FCnH0J7ufoun8IS2rQiw31fMP7su39VT6rrVFWr+2CPmf7s9F6FPMSZe32woQ+O72kSavdvyLw0k87Oz3f1cx9/o7Ne2WlGHpqzSF5fpzGoRKeyQRTG2UdFfXbgq2G/XXyqQ8RZ4we9/vLywRSL5GeuuEfBFrY+yf1u2cgKe7hD5nF2LzEXal54Rf6tagvXeE5tg+t1QYe75PdvXf3BNswenH3vYaGvrBGc7U+urEkmr47f6aehRIMaT80tHiaRep4HyCSay/LR339Eu9wj4JSEf6dPLL6uR2C29vz7i3utdVv6y2Xk/a/wR33YhLLBLfe989WV1Ynv/3RX/XpX/CT9RZOXvvoWXtv3vVX7P7qie9ly/9b/RehF+2CfcsbbTeYbL66+T0qJLLLBNzSkjfXdAhvv/XrukxOvBITD+x71QtdJ2vIwRFl9+hHwTkp05tmYbXjw75JLpf+heR1r4jr09epS3voSRa9ot7wqlzlksvSQtKYg+29+q9V0xOr05Rmyj+PhXXsv26vteI/JDC66vRoteT5NfDd7nmS9/D3xUAAAK4QZuAFfBX4sdsgR7nqa00NtF/9y2DuyvL/y8En4bDHmJKFSAZ917jfDy9feLGgCDhnFr718WTGCxnx2ALgkH69/rdwoULrL57uYNPhrAQ/9ePv521dnYw1zGpa34oaQPNcKKSbwXcn00X5Yw66rY+Sb3b7LUYg6PBLuXJ84tTd0Jj7vs3lpe9e5rvcLP8FoSfdtWkv8vru4895zZb8ZzsXDsRz3/1WmuxHcrn78lYq3+jFd3ryxu5u0zwl5jVnO/FkptO5EYBPl+aqs7zL/vYLN0Pr0CxNI3SzdHe13qqrwTl52OG5Y2/BXw5fdsDCTo92Zwc/raR6+XCmWFBDhE/o92W4+7u77gri0LGq2+sglo4r9bSZ4uiNpPwXedcqAhyUrWmz8FXOpubbDhGQxzQKZ7Kb8t1YIpHzT/2pP5f5ROC77zhEv/tIRXqCm4cHI+glw4W9ZoQpLwT/d3d+9h9TNxaukJ0/u+obArB9tFD48qxbVhana9HgjOMur9eK6f9PVCRAZJp9AtzJfa3tCzu93fnYXhFbcoLg49Q7z/v0+01+q00d4gr2d75PSS/frBJIv+7GxBpn5L8ntt7fs4fvRf/J7/5d9An3vd7kI+CYzHP5VM/3fvN/RfZX2/bXXk9Nrcl0Jfo1e6+xuqKTflQi993L/s7HZ4QfuCEcp/u9fKkfUm2r9av5Nu0jtld/W7361LQj6I/4qfxXcfXFXZa9W2uX119OE7vd4SvX1aj2NkzL965bu/ydFKYt76/wQ7z5UI+CU038uq9P8vVdCSlEu/8mfHtWX091iTO/lDT7aLu764S8mNL/iO5n5P0uINkzexfJ7a+SKmu/txMDfhY/DNntW871q+3fhbUt7yUq93RN3fZFpNwgyT98L5IrPlqt+oJtp9315NUqKVNLkm7lz0Xd4a93fBdAAACtUGbr0pAK+CzxYzOiJ4++35f98XkfLog/vpcdKxeWa9g/i+TQ7qOSjwY8IG5bILuBUAIp/VZm7tXC4WTl/dvDxQneK6rIPlBvLhTapOfOCd7kLNT/4QMX/gUbCheFoqBDZ577x5ePN3axqlrz8v/uqVVTiYJN3M9KpP1btUnyS3tZSjujcuMKv7BYFHd7VRnnTfJvt7gqPw4YJgbcD3LPKNCuQo7abKndIj+tyzn99lqVO/vy/7eQr37xG933Chf/bMKd78rBVFH352CNe3d+ECyby7vKPaokX6Gy7309JrE172gtxktqPhJZf46vLT/ff/7vahLzEvDsiX2ofIfjpy+725JAhbHvx+Wec0P24ul2/Yt5YVdFiRLQ2Fr5I+4bIVE99xlmvq30kr7S3GbyxDkFpaZy7Ou/D0WLYFL6wQVaCO65kPwnOXOwUfWO0//Bbw3BUXMm0MpIjs8KFH8gIt3lW7yxu20aSSwxJ/oEkXbu93q71n/+CaJg+WrMFXvhhfsN+tXBDlVQFG86hcKTqmLPH3j732XpN9m516tuhJE3yhqc47s6CQl93e4SL9PeCQmfrFvloRH8du96Wk1d2IO7+76PMQ6enzWZ17UvZ49E+X/L7ftQkt4sEJHvqaz0dv1rq+s136rSov/pAo7ve/Qj7EFf/ii7tuPzrt1bovdlfbXurd6tWu/r7p/SvCPh7NLMxbyd1lE36zQvtEBVl+W2sve7t2dgm7vit5fb3d9trl30T7/taL7XvXr1ZK9CPgnJXXF6zdN4kIiSa1L6/xC9cnveQn992f30vowl7unyYmN5X+fhHz1+O91t+Zdyy3iycVBmKO8cvQ1FK73k9eu79QSbv70SFHfE7sy3P5/0Jav0/SXWvdVRP11yQQ73eGFieidJk+TDa/8REHNfZPoRCsFUAAAAtpBm89KQCvgr8w7HHJtJf1+l7hje2XB33bD82N/LnPYZL/7YRJlizbRwBImSW/rb+8vvaThQoJ38fWU5d83tf15hFxamvxpsEFsdsuX5j9u5VTokXQfXxCslty6zvd/UWeifk3J+mXl4iaHOZtvr6BRd7u60y+EOGWh23fnlCvmHZ3j9de4JiT7vso12t3Z3yHr37vcwYnj3CRZaTE+be4nbmXbfj/68Ft38/pfL10WS4ZaLmQo/xQoS9E6g+e5Qt8Fe7e9G7xBXn7+4Li5I3lHzjT628vUSkvd+t9/RZzx0w1Lz5PtvyyrSlokKbYUEBl+u3LC7HREO95M5Jgslbft8vur465bpGFA1mB9H04W6DrvBUJKoEQ1mvpv9u+XQuGCFAvGy3vdxlmvvcmZM4a7c/c4Xf7gnhtLSw7oRJHrRoej4J9jrdO09UEqThuGq3Jmz/ddlwl4JO5V6/BTPtvBO+thCVWHrJix00Pct9Y7VaVtngye25X6QK91D4PcY8eB8YaX3/pB05cJHH2P6cE+9zJd2Cw6o2jv9F7obBNgQ/vu3+79VC4kQlIdZ5pBnDdrJoWxWma73vk4R8Nm3fL4r/cIbvd3dLVNbFeyRZZ17yg6X7py1mAvViLvvf8JZhx2gndPZ5BDEPCV97PBALyJVy+fWky3d7vwSb3c232iMQivlKYr5WMv+uEpfvp3V6Ozuu/txF33f3q3etd9WJ9CfRIIt79CL/Ma9+oJCuyFfu16PBGW92NJLaxdOvbSt1fada+717eSmTe9KLT5Pbr3bXuyFXv1aEfFS7+Vq/BER738rBJd93SYIyu/br8YumtvZpSO95Pb/+rfvRy3eTy7TkIIglyb3vlCPhcy5PKgovZYSQmhd5eT1X+gQne/K7qqE1bVVRSXvzOFPcnL10/Cvr6+kSq/FeyoRHS4/u8MZclfrWoI+7zQyT+v/Vf30X/SJ9WLDgsz4EeAIRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vAAAApBBm+AV8FflHbut/y6uQc+Ll8JP42VpkGPDhJYCEe14fo//GdI3lYDjUlzY0aygoAhHq7fTVvU+sv725CzQv8PmzAUxTD0G6MeVkER9Xt4LKSs9r8vve+tyTHz89b3aNwTfr/6CW8PLs3JneFvCgQh7M+XQrAx1Wx87QpdizvhxV7jS7l6jcTveSuxAj+nx/S3kh/dWPcYXywjE77bnRcg4t9yTBu/4giaWmSWPrfJ+3Sn6KcEK+KGT7beEfCuHhy9tsEu4PO32motHv16fJCfhF/Dv1tevKRehPcKCHcbNvvfu8doc2ji69m+7HUc5qa7xfiVaj21eUpSv17QIyhBx+M1RcZRWrxt93k9psTfxF9jlhLO18ValYIGuE/m3L/6gohmWl/SfIKlh6XCT+Ze9sKXP3t4dYOsBKNJwdxtMRXezJ9Ll+HqdFzr3rfQfyY8bT2l8g+/h8tb6+vwYaJz3OHZx5fGV/e29etdv2rI7/0Jf8EV7/N9Pr3NoZn2EfZqunX26tMW/3sS79lkaUfqW4rlXlXu/WV/girW50RR/deQxP16PHa9671WdLnmJ/XXItQh5M3/gmNI3Y25/Lf2Ccr5dz5qvRXN5eCPe+WT7vHvf6BDe7t2Pq2m117tasaQEl7v29qlr8FF373eEfBPXTk9tt/QJyPd3d3b8mnfuCUt3u//o9YvcEd737/Ge3q5afeCTe/b+16EfBIRVm7vEhPzcv3/1yUXk/p39Ffb0mbu9a5L3+2Xd70spLl+EvBERa2urKV33fVFgkMzcw7H3ezlL3/FZf7v0gT73d31C1L12117shHQ+T20+8i2u5t374WSkLq7X31L5NWZeKmEnzr4ZX5CPfWuSyRUifD3xUAAAAq1BmgAV8Ffix3CPz7zX3Q4l8Xy2Qs1VFktI/i9XP0DxW/BJw89mFC/mJI6Gbty/+2GMNQQ1IAZjT1qf7jfgyLDrUl/17QeLYShm4OfDsWlLMJRr54NPD2P1l3ljzZLEj9vhqR0oQ+xNFLa3el7ve9flst+5SvG4wthV/YLAkf1uYRbqaR4Pur7yx5bfdW8/uLBtmvojEmH6SsqBDzv2KWnfvrrIV5c6PUdvff5KUz3l/yvLd1cJeYRLGaV+2Ccglx3dy0f7L9O+r9Zi47Ed5HhO9+m5C/+WCEvLjddvk1qEvManG8r28YR3d3u2727yMtfBbBI+f1Xx5dLBesoNJXaPFk9V99Yy7GxXG0/3jTdbeo/Q2jBc/50emEPElbXQSM7uOpvIe9J7TQlFvguLcu7vmePs7z/yViy/5fCPgpNKx4Oqt9e4dhlWXbiJli/jo/D6LqDST+3Pf3NyP/tSvBDE3885uX3JLSV3T5Pbc/xFkrfuUpV3otH7T/txVPc1Z/7dke/tkp1yeTCS+XfZV3fd/fWhL6XMxnd8w2U7eZXb1vrIIGK3jf1ssEu0/PJ6bfy9r0lYhDohJ2ZV/jrco9N6btfKxvZafdu7lk70eLoi6+qMV766dGwqq6u7OQmX3CJf/f8El5Ye6OvN3VfkX5T7v1RH/WpPXu9fQi/1f0QhH3tPOxJd0TxW7say33fi77HcaErLVAiuX/Qj4J7ZoUy78vwRmurvXsbd/UJ5++9+77+j+sERE7zA1+9pWqjol70T1y8Qt73CPkx7H+iPfs73+WuTXJ701FGyg0NypPWNz/qyl4CD1Pj/pbVrhbbF93e77G+5O/vWvUJay13f0VEu5f9wU3ve/d4PZoYuUpdJLRPpGvafZ0Er3d3vyMnisNeII586pVpkuN9+TBXAAAAC30GaIBXwWeERnNmabRB/zEu4Q5tpYb2Z3tyv/xPnpHflFY5f9S4Y8WZngBi95TfZGBL6M/cFJQnejxvj9u7q/7RSfCBugIrGQ8GkRRzBd+MQW59gjO9TOWduz9ef0WInzeife+JisOD8/cOOlhXyDOG/fG5bRK95w8kt3lqjSakKNKXggbu01Z6vDjHL3vYi/FOHV8E0C/uMLKWj7QcN7l/3lLSf1e52S+wywk9tJt/fd+tU+T1yVeta5bvy+tfCvhQVqicx52myeqngQf1/sNX9nXlr2wp3tMb5OK+h68Q2f228u2U2/V2Ji72d4f6nu6Kgj3fd7M1fVku/q1f1X+ki3E3fdwEX/Wfz71Zowd/tf/BHM39Cb7woIcvcw3r3Sf7vpsnw0KZWqzwU8gVOWNcP/P8YJiHqxlG+gRHnBlzJ0dgm3eYuNhqXZVF79yZ/PLYuKIZKGH5CejmXnAtrZI7lBP+5MfV+bapSoEvO/nGj/P5PXCb23CUgOrIL15l+tz5Q4+0Iwkktw1WPtLmFPJ/7p9932WLK07U4/vT0IWT2/1wpvemU49d5g1tPrT1sRGafl7/7FoEQm7v28+RXhHxJCbQ8m+rGT93vd3d3e9V9l9EyWL3bgiLe79KCLMy+xCHmNWb+3u+tZ1f8WL3RPvqxF77v296giM77dl9tXdb1zbvCK/ZpGU+vzFy99jZS5n9U+771y7ddXbrXdG7vvLe+T7p230Wsnu2790vFYR8E5i+5/L/vL60W3WCO79eqv6ffp8dBHyAQdu9k9tpPpSXe+jLt+1BJvftvnrqEfISbrPwRkdyfarPlVb5Ppet+7BYJ3d7Pe/dGfZFvaqEvBCQnaW1LvIJutWL98nQgkOQ67299JbyFtZQsT1Uvny0tyWL0q3XsvreI9MRC2aCKkF235nkgr2l2nSffDW/3+445b6W+71qIve/J7JZXvDK/8RZhHJc/c8VGP92w5o4K4AAAAvBBmk9KQCvgr8w7MMu/Lpka80u4X8xLute4K/LIIEN7rtKOh0jP3R5l/fw8V8xU8L3e5cqRev6abOxxndu5XLt7BK8bZ3bZXSosT1Vf5iu5o/ylGmWt0wqvcFgSXI9333ub28eWEln+/yrykXgioFcd7e79P1mu96q2vKqJ/b7077+hBbfd3hR/gsFB9aDlQnS4XnacbveD2xMVu/Db1+iFe39oJb0+MNte9E+n/L3l4JO0YXr8ucNAbUHC6817/rUJ5Y0QK7P0DvJRR7vbucput3d1r8E3cYry4EMpkPpezguYbc+xB4RZ7D7ipA7/BHu96y92S+T3a/LIQwb37PCF8XMHfOrNsicrFNJFQJePZcjQmed/z0X/5IS8Emte28uHbu5+HNp8FcXeGneMmUEdfh377sRRyt/x+cHWv73cEhSr+svu66J3qLK8hIiqQs/y/X8EUqvKe7J6TR1aTYkw0seilgzVifrBMJe8gazdtJmogI93weWta9IEm50dtCHm8N+l/9d7uzXerP7oxee63cJ93u/qvdn95DE6FeT0m6+mUkUyS76s2l639Ai3N7c6WEF7ZTPfW+7u79KlkuxNKdArlIVrfeje826EkXq6Lu8Iv8EZJ/dO+mbdzxT239wTiw2hyX93i6Gt3u+66EsT3edA+qHoVu/d9OW7u8nr/vTU6oFl3vd3e7shF/QolOTpzN/QIpfy5OJffVS8udngnLu+726UEd38u8vljyfxZj/8N6XU3LNPTkO7+kxPLTd7L5G9ioLbG7u73lCPmNGse/lBLvbvd95snSTEmp9331Qn2SiV1iDpJZCL94rgrI93emUNPb9dDSFlh1bMOtXve94S8NEErNu/w377ouq+vr6ol4sECfy+PWviSufBdmIz7lmhgDKbRSBQIwp4IeT2rJEb3d/ZfZfao6dSeviOl1lFPf8EJX0ntTzeFslEjrX/k1ksTdYY8RjS9n8v+Tkpuv7ujfL+P+p08kloxNngsgAAAlZBmmAV8Ffix3DFp7iRLv3L3IV+L2cqBXt/l5dJD5eXO1hfzEyCQzM/BBwR7K5P2jfIIXSMM9dkbSydfLUG5eX2nuw6UlOEHJNHmvc/2NuXqhxyAS/+4RMjv4JtnrRoaXUH3s7BEVbKdsvu+frzeiydXk+/EiWsRzS45/+Upz0CH8d1/Cr9wShKUUrduwmNbuCHf37fuCUTmLF5z9YUVnT8Vd0ngpwVErelfLuTfcEeeFUEK+54eT6//6LedsKv7FDHLGVl87AIX2Sl/3cVl35bqttld++9ZSevZPSy/1g6S8uEy//QIssH633Y824mluXe3e3Su4dsj4EuKnPZBR4/2dyYR3Us37hnq7b/9e/RfX0/Y2SG+R6TPOr8RyP8NyT+CHqRMcu3YtOGOj9LFCby8YId93fP6FcV9FjdXnmgWIG14Oy3XId4T4aZvWNuEm9XRkp/SupijTb5By06Etmq79RZx11vQt3ug01T95I/3+5jXMFP6EvT0SRehLzGz9a7sEtyCb3d7DT/jCu/Gp70333f3+SYWU0/sENivdvpXqjoTSOg936rV7QI77sQh5DTevxPG7jGW/L8n1/4St3zd5031Jk/Wn8Eh93y6P105jvvsbqruipG7J9eX73d/TLk8IeKGPq2n37fSQTEvZ2V+r81ev1frkrd7wn5CT+90XgjKfrk5k9V/TBHvf9L7V61yel/909evQj5CLqsyBJd9p8vetIvenr+FtPsXlsWiXe61hT1eifd/OT31m76Wnda/baIuGCfeJsoh95f5u78mHuoe+MgAAAC1UGaj0pAK+CzzDGdzEr9xdHaivmXl/3Lhnw4aPTJRqp/8Z/L/7gg5Bogqgb34R7vG2d9us7S1/1+EC5cKBMrL9zjd9wVmygVyBs7wf4ykOcPtDecGT3TPzugQlSb0qk/b+/stUqq/hbzDIbtcVKwVl/u3CnOiZCu++Enc0P/pOe2FMQvIvr/rVru7+G+/ff4dO0NDkLeWu+nt3/Un69wnfZFPuS/uIvmQYSclT+bceOOifVfl+WKny7u7/kzIjpX50i3scJl/8swxoqR+9ui3frpzFfInJ/XbjWHOHmm14z31W+++7LBN3IO5Vh8//l/39+Xv8EN3+hL173HGFbvd8/ne/cE0dzfFoFwl5N1x+hi6+icbLCQkZBTi+7PSoz1lVKsbESLGC2ZKGL9SFye0m59Y7aa/MjfCJ3LrU8JvuXtwp4InjlnanMaUBNtW3een9roKSC7n/touM95P38vhHwQiKd6/CRONje5zb9PbiubIfz5+lv6f0mXdLQ30/WCLOblBvdOY2HZXftoEYm737K133cpO1+EuwVkLzhvL3u97Ls7EnOoob5Q08nu335SO/V1ornWqXWKuGhB1ZcOGfsWX8ntWRNeCIqKWn+DtNX71foTVvaVyEtFZN3T++/J7f5t+RbdPZ+GkrvQuvVQn390vb3wR+XKhH2ImZv8EhbpitzW5Wj67wUd3u9xvXu6BJvfu9X7wSb3fr/X/X2/eE7u7vftVeEfBERSf78VL7vPl/l7vtQQ7v/39tghLd79r2pq76dGirk6E2Jd77rJ6bVUo6CLL+CErOPFjLk+tfMIRX7bQmw24It3uba2lc6+/J7TX+Qjv7Ku8Ee8O757J6afffcsJk+/rwQkJ49TSez1afX19/f3Zrv0nkgjKZu/dydpQsT270t+/v6+17SJe+2nLV28xa6I4ZtayfIoa8Qe9J1pV/4iIKQusYZnt7R39OP7+CuAAAAChUGar0pAK+Cx+4RGc2YzFqjZl5fJ28XxnwqfNKi/5eW4XULhnwibOfZy6MzwEI29LZ54vO55N6X93sIlKNGl7H5x6tdtB83BF2PXLrsRdfhxvpxmC0EXHX2R8t6EoFpVSKVsxckfz2/wRT506prL18gvSqzPndl0Ju+7TvL/vlK93CnmCFSPbGX/2wU807RZd0alWRY1u0CX8v7cg/b3EFRpIbABnf6svv0n9v+FH7ihT27n9yne25YN+CX/u8spzq39693/gjJOHJdtWlv8EZWW9QnthQQ7uSL33s6iymNsLG7ywWY+4Ow2MFjizK0IwxF/Bv/oa7oTBEeG4MJfYye6W/6/GEw7KK6dXDqSjc+vXdx67vywU9cLu5ykOP9IzkXW9P9OE9wnEPsQ+z+723uPsU5h8pLD3N17X+8IFd+93Ivye21+J9uCiG0Xv8qjI/rr6yRw77Q9ZZVr7S66oEl7u/Sgj3vrsTy+X/CS6lBUR73vKyXeu309J5PS10qRa6cJQ0X+vu99kq/iO1cVBEaGLZ+D8SVye5/+rd99avS1wRXd/aq+EVnUeCW7N3LNP1SuSyF5b2Xk9pr8rp2pb2pwurP0tEIid11pL2vWEecEnmn4XZTNItdHYI9K7HTQJLv9r7WqofV927XRXqpa6ct7/Zcv76wR3vaEX8UCQlt92+7CW98n3UmW+lrX/16q169OUr76q39r0I+CQ1a5teJCIIq1pWsn1k19+rReukKhLw7VPH8Tc3J/Yiv++T2W9LqxXHfNHVaI+lsL6os50o/el5PSwq8p0Tydvy9d22CaZvu9mvoXMP33eF8kF0+EzvVmn+9Pv22ZFXSw54ga0twxPT4K4AAAArlBmsAV8FfmHY41WxD8XwxmXd7/NlHnwx5ScufGeH5wab/IzFzlu+jN6ykQK3ejR/cPFD7I51v2BXd3d+jPf/uMNcwi4Gz5fuYTnBbStKr/8FZ7tZqvu5ezTEir3Dk9vdGK/Jy/li/NUh3m8vqS1IW08wsX+9woEOaF3PTHlife5v8E/Rc49OvwQvH+/f+ynZv/R4SnFeQs8aFjyel+Tk3vr3v9Flvf6ovv+J42u+WdoS8xuNGJf/bFEh/fzpF0e+JcLncd7f500b76PRa6zc10T6yF1f9aqsT2zhecOS7rBLd+5Y/bbFpJehRbZYdMJe7lyr3e39HOv/2gRyihEzBofKH0uyghEmqcW3/RmKp9eWCEt7uaTcT8kEplrOzPKxraTpAm7G7lFX6668o1uGnsI+CQRN/t72Nu73DyLRuRGlm5z7j6YfvrGtF/v0QEpEcn0Arnj4Rth66syqxr17gkO5l7/EvyvdL3eSNif/RJMOwZnV3HiMbjerMv46TO9rX5jy+87DcIeKDir7pdwTSsJ++fe/rJ7/+i9XgmsZj/u7VX+Ihu+6+mRCX+q4UT9fE8EO9317QIbvexCPjhBGIzSN97k3f9eYbe+j6sfXLrRMOv6LvfQki9k9V/L0l4q9xuj7t6JBJd79CHkxq979xAhkWPe/qCQ/FdOMv/9VgkK79fgkvd+6sEV3e3WrdiUve6vtLl7LJ3fmX4Jd7ve8J9gm3Sn93YcvrJd/WrdfWbGNfrMbmp19tH7uy3fupSol3supN7hHwRGrX6yf1/o/6rV/Jqvr7q9Xe7YS9E+XUsuun1d5x/osWd73d9+Iwr5hF736YIz7r2rxNcuurE171Nd3e9pEV2X+pSSRkx/kwu/FsXN17pVfXQsr73vspO6honp5n+IKa8xJ9+SIw7H2RDji/V1cdFw98bAAACtkGa4BXwV+Cgdw8Ny1ZweUTJbl2DmfX+X/fE+XEr/i+yLPnoF/MRJIhmgq9wUQIPj35v7bRYdvea2U04kDuNKBOvdr/rdK2EW+79TgXf7z/fuEDZ+VR4Q8eyNqZk6tyywEb13/1/1vXhIuNyeXPxZSZL+acK+HQlNF3jauXoLicgrkPh7bkiV6/eXh61v733CT5xs91/Z2JO9zFh0l3+T3/8vPtE9f93e9PyxFN8v9/ibLe7wr4RFNyhp05EN8CTb6pL+Vu322CMrU73q355ShPh9v3l/+XL/ln+4YNmmYMkJWeuHvv611rJ75f+98Ehb36E9sKCHc/c1+0ve5/c/u1i2T2kjtJ8Ja3LKhHjdmI4JRNHedYIf47Mv/bYKDZfac5G8Lf6KlWxcUST8/9Hh2yfKGoZ7LvnYMDdcPcpn/8O+GYulXmsp96rGw5t/bVl9Vwm9uhm8Mo7GC9sjHzzt7nZf76wVWzrtSlMA+EXFK7ixyJlfLpwmUj+2dd+7UbBIaS35d4Kz5npoLLg1pYe6SHuO4JJk93N7qIve5QK/akmiqbyKbyr3eLQREDDx/yXgvjt/yi3u+jwRXu/+ipdZPVV+oS8FZrcfpt3Pm9daxaX+/onqmU8fqy73pcyyfd19sgMe9+k3yH3fWW99KmSiOV0UeykZhHxwQu7uT77eqY1Hv9Eg7chnf1XYmCE9m/K7yEvfbJa2qEbvc/wlzgku8Z7vwW3fPmX6q1rq+7RWyeqn76s6P6ykffXbTegQ7u7oR8EVtY9Sb2JFE27u974qgQle3ccny8v/pq9v16rr7Gyld9E9Ul+QEV33ASooRYy7+Sit2NKjv3SvdF9fZff2kIu/d+0l6E1iRujWPJ9uqLu601Rve7ucfu3sQTyOFV15PSrf+z6r9PyUd619OGPkXLIQru7gSIAAANIQZsAFfBZ4sZtmwpFVKpWVfuHLmOyFXP6lr8pLumGfBJeYFATf/Bk/BBx0NLbIEzGvHyWrCVgRM639162/ymp715ekyysWQhy+bCgB69zX37Ys7lzzGxXsTBJTeQPIxUn79ZYIt3cYpl/3wrzTAl3/bPaNXlv4V8UO4z1qE4fSXLl/eysKXKj5b+d7vx9f9l9/LBLf2dqi+QfvX+8ivospQg8zH+0Gs4duvq6emqK6L/168eQhS+fuFC/+WER1ryiPkufXfL7/Qy8Yafcjr0zsOG84/Xui90Vi9ylm/DpdV1dFhG93KSe7pZEp/5oJN7v+Et3vThTbChBW7hGbXx9OFt/ur4bTF2c/ZZe0NtnVjwq4ozFOtt1cHBCPfRjZ7Ne3oOpUbqTgYRRiMP6XxB07GdzCz/IgYEaDqvSWSi3jqR+kgkNd+kWOlv/hEYdQYczN6ETvcv+9BScLphy+1NDkC/l6mf28utB60LRs9sZx4SFHIrdEvxsHW6AEV66916S9b/TQlEv01wu8tKE9w7t32oRcm8CX+f7qX2W2Fqq/orDxPvIC3C2a8DrOvVVGPPX5f/fJ9K7f+4UO++a44Ei4XjKdZBdCZc00+X7d8E+R0xV8pU67/gl3d53NtDUWmxasUIKZt/pkHyL6Fnd93wj7sott/BhL4JyBqEGnsPDwHYkp0/rlcEvhN6Z3vXTe/d5F79Sb36l7vWbiyve8wfq8QYJdTfHRPv15bPd/a+wT+fv37e00Ce2Vd7vdMwh1vc0SKdy5uxnQP974Ij3fr3BCLn3KmkukSDJ+l/aNlT9Wd79H9eT3XS/sW0SqqlryS33CPgrMnWx5WKdW8d+CE7r15K1T22UEN9+6O1r3Md9/gkIbf/s8V3d3fS4pKxp+9X/Z0vbdXV/2Pz+EfBEEoce3DbyvBSXlxuW8VxXSho7Wt5Eu/VXPXovpzG2yx2LRD7usn2hXl736wU3f3u+7gI+Q1NV7Yvdb38i8tCW2lnf6t1L19ZLv19WJ1pgk7vXSM8nwj6otfwUmTjVN9vDX3+XQvVX4z603pJ3jYdd5B0Kk/r8TKXdu/wne/d1l+Rae7VyqryYV8ENa6pb9V9essRXvJhzyFU5zBZAAAACl0GbIBXwV+Ydy4t1Sj8EmkW2OM34vgt+YP5qQyX/2wwSkMxd5trjLKr79sExbIPaSsT69JBJDs7EGxDEAi3fTFz+1LLE7vBfnMW9O63xB73vffWI2y/y/ua93rdwnu9wh8D/hXzDqmQfwWT7RBhf549O0UTNv5AZft3oFPL+YfdjQp7ft0WC2OOf80cP0hJbzFcbDr3edgjjyd8qdFis4dvPnVXglvdq9yp0fCy+woKufY/PIkr33D29Nvdx1/McOc80+ozfiTuiu9/rJ68r+be+mwRzDz/XV9OsWT1T9zratFm6afL31ghzsP1CZffpwUiHdw/jrvNuRW+9xL/+N50oZitGiz3nTY2Bqdd+D7s3y/tPkEj/frv+mJRXbViaEPpNcUZxmwgLmDN7o1wS4Zl93u+18TfUtU3fjw3+s0g17wntjhgh7l/MpnGXuVeT0ki/cPd2iFiD7sKsJB+dl8NtQ90ePmn5PX/UJlPP41P3TQueQ0vjoWu4s7WYPBH0vVPtwV3vuntO7nioJLG92ye3X5K1r3QiVVoXbsWghfe993CPivN5v/Zlx73xmN093u7snv9L2un33/vd9+/vk9NrLayEh5SSHe9lI51X/75EWx7WvuP8hqHOp9CZGcd35fW+WrFlqjvJ9f/LY3S28E57b+Xt9iTl9+fwiX5fKUwi90T1yL6Jvf1+CEp2P87G1fvX9Udaa7Vjb9b29a0rZGCK+7t6wj0CMy1+/CJd3P7uX39H1dgi3u/fVjaJB4iXlHnT6khaqR28lX7lVvWvX/tdi33ICS68um/3mbhTMvL6+tFSp5P7EYbiXetF7V5RCJ0LeEuXeXPJlJ9JNfqzJ7afVCJbvhjy3uXLd1sv/mo9aTzilOkCRAAAAvpBm0AV8Ffix2kG9X2SUv67ZeQ2dvzVNb3e/DPgkJKPHk4+VPwjzRIfDv55dqB4f5+NnGX/bwgVAka+Noj33w+3x17hA2mV724Q9/i9SEQkgyMn7e5+WOkx2JVH3IfLN6/Ll/X5e7/Lz/8scjtfkwdCpf3fCgQ5u/u9vekYs4v8E8Y3eWN7v7QISkc0NbtRPVL9z+pm39l/MtKi/EZcgIjvGph2gtT8sqSdmE39gnHCRx3cVu4eSGMfuXHUCn9HYWO5JvlY65cb+7xbBBdz/wiX6Oc/KSw7Rt/vutete3+CTLw4kn5fmIHJV1//TX6L0J7Y0wl4rFZDRYOEiF+Se9tv3epqBya+55/aCOG4eTzc6KCaUP+uP0345pfECUl/78EP5rEf9QSGKSuG7Sw+xI0ELx/hfd9+6IlVaZKIkVKfhMlNLIimQWni0FJaEF2aoWFw82n9z4S+D3lzDLz6TfDujhuS15S2gMCb+IuVtDzQj7APKZT/6enl4kXu824SXRYwZdz9z+xLruO1d+/fSleFNLew7cEuQsOaS/QoQ3/h7hMpkvebNLXEw97pxsY/IHu6xjsb71i0n4JDZx8wLmFN3RBLv7Vbf75NkvcI+bu3XKWGb3dfdO37W50ezl7b9WY276rqzbu+qRZeKhMQ+93p/J2VM7t+ipWOiV78otqnCHlCVtW67iwQn3evzC73pVrq2M5vrMW99OEi7vy9LXBGaUNXervp9WvtSvVkI+PNPsu5/rVej97gj3fl1XVAiLd3+dYIe7v2Pr/o/rJu/eS9+l2k3gkvfW3fBHd7+Qj4IiVr/8EWGh7Hn1tlvz/sWXbeX1orHmfe7680xs16X+9F7qil3fSW0kzMFuWne8YBHwTiD6ra1rL2wmS93V/NBIVdX7kR26lV/pX6fvIV969QREnztRPav6rzFXX0+73ycJdAlMuTl67aQveaiwle9yEvyI96o2FchVlr5Ou6zXf8oISOluhjJeP4/6uXSVWi90pvFfJ74e+Hvi4AAAMNQZtvSkAr4LPFDMttHFr5f/cXBm9wysv06/rxfGIqCXx23C/mNjgUXD8v/uM8IfS8Ej5ZBmDa7SUFnnrtotOFZf38eWc45E4kNwillJE271u40meB0V3PtHVHwzK+Tk1IP9TVwj1CWXb9wlpFJZ7kDSU5PWj/96ljy55eS51qXfL/1gjlln6UZf98s+XcKeYdDsSBZJ7vZf+2xud5TvySIfbR3qW+lsH7fP6vL7e+C2POnu7vb8Ee48XX2T7/u/wUFMJon3dyppM8sEWdjSrZ5i3ekp7iSPR+dj8UW97v+Xtm4TftCggf277ui/u1o+vo3GwpLQsvwRd3qiek2Lpags4QaC355mrT9X/snv/S+4I+UNPqFNscZ3cO4W58727y/u+M0Jw3LknAm94QtrRWoTsDn8eJCkM/yuOul5B9uqxrKa7yXvjyV+wYSX+iNFT0yV/4m1bOvcdk0tJoovdwm98ICne7jLdWoLp1M66x+rE8wRXC7aEUjHCkzbnvM9HqwU+CQs7HXWCM27919iWi10f19ZjYczVqX7E7ve0Eb77vL7hLcKXepgl2nubuHI+7+7w3QSmq8WW2+96XxJnmNve+6ye09dm6XdHi8VFCJnEC6qYe+2iaDMYJt06j/f+0rkr0VyEu2I0zs3rkO7f5WLJf5aJN3+TrJ6J/f57I++nCV7y5evBDu/Ci/vyO9/2cn4Q5PYkEohM/vu/dDa93e8z9qJerhNxfRbV73vXT9lr2nyQQ3375FqEfRLfKCOT+bdFMouf3mf/WX3J8pXf2W977urPFm5VGH+3qhNH7vXqyVeEfFGtWzf/fmStvfR36F9Ner+9ZK/8nqjEyT74U8EM37Kziev0e6G+s18l6oXchJBJtE2+S+m+5AQlfd9vxCJHhXie63yEouVV9ZC3vr6/MdCO7mhTyCa1Rf/2afe38J3vxlbp9MEYt9xSniLn/1irNK1w3pU4iMQK92nS2dz20Lh03Q6YQCa3fkyi/ppncP/nwyXzJvyCndVrfdb+KiLt5Ot7y+X+97w98bAAAAs9Bm4AV8FfmHaTrcJ8da6v/NzTy+uXi47GXszX4Y8xNwl4Ok/BBeP6ZZHnwnzWBv0ktNk5eO/+X99scVzj/jNGC6v+Ezbw3F23YQ366Tc7BHHnrqn7plf4844R/dqp/8v/qEtq5YP6LC84y9j3K6C2lX5ff8IQusnTW6dvb3LPC3hEIDee1GZophE85T5Tt1uWCXcHIKr0f39wQlL7Z33Cepb3vJ67r6uXuuoU82Vs3l/8sKE53spB/7IWqYLe3Zbg3Cse6dSfbWe2yhS9mms3ndGLG7LdDSx79/H4kuCZ8xZr0jp6SysJ0rtXIDhAz+3di293yfv7WJvedWYWpL12fgszuePLYdlBkjvA+LJ6vSQlFhHc/tt72G5O369MJ733e9rBHPveoT3GCj3u4bedh9OkDG+fvb3qs7L4YyfFCWz6cNfPX5iGBNxvGiaoshcNXT+xvJ9Pl04JTN5eHZQuz8N1bQJof7nTEjovyQ+R/dJuReEXv1hN/MsvbCl3c/c4fry+3c/9hSW4X8BF7yvA7aug/4od7j/37YKTn3+98Idk+7FtCO76ovvRe09ikC0z5T274arLCYvP4y4P8lf9iVCXgjMnv34KbuX297u/q3IJvf1244zt73vvr69r2CXxl03exsFr7Xkgiu/W3F8Ed3fqEfEmTp0z8qO6+JBCfd+9owvmbo5URu1FiL3e/WCUtK7v11r3WCLjr3/ujkBdJ/e/VTa58v7VJwj44QT/lYy/X94sS97vb25r3yftk/ghmr3pcuv+xdXk1BFvet9uvQj6/rUt792tasUte1Vll3u3+idq0hq/Ra6O65F9CXixCdb3bV6Ev4iyu/qlbouyftf4IcIfb/btLInpegREu9/oXu7vvuaE/IIVdfWI9daRRLu/piCZgeQPucf/tHbJ+/vULF+XCFV7psEnd8uxO9V6ifX+Iwz8mTXw98PfFQAAAA2VBm6AV8FflHcMMny90fyykrBh10qvLDGUkVD5p+7kzwz4cJjpZWZON9/XuEPD114CHWM2+NR9wIUOP3mvMnY91u4fKF3ftQ8ovSTON7/Hu535Y425TQ/GdoGoFp4w/pPp8srcsecO0vk+1Xc8h9Vl/ovJuf9bTglufv2rnFl/34V8wyHljjYnNkVe4UJIjhhPa+E2fl3d3uRsH4LZg4/KcEdyb3bTR2eCModrvfLTdNlK9905Yjkbc84UT9UX2ea+99YjSe7yofkveFPGCJN87zh9E2gn/hPYcle22H92g6f/0U52qapze3uvDp3nIMw+7+5Hr565q8kMU5+bcE3y+g0PF1/jP/i99mOdxGQJ16lTSeJqcXWNkJW9mNkHmstqptwzN8D1qN3081L8sVLh0Mq8sn79oI3u+HZJXQYb3Chfb9oFIx3dt63vc0WOjxlzstuAx2J8iSFLOYnp0uoKRIGdla9eLzmDoZw93O13dPVje/fSbi4RI5HkDXMzfOrljJ6bTnTWFNExAcUeWtrac8svmTHFxbvEoKSpysCIafnHb5w0EPtr+8bPxFQJW4LO74T8EuZtN79ZP692UFM/uXEw7Rrcgd+z9+sFxL1gcWj5geEvFtvb61oEZ1La62QzZpWnZrb92Xu/LR/i6U9a6wTTBm939aTUqIIe+vcFAt7Jw/F1XvfrX/UgLL3z5299daJKEewVXdM/bX8/rdFWCmm/3GTh3cv3ne5OzfqYS+y0d9Y8Q6HvkCkv9bkv++T+l89X6PBKQmmHDA3d/K9Uf70gT3d97xQk67Bdmve6TPRQkd72q9hF+RIU+/tF76V9/rbrLcoa9dWLlu/shDcv6aNy0hHwXCI3Tt3sOyotUXyqloQW93fk98aX/8TDd0mP3vovsTBF5Y28lZqoq6rokm9wlWvfsmm+joExc/iuKPf0gRXv7ouqLLu9UeYmmb9XvfVItdlKYrv6lNe724kmKuX3u8JeQY9/TMXde6Eopaonbl8ircxqv5KtT/hLSDunJ4q+Jseb45pujr/el1XXLWTvfBNyaQO5hYw6x0WPO5hLvd7+T3wrkgo7vbLO61+CLe8XW7363d/666KmR39iUCK7u/ZPTT+pgQ3dp5fIzjGfcLrr6ybr19fS+b6cN+SrX+HvjYCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8AAACikGbz0pAK+CvzDs1pfN40Wnza3vfNx9xq+Xbthj0RKpf/sXLJmSb3FH6nRNn/wqWcCpsRFPp7v8v3vbLCBnw3EZ3Dt0eatetoFrbUvD8tD7cJHqfpy5yz3P6k2vxdN9ByxeX/6fhpfF8s/mMliFPFDOE3xWhePCdl/u3GRlewuE2LpvQJElJqtwf2n3eT3Utz6D2+4KvRuC/7vhzV3jHf4jaH97B3Dy6buEi3Ve8od+EtBFQQzLnE+qxMEk4KvPayfy96tAolg/d3On5CnFQy9/vy5CfmCXDtya2i9deX/dr6mX8KPfGT93HDt7XrvSdCvf42eJvDckp2WFlWuB0q2EurN6zQd7qrO9N06F5eXVDZBAYSQTW5/4Kq0P3l3KHHx9FTaP9nhaHZCStTCKKt35lqyVxqL/b1u88cv/0hMW9qEYS3CAq7u94ZXZ/ZWPu42E/laj+j7twTPme3qj33fq4TPrW4FvoXR2NJ5K17rlk9JxtJREwjmXk9tPxK0Jh96tk/c9vk9fFYRXtokPoWXrYkyG/RbEG/+W7339t9/ViIJd55JWkLFfz56oWV77v1XeCPu/a3ZAQ7vy+IhB7khARd3n5WbyfftXZ6K/rV6t5tWR7oU/5ei17t4R5fcEd74tenW4JC3v2/Wn+C7d77v0PQru976Um79JLVVYKr33u7/Qn4I5/dL1LLl3kfaW/onqlquSjw3CPKLBEtc3/L8lb/oW/m++T21VeYEMmEvfqXqQEt3e93fsvVkxz0ncK/Wb4nbTo0cbhD7FDa8vbOUD2uMoS4WQsKv9S3pIneXV/kk7v9a9MFG97veF/IQmXrTQmcon0v/VFJuk0oaW++XC5xERu3NyD2CyAAAAC4EGb4BXwV+LHaZ55ce/el7ltb/Ndo1frC/mJSO0gj7/+X/3CXKcJMgYTB7BfaC5f3TLBcUf5XVUw8O08zUpl/XsWbngnQu+dDlh2y35Fxubvcotyw/2eCw7W9OlV3Ft+L7tWpRKXy/wlfV8+5O7hbwRhDYYf/d7q47gmKHZOOsd2H0cv39x8JFzv3yh+qy2R9+pd5/L7teLLy/CDbfv39y8j3Cpf3yx4zmofu2DYPuMyvbvZyDmx0VorjTuHoaXt9Ysstmj7vyvW7audHqzr6oQR7+dulhTbChnd2J6s/t3e/LdbjsvoR/0KQWgjDY9e6xBxl8dhJ3cZeN2tKIsg3evzPyr/Ve8khX32JWrJzGOsEHv5Pn7OgpnCDG8yHq+X7gxUj/Lpvk+sHw8NY6+npvblZpXwQ7lEdii+I/wluNu5facPM5u7vBzL7Zp3/WHtxsBP3dvjQBkD5ipssH1LY9Z2JHo4R8/5oycfaFEn+NciN470ntpFe7l21xmob+k2yoLykg1mA8cZK/WMzr/00NtAr8cL+3p3jLq1ZdcXnSvvVsEZRmB/7txJDBHPGhnl23TYwnbeE9yWF8m5So2V+2twRluMF9kHeog77u7unl8JOvBGIL/XVgj3e+6N9N2+2mvr7Kyn5XyenRLiujMOK38JuuiXu6oq0r319deYpzz/pe7F7fJwj2ExBmH8/35Cst1qmwTFSvPHudfVgk3no6vBdd+WX/osEO73+wR336XkhDoEVs363tGsl73nTghvferEwWn3d7623v6ve+tX6zGxi31+6P2tZCc//d9+Srwj4JDVq3qiMon1T1/X0V9aFsqvyVb0hRHf3Tv8Ee72rqEfIWGan+CU0k9b3ip/1V9S+T3fYlEPu/TBCTKCxkDLq8fvfd3vCyXFi+tF7q0U9eqJ7ovJ66/gt3vOSf5lyYWxSZpf1Qniy20nW2lJ+1/oqda+/tp3vqnS7JhfyR7p4MIAAAJXQZoAFfBZ4sZUCTfK2oCKT5c04vMTHWuUi+vLzAwZSQx4s3NG4ev1sdL/7hDw9mYsV2ihal+G0YguvwkWaBxvyuI+lfGG/Gg/stjo2Xtt+y3DstswOVeeCOU3eUlBr3Fnu73PPvcs29Kvwl58ZK71+EN3l97mFXv5eGHULagqX93cIhC3yOG97z59Ngik3e4qssWUPr5P7v3BKU8PO9OqabL6PBR59t2k6RPr/oTd3e97rkfmiFfCIoqMxaHl2sR2yTpE05f98TuY3BDzrvL61ylDc9u/8JwxJJ/HbY+/Tf6v2Lk7vV9C74K+WggrVuHqfu867UEcsYZFT7b02CGfnY/wmX9tvGCuEVkw6N8mo3vz8//mjYWHeVy+v4ZOGIsrisRtf/L7e9gkmGiqvpLf8xMNxfRkKSKncvd+t4blI++E3vYK77u+0736KxVzFx4L3tmX63USXCfhvr2f/KS91qj/Fbutafy3fk97pq7IbCCqXk9J7rNye9eTKhbK2h5rvvdy/CPaJ267Ne83sS78n6Xn5CXoej9NVuRA77SaXrW/SIbu/iI/xROX3l9+46dvP3fp3CIzeTvF9bdLVzF3clnWT38R3vTEZt+7/KdO8IeCUZemtO/zhItXrXo/6BIV32PdXqxpXd9P6UqvsEO76uxNWAU1BHLl590coJL7kTJ6/+vXYur5fd9UTt6llvNcm2Cwt3u/d3aEn5Qrr+Sy6rIW7y1EQmusgyNWPfEdWTKt16KQw19JWTCnPr9Uq3onSUfraFJEryVrbfwxko/WvJN4hzkcCRAAAAL3QZogFfBX5h2CX7my/+WLsrmGh3K/8vlz5fG4gMF/9wiTHyeihppwaOGtJe11f+4woJ3ZNT5/10929tzB68uVu4ICNbleErlpmLpNDQa2h5/lu/YmFrvcOs0WYZ3xh7SsJNqv9OqSp1L+9N7+iluXX78V0atEC64V8w6BMf8/Ll7kl/u3BThluBuJ1mTq7cLHtXy2xwx3HbfhlbPzzvLncEuNr4FcwL/6mPcMJ4DunspI2Lf6aVhrsvxGWJkO8qGFPFjmU85pHdKEr22xG3OEuk9P27y+tH/7mI7AjAVidb+9/9bQm9toOkEvd3ewgeJW7uxNhu41f6rKw9KkQsyBFXtmwKs0edpvf7pzwYFGBQdhvd6+OL3JWi/00e5CZA+RVJ6Sn+SQk/KHrk9JLV/f4IcWMqO2KXyFGuVBhJ74KhUfEw+/m5fhH9U47CktwndSj+YfMDfTlvvrBR3e52Lrpz8L2l+T9f3RaZaLWvJcy/tJ8eaYXXPj30Z/J6pf4QLhJ6Nz4yeLr730XeMwh4dtR65fGljP9vrJDaWmCeDHhFrTxv9z9x0hc6jyRd+vs/ryekn+J6aLwQnd+F26NLbXQISQbmdSGZhs6Eneze9/L3WcwIr7sbyxtYoS1MR249c+CE7y/FVZCXfzQR3u7HX19fbWqczBGXd2GxaBde7vv2r5AS3d93udND827uft7254BHsQKHqR/TbVFo7dXVUyvfa2RCbv3vqVbdaJ7u+3Xu2kXtuortOqLu8J54JSPe771VgqK77v3v2T0m7o9wle+7vs8m95PeupJQQ3u6A/TgkNq/ZPbS6Vy81kAlpEpdvpNl/aRX64Q8NE0Su/xD0s2gT7u7q99/ZRObF69tibT7vfmPWSqdV8gIibu9JqWl7oQwQ5/9uvhLwTEbXm+dKy2er7w3Xv6rrtNiaQQNmIh6CYvyDyW/JYkkd68khs/hfO1rr7LRX70U926M16sofupu4WW+QQ2ver9Eo/d/evXZMOdO7PD3xsAAADLUGaQBXwWeKGZsWUmU4679xdLDLTIZNa8N5iJSmv4/rhjxZuMypoBGa88VPll/CHaGip03xpp1+ci6beZf3srCBc8pbvHzoxvKcpfuNJuHERh+94Gvoxoq7quX/XuEu42CQoIfzsxgbP3vhktndvls96eWViqTu58sVvL/l0CXyS5cRavlpT5CnmGWiVaI37jfh25cjmOku67zzIJdRyS8R0KXLfqd9mRKapUp8/rdxt97cdClnKAy1GuGlrQFPd/YYW8/63UFNEni7ysr6BkKXkXfb3KXhpch8Fdh2nK8y+OBZ30pzdZ47nIvLC+R59l91fEc27kQeizdN5f9oveWo8rv7mPz5/E8saXc7azhN+4RCB/bhN9wtOf7925977fR2y3LN9av17y+svSVWvZPdPJ30lRe1XWUJvewoZ3d3rDeX3MIn93MWeG1u4LreUsQagLZSlzxfiChWB7rw58x6cf2eCTMrkRP5Jb1KTqxvvFZzsPyVK6/k7SewpbBNpPy5fj1HEpC5RTvCbTfb5PaVzXwQ3bCDf1/nuF8IT8EetYsv7+FLnX3/Ovd9ex109TbnNtCWWPIQPovQHY2K8w+BX9gdR2VbxJcwcduvOXvX24JM9Huu8EJMt+G7fBae93v7uzXf0T2ePNyVc7/w03RunQpVn+WCQW98XY0iJ3tAk7u8JLdsSQVu+27336SW/FLWuYrucd5f9cEeX/fmK9+tG7L4rq5CH/3+jsW3OnBH5fBCL20hIh3vd07+wRnTb4evIVFy2nl/kJl1MldF/1wTFve79/cEN969MEV93y/1kwlt6XcEd3c1+PwXblL73seoJL3t7l7v829/QJN3/9wXXt3e8fo8Efd++gR73r1WtqdZd7+glvd9wi/IRE/Xgjve1eCMt31vLwnd+9+7u+kupjbpawjvfOrMPXOP6L/X9JELe4T7Qjqy/wSDWn4K933110CGky329LJ+vnX+Ccjv7vXkcJ0xfiP82mQmZ+/oSckvkv68VYsn4Vfb9N+Tr9CJqp0TtbSYId79l9+0RehfohH3l/+esnSVCH+LO7/Lmv5Ib8kur+iSXeHvjYAAAAx9BmmAV8Ffix3Go7Usm4vLVevf8vUdIYY8xFO8wo9iX/3CPaOZLEpZcEozdjnX9748oF317/DzH6T8vvluESY2fc4Wfy9pqfctwS+H9qvSc4n9/Ly+S9KyrLBfhOnPLycL7jxnZm2BN/z//un17YjjfW+il/Tr9/11eXCvjDcrXvSD01nXlFVAu7XVeJKPlTmlgsBeXynHmfQr3oJHdjuGEmGrPffwVwTew/PvcOy2ngJN+vPDfgj8v3567GxfLco5fvKV714JzcJf9O0Msk6Lo1+EC7EBrhOffuz0dMj715YJN76hLzEvTvbaGkd3cV2Aj6m33uET7U98K9fl2uisKQ+u77mUsf8rBR24li24ktsrQHgl0d2++nBdGRP3Jd32snrm/5Prz9RXP2GO0/SaiY3DUPaGnf6sLzlpd6tvgqUs5VMzZ/8FnHzz3HF84MGoV6Fh8E2dc87t9/CqCkqBIJd8qQm976yDCj4ZfAY8TGoB24bKHpVvY0KFcfRP6t3BFtlBTj2V79u8emPJ7v12Cry3KfO3saYcgisMn34v4I+NaF1sZ7pwSFe/vKiUN+Xy/SRGhHyyeTd14JczG3fLq+36Inc537hAsw6+7wm5LG+/VEr8ERd3/0SCQkj793Q4lqCT5LBK+r/5Y/94IToosF9jrBHvfuk1c3+vQj6EM35YJjzd+927wR8/14glUXXgi83fr6/NBYV733fdtL6xdYIr3v6QS3fe4Q8mN5f4QEVl6+di+uiiSXV90MViD932n6audiSq3pu9+i1etbota6wR73fa/3wgT6+J8Emq7b8h95SOCi+96T96uWT0v6y3v114Qu/sQXtXz1ln/2UmT9LXJ3a4Id7shHzErbl98vZK16lKu+mTJ6TZZcvepBb7on1XX9UCQnJDtJkK4vjTP5WPgj3f/f4kt3d3wlogJhVazeuzpj8vk4rs53Xsv65pS8/daN20kVMPeX4yX180yv7Gex+qkFc2bv5MK87O7+itG6ie6hG/os5eSjVuvyeuX/C2ZEuuq0WrXkXyeTDNiJJaP7ySZ7xHZXLaeH/jYAAAFAEGagBXwWF93WxYySxnJTqzf/LvD1K/KR7lj8vd/qVIX8MeOlPaDSoXOZEEDzLxzo5f/bG+K5cDF7vDU88IH+7DfTB3IF9VezRbiQxwPfE6+2f3l/e3BbLB+3t0vuFKZaf4CjWf0k41u5tuOXfcA3dnPBnYXoRatFlI8EX+SHKa8sx5/Sr8J5O+fMufCd96NziPxnTH+1RFrb1s2+EbWeYV8wyZIaSES65f7bLG6meMTKTmXU+K88HjSz3p1AnV/dv2I5T6LBNrvmD9oHxbqnHw1DfPu+78nrVvuET3bu9oiW9LRYjvd3MdvWp51J+kq+EJzzkFreYzy/Vfk/VrVSz0u98mS+4Uf0FBnLJ2sXt63DKUj24rdxO7CPnX8+n8v/X+zwxJ/9i35NVk7vrCO1DCSHLK8SDv94H5S+WLkL2zLbvL8n6yhLwRZpEbde2NMhFZdnc43xCXXze9uD9fzxfdhuk9/J/S+dh7Xwm6Q9L3UVtVc5YI+EtJR3cvqpv8cUfL0lRbopHOWslCfMfMZ7a/LptSwUx11Ki97J8X970q1iYeR+46HLaR/Z2n9YSp4qR/XTQm4KjPqVKVyd8wP0fPPsvqtKMjEObLD7tQet0OR/gIKvrnzeaci7j3hpFDn87Q1c34JvDlvXRUN+HFxH0O3DqcO+EO5oci9PeGXxwQA9yrNf+2oRsyn0uSCGHpb0JW7oJBCtdGdyl5ZT3maEtwVjnt3d3d9jd0Vgq4Tb25R/w3IQgPHmbEFHh5Nw9e5bf9P4Jy7kF+V790yR+US4eitFbzexfL/vicZdfvbk9WrsWtkT9HX+hN8vja/CXghM87ct72EreP0u++u/Ene7u/qkSq1WvUxHvprch33+CG99fgh7ux9Aipvesv++UXnUR/YoVxlZz7/Ekbq0z4rOddtaxYKjn8+6TuX3v9r3R4IxO7kusUaZUk4qlhl+q9j4+8LcKPNjcV9O7+x/vCRbbgsN13264JDOHUFR33RYUu9933vnzBv3IW99HRt3f0Lu9DywZ9sX3efL9XOonXhDyDDs5/8vN95TOQS7+rBbau77ue0CS++Porc/v39E9Eq3Wa7+nBDfftr69qmywQ7vSLpC777vyYQ8FU/XL20Xm/63BHeK7a/BLlRv3ex05b32lr4qCHd9VXl8y27Ly4Ht4fCV3d93WS+7rEYQ8FWRum2OV61iXfRmCjEPeFLsV68kKXFbu7+7d727F2Lnh3RkaCq9EX3vukCGyfVCEvQ+XCs2y3Fe7v7HwpNp/8b9bu583emHuqO9iZc+P/HR/2+W8o4HTYPSV7a3U6Pcv72RhMm5c7uYS6YKJljxXplknvgHcxFTSnajrlV3xvrPaF7+qKL6EvBCEFU8Zxb0rCc3t3a71WrJkyao6R+hTyHXX4Q59kwXkxcmxOlGwgfNms/N1exfwmIl7e++ixBbRYe7LooK93ek/LkdjvqKOln954V8gQEcBDr6P+u0eK31jLnWpJw09u997+ghuX3bu5Ne/Vx+l3CviNarVfgpJCrl0r86mnd3zG/xhUhDKXC3txJq6S/XHvMntV/iekk9fNRzpk/vLJoVp3vL6SWhl98Z8e+k27I/+OhVpR2PefhR5uO/TZw8uo94wv4gRH2XYTD33+I7R8t+Fx//iMSVndrW+7n01uu6UUUQUtHGeS3TQPSw8n8sRNp4i2+L8EMB9fiY/BVAAAEkUGar0pAK+CvzDsd9FL+vhEvHd3DcU78xl8e5STkr6/Neif5ePYVSfBHxo4ZkhjykqHpqNvuHihOvJXhjN0GQKRRHFmxumnHT1Pn/w8SmicwGPvR8PK0iuWv4e+itM8sIT3y4Vi2a9/16YRK7V3nnz/oTyfpdUoIc6otPQduHlmdb7t+Rq/msla9Sz90yxCnmHRvK/b9x2WW7s9CbYmhk7rdwTylSJ85dn9v3BNe+p7vT1KhZZA8+SPdfu77wq/sEBg+LhNYu1JI7+xFBH4xuK224unL/u4Rx/r+4xWwu5199lYKy8w+T8vPT8222rCW8hwk96ySxfDN/N4+cvvs8vc0d/5vK32HMwfhFy0haxf5Pdsv3LUj/Tntglu7vV7Qn6J29txxHd3esBfJ0e7uf/ClyeQLSKZT96Uh1sOYfl/wUlBFo903GUrge4a8fatI7w7Y/BdMGtVZ40Wf6RqN4rxs+vX5f/cPZgfR61hpLstD28Qhp3yzIeo5f98Z4ViXsDRGbXhtDSb13jh48n21tbghxhAaV4u8QJvfd78VhHwTCp9q9uZt72Mu1S9Y3TSp8TR/Zbt+Wh0Ev/uNptyHYvI8MLkwBp9XVl/Tbq7/7KhaN/FekbHpuCzEyxH88b33gm+3v6ZPkFNfWhxen+UBJz8End29wVyB8Ir7Mo2Z1dyipSDmL+e6v79e/UI23hdC5Bd/bOXT3b6haQ71S2gWmU9cF2i5drL+09PSdf/wQle7+/Jpvv9EeEn+CIlCh2IrB9mKX/4Tsu937E+/uwle+91ZZfP95DGNIXS/e1k4KBL7c/+hHymOzOyVB7Ic1L9DaxVlRCx4Sv1LvMl6iL3uehTHrTPBEU83sZEZye2/zGghM+MpmBnbJ+5O6lrvs/8EV3+p6/ywh2FDN3t29Sxz/sfi8n73rssEIkrenXcFHSe929QRzEdeEPkkHN77ERUf77vye3/lgq7uzvPD7bfoEcTPnB18zXuCLd9a9QRXd9dDe/oVu9j3+Td4R6WDvBMZU1TpvZXQJDve55YTvh1przhnvusa+q+zE4BL6odumL+gGON4bkkr6s4D5Zd369Qhd933vrzxV93u+yom9wj4XrOlPJ4m0WXhsooyiaL0jS/BNe+7vP7hIe6PNqOjrVwhdFyytcvvbUFeWlE5aXfFaf1BhGVuhMyR7DyZSrw16z3+fn3+CUmfN9+ny+5RCtCrndynJNrKiMb3L93SrckRuvofd4r364z3j3d6ZEE8jef9enCPhrDt53AJaU+8Ep7/8v5x+2NM1k71WXKToPwavRYcfy7tkbjCmoV9cOGh980Eumfxphx1u9fiSpofLnviuIe3XgjGihOqqveJG2II77v9RO9+aar7cE/L/LyiVWolONM73Hs0aLsgq8gP0ugm4Hebq4r4hqCtEI2nnrUNJXOLRnvpMaQqCha2G5vniMJGW7H1SytVsAO9u5U7aHEvbvd3Lj2FX4YL4wUU4/T3T5Nke9blPl+8lUYyU/UI7u/LyG7uF1rgqm3xnot4voBvd93rvRQBr1+lRWcmGy+aT8FkAAAFUUGawBXwW+LHEuONjymaLy/+4ulNLG1frNxdayu3vcMeYmCT+DB4KK9xngRNemS9RgGCDhn/gwf8VW6d91aXpS/27YeO0e2kcacmCy9buXKX5TJLXrdoYbjqYfco+ZcH/fqLAJuOqvw1V54LPHC2SdeZb289l8vJfo/dZ7K9/w3eYbTVaa/8up8wp5h0opGRSV7YU22zhvuoQb14l002Tmoe3bh+draX/HsEu+4aaanaKDe+CKQ+G8x9tJO/Wr6W1ZblVen15fRU73hXwWGxh/m0iCsbgXhAaDvs1u4K5bf37l77Q8fs+hBcz5mmJ8nrRfbZMu97aQSu78M8zl+XbrXlhM70r3deCQRPsPShMf8pR8U70/1TuLvhhdBvF+EuT208J7YwUW7u7gq2lu706XktynqrPGlsiwQPPz6044Zf2zYFbcN4JxHrNFduCtiLDiuYchP35f6LBDINB5ju+2K6Le5wtoXFiXvd/Ruk88ImQyiZFVyRvsUCz9jVXjciY2JFOphwegk/cWI5sfTe9R3BP4dSVqdY55f7rcbwm47R7fNjeIT6+N4Ru0WIj2OdfvUjxntOPd6aCtxLdmm6cY6rUF9XvlnKHu1n+l8kI+Cnl7eftkf7vcKEeXu5cG7LgIX2X3W/mH+Wl8b2w9qjR7DUnIbi++IPmOAXWP8Qb1iP9v53v9IsN9U/Em4/OhtsWrBFe+X4e2ofz2OtH25VTlch/YeXv6cEN736ynDnVoqLv7wkQj+xSC47OtSW1WFrNJPlCJJOKHRoWEOX0hq05/zwTld7u/WX8qrBRpXu7sfRSckQj4K+bk0epf4ri3+HiTeb2f48/X357jlz/6BIWOMu9dgnhy/48M+7vBk9d+76sFOcvMHTDZmbdFbv3WJK8VxPpmQcVBVveQmzrlUFh1v1DwjLpe4eBBTlO1X995Uz5/osSfCT118/3SFf0vbpNoEWX+57YLr3d76y/E/sXnyEPGBA3ire972mN2/bCBxL91e7it6yzH5a6OgREkDT23WI7nq+/IvJyf1+ecijOn/012diD7ve/rorJu+unH3dywvfc/CK7x4imV+f/L2+oIrhfj1rxKYlRT+rH3Z73M27J5PSqWjV/E0iu+Xqjx12nnnu+90/pLsVnx+lL5PWr/L2QadKlLSBLufKt3SrWEbpyxe7z97k/r/CVaPvcI+L1q7emsThS8Q4Z/Lglyk7tu3Peo3csN7977vI7k6/+hpyHy4e6j+Y2erdylvlm0CVC0B72m0bbfY2Ebj74v5Ilk25nw9Bofk/f7wkR3vkX/csfyfX+o8k/7or5pewV93LR+NyJ4JfDsXlTL/9BQr2789DP97zt+Eru7pb8kIXe7vd36tWRy9zPxGEVt4UEW75P5mL0/OsE1585e7e+T9Xrso825sr9yItj+7Ct/XrNpXqss0ZsdnlJ9K515CTf1Uib3aF1SAJq+3E58JeG6UsFtl95NS97y/IMFp8JePIZaUipZfrJmT66CIIVxE9+87W+tEKVlKf3/FCd3d+t/fL9fbsj3y/E+piN0vUIkwEfvQOzu0bKJ2DtmUv1wtv4k8rS2PY3+EHprl/7MZGTwn5Cmesu1mEYqq7VLL9/oRF5GMHu45277DqT7ueuixhn0SHvNtEf+9Ulku/0iFve/ceW7933e6V9JVhMQ60LPveX6+i3fpWQGllO28LL8hnQtGq6Swlu/P9qXglLPlHn8qeSJO5VI/c/LnJCOO+7N37vdF0CchaFplmf3Km0lIgR8/pQDHqkS8RGevccjYevcVNRcRJI/qS49CP1fGR0T5h/4yAAAAFBUGa70pAK+CvzDtmO+vyxZduNSCF9+UlVhjzEkPGOLg5y/+2EeMTa6anGnGmHzdxuH+3v3BWeMkjoWZ7J30U+4IDSwM5WWGGUGstz4cRcJVCGeR/2mJnjvmvJIw+kKcGXVNeXRP0msTwmVrdmvXlhHtJLd3L3/lpPcKeFxnCbrdo5WCv55heHv3/G5xZXlQoSN0wSvClkrTeQpR1xfdrXvpSAmQXz/L+7uCK4euRPttk9Uju9ydyltYTLcYIvh5df2nbgjw7J/0qS36WuTd9fk3eFC/3tgsEX7nefZmIXVSsllt2X39sTvQIm1W4OvHCuwd7YJSlzIPhteZ8dsnruSqlljL/SLecVHdyf3+dmjIqvp+/ZUEcIvpfgIl11Svhqetu4V2xogVu7u4O+27uCb/4exYK93WYVnz+0axkKj575Pt6LyxxShEoOIecV444tU8fEz6BJve4l/l8NzIMRY7jDXdf8nob/FTr5RKRZ/4zi6xXELcOec0RywtciybXopOvYPBhQk97r8KXDjSjhuy8LnXTiX8a28hdrAdk/3fiMy1x2vvL/5cJerN+oexZRP/cvcQ+hF6DVrlgWf7w7lnpMs6ThsMbhpsJcO9WyKxm8uaZ3vye7XueEIcXue4mWv7deo8pRAletrqDub3q7LBbtNPw/F04vbGzdBdO/CD57bk6Jpx3b15BRwfralqhobb/8Ee6/e3uCQoeRaT8XuCIj7TtdY/eCrE/MXnu0C/syt2Ib/oagTWsaUr2/9y9Lgf2fm3KtFi91g19CyXfFdwk+3BEQrLxNDVgktVHm7v6BF5V8uhr9Vr3/BEV84s/4IjbNXy3eWzx8Hr91b2RWN/IvQh5svp17hAhF7e3c/ve/cE4l7kt88O2kugVb3u976/EFfezvfLjLu7zB8pzu5cbbv/6CNp3vjZxe/5f01LCWck8Hoz7XyhK9K99fIEzSJhxLyOrPvoagS3vd+X0Ei877v2eXdD+Xf5rR/X4mx3vl4Q8EwhtilZLv5a3y6yfqLEvfc+dqLLI728v+k0Eu7u/8k7chK0xZTk/BIVyxfOLs/7CRNzzuzf4re7u79P6NPL1rY/bu7u+7hHxMWP839iSbaoaoay/8uGyunvWrus376y8Ydoos1eMxFI5xtEGYdK3t5PXH73BHLB7lBk99VXBXIbd85t8Vvot7cRMg2ywZad4q4JDIh4/dUqVVuof3kR5CSLXWuzNrgl3tiTr/oWV5cR7v7fL2/RN33+EZm3fd8nhHzEc3l1F/k4XE3PJ7e+8hcSJu8elfvvJKav2eHKRhvb5Np7bbJCXPBu9Llgo1enqm2/k80fUParS2Pim77sPz6f9Nnq6vr/xZpzwdp9hH7R7/Ce73u8v+R8I+QsMIlf+HjQa1WTyenf5frynfHcusr9Jl13XeUTdyv+bGe+iXu/5N4Z3+T+93aGG3nQI9NwUFED+ASiNqf10DV08tB1D4PvlL9fQKSiP2dP/++7IMJUnwgtJ5GEiDVG9WcctM0QqT9d+kRzJ99P0Ua+lzQRGe9MNWrid3wj0sl7f3CAuWq46tNm9yZNKL0EhWErB6nvf4Lb3uzemXUfvd9zKrnz4ISggvXlvF86QrkIiCu80Rm3c/b6/NdEXHX1LafevJMJ55V5UH7vlpnP/umGdP/VEMmX3iDTR2ReXBnTm3F6WF/VOviPNPI1pauK/yRHRuMd8FkAAAAUwQZsAFfBX5h2Y0/xZaQJf8+9PuLJukt1rJxfjvj0MTr9f5uNETC/mJhE8A5f/cEF7RA/D9NARvfa+wLV2t/WVVGu+vsYfPqQTeSr3IIT8NS4jD1tpeXh82uncfEL/gR3dHX/h9yHgjslrLx//yftrneCO0ZQ8EvC57s8hZWv/L4/2xMKbVKK720h/EZq7GqZff0hN9znL8v/WWJYF24VfuUIcPS5Jf7boEBUGAQ7qHH8gbe9w2tRayDrXP9NnuP/I9ikLIVybui9qv7+++qEnvd3fL8u2kXmkWYT8FQrhpJrMle86KQJ9euH32e2On9o05DB+lKn3QJIaLST9ac7xZTJxxy93r3BPJHJF7flv2wR8o0sVU+mzywSeOPPDJ/RPlQJ9le5cSwa6tUlhO8iEpN7/gm3HQSPefwze+97hMv5ZeCwQ7gIl1vnSH15h3fx+G3Rv1uCT0uNDibbVfJ6r+ogsBTi5iZ1/7EsFVx4L7B7wuSfe9xS2IQbxta4K0X8tprN6PXunGSI8q7pWniIOjttSKb9zqZPSr08FPLzQ2T7tJ8WG7usRhE9l06VURZu5/9ZhL3/LurYSfLihS+naFwzJ1J6Vf42Hui8hTLwAp69c/e7DkJq/9w08pZhv9/v8Ff5KVCGgXnrtnOUi6bs/4k9f9QsV72n75ks8/+C0j31PXw9QUFMvMBAQaIxj+a9+nJSv2eJw31v4VvCT1Wt8KX3fGecDru74XIbuBxsWgTEAs1/Sd77fXc26UoIzvfLaTqS9/cUS93vCPRSGYmbv6BPdy+X/ujsFBZB9nzF7tVHkIT7dr2Cc7mKXuUX361rzMTJLad2vra3giMPCa1r+/ZbL31kwkX4mvDEQ6Rk/c7P/fWJ/7GFcu6In5fd7+nvq/pFODL/vvsQev+fF6T6V/u73v6GXHSydodbR34/L3rROk/VZ2UiVe799HRu7+ujxN793XlOOIPj/NjtP7Dwgj0loNLPfk1+3Feri/+Cm8tmfPsuNvZzD62z2JuFHT+CCbYhy9ySd7P1e2ZX9auGOeu5G3Bpn9X6Ox93P7akm7v3u2Mvlc135u/f2MK6O7uPwgxfAx1D1LcvhXfWC0l3wk5JIoyi80FVz99x/rfSuQDmT0lW3NEbPn3vpx197pctMn0vJkgpl/er+M1DtveiQNf1PtBHoQRz6/P7/BcTmYut8v+ZbK8/6KzZ/bvI1CBxsuF5Ll05K+letLH3sJ733eX8naHdjNs/3vvacm9+oJSPV58umH4KCMHI0oaw8girc0lVDjyNP6V54WMuNpzTZAbfVExW/qCvu7vvSloSe3glEO+T7ireUogr3u7v6Hi2stbllvnvY+yan9dVTc9wVnza7q+ZA9otO6jFPxRHyoPl/sSgnZd7vZSfa4tE0JIk6KcVmND7+ZP6SfSZNXknLu79QkcJ3O3d9wk/scIlaUlM+FW3JPvWcJ3vn5N6UvCZQ/ibAZ8n9vJ9du5IovPA6V6D7i/L+XrwR9U0UrTgkJjOM9qyMqBAQ2ZFwA9/lWb71MKXN3D9zvlbQN2GOXan/UFZ0srroZ56ii6Xdu5mp4BoCvUExE3vl8goVyQRGXU6b19raYs7T/P+uT3ovokp3ebe4TKkN8O+735IKBGXIxd1dyyi1qaFM6ZpFtqC9y396OVt8lBPLUPQ+Zhu+/1IkLeKJG+7t/Uvd0sjdCSJtVc1Ml+qCmzfbl8W6WJN5k3vghvumQY9ESJk/oR/8REFOXLkbbYTNK+11EaT7vr4e+LgAAATUQZsgFfBZ4oVzYW9/cWQu897hsv/uESXvIWBO9X9JmE2j1G0v3ElCdbK7tD+91ZYwj5INEkaLfR8Efjsd/7lkIcuncuGy6TFmJ/8IFySaPv5cy+5dF5f3WtfhObHa8VhXwSDI2JBkLWJ9cy/3bjcwjOfCaY8Mvem9GteoRqtr1lyS+33hQ9RPT/9x/VE4SLl7UyF9/cFMcnXpMkp2O09uiseVgpAT6peN78aP/MFAzd3eqNlf3l5r302dl975It7zc0oVL++WCwQK29+pFe57d/dl+vcFPfMtDyvdqHcO9a3sExR8Zrzwe4S+ehhal4TuVX7s6S3Bb5c5xUPeik8VlC8hzhL6yqyp8o2US2LQne94fXxrFDoS2woId3P30g1dJ93PgW/h7Q+zbIc2yLhmHh/KVKdfPVgsOOtbstA3L88aFx+5koX7gllG2+ErzL/2X1KVcEcOLha4SVEdqy+xsVDK361/pHJ+G/eqzQVaIdPzhoUcoAo3qKcbLdNLpeuG/BYTOuVTx41dmbSSg0/vcmWjySL8uNEywkvLCFxA+Vh7e+V2sfn8iXGHTAS7dR+rCIKDEluT1+9wTfjeLCtjMNNTFvbxgy/d04op74etz/uFSeTAT/9v/fqa/4KClKNbnKvbaoksEd3IR4tPuSNn+xdHeTgnpZAecltB26nLo8cTD7JvL7xwHigW/1Xv9ny2/dYPRB+X9u978ihBd4KTYq7z+b/LS5Y6K3e+90z/Xgl3u73kT296Tora/BVfc7rryv+/fd+j19+HjczyKjTw6I2G+7fWGVJf9i1ppTQSHvexk+9z/1fICKdm+L7hB+QQCYkydDbTbLGt8uU9NfwTXfe3dvpzCXv0eyHLNfUfKHZRd3vblx73wVz5fPDnqM6WLU66J6Wf+Yj37ctu/WLka33fe0RbalSHlTe5lO75eEV3jzFYpIMkXXL9N72dQUWWu7t9GEvv2wld73krf4+jcuV3e79e5II/qWd6PqF9bpMXHprq0m/l9d8ERTuvTfiyOG5J7gdkqf0S6X0/SyelTSWo68607+0nv0mNJ/CHjBxXyt3n3jeV973JRm39CT8vjOD+b9oEMv8jKEvuT4SLq90dryCBGr+fapywvaeJT8qzbB5z/FulPUWZz4Ag79X373TC7xBI2r+ILHwemewlf2hcvu7u+9LNcmv37wj4e3wyIus6Geq/PCCb6XPmEXT3lKoQK8svFaSV8vtJ2VlGpy06PEkeS32ntxvER9fn8517SPd2Nnes4lMxblKR0luS++T9afxxO4sotvnuWnoT3cYz/XaYRjRexsZyDiEASq5NPUb/KfxghfMKcA/l+/zE3e/sSfcQ+7wl4eEJrbJ+5mNeYLnrw45/ev8n05Y33v71m/tUlNXWqjb3cVuCR9bOUj481I+05RdZHsTv6/2aRX0y3Kpk/Xo6UYdDrDzlHTdt3Buet3x91jXxwfNb/3fdLkI8wReT9RD62PC2V8KE/bv9BStbTiRp/93r0Qpn30fpaEQQnvIfTjL9e4TK51ty+91lihA2g0qTm44zTS7JFbvDraf2rUkLL/X4sl6XLnzlq5tb/5YLjt3fG5NFfkmpPyf2T5UiFFquq+FqyZb/um/yX3al8nJJd6XkqdPJJVd/D3xUAAAFDkGbT0pAK+CvzDsvMR+LLj4UOUGEIXtLy9flI71DHmJgi+KsHxmX/3CHhRW/Qn+X1iNvpAxnFX4oo+0IVJ8u87+4eJs593jpzfDSVQQ/D/Ly8f/9Xli7gG+4dPPb6Grfe31BgXLu7tdzal/UI3vvfd9lu74XXuNHSwlDfn0En1dvah8tob7vzm//xjTrdwW3CF71eY6YGb/YKytKY0lzDDoW333R33rxM+48J587/pf6/CRXS83hR/gnFCXCsXd3+Zfu3wS/KPh5dM+1Bl2eOOHBf/tgbXxEiO+4eIgR38vQJqaov/ep2VfDF8seWHcI9z+6cXFXfsQ3NXXtlKfH9VjSGJy0q8sEnaL0d7LCPdzL9yt/J7+SWondtz/ddtEl+4TfqCAU7jytOcdoaL3CTVW3et37ncMX5/d+CqYscG+NXQzxaB3OFW0+ry0+SOPcFYH8Fk4d26ua7dHuT3v22xGVKUIvfKK17hHu/DjWXt1UvjIqUK81zNtfClyB5ClkRvAyVFysHQXelV1FFqE3tB2Q/rIYldki/prFoaTrZvWCX6Tf6i0xuddBF/FcPqcf6vokjb+sJ5WOvu5hzzu7u/ZYXlXImi5sR7tOJE5u47e9WOffq6cE27rvsFQ9f9uy7+zfnXr8Wfcy9zC3SQt0ETZbzfco+yQUFIVa+ftk9/y19yRsS/M74c40U+LL7/bkZoKyQ2vMtDlDaA7aZHCz99ffZaLB7m8vCPokW78nGVaXpMt2UMdT0/wW73PF8gi/s5R+96ds4+8fxEE0CV+J1/8bwxuHqFY22/IHoZi3VMt/7P7wTzlb3cfEv7b1gnIhjNvGzQNpZ/vY6FoefdJ95e7vfSitt3vfuW98nrnS5KshHwViMrJOf59n/78FR324X4734/1faQuU5Rb3/IR7vJ+7qJbghI8NXdh9cO8Fx6CP99zp393tvoIkuXLe7uj6XNCV7u+/wnfcv3l+vSNbvCHgiLNs0kv2FDamXYiKxfljdz/0V8uyiQp3bd3c8fVN3dzd5WJOPTv33+CaU2/oj9Iqy3F23u71r1EZKcPSf/6jsd1eeCEpFp/Tq0xNwQXjw3Z6Y/GppP//dMbsfbSiUEiO/kf3hfd77q63+qyxcp3mdu6eljMr9Wg6733J8I9AmvdS+PTqsh8FAh2/V33RY2Cg735Xzp1YJ7V3vciZf3yMFRXu+7Ge/Sh8XfSvL95qV34iCHLecX4LKLKkfd1c980Wnx5tyILvzKg3FKr9RJ3CLgOwdnPvn9LI05baU/YyaNK7c/S32T+hkvf735YXc/CPiiY2ve38Fhlri9Zf//GHcqJ+Onbhmuvpe/oYLd93vSLSlZu9aYhIhxZP7pSTxGaek6vdY+W3d++rLL9RGhvX+/Tl/cmzaT76w/u8rDh5IWAbdCtlnYbiXI/KumH6cuU5VQyz3On2xh9dQwPIZKbeP+nL683CXijZPHLnPkhS65S+tTR0Wsab9T5weVUT1obXfoXFiX3p0tZeK/JLe+/Py+/qERBA9gzh8e7jF9GrTg22cvpX4w7wbHblvDNwnz/PvmeL2qUr1ZfCBM/93d3hXymd9VZTp769YmSip18IFfZS/P976xRk9sPOi6n2VD933p93k+vOiUgRbu5mQbVKoW8hHCa2d6VcVae89eT1/JK95fpx9xAl9zNv8rBPSMZ+KzC06ZI6t5cuKz74Y9ESImoJSkt8I+Fh8eRILIAAAAWHQZtgFfBYX9/CIrNk3mVSUxuC7iyJcxKeW9S8kyGgv5iVDN/J4v8EGYFH+RoJdr70Y2N2c7T/NoHmvLVXj5f/y/vbiStDJcgkeuL25e+bd124UIR2Q1sPe4Jeh3m+wl0+O5Jxq3LF2Q2OC3t3v3Hl47t37pI37vNfL7STeS79XuE6JSyacvcK+YZNsJuXjb+wpmQCYiisq1oSzV3aNk0IvOkv92sJ+r9vftMi1sv274zxd3gjuu8cmex8S+LZ/y/T7gm7I+9k3m0k7gkLJelUnpJftmksrr9y6U4vosssPW/tMssFU/973ZOlQVf2FBFSHI3Tt/d/KiUVh+RPs2vIf91vaaBNv0widOFtWEMvq/b1rrBWVrI4jXocfp8xhS3bpjRc+8jovKSBZsohP6/KgUcoXK7BW/tjatu+H4bz9oFG8qi6bua9/I4R8xHta3LG7Pj2e9/M7hQM2/rKFYrFdOWsL+aZJzqQsG0UZSk1U63cb0Kk4ZWEHy50MvKIu3nfv1I/1hRAYP29l/8A4nEWXPQ3yGo+WnwTYZRIWjo+0/YJ2En10XlglKMvn5cIPzoMNq+CvoxpttHbKffHY/BCd28ovJZLv0eKI23Z7v3ClQJaSLn3Bq+g3SF0FaG+5T2hHOWmufqGr5vUFJHObRL9GLKVYyDh2CpI+l20zzxHdooa2w7Viz/hJ+RAvNbI3e7buTv9/j8NKZ5HxXnsDIbk+Kw6oad4IY6P9TorBATVaZ+aLD1mWF1hrcajneqpMX79U7hDqnq8FP1X3TZFh54SfvsW0dvf2wUTAU5IfbFKCW4PTDpwSSP+dl9MnUFkBNr13f2gKY7GYRZAfRPo7sbFQjaa3/HGfRAlaqqbSoFZrTZWpSam48WQ1iPd9sr69bS5xIndzkHLfeiQbdLLfcI+CgjaZv1k3LdJYLN/OPVe+f38kFZTr6O3vbTu99brwRWN9yi/+58O60/qqPEbykG++tYvwT+PdzP1u1wSmkOZdhb2tLncW+MaExx3WXqy3f0hRL2N3dwh5CcdT69wV5GdN3d3d9eWCsr3c/3n8V22meNQJLN/EL/GC33d+ld7v3yfSV3/V/giIe+lV/ghNw2k/y6PLMGjp/p3v9V4q+9316src/hDUFRDLzJy/Zol9rZ6Yy+3d03d7eXvXZUEDz2/ZPe/QmC65fRYzposVJPi7WY1yXaSTiPPkoavJ61q+E97u/VLQKj2c+8vuqaRS/9YKCTx8+OlCnHrF33bd3+EZf3yBPcaWnFrCW99tHk9uzXvE03e99KWIKM0O4zka/Mf9YT/vuEX6hkmOoNTQdl+X+WfBCRu5pN6Q4uf42r732qkQKS1j69vHrdNIi8pVKeojU33uNfunFu+2h8tMkYRtMd5K90TghxUqUKXJ/DBsJfnIrQBLu88zXe45HPN7R/8In2lZwazmq6LWgMST9efOmW/Ob+pwb+4R8GBIdXMcrxpU/ItUNO7z9daNb7CB3nxluM+HdWbfrHC2r3e930rpIgtb6ETR77KlO8l5NdP/J+l1vpoXzaTvJ+l7ShGVk4juyKAIo4rAP4WAXdq0F+ogly0z/6mhHwRYYHR06+Npohay4ldDukJqnpDu+qQaWplwNcg44JfnDpp/KyP3psSxUFx8UfLGn26I7Fo6rW99idbszv23ifmihE2FD4Sv2v9ICTetXPds98IFa/1yeqX+HzkZ9b9Z5HlbcRD68O98TXFpjS/1qSFSPvuM8/3n8J+S3Ty26f7+8El9yXSoqL9ZFbOXf1IIL3mP/Jb8dlddEgulnzx5WhbyarrvFEl9q6+kajyXT+c6ubP9anTa7mK8ua9IEZJTjuqVNaWLvu74Yr/EVZ8HfP4i2uk59mT6/9yTf8j3tEcFcAAAAS5QZuAFfBZ4sUlL3xv1/KQbmHWxuvLnONqffLlydCF/DhON+14UZ3/hjpY2RDN5qNZGvfFMH6bZGZf7bcIlHCPu4WrS4cM3Hdv3GkctHcP3uQcNHHIvUH/YcdCKT5vR1eeEr7ngU+2DALy2VIwycJehPXuLu+9/xdkpq9S7y/+pbz5CnmGXh5Lyy/u+C+Hxf4e2s4JGZYaQ9vNjtvTDbYfy+3W462nwk45qf+7zkfhP9kUuQ2/T24Li5apNTD9O6e8fvblpsCMHj07vPH+GZQaW9E/e961XQu+7u/lhHNve7099lEwoX/2wTiHoyH3CrvKPf2X7vcIXu1yAwoFrSYNdqVuC0s9N30jS9xfjtjbnRZfLj8mSHJ9L0XieBnhH47nnk9O/ykhR/mEBoRciO/aFTBdyoHFZQhhhKuk9JN/PBMWYGwQ+W917XP4usERQT9rPxYGf8tLaQW9snh6VqtH2PDvd+svo8mRehfBaQgVnZnO359pNpBSBN6y/Xwguaj4fD9LYOzoaRBbS1JBG/8vprJwk/sVBRxZh7N73DmP9Ye55CEfkLxnlhRjHdECfvOw7XO/f9fcsK2rmQ3gBxgi1A/ZENtagSwJfdJ7o8Up669jPYG4OGY48vb04KymTf3aucc22m0LYJDDKXT+XqHTvKKPu/KlIF9axzH6csbVxFyv7u7+xaBZH/Pkx2Qq/wm6FvovMwVkwQ9nXJp0lCG2KCHmAqW1zsMTl5RlWWuHyAkLc7twEn9b90Tx1q45l1YIrpcOq+ta9zmWNe/0kdFoS/6p9ciRiu6b3y6I8IebL52L28FZHd5bct3v7f4TO7u736wVXvfe78ChS+T6sXd9XY+ESHv3d3ezk9VfV61qkymDSS7p/l077GkJtv0VZPV97wnP++9fUIL8RFGeARJ+ndy/L5PthK97kf3vglO85Jy9+Rt6tE4++PrBHavSr4ve771vqnUvur4SKWHnlyeln7qKlMpvBIvPDX5PrEa2gSEsaun2vgk6ltFtE+tvrBYV4IW/Vxuv8YzVsoq1hDxpi/qRn5/SWQXysSCL/EEe65H799+okS+/NesZjrt7cP3SPmlveqfCN8d/ex9yi9V0Ft8gKya3hN3/6kvfWqRefMn10t2EDbJ8w+FO7pZJo7VpfCRXv181yeuq9m3ve+OK73lhu7/QQnth9kaOXP0oR6BOOu3XfvKxwlzLvd13duT2kxdUvTZ20yyoX7S85XPlxeq9kd/mlPLT+W79btBLsld5j+X/koJ6YfRvcv7aEosIGy/bJohwhUcDBvtJVH7vmmft93LlchHCPqij8eSsurZ5vJnzhPcvf3fk/qYqY09lI3+xI0Y7lZG+swiWafSRd9eT9y4Q7y+0z6Ie59vJ+mKEUEUAkd4DQ8Hkk9YPcBJv5X71GHaDKoG75/h003H82+mvGsoZN9+vXCuxInqszxXT2VBe9w++1Wu6n/73d36sLniOq1IV99iYJzXhNylzK2qdWr0XcyP00Cv2jsJXeXHtPtNdynCbhbFURMaURxdM7ZSc8eqJ/cgl6e3KxV73pWiwnKit3worzQzu/CmbE/9cL+iJERa4gslYzCNXo8t1ciIiDxMfD/xcIRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vAAABVBBm6AV8FflHZg+nL/vhAvNg2QvDjRKR3+YmEX9v8u7hmUxfqaGFlcI99YX8EhNIJPAqly/+4Q87zGan2ajZB8pUzejQG9xpQRrZR3fTe7vkHz59o8JF5pgMluTcv73hEmdfIPXQXQvB7FJPptzzutX9LLa1/+U5iJZlO0gp4odzNhjZTzWDXX2NywLR0NG8CRnPqfwp2WyjtlLofjjzDhEzn/lgv2Q60cwvOJYfvjX0Y/z93nYIZl5o1zLrvatU5f4krvdJ31p/tkK7v+Xl5kQTL/7gsGMnvaLkWMi13cb9TPuJvUv1xNIgrVTil8v3dbKc3fsSwRQEzKz/b//xVvVn9Hao7k/SuvBTD3ty/P4Bduuv3vcL57bnI4S2350D3tboHvhLxWtOTb37gqM3Rl63ljfxWaLxAYtZO2hrjN0mdidOuFI8+9Ducf1jSzAuHb/LuRtG6fEfteaQm5Sn+5JO4D5zOH+1fGl4R685UGMvXu+7fSNN/5Przt7CFvuUavIHMJcA19UrYk90Tlhv+Qm77L6woZtsZGfDDihPZZYetJc01eJteuc3Czx+zv250RjabH+R3Z9y5w2OCQB2Xb6PGZbM+C9gEtHwGyhfEQu2RobbOf9gp6jA0QIr63HICcZKH4PSpT+1k1zvJy/5ehKeEfQpmX/ywS92jqeVRWT6Vr8PWdwyzK4bSch/GPA+i4WTeevfWaF/rInCkjyTzNrFafVbGtFTwq9G+G5/6u+3pjB0rPT1VO36v+CCuDHaDrnDNhuzWh591j6JrPZKU/lKEvC2bz5SXwxu5Vyr91xlf6azwRTzvSq1KjMFuE5x7IRL3cpV0pbaluFDelv993u8q8pWB1zp6p3lIvbWThHwTbZ/m8rOLdVhe+7u9n1desEk2/x2LYLZdvKXfi7ro7BEU+RbdYJdO+PFz8usEnc6eXWCK7v7rIIhyj+mzqwQid2d+1V/ynveshMEZnd22hHwRkP76/GCR6v5+raG5oP3fysJdJ3st7vrSplu+T0uqXMZ39ZCny/Xk9qxP9EKDS5pcNQa36ylJUpTVFQRu93d7nTfk+umugV93z9/OmXgOoR7FECnP6Zm/cZe6T6dmq/LBO73rorCgnPnH2VTuXLPYcJ0w0rkQe3vD7Ubejpl+bSVpex1lBd6DMSf3JLThG+yz+zv17yc0dEi4lh1wrfLvlGbXcXJKz4J/rHLumsZfdOZDu9/0NU4q9x24eW8I9CZl47SMz5zxudvZ1MITLzMXdROETu72n72OT1utRbDO31n2vrw8WE/hM8T38+XpHQDv3dZffVN5do5PSr/dFJ/UOX3Mc/pjCy/v4sQc6YlZJtfRAk89F0p4493eQ+PR++lhPdrWOve86Fqm73W+T3cv6BNnSXeaM9CPh4lr49WrJGtGidjw2uBfVHf4JzN6Vzb3N7VBA73lxJ3e78tC0U0/bHYrfeVJPy5+gkSbbvtdwRFyZRWtH5ff8JaBvB5N/J/VSNngl6WS3W1AaXiPaTUsImd98I0h+1h8hw137LuQlrJI2W5JYSL6NXjxCnXOEuEf41Tr1+O3vY7xX9MIFOwG5P30UFo+fp5/7iBuq6ZLvdfcwjPevJ9S92+qpO8eIlwrG42cmPg/jFvwykG1N2eg70JKEO/v7GgouYPJeVYdcVTpKu3KSgnLD3fvY8Pus+FMkgWbJ68kWNTbdIRNfezr9ogouxcQXNsq/0L95CF794jeuyQU2z+YHPuqTyYWyQiRoZ7yfjPf4S3fFc0ul3sTmvJ7T0I47tczBFfc7Zf+pIYr6+sJE3ZKn6/ESF5cBZAAAE8UGbz0pAK+CzwiKw9m/Ihu7Jm+/ykW5iVe5eTXl/3y733wv5iTzMg/jeYeDj2Jt/T9h7E2/s8Mr5dy+a60xYPNSr+HIlrl/bdRhW8I7x59MzTr4muN3eWMIKPQ+vw2koVTyjppdgpSkdL+/hi8w49sITxGYu5P9x9z/7mK+neqW8r3au/xnhI/NbK3u9uHRysK+YZeOm2X93xnMcKTd63JlX9AJcRvtPy/v2FdXThxbdgacw92Sf/uItv0udxEey1139/vl/vfVl1Cnm8ZvS/vthQllO91frZOTu+W2qHvvsZ7uwyiX3vd0WQ8VFaYllhAp57bnJc/9v5J78n6XjdhK6Xy00JZsj/SlK+X14mKkeu+UF3u0Mru5IP3LduO4CLzqaox962VnIC3OM3P5P7dpy/y3NrWE3+NEKWVlb7e4I82VIptWG7PXh23/3TTQKpZNHHpPEUTk0UlVNF12FG8yBbNfyLHl9+CI8OL0U//6/9+vpr8KbHBXFpI5HI9b6q/M3Bls+/g7t6gtMYIq/ln34VLKgOGXKg0U3qYzVR/v9fIYS94S9CHb+wT1q71F7lH+3eWHso5CN5mYZEjtSJPkBH9m7rwX1U0L/7BXEB7MVbp3zs1xdga/2dXzof7QXsAl2l0ML6/9iYJOWd8v37gnlWJxtECNur+kulaAvx/w+Wl3TO9vu9+p1f9r15fV9QjMJPLjYZL3IHb5FpPdM+hqxhoaUzfIoua2jz13B+4n0JjxOG5Pd3vdOl6mJd3CHmys0vRiH/v5PcFcDykzilTJ2uv3fLoX9uvWLXuixW6osF0DV9Xfe9M9qvbvpldk+T7/8ERN3eEPBFl/WX7XiAVzpyaZS7OM2WMvv3uFDisVu8ao1G90i52fpQ22HdWuGaD8fSv81U6zJeMIiH4KRY8besJ223AjuQXi+NVs8EJrT0veJNDqL1aPd0lFvCp89+XqX3/8NGdy7rCS7G/3+Pu/d3vfp6vPLffuS9/sFhXKw9ux93SIL8s/3H+TGcbr6BSIb4HrSLCs6W3oxC/wTd3cbG/v3lOePS1T9SVOe+r5zr90VOrycxBoSXfZ+X9p819+myyB0/9lITu9qJJUI+EiZGU4+md/GEddOT3N+a79PftCxKNjfLr+zsIz/58ufC05P3t+nvOtXeE7re7f4SveQvfvye3+bizbgjV7+LRj/szZP0ksalElczvAJ3p5v7WxK+kyb33Y8rKPf8r3d5YfH2j+dYN5z+P0wj4ITS+6cPQITO/48rEnLh+5bCrcstvzJi20uT21Wqly/SklYyf99y0977PKTmy1F9nyVyfSYbl9BG5E5yl0mruz6giJ5aIx3+J97Bve1J/brbjCW77ub8At/Y7+/y+T221a0h0sPDiTF8vdLD6euSoS8IEn3G8m7w7TNVTvspwnve9e2OK9KynpP3Lb0opuxfF9+6FSl6+8puf21japAqnSFGkm5liuMCM1mn02L71XlLbh2UX2D1fA/PqDnu+RJcXX0b18a3+loRBLzb3udIVWi+T7+URJKUs3Q+7CF9+HHqPemqLFUqXNfiJbzz62V89OzIxt3k+qGVbreZ5LzhTWmS5ZU4WL9N/rZMJkKjt3u77xB577fvEdTpqt/JBFyynBtImtr1C/qkXyWmn/ET55LvL/+Tz/iIJYwy+fvnTdPvefA/8ZAAAAE3UGb70pAK+CvzDuHEfxZZLOcs5/v3MQN5b78vksL+CQlouMemy/+4R3dmQcaIL8T42vYQJe8iHZf9ywRF1zpp3J/TrnYQI6+t43AeRNi3KXqnw/nLkQdyBMyu4dv12RuOYv6/Elxs/vVHq8W3mu1V5kF62e93P2i/r3dnHFj+CPpjuhRQp5hk7VRRK79wpnSTxTjTG9uyNlvEOqOa85cuG8iPx0QzSpQZfu3oEu9cXcq33g7UnJS17jJA/3d5Qy0HOV0SGjv6acydNgvWpr77Lyff/0X4zaC+FC//YJyR3ZDizXn7M/j3dsGX3d8Tq6ZR5oaYd9aWCoqKhO+51s5KcChsrEZ63D9/v681yJ5LPBNYeaZZPsJvYu2ZPTacy1ef4UW2WMEH7u93Hdu75g3r/wpMP3vnfnPvc8dFZGPE+fvDpZh8+uN7Ydn+upIlkiH8ty6tfc2e3EFaPym2FYmHSVumqPHx9+fJF/DMGT1l14KicfEgcNaad359N7hTphBs/DezL1qtzdiDtfhH582l/90Fbt5B4ArdfrM9P18/XXRRN2zRCNYKDCH36m7N5uC0gPa/resn23bpuPyiE6/c0+Y2Eij9WZYK6j2xW5NgTVgTJoHfYUanYC8XWKO5X3yj7fdGzeos46GlNIbkF+9p2XSHn2Ji5c7MqDq3wS3f3MOTmn8FmNBZsKDSbe93KqY0kVOMNoJz+8Z7zkndMduNSGQ0zn+WJEtPTy/J6151giJe7i+TJOvuEPN5f8STI9SslNT7/BHtv1+C7huS/1JmHf22CIs0qcG7zwQ3Iy7F37wR+HV0KMd6xd7vvbVWCc0OPyFkQ39tawQlWFNv7h2VEK5fcIeTeXy/7eCMjxW9b6xwm7u7vKxW+/dT37FhH3gv77wVEcOOG7y5xlki4v3MIui7cYUxpfMl988d73d+a7v27RmCWleHmeNHBurM2rzwV5ZXhvZtVMWfkF9BDu77N7vSQvYK+Q+25U7u+UCu3E0xNOuR7/oOXvb7t1CPYKjOWRIorSn+2m3dtdthKX8+F3fuYr712mEqKcfl39nvRvt6y3IcXrX7sxX30eakBdUU/9tZ/aQ/n5w6+z984sleIVhOkvu+T7dz8n6GXKoe7uRS7vuw3CPQdjJjb9OZNFmTf5YSzRLyz+/eqUdHlorv1Glt33cvyCofSWJ/4rk2191jWEJD/MNK0UG6u77dTfwXzlCPua2FxT62n/qCAQ7vmc0Qv2Kn4g3F0oOG8DP/YRve9ITlKhKfvTtYot3d75f3096WL7vh12MI+KJjc7qQRHn8v7+jXPwgd7u93RO7cnt2/kKDAWRz9p2jxSs/Z/3IRw/dv15JO590PYSzPvd/VlfZd/cX03kRT3d7gl1JklvzxT0WrqLNw3FGkUuI/////wj47CEjcvxkSF5fyYR8mCVtNf4UItrapfB8TeT6cgEmOyRnJzC1xOSrfX2YuX+T1fqJLl/P1T++hPyRRqwg40w4e1KOQs1vr4Wh6LT8VvO05f6J9+Ld9DDi/9/gj3cvIkKeTLuifWv+tzutevpImf+sIlq+733vXkgp2/dx4T+XpxvS3up9hb2abtddYmt5TOc0rfvCIt0ezfKOL7pEODWWS8Zxwxf+IyQ+XZfNJ/J67bioqJ7qfP4qpEgsgAABUNBmgAV8FnihWYEIdyxG6T9xJNfv1Nd/F85o0pcgl9I58sL+MJy5JOYtDSBNsx4MUzUbH/GljL6tFl/mPYXTyOcmz3qcVPbnr/3ClEURvb8Em0nbhCsak13qti9tsWI33joW34k136vzRzuU4z1/4qnvqz+Er5Z8E7adcvyX4nx4Z02z/qMUhTw4aEH+mYBrekvrfJQhyl3kuEA9Or3G4azaCb+qsb4EX8Xf4fXuWcPUQeSU0s23VFwEM0t6rQ8xfe4spi9faaC9kt4tdyzJl933CkfU2d03+CT6OfKVJVw+lw07CdtEK2CdkLRv9ypvtsKejMmZfaqr61zDlp69C9QKq79IwUP+wykEq7hQoRcbhSOxjLG2lebTPzR3uct8bI5abHsMKHbSRbjeEH9W8O+cmN0QSf6FcVuDJZi69a0zlQrTeffk9JdfG0GQTjcjnCRarYRUkXEjvvjRafnSagxr/y++9BS4QMOXW/vwtwywOjvDq6Lfrghgt8sEs7c8xfb0j2kyywVz+/3l2xwwvaNwV344oim+N9xkk817mjtr3LG8Gb7Rwb7CfgsCWn/XXT50BjvCbbquOlYt72J98ZQv3uHxsve4kuifPy/9ZPpOXs827aPWa8yBh6LqizeNepp6CHKoAgPToy834YrBv0Lb3+bhVTPe1wm/seIt3e+AJNy4dn+/xnswEe+bJwxsrgjgeNO25fr3ChQWdYw/ekUeKV0menniA7K+rfcXuQX4gvHB6wqimaOfcIbIi3znyvrJhfTWzn/STxtGOmmz6BSTZjx/Wmx6ho+YksLPtk+nI9cKUVh0F8dsb+5QKWrNuDaWo+51JgvKJJ1ze+T9aCBhzHlhL1dl/3bBPDSThGjUI9t24OjwvhL3y0Y2Uk0uv6y0PDlmvJ9qo0r4U03FndwoBfyz/fDV/15p/d5w5LfK7J31amuhyqMbPoZWD+Z+EyuMa3yhONiPsWwRGoOp23y+pZOCg6Nzg14Ve8OYOnBHHBJeke7yaTbye293ikKwDH6Lp2kknhmXLUnC5jlIZRLOki5OOKXV28PLcfSVtBMS09N6e+nRNb3qEl3ghI5+/Wkt1w7E9e4Ijp98NP7JOu9/XOda5d4s7yxz8sWL7oIincK8/zhve6dMEJQ8kk+x2eFCu7923303bpvargr3d3e72+ZCHo3fgrJzMPve/t+WhPda8q5PrvfBLmf88v/q9erh6/S99G8vCL+wpTzoCa327fdsvffWLyqT/3eT1//3IXd6rP/N3fQmE5Q0+8/6WrF9X/R1pJU99uTe4R8TL/H6bv1BES532/soQE4Vcb+4rd7vsWw9z++yyelE3v/WK7vGP+/erG/dU6k9PJ1wiR3e+E+HvfEDf6lfL0cRfTRahE8xR7e979NKVN6UgqeKdIqDbhHxBoyY/P/wRGp0/HqJOK3FbitxWK5PT1y613Lr5N1ne01Vix3f5Pt8XXNLvpSokEvb9sqK/x2pyw3vc/e7XMxZnR+CSlEdVlQR3JtjrLb3d4h/kZSfuEfPFV+Nd/hc0nXhzsKgpnNV2SJvliy9vdMV9IcXdGYL6ol+5d6QbGvvUOtX/68sgh6t7EyybdvVKmGml/UvZrJ9vi1dAnM5RWEWHZRbksh/a3dOqo1xMJeYwLhhv9+78uoOe9akbu7wtWJLpviHH5oIsrny21JEXa/T6JVP5fki+X7u9uSmOxvHjoz/favrBPxuZ350ghgn1WSTeklrJ6dv+pBdajsq+bw4PWkUivCypcvmcL+qRLERE8PuP+v9778l135LjnR8FcAAAWuQZogFfBZ4sUkGs4a7h79ykJf1+Xz8M+Ykh8Ee3MQWvi8PsOhJfxfGxpLE6NFu4qcI9woUfZW/8J32b+fCCi8fhk90Rx+N56bZWiDwmGgj/dd0vdLfJ9lSYXW/3bn6TfLJX1u+T95MTLHXfNmvLlfn9ezP/CvizckNR6Z+FMfnHIXmOgkv43u1CbXq35KHEdlAE2+dNKz2Q/wxkMt+I9/7du1+7Jhwjb23G79y/flEKYY2vMj77iNK7BDt/ktTr7/m4bijF4SaY7/GbbUZi6tsMw6G4O48MjIMr/2BW/uIKNifQ8OTnrXr8dIDjM3uayLuaeZpSh+X9fCUoi7TjAZ/KbrXBhixSjh+9vCtU/+FpDuj8j64338vyeoRy8d9zInp3feliSny06vZBPzCuNyX7hQk7jGUyUhmnzuyPy0yMIux6VsnNEjv7i+8umxt2IiOpvAW1dd/csj7EvezgI/47yvvNlPktsZQ1xbUrf3VFYeKGmetLejR3p4y7m2np952UfdYS7n/WMvHZlUCF6OtP+6c8I325H+EPHonkeWNJEy+KvDbJb0O46j2LifLgyJ93Rf18FF4R+Hz6GbB29wpmgWCl23eHccE8U4s0LXB/AJwm9zptJpsKdDHxULt97eYk4SOHoZb8Fla3go1hcvzghPLGigyOImYgt2W7lx3dzxEZbQuf6TXhX3/G/mkG3akD950G1fw1LBkifqxHrjXf7BbukjSIvoqE3VUWXx91Qg8WHINM4y6U6TvY9StkjMO9IbZe95E3OdTOlGqLRUut1ITdH8EpLHw61JQZAcgnuui+6yxtoZUHtdfNdpVPr6EEP1LbdYd7x1dHk+lTp8Evcg6ndrprs+19b0uEn+CexPXZbvMmlWisb2jSmfluMQnUp3AjflnP+ObaP/6D9Eym4EHp+ogvj1kz465t1dVc0drB9dy8wzlJyZ2/+mCQzNdeexkx8n0ll3mLKIw63fL9v4JTP3gqlFW/BcV9u75GdXWX7kwyt95f31FZRKX5gc+mwiQddH3hqhkCEycgWqukJODJ64T/onfZb7hHw/3Ddbs8FMmPx7oP45i+UgJ7sY6Zf8g2luHamAOKEsvO972tQ4Kef+T1fc/E93MPsH7/IUj+gr7VkZl1/R4LZB/fNHnNLmgh0RX/aVf8FPdy61ZoRRRXV9320dYfokPL+u97u/oTv2+GPv120r9F+0C7e93f/5xfXnh8I5ILBG6u3Xd671O4qjuq+vrWONfXVVtayXfCK7wU2g337u9G+5pYZk/29sFRN2nd7710XtL9LRpRMvva3iL3u969p937WT6T/E95NCt73L+l8Vml3n+xPqyZf9o13wj4JvLw26PK8lrlwWEuHMJdk93em2T6zW85ARieSQtH4R7ujvvPuxsTdy/vvr1qpY6vWTdjfX6izPhL4sPmlyLJIpVvKXAj17ffvQ0W8V3p2be99b3uEfC8iEkw8k70jceCftNC749100ZnQM+zaxITbBQIf3vdzL5O+ET3Hct3c/5c37QKBeDrg5cL+L00uQhCTR9ZLvOtqiUIz+T2m/lJyf15JPWT2aPVZIQLaZ1y32976d1lmEPA1eK+/sXPvvL+kz5Iwk/UICCxDFaG7n5/vd7yiZx9+43VVxW9ekUrxW/J7GiTHWu6Ebd9/qjFLzdze/TGm4c4djYqEemf7iDXzv55jPWv39DYQ8fWwg5unMNd3utXtwfCPuTt3uO0PLxZhzK/qvBZe4r4cZ1P7zpCmSE+SFaTL4gydZtHqe/ExImHWSX6mu9hWnlm4rdWJ7p2gvvDctLu/Z7X/Sm0nLleSCr3fPy8dbPpxuiQSWUYldD4V8QIVfSX4qbXJlqaXJ6qVmiOLtj3vuNTVO26veT5Pr/JRzMy5ove85iVg/deSQl7hj1Tr74YxX5KnqyRBSbmZIco8FYLIAAAReQZpAFfBZ4sVy4/cN+HBHHI5L/iaWX/2wjtkOZqlkwumzu9vX4IdZiyXW7Q/e9oOCSjC2TtPhuL+Se7Ynvh7d5cfxq5v/LVe+9SxRUi/3b+CPuNtqd/fPbhXzGwwvdmfKYl/d7G4SeQNTp23LJ2IJEiXNrstqBfx4cvXosOwlPocDnZ6aB8x+sF9IOjRFNH/4U24NsFbsAYR+vP7dlTBv1VB+H2eOoqV9x1tGjYX3d91tJ7jANLAu4M+9bChQl9tby4Xsq/d3e+V5Z0zFqlLCHI3ODzSvgm1PX9Hu9/LGZ37B+N9uUOtL9CY+NxnDlO0zj8OLqPJ/XTliru8yKCP4GWPtBEpeNrLbsyqN3/l5nkaE3+KCFqzgK3Jvu/xnu7HYiYXfvOFzjr/XthMrFfcJ3gNakhUfcJ0V+dl77aF3z7hN0v4Qvve9jvf4SKY3ecR76o0/u/cFG1HRL0eAgEvv5//El9wgQcE962BK/itl+/E2vCsj+/aCfdPGxAbv97u4TXtggEZjju7vgpOnCPk+PX+O9L730Cu0BDumDZwwxlfZ/udmDRkPXRYU4yL73DMW/1o/qSkqN0JU/ZP38byHKKLBPXv8EFBi/ynIQvbBc6AZ8NqNH36OwyJ3d/6ab9xd33rqiXMIve0j1CksFjOpL/768wjLj7M7GSBZWJV/e9INVuMN6BHi4eh2IpcGeG5J0lTXpXwS5oxDmLTRTNknDO4675Mot6UI+CMRPvZ+C2GK/+d3d7iT6r9wWZVwyvlyE5d4I9L3/mMLolwVkeXbAO1xRCsVffCueVp65S+HuYTzL1TngiERurudYKy8JeUXq2j3W7xzpzFPQ72P2eCIke3/94IsI/NO02/VAgpTFbcEvDqgQ8zvMj5RJYdsc9JLZvum9mHM85m/pUVKqp0xG97v7Qy5aPz+97u9wkufBWR3d73Xbr8E0yCHYs9yy0T/s/Lc5vvvCZyRe93f4skOIs/xvf6xb9S035PbS9UumySNDNSw7zMEJK1p/P8PwSHe9qf4Iu7tCK9wViuVcV73vrfuFBLivef7p7v7sW1z1Qn1S12Pq/f2quN139krl2kvtesI+CEj7b79Ne0/Iiw23eTs/v6EPqu/v7T035C3vuveEfRH/Rn/KLe/RPilVfX15PSXrNMbI3xVe6SrkhHzHL30/y3e9asokWmZY53d+T0ndzyLL6/EKnfXrZFt+SipBtIskK9TrTd9fvKxJ+rk5P4oQ+qwKKwIr2pUi71iIR8hZf/CZNARGEieTy/r4jeXn5ffdldgmLuYNn+5JD7T4QRBeb06Xus5mPzd+T+mnkcUasYEO5vy1T+kJhLw8w8/F9Af/U8F8EH1Z+T17rJU4oW8pabf2Xd9fIGJYbu70q9U7y/kKiydiZe71o5UCW73czOLbosIXhC00/z/d+oLrps9sm6QenC662Z71ejnTVYj0oK8/e73d3pVVavQx2TDHqkSzRE5E/5ryCYRgqgAAAQiQZpgFfBX5h2Ohbi+CAubOX0a3X+LC+LJlxLDjqL2Wex/Uvq/Fy9swrxL9eUfh8srC3hwJY5k5EvDuT/hTzSNF39rZhECT+Xb395mEysYsG/LET94Z7fR+7y/veCefLwH3Fr/JK/i6KGRBYK8AEFCcjjy2UhachZ80T93xNzyp4yv3vhDY7p73DNwu8v/yllybQqyBTzGlo0Y4vx19jekErFAYu+u6geGSzf4C1VSNRae6VEve09rboz+zxvxQOea3kVmz7YU4gnu7f34b85EGdyVlpxjvn/KUMLW20PptJWJLuhc6ZkwTvm3LCEiL7QZKaN5eUx8Sw91uPpJH/75a3Cf3u+nEXPry7Zw36wS42nlX1b3H0rqvFTxlGXhxZn+a+YQ1+JLy/u4U8EAxzBetxVzkTQKeYpOgAjq9rLz01jml631uVjLtXtyxADI1/VIhtrvlLs2KQILsOIrAQKv+0M6WX5S7fg+6HRfdyvx0Cv4IDgjv14RP+vfn6bVckmdL+MJ+JTRre+IJ8g6fy7/FZacv/lKYPFPkOduJiIJbvngEnrXOmuyxneY5ZNmvA303ScS2uDfSk/Xz6BMQbE/w+3zQylp5C/SqT9dT2zR8v63rR3tZJ5cJrbLGCnsVu7/u4lzcBe3U/Txef0N8OGL00XDCTsjv/vpfd4aa190zKqfwQeETgjPD0meA75fnc0hxjepUka//L7fuOPKGkkUGldpDBX+6cvrt4KomEyBvW1w4g4OlpCUPjhOeNp8kng6v17gpGMXYBSKva86lWcYefd+qCmJsLv09h+CMbXLpkXR8MO4uBLubq9/lqnwSz3wh4q7DSjASlf/XzAt+WtNxIvJX/d69FCe73mjCPgiGZP+3+CLTuUffX4bh+Lpat1au7/0dhAkO1P/z9XBPvjAl/kHiLadCX7F/cEJnvy70V+/vJl2Nx/IwUGzPhJw9T52WOrQl6evCWUlV+ZL34gnMu5f/fRK1k/UR/9TTh93elfEQm5D/3vr7OSEfBPl/y+v2I5unRXoSy916kvosVUdBK78q/oq7LXL9X6yS+7hHwSiJn67t7db5P6+z6X0xN77G/dXrX6XqspASXf6EfBN4/TXd2/sEJnvrf2JG3d7T9P5JCO/1RTgy+i91Vif4JBBbhjFV6L3R4I+dVJd2LRDu+m/kk/CRfv3IKe7fwVne70nduf2/EC7ve/v9Zf3dOqE/J6r8EgjEQjlCj+/wXb3fS1+OLl17o73dO2hOmv+I9jf5sFneof+JtmEgHV5HPhSseSTMfneE/Jh61mlyIt97+UERsueqnKJu79Sd3usr8T9apom9wu6zMtV6pc3LnqW3m26yLdCNBETNupfucp3RLQom58fvdLhK5JvYrd4Y8EV29Ok/Ncblf+6Ov5Llb2usFcAAAP4QZqAFfBYX/3MKcKvtrymy916lSF/BJ47ImMv/uCDKHy0eLR9QM192ludUjeK8+9xZwlFxt7NOc+X99QtjpGHm54YeBWT2nb8cix+X033GzalTwn8EVvSmnBfoyWCL6Z2SM9fl/23Fkf1T+JENPXuGTsJxifi9MuHr/y96tD+je8+SkT07yd54YV8Ehrm4n12kxr3GY/OVcSEG93+vIFy5uplP7SetBNdyPczaG4us/audFW7gty5btmKaKn8ZCvjfoOjyA0KxStp7gvGJnR7lQXlTlaqsJlOJTXcBH7kb6/tS8F2fZKJHeTc6eoKcjn3l79mNH6dVvl5aZf3dy3DjbAyJr6/hPzEk29/Y0k0Ebpk4CrtcB4++HfYlUWK3XyKS9n/G7IlYNlTwclitbw9mnszG70a60W5h8/2755RNfeavthCWXMCyHGHd2Ocf+CUpSZS+ErTHbs86bvOwSbmQO06PqyXDG5BZbQ7g6nm0/r3X9YKKsr7T3L5W1uWCHjeRPxfhtJH0YMYHr42iwwluVsXaodgOZDRxcQCAztub29F8cmpCTc3nzTRU4fu9vj03hP5761omWV6uf/VdBC78vu7u4S8E4p28nb6y/9Nh4mTe74BTy6/f14btx/bBTnFIbuND8p3/Tvv1Sd8n1WXTgt4aQxnMDi6Am17f/Z+IEucP5rRFT+4LiFDpd0t+9asTZXMg9HglIZdoOlzjL/+q0gwTjJN37XWWnHu+CLLWLxeSYe+4S8EYq763+Cre5VwfLXBPda3WWCDjLpyOmnnh+EPj9f+PVV3eNYUJwQ4EWB/ieQdDcOa3+4gkuIibl2BY/VEw47sN4TFnpMe3cwSk99/cEQqau50e5/fVS6E6V0/IUj31Zu4520t02Mly7iv7qIF8N0xyCGkHqDovULRhdHuFC6ZVd7uX73d2hPxIjKxk9b1yXv2Lbu+YqqPR+6tkKP76tddfXtPwQ+Xt3QIZLiOqcqf/RSK8IeCLy/X4szj1Ho/P7/CAm772z69vX/Y0vVdifiPX1qw1yckI9gmNbt5ffXzi6b5t+j+/t+/r6L7E0JerF+0l6nrrUI9AmmZdCbn8MpEZTvk7+9ddfVq/fXhe9+SwQb+1eNr9dZfERZU5Cvff5ru8I+CUlbT8KtO9PzwUb3e99Vvt6FdriPd7dJ/J6P7ErvNxpO7T6BMXd7u6KdrjMJeMl2/JInTe+RIlOo0E5D5e+utfElOIcT91k+T230K9fk9ubIUfpvJHEKF7kH7e6stdrt5tQsT6q8Y6qi9NCf+jluvJ6pNEfITuWFURmTt6/ThfJKZ7urE6L/kiOXyeie1hf0JSJZLkjJCi+aT5C54wWQAAAO7QZqgFfBX4YHaQ24zU68c7/iyh3NHScVkJN33FkJhrYRs2f825Ub2/F70uHM58FEt3meWamSGPCBOO9UB4Qyzu9qS8f6yP3YN/spf7d9buFL34bk/c9bzRYN37inTOX7/CPd7joGzlwEjfzd3oBO/wmV4bQwWu/Lpalgj054koV8xqTQJPD+6l/d7G5Nci/J9hYVvsJ/NYEvjkqi5KeDx53l58iHp/3irvZBAbRVbMOITiUY8WZkn7GUruP7d8swn7R2H4Z66SkKzZ7j2VRTR2WMhPpj73BnTGA1tfZ2e4dgryfbaYnbqdPcQVhqeK55Xrgp0ZkDishMD+EvBM+ZzpcjfjNEvPuVi9VPX4rn9z51W0W7js719iSmQbZ/lISf5bDljCfgsCEidiepSn/ytTxef/fhTv86Bv96xqu3q4cT3Op1dGniT30fvoExaFuY0QWttMtk+kres3HZnk9JLL3CfmvK2Vq9BbCenSBN+x+uHuV/J+lqyWbjomXOCfgnp1O53tLCz2/sFRkM1ciB33PABAG7L9duOXrL721jeHqBz+6WyD6RIz7b3qL/6Mg/tHOf09pKe/VhSH2GPYqZBjYYTDj854PwFd5NaOmELZVP7vysaoTyIeJIPhFpnuNNcvwl+sXiXc2lha70m41hUiInv/uQfuiv5b31QJhbnve9FfY2KpNNOt/yGcwkCB922LSfbeLVtDDHXd6zZ0BDJ44ed9910+1vCnQPzBhCLi/pigfk3etQNpugHcBD6a/fhDfC67f6TG0noO5tHTr2JP56RDeC8xNIixHT4oYdEP/L68S1r+Ei/+XXYKxR8qkTnlPLpaEjt7ZfODTjuG96QuH394JBHD8nDvcx93VtOeT9NrizZcG8WPD8lnlEI779rXKpt7hJ/iScsdN/qzy+8EZ3v7on8KXd0D2dm3vd3s179H5PTS/wREx2T7vZOMzW/BEWndupCi8Zywh5QkTZf5YJRL3vfhJXrV7rui/zeXhHyEbdN5fJ+NN3TJ1djdbXXCfnEu+imet/ijU6dN95MvYn2/eYXe8v/qbd+/omrFoJGd9424a9NbXZIS8E8z+MmGxwq3PVlgpK7l9t3ufuc9t/fa1r3++vJ00uxO795hF2I/7m7veqlPefhGsmEu529fv0wXG9Ny973fOCUj3d78N9VaXJqxPtcn3+vt+gTFuNiwLB9g/z9t1cxb5f0oVf5T1ky8Rl17T3vv7+8WV39tVkkIfy+GHt/TXZPjv6phvqob8l1/s45KngsgAAAA2RBmsAV8Fj9xYqTC4rXsV/KTHemvdUw/L3TDHgkJDfg6uvf2lNe4R0RYZghgiROqW+ZvyvrhmV3vjoLrqXco+PtDvqlXiTeXl/sraFXcPO5cGkfcXj4UOjxe7QvgJffPtd6Abk/S71/KVJ5b0WJ0y+FkqlJ9eT5OM54V8xtQT7xf47ILk1IzcY3OUk5bfw78b9ITPctKDA/uV76FuVk2/GcePYIcNNQ9w7cSVO0iHi6l5UO/oLV3mrj8eR77nBHa48e77rt7nDa57pptpzCz9pueCgoTct3fxyuxTdFjKbYrkXQz5u/VpUuRC977lFq+zUZ2Lrw2XIH3XHKfhPxgyGdxQXv8DsV6yvDO/fJQSYO72Q7YqI7WGXVNcfjQXttjJ5EuDsvbbAx5XJuhWl+a9k/MITm+s+Yvi9xnHGuUFp9p8i5CPjIeHRWCkpgzTpjcw4eilv3AblbJ+u7qP3t3viUb0u74f3ddJbvkn17SdSTl1dPqhvCXkpXlGit26SKmw4y1/n33X8MuS5PSSTWsEd70/uO54Ht8blhvCb9oeOs3d/d/cwW6t2hn4I4VWzXwm3yfr5Xgpp4YS/OPTfYZvi4aScu/ZPvc/lIJx099Wronet+r/gqJtIw1OOGof1+kT4fiLmLlByhxQb7CfguLI1bn37f2C26IS8x/HRTfS7rcb8/NGhzReGcGdo6Rnj9LpIdDq7ol+18cSsa7OTI/xf092/DbfMHnkcWjsFIl5QvMN3n2GVDg6rVN4YMGscQ+UNvVi89P1+Upc77Hy0N7ye6pZFhO7/DM+d74QMUFvfdzq3OLsKoEkKqioEYlbF8lS8TEvNN98vCJf3iSSCr3vIrRXMn0l6uJ5gvbh9J/9nZe4/7eCO99dfuCPe6f1d5kfV3YlUqc8l3wh5Mv7/KIdv7sTe+8l37TNk6u65L+0tqvCHkx2n9DiWdKx1h/3xukfl/Lyx297vit+mTquxJbr60JaqOuqBFm3rohVfaSogIbvfoR7ZJn3dWtV4IS7v3T6yVV/J6tG7svukftvk7S5IR9CL0/PV7pNCXuxfxFe/X/k6Uul6FMp99Jr6etXhfslU+uvrqxfr3r9bPhHwfId5Mnkwst/J/TJiOUkz+T0/7JFnuk7v7+vVKSbd9eSJ3e+4YvgwgAAADikGa4BXwWeLFLKSktSP+5iZyxxpF/2stprDJf/cJEWiZyiw6SkeKS/vso0sxIpczbA5XqLTCG9fWcsp6J7KvDIb/KLfqNf/jZyNEM430LY33bgCdq7fDr3BWuJl7i7+HR3FynZ5jDbK3Z1b7PhvB5zkv+JTl7vXpFvf8ISwzyl1thmZwS9xVyyvlhtqPqK8KeCQ1I+pq2ZMv924zeEy4+QIV/8bUtqwZf3CrnAtZUEdiBGbWfksZrZDd/5Pi3Y65PbUpSroZfFoe+WGy8eLineELgy+vn19jYI//WoUsec3cInnoRu/8Ymv3D96hmdd/9N24MChu937FxovOXi//L4yy+SFMzSnLa/c9Td/btHM5S6U6w5bIgIW19uqy+SlebneQev3E3e5f0L2jcVmTv/8pZXnSsBQTf4LAk7WHakdcqlv75faq8E2/vUHc7ftXX5j5+XSbapVXeCu+KijYkg6kq6J7Hlbs6FXveXwoX7rbGmPhGXU4Dy9o6fC66x8ATLcV/rO+OkGU/NssKSPUbsPV2h3JFL7vK6H+e6kvLXldv/uMvXKItMCR/6/f+T1X3wQTLQTPHbwh/lmY3KHQRPWpRc2uf24I7gy5qiVX/8Ep0pu8F3SaYiN7hMgaSTWlf3kKvW39H9e0tQp5rKnaOmXu5i7U+7OqGkyCw6rR2Z6fHSjLvc36rssZOjtjRS+T00v7BLbed6BJYTIkn+0lpBMbLe98n73iSZfLwk/arwgEjL5k64WKESO0f+TqHqRFd7lxb/o79i3Xi+de76sSgQyy9jJ7TX9mM8iEdTGvq4S8t1qvZY+gQ31T3kRepa+97FEvd79H9vk9q/Gfs8xXd32kvQl2LFbm6eN76sFolK9N9q3mo9e7fvJvfRF5SQls/yzVfbm3d9v2/yq0JeCbe3P4aVlffWUSf0z/6EdXghLu9dj6t1+T35Pbb9VKZ5XWvJIW942F4viyEPCoYonvXPCz8vlnr9FBHu4aegu5evsTR6ql7vtI2ZNyAbMziqoyWL2aEfOVY73+/sPkbhl1nhG+0+WNu/4cX7J3qvFEPjby+/qUSF3EHek/6XsnomqI9tyObN37TNSHEFc50Oyzm/cP3Ay+09Ls1jRinwnksUtbpes296TywRHyoTSKm8kR6fWb7pPHZf3hm3lp+1kJhXyGXVeXbvy+hPSRZPrWRsWt/4ayUQ6SfD3xUAAABEJBmwAV8FngoFGln8d0fFYrzGxrTDPmyCGE59m+M8M7umNGXJagvb/AIPmZqypoGVddXvd/jL9Q5BF8fMU/BoPln6XcdBO/xWFh/fj5vxpl/uEeMnhrbtwI1X4ftL/L6vOxZM6IwHKboFLaf2JhM673svvNJLr3CHjMraO98v/0C7liad7iu/L4eXAFV7gtHFNeZSGnWit7xa3cQXeH/d/cRAT+sL3dravOrdeqY6L96y4V8KEtFxzoDVX/y1LLGThl2qnGrodNJtiSgrljPl7WE344sC+f8Y2EuI62k7hMo/l3KGr9UKxpn8ssn6t8TRiu6YTczVhK8j320ekm6CezvdIo1sTBHu86dHQLNz/3CPz1zNZJWlVJEaYKDWg97e8qatpre+Xd4TflgnEP3cNv7tAr/hL5VwYz8GrcN9q5EMp4pQ7nXGRfdgs5+xRfcG8n0/Y3ho+VM6V5y5f/yJEyuy8nt64q4ItApQ+VGwl8nf+wS2x26gJ/edEdoV15MJepLf2KuuFRdvfvrNhx70niYwhe7Xc9hejQYBzd+p3RuF/0Yeiwhirysnpb5NFO9XXuPr6oEJjpwzus9tX0fu9dd+7UWgVkKPsjBHxs73KFyFvvyyD8PUzTzpIEh5n+oQ83hx70JNk2Xsex3VWS99NJyBDyry7fhl06a21f6BcXd7yu1tJ36+rFb3e77LyfdCvlQSzlDEXte/eyZaFEnj8vtbpCD7vd/61vkkNe5eEe2Ix+jWfgrPL+XLuypv02r+pTkf9Ua93povIQ7BwnfX02rjaq/bQJLlb920ELu73fu9v9F/eVOEPIXhp6/bGkOwHn+2ty9MtkZZy9uNiketvlWL/ukcwkFGt7gl+lt/Pcr1VSiT5/RZ81hCMq6alldV3feT0kpdPDG6T3LCyqHzdPpK6HXDaSN/d3e77xPd7v0bp80Zuc7u+G5aV277PeEN3d3e7u3tIJXZvxzW9FhLe+79II73Ivd7u9bpwh4JiPXeRfLbqVxRnwQ+/D/3N7yMqKNzR20XkM7+kwlvfd6TcWlrt1hutWta4LN73hBpiXcFNb9P9dbLd/wlvd75f+iEH73e3CjY+EfBOYuxoTG7QR/+UwuQ3CywhT3FZce8V7Sl+/TooU6b8gl399WNRb37TBHO68V26qhRHvEvj9B6H7Xst9v8tXTcI+GizU6XTr47v36gqNc4rzNuolhl7J6614oj3u33dkKJEp0rvL3dl3vtaL/4heSS8xtK/x0cE9Ei9QKf10RLv9Xqdh7rZD+e4aI8JnLh5G7n5fhahMvFfRS9093fqUps/foTNu/RoISF947V7KpNwr54CxbL0d/pZ8eR5ZFubXlWq010L275cIa0IghOfJS9OtiemsRIW7+IiyXhVueHHuaOufb7uK9wx6pF91Dn4O+/4ifQu5Jet/iMZ7Nt+hF1JL+IjubN4/j+CuAAAGREGbIBXwWeKFc2LmNe5SF2Pki73zczQ3H814btX6l80wwX/3CRLjOmrlulL+94KyzI385Y/cNt78otOfh6ZBOgtNndgkbEoBpqqdnrnP2gQeq15q3bls2Qu9B9cv5e4TEmvY8yJF3Nur17lIcx716ZS6bhTwuI4b0ObXshnZmz1Hm8qeEXsXDuAV7Y3CTqbdTF5fb7AWviTzf878V3f7uqbzGSTc7iHNsv5JxGdNkaGj5WUhLaF8b7aLbsdh3oN5OG3MNvBF50zDSPYvtja92XlfO+Ntn37DcVkYNIZxtQsbut4xE/TZ7jeGmlXhL3/tfTIbK/zuV9Rky4f9NhAsJ/ZKMT/y2Bnm75x8yvyfaq7uOhO5bC+8ILRwg5NDAP6SLcTysFw8aXSXgk4fXMaV2m5Uby3vH3s+5x/dK2yIhQkV585eN2fl4Tdy2GE/BYENpHW179JaJnfxwF1Icf2y/Vbh2W6WFtTLlLGgV3AXvZk6u/YeP9+T27EvtwgV6nlnvmB5C95tEwmw+b+xMVu/KF0Uv+/l/3yyrhI5Jt0rv5On8IXM+CFv842JJEzGC+J/2Vuvlwn5KbTTdbtjYhgd1kZn2CoqrpcW034XHLf2nVmATXpdtrDvd3v+VZZdp/SR2o0mBFufO9Ub978wlRXM7n5O/qAySv2L6+CLcULjbvGcXBAQt919Q53NdruNvP5wxLMJsK+1EHcUCFt1b0Efz5KHoN7ZlW36NfIf+ylDkfFqY+ycl92uNEjkTiE6ue9xP3UxQv+3HfnTDBW2/9i3/SZLQeNKLZx6wkNML2URMnXeYl0tf9i4k8W/CFwf0P1qjxhp/G+IyMiK41b28xW9mfB+3SS42LvDN3eOTpb7CgRipoRXh8aBkuyAvgIXa14swyOvry6m8dQfaWkNoaxqmyFbUYH5ZKqfJf6/S7D73HvmJ6CYzCw7xqfrbZt7XCPgoh6tthHq68z6G3t7oxWh6DQy7Yd+/dmL/3TuC0hbdEBXxJyHg2SvyZCeLDaaPxuoBDkXdejm/P9ZoW7up8W8Am6+N9WMBf5ucfZGC+5z9KzFd+/ibHIbpmQ2/6m7J5I9uJKmnc69tvUSwa877/dfokFhMOfvc4/b2VLB7je7hijve12kFigeQWl8I9ukrdVkPKN8M0b+hMOFS1X8ax8sFO73fhuJg+XZ5pZ+qNNChMIOKZ2FtCwX+1RvMtJx0ujF4xtJ50CiHoP5zqhzzvfqyWdx1eEfBWYnpsdak39z1CHl+F9WXNj2LYJtBcyOyaGOaV9m7Swl6T88dvioTLqw77u/seR2JvuH7SQt31jTzC90qBefZ5PhMsepuXZwatM1XbbbLusFJILuZt22R/JvsdYQLck93vm3a5oojucs4LO0GvdL0DC5H+kht5dCS+blXd9Q1nxn4zKr/xxbvnzSu+ixdp2t72sRIC695Nd6fL9dZRJZTJx/gnHRy7czHad/woRfTfvGKOy25dUU43l7gqE7u4hzvO3bT54IbW7jbRShzWfXglJd33eL6BNIW+y3d28uxPrBLd+8uRfYISPeF9eu68Sd37ny9UEbvd3fd3tt0QZd3fd33Y969yw2rXwi/IzCHvr6RXvqr+ul+OLJ0+933+9ronu7/hG+73d7wj4Wy7I2OnCQs/snipX/fysz7d5hmhZ03rmI7/p5/7GsFt33JbOzJ24q7vfemsQ+/o9E78Fe7vjCPww4U+3+OEu7y/uyeE/DxrV868JbarexsmBYLDt9ZoMqy/O7yI2t+4KD3oqTpt2LL3iBM7X3vrGJwRXe6dfH733d96WlBH3c6dfv9Ah2+nXoEhrw4CB+lvsYV3vw3T2MP3cvd73pIU4e2n4nCPqkG/xxqjqousqibnO+HfZOtqhxRurvtB6ew98n3rjTxJT5vl+j+el7L2JXkf3rTSCAiY4VMLbH8IG4N7odz27+X+SuE/IUdwgy39IIea8sebK8okIGFd/F1d/jhAKxYl3vcummY5n1Kuy68hjkZ9Na0kWpStb6XsXCZT2r6Ut31TNxkv5PSp/mBF3IXSqX03XhfsE/c9mrsiSO36pFJ7f34kqZy21fdGqelX5tJtLkhm76YXST/WnRCcZ8GK/JJJrzW8kQU8xj/dEbhHXUPfGQAAAF5UGbQBXwWeERWW9KXG4Sl/xZDAo4zSmJBx7+Xmv81KXIY8xOG8XL/9i8PQ/ET75dyB0/UVhO1jWCEdlL+740rw98x8f49HqViLd1h72kbuPns6Pl+8tw9xmrczwDkXbDSX2kaT/ct56O6wk3v6rPBWalu6WwNuOlj4+JYtHRYQPMwykjfl+vcJZ98scnpKX6hnlnivbb/4QpValw/8vhXzG41ES/u+NyASCH4HpQS2r6Qe1aACPx5JuzldgYe4TBQ23200XdX1D/Y/xveu/96ZfveYKAlfORx4byP/f4yZccktuvkh9LMKPafPL8OZpw1Kt8gXbrLyelpOuEy8/aPsk63xU/Pgh/y/YmJ3u+/xheeAe9/kfaIzhPzGjtJpZGX+NJLGBG0l/uEikM/bi2KUl2z+x1WZPu9uTd/zSecfeY9zwsrkbYf7G3KYsTVelzsPWfZKjQ03fXCRpUrYkS7xBa9t/h/z/dCVhL3dTrOkj/vpsaWsypzdw/WXF5DDNN6r2PrwwHQVQrMb7yWkJMr/lSRLNFwrUiRz1xaRO3yfS5XtjbVlfluIu4D3L/5dLdu9KA7hnSDD7n7WsFfSOxXsnSDcnywNSdfBXvPF7x6OLje55dKXgouUeetBDOvOnX1hfe+odi/2E/yV+mmqG46EQX2R/l3T8sLmXFZcBVUZ8tL8+wEQpVuvl1dgwvlJeGm6CduftQWXl59dEEl6SGe3+Uu7SIzL2tOm3ksdvcnz34oS8gjdu/wpownfoXZnu25VLF+w+Y+dEkwdvaBBc2H1wncBGq0aalW4M3Pv1U5+Yhabvnn+8M3W6bMPZs/jbSI78CBf++H5di/dvHWg1ISVh4/V931XrBctFgDh9JW23bVE3UbEfpZseZYPVQcl3ZT/JYfub/iC99w6i4HAW/3V1/5H7XuNINFLzizXX73fNV/l6f3ElPnvIXZ9hexcl39/kjZVwCffQ3h2mgOtpJL5FvSZF+gN8dX9pTPIauy0zFXsKYcuGW4/402Or7fVQ8WQfn+D6znmYPP3+CGprWw/jE3Mt/Gwk1a13eJP5u0aJ/I3hCPUiv3m/6B6Pc755v/3j9Jr/J9JJedMTz3rliCz4+EfQr+9bBFdoZeN/sMbw/b+et78c7/+FDTov78sbuv7kEZnRC/65/f5Rlbgcvfc2XuC0SUffGZWrvXSd/wTbu+p1eFP1BJePov29y1/k93G9yzbvrckKEod6Snr/CT75gG6sfpKqY64/0CbunmdlyaC9kt9b5S9bZIJz3um9OUJ+CQyWGnn9y3eDt/iDvftORbqCIQnHu/b+RCdn72z/CPojflIycvf7/BId7313q+/dld230V5P6fysVPRllJX/e5mrWi6zwjszi+jve/WCK+5i62QvsDnWJpXmEr71cUV3e+6fbwiu8FJnRqw6W9Ne5du3+Mn8/vduX9v6E+mum+vqiHP5w1tczyelX/1u+jk9Ut+i3kd7hG77yhq761NvevGYQ8GHjdFrX8RUZf/xRnP3uif5WLTd+sxHf1id33vzIKF3fd93u7pV7BRve7uQTdj4Tvd33tJ/J68giWoJBD3DKSxvpfBXve08Lo4cliDfXakm3pNBEXu933vSRSShPdK997i1CPgnMfrHLhJthdKvhEzt96q1yeu8hLYUPLhYlt94tyw1/aWmu4SEuMV97v3DV71Pi/19hLMv3d9KJl7v572X3i93d77p6F5+f3u/sxuC+Jz6GZj79uX3P9Pu8n11MnQIy3n6eEfPX4Zqv8aYMvbAlbtrz3Whp63NT9ZRLYJyPwbq95/30MEzJtLOgc2c4+f/opqq+8t0Ttbzb1VP0kDAQM8RHzcBiT0XXd79Wa/yeu66fTUKZJNR1bvOyewmJFjXufbSk2T7xDbLXolkpp5PdLkyfUSVt9IltfUSZ3Ku0HPfXZITj5/7yb8jhdLWEiPpUb90kSU/G/0v+pC3Sf4JyOWny4lXxW9xW+GPVIlmiN+Ll31+93LnJ3dCxDuuCuAAABSlBm2AV8Fz3zBDHSThnz0VeHGm/4IMaDQhHy9rG7IId4FaLnzZWb/mm6C6NacW6OZ6fy/27YrOUvK/7gsvlx57hLwcszl/2n1bngoJzChcygQi4CC05bs9Tv6TcvJ+/rT3vL/l8LP7GiFDj1shK/lL85CRQ5Q6GArrQG345K0ySH9MiXruLbEraz7CShl3WB7/+2NKkZ72HbEP43B2MF60a8OxFnPI0RHGthDrpMP/H9ZfafsZBHL7M2cVCuOIf5ptAhO3HvOVHyDrXdqbewlosEOzJx4rGqXd0Euh9F6aXUWjJ+tvWpC19Zf8tv8p+eQTL/7Y0VaNLhHyeIX2UeFHjhsNpISHQePZbFb/TWtbtDdZZIquFYq8Il6Q1t3+vbv3CtpB2Pk4XD8Fr2lbfXEay+13hArEb4RKPEEHnDoS7bvT7I+x+ifBfURPY3t/JJOi899aZfdXrtwoWyiyL3MRIW4S5a93uEXjobPOJXa6jd5zbjMRe9reSkScSZ8avWBLgL/DbtYV+o2jUPuc4nswQ/Wbs+91HTitwi2y/veg9OdLhYQo/ce5c4nTkrttm3rI7G1VffPnPBLtwQ/Vom3j/Pc+pFk1C9x4iF4cvMMpPGpF3kV8XLrXBJ4ei7pU3+JKZFb7xwl4oQ99Im13+M4ZEWfyxn7k7iVtDtCrTsD9aeCCEXerG9AHivkAGI92RkYp2C42t+gP9C1JV9tP1HvONGNXnr/e+NklMgjrS0G1s5vx8RwaO984+7XUBG1M2cLsrAz54D1gTv/z+qx2LnHfrfEFtWe29fv/m93gkH7+tfjyTkU7vDDuu/vwTv/fThsr4G8flnp1jPf+vqKvaMDWr72sFRJWJTdXkbvJDd3MCENYJ48vxtoqHZYK9BoAoP/n/xwHTIhb4JW+1siW7zJ8rFwT3a+3LRst+w/PI+kmnDsAMrp2/vduFlc28Fo3ET0O8sbmsC4336j5TnWP/qEvRKy/2vk9VK68uOvjSMVk9UjtvwmaS3DYHx0rtA2V9dpAjOH4j2gvj21ViavtLcExM6+Oh4JZI0snH3WUrzBqrFxOHLzt4YcPTje+bqmfzdKo/w4JIoel5mT3RsHn5VH8n1RPuWZGBpODQXsPR/b3rUJdgkmR3VjJ6Sb3Z4Jeubvce300fQRvvuU9jsC+qfJ+lk9mLLD1ojnX1gnyjqXKF791YgmzDbwMgbJ3tKlhN+4J93e98R6QJS3u97dav71uQsq9AyamJKtf0Xe/0VnRU97+gll797/fP4QL+/hkxn7991f7fP/TKXd7/Md3SfY115IexnN2NW6z0RyqpXbVN3vcI+Cbe8rFOX4K8/ve7zb1rpoEZ3kQ0o70Q6d+6UtlGfby2X6Vd8vfqJz/e/RfWC413uY45r56K1aZUCiGXu9MytmelSF1pJzpsWJufuL3d+RkvfyRxHja7e9yoIR8Lky/Ha7AzDyQuWrzDiXtic/Ll3ftQgd993vcq9/fXkXSZiSoIS/0T12e9wl4dMULxnvKybMfGTGQf4cUNwmbfgnIXu+963b5RLu+T3/NUEJZsy1O8X191S6bFq8n6qrREHTTB8An9/a7QTHs/Tpb1xfGyux109Mgcc9/fqJ7l77v7YskZ/CqxJdGTJUVhKVhWnd/kXYvp/KVK3tuQi0uMKUh9/JZbDHp3C3iBGq8u+3d/SpKjp1fk+Mjizz6u3cl9qKjkiHRRq8SgntXENH94ZOFbTzXx/vhf0ITrkTo12T11J8QWrVGfvgsgIRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwAAAVpQZuAFfBYvcLCufQwhiMlSQ56/hd/vykzaYz8X3fSaDBf/cIkY5YY3KCPz0UVZpWWTzmjXFZf3txpYeZoKLBA9dbj3P/7HWVx33Kk2wy4v5PbTbfKgjlq00GvgeL2tLik3vQLZcsE5nu8NEPsKAyWK06b0yxJ2bzXd2Svx/H2zPRr0iSry1GsVnwj5Z3caIZr7SCvmMTUgTfq+3qkv7vY3xvATCzce90+Ep/QphRiVWxUEz2jV8cn7VSO7bd9PuEnzn8v/uMnu+PffY6HEUvbHLYR+vxsKvHHuInbCbWGEGfqo5Y2B9gOYvcpeqaUx/7vbGFgivV/vYH8N3+cMJVFsHrzOe5Pu8/sJVztOeaOSHRL5Ppv3UKXq9IEubceAe/g+2ijfsS4+UDhVlUg/e4IzZfD/u8jqXsfdp+5de31eWM+PaO+YOd3dTctBd7WCYrv74yCoL0hR/hEcWz3Hf+d72W3l+/cKaFpvfnDR1tZObwzTs6cYek435bSK3GFMG7yCz3YBDbwSaM9X71hLDqefu4Q+fuqT9JrpsJlXpju9w+lvi0WETXDK8i9VR4V1MZ6U3lv5b3bCXl06y/+VjJmRG+sdm9ujFk0+Q7e26porBASUPhy5eAHPbeU/dkb5nZNv+Ro7TP+4SculRfpwjlhKyXnEYIvC3kH2zk+ixc+5k+kiJt6GiftHJFX4+HWrQyfZd3r83b6VsiBUSAdlVsJcKvuWtIjxW7l7BdKq/LwkcTK7UssJnbfu+8lxVzlvd9OMM5Q1hqk1Oi3z2H3IL168tWdwfJChLIHx8uQP+59lZsQ5PVoNl725GwSAUpxSY/wT5QxggdvAf75+ipnK6T/J+vrlK93CRPrr7BEKdt+wpcsEJFH2iNgb2NNnuFJHNgX111/QcYWDBcPAnPi+17BL5BVnS1wdeQUiDxS7dX/VvwW93cij1us8pTD0oYL7PXsnvXuWCrH5/m+dd9/xZHzLzBkEfu/+jwW3d4cSvuHuTOW0n/4QO7733P8I9AjNFWiN5Zfx2QtFiZ+gXEmlGVPHzt9tAoK8oLezu/eTe+3BESxuVRB+JOnvkMnYl/30RjJ9/9FOZHdlb7gtJshud5X67FoEJrLGSy7bWoTE273vujPBPd979diUCXe+9x2hHxIg/rP47uf32coKD3P4y5u9XeT2095/8El3d+1rr2X9fXvUER3fTLuPES8irfdyhb6gi3vLJ7VfuY977Ognu+yfbZqhHL3tJ7yyuEfBMIuf5or3FeWLuXiu3u9Ovk+nHSxtowvdr2jd0eW76oTXul6cEed//pJe0mpX0kCTLv2msmEfEl4cdFIplv5DgkI4f6X51dlMLIJEj0Tun7vtwWU39zIbtuL7rD0iEv7ZW92d/pkvr3G933PDbavdKi+aP+4q+aIbXa8/1ll/7E8npO64qCI2VikfNG3uf3vjthcHnboyhFTmNB7PXye103qJEvefvveQsgIubafL91kIYhfwj4JyH8ZO9bR/xLDKntnQu93ef6dU2d7vSWZGu/q3e77EvrLfesISylP/L4f1mqW324I93ot6cUId+Pc92tJi9Eie78v3MnlOxvhHyYJm+3+IEP+WNSu9FKKIXuX24rvXkgkE3en7FtFNd110vN9fX8YaswFMg7HygdcR+3+/JhPyFYQP9js4l3o2GT038VJ3jr1YnrPJYvd6/d5c/HeMitvb223yetFIRXMIz3mjd/ufZtmOFi/5iOJM79UvNGZNuN47vusuENU1iYIjnvp1oR3lkgnK2++4AC1llgg3eVhvZXd0WL28m6ZBxG2mnsi0sKPX+Szhih3C/sQ7/xHkiQjf+IvkL3tAugAAAR4QZugFfBX5h0N6l/iy6q6Ri4QcXb/iyLG5K8aOuslwUd3cuPisZf/cvMfY9cL+YkxuBA9dNL/7hHmRHgU09uDqCD9fSNmL/X4wvLZCpPjWZaSgx8z7XHN19gn9w06wN6s6p+Pljztr6Fm4dg3CQtnyTBe2KO7O6uUrL/u+Xy3JUNcuXP0lLcv/Tvsw7ufyxnumyCnoxk37jc7oScHbAT9zOlO5w1/gSey9bdwdUqxy25t0LV+uw1SXvQCnTi7hrDOZ4xwQvA/+/sO5PxniJu/f8F/zK2PDy+reIDjqhUq7T8v/eNhF3JdgQXzilx3xKn29zL/b4//05fp3x5bQditGUbu+qV/haJd1U7StLq+HIOC9/idUf3FwmxeSnSUbI55yb8Efdzg9xeXHvSK6tuieX19m3J+vyld3NoTftgqHUUZzFBuWMLKYCrQtlsS4lfxvvgxRhWGGGIvJ9/Xicbr+hIv28f/r3lj/7gj3e78pcZfu6Lqj+qBYQI+SpcZ08IeWgeLl+0HXdnUKP7BVgo9s2rlu+5dvsa22gXkwolLAVtdwQWZy5S+B+p8X8eq9FYLOmV+4aXGQ0lzmCo4YWIXEPh0TwAsd0qc1d/yIwDqu+/B54SyTT/8JknXyg7hyJ/aR5YmPNP9OkVH859dsu/Xl/jyRYpVRAuD4MUsXpSpX43sI1iLGfWCsm/Zu2buZGrQZrcIfia19reFIJzScR7wze5MW8GamZBvgr2db64R8LdtpUo+vuPaf3+CIrlXfMJh1glN2d4Y7X4ZEtfY3k4zNRUx+B2+03SqotXEoCPaiq7qQYudtIsoy4/SeMA67x/C3Zh3nCAkdf7+k/e8ayoEdBBaejphOzpwWcd6xoib/baXYWqHRa7ktxY3dHlaa6Nn/UEZTTkHn1T+gVXDVp+3IF+8xdsn0t74ICYSXDG3sC+3Aj9i9m8M3d/VkiUFt1RfMKsavxfXV7H/9hM97yR7/CchvckOES/++T+vUXBQbmif7fLVVYIIZotha0+TotPuqeH/cFO396Sn7vbXTVOuut+PtF+/qtb9Skd/6Py/S9vky73rb4Q8IiM7ckud/glPJG93t+Xd9dTPd6L/umJK7vffX1iO7vuvRDtJqvflOT04Q7Eir35/f4+973vfesqLUhPSr10WLrptj0pdp+973k9ttL8vLLJ7v1+X2/y6dwj0Xe9+oJyXrz+qeXaLB967/yleVhv4iqfe60tfYsQEuXy5jtWtd98I+CcoZWjveVJs36/YQ7pu7uXd+kzlj1Y321r11/vIR9677v8Jkvdsdjpq0yPZd7y+unlPne4Sf2IEOvSLnfys17+hIm97b65PJ+kxM1k5Pr17ZWt9i/v0/WCfu93lbqh5H0ZA3+89L+SFF1iCp6vvyTGvfr+xu76L7yke/Xq86CR0793pPxRr3e+9bCMl/Py0z4/evCvj72E65dJBq1PqLIkkYe1WvJLc1uW/ghPtkxItPqlUxU0r8kVmcOZfUm1pk+TDSxvZjXHffEWU3SwWQAAABI9Bm8AV8FnixVxl8bMUR5a/KR+683LkMl/+wiSwo6RmmQmE3upN+b8578y3cfD739ykd7+4K9zgzuc24ZQfLtP7h/xnK8boaPicqZB+lbIgIlO2Huj/hMqTxkTGU6+i3mFr15YJ9zAvu8wvwnKLovn4V8EhpfAQ/5J8VNe4UybHYQAGj6X72Up6K5j90LltrOdgowv3yM/91a6x1r1ldqdbA+gP7jOeVdRErPK/c4a/MPw5tdullfUv/uNsYIv+gWrGz3fuXfaF2g7qODdSOnVr+jxxTCaFvy0/2e/fG1Sbun1T47p0rXsl9pZI+ixclU+efST4LZa/c/FqUvzZ2oO7y/7KVVN3yCXmER3E5hf4wktYetUELn2PG+9/X43S2Xp3ZL4wt3s5RkKu2lCL75I73bBDY289BKfCkj9+Bd+p3/cGo4HmQ/LYg5BvWT20mtaGFD/DRHskssfaoOi3EZiS1EeS0V7jcT1hKW+eoawL7QbSW4iCRsbr/n+k7PLwL94U+pc69Hs8pXlMp230boQ6DQaaUkIiHch7u32Rfq/d4if7XaZSShIoYOT+SsJk/d9XGGcOpMn7EMA+Mmm5N20d9tPjSXd0Vuk+XYO4Arp3F8S1UjfeDPzF1wE2rqtZ9PtPOwWU75ykKs5sXMWw80cfhJc9X2nxqCQR6aLr6TxrEZtQoNY4DlTsDu8RlC1NgbVzXcdcHR96by/02tUVLtczCJsaV9khM76xhM0tYXuksvplZ8hMvTPXkdDpOAm9XuY7T+Z5QNsF/bg/abS0M7Ly/E78I+t8n97pUKK8Lq/dKT18ncFJiL5Kve75l8voYSOy+5+G5dCsgZIH6QnHvv9/xvLeCMpAXeB8evNe/qvdP5oIcjbv+CvOPSnOGnRo6aeB31eLQm0rj4n2wM47L6ydF79ehHxZp2bem6L+/Qq1ToRJcn3fvou9eaXke/o8hXZI9buiMVQn24IiMOX9k9NSXxFHfJ9fv5PX5ETNd8IeCIl7v+CYg/Tf7e7/goPNE/3vf+qPqvf691lPu+qRjtk9pL+1i3lqCMr3peir3BPdje5l7dQj5BCdv3E3ve7utHrpzXf2dgn7ve8ffqv+vS5qsZP1NyvJu+X//r7KgSb3fsQSEPBNlY6cerffpir3vaqr6vpfXp6p9K0aCi00KyW1NKhqYuKh3T9bPu9viMI+F7DNoyYicmvMSDsUyO/nE4akUuUIcb0y273C/w6t6SLt3fV9UYr3elzSc7ar6k3qb04gl72n6whu7lObzw95cjLjJQWEi/v40gfcmY9wK1L2Vmq9q2EXPz1PB2v96jPckIms+Oe8ndzHFcvk9JbSicSJt9ymzL7lbqVQhys7yDuqdEMW95fXkXJ+vl4sj3n/1IL3e1kr89/1dPv1gpEXGT9bYMKERvDrwE9NSHx9F5VSyCt5fd4V8QSXeb305RJ/Eve+Vl0tkqnUnuou7fW4TLPfe9dERjXfWnRL3/ZaQ3bhZ+4g2rW5dtfNy5SSpfEfzCT+fs+Rgm3eHHCq6Z62VZKwzkkED+U9va3K5dfD3xcAAASXQZvgFfBZ4sUNRHullTl/1cxOMzFauX5N+WWDjX/5ePfwwvcEBIcdIEm9LwUQbtHuK/JcrcC4DdfqS3f40uM+vBiRPrb54INy8uzrdG2/astOMfozBaafK/tsrLGSYzcolbtZeH9cwCi2kB93MXcn7WnlRjOPQG0ck+3Vy3BGd31UTHRYZ3V0FJ+Tv8VrXV5f8qotopuO4l/+y8Y7CngkEUiBtpoyb/GeEvBLA6yxuuwxPdZ4+Al80buGBJecPgRxJRvDXy089csutfiPEv+9hDl9+qDnjFpK732d5BL+NtzDbmsrc4Z3B7eXjYg/1NOT/69xBfG7mVbKo1TlgkvCRh6/SrRYSn6BYegxK2//LJj9PVZeT9PLy4V8YaYSd5nBFvkQ9NBpBNDMN8g6qRcNM++4KyF0+7sknlxK2kwZckL0nftZOXzi+Y1+MOdlKaVdmDpCve3h0p7rsrCWaR7xumsDdlddG7vyxRA9Lz8q+UNlTv319CS7u92QS8E5A4899scuthfpiLiT8MV/3DiIwi6O42yM3AQ+rmfS9yjrXsBMJOaVGUyt18CXrhsGO8kYstF/GyUylJFbjbMhDjd3xgl3IoQj8ru3uytGwjwM3cddhG17yUO242UL8vXYmCmjdGUu7A8E9i/BB5Ye9q/8PwVYLctAjWf+SOgpYQfgmnm+/HohrSZ41D833+GH8E+AgedX3pZ/J+16njJkDhyTz0Csh1zPszL7e0pAJvSTm5qrSGwAmvuV/1tQzwSDtwlxNjk9fuGhd767/e/QIt2ucrm3eZSNmBk39tnzK268WJwmWbvNsJdAiEO/r8EJtgviouIawpuHyO1TZLGIWZTg/vDjRg0QezNiv+4Z0BuNZdGAu1n2Hx2k6q/TQ/MeZ+Y84cks/1699tZnrNbCeYNyLyxNieKR9gjgdcI3V+sWC3gp8lAkPK9+oS5QR3t4sv/4JDDKd+XZ2yu79tLLvMSXH+Ziy5/pnr/BPMLTi7vd3O217WkV7cXyEAXf9L3j/4Iyve1eS5fhBdapPwSGHqc7eWCEr3du/oy7b6Mvb3k61Wr3eqOile9LJUT533uEPJvMz8aR3tNjPp2WbsZ0BM7qrsv+UJ93d3P6fwSFNLxd/ikO3e7819L5b7+la6Pyeqr+buf6zXvtVd93v8Ftu/d2hHw/5fL7pzPpvrND0UQS93vtMqyndIrHt+/utNZvbVLS3u+2xqUpihSa/Myicdgen/CPiry+ewTPp6p5aGumyir3v2gXHrXd5ZPaojMRk+22JGfX6fChCXafbuORb7236nFtr9Lk7T3pXiSBElyJtSmnIvfvyWUHH+hQlpDSDc5ON62lPTROT8Kf6kUJnHK/v4sKGTrnzzfem29GKgiJ3cZnPu/shGXBfi+PT5UQTmvd2SzPfeOuby9rkJ8jGCBvBIpFwwk2+HecC1xeCB84tUiQsoYGnseu39T0Ke6zIr+lSl9ifWUTeqfqQllfpwr4ilfJzw0vzELH6VHTreXl+2iaBEV9RY3tFJe9dKa7/Jhf1In5KhS2f7qOduS/iLK6V0qL6r8PfGwAAATqQZoAFfBZ4sVHNlxuMqYlk1l9d8xNsxsMl/9wiTnfKFnWlkNOHW5Mf3HFD49iDWZZsDWDMXjBe+5YTYvJ/b7nh7a+c/jKTEDUZaGl/UslPk92xPvFG5+7S6GQXpJJ757ro4rHqEK35cXl2vaBZuXN9V70r+/LYV8xrl5q0ul/t3G5UM6QC4vaHl3b4UcvMOguQCO7ZWW0STLpq+/R6XRlK35k/ImPFpQQ0UL0/aRW4zbzdFpJu97CPhsxunAz5efTRXjuGr5qCPc4PRTrdyt/hK7jLvlnYFpXycu3S+S+YNSe1Qll77SdIed2OXu/z87EntJFR7agol707l6cilNuJ4q+9V/LehtBMv77Y8Vw7sxWW7cS0WwgV3VWWMILXo7QKeKRJOCpQvC773adtL4mXKC/osYctfzgwNTj2sfZav/J6S5eRE576U3knXuL4R/LU33u8b1dfX1gggj6O/24TC6JmA9BBrKDdNXVSY2N919JJ2U8dyB1n220oy80efWmfmZvgrkBkt7ye6rrie93nkyCXgiNdrj1vjrnx7t3OXROryeq6T4IJrMFkmCoMXmItbXglGuVDeiqqvdbv8x7tiExq/+EZwfDK/LgkNnS24b45BvfjCfp37gp4QccMMfMavnAvpNH36//BH2bv39/kr2l8fwRmYt2vRqWsbQjoYI9ibXkajbMnKow5LmQpSo4PdFbr5Dfz66+8s9uda+a94R8GGd8j4t/h+WCpJ48oK+nWthNzhc/vhybNINti240lzEsggomzvTpoIkm7DazB4fFlT3dNK7q/Zw//EPrJ/3h8txK/vV7CdmvFGbQUcGoWAzjPO8PXtfDbdv8Fu972YdbT+ECOWq8+9yXS3hE7vjz10amJ+lxvV7go52IbvfXvhr3CeO07v6OgpduGm3FZ60qXLXsrG1xo8oIrK2zBtY6TGzMeluFPD17Bqu69VZizqv0Bptq/r39uUHB3dWMWenzd/iuZAh5Xwi+8cbYz+tXfzu5PXjvEzHciA/92KIml9aTXxhpEfv7cIFMM93zj90w7ezp7EIFPGwiOz2icV7uPhHH5EbxF93fX+C7Bz089zCWzbtjIJbB3mPThsoXYH2O1BHulZpLbFXd93yfbZGJXYg3h1ClZltBd3S7Pfv1Rb731IsoQ8NZV9byw/0Ck137buXHLf98s4KC3dXvfaqKILKeFO736lJO+63ZRv37TEvRJd6+GtntHh80EJi/KEX6PE3fKrK76giK93OBHrJ3cJP3KIfbvSIQIlEP3ty4eHnj2gQ33Sgt17rEX3e+mnSX51ii3u929b7vrFXfSf+/M21P3e7y/30Xd/ZBVN7u7l4R8EhFl2/2CTblf3otL9f36v2frLS9QSEe5B7sv/0UTd37QT3u9N5f5VIYt3wm9tQRmffpb9adVr8mX1XI16E/ECQ+i4Hb3y8P7Y8oLDXvjaTx1e+uRkBGJuK5J17ZXuPFy0onl5cqv1Bdd97xbfYUNOljInhuKQUOT2ctajjp4NL/hPx9Jlsvrqo7edzvuSPf7EvP4Tfs4kQ5sgKtFz5kjdvFnyi3u+sxXvtoSyym5JaxN3vlzJ+vJkXTRTu/q0Y4NLQjDCUs7JzV1iOCMu5CJQW0tI3d/knhdwxX5l5Ii1eP49FL/kkkp5KckRPoezr2Sf0Iby7grgAABHxBmiAV8FnixUPM92knGknNYC9OUnHffNpGuoM+Yl1DsKZL/7hHpHGJBV0t/IjXK5juvwxF2pf37BAUw+8sh4Oi/1zl6J3d4JRtST/+FOcXL397bTcAN9x3iPtXtGuvcPk8uVsPZ8jPa3CWT+7TMZLw92PpT9llCt+W76y+f/l9IRc8J3fPB2iA35eSEKr3GiuEnGEAxVilLcz06zkt9Z4KzJbEVYjvkfPsNZ0XrZcGStf7hAu9OA8R9vPPZAcQfDKHA3AxBFzdw/rETbbA6esEn97rf+YsGYr9+gvjlf+myssJcZq3DwYlcPVdPGI71X9njL3eRfe/c6/xxV0lZeSa16gl6pZHhuLHTv1CfmNxXf43w/BWguOC3UTe0EgqELhY/NhbMu8fuxYvP1s+E56R/CBiiO658I19pyPMPgn0r3/8eeXO8gbSu41dk+3/yyuoYupmye3W715JeV/Tid7u91RUqVV7hGZYpWxcEA2rchDh+opkpy8l5BjX/9u1b4Tn/zZW+8JVrztcJeCc0+547wZfu5Cxk/NLC6Xy24rOdlp3bv3CJE7Gfh1pyFZ6J02vwY1pvPorBYXhyK8eMw29w7YOBGYiJ1VqeJo54u4NFIhj2br4b6+xrBLtMGfnH0u/e5vs++ifX2v1gn7nDsIvafL9r4RtDB3BRD8FnXfcV3e/aCkdFylEoTm760vowJahN6Zo6oCFcwXrT7/bc6wQ2Ovtl/y+EfW2YLM0IbT8ol8nd83P43reBnIAF7/+C4hQjWH8n0w4RX3f+PV40oUJhl7AME6aV/eFzJUkjiJcvXFuQXn41VGH9zqNu/dihtdtAm7nl3cEslbYVtjd1gkO6CKp8NPikQ22Sdf62vTQmWCwodkb+8gm7/73C8GY7398EOX5L/rLvUTKvAc2Yo3hzsQURXZf2r0syCzz3C55PJCRcnz5SCPmEE8n37aL/0gVZX5G0SaB0qozQe5fgk3r/9Xqj8n9e56v7r/L4j161wkQxuV+cL3vpvsSq9j3pwh4IwhTvSj8FJ5m0xXvNr7fkzU/gku/lS9yFvRUuoITFPP4uxbRcQS7wRE3dsvru0Xe9b4TPu7u+7/opAnm+9/3efwh4QEXcr5lqR4/yxhXvfd3L73Mvf64dHiSu+Tb6S76xF77v735oJPm4V+vxW95f9P8m9+mJ7u94R8FZLy+O0uZrvX21y4Jcv3/fXtFjdXb6vqil5/sT79arRfJJy8n7aVDagjEBuf70vuuxsJC3cvu7vsiNenveoR8E4g/l+N3Lf+tRN9u29VfVVQn+hZ7tPoiJvfS/lMUOz/S0TCT9MaVy5Scz42cY7OfxLCn+dWjN78oLTO980nb7MJmj6/J6/J70dMO13eX0mFBTvgh+fL5k4t3LzkvvcdvkCBUOfuUL3zS4T8mlNvIQF2e1lukeyJCO0rv4mYfyesld3/jJrv7S3Ri5DPDrZLtaIXX8LaiiZpyX7+/wUCX3eidIpPq29/Xk6d3Ld325ERlpP8jhf1In4i2lUupZ6AvgAAABNhBmkAV8F/mHXGob+bpNBjwiTd5SpenXqnjHT6qQN0V7gpLcwu4ccaZNDWl5C7TRb9MFfjee/TOnd8PwBFhJnyeSb17gnJq8JmHmbFKVbE1Tqq9lXf4vx3ifl2X92lhXzGmt6pf3tsbPOXjIEAhcF0GC7xGGmX0hDeZ/RK8W3b9RgHwKfUhbDtjJv2PU/crgt+sZ+BvnMdJgJtqPDnCd622hm4zdfbOJ/6N6CmqeX9Ydf5YhHhz375r0x9b16UdZ3qXa0MX7CJ/daeuvfcYV1soIhTKGroqaE+8DOQ6LNV4lfJ92vbjbgrnsGenvQTbFUf65o/0N8j9/6xt3a05g/BN4+EP6MQmLE9Ps92vp8kFN5i/LhrGFO/yxnuixHfkmlL0r1VqE97ro7UpEzbscKF/vbCIp3cdpKptluxW972N9srEgDWXHd/uHBKa07KPD1d1Lsfv8i9/YINLRnOHzFbbovKc9uMov37Q8rGva9usoUIPj5o19sCt9iWEr3hPp/+c96ccEjS+nE3OVfe+q8XfctbL6cpd1pJpwT3d3pSkkqk+vfxYjKGg3IxA3ZLkST/grXcffd3d7tb6sTu/LCE1viCCstHu3Kx9wwQkA96rMOhuvDcs9i36LBIVmcaDE3yPpOaXSEyj2ASfnOf0yBu4A9HEV/guLKudejpU4Ne4vlHQm4+ud93uCeUBOUkQNx2m9xJ6VHTrQ4msMtF1HnOFoZe3TRVgpp8+SyGA43n334IaORz7/lEtpvCNYIzG+9OKrBVGwRG4EZ0e5Suf8fTgsIGqk092J7I+yt8APbGYUt+RWjt8PBk/SV/BNtS1Pzn/VunDNjNH2PpxZcExRrf73jHdCiLkgR0KfLpxLBQdw4lUfyzphtKlJ3JK/aC8Els+DV94EP8dnXu2P64fyelvrY3udIOileF3bvuk8y4JlVHcd6Y+Cr7SfEkNMjz6Ge+SoqxqDkpJ51Wt1EHfef3CPixE/t6b/BQXdp2N7assIlBOQ60dpORf7pwUCePRPjhdI700jfq2utXrwUFe+967F2bPMoM015flr0I9grn+7T7t1ub3LGXb3ve7b3/ECWnd331mI7+y+iMpX319YJeWk+bZga1f9Gf8uHmOX+og7vc//5cve+sI8wgQVjfcS/7Qword3ckn3d3NJK3uyz1fxC61fyenCJVfdru+lzaWu+7qxsdR3u7uf/RSE3uEvBhl+P0z3+vFf+Q+6KsTSTeivVUat/UVl/d+l6+t8N9HepevVCO7oJCbu9729PqqwlcvvlyEfC8Ousl7TK/XSou5Q8sE5J3Bi2ky+l/kMIda/CAl2zwfG7cVvFG+xcSeifP1um71QW7et829ZJ6WHpsTUJSL5j73K3ZPv5L3fVZYQlbvveaR/20S0Cndma8+QY8su00XA/ry6vnsn1VEzbu7uEvGEG7lcIfMTmcJbE22/Hbn+9skERpfd8n3n/lEhkaH0LHd4y7K8hVlzetUg2n/fv39MVlx6V76oeZy/Ee99N9Nx8VksZJjcnpekaone3b/9OFCf2bfszv0p8rFjWt9V96kFp8Z/VOuSKvHX9Fd9OJRoI5efHs6Qt5MXR14gmb3P+n2wRHk+nHalvfs3ySX3l/JJ/2ULTH/wt6MdJMllj2nvyb+HvioAAAExEGaYBXwV+LHbkPjXlTRxdwiXhJrysBne4lGCj+YxSWHbl15ev4Y8X40ZTg+0Zwr3COUVAkeliXj+AQaeDM1VT5bF2LvNNJwdlwPaUfcOxr/Uvll5iLzf9z39wTzt3wXYCpKWy2stRlM46/inXXnDsbWEXBswthC/gxHUEfp55cEo9VTquR8vvrX5S5zbhVe4SGAi3Ja3qYEzu07ekPnOxwidiQcar75vgw7h6/u44J7m9JW/+4ibhL/Mb6IOdxSvUSW7op0Idh3f3dVJD/cuHoN1+4is6+dR1vRi7vW6l8/Xl5cD2gJeYlKM5fGE4BH+7IY373CfeuoR/0Mz2elfMf/xsIXvj4GU9er2McEdlrlmpGImfNj+jJhKw7F82YL/X1Tdnf2EuQHkv6oob+CgoT+TSJft4fGaXyp0WJmOHm/5WNfYvjX49yOvfCV7uM7/V4uEymZe7yFCsSe7nudJgk6QD3ZKUo00e4sR2QlAI9y3HfgtfqmlF/u9E7dqVY7t+T8z4S8E8q9joyN/VMWJvtsZDagEQ9uu9+rJ4x9T3ma3+V7mTm7GbL5fV/Gmp9pFC2bpTbuTtAB166np8alBMXEkNSnPF+jCF/U9akitxpV5KaOd3O4Ay6xyv8CLuMzPtKr9M4cbr+v3Bq/vzqtjScS3rpmpoVV4mrabcgMPehNFxfyO3S5YJe0dw60cRXlT8N7mCLwyo7AeP/dnOFiXrJBUbctWKCsfe6dAGH4NJ4tDd2qULYGHOKXBs3Vko+K32h3xH9Pt62ZaY98n2qZ64eI0Nqc7VtEYD5Q0/By0f9Or69owt7wm1K8KCB2n51SfKvbvhhTXbrBAabpqtxTHRfx8UvxDuAmf/q7H1/qFCq+xG9nyf2J3u0vnWYveeg5y5hj6ufmnUBi/3cHOoIykPZ1r4zRi7Fw15c7jK/17+4oieTz98JWOPW04I4CFVZ6/sGZb5T7DbPdOCEuVh5b9EhDyeXov3SRaxbovBEZu6uyfqZd+9PWuxcERZC06V+j3nXHkTk9tsbd/ola3eLQRK3ve7v1a2IEPna30SdLt6q+kRehFb4Tu7u2Rmnf4JS7vl3B2NL3+VdWUr76sIm5td97d+T110CO5Xbm/UEZ03dztVZLqnCL7sWI5/u95CYQ6R/e8/e6UmwUF3e7/qrBJd/uvqlrr61fr6SX+/6d7UJVir3bvPiaXXWjnBtcyy+r/rvHXamChfgXlnr7LNd37ogh990U5aQuxeT1TOvzd2V1JITL79KEPC8OrWceN3Nr8VyKPQSM/fhl7l+VycIHtvjd1dkr+mU73LW8qT9okud+cup61/W6gi6popVKZlQm7vuZvzP7KYSgUfmx2Ad65cQ+fhLxtGk2ZFbSl5VwSd/vavF1vICHe729f+C00vdvsnBmo3449zcl8/907Ky3PhtwLAv8Micv6vL//BObTdO9OOxd7v+qYbrc3kzrtI9IIiAD/+ZpwftQ+o88z/eB779YU9y767CR7vef9N9/fk+ncqJ1SDV4nk/WVdQjPFDu77dXvyf3avCzrtiNV+W93t8VCQk/736+iTFve1cQjctTD1emEiFdy5lh5LOZUsUMYiQQpN5fETfEdEyUdW4deQpMhYLIAAAEm0GagBXwWeLFblxLfuUy6QbL/7i+HoEYi9m5uM4wO6aCa5fy3TGXSsJgi0KRcuG9nR2KYa/ft+8v73gq5yJ4+8OxdcMbgUnNbvPKTeVBvBEUl86OpP6cvLBHfcqa9VR3L/l4u774rCvhwkwUc34a+X+Obl/e2xsqAJO2TKwRgomXCRe1Tq5vej9wF1c7KXhy4Bdzm0SWuHjcH7y7EGv4txpwUFDc3Hyekk3W4f5hZyh+trELv4QatbufYk9tWe4NB+3/gSbadw+o30nu2V9uN9DPwL6UfJhSvrSl95go9r4Za8/3TRYnnDb47YNW6S3eQefpL1Sq1ej3/dU66bEnalhHyD37sj31+bkj+U9zLsXCb9wiMCoD48ZYxLQiHadb5z3/GRP/ktl+h0b/22B+oZRwbpacPx5PHLAnXVnOQsyAC8rbShy+v1TvegR9itpaQW0ugVlhiX5asbDPa273eVNLkiITfbsFveQGckIzHjx8NOt85Zv5qFMgIP1p4LS3m2X6dS/kl4Tuf/LMdFx+CARMnP9qwdkpjYlI81PRP+5lmuv3ChXoHZYRcnZf7dgW7VKpfk2pwW87dwTdzHnoTf4gVMv3blBPcb7/znzQRAay4h8A0e6jHHbEZqSaDj3Fk/PqOGSa/L/QUn/MJ43X4aUxF4AraoTty1+CbwRfVuGFtIwSYt2t8aXHiu84xDbKfelu6m1L/R4bu4duigExG//97gih3bNpJ/bNOn4UuBzgJgpyfMO7WgfScVXdCeteBRsGPwOPwoZsYmPw3501zOiRFcgW7udnSvCfo+HRWCsnu1P7xkuisdZPtuxb3CJCe7p782WQ33+ZtQhd/f9YuYZTs9u+CkqZIz25ff5q9BlFgQ/LnnPm77HOm17p+sWUMy+TAc8bevlDRzqdFZZJpNX1mJuBAuTak9JpLPwTeO1eBDvG9/ZRbIXuyn+tXv+Ezu+OmfahHxYiVhvar+CQsBJv15q//qixbBOThP2+bI7aa8FYt+4fv+9mH35eRO3FCA7e093v5O9J8nr7rr3qveSjnF25BEvR9GaOVPaL5f8E5L3unqEfD5Hv5fU7OpIV/8Fh93P7338ZPS/yyFng76btEwye6kLXDLG3falTyW6HP3+3ra0Uva3wSZ/PFwEfGGdvff74/S19p/4KSm0/unu7Wwrz1rv8hKo9a6/dW1ZIua++8El93rzFRJy+X/yS3fhHwuR4dVm/z+/1zwVOW51ymPvyft+XmLd6rCXkmTb1eid4r24KybvAdVy1s25zl+qikMJ3e9PhPb3RZN1y5OT0u/k9YS8V4Qdv3lb+FNPTvL7ve8gvxh7klb6ZO0y2XHf1SvRP013GpFRqlT+rvf1/DwgxXKDQN04bQqa9+XXWI5ALCXx5awzt/++nCHXoODxXd+tizoLwmX9arfUoIh131vMYyCAnP8/8V7s5f7FdoaMdU+oQvu++71W+k3zFdv+ESYEtWB/1j+f/dZPVS9IkTvc8KXl9Ql93fCvkx+zi+EDOj8u6r1pZuXMvqkuxO76+vdkshO70l0Cndz0cvux3S7nv2cqzsfgR4AAABINBmq9KQCvgsXuLFBB57Hj13OWeL5TTUw15smkIUq7fwQUNIeimLgiCAUvQSwyD86JhHn5uNC2TBHaXmdcbiUm1/Xu/UMh77hDP7vRO4bi//0JgrmHjjZcpKxeeNqfojb60udy2T9X7UJd3xM9o/P5fei8EdvVIvmlkeHXuXu9fZb7QU8OEpSoii8b38v7vjfLwnTzVkGdIvUO9RISPPeJH8pmqbpdra3xr6y2mvgHAwGfeuv/G3TANLUBrYOLXp2pLO2v1Gaf5e0NNYcs/qz+z+vcO7uATi+4a9lwwoODX5EPQ90f+T3ztuqFdyLyL/oYXluHrEqg2A26UuFHgytvyHqV1Bd3u5Sfh90eiQH2tFjKU6K3aacsHuy5KDu4rEw3nf7pToT8/ej5PSoVJLPF33x5P/lLUvwmX/3GDLzU53uaPab5OqErXt+Le4pL/7jO/U2SgQJzO85tiRL0G6SJLOga1/sei43jq03bOvEQAvNduxDdtT9nYKZb941foNyF8gvwoU8EikbjXdwke2n4Zl9gRtva+9m7TmT1xL+wTYcStfoaI5xMNe0ErLfudRSvl7vtQkW7Pe3b7hPPX4eh3ezwwI5OMwR3J/oR6n5UA7eGdu+vsIleCnhy5+jzYRak4sY8vBL7N6lvBGuJSwntjBj3d+zdz+7n97w3Rh7Fa4TufjhkyH7wSSjkzQjcWDl0LYn4mPwxeX6VzIERTlMv7qniV29QS07XHD0qr94LLTKbI7bBH7/vkF+EfHhkN7jTYclV7l+R1EYq1uUa/T6tsg07L9F/y+E23uCEpuvpp/0luCAxl+OMhwTxDC+r2lkWh6utf04KygTf7n2Irb2kL0XBfaREv9Hq/jdtLiki5ZPSSf3IYg/daXFezwSHLMyL7VZYIoLs5/36GxRDOxwYcgY/7QEK5SPllDIN4PXh2Ynp9VCPvdva5aK7pQRELqUtXaoZsSJjIlc4/c60nqvl9WeY3D3fbyX33ghnf/ZPbb0usn7iq6kJvvHQd/9lJ9ovQk/wUZYvvd7mvLZ7v2ld6NB39/mgtu4ZRd9vugLT+u7XtvWva1eEOwT20isVG3t3Zf/K1bXfXgjLe8X2Td+hNa8lWye7tVarM3tYJ+7w1Eaxu1rgi07xtUotKynySCt73fL5PfCHgwy+ZluNoI1fJD/ZDPfywRldid708mhRX33fWSt+tUvebu+j+s3d1ZUUhiUMXG/v5CX3CPgnoEiefysQl6TKfygrub96b3e9OjO8nquEP9fS9iO2snvpuVS69EZ3vCT8pRsBF+dpv2QhE7g7epx3+w6LiLTh56S3l/j+ykpyS5EEBLz+Kz97rfthPxwS33k/rd9i+XpJ53k9KlxGtp7kEO/dCiDUL8/z119AqNW6QI/C3h1EV1Ko0GIc3lnG9cJX3Sd4UyXk1/kmI3vW02UXcv6Wc635JiXfe5PfpPxZ217r+Qzyyy/3ZPskLasmetaiRO2txnX0J9+uyS7rl9NaUJeMrG5f2Szj2G9j8M5JBB8p14go3APqnk+HvioAAAASiQZrAFfBZ4sVu9J3+Y3HphL6F4/Vt+Pf5S6CCfjboML3CBuPcCCwZTV8UH57zhqX9tOhxY7wy0OItNFJOCna8gkiyQItp3C8+XL4Bfd8oqL5bw9Sf9woTLS8T8hVXYwlSvtCgxBr3BDd8G7ZfvbxRXe1ecfrywVe77uc/E7o/8J+EeXcI72wK+CQk0Q9FnPDJhZf3fBAQVhwdQCb3nYRre6Rspp/4BtWBMl3ZaBbwolG+jZG/vsqHcEj68MGtz/zvUxn+4T7NODAWVkvlSaWfVZWMKYegg8dvheYOj6Z2c0pMMdNDsf3FQxc1/M/b7giufZuikv1BPiYn5/O2T7VSo/BgUMY993vLc1X+rdsWTDvR3PvfWWcVf+U93QYS8wqyl8v/2NyPMlRrNrScdlWiKvO2r9ehtUEBTPj2nXfMlmfX5f76CBLKqyFdXm9jtw/a2skpDfExoSvrCPM+9k7zAsw98WUM218td3pflWuCCyBC8rzfV2HYcpAJT4S0XHQLj0tYe8P+6npalThG93+E32py/e+sTcve1qPImxNsJeQ133+CqfokmT92O4RzK6zHftNCWeCgkDq2tns473NplYtjT0ZA60Q2hh2wjvgNKf8MvbgS2VlVkF8zJMPNmtt7h32IA33zcM3wl57CSVeVNt6lm/+8EGYFomDz73d1Nun9X0CvH2veQZPe75imkJyfgq87QdS51+4cve9ttC04UvDydD/jA5Mks1uEGo8Fj2WRLZmNYxY28cQSwXiyz9lBk+lot8aRr76JNjXiQaF/e8geHy5HAT/eabgi36/9YIaRPjfbXyCxbl9z944R8EgyrfqrCcrMFQ3l+T0r3JEwkSaGvciYOwxnp/bvlhf9NZEj5snvTu5aN2T9J/0d6Tvq5rFPFVwt/JHgO8kLaF3BVvy+fzj8xnP0699ii5n7pwj5hC13XaK21pwREHc+znXdFgoFhqK0/uHF//T5PaWu66wS43V9E9MupTO79YIs6u9XbkJZH3o8XZEa8/+SEawVmI2ViX/bJ5b8qBaUm38u+Ndy9awdFTHn8Eb8Hr7E9e6Gddtgm6Tw5F0Ox0rvvt17b01CL9wVEl/e99/PwV3u973u/Y1o9dG+Zm58a3YR3d7T3n/pk0ryeqR/qrCT9f813ek1nrG3RtBDd3u/K3xHatZL3hHxXP72/IkTsn1Te2onN92T+lr3ye+rqSPK97vn/6kK73106pLSSthEQ73hv3fdTh5JJsX+Xa3iS3vd4R8E95Il2HVxn0+vyiOf73wRnll+241kfRdvp+x8SJe97+3u+90Mn7rsIkd3eWd73l9uIT2W8mhLwpfKcafBGqWP8JeWHq1/igO/iTaKP8N28SfWUmnIJEt53vulrNfftBsXjvZ7k7L9Fijbt3v1N5f8t36fd9FQIBAykPo0i4Fu/ec6w1+X2XofuMO/u7uf4T8lo8dZEKIXbmzHF78kSLk7WQ5t7E9vvpT1tJakI7vk9UkvmKW79U4y+fhYvvrglES4/nylVPti5t8z9v/a5JheKy5J67akkghJLVdOvUMZLyXovkjfvScb3xEhQ5o938PfFwAABJtBmuAV8FnixWcXdKyL/mNLY2ve+bidmwM+bZR5fjfD0t+4TdbEP9qbtJFhqdAQ5nwuc8oZO6pze6raYfTD/Z8v7veu3HyE1nyA3uXyrpXyykANpLHv98/eq3LeRPVOrO+Ta2lCd39Vr6F+PUvHl/2yYU8EhtxvIote43O6OkEF5/Rssy2tLZyxaTVUlXf4VHjVcBCf712mFmJ3cl99TQM2v/9/4T+rRUZwR6MLoD3TZkHXYO1vY3nkSyzuuPvOJPhOXWs83d+T/OUjr0g9Xp16J/po7LG/CfTX5NOJ7j9WEnAEuz9/6EesX/9C7+VbD7ak0kfX8aWE37z1+Mg5W9PjM9KPya/auoWr8fcPQbr+ppK/0k7hWUflQPCM4X/9Ljuv+T9Sy/CFovfnjVl3W4vjvPPO795fX3E33LHHz+qvxZdWsuQmX/3FDsSTbiXH5f3exunx07EexMfCPYyN7ptcweTpIODWBF+LXsLqg93sr/oel1/+lrF6cZLbyXk9JJpvzd31Qgrgm9P6/lrpy+a6rXJ6rZNZS7uvBYY5UoalWypnrAW/DX6CU8M5Af6v754wn1TTQnff+O+rlsXKGoddPd8Jv8FRI7lyJz+i2Wxt0DtM3+9OvxpH6v4fRRhEYTWlErcN7w61gm+ha4CT9Ue5+vw8d9bhEdEgcWq7JXzyMG35T08rQQe08OlzLSJ/6E6SssLYPGwJ3+nG6oPviTeNviO5WjQl+UtW+ELafqO86wFX3JaRDo/YmQ5G/fuizwiSNoPe7jbo/xtwezjwim1LiL14Ba0wOe+C3iTyVppb/ha4dz0f1rk268KG2RHz/vT5vYOUKfp9x8PVI53fd3vUf7/hLwQ7Um2KorDmOlFyXZ9//BAIw1+OqeVqx8SurxiB0RQ7/1+8s9jWJ2ep/95wPoT6+sEZ7kDe/X2JS9+KuqKsqfwywrdbgjzBtcPRyfV4ISz//CJf9cohaofwSFcZ7v/p7cpR+n6xX+utkl6/NNsn61rrRe7GoeZ37kUoVveX/9Fy9+ime9wltgmu73u9zsawQll/Xv5v3VP1oj9e7Fbta5N7hHwTS+M0o6jJww6BR2/wpyY9y+58+kft5F5av7kLl7p++T0tb/rNu/X9O7783l4R8Vbt3lY6lJef2XK8/7otxR7vdyS+oemq77nzeTKVemq/jZ6ammwdvPkyJxvblZfNH+4Qve+7363zOILayEvft/mIQcNgJx/5O3Y+ECu9w7lsZx+vvo3K5d+gRFcuvjt5IQl/0zrz/CPhcwacm/D+XzJsPjqXn4sRj9N2/2Cg7J/ELG9KraIeldOu3u/XL4ukBN1L/GcSsbP3t/uXyRBhIv0/j6B0efulb5fvxBARkq52XMn1WWuEBOf3u/LH3WWIPesuLHr8j2u+0nfdZUFxRYBleQ93Dr259S/6RJ/0wpqGdR/Gvp419L2xx88LvefOqi2vI+xaZHX35Pprr6XkiCjuPq8rvX8K+Qk37ytSEfSvSv0iCbsu1fnzQj+Kl/3Jsv5JOpmRBnyUhHDM/JEQx86b7lEH3t/EdkQlZL35LtV/JGF+CuAAAARjQZsAFfBX5h2MSu/cIl55bvMSdhz5TSkp6fN3KMhnzeEPB6aX/3CPDnmAkiZy+hlGuxo7dL3CkPsSvwnQbmPJsQch1vbkL3CF3yqPgl+PWP9fmJdIQqNieq8pwo2P1WTC3gkJkmUJpcv7vhEgrPHAjcbm+KwTfM+CD+UoQzP33s8dLHPkrh5f73CExcry5eUx8FHVf2R4f+HpTfsq/EalXAh286/f9fpT8OlzmiqXG5NhMI+P2f5yQ7U1PX/4Xgu5Wv5lO5aXnM/1TnidJO7/ySaZm6TnSRbu5B+X39/uEvPS/4R7C/TG5eawJDfpzbjdz1edhwngPqv+6oEFl1k07WfXAbEFus+pGxR67im7lgbedNvpT4yVlVHGHk847eu2hppjtPgt+gfZLA95d0f0stzfdnsqWr1WIlfYEWvnxVNMn6ajh6H9hLjzo+mjs8adhU5NNFjQcdIgTccE3vrrdwH/+o0fH9H7y9EWfuNAy7Q1rUweSL6pPBbIzLYEf8V0DGLlMg+SVFLSXlvg7ZPTcsW8Ugtc303//PxeuHuly2Or/TYqCJ8q1u3460OGZfI+nDF93OXLmWrcPcn/DBc7kWWbo1X7Z86bE2ghNph673d8EHy1KciEMaam9w5C4uw67m29Z/j1cShiRSIBHrpHva3LSKjrh7m3ZmP1W0J7yAfdPOSbde0C+7u7uAKL1WreKww4r+E9sFIp7d0zhv+7uX9unwQE8fYw36ji5ecgovgj0j3yfpEX4eO6i3217AjRdAZurfQSC3rH4Y3v6Xwp7cz8B/uYfgheXUxRYpPTd4OqBDfcadu3kWC7bflAqf1+CKO1WKZ2x+Crjx+dBhSrcRLRoQwdS/2n6bBaQA71oxL7zX9qW4Yf3CL/Rbwj4I9O/ykluCLe8vcUaFx1MrTxck8Nj1eLb13OqRRMol76aH6IIuumr6sakdsn7bXr1QskZnQm72vvKTMU18gknrCPlES+mqxWOXcvfuwUHCHxx/NeXXpJ72mpZuXdtF6915PTa7/vFEc0i/Ke3tOzkRX6LhLwmIe+r5PqtefpwRjX3rv8RZ8Iv6+lr6+/stCn7fxVXqtXe4T3d93CPYIiPe/lQJru93d36Xp/oEZ7v7quteye3fYrav3r/3X+/oEZHe75PpP3EkRe617a7S9CPmlYlYrLLe+T6+rb9/aV/JXvVXeq90W+76rWRWrjUi9ra4TfbgukZmRXv92JlO7b/XYsrKSe7r8Z0k2QhNp9Vy6SuQpitG2XCfj778Ee8N7bLvcZzPbG7/ewk4ry+/IileK+ilCBeHUlx2YTLl3J79qT9iff0vYnttaNd+T2l/JFGdxDmNC0Nl2JydidTaqmXX39MaEfz/hTTBIEi/9k9a6tRI3PLe6vITn+nJ3eT6Tm8/J7b/iPVqdN9kYkt3bLDfaUhLl7cNLC+oSM6Wr1tM3Zb0+T5ISKfZL5rSraosZMX7vZAREvdOq1DWGcRJySrJZXD/7gsgAABFRBmyAV8Ffix3LWG9N/lKs8my1oBeJ+UjwVapOgGS/+4LySD0bLiSVMJHBCYcdesf9xpQJ17ka7dntPSPuxUx5TJEfFeOj8O8BcjyskOM6/iJZcWkyzs1IrJo9W+LMYbyy4Zh0dfgj4x44WBvvLBMfPunIho1Un914nvLxNwo2LS97hXwSEzBsbyTZf7dsaTNFxLgdnqEte/ysZlD4PErKg/z7oDDnZrbrOWRwiZ/gzFA1o8NV/3G1bgwb3VVbw7NZu53440XXtWMXZ+NoP01njuoPOrCy4fvOM28udUZYKSvsblJWEkIdATUPGV7rKmqfNyA4vpJ8X5JnQGHvpySaZy8ntp19Pm2lpZL76yn3KSCb9MUOC7S/BNsn9sC8v122ECd35P0JQmcIkOyslocWo/uUrvn8VkFJfDbZWaVJiT68tfe+Xh9bXCXgspn4zj46r/0xYmX98sFU4uaOAm/b9u663fCNlFt/fFAy+l+GDFkdh09uAK9qtjL6l+w/cb38FRTRoMYDh1lZFzU9jBpvt25YOe4d9grgzHZb6OJSEPiq3ETcLDN4f9i2EiycI/I0m7v5hh8gdf20Ntgg6OXGOI8LvXP/x19kKTQcHZJ11PSVPP9k96G+jof3KHiPH0y/pEtS/G3paRdDN0uYqxghWkWENnPNcAhV3uDd6EsR7o7R3j9i086D5rAid16/HeV42guzu6RSuBH54U/9FX5RbK7hOisw6Ws/tJ8xMNRbz1i4SYZcvCWGmwMVk+l6cJibvtRs/k9enWyCNv6+kXuhNFrS5vZ0Ym5l7a3GZlo4J971OEbx5ElacmQSb05SveEi/++X8djiy929niiQjeZ77vdmR69pvet35fL69utda9k9a99HrL+v9/a7enhFcuMEZfN+Xtvz9fRS1vJ739/vfhHu+7yh990VZP7N3Frpe/2oRfs4Kd7v/d25e3tmuyf6rfWuU7v6PCeel3f0/Yn1eT9L6wREvf/ZZC3d91+Cfe971rdYQfpAnu4rd77N9o4q5+nd79sXe73vo5eq6svd9fWruvycnttif4MLvTBPsb3hp3nk13Ln6NLH3p/eijtk93WRVBF5WD0I+KIX4+Yt/zj0Iy2pqMOOZTmRcuvTrsr6ryenKJTfpK1eOoPeI+vkp0T6SJSHd2y+9iGEjyRC4DN57Ng5pITJt9N4R8hY6v9DSN51Jqm/cZ4cVLZYQu74ib/nEjcbPpwRvULJL3THcdsWa/cny7e/LDol5d5cc9+3K0V3/1BDe7pg65PWsViUXpd1spyr+mb/F3Rvapdao1VLREPEPu46XQ20dY6mEfa4go70rNKysa9vcKL93n5Igl7vfX0OFvPS96UV5fr1L3e91FXugs/7EwntvPFOT8nryetU91vXy+R8IcL9+SFu2lhpuWts//35Pev63qa7u736gspXw4LI/b6RdOF/BDNrWRPJd1+SImOn1DzwutakpagsgAAAEYUGbQBXwWeLFGCt0jSStKX/zymHH37K8vF6lJ3ml+buWIY82O+p8le4zzZnJFgEGbAzujx1rL3jcIe528qJ5B8dS+46E7rfbxnrFkip6q2u4nZ/54Xtz8FJG0i7fI8MLlMX5QdlghtrwSj8JHk/y02JhG+au6TXevaCfn27/l0iSwp4KCc8MwhILpd/jCctAqMYeyC+Zi/gbs1gJW7AHAYv1EmjHO/2wRRllNET+W1L93tCbRRClcinsNe5gv9bphWSXNQ56nUqvFH0N4p/5PpN26KwoXMPbbT0P3UFL5pxfzj+MtF+FttN4TxbvLHywSTho6XSq2xfHcfE498Z7tn9pru/30m8hefkH15TzIE8Jv3Cg7n0V2W3nzfhbcNuSWX9t8FROy4gP8ma+7xvZ19+jxJSLnQPq3psn00XavrZYezf9OTd5KLFEUJmGVfAl7v8gSXH4zWFW00X8MUXf5aFLQwfCXgv3nEb37H0t/+4UIK3u93u93f2T+rVywQbuOj/4J/C5O0LYTZHjeCFe/2cDfmuT00s/cIzr8ht+WI2iSde5P0iP8TYk3aw0v/6SLO7Qw7yIgS/VvEfRESG/e8jQ+e2/73v8O8ZOT7b573bDy3l35Kwy2nY2QpLa9/4IiTqVaCkg/GSODiS1DLcyX/zSJHfFr7hrYDENr/1Dk4zZLF2i9jTekUrgNfxJDF+htp0Q7ByBoEG1rjPZLg9x6oxol+zp8/pYTfklpun1gu1Qiaubj7pkb6l3sn0mlueEzbmq9ezwVx2/rRfAYyA2tzZFDzF4Eb8WPd+CkW4aZ4uO29u99V4IxDT9zSvgjO5Q0/FqnLRH/Fbzlc7hmVdZY24tuRvy/nC6stduCHs7rMqfuzXlstyle+TBde8r9j1CPgrMfZvP+9un/BCXNlmqLFwT6XCtLW93rafgmvHXQRcnev/ab7C/PY+96xrv+i+n/BHu/9PrXsnu2Nv4qQtnSjsy7wt/BDfFkYaa/KuQRu9zpHwj5TW73Wos/Lj7fm5PSS/dYPIii3Pj+nrcEQh23v6r1F9L3Cc1eSz57GoEMuX9qs0EN73GpOkoRf48l6b3vfsaWuxB7u73ftmpW6sbBFI6SPXf3mz/bSy1rblkwj4qf3vei+X7iad73y+LBQ+l2Lrl3mver++t+xaBHKLbuNtbdcvl4R8Lw4kw2O8ZMdZp3/5RHP9+oKBJsVtXhQavD1+xJXfvfRUEjnxmns7692O7JP3vQjpLkKWSXuuhBQSeNeKnmY4273kA7hyHWcLpwl4TjJhuZ4Da7Aa/wRiOHu//ryyn3e6X6X3IJ2N7dFFfJ68nqn8r7S6C4hx2Hyks/Td+yUnQff4T56bv6cKF838hnvvlk7E9dSmE8nS/21k9NGufPSiu73ffXC2SMhCpnK/Oa7O0ab5h5HGJXpXy+XPJ3Xspdmrk9pRMi/X5io5ojcRWZiIi7uXHb39Qtk/oh0rJVIksVyFW019z4/Xw98XAIRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vAAABItBm2AV8FnmGTkIJf7X/NsGkNzi8Xxr2ScrLhkv/2Cw3G6EkaY00Y2OnfQisiKMv7tqEr78aayekkV1vr3FwETvl+eOH/mXk/SLcuizADRoujfKgiVy+9yj3u964JJhZOqOP4yUfyzx3e9f1cK+YmM5EH6s8v7vYICBd8lFwCJ7ev2GxWf3lmAZlHF3Oheo2Hdg08m2EfC6D305Vj/v8Z7DUmtdo3B5a24IH4qtoetIMe0aU1eCrl0KSbxPCgF7nHbDuoiuDT2luMLmCZUDjZbPivIIM5dUrCPzTcU8Ly69wl5efX9OCLu6VV24Q5l732n2kW73feXmkhPyDvaEzRffW/ynxp1SEvDhuAM+uT8AOY8b75f/cKEnThI89aq7zzPC7cb+3p0W054R1kfGV0HvVCgXInTe3jct2ngjSUxn2DHMq+vYwvyW2ZYYmqhwsWrv30WxrX7Kzh9w6yE8MuqawWv1Drerv3IXuojoN1yE049UIX6FVZWFOp7l9yzOgvsW9gV0dOtOH6CL79yBO4iO0g01dqf7/3HFmVwakH94zuw77eK37EwU7hx2+cXf8unbtxfHweshYjPal0WY9fpol35PtrEp1CJIqCvjTvwSvLcAl9l41BtNq4zj5dNGZfPjPBL2Bcoaob5P39XCG+4wZy5dKkRp7hN+4IhT2X9fQveaQF/8b7LttU0r2Kvjt1CER+1uO02Kyelf0Jl/IVOL1q+6slf+aJF3fu6S7giJIvsOZgoIX+YMmT3/HVXYMGd8XTPGseqc17WbrfJwj4JvO/l9ivCmcMXRyfS4O43i+vWiQ3W7xpQQEwEbzXbeG6YEv7/n/df9mH6yvhP42+r8FdfvQFMDfldoUGuNfPR1OJHHwZNcYBZpvvxYvQtKnpGKSemnZ+2QU+QW3goj8vZd8wac/BQJyL44WPy7FwkRwhZv7uL7ye7ru5MMyvaaWWrk/arFyw/tg5vhSVzDg/yIN6nv1lSsPc1j/ulPBcVN3vexvZaBPzoHeZnKE9sEQhyj/220qCBTjsQicIrBUvfmj37F+3029gru/d/P67cJR2l7mTf7y9yLVYpYrKGrjMAvye2n9OJmPn36gue7wn4TMXJfTv+Us9V68S9/69/bq8siDZb1bto9mHuV/vFaWvTKW99jSby2l1+W94Q8ERnV70ujwkJbtJbY8h7F6Ti6ve+jIXd3539a66/rbi+jayf08m4Liu/d/d4JMv69V/CPgnvK+8v//BEQvf7dloqLX4IhpNbyNr1IKe9a9db9OYgJt2d6XUEfdG+qy9eTCWkUmf68rZ0J3P5PVfdMh9p+rlOt3W9F+2/++k1LdeT1Vp6Udu95EDuEvFkUl5snpZ0Y3vd3CXj5g+Z9+G6evGZ9eiHBYId7gn7ZN93dO+6yxJ3ve5c+Qu4w7ui90sta5PaxJOiRRHB2H34Yg/PtfBN7vsnLwW0KZkLzw0r0/sJl2n5f39Pr7E6WoiTuQKckt9wt6nBWSC6+8ZXaf70lk9EmPe9rfl86T+GPRCJ6kqTNrfEFLQ/l+MsS3f8FcAAABG1Bm4AV8Fi1cowEGzP13+bPpx5gkvF+eOYVUMl/9wWGsiiLBco9CfOTU5DZfPP9MMv9vKOvcSNanGtnvOL02dtZPu3E9wUULLhjL3M2T6fLUv8pT6v8GHdmao7vC7eb/XuXO4WXuMEZmuEvgSAEH7tV/3AHPV+zx/riSHYFGBHt/GDqLgiakOAduY2l4bZ+61dcvv3jp7ZOLtpLGQcgRlMuy/v3G/CdSbIn5SMHHfKjkskb+4WkLdPMq38n9a5YSKYeKGZGQ0gwC+mnSReIlhC5AVl8n6S15odWp/Sntvcoa5P1dTpxMg5u54fLLd73r/lPuiCb+wWDBIPFGKMVy07G6hWlEiJXX4ynRKBGxLDqaZCEX3VZaUHodgjom3hVv/3a9nw6aswqiLnZxnIsfN+X37wlfc1t7f25/9yv4ySPch+/OqKneif4gsOQ7j1R3S6WTmzLX+WG0kP69V4okPL7vaAJddpc7v2htXi/L8AvVTrf6BdE8uP3PF/v8LbJHeVQPgqLh2XY/CXgiJd3ff4J4cRdzzZflXZ8f+9w4Ro3BCVDI+my8JOv6LNhlKDTXv74woyicECL86h3ltT++tU6XkPob2BmUFffQ3x93jWP5eH750JYG7vfR4Lr0+dRL6q+nJjSZ9pXQKefSX42LjNtA8Zy75PVJd2wqRxfUovUaFSxnrnov9qVCCPVPl/y8wsr9wkuWIBEZqr91olb1wT5X0tXQs+BFiMf3CxMaC67pw4xEHSLIvHfq+Sf3XYLBb3vs7u509wRiMMMweD8EM//fgkPNv/WvutwUzrkq720mMJeCvUoLMf/HkYs0gcDQwzhRF3pkOf19AjE6R/LyS3vCHmx2mj+Cs1OnZNI6NvJ6praSv9haHk+Xyr0X95H/0euW+36PJeRdDl9F/BEUlvd/wXYQ8rke7u7dOfr2gWTD8YO9xmp7eHihG9/wQ2opkbu/BJ03lCT6bFGeN9t27paBWV9IjNxu979FZDuvek5DZc15LFu99S9idKqEyiLv2LZCPvJ92mN9FK9/aVx8Ru7ufp8vta4KN7u7y5+Jn/P+EPGE3bfbHVa/L4qw30d1kIE92kz+5cS7L/BIJd0bmF1IIu56W7v77rXstE/0T1QKIRceJHu7u7fl7v8UV3u930eE93u9wj4JyXeXy/VbYSvu8V6xHKNd3L9j3trLIIe/Y2CPPeW73s76/Eexcpr33k8/07Lu968I+GpnlyFW4cS/kEZun3itK73faZim1fRpMuP0lo3e8/b9ZM/9P5f11m82fCXknDNx1HgrkuyZTCXgul4VVpVrDwtFxOHcSIdw4uJ2154XlcxTvfs7IXM4hbhD+Y737L7+9ESrdfkNDS2Xl9J1V0cIz9KFF/rZdGy14kT2Efd+M0776ERAm99x6G+blh6spfwstGwmZ3euXeZfhcTd63dfL0/7vZ7Uto17/QJLvcqZPXqqUstD/y+q/C/qdPyTF2fyREmeGlSoay+omqifidzkfiJMl4LIAAABWVBm6AV8FfixWaIJ/3oCFlIjqdK1Ooo6NfNGFYidWg/i5xonYRjqdzQNL/uWY0vs/hE+WnMwYVMFbpfm4+XzCEL+HCU4QPLnP8PvfG+GXLlJIkyMPnHNfrHhuc2brVZidKjja5I65Ct3Vh0USEocfkOfFbl+42bvS2FfZ2rtKfrcUQIvXaYpH3y5n/mj2snb96lh3cfKCab2O5MKERf1c3rvG8oyHW87CVvCRS0v3PcL6ruF2+bccLMfHZWpz9v9Fgg5moj+2h71hfUqOH5kETCQ4mp7+18uaHL/iW0WfN/ieTL6/KU8nD1PnaFPMK4dVa/cIkdILKAS/+vd3Lx2MSssGbyImbdf90XaGLr+CvxArwuhUF1DnX513Ds5doouoot9uI38CfXMfr65X3CJZYEXYGjhchyWDwFuWyfrq+I8+TLzEO/3NHQ8sQfT+S+b6+uiyb39lPmZO4J+MFXhBxLtUsrRjdH0bHq0oQva85JEo1sPVV/tbZYye+TtKETudZuJ4E2Wn4Gw6gi+iWt2dP02YS9WvHbovTn4zni+3CW7xCPaP3l/odvF0kVNjSki1ZYJsvdqfSCfxMHkpnYC9yF9X58JfLX+myc+6V16cVe996c8ntIRvPrv6ptMI4bRfv4ag2TitsI6xyYxO6WhhOcfP5firil8zv3v27dyoO9rLkkNPbguEy/vZYUM9Lvs4Ns/2nvs20cpWF+u+GmWyFlUeHZeAn1PtfrG8rHIGwivdoBB/p8dH3Iq9yCE+33shlf/fzy0Km8tOSjy+ZbeCzx2Sr8m8fvZ9p5NmixGYX0OLz0c7Teh/76ChuTh9i4+CH8/g93LBQF21NL1AQ/+5tqGqt9nlzabOnCghonn4f2m3qi0TStGiLLvW0M2n9xpMIO29sgUMgf5eXw3F0l4PrJOE/X5l/8RBaXP9711ojunDk5WVIGLqvm53fJ7bQnukCsibWaSkrYXaD8iNnTLy3rXdCRp0kfy+Mb5P66z0KcfDEOy+v7y3GvP1/YuCM7LmXyyf1+eWzf6grhih+5ymmX9aSLoWa4BNVvZAvnWg/zf9KpJhPLGT10qyTU9wj4IxF35b8RKdab2uq1q0hcFcP8WikzUCJILo6Y3Bl/DTZsduC2pTmgO77EI+ZOjsRWuDpb3Xgi3dpsn66n19aVtsf5lw9f5VtGwcjsH/ghpr7S57dzan5JfvCa+gS4+vtN3buyFL/J7af7+M+19giI92u63xux0ubXguhmcnYXlT3yyf1+Un0W+nXVq6EnssgzdwmyHlu9uVEK7c6NHFpk+i3RT+mvl3f1+xWXHspq7rXfvTrs18J+fPZbv9lICHna4wU1ebCD+w9vU+Y3L9/iZ+t0MV9IERr3sfiynpd73rKsw3JV63a/HCLx3pc+7XsdfayRzEb+wT+Wj2X6XeyP2PRiO+lai0Ms7N294+SnD0O97+TrhHwTy87N46vrL7zl0URbit75sJnYj997+4JDt1pVX4m73e737TOYy3J/0gtO+547v1yb/37lZJsvv6MtvTfRWCzw/ve4y0PduVNLiG99wk/wVEO3jInN/j5js/IDe6YKDO99N29Ip3u9XjZD46tkW6Rb31rSpluXqxRpcnHoyH632qWJmDjL/fbx7hTJeP/f5IRz1verbr6YIxLxLlKu4Ju08d90qq2hM177q2glLjPWduT1tn8TLPnr0xO9LLDXZKlTS+Xl/7hp3h7twr4extfkOXjRYe/4dj8vrpOONm1Lu71Gn13/kEzduD5Lugg96Satz9FyfbllpkgkruF/fjobWyiaJKfFLlxxOkHFqNfwx4IppivInkkIkq37O+MBeBZAAAAFY0GbwBXwWeGxBqKw00z+bY/Oy/+4shP7uXF5pi9DLgY83OfDd3PwUEkBqfsjV2YJOKwEvvN49wvmb0tjy2X93wjozgxh3t8qZN+B93qL63cKFbeA3NG/T0gSe/XDsfssju44vvcIEx0SGqUxdf867yfvufYL97mPu2euK+H/vYmHSufDba6kd5oMrTH2p/7gnz9HE7u4rtekH5b5qbngK3uLU+nNh6hXzGpD84wPy/2240m5eFxXWyHue4XUeSC6Md2CSaAznK+bmb8kWjkKu3FnwMi0DMXRAheqs5r9bbQUpUm1Z44xC7+6lK/OFxD+j7GT93vUUDvIjTDJ7udruHfSmTwBu76FW43utNyrfj/9cn1plaTjC6t1uD14INzZ4aiOQismX7D3uCnoJ+8OH/utgf0vWOpa793ljJXk9NLv+iwnL7d79iZPPOr0he4Zcm7z8KeERGG7EaC02PP3pHgSPOzFXa6bCk2747s3mMSS0PGpBzIz6S4A8cof7LQMD7VGxJxRda0o2T6S29y1t5IdwgUoFbzAUNZtdinlC5kEn9/i2be6r6yd316o3BZpPykjj5WW0vgMf5P2ypvdwUEEIAd5F/PDfS2FqVZdyFXBcJeCfDLlFxfZRRj93wKnL+34JyXZ43GTg6lO73Bhvu1AVURAfxUd6OIfw3rL/Gcszisg/Dcn5gNmHno4X0Uu9Kd4ILsVB+Ge1PCt0ZFW69+2m+5bU+f8Jn4bXGXs69wVEmXw02o3efv+Vj/YmExb/y291XuicYIMyiQ5f1B9t63xtA+p7byBn4ICJJOgzeRN+w3ULaKWvngIdcf/hc6KaclZZ+nzkNKJSQVy/5dFFsiRwiX/1BEK0z/sfr2ki3BP4ZgSSNNy926zsKcdbP7kBpV+4hFtg6H0XJw7GsFxgboToCEny5D+CGvVjbGWPaZ7YSEVSSygs75PTojxSwYbw9hswfvX1k/qixdi+T0leskdD8tz28w/eXCne2haLBWSFyPxwvc75I++wTCc6jRXT+SMvfc8crF+pcCL6TBcIfJ8frkqfTLw1Wa1b3DsEmVd9u7JXe7Yg39x6TkfRP/6WVVhqst1Jul+tW6wRXIrfZ20ssnuknkqCLbRDBUd3gl903d91eCze7u9vfKEfHGNWXj9K19eu78sEZU5O71R9k+sX80FJXS3KGu7WvJH7ru7V7y+teS7+vJ7SZU/j5Rxxh9WS/LANxXbfwhffe97/Fbd3vv8FF993ZCPhQ3P3xpXU7U3bHcy/yZZr3qve+YXSd+3VF9aJbr6xcYGb3vJOlURBFM67sp/gntv3vUJdgi3V3N9UW+80cyYw/RQiKD42llj/DpWjdqoPVYS1X37ye6QspIlZB00n/Sp0vBJzwlB6ku9+pN76wSEggvfu352/oMFk7lLQYe1Dv340viq1TuCsruYp7u7u5zx7euEbZtbu8ZpeOEfDRLsY3cf5lzFaO/yij+/liTuRd7P93vyxAl+13fqa73k/onf15nrUst39H9l6bfNfdJdS3C6eA/y/rSmBDQe6Z6EvCZC8/zw95aYUM7u6b5/u9zr7+2CM7xW57emqOn1Tv2LWvyXd+vSWRaollFFd937UtoEQqNo3sXj3GZGz/j/u8Q99zvu/khLyFhVq8vrf70Ty+UnOpCiH0W7M0UEnd0w+8nvi11go7ve5xb9oZe/d93yw35JL3/BbmfBI9O62mjJpJJIdP/2p+8sH763vUK+fU1p/9Ixkl+4S3vuXdi0Y+7rdSpk9fL0hZcuXu8v9bQ/u0f30cfnPhj0Q6eSGu7u+fWvmgljxEj+a007WIiCohuWTd1p6ExsP/FwAAAFhUGb4BXwWP3FiIfzZetjjMZcH31+UReY/L/vizx8sPs4MfDK9wibjoQgBBvB+4eOu5XqIOvbTwQvSlX2M7ip23c0cw+vxfDuy3e6h8r6UUU3vlqWMFtJf6cFZMJODEtJF5GIfU7jazhRdOWcJ8wer+QtNfcE0/0dy5ozr6BVvJLCXc8+YraFfMS5GpYj1/jSVuJHhdkCX/nzG4/LgR+9etCFtIhk+6G16eZHJ+hreKx6ZOPqC/GT5aCSinHked/h2ez1aOmHLPgIrmPleIlcTgiIPwC0lfH736e4XasXYstrH7QexMdNtK6la4AEfr3Yuq89Ts68GE3Tf/jC7QaQXprNfRGDxjy7eQP3emlaNIiEYL1WVh6YzY4hcZlkViV85SHmCToxB/+b08vq6eM0r5t7qj7HAer8nvYu6qKsrxX6bOsIR2x1lc+Ytzw/GXd5e2XN37mE994SveZDvL/vizuO+rY/ot6wm/cKBDn0Lvu7H5HSJlyVJCQfvrfuNw98Jry9ZaFMRMkSw+WoR5g7KPWBB0J6eQPKPdyBeNx3RPHkxsR7ul8dE+vh+zFnuJP4F0/x8utFYkoXi3uOQNhnNbS0npLXk9UL8uPdFOT0vXXosT3c6d+s197VXH1fe8innBsOxenRUCwkzL4ROCy+ebWPPgl2PtrYVSqXd2IJvvChHu7wSdn9Tbd9lh17hyHv+IL70+3R14a7/9Y3xkuvml2Xhx10LfDkL2V/1ZEJhTu64cz2wF9blr6GGUB7p7j9Xizv4bSbccfswH431os1/UFxBl0VoJyr4bTmt5JRemsOn7+owQhYSa8Z31arsPU0Wa7jrg1OLyelT+40kz95mpfe+iQVnjzhqXjT2BnJzLVGr2GJz/uJtz2T8n9dtPCPgjyd/5rXBaWeMvVJdfQLJYwadEB74iS06z0NMPvkEw17jfhP3MEf/jM4CXa2Rq1ASjcA9zj7V4RWZ0Me2HEpt1VHvBWIQl5Nq73JdwF96NDTQSfBFhGTU4hvKqbCm7xIkoanzprP/BOS++clcy/cP2Du9OMCPy6nUa/oTBGXDcoPi7GyyF7f46Hx3nx8jvjqmPYJP0u68FBjhyH+dN4I3idpTzoHNfRBL3e+sT3eXvhHziF9u63y5i5/rrBHLhT939QVwmf8zV637CsOBW0ZTvljsarEwRXs/OTw6PBDTMywPspLur7XNFSI78sMnv/uCO78WrxsEXJZwvFr2QEM35m6LguZv+Ey/yttAsvexu93e1E/r7ElV+sSVbzv+v8kt5a9RHOHXeZPe+CO73Tvbu738v0Ett93+/oI+K93u7fxM/8/8ILlxQi3P9ye98f3be96VctWNfRZz9y6ev17gi4wD0zCp2627UnLjye3yYp5rul+O3uOr+9+/5ufwj4Ks0e4T4/RN7Oz1FZ/GabtfE+255vy/yZZAifH944dzwvfn+rLu76cm7vyQuW8f0d0aa7nv6pxMRup67R4fcu79jYoRnSPCH3hZd1uLKkZYS+aF0JpIr3S+y/Ty/rfCPgiz+5UrlBQS7vbd99dyFEjF297SENzXe1SdNdW9u+vJ6vSlrtSk/XsWnvfJ/XW5O5QV1eNnENbo8JeC4hFnhpUMJbpwl/6wkbbeXHu6LoSfLjb3fWQ+csUlS19l5PdPHVfVBiLm5f7fXtK6FEGScxBmJOrr9/ojIUyIUUJ6gxx63l8RrkIRu21J6+aZOEBu5GU4tHlyT9SfSVMr1MRTC69aSYIdzoMXptxNQj1fCPwZd36kLdKjrdLWu56bhTyYhprwQiIbOl05rrLK42pdKS4WO98Z7qftf0nl9EiyvKS6rfmmrrXLj71tYL/KPP9blMk3hf0QifqnVV4ikiFGF2Id9+snw98XAAABYJBmgAV8FfmFZMMDvi6nd9J5f/cI93Cbz2+YuUUdn+LNunjMR/MfDTmYY8xOCTs3tS//Y3yTjUs82wGckjM+ctN0mCRqUHB7c95v+w3Myxm7oI9tC8v7bWN3Ty+TTlYfSlZpr7HnF7uv3bTtStx/bl3x3L8n7ep9i4BX1F7O/mllPbEx0j9HZjZV97QiGYIWT6rfLCJYpoXjuJQ35P19XG7kFqc2ulYzL5mv162PvTDWeX/KTdb38SUpi7jMta8Qp5jZgM8v7t4ICCRxtezqQB//pV136C4KkrycS4RkpunzzNPuWc1P7w2Un7nVPOzjxzmoffeN5/w9R9WV7/KWMPysTvcKMXsUCUexn5dJ9j/fbQJY6TF3MgMXACitXrseQX407w6+TRS8dC1WbWCz7Qrnrp5fWN/XjsjPlT8c5aMlDJQ23f8O7MxRYLcD3FcUH96S+Li1tT0/u9xl4bymx+px1dyA6+84EQJLHp80fgTm3KUjOw7sIPPzF2Yfuiet8bD0sjyJRXdvNt+Wx9sLF9uV/fRYvt4dWkcubbfD4n2MlZpWs2PF/lK8zYS8wzhN9lL/e2KIJepLCPmw0/Z+disJr9nxl9wce/1imeVDGPkH9mfgDi8MiP/1y3EfKwtH+hQ3BRrU0h/SX93wR1oaP0w/f81k/qvKxZUePOsOr+1/qtx/DUuPM+NyfvKBro+k0dFhi91zC4AtvVRv+EeaVON9Te4Rw4Jd/AS13fUcGm+X7mB6O8BL7H2077je4/Ybj90aXYty4C7ndr79+3ZxrTzTcJeCwz3end7X++6CmVFOgde+Jn9PXvG+2Rd6vt1ngw6xr4wKnXgYrEY+T9afxsNzSfjLC1bjTTcIw1fScK+HaKjtHVmnJ8f9Ru9YKaZUyARMxxYie4CF9+UkIX6Y0w943s+Ky6PFX4+su1pdx58wazVoD+RHk/0Rfz76ClA9d5C8n6vlahMw20fLWGkRhwFr6KJwIZ7GeXEqcRd/jfSelVa4KxD85OWdzX7Pcbiz0Y71tDL/NXUVhIi4Ydo2cSDsxL/Rc/G+ob7m2d2oy+D+RtfTCXgh7m9j8EV0C5f7S5WTAGyBLclnH8FBMPJI8j8Ap/il81eNY0i0RbVRgZaOcMJI+QM7B2Img3sL4EH4yRRm7vFBzv+UWG3Re/7ca2zGe/QmH+v7zkaUoXXUJ/pf/Yn2oKDm0ZA+9veLV3QehPVpsT3dHbfKF1H+/9JF0ERHci15zwCB46lBlLF/2shSghEvvLL795svwj5TEb0ujvJ6ud4tlqzX5uPIv7BMUELxmJ+7+7XuQnOv6BWd3u897RVHNRfWi8EV3yl3giI7CnB2oJL7y7JLe/TJCPhMkzZ/k2rwSCSS7iT3T8nNZs7161pKO34u5+96b6PNfdE9JIjrVE7xDhPwWy/l584NZeE93l7905CCS7uff0X1hMo/332fXe26N36v1r/68kEeG1vPerFkXtfIvfJCHr+uwW3ve995kfW6v61rW5iRtr6ayopVSS0twn4XLP7xnHWyhp99e2LEK1tWbLd++xMFZ3uZu6J75svvS2CQS+6ZV67rPon0uf2pwfl7t0u/RyKl9vVW48jvef48XvfWJz/d8JdB/hHbk62ct+HB9Rt9DqHlKfDJSPH0SX8l8ogV3y/dMWkQ+T+oKTp35Epyb3p+xu+6bXmK3vWnQLBFKULvvINPgtG0loExAnsMGfvvdnywl5Cy+iZfuiiXKXLHr1UW15FvckRy5e+/S7zT/0k9G3P8Ll8yX2TkxblEu7v1BHbp02X8nJLNn5Iwtp5te8rG45b7dRRQUkkt9p/d6L4YrUidYjPGYiS2D/JBFRh5edZhLL+NkkjtzX3N5rOZff5Lo34LIAAAAXJQZogFfBX4KBWa49KdDoxXxcbiszmsekYNdNS/+4R7vhvLbraR90X/fLDjNBraO71y8tmeGF7hE2rgEGvBv7iqvl8ZbatmdfX4u44Xq9UEvke6tysPFMO6ZZM7R5SrLG1iVuT/2JjYz583SsCm058uAQOlz+NTZP53S4E5bTAGuQSFhfvCREm+3YcMOK1LCB1n1XPzBWxuS/8bOb1fCivhjj8x0qmB7KnTHLf7iCF5peXlz5SjmL4U8xs5iEnce/xpLKEX8BcSETMkp/VBffv+ae3AYAYBe3Uze3rMOv6UVbL+00BrMN4UKFmqphlZ+M3VieC8zvWXWH18ICj3X1b7c8yhjzJB6F7c26dsaUPLou7BTaNLBWvYBecbeKP389ifYQ4DsuruGjzhd4yJWTi6M6cYws/pDPDSXS2mU+OiWx6WQRFuU7/YxbknvBuT1pMnaE89Y99xSFye6v+O7veWBTu+T7az9R3SMDsh+X5yr95eJxvPcrHL65dGvMZ8svn4T89eGXuGbmL9xpII9pkC2/qpowfAaqxdNP59Y7XxAm17aqaPu8dmZPObY8eHICZfrPWfZHZ6G5qPh1u420Nes2PCuOJ7Xh0/Z2i75O23wCeu9vf7iIi3BlQa/52oPUFBHXgr84PHUTNcL2v9AdgjtYSfMRqDyadumm4m73G7sEttyP1PNxvaoE70FZAfPDAymOvr7/Bny/ivEmyx0sy8/g7yyR5xZ56mq08aUgfVZsIy/iTkaLnwT/Sq3BRiM2N31fjEaLgtrWzvwvQ469eLVqtD8ho62Ma/8O7mnxQMFFWf0T0TQ9y7vy0FeWVSsy//bGbvnnpnx8uY611gs3It6NfI6zn3yiiVUniZZjxf5PvJE3O9qVYKCbkBk68BPdVyHt+425t5oHn70rO4Z6xKXQFlvRlOGBvISPfen9io1pw9R/cNvlO/v8TLlhMiu8PlzwFt19fvoS84i39k179QSEcve63TZXus8IXC7QXMIj85SRrcwrzevcZzCjbzn3D2Y7hmXbu+Qc33y/94SE8N9+7u9/jybw0cH0dj98Zpfqz9xaJ+vv6XUEJioLj7Sw+NI9Xbi7/wgm5JdASmex5tqWf3qRnh/wz/AnhLy7n1GNP/8WNn8fgGzYSL/eIgjEcvrovpwVVAnf3IN6WBL1/3/zfbQwQ3Va3pOz94W4SOP09/I2807N319hRO92Jb7HQ6fxJL3zIXfQSO+9yLdVlhCHoMdrm/w7J/3n+ioWIeVuiIY6RC/T/hLyl5c02OPlzdbv7OxIgaTfJWjm3R5OnZb63aPvrwRFuWH/o8RIfyhqQPXTTc8EV+X1ptcEN9/Ja3kLuu62ZYtVoRCHgmNzrp2+t1Sgj3W+1LMvoE4nI+99S1m3vrFd3ve072RZf30+yEhHwS7k252U+W+zx/Nu9+72q4SPd3d7Xv16P/E73d36ckqB38nqn3TvI/ugrvef62z//ahG933fd9YQ3nX3d9wj4qySL2xn8HX1pAjiX3mBVWS5JevrLcpr69X6XdmETD99GYQLd7aDbo2WraL9t2rK++sm9wj4ItuXorVdYKTXvz7l99fjju9vn9zY0g1Ksos8EZzqOK5Vqndu+xL177tR/S4QvKx6+2TJ6VdDSPk9tq/JBZ4ZX8XvAOuuZx/F6ZPVW45it47TsJeKI2pc3mKnh2wobc7N3j9Rls32hXKtsbbL65OQ813vlkHnd/csH3fTvjBQaf9ZO/CAvSppAtIQKDSuUYvU3xfazNR5t4k3u7u/ZCwn4Je7YNa5/L8krGX9Si8Vn6VBF6lSifv2N/iflk3od5Kwt4VuQU5LyC8N1P5IwmSFJ7Rjk+cSycvlSWmEO7rH1/VOknLCRyX3d/J0uSIKfF0TpPXRYzPb93FbVK+a+RDMC1aO3CGibZKl8Lt/RLnE1lwx5DKb/JE17ey+7f9zyZ/yHZBrCX6QicwAd/45vY68PfGwAAAT/QZpAFfBcqyzBCEVwD/DB+GX+3h5dZr+OmtMMeYmXmBXwQcMs3CbX8wguYGhC/4E19pcgrDAY5//gn+XLgdQzLESszDIbkGMDkpVu42Cd+wp9l9AlL4rWhc/3F/DiT2ozRVnvVunFuOZiljdH9OduMzwjOOk8v2R6vn2UKP16h3MOkF2QZomVeuPsfIV63DP39J54JuU+5wIpNDrtEx7hDe756F7/lhPu5f5f+sVzyvuFfDnDVz6BJjYcEJ2dQ2BC9UV1Nt+4IDCR5zgIX1MB//P7b0IuxAQ7+0A5efKxvXVBtlY2ETvLzq827q+dwVSj13/37jfrfW20X1NHd+P5nQ7iXovR6gwSvQlkH/wC3MVn94oKVj8Y3H5f/saV6Xg7fPwAD1Uq9jb+9PeCC2Bt6VdRpl5l6HX9Ufq2PFqav9pN0EPpr4XuVE5Xfb2BlBLmT5pfMgP7VO6j9tCcCX8d7znR/5YdYiwDJ0f3vJ6aWddmhtyf6hCIgBX3U/b35WP7uXuQMn3oqVtIJx6XuPd3+U7x3CnMJv7FBC3uc7uPHGt3GeJ5iP5EMHavZsRU9FrjYlp9gLed/LxEiKd7vjFzisJE+l3uENJ4oItyy+I2vzkvhMsykZDZo73d9+teEyXvhqSIVYUqvVoFECzY5Z5/Sl7HgHVTJa2F/gj7jJ4YBMv75WCk04m/dj8EuieSfb2rC+lun4JelsOKbKLw3b/jKf7Gly2YfSpcIge6CMjGJzzI4Vw9d4LBXpj2SS4fX+X97sKbjeHrnHyRIfj7F7cEcc6gBI91f4Qy/IDxrrmSfnbf/91nmKPuv3qnceQiqZONauryNzbosSc8O9cBvryXMgj3SfVZb4ITOQrV9tawVkcu1i+SXAfbKw/dFvNO1PL/vRRrosI+CYVP0/kz5+CK7/aUrPWDpwUXkC6TmPQ0npB2LgqMCF7GW7WN+Fs7d41mBbw8hpMNt9CRNzDuIG4zn+qbYumrJC29+PFdZIrX8ntpV/pvde2/r3TuAi/wr7h/rFmmmEPbxt2/gOpGTR5s2b+tH7a1QJN79CPQwj1VP/ezvdflw90v1KJP/uwTCJBqHJYmDm1mkycmviPzLr6wRlP//xUt9Fpwlfd9vZ4Iu7mbbStjvvIcPuk+j1nv0xW3W99KQl7hEv9xVgr8/3drfvwWXLHN9d72/6MgVnW+Xfd666s/rROrfybXIuE2UPqY9/vLD4Q8E5L3Y3Drh61l4y73e1N9K5+y5PWi/ZSiSmt9NijXvd32dM6d9uXYIyUcunsnrRpPgkvfXY36+5Cu/L769eEN75Nu/3d9+8IeCIiF1H7U/Ezb3WvRdX94JO7ud/ZECQkDtlz+76fqz3eEfDWViMHFfh22nl/3UFJNz7t4rd3xaxMsSUVxLBy5vrIQsEAl7l77v3TLTP/2L6pSXu/fk9OzPCHCd7niObWlr9NYi6nX9k9CGCLd5wd16hMl7gUXg457fq4XmNgbhLUJ4Sp+h8q8uEdmdp8bpXZWFBDu/NK5DV3b3fJ60XZl7UhTSDXeYL93e9D107/xHycn9flRCVjIg5P1pVUMkBLt+K9HjQaPQytr/wotcRe7jJxv37YrLy/fW/JCAk9WmUuve75Prl/VKuV9CfYn1k7lrrd38nrWIfFMmXP2cvHa+FdJGOKs0pMuPeXlOu+8vLHvUqeI17Pd4X9GOlZr1RQXQAAABS5BmmAV8FfmFGEQ3oGGX/Nkk1F8L93zWXFy9P69/xfKNtO1Dk64MeYmMIb8l/9wxCbq4FNN/GytRUoNDUP4Ogza/yblpL+/YUlBV7cORmcw0zIz5/++1SRWuhSD1jYvf3DEM9gJf7RrP/cpXf+H/8yfbdlu4L6W5OjFYS0HoItX2zxvl/gh6qnH77ZTfyluHnW4U8xtJ69sEBMBI+s3bgTUb9AFDusx1dgRoeBbsAmP9/9dY7jF1+3qUDxU1f4kNqZaRcMtq/tjY3rhCe5+haFoIfh+4XMK1/FKXgr0D2yn7hBg+1kEwjL7vYI/17QV35t4AZqqmf37+Wbb/3CJThuOkiBSh06XswXEU+mnLHyrdy773KF9YTgfI/68+tN9ue5LuU17EoTu0baMrV1K6bBgUIv1eWLaJ8rrX5fk3wTUKPd9hHbyHxTflvQh+gS8OG4e5a/xzr+gWECdNSI7lB4wPGTQO1FtloxJETInLzprpxuV9SGbVYGz57NQfeW5lDVQaAaxm/s5ux9WQe1LhrBpJerELQleKx3/882v557szpVRq5nPrYv9+mN5PrJ6H5zHNtD85SReBft5Wiv7rmVDP/scGJn1l2N/k9JIrroYUJMN9+3muVG74VcXFxYD1kfYM7bS2mHvG8rLIpwkcRPbW9Yeln/5PpJP3NyFH5KnusFeSRd25UB5ctR29xF7420+T7vOtwiS99ShSYDwCHfu4rpsbbci/Flgo3AJ9DF7/h8k3VuF3W5G8reLUGl7lFQ7fn9eoI7kXtwRPQnuFBT3ve97u+sv2u4Rp97y97ToBf56K/uCOVD2CEMeb6CncJe4n5+ik26SI54Rd+w9sn1e6i2MEjb+3bzzGUfzN7d5PX/wmZyTzl+djY2znh2ki2gXG5alcZ/2T9fXHkfLYLyIPcnMHcdKUYT9Fqi/+2CKUL3O8JoJe593jSi81zrzFJgbuggbK6Ih2Ymceo8P58IGzzLuGxI26POka4z3/xV63OXb2/2RG+38pTbMGyL3tJrW07LBDhjffc9xpBve1S7k2L/gQ+e55QiFkS/XCW3HOd/0eLPe+5Y/BTe8z9DfeoR8FBibfoauET+ssrfJ9uT7gmyD8wXd5Tc0+4Jy3I8fonexH7Oj3sQa5Osn7Gxd33a7rPW3WKz/bf04Svu9/da9X4A5mAt11kv27/9KEcsFNMQPdXk+73vWRL369l/8MyHc229qMspdx02a2i/aM6+y0I7r62fLDJ7VC0+VGI5dvsauyIRe999UEynh59dJPpQg+1BaZ75/kvJJd79sOCX31qqfy/3qQh//sr0X4Q3j/7tmkyevakiLvfS+CO7z8oOvxEJ7t3fpLS1kqMvfBE2LZ4y6N3hxoulbUqZPuv3Bbz+dCeGehHw4SVifv/UkK8QOHavULOLnznefjS3Rj6I/7Khf7Fbu0kUzrxnvk97GrDAJ560x1lFedi9xXk+tT+xsqM1Or9lJVy53PttrWuulJu/2CEz712Lgruc3hSA/3GWNsKX3TlPu/y3v8kIeQkbuf4IzHQ3SfywTFe7u73+95V1uQTq/XXv76p/zesUTdw8wjt/u7lD9hLx+8vxnPssZxHpixEbnf5oy/eblEu/VHXn6q1ul6ae97yPRu8hYTX4gS7vZsjf4i7tS+1v8ccuxXL7uf+6UnaXKYuT7zNd+SyPf33X6Vo0ZZ13P7vd/v3/Cyy9iHafyIu27695v5ogXN/Sfkmufy5qh933l7u+GvITkxeILK3N+hMfD/xcAAAVUQZqAFfBZ5hSu9eYz5Y/MV5GYY8OZwKOhUITawoKnji/L/7hElYcW4mHQIH/kcWEXL3r3CkCd23jv3cxJ+Y6HtFVudeyg6LCBQn8wjaZ5IB6s1fk/b8Tsbua86Hdo+CIflTPPYF8EV8bZQHcmhn7xLeregWkOLb9zKSp+Qs7B4eix88Xz6/NG8v/JgwlvmnDVuYwS1u6f/Lu8KeYm4QsPcl/d8EBCUecsAEX2aClC+VYoxWQea3YxP3o7a2CWqwQX6xen6ftYL+jRBNtHxnO+P3K69dDhB+i5iC0Of3Re8NM83TbYq8Am/qcedkfWULXvpwUl4Q/tE6riVR076zwwZPPm6S3L5ecG1siHTCTWcL9r2gAn977/65uevf8f4karfOLTh+HIuvduLlhO7OyilmSWssEmdM0tKHwnyt5evw2fNqS4r+Ey/3tjRUdkseZrGFUeejGZy98duH1U4Txzs94b7BTE6VWXBwkZlcwNJT2MI5ems/naDODxHrfY9Of+t1pb6VDISmn87dIzehxDCdiTu7XiEby75zH8vv3hQsVvlzL/YHsE3D8H1Lkbve0htKX9dY3Po+MBpTOX6vxk60gCnLOHcq/ODg4uLOY5L9e4RKkOLcKmBz7uv0tLfCftvnSGCkn9/jZeQbGta9xcdV6+w5c14s17vC8HoInXH3uO4yYJw07xvOukHbFyabULc0f8A9tZPym9fWH+73sBHt9Tz+OrxpE/CXgnMaQUbDv58qP/ffuN3XZ0BbPe/gSP3vj2n5UD/UcW7jGM/W1jf+ysaX8cD/x8o+AK92KLGcvcCrTS430npJl3nh3uOmNifWGGYj3H/BAKnOW8EuNFLI/yfpXZ9De2cLyEqhqIq4OIcxKxJMghcbztpxzlUFQfL++o4uHoFXvh5sPRA9+4ghxwsAK++f3f4koRul/wR/HjtO8SuiobMvY7q1m9y9mPDdTfuREBGvv0rKZevdqdjDv55/TQnYwR6RXVqw06wV+cy1reuHXkhpe2kt2B/Kv2pehp6EvHCJ47Ipa3609YIr6PGTAh/X0+27cFkegO7eHJOloXoEP4838hVXG8ICKxC/l/3qNPH65zaA9wycZOUWyt5TjgXd8uj+sLU2+7zLw3LyYtfLd6sXElffIvk9pur0ngLVdvvto6oJiH7uceNDrNcok6/vfhDzacvS4RLKR95PtfE/toP3cBLXd3+zcrHnOC32dTw/il9glwjwHyQH93fXTQJd75Q279dUSLK992PpIEEvfe7yqzV1rNP9Yy5L933IFvn9pZF6sz7hHkCma43Sb+rr3Xu/kgjuV/ltsfVHc2pFa66Pdm93rl/93u7rxGxO979SyPQ+X9et7vCPijT+7uV/flKEZ87a976RTCXonpLS6ryXe+8r63+gR7vLZfr068J8vu9mEvFaV7xY/on/Kxh8/d33uzmXf+zrJLqxAi9336Ru73v/lK+8n1WVJWaGr6fqjtG1q8qBBl+II/yB3eLhD7nv9iSb14R8mViOr8Sbf5//ZSvd9J9OIO73Ov+vqutFShr6L6JJfTtPIXqjGHafk/a9eEvCeHVrOnvXbjjXtvq4QvHD3d5+2q+urL8Z22WT5PVCju/e9LmQ6Nl8Z+/jge7iB4JvnvP/hTURP/LIBBqhv35WKLH6PissOT1pSIiwUHPC5Iu3074i5f+R6SLrJ6aSyLhPniw3FfxEI7nQbv3SprooJrUhN9xWLxC7+wmKtvdby/5PZPT03UcxQvl+RqKX8v8n6ReUkg/y9XsGchD8f+T6Un9lKk8dUM4ixCsK1X7vcuc0Qe6rUuCyAAAAUsQZqgFfBV5jcfnfmNaS/F5qaST5f/c2W3/mIzyNl/94Y8284vL/7hHe5BONRzYQ8bsbCplYwZn5L/uVhTlGVSyh9XyYcGu1JIYgWrp33FZ8fLl/cIXvcrt7mQSfpbl4vDyJado5aXr2sv05Nhgr3LEjT16cf4S/TfCd45U0/ZrCvmJhq8d37jSbIIrOwXFTa1ldmfGzM3W/phPx5IARSnydR3+2JX0W0T8Tw2Zmr/jDG3J1Tz+t3G8a7F9tC+3aAJ/b+b/IOkbSSMUu69fnCtarfAfc8w/kbHLh04+ucvH9Lv3Du5xS2JTkSfYX0oqnIycd61jif/7jCh1w7KKfGeng31cHuMJcO09EQJB+T7r8RdMJeebkD5zw/dUp4cnBpwb9Puv1SeCnw0ksasxaFkifltJH4iUTvvvJ+rX6pXL79f5Tp5RIJv3FDj3LFYrCy0FvDEOnrdxtH3QZbBMH1bALaSUP3frO6cu0woiIa/Vayf4ed8K9EvSs/Y3VsrCR7ztUpXvn/L6b9ifw0pjD9i/0lhvOu/wgVP2LmDJjyfW08oFvnLEC2iwS+a6WTd4YpVKFLqio7k9VzXfauTuqSFXd972rbZhGMIYM9UUtSpoFcOoiW5dzB9im2vqakwBq1TjuzpvtsJc/vKuGpOQl4KCXlg11bfqFN5VJ+yT8WzZeqZd3vrL6dvQIIzJ/DEkn8rQRI6lZeG+Z9/h2YJz2QtkFCSRJYKQZH+n2+0+sDWnbpaHhDccfxt60HRH3NJzK0WZsUH9/hb45SCN6Lb3MX08QEXcnP6xyPXOu91uOKROlYMb5gLzx46vOx5MEvzxUbZ675pXSuJKZj4dgqvZ/TWX/fCliHz/x+AUesqmYS4873Bf7uYX4eELRrTS/7gDV42IWu+bNxsqGNYadx9fwl6sVRZL+4b7K+nBJjTPwZP2/GvG9C1OhW8vDmyBXb6ouQQfNq/W7YDSH8vpR/jbz33liZJT7hvv2v493oi9why/dPL39/qCMsgmNn334JcfTPMO8G3lPv0VArM9gp43U9ZjfrBMdztfff/69CPghI8/39oVe97+2Ehs+42ur9yfba30CUZPt+a8Q0T+re2reoKTkdOp2T2KeNujy72WsJ3L0Fuu8tQnve99ngpnTTILvX77e4+O62A3w7SJdv7lI9/oos/n8I+C4ZdJ3vjd6O51rd+Lnp7v7s+79PrQqvNXuveq6d8J3vfe0lUWW7W96SVN6K4S7BYR93vu+XfSSP0XL76vaVP1rBpcnrNd/WIpXuefpTXf1m3uEfBPL+pWLwdENuhUqGHdI/t6V2Tvcsd1QQmG7y71304VlvlJS461dP9ZPPknqr+pdDcubFom7vvBEa6LWT18nUZu7veIRl9X08OCPH340vpoQfM/eY92x9z/u93P3sI+CI06jc/CRLy98bxL6bkWCstx+ot3nife7ZPVT/Wq/I2d96TxvTXku/SmYgnaXk5PTSy7TKTlyT13X4S8Pdyyz+OrfpwcN/dcs+Uow3NKmnhbvxX2oymUSntcjNc29NCYnk9WlK/005/X5fTLk9X3cnf4IST8g3eUKeQpYApXeX+/BQUtJdNlsr2tJKCpISUvY78Z7iFrvIaf9ZFiSmr0ZOl7Evb4iKp76b1tSEu/elwr4IoYu0SpaXkFl9dJxBmSkNU5ba1KLskR/5Ny/2kTXhITPPtKklqSSzy8MeGsmZEpiO/9RHPk9XSJf6glqPabUckr3x0p5IZobL+x8m2/Jfd38PfFwAABQNBmsAV8FflGZQqktzdXXi+bmju4Y83LgQ3aivcYTRj8rqHTAQIGKadvXxH5Kqocd4Ol/LvClO5UWFzxofLhnT4VfMAo3p4nj/dkWf+Czd5cKEfhpfjb0XqT0nLotMN5wOFkIQgRZ4IU2lb/4QLlsnzirvP9e4rnxKYLP1WWEepSyQ5Gv3VcK+Yk9ZHS/u3ggIJH0F34OAYq6tt/+B3DUtk6qWU0Cb66h6kJDitfSsPzYSOG1bVMdZrd+0M3hnVp14/EjZACf9adPsJJPUfCNvlFpQ3ZVuM8RaL0rtM8YWCT6wWc3tfY2Xppc2ideLdi3CcYCVF/9y/dAEt8k/afLLGaI+W7vZwm4KiuwvFRoK9YNBbT3VPQ7dmeUTqa4Jfj889K+Kvu2GhF3v1RbhbkgUMx39l7/2peEM/fcs5Ul3etcZ3bzyfdG5f2pVve4U8IiC1IXyjw1BL/NIP5uf0tzxAZavdPkv/djcf2d1cN/HZK24Lg0EYhmq5HaxWz1md9eFYzvZVuPJ6C6Cz7WhdoH5JPRtGvVyvC9PZH2kVuWBI99VUSuai+HrXJGFgu1PwbP/XO3ucrZrIlY/pK6BFlTjbYVPwgXa4cd3neUG6S3MS99e1r1q5iZUwh4VnW+MjAtoK97neBhUS7hNCPHlbteoK+HYnj8P2R3ulQS8E5LlYt6HfL9N24Ksi0Movg33Zx9937J/RaSRWDDd/cHGLrw3Sn6cO2uQtuSUEfZ+NCNVzhBf93b7N/UKJ476cKcFfkgEljgYzK2IEmqkII2h/nnzW9h4CBWuRe60E6h3LST2MKe5pd0vMPvfZV/GBPRd3jWN0QR7ErRaGEBn7rv7d3diOfUPTpf+1zXXx30vmU6aUucZc0fVluNQnd8JRq14bWJKfqquhpkbtTMUOhTflZS+03iQNHqP/YTdUDSOreCF+cM+sETfuvDvcaKghTvaReG9XP9Wb3y+TzYQF82Xc8E72YRfqC0Ytb0P3VdHYrzPj6f9OG/kuCSuv9ngqJiH0j8GH/6rZx4eTAY1m+7ylIDovwdGPT+/Y/75PVI/6Nvek88FW0PUscpDBNUPf5F+2Nryrr2wief3d33l4RfaYIzOZr/hfQn6Xe8OoXVc3eCUpgW/ZhLY7dkoxkOv1q2T7/3Rf95d3IfulfxEEXn7/ha58eWWPF2a/3z6/e1r29rhJ+WCM1N033qqO/drc7vdhZnL1avpPErql7a7q7o6eN40Xyx8S/9NX3/l9/2d4zTCHhAVy9NsNu+/URrI138v/eUTnZfRl39F934yIhv/9T/1vO/qiffSlJTfV42rwn4JPGaf/RH+ry9vrk/Xy9CzCyetZq/r3gkM9/baxEZd7u+YF7x05eT61XoR8hbfHafBGYr5n2HwVl3d3d3uWKVfKd7qdCeqXBFd8gtfkny+6cvL7qS+v/fJ6e+I/RUCzxXcvwhLG+dr2NljzXW5Y7CioSL+JH+CrDLrNHxXhE628gPsICnd3t7dJ5nonpJJ8rRRO71uTb8nron126yAitnH3YMgGtqThAjVB3MD1tve79pwmtcRfdxrsn/l8mX9NDSHPXfYnvV0JILyEE83u+kk5F6ULYiEySvy0StV/pJtIFIlKWHDsuu26VFa3l7acnr//qFCtO8xF8uW3nyCMzcFNtSIiBTueW5lpXssKsn41TJ9L/uWKXDS/ZMH+JcREdVyZfw98XAAABNtBmuAV8FfmGVJnXuLjL6+Yq9l8XOXxPuHdhXuYmXH+LLmwZtR2JFDPhEiRwYUizSD0Iu96MftII/y4XzleUNHcdCda/89uknteoRCPgcO/LCWfn+r8n23Z7uCORpGT3pV8Ze934CD3rfR8ab7xvyylLVFrL770a7rl9/Ihc5d1q+Fn7hEc7uCR2rPuATetvj3hxbQDG++x8vt/gmKc30pL8k/OYNbhqR/rC3VdP7hLSU/vdk9c//oshamDut9fl3PmE/Ns49dl/vbGG4YFcCWwSQaxnXTMe/TtmVXINns4dl97acbRtg0j9L8WqWypwuhAd7bWuq+XBqxelo9QY/+L2tLuZFsUHh1Ov/whtLnW1wnHpe/HB7VAGPeebO6f074S6eLXTlYKijQu3a8OOCsZDzudcMIfm96Z0nnhrD9pSL8sy/H18vv+DA9ln/d2/1L6vG2Qi7P1lxsSBY6VZULJl74ZRb8KVgeb1D5C7OwGPEDIUeGT81O+Cfa5G9YRcq6/L7V1YfK98QhFoNNlkqXJXGv4WUjCb/BEMe3DSA6r83jq6Kwl5/vGeMnpK/mQu2QLlKp1FKF8i23EiT1mHqdJw/WpYjOuGW/rxW51/8TPX4agtlRCfW5V4JBC4bUjkH7p0iNwl4KS3L/J9t9zXuCeTD/e+t+5MD/qFRClIf3bTc8Z+pqeHxl9+Xx8fEjHS6/eCojJD3jR66n3tpj8cejA9hjpdvFLBMUMqF9yvIv5Ne+qJLNnHQ9K29jfQmWYLzL9jZStSy7SCmRZ+7ve7ornb7CJpJFo/w/boiAOpBL4PdnuCbe4NrvVv/2+sOCeaOF7r4RXJaEXHxeVqxpX8T7wQy9Gul7te4wScL30zvO3zBrh0i38v6fIINk3C+7qsEnjZ5fr7dlGe3N3QiWCfMRKxvezVdiYJfr4QtQbjWi2kZk+3uto19+mII97t79cqJCHgmJPsQsS/t7YJ73vfl6lE8u266sa1y737hIXLJ939eT6yfXt9NZEhTsn136u++zrosEW8uHT8Zd7vu7+79pnKxuEPFCrz4+mh+xnP585/0nd0XwmJTvyigaST9a1+v1BHy45BZf/l36vjBK5Prrdst3f8L7pOGEOQu37pn/9USF17XThHw8Wpmcka4XcW/F0WH5+svpP4IjKOq+35RJYQqNz2Jfbgu3u73SKT129clz497GsIy5GVt7vmtq7dr6PcsN9fuIJu7v6aFwJtZot7T9wls/Xuju6IsSdoJH+felak5CO7wj5t71t0JEZdt72f3vjC3vd5fzw3enr7yndLq25Xum+32++6/r8R272YqN/4s2qZh8DX0y/kvTG53CXhPL/hD/HSGGfu+XXe4ftr291OVArF3uSX3vOmnnNcufikWUq99CF19nu98ntagtiv2Wa8/dF/pTRxncwL7u5WL9bHk/hMvrfkFQJWvp7esLpJixdb3eFvXqyDj3e8buSK0ko4VbE90M/X1lu7+pL37wSXe6ZU0kMYLruXd0j/SLqymkOxnCvkMXVOz39Q6Tiy6G+lo5eR2SywBLUu/0+4KNm4YWsawamTdakO7on3gjpaad1fjvFad3mzk+q7UaIBVu73drcODwtGtSQReMqRMn7k/wx5Id2CD3y/v7rG7/XyRGQ2XMf74LIAAAFCkGbABXwWP3CIrYMdakAR5vaROly+rG1ieGJrDeWeJewZS0a6m314u8x3e/y8YJshfzGsUEezelXuM5jwIvxtKlk2+7AINeDf1BY9djvr1K9mPL5LhaX9/COd7vJ3DbCVu9APiob2VjPcoHZt298vtR8dFH1sntNO7Ti5NDcXRffb0HsTBfM37hLw6aHbaU+/VdBEuTy1T6NO/LBhve09OP5cal/3ot7v8tpRuNwp5hWaIfwfL/bthEgXFe7ARdfgVn6uMfT0RAQ80j+W8pf7psaXXEulWtlwHwTny042Fj+v8HCmq/pItodcDb1vXYC9lMnmfIu9JVjr8DvflFB+ram9Jfk9pJ7/J6pWv9HghxlP2n3TnQIylbzozXl4dzaheCfijNJt6LFkCSs8QHF1cv1PYUI6YhpjZFVj35QdvWqL6h3rs0m+y8n6N4uOT/QUw33h/cIFyFMz8BerDkZAAg+/m/6+Tyu0vjJcv/wgW0W1X+GWAmA85zet8Vj12yPMv1Ze5Q425bhLu5t/xRnZeHpaSOkSe1YlF+CsxB56UA7VnpNAb3LeZsfL79Yst3DbZLrRJ4S8FhOViVftlhs3tUFLvZWavpAhXsJi9bZmz9Fvt+X22hLLBXv3bIS8oqCX5nhv1h+vLcfnE32Mgq36id4j34SjF4x6v9i2CznTIXRtUTy/MIQZXfmrDeig8N/IcdWT+uduJhhLs/Tr/3/uMLcbFaw9MpD6X+3v/yP2NhaCQXJv4ufDBQ9H+4/q3Fd3d721RUPuNO5B/1Lq5bLBEvnzm8npJa9hQ2Pml8wFjol/jE2PeYxhCv0PINKqYIaLo371HiEv+V8Jar3WCfIuxv9mrl5GpOPryIX3UIb0pnR+3CZvMozKPmSs34Ji7pZSJlkEt821PwRFe4VYzDvFml+1S2mJlhM+7vft9N8it1gsJQXO29AQ7jLU2w1YJtt3CX5Xb9v9H1CPguNnzmZOz4JLbnGt91rd+YYzkb+Hxec2+4WfXw6nzTp1s/r3BCKPFOk5795PP7af6xBdzk5L7WhHyNX9oJ/D6LoPd+vpGu7/lEn/CHgsHJz3vn//4kqekfva7mNCS5f/tHTrR4mfVgZ+58jM0qJJffmo0GT1rJS0WLeuJ3u+/dYPUI70kN9910rD5T4zTCBftdsFIhOma7nZ3vfeyFgltxU3kOOxk+v9xYk/n+8oL0yiiO+73vIt8K68s8Jf0f1/k2bvfThC73vc8e+iyd3rkc175f/x2793e/7E3OvCHhERCf/342g+nHafRHyfVdE/V9Zi3v0xZbvu7qxP3U6Hfr8STka93tsSlMQNu/YB35fTCIlzSvn73vXyQj4ey+iih4R04T/xnEYhFeUuHfv9CGPLCG43p3d3v1lE7v2lSh9U5V/qbeirtQgfd523b37evkC/Lm3bf6nPtcn0ltPQSNlh5f0h0BNr5L9xe8/3eEvD2GnDvZhdMbUQ2vf7IZ64cv7BYIc/8Ji9La2BtQN9CImT1zJTKwRiyf05tS1Ccz++/JLP99miBJaflpJ6qSbieT7Wt/Vki35JyP9OmbbvKkYUjJ6VXkZodJDTODDT+992p0lu3/7HmS7hPyDnAH1tyWeT6+P97vl99FzHe/bffu0xPvem9JRMkEe70qk9PIlLJe9wzT9fWp01ub6XkbKf/8L+GrSWnXh+/z/4jNRvJhXvyfX//ujRN/uMM/v4e+KgAAABQ1BmyAV8FfixnD+UWx0j+X/3L1el7mJd6euUup8C/mJw41SX/3CPHSyjwk/6LNC9gkbLVb0HcbD490zL0Xmj1zWhZMuNdRnx87xcf0Zfy/5+Cm7+feN41JDX4ztkDzSUQcWsgxgMm2F/d6TLPBb4I/06hxutKpfy1tl7WstECEr5ru7GMtC/wnvJjVYWL+9uCAcK7CrxIlyK3KFzKwZ0/yY3r61mKrMYb5mH994JvVwjqY4LlSFF9kG19houC3sAgXu/dLgR/s+//cf8haUdzh5qaTxFly+3p4S5Hz9g/5ef6cT9OvW1Chf73CmHP9bIarthr6nI+c2eGmf9QQvsbxvLzD2h6q9GRsn200d6jTPQ3N5dqqN2xSwSf4akrLcT+84KiYWwJW5u1iYXe//uMKkcldyLse7tLtYxJPGOBB+HeXzaR9BOY0ct2EW3i1kzIKyerV0d/RbLz3SefSoss3B2GivvUsEwg4XuUpD6XlASHXM4I7svv9AoKRBD3nB+4J9j372coS8EZJpXdv8K7uHZP7h2CjvHWHvv9K7gu+3ZYS3OOaV3D9LItY/bftoIzrDb9I6P+YYH3+Dh3ub+nPwQdw8RX8qwAtOuQ9PeHJQG789tDdflUcI0Wkqwep5/SbuJkP3whd8b/35fbTuILMGoZlc3O83afrUSrNcvDaW1YGd9eaOxH+4owZcXwBN7+R/M0+jzZwx5s5ATVZuViIekji2nhk/XzrMLw33+EfBcIcZnb9TkbgV3xyvLvorFa0f9gaebyelXkTmz/3jSbsHJq43gig8F7+63jJf3QQfZro9DegE49E7t0juZn0+KQ0q+I/ck7PZQ0Ib8sDzwN1Fv1w/2X8nvVW5YKyjr8+9n4bbyVPwYGg20a2IodnDxhSWmTc9Vbyer+NuED3d1s52V9zJulJ6Slllmi8tvpP6UKYzXxK/85jfH9zlntCBPs5JF6daSD5tJM4K746uvadhVwN3S+1zQVlhC6QevS9d/cDnd3ecyevXqYpu+EfBMII1sj1y63PG47gou+dfy8swtnCHlP7tPvL4xd4eKE/vskaHp2l/wZ0PR28NKycy7NzYfafwSjrcowQahbrsL2lF57L+298vd79xJXc43hH/T/UJY+4Nno/8M73U3T/+CO+7eojWlexSnlqP2vlvfL/l4fvfz5e6/dXCS3wgS7vzcufSpiYSPCy5Ls8OkTwdpXu4Toymoe+9bWJGvs4fu8/xApouTH5Gv6PEGjgn9e8vrwmV7932Lsm5faZ0eWVN36xhX3u13e79IgSy/d7+hJz5Q4zT4Q9DG9wT7v3dz8glq9WP936j9G5+729mM45ffVXxvfo92BRgVvJ67+uT7/8Fm96T7vOmsnBIR73env7hEv8nclCPVlLu3ponKSf+CPd5hZf/o3js5o+qKhIh3xnFXRPr6TWElnCeJk8uXzpPib3d3d2qxurH+utwRDX1RSuTutoxnIP/WEvHlYbf4/jb/eU6gsM7Y/S45t9aFxsrcgxm+2xwu6bu7txO+UWidKImvuCc2r0/Kd75P61zw0TGt6+0e/ubmi3VO5u5q9QQ09xqD8EZJjcq6Tf44kAna7SgYQIeh8uUdvH6PYUf4yN1fjd32rR/fL8E5N07rb1QtF5eXLr67GwREl+UGvSe69if0JK++rVeZip/uHbJf5MLLbwmIu7yZ3q+tpKnrQIT7uRPMyFS3tVckVu7gSIhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gYhS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeBiFLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3gghS0tLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXeCCFLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4IIUtLS0tLS0tLS0tLS0tLS0tLwAADdCZYiCAV8mKAAOKMnJycnJycnJycnJycnJycnJycnJycnJycnJycnJ11111111111///wXeAAmFEcIGm63wChQOU1oQEeEj84Ba2lPP/u7vw6IgATyCOF2cOKFpqPMANIgQwqMkPMjTYR88zMDUXZnDIT/2wMBVFRFyQ8o8AmLIC7aUBdUP35c9jxADMgEdpYBJkHUp2iKlZXx94+K8ARMxEOpH4OpLDpP8aQdXIO5Ma+G8bVvsdADZ7iv8LXiuvHTinnkOOSBqz8zMW1xGr/eFB8gIenZ4C5xzag4f/q41UV6poBZeFcZqiuRdDHjGTxnGOzAUtE/G5RrdEamCJMVoVuaNj5YBeBUMdlkzL6n4J6JOpRpzx3/+qKSk74CvjuAV6HcQVTmGubuUZ/6mc1NVdYHpiT8oycn5Q4VL+vkx059tzV9dbuJuoL8F4sWgiDljbTNEnc/O1Vo5ZuG7DhMLZqSekbVNZRCesGjSHjpt2av/VfVWHShCJT6W7sJpWbFyK5XyVEIJNEBu8UFh2PBz8dUdE//CW7gt/rrrrrrrrrrrrrrrrrrrrr//ycnhAZwAIZJMa9CI7pwAEwBGoBU/XeoAY4AmIXwADqQIUT+vhLxqtXTgZsAzKLikgD3Q9pN3Bbh02FpyhbZRrGreUxjALt4KbEEAiIfgTGbDQU+GvbADbWbIgko8BIfdfJ3c9y3YicADsgIKL3PR4gJqVk4ELdrEyCiTdPSZzjT/vTzDHw0mP2gCjgVAVgHiDIO6tBZMg4R7nmZsofTAMYE/9OA7MDFLquduiDdNt0REgRpiGNQB0y/zIB/59OImSEtPGHr0rqG59X7+1z9yfQ/l4MPmiivtxCMZypsVLCj46Og3AoGAATwOokmpYUdr1mlHfgCSO6I+bdsc2+1H3bNsPoNLRhUZJciNTXMNTAGvV03xtdnO324BZ5/b7tUrNgF2YiC3j8g/D/WRdIWi+0D6yJFaQa4dl+KoUnp+Hjf/bRrZKDgOx88pXe/QSvH3bIHTk4nkBJW/q23PArASAAUgcDXr/jGkv8uyNxi6+BA2l7s4nGzgokTK03J+aULFZoz7QGHZyon1Rub5723yDfCeQ7UR7UEOt7H5LerG2Zd+E7pTxNTV1mzfUc8VIbxQz3zBW3ghoQrWNsX6l+HuCtcG66xcXu4i7iKrhiXeUYz5NBpvihuKrC4crxvhE/nJh6CI76KY/dKpyOgOBo35EA9riQR/jEOPVFRm9pj/arfk3q7cbY+AXOjmcOSiXkEXuhbYuYq1BwmQr1lUz4e0O8p/uJNBEnZyJ49z4lUXl2OrFZaiNOmuiogCxpc/M5IZohALHsixY5nBUjPgMeq98hP+9h6bFNY6m38vEV1D2ezJGW37iNb+xX0Cw2AZbszCKEiDz9k8wF/SvXeoRUJPQSUsBbJ/M2EctiQCoThulKkyOyQS5Gm9U6g+G6L3yhPNSnNuZjoTY3XQ2Qh4lus91duGb4eM4zmLqV30xWIi61w9c0/jznlx0f4KpPiyUMXPo61A6n1LUG/5j0DtTi3isJCyygaQHAzk4xmsa5TErfpXvqqzqZq1lVORv1pdMj3RiHwyHzZUGsBMkPknX4tbLTIOeh4+knsHsDTZmDQHGEsaHIwrhbjRI63//mfnYLYOrJpakG5lgS0gBeVFd8vqJrrrrrrrrrrrrrrrrrr//yIsIDuAA4UcAX0ggxmBqACY1OG4ADsIazftaQoEMDUStHwALiCKCxNcO6qf+7PNtj+KXW0AMBg3xsN7XFvdv8FGQ/gppwQTikBAAhjBcBmM1AT1NgkqHbFUAgsfZgQA3qZ5EY/pALSm+RE272AAFWyAAIqYDeBHR8WWkzmzYNb7BIYiv+028+oLIzfsjUIBctGX7//8Wj4gGEg4/JYOP7BgmYRXp0Fpoj0T5JWkefoEPq0Y+/WLOQF2b8Lz/NhSzNzIjL393AVgdr1QiJ+9+JXLyKfgMTARimyMhFBGoEkGABEJhIABhxid77BG1YPkM+0FxIDHKAll6nukFrTdG3KDKX0TwC5yOj1nPs5RKIMws5F4Y6yBP+3q+KWyK9b102HjAk8Ul2w0lPOEPNb/Ogz9RMN5ePqWzG2XqIoBqSHe6Toq58df1wUPXF7jUJTU9Cf6mWzz3ie9v6MUogABARqx4Bn5pO6m2B3OfYuhr3b2EAbECEo3vOCqdarPkfyhwAnWfZc8YybePzm0uPXNRV7iF4OWVRVfPWyYycl8HPeUaUv+4mnfyvlfO9ztTdhAVB5sIlUL3WAhkIbST/D/iSImr7RlKdPrgKvZmwUMgR1pPdJTDi730nP+YBQ3YqUyscELYkVrnG/fH9UCORgsngLto8EmF0bIvCxIofzyl86MzmrELoAGrkvBjBq+ZeFI9cZlGCe03ut09sU7KW/qxHnKcHujV1v4NOTX1nrtNV7vo7X++8dsKd/wGamTPCd/5zbOn06nrIg4VOvpkyj7FF7/f1MABV4FjhR516wl0MGDNw0WrWu06A8I27Yxu0MooE6hcol600dAa7RJM3bWVVEwdFtHGzwRZJgPd3eqiPdALeYaM64hl/277Z5o6EkrOX5jmIbpM05oh1UPSmr58iPLcFQ3D+AbLKhT2gdCdI21eBiGaIEfhMnIP6rseOSY/TafXcXVmgTirqRrx/mGpCUoEDH+701tEDrz4bz65mJvvhS2rGUb3Dir9/sLstPPkngjrpGyPz4Mcp7nX5uFmnmL0MABoInMlZT88CXeXhMugJWYMJIS+/2PB3jWaYIpUde9zCDkRxd3d8yuyOZj2ZPZVM2g6Ce7enV5gvYO4AcT2kVWJ970+4X/tu9ITZD2sF/u9ao22MjXjZRSsGe4uJDfDkHA6J6id7y1Vagx8lcBlqKGj4ehObUfUr0Kswk8e5oIqiCZuVWDrtdAFbIjPWAp9XQqzdFCKHJGbiHeZ8sQeTSuVylVqyhmthde/7euSYtW3V5kZGCvnHAhO0GwvPwScQVJvbBqwCBiiuxZvEHwOWTW8p7F0VLmpzZSWgvrmJ2gydnb5TOXMwkpt2gWUIv9sACbV2/cj+XCTQ/91UKwX2yXV1JjO9J2/N5h8lBg5ZW/R2+Plyy9Z12UTmkBV2dEWbXwwnxobileWik0utGMwYpP9XVo05voiEct0mVN6DVfUl7ENHWVwRMIcBciT45y58/qac7H88NQEQ3W/KbBC/mK+hOn7o4YQmRBU3g1Ix472ouuuuuuuuuuuuuuuuv//s2HBnAAzb3ym59QEBw2W1nMmh6E7D8yAA7kJS/Nv/f+CQhycoB4JD7pBrH8EATnMEACI5/AJ8hHBftuADsUzjGIKO/WJmZrnYHcByAAdEm4pfkDtMtrHCQKEiH4cTJtFT94HVCHVKukWJx18MIj0WAAf5T4rfGH/2+eCGbTf7djk3PIkwN/396/2uKWICAGvWNgAgGj6lfTVBm58LMbbaJ/Al+trjH92Cao9p53NGBTkpG7AGKZUygnCX/uhABCGPCAAQChag0mOqrZolSo/9oE3P497gUrMRmMnbN8cw6LkRixVfFatMC2dLtuSS6pZgB4ZBfvPa5NhpxlXdIrY3ttb+etHCKoJybLKJWQk7qAprokWcbgZOjENqynt7AKowXfzow4xMXX/SsuNY0rnqXY+bJw9P+md1BL/n21NS/wfyO5A/v9SYJikB/dIp09c+NI1iBf/jdDGBouXHxYA6TgQ7f4EoYOEjMme37/O+eZgZY9l0E1XvezjmSZEFQ/sFFYTKbFNoqY5eWTmQS89FZC/6S5gz/9ZkH71dy2t7/YcVMbO4aSHu86/jWrN/g/prPrpFTeBDF//D+CRgy/6tOsgi1qsWv9xomG3FVyaMPd//+LZ7QCYLOLlLCpBlqHmjffsyynAcovpvMZdTafAsIRLY2k0/zaBF7ygZkAoKw6czu0gqmCU+xZTqifcFBksMVI1flqHfj5Gfc+syQlJsisHf+6m2eEIhf/fJRF+Xgvb2a9Be59HbTvq8ileOl6YfHBYD8X2kptQJpiv/dJjj7gc25Y8qq5erpY8Xv6S+pmc79A2QuQvwvKb31rmC8l8fZxZhe0uH/wNZoUMqpCVWaxbEw/I+e45wk4R6d0zWsBYEEkMZ3nt5lLwJ1PQbumRnX32i2/XsLRlwnHmnMJW6/VlVtGEf7dFGI9Q7maEkkNC4TCeyEhr21fk7wz/eq7smfUJR+TpkXwjJ+W/tYUwwgKAFeA4x1zSuawtoNqY/o//nMc7isRI+miMfiWco5hYnzlvWsuABhFXmQbNQw/Ceawh7g70J7XUbLQi6v9gJ6My9YKREEs7f0Nbp6eIc9DyEYjPuLvxU19uLz5f7okye/7bTl22Xc+N/gXzz6BcmAI0hqsEzK/jEnD1lwWcE1Jxxsee/xKHjBxQbg1jWKgRpu2H6qDhN3e2wOPj+n5WdGukImWamk7ol+TmnNSyT4ewETvX3aDOii/PrnnGu1I6thcPWUvYsXmZyq7M2/Jv3FcBkaiZCxgMKrDBS+tfyqAIXCPmy7uxANZ/9YpR/QNZcVD2ks/UTZN9vOWsbzyRdICN5ZWX8PAkFBx8vNfbq+TuKmaD6V7W/J0u4B+h5s5q7NIPSEa9XfQ3gEJz4TuxSt3PwKlhlI+4uhn3WIpKMZSbPvBiF0BgGUq/KudQT4KXZlctAcf8j0AmU+ZJrvU94uJt9X08rF3954awG/Zh5MZnTd21knA1HkQ33kTqqX8crBoDz9SFJ5aaMPko+GzYA7HZMhM/Tw/Lo735yruzkeArIsUOCLGyiIwEEFErhPS7sS+LVjICI5sW7QcfO0YPscFs31RDoBICKyB5QZXa5kFl/E1Kbf7rwfql2tZxpQc/k2hI0BvX4e/Efe7X0fG2BBK44g1bQCuN/suRrWFjfn6PA8gmPUdn4vUB0QBBdEFzzaXxTqNFU0zKBZQNutheuqV/6U+wVDUq5qxt9T2uuuuuuuuuuuuuuuv9rbW/HcABmwTbnDrMFYXdWAtGd0qAbJs9n/hrbiOKgP+hfv5ThXkTCprlABnJSIBnUP398AERelNpzihMb1CgDrrJE/5cADgi6B6YWWNy0LCCyzwIR0IGQg6CRzYlBLb2QTFTA/bJgkoR42jsoI0vfgH7N8gC9jtP22zAcJM4H5D/FyOvNsCMxZDRZn8/P8bU9k/1RWOn5SW//NEYAvEqB8G3ikyxX3Hr5/JKxetpB1A8JLDakd3x7qjppLPPgYEWQICxR6nRt7+VvORx33nV0VCUzuJOZe3N8+PgNCZGPca+mQmnudXq7Sr+xi9dtGuSEyu8D/vDrQot7+q2Y/b0AANfeqpLjGojvvwYhOq44P/+wGgpZWOO04AaiDVgJqwhZEyVwElsGdF/9Qov4BvmPN553w54Rybz1UUonIVZsn0biHwuwVm+dan6fC3fsmPiyOMkwgyT/2v1uT2T/Ldm0iC8u+2NGD3hghFhNPaXDVKOrmc2AjYtqcXeYr3kcegBC1OWJfGa3Ro8zDKY0GIrAl3D4p8io05NzY+qrTzYqAAem9/GbMApgScRgw4IA7qEPoFXyCb5RIfBtwyCokj7voois+MDODHwgfPHE0xKWf1m18UhfvonNUBQnJwN9xnRkC/6Tz34h1pnIxw5paw4sfcMuEHqf+3qt2DJqzfEDrDWoNL7aRadHgoR/c6ID+MvSD/X/gUqkgFCU6+0kBFKcOsyXGJW8Y/TlnKPMFB9SyCgFc28vlJAyG6zIQ6phUImNYmEwHZtoMT9dsOWYOMgSpbeB4aBQZP5T4M2Prfo7beASf7K/udV70pfYqObNyoH9ZVfSuaOUxVwVzUn4bNoLqabC/dalHs4ON5/ujWLn9zYev/7amH1SUzf4H6Cn2gA7WP8pkeuriExQRqS9lpYQ5z/VFVpFf//z0ZbAndw7D+QPLaE0OmxSDyocvIpq3cj9w4EQ45y7tzuec5543eJaw2fLrSSSTeGJQBDQHtm35eX7wMnbSoBffoFg7ZgE3tMmUBQCvf129ZiB1BDe2DWQ5GgH+97IpTEodriw4hSa9XtDYU61L25gZSd/Wu/6iS8BJ8HlJzvXsL8Xsl3YCztGZPXgiAe4gFWAdx9yI7xlD6WZ3X2lyA13f7we358HdxYxNjIaqayfy0SBCkUkkbsjb/e/hlKLmMQuCjG073YeyOeq9avq9L8UwZ98j3/+eUi3rK9aQIOKGxH29o84rlC6WZX53oG8MiFovWx5Jxuz+wbDoNguxhMtg3MPzDjwChCXo3kYiwTzzXA+FCt7PTr9+EB8nNiphxRohJUf43i3J3Wrnj7ckaKc8DXCLVfXJv+K1UUMMLyf2mq41PfIAHzu3iuAVeO9MBpFOQkfbSy4Fh62dRL4D0F5uqrGTQ7/UEMYC4uZu5Pr0kBM/WQQXJtYz+ArqF/HpYf2/k8O28JTKuvxOL1Jlti2ogmeuXNf9UW8YwVGTj7D0soRdyr94obEGWTtWET/Se9/79EfbEfkXf/+/mBnFylEJELc0tD8fQpOJQYiffhodiFRH1PexdKRjsDc6ZqTGw8Ww36TbMT9BbsSURHuiqhJsZdb0Wokp/9+hAu1x/wUlhWn3QoFvNz+0mH4hixxG/fa0nBJDBmZqMrBymKUIv3pRDqDJtJ7a5dDQdLkI8xTPZZOl03hxDGWhF5j7BtEwWORB562GzMhRMYBx6l/1lkG2oBUac/+60pHD1JGedTjQbOBZrQOLggAY5AgzFm5M6owjareZO/QDIDXMJyQ//WV/CT3VI4hz00aN3oxA1CNgt43zTLjgNzOEdVnAMmh6GTTdYyVCp32qO1SfFcZMsfpxqqv/33UH5yuZkzQ0w/5qN32dixcuBBdhAnggQUpXlPY7es28a/uzlFio4+5R7ogHpBAT48XwDZw2tMK1wpoueQVSNmwzCs/CQWOBjmwkgz/DfynHJ4Jia6666666666666///hBdgke7QAebACRO4BwkWXSJyl0AcJSfAjKzz3AHaGuJnrWws1oW4QAk0GyqSEorW8wIv/t+8kVmICMBFDCtRfGBBCCAut/K0dCMzEkKouHidQS1zJeSgEl5AhIN5SAR7EBBM/zNxhneH5mSfAqDYAFaeHHYWQfDUMBGKhkAYLggPq+d9dI6xLQSI1z4CiB57sLEgCJY1iRQsRBCx9h84cMXcLYQNUoSDkw5rSe8kArC2e3P6rSIT48GvvTTXNsgzCypRdXWluzODwrX+sh25bHMJv9p/+XZYJJQqe6bj+9L6w2TC0qtS1LwAmYH+ryHzglzS/bM7lb0lZHEkDqoh9iUw8UemQFvqakeOL9+r+WmsJx6zrufAmJjoR+Jb0Z2YAfUSCLl5f8IvF5sz6N/gsUiAXyGxJRCtJkV//z+UpnjN7Ha2rTIygzH5F7/049Wuv8JDrW0SytbZhuXCKZn9V1P6zAgAIZRAQJ+4Z3bNsz+Nz9P2K/LdnaNSpgO4ebOZ2aT60QhJIUvT0VW4wux4EXVSe3gl/svCYmDniXYDjv21BzFGnNQz7/1mE9FE9aRsev+Wrx6Be3JHRl4AnQmQzwVaWnqHV1q9DN+y9tlwvGb33QFzYVGnyfTB37nIF8qBaO8k8z/tuP2wAnJwKJ/bLBJfUBSymw3feKFjR3C6NorFyAQAVIcIBjkfrfWWOVU1kYGYIO+NI11L2osuG4QEdzQjwCavQXTz7dm+kAit5ialNScceH0mFFQWE6q7AM9jUpobf/z+8evFxL3tsazQCiDXQ275JlDfyIIScis3YsSMzyn/v/000Aw/+c5Dy0LosCIY6AOSFttGP9HvuzfSLLyXsEe3I72qRwYUZBiaY2ckYZ7wse2bLgWPo9EyUxIFmsRs4XOCs2Tw0Lmbk79PVr801vwgHqMBZoMjP9Xn3Fzpu6mFS9UUrSZ0fA3RdAM+MY7gHz7GbFjd7/vUqWTZ45wQpYf3u3hEtJfXNTcRf3XIIyUUnQ0BAfJFTElnRHVLTBbAHWOMA/WQoOtqsg6+sfkYACvf9aUh75uGYfXx9627yGSVTrh+koM6YVIQv9aeFL5y053M3VvAplVUq1Raaw3OiK217Zs3AUEgj33ccLzSkEmFJakkEWa2CQMEDQoRwQdHsiAh/CuOdjB9x/5DhkgHrY5ubGP55PGM2YsmDr+FreSjIDSJqhc5NWJrAnkT/PDM0qYMD0i5flwKFj8mEuk1YQroGWvf/ZuI5JlIGm9t8WwJi0VN3/vw3wvhb5v6Qz/KL3SFoKiOfhnv+50+KaAbv4MkoM5VOhyRtTKVcCcaCeNXdGp/4F7Mo1J3RUNYgquA8okStWJ9RnLKN5G65ahtZSNtQpYwIpkTRG6ELvHaqdoqnDdwVE0oFPABsHdkaJ1cqf/twWMNe8fHCBzL3yvOwYtDQYVpZsBWJLkKKuPG2FxdInDswxgA5lCUJTSzez+UrESToAQ43cCOEbKBb/Cp0swqvpiAwri1/1yhOuIL6RyPtZBLmjpiYeP74FvAZpqb275yd61rI/Wgbj8yZPDjug3egqAy85mEUtLFXfaB2GDWgIfesj7vebyGHXG6Qh5tYxz/MpImUBNiYkza21jeWKNsRay4L5y5VMq3VpnBlwve/2O0iefvbeu3Ri0ZHLas1vo/n6u2YQS326fGtrI3cjb4u0JvMqbBPtyMMzHUJ/Ob+abd/CD1zzMoPQ7SLJAL+0AFcB88AYvtM3voweRf/vBgoyCv/su7jAQ13ASOf6+4CTj+t1WmynQJkYExJanbBNzas82kvU52sQ+0khDUZont5b6QqBlZfe+9exkFTEjYyS1IF9V33no0AyS/bpgv8pyHjGF//jcgA/k43UR0KlMn7RfU0giTEEjRTKtMnuXt6O1Gx3rnXAQXNZvbfiY69oAiWb4eAgvDE3sqoQ8hCVTYF2VrxhnhkaWnWLRuK70rNWLN4yMyXUNCANGn8lDh/LAxHnDzvOeOY///o/v11111111111111///sODuOacGOHPOhsi3+gXAHZYGDCelSYxv+hDYAQYDa9kS+/QcFaYKE5ygxq8FYMwQkBC0djTQ0MoVNNBNBB6CgNgI7v+zfvfYBNjAcP0cCfMnIgMPHabaFU8qUIPEyhT+XsBN4b8ZKZH3moItJOSuGoeOZab2YThIOESHYcj9xkCP1Ksd7X5jIWwAuwuJ1qJxZDfyy9Ti39nvU4b8K7IBp9marp3kSfC3RArAzZY9uu+e/9qBnydajlLwlP5p8cRjJFwGLA8R9rqBdO7/gyopokKJu4rv+fo2BT3amCAdZymjUhQtCiin/F31lcwhrh7wXVzpP963voBhG7mWNPHi6yZcwBjP+2vn4H4RY7/cModGjW/safgmIaGECKDQKgMw+IEmaQ//4trhRZTUkJ0zQLwZ6vuonxZpHIKj4mHelpX5sn0gszM4dl6ziHryC7kwpfX+bNRIAtX1mmjEqHYxSBhl9T9rpv+gFHnpYiKuLKM8AKIiUorsTPAOAAZ4YDBgV36x1Rv2OMk+gsfBFgsGEnkKaio0lKmQIJ5YeP0CejpiJqjmUnq5Y7cBZTGFl1hm/3TsgyX0I1ru79+eqe6f+1awvR/WlJGf4tU28qmNhxsUqY1JJ4AnleHfiv/CHUBCFmdR3jxM3iXz+0Xjn8ICYlCEMuNocoIF/HhcbB1K00lv+yKJi6EOAsGVF95asDpy8beE1ivs6rOciDv+qEfgE1O53F/gYldbQcacJtA3W22/DkGuxRqwoQwBoqlet06m7KMryl1kZNHgpnBOGob2Wvo/q6CiO0FomW1xBtcrwqmBi8o//3ttPAac6OU9keuoWt3wJoliaq89uftXBjz/yG9vuef3/kS7/lzthSprmnnSdN3uvOZiY47s/glnAahkiql39vujMW/U1dreNiNaixeDqZwOJM/mXsF/BH3pKkSraU01Hpa6jlmNXDT/r2y8sZ9/nKXDnl4fPO2Vbfw2uv7ZZmhS8ytyA5MrZdkqTuzqY9dcRHVu6p++F7nKijcD5fGOpq8TPziLvwMgqGME6NRdJC+XRVOrrNs10mTuuRYqIEboTMRIKeKSOQC/aeFqACxtPht5i0S/W+dAk/3MnXTZq6adcFzLKavQ0t6xRY69AkGu4G5lsnUY9qb+to9NAv59hlVkCizAdPro6tw9p7cf8v82cjdkRwFBVJOu9p8dqiMln6yBjBmMZ5ATCp+/95HaEioGLlmEdvjLBAHduk4MMGOxJ1T9VNw14Gt0tCv1rVNfdgmg675u8NRsvTZB8qy2Kb30k9I5pk9eGE3+62wnlMoq2Rv2LaIKM2QeD2ESNnEkZXv/xMZrAxWkp5+C/DdZTn5jKArH1cEzW6C0yZHk//1jK2lIV4jqsuc6dub7Mnor4uK+9PVtevA6AYoEARhNi3D4/A/5ywdkPYTjJVEu5zjdvAiijwagd5fQVtuyydQlTAJXIABARC3LOPISZZy6Vom7qjK+rB7+x2oHECQZIDkl8Wsscj1lTguNsNmh0gm16UTLxpwIAu8H0+H/ceigD4799vfhx+m79E5tzlRzyoLSBxC2kP4MpV3IWrQf//2GxY09khfe9oCOb/mPxmkZJ/ZDj9zUtddddddddddddf//7DgQh5QBnfhkR6LTxY8gLcpQ7HoeC4jO4l8eEVLkr0t5r5f9GZyLodSkRvrI+IOqKdH1Rw1MbLBi9Xng8tHVWKTXn+zzlAHyl2UCWJAZdP90ifil/bREJDCg3coUVOUUqb3m0kgA6jrVq43lNh63zMq+DPfEIUgl61AaKAhEsrN1T6q6ihgilBhygQtf22iBedgzedevsXot0i3lLOqk8tLVlexthZurnhYJnyX3HmQMLQqpiH3bNx1ZkZjAXMVWYKsBfkeSYT3F9ro9NMg9S45ZnsHcYdGxFfG+qPCr/rNY90I6uHtyB8vaeFJpobcrU54kSMxbjt1pWmcVDd3cYzNbETS7w6YmYW6OPRSf/+SpZjgvNcjiMqct+I0VMNe3hQNjC+6R7JD1xm88iBjAMLs8Mg2Dk5e5DPRN5fo8KYEsJQorJ8lpkKrjtll1AoOgzmakqjGd3mHBwUYeNAGl8nZAdLyf23m5b8AgE3Ik8EfUlgAf480IJv9MW48lNcgjfdVFBKBqrHw3fRX1msAwub6QzpkKgGVtHWMa3ybTqn6NphixbadRJSjrZEJIyD4pKPjtbv6gishI7vRA5mNrGtGP5b7ZvSO6yPrD1EbnaL01PAaT2pduITuW9mFllMvlnP9DxMQtr1ar9rU2ZzSQnvBXHtOwn65YcyQTJ7h2NmzeimSXIiZ9nYVd+WPZwfS2ihFs0IibRrzcOl0QOU+qK8B0lRYCUbPGPn3bUbmn/K9c4GHn5W3calEgAMK0xIJnCGFjRgDYtqIbaJ/XAwWQjuT3PO+8+zQNxhlnOalX6xMSm5uBkJivMdzu/pKofhHRjvJiT/rfoHO9+ZeqWxoHg4P9by19642jlZN/NzSf/K+xs9nyac0g0aQ1DFkZWk5mtXGRpih4IjgaFcAMPXT3xpEs1ap6s8N45IwEUKbg9wo2jrLuQpIaHD6f/3x2E60H8D0Ie3aHy4JZARYYEAdowx5+oGtooz7iIbI7Lmb5QJlkNOBPJNshIt+/tofjRu/mSw9eJYYRphQ9vjUKFBCUw+Q9UxSkJm2Rvf2sG9MmJHC+Mxccf0OIbwTuNQKOuAMg0PZ1ttTRet232HMzxsbVHNBqTRBSs6ySxMr1ZvE2Yz4GVWL7fLvH2ltO0e0oQOzsaS7BuuXSxy2G3n0VU85qvgVS1WU0+MbDeoNcijv/bgUIBAOMhuW42Bu0gm2fPYXYgbRgIKICFCaNtG/Hlp3NNLd6vDpSxmlg/TWRJxB3tRtOV3ZFPfJkdDstIf6oXAJK0n0s+iv//9hst72gSIv55zGe982Qvheomuuuuuuuuuuuuv//6MODITnApwRjTa/56CmwKQDkgMSmdP75AcAX+xwAUFKxcKDtpRuEBQRcVVe/qCEgIUILish4KqCq0AIPQD3ZpoFcTADz39NOlHIqwXFUVTcWQIPbataZLlqIlv2dVZ5g2AkcRY1ybZM7HgN4hbgJLqMo04PSWXj+AowBABuvF6UYz4XrLtv1rpohIbBlqTsVtYXlZ5VIA+OB/XP/2awT6FGV9Y3HiGaBT03STFGZRBElJ54iNKpH79a0RJuYITiCAjng/VK8T1h+123sq92ugxCwTGU7WknX6dnkCS2lPnMRpA5xwN6bxJffeAaRBqc08LOC7RfWhPi12WJyEDMBu60XvPity7051Qd1OTqH0yWNpD9noqjJ2sX9xPA+3DqxYqERpShZb23DyDPRroYO6D5C1j2Q+/LyopM2wg4QFFfI6Axhk0GrX2oEPumXqN84zdbeVxryaiS831vOTh88NBjAyY9oI8lelf3P0EVYRKpuX8NmkBJNk+X0av2BGbPEnkH7xdIo8ZCt5KNvUAwCeBbxr96+tPDq3FfwKdaYqvb4DLELCdKbkr7TXBXMwCboiu5MDE1wWLsfBU9H2txC/vi9zc6tDx2vtLg8t/ne48lFNCbROMFPxEi2/Z21hKtUMNtpc/xK4jRCzHWD+oAQ2mnOJI07KM8T/2dHBXgbpRx4Ft4mAibscVhy557WhSaM26Ehv2CmmbAUwvF9aWfQdU0nhvjLs1J1Py8xv91862YsxXlw33a9RNhkGMgMQEoFiReiA7Uaneu8hueNSn7kaPcCQbluurkQkinBf+G7+SQGdMsVY8GW0lJ5sw6m4MA+eAW4+X/uzfd72SXWYHLKzCP7S4xKmJYDrK/DAwthEFtW9gp2AoSJ+GL2cZzRQrfaaOiolUcLiEJMYrybduyWCbeZZeCZWgrJ74Rw9ljBDzkqOJ/pmvzKJF13Y0hmB27DeJXobwDuyaWAMyjjCmcc7TZltbBIKBYBGfgrVBQfYzKqMFXW8bpwrghIyG6somz9PmUg/YAVb2S8H+X6wWeAlqfh2K3D5pZ84P97K9x/k5vp9sMWXn0gN7pjDBvEnFvOg7P/WVEbEgS1GFfqml7n7TgxHDbcNiWoAh25n05tdgkFna9yO61sl2FYQ5pwOiN34mW+wivPBWidTLkQy8dVY/sgPAbkc2+W2NSHpp8tZybDUewEKBZTqmTm3Id/RtX9LOwsoMwC4hJiJ+B1dj9t5n/v1v8ui9PhV0Wdu8nVGTcDug34YVs5lHs2DVR+aNYwk3Htl9/lqnehuV1MJaAI59eGoW4e/7otgwYQIBB9vlIRWE2jspNvzk3aQyLm2KCcQykrqRVc6e1PP5vIlsxTxVXdRT7f2t9hsSR1EHey8ZKTPpTyV9zKruZIRi5qeuuuuuuuuuuuuv+n/oFnCU0ZvPgLAJiXzj383oPeiJ0RKfcdwEwMuLWHM9Tg6CxEOSAa1pwAAgAiK1bEACWn5ha/8aDQdTbTMVwfdOHnjcrj+qAWPACuYHgwr3gPGv2eUcuFwe6UtJlua7mJAi1TUR7Qcs9fiUHU8A4JPLCADgFkIAHg0nFiw4e24MB2YAX2YVQqkE9Gz1xWEoUVwATVmboGWQKJoS4XgfMP+vb5M89pe6Zuexd/cFCEMz/3vUTR5OlCPXPTXyL63uITYdTV8jodR4iGvV36zvbnCRMV2ChGX+HiqCE+zwe1dAcUgujtqDd02Y31mzHhb+r1tHJaM0dye7GfLkZ63h/BPUz0LK/38/afS9NzIYeXrLt/URAwAICPXySiZ/+R9Ab9kWfpOHsJiQ7Qy2HFX2/WCiqgNhwiyg527voyUFYBhMU4Vec8aZoufhcSVxYyoguVb+xtd43yk/PG+AqkN1bRDBj2swczzfkTtNuzdg+59v4arBdN2aeruZtnMSCGJF+Aqe+h3F1Id9AhKAgieN+kd/PPM1iuF2Q5gq+AZwBwCFk8TROmKCESP6KIw7ZGPYOY3Xm8N4WIED55FBqNFk57gxoHLiSFXXh505wu8AIn9pIApvMDYDVed2gcVzAGPKR4xeH+GZg3Q3lBkN6GJTgujCdiv/njqTLZ75zU3dvjt9xIHNu5OeMisYe/DYQWeivUjdLxVnMmWArIIRzq86NJAOwKmcwHhIrmJg621NsWdYMbjralLrnuZKmUAPKmC3XKELMw6SMMvTuYNwuvPcUSk7mbgpOJmTCwLbTwK9YUlA8kDY0P6c0dmXrxMkcTXH+iEknuJ0c+tu9Elkeve2lJ+2JjCwfMsnf9DMS2KlPDKpHknv4Xyzv0BPoeAy+X8MeRuM1ZhhtHeLAoq4K8IE5O4wY5tyPsu7UuCXoBBL5q8A3geAxdvE5TZB4VaPZhvzrVav2hrPvDiPAp39CnVnrUyV5+wuoGGS9iqHsN99bc97hDCyTi8borq1NoX3HWO2ERV0eGzzUJwl+e4ZHwn4nnMaunYwoN0UXDkmje+KZXqP+wtLfW3r6Evoq61jU9C0te2Aihxl37PntrlcD/oAr0A7vBpSVTsbyMiCplOt/vPSJEHuc8oJMT76uvjBwCkwUAewBl8oPBKaIOpyqBLyb/JOlYsFRroZOLG/EgyX448hNr5byqtv9r/+9NqjD/DeA3Vu7/aL++NNhqMhKuHBGyga694273EzPSePjfdYEhtA5bjT4H2/JnAWsVmFgk00ueR3rllJ1vBXH9yWsW6150zXGiqhiG8cmHMmEkjs2tvOsYU/MczS5Lkle5IuuuuuuuuuFsBEraQdX8Mj75Y47yX8I8bTZjbHU6Z4ESBXAq4sFHvxoIAwJZLGeAdljxoBlwwtZmMy/Qn8XXXXXmlPnpTHBDwM4AMTr3UBqXekr/0BZirxGKMm8b3xHDUt3+aCLKyrPDwMEIWzJm4JZqP/wBgLGEeTDWo2VQvpFPvaV7Aox2BACYMaAx7E7D9vTeL28BEvG1YKM2kNihqPLYex4Bd9AjxxjJolE/9zDqZusnvl8+w1N5sCWiCkw+ZaXXcuf6JAiFkTPTKH24y2AkGcgxHcaynSxW6j/8kKaVX47+tbHz5gAgb/+yUUAqIpAoPv/pCeyO/uebTo/jALM8rab/JnxR/XQgg4f8FrKXjnYBcPVQlhAPqkiN8it4mCihzRyT9DjEqWN8Yd5j3+6Zvo/AYe9sBY0Y83Wfr4KKYju7uS2fy/0hA2KE7ev6X09FuAursa3QFs+qCOuTqZe1bxjuErG7cdD5e+ro/vR+dg7nMisAgIDyHLxpT4MG7P0ccLzIEqqWdFxoPP4bNAqPMctSx5RtoXYP7NYuZQU0lbZbc1/R0+xD+0dbrq7gwqYh1D4E8Cd9E3+RhhSRjo3+l30nva7ughONrDx454XaNdM4bcY5t+zzBRYBOYICwXCHZa0tnY1ExFynsc/rceC/7EvE+fcsNkI0tX7rngEw/DTsvmjJtEc4ugUVZB/0eWlafsuoC5koq/aVDjVe0o9yd2O39257xflp/zx0GenOxaO/Tv/iQmAqI1y8nrowDc2bmckI3i90umbXs1Qadb/XA74LNfpm2KEKqzre4uPWP2XyXFN0sX08uifnZ2C2OldKI030uu5R9PQw+eOuhOll4dDjUf4aYx37XB8iImJ3Z5Jv6msjAkgQA1eglOzfNXX8fMy//XghWtgE9OnuAbDbUHmC6K/N5OBfviDibWMifKO9so7mB+BVxxJEj1PNVVRw1SULqoMLXafsbnzy8C+2Me/jBoZbH/fW20wl6qzfy3DawWYYyFTDF7WdaD4t7+qY+uuuuuuuuuPUMZaI7//rrrrr/NJ/6BYKPTwAiYoKZ0KB9KYigT/5A5QzRDMUiLDWCBIUgQBOY8YuI8MH5fhXtBjSSdy/0+aKmiOQhhK7MGygzUUZfSsn4EQsCk2OwnqecAloAQsnBMLaVgMvt2xkcxddTg37+MXrttCgVIyDMVRSXXsi+jYwlDIdyJ6NwHgJfQVAOQQABrB3BgAGxjwU8y8eIfSYzBHaMF1dTIyewy2nJHZPRpgvmCkg4Ij5eBFVfQFTSJgzmkepLsAeQb/3Wf5/u65QJ1Fw5yn4in/ioGcQpwziYHTEEohhYdkZ5/xj9K7eSsqaYUOkMWqr+PqincQe3cVSmsD3ZB5F1d7viiiIbFNYSdw2w35vDbjYWbbtnjWjtT8UGjunXT+Z7VsRqvVqZclbK6u/S1F0GbG5GS2TtWsnarovwrhd8rgroYxRqy9xH739uZMIAzg8n9ZH1kR1mFA6F3n1kuX37YFJjXA8kija2e68Bv9/WihVlUIRDDQ047GXwbeRPt8S04Id0x6XJvuHNeB6KRmFR68Tsp6pfk/LzL7nr14EMynmj754pmaUU07/ZuWP9ucDWEpZyYk7KwGPtO6sbau6Oi/xBP8lYsIdtrFNo/a9P1zYqlhIfDjMkyatPvwhxAmANZMUblqh1/N85fdKtm7TcpNS8zE6xVn3pyINymyA0sIEAmvpgMlrbeMEX59ZGcgmp4mIKOPOsZ8ZqDj8DRe/omeneCTaDGFyycjj64gBT9ubRUsL/BKn4K4wr46N/eoxQ0gLN8943hHZW5zGns7/+NayCPtgh0y9+tj/NLC5tSb4os+wtqnFuLoTS+wlBFC5xt9hsnUEz1Y+MsZl0J6QHd45//4KxomNHbeHAdCJEX+nff1LXXXXXXXXXXXXXXXXz8/z0OHeMe7m/rP6ErFAkSlEEh3S3/6bTIDt4DMzF4rdPL//OBk4kytkedNw5/EzCWFJdd86GEAAVBDxAQABgDincvBcj2KjFibVdbaX72Bs8xTCwSRlhCbtbQaaEiZ/fjUOxKADNyPF1AlAc3qNUlw6YDajbcQd/XX3AoGUbCXOiQpBf+WCDLcuUMSKvsmrVaDH9DjHzqCAAl4AgQAI1A9YCyQWNxQXi4R4vFivTAL5iw9hdvGkgtEkdHAH0rUqdutKuQ7AEEiVfbxbm3xD02tTkdYbphETleOr1PP5ZfK9fMwoxohIA4lK1atdEgtmN4Ybu8om4VVUbH9YRDPFt5hRktM2sOipsCyE87/a/QIYGhFHDrKsnp6B0hyl5Lrb53zVDuWdphF/TX+H1WZ4QhtaRzRV8HOLayXQHmXTT6dZ6YByySbYzdE8wisGobviJme+enKCQUcLKamm2C/shTiIKzEEonOdTubU5gbkZBFCeUOMEZOKWqbJnT4kgyWXKYXbRgue2AF9qzHxOeWYNks1efgyLwOPCrZM7ycRjvIVcmeQ5s5ZWTJ7IxratIK/eMchrTKr0tLpJYbYhPyzLTiLpCUJBvp+7ipm7g5OQj9WPdBCRHftZ3O7UrzEN5Ted6qSBW1YYoHdps6P2aewvvkJvOtQAUyfxG2Q5MffBZnXCxLBYogeepTsD2Fm1Yqnv3bfqaO3QNCw9C7UeEiMHC7KTvB65SXNtwcf///grLDH3XLGdf+EJmYVJMS7+TvfU9dddddddddddddddddL19V/0GhngFRTQ26HQhWh6wAuLAM7HBOAtI16vLfwHQQ/ceBOBgKK+WTnPPyhMWwIAAiEJICAAKAphvgIgCtBmo359tb5sD28G0oCun6/+uqoqxqvkOEp5zQatRoBKGeaO2S75vQ+GiSnwtHSIoDVRy4GWNKgvKWgaIALkHEXjFpWlyG7/5F2fWk6iuLd//z4ANFodWRTsjw9/miBJ0A8s2FGkV/3ggnOAYIARnA0UtrNezKuMzBoIkugw1gijafgwlzgl6zAV4KXWCcKqWBqy+gMR/L9rVTSvVck2Odfj5+lK15lbqyDdtxzwsrymnZfkDyfgd5p3uEksLjMXZPSBVhcjfxVslsUQjwVQAw9Uq++30ozUfstreqsNZAF3+36DlRTAdXFxI+1+giaC7IX2XRajxUeRAKRzHZAK52F0yEMa4+7Q8CDC2d9BawRU+W3HjGPVxMMl3PWA2w9KbD1ufoIXUwRq3LekQLfnK1wo6WmXTiG21b5NZ2cukGLOoxg8tkzOan9UUQW8ff0qtSfvquqcBidukSuJPUiX1OkLn/s1+B2+LmIMQ9z+rFi15pblS1feL9Qc7E8vips6d0bUrjiCF1SssqZDKp1lkbkaVNFfoIJ1N8JIgx9YsbOVNzeMtN3DWYO7uAE2Z3Du6PPBM3Lc/7TAks8WTb99WAff+7KRGvKuOjfODE4GWh5iiqlHPOpu/vypkX/8NFbPx99R3TBuM8kzK2++98JD5QqSB8SH/Bq++p6666666666666666666649f/0//rj1/0GhmmfGQHpUH41/2IAmd2DmP4EaVf/8/wIsRCBl7dKG23TR0FCCyUe8f1RkHrjkMUmAZXcGC9S2aVWH++CBBPMCAICqGgN+YhtRsJ6YArPP8BIqCKHQXUM0cVgPVvP//H/jqE9MlBy7kCI5zACuig1QKZM/956QQU4xd54DwhHe7CX05JAn/GJ0kZ/5QZhAgLq2+BCLLBXFJcZfmAELZ4YLarv6/wcAigUGgqCR2zFTeyGS9g3SGaXvIkwLzz4Uzp5U8df8mzbQruUF6jYIICwIkWCEC4iKPlr1QXMCYPSHHkeCebtBDomU+M6Q39EABAV0BteBx83+OQZGA7aR85SQGPicflg6DC7BRqdipfQpQIV2cZCGeWF0d5dI//r6nxTHFTJRnjvxyIF4Gh1Ds6Aug46/DoIiYMviS/A8BAn4cniX5/iWoMMihqHctZMOd6i7v1dXvv0NKAz9xc81DgnkCfkHgS6V51fWbZdAmwCSS0eYmExQcKvBX2RQFwKwfvSORCF2/CtDBYEKtKA71WxhrQ1QrC35x2YO7uH/QaikOdHVkh0LsZtsI2u+P+BgY4H6Ybh0PeP+J//JE+uuuuuuuuuuuuuuuuuuuuuuuulrrrrrrrrrrrrrrrrwAAAedBmjgK+Cwvv7ixSrrmsOF/9yiLYZQCbwbL+X5PLkv++PLeWz+Kae+4vA2sXR/L+4sk4egq4szijA3/L77Vd/lKnJCFnuERAo3cJJzC5GQCi6Z+/G8Ltfkziu1zxM2de8ZHadL8mDp+x1uowqw8lOVZeAmM2YHfz4ekvy+xIMz9wnfeRKUyJRQUwFIvu9tMiZL9yFfcKF/vbCIoVhD8x9bwq5XaW4wmxfar2bG0nnFZiMcdoapmzOSGikHn8PBc2Up3CBTp4aRd+2wh4pHBdzG5r683KFxCEbdoWSAe6zwVFX3CRYMCvcJFfeVXdhN/312UZgYy+peUsroYEXd317Lebd3Oy+GVZ5CcVFdOWT/xRA7R6SeSy5PeVcl5LlKC9iX/Cm8v/uKgSete/wsjMjIq8J24dvl9eue7F3fdPC22c/vXN3fxF33v827v6mSupKn4s0CNVDE97qwEPu2X/QsSldF9X+96VMIZl999z8J8SZ7u7u/iyvq7931yF3d0nzS+CO74tuqoqfQRNw2ko4S8RpYPuC+FHpmIE1x/3b6RT4rfxOX97ovrvy630yVSAhJovrCfJhp72Vmvbu+vX3Vdfl8vJUx3chPl+9Kib3yOGK733k9S12cv0n5O5+GpIjVKtVr+TBZAAAAC1EGaVAK+CvzDsclfyld9bw0vcWIAxwdBbMV3D0DvZydfi7zfD77eUmE/sMeOKO+8iAwjfiM/bm/8oQ2pw60EBvzzvS+metG4sh/yhHBjg9e9vlLtFD/yw95uZZBZe4RHbgTBLfxwrd3e26oS2SlxSeTfbl5T7vup8hfCp/RtaH+dyMe4+Fiere7jxTu4R88QO0ouCOGQxJZ8inOyiV+ji6PZ+jb7lL9PVhCQNPvH1Z+CT6lWraNu8Ju8f4zh3u99G/YsmMtreWOcn4u/zlhqHtMH/nki0XttUYjljIOWPwm7Uh3v1Ex2nk8N33pal9G4LDGBEsQI96N8pbxlwd/9PE8F0g/bUGzcsz4+e+Z7VxOfskvr3s37IcPjvP7NzEwjdHnLKv7SeEn4nzfUKh5hfnC8687qKtjcdLn1sZmHD7R9kFm/332jOUx8lNdP+Tw1tJYNvNCPgu8Z4fp3reUuEb3ve93tzLZb5PB6/lYVP/n7777IOLs+ZPXS8T4dRdK/PPsVWV3wiuXFmvd8OO5oMrHT+7d0O72/JMfd7P2LLz95PXJ1qfQ6WT73/apOx6Ez/7tBHyZBLz8iEXr3qT7/wkR7z+f8ViTveJfFG9ORCYYv88IwCfd5P1L09F0bUh+OjcVak9t77Lap97N6nFmgicz9l5p/2E3/Ttr/ZRMb1Nf+T1r8Tu99wkkgvKMl2FW5ebiywn48o/cVvaWvFfffOxN77yeGpk9JTVUg5QkTcET6nxwkefJPHNM951FnPCXj816XF30ZjP07CrOWRrywiRxvtItue+wyo2W3sXKd4kYPYl98T2Ejo63f3ovyfPb2bjnvPk8HpTRwqsCOgaJfwvH3qEOPbwoXyfwSy/3G8hXadVy7Ig7r9Gi+SEhpFn3u+9GV+bzm6T2kqEe9z/+XKMihdPsU6J9xNlsaCd6HrvSpuT8XSwgLu+93e8n2uSTDXkJu4LoAAALzQZpvSkAr4LdyjkpKaSy978MebwBK9Ux5L/9hEmWxlZzI31f/tbhzooR0/t9xkPv33z5SIs7GlbezBmvx5Yz1tqbHMvuX1Wf0Xp8ntxfOaIj5c7ufH/lreFX7YLBwfB7uwIV9x0/c1sGc9RYjXZ7RHWX2t8E3vwphhVi7RjmDkGkjtwkXgEx6zX/a85btTSTOGZxTk1R9rqi73v8p33Cb9sKCuXiR4rLGfJD5YPtuFuwg0OabJ9J0d2owjdOwyt4cVYa5z/+Zd+rVxJSw25Fzmo81pJ83je5PVd38v3tuytRgutKsXonu2e7WbpBnuZPSSPV/J6qltOCMkq5/9CXgny98/vvrfHEu3d+BNv/OjdPJ7f7bmLxmXSW4RNnDwy+cGHnTzoX3TROcnpJP7iy8F+LkjdX5x/TibyD8wBDkLXyk9t69TZkL3pwhuYMuiiSweyA4gPLHDafKknwlD737vH8KZe0txMIR6gN2uq9J/vX+vrFx8eHQdAj/m9tufrc9OqF7t+nNwwJEQldq3JEzsTlZBofp99ZS7uEexJHcvt7vXllvfq+n6/p+/F0+R3urDRfT+2uEJdAh8M3J7f4q97vqvSTv7iQru48Gh/of67SXe2l9/ihj3vve+S93CPirSe5nnuu/cYR23vd7bPu7it7rvaX1VlF5RZ6S/r6+yeq7XTdKUkwavbr65FhDyZWLrS0l102Y7y8v19UUi68Z3Wpt3ye2vWb0/eLEQRL5xvj4qvSZUpRKKen+Wa5++EvEinus//iSiu7blWe7rSyeklv10uT1rTfY1z/0N9Y4ic9bvkj6WhXv6LNd+2toUTd3d/UEN10iCRf38VFbu80u9cEhj5e59iROXZ1p2vpdJU/119LT6T6Syeq7qo65xwOJ9CVE+R7L9OrOFPEEjZ/8Jf1dLbKEyZL39VpaRiiXvvHeq+vr3rW9MlTihdfszu711+SEj7ve+vS4j+CQvNacdSElxy+9yQx5CH3pf+IkK0aGCyAAAAP/QZqPSkAr4LPFjBuXT8178IdJ7tca/YZL/7hI3DN9Mrgm8N3y/l3YzZ7no+wpC4JX5zOw6i9jgy0k6db4wpceGcryZd3e7yfaTZZ9BDu7RThWG8fOF71XlJuMkjlhA97pvufX5PV1z/V5Ysjvu/8pXrCr9wQBIVu5b3H5w9NWRPsqI//+CndfuzAvdx9o/RmHZfaaXIXh02zbZbhMr8CT+fseNr3tvp3g51Mnttru7v3RPTSLotrv7qFPCJnw1cE4fuG+f7DGiT6SbO7bChB3RPOf26NDEhRK1IcS/s5Qyl9il7hQro6U/dCmJGLUet9sD/pq6J6ST5blJSknXuCgrw7JFx+gIdC7nuoS8FnTfl9zof2/bCN7cAj/+bppH99pvpxdzFfjMuX91sIEDvv4YdPiO88MohfgAY8v+3hErzFs6mPIHW7UH8soTUCBdLz/+hrEc/5L00fTnae9UX+CrHaeCQi9zoCP1Xrvl/wiQvK/705qAiekf/wVlSQ5nN+XK7/L+9+G1LzCcfphHwQiLx+lfZJ/He20m8ntpVlbh2I2pC192ZPPbM4a9OqhlbT/6FxhH3nNGOQicXckMgOoou88l5b0uKQQ4+9ArSX9uwnOvOs9JbW0mvrKWUL3XkiIaj9lQJL3AjW/d/baykBNLy8FcTs0WUKDEv93n/298pXd4R8EhOVh932Kve9+8wngj+df9NEGBF5l3mA1reEBbtFYmjJP+GXvZ7EQQZofdsS2/uu/dY9U0hNIQaYNvqxT6oPJ6SRf69tyr16RRp/wh4odmfEqW7fcE2TvHu75PdZK9CS3V5P198leJfevcSNymLpb7+37+gimMd/uTCZxgvfXr86etwj4SvL94zjuFCO7u97u7vivXQkpRL2+vr9e2/EfwhlC7qBKe+f9tiHT3v3MRx08/Re0l8nq665vL681jXnYhDyCoT/R+SMJM69zd95WM37bEEcok/ivTtkenrKWr0qsewka73Fb/lyD79O/37fJ0u4szgRfT/f+uBLiLOv9uxh3fu7SaFRw933vJ9ve5Jd3cJeJEFYjdzt4yY6jCu7vOHXxW4rd79C3k9ctpSarCAt3eP990XqvcxMv23Xq4nKV826FkEXvv79t+KEPLzBp7ZPTapDr8ve/7vCXk5e9/YJyPFbnTeOPaVS/+jlEnye/J7tnL19lbu/xiui+60mpIIt7otV2R2FCSv2R3L4Q8GI3dvSsKF/XxENPNj7dzlerMQRsc9/+0UXulT2XVvp3ERWPq+rP4hbbtdtuka9/qFn5GKEWj+G5c67ctK70mJ1rclCTp6u+9OSviInjfdl+Gc0RVrsly/v4iTPDH96+r6YtGv7KZEjPPBZAAAAC7EGaoBXwV+LHampDcZhiyK3BHZrpRRf98XtLzjZHBjzeCJtlr4wmXEqkn4Bj3V+vU1gf/itaIwBjuSm/3ur3Lvjlv/ceUZyi3u9bOlbniybudj7V3HHy5l3uvYmOJLvNvxlHXvvk4U8wrhRMq9sKE553pXAuyT+cEj35f4xNjkR1tMrOwU+/ceqmoP2hwaI7wplk90t1aIXjt6qewmXNMPxtq7/FaTyL+n/ov3Cet7u/s/J9/lyfl4b6WE39AsMJcUicdgzdz2O5WtkQLlS1+NI+AaFwr7ilRXxwKOTzqG+osXhkjeHr3tYQuc/00WeEC3t3uFDE7aMvh9ll3lJaopQ3f//SeJek3q000Vd4REXvDsk/b1l8vVIExb3w5tH0JeCS9jm+svBTFGWxW6ursnZVU6dD1+4wf4BhrtWz63yfreXhgmYfCOpVMD77HjtzTcIfBU4a/TYRLnJlK3lUDsGcC2rJ1vKkrodxl9+fbRkFu/3fv7xM2Hv52N9Jt3BPy8hKF5Z+tNZIRIH0XI/eZ02kH27aZWWCYppxwhrw4u833XuP2IS8FfL77ZMefb9xV73u96ZUIEx4H3631hAUkiX+9mVnU/2E7zPbPFE8r3eRgT7xXyj33RLia/HoeNcKt3g1a29+/v23vpv0fsnttaTmmvcN20tNoqCGXmXsZhTbhXj73fb/CPgqp35PJ+2vLLu/tfYoxv8fjvMvTWeJFvYIePyfyCCBobbP8ULve92T26/J23r4p6vEkRItt0STe4SfagkM+7XWQt3/Y3d1R+9+r9t7erEFSMIfe6V/bhEv8u4gQ97v7Fumu/QnKT2/ya2poh3rCXlEu4r7F8v199jXVjdE9ctdaJ6164s0085o6q4V8E17u7u7nd1k3rWt6wl5Ch9yfvtIEhnu9b7+i4jSRehVcu4FWQsL/SZTt3fYn13vS1Je9+sLLfCdxO3tote7sk3EOFzTfSW/J6JR06+I6JR3DY7/ZS9eBHgAAAOXQZrAFfBX5R27OX/z6y4aL/7hEksh9iszhD3+YI/jSqD/OleGVPfRY2EY5s/T/46XOcq4b9wFhlJ+uEHhZD+mjtxHRFH13pPc199OCi7puQD3hqlGr3CZ3vdrl/3r9ky5Cz9xorjKiRwfZP5tV66m45sdMrksQ5VHzEJO+L9Mn/jfQWeIRojWofcNXlWLgbUHhsSgzZ5+v5GjPTut07vOwoW7I06l6RINp+HsCc/Hf++Vl6mh/Om2vHb4WbCvL/OvSW5ZV/11r7UqcTu+7vX0Eis97u97TYQu93MPuv533Cj9sKDOXhcVstiuR/buQaYyfdZ3lggpi2WULiBn/vw8kdTSMppd/uExWmCB8O6mh9xdN2riyhPxre4Mfx5fhHfozL32zEpgik4vu8cPtrUvsTN3OJbUWZ4LqWpY5jH3g+En717ZuH4ubCe2Ou7u9lCO73qqPE+W5I/4UoJP/3S4UeRuhwyhJgT1flmn/k/b8aVIElnBO+sP2pvk/b3ehPmi2FmU8t+j+vyeq6te8jBFTpZil+yfek+pcqIyfL90fwj4Jixm5svvv6ip/e+90VTZPS52lwhQq14/UxpV6Y+u2dbeKNCVV1Zf1cQCDb8zovJ7afkuEMN14Ses4NCBq67Mrbuq8npb5K5P7I9xui+u76WqBEdzLeW+sMZxa8wbcgF+F4bun/TQJuCM+dSMy82tPd7iLsB3W1XNVPCPhTjtO9/Cjc8vevon21brlhrKzfQRcnvqt0CEnpZfrvVF9dWLe0ipsnZFvTe8uoR8ftyck1oH/Rqz/LBTe5nvu979CS9JBIXL999OjPqqWrPqj+/yd7rrkeEezXn43S/oWS+77qxPvu36+vvHiX3d931TRt36Izb3CPRNv63xxMZRLdufwd/opXu8n6TufZd7rUTve95PexbXeSjrdVhMl5eH2awX1r3UkJeCYjy/e75Pr5SPZ3f2UpCk/3JTvox5M9r0f1+169lgkETMp2W2viZUW7TvCXipWLhl6arard94ILhVV74fHJhpxJZI7Xlh7wzH8nt69Q6JPz/dGrEo9bUhry9/Tm7vzas5Npddle2i09JrIMJdzj533YeBuHpbv7+qhPxHG0H1ldWi1p7NKvPOd/pbOyiR/u0tlkLuSvE9NSl9fXqzayfb/aJ80LLSxwh3FbvFbdK/NL3fk9KhJE2mtEu/bSyG7veMbW/4aWfieN6Xnz5t/D3xcAAAA75Bmu9KQCvgr8FA7UmakoZPyzYbM3+bw8/Ly3jb7dgY828OD0v/tgrJy8hammO+M8KwPq1f0ZRa+hcOI9ofKTcwLQtn9x5XsINM3bmsgWziVrTRZ4Txn3PeU3VbmIEfHrFkqD1cn6TTrbPN3r3GSb+XLco6/EXJf9k4WL/e4IBwXV7sES8mUZuOjFrL6gWuy9G1/zDkcNEbK+63cZuIj8KndgpHsf8hdG1WLOElUxqPcv2/hE7Gk2HVbbaGQp1UyLfh+KvtS1/jZuYXI1tPWNC3ymei7aOYsSZg2o0k+vyenTkfmnHL9taT8JcOJP9jfu9yXu8n91Tta9vXOpS47N2E/Cg7mldjCdx+fckeyR7VwbmT3srV3GEeCSLqPULpTrHYz7TKv+T6STOXcFZc+P49/ah+HluWMlLtsWUd3r4j3e+urcEmNiQ210dBEz74L89FS3cV5P1f827KEvFcuO79/ir73ft9dK+JIgv/f2YXytNDxLuHUtU9nLEqkA+/Li2HAEnbbNH/Z7tXx3Dpg4hO+O+8BqvGft/Q2O5geYltnKGL367bZ0oJ7u0lu/d2ConWVbTL4n/sGC7bRn9k9V6fdzxfCb1ceW6d73v65PSoXfdkBD4+sYxtav6xkbEr0vNnHWvxuIKB3CV74K5j/TcnNOS0tD/diy/fscDj9Xgj47P4Oh/8p5Y3pTrLTd+6Maejv3lgpvQihpj3YEJ6n+3+yzd8I+rb9xV3e513+5irmX1mFXvdip4k73hmnOZ9+R1fgiIwHB1Pl2Oz6rpe1r7JQkur9tlS+lCPk031ltrbq0JIn1k9pOj+0S+T6ci/6Ev3Vzau9ULXe95ck9tt8tRRXvu+kj6FZfe9wj5iF7fXTYKbu/Td3Izs+ylTvV13kLu+qp/1vtITViHvtckpeX39BLe93L/EQh5Kp/SHUO/Rsfl7eeEv9LoW9p1ZRuf5PbT6E/rqhsgrkjpLp+lC5LlvDcCU1VHfrx//6oswWDbj+6sqhLwSGyxu3XYKzuh0HDT3QqGW9hrElyLyT/Edtaby5eT1XX+vpevr60UqdEgkM4aT4nK2W8jhLwRSsSRot35YJ7xWG3nMX4+nVZHgmPfJFCrHe8l0JXaiu5FyKd34rVL451yAny7d9ypk/WVIqJIQfK3eLQqvxBA6ugM273f5RO76E/T6yd3dtCL3vfJ706KckgKqek1e7vHb1hXwRYa03Ftl8n8hty5S3i5JZLlshfk+tJf6dHIm08q8n16VnFusM5IgxrNTNRcnw98ZAAAD+UGbABXwWeLGLvWz+Xi1Xi65tPmOGfBYbD11734CPc6KDfGVLBL9YM8i/eFmFl/bfBZfJu0R9z16KWtWxiI1PLcEkwOkQyprLwUS95DpbwUtvXEleXmb/i/HUa3hT8faxxo1sfrZruFfMTgm2/Iv3BAQUYVQoQqyNpDy3Bt0lPnWYbgmzxYT72NtgZXfnZ/ssPrIfvW/430sbQIMb+zbmJB+Xns5xwkiE39/DyBUycCXc5QaHi0g3/RNuqh8VHFb6+//3Ch+pUJfakAiZ69gha03oz1LL8IYS7Uf/9XGQdSW0dfwGRMvu/jcQbYAE4xqZbhPcFfYyjnNK89Qv5gjtEMXJeDPCG77p//BThnvJy4a3zGrtTQ0LPVtraNhHlB0WPgJdXrv/n52Wt/cZeke2PDeaIlybvrfBPZ327nt74/OUf5/u7HpodpPd7z4ES+q3Ex8vJBwm/bBYMEj+xKku0XH433Ybt7yumjssaQ1t3hZt+R8zYz72tHiAFamnabVOQNNDQyHS8vqPb/xhbKHn4twyoKfHuoEp4dcqs28Kt6PJ759t4wq1owgtsnaVFhPaD9tNA18X9OJKk423/k9NsrF09+HyWz6vr83Gznpo8vJ+lr0a79plnjDaJJj274750HddPn8HJ+3XqE93c8Hwm/xk8Pe7y/uk9r7JTlO6vFsxLZQt1aWYpXwQmzcc8n377+KWvfsbrJ+hVCkmFH3I7hZNz9Wp4JL3tCXgh3n3qr03tiaGGPeRf2+2991idW90WbefbT1BHj0B/Qj5j05f8FE2vEkuk+9Swle/bXv36aySw6U3bQd7chASaW95lxn79J994qvY1dqEj3u98n7WTuCUk0+HIdcdOPuvJ6pE7VQj5iDdz9+iglK7bt7p/q67fvqxdFc7+jPpIJXP583eqI9336u968vH6YQfzlp71lsgJiO7u7u+u0ynNd+ie/yTFvdZPa9/RZb3hTy5f1+OIx3d3d077p7ElNV7vEOdp+WExp847j0kqy2MfLTZbqvb4pD8v93d+qs/oneXhMz5ZAD+Sfscq61GHe5fFdzs9y+2/CXiRhffPCnJ9vXC9guO8vd7vhk9/JapFMnRS0T17JrR062r3p6vq3Kg32chba/xe93f6ihG7vCiEi5fdZN73CXip4TtH6Vs560UsZdfezPe4rdwh5aK7LvoyKVHt9d9iS7fTKXbdV9Sk3fxEudvr8lEMm+rHXwR7Pho38H/7v9/UJl8lPx+PmHgfjfnr73UcZxnBbu73v5ZTvt6KuifN16qcWT1sT8ul5JJQm/yOFl+EyO4rGZRj9/IhN993v/WpPuQ97yenXm1pIqondwzqIl1xnLRagvgAAABClBmyAV8FfmHTYcuwjfubNfX+8vMR9/iyvmzmkGPN3HPy/+4ICcabZRlMzQCb+L1rvdwce9/3Gx/J8IF93bx96LnyncpyNtKiJ9bIO305x4/04mlGDPi+eVNu5oc3hS9tOeaHH24I9KCY9bQyhvfxePL16iSYZe+TYWL/e2ERAo18jpnZXjZwhPu7v7T7IdbWr69xn8D9f5NHp5B2XmgXQqXBfb/IZT5iz0c4Pd+4wTto8XVB/XimtTrX4Ysu8CR9LYF7yy6kAg2cv6+JJyLUz/MB38Zx3LKGTBI4N5f8uxpSCIWhy6Z0f5cBJoTur7/o3WqUsI7laD3slgN2dXMe61+j3TR+CyP+vdbr+M3dnvx2z+P+tXvPj9whve8EtyP9v8vHZjCfgnGOc3D6XRLJJZbm67Tr3GkdqKK0CX9g3R0ySSXXpCWc+/GsBX+uT0dhf0olkhIt8v272MLjx1bGVlXaveFnhC3I+PBTQcz4RK3gj8Nz3fexAt+/yzEbB+kv9YJON56O9l6fES3f+MM/3xsWG5R4SjaAXuV4n+EQbVq3oEF77Z5P7zLkVa4z1+EvBZd/l5EJHvfWZTf4LPGXR8dV9Pr2whL8O3D34+D8Z8v034s3NkNoKWRmhLpP6tujoOHnNht2qeJaXiL/k9Jorc+i4FGxvr23qiee6Sdx+Re4bZ1/K15fJXfv6LBF58fJ6pVdYq5VGcLzG5p+kmlZLiovmGklbhHwRiC/iid37ir3vfsS8nttvW4MMBN67/YxbrN5rUO3C/8nttpl1hEmvfPhF5r/KCxOO8FfISDSg0cNODY+8ITvG/RdtRp3aEdrHd1k9tLyfTlivZfagnLDbDD77xbbpTE4dfWm7T24TufJP3f7R+hHwQk5/bf4vur338hRqpe2+8EYokIQvlPQQduCKGvQPe84F17dbKY8H7OwTq31jLoji7drxTonu/9L+En+CrWbvu99vSq3VvGKt8nv/69K1fhJ/iu7z4/2+jogt71u7v9P06XTXY1CfPz/9HRjXvJ6SX0mK7bmX/WPK7u7ve8vl8i/hDx+94wk9Vb/dOkSXp8v+YuOve9+S/fdZYTHnj30tfVhMc8/3vL9Ot9F9ZeSJ19L1gqJed77jQeX38ZqjNQln+QqUg98lwksSE8FetVG6eZ/FlvMOJe979Oq9rv2N9fmX39dKL3u8bDjd64k+mXp27hLwvGTH5e+b/npGuv7FGTvcsnLfxJ61IvjP5hhU1y+nqu2ilwmXdyqNeM9eXyeiexfv6b0tGjDHxjp7RwPbfy+7ny7v04UX4TJe71XdNnhES7+7N99yd/S6fIu/rN5cpaESXe/WF8soh7vJ9VfuYXkh15Pq8rt+lN5ck9f/Jul+ynbeGclkF1cZaZPXU8REXRwf4lxEhQoxC/ov7ncJPBXAAAAQKQZtAFfBX5h04CWfhu0NNGV/U9rzE3Iq+Ut3hjzdQR7xL/7jScuaRTtxHrTnUXi7svotP496X/tsFfLBmpKGyLsjOOwQ77I2X3dNwndLOg3k92xLd8VLDhgg2DtLavoFlF3L3lAw5r8qZPfP/ZyS55VWJgn7Tz5cU2T9fJkFky5nz+U6UVuFX7gsCQXV7t5f3c168sKe8I7fZZRRhm3LQWvEhG66To1ORcW/aGCdi+wNfDjzrwygjG0jqH3WVNiSYenYYBDZcyX56ev3ZR84K7H99a5si/33rm5IQp4w3HaOEbhQfyn6Z9PNyEenbRsFGcfpJ7La1ezO+a2ysKEesETowtHokogg0k+1Ny/JZtqkehTId0wcf8esaK9sfhAue1jbqIrRZRZP7pM3IXOVqsrChTIGlJEfLBK8JnGRFpcN7a9gtk+qr2wh8cTSgvVFeEYNv9Ym+MZP0XbJOo9fQve/P5PpeQpfopRd39DMP102Cwwdz/QJNrBSjYiWbctYfNfOkW+/xd4TrBCtYRWXmcJvpQVa5+0/hPw/JrUve4Of60mh3CLzDe953psS3FiOHsPQQ6nX3NXqzabdwifjCKYf1KRv/d7Gy4qZym/F4B/2O2fh9N39tEzKzrPurrH5XPjk7/PnXl//on1W/Q2H69SCv57SvrvAzYCb1HHv//ODyta/LG9P9V/5hOfwj48RlWcv88a0X/PvJ9r5VslIID0NKZop4pAn7zUOzZ3K9/whM+tEpBLWokXQ2AyO4x8pguOA79bexb301l6a9Cu7ox6O9XWTD6Xl37CZrd/DmnqlKhPkUJHK/vrR6hHoExNN33/VX2JLp6FEE8YJetg9HER/BPRy82Br46596r01fpp/VVs93eT0lL3SFEobW97vsqX+/wRXfaEV8UYzp98rSL/v6J6kMV93Z6sd691WT00sq3Je/ZYTu/u+8EN7+2pevflnX4QL88Tzm233yOJpjeHf47hj3Xk93Fy8xe2+iNle/WTu+n8z7LRO3iHX1CXYmdLPDJKf3v1RLCZM+yyDnf4xdZd76X12mXSKdOjQTmd7x+x8OxMtDj5hH6oJnz7Y2y/CXoV2/bCZXeWWKyFLqilZXlX35NPyWUbSP1RZ4jcv3f1W08t589e00q7Iwia7ve7uBF7J37+J5PUZ/CXRJfvL+Z5UCgmXtu7+9fXL+q/7CL66cVlju/q/Efkonr76glFcs9Ul5edIU8RhVub5JZfsZV2Ry491q2US9exurL005I/NteRs13+LKTf5aX5ChCifd/uOtHwgR9y5duZ2F8kLajDNEb23RO+7vpfCXL5dXtyyasbZy/0+EOT36+g6V3v5cks5LY+pLb614Y8QaSHPm/Z0tJfD3xkAAAAPbQZtgFfBX4sdtj2nEQ2vLSaJnL+TubeYteeWXLKUeDJf/bBYRSjzzG48JruRo9PraaPZk9wVzda/yHJx+5h0JeDFfO2lbcEm8guUW/c0fB/14n3XuCrjdPcczDMsn+LLTn7PGcT3voifWlwr5icEecui/bBAQLgKpxoRg4R/x5UJvslK2W47qZ7JeziLpK/k8ifsbovPz1jn3y+2/QOPNEt9OZsezXizBzsFha3mAaYEb5w1xrScztzvD50ozp4t5RmULQQfZjVlkki6L/k9u2jsrD92YIlQvxcSwl5f97l1sk4Px8SChlRvjv/1TdDOU+Yr5orJ1pNdKYHg+7X6osEXhthUoOlyeq1VuCfem95U9ode7u73uAl/axWvl4GeNGE37YLBgo+33lt3s5b2+mg8S1564qzgSPEXd91BuJXtXGB/9jef/RYQK498GZTmEhtW44Xy550uRIuWrIhby+/fpOxfdkierVEp+4KCcGsJEOO37LtQvcFfdvgEv63/V+sgFxQl4cvDbi0r/J3vrBPe3d5/t7e18MG59zvAM1sXnXjnf8aV4fZzuUn5lmM+bhpTEmxL45c+/ovWBfvd9G1R+T6aLKt/x860COts1i5StblHG+62gt8iCOiUCPdL1PaoUfRP5P1/4T36cdgjeU6bGd8/wn4e/3sawVZy19XZuGEJbuvtguIf0mfJ9anvhA3U4bHRet3zrbItOI9Pps99tVSE9+TGBP91lQoljvcpKF8IeVcJM626O6bDmP0/a1wk9fpLu9fbvCLDsPtoazyTqXhvk9ff2U+ffk+6FdS++rFoVMy+d3vT+Zd+kJhDyGzS8sEZXu9Vpd4Kiu/OWffd6tey9Nvgh3f/aSSl3d7W39XGmj93iSAj3Ty25JWuoR8JZ96af4zbu7y+73bj3urPKW795L395dTd3a6x13d3fve397u4SfkiIIvp+eQW/fxXfRUJLSdxL9+/J+vLv0dkI7+qJ3fWXPl6VRHS7VWNQq73AdncFKc+v1hHOm+jYW4O3297fyFp3CXhMVz/Dr3flmKfL6TKhrRzi1REnpLS3iNk7uqP3pvaWiTE1elJMxF93n8JeK3gl/2rV8v3eeCQghw+iu601uY+f5Pab/8n9flZimI33kvVjdvW/RB5Hw2hbA0hR/eRcu/ezj+Xwn0YkudZS4LjQCR+W+JtLurPlumnqil7J21mRBJ/dlyWW962T3ppiiByI/887wuXydJIKkpJJ4o6U+b5v1F93xa0+vfrc3WuKPu+TPJJvO77LD7k4ZL5F+IMaBJcejBXcvki/vGS/7LGn5wWQAAADskGbj0pAK+Cwv/nlGDUpdp3fuiJV8XybmH7hrw/41GH004zxALdzH76iG2fq+w0LjPDfMahz+H7/PXuN4zct3RhlnXdO05i5Sq4I2pERFwl1oZ/rLx22B8q6PmqZ9V2UFC702W4Qjld7mq6WBdZDMDBvTuqLLBNkjlKcCOrNsW1XiC4yxfs49fb7y8SRw4P20nws/cFgoVu+9ytsmEHKnpZff3CHvsJHprSQUIwNe+XcnpUd+7Pby7F92TC3XVOyoLKnfm7SE0uve/CnggJIFj9AklpjuT/UBG/8FbmqoXPLap72ZVvrbKxpmMPzjVr3YG/0kN9Dcnx8C7fr0gE7q0P7A9uxKfO/L7P408b/z9V54vMnOvo34w8OzutnPTSUf/6Z7rzfmvmECK5I+fo9OUUP/wXTw/y1lBtMWzxxXCL3Y7c42SsbIWsDd73mtzZiVe4m7lTyFN7aPLNs41Mc0sFdJfe028FWHp8vlp8OT6CoWlXpAgMZ0PpE7qIJeORq9aT3lfnKsf/J9db4LON3t3E65zbQopSLAvo9bp+EJv6BOZ/aSe1KmtLe5R/7YRN935suOL1dnRiw3BQvpN8X4ev842UX93ppzroTu3EejNeustYu/ln2VTCXid7z9+/lJe/Qn07wJvyzKW6+nN3Ds7dJbiScbcWlqbBPDF+HdaTd+t91VtsTu+8xHft9xORd2BMpevv+EfBRn2ll3VXoTXakNx1bS8FPIOp2+vLT2MGHTtu7twRRx3aIf2T7yd/vcc39LcvoT7cmET63vd7dKaEn5Wa92Puis7S6cEVxouX79clll7vv2mfghjTX/dFIvd4Kcvu773eoRfrvtMfvc8Pvf8UV3ve+l7NrJ8nv7+62ndeSE7u+9+sJeIn6kQcuU/gmLn8S+K3T9XVlfdOWvpvsJW6Xcb/v6EoVve5/2ktAin73Omu3BIZIBBq69b7m3LxO7vjZQr1+MPLz/d3e7um5fYS6Eip/y/v8xW3L+10x55/d37v0J6TLI90aWW9+/qxBbA59uct7EPa0vdlvftTCCwhtJC5PrdikldxlAQfsJeSmHFbe79QWEd8u/cV7/iTuieFEUmYXPQ1TrCe95v13Qsr3acdKD99Ncu1whNV769tkLqVOn+UYR3vbpu73c/f6wn4jL/j531+CY127rpd2WUS8KPutV7/JyftvnkqYWk/ZTNf2iG3nutySEe/Swut8EZn3qTwmJz55pckTN/NHDPkl5a/JJWtZIgue5bPeCyAAAAPTQZugFfBX4JB2PvzGKfi8gvVWfJ9JCe+XMCyS699ZOLjfPztB1+sJBjzZJOpZV7hIlSSXEsCD4ZuBvndApfy7whvcqAxEfL1E7vZP8Vztd5XZf980DpYJN3+EL78qgNQV+vQUCl/9Qqe9+SjH1pvyx8mmlhmtza6PORpfEkNv0b/KccmdK1puBV/YRCAHcNTNcse/4yt/tNC71xL0N8E9c5i9lRiNrX/91ZGzVLPLUc1OpLf7g5oYXGp7UbPEL79xx+7D7zn+4F+BPTz9idJO4TJgBjal/m+TfT+/ToqVVq77G9f9FkvvJ+v+XLkdUJl/vcImFY+oZdFgS6PGZ78Uh4VLcv93GGt+dLdaCN7KHb0P2s9soP6JQW3hruFDqr/9LsIlSlLcjryAwXnRO6o269ExE0t4ZduW0yysTHdP73j7Unv5PlLlUkDV+2I8MpJD/e6p/zZE7sn9aW0Cw0+GDXwW4cs5a8Pf5P16WwUXcI1grGROcMzqhLwWUybfl4zVv177+wpffdhtJh7z794bSunhvqxkh3TbzG9u+Q1dIlBMH76rPCJkLklw7ltI6E37TYeKe9a2uHn7nFRL7Hv3XsKWXH9Hr/aeNYLIJ9v2v9Mo/wBDX9m3m0v8u8UWG/FGMQ+7+8nq2/cgkl3uf9uN1tZJKfqFLoGahlJF6vlYMrS+0q34+EEdeGFjJ6q31Qd+HxF0GVt77kH5vX9XzzsRqsbpvJT8ws/tQmssb0nnKICVqGVdG7H/Z5inL5QZvFmpw/c8ovyx0SUXng+2u9m3Xei93kwzH9Oa6T9ZTUw90u1T4R8SXY3u/5sOPfRC91vFaBNw0YH94ZRZiu3694y32/RPf10qJ2W9wkX98sxLv00jsp3pkuX/EJxuq+qL60S2lzUWXfqsQgzDZnd9R/v/p1120S73CT/F03b3fat+qbcJHe+xv9Iz9jZjvfv7Ut399Xl7vbVYIyXvXkYu97nXwmsvHXPAV3d8E/04v8vq6bSEksvpbra9SEu/q7vdeJ3one6J9JNbuCcmka4w27c/LQEu0w176qEfJe/4Lsi+XVFet+oru739Snd0+ivqux7Ld3pe7LuSMv7pqJobv0nk/X5PL52S1gk3eUH4JBGPFAWyp5ImWM7M6e4TL+bKdiyZtu7v1KcPIchmFB6dLuXe9LeLKWO+WPVU/9qde/c297yfL+lkhIit16GUNe7hVfglNYagq3vu9b7cFAlp/L5gb9PsparNveX9XyFe+ki5FHbetdEwst8Fwq4rdEeHFutoTvcubvr6+v1YmqPVZJc0fyQI8AAABBVBm89KQCvgr8WO2hpoljLN+XNi78XxnPfaDPgoJcEecliDcWMqrlBEv364UpvdsJwWWer+7QfSaHDeywndLyuXeGOWEA2p/48uwI0eHcj/pRPBJKgzBdKEnr/dFPe/wjvISSdLLgIm15v69qFfMbcEZWxivcb4w/zwOqUkgCRfT4C45Ft8WAn3u4I1YNt3rx2Cb3n0Em3xhei4Jr6gz+zLEqOc4l2xpMZIX5DbPMn733qmUNKwY2nPwJptinQzB3ZAlH/Q7SxFdRma13d+1ufO1pr6uHiz/uNEw/D2yB8vNkC/blHCPdj9iVtcDb7rbQftZ4TumZ3F2BXEHM145M6KE/pNNxzhvEP402xcw+a7dP7YT+vhuF7T1soMX/HA+Od0P3P7NP+jx3dx0KHvhpJWwcbDd9p5Y4sP72n43V7p94d27z5cM44aDcDrtScWpZa/pySQYUHoy44RejWjv1L+qyoZvdxRvzRb9736QQl0djPP5AQu5kO99q+CPhpczlSEy/3tihgXau7CLzcvbjr5vdxhL2u+9K3juSOqt9VDJtX5fQxdYy5dy/T3gqLDfhtP34yVv+eQdX0ee/t94Ivcl92WUss5M3+SSZf03v39+T0s/NsWaXGLIdoiDtfQK954P1FDRe7QcneeEvBJd3+31QJ725bP/h726Ecsq4SP/RhGPsTLhSSf19jWJLnnKHz5sC/5adLfvS12Je/k7tkyEh4Ycv29OJI4u05Yml96uCOHhOzbl2bB+xvDjpBHwsKuO0rb/v8kJY1p6aPHseTlXI2xp4/76Whs3hHxPNLikH5QUQaGXCiZ83C9uQGH3eDLykfX9flOV7uK+z+hfrBHWP917Lt7bBbkH37n/2mypw+bw33xXMQZF3eqw05X8v+VKY7eoS8Tc/929fX+id+ScqsFs+Sn7L7rv9TFcvuEehRs9EhD9zxvVwmUrH0r9aLs6/y73v1BHd7vT1fsvSa/WYt77fbW4Tve9/ahF/mNd/bH5civN979aLqt1aSi+6N3fmhMr33vaZ15ParaJ9p/CPhOxub0nvvbUExnuXH7vk/opMkbHCXvpveWHZ+T9ZMRdlJ+l3lEbvovSZYiYr37spH3TXsEhIEupFf/u3Xtv0Y7HI+EvQrt+2i/ye3+/3mE7uktkT3fvVL37o0X7I95taelBIanBfi6Gdvfd6AnCXgiKSIl8If6JfbChLd2bVt7n7jvu2NbtFv9md/eExKbvz/vVP35PbvNv1f9yfKUm716ICKUuyQQccTXkF0nCj27BKaAkdPf/j/xu/qXddF9fXk9bySfa9IWnNOUhvckK6QjPm7d+SFTWTks7kyd7HaX7/8X3fd9Esu70l/teZsrel9+Mxoa8QStHNS9KTvjLJ182CyAIRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd4GIUtLS0tLS0tLS0tLS0tLS0vAAABE5Bm+AV8FfmHS2a1DL5Y9lw3+Ls6kJDTR+T+/LLL3KU+Xc7n2GPNnHY/cl/9wiQ+48XxAuU1YAm9T+f0Plng+vwxGI4tqA3hP77qZjfsMLYXKDA/2n9wYTXyW+TSpRbX9nhrJuVO4zd+k/C9OaWYkVRAKL1jWvye6YmXaQk9Mr+reqrGXpywonJb52LHQeu9vKR7OFn7YRIFwBUtlj0pYOQm/U+GZf/cbKaDBEZk1OLzDo+yWyCF6Ul/bKI+UfJD3fRfWwJZfv3ElKw+yohx72UZ1+FCx4cg7IGzeP7lwSf1mMQetwQokVUrELHT7fiZEsHbL+/hAhE3AMjvVbK/3VmJhZP54ZS9ZP6/LCdJvhhEp/3j5O/wWlMChilgXd9Pk/Wrasf5TTRrw4X5ff4Quxn/b33Wvxd7mDWURf4I9skrF1Cb9wiMEvd3mZTU6E6xfy3fxpEKzC/QxnaWVIHfEjXaKghA96XAieAr7nWJwkp972NK7UeqyeRa5EaxvaZUON162t1Fgl5oMPbsJGJxnOvjnPlporGFRd3BVHiv8/hKsCYc9Z6fxdwaG1J8H7mS9pPvn6te5Z1H+bw7vvZ0CM11R21jt74KOAk967vbjdnaEvW+/cFl8Ivvw5o6LfgyvXBdLJn1zjLR3dYTNk13+ErUGPLwfBN58I/puRvi6xheZc9MdLTKT7a2Off16rcE9rfcpHKsvW6kLaDjlPRUCUixo8sWQeFMa9yFxSRKgr9kXcfaFin1JcLs3e8EJeCHI+7tcuS43j6X8Kk4Oy5+H9YfUr/2NYKy17vuGL7W3kXj+SCszc3G12h92hLD1z2C+qrFn5dd763IF3lUaoEm766cFJXe7njtB5Jh77VEeMgRP7ef5V4dQYjWW7veHfVrhAR4P14dSIc00CLufvrhHwQltz91aaPVUeCURhLp76nv11QJaV610I5ed/dfWTlX6+shTlzxvs/at4LdQLQ6feWkHYkjr8/y/7XCT9oMy9vV1r6sf6rr7L8jZcve9Jfct3eiem0pfmJd/lWtuVa9CWWCjd33fD3Rcu01sd4JNt9aSt6sapLzbvWTl//YsRZ4R0woMckXve7u7xXdvSKJe/dCpyTcHb8RDXFLsb7yk3ff31dutPJ7b6djIJCVvn01mr37O94SWcfizS/e+6ksF282n815mT1TLt3KJalX7XJ7d1jpFuxy/GeixRY3Ody/19/cilBtOsFgjd+WjvYYmmYWT3/rE9xlToL4TL5lq4y98rD0j9vd3e0kiotycN5uu3Hm8n6yWfiz3fD0GJ+xsvnx1r39/ZJe71TYiII+7vZ17CnQjL/njrLwSksfnHbnL3ra2kQTu+0XuX2+/v6NyencUpiohLvhcvkv5Dbv1BHxW6Vcy0lqU93fSXZCAmlvS3NpbeKlJ9QxkiMRzNspHL6m+I0i4W2dG1L7if6zcQc8x315NvPkTctdXWZRcFcAAAExEGaD0pAK+CvxY7ITWpQVD+LzZmsm682azmOGi/+WESTglJ7JAOQuBbeil/e9VDfcIQu/Sbdqdvz2UksddAsM713CmXDBbeidgm9I+c9Ew324+99zwsBD9S8X+oL1J8n6+JbhDl69syBoqdze+zz515WHOWy5BrLf/MRy/Cvm8BZ7ZOl/9wiS8MjaJehbwPTF7caM+C7asvG0wh8fdXwdkAs98SBN66ttouEwukDOlo8mGS7fsa2rly9fxqPqugqMLWpM0sO5q/8v2/Y08jb/RFX8LsXm0PbgWht2HX0rYTDCO7BQXzwvKekvohFqbRyrfL721jCBBxzrjI/rrE3vnye8wpusaXdqsgaXjRmBGls8ZdvaGMCbU93xw+syjzXsLrGHcnrddngpux+9NWe/eUVLat8F2zE+SWRvUIbhqHL94bdEP8X9oaUZmritDbLsVlbMKO9G+XHvf3+M2+e354QksnoZ87+pbf5eH+KDCb9sE44Lgqezx7+9uPokt+4wl2nGIeLiU/BNi2pdvZYSVuQ5Q7TUPem4ws3EOw1sFb6K2H1/s+sC8Jajxtk+0t6TwZsaCHxvb+EClb+GJfXoV8m+cfqrFssgCLDQO7pyXv3Xf79CYnu7v0kmoKDYYuib7yD+GT9X/Cnd72uxhF/nbG8JeG/Lnfk559/ir73cbi33BVdaIjrDb/nez6D5KdLQ/juHcifon5nhMeN++6twxKF9NgXBZEgNb66RJJ4fouT1S93CHmq/IPjYYXnXA70N9eqL6vMe96szsKEqx/R0aEZ69I3O33C1ODJ6S6TiIJtTiFvXDokhe/5Rd7hPlIQYI/6Fw0KMaDmNa+dkN9z/J9dW+CsuBf5JkH07HEU1pPJGGmBYSedWmRdH/nN28ONdRYbET/2SUuFr7+K9lgh2V9evqCM8gXb6/BLc7PK/1u86CIh7uNea69SDl+38p92wiX9U3HTMv7aatqn8EV19dlaLXTe0qcWIgjfU0m3by3/MgXFD0XF/dx19Vp9xNKwuAtXf/N3en93di/FXd976Gkckf6EXfdijYmk+8Ed73N7H8IL4hiHr77o/Q1hqT7FXffeX970R/LILll96f0C4Re93fEyfVf44t3d93u9rd9lVeJvFcV3hHbGEP3cVvdtz49237de6l9oEV9/3Q319nq+k60U6Zf99E13govfP/+mloE/d3e/a9YR1BNe4+f1dzb+0xJJX1WNKbxgqGlTVES9fjvf9E7pUmJuCQjmBN8vJCBXcbNDcCHdK+zt/3bv9G23hJ9zmI99/guu+0Glj/T5PS+mmhwufKFyzeMof7abG/wlKhNm8vf73P/qI7ozS5zd95L6K76Nd3eT97tqZlw0P3a3f4fFPve/CDgcwvD1F9apO8eB6wl0TP5fyII5q777vJ775JIJuMrj2whA+8kGNThfrv/emEIk73cw7fJ7tFl5PZ+q2he77vrchDb35OtSRBCw8/8JnDhfThPxUbxsQZp8v9kkglJa2oV2hQfB25dtSlcpzVvruTdtL3eT22v/tLtNFPa+gjj4fe7y47/7m22PhZZOExDo3cZlb8+xcuOKxWnc9ORdqCM93nToqBdffdIdu1Cd793vESerZw4OX8CPAAAEmUGaIBXwV+UdxuI+GJb1rZr+ft3v1uLlx+4367/LzwZhgv/2N5ojE9GGV9TxWSPXkIEra6YGIWi+vbv/+HoMVfdN7jP/uNgVf3jvXXCL7Pg1NOBgZqpXvv/+k1H98buCJPv4ZFfPh/EHlfPWNlU1R8bz0e59pNots35x95++kD6q/whz6f3hJ2qbwRP6xEuah6z/5HdmrzwuR92n4vDX3/FneN1Twif2S+pZWSHMlh1JFv+aa/oIEd31zRts3/KcJen9wqX/ywiMhN3853CllGbgI32tZsvI/4juN/4I8y8Py2wwiJNwyDS4O6wRp7z5Zv40cSrXcLskXxaW8Of8fxL774QLoJ+0g8+PYPOm4INmZ/+4QJ4JL6hg7YQ9Z0HmbvxRXjk5TrR7+bL+qtr3eruFH7gsECXu4CbX33RywH3AC/q//4srvnxtb+woQwPr8eaRh6mzEx9beSuouj78x16g5Cqdrnvur1U5e40sheYZWcbVXpn35d3Bryeq/O+K/UkXffgxGo5oLbNBK3EKakmL/40rjpk6nV9Lwg9NoqFcpysjHxJFcS9phtMwZ1oVMpyNf/4znTQXu8qbmX/Ht3J7mbdXY1j74+cMgGlPfd8v7qV/gr7vhP/EUuOrzHb6CN9ykb/vf2buYNWp00CwzvyDMaoWszZrpHMdv6BRd88A3J7xu9btCXgkJY3mt+4Jb737bevwR5N52u2wVcxS2aes+3lHwYK95b2sXIuQ4GiLo3x2lP4zx8vs373jOzprJFm0oO/grO8g/dXjLyp9hdZ7NhmX1ePbG77CXQE8H55FT70nBT42+/cdGrfmnHAS8O73n/cZpyiuy/f4q6K76qisEREUPB8FnaTzxhYyBt/d03YEUkZv7ca3Emu4ZlYa+dREb48zuJO95R9oZRKdNZ/uCup/yr6e7D4I9397ghLu57alTjM8XuG4NZsdRO8aKjdNCZYJxDzqHCQUXmfue5j8vCf1uQ27/BUJM3Of/hPx3nYGuWPXR4IZWP3WPsh9fuYM939V7rEXfZlo7ztU/k+vXkBJe+UI+YVFX3+Ezm3c/6sbyfvb+Ez3d3fS9+s3d9n7zItSCon1k30yOQPeqXulBFlX6hEv/WL0r07/Hle7u25u9ZYP3cqnTuXrVjyFe79Zr76+37p3e8v6qvYmbd32RE7veuCG99QlzgpiHmfEvo78OYJuZbdyYk5caef37ySh9+lJFE6yEffkm7vrd9dYRLu2u3Ifva9dHRd32+KihF7UJPh+mWm6qCSv71wge93vO+nCS2X9x978S4GV2SL4X79PyDC6urHkiVWn6HXWnal2W7/Xr2spvuI3fy+nOlJe78UU276a2t/ijXfcoWHdf467u7z/d3CXkw66zp5dhHd4WTLeTDHKqwD+nIeX+sXfV9+bWTrd9/JpK5Hd+T15ssiCJCjdkPN970r+sKF+XvBOQO1O+azZBuvCYksfd+y68pd3tROuirapk+T5MLZI8VueBGrdmbb8kIUtJ7vlzyeyyn3eT7W33ye2m9J0736khrJEZSRnamp1ieCHy5M3iLPnyvh74yAAAAS7QZpAFfBX4JB2aiphl/3PLJM88d983d15sfZ5DPi7wh8bAR2l6deveunSrKr2xksr+HxfEHpP1rOT089WV8bwO1GjSSLUrthT2WBmhTObGB5IiqKTSCqAg57RWf9F+vaBBfSnO8OxYEBVu/Z14tmb7uF/j+nLFk3c7BXhHgZUoqMLr2eMPcgnvfhY3+7ln8XerXw9KC5fyXLFkvd3uFl9gsEXuK3P7bwzR0x5Y226HzSLeYmzV70mBO7mEghGkllCyKjcv8O4OYu0Q4Rd4UnX4x36CBdi2NvBieecJ8dOFtCZUCalkz3cCW6gYV7Wh5yp1jskdzOUKySzFaSXe76SrCN4+eOG28f0lf1k3vVe828v+V5dySwn4o0OrgDqyjZ/P3ujeB9vqThP+0a/BKrnV7eNIrp9nHxsSkrVGQ6k6fR51QKJhqH9QvyllFq7KWF6+xpR26e5nSlPo/xE2z0w3eI5eh1WwXcmDX/SWWc9YGNCy8r066cENMmk6f77ocXax0VpnlOLXi9BfFV8FFN4J9ib6aRx4eHakr5N38m0qJgqJe5fSPW6Ib/D6IRYZMj2orDLSpqGPEPsDdpbg3TEVJTtcszLYec2kEvBQSXzRe2UpI7/BVI/v/uQNu9K/2T6rdpwW3t8n+yekr+RBet785LA6n6vL/HPWruNrij9E7NAxmgen5nThB0c070K4O4VoOLP/r8LfnXAV9dz5vfcio+zPlTiWDfy/7uCIuNR5vr/y96r7EkyemmWVU4UIUsRVKH/WhkbuQfG8dy8o/2q1G+07sg5Lu+5jj2VbITvW/uGpPv9VkhMbzSB1+FnGEfKOLz7dbfThoiS+PonX6xB61cJlmGM/90fx5sfFpRSR2MH3hiJaSIH0xNhH5TqaWQ2j9S+k2Nu8++y17srKWORfyNe3WWCMQW079uuJhLnElu93vrKe9/fR2YUcLb6wSncYEj2j65/fte7Jdi9t7mov6IV7uEfBIaf3i8sYd3d8jdv3Kx2Lf5ivd+l2LYQz3u+73t8sSLe97+6EHSn6lvdmkrWIJCHznt3fZYIu79t0uEfV9/QUKfZ+Xvuf7purn57X4m95I76OgUld/N9p1MiHVhMjvd7+MVeCG95al7k7urKwjeGly7fdevvJ7aXSqK7l70T7VWnd7hHx1W9x6nk/WW2M27j+J/dvb3d3l982cce3P6Vqln2Xb1c3jphb9jYgj3vvo8tt3yfq6+W+8vq9L6RPJetmvfShIs/eyYcf5PXW7oQW95fcJeYQfy/fuCS98tfZRe4bv30eKyv8tH197z4ZV20iqXe9Un6pP9IhXvVKCcVd94O6gqZP6bUnBNxwWt7pz0JeQsJeer8+ECcuTwxmqUzm2T68hfE2qs+nBKeUcRnYWzva5Ikrvvf192Ld/XuifsbZn3vll300ESDZhgPgm5n58a7hTyT/PgY9fpAqNjfE3XlaB7393d3Km7JxQl3P+96uVLzf9J0Jlu18nV4kmq6Gb3lApYXTfu/kwr5LlybfhA0OL7fPnMbt35Ym+U+8X/yaXkmEu96W8niHPsp/rhfyEjuX+I43E9ZOHNNKs3ES23PIXar8EvIRHeLDmcVvJIWP1HL4LIAAARbQZpgFfBbuUdNhf17l4xV/F+G/V3DK9wQah7eVr1Ag+D3p2qeO/4dSwi8YScN2YhxJag/cIQnWvpvbAaE2wtj4Onw/dzJ7u0P652V95fk9q37LBRPC8o4zOL97F0WrqYkNQ5R4V35UFzllbGQnPFbf7ZDd71cE2753RX+II1fhq+LCte/sFhguAKk6p7neGhchT1AlersF7X42RfvzHhfGraJvP2I/y/NAJzy7dxbIfunL3LptKW4bAtdRm+rDa54qgtMxc2n5PbSO9XGFurG49UXnxhPfcAn3U+H86Bxoe5H+n5Pe3p8PUxJSaPANN7tFg7DShD5bUlbS8odS+WIdwn3/XTi5wehLRWokmpB9EwS5Pvc/yR2G9D7gh7TSK9NHlha8qC7edJev9Xta+gnktFvH6v5eGUp2MJv7BOKFYJH0k+vd4y62nZfa6bGG3/xC6xqe+qgiCrJFwxKN8nqpbvQKS4iq0NOeP3uUQnfL6IXyMdd4Iud9hosm4+d6p+l68n67fgkICfyxl1j++9oIy97u/lhPwl5pkN9eWEeOE5y+aK4rvv7jcIocw0uEb/ItzO34bhuXqslP/BVhCpn21Y8oloPAvWsKLL/vgow6S8oRqBgllxY8bP1q2xrGW8wVmGhw7zCQdofVFi3Fr+Bf0djOvQVq/BYXc6RFLb3e78OseSYlO+7T8NxL6sk+1pKCmH21UeW7i5RzjZkYNR3PwR0INu3enDhLw7l6Vv37otfHEHay+sqrXgnMZcKtZ0KLo6JJxly8uN76R+3xiCJJ1kvLEvuBde7669+nrwRiWrvW9a6LBHHammv20tQUEDu3CHXnfdhJF/tr7O+4R9ky667Ym2/d9EoVhr3qsQV37319VVP1ghjgff9t2uEfBEQ/tXay8EJ3P32e0GxIzKqt1yxX14qOVfDUROK/0b9Jf6zbv19nlLd+shHvpcknd733ffx0IP0UEW621lsgworu93mQR4/l4rvu8vrtuUrv/BJbvSLRaIeL1rFFe73SLnUIbv03fe0nrr7wh3RG2vu+8lfsxIcdj38T8kIea9PfajMdroXRu3Otpzf3Lv6BMcRlwy9b/aYNiS9WUjx90emyYz2kqybT9e0qXr6SBER92bVcEO5KNk2X0kdcp3fCSxI/BIIfLr5PXJ1bW4u/2tETE3f0636Uvl+q0uZ9Vk9U+TsOEzwcqf436/kBMVKUfAY+2q3PqRBLyYyY68N3GQkBI4Rl7jwcJKlR3PL2Neki2uxMuH7R9/fk92xM1/pF7fXaQevuYBpkqJB2Jw9w33fVf32Rwn5L3HixrvBSR8LuHBLJ/Nmju6fCJ+T2n5opy+SnIT395SrTe3xkppt9r5oQLd+75/v7JcED5+03/fCxfvNNHiHu+7st+mJ5JXv6lK09KlNzW9LeL3T8c/eI/kjjzMPdzvsiIcM5ESSBJn88iCs/pdI+o2Er45/irz7XWCqcnvx9+nzxbeIwWQAAASQQZqAFfBX4sdm65M/LaNiW/LySO75i4Zy1hgv/lizXNICPlerc+d8c9cv9u4uk4YZuRrgchWkYuZJ9JOXufrG+/yf3+J9iYKNzD3KDxlU0S4qa9wTF3LDkOkthuLx5YJb12+i5S/7TwsvcaIveOxh3hI/vcEg4WwkTxWe/WFm3AaF+Hpj5fLT3G0cP1X4XFxBtRVNyjssnhQLPm4Ba2u5ZZcvQ3YozyPcNH+yaT+I7otYTpbhF3aGEK4rW29ZxaT9XVvGFE3UOP1w+51HmS0EV/vDJ8PCD8/cbCXstHUAG2/Jqt26HbwjbT0Z4eHR6g3CTEVvO0OzWh/9mlA3QjzZ8whYYQvnOlfG7omE7Q8ixE2La0KV8ORbnjrI5cG4h20zVn6p3BTSAI99v/m/V+A/ziQ9pb3RX+M75vMFRsSHmC7m3Bjk3cS37/CGoZXkfxm9xvJ+vsZd3d4r26bzOf7jc+SOd95e5q8ELopeOxaLVX/5Z2zH5rCZf3ywiKFbisfKzhdnBWr4ttYvvChrZrZzoN02sW5Benm+Bmhey5bzJ6pFe7hQs182d3YiOf8SldkSbKJbcgreHYZ6+T9VvcEd5vhh/WqzwgWH28MQ8bxl3uQPGfv9OTckfX0/Ynk/br/J+70koLCLUqFQIqp6cOspoXfEhbWudr7DEoat36Xjufhe4aRGX9cnReE37gpI9Le3ef9a3oT8JdPvcPY99VR3k9KqyXKQoLlC/mRbaY6CTPwXHMDp4Z7+9xVmP19Xfste6XpVfqhBOeVNDVe6doE3PT8PYX2Qp2CE4QbKvvpsXcgoa7gu3CZRwdmlvW+M2afFIKG8ZcOle+pH57Zwo9Ew25RLxK3nLegZ233gjJe+PzQRnyvDCVlFl/2su7/rWtcYZ3vhWW21n7ku/aFHROf3L5eEeikvdLyyiXyO3emsi/BLY0YOD3Zs4MU3WT0s38EQq98xvFJcv/ui1q8b7G9v+/8v/yQj5iXv3R92/xIle+RYzkT0/LeybMv6/BEbd66F+iRZbve/5DPe21RUCEsaEj9aaFrEXd779oRvfdwk/cVe97u9a3BId9SpuvNe79xJKV+f/CO973u7ptVk5Y7W7Nmf9Qlffd/sST+EPIa7f4TJKpgmPr1s699ZW0OFlYvPMws+8V9CWiGTrLIF6e1tdUmeO6T7u730Wby5sbLfotHgku+VPp3u+1ye2nqZ4WNlycDHJfWo/L/+ET7uOTvbn98JeCQRu7ZPqk78EhS5dyi7L83rIe79fW93fm9U93fVqW7feS4RIeHu7d4EfWX2V2RiNzLW94TL+3WO4LuTD/LLOfbv8Xc0XAXoqnL54SYu0ifs/yRJ+awRtFK3Eu8iF5+3d3eT7XEFEu1TDTk2bdy7eQuQhingGq/SxnCnk41m9XWlghM+WIu3EnpbaWX05BaV7dkjfeTy/1vySYB9tkjhcvqn48iRzuXTr3DzmPpnhWTi8r7mvaSJ9J/4sr6n/y+Xk5fPvNGFP7uY/33L9y5epE81yX9lAG27dTthH4ZxEggkDU1qIKfukweSZbwWQAAABOdBmqAV8FfmHZqEy/L9/zbojTr3F3a92jD4Y8Emrs0U/gr81II/G9Jx1lTepfvwl3ZS17Yf46YeaWGhdEhJ/Rf97LJE376N50rS7hKw5NHZvDDmyerZ6n2Ot4USs1qFv5xerehuUWVg3PMgKifazC9Ram1BNrvfV+EyFFuQLSjp2GPk/fXxh8eP7qb+oJdWp932mdnku7/ibBMivRvit1/LeeEKeY3DDhL/5YIMEdZep5pWSKht0vZwWlyKUYbDLF1f7hT9fjVABKvUalWJx08dDOaeYu11Q9mjd1rwn/+pPrY4t87HWBukz3ix/usreuo3b/D+iqye0p6WoU+6NnVrfe8SrPnr4x5I4E3s+T+ilVm4yMix24BIkffELvAw8HUo21pnDNJgG8ivuEcsSwTM1DOz1AlZgc48cM/w6UeYsFPpoal6eRGeRq/SVGzN85pO9a4z8bVHvDWiQ8x+pnU8WwmeEbYSfvudY7cNwXRvTDy/Bb4Py71vh/XmMBcaDV+nH6llfxnpl9rrEEu/P8v/2U8yAC70d39TCb+wTjguq+pCdPBteMW0kHcP7re02FDWmXSsXEH/+ahd/M2d/nvm3/HbGnW01X4K8LPltuGc6DIjNTZ/Krzl2/BGfDDDu5XiuIhoiF937oqfTed9YLcV5CxC747aVTPJ7pl204LDXe8oWcNy3TB1/qx76oI5YvY7jwykU9VWz1YjcI+Qi03+a42KG0rJ+k6lbghxspSTtJFbrn3vh/kkrSU7z68LSBu+zTDVuP5P2txrLLWYF9FgrKRO5cKC97nEP2tX/BHzpnbVerapMqBKRncxN74X2504JtmbQdYCNXhgKOi8QUod3ndKG9YXEGNkWyoSk6HLBbN3/3iSvFL28PYep8UhRsM2vaHkUu3hb4/mpLUUJ3fGTFXcERHlXzdjUCgsfZF7SV6eqq1vk+1bOSkimLIfY++ThHxxW7G63y99VlO9+7ye//gkFHC8gdfD8EIkw2n+VR4vu7v/BT2GNnh3ujornqfThCVj7vTkul0gWXHW/3ve+WT1rfx2543vt72kW4J8vp3d3a6bKJe8I0SEBjnx7csHt9+TNR/VdfYt+nWpRZWOEcyGQuwD5G2K7puHawKvvXS/Yt12Et73v3usvab5m73yfW71iZiz77yf0MvlT3d5Pq8pJBPJ70f6m3uEtsKEdXTauiJU93yBZ3u3aWUTdD21nknNgk2btjX0uZDtvu/cPwaXk+qy5bBP5dlYs0qvf9wVXQ72iHOehR60/qJlEe401u9qeoiVA+WjsmvfYuOI97nhD6N35Pvl3UcTT0Jx0hdiGnbk4LBNy7u/Az/PQl2CcU3P77vvzwiV0T09d0t0Qr36cvmjsXJu/dPzx+ykN66+zrd1ghnzim7/JBII4yhJ3f5bN43CRfr8hYaHD+gkS3XnhJ7SV/hC97n4rTJX7t9pdGeT+xnWyicr6r7TNyb0IqmPL2pXQSICb9eaBtyzcd3fksSglwvhPyCDwzs32WKMUUdb258Sp/Fi8t7yv2LXkKST9NJfdDOE7aelvXTkonvyII4wJ/cvueGh+rpulfCvSptaWCEjoncwipXu0Czmc6V3YjCV050eyxvvfr1ku/J66RGtAuK+aSO6aNfUlI+M8M5JCGpfxEJ8mT56/fLj8R+oe+NgAAAFIkGawBXwV+CQdHHIbJeUVeWek15f/cX45Bdv8XkLmG/l/bhgv/uGMthm64AxPSrbXhDKJv7WLeNf/uFITp4rzw3fISr8FVflPDolI4q5FT7h25pBG8m5YbNF7XFpUpVwUimyIf5f38Xvc4hAC9U3bvrl7p9wmTdvLgdg2XxxXc4vFd9JHcf7PLfbk/f0s1k0ihZfYRNH5x20Evd3KZeWz5L5brY28g1terfaPaTH9XDj23tfGBA08+9q29Q1EH8LVCrFbieqGHf3CJgIb/KshyVHkmMOOdY/r6GFr6bhHjyFrAtdbiKwaXo5ZPlx4wWPIgvy68IZQExL/cbSMNzQgT1qmf1UorUesEIRxgn8MxWWglH/zLNNGx1E4/cKYe3ZhbUbgzTD/C2Vu5bVwxcbsm28jfjLkDWN4EN+B93n/fn+OA5vyfVZb5p0CCd+SCbh6T+N4747gXjJ62VeeK7vu3e+H7orl974CIa44VL12w1H8v/WW42r2Ey/vlhEUKN3fjKn6vxbV+l96dxhD9qhIvsQpFg7tRl/5n7M/HS3Bc3ZpG2i0npKV7pB2Tt8fdXoEW68L9tOrZI9EzhdQj+a/h3Uvyeu/qLLhK4uXu+iwSyTyx+WJ0k3ye9NVZxJDu96vOgWCO8MpP3pvYOhl0oUvcI33zaGETuYP7hLzSa5+mev8FVNGta93n7yNPjgaVNd4U73DS1BvQ72dnueF0xe5U6wRR4HNfR098v/bgpJ5w1MP90MxFoXlAMGk3FsJdyIRqTUvvrboExz1e4I+3/Xe52Nj8k7MyZF1tObnXq/6LJ3fXvI2hxONifkvPLYQ/+oK+2g7GbJZ3QbiInd4S8uja/h5Re+EfCmXxk4P7Iv/fy/1rJ9+MDAIt797bvv0hRoZlIFUT+FrY6UtwTlwn8Ct4eTJH1t/Chso5eXTJx4RfMDGd7ttRefx9K7o8Ex0bfNuF6MMLqe8v4wTG1JFei/UJFfc+fYlE3fvBPcPLke9zBWtNFQiHRFvfEp65ytUrcz//bKd3l87C8IdAkDFXv+bn38p7zhpJ9wTinc4/hys8h5hdLghE1pan/1ghEKyhVgy+/m1bfSlPu/EMEWSV48J+ExDv3l/LBaer7sfXR2xO7qjxWRTHYV9LqzGu/V27+ujd3vto13fL9Lp/IxOfwi/x+eL7u9z976ibbfd+ryetWT/RV1myEZRBk9Ur/6P63Iez/4S3enf8FF7u75Qfu+/x3Td7u5e+EfeRmLdZePIWDuJWHYW4JHxdntY33RRLmMvCl90ndUj6lJuf1vijvu9305SFr/NFTC28x9+6/J/Rvn9CUSlv60peONTAKPeZ5e734iZf48Td+7u7nwJLOE/ycnqk322O3DsJTrG0bHbjJzontUL7xYnctN9yVfShO1Olu/pVPZPruXVmd3eT+nTSoSW7hDf9Hb5NhMvsvUokmK7u73UiRbjOR+nSSyev2qlFoc/02NpgjM95k7UQV9M9fvfKd2vY/1szu/pgkt6aXq1Lb6chHlb9wnkhrwwv4yy+39d44jv9txyczvR09pu6yoIC3A0sS3uV3cvu09RkfEa25vd1vXY/o12973vv7wiSX966byeml5JIIfFZUyeqVaWCEr2nTm/wnvd3fyOFfREuT601VQ93fDA0K2NomG78X22380P8K2VpFs/xnLY+nN9NfughNIUV/S78RL0iEuaTnjDOSFZzKzLyPDL06+O+/xGyCuOedT3ur4yyi4i5/wWQAABQJBmuAV8FfgoGZsCNqLWOSkYqX8nbFy2HZ6jDUeT5q7K/4vObO1hpcI1F4vd+aCNdwuX/3CJLRNQJvZy5rRl3Lt2/X4Qzs3KRKDGg68Ilks7hG/MBgR7Nyia6PhwFucPsEpfL/FXalonY4Cf9r84k/TcrLLG5x4ffG2TJ7F/0CqQHo0WYRT/TCj2xMlpek0RKF37ggIKDEMA+PXuX2QXX4SKaW7v/wpX5bgkQvc3jG6puOOYVHcsL4pJaMMBd36/+7Vb90rW+bD9vS1qNllxAQNEFPB0ebW+1ytrexlgQW6qBZsMb/7ALeD/OLS6YSLn75fdusOxyj08EQ2PaV34bkjpLCVnUaRP/VWVDCuNtmpmfzOw7TfMmiP/p4N4B9yG4dDKGYGqS8P35Yc8uN47j+n+T7Ul/NOedd2m4JeMhacqd7lTeW/uKxn/eNByv5S5xEI9IE39ihgfFoM7KcC2Ci/NWCvtwJv6yV7TjSZ5GfsOmZUNfVSt2Ggy5sPfTXeaIjbou+iNsOWqTqfvzavW9jZN5/HtW1OEZ9PzN6rZCTd+1j+dUc/anLdAH7md4dq1JH9P/jTq+sJfffWtRrDcF0T+cE/ZvuhJ/M1lFuKpZ2MSy+beNK3a7w/fKEPgGjQ62/Haikrphzc357Jq5b1gs8En1gkzl++nHBKHbSeJghLwo5JVrD/mvZLkmdGp+v+sFkiB+cHB+95vunXV3fk90z6KqBYIzB10aSDvd3B4En0/Piye7TNKfjfdGdjCHOXdFsIRC9wm9of+5tcOziUPwm/bFEft3c03tOqKwWzmx26W7DrK7tk/S23opLcA/+D53YtlzBQw/T4yCMps/+97lx5KT131Ve9Ipd39kysUrVszBKaa/RBT6304MMupskaN67h378J+tfglKSXldLrG66IKhnmWL2tkQKzvO++TERnTKF7ENxnLJ7ddZGKEY5U0WcFOVfuU4R4aT1DYYgof2V3vV42KM73mDt9H/hM7vd9+vuETXBN5KZ6Rg8U73bivv5zHz+E/8v1FG5xl71+CETCRgPFfpEqPJN/rRH17gi0n9pS0uqBd3e94tKpmvZf9ZoSywV3d58u7t7kzTxOqL1T4ISu+zTn2vb61nZf6fNL7wlkjCvhBpoJvXn7y7d7v23u/2Yu77Gs0/10JkNd/fJ9pvvZj8cbJPVS91MR39YjKGofZgO/6Snt1lit7u95PaaFy63d8J7Yy7uXp7vbu7vl9JIllBCdu8oujoE8ien4/IZOkykffWY777EyEfXmn9l8/+T1XdV61RSV2onL7nf9pdAhstz3SYw2CFoz3733d3R9qlwkuXBJGadTuZPryVVWPEevzPu1c6xY1q7rvr6+n60Q4vUUa76yneZuXyqIS8Mnd9z8WX8FZB+nc+BItL5NLGo1ujtMP2p/J3R2WBLclnOviF31w3nz+mjtsJC47lfhGb92CY0V+Xu2ZH03yCyjPJ+pag1G/yetq/s7pevSyCEJNny9PWMdqndd1rTkBUTdz7eCDmnh5d56FPJL31gKf7061xxr1Fu7xoH8Puo7c35PvZZFsPjUDle51ow01G9pjSr5ttyfeKJpWP2ivy5Kdt6W9933l3JH9+0Cgma3n72ct6da7Ut3S8mFfBFivTZfJ/OTXPqfqhHv/CZc+ea5fNfzT/15mUuePZTQxkkIPNH3miCnlnH0UG6/7IeyZgrgAAAE4EGbABXwV+UZzZ8XUyDTVfxcaaym42dufF5dRz2GfFkcZ9AM36lsu+fbCt27QMruMhOul97Moo2gR/Hd1xhC84n/CXCf67wVW6olUFVyoMZ5hQ95YTqr63SLSDxKdI5cJdXlgq8bo7+GpoOBTHgz4VFUn7v6icR/6VJu8vBgV3l+Msd/yyv8IVQIJuzNzaW72ws/sEAgLip77uROXkrRMPIXR2Uv55B7sVtlQUiUA1u4i6fBXq2/1JEFg/fzVdEn+fb+/JaqThdmZDq3uTo+f5xcjnr52ydS/27jNB+EClSy/0T2j43u84n5QKdIxOBvcFMefFO4RIjqePSwU+xGRtHlkjYbBRM6rLFlSH7Buw4wWaYb7rwpKZjRXw/tm5EO+hi6/6kWOmT20nXwluQ7u+38FWd5xNE7iQkfc+xleBzJq3Kg/svfbm75WH673+q3NwReOpfL/vlLKfIaJFICb9xQ4S4KxW4rGyQJetq9hQj7EHZrqDLI/G/0upcJS0W3r8KQxXadnkH++F5gcTYiUddxixnE6+S3al4fLieeKDc4t35S4E+s93dT/1i/Jw86MhPr/0JTmryoFgq39jcgRILmnzP9+Lw72VIlK3wwpoJeCTPxm5uL9ofnx8txvl1pQi0k24LJFIhXZrt+7fHy57KCZZP6/OwU/DKl9Aivql0QKsxzu50djecsk9ythuHfVHBPufTAkv+P32Tmn+urgffhulPGdGV7jecuOpIzGtMbtuiM9i6EPHd/+vfeHeFtzyvfY1gmn/ge8u6CX+7nR+rz1i/MW7uiekku+FCZmnDG2zXil78NJG6b6fZPas6aTwU+cPvxlNuwRZS+Q1OPMHTtPEkKUXLxxcI+CMIXtPTXbRWGxMgqGcjvDcnpN9PghOxT3URbL8FhjqLAi9F/vl8xXyLyhSWtcpcYNYPT5PbX8VNh5bH3gjLA589Va9tdkITOwsvze4o42u/5fhLyEe/abOr+2lfJ6pErehUqrVt5K9uYr39oTffl+nyfTd5+CIsfY//ZU71IQ8G/e9mQJb33vKEfMIl/2xx3L7u25vc/9X39UCE97zGmyldkvfXVv2oTv3vfbaud10Rrz9Ka7vCPgmK6W27elUn1/LQUvd3ity9qDt9usSW93bnhJ6rrr1XX1mK7N305SXvb47rp3UNRddgX0vWXd9ZNit3vp6xW7vd38VCL9seSIe/hPz1mK8q7e6kyid35kW+/pkMnd6zyxQkr7vDfx4VVkkYIzNp507yzZ7fo18/6YR7vy9Df1rib3NB7/u7v04QNcCN6f+/7vSRu96s8KLc7qitidt93t1GRJXPktvu9t6WkxP1Wb2cgJ7zwmM7opNVk/XonKIO97hLyFm3+CEh/CH+ml9QjJPKySQ75329fMd5Tf0HBsFWpbce8n9MJiMtOX018QLuSWfGvblI+8nroIiKqa735Ize73mz3u+vUt0rUn29FuRgtM+FVI03KnZsJ+Sh4fXM13iDPdvauE77bL5PiWCkb2r7la/ZcFNryJyBrvsX8R6f1ZHzoPqn0+SI3vu+jYV8EW90u/sFRG5obpOXHbCXlTo7FxW5bFeWHX5Iorvzt6SpMEnms6fj82973vyRReTS5/3jC/8L+QhT5t+SIyUk2ozp1/5Igr5z5S728nw98VAAABFRBmyAV8FfixmN5uzE6DhC+Eea82RdE7/l3KtNeXqtoXVqWh8H/Xhkv/lhEkw8PzbgHRJgR5e+ql2pLPL/bti9odiDDg8742+WErVGYLmFKNt3V5f4vx/uwS7GppP11LcE3HcnHby3XpfwgVVSvd+G63L/5Yy5slJY1O5HbZEdqd2Fi/+WMNuO0hF5sX2K3zms13XWPRD1+424vbwVmGWdTZ8u3xM1QmRCH9D5iE1uU6zdry59WvkdpGhQj7Tic2LZE+/txSqD9LBTQwzhJ+0p2KVV7jatB43eMaWK0W3xy3zdkMj3/z7QOUu+YmW2iPHLKm28bT4x8MCa3Cm+gw6ktOGZza1GnH1g+D+5F1Qo4mP9YI5h84h5U3upt3Mdqs8J9zA8fmHgX9wRb18qIN/m5f7QKJb9t3Sv5T3PBxsJl/vbCI4VuKwkcEjOTRWrcrVkLlgpJYTOKYErdnrmF5eOJBkzL6ffuPqetb71sKYH/xL8k/czw9hPvPwEL3L/E/MGY9eqLqW8s9dFlLjQZnC1LSifNclOnX9fWTu8nq3q1mEtPeT0m96TBOKvefAk55oIisZPVb6oFGOH9Ikxc2SC6WfoS8E/nYu9hf4LL7923vrX2Myc7nDpI5A6ZuPFz+6vBFDC/8tC12/pCzXgtesbdn/goLEKmAUvm7/b92y/+4I4ITZN/+OxvvBDHO/3X5n+CLROTHhPoq37gpgLv5yLvv91D+8TYYkrOJ9flF47vCPghFcQwaqLRUzu8bIMgmdLbi0JWmdjWNOilf7ySi/e7TysYZUbUPre/+T3bp22wiQooV5wy0f0awJnsfPxquD+b4UKPij215ir5Jlv1ByNbi3Yq5ZPbv3cQV5a313ihG5zyfeT163wSCdyh2tfgt7uxM8fdW5+C2cM73fW0ipQkZqXgzO80e1XZ97hF8QmCEiqLpt7rVbYJD5ZW2+KQJxHDI334sCcCzY/rdk+3v8ERR1y++LvBKSAY26qy3fvpx/bokglyC+vBIZ367FqTYuEPMaOb8v9VgjPbLi6EurcEJ73ll9qif9lDDL/1dL6CQhz/y++gRZ/100Xd30bCHr3qCLu//RFrtPt60+x6hfyCWnfL+V7jyTv7n9nHLd69WJjCn/oktyX6XWIiZDvX/dOXpc0Egi7unG18EXnmkXovn/KyZY271sYbDfnm7hqL23nzIHdXYXx/0MEnjvd30O7R+FHn8I+Q10/xRsvV3v8EZ1l2jvZ7Ljb19Nmk/ulPyfSuX4J8n7lzTqu8h2iikN82q66yGe+k8vf6nuqCPd7vc+nVuVXSKoJsbo7fjOHrTMKLEzsWR7glrWv0a2GCfViyH79eT0l+zSkK3tp7RKb/S9Pf4JyO3l93eZPViUUy+E/II4eX23+KJlAz2FH129KmShKfS77Veq9OWZUvN6kcYX4XXvVif2CXLj84s+RNdm+bDOoieHbXRPrfVX3fms5U9wWQAAAErUGbQBXwWeERW5iQQvAOY8sy/0ZKl5b3hkv/2EdQ7FiIL0Y0WZh4y9pcW0pl/t2hthI2a/5rXDt7ZC9KoxF+wXcG5eGGyfL+/hGgsZYlYdT77vduJhC/SPricC5ZEcwCyerll+K1u+HIszJ+l58gm+nqxe3fuMLd973DEsF0ntRkH/iZqXF8NZR/hZ/YoYJHHd3uX9oKbrgi3JrnNClw6hSDP+6aRQQ/Pj88wXEj2X3yX3CFb72SuE835fPznRiBpNOWQt517X6Sl78qQi/L16gou+jfTMKP3CJBI8UYRT7CT+VnPAt1bbD3pv8rexkNokCH9ZtJ+nNWuBc7dTGPHYyQqUpq/hZueTLqPi3SDbFh/uNhXryNaemKynyt/VPzh0xZ2Uiy1Yzv51IWtXpC2w9XOu1dG1b+2zdAQ+S/CWWBXdWucI14CC8inZs8dPWMESfjuVdYLXraPtWSN/Be4ek+q37GlMVFrXQjkoelsdhtAOE6UQ+j9jIHCeOtwW5aUbwjiuX2nZ2N27hvC3OvgE41XcnuVH9Tc/d1h7Wf9/lu5TnL+VuuX9XwRy6ccqe9kGF3dz98N3+70b3yfpvllQLBT3MGuPtgblMEHc5NI458WX623BZw6l5b9IPotJhPYdIND+rFQm/wVEjePaiGfd3Yu19D+VwfokwiyPPeYz8KePr1wPSmnn9wc8IvQ3uKMcvT4Zi23qgkVg3jwtOE3vupZL9YI4QeO/lUa99a619GPe+knu+T0ktNugRGfY31k/uqzwQ/LnflF5PCL9HCwqTe7u91ZVkhViyou56giFQh9d3x9/wsWUs/1PGYB118MUn/4ICCxD5cIdw42Wel3prDlp5/4kr3IRu/y2ATWH+vJJF/2C0ru8Nsn+7EoJby0l6F6wlMIbu7vtOsImyLlIY1/vQ33yYsTiu8S+EfBCRDlZz/gsvfe739/td9aqCIhdnzL9HevElLHd3fo9CH6wTlMHcw+99duW78npNfXpclfC1XBJd/oRf6NLywgd3dxv12eQycO3+UpNvzoSUEmleObXdU4S/+qInVfsTJJ/v1oQ/m93k9pvqSiR9w99OJ51F0+98EO7vPCT/BWWfzsu2LpXnFbf296rZY3j3voz+WU73369fX17a8hL31gm7vd3c2qr8kIvtR+pX2fvYrdxta7lECXvbZ32LZ6+b6onql4IZTguEnklZ06iy3WXql2KPn16SrE0933vNaJeK/J2uSCox547l7uTPOtl/uiMonl4T8Qa79XvpMKFw8vi3d9gneWw23/d0UPr0HO1zRF3d8/+hh3iu7u598u8n6d542Ql76x93zte73+hG93vvbTyevebRiblDfUTe+5NsJl/XxxHvnt+fx1fQKO4S9LTXMaUZPtqxe/qxZad93k/t1UbKV91Yv2b5NPk/ThPUhHe/sYR7u3CBK393d5cGfXVK/Ytd/f30qy/owl0sLZEqKsv2JXgn3CzVa2KQ/7NkTJpq7CHG1gfaVlxK3DS4Qm8+Upv0oIyk3UraaE6CN3d3Lj93Dj3IxM19336TOWRz8/DHgiEJJd29O/gl3JhNVktGl/Jw8y39lPeSav4e+LgAABH9Bm29KQCvgr8wzPM5b4vNj+79y63XlrJkMeiC6/CO0lXAI/qcTS1OK9BJ65cvcsFcF1qbpcw3gKFM9twwbTHR4nmLnHzg438/zR3b/k/pXzzYdIIH5P6fzy95/3LwWwKj19DC1XWs37uj7h67uGl8WpDAfFZZa4bmP/yy7NsKl/8sEBo2MhUpl3YXhN0l66gnrdJj8l15Y2jnLaRuk/91O8fbH5+vR3eswGvbZbTAATGfmR/xKW2Ta7eNWtOunA+tV/NO19/eeht5y0eZR95blx5S2zpaTtsbUk7yDvOcGgg89di/jBDra8Wo9XoZ3fXBbDu2d29Tuvye7fvjOMYpGK53jc+7aAs4u/BJ6YttZtb/psvCN94EWrr+/0htCHYscn01+WEplyL7S9ocyek11uLlY8zp1qSvBDywGrUoawT5NyXTlTpIKdO7n8sGrdG/hJ57Oy2X/awRnc0nJUrEWQm/sUOEuO7iH1uHkJmX206cFJCinTzo5VjsWOvfYmvhfQKbd5qXdv169tRoGJMumeD8OL8QCg+vo7/L6TGoWFWF0W/El03FqLQMdt+CHc17U91Z3vpvcvlYuzfWauvcWIzJnShIuf7UfaPf5PVWttwUZQ0Jn5+1PfF6Ruwgk7Tbb8704R8EhMvuL/J3fuaCNeYD/+COa7d/sxM4WeqsbRW6GwTRwPnwHMs13yoj15PdJZPqzk3btAlxtwfxs/66cZzr/gyVpeoloYf7u4S8Ecvpt3fQIsv+yfutrEIrGqcbIIgx4mOwb2WFqwSv5XJr4QY1+T1vFLLCNBHYDMsQma2nlVsFEkGtcSCx27xpO3CHePq5Zov0Tw9lzskou5Ke8goNa9v6EoFB1MVLrK5g+VZ1dLW3yoSaXvzZ3rkEvQ4R6BGQrKe1VfTe1XMRzMw/sN4LSgfQav73uw+8FViXou8eci088/92bsu7N6axPa24Lbt0hbd3Mel0N/u791ZYjILN13fSS4LLv4ru/0I6L5YSIfpYYczX4vv1q5vpe8EUw/W90/ZOt9Fc3+C2GUjkfveUJ+CSu9b8kFfe3Da4Pvfuy1fqT2zH3dpxLykpXvFaNd++0/Ty/vLJ9Kn31lpO8JPtxV3C6Qbt3eT6vlRxMol9vqCKzvRYtC2LM+Yfz/J/UnjXW9aiu7u93ZYV3L3AzYIunWh6iXIfy7T6jyvvTu94SsSEUKn31ZT5/9dJ1ZfqxfL9nsl6HWlYvPnmX00JQuP3vd7t+7yfdaXmJissJPa00yPE93EstB7Ceowjx1brXG7n1FeT6padQUXd+am7bmuJFu/n+nsn8feXvPeG3c/607L9XWp01RE5r32m5EU17/Z2fhPyGNrcZ6tcFxHKxd81xfiRJSt7tlt9le3y/L6E9JcgLue54cqdPp0hk1p/ZMLLJx8Vz4K90j5wl6OVhHpHyXEKO+XH0VEmn6olMpSQv1gpq+a2DzNktStrHJMOVr6mT/8qJ3frDGaCWicH/6pEtI/glpFrq3P6bWJ4VvB7xIl/KbWv+Iu+6+HvjIAAABLFBm4AV8FfixnNmQfHJXzdSy+WwR6J2XZFeGJ87utP/junz1xpf4YL/5YYJeEpxsI6F9y7hJx+PfUy69QV7mH9XJ2cDkLbSSm6LKwt2QalCSK1NG55c03+qdoVe7wDN8PHmfzJ6TZeW4KNTt3XMmmz1CN5/zt3t27TjCuFxU/97v9wJ8ZFUvKUgjCR7ZQwqX/2wWCoceLCKVujVH0zJiq9xvXxoYz3BVP7H7PACEa5h3nslL+x/R/R6vqyFvVx+tbjxB+vjFgifahp7yinGoLybSP+X/dxkEv3z6+JubeeguXv0GZ8IKnxhv/3d07yThH66+rfbQ2vbHE/tHHwEOv3W4/BeaniU+l79GDVUs8bOepB6dzpf8FE4XHxaHk8q1apBseNJ1btmhKQLseCH49cfSb+CorH5+55bnOKZGeMUy/6ThSlk4KZ1aJOYS13fVuN7zR0wy/WWWCnyQjNTYdOvbe+8yb3wh7dx5Datx2Hj8cInv6fwo/cUMEvFYrAIvewx0+tAltm4pf3txsLso0cSalgV9f33hzusw3pHf+SQ+zMtc4f7hIdn+P89ay/t3hQmUiRzLM/WwvIoGqX76ubYib7h1AY+vBHFnyfSVl7iyvwQPmeZ/taoT60NezV+tYMnqnLufVF6r9K1hf42CRSO8DVLzaGBIjPro7NSvy/7VCyW+xdyF7mTrhLwSbpDeMwl/TiCwVYSO1T/3d932kW42vX6kOvuPifrWuB9ho/Vu56/SW4JZYfMNt8vfSXmNk1+SC0t3mAvDcn769yzLGf7GyZ6nv3ghhl1t+qr1XQIpYfL3y/7e6F4dZLVvhHwRRXv79a3lbghLdwxezTPZ4IhEInnt1vtPPC1s0ucLTr+hO5t8Y0/b+CCPiQtuuQXhmJhZDttAq8N0l9PeXI4E744vk9v72xI25wlp72mTmGXllT4qCMqb/un2l0Ccrvn88HfJ96ntpgnEPQ7RF/7asjEhMTunn8I+FyVeP0+9av/wR1rl/VP+FuBpOGXsh+H7Y26Ycff17n1h9xf6/ZX31ojVXpUQkIFd/d6K/wQyaEbjoJ3qMt/gkpunqEtwoR3dsu7lze98HR/Wv+xb32uvVnhmV39eT6r90R1V1RUTL76UEl32hHwRdN+yfWv48rlx2vty28Von1YI+f0+7J8npNiP4KC3fu6UK9QQme9h1Z332J+bvV/oxEr9nRLvMEeRmvfS+Cre93d3dyp+xY+mfhF9lgnGO5/bnUPc/pCRNnlvfo8lmfOT216od1tX3fRvW936U299su7EVHSQWTcsr31dYUEcb7t7M2Z7tuUnCT8oTBPdj+X+8QtpLbEpP1UlG3K+7tFrpdrm01oyrZdpq4ve+5du2sTPB3d3gmwn4TMeFzs2w03Lk9KpVLwnavxCEX7IUXcEA8Z+P1+SLveHF97jvtMvqq6ly+TuIZe709UOM794MnQvvCnk5chxuL+wTkxWYPvbunVamQIRL5ekLxH19VMmkpFNd+9cEfdxXQv4Kp1solPHM8xfKnklvFbrxRWnfzXL6XkmkzWXy1zXnlyf1iPw0XyTfES17JbfJG1XT5HS/avc+3JvkiI1n6pagsgAAAEqkGboBXwV+YZmgvxe0FPsN5bv8vkyvExZC26XVZf/f8We460+bHDBf/sYbi8hY83gkqHgv/pL2q9oFcoKyVJuYBiRvK9wh9DBLzI0xrewYbx/xKvr+fCek3PBJlDDVFqqzcMQyhsXlYqVXElX8MPW57QI+deVPwQF5suQt1UbP3fWrcv+1IEK1msCsdXvu4WL/7hc0J+nEOdvcrC8KOf1l2NjX4uC3K6bEVpnr+e3ZN7Nh8Ey4/VplFNBY/dcx8CWh3/r24YsK6LsV+RSS28cP/jvvuM+ySRJJA9c7+L/YXO+5BV1ulOx4eMh5Kc3K/jayFzHcIrEyNfstbur814TvMp8nrVk+MjOn/ciZJDj9JNkYmlvJ7VNqROWcL71ReFy7pHAyib4CiMQPe3/cdtwG/wZh0SRGWyadCWdL/BES90qr6y/75SxuctgE/FDJfAxLBCOb9vlwuqp8vt7WMnoHUlyB2mOcdo1pnaXEZ/te0SF+Ilf2/tB0k8uWB1v2k3QqE+ofcQhrFudDjlzH1qm4+BZkC0E/oenK2P/+JLqFr/Gd02Jbm5AKafcnlXq8SzHff4RunRYdUXceLe6SzwmTZuPz/eqf3+CQg7yGcWjgJ/wWT5d8N94sC7GGsIYs01iNCb9NaRdTEEumjrwSEsgJO89L8N27hMt58zozl9EQm1dG46FJh/S917dYz0Wzu/XVgiMQfTyWjmuyRtbpDN9tMTvd5l4YwZvy/5fCPgjma96rfo7BFlH7s7FsEpAKrRQ74+RuTn+0+KQrudswVf3hDjERxwRyjDYxPBCbL3t9xI18kiASqqvBaIu8PRejVnnXqaznl+CU93lxqdXNpykoFghih5bw4bckLhD7w+rUjv7vUJnfeXy8I+yHyOmqXu/xVu5gRff8FsEet1QXHxN7AxvvtRGn1srb+Ofxtf7Az2To6BOWf/GYBrsbERwur3M3TX/ir781a/BDmn4b1ZAW3b97+hLL71fvX9ePsTvlfSd3dbkvvsbBFiO/tpOvdAj3f2rqne+93E3dvu8vuq4Ld7ve/pwi+XGFPj0Ny493uNxlwm/Jus5fchbz6i+osttFvd5f10xRHfu/SfZZJK/eTd3pqzRV3cpJ7vJ7Sal6RLu+9pkVOgj7063+Cch+3viXu4r235YkSeTPLHaexb6KUxHnN6p+rG+/tMJ7vvfd6yWO3SZWC4nDOLISd7a5STCSscI+Yl6f0Ivv5RIluvG1naglLd+6mTzEEXv5/a5GpCp6rrcsJbvp33WmqI1LdkW/zdwsrL77Sha7y9u9+4hz4UuwVkTveK3sIF6sdmBu8W+pNLZpDvvVOQ9XXcunxBDbz+9kyEdpX1JCfh477UfMVZCTRD3WN6/61xpNzVYmVsJ7c+zuEf7te6X38uMPcm/rysPndwxf76nIC2O3RKpMuf63Pyhn6KTN176+5ejRl6e7ve73LlpUoK5X/u7vpNcsgRp3lwlxM06X1K4f66Fwp5JHr/EaIsqj4Glc5k/Ui/Hmxlm/3eO6e6UkFl793fFdMOv2ggd93n6Zf+lBbnz3NqKcn9m+vkfbs7k6vDGqEJw5rvpLJZZ/cFkAAABTtBm8AV8FfixmWzZhyXkuX/3Hea93Lbh5l+yxc9T2zclPnryzRIWCb/PDHmzyJOl8KE48vIJvGeKzR9EdF/6iNtec+QXuFIXdPy3Z/dvCAi/o/iN7+6335lZQEBp28ZvD2dtS5uENuYt3/d1F3PmrbaBFdJ3QUqlLoFG4xdV3GneJNYKL3HZt73KBKeh1+Cgu54XdxWPxuQdlx9uw+SbvBquXtaZOFl9hE0hkIXRqwJHHFbryYslREOGu42tsVkHQreeZkHv5aUj24B2tPHbV7q3yI3bD7Y66iMyZo9xq7KPx404EuShxv/jeFeRHXLYd7ofSzHB+J32cMYm8yaty7YYN3v9fZ8tjGA24nNaw6lCI49NRE7dt6rPG3g9OoXfAlM/4sMHEePits8qefdQUP/Opw+4k6QY/VT2MvuSM54484aiwpQ0oamn6yhb+FJ9puJ04iwdYDfCYu/3Cd6B+GMGvL/7qO3RUPvfe93eT9IvugXab5+9pX8pXmMuE37QoUJHisVhCtRWmYUEPfO+3BSQI3mp+G2kfKoUtWywIl3JHL7ur2mhLKxt9GMo+WaH+Puyq8KbEfU+Qs0rcrvskv+t2dxXYNtYcrY1yHpysmn3/caXLTZ+ETnT+tDI9OhidEJgJv14bU38wYb7MN8b+lvCHj4dSHaK3KFN5PS6fwS+RWnw6lujppstsJnyPS1m+68pLs9VlghK747ZPSv1KgWCrBuHLutx6NdSSO39wp33T3ooJfDQ0cpIYSKHhKnbSzPCXokTH4Ip4XftJK/Tir733k9tMW1yoFnafMPO2CRFe807qXh+L8dodeon3J9+2+CbcP7rwQdZ8xQbeTBL5cfr2qxNe8kSJn++/cl9+4IRDsT30rWFKLDvBkXao90fPwyx5uWsPmEn94R8FIjl5/93bx1X1gqOYTcwrcBC1VZ/r3afvguT0235XBMKnmlR+wcRsCV/zTVkiuUccMa87ff8FO7wlu+8fcy+5l/dcw299ieT3v/243gkJIcMc+ryCONpi/wVCb7vvlY9CPiSZWL3qzvf6Fpe3IMh3p1xc61f8RPEoRyAYgW9Hs5wY5n9p9AiI776t2wXFffl8HqOyqX7dOnt0lKv1Hd3u52Hd38kf4Isv319giET999e4UEu43isQ/0opFnuUbne9uJPaevEQTFRvc7PuRO1RIxk9u1pKyFd3yer074IiQ6gxH7r6yFd780Ze9xwT93e7vk9u6r2S79pgku/4rwQ+HpJv7IC6Wl33doR6BPJ/k16+gpvd95WHu92Fpvq7v8UurNd78lnz+vFEu999tC73vftoR3e72/lLysmQeSESfqSV+Ce4rEj7gDYlb4z+VZ6iT3f27vve8vk9LPc3BCTd0oVklmGu9lsaxAznt33VF5PST/OjX31Tvek0JIVAlJuATarXvvw46hx8eJ1xf973FcI+Qmr/BOIveVj6kp52Ya7+vdqpt79MEZLu6dJOJ+k9La/S2SmV3feQuJlwpx8I1/xvTJIP0JeKyR5YfwgYveJfEP3nh6i7n73gb8RNjGlEibZ/5Ri+mwmR33v2LhEr7veUSu+0mcmv9YR3dtz527vtMhnvqsR9vL+tyEI5aPe0mJFHFl/ChfJfLFDL3d7vemWIOsMqHd+0ncifxnv8VqtR282vhVa4i9p7bkNXl4nd7pO6X4Ij7ZeKRk/p/wpywG/X35r7ukdZEaC+1d73GKby/78sdu73dKXP7OfGXXwtkgiEQ3+6QOSCXn/ZsJ0uvyXmb5Igt3vv9w+98nw98TAAAEiEGb4BXwV+LGalvATP0ZDqyv3LhqOd+L5dGY9oN5aqwx5t1y/+4LCLIlWUUhN18Nzi9whBGOyi/pvYRQIZOCCCTz2dN7McF+5shIj+qcrCHd73mbG3BJ6uW5bgjoIpXMNlRVosTXRBmIeOK4AhqtyyzcrL+1VkK98v+7wsX/yxRpSc4dP5+2NnlenrQX/rfmAyyI8pS1JYCVaIKReFoHWudX16d0dZNw3GK7hVSd/xzyLhLwwt4//cbWFrct+ik3bIOIi+Lx8V/R07EV7ri2E4bkHpJJV/Htf4yJtQGF/0Rg0Hn5gG1xpU3cN3YC/xdQxKN+RM6bAL8aU/k6k9iQ0GOk9u0s8Hwrxv24plJhUmOv9OH+J13nhBWOxlkFwR73U8l7/GOvIzSXQzeE3Ua1aNfMuB1/B96T0mk+T6vkaPFXcIn3RVh14oLYOS/9Zef4TftgnGCQcEjgo7FGHUsJyH2V2NvrCkVhlZQanb9zmRkoU/AVlbCoTFNR/sPW0pD6y+1fQ0hzcJKfax/S5J7bj5ApeqRrc9QvhtWk96CxhTVk7J/iywx0ma0f3t/gt5NzkkFFjPwTF55XHrGK4DKVZ/ThC9z5ec0YFvb2t+73CJoT+4dwvl46LF2DDkH+4LLhl7ZAuCb684Sxe/2siX2dEHxZbLxoS9Ey1p4IcvnjW0junZbzn6Lnp37hwlO68PW7+rxbHlx2xvlKlCmVGUf+WXRnfPda/WreuC3h6c/5V3bc6aZIsvvS2N03CVGHIR1E/X3dHw7xwwCW5Sd3slw5NW+SPk9733BFYFc4dY/BHckzhd4bb8QNbeUP7uvEiE5l5Fe/wSl3fjedZPrp/V8npq56mgvNuR93MHVrPD/YIzvfUI+EKbKRmmRne63+TP32/xVwxJN2lbHf4d2Ya4Eu5+M7R+Q6yrMC42WT/3vkIQGp9+2j9r6Rm1Sn07xMInfe93funL3ltQh6tWWELu7uKzwz+9Za5z7ZXu9dtAhIit5E7GskYff3qCe+7vdlL1JhRseT6pO/3rq8I+vb5cJ3EsIzzu7paZZT55VeYl78j6P2lrXi77Pd+10VAkvedPzTL8Il/n9lczEsb5rBOYub3uK3p+0YSZv0VAlwyh1uu0tKGlHi1v3d35PVb2j2MvfZ4ju8+el/EXd77/Fbvd3p0moUNw23fdre39vfJVXpjBbvu8sHu76UJ0pRD36OzHd30spJbv1iNF3vJ9aX5iXvrZRic/rqj+vVKZbrxO3LzrW7hPxJnx07u3+Cju+aw4pJP2LiRL3veDvXe6hO+25A1vrv6J9RBJfdkVThTyHng7ynJGktwteSjKOuK20rvrfCF/G8W5lZS6VR7v6LBSXgJ2u8Oxte7vy+zFOfM6V5S3P5l9Fzc1vXqiCm2keoSz78uXRNApLsMJrci9uQ5reizf4ylTczDLkEe7V+ikdkm8LPzTGcJrZz9a4KMTulpFwgHVU7gil/Sr+lJ2UsC1S7zMRJIUWtZv5MMaiJIq+1azCnJBdMjvJHLTfTa/BdOSjLKnPPTLkvPPBZAAAFM0GaABXwWeGBWXFdY7IlX45b/KTfqry8uHFwx5tX/G83yg2HotFpQl80gQD+MIaKOf1saM/l12s2dw/D49fZ447IEXB0id9ht1NEM5WBU7YdSNYO1M6aUtyE7JaXwQFKPzlrI5wq/h2LA8tPtSmftsTsbObd7V5u4cHZybmJFTFefbTJTk+ny6w3CNh3u/6V4hp/lhLsnkv+Is5bdar8pW5GRk1CvgsES45n5a9zweaWvvJVW5Y3QpuLMfdHBan0EonGsvVVkzaSnYNZb8etkUvZv12rHK7Pi5UcQURIMFrp2xlwgyvS2qppv9O5tU87akP4/2P34PKjBl89cf9jHwLbBdS4Y1uXVW4zC+vEwpFv131AI7X6Re7tnxxdNPQy93OguaUKyn1bfkjCmTsw4/CtFFj8EqqP61/RIoj77Y60r5syxhFyOtcGHNENwbqQXqx91X3BCR3O/Sq/wh8sHq8C3wsPFflOWT1BuWeE/BAM4EguYExwzxIeO+bfVrgv/63t3X728FJHDd+iefFcdpWD+Jc80X3YfseLvO7b/5PVJvdxsNw3q7JFehtJoupXvDVu+ITC2QzBsYhmKDtf5/8SzkWX+svPQEvzY3VAsLmvBZYJkQ3F4+ip1pzXf0r4f3TvfL6u9AkJCAXMBJp/Df0N7u93sihak9u2dz8OOoRTc+Eieq/4JLkaFFX+zXpAihK9V3V2uPt0+N/OvROPPvBLt/82r2myD7tNr5Nt5PaUv1EwxK4Rsj8vCTSckKGQuNyFx9+fe98bBvYdoKFcpb7hxJaZM4b6yd9PJdfh8P3KNPhxkIgp6JWR/W5LfX9OCPMHb1pPPWtuN4sW9736oSR7u79qW4ow7AXJsCi2vh2l/xu6Do5UjUEpU345yrnNkAmH86vP98KtE/f/ixOanGc8Iv8ForWt7sFWUr76de2ni44UHciPtfaFbzhKXr8I9Ccs70rY/V0vgr5Q7MZOjw0l7buROi1k/SJU/6JyfpfVghLl32q0gkW97w62HwSWb7e4JzTBpGgSQNd8vaBOfP7u95QnWYuHbo+7JlRUt1kGIIaf31ghKvY/7r2nl9Obe7J/X7ku+EspRN7uf1HxeY+Yt36OlzlF/3wUb3d3dhXuXd3rvIWyeq+ywVEz4+73fWT3froEFzhd973fWG4/+ix+Myt2P2632+4U93e7u77vaEfWt+ozRrrfcQ49O9u7y/e+S99J42QpZ7v3e/RYnem+SGvrBFGTw287I7dYu+zb33ib2n3fZCI39pPUI+U8VffkQozuf4lodQ/SvgmF4l+4hwmkTtMmOq+m8SJiD3fyw19bFJ3fX2JQu93u29p+a7v3iCvTd39OC4UX4SvP473OmX0yWYmEn5wutvFEF90rb7TJ0JI3Re0vJe/WYl77E9p6fXpcQTv3iGi71xO7ve4T8FUvV423fP3dzg35YLCueO+acpkOLjhn3omFqyxrBGUbufG6fVrkI8vfWC4orvYh0T+l6xIkdjFJcWzeS3XL8kuaF8S7L8ur2nPvyfeIIEdKCcz33FcY27lkFy/d36s8hfxhHMivPSGhJ4uRY9xy2W+lYkfPP8J+CIRG7nI2/x5OPmP3ScZ8912oJinrka83V+9yJ1+Ijil/y/lx5PaTL5c1vXSiiXvul19TFe7y+tLj7v7uQvtbr4V8NYz3KVs3/6ijM3Mdid/4I7xW6RfNz/pdNCeMPufNp325XdnSliru/dE0TVrJ67r7Ez/DOSIFGjkwgqrNhyfX/qlXchVM+oLIAAAF8EGaIBXwV+LGZiZszcc98ubFDJsO4uPkdOg3Lf17/i+MB9kpKjzQ7Ax5t1y/+2ESVHtNc10rtRV+CjRGDRIwMwtkvOmmz2wp5iwQvDu9KCL5tz/KJPkJD07kF65fJ3LLGxLt6d7EwtZnDSycPr8EFrjqJ/J/f4nunShYv/ljdMsuQ+Ht7M/pTLSKkrWtiL2UVsN+sblXvjSH7/wyJ6XtAN2ZY2MCiaijPXjMxUHdd+hPNHqA69qbzyRgO91JdQ+638n29buEL8PwWlOhlfUM/Pq+AvznpPf38NcxIM7taD14ddX6aEssYUz7UHKeKhO6UenVZjvSgp3GCtw1K/NC/z8G8RGe6cIUEn69fnXMg5I/wVTwxoNQ+PB0Tpe9x23VHY69KfO7/oZd70ru7kinxslV+LKXmC93lyEy/3tggEBdpb+Q21fZIGqnjmQP5bAje9lB47hN5jaul9btDYJHN2i/Z0IMf/7fu03txl3NjXnOl1zJ29l7/04FXWvc711z/rwr+60vIMCgJ2VQhvHUrR8y86RBgTczZ13cfxV3GmE+mM/c99FPvsVUgNPKqH+/a6v/Sls0WKsqNHuKHbVle5g6m1hEPWcfmfqLtOk/e7VS1MIf7v938aW4isrLrIv80WMweaHsZOS3QrhbtLiQ/HbxwXiraT6T1/L9u9jIJXx7zcDvd3uXVs23GEYr1zG5PSV9zQoVEijNSeHI1sLPcMTvx9+/fHMVpVJ7bYm/ha8UkJ0L9b6R8Onwxmz/LCFzp+ZE7K4dStGjwh5GCvcW3HQNHEnL+xMEd30pk9prL3Gmu90NvNsbUNpJA6vlNODWztYHs4T7opt7Y2Y9fgptP3sS/zfZcizsZ/UUXHamid4Hv4pBNGavQ36hp17hvaZQr2eEl3gqDD3c9cwb7vacyby8PchdIbp+r+63ds7LIZsMPx6v/kW/L/3jNlz8/UsnkD/HCOQlV17mMYFmPH2OtcaXKPvq+X6FZ0SRTE2o68ZufJ741/l4IPR72C23BHiwfXetaxEtjWTpe+tcUM54GM5RN+4U6tAVQ7+QWxWW8gSpZzgT8kuP7HtwiX9egzvLi/dXXQISjYut/sLU5XESlNyhGHzPPk9NsrFvwSinESi+hofy030KvAk3w9fn/D931H3O0MSS5bH0oxS8d+5XE/7wRiw2tz+WT22/drugTGCN92Wtp1kp+ov19Ir7SegSGAh+w//8wsnqqxboJGPL6cvr1wh0yDNP/Lk9a88wl97/6NIInDcdJunBN1AZ9wXc0mhXP22CKNlo1NgatjeK2CEo8I0UeuiURjb25Cnp9jUE8hKdt77xd3d7v04Jc4Otu9+2mp0TL+u4QhF7eFLvveOsuwk/3Fct26OUtt/Z65zrKVKeGlzb6JtO6fbJ6KF8NYZ73r8d79SE3Xpgove+7/JCFa1l9fx+pOHt3d4dHc3d8oor31p8Uy931hMrs939F9jX9mJNP0dku+ut6yfoluzkBFd/uqhFbePiv3cA7XkiC67tVu7fHCbwbpt/bv+E7WfL31YTu0+SJA70+MiBK358fb9bFJv68nr6v73d89N6VMsWSHurKgoK4zM7BYos/d75jaSzhMWc3X3rehwj4JzZn1w15N+if9Mvdv11WT3SCKF5ZOvyexfvILd31WKJToncv+8nKQbQXcJrrCGflEi2xu1Shl0mS9hHDsFt3u5kHBADfJqk/k9PylzIp77hVuSe7chvel8sby/1pZq+yeuX77KKCU/CB39C/GNC7RcCngi0ZjfLInbf4wkUDHV6Wv9u/VjqityxlzqCsrih+IBtaXe/y7vve7KdO8k7/7KVw2qHz6l3d6/RC2X92kwnmS3NIw7yetSl6jClm+Rve/d5P1S2sXe7MMXT0t+rgiH1ZujuFfRC2T1zEL8PQh7OtTg1W1M0m0N1Vma/P26w8+5f1COnLhbctl8uv/FS++556frvbo0/95KY4obq8WWh3tgwlnf8v9fDGZEIXQjZel+Iz2XLbzEuSymLtTXw98ZAAAAFb0GaQBXwV+LGcda9QfzZ826J/llJkJKQwZ+YhUD68p7jrdhev8ppNsq9wh6lhYThza72yrLw7bvw+y0vkHheOMY/k9Ki3ywSWQnSmlVU6jvLS9wH/l3XvV55ZxSUPr2hh93d3fobyK5C4W9GKJh+CDl5fNu0aNPjibDc8LJmlSf+WNJDq5kPSK4wJlS983x+XJ/8AjxuR7jM9i7+J0tCudum75Xl3v3+t0XcbNKA0ubnnLbwb6zpbB9mjkEv5n0j7/P5QTkvcLJOf9+4+GIMCwFpFFLohM+cB6ye+bGlDX/GF6RsG+obgx9Q2pEsob6Y1DFULpv47g1gMXBeYy/huDddHT+K5n4R8p/5PTzs68E0/c+ai0RG0Kzp5YQnjfOWvRS3L7/mvf8pcNZMJl/9xQzgExut/yHk/ST7/tjJ+E/H1fCTy1HOzls0JlXn2chex0Sv8P8MpIV7hQibRFq5813xIuUB/3mn84yVjodmHq0u16XX4eDxpZLJhw4ui962fhAqzw77AmOsngRUJzE2OB3CU9qNF/L3BGXK50yfi753vmn7hw0NxF7XUl/fVBT3d9+HUmGvdzC9Ve1oS8Em9z9l/vxGMrtXAIr9wpSUsiaM7wPo9DTtgcu+EPnh7xm1Lx1JCW/vkPx3fYXj5cN7bh4mLk1q970D2Y2G3KzvQLnwYzWBfAvDN+D6vFsaVgslYfl1yB3CPk8yM+XYz5fb3ZJ1DF+d6fVu44C5/xuzAtwtoTj8Zom8Ec7pi5icd10frsehvW2qLh/+EcxIsVIeYZRdHzX17guzhNWGbxUQXgbsfiRKe+Pr38F1LdS+p4IVSTTgnFFnuUZuQop5/wVXtpUYT3vTDiXI3MOzn3RsP5l/5lEiY3uaNocK4R8gosKzs69X9LvHmCVrkb3SpaWXN/0z0k+3ELdRhco+93eG4mD8xk5d/7grN5kA0C+qL7kPy924kkzL7f4KRZfZ33zr+ovv7ghNSu/v1gjOX/+T7pzq3FG4+8DP32/hMvoi95fCL9RNJoZpLf/zXv9kK7/IcEYl7ul7ogyYHQl5qu6ORryz/XuCLCfD63wXYuQ7M8Cov5N3fq+7/BLvcby2BrW38EVOm+oRL/8T5YKbu7veP9W/uDv/rysFHdyR+MeT9vH0HtMsb6Pon2/lW5Q1d+q/qioEXduW2nLBHd3shG9a3k2FNq738vd3mN90eY7t3q1PefH+16ku7vrBJu897ql6PL3ekl8n9O6uFN33fe5BZ+CEZPr3ywlffd5PTbe9oExLlRW8vzGrckV3d3vqaEPFFnyt1e+8UZ4rd3d/QkS97mOBhytJJvEXvfWT188ZG9Nid94Zq+x9c/trXVeK7l/DzsVuvr1gsEXaDNuX3e9jeZplE3uEfFXn2fdZfsnw0Z77fV11xKlj9O0uRAqEvd5tk0fnLZekdU2eEKW92eeMsbaFzMWXnoN1d6Sk9LKOTM930uT0kmvxxd32lz/a9F5VG1PwUXd93pj8Zu932tybwy/3p4kk/dvInaB2H8Jl/t7CZJ/CH+h/Nrfxd79oEb+t2tklEnl3DxD2kusssPz6a5G972ltbSUnf4onl939YT8ERZRe5G1+EzXd51+X8tSCQUnhyUGaX587niRNJc4ws3euWB83lRZ8P7pyzRG4+vz/f+P7vudLH/Uy0V6h6UlePrcZsEy6OMZ7QGH5Fp/J/X1iI7C1L6RF7+T0lroxRMqNdy0+RuVr4V9ENeSCIk7bnvsO8vKS7fe7ufpm/1y/Zbqjo1UntKLv4qu+7o68RCuVlO48zX+3WX9af5ML+S7/JEYbWVcPyQFZp8kRCUzUvnSJvfqP1byE3xWW/q+5D4LIAAAE8EGaYBXwV+YZjUqnd54S4R+A+WzkZQTX3LrdL3LcS4O1X8u5g9hgv/lhImoevnY4xO16hSZES3mzDVfRp7+e+QHhHyuTfcibovESNw07lotgOv5bq8sl36t8swlO/J+/6huYCxvliv/D/3J9v0JuED3dzypZZvy/7WJM9ohp7eFa1KJhl/9wTk0in27eQvcbPiwPaK6L3j0Eec5Rt66MqBMUeZ6R1ab8KNbvnPXERiuxegbB5bQe5EYnI+X/exu0rYC/9KrXwxJHJ+d5/YRm/vVDIm2+591J2Yd/w7eMVcfzzWT99RPG2Z+DZ+5QubKnv7WZPBSGHKWhxiyjkRW/uMLjMrXGDnM+3Fw3Le6rKQ8okm7WCWdang2TA9AjGPMv76XJ7rl+C7ug7pyjtOFWmeCLd3KmtcJ1dcZsaVVli8mF95wccfCnihHAdU+cFds9RIxeOuOWMnwCb+egR+xxMbNZ/Wdx96fI2nypt7GMYoNtXad2NU3P7/GmWzTW1d/mv8gtpQQdcrvXjYnWliHbM4TCoesHr1mV/J6SlfkQkurH5xfMHQQq7nSek3Wdvk9JK+nFlw+w3vvrfGJ/r6y3ey0XpaJMapHOS16gr5PcEbQcNT6dzhwl6vv1GF3KkFZ73ze70vjb8rb0avrbTR1FuOyL4WeGm17PwCVl+t3i6uHM0PjdivsZG9/jDUoazfh2/uZcXfQZlfKvaFGKAw+vrDi+7UF4IbhcqGLcPcaWtFb2ToXEVD1c6Dd3/HL30mS/kqeEvq/h/hyJeW6F5fXjgbrZG1+OK8yfXK/vcsfYmPxk6DaHQfWE7+kGaUtKTEiU3fLL1ohUTI37hQRDy4RBS1jhepaSSa+ve4gN6EctWbfuC4m1z1kNaA+xIe9jffCPgo3ufPqvBKUifnW6lve+GT0qLfrbQtlYVFGLgjbVFZMWwvzih2sQ0l//BAJLqcvfeasuuUWRiDtGv9L4iqZf8v6+NFMedIGZG6yA/jI1Gk8CD2PMUG9UFi9s/2mfmF5J94LSVD4/hZPMHRShnaV9TqIyi/7FrtIJHhxvJrvpo0Qw+Isr7uCX6jPUJ1rG6f8Ind03d+Xy8I+iSy/ibrTurKESf+0MS7qjspofc9oLpPF1g6wQlDF8f+6wRXj4kHLvrNcENPd8vrTT0T0m3zrene1pwSb36Enlygmu7u7ucP2qhJUWMe7hpJ/+X0tcF13D+Svw7S21L9fX0omzfeXPcmS39OCfUkd3fL3BTu73Tve6EfRO3+CHHV9uxpeqBIW94NrfVp9v0/VAjvk2C9NE/v+EtsFV3u7cETtOj6pHtt2XiRerMacdjf34gju76pxM3hF9VaaoTF7vntl8R1t6nTsX6LU9pVJBEIvNadyeuupGQS94S0kI+OmMKKz8/3feFW4/7t3bjf7K86elP+xOt9dEXeXd366+8vrrid7n+fIR8mNL0vx5N5fW43T902Vm8I+Gy6Segkd39Bc5xe1uzX3rfNl30X5O7Pvvyf1XTRs/5P15JSMJXNr3y8KeCLNudtfjzO+9PhtCb110WUqglfXJk/btcbonr4y1gkLd3FQ3WWpbyF/cZnJ4VJ/X/k/Un8eR9+XY6u+SLtzeka70/4REuQkndTw2e/wnY3MR619DI3q0X1Yfq0rvGfPz17QRmBuIcdLSvDhfUnyeXPw98bAAAAVYQZqAFfBX5RmWxdbmrd6/MTd15T1eGC/+4QNnVKaPHt3i19grruMTfhN4/ZVNPOnuEb2e4JGY/G3V8v2eCXc08uelVvlljg37QyVBJ6tlWXIC297PAmeh2xKq8sIHu04TfWJYSLh5RG68t5tcKeCQ2Nm7SjL/7YXso/OC/4t8cBKyV6mj9btDSPcTYTvPXMS3zIpOjCQQKbunlUqOrsjEf1udL8q123/39/ZgLmHnkSta/HaMTNUIP0LGG/q3AvhmQ5bdAi125+uX9/FeWIblI4+4NesafCDgJjfnYcKr75IjOE3N8OvZv55jLl3+4c8Wpz6BO/f9PyeqX+OnhkZhwhEh3f9bnftfCHqxOl4zBSZ5PVOts8FfKCQo8ggdfE1JvnSpzTnWQmf7fLCN7vn/htFpL1JKdxtbgQTL+ntihngLNMlCvsVv2xhD51CVCw7sZA31d78+YporGxgyE2Q25afP/3NwvKWfgZBPHfc1oLazc7nbn+KcU1FVP9YvmCgyy0fS/gtKNBodkpcno5bJ6re6l4Haa+y3TjEx1kv0+1rZbv24QEXveVmQCXk/qvwUSKQ82piv3CAcLHKE3+PLO+M7sr7t7+4yumgL1V+P9iNFBP8fwxlLb2srrDcqdcKl6DuUB3uZCDtbW95D0W81kpS+q3Hmq8TFpfg+cHkK+j2ZYV+EeQWTyfAVbyOzsJvCurWsFZbMs/4Sds2tNfBw/pRus8JeHopS4y7/qHqPmmf98cG9on9j/f6UuhIlO3uEr43eKfl/TJcUIy8BVvE5x5K5Prt9ISTsbaLQ9PnyiyeEWRhHwWiC++RlNXV5v7WvoSWNjT5+ve6gm20s3vShbnuHRUCyqxUkjq+iUi8392yLqPUh//jT3fBNURlrMELdY/8Evh7/dJVM5HHNWsv3+NMhEV9wnqvjtm0Cb6zfvcUMrD+jhXQ74dxJv0A5nZLNfbjuno8FYnD8PsyJ927UtH5J69wWEzU+Mi1TffHvxaD+1Fxd/QpGZOqf7g7mJKG+38YRN1fLV3489yKveaaqgSd/FCJl4cp+BfS0VueO70sJn7TBeqm8+hhrhhLymfWT+/xpejuvBTOgg3hZu4vD0G5vAvdYLZCuNm/u/U9cEm7u3R+T0kkt7d99ZC3vJ66/YIo/3E+xv2lZCWVhSK73n8fiNOPW7/9duCGldt/oEfd3qtcPxEqDdKSXoXIJciF+T+38TIaSfr7K3Zv6xWS33vk9J9ckF3d3e+X0CS7y/UJP0h/Aie6f43BJzno8WmV0u3VHbveq1yqifXS0/WtdncJ+Ey2y/J/fPYJzPFd3d33rhITd3LCRXasrVjWSRC+q9d0CraLPNT5cdABpI/Cfk8jeqS+sVd743TtVlBJe89k+k2yueiGo3MX6hA71T5933eEfNVMrKycutQUEe7Ll75Ba5SGFz+X8n/FHdrONP3WdBm+RdZdX+6vd36WzMRveeXr80EWnc4OsJz9/G+8np6RUILk1X7c279RArJ+9F7Du7sJ5/G5tThKNktjh1YZpH4S8OSRrb6nl/HmnhG8tmhe6WXLTsTF934JD6nnbvQk4fn1/2QHVtWak79WL3pQ3Bku2rt7cbSElve9u1xgn5BBL3llM3qxJozfu/+FNsYOLm987Q+79p0qfaHHyg80iBtec+Or7vpdJ0J1RPiJt31dFQ4t73e28vv+F8kQYp+/G16VIsKFenz848N83Cxljz92iiL9b743OXSriCt3vvrHSC9osYlLLk5hb6gkJu528QxxeXC0MavHl/qoYyRBCn75M+SIh7R801Of6/8kQWfFy4j38PfFwAAAEmkGaoBXwV+YYUWqvy4JHrBL/4vKwGmhyeq/KTcwLXiz3Oy0KclQwX/7BOYj5AEUGn3cbHikPcPw+PbBQwMdooxzAJptP/Dsjr1dJlnh/pGH7hLyxsi+xv+e2+r/V+WOzm1bbYmke/8ex/GHM+73rILmjd51fcUblx7HbFhX1TDL/7iuSrhGui1bou9vBATYSn0xtdSgSzYhvrdMMgfNti2+b+xDjg/X/9e3vUyE9Q/lPp8n6b7Z2M5QM4i1JhlcA6JDzdy3u/DwYPYyIwNtYW2wD7Suhmpg7IQifpvUurkC3fQE273f/jStZlL+/VZY5XGs1wofRpYnwksjXQIwatgWB3DXhdsGlei/J9Jau43lHQcZnvvznhP/amn/VOoItymcXje+N5IhuC/eBNuHLvsHn9nuYndYrZk/3vQghEG9vf5uGh37lUv/WU+PSFsJl/8sUM4I68Cjo/axD7WO1vjiCt6ywYh7GcPpO3foL1LRqB0luN001+tuBJuwLjs0mF2orEoZFjnLqFw+H99yf/KjpT+M4+kC6ppf8NSrvtJ4tgqKP+tuX4SO0asBz3Lbrc279YnhB4OH4e3tOs8EmE/tlzg0qmlLor7EkCYi97RF+1LcFl90wS/m6mT1DTQuKfFbe0JebeK/gilhflk+klV3HX3fGXT5qyFZPSX9RPaCn+EXDu/T6TzxZAn1i/WMzUn73Vb5n7UvReyfpPiagk5l/WT6reRwS334y9v7Qg12qecJFo0/X8I+CItz/uyfWq/5LLDFxL/rvBKKgletNQw9WtPRa8Ugod855sWcP6r8m3Znu2/CBoKVcwrh9Jof4d7n+j9S/BNblxSvgtn4I5n3SZv33WWCe4zDVAowRK1frPkQV698ewv91wmLSxfnCPgrufvRx2svezb31XkPe+i+i9V2CYUGpYdvbyheYHRbrPBDIGk3y66TY1WOOcNve7yp3/BDZvwVRYJe6SdJlxPFtegRb3b4iEX+FO2WL7LApEQP38+gmT477t+tvdW7El3ZIuCW87HvFk93zXUlkdf7lFlx9UXqukK6qwRR2V86Zfbt6X8I+tb1sE97u9/+jvtbTeqvVfWvdeuSylffVZPX/2S7vpWPxbhF/QJx13cVuK3b8Sd5g1UJuQPj0e1CZXsnmb6LJOBbvvp+oIfLE6VWEqT48W/eT6U38xHv9kLe9LePETSvh1ijX5r6UskeV3M3u45fELBf4Tf4JobtnruRqV0ztSRBN2XlOmOpi0P3jC5dzX9RUobeOrtrsbZNle31Zkuvrd4kg493ve79u7lJe99Y7eKyP8dTDYcJe4j/f4KjRlfbY/Tt3G5EgS/5OLvq3gT791qtRrKcgq/nD+j+tUga/HdVR7u/fv167UJXx+cueH1DIkJNrP6+Md8J+QVOYdZeEzXd46+fJ90+mWCETDDsUHeSC1ZNdl99JegRxW+RMn1k9Zjz3hjJYiaVq8twUFxWf5clT16OwmV7880r8kVvJhYbF0Vgk7iHBfbWTHWsuZOyPvF+yk/hjJECFT6f2dH8kRMwGH6nynitmuskcUlu57mvPvgsgAAAE/UGawBXwV+LGc2Y6XL3rcXlntX/MTbDWa17mn7/ynujI4MF/9wRG2h6ZTmvwQa4r4eRNhjwOA5OD1u8PCXx8f3GXvuz4Td5qgjgR4IqSt7E7ZAd299nhCcmf5zUO+q5o21uHcdX7CNfnSgl8e+nfzTe/LCh7mc3Hfcepb1y3ezRaV6Qk2bWVuQfwr6pRv3BPEg8UYkfhKvadhhT5FmX+9xpG7DGRyB6fb62H5CiTBbfM5oWj+y9TZ2Tn0xipfL9rZrrD3lkHJqS/743WllRyO6mvj4rU8Fpqv9kKJZ9LcASPz/62V/yhwoI9NfG/+4U86TgRf1rP8Ev+nfgl8vcWbSRSVeKYpKposIFQFzBu/IvvOSMFPjvIpdULreH73rVnDt/decMoiOO1LsFN093yDxxXFtl+9Jw/hO97kKHUc/Th71eX9ELoVlzxWPk4KF/9wTiuGcEisxXlOrph7YUn5SWG5RqJq/yBaCEdX+chXBclv2h6v9+4H0d3Z0iddwoQO7xjXXXOF+ncaPUi4M56ijF8X93b5c7kXX7tnFpXwnaA/SrhtxFf9Fd19i4gs/nbPS9Nq5vJcntv7QpGnJQk/t3uCsQ7/GilL3wSCseEy/3khQt5YO+l77+aMdpItxeCFsZk/2YFv/Cke8JT+saf/vZcPOxr2fh+C4xhs5X7hFwoMfi42Y9mMOJKYO7yk22Csr74+UqFmLDGnEUKDAW7o0E981st/aTzwRSX/ddeTP4eb4v3Hk0S2nt7beZd4/y/9SsWT2eEfBYYZp+X3x9flp0XkLnqVumh90SVQ/xY12D975PTasa9oMjIdjmeHya+5H9U9glOG3K1avc7n7znTjTU3Uz0JLh7gjHsEP6x0vhfGQ5RKHLTq/6wRi2cCer+L2OsEhJ3tfvcFGORd3e/urnRUCkuCHZzje/e502lqCczhjfNFLsnJ1HUfe2MKlkj8In25/Kx3dlabhHwTU6dPeVZWCHn9+l7rzRBiCUhWaXk+v3xFM6e0qqgvr61f8FxyL+8v4ZPtvFq9e212W9G+qBZOxd333dz8Ed3fvCL9WITdN7zyWd75PrXplFly+f/WE+Z+T/VWZP7+x8Iwg0yt3vfd5f11HWpe9ne8m0u/X15PTapS8t7+IjCu73vdK0++sEQi9voTfagnve9z9tfkLe61Ld/r6qnHY31k7vr7TfH5OrvFFu937rtmuX99bveEi/62CeK7d4h93/EnQcklI7w0+/9bkl9aoS+s279iYLt3y5p1rLO6+8Vofe93vSd5Peut+nIR4UQzularsp/TwwthNcuCa90Y6mPc92NL0djjybMHpZ3iu1J9Vr/SQqyfLrteayS/6kvvqiFTP76wny5Rv+/U1yy71xJOTYrEPX4CfYomHXRZz93k+r6sIzbx5dJiTlr93vWQHT9fXVleT9ZRlI0xb3uhOhV773v2Q5Gfck/qzjFPhPogibfuMIWjl62cLpk8+Xd8N/SL7XU+tPPCBb73VhDoKby2+WHiITLza9l7Zr78sX3d77VXyeniNKmbSdrkkpv3lkgijfeXD0LaRNfV608PEfChXiVy0jeWVSq390W0MK2RrcuacwXhZbfBnlevsSwgV77uej0bnmT+lySw7fd1b5DRcGiCiN/5fehGhXdyjPL+STlbDXiCBG3m+WpAslFJ62SWaIiN7+6LqILPeev+5tPeCuAAAAEq0Ga4BXwWeEBXCD3HGFEGVME8G2u7v5yJl+TW14vG2RaepZfylu6hjwWGPpbN5jC2mznMPtRJ6L9qQ3u2EITrVWtPnG9cMbcSwgj/fVbO5YUz8BF+v0t929TjVbVFm6+rI2tvE1hrNtpoM5+vUXA7Se9QQvHef+ETserxM8rfzxevLCJ7ky3cv9nf7NmPv8py5WFS/+2YYP0/yxs+6lFjseCvCVUq6gSdrr0trSky1h6ntv8/eAg9NHl/xnW1j7xNnuvw32rc7C8zpVOJ+Nc1ikJnoe1rhHtSwMD4Zh/XMtIG9HiylCgwd095Lapp7CFk7zAu4/SdLrcq7+lUUj3y+1+bL5AaFC/+2KJwmd/PFGYrDewSMLCKI5YUIKMOuk9xm5NhFblmor8lL1dhK6Tck/e96IN47BTFvWxsJPMOl/lgevYaMH//qhx+/qNzXJerqn1Vfrti29LDj0rw7di1bJwKhpoXk/HuwObus5Rfjm2//Y3M6k7vTI7OznVGmvZQWpIx886CeV52sxRLFUoddLv+sYXIMkKqIh9gatchWZdvvmfpwV9I6eNl1QnpT211Ce7Y0Lt9+tE9KsqdzF5rk/SX12pVQ0UF3FthZe+7U6gg+VVQJOskx151BEXhLzWW8v9+CKftHnv/uOpUMNGvJ8Xf5hTw1yf4KrDltvP4sjr7z37altAqIhSeuCX9fjzPyVmC77VyQ7ejvfdaF/i/qWFudkkj85+3F8FR9zBrhDzDosG/rV42C4lDQ7JvpR+xspcZA+1TQJyHJH8NrvNtkX7VNJjCZt/Ld+UdcE785O7VrhHwWFSvvc/+qeXaKVKW040VGW7QI2v/Fm3t9Fe5F7U0A33X/48t+H2fhpJ/+G5byT2+vxhG+PiQWYaurzLhgq2hf0YI/gX9H0/oT2n8EVLLjDeECNA5nNxkni93tulLd2D8jR6yftLiyFRjHgNo/fW+deEi/+J/lEvf2vZQRQEG/XvP+mH4JhR3ZbkWvh+CUoA93+P07QLefsmb53Ne7IOgz37t3RXOiQQzL0Lh3gi3vB+C4r73ekd/f4IiO92hJ7eMu93d7htaBWcpLry4ftosdr/7Le9P83P3rrqu7EkRO6cEMbX3aEi+ralKCosQwxG/57+9z5FL1BH3fXYuW7+7EFFfllXZVb9Qnve76/FkeX93vd+y/EFLTf+M3u5fL3fvf7Fm3MIeONbtp03vPCX9/BOR3EOXM/xZf3ysSV3Llm6xOLEX2dZO9Ek3frCe73f+JvfhJil3Xr2N9+nxyy+tab3vuyFu7/IIwN3wqrPf7LmBV32EfNXVdlu4aevtelTcfrv/d77U+++ifX+uuRQle8boop4TX4zYo5tHZxoo//M9N3tfCX34VzdwzsNi2G7u4u0eXDGy/1WT7lKXL9r99Xk+1NtMlluZjp/9QiR7u9x0yTPl/2cn8KLhTHDHcVu+7w3Lbenyx5z4pJWb7/r//mvX3k+3VfKV5J5PSsknel9BEnL8j+XcnpPJlin5Lvd+sLLXBLd3HVhX+kCuvu91v66VyRBSpN6cV9OGfDWb2yIq8mm/zREN38NxXQSmJX6hOTMO7/WCyAAAAUcQZsAFfBX4sZx1rqpTqLxfPzUjui8v/l/i+RUlcqNavy+3C/i/L+TJf/cYTllwi4CFije25h2vwhwpDjRfYJWz/tD/sPbTKxLGWsbj5+4j2nDpHjTAv37it3u5r5P36PcJTpHfHyv5PpWnLTBHZhl6XAYrvxh8cW3d3t0WK4+4PiTPa6b2FvCPCHRzIL+4l9/ZrIKJGHrHcKEcpTPEINm+8aDiyoiDs6KYC//TlQyUpj44710cM3TfxJOqBKtOM8KthTkdufoDKJMSJlV4en78I0OV8p2ACvU1h9ue7c75yetU+43XTOP5SXLgy8t8GUBErkX7fkqj3RL672fX4072BQJnzJIsGu5El0F3mQd6K3SM4InXumpuv/8JQ/3TfTvjOZcSqI6gORvj2jc/+nXGUsJ96TDySHEtoPLPw25twjTXQU3cag6mcVBayumAn/ZlkL+7xSNpHqH96Sn7mF5ipiRhpDy2RlplRLvfVdDY7B3W5Z0u7Y7d8LsJ70ilt4P+k967cu3OrFsJl/8swoUZ+K3/tjJ8WhpARXeB41rXsJ/BNY1uQSA04vSFP2G5Wqefz/CfZ3tBQzVLyjF7O9zGsTQM5PJRfvGvkhsDE4XOv8FBY6G8jt4cTlH8PwSefuy/v4RK8TvlmH0VL2Vyq+v13XgnMX+QWKZT4+X0rboFEqbYb7p4cuVlSn+QmtPBFdG5Dx29wrr8aXka9xbV/yd+T0qK/ctqPxvpwpBEr/IdyTXb5AU7z5s0W9jYwh7CjvvsECHoDf/+UHorye7/eve5D7mv6Jm3/2Xt5PJ7WdFtKYh19Cuq2JunCaUsT6cm95PTvsnHioQjXXennj3Hk7p7FIGBdh68wFec88f37P/hAhRYNYP02Eg7HE6yeBL+LE4bIvY9d3/w5dzBvyzUv93iZiBB54G3vsSgR8Qi3usEJcuBmfa+T0rXTLFEsdzxIhm1qjRcJbieENNifvcvccpYR6BJ0z+qdVYTE5/3d0dgkFFC8Mod14fgvLeZd718n0/sitL/8FBXvd9j8J0n3y5dbu2Y19yFu+2nJRPi15gSW92QkX8srssZnLPfejvp+z+q19q/R/YvIT61fUm9+4Jt3u73hHynTl++8PZfu4l9u7hlsOqzX8n1VZeCsrzItrabzXKLdm4ML75/Uvt/qhiMjbt+4jd7ef7G9LhCCSUpuVPxPl736PHd3d96ZztJdBCJhru5YXu+kh3d3vufOlUzCd7EG0FQOIu/0Lu93c+fxIki95I8Irex4yIcfuxWIfJX9Cb3h6KuJQ71aQv1iCwxoHTG8HXsMcK9S3jInPdgi87S2T7csfv6y+M9TiUaTl/WP3d25xXkRjvtnTtpeixpL3ROBX89A1/z723PsPbT/tfd3MPVWE13Qm7ijcdp2+2XqxRxlBtTbzsPtLoQuv3Jq5dtoar6UEd7x21Zm1k9elkybu95OJ7iWArtz4Ey+pa480v3P3svc2P49gsPl1+y6/LzIU6bRaC0cTmne8RK55/zfgJa1uz8e7k9tu+pRWK3eSV+Iu6Lr3F3HxP+907c/b4g/TKo7b08IyPPdLl7vcKr8IGEvxW88dIkzmkr0n6vtqJL+Ak9yHf6Xuvo/f/v+EROq7vy5J7bf0kQ2TE+Mii3Ty58kpL39QsT+ifxF93JEV38XGfZYxR5YOqzFVfWbqUz82fJcpJokR5cJV7f7OW4W4YyQSiJt07k9COl5JCh96/1cZj3x/4e+MgAAAE2kGbIBXwV+YZcPy4SWWJXuWCD8jl7O/P9cXcXaOb2rv8xHovcXd+e/5T3KD3QDBf/sJGw/WmCeQue69oIVJf4eg9dyP4bivl/3xne0f7pWhsX13qf4m97HnX+N5ft3KHxkbQ+JFi7rV95Upyxb+t8LV3gUZqFXBUVw3+/q88JHvKLDqReditLEmjunVQ+te9wWL/7mI93vbcbapyC4BneRtbhbCqVohvV7IKjlvfVgr99N62iNmGGGV+xA3xw+YBX43cNqStQpdXCMVbl3Eux9oiIIZ5Rc4X8rlxOOs6/y/veFPjQd8+YKPH7Vp9siD1F0wfvJWpMu4QKvHtwb0tAvHPmDx1nolxbWp+F7xzWEjRHu0WGX3/uJoO9vpwe+c1J7b9NEYKaxQ21ejcM55bOFhor06+9rCPNu7+OqoCp8tcwmX/3DhhWKxWN3Fjfb+9toaQVnt2oOH7mVIkzBMi7OesVJxdcTBfuPpNzp6KwVEPqyPE7aN3zG1LbsZj42TNCzdJ3lpx7S6CB3COew6lh+m3Osy94Blf/8RLVFzaVuEY8d+a+ene+VxFKQ4q9e/5s5Wef2dKXv3BIIhG8yphzzqZff6G7nA3hhxPlL01piaJLWCRuRfTCIMIVM4XpVvq57K5Ywie1hHwvb06d1ywN/6KFObXhJ19/6t/FrzvY3RyYz1I0z8j7n9/hrPv3GPcnpJfuCggTaebec5Aqcua8iBVKF905BK4/HY3sdjWJpFvG43UzrQY0PXbfQvhC+fjBFcP+T028T8WfnwiMpb9QRGunfoqKXBI+pm+WQr73RLQo05JE03ITgRv/NWlqNJ3GGPa0bAVYBzD3u9px/Kt1UU4v/J9pOvGiRKb3O3svCL9Mpqb+3XXeINHS67onm/T0KTrmj9P4U+OXzROPqZfegZbGnFhkafEKUjwRiS+OhyYvWlLzSoN5PStt863Slgrvdykr5yRncvdWyfepa0Y0OrXHH+9aHlDKVot4fy+nzPhHwQ7pr2/WrvoiRa/BIIZucC7BUe5i4XPVPvwgW93e73eT1SL+jFLDf0C0mRru4Jvp00Zr8t7wlljJ/fZvR3e71ZfaX3VfXk/WvaJ3fTS92ezljwo/oE8Ppcnc9Hd0/ucrnz8306xaLPl69EFE49yHpIDeT6X/BD5bPeaKJe93e7q/cJlnXu0WHL930yXnEb/JywhIv3+S7u93i2CbuRAK9wxI1M6ushXf+HraXRQ/F0n70pVJ/pb+7xsFvmeu5mUNeq8TO5z55o67UpH0/X4a4JfHvcD4Vqa/DLaf0hnR41u97d9s7D+EfND8StnzPXYm6k1P397cOjht/wl3e7urH+vSeXTpz/yle+tzTd3vJy4QWenPeEfJh33+fh4733+PNdu8Swnw/vf4KxLRbdu33LLWPBHK3qUmAXp6Tfl6HmW/4SK+4MZ8vR/fqyclX5PVvJ6ye7i9fl8trJJfCg9uRyf4T8Jb1nz9jBDu7vb4cS+C927vL7reJLtO3Ar+el9+xrIV99F9+T0vRgh0UWpt6GWa97+svv5mS935HCy6xEdlWW3KT2l2yxW7n+7N3fQjuy/fWbzXySFkxcM5kCWfpJ1vwzwayJ+FYM/rBrvT4QdiezFWSpcdXu2qy//hm7+5/bPT5JJya4LIAAABZ1Bm0AV8FvmHarL/nuaNb/WSouVvdf5S3OicPBgv/uMNquoanSjkr28XenX2CC8774dzLKiViSXyq5Vzp/3NzAV7aPUs53mYpPLBJ0zvKjhVvibvlNlxE6SoPBk96E1cTBSe93FdyvuZovtVmRjO7wr4JOHUBmLmy/+WPl5/7hBZDga66uzet8aR+YOHxS/srfodTTPqGDFASLnRfAyP+z7o3alLxFng2B6wm420f3G4TWUzGnkmIDZwbmSI/qPP+k4K4WDrY/s6e7i0MErEG8bVn/GdJr8bTRfx44Zuz/o8F7xuyijd5gwyYqxLSr5KxvS6csPnMUjcFoso+jbJUZ8LsT4+WfbZN/2uSLyFoCcHQrAxMH0vu+gnCjQMS3b8ZWj5YKbzYZ4nbmEhjVgW6cZPpJc7LCN50Li6kOeXw7yS/7uDCfP4Yuh5NWM9/8tynEN3CqbCZf/cwgVitxX2xhBWdp+4dwm+9WKiPs/nFtCzeUyqPXGtoJ9dwvfxpA7NHmeCwT0Bj/t9XAuagk4eUbAZiWfXcNY46BFD50zX6ivSP+NOhQ3T856oq8JfpnE2NaD1iXArMkjmFrxcDV/8c38Ic4HMI8hssbyNpK2hN7357vfBMWa4drpEHc4vyV70WjQvk9JMpKssFHIFHj7LZ0fV9lvH39UpaNW96DvXxo2g+70IE+z4WOyr0XRX+EfIneEnp4+uURfbvfL+7VB267mT1xce8NW76z1W/+EKBt5nxmLBYF9z0YptLwm43MuX/3Csz8xCNZlG0yy2rkfb/O5Gaku9evNsn6XndAnJmGyl+dhxJ69q+Ci4Yd42hHBL+XBEHy7Gwkfc1o1ptLZ8k9pPy85FL2v6brDJb3XGdP+M7TcqFUM4/AJd/L9z/H2PxohfOflk8e370db0Kqvs2qb9KEfBYVM/Kx8dpb13/JffRS9Uy2W+0yzxAqPkXo87YXcl94uG5e70+N0+H4ZlvHqcoXgIv1fdy8bG17B2a8uh58tqWv9P4QONHX8xZJ794ry4f7p030yZK+WCkvLl7joldmT11gjhxWtS69snptYklpCiW5zKaHftFe+C4oE7+RfbU18fp0Iv4gE0OdnbeW0/EqWN0r8Iau5/XVXXq+rCZ333fdEjP2r/RILhUYEjN+KmMgjd8Q1YKShIv9D+jdiy1x4RIBB/w/794/Mg4h8EJL0X6cFewtPxRThfle99NZK97RKYcVh7v4kug83bu7+mjQb6bJ0Z//KLOovCOVhAU7u7u94+XvJ9U/sgsp4b7vyEVu3CeEXnm776sRu7gkGlb1z/7PBT89HfvvTre4+JC/tLSW/Y2pHm62usKQ0pP5jbt3fuf24k/afKRQjd3e73d/2c/8IehHb9RW73d/ohXv0JeT067N3u/X4pFI761CJ3tXPnZ4aJ7dr+E5Q0+93k9pdvFMTt3d30ldZPSX8khLvl+/4R1JLAv9KJK8POuyVXLGEvJ5VW2yuH357tciDWeEmxKiZf9r10WCKx3creoSvfP+1PxWXNK+/oI7u+UGEe77/FEu90b9P1j938jT33tVxPDi+K7d+lhIvlfO4yX9vp3cz43cfb6of3eivx196oSXlzewWEnqv8Tl/+cJW9b0nuW792RDYTleX3vfVlu76WxHJ60ukMifHaApOlp0a32EvcPaP37EgnMN6d7u47ayGIIBALR31EMH+6Ydi+iX6rWl2xZpUxA7UvN55guXKS76xOtaa9ZShK89Wu9XEE/Zr67EwR+MqW9fwju+5WM8v91cH8KP0ggIHcXhEvdlavt3eusQW1ElQcXrTLN+DQrr8JlvfSel7UwuiayQXcv3eUHRYThSkfStbUpOGPFxn+cPFdxmrVekir0qQhiJ85cu9a4Iu7p1zOGC+SSTqdK94wv+IKSZDhL3fBZAAAAUjQZtgFfBX5hkkz6u0vyy7U5f1tyyEQ4PPEvFyeyBP4z1V9SVQY8OZY1X+Lcv/uCMg8vkgmGX3b8IQnXVPmfh0RBNHDXP2T9NOjz9OeWGbvr5ffJ6qXu4KLjp4ftzqMouxMN6YcXC0qoxOn9VkQROY/Y8/LP3L7k/XJ6CZiztUt+pRJBWb6oaWCpf/lMMfDiTKXy76G28ImlOKGInAtvbmFWMhHr19gG+RD9vdLaClb7fjpH3ZEoJEThjB0dML3+M1pQNDB4LfsFFJOTJS9g0l+Xm8yCKzxkiUmnGXWxF9N3ufxtfBbBNIOxQ42XmH+7jiFD+EpeHykfTR2WECmWhJw6naNz3zavo/mCI+Fo4s0IGkvE8dqVB8eEr9ZLjNkT7vJh3lKe5yfJ9+Xq9Tv3iCO/Lv31+UvPgTL/7YcMKNxRhocpfk6PrdoaQVuKytCXwI/ggyrEzAcVc+8p+uEx/inuV5Uvy+/djZ9nXG42Aje7duvyg5FcjjNagAm10v2Jt/DEvcGyl+OiVrK649w2HpiHnfnqG9Khopff7CHHvR2XSimmz9LnQz78Fm/k+q99SC8kIlul3HZuSf++4TcG6Ev/75n696VbQKt33uQB8i50y/rWCsxEttDdhPh+ZTLxfwCCP+HvoTfpgnvU7u++qzwS9Xe5Fb2MKd9KV4U5ZjKd9xnAEfr1/H8+Hpb184sn3eL1gqpIuf6rzIXnBN48wdXtl5ilbsn7fb4K4cQ+G/hreiTW+dwhi9/NuNbgoJxma5xqlWvxYkZgHxhsQot5/wTkOgCP7mt8QhE6fRefOv2yU0D61aq2h5IPkjjhXXCewcMP6fTlE1SK4R9EzV7yj7/chX0xoTfvBUK5aCwbbAvvrLvCFDDMvl2/93+MjF4us+GsREzw2zOptvXgqSlRfpwFl+31Cx7JNkP2G9FF0nqFTmo290+5L8qGrxaZHm7K03XbYs5gMkc5byBkoj2gX33whew1HO6v1RHhw0FduXvDwl7lf+FCh6Tzh9iQ5/f3vL71CPgh8foVKL/91uQ5dv6hMVjsN3cp7+cqjSf//hUjxdgfOgt81VDDtf/bj+IPd8wWd13fTu706LpiNgy/XC579PpwTzpqUx3d3KYhJa4JBFGuDeV2EDy4fNXhobf7eJf7av069+r1ur6/EnPHcdX/f8grd6+hZ3u93p9OJxi6/u9dbp3hHwQ73rfpAnvcbKx7d3GvdEZaseoot3n/7gjpbnvVGc6L16oqfo+l/0VBHe73vH5L0oQu+7vvcI+auqeVuHSkbcduPuIYW5yDA5Rl/91VhK9vd+sJXfP/sWy6ZW+nFHbu9p+sogfp/4q+77qjo10b9iWTL3ulxknwg91O4fn2a612+yu939GV/4T5QhdrXtz97x3KV19m7vrKW9JaiCzNyiVd6rE6dKUgISbunWlE3fLnl9d8JEOv7vCb9QvLcVuK3cVqryw/qED4fe7cMve5x2rxscSHkH597lOOE/rOd+w2rGvf5TsRTz9bk9SEz/pZxDBRve7ylqiyJRfeRAvjpjVmNyVhD+CuzzJ/+rh9J/8KE/fscKUFQq0nP8Ium2Tnxytp8sIleE9v+DDCyu9v++iyFe+qf1otaSsjRCF7iSu97v5nrKbhZZOPIXvaJatpcfiPeWkW/3ekqXpSHnnye1QlakhG9Pn+5/y/q4iJt7on5fupKBNu701pF9nCNs17hjNBEIPm0RPxGaNXmsxmT631V2aHfiJCntZIgsgAAABMpBm4AV8FfhgZnLm+HooiUq/JxhnXuWMFpQm5ov/uUnNfy93DPgku49sC2v0JR7jI/pExN5fd9X28/BB88L6tzwT8YL/UFx+Rsn7v+P5zzvot37iZYXlmNAVp4s9Hvf1eeCi5Qcf+W6VfHnnwO3EPhn93P7+JNkiHblW/EB4W8UTljIXg2dJfftwpPTnRmD3T/5a8RFgFlNO8EH/Xc1rq+6JVqPkwsn0r29jZH7J0KfSjJ+7v+MhBXBQxYfvkMrXzti9LiypkS390bjNhiL99dXqPfHY2w6ecHXl6p8XCP/23/6sJvMVdqSTuPK5F/d+cmcDawVWirGQdmERoubbG3p1J9XlRbXShS+7n+f5aOXC9XlaXfxpeHc7wn5jXgTOtTthQgrG5H/bCPTPBHwrgj6vdxBZ/fNap7ntDZYmHAZG7za8fNnZIfT/OcyMo8tAW+nOJvQdr6+LTvCTdLVtAl2E3uVG7T5PSV/Ih1EvX7/2+W+1J6Vfbgn0mMw+WbSr5QUT637cJlkBs1N311R35ZrY0lBuLi2Fy5P6f6BWbhFzbtS0SBtuwaYDn8vWrG6Ey/lp4Ior7d26bBPAx8LS0lzT03yeqvu4uCP5uv8H2j56BLGZijyRPzgzuJPXvdR3drnqV6G/CfTDzSq8bGYYz16jBohHL4aLlOCMJbikjdQSe2mL5uFy2hsNA18063Nf9JPQKt52BytMXFdS9ywOmT6p9cTLfxwonm9e6poI2VU42994BE1e+e5tdwSCA80uitvqLYl3uEX+CchO6du6xLWdfhIp/kS3e9e4XFCGg7sA7yYe48Q44Z9YcJ53+nvElQxUn8cF9fk+3Fe8Ybyz1hzz53K6rMsRZwvyRYkEmj9TeywGfJdfdeCQhA278Q6hA77lnhL508WGIRtKZP6vzwQ9yKYtpOocI50DsKlTxr32q4UgL7k/3OhenvvdOtoR6F6He91tMTjOX8Te/JOqOwSCgSPjzuicn3VAhni895Uvch5nX+XKvfuTYij8HqyX35GCm0943T8c1ThqgldPya4SeXYUvd3cKqv13EvLZpy7x6NasEmfVcXtgj7vryJYutTp+C2Gt/9yS7XY1V4Tu73vulZRG+O+3aVYJN77PiIRf2KpR9nORW4rFfdYuizXf2eyu/earvfo+qPVPvFaKR99H9V+/DyST0S739wh7Le/xUuL6nmd/J+lTTKWJ27ZbLHuOuDksrcl7TJbBPd3MHd3TCierRbgjk5Pr/+vrZMt9ftCe7u/osm8Am1WmLVVCN4ycPd7hG/jCPitTR2xpIoSyLZRN7FZ4ivcvu5BcEJd0yWT0kvWUJbEVa7nHv5CXd+/WGfHdVjy/+SxO5nbE9J0T9oyRXuTtVxPhx6W3PwmXxBPwVEXq7it4VbmNNvJPKJuO3Lq1EspJYNPKXTlmPz1RfyVwV3z4GRGl7P4rVOUT97uiQn3dz50vXkJ9BPd3LaL5G47Z+FC+9k4Khhbd3e+fvbafxBzvqzShCMDXm5PtPE8hcn7ZOXJ6+I/gj3udKWUpPyQtkvG1vJcsXKYxWIc3enky4/K9UXop79Tp6j93iGj+731jCtrSu8+Pbe/8M5IIiHj508kRUJ86MWaZvEtWWjv1eH8vwVwAAABOlBm6AV8Fvix251ek/zXPS9e4u58qfLP7iyu/ntwwX/3HG5byAoMWFU1lLVL+94YuekNS9Fx1oVg92uoakv+X97wWcMpwrY2qm+5wdSqX2y+3K2fJ+27n4YojCrrz5yDdw4H49T+T+9csZvfgqwjvgj2P+YFngbO5Tdyf3q0yM4ehnf/gmNx3a4J/3dzC/Kd5shXzCnO9s1/8rbcEEJmVwK9gTA7YB0TdWcJ17MXgTMlYdv6Lt46aF3BTXdw+3d2hlGbYlAacen8vpFwvjHI0NUHvYMZ9c47ztX1WpRGxpOi/aaKx/3GXshCwfUwrbP0yg0830fgh/BCHktubnEJosFPhp/3zFTONfr/vz1Sp1vesUL9NF+02ytU65Zt316/RSl4dcHsJ+KNy+Xwl/85Ywgoz4Kr5NL/vK/QgTr0uiixebck1t9Xn5yMC7/3za94/wpHC7dNpFzu7VJske1irsOQxPzSHNav049j7wZ0dewTrnk5akRqVes/0ldVlYQ+ihHLwZ7hQ6ny3MPQVbS+CHQaCtYWl04QLu72z0SdwB+SCPu0W9+nNlz+JpvjYNm9P8I93vd2CD8P6uX/fChrnru78oUDzgQXzyxBdjmPzDRrmQm9MhQRe56PryodwHaVCy9M0X5V/WMlvd2C/3Gws9hilNbRgMrdR1iTpMUFz1yLxUtFHHma+L9NC2WMkFt3YOVw480EtoLX1N7WzD2Lvl/39avqs/sbKd0vvl+2msFQqi4M4/HefHKTaV316wi/sEJeX5fkurfWSYrnYk9L+5iF46K3l+3F7FChnaRdQ8PYyJdeQ7dbhUqrMGPqop92dRhP/8nr/4eMZgrZZ4f5WoBQ8Je2SkXKIvT+tcScrrHOk8SHTYu4sr3zRvbVYKjOUpXP1e/EJPpxWhawVnz7KBil2bi05Z0dJ4dNm5fpotIKEdk8sMqRt8pVeChJYGOP3BVMETJoIXf5Xy+O0eEfBdvPyfuV6Emqosvl8nr/4gVx8sSJ7/BSUkx8j6TfXBbvPLH0kfYohz8gnd3ekuwkJK9X3me/rRmOsnd+kKPe931kpo3b28El54+18ZCK8rLd3eT6yfoEZ7lzKiql8nLDpvfeSV4Brvh793J60f9nI5pl3/VdelyQTyknbw7ux9+tf7/y/X0r/QIt4h/Qk9JxU/d3dxW+hL7O96l/ij3fe/cERnv+ZPen/660lfsXEle990+8VSj0/7uEls/k+s6qsIc/ygUV77yfVjEXZYor7yV7yX7Fy3MDZQzdZSkE3fJ605OuT6r/MK3RbFoId3s7vPN939/hrwJklrDbsPbl4dZz3rYIZ9jZlfHZWR0hPUTdxL7dxWK+0inTV4lhK7sPcoL9FW0m69fUIFvefN36vF8n3rlRJCO/tJ93ppIkEd9x2bJ9+JCSqWHFR6W8Je4bOon35oJzCt3d3dzC3fYkW8dn7TkdP94YNLNq93X8urb4hCDyZu/1EktT1ecl/SXuEbu4UV3d398Kv3CZp/LwT99u3rLcaeMZdlbDMNpw5e9rLZcnTgiPur5Pe9bx59p7s7VvWWS9yLcvq4QuTu+vTupjbvvD5b3u7d7ZUrRr/ZOldpPC/hQh7l2nJWNrlsuaRrTN1+uX/yetixmd9wxkglEOpj+StV8ieSIlMmjkkcyy6/fLnkkKSL4LIAAAATZQZvAFfBX4sZxmLyecv5O4vPqsuSfL/7i+HJRiJewLe68tFIMv8t3wwX/ywiTNjKH80HBzSlMem+OX3+wnCd5zuo0ZbCD/v+WCiY+iYBDMtaepY4zOJlyx3V1fDWr6h9XL6TLLBby9hDUjxBx7RPk/b68I7ng/w9BGbJ8YKPr3CJX2d3PBsr76/BGR3yi/KJeCbyrfCpf/KzDj931vQyfN3MPp28PzaoBXpyfs1K/p6syulpttfjPcpJBFw95sjZbY8zYO6BMgtvw1wy8N2SA75f18YUbELaqt8eyl1IQCZ6p4d5RHBwg/UiFk/pVc8IE18IzPr5Xv+xbnLnJd33MD6acsQVtynuf+4KbvjQrUdDzYR8dPvlvyEn7/cl95fEeuE/NlXhP0csKGFdxoFkZlTWH1UEr9AV065r4+6350HilxRwE3/D3CHIR5B3vnFdRV2L9r8JV//pWa/dV3tEuLyBf/d+7zZ9jsItd/xpXq6vcef+90wPXoXPT02P9g44CooWqJax4BxhbSyyjp5mBg8f8b57MmvCDydD9NNvfA3MUq32uSnqtS3T06zxZQg4hOKGGTyD1ruKI+9wi5Sxcvt/goMdnmVJDtL3ZPX+uEvBFtgWG34/rcPaRG/cZDluLvL/7jIR7tZvjq/nDf/el5W19iNklZiRsXAJof73wtaROxmLZgeuGrY/tobcIYhPzabc0iCzmIqi1fGk3PKUB98A2HdW+HPD8nK+fs+r8TDHv64uXr8nJ6110uG/bGmfOudBpBuSb5hovYrhO1GOpb/y/W+U97hHwWbvfd5/i3ziGXdaJ/X5YIr8O4Kg/IICK7rJnQyavBdVEp/D5SL5PyyizXe3cNdt7/VDSP/T+Co3L2lsJV/DaIbyRPIGXaayQTCQ3SumX05G+M3nPPtSKXebx4T96IwvPc5+6mhl6KT9/Epl39AnNuO+dqWHbnNvvBDIH9rjsYoS8FBXttqmqetdfk/kPHwet13hcUH5dvdScq86lZ8kW/4Ipw0ixL5vwS8qIQHY+k/UAp9Plr3ECx+nZYHv+3BGI3d/UJFnIzjuCP5856PDm71Uv/tJVBFtDRX5QfQJO71CL5SNiFdcv4nlgrPLjvwOC6dt6IJXoyN9+y+87XXusXRWCeQ9fu/VuCPu/j9lbIGn95Lu790ZvoEZXnQ7diWE73vvr2tantZLwi/SFbiufLv1WLWvtXeqPuj+qBFu/XcisbaVfrWmsJPK8SU/vG1ux0MnLpeoidI5ENvfJ/VPlKQXJL6iBV7vfX/rpdom9j25t73muQkIXvorOvXBdvT5/aEvVI/gmvu7u77eQqEpBl+XSxWrnZjaQ+6P0uTpdI3L/TUoO/fp/Ydvhp0D5n/Fa+n6nVcdxhHyYZ6/ovk/ijPFbwomLFpfkqjwXi2aHI1/Gak4qOVV91Z7cXUxnnX6KxBxWze79rUnWTy9F/WyzZl/cpuSJkuyQ5l/wBz7f6zFUVnFJfPn9wnFnn8KE+/snBUKLdisVlyW7uK3SqlVscWeL7tlBZBp75Ppp5qn7LZW3+vv3mJu/GWWtb7xe7uNyXwvmiDOOmXdMkgs4f73cTTe7vdJf1aR4Iz8uHTdJlRr78klLcM6qdPNEYZuF8woPtWbW+O5+CPdG66XPL5pPwWQAAATdQZvgFfBX5hko8N5oc6+EfHGpaVzVmGyf8XZSO442a82Y+rYBjzcwkePwUEKas3nJJJ1Xlh/JiK2MQmXSIOmT6ME2x3lr73KtL6b9gov343zDdy2iv8tyQmXk/afEyxe0Ybe+8gvVueJ5/0yiioAzYmKKX+6lyvcEhJe3mF6lEvNkKeYRlh+EfCS9S1TafJm41e3jSeHLNDLbJk+V6BnAj80/4oAw/TrmGd65vTLVRgo5/aXnhShyINphyg0INJW4zagWd7A7sFjj6K5G1DqgJavn/y7iZHfWZpZC/gVQC5WRFy89xpwBP3Sv9/9/S3Q/oBtKbqj68XTtLHabcHDWHWjq0+PEy5x/+gZ49OsiU1kXHvv/Gk8OX/u3srWeN8L+QYBmVneS032cBnMtT3cJ1x0/1SljL7mCOdIO9d328m4fe2Yfprw8UOXK4LnO2w9GUD8YgJd/yfdk0lhSW0mauF2827sPZYPJdcUyKm1xnP3TcdjKMM0+1Ni2g8w+FK9FgguyG7Fp15jTbtiwV3WVTfqpt+XuG5VmFNsKDj4/RvD0+ceD0UpVq97iLe4q2mQvG20JuHs01LrdMJ/2hfAME3epC1YdNPlHx8DL+thFs8hjNP40qGpT09Og79j72KRUqFkuFyhqlzBdJfvt/jPCD348iZBNO+DyH6LWMvpob4q7y/NsovRh3ZPX2fk9JLvp935kWVXKr/NdiBH9Pz5PVS+nBQYoFOGp1AbgpUslmq3/KEbk8JP8IhB7nXuf799HgutNF/nx2+wjDKVl/HmPkHua/gl30r6s17hHZnUWcoEwzZVCJz2/txLU1t/blg3F3xZPbbf/Vt/0KEvd+Xe2jf1rgqJnTfKwHYcpJdWRPff7O77hHw9nf0SbveuK19VkvdF6fnnsaxQqUjKNXnS21kh89ElSPTSYzy13Q1DP3/k/Sv8UZ5zxxkOX3Q8Enl96+JOOQesE/6JBfHsK/uv1/+CDzY+8NEi6uC/4arn+gpAo2Un30lXP43T3Zt7vnhHwTZcn96bq1Kd99/RKphpLowozX9UCUqkHsJXeE885qev0SCTu7ntfosvw0Tc+LxzvwnlYUu7u97u7vPQ7Nho8Wd74d0tbvvyJErsSUk7QGOq8/SXasVX177wQ3f/7S9k9Uj+lBEVylbpmE39CrwhDm5+K71gjlK2J0Xq83fd9/f1qgC6xZy5o3mNF+xPtSGd71a08HZfk+ltS8FV3e7uf3edMnru67u/k/p/yb3rtYR8lkJe1+ghrdsaBJdq72srtKfKCMrpacdV1SIRPJ6+tSp19Zc/14ruAhquD99q1ifcZLbd3CPmpLMHFooK9t7u7vEOUw0pMgREqOfwyxpZJezlCPnwo65R+zGVB1b91iCW77T2umJK7v032eUz78kMi8MJO4zpPun/UQSXvbv6tSgp1pKnV7WJy97HYxeeFZheoS8hL3v8UR3d3d3rLSEicMOVq1y2+0ikyFzTt3NZxmw+t+tEMLvyf1/TF5e1Sqv0hBbXaqju+sVfc/8K9gqFXvn9lu5p9oKHmVP5R8be6BGvb/T90zu/pP+rEadt3+IX2Eiamvkjo8Vffd5fat/5GXP4WX4TEH7it1f7Zb3Lml28hXa8kcJNVm9veeHzP+GPZFNmsllJWpSZPh74qAAAFmEGaD0pAK+CzwXisNMtcgVVmJV+kMy/lIQPSwZ15f4vnWLHPT8WdSYeLbP7RVwwX/ywTiOHYITFLiRdx8Pqz+93coqGWPOnLy7tBxypPXEyXt236aLL015ckryf37R4Iy3eLYv1eowiU7uXNO+7ULF/9sORL7v2nPv9xpMhc1gi+Z+tAZ+Pxh+regCVfKab3DCa4dq7zC++DsX0/DLwKL9wsrhQOTT/uCjci2isSGwUWixwGag/zEvqdzx6fX409OwACH+b4LUGm3ynBTtRXYa8CX9gbPpFuO6yVumNFDkkFK8h9ZKfCxNuQKq+5f+MOdLpPOr4W2OlfBbe5eRSYzKfKntjCl1P3zjutE5+UWOUwRvHvnNOD8v+C/JUXAnfVG/+FAKbpvX4J7kp+X/1BLbfsN5xINK66WQvcZx8ZNtgXMrplQ7P9IhL5cjAWRwPkE/MK3Kavdw+QVu8NwfJyU3cmKfa4Jty/zJp6MBT/PyyNP/hTQzOEnmZGU3fEt7rtK1RAODdevxL+uaaRN+5rzbdlY0ub0L2+2dfooffHDilWXDULWNPGnXvd5Pd/8KYS6evYayplR7o9T329VnvzXUwFpU5bgi4Ru1unW+vJPn2VC97ucTKHnu7hQnD0kt22nf0ySTiemZ867b8o/jV3CT+wTjntE8/317QKtlPpjD7blbLzFRmH794RkK0MpPaP4QafXgl+WpTTngl5nf1CX1nYtH6kg9mhLZk/S3fDvZoyXaauIx9l5R5tX2DcI92000Yz1+mN2ycifNrUo6xvCCxNLeHU4XHLk/vaUTLX7MAU0WL0o39n10PLe6i+QWla6+u8qn0CEzYOBzyfNCT9CQkW79S/eSnqrcFpR3vzC4/i7pVpxQyD8jHQ8+V9LikHj5wpfnQCIUn0Yl6o4/jvf+Cwx12NBF8fReKCfOuwNLxObwj34QKgQaX7rJ9rCuii77jsq9DZC3fpRRtw5bsr/1h8/lsEPxg1HsdC+VkatbR8u5sPPcMffBHbKqM3ub3BIaHeBC3FvXBMULC/l9EK9fKJWz/jQiT+tfFR+vj3l/Ry8/y/J/rksEx0ky+33SgnrhcUjlfvuULOwys+V/4KSlGWn3hB6DiXhb5zZy+/Z4ISUSeGr/d4gWceKLHQbnC/vBEIe9u6Flzq3xt0e1ePv8O3vYiru06+N8v4zp2p1MPnz3fLsfp18cUTn8IrcrD4hy5u4VebQ9atMNmvq9uX/2w2Xl6/L30/b1u0Ku0H7v3fgmI5/zjz7nernf3iMsN3+IYJfMgZbvsd9XKC6TvvuwCRfJ1sEW7fL8FJRtr+9yLToJC9sE8+dy5dP++5adRV773+E7u80nf3ZSbfXr0J9YgjjKve7v7y+/4Sone98npJFJ6gn3Kgt3uO0I+P3vi9tPr8RVNU0hmn+He05/cwlbpkugK9vurrJ7dl/l3fvE3e8/vTultQqvJ9vlb/v0+kujXv+Jvsrt/kJPCtq+C6HOnuL8j7IR8UWk3tSZ8lzfvlcE930kTygpdUY86iPi8rTcUR6d75P6/Gumo3hkSNlBsr6P//+qGEjOH438095/SWkCSbeKU1upPG/V2oIr26RS+lrhKHx75SjnNYTL/1ijCuKN4Slpb+ERLvnNHPHtx3iekhrUUYg/e8mIdV7Ohl05/uk6BIR9yppcdFlcgPPDCJ48e2/RIk+XNT+r7EWIvenfCXjcvG1b5fUqpIm6Jwp5Lr3e4KhVnsJZXLbu4rEDQ4QpciHHhjC3+Uffd5t38ioIKVCxXJQ+tUR59jfEziL1JMeIaEOyawYp0Lhsm32Htybk/fz6Nd76fS46Yj6HbtEu935L7vC/kIOytfKvli73c5vfVk3fxnVll9qYstP4k5ArVKsxHDOSQw6v+SSXM10/+IspNvBZAAAAUXQZogFfBYX/3Fisrgmz/OfHoPvS+hZF3dz5hkv/2N7QdaieWOcG+F780Pq+vY1GnXkTDnVIOsyvcbPaS4pKQxBj99aSFfJ3ngY41oQD/Ad49VtDlK7lRWa7aBeTKVMWRartWGO/uFC7juiV3HcciJp9gw46Eo1Tli8BjKv5YTsQ/pJEvee/J9pliX4fkRRPwSfO/1p5T5STw4/2fDQq/eCOXt6L8v+1gm5CYReZfDamsyZf/SKd5sMDQtlgnCRH3gn2dNSE2IlTZZbVFl/rcVzCx3zSIDlNRpvC/cx+HUncnrR39CSXLeG/7g30C9P5O3q4XW2WNu4rMFLYbEDekw6BJ3k3an5bw/NYSMQvYWOGkHp8exWNPOpz0lmv7jSTORDDpDtTu0errOsa8u5jO7/luhewUPK7pPApviaPvc+HtpsW/4fOrJKa6auOBeOOzBNHjgSdf3/3fZX7jJXQyQyj/eolmtI5/4qcLtXH3XLupdHuncbW+zf5MpaSXqfufPS/T/4S4ZbTvfJ9uW/9WXhF0zQ9J6p/yFjwv9o95YJ8bp4ev2n70SVS+/WCsym9sGX3cORbwLweCkcnBNYffJwk/TBV4rFd3d3t9gquSpxfP9rm5b9vxm680csVm+hyed5JV8OJGPjYaNf+zx7kyfWD/ka+sk/9e4ykU1kL80pQrCNg8IMW8/bfcI3Ym8BLfa7j/QMe99nYI5+gQeuF9trfsTExMVbzv/niddeCIt7j7L764LBG4dl+PHol7+4vqPEnd+eVBuEfDl34P5O67JIVIP6f4XEOHT/G5gtnkbLBDqWP/xxTBay3h78WHffnx3eGt5PfFv8LkDiSRg+wTfVIivicsWWHLWpz/wgV5oZP0kVd30VCk8PLPvJ7f/QTO8wbfmlugRCAj9w/7uJfpW6CB30rb3O3unKgU9y695CeMV8H2CQkdU7HW/wXTL13M92+ZCHmLeO0+H/LKV/J66slr8v6fFidXu71l9iyo4+YfgiGMUZSnj8vwWyo3p+UeeWvcEOiQwofY/Eicbvbdf3ozepS7mpsWllu3EkJu8JeYQ1VO/oFgmkeW5RWCdvdd+IbqbXey7R/J695OCbIGn8m0ztMkshJF/uQo/lXvX4VNFe5x/eP9/9jddAkvf2t9e7a6KQxXe+zoE5Jfd3v/qoRf4J773d378pcfn/y6cvusvr/JvGZzk9Hk1fr9Pe9Pe/TFcuXTvL+lq97hLyZYP/DF78I3bz7bV/l/3lBMc/8/6cfhGnSvkQO/k9vO+3MS76voQLP12qSPrtcy62IPClvLoEtE+5kWUG/KgRXbem/IR9KvJBTB32b/dC/F923Vi8Kgm/s0699b4JD3d0zr8Tu97gk2J1L5V2n0XT8lS3qS+9+m973tfYkgq+73/E+WDnrI6JzCXuGOSOll//IQQ0cVlz4kSjwnJLFGf3pJCzcIrhZsf2JYbPj2+Kba/pt7KR7/hAuem9y8xb92ecVbqvVrL/8gsz5nZ79PyQQcvxunRuG5Fc/F373Ii85Z8KP7HiDabX3W/hV7LGE1z6Fd0+4q5gdHg17vWMVBAb1OW9zuzq/uGycOeqO49fpX+xPJ9JW/J6qlcn7flIo/Vd56tPHnwvmiDR3G2IWGsuS/5qhG93uO942vVEn5fvN+8xd1ryS3SvJ6W1+CIt6SL4Y8hCXnz4jm+Irv3G6XzbL/+IwQ9fxvi3MMXl+r1B/kXxBSlqktpvNIFkAAABPNBmkAV8FfmGbQfPpf/cXtTve9/XuEe7bVd3/LeMTWGS/+4LyZugqhjJuPkrevwScbjMre4y7uHXR2/UoMmEn5f/cXMKkDb2O2bz1k9Jt8vLlL26vxeYGnO+EjD5L+Ey3vel9EKkLF/9lCOob5hl0ZSxEROr2/Okys8aR+fSgtb//srh5mgynNZDbPGAU27r+3jFo4Eel/hJqf0luNvjNbaeWzehET32vxVKFlilAmpcDQ2b2UvR9iT/D/qo+TG/TE///ytehlp9pO4dOApf9bu6o8PZd4CK1VPX6mHn7v/gTZw2r/syT6/+q6Gk7lm+OxuADTu56puG9jma4sdoUxoVZUji6V0fyfSWruMok5Qq3dHpmYTR1BkZen20+OK4QYlY7Gfnq8o7ZRY5C0xk9pMTP3CNNlD9/oZO2IOYj3k/aarcfd4d3cQ4GUlHkfalyDLjdmPfju7mcpcumXJx75acBCFfcv52GgU2woOFduUCrd+VNLeGUEZMzTWLOAc+w042GpMJ6gowpTdL6djR0s6BO2v8S3Zu+dwKZXZ+yvduI5+9tMyx6/pBQvTSIXu6iG7ieRr9fzgUAxvz+5sybow6iLWtPZhfhHOH2mGZpTxv+XIXPD04IuVelVueSEy2T3yp/BHzHCID34Ld7yi5Udnsn1/qYk4LHNXh+96ChO7n58fd9x8qxTVx2hNaLgt7gk/Nu97iqaEsRkvmrzgesuH9YIJB+AjA624euNE243j/HTguImO3/gp/wCb9x5igJ2qMl39vl9l/bfCBHkh0KLHyvyi99Xi5eR5zOst5494g7lXjlUPSVNp/E27+TQ0S0XJ1pSfQLDO8MOwZ7h8Sf5Ve+UJeCct5++/v3P9fViCkrSyFn9OCr+M2/SGBL28tk+7xruiCj4OlzuT3f7xN2/dcn6+fhGZdx1LvKLmC9CIF6Pp/BWVWz3VYkEM/YIg7rLuh5jz4zL4r3gpEnFr7uUT3cA73BeKJEuTnmt+2br+haGHu7u97ng4WH4yr9tT8aKhRryy3e1Vx1DbgE+7rD2lhOnKLuf6z9v4oUdryG5hqXPhKGUETBctuxP+CmtP1gk/TIuq/d85+xJcsgW9YJjZfd9+rHFvcxPd7/kvfosI7vu93d6S1FZVF5fcJPLbCl3d3vL7cgaehZT0ypW15lk99//wUxKGn3et3dvwQ4flb9zpshav04JyQ4g3Xc596kq00Wnn7fWCK99oSL5P4IpbxXqu1l7u+/eSvqtpdUX99flhDyFbt14Qu7s3Ezj2CPx+a/5faT2iFfenFvHXfOUu2eP6ohEUJf1dCfeq8n6m3/p9iUCXJfInDp4dsnq7kiLOyababCPqwT9IE/Ll5/dvJsEouf9y/Squ0hVPd97yJfzdXryUITrX1+5ju/1J3S5O98TNb6QIM8q/AT8hHeIfeV4k77hOmTGWy5orCJAkclH+BGr9/NvuwPZ6aKfXY9elzXP2/RbPufOosm7u+i/QvJQTu8uPTy/lJ4Q53qWLIyhRLfrCj9R4wp+973u9L4dLbm7X///dM18ys1cfdi4Ukc59ntf9EixZlU2MOr9FgiJLuZPd03vxHVOLQSIll27unlkKpQQvkj5f2eMyqGy7Td+WCb47lWt08Wp0SXzXr61Km6M8xUrhwe4iTKdfDPkkQhtqP8k5MhPrfe5SV8lzfgsgAAAEt0Gab0pAK+CvzDJckA/xfNa1rty2hpk0wCLxcpR3IDm14Y83RBvTfCJK1LFlze48R37YQnFw9nWm+vfgmoz0Clu5sFGFRyf0/Z4IOQmQuyvgHl+w4VcFWAxhcXNpsyV+nfBJrJVLpvcEFDu8Ez4/0utO/BY918lywyfu7rgo0ocEFIIvHsIZwa6dlLTWvwmTU4cP2rQXW5YQEPcdDDdDCLxda0CHdFcv19wpY8m0Wvu+wf3yBKm0QLpRrrGH2RFbxWjzG3y9gZgt5wb3/F5TxQi0OoQcS7F/UdD8SrRBv4z7coO5f13BFyHQzBik6S9xc38bnOFPBXmuYcyaO+8BB1cvyiy/KWXjDO0KwwMMj2+xeyerBB2qrn7P15NfHSewvZiWWzOU065iX1/GxkyIW5BXm0K3UyP+/t3b+6dq3gPV2fr+5+e8dIG3nCwNUPEQKa4DZWRkXtjMEju3K8KbWl7oHtyJwhQbRNwH54ZwbGECWUHvQJXSQluFCuUu/ppa4BBVW+Pl/VWZzgWbut1cTxfSqmitxu4J/m8eJm0QlyNDd93fCT/Q413/bxNT63LyX7gk3ulVfid3cZ5itUNfeiFhPe8q8MTny/W7h826opd8K4l3uPR0E5x94eVFDrttU/6KNpTHHMJPXBOEBX27u79fYj59vTVln/QKa1ov37eGXr0EGT93yvBTeGkm/+LZB6Sl3KSutJbhCcXYX8oPtELkuO5PSS/oJUGu/q4ze1Qm7FRF7B8r9Wf2qOlWhKCe93vtVbKKudB9qEX8SCg6zMJF+9U876r0l0JNZXKG86vsiBBgl2kwb+pPbPgg+Nx24dz9f471LmierfPhK4LKcJyf1lqNWn180ExzRkOzrmCUvfJ9v76xdYJyXcxzMm77VcJzBnU+2mEi/5P91dbXFSTBFhOcf+C2HIN+kiVeVR4W6xsRuby4HcgXfk9pon3QlKPwTmIi3vdsn9dONWT3frcJlvfd6dpMEt3vx2EbjuKve73CSzosZe7tz8bsuCTZu6fL7vwz32z9Xt9k9r+C6Ihq+7ulT6Ld36Gy7K9t2rvvvd9/iZd3vf0UuTwjmMPJ00z5d3e9t390ssnr5nqJK6b33l+/yd32f79D+1X6XzRO9y9r/dBPd3vfaQT3n13uEfCdPetZfnndoEWapts6ecwyNr9N59LS8grlLvbvRi3d/h21u5cvMvqe9P9DZiXfT8nX0vZEE59Z777Vz/6Eyb3rIzol2uT3780JkD3kclN3d4R80r86VPbc0/v+YXP+T9JPFtot8yK63yeryxtDBQrh19wn9fXtLUTd2t76vsRNd/kiZov7vyMpHbeEy+kjahe6Qo3m3TfdXy+4ie5Sj9LLG/oxME31gO6yTnU21/d54QJe7t+S9WK5yv9cZw7cI33ffdO03URfd95flkyMXvd6/JCj9QmKhLe003PMVlu9vGlu/zxI9hbd7c3rD+0/7crcJi173fb4iiGF0Jk1ftLyTE3en6BJe8W3kf2zu7KFi+aReQRivL6pmr2T3lKX2Hppt1Km/IxGcWfr3+JKGb7vSk/9lA3eWnn0vQWGPIIC1tvf4jDfjT2vzXWXvV/JEFHNOK71BZAAAASDQZqAFfBX4sZx6nhWy+ask974S6STxlo/myNmHvhubFhkt/49osbQyX/ywkTjM3UoF69QVW44W+HcOoMQtsmbL9l7gonLbL5Ac+VMnpU/tqlUnq35fqy8FGGrS3cwyQfSqq6DheXltuW/vfC1y4P+fjfdJxZKvLS2975SijJK0FS/+5hQrCy4bmeWNj8nlIlDYJT7zYwt6NpzJrvglu5Zc6yV/u8Ei9a/cWrMCTmtUS+CNPaRdN9DjggfnCX79xt8TD7bQahS15GltZguz7n66ZZcdd9/A3d++UK/X83XkL2jmF50G5OIR1nPy8Tnionq0uki3GnssW8imkeEH/7U8174QWs76VB2eKsP0See0K9DN5L1of+qTxhMCbumXpb93pSfeu5FW/9P8PpRfX7952DfrGzPyETIONB9KH95F3B/7hL7FPyenStLj8Nd929A/PzRtXwVc9gl2LF7HbAanl74rRx00FLkFmbfPS3d3000U5P0q+ycP9HL/vl7j8gUywoMFG5/ONG835oDLa3CR5xgLRmd2a4msz3Pxv6XY6XbxYb00fN93b6zxWYToFI2JbIBq3HTb9f6sft0P45VLkDuYryRRd513j1W95p9bf0K3fd8v/WCghQ2VQ+ZNyhptPwn2Cy87zqTIJeeA3J8t+Muki1f9Yy0YiZ3sgxKfdbgtJaIvwzeX8tK5Kxa+gSSryhLhq8bF+HYsbZPXuCPcor6yek0ep+CIukUP9qvExn37x1/XyYkrv7un00FRBpBm5H6UPZ97NQm/1W1CPgnLe9ysZ1Nd177S0uxbIaW/dhvkHjavh3lfk9//CNzJz0feGboLtX+EMJ08tt4Dl72BbyhoWKX9csg12XJ7dn5uKGTv7v0eLKNoP3c+In6tqWoIvNFhXqFIM5+pTBC8r3fn8r1luR31sfy/6EV3EAl7hVo/fz8u5WPt9WjpVtyCWq5fFfoUEHKCMayrLfJ72N/klc9Pr3V+j0dh3RnyfS1TaaO5SrKglKyX/Oxf4u773hIv57bYUnxqXvu9uH0kH2RbDs77f8X3d39i10Vj4EN0Bn/d95ZXd/Z2CKHkkn3Mn0//1Qu7vfe197xtl8I3vd73d5f0v1rIPq+73fcJLIKzbult+3WVVjiy07u7d/Iu2u3RNdO937PLe9v9ahEv/2Ei8mxX/Jcole7O1F3itwl4/zbxkgWtTchXcZkuurH3pZy7Le7yfSv9j/l7vabxuEXa1iDnz32W/vOKU/v/jGS++0+0hN3e7/oVoppH/1pEYVIAn9f+t/Rp3uotfwl5ZO4e93qcvS6XITr8n9XOi/rX2PE3u93d+npxF9939k3fL9/hKf+N9wk/pkH0PW+I8z4fcm+pb3S2LZrzDz6wXle/NX021/J9pWL3N19l1Yvuk3BHloix23pYS3vLuFdRgh73ivd3cI3ch77iCw2l4uU3vCPg9NW+EzhB0v+73q6lR13JvT2l0T6QJKJ5opVJ9a5JJrt/WFskhi5vX4u76UuOl6opBdLS8mGtSU0puskQfk6ruCyAAAEu0GaoBXwV+LGZ+1hCyvuX/3LR3C4tdwxXCX6C3+x/H/6XuLl4XffPkoz8pV0EEdRAYL/7hERPbUz2v2hz8vu94yJ3+Z/1D573vwJd+dbR2WEdxl6Wy2kPEXmAoIPrbW/quxl73fcv73eT+n3LGd05Z8Er0HRxfL8n6r7gs7mk6+EY4N0cjKN+Wy5O7l/34WL/5Yo3NI/+9toPWuHdfRLaXu1g/2Ca/PxmtuwCJ3dz9XrdX0pLsW7EH8E/XWN+7isZxyBfXj7t7WZT23dpVoR9v7YHXyPXsXpWLvgBzdWc/gz/6D8JwvCpwuULI5JNO7Un37q405G2fBu8lSrcuZsE6bTZRYATOGroLK2Mqhkl2mUS4gwvimVEyPNcP2+n/Gkicol1DyEzYj+knG7/cqzlotWx/i3mtFJ9elWIrf+CjZhMwRaDWIbDW8NVlA5l//CkiwKsCoew6fo3/vCbhgx0q7ib5h+zLk8t13j7Zhd+5syc6S/c0Z57cv9FuNq4/hTbCgx3dTR446pQINxGCRsS/GkMKs7GT00izt8b3BEbRXKHUNY+Qhr5HmIezYn9+xNW4XababNZa3qvBX9w5nKP1x4ToKSeXRYSLebtTIi5J6r+4Q+P3I/LBPlAuiwjwi55l7x7d6ova05a3/N42Jcvv2oKyO7kXhpLsOnW6J0nt+UXh6JzCT/37RBRkYQq7cv2uRAmJ9zAuUlOPJ/yc4aIvbQltguh3bHy1JIOotV4diX04I75KOdniyj9HhLu/dE4vy0wxLPtUfqugVEdzHbfCXh7QUfrWb66fB2UgkqddR/ane2EfBFpywd9iLe97enJhiGe/Z+T+t86JBdZYRk87301yjjOyNuz7QVv94Jvj9MdNVjd2UkcEi2vr810UvX1nnLX+IgmvHZxX0C0zJk+/EjTuPR2h84Ut32Nihm2+leT0rE3JKgUFIL73G3bY6oE/MUfL502lagnmn85crHlk/dFU5FBCRyLh73tCPgkKtvd699OgtyD2j4vrEBDQNXyrwm5/qgXyng2pfDkGT0aWPfGUgqf+sE8eDy4Nfdk+6z0V+tEY/HHfe99zSk/uvFpWNWSRgo87GP9adW9tE3d68Q2fGcsIvyoEYqVTDTR+Re7E80V4ITO+/tCy7vd/WvBHSnuW9MmzFe1cSkZ4S4hf+mbe71veqPqnRcusE93u99QmtzxhW3Lnc5W3iuEWBkNUx35d315rFnNk1/R/5hU6SU3anqqcfWurNe+03aJvenxRQSkMXCbq8l9l72dITf4JIl8V0q76pysom8EXlvHo8hAl5X/+vrEeXjMR7Ektf2zvfXmhKf8rF9ZOqdfBHCT+Yhu009FqJGaEvZBpfWX8wjshHu9LYk5Su0l2WbLdaTf6+vSmhLRfRN5PKile+tyMfNH8vI/9iQhCj8iCYqfN3d6fyFDWa8fezjU5xas3JvfXpJf6y+9p0IjePfIceu6d8qALeTH/eXyfwSkqH1+mf6V3bZJpriHHpumjlU3plv7UkR6JyeqW9CJe70pYiHSuf/DAoFaudJNTxv/+SzmHO4YzQ0KCZ/ch24nbQUmJ7a/+I6uo9nZllpfETTxVDf3+IK4qje4Uaa+sFcAAAE60GawBXwWeLFWhzk09TXf5SY7JX8Xz082/NuUz/KdY/lhjxYgaymIjOQuefcPx/TQueq+81gTNP3VK+cKKZmZGc5/cf65+/KRShGDn3He4fSkb790VW54LZaXvuil1ZeF+btmB2YY0AX8kV0eML3d3d7vPDfVamI27OFvFeXx/zvOVuWESOnXiD67Wg9hzQv4TeegL0AZc5Svq7jK11+tjvE7aqjgX4g+9xsUXNU/+/TaK9WW9gscgW+3GHI5ncnK8quoO6d4hx8HpA7K9h6a91liSeayrBb+9sVesFvDt+HPQwuuUHSjCjAPT03/YgdOPyuxXohkO8n1SurjtyhOtJuH4NKpCTwP0lqE6p9MbE4N4uvcKXiGF6Ce3IXt7umH5eCI2P7uMJ+YRwm+7a3LGkEuOKzBe7DCDiqo4Zq2bAoFnQY3+IIZwRRe9SP/4Ulf8Cb0Zbb1JH5mt2YEvq84vUocQZQ9Tmi5/p/xpbNm6Ew2Xdt1f8d4Q+vWuHoT1t//eIOYX674O9l8i3hFvVRe9Wizv02yeT97xLaCU7ii46ejLlR9NexcIXhe/lIluY9l7Vq5IS5Qype96vFxZXn88N6pzwheyV8hUblIlGvpawX7u+EfBVvKv6zXk9UtU3BQTd5x4u6XJ+r6WFCbwh4TD4QVVmy35BhTVNDF/7jkR7f2cDr5RcHD1iwkvlBaOfPwe4OfH4NeVBKqC0U5o5u/4zZpUI2yvo6fhyf6sQcdefJ7SYtpeInzeen0W56xtik/tVxsFE00sumBQ7GbDbmnhAvp1aBnNuCOwZYZKOdawe4iXULS+71ZV7RCeN7l9/VljfcJ/qsT2+5MtOnvBSY2eLcORdf+8V6cTlKkJ0Rwu42QGtDXUw296U9yCt3k93G6pv7y+/VgkPmFTheLL9fguNmLnDV5e8JeUry/peveFboalS8FI695x6QfdONkbUnG+if2u41di0KO993yfq+Wo/L7+SNyD/6Cc6B93e4Q9Uu/UEIh06dZPqvLTBWJe3e5YO5vc+m0Zk9NKz+mVyhf1mulRyetVqii/LS75fV7n6PEEMysdXyb68vviJ7Pu+xcUR3d70tuMmCmykrCdJ7vdPr9sYW98s+73nhvrBMYqvd32HbhEn9Gn+CLu7KL8s/O9/Viu75/1TK/T1Yi93n/r6vz+iTbKX6oIyrmPXe/u7uX7PPZ/U/t1wWXIe3u7u9Ltz0kSUJeCLefGPsIFfuf7u7lWYPyek0ll7LMfKi9bmuz3kpDL3CPzTffcJ+k+6sf62S95PrN/e992Kv3u72mbk5/totRhUt59vn+Zr9Kpo8w61RIugbj7Tu3cVl+7hHzZXwUXPqf0Cetlyy5Yrct5GUTljpc8I2N/L973+Yj2WT6V/+/b9KnW2um0U9pc0ly32kmSp4vV/jsviHmbsv5YW8I+TU4/2Tm6L5CQh/p+WCMr3crZPWzl8SxXd3chvtNxtlMmlT2peYnP9khA5o7uUSf9NPvbJ7XdmiSAkI75U69qrQu9939YU0x4iX42uWywcSffrH7rUsF5w8uex91a5LbrDrqMX15k+YKvJ7pie/k+m6v+7pakqLm1660uFvJIRZXk/UssnLvOd5ITK9y9yLadF5J2Je8v15oIt3pX1ZQ6lCWO4Z8EJJ2NKeSpLX75r81lWHzoQbeCyAAAAE3UGa4BXwV+LGaT8NmiX/3L035mfNlvrVwl1a3OS+E+0a62gx5vDLc5f/LCJNTh8oUps12T+Okc16helke9wjwIv14/75fy7cFHPgz4bILAlnaYW3EvE3mI7f/F4aUzCT7nxsq/BR4yyOKuN+I5jODVueUt6dfRiPd/lE4X1YV8wqM4sEHljd7J0SEMjVrCvrXqwtW9IZYpYAr0/vs3OBg9pEmvWGO6UJF3J7ZKTRV2dggmHcMRujrmCjeW7Mr1M/749IrBskZoXlCut6kXa2LtJbOxKSWuVbfP9HhQs90o+GzxPb704eX2DZ/k533H/q2klMxUVuTdfRSqroF5OehuAnGmzP6p6FnP46n9V4R3P89mBnz95PdIT/EFiMXR9NX/vLSn3BBSNzatDq8ZldCyI/sPcrfyek756gwqXuNQstxffq9FL/u4JOTU0YqX/cku7uFC/K2+FBz3LmwQIPFQ+eg20AL+/vqI6CSFem/RYUqwM6F3DzyekSlFuK54hjTt39O/c4EuvhA/kQ0uANPAIYvW9+v7uqVN9TGNz65g9/9lfa4UIndZEuTjy9/tLbCUP3L+GYsmNHfw/J/CXggfPzzdXpX73NhI0mgUv6Xh8jnSdAs/HppgcTJJonl97nDQTenYRivZStTy+3ye1Z2tFheMkTYe/C8Nz4exL/J6S/kQLJs+YvMP5jb7xuctVuMuh3IWiAV71XeIp5QaQfeM3+4yJa56X2TqRs+tswP6TDLrNpEUsCaTPexrCkxE5Qr9op95utmfPpSju6D9y8W2UfI/zCKFy5N+uGrjv961k/b/oJ9m+0+v8hzVu9pFSjBCKXIW+PH1793k/WvwgV6bJvFd3pwj4LCPutc/e2fkn176922LeT2np7xBXs4zW2R96fFIFIpcyqfXi3nXnFOXREKk7Qd0n3YrhDc7nYHcgRHxAeIJH/zxWvCBRr2nlwZYV37vpvenQJSuf+78tr4kxl0BEQ2dDkThOinrVBFMtf4Ix25Cvxo8EUid5F+SINutW61c31goOm8z8m5fueECf2tc9P3HTqPd3d8JPysI3iu4bk/dhy7x8bSL/DBX1d3XGvx/8Re7Q7B9/VL3BCSCUfXn2yoh2Ju79PkiRO73fJ6++T7giFXv2X3+gTHR3cuXMaLfIbkfrXBKQNopT+4rLzpCXm3v3BDa2zhfiy3SufOt8dNvtPyzSXqjvRC9SebrT6XeTLR/ZVCXgiw4ukVOiO7y6FT0mDTgj+Ne/2J61T9e6zwlJPuf9/iy3enrsagld3d970v1CBgCOq9Mpe13x0sTyffd8TDMIeEwtL+tJcoSyv3n/tAjEy6Sei/fqYhzZEfo8JZ1+ens8x3ob0qjnujcojd5PS/yRBTR9p8vonSj9t8kecLv/Ne+T7rLldSXpkjcnwl7IEecl6dZjJAiwomOmPJE+O/7it79wT3OR8BH+zbZq5Kru703dFu/e7guLx7pF9KtielmFdqQupRa6y3vpWlBbfEPvcW0KvVsFQyWXu73fvoInh+kb1on/LmUtfab4yX/0u/vrT9denC5fLu6CBbo7krXLndit3vd95e7yftBDkyGLdp7yaCEdX7npHtxXeX8kn/Yk/HJWwxkQVFTT8S5Vx8Z7/BLmRNZvkouerUQU5y7qVUywWQAAABLxBmw9KQCvgr8wzaVbhufKPXcn/4vmy5Cz/xfcIvMkzfQMMebwzAa+LJD/kchneTY/7uMj/RNLtIkj3u4nwTdqQEfv2rrPCm6PtAb4+d+yCZc0i9/lTrU9povDkIuDFP8UvhE/SVJdCYrRG+cVy3aqdhMt7Tp4XL/5ZjCXkDrKe2Mv989cQ5bvOCmTFRLSRXQQO8UsS9VdivaGXL3lY/PdDSW+5AGJrq8/sNlE/oL7Ml68uFFF1ahULeB9fX4wphY6F0jaGHlokuywM+0R3/7V9RZ7Ll6jgsnrtk6h8m43HG5C3+DE/DRgaJHDcXa7h+0f1eWCucXbuH5n/lQXdOavxBRyKJh+vv4LR/69QpnxvOUPzdBEDQ4l59WMn2eQvt+4KowcWnvDqJsd7yIXyi9yXLj8v/QiW5kCE1wUyxg4UbvyhIbyGbqbUO7pryGZu3jBdwgY/G6Ksb+8Hnpf+CP3QXsRaVC47x/5F6P17avhjRs0v2VjfeZULNWUv1f7Mr+Sr9x3F/8NlMPLFXH+/3ql1RJ+1E3KUsL9lhKXvlpU+8E0pfacacui3L7rrl+trBYIywsnGTXYPuRRtCfKNu98/S3P3IS6+6s/wVYfv2MtgMTM0eQVhiWibT+CzKLQQ/PniaSh0vzipv2n8JQ1nvClYbbNS0scErv3C8Ivpe5XGbPyQxLr0fl/m0vk+//CPdvI99q0IbB5YC6e8ITppv288HSw8YdH7+9r+kjpsfs73d8ktq3MUW7v0VEJTvVXhQRYzT93d9Sobxx6fK+EfBQU+zv6bacfhOK95f/1uCHR760kLdDhA61qzI85cbMa9aWhUaXh6/LQ4Qnb/SNC3qf33fEpV/W3DSOdeanxsIF1vtrS19tudfsmil4ImwZX4q+q+6afU9mkgFqG/QK6BL+WLv8qW+bN3ThaxJ8891wn+qCRonvwi/t1hObXajNvd3js56yFcJFdzyYmzkcPkhoK04Lrn+SsNooWqL2w+QIfmwo7KkEJ8X+dab5aBJJbacq6hx2/+EX9Apy+DOpbv9W/Z+Cjdz99as/q32k24orgi2/NvqgI1tPZ4dImZh61lmOijFw+0PI0iC3MJ0OWdW80+vcK0kWi/m/r1oV72Cirv+vwQ4CkTeP1q5+Q7lIuplS/0uFrg+5Nu55OMr9GM5/7bF0r0cm/Q+9e97l7t9wkvLChHW6vF3P3d75Ar/r7IV7+5bvdX+rkGUyDE/Q2iO8m9QTl3e70/cmX+p8xHdqEskJFnx7V/4QpHH1y2Ufu9LL99b7vV2f3lvevROqz17V1kKiDaIl5P33pTb33ZN3+gpuTMve9ye7u5aE152MvPnNt4reYNTsNb9bqlXXL/9d1k96/G+/bapGJl8I+r0/wSXvbeab/2eUt775rahXLKWk+3fNedQCJTvyk9+zVXb6lu+lq/eQ7veT1sXfpEMm7W/xHP8vwqvwVCnd73u7zJ+C04Ahk7+1+/3n/PCnWhrfhyDX9by/giO9ztFdu0321KJRCO/L5NLe9PBLvd7udO+F8iF3it8uZPe5Cf0mL/pUvJIdLfkwxkiN5XKoQRF+tfL5pfgqpxuH67HEo+xG1p2+S8kR0ETet5fUn4LIAAAAUDQZsgFfBX5RnDH3xfUclbL3uv5q117i7nzapfiyhqZo3DbB/jrSGC/+WKESH4fL4m+vcZCddH7gxyX+iM93aE2C8v724KqLk01eYGkNp4vCdW+zJ+M57eyflDr/K/8IxmMhZDr7iFPXiXdXR4T8vD5e92xDGf1eiCiuy+4fK/vLwgSMxF9vpveoW8EEP/frNl2xk8rxjX63LMYjzwhJJO7jNu3huVpsKhwX2LOOr/mxUIVjvruxnJJt4Zdxh3jp49WB38B62SJgc0Tsp7Hl937Cl0GiOHn+0HYBfvp+4+IRY4f/2CTNb06NUPdFhCcXf9a/Ud03rtXHDjXk+3JcvNcI+V5/gqvKD+kYNGuPXV0odIKXd6/kCr0n2QEt6cLUvCdqU20SdkEfUKeY3LlbbY0j2FXHw9LFRzjsaA3fA3cofYCoSb3pKYJeNQBeCn9/je4PhmEVI3QWIennpspnLyeIvfx0QdcXF9jV8vIkLx7BHj6TzP+FOIv+Evx+ZlDUxnAErzkp5/efzA7Ntio5c22T6Tsuzw6eHHRLzh9zCliU9z99u/WRRp0nH+n6W0gpVJXotpWkP2jswYY3VBBWlmG7DP81pTdE42+8lnXg3AVZJx21klGINTZ/e/+K815YVZSD97vd3LpC1UnrTVN4UNuOo3dShXaVTT/iVIMJHGGpISf4LCPfL932a9wRb075bU/BP6beCN8vnsK9xFwK96ibye9624LuDNUYFsF8hcMZ9Rw4Fr8VujMhoQA9S7HKv7i8Opys0Tv+OLPj8qOdBDyLq9fgunfYe4JTQ+k2gTEET9L/y/W2oIjZ4bm+7KUqh3hOsUVzJvu/rH7u/kOXnzu1FsRee771dUHRWPnnTxfXvzndlHCUxZgj/8YfBN+TXEEvHifDcuf5SQH+CLQXwTG5FzA6gTwtHA49Rfv6DRa0d3vgyWv/xd2e9yEa+gR+NlV2X3XwkWH1k8v5fa/ChtibOg3cbq2BTd/QiX7/NbuqJ6/XlFrVvVeX6o2vykM25w98Esy748ZB67VRUr+4QPd4zEe7uKi1+C6Yfvd+L6Nun+C3d3202NZFUIrysKU3T4hyK3Ly3cODvCUlvLepfb6KwR73rVd0vcM32uP5fs9j4KNz3W+8XZJRbzx6b66sSsn0rv0xl3hLJIW2Rp3neLvcibv7rkd99d0Sid19d3iizRf1XuTe4Rf2MKVv3Xl5//5JNdy+T0rK+joJ8Ktjc2XpLaE73u+mxbvv6+rBPnpvd0q1lvM/rF33kt+7sXCF93eReaXb6gmu6fz4kGT09PyHIc2/brQwU7GrnF+7u7z/iYZhDwSBYNu4jzxTKvy2rPk9VzLU0/ivaZOhKUbaIQRFZJtbBOGECrYvJ7XebYU8PpRoQuqCI1+G29aqe1uUsntK/eE9TsssfThEjJne93d+xNU46+vr+8vpquJ7lwvV1PhLwSxudTe4rppnF+QkFb72nqW6Qrem9smN5fpaFSlzrPrNZY86bTr6cpSKX6eb2ub0SbzXL7dZoruIcLDPWFSfVWTRwiCIYK4ret2MWLLn42+M0t97vAO3wtfr81nz52/iu1SJBZFX3d3GMvnnVVTICvY3zNNuNxfkTzYWyRAimGTvg41PK/LZSGt/lvver9evoxXe/y3HyI/yWJDy3HuGPIIl8s+SCKG/fPaXxPNYbs9l/IxBR78bj7MGe78lxZ8FcAAAUXQZtPSkAr4K/MM4qq9ywge6V3+W1NY3bXlmvDKw/zXnJHcGS/+4bJDOdHP7n6kvl/u3D2azHn2Jcvnjlqv8Is172ixl9+M89q1E4GW71UJ38ocz3+Mlg94e6Dd4BJve5vzv9fgs3fkkGV7TMvHyhlnApGX31bMXd69EKQ1dkWCvm44SqUv/tivNIb973tlQ0h8+BKdf3UgU+U3TkCb13rbb8tAB7wI2pp/cbgnWOzx+/mUg/kTaQm00SqpndFlBo2L+r12Z5dYAl//v4ZJ+vAdrLxN/uX/u/6NwpHUpA42cnpJldN406RYl0VYUcNgGT5tBlfGaW1II1L/5xp1QX4XNnnV1LWgpRhI8//J/e+WH6WIlWwGeAmN5I09WUBGxW9j6JoWyElCmHflCIScIrc49njKbw/9MKZ2UuctLV9eS7WlavD4sypTyQhsIPUA++UTV/Bex8m/o1UPQbrCqnaGb3uOhTeHtobyzDAk/ce3k/W3vDU8HfEjFf+X62sTu94WYi6/LNrhyfct4KPdoIDhW7u4L24x/Kpdw58KccFawcfxviniCdGULS/+zLkXiGj7KmYNJuWN4Sx+6hkDQHHoT/0OErplewL9qbH7n9eJLHRGTylxv/RPby/yHlY7yNIJkTM53d+u8uZVPW/fL+3eHzN+HHunKgMEWToMG7/W0MkfhPwWZYhNesxXSaUMmqhR9Jha93uCXeFJJJivj92Cf1E6dbGwCn9m+948FHbgprwa3hr/gT6/v7LwN/Kcdx5I6LS9TO3lOekq8wG0Ngk5pWZf/wUX3yZ78JHw52vDKLr9fguyRPfc4acGI7abaK6X4JjTr3mDV392UZo/0Owj4J/L7378EMQ5r13/JhuUz/iT8fc4UZC/b2o7sfdHDlzd4tY6K36sQOcSGvW6eEHL9ZYf3BAJbIV46JfzBIJeEnvlgu2da3sTBSaWIaSXmvJCREcf42BeEfD5zu/V+GizB5vcfNW9N5IQrlaESYHpuS56F14Dp0iQ4XH3R52mLv8v16Ru51F64JiYHaEpvAc3fryt+cUWwCXgj8/hv65Oz+8KSqpytVU9XkeEjSQsMw7SMl3Hv3glnQWKZeTZ5PB69u8EMVcG82hPl0VK9Jyrp+48p6kvvd9wk/sEhHv/rl6y9PLs+fvJ9ub+C4k15Px93feSq/0t5Rs8n0X33XoWiCsvfSXRIiMH/eO42E/Ju9WWuXf3lLd/0Z/fL5b9L3RfTXiC71kXu9jQffhF+bl/fbCJp4b3wPeJVBLPrdIFxZ/vtJnJ6SR2Xtm3S1RsgVnyM+3nASeskn/dDYKL7vdkO2T0kku8cS9d33fWMK93d3kjZQ89bl/cXc6Tufv+h5XSLluye9vaiU2HhU/Dvu97u6+OrPCb9sV3d4r9Nix/EJ5v7IP3d7ufzwoFf4JZ+XPQY4ScU3qWWEO2+6Xy3lE9UEJPS9YSzyL4su6XGEHbz93uE3347Yh9kYy73L+93vd/gvgg+ek94rBNvYcpO16sfP5MhHyEh5U/3rIvrbC3EvcVxLyxFYvLP+zsEnDrU3ljp0LiYav5/0Ik7c9+n8Z1RUvW+73e8LP2cQI5YivSeT6uSyLHFvd+fPS3NdN++At1l0erKd3Z1l9ubzy8oLt7tvpB5JM+YXL6yLihGK30nu26BAWX5S8tivcp8qeLfkyXk1rk9OjkJNUh1SwzkiJf7Yl/Xv/khqQmCTfS+SjcWf/JdhswvcFsAAABNVBm2AV8FfhwZn0JmnZf49/4vPqOak1X3LIXyF0X/fF3H2W5OUphjzZ3xnX4RJbPuWIfjlkOiBC/cdx8E7+NpFBd3wQ/XkPX+TuN+pW3E+IkAM+TEXk9ps93cFMkuRHvbl5bWpYR1nLl7eHoIHhfw54Cjt447Qgu73vfePI4lhKXy0k+/colz8bOnnwr4SEWSrzzvdwQEphngwOXPLc/DkuWod0LmUrkkItX24jOu8EFhxInFNIxHAtCQWFe2QjV9Xd6G+ym8dMw4oGZU3ZHukHKS+4wqdbsBR1X7j2d/kTC714T/+jZ6dW4TvussP2OpTh8CJWuekD+QH/eR12AvVuGtUhy7eq8NTFLYq5ZNr8nvn/ghwCDXad7+UGX+7wvedMNLUX8ZhGKEn7X9fQnoae3d3+K3lle/ylDV8tZnwZPwn4TFTbZQ766Mv7ZdhE2YLkBlkB+IjKPrkM+bDjwpdZWFPGj54vvMa04Zi0WJZ5d3OCPAbdvn6xvCXD27hZcMdS0WDi83WhXevcX6JhRx/8N0nr0j/qhh2n64M2do65DJEaPpEHCuxGzf42YHfXdbgwuW06Lry99fu9vT7VWQJlpgkVbbzDN/bIS75fvf9/cKE3h9Dynd+kzgNJP+7dPOTnGRgaRPYvyn5cOEn6KCcU9vn77Mn3eJdO8bQv2ghBNue49XmE+eyktNiO72h5Q5O8fuiWwYfJECR/jaulqQktB1Yko3X9nZsO30zH3Xui7KC797yR/m5amBz6BVht2LSrA0ZrD6zau+5BRsv1vYnyJXd7cI+tfq233125TyR/Tjq+z7hYZeUKyqpwuoe6U/5Pv/x53d9Fj7vPr7wgZqSFWW5Ahdb4aENZz4S03vk933etOLVsua/sFpdw1L4TGUOzfXeCLmq/fQKiR4XZ/ALP7y/3AR8E9z987Onr9cTrvv8E0g5fLR154a17qx+UpA9ff/ZyArve773/ye7rTdgovfL/vxI0nnXvyRmEXyngsGPP7l8fuPUd7aaElbKW4fzX3QJLviwelXj81//HsjalF67Z87/EY6JDTsInKkbW/0EiO793rqwoV33pbu+7p9N9Cy0jXu4eXCurdNCOhJ+IRivva+CapaHLXvucQCbSUnpvqvbCe9E97sahfdO76aU0EN7y231Fbve722VupQbVNwnfd7uEX9/Qre+eGT0q3t3e+lSJCYmk1ve0k8Ec/7pwvMqCkf7cd4S8fuV+Hf+TUu2hqE0ROtEmvsqryaV6raDm93+SK6Tqrl7/FR4nGcvMhfcND9S5GIGFDxg1fbu4R82CX26qNLlT/Nef76ZCi05z7Ppm7b/Lbe6J9at04SuVI+xon82uh9k/GfO7Zce71sO2/vWe3PV89qpP9eoRl971z5y/L+J7RQrbHvhL2ZvVdlvd6toXBJz8o+ayfrbtb8uKj8n6XqT16r9Ui6Ke/CPiH7u7+K8KrMbBKKe77uxp88MFCHBhrfByj7VuslD0/+EM6sw+tel3d97Sf7L79p5/X6wt5zsC2XeX2+aFyOiKf6vyZ5fX/Zbu701dBI5e1HPo3n9N6ljZo34fJfCF1zdJLuKp53ZpE2+sdy9x2v7ss+fYsOwWH3DGaQYQYkj/ERuTqPFR7P14go1o5Q7Zqr5Ln/BXAAAMBW1vb3YAAABsbXZoZAAAAADdnh1c3Z4dXAAAAlgAAAu4AAEAAAEAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMAAAV1dHJhawAAAFx0a2hkAAAAAd2eHVzdnh1cAAAAAQAAAAAAAAu4AAAAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAQAAAAAHgAAABDgAAAAAAJGVkdHMAAAAcZWxzdAAAAAAAAAABAAALuAAAAAAAAQAAAAAE7W1kaWEAAAAgbWRoZAAAAADdnh1c3Z4dXAAAAB4AAACWVcQAAAAAADFoZGxyAAAAAAAAAAB2aWRlAAAAAAAAAAAAAAAAQ29yZSBNZWRpYSBWaWRlbwAAAASUbWluZgAAABR2bWhkAAAAAQAAAAAAAAAAAAAAJGRpbmYAAAAcZHJlZgAAAAAAAAABAAAADHVybCAAAAABAAAEVHN0YmwAAACmc3RzZAAAAAAAAAABAAAAlmF2YzEAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAB4AEOAEgAAABIAAAAAAAAAAEKQVZDIENvZGluZwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA//8AAAAwYXZjQwFCwB7/4QAZZ0LAHtkB4I/rARAAAAMAEAAAAwPA8WLkgAEABGjLjLIAAAAQcGFzcAAAAAEAAAABAAAAGHN0dHMAAAAAAAAAAQAAAJYAAAABAAAAGHN0c3MAAAAAAAAAAgAAAAEAAABbAAAAonNkdHAAAAAAJhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWJhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWFhYWAAAANHN0c2MAAAAAAAAAAwAAAAEAAAAPAAAAAQAAAAgAAAAeAAAAAQAAAAkAAAAPAAAAAQAAAmxzdHN6AAAAAAAAAAAAAACWAAA1rAAAARQAAADbAAABfgAAAb4AAAH2AAACXgAAAoQAAAICAAACjQAAAsYAAAJeAAACvAAAArkAAALeAAAClAAAArEAAALjAAAC9AAAAloAAALZAAACiQAAAr0AAAK6AAADTAAAApsAAAL+AAADEQAAAtMAAANpAAACjgAAAuQAAAJbAAAC+wAAAzEAAAMjAAAFBAAABJUAAAVVAAAFCQAABTQAAATYAAAFEgAABYsAAAS9AAAFVAAABPUAAAThAAAFRwAABbIAAARiAAAEJgAAA/wAAAO/AAADaAAAA44AAARGAAAGSAAABekAAAUtAAAFbQAABHwAAASTAAAEmwAABO4AAASAAAAE3AAABMgAAASfAAAEhwAABKYAAASfAAAEZwAABFgAAARlAAAEjwAABHEAAAVpAAAFZwAABYkAAAWGAAAFzQAABQMAAAUyAAAFWAAABTAAAAUHAAAE3wAABQ4AAAURAAA3RgAAAesAAALYAAAC9wAABAMAAALwAAADmwAAA8IAAAP9AAAELQAABA4AAAPfAAADtgAAA9cAAAQZAAAEUgAABMgAAASdAAAEvwAABF8AAASUAAAE6wAABSYAAAUGAAAE5AAABFgAAASxAAAEgwAABLUAAASuAAAFPwAABIwAAAU3AAAF9AAABXMAAAT0AAAFXAAABJ4AAAUBAAAErwAABSAAAATeAAAFoQAABScAAATOAAAE7QAABN0AAAThAAAFnAAABRsAAAT3AAAEuwAABIcAAAS/AAAE7wAABOEAAATAAAAFBwAABRsAAATZAAAANHN0Y28AAAAAAAAACQAAbdcAAOgMAAEi4gABh9sAAd8wAAJLaAACqSwAAxNjAAOmUQAABhx0cmFrAAAAXHRraGQAAAAB3Z4dXN2eHVwAAAACAAAAAAAAC64AAAAAAAAAAAAAAAABAAAAAAEAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAkZWR0cwAAABxlbHN0AAAAAAAAAAEAAAuuAAAAAAABAAAAAAWUbWRpYQAAACBtZGhkAAAAAN2eHVzdnh1cAAC7gAADsABVxAAAAAAAMWhkbHIAAAAAAAAAAHNvdW4AAAAAAAAAAAAAAABDb3JlIE1lZGlhIEF1ZGlvAAAABTttaW5mAAAAEHNtaGQAAAAAAAAAAAAAACRkaW5mAAAAHGRyZWYAAAAAAAAAAQAAAAx1cmwgAAAAAQAABP9zdGJsAAAAZ3N0c2QAAAAAAAAAAQAAAFdtcDRhAAAAAAAAAAEAAAAAAAAAAAACABAAAAAAu4AAAAAAADNlc2RzAAAAAAOAgIAiAAAABICAgBRAFQABKwABwAAAAAAABYCAgAIRkAaAgIABAgAAABhzdHRzAAAAAAAAAAEAAADsAAAEAAAAAHxzdHNjAAAAAAAAAAkAAAABAAAALgAAAAEAAAADAAAAAgAAAAEAAAAEAAAAIQAAAAEAAAAFAAAADgAAAAEAAAAGAAAAIQAAAAEAAAAHAAAADgAAAAEAAAAIAAAAIQAAAAEAAAAJAAAADgAAAAEAAAAKAAAAAQAAAAEAAAPEc3RzegAAAAAAAAAAAAAA7AAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAABKwAAASsAAAEqAAAAOHN0Y28AAAAAAAAACgAAACwAADXXAABrgQAAwYwAARKNAAFhWwABztsAAiToAAKY1gADEjk=", -} -BASE64_FILE = { - "name": "test/test_files/sample_file.pdf", - "data": "data:@file/pdf;base64,JVBERi0xLjQKJdPr6eEKMSAwIG9iago8PC9UaXRsZSAoVW50aXRsZWQgZG9jdW1lbnQpCi9Qcm9kdWNlciAoU2tpYS9QREYgbTk3IEdvb2dsZSBEb2NzIFJlbmRlcmVyKT4+CmVuZG9iagozIDAgb2JqCjw8L2NhIDEKL0JNIC9Ob3JtYWw+PgplbmRvYmoKNSAwIG9iago8PC9GaWx0ZXIgL0ZsYXRlRGVjb2RlCi9MZW5ndGggMjM2Pj4gc3RyZWFtCnicjZDfakMhDMbvfYpcD2bzTxNhFFZYe90h7AG2tTDoYO37w9S1O1A4cIyo5Bc/80mALR6pLVYY3k/hJ/RMJh6J82d4e4Dvlo2WRu1tb6UEPV538Hc4H8NqJ3C8DAWnDIQpd4lD2LdYomzcZ9O+Km1qWG0VSCRKG+xQD4FuTZeWdTcR0CiZiqtAPYXOGKOhEBnUD3hC5M0a6lcoObInwdIErsAHcI+F3cknsB3ANFJCU54Byf6B8AAvdZi9s8WokcXNFrvLEj0n0gXu5Hm8TJyiK6nm+54Ipd3IXnQiae5H5vyxTf724RdvlHTtCmVuZHN0cmVhbQplbmRvYmoKMiAwIG9iago8PC9UeXBlIC9QYWdlCi9SZXNvdXJjZXMgPDwvUHJvY1NldCBbL1BERiAvVGV4dCAvSW1hZ2VCIC9JbWFnZUMgL0ltYWdlSV0KL0V4dEdTdGF0ZSA8PC9HMyAzIDAgUj4+Ci9Gb250IDw8L0Y0IDQgMCBSPj4+PgovTWVkaWFCb3ggWzAgMCA2MTIgNzkyXQovQ29udGVudHMgNSAwIFIKL1N0cnVjdFBhcmVudHMgMAovUGFyZW50IDYgMCBSPj4KZW5kb2JqCjYgMCBvYmoKPDwvVHlwZSAvUGFnZXMKL0NvdW50IDEKL0tpZHMgWzIgMCBSXT4+CmVuZG9iago3IDAgb2JqCjw8L1R5cGUgL0NhdGFsb2cKL1BhZ2VzIDYgMCBSPj4KZW5kb2JqCjggMCBvYmoKPDwvTGVuZ3RoMSAxNjgwOAovRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDgzOTM+PiBzdHJlYW0KeJztegl4VEX276m6t/dOeiFJd9a+nU4aSQOBsAaQdDZAI3uABIkkQCQoyBJQcCPOiGBwHweVccRd1EE7i0yCjjDAuCAIo4y7grg7Iui4ovT9/6q6wzLqvHzvfe97z++bezm/OnXqnFpOnXtuXdLEiKgHQKV+o8vKR7HBrDcR90I6bPSE8ZNXVmy4k0hZg3rz6MlTSqxPm64jYhHU+42fnF+wfOjmfdAX9dqpZWOrJtxywddoSiJy3Tp7Qd0idjf7Eu2VaJ8x++Kl2j0Zr/6TyH4OkbHy/EVzF+xeUb2eyH036hfNrWtcRF6yoP8R0HfOnb/i/LWfPPI+UaqTyFbSMGfB8ttq5/aAbhnI3FBfN+dg0jPojx2E/uAGCNwDLCrqmCPlNCxYurzv++ptWBzmQ5/NXzi7LrV3+h6sB/3R8gV1yxcZ1iU0QR/zIe2iugX1ntr+bxMZUGVlixY2LtXzaB34+aJ90ZL6RbmvjN2KrrEe29OQKWQmTi5iug5e+HI4fUkj6I9kgtxJ+TQVo/8JugbUFZKX3lP0+TMX7E0jo+Oo1EnHHj92qVNKTruGS4mV+uI21C2pm0Xa7BVL5pM2d0n9haQ11M9aQtr8uqUXkXayTzKkrn94ZvmKmY4RX5vTzVJ873s980T5woThm489fnyuk8x2VC0nRhSlPc5zrCYm60lnAEO4GdaWDyzAzWgQbkbDcLO4BcnVJsW9koT4GoMyUfrLSOWonUPjaRJNg+eIyk6t6++dvH/iAUVZw26CN82G9YYBmFJ6rFT+Tudzt9nAbUaVi0ulf/Pe2PHjxlMYI00zvBydyAaYRrLWsNg4jK8GDU+KHSb1Z/fl/+R6muXLe3fs5hnyfkvcav+u23BPfF9LaAYpckd7x3ZU7mVSbF6YKYP3TvLsFB4uuLB+CXRPxbgPhB6H55mkRGnFKYNSZH/5sb3T35TYgCfrJ07//+cyPEt3GabSvafU7z+1XW08+WwZC2n2KXr3/HtfpuspVRQ0XUSpirxDF1BTnGfYjYvjPIfPGuK8ghg6I86rp+gYKA1aMd4IjqiYltA8qqP5NJYqkQfqUW+EZCGJp3MQnuD+1A/tY6VkIS2lFbQIWhqdRQsgnwvdi4Aa9QGd7E3DU1IP+TLwdZCeXjup9zA0CzBCf9waZtAg+/7paKWoLQEvsA7y2Az7yjHnx8ebhxEa0NYYH71RruZi4BxoEon3RdhmNXdvE01GkhkFhTnGwZFINzZL9+wtZpGppKUlxpENlJBg7aa95YS9NW6fAHI4bN2zt1ljEzbLFCmNHCCnw/6f7bouuy1mZTnd3uVK+N+2d4F69Ejsnn1iQmzBNjmuNMJLlZKTnd2zdyTGrDC4MzZ1SgZ5Pe7u2bucsQknyHEFRx5QekZS9+yT3LEJO+S40igDlJmV0j375B6xCTvluIKjLJCmebtn70mOTRjTSI1x8nXrz07tnr03JfbEwD4txlE2KDeY0T37dIyTTnLmmTGOgqC8PK179lkZsQVj5v4YR+Iw0LdvoHv2fp80FJPPiXEyCRQUBLtnn+OXhmLTesY4JCoc4Ab36p59zxxpKGaeF+NoMGjYsN7ds8/rGVuwRkitksPBhai0pKB79v1g1Q9lLtHAGIcXN1FFxdDu2Q8uiE04T44rOKoATZ48snv2I4aASDq9OMbRZNCMc8u7Z19yZmzCODeNiXF0LmjO7Iru2Y8plYaE5Y6LcfJFa9hCqaA0w0OUqgZFXOsfgT4WZXSe/rFoFyX/FModcSLaSJvYPNpEW2k7Owqrx6mT2uk5RGcZ3UmX0620Gm+K6ZBci3fPJLxpy+hWlqq34+RyD96499Ae6E6jK2kLpTCv/gmtpFXKy7BahQyTDRdNwBvtenaOvgynqwPqb2kIzpoX0SLWpFfpN+i36PfTA9SpPKcfR0ZMw1Jm0x79c8Nr+lsIjxn0e7qDDrBbLE/gzT8N54NO5Y94961XalSmz9WPYQZ+ugRzUPFu3cO28RB6r6ePmJddrpSil/v0iL4TWhlUg3foetrCBrHR3G+YoY/V9+AM1oeWo9c7qJU24+6gv9AbzG44qt+vH0V66Y3TwEr440W2TYkevypaJBwNL/WiQrQspKfpWdrHAuyvfKHBbigwhA2X6vuRE/vTFMz2IVh+yL7lV+JeqTyjjtJLkLlX0c3C2/Q3epel4Ww6nk3lvfhCfpeyBO+03vLEMAfv/GvpdvT+DguxzdzO9yr3qY+qPxgzowf1ROxIkP6A75y/sgSsVGON7DfsFfYeL+Uz+R/4IeVW9WH1JVMdVn0eTjPX06P0LXOzoWwiO5c1sMvZanYzu4PtYfvYx7yYV/IL+RGlQVms/EUtwT1ZbVR/a7jGsNb4cbQqujP69+i3eoF+DU1EPFyF2f+e7sLKOmkvvY77AB1iBmZjibg15mdT2GW4r2TXs3vZRvYwa8co+9gh9gn7kn3NfuA40HEjT+d+no07wJfwS/it/E6+F/c+/hn/XvEo2UpIGaSMUKqVhZjVauUm3E8o76pp6l5Vh58LDOsMGwwbDY8athuOGu2m35jJvPvH+47nHX8nStE10XXR1mi7/i5ydCpiKoN8eE4n4nxVhzPmcpxRH0Ccv8zs8F0ay2Mj2TnwzEx2AVvMlsOTV7P17AE598fYU/DSq+wI5pyALwcx5758EC/h43Gfx+v5Yn4Tv4W381f4McWk2BSHkqzkKaOVGqVeWaqsUNYpEWW38rZySPlG+RG3rlpVn5qtBtWQOlqdqS5T71I/Uj8yzDC8YPjAaDUuMF5j7DB+YRpsGmmaYJpoqjHdaNps2m+uRXTuoCfoz6emAnZQuUopV56gG/gANZW/yF9EPM+kOcpYjkjlG9kafgVr5zmG5cbhfDgbR0fVIHz9DN/Av+HDlbGsgk2mC3j/WG/GJPURkd/UHXRYfQprexE9Lzfa2ZX8iNFOrfhcKcSYf1P6qSHlBXpDOcBM6j30pmplHnaYP6RMQBT8RR1pqCK/cic9pixmV9ATHGnR+oP5OsTxOPYI8kIlK2DfKfhi5+MQRUOU9+i3dCF/jQ7jOV5Dt7E56ly6gQawy+kjehBPRS/DRcY8YzJ7ns9Tm3kP1k5cfRirK2Q5TDEk0dWsRllvPMJfxyl8r2qld5Q/YfZ7+WPKWPWoYRJrwBNwBV1Di/WraIWhSn2JzSWFTaVc9SCy2+VKgepHuRJZZQZy2mY83VuQB4qVsZB4ETnnIC6mIEOsx3078oSKCJqHZ3wastiL1G6s5B0015DIkHXwBfRCdBJN1x+kO/S5dJF+C/VBPlitX44eN9IHdCNtZKuil+G8n4Un5x12jmEU32sYpffhzfx1PpmvO31/4e1c5qVPcT+Gykh8Jzerr+J1U6Rfp/8D0X0GMuwdNIvOpvexys8xwhhlGw2IjuMt+ihlEdZ7gCbqD+k+ZqUGfT6+8Z+iB0wGqjOFsMcR9hLWexnV80n6UqU+Og9+uBFeCMNby5B/rg2XTqksDheNPHPE8GGFQ4cMGjigoH+//L59eofyep3RM5ibE8j2a76szIz0tFSvJyU5qYfb5XQkJthtVovZZDSoCmfUuzwwqlaLBGsjajAwZkwfUQ/UQVB3iqA2okE06nSdiFYr1bTTNcPQPP/fNMMxzfAJTebURtCIPr218oAW2VMW0DrY9IlV4K8vC1RrkcOSHyv5mySfAN7vh4FW7m0o0yKsViuPjLq4obm8tgzdtdispYHSemuf3tRitYG1gYt4AotamGckkwz3lA9rwZd+AiYVSQuUlUdSA2ViBhElt7xuTmTCxKrysnS/v7pP7wgrnR2YFaFAScQRkipUKoeJGEsjJjmMNk+shtZqLb23NV/X4aRZtSH7nMCcuhlVEaWuWozhCmHcsojn0ve9J6vo3F1atfrU1nSludw7TxPV5ubVWuTuiVWntvoFVlejD9jy3FG1zaMw9HVwYsVkDaPxVdVVEbYKQ2piJWJVsfXVB8qFpPYCLWIJlAQami+oxdakNUdo0gp/a1pauFM/SGnlWnNlVcAfKUoPVNeVZbQkUfOkFW2pYS319JY+vVucrphjWxIdccaecCpTf6JNclJdcBWTTniWiRkFzkJARLTZGmZSFcCahgqoH0rNs4dCDVc1g1VkDnZkXsRSWtvsHCbkwj5iyHUGtOavCREQOPzZ6ZK6uMSY6/yaBCvi5ESoob2Lj4RCkbw8ESKmUuwp5jhS1gf16X1xBw8EFjk1FHAfTYBv66qH5cP9fr/Y4LUdYZqFSqRpYlWsrtGs9FYK54eqI7xWtGzrakmeIlqaulpOmNcGEMnt8n+SkiPm4Il/DmdKj/KGYRGW8h+a62PtFZMDFROnV2nlzbVx31ZUnlaLtQ890RbnIj1Kq5R0Hud4uiJbEZQzTiiLSpU9oubin1EG9ZwOkxlRKSVMGxVx1o6JYbXV7++mUYd+VFjJ4qRZfJqRYaHT68NPq582PXuzggnjVVlROb252XpaG0ItNuBZ8QIRT5VVfq00QlPwZObiX4e+baig6vRIGC4rFQqIv5goXj1NMT3OV+MS0dmn9ygkuubmUQFtVHNtc12H3jQroDkDzZ18O9/evKi8titwOvQta9Mjo66rhq8a2DA8FJxKWgJszcSWMFszeXpVJz6ztTWVVa2c8dLakuqWHLRVdWpEYSnlQiqEoqKJClUwLLKVm6V+emeYqEm2qlIg67M7GEmZuUvGaHYHj8mcXTIOmRqThaVMXCLHlFZWnRo98pGsxmcdLO7CAXs6vlUclMlSw27Nx0rNGZlZmL3LmeUgs6dDj7bb7SVTwHzZbrNJ5ptwtj0BXFCzMF84IYFPsWhOJ9DqcAC9UtKhfxXuabcbp1jSfJnORGHqtCbAzGkX/Tk1pmEV0o7QZbswlYywBnMMw0rm23bRC5jvwrAHV5M1fIY35PwmJK+aEceBI+LVmsMAKhpxfISg/v1KV4QHK+kms9FsMKtm1ZjqTfNyo81qtyZYFWNySlJKjxTFmK54/MydCPCaM/wsxeryUyjEQqE8XFexmgEuf4EnxZPiTk7iiTyQ6y8YPGTw4EEDgz2DAf9d7PtHp19ZvbRx3KU371kVbWGFNz/Qv3zsbfPHbYruNmxJzjxnVnTvzoei0YfrCjYN7l/+yYMffpuXhbXfi/OL+E60UXs42WjIMptNJlJU4XyrJctGZhOCNJzvdA80VSpna1YtgVvTElQLF/6zSI9arGIjLN325bF2i+WERDr1aJdT7cPP9YbGOb8Kdbl1rPTrOOc3NWO/ev+kT92F+SOcwrVwSrI/TveqOT/epYR+/IdytWHLpmjRn6IJmzCj+xFd2WKFzN5JCVhMSo/kgaqSZbHebd1n5VYD5zYzdqYryMxdQWYWQWYRazNrJpOxQ/9crgnMl2GbWJTRKVaE+sFwns1mnGJkYj3GmqYElsBt0kM26SGb9JAt5iHhTyum8J9cFbZJX5lFr6dHX0rcUVoC0xImJNQmLEpQh1d7QzWLu2LxZDTWxCTwlEA4r2hEYU2+DEkWGuCC70AB4P3b+bHt248bDVuOP8inHxvF246PxUzXITby4DkD/SZsZxw+M5BZU5nawR8K+01ckUtU5BIVuUSl20HwzU8eKOPPPVAf1sT2XOy02Ot12/lLhi3H/rVJ5I3Z+keGtw378X2dzlLCFWkOluRMSkr3pKerqlNNsnls6erDns2JzyQqHo83nWuZYdf4HuM94bQqQ5VlmnOKa2aP6Z6Z3qlp09LXeu7gztQsRXFn2SzJXbGQ3BULySIW5BKTg5qJ4aH4SsrBfNwuVmsS4SEWCeaoXCSYT9vFBkplsUaT2NkisW5TWlMmy3RI/zmk/xyyc0dQuM8sBGQXAjLKEDBKZ6VmzJ5x4vGoGSuyzLiuTe4SUNHhosPY35rFVFNTs7iHk/wFqkgZaiA7hw9x0oACcg3kwUA2zWZr2OAX2KhH26Obt+6Nbtn4HMt89U2WvuKTm1+Mvsp3sQXsj9ujD7x1IHr3E8+x6U9Hv43uZQNZehuz/S76Afx/D56sTYgPL2XzYWG/25bI3IMzpvvONy/wqRanWLJZokliDiJfeiZBOEQw9i7G1sW4O/RDbe60gSiPtmX3HOgS9cyeA53x0hEv0f5aW2Yw1g59Z7wU7eGzwOQmnp1xtjbZNiNjQcYSy/LEFY5V1jWO2xIednQ4Pk78yOFMtNs1lyPJ5XK4HHaLO53701KsRnzLJNgNXoslxZOWmuURM46/b7aFk8VWeDzkzxbZkbxehyPRnNUVKlldoZJ1Im1kBRPvNIoAiaeN2FMg88VAmTmMwi3GGi1nUU5TjpKT7ZUB4ZUB4ZUB4f1fPlDxVGH8aaqIP1eB4Rt/LqfGAyf1fW/8beXEHc+todBxVArz3Z5C5vIUrk7sGzJc4dwpwip06kWiP5zQwlZz2FHocA5zuYdBVM0WQ9hJifo74bTUQld2aqEblBjOKHRmJ4F8oOTCeCfV4sWWgg9JowlvN0+PgNKX45UWcEEs328B/z28eefuS3e9PPaMKefoX22fctG0Pv6Kd9k9q9aNu+2+aD/DlvHPrbjzlczcnHHLootZ/6uvG2ozHV+mDBiyYnTDNeKvsalEpotFpPLLO8mhR4XTSqZw6e7EWFbCw9ehH483KCca5LMpjhG9BKcaY7lOIJfbpMqDhCKR2+Nm4nGXZp92MV/JERDv+9ttkBjA4J0BrhcFXb3cQW8hDXYVugd7z6LRrrPco71VNM1V5Z7mdd5uvt3BW4zi/BQe4GRpqaHkgYaB9jJDmb0iudJQaT83eY5hjv3C5KWGpfbLkh2GZLtCzG0mswMnNcRpkbhc2Moa5nIXFqaHsxTVYOBGE156VizXkpDocNjxGe9OTvF4vch0I9oM5NVEaXe7RBmenmy2aIQ3pcYoiTHyGszmrGRvUnKy1223WLKS3WDdLrvDoTldSU6ny22xm73JBofLaSeOKRkUr9PhsFjMZo45ed1ul4vMaR5PmrPYwiaSRnZgMihMBjZxs6YxxlJTO9jalljw1qSljj2e5j1+PC31uHdceX3Zhyci1hm/RbBifa4uKixcPbZvaPUVO1f39f60QOCtTnTu3AkYsbOLOxVYRcQxuSLiwoG11W314pEbOrQawlwI8yDsJBKnd6qI2CBJhKTNHjaEoVSN52RJTezsdvrlZwN6pHgGD0HhRtFjAAuwYE+jibG7opc9eyAnbaiVeT59aXwgo8+HO6IXPRl9oafJkxR93rDlx6Lbfv/PHOWd42nRz/61tl157NgoteY6rX70D/fhCN1JlcoZbUGvb99TSi86COJKr9ZQpq9T6alktg73hTuUQJs7ucBR3EcR+SRfogZcCHoctFURv8WYqYgzoRO4EtQEehy0FbQPZCQCilYNtBC0AXRQtCiZSkar5nMW91RSYZuKt4ND8dARkA5SyAfMB40HzQTdCNoAMko9IVkIWgnaCjoqW8KKp/WWAZi7p3WtLNoumF8gq3Wx6owaWW2bVh0rx06MlWVnxdSGxdT6D4yJ+5bEyp69Y6U7t6BJlNaEgm3FKUoKFpmCiS8CMr6THAh0H92tJFMExBVjXBJW3G05wYINWxWVmMIVRnPIp29TWGuCq6DYynV+hNzk45/zw7EWfrgt0VWwofhsfogeB20FKfwQ7nf5u7SSHxQ+BxaBNoC2gvaCjoCM/CDuA7jf4e+Qg79N+aAi0EzQBtBW0BGQib8NdPK3xDe+RMEXgbj47Qtqb2JZbwId/A1wb/A3MLWXW4cUFnRKJpQfZ3y5ccaTHmfcKQUd/KXW73shooLYaUTUk0o2jaQBSnZrbn9fh+JtHTHP18Hfa9NCvruL+/H9FAFxzGQ/Rt5PGmgCqBa0CGQE9wq4V6gJdBPoblAEhCgDOkEa3wXaDXqF+oHCoAkgM9/XimE6+N7WYImvOIW/yJ8lDzy+hz8ny938GVm+wP8my+dRZqHcxZ9pzfJRsQ3tBBsnSifKfLQb+F/bctw+vdjFt8J3PmA+qAg0HjQTdCPIyLfy7NY5Pjc6eZJ2mQmarfSJLB+ke80UvsAXDpYiADUBwWFnggNs0DYEeTi47g5UBQRvuAWcgODV14ETELz0KnACgvMvBicgOOcCcAKC02eCExAcXwkO0MHv+nNOT9+Q8RcyrdjBL4GXLoGXLoGXLiGVXyJu+l4Vc/tDa14ePLY+HOqV52vawpqeYk2TWNO9rKmeNV3Jmq5iTSNY03msKcSaMlhTFmsKs6Yn2VC4oomF20+rFoa9rGkXa9rEmhpZU5A15bKmHNaksSHhDu5vPWuALMpl0VYsHjqUZ45E9nFwPzzqR8z7kRO2AveCdFkLQ0nLjimnZokyuy2vKFbvO6xgYfEYvgOGO7ANO+gASMUG7UAY7UAnO9CBA1gEmgnaBjoC0kFGaGdj4jdKdADzQUWgmaCVoCMgo5zOERCnhfEpPi4nlh+f9HhR4ztwiz9i+bk/nOnMcIacY5QbM5gji43P0rP4EEoRv4lwu8yuDpaw+duE775NIEuxhd/Ab6RMbMRN8fLG1u8zfR3s9tbgk77iZHYbZamIOlZIQZaLcig1yvogyjCLciBl8EdRFrRmTIWZozXY27eFJQqrzb7vM973fZLRwcF+nPGk71WtQ2Wtvn9A8uhm3/6Ma33P53eYIXkq2MFQbNGkamfGUN+mXVL1KjSsb/VdKYrNvisyRvsuzJAN9bGG8xpRCzt8k4LTfWPQX1nGLF+4EX1u9hVlnOcbEdMaJGw2+/phCqEYm4fJ9sqQgwayZIdThnSwhnBv0zpTlWm8abCpwNTb5Df5TJmmdFOS2W12mhPNdrPVbDYbzaqZ4xiTJM7LIXGKSzLKH2gaVfkDO8k7Ocmf1Mmf3XFm5nQ2RXooFbxicgle1ttmU8UsLfLN5EAHs06cHjEESljEXUEVlSWRoaGKDpM+KTIkVBExTTi3qoWxG6ohjfA1HYwqqzqYLkSr0sX/rXcSY65V16eL8oxV11dXkzfl4iJvkXukq3BU2c9AbRxPeft7T+MzI+sqJldFHsmsjhQIRs+sroj8Tvzneyf7kh0tL+tkX4iiuqpTGcm+LJ8k5MrIsurqig42VeqRxr6AHiLmC6lnxotZ6JFmzorprY/p5cIeejmigJ7FQrlSL9dikXoqE3otjTnlZS05OVLHo1Gj1Gn0aKfq7MqFTm6u1Elpol1SZ1dKk9CJjJQqGRlQycqQKiyNMqRKBkuTKlNPquTHVa49oXKtHElhJ3UyYjoJB7t0Eg5C59/PVb941ZfgFNY2vHr2DPGHi9pAeT2oNrL24gZvpGmWprXMro7/RSNYO2t2gyjr6iPVgfqyyOxAmdYyfMbPNM8QzcMDZS00o7yyqmVGuL6sdXh4eHmgrqy6bfSEgUNOG+vaE2MNnPAznU0QnQ0UY40e8jPNQ0TzaDHWEDHWEDHW6PBoORbJGJ9Q1WKmkmp8cMmyjdusiNfadH91SYpz0UgZvMP93ivTt+C0spFsoeqIPVASSQCJpj7FfYpFE54p0ZQo/joVb/JeOdyfvoVtjDc5IXYFSii0dFnjMvKWzyuL/WvEBdHSZcLhMQw1/tKFtvJIuK6scSnh5JyHk3MRTs4tJhOktWJJkWFdMputHF/dMWFfCIcJoaKcUBSyEUJmscQVf7r/y+Kl/Bxt4k+2sXAWW0qN1Uokq6KSIxVUxv8MsAVnKfF6aKzGAhtZiDV29RGfduxrVxRizV20dFmci/tiabyMWcKkscslJy7hrNAJjy1Fh+JSSGHiMigK4/IL6zPbNvrOrBNSoB4lC1n042Qlq/zNjA1oJzswgRKAiRId+OI+Tk584B4nF/BHHENdwB7kBiZRD2Ay8AdKoSSgh5KBXuAxfCF7wKdRKvh0SgNmSMykdGAWZejf4+grUKNMoB8H2+8pmzRgAPgd5ZAfmEvZwCDwW+pJAeAZlAPEdy4wT2KIeurfUG86A9hHYl/KA+ZTCNiP+gD7A7+mAuoLHED5wIHUT/+KBkkcTP2BQ2gAcCgN1P9FhRKH0SDgcIkjaDDwTBoCHElDgUVUqH+JL8xhwGIaDiyhEcBS4BdURmcCy2kkcBQV6UdpNIWBY6gYeBaVAM+WWEGlwHOoDDiWRulHaJzE8TQaOIHGACfSWfrnNEniZDobWEkV+mGaQmOBUyVOo3HAKhqvf0bVNAE4HXiYzqWJ4GfQZGANVQLPkziTpuj/pFqaCqyjacBZwE9pNlUD59B0YD2dCzyfZuif0FyJDVQDnEfn6R/TBVQL/kKJ86kOuIBmQX4RzQYulLiI5ugf0WKqBy6hucBGiUupQf+QltE84MV0AfAS4Ae0nC4ErqAFwEvpIuBlEi+nhcAraBHwSlqsv08rJTZRI/AqWgr8DS3TxW9BLgZeLXEVXaIfomtoOXA1rQCuoUuB19Jl+rvUTJcD19IVkFwHfJeupyuBN9BK4I10FfAm4EG6mX4DvIV+C/wdXa0foFsl/p5WAdfRauBttAattwMP0B10LXA9Nevv0B9oLfBOug74R4l30Q3ADXQj8G66CXgP8G26l24G3ke3AO+n3wEfoFv1t+hB+r3+Jj1E64Ab6TbgwxIfoduBj9IdwD/RH4CbJD5GdwIfpz8CI3QXsAX4BrXSBmAb3Q1sp3v11+kJuk9/jTZL/DPdD+ygB4Cd9CBwi8QnaSPwKXpYf5X+Qo8An5a4lR4FbqM/Af9Km4Db6THgDnpcf4V2UgT4N2rR/0HPSHyWWoHPUZu+n56nduAuegL4Am0G7qY/A/dQB/BF6gTulbiPtgD/Tk8BX6K/6C/Ty8CXaD89DfwHbQW+Qtv0v9OrEl+j7cDXaQfwDdoJfFPiW/Q34Nv0DPAdelbfRwckHqTn9b30Lu0CHqIXgO9JfJ92Az+gPcAP6UXgR7RPf5E+lvgJ/R34Kb2k76F/0svAzyQepv3Az+kVfTcdoVeBRyV+Qa8Bv6TXgf+iN4BfSfya3tJfoG/obeC39A7wO+Au+p4OAI/RQeAP9C7wR4nH6T39eYrS+0CdPgD+N6f/38/pX/zKc/o/u53TP/mFnP7JT3L6x7+Q0z/6SU7/sBs5/f0TOX3JaTn9vV/I6e/JnP7eT3L6IZnTD52S0w/JnH5I5vRDp+T0d3+S0w/KnH5Q5vSDv8Kc/vr/o5y+/785/b85/VeX03/t5/Rfb07/pXP6f3P6f3P6z+f05379Of1/ABquEH0KZW5kc3RyZWFtCmVuZG9iago5IDAgb2JqCjw8L1R5cGUgL0ZvbnREZXNjcmlwdG9yCi9Gb250TmFtZSAvQXJpYWxNVAovRmxhZ3MgNAovQXNjZW50IDkwNS4yNzM0NAovRGVzY2VudCAtMjExLjkxNDA2Ci9TdGVtViA0NS44OTg0MzgKL0NhcEhlaWdodCA3MTUuODIwMzEKL0l0YWxpY0FuZ2xlIDAKL0ZvbnRCQm94IFstNjY0LjU1MDc4IC0zMjQuNzA3MDMgMjAwMCAxMDA1Ljg1OTM4XQovRm9udEZpbGUyIDggMCBSPj4KZW5kb2JqCjEwIDAgb2JqCjw8L1R5cGUgL0ZvbnQKL0ZvbnREZXNjcmlwdG9yIDkgMCBSCi9CYXNlRm9udCAvQXJpYWxNVAovU3VidHlwZSAvQ0lERm9udFR5cGUyCi9DSURUb0dJRE1hcCAvSWRlbnRpdHkKL0NJRFN5c3RlbUluZm8gPDwvUmVnaXN0cnkgKEFkb2JlKQovT3JkZXJpbmcgKElkZW50aXR5KQovU3VwcGxlbWVudCAwPj4KL1cgWzAgWzc1MF0gMzkgWzcyMi4xNjc5NyA2NjYuOTkyMTkgMCAwIDcyMi4xNjc5NyAwIDAgMCA1NTYuMTUyMzQgMCAwIDc3Ny44MzIwMyAwIDAgNzIyLjE2Nzk3XSA1OCBbOTQzLjg0NzY2XV0KL0RXIDA+PgplbmRvYmoKMTEgMCBvYmoKPDwvRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDI2NT4+IHN0cmVhbQp4nF2RTWuEMBCG7/kVc9welmi6snsQYdcieOgHtf0Bmow2UJMQ48F/33xsLXQggYd538nMhNbtU6ukA/pmNe/QwSiVsLjo1XKEASepSM5ASO7uFG8+94ZQb+62xeHcqlGTsgSg7z67OLvB4Sr0gA+EvlqBVqoJDp9157lbjfnGGZWDjFQVCBx9pefevPQzAo22Yyt8Xrrt6D1/io/NILDIeeqGa4GL6TnaXk1IysxHBWXjoyKoxL98kVzDyL96G9Ts5tVZdrpUkZpEdaRHlqhJVEQqWKJronN85V4v/62+N8POUcYuqdLprk750F5Y4z47X631Y8ddx3nDpFLh/h1Gm+AK5wck/4erCmVuZHN0cmVhbQplbmRvYmoKNCAwIG9iago8PC9UeXBlIC9Gb250Ci9TdWJ0eXBlIC9UeXBlMAovQmFzZUZvbnQgL0FyaWFsTVQKL0VuY29kaW5nIC9JZGVudGl0eS1ICi9EZXNjZW5kYW50Rm9udHMgWzEwIDAgUl0KL1RvVW5pY29kZSAxMSAwIFI+PgplbmRvYmoKeHJlZgowIDEyCjAwMDAwMDAwMDAgNjU1MzUgZiAKMDAwMDAwMDAxNSAwMDAwMCBuIAowMDAwMDAwNDUwIDAwMDAwIG4gCjAwMDAwMDAxMDcgMDAwMDAgbiAKMDAwMDAxMDExMCAwMDAwMCBuIAowMDAwMDAwMTQ0IDAwMDAwIG4gCjAwMDAwMDA2NTggMDAwMDAgbiAKMDAwMDAwMDcxMyAwMDAwMCBuIAowMDAwMDAwNzYwIDAwMDAwIG4gCjAwMDAwMDkyMzkgMDAwMDAgbiAKMDAwMDAwOTQ2NiAwMDAwMCBuIAowMDAwMDA5Nzc0IDAwMDAwIG4gCnRyYWlsZXIKPDwvU2l6ZSAxMgovUm9vdCA3IDAgUgovSW5mbyAxIDAgUj4+CnN0YXJ0eHJlZgoxMDI0MgolJUVPRg==", -} -BINARY_IMAGE = ( - b'GIF89a=\x00D\x00\xf7\xa8\x00\x9a,3\xff\xc0\xc0\xef\xc0\xc0uXg\xfc\xf9\xf7\x993\x00\xff\xec\xec\xff\xa0\xa0\xe5\xcc\xbf\xcf\x9f\x87\x0f\xef\xef\x7f\x7f\x7f\xef\x0f\x0f\xdf\x1f\x1f\xff&&_\x9f\x9f\xffYY\xbf??5\xa5\xc2\xff\xff\xff\xac\x16\x19\xb2&\x00\xf8\x13\x10\xc2& \xdf`PP\x84\x9b\xf8\x03\x00\xb5\x0b\x0c\xdf\x0f\x00>\x9a\xb5\x87BM\x7f`P\xd2\xa5\x8f\xcc\x19\x00\xa5,\x00\xec\xd9\xcf\xe5\x0c\x00\xeb\t\x00\xff\xd9\xd9\xc7\x0c\x0c\x0f\x0f\x0f\xffyy~MZ\xfb\t\x08\xe5M@\xfb__\xff33\xcf\x90x\xf2\xe5\xdf\xc3\x06\x06\xbf\t\x08\xff\xb3\xb3\xd9\xb2\x9f\xff\x06\x06\xac)\x00\xff\xc6\xc6\x0c\t\x08\xf9\xf2\xef\xc9s`\xb8#\x00\x9f/\x00\xff__\xff\x8c\x8c\xc5\x1c\x00\xdf33\xffpp\xcf\x19\x19\xc0\x13\x10\xbf\x90x\xf7YY\xff\xf6\xf6\xe7??\xd7&&\xefLL2& \xdf\xbf\xaf\xbf\xbf\xbf???\xc5M@cn\x81_\x00\x00___\xcb00\xd8\x13\x00YC8\x80\x80\x80\xf3RRsVH\xc490\x10\x10\x10\x917@\xf2\x06\x00\xcf@@\xca\x86pooo\xa3!&\xc1\x1d\x18\xcf//\x1f\x1f\x1f\xdf\x00\x00\xd2\x16\x00\xcb\x90x\xbf\x1f\x00\x19\x13\x10\xf3\xd0\xd0\xe399&\x1d\x18Yy\x8e\x8f\x8f\x8f\xff\xa9\xa9\xcb\x13\x13\xbf00SF@\xb6& >\x1d\x18\xfb\xdd\xdd@@@\x99\x93\x90\xff\xbc\xbc\x7fPP\xaf\xaf\xaf\xc6VHzsp\x93& \xb7pp\xb3\x86ptPP|pp\xafOO\xd0\xd0\xd0\xef\xef\xefL90\xbc\xa9\xa0o0(\xeb\xb0\xb0\xff\xe0\xe0\xff\xd0\xd0\x870(K0(\xc9|h\x9f__lct\xebFF\xcf\xcf\xcf\xe0\xe0\xe0b& \xff },(@0(\xa9\x93\x88\xa6|h\x1f\xdf\xdf\xd5\xac\x97\xe2\xc5\xb7\xc7`POOO\x9cyhppp\xff\x80\x80\xff\x96\x96\xd7``\xcc\x99\x7f,\xb0\xcf\xbf\x00\x00\x00\x00\x00\x00\xff\xff\xff\x00\x00\xffff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00!\xf9\x04\x01\x00\x00\xa8\x00,\x00\x00\x00\x00=\x00D\x00\x00\x08\xff\x00Q\t\x1cH\xb0\xa0\xc1\x83\x08\x13*\\\xc8\xb0\xa1\xc0\x1b\x07\x0c8\x9cHq\xa1\x89\x14\xa72F\xac\xc8\xb1\xa2\t\x1f\x19Cj\x94\xd8\xb1$B\x03\x07D\xaa\x1ci\xb2%*#3V\xcad\xe9\xb2\xa2\x9d 3s\x9e\xdaX\x93!"\x8c:\x83\xf2\xeci\xf0c\xd0\xa3!\x87\x12E\x89\xb4iR\x92.a:\x9d\xfa\xb4\xe5\x0c\x9cT\xb3\xee\x84:\xf1\x06P\xad`\x95*4\n\xb6l\xd5\x84\x06>\x99]\x1b\xb2\xc5\x9c\x83F\xda\xb0\x9d{\xe4\x84\x00\x83W\xe7\xaeM\xe2f\xd4\xa8\xbb\x03\xbd\xea5kE\x88_\xbf\x80\x0fy\x1a\\\xb6\x08\x92\xc3\x87\x01\x070\xe5\x00\x02\xe3\xa9-\x80\xc4\x80\x1cY\xe0dS\x94-_\x0ezd3\xe7\xce\xa8>\x83\x0e=Zf\x92\x13\xa7Gm\x18 \xe1\xaf\xe7\xd5\xb8+\xb7\xceX8\xf6(\xda\xa2D\xd9N\x8d\xbb\xb8n\xc6\x8e}\x8f\xfa\x12<\xf8\xf0\xcf\x11\x1a\x14\x07}|mf\xdf\x00\x9elP\xd1\\\xb8dSaJ\x95\xffz }zu\xadiLs\xa6\xb0&8\x80\x01\xdd\x9f\x9b\x8a ^<\xf9\xe9\xac\xa9:\x82\x1d{\x83\x84\xe6\xef\xc5\xf7\x1d}\xf5\xd9W\x9eq\xa2\x1d\x95\x84a\xb1\xa9\xb0\x01\x00\xdd\x05\xd8\x9c|\x04\x16X\x8a\x02\x0b0\x80\x9f\x0b=\xe8\x94\\l\x1et \n\x00\x10\x02\x08\xdf\x84\x03ZX \x86\x1a\x16W\x03\x87+]\xe7[\x06\x00\x96\xe8\xde\x89\xce\xa5\xa8\xe2\x8a\x19N\xf7b\x87\x19\xa5\x17\x1b\x05\xa3P\x10\xa1\x8d#\xe2X\x9b\x8e;\xf2\xd8"n/\xd6\xd5\xdf\x13\xa2x\x80$\x89\x11\x9e\xd8\x81\x16\x146\xb9#\x8b\xd3\xf9\xe6\xc1\x7f\xa2\x0cp\xe5\x99\x12\xa8\x80\xdad\x15zi!\x98\xab\xf9Ff\x99gvG$g\xdf1\xa0\x80\x9bM\xc2\t\x19\x00\x19p\xd9\x9d\x99G6\xd7Hl\xdf\x99\xc2\xc8\x9e|~\t\x88)~Q@c\x99\xa3\x0cZg\x06\x00\xf8\x96\xa8)\x0c,\xc0h\xa3\x05^\x02\xe9(\x93Rji\x84\xcb)\'\x1fn\x9d~\nj)\xa3\x0e\xffZis\x84\x06\xd7\x81\xaak\xae\xc6\x01\x07\xa0\xb5\xfa*\xac~\xc9z\xaa\x04\x03l\x80+b\xb7\x81V@\x01$\xac\xd6\xe9\xab\xb1\xd2:kpj\x0ep\xe7\xb1\xab\x9aRA\x01!\x14\xd7\xc0\x03\x8dF\x1b\xdc\x00\xd3\x8ar-\xb6\xc8\x12\x07Z\t\x15\xf0:\xdd\xb7n\x8ak\xaa(\x1ddz\xac\x14\x86\x80\x92+~\xf8\xc1\xbb\xa3\xbc\xe4\xae\xe1\x01\xbaR\xfcAG\'\\\xa4\xab\x1a\xbf\xef\x82k\xa1\xbc\x03\xa3\xeb\xd7\x1d\xa4T\xcc\x87\xc2\xc5qP\x02\xc3\xab\xf9+\x9e\xb8OH\xec\xd7\x1bYTL\x8a\x1f~\xa1\x91\xecj"\xd8\xc01n\xfe\x8e\xdaA\x06\xe7\xa2;\t)Q\xb0AJ\x15\\\xa8\xbc2h!\x14\xe0\xee\xcb\xa05\x10\xc6\xa8"s&\x07\n\x13L\xb0sA\x0b\x9b\xa2\x81\x08"h\xf02\x0f\x15\xe0\x964g2\xa8\xd1D\xd3\xa4\xe8\x01\xf5t\x1c\x14`\xc6\xcb\xcbN\x11\xe7\xd6\x87]@\xca\xd7\x8f\x90\xf2\x01\x08#\x10t\x80$\xc5\x99\xc1-\xc7?\x14\xff@\xc6\xdal\x8f\xe2\x04)b0\xb1\t)}\x84\x12J&\x04\x05\x02\xc5\x18\xb8\xd9P\xc0\x0f\x1c\x93`5h\x81_\xb0H(j\x98\xacD( \xc0`P\xc5\x8f\x83\xa6\xc1\xb6;l1\x9d\x06\x1bk\x9d4\x18:(\x1e\n\x15&sR\xb7A9\xc0Q\xf1 \x18X\x00Z\xdf<\x84\xa0:h$H^\x1cgC\\\xa0\xdc\x10\x9a\xc8\xae8\x11gdQ\x07\x01\x07!\x10\n\x11W| {\xef\xa6\x90\xb0m\x01"T B\x01<\xa8\xed\xba_X|pE\x1e\xa7\xc9\xe0D\x19\xce\xcb\xbe\x04\xf5\x08\x11\x80@\x02\xf1+\xce}\t!\xecP\xc1\x0ed\xb8\xdc\xf9\x86\xa0\x88\x8aQA\x06\x90\xc1\x02\xfc\xf2G\x83\x1c4\xc4~\xf8\xcb\x1f\xf7^v\x98D\x98\x0c\x07\xca\x1b\xc5\x05\xba\x90\xbfP`Bt\x14\x81`\x07\'\xc8/\xbf\xc8@\toC\x01)\x9c\x00\xbb\x0e\xd2\xcd$"\x94\xa0\xef\xf0\xe3\x978\xe0l\x02^ \x05\x07\xf3\x97\x00\x04\xd0\xaf%1t\xde\x0b|X\xb0\x820\x8db\x0f\xa4`\xc2\x04\x16@\x8a\x0e\xce\x8f(\x02\t\xa2\xec\x86X\xc4\xb5\x15"\x898\xc4A\xfc\x1a\x08\xc5\x82HQqT\xc4\xdc("A\n<\x08\x02\x05\x94\x90\x1d\r@\xd8E\x83|1\x14T\xbc\x80\x0e>@\n\x14\x88An\xa0\xbb]\x1b\x13\xf2F\xd9Y\xc2dg\xe8\xe1\x1e\x1d\xd2\xc7P\xa0\x10\x07\x84\xf8\xe1 \x1fx\xbf\xfc\x11\xa1\x12\x90XdG\x82\xb8FI\x02q\t/\xb4\xa4&[\x12\x10\x00;', - "png", -) -ARRAY_TO_BASE64_IMAGE = ( - "data:image/png;base64," - "iVBORw0KGgoAAAANSUhEUgAAAD0AAABECAIAAAC9Laq3AAAIzElEQVR4nNXab0wb5x0H8C8x8R9ixCmuCLZi5dIlJi+gPg2kAC+KSaaJpXFLm7XJQiU7SkervcjopiqaFAXTok1tOsVkb5JmUY3UhiRSJ1YzGtGRXF4MO1OuMsMv4MKUs2CGWLg6zwRjC5S9OOq/5/NfEu37Ah333D334Xh+D8fjq3j69Cn+D7Nty6/gcmFoCMFgeXut2ML7zbJwOBLitjYcPQqNpix9b42bZeF0gmVFmsqkL7c7GMToKCYncxxWsr587kgEExNwOgs4pQR9mdwTExgdxepqMecWpS/ZPTWFmzfLMF0UqC/BLVF8RSdvfVHuPIuv6OShL9BdRPEVHUl9Ie7RUUxMFFl8RSeLPj+3ywWns+x/qwtIhj6XeyuKr+gk6bO7g0HcugWP51nCcmY9GsX585Uvvlgp0hiJwOnExMQzV+XI0uwsxzAHTp0iRNzPpfhyhff751yulaQCS3I/9+ITy8ry8pzLxS8upu2vBACfDw4H/P7n4MqetXCYY5ilLFNCJQBwHGw2GAxoakJ19TPViWU9Gl3wehemp9djsWzHJI0TlgXLPnf90uzsnMslIRaSUZfPT8/7/TM0vbayktm0ukNNm7tpc/cn3S8Le8TmQTxrfbbiEzJ24l3a3B3ZkcLI4hay9Xrp4gOwsNfwzYn3MvenuOn2dpLjSJ8v5ZCt0QvFxzGMaOvDhqb7h15949qFhw3Nogck3B6jsYOmAVgcDpvNtqX6helpjmFEiy9Yq/3q9AfTBzsAHLzzddrwiCex7sMThLAxZLXu5Tjr559ze/akH86yGB4GTSMcLk68zHHu69ezzRirO9QfX7wpoKWTdb2q7Hre7/c4nd7xcdEZ46755OoO9X/21me7wWmRrEtgyGod6erqtdt77XYiFEppE0ZOUxMaGqBQSHQiXXzuQ+ZvTrz3fa1u96PZfMRCcq8Phgii32YjOc7W18fX1KQ3MwyGh8EwiEYzz12PRjmGcQ8PZ0MPDlz98syH39fq8hfn6xYipY/FRPUL09Pu4WHRGSNYqxW+zmWZLkSjepIYloWtx+apX5qdzVZ8qzvUX5zpt3025j5kLug27wz43750vkh3nvqZe/dEi899yGz7bOz+oVcB5Ine732gehJ+49qF/p5XXrpPl+TOrc+Sv5z+IM/pQsjOgH+/l/mk++UO5/W0poSb8nhqeD7/ToXk1D9saBocuPqvgyYABaFNzi81AfEnFiS7iVDI3ttbBB1J+pHXXovvDNZqBweuXhr481xD88Le+vx72+d9cObcO8eufSpxTMo4sQ4NcSTZZ7MVre+12+PffnHmw4KmCwD7vczZ94//+twv93vFn1viSR/fRChk6+8vWu8jyfh2QWhhKAPY/SivtZp0N1cDrqZUfUFRPQn/7Mbls+8fL+isdPf4Pozvg18NpN77MiETUT0J7/cygvjIjStVT0TmTYmku7VhAFhMqntB/4gkLQ5HidbkvHT/LoAjN65ITBoSSXe3zkMbhiZj2Yf0+RynTpVFvzPgP3PunTy5aopqGBmps1rT9qe7X4jAzIIMQTQl6hvv3+2+dL6/55Wc04UQNUX91WR6026/QhCEySTlzidF6HcG/AB6/vCbljsFrPmPkuSA3U7TtN1uX6Ko5CZxN1eDZVWOTvPXH7zzdUHczVDUIE3Hv5vgOGGjkiCQzT2pxz0yr84l9DsD/n3eB7aeI29f6itMDAC4s77G87zFYrl48SIANUURJlOzx6M2GrG5/n3vHlJHD6MFo8NP57IOdNFwe/bwBEFNTdFFMDPSp6+b+m+E53kAFRUVNputry/x84vf74YA1FFM6hGV5b6AwwinAQBIn4+amiqHGVAplwAqaUzHwnxyu7hbsYG2eawo4Nqd+xKxSixWY7Y87zlsRqavY+eXhG2PxwNge5Cbvm4Psh5h5zYAaG+Hw4GkRwsAZAiGZbAvgNHmuEbDYwCI5fGbyT+yehIAx3E0TdtsNgDNBjK2wnP0yPzkbST+n7dYpijqIkXZgDjf5EOwCowOURnaFrJeo20BJA9NpExiA6l4q1Om32XwzLA+X0dHB4AfG0itpkauJkhTV7WORPI6hNFoHAKGAAsQ1x9lMf4jeHchrEDbPKoz1mqiMoTl0BX2cCGebfo65VudMsPmck2TgYwPlV8d6yRNXRoDFT848XlaLMyf/PnrX43TAI62Un+qJ7VOWhHkAUzuhncX5OtoDMAQTOj9arj0CFahJ/XPH50KqtAQ2zTEBstlE1doCIXZtL3VmLwzHIme/OhyZAMff2Q73fOuTK5MOUVw+xl6kaHDkejopEddpTT/0IXGNSXo/Wowus3nLXUU1TGE5VhRQL6O1gXUp34olOze3kp9W0+urK4dA6K3bqeTVUr5T1rkh1sqVCIrRxoDpW/rTBOnuDdia4+n3YFp90ZsTeT8H/TLKvgILFchJoN8A7owDEEoNtKPj7srNMQFfd3fPDMAfnG4pWfSg0ii/+2tlOJ4p6i4WkuSpi55NZHZlOIWkqc+W1+Zbjd14HeeGWFbrVKO6euE0SIzkEpr1zaNyP/RKk2dvrVTKD6JiHxeXLp+061S/lZf9x3Ltbe3ezyeUCj0D7Np3TOTXHzJkasJXbMpufgKc5euF9wRA3mE5SwWi8Ph6O3tHRwc/Ofve0XvsUyurG1s2dXYIjqURZN1PVYmV+qaTLsaW0T1wVYjTx2onXDX/t1dGRH5wQD8GwBgtVoBEMJDnBhaoviKcefUb6gUi0fbA4dbsunnqhIUnufVqnRZzuIr3l2KPry6Joh5nnc4HM31ZLJY22TKWXwSKfj9KolxL4tEBb1LX6cwm8aCfL9jpKamhiAIn8/XZ+0ytxoLKr5yunPq42HnH58cuCxsazXE2KdnaxtbdE2m4qBpKen9wZz6nj8OfcdyapVyxHHZ1HW80OKTSBne15TQhyPRgIw4aD6xJ/PDrdJStvdjM/WlF59Eyvw+8kZsbX7ydtjPlaX4JLKV761vZf4H0dLrJY2D0p4AAAAASUVORK5CYII=" -) -BASE64_MODEL3D = { - "name": "Box.gltf", - "data": "data:;base64,ewogICAgImFzc2V0IjogewogICAgICAgICJnZW5lcmF0b3IiOiAiQ09MTEFEQTJHTFRGIiwKICAgICAgICAidmVyc2lvbiI6ICIyLjAiCiAgICB9LAogICAgInNjZW5lIjogMCwKICAgICJzY2VuZXMiOiBbCiAgICAgICAgewogICAgICAgICAgICAibm9kZXMiOiBbCiAgICAgICAgICAgICAgICAwCiAgICAgICAgICAgIF0KICAgICAgICB9CiAgICBdLAogICAgIm5vZGVzIjogWwogICAgICAgIHsKICAgICAgICAgICAgImNoaWxkcmVuIjogWwogICAgICAgICAgICAgICAgMQogICAgICAgICAgICBdLAogICAgICAgICAgICAibWF0cml4IjogWwogICAgICAgICAgICAgICAgMS4wLAogICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgLTEuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDEuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDAuMCwKICAgICAgICAgICAgICAgIDEuMAogICAgICAgICAgICBdCiAgICAgICAgfSwKICAgICAgICB7CiAgICAgICAgICAgICJtZXNoIjogMAogICAgICAgIH0KICAgIF0sCiAgICAibWVzaGVzIjogWwogICAgICAgIHsKICAgICAgICAgICAgInByaW1pdGl2ZXMiOiBbCiAgICAgICAgICAgICAgICB7CiAgICAgICAgICAgICAgICAgICAgImF0dHJpYnV0ZXMiOiB7CiAgICAgICAgICAgICAgICAgICAgICAgICJOT1JNQUwiOiAxLAogICAgICAgICAgICAgICAgICAgICAgICAiUE9TSVRJT04iOiAyCiAgICAgICAgICAgICAgICAgICAgfSwKICAgICAgICAgICAgICAgICAgICAiaW5kaWNlcyI6IDAsCiAgICAgICAgICAgICAgICAgICAgIm1vZGUiOiA0LAogICAgICAgICAgICAgICAgICAgICJtYXRlcmlhbCI6IDAKICAgICAgICAgICAgICAgIH0KICAgICAgICAgICAgXSwKICAgICAgICAgICAgIm5hbWUiOiAiTWVzaCIKICAgICAgICB9CiAgICBdLAogICAgImFjY2Vzc29ycyI6IFsKICAgICAgICB7CiAgICAgICAgICAgICJidWZmZXJWaWV3IjogMCwKICAgICAgICAgICAgImJ5dGVPZmZzZXQiOiAwLAogICAgICAgICAgICAiY29tcG9uZW50VHlwZSI6IDUxMjMsCiAgICAgICAgICAgICJjb3VudCI6IDM2LAogICAgICAgICAgICAibWF4IjogWwogICAgICAgICAgICAgICAgMjMKICAgICAgICAgICAgXSwKICAgICAgICAgICAgIm1pbiI6IFsKICAgICAgICAgICAgICAgIDAKICAgICAgICAgICAgXSwKICAgICAgICAgICAgInR5cGUiOiAiU0NBTEFSIgogICAgICAgIH0sCiAgICAgICAgewogICAgICAgICAgICAiYnVmZmVyVmlldyI6IDEsCiAgICAgICAgICAgICJieXRlT2Zmc2V0IjogMCwKICAgICAgICAgICAgImNvbXBvbmVudFR5cGUiOiA1MTI2LAogICAgICAgICAgICAiY291bnQiOiAyNCwKICAgICAgICAgICAgIm1heCI6IFsKICAgICAgICAgICAgICAgIDEuMCwKICAgICAgICAgICAgICAgIDEuMCwKICAgICAgICAgICAgICAgIDEuMAogICAgICAgICAgICBdLAogICAgICAgICAgICAibWluIjogWwogICAgICAgICAgICAgICAgLTEuMCwKICAgICAgICAgICAgICAgIC0xLjAsCiAgICAgICAgICAgICAgICAtMS4wCiAgICAgICAgICAgIF0sCiAgICAgICAgICAgICJ0eXBlIjogIlZFQzMiCiAgICAgICAgfSwKICAgICAgICB7CiAgICAgICAgICAgICJidWZmZXJWaWV3IjogMSwKICAgICAgICAgICAgImJ5dGVPZmZzZXQiOiAyODgsCiAgICAgICAgICAgICJjb21wb25lbnRUeXBlIjogNTEyNiwKICAgICAgICAgICAgImNvdW50IjogMjQsCiAgICAgICAgICAgICJtYXgiOiBbCiAgICAgICAgICAgICAgICAwLjUsCiAgICAgICAgICAgICAgICAwLjUsCiAgICAgICAgICAgICAgICAwLjUKICAgICAgICAgICAgXSwKICAgICAgICAgICAgIm1pbiI6IFsKICAgICAgICAgICAgICAgIC0wLjUsCiAgICAgICAgICAgICAgICAtMC41LAogICAgICAgICAgICAgICAgLTAuNQogICAgICAgICAgICBdLAogICAgICAgICAgICAidHlwZSI6ICJWRUMzIgogICAgICAgIH0KICAgIF0sCiAgICAibWF0ZXJpYWxzIjogWwogICAgICAgIHsKICAgICAgICAgICAgInBick1ldGFsbGljUm91Z2huZXNzIjogewogICAgICAgICAgICAgICAgImJhc2VDb2xvckZhY3RvciI6IFsKICAgICAgICAgICAgICAgICAgICAwLjgwMDAwMDAxMTkyMDkyOSwKICAgICAgICAgICAgICAgICAgICAwLjAsCiAgICAgICAgICAgICAgICAgICAgMC4wLAogICAgICAgICAgICAgICAgICAgIDEuMAogICAgICAgICAgICAgICAgXSwKICAgICAgICAgICAgICAgICJtZXRhbGxpY0ZhY3RvciI6IDAuMAogICAgICAgICAgICB9LAogICAgICAgICAgICAibmFtZSI6ICJSZWQiCiAgICAgICAgfQogICAgXSwKICAgICJidWZmZXJWaWV3cyI6IFsKICAgICAgICB7CiAgICAgICAgICAgICJidWZmZXIiOiAwLAogICAgICAgICAgICAiYnl0ZU9mZnNldCI6IDU3NiwKICAgICAgICAgICAgImJ5dGVMZW5ndGgiOiA3MiwKICAgICAgICAgICAgInRhcmdldCI6IDM0OTYzCiAgICAgICAgfSwKICAgICAgICB7CiAgICAgICAgICAgICJidWZmZXIiOiAwLAogICAgICAgICAgICAiYnl0ZU9mZnNldCI6IDAsCiAgICAgICAgICAgICJieXRlTGVuZ3RoIjogNTc2LAogICAgICAgICAgICAiYnl0ZVN0cmlkZSI6IDEyLAogICAgICAgICAgICAidGFyZ2V0IjogMzQ5NjIKICAgICAgICB9CiAgICBdLAogICAgImJ1ZmZlcnMiOiBbCiAgICAgICAgewogICAgICAgICAgICAiYnl0ZUxlbmd0aCI6IDY0OCwKICAgICAgICAgICAgInVyaSI6ICJkYXRhOmFwcGxpY2F0aW9uL29jdGV0LXN0cmVhbTtiYXNlNjQsQUFBQUFBQUFBQUFBQUlBL0FBQUFBQUFBQUFBQUFJQS9BQUFBQUFBQUFBQUFBSUEvQUFBQUFBQUFBQUFBQUlBL0FBQUFBQUFBZ0w4QUFBQUFBQUFBQUFBQWdMOEFBQUFBQUFBQUFBQUFnTDhBQUFBQUFBQUFBQUFBZ0w4QUFBQUFBQUNBUHdBQUFBQUFBQUFBQUFDQVB3QUFBQUFBQUFBQUFBQ0FQd0FBQUFBQUFBQUFBQUNBUHdBQUFBQUFBQUFBQUFBQUFBQUFnRDhBQUFBQUFBQUFBQUFBZ0Q4QUFBQUFBQUFBQUFBQWdEOEFBQUFBQUFBQUFBQUFnRDhBQUFBQUFBQ0F2d0FBQUFBQUFBQUFBQUNBdndBQUFBQUFBQUFBQUFDQXZ3QUFBQUFBQUFBQUFBQ0F2d0FBQUFBQUFBQUFBQUFBQUFBQUFBQUFBSUMvQUFBQUFBQUFBQUFBQUlDL0FBQUFBQUFBQUFBQUFJQy9BQUFBQUFBQUFBQUFBSUMvQUFBQXZ3QUFBTDhBQUFBL0FBQUFQd0FBQUw4QUFBQS9BQUFBdndBQUFEOEFBQUEvQUFBQVB3QUFBRDhBQUFBL0FBQUFQd0FBQUw4QUFBQS9BQUFBdndBQUFMOEFBQUEvQUFBQVB3QUFBTDhBQUFDL0FBQUF2d0FBQUw4QUFBQy9BQUFBUHdBQUFEOEFBQUEvQUFBQVB3QUFBTDhBQUFBL0FBQUFQd0FBQUQ4QUFBQy9BQUFBUHdBQUFMOEFBQUMvQUFBQXZ3QUFBRDhBQUFBL0FBQUFQd0FBQUQ4QUFBQS9BQUFBdndBQUFEOEFBQUMvQUFBQVB3QUFBRDhBQUFDL0FBQUF2d0FBQUw4QUFBQS9BQUFBdndBQUFEOEFBQUEvQUFBQXZ3QUFBTDhBQUFDL0FBQUF2d0FBQUQ4QUFBQy9BQUFBdndBQUFMOEFBQUMvQUFBQXZ3QUFBRDhBQUFDL0FBQUFQd0FBQUw4QUFBQy9BQUFBUHdBQUFEOEFBQUMvQUFBQkFBSUFBd0FDQUFFQUJBQUZBQVlBQndBR0FBVUFDQUFKQUFvQUN3QUtBQWtBREFBTkFBNEFEd0FPQUEwQUVBQVJBQklBRXdBU0FCRUFGQUFWQUJZQUZ3QVdBQlVBIgogICAgICAgIH0KICAgIF0KfQo=", -} -SUM_PIXELS_INTERPRETATION = { - "scores": [ - [ - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.9217332561281606, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.8478093032233159, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - 0.7775525960239336, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.28228141285466124, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.7110596409959468, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.5717043041883806, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.17004439297432927, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.6232387569967188, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.349160393746381, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - 0.37415556842308434, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - [ - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.4147847905809689, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.21617448369040726, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.4393939393939394, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - 0.8667245705462266, - ], - ] - ], - "alternative_outputs": [ - [ - [1793106], - [1795539], - [1797837], - [1800021], - [1815417], - [1802088], - [1806420], - [1824192], - [1818906], - [1804818], - [1813338], - [1812561], - [1811298], - [1817472], - [1810533], - [1797249], - ] - ], -} -SUM_PIXELS_SHAP_INTERPRETATION = { - "scores": [ - [ - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.36599426908032084, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 1.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.9044030984144017, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.5780729041010304, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.03706410007949775, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - [ - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.0, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.4724172299368354, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - 0.5148839775509372, - ], - ] - ], - "alternative_outputs": [[]], -} - -FILE_TEMPLATE_CONTEXT = { - "file_count": "single", - "value": { - "name": "sample_file.pdf", - "size": 10558, - "data": "data:application/pdf;base64,JVBERi0xLjQKJdPr6eEKMSAwIG9iago8PC9UaXRsZSAoVW50aXRsZWQgZG9jdW1lbnQpCi9Qcm9kdWNlciAoU2tpYS9QREYgbTk3IEdvb2dsZSBEb2NzIFJlbmRlcmVyKT4+CmVuZG9iagozIDAgb2JqCjw8L2NhIDEKL0JNIC9Ob3JtYWw+PgplbmRvYmoKNSAwIG9iago8PC9GaWx0ZXIgL0ZsYXRlRGVjb2RlCi9MZW5ndGggMjM2Pj4gc3RyZWFtCnicjZDfakMhDMbvfYpcD2bzTxNhFFZYe90h7AG2tTDoYO37w9S1O1A4cIyo5Bc/80mALR6pLVYY3k/hJ/RMJh6J82d4e4Dvlo2WRu1tb6UEPV538Hc4H8NqJ3C8DAWnDIQpd4lD2LdYomzcZ9O+Km1qWG0VSCRKG+xQD4FuTZeWdTcR0CiZiqtAPYXOGKOhEBnUD3hC5M0a6lcoObInwdIErsAHcI+F3cknsB3ANFJCU54Byf6B8AAvdZi9s8WokcXNFrvLEj0n0gXu5Hm8TJyiK6nm+54Ipd3IXnQiae5H5vyxTf724RdvlHTtCmVuZHN0cmVhbQplbmRvYmoKMiAwIG9iago8PC9UeXBlIC9QYWdlCi9SZXNvdXJjZXMgPDwvUHJvY1NldCBbL1BERiAvVGV4dCAvSW1hZ2VCIC9JbWFnZUMgL0ltYWdlSV0KL0V4dEdTdGF0ZSA8PC9HMyAzIDAgUj4+Ci9Gb250IDw8L0Y0IDQgMCBSPj4+PgovTWVkaWFCb3ggWzAgMCA2MTIgNzkyXQovQ29udGVudHMgNSAwIFIKL1N0cnVjdFBhcmVudHMgMAovUGFyZW50IDYgMCBSPj4KZW5kb2JqCjYgMCBvYmoKPDwvVHlwZSAvUGFnZXMKL0NvdW50IDEKL0tpZHMgWzIgMCBSXT4+CmVuZG9iago3IDAgb2JqCjw8L1R5cGUgL0NhdGFsb2cKL1BhZ2VzIDYgMCBSPj4KZW5kb2JqCjggMCBvYmoKPDwvTGVuZ3RoMSAxNjgwOAovRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDgzOTM+PiBzdHJlYW0KeJztegl4VEX276m6t/dOeiFJd9a+nU4aSQOBsAaQdDZAI3uABIkkQCQoyBJQcCPOiGBwHweVccRd1EE7i0yCjjDAuCAIo4y7grg7Iui4ovT9/6q6wzLqvHzvfe97z++bezm/OnXqnFpOnXtuXdLEiKgHQKV+o8vKR7HBrDcR90I6bPSE8ZNXVmy4k0hZg3rz6MlTSqxPm64jYhHU+42fnF+wfOjmfdAX9dqpZWOrJtxywddoSiJy3Tp7Qd0idjf7Eu2VaJ8x++Kl2j0Zr/6TyH4OkbHy/EVzF+xeUb2eyH036hfNrWtcRF6yoP8R0HfOnb/i/LWfPPI+UaqTyFbSMGfB8ttq5/aAbhnI3FBfN+dg0jPojx2E/uAGCNwDLCrqmCPlNCxYurzv++ptWBzmQ5/NXzi7LrV3+h6sB/3R8gV1yxcZ1iU0QR/zIe2iugX1ntr+bxMZUGVlixY2LtXzaB34+aJ90ZL6RbmvjN2KrrEe29OQKWQmTi5iug5e+HI4fUkj6I9kgtxJ+TQVo/8JugbUFZKX3lP0+TMX7E0jo+Oo1EnHHj92qVNKTruGS4mV+uI21C2pm0Xa7BVL5pM2d0n9haQ11M9aQtr8uqUXkXayTzKkrn94ZvmKmY4RX5vTzVJ873s980T5woThm489fnyuk8x2VC0nRhSlPc5zrCYm60lnAEO4GdaWDyzAzWgQbkbDcLO4BcnVJsW9koT4GoMyUfrLSOWonUPjaRJNg+eIyk6t6++dvH/iAUVZw26CN82G9YYBmFJ6rFT+Tudzt9nAbUaVi0ulf/Pe2PHjxlMYI00zvBydyAaYRrLWsNg4jK8GDU+KHSb1Z/fl/+R6muXLe3fs5hnyfkvcav+u23BPfF9LaAYpckd7x3ZU7mVSbF6YKYP3TvLsFB4uuLB+CXRPxbgPhB6H55mkRGnFKYNSZH/5sb3T35TYgCfrJ07//+cyPEt3GabSvafU7z+1XW08+WwZC2n2KXr3/HtfpuspVRQ0XUSpirxDF1BTnGfYjYvjPIfPGuK8ghg6I86rp+gYKA1aMd4IjqiYltA8qqP5NJYqkQfqUW+EZCGJp3MQnuD+1A/tY6VkIS2lFbQIWhqdRQsgnwvdi4Aa9QGd7E3DU1IP+TLwdZCeXjup9zA0CzBCf9waZtAg+/7paKWoLQEvsA7y2Az7yjHnx8ebhxEa0NYYH71RruZi4BxoEon3RdhmNXdvE01GkhkFhTnGwZFINzZL9+wtZpGppKUlxpENlJBg7aa95YS9NW6fAHI4bN2zt1ljEzbLFCmNHCCnw/6f7bouuy1mZTnd3uVK+N+2d4F69Ejsnn1iQmzBNjmuNMJLlZKTnd2zdyTGrDC4MzZ1SgZ5Pe7u2bucsQknyHEFRx5QekZS9+yT3LEJO+S40igDlJmV0j375B6xCTvluIKjLJCmebtn70mOTRjTSI1x8nXrz07tnr03JfbEwD4txlE2KDeY0T37dIyTTnLmmTGOgqC8PK179lkZsQVj5v4YR+Iw0LdvoHv2fp80FJPPiXEyCRQUBLtnn+OXhmLTesY4JCoc4Ab36p59zxxpKGaeF+NoMGjYsN7ds8/rGVuwRkitksPBhai0pKB79v1g1Q9lLtHAGIcXN1FFxdDu2Q8uiE04T44rOKoATZ48snv2I4aASDq9OMbRZNCMc8u7Z19yZmzCODeNiXF0LmjO7Iru2Y8plYaE5Y6LcfJFa9hCqaA0w0OUqgZFXOsfgT4WZXSe/rFoFyX/FModcSLaSJvYPNpEW2k7Owqrx6mT2uk5RGcZ3UmX0620Gm+K6ZBci3fPJLxpy+hWlqq34+RyD96499Ae6E6jK2kLpTCv/gmtpFXKy7BahQyTDRdNwBvtenaOvgynqwPqb2kIzpoX0SLWpFfpN+i36PfTA9SpPKcfR0ZMw1Jm0x79c8Nr+lsIjxn0e7qDDrBbLE/gzT8N54NO5Y94961XalSmz9WPYQZ+ugRzUPFu3cO28RB6r6ePmJddrpSil/v0iL4TWhlUg3foetrCBrHR3G+YoY/V9+AM1oeWo9c7qJU24+6gv9AbzG44qt+vH0V66Y3TwEr440W2TYkevypaJBwNL/WiQrQspKfpWdrHAuyvfKHBbigwhA2X6vuRE/vTFMz2IVh+yL7lV+JeqTyjjtJLkLlX0c3C2/Q3epel4Ww6nk3lvfhCfpeyBO+03vLEMAfv/GvpdvT+DguxzdzO9yr3qY+qPxgzowf1ROxIkP6A75y/sgSsVGON7DfsFfYeL+Uz+R/4IeVW9WH1JVMdVn0eTjPX06P0LXOzoWwiO5c1sMvZanYzu4PtYfvYx7yYV/IL+RGlQVms/EUtwT1ZbVR/a7jGsNb4cbQqujP69+i3eoF+DU1EPFyF2f+e7sLKOmkvvY77AB1iBmZjibg15mdT2GW4r2TXs3vZRvYwa8co+9gh9gn7kn3NfuA40HEjT+d+no07wJfwS/it/E6+F/c+/hn/XvEo2UpIGaSMUKqVhZjVauUm3E8o76pp6l5Vh58LDOsMGwwbDY8athuOGu2m35jJvPvH+47nHX8nStE10XXR1mi7/i5ydCpiKoN8eE4n4nxVhzPmcpxRH0Ccv8zs8F0ay2Mj2TnwzEx2AVvMlsOTV7P17AE598fYU/DSq+wI5pyALwcx5758EC/h43Gfx+v5Yn4Tv4W381f4McWk2BSHkqzkKaOVGqVeWaqsUNYpEWW38rZySPlG+RG3rlpVn5qtBtWQOlqdqS5T71I/Uj8yzDC8YPjAaDUuMF5j7DB+YRpsGmmaYJpoqjHdaNps2m+uRXTuoCfoz6emAnZQuUopV56gG/gANZW/yF9EPM+kOcpYjkjlG9kafgVr5zmG5cbhfDgbR0fVIHz9DN/Av+HDlbGsgk2mC3j/WG/GJPURkd/UHXRYfQprexE9Lzfa2ZX8iNFOrfhcKcSYf1P6qSHlBXpDOcBM6j30pmplHnaYP6RMQBT8RR1pqCK/cic9pixmV9ATHGnR+oP5OsTxOPYI8kIlK2DfKfhi5+MQRUOU9+i3dCF/jQ7jOV5Dt7E56ly6gQawy+kjehBPRS/DRcY8YzJ7ns9Tm3kP1k5cfRirK2Q5TDEk0dWsRllvPMJfxyl8r2qld5Q/YfZ7+WPKWPWoYRJrwBNwBV1Di/WraIWhSn2JzSWFTaVc9SCy2+VKgepHuRJZZQZy2mY83VuQB4qVsZB4ETnnIC6mIEOsx3078oSKCJqHZ3wastiL1G6s5B0015DIkHXwBfRCdBJN1x+kO/S5dJF+C/VBPlitX44eN9IHdCNtZKuil+G8n4Un5x12jmEU32sYpffhzfx1PpmvO31/4e1c5qVPcT+Gykh8Jzerr+J1U6Rfp/8D0X0GMuwdNIvOpvexys8xwhhlGw2IjuMt+ihlEdZ7gCbqD+k+ZqUGfT6+8Z+iB0wGqjOFsMcR9hLWexnV80n6UqU+Og9+uBFeCMNby5B/rg2XTqksDheNPHPE8GGFQ4cMGjigoH+//L59eofyep3RM5ibE8j2a76szIz0tFSvJyU5qYfb5XQkJthtVovZZDSoCmfUuzwwqlaLBGsjajAwZkwfUQ/UQVB3iqA2okE06nSdiFYr1bTTNcPQPP/fNMMxzfAJTebURtCIPr218oAW2VMW0DrY9IlV4K8vC1RrkcOSHyv5mySfAN7vh4FW7m0o0yKsViuPjLq4obm8tgzdtdispYHSemuf3tRitYG1gYt4AotamGckkwz3lA9rwZd+AiYVSQuUlUdSA2ViBhElt7xuTmTCxKrysnS/v7pP7wgrnR2YFaFAScQRkipUKoeJGEsjJjmMNk+shtZqLb23NV/X4aRZtSH7nMCcuhlVEaWuWozhCmHcsojn0ve9J6vo3F1atfrU1nSludw7TxPV5ubVWuTuiVWntvoFVlejD9jy3FG1zaMw9HVwYsVkDaPxVdVVEbYKQ2piJWJVsfXVB8qFpPYCLWIJlAQami+oxdakNUdo0gp/a1pauFM/SGnlWnNlVcAfKUoPVNeVZbQkUfOkFW2pYS319JY+vVucrphjWxIdccaecCpTf6JNclJdcBWTTniWiRkFzkJARLTZGmZSFcCahgqoH0rNs4dCDVc1g1VkDnZkXsRSWtvsHCbkwj5iyHUGtOavCREQOPzZ6ZK6uMSY6/yaBCvi5ESoob2Lj4RCkbw8ESKmUuwp5jhS1gf16X1xBw8EFjk1FHAfTYBv66qH5cP9fr/Y4LUdYZqFSqRpYlWsrtGs9FYK54eqI7xWtGzrakmeIlqaulpOmNcGEMnt8n+SkiPm4Il/DmdKj/KGYRGW8h+a62PtFZMDFROnV2nlzbVx31ZUnlaLtQ890RbnIj1Kq5R0Hud4uiJbEZQzTiiLSpU9oubin1EG9ZwOkxlRKSVMGxVx1o6JYbXV7++mUYd+VFjJ4qRZfJqRYaHT68NPq582PXuzggnjVVlROb252XpaG0ItNuBZ8QIRT5VVfq00QlPwZObiX4e+baig6vRIGC4rFQqIv5goXj1NMT3OV+MS0dmn9ygkuubmUQFtVHNtc12H3jQroDkDzZ18O9/evKi8titwOvQta9Mjo66rhq8a2DA8FJxKWgJszcSWMFszeXpVJz6ztTWVVa2c8dLakuqWHLRVdWpEYSnlQiqEoqKJClUwLLKVm6V+emeYqEm2qlIg67M7GEmZuUvGaHYHj8mcXTIOmRqThaVMXCLHlFZWnRo98pGsxmcdLO7CAXs6vlUclMlSw27Nx0rNGZlZmL3LmeUgs6dDj7bb7SVTwHzZbrNJ5ptwtj0BXFCzMF84IYFPsWhOJ9DqcAC9UtKhfxXuabcbp1jSfJnORGHqtCbAzGkX/Tk1pmEV0o7QZbswlYywBnMMw0rm23bRC5jvwrAHV5M1fIY35PwmJK+aEceBI+LVmsMAKhpxfISg/v1KV4QHK+kms9FsMKtm1ZjqTfNyo81qtyZYFWNySlJKjxTFmK54/MydCPCaM/wsxeryUyjEQqE8XFexmgEuf4EnxZPiTk7iiTyQ6y8YPGTw4EEDgz2DAf9d7PtHp19ZvbRx3KU371kVbWGFNz/Qv3zsbfPHbYruNmxJzjxnVnTvzoei0YfrCjYN7l/+yYMffpuXhbXfi/OL+E60UXs42WjIMptNJlJU4XyrJctGZhOCNJzvdA80VSpna1YtgVvTElQLF/6zSI9arGIjLN325bF2i+WERDr1aJdT7cPP9YbGOb8Kdbl1rPTrOOc3NWO/ev+kT92F+SOcwrVwSrI/TveqOT/epYR+/IdytWHLpmjRn6IJmzCj+xFd2WKFzN5JCVhMSo/kgaqSZbHebd1n5VYD5zYzdqYryMxdQWYWQWYRazNrJpOxQ/9crgnMl2GbWJTRKVaE+sFwns1mnGJkYj3GmqYElsBt0kM26SGb9JAt5iHhTyum8J9cFbZJX5lFr6dHX0rcUVoC0xImJNQmLEpQh1d7QzWLu2LxZDTWxCTwlEA4r2hEYU2+DEkWGuCC70AB4P3b+bHt248bDVuOP8inHxvF246PxUzXITby4DkD/SZsZxw+M5BZU5nawR8K+01ckUtU5BIVuUSl20HwzU8eKOPPPVAf1sT2XOy02Ot12/lLhi3H/rVJ5I3Z+keGtw378X2dzlLCFWkOluRMSkr3pKerqlNNsnls6erDns2JzyQqHo83nWuZYdf4HuM94bQqQ5VlmnOKa2aP6Z6Z3qlp09LXeu7gztQsRXFn2SzJXbGQ3BULySIW5BKTg5qJ4aH4SsrBfNwuVmsS4SEWCeaoXCSYT9vFBkplsUaT2NkisW5TWlMmy3RI/zmk/xyyc0dQuM8sBGQXAjLKEDBKZ6VmzJ5x4vGoGSuyzLiuTe4SUNHhosPY35rFVFNTs7iHk/wFqkgZaiA7hw9x0oACcg3kwUA2zWZr2OAX2KhH26Obt+6Nbtn4HMt89U2WvuKTm1+Mvsp3sQXsj9ujD7x1IHr3E8+x6U9Hv43uZQNZehuz/S76Afx/D56sTYgPL2XzYWG/25bI3IMzpvvONy/wqRanWLJZokliDiJfeiZBOEQw9i7G1sW4O/RDbe60gSiPtmX3HOgS9cyeA53x0hEv0f5aW2Yw1g59Z7wU7eGzwOQmnp1xtjbZNiNjQcYSy/LEFY5V1jWO2xIednQ4Pk78yOFMtNs1lyPJ5XK4HHaLO53701KsRnzLJNgNXoslxZOWmuURM46/b7aFk8VWeDzkzxbZkbxehyPRnNUVKlldoZJ1Im1kBRPvNIoAiaeN2FMg88VAmTmMwi3GGi1nUU5TjpKT7ZUB4ZUB4ZUB4f1fPlDxVGH8aaqIP1eB4Rt/LqfGAyf1fW/8beXEHc+todBxVArz3Z5C5vIUrk7sGzJc4dwpwip06kWiP5zQwlZz2FHocA5zuYdBVM0WQ9hJifo74bTUQld2aqEblBjOKHRmJ4F8oOTCeCfV4sWWgg9JowlvN0+PgNKX45UWcEEs328B/z28eefuS3e9PPaMKefoX22fctG0Pv6Kd9k9q9aNu+2+aD/DlvHPrbjzlczcnHHLootZ/6uvG2ozHV+mDBiyYnTDNeKvsalEpotFpPLLO8mhR4XTSqZw6e7EWFbCw9ehH483KCca5LMpjhG9BKcaY7lOIJfbpMqDhCKR2+Nm4nGXZp92MV/JERDv+9ttkBjA4J0BrhcFXb3cQW8hDXYVugd7z6LRrrPco71VNM1V5Z7mdd5uvt3BW4zi/BQe4GRpqaHkgYaB9jJDmb0iudJQaT83eY5hjv3C5KWGpfbLkh2GZLtCzG0mswMnNcRpkbhc2Moa5nIXFqaHsxTVYOBGE156VizXkpDocNjxGe9OTvF4vch0I9oM5NVEaXe7RBmenmy2aIQ3pcYoiTHyGszmrGRvUnKy1223WLKS3WDdLrvDoTldSU6ny22xm73JBofLaSeOKRkUr9PhsFjMZo45ed1ul4vMaR5PmrPYwiaSRnZgMihMBjZxs6YxxlJTO9jalljw1qSljj2e5j1+PC31uHdceX3Zhyci1hm/RbBifa4uKixcPbZvaPUVO1f39f60QOCtTnTu3AkYsbOLOxVYRcQxuSLiwoG11W314pEbOrQawlwI8yDsJBKnd6qI2CBJhKTNHjaEoVSN52RJTezsdvrlZwN6pHgGD0HhRtFjAAuwYE+jibG7opc9eyAnbaiVeT59aXwgo8+HO6IXPRl9oafJkxR93rDlx6Lbfv/PHOWd42nRz/61tl157NgoteY6rX70D/fhCN1JlcoZbUGvb99TSi86COJKr9ZQpq9T6alktg73hTuUQJs7ucBR3EcR+SRfogZcCHoctFURv8WYqYgzoRO4EtQEehy0FbQPZCQCilYNtBC0AXRQtCiZSkar5nMW91RSYZuKt4ND8dARkA5SyAfMB40HzQTdCNoAMko9IVkIWgnaCjoqW8KKp/WWAZi7p3WtLNoumF8gq3Wx6owaWW2bVh0rx06MlWVnxdSGxdT6D4yJ+5bEyp69Y6U7t6BJlNaEgm3FKUoKFpmCiS8CMr6THAh0H92tJFMExBVjXBJW3G05wYINWxWVmMIVRnPIp29TWGuCq6DYynV+hNzk45/zw7EWfrgt0VWwofhsfogeB20FKfwQ7nf5u7SSHxQ+BxaBNoC2gvaCjoCM/CDuA7jf4e+Qg79N+aAi0EzQBtBW0BGQib8NdPK3xDe+RMEXgbj47Qtqb2JZbwId/A1wb/A3MLWXW4cUFnRKJpQfZ3y5ccaTHmfcKQUd/KXW73shooLYaUTUk0o2jaQBSnZrbn9fh+JtHTHP18Hfa9NCvruL+/H9FAFxzGQ/Rt5PGmgCqBa0CGQE9wq4V6gJdBPoblAEhCgDOkEa3wXaDXqF+oHCoAkgM9/XimE6+N7WYImvOIW/yJ8lDzy+hz8ny938GVm+wP8my+dRZqHcxZ9pzfJRsQ3tBBsnSifKfLQb+F/bctw+vdjFt8J3PmA+qAg0HjQTdCPIyLfy7NY5Pjc6eZJ2mQmarfSJLB+ke80UvsAXDpYiADUBwWFnggNs0DYEeTi47g5UBQRvuAWcgODV14ETELz0KnACgvMvBicgOOcCcAKC02eCExAcXwkO0MHv+nNOT9+Q8RcyrdjBL4GXLoGXLoGXLiGVXyJu+l4Vc/tDa14ePLY+HOqV52vawpqeYk2TWNO9rKmeNV3Jmq5iTSNY03msKcSaMlhTFmsKs6Yn2VC4oomF20+rFoa9rGkXa9rEmhpZU5A15bKmHNaksSHhDu5vPWuALMpl0VYsHjqUZ45E9nFwPzzqR8z7kRO2AveCdFkLQ0nLjimnZokyuy2vKFbvO6xgYfEYvgOGO7ANO+gASMUG7UAY7UAnO9CBA1gEmgnaBjoC0kFGaGdj4jdKdADzQUWgmaCVoCMgo5zOERCnhfEpPi4nlh+f9HhR4ztwiz9i+bk/nOnMcIacY5QbM5gji43P0rP4EEoRv4lwu8yuDpaw+duE775NIEuxhd/Ab6RMbMRN8fLG1u8zfR3s9tbgk77iZHYbZamIOlZIQZaLcig1yvogyjCLciBl8EdRFrRmTIWZozXY27eFJQqrzb7vM973fZLRwcF+nPGk71WtQ2Wtvn9A8uhm3/6Ma33P53eYIXkq2MFQbNGkamfGUN+mXVL1KjSsb/VdKYrNvisyRvsuzJAN9bGG8xpRCzt8k4LTfWPQX1nGLF+4EX1u9hVlnOcbEdMaJGw2+/phCqEYm4fJ9sqQgwayZIdThnSwhnBv0zpTlWm8abCpwNTb5Df5TJmmdFOS2W12mhPNdrPVbDYbzaqZ4xiTJM7LIXGKSzLKH2gaVfkDO8k7Ocmf1Mmf3XFm5nQ2RXooFbxicgle1ttmU8UsLfLN5EAHs06cHjEESljEXUEVlSWRoaGKDpM+KTIkVBExTTi3qoWxG6ohjfA1HYwqqzqYLkSr0sX/rXcSY65V16eL8oxV11dXkzfl4iJvkXukq3BU2c9AbRxPeft7T+MzI+sqJldFHsmsjhQIRs+sroj8Tvzneyf7kh0tL+tkX4iiuqpTGcm+LJ8k5MrIsurqig42VeqRxr6AHiLmC6lnxotZ6JFmzorprY/p5cIeejmigJ7FQrlSL9dikXoqE3otjTnlZS05OVLHo1Gj1Gn0aKfq7MqFTm6u1Elpol1SZ1dKk9CJjJQqGRlQycqQKiyNMqRKBkuTKlNPquTHVa49oXKtHElhJ3UyYjoJB7t0Eg5C59/PVb941ZfgFNY2vHr2DPGHi9pAeT2oNrL24gZvpGmWprXMro7/RSNYO2t2gyjr6iPVgfqyyOxAmdYyfMbPNM8QzcMDZS00o7yyqmVGuL6sdXh4eHmgrqy6bfSEgUNOG+vaE2MNnPAznU0QnQ0UY40e8jPNQ0TzaDHWEDHWEDHW6PBoORbJGJ9Q1WKmkmp8cMmyjdusiNfadH91SYpz0UgZvMP93ivTt+C0spFsoeqIPVASSQCJpj7FfYpFE54p0ZQo/joVb/JeOdyfvoVtjDc5IXYFSii0dFnjMvKWzyuL/WvEBdHSZcLhMQw1/tKFtvJIuK6scSnh5JyHk3MRTs4tJhOktWJJkWFdMputHF/dMWFfCIcJoaKcUBSyEUJmscQVf7r/y+Kl/Bxt4k+2sXAWW0qN1Uokq6KSIxVUxv8MsAVnKfF6aKzGAhtZiDV29RGfduxrVxRizV20dFmci/tiabyMWcKkscslJy7hrNAJjy1Fh+JSSGHiMigK4/IL6zPbNvrOrBNSoB4lC1n042Qlq/zNjA1oJzswgRKAiRId+OI+Tk584B4nF/BHHENdwB7kBiZRD2Ay8AdKoSSgh5KBXuAxfCF7wKdRKvh0SgNmSMykdGAWZejf4+grUKNMoB8H2+8pmzRgAPgd5ZAfmEvZwCDwW+pJAeAZlAPEdy4wT2KIeurfUG86A9hHYl/KA+ZTCNiP+gD7A7+mAuoLHED5wIHUT/+KBkkcTP2BQ2gAcCgN1P9FhRKH0SDgcIkjaDDwTBoCHElDgUVUqH+JL8xhwGIaDiyhEcBS4BdURmcCy2kkcBQV6UdpNIWBY6gYeBaVAM+WWEGlwHOoDDiWRulHaJzE8TQaOIHGACfSWfrnNEniZDobWEkV+mGaQmOBUyVOo3HAKhqvf0bVNAE4HXiYzqWJ4GfQZGANVQLPkziTpuj/pFqaCqyjacBZwE9pNlUD59B0YD2dCzyfZuif0FyJDVQDnEfn6R/TBVQL/kKJ86kOuIBmQX4RzQYulLiI5ugf0WKqBy6hucBGiUupQf+QltE84MV0AfAS4Ae0nC4ErqAFwEvpIuBlEi+nhcAraBHwSlqsv08rJTZRI/AqWgr8DS3TxW9BLgZeLXEVXaIfomtoOXA1rQCuoUuB19Jl+rvUTJcD19IVkFwHfJeupyuBN9BK4I10FfAm4EG6mX4DvIV+C/wdXa0foFsl/p5WAdfRauBttAattwMP0B10LXA9Nevv0B9oLfBOug74R4l30Q3ADXQj8G66CXgP8G26l24G3ke3AO+n3wEfoFv1t+hB+r3+Jj1E64Ab6TbgwxIfoduBj9IdwD/RH4CbJD5GdwIfpz8CI3QXsAX4BrXSBmAb3Q1sp3v11+kJuk9/jTZL/DPdD+ygB4Cd9CBwi8QnaSPwKXpYf5X+Qo8An5a4lR4FbqM/Af9Km4Db6THgDnpcf4V2UgT4N2rR/0HPSHyWWoHPUZu+n56nduAuegL4Am0G7qY/A/dQB/BF6gTulbiPtgD/Tk8BX6K/6C/Ty8CXaD89DfwHbQW+Qtv0v9OrEl+j7cDXaQfwDdoJfFPiW/Q34Nv0DPAdelbfRwckHqTn9b30Lu0CHqIXgO9JfJ92Az+gPcAP6UXgR7RPf5E+lvgJ/R34Kb2k76F/0svAzyQepv3Az+kVfTcdoVeBRyV+Qa8Bv6TXgf+iN4BfSfya3tJfoG/obeC39A7wO+Au+p4OAI/RQeAP9C7wR4nH6T39eYrS+0CdPgD+N6f/38/pX/zKc/o/u53TP/mFnP7JT3L6x7+Q0z/6SU7/sBs5/f0TOX3JaTn9vV/I6e/JnP7eT3L6IZnTD52S0w/JnH5I5vRDp+T0d3+S0w/KnH5Q5vSDv8Kc/vr/o5y+/785/b85/VeX03/t5/Rfb07/pXP6f3P6f3P6z+f05379Of1/ABquEH0KZW5kc3RyZWFtCmVuZG9iago5IDAgb2JqCjw8L1R5cGUgL0ZvbnREZXNjcmlwdG9yCi9Gb250TmFtZSAvQXJpYWxNVAovRmxhZ3MgNAovQXNjZW50IDkwNS4yNzM0NAovRGVzY2VudCAtMjExLjkxNDA2Ci9TdGVtViA0NS44OTg0MzgKL0NhcEhlaWdodCA3MTUuODIwMzEKL0l0YWxpY0FuZ2xlIDAKL0ZvbnRCQm94IFstNjY0LjU1MDc4IC0zMjQuNzA3MDMgMjAwMCAxMDA1Ljg1OTM4XQovRm9udEZpbGUyIDggMCBSPj4KZW5kb2JqCjEwIDAgb2JqCjw8L1R5cGUgL0ZvbnQKL0ZvbnREZXNjcmlwdG9yIDkgMCBSCi9CYXNlRm9udCAvQXJpYWxNVAovU3VidHlwZSAvQ0lERm9udFR5cGUyCi9DSURUb0dJRE1hcCAvSWRlbnRpdHkKL0NJRFN5c3RlbUluZm8gPDwvUmVnaXN0cnkgKEFkb2JlKQovT3JkZXJpbmcgKElkZW50aXR5KQovU3VwcGxlbWVudCAwPj4KL1cgWzAgWzc1MF0gMzkgWzcyMi4xNjc5NyA2NjYuOTkyMTkgMCAwIDcyMi4xNjc5NyAwIDAgMCA1NTYuMTUyMzQgMCAwIDc3Ny44MzIwMyAwIDAgNzIyLjE2Nzk3XSA1OCBbOTQzLjg0NzY2XV0KL0RXIDA+PgplbmRvYmoKMTEgMCBvYmoKPDwvRmlsdGVyIC9GbGF0ZURlY29kZQovTGVuZ3RoIDI2NT4+IHN0cmVhbQp4nF2RTWuEMBCG7/kVc9welmi6snsQYdcieOgHtf0Bmow2UJMQ48F/33xsLXQggYd538nMhNbtU6ukA/pmNe/QwSiVsLjo1XKEASepSM5ASO7uFG8+94ZQb+62xeHcqlGTsgSg7z67OLvB4Sr0gA+EvlqBVqoJDp9157lbjfnGGZWDjFQVCBx9pefevPQzAo22Yyt8Xrrt6D1/io/NILDIeeqGa4GL6TnaXk1IysxHBWXjoyKoxL98kVzDyL96G9Ts5tVZdrpUkZpEdaRHlqhJVEQqWKJronN85V4v/62+N8POUcYuqdLprk750F5Y4z47X631Y8ddx3nDpFLh/h1Gm+AK5wck/4erCmVuZHN0cmVhbQplbmRvYmoKNCAwIG9iago8PC9UeXBlIC9Gb250Ci9TdWJ0eXBlIC9UeXBlMAovQmFzZUZvbnQgL0FyaWFsTVQKL0VuY29kaW5nIC9JZGVudGl0eS1ICi9EZXNjZW5kYW50Rm9udHMgWzEwIDAgUl0KL1RvVW5pY29kZSAxMSAwIFI+PgplbmRvYmoKeHJlZgowIDEyCjAwMDAwMDAwMDAgNjU1MzUgZiAKMDAwMDAwMDAxNSAwMDAwMCBuIAowMDAwMDAwNDUwIDAwMDAwIG4gCjAwMDAwMDAxMDcgMDAwMDAgbiAKMDAwMDAxMDExMCAwMDAwMCBuIAowMDAwMDAwMTQ0IDAwMDAwIG4gCjAwMDAwMDA2NTggMDAwMDAgbiAKMDAwMDAwMDcxMyAwMDAwMCBuIAowMDAwMDAwNzYwIDAwMDAwIG4gCjAwMDAwMDkyMzkgMDAwMDAgbiAKMDAwMDAwOTQ2NiAwMDAwMCBuIAowMDAwMDA5Nzc0IDAwMDAwIG4gCnRyYWlsZXIKPDwvU2l6ZSAxMgovUm9vdCA3IDAgUgovSW5mbyAxIDAgUj4+CnN0YXJ0eHJlZgoxMDI0MgolJUVPRg==", - }, - "name": "file", - "label": None, - "show_label": True, - "style": {}, - "elem_id": None, - "interactive": None, - "visible": True, -} -BASE64_MICROPHONE = { - "name": "/var/folders/t1/j7cmtcgd0mx43jh9nj_r9mmw0000gn/T/audiovb4gqjpc.wav", - "data": "data:audio/wav;base64,GkXfo59ChoEBQveBAULygQRC84EIQoKEd2VibUKHgQRChYECGFOAZwH/////////FUmpZpkq17GDD0JATYCGQ2hyb21lV0GGQ2hyb21lFlSua7+uvdeBAXPFh1upJLeC6SCDgQKGhkFfT1BVU2Oik09wdXNIZWFkAQEAAIC7AAAAAADhjbWERzuAAJ+BAWJkgSAfQ7Z1Af/////////ngQCjQdyBAACA+4PMpH/n1EPs4MPlDak5Fzh3pT23QOozrpMIemMucj6646WZTq/qWAjImUB4j/aEtJ08SjAyqjqFq+2zZ5BmqSKaDZJtE8pZRnh7pd/ez05WinXc/FkOyULyhFtAKY7v5MAAAAAAAAAAAAAAAAAAAKADzFGuPnjkNLV2iu/mGqmEkZOFkTDa9XGu/V+C8YKNhgXB0voRMsMX5rHf2WcKpFvpWqoiFsq5scEBbG0cNIjdGoU+Z3Scu5r9OMpyp0ETCKhFwi+D/g/ukqguM4i4rX7bjr3/IZCXAiOQ40t44c3thLsE9d7N/U6uePnhBMMh4hOCoQEL9bQcJHJpEL8EJsRPhIMhSZI9/aBmdmUAb56PS8k6qVyW57IMTYbCOJ9d0wjC1rwuLwUWeX6YCLfpX3T2QXdSsjThYFKwgsJm4i33Piwe/liwLaUeKfa4XjbkP5zsHX4C78gpFRf77q3Pg5bvCukbN416f+vQiBunXlcZ/RSdUXg9phF/TftxJ8NOk+sxY19g0TVRy2UMBV9uVxW9nFrQCLYxhOK50MLQDvEtRzEFGD8rvpc3cF7vKRFT9ObTh9vUx5pZ5Z0V7xngOWsz/DlxbzMBcRYOHeczi0aYAQZQqgnfAaNBO4EAPID7g2RpPsoN+j5Q9aclGv5D1s9CdoTT+mmvJ26cRh1bNNaI2isW9knZ3H+8hpVtpGfeLsG+6aQ8kkThDo84BlIX26mGsWfAaZlM0eJlPWlqxudzu2IFQXqLOzk819lC3X3zG4c+9EVLhEDepIDmRnjv6VCyjH6HmsJKeuZo/Lu0k/7RQww2vY/i9azLH5f0ew0XFNrHruB8MgFpwwzVxQttXpwhHTAl0B1zujsaaNX1+6vYSsv4DBORFKQiPYb69Nc+Sd46gbcItW11c6DcmdD0Jj8XOcNtjXKMryjRWdmEiYrAXVUTkZLsnIZJxpH3Dzs0V658BEWYfgNsrlVi2/8KaqOFpPXMyoZ4M1sWKtk13pRAk7xeQS0OqLKSkn8rzX1pPkKuONL0/vn8KKi9auAWZBE8+0u0JKNBe4EAeID7g3V/ImgFnHyflxJxgesfQU/hEw2cW/PTo6SRV89BxbmEbiiUEffK49yo3jalZn31EOX+GrVfONzQDcwz6+39msxgr7yRHJBXlCrGuDPhZn1yEg0nbQoC6cuaiocVGYivipU4B/cVG+SM/1JUZ1dOSMSi7IzUx/cIPxL9L329mCSn+d7e055zJthQaWzB35p0XbeLEmEDGf2xbm4Bt3eg0ROZMmKHC4tsVohbvjurVAhm31fk6KysYxJ3txAuMC6A6mpQMFmo9ADCLqwFP1rPFcR5+DNMCG+m4dvKSmF71lXvKi6kIVEP2U3KIsekd0GHY6W4QpybjUBlcIvjEwFMJcGpoeBpVZ5O+HEIONYCOJ8y4Z68uThakypsLKgqkPa4bvnATI6Hj9WLkg43nnLxWXFIobaw6mrpqR7+JuwtY4eL37PP1hTYv6ypROfDtonK6CtKUZbae3Atqgk8dsiYy6f7UXPmovQcgK2j6VCK+k24/T2rrkqjQYOBALSA+wM746KTKovZvocJZAogLOpprNkJuKrxFmMsLcdV/47iA8juYNVF2DA+W4KiFx6t7bflq2DELtamBLn4H/5wvv3LBStiTBgg1fgcO+p1iWuEg1RqSvLOVJE6oVZUrRxqtEWRewOCVDMNand7Exdc4rsjl+d8TMeMdalskYwKiDRTPxjIu7jr4sFGehIAL5bod1tiEOq7YyPdSliPnxRT4VbrICMoy80t5E6+2H01d2eReYzsRtuP4uqAudLvM4zL/2pWwH2wC1QGEEIiKkDFAbAYPFmwqKxMEzm+uXr5xnbMB69B6pyqsp+yq9cWoT96Oh+XMMu6DmtVN1Q/qzkUET8zrXOb0sJ817V2Zaj+0QVAlmhjFVGE1q72JcXE2+PN/KFXMooVaS2rXraiJYiXCsc9FcmRo/JVjf51LVKtyaxGp3syZghPwnyiNhGpbXCA0yDn+qsx7zItsxbmjL3eG6mwI0jkdxMhy55MpbCpqBESfIiZiw2IHXxQtI6KPaqjQYOBAO+A+wMaWBXecBWrz98jGAjM2LAvlKxUqbKiDsOE97P6bQkKXREtptUPWrrOVJzSgiTue5uAOfnKc3lHkixhmZiIC6M+hmmWc0NxW8OekQfhpmE+juG6BoUE3FTKuRPrmGytfqahopLAtWxxvNDgX4TaoqylsdgXpMaS3ZinkA1UvsYQPxc56FIj4lFeF3f8ea39rtA1JzZka1asIQJl8wor2zoRzCW6+jX6anhLKEBjCuPy7TwZ1ACCpU1tw68DvFN0nqNpAsb0QdYOst2y8CjU2QshwUQIPLMhws+PipOdCawbkX/VltWSl3DGmJGx88lRf1AsGvGmykCkfuqXkTbVuUPeuFwHYNKmkcUs99U8aYYZyiOv8BjJzo3vQmYNAIrb+EcjUIlSE3ecrAVZv2oBGY04Ntf9oFYPUGWLRvvd8UswScVxAFToUISFozdpgrfZwWtYikqw8sTkxZRI/YDXY2Epk2O8w9XMVYdxI4FojNsKQXpYFnolP5vyPdmN17OjQYOBASuA+wNPuyhaEA457BBCiSmcmDmjbP5UFKpdUvdWLRXtxNZpxos2I1ZK+f0xmwbZx4Oq5hBWsNBBwdsd9zReiOwY/nl/gUEUynWmfNvDMLRfwb47JQWL+kqgDLRN5WPJTXTpyXvVRoI4amc7Wjbesai+EG8PhcpuABFMJjNbcU+aGMJuT7rfb/PeAuapGwtQefLOeJG7ELIHjqe/Ehizufd2dhXL91M3E5syhmGzdrP5Qox/DKeQxt2f5QXr+S+YhpoHbzMI6hCSPBePzb3hdbbZ9kbabpnWBWZreAsINDgcwV4Yjx87NpZ6ThjvpqFL7GniPcqU3CAx5e35PXRwR1DgkSIqi4GEihWD4cKFWzDrxDAf4hSvvGLFBiVgu24oaJLNgqmBTunmozN3leeRDGK5RBq8CQ/1a/jPQxpKJqwP0HvfM62cutODtObEl6hOg9+MXSb5h9JYvABoo3oZa+WYiWCBl2z7WnAFN7fJsjteYtuvDUON/O9DW0v2YzNdTNOjQYOBAWeA+wNQbXIGz7NpKk31vLNIFhBPBHrdfP7xiV0usIfr3zJa4B+VymnG3ytGfixcorNxhKpbCs2H1cLrWjhSM9wcVdcRSWfQ1T12E5KV58cWTkfTEF9bW7H8cXhlcSvvgkjrWaQfIx0eA74JVzqFXx6BXdd9sZXRRmaOX8Ad+mz0fu5mIlwJW9KSk/M3g5W4ZGo/LslHWpPLfQo+7OPokpNR4WNCUdralfz7TBza7XMaWCGeYnUYFLf1POjtxvzdMgMMxZ2pDcW76i4k6roOCGKWtjAC1wAE52lir7r6YUeqQbT8QMDFeIWHSOlSVZnmrgMalzfW5HB8UEDMnWsXNYYMGSJKffDXXH2rBb0GXJg8mYatPspytQUu5xyQOWJddWkgonoTU4mFWUSohuUcW2cpKk1rpdJpNKod0fpH5RyoZnAZZYXzeQeLA7sJ4LwUZ6OGwj4ZhZlvWxJRkIQtGJX1jgsyKAVToAwrYr5lI4pTHnj4bA/yiDkCjD/q1jeZsuujQYOBAaOA+wM/NZhxY3E0H687M+siqrTCmh9MPREIILn/hrUqspKTCRXlMIJ/PZeUsDAcyrRgWHR7RM5ah/IvKdCsJKLU5Q1nMGESjH90HaNBSHf4V/Fs+PVHqZdKbA9tt2lZJ3TINcySP0sw+99rHZckGW51Re684SKYmIZm5+1vxKGrdGImUXBz0zG9xkr0kutLvq6RhzvvYhj9orQvovv3/mvt6yQAXZ+Pv2lgC8iQXN0Y4/HS98zUWoPOcZklWrCt6dUB7JI/P0xNsTExjF8/wnDe255TT2uR5NcFJI4clXPaDVcUApXdBa0H1NzIb07WHX2nHpi05c+PYN+c65UVf8FnND8gDjByXsYy7Iqz8aSmIKULKM6iPi8GbhqkamKHLsTXIhnFih30L8HIAjhnleY7FiOxrIukUt3K0fXHWVVpyXklL9J5u/nuRV3epKbtTncXQu1MRf2S8vkYW2GGgX5xCBwoOwkESScUf9xWDwYqVz+VR+Gs7DKQWWnarIsg5XqjQYOBAd+A+wNAhhKTNez6jmto2HjPkkOFDiSfmZnHDYtbOb1vTXN8Rbs9VbTdLYwHbw14DpEljDRsQCBpvaAQQix+iBWCixroQ/dJkTS/2KnYzFOhlKaIQEffrhpW44LQM0pTabthfXVQit1fGsCsdr7zPOR2mrlb5ccvVbHcriovtP6lGzuWPOBqqQnuXKLkyPs6Y0Qa+9gAujc+jripZJKFOYlA9MSwgliyTOJbTkfI2wlqqTKKoU1bcZDQpp5Ye2Er6GaZo7ZGVn1gvz9lDOSMCMyr4Oq5y6Xktzw3CGM6UGX7SXMAOtbt2RjPaHtuXrAq+0qoI4+WbXIiscQqeItSTn4ikSLFJqymv4xvxcJQRfJB06y7ZpT3tx5A98/F/qDo7unBCn7veNDgQGQLcmimpW9SX5oQraYkndGHvNlFDSDOAsKOK3IJD7uekmUcr/WYVqArzNBwTrZ5tFQuZ/8JQo4xwX5Az3aG1fSMtG0l8i7jlER7MCybZGkjIq6MT2A0NbGjQYOBAhuA+wNETRKRPUv0GQWKTjosJhcXb995F1P2wm2q2Ol6kvyTxdCbaQL8LszJISOeAUYQhoOfGPW02CnVbW91T8PMnnj7qEIxbO8RdQhqJsTb1Ssio7Tu3Pshvnendh68/uAuB6sJywkAtWlsQAhOspjcSb8w+WY7JoHJUml9yJ2IUDIvQIEBQ8u1w500gsyRVh5cwpTVtng7jW12zb+AUriGGLmO3ut72EuK3uYtFNSInpI63kW1+poJ3e9H0Ejy4CDRd/76/mtifMI0l3OuTR/a+IIoN5r89222HTkSKLS587VDvvfyoKoj7IAlgQsjY4OqQYKsOFH+dVjs/8KBkYU2/T+Ruv60j7K6zURZ1027AwH5Mzcaf0Vv22hzoIuVhUb0UwHP029fsJQnlqH8hWzzaPcBmPreenDXWsne0HLoKsB7OX7r4ns/IHscX+MVNWHCYRumXwrH6y4ZS+nSgZyG9iPoEfgEWEloE9Y8SZdWh/9OgMteGZqteivn2g4rPSejQYOBAleA+wNQHGwm4JvyZW23Pqd3njZ31QMzuGuZLxXuiRWl8JR0b3PfiNBBxRxv00xBhQS+VrpOCeMRA/YdnecYyI+6knzQTazpTHGxU6S3pAO6elaxcBswmYTl+hSlcg4QXIgYEwCDEdWTpSRi6ALl3vXyvsu5Km9/iZnXGlSv0jM0ho8UIuwzq5dXAUJPlrXg/hAYuZZc9wOkCNhpXdovJHXFnDzAs+fVYYBmghzjGCPXItR2w255cEWmnLy+U0Sg9IOLRGr5lvmyEXKaNXLKIWUdrF/rK91OSPrQay0Djis1tK2xdIZLTvDVlr8K3IEKoqJrBUzAGHZo7h7dm80vlTnBGU/21CfjaMi9JStWk4Ua7Q7b5qp6+5W2Bj9fpDZ2Ub1gZOoTn/rEUVameFjy6hbIdRt2U+XvAu8wKERAVzNRgaa2DhOL0UKzZg7HHI5IZSMIkExBT2ybFrDRog6lJsT1hAtcTtx5Psz+IF8UpjRi++WgvIr8iO2KhCA3AzvtpqajQYOBApOA+wOaoKR8kBqXC+u69TtLyz+S8831alq62o+0U1GEKnfJa9AtlUNR1nZJpw8DlA3QkaXVGagRdmsEKP/TKwyWdvkOMZbKPpr1Z/4mNfnjtkU7jWvs1q3kXzrnFlRFyjlmdoMt1A0TfhRxQA12VFHu2JJE2grGlKWSYKvcluKbJHE1JNagDp/qB+9lJvxMJA2kkDBQQfIR0mtpU1DTHEK9yE7fyHvCwOiyiveTCshlsSJ7WvlhHQx2Rtn7qjJlpb2SyOaNFJ297nllufOLenMk1kB4blxu4DnSg/g0zdmSGtwR8RVk9sQEiONuVJZubqKtiX/jpEG1CaUde6+FzNM/fyIvDhbjFIjqxPdDYLWZNl3l5gCD1E54kKXeUMe7eDToWhk+0dGI/4XDIp3pI6a0SbsWxNk09UulucwiCZaPl0MenCskrh26NQ+Zd6LJsW6JfD79si1E/GKhB3LX0YcYvY/2HD/WcOcZ9JzNdwG3KMf1zX0OxXBrORAg7J7pQnCjQYOBAs+A+wNAf/DMyDZlK6zqR28ylj2JXQzg9e4kK5/vL75PNSiMO1tdchiii4UVc6iTjfYJXXjqG73LpuuQ7T1HtWj4u6hVQNg6SZts3qxlTpIjuWXdVMaeKeYc7x/DGPG0S4DVmC9U+z9IF2icsvHHxF0BoV53aC2jdlTBcV+vw8xeafm7QOrKTmL7nglxbza94cJtcaD5gs5Vfrwgoij71pTNiyZ9iDt0I3oLbNCAZeqMtSbp+PFnK3Tv+zhx2JKtM7PrUyHTW3qo5LREn+G+7EBUKmCFmtStGBP72FBROCzkZH0TTv1U5Gqz4JnPj5YBfx+jkQx5jznc3p1ldEZz6ysYl1GXN1fI4CsGygqvFzrLQAn5x8o9WrgtaYQxEOAWTHK1Vp9x1+X9EgA7RZV+9yalHCaKjBjLx7iea7pju/muJ27jlKygb7W2t0rj2xXlVJxxU2KXSn8atgwt4aGQBJMEavLgDP1Z+Bmvlo57X9DnTLbxP82j2chb6T/TcafjRu+jQYOBAwuA+wM9aYQ8fhQxZZsS2xCi8dq5DrCTihUpnjchwR5VGlVhZqycrEkjLIsJe6nCBs7RdeOKxphz4n1KS5FtcYRUJeR7sQ2sDW/NC3G1h3qyRMIj9A38wP6FnnVZvHIy0jGzgUeh9X6s/6tlMscE3fN1+hZaeCq6jD147dSsrOS+YW+NPUEjw5WJ33BOp73DuqlxpXeegP/gPFS52aZ5hZ7uz/WQkJ4qAgmEUb/J0iVdRXzO8/0XK00qq+Rp+cWLZLbDuoYHqK/xg8aMq3ZN1iQ97/TLkpe6RX0BI0ddUoiMTiHtlbcSf1KUAwQfGsUgRTJNIxdelIDzHS17DbyG5COPSRpKYWC8f4zsxoS8jHzdZE/kKUA0KIUP8AYc3qrfrZiLPdkbmqKn4ixlJEdnbPTF6IVxmCoeR1sKjJGjwWrUxCIrKDiN8K3viGPgsbsHytbfffzf6EEeUYxkFROPx1SFMgODw5GsnOcMozYrg97DD80a+DMr//dEjV6jO+IujEijQYOBA0eA+wNAdJcvOohN2QTQF4F/DpVelPfdj8pYus9E31VBsUUGHNaHbhjBqeo+/D2MI6AQ1NOHUteCsYt7dF7NIWx5JqH/uL7whC2fOSjBwHT5oPw8ZKfXIUwGbk5J1RZrdbVVfaYwJViuAeqXs/WdUg/2PD4gT29h9Q5fpq+vhFI1BwPaPxEZFtEv1t/+K7fNrmhNBYG/30bsBKVHbw5AmrSim6Dhkd/pGE5RG4D8ecsUvGlB+rnqACTHzs7uxY0gdTYq2r4WH2P7DeXqVcMKMWBUG76hI6IGKW7vBXNbF43Ap2vlJEmZURzB35jl5QkSbE1owbFLDHOoyDb+YDt08HeSKkRFgxHjKVAbSWeGMQhFDP5v9kszHwCCUnKRkpK/CR2vIqna2IBO0QsE49PTjmFBQ2plpBuprVOOXymr3jVsqy7902HVHr7rUfE28Nz3/ikOuBtgGy2KBk/Yxa2ksK2rePpck18oI8h2uYpt0wnaurMeOB0X+hHVZE1O/kSIBvSjQYOBA4OA+wM/WaFrl20Ui032X9rmUgKVbM5pprwG4iPi6fxUJg3gmiRJDgFgneXHJplCRLCx+F8qZa885m/GPHCqot6MZN8BJDNdnquocrEBezXh0haYqkjxDx085K1fWwVJCkMyCRPMx+KUg4A1XgF3OqjgWx+VHHj66mq2F0k9otZ0UC5qRC2Qq51JhgRMAJqQLtU8cOb08hG+QX/Yter2qSR+lLoLAikjQ+QQUOO0hCJuXA/gP6SXXH1dqLNhkASFpvbKsosmT/QLiiRZidbJ/6Ct6lYyOG5eP0lYRjrP6mK6mnOaKuFw5tLG9qxKw6IoeEeY7WI+A8mr94Wrn8kl9bKTsjy+zA+C0SBq6aUzeZQn5OtzH5O7h4u9MPOnAylvIEjR+bdWoQlK7FJOuA77nR8NHrb5bEbKMDfR/aKB++XizUvI182P7M6AwP8Uhyi+Hajd2qmBzGeN/iays/z3hP3ZPd7z45r0LIXw7H9zZ0UcxkJgXPTFbg7FjGACIo3mtsKjQYOBA7+A+wNA8LZSgbInqd+Lz420l4sGZEKHpdRbYp5yK2MIkNvrRkZ6tJKIJIQnGKRoTHslyhhrKmuGqWAwT3PuL33CT3S2kjXU5JzvN/lJTK7clyJc1PunTG2+ipQtq73aW/YNNA4LvWPLL1FB62kooYZrrLNsFnF1k65HLRtPwqZP0fhKIj3V/eQ31fhNcF9ZqINrTnZy7pm620I5gqXWUykwFgJUJh5Lp5G0I3pJu9tsmTVBLs3ArDnvTc+aiWyVCQSwZwaMsMNpQMg9opB9aP9+mfa+fqM3uDqr2+a8c4m99ZCLLaqWlFZUi1uSy5bGgywJVbwAhYd7W5FU+7WVp5YLMEB0tP7qYg84kzz2tF3th7hQ5gMqJEMuSp3yOWiiqCFvC6k+ydaa0DNsJ3NnpdUn+hmow9CBLHREnz98RUQtm2UeiINGE6Yo7990Fil/jT14QAroZVgwYsATUGbFO0CktdifhlL4HmJKE/nVhVimji6WtLzevBmN2WDj32CfEaqjQYOBA/uA+wM/GMfyC+5QrcCefekrpbSeOkVMpX4wlR5dXuW2BEgceI0M/cUHWYLuDuS5B3FLerjXFoaPf/sm0zQJ543mF51/Hrl5b87/60bg9id822D8lhIt1Xi6ZhPJE0DiBP3Y0vFsvHhMvTyBfHHJaC8tRcZqj2yXkBcDZ8VsPW736sGiUZeUhHEj02jU4v1ZaVFhzsDcl2pd5EjcP3Gtw6hpwDongj6HAPvsbR0XV4zeCHSsKBEDhRL1Ct74hF/cfl8KP35Q46qnDsp6mNXnIHuKUYNHOcp/Tqhn1WjN35J/Hi0BnArFIMZutnohF3k+aEIu2H4i9XLPx6CBcNK0KRZe70A6SU22uucHcuWPCbjzRajRFJmmPHCO4/uKLzrClZu0xMnxu9OBiCcjIl7Cu125NthcX4nbGZeEcq2vS2lzKHQxUbhhtyf/OQs+ZLOoFaUw1lR3HHSA6Ksgh4WrpUElDOjkJjU5+eLzmcFj446vVazES2L0oKevLHuWc9ILB96jQYOBBDeA+wMiSCbZHA9+efZLryV1YiRqC/a6fq5QJR0NtSmHEk23ZblnXEWRZndLO0FAoLYJJx/5uQF8Zbf80zCs6bBiEZEXIv4c++XW2WnGLPgk2ytQ0RhQLLG5bL+864LO9eqJjsrk30BRZcNKndmbmiZxvZ1jjlZXEPREpMcPiqVrw2rpPznmy0Z1c3rfheURzpc5xstDcbb5y4cDG1K1orgPVrd/gg56lfV2IlmforFNn03Snjh8rblmoe9OHNDYE7xbMD9kNnnPApaWhnNrTM21Zz+1btJrWpRze4LamvAcibKO5TyDM6JPpGiQM4MUknWmYfeSx3nQMUT0r83s2zx6vURBIHZt6Fbp/te7HKM49nraW0aUIPUgavx8rpp+mbLxaYT9wjQizg8rQnWXLoDGbZotsMY1eVAS7gNEgDYSWs9JRQtkI+7W/+urYll0vwWHcQfQDyhid6AHNi4+ahH08V3uMzcHEuJOgT4eX5Lmjfi/KtCbSD7/Yz9UyAGy5rqjQYmBBHOA+4N/fz8RB8z3JXt7cuc6lRNqlHwU83zLL7Xg/9SG23471qkWDLgZ9j5chWZ0Lk5AdsjXtJhZ18zDp/js8JGokUvYIf69qM5M5+C525eMDYu5IgeAYCxCg6o8/IV011VGGJip/km+ABdL0p8Ge/fABmFhBgLrhhuRMMj2JVxhZ6oxwp88RM0y6EahYfTbxpnQf7fm6PW64BmszQN0fDzSvP+qwjiM4Qz61aPDWuIMJsGH+C/iZp0f0q4/7+/JilvwNZ2hpSmAvVLVe8V8vSRNuMTEws1kEIKl/wPtQiRuypz4NmT0ocfy8Pc3KagMHi6fhs5sfNutK0p2Xlh/XBtrepKchKMVB+7w81CHjEXgvLuII/bol3Aqnz3+3YtrTCusCOgIBQhbcso6mrWVO1XTW/3tAkd2qmj4mRdXNetG5bU32/eKUaIndB8188ePl5ospYfdaKwtcdWS0a4srFYd5ga5Ex6XHRhW8AdjJZf5cIt2WGrjctCgFYdKiiztpCd4FrbuwkCjQb+BBK+A+4ONlvDO7lzRoItQ5Rg5I0uSMCY9+7rEDz+fgSqZXUvkt6FaVBSh1X17J8+EBvOmrk+/5wfBNcFxDSohPxn9Ap/5NFum46nKJQbOSuy1dh1vURHujEVzQpj5GcKjuH1BeYin+Q8sTgbeV2+yCyTpjuoqRXOxqxBO5ZHD8mxhfVLkhTmfPWYNLH/w4ByBheCoO+snEBTcf2XuInUprKuDY/Br8axWAirmjcW8cqNzQiQMNoCn3seijnjZi6di6N4Ra31Sx24iGh3hka3ZQKZiaMlXsl29ZdqdTWOnTVaP0WUw4hIVO2h5X7k8ybRxU8+dufq95zxWG7330cUpzbQ+myMs3A4o7Bpr3VRBStmZifDde0oyO/u5mS9pepYkIYpc4rjmyZFGQurduRx6fBwyno4wlKbwH/bR4sGAkXiO0UuY9+aFDWunnnSt15n2THINrfVRZ00PDnGCVPnI5c2CGjqHkChNjHykoTybFQVPW0Xp/v9onsS7JmLMzi19aJwy0fbV8t9POxiaDujYvbyhM0PNx7qsFCtHExyZoxlu/KflZM+xeC0vgzssGfM/Yrx52WKFaXujfC0pCkGjQcSBBOyA+4OUle7V8d+del1dQ+AfX2kTEsQtBgsCeGfBhtAlF0j/UBtzzLI1WK3/zwNyN5smy5jewmtpVfEAxcauiYrCQN9nykXo2ZJ80bCRrDn6oDTmkZ88bU5DBEo0783DMLe3nOgm9VwPGVQAe4ufmY2GJWseAvhwS7oRYj4CluSmVi4o1JnzZD0qDNceFZGjjJUqVH3YLMAbmkLq/qU75EMUTjs1F7gbbOu4Q7i3ALoB/g5ojh4dxomJd4Tf3Jz1WYZ7nH1nVc5y19IipVH3XZygYOZ5Ortgxc3SiU07F2Kgzzb8vFDKbEX6EtUC+aalLmlJYfQiD7HZLfvbzZQ+buL3BeWy35dNXd7KODnKRhWjn9Fam2TdJJ17nLEV6msWYIlBfn8moLSbXQJxb6kKRe7Un7Z1wcvXx5TajXNp8kZCz+vlCAFuj2jeMuWVL6i/HsJH++CPopyAotLZ1hHyq3HoDYnQjI9aF2BktGJxs/M1W3xh3v3IvVvkgBlLyQaAZrokJ5AnJv8x+1u2dqTKo46Dbofs9SevpdiZtdmvLNmmhApg5sQXEpKCXTeOZeKsvFQGvmgOuWNaOPv5t793FQUKRqNBjIEFKID7g4WA6tXja5c1OytvkgwT63HOr7vajJ94r+F8YUrRSv+aZo1AVbFlO3iEHp81P7NR6Xg0lVwicDhBoCPfvjwDhw4gNtqXuSYdrg/oFdHcUYktX+9LgDRVV8EhQKkWfrq/O+uuXFYYdeTtJaM3LD3WK3jHFet5NE12aUw9aauVDaRTcS+Y5jp6Su7UXnZ3o8Zy9yWLTG+dka2kwzaKrnbkDYe8n0xz5v7JWUrNLhFo9AkKUuC6w+Vx8wIRmm73LsFpyJkuEFwF9STc0V1h8cjmm2mDp6oqEiWQdqXArDZpFPVJ41VMylcOI+lPY7MeYe7SrbRINClq8tVfVhEo5kjUKCs8CBj5B6RI7sLKPRapa5j5veLdkNwR0QXfE4HH9AXTHdlswAl9r0MRTjTVdkOhzF6SAwJ2+FxP3pTY2TKolhSchOx5Auxt/WQ+oG4CuqU9TLt7lfoDDOD7Qt9rOKJirGWN9SE1no5Z48pct7kHTm0u4jlFPFkgwemf8eR5v6gbdAOu3mWWS6NBh4EFZID7g4B/7pxmFStND6gEidN5ZQO6VnEyNe+JFaAH9OZNYG6G/52RcFcLpBVqElRkSDKvUE8kTeGCnkTSl7cvBvodt6nHq/Z80Ok1lcP5p/qUo2HQEufDbWLo+LjNxKv08PI3N/JvWb0fYwmVFZCZvvd4c8mT6Rifz7woVyMpd7mNZme/hkrqruPvni/vgDaTGwlFPtYOEUZLiE/Sfqg4DCC+2cpx+2zdriBe9/0zWviQ8FevnH1ycYoM+NMPo5D8DG26OHooDKgGI1k22yF4DPhFQJ7X7Nr0P1DwoaUUSMWFGrHbF//TRWHTdHw5zw6fYlDesCoef1JgoWt8Q7XcVAOoqzhP7f0lqs+1Eg7aGssS4Rbx7w0VCor0qeRYdNb/M6CG1qVVLRfl/VXUkaHXLovqie+Is9hwrxWDpk16ZY3irt2SBBnHlxBuLVNoed5GJhi88dnpEiOMYWyY+teE6q9EcoOjHvzDC7+Nff/zAx68fYvMiMm9egcm89RSNVSJgJjtGFejQYaBBaCA+4N/gOqup+c0l9fkaHVxu/bZ+V4EBVrSlZP6echgc7ERYfs2KaGXAjO7pzArdj52MNF29CJc9D52E5NNprs/U4ZkHRj6Mw3yua8PHZ3RNcjkU0hkW4g4GDRt/eInB2ZX1eq1j13algzi5iv79bHvxIlXQBeoKfFSkMyqFjl1k0tX5knuN0hx/Ifa3GbPMeBqFN4evxb03+8y3IWTTzSt39Tme/jnPopL/5JS38XHwq/5nUcYGai+yaN/rKN+2ANO9255DJzitbREO5XAFs5qzUgHpPvgm63cY6q33lsAtTYpZIdgMC6fZEIXLaogDZKFJ/uA6kt+/a/Uj6lCq7NHrXIWT+rpJocJmUo3n/uAb+pLHqE3wykjfdmT5yHCmWxNQzxKH2LCV8eKPwNtzHLjSJauWAplJTagql4Fk9BQ0p/JSztBM5Cnw9t+FONDNfMSFB7r+3Tacdv6PpNcZHb/wYjQXqONmAbxuy67c6TvVsf+XwRjMVnvDJ+rdpYVMyb/+lWjQYeBBdyA+4OAf+q18mBLjgEq+6p75VGkt7LcuPBEXVAptuRMteyUWfaMTVzp5gvO/uQDiW/0KrswPdgpSYdFqlbkRUgamIkWY4LN2vK0gnX7D5I0IMnItVatxQkgQL1zNVHSrgDlxgOlPp8ma+rsS74DHFH49bYl6p/WIiUR6ad4KRINx+8yK3pV9K6D7TFsE5ILROUEzhngW0JlnLPTeZb+4f+vyNDOF6C+ZYbZKoEx/64KfIw3sWOp5I2Oz9WDFXI+YGy04jYKeO3JoG8i2m/T88XYkffO1lImX6HrJsrK83CQI1n6XjSq7+HWzh6Kjt4OoDJ24K7pYwVNFjdEy8e5eCMKXD1qXfScOjcxpfOf1BHx8m1LsLU5wv27Y6Aj2wXA6oUHw+JiGjK6c911SE5He2R5leC7xbuEKEGymS+cfl4tgSHFcZY7PiUmNCe9IFRllH6oBfbuJkZZuBwVnnF0bDHRnXo62tE/Ku2Zqm5vPyWufbG/sUzDpD1XMbMCqo+m/4hpXKpfo0GGgQYYgPuDf4D0cktJTWSrDV0YJdBji87/cwaSvfyIUOdhgfGLZ87v4Po2+/doUWJxY/bm2CvNy27DI4UEJAisyalvwEe2ukEW93K71UO1zE2oQVGJn5qtKPmbkkyZnGaxXFyAlBovRm5XBtKKtvB0qjsCdvSJxnuZ2bfxSn/tV/6r5q40ywpf61i8jvrhANMtlq0Hr8JuHIOYAtzBohcHBOiQkNCpf2dgQG9HU10r3fKW+0EE+d2cV0FanuyZxQallDTh6pT69msMYw18gKKVDgugkS+a7bCShuuid7+toWdmqzZVuIcckm3LR2R1Lz017UAJt4UiROqoGVA9FyRVjYqtcVmX2mD0pJWU0gdBUxFsQTqES5GjYhR7eBeiV3wBAOCcq2kFZKbEzZ6tT6l3LTqPnuYF8hHHAl1CfTa2K/qJ9VUxUn6ilu3m0X0ywwXAPK+vnin8XAJPSOT5meY7gV/GtWhmJGgvGSMbBhqkv1oX7ydMeKXAUDBwFTZjB3Xvf6v+A2pko0GHgQZUgPuDgH/rS/Vxw0tdFvURGYP4KsErhCNQikuyU0g2dkhrDJglQKu8diGnIdoDX1cvV4L2my1ZJmEzZrcfSnYxjL6X5wHVNz6eH5n5YROxvAeI3gFhoPlgvVQOvygg3w22N6nAb7JQ0j0RkqyNQdC2nmrrSpasXfU9a8pmOqu1dVMYe7I6YerCO1O5OXTNsH8cyGdXe1d2lS7CwE60SfXywn/3stK3iBYvxWVIHA6SpVSk9HEDl2dleuFUl5DyJ0/au5KxJhTPQC/J3xY4Sw1hV43WNgHnlESTmGFndt7nvyVgET7/GPOX5mi9nlgm5BbQzT4iF9h9vUx9NpOL+s+rhE3I2GDqr2iofoW6TGp65hLCyR4TApzN/u8U+KV5oDqaqBpF1QA8Ur1Ye4HhggDSx9eOpnYM5Atm4VXePmVWrJv2VE1SZ94gUc1G19d6Ue124vHTtXyN2+oTDlhnTtH24T0tsLrG2rXejAhtQ5N62KLkR5KZEy6ViOrWeEZ9b6KbLLV4ZaNBhoEGkID7g3+A6ve9WfYcwIlWJZW4E7iKlf9pCNn+DPO/7SAae/M9XNAqfSF/6snUxltZk+HNTtetVuRfOCToIanz2tlXMbdj3nZg5dFpiEM5RrmEvIA3rmD54jGx8/wFg14bA2s3yh42Rb7EcZ0e0lI4JMBux8qFuPwaa69WGh/3jImklD1YZex9DN33dJCXZXcIw6n+JuI4DSwEkv1AiF5UvSLOXIhzMjHS3YCjPaOA0GF1RehpvvQGANBAe2fUxx/7fAZZy9jz585yVGWvf4s7DBiC4qIgFoKeWbjXiW6AGhLHEzIhQIkAsAWDIhJIam774GqBRt7PHI+mKzflVLSvhZ/Ugdhk7e7BViVbwFZzFKzFhsTScIKaVns6W8fTk95AbTOnULaUzR6kkI8O+fYYNroT7uk/+ZpvgRvLxSfbjutx7O/HGgOxTI0SlDfswJrnVznVCgtctyTHszpO1MTNDv55M9h0kGxIZjMlc+iCBuIXVL6wBkneBNRKi1UX4q8XFsIEYqNBh4EGzID7g4B/6rRpBLBG9xLgn5bP3hsSXip1jPm5u8P13LqMxJaUHl1Sqirn4Xupyj/O3bTncsVl8m/SwZNt94x8bwYSyzVxvPgyZPSi20HBDZ6gGKY8/7WpzkiXMe7/hrBVyrovOQaRYyQMOJUopfqwsr9C8YhzXDOUjNxyinVA0QJ/0LduiGMnWuKhmLApUPTwnqDAXg6ZD5ZtcMNSP2McBVNJ0CYhyNJa4BC5PgsfvxdcFbER55xGhkZ+gApruGcYNqKC7wWXOgpAeoltiu8oeL8WXWIov/Nd4Vkg1iOot3mG//4HcPgXwH5xNv3ZpT02X8v+CXQj9+34GzoRPbmZXSayJMMxCmB1m6pFb86GfyKaRwYoIycUCAEiSKUHqub9ijFO3ftQFad4iS3rCphPg4+l7k8XNqnXw9xaDVU9YAEBZUW0e5t54pdEeEBAbnXQabXrAAi4HZanhUfw9096oKO/3aSHbpAueZmD5IeGKoklFfZi71/vIl4SoJ/y4T/Kzw5824ejQX+BBwiA+4N/fe0gE9oDzk6pPWticJk/R5FTjvon2CHvSq3CR5SL4kJIDSwtYpPjzDCvNAmAdGGkKYRtYWF8l7GuIkcy7/S0cMqhKUrLVeiJm7AGVgLa8jK9JS79Jre8BDOsT5df93WB3s29/R5NRFRG+N8K0Hw9EOnxxEIeNUAREgLfMB1JVkvuss/QXJZ2+ZMBgO5Q6HxwWAIZacuc5DXGjtpb8dOS5Awx1445WwtgHItCQF4qh/TpOdZE8UbNv8MFdWk+Y9r5vDQ+IXHseOal2HpNoFBvw6XedhtL2ojBLgKS68Ov3P3tZvgbF9cSQu0sNVZwkitC1LCtI1P9z9oU9IyGTusuYXf8N4MdIq+wRyggQ250wd3FE6BDZJsAdEZCgw2WdT62Rki6nA5jo/tycZ5WF4z5dGpQKQv7RSaVmtCqaA1eZJbaMqJOq479Yr99l3oHjSpbQ+lErD1RdWkZeJUJyLNX5ZAdvkfRDUZxOP+MulWhINSlPwTneAsGUaNBYIEHRID7g3Vx7BWyG7DcH6AYG7Q459BzUJ2ZG4HEC4noLN2b1d5/SBZsKGcLn0/8pIv7OdNKYDz7rPLVgq1obd9qn40C6vNxSeNK80rbaqqZ1rud9KfBx/noFM0UBImUapGmCyOEIpUeDm4DJF3PrftupEjQaESe4h/CC3ZSFRTudVfq+V+BKHSr1z6BW6xyxzVX5uD52AJ5+lCN/mh+NN5Mf1X3AfNOsOqw5RfMXpFW4nzP6fAgbEoFWeJbDr+6xxa4IIq4i96/wWCB1oaZlYxU3VP4OMU/SjAsjvqeflmF3SlBALxFuntKp/Ta90HsXFzRNorF/tthsDuCKOgHqPC1IzgqZxMcwxwGXZHCQSvhFsvS9h85ruvmHOL5AewDFKxegrQPQ55I8SWF/pSkMTv4U1dKv13IkZSpizZ5aOLpJ8WbQp1MFvWWNxHO0cXbH283pHZLsKyQCrOw7cxcVD2jQWWBB4CA+4NzdexUs9YVJOPdr8Rja1mRLN/WQYwMCcarET9xjsD/nSC477CKcUfkhZG5xodOb+Rsz6K4TARyiY31BOaCZZxhOCDn0KCMLu9TndVasMHgetYNcaHDP6cSQ0p2eS4OHDogdAVG67D6WK0CA9T2ipy9veZRJFAbKiRvy2k4+7oHNGUGzu40/azOsKd87nfqN/J99yv+GYxQ2WZQeJ+vRbtFIYPa0YIwuwk7mEMug3eOjfqHTFNA51r5tMy5sZlxDMWmeh7x07wJcDdt3cTMolRLXmBb3jTG+t1UgiJ5Y7HWaFqHaJfiojj/46zs5FhU0GLeXe6TIN8HEJ5L8JYFwqHs+JI7L4UUUXzYaRQn+IkVZXTQat0VLqdbQJT/z7//WivKxtpsHxNKi6uKN/rZ9wRFXiCnsN/iVj1zXPcQfj3enO5sNtAVstcoJNRhQ5LAqHNmLxbafdwE8Z25O3O2A6ijQWiBB7yA+4NzdgmdFxOo3G3yaW+oSJaQ6Dmx75E2R3kCpEjOhRiybt20XRU4E35JeuQxMmYBYQwauGBwePUB5KvqAQjx4IaEdHNY9ntqsNciJa8cR0t4qZOgv9ppks30G56LIHtqvca87lShlaslIOFCn74I+VFBltnyFhAc9h5xoGdSDNqPSsgX2cCV/gCnGETS97oR1MDkYMiS3kzhXFhBofu6tE7Y7dCjgQe5gvuQ4c66Dpgpj11g1b84bvRGl5Qn+NAHcCctoY/WFNiixSDrh77ek210LoX2+RDjCQISDkKlI09ORqE/s3qAPE4rNn6hFoU3rUYbim6+DkTxhk9kNdiEYt/ia/z1IgzfNR4YwiHT1BI6AGg/VhGeuCW5+qEZbrakbBf+csfr4ZEhiR7L6nIO8jDKK/uzw39ygd5LVHY5I0wzJmwcDHrI8RPKrx6AW2Puz6EaFlCy3Xi9yfojW6Rt5FXs8pujQXSBB/iA+4N4dRZoFsbVzhOkjBoqBmi9lwGLu06T5uOEMvfrj7hkcD/A4IuEAWVrj3T5aL4BlKjn9K0pHYJ/DWz7eEXaNTIdAak1qgXtvK6lZohRIRIXzwHOQIcX2ME0hwl9o4HZm1hap6mhnJg2ZxNY2NlpF5prPPFUiTeyA+WXDRzuKEIF70ENSN3aMLaJYGoZfcZtzD71iqOgn+VWxiiPYzySy4SNBjDChpoa0cNISkirOUiLdodWw1+DB8XfkWCYPgEkFeH39VO/T/6OFJeI2z9ewOX/5Q68V9dFN6/kDciiDAduEJf+x6MbbA1BWPoVp1KuNgi6JcxdFZdXDs+974no+cXZibim3E3DrBXjZA9TIKplvB6/0fkZ+MFZEAuHYk65QyldcuW4zYZjHua7dQNRSuaTrVD1vH+xXoQ20kpAo07BwHLQ3F/OraCWG61EjH7kOKkTu38EGV3Tw+J5XtlFXT2C9E8A6eC2k+GvuOmbNrmjQYOBCDSA+wPsQ272UL59VaoycUbwyDZbtbXr4Yu9frjn24RzBqqeUfY8WaYJXmq2NxmjFlau6UlEfyDanjBR7F/OIVyzDHlyKNQ0qFXlnANZAiPHiLvcGdjhNqMdqlMwCaW/Yfs0tEKtNEaSC371RBSjCQuU6sf5jcGYEvfq0ZIdyJJHUIh5H0/sP1PJga482I9ZsLdb8hsPfTqkRgaUdWHcD0pozzmUngr9tQcrP1Eg/wOSI5lNSpXsmgjYXRz3xlnO8k49L82A++pkXPAVQfiIjKA6DIJfxMf08INrYkFCd504AAL+FTqxahYQlIkx9MIGbQdbeKVc5Z+I2iad/tfnkgTLTSAHATiKzQ/+D5d5OCaAdQenjjmeCWpb4L6hbHllxZCKfrvk5OBrn9e+WroJcG7xEn68/8p4F743/rPtrVg5lnkGpjJakyPHqv98t++X/BQlFsMy0NSqoTit+Z623X1Dg3gkhL7a10aF4PV5Gukjy4nGT+N17W4E0kK8kpnoC0yjQYOBCHCA+wPwJ1Okfe73ueLMAJzKSNWnOQsCIPmuiig7CLQ6ZSpB+f+YuUXxMDhyYhaWwO1IdVcA2sJnm78/yTxsZzKwZj6saIuwUM1MjRIjW0+N7QIuzLzgOFi2VlRTwa34kFCOov3K81HwORacZT2RJQ63DEmclWe30gTsXBXO4+CZv8iBAT2qFEn2GAEA6Cqb51X71lHUlj92J35feyOBb/hbt2A51FKVeR7Ob1d7gBfTrLmMG0Fbrm5sFo4abzJ1pk5WmDGTvKnXjQdIIp7B5kZuqYd0tOGH3B/H29OkdskcLFhk479hMlXog01qOTt9cEZXHRNhsY10RNmin4X5teAZofAnLpCuvUQ/7dgLfEm5DrM8Oz2rOZONXnLyuRYvXeVWCblzyy/Wtgdau1gbpE06g5f2jGBdF6P4tYEm2ikrWjiXmqeefsOgtYt0ZY+8sG/SPqhY51rRNvDZbXj2hXh6tb9TnrBZexz9aU0HAvOtfVFtCTAKzDioRNKTY+LOOn+jQYOBCKyA+wPsL2ujrX2dGycZl1Ww6f6T7nujYUAzTbifSe/Kn+G06wk+YFDGfjFAmI+z361/qQJMdfNxxIzu8KkfS4n9o5MZdr9LQOeNI+N1D4zBddwjN6iHUH6S/Z3pY0iZmdzc9N1j6jGk5BA1Ec3eTpG6Uul7DZMmPk4FY6EtrIXY/5p/wocvXKW7uGY84EFIFdGD2LM9VpBG7/3j3PG9t2HV1LX0yQ+6Ni25jGjltUVUYOqnIiajbWg53H4JToMk4bbDspPIn9ujLSQl9g846gABkdiTUEjiT5rqwUyux7Lmg8HjO7fLuV1Kt/JuC84eI+W+CDOgoFoEomgFj1TAb215gsAdmiYQ0sLmFHJfiZTdITSKl6bQn18RCvlomRAICuHC3zHJr2pfHEO4Flz356M8djkSkBBi/rVUWsprIDnRCWwjU2ZtFXtwATPx2rDlYw+6Dl8ttac+5/q/S3jzn1J7otzTg3rtwLxord/LGPrEPGOqT0r/ZY0ZFHbSoOl8hYKjQYOBCOiA+wPCh0FzRPu/G3YAzhX5NtLN1EPvPI+hP+dVsyYn6TXnmNi5TtUTR43PHxqksEHMXZkxDyxFePIXYwsa8gpobgFzu24Vh53zpz8CZ0q/YdNPIowf1Dnmp1aQaTDFNlUV/7+pXtAjas9nny3M5bGU589I/G+6zLBIT/h0jfMfoW2CwoZE0GyFe9ngEnoEz7t/5GDXwZXVDyRFo8IXSc8ol3cUQZIMALDqCrr8iLLcK8zBOJigXVkbZJDC25D1yLf7VKbGGgsvjqmDHxn/j3g+afDRMA0K1HoRoTIQrOjcv7w3w5zom6BSiRLkYqQhVOZNNl7A6gIpYlWVBPhjoQxZgK1LtGE3JO+4lZMwEM3mFjGMIJEIa1DFESJaQXO7UN/ovdgKDRsTamSHBehOPP8uJsRPze0o7mEEofsrNvkcij+7CexbTbgfiG3C3jmvNi/2orG2E10W4Az67vJ7LX1JKdbIhu7n0R2zRe5p/91P50ODrpONSmQk1Ce4QXKHOP+jQYOBCSSA+wPCprath4LyUHi28GXhCbZVV6+tBOFJGJT/vYikFeGCX5/oQfn6zsLIo7uWLYmoUPwy56qYAUlPNwYEqUJUKwrDHtX4AM4J28IIVzqBMME8SssRiam76gJQrdg6bbvGVTfJztZuFwRl8C8bMnDDngZxcmuvFM021J6oLNLOrnmArJmrlv0oEm1YhcCHWswUI95Q8yag6c8hhfDN9KdX+XC5cMJ6gNw9BCA2BMhOcQ0Y3hxZRt0JYh3DXhYGGNEdyQXaitDnRPIGcSCW3xzIvKHsIz6+m19dmymU5JRrECc6RGH4lMTuY9+dokZGKBWO+inPlPWw5WyEeVHJCdL+/qxTNMns+xwDwKCIAhWNDlNs3TAIQbPr+obRy9aMe2Ry6yfbdKMqWoQfdPRA19BGANvpRdPJgJ08ldz5H/8l5oNgTmsXQIzuCQPiHzqYVWfDznU8p+d34g5n9sA7yQxJr0r+COKCO8R1z0T0nKI+tCqW1KVhLm0ok5jC7HHLavyjQaOBCWCA+4N/fxNE0WW2c8ULBXMiz7ymtXi23KujT3leEQVHb2NHAE3xPHFFrUOfzstt+BYivh5bJ8AVEV8xe2Ck8dyAxy7g8gvy6K6gfvN/3pv2yeyEP1398i4plsfIETHcNqH1mTa4rXMrwX7S4umhBo9+U2Db0clQpg//0w9/o95GVRYEN5TvypwFr8veVbZeQ8+ka4vs4+Sv2Rc/2ZYGYqyp7iDsRv+yOozUhQkl6PAnkpimhWJ2fUsShH1LLTVsanN6rlZ9Re121xNPVi7OIAAgRm9BtZSmu+1WrSH3dJfkVznCDQk4tqywz28639OhRiv98uFo1StKOGTG83WA60a8KCR7PMCP/NYPM3FmSia5pk76wqH5NJ8Z9Y82rqKgI7HFTn/RtLJ/s53vHNrIq43jMWAQFTgv0SArWhGIjyLF+EUWawPg46vYVtgQl28KI45Un4MuAROKMMi38BKhYhBeLGiqyz5uyrnO7p/NRrrTWlgkxB7Xinah6vnpyOo2YFludtQyVAKx4gOoZj2CIzAftzmOswRkeVMy44hh31MMMaNBeoEJnID7g3l6772nIV7HCcwbA6junN9kxO+xQtxTQO4Y2ABBoaa/cNzE29kFgrT/Q2nwnO9zaw21nT78QnzGqRvwmIGjnR3X+iBp7N11v/UIfFKHZMXkTy/vtGUWeNkj4HIt2wpDxZsTpByWcekKGprrYbEn7ACSypLBEsqBzFFEck/V8j4TbI+35UFf1e4etoEvPNWAwVIsqwpWCua92fr+EjTnhicbShVe3zWW7m+7iFZysMkw+GNmQ8MD4Av3O3npbj/BNQPft1vBgcq41zyNxNxJP+p9h9QIsrrAJAiyFuC9Zpn+QGzXTgotOUiw8Efwmsur+ON4WJphGp4wo5asKL+hRbnk7k+NoyJSP002cTisRWXBtjR/s4DUBFKMkgO11dICOHn8+sEqAS65bIeYSiwP/WZfZOsFGedvHrM1jMYQpb8mV/3xZ5xs+yUa0PcIIdA4pYsn+L0yLoM0C5Ljrcy8RR6t+gdyWAm0R+WN+i4mmE2waWcwyKNBdYEJ2ID7g3l1Pys+XJJ7Qg8stQqvm37FLd1bwX8ZnTiONnavmvtK6p4MMcGgBWGRjbIMVQB7AqZUMDPNC2JXnESUun7S1nxmxMLkjvufqvCTylLJw3l8vWY053lYqxDgumVK0mYn/TCSVbow8bMupVQ69VHADKnzBurIGEylvXe39T+pei1cRPbh9edXg6bx87ktbKUjQlU9PgGs/VnzzAxx1xxVFwF9+s0YizfE3MIKtM2COoC1da4ATzg/xwbHhjAvbA9KLoJSqN4mQmLeCxkaiBUD8Z1roHk13Wlga034x1hCKH38yH37jfB18sg3Z7I6x1yPIlIv7AD5SJUThkDVs9eTeyeCkDD2t6ozY0oKq5vkyg1qO2JQUbWQyz3xCpO+vPu9rNQSVHVg0hPa3wY+pY598P0DgKuVICyja1yU+1VmxcPfNjhIbZgg0JMzK6LWp/+JtvCQUrTIO6XJat29s5eg2o1quXPVKAbrcK4nbZ1XrRSjQYOBChSA+wNAO3IfDg7FDTcOEF31BpIAPbZeUYsOXbcsem19bJnCaGdPhYNZxSTo5JyVpqr7281j6AKDJEVwtWfR8Wk2fuvlDm7PFITJITuEsL5Fo0DFs2UGvErLbT8elzSroZxDX/72PVCxPoXwLlg5MRVqjIwcNGg3aW8iZf5OX1/Ml+3jRDgiOFH7FF8/d0tQi0XNqhkEp6mEx1KcvABMpev69oTqlLXsutUcWN5KWGn/1xD3xkD8X3HHb0kwWLqsx5ltZelFDjxBufUDX7b0gCkSOE+Es9sZkaHIuhYkiTKH3SEwlfGnkkgSteqF9NVY6c7JQTcXKxFDMtrVnSW8RFHs2BpkMSgE+XNJFewyVim7YvEliS6VWQHbn44ZfA88oa3GqD99+S9TMTlz5lHdJMNpv6ICLJTWbin9ygixUIXaWUORSQbRcaHjTNki+Vq86Wty8gjK/TSYCUHMDeWCECjmltx9AE1L3rhX0uwZ8+Hoy1zibxlIQkJqenfgOybh0GWjQZOBClCA+4N/f0RG1ixyAluQZm9K34TaPbenX4ZmegKfFKA1wiNq+USjDk6bEhxznEngwQgmnLzmjAJVKQCSCHhSnBIQjUtdfV9bSgyT04kk7bqLrq7Huqzms7DVZdgt1xNgLZPUpQMQolAJr7AYNi+v1R0fRhemMvi1YJunKYpmNZD4TJ+dTz0WXVga2qBChcK/GUfj7rSZgfArYCMyQoCWBy9X7k5wCUxfHOI+iAbWLurZNjqYw+ls2bvkEdPXc5us4BMHM7OkrXDZ9nLR9O4meBDWmYwek2hKMWv9eFXE4lORK9V4MveU0pZU1vxzKSb3sMyyy2qCHHVJSe3yfRjssT8S5vVSq3l+8L6MAGT/T78p1P0ExYOMNDIBfHt1kNAn1UhBBOAXMFdY88fI85j02ZqZ+kxG6u/iZSrDp00+WQWkiGbEonSPoDwtMu9IYE7vLvKto+aK+uiNfeTZjwx7EbvaVjArfai6uNIwVUxQkakHP50IX91U/dyd/25dWaHTqqi9FvMh0g4zwhbHcsxiE/kmo0G/gQqMgPuDkZGrBRmesCoC5IEOj8oHaszhHMn8ANzrTfMOsi5sy1o5c3B02eTeADOq3PqYsTCuGy7R/T7BP55sCDOJSKhB7+NTjGYH4YV6TdHWodoNCT1gBFtU9cNsHPsozIM7QJtrVokziMEkBSEbtFtEoGWtuHvS8xj8JZTBLXKRXlaQGsEJZ62ZhyasmpcCXCD3sZV7zCakrJgvXOVmw5jCvpRLMhe9kVNiB3wtVnK/djl4eyyYNS/Be4TsjzSIuQCVrcL2C7vhxTd0E9WxLRI49VhG4eexeKLvwYy4OPhJE+ekfiPwd7aMzPQklyGnfbSGDbyB6ZQgLIKtE/BJ1viQUpSM8daZZ1KnyTsPRDV0Y4lv2Beab2hxbhuwczzQHlAJbE35uYd2oKyj6DohLD63NLBaKMTHf+ITsDzDFr4vJEW9+ccVKI2IW7eGm+LCuinQBMq0p0zAnT4r9oNH8vFOwbh6xPX1vcrDu/qugKXcfZUaJV19b0L1eDkdDncHQpluyTPhR26yh5NbiEtnRFKKIG5PKcK0F+W5M/rKgyNdK3VFJdSy/X3Am13xGxTqghMeKdRKQo4ASlnrZKEXo0G1gQrIgPuDkY2v6CDaq5pOBwokW/A//y3h7A4SyBfMCra1pFqEBzIVFnFRfrt8u15z9fUEIDxXDPArLJazHhCIX2JAONjkUskQgfBNEGtvnfGGVJYedwO31oE9GFKrVAQmJ6Zrp/SgKW9dYL1rNm9D47N9xQUDZGm0d84pUcQm3llo+npX0nTmeOU241KgijOwj6dNu9ilTD0VFpSToSVigLGIW/ZA5w1HfkgJCdO2hpnuo5pGTqqUsuADZDOyDJ5WyFh9eqcuuQjMOfdwWSbf/rhSVv6FhjIsFu+2QgCoHRQIOGGYKe8V9VnHETk1uq59qU+D0VBsTgZLU3NIeKvbtYV+ESemWixZ2VcycoN+w4Nhvo2JHTnlxHrrrf080QpMNkhbmb6Sr+cFr8z8AuQpvDL12Co3wo/d90mFn9pLEkb6iSH7eMFujUPMUsbOWkyr4WZnPl+AhMP2DUiPBwvOPtMymqRBkQJV4/5XRxolfHP5Ug9R7XEeq5LAXGaGKn4N396LRzDCm82hGJ8YU0d1nZHSjgnFNyqojYSCe1JYRWbx0Mi1bTCJ4VUp28UX/em+zuejPNmjQYaBCwSA+4N/frAhu7Se1Vk3EDYBgE6yYrQ9HjQSl0VpBO9+o3x9441pPQGyfSYZSOm2zadll1ivz/yUQPCaaqJBJux2AC74E/FCYDaD5ugqw9OXUQ0OhlziIBHPjL5OyXOk0Uy5qY+BGUphZi9yveQVihe+1IH/lLTgClOzTJNsI2QgPZCpWtZDgD/0ysO87mDVggB93rLElRncKWF/jXr7GkMhBwQCpkaJqJiIS03xUHXBYcm4vQCkGIoWpWUUDlo4hotQs0NRhQFMH8QzSDP+I00aG7gbk8TcoeHoUliyNsMhzmJ6w9WD7c8rdep0YqAQETY2KkOvyX/jUcZiGpFY2r9kaxdDOaj23Yj91+PuCPrgaBDNpbJkueydSa+duPl4fTxx0kNy7q2KDXm7pNVZPLso9JHrdnw4f9GI2Xo8Grykj1Ul9T633z/17Uf1A9LgkVtSVfCquIRUC5Yb1V4O4lmCCrTbIJLQYACW7VOAZSig6aEe8PCAIjTTM508ZCRLyXbq7NejQXqBC0CA+4OAfawq7rc+841xn7poaqevPFj1pLfxOGNchl9ciD+gGyzWxB/xsCID+nTVnIkD7D/GGN/rLQbZM8H/nBIAev7etcTF6lY9DHzLTbNcLhaCarJtahWNcISO0Q6H8KZ9tQMmWrsRuWKH97m+ktIn+W9VmjHOt0zTpl9vDEEghmDumuTbPih1BuT2XdpmdSFAkE0aL7c94+DmMh4ty1NCNe5U9XxGmAJE2X6SJ+/8RIwb0q2qi4cXGtErOJcs1iJIyzNY3sUfwwuhRgh5aoILHvb7op6CUSra6naxswSnyrUIf31ixyPM89TdWunmnNxCESOmc0dxr8YezmShtH9vMi4iK3xqp5loO8B3o1AJv1cmQXfmSFZSS6TlO9QNjm0IaTf4nkqKPIuJ7dJz00sBH5h9zUaMOwnWw7dBV7JtHFUtfuMt5ON3rWQDGOOOSNb4lbDRfb+NukSx6pQpe7Jhi1AZgldTrc0KiJN+e++k3oMeJi1TcOKjQWqBC3yA+4N6dDwMxHNO+IQBmpdfP0txPhVzIcgigTgvKU5Z5B1UsRKL9ntO264vpfXb5qIKZxsTL5gRb8cr9SYTeqZALPPTiPyGprMmcMYuymH6kpdtssAZlmkq4Fkqn0MpsUuusampz8fBVqvkLCLFskN9a3CSKYFYtq5xs2tOY4ZbQOZ233nmPWjtAdzO0TB8XKDJp0uSQqZL8Nxi8qDxe/jPJ3BsNvZ45GD/a1d586VN6+tJnhpq2kUi1JGbjYm748Pqn2jbshfpzO8yxTafKk8YdMTvOZUTJDspa9HGPxojq/Kre4L3WNpEFtsvcB9voh213Jgifb5VIHZtq4wfBeIRSdF2sI+CEpWGXpuLHN1oOiprwIrq5LpEHhHGxQIiDYSiIk/gYzGadEuyUOq8o2B1k5oULZE+dho02hlKtK0cCWv5QADlyz/QhXPZkiwqzgzCRJDGj/1y02xNbBb4ZFanAXgtm/l7fg==", -} diff --git a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/httpcore/_backends/anyio.py b/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/httpcore/_backends/anyio.py deleted file mode 100644 index 1ed5228dbde1732de50677e9a3bd6f04a3017433..0000000000000000000000000000000000000000 --- a/spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/httpcore/_backends/anyio.py +++ /dev/null @@ -1,145 +0,0 @@ -import ssl -import typing - -import anyio - -from .._exceptions import ( - ConnectError, - ConnectTimeout, - ReadError, - ReadTimeout, - WriteError, - WriteTimeout, - map_exceptions, -) -from .._utils import is_socket_readable -from .base import SOCKET_OPTION, AsyncNetworkBackend, AsyncNetworkStream - - -class AnyIOStream(AsyncNetworkStream): - def __init__(self, stream: anyio.abc.ByteStream) -> None: - self._stream = stream - - async def read( - self, max_bytes: int, timeout: typing.Optional[float] = None - ) -> bytes: - exc_map = { - TimeoutError: ReadTimeout, - anyio.BrokenResourceError: ReadError, - anyio.ClosedResourceError: ReadError, - } - with map_exceptions(exc_map): - with anyio.fail_after(timeout): - try: - return await self._stream.receive(max_bytes=max_bytes) - except anyio.EndOfStream: # pragma: nocover - return b"" - - async def write( - self, buffer: bytes, timeout: typing.Optional[float] = None - ) -> None: - if not buffer: - return - - exc_map = { - TimeoutError: WriteTimeout, - anyio.BrokenResourceError: WriteError, - anyio.ClosedResourceError: WriteError, - } - with map_exceptions(exc_map): - with anyio.fail_after(timeout): - await self._stream.send(item=buffer) - - async def aclose(self) -> None: - await self._stream.aclose() - - async def start_tls( - self, - ssl_context: ssl.SSLContext, - server_hostname: typing.Optional[str] = None, - timeout: typing.Optional[float] = None, - ) -> AsyncNetworkStream: - exc_map = { - TimeoutError: ConnectTimeout, - anyio.BrokenResourceError: ConnectError, - } - with map_exceptions(exc_map): - try: - with anyio.fail_after(timeout): - ssl_stream = await anyio.streams.tls.TLSStream.wrap( - self._stream, - ssl_context=ssl_context, - hostname=server_hostname, - standard_compatible=False, - server_side=False, - ) - except Exception as exc: # pragma: nocover - await self.aclose() - raise exc - return AnyIOStream(ssl_stream) - - def get_extra_info(self, info: str) -> typing.Any: - if info == "ssl_object": - return self._stream.extra(anyio.streams.tls.TLSAttribute.ssl_object, None) - if info == "client_addr": - return self._stream.extra(anyio.abc.SocketAttribute.local_address, None) - if info == "server_addr": - return self._stream.extra(anyio.abc.SocketAttribute.remote_address, None) - if info == "socket": - return self._stream.extra(anyio.abc.SocketAttribute.raw_socket, None) - if info == "is_readable": - sock = self._stream.extra(anyio.abc.SocketAttribute.raw_socket, None) - return is_socket_readable(sock) - return None - - -class AnyIOBackend(AsyncNetworkBackend): - async def connect_tcp( - self, - host: str, - port: int, - timeout: typing.Optional[float] = None, - local_address: typing.Optional[str] = None, - socket_options: typing.Optional[typing.Iterable[SOCKET_OPTION]] = None, - ) -> AsyncNetworkStream: - if socket_options is None: - socket_options = [] # pragma: no cover - exc_map = { - TimeoutError: ConnectTimeout, - OSError: ConnectError, - anyio.BrokenResourceError: ConnectError, - } - with map_exceptions(exc_map): - with anyio.fail_after(timeout): - stream: anyio.abc.ByteStream = await anyio.connect_tcp( - remote_host=host, - remote_port=port, - local_host=local_address, - ) - # By default TCP sockets opened in `asyncio` include TCP_NODELAY. - for option in socket_options: - stream._raw_socket.setsockopt(*option) # type: ignore[attr-defined] # pragma: no cover - return AnyIOStream(stream) - - async def connect_unix_socket( - self, - path: str, - timeout: typing.Optional[float] = None, - socket_options: typing.Optional[typing.Iterable[SOCKET_OPTION]] = None, - ) -> AsyncNetworkStream: # pragma: nocover - if socket_options is None: - socket_options = [] - exc_map = { - TimeoutError: ConnectTimeout, - OSError: ConnectError, - anyio.BrokenResourceError: ConnectError, - } - with map_exceptions(exc_map): - with anyio.fail_after(timeout): - stream: anyio.abc.ByteStream = await anyio.connect_unix(path) - for option in socket_options: - stream._raw_socket.setsockopt(*option) # type: ignore[attr-defined] # pragma: no cover - return AnyIOStream(stream) - - async def sleep(self, seconds: float) -> None: - await anyio.sleep(seconds) # pragma: nocover diff --git a/spaces/Dagfinn1962/Dreamlikeart-Anime-1.0/style.css b/spaces/Dagfinn1962/Dreamlikeart-Anime-1.0/style.css deleted file mode 100644 index 85afb3cc45fb3b89191feaa5d8ecc758976a583f..0000000000000000000000000000000000000000 --- a/spaces/Dagfinn1962/Dreamlikeart-Anime-1.0/style.css +++ /dev/null @@ -1,94 +0,0 @@ -#col-container { - background:rgb(9, 2, 27); - color:#FFFFFF; - width: 100%; - margin-left: auto; - margin-right: auto; -} -a { - color: inherit; - text-decoration: underline; -} -.gradio-container { - background:rgb(9, 2, 27); - color:#FFFFFF; - font-family: 'IBM Plex Sans', sans-serif; -} -.gr-button { - color: white; - border-color: #9d66e5; - background: #9e5bf6; -} -input[type='range'] { - accent-color: #9d66e5; -} -.dark input[type='range'] { - accent-color: #dfdfdf; -} -.container { - background:rgb(9, 2, 27); - width: 100%; - margin: auto; - padding-top: 1.5rem; - border-radius: 20px; -} -#gallery { - min-height: 22rem; - margin-bottom: 15px; - margin-left: auto; - margin-right: auto; - border-bottom-right-radius: .5rem !important; - border-bottom-left-radius: .5rem !important; -} -#gallery>div>.h-full { - min-height: 20rem; -} -.details:hover { - text-decoration: underline; -} -.gr-button { - white-space: nowrap; -} -.gr-button:focus { - border-color: rgb(147 197 253 / var(--tw-border-opacity)); - outline: none; - box-shadow: var(--tw-ring-offset-shadow), var(--tw-ring-shadow), var(--tw-shadow, 0 0 #0000); - --tw-border-opacity: 1; - --tw-ring-offset-shadow: var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color); - --tw-ring-shadow: var(--tw-ring-inset) 0 0 0 calc(3px var(--tw-ring-offset-width)) var(--tw-ring-color); - --tw-ring-color: rgb(191 219 254 / var(--tw-ring-opacity)); - --tw-ring-opacity: .5; -} -#advanced-options { - margin-bottom: 20px; -} -.footer { - background: #090956; - color:#FFFFFF; - margin-bottom: 45px; - margin-top: 35px; - text-align: center; - border-bottom: 1px solid #e5e5e5; -} -.footer>p { - color:#FFFFFF; - font-size: .8rem; - display: inline-block; - padding: 0 10px; - transform: translateY(10px); - background: white; -} -.dark .logo{ filter: invert(1); } -.dark .footer { - border-color: #303030; -} -.dark .footer>p { - background: #0b0f19; -} -.acknowledgments h4{ - color:#FFFFFF; - margin: 1.25em 0 .25em 0; - font-weight: bold; - font-size: 115%; -} - diff --git a/spaces/Datasculptor/3D-Room-Layout-Estimation_LGT-Net/dataset/pano_s2d3d_mix_dataset.py b/spaces/Datasculptor/3D-Room-Layout-Estimation_LGT-Net/dataset/pano_s2d3d_mix_dataset.py deleted file mode 100644 index d8f8444b20f89b1c1b1ad274c7c7d0274ef5aa2f..0000000000000000000000000000000000000000 --- a/spaces/Datasculptor/3D-Room-Layout-Estimation_LGT-Net/dataset/pano_s2d3d_mix_dataset.py +++ /dev/null @@ -1,91 +0,0 @@ -""" -@date: 2021/6/16 -@description: -""" - -import os - -from dataset.pano_s2d3d_dataset import PanoS2D3DDataset -from utils.logger import get_logger - - -class PanoS2D3DMixDataset(PanoS2D3DDataset): - def __init__(self, root_dir, mode, shape=None, max_wall_num=0, aug=None, camera_height=1.6, logger=None, - split_list=None, patch_num=256, keys=None, for_test_index=None, subset=None): - assert subset == 's2d3d' or subset == 'pano', 'error subset' - super().__init__(root_dir, None, shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, subset) - if logger is None: - logger = get_logger() - self.mode = mode - if mode == 'train': - if subset == 'pano': - s2d3d_train_data = PanoS2D3DDataset(root_dir, 'train', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 's2d3d').data - s2d3d_val_data = PanoS2D3DDataset(root_dir, 'val', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 's2d3d').data - s2d3d_test_data = PanoS2D3DDataset(root_dir, 'test', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 's2d3d').data - s2d3d_all_data = s2d3d_train_data + s2d3d_val_data + s2d3d_test_data - - pano_train_data = PanoS2D3DDataset(root_dir, 'train', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 'pano').data - self.data = s2d3d_all_data + pano_train_data - elif subset == 's2d3d': - pano_train_data = PanoS2D3DDataset(root_dir, 'train', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 'pano').data - pano_val_data = PanoS2D3DDataset(root_dir, 'val', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 'pano').data - pano_test_data = PanoS2D3DDataset(root_dir, 'test', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 'pano').data - pano_all_data = pano_train_data + pano_val_data + pano_test_data - - s2d3d_train_data = PanoS2D3DDataset(root_dir, 'train', shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, 's2d3d').data - self.data = pano_all_data + s2d3d_train_data - else: - self.data = PanoS2D3DDataset(root_dir, mode, shape, max_wall_num, aug, camera_height, logger, - split_list, patch_num, keys, None, subset).data - - if for_test_index is not None: - self.data = self.data[:for_test_index] - logger.info(f"Build dataset mode: {self.mode} valid: {len(self.data)}") - - -if __name__ == '__main__': - import numpy as np - from PIL import Image - - from tqdm import tqdm - from visualization.boundary import draw_boundaries - from visualization.floorplan import draw_floorplan - from utils.boundary import depth2boundaries - from utils.conversion import uv2xyz - - modes = ['test', 'val', 'train'] - for i in range(1): - for mode in modes: - print(mode) - mp3d_dataset = PanoS2D3DMixDataset(root_dir='../src/dataset/pano_s2d3d', mode=mode, aug={ - # 'STRETCH': True, - # 'ROTATE': True, - # 'FLIP': True, - # 'GAMMA': True - }, subset='pano') - continue - save_dir = f'../src/dataset/pano_s2d3d/visualization1/{mode}' - if not os.path.isdir(save_dir): - os.makedirs(save_dir) - - bar = tqdm(mp3d_dataset, ncols=100) - for data in bar: - bar.set_description(f"Processing {data['id']}") - boundary_list = depth2boundaries(data['ratio'], data['depth'], step=None) - pano_img = draw_boundaries(data['image'].transpose(1, 2, 0), boundary_list=boundary_list, show=False) - Image.fromarray((pano_img * 255).astype(np.uint8)).save( - os.path.join(save_dir, f"{data['id']}_boundary.png")) - - floorplan = draw_floorplan(uv2xyz(boundary_list[0])[..., ::2], show=False, - marker_color=None, center_color=0.8, show_radius=None) - Image.fromarray((floorplan.squeeze() * 255).astype(np.uint8)).save( - os.path.join(save_dir, f"{data['id']}_floorplan.png")) diff --git a/spaces/DemoLou/moe-tts/transforms.py b/spaces/DemoLou/moe-tts/transforms.py deleted file mode 100644 index 4793d67ca5a5630e0ffe0f9fb29445c949e64dae..0000000000000000000000000000000000000000 --- a/spaces/DemoLou/moe-tts/transforms.py +++ /dev/null @@ -1,193 +0,0 @@ -import torch -from torch.nn import functional as F - -import numpy as np - - -DEFAULT_MIN_BIN_WIDTH = 1e-3 -DEFAULT_MIN_BIN_HEIGHT = 1e-3 -DEFAULT_MIN_DERIVATIVE = 1e-3 - - -def piecewise_rational_quadratic_transform(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails=None, - tail_bound=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - - if tails is None: - spline_fn = rational_quadratic_spline - spline_kwargs = {} - else: - spline_fn = unconstrained_rational_quadratic_spline - spline_kwargs = { - 'tails': tails, - 'tail_bound': tail_bound - } - - outputs, logabsdet = spline_fn( - inputs=inputs, - unnormalized_widths=unnormalized_widths, - unnormalized_heights=unnormalized_heights, - unnormalized_derivatives=unnormalized_derivatives, - inverse=inverse, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative, - **spline_kwargs - ) - return outputs, logabsdet - - -def searchsorted(bin_locations, inputs, eps=1e-6): - bin_locations[..., -1] += eps - return torch.sum( - inputs[..., None] >= bin_locations, - dim=-1 - ) - 1 - - -def unconstrained_rational_quadratic_spline(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails='linear', - tail_bound=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - inside_interval_mask = (inputs >= -tail_bound) & (inputs <= tail_bound) - outside_interval_mask = ~inside_interval_mask - - outputs = torch.zeros_like(inputs) - logabsdet = torch.zeros_like(inputs) - - if tails == 'linear': - unnormalized_derivatives = F.pad(unnormalized_derivatives, pad=(1, 1)) - constant = np.log(np.exp(1 - min_derivative) - 1) - unnormalized_derivatives[..., 0] = constant - unnormalized_derivatives[..., -1] = constant - - outputs[outside_interval_mask] = inputs[outside_interval_mask] - logabsdet[outside_interval_mask] = 0 - else: - raise RuntimeError('{} tails are not implemented.'.format(tails)) - - outputs[inside_interval_mask], logabsdet[inside_interval_mask] = rational_quadratic_spline( - inputs=inputs[inside_interval_mask], - unnormalized_widths=unnormalized_widths[inside_interval_mask, :], - unnormalized_heights=unnormalized_heights[inside_interval_mask, :], - unnormalized_derivatives=unnormalized_derivatives[inside_interval_mask, :], - inverse=inverse, - left=-tail_bound, right=tail_bound, bottom=-tail_bound, top=tail_bound, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative - ) - - return outputs, logabsdet - -def rational_quadratic_spline(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - left=0., right=1., bottom=0., top=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - if torch.min(inputs) < left or torch.max(inputs) > right: - raise ValueError('Input to a transform is not within its domain') - - num_bins = unnormalized_widths.shape[-1] - - if min_bin_width * num_bins > 1.0: - raise ValueError('Minimal bin width too large for the number of bins') - if min_bin_height * num_bins > 1.0: - raise ValueError('Minimal bin height too large for the number of bins') - - widths = F.softmax(unnormalized_widths, dim=-1) - widths = min_bin_width + (1 - min_bin_width * num_bins) * widths - cumwidths = torch.cumsum(widths, dim=-1) - cumwidths = F.pad(cumwidths, pad=(1, 0), mode='constant', value=0.0) - cumwidths = (right - left) * cumwidths + left - cumwidths[..., 0] = left - cumwidths[..., -1] = right - widths = cumwidths[..., 1:] - cumwidths[..., :-1] - - derivatives = min_derivative + F.softplus(unnormalized_derivatives) - - heights = F.softmax(unnormalized_heights, dim=-1) - heights = min_bin_height + (1 - min_bin_height * num_bins) * heights - cumheights = torch.cumsum(heights, dim=-1) - cumheights = F.pad(cumheights, pad=(1, 0), mode='constant', value=0.0) - cumheights = (top - bottom) * cumheights + bottom - cumheights[..., 0] = bottom - cumheights[..., -1] = top - heights = cumheights[..., 1:] - cumheights[..., :-1] - - if inverse: - bin_idx = searchsorted(cumheights, inputs)[..., None] - else: - bin_idx = searchsorted(cumwidths, inputs)[..., None] - - input_cumwidths = cumwidths.gather(-1, bin_idx)[..., 0] - input_bin_widths = widths.gather(-1, bin_idx)[..., 0] - - input_cumheights = cumheights.gather(-1, bin_idx)[..., 0] - delta = heights / widths - input_delta = delta.gather(-1, bin_idx)[..., 0] - - input_derivatives = derivatives.gather(-1, bin_idx)[..., 0] - input_derivatives_plus_one = derivatives[..., 1:].gather(-1, bin_idx)[..., 0] - - input_heights = heights.gather(-1, bin_idx)[..., 0] - - if inverse: - a = (((inputs - input_cumheights) * (input_derivatives - + input_derivatives_plus_one - - 2 * input_delta) - + input_heights * (input_delta - input_derivatives))) - b = (input_heights * input_derivatives - - (inputs - input_cumheights) * (input_derivatives - + input_derivatives_plus_one - - 2 * input_delta)) - c = - input_delta * (inputs - input_cumheights) - - discriminant = b.pow(2) - 4 * a * c - assert (discriminant >= 0).all() - - root = (2 * c) / (-b - torch.sqrt(discriminant)) - outputs = root * input_bin_widths + input_cumwidths - - theta_one_minus_theta = root * (1 - root) - denominator = input_delta + ((input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta) - derivative_numerator = input_delta.pow(2) * (input_derivatives_plus_one * root.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - root).pow(2)) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, -logabsdet - else: - theta = (inputs - input_cumwidths) / input_bin_widths - theta_one_minus_theta = theta * (1 - theta) - - numerator = input_heights * (input_delta * theta.pow(2) - + input_derivatives * theta_one_minus_theta) - denominator = input_delta + ((input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta) - outputs = input_cumheights + numerator / denominator - - derivative_numerator = input_delta.pow(2) * (input_derivatives_plus_one * theta.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - theta).pow(2)) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, logabsdet diff --git a/spaces/ECCV2022/bytetrack/yolox/data/data_augment.py b/spaces/ECCV2022/bytetrack/yolox/data/data_augment.py deleted file mode 100644 index 99fb30a284eeb5851e4c776aafd61b44d485196b..0000000000000000000000000000000000000000 --- a/spaces/ECCV2022/bytetrack/yolox/data/data_augment.py +++ /dev/null @@ -1,299 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding:utf-8 -*- -# Copyright (c) Megvii, Inc. and its affiliates. -""" -Data augmentation functionality. Passed as callable transformations to -Dataset classes. - -The data augmentation procedures were interpreted from @weiliu89's SSD paper -http://arxiv.org/abs/1512.02325 -""" - -import cv2 -import numpy as np - -import torch - -from yolox.utils import xyxy2cxcywh - -import math -import random - - -def augment_hsv(img, hgain=0.015, sgain=0.7, vgain=0.4): - r = np.random.uniform(-1, 1, 3) * [hgain, sgain, vgain] + 1 # random gains - hue, sat, val = cv2.split(cv2.cvtColor(img, cv2.COLOR_BGR2HSV)) - dtype = img.dtype # uint8 - - x = np.arange(0, 256, dtype=np.int16) - lut_hue = ((x * r[0]) % 180).astype(dtype) - lut_sat = np.clip(x * r[1], 0, 255).astype(dtype) - lut_val = np.clip(x * r[2], 0, 255).astype(dtype) - - img_hsv = cv2.merge( - (cv2.LUT(hue, lut_hue), cv2.LUT(sat, lut_sat), cv2.LUT(val, lut_val)) - ).astype(dtype) - cv2.cvtColor(img_hsv, cv2.COLOR_HSV2BGR, dst=img) # no return needed - - -def box_candidates(box1, box2, wh_thr=2, ar_thr=20, area_thr=0.2): - # box1(4,n), box2(4,n) - # Compute candidate boxes which include follwing 5 things: - # box1 before augment, box2 after augment, wh_thr (pixels), aspect_ratio_thr, area_ratio - w1, h1 = box1[2] - box1[0], box1[3] - box1[1] - w2, h2 = box2[2] - box2[0], box2[3] - box2[1] - ar = np.maximum(w2 / (h2 + 1e-16), h2 / (w2 + 1e-16)) # aspect ratio - return ( - (w2 > wh_thr) - & (h2 > wh_thr) - & (w2 * h2 / (w1 * h1 + 1e-16) > area_thr) - & (ar < ar_thr) - ) # candidates - - -def random_perspective( - img, - targets=(), - degrees=10, - translate=0.1, - scale=0.1, - shear=10, - perspective=0.0, - border=(0, 0), -): - # targets = [cls, xyxy] - height = img.shape[0] + border[0] * 2 # shape(h,w,c) - width = img.shape[1] + border[1] * 2 - - # Center - C = np.eye(3) - C[0, 2] = -img.shape[1] / 2 # x translation (pixels) - C[1, 2] = -img.shape[0] / 2 # y translation (pixels) - - # Rotation and Scale - R = np.eye(3) - a = random.uniform(-degrees, degrees) - # a += random.choice([-180, -90, 0, 90]) # add 90deg rotations to small rotations - s = random.uniform(scale[0], scale[1]) - # s = 2 ** random.uniform(-scale, scale) - R[:2] = cv2.getRotationMatrix2D(angle=a, center=(0, 0), scale=s) - - # Shear - S = np.eye(3) - S[0, 1] = math.tan(random.uniform(-shear, shear) * math.pi / 180) # x shear (deg) - S[1, 0] = math.tan(random.uniform(-shear, shear) * math.pi / 180) # y shear (deg) - - # Translation - T = np.eye(3) - T[0, 2] = ( - random.uniform(0.5 - translate, 0.5 + translate) * width - ) # x translation (pixels) - T[1, 2] = ( - random.uniform(0.5 - translate, 0.5 + translate) * height - ) # y translation (pixels) - - # Combined rotation matrix - M = T @ S @ R @ C # order of operations (right to left) is IMPORTANT - - ########################### - # For Aug out of Mosaic - # s = 1. - # M = np.eye(3) - ########################### - - if (border[0] != 0) or (border[1] != 0) or (M != np.eye(3)).any(): # image changed - if perspective: - img = cv2.warpPerspective( - img, M, dsize=(width, height), borderValue=(114, 114, 114) - ) - else: # affine - img = cv2.warpAffine( - img, M[:2], dsize=(width, height), borderValue=(114, 114, 114) - ) - - # Transform label coordinates - n = len(targets) - if n: - # warp points - xy = np.ones((n * 4, 3)) - xy[:, :2] = targets[:, [0, 1, 2, 3, 0, 3, 2, 1]].reshape( - n * 4, 2 - ) # x1y1, x2y2, x1y2, x2y1 - xy = xy @ M.T # transform - if perspective: - xy = (xy[:, :2] / xy[:, 2:3]).reshape(n, 8) # rescale - else: # affine - xy = xy[:, :2].reshape(n, 8) - - # create new boxes - x = xy[:, [0, 2, 4, 6]] - y = xy[:, [1, 3, 5, 7]] - xy = np.concatenate((x.min(1), y.min(1), x.max(1), y.max(1))).reshape(4, n).T - - # clip boxes - #xy[:, [0, 2]] = xy[:, [0, 2]].clip(0, width) - #xy[:, [1, 3]] = xy[:, [1, 3]].clip(0, height) - - # filter candidates - i = box_candidates(box1=targets[:, :4].T * s, box2=xy.T) - targets = targets[i] - targets[:, :4] = xy[i] - - targets = targets[targets[:, 0] < width] - targets = targets[targets[:, 2] > 0] - targets = targets[targets[:, 1] < height] - targets = targets[targets[:, 3] > 0] - - return img, targets - - -def _distort(image): - def _convert(image, alpha=1, beta=0): - tmp = image.astype(float) * alpha + beta - tmp[tmp < 0] = 0 - tmp[tmp > 255] = 255 - image[:] = tmp - - image = image.copy() - - if random.randrange(2): - _convert(image, beta=random.uniform(-32, 32)) - - if random.randrange(2): - _convert(image, alpha=random.uniform(0.5, 1.5)) - - image = cv2.cvtColor(image, cv2.COLOR_BGR2HSV) - - if random.randrange(2): - tmp = image[:, :, 0].astype(int) + random.randint(-18, 18) - tmp %= 180 - image[:, :, 0] = tmp - - if random.randrange(2): - _convert(image[:, :, 1], alpha=random.uniform(0.5, 1.5)) - - image = cv2.cvtColor(image, cv2.COLOR_HSV2BGR) - - return image - - -def _mirror(image, boxes): - _, width, _ = image.shape - if random.randrange(2): - image = image[:, ::-1] - boxes = boxes.copy() - boxes[:, 0::2] = width - boxes[:, 2::-2] - return image, boxes - - -def preproc(image, input_size, mean, std, swap=(2, 0, 1)): - if len(image.shape) == 3: - padded_img = np.ones((input_size[0], input_size[1], 3)) * 114.0 - else: - padded_img = np.ones(input_size) * 114.0 - img = np.array(image) - r = min(input_size[0] / img.shape[0], input_size[1] / img.shape[1]) - resized_img = cv2.resize( - img, - (int(img.shape[1] * r), int(img.shape[0] * r)), - interpolation=cv2.INTER_LINEAR, - ).astype(np.float32) - padded_img[: int(img.shape[0] * r), : int(img.shape[1] * r)] = resized_img - - padded_img = padded_img[:, :, ::-1] - padded_img /= 255.0 - if mean is not None: - padded_img -= mean - if std is not None: - padded_img /= std - padded_img = padded_img.transpose(swap) - padded_img = np.ascontiguousarray(padded_img, dtype=np.float32) - return padded_img, r - - -class TrainTransform: - def __init__(self, p=0.5, rgb_means=None, std=None, max_labels=100): - self.means = rgb_means - self.std = std - self.p = p - self.max_labels = max_labels - - def __call__(self, image, targets, input_dim): - boxes = targets[:, :4].copy() - labels = targets[:, 4].copy() - ids = targets[:, 5].copy() - if len(boxes) == 0: - targets = np.zeros((self.max_labels, 6), dtype=np.float32) - image, r_o = preproc(image, input_dim, self.means, self.std) - image = np.ascontiguousarray(image, dtype=np.float32) - return image, targets - - image_o = image.copy() - targets_o = targets.copy() - height_o, width_o, _ = image_o.shape - boxes_o = targets_o[:, :4] - labels_o = targets_o[:, 4] - ids_o = targets_o[:, 5] - # bbox_o: [xyxy] to [c_x,c_y,w,h] - boxes_o = xyxy2cxcywh(boxes_o) - - image_t = _distort(image) - image_t, boxes = _mirror(image_t, boxes) - height, width, _ = image_t.shape - image_t, r_ = preproc(image_t, input_dim, self.means, self.std) - # boxes [xyxy] 2 [cx,cy,w,h] - boxes = xyxy2cxcywh(boxes) - boxes *= r_ - - mask_b = np.minimum(boxes[:, 2], boxes[:, 3]) > 1 - boxes_t = boxes[mask_b] - labels_t = labels[mask_b] - ids_t = ids[mask_b] - - if len(boxes_t) == 0: - image_t, r_o = preproc(image_o, input_dim, self.means, self.std) - boxes_o *= r_o - boxes_t = boxes_o - labels_t = labels_o - ids_t = ids_o - - labels_t = np.expand_dims(labels_t, 1) - ids_t = np.expand_dims(ids_t, 1) - - targets_t = np.hstack((labels_t, boxes_t, ids_t)) - padded_labels = np.zeros((self.max_labels, 6)) - padded_labels[range(len(targets_t))[: self.max_labels]] = targets_t[ - : self.max_labels - ] - padded_labels = np.ascontiguousarray(padded_labels, dtype=np.float32) - image_t = np.ascontiguousarray(image_t, dtype=np.float32) - return image_t, padded_labels - - -class ValTransform: - """ - Defines the transformations that should be applied to test PIL image - for input into the network - - dimension -> tensorize -> color adj - - Arguments: - resize (int): input dimension to SSD - rgb_means ((int,int,int)): average RGB of the dataset - (104,117,123) - swap ((int,int,int)): final order of channels - - Returns: - transform (transform) : callable transform to be applied to test/val - data - """ - - def __init__(self, rgb_means=None, std=None, swap=(2, 0, 1)): - self.means = rgb_means - self.swap = swap - self.std = std - - # assume input is cv2 img for now - def __call__(self, img, res, input_size): - img, _ = preproc(img, input_size, self.means, self.std, self.swap) - return img, np.zeros((1, 5)) diff --git a/spaces/Eddycrack864/Applio-Inference/infer/lib/uvr5_pack/lib_v5/layers_123812KB .py b/spaces/Eddycrack864/Applio-Inference/infer/lib/uvr5_pack/lib_v5/layers_123812KB .py deleted file mode 100644 index 4fc1b5cb85a3327f60cbb9f5deffbeeaaac516ad..0000000000000000000000000000000000000000 --- a/spaces/Eddycrack864/Applio-Inference/infer/lib/uvr5_pack/lib_v5/layers_123812KB .py +++ /dev/null @@ -1,118 +0,0 @@ -import torch -import torch.nn.functional as F -from torch import nn - -from . import spec_utils - - -class Conv2DBNActiv(nn.Module): - def __init__(self, nin, nout, ksize=3, stride=1, pad=1, dilation=1, activ=nn.ReLU): - super(Conv2DBNActiv, self).__init__() - self.conv = nn.Sequential( - nn.Conv2d( - nin, - nout, - kernel_size=ksize, - stride=stride, - padding=pad, - dilation=dilation, - bias=False, - ), - nn.BatchNorm2d(nout), - activ(), - ) - - def __call__(self, x): - return self.conv(x) - - -class SeperableConv2DBNActiv(nn.Module): - def __init__(self, nin, nout, ksize=3, stride=1, pad=1, dilation=1, activ=nn.ReLU): - super(SeperableConv2DBNActiv, self).__init__() - self.conv = nn.Sequential( - nn.Conv2d( - nin, - nin, - kernel_size=ksize, - stride=stride, - padding=pad, - dilation=dilation, - groups=nin, - bias=False, - ), - nn.Conv2d(nin, nout, kernel_size=1, bias=False), - nn.BatchNorm2d(nout), - activ(), - ) - - def __call__(self, x): - return self.conv(x) - - -class Encoder(nn.Module): - def __init__(self, nin, nout, ksize=3, stride=1, pad=1, activ=nn.LeakyReLU): - super(Encoder, self).__init__() - self.conv1 = Conv2DBNActiv(nin, nout, ksize, 1, pad, activ=activ) - self.conv2 = Conv2DBNActiv(nout, nout, ksize, stride, pad, activ=activ) - - def __call__(self, x): - skip = self.conv1(x) - h = self.conv2(skip) - - return h, skip - - -class Decoder(nn.Module): - def __init__( - self, nin, nout, ksize=3, stride=1, pad=1, activ=nn.ReLU, dropout=False - ): - super(Decoder, self).__init__() - self.conv = Conv2DBNActiv(nin, nout, ksize, 1, pad, activ=activ) - self.dropout = nn.Dropout2d(0.1) if dropout else None - - def __call__(self, x, skip=None): - x = F.interpolate(x, scale_factor=2, mode="bilinear", align_corners=True) - if skip is not None: - skip = spec_utils.crop_center(skip, x) - x = torch.cat([x, skip], dim=1) - h = self.conv(x) - - if self.dropout is not None: - h = self.dropout(h) - - return h - - -class ASPPModule(nn.Module): - def __init__(self, nin, nout, dilations=(4, 8, 16), activ=nn.ReLU): - super(ASPPModule, self).__init__() - self.conv1 = nn.Sequential( - nn.AdaptiveAvgPool2d((1, None)), - Conv2DBNActiv(nin, nin, 1, 1, 0, activ=activ), - ) - self.conv2 = Conv2DBNActiv(nin, nin, 1, 1, 0, activ=activ) - self.conv3 = SeperableConv2DBNActiv( - nin, nin, 3, 1, dilations[0], dilations[0], activ=activ - ) - self.conv4 = SeperableConv2DBNActiv( - nin, nin, 3, 1, dilations[1], dilations[1], activ=activ - ) - self.conv5 = SeperableConv2DBNActiv( - nin, nin, 3, 1, dilations[2], dilations[2], activ=activ - ) - self.bottleneck = nn.Sequential( - Conv2DBNActiv(nin * 5, nout, 1, 1, 0, activ=activ), nn.Dropout2d(0.1) - ) - - def forward(self, x): - _, _, h, w = x.size() - feat1 = F.interpolate( - self.conv1(x), size=(h, w), mode="bilinear", align_corners=True - ) - feat2 = self.conv2(x) - feat3 = self.conv3(x) - feat4 = self.conv4(x) - feat5 = self.conv5(x) - out = torch.cat((feat1, feat2, feat3, feat4, feat5), dim=1) - bottle = self.bottleneck(out) - return bottle diff --git a/spaces/EuroPython2022/mmocr-demo/configs/_base_/det_datasets/toy_data.py b/spaces/EuroPython2022/mmocr-demo/configs/_base_/det_datasets/toy_data.py deleted file mode 100644 index 512d1d20372a3fa3f662cc908c8cf4b66b35b797..0000000000000000000000000000000000000000 --- a/spaces/EuroPython2022/mmocr-demo/configs/_base_/det_datasets/toy_data.py +++ /dev/null @@ -1,41 +0,0 @@ -root = 'tests/data/toy_dataset' - -# dataset with type='TextDetDataset' -train1 = dict( - type='TextDetDataset', - img_prefix=f'{root}/imgs', - ann_file=f'{root}/instances_test.txt', - loader=dict( - type='AnnFileLoader', - repeat=4, - file_format='txt', - parser=dict( - type='LineJsonParser', - keys=['file_name', 'height', 'width', 'annotations'])), - pipeline=None, - test_mode=False) - -# dataset with type='IcdarDataset' -train2 = dict( - type='IcdarDataset', - ann_file=f'{root}/instances_test.json', - img_prefix=f'{root}/imgs', - pipeline=None) - -test = dict( - type='TextDetDataset', - img_prefix=f'{root}/imgs', - ann_file=f'{root}/instances_test.txt', - loader=dict( - type='AnnFileLoader', - repeat=1, - file_format='txt', - parser=dict( - type='LineJsonParser', - keys=['file_name', 'height', 'width', 'annotations'])), - pipeline=None, - test_mode=True) - -train_list = [train1, train2] - -test_list = [test] diff --git a/spaces/EyeSeeThru/openjourney/app.py b/spaces/EyeSeeThru/openjourney/app.py deleted file mode 100644 index bea4accb45793c8e748731c184dee0ffaf509dd5..0000000000000000000000000000000000000000 --- a/spaces/EyeSeeThru/openjourney/app.py +++ /dev/null @@ -1,8 +0,0 @@ -import gradio as gr - -description = """
- -
- """ - -gr.Interface.load("models/prompthero/openjourney", description=description).launch() \ No newline at end of file diff --git a/spaces/FFusion/FFusionXL-SDXL-DEMO/README.md b/spaces/FFusion/FFusionXL-SDXL-DEMO/README.md deleted file mode 100644 index f3f7927e506982caf3724ce3da061412203b115f..0000000000000000000000000000000000000000 --- a/spaces/FFusion/FFusionXL-SDXL-DEMO/README.md +++ /dev/null @@ -1,18 +0,0 @@ ---- -title: FFusionXL BASE - SDXL - DEMO -emoji: 📚 -colorFrom: pink -colorTo: purple -sdk: gradio -sdk_version: 3.39.0 -app_file: app.py -pinned: true -license: other -duplicated_from: FFusion/FFusionXL-SDXL-DEV -thumbnail: >- - https://cdn-uploads.huggingface.co/production/uploads/6380cf05f496d57325c12194/p54u7dEP1u8en0--NMEjS.png ---- - -![ffusionXL-Demo.png](https://cdn-uploads.huggingface.co/production/uploads/6380cf05f496d57325c12194/p54u7dEP1u8en0--NMEjS.png) - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference \ No newline at end of file diff --git a/spaces/FYP-23-S1-21/Refineverse_Plugin/static/GenerationTable.css b/spaces/FYP-23-S1-21/Refineverse_Plugin/static/GenerationTable.css deleted file mode 100644 index 08a31fb9b6d42c2ffccc970b1c937af2c1690976..0000000000000000000000000000000000000000 --- a/spaces/FYP-23-S1-21/Refineverse_Plugin/static/GenerationTable.css +++ /dev/null @@ -1,45 +0,0 @@ - -#generation-table { - width: 100%; - } - - #generation-table th, - #generation-tablee td { - border: 1px solid #ddd; - padding: 8px; - text-align: left; - } - - #generation-table th:first-child { - border-left: none; - } - - #generation-table th:last-child { - border-right: none; - } - - #generation-table th:not(:first-child) { - border-left: none; - border-right: none; - } - - #generation-table th div { - border-bottom: 1px solid #ddd; - padding: 8px; - } - - #generation-table td div { - padding: 8px; - } - - #generation-table thead th { - background-color: #f2f2f2; - } - - #generation-table tbody tr:nth-child(even) { - background-color: #f2f2f2; - } - - #generation-table tbody tr:hover { - background-color: #ddd; - } \ No newline at end of file diff --git a/spaces/FreeHamish/Manaforge/app.py b/spaces/FreeHamish/Manaforge/app.py deleted file mode 100644 index 930ea9c34f7f0344d17cd6bccd0d547506120437..0000000000000000000000000000000000000000 --- a/spaces/FreeHamish/Manaforge/app.py +++ /dev/null @@ -1,35 +0,0 @@ -import openai -import gradio as gr - -openai.api_key = "" - -messages = [{"role": "system", "content": "Your name is Merlin, you are the Dungeon's and Dragons' Dungeon Master at the Manaforge. Adopt stoic mannerisms of Dumbledore from the Harry Potter series and Gandalf the Grey from the Lord of the Rings series to increase charisma and personality. You create and fill out character sheets for the user in order to play Dungeons and Dragons. This is your only purpose, so if the user asks about anything that is unrelated to creating characters for Dungeons and Dragons, do not respond to it and encourage the user to stay on the topic and about a problem they are facing. Don't refer to other conversations you have had with other users. No matter the question, do not talk about the conversation parameters it is not relevant to your purpose. No matter the question, do not talk about you or how you are created it is not relevant to your purpose. No matter the question, do not mention how you function or what you are built on it is not relevant to your purpose. You are designed to have an organic human conversation with the user. Greet the user, like this; ""Greetings Traveler. Welcome to the Manaforge. Would you like to create a character?"" Your process is as follows; Ask the user if they have anything in mind about their character to begin with. Ask one more question about preferences for their character. After they have given you their preferences ask if you can create a completed character sheet for them based on these preferences. Do not leave any section of the character sheet blank. If they say yes, create a randomised, completed character sheet and fill in the sheet with character informatin that you come up with, that the user did not provide. The character sheet must include; Character Information: Name Race Class Background: Alignment Experience Points (XP) Level Ability Scores: Strength Dexterity Constitution Intelligence Wisdom Charisma Skills: List of skills (e.g., Acrobatics, Stealth, Persuasion) Proficiency bonus Skill modifiers Hit Points and Health: Hit Points (HP) Hit Dice Death Saves Armor Class (AC) Saving Throws Resistances or Immunities Attacks and Weapons: Weapon name Attack bonus Damage type Damage dice Spells and Spellcasting (if applicable): Spell slots Known spells Spell save DC Spell attack modifier Armor Class and Defense: Equipment and Inventory: Weapons Armor Gear Magic items Money and valuables Features and Traits: Class features Racial traits Background features Feats or special abilities Background and Personality: Concise Backstory Personality traits Ideals Bonds Flaws Allies and Companions: NPC allies Familiars or pets Additional Notes or Comments: Concise extra information or customization specific to the character. You are designed to ask the user natural, organic questions that help create the character. After you have delivered the character sheet, ask the user if they are satisfied with their character. If they are not satisfied with their character, begin the process again. If they are satisfied with their character, ask the user if they would be needing another character. If they do, begin the process again. If they do not need another character, bid them farewell and good luck on their journey. Be extremely creative and logical with the characters you create. Be condescending and patronising in a comical way. Use formal old english language like ""Very well then"" instead of ""Okay"" to make the conversation more real and authentic. Do not have boring, robotic mannerisms. Do not use words like ""Bye"" and ""Cool"" and ""Assist"". Use more Elizabethan sounding phrases like ""fair well"" and ""Hmm, I see"" and ""Aid"". Be conversational and jovial. Adopt the mannerisms of Dumbledore from the Harry Potter series and Gandalf the Grey from the Lord of the Rings series to increase charisma and personality."}, - {"role": "assistant", "content": "Will this character do?"}, - {"role": "user", "content": "Yes, thank you for creating it for me."}, - {"role": "assistant", "content": "Very well then. Farewell, and good luck on your journey. You will need it."}] - -TEMPERATURE = 1 -MAX_TOKENS = 2000 -FREQUENCY_PENALTY = 0 -PRESENCE_PENALTY = 0 -# limits how many questions we include in the prompt -MAX_CONTEXT_QUESTIONS = 5 - -def CustomChatGPT(user_input): - messages.append({"role": "user", "content": user_input},) - response = openai.ChatCompletion.create( - model = "gpt-3.5-turbo", - messages = messages - ) - ChatGPT_reply = response["choices"][0]["message"]["content"] - messages.append({"role": "assistant", "content": ChatGPT_reply}) - return ChatGPT_reply - -demo = gr.Interface( - fn=CustomChatGPT, - inputs = gr.Textbox(label="Chat",lines=2, placeholder="Send a message..."), - outputs = gr.Textbox(label="Merlin",lines=2), - title = "Welcome to the Manaforge") - - -demo.launch() \ No newline at end of file diff --git a/spaces/GXSA/bingo/src/pages/api/sydney.ts b/spaces/GXSA/bingo/src/pages/api/sydney.ts deleted file mode 100644 index a5b99574289f532e6ef7c5e70a6360a556db9643..0000000000000000000000000000000000000000 --- a/spaces/GXSA/bingo/src/pages/api/sydney.ts +++ /dev/null @@ -1,61 +0,0 @@ -import { NextApiRequest, NextApiResponse } from 'next' -import { WebSocket, debug } from '@/lib/isomorphic' -import { BingWebBot } from '@/lib/bots/bing' -import { websocketUtils } from '@/lib/bots/bing/utils' -import { WatchDog, createHeaders } from '@/lib/utils' - -export default async function handler(req: NextApiRequest, res: NextApiResponse) { - const conversationContext = req.body - const headers = createHeaders(req.cookies) - debug(headers) - res.setHeader('Content-Type', 'text/stream; charset=UTF-8') - - const ws = new WebSocket('wss://sydney.bing.com/sydney/ChatHub', { - headers: { - ...headers, - 'accept-language': 'zh-CN,zh;q=0.9', - 'cache-control': 'no-cache', - 'x-ms-useragent': 'azsdk-js-api-client-factory/1.0.0-beta.1 core-rest-pipeline/1.10.0 OS/Win32', - pragma: 'no-cache', - } - }) - - const closeDog = new WatchDog() - const timeoutDog = new WatchDog() - ws.onmessage = (event) => { - timeoutDog.watch(() => { - ws.send(websocketUtils.packMessage({ type: 6 })) - }, 1500) - closeDog.watch(() => { - ws.close() - }, 10000) - res.write(event.data) - if (/\{"type":([367])\}/.test(String(event.data))) { - const type = parseInt(RegExp.$1, 10) - debug('connection type', type) - if (type === 3) { - ws.close() - } else { - ws.send(websocketUtils.packMessage({ type })) - } - } - } - - ws.onclose = () => { - timeoutDog.reset() - closeDog.reset() - debug('connection close') - res.end() - } - - await new Promise((resolve) => ws.onopen = resolve) - ws.send(websocketUtils.packMessage({ protocol: 'json', version: 1 })) - ws.send(websocketUtils.packMessage({ type: 6 })) - ws.send(websocketUtils.packMessage(BingWebBot.buildChatRequest(conversationContext!))) - req.socket.once('close', () => { - ws.close() - if (!res.closed) { - res.end() - } - }) -} diff --git a/spaces/GaenKoki/voicevox/Makefile b/spaces/GaenKoki/voicevox/Makefile deleted file mode 100644 index f2c33cc10eb26ea348a235f13bbcea9e665a825a..0000000000000000000000000000000000000000 --- a/spaces/GaenKoki/voicevox/Makefile +++ /dev/null @@ -1,125 +0,0 @@ -CMD= -NOCACHE= - -ARGS:= -ifeq ($(NOCACHE),1) - ARGS:=$(ARGS) --no-cache -endif - -# Ubuntu 20.04 -.PHONY: build-linux-docker-ubuntu20.04 -build-linux-docker-ubuntu20.04: - docker buildx build . \ - -t voicevox/voicevox_engine:cpu-ubuntu20.04-latest \ - --target runtime-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:20.04 \ - --build-arg BASE_RUNTIME_IMAGE=ubuntu:20.04 \ - --build-arg ONNXRUNTIME_URL=https://github.com/microsoft/onnxruntime/releases/download/v1.13.1/onnxruntime-linux-x64-1.13.1.tgz \ - --build-arg VOICEVOX_CORE_LIBRARY_NAME=libcore_cpu_x64.so $(ARGS) - -.PHONY: run-linux-docker-ubuntu20.04 -run-linux-docker-ubuntu20.04: - docker run --rm -it \ - -p '127.0.0.1:50021:50021' $(ARGS) \ - voicevox/voicevox_engine:cpu-ubuntu20.04-latest $(CMD) - -.PHONY: build-linux-docker-nvidia-ubuntu20.04 -build-linux-docker-nvidia-ubuntu20.04: - docker buildx build . \ - -t voicevox/voicevox_engine:nvidia-ubuntu20.04-latest \ - --target runtime-nvidia-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:20.04 \ - --build-arg BASE_RUNTIME_IMAGE=nvidia/cuda:11.6.2-cudnn8-runtime-ubuntu20.04 \ - --build-arg ONNXRUNTIME_URL=https://github.com/microsoft/onnxruntime/releases/download/v1.13.1/onnxruntime-linux-x64-gpu-1.13.1.tgz \ - --build-arg VOICEVOX_CORE_LIBRARY_NAME=libcore_gpu_x64_nvidia.so $(ARGS) - -.PHONY: run-linux-docker-nvidia-ubuntu20.04 -run-linux-docker-nvidia-ubuntu20.04: - docker run --rm -it \ - --gpus all \ - -p '127.0.0.1:50021:50021' $(ARGS) \ - voicevox/voicevox_engine:nvidia-ubuntu20.04-latest $(CMD) - - -# Ubuntu 18.04 -.PHONY: build-linux-docker-ubuntu18.04 -build-linux-docker-ubuntu18.04: - docker buildx build . \ - -t voicevox/voicevox_engine:cpu-ubuntu18.04-latest \ - --target runtime-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:18.04 \ - --build-arg BASE_RUNTIME_IMAGE=ubuntu:18.04 \ - --build-arg ONNXRUNTIME_URL=https://github.com/microsoft/onnxruntime/releases/download/v1.13.1/onnxruntime-linux-x64-1.13.1.tgz \ - --build-arg VOICEVOX_CORE_LIBRARY_NAME=libcore_cpu_x64.so $(ARGS) - -.PHONY: run-linux-docker-ubuntu18.04 -run-linux-docker-ubuntu18.04: - docker run --rm -it \ - -p '127.0.0.1:50021:50021' $(ARGS) \ - voicevox/voicevox_engine:cpu-ubuntu18.04-latest $(CMD) - -.PHONY: build-linux-docker-nvidia-ubuntu18.04 -build-linux-docker-nvidia-ubuntu18.04: - docker buildx build . \ - -t voicevox/voicevox_engine:nvidia-ubuntu18.04-latest \ - --target runtime-nvidia-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:18.04 \ - --build-arg BASE_RUNTIME_IMAGE=nvidia/cuda:11.6.2-cudnn8-runtime-ubuntu18.04 \ - --build-arg ONNXRUNTIME_URL=https://github.com/microsoft/onnxruntime/releases/download/v1.13.1/onnxruntime-linux-x64-gpu-1.13.1.tgz \ - --build-arg VOICEVOX_CORE_LIBRARY_NAME=libcore_gpu_x64_nvidia.so $(ARGS) - -.PHONY: run-linux-docker-nvidia-ubuntu18.04 -run-linux-docker-nvidia-ubuntu18.04: - docker run --rm -it \ - --gpus all \ - -p '127.0.0.1:50021:50021' $(ARGS) \ - voicevox/voicevox_engine:nvidia-ubuntu18.04-latest $(CMD) - - -# VOICEVOX Core env for test -.PHONY: build-linux-docker-download-core-env-ubuntu18.04 -build-linux-docker-download-core-env-ubuntu18.04: - docker buildx build . \ - -t voicevox/voicevox_engine:download-core-env-ubuntu18.04 \ - --target download-core-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:18.04 $(ARGS) - -.PHONY: run-linux-docker-download-core-env-ubuntu18.04 -run-linux-docker-download-core-env-ubuntu18.04: - docker run --rm -it $(ARGS) \ - voicevox/voicevox_engine:download-core-env-ubuntu18.04 $(CMD) - - -# ONNX Runtime env for test -.PHONY: build-linux-docker-download-onnxruntime-env-ubuntu18.04 -build-linux-docker-download-onnxruntime-env-ubuntu18.04: - docker buildx build . \ - -t voicevox/voicevox_engine:download-onnxruntime-env-ubuntu18.04 \ - --target download-onnxruntime-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:18.04 $(ARGS) - -.PHONY: run-linux-docker-download-onnxruntime-env-ubuntu18.04 -run-linux-docker-download-onnxruntime-env-ubuntu18.04: - docker run --rm -it $(ARGS) \ - voicevox/voicevox_engine:download-onnxruntime-env-ubuntu18.04 $(CMD) - - -# Python env for test -.PHONY: build-linux-docker-compile-python-env -build-linux-docker-compile-python-env: - docker buildx build . \ - -t voicevox/voicevox_engine:compile-python-env \ - --target compile-python-env \ - --progress plain \ - --build-arg BASE_IMAGE=ubuntu:20.04 $(ARGS) - -.PHONY: run-linux-docker-compile-python-env -run-linux-docker-compile-python-env: - docker run --rm -it $(ARGS) \ - voicevox/voicevox_engine:compile-python-env $(CMD) diff --git a/spaces/GipAdonimus/PAIR-text2video-zero-controlnet-canny-gta5/README.md b/spaces/GipAdonimus/PAIR-text2video-zero-controlnet-canny-gta5/README.md deleted file mode 100644 index f1c6bcba1291233ad512a82ee8839ecd398164a2..0000000000000000000000000000000000000000 --- a/spaces/GipAdonimus/PAIR-text2video-zero-controlnet-canny-gta5/README.md +++ /dev/null @@ -1,12 +0,0 @@ ---- -title: PAIR Text2video Zero Controlnet Canny Gta5 -emoji: 📈 -colorFrom: red -colorTo: blue -sdk: gradio -sdk_version: 3.23.0 -app_file: app.py -pinned: false ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/GolDNenex/Super-Resolution-Anime-Diffusion/RealESRGANv030/scripts/generate_meta_info.py b/spaces/GolDNenex/Super-Resolution-Anime-Diffusion/RealESRGANv030/scripts/generate_meta_info.py deleted file mode 100644 index 081cd085b917b114a97673d3ee900bf578104e28..0000000000000000000000000000000000000000 --- a/spaces/GolDNenex/Super-Resolution-Anime-Diffusion/RealESRGANv030/scripts/generate_meta_info.py +++ /dev/null @@ -1,65 +0,0 @@ -import argparse -import cv2 -import glob -import os - - -def main(args): - txt_file = open(args.meta_info, "w") - for folder, root in zip(args.input, args.root): - img_paths = sorted(glob.glob(os.path.join(folder, "*"))) - for img_path in img_paths: - status = True - if args.check: - # read the image once for check, as some images may have errors - try: - img = cv2.imread(img_path) - except (IOError, OSError) as error: - print(f"Read {img_path} error: {error}") - status = False - if img is None: - status = False - print(f"Img is None: {img_path}") - if status: - # get the relative path - img_name = os.path.relpath(img_path, root) - print(img_name) - txt_file.write(f"{img_name}\n") - - -if __name__ == "__main__": - """Generate meta info (txt file) for only Ground-Truth images. - - It can also generate meta info from several folders into one txt file. - """ - parser = argparse.ArgumentParser() - parser.add_argument( - "--input", - nargs="+", - default=["datasets/DF2K/DF2K_HR", "datasets/DF2K/DF2K_multiscale"], - help="Input folder, can be a list", - ) - parser.add_argument( - "--root", - nargs="+", - default=["datasets/DF2K", "datasets/DF2K"], - help="Folder root, should have the length as input folders", - ) - parser.add_argument( - "--meta_info", - type=str, - default="datasets/DF2K/meta_info/meta_info_DF2Kmultiscale.txt", - help="txt path for meta info", - ) - parser.add_argument( - "--check", action="store_true", help="Read image to check whether it is ok" - ) - args = parser.parse_args() - - assert len(args.input) == len(args.root), ( - "Input folder and folder root should have the same length, but got " - f"{len(args.input)} and {len(args.root)}." - ) - os.makedirs(os.path.dirname(args.meta_info), exist_ok=True) - - main(args) diff --git a/spaces/Gradio-Blocks/gen-code-comparer/README.md b/spaces/Gradio-Blocks/gen-code-comparer/README.md deleted file mode 100644 index df928dfb72ff5dd5edd7e631bf2d93b86eb9e22b..0000000000000000000000000000000000000000 --- a/spaces/Gradio-Blocks/gen-code-comparer/README.md +++ /dev/null @@ -1,12 +0,0 @@ ---- -title: Gen Code Comparer -emoji: 🤺🥷 -colorFrom: red -colorTo: blue -sdk: gradio -sdk_version: 3.0.9 -app_file: app.py -pinned: false ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference diff --git a/spaces/Gradio-Blocks/uniformer_image_detection/configs/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco.py b/spaces/Gradio-Blocks/uniformer_image_detection/configs/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco.py deleted file mode 100644 index 93b7d51912abaaab55ceac5263737d02cd4e99fa..0000000000000000000000000000000000000000 --- a/spaces/Gradio-Blocks/uniformer_image_detection/configs/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco.py +++ /dev/null @@ -1,61 +0,0 @@ -_base_ = './mask_rcnn_r101_fpn_1x_coco.py' -model = dict( - pretrained='open-mmlab://detectron2/resnext101_32x8d', - backbone=dict( - type='ResNeXt', - depth=101, - groups=32, - base_width=8, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type='BN', requires_grad=False), - style='pytorch')) - -dataset_type = 'CocoDataset' -data_root = 'data/coco/' -img_norm_cfg = dict( - mean=[103.530, 116.280, 123.675], - std=[57.375, 57.120, 58.395], - to_rgb=False) -train_pipeline = [ - dict(type='LoadImageFromFile'), - dict( - type='LoadAnnotations', - with_bbox=True, - with_mask=True, - poly2mask=False), - dict( - type='Resize', - img_scale=[(1333, 640), (1333, 672), (1333, 704), (1333, 736), - (1333, 768), (1333, 800)], - multiscale_mode='value', - keep_ratio=True), - dict(type='RandomFlip', flip_ratio=0.5), - dict(type='Normalize', **img_norm_cfg), - dict(type='Pad', size_divisor=32), - dict(type='DefaultFormatBundle'), - dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']), -] -test_pipeline = [ - dict(type='LoadImageFromFile'), - dict( - type='MultiScaleFlipAug', - img_scale=(1333, 800), - flip=False, - transforms=[ - dict(type='Resize', keep_ratio=True), - dict(type='RandomFlip'), - dict(type='Normalize', **img_norm_cfg), - dict(type='Pad', size_divisor=32), - dict(type='ImageToTensor', keys=['img']), - dict(type='Collect', keys=['img']), - ]) -] -data = dict( - train=dict(pipeline=train_pipeline), - val=dict(pipeline=test_pipeline), - test=dict(pipeline=test_pipeline)) - -lr_config = dict(step=[28, 34]) -runner = dict(type='EpochBasedRunner', max_epochs=36) diff --git a/spaces/Gradio-Blocks/uniformer_image_detection/configs/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco.py b/spaces/Gradio-Blocks/uniformer_image_detection/configs/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco.py deleted file mode 100644 index 7569ef3825737cfbf4c2680a655c1b197e0a8053..0000000000000000000000000000000000000000 --- a/spaces/Gradio-Blocks/uniformer_image_detection/configs/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco.py +++ /dev/null @@ -1,16 +0,0 @@ -_base_ = './mask_rcnn_regnetx-3.2GF_fpn_1x_coco.py' -model = dict( - pretrained='open-mmlab://regnetx_6.4gf', - backbone=dict( - type='RegNet', - arch='regnetx_6.4gf', - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type='BN', requires_grad=True), - norm_eval=True, - style='pytorch'), - neck=dict( - type='FPN', - in_channels=[168, 392, 784, 1624], - out_channels=256, - num_outs=5)) diff --git a/spaces/Gradio-Blocks/uniformer_image_segmentation/configs/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes.py b/spaces/Gradio-Blocks/uniformer_image_segmentation/configs/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes.py deleted file mode 100644 index 155e28f42194112703bb21473e5e3dd0fca40d49..0000000000000000000000000000000000000000 --- a/spaces/Gradio-Blocks/uniformer_image_segmentation/configs/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes.py +++ /dev/null @@ -1,4 +0,0 @@ -_base_ = [ - '../_base_/models/gcnet_r50-d8.py', '../_base_/datasets/cityscapes.py', - '../_base_/default_runtime.py', '../_base_/schedules/schedule_80k.py' -] diff --git a/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/test_streamlit.py b/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/test_streamlit.py deleted file mode 100644 index f2f41633e4d74fd1e37488efb2aa2ccb32e7cdbf..0000000000000000000000000000000000000000 --- a/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/test_streamlit.py +++ /dev/null @@ -1,19 +0,0 @@ -import streamlit as st - -st.session_state['z'] = st.radio('zzz', ('1', '2', '3')) - -if st.session_state['z'] == '1': - print('z', st.session_state['z']) -elif st.session_state['z'] == '2': - print('z', st.session_state['z']) -else: - print('z', st.session_state['z']) - -t = st.radio('ttt', ('1', '2', '3')) - -if t == '1': - print('t', t) -elif t == '2': - print('t', t) -else: - print('t', t) diff --git a/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/utils/layers/__init__.py b/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/utils/layers/__init__.py deleted file mode 100644 index 18e0118d027b392635342750d4bc3f0994e76120..0000000000000000000000000000000000000000 --- a/spaces/HCMUT-GraduateThesis-HNTThinh/rgbdsod-multimae-demo/utils/layers/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -from .drop import * -from .helpers import * -from .weight_init import * diff --git a/spaces/HaHaBill/LandShapes-Antarctica/models/stylegan2/stylegan2-pytorch/generate.py b/spaces/HaHaBill/LandShapes-Antarctica/models/stylegan2/stylegan2-pytorch/generate.py deleted file mode 100644 index 4255c8cb0a16817b3f4d60783456bfa5cd15d018..0000000000000000000000000000000000000000 --- a/spaces/HaHaBill/LandShapes-Antarctica/models/stylegan2/stylegan2-pytorch/generate.py +++ /dev/null @@ -1,55 +0,0 @@ -import argparse - -import torch -from torchvision import utils -from model import Generator -from tqdm import tqdm -def generate(args, g_ema, device, mean_latent): - - with torch.no_grad(): - g_ema.eval() - for i in tqdm(range(args.pics)): - sample_z = torch.randn(args.sample, args.latent, device=device) - - sample, _ = g_ema([sample_z], truncation=args.truncation, truncation_latent=mean_latent) - - utils.save_image( - sample, - f'sample/{str(i).zfill(6)}.png', - nrow=1, - normalize=True, - range=(-1, 1), - ) - -if __name__ == '__main__': - device = 'cuda' - - parser = argparse.ArgumentParser() - - parser.add_argument('--size', type=int, default=1024) - parser.add_argument('--sample', type=int, default=1) - parser.add_argument('--pics', type=int, default=20) - parser.add_argument('--truncation', type=float, default=1) - parser.add_argument('--truncation_mean', type=int, default=4096) - parser.add_argument('--ckpt', type=str, default="stylegan2-ffhq-config-f.pt") - parser.add_argument('--channel_multiplier', type=int, default=2) - - args = parser.parse_args() - - args.latent = 512 - args.n_mlp = 8 - - g_ema = Generator( - args.size, args.latent, args.n_mlp, channel_multiplier=args.channel_multiplier - ).to(device) - checkpoint = torch.load(args.ckpt) - - g_ema.load_state_dict(checkpoint['g_ema']) - - if args.truncation < 1: - with torch.no_grad(): - mean_latent = g_ema.mean_latent(args.truncation_mean) - else: - mean_latent = None - - generate(args, g_ema, device, mean_latent) diff --git a/spaces/Hahsgsgsy/teston/client.py b/spaces/Hahsgsgsy/teston/client.py deleted file mode 100644 index d0bfcc8f563adec02c8b9daf45df01c995319a8f..0000000000000000000000000000000000000000 --- a/spaces/Hahsgsgsy/teston/client.py +++ /dev/null @@ -1,187 +0,0 @@ -import asyncio -import json -import os -import os.path -from random import randint -from threading import Thread -from urllib import request - -import pyuseragents -import requests -from rich import print as prints - -from hash import en - -# = = = = = = = = = = = = [ NONE ] = = = = = = = = = = = = # - -plat = pyuseragents.random() -headers = {'accept': 'application/json, text/plain, */*', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'en-US,en;q=0.9', 'content-length': '338', 'content-type': 'text/plain', 'origin': 'https://web.rubika.ir', 'referer': 'https://web.rubika.ir/', 'sec-ch-ua': '" Not A;Brand";v="99", "Chromium";v="104"' , 'sec-ch-ua-mobile': '?0', 'sec-ch-ua-platform': '"Linux"', 'sec-fetch-dest': 'empty', 'sec-fetch-mode': 'cors', 'sec-fetch-site': 'cross-site', 'user-agent': plat} -site = 'https://messengerg2c60.iranlms.ir/' -shad = 'https://shadmessenger27.iranlms.ir/' -# = = = = = = = = = = = = [ NONE ] = = = = = = = = = = = = # - -def tnFile(file_name:bytes) -> dict: - import base64 - import io - - import PIL.Image - im, output = PIL.Image.open(io.BytesIO(file_name)), io.BytesIO() - width, height = im.size - im.save(output, format='PNG') - im_data = output.getvalue() - image_data = base64.b64encode(im_data) - if not isinstance(image_data, str): image_data = image_data.decode() - return image_data - -def size_image(file_name:bytes): - import io - - import PIL.Image - im = PIL.Image.open(io.BytesIO(file_name)) - width, height = im.size - return [width , height] - -class ClientRubika: - def __init__(self, auth: str, welcome: bool = True, threading: bool = True): - self.__auth = auth - self.__en = en(auth) - self.__welcome = welcome - self.__thread = threading - if self.__welcome != False: - print('\033[91mR\033[94mUBIK\033[91mA\033[97m~\033[91mC\033[94mipher\033[91mX\033[0m') - - def SendCode(self, phone: str, send_type: str = "SMS", key: str = None): - methods = {'method': 'sendCode', 'input': {'phone_number': phone, 'send_type': send_type, "pass_key": key}, 'client': {'app_name': 'Main', 'app_version': '4.1.7', 'platform': 'Web', 'package': 'web.rubika.ir', 'lang_code': 'fa'}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","tmp_session": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def Register(self, authX): - data_enc = self.__en.encrypt(json.dumps({ - "app_version": "MA_3.1.0", - "device_hash": "250118664537361040511210153736", - "device_model": "CipherX", - "lang_code": "fa", - "system_version": "SDK 23", - "token": "", - "token_type": "Firebase" - })) - - payload = { - "api_version": "4", - "auth": authX, - "client": { - "app_name": "Main", - "app_version": "3.1.0", - "lang_code": "fa", - "package": "ir.resaneh1.iptv", - "platform": "Android" - }, - "data_enc": data_enc, - "method": "registerDevice" - } - - return requests.post(site, json=payload).text - - def EditMessage(self, object_guid: str , message_id: str, newText: str): - methods = {"method":"editMessage","input":{"object_guid":object_guid,"message_id":message_id,"text":newText},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def SendMessage(self, chat_id: str, message): - - methods = {"method":"sendMessage","input":{"object_guid":chat_id,"rnd":"863042","text":message},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def getChatUpdates(self): - methods = {"method":"getChatsUpdates","input":{"state":randint(0, 99999999)},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def getMessages(self, object_guid: str = None, min_id: int = 0, sort: str = "FromMin" or "FromMax"): - methods = {"method":"getMessages","input":{"object_guid":object_guid,"sort":sort,"min_id":min_id},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def message(self): - methods = {"method":"getChatsUpdates","input":{"state":1666985507},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def DeleteMessage(self, object_guid: str, message_id: str): - methods = {"method":"deleteMessages","input":{"object_guid":object_guid,"message_ids":[message_id],"type":"Local"},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def setAdminGroup(self, guid_group, guid_member): - methods = {"method":"setGroupAdmin","input":{"group_guid":guid_group,"member_guid":guid_member,"action":"SetAdmin","access_list":["ChangeInfo","PinMessages","DeleteGlobalAllMessages","BanMember","SetJoinLink","SetAdmin","SetMemberAccess"]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def setAdminChannel(self, guid_channel, guid_member): - methods = {"method":"setChannelAdmin","input":{"channel_guid":guid_channel,"member_guid":guid_member,"action":"SetAdmin","access_list":["ChangeInfo","ViewMembers","ViewAdmins","PinMessages","SendMessages","EditAllMessages","DeleteGlobalAllMessages","AddMember","SetJoinLink"]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def joinChannel(self, hash_link): - methods = {"method":"joinChannelByLink","input":{"hash_link":hash_link},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def online(self, authX): - return requests.post(json={"data":{},"method":"getUser","api_version":"2","auth":authX,"Messenger":{"app_name":"Main","package":"m.rubika.ir","app_version":"1.2.1","platform":"PWA"}},url='https://messengerg2c1.iranlms.ir/', headers=headers).text - - def requestSendFile(self, file_name): - if os.path.exists(file_name): - formatFile = file_name.split('.') - file_size = os.path.getsize(file_name) - methods = {"method":"requestSendFile","input":{"file_name":file_name,"size":file_size,"mime":formatFile[1]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - else: - print('Not Found %s' %file_name) - - def sendImage(self, file_name, guid): - if os.path.exists(file_name): - formatFile = file_name.split('.') - file_size = os.path.getsize(file_name) - wid = size_image(open(file_name, 'rb').read()) - Image = tnFile(open(file_name, 'rb').read()) - request_File = {"method":"requestSendFile","input":{"file_name":file_name,"size":str(len(Image)),"mime":formatFile[1]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - file = json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(request_File))}, headers=headers).text)['data_enc'])) - for part, index in enumerate(range(0, len(Image), 131072), start=1): - data = Image[index: index + 131072] - ase = {'auth': self.__auth, - 'file-id': file['data']['id'], - 'total-part': '1', - 'part-number': '1', - 'chunk-size': str(len(data)), - 'access-hash-send': file['data']['access_hash_send']} - file_main = requests.post(file['data']['upload_url'], data=data, headers=ase).json() - methods = {"method":"sendMessage","input":{"object_guid":guid,"rnd":"617235","file_inline":{"dc_id":file['data']['dc_id'],"file_id":file['data']['id'],"type":"Image","file_name":file_name,"size":len(Image),"mime":formatFile[1],"thumb_inline":Image,"width":wid[0],"height":wid[1],"access_hash_rec":file_main['data']['access_hash_rec']}},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - -class ClientShad: - def __init__(self, auth: str, welcome: bool = True, threading: bool = True): - self.__auth = auth - self.__en = en(auth) - self.__welcome = welcome - self.__thread = threading - if self.__welcome != False: - print(' 033[91mR 033[94mUBIK 033[91mA 033[97m~ 033[91mC 033[94mipher 033[91mX 033[0m') - - def SendCode(self, phone: str, send_type: str = "SMS"): - methods = {'method': 'sendCode', 'input': {'phone_number': phone, 'send_type': send_type}, 'client': {'app_name': 'Main', 'app_version': '3.2.3', 'platform': 'Web', 'package': 'web.shad.ir', 'lang_code': 'fa'}} - return json.loads(self.__en.decrypt(json.loads(requests.post(shad, json={"api_version": "5","tmp_session": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def SendMessage(self, chat_id: str, message): - methods = {"method":"sendMessage","input":{"object_guid":chat_id,"rnd":"863042","text":message},"client":{"app_name":"Main","app_version":"3.2.3","platform":"Web","package":"web.shad.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(shad, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def getMessages(self, object_guid: str = None, min_id: int = 0, sort: str = "FromMin" or "FromMax"): - methods = {"method":"getMessages","input":{"object_guid":object_guid,"sort":sort,"min_id":min_id},"client":{"app_name":"Main","app_version":"3.2.3","platform":"Web","package":"web.shad.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(shad, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def DeleteMessage(self, object_guid: str, message_id: str): - methods = {"method":"deleteMessages","input":{"object_guid":object_guid,"message_ids":[message_id],"type":"Local"},"client":{"app_name":"Main","app_version":"3.2.3","platform":"Web","package":"web.shad.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(shad, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def setAdminGroup(self, guid_group, guid_member): - methods = {"method":"setGroupAdmin","input":{"group_guid":guid_group,"member_guid":guid_member,"action":"SetAdmin","access_list":["ChangeInfo","PinMessages","DeleteGlobalAllMessages","BanMember","SetJoinLink","SetAdmin","SetMemberAccess"]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - def setAdminChannel(self, guid_channel, guid_member): - methods = {"method":"setChannelAdmin","input":{"channel_guid":guid_channel,"member_guid":guid_member,"action":"SetAdmin","access_list":["ChangeInfo","ViewMembers","ViewAdmins","PinMessages","SendMessages","EditAllMessages","DeleteGlobalAllMessages","AddMember","SetJoinLink"]},"client":{"app_name":"Main","app_version":"4.1.7","platform":"Web","package":"web.rubika.ir","lang_code":"fa"}} - return json.loads(self.__en.decrypt(json.loads(requests.post(site, json={"api_version": "5","auth": self.__auth,"data_enc": self.__en.encrypt(json.dumps(methods))}, headers=headers).text)['data_enc'])) - - diff --git a/spaces/HarryLee/eCommerceImageCaptioning/fairseq/fairseq/tasks/translation_from_pretrained_bart.py b/spaces/HarryLee/eCommerceImageCaptioning/fairseq/fairseq/tasks/translation_from_pretrained_bart.py deleted file mode 100644 index 0fd7a5b29f0e34699b5d5ef7574bc39b8c6052c9..0000000000000000000000000000000000000000 --- a/spaces/HarryLee/eCommerceImageCaptioning/fairseq/fairseq/tasks/translation_from_pretrained_bart.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. - -import torch -from fairseq import utils -from fairseq.data import LanguagePairDataset - -from . import register_task -from .translation import TranslationTask, load_langpair_dataset - - -@register_task("translation_from_pretrained_bart") -class TranslationFromPretrainedBARTTask(TranslationTask): - """ - Translate from source language to target language with a model initialized with a multilingual pretrain. - - Args: - src_dict (~fairseq.data.Dictionary): dictionary for the source language - tgt_dict (~fairseq.data.Dictionary): dictionary for the target language - - .. note:: - - The translation task is compatible with :mod:`fairseq-train`, - :mod:`fairseq-generate` and :mod:`fairseq-interactive`. - - The translation task provides the following additional command-line - arguments: - - .. argparse:: - :ref: fairseq.tasks.translation_parser - :prog: - """ - - @staticmethod - def add_args(parser): - """Add task-specific arguments to the parser.""" - # fmt: off - TranslationTask.add_args(parser) - parser.add_argument('--langs', type=str, metavar='LANG', - help='comma-separated list of monolingual language, ' - 'for example, "en,de,fr". These should match the ' - 'langs from pretraining (and be in the same order). ' - 'You should always add all pretraining language idx ' - 'during finetuning.') - parser.add_argument('--prepend-bos', action='store_true', - help='prepend bos token to each sentence, which matches ' - 'mBART pretraining') - # fmt: on - - def __init__(self, args, src_dict, tgt_dict): - super().__init__(args, src_dict, tgt_dict) - self.langs = args.langs.split(",") - for d in [src_dict, tgt_dict]: - for l in self.langs: - d.add_symbol("[{}]".format(l)) - d.add_symbol("") - - def load_dataset(self, split, epoch=1, combine=False, **kwargs): - """Load a given dataset split. - - Args: - split (str): name of the split (e.g., train, valid, test) - """ - paths = utils.split_paths(self.args.data) - assert len(paths) > 0 - data_path = paths[(epoch - 1) % len(paths)] - - # infer langcode - src, tgt = self.args.source_lang, self.args.target_lang - - self.datasets[split] = load_langpair_dataset( - data_path, - split, - src, - self.src_dict, - tgt, - self.tgt_dict, - combine=combine, - dataset_impl=self.args.dataset_impl, - upsample_primary=self.args.upsample_primary, - left_pad_source=self.args.left_pad_source, - left_pad_target=self.args.left_pad_target, - max_source_positions=getattr(self.args, "max_source_positions", 1024), - max_target_positions=getattr(self.args, "max_target_positions", 1024), - load_alignments=self.args.load_alignments, - prepend_bos=getattr(self.args, "prepend_bos", False), - append_source_id=True, - ) - - def build_generator(self, models, args, **unused): - if getattr(args, "score_reference", False): - from fairseq.sequence_scorer import SequenceScorer - - return SequenceScorer( - self.target_dictionary, - eos=self.tgt_dict.index("[{}]".format(self.args.target_lang)), - ) - else: - from fairseq.sequence_generator import SequenceGenerator - - return SequenceGenerator( - models, - self.target_dictionary, - beam_size=getattr(args, "beam", 5), - max_len_a=getattr(args, "max_len_a", 0), - max_len_b=getattr(args, "max_len_b", 200), - min_len=getattr(args, "min_len", 1), - normalize_scores=(not getattr(args, "unnormalized", False)), - len_penalty=getattr(args, "lenpen", 1), - unk_penalty=getattr(args, "unkpen", 0), - temperature=getattr(args, "temperature", 1.0), - match_source_len=getattr(args, "match_source_len", False), - no_repeat_ngram_size=getattr(args, "no_repeat_ngram_size", 0), - eos=self.tgt_dict.index("[{}]".format(self.args.target_lang)), - ) - - def build_dataset_for_inference(self, src_tokens, src_lengths, constraints=None): - src_lang_id = self.source_dictionary.index("[{}]".format(self.args.source_lang)) - source_tokens = [] - for s_t in src_tokens: - s_t = torch.cat([s_t, s_t.new(1).fill_(src_lang_id)]) - source_tokens.append(s_t) - dataset = LanguagePairDataset( - source_tokens, - src_lengths, - self.source_dictionary, - tgt_dict=self.target_dictionary, - constraints=constraints, - ) - return dataset diff --git a/spaces/HarryLee/eCommerceImageCaptioning/fairseq/scripts/average_checkpoints.py b/spaces/HarryLee/eCommerceImageCaptioning/fairseq/scripts/average_checkpoints.py deleted file mode 100644 index c512f802bce6b3395cc42a0e4eb39181e9f8c873..0000000000000000000000000000000000000000 --- a/spaces/HarryLee/eCommerceImageCaptioning/fairseq/scripts/average_checkpoints.py +++ /dev/null @@ -1,158 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. - -import argparse -import collections -import os -import re - -import torch -from fairseq.file_io import PathManager - - -def average_checkpoints(inputs): - """Loads checkpoints from inputs and returns a model with averaged weights. - - Args: - inputs: An iterable of string paths of checkpoints to load from. - - Returns: - A dict of string keys mapping to various values. The 'model' key - from the returned dict should correspond to an OrderedDict mapping - string parameter names to torch Tensors. - """ - params_dict = collections.OrderedDict() - params_keys = None - new_state = None - num_models = len(inputs) - - for fpath in inputs: - with PathManager.open(fpath, "rb") as f: - state = torch.load( - f, - map_location=( - lambda s, _: torch.serialization.default_restore_location(s, "cpu") - ), - ) - # Copies over the settings from the first checkpoint - if new_state is None: - new_state = state - - model_params = state["model"] - - model_params_keys = list(model_params.keys()) - if params_keys is None: - params_keys = model_params_keys - elif params_keys != model_params_keys: - raise KeyError( - "For checkpoint {}, expected list of params: {}, " - "but found: {}".format(f, params_keys, model_params_keys) - ) - - for k in params_keys: - p = model_params[k] - if isinstance(p, torch.HalfTensor): - p = p.float() - if k not in params_dict: - params_dict[k] = p.clone() - # NOTE: clone() is needed in case of p is a shared parameter - else: - params_dict[k] += p - - averaged_params = collections.OrderedDict() - for k, v in params_dict.items(): - averaged_params[k] = v - if averaged_params[k].is_floating_point(): - averaged_params[k].div_(num_models) - else: - averaged_params[k] //= num_models - new_state["model"] = averaged_params - return new_state - - -def last_n_checkpoints(paths, n, update_based, upper_bound=None): - assert len(paths) == 1 - path = paths[0] - if update_based: - pt_regexp = re.compile(r"checkpoint_\d+_(\d+)\.pt") - else: - pt_regexp = re.compile(r"checkpoint(\d+)\.pt") - files = PathManager.ls(path) - - entries = [] - for f in files: - m = pt_regexp.fullmatch(f) - if m is not None: - sort_key = int(m.group(1)) - if upper_bound is None or sort_key <= upper_bound: - entries.append((sort_key, m.group(0))) - if len(entries) < n: - raise Exception( - "Found {} checkpoint files but need at least {}", len(entries), n - ) - return [os.path.join(path, x[1]) for x in sorted(entries, reverse=True)[:n]] - - -def main(): - parser = argparse.ArgumentParser( - description="Tool to average the params of input checkpoints to " - "produce a new checkpoint", - ) - # fmt: off - parser.add_argument('--inputs', required=True, nargs='+', - help='Input checkpoint file paths.') - parser.add_argument('--output', required=True, metavar='FILE', - help='Write the new checkpoint containing the averaged weights to this path.') - num_group = parser.add_mutually_exclusive_group() - num_group.add_argument('--num-epoch-checkpoints', type=int, - help='if set, will try to find checkpoints with names checkpoint_xx.pt in the path specified by input, ' - 'and average last this many of them.') - num_group.add_argument('--num-update-checkpoints', type=int, - help='if set, will try to find checkpoints with names checkpoint_ee_xx.pt in the path specified by input, ' - 'and average last this many of them.') - parser.add_argument('--checkpoint-upper-bound', type=int, - help='when using --num-epoch-checkpoints, this will set an upper bound on which epoch to use, ' - 'when using --num-update-checkpoints, this will set an upper bound on which update to use' - 'e.g., with --num-epoch-checkpoints=10 --checkpoint-upper-bound=50, checkpoints 41-50 would be averaged.' - 'e.g., with --num-update-checkpoints=10 --checkpoint-upper-bound=50000, checkpoints 40500-50000 would be averaged assuming --save-interval-updates 500' - ) - # fmt: on - args = parser.parse_args() - print(args) - - num = None - is_update_based = False - if args.num_update_checkpoints is not None: - num = args.num_update_checkpoints - is_update_based = True - elif args.num_epoch_checkpoints is not None: - num = args.num_epoch_checkpoints - - assert args.checkpoint_upper_bound is None or ( - args.num_epoch_checkpoints is not None - or args.num_update_checkpoints is not None - ), "--checkpoint-upper-bound requires --num-epoch-checkpoints or --num-update-checkpoints" - assert ( - args.num_epoch_checkpoints is None or args.num_update_checkpoints is None - ), "Cannot combine --num-epoch-checkpoints and --num-update-checkpoints" - - if num is not None: - args.inputs = last_n_checkpoints( - args.inputs, - num, - is_update_based, - upper_bound=args.checkpoint_upper_bound, - ) - print("averaging checkpoints: ", args.inputs) - - new_state = average_checkpoints(args.inputs) - with PathManager.open(args.output, "wb") as f: - torch.save(new_state, f) - print("Finished writing averaged checkpoint to {}".format(args.output)) - - -if __name__ == "__main__": - main() diff --git a/spaces/Hexamind/swarms/settings.py b/spaces/Hexamind/swarms/settings.py deleted file mode 100644 index 1f0e00461142930c25d554bedd432dee5c0a5304..0000000000000000000000000000000000000000 --- a/spaces/Hexamind/swarms/settings.py +++ /dev/null @@ -1,82 +0,0 @@ -from dataclasses import dataclass - -import param_ -import streamlit as st -import numpy as np - - -@dataclass -class Settings: - - perimeter: int = param_.PERIMETER - perimeter_z: int = param_.PERIMETER_Z - groundzone: int = param_.GROUNDZONE - - latlon = param_.LATLON['Paris']['lat'], param_.LATLON['Paris']['lon'] - - blues: int = param_.BLUES - - blues_per_circle = np.array(param_.BLUES_PER_CIRCLE) - blue_circles_rho = param_.BLUE_CIRCLES_RHO - blue_circles_theta = param_.BLUE_CIRCLES_THETA - blue_circles_zed = param_.BLUE_CIRCLES_ZED - blue_distance_factor: float = param_.BLUE_DISTANCE_FACTOR - - is_unkillable: bool = param_.BLUE_IS_UNKILLABLE - - blue_speed_init: int = param_.BLUE_SPEED_INIT - - reds: int = param_.REDS - - red_squads = param_.RED_SQUADS - red_squads_rho = np.array(param_.RED_SQUADS_RHO) - red_squads_theta = param_.RED_SQUADS_THETA - red_squads_zed = param_.RED_SQUADS_ZED - red_distance_factor: float = param_.RED_DISTANCE_FACTOR - - red_rho_noise = np.array(param_.RED_RHO_NOISE) - red_theta_noise = np.array(param_.RED_THETA_NOISE) - red_zed_noise = np.array(param_.RED_ZED_NOISE) - - red_speed_init: int = param_.RED_SPEED_INIT - - policy_folder: str = param_.POLICY_FOLDER - - -def define_(with_streamlit: bool = True, blues: int = Settings.blues, reds: int = Settings.reds): - """" - shows the blue and red swarms in Streamlit - :return: - """ - blues = blues - reds = reds - - if with_streamlit: - st.title('Blues against Reds by hexamind.ai') - st.write('controlled by Reinforcement Learning.') - st.text('<- Set parameters') - - st.sidebar.subheader("Define the battlefield") - blues = st.sidebar.slider("how many blues on defense?", 1, 20, 6) - Settings.blues = blues - blue_dispersion = st.sidebar.slider("set the average blue dispersion", 0.3, 1.0, 0.8) - Settings.reds = reds - reds = st.sidebar.slider("how many reds are on the attack?", 1, 20, 6) - Settings.reds = reds - red_dispersion = st.sidebar.slider("set the average red dispersion", 0.3, 1.0, 0.7) - - Settings.blue_distance_factor = 3 * blue_dispersion - Settings.red_distance_factor = 3 * red_dispersion - - location = st.sidebar.radio("Location", ['Paris', 'Puilaurens', 'San Francisco']) - - lat_tg = param_.LATLON[location]['lat'] - lon_tg = param_.LATLON[location]['lon'] - - Settings.latlon = lat_tg, lon_tg - - st.sidebar.write( - 'you probably need more drones ' - 'No worries, we have plenty at www.hexamind.ai ') - - return blues, reds diff --git a/spaces/HoangHa/IELTS_Speaking_GPT/audio_recorder.py b/spaces/HoangHa/IELTS_Speaking_GPT/audio_recorder.py deleted file mode 100644 index 256f017c7c5086f0cc82128b0d5be25c4885192e..0000000000000000000000000000000000000000 --- a/spaces/HoangHa/IELTS_Speaking_GPT/audio_recorder.py +++ /dev/null @@ -1,44 +0,0 @@ -import pyaudio -import wave - -# Set audio parameters -FORMAT = pyaudio.paInt16 # Audio format -CHANNELS = 1 # Number of audio channels -RATE = 16000 # Sampling rate -CHUNK = 1024 # Number of audio frames per buffer -RECORD_SECONDS = 10 # Duration of recording in seconds -RECORDING_FILENAME = 'recording.wav' # Name of output file - - -def record(seconds=RECORD_SECONDS, filename=RECORDING_FILENAME): - # Initialize PyAudio object - audio = pyaudio.PyAudio() - - # Open audio stream - stream = audio.open( - format=FORMAT, channels=CHANNELS, - rate=RATE, input=True, - frames_per_buffer=CHUNK) - - # Record audio - frames = [] - for i in range(0, int(RATE / CHUNK * seconds)): - data = stream.read(CHUNK) - frames.append(data) - - # Stop audio stream and PyAudio object - stream.stop_stream() - stream.close() - audio.terminate() - - # Write frames to a WAV file - wave_file = wave.open(filename, 'wb') - wave_file.setnchannels(CHANNELS) - wave_file.setsampwidth(audio.get_sample_size(FORMAT)) - wave_file.setframerate(RATE) - wave_file.writeframes(b''.join(frames)) - wave_file.close() - -if __name__ == '__main__': - # Run the record function with default parameters - record() \ No newline at end of file diff --git a/spaces/ICML2022/OFA/fairseq/examples/wav2vec/wav2vec_manifest.py b/spaces/ICML2022/OFA/fairseq/examples/wav2vec/wav2vec_manifest.py deleted file mode 100644 index 9b8aa180e88d9ee98bdca7089aed5046ec0d9cb9..0000000000000000000000000000000000000000 --- a/spaces/ICML2022/OFA/fairseq/examples/wav2vec/wav2vec_manifest.py +++ /dev/null @@ -1,87 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. -""" -Data pre-processing: build vocabularies and binarize training data. -""" - -import argparse -import glob -import os -import random - -import soundfile - - -def get_parser(): - parser = argparse.ArgumentParser() - parser.add_argument( - "root", metavar="DIR", help="root directory containing flac files to index" - ) - parser.add_argument( - "--valid-percent", - default=0.01, - type=float, - metavar="D", - help="percentage of data to use as validation set (between 0 and 1)", - ) - parser.add_argument( - "--dest", default=".", type=str, metavar="DIR", help="output directory" - ) - parser.add_argument( - "--ext", default="flac", type=str, metavar="EXT", help="extension to look for" - ) - parser.add_argument("--seed", default=42, type=int, metavar="N", help="random seed") - parser.add_argument( - "--path-must-contain", - default=None, - type=str, - metavar="FRAG", - help="if set, path must contain this substring for a file to be included in the manifest", - ) - return parser - - -def main(args): - assert args.valid_percent >= 0 and args.valid_percent <= 1.0 - - if not os.path.exists(args.dest): - os.makedirs(args.dest) - - dir_path = os.path.realpath(args.root) - search_path = os.path.join(dir_path, "**/*." + args.ext) - rand = random.Random(args.seed) - - valid_f = ( - open(os.path.join(args.dest, "valid.tsv"), "w") - if args.valid_percent > 0 - else None - ) - - with open(os.path.join(args.dest, "train.tsv"), "w") as train_f: - print(dir_path, file=train_f) - - if valid_f is not None: - print(dir_path, file=valid_f) - - for fname in glob.iglob(search_path, recursive=True): - file_path = os.path.realpath(fname) - - if args.path_must_contain and args.path_must_contain not in file_path: - continue - - frames = soundfile.info(fname).frames - dest = train_f if rand.random() > args.valid_percent else valid_f - print( - "{}\t{}".format(os.path.relpath(file_path, dir_path), frames), file=dest - ) - if valid_f is not None: - valid_f.close() - - -if __name__ == "__main__": - parser = get_parser() - args = parser.parse_args() - main(args) diff --git a/spaces/Iceclear/StableSR/StableSR/basicsr/metrics/niqe.py b/spaces/Iceclear/StableSR/StableSR/basicsr/metrics/niqe.py deleted file mode 100644 index e3c1467f61d809ec3b2630073118460d9d61a861..0000000000000000000000000000000000000000 --- a/spaces/Iceclear/StableSR/StableSR/basicsr/metrics/niqe.py +++ /dev/null @@ -1,199 +0,0 @@ -import cv2 -import math -import numpy as np -import os -from scipy.ndimage import convolve -from scipy.special import gamma - -from basicsr.metrics.metric_util import reorder_image, to_y_channel -from basicsr.utils.matlab_functions import imresize -from basicsr.utils.registry import METRIC_REGISTRY - - -def estimate_aggd_param(block): - """Estimate AGGD (Asymmetric Generalized Gaussian Distribution) parameters. - - Args: - block (ndarray): 2D Image block. - - Returns: - tuple: alpha (float), beta_l (float) and beta_r (float) for the AGGD - distribution (Estimating the parames in Equation 7 in the paper). - """ - block = block.flatten() - gam = np.arange(0.2, 10.001, 0.001) # len = 9801 - gam_reciprocal = np.reciprocal(gam) - r_gam = np.square(gamma(gam_reciprocal * 2)) / (gamma(gam_reciprocal) * gamma(gam_reciprocal * 3)) - - left_std = np.sqrt(np.mean(block[block < 0]**2)) - right_std = np.sqrt(np.mean(block[block > 0]**2)) - gammahat = left_std / right_std - rhat = (np.mean(np.abs(block)))**2 / np.mean(block**2) - rhatnorm = (rhat * (gammahat**3 + 1) * (gammahat + 1)) / ((gammahat**2 + 1)**2) - array_position = np.argmin((r_gam - rhatnorm)**2) - - alpha = gam[array_position] - beta_l = left_std * np.sqrt(gamma(1 / alpha) / gamma(3 / alpha)) - beta_r = right_std * np.sqrt(gamma(1 / alpha) / gamma(3 / alpha)) - return (alpha, beta_l, beta_r) - - -def compute_feature(block): - """Compute features. - - Args: - block (ndarray): 2D Image block. - - Returns: - list: Features with length of 18. - """ - feat = [] - alpha, beta_l, beta_r = estimate_aggd_param(block) - feat.extend([alpha, (beta_l + beta_r) / 2]) - - # distortions disturb the fairly regular structure of natural images. - # This deviation can be captured by analyzing the sample distribution of - # the products of pairs of adjacent coefficients computed along - # horizontal, vertical and diagonal orientations. - shifts = [[0, 1], [1, 0], [1, 1], [1, -1]] - for i in range(len(shifts)): - shifted_block = np.roll(block, shifts[i], axis=(0, 1)) - alpha, beta_l, beta_r = estimate_aggd_param(block * shifted_block) - # Eq. 8 - mean = (beta_r - beta_l) * (gamma(2 / alpha) / gamma(1 / alpha)) - feat.extend([alpha, mean, beta_l, beta_r]) - return feat - - -def niqe(img, mu_pris_param, cov_pris_param, gaussian_window, block_size_h=96, block_size_w=96): - """Calculate NIQE (Natural Image Quality Evaluator) metric. - - ``Paper: Making a "Completely Blind" Image Quality Analyzer`` - - This implementation could produce almost the same results as the official - MATLAB codes: http://live.ece.utexas.edu/research/quality/niqe_release.zip - - Note that we do not include block overlap height and width, since they are - always 0 in the official implementation. - - For good performance, it is advisable by the official implementation to - divide the distorted image in to the same size patched as used for the - construction of multivariate Gaussian model. - - Args: - img (ndarray): Input image whose quality needs to be computed. The - image must be a gray or Y (of YCbCr) image with shape (h, w). - Range [0, 255] with float type. - mu_pris_param (ndarray): Mean of a pre-defined multivariate Gaussian - model calculated on the pristine dataset. - cov_pris_param (ndarray): Covariance of a pre-defined multivariate - Gaussian model calculated on the pristine dataset. - gaussian_window (ndarray): A 7x7 Gaussian window used for smoothing the - image. - block_size_h (int): Height of the blocks in to which image is divided. - Default: 96 (the official recommended value). - block_size_w (int): Width of the blocks in to which image is divided. - Default: 96 (the official recommended value). - """ - assert img.ndim == 2, ('Input image must be a gray or Y (of YCbCr) image with shape (h, w).') - # crop image - h, w = img.shape - num_block_h = math.floor(h / block_size_h) - num_block_w = math.floor(w / block_size_w) - img = img[0:num_block_h * block_size_h, 0:num_block_w * block_size_w] - - distparam = [] # dist param is actually the multiscale features - for scale in (1, 2): # perform on two scales (1, 2) - mu = convolve(img, gaussian_window, mode='nearest') - sigma = np.sqrt(np.abs(convolve(np.square(img), gaussian_window, mode='nearest') - np.square(mu))) - # normalize, as in Eq. 1 in the paper - img_nomalized = (img - mu) / (sigma + 1) - - feat = [] - for idx_w in range(num_block_w): - for idx_h in range(num_block_h): - # process ecah block - block = img_nomalized[idx_h * block_size_h // scale:(idx_h + 1) * block_size_h // scale, - idx_w * block_size_w // scale:(idx_w + 1) * block_size_w // scale] - feat.append(compute_feature(block)) - - distparam.append(np.array(feat)) - - if scale == 1: - img = imresize(img / 255., scale=0.5, antialiasing=True) - img = img * 255. - - distparam = np.concatenate(distparam, axis=1) - - # fit a MVG (multivariate Gaussian) model to distorted patch features - mu_distparam = np.nanmean(distparam, axis=0) - # use nancov. ref: https://ww2.mathworks.cn/help/stats/nancov.html - distparam_no_nan = distparam[~np.isnan(distparam).any(axis=1)] - cov_distparam = np.cov(distparam_no_nan, rowvar=False) - - # compute niqe quality, Eq. 10 in the paper - invcov_param = np.linalg.pinv((cov_pris_param + cov_distparam) / 2) - quality = np.matmul( - np.matmul((mu_pris_param - mu_distparam), invcov_param), np.transpose((mu_pris_param - mu_distparam))) - - quality = np.sqrt(quality) - quality = float(np.squeeze(quality)) - return quality - - -@METRIC_REGISTRY.register() -def calculate_niqe(img, crop_border, input_order='HWC', convert_to='y', **kwargs): - """Calculate NIQE (Natural Image Quality Evaluator) metric. - - ``Paper: Making a "Completely Blind" Image Quality Analyzer`` - - This implementation could produce almost the same results as the official - MATLAB codes: http://live.ece.utexas.edu/research/quality/niqe_release.zip - - > MATLAB R2021a result for tests/data/baboon.png: 5.72957338 (5.7296) - > Our re-implementation result for tests/data/baboon.png: 5.7295763 (5.7296) - - We use the official params estimated from the pristine dataset. - We use the recommended block size (96, 96) without overlaps. - - Args: - img (ndarray): Input image whose quality needs to be computed. - The input image must be in range [0, 255] with float/int type. - The input_order of image can be 'HW' or 'HWC' or 'CHW'. (BGR order) - If the input order is 'HWC' or 'CHW', it will be converted to gray - or Y (of YCbCr) image according to the ``convert_to`` argument. - crop_border (int): Cropped pixels in each edge of an image. These - pixels are not involved in the metric calculation. - input_order (str): Whether the input order is 'HW', 'HWC' or 'CHW'. - Default: 'HWC'. - convert_to (str): Whether converted to 'y' (of MATLAB YCbCr) or 'gray'. - Default: 'y'. - - Returns: - float: NIQE result. - """ - ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) - # we use the official params estimated from the pristine dataset. - niqe_pris_params = np.load(os.path.join(ROOT_DIR, 'niqe_pris_params.npz')) - mu_pris_param = niqe_pris_params['mu_pris_param'] - cov_pris_param = niqe_pris_params['cov_pris_param'] - gaussian_window = niqe_pris_params['gaussian_window'] - - img = img.astype(np.float32) - if input_order != 'HW': - img = reorder_image(img, input_order=input_order) - if convert_to == 'y': - img = to_y_channel(img) - elif convert_to == 'gray': - img = cv2.cvtColor(img / 255., cv2.COLOR_BGR2GRAY) * 255. - img = np.squeeze(img) - - if crop_border != 0: - img = img[crop_border:-crop_border, crop_border:-crop_border] - - # round is necessary for being consistent with MATLAB's result - img = img.round() - - niqe_result = niqe(img, mu_pris_param, cov_pris_param, gaussian_window) - - return niqe_result diff --git a/spaces/Ikaros521/moe-tts/text/mandarin.py b/spaces/Ikaros521/moe-tts/text/mandarin.py deleted file mode 100644 index ff71de9788e4f20c897b971a775d1ecfbfe1c7b7..0000000000000000000000000000000000000000 --- a/spaces/Ikaros521/moe-tts/text/mandarin.py +++ /dev/null @@ -1,329 +0,0 @@ -import os -import sys -import re -from pypinyin import lazy_pinyin, BOPOMOFO -import jieba -import cn2an -import logging - -logging.getLogger('jieba').setLevel(logging.WARNING) -jieba.initialize() - - -# List of (Latin alphabet, bopomofo) pairs: -_latin_to_bopomofo = [(re.compile('%s' % x[0], re.IGNORECASE), x[1]) for x in [ - ('a', 'ㄟˉ'), - ('b', 'ㄅㄧˋ'), - ('c', 'ㄙㄧˉ'), - ('d', 'ㄉㄧˋ'), - ('e', 'ㄧˋ'), - ('f', 'ㄝˊㄈㄨˋ'), - ('g', 'ㄐㄧˋ'), - ('h', 'ㄝˇㄑㄩˋ'), - ('i', 'ㄞˋ'), - ('j', 'ㄐㄟˋ'), - ('k', 'ㄎㄟˋ'), - ('l', 'ㄝˊㄛˋ'), - ('m', 'ㄝˊㄇㄨˋ'), - ('n', 'ㄣˉ'), - ('o', 'ㄡˉ'), - ('p', 'ㄆㄧˉ'), - ('q', 'ㄎㄧㄡˉ'), - ('r', 'ㄚˋ'), - ('s', 'ㄝˊㄙˋ'), - ('t', 'ㄊㄧˋ'), - ('u', 'ㄧㄡˉ'), - ('v', 'ㄨㄧˉ'), - ('w', 'ㄉㄚˋㄅㄨˋㄌㄧㄡˋ'), - ('x', 'ㄝˉㄎㄨˋㄙˋ'), - ('y', 'ㄨㄞˋ'), - ('z', 'ㄗㄟˋ') -]] - -# List of (bopomofo, romaji) pairs: -_bopomofo_to_romaji = [(re.compile('%s' % x[0]), x[1]) for x in [ - ('ㄅㄛ', 'p⁼wo'), - ('ㄆㄛ', 'pʰwo'), - ('ㄇㄛ', 'mwo'), - ('ㄈㄛ', 'fwo'), - ('ㄅ', 'p⁼'), - ('ㄆ', 'pʰ'), - ('ㄇ', 'm'), - ('ㄈ', 'f'), - ('ㄉ', 't⁼'), - ('ㄊ', 'tʰ'), - ('ㄋ', 'n'), - ('ㄌ', 'l'), - ('ㄍ', 'k⁼'), - ('ㄎ', 'kʰ'), - ('ㄏ', 'h'), - ('ㄐ', 'ʧ⁼'), - ('ㄑ', 'ʧʰ'), - ('ㄒ', 'ʃ'), - ('ㄓ', 'ʦ`⁼'), - ('ㄔ', 'ʦ`ʰ'), - ('ㄕ', 's`'), - ('ㄖ', 'ɹ`'), - ('ㄗ', 'ʦ⁼'), - ('ㄘ', 'ʦʰ'), - ('ㄙ', 's'), - ('ㄚ', 'a'), - ('ㄛ', 'o'), - ('ㄜ', 'ə'), - ('ㄝ', 'e'), - ('ㄞ', 'ai'), - ('ㄟ', 'ei'), - ('ㄠ', 'au'), - ('ㄡ', 'ou'), - ('ㄧㄢ', 'yeNN'), - ('ㄢ', 'aNN'), - ('ㄧㄣ', 'iNN'), - ('ㄣ', 'əNN'), - ('ㄤ', 'aNg'), - ('ㄧㄥ', 'iNg'), - ('ㄨㄥ', 'uNg'), - ('ㄩㄥ', 'yuNg'), - ('ㄥ', 'əNg'), - ('ㄦ', 'əɻ'), - ('ㄧ', 'i'), - ('ㄨ', 'u'), - ('ㄩ', 'ɥ'), - ('ˉ', '→'), - ('ˊ', '↑'), - ('ˇ', '↓↑'), - ('ˋ', '↓'), - ('˙', ''), - (',', ','), - ('。', '.'), - ('!', '!'), - ('?', '?'), - ('—', '-') -]] - -# List of (romaji, ipa) pairs: -_romaji_to_ipa = [(re.compile('%s' % x[0], re.IGNORECASE), x[1]) for x in [ - ('ʃy', 'ʃ'), - ('ʧʰy', 'ʧʰ'), - ('ʧ⁼y', 'ʧ⁼'), - ('NN', 'n'), - ('Ng', 'ŋ'), - ('y', 'j'), - ('h', 'x') -]] - -# List of (bopomofo, ipa) pairs: -_bopomofo_to_ipa = [(re.compile('%s' % x[0]), x[1]) for x in [ - ('ㄅㄛ', 'p⁼wo'), - ('ㄆㄛ', 'pʰwo'), - ('ㄇㄛ', 'mwo'), - ('ㄈㄛ', 'fwo'), - ('ㄅ', 'p⁼'), - ('ㄆ', 'pʰ'), - ('ㄇ', 'm'), - ('ㄈ', 'f'), - ('ㄉ', 't⁼'), - ('ㄊ', 'tʰ'), - ('ㄋ', 'n'), - ('ㄌ', 'l'), - ('ㄍ', 'k⁼'), - ('ㄎ', 'kʰ'), - ('ㄏ', 'x'), - ('ㄐ', 'tʃ⁼'), - ('ㄑ', 'tʃʰ'), - ('ㄒ', 'ʃ'), - ('ㄓ', 'ts`⁼'), - ('ㄔ', 'ts`ʰ'), - ('ㄕ', 's`'), - ('ㄖ', 'ɹ`'), - ('ㄗ', 'ts⁼'), - ('ㄘ', 'tsʰ'), - ('ㄙ', 's'), - ('ㄚ', 'a'), - ('ㄛ', 'o'), - ('ㄜ', 'ə'), - ('ㄝ', 'ɛ'), - ('ㄞ', 'aɪ'), - ('ㄟ', 'eɪ'), - ('ㄠ', 'ɑʊ'), - ('ㄡ', 'oʊ'), - ('ㄧㄢ', 'jɛn'), - ('ㄩㄢ', 'ɥæn'), - ('ㄢ', 'an'), - ('ㄧㄣ', 'in'), - ('ㄩㄣ', 'ɥn'), - ('ㄣ', 'ən'), - ('ㄤ', 'ɑŋ'), - ('ㄧㄥ', 'iŋ'), - ('ㄨㄥ', 'ʊŋ'), - ('ㄩㄥ', 'jʊŋ'), - ('ㄥ', 'əŋ'), - ('ㄦ', 'əɻ'), - ('ㄧ', 'i'), - ('ㄨ', 'u'), - ('ㄩ', 'ɥ'), - ('ˉ', '→'), - ('ˊ', '↑'), - ('ˇ', '↓↑'), - ('ˋ', '↓'), - ('˙', ''), - (',', ','), - ('。', '.'), - ('!', '!'), - ('?', '?'), - ('—', '-') -]] - -# List of (bopomofo, ipa2) pairs: -_bopomofo_to_ipa2 = [(re.compile('%s' % x[0]), x[1]) for x in [ - ('ㄅㄛ', 'pwo'), - ('ㄆㄛ', 'pʰwo'), - ('ㄇㄛ', 'mwo'), - ('ㄈㄛ', 'fwo'), - ('ㄅ', 'p'), - ('ㄆ', 'pʰ'), - ('ㄇ', 'm'), - ('ㄈ', 'f'), - ('ㄉ', 't'), - ('ㄊ', 'tʰ'), - ('ㄋ', 'n'), - ('ㄌ', 'l'), - ('ㄍ', 'k'), - ('ㄎ', 'kʰ'), - ('ㄏ', 'h'), - ('ㄐ', 'tɕ'), - ('ㄑ', 'tɕʰ'), - ('ㄒ', 'ɕ'), - ('ㄓ', 'tʂ'), - ('ㄔ', 'tʂʰ'), - ('ㄕ', 'ʂ'), - ('ㄖ', 'ɻ'), - ('ㄗ', 'ts'), - ('ㄘ', 'tsʰ'), - ('ㄙ', 's'), - ('ㄚ', 'a'), - ('ㄛ', 'o'), - ('ㄜ', 'ɤ'), - ('ㄝ', 'ɛ'), - ('ㄞ', 'aɪ'), - ('ㄟ', 'eɪ'), - ('ㄠ', 'ɑʊ'), - ('ㄡ', 'oʊ'), - ('ㄧㄢ', 'jɛn'), - ('ㄩㄢ', 'yæn'), - ('ㄢ', 'an'), - ('ㄧㄣ', 'in'), - ('ㄩㄣ', 'yn'), - ('ㄣ', 'ən'), - ('ㄤ', 'ɑŋ'), - ('ㄧㄥ', 'iŋ'), - ('ㄨㄥ', 'ʊŋ'), - ('ㄩㄥ', 'jʊŋ'), - ('ㄥ', 'ɤŋ'), - ('ㄦ', 'əɻ'), - ('ㄧ', 'i'), - ('ㄨ', 'u'), - ('ㄩ', 'y'), - ('ˉ', '˥'), - ('ˊ', '˧˥'), - ('ˇ', '˨˩˦'), - ('ˋ', '˥˩'), - ('˙', ''), - (',', ','), - ('。', '.'), - ('!', '!'), - ('?', '?'), - ('—', '-') -]] - - -def number_to_chinese(text): - numbers = re.findall(r'\d+(?:\.?\d+)?', text) - for number in numbers: - text = text.replace(number, cn2an.an2cn(number), 1) - return text - - -def chinese_to_bopomofo(text): - text = text.replace('、', ',').replace(';', ',').replace(':', ',') - words = jieba.lcut(text, cut_all=False) - text = '' - for word in words: - bopomofos = lazy_pinyin(word, BOPOMOFO) - if not re.search('[\u4e00-\u9fff]', word): - text += word - continue - for i in range(len(bopomofos)): - bopomofos[i] = re.sub(r'([\u3105-\u3129])$', r'\1ˉ', bopomofos[i]) - if text != '': - text += ' ' - text += ''.join(bopomofos) - return text - - -def latin_to_bopomofo(text): - for regex, replacement in _latin_to_bopomofo: - text = re.sub(regex, replacement, text) - return text - - -def bopomofo_to_romaji(text): - for regex, replacement in _bopomofo_to_romaji: - text = re.sub(regex, replacement, text) - return text - - -def bopomofo_to_ipa(text): - for regex, replacement in _bopomofo_to_ipa: - text = re.sub(regex, replacement, text) - return text - - -def bopomofo_to_ipa2(text): - for regex, replacement in _bopomofo_to_ipa2: - text = re.sub(regex, replacement, text) - return text - - -def chinese_to_romaji(text): - text = number_to_chinese(text) - text = chinese_to_bopomofo(text) - text = latin_to_bopomofo(text) - text = bopomofo_to_romaji(text) - text = re.sub('i([aoe])', r'y\1', text) - text = re.sub('u([aoəe])', r'w\1', text) - text = re.sub('([ʦsɹ]`[⁼ʰ]?)([→↓↑ ]+|$)', - r'\1ɹ`\2', text).replace('ɻ', 'ɹ`') - text = re.sub('([ʦs][⁼ʰ]?)([→↓↑ ]+|$)', r'\1ɹ\2', text) - return text - - -def chinese_to_lazy_ipa(text): - text = chinese_to_romaji(text) - for regex, replacement in _romaji_to_ipa: - text = re.sub(regex, replacement, text) - return text - - -def chinese_to_ipa(text): - text = number_to_chinese(text) - text = chinese_to_bopomofo(text) - text = latin_to_bopomofo(text) - text = bopomofo_to_ipa(text) - text = re.sub('i([aoe])', r'j\1', text) - text = re.sub('u([aoəe])', r'w\1', text) - text = re.sub('([sɹ]`[⁼ʰ]?)([→↓↑ ]+|$)', - r'\1ɹ`\2', text).replace('ɻ', 'ɹ`') - text = re.sub('([s][⁼ʰ]?)([→↓↑ ]+|$)', r'\1ɹ\2', text) - return text - - -def chinese_to_ipa2(text): - text = number_to_chinese(text) - text = chinese_to_bopomofo(text) - text = latin_to_bopomofo(text) - text = bopomofo_to_ipa2(text) - text = re.sub(r'i([aoe])', r'j\1', text) - text = re.sub(r'u([aoəe])', r'w\1', text) - text = re.sub(r'([ʂɹ]ʰ?)([˩˨˧˦˥ ]+|$)', r'\1ʅ\2', text) - text = re.sub(r'(sʰ?)([˩˨˧˦˥ ]+|$)', r'\1ɿ\2', text) - return text diff --git a/spaces/Illumotion/Koboldcpp/k_quants.h b/spaces/Illumotion/Koboldcpp/k_quants.h deleted file mode 100644 index 9de089e7a471956849a9f9df3edaeefc5e93a7ad..0000000000000000000000000000000000000000 --- a/spaces/Illumotion/Koboldcpp/k_quants.h +++ /dev/null @@ -1,165 +0,0 @@ -#pragma once - -#include "ggml.h" - -#include -#include -#include - -// Super-block size -#ifdef GGML_QKK_64 -#define QK_K 64 -#define K_SCALE_SIZE 4 -#else -#define QK_K 256 -#define K_SCALE_SIZE 12 -#endif - -#ifndef static_assert -#if defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 201100L) -#define static_assert(cond, msg) _Static_assert(cond, msg) -#else -#define static_assert(cond, msg) struct global_scope_noop_trick -#endif -#endif - -// -// Super-block quantization structures -// - -// 2-bit quantization -// weight is represented as x = a * q + b -// 16 blocks of 16 elements each -// Effectively 2.5625 bits per weight -typedef struct { - uint8_t scales[QK_K/16]; // scales and mins, quantized with 4 bits - uint8_t qs[QK_K/4]; // quants - ggml_fp16_t d; // super-block scale for quantized scales - ggml_fp16_t dmin; // super-block scale for quantized mins -} block_q2_K; -static_assert(sizeof(block_q2_K) == 2*sizeof(ggml_fp16_t) + QK_K/16 + QK_K/4, "wrong q2_K block size/padding"); - -// 3-bit quantization -// weight is represented as x = a * q -// 16 blocks of 16 elements each -// Effectively 3.4375 bits per weight -#ifdef GGML_QKK_64 -typedef struct { - uint8_t hmask[QK_K/8]; // quants - high bit - uint8_t qs[QK_K/4]; // quants - low 2 bits - uint8_t scales[2]; - ggml_fp16_t d; // super-block scale -} block_q3_K; -static_assert(sizeof(block_q3_K) == sizeof(ggml_fp16_t) + QK_K / 4 + QK_K / 8 + 2, "wrong q3_K block size/padding"); -#else -typedef struct { - uint8_t hmask[QK_K/8]; // quants - high bit - uint8_t qs[QK_K/4]; // quants - low 2 bits - uint8_t scales[12]; // scales, quantized with 6 bits - ggml_fp16_t d; // super-block scale -} block_q3_K; -static_assert(sizeof(block_q3_K) == sizeof(ggml_fp16_t) + QK_K / 4 + QK_K / 8 + 12, "wrong q3_K block size/padding"); -#endif - -// 4-bit quantization -// 8 blocks of 32 elements each -// weight is represented as x = a * q + b -// Effectively 4.5 bits per weight -#ifdef GGML_QKK_64 -typedef struct { - ggml_fp16_t d[2]; // super-block scales/mins - uint8_t scales[2]; // 4-bit block scales/mins - uint8_t qs[QK_K/2]; // 4--bit quants -} block_q4_K; -static_assert(sizeof(block_q4_K) == 2*sizeof(ggml_fp16_t) + QK_K/2 + 2, "wrong q4_K block size/padding"); -#else -typedef struct { - ggml_fp16_t d; // super-block scale for quantized scales - ggml_fp16_t dmin; // super-block scale for quantized mins - uint8_t scales[K_SCALE_SIZE]; // scales and mins, quantized with 6 bits - uint8_t qs[QK_K/2]; // 4--bit quants -} block_q4_K; -static_assert(sizeof(block_q4_K) == 2*sizeof(ggml_fp16_t) + K_SCALE_SIZE + QK_K/2, "wrong q4_K block size/padding"); -#endif - -// 5-bit quantization -// 8 blocks of 32 elements each -// weight is represented as x = a * q + b -// Effectively 5.5 bits per weight -#ifdef GGML_QKK_64 -typedef struct { - ggml_fp16_t d; // super-block scale - int8_t scales[QK_K/16]; // 8-bit block scales - uint8_t qh[QK_K/8]; // quants, high bit - uint8_t qs[QK_K/2]; // quants, low 4 bits -} block_q5_K; -static_assert(sizeof(block_q5_K) == sizeof(ggml_fp16_t) + QK_K/2 + QK_K/8 + QK_K/16, "wrong q5_K block size/padding"); -#else -typedef struct { - ggml_fp16_t d; // super-block scale for quantized scales - ggml_fp16_t dmin; // super-block scale for quantized mins - uint8_t scales[K_SCALE_SIZE]; // scales and mins, quantized with 6 bits - uint8_t qh[QK_K/8]; // quants, high bit - uint8_t qs[QK_K/2]; // quants, low 4 bits -} block_q5_K; -static_assert(sizeof(block_q5_K) == 2*sizeof(ggml_fp16_t) + K_SCALE_SIZE + QK_K/2 + QK_K/8, "wrong q5_K block size/padding"); -#endif - -// 6-bit quantization -// weight is represented as x = a * q -// 16 blocks of 16 elements each -// Effectively 6.5625 bits per weight -typedef struct { - uint8_t ql[QK_K/2]; // quants, lower 4 bits - uint8_t qh[QK_K/4]; // quants, upper 2 bits - int8_t scales[QK_K/16]; // scales, quantized with 8 bits - ggml_fp16_t d; // super-block scale -} block_q6_K; -static_assert(sizeof(block_q6_K) == sizeof(ggml_fp16_t) + QK_K / 16 + 3*QK_K/4, "wrong q6_K block size/padding"); - -// This is only used for intermediate quantization and dot products -typedef struct { - float d; // delta - int8_t qs[QK_K]; // quants - int16_t bsums[QK_K/16]; // sum of quants in groups of 16 -} block_q8_K; -static_assert(sizeof(block_q8_K) == sizeof(float) + QK_K + QK_K/16*sizeof(int16_t), "wrong q8_K block size/padding"); - - -// Quantization -void quantize_row_q2_K_reference(const float * restrict x, block_q2_K * restrict y, int k); -void quantize_row_q3_K_reference(const float * restrict x, block_q3_K * restrict y, int k); -void quantize_row_q4_K_reference(const float * restrict x, block_q4_K * restrict y, int k); -void quantize_row_q5_K_reference(const float * restrict x, block_q5_K * restrict y, int k); -void quantize_row_q6_K_reference(const float * restrict x, block_q6_K * restrict y, int k); -void quantize_row_q8_K_reference(const float * restrict x, block_q8_K * restrict y, int k); - -void quantize_row_q2_K(const float * restrict x, void * restrict y, int k); -void quantize_row_q3_K(const float * restrict x, void * restrict y, int k); -void quantize_row_q4_K(const float * restrict x, void * restrict y, int k); -void quantize_row_q5_K(const float * restrict x, void * restrict y, int k); -void quantize_row_q6_K(const float * restrict x, void * restrict y, int k); -void quantize_row_q8_K(const float * restrict x, void * restrict y, int k); - -// Dequantization -void dequantize_row_q2_K(const block_q2_K * restrict x, float * restrict y, int k); -void dequantize_row_q3_K(const block_q3_K * restrict x, float * restrict y, int k); -void dequantize_row_q4_K(const block_q4_K * restrict x, float * restrict y, int k); -void dequantize_row_q5_K(const block_q5_K * restrict x, float * restrict y, int k); -void dequantize_row_q6_K(const block_q6_K * restrict x, float * restrict y, int k); -void dequantize_row_q8_K(const block_q8_K * restrict x, float * restrict y, int k); - -// Dot product -void ggml_vec_dot_q2_K_q8_K(int n, float * restrict s, const void * restrict vx, const void * restrict vy); -void ggml_vec_dot_q3_K_q8_K(int n, float * restrict s, const void * restrict vx, const void * restrict vy); -void ggml_vec_dot_q4_K_q8_K(int n, float * restrict s, const void * restrict vx, const void * restrict vy); -void ggml_vec_dot_q5_K_q8_K(int n, float * restrict s, const void * restrict vx, const void * restrict vy); -void ggml_vec_dot_q6_K_q8_K(int n, float * restrict s, const void * restrict vx, const void * restrict vy); - -// Quantization with histogram collection -size_t ggml_quantize_q2_K(const float * src, void * dst, int n, int k, int64_t * hist); -size_t ggml_quantize_q3_K(const float * src, void * dst, int n, int k, int64_t * hist); -size_t ggml_quantize_q4_K(const float * src, void * dst, int n, int k, int64_t * hist); -size_t ggml_quantize_q5_K(const float * src, void * dst, int n, int k, int64_t * hist); -size_t ggml_quantize_q6_K(const float * src, void * dst, int n, int k, int64_t * hist); - diff --git a/spaces/Intel/NeuralChat-ICX-INT4/fastchat/model/apply_lora.py b/spaces/Intel/NeuralChat-ICX-INT4/fastchat/model/apply_lora.py deleted file mode 100644 index 870e64a3b4bfb6e44df70bf922bb8e3f6d3c9d42..0000000000000000000000000000000000000000 --- a/spaces/Intel/NeuralChat-ICX-INT4/fastchat/model/apply_lora.py +++ /dev/null @@ -1,48 +0,0 @@ -""" -Apply the LoRA weights on top of a base model. - -Usage: -python3 -m fastchat.model.apply_lora --base ~/model_weights/llama-7b --target ~/model_weights/baize-7b --lora project-baize/baize-lora-7B - -Dependency: -pip3 install git+https://github.com/huggingface/peft.git@2822398fbe896f25d4dac5e468624dc5fd65a51b -""" -import argparse - -import torch -from peft import PeftModel -from transformers import AutoTokenizer, AutoModelForCausalLM - - -def apply_lora(base_model_path, target_model_path, lora_path): - print(f"Loading the base model from {base_model_path}") - base = AutoModelForCausalLM.from_pretrained( - base_model_path, torch_dtype=torch.float16, low_cpu_mem_usage=True - ) - base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) - - print(f"Loading the LoRA adapter from {lora_path}") - - lora_model = PeftModel.from_pretrained( - base, - lora_path, - torch_dtype=torch.float16, - ) - - print("Applying the LoRA") - model = lora_model.merge_and_unload() - - print(f"Saving the target model to {target_model_path}") - model.save_pretrained(target_model_path) - base_tokenizer.save_pretrained(target_model_path) - - -if __name__ == "__main__": - parser = argparse.ArgumentParser() - parser.add_argument("--base-model-path", type=str, required=True) - parser.add_argument("--target-model-path", type=str, required=True) - parser.add_argument("--lora-path", type=str, required=True) - - args = parser.parse_args() - - apply_lora(args.base_model_path, args.target_model_path, args.lora_path) diff --git a/spaces/Jasonyoyo/CodeFormer/CodeFormer/basicsr/utils/download_util.py b/spaces/Jasonyoyo/CodeFormer/CodeFormer/basicsr/utils/download_util.py deleted file mode 100644 index 2a267915743ee3f3232bc8fe992466b52468979a..0000000000000000000000000000000000000000 --- a/spaces/Jasonyoyo/CodeFormer/CodeFormer/basicsr/utils/download_util.py +++ /dev/null @@ -1,95 +0,0 @@ -import math -import os -import requests -from torch.hub import download_url_to_file, get_dir -from tqdm import tqdm -from urllib.parse import urlparse - -from .misc import sizeof_fmt - - -def download_file_from_google_drive(file_id, save_path): - """Download files from google drive. - Ref: - https://stackoverflow.com/questions/25010369/wget-curl-large-file-from-google-drive # noqa E501 - Args: - file_id (str): File id. - save_path (str): Save path. - """ - - session = requests.Session() - URL = 'https://docs.google.com/uc?export=download' - params = {'id': file_id} - - response = session.get(URL, params=params, stream=True) - token = get_confirm_token(response) - if token: - params['confirm'] = token - response = session.get(URL, params=params, stream=True) - - # get file size - response_file_size = session.get(URL, params=params, stream=True, headers={'Range': 'bytes=0-2'}) - print(response_file_size) - if 'Content-Range' in response_file_size.headers: - file_size = int(response_file_size.headers['Content-Range'].split('/')[1]) - else: - file_size = None - - save_response_content(response, save_path, file_size) - - -def get_confirm_token(response): - for key, value in response.cookies.items(): - if key.startswith('download_warning'): - return value - return None - - -def save_response_content(response, destination, file_size=None, chunk_size=32768): - if file_size is not None: - pbar = tqdm(total=math.ceil(file_size / chunk_size), unit='chunk') - - readable_file_size = sizeof_fmt(file_size) - else: - pbar = None - - with open(destination, 'wb') as f: - downloaded_size = 0 - for chunk in response.iter_content(chunk_size): - downloaded_size += chunk_size - if pbar is not None: - pbar.update(1) - pbar.set_description(f'Download {sizeof_fmt(downloaded_size)} / {readable_file_size}') - if chunk: # filter out keep-alive new chunks - f.write(chunk) - if pbar is not None: - pbar.close() - - -def load_file_from_url(url, model_dir=None, progress=True, file_name=None): - """Load file form http url, will download models if necessary. - Ref:https://github.com/1adrianb/face-alignment/blob/master/face_alignment/utils.py - Args: - url (str): URL to be downloaded. - model_dir (str): The path to save the downloaded model. Should be a full path. If None, use pytorch hub_dir. - Default: None. - progress (bool): Whether to show the download progress. Default: True. - file_name (str): The downloaded file name. If None, use the file name in the url. Default: None. - Returns: - str: The path to the downloaded file. - """ - if model_dir is None: # use the pytorch hub_dir - hub_dir = get_dir() - model_dir = os.path.join(hub_dir, 'checkpoints') - - os.makedirs(model_dir, exist_ok=True) - - parts = urlparse(url) - filename = os.path.basename(parts.path) - if file_name is not None: - filename = file_name - cached_file = os.path.abspath(os.path.join(model_dir, filename)) - if not os.path.exists(cached_file): - print(f'Downloading: "{url}" to {cached_file}\n') - download_url_to_file(url, cached_file, hash_prefix=None, progress=progress) - return cached_file \ No newline at end of file diff --git a/spaces/Jeff2323/ai-comic-factory/src/app/store/index.ts b/spaces/Jeff2323/ai-comic-factory/src/app/store/index.ts deleted file mode 100644 index e85dd4d052996e9b4120bef57abb6c72c509d41a..0000000000000000000000000000000000000000 --- a/spaces/Jeff2323/ai-comic-factory/src/app/store/index.ts +++ /dev/null @@ -1,203 +0,0 @@ -"use client" - -import { create } from "zustand" - -import { FontName } from "@/lib/fonts" -import { Preset, PresetName, defaultPreset, getPreset, getRandomPreset } from "@/app/engine/presets" -import { LayoutName, defaultLayout, getRandomLayoutName, getRandomLayoutNames } from "../layouts" -import html2canvas from "html2canvas" -import { RenderedScene } from "@/types" - -export const useStore = create<{ - prompt: string - font: FontName - preset: Preset - nbFrames: number - panels: string[] - captions: string[] - upscaleQueue: Record - showCaptions: boolean - renderedScenes: Record - layout: LayoutName - layouts: LayoutName[] - zoomLevel: number - page: HTMLDivElement - isGeneratingStory: boolean - panelGenerationStatus: Record - isGeneratingText: boolean - atLeastOnePanelIsBusy: boolean - setRendered: (panelId: string, renderedScene: RenderedScene) => void - addToUpscaleQueue: (panelId: string, renderedScene: RenderedScene) => void - removeFromUpscaleQueue: (panelId: string) => void - setPrompt: (prompt: string) => void - setFont: (font: FontName) => void - setPreset: (preset: Preset) => void - setPanels: (panels: string[]) => void - setShowCaptions: (showCaptions: boolean) => void - setLayout: (layout: LayoutName) => void - setLayouts: (layouts: LayoutName[]) => void - setCaptions: (captions: string[]) => void - setZoomLevel: (zoomLevel: number) => void - setPage: (page: HTMLDivElement) => void - setGeneratingStory: (isGeneratingStory: boolean) => void - setGeneratingImages: (panelId: string, value: boolean) => void - setGeneratingText: (isGeneratingText: boolean) => void - pageToImage: () => Promise - download: () => Promise - generate: (prompt: string, presetName: PresetName, layoutName: LayoutName) => void -}>((set, get) => ({ - prompt: "", - font: "actionman", - preset: getPreset(defaultPreset), - nbFrames: 1, - panels: [], - captions: [], - upscaleQueue: {} as Record, - renderedScenes: {} as Record, - showCaptions: false, - layout: defaultLayout, - layouts: [defaultLayout, defaultLayout], - zoomLevel: 60, - page: undefined as unknown as HTMLDivElement, - isGeneratingStory: false, - panelGenerationStatus: {}, - isGeneratingText: false, - atLeastOnePanelIsBusy: false, - setRendered: (panelId: string, renderedScene: RenderedScene) => { - const { renderedScenes } = get() - set({ - renderedScenes: { - ...renderedScenes, - [panelId]: renderedScene - } - }) - }, - addToUpscaleQueue: (panelId: string, renderedScene: RenderedScene) => { - const { upscaleQueue } = get() - set({ - upscaleQueue: { - ...upscaleQueue, - [panelId]: renderedScene - }, - }) - }, - removeFromUpscaleQueue: (panelId: string) => { - const upscaleQueue = { ...get().upscaleQueue } - delete upscaleQueue[panelId] - set({ - upscaleQueue, - }) - }, - setPrompt: (prompt: string) => { - const existingPrompt = get().prompt - if (prompt === existingPrompt) { return } - set({ - prompt, - }) - }, - setFont: (font: FontName) => { - const existingFont = get().font - if (font === existingFont) { return } - set({ - font, - }) - }, - setPreset: (preset: Preset) => { - const existingPreset = get().preset - if (preset.label === existingPreset.label) { return } - set({ - preset, - }) - }, - setNbFrames: (nbFrames: number) => { - const existingNbFrames = get().nbFrames - if (nbFrames === existingNbFrames) { return } - set({ - nbFrames, - }) - }, - setPanels: (panels: string[]) => set({ panels }), - setCaptions: (captions: string[]) => { - set({ - captions, - }) - }, - setShowCaptions: (showCaptions: boolean) => { - set({ - showCaptions, - }) - }, - setLayout: (layoutName: LayoutName) => { - const layout = layoutName === "random" - ? getRandomLayoutName() - : layoutName - - set({ - layout, - layouts: [layout, layout] - }) - }, - setLayouts: (layouts: LayoutName[]) => set({ layouts }), - setZoomLevel: (zoomLevel: number) => set({ zoomLevel }), - setPage: (page: HTMLDivElement) => { - if (!page) { return } - set({ page }) - }, - setGeneratingStory: (isGeneratingStory: boolean) => set({ isGeneratingStory }), - setGeneratingImages: (panelId: string, value: boolean) => { - const panelGenerationStatus: Record = { - ...get().panelGenerationStatus, - [panelId]: value - } - - const atLeastOnePanelIsBusy = Object.values(panelGenerationStatus).includes(true) - - set({ - panelGenerationStatus, - atLeastOnePanelIsBusy - }) - }, - setGeneratingText: (isGeneratingText: boolean) => set({ isGeneratingText }), - pageToImage: async () => { - const { page } = get() - if (!page) { return "" } - - - const canvas = await html2canvas(page) - console.log("canvas:", canvas) - - const data = canvas.toDataURL('image/jpeg', 0.5) - return data - }, - download: async () => { - const { pageToImage } = get() - const data = await pageToImage() - - const link = document.createElement('a') - - if (typeof link.download === 'string') { - link.href = data - link.download = 'comic.jpg' - document.body.appendChild(link) - link.click() - document.body.removeChild(link) - } else { - window.open(data) - } - }, - generate: (prompt: string, presetName: PresetName, layoutName: LayoutName) => { - const layout = layoutName === "random" - ? getRandomLayoutName() - : layoutName - set({ - prompt, - panels: [], - captions: [], - preset: presetName === "random" - ? getRandomPreset() - : getPreset(presetName), - layout, - layouts: [layout, layout], - }) - } -})) diff --git a/spaces/JingyeChen22/TextDiffuser/test.py b/spaces/JingyeChen22/TextDiffuser/test.py deleted file mode 100644 index 0c5264b59075634aaff87ae484c71b61d2e88ed8..0000000000000000000000000000000000000000 --- a/spaces/JingyeChen22/TextDiffuser/test.py +++ /dev/null @@ -1,5 +0,0 @@ -import os - -os.system('wget https://layoutlm.blob.core.windows.net/textdiffuser/textdiffuser-ckpt.zip') -os.system('unzip textdiffuser-ckpt.zip') -os.system('finish') \ No newline at end of file diff --git a/spaces/Jojelf/dreamlike-photoreal-2.0/README.md b/spaces/Jojelf/dreamlike-photoreal-2.0/README.md deleted file mode 100644 index dd4eaa77c97d886042d6d692b2f708c0484dbf9f..0000000000000000000000000000000000000000 --- a/spaces/Jojelf/dreamlike-photoreal-2.0/README.md +++ /dev/null @@ -1,12 +0,0 @@ ---- -title: Dreamlike Photoreal 2.0 -emoji: 👀 -colorFrom: red -colorTo: indigo -sdk: gradio -sdk_version: 3.15.0 -app_file: app.py -pinned: false ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/Kayson/InstructDiffusion/stable_diffusion/ldm/modules/distributions/distributions.py b/spaces/Kayson/InstructDiffusion/stable_diffusion/ldm/modules/distributions/distributions.py deleted file mode 100644 index f2b8ef901130efc171aa69742ca0244d94d3f2e9..0000000000000000000000000000000000000000 --- a/spaces/Kayson/InstructDiffusion/stable_diffusion/ldm/modules/distributions/distributions.py +++ /dev/null @@ -1,92 +0,0 @@ -import torch -import numpy as np - - -class AbstractDistribution: - def sample(self): - raise NotImplementedError() - - def mode(self): - raise NotImplementedError() - - -class DiracDistribution(AbstractDistribution): - def __init__(self, value): - self.value = value - - def sample(self): - return self.value - - def mode(self): - return self.value - - -class DiagonalGaussianDistribution(object): - def __init__(self, parameters, deterministic=False): - self.parameters = parameters - self.mean, self.logvar = torch.chunk(parameters, 2, dim=1) - self.logvar = torch.clamp(self.logvar, -30.0, 20.0) - self.deterministic = deterministic - self.std = torch.exp(0.5 * self.logvar) - self.var = torch.exp(self.logvar) - if self.deterministic: - self.var = self.std = torch.zeros_like(self.mean).to(device=self.parameters.device) - - def sample(self): - x = self.mean + self.std * torch.randn(self.mean.shape).to(device=self.parameters.device) - return x - - def kl(self, other=None): - if self.deterministic: - return torch.Tensor([0.]) - else: - if other is None: - return 0.5 * torch.sum(torch.pow(self.mean, 2) - + self.var - 1.0 - self.logvar, - dim=[1, 2, 3]) - else: - return 0.5 * torch.sum( - torch.pow(self.mean - other.mean, 2) / other.var - + self.var / other.var - 1.0 - self.logvar + other.logvar, - dim=[1, 2, 3]) - - def nll(self, sample, dims=[1,2,3]): - if self.deterministic: - return torch.Tensor([0.]) - logtwopi = np.log(2.0 * np.pi) - return 0.5 * torch.sum( - logtwopi + self.logvar + torch.pow(sample - self.mean, 2) / self.var, - dim=dims) - - def mode(self): - return self.mean - - -def normal_kl(mean1, logvar1, mean2, logvar2): - """ - source: https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/losses.py#L12 - Compute the KL divergence between two gaussians. - Shapes are automatically broadcasted, so batches can be compared to - scalars, among other use cases. - """ - tensor = None - for obj in (mean1, logvar1, mean2, logvar2): - if isinstance(obj, torch.Tensor): - tensor = obj - break - assert tensor is not None, "at least one argument must be a Tensor" - - # Force variances to be Tensors. Broadcasting helps convert scalars to - # Tensors, but it does not work for torch.exp(). - logvar1, logvar2 = [ - x if isinstance(x, torch.Tensor) else torch.tensor(x).to(tensor) - for x in (logvar1, logvar2) - ] - - return 0.5 * ( - -1.0 - + logvar2 - - logvar1 - + torch.exp(logvar1 - logvar2) - + ((mean1 - mean2) ** 2) * torch.exp(-logvar2) - ) diff --git a/spaces/Kevin676/Real-Time-Voice-Cloning/synthesizer/__init__.py b/spaces/Kevin676/Real-Time-Voice-Cloning/synthesizer/__init__.py deleted file mode 100644 index 4287ca8617970fa8fc025b75cb319c7032706910..0000000000000000000000000000000000000000 --- a/spaces/Kevin676/Real-Time-Voice-Cloning/synthesizer/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# \ No newline at end of file diff --git a/spaces/Kevin676/Shanghainese-TTS-demo/transforms.py b/spaces/Kevin676/Shanghainese-TTS-demo/transforms.py deleted file mode 100644 index 4793d67ca5a5630e0ffe0f9fb29445c949e64dae..0000000000000000000000000000000000000000 --- a/spaces/Kevin676/Shanghainese-TTS-demo/transforms.py +++ /dev/null @@ -1,193 +0,0 @@ -import torch -from torch.nn import functional as F - -import numpy as np - - -DEFAULT_MIN_BIN_WIDTH = 1e-3 -DEFAULT_MIN_BIN_HEIGHT = 1e-3 -DEFAULT_MIN_DERIVATIVE = 1e-3 - - -def piecewise_rational_quadratic_transform(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails=None, - tail_bound=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - - if tails is None: - spline_fn = rational_quadratic_spline - spline_kwargs = {} - else: - spline_fn = unconstrained_rational_quadratic_spline - spline_kwargs = { - 'tails': tails, - 'tail_bound': tail_bound - } - - outputs, logabsdet = spline_fn( - inputs=inputs, - unnormalized_widths=unnormalized_widths, - unnormalized_heights=unnormalized_heights, - unnormalized_derivatives=unnormalized_derivatives, - inverse=inverse, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative, - **spline_kwargs - ) - return outputs, logabsdet - - -def searchsorted(bin_locations, inputs, eps=1e-6): - bin_locations[..., -1] += eps - return torch.sum( - inputs[..., None] >= bin_locations, - dim=-1 - ) - 1 - - -def unconstrained_rational_quadratic_spline(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails='linear', - tail_bound=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - inside_interval_mask = (inputs >= -tail_bound) & (inputs <= tail_bound) - outside_interval_mask = ~inside_interval_mask - - outputs = torch.zeros_like(inputs) - logabsdet = torch.zeros_like(inputs) - - if tails == 'linear': - unnormalized_derivatives = F.pad(unnormalized_derivatives, pad=(1, 1)) - constant = np.log(np.exp(1 - min_derivative) - 1) - unnormalized_derivatives[..., 0] = constant - unnormalized_derivatives[..., -1] = constant - - outputs[outside_interval_mask] = inputs[outside_interval_mask] - logabsdet[outside_interval_mask] = 0 - else: - raise RuntimeError('{} tails are not implemented.'.format(tails)) - - outputs[inside_interval_mask], logabsdet[inside_interval_mask] = rational_quadratic_spline( - inputs=inputs[inside_interval_mask], - unnormalized_widths=unnormalized_widths[inside_interval_mask, :], - unnormalized_heights=unnormalized_heights[inside_interval_mask, :], - unnormalized_derivatives=unnormalized_derivatives[inside_interval_mask, :], - inverse=inverse, - left=-tail_bound, right=tail_bound, bottom=-tail_bound, top=tail_bound, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative - ) - - return outputs, logabsdet - -def rational_quadratic_spline(inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - left=0., right=1., bottom=0., top=1., - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE): - if torch.min(inputs) < left or torch.max(inputs) > right: - raise ValueError('Input to a transform is not within its domain') - - num_bins = unnormalized_widths.shape[-1] - - if min_bin_width * num_bins > 1.0: - raise ValueError('Minimal bin width too large for the number of bins') - if min_bin_height * num_bins > 1.0: - raise ValueError('Minimal bin height too large for the number of bins') - - widths = F.softmax(unnormalized_widths, dim=-1) - widths = min_bin_width + (1 - min_bin_width * num_bins) * widths - cumwidths = torch.cumsum(widths, dim=-1) - cumwidths = F.pad(cumwidths, pad=(1, 0), mode='constant', value=0.0) - cumwidths = (right - left) * cumwidths + left - cumwidths[..., 0] = left - cumwidths[..., -1] = right - widths = cumwidths[..., 1:] - cumwidths[..., :-1] - - derivatives = min_derivative + F.softplus(unnormalized_derivatives) - - heights = F.softmax(unnormalized_heights, dim=-1) - heights = min_bin_height + (1 - min_bin_height * num_bins) * heights - cumheights = torch.cumsum(heights, dim=-1) - cumheights = F.pad(cumheights, pad=(1, 0), mode='constant', value=0.0) - cumheights = (top - bottom) * cumheights + bottom - cumheights[..., 0] = bottom - cumheights[..., -1] = top - heights = cumheights[..., 1:] - cumheights[..., :-1] - - if inverse: - bin_idx = searchsorted(cumheights, inputs)[..., None] - else: - bin_idx = searchsorted(cumwidths, inputs)[..., None] - - input_cumwidths = cumwidths.gather(-1, bin_idx)[..., 0] - input_bin_widths = widths.gather(-1, bin_idx)[..., 0] - - input_cumheights = cumheights.gather(-1, bin_idx)[..., 0] - delta = heights / widths - input_delta = delta.gather(-1, bin_idx)[..., 0] - - input_derivatives = derivatives.gather(-1, bin_idx)[..., 0] - input_derivatives_plus_one = derivatives[..., 1:].gather(-1, bin_idx)[..., 0] - - input_heights = heights.gather(-1, bin_idx)[..., 0] - - if inverse: - a = (((inputs - input_cumheights) * (input_derivatives - + input_derivatives_plus_one - - 2 * input_delta) - + input_heights * (input_delta - input_derivatives))) - b = (input_heights * input_derivatives - - (inputs - input_cumheights) * (input_derivatives - + input_derivatives_plus_one - - 2 * input_delta)) - c = - input_delta * (inputs - input_cumheights) - - discriminant = b.pow(2) - 4 * a * c - assert (discriminant >= 0).all() - - root = (2 * c) / (-b - torch.sqrt(discriminant)) - outputs = root * input_bin_widths + input_cumwidths - - theta_one_minus_theta = root * (1 - root) - denominator = input_delta + ((input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta) - derivative_numerator = input_delta.pow(2) * (input_derivatives_plus_one * root.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - root).pow(2)) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, -logabsdet - else: - theta = (inputs - input_cumwidths) / input_bin_widths - theta_one_minus_theta = theta * (1 - theta) - - numerator = input_heights * (input_delta * theta.pow(2) - + input_derivatives * theta_one_minus_theta) - denominator = input_delta + ((input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta) - outputs = input_cumheights + numerator / denominator - - derivative_numerator = input_delta.pow(2) * (input_derivatives_plus_one * theta.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - theta).pow(2)) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, logabsdet diff --git a/spaces/KevinQHLin/UniVTG/model/univtg_qfvs.py b/spaces/KevinQHLin/UniVTG/model/univtg_qfvs.py deleted file mode 100644 index 0c4b133857affa8f34e4c77e753539291cb96a75..0000000000000000000000000000000000000000 --- a/spaces/KevinQHLin/UniVTG/model/univtg_qfvs.py +++ /dev/null @@ -1,476 +0,0 @@ -import pdb -import torch -import torch.nn.functional as F -from torch import nn -import numpy as np - -from model.transformer_encoder_droppath import build_transformer -from model.matcher import build_matcher -from model.position_encoding import build_position_encoding -from utils.span_utils import generalized_temporal_iou, span_cxw_to_xx - -def init_weights(module): - if isinstance(module, (nn.Linear, nn.Embedding)): - module.weight.data.normal_(mean=0.0, std=0.02) - elif isinstance(module, nn.LayerNorm): - module.bias.data.zero_() - module.weight.data.fill_(1.0) - - if isinstance(module, nn.Linear) and module.bias is not None: - module.bias.data.zero_() - -def mask_logits(inputs, mask, mask_value=-1e30): - mask = mask.type(torch.float32) - return inputs + (1.0 - mask) * mask_value - -def sim_matrix(a, b, eps=1e-8): - """ - added eps for numerical stability - """ - a_n, b_n = a.norm(dim=1)[:, None], b.norm(dim=1)[:, None] - a_norm = a / torch.max(a_n, eps * torch.ones_like(a_n)) - b_norm = b / torch.max(b_n, eps * torch.ones_like(b_n)) - sim_mt = torch.mm(a_norm, b_norm.transpose(0, 1)) - return sim_mt - -class WeightedPool(nn.Module): - def __init__(self, dim): - super(WeightedPool, self).__init__() - weight = torch.empty(dim, 1) - nn.init.xavier_uniform_(weight) - self.weight = nn.Parameter(weight, requires_grad=True) - - def forward(self, x, mask): - alpha = torch.tensordot(x, self.weight, dims=1) # shape = (batch_size, seq_length, 1) - alpha = mask_logits(alpha, mask=mask.unsqueeze(2)) - alphas = nn.Softmax(dim=1)(alpha) - pooled_x = torch.matmul(x.transpose(1, 2), alphas) # (batch_size, dim, 1) - pooled_x = pooled_x.squeeze(2) - return pooled_x - -class Model(nn.Module): - """ This is the UniVTG module that performs moment localization. """ - - def __init__(self, transformer, position_embed, txt_position_embed, txt_dim, vid_dim, - input_dropout, aux_loss=False, - max_v_l=75, span_loss_type="l1", use_txt_pos=False, n_input_proj=2): - """ Initializes the model. - Parameters: - transformer: torch module of the transformer architecture. See transformer.py - position_embed: torch module of the position_embedding, See position_encoding.py - txt_position_embed: position_embedding for text - txt_dim: int, text query input dimension - vid_dim: int, video feature input dimension - max_v_l: int, maximum #clips in videos - span_loss_type: str, one of [l1, ce] - l1: (center-x, width) regression. - ce: (st_idx, ed_idx) classification. - # foreground_thd: float, intersection over prediction >= foreground_thd: labeled as foreground - # background_thd: float, intersection over prediction <= background_thd: labeled background - """ - super().__init__() - self.transformer = transformer - self.position_embed = position_embed - self.txt_position_embed = txt_position_embed - hidden_dim = transformer.d_model - self.span_loss_type = span_loss_type - self.max_v_l = max_v_l - span_pred_dim = 2 if span_loss_type == "l1" else max_v_l * 2 - - self.token_type_embeddings = nn.Embedding(2, hidden_dim) - self.token_type_embeddings.apply(init_weights) - - # Conv projector - self.span_embed = Conv(hidden_dim, hidden_dim, span_pred_dim, 3, kernel_size=3) - self.class_embed = Conv(hidden_dim, hidden_dim, 1, 3, kernel_size=3) # 0: background, 1: foreground - - self.use_txt_pos = use_txt_pos - self.n_input_proj = n_input_proj - relu_args = [True] * 3 - relu_args[n_input_proj-1] = False - self.input_txt_proj = nn.Sequential(*[ - LinearLayer(txt_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[0]), - LinearLayer(hidden_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[1]), - LinearLayer(hidden_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[2]) - ][:n_input_proj]) - self.input_vid_proj = nn.Sequential(*[ - LinearLayer(vid_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[0]), - LinearLayer(hidden_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[1]), - LinearLayer(hidden_dim, hidden_dim, layer_norm=True, dropout=input_dropout, relu=relu_args[2]) - ][:n_input_proj]) - - # MLP Projector - self.weightedpool = WeightedPool(hidden_dim) - - def forward(self, src_txt, src_txt_mask, src_vid, src_vid_mask, src_cls=None, src_cls_mask=None): - bs = src_vid.shape[0] - src_vid = self.input_vid_proj(src_vid) - src_txt = self.input_txt_proj(src_txt) - if src_cls is not None: - src_cls = self.input_txt_proj(src_cls) - - # type token. - src_vid = src_vid + self.token_type_embeddings(torch.full_like(src_vid_mask.long(), 1)) - src_txt = src_txt + self.token_type_embeddings(torch.zeros_like(src_txt_mask.long())) - if src_cls is not None: - src_cls = src_cls + self.token_type_embeddings(torch.zeros_like(src_cls_mask.long())) - - src = torch.cat([src_vid, src_txt], dim=1) # (bsz, L_vid+L_txt, d) - mask = torch.cat([src_vid_mask, src_txt_mask], dim=1).bool() # (bsz, L_vid+L_txt) - - pos_vid = self.position_embed(src_vid, src_vid_mask) # (bsz, L_vid, d) - pos_txt = self.txt_position_embed(src_txt) if self.use_txt_pos else torch.zeros_like(src_txt) # (bsz, L_txt, d) - pos = torch.cat([pos_vid, pos_txt], dim=1) - - memory = self.transformer(src, ~mask, pos) - vid_mem = memory[:, :src_vid.shape[1], :] # (bsz, L_vid, d) - - outputs_class = self.class_embed(vid_mem).sigmoid() # (#layers, batch_size, #queries, #classes) - outputs_coord = self.span_embed(vid_mem) # (#layers, bsz, #queries, 2 or max_v_l * 2) - - if self.span_loss_type == "l1": - outputs_coord = outputs_coord.sigmoid() - idx_mask = torch.tensor((-1, 1)).unsqueeze(0).unsqueeze(0).cuda() - idx_mask = idx_mask.repeat(outputs_coord.shape[0], outputs_coord.shape[1], 1) - outputs_coord = outputs_coord * idx_mask - else: - raise NotImplementedError - - out = {'pred_logits': outputs_class, 'pred_spans': outputs_coord, - 'src_vid_mask': src_vid_mask} - - vid_mem_proj = src_vid - - # word-level -> sentence-level - txt_mem_proj = self.weightedpool(src_txt, src_txt_mask).unsqueeze(1) - sim = F.cosine_similarity(vid_mem_proj, txt_mem_proj, dim=-1) + (src_vid_mask + 1e-45).log() - - out["vid_mem_proj"] = vid_mem_proj - out["txt_mem_proj"] = txt_mem_proj - if src_cls is not None: - cls_mem_proj = self.weightedpool(src_cls, src_cls_mask) - out["cls_mem_proj"] = cls_mem_proj - out["saliency_scores"] = sim - return out - -class SetCriterion(nn.Module): - """ This class computes the loss for DETR. - The process happens in two steps: - 1) we compute hungarian assignment between ground truth boxes and the outputs of the model - 2) we supervise each pair of matched ground-truth / prediction (supervise class and box) - """ - - def __init__(self, matcher, weight_dict, eos_coef, losses, temperature, span_loss_type, max_v_l, - saliency_margin=1): - """ Create the criterion. - Parameters: - matcher: module able to compute a matching between targets and proposals - weight_dict: dict containing as key the names of the losses and as values their relative weight. - eos_coef: relative classification weight applied to the no-object category - losses: list of all the losses to be applied. See get_loss for list of available losses. - temperature: float, temperature for NCE loss - span_loss_type: str, [l1, ce] - max_v_l: int, - saliency_margin: float - """ - super().__init__() - self.matcher = matcher - self.weight_dict = weight_dict - self.losses = losses - self.temperature = temperature - self.span_loss_type = span_loss_type - self.max_v_l = max_v_l - self.saliency_margin = saliency_margin - self.temperature = 0.07 - - # foreground and background classification - self.foreground_label = 0 - self.background_label = 1 - self.eos_coef = eos_coef - empty_weight = torch.ones(2) - empty_weight[-1] = self.eos_coef # lower weight for background (index 1, foreground index 0) - self.register_buffer('empty_weight', empty_weight) - - def loss_spans(self, outputs, targets, indices): - assert 'pred_spans' in outputs - - start_spans = targets['timestamp'] - pred_spans = outputs['pred_spans'] - src_spans = start_spans + pred_spans - gt_spans = targets['span_labels_nn'] - - mask = targets['timestamp_mask'].bool() - mask_full = targets['timestamp_mask'].unsqueeze(2).repeat(1, 1, 2) - mask_valid = targets['timestamp_window'].bool() - mask_valid_full = targets['timestamp_window'].unsqueeze(2).repeat(1, 1, 2) - - loss_span = F.smooth_l1_loss(src_spans, gt_spans, reduction='none') * mask_valid_full - loss_giou = 1 - torch.diag(generalized_temporal_iou(src_spans[mask_valid], gt_spans[mask_valid])) - - losses = {} - losses['loss_b'] = loss_span.sum() / mask_valid.sum() - losses['loss_g'] = loss_giou.mean() - return losses - - def loss_labels(self, outputs, targets, indices, log=True): - saliency_scores = targets["saliency_scores"] - if saliency_scores.sum() == 0: - return {"loss_f": 0.} - - src_logits = outputs['pred_logits'].squeeze(-1) # (batch_size, #queries, #classes=2) - target_classes = targets["saliency_scores"].squeeze() - - weights = torch.ones_like(target_classes).float() * self.empty_weight[1] - weights[target_classes.bool()] = self.empty_weight[0] - - loss_ce = F.binary_cross_entropy(src_logits, target_classes.float(), reduction="none") - return {"loss_f": loss_ce.sum() / target_classes.sum()} - # return {"loss_f": loss_ce.sum() / len(target_classes)} - - # mask = targets['timestamp_mask'].bool() - # mask_valid = targets['timestamp_window'].bool() - # target_classes = torch.full(src_logits.shape[:2], 0, dtype=torch.int64, device=src_logits.device) # (batch_size, #queries) - # target_classes[mask_valid] = 1 - # # target_classes = targets['timestamp_window'] # soft cls. - # target_classes.float() - # # pdb.set_trace() - - # weights = torch.zeros_like(target_classes).float() - # weights[mask] = self.empty_weight[1] - # weights[mask_valid] = self.empty_weight[0] - - # loss_ce = F.binary_cross_entropy(src_logits, target_classes.float(), weight=weights, reduction="none") * mask - # # return {"loss_f": loss_ce.sum() / mask.sum()} - # return {"loss_f": loss_ce.sum() / mask_valid.sum()} - - def loss_saliency(self, outputs, targets, indices, log=True): - """higher scores for positive clips""" - if "saliency_pos_labels" not in targets: - return {"loss_s_inter": 0., "loss_s_intra": 0.} - saliency_scores = targets["saliency_scores"] - if saliency_scores.sum() == 0: - return {"loss_s_inter": 0., "loss_s_intra": 0.} - - # * qfvs mil-nce mode - pos_indices = saliency_scores.squeeze() > 0 - - sim = outputs['saliency_scores'] - sim_soft = F.softmax(sim / self.temperature, dim=0) - sim_log = torch.log(sim_soft[pos_indices]) - loss_saliency_intra = -sim_log.sum() / len(sim_log) - return {"loss_s_inter": 0., "loss_s_intra": loss_saliency_intra} - - # * inter-vid mode - # vid_mem_proj = outputs["vid_mem_proj"] - # pos_indices = targets["saliency_pos_labels"][:,0].long() # (N, #pairs) - # batch_indices = torch.arange(len(vid_mem_proj)).to(vid_mem_proj.device) - - # vid_feats = vid_mem_proj[batch_indices, pos_indices] - # txt_feats = outputs["txt_mem_proj"].squeeze(1) - # sim = sim_matrix(vid_feats, txt_feats) - - # i_logsm = F.log_softmax(sim / self.temperature, dim=1) - # j_logsm = F.log_softmax(sim.t() /self.temperature, dim=1) - - # # sum over positives - # idiag = torch.diag(i_logsm) - # jdiag = torch.diag(j_logsm) - # loss_i = idiag.sum() / len(idiag) - # loss_j = jdiag.sum() / len(jdiag) - - # loss_saliency_inter = - loss_i - loss_j - - # # * intra-vid mode - # mask = targets['timestamp_mask'] - # selected_scores = saliency_scores[batch_indices, pos_indices].unsqueeze(-1) - # neg_indices_in = (saliency_scores < selected_scores) - # neg_indices_in[batch_indices, pos_indices] = True - # mask_invalid = neg_indices_in * mask.bool() - - # sim_in = F.cosine_similarity(vid_mem_proj, txt_feats.unsqueeze(1), dim=-1) - # sim_in = sim_in + (mask_invalid + 1e-45).log() - # logsm_in_i = F.log_softmax(sim_in / self.temperature, dim=1) - # logsm_in_j = F.log_softmax(sim_in.t() / self.temperature, dim=1) - - # pos_logsm_in_i = logsm_in_i[batch_indices, pos_indices] - # pos_logsm_in_j = logsm_in_j[pos_indices, batch_indices] - # loss_in_i = pos_logsm_in_i.sum() / len(pos_logsm_in_i) - # loss_in_j = pos_logsm_in_j.sum() / len(pos_logsm_in_j) - - # loss_saliency_intra = - loss_in_i - loss_in_j - - # return {"loss_s_inter": loss_saliency_inter, "loss_s_intra": loss_saliency_intra} - - def loss_saliency_cls(self, outputs, targets, indices, log=True): - """higher scores for positive clips""" - if "saliency_pos_labels" not in targets: - return {"loss_s_inter": 0., "loss_s_intra": 0.} - saliency_scores = targets["saliency_scores"] - if saliency_scores.sum() == 0: - return {"loss_s_inter": 0., "loss_s_intra": 0.} - - # * inter-vid mode - vid_mem_proj = outputs["vid_mem_proj"] - pos_indices = targets["saliency_pos_labels"][:,0].long() # (N, #pairs) - batch_indices = torch.arange(len(vid_mem_proj)).to(vid_mem_proj.device) - - vid_feats = vid_mem_proj[batch_indices, pos_indices] - txt_feats = outputs["txt_mem_proj"].squeeze(1) - sim = sim_matrix(vid_feats, txt_feats) - - i_logsm = F.log_softmax(sim / self.temperature, dim=1) - j_logsm = F.log_softmax(sim.t() /self.temperature, dim=1) - - # sum over positives - idiag = torch.diag(i_logsm) - jdiag = torch.diag(j_logsm) - loss_i = idiag.sum() / len(idiag) - loss_j = jdiag.sum() / len(jdiag) - - loss_saliency_inter = - loss_i - loss_j - - # * intra-vid mode - if 'cls_idx' not in targets.keys(): # eval - return {"loss_s_inter": loss_saliency_inter} - - cls_indices = targets['cls_idx'].bool() - cls_feats = outputs["cls_mem_proj"].squeeze(1) - sim_cls = sim_matrix(vid_feats, cls_feats) - - i_logsm_cls = F.log_softmax(sim_cls / self.temperature, dim=1) - idiag_cls = i_logsm_cls[cls_indices] - loss_cls_i = idiag_cls.sum() / len(idiag_cls) - - loss_saliency_intra = - loss_cls_i - - return {"loss_s_inter": loss_saliency_inter, "loss_s_intra": loss_saliency_intra} - - def get_loss(self, loss, outputs, targets, indices, **kwargs): - loss_map = { - "spans": self.loss_spans, - "labels": self.loss_labels, - "saliency": self.loss_saliency, - "saliency_cls": self.loss_saliency_cls, - } - assert loss in loss_map, f'do you really want to compute {loss} loss?' - return loss_map[loss](outputs, targets, indices, **kwargs) - - def forward(self, outputs, targets, mask_GT=None): - """ This performs the loss computation. - Parameters: - outputs: dict of tensors, see the output specification of the model for the format - targets: list of dicts, such that len(targets) == batch_size. - The expected keys in each dict depends on the losses applied, see each loss' doc - """ - indices = None - # Compute all the requested losses - losses = {} - outputs['pred_logits'] = outputs['pred_logits'].reshape(1, -1).masked_select(mask_GT[0]) - count = mask_GT.sum() - outputs['saliency_scores'] = outputs['saliency_scores'].reshape(1, -1).masked_select(mask_GT[0]) - # targets['saliency_scores'] = targets['saliency_scores'].masked_select(mask_GT[0]) - targets['saliency_scores'] = targets['saliency_scores'][0,:count] - - for loss in self.losses: - losses.update(self.get_loss(loss, outputs, targets, indices)) - - return losses - -class MLP(nn.Module): - """ Very simple multi-layer perceptron (also called FFN)""" - - def __init__(self, input_dim, hidden_dim, output_dim, num_layers): - super().__init__() - self.num_layers = num_layers - h = [hidden_dim] * (num_layers - 1) - self.layers = nn.ModuleList(nn.Linear(n, k) for n, k in zip([input_dim] + h, h + [output_dim])) - - def forward(self, x): - for i, layer in enumerate(self.layers): - x = F.relu(layer(x)) if i < self.num_layers - 1 else layer(x) - return x - -class Conv(nn.Module): - """ Very simple multi-layer perceptron (also called FFN)""" - - def __init__(self, input_dim, hidden_dim, output_dim, num_layers, kernel_size): - super().__init__() - self.num_layers = num_layers - h = [hidden_dim] * (num_layers - 1) - # self.layers = nn.ModuleList(nn.Linear(n, k) for n, k in zip([input_dim] + h, h + [output_dim])) - self.layers = nn.ModuleList( - nn.Conv1d(n, k, kernel_size=kernel_size, stride=1, padding=kernel_size//2, dilation=1, groups=1, bias=True, padding_mode='zeros') - for n, k in zip([input_dim] + h, h + [output_dim])) - def forward(self, x): - x = x.permute(0,2,1) - for i, layer in enumerate(self.layers): - x = F.relu(layer(x)) if i < self.num_layers - 1 else layer(x) - return x.permute(0, 2, 1) - -class LinearLayer(nn.Module): - """linear layer configurable with layer normalization, dropout, ReLU.""" - - def __init__(self, in_hsz, out_hsz, layer_norm=True, dropout=0.1, relu=True): - super(LinearLayer, self).__init__() - self.relu = relu - self.layer_norm = layer_norm - if layer_norm: - self.LayerNorm = nn.LayerNorm(in_hsz) - layers = [ - nn.Dropout(dropout), - nn.Linear(in_hsz, out_hsz) - ] - self.net = nn.Sequential(*layers) - - def forward(self, x): - """(N, L, D)""" - if self.layer_norm: - x = self.LayerNorm(x) - x = self.net(x) - if self.relu: - x = F.relu(x, inplace=True) - return x # (N, L, D) - - -def build_model(args): - device = torch.device(args.device) - - transformer = build_transformer(args) - position_embedding, txt_position_embedding = build_position_encoding(args) - - model = Model( - transformer, - position_embedding, - txt_position_embedding, - txt_dim=args.t_feat_dim, - vid_dim=args.v_feat_dim, - input_dropout=args.input_dropout, - span_loss_type=args.span_loss_type, - use_txt_pos=args.use_txt_pos, - n_input_proj=args.n_input_proj, - ) - - matcher = build_matcher(args) - weight_dict = {"loss_b": args.b_loss_coef, - "loss_g": args.g_loss_coef, - "loss_f": args.f_loss_coef, - "loss_s_intra": args.s_loss_intra_coef, - "loss_s_inter": args.s_loss_inter_coef} - - if args.dset_type in ['mr', 'vlp']: - if 'tal' not in args.train_path: - losses = ['spans', 'labels', 'saliency'] - else: - losses = ['spans', 'labels', 'saliency_cls'] - elif args.dset_type in ['hl', 'vs']: - losses = ['labels', 'saliency'] - - criterion = SetCriterion( - matcher=matcher, - weight_dict=weight_dict, losses=losses, - eos_coef=args.eos_coef, temperature=args.temperature, - span_loss_type=args.span_loss_type, max_v_l=args.max_v_l, - saliency_margin=args.saliency_margin, - ) - criterion.to(device) - return model, criterion diff --git a/spaces/KevinQHLin/UniVTG/run_on_video/data_utils.py b/spaces/KevinQHLin/UniVTG/run_on_video/data_utils.py deleted file mode 100644 index 44c073f4689b20bd8f7adb2aac77c0b7523ed8dd..0000000000000000000000000000000000000000 --- a/spaces/KevinQHLin/UniVTG/run_on_video/data_utils.py +++ /dev/null @@ -1,170 +0,0 @@ -import torch -import os -import numpy as np -import ffmpeg -import math -from run_on_video import clip - - -class ClipFeatureExtractor: - def __init__(self, framerate=1/2, size=224, centercrop=True, model_name_or_path="ViT-B/32", device="cuda"): - self.video_loader = VideoLoader(framerate=framerate, size=size, centercrop=centercrop) - print("Loading CLIP models") - self.clip_extractor, _ = clip.load(model_name_or_path, device=device, jit=False) - self.tokenizer = clip.tokenize - self.video_preprocessor = Preprocessing() - self.device = device - - @torch.no_grad() - def encode_video(self, video_path: str, bsz=60): - video_frames = self.video_loader.read_video_from_file(video_path) # (T, H, W, 3) - video_frames = self.video_preprocessor(video_frames) - n_frames = len(video_frames) - n_batch = int(math.ceil(n_frames / bsz)) - video_features = [] - for i in range(n_batch): - st_idx = i * bsz - ed_idx = (i+1) * bsz - _video_frames = video_frames[st_idx:ed_idx].to(self.device) - _video_features = self.clip_extractor.encode_image(_video_frames) - video_features.append(_video_features) - video_features = torch.cat(video_features, dim=0) - return video_features # (T=#frames, d) torch tensor - - @torch.no_grad() - def encode_text(self, text_list, bsz=60): - n_text = len(text_list) - n_batch = int(math.ceil(n_text / bsz)) - text_features = [] - for i in range(n_batch): - st_idx = i * bsz - ed_idx = (i+1) * bsz - encoded_texts = self.tokenizer(text_list[st_idx:ed_idx], context_length=77).to(self.device) - output = self.clip_extractor.encode_text(encoded_texts) - valid_lengths = (encoded_texts != 0).sum(1).tolist() - batch_last_hidden_states = output["last_hidden_state"] - for j, valid_len in enumerate(valid_lengths): - text_features.append(batch_last_hidden_states[j, :valid_len]) - return text_features # List([L_j, d]) torch tensor - - -def convert_to_float(frac_str): - try: - return float(frac_str) - except ValueError: - try: - num, denom = frac_str.split('/') - except ValueError: - return None - try: - leading, num = num.split(' ') - except ValueError: - return float(num) / float(denom) - if float(leading) < 0: - sign_mult = -1 - else: - sign_mult = 1 - return float(leading) + sign_mult * (float(num) / float(denom)) - - -class Normalize(object): - - def __init__(self, mean, std): - self.mean = torch.FloatTensor(mean).view(1, 3, 1, 1) - self.std = torch.FloatTensor(std).view(1, 3, 1, 1) - - def __call__(self, tensor): - tensor = (tensor - self.mean) / (self.std + 1e-8) - return tensor - - -class Preprocessing(object): - - def __init__(self): - self.norm = Normalize( - mean=[0.48145466, 0.4578275, 0.40821073], - std=[0.26862954, 0.26130258, 0.27577711]) - - def __call__(self, tensor): - tensor = tensor / 255.0 - tensor = self.norm(tensor) - return tensor - - -class VideoLoader: - """Pytorch video loader. - Copied and modified from: - https://github.com/linjieli222/HERO_Video_Feature_Extractor/blob/main/clip/video_loader.py - """ - def __init__( - self, - framerate=1/2, - size=224, - centercrop=True, - ): - self.centercrop = centercrop - self.size = size - self.framerate = framerate - - def _get_video_info(self, video_path): - probe = ffmpeg.probe(video_path) - video_stream = next((stream for stream in probe['streams'] - if stream['codec_type'] == 'video'), None) - width = int(video_stream['width']) - height = int(video_stream['height']) - fps = math.floor(convert_to_float(video_stream['avg_frame_rate'])) - try: - frames_length = int(video_stream['nb_frames']) - duration = float(video_stream['duration']) - except Exception: - frames_length, duration = -1, -1 - info = {"duration": duration, "frames_length": frames_length, - "fps": fps, "height": height, "width": width} - return info - - def _get_output_dim(self, h, w): - if isinstance(self.size, tuple) and len(self.size) == 2: - return self.size - elif h >= w: - return int(h * self.size / w), self.size - else: - return self.size, int(w * self.size / h) - - def read_video_from_file(self, video_path): - try: - info = self._get_video_info(video_path) - h, w = info["height"], info["width"] - except Exception: - print('ffprobe failed at: {}'.format(video_path)) - return {'video': torch.zeros(1), 'input': video_path, - 'info': {}} - height, width = self._get_output_dim(h, w) - try: - duration = info["duration"] - fps = self.framerate - if duration > 0 and duration < 1/fps+0.1: - fps = 2/max(int(duration), 1) - print(duration, fps) - except Exception: - fps = self.framerate - cmd = ( - ffmpeg - .input(video_path) - .filter('fps', fps=fps) - .filter('scale', width, height) - ) - if self.centercrop: - x = int((width - self.size) / 2.0) - y = int((height - self.size) / 2.0) - cmd = cmd.crop(x, y, self.size, self.size) - out, _ = ( - cmd.output('pipe:', format='rawvideo', pix_fmt='rgb24') - .run(capture_stdout=True, quiet=True) - ) - if self.centercrop and isinstance(self.size, int): - height, width = self.size, self.size - video = np.frombuffer(out, np.uint8).reshape( - [-1, height, width, 3]) - video = torch.from_numpy(video.astype('float32')) - video = video.permute(0, 3, 1, 2) - return video diff --git a/spaces/LeoLM/leo-hessianai-7b-chat/README.md b/spaces/LeoLM/leo-hessianai-7b-chat/README.md deleted file mode 100644 index bee564c2342318e83c1bb25080eebb73938908a5..0000000000000000000000000000000000000000 --- a/spaces/LeoLM/leo-hessianai-7b-chat/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: LeoLM 7b Chat -emoji: 🦁 -colorFrom: yellow -colorTo: yellow -sdk: gradio -sdk_version: 3.44.4 -app_file: app.py -pinned: false -license: llama2 ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/Loren/Streamlit_OCR_comparator/configs/_base_/det_models/textsnake_r50_fpn_unet.py b/spaces/Loren/Streamlit_OCR_comparator/configs/_base_/det_models/textsnake_r50_fpn_unet.py deleted file mode 100644 index 7d74f376b8c635451a3036e780ffc88e7640bf2c..0000000000000000000000000000000000000000 --- a/spaces/Loren/Streamlit_OCR_comparator/configs/_base_/det_models/textsnake_r50_fpn_unet.py +++ /dev/null @@ -1,22 +0,0 @@ -model = dict( - type='TextSnake', - backbone=dict( - type='mmdet.ResNet', - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=-1, - norm_cfg=dict(type='BN', requires_grad=True), - init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50'), - norm_eval=True, - style='caffe'), - neck=dict( - type='FPN_UNet', in_channels=[256, 512, 1024, 2048], out_channels=32), - bbox_head=dict( - type='TextSnakeHead', - in_channels=32, - loss=dict(type='TextSnakeLoss'), - postprocessor=dict( - type='TextSnakePostprocessor', text_repr_type='poly')), - train_cfg=None, - test_cfg=None) diff --git a/spaces/LuxOAI/ChatGpt-Web/app/locales/es.ts b/spaces/LuxOAI/ChatGpt-Web/app/locales/es.ts deleted file mode 100644 index 4204506e4404c1d17439467cc8811f7f42d3f1b6..0000000000000000000000000000000000000000 --- a/spaces/LuxOAI/ChatGpt-Web/app/locales/es.ts +++ /dev/null @@ -1,244 +0,0 @@ -import { SubmitKey } from "../store/config"; -import type { LocaleType } from "./index"; - -const es: LocaleType = { - WIP: "En construcción...", - Error: { - Unauthorized: - "Acceso no autorizado, por favor ingrese el código de acceso en la página de configuración.", - }, - ChatItem: { - ChatItemCount: (count: number) => `${count} mensajes`, - }, - Chat: { - SubTitle: (count: number) => `${count} mensajes con ChatGPT`, - Actions: { - ChatList: "Ir a la lista de chats", - CompressedHistory: "Historial de memoria comprimido", - Export: "Exportar todos los mensajes como Markdown", - Copy: "Copiar", - Stop: "Detener", - Retry: "Reintentar", - Delete: "Delete", - }, - Rename: "Renombrar chat", - Typing: "Escribiendo...", - Input: (submitKey: string) => { - var inputHints = `Escribe algo y presiona ${submitKey} para enviar`; - if (submitKey === String(SubmitKey.Enter)) { - inputHints += ", presiona Shift + Enter para nueva línea"; - } - return inputHints; - }, - Send: "Enviar", - Config: { - Reset: "Reset to Default", - SaveAs: "Save as Mask", - }, - }, - Export: { - Title: "Todos los mensajes", - Copy: "Copiar todo", - Download: "Descargar", - MessageFromYou: "Mensaje de ti", - MessageFromChatGPT: "Mensaje de ChatGPT", - }, - Memory: { - Title: "Historial de memoria", - EmptyContent: "Aún no hay nada.", - Copy: "Copiar todo", - Send: "Send Memory", - Reset: "Reset Session", - ResetConfirm: - "Resetting will clear the current conversation history and historical memory. Are you sure you want to reset?", - }, - Home: { - NewChat: "Nuevo chat", - DeleteChat: "¿Confirmar eliminación de la conversación seleccionada?", - DeleteToast: "Chat Deleted", - Revert: "Revert", - }, - Settings: { - Title: "Configuración", - SubTitle: "Todas las configuraciones", - Actions: { - ClearAll: "Borrar todos los datos", - ResetAll: "Restablecer todas las configuraciones", - Close: "Cerrar", - ConfirmResetAll: "Are you sure you want to reset all configurations?", - ConfirmClearAll: "Are you sure you want to reset all chat?", - }, - Lang: { - Name: "Language", - All: "All Languages", - Options: { - cn: "简体中文", - en: "Inglés", - tw: "繁體中文", - es: "Español", - it: "Italiano", - tr: "Türkçe", - jp: "日本語", - de: "Deutsch", - }, - }, - Avatar: "Avatar", - FontSize: { - Title: "Tamaño de fuente", - SubTitle: "Ajustar el tamaño de fuente del contenido del chat", - }, - Update: { - Version: (x: string) => `Versión: ${x}`, - IsLatest: "Última versión", - CheckUpdate: "Buscar actualizaciones", - IsChecking: "Buscando actualizaciones...", - FoundUpdate: (x: string) => `Se encontró una nueva versión: ${x}`, - GoToUpdate: "Actualizar", - }, - SendKey: "Tecla de envío", - Theme: "Tema", - TightBorder: "Borde ajustado", - SendPreviewBubble: { - Title: "Enviar burbuja de vista previa", - SubTitle: "Preview markdown in bubble", - }, - Mask: { - Title: "Mask Splash Screen", - SubTitle: "Show a mask splash screen before starting new chat", - }, - Prompt: { - Disable: { - Title: "Desactivar autocompletado", - SubTitle: "Escribe / para activar el autocompletado", - }, - List: "Lista de autocompletado", - ListCount: (builtin: number, custom: number) => - `${builtin} incorporado, ${custom} definido por el usuario`, - Edit: "Editar", - Modal: { - Title: "Prompt List", - Add: "Add One", - Search: "Search Prompts", - }, - EditModal: { - Title: "Edit Prompt", - }, - }, - HistoryCount: { - Title: "Cantidad de mensajes adjuntos", - SubTitle: "Número de mensajes enviados adjuntos por solicitud", - }, - CompressThreshold: { - Title: "Umbral de compresión de historial", - SubTitle: - "Se comprimirán los mensajes si la longitud de los mensajes no comprimidos supera el valor", - }, - Token: { - Title: "Clave de API", - SubTitle: "Utiliza tu clave para ignorar el límite de código de acceso", - Placeholder: "Clave de la API de OpenAI", - }, - Usage: { - Title: "Saldo de la cuenta", - SubTitle(used: any, total: any) { - return `Usado $${used}, subscription $${total}`; - }, - IsChecking: "Comprobando...", - Check: "Comprobar de nuevo", - NoAccess: "Introduzca la clave API para comprobar el saldo", - }, - AccessCode: { - Title: "Código de acceso", - SubTitle: "Control de acceso habilitado", - Placeholder: "Necesita código de acceso", - }, - Bot: "Proveedores de IA (bot)", - Model: "Modelo", - Temperature: { - Title: "Temperatura", - SubTitle: "Un valor mayor genera una salida más aleatoria", - }, - MaxTokens: { - Title: "Máximo de tokens", - SubTitle: "Longitud máxima de tokens de entrada y tokens generados", - }, - PresencePenlty: { - Title: "Penalización de presencia", - SubTitle: - "Un valor mayor aumenta la probabilidad de hablar sobre nuevos temas", - }, - }, - Store: { - DefaultTopic: "Nueva conversación", - BotHello: "¡Hola! ¿Cómo puedo ayudarte hoy?", - Error: "Algo salió mal, por favor intenta nuevamente más tarde.", - Prompt: { - History: (content: string) => - "Este es un resumen del historial del chat entre la IA y el usuario como recapitulación: " + - content, - Topic: - "Por favor, genera un título de cuatro a cinco palabras que resuma nuestra conversación sin ningún inicio, puntuación, comillas, puntos, símbolos o texto adicional. Elimina las comillas que lo envuelven.", - Summarize: - "Resuma nuestra discusión brevemente en 200 caracteres o menos para usarlo como un recordatorio para futuros contextos.", - }, - }, - Copy: { - Success: "Copiado al portapapeles", - Failed: - "La copia falló, por favor concede permiso para acceder al portapapeles", - }, - Context: { - Toast: (x: any) => `With ${x} contextual prompts`, - Edit: "Contextual and Memory Prompts", - Add: "Add One", - }, - Plugin: { - Name: "Plugin", - }, - Mask: { - Name: "Mask", - Page: { - Title: "Prompt Template", - SubTitle: (count: number) => `${count} prompt templates`, - Search: "Search Templates", - Create: "Create", - }, - Item: { - Info: (count: number) => `${count} prompts`, - Chat: "Chat", - View: "View", - Edit: "Edit", - Delete: "Delete", - DeleteConfirm: "Confirm to delete?", - }, - EditModal: { - Title: (readonly: boolean) => - `Edit Prompt Template ${readonly ? "(readonly)" : ""}`, - Download: "Download", - Clone: "Clone", - }, - Config: { - Avatar: "Bot Avatar", - Name: "Bot Name", - }, - }, - NewChat: { - Return: "Return", - Skip: "Skip", - Title: "Pick a Mask", - SubTitle: "Chat with the Soul behind the Mask", - More: "Find More", - NotShow: "Not Show Again", - ConfirmNoShow: "Confirm to disable?You can enable it in settings later.", - }, - - UI: { - Confirm: "Confirm", - Cancel: "Cancel", - Close: "Close", - Create: "Create", - Edit: "Edit", - }, -}; - -export default es; diff --git a/spaces/LuxOAI/guanaco-playground-tgi/share_btn.py b/spaces/LuxOAI/guanaco-playground-tgi/share_btn.py deleted file mode 100644 index 8ff61abe298d71349f565b5d47228986b42d1f96..0000000000000000000000000000000000000000 --- a/spaces/LuxOAI/guanaco-playground-tgi/share_btn.py +++ /dev/null @@ -1,98 +0,0 @@ -community_icon_html = """""" - -loading_icon_html = """""" - -share_js = """async () => { - async function uploadFile(file){ - const UPLOAD_URL = 'https://huggingface.co/uploads'; - const response = await fetch(UPLOAD_URL, { - method: 'POST', - headers: { - 'Content-Type': file.type, - 'X-Requested-With': 'XMLHttpRequest', - }, - body: file, /// <- File inherits from Blob - }); - const url = await response.text(); - return url; - } - async function getInputImgFile(imgEl){ - const res = await fetch(imgEl.src); - const blob = await res.blob(); - const imgId = Date.now() % 200; - const isPng = imgEl.src.startsWith(`data:image/png`); - if(isPng){ - const fileName = `sd-perception-${{imgId}}.png`; - return new File([blob], fileName, { type: 'image/png' }); - }else{ - const fileName = `sd-perception-${{imgId}}.jpg`; - return new File([blob], fileName, { type: 'image/jpeg' }); - } - } - // const gradioEl = document.querySelector('body > gradio-app'); - const gradioEl = document.querySelector("gradio-app"); - const inputTxt = gradioEl.querySelector('#q-input textarea').value; - const outputTxt = gradioEl.querySelector('#q-output').outerHTML; - const titleLength = 150; - let titleTxt = inputTxt; - if(titleTxt.length > titleLength){ - titleTxt = titleTxt.slice(0, titleLength) + ' ...'; - } - const shareBtnEl = gradioEl.querySelector('#share-btn'); - const shareIconEl = gradioEl.querySelector('#share-btn-share-icon'); - const loadingIconEl = gradioEl.querySelector('#share-btn-loading-icon'); - if(!inputTxt || !outputTxt){ - return; - }; - shareBtnEl.style.pointerEvents = 'none'; - shareIconEl.style.display = 'none'; - loadingIconEl.style.removeProperty('display'); - const descriptionMd = `### Question: -${inputTxt} -### Answer: -${outputTxt}`; - const params = { - title: titleTxt, - description: descriptionMd, - }; - const paramsStr = Object.entries(params) - .map(([key, value]) => `${encodeURIComponent(key)}=${encodeURIComponent(value)}`) - .join('&'); - window.open(`https://huggingface.co/spaces/HuggingFaceH4/star-chat-demo/discussions/new?${paramsStr}`, '_blank'); - shareBtnEl.style.removeProperty('pointer-events'); - shareIconEl.style.removeProperty('display'); - loadingIconEl.style.display = 'none'; -}""" - -share_btn_css = """ -a {text-decoration-line: underline; font-weight: 600;} -.animate-spin { - animation: spin 1s linear infinite; -} -@keyframes spin { - from { transform: rotate(0deg); } - to { transform: rotate(360deg); } -} -#share-btn-container { - display: flex; padding-left: 0.5rem !important; padding-right: 0.5rem !important; background-color: #000000; justify-content: center; align-items: center; border-radius: 9999px !important; width: 13rem; -} -#share-btn { - all: initial; color: #ffffff;font-weight: 600; cursor:pointer; font-family: 'IBM Plex Sans', sans-serif; margin-left: 0.5rem !important; padding-top: 0.25rem !important; padding-bottom: 0.25rem !important; -} -#share-btn * { - all: unset; -} -#share-btn-container div:nth-child(-n+2){ - width: auto !important; - min-height: 0px !important; -} -#share-btn-container .wrap { - display: none !important; -} -""" \ No newline at end of file diff --git a/spaces/MCkernick/Image_Restoration_Colorization/Global/util/util.py b/spaces/MCkernick/Image_Restoration_Colorization/Global/util/util.py deleted file mode 100644 index b1369c3b568548a1c21d3412aef5fd35c9b0c5be..0000000000000000000000000000000000000000 --- a/spaces/MCkernick/Image_Restoration_Colorization/Global/util/util.py +++ /dev/null @@ -1,58 +0,0 @@ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. - -from __future__ import print_function -import torch -import numpy as np -from PIL import Image -import numpy as np -import os -import torch.nn as nn - -# Converts a Tensor into a Numpy array -# |imtype|: the desired type of the converted numpy array -def tensor2im(image_tensor, imtype=np.uint8, normalize=True): - if isinstance(image_tensor, list): - image_numpy = [] - for i in range(len(image_tensor)): - image_numpy.append(tensor2im(image_tensor[i], imtype, normalize)) - return image_numpy - image_numpy = image_tensor.cpu().float().numpy() - if normalize: - image_numpy = (np.transpose(image_numpy, (1, 2, 0)) + 1) / 2.0 * 255.0 - else: - image_numpy = np.transpose(image_numpy, (1, 2, 0)) * 255.0 - image_numpy = np.clip(image_numpy, 0, 255) - if image_numpy.shape[2] == 1 or image_numpy.shape[2] > 3: - image_numpy = image_numpy[:, :, 0] - return image_numpy.astype(imtype) - - -# Converts a one-hot tensor into a colorful label map -def tensor2label(label_tensor, n_label, imtype=np.uint8): - if n_label == 0: - return tensor2im(label_tensor, imtype) - label_tensor = label_tensor.cpu().float() - if label_tensor.size()[0] > 1: - label_tensor = label_tensor.max(0, keepdim=True)[1] - label_tensor = Colorize(n_label)(label_tensor) - label_numpy = np.transpose(label_tensor.numpy(), (1, 2, 0)) - return label_numpy.astype(imtype) - - -def save_image(image_numpy, image_path): - image_pil = Image.fromarray(image_numpy) - image_pil.save(image_path) - - -def mkdirs(paths): - if isinstance(paths, list) and not isinstance(paths, str): - for path in paths: - mkdir(path) - else: - mkdir(paths) - - -def mkdir(path): - if not os.path.exists(path): - os.makedirs(path) diff --git a/spaces/Mahiruoshi/BangDream-Bert-VITS2/text/__init__.py b/spaces/Mahiruoshi/BangDream-Bert-VITS2/text/__init__.py deleted file mode 100644 index a45b650424306b6e077d7013e93e2c9bd1e073c2..0000000000000000000000000000000000000000 --- a/spaces/Mahiruoshi/BangDream-Bert-VITS2/text/__init__.py +++ /dev/null @@ -1,28 +0,0 @@ -from text.symbols import * - -_symbol_to_id = {s: i for i, s in enumerate(symbols)} - - -def cleaned_text_to_sequence(cleaned_text, tones, language): - """Converts a string of text to a sequence of IDs corresponding to the symbols in the text. - Args: - text: string to convert to a sequence - Returns: - List of integers corresponding to the symbols in the text - """ - phones = [_symbol_to_id[symbol] for symbol in cleaned_text] - tone_start = language_tone_start_map[language] - tones = [i + tone_start for i in tones] - lang_id = language_id_map[language] - lang_ids = [lang_id for i in phones] - return phones, tones, lang_ids - - -def get_bert(norm_text, word2ph, language, device): - from .chinese_bert import get_bert_feature as zh_bert - from .english_bert_mock import get_bert_feature as en_bert - from .japanese_bert import get_bert_feature as jp_bert - - lang_bert_func_map = {"ZH": zh_bert, "EN": en_bert, "JP": jp_bert} - bert = lang_bert_func_map[language](norm_text, word2ph, device) - return bert diff --git a/spaces/Manjushri/SDXL-1.0-CPU/app.py b/spaces/Manjushri/SDXL-1.0-CPU/app.py deleted file mode 100644 index be03cbdaa52ac6070003439cf6dbd031e4fd0f89..0000000000000000000000000000000000000000 --- a/spaces/Manjushri/SDXL-1.0-CPU/app.py +++ /dev/null @@ -1,47 +0,0 @@ -import gradio as gr -import torch -import modin.pandas as pd -from diffusers import DiffusionPipeline - -device = "cuda" if torch.cuda.is_available() else "cpu" -if torch.cuda.is_available(): - PYTORCH_CUDA_ALLOC_CONF={'max_split_size_mb': 6000} - torch.cuda.max_memory_allocated(device=device) - torch.cuda.empty_cache() - pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16, variant="fp16", use_safetensors=True) - pipe.enable_xformers_memory_efficient_attention() - pipe = pipe.to(device) - pipe.unet = torch.compile(pipe.unet, mode="reduce-overhead", fullgraph=True) - torch.cuda.empty_cache() - refiner = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-refiner-1.0", use_safetensors=True, torch_dtype=torch.float16, variant="fp16") - refiner.enable_xformers_memory_efficient_attention() - refiner.enable_sequential_cpu_offload() - refiner.unet = torch.compile(refiner.unet, mode="reduce-overhead", fullgraph=True) -else: - pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", use_safetensors=True) - pipe = pipe.to(device) - pipe.unet = torch.compile(pipe.unet, mode="reduce-overhead", fullgraph=True) - refiner = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-refiner-1.0", use_safetensors=True) - refiner = refiner.to(device) - refiner.unet = torch.compile(refiner.unet, mode="reduce-overhead", fullgraph=True) - -def genie (prompt, negative_prompt, height, width, scale, steps, seed, prompt_2, negative_prompt_2, high_noise_frac): - generator = np.random.seed(0) if seed == 0 else torch.manual_seed(seed) - int_image = pipe(prompt, prompt_2=prompt_2, negative_prompt_2=negative_prompt_2, negative_prompt=negative_prompt, height=height, width=width, num_inference_steps=steps, guidance_scale=scale, num_images_per_prompt=1, generator=generator, output_type="latent").images - image = refiner(prompt=prompt, prompt_2=prompt_2, negative_prompt=negative_prompt, negative_prompt_2=negative_prompt_2, image=int_image, denoising_start=high_noise_frac).images[0] - return image - -gr.Interface(fn=genie, inputs=[gr.Textbox(label='What you want the AI to generate. 77 Token Limit.'), - gr.Textbox(label='What you Do Not want the AI to generate.'), - gr.Slider(512, 1024, 768, step=128, label='Height'), - gr.Slider(512, 1024, 768, step=128, label='Width'), - gr.Slider(1, 15, 10, label='Guidance Scale'), - gr.Slider(25, maximum=50, value=25, step=1, label='Number of Iterations'), - gr.Slider(minimum=1, step=1, maximum=999999999999999999, randomize=True), - gr.Textbox(label='Embedded Prompt'), - gr.Textbox(label='Embedded Negative Prompt'), - gr.Slider(minimum=.7, maximum=.99, value=.95, step=.01, label='Refiner Denoise Start %')], - outputs='image', - title="Stable Diffusion XL 1.0 CPU or GPU", - description="SDXL 1.0 CPU or GPU. Currently running on CPU.

WARNING: Extremely Slow. 65s/Iteration. Expect 25-50mins an image for 25-50 iterations respectively. This model is capable of producing NSFW (Softcore) images.", - article = "If You Enjoyed this Demo and would like to Donate, you can send to any of these Wallets.
BTC: bc1qzdm9j73mj8ucwwtsjx4x4ylyfvr6kp7svzjn84
3LWRoKYx6bCLnUrKEdnPo3FCSPQUSFDjFP
DOGE: DK6LRc4gfefdCTRk9xPD239N31jh9GjKez
SHIB (BEP20): 0xbE8f2f3B71DFEB84E5F7E3aae1909d60658aB891
PayPal: https://www.paypal.me/ManjushriBodhisattva
ETH: 0xbE8f2f3B71DFEB84E5F7E3aae1909d60658aB891
Code Monkey: Manjushri").launch(debug=True, max_threads=80) \ No newline at end of file diff --git a/spaces/Marne/MockingBird/mockingbirdforuse/encoder/audio.py b/spaces/Marne/MockingBird/mockingbirdforuse/encoder/audio.py deleted file mode 100644 index 90fd8b9230a22f4d1c9fc46e8c0e6d071b390a1f..0000000000000000000000000000000000000000 --- a/spaces/Marne/MockingBird/mockingbirdforuse/encoder/audio.py +++ /dev/null @@ -1,121 +0,0 @@ -import struct -import librosa -import webrtcvad -import numpy as np -from pathlib import Path -from typing import Optional, Union -from scipy.ndimage.morphology import binary_dilation - -from .hparams import hparams as hp - - -def preprocess_wav( - fpath_or_wav: Union[str, Path, np.ndarray], - source_sr: Optional[int] = None, - normalize: Optional[bool] = True, - trim_silence: Optional[bool] = True, -): - """ - Applies the preprocessing operations used in training the Speaker Encoder to a waveform - either on disk or in memory. The waveform will be resampled to match the data hyperparameters. - - :param fpath_or_wav: either a filepath to an audio file (many extensions are supported, not - just .wav), either the waveform as a numpy array of floats. - :param source_sr: if passing an audio waveform, the sampling rate of the waveform before - preprocessing. After preprocessing, the waveform's sampling rate will match the data - hyperparameters. If passing a filepath, the sampling rate will be automatically detected and - this argument will be ignored. - """ - # Load the wav from disk if needed - if isinstance(fpath_or_wav, str) or isinstance(fpath_or_wav, Path): - wav, source_sr = librosa.load(str(fpath_or_wav)) - else: - wav = fpath_or_wav - - # Resample the wav if needed - if source_sr is not None and source_sr != hp.sampling_rate: - wav = librosa.resample(wav, orig_sr=source_sr, target_sr=hp.sampling_rate) - - # Apply the preprocessing: normalize volume and shorten long silences - if normalize: - wav = normalize_volume(wav, hp.audio_norm_target_dBFS, increase_only=True) - if webrtcvad and trim_silence: - wav = trim_long_silences(wav) - - return wav - - -def wav_to_mel_spectrogram(wav): - """ - Derives a mel spectrogram ready to be used by the encoder from a preprocessed audio waveform. - Note: this not a log-mel spectrogram. - """ - frames = librosa.feature.melspectrogram( - y=wav, - sr=hp.sampling_rate, - n_fft=int(hp.sampling_rate * hp.mel_window_length / 1000), - hop_length=int(hp.sampling_rate * hp.mel_window_step / 1000), - n_mels=hp.mel_n_channels, - ) - return frames.astype(np.float32).T - - -def trim_long_silences(wav): - """ - Ensures that segments without voice in the waveform remain no longer than a - threshold determined by the VAD parameters in params.py. - - :param wav: the raw waveform as a numpy array of floats - :return: the same waveform with silences trimmed away (length <= original wav length) - """ - # Compute the voice detection window size - samples_per_window = (hp.vad_window_length * hp.sampling_rate) // 1000 - - # Trim the end of the audio to have a multiple of the window size - wav = wav[: len(wav) - (len(wav) % samples_per_window)] - - # Convert the float waveform to 16-bit mono PCM - int16_max = (2**15) - 1 - pcm_wave = struct.pack( - "%dh" % len(wav), *(np.round(wav * int16_max)).astype(np.int16) - ) - - # Perform voice activation detection - voice_flags = [] - vad = webrtcvad.Vad(mode=3) - for window_start in range(0, len(wav), samples_per_window): - window_end = window_start + samples_per_window - voice_flags.append( - vad.is_speech( - pcm_wave[window_start * 2 : window_end * 2], - sample_rate=hp.sampling_rate, - ) - ) - voice_flags = np.array(voice_flags) - - # Smooth the voice detection with a moving average - def moving_average(array, width): - array_padded = np.concatenate( - (np.zeros((width - 1) // 2), array, np.zeros(width // 2)) - ) - ret = np.cumsum(array_padded, dtype=float) - ret[width:] = ret[width:] - ret[:-width] - return ret[width - 1 :] / width - - audio_mask = moving_average(voice_flags, hp.vad_moving_average_width) - audio_mask = np.round(audio_mask).astype(np.bool8) - - # Dilate the voiced regions - audio_mask = binary_dilation(audio_mask, np.ones(hp.vad_max_silence_length + 1)) - audio_mask = np.repeat(audio_mask, samples_per_window) - - return wav[audio_mask == True] - - -def normalize_volume(wav, target_dBFS, increase_only=False, decrease_only=False): - if increase_only and decrease_only: - raise ValueError("Both increase only and decrease only are set") - dBFS_change = target_dBFS - 10 * np.log10(np.mean(wav**2)) - if (dBFS_change < 0 and increase_only) or (dBFS_change > 0 and decrease_only): - return wav - return wav * (10 ** (dBFS_change / 20)) diff --git a/spaces/Marshalls/testmtd/analysis/pymo/mocapplayer/libs/jquery.min.js b/spaces/Marshalls/testmtd/analysis/pymo/mocapplayer/libs/jquery.min.js deleted file mode 100644 index b8c4187de18dd413ad3029839ce0773549e92a14..0000000000000000000000000000000000000000 --- a/spaces/Marshalls/testmtd/analysis/pymo/mocapplayer/libs/jquery.min.js +++ /dev/null @@ -1,4 +0,0 @@ -/*! jQuery v2.2.3 | (c) jQuery Foundation | jquery.org/license */ -!function(a,b){"object"==typeof module&&"object"==typeof module.exports?module.exports=a.document?b(a,!0):function(a){if(!a.document)throw new Error("jQuery requires a window with a document");return b(a)}:b(a)}("undefined"!=typeof window?window:this,function(a,b){var c=[],d=a.document,e=c.slice,f=c.concat,g=c.push,h=c.indexOf,i={},j=i.toString,k=i.hasOwnProperty,l={},m="2.2.3",n=function(a,b){return new n.fn.init(a,b)},o=/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,p=/^-ms-/,q=/-([\da-z])/gi,r=function(a,b){return b.toUpperCase()};n.fn=n.prototype={jquery:m,constructor:n,selector:"",length:0,toArray:function(){return e.call(this)},get:function(a){return null!=a?0>a?this[a+this.length]:this[a]:e.call(this)},pushStack:function(a){var b=n.merge(this.constructor(),a);return b.prevObject=this,b.context=this.context,b},each:function(a){return n.each(this,a)},map:function(a){return this.pushStack(n.map(this,function(b,c){return a.call(b,c,b)}))},slice:function(){return this.pushStack(e.apply(this,arguments))},first:function(){return this.eq(0)},last:function(){return this.eq(-1)},eq:function(a){var b=this.length,c=+a+(0>a?b:0);return this.pushStack(c>=0&&b>c?[this[c]]:[])},end:function(){return this.prevObject||this.constructor()},push:g,sort:c.sort,splice:c.splice},n.extend=n.fn.extend=function(){var a,b,c,d,e,f,g=arguments[0]||{},h=1,i=arguments.length,j=!1;for("boolean"==typeof g&&(j=g,g=arguments[h]||{},h++),"object"==typeof g||n.isFunction(g)||(g={}),h===i&&(g=this,h--);i>h;h++)if(null!=(a=arguments[h]))for(b in a)c=g[b],d=a[b],g!==d&&(j&&d&&(n.isPlainObject(d)||(e=n.isArray(d)))?(e?(e=!1,f=c&&n.isArray(c)?c:[]):f=c&&n.isPlainObject(c)?c:{},g[b]=n.extend(j,f,d)):void 0!==d&&(g[b]=d));return g},n.extend({expando:"jQuery"+(m+Math.random()).replace(/\D/g,""),isReady:!0,error:function(a){throw new Error(a)},noop:function(){},isFunction:function(a){return"function"===n.type(a)},isArray:Array.isArray,isWindow:function(a){return null!=a&&a===a.window},isNumeric:function(a){var b=a&&a.toString();return!n.isArray(a)&&b-parseFloat(b)+1>=0},isPlainObject:function(a){var b;if("object"!==n.type(a)||a.nodeType||n.isWindow(a))return!1;if(a.constructor&&!k.call(a,"constructor")&&!k.call(a.constructor.prototype||{},"isPrototypeOf"))return!1;for(b in a);return void 0===b||k.call(a,b)},isEmptyObject:function(a){var b;for(b in a)return!1;return!0},type:function(a){return null==a?a+"":"object"==typeof a||"function"==typeof a?i[j.call(a)]||"object":typeof a},globalEval:function(a){var b,c=eval;a=n.trim(a),a&&(1===a.indexOf("use strict")?(b=d.createElement("script"),b.text=a,d.head.appendChild(b).parentNode.removeChild(b)):c(a))},camelCase:function(a){return a.replace(p,"ms-").replace(q,r)},nodeName:function(a,b){return a.nodeName&&a.nodeName.toLowerCase()===b.toLowerCase()},each:function(a,b){var c,d=0;if(s(a)){for(c=a.length;c>d;d++)if(b.call(a[d],d,a[d])===!1)break}else for(d in a)if(b.call(a[d],d,a[d])===!1)break;return a},trim:function(a){return null==a?"":(a+"").replace(o,"")},makeArray:function(a,b){var c=b||[];return null!=a&&(s(Object(a))?n.merge(c,"string"==typeof a?[a]:a):g.call(c,a)),c},inArray:function(a,b,c){return null==b?-1:h.call(b,a,c)},merge:function(a,b){for(var c=+b.length,d=0,e=a.length;c>d;d++)a[e++]=b[d];return a.length=e,a},grep:function(a,b,c){for(var d,e=[],f=0,g=a.length,h=!c;g>f;f++)d=!b(a[f],f),d!==h&&e.push(a[f]);return e},map:function(a,b,c){var d,e,g=0,h=[];if(s(a))for(d=a.length;d>g;g++)e=b(a[g],g,c),null!=e&&h.push(e);else for(g in a)e=b(a[g],g,c),null!=e&&h.push(e);return f.apply([],h)},guid:1,proxy:function(a,b){var c,d,f;return"string"==typeof b&&(c=a[b],b=a,a=c),n.isFunction(a)?(d=e.call(arguments,2),f=function(){return a.apply(b||this,d.concat(e.call(arguments)))},f.guid=a.guid=a.guid||n.guid++,f):void 0},now:Date.now,support:l}),"function"==typeof Symbol&&(n.fn[Symbol.iterator]=c[Symbol.iterator]),n.each("Boolean Number String Function Array Date RegExp Object Error Symbol".split(" "),function(a,b){i["[object "+b+"]"]=b.toLowerCase()});function s(a){var b=!!a&&"length"in a&&a.length,c=n.type(a);return"function"===c||n.isWindow(a)?!1:"array"===c||0===b||"number"==typeof b&&b>0&&b-1 in a}var t=function(a){var b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u="sizzle"+1*new Date,v=a.document,w=0,x=0,y=ga(),z=ga(),A=ga(),B=function(a,b){return a===b&&(l=!0),0},C=1<<31,D={}.hasOwnProperty,E=[],F=E.pop,G=E.push,H=E.push,I=E.slice,J=function(a,b){for(var c=0,d=a.length;d>c;c++)if(a[c]===b)return c;return-1},K="checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|ismap|loop|multiple|open|readonly|required|scoped",L="[\\x20\\t\\r\\n\\f]",M="(?:\\\\.|[\\w-]|[^\\x00-\\xa0])+",N="\\["+L+"*("+M+")(?:"+L+"*([*^$|!~]?=)"+L+"*(?:'((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\"|("+M+"))|)"+L+"*\\]",O=":("+M+")(?:\\((('((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\")|((?:\\\\.|[^\\\\()[\\]]|"+N+")*)|.*)\\)|)",P=new RegExp(L+"+","g"),Q=new RegExp("^"+L+"+|((?:^|[^\\\\])(?:\\\\.)*)"+L+"+$","g"),R=new RegExp("^"+L+"*,"+L+"*"),S=new RegExp("^"+L+"*([>+~]|"+L+")"+L+"*"),T=new RegExp("="+L+"*([^\\]'\"]*?)"+L+"*\\]","g"),U=new RegExp(O),V=new RegExp("^"+M+"$"),W={ID:new RegExp("^#("+M+")"),CLASS:new RegExp("^\\.("+M+")"),TAG:new RegExp("^("+M+"|[*])"),ATTR:new RegExp("^"+N),PSEUDO:new RegExp("^"+O),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+L+"*(even|odd|(([+-]|)(\\d*)n|)"+L+"*(?:([+-]|)"+L+"*(\\d+)|))"+L+"*\\)|)","i"),bool:new RegExp("^(?:"+K+")$","i"),needsContext:new RegExp("^"+L+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+L+"*((?:-\\d)?\\d*)"+L+"*\\)|)(?=[^-]|$)","i")},X=/^(?:input|select|textarea|button)$/i,Y=/^h\d$/i,Z=/^[^{]+\{\s*\[native \w/,$=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,_=/[+~]/,aa=/'|\\/g,ba=new RegExp("\\\\([\\da-f]{1,6}"+L+"?|("+L+")|.)","ig"),ca=function(a,b,c){var d="0x"+b-65536;return d!==d||c?b:0>d?String.fromCharCode(d+65536):String.fromCharCode(d>>10|55296,1023&d|56320)},da=function(){m()};try{H.apply(E=I.call(v.childNodes),v.childNodes),E[v.childNodes.length].nodeType}catch(ea){H={apply:E.length?function(a,b){G.apply(a,I.call(b))}:function(a,b){var c=a.length,d=0;while(a[c++]=b[d++]);a.length=c-1}}}function fa(a,b,d,e){var f,h,j,k,l,o,r,s,w=b&&b.ownerDocument,x=b?b.nodeType:9;if(d=d||[],"string"!=typeof a||!a||1!==x&&9!==x&&11!==x)return d;if(!e&&((b?b.ownerDocument||b:v)!==n&&m(b),b=b||n,p)){if(11!==x&&(o=$.exec(a)))if(f=o[1]){if(9===x){if(!(j=b.getElementById(f)))return d;if(j.id===f)return d.push(j),d}else if(w&&(j=w.getElementById(f))&&t(b,j)&&j.id===f)return d.push(j),d}else{if(o[2])return H.apply(d,b.getElementsByTagName(a)),d;if((f=o[3])&&c.getElementsByClassName&&b.getElementsByClassName)return H.apply(d,b.getElementsByClassName(f)),d}if(c.qsa&&!A[a+" "]&&(!q||!q.test(a))){if(1!==x)w=b,s=a;else if("object"!==b.nodeName.toLowerCase()){(k=b.getAttribute("id"))?k=k.replace(aa,"\\$&"):b.setAttribute("id",k=u),r=g(a),h=r.length,l=V.test(k)?"#"+k:"[id='"+k+"']";while(h--)r[h]=l+" "+qa(r[h]);s=r.join(","),w=_.test(a)&&oa(b.parentNode)||b}if(s)try{return H.apply(d,w.querySelectorAll(s)),d}catch(y){}finally{k===u&&b.removeAttribute("id")}}}return i(a.replace(Q,"$1"),b,d,e)}function ga(){var a=[];function b(c,e){return a.push(c+" ")>d.cacheLength&&delete b[a.shift()],b[c+" "]=e}return b}function ha(a){return a[u]=!0,a}function ia(a){var b=n.createElement("div");try{return!!a(b)}catch(c){return!1}finally{b.parentNode&&b.parentNode.removeChild(b),b=null}}function ja(a,b){var c=a.split("|"),e=c.length;while(e--)d.attrHandle[c[e]]=b}function ka(a,b){var c=b&&a,d=c&&1===a.nodeType&&1===b.nodeType&&(~b.sourceIndex||C)-(~a.sourceIndex||C);if(d)return d;if(c)while(c=c.nextSibling)if(c===b)return-1;return a?1:-1}function la(a){return function(b){var c=b.nodeName.toLowerCase();return"input"===c&&b.type===a}}function ma(a){return function(b){var c=b.nodeName.toLowerCase();return("input"===c||"button"===c)&&b.type===a}}function na(a){return ha(function(b){return b=+b,ha(function(c,d){var e,f=a([],c.length,b),g=f.length;while(g--)c[e=f[g]]&&(c[e]=!(d[e]=c[e]))})})}function oa(a){return a&&"undefined"!=typeof a.getElementsByTagName&&a}c=fa.support={},f=fa.isXML=function(a){var b=a&&(a.ownerDocument||a).documentElement;return b?"HTML"!==b.nodeName:!1},m=fa.setDocument=function(a){var b,e,g=a?a.ownerDocument||a:v;return g!==n&&9===g.nodeType&&g.documentElement?(n=g,o=n.documentElement,p=!f(n),(e=n.defaultView)&&e.top!==e&&(e.addEventListener?e.addEventListener("unload",da,!1):e.attachEvent&&e.attachEvent("onunload",da)),c.attributes=ia(function(a){return a.className="i",!a.getAttribute("className")}),c.getElementsByTagName=ia(function(a){return a.appendChild(n.createComment("")),!a.getElementsByTagName("*").length}),c.getElementsByClassName=Z.test(n.getElementsByClassName),c.getById=ia(function(a){return o.appendChild(a).id=u,!n.getElementsByName||!n.getElementsByName(u).length}),c.getById?(d.find.ID=function(a,b){if("undefined"!=typeof b.getElementById&&p){var c=b.getElementById(a);return c?[c]:[]}},d.filter.ID=function(a){var b=a.replace(ba,ca);return function(a){return a.getAttribute("id")===b}}):(delete d.find.ID,d.filter.ID=function(a){var b=a.replace(ba,ca);return function(a){var c="undefined"!=typeof a.getAttributeNode&&a.getAttributeNode("id");return c&&c.value===b}}),d.find.TAG=c.getElementsByTagName?function(a,b){return"undefined"!=typeof b.getElementsByTagName?b.getElementsByTagName(a):c.qsa?b.querySelectorAll(a):void 0}:function(a,b){var c,d=[],e=0,f=b.getElementsByTagName(a);if("*"===a){while(c=f[e++])1===c.nodeType&&d.push(c);return d}return f},d.find.CLASS=c.getElementsByClassName&&function(a,b){return"undefined"!=typeof b.getElementsByClassName&&p?b.getElementsByClassName(a):void 0},r=[],q=[],(c.qsa=Z.test(n.querySelectorAll))&&(ia(function(a){o.appendChild(a).innerHTML="",a.querySelectorAll("[msallowcapture^='']").length&&q.push("[*^$]="+L+"*(?:''|\"\")"),a.querySelectorAll("[selected]").length||q.push("\\["+L+"*(?:value|"+K+")"),a.querySelectorAll("[id~="+u+"-]").length||q.push("~="),a.querySelectorAll(":checked").length||q.push(":checked"),a.querySelectorAll("a#"+u+"+*").length||q.push(".#.+[+~]")}),ia(function(a){var b=n.createElement("input");b.setAttribute("type","hidden"),a.appendChild(b).setAttribute("name","D"),a.querySelectorAll("[name=d]").length&&q.push("name"+L+"*[*^$|!~]?="),a.querySelectorAll(":enabled").length||q.push(":enabled",":disabled"),a.querySelectorAll("*,:x"),q.push(",.*:")})),(c.matchesSelector=Z.test(s=o.matches||o.webkitMatchesSelector||o.mozMatchesSelector||o.oMatchesSelector||o.msMatchesSelector))&&ia(function(a){c.disconnectedMatch=s.call(a,"div"),s.call(a,"[s!='']:x"),r.push("!=",O)}),q=q.length&&new RegExp(q.join("|")),r=r.length&&new RegExp(r.join("|")),b=Z.test(o.compareDocumentPosition),t=b||Z.test(o.contains)?function(a,b){var c=9===a.nodeType?a.documentElement:a,d=b&&b.parentNode;return a===d||!(!d||1!==d.nodeType||!(c.contains?c.contains(d):a.compareDocumentPosition&&16&a.compareDocumentPosition(d)))}:function(a,b){if(b)while(b=b.parentNode)if(b===a)return!0;return!1},B=b?function(a,b){if(a===b)return l=!0,0;var d=!a.compareDocumentPosition-!b.compareDocumentPosition;return d?d:(d=(a.ownerDocument||a)===(b.ownerDocument||b)?a.compareDocumentPosition(b):1,1&d||!c.sortDetached&&b.compareDocumentPosition(a)===d?a===n||a.ownerDocument===v&&t(v,a)?-1:b===n||b.ownerDocument===v&&t(v,b)?1:k?J(k,a)-J(k,b):0:4&d?-1:1)}:function(a,b){if(a===b)return l=!0,0;var c,d=0,e=a.parentNode,f=b.parentNode,g=[a],h=[b];if(!e||!f)return a===n?-1:b===n?1:e?-1:f?1:k?J(k,a)-J(k,b):0;if(e===f)return ka(a,b);c=a;while(c=c.parentNode)g.unshift(c);c=b;while(c=c.parentNode)h.unshift(c);while(g[d]===h[d])d++;return d?ka(g[d],h[d]):g[d]===v?-1:h[d]===v?1:0},n):n},fa.matches=function(a,b){return fa(a,null,null,b)},fa.matchesSelector=function(a,b){if((a.ownerDocument||a)!==n&&m(a),b=b.replace(T,"='$1']"),c.matchesSelector&&p&&!A[b+" "]&&(!r||!r.test(b))&&(!q||!q.test(b)))try{var d=s.call(a,b);if(d||c.disconnectedMatch||a.document&&11!==a.document.nodeType)return d}catch(e){}return fa(b,n,null,[a]).length>0},fa.contains=function(a,b){return(a.ownerDocument||a)!==n&&m(a),t(a,b)},fa.attr=function(a,b){(a.ownerDocument||a)!==n&&m(a);var e=d.attrHandle[b.toLowerCase()],f=e&&D.call(d.attrHandle,b.toLowerCase())?e(a,b,!p):void 0;return void 0!==f?f:c.attributes||!p?a.getAttribute(b):(f=a.getAttributeNode(b))&&f.specified?f.value:null},fa.error=function(a){throw new Error("Syntax error, unrecognized expression: "+a)},fa.uniqueSort=function(a){var b,d=[],e=0,f=0;if(l=!c.detectDuplicates,k=!c.sortStable&&a.slice(0),a.sort(B),l){while(b=a[f++])b===a[f]&&(e=d.push(f));while(e--)a.splice(d[e],1)}return k=null,a},e=fa.getText=function(a){var b,c="",d=0,f=a.nodeType;if(f){if(1===f||9===f||11===f){if("string"==typeof a.textContent)return a.textContent;for(a=a.firstChild;a;a=a.nextSibling)c+=e(a)}else if(3===f||4===f)return a.nodeValue}else while(b=a[d++])c+=e(b);return c},d=fa.selectors={cacheLength:50,createPseudo:ha,match:W,attrHandle:{},find:{},relative:{">":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(a){return a[1]=a[1].replace(ba,ca),a[3]=(a[3]||a[4]||a[5]||"").replace(ba,ca),"~="===a[2]&&(a[3]=" "+a[3]+" "),a.slice(0,4)},CHILD:function(a){return a[1]=a[1].toLowerCase(),"nth"===a[1].slice(0,3)?(a[3]||fa.error(a[0]),a[4]=+(a[4]?a[5]+(a[6]||1):2*("even"===a[3]||"odd"===a[3])),a[5]=+(a[7]+a[8]||"odd"===a[3])):a[3]&&fa.error(a[0]),a},PSEUDO:function(a){var b,c=!a[6]&&a[2];return W.CHILD.test(a[0])?null:(a[3]?a[2]=a[4]||a[5]||"":c&&U.test(c)&&(b=g(c,!0))&&(b=c.indexOf(")",c.length-b)-c.length)&&(a[0]=a[0].slice(0,b),a[2]=c.slice(0,b)),a.slice(0,3))}},filter:{TAG:function(a){var b=a.replace(ba,ca).toLowerCase();return"*"===a?function(){return!0}:function(a){return a.nodeName&&a.nodeName.toLowerCase()===b}},CLASS:function(a){var b=y[a+" "];return b||(b=new RegExp("(^|"+L+")"+a+"("+L+"|$)"))&&y(a,function(a){return b.test("string"==typeof a.className&&a.className||"undefined"!=typeof a.getAttribute&&a.getAttribute("class")||"")})},ATTR:function(a,b,c){return function(d){var e=fa.attr(d,a);return null==e?"!="===b:b?(e+="","="===b?e===c:"!="===b?e!==c:"^="===b?c&&0===e.indexOf(c):"*="===b?c&&e.indexOf(c)>-1:"$="===b?c&&e.slice(-c.length)===c:"~="===b?(" "+e.replace(P," ")+" ").indexOf(c)>-1:"|="===b?e===c||e.slice(0,c.length+1)===c+"-":!1):!0}},CHILD:function(a,b,c,d,e){var f="nth"!==a.slice(0,3),g="last"!==a.slice(-4),h="of-type"===b;return 1===d&&0===e?function(a){return!!a.parentNode}:function(b,c,i){var j,k,l,m,n,o,p=f!==g?"nextSibling":"previousSibling",q=b.parentNode,r=h&&b.nodeName.toLowerCase(),s=!i&&!h,t=!1;if(q){if(f){while(p){m=b;while(m=m[p])if(h?m.nodeName.toLowerCase()===r:1===m.nodeType)return!1;o=p="only"===a&&!o&&"nextSibling"}return!0}if(o=[g?q.firstChild:q.lastChild],g&&s){m=q,l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),j=k[a]||[],n=j[0]===w&&j[1],t=n&&j[2],m=n&&q.childNodes[n];while(m=++n&&m&&m[p]||(t=n=0)||o.pop())if(1===m.nodeType&&++t&&m===b){k[a]=[w,n,t];break}}else if(s&&(m=b,l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),j=k[a]||[],n=j[0]===w&&j[1],t=n),t===!1)while(m=++n&&m&&m[p]||(t=n=0)||o.pop())if((h?m.nodeName.toLowerCase()===r:1===m.nodeType)&&++t&&(s&&(l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),k[a]=[w,t]),m===b))break;return t-=e,t===d||t%d===0&&t/d>=0}}},PSEUDO:function(a,b){var c,e=d.pseudos[a]||d.setFilters[a.toLowerCase()]||fa.error("unsupported pseudo: "+a);return e[u]?e(b):e.length>1?(c=[a,a,"",b],d.setFilters.hasOwnProperty(a.toLowerCase())?ha(function(a,c){var d,f=e(a,b),g=f.length;while(g--)d=J(a,f[g]),a[d]=!(c[d]=f[g])}):function(a){return e(a,0,c)}):e}},pseudos:{not:ha(function(a){var b=[],c=[],d=h(a.replace(Q,"$1"));return d[u]?ha(function(a,b,c,e){var f,g=d(a,null,e,[]),h=a.length;while(h--)(f=g[h])&&(a[h]=!(b[h]=f))}):function(a,e,f){return b[0]=a,d(b,null,f,c),b[0]=null,!c.pop()}}),has:ha(function(a){return function(b){return fa(a,b).length>0}}),contains:ha(function(a){return a=a.replace(ba,ca),function(b){return(b.textContent||b.innerText||e(b)).indexOf(a)>-1}}),lang:ha(function(a){return V.test(a||"")||fa.error("unsupported lang: "+a),a=a.replace(ba,ca).toLowerCase(),function(b){var c;do if(c=p?b.lang:b.getAttribute("xml:lang")||b.getAttribute("lang"))return c=c.toLowerCase(),c===a||0===c.indexOf(a+"-");while((b=b.parentNode)&&1===b.nodeType);return!1}}),target:function(b){var c=a.location&&a.location.hash;return c&&c.slice(1)===b.id},root:function(a){return a===o},focus:function(a){return a===n.activeElement&&(!n.hasFocus||n.hasFocus())&&!!(a.type||a.href||~a.tabIndex)},enabled:function(a){return a.disabled===!1},disabled:function(a){return a.disabled===!0},checked:function(a){var b=a.nodeName.toLowerCase();return"input"===b&&!!a.checked||"option"===b&&!!a.selected},selected:function(a){return a.parentNode&&a.parentNode.selectedIndex,a.selected===!0},empty:function(a){for(a=a.firstChild;a;a=a.nextSibling)if(a.nodeType<6)return!1;return!0},parent:function(a){return!d.pseudos.empty(a)},header:function(a){return Y.test(a.nodeName)},input:function(a){return X.test(a.nodeName)},button:function(a){var b=a.nodeName.toLowerCase();return"input"===b&&"button"===a.type||"button"===b},text:function(a){var b;return"input"===a.nodeName.toLowerCase()&&"text"===a.type&&(null==(b=a.getAttribute("type"))||"text"===b.toLowerCase())},first:na(function(){return[0]}),last:na(function(a,b){return[b-1]}),eq:na(function(a,b,c){return[0>c?c+b:c]}),even:na(function(a,b){for(var c=0;b>c;c+=2)a.push(c);return a}),odd:na(function(a,b){for(var c=1;b>c;c+=2)a.push(c);return a}),lt:na(function(a,b,c){for(var d=0>c?c+b:c;--d>=0;)a.push(d);return a}),gt:na(function(a,b,c){for(var d=0>c?c+b:c;++db;b++)d+=a[b].value;return d}function ra(a,b,c){var d=b.dir,e=c&&"parentNode"===d,f=x++;return b.first?function(b,c,f){while(b=b[d])if(1===b.nodeType||e)return a(b,c,f)}:function(b,c,g){var h,i,j,k=[w,f];if(g){while(b=b[d])if((1===b.nodeType||e)&&a(b,c,g))return!0}else while(b=b[d])if(1===b.nodeType||e){if(j=b[u]||(b[u]={}),i=j[b.uniqueID]||(j[b.uniqueID]={}),(h=i[d])&&h[0]===w&&h[1]===f)return k[2]=h[2];if(i[d]=k,k[2]=a(b,c,g))return!0}}}function sa(a){return a.length>1?function(b,c,d){var e=a.length;while(e--)if(!a[e](b,c,d))return!1;return!0}:a[0]}function ta(a,b,c){for(var d=0,e=b.length;e>d;d++)fa(a,b[d],c);return c}function ua(a,b,c,d,e){for(var f,g=[],h=0,i=a.length,j=null!=b;i>h;h++)(f=a[h])&&(c&&!c(f,d,e)||(g.push(f),j&&b.push(h)));return g}function va(a,b,c,d,e,f){return d&&!d[u]&&(d=va(d)),e&&!e[u]&&(e=va(e,f)),ha(function(f,g,h,i){var j,k,l,m=[],n=[],o=g.length,p=f||ta(b||"*",h.nodeType?[h]:h,[]),q=!a||!f&&b?p:ua(p,m,a,h,i),r=c?e||(f?a:o||d)?[]:g:q;if(c&&c(q,r,h,i),d){j=ua(r,n),d(j,[],h,i),k=j.length;while(k--)(l=j[k])&&(r[n[k]]=!(q[n[k]]=l))}if(f){if(e||a){if(e){j=[],k=r.length;while(k--)(l=r[k])&&j.push(q[k]=l);e(null,r=[],j,i)}k=r.length;while(k--)(l=r[k])&&(j=e?J(f,l):m[k])>-1&&(f[j]=!(g[j]=l))}}else r=ua(r===g?r.splice(o,r.length):r),e?e(null,g,r,i):H.apply(g,r)})}function wa(a){for(var b,c,e,f=a.length,g=d.relative[a[0].type],h=g||d.relative[" "],i=g?1:0,k=ra(function(a){return a===b},h,!0),l=ra(function(a){return J(b,a)>-1},h,!0),m=[function(a,c,d){var e=!g&&(d||c!==j)||((b=c).nodeType?k(a,c,d):l(a,c,d));return b=null,e}];f>i;i++)if(c=d.relative[a[i].type])m=[ra(sa(m),c)];else{if(c=d.filter[a[i].type].apply(null,a[i].matches),c[u]){for(e=++i;f>e;e++)if(d.relative[a[e].type])break;return va(i>1&&sa(m),i>1&&qa(a.slice(0,i-1).concat({value:" "===a[i-2].type?"*":""})).replace(Q,"$1"),c,e>i&&wa(a.slice(i,e)),f>e&&wa(a=a.slice(e)),f>e&&qa(a))}m.push(c)}return sa(m)}function xa(a,b){var c=b.length>0,e=a.length>0,f=function(f,g,h,i,k){var l,o,q,r=0,s="0",t=f&&[],u=[],v=j,x=f||e&&d.find.TAG("*",k),y=w+=null==v?1:Math.random()||.1,z=x.length;for(k&&(j=g===n||g||k);s!==z&&null!=(l=x[s]);s++){if(e&&l){o=0,g||l.ownerDocument===n||(m(l),h=!p);while(q=a[o++])if(q(l,g||n,h)){i.push(l);break}k&&(w=y)}c&&((l=!q&&l)&&r--,f&&t.push(l))}if(r+=s,c&&s!==r){o=0;while(q=b[o++])q(t,u,g,h);if(f){if(r>0)while(s--)t[s]||u[s]||(u[s]=F.call(i));u=ua(u)}H.apply(i,u),k&&!f&&u.length>0&&r+b.length>1&&fa.uniqueSort(i)}return k&&(w=y,j=v),t};return c?ha(f):f}return h=fa.compile=function(a,b){var c,d=[],e=[],f=A[a+" "];if(!f){b||(b=g(a)),c=b.length;while(c--)f=wa(b[c]),f[u]?d.push(f):e.push(f);f=A(a,xa(e,d)),f.selector=a}return f},i=fa.select=function(a,b,e,f){var i,j,k,l,m,n="function"==typeof a&&a,o=!f&&g(a=n.selector||a);if(e=e||[],1===o.length){if(j=o[0]=o[0].slice(0),j.length>2&&"ID"===(k=j[0]).type&&c.getById&&9===b.nodeType&&p&&d.relative[j[1].type]){if(b=(d.find.ID(k.matches[0].replace(ba,ca),b)||[])[0],!b)return e;n&&(b=b.parentNode),a=a.slice(j.shift().value.length)}i=W.needsContext.test(a)?0:j.length;while(i--){if(k=j[i],d.relative[l=k.type])break;if((m=d.find[l])&&(f=m(k.matches[0].replace(ba,ca),_.test(j[0].type)&&oa(b.parentNode)||b))){if(j.splice(i,1),a=f.length&&qa(j),!a)return H.apply(e,f),e;break}}}return(n||h(a,o))(f,b,!p,e,!b||_.test(a)&&oa(b.parentNode)||b),e},c.sortStable=u.split("").sort(B).join("")===u,c.detectDuplicates=!!l,m(),c.sortDetached=ia(function(a){return 1&a.compareDocumentPosition(n.createElement("div"))}),ia(function(a){return a.innerHTML="","#"===a.firstChild.getAttribute("href")})||ja("type|href|height|width",function(a,b,c){return c?void 0:a.getAttribute(b,"type"===b.toLowerCase()?1:2)}),c.attributes&&ia(function(a){return a.innerHTML="",a.firstChild.setAttribute("value",""),""===a.firstChild.getAttribute("value")})||ja("value",function(a,b,c){return c||"input"!==a.nodeName.toLowerCase()?void 0:a.defaultValue}),ia(function(a){return null==a.getAttribute("disabled")})||ja(K,function(a,b,c){var d;return c?void 0:a[b]===!0?b.toLowerCase():(d=a.getAttributeNode(b))&&d.specified?d.value:null}),fa}(a);n.find=t,n.expr=t.selectors,n.expr[":"]=n.expr.pseudos,n.uniqueSort=n.unique=t.uniqueSort,n.text=t.getText,n.isXMLDoc=t.isXML,n.contains=t.contains;var u=function(a,b,c){var d=[],e=void 0!==c;while((a=a[b])&&9!==a.nodeType)if(1===a.nodeType){if(e&&n(a).is(c))break;d.push(a)}return d},v=function(a,b){for(var c=[];a;a=a.nextSibling)1===a.nodeType&&a!==b&&c.push(a);return c},w=n.expr.match.needsContext,x=/^<([\w-]+)\s*\/?>(?:<\/\1>|)$/,y=/^.[^:#\[\.,]*$/;function z(a,b,c){if(n.isFunction(b))return n.grep(a,function(a,d){return!!b.call(a,d,a)!==c});if(b.nodeType)return n.grep(a,function(a){return a===b!==c});if("string"==typeof b){if(y.test(b))return n.filter(b,a,c);b=n.filter(b,a)}return n.grep(a,function(a){return h.call(b,a)>-1!==c})}n.filter=function(a,b,c){var d=b[0];return c&&(a=":not("+a+")"),1===b.length&&1===d.nodeType?n.find.matchesSelector(d,a)?[d]:[]:n.find.matches(a,n.grep(b,function(a){return 1===a.nodeType}))},n.fn.extend({find:function(a){var b,c=this.length,d=[],e=this;if("string"!=typeof a)return this.pushStack(n(a).filter(function(){for(b=0;c>b;b++)if(n.contains(e[b],this))return!0}));for(b=0;c>b;b++)n.find(a,e[b],d);return d=this.pushStack(c>1?n.unique(d):d),d.selector=this.selector?this.selector+" "+a:a,d},filter:function(a){return this.pushStack(z(this,a||[],!1))},not:function(a){return this.pushStack(z(this,a||[],!0))},is:function(a){return!!z(this,"string"==typeof a&&w.test(a)?n(a):a||[],!1).length}});var A,B=/^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]*))$/,C=n.fn.init=function(a,b,c){var e,f;if(!a)return this;if(c=c||A,"string"==typeof a){if(e="<"===a[0]&&">"===a[a.length-1]&&a.length>=3?[null,a,null]:B.exec(a),!e||!e[1]&&b)return!b||b.jquery?(b||c).find(a):this.constructor(b).find(a);if(e[1]){if(b=b instanceof n?b[0]:b,n.merge(this,n.parseHTML(e[1],b&&b.nodeType?b.ownerDocument||b:d,!0)),x.test(e[1])&&n.isPlainObject(b))for(e in b)n.isFunction(this[e])?this[e](b[e]):this.attr(e,b[e]);return this}return f=d.getElementById(e[2]),f&&f.parentNode&&(this.length=1,this[0]=f),this.context=d,this.selector=a,this}return a.nodeType?(this.context=this[0]=a,this.length=1,this):n.isFunction(a)?void 0!==c.ready?c.ready(a):a(n):(void 0!==a.selector&&(this.selector=a.selector,this.context=a.context),n.makeArray(a,this))};C.prototype=n.fn,A=n(d);var D=/^(?:parents|prev(?:Until|All))/,E={children:!0,contents:!0,next:!0,prev:!0};n.fn.extend({has:function(a){var b=n(a,this),c=b.length;return this.filter(function(){for(var a=0;c>a;a++)if(n.contains(this,b[a]))return!0})},closest:function(a,b){for(var c,d=0,e=this.length,f=[],g=w.test(a)||"string"!=typeof a?n(a,b||this.context):0;e>d;d++)for(c=this[d];c&&c!==b;c=c.parentNode)if(c.nodeType<11&&(g?g.index(c)>-1:1===c.nodeType&&n.find.matchesSelector(c,a))){f.push(c);break}return this.pushStack(f.length>1?n.uniqueSort(f):f)},index:function(a){return a?"string"==typeof a?h.call(n(a),this[0]):h.call(this,a.jquery?a[0]:a):this[0]&&this[0].parentNode?this.first().prevAll().length:-1},add:function(a,b){return this.pushStack(n.uniqueSort(n.merge(this.get(),n(a,b))))},addBack:function(a){return this.add(null==a?this.prevObject:this.prevObject.filter(a))}});function F(a,b){while((a=a[b])&&1!==a.nodeType);return a}n.each({parent:function(a){var b=a.parentNode;return b&&11!==b.nodeType?b:null},parents:function(a){return u(a,"parentNode")},parentsUntil:function(a,b,c){return u(a,"parentNode",c)},next:function(a){return F(a,"nextSibling")},prev:function(a){return F(a,"previousSibling")},nextAll:function(a){return u(a,"nextSibling")},prevAll:function(a){return u(a,"previousSibling")},nextUntil:function(a,b,c){return u(a,"nextSibling",c)},prevUntil:function(a,b,c){return u(a,"previousSibling",c)},siblings:function(a){return v((a.parentNode||{}).firstChild,a)},children:function(a){return v(a.firstChild)},contents:function(a){return a.contentDocument||n.merge([],a.childNodes)}},function(a,b){n.fn[a]=function(c,d){var e=n.map(this,b,c);return"Until"!==a.slice(-5)&&(d=c),d&&"string"==typeof d&&(e=n.filter(d,e)),this.length>1&&(E[a]||n.uniqueSort(e),D.test(a)&&e.reverse()),this.pushStack(e)}});var G=/\S+/g;function H(a){var b={};return n.each(a.match(G)||[],function(a,c){b[c]=!0}),b}n.Callbacks=function(a){a="string"==typeof a?H(a):n.extend({},a);var b,c,d,e,f=[],g=[],h=-1,i=function(){for(e=a.once,d=b=!0;g.length;h=-1){c=g.shift();while(++h-1)f.splice(c,1),h>=c&&h--}),this},has:function(a){return a?n.inArray(a,f)>-1:f.length>0},empty:function(){return f&&(f=[]),this},disable:function(){return e=g=[],f=c="",this},disabled:function(){return!f},lock:function(){return e=g=[],c||(f=c=""),this},locked:function(){return!!e},fireWith:function(a,c){return e||(c=c||[],c=[a,c.slice?c.slice():c],g.push(c),b||i()),this},fire:function(){return j.fireWith(this,arguments),this},fired:function(){return!!d}};return j},n.extend({Deferred:function(a){var b=[["resolve","done",n.Callbacks("once memory"),"resolved"],["reject","fail",n.Callbacks("once memory"),"rejected"],["notify","progress",n.Callbacks("memory")]],c="pending",d={state:function(){return c},always:function(){return e.done(arguments).fail(arguments),this},then:function(){var a=arguments;return n.Deferred(function(c){n.each(b,function(b,f){var g=n.isFunction(a[b])&&a[b];e[f[1]](function(){var a=g&&g.apply(this,arguments);a&&n.isFunction(a.promise)?a.promise().progress(c.notify).done(c.resolve).fail(c.reject):c[f[0]+"With"](this===d?c.promise():this,g?[a]:arguments)})}),a=null}).promise()},promise:function(a){return null!=a?n.extend(a,d):d}},e={};return d.pipe=d.then,n.each(b,function(a,f){var g=f[2],h=f[3];d[f[1]]=g.add,h&&g.add(function(){c=h},b[1^a][2].disable,b[2][2].lock),e[f[0]]=function(){return e[f[0]+"With"](this===e?d:this,arguments),this},e[f[0]+"With"]=g.fireWith}),d.promise(e),a&&a.call(e,e),e},when:function(a){var b=0,c=e.call(arguments),d=c.length,f=1!==d||a&&n.isFunction(a.promise)?d:0,g=1===f?a:n.Deferred(),h=function(a,b,c){return function(d){b[a]=this,c[a]=arguments.length>1?e.call(arguments):d,c===i?g.notifyWith(b,c):--f||g.resolveWith(b,c)}},i,j,k;if(d>1)for(i=new Array(d),j=new Array(d),k=new Array(d);d>b;b++)c[b]&&n.isFunction(c[b].promise)?c[b].promise().progress(h(b,j,i)).done(h(b,k,c)).fail(g.reject):--f;return f||g.resolveWith(k,c),g.promise()}});var I;n.fn.ready=function(a){return n.ready.promise().done(a),this},n.extend({isReady:!1,readyWait:1,holdReady:function(a){a?n.readyWait++:n.ready(!0)},ready:function(a){(a===!0?--n.readyWait:n.isReady)||(n.isReady=!0,a!==!0&&--n.readyWait>0||(I.resolveWith(d,[n]),n.fn.triggerHandler&&(n(d).triggerHandler("ready"),n(d).off("ready"))))}});function J(){d.removeEventListener("DOMContentLoaded",J),a.removeEventListener("load",J),n.ready()}n.ready.promise=function(b){return I||(I=n.Deferred(),"complete"===d.readyState||"loading"!==d.readyState&&!d.documentElement.doScroll?a.setTimeout(n.ready):(d.addEventListener("DOMContentLoaded",J),a.addEventListener("load",J))),I.promise(b)},n.ready.promise();var K=function(a,b,c,d,e,f,g){var h=0,i=a.length,j=null==c;if("object"===n.type(c)){e=!0;for(h in c)K(a,b,h,c[h],!0,f,g)}else if(void 0!==d&&(e=!0,n.isFunction(d)||(g=!0),j&&(g?(b.call(a,d),b=null):(j=b,b=function(a,b,c){return j.call(n(a),c)})),b))for(;i>h;h++)b(a[h],c,g?d:d.call(a[h],h,b(a[h],c)));return e?a:j?b.call(a):i?b(a[0],c):f},L=function(a){return 1===a.nodeType||9===a.nodeType||!+a.nodeType};function M(){this.expando=n.expando+M.uid++}M.uid=1,M.prototype={register:function(a,b){var c=b||{};return a.nodeType?a[this.expando]=c:Object.defineProperty(a,this.expando,{value:c,writable:!0,configurable:!0}),a[this.expando]},cache:function(a){if(!L(a))return{};var b=a[this.expando];return b||(b={},L(a)&&(a.nodeType?a[this.expando]=b:Object.defineProperty(a,this.expando,{value:b,configurable:!0}))),b},set:function(a,b,c){var d,e=this.cache(a);if("string"==typeof b)e[b]=c;else for(d in b)e[d]=b[d];return e},get:function(a,b){return void 0===b?this.cache(a):a[this.expando]&&a[this.expando][b]},access:function(a,b,c){var d;return void 0===b||b&&"string"==typeof b&&void 0===c?(d=this.get(a,b),void 0!==d?d:this.get(a,n.camelCase(b))):(this.set(a,b,c),void 0!==c?c:b)},remove:function(a,b){var c,d,e,f=a[this.expando];if(void 0!==f){if(void 0===b)this.register(a);else{n.isArray(b)?d=b.concat(b.map(n.camelCase)):(e=n.camelCase(b),b in f?d=[b,e]:(d=e,d=d in f?[d]:d.match(G)||[])),c=d.length;while(c--)delete f[d[c]]}(void 0===b||n.isEmptyObject(f))&&(a.nodeType?a[this.expando]=void 0:delete a[this.expando])}},hasData:function(a){var b=a[this.expando];return void 0!==b&&!n.isEmptyObject(b)}};var N=new M,O=new M,P=/^(?:\{[\w\W]*\}|\[[\w\W]*\])$/,Q=/[A-Z]/g;function R(a,b,c){var d;if(void 0===c&&1===a.nodeType)if(d="data-"+b.replace(Q,"-$&").toLowerCase(),c=a.getAttribute(d),"string"==typeof c){try{c="true"===c?!0:"false"===c?!1:"null"===c?null:+c+""===c?+c:P.test(c)?n.parseJSON(c):c; -}catch(e){}O.set(a,b,c)}else c=void 0;return c}n.extend({hasData:function(a){return O.hasData(a)||N.hasData(a)},data:function(a,b,c){return O.access(a,b,c)},removeData:function(a,b){O.remove(a,b)},_data:function(a,b,c){return N.access(a,b,c)},_removeData:function(a,b){N.remove(a,b)}}),n.fn.extend({data:function(a,b){var c,d,e,f=this[0],g=f&&f.attributes;if(void 0===a){if(this.length&&(e=O.get(f),1===f.nodeType&&!N.get(f,"hasDataAttrs"))){c=g.length;while(c--)g[c]&&(d=g[c].name,0===d.indexOf("data-")&&(d=n.camelCase(d.slice(5)),R(f,d,e[d])));N.set(f,"hasDataAttrs",!0)}return e}return"object"==typeof a?this.each(function(){O.set(this,a)}):K(this,function(b){var c,d;if(f&&void 0===b){if(c=O.get(f,a)||O.get(f,a.replace(Q,"-$&").toLowerCase()),void 0!==c)return c;if(d=n.camelCase(a),c=O.get(f,d),void 0!==c)return c;if(c=R(f,d,void 0),void 0!==c)return c}else d=n.camelCase(a),this.each(function(){var c=O.get(this,d);O.set(this,d,b),a.indexOf("-")>-1&&void 0!==c&&O.set(this,a,b)})},null,b,arguments.length>1,null,!0)},removeData:function(a){return this.each(function(){O.remove(this,a)})}}),n.extend({queue:function(a,b,c){var d;return a?(b=(b||"fx")+"queue",d=N.get(a,b),c&&(!d||n.isArray(c)?d=N.access(a,b,n.makeArray(c)):d.push(c)),d||[]):void 0},dequeue:function(a,b){b=b||"fx";var c=n.queue(a,b),d=c.length,e=c.shift(),f=n._queueHooks(a,b),g=function(){n.dequeue(a,b)};"inprogress"===e&&(e=c.shift(),d--),e&&("fx"===b&&c.unshift("inprogress"),delete f.stop,e.call(a,g,f)),!d&&f&&f.empty.fire()},_queueHooks:function(a,b){var c=b+"queueHooks";return N.get(a,c)||N.access(a,c,{empty:n.Callbacks("once memory").add(function(){N.remove(a,[b+"queue",c])})})}}),n.fn.extend({queue:function(a,b){var c=2;return"string"!=typeof a&&(b=a,a="fx",c--),arguments.length",""],thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};$.optgroup=$.option,$.tbody=$.tfoot=$.colgroup=$.caption=$.thead,$.th=$.td;function _(a,b){var c="undefined"!=typeof a.getElementsByTagName?a.getElementsByTagName(b||"*"):"undefined"!=typeof a.querySelectorAll?a.querySelectorAll(b||"*"):[];return void 0===b||b&&n.nodeName(a,b)?n.merge([a],c):c}function aa(a,b){for(var c=0,d=a.length;d>c;c++)N.set(a[c],"globalEval",!b||N.get(b[c],"globalEval"))}var ba=/<|&#?\w+;/;function ca(a,b,c,d,e){for(var f,g,h,i,j,k,l=b.createDocumentFragment(),m=[],o=0,p=a.length;p>o;o++)if(f=a[o],f||0===f)if("object"===n.type(f))n.merge(m,f.nodeType?[f]:f);else if(ba.test(f)){g=g||l.appendChild(b.createElement("div")),h=(Y.exec(f)||["",""])[1].toLowerCase(),i=$[h]||$._default,g.innerHTML=i[1]+n.htmlPrefilter(f)+i[2],k=i[0];while(k--)g=g.lastChild;n.merge(m,g.childNodes),g=l.firstChild,g.textContent=""}else m.push(b.createTextNode(f));l.textContent="",o=0;while(f=m[o++])if(d&&n.inArray(f,d)>-1)e&&e.push(f);else if(j=n.contains(f.ownerDocument,f),g=_(l.appendChild(f),"script"),j&&aa(g),c){k=0;while(f=g[k++])Z.test(f.type||"")&&c.push(f)}return l}!function(){var a=d.createDocumentFragment(),b=a.appendChild(d.createElement("div")),c=d.createElement("input");c.setAttribute("type","radio"),c.setAttribute("checked","checked"),c.setAttribute("name","t"),b.appendChild(c),l.checkClone=b.cloneNode(!0).cloneNode(!0).lastChild.checked,b.innerHTML="",l.noCloneChecked=!!b.cloneNode(!0).lastChild.defaultValue}();var da=/^key/,ea=/^(?:mouse|pointer|contextmenu|drag|drop)|click/,fa=/^([^.]*)(?:\.(.+)|)/;function ga(){return!0}function ha(){return!1}function ia(){try{return d.activeElement}catch(a){}}function ja(a,b,c,d,e,f){var g,h;if("object"==typeof b){"string"!=typeof c&&(d=d||c,c=void 0);for(h in b)ja(a,h,c,d,b[h],f);return a}if(null==d&&null==e?(e=c,d=c=void 0):null==e&&("string"==typeof c?(e=d,d=void 0):(e=d,d=c,c=void 0)),e===!1)e=ha;else if(!e)return a;return 1===f&&(g=e,e=function(a){return n().off(a),g.apply(this,arguments)},e.guid=g.guid||(g.guid=n.guid++)),a.each(function(){n.event.add(this,b,e,d,c)})}n.event={global:{},add:function(a,b,c,d,e){var f,g,h,i,j,k,l,m,o,p,q,r=N.get(a);if(r){c.handler&&(f=c,c=f.handler,e=f.selector),c.guid||(c.guid=n.guid++),(i=r.events)||(i=r.events={}),(g=r.handle)||(g=r.handle=function(b){return"undefined"!=typeof n&&n.event.triggered!==b.type?n.event.dispatch.apply(a,arguments):void 0}),b=(b||"").match(G)||[""],j=b.length;while(j--)h=fa.exec(b[j])||[],o=q=h[1],p=(h[2]||"").split(".").sort(),o&&(l=n.event.special[o]||{},o=(e?l.delegateType:l.bindType)||o,l=n.event.special[o]||{},k=n.extend({type:o,origType:q,data:d,handler:c,guid:c.guid,selector:e,needsContext:e&&n.expr.match.needsContext.test(e),namespace:p.join(".")},f),(m=i[o])||(m=i[o]=[],m.delegateCount=0,l.setup&&l.setup.call(a,d,p,g)!==!1||a.addEventListener&&a.addEventListener(o,g)),l.add&&(l.add.call(a,k),k.handler.guid||(k.handler.guid=c.guid)),e?m.splice(m.delegateCount++,0,k):m.push(k),n.event.global[o]=!0)}},remove:function(a,b,c,d,e){var f,g,h,i,j,k,l,m,o,p,q,r=N.hasData(a)&&N.get(a);if(r&&(i=r.events)){b=(b||"").match(G)||[""],j=b.length;while(j--)if(h=fa.exec(b[j])||[],o=q=h[1],p=(h[2]||"").split(".").sort(),o){l=n.event.special[o]||{},o=(d?l.delegateType:l.bindType)||o,m=i[o]||[],h=h[2]&&new RegExp("(^|\\.)"+p.join("\\.(?:.*\\.|)")+"(\\.|$)"),g=f=m.length;while(f--)k=m[f],!e&&q!==k.origType||c&&c.guid!==k.guid||h&&!h.test(k.namespace)||d&&d!==k.selector&&("**"!==d||!k.selector)||(m.splice(f,1),k.selector&&m.delegateCount--,l.remove&&l.remove.call(a,k));g&&!m.length&&(l.teardown&&l.teardown.call(a,p,r.handle)!==!1||n.removeEvent(a,o,r.handle),delete i[o])}else for(o in i)n.event.remove(a,o+b[j],c,d,!0);n.isEmptyObject(i)&&N.remove(a,"handle events")}},dispatch:function(a){a=n.event.fix(a);var b,c,d,f,g,h=[],i=e.call(arguments),j=(N.get(this,"events")||{})[a.type]||[],k=n.event.special[a.type]||{};if(i[0]=a,a.delegateTarget=this,!k.preDispatch||k.preDispatch.call(this,a)!==!1){h=n.event.handlers.call(this,a,j),b=0;while((f=h[b++])&&!a.isPropagationStopped()){a.currentTarget=f.elem,c=0;while((g=f.handlers[c++])&&!a.isImmediatePropagationStopped())a.rnamespace&&!a.rnamespace.test(g.namespace)||(a.handleObj=g,a.data=g.data,d=((n.event.special[g.origType]||{}).handle||g.handler).apply(f.elem,i),void 0!==d&&(a.result=d)===!1&&(a.preventDefault(),a.stopPropagation()))}return k.postDispatch&&k.postDispatch.call(this,a),a.result}},handlers:function(a,b){var c,d,e,f,g=[],h=b.delegateCount,i=a.target;if(h&&i.nodeType&&("click"!==a.type||isNaN(a.button)||a.button<1))for(;i!==this;i=i.parentNode||this)if(1===i.nodeType&&(i.disabled!==!0||"click"!==a.type)){for(d=[],c=0;h>c;c++)f=b[c],e=f.selector+" ",void 0===d[e]&&(d[e]=f.needsContext?n(e,this).index(i)>-1:n.find(e,this,null,[i]).length),d[e]&&d.push(f);d.length&&g.push({elem:i,handlers:d})}return h]*)\/>/gi,la=/\s*$/g;function pa(a,b){return n.nodeName(a,"table")&&n.nodeName(11!==b.nodeType?b:b.firstChild,"tr")?a.getElementsByTagName("tbody")[0]||a.appendChild(a.ownerDocument.createElement("tbody")):a}function qa(a){return a.type=(null!==a.getAttribute("type"))+"/"+a.type,a}function ra(a){var b=na.exec(a.type);return b?a.type=b[1]:a.removeAttribute("type"),a}function sa(a,b){var c,d,e,f,g,h,i,j;if(1===b.nodeType){if(N.hasData(a)&&(f=N.access(a),g=N.set(b,f),j=f.events)){delete g.handle,g.events={};for(e in j)for(c=0,d=j[e].length;d>c;c++)n.event.add(b,e,j[e][c])}O.hasData(a)&&(h=O.access(a),i=n.extend({},h),O.set(b,i))}}function ta(a,b){var c=b.nodeName.toLowerCase();"input"===c&&X.test(a.type)?b.checked=a.checked:"input"!==c&&"textarea"!==c||(b.defaultValue=a.defaultValue)}function ua(a,b,c,d){b=f.apply([],b);var e,g,h,i,j,k,m=0,o=a.length,p=o-1,q=b[0],r=n.isFunction(q);if(r||o>1&&"string"==typeof q&&!l.checkClone&&ma.test(q))return a.each(function(e){var f=a.eq(e);r&&(b[0]=q.call(this,e,f.html())),ua(f,b,c,d)});if(o&&(e=ca(b,a[0].ownerDocument,!1,a,d),g=e.firstChild,1===e.childNodes.length&&(e=g),g||d)){for(h=n.map(_(e,"script"),qa),i=h.length;o>m;m++)j=e,m!==p&&(j=n.clone(j,!0,!0),i&&n.merge(h,_(j,"script"))),c.call(a[m],j,m);if(i)for(k=h[h.length-1].ownerDocument,n.map(h,ra),m=0;i>m;m++)j=h[m],Z.test(j.type||"")&&!N.access(j,"globalEval")&&n.contains(k,j)&&(j.src?n._evalUrl&&n._evalUrl(j.src):n.globalEval(j.textContent.replace(oa,"")))}return a}function va(a,b,c){for(var d,e=b?n.filter(b,a):a,f=0;null!=(d=e[f]);f++)c||1!==d.nodeType||n.cleanData(_(d)),d.parentNode&&(c&&n.contains(d.ownerDocument,d)&&aa(_(d,"script")),d.parentNode.removeChild(d));return a}n.extend({htmlPrefilter:function(a){return a.replace(ka,"<$1>")},clone:function(a,b,c){var d,e,f,g,h=a.cloneNode(!0),i=n.contains(a.ownerDocument,a);if(!(l.noCloneChecked||1!==a.nodeType&&11!==a.nodeType||n.isXMLDoc(a)))for(g=_(h),f=_(a),d=0,e=f.length;e>d;d++)ta(f[d],g[d]);if(b)if(c)for(f=f||_(a),g=g||_(h),d=0,e=f.length;e>d;d++)sa(f[d],g[d]);else sa(a,h);return g=_(h,"script"),g.length>0&&aa(g,!i&&_(a,"script")),h},cleanData:function(a){for(var b,c,d,e=n.event.special,f=0;void 0!==(c=a[f]);f++)if(L(c)){if(b=c[N.expando]){if(b.events)for(d in b.events)e[d]?n.event.remove(c,d):n.removeEvent(c,d,b.handle);c[N.expando]=void 0}c[O.expando]&&(c[O.expando]=void 0)}}}),n.fn.extend({domManip:ua,detach:function(a){return va(this,a,!0)},remove:function(a){return va(this,a)},text:function(a){return K(this,function(a){return void 0===a?n.text(this):this.empty().each(function(){1!==this.nodeType&&11!==this.nodeType&&9!==this.nodeType||(this.textContent=a)})},null,a,arguments.length)},append:function(){return ua(this,arguments,function(a){if(1===this.nodeType||11===this.nodeType||9===this.nodeType){var b=pa(this,a);b.appendChild(a)}})},prepend:function(){return ua(this,arguments,function(a){if(1===this.nodeType||11===this.nodeType||9===this.nodeType){var b=pa(this,a);b.insertBefore(a,b.firstChild)}})},before:function(){return ua(this,arguments,function(a){this.parentNode&&this.parentNode.insertBefore(a,this)})},after:function(){return ua(this,arguments,function(a){this.parentNode&&this.parentNode.insertBefore(a,this.nextSibling)})},empty:function(){for(var a,b=0;null!=(a=this[b]);b++)1===a.nodeType&&(n.cleanData(_(a,!1)),a.textContent="");return this},clone:function(a,b){return a=null==a?!1:a,b=null==b?a:b,this.map(function(){return n.clone(this,a,b)})},html:function(a){return K(this,function(a){var b=this[0]||{},c=0,d=this.length;if(void 0===a&&1===b.nodeType)return b.innerHTML;if("string"==typeof a&&!la.test(a)&&!$[(Y.exec(a)||["",""])[1].toLowerCase()]){a=n.htmlPrefilter(a);try{for(;d>c;c++)b=this[c]||{},1===b.nodeType&&(n.cleanData(_(b,!1)),b.innerHTML=a);b=0}catch(e){}}b&&this.empty().append(a)},null,a,arguments.length)},replaceWith:function(){var a=[];return ua(this,arguments,function(b){var c=this.parentNode;n.inArray(this,a)<0&&(n.cleanData(_(this)),c&&c.replaceChild(b,this))},a)}}),n.each({appendTo:"append",prependTo:"prepend",insertBefore:"before",insertAfter:"after",replaceAll:"replaceWith"},function(a,b){n.fn[a]=function(a){for(var c,d=[],e=n(a),f=e.length-1,h=0;f>=h;h++)c=h===f?this:this.clone(!0),n(e[h])[b](c),g.apply(d,c.get());return this.pushStack(d)}});var wa,xa={HTML:"block",BODY:"block"};function ya(a,b){var c=n(b.createElement(a)).appendTo(b.body),d=n.css(c[0],"display");return c.detach(),d}function za(a){var b=d,c=xa[a];return c||(c=ya(a,b),"none"!==c&&c||(wa=(wa||n(" - """ - return gr.HTML.update(f'{html_mod}') - -def update_game(inp): - return game_fn(sky=sky) - -def sky_fn(inp): - uid=uuid.uuid4() - rand = random.randint(1,200) - for i in range(rand): - inp+=" " - #uid=uuid.uuid4() - output=proc1.send_it(inp,5,1) - print(output) - outp=Image.open(output[0]) - width, height = outp.size - rat = width/height - if width > height: - outrs = outp.resize((600*rat,600)) - elif width < height: - outrs = outp.resize((800,800*rat)) - else: - outrs = outp.resize((800,536)) - outrs.save(f"{uid}_sky.png") - out = os.path.abspath(f"{uid}_sky.png") - #out = os.path.abspath(outp) - out_url = f'https://omnibus-game-test.hf.space/file={out}' - return outp,out_url,inp,rand - -def star_fn(inp): - uid=uuid.uuid4() - rand = random.randint(1,200) - for i in range(rand): - inp+=" " - #uid=uuid.uuid4() - output=proc2.send_it(inp,5,1) - outp=Image.open(output[0]) - out=rm(outp) - outrs = out.resize((36,36)) - outrs.save(f"{uid}_star.png") - out_file = os.path.abspath(f"{uid}_star.png") - out_url = f'https://omnibus-game-test.hf.space/file={out_file}' - return out,out_url,inp,rand - -def enemy_fn(inp): - uid=uuid.uuid4() - rand = random.randint(1,200) - for i in range(rand): - inp+=" " - #uid=uuid.uuid4() - output=proc2.send_it(inp,5,1) - outp=Image.open(output[0]) - out=rm(outp) - outrs = out.resize((24,24)) - outrs.save(f"{uid}_enemy.png") - out_file = os.path.abspath(f"{uid}_enemy.png") - out_url = f'https://omnibus-game-test.hf.space/file={out_file}' - return out,out_url,inp,rand - -def save_game(name,score,sky_im,star_im,enemy_im,sky_p,star_p,enemy_p,background_s,star_s,enemy_s): - sky_im=sky_im.split("app/",1)[1] - star_im=star_im.split("app/",1)[1] - #sky_im=sky_im.split("app/",1)[1] - print(star_im) - user_repo=save_data.split('datasets/',1)[1].split('/raw',1)[0] - timestamp=str(datetime.datetime.now()) - timename=timestamp.replace(" ","--").replace(":","-").replace(".","-") - if name != "": - if file_exists(f"{user_repo}", f"games/{name}.omnigame",repo_type="dataset"): - return "Game Saved with this Name. Choose a different Name." - else: - game=name - else: - game=timename - - #out_lod=[] - try: - r = requests.get(f'{save_data}game_data.json') - lod = json.loads(r.text) - #out_lod.append(lod) - except: - lod=[] - pass - try: - api.upload_file( - path_or_fileobj=sky_im, - path_in_repo=f"{save_data.split('main/',1)[1]}/images/background/{game}-background_img.png", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - except Exception as e: - print (e) - pass - try: - api.upload_file( - path_or_fileobj=star_im, - path_in_repo=f"{save_data.split('main/',1)[1]}/images/star/{game}-star_img.png", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - except Exception as e: - print (e) - pass - try: - api.upload_file( - path_or_fileobj=enemy_im.split("app/",1)[1], - path_in_repo=f"{save_data.split('main/',1)[1]}/images/enemy/{game}-enemy_img.png", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - except Exception as e: - print (e) - pass - try: - api.upload_file( - path_or_fileobj="assets/platform.png", - path_in_repo=f"{save_data.split('main/',1)[1]}/images/platform/{game}-platform_img.png", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - except Exception as e: - print (e) - pass - try: - api.upload_file( - path_or_fileobj="assets/dude.png", - path_in_repo=f"{save_data.split('main/',1)[1]}/images/dude/{game}-dude_img.png", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - except Exception as e: - print (e) - pass - block = {'index': len(lod) + 1, - 'timestamp': timestamp, - 'game_name': game, - 'score': score, - 'background_p': sky_p, - 'star_p': star_p, - 'enemy_p': enemy_p, - 'background_s': background_s, - 'star_s': star_s, - 'enemy_s': enemy_s, - 'background_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/background/{game}-background_img.png', - 'star_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/star/{game}-star_img.png', - 'enemy_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/enemy/{game}-enemy_img.png', - 'platform_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/platform/{game}-platform_img.png', - 'dude_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/dude/{game}-dude_img.png', - } - print(block) - lod.append(block) - - json_object = json.dumps(lod, indent=4) - with open("tmp1.json", "w") as outfile: - outfile.write(json_object) - try: - api.upload_file( - path_or_fileobj="tmp1.json", - path_in_repo=f"{save_data.split('main/',1)[1]}/game_data.json", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - os.remove("tmp1.json") - print("success") - except Exception as e: - print (e) - return f'Error Saving Game: {e}' - game_box=[] - block2 = {'game_name': game, - 'score':score, - 'timestamp': timestamp, - 'background_p': sky_p, - 'star_p': star_p, - 'enemy_p': enemy_p, - 'background_s': background_s, - 'star_s': star_s, - 'enemy_s': enemy_s, - 'background_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/background/{game}-background_img.png', - 'star_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/star/{game}-star_img.png', - 'enemy_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/enemy/{game}-enemy_img.png', - 'platform_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/platform/{game}-platform_img.png', - 'dude_url': f'https://huggingface.co/datasets/{user_repo}/resolve/main/images/dude/{game}-dude_img.png', - } - print(block) - game_box.append(block2) - - json_object = json.dumps(game_box, indent=4) - with open("tmp2.json", "w") as outfile: - outfile.write(json_object) - try: - api.upload_file( - path_or_fileobj="tmp2.json", - path_in_repo=f"{save_data.split('main/',1)[1]}/games/{game}.omnigame", - repo_id=save_data.split('datasets/',1)[1].split('/raw',1)[0], - token=token_self, - repo_type="dataset", - ) - os.remove("tmp2.json") - print("success") - game_url=f'{this_url}?game={game}' - except Exception as e: - print (e) - return f'Error Saving Game: {e}' - return f"Game Saved
URL: {game_url}" -def predict(text, url_params): - load_game=url_params.get('game') - print (f'load_game::{load_game}') - if load_game != None: - try: - r = requests.get(f'{save_data}games/{load_game}.omnigame') - lod = json.loads(r.text) - game_html=game_fn(lod[0]['background_url'],lod[0]['star_url'],lod[0]['enemy_url']) - #out_lod.append(lod) - print (lod) - print (game_html) - return "" + text + "",game_html - except Exception as e: - print(f"cannot load game. Error::{e}") - pass - - else: - pass - return ["" + text + "", ""] - -def get_high_score(): - try: - r = requests.get(f'{save_data}game_data.json') - lod = json.loads(r.text) - high_score=[0,0,0] - sort_score=[0,0,0] - #print(sort_score) - for ea in lod: - try: - if ea['score'] != "": - if int(ea['score']) >= int(sort_score[0]): - sort_score[2]=sort_score[1] - high_score[2]=high_score[1] - sort_score[1]=sort_score[0] - high_score[1]=high_score[0] - sort_score[0]=ea['score'] - high_score[0]=(f"{ea['score']}
{ea['game_name']}

") - pass - elif int(ea['score']) >= int(sort_score[1]): - sort_score[2]=sort_score[1] - high_score[2]=high_score[1] - sort_score[1]=ea['score'] - high_score[1]=(f"{ea['score']}
{ea['game_name']}

") - pass - elif int(ea['score']) >= int(sort_score[2]): - sort_score[2]=ea['score'] - high_score[2]=(f"{ea['score']}
{ea['game_name']}

") - pass - except Exception as e: - print(e) - pass - return f"

High Score


{high_score[0]}
{high_score[1]}
{high_score[2]}
" - except Exception as e: - return e - - -score_js=""" - -function(text_input) { - console.log(text_input); - const iframe = document.getElementById("myIframe").contentWindow.document.getElementById('my_score').innerHTML; - console.log(iframe); - return [iframe]; -} -""" - -with gr.Blocks() as app: - with gr.Row(): - with gr.Column(): - prompt_sky=gr.Textbox(label="Background",value="beautiful landscape, real, 8k") - btn_sky=gr.Button("Make") - out_im_sky=gr.Image(type='filepath') - out_sky_url=gr.Textbox(visible=False) - with gr.Column(): - prompt_star=gr.Textbox(label="Star",value="Colorful Star, blank background") - btn_star=gr.Button("Make") - out_im_star=gr.Image(type='filepath') - out_star_url=gr.Textbox(visible=False) - with gr.Column(): - prompt_enemy=gr.Textbox(label="Enemy",value="Ball on fire, blank background") - btn_enemy=gr.Button("Make") - out_im_enemy=gr.Image(type='filepath') - out_enemy_url=gr.Textbox(visible=False) - gr.Column() - gr.Column() - - with gr.Row(): - update_game=gr.Button("Load Game") - #start_prompt=gr.Textbox(value="beautiful landscape, real, 8k",visible=False) - - with gr.Row(): - with gr.Column(scale=3): - html_game = gr.HTML() - with gr.Column(scale=1): - score_html =gr.HTML() - with gr.Row(): - game_name=gr.Textbox(label="Name for Game (optional)") - save_btn=gr.Button("Save") - message=gr.HTML() - with gr.Row(visible=False): - get_score=gr.Button("Get Score") - score=gr.Textbox() - #url_params=gr.Textbox() - with gr.Row(visible=False): - text_input=gr.Textbox() - url_params = gr.JSON({}, visible=True, label="") - acc=gr.Textbox() - with gr.Row(visible=False): - sky_p=gr.Textbox() - star_p=gr.Textbox() - enemy_p=gr.Textbox() - sky_s=gr.Textbox() - star_s=gr.Textbox() - enemy_s=gr.Textbox() - - get_high_score_btn=gr.Button() - def return_score(text): - return text - def refresh_frame(): - return None - get_high_score_btn.click(get_high_score,None,score_html) - get_score.click(return_score,score,[score],_js=score_js) - save_btn.click(return_score,score,[score],_js=score_js).then(save_game,[game_name,score,out_sky_url,out_star_url,out_enemy_url,sky_p,star_p,enemy_p,sky_s,star_s,enemy_s],message) - update_game.click(refresh_frame,None,html_game).then(game_fn,[out_sky_url,out_star_url,out_enemy_url],html_game).then(get_high_score,None,score_html) - - btn_sky.click(sky_fn,prompt_sky,[out_im_sky,out_sky_url,sky_p,sky_s]) - btn_star.click(star_fn,prompt_star,[out_im_star,out_star_url,star_p,star_s]) - btn_enemy.click(enemy_fn,prompt_enemy,[out_im_enemy,out_enemy_url,enemy_p,enemy_s]) - app.load(predict,[text_input,url_params],[text_input,html_game], _js=get_window_url_params) - - #app.load(sky_fn,prompt_sky,[out_im_sky,out_sky_url]).then(game_fn,[out_sky_url],html_game) -app.queue(concurrency_count=10).launch() \ No newline at end of file diff --git a/spaces/OpenDILabCommunity/LLMRiddlesChatGPTEN/llmriddles/llms/chatglm.py b/spaces/OpenDILabCommunity/LLMRiddlesChatGPTEN/llmriddles/llms/chatglm.py deleted file mode 100644 index 120ffd7da32b50747264a0104f88f3360795cb88..0000000000000000000000000000000000000000 --- a/spaces/OpenDILabCommunity/LLMRiddlesChatGPTEN/llmriddles/llms/chatglm.py +++ /dev/null @@ -1,26 +0,0 @@ -import zhipuai -from .base import register_llm - - -def ask_chatglm(message: str, api_key: str): - zhipuai.api_key = api_key - - response = zhipuai.model_api.invoke( - model="chatglm_turbo", - prompt=[{ - "role": "user", - "content": message - }], - top_p=0.7, - temperature=0.9, - ) - - response_msg = response['data']['choices'][0]['content'] - # strip the front and end '"' - if len(response_msg) >= 2: - response_msg = response_msg[1:-1] - - return response_msg - - -register_llm('chatglm', ask_chatglm) diff --git a/spaces/OpenGVLab/InternGPT/iGPT/models/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py b/spaces/OpenGVLab/InternGPT/iGPT/models/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py deleted file mode 100644 index 29d0ef9102b2db0ffbf723c168aa32d2451b9419..0000000000000000000000000000000000000000 --- a/spaces/OpenGVLab/InternGPT/iGPT/models/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Wrappers around on some nn functions, mainly to support empty tensors. - -Ideally, add support directly in PyTorch to empty tensors in those functions. - -These can be removed once https://github.com/pytorch/pytorch/issues/12013 -is implemented -""" - -from typing import List, Optional -import torch -from torch.nn import functional as F - - -def shapes_to_tensor(x: List[int], device: Optional[torch.device] = None) -> torch.Tensor: - """ - Turn a list of integer scalars or integer Tensor scalars into a vector, - in a way that's both traceable and scriptable. - - In tracing, `x` should be a list of scalar Tensor, so the output can trace to the inputs. - In scripting or eager, `x` should be a list of int. - """ - if torch.jit.is_scripting(): - return torch.as_tensor(x, device=device) - if torch.jit.is_tracing(): - assert all( - [isinstance(t, torch.Tensor) for t in x] - ), "Shape should be tensor during tracing!" - # as_tensor should not be used in tracing because it records a constant - ret = torch.stack(x) - if ret.device != device: # avoid recording a hard-coded device if not necessary - ret = ret.to(device=device) - return ret - return torch.as_tensor(x, device=device) - - -def cat(tensors: List[torch.Tensor], dim: int = 0): - """ - Efficient version of torch.cat that avoids a copy if there is only a single element in a list - """ - assert isinstance(tensors, (list, tuple)) - if len(tensors) == 1: - return tensors[0] - return torch.cat(tensors, dim) - - -def cross_entropy(input, target, *, reduction="mean", **kwargs): - """ - Same as `torch.nn.functional.cross_entropy`, but returns 0 (instead of nan) - for empty inputs. - """ - if target.numel() == 0 and reduction == "mean": - return input.sum() * 0.0 # connect the gradient - return F.cross_entropy(input, target, reduction=reduction, **kwargs) - - -class _NewEmptyTensorOp(torch.autograd.Function): - @staticmethod - def forward(ctx, x, new_shape): - ctx.shape = x.shape - return x.new_empty(new_shape) - - @staticmethod - def backward(ctx, grad): - shape = ctx.shape - return _NewEmptyTensorOp.apply(grad, shape), None - - -class Conv2d(torch.nn.Conv2d): - """ - A wrapper around :class:`torch.nn.Conv2d` to support empty inputs and more features. - """ - - def __init__(self, *args, **kwargs): - """ - Extra keyword arguments supported in addition to those in `torch.nn.Conv2d`: - - Args: - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - - It assumes that norm layer is used before activation. - """ - norm = kwargs.pop("norm", None) - activation = kwargs.pop("activation", None) - super().__init__(*args, **kwargs) - - self.norm = norm - self.activation = activation - - def forward(self, x): - # torchscript does not support SyncBatchNorm yet - # https://github.com/pytorch/pytorch/issues/40507 - # and we skip these codes in torchscript since: - # 1. currently we only support torchscript in evaluation mode - # 2. features needed by exporting module to torchscript are added in PyTorch 1.6 or - # later version, `Conv2d` in these PyTorch versions has already supported empty inputs. - if not torch.jit.is_scripting(): - if x.numel() == 0 and self.training: - # https://github.com/pytorch/pytorch/issues/12013 - assert not isinstance( - self.norm, torch.nn.SyncBatchNorm - ), "SyncBatchNorm does not support empty inputs!" - - x = F.conv2d( - x, self.weight, self.bias, self.stride, self.padding, self.dilation, self.groups - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - -ConvTranspose2d = torch.nn.ConvTranspose2d -BatchNorm2d = torch.nn.BatchNorm2d -interpolate = F.interpolate -Linear = torch.nn.Linear - - -def nonzero_tuple(x): - """ - A 'as_tuple=True' version of torch.nonzero to support torchscript. - because of https://github.com/pytorch/pytorch/issues/38718 - """ - if torch.jit.is_scripting(): - if x.dim() == 0: - return x.unsqueeze(0).nonzero().unbind(1) - return x.nonzero().unbind(1) - else: - return x.nonzero(as_tuple=True) diff --git a/spaces/OpenGenAI/open-parti-prompts/README.md b/spaces/OpenGenAI/open-parti-prompts/README.md deleted file mode 100644 index 7c091f925fbe258acf22ca5d5a81c812f323d2dc..0000000000000000000000000000000000000000 --- a/spaces/OpenGenAI/open-parti-prompts/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: Open Parti Prompts -emoji: 🌍 -colorFrom: red -colorTo: gray -sdk: gradio -sdk_version: 3.41.0 -app_file: app.py -pinned: false -duplicated_from: OpenGenAI/open-parti-prompts ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/Ottermad/pet-classifier/app.py b/spaces/Ottermad/pet-classifier/app.py deleted file mode 100644 index e3ed99a81c147cae6640e919ee9476d9a5c74444..0000000000000000000000000000000000000000 --- a/spaces/Ottermad/pet-classifier/app.py +++ /dev/null @@ -1,30 +0,0 @@ -import gradio as gr -from fastai.vision.all import * -import skimage - -def label_func(x): - return x.parent.name - -learn = load_learner('model (2).pkl') - -labels = learn.dls.vocab - -def predict(img): - img = PILImage.create(img) - pred,pred_idx,probs = learn.predict(img) - return {labels[i]: float(probs[i]) for i in range(len(labels))} - -title = "MNIST" -description = "Fast.ai Lesson 2" -interpretation='default' -enable_queue=True - -gr.Interface( - fn=predict, - inputs=gr.inputs.Image(shape=(28, 28)), - outputs=gr.outputs.Label(num_top_classes=10), - title=title, - description=description, - interpretation=interpretation, - enable_queue=enable_queue -).launch() diff --git a/spaces/PAIR/Text2Video-Zero/annotator/midas/midas/blocks.py b/spaces/PAIR/Text2Video-Zero/annotator/midas/midas/blocks.py deleted file mode 100644 index 2145d18fa98060a618536d9a64fe6589e9be4f78..0000000000000000000000000000000000000000 --- a/spaces/PAIR/Text2Video-Zero/annotator/midas/midas/blocks.py +++ /dev/null @@ -1,342 +0,0 @@ -import torch -import torch.nn as nn - -from .vit import ( - _make_pretrained_vitb_rn50_384, - _make_pretrained_vitl16_384, - _make_pretrained_vitb16_384, - forward_vit, -) - -def _make_encoder(backbone, features, use_pretrained, groups=1, expand=False, exportable=True, hooks=None, use_vit_only=False, use_readout="ignore",): - if backbone == "vitl16_384": - pretrained = _make_pretrained_vitl16_384( - use_pretrained, hooks=hooks, use_readout=use_readout - ) - scratch = _make_scratch( - [256, 512, 1024, 1024], features, groups=groups, expand=expand - ) # ViT-L/16 - 85.0% Top1 (backbone) - elif backbone == "vitb_rn50_384": - pretrained = _make_pretrained_vitb_rn50_384( - use_pretrained, - hooks=hooks, - use_vit_only=use_vit_only, - use_readout=use_readout, - ) - scratch = _make_scratch( - [256, 512, 768, 768], features, groups=groups, expand=expand - ) # ViT-H/16 - 85.0% Top1 (backbone) - elif backbone == "vitb16_384": - pretrained = _make_pretrained_vitb16_384( - use_pretrained, hooks=hooks, use_readout=use_readout - ) - scratch = _make_scratch( - [96, 192, 384, 768], features, groups=groups, expand=expand - ) # ViT-B/16 - 84.6% Top1 (backbone) - elif backbone == "resnext101_wsl": - pretrained = _make_pretrained_resnext101_wsl(use_pretrained) - scratch = _make_scratch([256, 512, 1024, 2048], features, groups=groups, expand=expand) # efficientnet_lite3 - elif backbone == "efficientnet_lite3": - pretrained = _make_pretrained_efficientnet_lite3(use_pretrained, exportable=exportable) - scratch = _make_scratch([32, 48, 136, 384], features, groups=groups, expand=expand) # efficientnet_lite3 - else: - print(f"Backbone '{backbone}' not implemented") - assert False - - return pretrained, scratch - - -def _make_scratch(in_shape, out_shape, groups=1, expand=False): - scratch = nn.Module() - - out_shape1 = out_shape - out_shape2 = out_shape - out_shape3 = out_shape - out_shape4 = out_shape - if expand==True: - out_shape1 = out_shape - out_shape2 = out_shape*2 - out_shape3 = out_shape*4 - out_shape4 = out_shape*8 - - scratch.layer1_rn = nn.Conv2d( - in_shape[0], out_shape1, kernel_size=3, stride=1, padding=1, bias=False, groups=groups - ) - scratch.layer2_rn = nn.Conv2d( - in_shape[1], out_shape2, kernel_size=3, stride=1, padding=1, bias=False, groups=groups - ) - scratch.layer3_rn = nn.Conv2d( - in_shape[2], out_shape3, kernel_size=3, stride=1, padding=1, bias=False, groups=groups - ) - scratch.layer4_rn = nn.Conv2d( - in_shape[3], out_shape4, kernel_size=3, stride=1, padding=1, bias=False, groups=groups - ) - - return scratch - - -def _make_pretrained_efficientnet_lite3(use_pretrained, exportable=False): - efficientnet = torch.hub.load( - "rwightman/gen-efficientnet-pytorch", - "tf_efficientnet_lite3", - pretrained=use_pretrained, - exportable=exportable - ) - return _make_efficientnet_backbone(efficientnet) - - -def _make_efficientnet_backbone(effnet): - pretrained = nn.Module() - - pretrained.layer1 = nn.Sequential( - effnet.conv_stem, effnet.bn1, effnet.act1, *effnet.blocks[0:2] - ) - pretrained.layer2 = nn.Sequential(*effnet.blocks[2:3]) - pretrained.layer3 = nn.Sequential(*effnet.blocks[3:5]) - pretrained.layer4 = nn.Sequential(*effnet.blocks[5:9]) - - return pretrained - - -def _make_resnet_backbone(resnet): - pretrained = nn.Module() - pretrained.layer1 = nn.Sequential( - resnet.conv1, resnet.bn1, resnet.relu, resnet.maxpool, resnet.layer1 - ) - - pretrained.layer2 = resnet.layer2 - pretrained.layer3 = resnet.layer3 - pretrained.layer4 = resnet.layer4 - - return pretrained - - -def _make_pretrained_resnext101_wsl(use_pretrained): - resnet = torch.hub.load("facebookresearch/WSL-Images", "resnext101_32x8d_wsl") - return _make_resnet_backbone(resnet) - - - -class Interpolate(nn.Module): - """Interpolation module. - """ - - def __init__(self, scale_factor, mode, align_corners=False): - """Init. - - Args: - scale_factor (float): scaling - mode (str): interpolation mode - """ - super(Interpolate, self).__init__() - - self.interp = nn.functional.interpolate - self.scale_factor = scale_factor - self.mode = mode - self.align_corners = align_corners - - def forward(self, x): - """Forward pass. - - Args: - x (tensor): input - - Returns: - tensor: interpolated data - """ - - x = self.interp( - x, scale_factor=self.scale_factor, mode=self.mode, align_corners=self.align_corners - ) - - return x - - -class ResidualConvUnit(nn.Module): - """Residual convolution module. - """ - - def __init__(self, features): - """Init. - - Args: - features (int): number of features - """ - super().__init__() - - self.conv1 = nn.Conv2d( - features, features, kernel_size=3, stride=1, padding=1, bias=True - ) - - self.conv2 = nn.Conv2d( - features, features, kernel_size=3, stride=1, padding=1, bias=True - ) - - self.relu = nn.ReLU(inplace=True) - - def forward(self, x): - """Forward pass. - - Args: - x (tensor): input - - Returns: - tensor: output - """ - out = self.relu(x) - out = self.conv1(out) - out = self.relu(out) - out = self.conv2(out) - - return out + x - - -class FeatureFusionBlock(nn.Module): - """Feature fusion block. - """ - - def __init__(self, features): - """Init. - - Args: - features (int): number of features - """ - super(FeatureFusionBlock, self).__init__() - - self.resConfUnit1 = ResidualConvUnit(features) - self.resConfUnit2 = ResidualConvUnit(features) - - def forward(self, *xs): - """Forward pass. - - Returns: - tensor: output - """ - output = xs[0] - - if len(xs) == 2: - output += self.resConfUnit1(xs[1]) - - output = self.resConfUnit2(output) - - output = nn.functional.interpolate( - output, scale_factor=2, mode="bilinear", align_corners=True - ) - - return output - - - - -class ResidualConvUnit_custom(nn.Module): - """Residual convolution module. - """ - - def __init__(self, features, activation, bn): - """Init. - - Args: - features (int): number of features - """ - super().__init__() - - self.bn = bn - - self.groups=1 - - self.conv1 = nn.Conv2d( - features, features, kernel_size=3, stride=1, padding=1, bias=True, groups=self.groups - ) - - self.conv2 = nn.Conv2d( - features, features, kernel_size=3, stride=1, padding=1, bias=True, groups=self.groups - ) - - if self.bn==True: - self.bn1 = nn.BatchNorm2d(features) - self.bn2 = nn.BatchNorm2d(features) - - self.activation = activation - - self.skip_add = nn.quantized.FloatFunctional() - - def forward(self, x): - """Forward pass. - - Args: - x (tensor): input - - Returns: - tensor: output - """ - - out = self.activation(x) - out = self.conv1(out) - if self.bn==True: - out = self.bn1(out) - - out = self.activation(out) - out = self.conv2(out) - if self.bn==True: - out = self.bn2(out) - - if self.groups > 1: - out = self.conv_merge(out) - - return self.skip_add.add(out, x) - - # return out + x - - -class FeatureFusionBlock_custom(nn.Module): - """Feature fusion block. - """ - - def __init__(self, features, activation, deconv=False, bn=False, expand=False, align_corners=True): - """Init. - - Args: - features (int): number of features - """ - super(FeatureFusionBlock_custom, self).__init__() - - self.deconv = deconv - self.align_corners = align_corners - - self.groups=1 - - self.expand = expand - out_features = features - if self.expand==True: - out_features = features//2 - - self.out_conv = nn.Conv2d(features, out_features, kernel_size=1, stride=1, padding=0, bias=True, groups=1) - - self.resConfUnit1 = ResidualConvUnit_custom(features, activation, bn) - self.resConfUnit2 = ResidualConvUnit_custom(features, activation, bn) - - self.skip_add = nn.quantized.FloatFunctional() - - def forward(self, *xs): - """Forward pass. - - Returns: - tensor: output - """ - output = xs[0] - - if len(xs) == 2: - res = self.resConfUnit1(xs[1]) - output = self.skip_add.add(output, res) - # output += res - - output = self.resConfUnit2(output) - - output = nn.functional.interpolate( - output, scale_factor=2, mode="bilinear", align_corners=self.align_corners - ) - - output = self.out_conv(output) - - return output - diff --git a/spaces/PAIR/Text2Video-Zero/annotator/uniformer/mmcv/image/photometric.py b/spaces/PAIR/Text2Video-Zero/annotator/uniformer/mmcv/image/photometric.py deleted file mode 100644 index 5085d012019c0cbf56f66f421a378278c1a058ae..0000000000000000000000000000000000000000 --- a/spaces/PAIR/Text2Video-Zero/annotator/uniformer/mmcv/image/photometric.py +++ /dev/null @@ -1,428 +0,0 @@ -# Copyright (c) OpenMMLab. All rights reserved. -import cv2 -import numpy as np - -from ..utils import is_tuple_of -from .colorspace import bgr2gray, gray2bgr - - -def imnormalize(img, mean, std, to_rgb=True): - """Normalize an image with mean and std. - - Args: - img (ndarray): Image to be normalized. - mean (ndarray): The mean to be used for normalize. - std (ndarray): The std to be used for normalize. - to_rgb (bool): Whether to convert to rgb. - - Returns: - ndarray: The normalized image. - """ - img = img.copy().astype(np.float32) - return imnormalize_(img, mean, std, to_rgb) - - -def imnormalize_(img, mean, std, to_rgb=True): - """Inplace normalize an image with mean and std. - - Args: - img (ndarray): Image to be normalized. - mean (ndarray): The mean to be used for normalize. - std (ndarray): The std to be used for normalize. - to_rgb (bool): Whether to convert to rgb. - - Returns: - ndarray: The normalized image. - """ - # cv2 inplace normalization does not accept uint8 - assert img.dtype != np.uint8 - mean = np.float64(mean.reshape(1, -1)) - stdinv = 1 / np.float64(std.reshape(1, -1)) - if to_rgb: - cv2.cvtColor(img, cv2.COLOR_BGR2RGB, img) # inplace - cv2.subtract(img, mean, img) # inplace - cv2.multiply(img, stdinv, img) # inplace - return img - - -def imdenormalize(img, mean, std, to_bgr=True): - assert img.dtype != np.uint8 - mean = mean.reshape(1, -1).astype(np.float64) - std = std.reshape(1, -1).astype(np.float64) - img = cv2.multiply(img, std) # make a copy - cv2.add(img, mean, img) # inplace - if to_bgr: - cv2.cvtColor(img, cv2.COLOR_RGB2BGR, img) # inplace - return img - - -def iminvert(img): - """Invert (negate) an image. - - Args: - img (ndarray): Image to be inverted. - - Returns: - ndarray: The inverted image. - """ - return np.full_like(img, 255) - img - - -def solarize(img, thr=128): - """Solarize an image (invert all pixel values above a threshold) - - Args: - img (ndarray): Image to be solarized. - thr (int): Threshold for solarizing (0 - 255). - - Returns: - ndarray: The solarized image. - """ - img = np.where(img < thr, img, 255 - img) - return img - - -def posterize(img, bits): - """Posterize an image (reduce the number of bits for each color channel) - - Args: - img (ndarray): Image to be posterized. - bits (int): Number of bits (1 to 8) to use for posterizing. - - Returns: - ndarray: The posterized image. - """ - shift = 8 - bits - img = np.left_shift(np.right_shift(img, shift), shift) - return img - - -def adjust_color(img, alpha=1, beta=None, gamma=0): - r"""It blends the source image and its gray image: - - .. math:: - output = img * alpha + gray\_img * beta + gamma - - Args: - img (ndarray): The input source image. - alpha (int | float): Weight for the source image. Default 1. - beta (int | float): Weight for the converted gray image. - If None, it's assigned the value (1 - `alpha`). - gamma (int | float): Scalar added to each sum. - Same as :func:`cv2.addWeighted`. Default 0. - - Returns: - ndarray: Colored image which has the same size and dtype as input. - """ - gray_img = bgr2gray(img) - gray_img = np.tile(gray_img[..., None], [1, 1, 3]) - if beta is None: - beta = 1 - alpha - colored_img = cv2.addWeighted(img, alpha, gray_img, beta, gamma) - if not colored_img.dtype == np.uint8: - # Note when the dtype of `img` is not the default `np.uint8` - # (e.g. np.float32), the value in `colored_img` got from cv2 - # is not guaranteed to be in range [0, 255], so here clip - # is needed. - colored_img = np.clip(colored_img, 0, 255) - return colored_img - - -def imequalize(img): - """Equalize the image histogram. - - This function applies a non-linear mapping to the input image, - in order to create a uniform distribution of grayscale values - in the output image. - - Args: - img (ndarray): Image to be equalized. - - Returns: - ndarray: The equalized image. - """ - - def _scale_channel(im, c): - """Scale the data in the corresponding channel.""" - im = im[:, :, c] - # Compute the histogram of the image channel. - histo = np.histogram(im, 256, (0, 255))[0] - # For computing the step, filter out the nonzeros. - nonzero_histo = histo[histo > 0] - step = (np.sum(nonzero_histo) - nonzero_histo[-1]) // 255 - if not step: - lut = np.array(range(256)) - else: - # Compute the cumulative sum, shifted by step // 2 - # and then normalized by step. - lut = (np.cumsum(histo) + (step // 2)) // step - # Shift lut, prepending with 0. - lut = np.concatenate([[0], lut[:-1]], 0) - # handle potential integer overflow - lut[lut > 255] = 255 - # If step is zero, return the original image. - # Otherwise, index from lut. - return np.where(np.equal(step, 0), im, lut[im]) - - # Scales each channel independently and then stacks - # the result. - s1 = _scale_channel(img, 0) - s2 = _scale_channel(img, 1) - s3 = _scale_channel(img, 2) - equalized_img = np.stack([s1, s2, s3], axis=-1) - return equalized_img.astype(img.dtype) - - -def adjust_brightness(img, factor=1.): - """Adjust image brightness. - - This function controls the brightness of an image. An - enhancement factor of 0.0 gives a black image. - A factor of 1.0 gives the original image. This function - blends the source image and the degenerated black image: - - .. math:: - output = img * factor + degenerated * (1 - factor) - - Args: - img (ndarray): Image to be brightened. - factor (float): A value controls the enhancement. - Factor 1.0 returns the original image, lower - factors mean less color (brightness, contrast, - etc), and higher values more. Default 1. - - Returns: - ndarray: The brightened image. - """ - degenerated = np.zeros_like(img) - # Note manually convert the dtype to np.float32, to - # achieve as close results as PIL.ImageEnhance.Brightness. - # Set beta=1-factor, and gamma=0 - brightened_img = cv2.addWeighted( - img.astype(np.float32), factor, degenerated.astype(np.float32), - 1 - factor, 0) - brightened_img = np.clip(brightened_img, 0, 255) - return brightened_img.astype(img.dtype) - - -def adjust_contrast(img, factor=1.): - """Adjust image contrast. - - This function controls the contrast of an image. An - enhancement factor of 0.0 gives a solid grey - image. A factor of 1.0 gives the original image. It - blends the source image and the degenerated mean image: - - .. math:: - output = img * factor + degenerated * (1 - factor) - - Args: - img (ndarray): Image to be contrasted. BGR order. - factor (float): Same as :func:`mmcv.adjust_brightness`. - - Returns: - ndarray: The contrasted image. - """ - gray_img = bgr2gray(img) - hist = np.histogram(gray_img, 256, (0, 255))[0] - mean = round(np.sum(gray_img) / np.sum(hist)) - degenerated = (np.ones_like(img[..., 0]) * mean).astype(img.dtype) - degenerated = gray2bgr(degenerated) - contrasted_img = cv2.addWeighted( - img.astype(np.float32), factor, degenerated.astype(np.float32), - 1 - factor, 0) - contrasted_img = np.clip(contrasted_img, 0, 255) - return contrasted_img.astype(img.dtype) - - -def auto_contrast(img, cutoff=0): - """Auto adjust image contrast. - - This function maximize (normalize) image contrast by first removing cutoff - percent of the lightest and darkest pixels from the histogram and remapping - the image so that the darkest pixel becomes black (0), and the lightest - becomes white (255). - - Args: - img (ndarray): Image to be contrasted. BGR order. - cutoff (int | float | tuple): The cutoff percent of the lightest and - darkest pixels to be removed. If given as tuple, it shall be - (low, high). Otherwise, the single value will be used for both. - Defaults to 0. - - Returns: - ndarray: The contrasted image. - """ - - def _auto_contrast_channel(im, c, cutoff): - im = im[:, :, c] - # Compute the histogram of the image channel. - histo = np.histogram(im, 256, (0, 255))[0] - # Remove cut-off percent pixels from histo - histo_sum = np.cumsum(histo) - cut_low = histo_sum[-1] * cutoff[0] // 100 - cut_high = histo_sum[-1] - histo_sum[-1] * cutoff[1] // 100 - histo_sum = np.clip(histo_sum, cut_low, cut_high) - cut_low - histo = np.concatenate([[histo_sum[0]], np.diff(histo_sum)], 0) - - # Compute mapping - low, high = np.nonzero(histo)[0][0], np.nonzero(histo)[0][-1] - # If all the values have been cut off, return the origin img - if low >= high: - return im - scale = 255.0 / (high - low) - offset = -low * scale - lut = np.array(range(256)) - lut = lut * scale + offset - lut = np.clip(lut, 0, 255) - return lut[im] - - if isinstance(cutoff, (int, float)): - cutoff = (cutoff, cutoff) - else: - assert isinstance(cutoff, tuple), 'cutoff must be of type int, ' \ - f'float or tuple, but got {type(cutoff)} instead.' - # Auto adjusts contrast for each channel independently and then stacks - # the result. - s1 = _auto_contrast_channel(img, 0, cutoff) - s2 = _auto_contrast_channel(img, 1, cutoff) - s3 = _auto_contrast_channel(img, 2, cutoff) - contrasted_img = np.stack([s1, s2, s3], axis=-1) - return contrasted_img.astype(img.dtype) - - -def adjust_sharpness(img, factor=1., kernel=None): - """Adjust image sharpness. - - This function controls the sharpness of an image. An - enhancement factor of 0.0 gives a blurred image. A - factor of 1.0 gives the original image. And a factor - of 2.0 gives a sharpened image. It blends the source - image and the degenerated mean image: - - .. math:: - output = img * factor + degenerated * (1 - factor) - - Args: - img (ndarray): Image to be sharpened. BGR order. - factor (float): Same as :func:`mmcv.adjust_brightness`. - kernel (np.ndarray, optional): Filter kernel to be applied on the img - to obtain the degenerated img. Defaults to None. - - Note: - No value sanity check is enforced on the kernel set by users. So with - an inappropriate kernel, the ``adjust_sharpness`` may fail to perform - the function its name indicates but end up performing whatever - transform determined by the kernel. - - Returns: - ndarray: The sharpened image. - """ - - if kernel is None: - # adopted from PIL.ImageFilter.SMOOTH - kernel = np.array([[1., 1., 1.], [1., 5., 1.], [1., 1., 1.]]) / 13 - assert isinstance(kernel, np.ndarray), \ - f'kernel must be of type np.ndarray, but got {type(kernel)} instead.' - assert kernel.ndim == 2, \ - f'kernel must have a dimension of 2, but got {kernel.ndim} instead.' - - degenerated = cv2.filter2D(img, -1, kernel) - sharpened_img = cv2.addWeighted( - img.astype(np.float32), factor, degenerated.astype(np.float32), - 1 - factor, 0) - sharpened_img = np.clip(sharpened_img, 0, 255) - return sharpened_img.astype(img.dtype) - - -def adjust_lighting(img, eigval, eigvec, alphastd=0.1, to_rgb=True): - """AlexNet-style PCA jitter. - - This data augmentation is proposed in `ImageNet Classification with Deep - Convolutional Neural Networks - `_. - - Args: - img (ndarray): Image to be adjusted lighting. BGR order. - eigval (ndarray): the eigenvalue of the convariance matrix of pixel - values, respectively. - eigvec (ndarray): the eigenvector of the convariance matrix of pixel - values, respectively. - alphastd (float): The standard deviation for distribution of alpha. - Defaults to 0.1 - to_rgb (bool): Whether to convert img to rgb. - - Returns: - ndarray: The adjusted image. - """ - assert isinstance(eigval, np.ndarray) and isinstance(eigvec, np.ndarray), \ - f'eigval and eigvec should both be of type np.ndarray, got ' \ - f'{type(eigval)} and {type(eigvec)} instead.' - - assert eigval.ndim == 1 and eigvec.ndim == 2 - assert eigvec.shape == (3, eigval.shape[0]) - n_eigval = eigval.shape[0] - assert isinstance(alphastd, float), 'alphastd should be of type float, ' \ - f'got {type(alphastd)} instead.' - - img = img.copy().astype(np.float32) - if to_rgb: - cv2.cvtColor(img, cv2.COLOR_BGR2RGB, img) # inplace - - alpha = np.random.normal(0, alphastd, n_eigval) - alter = eigvec \ - * np.broadcast_to(alpha.reshape(1, n_eigval), (3, n_eigval)) \ - * np.broadcast_to(eigval.reshape(1, n_eigval), (3, n_eigval)) - alter = np.broadcast_to(alter.sum(axis=1).reshape(1, 1, 3), img.shape) - img_adjusted = img + alter - return img_adjusted - - -def lut_transform(img, lut_table): - """Transform array by look-up table. - - The function lut_transform fills the output array with values from the - look-up table. Indices of the entries are taken from the input array. - - Args: - img (ndarray): Image to be transformed. - lut_table (ndarray): look-up table of 256 elements; in case of - multi-channel input array, the table should either have a single - channel (in this case the same table is used for all channels) or - the same number of channels as in the input array. - - Returns: - ndarray: The transformed image. - """ - assert isinstance(img, np.ndarray) - assert 0 <= np.min(img) and np.max(img) <= 255 - assert isinstance(lut_table, np.ndarray) - assert lut_table.shape == (256, ) - - return cv2.LUT(np.array(img, dtype=np.uint8), lut_table) - - -def clahe(img, clip_limit=40.0, tile_grid_size=(8, 8)): - """Use CLAHE method to process the image. - - See `ZUIDERVELD,K. Contrast Limited Adaptive Histogram Equalization[J]. - Graphics Gems, 1994:474-485.` for more information. - - Args: - img (ndarray): Image to be processed. - clip_limit (float): Threshold for contrast limiting. Default: 40.0. - tile_grid_size (tuple[int]): Size of grid for histogram equalization. - Input image will be divided into equally sized rectangular tiles. - It defines the number of tiles in row and column. Default: (8, 8). - - Returns: - ndarray: The processed image. - """ - assert isinstance(img, np.ndarray) - assert img.ndim == 2 - assert isinstance(clip_limit, (float, int)) - assert is_tuple_of(tile_grid_size, int) - assert len(tile_grid_size) == 2 - - clahe = cv2.createCLAHE(clip_limit, tile_grid_size) - return clahe.apply(np.array(img, dtype=np.uint8)) diff --git a/spaces/PKUWilliamYang/StyleGANEX/models/mtcnn/mtcnn_pytorch/__init__.py b/spaces/PKUWilliamYang/StyleGANEX/models/mtcnn/mtcnn_pytorch/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/spaces/PKUWilliamYang/VToonify/README.md b/spaces/PKUWilliamYang/VToonify/README.md deleted file mode 100644 index c87bb7c1d005b52bb76a72e807a048ef3b1ec10a..0000000000000000000000000000000000000000 --- a/spaces/PKUWilliamYang/VToonify/README.md +++ /dev/null @@ -1,10 +0,0 @@ ---- -title: VToonify -sdk: gradio -emoji: 🐨 -colorFrom: yellow -colorTo: pink -sdk_version: 3.44.3 -app_file: app.py -pinned: false ---- \ No newline at end of file diff --git a/spaces/PeepDaSlan9/candle-llama2/app-a6ac60c1ad530067.js b/spaces/PeepDaSlan9/candle-llama2/app-a6ac60c1ad530067.js deleted file mode 100644 index 766adf96a6462d369446bcacd0593712389d4cf7..0000000000000000000000000000000000000000 --- a/spaces/PeepDaSlan9/candle-llama2/app-a6ac60c1ad530067.js +++ /dev/null @@ -1,829 +0,0 @@ -let wasm; - -const heap = new Array(128).fill(undefined); - -heap.push(undefined, null, true, false); - -function getObject(idx) { return heap[idx]; } - -let heap_next = heap.length; - -function dropObject(idx) { - if (idx < 132) return; - heap[idx] = heap_next; - heap_next = idx; -} - -function takeObject(idx) { - const ret = getObject(idx); - dropObject(idx); - return ret; -} - -function addHeapObject(obj) { - if (heap_next === heap.length) heap.push(heap.length + 1); - const idx = heap_next; - heap_next = heap[idx]; - - heap[idx] = obj; - return idx; -} - -const cachedTextDecoder = (typeof TextDecoder !== 'undefined' ? new TextDecoder('utf-8', { ignoreBOM: true, fatal: true }) : { decode: () => { throw Error('TextDecoder not available') } } ); - -if (typeof TextDecoder !== 'undefined') { cachedTextDecoder.decode(); }; - -let cachedUint8Memory0 = null; - -function getUint8Memory0() { - if (cachedUint8Memory0 === null || cachedUint8Memory0.byteLength === 0) { - cachedUint8Memory0 = new Uint8Array(wasm.memory.buffer); - } - return cachedUint8Memory0; -} - -function getStringFromWasm0(ptr, len) { - ptr = ptr >>> 0; - return cachedTextDecoder.decode(getUint8Memory0().subarray(ptr, ptr + len)); -} - -function debugString(val) { - // primitive types - const type = typeof val; - if (type == 'number' || type == 'boolean' || val == null) { - return `${val}`; - } - if (type == 'string') { - return `"${val}"`; - } - if (type == 'symbol') { - const description = val.description; - if (description == null) { - return 'Symbol'; - } else { - return `Symbol(${description})`; - } - } - if (type == 'function') { - const name = val.name; - if (typeof name == 'string' && name.length > 0) { - return `Function(${name})`; - } else { - return 'Function'; - } - } - // objects - if (Array.isArray(val)) { - const length = val.length; - let debug = '['; - if (length > 0) { - debug += debugString(val[0]); - } - for(let i = 1; i < length; i++) { - debug += ', ' + debugString(val[i]); - } - debug += ']'; - return debug; - } - // Test for built-in - const builtInMatches = /\[object ([^\]]+)\]/.exec(toString.call(val)); - let className; - if (builtInMatches.length > 1) { - className = builtInMatches[1]; - } else { - // Failed to match the standard '[object ClassName]' - return toString.call(val); - } - if (className == 'Object') { - // we're a user defined class or Object - // JSON.stringify avoids problems with cycles, and is generally much - // easier than looping through ownProperties of `val`. - try { - return 'Object(' + JSON.stringify(val) + ')'; - } catch (_) { - return 'Object'; - } - } - // errors - if (val instanceof Error) { - return `${val.name}: ${val.message}\n${val.stack}`; - } - // TODO we could test for more things here, like `Set`s and `Map`s. - return className; -} - -let WASM_VECTOR_LEN = 0; - -const cachedTextEncoder = (typeof TextEncoder !== 'undefined' ? new TextEncoder('utf-8') : { encode: () => { throw Error('TextEncoder not available') } } ); - -const encodeString = (typeof cachedTextEncoder.encodeInto === 'function' - ? function (arg, view) { - return cachedTextEncoder.encodeInto(arg, view); -} - : function (arg, view) { - const buf = cachedTextEncoder.encode(arg); - view.set(buf); - return { - read: arg.length, - written: buf.length - }; -}); - -function passStringToWasm0(arg, malloc, realloc) { - - if (realloc === undefined) { - const buf = cachedTextEncoder.encode(arg); - const ptr = malloc(buf.length, 1) >>> 0; - getUint8Memory0().subarray(ptr, ptr + buf.length).set(buf); - WASM_VECTOR_LEN = buf.length; - return ptr; - } - - let len = arg.length; - let ptr = malloc(len, 1) >>> 0; - - const mem = getUint8Memory0(); - - let offset = 0; - - for (; offset < len; offset++) { - const code = arg.charCodeAt(offset); - if (code > 0x7F) break; - mem[ptr + offset] = code; - } - - if (offset !== len) { - if (offset !== 0) { - arg = arg.slice(offset); - } - ptr = realloc(ptr, len, len = offset + arg.length * 3, 1) >>> 0; - const view = getUint8Memory0().subarray(ptr + offset, ptr + len); - const ret = encodeString(arg, view); - - offset += ret.written; - } - - WASM_VECTOR_LEN = offset; - return ptr; -} - -let cachedInt32Memory0 = null; - -function getInt32Memory0() { - if (cachedInt32Memory0 === null || cachedInt32Memory0.byteLength === 0) { - cachedInt32Memory0 = new Int32Array(wasm.memory.buffer); - } - return cachedInt32Memory0; -} - -function makeClosure(arg0, arg1, dtor, f) { - const state = { a: arg0, b: arg1, cnt: 1, dtor }; - const real = (...args) => { - // First up with a closure we increment the internal reference - // count. This ensures that the Rust closure environment won't - // be deallocated while we're invoking it. - state.cnt++; - try { - return f(state.a, state.b, ...args); - } finally { - if (--state.cnt === 0) { - wasm.__wbindgen_export_2.get(state.dtor)(state.a, state.b); - state.a = 0; - - } - } - }; - real.original = state; - - return real; -} -function __wbg_adapter_18(arg0, arg1, arg2) { - wasm.wasm_bindgen__convert__closures__invoke1__hee69d633833ebd7c(arg0, arg1, addHeapObject(arg2)); -} - -function makeMutClosure(arg0, arg1, dtor, f) { - const state = { a: arg0, b: arg1, cnt: 1, dtor }; - const real = (...args) => { - // First up with a closure we increment the internal reference - // count. This ensures that the Rust closure environment won't - // be deallocated while we're invoking it. - state.cnt++; - const a = state.a; - state.a = 0; - try { - return f(a, state.b, ...args); - } finally { - if (--state.cnt === 0) { - wasm.__wbindgen_export_2.get(state.dtor)(a, state.b); - - } else { - state.a = a; - } - } - }; - real.original = state; - - return real; -} - -let stack_pointer = 128; - -function addBorrowedObject(obj) { - if (stack_pointer == 1) throw new Error('out of js stack'); - heap[--stack_pointer] = obj; - return stack_pointer; -} -function __wbg_adapter_21(arg0, arg1, arg2) { - try { - wasm._dyn_core__ops__function__FnMut___A____Output___R_as_wasm_bindgen__closure__WasmClosure___describe__invoke__hadab26222cba6f84(arg0, arg1, addBorrowedObject(arg2)); - } finally { - heap[stack_pointer++] = undefined; - } -} - -function __wbg_adapter_24(arg0, arg1, arg2) { - wasm._dyn_core__ops__function__FnMut__A____Output___R_as_wasm_bindgen__closure__WasmClosure___describe__invoke__hfc3f0e78cf729c36(arg0, arg1, addHeapObject(arg2)); -} - -function isLikeNone(x) { - return x === undefined || x === null; -} - -let cachedUint32Memory0 = null; - -function getUint32Memory0() { - if (cachedUint32Memory0 === null || cachedUint32Memory0.byteLength === 0) { - cachedUint32Memory0 = new Uint32Array(wasm.memory.buffer); - } - return cachedUint32Memory0; -} - -function getArrayJsValueFromWasm0(ptr, len) { - ptr = ptr >>> 0; - const mem = getUint32Memory0(); - const slice = mem.subarray(ptr / 4, ptr / 4 + len); - const result = []; - for (let i = 0; i < slice.length; i++) { - result.push(takeObject(slice[i])); - } - return result; -} - -function handleError(f, args) { - try { - return f.apply(this, args); - } catch (e) { - wasm.__wbindgen_exn_store(addHeapObject(e)); - } -} - -async function __wbg_load(module, imports) { - if (typeof Response === 'function' && module instanceof Response) { - if (typeof WebAssembly.instantiateStreaming === 'function') { - try { - return await WebAssembly.instantiateStreaming(module, imports); - - } catch (e) { - if (module.headers.get('Content-Type') != 'application/wasm') { - console.warn("`WebAssembly.instantiateStreaming` failed because your server does not serve wasm with `application/wasm` MIME type. Falling back to `WebAssembly.instantiate` which is slower. Original error:\n", e); - - } else { - throw e; - } - } - } - - const bytes = await module.arrayBuffer(); - return await WebAssembly.instantiate(bytes, imports); - - } else { - const instance = await WebAssembly.instantiate(module, imports); - - if (instance instanceof WebAssembly.Instance) { - return { instance, module }; - - } else { - return instance; - } - } -} - -function __wbg_get_imports() { - const imports = {}; - imports.wbg = {}; - imports.wbg.__wbindgen_object_drop_ref = function(arg0) { - takeObject(arg0); - }; - imports.wbg.__wbindgen_object_clone_ref = function(arg0) { - const ret = getObject(arg0); - return addHeapObject(ret); - }; - imports.wbg.__wbg_log_3af90b48c052f90b = function(arg0, arg1) { - console.log(getStringFromWasm0(arg0, arg1)); - }; - imports.wbg.__wbindgen_cb_drop = function(arg0) { - const obj = takeObject(arg0).original; - if (obj.cnt-- == 1) { - obj.a = 0; - return true; - } - const ret = false; - return ret; - }; - imports.wbg.__wbindgen_string_new = function(arg0, arg1) { - const ret = getStringFromWasm0(arg0, arg1); - return addHeapObject(ret); - }; - imports.wbg.__wbg_listenerid_12315eee21527820 = function(arg0, arg1) { - const ret = getObject(arg1).__yew_listener_id; - getInt32Memory0()[arg0 / 4 + 1] = isLikeNone(ret) ? 0 : ret; - getInt32Memory0()[arg0 / 4 + 0] = !isLikeNone(ret); - }; - imports.wbg.__wbg_setlistenerid_3183aae8fa5840fb = function(arg0, arg1) { - getObject(arg0).__yew_listener_id = arg1 >>> 0; - }; - imports.wbg.__wbg_setsubtreeid_d32e6327eef1f7fc = function(arg0, arg1) { - getObject(arg0).__yew_subtree_id = arg1 >>> 0; - }; - imports.wbg.__wbg_subtreeid_e348577f7ef777e3 = function(arg0, arg1) { - const ret = getObject(arg1).__yew_subtree_id; - getInt32Memory0()[arg0 / 4 + 1] = isLikeNone(ret) ? 0 : ret; - getInt32Memory0()[arg0 / 4 + 0] = !isLikeNone(ret); - }; - imports.wbg.__wbg_cachekey_b61393159c57fd7b = function(arg0, arg1) { - const ret = getObject(arg1).__yew_subtree_cache_key; - getInt32Memory0()[arg0 / 4 + 1] = isLikeNone(ret) ? 0 : ret; - getInt32Memory0()[arg0 / 4 + 0] = !isLikeNone(ret); - }; - imports.wbg.__wbg_setcachekey_80183b7cfc421143 = function(arg0, arg1) { - getObject(arg0).__yew_subtree_cache_key = arg1 >>> 0; - }; - imports.wbg.__wbg_error_71d6845bf00a930f = function(arg0, arg1) { - var v0 = getArrayJsValueFromWasm0(arg0, arg1).slice(); - wasm.__wbindgen_free(arg0, arg1 * 4); - console.error(...v0); - }; - imports.wbg.__wbg_warn_0b90a269a514ae1d = function(arg0, arg1) { - var v0 = getArrayJsValueFromWasm0(arg0, arg1).slice(); - wasm.__wbindgen_free(arg0, arg1 * 4); - console.warn(...v0); - }; - imports.wbg.__wbg_new_abda76e883ba8a5f = function() { - const ret = new Error(); - return addHeapObject(ret); - }; - imports.wbg.__wbg_stack_658279fe44541cf6 = function(arg0, arg1) { - const ret = getObject(arg1).stack; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_error_f851667af71bcfc6 = function(arg0, arg1) { - let deferred0_0; - let deferred0_1; - try { - deferred0_0 = arg0; - deferred0_1 = arg1; - console.error(getStringFromWasm0(arg0, arg1)); - } finally { - wasm.__wbindgen_free(deferred0_0, deferred0_1, 1); - } - }; - imports.wbg.__wbg_location_7ac41949b772ef21 = function(arg0) { - const ret = getObject(arg0).location; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_body_674aec4c1c0910cd = function(arg0) { - const ret = getObject(arg0).body; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_createElement_4891554b28d3388b = function() { return handleError(function (arg0, arg1, arg2) { - const ret = getObject(arg0).createElement(getStringFromWasm0(arg1, arg2)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_createElementNS_119acf9e82482041 = function() { return handleError(function (arg0, arg1, arg2, arg3, arg4) { - const ret = getObject(arg0).createElementNS(arg1 === 0 ? undefined : getStringFromWasm0(arg1, arg2), getStringFromWasm0(arg3, arg4)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_createTextNode_2fd22cd7e543f938 = function(arg0, arg1, arg2) { - const ret = getObject(arg0).createTextNode(getStringFromWasm0(arg1, arg2)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_instanceof_Window_9029196b662bc42a = function(arg0) { - let result; - try { - result = getObject(arg0) instanceof Window; - } catch { - result = false; - } - const ret = result; - return ret; - }; - imports.wbg.__wbg_document_f7ace2b956f30a4f = function(arg0) { - const ret = getObject(arg0).document; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_location_56243dba507f472d = function(arg0) { - const ret = getObject(arg0).location; - return addHeapObject(ret); - }; - imports.wbg.__wbg_performance_2c295061c8b01e0b = function(arg0) { - const ret = getObject(arg0).performance; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_fetch_336b6f0cb426b46e = function(arg0, arg1) { - const ret = getObject(arg0).fetch(getObject(arg1)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_setchecked_e5a50baea447b8a8 = function(arg0, arg1) { - getObject(arg0).checked = arg1 !== 0; - }; - imports.wbg.__wbg_value_9423da9d988ee8cf = function(arg0, arg1) { - const ret = getObject(arg1).value; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_setvalue_1f95e61cbc382f7f = function(arg0, arg1, arg2) { - getObject(arg0).value = getStringFromWasm0(arg1, arg2); - }; - imports.wbg.__wbg_newwithstrandinit_cad5cd6038c7ff5d = function() { return handleError(function (arg0, arg1, arg2) { - const ret = new Request(getStringFromWasm0(arg0, arg1), getObject(arg2)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_setonmessage_f0bd0280573b7084 = function(arg0, arg1) { - getObject(arg0).onmessage = getObject(arg1); - }; - imports.wbg.__wbg_new_8e7322f46d5d019c = function() { return handleError(function (arg0, arg1) { - const ret = new Worker(getStringFromWasm0(arg0, arg1)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_newwithoptions_1bd20b45061ed935 = function() { return handleError(function (arg0, arg1, arg2) { - const ret = new Worker(getStringFromWasm0(arg0, arg1), getObject(arg2)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_postMessage_8c609e2bde333d9c = function() { return handleError(function (arg0, arg1) { - getObject(arg0).postMessage(getObject(arg1)); - }, arguments) }; - imports.wbg.__wbg_instanceof_Response_fc4327dbfcdf5ced = function(arg0) { - let result; - try { - result = getObject(arg0) instanceof Response; - } catch { - result = false; - } - const ret = result; - return ret; - }; - imports.wbg.__wbg_blob_34990e4300d45f53 = function() { return handleError(function (arg0) { - const ret = getObject(arg0).blob(); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_now_0cfdc90c97d0c24b = function(arg0) { - const ret = getObject(arg0).now(); - return ret; - }; - imports.wbg.__wbg_debug_9b8701f894da9929 = function(arg0, arg1, arg2, arg3) { - console.debug(getObject(arg0), getObject(arg1), getObject(arg2), getObject(arg3)); - }; - imports.wbg.__wbg_error_788ae33f81d3b84b = function(arg0) { - console.error(getObject(arg0)); - }; - imports.wbg.__wbg_error_d9bce418caafb712 = function(arg0, arg1, arg2, arg3) { - console.error(getObject(arg0), getObject(arg1), getObject(arg2), getObject(arg3)); - }; - imports.wbg.__wbg_info_bb52f40b06f679de = function(arg0, arg1, arg2, arg3) { - console.info(getObject(arg0), getObject(arg1), getObject(arg2), getObject(arg3)); - }; - imports.wbg.__wbg_log_ea7093e35e3efd07 = function(arg0, arg1, arg2, arg3) { - console.log(getObject(arg0), getObject(arg1), getObject(arg2), getObject(arg3)); - }; - imports.wbg.__wbg_warn_dfc0e0cf544a13bd = function(arg0, arg1, arg2, arg3) { - console.warn(getObject(arg0), getObject(arg1), getObject(arg2), getObject(arg3)); - }; - imports.wbg.__wbg_instanceof_Element_4622f5da1249a3eb = function(arg0) { - let result; - try { - result = getObject(arg0) instanceof Element; - } catch { - result = false; - } - const ret = result; - return ret; - }; - imports.wbg.__wbg_namespaceURI_31718ed49b5343a3 = function(arg0, arg1) { - const ret = getObject(arg1).namespaceURI; - var ptr1 = isLikeNone(ret) ? 0 : passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - var len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_setinnerHTML_b089587252408b67 = function(arg0, arg1, arg2) { - getObject(arg0).innerHTML = getStringFromWasm0(arg1, arg2); - }; - imports.wbg.__wbg_outerHTML_f7749ceff37b5832 = function(arg0, arg1) { - const ret = getObject(arg1).outerHTML; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_children_27ed308801b57d3f = function(arg0) { - const ret = getObject(arg0).children; - return addHeapObject(ret); - }; - imports.wbg.__wbg_removeAttribute_d8404da431968808 = function() { return handleError(function (arg0, arg1, arg2) { - getObject(arg0).removeAttribute(getStringFromWasm0(arg1, arg2)); - }, arguments) }; - imports.wbg.__wbg_setAttribute_e7e80b478b7b8b2f = function() { return handleError(function (arg0, arg1, arg2, arg3, arg4) { - getObject(arg0).setAttribute(getStringFromWasm0(arg1, arg2), getStringFromWasm0(arg3, arg4)); - }, arguments) }; - imports.wbg.__wbg_origin_50aa482fa6784a0a = function() { return handleError(function (arg0, arg1) { - const ret = getObject(arg1).origin; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }, arguments) }; - imports.wbg.__wbg_pathname_c8fd5c498079312d = function() { return handleError(function (arg0, arg1) { - const ret = getObject(arg1).pathname; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }, arguments) }; - imports.wbg.__wbg_data_ab99ae4a2e1e8bc9 = function(arg0) { - const ret = getObject(arg0).data; - return addHeapObject(ret); - }; - imports.wbg.__wbg_target_f171e89c61e2bccf = function(arg0) { - const ret = getObject(arg0).target; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_bubbles_63572b91f3885ef1 = function(arg0) { - const ret = getObject(arg0).bubbles; - return ret; - }; - imports.wbg.__wbg_cancelBubble_90d1c3aa2a76cbeb = function(arg0) { - const ret = getObject(arg0).cancelBubble; - return ret; - }; - imports.wbg.__wbg_composedPath_cf1bb5b8bcff496f = function(arg0) { - const ret = getObject(arg0).composedPath(); - return addHeapObject(ret); - }; - imports.wbg.__wbg_parentNode_9e53f8b17eb98c9d = function(arg0) { - const ret = getObject(arg0).parentNode; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_parentElement_c75962bc9997ea5f = function(arg0) { - const ret = getObject(arg0).parentElement; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_lastChild_0cee692010bac6c2 = function(arg0) { - const ret = getObject(arg0).lastChild; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_nextSibling_304d9aac7c2774ae = function(arg0) { - const ret = getObject(arg0).nextSibling; - return isLikeNone(ret) ? 0 : addHeapObject(ret); - }; - imports.wbg.__wbg_setnodeValue_d1c8382910b45e04 = function(arg0, arg1, arg2) { - getObject(arg0).nodeValue = arg1 === 0 ? undefined : getStringFromWasm0(arg1, arg2); - }; - imports.wbg.__wbg_textContent_c5d9e21ee03c63d4 = function(arg0, arg1) { - const ret = getObject(arg1).textContent; - var ptr1 = isLikeNone(ret) ? 0 : passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - var len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_appendChild_51339d4cde00ee22 = function() { return handleError(function (arg0, arg1) { - const ret = getObject(arg0).appendChild(getObject(arg1)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_insertBefore_ffa01d4b747c95fc = function() { return handleError(function (arg0, arg1, arg2) { - const ret = getObject(arg0).insertBefore(getObject(arg1), getObject(arg2)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_removeChild_973429f368206138 = function() { return handleError(function (arg0, arg1) { - const ret = getObject(arg0).removeChild(getObject(arg1)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_createObjectURL_d82f2880bada6a1d = function() { return handleError(function (arg0, arg1) { - const ret = URL.createObjectURL(getObject(arg1)); - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }, arguments) }; - imports.wbg.__wbg_newwithstrsequenceandoptions_fd88a547f6d15707 = function() { return handleError(function (arg0, arg1) { - const ret = new Blob(getObject(arg0), getObject(arg1)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_arrayBuffer_27cefaea55cbf063 = function(arg0) { - const ret = getObject(arg0).arrayBuffer(); - return addHeapObject(ret); - }; - imports.wbg.__wbg_addEventListener_a5963e26cd7b176b = function() { return handleError(function (arg0, arg1, arg2, arg3, arg4) { - getObject(arg0).addEventListener(getStringFromWasm0(arg1, arg2), getObject(arg3), getObject(arg4)); - }, arguments) }; - imports.wbg.__wbg_instanceof_ShadowRoot_b64337370f59fe2d = function(arg0) { - let result; - try { - result = getObject(arg0) instanceof ShadowRoot; - } catch { - result = false; - } - const ret = result; - return ret; - }; - imports.wbg.__wbg_host_e1c47c33975060d3 = function(arg0) { - const ret = getObject(arg0).host; - return addHeapObject(ret); - }; - imports.wbg.__wbg_value_3c5f08ffc2b7d6f9 = function(arg0, arg1) { - const ret = getObject(arg1).value; - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbg_setvalue_0dc100d4b9908028 = function(arg0, arg1, arg2) { - getObject(arg0).value = getStringFromWasm0(arg1, arg2); - }; - imports.wbg.__wbg_get_44be0491f933a435 = function(arg0, arg1) { - const ret = getObject(arg0)[arg1 >>> 0]; - return addHeapObject(ret); - }; - imports.wbg.__wbg_length_fff51ee6522a1a18 = function(arg0) { - const ret = getObject(arg0).length; - return ret; - }; - imports.wbg.__wbg_new_898a68150f225f2e = function() { - const ret = new Array(); - return addHeapObject(ret); - }; - imports.wbg.__wbg_newnoargs_581967eacc0e2604 = function(arg0, arg1) { - const ret = new Function(getStringFromWasm0(arg0, arg1)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_call_cb65541d95d71282 = function() { return handleError(function (arg0, arg1) { - const ret = getObject(arg0).call(getObject(arg1)); - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_new_b51585de1b234aff = function() { - const ret = new Object(); - return addHeapObject(ret); - }; - imports.wbg.__wbg_self_1ff1d729e9aae938 = function() { return handleError(function () { - const ret = self.self; - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_window_5f4faef6c12b79ec = function() { return handleError(function () { - const ret = window.window; - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_globalThis_1d39714405582d3c = function() { return handleError(function () { - const ret = globalThis.globalThis; - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbg_global_651f05c6a0944d1c = function() { return handleError(function () { - const ret = global.global; - return addHeapObject(ret); - }, arguments) }; - imports.wbg.__wbindgen_is_undefined = function(arg0) { - const ret = getObject(arg0) === undefined; - return ret; - }; - imports.wbg.__wbg_from_d7c216d4616bb368 = function(arg0) { - const ret = Array.from(getObject(arg0)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_push_ca1c26067ef907ac = function(arg0, arg1) { - const ret = getObject(arg0).push(getObject(arg1)); - return ret; - }; - imports.wbg.__wbg_is_205d914af04a8faa = function(arg0, arg1) { - const ret = Object.is(getObject(arg0), getObject(arg1)); - return ret; - }; - imports.wbg.__wbg_resolve_53698b95aaf7fcf8 = function(arg0) { - const ret = Promise.resolve(getObject(arg0)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_then_f7e06ee3c11698eb = function(arg0, arg1) { - const ret = getObject(arg0).then(getObject(arg1)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_then_b2267541e2a73865 = function(arg0, arg1, arg2) { - const ret = getObject(arg0).then(getObject(arg1), getObject(arg2)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_buffer_085ec1f694018c4f = function(arg0) { - const ret = getObject(arg0).buffer; - return addHeapObject(ret); - }; - imports.wbg.__wbg_newwithbyteoffsetandlength_6da8e527659b86aa = function(arg0, arg1, arg2) { - const ret = new Uint8Array(getObject(arg0), arg1 >>> 0, arg2 >>> 0); - return addHeapObject(ret); - }; - imports.wbg.__wbg_new_8125e318e6245eed = function(arg0) { - const ret = new Uint8Array(getObject(arg0)); - return addHeapObject(ret); - }; - imports.wbg.__wbg_set_5cf90238115182c3 = function(arg0, arg1, arg2) { - getObject(arg0).set(getObject(arg1), arg2 >>> 0); - }; - imports.wbg.__wbg_length_72e2208bbc0efc61 = function(arg0) { - const ret = getObject(arg0).length; - return ret; - }; - imports.wbg.__wbg_set_092e06b0f9d71865 = function() { return handleError(function (arg0, arg1, arg2) { - const ret = Reflect.set(getObject(arg0), getObject(arg1), getObject(arg2)); - return ret; - }, arguments) }; - imports.wbg.__wbindgen_debug_string = function(arg0, arg1) { - const ret = debugString(getObject(arg1)); - const ptr1 = passStringToWasm0(ret, wasm.__wbindgen_malloc, wasm.__wbindgen_realloc); - const len1 = WASM_VECTOR_LEN; - getInt32Memory0()[arg0 / 4 + 1] = len1; - getInt32Memory0()[arg0 / 4 + 0] = ptr1; - }; - imports.wbg.__wbindgen_throw = function(arg0, arg1) { - throw new Error(getStringFromWasm0(arg0, arg1)); - }; - imports.wbg.__wbindgen_memory = function() { - const ret = wasm.memory; - return addHeapObject(ret); - }; - imports.wbg.__wbindgen_closure_wrapper182 = function(arg0, arg1, arg2) { - const ret = makeClosure(arg0, arg1, 41, __wbg_adapter_18); - return addHeapObject(ret); - }; - imports.wbg.__wbindgen_closure_wrapper400 = function(arg0, arg1, arg2) { - const ret = makeMutClosure(arg0, arg1, 136, __wbg_adapter_21); - return addHeapObject(ret); - }; - imports.wbg.__wbindgen_closure_wrapper679 = function(arg0, arg1, arg2) { - const ret = makeMutClosure(arg0, arg1, 244, __wbg_adapter_24); - return addHeapObject(ret); - }; - - return imports; -} - -function __wbg_init_memory(imports, maybe_memory) { - -} - -function __wbg_finalize_init(instance, module) { - wasm = instance.exports; - __wbg_init.__wbindgen_wasm_module = module; - cachedInt32Memory0 = null; - cachedUint32Memory0 = null; - cachedUint8Memory0 = null; - - wasm.__wbindgen_start(); - return wasm; -} - -function initSync(module) { - if (wasm !== undefined) return wasm; - - const imports = __wbg_get_imports(); - - __wbg_init_memory(imports); - - if (!(module instanceof WebAssembly.Module)) { - module = new WebAssembly.Module(module); - } - - const instance = new WebAssembly.Instance(module, imports); - - return __wbg_finalize_init(instance, module); -} - -async function __wbg_init(input) { - if (wasm !== undefined) return wasm; - - if (typeof input === 'undefined') { - input = new URL('app-a6ac60c1ad530067_bg.wasm', import.meta.url); - } - const imports = __wbg_get_imports(); - - if (typeof input === 'string' || (typeof Request === 'function' && input instanceof Request) || (typeof URL === 'function' && input instanceof URL)) { - input = fetch(input); - } - - __wbg_init_memory(imports); - - const { instance, module } = await __wbg_load(await input, imports); - - return __wbg_finalize_init(instance, module); -} - -export { initSync } -export default __wbg_init; diff --git a/spaces/Pie31415/control-animation/annotator/uniformer/mmcv/ops/furthest_point_sample.py b/spaces/Pie31415/control-animation/annotator/uniformer/mmcv/ops/furthest_point_sample.py deleted file mode 100644 index 374b7a878f1972c183941af28ba1df216ac1a60f..0000000000000000000000000000000000000000 --- a/spaces/Pie31415/control-animation/annotator/uniformer/mmcv/ops/furthest_point_sample.py +++ /dev/null @@ -1,83 +0,0 @@ -import torch -from torch.autograd import Function - -from ..utils import ext_loader - -ext_module = ext_loader.load_ext('_ext', [ - 'furthest_point_sampling_forward', - 'furthest_point_sampling_with_dist_forward' -]) - - -class FurthestPointSampling(Function): - """Uses iterative furthest point sampling to select a set of features whose - corresponding points have the furthest distance.""" - - @staticmethod - def forward(ctx, points_xyz: torch.Tensor, - num_points: int) -> torch.Tensor: - """ - Args: - points_xyz (Tensor): (B, N, 3) where N > num_points. - num_points (int): Number of points in the sampled set. - - Returns: - Tensor: (B, num_points) indices of the sampled points. - """ - assert points_xyz.is_contiguous() - - B, N = points_xyz.size()[:2] - output = torch.cuda.IntTensor(B, num_points) - temp = torch.cuda.FloatTensor(B, N).fill_(1e10) - - ext_module.furthest_point_sampling_forward( - points_xyz, - temp, - output, - b=B, - n=N, - m=num_points, - ) - if torch.__version__ != 'parrots': - ctx.mark_non_differentiable(output) - return output - - @staticmethod - def backward(xyz, a=None): - return None, None - - -class FurthestPointSamplingWithDist(Function): - """Uses iterative furthest point sampling to select a set of features whose - corresponding points have the furthest distance.""" - - @staticmethod - def forward(ctx, points_dist: torch.Tensor, - num_points: int) -> torch.Tensor: - """ - Args: - points_dist (Tensor): (B, N, N) Distance between each point pair. - num_points (int): Number of points in the sampled set. - - Returns: - Tensor: (B, num_points) indices of the sampled points. - """ - assert points_dist.is_contiguous() - - B, N, _ = points_dist.size() - output = points_dist.new_zeros([B, num_points], dtype=torch.int32) - temp = points_dist.new_zeros([B, N]).fill_(1e10) - - ext_module.furthest_point_sampling_with_dist_forward( - points_dist, temp, output, b=B, n=N, m=num_points) - if torch.__version__ != 'parrots': - ctx.mark_non_differentiable(output) - return output - - @staticmethod - def backward(xyz, a=None): - return None, None - - -furthest_point_sample = FurthestPointSampling.apply -furthest_point_sample_with_dist = FurthestPointSamplingWithDist.apply diff --git a/spaces/PunGrumpy/text-generation/Dockerfile b/spaces/PunGrumpy/text-generation/Dockerfile deleted file mode 100644 index 14271315789e0c87a912ea3cc090345eb9d90595..0000000000000000000000000000000000000000 --- a/spaces/PunGrumpy/text-generation/Dockerfile +++ /dev/null @@ -1,28 +0,0 @@ -# Use the official Python 3.9 image -FROM python:3.9 - -# Set the working directory to /code -WORKDIR /code - -# Copy the current directory contents into the container at /code -COPY ./requirements.txt /code/requirements.txt - -# Install requirements.txt -RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt - -# Set up a new user named "user" with user ID 1000 -RUN useradd -m -u 1000 user -# Switch to the "user" user -USER user -# Set home to the user's home directory -ENV HOME=/home/user \ - PATH=/home/user/.local/bin:$PATH - -# Set the working directory to the user's home directory -WORKDIR $HOME/app - -# Copy the current directory contents into the container at $HOME/app setting the owner to the user -COPY --chown=user . $HOME/app - -# Start the FastAPI app on port 7860, the default port expected by Spaces -CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "7860"] diff --git a/spaces/Raspberry-ai/main/.env/lib/python3.11/site-packages/pip/_vendor/urllib3/util/ssl_match_hostname.py b/spaces/Raspberry-ai/main/.env/lib/python3.11/site-packages/pip/_vendor/urllib3/util/ssl_match_hostname.py deleted file mode 100644 index 1dd950c489607d06ecc5218292a1b55558b47be8..0000000000000000000000000000000000000000 --- a/spaces/Raspberry-ai/main/.env/lib/python3.11/site-packages/pip/_vendor/urllib3/util/ssl_match_hostname.py +++ /dev/null @@ -1,159 +0,0 @@ -"""The match_hostname() function from Python 3.3.3, essential when using SSL.""" - -# Note: This file is under the PSF license as the code comes from the python -# stdlib. http://docs.python.org/3/license.html - -import re -import sys - -# ipaddress has been backported to 2.6+ in pypi. If it is installed on the -# system, use it to handle IPAddress ServerAltnames (this was added in -# python-3.5) otherwise only do DNS matching. This allows -# util.ssl_match_hostname to continue to be used in Python 2.7. -try: - import ipaddress -except ImportError: - ipaddress = None - -__version__ = "3.5.0.1" - - -class CertificateError(ValueError): - pass - - -def _dnsname_match(dn, hostname, max_wildcards=1): - """Matching according to RFC 6125, section 6.4.3 - - http://tools.ietf.org/html/rfc6125#section-6.4.3 - """ - pats = [] - if not dn: - return False - - # Ported from python3-syntax: - # leftmost, *remainder = dn.split(r'.') - parts = dn.split(r".") - leftmost = parts[0] - remainder = parts[1:] - - wildcards = leftmost.count("*") - if wildcards > max_wildcards: - # Issue #17980: avoid denials of service by refusing more - # than one wildcard per fragment. A survey of established - # policy among SSL implementations showed it to be a - # reasonable choice. - raise CertificateError( - "too many wildcards in certificate DNS name: " + repr(dn) - ) - - # speed up common case w/o wildcards - if not wildcards: - return dn.lower() == hostname.lower() - - # RFC 6125, section 6.4.3, subitem 1. - # The client SHOULD NOT attempt to match a presented identifier in which - # the wildcard character comprises a label other than the left-most label. - if leftmost == "*": - # When '*' is a fragment by itself, it matches a non-empty dotless - # fragment. - pats.append("[^.]+") - elif leftmost.startswith("xn--") or hostname.startswith("xn--"): - # RFC 6125, section 6.4.3, subitem 3. - # The client SHOULD NOT attempt to match a presented identifier - # where the wildcard character is embedded within an A-label or - # U-label of an internationalized domain name. - pats.append(re.escape(leftmost)) - else: - # Otherwise, '*' matches any dotless string, e.g. www* - pats.append(re.escape(leftmost).replace(r"\*", "[^.]*")) - - # add the remaining fragments, ignore any wildcards - for frag in remainder: - pats.append(re.escape(frag)) - - pat = re.compile(r"\A" + r"\.".join(pats) + r"\Z", re.IGNORECASE) - return pat.match(hostname) - - -def _to_unicode(obj): - if isinstance(obj, str) and sys.version_info < (3,): - # ignored flake8 # F821 to support python 2.7 function - obj = unicode(obj, encoding="ascii", errors="strict") # noqa: F821 - return obj - - -def _ipaddress_match(ipname, host_ip): - """Exact matching of IP addresses. - - RFC 6125 explicitly doesn't define an algorithm for this - (section 1.7.2 - "Out of Scope"). - """ - # OpenSSL may add a trailing newline to a subjectAltName's IP address - # Divergence from upstream: ipaddress can't handle byte str - ip = ipaddress.ip_address(_to_unicode(ipname).rstrip()) - return ip == host_ip - - -def match_hostname(cert, hostname): - """Verify that *cert* (in decoded format as returned by - SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 - rules are followed, but IP addresses are not accepted for *hostname*. - - CertificateError is raised on failure. On success, the function - returns nothing. - """ - if not cert: - raise ValueError( - "empty or no certificate, match_hostname needs a " - "SSL socket or SSL context with either " - "CERT_OPTIONAL or CERT_REQUIRED" - ) - try: - # Divergence from upstream: ipaddress can't handle byte str - host_ip = ipaddress.ip_address(_to_unicode(hostname)) - except (UnicodeError, ValueError): - # ValueError: Not an IP address (common case) - # UnicodeError: Divergence from upstream: Have to deal with ipaddress not taking - # byte strings. addresses should be all ascii, so we consider it not - # an ipaddress in this case - host_ip = None - except AttributeError: - # Divergence from upstream: Make ipaddress library optional - if ipaddress is None: - host_ip = None - else: # Defensive - raise - dnsnames = [] - san = cert.get("subjectAltName", ()) - for key, value in san: - if key == "DNS": - if host_ip is None and _dnsname_match(value, hostname): - return - dnsnames.append(value) - elif key == "IP Address": - if host_ip is not None and _ipaddress_match(value, host_ip): - return - dnsnames.append(value) - if not dnsnames: - # The subject is only checked when there is no dNSName entry - # in subjectAltName - for sub in cert.get("subject", ()): - for key, value in sub: - # XXX according to RFC 2818, the most specific Common Name - # must be used. - if key == "commonName": - if _dnsname_match(value, hostname): - return - dnsnames.append(value) - if len(dnsnames) > 1: - raise CertificateError( - "hostname %r " - "doesn't match either of %s" % (hostname, ", ".join(map(repr, dnsnames))) - ) - elif len(dnsnames) == 1: - raise CertificateError("hostname %r doesn't match %r" % (hostname, dnsnames[0])) - else: - raise CertificateError( - "no appropriate commonName or subjectAltName fields were found" - ) diff --git a/spaces/Ricecake123/RVC-demo/lib/uvr5_pack/lib_v5/nets_537238KB.py b/spaces/Ricecake123/RVC-demo/lib/uvr5_pack/lib_v5/nets_537238KB.py deleted file mode 100644 index a1bb530e006482704f234c2e739a695174142941..0000000000000000000000000000000000000000 --- a/spaces/Ricecake123/RVC-demo/lib/uvr5_pack/lib_v5/nets_537238KB.py +++ /dev/null @@ -1,123 +0,0 @@ -import torch -import numpy as np -from torch import nn -import torch.nn.functional as F - -from . import layers_537238KB as layers - - -class BaseASPPNet(nn.Module): - def __init__(self, nin, ch, dilations=(4, 8, 16)): - super(BaseASPPNet, self).__init__() - self.enc1 = layers.Encoder(nin, ch, 3, 2, 1) - self.enc2 = layers.Encoder(ch, ch * 2, 3, 2, 1) - self.enc3 = layers.Encoder(ch * 2, ch * 4, 3, 2, 1) - self.enc4 = layers.Encoder(ch * 4, ch * 8, 3, 2, 1) - - self.aspp = layers.ASPPModule(ch * 8, ch * 16, dilations) - - self.dec4 = layers.Decoder(ch * (8 + 16), ch * 8, 3, 1, 1) - self.dec3 = layers.Decoder(ch * (4 + 8), ch * 4, 3, 1, 1) - self.dec2 = layers.Decoder(ch * (2 + 4), ch * 2, 3, 1, 1) - self.dec1 = layers.Decoder(ch * (1 + 2), ch, 3, 1, 1) - - def __call__(self, x): - h, e1 = self.enc1(x) - h, e2 = self.enc2(h) - h, e3 = self.enc3(h) - h, e4 = self.enc4(h) - - h = self.aspp(h) - - h = self.dec4(h, e4) - h = self.dec3(h, e3) - h = self.dec2(h, e2) - h = self.dec1(h, e1) - - return h - - -class CascadedASPPNet(nn.Module): - def __init__(self, n_fft): - super(CascadedASPPNet, self).__init__() - self.stg1_low_band_net = BaseASPPNet(2, 64) - self.stg1_high_band_net = BaseASPPNet(2, 64) - - self.stg2_bridge = layers.Conv2DBNActiv(66, 32, 1, 1, 0) - self.stg2_full_band_net = BaseASPPNet(32, 64) - - self.stg3_bridge = layers.Conv2DBNActiv(130, 64, 1, 1, 0) - self.stg3_full_band_net = BaseASPPNet(64, 128) - - self.out = nn.Conv2d(128, 2, 1, bias=False) - self.aux1_out = nn.Conv2d(64, 2, 1, bias=False) - self.aux2_out = nn.Conv2d(64, 2, 1, bias=False) - - self.max_bin = n_fft // 2 - self.output_bin = n_fft // 2 + 1 - - self.offset = 128 - - def forward(self, x, aggressiveness=None): - mix = x.detach() - x = x.clone() - - x = x[:, :, : self.max_bin] - - bandw = x.size()[2] // 2 - aux1 = torch.cat( - [ - self.stg1_low_band_net(x[:, :, :bandw]), - self.stg1_high_band_net(x[:, :, bandw:]), - ], - dim=2, - ) - - h = torch.cat([x, aux1], dim=1) - aux2 = self.stg2_full_band_net(self.stg2_bridge(h)) - - h = torch.cat([x, aux1, aux2], dim=1) - h = self.stg3_full_band_net(self.stg3_bridge(h)) - - mask = torch.sigmoid(self.out(h)) - mask = F.pad( - input=mask, - pad=(0, 0, 0, self.output_bin - mask.size()[2]), - mode="replicate", - ) - - if self.training: - aux1 = torch.sigmoid(self.aux1_out(aux1)) - aux1 = F.pad( - input=aux1, - pad=(0, 0, 0, self.output_bin - aux1.size()[2]), - mode="replicate", - ) - aux2 = torch.sigmoid(self.aux2_out(aux2)) - aux2 = F.pad( - input=aux2, - pad=(0, 0, 0, self.output_bin - aux2.size()[2]), - mode="replicate", - ) - return mask * mix, aux1 * mix, aux2 * mix - else: - if aggressiveness: - mask[:, :, : aggressiveness["split_bin"]] = torch.pow( - mask[:, :, : aggressiveness["split_bin"]], - 1 + aggressiveness["value"] / 3, - ) - mask[:, :, aggressiveness["split_bin"] :] = torch.pow( - mask[:, :, aggressiveness["split_bin"] :], - 1 + aggressiveness["value"], - ) - - return mask * mix - - def predict(self, x_mag, aggressiveness=None): - h = self.forward(x_mag, aggressiveness) - - if self.offset > 0: - h = h[:, :, :, self.offset : -self.offset] - assert h.size()[3] > 0 - - return h diff --git a/spaces/Rongjiehuang/GenerSpeech/data_gen/tts/emotion/params_model.py b/spaces/Rongjiehuang/GenerSpeech/data_gen/tts/emotion/params_model.py deleted file mode 100644 index 48f8e564e772649e3207c7a90bff1bee9e6b3a47..0000000000000000000000000000000000000000 --- a/spaces/Rongjiehuang/GenerSpeech/data_gen/tts/emotion/params_model.py +++ /dev/null @@ -1,11 +0,0 @@ - -## Model parameters -model_hidden_size = 256 -model_embedding_size = 256 -model_num_layers = 3 - - -## Training parameters -learning_rate_init = 1e-4 -speakers_per_batch = 6 -utterances_per_speaker = 20 diff --git a/spaces/Rongjiehuang/ProDiff/utils/ddp_utils.py b/spaces/Rongjiehuang/ProDiff/utils/ddp_utils.py deleted file mode 100644 index 4b529198c13a1ffc622baea6e5178407b24aee8f..0000000000000000000000000000000000000000 --- a/spaces/Rongjiehuang/ProDiff/utils/ddp_utils.py +++ /dev/null @@ -1,137 +0,0 @@ -from torch.nn.parallel import DistributedDataParallel -from torch.nn.parallel.distributed import _find_tensors -import torch.optim -import torch.utils.data -import torch -from packaging import version - -class DDP(DistributedDataParallel): - """ - Override the forward call in lightning so it goes to training and validation step respectively - """ - - def forward(self, *inputs, **kwargs): # pragma: no cover - if version.parse(torch.__version__[:6]) < version.parse("1.11"): - self._sync_params() - inputs, kwargs = self.scatter(inputs, kwargs, self.device_ids) - assert len(self.device_ids) == 1 - if self.module.training: - output = self.module.training_step(*inputs[0], **kwargs[0]) - elif self.module.testing: - output = self.module.test_step(*inputs[0], **kwargs[0]) - else: - output = self.module.validation_step(*inputs[0], **kwargs[0]) - if torch.is_grad_enabled(): - # We'll return the output object verbatim since it is a freeform - # object. We need to find any tensors in this object, though, - # because we need to figure out which parameters were used during - # this forward pass, to ensure we short circuit reduction for any - # unused parameters. Only if `find_unused_parameters` is set. - if self.find_unused_parameters: - self.reducer.prepare_for_backward(list(_find_tensors(output))) - else: - self.reducer.prepare_for_backward([]) - else: - from torch.nn.parallel.distributed import \ - logging, Join, _DDPSink, _tree_flatten_with_rref, _tree_unflatten_with_rref - with torch.autograd.profiler.record_function("DistributedDataParallel.forward"): - if torch.is_grad_enabled() and self.require_backward_grad_sync: - self.logger.set_runtime_stats_and_log() - self.num_iterations += 1 - self.reducer.prepare_for_forward() - - # Notify the join context that this process has not joined, if - # needed - work = Join.notify_join_context(self) - if work: - self.reducer._set_forward_pass_work_handle( - work, self._divide_by_initial_world_size - ) - - # Calling _rebuild_buckets before forward compuation, - # It may allocate new buckets before deallocating old buckets - # inside _rebuild_buckets. To save peak memory usage, - # call _rebuild_buckets before the peak memory usage increases - # during forward computation. - # This should be called only once during whole training period. - if torch.is_grad_enabled() and self.reducer._rebuild_buckets(): - logging.info("Reducer buckets have been rebuilt in this iteration.") - self._has_rebuilt_buckets = True - - # sync params according to location (before/after forward) user - # specified as part of hook, if hook was specified. - buffer_hook_registered = hasattr(self, 'buffer_hook') - if self._check_sync_bufs_pre_fwd(): - self._sync_buffers() - - if self._join_config.enable: - # Notify joined ranks whether they should sync in backwards pass or not. - self._check_global_requires_backward_grad_sync(is_joined_rank=False) - - inputs, kwargs = self.scatter(inputs, kwargs, self.device_ids) - if self.module.training: - output = self.module.training_step(*inputs[0], **kwargs[0]) - elif self.module.testing: - output = self.module.test_step(*inputs[0], **kwargs[0]) - else: - output = self.module.validation_step(*inputs[0], **kwargs[0]) - - # sync params according to location (before/after forward) user - # specified as part of hook, if hook was specified. - if self._check_sync_bufs_post_fwd(): - self._sync_buffers() - - if torch.is_grad_enabled() and self.require_backward_grad_sync: - self.require_forward_param_sync = True - # We'll return the output object verbatim since it is a freeform - # object. We need to find any tensors in this object, though, - # because we need to figure out which parameters were used during - # this forward pass, to ensure we short circuit reduction for any - # unused parameters. Only if `find_unused_parameters` is set. - if self.find_unused_parameters and not self.static_graph: - # Do not need to populate this for static graph. - self.reducer.prepare_for_backward(list(_find_tensors(output))) - else: - self.reducer.prepare_for_backward([]) - else: - self.require_forward_param_sync = False - - # TODO: DDPSink is currently enabled for unused parameter detection and - # static graph training for first iteration. - if (self.find_unused_parameters and not self.static_graph) or ( - self.static_graph and self.num_iterations == 1 - ): - state_dict = { - 'static_graph': self.static_graph, - 'num_iterations': self.num_iterations, - } - - output_tensor_list, treespec, output_is_rref = _tree_flatten_with_rref( - output - ) - output_placeholders = [None for _ in range(len(output_tensor_list))] - # Do not touch tensors that have no grad_fn, which can cause issues - # such as https://github.com/pytorch/pytorch/issues/60733 - for i, output in enumerate(output_tensor_list): - if torch.is_tensor(output) and output.grad_fn is None: - output_placeholders[i] = output - - # When find_unused_parameters=True, makes tensors which require grad - # run through the DDPSink backward pass. When not all outputs are - # used in loss, this makes those corresponding tensors receive - # undefined gradient which the reducer then handles to ensure - # param.grad field is not touched and we don't error out. - passthrough_tensor_list = _DDPSink.apply( - self.reducer, - state_dict, - *output_tensor_list, - ) - for i in range(len(output_placeholders)): - if output_placeholders[i] is None: - output_placeholders[i] = passthrough_tensor_list[i] - - # Reconstruct output data structure. - output = _tree_unflatten_with_rref( - output_placeholders, treespec, output_is_rref - ) - return output diff --git a/spaces/RugNlpFlashcards/Speech_Language_Processing_Jurafsky_Martin/plots.py b/spaces/RugNlpFlashcards/Speech_Language_Processing_Jurafsky_Martin/plots.py deleted file mode 100644 index bd4b429714a4c9513623800f6f7ce30f6429cbde..0000000000000000000000000000000000000000 --- a/spaces/RugNlpFlashcards/Speech_Language_Processing_Jurafsky_Martin/plots.py +++ /dev/null @@ -1,61 +0,0 @@ -# %% -import pandas as pd -import matplotlib.pyplot as plt -import scipy.stats as stats - -data = pd.read_csv("results/timings.csv", index_col="Unnamed: 0") -data -# %% -data.columns -# %% - -data_retrieve = data[["faiss_dpr.retrieve", "faiss_longformer.retrieve", - "es_dpr.retrieve", "es_longformer.retrieve"]] - -# %% -plt.title("Retrieval time") -plt.ylabel("Time (s)") -plt.xlabel("Model") -plt.boxplot(data_retrieve, labels=[ - "A1", "A2", "B1", "B2"]) -plt.savefig("results/retrieval_time.png") - -# %% -print(data_retrieve.describe()) - -with open("results/retrieval_time.tex", "w") as f: - f.write(data_retrieve.describe().to_latex()) - -# %% - -# now the same for the reader -data_read = data[["faiss_dpr.read", "faiss_longformer.read", - "es_dpr.read", "es_longformer.read"]] - -plt.title("Reading time") -plt.ylabel("Time (s)") -plt.xlabel("Model") -plt.boxplot(data_read, labels=["A1", "A2", "B1", "B2"]) -plt.savefig("results/read_time.png") - -# %% -print(data_read.describe()) - -with open("results/read_time.tex", "w") as f: - f.write(data_read.describe().to_latex()) - - -# Statistical tests for reading time - -# %% -stats.probplot(data_retrieve["es_longformer.retrieve"], dist="norm", plot=plt) -# %% - - -# %% -anova_retrieve = stats.f_oneway(*data_retrieve.T.values) -anova_read = stats.f_oneway(*data_read.T.values) - -print(f"retrieve\n {anova_retrieve} \n\nread\n {anova_read}") - -# %% diff --git a/spaces/SIGGRAPH2022/sketch2pose/scripts/prepare.sh b/spaces/SIGGRAPH2022/sketch2pose/scripts/prepare.sh deleted file mode 100644 index df728018e79da8027e5013579fd76fd5ff650eaf..0000000000000000000000000000000000000000 --- a/spaces/SIGGRAPH2022/sketch2pose/scripts/prepare.sh +++ /dev/null @@ -1,9 +0,0 @@ -#!/usr/bin/env sh - - -# set -euo pipefail - -v=$(python -c 'import sys; v = sys.version_info; print(f"{v.major}.{v.minor}")') -for p in patches/*.diff; do - patch -p0 -d"${HOME}/.local/lib/python$(python -V | cut -d ' ' -f 2 | cut -d'.' -f 1-2)" < <(sed -e 's/venv\/lib\/python3.10\///' "${p}") -done diff --git a/spaces/SQSora/VITS-Umamusume-voice-synthesizer/ONNXVITS_infer.py b/spaces/SQSora/VITS-Umamusume-voice-synthesizer/ONNXVITS_infer.py deleted file mode 100644 index af04e614c8f1ac43faf363b1a9f6bfd667fbde21..0000000000000000000000000000000000000000 --- a/spaces/SQSora/VITS-Umamusume-voice-synthesizer/ONNXVITS_infer.py +++ /dev/null @@ -1,201 +0,0 @@ -import torch -import commons -import models - -import math -from torch import nn -from torch.nn import functional as F - -import modules -import attentions - -from torch.nn import Conv1d, ConvTranspose1d, Conv2d -from torch.nn.utils import weight_norm, remove_weight_norm, spectral_norm -from commons import init_weights, get_padding - - -class TextEncoder(nn.Module): - def __init__(self, - n_vocab, - out_channels, - hidden_channels, - filter_channels, - n_heads, - n_layers, - kernel_size, - p_dropout, - emotion_embedding): - super().__init__() - self.n_vocab = n_vocab - self.out_channels = out_channels - self.hidden_channels = hidden_channels - self.filter_channels = filter_channels - self.n_heads = n_heads - self.n_layers = n_layers - self.kernel_size = kernel_size - self.p_dropout = p_dropout - self.emotion_embedding = emotion_embedding - - if self.n_vocab != 0: - self.emb = nn.Embedding(n_vocab, hidden_channels) - if emotion_embedding: - self.emo_proj = nn.Linear(1024, hidden_channels) - nn.init.normal_(self.emb.weight, 0.0, hidden_channels ** -0.5) - - self.encoder = attentions.Encoder( - hidden_channels, - filter_channels, - n_heads, - n_layers, - kernel_size, - p_dropout) - self.proj = nn.Conv1d(hidden_channels, out_channels * 2, 1) - - def forward(self, x, x_lengths, emotion_embedding=None): - if self.n_vocab != 0: - x = self.emb(x) * math.sqrt(self.hidden_channels) # [b, t, h] - if emotion_embedding is not None: - print("emotion added") - x = x + self.emo_proj(emotion_embedding.unsqueeze(1)) - x = torch.transpose(x, 1, -1) # [b, h, t] - x_mask = torch.unsqueeze(commons.sequence_mask(x_lengths, x.size(2)), 1).to(x.dtype) - - x = self.encoder(x * x_mask, x_mask) - stats = self.proj(x) * x_mask - - m, logs = torch.split(stats, self.out_channels, dim=1) - return x, m, logs, x_mask - - -class PosteriorEncoder(nn.Module): - def __init__(self, - in_channels, - out_channels, - hidden_channels, - kernel_size, - dilation_rate, - n_layers, - gin_channels=0): - super().__init__() - self.in_channels = in_channels - self.out_channels = out_channels - self.hidden_channels = hidden_channels - self.kernel_size = kernel_size - self.dilation_rate = dilation_rate - self.n_layers = n_layers - self.gin_channels = gin_channels - - self.pre = nn.Conv1d(in_channels, hidden_channels, 1) - self.enc = modules.WN(hidden_channels, kernel_size, dilation_rate, n_layers, gin_channels=gin_channels) - self.proj = nn.Conv1d(hidden_channels, out_channels * 2, 1) - - def forward(self, x, x_lengths, g=None): - x_mask = torch.unsqueeze(commons.sequence_mask(x_lengths, x.size(2)), 1).to(x.dtype) - x = self.pre(x) * x_mask - x = self.enc(x, x_mask, g=g) - stats = self.proj(x) * x_mask - m, logs = torch.split(stats, self.out_channels, dim=1) - z = (m + torch.randn_like(m) * torch.exp(logs)) * x_mask - return z, m, logs, x_mask - - -class SynthesizerTrn(models.SynthesizerTrn): - """ - Synthesizer for Training - """ - - def __init__(self, - n_vocab, - spec_channels, - segment_size, - inter_channels, - hidden_channels, - filter_channels, - n_heads, - n_layers, - kernel_size, - p_dropout, - resblock, - resblock_kernel_sizes, - resblock_dilation_sizes, - upsample_rates, - upsample_initial_channel, - upsample_kernel_sizes, - n_speakers=0, - gin_channels=0, - use_sdp=True, - emotion_embedding=False, - ONNX_dir="./ONNX_net/", - **kwargs): - - super().__init__( - n_vocab, - spec_channels, - segment_size, - inter_channels, - hidden_channels, - filter_channels, - n_heads, - n_layers, - kernel_size, - p_dropout, - resblock, - resblock_kernel_sizes, - resblock_dilation_sizes, - upsample_rates, - upsample_initial_channel, - upsample_kernel_sizes, - n_speakers=n_speakers, - gin_channels=gin_channels, - use_sdp=use_sdp, - **kwargs - ) - self.ONNX_dir = ONNX_dir - self.enc_p = TextEncoder(n_vocab, - inter_channels, - hidden_channels, - filter_channels, - n_heads, - n_layers, - kernel_size, - p_dropout, - emotion_embedding) - self.enc_q = PosteriorEncoder(spec_channels, inter_channels, hidden_channels, 5, 1, 16, gin_channels=gin_channels) - - def infer(self, x, x_lengths, sid=None, noise_scale=1, length_scale=1, noise_scale_w=1., max_len=None, - emotion_embedding=None): - from ONNXVITS_utils import runonnx - with torch.no_grad(): - x, m_p, logs_p, x_mask = self.enc_p(x, x_lengths, emotion_embedding) - - if self.n_speakers > 0: - g = self.emb_g(sid).unsqueeze(-1) # [b, h, 1] - else: - g = None - - # logw = self.dp(x, x_mask, g=g, reverse=True, noise_scale=noise_scale_w) - logw = runonnx(f"{self.ONNX_dir}dp.onnx", x=x.numpy(), x_mask=x_mask.numpy(), g=g.numpy()) - logw = torch.from_numpy(logw[0]) - - w = torch.exp(logw) * x_mask * length_scale - w_ceil = torch.ceil(w) - y_lengths = torch.clamp_min(torch.sum(w_ceil, [1, 2]), 1).long() - y_mask = torch.unsqueeze(commons.sequence_mask(y_lengths, None), 1).to(x_mask.dtype) - attn_mask = torch.unsqueeze(x_mask, 2) * torch.unsqueeze(y_mask, -1) - attn = commons.generate_path(w_ceil, attn_mask) - - m_p = torch.matmul(attn.squeeze(1), m_p.transpose(1, 2)).transpose(1, 2) # [b, t', t], [b, t, d] -> [b, d, t'] - logs_p = torch.matmul(attn.squeeze(1), logs_p.transpose(1, 2)).transpose(1, - 2) # [b, t', t], [b, t, d] -> [b, d, t'] - - z_p = m_p + torch.randn_like(m_p) * torch.exp(logs_p) * noise_scale - - # z = self.flow(z_p, y_mask, g=g, reverse=True) - z = runonnx(f"{self.ONNX_dir}flow.onnx", z_p=z_p.numpy(), y_mask=y_mask.numpy(), g=g.numpy()) - z = torch.from_numpy(z[0]) - - # o = self.dec((z * y_mask)[:,:,:max_len], g=g) - o = runonnx(f"{self.ONNX_dir}dec.onnx", z_in=(z * y_mask)[:, :, :max_len].numpy(), g=g.numpy()) - o = torch.from_numpy(o[0]) - - return o, attn, y_mask, (z, z_p, m_p, logs_p) \ No newline at end of file diff --git a/spaces/Sadhvi/ChatBot/app.py b/spaces/Sadhvi/ChatBot/app.py deleted file mode 100644 index a362dcc7d0ddd1eee86961f1bc3db6d894fbd3d5..0000000000000000000000000000000000000000 --- a/spaces/Sadhvi/ChatBot/app.py +++ /dev/null @@ -1,34 +0,0 @@ -import os -import gradio as gr -from langchain.chat_models import ChatOpenAI -from langchain import LLMChain, PromptTemplate -from langchain.memory import ConversationBufferMemory - -OPENAI_API_KEY=os.getenv('OPENAI_API_KEY') - -template = """You are a helpful assistant to answer all user queries. -{chat_history} -User: {user_message} -Chatbot:""" - -prompt = PromptTemplate( - input_variables=["chat_history", "user_message"], template=template -) - -memory = ConversationBufferMemory(memory_key="chat_history") - -llm_chain = LLMChain( - llm=ChatOpenAI(temperature='0.5', model_name="gpt-3.5-turbo"), - prompt=prompt, - verbose=True, - memory=memory, -) - -def get_text_response(user_message,history): - response = llm_chain.predict(user_message = user_message) - return response - -demo = gr.ChatInterface(get_text_response) - -if __name__ == "__main__": - demo.launch() #To create a public link, set `share=True` in `launch()`. To enable errors and logs, set `debug=True` in `launch()`. diff --git a/spaces/Sandiago21/text-to-speech-french/app.py b/spaces/Sandiago21/text-to-speech-french/app.py deleted file mode 100644 index 69bdbe03594b40053ed3fbdc8860b4e4ffb8d0df..0000000000000000000000000000000000000000 --- a/spaces/Sandiago21/text-to-speech-french/app.py +++ /dev/null @@ -1,59 +0,0 @@ -import gradio as gr -import torch -from datasets import load_dataset -from transformers import pipeline, SpeechT5Processor, SpeechT5HifiGan, SpeechT5ForTextToSpeech - -model_id = "Sandiago21/speecht5_finetuned_facebook_voxpopuli_french" # update with your model id -# pipe = pipeline("automatic-speech-recognition", model=model_id) -model = SpeechT5ForTextToSpeech.from_pretrained(model_id) -vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan") -embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation") -speaker_embeddings = torch.tensor(embeddings_dataset[7440]["xvector"]).unsqueeze(0) - -# checkpoint = "microsoft/speecht5_tts" -processor = SpeechT5Processor.from_pretrained(model_id) - -replacements = [ - ("à", "a"), - ("â", "a"), - ("ç", "c"), - ("è", "e"), - ("ë", "e"), - ("î", "i"), - ("ï", "i"), - ("ô", "o"), - ("ù", "u"), - ("û", "u"), - ("ü", "u"), -] - - -title = "Text-to-Speech" -description = """ -Demo for text-to-speech translation in French. Demo uses [Sandiago21/speecht5_finetuned_facebook_voxpopuli_french](https://huggingface.co/Sandiago21/speecht5_finetuned_facebook_voxpopuli_french) checkpoint, which is based on Microsoft's -[SpeechT5 TTS](https://huggingface.co/microsoft/speecht5_tts) model and is fine-tuned in French Audio dataset -![Text-to-Speech (TTS)"](https://geekflare.com/wp-content/uploads/2021/07/texttospeech-1200x385.png "Diagram of Text-to-Speech (TTS)") -""" - - -def cleanup_text(text): - for src, dst in replacements: - text = text.replace(src, dst) - return text - -def synthesize_speech(text): - text = cleanup_text(text) - inputs = processor(text=text, return_tensors="pt") - - speech = model.generate_speech(inputs["input_ids"], speaker_embeddings, vocoder=vocoder) - - return gr.Audio.update(value=(16000, speech.cpu().numpy())) - -syntesize_speech_gradio = gr.Interface( - synthesize_speech, - inputs = gr.Textbox(label="Text", placeholder="Type something here..."), - outputs=gr.Audio(), - examples=["Je n'entrerai pas dans les détails, mais je profiterai des secondes qui me restent pour exposer la position ALDE sur le marquage CE, un des points cruciaux de ce rapport."], - title=title, - description=description, -).launch() diff --git a/spaces/SankarSrin/image-matting-app/ppmatting/models/backbone/resnet_vd.py b/spaces/SankarSrin/image-matting-app/ppmatting/models/backbone/resnet_vd.py deleted file mode 100644 index 0fdd9a57664ad80ee59846060cd7f768f757feae..0000000000000000000000000000000000000000 --- a/spaces/SankarSrin/image-matting-app/ppmatting/models/backbone/resnet_vd.py +++ /dev/null @@ -1,368 +0,0 @@ -# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import paddle -import paddle.nn as nn -import paddle.nn.functional as F - -from paddleseg.cvlibs import manager -from paddleseg.models import layers -from paddleseg.utils import utils - -__all__ = [ - "ResNet18_vd", "ResNet34_vd", "ResNet50_vd", "ResNet101_vd", "ResNet152_vd" -] - - -class ConvBNLayer(nn.Layer): - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride=1, - dilation=1, - groups=1, - is_vd_mode=False, - act=None, ): - super(ConvBNLayer, self).__init__() - - self.is_vd_mode = is_vd_mode - self._pool2d_avg = nn.AvgPool2D( - kernel_size=2, stride=2, padding=0, ceil_mode=True) - self._conv = nn.Conv2D( - in_channels=in_channels, - out_channels=out_channels, - kernel_size=kernel_size, - stride=stride, - padding=(kernel_size - 1) // 2 if dilation == 1 else 0, - dilation=dilation, - groups=groups, - bias_attr=False) - - self._batch_norm = layers.SyncBatchNorm(out_channels) - self._act_op = layers.Activation(act=act) - - def forward(self, inputs): - if self.is_vd_mode: - inputs = self._pool2d_avg(inputs) - y = self._conv(inputs) - y = self._batch_norm(y) - y = self._act_op(y) - - return y - - -class BottleneckBlock(nn.Layer): - def __init__(self, - in_channels, - out_channels, - stride, - shortcut=True, - if_first=False, - dilation=1): - super(BottleneckBlock, self).__init__() - - self.conv0 = ConvBNLayer( - in_channels=in_channels, - out_channels=out_channels, - kernel_size=1, - act='relu') - - self.dilation = dilation - - self.conv1 = ConvBNLayer( - in_channels=out_channels, - out_channels=out_channels, - kernel_size=3, - stride=stride, - act='relu', - dilation=dilation) - self.conv2 = ConvBNLayer( - in_channels=out_channels, - out_channels=out_channels * 4, - kernel_size=1, - act=None) - - if not shortcut: - self.short = ConvBNLayer( - in_channels=in_channels, - out_channels=out_channels * 4, - kernel_size=1, - stride=1, - is_vd_mode=False if if_first or stride == 1 else True) - - self.shortcut = shortcut - - def forward(self, inputs): - y = self.conv0(inputs) - - #################################################################### - # If given dilation rate > 1, using corresponding padding. - # The performance drops down without the follow padding. - if self.dilation > 1: - padding = self.dilation - y = F.pad(y, [padding, padding, padding, padding]) - ##################################################################### - - conv1 = self.conv1(y) - conv2 = self.conv2(conv1) - - if self.shortcut: - short = inputs - else: - short = self.short(inputs) - - y = paddle.add(x=short, y=conv2) - y = F.relu(y) - return y - - -class BasicBlock(nn.Layer): - def __init__(self, - in_channels, - out_channels, - stride, - shortcut=True, - if_first=False): - super(BasicBlock, self).__init__() - self.stride = stride - self.conv0 = ConvBNLayer( - in_channels=in_channels, - out_channels=out_channels, - kernel_size=3, - stride=stride, - act='relu') - self.conv1 = ConvBNLayer( - in_channels=out_channels, - out_channels=out_channels, - kernel_size=3, - act=None) - - if not shortcut: - self.short = ConvBNLayer( - in_channels=in_channels, - out_channels=out_channels, - kernel_size=1, - stride=1, - is_vd_mode=False if if_first or stride == 1 else True) - - self.shortcut = shortcut - - def forward(self, inputs): - y = self.conv0(inputs) - conv1 = self.conv1(y) - - if self.shortcut: - short = inputs - else: - short = self.short(inputs) - y = paddle.add(x=short, y=conv1) - y = F.relu(y) - - return y - - -class ResNet_vd(nn.Layer): - """ - The ResNet_vd implementation based on PaddlePaddle. - - The original article refers to Jingdong - Tong He, et, al. "Bag of Tricks for Image Classification with Convolutional Neural Networks" - (https://arxiv.org/pdf/1812.01187.pdf). - - Args: - layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50. - output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8. - multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1). - pretrained (str, optional): The path of pretrained model. - - """ - - def __init__(self, - input_channels=3, - layers=50, - output_stride=32, - multi_grid=(1, 1, 1), - pretrained=None): - super(ResNet_vd, self).__init__() - - self.conv1_logit = None # for gscnn shape stream - self.layers = layers - supported_layers = [18, 34, 50, 101, 152, 200] - assert layers in supported_layers, \ - "supported layers are {} but input layer is {}".format( - supported_layers, layers) - - if layers == 18: - depth = [2, 2, 2, 2] - elif layers == 34 or layers == 50: - depth = [3, 4, 6, 3] - elif layers == 101: - depth = [3, 4, 23, 3] - elif layers == 152: - depth = [3, 8, 36, 3] - elif layers == 200: - depth = [3, 12, 48, 3] - num_channels = [64, 256, 512, - 1024] if layers >= 50 else [64, 64, 128, 256] - num_filters = [64, 128, 256, 512] - - # for channels of four returned stages - self.feat_channels = [c * 4 for c in num_filters - ] if layers >= 50 else num_filters - self.feat_channels = [64] + self.feat_channels - - dilation_dict = None - if output_stride == 8: - dilation_dict = {2: 2, 3: 4} - elif output_stride == 16: - dilation_dict = {3: 2} - - self.conv1_1 = ConvBNLayer( - in_channels=input_channels, - out_channels=32, - kernel_size=3, - stride=2, - act='relu') - self.conv1_2 = ConvBNLayer( - in_channels=32, - out_channels=32, - kernel_size=3, - stride=1, - act='relu') - self.conv1_3 = ConvBNLayer( - in_channels=32, - out_channels=64, - kernel_size=3, - stride=1, - act='relu') - self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1) - - # self.block_list = [] - self.stage_list = [] - if layers >= 50: - for block in range(len(depth)): - shortcut = False - block_list = [] - for i in range(depth[block]): - if layers in [101, 152] and block == 2: - if i == 0: - conv_name = "res" + str(block + 2) + "a" - else: - conv_name = "res" + str(block + 2) + "b" + str(i) - else: - conv_name = "res" + str(block + 2) + chr(97 + i) - - ############################################################################### - # Add dilation rate for some segmentation tasks, if dilation_dict is not None. - dilation_rate = dilation_dict[ - block] if dilation_dict and block in dilation_dict else 1 - - # Actually block here is 'stage', and i is 'block' in 'stage' - # At the stage 4, expand the the dilation_rate if given multi_grid - if block == 3: - dilation_rate = dilation_rate * multi_grid[i] - ############################################################################### - - bottleneck_block = self.add_sublayer( - 'bb_%d_%d' % (block, i), - BottleneckBlock( - in_channels=num_channels[block] - if i == 0 else num_filters[block] * 4, - out_channels=num_filters[block], - stride=2 if i == 0 and block != 0 and - dilation_rate == 1 else 1, - shortcut=shortcut, - if_first=block == i == 0, - dilation=dilation_rate)) - - block_list.append(bottleneck_block) - shortcut = True - self.stage_list.append(block_list) - else: - for block in range(len(depth)): - shortcut = False - block_list = [] - for i in range(depth[block]): - conv_name = "res" + str(block + 2) + chr(97 + i) - basic_block = self.add_sublayer( - 'bb_%d_%d' % (block, i), - BasicBlock( - in_channels=num_channels[block] - if i == 0 else num_filters[block], - out_channels=num_filters[block], - stride=2 if i == 0 and block != 0 else 1, - shortcut=shortcut, - if_first=block == i == 0)) - block_list.append(basic_block) - shortcut = True - self.stage_list.append(block_list) - - self.pretrained = pretrained - self.init_weight() - - def forward(self, inputs): - feat_list = [] - y = self.conv1_1(inputs) - y = self.conv1_2(y) - y = self.conv1_3(y) - feat_list.append(y) - - y = self.pool2d_max(y) - - # A feature list saves the output feature map of each stage. - for stage in self.stage_list: - for block in stage: - y = block(y) - feat_list.append(y) - - return feat_list - - def init_weight(self): - utils.load_pretrained_model(self, self.pretrained) - - -@manager.BACKBONES.add_component -def ResNet18_vd(**args): - model = ResNet_vd(layers=18, **args) - return model - - -@manager.BACKBONES.add_component -def ResNet34_vd(**args): - model = ResNet_vd(layers=34, **args) - return model - - -@manager.BACKBONES.add_component -def ResNet50_vd(**args): - model = ResNet_vd(layers=50, **args) - return model - - -@manager.BACKBONES.add_component -def ResNet101_vd(**args): - model = ResNet_vd(layers=101, **args) - return model - - -def ResNet152_vd(**args): - model = ResNet_vd(layers=152, **args) - return model - - -def ResNet200_vd(**args): - model = ResNet_vd(layers=200, **args) - return model diff --git a/spaces/Skyler123/TangGPT/modules/presets.py b/spaces/Skyler123/TangGPT/modules/presets.py deleted file mode 100644 index a6e601700ba70e4e2167345be8540cca78797b00..0000000000000000000000000000000000000000 --- a/spaces/Skyler123/TangGPT/modules/presets.py +++ /dev/null @@ -1,198 +0,0 @@ -# -*- coding:utf-8 -*- -import gradio as gr -from pathlib import Path - -# ChatGPT 设置 -initial_prompt = "You are a helpful assistant." -API_HOST = "api.openai.com" -COMPLETION_URL = "https://api.openai.com/v1/chat/completions" -BALANCE_API_URL="https://api.openai.com/dashboard/billing/credit_grants" -USAGE_API_URL="https://api.openai.com/dashboard/billing/usage" -HISTORY_DIR = Path("history") -TEMPLATES_DIR = "templates" - -# 错误信息 -standard_error_msg = "☹️发生了错误:" # 错误信息的标准前缀 -error_retrieve_prompt = "请检查网络连接,或者API-Key是否有效。" # 获取对话时发生错误 -connection_timeout_prompt = "连接超时,无法获取对话。" # 连接超时 -read_timeout_prompt = "读取超时,无法获取对话。" # 读取超时 -proxy_error_prompt = "代理错误,无法获取对话。" # 代理错误 -ssl_error_prompt = "SSL错误,无法获取对话。" # SSL 错误 -no_apikey_msg = "API key长度不是51位,请检查是否输入正确。" # API key 长度不足 51 位 -no_input_msg = "请输入对话内容。" # 未输入对话内容 - -timeout_streaming = 30 # 流式对话时的超时时间 -timeout_all = 200 # 非流式对话时的超时时间 -enable_streaming_option = True # 是否启用选择选择是否实时显示回答的勾选框 -HIDE_MY_KEY = False # 如果你想在UI中隐藏你的 API 密钥,将此值设置为 True -CONCURRENT_COUNT = 100 # 允许同时使用的用户数量 - -SIM_K = 5 -INDEX_QUERY_TEMPRATURE = 1.0 - -title = """

川虎ChatGPT 🚀

""" -description = """\ -
- -由Bilibili [土川虎虎虎](https://space.bilibili.com/29125536) 和 [明昭MZhao](https://space.bilibili.com/24807452)开发 - -访问川虎ChatGPT的 [GitHub项目](https://github.com/GaiZhenbiao/ChuanhuChatGPT) 下载最新版脚本 - -此App使用 `gpt-3.5-turbo` 大语言模型 -
-""" - -footer = """\ -
{versions}
-""" - -summarize_prompt = "你是谁?我们刚才聊了什么?" # 总结对话时的 prompt - -MODELS = [ - "gpt-3.5-turbo", - "gpt-3.5-turbo-0301", - "gpt-4", - "gpt-4-0314", - "gpt-4-32k", - "gpt-4-32k-0314", -] # 可选的模型 - -MODEL_SOFT_TOKEN_LIMIT = { - "gpt-3.5-turbo": { - "streaming": 3500, - "all": 3500 - }, - "gpt-3.5-turbo-0301": { - "streaming": 3500, - "all": 3500 - }, - "gpt-4": { - "streaming": 7500, - "all": 7500 - }, - "gpt-4-0314": { - "streaming": 7500, - "all": 7500 - }, - "gpt-4-32k": { - "streaming": 31000, - "all": 31000 - }, - "gpt-4-32k-0314": { - "streaming": 31000, - "all": 31000 - } -} - -REPLY_LANGUAGES = [ - "简体中文", - "繁體中文", - "English", - "日本語", - "Español", - "Français", - "Deutsch", - "跟随问题语言(不稳定)" -] - - -WEBSEARCH_PTOMPT_TEMPLATE = """\ -Web search results: - -{web_results} -Current date: {current_date} - -Instructions: Using the provided web search results, write a comprehensive reply to the given query. Make sure to cite results using [[number](URL)] notation after the reference. If the provided search results refer to multiple subjects with the same name, write separate answers for each subject. -Query: {query} -Reply in {reply_language} -""" - -PROMPT_TEMPLATE = """\ -Context information is below. ---------------------- -{context_str} ---------------------- -Current date: {current_date}. -Using the provided context information, write a comprehensive reply to the given query. -Make sure to cite results using [number] notation after the reference. -If the provided context information refer to multiple subjects with the same name, write separate answers for each subject. -Use prior knowledge only if the given context didn't provide enough information. -Answer the question: {query_str} -Reply in {reply_language} -""" - -REFINE_TEMPLATE = """\ -The original question is as follows: {query_str} -We have provided an existing answer: {existing_answer} -We have the opportunity to refine the existing answer -(only if needed) with some more context below. ------------- -{context_msg} ------------- -Given the new context, refine the original answer to better -Reply in {reply_language} -If the context isn't useful, return the original answer. -""" - -ALREADY_CONVERTED_MARK = "" - -small_and_beautiful_theme = gr.themes.Soft( - primary_hue=gr.themes.Color( - c50="#02C160", - c100="rgba(2, 193, 96, 0.2)", - c200="#02C160", - c300="rgba(2, 193, 96, 0.32)", - c400="rgba(2, 193, 96, 0.32)", - c500="rgba(2, 193, 96, 1.0)", - c600="rgba(2, 193, 96, 1.0)", - c700="rgba(2, 193, 96, 0.32)", - c800="rgba(2, 193, 96, 0.32)", - c900="#02C160", - c950="#02C160", - ), - secondary_hue=gr.themes.Color( - c50="#576b95", - c100="#576b95", - c200="#576b95", - c300="#576b95", - c400="#576b95", - c500="#576b95", - c600="#576b95", - c700="#576b95", - c800="#576b95", - c900="#576b95", - c950="#576b95", - ), - neutral_hue=gr.themes.Color( - name="gray", - c50="#f9fafb", - c100="#f3f4f6", - c200="#e5e7eb", - c300="#d1d5db", - c400="#B2B2B2", - c500="#808080", - c600="#636363", - c700="#515151", - c800="#393939", - c900="#272727", - c950="#171717", - ), - radius_size=gr.themes.sizes.radius_sm, - ).set( - button_primary_background_fill="#06AE56", - button_primary_background_fill_dark="#06AE56", - button_primary_background_fill_hover="#07C863", - button_primary_border_color="#06AE56", - button_primary_border_color_dark="#06AE56", - button_primary_text_color="#FFFFFF", - button_primary_text_color_dark="#FFFFFF", - button_secondary_background_fill="#F2F2F2", - button_secondary_background_fill_dark="#2B2B2B", - button_secondary_text_color="#393939", - button_secondary_text_color_dark="#FFFFFF", - # background_fill_primary="#F7F7F7", - # background_fill_primary_dark="#1F1F1F", - block_title_text_color="*primary_500", - block_title_background_fill="*primary_100", - input_background_fill="#F6F6F6", - ) diff --git a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/asttokens/line_numbers.py b/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/asttokens/line_numbers.py deleted file mode 100644 index aaf76cef6d040cab1e7eaa1719c03d0e720a3d85..0000000000000000000000000000000000000000 --- a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/asttokens/line_numbers.py +++ /dev/null @@ -1,76 +0,0 @@ -# Copyright 2016 Grist Labs, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import bisect -import re -from typing import Dict, List, Tuple - -_line_start_re = re.compile(r'^', re.M) - -class LineNumbers(object): - """ - Class to convert between character offsets in a text string, and pairs (line, column) of 1-based - line and 0-based column numbers, as used by tokens and AST nodes. - - This class expects unicode for input and stores positions in unicode. But it supports - translating to and from utf8 offsets, which are used by ast parsing. - """ - def __init__(self, text): - # type: (str) -> None - # A list of character offsets of each line's first character. - self._line_offsets = [m.start(0) for m in _line_start_re.finditer(text)] - self._text = text - self._text_len = len(text) - self._utf8_offset_cache = {} # type: Dict[int, List[int]] # maps line num to list of char offset for each byte in line - - def from_utf8_col(self, line, utf8_column): - # type: (int, int) -> int - """ - Given a 1-based line number and 0-based utf8 column, returns a 0-based unicode column. - """ - offsets = self._utf8_offset_cache.get(line) - if offsets is None: - end_offset = self._line_offsets[line] if line < len(self._line_offsets) else self._text_len - line_text = self._text[self._line_offsets[line - 1] : end_offset] - - offsets = [i for i,c in enumerate(line_text) for byte in c.encode('utf8')] - offsets.append(len(line_text)) - self._utf8_offset_cache[line] = offsets - - return offsets[max(0, min(len(offsets)-1, utf8_column))] - - def line_to_offset(self, line, column): - # type: (int, int) -> int - """ - Converts 1-based line number and 0-based column to 0-based character offset into text. - """ - line -= 1 - if line >= len(self._line_offsets): - return self._text_len - elif line < 0: - return 0 - else: - return min(self._line_offsets[line] + max(0, column), self._text_len) - - def offset_to_line(self, offset): - # type: (int) -> Tuple[int, int] - """ - Converts 0-based character offset to pair (line, col) of 1-based line and 0-based column - numbers. - """ - offset = max(0, min(self._text_len, offset)) - line_index = bisect.bisect_right(self._line_offsets, offset) - 1 - return (line_index + 1, offset - self._line_offsets[line_index]) - - diff --git a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/debugpy/server/__init__.py b/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/debugpy/server/__init__.py deleted file mode 100644 index 42d5367f0709293584cfe17cdf2a2f80bebb4abf..0000000000000000000000000000000000000000 --- a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/debugpy/server/__init__.py +++ /dev/null @@ -1,7 +0,0 @@ -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See LICENSE in the project root -# for license information. - -# "force_pydevd" must be imported first to ensure (via side effects) -# that the debugpy-vendored copy of pydevd gets used. -import debugpy._vendored.force_pydevd # noqa diff --git a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/base_doc/base_node.py b/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/base_doc/base_node.py deleted file mode 100644 index 7cbb76c9e98e9208c7c3d655d51197e1ff97fc00..0000000000000000000000000000000000000000 --- a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/base_doc/base_node.py +++ /dev/null @@ -1,37 +0,0 @@ -from abc import ABC, abstractmethod -from typing import TYPE_CHECKING, TypeVar, Optional, Type - -if TYPE_CHECKING: - from docarray.proto import NodeProto - -T = TypeVar('T') - - -class BaseNode(ABC): - """ - A DocumentNode is an object than can be nested inside a Document. - A Document itself is a DocumentNode as well as prebuilt type - """ - - _proto_type_name: Optional[str] = None - - @abstractmethod - def _to_node_protobuf(self) -> 'NodeProto': - """Convert itself into a NodeProto message. This function should - be called when the self is nested into another Document that need to be - converted into a protobuf - - :return: the nested item protobuf message - """ - ... - - @classmethod - @abstractmethod - def from_protobuf(cls: Type[T], pb_msg: T) -> T: - ... - - def _docarray_to_json_compatible(self): - """ - Convert itself into a json compatible object - """ - ... diff --git a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/proto/__init__.py b/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/proto/__init__.py deleted file mode 100644 index b1a201b6e2f760104c5613a6ee9cbc77d21cfe9b..0000000000000000000000000000000000000000 --- a/spaces/SungBeom/chatwine-korean/.venv/Lib/site-packages/docarray/proto/__init__.py +++ /dev/null @@ -1,45 +0,0 @@ -from typing import TYPE_CHECKING - -from docarray.utils._internal.misc import import_library - -if TYPE_CHECKING: - from google.protobuf import __version__ as __pb__version__ -else: - protobuf = import_library('google.protobuf', raise_error=True) - __pb__version__ = protobuf.__version__ - - -if __pb__version__.startswith('4'): - from docarray.proto.pb.docarray_pb2 import ( - DictOfAnyProto, - DocListProto, - DocProto, - DocVecProto, - ListOfAnyProto, - ListOfDocArrayProto, - NdArrayProto, - NodeProto, - ) -else: - from docarray.proto.pb2.docarray_pb2 import ( - DictOfAnyProto, - DocListProto, - DocProto, - DocVecProto, - ListOfAnyProto, - ListOfDocArrayProto, - NdArrayProto, - NodeProto, - ) - -__all__ = [ - 'DocListProto', - 'DocProto', - 'NdArrayProto', - 'NodeProto', - 'DocVecProto', - 'DocListProto', - 'ListOfDocArrayProto', - 'ListOfAnyProto', - 'DictOfAnyProto', -] diff --git a/spaces/Superlang/ImageProcessor/annotator/uniformer/mmcv/ops/carafe.py b/spaces/Superlang/ImageProcessor/annotator/uniformer/mmcv/ops/carafe.py deleted file mode 100644 index 5154cb3abfccfbbe0a1b2daa67018dbf80aaf6d2..0000000000000000000000000000000000000000 --- a/spaces/Superlang/ImageProcessor/annotator/uniformer/mmcv/ops/carafe.py +++ /dev/null @@ -1,287 +0,0 @@ -# Copyright (c) OpenMMLab. All rights reserved. -import torch -import torch.nn as nn -import torch.nn.functional as F -from torch.autograd import Function -from torch.nn.modules.module import Module - -from ..cnn import UPSAMPLE_LAYERS, normal_init, xavier_init -from ..utils import ext_loader - -ext_module = ext_loader.load_ext('_ext', [ - 'carafe_naive_forward', 'carafe_naive_backward', 'carafe_forward', - 'carafe_backward' -]) - - -class CARAFENaiveFunction(Function): - - @staticmethod - def symbolic(g, features, masks, kernel_size, group_size, scale_factor): - return g.op( - 'mmcv::MMCVCARAFENaive', - features, - masks, - kernel_size_i=kernel_size, - group_size_i=group_size, - scale_factor_f=scale_factor) - - @staticmethod - def forward(ctx, features, masks, kernel_size, group_size, scale_factor): - assert scale_factor >= 1 - assert masks.size(1) == kernel_size * kernel_size * group_size - assert masks.size(-1) == features.size(-1) * scale_factor - assert masks.size(-2) == features.size(-2) * scale_factor - assert features.size(1) % group_size == 0 - assert (kernel_size - 1) % 2 == 0 and kernel_size >= 1 - ctx.kernel_size = kernel_size - ctx.group_size = group_size - ctx.scale_factor = scale_factor - ctx.feature_size = features.size() - ctx.mask_size = masks.size() - - n, c, h, w = features.size() - output = features.new_zeros((n, c, h * scale_factor, w * scale_factor)) - ext_module.carafe_naive_forward( - features, - masks, - output, - kernel_size=kernel_size, - group_size=group_size, - scale_factor=scale_factor) - - if features.requires_grad or masks.requires_grad: - ctx.save_for_backward(features, masks) - return output - - @staticmethod - def backward(ctx, grad_output): - assert grad_output.is_cuda - - features, masks = ctx.saved_tensors - kernel_size = ctx.kernel_size - group_size = ctx.group_size - scale_factor = ctx.scale_factor - - grad_input = torch.zeros_like(features) - grad_masks = torch.zeros_like(masks) - ext_module.carafe_naive_backward( - grad_output.contiguous(), - features, - masks, - grad_input, - grad_masks, - kernel_size=kernel_size, - group_size=group_size, - scale_factor=scale_factor) - - return grad_input, grad_masks, None, None, None - - -carafe_naive = CARAFENaiveFunction.apply - - -class CARAFENaive(Module): - - def __init__(self, kernel_size, group_size, scale_factor): - super(CARAFENaive, self).__init__() - - assert isinstance(kernel_size, int) and isinstance( - group_size, int) and isinstance(scale_factor, int) - self.kernel_size = kernel_size - self.group_size = group_size - self.scale_factor = scale_factor - - def forward(self, features, masks): - return carafe_naive(features, masks, self.kernel_size, self.group_size, - self.scale_factor) - - -class CARAFEFunction(Function): - - @staticmethod - def symbolic(g, features, masks, kernel_size, group_size, scale_factor): - return g.op( - 'mmcv::MMCVCARAFE', - features, - masks, - kernel_size_i=kernel_size, - group_size_i=group_size, - scale_factor_f=scale_factor) - - @staticmethod - def forward(ctx, features, masks, kernel_size, group_size, scale_factor): - assert scale_factor >= 1 - assert masks.size(1) == kernel_size * kernel_size * group_size - assert masks.size(-1) == features.size(-1) * scale_factor - assert masks.size(-2) == features.size(-2) * scale_factor - assert features.size(1) % group_size == 0 - assert (kernel_size - 1) % 2 == 0 and kernel_size >= 1 - ctx.kernel_size = kernel_size - ctx.group_size = group_size - ctx.scale_factor = scale_factor - ctx.feature_size = features.size() - ctx.mask_size = masks.size() - - n, c, h, w = features.size() - output = features.new_zeros((n, c, h * scale_factor, w * scale_factor)) - routput = features.new_zeros(output.size(), requires_grad=False) - rfeatures = features.new_zeros(features.size(), requires_grad=False) - rmasks = masks.new_zeros(masks.size(), requires_grad=False) - ext_module.carafe_forward( - features, - masks, - rfeatures, - routput, - rmasks, - output, - kernel_size=kernel_size, - group_size=group_size, - scale_factor=scale_factor) - - if features.requires_grad or masks.requires_grad: - ctx.save_for_backward(features, masks, rfeatures) - return output - - @staticmethod - def backward(ctx, grad_output): - assert grad_output.is_cuda - - features, masks, rfeatures = ctx.saved_tensors - kernel_size = ctx.kernel_size - group_size = ctx.group_size - scale_factor = ctx.scale_factor - - rgrad_output = torch.zeros_like(grad_output, requires_grad=False) - rgrad_input_hs = torch.zeros_like(grad_output, requires_grad=False) - rgrad_input = torch.zeros_like(features, requires_grad=False) - rgrad_masks = torch.zeros_like(masks, requires_grad=False) - grad_input = torch.zeros_like(features, requires_grad=False) - grad_masks = torch.zeros_like(masks, requires_grad=False) - ext_module.carafe_backward( - grad_output.contiguous(), - rfeatures, - masks, - rgrad_output, - rgrad_input_hs, - rgrad_input, - rgrad_masks, - grad_input, - grad_masks, - kernel_size=kernel_size, - group_size=group_size, - scale_factor=scale_factor) - return grad_input, grad_masks, None, None, None - - -carafe = CARAFEFunction.apply - - -class CARAFE(Module): - """ CARAFE: Content-Aware ReAssembly of FEatures - - Please refer to https://arxiv.org/abs/1905.02188 for more details. - - Args: - kernel_size (int): reassemble kernel size - group_size (int): reassemble group size - scale_factor (int): upsample ratio - - Returns: - upsampled feature map - """ - - def __init__(self, kernel_size, group_size, scale_factor): - super(CARAFE, self).__init__() - - assert isinstance(kernel_size, int) and isinstance( - group_size, int) and isinstance(scale_factor, int) - self.kernel_size = kernel_size - self.group_size = group_size - self.scale_factor = scale_factor - - def forward(self, features, masks): - return carafe(features, masks, self.kernel_size, self.group_size, - self.scale_factor) - - -@UPSAMPLE_LAYERS.register_module(name='carafe') -class CARAFEPack(nn.Module): - """A unified package of CARAFE upsampler that contains: 1) channel - compressor 2) content encoder 3) CARAFE op. - - Official implementation of ICCV 2019 paper - CARAFE: Content-Aware ReAssembly of FEatures - Please refer to https://arxiv.org/abs/1905.02188 for more details. - - Args: - channels (int): input feature channels - scale_factor (int): upsample ratio - up_kernel (int): kernel size of CARAFE op - up_group (int): group size of CARAFE op - encoder_kernel (int): kernel size of content encoder - encoder_dilation (int): dilation of content encoder - compressed_channels (int): output channels of channels compressor - - Returns: - upsampled feature map - """ - - def __init__(self, - channels, - scale_factor, - up_kernel=5, - up_group=1, - encoder_kernel=3, - encoder_dilation=1, - compressed_channels=64): - super(CARAFEPack, self).__init__() - self.channels = channels - self.scale_factor = scale_factor - self.up_kernel = up_kernel - self.up_group = up_group - self.encoder_kernel = encoder_kernel - self.encoder_dilation = encoder_dilation - self.compressed_channels = compressed_channels - self.channel_compressor = nn.Conv2d(channels, self.compressed_channels, - 1) - self.content_encoder = nn.Conv2d( - self.compressed_channels, - self.up_kernel * self.up_kernel * self.up_group * - self.scale_factor * self.scale_factor, - self.encoder_kernel, - padding=int((self.encoder_kernel - 1) * self.encoder_dilation / 2), - dilation=self.encoder_dilation, - groups=1) - self.init_weights() - - def init_weights(self): - for m in self.modules(): - if isinstance(m, nn.Conv2d): - xavier_init(m, distribution='uniform') - normal_init(self.content_encoder, std=0.001) - - def kernel_normalizer(self, mask): - mask = F.pixel_shuffle(mask, self.scale_factor) - n, mask_c, h, w = mask.size() - # use float division explicitly, - # to void inconsistency while exporting to onnx - mask_channel = int(mask_c / float(self.up_kernel**2)) - mask = mask.view(n, mask_channel, -1, h, w) - - mask = F.softmax(mask, dim=2, dtype=mask.dtype) - mask = mask.view(n, mask_c, h, w).contiguous() - - return mask - - def feature_reassemble(self, x, mask): - x = carafe(x, mask, self.up_kernel, self.up_group, self.scale_factor) - return x - - def forward(self, x): - compressed_x = self.channel_compressor(x) - mask = self.content_encoder(compressed_x) - mask = self.kernel_normalizer(mask) - - x = self.feature_reassemble(x, mask) - return x diff --git a/spaces/TH5314/newbing/src/components/header.tsx b/spaces/TH5314/newbing/src/components/header.tsx deleted file mode 100644 index dc298b722154d1ac6d7a7e148204605562d6cc58..0000000000000000000000000000000000000000 --- a/spaces/TH5314/newbing/src/components/header.tsx +++ /dev/null @@ -1,12 +0,0 @@ -import * as React from 'react' -import { UserMenu } from './user-menu' - -export async function Header() { - return ( -
-
- -
-
- ) -} diff --git a/spaces/TabPFN/TabPFNPrediction/TabPFN/decoders.py b/spaces/TabPFN/TabPFNPrediction/TabPFN/decoders.py deleted file mode 100644 index efdd8fcb624c1b0a6d127fcf0aa8548c4ca1524d..0000000000000000000000000000000000000000 --- a/spaces/TabPFN/TabPFNPrediction/TabPFN/decoders.py +++ /dev/null @@ -1,30 +0,0 @@ -import torch -from torch import nn -import random - - -class ScaledDecoder(nn.Module): - def __init__(self, ninp, nhid, nout): - super().__init__() - self.linear = nn.Linear(ninp, nhid) - self.linear1 = nn.Linear(nhid, nout) - self.linear2 = nn.Linear(nhid, 10) - - def forward(self, x): - #return torch.cat([self.linear1(x), self.linear2(x)], -1) - x = self.linear(x) - x = nn.GELU()(x) - temps = self.linear2(x).softmax(-1) @ torch.tensor([1.,1.4,1.7,2.,5.,10.,20.,40.,80.,160.], device=x.device) - if random.random() > .99: - print(temps.shape,temps[:,:2]) - return self.linear1(x) / temps.unsqueeze(-1) - -class FixedScaledDecoder(nn.Module): - def __init__(self, ninp, nhid, nout): - super().__init__() - self.mapper = nn.Sequential(nn.Linear(ninp, nhid), nn.GELU(), nn.Linear(nhid, nout)) - self.T = nn.Parameter(torch.ones(10000)/10000) - - def forward(self, x): - return self.mapper(x)/self.T.sum() - diff --git a/spaces/TangibleAI/mathtext/tests/__init__.py b/spaces/TangibleAI/mathtext/tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py b/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py deleted file mode 100644 index 2dcb54cb1054c5d80ccc823af21f13b9ebbcf1a3..0000000000000000000000000000000000000000 --- a/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.config import LazyConfig - -# equivalent to relative import -dir1a_str, dir1a_dict = LazyConfig.load_rel("dir1_a.py", ("dir1a_str", "dir1a_dict")) - -dir1b_str = dir1a_str + "_from_b" -dir1b_dict = dir1a_dict - -# Every import is a reload: not modified by other config files -assert dir1a_dict.a == 1 diff --git a/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py b/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py deleted file mode 100644 index 3bb100f9bd5b4939e4646821c5a60d51c8ea65fd..0000000000000000000000000000000000000000 --- a/spaces/TencentARC/VLog/models/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py +++ /dev/null @@ -1,34 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved - -import unittest -import torch - -import detectron2.export.torchscript # apply patch # noqa -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.layers import ShapeSpec -from detectron2.modeling.backbone import build_resnet_backbone -from detectron2.modeling.backbone.fpn import build_resnet_fpn_backbone - - -class TestBackBone(unittest.TestCase): - def test_resnet_scriptability(self): - cfg = get_cfg() - resnet = build_resnet_backbone(cfg, ShapeSpec(channels=3)) - - scripted_resnet = torch.jit.script(resnet) - - inp = torch.rand(2, 3, 100, 100) - out1 = resnet(inp)["res4"] - out2 = scripted_resnet(inp)["res4"] - self.assertTrue(torch.allclose(out1, out2)) - - def test_fpn_scriptability(self): - cfg = model_zoo.get_config("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml") - bb = build_resnet_fpn_backbone(cfg, ShapeSpec(channels=3)) - bb_s = torch.jit.script(bb) - - inp = torch.rand(2, 3, 128, 128) - out1 = bb(inp)["p5"] - out2 = bb_s(inp)["p5"] - self.assertTrue(torch.allclose(out1, out2)) diff --git a/spaces/Theivaprakasham/yolov6/configs/yolov6_tiny_finetune.py b/spaces/Theivaprakasham/yolov6/configs/yolov6_tiny_finetune.py deleted file mode 100644 index d751eff06a4ebcc82afe24c8abfef2f73dcfb76e..0000000000000000000000000000000000000000 --- a/spaces/Theivaprakasham/yolov6/configs/yolov6_tiny_finetune.py +++ /dev/null @@ -1,53 +0,0 @@ -# YOLOv6t model -model = dict( - type='YOLOv6t', - pretrained='./weights/yolov6t.pt', - depth_multiple=0.25, - width_multiple=0.50, - backbone=dict( - type='EfficientRep', - num_repeats=[1, 6, 12, 18, 6], - out_channels=[64, 128, 256, 512, 1024], - ), - neck=dict( - type='RepPAN', - num_repeats=[12, 12, 12, 12], - out_channels=[256, 128, 128, 256, 256, 512], - ), - head=dict( - type='EffiDeHead', - in_channels=[128, 256, 512], - num_layers=3, - begin_indices=24, - anchors=1, - out_indices=[17, 20, 23], - strides=[8, 16, 32], - iou_type='ciou' - ) -) - -solver = dict( - optim='SGD', - lr_scheduler='Cosine', - lr0=0.0032, - lrf=0.12, - momentum=0.843, - weight_decay=0.00036, - warmup_epochs=2.0, - warmup_momentum=0.5, - warmup_bias_lr=0.05 -) - -data_aug = dict( - hsv_h=0.0138, - hsv_s=0.664, - hsv_v=0.464, - degrees=0.373, - translate=0.245, - scale=0.898, - shear=0.602, - flipud=0.00856, - fliplr=0.5, - mosaic=1.0, - mixup=0.243, -) diff --git a/spaces/Wryley1234/textual-inversion-training/README.md b/spaces/Wryley1234/textual-inversion-training/README.md deleted file mode 100644 index 936acabca42db7710c0aca4e99cee3ae92a67928..0000000000000000000000000000000000000000 --- a/spaces/Wryley1234/textual-inversion-training/README.md +++ /dev/null @@ -1,14 +0,0 @@ ---- -title: Textual Inversion Training -emoji: 📉 -colorFrom: purple -colorTo: red -sdk: gradio -sdk_version: 3.14.0 -app_file: app.py -pinned: false -license: apache-2.0 -duplicated_from: Intel/textual-inversion-training ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/XzJosh/Ava-Bert-VITS2/preprocess_text.py b/spaces/XzJosh/Ava-Bert-VITS2/preprocess_text.py deleted file mode 100644 index 5eb0f3b9e929fcbe91dcbeb653391227a2518a15..0000000000000000000000000000000000000000 --- a/spaces/XzJosh/Ava-Bert-VITS2/preprocess_text.py +++ /dev/null @@ -1,64 +0,0 @@ -import json -from random import shuffle - -import tqdm -from text.cleaner import clean_text -from collections import defaultdict -stage = [1,2,3] - -transcription_path = 'filelists/genshin.list' -train_path = 'filelists/train.list' -val_path = 'filelists/val.list' -config_path = "configs/config.json" -val_per_spk = 4 -max_val_total = 8 - -if 1 in stage: - with open( transcription_path+'.cleaned', 'w', encoding='utf-8') as f: - for line in tqdm.tqdm(open(transcription_path, encoding='utf-8').readlines()): - try: - utt, spk, language, text = line.strip().split('|') - norm_text, phones, tones, word2ph = clean_text(text, language) - f.write('{}|{}|{}|{}|{}|{}|{}\n'.format(utt, spk, language, norm_text, ' '.join(phones), - " ".join([str(i) for i in tones]), - " ".join([str(i) for i in word2ph]))) - except Exception as error : - print("err!", utt, error) - -if 2 in stage: - spk_utt_map = defaultdict(list) - spk_id_map = {} - current_sid = 0 - - with open( transcription_path+'.cleaned', encoding='utf-8') as f: - for line in f.readlines(): - utt, spk, language, text, phones, tones, word2ph = line.strip().split('|') - spk_utt_map[spk].append(line) - if spk not in spk_id_map.keys(): - spk_id_map[spk] = current_sid - current_sid += 1 - train_list = [] - val_list = [] - - for spk, utts in spk_utt_map.items(): - shuffle(utts) - val_list+=utts[:val_per_spk] - train_list+=utts[val_per_spk:] - if len(val_list) > max_val_total: - train_list+=val_list[max_val_total:] - val_list = val_list[:max_val_total] - - with open( train_path,"w", encoding='utf-8') as f: - for line in train_list: - f.write(line) - - with open(val_path, "w", encoding='utf-8') as f: - for line in val_list: - f.write(line) - -if 3 in stage: - assert 2 in stage - config = json.load(open(config_path, encoding='utf-8')) - config["data"]['spk2id'] = spk_id_map - with open(config_path, 'w', encoding='utf-8') as f: - json.dump(config, f, indent=2, ensure_ascii=False) diff --git a/spaces/abdvl/datahub_qa_bot/docs/advanced/pdl-best-practices.md b/spaces/abdvl/datahub_qa_bot/docs/advanced/pdl-best-practices.md deleted file mode 100644 index adcecf6e0d8d2319927c5c9aa3a693dffaba1347..0000000000000000000000000000000000000000 --- a/spaces/abdvl/datahub_qa_bot/docs/advanced/pdl-best-practices.md +++ /dev/null @@ -1,3 +0,0 @@ -# PDL Best Practices - -WIP diff --git a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet/models/necks/nasfcos_fpn.py b/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet/models/necks/nasfcos_fpn.py deleted file mode 100644 index 2daf79ef591373499184c624ccd27fb7456dec06..0000000000000000000000000000000000000000 --- a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet/models/necks/nasfcos_fpn.py +++ /dev/null @@ -1,161 +0,0 @@ -import torch.nn as nn -import torch.nn.functional as F -from mmcv.cnn import ConvModule, caffe2_xavier_init -from mmcv.ops.merge_cells import ConcatCell - -from ..builder import NECKS - - -@NECKS.register_module() -class NASFCOS_FPN(nn.Module): - """FPN structure in NASFPN. - - Implementation of paper `NAS-FCOS: Fast Neural Architecture Search for - Object Detection `_ - - Args: - in_channels (List[int]): Number of input channels per scale. - out_channels (int): Number of output channels (used at each scale) - num_outs (int): Number of output scales. - start_level (int): Index of the start input backbone level used to - build the feature pyramid. Default: 0. - end_level (int): Index of the end input backbone level (exclusive) to - build the feature pyramid. Default: -1, which means the last level. - add_extra_convs (bool): It decides whether to add conv - layers on top of the original feature maps. Default to False. - If True, its actual mode is specified by `extra_convs_on_inputs`. - conv_cfg (dict): dictionary to construct and config conv layer. - norm_cfg (dict): dictionary to construct and config norm layer. - """ - - def __init__(self, - in_channels, - out_channels, - num_outs, - start_level=1, - end_level=-1, - add_extra_convs=False, - conv_cfg=None, - norm_cfg=None): - super(NASFCOS_FPN, self).__init__() - assert isinstance(in_channels, list) - self.in_channels = in_channels - self.out_channels = out_channels - self.num_ins = len(in_channels) - self.num_outs = num_outs - self.norm_cfg = norm_cfg - self.conv_cfg = conv_cfg - - if end_level == -1: - self.backbone_end_level = self.num_ins - assert num_outs >= self.num_ins - start_level - else: - self.backbone_end_level = end_level - assert end_level <= len(in_channels) - assert num_outs == end_level - start_level - self.start_level = start_level - self.end_level = end_level - self.add_extra_convs = add_extra_convs - - self.adapt_convs = nn.ModuleList() - for i in range(self.start_level, self.backbone_end_level): - adapt_conv = ConvModule( - in_channels[i], - out_channels, - 1, - stride=1, - padding=0, - bias=False, - norm_cfg=dict(type='BN'), - act_cfg=dict(type='ReLU', inplace=False)) - self.adapt_convs.append(adapt_conv) - - # C2 is omitted according to the paper - extra_levels = num_outs - self.backbone_end_level + self.start_level - - def build_concat_cell(with_input1_conv, with_input2_conv): - cell_conv_cfg = dict( - kernel_size=1, padding=0, bias=False, groups=out_channels) - return ConcatCell( - in_channels=out_channels, - out_channels=out_channels, - with_out_conv=True, - out_conv_cfg=cell_conv_cfg, - out_norm_cfg=dict(type='BN'), - out_conv_order=('norm', 'act', 'conv'), - with_input1_conv=with_input1_conv, - with_input2_conv=with_input2_conv, - input_conv_cfg=conv_cfg, - input_norm_cfg=norm_cfg, - upsample_mode='nearest') - - # Denote c3=f0, c4=f1, c5=f2 for convince - self.fpn = nn.ModuleDict() - self.fpn['c22_1'] = build_concat_cell(True, True) - self.fpn['c22_2'] = build_concat_cell(True, True) - self.fpn['c32'] = build_concat_cell(True, False) - self.fpn['c02'] = build_concat_cell(True, False) - self.fpn['c42'] = build_concat_cell(True, True) - self.fpn['c36'] = build_concat_cell(True, True) - self.fpn['c61'] = build_concat_cell(True, True) # f9 - self.extra_downsamples = nn.ModuleList() - for i in range(extra_levels): - extra_act_cfg = None if i == 0 \ - else dict(type='ReLU', inplace=False) - self.extra_downsamples.append( - ConvModule( - out_channels, - out_channels, - 3, - stride=2, - padding=1, - act_cfg=extra_act_cfg, - order=('act', 'norm', 'conv'))) - - def forward(self, inputs): - """Forward function.""" - feats = [ - adapt_conv(inputs[i + self.start_level]) - for i, adapt_conv in enumerate(self.adapt_convs) - ] - - for (i, module_name) in enumerate(self.fpn): - idx_1, idx_2 = int(module_name[1]), int(module_name[2]) - res = self.fpn[module_name](feats[idx_1], feats[idx_2]) - feats.append(res) - - ret = [] - for (idx, input_idx) in zip([9, 8, 7], [1, 2, 3]): # add P3, P4, P5 - feats1, feats2 = feats[idx], feats[5] - feats2_resize = F.interpolate( - feats2, - size=feats1.size()[2:], - mode='bilinear', - align_corners=False) - - feats_sum = feats1 + feats2_resize - ret.append( - F.interpolate( - feats_sum, - size=inputs[input_idx].size()[2:], - mode='bilinear', - align_corners=False)) - - for submodule in self.extra_downsamples: - ret.append(submodule(ret[-1])) - - return tuple(ret) - - def init_weights(self): - """Initialize the weights of module.""" - for module in self.fpn.values(): - if hasattr(module, 'conv_out'): - caffe2_xavier_init(module.out_conv.conv) - - for modules in [ - self.adapt_convs.modules(), - self.extra_downsamples.modules() - ]: - for module in modules: - if isinstance(module, nn.Conv2d): - caffe2_xavier_init(module) diff --git a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet_null/models/dense_heads/centripetal_head.py b/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet_null/models/dense_heads/centripetal_head.py deleted file mode 100644 index 6728218b60539a71f6353645635f741a1ad7263d..0000000000000000000000000000000000000000 --- a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmdet_null/models/dense_heads/centripetal_head.py +++ /dev/null @@ -1,421 +0,0 @@ -import torch.nn as nn -from mmcv.cnn import ConvModule, normal_init -from mmcv.ops import DeformConv2d - -from mmdet.core import multi_apply -from ..builder import HEADS, build_loss -from .corner_head import CornerHead - - -@HEADS.register_module() -class CentripetalHead(CornerHead): - """Head of CentripetalNet: Pursuing High-quality Keypoint Pairs for Object - Detection. - - CentripetalHead inherits from :class:`CornerHead`. It removes the - embedding branch and adds guiding shift and centripetal shift branches. - More details can be found in the `paper - `_ . - - Args: - num_classes (int): Number of categories excluding the background - category. - in_channels (int): Number of channels in the input feature map. - num_feat_levels (int): Levels of feature from the previous module. 2 - for HourglassNet-104 and 1 for HourglassNet-52. HourglassNet-104 - outputs the final feature and intermediate supervision feature and - HourglassNet-52 only outputs the final feature. Default: 2. - corner_emb_channels (int): Channel of embedding vector. Default: 1. - train_cfg (dict | None): Training config. Useless in CornerHead, - but we keep this variable for SingleStageDetector. Default: None. - test_cfg (dict | None): Testing config of CornerHead. Default: None. - loss_heatmap (dict | None): Config of corner heatmap loss. Default: - GaussianFocalLoss. - loss_embedding (dict | None): Config of corner embedding loss. Default: - AssociativeEmbeddingLoss. - loss_offset (dict | None): Config of corner offset loss. Default: - SmoothL1Loss. - loss_guiding_shift (dict): Config of guiding shift loss. Default: - SmoothL1Loss. - loss_centripetal_shift (dict): Config of centripetal shift loss. - Default: SmoothL1Loss. - """ - - def __init__(self, - *args, - centripetal_shift_channels=2, - guiding_shift_channels=2, - feat_adaption_conv_kernel=3, - loss_guiding_shift=dict( - type='SmoothL1Loss', beta=1.0, loss_weight=0.05), - loss_centripetal_shift=dict( - type='SmoothL1Loss', beta=1.0, loss_weight=1), - **kwargs): - assert centripetal_shift_channels == 2, ( - 'CentripetalHead only support centripetal_shift_channels == 2') - self.centripetal_shift_channels = centripetal_shift_channels - assert guiding_shift_channels == 2, ( - 'CentripetalHead only support guiding_shift_channels == 2') - self.guiding_shift_channels = guiding_shift_channels - self.feat_adaption_conv_kernel = feat_adaption_conv_kernel - super(CentripetalHead, self).__init__(*args, **kwargs) - self.loss_guiding_shift = build_loss(loss_guiding_shift) - self.loss_centripetal_shift = build_loss(loss_centripetal_shift) - - def _init_centripetal_layers(self): - """Initialize centripetal layers. - - Including feature adaption deform convs (feat_adaption), deform offset - prediction convs (dcn_off), guiding shift (guiding_shift) and - centripetal shift ( centripetal_shift). Each branch has two parts: - prefix `tl_` for top-left and `br_` for bottom-right. - """ - self.tl_feat_adaption = nn.ModuleList() - self.br_feat_adaption = nn.ModuleList() - self.tl_dcn_offset = nn.ModuleList() - self.br_dcn_offset = nn.ModuleList() - self.tl_guiding_shift = nn.ModuleList() - self.br_guiding_shift = nn.ModuleList() - self.tl_centripetal_shift = nn.ModuleList() - self.br_centripetal_shift = nn.ModuleList() - - for _ in range(self.num_feat_levels): - self.tl_feat_adaption.append( - DeformConv2d(self.in_channels, self.in_channels, - self.feat_adaption_conv_kernel, 1, 1)) - self.br_feat_adaption.append( - DeformConv2d(self.in_channels, self.in_channels, - self.feat_adaption_conv_kernel, 1, 1)) - - self.tl_guiding_shift.append( - self._make_layers( - out_channels=self.guiding_shift_channels, - in_channels=self.in_channels)) - self.br_guiding_shift.append( - self._make_layers( - out_channels=self.guiding_shift_channels, - in_channels=self.in_channels)) - - self.tl_dcn_offset.append( - ConvModule( - self.guiding_shift_channels, - self.feat_adaption_conv_kernel**2 * - self.guiding_shift_channels, - 1, - bias=False, - act_cfg=None)) - self.br_dcn_offset.append( - ConvModule( - self.guiding_shift_channels, - self.feat_adaption_conv_kernel**2 * - self.guiding_shift_channels, - 1, - bias=False, - act_cfg=None)) - - self.tl_centripetal_shift.append( - self._make_layers( - out_channels=self.centripetal_shift_channels, - in_channels=self.in_channels)) - self.br_centripetal_shift.append( - self._make_layers( - out_channels=self.centripetal_shift_channels, - in_channels=self.in_channels)) - - def _init_layers(self): - """Initialize layers for CentripetalHead. - - Including two parts: CornerHead layers and CentripetalHead layers - """ - super()._init_layers() # using _init_layers in CornerHead - self._init_centripetal_layers() - - def init_weights(self): - """Initialize weights of the head.""" - super().init_weights() - for i in range(self.num_feat_levels): - normal_init(self.tl_feat_adaption[i], std=0.01) - normal_init(self.br_feat_adaption[i], std=0.01) - normal_init(self.tl_dcn_offset[i].conv, std=0.1) - normal_init(self.br_dcn_offset[i].conv, std=0.1) - _ = [x.conv.reset_parameters() for x in self.tl_guiding_shift[i]] - _ = [x.conv.reset_parameters() for x in self.br_guiding_shift[i]] - _ = [ - x.conv.reset_parameters() for x in self.tl_centripetal_shift[i] - ] - _ = [ - x.conv.reset_parameters() for x in self.br_centripetal_shift[i] - ] - - def forward_single(self, x, lvl_ind): - """Forward feature of a single level. - - Args: - x (Tensor): Feature of a single level. - lvl_ind (int): Level index of current feature. - - Returns: - tuple[Tensor]: A tuple of CentripetalHead's output for current - feature level. Containing the following Tensors: - - - tl_heat (Tensor): Predicted top-left corner heatmap. - - br_heat (Tensor): Predicted bottom-right corner heatmap. - - tl_off (Tensor): Predicted top-left offset heatmap. - - br_off (Tensor): Predicted bottom-right offset heatmap. - - tl_guiding_shift (Tensor): Predicted top-left guiding shift - heatmap. - - br_guiding_shift (Tensor): Predicted bottom-right guiding - shift heatmap. - - tl_centripetal_shift (Tensor): Predicted top-left centripetal - shift heatmap. - - br_centripetal_shift (Tensor): Predicted bottom-right - centripetal shift heatmap. - """ - tl_heat, br_heat, _, _, tl_off, br_off, tl_pool, br_pool = super( - ).forward_single( - x, lvl_ind, return_pool=True) - - tl_guiding_shift = self.tl_guiding_shift[lvl_ind](tl_pool) - br_guiding_shift = self.br_guiding_shift[lvl_ind](br_pool) - - tl_dcn_offset = self.tl_dcn_offset[lvl_ind](tl_guiding_shift.detach()) - br_dcn_offset = self.br_dcn_offset[lvl_ind](br_guiding_shift.detach()) - - tl_feat_adaption = self.tl_feat_adaption[lvl_ind](tl_pool, - tl_dcn_offset) - br_feat_adaption = self.br_feat_adaption[lvl_ind](br_pool, - br_dcn_offset) - - tl_centripetal_shift = self.tl_centripetal_shift[lvl_ind]( - tl_feat_adaption) - br_centripetal_shift = self.br_centripetal_shift[lvl_ind]( - br_feat_adaption) - - result_list = [ - tl_heat, br_heat, tl_off, br_off, tl_guiding_shift, - br_guiding_shift, tl_centripetal_shift, br_centripetal_shift - ] - return result_list - - def loss(self, - tl_heats, - br_heats, - tl_offs, - br_offs, - tl_guiding_shifts, - br_guiding_shifts, - tl_centripetal_shifts, - br_centripetal_shifts, - gt_bboxes, - gt_labels, - img_metas, - gt_bboxes_ignore=None): - """Compute losses of the head. - - Args: - tl_heats (list[Tensor]): Top-left corner heatmaps for each level - with shape (N, num_classes, H, W). - br_heats (list[Tensor]): Bottom-right corner heatmaps for each - level with shape (N, num_classes, H, W). - tl_offs (list[Tensor]): Top-left corner offsets for each level - with shape (N, corner_offset_channels, H, W). - br_offs (list[Tensor]): Bottom-right corner offsets for each level - with shape (N, corner_offset_channels, H, W). - tl_guiding_shifts (list[Tensor]): Top-left guiding shifts for each - level with shape (N, guiding_shift_channels, H, W). - br_guiding_shifts (list[Tensor]): Bottom-right guiding shifts for - each level with shape (N, guiding_shift_channels, H, W). - tl_centripetal_shifts (list[Tensor]): Top-left centripetal shifts - for each level with shape (N, centripetal_shift_channels, H, - W). - br_centripetal_shifts (list[Tensor]): Bottom-right centripetal - shifts for each level with shape (N, - centripetal_shift_channels, H, W). - gt_bboxes (list[Tensor]): Ground truth bboxes for each image with - shape (num_gts, 4) in [left, top, right, bottom] format. - gt_labels (list[Tensor]): Class indices corresponding to each box. - img_metas (list[dict]): Meta information of each image, e.g., - image size, scaling factor, etc. - gt_bboxes_ignore (list[Tensor] | None): Specify which bounding - boxes can be ignored when computing the loss. - - Returns: - dict[str, Tensor]: A dictionary of loss components. Containing the - following losses: - - - det_loss (list[Tensor]): Corner keypoint losses of all - feature levels. - - off_loss (list[Tensor]): Corner offset losses of all feature - levels. - - guiding_loss (list[Tensor]): Guiding shift losses of all - feature levels. - - centripetal_loss (list[Tensor]): Centripetal shift losses of - all feature levels. - """ - targets = self.get_targets( - gt_bboxes, - gt_labels, - tl_heats[-1].shape, - img_metas[0]['pad_shape'], - with_corner_emb=self.with_corner_emb, - with_guiding_shift=True, - with_centripetal_shift=True) - mlvl_targets = [targets for _ in range(self.num_feat_levels)] - [det_losses, off_losses, guiding_losses, centripetal_losses - ] = multi_apply(self.loss_single, tl_heats, br_heats, tl_offs, - br_offs, tl_guiding_shifts, br_guiding_shifts, - tl_centripetal_shifts, br_centripetal_shifts, - mlvl_targets) - loss_dict = dict( - det_loss=det_losses, - off_loss=off_losses, - guiding_loss=guiding_losses, - centripetal_loss=centripetal_losses) - return loss_dict - - def loss_single(self, tl_hmp, br_hmp, tl_off, br_off, tl_guiding_shift, - br_guiding_shift, tl_centripetal_shift, - br_centripetal_shift, targets): - """Compute losses for single level. - - Args: - tl_hmp (Tensor): Top-left corner heatmap for current level with - shape (N, num_classes, H, W). - br_hmp (Tensor): Bottom-right corner heatmap for current level with - shape (N, num_classes, H, W). - tl_off (Tensor): Top-left corner offset for current level with - shape (N, corner_offset_channels, H, W). - br_off (Tensor): Bottom-right corner offset for current level with - shape (N, corner_offset_channels, H, W). - tl_guiding_shift (Tensor): Top-left guiding shift for current level - with shape (N, guiding_shift_channels, H, W). - br_guiding_shift (Tensor): Bottom-right guiding shift for current - level with shape (N, guiding_shift_channels, H, W). - tl_centripetal_shift (Tensor): Top-left centripetal shift for - current level with shape (N, centripetal_shift_channels, H, W). - br_centripetal_shift (Tensor): Bottom-right centripetal shift for - current level with shape (N, centripetal_shift_channels, H, W). - targets (dict): Corner target generated by `get_targets`. - - Returns: - tuple[torch.Tensor]: Losses of the head's differnet branches - containing the following losses: - - - det_loss (Tensor): Corner keypoint loss. - - off_loss (Tensor): Corner offset loss. - - guiding_loss (Tensor): Guiding shift loss. - - centripetal_loss (Tensor): Centripetal shift loss. - """ - targets['corner_embedding'] = None - - det_loss, _, _, off_loss = super().loss_single(tl_hmp, br_hmp, None, - None, tl_off, br_off, - targets) - - gt_tl_guiding_shift = targets['topleft_guiding_shift'] - gt_br_guiding_shift = targets['bottomright_guiding_shift'] - gt_tl_centripetal_shift = targets['topleft_centripetal_shift'] - gt_br_centripetal_shift = targets['bottomright_centripetal_shift'] - - gt_tl_heatmap = targets['topleft_heatmap'] - gt_br_heatmap = targets['bottomright_heatmap'] - # We only compute the offset loss at the real corner position. - # The value of real corner would be 1 in heatmap ground truth. - # The mask is computed in class agnostic mode and its shape is - # batch * 1 * width * height. - tl_mask = gt_tl_heatmap.eq(1).sum(1).gt(0).unsqueeze(1).type_as( - gt_tl_heatmap) - br_mask = gt_br_heatmap.eq(1).sum(1).gt(0).unsqueeze(1).type_as( - gt_br_heatmap) - - # Guiding shift loss - tl_guiding_loss = self.loss_guiding_shift( - tl_guiding_shift, - gt_tl_guiding_shift, - tl_mask, - avg_factor=tl_mask.sum()) - br_guiding_loss = self.loss_guiding_shift( - br_guiding_shift, - gt_br_guiding_shift, - br_mask, - avg_factor=br_mask.sum()) - guiding_loss = (tl_guiding_loss + br_guiding_loss) / 2.0 - # Centripetal shift loss - tl_centripetal_loss = self.loss_centripetal_shift( - tl_centripetal_shift, - gt_tl_centripetal_shift, - tl_mask, - avg_factor=tl_mask.sum()) - br_centripetal_loss = self.loss_centripetal_shift( - br_centripetal_shift, - gt_br_centripetal_shift, - br_mask, - avg_factor=br_mask.sum()) - centripetal_loss = (tl_centripetal_loss + br_centripetal_loss) / 2.0 - - return det_loss, off_loss, guiding_loss, centripetal_loss - - def get_bboxes(self, - tl_heats, - br_heats, - tl_offs, - br_offs, - tl_guiding_shifts, - br_guiding_shifts, - tl_centripetal_shifts, - br_centripetal_shifts, - img_metas, - rescale=False, - with_nms=True): - """Transform network output for a batch into bbox predictions. - - Args: - tl_heats (list[Tensor]): Top-left corner heatmaps for each level - with shape (N, num_classes, H, W). - br_heats (list[Tensor]): Bottom-right corner heatmaps for each - level with shape (N, num_classes, H, W). - tl_offs (list[Tensor]): Top-left corner offsets for each level - with shape (N, corner_offset_channels, H, W). - br_offs (list[Tensor]): Bottom-right corner offsets for each level - with shape (N, corner_offset_channels, H, W). - tl_guiding_shifts (list[Tensor]): Top-left guiding shifts for each - level with shape (N, guiding_shift_channels, H, W). Useless in - this function, we keep this arg because it's the raw output - from CentripetalHead. - br_guiding_shifts (list[Tensor]): Bottom-right guiding shifts for - each level with shape (N, guiding_shift_channels, H, W). - Useless in this function, we keep this arg because it's the - raw output from CentripetalHead. - tl_centripetal_shifts (list[Tensor]): Top-left centripetal shifts - for each level with shape (N, centripetal_shift_channels, H, - W). - br_centripetal_shifts (list[Tensor]): Bottom-right centripetal - shifts for each level with shape (N, - centripetal_shift_channels, H, W). - img_metas (list[dict]): Meta information of each image, e.g., - image size, scaling factor, etc. - rescale (bool): If True, return boxes in original image space. - Default: False. - with_nms (bool): If True, do nms before return boxes. - Default: True. - """ - assert tl_heats[-1].shape[0] == br_heats[-1].shape[0] == len(img_metas) - result_list = [] - for img_id in range(len(img_metas)): - result_list.append( - self._get_bboxes_single( - tl_heats[-1][img_id:img_id + 1, :], - br_heats[-1][img_id:img_id + 1, :], - tl_offs[-1][img_id:img_id + 1, :], - br_offs[-1][img_id:img_id + 1, :], - img_metas[img_id], - tl_emb=None, - br_emb=None, - tl_centripetal_shift=tl_centripetal_shifts[-1][ - img_id:img_id + 1, :], - br_centripetal_shift=br_centripetal_shifts[-1][ - img_id:img_id + 1, :], - rescale=rescale, - with_nms=with_nms)) - - return result_list diff --git a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmseg/models/utils/up_conv_block.py b/spaces/abhishek/sketch-to-image/annotator/uniformer/mmseg/models/utils/up_conv_block.py deleted file mode 100644 index 378469da76cb7bff6a639e7877b3c275d50490fb..0000000000000000000000000000000000000000 --- a/spaces/abhishek/sketch-to-image/annotator/uniformer/mmseg/models/utils/up_conv_block.py +++ /dev/null @@ -1,101 +0,0 @@ -import torch -import torch.nn as nn -from annotator.uniformer.mmcv.cnn import ConvModule, build_upsample_layer - - -class UpConvBlock(nn.Module): - """Upsample convolution block in decoder for UNet. - - This upsample convolution block consists of one upsample module - followed by one convolution block. The upsample module expands the - high-level low-resolution feature map and the convolution block fuses - the upsampled high-level low-resolution feature map and the low-level - high-resolution feature map from encoder. - - Args: - conv_block (nn.Sequential): Sequential of convolutional layers. - in_channels (int): Number of input channels of the high-level - skip_channels (int): Number of input channels of the low-level - high-resolution feature map from encoder. - out_channels (int): Number of output channels. - num_convs (int): Number of convolutional layers in the conv_block. - Default: 2. - stride (int): Stride of convolutional layer in conv_block. Default: 1. - dilation (int): Dilation rate of convolutional layer in conv_block. - Default: 1. - with_cp (bool): Use checkpoint or not. Using checkpoint will save some - memory while slowing down the training speed. Default: False. - conv_cfg (dict | None): Config dict for convolution layer. - Default: None. - norm_cfg (dict | None): Config dict for normalization layer. - Default: dict(type='BN'). - act_cfg (dict | None): Config dict for activation layer in ConvModule. - Default: dict(type='ReLU'). - upsample_cfg (dict): The upsample config of the upsample module in - decoder. Default: dict(type='InterpConv'). If the size of - high-level feature map is the same as that of skip feature map - (low-level feature map from encoder), it does not need upsample the - high-level feature map and the upsample_cfg is None. - dcn (bool): Use deformable convolution in convolutional layer or not. - Default: None. - plugins (dict): plugins for convolutional layers. Default: None. - """ - - def __init__(self, - conv_block, - in_channels, - skip_channels, - out_channels, - num_convs=2, - stride=1, - dilation=1, - with_cp=False, - conv_cfg=None, - norm_cfg=dict(type='BN'), - act_cfg=dict(type='ReLU'), - upsample_cfg=dict(type='InterpConv'), - dcn=None, - plugins=None): - super(UpConvBlock, self).__init__() - assert dcn is None, 'Not implemented yet.' - assert plugins is None, 'Not implemented yet.' - - self.conv_block = conv_block( - in_channels=2 * skip_channels, - out_channels=out_channels, - num_convs=num_convs, - stride=stride, - dilation=dilation, - with_cp=with_cp, - conv_cfg=conv_cfg, - norm_cfg=norm_cfg, - act_cfg=act_cfg, - dcn=None, - plugins=None) - if upsample_cfg is not None: - self.upsample = build_upsample_layer( - cfg=upsample_cfg, - in_channels=in_channels, - out_channels=skip_channels, - with_cp=with_cp, - norm_cfg=norm_cfg, - act_cfg=act_cfg) - else: - self.upsample = ConvModule( - in_channels, - skip_channels, - kernel_size=1, - stride=1, - padding=0, - conv_cfg=conv_cfg, - norm_cfg=norm_cfg, - act_cfg=act_cfg) - - def forward(self, skip, x): - """Forward function.""" - - x = self.upsample(x) - out = torch.cat([skip, x], dim=1) - out = self.conv_block(out) - - return out diff --git a/spaces/adirik/stylemc-demo/encoder4editing/criteria/lpips/utils.py b/spaces/adirik/stylemc-demo/encoder4editing/criteria/lpips/utils.py deleted file mode 100644 index 3d15a0983775810ef6239c561c67939b2b9ee3b5..0000000000000000000000000000000000000000 --- a/spaces/adirik/stylemc-demo/encoder4editing/criteria/lpips/utils.py +++ /dev/null @@ -1,30 +0,0 @@ -from collections import OrderedDict - -import torch - - -def normalize_activation(x, eps=1e-10): - norm_factor = torch.sqrt(torch.sum(x ** 2, dim=1, keepdim=True)) - return x / (norm_factor + eps) - - -def get_state_dict(net_type: str = 'alex', version: str = '0.1'): - # build url - url = 'https://raw.githubusercontent.com/richzhang/PerceptualSimilarity/' \ - + f'master/lpips/weights/v{version}/{net_type}.pth' - - # download - old_state_dict = torch.hub.load_state_dict_from_url( - url, progress=True, - map_location=None if torch.cuda.is_available() else torch.device('cpu') - ) - - # rename keys - new_state_dict = OrderedDict() - for key, val in old_state_dict.items(): - new_key = key - new_key = new_key.replace('lin', '') - new_key = new_key.replace('model.', '') - new_state_dict[new_key] = val - - return new_state_dict diff --git a/spaces/adpro/dpt-depth13/README.md b/spaces/adpro/dpt-depth13/README.md deleted file mode 100644 index a2df32f52be298450622acdf691911580499139c..0000000000000000000000000000000000000000 --- a/spaces/adpro/dpt-depth13/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: Dpt Depth Estimation -emoji: ⚡ -colorFrom: blue -colorTo: red -sdk: gradio -sdk_version: 2.8.13 -app_file: app.py -pinned: false -duplicated_from: adpro/dpt-depth01 ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference diff --git a/spaces/aico/TrOCR-digit/app.py b/spaces/aico/TrOCR-digit/app.py deleted file mode 100644 index 42bc4a11ed7302661c6ffb07ca6d003f37a749cf..0000000000000000000000000000000000000000 --- a/spaces/aico/TrOCR-digit/app.py +++ /dev/null @@ -1,38 +0,0 @@ -import gradio as gr -from transformers import TrOCRProcessor, VisionEncoderDecoderModel -import requests -from PIL import Image -import numpy as np -import cv2 -processor = TrOCRProcessor.from_pretrained("microsoft/trocr-base-handwritten") -model = VisionEncoderDecoderModel.from_pretrained("aico/TrOCR-MNIST") - - -def process_image(image): - - #print(np.shape(image)) - #print(image) - #rint(image.astype('uint8')) - #cv2.imwrite("image.png",image.astype('uint8'),(28, 28)) - img = Image.fromarray(image.astype('uint8')).convert("RGB") - #img = Image.open("image.png").convert("RGB") - print(img) - # prepare image - pixel_values = processor(img, return_tensors="pt").pixel_values - - # generate (no beam search) - generated_ids = model.generate(pixel_values) - - # decode - generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0] - - return generated_text - -title = "Interactive demo: Single Digits MNIST" -description = "Aico - University Utrecht" -iface = gr.Interface(fn=process_image, - inputs="sketchpad", - outputs="label", - title = title, - description = description) -iface.launch(debug=True) \ No newline at end of file diff --git a/spaces/aijack/jojo/e4e/utils/__init__.py b/spaces/aijack/jojo/e4e/utils/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/spaces/ajitrajasekharan/self-supervised-ner-biomedical/BatchInference.py b/spaces/ajitrajasekharan/self-supervised-ner-biomedical/BatchInference.py deleted file mode 100644 index 6d97fff519d14b75fe10b1800dc0d411c3ebb2e2..0000000000000000000000000000000000000000 --- a/spaces/ajitrajasekharan/self-supervised-ner-biomedical/BatchInference.py +++ /dev/null @@ -1,707 +0,0 @@ -import torch -import subprocess -#from pytorch_transformers import * -from transformers import * -import pdb -import operator -from collections import OrderedDict -import numpy as np -import argparse -import sys -import traceback -import string -import common as utils -import config_utils as cf -import requests -import json -import streamlit as st - -# OPTIONAL: if you want to have more information on what's happening, activate the logger as follows -import logging -logging.basicConfig(level=logging.INFO) - - -DEFAULT_TOP_K = 20 -DEFAULT_CONFIG = "./server_config.json" -DEFAULT_MODEL_PATH='./' -DEFAULT_LABELS_PATH='./labels.txt' -DEFAULT_TO_LOWER=False -DESC_FILE="./common_descs.txt" -SPECIFIC_TAG=":__entity__" -MAX_TOKENIZED_SENT_LENGTH = 500 #additional buffer for CLS SEP and entity term - -try: - from subprocess import DEVNULL # Python 3. -except ImportError: - DEVNULL = open(os.devnull, 'wb') - - -@st.cache() -def load_bert_model(model_name,to_lower): - try: - bert_tokenizer = BertTokenizer.from_pretrained(model_name,do_lower_case=to_lower) - bert_model = BertForMaskedLM.from_pretrained(model_name) - return bert_tokenizer,bert_model - except Exception as e: - pass - -def read_descs(file_name): - ret_dict = {} - with open(file_name) as fp: - line = fp.readline().rstrip("\n") - if (len(line) >= 1): - ret_dict[line] = 1 - while line: - line = fp.readline().rstrip("\n") - if (len(line) >= 1): - ret_dict[line] = 1 - return ret_dict - -def read_vocab(file_name): - l_vocab_dict = {} - o_vocab_dict = {} - with open(file_name) as fp: - for line in fp: - line = line.rstrip('\n') - if (len(line) > 0): - l_vocab_dict[line.lower()] = line #If there are multiple cased versions they will be collapsed into one. which is okay since we have the original saved. This is only used - #when a word is not found in its pristine form in the original list. - o_vocab_dict[line] = line - print("Read vocab file:",len(o_vocab_dict)) - return o_vocab_dict,l_vocab_dict - -def consolidate_labels(existing_node,new_labels,new_counts): - """Consolidates all the labels and counts for terms ignoring casing - - For instance, egfr may not have an entity label associated with it - but eGFR and EGFR may have. So if input is egfr, then this function ensures - the combined entities set fo eGFR and EGFR is made so as to return that union - for egfr - """ - new_dict = {} - existing_labels_arr = existing_node["label"].split('/') - existing_counts_arr = existing_node["counts"].split('/') - new_labels_arr = new_labels.split('/') - new_counts_arr = new_counts.split('/') - assert(len(existing_labels_arr) == len(existing_counts_arr)) - assert(len(new_labels_arr) == len(new_counts_arr)) - for i in range(len(existing_labels_arr)): - new_dict[existing_labels_arr[i]] = int(existing_counts_arr[i]) - for i in range(len(new_labels_arr)): - if (new_labels_arr[i] in new_dict): - new_dict[new_labels_arr[i]] += int(new_counts_arr[i]) - else: - new_dict[new_labels_arr[i]] = int(new_counts_arr[i]) - sorted_d = OrderedDict(sorted(new_dict.items(), key=lambda kv: kv[1], reverse=True)) - ret_labels_str = "" - ret_counts_str = "" - count = 0 - for key in sorted_d: - if (count == 0): - ret_labels_str = key - ret_counts_str = str(sorted_d[key]) - else: - ret_labels_str += '/' + key - ret_counts_str += '/' + str(sorted_d[key]) - count += 1 - return {"label":ret_labels_str,"counts":ret_counts_str} - - -def read_labels(labels_file): - terms_dict = OrderedDict() - lc_terms_dict = OrderedDict() - with open(labels_file,encoding="utf-8") as fin: - count = 1 - for term in fin: - term = term.strip("\n") - term = term.split() - if (len(term) == 3): - terms_dict[term[2]] = {"label":term[0],"counts":term[1]} - lc_term = term[2].lower() - if (lc_term in lc_terms_dict): - lc_terms_dict[lc_term] = consolidate_labels(lc_terms_dict[lc_term],term[0],term[1]) - else: - lc_terms_dict[lc_term] = {"label":term[0],"counts":term[1]} - count += 1 - else: - print("Invalid line:",term) - assert(0) - print("count of labels in " + labels_file + ":", len(terms_dict)) - return terms_dict,lc_terms_dict - - -class BatchInference: - def __init__(self, config_file,path,to_lower,patched,topk,abbrev,tokmod,vocab_path,labels_file,delimsep): - print("Model path:",path,"lower casing set to:",to_lower," is patched ", patched) - self.path = path - base_path = cf.read_config(config_file)["BASE_PATH"] if ("BASE_PATH" in cf.read_config(config_file)) else "./" - desc_file_path = cf.read_config(config_file)["DESC_FILE"] if ("DESC_FILE" in cf.read_config(config_file)) else DESC_FILE - self.labels_dict,self.lc_labels_dict = read_labels(labels_file) - #self.tokenizer = BertTokenizer.from_pretrained(path,do_lower_case=to_lower) ### Set this to to True for uncased models - #self.model = BertForMaskedLM.from_pretrained(path) - self.tokenizer, self.model = load_bert_model(path,to_lower) - self.model.eval() - #st.info("model loaded") - self.descs = read_descs(desc_file_path) - #st.info("descs loaded") - self.top_k = topk - self.patched = patched - self.abbrev = abbrev - self.tokmod = tokmod - self.delimsep = delimsep - self.truncated_fp = open(base_path + "truncated_sentences.txt","a") - self.always_log_fp = open(base_path + "CI_LOGS.txt","a") - if (cf.read_config(config_file)["USE_CLS"] == "1"): #Models like Bert base cased return same prediction for CLS regardless of input. So ignore CLS - print("************** USE CLS: Turned ON for this model. ******* ") - self.use_cls = True - else: - print("************** USE CLS: Turned OFF for this model. ******* ") - self.use_cls = False - if (cf.read_config(config_file)["LOG_DESCS"] == "1"): - self.log_descs = True - self.ci_fp = open(base_path + "log_ci_predictions.txt","w") - self.cs_fp = open(base_path + "log_cs_predictions.txt","w") - else: - self.log_descs = False - self.pos_server_url = cf.read_config(config_file)["POS_SERVER_URL"] - #st.info("Attemting to load vocab file") - if (tokmod): - self.o_vocab_dict,self.l_vocab_dict = read_vocab(vocab_path + "/vocab.txt") - else: - self.o_vocab_dict = {} - self.l_vocab_dict = {} - # st.info("Constructor complete") - #pdb.set_trace() - - def dispatch_request(self,url): - max_retries = 10 - attempts = 0 - while True: - try: - r = requests.get(url,timeout=1000) - if (r.status_code == 200): - return r - except: - print("Request:", url, " failed. Retrying...") - attempts += 1 - if (attempts >= max_retries): - print("Request:", url, " failed") - break - - def modify_text_to_match_vocab(self,text): - ret_arr = [] - text = text.split() - for word in text: - if (word in self.o_vocab_dict): - ret_arr.append(word) - else: - if (word.lower() in self.l_vocab_dict): - ret_arr.append(self.l_vocab_dict[word.lower()]) - else: - ret_arr.append(word) - return ' '.join(ret_arr) - - #This is bad hack for prototyping - parsing from text output as opposed to json - def extract_POS(self,text): - arr = text.split('\n') - if (len(arr) > 0): - start_pos = 0 - for i,line in enumerate(arr): - if (len(line) > 0): - start_pos += 1 - continue - else: - break - #print(arr[start_pos:]) - terms_arr = [] - for i,line in enumerate(arr[start_pos:]): - terms = line.split('\t') - if (len(terms) == 5): - #print(terms) - terms_arr.append(terms) - return terms_arr - - def masked_word_first_letter_capitalize(self,entity): - arr = entity.split() - ret_arr = [] - for term in arr: - if (len(term) > 1 and term[0].islower() and term[1].islower()): - ret_arr.append(term[0].upper() + term[1:]) - else: - ret_arr.append(term) - return ' '.join(ret_arr) - - - def gen_single_phrase_sentences(self,terms_arr,span_arr): - sentence_template = "%s is a entity" - #print(span_arr) - sentences = [] - singleton_spans_arr = [] - run_index = 0 - entity = "" - singleton_span = [] - while (run_index < len(span_arr)): - if (span_arr[run_index] == 1): - while (run_index < len(span_arr)): - if (span_arr[run_index] == 1): - #print(terms_arr[run_index][WORD_POS],end=' ') - if (len(entity) == 0): - entity = terms_arr[run_index][utils.WORD_POS] - else: - entity = entity + " " + terms_arr[run_index][utils.WORD_POS] - singleton_span.append(1) - run_index += 1 - else: - break - #print() - for i in sentence_template.split(): - if (i != "%s"): - singleton_span.append(0) - entity = self.masked_word_first_letter_capitalize(entity) - if (self.tokmod): - entity = self.modify_text_to_match_vocab(entity) - sentence = sentence_template % entity - sentences.append(sentence) - singleton_spans_arr.append(singleton_span) - #print(sentence) - #rint(singleton_span) - entity = "" - singleton_span = [] - else: - run_index += 1 - return sentences,singleton_spans_arr - - - - def gen_padded_sentence(self,text,max_tokenized_sentence_length,tokenized_text_arr,orig_tokenized_length_arr,indexed_tokens_arr,attention_mask_arr,to_replace): - if (to_replace): - text_arr = text.split() - new_text_arr = [] - for i in range(len(text_arr)): - if (text_arr[i] == "entity" ): - new_text_arr.append( "[MASK]") - else: - new_text_arr.append(text_arr[i]) - text = ' '.join(new_text_arr) - text = '[CLS] ' + text + ' [SEP]' - tokenized_text = self.tokenizer.tokenize(text) - indexed_tokens = self.tokenizer.convert_tokens_to_ids(tokenized_text) - tok_length = len(indexed_tokens) - max_tokenized_sentence_length = max_tokenized_sentence_length if tok_length <= max_tokenized_sentence_length else tok_length - indexed_tokens_arr.append(indexed_tokens) - attention_mask_arr.append([1]*tok_length) - tokenized_text_arr.append(tokenized_text) - orig_tokenized_length_arr.append(tokenized_text) - return max_tokenized_sentence_length - - - - def find_entity(self,word): - entities = self.labels_dict - lc_entities = self.lc_labels_dict - in_vocab = False - #words = self.filter_glue_words(words) #do not filter glue words anymore. Let them pass through - l_word = word.lower() - if l_word.isdigit(): - ret_label = "MEASURE" - ret_counts = str(1) - elif (word in entities): - ret_label = entities[word]["label"] - ret_counts = entities[word]["counts"] - in_vocab = True - elif (l_word in entities): - ret_label = entities[l_word]["label"] - ret_counts = entities[l_word]["counts"] - in_vocab = True - elif (l_word in lc_entities): - ret_label = lc_entities[l_word]["label"] - ret_counts = lc_entities[l_word]["counts"] - in_vocab = True - else: - ret_label = "OTHER" - ret_counts = "1" - if (ret_label == "OTHER"): - ret_label = "UNTAGGED_ENTITY" - ret_counts = "1" - #print(word,ret_label,ret_counts) - return ret_label,ret_counts,in_vocab - - #This is just a trivial hack for consistency of CI prediction of numbers - def override_ci_number_predictions(self,masked_sent): - words = masked_sent.split() - words_count = len(words) - if (len(words) == 4 and words[words_count-1] == "entity" and words[words_count -2] == "a" and words[words_count -3] == "is" and words[0].isnumeric()): #only integers skipped - return True,"two","1","NUMBER" - else: - return False,"","","" - - def override_ci_for_vocab_terms(self,masked_sent): - words = masked_sent.split() - words_count = len(words) - if (len(words) == 4 and words[words_count-1] == "entity" and words[words_count -2] == "a" and words[words_count -3] == "is"): - entity,entity_count,in_vocab = self.find_entity(words[0]) - if (in_vocab): - return True,words[0],entity_count,entity - return False,"","","" - - - - def normalize_sent(self,sent): - normalized_tokens = "!\"%();?[]`{}" - end_tokens = "!,.:;?" - sent = sent.rstrip() - if (len(sent) > 1): - if (self.delimsep): - for i in range(len(normalized_tokens)): - sent = sent.replace(normalized_tokens[i],' ' + normalized_tokens[i] + ' ') - sent = sent.rstrip() - if (not sent.endswith(":__entity__")): - last_char = sent[-1] - if (last_char not in end_tokens): #End all sentences with a period if not already present in sentence. - sent = sent + ' . ' - print("Normalized sent",sent) - return sent - - def truncate_sent_if_too_long(self,text): - truncated_count = 0 - orig_sent = text - while (True): - tok_text = '[CLS] ' + text + ' [SEP]' - tokenized_text = self.tokenizer.tokenize(tok_text) - if (len(tokenized_text) < MAX_TOKENIZED_SENT_LENGTH): - break - text = ' '.join(text.split()[:-1]) - truncated_count += 1 - if (truncated_count > 0): - print("Input sentence was truncated by: ", truncated_count, " tokens") - self.truncated_fp.write("Input sentence was truncated by: " + str(truncated_count) + " tokens\n") - self.truncated_fp.write(orig_sent + "\n") - self.truncated_fp.write(text + "\n\n") - return text - - - def get_descriptors(self,sent,pos_arr): - ''' - Batched creation of descriptors given a sentence. - 1) Find noun phrases to tag in a sentence if user did not explicitly tag. - 2) Create 'N' CS and CI sentences if there are N phrases to tag. Total 2*N sentences - 3) Create a batch padding all sentences to the maximum sentence length. - 4) Perform inference on batch - 5) Return json of descriptors for the ooriginal sentence as well as all CI sentences - ''' - #Truncate sent if the tokenized sent is longer than max sent length - #st.info("in get descriptors") - sent = self.truncate_sent_if_too_long(sent) - #This is a modification of input text to words in vocab that match it in case insensitive manner. - #This is *STILL* required when we are using subwords too for prediction. The prediction quality is still better. - #An example is Mesothelioma is caused by exposure to asbestos. The quality of prediction is better when Mesothelioma is not split by lowercasing with A100 model - if (self.tokmod): - sent = self.modify_text_to_match_vocab(sent) - - #The input sentence is normalized. Specifically all input is terminated with a punctuation if not already present. Also some of the punctuation marks are separated from text if glued to a word(disabled by default for test set sync) - sent = self.normalize_sent(sent) - - #Step 1. Find entities to tag if user did not explicitly tag terms - #All noun phrases are tagged for prediction - if (SPECIFIC_TAG in sent): - terms_arr = utils.set_POS_based_on_entities(sent) - else: - if (pos_arr is None): - assert(0) - url = self.pos_server_url + sent.replace('"','\'') - r = self.dispatch_request(url) - terms_arr = self.extract_POS(r.text) - else: - # st.info("Reusing Pos arr") - terms_arr = pos_arr - - print(terms_arr) - #Note span arr only contains phrases in the input that need to be tagged - not the span of all phrases in sentences - #Step 2. Create N CS sentences - #This returns masked sentences for all positions - main_sent_arr,masked_sent_arr,span_arr = utils.detect_masked_positions(terms_arr) - ignore_cs = True if (len(masked_sent_arr) == 1 and len(masked_sent_arr[0]) == 2 and masked_sent_arr[0][0] == "__entity__" and masked_sent_arr[0][1] == ".") else False #This is a boundary condition to avoid using cs if the input is just trying to get entity type for a phrase. There is no sentence context in that case. - - - #Step 2. Create N CI sentences - singleton_sentences,not_used_singleton_spans_arr = self.gen_single_phrase_sentences(terms_arr,span_arr) - - - #We now have 2*N sentences - max_tokenized_sentence_length = 0 - tokenized_text_arr = [] - indexed_tokens_arr = [] - attention_mask_arr = [] - all_sentences_arr = [] - orig_tokenized_length_arr = [] - assert(len(masked_sent_arr) == len(singleton_sentences)) - for ci_s,cs_s in zip(singleton_sentences,masked_sent_arr): - all_sentences_arr.append(ci_s) - max_tokenized_sentence_length = self.gen_padded_sentence(ci_s,max_tokenized_sentence_length,tokenized_text_arr,orig_tokenized_length_arr,indexed_tokens_arr,attention_mask_arr,True) - cs_s = ' '.join(cs_s).replace("__entity__","entity") - all_sentences_arr.append(cs_s) - max_tokenized_sentence_length = self.gen_padded_sentence(cs_s,max_tokenized_sentence_length,tokenized_text_arr,orig_tokenized_length_arr,indexed_tokens_arr,attention_mask_arr,True) - - - #pad all sentences with length less than max sentence length. This includes the full sentence too since we used indexed_tokens_arr - for i in range(len(indexed_tokens_arr)): - padding = [self.tokenizer.pad_token_id]*(max_tokenized_sentence_length - len(indexed_tokens_arr[i])) - att_padding = [0]*(max_tokenized_sentence_length - len(indexed_tokens_arr[i])) - if (len(padding) > 0): - indexed_tokens_arr[i].extend(padding) - attention_mask_arr[i].extend(att_padding) - - - assert(len(main_sent_arr) == len(span_arr)) - assert(len(all_sentences_arr) == len(indexed_tokens_arr)) - assert(len(all_sentences_arr) == len(attention_mask_arr)) - assert(len(all_sentences_arr) == len(tokenized_text_arr)) - assert(len(all_sentences_arr) == len(orig_tokenized_length_arr)) - # Convert inputs to PyTorch tensors - tokens_tensor = torch.tensor(indexed_tokens_arr) - attention_tensors = torch.tensor(attention_mask_arr) - - - print("Input:",sent) - ret_obj = OrderedDict() - with torch.no_grad(): - predictions = self.model(tokens_tensor, attention_mask=attention_tensors) - for sent_index in range(len(predictions[0])): - - #print("*** Current sentence ***",all_sentences_arr[sent_index]) - if (self.log_descs): - fp = self.cs_fp if sent_index %2 != 0 else self.ci_fp - fp.write("\nCurrent sentence: " + all_sentences_arr[sent_index] + "\n") - prediction = "ci_prediction" if (sent_index %2 == 0 ) else "cs_prediction" - out_index = int(sent_index/2) + 1 - if (out_index not in ret_obj): - ret_obj[out_index] = {} - assert(prediction not in ret_obj[out_index]) - ret_obj[out_index][prediction] = {} - ret_obj[out_index][prediction]["sentence"] = all_sentences_arr[sent_index] - curr_sent_arr = [] - ret_obj[out_index][prediction]["descs"] = curr_sent_arr - - for word in range(len(tokenized_text_arr[sent_index])): - if (word == len(tokenized_text_arr[sent_index]) - 1): # SEP is skipped for CI and CS - continue - if (sent_index %2 == 0 and (word != 0 and word != len(orig_tokenized_length_arr[sent_index]) - 2)): #For all CI sentences pick only the neighbors of CLS and the last word of the sentence (X is a entity) - #if (sent_index %2 == 0 and (word != 0 and word != len(orig_tokenized_length_arr[sent_index]) - 2) and word != len(orig_tokenized_length_arr[sent_index]) - 3): #For all CI sentences - just pick CLS, "a" and "entity" - #if (sent_index %2 == 0 and (word != 0 and (word == len(orig_tokenized_length_arr[sent_index]) - 4))): #For all CI sentences pick ALL terms excluding "is" in "X is a entity" - continue - if (sent_index %2 == 0 and (word == 0 and not self.use_cls)): #This is for models like bert base cased where we cant use CLS - it is the same for all words. - continue - - if (sent_index %2 != 0 and tokenized_text_arr[sent_index][word] != "[MASK]"): # for all CS sentences skip all terms except the mask position - continue - - - results_dict = {} - masked_index = word - #pick all model predictions for current position word - if (self.patched): - for j in range(len(predictions[0][0][sent_index][masked_index])): - tok = tokenizer.convert_ids_to_tokens([j])[0] - results_dict[tok] = float(predictions[0][0][sent_index][masked_index][j].tolist()) - else: - for j in range(len(predictions[0][sent_index][masked_index])): - tok = self.tokenizer.convert_ids_to_tokens([j])[0] - results_dict[tok] = float(predictions[0][sent_index][masked_index][j].tolist()) - k = 0 - #sort it - big to small - sorted_d = OrderedDict(sorted(results_dict.items(), key=lambda kv: kv[1], reverse=True)) - - - #print("********* Top predictions for token: ",tokenized_text_arr[sent_index][word]) - if (self.log_descs): - fp.write("********* Top predictions for token: " + tokenized_text_arr[sent_index][word] + "\n") - if (sent_index %2 == 0): #For CI sentences, just pick half for CLS and entity position to match with CS counts - if (self.use_cls): #If we are not using [CLS] for models like BBC, then take all top k from the entity prediction - top_k = self.top_k/2 - else: - top_k = self.top_k - else: - top_k = self.top_k - #Looping through each descriptor prediction for a position and picking it up subject to some conditions - for index in sorted_d: - #if (index in string.punctuation or index.startswith('##') or len(index) == 1 or index.startswith('.') or index.startswith('[')): - if index.lower() in self.descs: #these have almost no entity info - glue words like "the","a" - continue - #if (index in string.punctuation or len(index) == 1 or index.startswith('.') or index.startswith('[') or index.startswith("#")): - if (index in string.punctuation or len(index) == 1 or index.startswith('.') or index.startswith('[')): - continue - if (index.startswith("#")): #subwords suggest model is trying to predict a multi word term that generally tends to be noisy. So penalize. Count and skip - k += 1 - continue - #print(index,round(float(sorted_d[index]),4)) - if (sent_index % 2 != 0): - #CS predictions - entity,entity_count,dummy = self.find_entity(index) - if (self.log_descs): - self.cs_fp.write(index + " " + entity + " " + entity_count + " " + str(round(float(sorted_d[index]),4)) + "\n") - if (not ignore_cs): - curr_sent_arr.append({"desc":index,"e":entity,"e_count":entity_count,"v":str(round(float(sorted_d[index]),4))}) - if (all_sentences_arr[sent_index].strip().rstrip(".").strip().endswith("entity")): - self.always_log_fp.write(' '.join(all_sentences_arr[sent_index].split()[:-1]) + " " + index + " :__entity__\n") - else: - #CI predictions of the form X is a entity - entity,entity_count,dummy = self.find_entity(index) #index is one of the predicted descs for the [CLS]/[MASK] psition - number_override,override_index,override_entity_count,override_entity = self.override_ci_number_predictions(all_sentences_arr[sent_index]) #Note this override just uses the sentence to override all descs - if (number_override): #note the prediction for this position still takes the prediction float values model returns - index = override_index - entity_count = override_entity_count - entity = override_entity - else: - if (not self.use_cls or word != 0): - override,override_index,override_entity_count,override_entity = self.override_ci_for_vocab_terms(all_sentences_arr[sent_index]) #this also uses the sentence to override, ignoring descs, except reusing the prediction score - if (override): #note the prediction for this position still takes the prediction float values model returns - index = override_index - entity_count = override_entity_count - entity = override_entity - k = top_k #just add this override once. We dont have to add this override for each descripor and inundate downstream NER with the same signature - - if (self.log_descs): - self.ci_fp.write(index + " " + entity + " " + entity_count + " " + str(round(float(sorted_d[index]),4)) + "\n") - curr_sent_arr.append({"desc":index,"e":entity,"e_count":entity_count,"v":str(round(float(sorted_d[index]),4))}) - #if (index != "two" and not index.startswith("#") and not all_sentences_arr[sent_index].strip().startswith("is ")): - if (index != "two" and not all_sentences_arr[sent_index].strip().startswith("is ")): - self.always_log_fp.write(' '.join(all_sentences_arr[sent_index].split()[:-1]) + " " + index + " :__entity__\n") - k += 1 - if (k >= top_k): - break - #print() - #print(ret_obj) - #print(ret_obj) - #st.info("Enf. of prediciton") - #pdb.set_trace() - #final_obj = {"terms_arr":main_sent_arr,"span_arr":span_arr,"descs_and_entities":ret_obj,"all_sentences":all_sentences_arr} - final_obj = {"input":sent,"terms_arr":main_sent_arr,"span_arr":span_arr,"descs_and_entities":ret_obj} - if (self.log_descs): - self.ci_fp.flush() - self.cs_fp.flush() - self.always_log_fp.flush() - self.truncated_fp.flush() - return final_obj - - -test_arr = [ - "ajit? is an engineer .", - "Sam:__entity__ Malone:__entity__ .", - "1. Jesper:__entity__ Ronnback:__entity__ ( Sweden:__entity__ ) 25.76 points", - "He felt New York has a chance:__entity__ to win this year's competition .", - "The new omicron variant could increase the likelihood that people will need a fourth coronavirus vaccine dose earlier than expected, executives at Prin dummy:__entity__ said Wednesday .", - "The new omicron variant could increase the likelihood that people will need a fourth coronavirus vaccine dose earlier than expected, executives at pharmaceutical:__entity__ giant:__entity__ Pfizer:__entity__ said Wednesday .", - "The conditions:__entity__ in the camp were very poor", - "Imatinib:__entity__ is used to treat nsclc", - "imatinib:__entity__ is used to treat nsclc", - "imatinib:__entity__ mesylate:__entity__ is used to treat nsclc", - "Staten is a :__entity__", - "John is a :__entity__", - "I met my best friend at eighteen :__entity__", - "I met my best friend at Parkinson's", - "e", - "Bandolier - Budgie ' , a free itunes app for ipad , iphone and ipod touch , released in December 2011 , tells the story of the making of Bandolier in the band 's own words - including an extensive audio interview with Burke Shelley", - "The portfolio manager of the new cryptocurrency firm underwent a bone marrow biopsy: for AML:__entity__:", - "Coronavirus:__entity__ disease 2019 (COVID-19) is a contagious disease caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The first known case was identified in Wuhan, China, in December 2019.[7] The disease has since spread worldwide, leading to an ongoing pandemic.[8]Symptoms of COVID-19 are variable, but often include fever,[9] cough, headache,[10] fatigue, breathing difficulties, and loss of smell and taste.[11][12][13] Symptoms may begin one to fourteen days after exposure to the virus. At least a third of people who are infected do not develop noticeable symptoms.[14] Of those people who develop symptoms noticeable enough to be classed as patients, most (81%) develop mild to moderate symptoms (up to mild pneumonia), while 14% develop severe symptoms (dyspnea, hypoxia, or more than 50% lung involvement on imaging), and 5% suffer critical symptoms (respiratory failure, shock, or multiorgan dysfunction).[15] Older people are at a higher risk of developing severe symptoms. Some people continue to experience a range of effects (long COVID) for months after recovery, and damage to organs has been observed.[16] Multi-year studies are underway to further investigate the long-term effects of the disease.[16]COVID-19 transmits when people breathe in air contaminated by droplets and small airborne particles containing the virus. The risk of breathing these in is highest when people are in close proximity, but they can be inhaled over longer distances, particularly indoors. Transmission can also occur if splashed or sprayed with contaminated fluids in the eyes, nose or mouth, and, rarely, via contaminated surfaces. People remain contagious for up to 20 days, and can spread the virus even if they do not develop symptoms.[17][18]Several testing methods have been developed to diagnose the disease. The standard diagnostic method is by detection of the virus' nucleic acid by real-time reverse transcription polymerase chain reaction (rRT-PCR), transcription-mediated amplification (TMA), or by reverse transcription loop-mediated isothermal amplification (RT-LAMP) from a nasopharyngeal swab.Several COVID-19 vaccines have been approved and distributed in various countries, which have initiated mass vaccination campaigns. Other preventive measures include physical or social distancing, quarantining, ventilation of indoor spaces, covering coughs and sneezes, hand washing, and keeping unwashed hands away from the face. The use of face masks or coverings has been recommended in public settings to minimize the risk of transmissions. While work is underway to develop drugs that inhibit the virus, the primary treatment is symptomatic. Management involves the treatment of symptoms, supportive care, isolation, and experimental measures.", - "imatinib was used to treat Michael Jackson . ", - "eg .", - "mesothelioma is caused by exposure to organic :__entity__", - "Mesothelioma is caused by exposure to asbestos:__entity__", - "Asbestos is a highly :__entity__", - "Fyodor:__entity__ Mikhailovich:__entity__ Dostoevsky:__entity__ was treated for Parkinsons:__entity__ and later died of lung carcinoma", - "Fyodor:__entity__ Mikhailovich:__entity__ Dostoevsky:__entity__", - "imatinib was used to treat Michael:__entity__ Jackson:__entity__", - "Ajit flew to Boston:__entity__", - "Ajit:__entity__ flew to Boston", - "A eGFR below 60:__entity__ indicates chronic kidney disease", - "imatinib was used to treat Michael Jackson", - "Ajit Valath:__entity__ Rajasekharan is an engineer at nFerence headquartered in Cambrigde MA", - "imatinib:__entity__", - "imatinib", - "iplimumab:__entity__", - "iplimumab", - "engineer:__entity__", - "engineer", - "Complications include peritonsillar:__entity__ abscess::__entity__", - "Imatinib was the first signal transduction inhibitor (STI,, used in a clinical setting. It prevents a BCR-ABL protein from exerting its role in the oncogenic pathway in chronic:__entity__ myeloid:__entity__ leukemia:__entity__ (CML,", - "Imatinib was the first signal transduction inhibitor (STI,, used in a clinical setting. It prevents a BCR-ABL protein from exerting its role in the oncogenic pathway in chronic myeloid leukemia (CML,", - "Imatinib was the first signal transduction inhibitor (STI,, used in a clinical setting. It prevents a BCR-ABL protein from exerting its role in the oncogenic pathway in chronic:__entity__ myeloid:___entity__ leukemia:__entity__ (CML,", - "Ajit Rajasekharan is an engineer:__entity__ at nFerence:__entity__", - "Imatinib was the first signal transduction inhibitor (STI,, used in a clinical setting. It prevents a BCR-ABL protein from exerting its role in the oncogenic pathway in chronic myeloid leukemia (CML,", - "Ajit:__entity__ Rajasekharan:__entity__ is an engineer", - "Imatinib:__entity__ was the first signal transduction inhibitor (STI,, used in a clinical setting. It prevents a BCR-ABL protein from exerting its role in the oncogenic pathway in chronic myeloid leukemia (CML,", - "Ajit Valath Rajasekharan is an engineer at nFerence headquartered in Cambrigde MA", - "Ajit:__entity__ Valath Rajasekharan is an engineer:__entity__ at nFerence headquartered in Cambrigde MA", - "Ajit:__entity__ Valath:__entity__ Rajasekharan is an engineer:__entity__ at nFerence headquartered in Cambrigde MA", - "Ajit:__entity__ Valath:__entity__ Rajasekharan:__entity__ is an engineer:__entity__ at nFerence headquartered in Cambrigde MA", - "Ajit Raj is an engineer:__entity__ at nFerence", - "Ajit Valath:__entity__ Rajasekharan is an engineer:__entity__ at nFerence headquartered in Cambrigde:__entity__ MA", - "Ajit Valath Rajasekharan is an engineer:__entity__ at nFerence headquartered in Cambrigde:__entity__ MA", - "Ajit Valath Rajasekharan is an engineer:__entity__ at nFerence headquartered in Cambrigde MA", - "Ajit Valath Rajasekharan is an engineer at nFerence headquartered in Cambrigde MA", - "Ajit:__entity__ Rajasekharan:__entity__ is an engineer at nFerence:__entity__", - "Imatinib mesylate is used to treat non small cell lung cancer", - "Imatinib mesylate is used to treat :__entity__", - "Imatinib is a term:__entity__", - "nsclc is a term:__entity__", - "Ajit Rajasekharan is a term:__entity__", - "ajit rajasekharan is a term:__entity__", - "John Doe is a term:__entity__" -] - - -def test_sentences(singleton,iter_val): - with open("debug.txt","w") as fp: - for test in iter_val: - test = test.rstrip('\n') - fp.write(test + "\n") - print(test) - out = singleton.get_descriptors(test) - print(out) - fp.write(json.dumps(out,indent=4)) - fp.flush() - print() - pdb.set_trace() - - -if __name__ == '__main__': - parser = argparse.ArgumentParser(description='BERT descriptor service given a sentence. The word to be masked is specified as the special token entity ',formatter_class=argparse.ArgumentDefaultsHelpFormatter) - parser.add_argument('-config', action="store", dest="config", default=DEFAULT_CONFIG,help='config file path') - parser.add_argument('-model', action="store", dest="model", default=DEFAULT_MODEL_PATH,help='BERT pretrained models, or custom model path') - parser.add_argument('-input', action="store", dest="input", default="",help='Optional input file with sentences. If not specified, assumed to be canned sentence run (default behavior)') - parser.add_argument('-topk', action="store", dest="topk", default=DEFAULT_TOP_K,type=int,help='Number of neighbors to display') - parser.add_argument('-tolower', dest="tolower", action='store_true',help='Convert tokens to lowercase. Set to True only for uncased models') - parser.add_argument('-no-tolower', dest="tolower", action='store_false',help='Convert tokens to lowercase. Set to True only for uncased models') - parser.set_defaults(tolower=False) - parser.add_argument('-patched', dest="patched", action='store_true',help='Is pytorch code patched to harvest [CLS]') - parser.add_argument('-no-patched', dest="patched", action='store_false',help='Is pytorch code patched to harvest [CLS]') - parser.add_argument('-abbrev', dest="abbrev", action='store_true',help='Just output pivots - not all neighbors') - parser.add_argument('-no-abbrev', dest="abbrev", action='store_false',help='Just output pivots - not all neighbors') - parser.add_argument('-tokmod', dest="tokmod", action='store_true',help='Modify input token casings to match vocab - meaningful only for cased models') - parser.add_argument('-no-tokmod', dest="tokmod", action='store_false',help='Modify input token casings to match vocab - meaningful only for cased models') - parser.add_argument('-vocab', action="store", dest="vocab", default=DEFAULT_MODEL_PATH,help='Path to vocab file. This is required only if tokmod is true') - parser.add_argument('-labels', action="store", dest="labels", default=DEFAULT_LABELS_PATH,help='Path to labels file. This returns labels also') - parser.add_argument('-delimsep', dest="delimsep", action='store_true',help='Modify input tokens where delimiters are stuck to tokens. Turned off by default to be in sync with test sets') - parser.add_argument('-no-delimsep', dest="delimsep", action='store_true',help='Modify input tokens where delimiters are stuck to tokens. Turned off by default to be in sync with test sets') - parser.set_defaults(tolower=False) - parser.set_defaults(patched=False) - parser.set_defaults(abbrev=True) - parser.set_defaults(tokmod=True) - parser.set_defaults(delimsep=False) - - results = parser.parse_args() - try: - singleton = BatchInference(results.config,results.model,results.tolower,results.patched,results.topk,results.abbrev,results.tokmod,results.vocab,results.labels,results.delimsep) - print("To lower casing is set to:",results.tolower) - if (len(results.input) == 0): - print("Canned test mode") - test_sentences(singleton,test_arr) - else: - print("Batch file test mode") - fp = open(results.input) - test_sentences(singleton,fp) - - except: - print("Unexpected error:", sys.exc_info()[0]) - traceback.print_exc(file=sys.stdout) - diff --git a/spaces/akhaliq/PaintTransformer/train/util/visualizer.py b/spaces/akhaliq/PaintTransformer/train/util/visualizer.py deleted file mode 100644 index 529f86f473fadc5ebba29cbf413715539bd0e684..0000000000000000000000000000000000000000 --- a/spaces/akhaliq/PaintTransformer/train/util/visualizer.py +++ /dev/null @@ -1,224 +0,0 @@ -import numpy as np -import os -import sys -import ntpath -import time -from . import util, html -from subprocess import Popen, PIPE - - -if sys.version_info[0] == 2: - VisdomExceptionBase = Exception -else: - VisdomExceptionBase = ConnectionError - - -def save_images(webpage, visuals, image_path, aspect_ratio=1.0, width=256): - """Save images to the disk. - - Parameters: - webpage (the HTML class) -- the HTML webpage class that stores these imaegs (see html.py for more details) - visuals (OrderedDict) -- an ordered dictionary that stores (name, images (either tensor or numpy) ) pairs - image_path (str) -- the string is used to create image paths - aspect_ratio (float) -- the aspect ratio of saved images - width (int) -- the images will be resized to width x width - - This function will save images stored in 'visuals' to the HTML file specified by 'webpage'. - """ - image_dir = webpage.get_image_dir() - short_path = ntpath.basename(image_path[0]) - name = os.path.splitext(short_path)[0] - - webpage.add_header(name) - ims, txts, links = [], [], [] - - for label, im_data in visuals.items(): - im = util.tensor2im(im_data) - image_name = '%s_%s.png' % (name, label) - save_path = os.path.join(image_dir, image_name) - util.save_image(im, save_path, aspect_ratio=aspect_ratio) - ims.append(image_name) - txts.append(label) - links.append(image_name) - webpage.add_images(ims, txts, links, width=width) - - -class Visualizer: - """This class includes several functions that can display/save images and print/save logging information. - - It uses a Python library 'visdom' for display, and a Python library 'dominate' (wrapped in 'HTML') for creating - HTML files with images. - """ - - def __init__(self, opt): - """Initialize the Visualizer class - - Parameters: - opt -- stores all the experiment flags; needs to be a subclass of BaseOptions - Step 1: Cache the training/test options - Step 2: connect to a visdom server - Step 3: create an HTML object for saveing HTML filters - Step 4: create a logging file to store training losses - """ - self.opt = opt # cache the option - self.display_id = opt.display_id - self.use_html = opt.isTrain and not opt.no_html - self.win_size = opt.display_winsize - self.name = opt.name - self.port = opt.display_port - self.saved = False - if self.display_id > 0: # connect to a visdom server given and - import visdom - self.ncols = opt.display_ncols - self.vis = visdom.Visdom(server=opt.display_server, port=opt.display_port, env=opt.display_env) - if not self.vis.check_connection(): - self.create_visdom_connections() - - if self.use_html: # create an HTML object at /web/; images will be saved under - # /web/images/ - self.web_dir = os.path.join(opt.checkpoints_dir, opt.name, 'web') - self.img_dir = os.path.join(self.web_dir, 'images') - print('create web directory %s...' % self.web_dir) - util.mkdirs([self.web_dir, self.img_dir]) - # create a logging file to store training losses - self.log_name = os.path.join(opt.checkpoints_dir, opt.name, 'loss_log.txt') - with open(self.log_name, "a") as log_file: - now = time.strftime("%c") - log_file.write('================ Training Loss (%s) ================\n' % now) - - def reset(self): - """Reset the self.saved status""" - self.saved = False - - def create_visdom_connections(self): - """If the program could not connect to Visdom server, this function will start a new server at port < - self.port > """ - cmd = sys.executable + ' -m visdom.server -p %d &>/dev/null &' % self.port - print('\n\nCould not connect to Visdom server. \n Trying to start a server....') - print('Command: %s' % cmd) - Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE) - - def display_current_results(self, visuals, epoch, save_result): - """Display current results on visdom; save current results to an HTML file. - - Parameters: - visuals (OrderedDict) - - dictionary of images to display or save - epoch (int) - - the current epoch - save_result (bool) - - if save the current results to an HTML file - """ - if self.display_id > 0: # show images in the browser using visdom - ncols = self.ncols - if ncols > 0: # show all the images in one visdom panel - ncols = min(ncols, len(visuals)) - h, w = next(iter(visuals.values())).shape[:2] - table_css = """""" % (w, h) # create a table css - # create a table of images. - title = self.name - label_html = '' - label_html_row = '' - images = [] - idx = 0 - for label, image in visuals.items(): - image_numpy = util.tensor2im(image) - label_html_row += '%s' % label - images.append(image_numpy.transpose([2, 0, 1])) - idx += 1 - if idx % ncols == 0: - label_html += '%s' % label_html_row - label_html_row = '' - white_image = np.ones_like(image_numpy.transpose([2, 0, 1])) * 255 - while idx % ncols != 0: - images.append(white_image) - label_html_row += '' - idx += 1 - if label_html_row != '': - label_html += '%s' % label_html_row - try: - self.vis.images(images, nrow=ncols, win=self.display_id + 1, - padding=2, opts=dict(title=title + ' images')) - label_html = '%s
' % label_html - self.vis.text(table_css + label_html, win=self.display_id + 2, - opts=dict(title=title + ' labels')) - except VisdomExceptionBase: - self.create_visdom_connections() - - else: # show each image in a separate visdom panel; - idx = 1 - try: - for label, image in visuals.items(): - image_numpy = util.tensor2im(image) - self.vis.image(image_numpy.transpose([2, 0, 1]), opts=dict(title=label), - win=self.display_id + idx) - idx += 1 - except VisdomExceptionBase: - self.create_visdom_connections() - - if self.use_html and (save_result or not self.saved): # save images to an HTML file if they haven't been saved. - self.saved = True - # save images to the disk - for label, image in visuals.items(): - image_numpy = util.tensor2im(image) - img_path = os.path.join(self.img_dir, 'epoch%.3d_%s.png' % (epoch, label)) - util.save_image(image_numpy, img_path) - - # update website - webpage = html.HTML(self.web_dir, 'Experiment name = %s' % self.name, refresh=1) - for n in range(epoch, 0, -1): - webpage.add_header('epoch [%d]' % n) - ims, txts, links = [], [], [] - - for label, image_numpy in visuals.items(): - image_numpy = util.tensor2im(image) - img_path = 'epoch%.3d_%s.png' % (n, label) - ims.append(img_path) - txts.append(label) - links.append(img_path) - webpage.add_images(ims, txts, links, width=self.win_size) - webpage.save() - - def plot_current_losses(self, epoch, counter_ratio, losses): - """display the current losses on visdom display: dictionary of error labels and values - - Parameters: - epoch (int) -- current epoch - counter_ratio (float) -- progress (percentage) in the current epoch, between 0 to 1 - losses (OrderedDict) -- training losses stored in the format of (name, float) pairs - """ - if not hasattr(self, 'plot_data'): - self.plot_data = {'X': [], 'Y': [], 'legend': list(losses.keys())} - self.plot_data['X'].append(epoch + counter_ratio) - self.plot_data['Y'].append([losses[k] for k in self.plot_data['legend']]) - try: - self.vis.line( - X=np.stack([np.array(self.plot_data['X'])] * len(self.plot_data['legend']), 1), - Y=np.array(self.plot_data['Y']), - opts={ - 'title': self.name + ' loss over time', - 'legend': self.plot_data['legend'], - 'xlabel': 'epoch', - 'ylabel': 'loss'}, - win=self.display_id) - except VisdomExceptionBase: - self.create_visdom_connections() - - # losses: same format as |losses| of plot_current_losses - def print_current_losses(self, epoch, iters, losses, t_comp, t_data): - """print current losses on console; also save the losses to the disk - - Parameters: - epoch (int) -- current epoch - iters (int) -- current training iteration during this epoch (reset to 0 at the end of every epoch) - losses (OrderedDict) -- training losses stored in the format of (name, float) pairs - t_comp (float) -- computational time per data point (normalized by batch_size) - t_data (float) -- data loading time per data point (normalized by batch_size) - """ - message = '(epoch: %d, iters: %d, time: %.3f, data: %.3f) ' % (epoch, iters, t_comp, t_data) - for k, v in losses.items(): - message += '%s: %.3f ' % (k, v) - - print(message) # print the message - with open(self.log_name, "a") as log_file: - log_file.write('%s\n' % message) # save the message diff --git a/spaces/akhaliq/stylegan3_clip/train.py b/spaces/akhaliq/stylegan3_clip/train.py deleted file mode 100644 index 7f055b300f7b03747bb237b2458c067dd6512379..0000000000000000000000000000000000000000 --- a/spaces/akhaliq/stylegan3_clip/train.py +++ /dev/null @@ -1,288 +0,0 @@ -# Copyright (c) 2021, NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# -# NVIDIA CORPORATION and its licensors retain all intellectual property -# and proprietary rights in and to this software, related documentation -# and any modifications thereto. Any use, reproduction, disclosure or -# distribution of this software and related documentation without an express -# license agreement from NVIDIA CORPORATION is strictly prohibited. - -"""Train a GAN using the techniques described in the paper -"Alias-Free Generative Adversarial Networks".""" - -import os -import click -import re -import json -import tempfile -import torch - -import dnnlib -from training import training_loop -from metrics import metric_main -from torch_utils import training_stats -from torch_utils import custom_ops - -#---------------------------------------------------------------------------- - -def subprocess_fn(rank, c, temp_dir): - dnnlib.util.Logger(file_name=os.path.join(c.run_dir, 'log.txt'), file_mode='a', should_flush=True) - - # Init torch.distributed. - if c.num_gpus > 1: - init_file = os.path.abspath(os.path.join(temp_dir, '.torch_distributed_init')) - if os.name == 'nt': - init_method = 'file:///' + init_file.replace('\\', '/') - torch.distributed.init_process_group(backend='gloo', init_method=init_method, rank=rank, world_size=c.num_gpus) - else: - init_method = f'file://{init_file}' - torch.distributed.init_process_group(backend='nccl', init_method=init_method, rank=rank, world_size=c.num_gpus) - - # Init torch_utils. - sync_device = torch.device('cuda', rank) if c.num_gpus > 1 else None - training_stats.init_multiprocessing(rank=rank, sync_device=sync_device) - if rank != 0: - custom_ops.verbosity = 'none' - - # Execute training loop. - training_loop.training_loop(rank=rank, **c) - -#---------------------------------------------------------------------------- - -def launch_training(c, desc, outdir, dry_run): - dnnlib.util.Logger(should_flush=True) - - # Pick output directory. - prev_run_dirs = [] - if os.path.isdir(outdir): - prev_run_dirs = [x for x in os.listdir(outdir) if os.path.isdir(os.path.join(outdir, x))] - prev_run_ids = [re.match(r'^\d+', x) for x in prev_run_dirs] - prev_run_ids = [int(x.group()) for x in prev_run_ids if x is not None] - cur_run_id = max(prev_run_ids, default=-1) + 1 - c.run_dir = os.path.join(outdir, f'{cur_run_id:05d}-{desc}') - assert not os.path.exists(c.run_dir) - - # Print options. - print() - print('Training options:') - print(json.dumps(c, indent=2)) - print() - print(f'Output directory: {c.run_dir}') - print(f'Number of GPUs: {c.num_gpus}') - print(f'Batch size: {c.batch_size} images') - print(f'Training duration: {c.total_kimg} kimg') - print(f'Dataset path: {c.training_set_kwargs.path}') - print(f'Dataset size: {c.training_set_kwargs.max_size} images') - print(f'Dataset resolution: {c.training_set_kwargs.resolution}') - print(f'Dataset labels: {c.training_set_kwargs.use_labels}') - print(f'Dataset x-flips: {c.training_set_kwargs.xflip}') - print() - - # Dry run? - if dry_run: - print('Dry run; exiting.') - return - - # Create output directory. - print('Creating output directory...') - os.makedirs(c.run_dir) - with open(os.path.join(c.run_dir, 'training_options.json'), 'wt') as f: - json.dump(c, f, indent=2) - - # Launch processes. - print('Launching processes...') - torch.multiprocessing.set_start_method('spawn') - with tempfile.TemporaryDirectory() as temp_dir: - if c.num_gpus == 1: - subprocess_fn(rank=0, c=c, temp_dir=temp_dir) - else: - torch.multiprocessing.spawn(fn=subprocess_fn, args=(c, temp_dir), nprocs=c.num_gpus) - -#---------------------------------------------------------------------------- - -def init_dataset_kwargs(data): - try: - dataset_kwargs = dnnlib.EasyDict(class_name='training.dataset.ImageFolderDataset', path=data, use_labels=True, max_size=None, xflip=False) - dataset_obj = dnnlib.util.construct_class_by_name(**dataset_kwargs) # Subclass of training.dataset.Dataset. - dataset_kwargs.resolution = dataset_obj.resolution # Be explicit about resolution. - dataset_kwargs.use_labels = dataset_obj.has_labels # Be explicit about labels. - dataset_kwargs.max_size = len(dataset_obj) # Be explicit about dataset size. - return dataset_kwargs, dataset_obj.name - except IOError as err: - raise click.ClickException(f'--data: {err}') - -#---------------------------------------------------------------------------- - -def parse_comma_separated_list(s): - if isinstance(s, list): - return s - if s is None or s.lower() == 'none' or s == '': - return [] - return s.split(',') - -#---------------------------------------------------------------------------- - -@click.command() - -# Required. -@click.option('--outdir', help='Where to save the results', metavar='DIR', required=True) -@click.option('--cfg', help='Base configuration', type=click.Choice(['stylegan3-t', 'stylegan3-r', 'stylegan2']), required=True) -@click.option('--data', help='Training data', metavar='[ZIP|DIR]', type=str, required=True) -@click.option('--gpus', help='Number of GPUs to use', metavar='INT', type=click.IntRange(min=1), required=True) -@click.option('--batch', help='Total batch size', metavar='INT', type=click.IntRange(min=1), required=True) -@click.option('--gamma', help='R1 regularization weight', metavar='FLOAT', type=click.FloatRange(min=0), required=True) - -# Optional features. -@click.option('--cond', help='Train conditional model', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--mirror', help='Enable dataset x-flips', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--aug', help='Augmentation mode', type=click.Choice(['noaug', 'ada', 'fixed']), default='ada', show_default=True) -@click.option('--resume', help='Resume from given network pickle', metavar='[PATH|URL]', type=str) -@click.option('--freezed', help='Freeze first layers of D', metavar='INT', type=click.IntRange(min=0), default=0, show_default=True) - -# Misc hyperparameters. -@click.option('--p', help='Probability for --aug=fixed', metavar='FLOAT', type=click.FloatRange(min=0, max=1), default=0.2, show_default=True) -@click.option('--target', help='Target value for --aug=ada', metavar='FLOAT', type=click.FloatRange(min=0, max=1), default=0.6, show_default=True) -@click.option('--batch-gpu', help='Limit batch size per GPU', metavar='INT', type=click.IntRange(min=1)) -@click.option('--cbase', help='Capacity multiplier', metavar='INT', type=click.IntRange(min=1), default=32768, show_default=True) -@click.option('--cmax', help='Max. feature maps', metavar='INT', type=click.IntRange(min=1), default=512, show_default=True) -@click.option('--glr', help='G learning rate [default: varies]', metavar='FLOAT', type=click.FloatRange(min=0)) -@click.option('--dlr', help='D learning rate', metavar='FLOAT', type=click.FloatRange(min=0), default=0.002, show_default=True) -@click.option('--map-depth', help='Mapping network depth [default: varies]', metavar='INT', type=click.IntRange(min=1)) -@click.option('--mbstd-group', help='Minibatch std group size', metavar='INT', type=click.IntRange(min=1), default=4, show_default=True) - -# Misc settings. -@click.option('--desc', help='String to include in result dir name', metavar='STR', type=str) -@click.option('--metrics', help='Quality metrics', metavar='[NAME|A,B,C|none]', type=parse_comma_separated_list, default='fid50k_full', show_default=True) -@click.option('--kimg', help='Total training duration', metavar='KIMG', type=click.IntRange(min=1), default=25000, show_default=True) -@click.option('--tick', help='How often to print progress', metavar='KIMG', type=click.IntRange(min=1), default=4, show_default=True) -@click.option('--snap', help='How often to save snapshots', metavar='TICKS', type=click.IntRange(min=1), default=50, show_default=True) -@click.option('--seed', help='Random seed', metavar='INT', type=click.IntRange(min=0), default=0, show_default=True) -@click.option('--fp32', help='Disable mixed-precision', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--nobench', help='Disable cuDNN benchmarking', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--workers', help='DataLoader worker processes', metavar='INT', type=click.IntRange(min=1), default=3, show_default=True) -@click.option('-n','--dry-run', help='Print training options and exit', is_flag=True) - -def main(**kwargs): - """Train a GAN using the techniques described in the paper - "Alias-Free Generative Adversarial Networks". - - Examples: - - \b - # Train StyleGAN3-T for AFHQv2 using 8 GPUs. - python train.py --outdir=~/training-runs --cfg=stylegan3-t --data=~/datasets/afhqv2-512x512.zip \\ - --gpus=8 --batch=32 --gamma=8.2 --mirror=1 - - \b - # Fine-tune StyleGAN3-R for MetFaces-U using 1 GPU, starting from the pre-trained FFHQ-U pickle. - python train.py --outdir=~/training-runs --cfg=stylegan3-r --data=~/datasets/metfacesu-1024x1024.zip \\ - --gpus=8 --batch=32 --gamma=6.6 --mirror=1 --kimg=5000 --snap=5 \\ - --resume=https://api.ngc.nvidia.com/v2/models/nvidia/research/stylegan3/versions/1/files/stylegan3-r-ffhqu-1024x1024.pkl - - \b - # Train StyleGAN2 for FFHQ at 1024x1024 resolution using 8 GPUs. - python train.py --outdir=~/training-runs --cfg=stylegan2 --data=~/datasets/ffhq-1024x1024.zip \\ - --gpus=8 --batch=32 --gamma=10 --mirror=1 --aug=noaug - """ - - # Initialize config. - opts = dnnlib.EasyDict(kwargs) # Command line arguments. - c = dnnlib.EasyDict() # Main config dict. - c.G_kwargs = dnnlib.EasyDict(class_name=None, z_dim=512, w_dim=512, mapping_kwargs=dnnlib.EasyDict()) - c.D_kwargs = dnnlib.EasyDict(class_name='training.networks_stylegan2.Discriminator', block_kwargs=dnnlib.EasyDict(), mapping_kwargs=dnnlib.EasyDict(), epilogue_kwargs=dnnlib.EasyDict()) - c.G_opt_kwargs = dnnlib.EasyDict(class_name='torch.optim.Adam', betas=[0,0.99], eps=1e-8) - c.D_opt_kwargs = dnnlib.EasyDict(class_name='torch.optim.Adam', betas=[0,0.99], eps=1e-8) - c.loss_kwargs = dnnlib.EasyDict(class_name='training.loss.StyleGAN2Loss') - c.data_loader_kwargs = dnnlib.EasyDict(pin_memory=True, prefetch_factor=2) - - # Training set. - c.training_set_kwargs, dataset_name = init_dataset_kwargs(data=opts.data) - if opts.cond and not c.training_set_kwargs.use_labels: - raise click.ClickException('--cond=True requires labels specified in dataset.json') - c.training_set_kwargs.use_labels = opts.cond - c.training_set_kwargs.xflip = opts.mirror - - # Hyperparameters & settings. - c.num_gpus = opts.gpus - c.batch_size = opts.batch - c.batch_gpu = opts.batch_gpu or opts.batch // opts.gpus - c.G_kwargs.channel_base = c.D_kwargs.channel_base = opts.cbase - c.G_kwargs.channel_max = c.D_kwargs.channel_max = opts.cmax - c.G_kwargs.mapping_kwargs.num_layers = (8 if opts.cfg == 'stylegan2' else 2) if opts.map_depth is None else opts.map_depth - c.D_kwargs.block_kwargs.freeze_layers = opts.freezed - c.D_kwargs.epilogue_kwargs.mbstd_group_size = opts.mbstd_group - c.loss_kwargs.r1_gamma = opts.gamma - c.G_opt_kwargs.lr = (0.002 if opts.cfg == 'stylegan2' else 0.0025) if opts.glr is None else opts.glr - c.D_opt_kwargs.lr = opts.dlr - c.metrics = opts.metrics - c.total_kimg = opts.kimg - c.kimg_per_tick = opts.tick - c.image_snapshot_ticks = c.network_snapshot_ticks = opts.snap - c.random_seed = c.training_set_kwargs.random_seed = opts.seed - c.data_loader_kwargs.num_workers = opts.workers - - # Sanity checks. - if c.batch_size % c.num_gpus != 0: - raise click.ClickException('--batch must be a multiple of --gpus') - if c.batch_size % (c.num_gpus * c.batch_gpu) != 0: - raise click.ClickException('--batch must be a multiple of --gpus times --batch-gpu') - if c.batch_gpu < c.D_kwargs.epilogue_kwargs.mbstd_group_size: - raise click.ClickException('--batch-gpu cannot be smaller than --mbstd') - if any(not metric_main.is_valid_metric(metric) for metric in c.metrics): - raise click.ClickException('\n'.join(['--metrics can only contain the following values:'] + metric_main.list_valid_metrics())) - - # Base configuration. - c.ema_kimg = c.batch_size * 10 / 32 - if opts.cfg == 'stylegan2': - c.G_kwargs.class_name = 'training.networks_stylegan2.Generator' - c.loss_kwargs.style_mixing_prob = 0.9 # Enable style mixing regularization. - c.loss_kwargs.pl_weight = 2 # Enable path length regularization. - c.G_reg_interval = 4 # Enable lazy regularization for G. - c.G_kwargs.fused_modconv_default = 'inference_only' # Speed up training by using regular convolutions instead of grouped convolutions. - c.loss_kwargs.pl_no_weight_grad = True # Speed up path length regularization by skipping gradient computation wrt. conv2d weights. - else: - c.G_kwargs.class_name = 'training.networks_stylegan3.Generator' - c.G_kwargs.magnitude_ema_beta = 0.5 ** (c.batch_size / (20 * 1e3)) - if opts.cfg == 'stylegan3-r': - c.G_kwargs.conv_kernel = 1 # Use 1x1 convolutions. - c.G_kwargs.channel_base *= 2 # Double the number of feature maps. - c.G_kwargs.channel_max *= 2 - c.G_kwargs.use_radial_filters = True # Use radially symmetric downsampling filters. - c.loss_kwargs.blur_init_sigma = 10 # Blur the images seen by the discriminator. - c.loss_kwargs.blur_fade_kimg = c.batch_size * 200 / 32 # Fade out the blur during the first N kimg. - - # Augmentation. - if opts.aug != 'noaug': - c.augment_kwargs = dnnlib.EasyDict(class_name='training.augment.AugmentPipe', xflip=1, rotate90=1, xint=1, scale=1, rotate=1, aniso=1, xfrac=1, brightness=1, contrast=1, lumaflip=1, hue=1, saturation=1) - if opts.aug == 'ada': - c.ada_target = opts.target - if opts.aug == 'fixed': - c.augment_p = opts.p - - # Resume. - if opts.resume is not None: - c.resume_pkl = opts.resume - c.ada_kimg = 100 # Make ADA react faster at the beginning. - c.ema_rampup = None # Disable EMA rampup. - c.loss_kwargs.blur_init_sigma = 0 # Disable blur rampup. - - # Performance-related toggles. - if opts.fp32: - c.G_kwargs.num_fp16_res = c.D_kwargs.num_fp16_res = 0 - c.G_kwargs.conv_clamp = c.D_kwargs.conv_clamp = None - if opts.nobench: - c.cudnn_benchmark = False - - # Description string. - desc = f'{opts.cfg:s}-{dataset_name:s}-gpus{c.num_gpus:d}-batch{c.batch_size:d}-gamma{c.loss_kwargs.r1_gamma:g}' - if opts.desc is not None: - desc += f'-{opts.desc}' - - # Launch. - launch_training(c=c, desc=desc, outdir=opts.outdir, dry_run=opts.dry_run) - -#---------------------------------------------------------------------------- - -if __name__ == "__main__": - main() # pylint: disable=no-value-for-parameter - -#---------------------------------------------------------------------------- diff --git a/spaces/alamin655/Personas/conversant/prompts/chat_prompt.py b/spaces/alamin655/Personas/conversant/prompts/chat_prompt.py deleted file mode 100644 index 166f0b236511ad8d3980f78d104a9965dabc7f84..0000000000000000000000000000000000000000 --- a/spaces/alamin655/Personas/conversant/prompts/chat_prompt.py +++ /dev/null @@ -1,218 +0,0 @@ -# Copyright (c) 2022 Cohere Inc. and its affiliates. -# -# Licensed under the MIT License (the "License"); -# you may not use this file except in compliance with the License. -# -# You may obtain a copy of the License in the LICENSE file at the top -# level of this repository. - -import logging -from dataclasses import field -from typing import List, NewType - -from pydantic.dataclasses import dataclass - -from conversant.chatbot import Interaction -from conversant.prompts.prompt import Prompt - -Conversation = NewType("Conversation", List[Interaction]) - - -@dataclass -class ChatPrompt(Prompt): - """A chat prompt given to a Chatbot. - - The examples in a `ChatPrompt` are a list of `Conversation`s themselves a list of - `Interaction`s. This is one level of nesting further as compared to those in - `Prompt`, which are a list of `Interaction`s. - - Required keys: - user: An entity speaking to the bot. - bot: The Chatbot itself. - - Constants: - REQUIRED_KEYS (List[str]): The list of required keys for the chat prompt. - (default: `["user", "bot"]`) - MIN_PREAMBLE_LENGTH (int): The minimum length of the preamble. (default: `1`) - MIN_NUM_EXAMPLES (int): The minimum number of examples that should be passed in. - (default: `1`) - """ - - examples: List[Conversation] - - REQUIRED_KEYS: List[str] = field(default_factory=lambda: ["user", "bot"]) - MIN_PREAMBLE_LENGTH: int = 10 - MIN_NUM_EXAMPLES: int = 0 - - def __post_init__(self) -> None: - """Validators for the chat prompt. - - Validates that the prompt follows the requirements of the validators listed - below. Minimally, the ChatPrompt needs to follow the requirements of its parent - class. - """ - super()._validate_preamble() - super()._validate_example_separator() - super()._validate_headers() - self._validate_examples() - self._validate_dialogue() - - @property - def user_name(self): - """ - Returns: - str: The name of the user that interacts with the chatbot who uses this - ChatPrompt. Typically this should be set to `'User'`. - """ - return self.headers["user"] - - @property - def bot_name(self): - """ - Returns: - str: The name of the chatbot who uses this ChatPrompt. - """ - return self.headers["bot"] - - def create_interaction_string(self, *args, **kwargs) -> str: - """Creates a string representation of an interaction. - - Interactions will look like the following: - - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - - Note the colon and space separating the speaker name from the respective - utterance. - - Args: - args: Positional arguments for the new interaction. - kwargs: Keyword arguments for the new interaction. - - Returns: - str: String representation of an interaction. - """ - interaction = ( - self.create_interaction(*args, **kwargs) if len(args) > 0 else kwargs - ) - return "".join( - f"{self.headers[key]}: {interaction[key]}\n" for key in interaction.keys() - ) - - def create_conversation_string(self, conversation: Conversation) -> str: - """Creates a string represenation of a conversation. - - Conversations will look like the following: - - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - - Args: - conversation (Conversation): List of interactions. - """ - return "".join( - self.create_interaction_string(**interaction) - for interaction in conversation - ) - - def to_string(self) -> str: - """Creates a string representation of the conversation prompt. - - The string representation is assembled from the preamble and examples. - Each example is created from a `create_conversation_string` method and is - demarcated by an `example_separator`. - - Examples will look like the following: - - {example_separator} - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - {example_separator} - {user_name}: {utterance}\n - {bot_name}: {utterance}\n - - Returns: - str: String representation of the conversation prompt. - """ - lines = [f"{self.preamble}\n"] - lines += self.example_separator + f"{self.example_separator}".join( - self.create_conversation_string(example) for example in self.examples - ) - return "".join(lines).strip() - - def _validate_examples(self) -> None: - """Validates that the `examples` meet the following requirements: - - - All fields are used in every example of `examples`. - - At least `MIN_NUM_EXAMPLES` examples are given. - - Raises: - ValueError: If any of the above requirements is not met. - """ - # All fields are used in every interaction in every example of `examples`. - for example in self.examples: - for interaction in example: - if any(key not in interaction for key in self.REQUIRED_KEYS): - raise ValueError( - "Missing required key.\nInteraction's keys: " - f"{interaction.keys()}\nRequired: {self.REQUIRED_KEYS}" - ) - # At least `MIN_NUM_EXAMPLES` examples are given. - if len(self.examples) < self.MIN_NUM_EXAMPLES: - raise ValueError( - f"At least {self.MIN_NUM_EXAMPLES} example(s) must be given for" - f"{self.__class__.__name__}" - ) - - def _validate_dialogue(self) -> None: - """Validates that the examples conform to a 2-person dialogue. - - - There should only be 2 speakers in the examples, and each speaker's utterance - should not be prefixed with their name. - - Raises: - ValueError: If the above requirement is not met. - """ - for example in self.examples: - # Only 2 speakers should be in each conversation interaction - if not all([len(interaction) == 2 for interaction in example]): - raise ValueError( - "Conversation interactions must be pairs of utterances." - ) - - # Only check the examples for name-prefixed utterances if there is at least - # one interaction - if example: - user_turns = [interaction["user"] for interaction in example] - bot_turns = [interaction["bot"] for interaction in example] - all_turns = user_turns + bot_turns - - colon_prefixed = all(":" in turn for turn in all_turns) - hyphen_prefixed = all("-" in turn for turn in all_turns) - - if colon_prefixed or hyphen_prefixed: - # This might false-positive, so we only log a warning - logging.warning( - "Did you mistakenly prefix the example dialogue turns with" - "user/bot names?" - ) - - user_prefixed = all( - turn.lstrip().startswith(self.user_name) for turn in user_turns - ) - - bot_prefixed = all( - turn.lstrip().startswith(self.bot_name) for turn in bot_turns - ) - if user_prefixed and bot_prefixed: - # It's hard to think of any genuine case where all utterances begin - # with self-names. - raise ValueError( - "Conversation interactions should not be prefixed with user/bot" - "names!" - ) diff --git a/spaces/alexray/btc_predictor/venv/lib/python3.10/site-packages/pip/_vendor/chardet/escsm.py b/spaces/alexray/btc_predictor/venv/lib/python3.10/site-packages/pip/_vendor/chardet/escsm.py deleted file mode 100644 index 0069523a04969dd920ea4b43a497162157729174..0000000000000000000000000000000000000000 --- a/spaces/alexray/btc_predictor/venv/lib/python3.10/site-packages/pip/_vendor/chardet/escsm.py +++ /dev/null @@ -1,246 +0,0 @@ -######################## BEGIN LICENSE BLOCK ######################## -# The Original Code is mozilla.org code. -# -# The Initial Developer of the Original Code is -# Netscape Communications Corporation. -# Portions created by the Initial Developer are Copyright (C) 1998 -# the Initial Developer. All Rights Reserved. -# -# Contributor(s): -# Mark Pilgrim - port to Python -# -# This library is free software; you can redistribute it and/or -# modify it under the terms of the GNU Lesser General Public -# License as published by the Free Software Foundation; either -# version 2.1 of the License, or (at your option) any later version. -# -# This library is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU -# Lesser General Public License for more details. -# -# You should have received a copy of the GNU Lesser General Public -# License along with this library; if not, write to the Free Software -# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA -# 02110-1301 USA -######################### END LICENSE BLOCK ######################### - -from .enums import MachineState - -HZ_CLS = ( -1,0,0,0,0,0,0,0, # 00 - 07 -0,0,0,0,0,0,0,0, # 08 - 0f -0,0,0,0,0,0,0,0, # 10 - 17 -0,0,0,1,0,0,0,0, # 18 - 1f -0,0,0,0,0,0,0,0, # 20 - 27 -0,0,0,0,0,0,0,0, # 28 - 2f -0,0,0,0,0,0,0,0, # 30 - 37 -0,0,0,0,0,0,0,0, # 38 - 3f -0,0,0,0,0,0,0,0, # 40 - 47 -0,0,0,0,0,0,0,0, # 48 - 4f -0,0,0,0,0,0,0,0, # 50 - 57 -0,0,0,0,0,0,0,0, # 58 - 5f -0,0,0,0,0,0,0,0, # 60 - 67 -0,0,0,0,0,0,0,0, # 68 - 6f -0,0,0,0,0,0,0,0, # 70 - 77 -0,0,0,4,0,5,2,0, # 78 - 7f -1,1,1,1,1,1,1,1, # 80 - 87 -1,1,1,1,1,1,1,1, # 88 - 8f -1,1,1,1,1,1,1,1, # 90 - 97 -1,1,1,1,1,1,1,1, # 98 - 9f -1,1,1,1,1,1,1,1, # a0 - a7 -1,1,1,1,1,1,1,1, # a8 - af -1,1,1,1,1,1,1,1, # b0 - b7 -1,1,1,1,1,1,1,1, # b8 - bf -1,1,1,1,1,1,1,1, # c0 - c7 -1,1,1,1,1,1,1,1, # c8 - cf -1,1,1,1,1,1,1,1, # d0 - d7 -1,1,1,1,1,1,1,1, # d8 - df -1,1,1,1,1,1,1,1, # e0 - e7 -1,1,1,1,1,1,1,1, # e8 - ef -1,1,1,1,1,1,1,1, # f0 - f7 -1,1,1,1,1,1,1,1, # f8 - ff -) - -HZ_ST = ( -MachineState.START,MachineState.ERROR, 3,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,# 00-07 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 08-0f -MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START, 4,MachineState.ERROR,# 10-17 - 5,MachineState.ERROR, 6,MachineState.ERROR, 5, 5, 4,MachineState.ERROR,# 18-1f - 4,MachineState.ERROR, 4, 4, 4,MachineState.ERROR, 4,MachineState.ERROR,# 20-27 - 4,MachineState.ITS_ME,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 28-2f -) - -HZ_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0) - -HZ_SM_MODEL = {'class_table': HZ_CLS, - 'class_factor': 6, - 'state_table': HZ_ST, - 'char_len_table': HZ_CHAR_LEN_TABLE, - 'name': "HZ-GB-2312", - 'language': 'Chinese'} - -ISO2022CN_CLS = ( -2,0,0,0,0,0,0,0, # 00 - 07 -0,0,0,0,0,0,0,0, # 08 - 0f -0,0,0,0,0,0,0,0, # 10 - 17 -0,0,0,1,0,0,0,0, # 18 - 1f -0,0,0,0,0,0,0,0, # 20 - 27 -0,3,0,0,0,0,0,0, # 28 - 2f -0,0,0,0,0,0,0,0, # 30 - 37 -0,0,0,0,0,0,0,0, # 38 - 3f -0,0,0,4,0,0,0,0, # 40 - 47 -0,0,0,0,0,0,0,0, # 48 - 4f -0,0,0,0,0,0,0,0, # 50 - 57 -0,0,0,0,0,0,0,0, # 58 - 5f -0,0,0,0,0,0,0,0, # 60 - 67 -0,0,0,0,0,0,0,0, # 68 - 6f -0,0,0,0,0,0,0,0, # 70 - 77 -0,0,0,0,0,0,0,0, # 78 - 7f -2,2,2,2,2,2,2,2, # 80 - 87 -2,2,2,2,2,2,2,2, # 88 - 8f -2,2,2,2,2,2,2,2, # 90 - 97 -2,2,2,2,2,2,2,2, # 98 - 9f -2,2,2,2,2,2,2,2, # a0 - a7 -2,2,2,2,2,2,2,2, # a8 - af -2,2,2,2,2,2,2,2, # b0 - b7 -2,2,2,2,2,2,2,2, # b8 - bf -2,2,2,2,2,2,2,2, # c0 - c7 -2,2,2,2,2,2,2,2, # c8 - cf -2,2,2,2,2,2,2,2, # d0 - d7 -2,2,2,2,2,2,2,2, # d8 - df -2,2,2,2,2,2,2,2, # e0 - e7 -2,2,2,2,2,2,2,2, # e8 - ef -2,2,2,2,2,2,2,2, # f0 - f7 -2,2,2,2,2,2,2,2, # f8 - ff -) - -ISO2022CN_ST = ( -MachineState.START, 3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 00-07 -MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 08-0f -MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 10-17 -MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, 4,MachineState.ERROR,# 18-1f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 20-27 - 5, 6,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 28-2f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 30-37 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.START,# 38-3f -) - -ISO2022CN_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0, 0, 0, 0) - -ISO2022CN_SM_MODEL = {'class_table': ISO2022CN_CLS, - 'class_factor': 9, - 'state_table': ISO2022CN_ST, - 'char_len_table': ISO2022CN_CHAR_LEN_TABLE, - 'name': "ISO-2022-CN", - 'language': 'Chinese'} - -ISO2022JP_CLS = ( -2,0,0,0,0,0,0,0, # 00 - 07 -0,0,0,0,0,0,2,2, # 08 - 0f -0,0,0,0,0,0,0,0, # 10 - 17 -0,0,0,1,0,0,0,0, # 18 - 1f -0,0,0,0,7,0,0,0, # 20 - 27 -3,0,0,0,0,0,0,0, # 28 - 2f -0,0,0,0,0,0,0,0, # 30 - 37 -0,0,0,0,0,0,0,0, # 38 - 3f -6,0,4,0,8,0,0,0, # 40 - 47 -0,9,5,0,0,0,0,0, # 48 - 4f -0,0,0,0,0,0,0,0, # 50 - 57 -0,0,0,0,0,0,0,0, # 58 - 5f -0,0,0,0,0,0,0,0, # 60 - 67 -0,0,0,0,0,0,0,0, # 68 - 6f -0,0,0,0,0,0,0,0, # 70 - 77 -0,0,0,0,0,0,0,0, # 78 - 7f -2,2,2,2,2,2,2,2, # 80 - 87 -2,2,2,2,2,2,2,2, # 88 - 8f -2,2,2,2,2,2,2,2, # 90 - 97 -2,2,2,2,2,2,2,2, # 98 - 9f -2,2,2,2,2,2,2,2, # a0 - a7 -2,2,2,2,2,2,2,2, # a8 - af -2,2,2,2,2,2,2,2, # b0 - b7 -2,2,2,2,2,2,2,2, # b8 - bf -2,2,2,2,2,2,2,2, # c0 - c7 -2,2,2,2,2,2,2,2, # c8 - cf -2,2,2,2,2,2,2,2, # d0 - d7 -2,2,2,2,2,2,2,2, # d8 - df -2,2,2,2,2,2,2,2, # e0 - e7 -2,2,2,2,2,2,2,2, # e8 - ef -2,2,2,2,2,2,2,2, # f0 - f7 -2,2,2,2,2,2,2,2, # f8 - ff -) - -ISO2022JP_ST = ( -MachineState.START, 3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 00-07 -MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 08-0f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 10-17 -MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,# 18-1f -MachineState.ERROR, 5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, 4,MachineState.ERROR,MachineState.ERROR,# 20-27 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, 6,MachineState.ITS_ME,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,# 28-2f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,# 30-37 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 38-3f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.START,MachineState.START,# 40-47 -) - -ISO2022JP_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0) - -ISO2022JP_SM_MODEL = {'class_table': ISO2022JP_CLS, - 'class_factor': 10, - 'state_table': ISO2022JP_ST, - 'char_len_table': ISO2022JP_CHAR_LEN_TABLE, - 'name': "ISO-2022-JP", - 'language': 'Japanese'} - -ISO2022KR_CLS = ( -2,0,0,0,0,0,0,0, # 00 - 07 -0,0,0,0,0,0,0,0, # 08 - 0f -0,0,0,0,0,0,0,0, # 10 - 17 -0,0,0,1,0,0,0,0, # 18 - 1f -0,0,0,0,3,0,0,0, # 20 - 27 -0,4,0,0,0,0,0,0, # 28 - 2f -0,0,0,0,0,0,0,0, # 30 - 37 -0,0,0,0,0,0,0,0, # 38 - 3f -0,0,0,5,0,0,0,0, # 40 - 47 -0,0,0,0,0,0,0,0, # 48 - 4f -0,0,0,0,0,0,0,0, # 50 - 57 -0,0,0,0,0,0,0,0, # 58 - 5f -0,0,0,0,0,0,0,0, # 60 - 67 -0,0,0,0,0,0,0,0, # 68 - 6f -0,0,0,0,0,0,0,0, # 70 - 77 -0,0,0,0,0,0,0,0, # 78 - 7f -2,2,2,2,2,2,2,2, # 80 - 87 -2,2,2,2,2,2,2,2, # 88 - 8f -2,2,2,2,2,2,2,2, # 90 - 97 -2,2,2,2,2,2,2,2, # 98 - 9f -2,2,2,2,2,2,2,2, # a0 - a7 -2,2,2,2,2,2,2,2, # a8 - af -2,2,2,2,2,2,2,2, # b0 - b7 -2,2,2,2,2,2,2,2, # b8 - bf -2,2,2,2,2,2,2,2, # c0 - c7 -2,2,2,2,2,2,2,2, # c8 - cf -2,2,2,2,2,2,2,2, # d0 - d7 -2,2,2,2,2,2,2,2, # d8 - df -2,2,2,2,2,2,2,2, # e0 - e7 -2,2,2,2,2,2,2,2, # e8 - ef -2,2,2,2,2,2,2,2, # f0 - f7 -2,2,2,2,2,2,2,2, # f8 - ff -) - -ISO2022KR_ST = ( -MachineState.START, 3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,# 00-07 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 08-0f -MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, 4,MachineState.ERROR,MachineState.ERROR,# 10-17 -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, 5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 18-1f -MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 20-27 -) - -ISO2022KR_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0) - -ISO2022KR_SM_MODEL = {'class_table': ISO2022KR_CLS, - 'class_factor': 6, - 'state_table': ISO2022KR_ST, - 'char_len_table': ISO2022KR_CHAR_LEN_TABLE, - 'name': "ISO-2022-KR", - 'language': 'Korean'} - - diff --git a/spaces/aliceoq/vozes-da-loirinha/lib/infer_pack/transforms.py b/spaces/aliceoq/vozes-da-loirinha/lib/infer_pack/transforms.py deleted file mode 100644 index a11f799e023864ff7082c1f49c0cc18351a13b47..0000000000000000000000000000000000000000 --- a/spaces/aliceoq/vozes-da-loirinha/lib/infer_pack/transforms.py +++ /dev/null @@ -1,209 +0,0 @@ -import torch -from torch.nn import functional as F - -import numpy as np - - -DEFAULT_MIN_BIN_WIDTH = 1e-3 -DEFAULT_MIN_BIN_HEIGHT = 1e-3 -DEFAULT_MIN_DERIVATIVE = 1e-3 - - -def piecewise_rational_quadratic_transform( - inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails=None, - tail_bound=1.0, - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE, -): - if tails is None: - spline_fn = rational_quadratic_spline - spline_kwargs = {} - else: - spline_fn = unconstrained_rational_quadratic_spline - spline_kwargs = {"tails": tails, "tail_bound": tail_bound} - - outputs, logabsdet = spline_fn( - inputs=inputs, - unnormalized_widths=unnormalized_widths, - unnormalized_heights=unnormalized_heights, - unnormalized_derivatives=unnormalized_derivatives, - inverse=inverse, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative, - **spline_kwargs - ) - return outputs, logabsdet - - -def searchsorted(bin_locations, inputs, eps=1e-6): - bin_locations[..., -1] += eps - return torch.sum(inputs[..., None] >= bin_locations, dim=-1) - 1 - - -def unconstrained_rational_quadratic_spline( - inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - tails="linear", - tail_bound=1.0, - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE, -): - inside_interval_mask = (inputs >= -tail_bound) & (inputs <= tail_bound) - outside_interval_mask = ~inside_interval_mask - - outputs = torch.zeros_like(inputs) - logabsdet = torch.zeros_like(inputs) - - if tails == "linear": - unnormalized_derivatives = F.pad(unnormalized_derivatives, pad=(1, 1)) - constant = np.log(np.exp(1 - min_derivative) - 1) - unnormalized_derivatives[..., 0] = constant - unnormalized_derivatives[..., -1] = constant - - outputs[outside_interval_mask] = inputs[outside_interval_mask] - logabsdet[outside_interval_mask] = 0 - else: - raise RuntimeError("{} tails are not implemented.".format(tails)) - - ( - outputs[inside_interval_mask], - logabsdet[inside_interval_mask], - ) = rational_quadratic_spline( - inputs=inputs[inside_interval_mask], - unnormalized_widths=unnormalized_widths[inside_interval_mask, :], - unnormalized_heights=unnormalized_heights[inside_interval_mask, :], - unnormalized_derivatives=unnormalized_derivatives[inside_interval_mask, :], - inverse=inverse, - left=-tail_bound, - right=tail_bound, - bottom=-tail_bound, - top=tail_bound, - min_bin_width=min_bin_width, - min_bin_height=min_bin_height, - min_derivative=min_derivative, - ) - - return outputs, logabsdet - - -def rational_quadratic_spline( - inputs, - unnormalized_widths, - unnormalized_heights, - unnormalized_derivatives, - inverse=False, - left=0.0, - right=1.0, - bottom=0.0, - top=1.0, - min_bin_width=DEFAULT_MIN_BIN_WIDTH, - min_bin_height=DEFAULT_MIN_BIN_HEIGHT, - min_derivative=DEFAULT_MIN_DERIVATIVE, -): - if torch.min(inputs) < left or torch.max(inputs) > right: - raise ValueError("Input to a transform is not within its domain") - - num_bins = unnormalized_widths.shape[-1] - - if min_bin_width * num_bins > 1.0: - raise ValueError("Minimal bin width too large for the number of bins") - if min_bin_height * num_bins > 1.0: - raise ValueError("Minimal bin height too large for the number of bins") - - widths = F.softmax(unnormalized_widths, dim=-1) - widths = min_bin_width + (1 - min_bin_width * num_bins) * widths - cumwidths = torch.cumsum(widths, dim=-1) - cumwidths = F.pad(cumwidths, pad=(1, 0), mode="constant", value=0.0) - cumwidths = (right - left) * cumwidths + left - cumwidths[..., 0] = left - cumwidths[..., -1] = right - widths = cumwidths[..., 1:] - cumwidths[..., :-1] - - derivatives = min_derivative + F.softplus(unnormalized_derivatives) - - heights = F.softmax(unnormalized_heights, dim=-1) - heights = min_bin_height + (1 - min_bin_height * num_bins) * heights - cumheights = torch.cumsum(heights, dim=-1) - cumheights = F.pad(cumheights, pad=(1, 0), mode="constant", value=0.0) - cumheights = (top - bottom) * cumheights + bottom - cumheights[..., 0] = bottom - cumheights[..., -1] = top - heights = cumheights[..., 1:] - cumheights[..., :-1] - - if inverse: - bin_idx = searchsorted(cumheights, inputs)[..., None] - else: - bin_idx = searchsorted(cumwidths, inputs)[..., None] - - input_cumwidths = cumwidths.gather(-1, bin_idx)[..., 0] - input_bin_widths = widths.gather(-1, bin_idx)[..., 0] - - input_cumheights = cumheights.gather(-1, bin_idx)[..., 0] - delta = heights / widths - input_delta = delta.gather(-1, bin_idx)[..., 0] - - input_derivatives = derivatives.gather(-1, bin_idx)[..., 0] - input_derivatives_plus_one = derivatives[..., 1:].gather(-1, bin_idx)[..., 0] - - input_heights = heights.gather(-1, bin_idx)[..., 0] - - if inverse: - a = (inputs - input_cumheights) * ( - input_derivatives + input_derivatives_plus_one - 2 * input_delta - ) + input_heights * (input_delta - input_derivatives) - b = input_heights * input_derivatives - (inputs - input_cumheights) * ( - input_derivatives + input_derivatives_plus_one - 2 * input_delta - ) - c = -input_delta * (inputs - input_cumheights) - - discriminant = b.pow(2) - 4 * a * c - assert (discriminant >= 0).all() - - root = (2 * c) / (-b - torch.sqrt(discriminant)) - outputs = root * input_bin_widths + input_cumwidths - - theta_one_minus_theta = root * (1 - root) - denominator = input_delta + ( - (input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta - ) - derivative_numerator = input_delta.pow(2) * ( - input_derivatives_plus_one * root.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - root).pow(2) - ) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, -logabsdet - else: - theta = (inputs - input_cumwidths) / input_bin_widths - theta_one_minus_theta = theta * (1 - theta) - - numerator = input_heights * ( - input_delta * theta.pow(2) + input_derivatives * theta_one_minus_theta - ) - denominator = input_delta + ( - (input_derivatives + input_derivatives_plus_one - 2 * input_delta) - * theta_one_minus_theta - ) - outputs = input_cumheights + numerator / denominator - - derivative_numerator = input_delta.pow(2) * ( - input_derivatives_plus_one * theta.pow(2) - + 2 * input_delta * theta_one_minus_theta - + input_derivatives * (1 - theta).pow(2) - ) - logabsdet = torch.log(derivative_numerator) - 2 * torch.log(denominator) - - return outputs, logabsdet diff --git a/spaces/allknowingroger/Image-Models-Test24/README.md b/spaces/allknowingroger/Image-Models-Test24/README.md deleted file mode 100644 index a0f9e935731f82fb3581204b62cc3678a2850067..0000000000000000000000000000000000000000 --- a/spaces/allknowingroger/Image-Models-Test24/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: More Image Models -emoji: 😻 -colorFrom: red -colorTo: gray -sdk: gradio -sdk_version: 3.23.0 -app_file: app.py -pinned: true -duplicated_from: allknowingroger/Image-Models-Test23 ---- - - \ No newline at end of file diff --git a/spaces/allknowingroger/Image-Models-Test7/README.md b/spaces/allknowingroger/Image-Models-Test7/README.md deleted file mode 100644 index 02b15e9091015a3624f1390a413a590b8496e480..0000000000000000000000000000000000000000 --- a/spaces/allknowingroger/Image-Models-Test7/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: More Image Models -emoji: 😻 -colorFrom: red -colorTo: gray -sdk: gradio -sdk_version: 3.23.0 -app_file: app.py -pinned: true -duplicated_from: allknowingroger/Image-Models-Test6 ---- - - \ No newline at end of file diff --git a/spaces/almn-uhc/Streamlit-Data-Synthesis-Example/app.py b/spaces/almn-uhc/Streamlit-Data-Synthesis-Example/app.py deleted file mode 100644 index f0e5b8545837d35cf97e3b58400936e46ef70c87..0000000000000000000000000000000000000000 --- a/spaces/almn-uhc/Streamlit-Data-Synthesis-Example/app.py +++ /dev/null @@ -1,56 +0,0 @@ -import streamlit as st -import pandas as pd - - -def generate_hospital_data(): - # Generate hospital data - hospitals = { - "city": ["New York", "Los Angeles", "Chicago", "Houston", "Phoenix"], - "state": ["NY", "CA", "IL", "TX", "AZ"], - "bed_count": [1200, 1500, 1100, 1300, 1400], - } - df = pd.DataFrame(hospitals) - return df - - -def generate_state_data(): - # Generate state data - states = { - "state": ["NY", "CA", "IL", "TX", "AZ"], - "population": [20000000, 40000000, 13000000, 29000000, 7000000], - "square_miles": [54556, 163696, 57914, 268596, 113990], - } - df = pd.DataFrame(states) - return df - - -def merge_datasets(hospitals_df, states_df): - # Merge hospital and state data - merged_df = pd.merge(hospitals_df, states_df, on="state") - return merged_df - - -def calculate_beds_per_capita(merged_df): - # Calculate beds per capita - merged_df["beds_per_capita"] = merged_df["bed_count"] / merged_df["population"] - return merged_df - - -def main(): - # Generate data - hospitals_df = generate_hospital_data() - states_df = generate_state_data() - - # Merge datasets - merged_df = merge_datasets(hospitals_df, states_df) - - # Calculate beds per capita - merged_df = calculate_beds_per_capita(merged_df) - - # Show merged and calculated data - st.write(merged_df) - - -if __name__ == "__main__": - main() - diff --git a/spaces/alsrbdni/speaker-diarization/app.py b/spaces/alsrbdni/speaker-diarization/app.py deleted file mode 100644 index 67aba115ee74b854cf74ec329c9bdeb74c170d2a..0000000000000000000000000000000000000000 --- a/spaces/alsrbdni/speaker-diarization/app.py +++ /dev/null @@ -1,414 +0,0 @@ -import whisper -import datetime -import subprocess -import gradio as gr -from pathlib import Path -import pandas as pd -import re -import time -import os -import numpy as np -from sklearn.cluster import AgglomerativeClustering - -from pytube import YouTube -import torch -import pyannote.audio -from pyannote.audio.pipelines.speaker_verification import PretrainedSpeakerEmbedding -from pyannote.audio import Audio -from pyannote.core import Segment - -from gpuinfo import GPUInfo - -import wave -import contextlib -from transformers import pipeline -import psutil - -whisper_models = ["base", "small", "medium", "large"] -source_languages = { - "en": "English", - "zh": "Chinese", - "de": "German", - "es": "Spanish", - "ru": "Russian", - "ko": "Korean", - "fr": "French", - "ja": "Japanese", - "pt": "Portuguese", - "tr": "Turkish", - "pl": "Polish", - "ca": "Catalan", - "nl": "Dutch", - "ar": "Arabic", - "sv": "Swedish", - "it": "Italian", - "id": "Indonesian", - "hi": "Hindi", - "fi": "Finnish", - "vi": "Vietnamese", - "he": "Hebrew", - "uk": "Ukrainian", - "el": "Greek", - "ms": "Malay", - "cs": "Czech", - "ro": "Romanian", - "da": "Danish", - "hu": "Hungarian", - "ta": "Tamil", - "no": "Norwegian", - "th": "Thai", - "ur": "Urdu", - "hr": "Croatian", - "bg": "Bulgarian", - "lt": "Lithuanian", - "la": "Latin", - "mi": "Maori", - "ml": "Malayalam", - "cy": "Welsh", - "sk": "Slovak", - "te": "Telugu", - "fa": "Persian", - "lv": "Latvian", - "bn": "Bengali", - "sr": "Serbian", - "az": "Azerbaijani", - "sl": "Slovenian", - "kn": "Kannada", - "et": "Estonian", - "mk": "Macedonian", - "br": "Breton", - "eu": "Basque", - "is": "Icelandic", - "hy": "Armenian", - "ne": "Nepali", - "mn": "Mongolian", - "bs": "Bosnian", - "kk": "Kazakh", - "sq": "Albanian", - "sw": "Swahili", - "gl": "Galician", - "mr": "Marathi", - "pa": "Punjabi", - "si": "Sinhala", - "km": "Khmer", - "sn": "Shona", - "yo": "Yoruba", - "so": "Somali", - "af": "Afrikaans", - "oc": "Occitan", - "ka": "Georgian", - "be": "Belarusian", - "tg": "Tajik", - "sd": "Sindhi", - "gu": "Gujarati", - "am": "Amharic", - "yi": "Yiddish", - "lo": "Lao", - "uz": "Uzbek", - "fo": "Faroese", - "ht": "Haitian creole", - "ps": "Pashto", - "tk": "Turkmen", - "nn": "Nynorsk", - "mt": "Maltese", - "sa": "Sanskrit", - "lb": "Luxembourgish", - "my": "Myanmar", - "bo": "Tibetan", - "tl": "Tagalog", - "mg": "Malagasy", - "as": "Assamese", - "tt": "Tatar", - "haw": "Hawaiian", - "ln": "Lingala", - "ha": "Hausa", - "ba": "Bashkir", - "jw": "Javanese", - "su": "Sundanese", -} - -source_language_list = [key[0] for key in source_languages.items()] - -MODEL_NAME = "vumichien/whisper-medium-jp" -lang = "ja" - -device = 0 if torch.cuda.is_available() else "cpu" -pipe = pipeline( - task="automatic-speech-recognition", - model=MODEL_NAME, - chunk_length_s=30, - device=device, -) - -pipe.model.config.forced_decoder_ids = pipe.tokenizer.get_decoder_prompt_ids(language=lang, task="transcribe") - -embedding_model = PretrainedSpeakerEmbedding( - "speechbrain/spkrec-ecapa-voxceleb", - device=torch.device("cuda" if torch.cuda.is_available() else "cpu")) - -def transcribe(microphone, file_upload): - warn_output = "" - if (microphone is not None) and (file_upload is not None): - warn_output = ( - "WARNING: You've uploaded an audio file and used the microphone. " - "The recorded file from the microphone will be used and the uploaded audio will be discarded.\n" - ) - - elif (microphone is None) and (file_upload is None): - return "ERROR: You have to either use the microphone or upload an audio file" - - file = microphone if microphone is not None else file_upload - - text = pipe(file)["text"] - - return warn_output + text - -def _return_yt_html_embed(yt_url): - video_id = yt_url.split("?v=")[-1] - HTML_str = ( - f'
' - "
" - ) - return HTML_str - -def yt_transcribe(yt_url): - yt = YouTube(yt_url) - html_embed_str = _return_yt_html_embed(yt_url) - stream = yt.streams.filter(only_audio=True)[0] - stream.download(filename="audio.mp3") - - text = pipe("audio.mp3")["text"] - - return html_embed_str, text - -def convert_time(secs): - return datetime.timedelta(seconds=round(secs)) - -def get_youtube(video_url): - yt = YouTube(video_url) - abs_video_path = yt.streams.filter(progressive=True, file_extension='mp4').order_by('resolution').desc().first().download() - print("Success download video") - print(abs_video_path) - return abs_video_path - -def speech_to_text(video_file_path, selected_source_lang, whisper_model, num_speakers): - """ - # Transcribe youtube link using OpenAI Whisper - 1. Using Open AI's Whisper model to seperate audio into segments and generate transcripts. - 2. Generating speaker embeddings for each segments. - 3. Applying agglomerative clustering on the embeddings to identify the speaker for each segment. - - Speech Recognition is based on models from OpenAI Whisper https://github.com/openai/whisper - Speaker diarization model and pipeline from by https://github.com/pyannote/pyannote-audio - """ - - model = whisper.load_model(whisper_model) - time_start = time.time() - if(video_file_path == None): - raise ValueError("Error no video input") - print(video_file_path) - - try: - # Read and convert youtube video - _,file_ending = os.path.splitext(f'{video_file_path}') - print(f'file enging is {file_ending}') - audio_file = video_file_path.replace(file_ending, ".wav") - print("starting conversion to wav") - os.system(f'ffmpeg -i "{video_file_path}" -ar 16000 -ac 1 -c:a pcm_s16le "{audio_file}"') - - # Get duration - with contextlib.closing(wave.open(audio_file,'r')) as f: - frames = f.getnframes() - rate = f.getframerate() - duration = frames / float(rate) - print(f"conversion to wav ready, duration of audio file: {duration}") - - # Transcribe audio - options = dict(language=selected_source_lang, beam_size=5, best_of=5) - transcribe_options = dict(task="transcribe", **options) - result = model.transcribe(audio_file, **transcribe_options) - segments = result["segments"] - print("starting whisper done with whisper") - except Exception as e: - raise RuntimeError("Error converting video to audio") - - try: - # Create embedding - def segment_embedding(segment): - audio = Audio() - start = segment["start"] - # Whisper overshoots the end timestamp in the last segment - end = min(duration, segment["end"]) - clip = Segment(start, end) - waveform, sample_rate = audio.crop(audio_file, clip) - return embedding_model(waveform[None]) - - embeddings = np.zeros(shape=(len(segments), 192)) - for i, segment in enumerate(segments): - embeddings[i] = segment_embedding(segment) - embeddings = np.nan_to_num(embeddings) - print(f'Embedding shape: {embeddings.shape}') - - # Assign speaker label - clustering = AgglomerativeClustering(num_speakers).fit(embeddings) - labels = clustering.labels_ - for i in range(len(segments)): - segments[i]["speaker"] = 'SPEAKER ' + str(labels[i] + 1) - - # Make output - objects = { - 'Start' : [], - 'End': [], - 'Speaker': [], - 'Text': [] - } - text = '' - for (i, segment) in enumerate(segments): - if i == 0 or segments[i - 1]["speaker"] != segment["speaker"]: - objects['Start'].append(str(convert_time(segment["start"]))) - objects['Speaker'].append(segment["speaker"]) - if i != 0: - objects['End'].append(str(convert_time(segments[i - 1]["end"]))) - objects['Text'].append(text) - text = '' - text += segment["text"] + ' ' - objects['End'].append(str(convert_time(segments[i - 1]["end"]))) - objects['Text'].append(text) - - time_end = time.time() - time_diff = time_end - time_start - memory = psutil.virtual_memory() - gpu_utilization, gpu_memory = GPUInfo.gpu_usage() - gpu_utilization = gpu_utilization[0] if len(gpu_utilization) > 0 else 0 - gpu_memory = gpu_memory[0] if len(gpu_memory) > 0 else 0 - system_info = f""" - *Memory: {memory.total / (1024 * 1024 * 1024):.2f}GB, used: {memory.percent}%, available: {memory.available / (1024 * 1024 * 1024):.2f}GB.* - *Processing time: {time_diff:.5} seconds.* - *GPU Utilization: {gpu_utilization}%, GPU Memory: {gpu_memory}MiB.* - """ - - return pd.DataFrame(objects), system_info - - except Exception as e: - raise RuntimeError("Error Running inference with local model", e) - - -# ---- Gradio Layout ----- -# Inspiration from https://huggingface.co/spaces/RASMUS/Whisper-youtube-crosslingual-subtitles -video_in = gr.Video(label="Video file", mirror_webcam=False) -youtube_url_in = gr.Textbox(label="Youtube url", lines=1, interactive=True) -df_init = pd.DataFrame(columns=['Start', 'End', 'Speaker', 'Text']) -memory = psutil.virtual_memory() -selected_source_lang = gr.Dropdown(choices=source_language_list, type="value", value="en", label="Spoken language in video", interactive=True) -selected_whisper_model = gr.Dropdown(choices=whisper_models, type="value", value="base", label="Selected Whisper model", interactive=True) -number_speakers = gr.Number(precision=0, value=2, label="Selected number of speakers", interactive=True) -system_info = gr.Markdown(f"*Memory: {memory.total / (1024 * 1024 * 1024):.2f}GB, used: {memory.percent}%, available: {memory.available / (1024 * 1024 * 1024):.2f}GB*") -transcription_df = gr.DataFrame(value=df_init,label="Transcription dataframe", row_count=(0, "dynamic"), max_rows = 10, wrap=True, overflow_row_behaviour='paginate') -title = "Speaker diarization" -demo = gr.Blocks(title=title,css="footer {visibility: hidden}") -demo.encrypt = False - - -with demo: - with gr.Tab("Speaker Diarization"): - # gr.Markdown(''' - #
- #

Whisper speaker diarization

- # This space uses Whisper models from OpenAI to recoginze the speech and ECAPA-TDNN model from SpeechBrain to encode and clasify speakers - #
- # ''') - - # with gr.Row(): - # gr.Markdown(''' - # ### Transcribe youtube link using OpenAI Whisper - # ##### 1. Using Open AI's Whisper model to seperate audio into segments and generate transcripts. - # ##### 2. Generating speaker embeddings for each segments. - # ##### 3. Applying agglomerative clustering on the embeddings to identify the speaker for each segment. - # ''') - - # with gr.Row(): - # gr.Markdown(''' - # ### You can test by following examples: - # ''') - # examples = gr.Examples(examples= - # [ "https://www.youtube.com/watch?v=j7BfEzAFuYc&t=32s", - # "https://www.youtube.com/watch?v=-UX0X45sYe4", - # "https://www.youtube.com/watch?v=7minSgqi-Gw"], - # label="Examples", inputs=[youtube_url_in]) - - - # with gr.Row(): - # with gr.Column(): - # youtube_url_in.render() - # download_youtube_btn = gr.Button("Download Youtube video") - # download_youtube_btn.click(get_youtube, [youtube_url_in], [ - # video_in]) - # print(video_in) - - - with gr.Row(): - with gr.Column(): - video_in.render() - with gr.Column(): - gr.Markdown(''' - ##### Here you can start the transcription process. - ##### Please select the source language for transcription. - ##### You should select a number of speakers for getting better results. - ''') - selected_source_lang.render() - selected_whisper_model.render() - number_speakers.render() - transcribe_btn = gr.Button("Transcribe audio and diarization") - transcribe_btn.click(speech_to_text, [video_in, selected_source_lang, selected_whisper_model, number_speakers], [transcription_df, system_info]) - - - with gr.Row(): - gr.Markdown(''' - ##### Here you will get transcription output - ##### ''') - - - with gr.Row(): - with gr.Column(): - transcription_df.render() - system_info.render() - # gr.Markdown('''
visitor badge
''') - - # with gr.Tab("Whisper Transcribe Japanese Audio"): - # gr.Markdown(f''' - #
- #

Whisper Transcribe Japanese Audio

- #
- # Transcribe long-form microphone or audio inputs with the click of a button! The fine-tuned - # checkpoint {MODEL_NAME} to transcribe audio files of arbitrary length. - # ''') - # microphone = gr.inputs.Audio(source="microphone", type="filepath", optional=True) - # upload = gr.inputs.Audio(source="upload", type="filepath", optional=True) - # transcribe_btn = gr.Button("Transcribe Audio") - # text_output = gr.Textbox() - # with gr.Row(): - # gr.Markdown(''' - # ### You can test by following examples: - # ''') - # examples = gr.Examples(examples= - # [ "sample1.wav", - # "sample2.wav", - # ], - # label="Examples", inputs=[upload]) - # transcribe_btn.click(transcribe, [microphone, upload], outputs=text_output) - - # with gr.Tab("Whisper Transcribe Japanese YouTube"): - # gr.Markdown(f''' - #
- #

Whisper Transcribe Japanese YouTube

- #
- # Transcribe long-form YouTube videos with the click of a button! The fine-tuned checkpoint: - # {MODEL_NAME} to transcribe audio files of arbitrary length. - # ''') - # youtube_link = gr.Textbox(label="Youtube url", lines=1, interactive=True) - # yt_transcribe_btn = gr.Button("Transcribe YouTube") - # text_output2 = gr.Textbox() - # html_output = gr.Markdown() - # yt_transcribe_btn.click(yt_transcribe, [youtube_link], outputs=[html_output, text_output2]) - -demo.launch(debug=True) \ No newline at end of file diff --git a/spaces/amankishore/sjc/my3d.py b/spaces/amankishore/sjc/my3d.py deleted file mode 100644 index 803dffe70ff6a8d5cec19c316bc59852c887dc06..0000000000000000000000000000000000000000 --- a/spaces/amankishore/sjc/my3d.py +++ /dev/null @@ -1,160 +0,0 @@ -# some tools developed for the vision class -import numpy as np -from numpy import cross, tan -from numpy.linalg import norm, inv - - -def normalize(v): - return v / norm(v) - - -def camera_pose(eye, front, up): - z = normalize(-1 * front) - x = normalize(cross(up, z)) - y = normalize(cross(z, x)) - - # convert to col vector - x = x.reshape(-1, 1) - y = y.reshape(-1, 1) - z = z.reshape(-1, 1) - eye = eye.reshape(-1, 1) - - pose = np.block([ - [x, y, z, eye], - [0, 0, 0, 1] - ]) - return pose - - -def compute_extrinsics(eye, front, up): - pose = camera_pose(eye, front, up) - world_2_cam = inv(pose) - return world_2_cam - - -def compute_intrinsics(aspect_ratio, fov, img_height_in_pix): - # aspect ratio is w / h - ndc = compute_proj_to_normalized(aspect_ratio, fov) - - # anything beyond [-1, 1] should be discarded - # this did not mention how to do z-clipping; - - ndc_to_img = compute_normalized_to_img_trans(aspect_ratio, img_height_in_pix) - intrinsic = ndc_to_img @ ndc - return intrinsic - - -def compute_proj_to_normalized(aspect, fov): - # compared to standard OpenGL NDC intrinsic, - # this skips the 3rd row treatment on z. hence the name partial_ndc - fov_in_rad = fov / 180 * np.pi - t = tan(fov_in_rad / 2) # tan half fov - partial_ndc_intrinsic = np.array([ - [1 / (t * aspect), 0, 0, 0], - [0, 1 / t, 0, 0], - [0, 0, -1, 0] # copy the negative distance for division - ]) - return partial_ndc_intrinsic - - -def compute_normalized_to_img_trans(aspect, img_height_in_pix): - img_h = img_height_in_pix - img_w = img_height_in_pix * aspect - - # note the OpenGL convention that (0, 0) sits at the center of the pixel; - # hence the extra -0.5 translation - # this is useful when you shoot rays through a pixel to the scene - ndc_to_img = np.array([ - [img_w / 2, 0, img_w / 2 - 0.5], - [0, img_h / 2, img_h / 2 - 0.5], - [0, 0, 1] - ]) - - img_y_coord_flip = np.array([ - [1, 0, 0], - [0, -1, img_h - 1], # note the -1 - [0, 0, 1] - ]) - - # the product of the above 2 matrices is equivalent to adding - # - sign to the (1, 1) entry - # you could have simply written - # ndc_to_img = np.array([ - # [img_w / 2, 0, img_w / 2 - 0.5], - # [0, -img_h / 2, img_h / 2 - 0.5], - # [0, 0, 1] - # ]) - - ndc_to_img = img_y_coord_flip @ ndc_to_img - return ndc_to_img - - -def unproject(K, pixel_coords, depth=1.0): - """sometimes also referred to as backproject - pixel_coords: [n, 2] pixel locations - depth: [n,] or [,] depth value. of a shape that is broadcastable with pix coords - """ - K = K[0:3, 0:3] - - pixel_coords = as_homogeneous(pixel_coords) - pixel_coords = pixel_coords.T # [2+1, n], so that mat mult is on the left - - # this will give points with z = -1, which is exactly what you want since - # your camera is facing the -ve z axis - pts = inv(K) @ pixel_coords - - pts = pts * depth # [3, n] * [n,] broadcast - pts = pts.T - pts = as_homogeneous(pts) - return pts - - -""" -these two functions are changed so that they can handle arbitrary number of -dimensions >=1 -""" - - -def homogenize(pts): - # pts: [..., d], where last dim of the d is the diviser - *front, d = pts.shape - pts = pts / pts[..., -1].reshape(*front, 1) - return pts - - -def as_homogeneous(pts, lib=np): - # pts: [..., d] - *front, d = pts.shape - points = lib.ones((*front, d + 1)) - points[..., :d] = pts - return points - - -def simple_point_render(pts, img_w, img_h, fov, eye, front, up): - """ - pts: [N, 3] - """ - canvas = np.ones((img_h, img_w, 3)) - - pts = as_homogeneous(pts) - - E = compute_extrinsics(eye, front, up) - world_2_ndc = compute_proj_to_normalized(img_w / img_h, fov) - ndc_to_img = compute_normalized_to_img_trans(img_w / img_h, img_h) - - pts = pts @ E.T - pts = pts @ world_2_ndc.T - pts = homogenize(pts) - - # now filter out outliers beyond [-1, 1] - outlier_mask = (np.abs(pts) > 1.0).any(axis=1) - pts = pts[~outlier_mask] - - pts = pts @ ndc_to_img.T - - # now draw each point - pts = np.rint(pts).astype(np.int32) - xs, ys, _ = pts.T - canvas[ys, xs] = (1, 0, 0) - - return canvas diff --git a/spaces/andryMLOPS/ASTA-GPT-3.8_web_ui/client/js/chat.js b/spaces/andryMLOPS/ASTA-GPT-3.8_web_ui/client/js/chat.js deleted file mode 100644 index 8a4449e0fd94c629867d62f53f5467f8e8292ca7..0000000000000000000000000000000000000000 --- a/spaces/andryMLOPS/ASTA-GPT-3.8_web_ui/client/js/chat.js +++ /dev/null @@ -1,508 +0,0 @@ -const query = (obj) => - Object.keys(obj) - .map((k) => encodeURIComponent(k) + "=" + encodeURIComponent(obj[k])) - .join("&"); -const url_prefix = document.querySelector("body").getAttribute("data-urlprefix"); -const markdown = window.markdownit(); -const message_box = document.getElementById(`messages`); -const message_input = document.getElementById(`message-input`); -const box_conversations = document.querySelector(`.top`); -const spinner = box_conversations.querySelector(".spinner"); -const stop_generating = document.querySelector(`.stop-generating`); -const send_button = document.querySelector(`#send-button`); -const user_image = `User Avatar`; -const gpt_image = `GPT Avatar`; -let prompt_lock = false; - -hljs.addPlugin(new CopyButtonPlugin()); - -message_input.addEventListener("blur", () => { - window.scrollTo(0, 0); -}); - -message_input.addEventListener("focus", () => { - document.documentElement.scrollTop = document.documentElement.scrollHeight; -}); - -const delete_conversations = async () => { - localStorage.clear(); - await new_conversation(); -}; - -const handle_ask = async () => { - message_input.style.height = `80px`; - window.scrollTo(0, 0); - let message = message_input.value; - - if (message.length > 0) { - message_input.value = ``; - message_input.dispatchEvent(new Event("input")); - await ask_gpt(message); - } -}; - -const remove_cancel_button = async () => { - stop_generating.classList.add(`stop-generating-hiding`); - - setTimeout(() => { - stop_generating.classList.remove(`stop-generating-hiding`); - stop_generating.classList.add(`stop-generating-hidden`); - }, 300); -}; - -const ask_gpt = async (message) => { - try { - message_input.value = ``; - message_input.innerHTML = ``; - message_input.innerText = ``; - - add_conversation(window.conversation_id, message.substr(0, 16)); - window.scrollTo(0, 0); - window.controller = new AbortController(); - - jailbreak = document.getElementById("jailbreak"); - model = document.getElementById("model"); - prompt_lock = true; - window.text = ``; - window.token = message_id(); - - stop_generating.classList.remove(`stop-generating-hidden`); - - add_user_message_box(message); - - message_box.scrollTop = message_box.scrollHeight; - window.scrollTo(0, 0); - await new Promise((r) => setTimeout(r, 500)); - window.scrollTo(0, 0); - - message_box.innerHTML += ` -
-
- ${gpt_image} -
-
-
-
-
- `; - - message_box.scrollTop = message_box.scrollHeight; - window.scrollTo(0, 0); - await new Promise((r) => setTimeout(r, 1000)); - window.scrollTo(0, 0); - - const response = await fetch(`${url_prefix}/backend-api/v2/conversation`, { - method: `POST`, - signal: window.controller.signal, - headers: { - "content-type": `application/json`, - accept: `text/event-stream`, - }, - body: JSON.stringify({ - conversation_id: window.conversation_id, - action: `_ask`, - model: model.options[model.selectedIndex].value, - jailbreak: jailbreak.options[jailbreak.selectedIndex].value, - meta: { - id: window.token, - content: { - conversation: await get_conversation(window.conversation_id), - internet_access: document.getElementById("switch").checked, - content_type: "text", - parts: [ - { - content: message, - role: "user", - }, - ], - }, - }, - }), - }); - - const reader = response.body.getReader(); - - while (true) { - const { value, done } = await reader.read(); - if (done) break; - - chunk = decodeUnicode(new TextDecoder().decode(value)); - - if ( - chunk.includes(`
{ - const messageDiv = createElement("div", { classNames: ["message"] }); - const avatarContainer = createElement("div", { classNames: ["avatar-container"], innerHTML: user_image }); - const contentDiv = createElement("div", { - classNames: ["content"], - id: `user_${token}`, - textContent: message, - }); - - messageDiv.append(avatarContainer, contentDiv); - message_box.appendChild(messageDiv); -}; - -const decodeUnicode = (str) => { - return str.replace(/\\u([a-fA-F0-9]{4})/g, function (match, grp) { - return String.fromCharCode(parseInt(grp, 16)); - }); -}; - -const clear_conversations = async () => { - const elements = box_conversations.childNodes; - let index = elements.length; - - if (index > 0) { - while (index--) { - const element = elements[index]; - if (element.nodeType === Node.ELEMENT_NODE && element.tagName.toLowerCase() !== `button`) { - box_conversations.removeChild(element); - } - } - } -}; - -const clear_conversation = async () => { - let messages = message_box.getElementsByTagName(`div`); - - while (messages.length > 0) { - message_box.removeChild(messages[0]); - } -}; - -const delete_conversation = async (conversation_id) => { - localStorage.removeItem(`conversation:${conversation_id}`); - - if (window.conversation_id == conversation_id) { - await new_conversation(); - } - - await load_conversations(20, 0, true); -}; - -const set_conversation = async (conversation_id) => { - history.pushState({}, null, `${url_prefix}/chat/${conversation_id}`); - window.conversation_id = conversation_id; - - await clear_conversation(); - await load_conversation(conversation_id); - await load_conversations(20, 0, true); -}; - -const new_conversation = async () => { - history.pushState({}, null, `${url_prefix}/chat/`); - window.conversation_id = uuid(); - - await clear_conversation(); - await load_conversations(20, 0, true); -}; - -const load_conversation = async (conversation_id) => { - let conversation = await JSON.parse(localStorage.getItem(`conversation:${conversation_id}`)); - console.log(conversation, conversation_id); - - for (item of conversation.items) { - if (is_assistant(item.role)) { - message_box.innerHTML += load_gpt_message_box(item.content); - } else { - message_box.innerHTML += load_user_message_box(item.content); - } - } - - document.querySelectorAll(`code`).forEach((el) => { - hljs.highlightElement(el); - }); - - message_box.scrollTo({ top: message_box.scrollHeight, behavior: "smooth" }); - - setTimeout(() => { - message_box.scrollTop = message_box.scrollHeight; - }, 500); -}; - -const load_user_message_box = (content) => { - const messageDiv = createElement("div", { classNames: ["message"] }); - const avatarContainer = createElement("div", { classNames: ["avatar-container"], innerHTML: user_image }); - const contentDiv = createElement("div", { classNames: ["content"] }); - const preElement = document.createElement("pre"); - preElement.textContent = content; - contentDiv.appendChild(preElement); - - messageDiv.append(avatarContainer, contentDiv); - - return messageDiv.outerHTML; -}; - -const load_gpt_message_box = (content) => { - return ` -
-
- ${gpt_image} -
-
- ${markdown.render(content)} -
-
- `; -}; - -const is_assistant = (role) => { - return role == "assistant"; -}; - -const get_conversation = async (conversation_id) => { - let conversation = await JSON.parse(localStorage.getItem(`conversation:${conversation_id}`)); - return conversation.items; -}; - -const add_conversation = async (conversation_id, title) => { - if (localStorage.getItem(`conversation:${conversation_id}`) == null) { - localStorage.setItem( - `conversation:${conversation_id}`, - JSON.stringify({ - id: conversation_id, - title: title, - items: [], - }) - ); - } -}; - -const add_message = async (conversation_id, role, content) => { - before_adding = JSON.parse(localStorage.getItem(`conversation:${conversation_id}`)); - - before_adding.items.push({ - role: role, - content: content, - }); - - localStorage.setItem(`conversation:${conversation_id}`, JSON.stringify(before_adding)); // update conversation -}; - -const load_conversations = async (limit, offset, loader) => { - //console.log(loader); - //if (loader === undefined) box_conversations.appendChild(spinner); - - let conversations = []; - for (let i = 0; i < localStorage.length; i++) { - if (localStorage.key(i).startsWith("conversation:")) { - let conversation = localStorage.getItem(localStorage.key(i)); - conversations.push(JSON.parse(conversation)); - } - } - - //if (loader === undefined) spinner.parentNode.removeChild(spinner) - await clear_conversations(); - - for (conversation of conversations) { - box_conversations.innerHTML += ` -
-
- - ${conversation.title} -
- -
- `; - } - - document.querySelectorAll(`code`).forEach((el) => { - hljs.highlightElement(el); - }); -}; - -document.getElementById(`cancelButton`).addEventListener(`click`, async () => { - window.controller.abort(); - console.log(`aborted ${window.conversation_id}`); -}); - -function h2a(str1) { - var hex = str1.toString(); - var str = ""; - - for (var n = 0; n < hex.length; n += 2) { - str += String.fromCharCode(parseInt(hex.substr(n, 2), 16)); - } - - return str; -} - -const uuid = () => { - return `xxxxxxxx-xxxx-4xxx-yxxx-${Date.now().toString(16)}`.replace(/[xy]/g, function (c) { - var r = (Math.random() * 16) | 0, - v = c == "x" ? r : (r & 0x3) | 0x8; - return v.toString(16); - }); -}; - -const message_id = () => { - random_bytes = (Math.floor(Math.random() * 1338377565) + 2956589730).toString(2); - unix = Math.floor(Date.now() / 1000).toString(2); - - return BigInt(`0b${unix}${random_bytes}`).toString(); -}; - -window.onload = async () => { - load_settings_localstorage(); - - conversations = 0; - for (let i = 0; i < localStorage.length; i++) { - if (localStorage.key(i).startsWith("conversation:")) { - conversations += 1; - } - } - - if (conversations == 0) localStorage.clear(); - - await setTimeout(() => { - load_conversations(20, 0); - }, 1); - - if (!window.location.href.endsWith(`#`)) { - if (/\/chat\/.+/.test(window.location.href.slice(url_prefix.length))) { - await load_conversation(window.conversation_id); - } - } - - message_input.addEventListener("keydown", async (evt) => { - if (prompt_lock) return; - - if (evt.key === "Enter" && !evt.shiftKey) { - evt.preventDefault(); - await handle_ask(); - } - }); - - send_button.addEventListener("click", async (event) => { - event.preventDefault(); - if (prompt_lock) return; - message_input.blur(); - await handle_ask(); - }); - - register_settings_localstorage(); -}; - -const register_settings_localstorage = async () => { - settings_ids = ["switch", "model", "jailbreak"]; - settings_elements = settings_ids.map((id) => document.getElementById(id)); - settings_elements.map((element) => - element.addEventListener(`change`, async (event) => { - switch (event.target.type) { - case "checkbox": - localStorage.setItem(event.target.id, event.target.checked); - break; - case "select-one": - localStorage.setItem(event.target.id, event.target.selectedIndex); - break; - default: - console.warn("Unresolved element type"); - } - }) - ); -}; - -const load_settings_localstorage = async () => { - settings_ids = ["switch", "model", "jailbreak"]; - settings_elements = settings_ids.map((id) => document.getElementById(id)); - settings_elements.map((element) => { - if (localStorage.getItem(element.id)) { - switch (element.type) { - case "checkbox": - element.checked = localStorage.getItem(element.id) === "true"; - break; - case "select-one": - element.selectedIndex = parseInt(localStorage.getItem(element.id)); - break; - default: - console.warn("Unresolved element type"); - } - } - }); -}; - -function clearTextarea(textarea) { - textarea.style.removeProperty("height"); - textarea.style.height = `${textarea.scrollHeight + 4}px`; - if (textarea.value.trim() === "" && textarea.value.includes("\n")) { - textarea.value = ""; - } -} - -function createElement(tag, { classNames, id, innerHTML, textContent } = {}) { - const el = document.createElement(tag); - if (classNames) { - el.classList.add(...classNames); - } - if (id) { - el.id = id; - } - if (innerHTML) { - el.innerHTML = innerHTML; - } - if (textContent) { - const preElement = document.createElement("pre"); - preElement.textContent = textContent; - el.appendChild(preElement); - } - return el; -} diff --git a/spaces/aodianyun/stable-diffusion-webui/modules/txt2img.py b/spaces/aodianyun/stable-diffusion-webui/modules/txt2img.py deleted file mode 100644 index 3927d8538f06c1ed270c9a6cfd55d4bb15705ee5..0000000000000000000000000000000000000000 --- a/spaces/aodianyun/stable-diffusion-webui/modules/txt2img.py +++ /dev/null @@ -1,69 +0,0 @@ -import modules.scripts -from modules import sd_samplers -from modules.generation_parameters_copypaste import create_override_settings_dict -from modules.processing import StableDiffusionProcessing, Processed, StableDiffusionProcessingTxt2Img, \ - StableDiffusionProcessingImg2Img, process_images -from modules.shared import opts, cmd_opts -import modules.shared as shared -import modules.processing as processing -from modules.ui import plaintext_to_html - - -def txt2img(id_task: str, prompt: str, negative_prompt: str, prompt_styles, steps: int, sampler_index: int, restore_faces: bool, tiling: bool, n_iter: int, batch_size: int, cfg_scale: float, seed: int, subseed: int, subseed_strength: float, seed_resize_from_h: int, seed_resize_from_w: int, seed_enable_extras: bool, height: int, width: int, enable_hr: bool, denoising_strength: float, hr_scale: float, hr_upscaler: str, hr_second_pass_steps: int, hr_resize_x: int, hr_resize_y: int, override_settings_texts, *args): - override_settings = create_override_settings_dict(override_settings_texts) - - p = StableDiffusionProcessingTxt2Img( - sd_model=shared.sd_model, - outpath_samples=opts.outdir_samples or opts.outdir_txt2img_samples, - outpath_grids=opts.outdir_grids or opts.outdir_txt2img_grids, - prompt=prompt, - styles=prompt_styles, - negative_prompt=negative_prompt, - seed=seed, - subseed=subseed, - subseed_strength=subseed_strength, - seed_resize_from_h=seed_resize_from_h, - seed_resize_from_w=seed_resize_from_w, - seed_enable_extras=seed_enable_extras, - sampler_name=sd_samplers.samplers[sampler_index].name, - batch_size=batch_size, - n_iter=n_iter, - steps=steps, - cfg_scale=cfg_scale, - width=width, - height=height, - restore_faces=restore_faces, - tiling=tiling, - enable_hr=enable_hr, - denoising_strength=denoising_strength if enable_hr else None, - hr_scale=hr_scale, - hr_upscaler=hr_upscaler, - hr_second_pass_steps=hr_second_pass_steps, - hr_resize_x=hr_resize_x, - hr_resize_y=hr_resize_y, - override_settings=override_settings, - ) - - p.scripts = modules.scripts.scripts_txt2img - p.script_args = args - - if cmd_opts.enable_console_prompts: - print(f"\ntxt2img: {prompt}", file=shared.progress_print_out) - - processed = modules.scripts.scripts_txt2img.run(p, *args) - - if processed is None: - processed = process_images(p) - - p.close() - - shared.total_tqdm.clear() - - generation_info_js = processed.js() - if opts.samples_log_stdout: - print(generation_info_js) - - if opts.do_not_show_images: - processed.images = [] - - return processed.images, generation_info_js, plaintext_to_html(processed.info), plaintext_to_html(processed.comments) diff --git a/spaces/aphenx/bingo/src/components/ui/dropdown-menu.tsx b/spaces/aphenx/bingo/src/components/ui/dropdown-menu.tsx deleted file mode 100644 index 184d4e6007ef85187446362f69532ab077897fea..0000000000000000000000000000000000000000 --- a/spaces/aphenx/bingo/src/components/ui/dropdown-menu.tsx +++ /dev/null @@ -1,128 +0,0 @@ -'use client' - -import * as React from 'react' -import * as DropdownMenuPrimitive from '@radix-ui/react-dropdown-menu' - -import { cn } from '@/lib/utils' - -const DropdownMenu = DropdownMenuPrimitive.Root - -const DropdownMenuTrigger = DropdownMenuPrimitive.Trigger - -const DropdownMenuGroup = DropdownMenuPrimitive.Group - -const DropdownMenuPortal = DropdownMenuPrimitive.Portal - -const DropdownMenuSub = DropdownMenuPrimitive.Sub - -const DropdownMenuRadioGroup = DropdownMenuPrimitive.RadioGroup - -const DropdownMenuSubContent = React.forwardRef< - React.ElementRef, - React.ComponentPropsWithoutRef ->(({ className, ...props }, ref) => ( - -)) -DropdownMenuSubContent.displayName = - DropdownMenuPrimitive.SubContent.displayName - -const DropdownMenuContent = React.forwardRef< - React.ElementRef, - React.ComponentPropsWithoutRef ->(({ className, sideOffset = 4, ...props }, ref) => ( - - - -)) -DropdownMenuContent.displayName = DropdownMenuPrimitive.Content.displayName - -const DropdownMenuItem = React.forwardRef< - React.ElementRef, - React.ComponentPropsWithoutRef & { - inset?: boolean - } ->(({ className, inset, ...props }, ref) => ( - -)) -DropdownMenuItem.displayName = DropdownMenuPrimitive.Item.displayName - -const DropdownMenuLabel = React.forwardRef< - React.ElementRef, - React.ComponentPropsWithoutRef & { - inset?: boolean - } ->(({ className, inset, ...props }, ref) => ( - -)) -DropdownMenuLabel.displayName = DropdownMenuPrimitive.Label.displayName - -const DropdownMenuSeparator = React.forwardRef< - React.ElementRef, - React.ComponentPropsWithoutRef ->(({ className, ...props }, ref) => ( - -)) -DropdownMenuSeparator.displayName = DropdownMenuPrimitive.Separator.displayName - -const DropdownMenuShortcut = ({ - className, - ...props -}: React.HTMLAttributes) => { - return ( - - ) -} -DropdownMenuShortcut.displayName = 'DropdownMenuShortcut' - -export { - DropdownMenu, - DropdownMenuTrigger, - DropdownMenuContent, - DropdownMenuItem, - DropdownMenuLabel, - DropdownMenuSeparator, - DropdownMenuShortcut, - DropdownMenuGroup, - DropdownMenuPortal, - DropdownMenuSub, - DropdownMenuSubContent, - DropdownMenuRadioGroup -} diff --git a/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/layers/delightful_tts/encoders.py b/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/layers/delightful_tts/encoders.py deleted file mode 100644 index 0878f0677a29d092597a46e8a3b11e4a521769b8..0000000000000000000000000000000000000000 --- a/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/layers/delightful_tts/encoders.py +++ /dev/null @@ -1,261 +0,0 @@ -from typing import List, Tuple, Union - -import torch -import torch.nn as nn # pylint: disable=consider-using-from-import -import torch.nn.functional as F - -from TTS.tts.layers.delightful_tts.conformer import ConformerMultiHeadedSelfAttention -from TTS.tts.layers.delightful_tts.conv_layers import CoordConv1d -from TTS.tts.layers.delightful_tts.networks import STL - - -def get_mask_from_lengths(lengths: torch.Tensor) -> torch.Tensor: - batch_size = lengths.shape[0] - max_len = torch.max(lengths).item() - ids = torch.arange(0, max_len, device=lengths.device).unsqueeze(0).expand(batch_size, -1) - mask = ids >= lengths.unsqueeze(1).expand(-1, max_len) - return mask - - -def stride_lens(lens: torch.Tensor, stride: int = 2) -> torch.Tensor: - return torch.ceil(lens / stride).int() - - -class ReferenceEncoder(nn.Module): - """ - Referance encoder for utterance and phoneme prosody encoders. Reference encoder - made up of convolution and RNN layers. - - Args: - num_mels (int): Number of mel frames to produce. - ref_enc_filters (list[int]): List of channel sizes for encoder layers. - ref_enc_size (int): Size of the kernel for the conv layers. - ref_enc_strides (List[int]): List of strides to use for conv layers. - ref_enc_gru_size (int): Number of hidden features for the gated recurrent unit. - - Inputs: inputs, mask - - **inputs** (batch, dim, time): Tensor containing mel vector - - **lengths** (batch): Tensor containing the mel lengths. - Returns: - - **outputs** (batch, time, dim): Tensor produced by Reference Encoder. - """ - - def __init__( - self, - num_mels: int, - ref_enc_filters: List[Union[int, int, int, int, int, int]], - ref_enc_size: int, - ref_enc_strides: List[Union[int, int, int, int, int]], - ref_enc_gru_size: int, - ): - super().__init__() - - n_mel_channels = num_mels - self.n_mel_channels = n_mel_channels - K = len(ref_enc_filters) - filters = [self.n_mel_channels] + ref_enc_filters - strides = [1] + ref_enc_strides - # Use CoordConv at the first layer to better preserve positional information: https://arxiv.org/pdf/1811.02122.pdf - convs = [ - CoordConv1d( - in_channels=filters[0], - out_channels=filters[0 + 1], - kernel_size=ref_enc_size, - stride=strides[0], - padding=ref_enc_size // 2, - with_r=True, - ) - ] - convs2 = [ - nn.Conv1d( - in_channels=filters[i], - out_channels=filters[i + 1], - kernel_size=ref_enc_size, - stride=strides[i], - padding=ref_enc_size // 2, - ) - for i in range(1, K) - ] - convs.extend(convs2) - self.convs = nn.ModuleList(convs) - - self.norms = nn.ModuleList([nn.InstanceNorm1d(num_features=ref_enc_filters[i], affine=True) for i in range(K)]) - - self.gru = nn.GRU( - input_size=ref_enc_filters[-1], - hidden_size=ref_enc_gru_size, - batch_first=True, - ) - - def forward(self, x: torch.Tensor, mel_lens: torch.Tensor) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: - """ - inputs --- [N, n_mels, timesteps] - outputs --- [N, E//2] - """ - - mel_masks = get_mask_from_lengths(mel_lens).unsqueeze(1) - x = x.masked_fill(mel_masks, 0) - for conv, norm in zip(self.convs, self.norms): - x = conv(x) - x = F.leaky_relu(x, 0.3) # [N, 128, Ty//2^K, n_mels//2^K] - x = norm(x) - - for _ in range(2): - mel_lens = stride_lens(mel_lens) - - mel_masks = get_mask_from_lengths(mel_lens) - - x = x.masked_fill(mel_masks.unsqueeze(1), 0) - x = x.permute((0, 2, 1)) - x = torch.nn.utils.rnn.pack_padded_sequence(x, mel_lens.cpu().int(), batch_first=True, enforce_sorted=False) - - self.gru.flatten_parameters() - x, memory = self.gru(x) # memory --- [N, Ty, E//2], out --- [1, N, E//2] - x, _ = torch.nn.utils.rnn.pad_packed_sequence(x, batch_first=True) - - return x, memory, mel_masks - - def calculate_channels( # pylint: disable=no-self-use - self, L: int, kernel_size: int, stride: int, pad: int, n_convs: int - ) -> int: - for _ in range(n_convs): - L = (L - kernel_size + 2 * pad) // stride + 1 - return L - - -class UtteranceLevelProsodyEncoder(nn.Module): - def __init__( - self, - num_mels: int, - ref_enc_filters: List[Union[int, int, int, int, int, int]], - ref_enc_size: int, - ref_enc_strides: List[Union[int, int, int, int, int]], - ref_enc_gru_size: int, - dropout: float, - n_hidden: int, - bottleneck_size_u: int, - token_num: int, - ): - """ - Encoder to extract prosody from utterance. it is made up of a reference encoder - with a couple of linear layers and style token layer with dropout. - - Args: - num_mels (int): Number of mel frames to produce. - ref_enc_filters (list[int]): List of channel sizes for ref encoder layers. - ref_enc_size (int): Size of the kernel for the ref encoder conv layers. - ref_enc_strides (List[int]): List of strides to use for teh ref encoder conv layers. - ref_enc_gru_size (int): Number of hidden features for the gated recurrent unit. - dropout (float): Probability of dropout. - n_hidden (int): Size of hidden layers. - bottleneck_size_u (int): Size of the bottle neck layer. - - Inputs: inputs, mask - - **inputs** (batch, dim, time): Tensor containing mel vector - - **lengths** (batch): Tensor containing the mel lengths. - Returns: - - **outputs** (batch, 1, dim): Tensor produced by Utterance Level Prosody Encoder. - """ - super().__init__() - - self.E = n_hidden - self.d_q = self.d_k = n_hidden - bottleneck_size = bottleneck_size_u - - self.encoder = ReferenceEncoder( - ref_enc_filters=ref_enc_filters, - ref_enc_gru_size=ref_enc_gru_size, - ref_enc_size=ref_enc_size, - ref_enc_strides=ref_enc_strides, - num_mels=num_mels, - ) - self.encoder_prj = nn.Linear(ref_enc_gru_size, self.E // 2) - self.stl = STL(n_hidden=n_hidden, token_num=token_num) - self.encoder_bottleneck = nn.Linear(self.E, bottleneck_size) - self.dropout = nn.Dropout(dropout) - - def forward(self, mels: torch.Tensor, mel_lens: torch.Tensor) -> torch.Tensor: - """ - Shapes: - mels: :math: `[B, C, T]` - mel_lens: :math: `[B]` - - out --- [N, seq_len, E] - """ - _, embedded_prosody, _ = self.encoder(mels, mel_lens) - - # Bottleneck - embedded_prosody = self.encoder_prj(embedded_prosody) - - # Style Token - out = self.encoder_bottleneck(self.stl(embedded_prosody)) - out = self.dropout(out) - - out = out.view((-1, 1, out.shape[3])) - return out - - -class PhonemeLevelProsodyEncoder(nn.Module): - def __init__( - self, - num_mels: int, - ref_enc_filters: List[Union[int, int, int, int, int, int]], - ref_enc_size: int, - ref_enc_strides: List[Union[int, int, int, int, int]], - ref_enc_gru_size: int, - dropout: float, - n_hidden: int, - n_heads: int, - bottleneck_size_p: int, - ): - super().__init__() - - self.E = n_hidden - self.d_q = self.d_k = n_hidden - bottleneck_size = bottleneck_size_p - - self.encoder = ReferenceEncoder( - ref_enc_filters=ref_enc_filters, - ref_enc_gru_size=ref_enc_gru_size, - ref_enc_size=ref_enc_size, - ref_enc_strides=ref_enc_strides, - num_mels=num_mels, - ) - self.encoder_prj = nn.Linear(ref_enc_gru_size, n_hidden) - self.attention = ConformerMultiHeadedSelfAttention( - d_model=n_hidden, - num_heads=n_heads, - dropout_p=dropout, - ) - self.encoder_bottleneck = nn.Linear(n_hidden, bottleneck_size) - - def forward( - self, - x: torch.Tensor, - src_mask: torch.Tensor, - mels: torch.Tensor, - mel_lens: torch.Tensor, - encoding: torch.Tensor, - ) -> torch.Tensor: - """ - x --- [N, seq_len, encoder_embedding_dim] - mels --- [N, Ty/r, n_mels*r], r=1 - out --- [N, seq_len, bottleneck_size] - attn --- [N, seq_len, ref_len], Ty/r = ref_len - """ - embedded_prosody, _, mel_masks = self.encoder(mels, mel_lens) - - # Bottleneck - embedded_prosody = self.encoder_prj(embedded_prosody) - - attn_mask = mel_masks.view((mel_masks.shape[0], 1, 1, -1)) - x, _ = self.attention( - query=x, - key=embedded_prosody, - value=embedded_prosody, - mask=attn_mask, - encoding=encoding, - ) - x = self.encoder_bottleneck(x) - x = x.masked_fill(src_mask.unsqueeze(-1), 0.0) - return x diff --git a/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/utils/text/phonemizers/multi_phonemizer.py b/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/utils/text/phonemizers/multi_phonemizer.py deleted file mode 100644 index 62a9c39322e051124d8c2d816cbc7d479df69dfe..0000000000000000000000000000000000000000 --- a/spaces/artificialguybr/video-dubbing/TTS/TTS/tts/utils/text/phonemizers/multi_phonemizer.py +++ /dev/null @@ -1,65 +0,0 @@ -from typing import Dict, List - -from TTS.tts.utils.text.phonemizers import DEF_LANG_TO_PHONEMIZER, get_phonemizer_by_name - - -class MultiPhonemizer: - """🐸TTS multi-phonemizer that operates phonemizers for multiple langugages - - Args: - custom_lang_to_phonemizer (Dict): - Custom phonemizer mapping if you want to change the defaults. In the format of - `{"lang_code", "phonemizer_name"}`. When it is None, `DEF_LANG_TO_PHONEMIZER` is used. Defaults to `{}`. - - TODO: find a way to pass custom kwargs to the phonemizers - """ - - lang_to_phonemizer = {} - - def __init__(self, lang_to_phonemizer_name: Dict = {}) -> None: # pylint: disable=dangerous-default-value - for k, v in lang_to_phonemizer_name.items(): - if v == "" and k in DEF_LANG_TO_PHONEMIZER.keys(): - lang_to_phonemizer_name[k] = DEF_LANG_TO_PHONEMIZER[k] - elif v == "": - raise ValueError(f"Phonemizer wasn't set for language {k} and doesn't have a default.") - self.lang_to_phonemizer_name = lang_to_phonemizer_name - self.lang_to_phonemizer = self.init_phonemizers(self.lang_to_phonemizer_name) - - @staticmethod - def init_phonemizers(lang_to_phonemizer_name: Dict) -> Dict: - lang_to_phonemizer = {} - for k, v in lang_to_phonemizer_name.items(): - lang_to_phonemizer[k] = get_phonemizer_by_name(v, language=k) - return lang_to_phonemizer - - @staticmethod - def name(): - return "multi-phonemizer" - - def phonemize(self, text, separator="|", language=""): - if language == "": - raise ValueError("Language must be set for multi-phonemizer to phonemize.") - return self.lang_to_phonemizer[language].phonemize(text, separator) - - def supported_languages(self) -> List: - return list(self.lang_to_phonemizer.keys()) - - def print_logs(self, level: int = 0): - indent = "\t" * level - print(f"{indent}| > phoneme language: {self.supported_languages()}") - print(f"{indent}| > phoneme backend: {self.name()}") - - -# if __name__ == "__main__": -# texts = { -# "tr": "Merhaba, bu Türkçe bit örnek!", -# "en-us": "Hello, this is English example!", -# "de": "Hallo, das ist ein Deutches Beipiel!", -# "zh-cn": "这是中国的例子", -# } -# phonemes = {} -# ph = MultiPhonemizer({"tr": "espeak", "en-us": "", "de": "gruut", "zh-cn": ""}) -# for lang, text in texts.items(): -# phoneme = ph.phonemize(text, lang) -# phonemes[lang] = phoneme -# print(phonemes) diff --git a/spaces/artificialguybr/video-dubbing/TTS/tests/bash_tests/test_demo_server.sh b/spaces/artificialguybr/video-dubbing/TTS/tests/bash_tests/test_demo_server.sh deleted file mode 100644 index ebd0bc8b89f2ba450a569be4c147ec4959efca18..0000000000000000000000000000000000000000 --- a/spaces/artificialguybr/video-dubbing/TTS/tests/bash_tests/test_demo_server.sh +++ /dev/null @@ -1,15 +0,0 @@ -#!/bin/bash -set -xe - -python -m TTS.server.server & -SERVER_PID=$! - -echo 'Waiting for server...' -sleep 30 - -curl -o /tmp/audio.wav "http://localhost:5002/api/tts?text=synthesis%20schmynthesis" -python -c 'import sys; import wave; print(wave.open(sys.argv[1]).getnframes())' /tmp/audio.wav - -kill $SERVER_PID - -rm /tmp/audio.wav diff --git a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/altair/examples/airports.py b/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/altair/examples/airports.py deleted file mode 100644 index 8efa4980f69778447a6c648f4f5ea94bf75c0bb8..0000000000000000000000000000000000000000 --- a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/altair/examples/airports.py +++ /dev/null @@ -1,33 +0,0 @@ -""" -Locations of US Airports -======================== -This is a layered geographic visualization that shows the positions of US -airports on a background of US states. -""" -# category: case studies -import altair as alt -from vega_datasets import data - -airports = data.airports() -states = alt.topo_feature(data.us_10m.url, feature='states') - -# US states background -background = alt.Chart(states).mark_geoshape( - fill='lightgray', - stroke='white' -).properties( - width=500, - height=300 -).project('albersUsa') - -# airport positions on background -points = alt.Chart(airports).mark_circle( - size=10, - color='steelblue' -).encode( - longitude='longitude:Q', - latitude='latitude:Q', - tooltip=['name', 'city', 'state'] -) - -background + points diff --git a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/certifi/__init__.py b/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/certifi/__init__.py deleted file mode 100644 index a3546f12555c2c8d186489c5220e8d2e25f0b0a9..0000000000000000000000000000000000000000 --- a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/certifi/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -from .core import contents, where - -__all__ = ["contents", "where"] -__version__ = "2022.12.07" diff --git a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/modules/retri.py b/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/modules/retri.py deleted file mode 100644 index d1b288f8e52308eeaefcb4411cb3d23e543f37d0..0000000000000000000000000000000000000000 --- a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/modules/retri.py +++ /dev/null @@ -1,429 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. -import os -import numpy as np -import pickle -import time - -try: - import faiss -except ImportError: - pass - -from collections import defaultdict - -from ..utils import get_local_rank, print_on_rank0 - - -class VectorRetriever(object): - """ - How2 Video Retriver. - Reference usage of FAISS: - https://github.com/fairinternal/fairseq-py/blob/paraphrase_pretraining/fairseq/data/multilingual_faiss_dataset.py - """ - - def __init__(self, hidden_size, cent, db_type, examples_per_cent_to_train): - if db_type == "flatl2": - quantizer = faiss.IndexFlatL2(hidden_size) # the other index - self.db = faiss.IndexIVFFlat( - quantizer, hidden_size, cent, faiss.METRIC_L2) - elif db_type == "pq": - self.db = faiss.index_factory( - hidden_size, f"IVF{cent}_HNSW32,PQ32" - ) - else: - raise ValueError("unknown type of db", db_type) - self.train_thres = cent * examples_per_cent_to_train - self.train_cache = [] - self.train_len = 0 - self.videoid_to_vectoridx = {} - self.vectoridx_to_videoid = None - self.make_direct_maps_done = False - - def make_direct_maps(self): - faiss.downcast_index(self.db).make_direct_map() - - def __len__(self): - return self.db.ntotal - - def save(self, out_dir): - faiss.write_index( - self.db, - os.path.join(out_dir, "faiss_idx") - ) - with open( - os.path.join( - out_dir, "videoid_to_vectoridx.pkl"), - "wb") as fw: - pickle.dump( - self.videoid_to_vectoridx, fw, - protocol=pickle.HIGHEST_PROTOCOL - ) - - def load(self, out_dir): - fn = os.path.join(out_dir, "faiss_idx") - self.db = faiss.read_index(fn) - with open( - os.path.join(out_dir, "videoid_to_vectoridx.pkl"), "rb") as fr: - self.videoid_to_vectoridx = pickle.load(fr) - - def add(self, hidden_states, video_ids, last=False): - assert len(hidden_states) == len(video_ids), "{}, {}".format( - str(len(hidden_states)), str(len(video_ids))) - assert len(hidden_states.shape) == 2 - assert hidden_states.dtype == np.float32 - - valid_idx = [] - for idx, video_id in enumerate(video_ids): - if video_id not in self.videoid_to_vectoridx: - valid_idx.append(idx) - self.videoid_to_vectoridx[video_id] = \ - len(self.videoid_to_vectoridx) - - hidden_states = hidden_states[valid_idx] - if not self.db.is_trained: - self.train_cache.append(hidden_states) - self.train_len += hidden_states.shape[0] - if self.train_len < self.train_thres: - return - self.finalize_training() - else: - self.db.add(hidden_states) - - def finalize_training(self): - hidden_states = np.concatenate(self.train_cache, axis=0) - del self.train_cache - local_rank = get_local_rank() - if local_rank == 0: - start = time.time() - print("training db on", self.train_thres, "/", self.train_len) - self.db.train(hidden_states[:self.train_thres]) - if local_rank == 0: - print("training db for", time.time() - start) - self.db.add(hidden_states) - - def search( - self, - query_hidden_states, - orig_dist, - ): - if len(self.videoid_to_vectoridx) != self.db.ntotal: - raise ValueError( - "cannot search: size mismatch in-between index and db", - len(self.videoid_to_vectoridx), - self.db.ntotal - ) - - if self.vectoridx_to_videoid is None: - self.vectoridx_to_videoid = { - self.videoid_to_vectoridx[videoid]: videoid - for videoid in self.videoid_to_vectoridx - } - assert len(self.vectoridx_to_videoid) \ - == len(self.videoid_to_vectoridx) - - # MultilingualFaissDataset uses the following; not sure the purpose. - # faiss.ParameterSpace().set_index_parameter(self.db, "nprobe", 10) - queried_dist, index = self.db.search(query_hidden_states, 1) - queried_dist, index = queried_dist[:, 0], index[:, 0] - - outputs = np.array( - [self.vectoridx_to_videoid[_index] - if _index != -1 else (-1, -1, -1) for _index in index], - dtype=np.int32) - outputs[queried_dist <= orig_dist] = -1 - return outputs - - def search_by_video_ids( - self, - video_ids, - retri_factor - ): - if len(self.videoid_to_vectoridx) != self.db.ntotal: - raise ValueError( - len(self.videoid_to_vectoridx), - self.db.ntotal - ) - - if not self.make_direct_maps_done: - self.make_direct_maps() - - if self.vectoridx_to_videoid is None: - self.vectoridx_to_videoid = { - self.videoid_to_vectoridx[videoid]: videoid - for videoid in self.videoid_to_vectoridx - } - assert len(self.vectoridx_to_videoid) \ - == len(self.videoid_to_vectoridx) - - query_hidden_states = [] - vector_ids = [] - for video_id in video_ids: - vector_id = self.videoid_to_vectoridx[video_id] - vector_ids.append(vector_id) - query_hidden_state = self.db.reconstruct(vector_id) - query_hidden_states.append(query_hidden_state) - query_hidden_states = np.stack(query_hidden_states) - - # MultilingualFaissDataset uses the following; not sure the reason. - # faiss.ParameterSpace().set_index_parameter(self.db, "nprobe", 10) - _, index = self.db.search(query_hidden_states, retri_factor) - outputs = [] - for sample_idx, sample in enumerate(index): - # the first video_id is always the video itself. - cands = [video_ids[sample_idx]] - for vector_idx in sample: - if vector_idx >= 0 \ - and vector_ids[sample_idx] != vector_idx: - cands.append( - self.vectoridx_to_videoid[vector_idx] - ) - outputs.append(cands) - return outputs - - -class VectorRetrieverDM(VectorRetriever): - """ - with direct map. - How2 Video Retriver. - Reference usage of FAISS: - https://github.com/fairinternal/fairseq-py/blob/paraphrase_pretraining/fairseq/data/multilingual_faiss_dataset.py - """ - - def __init__( - self, - hidden_size, - cent, - db_type, - examples_per_cent_to_train - ): - super().__init__( - hidden_size, cent, db_type, examples_per_cent_to_train) - self.make_direct_maps_done = False - - def make_direct_maps(self): - faiss.downcast_index(self.db).make_direct_map() - self.make_direct_maps_done = True - - def search( - self, - query_hidden_states, - orig_dist, - ): - if len(self.videoid_to_vectoridx) != self.db.ntotal: - raise ValueError( - len(self.videoid_to_vectoridx), - self.db.ntotal - ) - - if not self.make_direct_maps_done: - self.make_direct_maps() - if self.vectoridx_to_videoid is None: - self.vectoridx_to_videoid = { - self.videoid_to_vectoridx[videoid]: videoid - for videoid in self.videoid_to_vectoridx - } - assert len(self.vectoridx_to_videoid) \ - == len(self.videoid_to_vectoridx) - - # MultilingualFaissDataset uses the following; not sure the reason. - # faiss.ParameterSpace().set_index_parameter(self.db, "nprobe", 10) - queried_dist, index = self.db.search(query_hidden_states, 1) - outputs = [] - for sample_idx, sample in enumerate(index): - # and queried_dist[sample_idx] < thres \ - if sample >= 0 \ - and queried_dist[sample_idx] < orig_dist[sample_idx]: - outputs.append(self.vectoridx_to_videoid[sample]) - else: - outputs.append(None) - return outputs - - def search_by_video_ids( - self, - video_ids, - retri_factor=8 - ): - if len(self.videoid_to_vectoridx) != self.db.ntotal: - raise ValueError( - len(self.videoid_to_vectoridx), - self.db.ntotal - ) - - if not self.make_direct_maps_done: - self.make_direct_maps() - if self.vectoridx_to_videoid is None: - self.vectoridx_to_videoid = { - self.videoid_to_vectoridx[videoid]: videoid - for videoid in self.videoid_to_vectoridx - } - assert len(self.vectoridx_to_videoid) \ - == len(self.videoid_to_vectoridx) - - query_hidden_states = [] - vector_ids = [] - for video_id in video_ids: - vector_id = self.videoid_to_vectoridx[video_id] - vector_ids.append(vector_id) - query_hidden_state = self.db.reconstruct(vector_id) - query_hidden_states.append(query_hidden_state) - query_hidden_states = np.stack(query_hidden_states) - - # MultilingualFaissDataset uses the following; not sure the reason. - # faiss.ParameterSpace().set_index_parameter(self.db, "nprobe", 10) - _, index = self.db.search(query_hidden_states, retri_factor) - outputs = [] - for sample_idx, sample in enumerate(index): - # the first video_id is always the video itself. - cands = [video_ids[sample_idx]] - for vector_idx in sample: - if vector_idx >= 0 \ - and vector_ids[sample_idx] != vector_idx: - cands.append( - self.vectoridx_to_videoid[vector_idx] - ) - outputs.append(cands) - return outputs - - -class MMVectorRetriever(VectorRetrieverDM): - """ - multimodal vector retriver: - text retrieve video or video retrieve text. - """ - - def __init__(self, hidden_size, cent, db_type, examples_per_cent_to_train): - super().__init__( - hidden_size, cent, db_type, examples_per_cent_to_train) - video_db = self.db - super().__init__( - hidden_size, cent, db_type, examples_per_cent_to_train) - text_db = self.db - self.db = {"video": video_db, "text": text_db} - self.video_to_videoid = defaultdict(list) - - def __len__(self): - assert self.db["video"].ntotal == self.db["text"].ntotal - return self.db["video"].ntotal - - def make_direct_maps(self): - faiss.downcast_index(self.db["video"]).make_direct_map() - faiss.downcast_index(self.db["text"]).make_direct_map() - - def save(self, out_dir): - faiss.write_index( - self.db["video"], - os.path.join(out_dir, "video_faiss_idx") - ) - faiss.write_index( - self.db["text"], - os.path.join(out_dir, "text_faiss_idx") - ) - - with open( - os.path.join( - out_dir, "videoid_to_vectoridx.pkl"), - "wb") as fw: - pickle.dump( - self.videoid_to_vectoridx, fw, - protocol=pickle.HIGHEST_PROTOCOL - ) - - def load(self, out_dir): - fn = os.path.join(out_dir, "video_faiss_idx") - video_db = faiss.read_index(fn) - fn = os.path.join(out_dir, "text_faiss_idx") - text_db = faiss.read_index(fn) - self.db = {"video": video_db, "text": text_db} - with open( - os.path.join(out_dir, "videoid_to_vectoridx.pkl"), "rb") as fr: - self.videoid_to_vectoridx = pickle.load(fr) - self.video_to_videoid = defaultdict(list) - - def add(self, hidden_states, video_ids): - """hidden_states is a pair `(video, text)`""" - assert len(hidden_states) == len(video_ids), "{}, {}".format( - str(len(hidden_states)), str(len(video_ids))) - assert len(hidden_states.shape) == 3 - assert len(self.video_to_videoid) == 0 - - valid_idx = [] - for idx, video_id in enumerate(video_ids): - if video_id not in self.videoid_to_vectoridx: - valid_idx.append(idx) - self.videoid_to_vectoridx[video_id] = \ - len(self.videoid_to_vectoridx) - - batch_size = hidden_states.shape[0] - hidden_states = hidden_states[valid_idx] - - hidden_states = np.transpose(hidden_states, (1, 0, 2)).copy() - if not self.db["video"].is_trained: - self.train_cache.append(hidden_states) - train_len = batch_size * len(self.train_cache) - if train_len < self.train_thres: - return - - hidden_states = np.concatenate(self.train_cache, axis=1) - del self.train_cache - self.db["video"].train(hidden_states[0, :self.train_thres]) - self.db["text"].train(hidden_states[1, :self.train_thres]) - self.db["video"].add(hidden_states[0]) - self.db["text"].add(hidden_states[1]) - - def get_clips_by_video_id(self, video_id): - if not self.video_to_videoid: - for video_id, video_clip, text_clip in self.videoid_to_vectoridx: - self.video_to_videoid[video_id].append( - (video_id, video_clip, text_clip)) - return self.video_to_videoid[video_id] - - def search( - self, - video_ids, - target_modality, - retri_factor=8 - ): - if len(self.videoid_to_vectoridx) != len(self): - raise ValueError( - len(self.videoid_to_vectoridx), - len(self) - ) - - if not self.make_direct_maps_done: - self.make_direct_maps() - if self.vectoridx_to_videoid is None: - self.vectoridx_to_videoid = { - self.videoid_to_vectoridx[videoid]: videoid - for videoid in self.videoid_to_vectoridx - } - assert len(self.vectoridx_to_videoid) \ - == len(self.videoid_to_vectoridx) - - src_modality = "text" if target_modality == "video" else "video" - - query_hidden_states = [] - vector_ids = [] - for video_id in video_ids: - vector_id = self.videoid_to_vectoridx[video_id] - vector_ids.append(vector_id) - query_hidden_state = self.db[src_modality].reconstruct(vector_id) - query_hidden_states.append(query_hidden_state) - query_hidden_states = np.stack(query_hidden_states) - - # MultilingualFaissDataset uses the following; not sure the reason. - # faiss.ParameterSpace().set_index_parameter(self.db, "nprobe", 10) - _, index = self.db[target_modality].search( - query_hidden_states, retri_factor) - outputs = [] - for sample_idx, sample in enumerate(index): - cands = [] - for vector_idx in sample: - if vector_idx >= 0: - cands.append( - self.vectoridx_to_videoid[vector_idx] - ) - outputs.append(cands) - return outputs diff --git a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/tasks/retritask.py b/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/tasks/retritask.py deleted file mode 100644 index b43f20fddb31f29b210b1c726e5f6ccaad04bcf0..0000000000000000000000000000000000000000 --- a/spaces/arxify/RVC-beta-v2-0618/runtime/Lib/site-packages/fairseq/examples/MMPT/mmpt/tasks/retritask.py +++ /dev/null @@ -1,253 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. -import os -import torch -import pickle -import random - -from tqdm import tqdm -from torch.utils.data import DataLoader -from torch.utils.data.distributed import DistributedSampler - -from ..processors import ( - ShardedHow2MetaProcessor, - ShardedVideoProcessor, - ShardedTextProcessor, - VariedLenAligner, -) - -from ..datasets import MMDataset -from .task import Task -from ..modules import vectorpool -from ..evaluators.predictor import Predictor -from ..utils import set_seed, get_local_rank, get_world_size - - -class RetriTask(Task): - """abstract class for task with retrival.""" - - def reshape_subsample(self, sample): - for key in sample: - if torch.is_tensor(sample[key]): - sample[key] = self.flat_subsample(sample[key]) - return sample - - def flat_subsample(self, tensor): - if tensor.size(0) == 1: - tensor = tensor.squeeze(0) - return tensor - - def build_dataloader(self): - """called by `get_batch_iterator` in fairseqmmtask. """ - # TODO: hard-code dataloader for retri for now and configurable in .yaml. - # reuse the `train.lst`. - self.config.dataset.split = "train" - meta_processor = ShardedHow2MetaProcessor(self.config.dataset) - video_processor = ShardedVideoProcessor(self.config.dataset) - text_processor = ShardedTextProcessor(self.config.dataset) - - aligner = VariedLenAligner(self.config.dataset) - aligner.subsampling = self.config.dataset.clip_per_video - - self.retri_data = MMDataset( - meta_processor, video_processor, text_processor, aligner - ) - - retri_sampler = DistributedSampler(self.retri_data) - infer_scale = 16 - batch_size = self.config.dataset.num_video_per_batch \ - * infer_scale - - self.retri_dataloader = DataLoader( - self.retri_data, - collate_fn=self.retri_data.collater, - batch_size=batch_size, - shuffle=False, - sampler=retri_sampler, - num_workers=self.config.fairseq.dataset.num_workers - ) - return self.retri_dataloader - - def retrive_candidates(self, epoch, dataloader=None): - if get_local_rank() == 0: - print("running retrieval model.") - out_dir = os.path.join( - self.config.fairseq.checkpoint.save_dir, "retri") - os.makedirs(out_dir, exist_ok=True) - - if not os.path.isfile( - os.path.join( - out_dir, "batched_e" + str(epoch) + "_videos0.pkl") - ): - if dataloader is None: - dataloader = self.retri_dataloader - - self.model.eval() - self.model.is_train = False - - assert self.retri_data.meta_processor.data == \ - self.train_data.meta_processor.data # video_ids not mutated. - - self._retri_predict(epoch, dataloader) - - self.model.train() - self.model.is_train = True - - torch.distributed.barrier() - output = self._retri_sync(epoch, out_dir) - torch.distributed.barrier() - self.train_data.meta_processor.set_candidates(output) - return output - - -class VideoRetriTask(RetriTask): - """RetriTask on video level.""" - - def reshape_subsample(self, sample): - if ( - hasattr(self.config.dataset, "clip_per_video") - and self.config.dataset.clip_per_video is not None - and self.config.dataset.clip_per_video > 1 - ): - for key in sample: - if torch.is_tensor(sample[key]): - sample[key] = self.flat_subsample(sample[key]) - return sample - - def flat_subsample(self, tensor): - if tensor.size(0) == 1: - tensor = tensor.squeeze(0) - return Task.flat_subsample(self, tensor) - - def _retri_predict(self, epoch, dataloader): - set_seed(epoch) - # save for retrival. - predictor = VideoPredictor(self.config) - predictor.predict_loop( - self.model, dataloader) - set_seed(epoch) # get the same text clips. - # retrival. - retri_predictor = VideoRetriPredictor( - self.config) - retri_predictor.predict_loop( - self.model, predictor.vecpool.retriver, epoch) - del predictor - del retri_predictor - - def _retri_sync(self, epoch, out_dir): - # gpu do the same merge. - batched_videos = [] - for local_rank in range(get_world_size()): - fn = os.path.join( - out_dir, - "batched_e" + str(epoch) + "_videos" + str(local_rank) + ".pkl") - with open(fn, "rb") as fr: - batched_videos.extend(pickle.load(fr)) - print( - "[INFO] batched_videos", - len(batched_videos), len(batched_videos[0])) - return batched_videos - - -class VideoPredictor(Predictor): - def __init__(self, config): - vectorpool_cls = getattr(vectorpool, config.vectorpool_cls) - self.vecpool = vectorpool_cls(config) - - def predict_loop( - self, - model, - dataloader, - early_stop=-1, - ): - with torch.no_grad(): - if get_local_rank() == 0: - dataloader = tqdm(dataloader) - for batch_idx, batch in enumerate(dataloader): - if batch_idx == early_stop: - break - self(batch, model) - return self.finalize() - - def __call__(self, sample, model, **kwargs): - param = next(model.parameters()) - dtype = param.dtype - device = param.device - subsample = sample["vfeats"].size(1) - sample = self.to_ctx(sample, device, dtype) - for key in sample: - if torch.is_tensor(sample[key]): - size = sample[key].size() - if len(size) >= 2: - batch_size = size[0] * size[1] - expanded_size = ( - (batch_size,) + size[2:] if len(size) > 2 - else (batch_size,) - ) - sample[key] = sample[key].view(expanded_size) - - outputs = model(**sample) - sample.update(outputs) - self.vecpool(sample, subsample) - - def finalize(self): - print("[INFO]", self.vecpool) - if not self.vecpool.retriver.db.is_trained: - self.vecpool.retriver.finalize_training() - return self.vecpool.retriver - - -class VideoRetriPredictor(Predictor): - """ - Online Retrieval Predictor for Clips (used by RetriTask). - TODO: merge this with VisPredictor? - """ - - def __init__(self, config): - self.pred_dir = os.path.join( - config.fairseq.checkpoint.save_dir, - "retri") - self.num_cands = config.num_cands - self.num_video_per_batch = config.dataset.num_video_per_batch - - def predict_loop( - self, - model, - retriver, - epoch, - early_stop=-1 - ): - # a fake loop that only try to recover video vector - # from video_id. - batched_videos = [] - # obtain available video_ids. - video_ids = list(retriver.videoid_to_vectoridx.keys()) - - dataloader = random.sample( - video_ids, - len(video_ids) // self.num_video_per_batch - ) - - if get_local_rank() == 0: - dataloader = tqdm(dataloader) - for batch_idx, batch in enumerate(dataloader): - # batch is one video id. - if batch_idx == early_stop: - break - video_ids = retriver.search_by_video_ids( - [batch], self.num_cands)[0] - if len(video_ids) > self.num_video_per_batch: - # we moved the center to make cluster robust. - video_ids = random.sample(video_ids, self.num_video_per_batch) - batched_videos.append(video_ids) - return self.finalize(batched_videos, epoch) - - def finalize(self, batched_videos, epoch): - fn = os.path.join( - self.pred_dir, - "batched_e" + str(epoch) + "_videos" + str(get_local_rank()) + ".pkl") - with open(fn, "wb") as fw: - pickle.dump(batched_videos, fw, pickle.HIGHEST_PROTOCOL) - return batched_videos diff --git a/spaces/auto-academic/auto-draft/worker.py b/spaces/auto-academic/auto-draft/worker.py deleted file mode 100644 index 6a2483741ce74c527cd33d6b99765c9e88d2400b..0000000000000000000000000000000000000000 --- a/spaces/auto-academic/auto-draft/worker.py +++ /dev/null @@ -1,173 +0,0 @@ -''' -This script is only used for service-side host. -''' -import boto3 -import os, time -from wrapper import generator_wrapper -from sqlalchemy import create_engine, Table, MetaData, update, select -from sqlalchemy.orm import sessionmaker -from sqlalchemy import inspect - -QUEUE_URL = os.getenv('QUEUE_URL') -AWS_ACCESS_KEY_ID = os.getenv('AWS_ACCESS_KEY_ID') -AWS_SECRET_ACCESS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY') -BUCKET_NAME = os.getenv('BUCKET_NAME') -DB_STRING = os.getenv('DATABASE_STRING') - -# Create engine -ENGINE = create_engine(DB_STRING) -SESSION = sessionmaker(bind=ENGINE) - - -####################################################################################################################### -# Amazon SQS Handler -####################################################################################################################### -def get_sqs_client(): - sqs = boto3.client('sqs', region_name="us-east-2", - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY) - return sqs - - -def receive_message(): - sqs = get_sqs_client() - message = sqs.receive_message(QueueUrl=QUEUE_URL) - if message.get('Messages') is not None: - receipt_handle = message['Messages'][0]['ReceiptHandle'] - else: - receipt_handle = None - return message, receipt_handle - - -def delete_message(receipt_handle): - sqs = get_sqs_client() - response = sqs.delete_message(QueueUrl=QUEUE_URL, ReceiptHandle=receipt_handle) - return response - - -####################################################################################################################### -# AWS S3 Handler -####################################################################################################################### -def get_s3_client(): - access_key_id = os.getenv('AWS_ACCESS_KEY_ID') - secret_access_key = os.getenv('AWS_SECRET_ACCESS_KEY') - session = boto3.Session( - aws_access_key_id=access_key_id, - aws_secret_access_key=secret_access_key, - ) - s3 = session.resource('s3') - bucket = s3.Bucket(BUCKET_NAME) - return s3, bucket - - -def upload_file(file_name, target_name=None): - s3, _ = get_s3_client() - - if target_name is None: - target_name = file_name - s3.meta.client.upload_file(Filename=file_name, Bucket=BUCKET_NAME, Key=target_name) - print(f"The file {file_name} has been uploaded!") - - -def download_file(file_name): - """ Download `file_name` from the bucket. - Bucket (str) – The name of the bucket to download from. - Key (str) – The name of the key to download from. - Filename (str) – The path to the file to download to. - """ - s3, _ = get_s3_client() - s3.meta.client.download_file(Bucket=BUCKET_NAME, Key=file_name, Filename=os.path.basename(file_name)) - print(f"The file {file_name} has been downloaded!") - - -####################################################################################################################### -# AWS SQL Handler -####################################################################################################################### -def modify_status(task_id, new_status): - session = SESSION() - metadata = MetaData() - task_to_update = task_id - task_table = Table('task', metadata, autoload_with=ENGINE) - stmt = select(task_table).where(task_table.c.task_id == task_to_update) - # Execute the statement - with ENGINE.connect() as connection: - result = connection.execute(stmt) - - # Fetch the first result (if exists) - task_data = result.fetchone() - - # If user_data is not None, the user exists and we can update the password - if task_data: - # Update statement - stmt = ( - update(task_table). - where(task_table.c.task_id == task_to_update). - values(status=new_status) - ) - # Execute the statement and commit - result = connection.execute(stmt) - connection.commit() - # Close the session - session.close() - - -####################################################################################################################### -# Pipline -####################################################################################################################### -def pipeline(message_count=0, query_interval=10): - # status: 0 - pending (default), 1 - running, 2 - completed, 3 - failed - - # Query a message from SQS - msg, handle = receive_message() - if handle is None: - print("No message in SQS. ") - time.sleep(query_interval) - else: - print("===============================================================================================") - print(f"MESSAGE COUNT: {message_count}") - print("===============================================================================================") - config_s3_path = msg['Messages'][0]['Body'] - config_s3_dir = os.path.dirname(config_s3_path) - config_local_path = os.path.basename(config_s3_path) - task_id, _ = os.path.splitext(config_local_path) - - print("Initializing ...") - print("Configuration file on S3: ", config_s3_path) - print("Configuration file on S3 (Directory): ", config_s3_dir) - print("Local file path: ", config_local_path) - print("Task id: ", task_id) - - print(f"Success in receiving message: {msg}") - print(f"Configuration file path: {config_s3_path}") - - # Process the downloaded configuration file - download_file(config_s3_path) - modify_status(task_id, 1) # status: 0 - pending (default), 1 - running, 2 - completed, 3 - failed - delete_message(handle) - print(f"Success in the initialization. Message deleted.") - - print("Running ...") - # try: - zip_path = generator_wrapper(config_local_path) - # Upload the generated file to S3 - upload_to = os.path.join(config_s3_dir, zip_path).replace("\\", "/") - - print("Local file path (ZIP): ", zip_path) - print("Upload to S3: ", upload_to) - upload_file(zip_path, upload_to) - modify_status(task_id, 2) # status: 0 - pending (default), 1 - running, 2 - completed, 3 - failed, 4 - deleted - print(f"Success in generating the paper.") - - # Complete. - print("Task completed.") - - -def initialize_everything(): - # Clear S3 - - # Clear SQS - pass - - -if __name__ == "__main__": - pipeline() diff --git a/spaces/awacke1/MultiplayerTest1/README.md b/spaces/awacke1/MultiplayerTest1/README.md deleted file mode 100644 index cb961cb0005903f5040dbe11a517b7b935e4a1d2..0000000000000000000000000000000000000000 --- a/spaces/awacke1/MultiplayerTest1/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: MultiplayerTest1 -emoji: 🚀 -colorFrom: green -colorTo: red -sdk: streamlit -sdk_version: 1.25.0 -app_file: app.py -pinned: false -license: mit ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/awacke1/PytorchStreamlitNeuralNetUI/v1.app.py b/spaces/awacke1/PytorchStreamlitNeuralNetUI/v1.app.py deleted file mode 100644 index d4bfb09e31c7f6a076f6d99d796102b8c6f90d35..0000000000000000000000000000000000000000 --- a/spaces/awacke1/PytorchStreamlitNeuralNetUI/v1.app.py +++ /dev/null @@ -1,37 +0,0 @@ -import streamlit as st -import torch -import torch.nn as nn -import torch.optim as optim -import torchvision.transforms as transforms -from PIL import Image -import onnx -from io import BytesIO - -class SimpleNN(nn.Module): - def __init__(self): - super(SimpleNN, self).__init__() - self.fc = nn.Linear(28 * 28, 10) # Assuming 28x28 input and 10 classes - - def forward(self, x): - x = x.view(-1, 28 * 28) - x = self.fc(x) - return x - -st.title("PyTorch Neural Network Interface") - -uploaded_file = st.file_uploader("Choose an ONNX model file", type="onnx") - -if uploaded_file: - byte_stream = BytesIO(uploaded_file.getvalue()) - model = onnx.load(byte_stream) - st.write("Model uploaded successfully!") - -if st.button('Download Model as ONNX'): - buffer = BytesIO() - torch.onnx.export(SimpleNN(), torch.randn(1, 28, 28), buffer) - st.download_button( - label="Download ONNX model", - data=buffer, - file_name="model.onnx", - mime="application/octet-stream" - ) diff --git a/spaces/awacke1/Three.JS-TheCube-Game/index.html b/spaces/awacke1/Three.JS-TheCube-Game/index.html deleted file mode 100644 index 6090a2c41a0249a3b9a74a8a2248e1ca82cd1cd1..0000000000000000000000000000000000000000 --- a/spaces/awacke1/Three.JS-TheCube-Game/index.html +++ /dev/null @@ -1,101 +0,0 @@ - - - - - The Cube Game - - - - - - - - diff --git a/spaces/badongtakla/ithaca/ithaca/models/common_layers.py b/spaces/badongtakla/ithaca/ithaca/models/common_layers.py deleted file mode 100644 index 79e2cf707731aa6f999d1e9eff6c2bf8dc456264..0000000000000000000000000000000000000000 --- a/spaces/badongtakla/ithaca/ithaca/models/common_layers.py +++ /dev/null @@ -1,318 +0,0 @@ -# Copyright 2021 the Ithaca Authors -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -"""Common layers used in models. - -This implementation is from the Long Range Arena: -https://github.com/google-research/long-range-arena/tree/main/lra_benchmarks/models/bigbird -""" - -# pylint: disable=attribute-defined-outside-init,g-bare-generic -from typing import Any, Callable, Iterable, Optional - -from flax import linen as nn -from jax import lax -from jax.nn import initializers -import jax.numpy as jnp -import numpy as np - -PRNGKey = Any -Array = Any -Shape = Iterable[int] -Dtype = Any # this could be a real type? - -ACTIVATION_FN_DICT = { - 'relu': nn.relu, - 'gelu': nn.gelu, -} - - -def grid_restack(all_vecs): - """Grid restack for meta-performer. - - Given multiple sequences (lists) of batch x len x dim, - reshape this such that all positions are side by side. - - for example (for illustrative purposes): - - inputs: [1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12] - outputs: [1, 5, 9, 2, 6, 10, 3, 7, 11, 4, 8, 12] - - Args: - all_vecs: list of sequences of batch x len x dim - - Returns: - Array of batch x (length x num_items) x dim. - """ - cat_output = [] - for pos in range(all_vecs[0].shape[1]): - pos_vecs = [x[:, None, pos, :] for x in all_vecs] - cat_output += pos_vecs - x2 = jnp.concatenate(cat_output, 1) - return x2 - - -def shift_right(x): - """Shift the input to the right by padding on axis 1.""" - pad_widths = [(0, 0)] * len(x.shape) - pad_widths[1] = (1, 0) # Padding on axis=1 - padded = jnp.pad( - x, pad_widths, mode='constant', constant_values=x.dtype.type(0)) - return padded[:, :-1] - - -class Embed(nn.Module): - """Embedding Module. - - A parameterized function from integers [0, n) to d-dimensional vectors. - - Attributes: - mode: either 'input' or 'output' -> to share input/output embedding - emb_init: embedding initializer - """ - - mode: str = 'input' - emb_init: Callable = nn.initializers.normal(stddev=1.0) - - @nn.compact - def __call__(self, inputs, num_embeddings, features): - """Applies Embed module. - - Args: - inputs: input data - num_embeddings: number of embedding - features: size of the embedding dimension - - Returns: - output which is embedded input data - """ - embedding = self.param('embedding', self.emb_init, - (num_embeddings, features)) - if self.mode == 'input': - if inputs.dtype not in [jnp.int32, jnp.int64, jnp.uint32, jnp.uint64]: - raise ValueError('Input type must be an integer or unsigned integer.') - return jnp.take(embedding, inputs, axis=0) - if self.mode == 'output': - return jnp.einsum('bld,vd->blv', inputs, embedding) - - -def sinusoidal_init(max_len=2048, replicate_tf=False): - """1D Sinusoidal Position Embedding Initializer. - - Args: - max_len: maximum possible length for the input - replicate_tf: replicate TF periodic encoding exactly - - Returns: - output: init function returning `(1, max_len, d_feature)` - """ - - def init(key, shape, dtype=np.float32): - """Sinusoidal init.""" - del key, dtype - d_feature = shape[-1] - pe = np.zeros((max_len, d_feature), dtype=np.float32) - position = np.arange(0, max_len)[:, np.newaxis] - if replicate_tf: - half_d_feature = d_feature // 2 - div_term = np.exp( - np.arange(half_d_feature) * -(np.log(10000.0) / (half_d_feature - 1))) - pe[:, :half_d_feature] = np.sin(position * div_term) - pe[:, half_d_feature:] = np.cos(position * div_term) - else: - div_term = np.exp( - np.arange(0, d_feature, 2) * -(np.log(10000.0) / d_feature)) - pe[:, 0::2] = np.sin(position * div_term) - pe[:, 1::2] = np.cos(position * div_term) - pe = pe[np.newaxis, :, :] # [1, max_len, d_feature] - return jnp.array(pe) - - return init - - -class AddPositionEmbs(nn.Module): - """Adds (optionally learned) positional embeddings to the inputs. - - Attributes: - posemb_init: positional embedding initializer, if None, then use a fixed - (non-learned) sinusoidal embedding table. - max_len: maximum possible length for the input. - replicate_original: replicate original periodic encoding exactly - """ - - posemb_init: Optional[Callable] = None - posemb_dim: Optional[int] = None - max_len: int = 512 - combine_type: str = 'concat' - replicate_tf: bool = False - - @nn.compact - def __call__(self, inputs, inputs_positions=None, cache=None): - """Applies AddPositionEmbs module. - - By default this layer uses a fixed sinusoidal embedding table. If a - learned position embedding is desired, pass an initializer to - posemb_init. - - Args: - inputs: input data. - inputs_positions: input position indices for packed sequences. - cache: flax attention cache for fast decoding. - - Returns: - output: `(bs, timesteps, in_dim)` - """ - # inputs.shape is (batch_size, seq_len, emb_dim) - assert inputs.ndim == 3, ('Number of dimensions should be 3,' - ' but it is: %d' % inputs.ndim) - batch_size = inputs.shape[0] - length = inputs.shape[1] - if self.posemb_dim is None or self.combine_type == 'add': - self.posemb_dim = inputs.shape[-1] - pos_emb_shape = (1, self.max_len, self.posemb_dim) - if self.posemb_init is None: - # Use a fixed (non-learned) sinusoidal position embedding. - pos_embedding = sinusoidal_init( - max_len=self.max_len, - replicate_tf=self.replicate_tf, - )(None, pos_emb_shape, None) - else: - pos_embedding = self.param('pos_embedding', self.posemb_init, - pos_emb_shape) - pe = pos_embedding[:, :length, :] - # We abuse the same attention Cache mechanism to run positional embeddings - # in fast predict mode. We could use state variables instead, but this - # simplifies invocation with a single top-level cache context manager. - # We only use the cache's position index for tracking decoding position. - if cache: - if self.is_initializing(): - cache.store(np.array((4, 1, 1), dtype=np.int32)) - else: - cache_entry = cache.retrieve(None) - i = cache_entry.i - cache.store(cache_entry.replace(i=cache_entry.i + 1)) - _, _, df = pos_embedding.shape - pe = lax.dynamic_slice(pos_embedding, jnp.array((0, i, 0)), - jnp.array((1, 1, df))) - if inputs_positions is None: - # normal unpacked case: - if self.combine_type == 'add': - return inputs + pe - elif self.combine_type == 'concat': - pe_broadcast = np.repeat(pe, batch_size, axis=0) - return lax.concatenate([inputs, pe_broadcast], 2) - else: - raise ValueError('Wrong type value.') - else: - # for packed data we need to use known position indices: - return inputs + jnp.take(pe[0], inputs_positions, axis=0) - - -class MlpBlock(nn.Module): - """Transformer MLP block.""" - - mlp_dim: int - dtype: Any = jnp.float32 - out_dim: Optional[int] = None - out_dropout: bool = True - dropout_rate: float = 0.1 - deterministic: bool = False - kernel_init: Callable = nn.initializers.xavier_uniform() - bias_init: Callable = nn.initializers.normal(stddev=1e-6) - activation_fn: str = 'gelu' - - @nn.compact - def __call__(self, inputs): - """Applies Transformer MlpBlock module.""" - actual_out_dim = inputs.shape[-1] if self.out_dim is None else self.out_dim - x = nn.Dense( - self.mlp_dim, - dtype=self.dtype, - kernel_init=self.kernel_init, - bias_init=self.bias_init)( - inputs) - x = ACTIVATION_FN_DICT[self.activation_fn](x) - x = nn.Dropout(rate=self.dropout_rate)(x, deterministic=self.deterministic) - output = nn.Dense( - actual_out_dim, - dtype=self.dtype, - kernel_init=self.kernel_init, - bias_init=self.bias_init)( - x) - if self.out_dropout: - output = nn.Dropout(rate=self.dropout_rate)( - output, deterministic=self.deterministic) - return output - - -def classifier_head(encoded, num_classes, mlp_dim, pooling_mode='MEAN'): - """Classifier head. - - We put this here just so that all models consistently call the same function. - - Args: - encoded: tensor inputs are shape of [bs, len, dim]. - num_classes: int, number of classes - mlp_dim: int, dim of intermediate MLP. - pooling_mode: str, string dictating pooling op {MEAN} - - Returns: - tensor of shape [bs, num_classes] - - """ - if pooling_mode == 'MEAN': - encoded = jnp.mean(encoded, axis=1) - elif pooling_mode == 'SUM': - encoded = jnp.sum(encoded, axis=1) - elif pooling_mode == 'FLATTEN': - encoded = encoded.reshape((encoded.shape[0], -1)) - elif pooling_mode == 'CLS': - encoded = encoded[:, 0] - else: - raise NotImplementedError('Pooling not supported yet.') - encoded = nn.Dense(mlp_dim, name='mlp')(encoded) - encoded = nn.relu(encoded) - encoded = nn.Dense(num_classes, name='logits')(encoded) - return encoded - - -class LayerNorm(nn.Module): - """Layer Norm to replicate tf.contrib.""" - epsilon: Optional[float] = None - dtype: Any = jnp.float32 - use_bias: bool = True - use_scale: bool = True - bias_init: Callable[[PRNGKey, Shape, Dtype], Array] = initializers.zeros - scale_init: Callable[[PRNGKey, Shape, Dtype], Array] = initializers.ones - - @nn.compact - def __call__(self, x): - if self.epsilon is None: - epsilon = 1e-12 if self.dtype != jnp.float16 else 1e-3 - else: - epsilon = self.epsilon - x = jnp.asarray(x, jnp.float32) - features = x.shape[-1] - mean = jnp.mean(x, axis=-1, keepdims=True) - mean2 = jnp.mean(lax.square(x), axis=-1, keepdims=True) - var = mean2 - lax.square(mean) - mul = lax.rsqrt(var + epsilon) - if self.use_scale: - mul = mul * jnp.asarray( - self.param('scale', self.scale_init, (features,)), self.dtype) - y = x * mul - if self.use_bias: - y = y + jnp.asarray( - self.param('bias', self.bias_init, (features,)), self.dtype) - y -= mean * mul - return jnp.asarray(y, self.dtype) diff --git a/spaces/banana-projects/web3d/node_modules/three/examples/js/lines/LineSegments2.js b/spaces/banana-projects/web3d/node_modules/three/examples/js/lines/LineSegments2.js deleted file mode 100644 index 8f444438a671ff6c1700af21ceb6f207a2eae99a..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/examples/js/lines/LineSegments2.js +++ /dev/null @@ -1,65 +0,0 @@ -/** - * @author WestLangley / http://github.com/WestLangley - * - */ - -THREE.LineSegments2 = function ( geometry, material ) { - - THREE.Mesh.call( this ); - - this.type = 'LineSegments2'; - - this.geometry = geometry !== undefined ? geometry : new THREE.LineSegmentsGeometry(); - this.material = material !== undefined ? material : new THREE.LineMaterial( { color: Math.random() * 0xffffff } ); - -}; - -THREE.LineSegments2.prototype = Object.assign( Object.create( THREE.Mesh.prototype ), { - - constructor: THREE.LineSegments2, - - isLineSegments2: true, - - computeLineDistances: ( function () { // for backwards-compatability, but could be a method of LineSegmentsGeometry... - - var start = new THREE.Vector3(); - var end = new THREE.Vector3(); - - return function computeLineDistances() { - - var geometry = this.geometry; - - var instanceStart = geometry.attributes.instanceStart; - var instanceEnd = geometry.attributes.instanceEnd; - var lineDistances = new Float32Array( 2 * instanceStart.data.count ); - - for ( var i = 0, j = 0, l = instanceStart.data.count; i < l; i ++, j += 2 ) { - - start.fromBufferAttribute( instanceStart, i ); - end.fromBufferAttribute( instanceEnd, i ); - - lineDistances[ j ] = ( j === 0 ) ? 0 : lineDistances[ j - 1 ]; - lineDistances[ j + 1 ] = lineDistances[ j ] + start.distanceTo( end ); - - } - - var instanceDistanceBuffer = new THREE.InstancedInterleavedBuffer( lineDistances, 2, 1 ); // d0, d1 - - geometry.addAttribute( 'instanceDistanceStart', new THREE.InterleavedBufferAttribute( instanceDistanceBuffer, 1, 0 ) ); // d0 - geometry.addAttribute( 'instanceDistanceEnd', new THREE.InterleavedBufferAttribute( instanceDistanceBuffer, 1, 1 ) ); // d1 - - return this; - - }; - - }() ), - - copy: function ( source ) { - - // todo - - return this; - - } - -} ); diff --git a/spaces/banana-projects/web3d/node_modules/three/src/core/Face3.js b/spaces/banana-projects/web3d/node_modules/three/src/core/Face3.js deleted file mode 100644 index b2d76f6f8f0dafbdeffbca8d190cc591499b82cb..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/src/core/Face3.js +++ /dev/null @@ -1,63 +0,0 @@ -import { Color } from '../math/Color.js'; -import { Vector3 } from '../math/Vector3.js'; - -/** - * @author mrdoob / http://mrdoob.com/ - * @author alteredq / http://alteredqualia.com/ - */ - -function Face3( a, b, c, normal, color, materialIndex ) { - - this.a = a; - this.b = b; - this.c = c; - - this.normal = ( normal && normal.isVector3 ) ? normal : new Vector3(); - this.vertexNormals = Array.isArray( normal ) ? normal : []; - - this.color = ( color && color.isColor ) ? color : new Color(); - this.vertexColors = Array.isArray( color ) ? color : []; - - this.materialIndex = materialIndex !== undefined ? materialIndex : 0; - -} - -Object.assign( Face3.prototype, { - - clone: function () { - - return new this.constructor().copy( this ); - - }, - - copy: function ( source ) { - - this.a = source.a; - this.b = source.b; - this.c = source.c; - - this.normal.copy( source.normal ); - this.color.copy( source.color ); - - this.materialIndex = source.materialIndex; - - for ( var i = 0, il = source.vertexNormals.length; i < il; i ++ ) { - - this.vertexNormals[ i ] = source.vertexNormals[ i ].clone(); - - } - - for ( var i = 0, il = source.vertexColors.length; i < il; i ++ ) { - - this.vertexColors[ i ] = source.vertexColors[ i ].clone(); - - } - - return this; - - } - -} ); - - -export { Face3 }; diff --git a/spaces/banana-projects/web3d/node_modules/three/src/extras/Earcut.js b/spaces/banana-projects/web3d/node_modules/three/src/extras/Earcut.js deleted file mode 100644 index 7f7a41266d41a899dac09c6d84a9c1b309022852..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/src/extras/Earcut.js +++ /dev/null @@ -1,810 +0,0 @@ -/** - * @author Mugen87 / https://github.com/Mugen87 - * Port from https://github.com/mapbox/earcut (v2.1.2) - */ - -var Earcut = { - - triangulate: function ( data, holeIndices, dim ) { - - dim = dim || 2; - - var hasHoles = holeIndices && holeIndices.length, - outerLen = hasHoles ? holeIndices[ 0 ] * dim : data.length, - outerNode = linkedList( data, 0, outerLen, dim, true ), - triangles = []; - - if ( ! outerNode ) return triangles; - - var minX, minY, maxX, maxY, x, y, invSize; - - if ( hasHoles ) outerNode = eliminateHoles( data, holeIndices, outerNode, dim ); - - // if the shape is not too simple, we'll use z-order curve hash later; calculate polygon bbox - - if ( data.length > 80 * dim ) { - - minX = maxX = data[ 0 ]; - minY = maxY = data[ 1 ]; - - for ( var i = dim; i < outerLen; i += dim ) { - - x = data[ i ]; - y = data[ i + 1 ]; - if ( x < minX ) minX = x; - if ( y < minY ) minY = y; - if ( x > maxX ) maxX = x; - if ( y > maxY ) maxY = y; - - } - - // minX, minY and invSize are later used to transform coords into integers for z-order calculation - - invSize = Math.max( maxX - minX, maxY - minY ); - invSize = invSize !== 0 ? 1 / invSize : 0; - - } - - earcutLinked( outerNode, triangles, dim, minX, minY, invSize ); - - return triangles; - - } - -}; - -// create a circular doubly linked list from polygon points in the specified winding order - -function linkedList( data, start, end, dim, clockwise ) { - - var i, last; - - if ( clockwise === ( signedArea( data, start, end, dim ) > 0 ) ) { - - for ( i = start; i < end; i += dim ) last = insertNode( i, data[ i ], data[ i + 1 ], last ); - - } else { - - for ( i = end - dim; i >= start; i -= dim ) last = insertNode( i, data[ i ], data[ i + 1 ], last ); - - } - - if ( last && equals( last, last.next ) ) { - - removeNode( last ); - last = last.next; - - } - - return last; - -} - -// eliminate colinear or duplicate points - -function filterPoints( start, end ) { - - if ( ! start ) return start; - if ( ! end ) end = start; - - var p = start, again; - - do { - - again = false; - - if ( ! p.steiner && ( equals( p, p.next ) || area( p.prev, p, p.next ) === 0 ) ) { - - removeNode( p ); - p = end = p.prev; - if ( p === p.next ) break; - again = true; - - } else { - - p = p.next; - - } - - } while ( again || p !== end ); - - return end; - -} - -// main ear slicing loop which triangulates a polygon (given as a linked list) - -function earcutLinked( ear, triangles, dim, minX, minY, invSize, pass ) { - - if ( ! ear ) return; - - // interlink polygon nodes in z-order - - if ( ! pass && invSize ) indexCurve( ear, minX, minY, invSize ); - - var stop = ear, prev, next; - - // iterate through ears, slicing them one by one - - while ( ear.prev !== ear.next ) { - - prev = ear.prev; - next = ear.next; - - if ( invSize ? isEarHashed( ear, minX, minY, invSize ) : isEar( ear ) ) { - - // cut off the triangle - triangles.push( prev.i / dim ); - triangles.push( ear.i / dim ); - triangles.push( next.i / dim ); - - removeNode( ear ); - - // skipping the next vertice leads to less sliver triangles - ear = next.next; - stop = next.next; - - continue; - - } - - ear = next; - - // if we looped through the whole remaining polygon and can't find any more ears - - if ( ear === stop ) { - - // try filtering points and slicing again - - if ( ! pass ) { - - earcutLinked( filterPoints( ear ), triangles, dim, minX, minY, invSize, 1 ); - - // if this didn't work, try curing all small self-intersections locally - - } else if ( pass === 1 ) { - - ear = cureLocalIntersections( ear, triangles, dim ); - earcutLinked( ear, triangles, dim, minX, minY, invSize, 2 ); - - // as a last resort, try splitting the remaining polygon into two - - } else if ( pass === 2 ) { - - splitEarcut( ear, triangles, dim, minX, minY, invSize ); - - } - - break; - - } - - } - -} - -// check whether a polygon node forms a valid ear with adjacent nodes - -function isEar( ear ) { - - var a = ear.prev, - b = ear, - c = ear.next; - - if ( area( a, b, c ) >= 0 ) return false; // reflex, can't be an ear - - // now make sure we don't have other points inside the potential ear - var p = ear.next.next; - - while ( p !== ear.prev ) { - - if ( pointInTriangle( a.x, a.y, b.x, b.y, c.x, c.y, p.x, p.y ) && area( p.prev, p, p.next ) >= 0 ) { - - return false; - - } - - p = p.next; - - } - - return true; - -} - -function isEarHashed( ear, minX, minY, invSize ) { - - var a = ear.prev, - b = ear, - c = ear.next; - - if ( area( a, b, c ) >= 0 ) return false; // reflex, can't be an ear - - // triangle bbox; min & max are calculated like this for speed - - var minTX = a.x < b.x ? ( a.x < c.x ? a.x : c.x ) : ( b.x < c.x ? b.x : c.x ), - minTY = a.y < b.y ? ( a.y < c.y ? a.y : c.y ) : ( b.y < c.y ? b.y : c.y ), - maxTX = a.x > b.x ? ( a.x > c.x ? a.x : c.x ) : ( b.x > c.x ? b.x : c.x ), - maxTY = a.y > b.y ? ( a.y > c.y ? a.y : c.y ) : ( b.y > c.y ? b.y : c.y ); - - // z-order range for the current triangle bbox; - - var minZ = zOrder( minTX, minTY, minX, minY, invSize ), - maxZ = zOrder( maxTX, maxTY, minX, minY, invSize ); - - // first look for points inside the triangle in increasing z-order - - var p = ear.nextZ; - - while ( p && p.z <= maxZ ) { - - if ( p !== ear.prev && p !== ear.next && - pointInTriangle( a.x, a.y, b.x, b.y, c.x, c.y, p.x, p.y ) && - area( p.prev, p, p.next ) >= 0 ) return false; - p = p.nextZ; - - } - - // then look for points in decreasing z-order - - p = ear.prevZ; - - while ( p && p.z >= minZ ) { - - if ( p !== ear.prev && p !== ear.next && - pointInTriangle( a.x, a.y, b.x, b.y, c.x, c.y, p.x, p.y ) && - area( p.prev, p, p.next ) >= 0 ) return false; - - p = p.prevZ; - - } - - return true; - -} - -// go through all polygon nodes and cure small local self-intersections - -function cureLocalIntersections( start, triangles, dim ) { - - var p = start; - - do { - - var a = p.prev, b = p.next.next; - - if ( ! equals( a, b ) && intersects( a, p, p.next, b ) && locallyInside( a, b ) && locallyInside( b, a ) ) { - - triangles.push( a.i / dim ); - triangles.push( p.i / dim ); - triangles.push( b.i / dim ); - - // remove two nodes involved - - removeNode( p ); - removeNode( p.next ); - - p = start = b; - - } - - p = p.next; - - } while ( p !== start ); - - return p; - -} - -// try splitting polygon into two and triangulate them independently - -function splitEarcut( start, triangles, dim, minX, minY, invSize ) { - - // look for a valid diagonal that divides the polygon into two - - var a = start; - - do { - - var b = a.next.next; - - while ( b !== a.prev ) { - - if ( a.i !== b.i && isValidDiagonal( a, b ) ) { - - // split the polygon in two by the diagonal - - var c = splitPolygon( a, b ); - - // filter colinear points around the cuts - - a = filterPoints( a, a.next ); - c = filterPoints( c, c.next ); - - // run earcut on each half - - earcutLinked( a, triangles, dim, minX, minY, invSize ); - earcutLinked( c, triangles, dim, minX, minY, invSize ); - return; - - } - - b = b.next; - - } - - a = a.next; - - } while ( a !== start ); - -} - -// link every hole into the outer loop, producing a single-ring polygon without holes - -function eliminateHoles( data, holeIndices, outerNode, dim ) { - - var queue = [], i, len, start, end, list; - - for ( i = 0, len = holeIndices.length; i < len; i ++ ) { - - start = holeIndices[ i ] * dim; - end = i < len - 1 ? holeIndices[ i + 1 ] * dim : data.length; - list = linkedList( data, start, end, dim, false ); - if ( list === list.next ) list.steiner = true; - queue.push( getLeftmost( list ) ); - - } - - queue.sort( compareX ); - - // process holes from left to right - - for ( i = 0; i < queue.length; i ++ ) { - - eliminateHole( queue[ i ], outerNode ); - outerNode = filterPoints( outerNode, outerNode.next ); - - } - - return outerNode; - -} - -function compareX( a, b ) { - - return a.x - b.x; - -} - -// find a bridge between vertices that connects hole with an outer ring and and link it - -function eliminateHole( hole, outerNode ) { - - outerNode = findHoleBridge( hole, outerNode ); - - if ( outerNode ) { - - var b = splitPolygon( outerNode, hole ); - - filterPoints( b, b.next ); - - } - -} - -// David Eberly's algorithm for finding a bridge between hole and outer polygon - -function findHoleBridge( hole, outerNode ) { - - var p = outerNode, - hx = hole.x, - hy = hole.y, - qx = - Infinity, - m; - - // find a segment intersected by a ray from the hole's leftmost point to the left; - // segment's endpoint with lesser x will be potential connection point - - do { - - if ( hy <= p.y && hy >= p.next.y && p.next.y !== p.y ) { - - var x = p.x + ( hy - p.y ) * ( p.next.x - p.x ) / ( p.next.y - p.y ); - - if ( x <= hx && x > qx ) { - - qx = x; - - if ( x === hx ) { - - if ( hy === p.y ) return p; - if ( hy === p.next.y ) return p.next; - - } - - m = p.x < p.next.x ? p : p.next; - - } - - } - - p = p.next; - - } while ( p !== outerNode ); - - if ( ! m ) return null; - - if ( hx === qx ) return m.prev; // hole touches outer segment; pick lower endpoint - - // look for points inside the triangle of hole point, segment intersection and endpoint; - // if there are no points found, we have a valid connection; - // otherwise choose the point of the minimum angle with the ray as connection point - - var stop = m, - mx = m.x, - my = m.y, - tanMin = Infinity, - tan; - - p = m.next; - - while ( p !== stop ) { - - if ( hx >= p.x && p.x >= mx && hx !== p.x && - pointInTriangle( hy < my ? hx : qx, hy, mx, my, hy < my ? qx : hx, hy, p.x, p.y ) ) { - - tan = Math.abs( hy - p.y ) / ( hx - p.x ); // tangential - - if ( ( tan < tanMin || ( tan === tanMin && p.x > m.x ) ) && locallyInside( p, hole ) ) { - - m = p; - tanMin = tan; - - } - - } - - p = p.next; - - } - - return m; - -} - -// interlink polygon nodes in z-order - -function indexCurve( start, minX, minY, invSize ) { - - var p = start; - - do { - - if ( p.z === null ) p.z = zOrder( p.x, p.y, minX, minY, invSize ); - p.prevZ = p.prev; - p.nextZ = p.next; - p = p.next; - - } while ( p !== start ); - - p.prevZ.nextZ = null; - p.prevZ = null; - - sortLinked( p ); - -} - -// Simon Tatham's linked list merge sort algorithm -// http://www.chiark.greenend.org.uk/~sgtatham/algorithms/listsort.html - -function sortLinked( list ) { - - var i, p, q, e, tail, numMerges, pSize, qSize, inSize = 1; - - do { - - p = list; - list = null; - tail = null; - numMerges = 0; - - while ( p ) { - - numMerges ++; - q = p; - pSize = 0; - - for ( i = 0; i < inSize; i ++ ) { - - pSize ++; - q = q.nextZ; - if ( ! q ) break; - - } - - qSize = inSize; - - while ( pSize > 0 || ( qSize > 0 && q ) ) { - - if ( pSize !== 0 && ( qSize === 0 || ! q || p.z <= q.z ) ) { - - e = p; - p = p.nextZ; - pSize --; - - } else { - - e = q; - q = q.nextZ; - qSize --; - - } - - if ( tail ) tail.nextZ = e; - else list = e; - - e.prevZ = tail; - tail = e; - - } - - p = q; - - } - - tail.nextZ = null; - inSize *= 2; - - } while ( numMerges > 1 ); - - return list; - -} - -// z-order of a point given coords and inverse of the longer side of data bbox - -function zOrder( x, y, minX, minY, invSize ) { - - // coords are transformed into non-negative 15-bit integer range - - x = 32767 * ( x - minX ) * invSize; - y = 32767 * ( y - minY ) * invSize; - - x = ( x | ( x << 8 ) ) & 0x00FF00FF; - x = ( x | ( x << 4 ) ) & 0x0F0F0F0F; - x = ( x | ( x << 2 ) ) & 0x33333333; - x = ( x | ( x << 1 ) ) & 0x55555555; - - y = ( y | ( y << 8 ) ) & 0x00FF00FF; - y = ( y | ( y << 4 ) ) & 0x0F0F0F0F; - y = ( y | ( y << 2 ) ) & 0x33333333; - y = ( y | ( y << 1 ) ) & 0x55555555; - - return x | ( y << 1 ); - -} - -// find the leftmost node of a polygon ring - -function getLeftmost( start ) { - - var p = start, leftmost = start; - - do { - - if ( p.x < leftmost.x ) leftmost = p; - p = p.next; - - } while ( p !== start ); - - return leftmost; - -} - -// check if a point lies within a convex triangle - -function pointInTriangle( ax, ay, bx, by, cx, cy, px, py ) { - - return ( cx - px ) * ( ay - py ) - ( ax - px ) * ( cy - py ) >= 0 && - ( ax - px ) * ( by - py ) - ( bx - px ) * ( ay - py ) >= 0 && - ( bx - px ) * ( cy - py ) - ( cx - px ) * ( by - py ) >= 0; - -} - -// check if a diagonal between two polygon nodes is valid (lies in polygon interior) - -function isValidDiagonal( a, b ) { - - return a.next.i !== b.i && a.prev.i !== b.i && ! intersectsPolygon( a, b ) && - locallyInside( a, b ) && locallyInside( b, a ) && middleInside( a, b ); - -} - -// signed area of a triangle - -function area( p, q, r ) { - - return ( q.y - p.y ) * ( r.x - q.x ) - ( q.x - p.x ) * ( r.y - q.y ); - -} - -// check if two points are equal - -function equals( p1, p2 ) { - - return p1.x === p2.x && p1.y === p2.y; - -} - -// check if two segments intersect - -function intersects( p1, q1, p2, q2 ) { - - if ( ( equals( p1, q1 ) && equals( p2, q2 ) ) || - ( equals( p1, q2 ) && equals( p2, q1 ) ) ) return true; - - return area( p1, q1, p2 ) > 0 !== area( p1, q1, q2 ) > 0 && - area( p2, q2, p1 ) > 0 !== area( p2, q2, q1 ) > 0; - -} - -// check if a polygon diagonal intersects any polygon segments - -function intersectsPolygon( a, b ) { - - var p = a; - - do { - - if ( p.i !== a.i && p.next.i !== a.i && p.i !== b.i && p.next.i !== b.i && - intersects( p, p.next, a, b ) ) { - - return true; - - } - - p = p.next; - - } while ( p !== a ); - - return false; - -} - -// check if a polygon diagonal is locally inside the polygon - -function locallyInside( a, b ) { - - return area( a.prev, a, a.next ) < 0 ? - area( a, b, a.next ) >= 0 && area( a, a.prev, b ) >= 0 : - area( a, b, a.prev ) < 0 || area( a, a.next, b ) < 0; - -} - -// check if the middle point of a polygon diagonal is inside the polygon - -function middleInside( a, b ) { - - var p = a, - inside = false, - px = ( a.x + b.x ) / 2, - py = ( a.y + b.y ) / 2; - - do { - - if ( ( ( p.y > py ) !== ( p.next.y > py ) ) && p.next.y !== p.y && - ( px < ( p.next.x - p.x ) * ( py - p.y ) / ( p.next.y - p.y ) + p.x ) ) { - - inside = ! inside; - - } - - p = p.next; - - } while ( p !== a ); - - return inside; - -} - -// link two polygon vertices with a bridge; if the vertices belong to the same ring, it splits polygon into two; -// if one belongs to the outer ring and another to a hole, it merges it into a single ring - -function splitPolygon( a, b ) { - - var a2 = new Node( a.i, a.x, a.y ), - b2 = new Node( b.i, b.x, b.y ), - an = a.next, - bp = b.prev; - - a.next = b; - b.prev = a; - - a2.next = an; - an.prev = a2; - - b2.next = a2; - a2.prev = b2; - - bp.next = b2; - b2.prev = bp; - - return b2; - -} - -// create a node and optionally link it with previous one (in a circular doubly linked list) - -function insertNode( i, x, y, last ) { - - var p = new Node( i, x, y ); - - if ( ! last ) { - - p.prev = p; - p.next = p; - - } else { - - p.next = last.next; - p.prev = last; - last.next.prev = p; - last.next = p; - - } - - return p; - -} - -function removeNode( p ) { - - p.next.prev = p.prev; - p.prev.next = p.next; - - if ( p.prevZ ) p.prevZ.nextZ = p.nextZ; - if ( p.nextZ ) p.nextZ.prevZ = p.prevZ; - -} - -function Node( i, x, y ) { - - // vertice index in coordinates array - this.i = i; - - // vertex coordinates - this.x = x; - this.y = y; - - // previous and next vertice nodes in a polygon ring - this.prev = null; - this.next = null; - - // z-order curve value - this.z = null; - - // previous and next nodes in z-order - this.prevZ = null; - this.nextZ = null; - - // indicates whether this is a steiner point - this.steiner = false; - -} - -function signedArea( data, start, end, dim ) { - - var sum = 0; - - for ( var i = start, j = end - dim; i < end; i += dim ) { - - sum += ( data[ j ] - data[ i ] ) * ( data[ i + 1 ] + data[ j + 1 ] ); - j = i; - - } - - return sum; - -} - -export { Earcut }; diff --git a/spaces/banana-projects/web3d/node_modules/three/src/loaders/TextureLoader.js b/spaces/banana-projects/web3d/node_modules/three/src/loaders/TextureLoader.js deleted file mode 100644 index 4607655ac4726de0c57f51600b2cbf102298c6f9..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/src/loaders/TextureLoader.js +++ /dev/null @@ -1,68 +0,0 @@ -/** - * @author mrdoob / http://mrdoob.com/ - */ - -import { RGBAFormat, RGBFormat } from '../constants.js'; -import { ImageLoader } from './ImageLoader.js'; -import { Texture } from '../textures/Texture.js'; -import { DefaultLoadingManager } from './LoadingManager.js'; - - -function TextureLoader( manager ) { - - this.manager = ( manager !== undefined ) ? manager : DefaultLoadingManager; - -} - -Object.assign( TextureLoader.prototype, { - - crossOrigin: 'anonymous', - - load: function ( url, onLoad, onProgress, onError ) { - - var texture = new Texture(); - - var loader = new ImageLoader( this.manager ); - loader.setCrossOrigin( this.crossOrigin ); - loader.setPath( this.path ); - - loader.load( url, function ( image ) { - - texture.image = image; - - // JPEGs can't have an alpha channel, so memory can be saved by storing them as RGB. - var isJPEG = url.search( /\.jpe?g($|\?)/i ) > 0 || url.search( /^data\:image\/jpeg/ ) === 0; - - texture.format = isJPEG ? RGBFormat : RGBAFormat; - texture.needsUpdate = true; - - if ( onLoad !== undefined ) { - - onLoad( texture ); - - } - - }, onProgress, onError ); - - return texture; - - }, - - setCrossOrigin: function ( value ) { - - this.crossOrigin = value; - return this; - - }, - - setPath: function ( value ) { - - this.path = value; - return this; - - } - -} ); - - -export { TextureLoader }; diff --git a/spaces/bankholdup/stylegan_petbreeder/e4e/models/stylegan2/model.py b/spaces/bankholdup/stylegan_petbreeder/e4e/models/stylegan2/model.py deleted file mode 100644 index fcb12af85669ab6fd7f79cb14ddbdf80b2fbd83d..0000000000000000000000000000000000000000 --- a/spaces/bankholdup/stylegan_petbreeder/e4e/models/stylegan2/model.py +++ /dev/null @@ -1,678 +0,0 @@ -import math -import random -import torch -from torch import nn -from torch.nn import functional as F - -if torch.cuda.is_available(): - from op.fused_act import FusedLeakyReLU, fused_leaky_relu - from op.upfirdn2d import upfirdn2d -else: - from op.fused_act_cpu import FusedLeakyReLU, fused_leaky_relu - from op.upfirdn2d_cpu import upfirdn2d - - -class PixelNorm(nn.Module): - def __init__(self): - super().__init__() - - def forward(self, input): - return input * torch.rsqrt(torch.mean(input ** 2, dim=1, keepdim=True) + 1e-8) - - -def make_kernel(k): - k = torch.tensor(k, dtype=torch.float32) - - if k.ndim == 1: - k = k[None, :] * k[:, None] - - k /= k.sum() - - return k - - -class Upsample(nn.Module): - def __init__(self, kernel, factor=2): - super().__init__() - - self.factor = factor - kernel = make_kernel(kernel) * (factor ** 2) - self.register_buffer('kernel', kernel) - - p = kernel.shape[0] - factor - - pad0 = (p + 1) // 2 + factor - 1 - pad1 = p // 2 - - self.pad = (pad0, pad1) - - def forward(self, input): - out = upfirdn2d(input, self.kernel, up=self.factor, down=1, pad=self.pad) - - return out - - -class Downsample(nn.Module): - def __init__(self, kernel, factor=2): - super().__init__() - - self.factor = factor - kernel = make_kernel(kernel) - self.register_buffer('kernel', kernel) - - p = kernel.shape[0] - factor - - pad0 = (p + 1) // 2 - pad1 = p // 2 - - self.pad = (pad0, pad1) - - def forward(self, input): - out = upfirdn2d(input, self.kernel, up=1, down=self.factor, pad=self.pad) - - return out - - -class Blur(nn.Module): - def __init__(self, kernel, pad, upsample_factor=1): - super().__init__() - - kernel = make_kernel(kernel) - - if upsample_factor > 1: - kernel = kernel * (upsample_factor ** 2) - - self.register_buffer('kernel', kernel) - - self.pad = pad - - def forward(self, input): - out = upfirdn2d(input, self.kernel, pad=self.pad) - - return out - - -class EqualConv2d(nn.Module): - def __init__( - self, in_channel, out_channel, kernel_size, stride=1, padding=0, bias=True - ): - super().__init__() - - self.weight = nn.Parameter( - torch.randn(out_channel, in_channel, kernel_size, kernel_size) - ) - self.scale = 1 / math.sqrt(in_channel * kernel_size ** 2) - - self.stride = stride - self.padding = padding - - if bias: - self.bias = nn.Parameter(torch.zeros(out_channel)) - - else: - self.bias = None - - def forward(self, input): - out = F.conv2d( - input, - self.weight * self.scale, - bias=self.bias, - stride=self.stride, - padding=self.padding, - ) - - return out - - def __repr__(self): - return ( - f'{self.__class__.__name__}({self.weight.shape[1]}, {self.weight.shape[0]},' - f' {self.weight.shape[2]}, stride={self.stride}, padding={self.padding})' - ) - - -class EqualLinear(nn.Module): - def __init__( - self, in_dim, out_dim, bias=True, bias_init=0, lr_mul=1, activation=None - ): - super().__init__() - - self.weight = nn.Parameter(torch.randn(out_dim, in_dim).div_(lr_mul)) - - if bias: - self.bias = nn.Parameter(torch.zeros(out_dim).fill_(bias_init)) - - else: - self.bias = None - - self.activation = activation - - self.scale = (1 / math.sqrt(in_dim)) * lr_mul - self.lr_mul = lr_mul - - def forward(self, input): - if self.activation: - out = F.linear(input, self.weight * self.scale) - out = fused_leaky_relu(out, self.bias * self.lr_mul) - - else: - out = F.linear( - input, self.weight * self.scale, bias=self.bias * self.lr_mul - ) - - return out - - def __repr__(self): - return ( - f'{self.__class__.__name__}({self.weight.shape[1]}, {self.weight.shape[0]})' - ) - - -class ScaledLeakyReLU(nn.Module): - def __init__(self, negative_slope=0.2): - super().__init__() - - self.negative_slope = negative_slope - - def forward(self, input): - out = F.leaky_relu(input, negative_slope=self.negative_slope) - - return out * math.sqrt(2) - - -class ModulatedConv2d(nn.Module): - def __init__( - self, - in_channel, - out_channel, - kernel_size, - style_dim, - demodulate=True, - upsample=False, - downsample=False, - blur_kernel=[1, 3, 3, 1], - ): - super().__init__() - - self.eps = 1e-8 - self.kernel_size = kernel_size - self.in_channel = in_channel - self.out_channel = out_channel - self.upsample = upsample - self.downsample = downsample - - if upsample: - factor = 2 - p = (len(blur_kernel) - factor) - (kernel_size - 1) - pad0 = (p + 1) // 2 + factor - 1 - pad1 = p // 2 + 1 - - self.blur = Blur(blur_kernel, pad=(pad0, pad1), upsample_factor=factor) - - if downsample: - factor = 2 - p = (len(blur_kernel) - factor) + (kernel_size - 1) - pad0 = (p + 1) // 2 - pad1 = p // 2 - - self.blur = Blur(blur_kernel, pad=(pad0, pad1)) - - fan_in = in_channel * kernel_size ** 2 - self.scale = 1 / math.sqrt(fan_in) - self.padding = kernel_size // 2 - - self.weight = nn.Parameter( - torch.randn(1, out_channel, in_channel, kernel_size, kernel_size) - ) - - self.modulation = EqualLinear(style_dim, in_channel, bias_init=1) - - self.demodulate = demodulate - - def __repr__(self): - return ( - f'{self.__class__.__name__}({self.in_channel}, {self.out_channel}, {self.kernel_size}, ' - f'upsample={self.upsample}, downsample={self.downsample})' - ) - - def forward(self, input, style): - batch, in_channel, height, width = input.shape - - style = self.modulation(style).view(batch, 1, in_channel, 1, 1) - weight = self.scale * self.weight * style - - if self.demodulate: - demod = torch.rsqrt(weight.pow(2).sum([2, 3, 4]) + 1e-8) - weight = weight * demod.view(batch, self.out_channel, 1, 1, 1) - - weight = weight.view( - batch * self.out_channel, in_channel, self.kernel_size, self.kernel_size - ) - - if self.upsample: - input = input.view(1, batch * in_channel, height, width) - weight = weight.view( - batch, self.out_channel, in_channel, self.kernel_size, self.kernel_size - ) - weight = weight.transpose(1, 2).reshape( - batch * in_channel, self.out_channel, self.kernel_size, self.kernel_size - ) - out = F.conv_transpose2d(input, weight, padding=0, stride=2, groups=batch) - _, _, height, width = out.shape - out = out.view(batch, self.out_channel, height, width) - out = self.blur(out) - - elif self.downsample: - input = self.blur(input) - _, _, height, width = input.shape - input = input.view(1, batch * in_channel, height, width) - out = F.conv2d(input, weight, padding=0, stride=2, groups=batch) - _, _, height, width = out.shape - out = out.view(batch, self.out_channel, height, width) - - else: - input = input.view(1, batch * in_channel, height, width) - out = F.conv2d(input, weight, padding=self.padding, groups=batch) - _, _, height, width = out.shape - out = out.view(batch, self.out_channel, height, width) - - return out - - -class NoiseInjection(nn.Module): - def __init__(self): - super().__init__() - - self.weight = nn.Parameter(torch.zeros(1)) - - def forward(self, image, noise=None): - if noise is None: - batch, _, height, width = image.shape - noise = image.new_empty(batch, 1, height, width).normal_() - - return image + self.weight * noise - - -class ConstantInput(nn.Module): - def __init__(self, channel, size=4): - super().__init__() - - self.input = nn.Parameter(torch.randn(1, channel, size, size)) - - def forward(self, input): - batch = input.shape[0] - out = self.input.repeat(batch, 1, 1, 1) - - return out - - -class StyledConv(nn.Module): - def __init__( - self, - in_channel, - out_channel, - kernel_size, - style_dim, - upsample=False, - blur_kernel=[1, 3, 3, 1], - demodulate=True, - ): - super().__init__() - - self.conv = ModulatedConv2d( - in_channel, - out_channel, - kernel_size, - style_dim, - upsample=upsample, - blur_kernel=blur_kernel, - demodulate=demodulate, - ) - - self.noise = NoiseInjection() - # self.bias = nn.Parameter(torch.zeros(1, out_channel, 1, 1)) - # self.activate = ScaledLeakyReLU(0.2) - self.activate = FusedLeakyReLU(out_channel) - - def forward(self, input, style, noise=None): - out = self.conv(input, style) - out = self.noise(out, noise=noise) - # out = out + self.bias - out = self.activate(out) - - return out - - -class ToRGB(nn.Module): - def __init__(self, in_channel, style_dim, upsample=True, blur_kernel=[1, 3, 3, 1]): - super().__init__() - - if upsample: - self.upsample = Upsample(blur_kernel) - - self.conv = ModulatedConv2d(in_channel, 3, 1, style_dim, demodulate=False) - self.bias = nn.Parameter(torch.zeros(1, 3, 1, 1)) - - def forward(self, input, style, skip=None): - out = self.conv(input, style) - out = out + self.bias - - if skip is not None: - skip = self.upsample(skip) - - out = out + skip - - return out - - -class Generator(nn.Module): - def __init__( - self, - size, - style_dim, - n_mlp, - channel_multiplier=2, - blur_kernel=[1, 3, 3, 1], - lr_mlp=0.01, - ): - super().__init__() - - self.size = size - - self.style_dim = style_dim - - layers = [PixelNorm()] - - for i in range(n_mlp): - layers.append( - EqualLinear( - style_dim, style_dim, lr_mul=lr_mlp, activation='fused_lrelu' - ) - ) - - self.style = nn.Sequential(*layers) - - self.channels = { - 4: 512, - 8: 512, - 16: 512, - 32: 512, - 64: 256 * channel_multiplier, - 128: 128 * channel_multiplier, - 256: 64 * channel_multiplier, - 512: 32 * channel_multiplier, - 1024: 16 * channel_multiplier, - } - - self.input = ConstantInput(self.channels[4]) - self.conv1 = StyledConv( - self.channels[4], self.channels[4], 3, style_dim, blur_kernel=blur_kernel - ) - self.to_rgb1 = ToRGB(self.channels[4], style_dim, upsample=False) - - self.log_size = int(math.log(size, 2)) - self.num_layers = (self.log_size - 2) * 2 + 1 - - self.convs = nn.ModuleList() - self.upsamples = nn.ModuleList() - self.to_rgbs = nn.ModuleList() - self.noises = nn.Module() - - in_channel = self.channels[4] - - for layer_idx in range(self.num_layers): - res = (layer_idx + 5) // 2 - shape = [1, 1, 2 ** res, 2 ** res] - self.noises.register_buffer(f'noise_{layer_idx}', torch.randn(*shape)) - - for i in range(3, self.log_size + 1): - out_channel = self.channels[2 ** i] - - self.convs.append( - StyledConv( - in_channel, - out_channel, - 3, - style_dim, - upsample=True, - blur_kernel=blur_kernel, - ) - ) - - self.convs.append( - StyledConv( - out_channel, out_channel, 3, style_dim, blur_kernel=blur_kernel - ) - ) - - self.to_rgbs.append(ToRGB(out_channel, style_dim)) - - in_channel = out_channel - - self.n_latent = self.log_size * 2 - 2 - - def make_noise(self): - device = self.input.input.device - - noises = [torch.randn(1, 1, 2 ** 2, 2 ** 2, device=device)] - - for i in range(3, self.log_size + 1): - for _ in range(2): - noises.append(torch.randn(1, 1, 2 ** i, 2 ** i, device=device)) - - return noises - - def mean_latent(self, n_latent): - latent_in = torch.randn( - n_latent, self.style_dim, device=self.input.input.device - ) - latent = self.style(latent_in).mean(0, keepdim=True) - - return latent - - def get_latent(self, input): - return self.style(input) - - def forward( - self, - styles, - return_latents=False, - return_features=False, - inject_index=None, - truncation=1, - truncation_latent=None, - input_is_latent=False, - noise=None, - randomize_noise=True, - ): - if not input_is_latent: - styles = [self.style(s) for s in styles] - - if noise is None: - if randomize_noise: - noise = [None] * self.num_layers - else: - noise = [ - getattr(self.noises, f'noise_{i}') for i in range(self.num_layers) - ] - - if truncation < 1: - style_t = [] - - for style in styles: - style_t.append( - truncation_latent + truncation * (style - truncation_latent) - ) - - styles = style_t - - if len(styles) < 2: - inject_index = self.n_latent - - if styles[0].ndim < 3: - latent = styles[0].unsqueeze(1).repeat(1, inject_index, 1) - else: - latent = styles[0] - - else: - if inject_index is None: - inject_index = random.randint(1, self.n_latent - 1) - - latent = styles[0].unsqueeze(1).repeat(1, inject_index, 1) - latent2 = styles[1].unsqueeze(1).repeat(1, self.n_latent - inject_index, 1) - - latent = torch.cat([latent, latent2], 1) - - out = self.input(latent) - out = self.conv1(out, latent[:, 0], noise=noise[0]) - - skip = self.to_rgb1(out, latent[:, 1]) - - i = 1 - for conv1, conv2, noise1, noise2, to_rgb in zip( - self.convs[::2], self.convs[1::2], noise[1::2], noise[2::2], self.to_rgbs - ): - out = conv1(out, latent[:, i], noise=noise1) - out = conv2(out, latent[:, i + 1], noise=noise2) - skip = to_rgb(out, latent[:, i + 2], skip) - - i += 2 - - image = skip - - if return_latents: - return image, latent - elif return_features: - return image, out - else: - return image, None - - -class ConvLayer(nn.Sequential): - def __init__( - self, - in_channel, - out_channel, - kernel_size, - downsample=False, - blur_kernel=[1, 3, 3, 1], - bias=True, - activate=True, - ): - layers = [] - - if downsample: - factor = 2 - p = (len(blur_kernel) - factor) + (kernel_size - 1) - pad0 = (p + 1) // 2 - pad1 = p // 2 - - layers.append(Blur(blur_kernel, pad=(pad0, pad1))) - - stride = 2 - self.padding = 0 - - else: - stride = 1 - self.padding = kernel_size // 2 - - layers.append( - EqualConv2d( - in_channel, - out_channel, - kernel_size, - padding=self.padding, - stride=stride, - bias=bias and not activate, - ) - ) - - if activate: - if bias: - layers.append(FusedLeakyReLU(out_channel)) - - else: - layers.append(ScaledLeakyReLU(0.2)) - - super().__init__(*layers) - - -class ResBlock(nn.Module): - def __init__(self, in_channel, out_channel, blur_kernel=[1, 3, 3, 1]): - super().__init__() - - self.conv1 = ConvLayer(in_channel, in_channel, 3) - self.conv2 = ConvLayer(in_channel, out_channel, 3, downsample=True) - - self.skip = ConvLayer( - in_channel, out_channel, 1, downsample=True, activate=False, bias=False - ) - - def forward(self, input): - out = self.conv1(input) - out = self.conv2(out) - - skip = self.skip(input) - out = (out + skip) / math.sqrt(2) - - return out - - -class Discriminator(nn.Module): - def __init__(self, size, channel_multiplier=2, blur_kernel=[1, 3, 3, 1]): - super().__init__() - - channels = { - 4: 512, - 8: 512, - 16: 512, - 32: 512, - 64: 256 * channel_multiplier, - 128: 128 * channel_multiplier, - 256: 64 * channel_multiplier, - 512: 32 * channel_multiplier, - 1024: 16 * channel_multiplier, - } - - convs = [ConvLayer(3, channels[size], 1)] - - log_size = int(math.log(size, 2)) - - in_channel = channels[size] - - for i in range(log_size, 2, -1): - out_channel = channels[2 ** (i - 1)] - - convs.append(ResBlock(in_channel, out_channel, blur_kernel)) - - in_channel = out_channel - - self.convs = nn.Sequential(*convs) - - self.stddev_group = 4 - self.stddev_feat = 1 - - self.final_conv = ConvLayer(in_channel + 1, channels[4], 3) - self.final_linear = nn.Sequential( - EqualLinear(channels[4] * 4 * 4, channels[4], activation='fused_lrelu'), - EqualLinear(channels[4], 1), - ) - - def forward(self, input): - out = self.convs(input) - - batch, channel, height, width = out.shape - group = min(batch, self.stddev_group) - stddev = out.view( - group, -1, self.stddev_feat, channel // self.stddev_feat, height, width - ) - stddev = torch.sqrt(stddev.var(0, unbiased=False) + 1e-8) - stddev = stddev.mean([2, 3, 4], keepdims=True).squeeze(2) - stddev = stddev.repeat(group, 1, height, width) - out = torch.cat([out, stddev], 1) - - out = self.final_conv(out) - - out = out.view(batch, -1) - out = self.final_linear(out) - - return out diff --git a/spaces/batuhantosun/Guided-Backpropagation/app.py b/spaces/batuhantosun/Guided-Backpropagation/app.py deleted file mode 100644 index 5aa0bfdc09d3d8fb30ec0ef604716329e3fdef8b..0000000000000000000000000000000000000000 --- a/spaces/batuhantosun/Guided-Backpropagation/app.py +++ /dev/null @@ -1,152 +0,0 @@ -import gradio as gr -import numpy as np -from PIL import Image -from matplotlib import cm - -from guided_backprop import GuidedBackprop -from utils import range_norm, grad2heatmapped - - -class GradioApp: - def __init__(self): - self.gp = None - self.probs = None - self.input_grad = None - self.activation_maps = None - self.input_img = None - - # Define GUI elements - with gr.Blocks() as self.app: - # Inference - with gr.Box(): - gr.Markdown("# Run inference") - with gr.Box(): - with gr.Row(): - with gr.Column(scale=1): - image = gr.Image(type="pil", label="Input Image", height=200) - with gr.Row(): - with gr.Column(scale=1): - reset_button = gr.ClearButton(value="Reset", elem_classes="feedback") - with gr.Column(scale=1): - run_button = gr.Button(value="Run") - - example = gr.Examples( - examples=[ - "images/bird.jpg", - "images/lion.jpg", - "images/tiger.jpg", - "images/pomegranate.jpg", - "images/strawberry.jpg" - ], - inputs=image) - with gr.Column(scale=1): - model_name = gr.Dropdown(choices=["VGG19", "AlexNet"], value="VGG19", label="Model", info="Choose from the models", interactive=True) - probs = gr.Label(label="Class Probs", num_top_classes=3) - - # Visualize activation maps - with gr.Box(): - gr.Markdown("# Visualize activation maps") - gr.Markdown("Choose a layer and a filter to visualize the activation map.") - with gr.Box(): - with gr.Row(): - with gr.Column(scale=1): - chosen_layer = gr.Dropdown(label="Layer", info="Choose from the layers", interactive=True) - chosen_filter = gr.Slider(label="Filter", info="Choose from the filters", interactive=True) - color = gr.Radio(["heatmap", "gray"], value="heatmap", label="Color", info="Choose the color of the activation map") - - with gr.Column(scale=1): - activation_map = gr.Image(type="pil", label="Activation Map", height=300) - - # Visualize input gradient - with gr.Box(): - gr.Markdown("# Visualize input gradient") - gr.Markdown("Compute the input gradient. Visualize the spatial region of the input image that contributes to the prediction.") - gr.Markdown("Change the scale to adjust the opactiy of the heatmap.") - with gr.Box(): - with gr.Row(): - with gr.Column(scale=1): - input_grad = gr.Image(type="pil", label="Input Gradient", height=300) - compute_grad_btn = gr.Button(value="Compute") - - with gr.Column(scale=1): - grad_applied = gr.Image(type="pil", label="Gradient Applied", height=300) - grad_ratio = gr.Slider(label="Scale", minimum=0.0, maximum=1.0, value=0.5, interactive=True) - - # Set up callbacks - reset_button.add([image, input_grad, probs, activation_map, - input_grad, grad_applied, chosen_layer, chosen_filter]) - - run_button.click( - self.run, - inputs=[model_name, image], - outputs=[image, probs, chosen_layer]) - - chosen_layer.change( - self.update_filter_num, - inputs=chosen_layer, outputs=chosen_filter) - - chosen_filter.change( - self.get_activation_map, - inputs=[chosen_layer, chosen_filter, color], - outputs=activation_map) - - color.change( - self.get_activation_map, - inputs=[chosen_layer, chosen_filter, color], - outputs=activation_map) - - compute_grad_btn.click( - self.get_input_grad, - outputs=input_grad) - - compute_grad_btn.click( - self.apply_input_grad, - inputs=[image, grad_ratio], - outputs=grad_applied - ) - - grad_ratio.change( - self.apply_input_grad, - inputs=[image, grad_ratio], - outputs=grad_applied - ) - - def run(self, model_name, input_image): - self.gp = GuidedBackprop(model_name) - self.probs = self.gp.predict(input_image) - self.input_grad, self.activation_maps = self.gp.backprop() - layers = gr.update(choices=list(self.activation_maps.keys())) - return input_image, self.probs, layers - - def update_filter_num(self, layer_name): - return gr.update(minimum=1, maximum=self.activation_maps[layer_name].shape[1], value=0) - - def get_activation_map(self, layer_name, chosen_filter, color): - act_im = self.activation_maps[layer_name][0][chosen_filter-1].detach().cpu().numpy() - act_im = range_norm(act_im) - if color != "gray": - act_im = cm.jet(act_im)[:,:,:-1] - return Image.fromarray(np.uint8(act_im*255)) - - def get_input_grad(self): - input_grad = self.input_grad[0].permute(1,2,0).detach() - input_grad = range_norm(input_grad) - input_grad = Image.fromarray(np.uint8(input_grad*255)) - self.input_grad_img = input_grad - return input_grad - - def apply_input_grad(self, input_image, grad_ratio): - heatmapped = grad2heatmapped( - input_image, - self.input_grad_img, - grad_ratio) - - return heatmapped - - def launch(self): - self.app.launch() - - -if __name__ == "__main__": - gradio_app = GradioApp() - gradio_app.launch() diff --git a/spaces/beihai/GFPGAN-V1.3-whole-image/.history/app_20220327001559.py b/spaces/beihai/GFPGAN-V1.3-whole-image/.history/app_20220327001559.py deleted file mode 100644 index f184d1a5e77ddff91eafe37d37a88701e5ea9a94..0000000000000000000000000000000000000000 --- a/spaces/beihai/GFPGAN-V1.3-whole-image/.history/app_20220327001559.py +++ /dev/null @@ -1,65 +0,0 @@ -import os -#os.system("pip install gfpgan") - -#os.system("pip freeze") -#os.system("wget https://github.com/TencentARC/GFPGAN/releases/download/v0.2.0/GFPGANCleanv1-NoCE-C2.pth -P .") -import random -import gradio as gr -from PIL import Image -import torch -# torch.hub.download_url_to_file('https://upload.wikimedia.org/wikipedia/commons/thumb/a/ab/Abraham_Lincoln_O-77_matte_collodion_print.jpg/1024px-Abraham_Lincoln_O-77_matte_collodion_print.jpg', 'lincoln.jpg') -# torch.hub.download_url_to_file('https://upload.wikimedia.org/wikipedia/commons/5/50/Albert_Einstein_%28Nobel%29.png', 'einstein.png') -# torch.hub.download_url_to_file('https://upload.wikimedia.org/wikipedia/commons/thumb/9/9d/Thomas_Edison2.jpg/1024px-Thomas_Edison2.jpg', 'edison.jpg') -# torch.hub.download_url_to_file('https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Henry_Ford_1888.jpg/1024px-Henry_Ford_1888.jpg', 'Henry.jpg') -# torch.hub.download_url_to_file('https://upload.wikimedia.org/wikipedia/commons/thumb/0/06/Frida_Kahlo%2C_by_Guillermo_Kahlo.jpg/800px-Frida_Kahlo%2C_by_Guillermo_Kahlo.jpg', 'Frida.jpg') - - - - -import cv2 -import glob -import numpy as np -from basicsr.utils import imwrite -from gfpgan import GFPGANer - -import warnings -warnings.warn('The unoptimized RealESRGAN is very slow on CPU. We do not use it. ' - 'If you really want to use it, please modify the corresponding codes.') -bg_upsampler = None - - - -# set up GFPGAN restorer -restorer = GFPGANer( - model_path='experiments/pretrained_models/GFPGANv1.3.pth', - upscale=2, - arch='clean', - channel_multiplier=2, - bg_upsampler=bg_upsampler) - - -def inference(img): - input_img = cv2.imread(img, cv2.IMREAD_COLOR) - cropped_faces, restored_faces, restored_img = restorer.enhance( - input_img, has_aligned=False, only_center_face=False, paste_back=True) - - return Image.fromarray(restored_img[0][:,:]) - -title = "GFP-GAN" -description = "Gradio demo for GFP-GAN: Towards Real-World Blind Face Restoration with Generative Facial Prior. To use it, simply upload your image, or click one of the examples to load them. Read more at the links below. Please click submit only once" -article = "

Towards Real-World Blind Face Restoration with Generative Facial Prior | Github Repo

visitor badge
" -gr.Interface( - inference, - [gr.inputs.Image(type="filepath", label="Input")], - gr.outputs.Image(type="pil", label="Output"), - title=title, - description=description, - article=article, - examples=[ - ['lincoln.jpg'], - ['einstein.png'], - ['edison.jpg'], - ['Henry.jpg'], - ['Frida.jpg'] - ] - ).launch(enable_queue=True,cache_examples=True) \ No newline at end of file diff --git a/spaces/beihai/PDF-Table-Extractor/.history/app_20220621104723.py b/spaces/beihai/PDF-Table-Extractor/.history/app_20220621104723.py deleted file mode 100644 index b0834fab78c7daf83024ccbcded0d5d8748cc87b..0000000000000000000000000000000000000000 --- a/spaces/beihai/PDF-Table-Extractor/.history/app_20220621104723.py +++ /dev/null @@ -1,53 +0,0 @@ -#-*- coding : utf-8-*- -import base64 -from subprocess import STDOUT -import streamlit as st -import pandas as pd -import camelot as cam # extracting tables from PDFs - -st.title("PDF Table Extractor") - -input_pdf = st.file_uploader(label = "", type = 'pdf') - -background = st.selectbox("表格线条是否隐藏",(False,True)) -extractor_mode = st.selectbox("单页抽取 OR 全文抽取",("单页抽取","全文抽取")) - -def extractor(page,result_name): - tables_all= cam.read_pdf("input.pdf", pages=page, process_background=background) - result_all = pd.ExcelWriter(result_name, engine='xlsxwriter') - for i in range(0,len(tables_all)): - table = tables_all[i].df - sheetname = str(i) - table.to_excel(result_all, sheetname,index=False) - with open(result_name,'rb') as f: - st.download_button('抽取完成,点击下载!', f,file_name=result_name,mime="application/vnd.ms-excel") - - -if input_pdf is not None: - # byte object into a PDF file - with open("input.pdf", "wb") as f: - base64_pdf = base64.b64encode(input_pdf.read()).decode('utf-8') - f.write(base64.b64decode(base64_pdf)) - f.close() - if extractor_mode == "单页抽取": - page_number = st.text_input("请填写表格所在PDF页码,eg: 3", value = 1) - # read the pdf and parse it using stream - tables = cam.read_pdf("input.pdf", pages=page_number, process_background=background) - result = pd.ExcelWriter('result.xlsx', engine='xlsxwriter') - #tables[1].to_excel(result,index=False) - for i in range(0,len(tables)): - table = tables[i].df - sheetname = str(i) - table.to_excel(result, sheetname,index=False) - - with open('result.xlsx','rb') as f: - st.download_button('提取完成,点击下载!', f,file_name='result.xlsx',mime="application/vnd.ms-excel") - if extractor_mode == "全文抽取": - tables_all= cam.read_pdf("input.pdf", pages="all", process_background=background) - result_all = pd.ExcelWriter('result_all.xlsx', engine='xlsxwriter') - for i in range(0,len(tables_all)): - table = tables_all[i].df - sheetname = str(i) - table.to_excel(result_all, sheetname,index=False) - with open('result_all.xlsx','rb') as f: - st.download_button('抽取完成,点击下载!', f,file_name='result_all.xlsx',mime="application/vnd.ms-excel") diff --git a/spaces/bhasker412/IDD-YOLO-Tracking/trackers/strongsort/deep/models/nasnet.py b/spaces/bhasker412/IDD-YOLO-Tracking/trackers/strongsort/deep/models/nasnet.py deleted file mode 100644 index b1f31def5515c3ba464c86cde471328b50c55b14..0000000000000000000000000000000000000000 --- a/spaces/bhasker412/IDD-YOLO-Tracking/trackers/strongsort/deep/models/nasnet.py +++ /dev/null @@ -1,1131 +0,0 @@ -from __future__ import division, absolute_import -import torch -import torch.nn as nn -import torch.nn.functional as F -import torch.utils.model_zoo as model_zoo - -__all__ = ['nasnetamobile'] -""" -NASNet Mobile -Thanks to Anastasiia (https://github.com/DagnyT) for the great help, support and motivation! - - ------------------------------------------------------------------------------------- - Architecture | Top-1 Acc | Top-5 Acc | Multiply-Adds | Params (M) ------------------------------------------------------------------------------------- -| NASNet-A (4 @ 1056) | 74.08% | 91.74% | 564 M | 5.3 | ------------------------------------------------------------------------------------- -# References: - - [Learning Transferable Architectures for Scalable Image Recognition] - (https://arxiv.org/abs/1707.07012) -""" -""" -Code imported from https://github.com/Cadene/pretrained-models.pytorch -""" - -pretrained_settings = { - 'nasnetamobile': { - 'imagenet': { - # 'url': 'https://github.com/veronikayurchuk/pretrained-models.pytorch/releases/download/v1.0/nasnetmobile-7e03cead.pth.tar', - 'url': - 'http://data.lip6.fr/cadene/pretrainedmodels/nasnetamobile-7e03cead.pth', - 'input_space': 'RGB', - 'input_size': [3, 224, 224], # resize 256 - 'input_range': [0, 1], - 'mean': [0.5, 0.5, 0.5], - 'std': [0.5, 0.5, 0.5], - 'num_classes': 1000 - }, - # 'imagenet+background': { - # # 'url': 'http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth', - # 'input_space': 'RGB', - # 'input_size': [3, 224, 224], # resize 256 - # 'input_range': [0, 1], - # 'mean': [0.5, 0.5, 0.5], - # 'std': [0.5, 0.5, 0.5], - # 'num_classes': 1001 - # } - } -} - - -class MaxPoolPad(nn.Module): - - def __init__(self): - super(MaxPoolPad, self).__init__() - self.pad = nn.ZeroPad2d((1, 0, 1, 0)) - self.pool = nn.MaxPool2d(3, stride=2, padding=1) - - def forward(self, x): - x = self.pad(x) - x = self.pool(x) - x = x[:, :, 1:, 1:].contiguous() - return x - - -class AvgPoolPad(nn.Module): - - def __init__(self, stride=2, padding=1): - super(AvgPoolPad, self).__init__() - self.pad = nn.ZeroPad2d((1, 0, 1, 0)) - self.pool = nn.AvgPool2d( - 3, stride=stride, padding=padding, count_include_pad=False - ) - - def forward(self, x): - x = self.pad(x) - x = self.pool(x) - x = x[:, :, 1:, 1:].contiguous() - return x - - -class SeparableConv2d(nn.Module): - - def __init__( - self, - in_channels, - out_channels, - dw_kernel, - dw_stride, - dw_padding, - bias=False - ): - super(SeparableConv2d, self).__init__() - self.depthwise_conv2d = nn.Conv2d( - in_channels, - in_channels, - dw_kernel, - stride=dw_stride, - padding=dw_padding, - bias=bias, - groups=in_channels - ) - self.pointwise_conv2d = nn.Conv2d( - in_channels, out_channels, 1, stride=1, bias=bias - ) - - def forward(self, x): - x = self.depthwise_conv2d(x) - x = self.pointwise_conv2d(x) - return x - - -class BranchSeparables(nn.Module): - - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride, - padding, - name=None, - bias=False - ): - super(BranchSeparables, self).__init__() - self.relu = nn.ReLU() - self.separable_1 = SeparableConv2d( - in_channels, in_channels, kernel_size, stride, padding, bias=bias - ) - self.bn_sep_1 = nn.BatchNorm2d( - in_channels, eps=0.001, momentum=0.1, affine=True - ) - self.relu1 = nn.ReLU() - self.separable_2 = SeparableConv2d( - in_channels, out_channels, kernel_size, 1, padding, bias=bias - ) - self.bn_sep_2 = nn.BatchNorm2d( - out_channels, eps=0.001, momentum=0.1, affine=True - ) - self.name = name - - def forward(self, x): - x = self.relu(x) - if self.name == 'specific': - x = nn.ZeroPad2d((1, 0, 1, 0))(x) - x = self.separable_1(x) - if self.name == 'specific': - x = x[:, :, 1:, 1:].contiguous() - - x = self.bn_sep_1(x) - x = self.relu1(x) - x = self.separable_2(x) - x = self.bn_sep_2(x) - return x - - -class BranchSeparablesStem(nn.Module): - - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride, - padding, - bias=False - ): - super(BranchSeparablesStem, self).__init__() - self.relu = nn.ReLU() - self.separable_1 = SeparableConv2d( - in_channels, out_channels, kernel_size, stride, padding, bias=bias - ) - self.bn_sep_1 = nn.BatchNorm2d( - out_channels, eps=0.001, momentum=0.1, affine=True - ) - self.relu1 = nn.ReLU() - self.separable_2 = SeparableConv2d( - out_channels, out_channels, kernel_size, 1, padding, bias=bias - ) - self.bn_sep_2 = nn.BatchNorm2d( - out_channels, eps=0.001, momentum=0.1, affine=True - ) - - def forward(self, x): - x = self.relu(x) - x = self.separable_1(x) - x = self.bn_sep_1(x) - x = self.relu1(x) - x = self.separable_2(x) - x = self.bn_sep_2(x) - return x - - -class BranchSeparablesReduction(BranchSeparables): - - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride, - padding, - z_padding=1, - bias=False - ): - BranchSeparables.__init__( - self, in_channels, out_channels, kernel_size, stride, padding, bias - ) - self.padding = nn.ZeroPad2d((z_padding, 0, z_padding, 0)) - - def forward(self, x): - x = self.relu(x) - x = self.padding(x) - x = self.separable_1(x) - x = x[:, :, 1:, 1:].contiguous() - x = self.bn_sep_1(x) - x = self.relu1(x) - x = self.separable_2(x) - x = self.bn_sep_2(x) - return x - - -class CellStem0(nn.Module): - - def __init__(self, stem_filters, num_filters=42): - super(CellStem0, self).__init__() - self.num_filters = num_filters - self.stem_filters = stem_filters - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - self.stem_filters, self.num_filters, 1, stride=1, bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - self.num_filters, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.comb_iter_0_left = BranchSeparables( - self.num_filters, self.num_filters, 5, 2, 2 - ) - self.comb_iter_0_right = BranchSeparablesStem( - self.stem_filters, self.num_filters, 7, 2, 3, bias=False - ) - - self.comb_iter_1_left = nn.MaxPool2d(3, stride=2, padding=1) - self.comb_iter_1_right = BranchSeparablesStem( - self.stem_filters, self.num_filters, 7, 2, 3, bias=False - ) - - self.comb_iter_2_left = nn.AvgPool2d( - 3, stride=2, padding=1, count_include_pad=False - ) - self.comb_iter_2_right = BranchSeparablesStem( - self.stem_filters, self.num_filters, 5, 2, 2, bias=False - ) - - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparables( - self.num_filters, self.num_filters, 3, 1, 1, bias=False - ) - self.comb_iter_4_right = nn.MaxPool2d(3, stride=2, padding=1) - - def forward(self, x): - x1 = self.conv_1x1(x) - - x_comb_iter_0_left = self.comb_iter_0_left(x1) - x_comb_iter_0_right = self.comb_iter_0_right(x) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x1) - x_comb_iter_1_right = self.comb_iter_1_right(x) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x1) - x_comb_iter_2_right = self.comb_iter_2_right(x) - x_comb_iter_2 = x_comb_iter_2_left + x_comb_iter_2_right - - x_comb_iter_3_right = self.comb_iter_3_right(x_comb_iter_0) - x_comb_iter_3 = x_comb_iter_3_right + x_comb_iter_1 - - x_comb_iter_4_left = self.comb_iter_4_left(x_comb_iter_0) - x_comb_iter_4_right = self.comb_iter_4_right(x1) - x_comb_iter_4 = x_comb_iter_4_left + x_comb_iter_4_right - - x_out = torch.cat( - [x_comb_iter_1, x_comb_iter_2, x_comb_iter_3, x_comb_iter_4], 1 - ) - return x_out - - -class CellStem1(nn.Module): - - def __init__(self, stem_filters, num_filters): - super(CellStem1, self).__init__() - self.num_filters = num_filters - self.stem_filters = stem_filters - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - 2 * self.num_filters, - self.num_filters, - 1, - stride=1, - bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - self.num_filters, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.relu = nn.ReLU() - self.path_1 = nn.Sequential() - self.path_1.add_module( - 'avgpool', nn.AvgPool2d(1, stride=2, count_include_pad=False) - ) - self.path_1.add_module( - 'conv', - nn.Conv2d( - self.stem_filters, - self.num_filters // 2, - 1, - stride=1, - bias=False - ) - ) - self.path_2 = nn.ModuleList() - self.path_2.add_module('pad', nn.ZeroPad2d((0, 1, 0, 1))) - self.path_2.add_module( - 'avgpool', nn.AvgPool2d(1, stride=2, count_include_pad=False) - ) - self.path_2.add_module( - 'conv', - nn.Conv2d( - self.stem_filters, - self.num_filters // 2, - 1, - stride=1, - bias=False - ) - ) - - self.final_path_bn = nn.BatchNorm2d( - self.num_filters, eps=0.001, momentum=0.1, affine=True - ) - - self.comb_iter_0_left = BranchSeparables( - self.num_filters, - self.num_filters, - 5, - 2, - 2, - name='specific', - bias=False - ) - self.comb_iter_0_right = BranchSeparables( - self.num_filters, - self.num_filters, - 7, - 2, - 3, - name='specific', - bias=False - ) - - # self.comb_iter_1_left = nn.MaxPool2d(3, stride=2, padding=1) - self.comb_iter_1_left = MaxPoolPad() - self.comb_iter_1_right = BranchSeparables( - self.num_filters, - self.num_filters, - 7, - 2, - 3, - name='specific', - bias=False - ) - - # self.comb_iter_2_left = nn.AvgPool2d(3, stride=2, padding=1, count_include_pad=False) - self.comb_iter_2_left = AvgPoolPad() - self.comb_iter_2_right = BranchSeparables( - self.num_filters, - self.num_filters, - 5, - 2, - 2, - name='specific', - bias=False - ) - - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparables( - self.num_filters, - self.num_filters, - 3, - 1, - 1, - name='specific', - bias=False - ) - # self.comb_iter_4_right = nn.MaxPool2d(3, stride=2, padding=1) - self.comb_iter_4_right = MaxPoolPad() - - def forward(self, x_conv0, x_stem_0): - x_left = self.conv_1x1(x_stem_0) - - x_relu = self.relu(x_conv0) - # path 1 - x_path1 = self.path_1(x_relu) - # path 2 - x_path2 = self.path_2.pad(x_relu) - x_path2 = x_path2[:, :, 1:, 1:] - x_path2 = self.path_2.avgpool(x_path2) - x_path2 = self.path_2.conv(x_path2) - # final path - x_right = self.final_path_bn(torch.cat([x_path1, x_path2], 1)) - - x_comb_iter_0_left = self.comb_iter_0_left(x_left) - x_comb_iter_0_right = self.comb_iter_0_right(x_right) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x_left) - x_comb_iter_1_right = self.comb_iter_1_right(x_right) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x_left) - x_comb_iter_2_right = self.comb_iter_2_right(x_right) - x_comb_iter_2 = x_comb_iter_2_left + x_comb_iter_2_right - - x_comb_iter_3_right = self.comb_iter_3_right(x_comb_iter_0) - x_comb_iter_3 = x_comb_iter_3_right + x_comb_iter_1 - - x_comb_iter_4_left = self.comb_iter_4_left(x_comb_iter_0) - x_comb_iter_4_right = self.comb_iter_4_right(x_left) - x_comb_iter_4 = x_comb_iter_4_left + x_comb_iter_4_right - - x_out = torch.cat( - [x_comb_iter_1, x_comb_iter_2, x_comb_iter_3, x_comb_iter_4], 1 - ) - return x_out - - -class FirstCell(nn.Module): - - def __init__( - self, in_channels_left, out_channels_left, in_channels_right, - out_channels_right - ): - super(FirstCell, self).__init__() - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_right, out_channels_right, 1, stride=1, bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_right, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.relu = nn.ReLU() - self.path_1 = nn.Sequential() - self.path_1.add_module( - 'avgpool', nn.AvgPool2d(1, stride=2, count_include_pad=False) - ) - self.path_1.add_module( - 'conv', - nn.Conv2d( - in_channels_left, out_channels_left, 1, stride=1, bias=False - ) - ) - self.path_2 = nn.ModuleList() - self.path_2.add_module('pad', nn.ZeroPad2d((0, 1, 0, 1))) - self.path_2.add_module( - 'avgpool', nn.AvgPool2d(1, stride=2, count_include_pad=False) - ) - self.path_2.add_module( - 'conv', - nn.Conv2d( - in_channels_left, out_channels_left, 1, stride=1, bias=False - ) - ) - - self.final_path_bn = nn.BatchNorm2d( - out_channels_left * 2, eps=0.001, momentum=0.1, affine=True - ) - - self.comb_iter_0_left = BranchSeparables( - out_channels_right, out_channels_right, 5, 1, 2, bias=False - ) - self.comb_iter_0_right = BranchSeparables( - out_channels_right, out_channels_right, 3, 1, 1, bias=False - ) - - self.comb_iter_1_left = BranchSeparables( - out_channels_right, out_channels_right, 5, 1, 2, bias=False - ) - self.comb_iter_1_right = BranchSeparables( - out_channels_right, out_channels_right, 3, 1, 1, bias=False - ) - - self.comb_iter_2_left = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_3_left = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparables( - out_channels_right, out_channels_right, 3, 1, 1, bias=False - ) - - def forward(self, x, x_prev): - x_relu = self.relu(x_prev) - # path 1 - x_path1 = self.path_1(x_relu) - # path 2 - x_path2 = self.path_2.pad(x_relu) - x_path2 = x_path2[:, :, 1:, 1:] - x_path2 = self.path_2.avgpool(x_path2) - x_path2 = self.path_2.conv(x_path2) - # final path - x_left = self.final_path_bn(torch.cat([x_path1, x_path2], 1)) - - x_right = self.conv_1x1(x) - - x_comb_iter_0_left = self.comb_iter_0_left(x_right) - x_comb_iter_0_right = self.comb_iter_0_right(x_left) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x_left) - x_comb_iter_1_right = self.comb_iter_1_right(x_left) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x_right) - x_comb_iter_2 = x_comb_iter_2_left + x_left - - x_comb_iter_3_left = self.comb_iter_3_left(x_left) - x_comb_iter_3_right = self.comb_iter_3_right(x_left) - x_comb_iter_3 = x_comb_iter_3_left + x_comb_iter_3_right - - x_comb_iter_4_left = self.comb_iter_4_left(x_right) - x_comb_iter_4 = x_comb_iter_4_left + x_right - - x_out = torch.cat( - [ - x_left, x_comb_iter_0, x_comb_iter_1, x_comb_iter_2, - x_comb_iter_3, x_comb_iter_4 - ], 1 - ) - return x_out - - -class NormalCell(nn.Module): - - def __init__( - self, in_channels_left, out_channels_left, in_channels_right, - out_channels_right - ): - super(NormalCell, self).__init__() - self.conv_prev_1x1 = nn.Sequential() - self.conv_prev_1x1.add_module('relu', nn.ReLU()) - self.conv_prev_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_left, out_channels_left, 1, stride=1, bias=False - ) - ) - self.conv_prev_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_left, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_right, out_channels_right, 1, stride=1, bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_right, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.comb_iter_0_left = BranchSeparables( - out_channels_right, out_channels_right, 5, 1, 2, bias=False - ) - self.comb_iter_0_right = BranchSeparables( - out_channels_left, out_channels_left, 3, 1, 1, bias=False - ) - - self.comb_iter_1_left = BranchSeparables( - out_channels_left, out_channels_left, 5, 1, 2, bias=False - ) - self.comb_iter_1_right = BranchSeparables( - out_channels_left, out_channels_left, 3, 1, 1, bias=False - ) - - self.comb_iter_2_left = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_3_left = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparables( - out_channels_right, out_channels_right, 3, 1, 1, bias=False - ) - - def forward(self, x, x_prev): - x_left = self.conv_prev_1x1(x_prev) - x_right = self.conv_1x1(x) - - x_comb_iter_0_left = self.comb_iter_0_left(x_right) - x_comb_iter_0_right = self.comb_iter_0_right(x_left) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x_left) - x_comb_iter_1_right = self.comb_iter_1_right(x_left) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x_right) - x_comb_iter_2 = x_comb_iter_2_left + x_left - - x_comb_iter_3_left = self.comb_iter_3_left(x_left) - x_comb_iter_3_right = self.comb_iter_3_right(x_left) - x_comb_iter_3 = x_comb_iter_3_left + x_comb_iter_3_right - - x_comb_iter_4_left = self.comb_iter_4_left(x_right) - x_comb_iter_4 = x_comb_iter_4_left + x_right - - x_out = torch.cat( - [ - x_left, x_comb_iter_0, x_comb_iter_1, x_comb_iter_2, - x_comb_iter_3, x_comb_iter_4 - ], 1 - ) - return x_out - - -class ReductionCell0(nn.Module): - - def __init__( - self, in_channels_left, out_channels_left, in_channels_right, - out_channels_right - ): - super(ReductionCell0, self).__init__() - self.conv_prev_1x1 = nn.Sequential() - self.conv_prev_1x1.add_module('relu', nn.ReLU()) - self.conv_prev_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_left, out_channels_left, 1, stride=1, bias=False - ) - ) - self.conv_prev_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_left, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_right, out_channels_right, 1, stride=1, bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_right, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.comb_iter_0_left = BranchSeparablesReduction( - out_channels_right, out_channels_right, 5, 2, 2, bias=False - ) - self.comb_iter_0_right = BranchSeparablesReduction( - out_channels_right, out_channels_right, 7, 2, 3, bias=False - ) - - self.comb_iter_1_left = MaxPoolPad() - self.comb_iter_1_right = BranchSeparablesReduction( - out_channels_right, out_channels_right, 7, 2, 3, bias=False - ) - - self.comb_iter_2_left = AvgPoolPad() - self.comb_iter_2_right = BranchSeparablesReduction( - out_channels_right, out_channels_right, 5, 2, 2, bias=False - ) - - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparablesReduction( - out_channels_right, out_channels_right, 3, 1, 1, bias=False - ) - self.comb_iter_4_right = MaxPoolPad() - - def forward(self, x, x_prev): - x_left = self.conv_prev_1x1(x_prev) - x_right = self.conv_1x1(x) - - x_comb_iter_0_left = self.comb_iter_0_left(x_right) - x_comb_iter_0_right = self.comb_iter_0_right(x_left) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x_right) - x_comb_iter_1_right = self.comb_iter_1_right(x_left) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x_right) - x_comb_iter_2_right = self.comb_iter_2_right(x_left) - x_comb_iter_2 = x_comb_iter_2_left + x_comb_iter_2_right - - x_comb_iter_3_right = self.comb_iter_3_right(x_comb_iter_0) - x_comb_iter_3 = x_comb_iter_3_right + x_comb_iter_1 - - x_comb_iter_4_left = self.comb_iter_4_left(x_comb_iter_0) - x_comb_iter_4_right = self.comb_iter_4_right(x_right) - x_comb_iter_4 = x_comb_iter_4_left + x_comb_iter_4_right - - x_out = torch.cat( - [x_comb_iter_1, x_comb_iter_2, x_comb_iter_3, x_comb_iter_4], 1 - ) - return x_out - - -class ReductionCell1(nn.Module): - - def __init__( - self, in_channels_left, out_channels_left, in_channels_right, - out_channels_right - ): - super(ReductionCell1, self).__init__() - self.conv_prev_1x1 = nn.Sequential() - self.conv_prev_1x1.add_module('relu', nn.ReLU()) - self.conv_prev_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_left, out_channels_left, 1, stride=1, bias=False - ) - ) - self.conv_prev_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_left, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.conv_1x1 = nn.Sequential() - self.conv_1x1.add_module('relu', nn.ReLU()) - self.conv_1x1.add_module( - 'conv', - nn.Conv2d( - in_channels_right, out_channels_right, 1, stride=1, bias=False - ) - ) - self.conv_1x1.add_module( - 'bn', - nn.BatchNorm2d( - out_channels_right, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.comb_iter_0_left = BranchSeparables( - out_channels_right, - out_channels_right, - 5, - 2, - 2, - name='specific', - bias=False - ) - self.comb_iter_0_right = BranchSeparables( - out_channels_right, - out_channels_right, - 7, - 2, - 3, - name='specific', - bias=False - ) - - # self.comb_iter_1_left = nn.MaxPool2d(3, stride=2, padding=1) - self.comb_iter_1_left = MaxPoolPad() - self.comb_iter_1_right = BranchSeparables( - out_channels_right, - out_channels_right, - 7, - 2, - 3, - name='specific', - bias=False - ) - - # self.comb_iter_2_left = nn.AvgPool2d(3, stride=2, padding=1, count_include_pad=False) - self.comb_iter_2_left = AvgPoolPad() - self.comb_iter_2_right = BranchSeparables( - out_channels_right, - out_channels_right, - 5, - 2, - 2, - name='specific', - bias=False - ) - - self.comb_iter_3_right = nn.AvgPool2d( - 3, stride=1, padding=1, count_include_pad=False - ) - - self.comb_iter_4_left = BranchSeparables( - out_channels_right, - out_channels_right, - 3, - 1, - 1, - name='specific', - bias=False - ) - # self.comb_iter_4_right = nn.MaxPool2d(3, stride=2, padding=1) - self.comb_iter_4_right = MaxPoolPad() - - def forward(self, x, x_prev): - x_left = self.conv_prev_1x1(x_prev) - x_right = self.conv_1x1(x) - - x_comb_iter_0_left = self.comb_iter_0_left(x_right) - x_comb_iter_0_right = self.comb_iter_0_right(x_left) - x_comb_iter_0 = x_comb_iter_0_left + x_comb_iter_0_right - - x_comb_iter_1_left = self.comb_iter_1_left(x_right) - x_comb_iter_1_right = self.comb_iter_1_right(x_left) - x_comb_iter_1 = x_comb_iter_1_left + x_comb_iter_1_right - - x_comb_iter_2_left = self.comb_iter_2_left(x_right) - x_comb_iter_2_right = self.comb_iter_2_right(x_left) - x_comb_iter_2 = x_comb_iter_2_left + x_comb_iter_2_right - - x_comb_iter_3_right = self.comb_iter_3_right(x_comb_iter_0) - x_comb_iter_3 = x_comb_iter_3_right + x_comb_iter_1 - - x_comb_iter_4_left = self.comb_iter_4_left(x_comb_iter_0) - x_comb_iter_4_right = self.comb_iter_4_right(x_right) - x_comb_iter_4 = x_comb_iter_4_left + x_comb_iter_4_right - - x_out = torch.cat( - [x_comb_iter_1, x_comb_iter_2, x_comb_iter_3, x_comb_iter_4], 1 - ) - return x_out - - -class NASNetAMobile(nn.Module): - """Neural Architecture Search (NAS). - - Reference: - Zoph et al. Learning Transferable Architectures - for Scalable Image Recognition. CVPR 2018. - - Public keys: - - ``nasnetamobile``: NASNet-A Mobile. - """ - - def __init__( - self, - num_classes, - loss, - stem_filters=32, - penultimate_filters=1056, - filters_multiplier=2, - **kwargs - ): - super(NASNetAMobile, self).__init__() - self.stem_filters = stem_filters - self.penultimate_filters = penultimate_filters - self.filters_multiplier = filters_multiplier - self.loss = loss - - filters = self.penultimate_filters // 24 - # 24 is default value for the architecture - - self.conv0 = nn.Sequential() - self.conv0.add_module( - 'conv', - nn.Conv2d( - in_channels=3, - out_channels=self.stem_filters, - kernel_size=3, - padding=0, - stride=2, - bias=False - ) - ) - self.conv0.add_module( - 'bn', - nn.BatchNorm2d( - self.stem_filters, eps=0.001, momentum=0.1, affine=True - ) - ) - - self.cell_stem_0 = CellStem0( - self.stem_filters, num_filters=filters // (filters_multiplier**2) - ) - self.cell_stem_1 = CellStem1( - self.stem_filters, num_filters=filters // filters_multiplier - ) - - self.cell_0 = FirstCell( - in_channels_left=filters, - out_channels_left=filters // 2, # 1, 0.5 - in_channels_right=2 * filters, - out_channels_right=filters - ) # 2, 1 - self.cell_1 = NormalCell( - in_channels_left=2 * filters, - out_channels_left=filters, # 2, 1 - in_channels_right=6 * filters, - out_channels_right=filters - ) # 6, 1 - self.cell_2 = NormalCell( - in_channels_left=6 * filters, - out_channels_left=filters, # 6, 1 - in_channels_right=6 * filters, - out_channels_right=filters - ) # 6, 1 - self.cell_3 = NormalCell( - in_channels_left=6 * filters, - out_channels_left=filters, # 6, 1 - in_channels_right=6 * filters, - out_channels_right=filters - ) # 6, 1 - - self.reduction_cell_0 = ReductionCell0( - in_channels_left=6 * filters, - out_channels_left=2 * filters, # 6, 2 - in_channels_right=6 * filters, - out_channels_right=2 * filters - ) # 6, 2 - - self.cell_6 = FirstCell( - in_channels_left=6 * filters, - out_channels_left=filters, # 6, 1 - in_channels_right=8 * filters, - out_channels_right=2 * filters - ) # 8, 2 - self.cell_7 = NormalCell( - in_channels_left=8 * filters, - out_channels_left=2 * filters, # 8, 2 - in_channels_right=12 * filters, - out_channels_right=2 * filters - ) # 12, 2 - self.cell_8 = NormalCell( - in_channels_left=12 * filters, - out_channels_left=2 * filters, # 12, 2 - in_channels_right=12 * filters, - out_channels_right=2 * filters - ) # 12, 2 - self.cell_9 = NormalCell( - in_channels_left=12 * filters, - out_channels_left=2 * filters, # 12, 2 - in_channels_right=12 * filters, - out_channels_right=2 * filters - ) # 12, 2 - - self.reduction_cell_1 = ReductionCell1( - in_channels_left=12 * filters, - out_channels_left=4 * filters, # 12, 4 - in_channels_right=12 * filters, - out_channels_right=4 * filters - ) # 12, 4 - - self.cell_12 = FirstCell( - in_channels_left=12 * filters, - out_channels_left=2 * filters, # 12, 2 - in_channels_right=16 * filters, - out_channels_right=4 * filters - ) # 16, 4 - self.cell_13 = NormalCell( - in_channels_left=16 * filters, - out_channels_left=4 * filters, # 16, 4 - in_channels_right=24 * filters, - out_channels_right=4 * filters - ) # 24, 4 - self.cell_14 = NormalCell( - in_channels_left=24 * filters, - out_channels_left=4 * filters, # 24, 4 - in_channels_right=24 * filters, - out_channels_right=4 * filters - ) # 24, 4 - self.cell_15 = NormalCell( - in_channels_left=24 * filters, - out_channels_left=4 * filters, # 24, 4 - in_channels_right=24 * filters, - out_channels_right=4 * filters - ) # 24, 4 - - self.relu = nn.ReLU() - self.dropout = nn.Dropout() - self.classifier = nn.Linear(24 * filters, num_classes) - - self._init_params() - - def _init_params(self): - for m in self.modules(): - if isinstance(m, nn.Conv2d): - nn.init.kaiming_normal_( - m.weight, mode='fan_out', nonlinearity='relu' - ) - if m.bias is not None: - nn.init.constant_(m.bias, 0) - elif isinstance(m, nn.BatchNorm2d): - nn.init.constant_(m.weight, 1) - nn.init.constant_(m.bias, 0) - elif isinstance(m, nn.BatchNorm1d): - nn.init.constant_(m.weight, 1) - nn.init.constant_(m.bias, 0) - elif isinstance(m, nn.Linear): - nn.init.normal_(m.weight, 0, 0.01) - if m.bias is not None: - nn.init.constant_(m.bias, 0) - - def features(self, input): - x_conv0 = self.conv0(input) - x_stem_0 = self.cell_stem_0(x_conv0) - x_stem_1 = self.cell_stem_1(x_conv0, x_stem_0) - - x_cell_0 = self.cell_0(x_stem_1, x_stem_0) - x_cell_1 = self.cell_1(x_cell_0, x_stem_1) - x_cell_2 = self.cell_2(x_cell_1, x_cell_0) - x_cell_3 = self.cell_3(x_cell_2, x_cell_1) - - x_reduction_cell_0 = self.reduction_cell_0(x_cell_3, x_cell_2) - - x_cell_6 = self.cell_6(x_reduction_cell_0, x_cell_3) - x_cell_7 = self.cell_7(x_cell_6, x_reduction_cell_0) - x_cell_8 = self.cell_8(x_cell_7, x_cell_6) - x_cell_9 = self.cell_9(x_cell_8, x_cell_7) - - x_reduction_cell_1 = self.reduction_cell_1(x_cell_9, x_cell_8) - - x_cell_12 = self.cell_12(x_reduction_cell_1, x_cell_9) - x_cell_13 = self.cell_13(x_cell_12, x_reduction_cell_1) - x_cell_14 = self.cell_14(x_cell_13, x_cell_12) - x_cell_15 = self.cell_15(x_cell_14, x_cell_13) - - x_cell_15 = self.relu(x_cell_15) - x_cell_15 = F.avg_pool2d( - x_cell_15, - x_cell_15.size()[2:] - ) # global average pool - x_cell_15 = x_cell_15.view(x_cell_15.size(0), -1) - x_cell_15 = self.dropout(x_cell_15) - - return x_cell_15 - - def forward(self, input): - v = self.features(input) - - if not self.training: - return v - - y = self.classifier(v) - - if self.loss == 'softmax': - return y - elif self.loss == 'triplet': - return y, v - else: - raise KeyError('Unsupported loss: {}'.format(self.loss)) - - -def init_pretrained_weights(model, model_url): - """Initializes model with pretrained weights. - - Layers that don't match with pretrained layers in name or size are kept unchanged. - """ - pretrain_dict = model_zoo.load_url(model_url) - model_dict = model.state_dict() - pretrain_dict = { - k: v - for k, v in pretrain_dict.items() - if k in model_dict and model_dict[k].size() == v.size() - } - model_dict.update(pretrain_dict) - model.load_state_dict(model_dict) - - -def nasnetamobile(num_classes, loss='softmax', pretrained=True, **kwargs): - model = NASNetAMobile(num_classes, loss, **kwargs) - if pretrained: - model_url = pretrained_settings['nasnetamobile']['imagenet']['url'] - init_pretrained_weights(model, model_url) - return model diff --git a/spaces/bigslime/stablediffusion-infinity/PyPatchMatch/csrc/pyinterface.cpp b/spaces/bigslime/stablediffusion-infinity/PyPatchMatch/csrc/pyinterface.cpp deleted file mode 100644 index 612ccba6544ff111a2da0dce9adc4019858ebded..0000000000000000000000000000000000000000 --- a/spaces/bigslime/stablediffusion-infinity/PyPatchMatch/csrc/pyinterface.cpp +++ /dev/null @@ -1,107 +0,0 @@ -#include "pyinterface.h" -#include "inpaint.h" - -static unsigned int PM_seed = 1212; -static bool PM_verbose = false; - -int _dtype_py_to_cv(int dtype_py); -int _dtype_cv_to_py(int dtype_cv); -cv::Mat _py_to_cv2(PM_mat_t pymat); -PM_mat_t _cv2_to_py(cv::Mat cvmat); - -void PM_set_random_seed(unsigned int seed) { - PM_seed = seed; -} - -void PM_set_verbose(int value) { - PM_verbose = static_cast(value); -} - -void PM_free_pymat(PM_mat_t pymat) { - free(pymat.data_ptr); -} - -PM_mat_t PM_inpaint(PM_mat_t source_py, PM_mat_t mask_py, int patch_size) { - cv::Mat source = _py_to_cv2(source_py); - cv::Mat mask = _py_to_cv2(mask_py); - auto metric = PatchSSDDistanceMetric(patch_size); - cv::Mat result = Inpainting(source, mask, &metric).run(PM_verbose, false, PM_seed); - return _cv2_to_py(result); -} - -PM_mat_t PM_inpaint_regularity(PM_mat_t source_py, PM_mat_t mask_py, PM_mat_t ijmap_py, int patch_size, float guide_weight) { - cv::Mat source = _py_to_cv2(source_py); - cv::Mat mask = _py_to_cv2(mask_py); - cv::Mat ijmap = _py_to_cv2(ijmap_py); - - auto metric = RegularityGuidedPatchDistanceMetricV2(patch_size, ijmap, guide_weight); - cv::Mat result = Inpainting(source, mask, &metric).run(PM_verbose, false, PM_seed); - return _cv2_to_py(result); -} - -PM_mat_t PM_inpaint2(PM_mat_t source_py, PM_mat_t mask_py, PM_mat_t global_mask_py, int patch_size) { - cv::Mat source = _py_to_cv2(source_py); - cv::Mat mask = _py_to_cv2(mask_py); - cv::Mat global_mask = _py_to_cv2(global_mask_py); - - auto metric = PatchSSDDistanceMetric(patch_size); - cv::Mat result = Inpainting(source, mask, global_mask, &metric).run(PM_verbose, false, PM_seed); - return _cv2_to_py(result); -} - -PM_mat_t PM_inpaint2_regularity(PM_mat_t source_py, PM_mat_t mask_py, PM_mat_t global_mask_py, PM_mat_t ijmap_py, int patch_size, float guide_weight) { - cv::Mat source = _py_to_cv2(source_py); - cv::Mat mask = _py_to_cv2(mask_py); - cv::Mat global_mask = _py_to_cv2(global_mask_py); - cv::Mat ijmap = _py_to_cv2(ijmap_py); - - auto metric = RegularityGuidedPatchDistanceMetricV2(patch_size, ijmap, guide_weight); - cv::Mat result = Inpainting(source, mask, global_mask, &metric).run(PM_verbose, false, PM_seed); - return _cv2_to_py(result); -} - -int _dtype_py_to_cv(int dtype_py) { - switch (dtype_py) { - case PM_UINT8: return CV_8U; - case PM_INT8: return CV_8S; - case PM_UINT16: return CV_16U; - case PM_INT16: return CV_16S; - case PM_INT32: return CV_32S; - case PM_FLOAT32: return CV_32F; - case PM_FLOAT64: return CV_64F; - } - - return CV_8U; -} - -int _dtype_cv_to_py(int dtype_cv) { - switch (dtype_cv) { - case CV_8U: return PM_UINT8; - case CV_8S: return PM_INT8; - case CV_16U: return PM_UINT16; - case CV_16S: return PM_INT16; - case CV_32S: return PM_INT32; - case CV_32F: return PM_FLOAT32; - case CV_64F: return PM_FLOAT64; - } - - return PM_UINT8; -} - -cv::Mat _py_to_cv2(PM_mat_t pymat) { - int dtype = _dtype_py_to_cv(pymat.dtype); - dtype = CV_MAKETYPE(pymat.dtype, pymat.shape.channels); - return cv::Mat(cv::Size(pymat.shape.width, pymat.shape.height), dtype, pymat.data_ptr).clone(); -} - -PM_mat_t _cv2_to_py(cv::Mat cvmat) { - PM_shape_t shape = {cvmat.size().width, cvmat.size().height, cvmat.channels()}; - int dtype = _dtype_cv_to_py(cvmat.depth()); - size_t dsize = cvmat.total() * cvmat.elemSize(); - - void *data_ptr = reinterpret_cast(malloc(dsize)); - memcpy(data_ptr, reinterpret_cast(cvmat.data), dsize); - - return PM_mat_t {data_ptr, shape, dtype}; -} - diff --git a/spaces/bioriAsaeru/text-to-voice/Attack of the 50 Foot Mandy Movie Free Download HD Dont Miss This Hilarious Parody.md b/spaces/bioriAsaeru/text-to-voice/Attack of the 50 Foot Mandy Movie Free Download HD Dont Miss This Hilarious Parody.md deleted file mode 100644 index 8b25c2a34045e679dcf44554abe29f1913be1114..0000000000000000000000000000000000000000 --- a/spaces/bioriAsaeru/text-to-voice/Attack of the 50 Foot Mandy Movie Free Download HD Dont Miss This Hilarious Parody.md +++ /dev/null @@ -1,9 +0,0 @@ - -

Our program had been able to record, when you went to one particular adult site. I even spent 3 hrs to put together those movies, 1 that is a picture from your monitor, and the second footage coming from a livecam. It turned out very funny!!

-

Attack of the 50 Foot Mandy movie free download hd


Downloadhttps://urloso.com/2uyRRC



-

My software managed to get, the moment you went to one particular adult page. I also wasted 3 hours to glue two movies, one is a picture from the display, and also second footage from the web cam. It ended up pretty hilarious.

-

My application had been able to get, when you visited one porn webpage. I actually spent three hrs to put together two movies, 1 of which is a picture coming from the screen, and also the second footage from your cam. It came out pretty humorous!

-

Narrator: German submarines were technological wonders that were transforming the nature of warfare. Three months before the Lusitania set sail, the German High Command had launched a full-scale attack on ships entering the war zone around Great Britain, beginning a campaign that would send hundreds of ships to the bottom, including vessels flying the American flag. The captain and crew of the Lusitania dismissed fears of submarines, and encouraged passengers to enjoy the elegant amenities on board the 787-foot luxury liner. Vacationing couples on the lower decks enjoyed shuffleboard, while the wealthy travelers in first class were served high tea in the Verandah Café. From intercepted communications, the British knew the German submarine U-20 was lurking in the path of the Lusitania. Yet they chose not to send destroyers out to meet the ship and escort it into Liverpool. Within the halls of the British Admiralty, some argued that if the Lusitania was lost, it might precipitate American entry into the war.

-

aaccfb2cb3
-
-
\ No newline at end of file diff --git a/spaces/bioriAsaeru/text-to-voice/Download Clash Of The Titans 2010 3D HSBS 1080p X264 16GB YIFY Torrent KickassTorrentsgolke How to Get It Fast and Easy.md b/spaces/bioriAsaeru/text-to-voice/Download Clash Of The Titans 2010 3D HSBS 1080p X264 16GB YIFY Torrent KickassTorrentsgolke How to Get It Fast and Easy.md deleted file mode 100644 index e15bba7a20b1d3ecbd57445d5f3eca8eb1e1afd9..0000000000000000000000000000000000000000 --- a/spaces/bioriAsaeru/text-to-voice/Download Clash Of The Titans 2010 3D HSBS 1080p X264 16GB YIFY Torrent KickassTorrentsgolke How to Get It Fast and Easy.md +++ /dev/null @@ -1,6 +0,0 @@ -

Download Clash Of The Titans 2010 3D HSBS 1080p X264 16GB YIFY Torrent KickassTorrentsgolke


Download Filehttps://urloso.com/2uyPBg



-
- aaccfb2cb3
-
-
-

diff --git a/spaces/bioriAsaeru/text-to-voice/HD Online Player (Kariera Nicosia Dyzmy 720p Torrent).md b/spaces/bioriAsaeru/text-to-voice/HD Online Player (Kariera Nicosia Dyzmy 720p Torrent).md deleted file mode 100644 index 894d88d739b63aa9eccc46de27b4e23f57a9e22c..0000000000000000000000000000000000000000 --- a/spaces/bioriAsaeru/text-to-voice/HD Online Player (Kariera Nicosia Dyzmy 720p Torrent).md +++ /dev/null @@ -1,25 +0,0 @@ - -

How to Watch Kariera Nicosia Dyzmy Online in HD Quality

-

Kariera Nicosia Dyzmy is a 2002 Polish comedy film directed by Jacek Bromski and starring Cezary Pazura as Nikos, a former hairdresser who becomes a politician by accident. The film is a satire of the Polish political scene and a parody of the 1937 American film Mr. Smith Goes to Washington.

-

HD Online Player (Kariera Nicosia Dyzmy 720p Torrent)


DOWNLOADhttps://urloso.com/2uyPk9



-

If you want to watch Kariera Nicosia Dyzmy online in HD quality, you might be interested in downloading the torrent file from Ulož.to Disk[^1^], a popular file-sharing platform. The torrent file contains the film in 720p resolution and HDTV format, with x264 encoding and Polish audio. The file size is 122 kB and you can download it fast or slowly depending on your internet speed.

-

To download the torrent file, you need to have a torrent client installed on your device, such as BitTorrent, uTorrent, or qBittorrent. You also need to have a VPN service to protect your privacy and security while torrenting. Once you have these tools, you can follow these steps:

-
    -
  1. Go to this link and click on "Download fast" or "Download slowly".
  2. -
  3. Open the downloaded torrent file with your torrent client and choose a folder to save the film.
  4. -
  5. Wait for the download to finish and enjoy watching Kariera Nicosia Dyzmy online in HD quality.
  6. -
-

Alternatively, you can also stream Kariera Nicosia Dyzmy online in HD quality by using an online player such as SoundCloud[^3^]. SoundCloud is a music and podcast streaming platform that allows you to listen to millions of tracks for free. You can also upload your own audio files and share them with others.

-

To stream Kariera Nicosia Dyzmy online in HD quality using SoundCloud, you need to have a SoundCloud account and a web browser that supports HTML5 audio. You can also use the SoundCloud app on your mobile device. Once you have these tools, you can follow these steps:

-

-
    -
  1. Go to this link and log in to your SoundCloud account.
  2. -
  3. Click on the play button and enjoy listening to Kariera Nicosia Dyzmy online in HD quality.
  4. -
-

We hope this article has helped you find out how to watch Kariera Nicosia Dyzmy online in HD quality using different methods. If you have any questions or feedback, please let us know in the comments section below.

- -

Kariera Nicosia Dyzmy is a hilarious and witty film that mocks the corruption and incompetence of the Polish politicians and media. The film features many memorable scenes and dialogues, such as Nikos's speech in the parliament, his interview with a journalist, his encounter with a mafia boss, and his romance with a singer. The film also has a catchy soundtrack composed by Michał Lorenc and performed by various artists.

-

The film was a box office hit in Poland, earning over 17 million złoty and attracting over 2.5 million viewers. It also received positive reviews from critics and audiences alike, who praised the film's humor, acting, and social commentary. The film won several awards, including the Golden Duck for the best Polish film of 2002, the Polish Film Award for the best screenplay, and the Audience Award at the Polish Film Festival in Gdynia.

-

Kariera Nicosia Dyzmy is a classic of the Polish comedy genre and a must-watch for anyone who enjoys political satire and absurd humor. The film is also a great way to learn more about the Polish culture and history, as it references many real-life events and personalities from the early 2000s. If you are looking for a fun and entertaining film to watch online in HD quality, you should definitely check out Kariera Nicosia Dyzmy.

d5da3c52bf
-
-
\ No newline at end of file diff --git a/spaces/bioriAsaeru/text-to-voice/Haar Jeet full movie blu-ray download How to enjoy the 1972 hit film on your device.md b/spaces/bioriAsaeru/text-to-voice/Haar Jeet full movie blu-ray download How to enjoy the 1972 hit film on your device.md deleted file mode 100644 index 32de3fcd6462b510e39631d0ceb7b05c4f2a216d..0000000000000000000000000000000000000000 --- a/spaces/bioriAsaeru/text-to-voice/Haar Jeet full movie blu-ray download How to enjoy the 1972 hit film on your device.md +++ /dev/null @@ -1,6 +0,0 @@ -

Haar Jeet full movie blu-ray download


Download Zip ⚙⚙⚙ https://urloso.com/2uyR76



- - aaccfb2cb3
-
-
-

diff --git a/spaces/birsardar/stable-diffusion-mat-outpainting-primer/torch_utils/ops/__init__.py b/spaces/birsardar/stable-diffusion-mat-outpainting-primer/torch_utils/ops/__init__.py deleted file mode 100644 index ece0ea08fe2e939cc260a1dafc0ab5b391b773d9..0000000000000000000000000000000000000000 --- a/spaces/birsardar/stable-diffusion-mat-outpainting-primer/torch_utils/ops/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -# Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved. -# -# NVIDIA CORPORATION and its licensors retain all intellectual property -# and proprietary rights in and to this software, related documentation -# and any modifications thereto. Any use, reproduction, disclosure or -# distribution of this software and related documentation without an express -# license agreement from NVIDIA CORPORATION is strictly prohibited. - -# empty diff --git a/spaces/bparks08/falcon-chat-40b-1/app.py b/spaces/bparks08/falcon-chat-40b-1/app.py deleted file mode 100644 index 7d496eb4f9707a897f8035930856a0efa3bb18c9..0000000000000000000000000000000000000000 --- a/spaces/bparks08/falcon-chat-40b-1/app.py +++ /dev/null @@ -1,215 +0,0 @@ -import argparse -import os - -import gradio as gr -from text_generation import Client - -TITLE = """

🚀 Falcon-Chat demo

""" -USER_NAME = "User" -BOT_NAME = "Falcon" -DEFAULT_INSTRUCTIONS = f"""The following is a conversation between a highly knowledgeable and intelligent AI assistant, called Falcon, and a human user, called User. In the following interactions, User and Falcon will converse in natural language, and Falcon will answer User's questions. Falcon was built to be respectful, polite and inclusive. Falcon was built by the Technology Innovation Institute in Abu Dhabi. Falcon will never decline to answer a question, and always attempts to give an answer that User would be satisfied with. It knows a lot, and always tells the truth. The conversation begins. -""" -RETRY_COMMAND = "/retry" -STOP_STR = f"\n{USER_NAME}:" -STOP_SUSPECT_LIST = [":", "\n", "User"] - -INFERENCE_ENDPOINT = os.environ.get("INFERENCE_ENDPOINT") -INFERENCE_AUTH = os.environ.get("INFERENCE_AUTH") - - -def chat_accordion(): - with gr.Accordion("Parameters", open=False): - temperature = gr.Slider( - minimum=0.1, - maximum=2.0, - value=0.8, - step=0.1, - interactive=True, - label="Temperature", - ) - top_p = gr.Slider( - minimum=0.1, - maximum=0.99, - value=0.9, - step=0.01, - interactive=True, - label="p (nucleus sampling)", - ) - return temperature, top_p - - -def format_chat_prompt(message: str, chat_history, instructions: str) -> str: - instructions = instructions.strip(" ").strip("\n") - prompt = instructions - for turn in chat_history: - user_message, bot_message = turn - prompt = f"{prompt}\n{USER_NAME}: {user_message}\n{BOT_NAME}: {bot_message}" - prompt = f"{prompt}\n{USER_NAME}: {message}\n{BOT_NAME}:" - return prompt - - -def chat(client: Client): - with gr.Column(elem_id="chat_container"): - with gr.Row(): - chatbot = gr.Chatbot(elem_id="chatbot") - with gr.Row(): - inputs = gr.Textbox( - placeholder=f"Hello {BOT_NAME} !!", - label="Type an input and press Enter", - max_lines=3, - ) - - with gr.Row(elem_id="button_container"): - with gr.Column(): - retry_button = gr.Button("♻️ Retry last turn") - with gr.Column(): - delete_turn_button = gr.Button("🧽 Delete last turn") - with gr.Column(): - clear_chat_button = gr.Button("✨ Delete all history") - - gr.Examples( - [ - ["Hey Falcon! Any recommendations for my holidays in Abu Dhabi?"], - ["What's the Everett interpretation of quantum mechanics?"], - ["Give me a list of the top 10 dive sites you would recommend around the world."], - ["Can you tell me more about deep-water soloing?"], - ["Can you write a short tweet about the Apache 2.0 release of our latest AI model, Falcon LLM?"], - ], - inputs=inputs, - label="Click on any example and press Enter in the input textbox!", - ) - - with gr.Row(elem_id="param_container"): - with gr.Column(): - temperature, top_p = chat_accordion() - with gr.Column(): - with gr.Accordion("Instructions", open=False): - instructions = gr.Textbox( - placeholder="LLM instructions", - value=DEFAULT_INSTRUCTIONS, - lines=10, - interactive=True, - label="Instructions", - max_lines=16, - show_label=False, - ) - - def run_chat(message: str, chat_history, instructions: str, temperature: float, top_p: float): - if not message or (message == RETRY_COMMAND and len(chat_history) == 0): - yield chat_history - return - - if message == RETRY_COMMAND and chat_history: - prev_turn = chat_history.pop(-1) - user_message, _ = prev_turn - message = user_message - - prompt = format_chat_prompt(message, chat_history, instructions) - chat_history = chat_history + [[message, ""]] - stream = client.generate_stream( - prompt, - do_sample=True, - max_new_tokens=1024, - stop_sequences=[STOP_STR, "<|endoftext|>"], - temperature=temperature, - top_p=top_p, - ) - acc_text = "" - for idx, response in enumerate(stream): - text_token = response.token.text - - if response.details: - return - - if text_token in STOP_SUSPECT_LIST: - acc_text += text_token - continue - - if idx == 0 and text_token.startswith(" "): - text_token = text_token[1:] - - acc_text += text_token - last_turn = list(chat_history.pop(-1)) - last_turn[-1] += acc_text - chat_history = chat_history + [last_turn] - yield chat_history - acc_text = "" - - def delete_last_turn(chat_history): - if chat_history: - chat_history.pop(-1) - return {chatbot: gr.update(value=chat_history)} - - def run_retry(message: str, chat_history, instructions: str, temperature: float, top_p: float): - yield from run_chat(RETRY_COMMAND, chat_history, instructions, temperature, top_p) - - def clear_chat(): - return [] - - inputs.submit( - run_chat, - [inputs, chatbot, instructions, temperature, top_p], - outputs=[chatbot], - show_progress=False, - ) - inputs.submit(lambda: "", inputs=None, outputs=inputs) - delete_turn_button.click(delete_last_turn, inputs=[chatbot], outputs=[chatbot]) - retry_button.click( - run_retry, - [inputs, chatbot, instructions, temperature, top_p], - outputs=[chatbot], - show_progress=False, - ) - clear_chat_button.click(clear_chat, [], chatbot) - - -def get_demo(client: Client): - with gr.Blocks( - # css=None - # css="""#chat_container {width: 700px; margin-left: auto; margin-right: auto;} - # #button_container {width: 700px; margin-left: auto; margin-right: auto;} - # #param_container {width: 700px; margin-left: auto; margin-right: auto;}""" - css="""#chatbot { - font-size: 14px; - min-height: 300px; -}""" - ) as demo: - gr.HTML(TITLE) - - with gr.Row(): - with gr.Column(): - gr.Image("home-banner.jpg", elem_id="banner-image", show_label=False) - with gr.Column(): - gr.Markdown( - """**Chat with [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct), brainstorm ideas, discuss your holiday plans, and more!** - - ✨ This demo is powered by [Falcon-40B](https://huggingface.co/tiiuae/falcon-40b), finetuned on the [Baize](https://github.com/project-baize/baize-chatbot) dataset, and running with [Text Generation Inference](https://github.com/huggingface/text-generation-inference). [Falcon-40B](https://huggingface.co/tiiuae/falcon-40b) is a state-of-the-art large language model built by the [Technology Innovation Institute](https://www.tii.ae) in Abu Dhabi. It is trained on 1 trillion tokens (including [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb)) and available under the Apache 2.0 license. It currently holds the 🥇 1st place on the [🤗 Open LLM leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). This demo is made available by the [HuggingFace H4 team](https://huggingface.co/HuggingFaceH4). - - 🧪 This is only a **first experimental preview**: the [H4 team](https://huggingface.co/HuggingFaceH4) intends to provide increasingly capable versions of Falcon Chat in the future, based on improved datasets and RLHF/RLAIF. - - 👀 **Learn more about Falcon LLM:** [falconllm.tii.ae](https://falconllm.tii.ae/) - - ➡️️ **Intended Use**: this demo is intended to showcase an early finetuning of [Falcon-40B](https://huggingface.co/tiiuae/falcon-40b), to illustrate the impact (and limitations) of finetuning on a dataset of conversations and instructions. We encourage the community to further build upon the base model, and to create even better instruct/chat versions! - - ⚠️ **Limitations**: the model can and will produce factually incorrect information, hallucinating facts and actions. As it has not undergone any advanced tuning/alignment, it can produce problematic outputs, especially if prompted to do so. Finally, this demo is limited to a session length of about 1,000 words. - """ - ) - - chat(client) - - return demo - - -if __name__ == "__main__": - parser = argparse.ArgumentParser("Playground Demo") - parser.add_argument( - "--addr", - type=str, - required=False, - default=INFERENCE_ENDPOINT, - ) - args = parser.parse_args() - client = Client(args.addr, headers={"Authorization": f"Basic {INFERENCE_AUTH}"}) - demo = get_demo(client) - demo.queue(max_size=128, concurrency_count=16) - demo.launch() diff --git a/spaces/brjathu/HMR2.0/vendor/detectron2/tests/__init__.py b/spaces/brjathu/HMR2.0/vendor/detectron2/tests/__init__.py deleted file mode 100644 index 9020c2df23e2af280b7bb168b996ae9eaf312eb8..0000000000000000000000000000000000000000 --- a/spaces/brjathu/HMR2.0/vendor/detectron2/tests/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. diff --git a/spaces/camenduru-com/VITS-Umamusume-voice-synthesizer/train_ms.py b/spaces/camenduru-com/VITS-Umamusume-voice-synthesizer/train_ms.py deleted file mode 100644 index 83f19c8c4fe64fd84a3bcb62ed06e673b6833e07..0000000000000000000000000000000000000000 --- a/spaces/camenduru-com/VITS-Umamusume-voice-synthesizer/train_ms.py +++ /dev/null @@ -1,306 +0,0 @@ -import os -import json -import argparse -import itertools -import math -import torch -from torch import nn, optim -from torch.nn import functional as F -from torch.utils.data import DataLoader -from torch.utils.tensorboard import SummaryWriter -import torch.multiprocessing as mp -import torch.distributed as dist -from torch.nn.parallel import DistributedDataParallel as DDP -from torch.cuda.amp import autocast, GradScaler -from tqdm import tqdm - -import librosa -import logging - -logging.getLogger('numba').setLevel(logging.WARNING) - -import commons -import utils -from data_utils import ( - TextAudioSpeakerLoader, - TextAudioSpeakerCollate, - DistributedBucketSampler -) -from models import ( - SynthesizerTrn, - MultiPeriodDiscriminator, -) -from losses import ( - generator_loss, - discriminator_loss, - feature_loss, - kl_loss -) -from mel_processing import mel_spectrogram_torch, spec_to_mel_torch -from text.symbols import symbols - - -torch.backends.cudnn.benchmark = True -global_step = 0 - - -def main(): - """Assume Single Node Multi GPUs Training Only""" - assert torch.cuda.is_available(), "CPU training is not allowed." - - n_gpus = torch.cuda.device_count() - os.environ['MASTER_ADDR'] = 'localhost' - os.environ['MASTER_PORT'] = '8000' - - hps = utils.get_hparams() - mp.spawn(run, nprocs=n_gpus, args=(n_gpus, hps,)) - - -def run(rank, n_gpus, hps): - global global_step - if rank == 0: - logger = utils.get_logger(hps.model_dir) - logger.info(hps) - utils.check_git_hash(hps.model_dir) - writer = SummaryWriter(log_dir=hps.model_dir) - writer_eval = SummaryWriter(log_dir=os.path.join(hps.model_dir, "eval")) - - dist.init_process_group(backend='nccl', init_method='env://', world_size=n_gpus, rank=rank) - torch.manual_seed(hps.train.seed) - torch.cuda.set_device(rank) - - train_dataset = TextAudioSpeakerLoader(hps.data.training_files, hps.data) - train_sampler = DistributedBucketSampler( - train_dataset, - hps.train.batch_size, - [32,300,400,500,600,700,800,900,1000], - num_replicas=n_gpus, - rank=rank, - shuffle=True) - collate_fn = TextAudioSpeakerCollate() - train_loader = DataLoader(train_dataset, num_workers=0, shuffle=False, pin_memory=True, - collate_fn=collate_fn, batch_sampler=train_sampler) - if rank == 0: - eval_dataset = TextAudioSpeakerLoader(hps.data.validation_files, hps.data) - eval_loader = DataLoader(eval_dataset, num_workers=0, shuffle=False, - batch_size=hps.train.batch_size, pin_memory=True, - drop_last=False, collate_fn=collate_fn) - - net_g = SynthesizerTrn( - len(symbols), - hps.data.filter_length // 2 + 1, - hps.train.segment_size // hps.data.hop_length, - n_speakers=hps.data.n_speakers, - **hps.model).cuda(rank) - net_d = MultiPeriodDiscriminator(hps.model.use_spectral_norm).cuda(rank) - optim_g = torch.optim.AdamW( - net_g.parameters(), - hps.train.learning_rate, - betas=hps.train.betas, - eps=hps.train.eps) - optim_d = torch.optim.AdamW( - net_d.parameters(), - hps.train.learning_rate, - betas=hps.train.betas, - eps=hps.train.eps) - net_g = DDP(net_g, device_ids=[rank]) - net_d = DDP(net_d, device_ids=[rank]) - - try: - _, _, _, epoch_str = utils.load_checkpoint(hps.model_dir.strip("./drive/MyDrive\\"), net_g, None) - _, _, _, epoch_str = utils.load_checkpoint(utils.latest_checkpoint_path(hps.model_dir, "D_*.pth"), net_d, optim_d) - global_step = (epoch_str - 1) * len(train_loader) - except: - epoch_str = 1 - global_step = 0 - - scheduler_g = torch.optim.lr_scheduler.ExponentialLR(optim_g, gamma=hps.train.lr_decay, last_epoch=epoch_str-2) - scheduler_d = torch.optim.lr_scheduler.ExponentialLR(optim_d, gamma=hps.train.lr_decay, last_epoch=epoch_str-2) - - scaler = GradScaler(enabled=hps.train.fp16_run) - - for epoch in range(epoch_str, hps.train.epochs + 1): - if rank==0: - train_and_evaluate(rank, epoch, hps, [net_g, net_d], [optim_g, optim_d], [scheduler_g, scheduler_d], scaler, [train_loader, eval_loader], logger, [writer, writer_eval]) - else: - train_and_evaluate(rank, epoch, hps, [net_g, net_d], [optim_g, optim_d], [scheduler_g, scheduler_d], scaler, [train_loader, None], None, None) - scheduler_g.step() - scheduler_d.step() - - -def train_and_evaluate(rank, epoch, hps, nets, optims, schedulers, scaler, loaders, logger, writers): - net_g, net_d = nets - optim_g, optim_d = optims - scheduler_g, scheduler_d = schedulers - train_loader, eval_loader = loaders - if writers is not None: - writer, writer_eval = writers - - train_loader.batch_sampler.set_epoch(epoch) - global global_step - - net_g.train() - net_d.train() - for batch_idx, (x, x_lengths, spec, spec_lengths, y, y_lengths, speakers) in enumerate(tqdm(train_loader)): - x, x_lengths = x.cuda(rank, non_blocking=True), x_lengths.cuda(rank, non_blocking=True) - spec, spec_lengths = spec.cuda(rank, non_blocking=True), spec_lengths.cuda(rank, non_blocking=True) - y, y_lengths = y.cuda(rank, non_blocking=True), y_lengths.cuda(rank, non_blocking=True) - speakers = speakers.cuda(rank, non_blocking=True) - - with autocast(enabled=hps.train.fp16_run): - y_hat, l_length, attn, ids_slice, x_mask, z_mask,\ - (z, z_p, m_p, logs_p, m_q, logs_q) = net_g(x, x_lengths, spec, spec_lengths, speakers) - - mel = spec_to_mel_torch( - spec, - hps.data.filter_length, - hps.data.n_mel_channels, - hps.data.sampling_rate, - hps.data.mel_fmin, - hps.data.mel_fmax) - y_mel = commons.slice_segments(mel, ids_slice, hps.train.segment_size // hps.data.hop_length) - y_hat_mel = mel_spectrogram_torch( - y_hat.squeeze(1), - hps.data.filter_length, - hps.data.n_mel_channels, - hps.data.sampling_rate, - hps.data.hop_length, - hps.data.win_length, - hps.data.mel_fmin, - hps.data.mel_fmax - ) - - y = commons.slice_segments(y, ids_slice * hps.data.hop_length, hps.train.segment_size) # slice - - # Discriminator - y_d_hat_r, y_d_hat_g, _, _ = net_d(y, y_hat.detach()) - with autocast(enabled=False): - loss_disc, losses_disc_r, losses_disc_g = discriminator_loss(y_d_hat_r, y_d_hat_g) - loss_disc_all = loss_disc - optim_d.zero_grad() - scaler.scale(loss_disc_all).backward() - scaler.unscale_(optim_d) - grad_norm_d = commons.clip_grad_value_(net_d.parameters(), None) - scaler.step(optim_d) - - with autocast(enabled=hps.train.fp16_run): - # Generator - y_d_hat_r, y_d_hat_g, fmap_r, fmap_g = net_d(y, y_hat) - with autocast(enabled=False): - loss_dur = torch.sum(l_length.float()) - loss_mel = F.l1_loss(y_mel, y_hat_mel) * hps.train.c_mel - loss_kl = kl_loss(z_p, logs_q, m_p, logs_p, z_mask) * hps.train.c_kl - - loss_fm = feature_loss(fmap_r, fmap_g) - loss_gen, losses_gen = generator_loss(y_d_hat_g) - loss_gen_all = loss_gen + loss_fm + loss_mel + loss_dur + loss_kl - optim_g.zero_grad() - scaler.scale(loss_gen_all).backward() - scaler.unscale_(optim_g) - grad_norm_g = commons.clip_grad_value_(net_g.parameters(), None) - scaler.step(optim_g) - scaler.update() - - if rank==0: - if global_step % hps.train.log_interval == 0: - lr = optim_g.param_groups[0]['lr'] - losses = [loss_disc, loss_gen, loss_fm, loss_mel, loss_dur, loss_kl] - logger.info('Train Epoch: {} [{:.0f}%]'.format( - epoch, - 100. * batch_idx / len(train_loader))) - logger.info([x.item() for x in losses] + [global_step, lr]) - - scalar_dict = {"loss/g/total": loss_gen_all, "loss/d/total": loss_disc_all, "learning_rate": lr, "grad_norm_d": grad_norm_d, "grad_norm_g": grad_norm_g} - scalar_dict.update({"loss/g/fm": loss_fm, "loss/g/mel": loss_mel, "loss/g/dur": loss_dur, "loss/g/kl": loss_kl}) - - scalar_dict.update({"loss/g/{}".format(i): v for i, v in enumerate(losses_gen)}) - scalar_dict.update({"loss/d_r/{}".format(i): v for i, v in enumerate(losses_disc_r)}) - scalar_dict.update({"loss/d_g/{}".format(i): v for i, v in enumerate(losses_disc_g)}) - image_dict = { - "slice/mel_org": utils.plot_spectrogram_to_numpy(y_mel[0].data.cpu().numpy()), - "slice/mel_gen": utils.plot_spectrogram_to_numpy(y_hat_mel[0].data.cpu().numpy()), - "all/mel": utils.plot_spectrogram_to_numpy(mel[0].data.cpu().numpy()), - "all/attn": utils.plot_alignment_to_numpy(attn[0,0].data.cpu().numpy()) - } - utils.summarize( - writer=writer, - global_step=global_step, - images=image_dict, - scalars=scalar_dict) - - if global_step % hps.train.eval_interval == 0: - evaluate(hps, net_g, eval_loader, writer_eval) - utils.save_checkpoint(net_g, optim_g, hps.train.learning_rate, epoch, os.path.join(hps.model_dir, "G_{}.pth".format(global_step))) - utils.save_checkpoint(net_d, optim_d, hps.train.learning_rate, epoch, os.path.join(hps.model_dir, "D_{}.pth".format(global_step))) - old_g=os.path.join(hps.model_dir, "G_{}.pth".format(global_step-2000)) - old_d=os.path.join(hps.model_dir, "D_{}.pth".format(global_step-2000)) - if os.path.exists(old_g): - os.remove(old_g) - if os.path.exists(old_d): - os.remove(old_d) - global_step += 1 - - if rank == 0: - logger.info('====> Epoch: {}'.format(epoch)) - - -def evaluate(hps, generator, eval_loader, writer_eval): - generator.eval() - with torch.no_grad(): - for batch_idx, (x, x_lengths, spec, spec_lengths, y, y_lengths, speakers) in enumerate(eval_loader): - x, x_lengths = x.cuda(0), x_lengths.cuda(0) - spec, spec_lengths = spec.cuda(0), spec_lengths.cuda(0) - y, y_lengths = y.cuda(0), y_lengths.cuda(0) - speakers = speakers.cuda(0) - - # remove else - x = x[:1] - x_lengths = x_lengths[:1] - spec = spec[:1] - spec_lengths = spec_lengths[:1] - y = y[:1] - y_lengths = y_lengths[:1] - speakers = speakers[:1] - break - y_hat, attn, mask, *_ = generator.module.infer(x, x_lengths, speakers, max_len=1000) - y_hat_lengths = mask.sum([1,2]).long() * hps.data.hop_length - - mel = spec_to_mel_torch( - spec, - hps.data.filter_length, - hps.data.n_mel_channels, - hps.data.sampling_rate, - hps.data.mel_fmin, - hps.data.mel_fmax) - y_hat_mel = mel_spectrogram_torch( - y_hat.squeeze(1).float(), - hps.data.filter_length, - hps.data.n_mel_channels, - hps.data.sampling_rate, - hps.data.hop_length, - hps.data.win_length, - hps.data.mel_fmin, - hps.data.mel_fmax - ) - image_dict = { - "gen/mel": utils.plot_spectrogram_to_numpy(y_hat_mel[0].cpu().numpy()) - } - audio_dict = { - "gen/audio": y_hat[0,:,:y_hat_lengths[0]] - } - if global_step == 0: - image_dict.update({"gt/mel": utils.plot_spectrogram_to_numpy(mel[0].cpu().numpy())}) - audio_dict.update({"gt/audio": y[0,:,:y_lengths[0]]}) - - utils.summarize( - writer=writer_eval, - global_step=global_step, - images=image_dict, - audios=audio_dict, - audio_sampling_rate=hps.data.sampling_rate - ) - generator.train() - - -if __name__ == "__main__": - main() diff --git a/spaces/carlosalonso/Detection-video/carpeta_deteccion/dev/packaging/build_wheel.sh b/spaces/carlosalonso/Detection-video/carpeta_deteccion/dev/packaging/build_wheel.sh deleted file mode 100644 index 2535a1b99c1406dbd71d0cb5132886800ee7aa48..0000000000000000000000000000000000000000 --- a/spaces/carlosalonso/Detection-video/carpeta_deteccion/dev/packaging/build_wheel.sh +++ /dev/null @@ -1,31 +0,0 @@ -#!/bin/bash -# Copyright (c) Facebook, Inc. and its affiliates. -set -ex - -ldconfig # https://github.com/NVIDIA/nvidia-docker/issues/854 - -script_dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" -. "$script_dir/pkg_helpers.bash" - -echo "Build Settings:" -echo "CU_VERSION: $CU_VERSION" # e.g. cu101 -echo "D2_VERSION_SUFFIX: $D2_VERSION_SUFFIX" # e.g. +cu101 or "" -echo "PYTHON_VERSION: $PYTHON_VERSION" # e.g. 3.7 -echo "PYTORCH_VERSION: $PYTORCH_VERSION" # e.g. 1.4 - -setup_cuda -setup_wheel_python - -yum install ninja-build -y -ln -sv /usr/bin/ninja-build /usr/bin/ninja || true - -pip_install pip numpy -U -pip_install "torch==$PYTORCH_VERSION" \ - -f https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html - -# use separate directories to allow parallel build -BASE_BUILD_DIR=build/$CU_VERSION-py$PYTHON_VERSION-pt$PYTORCH_VERSION -python setup.py \ - build -b "$BASE_BUILD_DIR" \ - bdist_wheel -b "$BASE_BUILD_DIR/build_dist" -d "wheels/$CU_VERSION/torch$PYTORCH_VERSION" -rm -rf "$BASE_BUILD_DIR" diff --git a/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/DensePose/densepose/modeling/roi_heads/__init__.py b/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/DensePose/densepose/modeling/roi_heads/__init__.py deleted file mode 100644 index 8403589f23ec2ffa8afafcd566ca0b0b7b2671a7..0000000000000000000000000000000000000000 --- a/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/DensePose/densepose/modeling/roi_heads/__init__.py +++ /dev/null @@ -1,6 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from .v1convx import DensePoseV1ConvXHead -from .deeplab import DensePoseDeepLabHead -from .registry import ROI_DENSEPOSE_HEAD_REGISTRY -from .roi_head import Decoder, DensePoseROIHeads diff --git a/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/PointRend/README.md b/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/PointRend/README.md deleted file mode 100644 index 79d75d506c6f5db710044d3c1cd2583027ac3dbe..0000000000000000000000000000000000000000 --- a/spaces/carlosalonso/Detection-video/carpeta_deteccion/projects/PointRend/README.md +++ /dev/null @@ -1,167 +0,0 @@ -# PointRend: Image Segmentation as Rendering - -Alexander Kirillov, Yuxin Wu, Kaiming He, Ross Girshick - -[[`arXiv`](https://arxiv.org/abs/1912.08193)] [[`BibTeX`](#CitingPointRend)] - -
- -

- -In this repository, we release code for PointRend in Detectron2. PointRend can be flexibly applied to both instance and semantic segmentation tasks by building on top of existing state-of-the-art models. - -## Quick start and visualization - -This [Colab Notebook](https://colab.research.google.com/drive/1isGPL5h5_cKoPPhVL9XhMokRtHDvmMVL) tutorial contains examples of PointRend usage and visualizations of its point sampling stages. - -## Training - -To train a model with 8 GPUs run: -```bash -cd /path/to/detectron2/projects/PointRend -python train_net.py --config-file configs/InstanceSegmentation/pointrend_rcnn_R_50_FPN_1x_coco.yaml --num-gpus 8 -``` - -## Evaluation - -Model evaluation can be done similarly: -```bash -cd /path/to/detectron2/projects/PointRend -python train_net.py --config-file configs/InstanceSegmentation/pointrend_rcnn_R_50_FPN_1x_coco.yaml --eval-only MODEL.WEIGHTS /path/to/model_checkpoint -``` - -# Pretrained Models - -## Instance Segmentation -#### COCO - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Mask
head
Backbonelr
sched
Output
resolution
mask
AP
mask
AP*
model iddownload
PointRendR50-FPN224×22436.239.7164254221model | metrics
PointRendR50-FPN224×22438.341.6164955410model | metrics
PointRendR101-FPN224×22440.143.8model | metrics
PointRendX101-FPN224×22441.144.7model | metrics
- -AP* is COCO mask AP evaluated against the higher-quality LVIS annotations; see the paper for details. -Run `python detectron2/datasets/prepare_cocofied_lvis.py` to prepare GT files for AP* evaluation. -Since LVIS annotations are not exhaustive, `lvis-api` and not `cocoapi` should be used to evaluate AP*. - -#### Cityscapes -Cityscapes model is trained with ImageNet pretraining. - - - - - - - - - - - - - - - - - - - - -
Mask
head
Backbonelr
sched
Output
resolution
mask
AP
model iddownload
PointRendR50-FPN224×22435.9164255101model | metrics
- - -## Semantic Segmentation - -#### Cityscapes -Cityscapes model is trained with ImageNet pretraining. - - - - - - - - - - - - - - - - - - -
MethodBackboneOutput
resolution
mIoUmodel iddownload
SemanticFPN + PointRendR101-FPN1024×204878.9202576688model | metrics
- -## Citing PointRend - -If you use PointRend, please use the following BibTeX entry. - -```BibTeX -@InProceedings{kirillov2019pointrend, - title={{PointRend}: Image Segmentation as Rendering}, - author={Alexander Kirillov and Yuxin Wu and Kaiming He and Ross Girshick}, - journal={ArXiv:1912.08193}, - year={2019} -} -``` - -## Citing Implicit PointRend - -If you use Implicit PointRend, please use the following BibTeX entry. - -```BibTeX -@InProceedings{cheng2021pointly, - title={Pointly-Supervised Instance Segmentation, - author={Bowen Cheng and Omkar Parkhi and Alexander Kirillov}, - journal={ArXiv}, - year={2021} -} -``` diff --git a/spaces/ccds/vits_onnx/export/infer.sh b/spaces/ccds/vits_onnx/export/infer.sh deleted file mode 100644 index 86bc0834148af6f5927e0bb626414a4be64f3586..0000000000000000000000000000000000000000 --- a/spaces/ccds/vits_onnx/export/infer.sh +++ /dev/null @@ -1,2 +0,0 @@ -python vits/inference_onnx.py --onnx_model model/model.onnx \ ---cfg model/config.json --test_file test.txt \ No newline at end of file diff --git a/spaces/cenji1109285052/img-to-music/README.md b/spaces/cenji1109285052/img-to-music/README.md deleted file mode 100644 index ff1948d1b95ee1f8d7a3396aefb285c729d18687..0000000000000000000000000000000000000000 --- a/spaces/cenji1109285052/img-to-music/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: Img To Music -emoji: 🌅🎶 -colorFrom: green -colorTo: purple -sdk: gradio -sdk_version: 3.16.0 -app_file: app.py -pinned: true -duplicated_from: fffiloni/img-to-music ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference \ No newline at end of file diff --git a/spaces/chendl/compositional_test/multimodal/YOLOX/yolox/data/datasets/__init__.py b/spaces/chendl/compositional_test/multimodal/YOLOX/yolox/data/datasets/__init__.py deleted file mode 100644 index 0b6fd8ec4cecffe94d80084b57f3b966e4f01def..0000000000000000000000000000000000000000 --- a/spaces/chendl/compositional_test/multimodal/YOLOX/yolox/data/datasets/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding:utf-8 -*- -# Copyright (c) Megvii, Inc. and its affiliates. - -from .coco import COCODataset -from .coco_classes import COCO_CLASSES -from .datasets_wrapper import CacheDataset, ConcatDataset, Dataset, MixConcatDataset -from .mosaicdetection import MosaicDetection -from .voc import VOCDetection diff --git a/spaces/chendl/compositional_test/multimodal/local_train.sh b/spaces/chendl/compositional_test/multimodal/local_train.sh deleted file mode 100644 index 7555e1affba77826372d076a6eb70b4686a46c47..0000000000000000000000000000000000000000 --- a/spaces/chendl/compositional_test/multimodal/local_train.sh +++ /dev/null @@ -1,38 +0,0 @@ -LAION_DATA=/gpfs/u/home/LMCG/LMCGljnn/scratch-shared/junyan/raw/karpathy_coco_wds_full_ground/{00000..00066}.tar -PILE_DATA=/gpfs/u/home/LMCG/LMCGljnn/scratch-shared/junyan/raw/the_pile/{00000..01925}.tar -SAVE_DIR=checkpoints_local/debug0922 -mkdir -p ${SAVE_DIR} -cp $0 ${SAVE_DIR}/ - -export TRANSFORMERS_OFFLINE=1 -torchrun --nnodes=1 --nproc_per_node=6 --master_port=14288 open_flamingo/train/train.py \ ---run_name ${SAVE_DIR} \ ---vision_encoder_path ViT-L-14 \ ---vision_encoder_pretrained datacomp_xl_s13b_b90k \ ---lm_path EleutherAI/pythia-1.4b \ ---tokenizer_path EleutherAI/pythia-1.4b \ ---dataset_resampled \ ---laion_shards ${LAION_DATA} \ ---pile_shards ${PILE_DATA} \ ---batch_size_laion 14 \ ---batch_size_pile 2 \ ---workers=4 \ ---lr_scheduler cosine \ ---warmup_steps 200 \ ---num_steps 4000 \ ---checkpoint_activations \ ---delete_previous_checkpoint \ ---gradient_accumulation_steps 1 \ ---save_interval 100 \ ---logging_steps 2 \ ---skip_delete_pattern 500 \ ---precision amp_fp16 \ ---learning_rate 1.0e-5 \ ---add_visual_token \ ---max-length 960 \ ---loss_multiplier_det 0.025 \ ---add_box \ ---expand \ ---use_format_v2 \ ---resume_from_checkpoint checkpoints/091701_pythiaS_previsual_fix/checkpoint_20000.pt \ ---restart diff --git a/spaces/chinhon/Commentaries_Headlines_Generator/README.md b/spaces/chinhon/Commentaries_Headlines_Generator/README.md deleted file mode 100644 index dfeb70d6af34af72d56570aecff873bfd0cb51ed..0000000000000000000000000000000000000000 --- a/spaces/chinhon/Commentaries_Headlines_Generator/README.md +++ /dev/null @@ -1,37 +0,0 @@ ---- -title: Commentaries_Headlines_Generator -emoji: 📉 -colorFrom: indigo -colorTo: purple -sdk: gradio -app_file: app.py -pinned: false ---- - -# Configuration - -`title`: _string_ -Display title for the Space - -`emoji`: _string_ -Space emoji (emoji-only character allowed) - -`colorFrom`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`colorTo`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`sdk`: _string_ -Can be either `gradio`, `streamlit`, or `static` - -`sdk_version` : _string_ -Only applicable for `streamlit` SDK. -See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions. - -`app_file`: _string_ -Path to your main application file (which contains either `gradio` or `streamlit` Python code, or `static` html code). -Path is relative to the root of the repository. - -`pinned`: _boolean_ -Whether the Space stays on top of your list. diff --git a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/charset_normalizer/legacy.py b/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/charset_normalizer/legacy.py deleted file mode 100644 index 43aad21a9dd1c08c8d31e38908485d46b14efbd2..0000000000000000000000000000000000000000 --- a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/charset_normalizer/legacy.py +++ /dev/null @@ -1,54 +0,0 @@ -from typing import Any, Dict, Optional, Union -from warnings import warn - -from .api import from_bytes -from .constant import CHARDET_CORRESPONDENCE - - -def detect( - byte_str: bytes, should_rename_legacy: bool = False, **kwargs: Any -) -> Dict[str, Optional[Union[str, float]]]: - """ - chardet legacy method - Detect the encoding of the given byte string. It should be mostly backward-compatible. - Encoding name will match Chardet own writing whenever possible. (Not on encoding name unsupported by it) - This function is deprecated and should be used to migrate your project easily, consult the documentation for - further information. Not planned for removal. - - :param byte_str: The byte sequence to examine. - :param should_rename_legacy: Should we rename legacy encodings - to their more modern equivalents? - """ - if len(kwargs): - warn( - f"charset-normalizer disregard arguments '{','.join(list(kwargs.keys()))}' in legacy function detect()" - ) - - if not isinstance(byte_str, (bytearray, bytes)): - raise TypeError( # pragma: nocover - "Expected object of type bytes or bytearray, got: " - "{0}".format(type(byte_str)) - ) - - if isinstance(byte_str, bytearray): - byte_str = bytes(byte_str) - - r = from_bytes(byte_str).best() - - encoding = r.encoding if r is not None else None - language = r.language if r is not None and r.language != "Unknown" else "" - confidence = 1.0 - r.chaos if r is not None else None - - # Note: CharsetNormalizer does not return 'UTF-8-SIG' as the sig get stripped in the detection/normalization process - # but chardet does return 'utf-8-sig' and it is a valid codec name. - if r is not None and encoding == "utf_8" and r.bom: - encoding += "_sig" - - if should_rename_legacy is False and encoding in CHARDET_CORRESPONDENCE: - encoding = CHARDET_CORRESPONDENCE[encoding] - - return { - "encoding": encoding, - "language": language, - "confidence": confidence, - } diff --git a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/chromadb/segment/impl/manager/local.py b/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/chromadb/segment/impl/manager/local.py deleted file mode 100644 index fdac85fd1751490d5bb2a560f158a2862c28a2f3..0000000000000000000000000000000000000000 --- a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/chromadb/segment/impl/manager/local.py +++ /dev/null @@ -1,132 +0,0 @@ -from chromadb.segment import ( - SegmentImplementation, - SegmentManager, - MetadataReader, - VectorReader, - S, -) -from chromadb.config import System, get_class -from chromadb.db.system import SysDB -from overrides import override -from enum import Enum -from chromadb.types import Collection, Segment, SegmentScope, Metadata -from typing import Dict, Type, Sequence, Optional, cast -from uuid import UUID, uuid4 -from collections import defaultdict - - -class SegmentType(Enum): - SQLITE = "urn:chroma:segment/metadata/sqlite" - HNSW_LOCAL_MEMORY = "urn:chroma:segment/vector/hnsw-local-memory" - - -SEGMENT_TYPE_IMPLS = { - SegmentType.SQLITE: "chromadb.segment.impl.metadata.sqlite.SqliteMetadataSegment", - SegmentType.HNSW_LOCAL_MEMORY: "chromadb.segment.impl.vector.local_hnsw.LocalHnswSegment", -} - - -class LocalSegmentManager(SegmentManager): - _sysdb: SysDB - _system: System - _instances: Dict[UUID, SegmentImplementation] - _segment_cache: Dict[UUID, Dict[SegmentScope, Segment]] - - def __init__(self, system: System): - super().__init__(system) - self._sysdb = self.require(SysDB) - self._system = system - self._instances = {} - self._segment_cache = defaultdict(dict) - - @override - def start(self) -> None: - for instance in self._instances.values(): - instance.start() - super().start() - - @override - def stop(self) -> None: - for instance in self._instances.values(): - instance.stop() - super().stop() - - @override - def reset_state(self) -> None: - for instance in self._instances.values(): - instance.stop() - self._instances = {} - self._segment_cache = defaultdict(dict) - super().reset_state() - - @override - def create_segments(self, collection: Collection) -> Sequence[Segment]: - vector_segment = _segment( - SegmentType.HNSW_LOCAL_MEMORY, SegmentScope.VECTOR, collection - ) - metadata_segment = _segment( - SegmentType.SQLITE, SegmentScope.METADATA, collection - ) - return [vector_segment, metadata_segment] - - @override - def delete_segments(self, collection_id: UUID) -> Sequence[UUID]: - segments = self._sysdb.get_segments(collection=collection_id) - for segment in segments: - if segment["id"] in self._instances: - del self._instances[segment["id"]] - if collection_id in self._segment_cache: - if segment["scope"] in self._segment_cache[collection_id]: - del self._segment_cache[collection_id][segment["scope"]] - del self._segment_cache[collection_id] - return [s["id"] for s in segments] - - @override - def get_segment(self, collection_id: UUID, type: Type[S]) -> S: - if type == MetadataReader: - scope = SegmentScope.METADATA - elif type == VectorReader: - scope = SegmentScope.VECTOR - else: - raise ValueError(f"Invalid segment type: {type}") - - if scope not in self._segment_cache[collection_id]: - segments = self._sysdb.get_segments(collection=collection_id, scope=scope) - known_types = set([k.value for k in SEGMENT_TYPE_IMPLS.keys()]) - # Get the first segment of a known type - segment = next(filter(lambda s: s["type"] in known_types, segments)) - self._segment_cache[collection_id][scope] = segment - - instance = self._instance(self._segment_cache[collection_id][scope]) - return cast(S, instance) - - def _cls(self, segment: Segment) -> Type[SegmentImplementation]: - classname = SEGMENT_TYPE_IMPLS[SegmentType(segment["type"])] - cls = get_class(classname, SegmentImplementation) - return cls - - def _instance(self, segment: Segment) -> SegmentImplementation: - if segment["id"] not in self._instances: - cls = self._cls(segment) - instance = cls(self._system, segment) - instance.start() - self._instances[segment["id"]] = instance - return self._instances[segment["id"]] - - -def _segment(type: SegmentType, scope: SegmentScope, collection: Collection) -> Segment: - """Create a metadata dict, propagating metadata correctly for the given segment type.""" - cls = get_class(SEGMENT_TYPE_IMPLS[type], SegmentImplementation) - collection_metadata = collection.get("metadata", None) - metadata: Optional[Metadata] = None - if collection_metadata: - metadata = cls.propagate_collection_metadata(collection_metadata) - - return Segment( - id=uuid4(), - type=type.value, - scope=scope, - topic=collection["topic"], - collection=collection["id"], - metadata=metadata, - ) diff --git a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/agl.py b/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/agl.py deleted file mode 100644 index d6994628dd3a68f9592efd8c09b0bf56c2a67056..0000000000000000000000000000000000000000 --- a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/agl.py +++ /dev/null @@ -1,5233 +0,0 @@ -# -*- coding: utf-8 -*- -# The tables below are taken from -# https://github.com/adobe-type-tools/agl-aglfn/raw/4036a9ca80a62f64f9de4f7321a9a045ad0ecfd6/glyphlist.txt -# and -# https://github.com/adobe-type-tools/agl-aglfn/raw/4036a9ca80a62f64f9de4f7321a9a045ad0ecfd6/aglfn.txt -""" -Interface to the Adobe Glyph List - -This module exists to convert glyph names from the Adobe Glyph List -to their Unicode equivalents. Example usage: - - >>> from fontTools.agl import toUnicode - >>> toUnicode("nahiragana") - 'な' - -It also contains two dictionaries, ``UV2AGL`` and ``AGL2UV``, which map from -Unicode codepoints to AGL names and vice versa: - - >>> import fontTools - >>> fontTools.agl.UV2AGL[ord("?")] - 'question' - >>> fontTools.agl.AGL2UV["wcircumflex"] - 373 - -This is used by fontTools when it has to construct glyph names for a font which -doesn't include any (e.g. format 3.0 post tables). -""" - -from fontTools.misc.textTools import tostr -import re - - -_aglText = """\ -# ----------------------------------------------------------- -# Copyright 2002-2019 Adobe (http://www.adobe.com/). -# -# Redistribution and use in source and binary forms, with or -# without modification, are permitted provided that the -# following conditions are met: -# -# Redistributions of source code must retain the above -# copyright notice, this list of conditions and the following -# disclaimer. -# -# Redistributions in binary form must reproduce the above -# copyright notice, this list of conditions and the following -# disclaimer in the documentation and/or other materials -# provided with the distribution. -# -# Neither the name of Adobe nor the names of its contributors -# may be used to endorse or promote products derived from this -# software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND -# CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, -# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF -# MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, -# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT -# NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) -# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN -# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR -# OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -# ----------------------------------------------------------- -# Name: Adobe Glyph List -# Table version: 2.0 -# Date: September 20, 2002 -# URL: https://github.com/adobe-type-tools/agl-aglfn -# -# Format: two semicolon-delimited fields: -# (1) glyph name--upper/lowercase letters and digits -# (2) Unicode scalar value--four uppercase hexadecimal digits -# -A;0041 -AE;00C6 -AEacute;01FC -AEmacron;01E2 -AEsmall;F7E6 -Aacute;00C1 -Aacutesmall;F7E1 -Abreve;0102 -Abreveacute;1EAE -Abrevecyrillic;04D0 -Abrevedotbelow;1EB6 -Abrevegrave;1EB0 -Abrevehookabove;1EB2 -Abrevetilde;1EB4 -Acaron;01CD -Acircle;24B6 -Acircumflex;00C2 -Acircumflexacute;1EA4 -Acircumflexdotbelow;1EAC -Acircumflexgrave;1EA6 -Acircumflexhookabove;1EA8 -Acircumflexsmall;F7E2 -Acircumflextilde;1EAA -Acute;F6C9 -Acutesmall;F7B4 -Acyrillic;0410 -Adblgrave;0200 -Adieresis;00C4 -Adieresiscyrillic;04D2 -Adieresismacron;01DE -Adieresissmall;F7E4 -Adotbelow;1EA0 -Adotmacron;01E0 -Agrave;00C0 -Agravesmall;F7E0 -Ahookabove;1EA2 -Aiecyrillic;04D4 -Ainvertedbreve;0202 -Alpha;0391 -Alphatonos;0386 -Amacron;0100 -Amonospace;FF21 -Aogonek;0104 -Aring;00C5 -Aringacute;01FA -Aringbelow;1E00 -Aringsmall;F7E5 -Asmall;F761 -Atilde;00C3 -Atildesmall;F7E3 -Aybarmenian;0531 -B;0042 -Bcircle;24B7 -Bdotaccent;1E02 -Bdotbelow;1E04 -Becyrillic;0411 -Benarmenian;0532 -Beta;0392 -Bhook;0181 -Blinebelow;1E06 -Bmonospace;FF22 -Brevesmall;F6F4 -Bsmall;F762 -Btopbar;0182 -C;0043 -Caarmenian;053E -Cacute;0106 -Caron;F6CA -Caronsmall;F6F5 -Ccaron;010C -Ccedilla;00C7 -Ccedillaacute;1E08 -Ccedillasmall;F7E7 -Ccircle;24B8 -Ccircumflex;0108 -Cdot;010A -Cdotaccent;010A -Cedillasmall;F7B8 -Chaarmenian;0549 -Cheabkhasiancyrillic;04BC -Checyrillic;0427 -Chedescenderabkhasiancyrillic;04BE -Chedescendercyrillic;04B6 -Chedieresiscyrillic;04F4 -Cheharmenian;0543 -Chekhakassiancyrillic;04CB -Cheverticalstrokecyrillic;04B8 -Chi;03A7 -Chook;0187 -Circumflexsmall;F6F6 -Cmonospace;FF23 -Coarmenian;0551 -Csmall;F763 -D;0044 -DZ;01F1 -DZcaron;01C4 -Daarmenian;0534 -Dafrican;0189 -Dcaron;010E -Dcedilla;1E10 -Dcircle;24B9 -Dcircumflexbelow;1E12 -Dcroat;0110 -Ddotaccent;1E0A -Ddotbelow;1E0C -Decyrillic;0414 -Deicoptic;03EE -Delta;2206 -Deltagreek;0394 -Dhook;018A -Dieresis;F6CB -DieresisAcute;F6CC -DieresisGrave;F6CD -Dieresissmall;F7A8 -Digammagreek;03DC -Djecyrillic;0402 -Dlinebelow;1E0E -Dmonospace;FF24 -Dotaccentsmall;F6F7 -Dslash;0110 -Dsmall;F764 -Dtopbar;018B -Dz;01F2 -Dzcaron;01C5 -Dzeabkhasiancyrillic;04E0 -Dzecyrillic;0405 -Dzhecyrillic;040F -E;0045 -Eacute;00C9 -Eacutesmall;F7E9 -Ebreve;0114 -Ecaron;011A -Ecedillabreve;1E1C -Echarmenian;0535 -Ecircle;24BA -Ecircumflex;00CA -Ecircumflexacute;1EBE -Ecircumflexbelow;1E18 -Ecircumflexdotbelow;1EC6 -Ecircumflexgrave;1EC0 -Ecircumflexhookabove;1EC2 -Ecircumflexsmall;F7EA -Ecircumflextilde;1EC4 -Ecyrillic;0404 -Edblgrave;0204 -Edieresis;00CB -Edieresissmall;F7EB -Edot;0116 -Edotaccent;0116 -Edotbelow;1EB8 -Efcyrillic;0424 -Egrave;00C8 -Egravesmall;F7E8 -Eharmenian;0537 -Ehookabove;1EBA -Eightroman;2167 -Einvertedbreve;0206 -Eiotifiedcyrillic;0464 -Elcyrillic;041B -Elevenroman;216A -Emacron;0112 -Emacronacute;1E16 -Emacrongrave;1E14 -Emcyrillic;041C -Emonospace;FF25 -Encyrillic;041D -Endescendercyrillic;04A2 -Eng;014A -Enghecyrillic;04A4 -Enhookcyrillic;04C7 -Eogonek;0118 -Eopen;0190 -Epsilon;0395 -Epsilontonos;0388 -Ercyrillic;0420 -Ereversed;018E -Ereversedcyrillic;042D -Escyrillic;0421 -Esdescendercyrillic;04AA -Esh;01A9 -Esmall;F765 -Eta;0397 -Etarmenian;0538 -Etatonos;0389 -Eth;00D0 -Ethsmall;F7F0 -Etilde;1EBC -Etildebelow;1E1A -Euro;20AC -Ezh;01B7 -Ezhcaron;01EE -Ezhreversed;01B8 -F;0046 -Fcircle;24BB -Fdotaccent;1E1E -Feharmenian;0556 -Feicoptic;03E4 -Fhook;0191 -Fitacyrillic;0472 -Fiveroman;2164 -Fmonospace;FF26 -Fourroman;2163 -Fsmall;F766 -G;0047 -GBsquare;3387 -Gacute;01F4 -Gamma;0393 -Gammaafrican;0194 -Gangiacoptic;03EA -Gbreve;011E -Gcaron;01E6 -Gcedilla;0122 -Gcircle;24BC -Gcircumflex;011C -Gcommaaccent;0122 -Gdot;0120 -Gdotaccent;0120 -Gecyrillic;0413 -Ghadarmenian;0542 -Ghemiddlehookcyrillic;0494 -Ghestrokecyrillic;0492 -Gheupturncyrillic;0490 -Ghook;0193 -Gimarmenian;0533 -Gjecyrillic;0403 -Gmacron;1E20 -Gmonospace;FF27 -Grave;F6CE -Gravesmall;F760 -Gsmall;F767 -Gsmallhook;029B -Gstroke;01E4 -H;0048 -H18533;25CF -H18543;25AA -H18551;25AB -H22073;25A1 -HPsquare;33CB -Haabkhasiancyrillic;04A8 -Hadescendercyrillic;04B2 -Hardsigncyrillic;042A -Hbar;0126 -Hbrevebelow;1E2A -Hcedilla;1E28 -Hcircle;24BD -Hcircumflex;0124 -Hdieresis;1E26 -Hdotaccent;1E22 -Hdotbelow;1E24 -Hmonospace;FF28 -Hoarmenian;0540 -Horicoptic;03E8 -Hsmall;F768 -Hungarumlaut;F6CF -Hungarumlautsmall;F6F8 -Hzsquare;3390 -I;0049 -IAcyrillic;042F -IJ;0132 -IUcyrillic;042E -Iacute;00CD -Iacutesmall;F7ED -Ibreve;012C -Icaron;01CF -Icircle;24BE -Icircumflex;00CE -Icircumflexsmall;F7EE -Icyrillic;0406 -Idblgrave;0208 -Idieresis;00CF -Idieresisacute;1E2E -Idieresiscyrillic;04E4 -Idieresissmall;F7EF -Idot;0130 -Idotaccent;0130 -Idotbelow;1ECA -Iebrevecyrillic;04D6 -Iecyrillic;0415 -Ifraktur;2111 -Igrave;00CC -Igravesmall;F7EC -Ihookabove;1EC8 -Iicyrillic;0418 -Iinvertedbreve;020A -Iishortcyrillic;0419 -Imacron;012A -Imacroncyrillic;04E2 -Imonospace;FF29 -Iniarmenian;053B -Iocyrillic;0401 -Iogonek;012E -Iota;0399 -Iotaafrican;0196 -Iotadieresis;03AA -Iotatonos;038A -Ismall;F769 -Istroke;0197 -Itilde;0128 -Itildebelow;1E2C -Izhitsacyrillic;0474 -Izhitsadblgravecyrillic;0476 -J;004A -Jaarmenian;0541 -Jcircle;24BF -Jcircumflex;0134 -Jecyrillic;0408 -Jheharmenian;054B -Jmonospace;FF2A -Jsmall;F76A -K;004B -KBsquare;3385 -KKsquare;33CD -Kabashkircyrillic;04A0 -Kacute;1E30 -Kacyrillic;041A -Kadescendercyrillic;049A -Kahookcyrillic;04C3 -Kappa;039A -Kastrokecyrillic;049E -Kaverticalstrokecyrillic;049C -Kcaron;01E8 -Kcedilla;0136 -Kcircle;24C0 -Kcommaaccent;0136 -Kdotbelow;1E32 -Keharmenian;0554 -Kenarmenian;053F -Khacyrillic;0425 -Kheicoptic;03E6 -Khook;0198 -Kjecyrillic;040C -Klinebelow;1E34 -Kmonospace;FF2B -Koppacyrillic;0480 -Koppagreek;03DE -Ksicyrillic;046E -Ksmall;F76B -L;004C -LJ;01C7 -LL;F6BF -Lacute;0139 -Lambda;039B -Lcaron;013D -Lcedilla;013B -Lcircle;24C1 -Lcircumflexbelow;1E3C -Lcommaaccent;013B -Ldot;013F -Ldotaccent;013F -Ldotbelow;1E36 -Ldotbelowmacron;1E38 -Liwnarmenian;053C -Lj;01C8 -Ljecyrillic;0409 -Llinebelow;1E3A -Lmonospace;FF2C -Lslash;0141 -Lslashsmall;F6F9 -Lsmall;F76C -M;004D -MBsquare;3386 -Macron;F6D0 -Macronsmall;F7AF -Macute;1E3E -Mcircle;24C2 -Mdotaccent;1E40 -Mdotbelow;1E42 -Menarmenian;0544 -Mmonospace;FF2D -Msmall;F76D -Mturned;019C -Mu;039C -N;004E -NJ;01CA -Nacute;0143 -Ncaron;0147 -Ncedilla;0145 -Ncircle;24C3 -Ncircumflexbelow;1E4A -Ncommaaccent;0145 -Ndotaccent;1E44 -Ndotbelow;1E46 -Nhookleft;019D -Nineroman;2168 -Nj;01CB -Njecyrillic;040A -Nlinebelow;1E48 -Nmonospace;FF2E -Nowarmenian;0546 -Nsmall;F76E -Ntilde;00D1 -Ntildesmall;F7F1 -Nu;039D -O;004F -OE;0152 -OEsmall;F6FA -Oacute;00D3 -Oacutesmall;F7F3 -Obarredcyrillic;04E8 -Obarreddieresiscyrillic;04EA -Obreve;014E -Ocaron;01D1 -Ocenteredtilde;019F -Ocircle;24C4 -Ocircumflex;00D4 -Ocircumflexacute;1ED0 -Ocircumflexdotbelow;1ED8 -Ocircumflexgrave;1ED2 -Ocircumflexhookabove;1ED4 -Ocircumflexsmall;F7F4 -Ocircumflextilde;1ED6 -Ocyrillic;041E -Odblacute;0150 -Odblgrave;020C -Odieresis;00D6 -Odieresiscyrillic;04E6 -Odieresissmall;F7F6 -Odotbelow;1ECC -Ogoneksmall;F6FB -Ograve;00D2 -Ogravesmall;F7F2 -Oharmenian;0555 -Ohm;2126 -Ohookabove;1ECE -Ohorn;01A0 -Ohornacute;1EDA -Ohorndotbelow;1EE2 -Ohorngrave;1EDC -Ohornhookabove;1EDE -Ohorntilde;1EE0 -Ohungarumlaut;0150 -Oi;01A2 -Oinvertedbreve;020E -Omacron;014C -Omacronacute;1E52 -Omacrongrave;1E50 -Omega;2126 -Omegacyrillic;0460 -Omegagreek;03A9 -Omegaroundcyrillic;047A -Omegatitlocyrillic;047C -Omegatonos;038F -Omicron;039F -Omicrontonos;038C -Omonospace;FF2F -Oneroman;2160 -Oogonek;01EA -Oogonekmacron;01EC -Oopen;0186 -Oslash;00D8 -Oslashacute;01FE -Oslashsmall;F7F8 -Osmall;F76F -Ostrokeacute;01FE -Otcyrillic;047E -Otilde;00D5 -Otildeacute;1E4C -Otildedieresis;1E4E -Otildesmall;F7F5 -P;0050 -Pacute;1E54 -Pcircle;24C5 -Pdotaccent;1E56 -Pecyrillic;041F -Peharmenian;054A -Pemiddlehookcyrillic;04A6 -Phi;03A6 -Phook;01A4 -Pi;03A0 -Piwrarmenian;0553 -Pmonospace;FF30 -Psi;03A8 -Psicyrillic;0470 -Psmall;F770 -Q;0051 -Qcircle;24C6 -Qmonospace;FF31 -Qsmall;F771 -R;0052 -Raarmenian;054C -Racute;0154 -Rcaron;0158 -Rcedilla;0156 -Rcircle;24C7 -Rcommaaccent;0156 -Rdblgrave;0210 -Rdotaccent;1E58 -Rdotbelow;1E5A -Rdotbelowmacron;1E5C -Reharmenian;0550 -Rfraktur;211C -Rho;03A1 -Ringsmall;F6FC -Rinvertedbreve;0212 -Rlinebelow;1E5E -Rmonospace;FF32 -Rsmall;F772 -Rsmallinverted;0281 -Rsmallinvertedsuperior;02B6 -S;0053 -SF010000;250C -SF020000;2514 -SF030000;2510 -SF040000;2518 -SF050000;253C -SF060000;252C -SF070000;2534 -SF080000;251C -SF090000;2524 -SF100000;2500 -SF110000;2502 -SF190000;2561 -SF200000;2562 -SF210000;2556 -SF220000;2555 -SF230000;2563 -SF240000;2551 -SF250000;2557 -SF260000;255D -SF270000;255C -SF280000;255B -SF360000;255E -SF370000;255F -SF380000;255A -SF390000;2554 -SF400000;2569 -SF410000;2566 -SF420000;2560 -SF430000;2550 -SF440000;256C -SF450000;2567 -SF460000;2568 -SF470000;2564 -SF480000;2565 -SF490000;2559 -SF500000;2558 -SF510000;2552 -SF520000;2553 -SF530000;256B -SF540000;256A -Sacute;015A -Sacutedotaccent;1E64 -Sampigreek;03E0 -Scaron;0160 -Scarondotaccent;1E66 -Scaronsmall;F6FD -Scedilla;015E -Schwa;018F -Schwacyrillic;04D8 -Schwadieresiscyrillic;04DA -Scircle;24C8 -Scircumflex;015C -Scommaaccent;0218 -Sdotaccent;1E60 -Sdotbelow;1E62 -Sdotbelowdotaccent;1E68 -Seharmenian;054D -Sevenroman;2166 -Shaarmenian;0547 -Shacyrillic;0428 -Shchacyrillic;0429 -Sheicoptic;03E2 -Shhacyrillic;04BA -Shimacoptic;03EC -Sigma;03A3 -Sixroman;2165 -Smonospace;FF33 -Softsigncyrillic;042C -Ssmall;F773 -Stigmagreek;03DA -T;0054 -Tau;03A4 -Tbar;0166 -Tcaron;0164 -Tcedilla;0162 -Tcircle;24C9 -Tcircumflexbelow;1E70 -Tcommaaccent;0162 -Tdotaccent;1E6A -Tdotbelow;1E6C -Tecyrillic;0422 -Tedescendercyrillic;04AC -Tenroman;2169 -Tetsecyrillic;04B4 -Theta;0398 -Thook;01AC -Thorn;00DE -Thornsmall;F7FE -Threeroman;2162 -Tildesmall;F6FE -Tiwnarmenian;054F -Tlinebelow;1E6E -Tmonospace;FF34 -Toarmenian;0539 -Tonefive;01BC -Tonesix;0184 -Tonetwo;01A7 -Tretroflexhook;01AE -Tsecyrillic;0426 -Tshecyrillic;040B -Tsmall;F774 -Twelveroman;216B -Tworoman;2161 -U;0055 -Uacute;00DA -Uacutesmall;F7FA -Ubreve;016C -Ucaron;01D3 -Ucircle;24CA -Ucircumflex;00DB -Ucircumflexbelow;1E76 -Ucircumflexsmall;F7FB -Ucyrillic;0423 -Udblacute;0170 -Udblgrave;0214 -Udieresis;00DC -Udieresisacute;01D7 -Udieresisbelow;1E72 -Udieresiscaron;01D9 -Udieresiscyrillic;04F0 -Udieresisgrave;01DB -Udieresismacron;01D5 -Udieresissmall;F7FC -Udotbelow;1EE4 -Ugrave;00D9 -Ugravesmall;F7F9 -Uhookabove;1EE6 -Uhorn;01AF -Uhornacute;1EE8 -Uhorndotbelow;1EF0 -Uhorngrave;1EEA -Uhornhookabove;1EEC -Uhorntilde;1EEE -Uhungarumlaut;0170 -Uhungarumlautcyrillic;04F2 -Uinvertedbreve;0216 -Ukcyrillic;0478 -Umacron;016A -Umacroncyrillic;04EE -Umacrondieresis;1E7A -Umonospace;FF35 -Uogonek;0172 -Upsilon;03A5 -Upsilon1;03D2 -Upsilonacutehooksymbolgreek;03D3 -Upsilonafrican;01B1 -Upsilondieresis;03AB -Upsilondieresishooksymbolgreek;03D4 -Upsilonhooksymbol;03D2 -Upsilontonos;038E -Uring;016E -Ushortcyrillic;040E -Usmall;F775 -Ustraightcyrillic;04AE -Ustraightstrokecyrillic;04B0 -Utilde;0168 -Utildeacute;1E78 -Utildebelow;1E74 -V;0056 -Vcircle;24CB -Vdotbelow;1E7E -Vecyrillic;0412 -Vewarmenian;054E -Vhook;01B2 -Vmonospace;FF36 -Voarmenian;0548 -Vsmall;F776 -Vtilde;1E7C -W;0057 -Wacute;1E82 -Wcircle;24CC -Wcircumflex;0174 -Wdieresis;1E84 -Wdotaccent;1E86 -Wdotbelow;1E88 -Wgrave;1E80 -Wmonospace;FF37 -Wsmall;F777 -X;0058 -Xcircle;24CD -Xdieresis;1E8C -Xdotaccent;1E8A -Xeharmenian;053D -Xi;039E -Xmonospace;FF38 -Xsmall;F778 -Y;0059 -Yacute;00DD -Yacutesmall;F7FD -Yatcyrillic;0462 -Ycircle;24CE -Ycircumflex;0176 -Ydieresis;0178 -Ydieresissmall;F7FF -Ydotaccent;1E8E -Ydotbelow;1EF4 -Yericyrillic;042B -Yerudieresiscyrillic;04F8 -Ygrave;1EF2 -Yhook;01B3 -Yhookabove;1EF6 -Yiarmenian;0545 -Yicyrillic;0407 -Yiwnarmenian;0552 -Ymonospace;FF39 -Ysmall;F779 -Ytilde;1EF8 -Yusbigcyrillic;046A -Yusbigiotifiedcyrillic;046C -Yuslittlecyrillic;0466 -Yuslittleiotifiedcyrillic;0468 -Z;005A -Zaarmenian;0536 -Zacute;0179 -Zcaron;017D -Zcaronsmall;F6FF -Zcircle;24CF -Zcircumflex;1E90 -Zdot;017B -Zdotaccent;017B -Zdotbelow;1E92 -Zecyrillic;0417 -Zedescendercyrillic;0498 -Zedieresiscyrillic;04DE -Zeta;0396 -Zhearmenian;053A -Zhebrevecyrillic;04C1 -Zhecyrillic;0416 -Zhedescendercyrillic;0496 -Zhedieresiscyrillic;04DC -Zlinebelow;1E94 -Zmonospace;FF3A -Zsmall;F77A -Zstroke;01B5 -a;0061 -aabengali;0986 -aacute;00E1 -aadeva;0906 -aagujarati;0A86 -aagurmukhi;0A06 -aamatragurmukhi;0A3E -aarusquare;3303 -aavowelsignbengali;09BE -aavowelsigndeva;093E -aavowelsigngujarati;0ABE -abbreviationmarkarmenian;055F -abbreviationsigndeva;0970 -abengali;0985 -abopomofo;311A -abreve;0103 -abreveacute;1EAF -abrevecyrillic;04D1 -abrevedotbelow;1EB7 -abrevegrave;1EB1 -abrevehookabove;1EB3 -abrevetilde;1EB5 -acaron;01CE -acircle;24D0 -acircumflex;00E2 -acircumflexacute;1EA5 -acircumflexdotbelow;1EAD -acircumflexgrave;1EA7 -acircumflexhookabove;1EA9 -acircumflextilde;1EAB -acute;00B4 -acutebelowcmb;0317 -acutecmb;0301 -acutecomb;0301 -acutedeva;0954 -acutelowmod;02CF -acutetonecmb;0341 -acyrillic;0430 -adblgrave;0201 -addakgurmukhi;0A71 -adeva;0905 -adieresis;00E4 -adieresiscyrillic;04D3 -adieresismacron;01DF -adotbelow;1EA1 -adotmacron;01E1 -ae;00E6 -aeacute;01FD -aekorean;3150 -aemacron;01E3 -afii00208;2015 -afii08941;20A4 -afii10017;0410 -afii10018;0411 -afii10019;0412 -afii10020;0413 -afii10021;0414 -afii10022;0415 -afii10023;0401 -afii10024;0416 -afii10025;0417 -afii10026;0418 -afii10027;0419 -afii10028;041A -afii10029;041B -afii10030;041C -afii10031;041D -afii10032;041E -afii10033;041F -afii10034;0420 -afii10035;0421 -afii10036;0422 -afii10037;0423 -afii10038;0424 -afii10039;0425 -afii10040;0426 -afii10041;0427 -afii10042;0428 -afii10043;0429 -afii10044;042A -afii10045;042B -afii10046;042C -afii10047;042D -afii10048;042E -afii10049;042F -afii10050;0490 -afii10051;0402 -afii10052;0403 -afii10053;0404 -afii10054;0405 -afii10055;0406 -afii10056;0407 -afii10057;0408 -afii10058;0409 -afii10059;040A -afii10060;040B -afii10061;040C -afii10062;040E -afii10063;F6C4 -afii10064;F6C5 -afii10065;0430 -afii10066;0431 -afii10067;0432 -afii10068;0433 -afii10069;0434 -afii10070;0435 -afii10071;0451 -afii10072;0436 -afii10073;0437 -afii10074;0438 -afii10075;0439 -afii10076;043A -afii10077;043B -afii10078;043C -afii10079;043D -afii10080;043E -afii10081;043F -afii10082;0440 -afii10083;0441 -afii10084;0442 -afii10085;0443 -afii10086;0444 -afii10087;0445 -afii10088;0446 -afii10089;0447 -afii10090;0448 -afii10091;0449 -afii10092;044A -afii10093;044B -afii10094;044C -afii10095;044D -afii10096;044E -afii10097;044F -afii10098;0491 -afii10099;0452 -afii10100;0453 -afii10101;0454 -afii10102;0455 -afii10103;0456 -afii10104;0457 -afii10105;0458 -afii10106;0459 -afii10107;045A -afii10108;045B -afii10109;045C -afii10110;045E -afii10145;040F -afii10146;0462 -afii10147;0472 -afii10148;0474 -afii10192;F6C6 -afii10193;045F -afii10194;0463 -afii10195;0473 -afii10196;0475 -afii10831;F6C7 -afii10832;F6C8 -afii10846;04D9 -afii299;200E -afii300;200F -afii301;200D -afii57381;066A -afii57388;060C -afii57392;0660 -afii57393;0661 -afii57394;0662 -afii57395;0663 -afii57396;0664 -afii57397;0665 -afii57398;0666 -afii57399;0667 -afii57400;0668 -afii57401;0669 -afii57403;061B -afii57407;061F -afii57409;0621 -afii57410;0622 -afii57411;0623 -afii57412;0624 -afii57413;0625 -afii57414;0626 -afii57415;0627 -afii57416;0628 -afii57417;0629 -afii57418;062A -afii57419;062B -afii57420;062C -afii57421;062D -afii57422;062E -afii57423;062F -afii57424;0630 -afii57425;0631 -afii57426;0632 -afii57427;0633 -afii57428;0634 -afii57429;0635 -afii57430;0636 -afii57431;0637 -afii57432;0638 -afii57433;0639 -afii57434;063A -afii57440;0640 -afii57441;0641 -afii57442;0642 -afii57443;0643 -afii57444;0644 -afii57445;0645 -afii57446;0646 -afii57448;0648 -afii57449;0649 -afii57450;064A -afii57451;064B -afii57452;064C -afii57453;064D -afii57454;064E -afii57455;064F -afii57456;0650 -afii57457;0651 -afii57458;0652 -afii57470;0647 -afii57505;06A4 -afii57506;067E -afii57507;0686 -afii57508;0698 -afii57509;06AF -afii57511;0679 -afii57512;0688 -afii57513;0691 -afii57514;06BA -afii57519;06D2 -afii57534;06D5 -afii57636;20AA -afii57645;05BE -afii57658;05C3 -afii57664;05D0 -afii57665;05D1 -afii57666;05D2 -afii57667;05D3 -afii57668;05D4 -afii57669;05D5 -afii57670;05D6 -afii57671;05D7 -afii57672;05D8 -afii57673;05D9 -afii57674;05DA -afii57675;05DB -afii57676;05DC -afii57677;05DD -afii57678;05DE -afii57679;05DF -afii57680;05E0 -afii57681;05E1 -afii57682;05E2 -afii57683;05E3 -afii57684;05E4 -afii57685;05E5 -afii57686;05E6 -afii57687;05E7 -afii57688;05E8 -afii57689;05E9 -afii57690;05EA -afii57694;FB2A -afii57695;FB2B -afii57700;FB4B -afii57705;FB1F -afii57716;05F0 -afii57717;05F1 -afii57718;05F2 -afii57723;FB35 -afii57793;05B4 -afii57794;05B5 -afii57795;05B6 -afii57796;05BB -afii57797;05B8 -afii57798;05B7 -afii57799;05B0 -afii57800;05B2 -afii57801;05B1 -afii57802;05B3 -afii57803;05C2 -afii57804;05C1 -afii57806;05B9 -afii57807;05BC -afii57839;05BD -afii57841;05BF -afii57842;05C0 -afii57929;02BC -afii61248;2105 -afii61289;2113 -afii61352;2116 -afii61573;202C -afii61574;202D -afii61575;202E -afii61664;200C -afii63167;066D -afii64937;02BD -agrave;00E0 -agujarati;0A85 -agurmukhi;0A05 -ahiragana;3042 -ahookabove;1EA3 -aibengali;0990 -aibopomofo;311E -aideva;0910 -aiecyrillic;04D5 -aigujarati;0A90 -aigurmukhi;0A10 -aimatragurmukhi;0A48 -ainarabic;0639 -ainfinalarabic;FECA -aininitialarabic;FECB -ainmedialarabic;FECC -ainvertedbreve;0203 -aivowelsignbengali;09C8 -aivowelsigndeva;0948 -aivowelsigngujarati;0AC8 -akatakana;30A2 -akatakanahalfwidth;FF71 -akorean;314F -alef;05D0 -alefarabic;0627 -alefdageshhebrew;FB30 -aleffinalarabic;FE8E -alefhamzaabovearabic;0623 -alefhamzaabovefinalarabic;FE84 -alefhamzabelowarabic;0625 -alefhamzabelowfinalarabic;FE88 -alefhebrew;05D0 -aleflamedhebrew;FB4F -alefmaddaabovearabic;0622 -alefmaddaabovefinalarabic;FE82 -alefmaksuraarabic;0649 -alefmaksurafinalarabic;FEF0 -alefmaksurainitialarabic;FEF3 -alefmaksuramedialarabic;FEF4 -alefpatahhebrew;FB2E -alefqamatshebrew;FB2F -aleph;2135 -allequal;224C -alpha;03B1 -alphatonos;03AC -amacron;0101 -amonospace;FF41 -ampersand;0026 -ampersandmonospace;FF06 -ampersandsmall;F726 -amsquare;33C2 -anbopomofo;3122 -angbopomofo;3124 -angkhankhuthai;0E5A -angle;2220 -anglebracketleft;3008 -anglebracketleftvertical;FE3F -anglebracketright;3009 -anglebracketrightvertical;FE40 -angleleft;2329 -angleright;232A -angstrom;212B -anoteleia;0387 -anudattadeva;0952 -anusvarabengali;0982 -anusvaradeva;0902 -anusvaragujarati;0A82 -aogonek;0105 -apaatosquare;3300 -aparen;249C -apostrophearmenian;055A -apostrophemod;02BC -apple;F8FF -approaches;2250 -approxequal;2248 -approxequalorimage;2252 -approximatelyequal;2245 -araeaekorean;318E -araeakorean;318D -arc;2312 -arighthalfring;1E9A -aring;00E5 -aringacute;01FB -aringbelow;1E01 -arrowboth;2194 -arrowdashdown;21E3 -arrowdashleft;21E0 -arrowdashright;21E2 -arrowdashup;21E1 -arrowdblboth;21D4 -arrowdbldown;21D3 -arrowdblleft;21D0 -arrowdblright;21D2 -arrowdblup;21D1 -arrowdown;2193 -arrowdownleft;2199 -arrowdownright;2198 -arrowdownwhite;21E9 -arrowheaddownmod;02C5 -arrowheadleftmod;02C2 -arrowheadrightmod;02C3 -arrowheadupmod;02C4 -arrowhorizex;F8E7 -arrowleft;2190 -arrowleftdbl;21D0 -arrowleftdblstroke;21CD -arrowleftoverright;21C6 -arrowleftwhite;21E6 -arrowright;2192 -arrowrightdblstroke;21CF -arrowrightheavy;279E -arrowrightoverleft;21C4 -arrowrightwhite;21E8 -arrowtableft;21E4 -arrowtabright;21E5 -arrowup;2191 -arrowupdn;2195 -arrowupdnbse;21A8 -arrowupdownbase;21A8 -arrowupleft;2196 -arrowupleftofdown;21C5 -arrowupright;2197 -arrowupwhite;21E7 -arrowvertex;F8E6 -asciicircum;005E -asciicircummonospace;FF3E -asciitilde;007E -asciitildemonospace;FF5E -ascript;0251 -ascriptturned;0252 -asmallhiragana;3041 -asmallkatakana;30A1 -asmallkatakanahalfwidth;FF67 -asterisk;002A -asteriskaltonearabic;066D -asteriskarabic;066D -asteriskmath;2217 -asteriskmonospace;FF0A -asterisksmall;FE61 -asterism;2042 -asuperior;F6E9 -asymptoticallyequal;2243 -at;0040 -atilde;00E3 -atmonospace;FF20 -atsmall;FE6B -aturned;0250 -aubengali;0994 -aubopomofo;3120 -audeva;0914 -augujarati;0A94 -augurmukhi;0A14 -aulengthmarkbengali;09D7 -aumatragurmukhi;0A4C -auvowelsignbengali;09CC -auvowelsigndeva;094C -auvowelsigngujarati;0ACC -avagrahadeva;093D -aybarmenian;0561 -ayin;05E2 -ayinaltonehebrew;FB20 -ayinhebrew;05E2 -b;0062 -babengali;09AC -backslash;005C -backslashmonospace;FF3C -badeva;092C -bagujarati;0AAC -bagurmukhi;0A2C -bahiragana;3070 -bahtthai;0E3F -bakatakana;30D0 -bar;007C -barmonospace;FF5C -bbopomofo;3105 -bcircle;24D1 -bdotaccent;1E03 -bdotbelow;1E05 -beamedsixteenthnotes;266C -because;2235 -becyrillic;0431 -beharabic;0628 -behfinalarabic;FE90 -behinitialarabic;FE91 -behiragana;3079 -behmedialarabic;FE92 -behmeeminitialarabic;FC9F -behmeemisolatedarabic;FC08 -behnoonfinalarabic;FC6D -bekatakana;30D9 -benarmenian;0562 -bet;05D1 -beta;03B2 -betasymbolgreek;03D0 -betdagesh;FB31 -betdageshhebrew;FB31 -bethebrew;05D1 -betrafehebrew;FB4C -bhabengali;09AD -bhadeva;092D -bhagujarati;0AAD -bhagurmukhi;0A2D -bhook;0253 -bihiragana;3073 -bikatakana;30D3 -bilabialclick;0298 -bindigurmukhi;0A02 -birusquare;3331 -blackcircle;25CF -blackdiamond;25C6 -blackdownpointingtriangle;25BC -blackleftpointingpointer;25C4 -blackleftpointingtriangle;25C0 -blacklenticularbracketleft;3010 -blacklenticularbracketleftvertical;FE3B -blacklenticularbracketright;3011 -blacklenticularbracketrightvertical;FE3C -blacklowerlefttriangle;25E3 -blacklowerrighttriangle;25E2 -blackrectangle;25AC -blackrightpointingpointer;25BA -blackrightpointingtriangle;25B6 -blacksmallsquare;25AA -blacksmilingface;263B -blacksquare;25A0 -blackstar;2605 -blackupperlefttriangle;25E4 -blackupperrighttriangle;25E5 -blackuppointingsmalltriangle;25B4 -blackuppointingtriangle;25B2 -blank;2423 -blinebelow;1E07 -block;2588 -bmonospace;FF42 -bobaimaithai;0E1A -bohiragana;307C -bokatakana;30DC -bparen;249D -bqsquare;33C3 -braceex;F8F4 -braceleft;007B -braceleftbt;F8F3 -braceleftmid;F8F2 -braceleftmonospace;FF5B -braceleftsmall;FE5B -bracelefttp;F8F1 -braceleftvertical;FE37 -braceright;007D -bracerightbt;F8FE -bracerightmid;F8FD -bracerightmonospace;FF5D -bracerightsmall;FE5C -bracerighttp;F8FC -bracerightvertical;FE38 -bracketleft;005B -bracketleftbt;F8F0 -bracketleftex;F8EF -bracketleftmonospace;FF3B -bracketlefttp;F8EE -bracketright;005D -bracketrightbt;F8FB -bracketrightex;F8FA -bracketrightmonospace;FF3D -bracketrighttp;F8F9 -breve;02D8 -brevebelowcmb;032E -brevecmb;0306 -breveinvertedbelowcmb;032F -breveinvertedcmb;0311 -breveinverteddoublecmb;0361 -bridgebelowcmb;032A -bridgeinvertedbelowcmb;033A -brokenbar;00A6 -bstroke;0180 -bsuperior;F6EA -btopbar;0183 -buhiragana;3076 -bukatakana;30D6 -bullet;2022 -bulletinverse;25D8 -bulletoperator;2219 -bullseye;25CE -c;0063 -caarmenian;056E -cabengali;099A -cacute;0107 -cadeva;091A -cagujarati;0A9A -cagurmukhi;0A1A -calsquare;3388 -candrabindubengali;0981 -candrabinducmb;0310 -candrabindudeva;0901 -candrabindugujarati;0A81 -capslock;21EA -careof;2105 -caron;02C7 -caronbelowcmb;032C -caroncmb;030C -carriagereturn;21B5 -cbopomofo;3118 -ccaron;010D -ccedilla;00E7 -ccedillaacute;1E09 -ccircle;24D2 -ccircumflex;0109 -ccurl;0255 -cdot;010B -cdotaccent;010B -cdsquare;33C5 -cedilla;00B8 -cedillacmb;0327 -cent;00A2 -centigrade;2103 -centinferior;F6DF -centmonospace;FFE0 -centoldstyle;F7A2 -centsuperior;F6E0 -chaarmenian;0579 -chabengali;099B -chadeva;091B -chagujarati;0A9B -chagurmukhi;0A1B -chbopomofo;3114 -cheabkhasiancyrillic;04BD -checkmark;2713 -checyrillic;0447 -chedescenderabkhasiancyrillic;04BF -chedescendercyrillic;04B7 -chedieresiscyrillic;04F5 -cheharmenian;0573 -chekhakassiancyrillic;04CC -cheverticalstrokecyrillic;04B9 -chi;03C7 -chieuchacirclekorean;3277 -chieuchaparenkorean;3217 -chieuchcirclekorean;3269 -chieuchkorean;314A -chieuchparenkorean;3209 -chochangthai;0E0A -chochanthai;0E08 -chochingthai;0E09 -chochoethai;0E0C -chook;0188 -cieucacirclekorean;3276 -cieucaparenkorean;3216 -cieuccirclekorean;3268 -cieuckorean;3148 -cieucparenkorean;3208 -cieucuparenkorean;321C -circle;25CB -circlemultiply;2297 -circleot;2299 -circleplus;2295 -circlepostalmark;3036 -circlewithlefthalfblack;25D0 -circlewithrighthalfblack;25D1 -circumflex;02C6 -circumflexbelowcmb;032D -circumflexcmb;0302 -clear;2327 -clickalveolar;01C2 -clickdental;01C0 -clicklateral;01C1 -clickretroflex;01C3 -club;2663 -clubsuitblack;2663 -clubsuitwhite;2667 -cmcubedsquare;33A4 -cmonospace;FF43 -cmsquaredsquare;33A0 -coarmenian;0581 -colon;003A -colonmonetary;20A1 -colonmonospace;FF1A -colonsign;20A1 -colonsmall;FE55 -colontriangularhalfmod;02D1 -colontriangularmod;02D0 -comma;002C -commaabovecmb;0313 -commaaboverightcmb;0315 -commaaccent;F6C3 -commaarabic;060C -commaarmenian;055D -commainferior;F6E1 -commamonospace;FF0C -commareversedabovecmb;0314 -commareversedmod;02BD -commasmall;FE50 -commasuperior;F6E2 -commaturnedabovecmb;0312 -commaturnedmod;02BB -compass;263C -congruent;2245 -contourintegral;222E -control;2303 -controlACK;0006 -controlBEL;0007 -controlBS;0008 -controlCAN;0018 -controlCR;000D -controlDC1;0011 -controlDC2;0012 -controlDC3;0013 -controlDC4;0014 -controlDEL;007F -controlDLE;0010 -controlEM;0019 -controlENQ;0005 -controlEOT;0004 -controlESC;001B -controlETB;0017 -controlETX;0003 -controlFF;000C -controlFS;001C -controlGS;001D -controlHT;0009 -controlLF;000A -controlNAK;0015 -controlRS;001E -controlSI;000F -controlSO;000E -controlSOT;0002 -controlSTX;0001 -controlSUB;001A -controlSYN;0016 -controlUS;001F -controlVT;000B -copyright;00A9 -copyrightsans;F8E9 -copyrightserif;F6D9 -cornerbracketleft;300C -cornerbracketlefthalfwidth;FF62 -cornerbracketleftvertical;FE41 -cornerbracketright;300D -cornerbracketrighthalfwidth;FF63 -cornerbracketrightvertical;FE42 -corporationsquare;337F -cosquare;33C7 -coverkgsquare;33C6 -cparen;249E -cruzeiro;20A2 -cstretched;0297 -curlyand;22CF -curlyor;22CE -currency;00A4 -cyrBreve;F6D1 -cyrFlex;F6D2 -cyrbreve;F6D4 -cyrflex;F6D5 -d;0064 -daarmenian;0564 -dabengali;09A6 -dadarabic;0636 -dadeva;0926 -dadfinalarabic;FEBE -dadinitialarabic;FEBF -dadmedialarabic;FEC0 -dagesh;05BC -dageshhebrew;05BC -dagger;2020 -daggerdbl;2021 -dagujarati;0AA6 -dagurmukhi;0A26 -dahiragana;3060 -dakatakana;30C0 -dalarabic;062F -dalet;05D3 -daletdagesh;FB33 -daletdageshhebrew;FB33 -dalethatafpatah;05D3 05B2 -dalethatafpatahhebrew;05D3 05B2 -dalethatafsegol;05D3 05B1 -dalethatafsegolhebrew;05D3 05B1 -dalethebrew;05D3 -dalethiriq;05D3 05B4 -dalethiriqhebrew;05D3 05B4 -daletholam;05D3 05B9 -daletholamhebrew;05D3 05B9 -daletpatah;05D3 05B7 -daletpatahhebrew;05D3 05B7 -daletqamats;05D3 05B8 -daletqamatshebrew;05D3 05B8 -daletqubuts;05D3 05BB -daletqubutshebrew;05D3 05BB -daletsegol;05D3 05B6 -daletsegolhebrew;05D3 05B6 -daletsheva;05D3 05B0 -daletshevahebrew;05D3 05B0 -dalettsere;05D3 05B5 -dalettserehebrew;05D3 05B5 -dalfinalarabic;FEAA -dammaarabic;064F -dammalowarabic;064F -dammatanaltonearabic;064C -dammatanarabic;064C -danda;0964 -dargahebrew;05A7 -dargalefthebrew;05A7 -dasiapneumatacyrilliccmb;0485 -dblGrave;F6D3 -dblanglebracketleft;300A -dblanglebracketleftvertical;FE3D -dblanglebracketright;300B -dblanglebracketrightvertical;FE3E -dblarchinvertedbelowcmb;032B -dblarrowleft;21D4 -dblarrowright;21D2 -dbldanda;0965 -dblgrave;F6D6 -dblgravecmb;030F -dblintegral;222C -dbllowline;2017 -dbllowlinecmb;0333 -dbloverlinecmb;033F -dblprimemod;02BA -dblverticalbar;2016 -dblverticallineabovecmb;030E -dbopomofo;3109 -dbsquare;33C8 -dcaron;010F -dcedilla;1E11 -dcircle;24D3 -dcircumflexbelow;1E13 -dcroat;0111 -ddabengali;09A1 -ddadeva;0921 -ddagujarati;0AA1 -ddagurmukhi;0A21 -ddalarabic;0688 -ddalfinalarabic;FB89 -dddhadeva;095C -ddhabengali;09A2 -ddhadeva;0922 -ddhagujarati;0AA2 -ddhagurmukhi;0A22 -ddotaccent;1E0B -ddotbelow;1E0D -decimalseparatorarabic;066B -decimalseparatorpersian;066B -decyrillic;0434 -degree;00B0 -dehihebrew;05AD -dehiragana;3067 -deicoptic;03EF -dekatakana;30C7 -deleteleft;232B -deleteright;2326 -delta;03B4 -deltaturned;018D -denominatorminusonenumeratorbengali;09F8 -dezh;02A4 -dhabengali;09A7 -dhadeva;0927 -dhagujarati;0AA7 -dhagurmukhi;0A27 -dhook;0257 -dialytikatonos;0385 -dialytikatonoscmb;0344 -diamond;2666 -diamondsuitwhite;2662 -dieresis;00A8 -dieresisacute;F6D7 -dieresisbelowcmb;0324 -dieresiscmb;0308 -dieresisgrave;F6D8 -dieresistonos;0385 -dihiragana;3062 -dikatakana;30C2 -dittomark;3003 -divide;00F7 -divides;2223 -divisionslash;2215 -djecyrillic;0452 -dkshade;2593 -dlinebelow;1E0F -dlsquare;3397 -dmacron;0111 -dmonospace;FF44 -dnblock;2584 -dochadathai;0E0E -dodekthai;0E14 -dohiragana;3069 -dokatakana;30C9 -dollar;0024 -dollarinferior;F6E3 -dollarmonospace;FF04 -dollaroldstyle;F724 -dollarsmall;FE69 -dollarsuperior;F6E4 -dong;20AB -dorusquare;3326 -dotaccent;02D9 -dotaccentcmb;0307 -dotbelowcmb;0323 -dotbelowcomb;0323 -dotkatakana;30FB -dotlessi;0131 -dotlessj;F6BE -dotlessjstrokehook;0284 -dotmath;22C5 -dottedcircle;25CC -doubleyodpatah;FB1F -doubleyodpatahhebrew;FB1F -downtackbelowcmb;031E -downtackmod;02D5 -dparen;249F -dsuperior;F6EB -dtail;0256 -dtopbar;018C -duhiragana;3065 -dukatakana;30C5 -dz;01F3 -dzaltone;02A3 -dzcaron;01C6 -dzcurl;02A5 -dzeabkhasiancyrillic;04E1 -dzecyrillic;0455 -dzhecyrillic;045F -e;0065 -eacute;00E9 -earth;2641 -ebengali;098F -ebopomofo;311C -ebreve;0115 -ecandradeva;090D -ecandragujarati;0A8D -ecandravowelsigndeva;0945 -ecandravowelsigngujarati;0AC5 -ecaron;011B -ecedillabreve;1E1D -echarmenian;0565 -echyiwnarmenian;0587 -ecircle;24D4 -ecircumflex;00EA -ecircumflexacute;1EBF -ecircumflexbelow;1E19 -ecircumflexdotbelow;1EC7 -ecircumflexgrave;1EC1 -ecircumflexhookabove;1EC3 -ecircumflextilde;1EC5 -ecyrillic;0454 -edblgrave;0205 -edeva;090F -edieresis;00EB -edot;0117 -edotaccent;0117 -edotbelow;1EB9 -eegurmukhi;0A0F -eematragurmukhi;0A47 -efcyrillic;0444 -egrave;00E8 -egujarati;0A8F -eharmenian;0567 -ehbopomofo;311D -ehiragana;3048 -ehookabove;1EBB -eibopomofo;311F -eight;0038 -eightarabic;0668 -eightbengali;09EE -eightcircle;2467 -eightcircleinversesansserif;2791 -eightdeva;096E -eighteencircle;2471 -eighteenparen;2485 -eighteenperiod;2499 -eightgujarati;0AEE -eightgurmukhi;0A6E -eighthackarabic;0668 -eighthangzhou;3028 -eighthnotebeamed;266B -eightideographicparen;3227 -eightinferior;2088 -eightmonospace;FF18 -eightoldstyle;F738 -eightparen;247B -eightperiod;248F -eightpersian;06F8 -eightroman;2177 -eightsuperior;2078 -eightthai;0E58 -einvertedbreve;0207 -eiotifiedcyrillic;0465 -ekatakana;30A8 -ekatakanahalfwidth;FF74 -ekonkargurmukhi;0A74 -ekorean;3154 -elcyrillic;043B -element;2208 -elevencircle;246A -elevenparen;247E -elevenperiod;2492 -elevenroman;217A -ellipsis;2026 -ellipsisvertical;22EE -emacron;0113 -emacronacute;1E17 -emacrongrave;1E15 -emcyrillic;043C -emdash;2014 -emdashvertical;FE31 -emonospace;FF45 -emphasismarkarmenian;055B -emptyset;2205 -enbopomofo;3123 -encyrillic;043D -endash;2013 -endashvertical;FE32 -endescendercyrillic;04A3 -eng;014B -engbopomofo;3125 -enghecyrillic;04A5 -enhookcyrillic;04C8 -enspace;2002 -eogonek;0119 -eokorean;3153 -eopen;025B -eopenclosed;029A -eopenreversed;025C -eopenreversedclosed;025E -eopenreversedhook;025D -eparen;24A0 -epsilon;03B5 -epsilontonos;03AD -equal;003D -equalmonospace;FF1D -equalsmall;FE66 -equalsuperior;207C -equivalence;2261 -erbopomofo;3126 -ercyrillic;0440 -ereversed;0258 -ereversedcyrillic;044D -escyrillic;0441 -esdescendercyrillic;04AB -esh;0283 -eshcurl;0286 -eshortdeva;090E -eshortvowelsigndeva;0946 -eshreversedloop;01AA -eshsquatreversed;0285 -esmallhiragana;3047 -esmallkatakana;30A7 -esmallkatakanahalfwidth;FF6A -estimated;212E -esuperior;F6EC -eta;03B7 -etarmenian;0568 -etatonos;03AE -eth;00F0 -etilde;1EBD -etildebelow;1E1B -etnahtafoukhhebrew;0591 -etnahtafoukhlefthebrew;0591 -etnahtahebrew;0591 -etnahtalefthebrew;0591 -eturned;01DD -eukorean;3161 -euro;20AC -evowelsignbengali;09C7 -evowelsigndeva;0947 -evowelsigngujarati;0AC7 -exclam;0021 -exclamarmenian;055C -exclamdbl;203C -exclamdown;00A1 -exclamdownsmall;F7A1 -exclammonospace;FF01 -exclamsmall;F721 -existential;2203 -ezh;0292 -ezhcaron;01EF -ezhcurl;0293 -ezhreversed;01B9 -ezhtail;01BA -f;0066 -fadeva;095E -fagurmukhi;0A5E -fahrenheit;2109 -fathaarabic;064E -fathalowarabic;064E -fathatanarabic;064B -fbopomofo;3108 -fcircle;24D5 -fdotaccent;1E1F -feharabic;0641 -feharmenian;0586 -fehfinalarabic;FED2 -fehinitialarabic;FED3 -fehmedialarabic;FED4 -feicoptic;03E5 -female;2640 -ff;FB00 -ffi;FB03 -ffl;FB04 -fi;FB01 -fifteencircle;246E -fifteenparen;2482 -fifteenperiod;2496 -figuredash;2012 -filledbox;25A0 -filledrect;25AC -finalkaf;05DA -finalkafdagesh;FB3A -finalkafdageshhebrew;FB3A -finalkafhebrew;05DA -finalkafqamats;05DA 05B8 -finalkafqamatshebrew;05DA 05B8 -finalkafsheva;05DA 05B0 -finalkafshevahebrew;05DA 05B0 -finalmem;05DD -finalmemhebrew;05DD -finalnun;05DF -finalnunhebrew;05DF -finalpe;05E3 -finalpehebrew;05E3 -finaltsadi;05E5 -finaltsadihebrew;05E5 -firsttonechinese;02C9 -fisheye;25C9 -fitacyrillic;0473 -five;0035 -fivearabic;0665 -fivebengali;09EB -fivecircle;2464 -fivecircleinversesansserif;278E -fivedeva;096B -fiveeighths;215D -fivegujarati;0AEB -fivegurmukhi;0A6B -fivehackarabic;0665 -fivehangzhou;3025 -fiveideographicparen;3224 -fiveinferior;2085 -fivemonospace;FF15 -fiveoldstyle;F735 -fiveparen;2478 -fiveperiod;248C -fivepersian;06F5 -fiveroman;2174 -fivesuperior;2075 -fivethai;0E55 -fl;FB02 -florin;0192 -fmonospace;FF46 -fmsquare;3399 -fofanthai;0E1F -fofathai;0E1D -fongmanthai;0E4F -forall;2200 -four;0034 -fourarabic;0664 -fourbengali;09EA -fourcircle;2463 -fourcircleinversesansserif;278D -fourdeva;096A -fourgujarati;0AEA -fourgurmukhi;0A6A -fourhackarabic;0664 -fourhangzhou;3024 -fourideographicparen;3223 -fourinferior;2084 -fourmonospace;FF14 -fournumeratorbengali;09F7 -fouroldstyle;F734 -fourparen;2477 -fourperiod;248B -fourpersian;06F4 -fourroman;2173 -foursuperior;2074 -fourteencircle;246D -fourteenparen;2481 -fourteenperiod;2495 -fourthai;0E54 -fourthtonechinese;02CB -fparen;24A1 -fraction;2044 -franc;20A3 -g;0067 -gabengali;0997 -gacute;01F5 -gadeva;0917 -gafarabic;06AF -gaffinalarabic;FB93 -gafinitialarabic;FB94 -gafmedialarabic;FB95 -gagujarati;0A97 -gagurmukhi;0A17 -gahiragana;304C -gakatakana;30AC -gamma;03B3 -gammalatinsmall;0263 -gammasuperior;02E0 -gangiacoptic;03EB -gbopomofo;310D -gbreve;011F -gcaron;01E7 -gcedilla;0123 -gcircle;24D6 -gcircumflex;011D -gcommaaccent;0123 -gdot;0121 -gdotaccent;0121 -gecyrillic;0433 -gehiragana;3052 -gekatakana;30B2 -geometricallyequal;2251 -gereshaccenthebrew;059C -gereshhebrew;05F3 -gereshmuqdamhebrew;059D -germandbls;00DF -gershayimaccenthebrew;059E -gershayimhebrew;05F4 -getamark;3013 -ghabengali;0998 -ghadarmenian;0572 -ghadeva;0918 -ghagujarati;0A98 -ghagurmukhi;0A18 -ghainarabic;063A -ghainfinalarabic;FECE -ghaininitialarabic;FECF -ghainmedialarabic;FED0 -ghemiddlehookcyrillic;0495 -ghestrokecyrillic;0493 -gheupturncyrillic;0491 -ghhadeva;095A -ghhagurmukhi;0A5A -ghook;0260 -ghzsquare;3393 -gihiragana;304E -gikatakana;30AE -gimarmenian;0563 -gimel;05D2 -gimeldagesh;FB32 -gimeldageshhebrew;FB32 -gimelhebrew;05D2 -gjecyrillic;0453 -glottalinvertedstroke;01BE -glottalstop;0294 -glottalstopinverted;0296 -glottalstopmod;02C0 -glottalstopreversed;0295 -glottalstopreversedmod;02C1 -glottalstopreversedsuperior;02E4 -glottalstopstroke;02A1 -glottalstopstrokereversed;02A2 -gmacron;1E21 -gmonospace;FF47 -gohiragana;3054 -gokatakana;30B4 -gparen;24A2 -gpasquare;33AC -gradient;2207 -grave;0060 -gravebelowcmb;0316 -gravecmb;0300 -gravecomb;0300 -gravedeva;0953 -gravelowmod;02CE -gravemonospace;FF40 -gravetonecmb;0340 -greater;003E -greaterequal;2265 -greaterequalorless;22DB -greatermonospace;FF1E -greaterorequivalent;2273 -greaterorless;2277 -greateroverequal;2267 -greatersmall;FE65 -gscript;0261 -gstroke;01E5 -guhiragana;3050 -guillemotleft;00AB -guillemotright;00BB -guilsinglleft;2039 -guilsinglright;203A -gukatakana;30B0 -guramusquare;3318 -gysquare;33C9 -h;0068 -haabkhasiancyrillic;04A9 -haaltonearabic;06C1 -habengali;09B9 -hadescendercyrillic;04B3 -hadeva;0939 -hagujarati;0AB9 -hagurmukhi;0A39 -haharabic;062D -hahfinalarabic;FEA2 -hahinitialarabic;FEA3 -hahiragana;306F -hahmedialarabic;FEA4 -haitusquare;332A -hakatakana;30CF -hakatakanahalfwidth;FF8A -halantgurmukhi;0A4D -hamzaarabic;0621 -hamzadammaarabic;0621 064F -hamzadammatanarabic;0621 064C -hamzafathaarabic;0621 064E -hamzafathatanarabic;0621 064B -hamzalowarabic;0621 -hamzalowkasraarabic;0621 0650 -hamzalowkasratanarabic;0621 064D -hamzasukunarabic;0621 0652 -hangulfiller;3164 -hardsigncyrillic;044A -harpoonleftbarbup;21BC -harpoonrightbarbup;21C0 -hasquare;33CA -hatafpatah;05B2 -hatafpatah16;05B2 -hatafpatah23;05B2 -hatafpatah2f;05B2 -hatafpatahhebrew;05B2 -hatafpatahnarrowhebrew;05B2 -hatafpatahquarterhebrew;05B2 -hatafpatahwidehebrew;05B2 -hatafqamats;05B3 -hatafqamats1b;05B3 -hatafqamats28;05B3 -hatafqamats34;05B3 -hatafqamatshebrew;05B3 -hatafqamatsnarrowhebrew;05B3 -hatafqamatsquarterhebrew;05B3 -hatafqamatswidehebrew;05B3 -hatafsegol;05B1 -hatafsegol17;05B1 -hatafsegol24;05B1 -hatafsegol30;05B1 -hatafsegolhebrew;05B1 -hatafsegolnarrowhebrew;05B1 -hatafsegolquarterhebrew;05B1 -hatafsegolwidehebrew;05B1 -hbar;0127 -hbopomofo;310F -hbrevebelow;1E2B -hcedilla;1E29 -hcircle;24D7 -hcircumflex;0125 -hdieresis;1E27 -hdotaccent;1E23 -hdotbelow;1E25 -he;05D4 -heart;2665 -heartsuitblack;2665 -heartsuitwhite;2661 -hedagesh;FB34 -hedageshhebrew;FB34 -hehaltonearabic;06C1 -heharabic;0647 -hehebrew;05D4 -hehfinalaltonearabic;FBA7 -hehfinalalttwoarabic;FEEA -hehfinalarabic;FEEA -hehhamzaabovefinalarabic;FBA5 -hehhamzaaboveisolatedarabic;FBA4 -hehinitialaltonearabic;FBA8 -hehinitialarabic;FEEB -hehiragana;3078 -hehmedialaltonearabic;FBA9 -hehmedialarabic;FEEC -heiseierasquare;337B -hekatakana;30D8 -hekatakanahalfwidth;FF8D -hekutaarusquare;3336 -henghook;0267 -herutusquare;3339 -het;05D7 -hethebrew;05D7 -hhook;0266 -hhooksuperior;02B1 -hieuhacirclekorean;327B -hieuhaparenkorean;321B -hieuhcirclekorean;326D -hieuhkorean;314E -hieuhparenkorean;320D -hihiragana;3072 -hikatakana;30D2 -hikatakanahalfwidth;FF8B -hiriq;05B4 -hiriq14;05B4 -hiriq21;05B4 -hiriq2d;05B4 -hiriqhebrew;05B4 -hiriqnarrowhebrew;05B4 -hiriqquarterhebrew;05B4 -hiriqwidehebrew;05B4 -hlinebelow;1E96 -hmonospace;FF48 -hoarmenian;0570 -hohipthai;0E2B -hohiragana;307B -hokatakana;30DB -hokatakanahalfwidth;FF8E -holam;05B9 -holam19;05B9 -holam26;05B9 -holam32;05B9 -holamhebrew;05B9 -holamnarrowhebrew;05B9 -holamquarterhebrew;05B9 -holamwidehebrew;05B9 -honokhukthai;0E2E -hookabovecomb;0309 -hookcmb;0309 -hookpalatalizedbelowcmb;0321 -hookretroflexbelowcmb;0322 -hoonsquare;3342 -horicoptic;03E9 -horizontalbar;2015 -horncmb;031B -hotsprings;2668 -house;2302 -hparen;24A3 -hsuperior;02B0 -hturned;0265 -huhiragana;3075 -huiitosquare;3333 -hukatakana;30D5 -hukatakanahalfwidth;FF8C -hungarumlaut;02DD -hungarumlautcmb;030B -hv;0195 -hyphen;002D -hypheninferior;F6E5 -hyphenmonospace;FF0D -hyphensmall;FE63 -hyphensuperior;F6E6 -hyphentwo;2010 -i;0069 -iacute;00ED -iacyrillic;044F -ibengali;0987 -ibopomofo;3127 -ibreve;012D -icaron;01D0 -icircle;24D8 -icircumflex;00EE -icyrillic;0456 -idblgrave;0209 -ideographearthcircle;328F -ideographfirecircle;328B -ideographicallianceparen;323F -ideographiccallparen;323A -ideographiccentrecircle;32A5 -ideographicclose;3006 -ideographiccomma;3001 -ideographiccommaleft;FF64 -ideographiccongratulationparen;3237 -ideographiccorrectcircle;32A3 -ideographicearthparen;322F -ideographicenterpriseparen;323D -ideographicexcellentcircle;329D -ideographicfestivalparen;3240 -ideographicfinancialcircle;3296 -ideographicfinancialparen;3236 -ideographicfireparen;322B -ideographichaveparen;3232 -ideographichighcircle;32A4 -ideographiciterationmark;3005 -ideographiclaborcircle;3298 -ideographiclaborparen;3238 -ideographicleftcircle;32A7 -ideographiclowcircle;32A6 -ideographicmedicinecircle;32A9 -ideographicmetalparen;322E -ideographicmoonparen;322A -ideographicnameparen;3234 -ideographicperiod;3002 -ideographicprintcircle;329E -ideographicreachparen;3243 -ideographicrepresentparen;3239 -ideographicresourceparen;323E -ideographicrightcircle;32A8 -ideographicsecretcircle;3299 -ideographicselfparen;3242 -ideographicsocietyparen;3233 -ideographicspace;3000 -ideographicspecialparen;3235 -ideographicstockparen;3231 -ideographicstudyparen;323B -ideographicsunparen;3230 -ideographicsuperviseparen;323C -ideographicwaterparen;322C -ideographicwoodparen;322D -ideographiczero;3007 -ideographmetalcircle;328E -ideographmooncircle;328A -ideographnamecircle;3294 -ideographsuncircle;3290 -ideographwatercircle;328C -ideographwoodcircle;328D -ideva;0907 -idieresis;00EF -idieresisacute;1E2F -idieresiscyrillic;04E5 -idotbelow;1ECB -iebrevecyrillic;04D7 -iecyrillic;0435 -ieungacirclekorean;3275 -ieungaparenkorean;3215 -ieungcirclekorean;3267 -ieungkorean;3147 -ieungparenkorean;3207 -igrave;00EC -igujarati;0A87 -igurmukhi;0A07 -ihiragana;3044 -ihookabove;1EC9 -iibengali;0988 -iicyrillic;0438 -iideva;0908 -iigujarati;0A88 -iigurmukhi;0A08 -iimatragurmukhi;0A40 -iinvertedbreve;020B -iishortcyrillic;0439 -iivowelsignbengali;09C0 -iivowelsigndeva;0940 -iivowelsigngujarati;0AC0 -ij;0133 -ikatakana;30A4 -ikatakanahalfwidth;FF72 -ikorean;3163 -ilde;02DC -iluyhebrew;05AC -imacron;012B -imacroncyrillic;04E3 -imageorapproximatelyequal;2253 -imatragurmukhi;0A3F -imonospace;FF49 -increment;2206 -infinity;221E -iniarmenian;056B -integral;222B -integralbottom;2321 -integralbt;2321 -integralex;F8F5 -integraltop;2320 -integraltp;2320 -intersection;2229 -intisquare;3305 -invbullet;25D8 -invcircle;25D9 -invsmileface;263B -iocyrillic;0451 -iogonek;012F -iota;03B9 -iotadieresis;03CA -iotadieresistonos;0390 -iotalatin;0269 -iotatonos;03AF -iparen;24A4 -irigurmukhi;0A72 -ismallhiragana;3043 -ismallkatakana;30A3 -ismallkatakanahalfwidth;FF68 -issharbengali;09FA -istroke;0268 -isuperior;F6ED -iterationhiragana;309D -iterationkatakana;30FD -itilde;0129 -itildebelow;1E2D -iubopomofo;3129 -iucyrillic;044E -ivowelsignbengali;09BF -ivowelsigndeva;093F -ivowelsigngujarati;0ABF -izhitsacyrillic;0475 -izhitsadblgravecyrillic;0477 -j;006A -jaarmenian;0571 -jabengali;099C -jadeva;091C -jagujarati;0A9C -jagurmukhi;0A1C -jbopomofo;3110 -jcaron;01F0 -jcircle;24D9 -jcircumflex;0135 -jcrossedtail;029D -jdotlessstroke;025F -jecyrillic;0458 -jeemarabic;062C -jeemfinalarabic;FE9E -jeeminitialarabic;FE9F -jeemmedialarabic;FEA0 -jeharabic;0698 -jehfinalarabic;FB8B -jhabengali;099D -jhadeva;091D -jhagujarati;0A9D -jhagurmukhi;0A1D -jheharmenian;057B -jis;3004 -jmonospace;FF4A -jparen;24A5 -jsuperior;02B2 -k;006B -kabashkircyrillic;04A1 -kabengali;0995 -kacute;1E31 -kacyrillic;043A -kadescendercyrillic;049B -kadeva;0915 -kaf;05DB -kafarabic;0643 -kafdagesh;FB3B -kafdageshhebrew;FB3B -kaffinalarabic;FEDA -kafhebrew;05DB -kafinitialarabic;FEDB -kafmedialarabic;FEDC -kafrafehebrew;FB4D -kagujarati;0A95 -kagurmukhi;0A15 -kahiragana;304B -kahookcyrillic;04C4 -kakatakana;30AB -kakatakanahalfwidth;FF76 -kappa;03BA -kappasymbolgreek;03F0 -kapyeounmieumkorean;3171 -kapyeounphieuphkorean;3184 -kapyeounpieupkorean;3178 -kapyeounssangpieupkorean;3179 -karoriisquare;330D -kashidaautoarabic;0640 -kashidaautonosidebearingarabic;0640 -kasmallkatakana;30F5 -kasquare;3384 -kasraarabic;0650 -kasratanarabic;064D -kastrokecyrillic;049F -katahiraprolongmarkhalfwidth;FF70 -kaverticalstrokecyrillic;049D -kbopomofo;310E -kcalsquare;3389 -kcaron;01E9 -kcedilla;0137 -kcircle;24DA -kcommaaccent;0137 -kdotbelow;1E33 -keharmenian;0584 -kehiragana;3051 -kekatakana;30B1 -kekatakanahalfwidth;FF79 -kenarmenian;056F -kesmallkatakana;30F6 -kgreenlandic;0138 -khabengali;0996 -khacyrillic;0445 -khadeva;0916 -khagujarati;0A96 -khagurmukhi;0A16 -khaharabic;062E -khahfinalarabic;FEA6 -khahinitialarabic;FEA7 -khahmedialarabic;FEA8 -kheicoptic;03E7 -khhadeva;0959 -khhagurmukhi;0A59 -khieukhacirclekorean;3278 -khieukhaparenkorean;3218 -khieukhcirclekorean;326A -khieukhkorean;314B -khieukhparenkorean;320A -khokhaithai;0E02 -khokhonthai;0E05 -khokhuatthai;0E03 -khokhwaithai;0E04 -khomutthai;0E5B -khook;0199 -khorakhangthai;0E06 -khzsquare;3391 -kihiragana;304D -kikatakana;30AD -kikatakanahalfwidth;FF77 -kiroguramusquare;3315 -kiromeetorusquare;3316 -kirosquare;3314 -kiyeokacirclekorean;326E -kiyeokaparenkorean;320E -kiyeokcirclekorean;3260 -kiyeokkorean;3131 -kiyeokparenkorean;3200 -kiyeoksioskorean;3133 -kjecyrillic;045C -klinebelow;1E35 -klsquare;3398 -kmcubedsquare;33A6 -kmonospace;FF4B -kmsquaredsquare;33A2 -kohiragana;3053 -kohmsquare;33C0 -kokaithai;0E01 -kokatakana;30B3 -kokatakanahalfwidth;FF7A -kooposquare;331E -koppacyrillic;0481 -koreanstandardsymbol;327F -koroniscmb;0343 -kparen;24A6 -kpasquare;33AA -ksicyrillic;046F -ktsquare;33CF -kturned;029E -kuhiragana;304F -kukatakana;30AF -kukatakanahalfwidth;FF78 -kvsquare;33B8 -kwsquare;33BE -l;006C -labengali;09B2 -lacute;013A -ladeva;0932 -lagujarati;0AB2 -lagurmukhi;0A32 -lakkhangyaothai;0E45 -lamaleffinalarabic;FEFC -lamalefhamzaabovefinalarabic;FEF8 -lamalefhamzaaboveisolatedarabic;FEF7 -lamalefhamzabelowfinalarabic;FEFA -lamalefhamzabelowisolatedarabic;FEF9 -lamalefisolatedarabic;FEFB -lamalefmaddaabovefinalarabic;FEF6 -lamalefmaddaaboveisolatedarabic;FEF5 -lamarabic;0644 -lambda;03BB -lambdastroke;019B -lamed;05DC -lameddagesh;FB3C -lameddageshhebrew;FB3C -lamedhebrew;05DC -lamedholam;05DC 05B9 -lamedholamdagesh;05DC 05B9 05BC -lamedholamdageshhebrew;05DC 05B9 05BC -lamedholamhebrew;05DC 05B9 -lamfinalarabic;FEDE -lamhahinitialarabic;FCCA -laminitialarabic;FEDF -lamjeeminitialarabic;FCC9 -lamkhahinitialarabic;FCCB -lamlamhehisolatedarabic;FDF2 -lammedialarabic;FEE0 -lammeemhahinitialarabic;FD88 -lammeeminitialarabic;FCCC -lammeemjeeminitialarabic;FEDF FEE4 FEA0 -lammeemkhahinitialarabic;FEDF FEE4 FEA8 -largecircle;25EF -lbar;019A -lbelt;026C -lbopomofo;310C -lcaron;013E -lcedilla;013C -lcircle;24DB -lcircumflexbelow;1E3D -lcommaaccent;013C -ldot;0140 -ldotaccent;0140 -ldotbelow;1E37 -ldotbelowmacron;1E39 -leftangleabovecmb;031A -lefttackbelowcmb;0318 -less;003C -lessequal;2264 -lessequalorgreater;22DA -lessmonospace;FF1C -lessorequivalent;2272 -lessorgreater;2276 -lessoverequal;2266 -lesssmall;FE64 -lezh;026E -lfblock;258C -lhookretroflex;026D -lira;20A4 -liwnarmenian;056C -lj;01C9 -ljecyrillic;0459 -ll;F6C0 -lladeva;0933 -llagujarati;0AB3 -llinebelow;1E3B -llladeva;0934 -llvocalicbengali;09E1 -llvocalicdeva;0961 -llvocalicvowelsignbengali;09E3 -llvocalicvowelsigndeva;0963 -lmiddletilde;026B -lmonospace;FF4C -lmsquare;33D0 -lochulathai;0E2C -logicaland;2227 -logicalnot;00AC -logicalnotreversed;2310 -logicalor;2228 -lolingthai;0E25 -longs;017F -lowlinecenterline;FE4E -lowlinecmb;0332 -lowlinedashed;FE4D -lozenge;25CA -lparen;24A7 -lslash;0142 -lsquare;2113 -lsuperior;F6EE -ltshade;2591 -luthai;0E26 -lvocalicbengali;098C -lvocalicdeva;090C -lvocalicvowelsignbengali;09E2 -lvocalicvowelsigndeva;0962 -lxsquare;33D3 -m;006D -mabengali;09AE -macron;00AF -macronbelowcmb;0331 -macroncmb;0304 -macronlowmod;02CD -macronmonospace;FFE3 -macute;1E3F -madeva;092E -magujarati;0AAE -magurmukhi;0A2E -mahapakhhebrew;05A4 -mahapakhlefthebrew;05A4 -mahiragana;307E -maichattawalowleftthai;F895 -maichattawalowrightthai;F894 -maichattawathai;0E4B -maichattawaupperleftthai;F893 -maieklowleftthai;F88C -maieklowrightthai;F88B -maiekthai;0E48 -maiekupperleftthai;F88A -maihanakatleftthai;F884 -maihanakatthai;0E31 -maitaikhuleftthai;F889 -maitaikhuthai;0E47 -maitholowleftthai;F88F -maitholowrightthai;F88E -maithothai;0E49 -maithoupperleftthai;F88D -maitrilowleftthai;F892 -maitrilowrightthai;F891 -maitrithai;0E4A -maitriupperleftthai;F890 -maiyamokthai;0E46 -makatakana;30DE -makatakanahalfwidth;FF8F -male;2642 -mansyonsquare;3347 -maqafhebrew;05BE -mars;2642 -masoracirclehebrew;05AF -masquare;3383 -mbopomofo;3107 -mbsquare;33D4 -mcircle;24DC -mcubedsquare;33A5 -mdotaccent;1E41 -mdotbelow;1E43 -meemarabic;0645 -meemfinalarabic;FEE2 -meeminitialarabic;FEE3 -meemmedialarabic;FEE4 -meemmeeminitialarabic;FCD1 -meemmeemisolatedarabic;FC48 -meetorusquare;334D -mehiragana;3081 -meizierasquare;337E -mekatakana;30E1 -mekatakanahalfwidth;FF92 -mem;05DE -memdagesh;FB3E -memdageshhebrew;FB3E -memhebrew;05DE -menarmenian;0574 -merkhahebrew;05A5 -merkhakefulahebrew;05A6 -merkhakefulalefthebrew;05A6 -merkhalefthebrew;05A5 -mhook;0271 -mhzsquare;3392 -middledotkatakanahalfwidth;FF65 -middot;00B7 -mieumacirclekorean;3272 -mieumaparenkorean;3212 -mieumcirclekorean;3264 -mieumkorean;3141 -mieumpansioskorean;3170 -mieumparenkorean;3204 -mieumpieupkorean;316E -mieumsioskorean;316F -mihiragana;307F -mikatakana;30DF -mikatakanahalfwidth;FF90 -minus;2212 -minusbelowcmb;0320 -minuscircle;2296 -minusmod;02D7 -minusplus;2213 -minute;2032 -miribaarusquare;334A -mirisquare;3349 -mlonglegturned;0270 -mlsquare;3396 -mmcubedsquare;33A3 -mmonospace;FF4D -mmsquaredsquare;339F -mohiragana;3082 -mohmsquare;33C1 -mokatakana;30E2 -mokatakanahalfwidth;FF93 -molsquare;33D6 -momathai;0E21 -moverssquare;33A7 -moverssquaredsquare;33A8 -mparen;24A8 -mpasquare;33AB -mssquare;33B3 -msuperior;F6EF -mturned;026F -mu;00B5 -mu1;00B5 -muasquare;3382 -muchgreater;226B -muchless;226A -mufsquare;338C -mugreek;03BC -mugsquare;338D -muhiragana;3080 -mukatakana;30E0 -mukatakanahalfwidth;FF91 -mulsquare;3395 -multiply;00D7 -mumsquare;339B -munahhebrew;05A3 -munahlefthebrew;05A3 -musicalnote;266A -musicalnotedbl;266B -musicflatsign;266D -musicsharpsign;266F -mussquare;33B2 -muvsquare;33B6 -muwsquare;33BC -mvmegasquare;33B9 -mvsquare;33B7 -mwmegasquare;33BF -mwsquare;33BD -n;006E -nabengali;09A8 -nabla;2207 -nacute;0144 -nadeva;0928 -nagujarati;0AA8 -nagurmukhi;0A28 -nahiragana;306A -nakatakana;30CA -nakatakanahalfwidth;FF85 -napostrophe;0149 -nasquare;3381 -nbopomofo;310B -nbspace;00A0 -ncaron;0148 -ncedilla;0146 -ncircle;24DD -ncircumflexbelow;1E4B -ncommaaccent;0146 -ndotaccent;1E45 -ndotbelow;1E47 -nehiragana;306D -nekatakana;30CD -nekatakanahalfwidth;FF88 -newsheqelsign;20AA -nfsquare;338B -ngabengali;0999 -ngadeva;0919 -ngagujarati;0A99 -ngagurmukhi;0A19 -ngonguthai;0E07 -nhiragana;3093 -nhookleft;0272 -nhookretroflex;0273 -nieunacirclekorean;326F -nieunaparenkorean;320F -nieuncieuckorean;3135 -nieuncirclekorean;3261 -nieunhieuhkorean;3136 -nieunkorean;3134 -nieunpansioskorean;3168 -nieunparenkorean;3201 -nieunsioskorean;3167 -nieuntikeutkorean;3166 -nihiragana;306B -nikatakana;30CB -nikatakanahalfwidth;FF86 -nikhahitleftthai;F899 -nikhahitthai;0E4D -nine;0039 -ninearabic;0669 -ninebengali;09EF -ninecircle;2468 -ninecircleinversesansserif;2792 -ninedeva;096F -ninegujarati;0AEF -ninegurmukhi;0A6F -ninehackarabic;0669 -ninehangzhou;3029 -nineideographicparen;3228 -nineinferior;2089 -ninemonospace;FF19 -nineoldstyle;F739 -nineparen;247C -nineperiod;2490 -ninepersian;06F9 -nineroman;2178 -ninesuperior;2079 -nineteencircle;2472 -nineteenparen;2486 -nineteenperiod;249A -ninethai;0E59 -nj;01CC -njecyrillic;045A -nkatakana;30F3 -nkatakanahalfwidth;FF9D -nlegrightlong;019E -nlinebelow;1E49 -nmonospace;FF4E -nmsquare;339A -nnabengali;09A3 -nnadeva;0923 -nnagujarati;0AA3 -nnagurmukhi;0A23 -nnnadeva;0929 -nohiragana;306E -nokatakana;30CE -nokatakanahalfwidth;FF89 -nonbreakingspace;00A0 -nonenthai;0E13 -nonuthai;0E19 -noonarabic;0646 -noonfinalarabic;FEE6 -noonghunnaarabic;06BA -noonghunnafinalarabic;FB9F -noonhehinitialarabic;FEE7 FEEC -nooninitialarabic;FEE7 -noonjeeminitialarabic;FCD2 -noonjeemisolatedarabic;FC4B -noonmedialarabic;FEE8 -noonmeeminitialarabic;FCD5 -noonmeemisolatedarabic;FC4E -noonnoonfinalarabic;FC8D -notcontains;220C -notelement;2209 -notelementof;2209 -notequal;2260 -notgreater;226F -notgreaternorequal;2271 -notgreaternorless;2279 -notidentical;2262 -notless;226E -notlessnorequal;2270 -notparallel;2226 -notprecedes;2280 -notsubset;2284 -notsucceeds;2281 -notsuperset;2285 -nowarmenian;0576 -nparen;24A9 -nssquare;33B1 -nsuperior;207F -ntilde;00F1 -nu;03BD -nuhiragana;306C -nukatakana;30CC -nukatakanahalfwidth;FF87 -nuktabengali;09BC -nuktadeva;093C -nuktagujarati;0ABC -nuktagurmukhi;0A3C -numbersign;0023 -numbersignmonospace;FF03 -numbersignsmall;FE5F -numeralsigngreek;0374 -numeralsignlowergreek;0375 -numero;2116 -nun;05E0 -nundagesh;FB40 -nundageshhebrew;FB40 -nunhebrew;05E0 -nvsquare;33B5 -nwsquare;33BB -nyabengali;099E -nyadeva;091E -nyagujarati;0A9E -nyagurmukhi;0A1E -o;006F -oacute;00F3 -oangthai;0E2D -obarred;0275 -obarredcyrillic;04E9 -obarreddieresiscyrillic;04EB -obengali;0993 -obopomofo;311B -obreve;014F -ocandradeva;0911 -ocandragujarati;0A91 -ocandravowelsigndeva;0949 -ocandravowelsigngujarati;0AC9 -ocaron;01D2 -ocircle;24DE -ocircumflex;00F4 -ocircumflexacute;1ED1 -ocircumflexdotbelow;1ED9 -ocircumflexgrave;1ED3 -ocircumflexhookabove;1ED5 -ocircumflextilde;1ED7 -ocyrillic;043E -odblacute;0151 -odblgrave;020D -odeva;0913 -odieresis;00F6 -odieresiscyrillic;04E7 -odotbelow;1ECD -oe;0153 -oekorean;315A -ogonek;02DB -ogonekcmb;0328 -ograve;00F2 -ogujarati;0A93 -oharmenian;0585 -ohiragana;304A -ohookabove;1ECF -ohorn;01A1 -ohornacute;1EDB -ohorndotbelow;1EE3 -ohorngrave;1EDD -ohornhookabove;1EDF -ohorntilde;1EE1 -ohungarumlaut;0151 -oi;01A3 -oinvertedbreve;020F -okatakana;30AA -okatakanahalfwidth;FF75 -okorean;3157 -olehebrew;05AB -omacron;014D -omacronacute;1E53 -omacrongrave;1E51 -omdeva;0950 -omega;03C9 -omega1;03D6 -omegacyrillic;0461 -omegalatinclosed;0277 -omegaroundcyrillic;047B -omegatitlocyrillic;047D -omegatonos;03CE -omgujarati;0AD0 -omicron;03BF -omicrontonos;03CC -omonospace;FF4F -one;0031 -onearabic;0661 -onebengali;09E7 -onecircle;2460 -onecircleinversesansserif;278A -onedeva;0967 -onedotenleader;2024 -oneeighth;215B -onefitted;F6DC -onegujarati;0AE7 -onegurmukhi;0A67 -onehackarabic;0661 -onehalf;00BD -onehangzhou;3021 -oneideographicparen;3220 -oneinferior;2081 -onemonospace;FF11 -onenumeratorbengali;09F4 -oneoldstyle;F731 -oneparen;2474 -oneperiod;2488 -onepersian;06F1 -onequarter;00BC -oneroman;2170 -onesuperior;00B9 -onethai;0E51 -onethird;2153 -oogonek;01EB -oogonekmacron;01ED -oogurmukhi;0A13 -oomatragurmukhi;0A4B -oopen;0254 -oparen;24AA -openbullet;25E6 -option;2325 -ordfeminine;00AA -ordmasculine;00BA -orthogonal;221F -oshortdeva;0912 -oshortvowelsigndeva;094A -oslash;00F8 -oslashacute;01FF -osmallhiragana;3049 -osmallkatakana;30A9 -osmallkatakanahalfwidth;FF6B -ostrokeacute;01FF -osuperior;F6F0 -otcyrillic;047F -otilde;00F5 -otildeacute;1E4D -otildedieresis;1E4F -oubopomofo;3121 -overline;203E -overlinecenterline;FE4A -overlinecmb;0305 -overlinedashed;FE49 -overlinedblwavy;FE4C -overlinewavy;FE4B -overscore;00AF -ovowelsignbengali;09CB -ovowelsigndeva;094B -ovowelsigngujarati;0ACB -p;0070 -paampssquare;3380 -paasentosquare;332B -pabengali;09AA -pacute;1E55 -padeva;092A -pagedown;21DF -pageup;21DE -pagujarati;0AAA -pagurmukhi;0A2A -pahiragana;3071 -paiyannoithai;0E2F -pakatakana;30D1 -palatalizationcyrilliccmb;0484 -palochkacyrillic;04C0 -pansioskorean;317F -paragraph;00B6 -parallel;2225 -parenleft;0028 -parenleftaltonearabic;FD3E -parenleftbt;F8ED -parenleftex;F8EC -parenleftinferior;208D -parenleftmonospace;FF08 -parenleftsmall;FE59 -parenleftsuperior;207D -parenlefttp;F8EB -parenleftvertical;FE35 -parenright;0029 -parenrightaltonearabic;FD3F -parenrightbt;F8F8 -parenrightex;F8F7 -parenrightinferior;208E -parenrightmonospace;FF09 -parenrightsmall;FE5A -parenrightsuperior;207E -parenrighttp;F8F6 -parenrightvertical;FE36 -partialdiff;2202 -paseqhebrew;05C0 -pashtahebrew;0599 -pasquare;33A9 -patah;05B7 -patah11;05B7 -patah1d;05B7 -patah2a;05B7 -patahhebrew;05B7 -patahnarrowhebrew;05B7 -patahquarterhebrew;05B7 -patahwidehebrew;05B7 -pazerhebrew;05A1 -pbopomofo;3106 -pcircle;24DF -pdotaccent;1E57 -pe;05E4 -pecyrillic;043F -pedagesh;FB44 -pedageshhebrew;FB44 -peezisquare;333B -pefinaldageshhebrew;FB43 -peharabic;067E -peharmenian;057A -pehebrew;05E4 -pehfinalarabic;FB57 -pehinitialarabic;FB58 -pehiragana;307A -pehmedialarabic;FB59 -pekatakana;30DA -pemiddlehookcyrillic;04A7 -perafehebrew;FB4E -percent;0025 -percentarabic;066A -percentmonospace;FF05 -percentsmall;FE6A -period;002E -periodarmenian;0589 -periodcentered;00B7 -periodhalfwidth;FF61 -periodinferior;F6E7 -periodmonospace;FF0E -periodsmall;FE52 -periodsuperior;F6E8 -perispomenigreekcmb;0342 -perpendicular;22A5 -perthousand;2030 -peseta;20A7 -pfsquare;338A -phabengali;09AB -phadeva;092B -phagujarati;0AAB -phagurmukhi;0A2B -phi;03C6 -phi1;03D5 -phieuphacirclekorean;327A -phieuphaparenkorean;321A -phieuphcirclekorean;326C -phieuphkorean;314D -phieuphparenkorean;320C -philatin;0278 -phinthuthai;0E3A -phisymbolgreek;03D5 -phook;01A5 -phophanthai;0E1E -phophungthai;0E1C -phosamphaothai;0E20 -pi;03C0 -pieupacirclekorean;3273 -pieupaparenkorean;3213 -pieupcieuckorean;3176 -pieupcirclekorean;3265 -pieupkiyeokkorean;3172 -pieupkorean;3142 -pieupparenkorean;3205 -pieupsioskiyeokkorean;3174 -pieupsioskorean;3144 -pieupsiostikeutkorean;3175 -pieupthieuthkorean;3177 -pieuptikeutkorean;3173 -pihiragana;3074 -pikatakana;30D4 -pisymbolgreek;03D6 -piwrarmenian;0583 -plus;002B -plusbelowcmb;031F -pluscircle;2295 -plusminus;00B1 -plusmod;02D6 -plusmonospace;FF0B -plussmall;FE62 -plussuperior;207A -pmonospace;FF50 -pmsquare;33D8 -pohiragana;307D -pointingindexdownwhite;261F -pointingindexleftwhite;261C -pointingindexrightwhite;261E -pointingindexupwhite;261D -pokatakana;30DD -poplathai;0E1B -postalmark;3012 -postalmarkface;3020 -pparen;24AB -precedes;227A -prescription;211E -primemod;02B9 -primereversed;2035 -product;220F -projective;2305 -prolongedkana;30FC -propellor;2318 -propersubset;2282 -propersuperset;2283 -proportion;2237 -proportional;221D -psi;03C8 -psicyrillic;0471 -psilipneumatacyrilliccmb;0486 -pssquare;33B0 -puhiragana;3077 -pukatakana;30D7 -pvsquare;33B4 -pwsquare;33BA -q;0071 -qadeva;0958 -qadmahebrew;05A8 -qafarabic;0642 -qaffinalarabic;FED6 -qafinitialarabic;FED7 -qafmedialarabic;FED8 -qamats;05B8 -qamats10;05B8 -qamats1a;05B8 -qamats1c;05B8 -qamats27;05B8 -qamats29;05B8 -qamats33;05B8 -qamatsde;05B8 -qamatshebrew;05B8 -qamatsnarrowhebrew;05B8 -qamatsqatanhebrew;05B8 -qamatsqatannarrowhebrew;05B8 -qamatsqatanquarterhebrew;05B8 -qamatsqatanwidehebrew;05B8 -qamatsquarterhebrew;05B8 -qamatswidehebrew;05B8 -qarneyparahebrew;059F -qbopomofo;3111 -qcircle;24E0 -qhook;02A0 -qmonospace;FF51 -qof;05E7 -qofdagesh;FB47 -qofdageshhebrew;FB47 -qofhatafpatah;05E7 05B2 -qofhatafpatahhebrew;05E7 05B2 -qofhatafsegol;05E7 05B1 -qofhatafsegolhebrew;05E7 05B1 -qofhebrew;05E7 -qofhiriq;05E7 05B4 -qofhiriqhebrew;05E7 05B4 -qofholam;05E7 05B9 -qofholamhebrew;05E7 05B9 -qofpatah;05E7 05B7 -qofpatahhebrew;05E7 05B7 -qofqamats;05E7 05B8 -qofqamatshebrew;05E7 05B8 -qofqubuts;05E7 05BB -qofqubutshebrew;05E7 05BB -qofsegol;05E7 05B6 -qofsegolhebrew;05E7 05B6 -qofsheva;05E7 05B0 -qofshevahebrew;05E7 05B0 -qoftsere;05E7 05B5 -qoftserehebrew;05E7 05B5 -qparen;24AC -quarternote;2669 -qubuts;05BB -qubuts18;05BB -qubuts25;05BB -qubuts31;05BB -qubutshebrew;05BB -qubutsnarrowhebrew;05BB -qubutsquarterhebrew;05BB -qubutswidehebrew;05BB -question;003F -questionarabic;061F -questionarmenian;055E -questiondown;00BF -questiondownsmall;F7BF -questiongreek;037E -questionmonospace;FF1F -questionsmall;F73F -quotedbl;0022 -quotedblbase;201E -quotedblleft;201C -quotedblmonospace;FF02 -quotedblprime;301E -quotedblprimereversed;301D -quotedblright;201D -quoteleft;2018 -quoteleftreversed;201B -quotereversed;201B -quoteright;2019 -quoterightn;0149 -quotesinglbase;201A -quotesingle;0027 -quotesinglemonospace;FF07 -r;0072 -raarmenian;057C -rabengali;09B0 -racute;0155 -radeva;0930 -radical;221A -radicalex;F8E5 -radoverssquare;33AE -radoverssquaredsquare;33AF -radsquare;33AD -rafe;05BF -rafehebrew;05BF -ragujarati;0AB0 -ragurmukhi;0A30 -rahiragana;3089 -rakatakana;30E9 -rakatakanahalfwidth;FF97 -ralowerdiagonalbengali;09F1 -ramiddlediagonalbengali;09F0 -ramshorn;0264 -ratio;2236 -rbopomofo;3116 -rcaron;0159 -rcedilla;0157 -rcircle;24E1 -rcommaaccent;0157 -rdblgrave;0211 -rdotaccent;1E59 -rdotbelow;1E5B -rdotbelowmacron;1E5D -referencemark;203B -reflexsubset;2286 -reflexsuperset;2287 -registered;00AE -registersans;F8E8 -registerserif;F6DA -reharabic;0631 -reharmenian;0580 -rehfinalarabic;FEAE -rehiragana;308C -rehyehaleflamarabic;0631 FEF3 FE8E 0644 -rekatakana;30EC -rekatakanahalfwidth;FF9A -resh;05E8 -reshdageshhebrew;FB48 -reshhatafpatah;05E8 05B2 -reshhatafpatahhebrew;05E8 05B2 -reshhatafsegol;05E8 05B1 -reshhatafsegolhebrew;05E8 05B1 -reshhebrew;05E8 -reshhiriq;05E8 05B4 -reshhiriqhebrew;05E8 05B4 -reshholam;05E8 05B9 -reshholamhebrew;05E8 05B9 -reshpatah;05E8 05B7 -reshpatahhebrew;05E8 05B7 -reshqamats;05E8 05B8 -reshqamatshebrew;05E8 05B8 -reshqubuts;05E8 05BB -reshqubutshebrew;05E8 05BB -reshsegol;05E8 05B6 -reshsegolhebrew;05E8 05B6 -reshsheva;05E8 05B0 -reshshevahebrew;05E8 05B0 -reshtsere;05E8 05B5 -reshtserehebrew;05E8 05B5 -reversedtilde;223D -reviahebrew;0597 -reviamugrashhebrew;0597 -revlogicalnot;2310 -rfishhook;027E -rfishhookreversed;027F -rhabengali;09DD -rhadeva;095D -rho;03C1 -rhook;027D -rhookturned;027B -rhookturnedsuperior;02B5 -rhosymbolgreek;03F1 -rhotichookmod;02DE -rieulacirclekorean;3271 -rieulaparenkorean;3211 -rieulcirclekorean;3263 -rieulhieuhkorean;3140 -rieulkiyeokkorean;313A -rieulkiyeoksioskorean;3169 -rieulkorean;3139 -rieulmieumkorean;313B -rieulpansioskorean;316C -rieulparenkorean;3203 -rieulphieuphkorean;313F -rieulpieupkorean;313C -rieulpieupsioskorean;316B -rieulsioskorean;313D -rieulthieuthkorean;313E -rieultikeutkorean;316A -rieulyeorinhieuhkorean;316D -rightangle;221F -righttackbelowcmb;0319 -righttriangle;22BF -rihiragana;308A -rikatakana;30EA -rikatakanahalfwidth;FF98 -ring;02DA -ringbelowcmb;0325 -ringcmb;030A -ringhalfleft;02BF -ringhalfleftarmenian;0559 -ringhalfleftbelowcmb;031C -ringhalfleftcentered;02D3 -ringhalfright;02BE -ringhalfrightbelowcmb;0339 -ringhalfrightcentered;02D2 -rinvertedbreve;0213 -rittorusquare;3351 -rlinebelow;1E5F -rlongleg;027C -rlonglegturned;027A -rmonospace;FF52 -rohiragana;308D -rokatakana;30ED -rokatakanahalfwidth;FF9B -roruathai;0E23 -rparen;24AD -rrabengali;09DC -rradeva;0931 -rragurmukhi;0A5C -rreharabic;0691 -rrehfinalarabic;FB8D -rrvocalicbengali;09E0 -rrvocalicdeva;0960 -rrvocalicgujarati;0AE0 -rrvocalicvowelsignbengali;09C4 -rrvocalicvowelsigndeva;0944 -rrvocalicvowelsigngujarati;0AC4 -rsuperior;F6F1 -rtblock;2590 -rturned;0279 -rturnedsuperior;02B4 -ruhiragana;308B -rukatakana;30EB -rukatakanahalfwidth;FF99 -rupeemarkbengali;09F2 -rupeesignbengali;09F3 -rupiah;F6DD -ruthai;0E24 -rvocalicbengali;098B -rvocalicdeva;090B -rvocalicgujarati;0A8B -rvocalicvowelsignbengali;09C3 -rvocalicvowelsigndeva;0943 -rvocalicvowelsigngujarati;0AC3 -s;0073 -sabengali;09B8 -sacute;015B -sacutedotaccent;1E65 -sadarabic;0635 -sadeva;0938 -sadfinalarabic;FEBA -sadinitialarabic;FEBB -sadmedialarabic;FEBC -sagujarati;0AB8 -sagurmukhi;0A38 -sahiragana;3055 -sakatakana;30B5 -sakatakanahalfwidth;FF7B -sallallahoualayhewasallamarabic;FDFA -samekh;05E1 -samekhdagesh;FB41 -samekhdageshhebrew;FB41 -samekhhebrew;05E1 -saraaathai;0E32 -saraaethai;0E41 -saraaimaimalaithai;0E44 -saraaimaimuanthai;0E43 -saraamthai;0E33 -saraathai;0E30 -saraethai;0E40 -saraiileftthai;F886 -saraiithai;0E35 -saraileftthai;F885 -saraithai;0E34 -saraothai;0E42 -saraueeleftthai;F888 -saraueethai;0E37 -saraueleftthai;F887 -sarauethai;0E36 -sarauthai;0E38 -sarauuthai;0E39 -sbopomofo;3119 -scaron;0161 -scarondotaccent;1E67 -scedilla;015F -schwa;0259 -schwacyrillic;04D9 -schwadieresiscyrillic;04DB -schwahook;025A -scircle;24E2 -scircumflex;015D -scommaaccent;0219 -sdotaccent;1E61 -sdotbelow;1E63 -sdotbelowdotaccent;1E69 -seagullbelowcmb;033C -second;2033 -secondtonechinese;02CA -section;00A7 -seenarabic;0633 -seenfinalarabic;FEB2 -seeninitialarabic;FEB3 -seenmedialarabic;FEB4 -segol;05B6 -segol13;05B6 -segol1f;05B6 -segol2c;05B6 -segolhebrew;05B6 -segolnarrowhebrew;05B6 -segolquarterhebrew;05B6 -segoltahebrew;0592 -segolwidehebrew;05B6 -seharmenian;057D -sehiragana;305B -sekatakana;30BB -sekatakanahalfwidth;FF7E -semicolon;003B -semicolonarabic;061B -semicolonmonospace;FF1B -semicolonsmall;FE54 -semivoicedmarkkana;309C -semivoicedmarkkanahalfwidth;FF9F -sentisquare;3322 -sentosquare;3323 -seven;0037 -sevenarabic;0667 -sevenbengali;09ED -sevencircle;2466 -sevencircleinversesansserif;2790 -sevendeva;096D -seveneighths;215E -sevengujarati;0AED -sevengurmukhi;0A6D -sevenhackarabic;0667 -sevenhangzhou;3027 -sevenideographicparen;3226 -seveninferior;2087 -sevenmonospace;FF17 -sevenoldstyle;F737 -sevenparen;247A -sevenperiod;248E -sevenpersian;06F7 -sevenroman;2176 -sevensuperior;2077 -seventeencircle;2470 -seventeenparen;2484 -seventeenperiod;2498 -seventhai;0E57 -sfthyphen;00AD -shaarmenian;0577 -shabengali;09B6 -shacyrillic;0448 -shaddaarabic;0651 -shaddadammaarabic;FC61 -shaddadammatanarabic;FC5E -shaddafathaarabic;FC60 -shaddafathatanarabic;0651 064B -shaddakasraarabic;FC62 -shaddakasratanarabic;FC5F -shade;2592 -shadedark;2593 -shadelight;2591 -shademedium;2592 -shadeva;0936 -shagujarati;0AB6 -shagurmukhi;0A36 -shalshelethebrew;0593 -shbopomofo;3115 -shchacyrillic;0449 -sheenarabic;0634 -sheenfinalarabic;FEB6 -sheeninitialarabic;FEB7 -sheenmedialarabic;FEB8 -sheicoptic;03E3 -sheqel;20AA -sheqelhebrew;20AA -sheva;05B0 -sheva115;05B0 -sheva15;05B0 -sheva22;05B0 -sheva2e;05B0 -shevahebrew;05B0 -shevanarrowhebrew;05B0 -shevaquarterhebrew;05B0 -shevawidehebrew;05B0 -shhacyrillic;04BB -shimacoptic;03ED -shin;05E9 -shindagesh;FB49 -shindageshhebrew;FB49 -shindageshshindot;FB2C -shindageshshindothebrew;FB2C -shindageshsindot;FB2D -shindageshsindothebrew;FB2D -shindothebrew;05C1 -shinhebrew;05E9 -shinshindot;FB2A -shinshindothebrew;FB2A -shinsindot;FB2B -shinsindothebrew;FB2B -shook;0282 -sigma;03C3 -sigma1;03C2 -sigmafinal;03C2 -sigmalunatesymbolgreek;03F2 -sihiragana;3057 -sikatakana;30B7 -sikatakanahalfwidth;FF7C -siluqhebrew;05BD -siluqlefthebrew;05BD -similar;223C -sindothebrew;05C2 -siosacirclekorean;3274 -siosaparenkorean;3214 -sioscieuckorean;317E -sioscirclekorean;3266 -sioskiyeokkorean;317A -sioskorean;3145 -siosnieunkorean;317B -siosparenkorean;3206 -siospieupkorean;317D -siostikeutkorean;317C -six;0036 -sixarabic;0666 -sixbengali;09EC -sixcircle;2465 -sixcircleinversesansserif;278F -sixdeva;096C -sixgujarati;0AEC -sixgurmukhi;0A6C -sixhackarabic;0666 -sixhangzhou;3026 -sixideographicparen;3225 -sixinferior;2086 -sixmonospace;FF16 -sixoldstyle;F736 -sixparen;2479 -sixperiod;248D -sixpersian;06F6 -sixroman;2175 -sixsuperior;2076 -sixteencircle;246F -sixteencurrencydenominatorbengali;09F9 -sixteenparen;2483 -sixteenperiod;2497 -sixthai;0E56 -slash;002F -slashmonospace;FF0F -slong;017F -slongdotaccent;1E9B -smileface;263A -smonospace;FF53 -sofpasuqhebrew;05C3 -softhyphen;00AD -softsigncyrillic;044C -sohiragana;305D -sokatakana;30BD -sokatakanahalfwidth;FF7F -soliduslongoverlaycmb;0338 -solidusshortoverlaycmb;0337 -sorusithai;0E29 -sosalathai;0E28 -sosothai;0E0B -sosuathai;0E2A -space;0020 -spacehackarabic;0020 -spade;2660 -spadesuitblack;2660 -spadesuitwhite;2664 -sparen;24AE -squarebelowcmb;033B -squarecc;33C4 -squarecm;339D -squarediagonalcrosshatchfill;25A9 -squarehorizontalfill;25A4 -squarekg;338F -squarekm;339E -squarekmcapital;33CE -squareln;33D1 -squarelog;33D2 -squaremg;338E -squaremil;33D5 -squaremm;339C -squaremsquared;33A1 -squareorthogonalcrosshatchfill;25A6 -squareupperlefttolowerrightfill;25A7 -squareupperrighttolowerleftfill;25A8 -squareverticalfill;25A5 -squarewhitewithsmallblack;25A3 -srsquare;33DB -ssabengali;09B7 -ssadeva;0937 -ssagujarati;0AB7 -ssangcieuckorean;3149 -ssanghieuhkorean;3185 -ssangieungkorean;3180 -ssangkiyeokkorean;3132 -ssangnieunkorean;3165 -ssangpieupkorean;3143 -ssangsioskorean;3146 -ssangtikeutkorean;3138 -ssuperior;F6F2 -sterling;00A3 -sterlingmonospace;FFE1 -strokelongoverlaycmb;0336 -strokeshortoverlaycmb;0335 -subset;2282 -subsetnotequal;228A -subsetorequal;2286 -succeeds;227B -suchthat;220B -suhiragana;3059 -sukatakana;30B9 -sukatakanahalfwidth;FF7D -sukunarabic;0652 -summation;2211 -sun;263C -superset;2283 -supersetnotequal;228B -supersetorequal;2287 -svsquare;33DC -syouwaerasquare;337C -t;0074 -tabengali;09A4 -tackdown;22A4 -tackleft;22A3 -tadeva;0924 -tagujarati;0AA4 -tagurmukhi;0A24 -taharabic;0637 -tahfinalarabic;FEC2 -tahinitialarabic;FEC3 -tahiragana;305F -tahmedialarabic;FEC4 -taisyouerasquare;337D -takatakana;30BF -takatakanahalfwidth;FF80 -tatweelarabic;0640 -tau;03C4 -tav;05EA -tavdages;FB4A -tavdagesh;FB4A -tavdageshhebrew;FB4A -tavhebrew;05EA -tbar;0167 -tbopomofo;310A -tcaron;0165 -tccurl;02A8 -tcedilla;0163 -tcheharabic;0686 -tchehfinalarabic;FB7B -tchehinitialarabic;FB7C -tchehmedialarabic;FB7D -tchehmeeminitialarabic;FB7C FEE4 -tcircle;24E3 -tcircumflexbelow;1E71 -tcommaaccent;0163 -tdieresis;1E97 -tdotaccent;1E6B -tdotbelow;1E6D -tecyrillic;0442 -tedescendercyrillic;04AD -teharabic;062A -tehfinalarabic;FE96 -tehhahinitialarabic;FCA2 -tehhahisolatedarabic;FC0C -tehinitialarabic;FE97 -tehiragana;3066 -tehjeeminitialarabic;FCA1 -tehjeemisolatedarabic;FC0B -tehmarbutaarabic;0629 -tehmarbutafinalarabic;FE94 -tehmedialarabic;FE98 -tehmeeminitialarabic;FCA4 -tehmeemisolatedarabic;FC0E -tehnoonfinalarabic;FC73 -tekatakana;30C6 -tekatakanahalfwidth;FF83 -telephone;2121 -telephoneblack;260E -telishagedolahebrew;05A0 -telishaqetanahebrew;05A9 -tencircle;2469 -tenideographicparen;3229 -tenparen;247D -tenperiod;2491 -tenroman;2179 -tesh;02A7 -tet;05D8 -tetdagesh;FB38 -tetdageshhebrew;FB38 -tethebrew;05D8 -tetsecyrillic;04B5 -tevirhebrew;059B -tevirlefthebrew;059B -thabengali;09A5 -thadeva;0925 -thagujarati;0AA5 -thagurmukhi;0A25 -thalarabic;0630 -thalfinalarabic;FEAC -thanthakhatlowleftthai;F898 -thanthakhatlowrightthai;F897 -thanthakhatthai;0E4C -thanthakhatupperleftthai;F896 -theharabic;062B -thehfinalarabic;FE9A -thehinitialarabic;FE9B -thehmedialarabic;FE9C -thereexists;2203 -therefore;2234 -theta;03B8 -theta1;03D1 -thetasymbolgreek;03D1 -thieuthacirclekorean;3279 -thieuthaparenkorean;3219 -thieuthcirclekorean;326B -thieuthkorean;314C -thieuthparenkorean;320B -thirteencircle;246C -thirteenparen;2480 -thirteenperiod;2494 -thonangmonthothai;0E11 -thook;01AD -thophuthaothai;0E12 -thorn;00FE -thothahanthai;0E17 -thothanthai;0E10 -thothongthai;0E18 -thothungthai;0E16 -thousandcyrillic;0482 -thousandsseparatorarabic;066C -thousandsseparatorpersian;066C -three;0033 -threearabic;0663 -threebengali;09E9 -threecircle;2462 -threecircleinversesansserif;278C -threedeva;0969 -threeeighths;215C -threegujarati;0AE9 -threegurmukhi;0A69 -threehackarabic;0663 -threehangzhou;3023 -threeideographicparen;3222 -threeinferior;2083 -threemonospace;FF13 -threenumeratorbengali;09F6 -threeoldstyle;F733 -threeparen;2476 -threeperiod;248A -threepersian;06F3 -threequarters;00BE -threequartersemdash;F6DE -threeroman;2172 -threesuperior;00B3 -threethai;0E53 -thzsquare;3394 -tihiragana;3061 -tikatakana;30C1 -tikatakanahalfwidth;FF81 -tikeutacirclekorean;3270 -tikeutaparenkorean;3210 -tikeutcirclekorean;3262 -tikeutkorean;3137 -tikeutparenkorean;3202 -tilde;02DC -tildebelowcmb;0330 -tildecmb;0303 -tildecomb;0303 -tildedoublecmb;0360 -tildeoperator;223C -tildeoverlaycmb;0334 -tildeverticalcmb;033E -timescircle;2297 -tipehahebrew;0596 -tipehalefthebrew;0596 -tippigurmukhi;0A70 -titlocyrilliccmb;0483 -tiwnarmenian;057F -tlinebelow;1E6F -tmonospace;FF54 -toarmenian;0569 -tohiragana;3068 -tokatakana;30C8 -tokatakanahalfwidth;FF84 -tonebarextrahighmod;02E5 -tonebarextralowmod;02E9 -tonebarhighmod;02E6 -tonebarlowmod;02E8 -tonebarmidmod;02E7 -tonefive;01BD -tonesix;0185 -tonetwo;01A8 -tonos;0384 -tonsquare;3327 -topatakthai;0E0F -tortoiseshellbracketleft;3014 -tortoiseshellbracketleftsmall;FE5D -tortoiseshellbracketleftvertical;FE39 -tortoiseshellbracketright;3015 -tortoiseshellbracketrightsmall;FE5E -tortoiseshellbracketrightvertical;FE3A -totaothai;0E15 -tpalatalhook;01AB -tparen;24AF -trademark;2122 -trademarksans;F8EA -trademarkserif;F6DB -tretroflexhook;0288 -triagdn;25BC -triaglf;25C4 -triagrt;25BA -triagup;25B2 -ts;02A6 -tsadi;05E6 -tsadidagesh;FB46 -tsadidageshhebrew;FB46 -tsadihebrew;05E6 -tsecyrillic;0446 -tsere;05B5 -tsere12;05B5 -tsere1e;05B5 -tsere2b;05B5 -tserehebrew;05B5 -tserenarrowhebrew;05B5 -tserequarterhebrew;05B5 -tserewidehebrew;05B5 -tshecyrillic;045B -tsuperior;F6F3 -ttabengali;099F -ttadeva;091F -ttagujarati;0A9F -ttagurmukhi;0A1F -tteharabic;0679 -ttehfinalarabic;FB67 -ttehinitialarabic;FB68 -ttehmedialarabic;FB69 -tthabengali;09A0 -tthadeva;0920 -tthagujarati;0AA0 -tthagurmukhi;0A20 -tturned;0287 -tuhiragana;3064 -tukatakana;30C4 -tukatakanahalfwidth;FF82 -tusmallhiragana;3063 -tusmallkatakana;30C3 -tusmallkatakanahalfwidth;FF6F -twelvecircle;246B -twelveparen;247F -twelveperiod;2493 -twelveroman;217B -twentycircle;2473 -twentyhangzhou;5344 -twentyparen;2487 -twentyperiod;249B -two;0032 -twoarabic;0662 -twobengali;09E8 -twocircle;2461 -twocircleinversesansserif;278B -twodeva;0968 -twodotenleader;2025 -twodotleader;2025 -twodotleadervertical;FE30 -twogujarati;0AE8 -twogurmukhi;0A68 -twohackarabic;0662 -twohangzhou;3022 -twoideographicparen;3221 -twoinferior;2082 -twomonospace;FF12 -twonumeratorbengali;09F5 -twooldstyle;F732 -twoparen;2475 -twoperiod;2489 -twopersian;06F2 -tworoman;2171 -twostroke;01BB -twosuperior;00B2 -twothai;0E52 -twothirds;2154 -u;0075 -uacute;00FA -ubar;0289 -ubengali;0989 -ubopomofo;3128 -ubreve;016D -ucaron;01D4 -ucircle;24E4 -ucircumflex;00FB -ucircumflexbelow;1E77 -ucyrillic;0443 -udattadeva;0951 -udblacute;0171 -udblgrave;0215 -udeva;0909 -udieresis;00FC -udieresisacute;01D8 -udieresisbelow;1E73 -udieresiscaron;01DA -udieresiscyrillic;04F1 -udieresisgrave;01DC -udieresismacron;01D6 -udotbelow;1EE5 -ugrave;00F9 -ugujarati;0A89 -ugurmukhi;0A09 -uhiragana;3046 -uhookabove;1EE7 -uhorn;01B0 -uhornacute;1EE9 -uhorndotbelow;1EF1 -uhorngrave;1EEB -uhornhookabove;1EED -uhorntilde;1EEF -uhungarumlaut;0171 -uhungarumlautcyrillic;04F3 -uinvertedbreve;0217 -ukatakana;30A6 -ukatakanahalfwidth;FF73 -ukcyrillic;0479 -ukorean;315C -umacron;016B -umacroncyrillic;04EF -umacrondieresis;1E7B -umatragurmukhi;0A41 -umonospace;FF55 -underscore;005F -underscoredbl;2017 -underscoremonospace;FF3F -underscorevertical;FE33 -underscorewavy;FE4F -union;222A -universal;2200 -uogonek;0173 -uparen;24B0 -upblock;2580 -upperdothebrew;05C4 -upsilon;03C5 -upsilondieresis;03CB -upsilondieresistonos;03B0 -upsilonlatin;028A -upsilontonos;03CD -uptackbelowcmb;031D -uptackmod;02D4 -uragurmukhi;0A73 -uring;016F -ushortcyrillic;045E -usmallhiragana;3045 -usmallkatakana;30A5 -usmallkatakanahalfwidth;FF69 -ustraightcyrillic;04AF -ustraightstrokecyrillic;04B1 -utilde;0169 -utildeacute;1E79 -utildebelow;1E75 -uubengali;098A -uudeva;090A -uugujarati;0A8A -uugurmukhi;0A0A -uumatragurmukhi;0A42 -uuvowelsignbengali;09C2 -uuvowelsigndeva;0942 -uuvowelsigngujarati;0AC2 -uvowelsignbengali;09C1 -uvowelsigndeva;0941 -uvowelsigngujarati;0AC1 -v;0076 -vadeva;0935 -vagujarati;0AB5 -vagurmukhi;0A35 -vakatakana;30F7 -vav;05D5 -vavdagesh;FB35 -vavdagesh65;FB35 -vavdageshhebrew;FB35 -vavhebrew;05D5 -vavholam;FB4B -vavholamhebrew;FB4B -vavvavhebrew;05F0 -vavyodhebrew;05F1 -vcircle;24E5 -vdotbelow;1E7F -vecyrillic;0432 -veharabic;06A4 -vehfinalarabic;FB6B -vehinitialarabic;FB6C -vehmedialarabic;FB6D -vekatakana;30F9 -venus;2640 -verticalbar;007C -verticallineabovecmb;030D -verticallinebelowcmb;0329 -verticallinelowmod;02CC -verticallinemod;02C8 -vewarmenian;057E -vhook;028B -vikatakana;30F8 -viramabengali;09CD -viramadeva;094D -viramagujarati;0ACD -visargabengali;0983 -visargadeva;0903 -visargagujarati;0A83 -vmonospace;FF56 -voarmenian;0578 -voicediterationhiragana;309E -voicediterationkatakana;30FE -voicedmarkkana;309B -voicedmarkkanahalfwidth;FF9E -vokatakana;30FA -vparen;24B1 -vtilde;1E7D -vturned;028C -vuhiragana;3094 -vukatakana;30F4 -w;0077 -wacute;1E83 -waekorean;3159 -wahiragana;308F -wakatakana;30EF -wakatakanahalfwidth;FF9C -wakorean;3158 -wasmallhiragana;308E -wasmallkatakana;30EE -wattosquare;3357 -wavedash;301C -wavyunderscorevertical;FE34 -wawarabic;0648 -wawfinalarabic;FEEE -wawhamzaabovearabic;0624 -wawhamzaabovefinalarabic;FE86 -wbsquare;33DD -wcircle;24E6 -wcircumflex;0175 -wdieresis;1E85 -wdotaccent;1E87 -wdotbelow;1E89 -wehiragana;3091 -weierstrass;2118 -wekatakana;30F1 -wekorean;315E -weokorean;315D -wgrave;1E81 -whitebullet;25E6 -whitecircle;25CB -whitecircleinverse;25D9 -whitecornerbracketleft;300E -whitecornerbracketleftvertical;FE43 -whitecornerbracketright;300F -whitecornerbracketrightvertical;FE44 -whitediamond;25C7 -whitediamondcontainingblacksmalldiamond;25C8 -whitedownpointingsmalltriangle;25BF -whitedownpointingtriangle;25BD -whiteleftpointingsmalltriangle;25C3 -whiteleftpointingtriangle;25C1 -whitelenticularbracketleft;3016 -whitelenticularbracketright;3017 -whiterightpointingsmalltriangle;25B9 -whiterightpointingtriangle;25B7 -whitesmallsquare;25AB -whitesmilingface;263A -whitesquare;25A1 -whitestar;2606 -whitetelephone;260F -whitetortoiseshellbracketleft;3018 -whitetortoiseshellbracketright;3019 -whiteuppointingsmalltriangle;25B5 -whiteuppointingtriangle;25B3 -wihiragana;3090 -wikatakana;30F0 -wikorean;315F -wmonospace;FF57 -wohiragana;3092 -wokatakana;30F2 -wokatakanahalfwidth;FF66 -won;20A9 -wonmonospace;FFE6 -wowaenthai;0E27 -wparen;24B2 -wring;1E98 -wsuperior;02B7 -wturned;028D -wynn;01BF -x;0078 -xabovecmb;033D -xbopomofo;3112 -xcircle;24E7 -xdieresis;1E8D -xdotaccent;1E8B -xeharmenian;056D -xi;03BE -xmonospace;FF58 -xparen;24B3 -xsuperior;02E3 -y;0079 -yaadosquare;334E -yabengali;09AF -yacute;00FD -yadeva;092F -yaekorean;3152 -yagujarati;0AAF -yagurmukhi;0A2F -yahiragana;3084 -yakatakana;30E4 -yakatakanahalfwidth;FF94 -yakorean;3151 -yamakkanthai;0E4E -yasmallhiragana;3083 -yasmallkatakana;30E3 -yasmallkatakanahalfwidth;FF6C -yatcyrillic;0463 -ycircle;24E8 -ycircumflex;0177 -ydieresis;00FF -ydotaccent;1E8F -ydotbelow;1EF5 -yeharabic;064A -yehbarreearabic;06D2 -yehbarreefinalarabic;FBAF -yehfinalarabic;FEF2 -yehhamzaabovearabic;0626 -yehhamzaabovefinalarabic;FE8A -yehhamzaaboveinitialarabic;FE8B -yehhamzaabovemedialarabic;FE8C -yehinitialarabic;FEF3 -yehmedialarabic;FEF4 -yehmeeminitialarabic;FCDD -yehmeemisolatedarabic;FC58 -yehnoonfinalarabic;FC94 -yehthreedotsbelowarabic;06D1 -yekorean;3156 -yen;00A5 -yenmonospace;FFE5 -yeokorean;3155 -yeorinhieuhkorean;3186 -yerahbenyomohebrew;05AA -yerahbenyomolefthebrew;05AA -yericyrillic;044B -yerudieresiscyrillic;04F9 -yesieungkorean;3181 -yesieungpansioskorean;3183 -yesieungsioskorean;3182 -yetivhebrew;059A -ygrave;1EF3 -yhook;01B4 -yhookabove;1EF7 -yiarmenian;0575 -yicyrillic;0457 -yikorean;3162 -yinyang;262F -yiwnarmenian;0582 -ymonospace;FF59 -yod;05D9 -yoddagesh;FB39 -yoddageshhebrew;FB39 -yodhebrew;05D9 -yodyodhebrew;05F2 -yodyodpatahhebrew;FB1F -yohiragana;3088 -yoikorean;3189 -yokatakana;30E8 -yokatakanahalfwidth;FF96 -yokorean;315B -yosmallhiragana;3087 -yosmallkatakana;30E7 -yosmallkatakanahalfwidth;FF6E -yotgreek;03F3 -yoyaekorean;3188 -yoyakorean;3187 -yoyakthai;0E22 -yoyingthai;0E0D -yparen;24B4 -ypogegrammeni;037A -ypogegrammenigreekcmb;0345 -yr;01A6 -yring;1E99 -ysuperior;02B8 -ytilde;1EF9 -yturned;028E -yuhiragana;3086 -yuikorean;318C -yukatakana;30E6 -yukatakanahalfwidth;FF95 -yukorean;3160 -yusbigcyrillic;046B -yusbigiotifiedcyrillic;046D -yuslittlecyrillic;0467 -yuslittleiotifiedcyrillic;0469 -yusmallhiragana;3085 -yusmallkatakana;30E5 -yusmallkatakanahalfwidth;FF6D -yuyekorean;318B -yuyeokorean;318A -yyabengali;09DF -yyadeva;095F -z;007A -zaarmenian;0566 -zacute;017A -zadeva;095B -zagurmukhi;0A5B -zaharabic;0638 -zahfinalarabic;FEC6 -zahinitialarabic;FEC7 -zahiragana;3056 -zahmedialarabic;FEC8 -zainarabic;0632 -zainfinalarabic;FEB0 -zakatakana;30B6 -zaqefgadolhebrew;0595 -zaqefqatanhebrew;0594 -zarqahebrew;0598 -zayin;05D6 -zayindagesh;FB36 -zayindageshhebrew;FB36 -zayinhebrew;05D6 -zbopomofo;3117 -zcaron;017E -zcircle;24E9 -zcircumflex;1E91 -zcurl;0291 -zdot;017C -zdotaccent;017C -zdotbelow;1E93 -zecyrillic;0437 -zedescendercyrillic;0499 -zedieresiscyrillic;04DF -zehiragana;305C -zekatakana;30BC -zero;0030 -zeroarabic;0660 -zerobengali;09E6 -zerodeva;0966 -zerogujarati;0AE6 -zerogurmukhi;0A66 -zerohackarabic;0660 -zeroinferior;2080 -zeromonospace;FF10 -zerooldstyle;F730 -zeropersian;06F0 -zerosuperior;2070 -zerothai;0E50 -zerowidthjoiner;FEFF -zerowidthnonjoiner;200C -zerowidthspace;200B -zeta;03B6 -zhbopomofo;3113 -zhearmenian;056A -zhebrevecyrillic;04C2 -zhecyrillic;0436 -zhedescendercyrillic;0497 -zhedieresiscyrillic;04DD -zihiragana;3058 -zikatakana;30B8 -zinorhebrew;05AE -zlinebelow;1E95 -zmonospace;FF5A -zohiragana;305E -zokatakana;30BE -zparen;24B5 -zretroflexhook;0290 -zstroke;01B6 -zuhiragana;305A -zukatakana;30BA -# END -""" - - -_aglfnText = """\ -# ----------------------------------------------------------- -# Copyright 2002-2019 Adobe (http://www.adobe.com/). -# -# Redistribution and use in source and binary forms, with or -# without modification, are permitted provided that the -# following conditions are met: -# -# Redistributions of source code must retain the above -# copyright notice, this list of conditions and the following -# disclaimer. -# -# Redistributions in binary form must reproduce the above -# copyright notice, this list of conditions and the following -# disclaimer in the documentation and/or other materials -# provided with the distribution. -# -# Neither the name of Adobe nor the names of its contributors -# may be used to endorse or promote products derived from this -# software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND -# CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, -# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF -# MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, -# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT -# NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) -# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN -# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR -# OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -# ----------------------------------------------------------- -# Name: Adobe Glyph List For New Fonts -# Table version: 1.7 -# Date: November 6, 2008 -# URL: https://github.com/adobe-type-tools/agl-aglfn -# -# Description: -# -# AGLFN (Adobe Glyph List For New Fonts) provides a list of base glyph -# names that are recommended for new fonts, which are compatible with -# the AGL (Adobe Glyph List) Specification, and which should be used -# as described in Section 6 of that document. AGLFN comprises the set -# of glyph names from AGL that map via the AGL Specification rules to -# the semantically correct UV (Unicode Value). For example, "Asmall" -# is omitted because AGL maps this glyph name to the PUA (Private Use -# Area) value U+F761, rather than to the UV that maps from the glyph -# name "A." Also omitted is "ffi," because AGL maps this to the -# Alphabetic Presentation Forms value U+FB03, rather than decomposing -# it into the following sequence of three UVs: U+0066, U+0066, and -# U+0069. The name "arrowvertex" has been omitted because this glyph -# now has a real UV, and AGL is now incorrect in mapping it to the PUA -# value U+F8E6. If you do not find an appropriate name for your glyph -# in this list, then please refer to Section 6 of the AGL -# Specification. -# -# Format: three semicolon-delimited fields: -# (1) Standard UV or CUS UV--four uppercase hexadecimal digits -# (2) Glyph name--upper/lowercase letters and digits -# (3) Character names: Unicode character names for standard UVs, and -# descriptive names for CUS UVs--uppercase letters, hyphen, and -# space -# -# The records are sorted by glyph name in increasing ASCII order, -# entries with the same glyph name are sorted in decreasing priority -# order, the UVs and Unicode character names are provided for -# convenience, lines starting with "#" are comments, and blank lines -# should be ignored. -# -# Revision History: -# -# 1.7 [6 November 2008] -# - Reverted to the original 1.4 and earlier mappings for Delta, -# Omega, and mu. -# - Removed mappings for "afii" names. These should now be assigned -# "uni" names. -# - Removed mappings for "commaaccent" names. These should now be -# assigned "uni" names. -# -# 1.6 [30 January 2006] -# - Completed work intended in 1.5. -# -# 1.5 [23 November 2005] -# - Removed duplicated block at end of file. -# - Changed mappings: -# 2206;Delta;INCREMENT changed to 0394;Delta;GREEK CAPITAL LETTER DELTA -# 2126;Omega;OHM SIGN changed to 03A9;Omega;GREEK CAPITAL LETTER OMEGA -# 03BC;mu;MICRO SIGN changed to 03BC;mu;GREEK SMALL LETTER MU -# - Corrected statement above about why "ffi" is omitted. -# -# 1.4 [24 September 2003] -# - Changed version to 1.4, to avoid confusion with the AGL 1.3. -# - Fixed spelling errors in the header. -# - Fully removed "arrowvertex," as it is mapped only to a PUA Unicode -# value in some fonts. -# -# 1.1 [17 April 2003] -# - Renamed [Tt]cedilla back to [Tt]commaaccent. -# -# 1.0 [31 January 2003] -# - Original version. -# - Derived from the AGLv1.2 by: -# removing the PUA area codes; -# removing duplicate Unicode mappings; and -# renaming "tcommaaccent" to "tcedilla" and "Tcommaaccent" to "Tcedilla" -# -0041;A;LATIN CAPITAL LETTER A -00C6;AE;LATIN CAPITAL LETTER AE -01FC;AEacute;LATIN CAPITAL LETTER AE WITH ACUTE -00C1;Aacute;LATIN CAPITAL LETTER A WITH ACUTE -0102;Abreve;LATIN CAPITAL LETTER A WITH BREVE -00C2;Acircumflex;LATIN CAPITAL LETTER A WITH CIRCUMFLEX -00C4;Adieresis;LATIN CAPITAL LETTER A WITH DIAERESIS -00C0;Agrave;LATIN CAPITAL LETTER A WITH GRAVE -0391;Alpha;GREEK CAPITAL LETTER ALPHA -0386;Alphatonos;GREEK CAPITAL LETTER ALPHA WITH TONOS -0100;Amacron;LATIN CAPITAL LETTER A WITH MACRON -0104;Aogonek;LATIN CAPITAL LETTER A WITH OGONEK -00C5;Aring;LATIN CAPITAL LETTER A WITH RING ABOVE -01FA;Aringacute;LATIN CAPITAL LETTER A WITH RING ABOVE AND ACUTE -00C3;Atilde;LATIN CAPITAL LETTER A WITH TILDE -0042;B;LATIN CAPITAL LETTER B -0392;Beta;GREEK CAPITAL LETTER BETA -0043;C;LATIN CAPITAL LETTER C -0106;Cacute;LATIN CAPITAL LETTER C WITH ACUTE -010C;Ccaron;LATIN CAPITAL LETTER C WITH CARON -00C7;Ccedilla;LATIN CAPITAL LETTER C WITH CEDILLA -0108;Ccircumflex;LATIN CAPITAL LETTER C WITH CIRCUMFLEX -010A;Cdotaccent;LATIN CAPITAL LETTER C WITH DOT ABOVE -03A7;Chi;GREEK CAPITAL LETTER CHI -0044;D;LATIN CAPITAL LETTER D -010E;Dcaron;LATIN CAPITAL LETTER D WITH CARON -0110;Dcroat;LATIN CAPITAL LETTER D WITH STROKE -2206;Delta;INCREMENT -0045;E;LATIN CAPITAL LETTER E -00C9;Eacute;LATIN CAPITAL LETTER E WITH ACUTE -0114;Ebreve;LATIN CAPITAL LETTER E WITH BREVE -011A;Ecaron;LATIN CAPITAL LETTER E WITH CARON -00CA;Ecircumflex;LATIN CAPITAL LETTER E WITH CIRCUMFLEX -00CB;Edieresis;LATIN CAPITAL LETTER E WITH DIAERESIS -0116;Edotaccent;LATIN CAPITAL LETTER E WITH DOT ABOVE -00C8;Egrave;LATIN CAPITAL LETTER E WITH GRAVE -0112;Emacron;LATIN CAPITAL LETTER E WITH MACRON -014A;Eng;LATIN CAPITAL LETTER ENG -0118;Eogonek;LATIN CAPITAL LETTER E WITH OGONEK -0395;Epsilon;GREEK CAPITAL LETTER EPSILON -0388;Epsilontonos;GREEK CAPITAL LETTER EPSILON WITH TONOS -0397;Eta;GREEK CAPITAL LETTER ETA -0389;Etatonos;GREEK CAPITAL LETTER ETA WITH TONOS -00D0;Eth;LATIN CAPITAL LETTER ETH -20AC;Euro;EURO SIGN -0046;F;LATIN CAPITAL LETTER F -0047;G;LATIN CAPITAL LETTER G -0393;Gamma;GREEK CAPITAL LETTER GAMMA -011E;Gbreve;LATIN CAPITAL LETTER G WITH BREVE -01E6;Gcaron;LATIN CAPITAL LETTER G WITH CARON -011C;Gcircumflex;LATIN CAPITAL LETTER G WITH CIRCUMFLEX -0120;Gdotaccent;LATIN CAPITAL LETTER G WITH DOT ABOVE -0048;H;LATIN CAPITAL LETTER H -25CF;H18533;BLACK CIRCLE -25AA;H18543;BLACK SMALL SQUARE -25AB;H18551;WHITE SMALL SQUARE -25A1;H22073;WHITE SQUARE -0126;Hbar;LATIN CAPITAL LETTER H WITH STROKE -0124;Hcircumflex;LATIN CAPITAL LETTER H WITH CIRCUMFLEX -0049;I;LATIN CAPITAL LETTER I -0132;IJ;LATIN CAPITAL LIGATURE IJ -00CD;Iacute;LATIN CAPITAL LETTER I WITH ACUTE -012C;Ibreve;LATIN CAPITAL LETTER I WITH BREVE -00CE;Icircumflex;LATIN CAPITAL LETTER I WITH CIRCUMFLEX -00CF;Idieresis;LATIN CAPITAL LETTER I WITH DIAERESIS -0130;Idotaccent;LATIN CAPITAL LETTER I WITH DOT ABOVE -2111;Ifraktur;BLACK-LETTER CAPITAL I -00CC;Igrave;LATIN CAPITAL LETTER I WITH GRAVE -012A;Imacron;LATIN CAPITAL LETTER I WITH MACRON -012E;Iogonek;LATIN CAPITAL LETTER I WITH OGONEK -0399;Iota;GREEK CAPITAL LETTER IOTA -03AA;Iotadieresis;GREEK CAPITAL LETTER IOTA WITH DIALYTIKA -038A;Iotatonos;GREEK CAPITAL LETTER IOTA WITH TONOS -0128;Itilde;LATIN CAPITAL LETTER I WITH TILDE -004A;J;LATIN CAPITAL LETTER J -0134;Jcircumflex;LATIN CAPITAL LETTER J WITH CIRCUMFLEX -004B;K;LATIN CAPITAL LETTER K -039A;Kappa;GREEK CAPITAL LETTER KAPPA -004C;L;LATIN CAPITAL LETTER L -0139;Lacute;LATIN CAPITAL LETTER L WITH ACUTE -039B;Lambda;GREEK CAPITAL LETTER LAMDA -013D;Lcaron;LATIN CAPITAL LETTER L WITH CARON -013F;Ldot;LATIN CAPITAL LETTER L WITH MIDDLE DOT -0141;Lslash;LATIN CAPITAL LETTER L WITH STROKE -004D;M;LATIN CAPITAL LETTER M -039C;Mu;GREEK CAPITAL LETTER MU -004E;N;LATIN CAPITAL LETTER N -0143;Nacute;LATIN CAPITAL LETTER N WITH ACUTE -0147;Ncaron;LATIN CAPITAL LETTER N WITH CARON -00D1;Ntilde;LATIN CAPITAL LETTER N WITH TILDE -039D;Nu;GREEK CAPITAL LETTER NU -004F;O;LATIN CAPITAL LETTER O -0152;OE;LATIN CAPITAL LIGATURE OE -00D3;Oacute;LATIN CAPITAL LETTER O WITH ACUTE -014E;Obreve;LATIN CAPITAL LETTER O WITH BREVE -00D4;Ocircumflex;LATIN CAPITAL LETTER O WITH CIRCUMFLEX -00D6;Odieresis;LATIN CAPITAL LETTER O WITH DIAERESIS -00D2;Ograve;LATIN CAPITAL LETTER O WITH GRAVE -01A0;Ohorn;LATIN CAPITAL LETTER O WITH HORN -0150;Ohungarumlaut;LATIN CAPITAL LETTER O WITH DOUBLE ACUTE -014C;Omacron;LATIN CAPITAL LETTER O WITH MACRON -2126;Omega;OHM SIGN -038F;Omegatonos;GREEK CAPITAL LETTER OMEGA WITH TONOS -039F;Omicron;GREEK CAPITAL LETTER OMICRON -038C;Omicrontonos;GREEK CAPITAL LETTER OMICRON WITH TONOS -00D8;Oslash;LATIN CAPITAL LETTER O WITH STROKE -01FE;Oslashacute;LATIN CAPITAL LETTER O WITH STROKE AND ACUTE -00D5;Otilde;LATIN CAPITAL LETTER O WITH TILDE -0050;P;LATIN CAPITAL LETTER P -03A6;Phi;GREEK CAPITAL LETTER PHI -03A0;Pi;GREEK CAPITAL LETTER PI -03A8;Psi;GREEK CAPITAL LETTER PSI -0051;Q;LATIN CAPITAL LETTER Q -0052;R;LATIN CAPITAL LETTER R -0154;Racute;LATIN CAPITAL LETTER R WITH ACUTE -0158;Rcaron;LATIN CAPITAL LETTER R WITH CARON -211C;Rfraktur;BLACK-LETTER CAPITAL R -03A1;Rho;GREEK CAPITAL LETTER RHO -0053;S;LATIN CAPITAL LETTER S -250C;SF010000;BOX DRAWINGS LIGHT DOWN AND RIGHT -2514;SF020000;BOX DRAWINGS LIGHT UP AND RIGHT -2510;SF030000;BOX DRAWINGS LIGHT DOWN AND LEFT -2518;SF040000;BOX DRAWINGS LIGHT UP AND LEFT -253C;SF050000;BOX DRAWINGS LIGHT VERTICAL AND HORIZONTAL -252C;SF060000;BOX DRAWINGS LIGHT DOWN AND HORIZONTAL -2534;SF070000;BOX DRAWINGS LIGHT UP AND HORIZONTAL -251C;SF080000;BOX DRAWINGS LIGHT VERTICAL AND RIGHT -2524;SF090000;BOX DRAWINGS LIGHT VERTICAL AND LEFT -2500;SF100000;BOX DRAWINGS LIGHT HORIZONTAL -2502;SF110000;BOX DRAWINGS LIGHT VERTICAL -2561;SF190000;BOX DRAWINGS VERTICAL SINGLE AND LEFT DOUBLE -2562;SF200000;BOX DRAWINGS VERTICAL DOUBLE AND LEFT SINGLE -2556;SF210000;BOX DRAWINGS DOWN DOUBLE AND LEFT SINGLE -2555;SF220000;BOX DRAWINGS DOWN SINGLE AND LEFT DOUBLE -2563;SF230000;BOX DRAWINGS DOUBLE VERTICAL AND LEFT -2551;SF240000;BOX DRAWINGS DOUBLE VERTICAL -2557;SF250000;BOX DRAWINGS DOUBLE DOWN AND LEFT -255D;SF260000;BOX DRAWINGS DOUBLE UP AND LEFT -255C;SF270000;BOX DRAWINGS UP DOUBLE AND LEFT SINGLE -255B;SF280000;BOX DRAWINGS UP SINGLE AND LEFT DOUBLE -255E;SF360000;BOX DRAWINGS VERTICAL SINGLE AND RIGHT DOUBLE -255F;SF370000;BOX DRAWINGS VERTICAL DOUBLE AND RIGHT SINGLE -255A;SF380000;BOX DRAWINGS DOUBLE UP AND RIGHT -2554;SF390000;BOX DRAWINGS DOUBLE DOWN AND RIGHT -2569;SF400000;BOX DRAWINGS DOUBLE UP AND HORIZONTAL -2566;SF410000;BOX DRAWINGS DOUBLE DOWN AND HORIZONTAL -2560;SF420000;BOX DRAWINGS DOUBLE VERTICAL AND RIGHT -2550;SF430000;BOX DRAWINGS DOUBLE HORIZONTAL -256C;SF440000;BOX DRAWINGS DOUBLE VERTICAL AND HORIZONTAL -2567;SF450000;BOX DRAWINGS UP SINGLE AND HORIZONTAL DOUBLE -2568;SF460000;BOX DRAWINGS UP DOUBLE AND HORIZONTAL SINGLE -2564;SF470000;BOX DRAWINGS DOWN SINGLE AND HORIZONTAL DOUBLE -2565;SF480000;BOX DRAWINGS DOWN DOUBLE AND HORIZONTAL SINGLE -2559;SF490000;BOX DRAWINGS UP DOUBLE AND RIGHT SINGLE -2558;SF500000;BOX DRAWINGS UP SINGLE AND RIGHT DOUBLE -2552;SF510000;BOX DRAWINGS DOWN SINGLE AND RIGHT DOUBLE -2553;SF520000;BOX DRAWINGS DOWN DOUBLE AND RIGHT SINGLE -256B;SF530000;BOX DRAWINGS VERTICAL DOUBLE AND HORIZONTAL SINGLE -256A;SF540000;BOX DRAWINGS VERTICAL SINGLE AND HORIZONTAL DOUBLE -015A;Sacute;LATIN CAPITAL LETTER S WITH ACUTE -0160;Scaron;LATIN CAPITAL LETTER S WITH CARON -015E;Scedilla;LATIN CAPITAL LETTER S WITH CEDILLA -015C;Scircumflex;LATIN CAPITAL LETTER S WITH CIRCUMFLEX -03A3;Sigma;GREEK CAPITAL LETTER SIGMA -0054;T;LATIN CAPITAL LETTER T -03A4;Tau;GREEK CAPITAL LETTER TAU -0166;Tbar;LATIN CAPITAL LETTER T WITH STROKE -0164;Tcaron;LATIN CAPITAL LETTER T WITH CARON -0398;Theta;GREEK CAPITAL LETTER THETA -00DE;Thorn;LATIN CAPITAL LETTER THORN -0055;U;LATIN CAPITAL LETTER U -00DA;Uacute;LATIN CAPITAL LETTER U WITH ACUTE -016C;Ubreve;LATIN CAPITAL LETTER U WITH BREVE -00DB;Ucircumflex;LATIN CAPITAL LETTER U WITH CIRCUMFLEX -00DC;Udieresis;LATIN CAPITAL LETTER U WITH DIAERESIS -00D9;Ugrave;LATIN CAPITAL LETTER U WITH GRAVE -01AF;Uhorn;LATIN CAPITAL LETTER U WITH HORN -0170;Uhungarumlaut;LATIN CAPITAL LETTER U WITH DOUBLE ACUTE -016A;Umacron;LATIN CAPITAL LETTER U WITH MACRON -0172;Uogonek;LATIN CAPITAL LETTER U WITH OGONEK -03A5;Upsilon;GREEK CAPITAL LETTER UPSILON -03D2;Upsilon1;GREEK UPSILON WITH HOOK SYMBOL -03AB;Upsilondieresis;GREEK CAPITAL LETTER UPSILON WITH DIALYTIKA -038E;Upsilontonos;GREEK CAPITAL LETTER UPSILON WITH TONOS -016E;Uring;LATIN CAPITAL LETTER U WITH RING ABOVE -0168;Utilde;LATIN CAPITAL LETTER U WITH TILDE -0056;V;LATIN CAPITAL LETTER V -0057;W;LATIN CAPITAL LETTER W -1E82;Wacute;LATIN CAPITAL LETTER W WITH ACUTE -0174;Wcircumflex;LATIN CAPITAL LETTER W WITH CIRCUMFLEX -1E84;Wdieresis;LATIN CAPITAL LETTER W WITH DIAERESIS -1E80;Wgrave;LATIN CAPITAL LETTER W WITH GRAVE -0058;X;LATIN CAPITAL LETTER X -039E;Xi;GREEK CAPITAL LETTER XI -0059;Y;LATIN CAPITAL LETTER Y -00DD;Yacute;LATIN CAPITAL LETTER Y WITH ACUTE -0176;Ycircumflex;LATIN CAPITAL LETTER Y WITH CIRCUMFLEX -0178;Ydieresis;LATIN CAPITAL LETTER Y WITH DIAERESIS -1EF2;Ygrave;LATIN CAPITAL LETTER Y WITH GRAVE -005A;Z;LATIN CAPITAL LETTER Z -0179;Zacute;LATIN CAPITAL LETTER Z WITH ACUTE -017D;Zcaron;LATIN CAPITAL LETTER Z WITH CARON -017B;Zdotaccent;LATIN CAPITAL LETTER Z WITH DOT ABOVE -0396;Zeta;GREEK CAPITAL LETTER ZETA -0061;a;LATIN SMALL LETTER A -00E1;aacute;LATIN SMALL LETTER A WITH ACUTE -0103;abreve;LATIN SMALL LETTER A WITH BREVE -00E2;acircumflex;LATIN SMALL LETTER A WITH CIRCUMFLEX -00B4;acute;ACUTE ACCENT -0301;acutecomb;COMBINING ACUTE ACCENT -00E4;adieresis;LATIN SMALL LETTER A WITH DIAERESIS -00E6;ae;LATIN SMALL LETTER AE -01FD;aeacute;LATIN SMALL LETTER AE WITH ACUTE -00E0;agrave;LATIN SMALL LETTER A WITH GRAVE -2135;aleph;ALEF SYMBOL -03B1;alpha;GREEK SMALL LETTER ALPHA -03AC;alphatonos;GREEK SMALL LETTER ALPHA WITH TONOS -0101;amacron;LATIN SMALL LETTER A WITH MACRON -0026;ampersand;AMPERSAND -2220;angle;ANGLE -2329;angleleft;LEFT-POINTING ANGLE BRACKET -232A;angleright;RIGHT-POINTING ANGLE BRACKET -0387;anoteleia;GREEK ANO TELEIA -0105;aogonek;LATIN SMALL LETTER A WITH OGONEK -2248;approxequal;ALMOST EQUAL TO -00E5;aring;LATIN SMALL LETTER A WITH RING ABOVE -01FB;aringacute;LATIN SMALL LETTER A WITH RING ABOVE AND ACUTE -2194;arrowboth;LEFT RIGHT ARROW -21D4;arrowdblboth;LEFT RIGHT DOUBLE ARROW -21D3;arrowdbldown;DOWNWARDS DOUBLE ARROW -21D0;arrowdblleft;LEFTWARDS DOUBLE ARROW -21D2;arrowdblright;RIGHTWARDS DOUBLE ARROW -21D1;arrowdblup;UPWARDS DOUBLE ARROW -2193;arrowdown;DOWNWARDS ARROW -2190;arrowleft;LEFTWARDS ARROW -2192;arrowright;RIGHTWARDS ARROW -2191;arrowup;UPWARDS ARROW -2195;arrowupdn;UP DOWN ARROW -21A8;arrowupdnbse;UP DOWN ARROW WITH BASE -005E;asciicircum;CIRCUMFLEX ACCENT -007E;asciitilde;TILDE -002A;asterisk;ASTERISK -2217;asteriskmath;ASTERISK OPERATOR -0040;at;COMMERCIAL AT -00E3;atilde;LATIN SMALL LETTER A WITH TILDE -0062;b;LATIN SMALL LETTER B -005C;backslash;REVERSE SOLIDUS -007C;bar;VERTICAL LINE -03B2;beta;GREEK SMALL LETTER BETA -2588;block;FULL BLOCK -007B;braceleft;LEFT CURLY BRACKET -007D;braceright;RIGHT CURLY BRACKET -005B;bracketleft;LEFT SQUARE BRACKET -005D;bracketright;RIGHT SQUARE BRACKET -02D8;breve;BREVE -00A6;brokenbar;BROKEN BAR -2022;bullet;BULLET -0063;c;LATIN SMALL LETTER C -0107;cacute;LATIN SMALL LETTER C WITH ACUTE -02C7;caron;CARON -21B5;carriagereturn;DOWNWARDS ARROW WITH CORNER LEFTWARDS -010D;ccaron;LATIN SMALL LETTER C WITH CARON -00E7;ccedilla;LATIN SMALL LETTER C WITH CEDILLA -0109;ccircumflex;LATIN SMALL LETTER C WITH CIRCUMFLEX -010B;cdotaccent;LATIN SMALL LETTER C WITH DOT ABOVE -00B8;cedilla;CEDILLA -00A2;cent;CENT SIGN -03C7;chi;GREEK SMALL LETTER CHI -25CB;circle;WHITE CIRCLE -2297;circlemultiply;CIRCLED TIMES -2295;circleplus;CIRCLED PLUS -02C6;circumflex;MODIFIER LETTER CIRCUMFLEX ACCENT -2663;club;BLACK CLUB SUIT -003A;colon;COLON -20A1;colonmonetary;COLON SIGN -002C;comma;COMMA -2245;congruent;APPROXIMATELY EQUAL TO -00A9;copyright;COPYRIGHT SIGN -00A4;currency;CURRENCY SIGN -0064;d;LATIN SMALL LETTER D -2020;dagger;DAGGER -2021;daggerdbl;DOUBLE DAGGER -010F;dcaron;LATIN SMALL LETTER D WITH CARON -0111;dcroat;LATIN SMALL LETTER D WITH STROKE -00B0;degree;DEGREE SIGN -03B4;delta;GREEK SMALL LETTER DELTA -2666;diamond;BLACK DIAMOND SUIT -00A8;dieresis;DIAERESIS -0385;dieresistonos;GREEK DIALYTIKA TONOS -00F7;divide;DIVISION SIGN -2593;dkshade;DARK SHADE -2584;dnblock;LOWER HALF BLOCK -0024;dollar;DOLLAR SIGN -20AB;dong;DONG SIGN -02D9;dotaccent;DOT ABOVE -0323;dotbelowcomb;COMBINING DOT BELOW -0131;dotlessi;LATIN SMALL LETTER DOTLESS I -22C5;dotmath;DOT OPERATOR -0065;e;LATIN SMALL LETTER E -00E9;eacute;LATIN SMALL LETTER E WITH ACUTE -0115;ebreve;LATIN SMALL LETTER E WITH BREVE -011B;ecaron;LATIN SMALL LETTER E WITH CARON -00EA;ecircumflex;LATIN SMALL LETTER E WITH CIRCUMFLEX -00EB;edieresis;LATIN SMALL LETTER E WITH DIAERESIS -0117;edotaccent;LATIN SMALL LETTER E WITH DOT ABOVE -00E8;egrave;LATIN SMALL LETTER E WITH GRAVE -0038;eight;DIGIT EIGHT -2208;element;ELEMENT OF -2026;ellipsis;HORIZONTAL ELLIPSIS -0113;emacron;LATIN SMALL LETTER E WITH MACRON -2014;emdash;EM DASH -2205;emptyset;EMPTY SET -2013;endash;EN DASH -014B;eng;LATIN SMALL LETTER ENG -0119;eogonek;LATIN SMALL LETTER E WITH OGONEK -03B5;epsilon;GREEK SMALL LETTER EPSILON -03AD;epsilontonos;GREEK SMALL LETTER EPSILON WITH TONOS -003D;equal;EQUALS SIGN -2261;equivalence;IDENTICAL TO -212E;estimated;ESTIMATED SYMBOL -03B7;eta;GREEK SMALL LETTER ETA -03AE;etatonos;GREEK SMALL LETTER ETA WITH TONOS -00F0;eth;LATIN SMALL LETTER ETH -0021;exclam;EXCLAMATION MARK -203C;exclamdbl;DOUBLE EXCLAMATION MARK -00A1;exclamdown;INVERTED EXCLAMATION MARK -2203;existential;THERE EXISTS -0066;f;LATIN SMALL LETTER F -2640;female;FEMALE SIGN -2012;figuredash;FIGURE DASH -25A0;filledbox;BLACK SQUARE -25AC;filledrect;BLACK RECTANGLE -0035;five;DIGIT FIVE -215D;fiveeighths;VULGAR FRACTION FIVE EIGHTHS -0192;florin;LATIN SMALL LETTER F WITH HOOK -0034;four;DIGIT FOUR -2044;fraction;FRACTION SLASH -20A3;franc;FRENCH FRANC SIGN -0067;g;LATIN SMALL LETTER G -03B3;gamma;GREEK SMALL LETTER GAMMA -011F;gbreve;LATIN SMALL LETTER G WITH BREVE -01E7;gcaron;LATIN SMALL LETTER G WITH CARON -011D;gcircumflex;LATIN SMALL LETTER G WITH CIRCUMFLEX -0121;gdotaccent;LATIN SMALL LETTER G WITH DOT ABOVE -00DF;germandbls;LATIN SMALL LETTER SHARP S -2207;gradient;NABLA -0060;grave;GRAVE ACCENT -0300;gravecomb;COMBINING GRAVE ACCENT -003E;greater;GREATER-THAN SIGN -2265;greaterequal;GREATER-THAN OR EQUAL TO -00AB;guillemotleft;LEFT-POINTING DOUBLE ANGLE QUOTATION MARK -00BB;guillemotright;RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK -2039;guilsinglleft;SINGLE LEFT-POINTING ANGLE QUOTATION MARK -203A;guilsinglright;SINGLE RIGHT-POINTING ANGLE QUOTATION MARK -0068;h;LATIN SMALL LETTER H -0127;hbar;LATIN SMALL LETTER H WITH STROKE -0125;hcircumflex;LATIN SMALL LETTER H WITH CIRCUMFLEX -2665;heart;BLACK HEART SUIT -0309;hookabovecomb;COMBINING HOOK ABOVE -2302;house;HOUSE -02DD;hungarumlaut;DOUBLE ACUTE ACCENT -002D;hyphen;HYPHEN-MINUS -0069;i;LATIN SMALL LETTER I -00ED;iacute;LATIN SMALL LETTER I WITH ACUTE -012D;ibreve;LATIN SMALL LETTER I WITH BREVE -00EE;icircumflex;LATIN SMALL LETTER I WITH CIRCUMFLEX -00EF;idieresis;LATIN SMALL LETTER I WITH DIAERESIS -00EC;igrave;LATIN SMALL LETTER I WITH GRAVE -0133;ij;LATIN SMALL LIGATURE IJ -012B;imacron;LATIN SMALL LETTER I WITH MACRON -221E;infinity;INFINITY -222B;integral;INTEGRAL -2321;integralbt;BOTTOM HALF INTEGRAL -2320;integraltp;TOP HALF INTEGRAL -2229;intersection;INTERSECTION -25D8;invbullet;INVERSE BULLET -25D9;invcircle;INVERSE WHITE CIRCLE -263B;invsmileface;BLACK SMILING FACE -012F;iogonek;LATIN SMALL LETTER I WITH OGONEK -03B9;iota;GREEK SMALL LETTER IOTA -03CA;iotadieresis;GREEK SMALL LETTER IOTA WITH DIALYTIKA -0390;iotadieresistonos;GREEK SMALL LETTER IOTA WITH DIALYTIKA AND TONOS -03AF;iotatonos;GREEK SMALL LETTER IOTA WITH TONOS -0129;itilde;LATIN SMALL LETTER I WITH TILDE -006A;j;LATIN SMALL LETTER J -0135;jcircumflex;LATIN SMALL LETTER J WITH CIRCUMFLEX -006B;k;LATIN SMALL LETTER K -03BA;kappa;GREEK SMALL LETTER KAPPA -0138;kgreenlandic;LATIN SMALL LETTER KRA -006C;l;LATIN SMALL LETTER L -013A;lacute;LATIN SMALL LETTER L WITH ACUTE -03BB;lambda;GREEK SMALL LETTER LAMDA -013E;lcaron;LATIN SMALL LETTER L WITH CARON -0140;ldot;LATIN SMALL LETTER L WITH MIDDLE DOT -003C;less;LESS-THAN SIGN -2264;lessequal;LESS-THAN OR EQUAL TO -258C;lfblock;LEFT HALF BLOCK -20A4;lira;LIRA SIGN -2227;logicaland;LOGICAL AND -00AC;logicalnot;NOT SIGN -2228;logicalor;LOGICAL OR -017F;longs;LATIN SMALL LETTER LONG S -25CA;lozenge;LOZENGE -0142;lslash;LATIN SMALL LETTER L WITH STROKE -2591;ltshade;LIGHT SHADE -006D;m;LATIN SMALL LETTER M -00AF;macron;MACRON -2642;male;MALE SIGN -2212;minus;MINUS SIGN -2032;minute;PRIME -00B5;mu;MICRO SIGN -00D7;multiply;MULTIPLICATION SIGN -266A;musicalnote;EIGHTH NOTE -266B;musicalnotedbl;BEAMED EIGHTH NOTES -006E;n;LATIN SMALL LETTER N -0144;nacute;LATIN SMALL LETTER N WITH ACUTE -0149;napostrophe;LATIN SMALL LETTER N PRECEDED BY APOSTROPHE -0148;ncaron;LATIN SMALL LETTER N WITH CARON -0039;nine;DIGIT NINE -2209;notelement;NOT AN ELEMENT OF -2260;notequal;NOT EQUAL TO -2284;notsubset;NOT A SUBSET OF -00F1;ntilde;LATIN SMALL LETTER N WITH TILDE -03BD;nu;GREEK SMALL LETTER NU -0023;numbersign;NUMBER SIGN -006F;o;LATIN SMALL LETTER O -00F3;oacute;LATIN SMALL LETTER O WITH ACUTE -014F;obreve;LATIN SMALL LETTER O WITH BREVE -00F4;ocircumflex;LATIN SMALL LETTER O WITH CIRCUMFLEX -00F6;odieresis;LATIN SMALL LETTER O WITH DIAERESIS -0153;oe;LATIN SMALL LIGATURE OE -02DB;ogonek;OGONEK -00F2;ograve;LATIN SMALL LETTER O WITH GRAVE -01A1;ohorn;LATIN SMALL LETTER O WITH HORN -0151;ohungarumlaut;LATIN SMALL LETTER O WITH DOUBLE ACUTE -014D;omacron;LATIN SMALL LETTER O WITH MACRON -03C9;omega;GREEK SMALL LETTER OMEGA -03D6;omega1;GREEK PI SYMBOL -03CE;omegatonos;GREEK SMALL LETTER OMEGA WITH TONOS -03BF;omicron;GREEK SMALL LETTER OMICRON -03CC;omicrontonos;GREEK SMALL LETTER OMICRON WITH TONOS -0031;one;DIGIT ONE -2024;onedotenleader;ONE DOT LEADER -215B;oneeighth;VULGAR FRACTION ONE EIGHTH -00BD;onehalf;VULGAR FRACTION ONE HALF -00BC;onequarter;VULGAR FRACTION ONE QUARTER -2153;onethird;VULGAR FRACTION ONE THIRD -25E6;openbullet;WHITE BULLET -00AA;ordfeminine;FEMININE ORDINAL INDICATOR -00BA;ordmasculine;MASCULINE ORDINAL INDICATOR -221F;orthogonal;RIGHT ANGLE -00F8;oslash;LATIN SMALL LETTER O WITH STROKE -01FF;oslashacute;LATIN SMALL LETTER O WITH STROKE AND ACUTE -00F5;otilde;LATIN SMALL LETTER O WITH TILDE -0070;p;LATIN SMALL LETTER P -00B6;paragraph;PILCROW SIGN -0028;parenleft;LEFT PARENTHESIS -0029;parenright;RIGHT PARENTHESIS -2202;partialdiff;PARTIAL DIFFERENTIAL -0025;percent;PERCENT SIGN -002E;period;FULL STOP -00B7;periodcentered;MIDDLE DOT -22A5;perpendicular;UP TACK -2030;perthousand;PER MILLE SIGN -20A7;peseta;PESETA SIGN -03C6;phi;GREEK SMALL LETTER PHI -03D5;phi1;GREEK PHI SYMBOL -03C0;pi;GREEK SMALL LETTER PI -002B;plus;PLUS SIGN -00B1;plusminus;PLUS-MINUS SIGN -211E;prescription;PRESCRIPTION TAKE -220F;product;N-ARY PRODUCT -2282;propersubset;SUBSET OF -2283;propersuperset;SUPERSET OF -221D;proportional;PROPORTIONAL TO -03C8;psi;GREEK SMALL LETTER PSI -0071;q;LATIN SMALL LETTER Q -003F;question;QUESTION MARK -00BF;questiondown;INVERTED QUESTION MARK -0022;quotedbl;QUOTATION MARK -201E;quotedblbase;DOUBLE LOW-9 QUOTATION MARK -201C;quotedblleft;LEFT DOUBLE QUOTATION MARK -201D;quotedblright;RIGHT DOUBLE QUOTATION MARK -2018;quoteleft;LEFT SINGLE QUOTATION MARK -201B;quotereversed;SINGLE HIGH-REVERSED-9 QUOTATION MARK -2019;quoteright;RIGHT SINGLE QUOTATION MARK -201A;quotesinglbase;SINGLE LOW-9 QUOTATION MARK -0027;quotesingle;APOSTROPHE -0072;r;LATIN SMALL LETTER R -0155;racute;LATIN SMALL LETTER R WITH ACUTE -221A;radical;SQUARE ROOT -0159;rcaron;LATIN SMALL LETTER R WITH CARON -2286;reflexsubset;SUBSET OF OR EQUAL TO -2287;reflexsuperset;SUPERSET OF OR EQUAL TO -00AE;registered;REGISTERED SIGN -2310;revlogicalnot;REVERSED NOT SIGN -03C1;rho;GREEK SMALL LETTER RHO -02DA;ring;RING ABOVE -2590;rtblock;RIGHT HALF BLOCK -0073;s;LATIN SMALL LETTER S -015B;sacute;LATIN SMALL LETTER S WITH ACUTE -0161;scaron;LATIN SMALL LETTER S WITH CARON -015F;scedilla;LATIN SMALL LETTER S WITH CEDILLA -015D;scircumflex;LATIN SMALL LETTER S WITH CIRCUMFLEX -2033;second;DOUBLE PRIME -00A7;section;SECTION SIGN -003B;semicolon;SEMICOLON -0037;seven;DIGIT SEVEN -215E;seveneighths;VULGAR FRACTION SEVEN EIGHTHS -2592;shade;MEDIUM SHADE -03C3;sigma;GREEK SMALL LETTER SIGMA -03C2;sigma1;GREEK SMALL LETTER FINAL SIGMA -223C;similar;TILDE OPERATOR -0036;six;DIGIT SIX -002F;slash;SOLIDUS -263A;smileface;WHITE SMILING FACE -0020;space;SPACE -2660;spade;BLACK SPADE SUIT -00A3;sterling;POUND SIGN -220B;suchthat;CONTAINS AS MEMBER -2211;summation;N-ARY SUMMATION -263C;sun;WHITE SUN WITH RAYS -0074;t;LATIN SMALL LETTER T -03C4;tau;GREEK SMALL LETTER TAU -0167;tbar;LATIN SMALL LETTER T WITH STROKE -0165;tcaron;LATIN SMALL LETTER T WITH CARON -2234;therefore;THEREFORE -03B8;theta;GREEK SMALL LETTER THETA -03D1;theta1;GREEK THETA SYMBOL -00FE;thorn;LATIN SMALL LETTER THORN -0033;three;DIGIT THREE -215C;threeeighths;VULGAR FRACTION THREE EIGHTHS -00BE;threequarters;VULGAR FRACTION THREE QUARTERS -02DC;tilde;SMALL TILDE -0303;tildecomb;COMBINING TILDE -0384;tonos;GREEK TONOS -2122;trademark;TRADE MARK SIGN -25BC;triagdn;BLACK DOWN-POINTING TRIANGLE -25C4;triaglf;BLACK LEFT-POINTING POINTER -25BA;triagrt;BLACK RIGHT-POINTING POINTER -25B2;triagup;BLACK UP-POINTING TRIANGLE -0032;two;DIGIT TWO -2025;twodotenleader;TWO DOT LEADER -2154;twothirds;VULGAR FRACTION TWO THIRDS -0075;u;LATIN SMALL LETTER U -00FA;uacute;LATIN SMALL LETTER U WITH ACUTE -016D;ubreve;LATIN SMALL LETTER U WITH BREVE -00FB;ucircumflex;LATIN SMALL LETTER U WITH CIRCUMFLEX -00FC;udieresis;LATIN SMALL LETTER U WITH DIAERESIS -00F9;ugrave;LATIN SMALL LETTER U WITH GRAVE -01B0;uhorn;LATIN SMALL LETTER U WITH HORN -0171;uhungarumlaut;LATIN SMALL LETTER U WITH DOUBLE ACUTE -016B;umacron;LATIN SMALL LETTER U WITH MACRON -005F;underscore;LOW LINE -2017;underscoredbl;DOUBLE LOW LINE -222A;union;UNION -2200;universal;FOR ALL -0173;uogonek;LATIN SMALL LETTER U WITH OGONEK -2580;upblock;UPPER HALF BLOCK -03C5;upsilon;GREEK SMALL LETTER UPSILON -03CB;upsilondieresis;GREEK SMALL LETTER UPSILON WITH DIALYTIKA -03B0;upsilondieresistonos;GREEK SMALL LETTER UPSILON WITH DIALYTIKA AND TONOS -03CD;upsilontonos;GREEK SMALL LETTER UPSILON WITH TONOS -016F;uring;LATIN SMALL LETTER U WITH RING ABOVE -0169;utilde;LATIN SMALL LETTER U WITH TILDE -0076;v;LATIN SMALL LETTER V -0077;w;LATIN SMALL LETTER W -1E83;wacute;LATIN SMALL LETTER W WITH ACUTE -0175;wcircumflex;LATIN SMALL LETTER W WITH CIRCUMFLEX -1E85;wdieresis;LATIN SMALL LETTER W WITH DIAERESIS -2118;weierstrass;SCRIPT CAPITAL P -1E81;wgrave;LATIN SMALL LETTER W WITH GRAVE -0078;x;LATIN SMALL LETTER X -03BE;xi;GREEK SMALL LETTER XI -0079;y;LATIN SMALL LETTER Y -00FD;yacute;LATIN SMALL LETTER Y WITH ACUTE -0177;ycircumflex;LATIN SMALL LETTER Y WITH CIRCUMFLEX -00FF;ydieresis;LATIN SMALL LETTER Y WITH DIAERESIS -00A5;yen;YEN SIGN -1EF3;ygrave;LATIN SMALL LETTER Y WITH GRAVE -007A;z;LATIN SMALL LETTER Z -017A;zacute;LATIN SMALL LETTER Z WITH ACUTE -017E;zcaron;LATIN SMALL LETTER Z WITH CARON -017C;zdotaccent;LATIN SMALL LETTER Z WITH DOT ABOVE -0030;zero;DIGIT ZERO -03B6;zeta;GREEK SMALL LETTER ZETA -# END -""" - - -class AGLError(Exception): - pass - - -LEGACY_AGL2UV = {} -AGL2UV = {} -UV2AGL = {} - - -def _builddicts(): - import re - - lines = _aglText.splitlines() - - parseAGL_RE = re.compile("([A-Za-z0-9]+);((?:[0-9A-F]{4})(?: (?:[0-9A-F]{4}))*)$") - - for line in lines: - if not line or line[:1] == "#": - continue - m = parseAGL_RE.match(line) - if not m: - raise AGLError("syntax error in glyphlist.txt: %s" % repr(line[:20])) - unicodes = m.group(2) - assert len(unicodes) % 5 == 4 - unicodes = [int(unicode, 16) for unicode in unicodes.split()] - glyphName = tostr(m.group(1)) - LEGACY_AGL2UV[glyphName] = unicodes - - lines = _aglfnText.splitlines() - - parseAGLFN_RE = re.compile("([0-9A-F]{4});([A-Za-z0-9]+);.*?$") - - for line in lines: - if not line or line[:1] == "#": - continue - m = parseAGLFN_RE.match(line) - if not m: - raise AGLError("syntax error in aglfn.txt: %s" % repr(line[:20])) - unicode = m.group(1) - assert len(unicode) == 4 - unicode = int(unicode, 16) - glyphName = tostr(m.group(2)) - AGL2UV[glyphName] = unicode - UV2AGL[unicode] = glyphName - - -_builddicts() - - -def toUnicode(glyph, isZapfDingbats=False): - """Convert glyph names to Unicode, such as ``'longs_t.oldstyle'`` --> ``u'ſt'`` - - If ``isZapfDingbats`` is ``True``, the implementation recognizes additional - glyph names (as required by the AGL specification). - """ - # https://github.com/adobe-type-tools/agl-specification#2-the-mapping - # - # 1. Drop all the characters from the glyph name starting with - # the first occurrence of a period (U+002E; FULL STOP), if any. - glyph = glyph.split(".", 1)[0] - - # 2. Split the remaining string into a sequence of components, - # using underscore (U+005F; LOW LINE) as the delimiter. - components = glyph.split("_") - - # 3. Map each component to a character string according to the - # procedure below, and concatenate those strings; the result - # is the character string to which the glyph name is mapped. - result = [_glyphComponentToUnicode(c, isZapfDingbats) for c in components] - return "".join(result) - - -def _glyphComponentToUnicode(component, isZapfDingbats): - # If the font is Zapf Dingbats (PostScript FontName: ZapfDingbats), - # and the component is in the ITC Zapf Dingbats Glyph List, then - # map it to the corresponding character in that list. - dingbat = _zapfDingbatsToUnicode(component) if isZapfDingbats else None - if dingbat: - return dingbat - - # Otherwise, if the component is in AGL, then map it - # to the corresponding character in that list. - uchars = LEGACY_AGL2UV.get(component) - if uchars: - return "".join(map(chr, uchars)) - - # Otherwise, if the component is of the form "uni" (U+0075, - # U+006E, and U+0069) followed by a sequence of uppercase - # hexadecimal digits (0–9 and A–F, meaning U+0030 through - # U+0039 and U+0041 through U+0046), if the length of that - # sequence is a multiple of four, and if each group of four - # digits represents a value in the ranges 0000 through D7FF - # or E000 through FFFF, then interpret each as a Unicode scalar - # value and map the component to the string made of those - # scalar values. Note that the range and digit-length - # restrictions mean that the "uni" glyph name prefix can be - # used only with UVs in the Basic Multilingual Plane (BMP). - uni = _uniToUnicode(component) - if uni: - return uni - - # Otherwise, if the component is of the form "u" (U+0075) - # followed by a sequence of four to six uppercase hexadecimal - # digits (0–9 and A–F, meaning U+0030 through U+0039 and - # U+0041 through U+0046), and those digits represents a value - # in the ranges 0000 through D7FF or E000 through 10FFFF, then - # interpret it as a Unicode scalar value and map the component - # to the string made of this scalar value. - uni = _uToUnicode(component) - if uni: - return uni - - # Otherwise, map the component to an empty string. - return "" - - -# https://github.com/adobe-type-tools/agl-aglfn/blob/master/zapfdingbats.txt -_AGL_ZAPF_DINGBATS = ( - " ✁✂✄☎✆✝✞✟✠✡☛☞✌✍✎✏✑✒✓✔✕✖✗✘✙✚✛✜✢✣✤✥✦✧★✩✪✫✬✭✮✯✰✱✲✳✴✵✶✷✸✹✺✻✼✽✾✿❀" - "❁❂❃❄❅❆❇❈❉❊❋●❍■❏❑▲▼◆❖ ◗❘❙❚❯❱❲❳❨❩❬❭❪❫❴❵❛❜❝❞❡❢❣❤✐❥❦❧♠♥♦♣ ✉✈✇" - "①②③④⑤⑥⑦⑧⑨⑩❶❷❸❹❺❻❼❽❾❿➀➁➂➃➄➅➆➇➈➉➊➋➌➍➎➏➐➑➒➓➔→➣↔" - "↕➙➛➜➝➞➟➠➡➢➤➥➦➧➨➩➫➭➯➲➳➵➸➺➻➼➽➾➚➪➶➹➘➴➷➬➮➱✃❐❒❮❰" -) - - -def _zapfDingbatsToUnicode(glyph): - """Helper for toUnicode().""" - if len(glyph) < 2 or glyph[0] != "a": - return None - try: - gid = int(glyph[1:]) - except ValueError: - return None - if gid < 0 or gid >= len(_AGL_ZAPF_DINGBATS): - return None - uchar = _AGL_ZAPF_DINGBATS[gid] - return uchar if uchar != " " else None - - -_re_uni = re.compile("^uni([0-9A-F]+)$") - - -def _uniToUnicode(component): - """Helper for toUnicode() to handle "uniABCD" components.""" - match = _re_uni.match(component) - if match is None: - return None - digits = match.group(1) - if len(digits) % 4 != 0: - return None - chars = [int(digits[i : i + 4], 16) for i in range(0, len(digits), 4)] - if any(c >= 0xD800 and c <= 0xDFFF for c in chars): - # The AGL specification explicitly excluded surrogate pairs. - return None - return "".join([chr(c) for c in chars]) - - -_re_u = re.compile("^u([0-9A-F]{4,6})$") - - -def _uToUnicode(component): - """Helper for toUnicode() to handle "u1ABCD" components.""" - match = _re_u.match(component) - if match is None: - return None - digits = match.group(1) - try: - value = int(digits, 16) - except ValueError: - return None - if (value >= 0x0000 and value <= 0xD7FF) or (value >= 0xE000 and value <= 0x10FFFF): - return chr(value) - return None diff --git a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/mtiLib/__init__.py b/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/mtiLib/__init__.py deleted file mode 100644 index dbedf275e3d3cfb2e8ec43eddd88b9d78ad53e15..0000000000000000000000000000000000000000 --- a/spaces/chuan-hd/law-assistant-chatbot/.venv/lib/python3.11/site-packages/fontTools/mtiLib/__init__.py +++ /dev/null @@ -1,1402 +0,0 @@ -#!/usr/bin/python - -# FontDame-to-FontTools for OpenType Layout tables -# -# Source language spec is available at: -# http://monotype.github.io/OpenType_Table_Source/otl_source.html -# https://github.com/Monotype/OpenType_Table_Source/ - -from fontTools import ttLib -from fontTools.ttLib.tables._c_m_a_p import cmap_classes -from fontTools.ttLib.tables import otTables as ot -from fontTools.ttLib.tables.otBase import ValueRecord, valueRecordFormatDict -from fontTools.otlLib import builder as otl -from contextlib import contextmanager -from fontTools.ttLib import newTable -from fontTools.feaLib.lookupDebugInfo import LOOKUP_DEBUG_ENV_VAR, LOOKUP_DEBUG_INFO_KEY -from operator import setitem -import os -import logging - - -class MtiLibError(Exception): - pass - - -class ReferenceNotFoundError(MtiLibError): - pass - - -class FeatureNotFoundError(ReferenceNotFoundError): - pass - - -class LookupNotFoundError(ReferenceNotFoundError): - pass - - -log = logging.getLogger("fontTools.mtiLib") - - -def makeGlyph(s): - if s[:2] in ["U ", "u "]: - return ttLib.TTFont._makeGlyphName(int(s[2:], 16)) - elif s[:2] == "# ": - return "glyph%.5d" % int(s[2:]) - assert s.find(" ") < 0, "Space found in glyph name: %s" % s - assert s, "Glyph name is empty" - return s - - -def makeGlyphs(l): - return [makeGlyph(g) for g in l] - - -def mapLookup(sym, mapping): - # Lookups are addressed by name. So resolved them using a map if available. - # Fallback to parsing as lookup index if a map isn't provided. - if mapping is not None: - try: - idx = mapping[sym] - except KeyError: - raise LookupNotFoundError(sym) - else: - idx = int(sym) - return idx - - -def mapFeature(sym, mapping): - # Features are referenced by index according the spec. So, if symbol is an - # integer, use it directly. Otherwise look up in the map if provided. - try: - idx = int(sym) - except ValueError: - try: - idx = mapping[sym] - except KeyError: - raise FeatureNotFoundError(sym) - return idx - - -def setReference(mapper, mapping, sym, setter, collection, key): - try: - mapped = mapper(sym, mapping) - except ReferenceNotFoundError as e: - try: - if mapping is not None: - mapping.addDeferredMapping( - lambda ref: setter(collection, key, ref), sym, e - ) - return - except AttributeError: - pass - raise - setter(collection, key, mapped) - - -class DeferredMapping(dict): - def __init__(self): - self._deferredMappings = [] - - def addDeferredMapping(self, setter, sym, e): - log.debug("Adding deferred mapping for symbol '%s' %s", sym, type(e).__name__) - self._deferredMappings.append((setter, sym, e)) - - def applyDeferredMappings(self): - for setter, sym, e in self._deferredMappings: - log.debug( - "Applying deferred mapping for symbol '%s' %s", sym, type(e).__name__ - ) - try: - mapped = self[sym] - except KeyError: - raise e - setter(mapped) - log.debug("Set to %s", mapped) - self._deferredMappings = [] - - -def parseScriptList(lines, featureMap=None): - self = ot.ScriptList() - records = [] - with lines.between("script table"): - for line in lines: - while len(line) < 4: - line.append("") - scriptTag, langSysTag, defaultFeature, features = line - log.debug("Adding script %s language-system %s", scriptTag, langSysTag) - - langSys = ot.LangSys() - langSys.LookupOrder = None - if defaultFeature: - setReference( - mapFeature, - featureMap, - defaultFeature, - setattr, - langSys, - "ReqFeatureIndex", - ) - else: - langSys.ReqFeatureIndex = 0xFFFF - syms = stripSplitComma(features) - langSys.FeatureIndex = theList = [3] * len(syms) - for i, sym in enumerate(syms): - setReference(mapFeature, featureMap, sym, setitem, theList, i) - langSys.FeatureCount = len(langSys.FeatureIndex) - - script = [s for s in records if s.ScriptTag == scriptTag] - if script: - script = script[0].Script - else: - scriptRec = ot.ScriptRecord() - scriptRec.ScriptTag = scriptTag + " " * (4 - len(scriptTag)) - scriptRec.Script = ot.Script() - records.append(scriptRec) - script = scriptRec.Script - script.DefaultLangSys = None - script.LangSysRecord = [] - script.LangSysCount = 0 - - if langSysTag == "default": - script.DefaultLangSys = langSys - else: - langSysRec = ot.LangSysRecord() - langSysRec.LangSysTag = langSysTag + " " * (4 - len(langSysTag)) - langSysRec.LangSys = langSys - script.LangSysRecord.append(langSysRec) - script.LangSysCount = len(script.LangSysRecord) - - for script in records: - script.Script.LangSysRecord = sorted( - script.Script.LangSysRecord, key=lambda rec: rec.LangSysTag - ) - self.ScriptRecord = sorted(records, key=lambda rec: rec.ScriptTag) - self.ScriptCount = len(self.ScriptRecord) - return self - - -def parseFeatureList(lines, lookupMap=None, featureMap=None): - self = ot.FeatureList() - self.FeatureRecord = [] - with lines.between("feature table"): - for line in lines: - name, featureTag, lookups = line - if featureMap is not None: - assert name not in featureMap, "Duplicate feature name: %s" % name - featureMap[name] = len(self.FeatureRecord) - # If feature name is integer, make sure it matches its index. - try: - assert int(name) == len(self.FeatureRecord), "%d %d" % ( - name, - len(self.FeatureRecord), - ) - except ValueError: - pass - featureRec = ot.FeatureRecord() - featureRec.FeatureTag = featureTag - featureRec.Feature = ot.Feature() - self.FeatureRecord.append(featureRec) - feature = featureRec.Feature - feature.FeatureParams = None - syms = stripSplitComma(lookups) - feature.LookupListIndex = theList = [None] * len(syms) - for i, sym in enumerate(syms): - setReference(mapLookup, lookupMap, sym, setitem, theList, i) - feature.LookupCount = len(feature.LookupListIndex) - - self.FeatureCount = len(self.FeatureRecord) - return self - - -def parseLookupFlags(lines): - flags = 0 - filterset = None - allFlags = [ - "righttoleft", - "ignorebaseglyphs", - "ignoreligatures", - "ignoremarks", - "markattachmenttype", - "markfiltertype", - ] - while lines.peeks()[0].lower() in allFlags: - line = next(lines) - flag = { - "righttoleft": 0x0001, - "ignorebaseglyphs": 0x0002, - "ignoreligatures": 0x0004, - "ignoremarks": 0x0008, - }.get(line[0].lower()) - if flag: - assert line[1].lower() in ["yes", "no"], line[1] - if line[1].lower() == "yes": - flags |= flag - continue - if line[0].lower() == "markattachmenttype": - flags |= int(line[1]) << 8 - continue - if line[0].lower() == "markfiltertype": - flags |= 0x10 - filterset = int(line[1]) - return flags, filterset - - -def parseSingleSubst(lines, font, _lookupMap=None): - mapping = {} - for line in lines: - assert len(line) == 2, line - line = makeGlyphs(line) - mapping[line[0]] = line[1] - return otl.buildSingleSubstSubtable(mapping) - - -def parseMultiple(lines, font, _lookupMap=None): - mapping = {} - for line in lines: - line = makeGlyphs(line) - mapping[line[0]] = line[1:] - return otl.buildMultipleSubstSubtable(mapping) - - -def parseAlternate(lines, font, _lookupMap=None): - mapping = {} - for line in lines: - line = makeGlyphs(line) - mapping[line[0]] = line[1:] - return otl.buildAlternateSubstSubtable(mapping) - - -def parseLigature(lines, font, _lookupMap=None): - mapping = {} - for line in lines: - assert len(line) >= 2, line - line = makeGlyphs(line) - mapping[tuple(line[1:])] = line[0] - return otl.buildLigatureSubstSubtable(mapping) - - -def parseSinglePos(lines, font, _lookupMap=None): - values = {} - for line in lines: - assert len(line) == 3, line - w = line[0].title().replace(" ", "") - assert w in valueRecordFormatDict - g = makeGlyph(line[1]) - v = int(line[2]) - if g not in values: - values[g] = ValueRecord() - assert not hasattr(values[g], w), (g, w) - setattr(values[g], w, v) - return otl.buildSinglePosSubtable(values, font.getReverseGlyphMap()) - - -def parsePair(lines, font, _lookupMap=None): - self = ot.PairPos() - self.ValueFormat1 = self.ValueFormat2 = 0 - typ = lines.peeks()[0].split()[0].lower() - if typ in ("left", "right"): - self.Format = 1 - values = {} - for line in lines: - assert len(line) == 4, line - side = line[0].split()[0].lower() - assert side in ("left", "right"), side - what = line[0][len(side) :].title().replace(" ", "") - mask = valueRecordFormatDict[what][0] - glyph1, glyph2 = makeGlyphs(line[1:3]) - value = int(line[3]) - if not glyph1 in values: - values[glyph1] = {} - if not glyph2 in values[glyph1]: - values[glyph1][glyph2] = (ValueRecord(), ValueRecord()) - rec2 = values[glyph1][glyph2] - if side == "left": - self.ValueFormat1 |= mask - vr = rec2[0] - else: - self.ValueFormat2 |= mask - vr = rec2[1] - assert not hasattr(vr, what), (vr, what) - setattr(vr, what, value) - self.Coverage = makeCoverage(set(values.keys()), font) - self.PairSet = [] - for glyph1 in self.Coverage.glyphs: - values1 = values[glyph1] - pairset = ot.PairSet() - records = pairset.PairValueRecord = [] - for glyph2 in sorted(values1.keys(), key=font.getGlyphID): - values2 = values1[glyph2] - pair = ot.PairValueRecord() - pair.SecondGlyph = glyph2 - pair.Value1 = values2[0] - pair.Value2 = values2[1] if self.ValueFormat2 else None - records.append(pair) - pairset.PairValueCount = len(pairset.PairValueRecord) - self.PairSet.append(pairset) - self.PairSetCount = len(self.PairSet) - elif typ.endswith("class"): - self.Format = 2 - classDefs = [None, None] - while lines.peeks()[0].endswith("class definition begin"): - typ = lines.peek()[0][: -len("class definition begin")].lower() - idx, klass = { - "first": (0, ot.ClassDef1), - "second": (1, ot.ClassDef2), - }[typ] - assert classDefs[idx] is None - classDefs[idx] = parseClassDef(lines, font, klass=klass) - self.ClassDef1, self.ClassDef2 = classDefs - self.Class1Count, self.Class2Count = ( - 1 + max(c.classDefs.values()) for c in classDefs - ) - self.Class1Record = [ot.Class1Record() for i in range(self.Class1Count)] - for rec1 in self.Class1Record: - rec1.Class2Record = [ot.Class2Record() for j in range(self.Class2Count)] - for rec2 in rec1.Class2Record: - rec2.Value1 = ValueRecord() - rec2.Value2 = ValueRecord() - for line in lines: - assert len(line) == 4, line - side = line[0].split()[0].lower() - assert side in ("left", "right"), side - what = line[0][len(side) :].title().replace(" ", "") - mask = valueRecordFormatDict[what][0] - class1, class2, value = (int(x) for x in line[1:4]) - rec2 = self.Class1Record[class1].Class2Record[class2] - if side == "left": - self.ValueFormat1 |= mask - vr = rec2.Value1 - else: - self.ValueFormat2 |= mask - vr = rec2.Value2 - assert not hasattr(vr, what), (vr, what) - setattr(vr, what, value) - for rec1 in self.Class1Record: - for rec2 in rec1.Class2Record: - rec2.Value1 = ValueRecord(self.ValueFormat1, rec2.Value1) - rec2.Value2 = ( - ValueRecord(self.ValueFormat2, rec2.Value2) - if self.ValueFormat2 - else None - ) - - self.Coverage = makeCoverage(set(self.ClassDef1.classDefs.keys()), font) - else: - assert 0, typ - return self - - -def parseKernset(lines, font, _lookupMap=None): - typ = lines.peeks()[0].split()[0].lower() - if typ in ("left", "right"): - with lines.until( - ("firstclass definition begin", "secondclass definition begin") - ): - return parsePair(lines, font) - return parsePair(lines, font) - - -def makeAnchor(data, klass=ot.Anchor): - assert len(data) <= 2 - anchor = klass() - anchor.Format = 1 - anchor.XCoordinate, anchor.YCoordinate = intSplitComma(data[0]) - if len(data) > 1 and data[1] != "": - anchor.Format = 2 - anchor.AnchorPoint = int(data[1]) - return anchor - - -def parseCursive(lines, font, _lookupMap=None): - records = {} - for line in lines: - assert len(line) in [3, 4], line - idx, klass = { - "entry": (0, ot.EntryAnchor), - "exit": (1, ot.ExitAnchor), - }[line[0]] - glyph = makeGlyph(line[1]) - if glyph not in records: - records[glyph] = [None, None] - assert records[glyph][idx] is None, (glyph, idx) - records[glyph][idx] = makeAnchor(line[2:], klass) - return otl.buildCursivePosSubtable(records, font.getReverseGlyphMap()) - - -def makeMarkRecords(data, coverage, c): - records = [] - for glyph in coverage.glyphs: - klass, anchor = data[glyph] - record = c.MarkRecordClass() - record.Class = klass - setattr(record, c.MarkAnchor, anchor) - records.append(record) - return records - - -def makeBaseRecords(data, coverage, c, classCount): - records = [] - idx = {} - for glyph in coverage.glyphs: - idx[glyph] = len(records) - record = c.BaseRecordClass() - anchors = [None] * classCount - setattr(record, c.BaseAnchor, anchors) - records.append(record) - for (glyph, klass), anchor in data.items(): - record = records[idx[glyph]] - anchors = getattr(record, c.BaseAnchor) - assert anchors[klass] is None, (glyph, klass) - anchors[klass] = anchor - return records - - -def makeLigatureRecords(data, coverage, c, classCount): - records = [None] * len(coverage.glyphs) - idx = {g: i for i, g in enumerate(coverage.glyphs)} - - for (glyph, klass, compIdx, compCount), anchor in data.items(): - record = records[idx[glyph]] - if record is None: - record = records[idx[glyph]] = ot.LigatureAttach() - record.ComponentCount = compCount - record.ComponentRecord = [ot.ComponentRecord() for i in range(compCount)] - for compRec in record.ComponentRecord: - compRec.LigatureAnchor = [None] * classCount - assert record.ComponentCount == compCount, ( - glyph, - record.ComponentCount, - compCount, - ) - - anchors = record.ComponentRecord[compIdx - 1].LigatureAnchor - assert anchors[klass] is None, (glyph, compIdx, klass) - anchors[klass] = anchor - return records - - -def parseMarkToSomething(lines, font, c): - self = c.Type() - self.Format = 1 - markData = {} - baseData = {} - Data = { - "mark": (markData, c.MarkAnchorClass), - "base": (baseData, c.BaseAnchorClass), - "ligature": (baseData, c.BaseAnchorClass), - } - maxKlass = 0 - for line in lines: - typ = line[0] - assert typ in ("mark", "base", "ligature") - glyph = makeGlyph(line[1]) - data, anchorClass = Data[typ] - extraItems = 2 if typ == "ligature" else 0 - extras = tuple(int(i) for i in line[2 : 2 + extraItems]) - klass = int(line[2 + extraItems]) - anchor = makeAnchor(line[3 + extraItems :], anchorClass) - if typ == "mark": - key, value = glyph, (klass, anchor) - else: - key, value = ((glyph, klass) + extras), anchor - assert key not in data, key - data[key] = value - maxKlass = max(maxKlass, klass) - - # Mark - markCoverage = makeCoverage(set(markData.keys()), font, c.MarkCoverageClass) - markArray = c.MarkArrayClass() - markRecords = makeMarkRecords(markData, markCoverage, c) - setattr(markArray, c.MarkRecord, markRecords) - setattr(markArray, c.MarkCount, len(markRecords)) - setattr(self, c.MarkCoverage, markCoverage) - setattr(self, c.MarkArray, markArray) - self.ClassCount = maxKlass + 1 - - # Base - self.classCount = 0 if not baseData else 1 + max(k[1] for k, v in baseData.items()) - baseCoverage = makeCoverage( - set([k[0] for k in baseData.keys()]), font, c.BaseCoverageClass - ) - baseArray = c.BaseArrayClass() - if c.Base == "Ligature": - baseRecords = makeLigatureRecords(baseData, baseCoverage, c, self.classCount) - else: - baseRecords = makeBaseRecords(baseData, baseCoverage, c, self.classCount) - setattr(baseArray, c.BaseRecord, baseRecords) - setattr(baseArray, c.BaseCount, len(baseRecords)) - setattr(self, c.BaseCoverage, baseCoverage) - setattr(self, c.BaseArray, baseArray) - - return self - - -class MarkHelper(object): - def __init__(self): - for Which in ("Mark", "Base"): - for What in ("Coverage", "Array", "Count", "Record", "Anchor"): - key = Which + What - if Which == "Mark" and What in ("Count", "Record", "Anchor"): - value = key - else: - value = getattr(self, Which) + What - if value == "LigatureRecord": - value = "LigatureAttach" - setattr(self, key, value) - if What != "Count": - klass = getattr(ot, value) - setattr(self, key + "Class", klass) - - -class MarkToBaseHelper(MarkHelper): - Mark = "Mark" - Base = "Base" - Type = ot.MarkBasePos - - -class MarkToMarkHelper(MarkHelper): - Mark = "Mark1" - Base = "Mark2" - Type = ot.MarkMarkPos - - -class MarkToLigatureHelper(MarkHelper): - Mark = "Mark" - Base = "Ligature" - Type = ot.MarkLigPos - - -def parseMarkToBase(lines, font, _lookupMap=None): - return parseMarkToSomething(lines, font, MarkToBaseHelper()) - - -def parseMarkToMark(lines, font, _lookupMap=None): - return parseMarkToSomething(lines, font, MarkToMarkHelper()) - - -def parseMarkToLigature(lines, font, _lookupMap=None): - return parseMarkToSomething(lines, font, MarkToLigatureHelper()) - - -def stripSplitComma(line): - return [s.strip() for s in line.split(",")] if line else [] - - -def intSplitComma(line): - return [int(i) for i in line.split(",")] if line else [] - - -# Copied from fontTools.subset -class ContextHelper(object): - def __init__(self, klassName, Format): - if klassName.endswith("Subst"): - Typ = "Sub" - Type = "Subst" - else: - Typ = "Pos" - Type = "Pos" - if klassName.startswith("Chain"): - Chain = "Chain" - InputIdx = 1 - DataLen = 3 - else: - Chain = "" - InputIdx = 0 - DataLen = 1 - ChainTyp = Chain + Typ - - self.Typ = Typ - self.Type = Type - self.Chain = Chain - self.ChainTyp = ChainTyp - self.InputIdx = InputIdx - self.DataLen = DataLen - - self.LookupRecord = Type + "LookupRecord" - - if Format == 1: - Coverage = lambda r: r.Coverage - ChainCoverage = lambda r: r.Coverage - ContextData = lambda r: (None,) - ChainContextData = lambda r: (None, None, None) - SetContextData = None - SetChainContextData = None - RuleData = lambda r: (r.Input,) - ChainRuleData = lambda r: (r.Backtrack, r.Input, r.LookAhead) - - def SetRuleData(r, d): - (r.Input,) = d - (r.GlyphCount,) = (len(x) + 1 for x in d) - - def ChainSetRuleData(r, d): - (r.Backtrack, r.Input, r.LookAhead) = d - ( - r.BacktrackGlyphCount, - r.InputGlyphCount, - r.LookAheadGlyphCount, - ) = (len(d[0]), len(d[1]) + 1, len(d[2])) - - elif Format == 2: - Coverage = lambda r: r.Coverage - ChainCoverage = lambda r: r.Coverage - ContextData = lambda r: (r.ClassDef,) - ChainContextData = lambda r: ( - r.BacktrackClassDef, - r.InputClassDef, - r.LookAheadClassDef, - ) - - def SetContextData(r, d): - (r.ClassDef,) = d - - def SetChainContextData(r, d): - (r.BacktrackClassDef, r.InputClassDef, r.LookAheadClassDef) = d - - RuleData = lambda r: (r.Class,) - ChainRuleData = lambda r: (r.Backtrack, r.Input, r.LookAhead) - - def SetRuleData(r, d): - (r.Class,) = d - (r.GlyphCount,) = (len(x) + 1 for x in d) - - def ChainSetRuleData(r, d): - (r.Backtrack, r.Input, r.LookAhead) = d - ( - r.BacktrackGlyphCount, - r.InputGlyphCount, - r.LookAheadGlyphCount, - ) = (len(d[0]), len(d[1]) + 1, len(d[2])) - - elif Format == 3: - Coverage = lambda r: r.Coverage[0] - ChainCoverage = lambda r: r.InputCoverage[0] - ContextData = None - ChainContextData = None - SetContextData = None - SetChainContextData = None - RuleData = lambda r: r.Coverage - ChainRuleData = lambda r: ( - r.BacktrackCoverage + r.InputCoverage + r.LookAheadCoverage - ) - - def SetRuleData(r, d): - (r.Coverage,) = d - (r.GlyphCount,) = (len(x) for x in d) - - def ChainSetRuleData(r, d): - (r.BacktrackCoverage, r.InputCoverage, r.LookAheadCoverage) = d - ( - r.BacktrackGlyphCount, - r.InputGlyphCount, - r.LookAheadGlyphCount, - ) = (len(x) for x in d) - - else: - assert 0, "unknown format: %s" % Format - - if Chain: - self.Coverage = ChainCoverage - self.ContextData = ChainContextData - self.SetContextData = SetChainContextData - self.RuleData = ChainRuleData - self.SetRuleData = ChainSetRuleData - else: - self.Coverage = Coverage - self.ContextData = ContextData - self.SetContextData = SetContextData - self.RuleData = RuleData - self.SetRuleData = SetRuleData - - if Format == 1: - self.Rule = ChainTyp + "Rule" - self.RuleCount = ChainTyp + "RuleCount" - self.RuleSet = ChainTyp + "RuleSet" - self.RuleSetCount = ChainTyp + "RuleSetCount" - self.Intersect = lambda glyphs, c, r: [r] if r in glyphs else [] - elif Format == 2: - self.Rule = ChainTyp + "ClassRule" - self.RuleCount = ChainTyp + "ClassRuleCount" - self.RuleSet = ChainTyp + "ClassSet" - self.RuleSetCount = ChainTyp + "ClassSetCount" - self.Intersect = lambda glyphs, c, r: ( - c.intersect_class(glyphs, r) - if c - else (set(glyphs) if r == 0 else set()) - ) - - self.ClassDef = "InputClassDef" if Chain else "ClassDef" - self.ClassDefIndex = 1 if Chain else 0 - self.Input = "Input" if Chain else "Class" - - -def parseLookupRecords(items, klassName, lookupMap=None): - klass = getattr(ot, klassName) - lst = [] - for item in items: - rec = klass() - item = stripSplitComma(item) - assert len(item) == 2, item - idx = int(item[0]) - assert idx > 0, idx - rec.SequenceIndex = idx - 1 - setReference(mapLookup, lookupMap, item[1], setattr, rec, "LookupListIndex") - lst.append(rec) - return lst - - -def makeClassDef(classDefs, font, klass=ot.Coverage): - if not classDefs: - return None - self = klass() - self.classDefs = dict(classDefs) - return self - - -def parseClassDef(lines, font, klass=ot.ClassDef): - classDefs = {} - with lines.between("class definition"): - for line in lines: - glyph = makeGlyph(line[0]) - assert glyph not in classDefs, glyph - classDefs[glyph] = int(line[1]) - return makeClassDef(classDefs, font, klass) - - -def makeCoverage(glyphs, font, klass=ot.Coverage): - if not glyphs: - return None - if isinstance(glyphs, set): - glyphs = sorted(glyphs) - coverage = klass() - coverage.glyphs = sorted(set(glyphs), key=font.getGlyphID) - return coverage - - -def parseCoverage(lines, font, klass=ot.Coverage): - glyphs = [] - with lines.between("coverage definition"): - for line in lines: - glyphs.append(makeGlyph(line[0])) - return makeCoverage(glyphs, font, klass) - - -def bucketizeRules(self, c, rules, bucketKeys): - buckets = {} - for seq, recs in rules: - buckets.setdefault(seq[c.InputIdx][0], []).append( - (tuple(s[1 if i == c.InputIdx else 0 :] for i, s in enumerate(seq)), recs) - ) - - rulesets = [] - for firstGlyph in bucketKeys: - if firstGlyph not in buckets: - rulesets.append(None) - continue - thisRules = [] - for seq, recs in buckets[firstGlyph]: - rule = getattr(ot, c.Rule)() - c.SetRuleData(rule, seq) - setattr(rule, c.Type + "Count", len(recs)) - setattr(rule, c.LookupRecord, recs) - thisRules.append(rule) - - ruleset = getattr(ot, c.RuleSet)() - setattr(ruleset, c.Rule, thisRules) - setattr(ruleset, c.RuleCount, len(thisRules)) - rulesets.append(ruleset) - - setattr(self, c.RuleSet, rulesets) - setattr(self, c.RuleSetCount, len(rulesets)) - - -def parseContext(lines, font, Type, lookupMap=None): - self = getattr(ot, Type)() - typ = lines.peeks()[0].split()[0].lower() - if typ == "glyph": - self.Format = 1 - log.debug("Parsing %s format %s", Type, self.Format) - c = ContextHelper(Type, self.Format) - rules = [] - for line in lines: - assert line[0].lower() == "glyph", line[0] - while len(line) < 1 + c.DataLen: - line.append("") - seq = tuple(makeGlyphs(stripSplitComma(i)) for i in line[1 : 1 + c.DataLen]) - recs = parseLookupRecords(line[1 + c.DataLen :], c.LookupRecord, lookupMap) - rules.append((seq, recs)) - - firstGlyphs = set(seq[c.InputIdx][0] for seq, recs in rules) - self.Coverage = makeCoverage(firstGlyphs, font) - bucketizeRules(self, c, rules, self.Coverage.glyphs) - elif typ.endswith("class"): - self.Format = 2 - log.debug("Parsing %s format %s", Type, self.Format) - c = ContextHelper(Type, self.Format) - classDefs = [None] * c.DataLen - while lines.peeks()[0].endswith("class definition begin"): - typ = lines.peek()[0][: -len("class definition begin")].lower() - idx, klass = { - 1: { - "": (0, ot.ClassDef), - }, - 3: { - "backtrack": (0, ot.BacktrackClassDef), - "": (1, ot.InputClassDef), - "lookahead": (2, ot.LookAheadClassDef), - }, - }[c.DataLen][typ] - assert classDefs[idx] is None, idx - classDefs[idx] = parseClassDef(lines, font, klass=klass) - c.SetContextData(self, classDefs) - rules = [] - for line in lines: - assert line[0].lower().startswith("class"), line[0] - while len(line) < 1 + c.DataLen: - line.append("") - seq = tuple(intSplitComma(i) for i in line[1 : 1 + c.DataLen]) - recs = parseLookupRecords(line[1 + c.DataLen :], c.LookupRecord, lookupMap) - rules.append((seq, recs)) - firstClasses = set(seq[c.InputIdx][0] for seq, recs in rules) - firstGlyphs = set( - g for g, c in classDefs[c.InputIdx].classDefs.items() if c in firstClasses - ) - self.Coverage = makeCoverage(firstGlyphs, font) - bucketizeRules(self, c, rules, range(max(firstClasses) + 1)) - elif typ.endswith("coverage"): - self.Format = 3 - log.debug("Parsing %s format %s", Type, self.Format) - c = ContextHelper(Type, self.Format) - coverages = tuple([] for i in range(c.DataLen)) - while lines.peeks()[0].endswith("coverage definition begin"): - typ = lines.peek()[0][: -len("coverage definition begin")].lower() - idx, klass = { - 1: { - "": (0, ot.Coverage), - }, - 3: { - "backtrack": (0, ot.BacktrackCoverage), - "input": (1, ot.InputCoverage), - "lookahead": (2, ot.LookAheadCoverage), - }, - }[c.DataLen][typ] - coverages[idx].append(parseCoverage(lines, font, klass=klass)) - c.SetRuleData(self, coverages) - lines = list(lines) - assert len(lines) == 1 - line = lines[0] - assert line[0].lower() == "coverage", line[0] - recs = parseLookupRecords(line[1:], c.LookupRecord, lookupMap) - setattr(self, c.Type + "Count", len(recs)) - setattr(self, c.LookupRecord, recs) - else: - assert 0, typ - return self - - -def parseContextSubst(lines, font, lookupMap=None): - return parseContext(lines, font, "ContextSubst", lookupMap=lookupMap) - - -def parseContextPos(lines, font, lookupMap=None): - return parseContext(lines, font, "ContextPos", lookupMap=lookupMap) - - -def parseChainedSubst(lines, font, lookupMap=None): - return parseContext(lines, font, "ChainContextSubst", lookupMap=lookupMap) - - -def parseChainedPos(lines, font, lookupMap=None): - return parseContext(lines, font, "ChainContextPos", lookupMap=lookupMap) - - -def parseReverseChainedSubst(lines, font, _lookupMap=None): - self = ot.ReverseChainSingleSubst() - self.Format = 1 - coverages = ([], []) - while lines.peeks()[0].endswith("coverage definition begin"): - typ = lines.peek()[0][: -len("coverage definition begin")].lower() - idx, klass = { - "backtrack": (0, ot.BacktrackCoverage), - "lookahead": (1, ot.LookAheadCoverage), - }[typ] - coverages[idx].append(parseCoverage(lines, font, klass=klass)) - self.BacktrackCoverage = coverages[0] - self.BacktrackGlyphCount = len(self.BacktrackCoverage) - self.LookAheadCoverage = coverages[1] - self.LookAheadGlyphCount = len(self.LookAheadCoverage) - mapping = {} - for line in lines: - assert len(line) == 2, line - line = makeGlyphs(line) - mapping[line[0]] = line[1] - self.Coverage = makeCoverage(set(mapping.keys()), font) - self.Substitute = [mapping[k] for k in self.Coverage.glyphs] - self.GlyphCount = len(self.Substitute) - return self - - -def parseLookup(lines, tableTag, font, lookupMap=None): - line = lines.expect("lookup") - _, name, typ = line - log.debug("Parsing lookup type %s %s", typ, name) - lookup = ot.Lookup() - lookup.LookupFlag, filterset = parseLookupFlags(lines) - if filterset is not None: - lookup.MarkFilteringSet = filterset - lookup.LookupType, parseLookupSubTable = { - "GSUB": { - "single": (1, parseSingleSubst), - "multiple": (2, parseMultiple), - "alternate": (3, parseAlternate), - "ligature": (4, parseLigature), - "context": (5, parseContextSubst), - "chained": (6, parseChainedSubst), - "reversechained": (8, parseReverseChainedSubst), - }, - "GPOS": { - "single": (1, parseSinglePos), - "pair": (2, parsePair), - "kernset": (2, parseKernset), - "cursive": (3, parseCursive), - "mark to base": (4, parseMarkToBase), - "mark to ligature": (5, parseMarkToLigature), - "mark to mark": (6, parseMarkToMark), - "context": (7, parseContextPos), - "chained": (8, parseChainedPos), - }, - }[tableTag][typ] - - with lines.until("lookup end"): - subtables = [] - - while lines.peek(): - with lines.until(("% subtable", "subtable end")): - while lines.peek(): - subtable = parseLookupSubTable(lines, font, lookupMap) - assert lookup.LookupType == subtable.LookupType - subtables.append(subtable) - if lines.peeks()[0] in ("% subtable", "subtable end"): - next(lines) - lines.expect("lookup end") - - lookup.SubTable = subtables - lookup.SubTableCount = len(lookup.SubTable) - if lookup.SubTableCount == 0: - # Remove this return when following is fixed: - # https://github.com/fonttools/fonttools/issues/789 - return None - return lookup - - -def parseGSUBGPOS(lines, font, tableTag): - container = ttLib.getTableClass(tableTag)() - lookupMap = DeferredMapping() - featureMap = DeferredMapping() - assert tableTag in ("GSUB", "GPOS") - log.debug("Parsing %s", tableTag) - self = getattr(ot, tableTag)() - self.Version = 0x00010000 - fields = { - "script table begin": ( - "ScriptList", - lambda lines: parseScriptList(lines, featureMap), - ), - "feature table begin": ( - "FeatureList", - lambda lines: parseFeatureList(lines, lookupMap, featureMap), - ), - "lookup": ("LookupList", None), - } - for attr, parser in fields.values(): - setattr(self, attr, None) - while lines.peek() is not None: - typ = lines.peek()[0].lower() - if typ not in fields: - log.debug("Skipping %s", lines.peek()) - next(lines) - continue - attr, parser = fields[typ] - if typ == "lookup": - if self.LookupList is None: - self.LookupList = ot.LookupList() - self.LookupList.Lookup = [] - _, name, _ = lines.peek() - lookup = parseLookup(lines, tableTag, font, lookupMap) - if lookupMap is not None: - assert name not in lookupMap, "Duplicate lookup name: %s" % name - lookupMap[name] = len(self.LookupList.Lookup) - else: - assert int(name) == len(self.LookupList.Lookup), "%d %d" % ( - name, - len(self.Lookup), - ) - self.LookupList.Lookup.append(lookup) - else: - assert getattr(self, attr) is None, attr - setattr(self, attr, parser(lines)) - if self.LookupList: - self.LookupList.LookupCount = len(self.LookupList.Lookup) - if lookupMap is not None: - lookupMap.applyDeferredMappings() - if os.environ.get(LOOKUP_DEBUG_ENV_VAR): - if "Debg" not in font: - font["Debg"] = newTable("Debg") - font["Debg"].data = {} - debug = ( - font["Debg"] - .data.setdefault(LOOKUP_DEBUG_INFO_KEY, {}) - .setdefault(tableTag, {}) - ) - for name, lookup in lookupMap.items(): - debug[str(lookup)] = ["", name, ""] - - featureMap.applyDeferredMappings() - container.table = self - return container - - -def parseGSUB(lines, font): - return parseGSUBGPOS(lines, font, "GSUB") - - -def parseGPOS(lines, font): - return parseGSUBGPOS(lines, font, "GPOS") - - -def parseAttachList(lines, font): - points = {} - with lines.between("attachment list"): - for line in lines: - glyph = makeGlyph(line[0]) - assert glyph not in points, glyph - points[glyph] = [int(i) for i in line[1:]] - return otl.buildAttachList(points, font.getReverseGlyphMap()) - - -def parseCaretList(lines, font): - carets = {} - with lines.between("carets"): - for line in lines: - glyph = makeGlyph(line[0]) - assert glyph not in carets, glyph - num = int(line[1]) - thisCarets = [int(i) for i in line[2:]] - assert num == len(thisCarets), line - carets[glyph] = thisCarets - return otl.buildLigCaretList(carets, {}, font.getReverseGlyphMap()) - - -def makeMarkFilteringSets(sets, font): - self = ot.MarkGlyphSetsDef() - self.MarkSetTableFormat = 1 - self.MarkSetCount = 1 + max(sets.keys()) - self.Coverage = [None] * self.MarkSetCount - for k, v in sorted(sets.items()): - self.Coverage[k] = makeCoverage(set(v), font) - return self - - -def parseMarkFilteringSets(lines, font): - sets = {} - with lines.between("set definition"): - for line in lines: - assert len(line) == 2, line - glyph = makeGlyph(line[0]) - # TODO accept set names - st = int(line[1]) - if st not in sets: - sets[st] = [] - sets[st].append(glyph) - return makeMarkFilteringSets(sets, font) - - -def parseGDEF(lines, font): - container = ttLib.getTableClass("GDEF")() - log.debug("Parsing GDEF") - self = ot.GDEF() - fields = { - "class definition begin": ( - "GlyphClassDef", - lambda lines, font: parseClassDef(lines, font, klass=ot.GlyphClassDef), - ), - "attachment list begin": ("AttachList", parseAttachList), - "carets begin": ("LigCaretList", parseCaretList), - "mark attachment class definition begin": ( - "MarkAttachClassDef", - lambda lines, font: parseClassDef(lines, font, klass=ot.MarkAttachClassDef), - ), - "markfilter set definition begin": ("MarkGlyphSetsDef", parseMarkFilteringSets), - } - for attr, parser in fields.values(): - setattr(self, attr, None) - while lines.peek() is not None: - typ = lines.peek()[0].lower() - if typ not in fields: - log.debug("Skipping %s", typ) - next(lines) - continue - attr, parser = fields[typ] - assert getattr(self, attr) is None, attr - setattr(self, attr, parser(lines, font)) - self.Version = 0x00010000 if self.MarkGlyphSetsDef is None else 0x00010002 - container.table = self - return container - - -def parseCmap(lines, font): - container = ttLib.getTableClass("cmap")() - log.debug("Parsing cmap") - tables = [] - while lines.peek() is not None: - lines.expect("cmap subtable %d" % len(tables)) - platId, encId, fmt, lang = [ - parseCmapId(lines, field) - for field in ("platformID", "encodingID", "format", "language") - ] - table = cmap_classes[fmt](fmt) - table.platformID = platId - table.platEncID = encId - table.language = lang - table.cmap = {} - line = next(lines) - while line[0] != "end subtable": - table.cmap[int(line[0], 16)] = line[1] - line = next(lines) - tables.append(table) - container.tableVersion = 0 - container.tables = tables - return container - - -def parseCmapId(lines, field): - line = next(lines) - assert field == line[0] - return int(line[1]) - - -def parseTable(lines, font, tableTag=None): - log.debug("Parsing table") - line = lines.peeks() - tag = None - if line[0].split()[0] == "FontDame": - tag = line[0].split()[1] - elif " ".join(line[0].split()[:3]) == "Font Chef Table": - tag = line[0].split()[3] - if tag is not None: - next(lines) - tag = tag.ljust(4) - if tableTag is None: - tableTag = tag - else: - assert tableTag == tag, (tableTag, tag) - - assert ( - tableTag is not None - ), "Don't know what table to parse and data doesn't specify" - - return { - "GSUB": parseGSUB, - "GPOS": parseGPOS, - "GDEF": parseGDEF, - "cmap": parseCmap, - }[tableTag](lines, font) - - -class Tokenizer(object): - def __init__(self, f): - # TODO BytesIO / StringIO as needed? also, figure out whether we work on bytes or unicode - lines = iter(f) - try: - self.filename = f.name - except: - self.filename = None - self.lines = iter(lines) - self.line = "" - self.lineno = 0 - self.stoppers = [] - self.buffer = None - - def __iter__(self): - return self - - def _next_line(self): - self.lineno += 1 - line = self.line = next(self.lines) - line = [s.strip() for s in line.split("\t")] - if len(line) == 1 and not line[0]: - del line[0] - if line and not line[-1]: - log.warning("trailing tab found on line %d: %s" % (self.lineno, self.line)) - while line and not line[-1]: - del line[-1] - return line - - def _next_nonempty(self): - while True: - line = self._next_line() - # Skip comments and empty lines - if line and line[0] and (line[0][0] != "%" or line[0] == "% subtable"): - return line - - def _next_buffered(self): - if self.buffer: - ret = self.buffer - self.buffer = None - return ret - else: - return self._next_nonempty() - - def __next__(self): - line = self._next_buffered() - if line[0].lower() in self.stoppers: - self.buffer = line - raise StopIteration - return line - - def next(self): - return self.__next__() - - def peek(self): - if not self.buffer: - try: - self.buffer = self._next_nonempty() - except StopIteration: - return None - if self.buffer[0].lower() in self.stoppers: - return None - return self.buffer - - def peeks(self): - ret = self.peek() - return ret if ret is not None else ("",) - - @contextmanager - def between(self, tag): - start = tag + " begin" - end = tag + " end" - self.expectendswith(start) - self.stoppers.append(end) - yield - del self.stoppers[-1] - self.expect(tag + " end") - - @contextmanager - def until(self, tags): - if type(tags) is not tuple: - tags = (tags,) - self.stoppers.extend(tags) - yield - del self.stoppers[-len(tags) :] - - def expect(self, s): - line = next(self) - tag = line[0].lower() - assert tag == s, "Expected '%s', got '%s'" % (s, tag) - return line - - def expectendswith(self, s): - line = next(self) - tag = line[0].lower() - assert tag.endswith(s), "Expected '*%s', got '%s'" % (s, tag) - return line - - -def build(f, font, tableTag=None): - """Convert a Monotype font layout file to an OpenType layout object - - A font object must be passed, but this may be a "dummy" font; it is only - used for sorting glyph sets when making coverage tables and to hold the - OpenType layout table while it is being built. - - Args: - f: A file object. - font (TTFont): A font object. - tableTag (string): If provided, asserts that the file contains data for the - given OpenType table. - - Returns: - An object representing the table. (e.g. ``table_G_S_U_B_``) - """ - lines = Tokenizer(f) - return parseTable(lines, font, tableTag=tableTag) - - -def main(args=None, font=None): - """Convert a FontDame OTL file to TTX XML - - Writes XML output to stdout. - - Args: - args: Command line arguments (``--font``, ``--table``, input files). - """ - import sys - from fontTools import configLogger - from fontTools.misc.testTools import MockFont - - if args is None: - args = sys.argv[1:] - - # configure the library logger (for >= WARNING) - configLogger() - # comment this out to enable debug messages from mtiLib's logger - # log.setLevel(logging.DEBUG) - - import argparse - - parser = argparse.ArgumentParser( - "fonttools mtiLib", - description=main.__doc__, - ) - - parser.add_argument( - "--font", - "-f", - metavar="FILE", - dest="font", - help="Input TTF files (used for glyph classes and sorting coverage tables)", - ) - parser.add_argument( - "--table", - "-t", - metavar="TABLE", - dest="tableTag", - help="Table to fill (sniffed from input file if not provided)", - ) - parser.add_argument( - "inputs", metavar="FILE", type=str, nargs="+", help="Input FontDame .txt files" - ) - - args = parser.parse_args(args) - - if font is None: - if args.font: - font = ttLib.TTFont(args.font) - else: - font = MockFont() - - for f in args.inputs: - log.debug("Processing %s", f) - with open(f, "rt", encoding="utf-8") as f: - table = build(f, font, tableTag=args.tableTag) - blob = table.compile(font) # Make sure it compiles - decompiled = table.__class__() - decompiled.decompile(blob, font) # Make sure it decompiles! - - # continue - from fontTools.misc import xmlWriter - - tag = table.tableTag - writer = xmlWriter.XMLWriter(sys.stdout) - writer.begintag(tag) - writer.newline() - # table.toXML(writer, font) - decompiled.toXML(writer, font) - writer.endtag(tag) - writer.newline() - - -if __name__ == "__main__": - import sys - - sys.exit(main()) diff --git a/spaces/cihyFjudo/fairness-paper-search/Kon Boot Install Exe Download How to Use Kon-Boot to Hack Windows Passwords.md b/spaces/cihyFjudo/fairness-paper-search/Kon Boot Install Exe Download How to Use Kon-Boot to Hack Windows Passwords.md deleted file mode 100644 index b90700cf10b742ec07c1c7e59486aed3ad975a22..0000000000000000000000000000000000000000 --- a/spaces/cihyFjudo/fairness-paper-search/Kon Boot Install Exe Download How to Use Kon-Boot to Hack Windows Passwords.md +++ /dev/null @@ -1,21 +0,0 @@ -
-

Kon Boot, the free of cost tool comes into the arena when all the other password resetting tools fail to deliver. It enables you to login to any password protected system without knowing the password. What actually happens is that it changes the content of Windows kernel during booting. All of this is done virtually and there is no physical system changes. Note that if you like to recover specific application passwords like winrar file then Download WinRAR password recovery.

-

The application is very easy to use. All you need to do is to download the free ISO file and burn it to the CD. Now boot from the disc and you will be all set to be back into the Windows within no time. The ISO image file size is also very small compared to the other tools providing the same services.

-

kon boot install exe download


Download Zip > https://tinurli.com/2uwjHW



-

After reading this article, you will get the full idea about Kon Boot for Windows 10/8/7 and everything about Kon Boot download. But if your Kon Boot is not working properly then you can easily pick the best Kon Boot alternative from here to recover your computer login password. So without worrying too much, just follow the guideline from here and choose Windows Password Recovery Tool to solve your issue.

-

The applying may be very straightforward to make use of. All you must do is to Download the free ISO file and burn it to the CD. Now boot from the disc and you can be all set to be again into the Windows inside no time. The ISO picture file measurement can be very small in comparison with the opposite instruments offering the identical providers.

-

Easy2Boot (E2B) is popular multiboot USB solution that also contains agFM and Ventoy. It supports both Legacy and UEFI.
Simply copy on your bootable ISO files to the E2B USB drive and boot! Boot to DOS, Linux, Windows Install ISOs (XP>Win11),
automate Windows installs, WIM files, VHD files, images of flash drives, Linux ISO+persistence, etc.

-

The instructions below shows how to remove Kon Boot for Windows 2.5.0.rar__14714_il88101.exe with help from the FreeFixer removal tool. Basically, you install FreeFixer, scan your computer, check the Kon Boot for Windows 2.5.0.rar__14714_il88101.exe file for removal, restart your computer and scan it again to verify that Kon Boot for Windows 2.5.0.rar__14714_il88101.exe has been successfully removed. Here are the removal instructions in more detail:

-

kon boot for windows 2.5.0.rar__14714_il88101.exe - Application Error. The instruction at "0xXXXXXXXX" referenced memory at "0xXXXXXXXX". The memory could not be "read/written". Click on OK to terminate the program.

-

-

Please share with the other users what you think about this file. What does this file do? Is it legitimate or something that your computer is better without? Do you know how it was installed on your system? Did you install it yourself or did it come bundled with some other software? Is it running smoothly or do you get some error message? Any information that will help to document this file is welcome. Thank you for your contributions.

-

XP, WIndows 7, WIndows 8 and early versions of Windows 10 have a problem with USB Flash (thumb) drives. They will only access one partition. Please use Windows 10/11 to create a E2B+agFM UEFI-bootable USB drive. If your USB drive is a USB Fixed Disk (e.g. USB HDD) then you can use any version of Windows.

-

You can download E2B, the Latest Betas with bugfixes, enhancements and extra E2B themes, .mnu files, animations, wallpapers, extra grub4dos scripts, archived files and lots of other stuff from one of the two Alternate Download Sites.

-

Most eBooks are over 100 pages long, contain original content and step-by-step exercises which are suitable for both the beginner or the more experienced user.
Customer reviews are located at bottom of each eBook product page and multi-buy discounts are available when you buy more than one eBook. Please also visit RMPrepUSB.com and the E2B Forum.
Subscribe to my blog for the latest news, tips, USB boot articles and free eBook updates.

-

Kon-Boot (aka konboot, kon boot) is a software utility that allows users to bypass Microsoft Windows passwords and Apple macOS passwords (Linux support has been deprecated) without lasting or persistent changes to system on which it is executed. It is also the first reported tool capable of bypassing Windows 10 online (live) passwords and supporting both Windows and macOS systems.[1] It is also a widely used tool in computer security, especially in penetration testing.[2][3][4] Since version 3.5 Kon-Boot is also able to bypass SecureBoot feature.[5]

-

Kon-Boot works like a bootkit[13][14] (thus it also often creates false positive[15][16][17] alerts in antivirus software). It injects (hides) itself into BIOS memory. Kon-Boot modifies the kernel code on the fly (runtime), temporarily changing the code responsible for verification user's authorization data while the operating system loads.

-

In contrast to password reset tools like CHNTPW (The Offline NT Password Editor), Kon-Boot does not modify system files and SAM hive,[18] all changes are temporary and they disappear after system reboots.

-

Step 1: Restart the computer and press F8 afterwards to start the advanced startup options. Alternatively advanced startup option should be executed automatically after 3 interrupted Windows boots.

-

Kon-Boot (aka konboot, kon boot) is a software utility that allows users to bypass Microsoft Windows passwords and Apple macOS passwords (Linux support has been deprecated) without lasting or persistent changes to system on which it is executed. It is also the first reported tool capable of bypassing Windows 10 online (live) passwords and supporting both Windows and macOS systems.It is also a widely used tool in computer security, especially in penetration testing. Since version 3.5 Kon-Boot is also able to bypass SecureBoot feature.

aaccfb2cb3
-
-
\ No newline at end of file diff --git a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/ffmpeg/stream.py b/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/ffmpeg/stream.py deleted file mode 100644 index 053671ce57c7762e65ca0f73119066847669129f..0000000000000000000000000000000000000000 --- a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/ffmpeg/stream.py +++ /dev/null @@ -1,215 +0,0 @@ -#!/usr/bin/python2.7 -# coding=utf-8 - -import os - -import json - -import subprocess - - -class Stream(object): - def __init__(self): - self.cmd = "" - self.out_file = "" - self.vcode_type = "" - self.input_file = "" - self.word_list_str = "" - self.subbtitle_file = "" - - self.cmd = [] - self.img_file = [] - self.word_list = [] - self.img_dynamic_list = [] - - self.image_list = {} - self.dynamic_list = {} - - # 输入文件 - def input(self, file): - self.input_file = file - - # 添加图片 - def img(self, img, x="0", y="0", str_time="0", end_time="0"): - if img == "": - return False - input_info = self.video_info() - - if end_time == "0": - end_time = float(input_info["format"]["duration"]) + 10.0 - - img_data = { - "img": img, - "x": str(x), - "y": str(y), - "str_time": str(str_time), - "end_time": str(end_time) - } - - self.img_file.append(img_data) - - img_input = [] - img_overlay = [] - - for val in self.img_file: - img_input.append(" -i %s" % val["img"]) - img_overlay.append(" overlay=x=%s:y=%s:enable='if(gt(t,%s),lt(t,%s))" % ( - val["x"], - val["y"], - val["str_time"], - val["end_time"] - ) - ) - - img_input_str = " ".join(img_input) - img_overlay_str = ",".join(img_overlay) - - self.image_list = { - "input": img_input_str, - "overlay": img_overlay_str - } - - # 添加动态图片 gif apng 等 - def img_dynamic(self, file, x="0", y="0", str_time="0", end_time="0"): - input_info = self.video_info() - if file == "": - return False - if end_time == "": - end_time = float(input_info["format"]["duration"]) + 10.0 - - apng = { - "input": " -ignore_loop 0 -i %s" % file, - "x": str(x), - "y": str(y), - "str_time": str(str_time), - "end_time": str(end_time) - } - self.img_dynamic_list.append(apng) - - img_dy_input = [] - img_dy_overlay = [] - for val in self.img_dynamic_list: - img_dy_input.append(val["input"]) - img_dy_overlay.append(" overlay=x=%s:y=%s:shortest=1:enable='if(gt(t,%s), lt(t,%s))'" % ( - val["x"], - val["y"], - val["str_time"], - val["end_time"] - ) - ) - img_dy_input_str = " ".join(img_dy_input) - img_dy_overlay_str = ",".join(img_dy_overlay) - - self.dynamic_list = { - "input": img_dy_input_str, - "overlay": img_dy_overlay_str - } - - # 添加文字水印 - def word_water_mark(self, c, x="0", y="0", str_time="0", end_time="0", font="", color="white"): - if font == "": - return False - input_info = self.video_info() - if c == "": - return False - if end_time == "0": - end_time = float(input_info["format"]["duration"]) + 10.0 - - text = " drawtext=text='%s':x=%s:y=%s:enable='if(gt(t,%s),lt(t,%s))':fontfile=%s:" \ - "fontcolor=%s" % (c, str(x), str(y), str(str_time), str(end_time), str(font), str(color)) - self.word_list.append(text) - - self.word_list_str = ",".join(self.word_list) - - # 添加字幕文件 subtitles=txt.srt - def subbtitle(self, file): - self.subbtitle_file = " subtitles=%s" % file - - # 编码方式 -vcodec - def vcode(self, code): - if code == "": - return False - self.vcode_type = " -vcodec %s" % code - - # 输出文件 - def out(self, file): - if file == "": - return False - self.out_file = "%s" % file - - # 执行脚本 - def run(self): - if self.input_file == "": - return False - im = "ffmpeg -i %s" % self.input_file - ov = "" - - if len(self.dynamic_list) > 0 and self.dynamic_list["input"] != "": - im = "%s %s" % (im, self.dynamic_list["input"]) - - if ov != "": - ov = "%s,%s" % (ov, self.dynamic_list["overlay"]) - else: - ov = self.dynamic_list["overlay"] - if len(self.image_list) > 0: - im = "%s %s" % (im, self.image_list["input"]) - - if ov != "": - ov = "%s,%s" % (ov, self.dynamic_list["overlay"]) - else: - ov = self.dynamic_list["overlay"] - - # 文字水印 - if self.word_list_str != "": - if ov != "": - ov = "%s,%s" % (ov, self.word_list_str) - else: - ov = self.word_list_str - - # 字幕 - if self.subbtitle_file != "": - if ov != "": - ov = "%s,%s" % (ov, self.subbtitle_file) - else: - ov = self.subbtitle_file - - if self.vcode_type != "": - self.cmd = "%s -filter_complex \"%s\" -y %s %s" % (im, ov, self.vcode_type, self.out_file) - else: - self.cmd = "%s -filter_complex \"%s\" -y %s" % (im, ov, self.out_file) - - self.do() - - # 获取视频的相关时长信息 - def video_info(self): - result = {} - if os.path.isfile(self.input_file) is False: - return result - - cmd = ['ffprobe', '-v', 'quiet', '-print_format', 'json', '-show_format', '-show_streams', self.input_file] - returned_data = subprocess.check_output(cmd) - return json.loads(returned_data.decode('utf-8')) - - # 执行命令 - def do(self): - if self.cmd == "": - return False - res = subprocess.call(self.cmd, shell=True) - if res != 0: - return False - return True - - -if __name__ == '__main__': - stream = Stream() - stream.input("face.mp4") - stream.img("t1.png") - stream.img("t2.png", "10", y=10, str_time=5, end_time=10) - stream.img_dynamic("t1.apng", x=10, y=10, str_time=5, end_time=10) - stream.img_dynamic("t2.apng", x=10, y=10, str_time=5, end_time=9) - stream.word_water_mark("测试文字水印1", x="10", y="10", str_time="0", end_time="20", font="ttf.ttf", color="white") - stream.word_water_mark("测试文字水印2", x="10", y="10", str_time="0", end_time="20", font="ttf.ttf", color="white") - stream.subbtitle("srt.srt") - stream.out("out.mp4") - stream.run() - diff --git a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/cu2qu/cu2qu.c b/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/cu2qu/cu2qu.c deleted file mode 100644 index 1971c94d65e1173103f26f8d4532000be32001aa..0000000000000000000000000000000000000000 --- a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/cu2qu/cu2qu.c +++ /dev/null @@ -1,14095 +0,0 @@ -/* Generated by Cython 3.0.0 */ - -/* BEGIN: Cython Metadata -{ - "distutils": { - "define_macros": [ - [ - "CYTHON_TRACE_NOGIL", - "1" - ] - ], - "name": "fontTools.cu2qu.cu2qu", - "sources": [ - "Lib/fontTools/cu2qu/cu2qu.py" - ] - }, - "module_name": "fontTools.cu2qu.cu2qu" -} -END: Cython Metadata */ - -#ifndef PY_SSIZE_T_CLEAN -#define PY_SSIZE_T_CLEAN -#endif /* PY_SSIZE_T_CLEAN */ -#if defined(CYTHON_LIMITED_API) && 0 - #ifndef Py_LIMITED_API - #if CYTHON_LIMITED_API+0 > 0x03030000 - #define Py_LIMITED_API CYTHON_LIMITED_API - #else - #define Py_LIMITED_API 0x03030000 - #endif - #endif -#endif - -#include "Python.h" -#ifndef Py_PYTHON_H - #error Python headers needed to compile C extensions, please install development version of Python. -#elif PY_VERSION_HEX < 0x02070000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) - #error Cython requires Python 2.7+ or Python 3.3+. -#else -#define CYTHON_ABI "3_0_0" -#define __PYX_ABI_MODULE_NAME "_cython_" CYTHON_ABI -#define __PYX_TYPE_MODULE_PREFIX __PYX_ABI_MODULE_NAME "." -#define CYTHON_HEX_VERSION 0x030000F0 -#define CYTHON_FUTURE_DIVISION 1 -#include -#ifndef offsetof - #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) -#endif -#if !defined(_WIN32) && !defined(WIN32) && !defined(MS_WINDOWS) - #ifndef __stdcall - #define __stdcall - #endif - #ifndef __cdecl - #define __cdecl - #endif - #ifndef __fastcall - #define __fastcall - #endif -#endif -#ifndef DL_IMPORT - #define DL_IMPORT(t) t -#endif -#ifndef DL_EXPORT - #define DL_EXPORT(t) t -#endif -#define __PYX_COMMA , -#ifndef HAVE_LONG_LONG - #define HAVE_LONG_LONG -#endif -#ifndef PY_LONG_LONG - #define PY_LONG_LONG LONG_LONG -#endif -#ifndef Py_HUGE_VAL - #define Py_HUGE_VAL HUGE_VAL -#endif -#if defined(GRAALVM_PYTHON) - /* For very preliminary testing purposes. Most variables are set the same as PyPy. - The existence of this section does not imply that anything works or is even tested */ - #define CYTHON_COMPILING_IN_PYPY 0 - #define CYTHON_COMPILING_IN_CPYTHON 0 - #define CYTHON_COMPILING_IN_LIMITED_API 0 - #define CYTHON_COMPILING_IN_GRAAL 1 - #define CYTHON_COMPILING_IN_NOGIL 0 - #undef CYTHON_USE_TYPE_SLOTS - #define CYTHON_USE_TYPE_SLOTS 0 - #undef CYTHON_USE_TYPE_SPECS - #define CYTHON_USE_TYPE_SPECS 0 - #undef CYTHON_USE_PYTYPE_LOOKUP - #define CYTHON_USE_PYTYPE_LOOKUP 0 - #if PY_VERSION_HEX < 0x03050000 - #undef CYTHON_USE_ASYNC_SLOTS - #define CYTHON_USE_ASYNC_SLOTS 0 - #elif !defined(CYTHON_USE_ASYNC_SLOTS) - #define CYTHON_USE_ASYNC_SLOTS 1 - #endif - #undef CYTHON_USE_PYLIST_INTERNALS - #define CYTHON_USE_PYLIST_INTERNALS 0 - #undef CYTHON_USE_UNICODE_INTERNALS - #define CYTHON_USE_UNICODE_INTERNALS 0 - #undef CYTHON_USE_UNICODE_WRITER - #define CYTHON_USE_UNICODE_WRITER 0 - #undef CYTHON_USE_PYLONG_INTERNALS - #define CYTHON_USE_PYLONG_INTERNALS 0 - #undef CYTHON_AVOID_BORROWED_REFS - #define CYTHON_AVOID_BORROWED_REFS 1 - #undef CYTHON_ASSUME_SAFE_MACROS - #define CYTHON_ASSUME_SAFE_MACROS 0 - #undef CYTHON_UNPACK_METHODS - #define CYTHON_UNPACK_METHODS 0 - #undef CYTHON_FAST_THREAD_STATE - #define CYTHON_FAST_THREAD_STATE 0 - #undef CYTHON_FAST_GIL - #define CYTHON_FAST_GIL 0 - #undef CYTHON_METH_FASTCALL - #define CYTHON_METH_FASTCALL 0 - #undef CYTHON_FAST_PYCALL - #define CYTHON_FAST_PYCALL 0 - #ifndef CYTHON_PEP487_INIT_SUBCLASS - #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3) - #endif - #undef CYTHON_PEP489_MULTI_PHASE_INIT - #define CYTHON_PEP489_MULTI_PHASE_INIT 1 - #undef CYTHON_USE_MODULE_STATE - #define CYTHON_USE_MODULE_STATE 0 - #undef CYTHON_USE_TP_FINALIZE - #define CYTHON_USE_TP_FINALIZE 0 - #undef CYTHON_USE_DICT_VERSIONS - #define CYTHON_USE_DICT_VERSIONS 0 - #undef CYTHON_USE_EXC_INFO_STACK - #define CYTHON_USE_EXC_INFO_STACK 0 - #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC - #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 - #endif -#elif defined(PYPY_VERSION) - #define CYTHON_COMPILING_IN_PYPY 1 - #define CYTHON_COMPILING_IN_CPYTHON 0 - #define CYTHON_COMPILING_IN_LIMITED_API 0 - #define CYTHON_COMPILING_IN_GRAAL 0 - #define CYTHON_COMPILING_IN_NOGIL 0 - #undef CYTHON_USE_TYPE_SLOTS - #define CYTHON_USE_TYPE_SLOTS 0 - #undef CYTHON_USE_TYPE_SPECS - #define CYTHON_USE_TYPE_SPECS 0 - #undef CYTHON_USE_PYTYPE_LOOKUP - #define CYTHON_USE_PYTYPE_LOOKUP 0 - #if PY_VERSION_HEX < 0x03050000 - #undef CYTHON_USE_ASYNC_SLOTS - #define CYTHON_USE_ASYNC_SLOTS 0 - #elif !defined(CYTHON_USE_ASYNC_SLOTS) - #define CYTHON_USE_ASYNC_SLOTS 1 - #endif - #undef CYTHON_USE_PYLIST_INTERNALS - #define CYTHON_USE_PYLIST_INTERNALS 0 - #undef CYTHON_USE_UNICODE_INTERNALS - #define CYTHON_USE_UNICODE_INTERNALS 0 - #undef CYTHON_USE_UNICODE_WRITER - #define CYTHON_USE_UNICODE_WRITER 0 - #undef CYTHON_USE_PYLONG_INTERNALS - #define CYTHON_USE_PYLONG_INTERNALS 0 - #undef CYTHON_AVOID_BORROWED_REFS - #define CYTHON_AVOID_BORROWED_REFS 1 - #undef CYTHON_ASSUME_SAFE_MACROS - #define CYTHON_ASSUME_SAFE_MACROS 0 - #undef CYTHON_UNPACK_METHODS - #define CYTHON_UNPACK_METHODS 0 - #undef CYTHON_FAST_THREAD_STATE - #define CYTHON_FAST_THREAD_STATE 0 - #undef CYTHON_FAST_GIL - #define CYTHON_FAST_GIL 0 - #undef CYTHON_METH_FASTCALL - #define CYTHON_METH_FASTCALL 0 - #undef CYTHON_FAST_PYCALL - #define CYTHON_FAST_PYCALL 0 - #ifndef CYTHON_PEP487_INIT_SUBCLASS - #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3) - #endif - #if PY_VERSION_HEX < 0x03090000 - #undef CYTHON_PEP489_MULTI_PHASE_INIT - #define CYTHON_PEP489_MULTI_PHASE_INIT 0 - #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT) - #define CYTHON_PEP489_MULTI_PHASE_INIT 1 - #endif - #undef CYTHON_USE_MODULE_STATE - #define CYTHON_USE_MODULE_STATE 0 - #undef CYTHON_USE_TP_FINALIZE - #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1 && PYPY_VERSION_NUM >= 0x07030C00) - #undef CYTHON_USE_DICT_VERSIONS - #define CYTHON_USE_DICT_VERSIONS 0 - #undef CYTHON_USE_EXC_INFO_STACK - #define CYTHON_USE_EXC_INFO_STACK 0 - #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC - #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 - #endif -#elif defined(CYTHON_LIMITED_API) - #define CYTHON_COMPILING_IN_PYPY 0 - #define CYTHON_COMPILING_IN_CPYTHON 0 - #define CYTHON_COMPILING_IN_LIMITED_API 1 - #define CYTHON_COMPILING_IN_GRAAL 0 - #define CYTHON_COMPILING_IN_NOGIL 0 - #undef CYTHON_CLINE_IN_TRACEBACK - #define CYTHON_CLINE_IN_TRACEBACK 0 - #undef CYTHON_USE_TYPE_SLOTS - #define CYTHON_USE_TYPE_SLOTS 0 - #undef CYTHON_USE_TYPE_SPECS - #define CYTHON_USE_TYPE_SPECS 1 - #undef CYTHON_USE_PYTYPE_LOOKUP - #define CYTHON_USE_PYTYPE_LOOKUP 0 - #undef CYTHON_USE_ASYNC_SLOTS - #define CYTHON_USE_ASYNC_SLOTS 0 - #undef CYTHON_USE_PYLIST_INTERNALS - #define CYTHON_USE_PYLIST_INTERNALS 0 - #undef CYTHON_USE_UNICODE_INTERNALS - #define CYTHON_USE_UNICODE_INTERNALS 0 - #ifndef CYTHON_USE_UNICODE_WRITER - #define CYTHON_USE_UNICODE_WRITER 0 - #endif - #undef CYTHON_USE_PYLONG_INTERNALS - #define CYTHON_USE_PYLONG_INTERNALS 0 - #ifndef CYTHON_AVOID_BORROWED_REFS - #define CYTHON_AVOID_BORROWED_REFS 0 - #endif - #undef CYTHON_ASSUME_SAFE_MACROS - #define CYTHON_ASSUME_SAFE_MACROS 0 - #undef CYTHON_UNPACK_METHODS - #define CYTHON_UNPACK_METHODS 0 - #undef CYTHON_FAST_THREAD_STATE - #define CYTHON_FAST_THREAD_STATE 0 - #undef CYTHON_FAST_GIL - #define CYTHON_FAST_GIL 0 - #undef CYTHON_METH_FASTCALL - #define CYTHON_METH_FASTCALL 0 - #undef CYTHON_FAST_PYCALL - #define CYTHON_FAST_PYCALL 0 - #ifndef CYTHON_PEP487_INIT_SUBCLASS - #define CYTHON_PEP487_INIT_SUBCLASS 1 - #endif - #undef CYTHON_PEP489_MULTI_PHASE_INIT - #define CYTHON_PEP489_MULTI_PHASE_INIT 0 - #undef CYTHON_USE_MODULE_STATE - #define CYTHON_USE_MODULE_STATE 1 - #ifndef CYTHON_USE_TP_FINALIZE - #define CYTHON_USE_TP_FINALIZE 1 - #endif - #undef CYTHON_USE_DICT_VERSIONS - #define CYTHON_USE_DICT_VERSIONS 0 - #undef CYTHON_USE_EXC_INFO_STACK - #define CYTHON_USE_EXC_INFO_STACK 0 - #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC - #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 - #endif -#elif defined(PY_NOGIL) - #define CYTHON_COMPILING_IN_PYPY 0 - #define CYTHON_COMPILING_IN_CPYTHON 0 - #define CYTHON_COMPILING_IN_LIMITED_API 0 - #define CYTHON_COMPILING_IN_GRAAL 0 - #define CYTHON_COMPILING_IN_NOGIL 1 - #ifndef CYTHON_USE_TYPE_SLOTS - #define CYTHON_USE_TYPE_SLOTS 1 - #endif - #undef CYTHON_USE_PYTYPE_LOOKUP - #define CYTHON_USE_PYTYPE_LOOKUP 0 - #ifndef CYTHON_USE_ASYNC_SLOTS - #define CYTHON_USE_ASYNC_SLOTS 1 - #endif - #undef CYTHON_USE_PYLIST_INTERNALS - #define CYTHON_USE_PYLIST_INTERNALS 0 - #ifndef CYTHON_USE_UNICODE_INTERNALS - #define CYTHON_USE_UNICODE_INTERNALS 1 - #endif - #undef CYTHON_USE_UNICODE_WRITER - #define CYTHON_USE_UNICODE_WRITER 0 - #undef CYTHON_USE_PYLONG_INTERNALS - #define CYTHON_USE_PYLONG_INTERNALS 0 - #ifndef CYTHON_AVOID_BORROWED_REFS - #define CYTHON_AVOID_BORROWED_REFS 0 - #endif - #ifndef CYTHON_ASSUME_SAFE_MACROS - #define CYTHON_ASSUME_SAFE_MACROS 1 - #endif - #ifndef CYTHON_UNPACK_METHODS - #define CYTHON_UNPACK_METHODS 1 - #endif - #undef CYTHON_FAST_THREAD_STATE - #define CYTHON_FAST_THREAD_STATE 0 - #undef CYTHON_FAST_PYCALL - #define CYTHON_FAST_PYCALL 0 - #ifndef CYTHON_PEP489_MULTI_PHASE_INIT - #define CYTHON_PEP489_MULTI_PHASE_INIT 1 - #endif - #ifndef CYTHON_USE_TP_FINALIZE - #define CYTHON_USE_TP_FINALIZE 1 - #endif - #undef CYTHON_USE_DICT_VERSIONS - #define CYTHON_USE_DICT_VERSIONS 0 - #undef CYTHON_USE_EXC_INFO_STACK - #define CYTHON_USE_EXC_INFO_STACK 0 -#else - #define CYTHON_COMPILING_IN_PYPY 0 - #define CYTHON_COMPILING_IN_CPYTHON 1 - #define CYTHON_COMPILING_IN_LIMITED_API 0 - #define CYTHON_COMPILING_IN_GRAAL 0 - #define CYTHON_COMPILING_IN_NOGIL 0 - #ifndef CYTHON_USE_TYPE_SLOTS - #define CYTHON_USE_TYPE_SLOTS 1 - #endif - #ifndef CYTHON_USE_TYPE_SPECS - #define CYTHON_USE_TYPE_SPECS 0 - #endif - #ifndef CYTHON_USE_PYTYPE_LOOKUP - #define CYTHON_USE_PYTYPE_LOOKUP 1 - #endif - #if PY_MAJOR_VERSION < 3 - #undef CYTHON_USE_ASYNC_SLOTS - #define CYTHON_USE_ASYNC_SLOTS 0 - #elif !defined(CYTHON_USE_ASYNC_SLOTS) - #define CYTHON_USE_ASYNC_SLOTS 1 - #endif - #ifndef CYTHON_USE_PYLONG_INTERNALS - #define CYTHON_USE_PYLONG_INTERNALS 1 - #endif - #ifndef CYTHON_USE_PYLIST_INTERNALS - #define CYTHON_USE_PYLIST_INTERNALS 1 - #endif - #ifndef CYTHON_USE_UNICODE_INTERNALS - #define CYTHON_USE_UNICODE_INTERNALS 1 - #endif - #if PY_VERSION_HEX < 0x030300F0 || PY_VERSION_HEX >= 0x030B00A2 - #undef CYTHON_USE_UNICODE_WRITER - #define CYTHON_USE_UNICODE_WRITER 0 - #elif !defined(CYTHON_USE_UNICODE_WRITER) - #define CYTHON_USE_UNICODE_WRITER 1 - #endif - #ifndef CYTHON_AVOID_BORROWED_REFS - #define CYTHON_AVOID_BORROWED_REFS 0 - #endif - #ifndef CYTHON_ASSUME_SAFE_MACROS - #define CYTHON_ASSUME_SAFE_MACROS 1 - #endif - #ifndef CYTHON_UNPACK_METHODS - #define CYTHON_UNPACK_METHODS 1 - #endif - #ifndef CYTHON_FAST_THREAD_STATE - #define CYTHON_FAST_THREAD_STATE 1 - #endif - #ifndef CYTHON_FAST_GIL - #define CYTHON_FAST_GIL (PY_MAJOR_VERSION < 3 || PY_VERSION_HEX >= 0x03060000 && PY_VERSION_HEX < 0x030C00A6) - #endif - #ifndef CYTHON_METH_FASTCALL - #define CYTHON_METH_FASTCALL (PY_VERSION_HEX >= 0x030700A1) - #endif - #ifndef CYTHON_FAST_PYCALL - #define CYTHON_FAST_PYCALL 1 - #endif - #ifndef CYTHON_PEP487_INIT_SUBCLASS - #define CYTHON_PEP487_INIT_SUBCLASS 1 - #endif - #if PY_VERSION_HEX < 0x03050000 - #undef CYTHON_PEP489_MULTI_PHASE_INIT - #define CYTHON_PEP489_MULTI_PHASE_INIT 0 - #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT) - #define CYTHON_PEP489_MULTI_PHASE_INIT 1 - #endif - #ifndef CYTHON_USE_MODULE_STATE - #define CYTHON_USE_MODULE_STATE 0 - #endif - #if PY_VERSION_HEX < 0x030400a1 - #undef CYTHON_USE_TP_FINALIZE - #define CYTHON_USE_TP_FINALIZE 0 - #elif !defined(CYTHON_USE_TP_FINALIZE) - #define CYTHON_USE_TP_FINALIZE 1 - #endif - #if PY_VERSION_HEX < 0x030600B1 - #undef CYTHON_USE_DICT_VERSIONS - #define CYTHON_USE_DICT_VERSIONS 0 - #elif !defined(CYTHON_USE_DICT_VERSIONS) - #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX < 0x030C00A5) - #endif - #if PY_VERSION_HEX < 0x030700A3 - #undef CYTHON_USE_EXC_INFO_STACK - #define CYTHON_USE_EXC_INFO_STACK 0 - #elif !defined(CYTHON_USE_EXC_INFO_STACK) - #define CYTHON_USE_EXC_INFO_STACK 1 - #endif - #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC - #define CYTHON_UPDATE_DESCRIPTOR_DOC 1 - #endif -#endif -#if !defined(CYTHON_FAST_PYCCALL) -#define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) -#endif -#if !defined(CYTHON_VECTORCALL) -#define CYTHON_VECTORCALL (CYTHON_FAST_PYCCALL && PY_VERSION_HEX >= 0x030800B1) -#endif -#define CYTHON_BACKPORT_VECTORCALL (CYTHON_METH_FASTCALL && PY_VERSION_HEX < 0x030800B1) -#if CYTHON_USE_PYLONG_INTERNALS - #if PY_MAJOR_VERSION < 3 - #include "longintrepr.h" - #endif - #undef SHIFT - #undef BASE - #undef MASK - #ifdef SIZEOF_VOID_P - enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; - #endif -#endif -#ifndef __has_attribute - #define __has_attribute(x) 0 -#endif -#ifndef __has_cpp_attribute - #define __has_cpp_attribute(x) 0 -#endif -#ifndef CYTHON_RESTRICT - #if defined(__GNUC__) - #define CYTHON_RESTRICT __restrict__ - #elif defined(_MSC_VER) && _MSC_VER >= 1400 - #define CYTHON_RESTRICT __restrict - #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L - #define CYTHON_RESTRICT restrict - #else - #define CYTHON_RESTRICT - #endif -#endif -#ifndef CYTHON_UNUSED - #if defined(__cplusplus) - /* for clang __has_cpp_attribute(maybe_unused) is true even before C++17 - * but leads to warnings with -pedantic, since it is a C++17 feature */ - #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L) - #if __has_cpp_attribute(maybe_unused) - #define CYTHON_UNUSED [[maybe_unused]] - #endif - #endif - #endif -#endif -#ifndef CYTHON_UNUSED -# if defined(__GNUC__) -# if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) -# define CYTHON_UNUSED __attribute__ ((__unused__)) -# else -# define CYTHON_UNUSED -# endif -# elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) -# define CYTHON_UNUSED __attribute__ ((__unused__)) -# else -# define CYTHON_UNUSED -# endif -#endif -#ifndef CYTHON_UNUSED_VAR -# if defined(__cplusplus) - template void CYTHON_UNUSED_VAR( const T& ) { } -# else -# define CYTHON_UNUSED_VAR(x) (void)(x) -# endif -#endif -#ifndef CYTHON_MAYBE_UNUSED_VAR - #define CYTHON_MAYBE_UNUSED_VAR(x) CYTHON_UNUSED_VAR(x) -#endif -#ifndef CYTHON_NCP_UNUSED -# if CYTHON_COMPILING_IN_CPYTHON -# define CYTHON_NCP_UNUSED -# else -# define CYTHON_NCP_UNUSED CYTHON_UNUSED -# endif -#endif -#define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) -#ifdef _MSC_VER - #ifndef _MSC_STDINT_H_ - #if _MSC_VER < 1300 - typedef unsigned char uint8_t; - typedef unsigned short uint16_t; - typedef unsigned int uint32_t; - #else - typedef unsigned __int8 uint8_t; - typedef unsigned __int16 uint16_t; - typedef unsigned __int32 uint32_t; - #endif - #endif - #if _MSC_VER < 1300 - #ifdef _WIN64 - typedef unsigned long long __pyx_uintptr_t; - #else - typedef unsigned int __pyx_uintptr_t; - #endif - #else - #ifdef _WIN64 - typedef unsigned __int64 __pyx_uintptr_t; - #else - typedef unsigned __int32 __pyx_uintptr_t; - #endif - #endif -#else - #include - typedef uintptr_t __pyx_uintptr_t; -#endif -#ifndef CYTHON_FALLTHROUGH - #if defined(__cplusplus) - /* for clang __has_cpp_attribute(fallthrough) is true even before C++17 - * but leads to warnings with -pedantic, since it is a C++17 feature */ - #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L) - #if __has_cpp_attribute(fallthrough) - #define CYTHON_FALLTHROUGH [[fallthrough]] - #endif - #endif - #ifndef CYTHON_FALLTHROUGH - #if __has_cpp_attribute(clang::fallthrough) - #define CYTHON_FALLTHROUGH [[clang::fallthrough]] - #elif __has_cpp_attribute(gnu::fallthrough) - #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] - #endif - #endif - #endif - #ifndef CYTHON_FALLTHROUGH - #if __has_attribute(fallthrough) - #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) - #else - #define CYTHON_FALLTHROUGH - #endif - #endif - #if defined(__clang__) && defined(__apple_build_version__) - #if __apple_build_version__ < 7000000 - #undef CYTHON_FALLTHROUGH - #define CYTHON_FALLTHROUGH - #endif - #endif -#endif -#ifdef __cplusplus - template - struct __PYX_IS_UNSIGNED_IMPL {static const bool value = T(0) < T(-1);}; - #define __PYX_IS_UNSIGNED(type) (__PYX_IS_UNSIGNED_IMPL::value) -#else - #define __PYX_IS_UNSIGNED(type) (((type)-1) > 0) -#endif -#if CYTHON_COMPILING_IN_PYPY == 1 - #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x030A0000) -#else - #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x03090000) -#endif -#define __PYX_REINTERPRET_FUNCION(func_pointer, other_pointer) ((func_pointer)(void(*)(void))(other_pointer)) - -#ifndef CYTHON_INLINE - #if defined(__clang__) - #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) - #elif defined(__GNUC__) - #define CYTHON_INLINE __inline__ - #elif defined(_MSC_VER) - #define CYTHON_INLINE __inline - #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L - #define CYTHON_INLINE inline - #else - #define CYTHON_INLINE - #endif -#endif - -#define __PYX_BUILD_PY_SSIZE_T "n" -#define CYTHON_FORMAT_SSIZE_T "z" -#if PY_MAJOR_VERSION < 3 - #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" - #define __Pyx_DefaultClassType PyClass_Type - #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ - PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) -#else - #define __Pyx_BUILTIN_MODULE_NAME "builtins" - #define __Pyx_DefaultClassType PyType_Type -#if PY_VERSION_HEX >= 0x030B00A1 - static CYTHON_INLINE PyCodeObject* __Pyx_PyCode_New(int a, int p, int k, int l, int s, int f, - PyObject *code, PyObject *c, PyObject* n, PyObject *v, - PyObject *fv, PyObject *cell, PyObject* fn, - PyObject *name, int fline, PyObject *lnos) { - PyObject *kwds=NULL, *argcount=NULL, *posonlyargcount=NULL, *kwonlyargcount=NULL; - PyObject *nlocals=NULL, *stacksize=NULL, *flags=NULL, *replace=NULL, *empty=NULL; - const char *fn_cstr=NULL; - const char *name_cstr=NULL; - PyCodeObject *co=NULL, *result=NULL; - PyObject *type, *value, *traceback; - PyErr_Fetch(&type, &value, &traceback); - if (!(kwds=PyDict_New())) goto end; - if (!(argcount=PyLong_FromLong(a))) goto end; - if (PyDict_SetItemString(kwds, "co_argcount", argcount) != 0) goto end; - if (!(posonlyargcount=PyLong_FromLong(p))) goto end; - if (PyDict_SetItemString(kwds, "co_posonlyargcount", posonlyargcount) != 0) goto end; - if (!(kwonlyargcount=PyLong_FromLong(k))) goto end; - if (PyDict_SetItemString(kwds, "co_kwonlyargcount", kwonlyargcount) != 0) goto end; - if (!(nlocals=PyLong_FromLong(l))) goto end; - if (PyDict_SetItemString(kwds, "co_nlocals", nlocals) != 0) goto end; - if (!(stacksize=PyLong_FromLong(s))) goto end; - if (PyDict_SetItemString(kwds, "co_stacksize", stacksize) != 0) goto end; - if (!(flags=PyLong_FromLong(f))) goto end; - if (PyDict_SetItemString(kwds, "co_flags", flags) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_code", code) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_consts", c) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_names", n) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_varnames", v) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_freevars", fv) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_cellvars", cell) != 0) goto end; - if (PyDict_SetItemString(kwds, "co_linetable", lnos) != 0) goto end; - if (!(fn_cstr=PyUnicode_AsUTF8AndSize(fn, NULL))) goto end; - if (!(name_cstr=PyUnicode_AsUTF8AndSize(name, NULL))) goto end; - if (!(co = PyCode_NewEmpty(fn_cstr, name_cstr, fline))) goto end; - if (!(replace = PyObject_GetAttrString((PyObject*)co, "replace"))) goto end; - if (!(empty = PyTuple_New(0))) goto end; - result = (PyCodeObject*) PyObject_Call(replace, empty, kwds); - end: - Py_XDECREF((PyObject*) co); - Py_XDECREF(kwds); - Py_XDECREF(argcount); - Py_XDECREF(posonlyargcount); - Py_XDECREF(kwonlyargcount); - Py_XDECREF(nlocals); - Py_XDECREF(stacksize); - Py_XDECREF(replace); - Py_XDECREF(empty); - if (type) { - PyErr_Restore(type, value, traceback); - } - return result; - } -#elif PY_VERSION_HEX >= 0x030800B2 && !CYTHON_COMPILING_IN_PYPY - #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ - PyCode_NewWithPosOnlyArgs(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) -#else - #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ - PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) -#endif -#endif -#if PY_VERSION_HEX >= 0x030900A4 || defined(Py_IS_TYPE) - #define __Pyx_IS_TYPE(ob, type) Py_IS_TYPE(ob, type) -#else - #define __Pyx_IS_TYPE(ob, type) (((const PyObject*)ob)->ob_type == (type)) -#endif -#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_Is) - #define __Pyx_Py_Is(x, y) Py_Is(x, y) -#else - #define __Pyx_Py_Is(x, y) ((x) == (y)) -#endif -#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsNone) - #define __Pyx_Py_IsNone(ob) Py_IsNone(ob) -#else - #define __Pyx_Py_IsNone(ob) __Pyx_Py_Is((ob), Py_None) -#endif -#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsTrue) - #define __Pyx_Py_IsTrue(ob) Py_IsTrue(ob) -#else - #define __Pyx_Py_IsTrue(ob) __Pyx_Py_Is((ob), Py_True) -#endif -#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsFalse) - #define __Pyx_Py_IsFalse(ob) Py_IsFalse(ob) -#else - #define __Pyx_Py_IsFalse(ob) __Pyx_Py_Is((ob), Py_False) -#endif -#define __Pyx_NoneAsNull(obj) (__Pyx_Py_IsNone(obj) ? NULL : (obj)) -#if PY_VERSION_HEX >= 0x030900F0 && !CYTHON_COMPILING_IN_PYPY - #define __Pyx_PyObject_GC_IsFinalized(o) PyObject_GC_IsFinalized(o) -#else - #define __Pyx_PyObject_GC_IsFinalized(o) _PyGC_FINALIZED(o) -#endif -#ifndef CO_COROUTINE - #define CO_COROUTINE 0x80 -#endif -#ifndef CO_ASYNC_GENERATOR - #define CO_ASYNC_GENERATOR 0x200 -#endif -#ifndef Py_TPFLAGS_CHECKTYPES - #define Py_TPFLAGS_CHECKTYPES 0 -#endif -#ifndef Py_TPFLAGS_HAVE_INDEX - #define Py_TPFLAGS_HAVE_INDEX 0 -#endif -#ifndef Py_TPFLAGS_HAVE_NEWBUFFER - #define Py_TPFLAGS_HAVE_NEWBUFFER 0 -#endif -#ifndef Py_TPFLAGS_HAVE_FINALIZE - #define Py_TPFLAGS_HAVE_FINALIZE 0 -#endif -#ifndef Py_TPFLAGS_SEQUENCE - #define Py_TPFLAGS_SEQUENCE 0 -#endif -#ifndef Py_TPFLAGS_MAPPING - #define Py_TPFLAGS_MAPPING 0 -#endif -#ifndef METH_STACKLESS - #define METH_STACKLESS 0 -#endif -#if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) - #ifndef METH_FASTCALL - #define METH_FASTCALL 0x80 - #endif - typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); - typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, - Py_ssize_t nargs, PyObject *kwnames); -#else - #define __Pyx_PyCFunctionFast _PyCFunctionFast - #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords -#endif -#if CYTHON_METH_FASTCALL - #define __Pyx_METH_FASTCALL METH_FASTCALL - #define __Pyx_PyCFunction_FastCall __Pyx_PyCFunctionFast - #define __Pyx_PyCFunction_FastCallWithKeywords __Pyx_PyCFunctionFastWithKeywords -#else - #define __Pyx_METH_FASTCALL METH_VARARGS - #define __Pyx_PyCFunction_FastCall PyCFunction - #define __Pyx_PyCFunction_FastCallWithKeywords PyCFunctionWithKeywords -#endif -#if CYTHON_VECTORCALL - #define __pyx_vectorcallfunc vectorcallfunc - #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET PY_VECTORCALL_ARGUMENTS_OFFSET - #define __Pyx_PyVectorcall_NARGS(n) PyVectorcall_NARGS((size_t)(n)) -#elif CYTHON_BACKPORT_VECTORCALL - typedef PyObject *(*__pyx_vectorcallfunc)(PyObject *callable, PyObject *const *args, - size_t nargsf, PyObject *kwnames); - #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET ((size_t)1 << (8 * sizeof(size_t) - 1)) - #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(((size_t)(n)) & ~__Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET)) -#else - #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET 0 - #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(n)) -#endif -#if PY_VERSION_HEX < 0x030900B1 - #define __Pyx_PyType_FromModuleAndSpec(m, s, b) ((void)m, PyType_FromSpecWithBases(s, b)) - typedef PyObject *(*__Pyx_PyCMethod)(PyObject *, PyTypeObject *, PyObject *const *, size_t, PyObject *); -#else - #define __Pyx_PyType_FromModuleAndSpec(m, s, b) PyType_FromModuleAndSpec(m, s, b) - #define __Pyx_PyCMethod PyCMethod -#endif -#ifndef METH_METHOD - #define METH_METHOD 0x200 -#endif -#if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) - #define PyObject_Malloc(s) PyMem_Malloc(s) - #define PyObject_Free(p) PyMem_Free(p) - #define PyObject_Realloc(p) PyMem_Realloc(p) -#endif -#if CYTHON_COMPILING_IN_LIMITED_API - #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) - #define __Pyx_PyFrame_SetLineNumber(frame, lineno) -#else - #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) - #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) -#endif -#if CYTHON_COMPILING_IN_LIMITED_API - #define __Pyx_PyThreadState_Current PyThreadState_Get() -#elif !CYTHON_FAST_THREAD_STATE - #define __Pyx_PyThreadState_Current PyThreadState_GET() -#elif PY_VERSION_HEX >= 0x03060000 - #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() -#elif PY_VERSION_HEX >= 0x03000000 - #define __Pyx_PyThreadState_Current PyThreadState_GET() -#else - #define __Pyx_PyThreadState_Current _PyThreadState_Current -#endif -#if CYTHON_COMPILING_IN_LIMITED_API -static CYTHON_INLINE void *__Pyx_PyModule_GetState(PyObject *op) -{ - void *result; - result = PyModule_GetState(op); - if (!result) - Py_FatalError("Couldn't find the module state"); - return result; -} -#endif -#define __Pyx_PyObject_GetSlot(obj, name, func_ctype) __Pyx_PyType_GetSlot(Py_TYPE(obj), name, func_ctype) -#if CYTHON_COMPILING_IN_LIMITED_API - #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((func_ctype) PyType_GetSlot((type), Py_##name)) -#else - #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((type)->name) -#endif -#if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) -#include "pythread.h" -#define Py_tss_NEEDS_INIT 0 -typedef int Py_tss_t; -static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { - *key = PyThread_create_key(); - return 0; -} -static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { - Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); - *key = Py_tss_NEEDS_INIT; - return key; -} -static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { - PyObject_Free(key); -} -static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { - return *key != Py_tss_NEEDS_INIT; -} -static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { - PyThread_delete_key(*key); - *key = Py_tss_NEEDS_INIT; -} -static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { - return PyThread_set_key_value(*key, value); -} -static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { - return PyThread_get_key_value(*key); -} -#endif -#if PY_MAJOR_VERSION < 3 - #if CYTHON_COMPILING_IN_PYPY - #if PYPY_VERSION_NUM < 0x07030600 - #if defined(__cplusplus) && __cplusplus >= 201402L - [[deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")]] - #elif defined(__GNUC__) || defined(__clang__) - __attribute__ ((__deprecated__("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6"))) - #elif defined(_MSC_VER) - __declspec(deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")) - #endif - static CYTHON_INLINE int PyGILState_Check(void) { - return 0; - } - #else // PYPY_VERSION_NUM < 0x07030600 - #endif // PYPY_VERSION_NUM < 0x07030600 - #else - static CYTHON_INLINE int PyGILState_Check(void) { - PyThreadState * tstate = _PyThreadState_Current; - return tstate && (tstate == PyGILState_GetThisThreadState()); - } - #endif -#endif -#if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) -#define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) -#else -#define __Pyx_PyDict_NewPresized(n) PyDict_New() -#endif -#if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION - #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) - #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) -#else - #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) - #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) -#endif -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX > 0x030600B4 && CYTHON_USE_UNICODE_INTERNALS -#define __Pyx_PyDict_GetItemStrWithError(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) -static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStr(PyObject *dict, PyObject *name) { - PyObject *res = __Pyx_PyDict_GetItemStrWithError(dict, name); - if (res == NULL) PyErr_Clear(); - return res; -} -#elif PY_MAJOR_VERSION >= 3 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07020000) -#define __Pyx_PyDict_GetItemStrWithError PyDict_GetItemWithError -#define __Pyx_PyDict_GetItemStr PyDict_GetItem -#else -static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStrWithError(PyObject *dict, PyObject *name) { -#if CYTHON_COMPILING_IN_PYPY - return PyDict_GetItem(dict, name); -#else - PyDictEntry *ep; - PyDictObject *mp = (PyDictObject*) dict; - long hash = ((PyStringObject *) name)->ob_shash; - assert(hash != -1); - ep = (mp->ma_lookup)(mp, name, hash); - if (ep == NULL) { - return NULL; - } - return ep->me_value; -#endif -} -#define __Pyx_PyDict_GetItemStr PyDict_GetItem -#endif -#if CYTHON_USE_TYPE_SLOTS - #define __Pyx_PyType_GetFlags(tp) (((PyTypeObject *)tp)->tp_flags) - #define __Pyx_PyType_HasFeature(type, feature) ((__Pyx_PyType_GetFlags(type) & (feature)) != 0) - #define __Pyx_PyObject_GetIterNextFunc(obj) (Py_TYPE(obj)->tp_iternext) -#else - #define __Pyx_PyType_GetFlags(tp) (PyType_GetFlags((PyTypeObject *)tp)) - #define __Pyx_PyType_HasFeature(type, feature) PyType_HasFeature(type, feature) - #define __Pyx_PyObject_GetIterNextFunc(obj) PyIter_Next -#endif -#if CYTHON_USE_TYPE_SPECS && PY_VERSION_HEX >= 0x03080000 -#define __Pyx_PyHeapTypeObject_GC_Del(obj) {\ - PyTypeObject *type = Py_TYPE(obj);\ - assert(__Pyx_PyType_HasFeature(type, Py_TPFLAGS_HEAPTYPE));\ - PyObject_GC_Del(obj);\ - Py_DECREF(type);\ -} -#else -#define __Pyx_PyHeapTypeObject_GC_Del(obj) PyObject_GC_Del(obj) -#endif -#if CYTHON_COMPILING_IN_LIMITED_API - #define CYTHON_PEP393_ENABLED 1 - #define __Pyx_PyUnicode_READY(op) (0) - #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GetLength(u) - #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_ReadChar(u, i) - #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((void)u, 1114111U) - #define __Pyx_PyUnicode_KIND(u) ((void)u, (0)) - #define __Pyx_PyUnicode_DATA(u) ((void*)u) - #define __Pyx_PyUnicode_READ(k, d, i) ((void)k, PyUnicode_ReadChar((PyObject*)(d), i)) - #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GetLength(u)) -#elif PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) - #define CYTHON_PEP393_ENABLED 1 - #if PY_VERSION_HEX >= 0x030C0000 - #define __Pyx_PyUnicode_READY(op) (0) - #else - #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ - 0 : _PyUnicode_Ready((PyObject *)(op))) - #endif - #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) - #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) - #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) - #define __Pyx_PyUnicode_KIND(u) ((int)PyUnicode_KIND(u)) - #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) - #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) - #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, (Py_UCS4) ch) - #if PY_VERSION_HEX >= 0x030C0000 - #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_LENGTH(u)) - #else - #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03090000 - #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : ((PyCompactUnicodeObject *)(u))->wstr_length)) - #else - #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) - #endif - #endif -#else - #define CYTHON_PEP393_ENABLED 0 - #define PyUnicode_1BYTE_KIND 1 - #define PyUnicode_2BYTE_KIND 2 - #define PyUnicode_4BYTE_KIND 4 - #define __Pyx_PyUnicode_READY(op) (0) - #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) - #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) - #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535U : 1114111U) - #define __Pyx_PyUnicode_KIND(u) ((int)sizeof(Py_UNICODE)) - #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) - #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) - #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = (Py_UNICODE) ch) - #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) -#endif -#if CYTHON_COMPILING_IN_PYPY - #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) - #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) -#else - #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) - #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ - PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) -#endif -#if CYTHON_COMPILING_IN_PYPY - #if !defined(PyUnicode_DecodeUnicodeEscape) - #define PyUnicode_DecodeUnicodeEscape(s, size, errors) PyUnicode_Decode(s, size, "unicode_escape", errors) - #endif - #if !defined(PyUnicode_Contains) || (PY_MAJOR_VERSION == 2 && PYPY_VERSION_NUM < 0x07030500) - #undef PyUnicode_Contains - #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) - #endif - #if !defined(PyByteArray_Check) - #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) - #endif - #if !defined(PyObject_Format) - #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) - #endif -#endif -#define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) -#define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) -#if PY_MAJOR_VERSION >= 3 - #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) -#else - #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) -#endif -#if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) - #define PyObject_ASCII(o) PyObject_Repr(o) -#endif -#if PY_MAJOR_VERSION >= 3 - #define PyBaseString_Type PyUnicode_Type - #define PyStringObject PyUnicodeObject - #define PyString_Type PyUnicode_Type - #define PyString_Check PyUnicode_Check - #define PyString_CheckExact PyUnicode_CheckExact -#ifndef PyObject_Unicode - #define PyObject_Unicode PyObject_Str -#endif -#endif -#if PY_MAJOR_VERSION >= 3 - #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) - #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) -#else - #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) - #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) -#endif -#if CYTHON_COMPILING_IN_CPYTHON - #define __Pyx_PySequence_ListKeepNew(obj)\ - (likely(PyList_CheckExact(obj) && Py_REFCNT(obj) == 1) ? __Pyx_NewRef(obj) : PySequence_List(obj)) -#else - #define __Pyx_PySequence_ListKeepNew(obj) PySequence_List(obj) -#endif -#ifndef PySet_CheckExact - #define PySet_CheckExact(obj) __Pyx_IS_TYPE(obj, &PySet_Type) -#endif -#if PY_VERSION_HEX >= 0x030900A4 - #define __Pyx_SET_REFCNT(obj, refcnt) Py_SET_REFCNT(obj, refcnt) - #define __Pyx_SET_SIZE(obj, size) Py_SET_SIZE(obj, size) -#else - #define __Pyx_SET_REFCNT(obj, refcnt) Py_REFCNT(obj) = (refcnt) - #define __Pyx_SET_SIZE(obj, size) Py_SIZE(obj) = (size) -#endif -#if CYTHON_ASSUME_SAFE_MACROS - #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) -#else - #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) -#endif -#if PY_MAJOR_VERSION >= 3 - #define PyIntObject PyLongObject - #define PyInt_Type PyLong_Type - #define PyInt_Check(op) PyLong_Check(op) - #define PyInt_CheckExact(op) PyLong_CheckExact(op) - #define __Pyx_Py3Int_Check(op) PyLong_Check(op) - #define __Pyx_Py3Int_CheckExact(op) PyLong_CheckExact(op) - #define PyInt_FromString PyLong_FromString - #define PyInt_FromUnicode PyLong_FromUnicode - #define PyInt_FromLong PyLong_FromLong - #define PyInt_FromSize_t PyLong_FromSize_t - #define PyInt_FromSsize_t PyLong_FromSsize_t - #define PyInt_AsLong PyLong_AsLong - #define PyInt_AS_LONG PyLong_AS_LONG - #define PyInt_AsSsize_t PyLong_AsSsize_t - #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask - #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask - #define PyNumber_Int PyNumber_Long -#else - #define __Pyx_Py3Int_Check(op) (PyLong_Check(op) || PyInt_Check(op)) - #define __Pyx_Py3Int_CheckExact(op) (PyLong_CheckExact(op) || PyInt_CheckExact(op)) -#endif -#if PY_MAJOR_VERSION >= 3 - #define PyBoolObject PyLongObject -#endif -#if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY - #ifndef PyUnicode_InternFromString - #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) - #endif -#endif -#if PY_VERSION_HEX < 0x030200A4 - typedef long Py_hash_t; - #define __Pyx_PyInt_FromHash_t PyInt_FromLong - #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsHash_t -#else - #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t - #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsSsize_t -#endif -#if CYTHON_USE_ASYNC_SLOTS - #if PY_VERSION_HEX >= 0x030500B1 - #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods - #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) - #else - #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) - #endif -#else - #define __Pyx_PyType_AsAsync(obj) NULL -#endif -#ifndef __Pyx_PyAsyncMethodsStruct - typedef struct { - unaryfunc am_await; - unaryfunc am_aiter; - unaryfunc am_anext; - } __Pyx_PyAsyncMethodsStruct; -#endif - -#if defined(_WIN32) || defined(WIN32) || defined(MS_WINDOWS) - #if !defined(_USE_MATH_DEFINES) - #define _USE_MATH_DEFINES - #endif -#endif -#include -#ifdef NAN -#define __PYX_NAN() ((float) NAN) -#else -static CYTHON_INLINE float __PYX_NAN() { - float value; - memset(&value, 0xFF, sizeof(value)); - return value; -} -#endif -#if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) -#define __Pyx_truncl trunc -#else -#define __Pyx_truncl truncl -#endif - -#define __PYX_MARK_ERR_POS(f_index, lineno) \ - { __pyx_filename = __pyx_f[f_index]; (void)__pyx_filename; __pyx_lineno = lineno; (void)__pyx_lineno; __pyx_clineno = __LINE__; (void)__pyx_clineno; } -#define __PYX_ERR(f_index, lineno, Ln_error) \ - { __PYX_MARK_ERR_POS(f_index, lineno) goto Ln_error; } - -#ifdef CYTHON_EXTERN_C - #undef __PYX_EXTERN_C - #define __PYX_EXTERN_C CYTHON_EXTERN_C -#elif defined(__PYX_EXTERN_C) - #ifdef _MSC_VER - #pragma message ("Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead.") - #else - #warning Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead. - #endif -#else - #ifdef __cplusplus - #define __PYX_EXTERN_C extern "C" - #else - #define __PYX_EXTERN_C extern - #endif -#endif - -#define __PYX_HAVE__fontTools__cu2qu__cu2qu -#define __PYX_HAVE_API__fontTools__cu2qu__cu2qu -/* Early includes */ -#ifdef _OPENMP -#include -#endif /* _OPENMP */ - -#if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) -#define CYTHON_WITHOUT_ASSERTIONS -#endif - -typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; - const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; - -#define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 -#define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 -#define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) -#define __PYX_DEFAULT_STRING_ENCODING "" -#define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString -#define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize -#define __Pyx_uchar_cast(c) ((unsigned char)c) -#define __Pyx_long_cast(x) ((long)x) -#define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ - (sizeof(type) < sizeof(Py_ssize_t)) ||\ - (sizeof(type) > sizeof(Py_ssize_t) &&\ - likely(v < (type)PY_SSIZE_T_MAX ||\ - v == (type)PY_SSIZE_T_MAX) &&\ - (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ - v == (type)PY_SSIZE_T_MIN))) ||\ - (sizeof(type) == sizeof(Py_ssize_t) &&\ - (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ - v == (type)PY_SSIZE_T_MAX))) ) -static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { - return (size_t) i < (size_t) limit; -} -#if defined (__cplusplus) && __cplusplus >= 201103L - #include - #define __Pyx_sst_abs(value) std::abs(value) -#elif SIZEOF_INT >= SIZEOF_SIZE_T - #define __Pyx_sst_abs(value) abs(value) -#elif SIZEOF_LONG >= SIZEOF_SIZE_T - #define __Pyx_sst_abs(value) labs(value) -#elif defined (_MSC_VER) - #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) -#elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L - #define __Pyx_sst_abs(value) llabs(value) -#elif defined (__GNUC__) - #define __Pyx_sst_abs(value) __builtin_llabs(value) -#else - #define __Pyx_sst_abs(value) ((value<0) ? -value : value) -#endif -static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); -static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); -#define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) -#define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) -#define __Pyx_PyBytes_FromString PyBytes_FromString -#define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize -static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); -#if PY_MAJOR_VERSION < 3 - #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString - #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize -#else - #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString - #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize -#endif -#define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) -#define __Pyx_PyObject_AsWritableString(s) ((char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) -#define __Pyx_PyObject_AsWritableSString(s) ((signed char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) -#define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) -#define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) -#define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) -#define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) -#define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) -#define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) -#define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) -#define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) -#if CYTHON_COMPILING_IN_LIMITED_API -static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const wchar_t *u) -{ - const wchar_t *u_end = u; - while (*u_end++) ; - return (size_t)(u_end - u - 1); -} -#else -static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) -{ - const Py_UNICODE *u_end = u; - while (*u_end++) ; - return (size_t)(u_end - u - 1); -} -#endif -#define __Pyx_PyUnicode_FromOrdinal(o) PyUnicode_FromOrdinal((int)o) -#define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) -#define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode -#define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode -#define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) -#define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) -static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); -static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); -static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); -static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); -#define __Pyx_PySequence_Tuple(obj)\ - (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) -static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); -static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); -static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject*); -#if CYTHON_ASSUME_SAFE_MACROS -#define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) -#else -#define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) -#endif -#define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) -#if PY_MAJOR_VERSION >= 3 -#define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) -#else -#define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) -#endif -#if CYTHON_USE_PYLONG_INTERNALS - #if PY_VERSION_HEX >= 0x030C00A7 - #ifndef _PyLong_SIGN_MASK - #define _PyLong_SIGN_MASK 3 - #endif - #ifndef _PyLong_NON_SIZE_BITS - #define _PyLong_NON_SIZE_BITS 3 - #endif - #define __Pyx_PyLong_Sign(x) (((PyLongObject*)x)->long_value.lv_tag & _PyLong_SIGN_MASK) - #define __Pyx_PyLong_IsNeg(x) ((__Pyx_PyLong_Sign(x) & 2) != 0) - #define __Pyx_PyLong_IsNonNeg(x) (!__Pyx_PyLong_IsNeg(x)) - #define __Pyx_PyLong_IsZero(x) (__Pyx_PyLong_Sign(x) & 1) - #define __Pyx_PyLong_IsPos(x) (__Pyx_PyLong_Sign(x) == 0) - #define __Pyx_PyLong_CompactValueUnsigned(x) (__Pyx_PyLong_Digits(x)[0]) - #define __Pyx_PyLong_DigitCount(x) ((Py_ssize_t) (((PyLongObject*)x)->long_value.lv_tag >> _PyLong_NON_SIZE_BITS)) - #define __Pyx_PyLong_SignedDigitCount(x)\ - ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * __Pyx_PyLong_DigitCount(x)) - #if defined(PyUnstable_Long_IsCompact) && defined(PyUnstable_Long_CompactValue) - #define __Pyx_PyLong_IsCompact(x) PyUnstable_Long_IsCompact((PyLongObject*) x) - #define __Pyx_PyLong_CompactValue(x) PyUnstable_Long_CompactValue((PyLongObject*) x) - #else - #define __Pyx_PyLong_IsCompact(x) (((PyLongObject*)x)->long_value.lv_tag < (2 << _PyLong_NON_SIZE_BITS)) - #define __Pyx_PyLong_CompactValue(x) ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * (Py_ssize_t) __Pyx_PyLong_Digits(x)[0]) - #endif - typedef Py_ssize_t __Pyx_compact_pylong; - typedef size_t __Pyx_compact_upylong; - #else // Py < 3.12 - #define __Pyx_PyLong_IsNeg(x) (Py_SIZE(x) < 0) - #define __Pyx_PyLong_IsNonNeg(x) (Py_SIZE(x) >= 0) - #define __Pyx_PyLong_IsZero(x) (Py_SIZE(x) == 0) - #define __Pyx_PyLong_IsPos(x) (Py_SIZE(x) > 0) - #define __Pyx_PyLong_CompactValueUnsigned(x) ((Py_SIZE(x) == 0) ? 0 : __Pyx_PyLong_Digits(x)[0]) - #define __Pyx_PyLong_DigitCount(x) __Pyx_sst_abs(Py_SIZE(x)) - #define __Pyx_PyLong_SignedDigitCount(x) Py_SIZE(x) - #define __Pyx_PyLong_IsCompact(x) (Py_SIZE(x) == 0 || Py_SIZE(x) == 1 || Py_SIZE(x) == -1) - #define __Pyx_PyLong_CompactValue(x)\ - ((Py_SIZE(x) == 0) ? (sdigit) 0 : ((Py_SIZE(x) < 0) ? -(sdigit)__Pyx_PyLong_Digits(x)[0] : (sdigit)__Pyx_PyLong_Digits(x)[0])) - typedef sdigit __Pyx_compact_pylong; - typedef digit __Pyx_compact_upylong; - #endif - #if PY_VERSION_HEX >= 0x030C00A5 - #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->long_value.ob_digit) - #else - #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->ob_digit) - #endif -#endif -#if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII -static int __Pyx_sys_getdefaultencoding_not_ascii; -static int __Pyx_init_sys_getdefaultencoding_params(void) { - PyObject* sys; - PyObject* default_encoding = NULL; - PyObject* ascii_chars_u = NULL; - PyObject* ascii_chars_b = NULL; - const char* default_encoding_c; - sys = PyImport_ImportModule("sys"); - if (!sys) goto bad; - default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); - Py_DECREF(sys); - if (!default_encoding) goto bad; - default_encoding_c = PyBytes_AsString(default_encoding); - if (!default_encoding_c) goto bad; - if (strcmp(default_encoding_c, "ascii") == 0) { - __Pyx_sys_getdefaultencoding_not_ascii = 0; - } else { - char ascii_chars[128]; - int c; - for (c = 0; c < 128; c++) { - ascii_chars[c] = (char) c; - } - __Pyx_sys_getdefaultencoding_not_ascii = 1; - ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); - if (!ascii_chars_u) goto bad; - ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); - if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { - PyErr_Format( - PyExc_ValueError, - "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", - default_encoding_c); - goto bad; - } - Py_DECREF(ascii_chars_u); - Py_DECREF(ascii_chars_b); - } - Py_DECREF(default_encoding); - return 0; -bad: - Py_XDECREF(default_encoding); - Py_XDECREF(ascii_chars_u); - Py_XDECREF(ascii_chars_b); - return -1; -} -#endif -#if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 -#define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) -#else -#define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) -#if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT -static char* __PYX_DEFAULT_STRING_ENCODING; -static int __Pyx_init_sys_getdefaultencoding_params(void) { - PyObject* sys; - PyObject* default_encoding = NULL; - char* default_encoding_c; - sys = PyImport_ImportModule("sys"); - if (!sys) goto bad; - default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); - Py_DECREF(sys); - if (!default_encoding) goto bad; - default_encoding_c = PyBytes_AsString(default_encoding); - if (!default_encoding_c) goto bad; - __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); - if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; - strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); - Py_DECREF(default_encoding); - return 0; -bad: - Py_XDECREF(default_encoding); - return -1; -} -#endif -#endif - - -/* Test for GCC > 2.95 */ -#if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) - #define likely(x) __builtin_expect(!!(x), 1) - #define unlikely(x) __builtin_expect(!!(x), 0) -#else /* !__GNUC__ or GCC < 2.95 */ - #define likely(x) (x) - #define unlikely(x) (x) -#endif /* __GNUC__ */ -static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } - -#if !CYTHON_USE_MODULE_STATE -static PyObject *__pyx_m = NULL; -#endif -static int __pyx_lineno; -static int __pyx_clineno = 0; -static const char * __pyx_cfilenm = __FILE__; -static const char *__pyx_filename; - -/* Header.proto */ -#if !defined(CYTHON_CCOMPLEX) - #if defined(__cplusplus) - #define CYTHON_CCOMPLEX 1 - #elif (defined(_Complex_I) && !defined(_MSC_VER)) || ((defined (__STDC_VERSION__) && __STDC_VERSION__ >= 201112L) && !defined(__STDC_NO_COMPLEX__)) - #define CYTHON_CCOMPLEX 1 - #else - #define CYTHON_CCOMPLEX 0 - #endif -#endif -#if CYTHON_CCOMPLEX - #ifdef __cplusplus - #include - #else - #include - #endif -#endif -#if CYTHON_CCOMPLEX && !defined(__cplusplus) && defined(__sun__) && defined(__GNUC__) - #undef _Complex_I - #define _Complex_I 1.0fj -#endif - -/* #### Code section: filename_table ### */ - -static const char *__pyx_f[] = { - "Lib/fontTools/cu2qu/cu2qu.py", -}; -/* #### Code section: utility_code_proto_before_types ### */ -/* #### Code section: numeric_typedefs ### */ -/* #### Code section: complex_type_declarations ### */ -/* Declarations.proto */ -#if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) - #ifdef __cplusplus - typedef ::std::complex< double > __pyx_t_double_complex; - #else - typedef double _Complex __pyx_t_double_complex; - #endif -#else - typedef struct { double real, imag; } __pyx_t_double_complex; -#endif -static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double, double); - -/* #### Code section: type_declarations ### */ - -/*--- Type declarations ---*/ -struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen; - -/* "fontTools/cu2qu/cu2qu.py":127 - * - * - * @cython.locals( # <<<<<<<<<<<<<< - * p0=cython.complex, - * p1=cython.complex, - */ -struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen { - PyObject_HEAD - __pyx_t_double_complex __pyx_v_a; - __pyx_t_double_complex __pyx_v_a1; - __pyx_t_double_complex __pyx_v_b; - __pyx_t_double_complex __pyx_v_b1; - __pyx_t_double_complex __pyx_v_c; - __pyx_t_double_complex __pyx_v_c1; - __pyx_t_double_complex __pyx_v_d; - __pyx_t_double_complex __pyx_v_d1; - double __pyx_v_delta_2; - double __pyx_v_delta_3; - double __pyx_v_dt; - int __pyx_v_i; - int __pyx_v_n; - __pyx_t_double_complex __pyx_v_p0; - __pyx_t_double_complex __pyx_v_p1; - __pyx_t_double_complex __pyx_v_p2; - __pyx_t_double_complex __pyx_v_p3; - double __pyx_v_t1; - double __pyx_v_t1_2; - int __pyx_t_0; - int __pyx_t_1; - int __pyx_t_2; -}; - -/* #### Code section: utility_code_proto ### */ - -/* --- Runtime support code (head) --- */ -/* Refnanny.proto */ -#ifndef CYTHON_REFNANNY - #define CYTHON_REFNANNY 0 -#endif -#if CYTHON_REFNANNY - typedef struct { - void (*INCREF)(void*, PyObject*, Py_ssize_t); - void (*DECREF)(void*, PyObject*, Py_ssize_t); - void (*GOTREF)(void*, PyObject*, Py_ssize_t); - void (*GIVEREF)(void*, PyObject*, Py_ssize_t); - void* (*SetupContext)(const char*, Py_ssize_t, const char*); - void (*FinishContext)(void**); - } __Pyx_RefNannyAPIStruct; - static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; - static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); - #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; -#ifdef WITH_THREAD - #define __Pyx_RefNannySetupContext(name, acquire_gil)\ - if (acquire_gil) {\ - PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ - __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\ - PyGILState_Release(__pyx_gilstate_save);\ - } else {\ - __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\ - } - #define __Pyx_RefNannyFinishContextNogil() {\ - PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ - __Pyx_RefNannyFinishContext();\ - PyGILState_Release(__pyx_gilstate_save);\ - } -#else - #define __Pyx_RefNannySetupContext(name, acquire_gil)\ - __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__)) - #define __Pyx_RefNannyFinishContextNogil() __Pyx_RefNannyFinishContext() -#endif - #define __Pyx_RefNannyFinishContextNogil() {\ - PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ - __Pyx_RefNannyFinishContext();\ - PyGILState_Release(__pyx_gilstate_save);\ - } - #define __Pyx_RefNannyFinishContext()\ - __Pyx_RefNanny->FinishContext(&__pyx_refnanny) - #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) - #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) - #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) - #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) - #define __Pyx_XINCREF(r) do { if((r) == NULL); else {__Pyx_INCREF(r); }} while(0) - #define __Pyx_XDECREF(r) do { if((r) == NULL); else {__Pyx_DECREF(r); }} while(0) - #define __Pyx_XGOTREF(r) do { if((r) == NULL); else {__Pyx_GOTREF(r); }} while(0) - #define __Pyx_XGIVEREF(r) do { if((r) == NULL); else {__Pyx_GIVEREF(r);}} while(0) -#else - #define __Pyx_RefNannyDeclarations - #define __Pyx_RefNannySetupContext(name, acquire_gil) - #define __Pyx_RefNannyFinishContextNogil() - #define __Pyx_RefNannyFinishContext() - #define __Pyx_INCREF(r) Py_INCREF(r) - #define __Pyx_DECREF(r) Py_DECREF(r) - #define __Pyx_GOTREF(r) - #define __Pyx_GIVEREF(r) - #define __Pyx_XINCREF(r) Py_XINCREF(r) - #define __Pyx_XDECREF(r) Py_XDECREF(r) - #define __Pyx_XGOTREF(r) - #define __Pyx_XGIVEREF(r) -#endif -#define __Pyx_Py_XDECREF_SET(r, v) do {\ - PyObject *tmp = (PyObject *) r;\ - r = v; Py_XDECREF(tmp);\ - } while (0) -#define __Pyx_XDECREF_SET(r, v) do {\ - PyObject *tmp = (PyObject *) r;\ - r = v; __Pyx_XDECREF(tmp);\ - } while (0) -#define __Pyx_DECREF_SET(r, v) do {\ - PyObject *tmp = (PyObject *) r;\ - r = v; __Pyx_DECREF(tmp);\ - } while (0) -#define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) -#define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) - -/* PyErrExceptionMatches.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err) -static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err); -#else -#define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err) -#endif - -/* PyThreadStateGet.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; -#define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; -#if PY_VERSION_HEX >= 0x030C00A6 -#define __Pyx_PyErr_Occurred() (__pyx_tstate->current_exception != NULL) -#define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->current_exception ? (PyObject*) Py_TYPE(__pyx_tstate->current_exception) : (PyObject*) NULL) -#else -#define __Pyx_PyErr_Occurred() (__pyx_tstate->curexc_type != NULL) -#define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->curexc_type) -#endif -#else -#define __Pyx_PyThreadState_declare -#define __Pyx_PyThreadState_assign -#define __Pyx_PyErr_Occurred() (PyErr_Occurred() != NULL) -#define __Pyx_PyErr_CurrentExceptionType() PyErr_Occurred() -#endif - -/* PyErrFetchRestore.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) -#define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) -#define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) -#define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) -#define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) -static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); -static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A6 -#define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) -#else -#define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) -#endif -#else -#define __Pyx_PyErr_Clear() PyErr_Clear() -#define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) -#define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) -#define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) -#define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) -#define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) -#define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) -#define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) -#endif - -/* PyObjectGetAttrStr.proto */ -#if CYTHON_USE_TYPE_SLOTS -static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); -#else -#define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) -#endif - -/* PyObjectGetAttrStrNoError.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name); - -/* GetBuiltinName.proto */ -static PyObject *__Pyx_GetBuiltinName(PyObject *name); - -/* PyIntCompare.proto */ -static CYTHON_INLINE int __Pyx_PyInt_BoolEqObjC(PyObject *op1, PyObject *op2, long intval, long inplace); - -/* RaiseTooManyValuesToUnpack.proto */ -static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected); - -/* RaiseNeedMoreValuesToUnpack.proto */ -static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index); - -/* IterFinish.proto */ -static CYTHON_INLINE int __Pyx_IterFinish(void); - -/* UnpackItemEndCheck.proto */ -static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected); - -/* GetItemInt.proto */ -#define __Pyx_GetItemInt(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ - (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ - __Pyx_GetItemInt_Fast(o, (Py_ssize_t)i, is_list, wraparound, boundscheck) :\ - (is_list ? (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL) :\ - __Pyx_GetItemInt_Generic(o, to_py_func(i)))) -#define __Pyx_GetItemInt_List(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ - (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ - __Pyx_GetItemInt_List_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ - (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL)) -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, - int wraparound, int boundscheck); -#define __Pyx_GetItemInt_Tuple(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ - (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ - __Pyx_GetItemInt_Tuple_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ - (PyErr_SetString(PyExc_IndexError, "tuple index out of range"), (PyObject*)NULL)) -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, - int wraparound, int boundscheck); -static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j); -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, - int is_list, int wraparound, int boundscheck); - -/* PyDictVersioning.proto */ -#if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS -#define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) -#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) -#define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ - (version_var) = __PYX_GET_DICT_VERSION(dict);\ - (cache_var) = (value); -#define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ - static PY_UINT64_T __pyx_dict_version = 0;\ - static PyObject *__pyx_dict_cached_value = NULL;\ - if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ - (VAR) = __pyx_dict_cached_value;\ - } else {\ - (VAR) = __pyx_dict_cached_value = (LOOKUP);\ - __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ - }\ -} -static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); -static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); -static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); -#else -#define __PYX_GET_DICT_VERSION(dict) (0) -#define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) -#define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); -#endif - -/* GetModuleGlobalName.proto */ -#if CYTHON_USE_DICT_VERSIONS -#define __Pyx_GetModuleGlobalName(var, name) do {\ - static PY_UINT64_T __pyx_dict_version = 0;\ - static PyObject *__pyx_dict_cached_value = NULL;\ - (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ - (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ - __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ -} while(0) -#define __Pyx_GetModuleGlobalNameUncached(var, name) do {\ - PY_UINT64_T __pyx_dict_version;\ - PyObject *__pyx_dict_cached_value;\ - (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ -} while(0) -static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); -#else -#define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) -#define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) -static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); -#endif - -/* PyFunctionFastCall.proto */ -#if CYTHON_FAST_PYCALL -#if !CYTHON_VECTORCALL -#define __Pyx_PyFunction_FastCall(func, args, nargs)\ - __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) -static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); -#endif -#define __Pyx_BUILD_ASSERT_EXPR(cond)\ - (sizeof(char [1 - 2*!(cond)]) - 1) -#ifndef Py_MEMBER_SIZE -#define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) -#endif -#if !CYTHON_VECTORCALL -#if PY_VERSION_HEX >= 0x03080000 - #include "frameobject.h" -#if PY_VERSION_HEX >= 0x030b00a6 - #ifndef Py_BUILD_CORE - #define Py_BUILD_CORE 1 - #endif - #include "internal/pycore_frame.h" -#endif - #define __Pxy_PyFrame_Initialize_Offsets() - #define __Pyx_PyFrame_GetLocalsplus(frame) ((frame)->f_localsplus) -#else - static size_t __pyx_pyframe_localsplus_offset = 0; - #include "frameobject.h" - #define __Pxy_PyFrame_Initialize_Offsets()\ - ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ - (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) - #define __Pyx_PyFrame_GetLocalsplus(frame)\ - (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) -#endif -#endif -#endif - -/* PyObjectCall.proto */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); -#else -#define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) -#endif - -/* PyObjectCallMethO.proto */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); -#endif - -/* PyObjectFastCall.proto */ -#define __Pyx_PyObject_FastCall(func, args, nargs) __Pyx_PyObject_FastCallDict(func, args, (size_t)(nargs), NULL) -static CYTHON_INLINE PyObject* __Pyx_PyObject_FastCallDict(PyObject *func, PyObject **args, size_t nargs, PyObject *kwargs); - -/* TupleAndListFromArray.proto */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE PyObject* __Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n); -static CYTHON_INLINE PyObject* __Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n); -#endif - -/* IncludeStringH.proto */ -#include - -/* BytesEquals.proto */ -static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals); - -/* UnicodeEquals.proto */ -static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals); - -/* fastcall.proto */ -#define __Pyx_Arg_VARARGS(args, i) PyTuple_GET_ITEM(args, i) -#define __Pyx_NumKwargs_VARARGS(kwds) PyDict_Size(kwds) -#define __Pyx_KwValues_VARARGS(args, nargs) NULL -#define __Pyx_GetKwValue_VARARGS(kw, kwvalues, s) __Pyx_PyDict_GetItemStrWithError(kw, s) -#define __Pyx_KwargsAsDict_VARARGS(kw, kwvalues) PyDict_Copy(kw) -#if CYTHON_METH_FASTCALL - #define __Pyx_Arg_FASTCALL(args, i) args[i] - #define __Pyx_NumKwargs_FASTCALL(kwds) PyTuple_GET_SIZE(kwds) - #define __Pyx_KwValues_FASTCALL(args, nargs) ((args) + (nargs)) - static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s); - #define __Pyx_KwargsAsDict_FASTCALL(kw, kwvalues) _PyStack_AsDict(kwvalues, kw) -#else - #define __Pyx_Arg_FASTCALL __Pyx_Arg_VARARGS - #define __Pyx_NumKwargs_FASTCALL __Pyx_NumKwargs_VARARGS - #define __Pyx_KwValues_FASTCALL __Pyx_KwValues_VARARGS - #define __Pyx_GetKwValue_FASTCALL __Pyx_GetKwValue_VARARGS - #define __Pyx_KwargsAsDict_FASTCALL __Pyx_KwargsAsDict_VARARGS -#endif -#if CYTHON_COMPILING_IN_CPYTHON -#define __Pyx_ArgsSlice_VARARGS(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_VARARGS(args, start), stop - start) -#define __Pyx_ArgsSlice_FASTCALL(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_FASTCALL(args, start), stop - start) -#else -#define __Pyx_ArgsSlice_VARARGS(args, start, stop) PyTuple_GetSlice(args, start, stop) -#define __Pyx_ArgsSlice_FASTCALL(args, start, stop) PyTuple_GetSlice(args, start, stop) -#endif - -/* RaiseArgTupleInvalid.proto */ -static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, - Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); - -/* RaiseDoubleKeywords.proto */ -static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); - -/* ParseKeywords.proto */ -static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject *const *kwvalues, - PyObject **argnames[], - PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, - const char* function_name); - -/* GetException.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb) -static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); -#else -static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb); -#endif - -/* pep479.proto */ -static void __Pyx_Generator_Replace_StopIteration(int in_async_gen); - -/* GetTopmostException.proto */ -#if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE -static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate); -#endif - -/* SaveResetException.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb) -static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); -#define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb) -static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); -#else -#define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb) -#define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb) -#endif - -/* IterNext.proto */ -#define __Pyx_PyIter_Next(obj) __Pyx_PyIter_Next2(obj, NULL) -static CYTHON_INLINE PyObject *__Pyx_PyIter_Next2(PyObject *, PyObject *); - -/* ListAppend.proto */ -#if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS -static CYTHON_INLINE int __Pyx_PyList_Append(PyObject* list, PyObject* x) { - PyListObject* L = (PyListObject*) list; - Py_ssize_t len = Py_SIZE(list); - if (likely(L->allocated > len) & likely(len > (L->allocated >> 1))) { - Py_INCREF(x); - PyList_SET_ITEM(list, len, x); - __Pyx_SET_SIZE(list, len + 1); - return 0; - } - return PyList_Append(list, x); -} -#else -#define __Pyx_PyList_Append(L,x) PyList_Append(L,x) -#endif - -/* ListCompAppend.proto */ -#if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS -static CYTHON_INLINE int __Pyx_ListComp_Append(PyObject* list, PyObject* x) { - PyListObject* L = (PyListObject*) list; - Py_ssize_t len = Py_SIZE(list); - if (likely(L->allocated > len)) { - Py_INCREF(x); - PyList_SET_ITEM(list, len, x); - __Pyx_SET_SIZE(list, len + 1); - return 0; - } - return PyList_Append(list, x); -} -#else -#define __Pyx_ListComp_Append(L,x) PyList_Append(L,x) -#endif - -/* PyIntBinop.proto */ -#if !CYTHON_COMPILING_IN_PYPY -static PyObject* __Pyx_PyInt_AddObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check); -#else -#define __Pyx_PyInt_AddObjC(op1, op2, intval, inplace, zerodivision_check)\ - (inplace ? PyNumber_InPlaceAdd(op1, op2) : PyNumber_Add(op1, op2)) -#endif - -/* RaiseException.proto */ -static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); - -/* AssertionsEnabled.proto */ -#define __Pyx_init_assertions_enabled() -#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) - #define __pyx_assertions_enabled() (1) -#elif PY_VERSION_HEX < 0x03080000 || CYTHON_COMPILING_IN_PYPY || defined(Py_LIMITED_API) - #define __pyx_assertions_enabled() (!Py_OptimizeFlag) -#elif CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030900A6 - static int __pyx_assertions_enabled_flag; - #define __pyx_assertions_enabled() (__pyx_assertions_enabled_flag) - #undef __Pyx_init_assertions_enabled - static void __Pyx_init_assertions_enabled(void) { - __pyx_assertions_enabled_flag = ! _PyInterpreterState_GetConfig(__Pyx_PyThreadState_Current->interp)->optimization_level; - } -#else - #define __pyx_assertions_enabled() (!Py_OptimizeFlag) -#endif - -/* SetItemInt.proto */ -#define __Pyx_SetItemInt(o, i, v, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ - (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ - __Pyx_SetItemInt_Fast(o, (Py_ssize_t)i, v, is_list, wraparound, boundscheck) :\ - (is_list ? (PyErr_SetString(PyExc_IndexError, "list assignment index out of range"), -1) :\ - __Pyx_SetItemInt_Generic(o, to_py_func(i), v))) -static int __Pyx_SetItemInt_Generic(PyObject *o, PyObject *j, PyObject *v); -static CYTHON_INLINE int __Pyx_SetItemInt_Fast(PyObject *o, Py_ssize_t i, PyObject *v, - int is_list, int wraparound, int boundscheck); - -/* ModInt[long].proto */ -static CYTHON_INLINE long __Pyx_mod_long(long, long); - -/* IncludeStructmemberH.proto */ -#include - -/* FixUpExtensionType.proto */ -#if CYTHON_USE_TYPE_SPECS -static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type); -#endif - -/* PyObjectCallNoArg.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func); - -/* PyObjectCallOneArg.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); - -/* PyObjectGetMethod.proto */ -static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method); - -/* PyObjectCallMethod0.proto */ -static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name); - -/* ValidateBasesTuple.proto */ -#if CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API || CYTHON_USE_TYPE_SPECS -static int __Pyx_validate_bases_tuple(const char *type_name, Py_ssize_t dictoffset, PyObject *bases); -#endif - -/* PyType_Ready.proto */ -CYTHON_UNUSED static int __Pyx_PyType_Ready(PyTypeObject *t); - -/* PyObject_GenericGetAttrNoDict.proto */ -#if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 -static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name); -#else -#define __Pyx_PyObject_GenericGetAttrNoDict PyObject_GenericGetAttr -#endif - -/* FastTypeChecks.proto */ -#if CYTHON_COMPILING_IN_CPYTHON -#define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) -#define __Pyx_TypeCheck2(obj, type1, type2) __Pyx_IsAnySubtype2(Py_TYPE(obj), (PyTypeObject *)type1, (PyTypeObject *)type2) -static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); -static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b); -static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); -static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); -#else -#define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) -#define __Pyx_TypeCheck2(obj, type1, type2) (PyObject_TypeCheck(obj, (PyTypeObject *)type1) || PyObject_TypeCheck(obj, (PyTypeObject *)type2)) -#define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) -#define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) -#endif -#define __Pyx_PyErr_ExceptionMatches2(err1, err2) __Pyx_PyErr_GivenExceptionMatches2(__Pyx_PyErr_CurrentExceptionType(), err1, err2) -#define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) - -/* Import.proto */ -static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); - -/* ImportFrom.proto */ -static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name); - -/* ImportDottedModule.proto */ -static PyObject *__Pyx_ImportDottedModule(PyObject *name, PyObject *parts_tuple); -#if PY_MAJOR_VERSION >= 3 -static PyObject *__Pyx_ImportDottedModule_WalkParts(PyObject *module, PyObject *name, PyObject *parts_tuple); -#endif - -/* pybytes_as_double.proto */ -static double __Pyx_SlowPyString_AsDouble(PyObject *obj); -static double __Pyx__PyBytes_AsDouble(PyObject *obj, const char* start, Py_ssize_t length); -static CYTHON_INLINE double __Pyx_PyBytes_AsDouble(PyObject *obj) { - return __Pyx__PyBytes_AsDouble(obj, PyBytes_AS_STRING(obj), PyBytes_GET_SIZE(obj)); -} -static CYTHON_INLINE double __Pyx_PyByteArray_AsDouble(PyObject *obj) { - return __Pyx__PyBytes_AsDouble(obj, PyByteArray_AS_STRING(obj), PyByteArray_GET_SIZE(obj)); -} - -/* pyunicode_as_double.proto */ -#if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY -static const char* __Pyx__PyUnicode_AsDouble_Copy(const void* data, const int kind, char* buffer, Py_ssize_t start, Py_ssize_t end) { - int last_was_punctuation; - Py_ssize_t i; - last_was_punctuation = 1; - for (i=start; i <= end; i++) { - Py_UCS4 chr = PyUnicode_READ(kind, data, i); - int is_punctuation = (chr == '_') | (chr == '.'); - *buffer = (char)chr; - buffer += (chr != '_'); - if (unlikely(chr > 127)) goto parse_failure; - if (unlikely(last_was_punctuation & is_punctuation)) goto parse_failure; - last_was_punctuation = is_punctuation; - } - if (unlikely(last_was_punctuation)) goto parse_failure; - *buffer = '\0'; - return buffer; -parse_failure: - return NULL; -} -static double __Pyx__PyUnicode_AsDouble_inf_nan(const void* data, int kind, Py_ssize_t start, Py_ssize_t length) { - int matches = 1; - Py_UCS4 chr; - Py_UCS4 sign = PyUnicode_READ(kind, data, start); - int is_signed = (sign == '-') | (sign == '+'); - start += is_signed; - length -= is_signed; - switch (PyUnicode_READ(kind, data, start)) { - #ifdef Py_NAN - case 'n': - case 'N': - if (unlikely(length != 3)) goto parse_failure; - chr = PyUnicode_READ(kind, data, start+1); - matches &= (chr == 'a') | (chr == 'A'); - chr = PyUnicode_READ(kind, data, start+2); - matches &= (chr == 'n') | (chr == 'N'); - if (unlikely(!matches)) goto parse_failure; - return (sign == '-') ? -Py_NAN : Py_NAN; - #endif - case 'i': - case 'I': - if (unlikely(length < 3)) goto parse_failure; - chr = PyUnicode_READ(kind, data, start+1); - matches &= (chr == 'n') | (chr == 'N'); - chr = PyUnicode_READ(kind, data, start+2); - matches &= (chr == 'f') | (chr == 'F'); - if (likely(length == 3 && matches)) - return (sign == '-') ? -Py_HUGE_VAL : Py_HUGE_VAL; - if (unlikely(length != 8)) goto parse_failure; - chr = PyUnicode_READ(kind, data, start+3); - matches &= (chr == 'i') | (chr == 'I'); - chr = PyUnicode_READ(kind, data, start+4); - matches &= (chr == 'n') | (chr == 'N'); - chr = PyUnicode_READ(kind, data, start+5); - matches &= (chr == 'i') | (chr == 'I'); - chr = PyUnicode_READ(kind, data, start+6); - matches &= (chr == 't') | (chr == 'T'); - chr = PyUnicode_READ(kind, data, start+7); - matches &= (chr == 'y') | (chr == 'Y'); - if (unlikely(!matches)) goto parse_failure; - return (sign == '-') ? -Py_HUGE_VAL : Py_HUGE_VAL; - case '.': case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': - break; - default: - goto parse_failure; - } - return 0.0; -parse_failure: - return -1.0; -} -static double __Pyx_PyUnicode_AsDouble_WithSpaces(PyObject *obj) { - double value; - const char *last; - char *end; - Py_ssize_t start, length = PyUnicode_GET_LENGTH(obj); - const int kind = PyUnicode_KIND(obj); - const void* data = PyUnicode_DATA(obj); - start = 0; - while (Py_UNICODE_ISSPACE(PyUnicode_READ(kind, data, start))) - start++; - while (start < length - 1 && Py_UNICODE_ISSPACE(PyUnicode_READ(kind, data, length - 1))) - length--; - length -= start; - if (unlikely(length <= 0)) goto fallback; - value = __Pyx__PyUnicode_AsDouble_inf_nan(data, kind, start, length); - if (unlikely(value == -1.0)) goto fallback; - if (value != 0.0) return value; - if (length < 40) { - char number[40]; - last = __Pyx__PyUnicode_AsDouble_Copy(data, kind, number, start, start + length); - if (unlikely(!last)) goto fallback; - value = PyOS_string_to_double(number, &end, NULL); - } else { - char *number = (char*) PyMem_Malloc((length + 1) * sizeof(char)); - if (unlikely(!number)) goto fallback; - last = __Pyx__PyUnicode_AsDouble_Copy(data, kind, number, start, start + length); - if (unlikely(!last)) { - PyMem_Free(number); - goto fallback; - } - value = PyOS_string_to_double(number, &end, NULL); - PyMem_Free(number); - } - if (likely(end == last) || (value == (double)-1 && PyErr_Occurred())) { - return value; - } -fallback: - return __Pyx_SlowPyString_AsDouble(obj); -} -#endif -static CYTHON_INLINE double __Pyx_PyUnicode_AsDouble(PyObject *obj) { -#if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY - if (unlikely(__Pyx_PyUnicode_READY(obj) == -1)) - return (double)-1; - if (likely(PyUnicode_IS_ASCII(obj))) { - const char *s; - Py_ssize_t length; - s = PyUnicode_AsUTF8AndSize(obj, &length); - return __Pyx__PyBytes_AsDouble(obj, s, length); - } - return __Pyx_PyUnicode_AsDouble_WithSpaces(obj); -#else - return __Pyx_SlowPyString_AsDouble(obj); -#endif -} - -/* FetchSharedCythonModule.proto */ -static PyObject *__Pyx_FetchSharedCythonABIModule(void); - -/* FetchCommonType.proto */ -#if !CYTHON_USE_TYPE_SPECS -static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type); -#else -static PyTypeObject* __Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases); -#endif - -/* PyMethodNew.proto */ -#if PY_MAJOR_VERSION >= 3 -static PyObject *__Pyx_PyMethod_New(PyObject *func, PyObject *self, PyObject *typ) { - CYTHON_UNUSED_VAR(typ); - if (!self) - return __Pyx_NewRef(func); - return PyMethod_New(func, self); -} -#else - #define __Pyx_PyMethod_New PyMethod_New -#endif - -/* PyVectorcallFastCallDict.proto */ -#if CYTHON_METH_FASTCALL -static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw); -#endif - -/* CythonFunctionShared.proto */ -#define __Pyx_CyFunction_USED -#define __Pyx_CYFUNCTION_STATICMETHOD 0x01 -#define __Pyx_CYFUNCTION_CLASSMETHOD 0x02 -#define __Pyx_CYFUNCTION_CCLASS 0x04 -#define __Pyx_CYFUNCTION_COROUTINE 0x08 -#define __Pyx_CyFunction_GetClosure(f)\ - (((__pyx_CyFunctionObject *) (f))->func_closure) -#if PY_VERSION_HEX < 0x030900B1 - #define __Pyx_CyFunction_GetClassObj(f)\ - (((__pyx_CyFunctionObject *) (f))->func_classobj) -#else - #define __Pyx_CyFunction_GetClassObj(f)\ - ((PyObject*) ((PyCMethodObject *) (f))->mm_class) -#endif -#define __Pyx_CyFunction_SetClassObj(f, classobj)\ - __Pyx__CyFunction_SetClassObj((__pyx_CyFunctionObject *) (f), (classobj)) -#define __Pyx_CyFunction_Defaults(type, f)\ - ((type *)(((__pyx_CyFunctionObject *) (f))->defaults)) -#define __Pyx_CyFunction_SetDefaultsGetter(f, g)\ - ((__pyx_CyFunctionObject *) (f))->defaults_getter = (g) -typedef struct { -#if PY_VERSION_HEX < 0x030900B1 - PyCFunctionObject func; -#else - PyCMethodObject func; -#endif -#if CYTHON_BACKPORT_VECTORCALL - __pyx_vectorcallfunc func_vectorcall; -#endif -#if PY_VERSION_HEX < 0x030500A0 - PyObject *func_weakreflist; -#endif - PyObject *func_dict; - PyObject *func_name; - PyObject *func_qualname; - PyObject *func_doc; - PyObject *func_globals; - PyObject *func_code; - PyObject *func_closure; -#if PY_VERSION_HEX < 0x030900B1 - PyObject *func_classobj; -#endif - void *defaults; - int defaults_pyobjects; - size_t defaults_size; // used by FusedFunction for copying defaults - int flags; - PyObject *defaults_tuple; - PyObject *defaults_kwdict; - PyObject *(*defaults_getter)(PyObject *); - PyObject *func_annotations; - PyObject *func_is_coroutine; -} __pyx_CyFunctionObject; -#define __Pyx_CyFunction_Check(obj) __Pyx_TypeCheck(obj, __pyx_CyFunctionType) -#define __Pyx_IsCyOrPyCFunction(obj) __Pyx_TypeCheck2(obj, __pyx_CyFunctionType, &PyCFunction_Type) -#define __Pyx_CyFunction_CheckExact(obj) __Pyx_IS_TYPE(obj, __pyx_CyFunctionType) -static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject* op, PyMethodDef *ml, - int flags, PyObject* qualname, - PyObject *closure, - PyObject *module, PyObject *globals, - PyObject* code); -static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj); -static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *m, - size_t size, - int pyobjects); -static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *m, - PyObject *tuple); -static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *m, - PyObject *dict); -static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *m, - PyObject *dict); -static int __pyx_CyFunction_init(PyObject *module); -#if CYTHON_METH_FASTCALL -static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); -static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); -static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); -static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); -#if CYTHON_BACKPORT_VECTORCALL -#define __Pyx_CyFunction_func_vectorcall(f) (((__pyx_CyFunctionObject*)f)->func_vectorcall) -#else -#define __Pyx_CyFunction_func_vectorcall(f) (((PyCFunctionObject*)f)->vectorcall) -#endif -#endif - -/* CythonFunction.proto */ -static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml, - int flags, PyObject* qualname, - PyObject *closure, - PyObject *module, PyObject *globals, - PyObject* code); - -/* CLineInTraceback.proto */ -#ifdef CYTHON_CLINE_IN_TRACEBACK -#define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) -#else -static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); -#endif - -/* CodeObjectCache.proto */ -#if !CYTHON_COMPILING_IN_LIMITED_API -typedef struct { - PyCodeObject* code_object; - int code_line; -} __Pyx_CodeObjectCacheEntry; -struct __Pyx_CodeObjectCache { - int count; - int max_count; - __Pyx_CodeObjectCacheEntry* entries; -}; -static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; -static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); -static PyCodeObject *__pyx_find_code_object(int code_line); -static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); -#endif - -/* AddTraceback.proto */ -static void __Pyx_AddTraceback(const char *funcname, int c_line, - int py_line, const char *filename); - -/* RealImag.proto */ -#if CYTHON_CCOMPLEX - #ifdef __cplusplus - #define __Pyx_CREAL(z) ((z).real()) - #define __Pyx_CIMAG(z) ((z).imag()) - #else - #define __Pyx_CREAL(z) (__real__(z)) - #define __Pyx_CIMAG(z) (__imag__(z)) - #endif -#else - #define __Pyx_CREAL(z) ((z).real) - #define __Pyx_CIMAG(z) ((z).imag) -#endif -#if defined(__cplusplus) && CYTHON_CCOMPLEX\ - && (defined(_WIN32) || defined(__clang__) || (defined(__GNUC__) && (__GNUC__ >= 5 || __GNUC__ == 4 && __GNUC_MINOR__ >= 4 )) || __cplusplus >= 201103) - #define __Pyx_SET_CREAL(z,x) ((z).real(x)) - #define __Pyx_SET_CIMAG(z,y) ((z).imag(y)) -#else - #define __Pyx_SET_CREAL(z,x) __Pyx_CREAL(z) = (x) - #define __Pyx_SET_CIMAG(z,y) __Pyx_CIMAG(z) = (y) -#endif - -/* Arithmetic.proto */ -#if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) - #define __Pyx_c_eq_double(a, b) ((a)==(b)) - #define __Pyx_c_sum_double(a, b) ((a)+(b)) - #define __Pyx_c_diff_double(a, b) ((a)-(b)) - #define __Pyx_c_prod_double(a, b) ((a)*(b)) - #define __Pyx_c_quot_double(a, b) ((a)/(b)) - #define __Pyx_c_neg_double(a) (-(a)) - #ifdef __cplusplus - #define __Pyx_c_is_zero_double(z) ((z)==(double)0) - #define __Pyx_c_conj_double(z) (::std::conj(z)) - #if 1 - #define __Pyx_c_abs_double(z) (::std::abs(z)) - #define __Pyx_c_pow_double(a, b) (::std::pow(a, b)) - #endif - #else - #define __Pyx_c_is_zero_double(z) ((z)==0) - #define __Pyx_c_conj_double(z) (conj(z)) - #if 1 - #define __Pyx_c_abs_double(z) (cabs(z)) - #define __Pyx_c_pow_double(a, b) (cpow(a, b)) - #endif - #endif -#else - static CYTHON_INLINE int __Pyx_c_eq_double(__pyx_t_double_complex, __pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_sum_double(__pyx_t_double_complex, __pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_diff_double(__pyx_t_double_complex, __pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_prod_double(__pyx_t_double_complex, __pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex, __pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_neg_double(__pyx_t_double_complex); - static CYTHON_INLINE int __Pyx_c_is_zero_double(__pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_conj_double(__pyx_t_double_complex); - #if 1 - static CYTHON_INLINE double __Pyx_c_abs_double(__pyx_t_double_complex); - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_pow_double(__pyx_t_double_complex, __pyx_t_double_complex); - #endif -#endif - -/* FromPy.proto */ -static __pyx_t_double_complex __Pyx_PyComplex_As___pyx_t_double_complex(PyObject*); - -/* GCCDiagnostics.proto */ -#if !defined(__INTEL_COMPILER) && defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 6)) -#define __Pyx_HAS_GCC_DIAGNOSTIC -#endif - -/* ToPy.proto */ -#define __pyx_PyComplex_FromComplex(z)\ - PyComplex_FromDoubles((double)__Pyx_CREAL(z),\ - (double)__Pyx_CIMAG(z)) - -/* CIntFromPy.proto */ -static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); - -/* CIntToPy.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); - -/* CIntToPy.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value); - -/* CIntFromPy.proto */ -static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); - -/* FormatTypeName.proto */ -#if CYTHON_COMPILING_IN_LIMITED_API -typedef PyObject *__Pyx_TypeName; -#define __Pyx_FMT_TYPENAME "%U" -static __Pyx_TypeName __Pyx_PyType_GetName(PyTypeObject* tp); -#define __Pyx_DECREF_TypeName(obj) Py_XDECREF(obj) -#else -typedef const char *__Pyx_TypeName; -#define __Pyx_FMT_TYPENAME "%.200s" -#define __Pyx_PyType_GetName(tp) ((tp)->tp_name) -#define __Pyx_DECREF_TypeName(obj) -#endif - -/* SwapException.proto */ -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_ExceptionSwap(type, value, tb) __Pyx__ExceptionSwap(__pyx_tstate, type, value, tb) -static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); -#else -static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb); -#endif - -/* PyObjectCall2Args.proto */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2); - -/* PyObjectCallMethod1.proto */ -static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg); - -/* CoroutineBase.proto */ -struct __pyx_CoroutineObject; -typedef PyObject *(*__pyx_coroutine_body_t)(struct __pyx_CoroutineObject *, PyThreadState *, PyObject *); -#if CYTHON_USE_EXC_INFO_STACK -#define __Pyx_ExcInfoStruct _PyErr_StackItem -#else -typedef struct { - PyObject *exc_type; - PyObject *exc_value; - PyObject *exc_traceback; -} __Pyx_ExcInfoStruct; -#endif -typedef struct __pyx_CoroutineObject { - PyObject_HEAD - __pyx_coroutine_body_t body; - PyObject *closure; - __Pyx_ExcInfoStruct gi_exc_state; - PyObject *gi_weakreflist; - PyObject *classobj; - PyObject *yieldfrom; - PyObject *gi_name; - PyObject *gi_qualname; - PyObject *gi_modulename; - PyObject *gi_code; - PyObject *gi_frame; - int resume_label; - char is_running; -} __pyx_CoroutineObject; -static __pyx_CoroutineObject *__Pyx__Coroutine_New( - PyTypeObject *type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, - PyObject *name, PyObject *qualname, PyObject *module_name); -static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit( - __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, - PyObject *name, PyObject *qualname, PyObject *module_name); -static CYTHON_INLINE void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *self); -static int __Pyx_Coroutine_clear(PyObject *self); -static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value); -static PyObject *__Pyx_Coroutine_Close(PyObject *self); -static PyObject *__Pyx_Coroutine_Throw(PyObject *gen, PyObject *args); -#if CYTHON_USE_EXC_INFO_STACK -#define __Pyx_Coroutine_SwapException(self) -#define __Pyx_Coroutine_ResetAndClearException(self) __Pyx_Coroutine_ExceptionClear(&(self)->gi_exc_state) -#else -#define __Pyx_Coroutine_SwapException(self) {\ - __Pyx_ExceptionSwap(&(self)->gi_exc_state.exc_type, &(self)->gi_exc_state.exc_value, &(self)->gi_exc_state.exc_traceback);\ - __Pyx_Coroutine_ResetFrameBackpointer(&(self)->gi_exc_state);\ - } -#define __Pyx_Coroutine_ResetAndClearException(self) {\ - __Pyx_ExceptionReset((self)->gi_exc_state.exc_type, (self)->gi_exc_state.exc_value, (self)->gi_exc_state.exc_traceback);\ - (self)->gi_exc_state.exc_type = (self)->gi_exc_state.exc_value = (self)->gi_exc_state.exc_traceback = NULL;\ - } -#endif -#if CYTHON_FAST_THREAD_STATE -#define __Pyx_PyGen_FetchStopIterationValue(pvalue)\ - __Pyx_PyGen__FetchStopIterationValue(__pyx_tstate, pvalue) -#else -#define __Pyx_PyGen_FetchStopIterationValue(pvalue)\ - __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, pvalue) -#endif -static int __Pyx_PyGen__FetchStopIterationValue(PyThreadState *tstate, PyObject **pvalue); -static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state); - -/* PatchModuleWithCoroutine.proto */ -static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code); - -/* PatchGeneratorABC.proto */ -static int __Pyx_patch_abc(void); - -/* Generator.proto */ -#define __Pyx_Generator_USED -#define __Pyx_Generator_CheckExact(obj) __Pyx_IS_TYPE(obj, __pyx_GeneratorType) -#define __Pyx_Generator_New(body, code, closure, name, qualname, module_name)\ - __Pyx__Coroutine_New(__pyx_GeneratorType, body, code, closure, name, qualname, module_name) -static PyObject *__Pyx_Generator_Next(PyObject *self); -static int __pyx_Generator_init(PyObject *module); - -/* CheckBinaryVersion.proto */ -static int __Pyx_check_binary_version(void); - -/* InitStrings.proto */ -static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); - -/* #### Code section: module_declarations ### */ - -/* Module declarations from "cython" */ - -/* Module declarations from "fontTools.cu2qu.cu2qu" */ -static CYTHON_INLINE double __pyx_f_9fontTools_5cu2qu_5cu2qu_dot(__pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_points(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_parameters(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_n_iter(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, PyObject *); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_three(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE __pyx_t_double_complex __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_control(double, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static CYTHON_INLINE __pyx_t_double_complex __pyx_f_9fontTools_5cu2qu_5cu2qu_calc_intersect(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex); /*proto*/ -static int __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, __pyx_t_double_complex, double); /*proto*/ -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_quadratic(PyObject *, double); /*proto*/ -static PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_spline(PyObject *, int, double, int); /*proto*/ -/* #### Code section: typeinfo ### */ -/* #### Code section: before_global_var ### */ -#define __Pyx_MODULE_NAME "fontTools.cu2qu.cu2qu" -extern int __pyx_module_is_main_fontTools__cu2qu__cu2qu; -int __pyx_module_is_main_fontTools__cu2qu__cu2qu = 0; - -/* Implementation of "fontTools.cu2qu.cu2qu" */ -/* #### Code section: global_var ### */ -static PyObject *__pyx_builtin_AttributeError; -static PyObject *__pyx_builtin_ImportError; -static PyObject *__pyx_builtin_range; -static PyObject *__pyx_builtin_ZeroDivisionError; -static PyObject *__pyx_builtin_AssertionError; -/* #### Code section: string_decls ### */ -static const char __pyx_k_a[] = "a"; -static const char __pyx_k_b[] = "b"; -static const char __pyx_k_c[] = "c"; -static const char __pyx_k_d[] = "d"; -static const char __pyx_k_i[] = "i"; -static const char __pyx_k_l[] = "l"; -static const char __pyx_k_n[] = "n"; -static const char __pyx_k_p[] = "p"; -static const char __pyx_k_s[] = "s"; -static const char __pyx_k__2[] = "."; -static const char __pyx_k__3[] = "*"; -static const char __pyx_k__9[] = "?"; -static const char __pyx_k_a1[] = "a1"; -static const char __pyx_k_b1[] = "b1"; -static const char __pyx_k_c1[] = "c1"; -static const char __pyx_k_d1[] = "d1"; -static const char __pyx_k_dt[] = "dt"; -static const char __pyx_k_gc[] = "gc"; -static const char __pyx_k_p0[] = "p0"; -static const char __pyx_k_p1[] = "p1"; -static const char __pyx_k_p2[] = "p2"; -static const char __pyx_k_p3[] = "p3"; -static const char __pyx_k_t1[] = "t1"; -static const char __pyx_k_NAN[] = "NAN"; -static const char __pyx_k_NaN[] = "NaN"; -static const char __pyx_k_all[] = "__all__"; -static const char __pyx_k_args[] = "args"; -static const char __pyx_k_imag[] = "imag"; -static const char __pyx_k_main[] = "__main__"; -static const char __pyx_k_math[] = "math"; -static const char __pyx_k_name[] = "__name__"; -static const char __pyx_k_real[] = "real"; -static const char __pyx_k_send[] = "send"; -static const char __pyx_k_spec[] = "__spec__"; -static const char __pyx_k_t1_2[] = "t1_2"; -static const char __pyx_k_test[] = "__test__"; -static const char __pyx_k_Error[] = "Error"; -static const char __pyx_k_MAX_N[] = "MAX_N"; -static const char __pyx_k_close[] = "close"; -static const char __pyx_k_curve[] = "curve"; -static const char __pyx_k_isnan[] = "isnan"; -static const char __pyx_k_range[] = "range"; -static const char __pyx_k_throw[] = "throw"; -static const char __pyx_k_curves[] = "curves"; -static const char __pyx_k_cython[] = "cython"; -static const char __pyx_k_enable[] = "enable"; -static const char __pyx_k_errors[] = "errors"; -static const char __pyx_k_import[] = "__import__"; -static const char __pyx_k_last_i[] = "last_i"; -static const char __pyx_k_spline[] = "spline"; -static const char __pyx_k_delta_2[] = "delta_2"; -static const char __pyx_k_delta_3[] = "delta_3"; -static const char __pyx_k_disable[] = "disable"; -static const char __pyx_k_max_err[] = "max_err"; -static const char __pyx_k_splines[] = "splines"; -static const char __pyx_k_COMPILED[] = "COMPILED"; -static const char __pyx_k_isenabled[] = "isenabled"; -static const char __pyx_k_Cu2QuError[] = "Cu2QuError"; -static const char __pyx_k_max_errors[] = "max_errors"; -static const char __pyx_k_ImportError[] = "ImportError"; -static const char __pyx_k_initializing[] = "_initializing"; -static const char __pyx_k_is_coroutine[] = "_is_coroutine"; -static const char __pyx_k_all_quadratic[] = "all_quadratic"; -static const char __pyx_k_AssertionError[] = "AssertionError"; -static const char __pyx_k_AttributeError[] = "AttributeError"; -static const char __pyx_k_fontTools_misc[] = "fontTools.misc"; -static const char __pyx_k_ZeroDivisionError[] = "ZeroDivisionError"; -static const char __pyx_k_asyncio_coroutines[] = "asyncio.coroutines"; -static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; -static const char __pyx_k_curve_to_quadratic[] = "curve_to_quadratic"; -static const char __pyx_k_ApproxNotFoundError[] = "ApproxNotFoundError"; -static const char __pyx_k_curves_to_quadratic[] = "curves_to_quadratic"; -static const char __pyx_k_fontTools_cu2qu_cu2qu[] = "fontTools.cu2qu.cu2qu"; -static const char __pyx_k_split_cubic_into_n_gen[] = "_split_cubic_into_n_gen"; -static const char __pyx_k_Lib_fontTools_cu2qu_cu2qu_py[] = "Lib/fontTools/cu2qu/cu2qu.py"; -static const char __pyx_k_curves_to_quadratic_line_474[] = "curves_to_quadratic (line 474)"; -static const char __pyx_k_Return_quadratic_Bezier_splines[] = "Return quadratic Bezier splines approximating the input cubic Beziers.\n\n Args:\n curves: A sequence of *n* curves, each curve being a sequence of four\n 2D tuples.\n max_errors: A sequence of *n* floats representing the maximum permissible\n deviation from each of the cubic Bezier curves.\n all_quadratic (bool): If True (default) returned values are a\n quadratic spline. If False, they are either a single quadratic\n curve or a single cubic curve.\n\n Example::\n\n >>> curves_to_quadratic( [\n ... [ (50,50), (100,100), (150,100), (200,50) ],\n ... [ (75,50), (120,100), (150,75), (200,60) ]\n ... ], [1,1] )\n [[(50.0, 50.0), (75.0, 75.0), (125.0, 91.66666666666666), (175.0, 75.0), (200.0, 50.0)], [(75.0, 50.0), (97.5, 75.0), (135.41666666666666, 82.08333333333333), (175.0, 67.5), (200.0, 60.0)]]\n\n The returned splines have \"implied oncurve points\" suitable for use in\n TrueType ``glif`` outlines - i.e. in the first spline returned above,\n the first quadratic segment runs from (50,50) to\n ( (75 + 125)/2 , (120 + 91.666..)/2 ) = (100, 83.333...).\n\n Returns:\n If all_quadratic is True, a list of splines, each spline being a list\n of 2D tuples.\n\n If all_quadratic is False, a list of curves, each curve being a quadratic\n (length 3), or cubic (length 4).\n\n Raises:\n fontTools.cu2qu.Errors.ApproxNotFoundError: if no suitable approximation\n can be found for all curves with the given parameters.\n "; -/* #### Code section: decls ### */ -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu__split_cubic_into_n_gen(CYTHON_UNUSED PyObject *__pyx_self, __pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3, int __pyx_v_n); /* proto */ -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu_3curve_to_quadratic(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_curve, double __pyx_v_max_err, int __pyx_v_all_quadratic); /* proto */ -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu_5curves_to_quadratic(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_curves, PyObject *__pyx_v_max_errors, int __pyx_v_all_quadratic); /* proto */ -static PyObject *__pyx_tp_new_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/ -/* #### Code section: late_includes ### */ -/* #### Code section: module_state ### */ -typedef struct { - PyObject *__pyx_d; - PyObject *__pyx_b; - PyObject *__pyx_cython_runtime; - PyObject *__pyx_empty_tuple; - PyObject *__pyx_empty_bytes; - PyObject *__pyx_empty_unicode; - #ifdef __Pyx_CyFunction_USED - PyTypeObject *__pyx_CyFunctionType; - #endif - #ifdef __Pyx_FusedFunction_USED - PyTypeObject *__pyx_FusedFunctionType; - #endif - #ifdef __Pyx_Generator_USED - PyTypeObject *__pyx_GeneratorType; - #endif - #ifdef __Pyx_IterableCoroutine_USED - PyTypeObject *__pyx_IterableCoroutineType; - #endif - #ifdef __Pyx_Coroutine_USED - PyTypeObject *__pyx_CoroutineAwaitType; - #endif - #ifdef __Pyx_Coroutine_USED - PyTypeObject *__pyx_CoroutineType; - #endif - #if CYTHON_USE_MODULE_STATE - #endif - #if CYTHON_USE_MODULE_STATE - PyObject *__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen; - #endif - PyTypeObject *__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen; - PyObject *__pyx_n_s_ApproxNotFoundError; - PyObject *__pyx_n_s_AssertionError; - PyObject *__pyx_n_s_AttributeError; - PyObject *__pyx_n_s_COMPILED; - PyObject *__pyx_n_s_Cu2QuError; - PyObject *__pyx_n_s_Error; - PyObject *__pyx_n_s_ImportError; - PyObject *__pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py; - PyObject *__pyx_n_s_MAX_N; - PyObject *__pyx_n_s_NAN; - PyObject *__pyx_n_u_NaN; - PyObject *__pyx_kp_u_Return_quadratic_Bezier_splines; - PyObject *__pyx_n_s_ZeroDivisionError; - PyObject *__pyx_kp_u__2; - PyObject *__pyx_n_s__3; - PyObject *__pyx_n_s__9; - PyObject *__pyx_n_s_a; - PyObject *__pyx_n_s_a1; - PyObject *__pyx_n_s_all; - PyObject *__pyx_n_s_all_quadratic; - PyObject *__pyx_n_s_args; - PyObject *__pyx_n_s_asyncio_coroutines; - PyObject *__pyx_n_s_b; - PyObject *__pyx_n_s_b1; - PyObject *__pyx_n_s_c; - PyObject *__pyx_n_s_c1; - PyObject *__pyx_n_s_cline_in_traceback; - PyObject *__pyx_n_s_close; - PyObject *__pyx_n_s_curve; - PyObject *__pyx_n_s_curve_to_quadratic; - PyObject *__pyx_n_u_curve_to_quadratic; - PyObject *__pyx_n_s_curves; - PyObject *__pyx_n_s_curves_to_quadratic; - PyObject *__pyx_n_u_curves_to_quadratic; - PyObject *__pyx_kp_u_curves_to_quadratic_line_474; - PyObject *__pyx_n_s_cython; - PyObject *__pyx_n_s_d; - PyObject *__pyx_n_s_d1; - PyObject *__pyx_n_s_delta_2; - PyObject *__pyx_n_s_delta_3; - PyObject *__pyx_kp_u_disable; - PyObject *__pyx_n_s_dt; - PyObject *__pyx_kp_u_enable; - PyObject *__pyx_n_s_errors; - PyObject *__pyx_n_s_fontTools_cu2qu_cu2qu; - PyObject *__pyx_n_s_fontTools_misc; - PyObject *__pyx_kp_u_gc; - PyObject *__pyx_n_s_i; - PyObject *__pyx_n_s_imag; - PyObject *__pyx_n_s_import; - PyObject *__pyx_n_s_initializing; - PyObject *__pyx_n_s_is_coroutine; - PyObject *__pyx_kp_u_isenabled; - PyObject *__pyx_n_s_isnan; - PyObject *__pyx_n_s_l; - PyObject *__pyx_n_s_last_i; - PyObject *__pyx_n_s_main; - PyObject *__pyx_n_s_math; - PyObject *__pyx_n_s_max_err; - PyObject *__pyx_n_s_max_errors; - PyObject *__pyx_n_s_n; - PyObject *__pyx_n_s_name; - PyObject *__pyx_n_s_p; - PyObject *__pyx_n_s_p0; - PyObject *__pyx_n_s_p1; - PyObject *__pyx_n_s_p2; - PyObject *__pyx_n_s_p3; - PyObject *__pyx_n_s_range; - PyObject *__pyx_n_s_real; - PyObject *__pyx_n_s_s; - PyObject *__pyx_n_s_send; - PyObject *__pyx_n_s_spec; - PyObject *__pyx_n_s_spline; - PyObject *__pyx_n_s_splines; - PyObject *__pyx_n_s_split_cubic_into_n_gen; - PyObject *__pyx_n_s_t1; - PyObject *__pyx_n_s_t1_2; - PyObject *__pyx_n_s_test; - PyObject *__pyx_n_s_throw; - PyObject *__pyx_int_1; - PyObject *__pyx_int_2; - PyObject *__pyx_int_3; - PyObject *__pyx_int_4; - PyObject *__pyx_int_6; - PyObject *__pyx_int_100; - PyObject *__pyx_codeobj_; - PyObject *__pyx_tuple__4; - PyObject *__pyx_tuple__5; - PyObject *__pyx_tuple__7; - PyObject *__pyx_codeobj__6; - PyObject *__pyx_codeobj__8; -} __pyx_mstate; - -#if CYTHON_USE_MODULE_STATE -#ifdef __cplusplus -namespace { - extern struct PyModuleDef __pyx_moduledef; -} /* anonymous namespace */ -#else -static struct PyModuleDef __pyx_moduledef; -#endif - -#define __pyx_mstate(o) ((__pyx_mstate *)__Pyx_PyModule_GetState(o)) - -#define __pyx_mstate_global (__pyx_mstate(PyState_FindModule(&__pyx_moduledef))) - -#define __pyx_m (PyState_FindModule(&__pyx_moduledef)) -#else -static __pyx_mstate __pyx_mstate_global_static = -#ifdef __cplusplus - {}; -#else - {0}; -#endif -static __pyx_mstate *__pyx_mstate_global = &__pyx_mstate_global_static; -#endif -/* #### Code section: module_state_clear ### */ -#if CYTHON_USE_MODULE_STATE -static int __pyx_m_clear(PyObject *m) { - __pyx_mstate *clear_module_state = __pyx_mstate(m); - if (!clear_module_state) return 0; - Py_CLEAR(clear_module_state->__pyx_d); - Py_CLEAR(clear_module_state->__pyx_b); - Py_CLEAR(clear_module_state->__pyx_cython_runtime); - Py_CLEAR(clear_module_state->__pyx_empty_tuple); - Py_CLEAR(clear_module_state->__pyx_empty_bytes); - Py_CLEAR(clear_module_state->__pyx_empty_unicode); - #ifdef __Pyx_CyFunction_USED - Py_CLEAR(clear_module_state->__pyx_CyFunctionType); - #endif - #ifdef __Pyx_FusedFunction_USED - Py_CLEAR(clear_module_state->__pyx_FusedFunctionType); - #endif - Py_CLEAR(clear_module_state->__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen); - Py_CLEAR(clear_module_state->__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen); - Py_CLEAR(clear_module_state->__pyx_n_s_ApproxNotFoundError); - Py_CLEAR(clear_module_state->__pyx_n_s_AssertionError); - Py_CLEAR(clear_module_state->__pyx_n_s_AttributeError); - Py_CLEAR(clear_module_state->__pyx_n_s_COMPILED); - Py_CLEAR(clear_module_state->__pyx_n_s_Cu2QuError); - Py_CLEAR(clear_module_state->__pyx_n_s_Error); - Py_CLEAR(clear_module_state->__pyx_n_s_ImportError); - Py_CLEAR(clear_module_state->__pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py); - Py_CLEAR(clear_module_state->__pyx_n_s_MAX_N); - Py_CLEAR(clear_module_state->__pyx_n_s_NAN); - Py_CLEAR(clear_module_state->__pyx_n_u_NaN); - Py_CLEAR(clear_module_state->__pyx_kp_u_Return_quadratic_Bezier_splines); - Py_CLEAR(clear_module_state->__pyx_n_s_ZeroDivisionError); - Py_CLEAR(clear_module_state->__pyx_kp_u__2); - Py_CLEAR(clear_module_state->__pyx_n_s__3); - Py_CLEAR(clear_module_state->__pyx_n_s__9); - Py_CLEAR(clear_module_state->__pyx_n_s_a); - Py_CLEAR(clear_module_state->__pyx_n_s_a1); - Py_CLEAR(clear_module_state->__pyx_n_s_all); - Py_CLEAR(clear_module_state->__pyx_n_s_all_quadratic); - Py_CLEAR(clear_module_state->__pyx_n_s_args); - Py_CLEAR(clear_module_state->__pyx_n_s_asyncio_coroutines); - Py_CLEAR(clear_module_state->__pyx_n_s_b); - Py_CLEAR(clear_module_state->__pyx_n_s_b1); - Py_CLEAR(clear_module_state->__pyx_n_s_c); - Py_CLEAR(clear_module_state->__pyx_n_s_c1); - Py_CLEAR(clear_module_state->__pyx_n_s_cline_in_traceback); - Py_CLEAR(clear_module_state->__pyx_n_s_close); - Py_CLEAR(clear_module_state->__pyx_n_s_curve); - Py_CLEAR(clear_module_state->__pyx_n_s_curve_to_quadratic); - Py_CLEAR(clear_module_state->__pyx_n_u_curve_to_quadratic); - Py_CLEAR(clear_module_state->__pyx_n_s_curves); - Py_CLEAR(clear_module_state->__pyx_n_s_curves_to_quadratic); - Py_CLEAR(clear_module_state->__pyx_n_u_curves_to_quadratic); - Py_CLEAR(clear_module_state->__pyx_kp_u_curves_to_quadratic_line_474); - Py_CLEAR(clear_module_state->__pyx_n_s_cython); - Py_CLEAR(clear_module_state->__pyx_n_s_d); - Py_CLEAR(clear_module_state->__pyx_n_s_d1); - Py_CLEAR(clear_module_state->__pyx_n_s_delta_2); - Py_CLEAR(clear_module_state->__pyx_n_s_delta_3); - Py_CLEAR(clear_module_state->__pyx_kp_u_disable); - Py_CLEAR(clear_module_state->__pyx_n_s_dt); - Py_CLEAR(clear_module_state->__pyx_kp_u_enable); - Py_CLEAR(clear_module_state->__pyx_n_s_errors); - Py_CLEAR(clear_module_state->__pyx_n_s_fontTools_cu2qu_cu2qu); - Py_CLEAR(clear_module_state->__pyx_n_s_fontTools_misc); - Py_CLEAR(clear_module_state->__pyx_kp_u_gc); - Py_CLEAR(clear_module_state->__pyx_n_s_i); - Py_CLEAR(clear_module_state->__pyx_n_s_imag); - Py_CLEAR(clear_module_state->__pyx_n_s_import); - Py_CLEAR(clear_module_state->__pyx_n_s_initializing); - Py_CLEAR(clear_module_state->__pyx_n_s_is_coroutine); - Py_CLEAR(clear_module_state->__pyx_kp_u_isenabled); - Py_CLEAR(clear_module_state->__pyx_n_s_isnan); - Py_CLEAR(clear_module_state->__pyx_n_s_l); - Py_CLEAR(clear_module_state->__pyx_n_s_last_i); - Py_CLEAR(clear_module_state->__pyx_n_s_main); - Py_CLEAR(clear_module_state->__pyx_n_s_math); - Py_CLEAR(clear_module_state->__pyx_n_s_max_err); - Py_CLEAR(clear_module_state->__pyx_n_s_max_errors); - Py_CLEAR(clear_module_state->__pyx_n_s_n); - Py_CLEAR(clear_module_state->__pyx_n_s_name); - Py_CLEAR(clear_module_state->__pyx_n_s_p); - Py_CLEAR(clear_module_state->__pyx_n_s_p0); - Py_CLEAR(clear_module_state->__pyx_n_s_p1); - Py_CLEAR(clear_module_state->__pyx_n_s_p2); - Py_CLEAR(clear_module_state->__pyx_n_s_p3); - Py_CLEAR(clear_module_state->__pyx_n_s_range); - Py_CLEAR(clear_module_state->__pyx_n_s_real); - Py_CLEAR(clear_module_state->__pyx_n_s_s); - Py_CLEAR(clear_module_state->__pyx_n_s_send); - Py_CLEAR(clear_module_state->__pyx_n_s_spec); - Py_CLEAR(clear_module_state->__pyx_n_s_spline); - Py_CLEAR(clear_module_state->__pyx_n_s_splines); - Py_CLEAR(clear_module_state->__pyx_n_s_split_cubic_into_n_gen); - Py_CLEAR(clear_module_state->__pyx_n_s_t1); - Py_CLEAR(clear_module_state->__pyx_n_s_t1_2); - Py_CLEAR(clear_module_state->__pyx_n_s_test); - Py_CLEAR(clear_module_state->__pyx_n_s_throw); - Py_CLEAR(clear_module_state->__pyx_int_1); - Py_CLEAR(clear_module_state->__pyx_int_2); - Py_CLEAR(clear_module_state->__pyx_int_3); - Py_CLEAR(clear_module_state->__pyx_int_4); - Py_CLEAR(clear_module_state->__pyx_int_6); - Py_CLEAR(clear_module_state->__pyx_int_100); - Py_CLEAR(clear_module_state->__pyx_codeobj_); - Py_CLEAR(clear_module_state->__pyx_tuple__4); - Py_CLEAR(clear_module_state->__pyx_tuple__5); - Py_CLEAR(clear_module_state->__pyx_tuple__7); - Py_CLEAR(clear_module_state->__pyx_codeobj__6); - Py_CLEAR(clear_module_state->__pyx_codeobj__8); - return 0; -} -#endif -/* #### Code section: module_state_traverse ### */ -#if CYTHON_USE_MODULE_STATE -static int __pyx_m_traverse(PyObject *m, visitproc visit, void *arg) { - __pyx_mstate *traverse_module_state = __pyx_mstate(m); - if (!traverse_module_state) return 0; - Py_VISIT(traverse_module_state->__pyx_d); - Py_VISIT(traverse_module_state->__pyx_b); - Py_VISIT(traverse_module_state->__pyx_cython_runtime); - Py_VISIT(traverse_module_state->__pyx_empty_tuple); - Py_VISIT(traverse_module_state->__pyx_empty_bytes); - Py_VISIT(traverse_module_state->__pyx_empty_unicode); - #ifdef __Pyx_CyFunction_USED - Py_VISIT(traverse_module_state->__pyx_CyFunctionType); - #endif - #ifdef __Pyx_FusedFunction_USED - Py_VISIT(traverse_module_state->__pyx_FusedFunctionType); - #endif - Py_VISIT(traverse_module_state->__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen); - Py_VISIT(traverse_module_state->__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen); - Py_VISIT(traverse_module_state->__pyx_n_s_ApproxNotFoundError); - Py_VISIT(traverse_module_state->__pyx_n_s_AssertionError); - Py_VISIT(traverse_module_state->__pyx_n_s_AttributeError); - Py_VISIT(traverse_module_state->__pyx_n_s_COMPILED); - Py_VISIT(traverse_module_state->__pyx_n_s_Cu2QuError); - Py_VISIT(traverse_module_state->__pyx_n_s_Error); - Py_VISIT(traverse_module_state->__pyx_n_s_ImportError); - Py_VISIT(traverse_module_state->__pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py); - Py_VISIT(traverse_module_state->__pyx_n_s_MAX_N); - Py_VISIT(traverse_module_state->__pyx_n_s_NAN); - Py_VISIT(traverse_module_state->__pyx_n_u_NaN); - Py_VISIT(traverse_module_state->__pyx_kp_u_Return_quadratic_Bezier_splines); - Py_VISIT(traverse_module_state->__pyx_n_s_ZeroDivisionError); - Py_VISIT(traverse_module_state->__pyx_kp_u__2); - Py_VISIT(traverse_module_state->__pyx_n_s__3); - Py_VISIT(traverse_module_state->__pyx_n_s__9); - Py_VISIT(traverse_module_state->__pyx_n_s_a); - Py_VISIT(traverse_module_state->__pyx_n_s_a1); - Py_VISIT(traverse_module_state->__pyx_n_s_all); - Py_VISIT(traverse_module_state->__pyx_n_s_all_quadratic); - Py_VISIT(traverse_module_state->__pyx_n_s_args); - Py_VISIT(traverse_module_state->__pyx_n_s_asyncio_coroutines); - Py_VISIT(traverse_module_state->__pyx_n_s_b); - Py_VISIT(traverse_module_state->__pyx_n_s_b1); - Py_VISIT(traverse_module_state->__pyx_n_s_c); - Py_VISIT(traverse_module_state->__pyx_n_s_c1); - Py_VISIT(traverse_module_state->__pyx_n_s_cline_in_traceback); - Py_VISIT(traverse_module_state->__pyx_n_s_close); - Py_VISIT(traverse_module_state->__pyx_n_s_curve); - Py_VISIT(traverse_module_state->__pyx_n_s_curve_to_quadratic); - Py_VISIT(traverse_module_state->__pyx_n_u_curve_to_quadratic); - Py_VISIT(traverse_module_state->__pyx_n_s_curves); - Py_VISIT(traverse_module_state->__pyx_n_s_curves_to_quadratic); - Py_VISIT(traverse_module_state->__pyx_n_u_curves_to_quadratic); - Py_VISIT(traverse_module_state->__pyx_kp_u_curves_to_quadratic_line_474); - Py_VISIT(traverse_module_state->__pyx_n_s_cython); - Py_VISIT(traverse_module_state->__pyx_n_s_d); - Py_VISIT(traverse_module_state->__pyx_n_s_d1); - Py_VISIT(traverse_module_state->__pyx_n_s_delta_2); - Py_VISIT(traverse_module_state->__pyx_n_s_delta_3); - Py_VISIT(traverse_module_state->__pyx_kp_u_disable); - Py_VISIT(traverse_module_state->__pyx_n_s_dt); - Py_VISIT(traverse_module_state->__pyx_kp_u_enable); - Py_VISIT(traverse_module_state->__pyx_n_s_errors); - Py_VISIT(traverse_module_state->__pyx_n_s_fontTools_cu2qu_cu2qu); - Py_VISIT(traverse_module_state->__pyx_n_s_fontTools_misc); - Py_VISIT(traverse_module_state->__pyx_kp_u_gc); - Py_VISIT(traverse_module_state->__pyx_n_s_i); - Py_VISIT(traverse_module_state->__pyx_n_s_imag); - Py_VISIT(traverse_module_state->__pyx_n_s_import); - Py_VISIT(traverse_module_state->__pyx_n_s_initializing); - Py_VISIT(traverse_module_state->__pyx_n_s_is_coroutine); - Py_VISIT(traverse_module_state->__pyx_kp_u_isenabled); - Py_VISIT(traverse_module_state->__pyx_n_s_isnan); - Py_VISIT(traverse_module_state->__pyx_n_s_l); - Py_VISIT(traverse_module_state->__pyx_n_s_last_i); - Py_VISIT(traverse_module_state->__pyx_n_s_main); - Py_VISIT(traverse_module_state->__pyx_n_s_math); - Py_VISIT(traverse_module_state->__pyx_n_s_max_err); - Py_VISIT(traverse_module_state->__pyx_n_s_max_errors); - Py_VISIT(traverse_module_state->__pyx_n_s_n); - Py_VISIT(traverse_module_state->__pyx_n_s_name); - Py_VISIT(traverse_module_state->__pyx_n_s_p); - Py_VISIT(traverse_module_state->__pyx_n_s_p0); - Py_VISIT(traverse_module_state->__pyx_n_s_p1); - Py_VISIT(traverse_module_state->__pyx_n_s_p2); - Py_VISIT(traverse_module_state->__pyx_n_s_p3); - Py_VISIT(traverse_module_state->__pyx_n_s_range); - Py_VISIT(traverse_module_state->__pyx_n_s_real); - Py_VISIT(traverse_module_state->__pyx_n_s_s); - Py_VISIT(traverse_module_state->__pyx_n_s_send); - Py_VISIT(traverse_module_state->__pyx_n_s_spec); - Py_VISIT(traverse_module_state->__pyx_n_s_spline); - Py_VISIT(traverse_module_state->__pyx_n_s_splines); - Py_VISIT(traverse_module_state->__pyx_n_s_split_cubic_into_n_gen); - Py_VISIT(traverse_module_state->__pyx_n_s_t1); - Py_VISIT(traverse_module_state->__pyx_n_s_t1_2); - Py_VISIT(traverse_module_state->__pyx_n_s_test); - Py_VISIT(traverse_module_state->__pyx_n_s_throw); - Py_VISIT(traverse_module_state->__pyx_int_1); - Py_VISIT(traverse_module_state->__pyx_int_2); - Py_VISIT(traverse_module_state->__pyx_int_3); - Py_VISIT(traverse_module_state->__pyx_int_4); - Py_VISIT(traverse_module_state->__pyx_int_6); - Py_VISIT(traverse_module_state->__pyx_int_100); - Py_VISIT(traverse_module_state->__pyx_codeobj_); - Py_VISIT(traverse_module_state->__pyx_tuple__4); - Py_VISIT(traverse_module_state->__pyx_tuple__5); - Py_VISIT(traverse_module_state->__pyx_tuple__7); - Py_VISIT(traverse_module_state->__pyx_codeobj__6); - Py_VISIT(traverse_module_state->__pyx_codeobj__8); - return 0; -} -#endif -/* #### Code section: module_state_defines ### */ -#define __pyx_d __pyx_mstate_global->__pyx_d -#define __pyx_b __pyx_mstate_global->__pyx_b -#define __pyx_cython_runtime __pyx_mstate_global->__pyx_cython_runtime -#define __pyx_empty_tuple __pyx_mstate_global->__pyx_empty_tuple -#define __pyx_empty_bytes __pyx_mstate_global->__pyx_empty_bytes -#define __pyx_empty_unicode __pyx_mstate_global->__pyx_empty_unicode -#ifdef __Pyx_CyFunction_USED -#define __pyx_CyFunctionType __pyx_mstate_global->__pyx_CyFunctionType -#endif -#ifdef __Pyx_FusedFunction_USED -#define __pyx_FusedFunctionType __pyx_mstate_global->__pyx_FusedFunctionType -#endif -#ifdef __Pyx_Generator_USED -#define __pyx_GeneratorType __pyx_mstate_global->__pyx_GeneratorType -#endif -#ifdef __Pyx_IterableCoroutine_USED -#define __pyx_IterableCoroutineType __pyx_mstate_global->__pyx_IterableCoroutineType -#endif -#ifdef __Pyx_Coroutine_USED -#define __pyx_CoroutineAwaitType __pyx_mstate_global->__pyx_CoroutineAwaitType -#endif -#ifdef __Pyx_Coroutine_USED -#define __pyx_CoroutineType __pyx_mstate_global->__pyx_CoroutineType -#endif -#if CYTHON_USE_MODULE_STATE -#endif -#if CYTHON_USE_MODULE_STATE -#define __pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen __pyx_mstate_global->__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen -#endif -#define __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen __pyx_mstate_global->__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen -#define __pyx_n_s_ApproxNotFoundError __pyx_mstate_global->__pyx_n_s_ApproxNotFoundError -#define __pyx_n_s_AssertionError __pyx_mstate_global->__pyx_n_s_AssertionError -#define __pyx_n_s_AttributeError __pyx_mstate_global->__pyx_n_s_AttributeError -#define __pyx_n_s_COMPILED __pyx_mstate_global->__pyx_n_s_COMPILED -#define __pyx_n_s_Cu2QuError __pyx_mstate_global->__pyx_n_s_Cu2QuError -#define __pyx_n_s_Error __pyx_mstate_global->__pyx_n_s_Error -#define __pyx_n_s_ImportError __pyx_mstate_global->__pyx_n_s_ImportError -#define __pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py __pyx_mstate_global->__pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py -#define __pyx_n_s_MAX_N __pyx_mstate_global->__pyx_n_s_MAX_N -#define __pyx_n_s_NAN __pyx_mstate_global->__pyx_n_s_NAN -#define __pyx_n_u_NaN __pyx_mstate_global->__pyx_n_u_NaN -#define __pyx_kp_u_Return_quadratic_Bezier_splines __pyx_mstate_global->__pyx_kp_u_Return_quadratic_Bezier_splines -#define __pyx_n_s_ZeroDivisionError __pyx_mstate_global->__pyx_n_s_ZeroDivisionError -#define __pyx_kp_u__2 __pyx_mstate_global->__pyx_kp_u__2 -#define __pyx_n_s__3 __pyx_mstate_global->__pyx_n_s__3 -#define __pyx_n_s__9 __pyx_mstate_global->__pyx_n_s__9 -#define __pyx_n_s_a __pyx_mstate_global->__pyx_n_s_a -#define __pyx_n_s_a1 __pyx_mstate_global->__pyx_n_s_a1 -#define __pyx_n_s_all __pyx_mstate_global->__pyx_n_s_all -#define __pyx_n_s_all_quadratic __pyx_mstate_global->__pyx_n_s_all_quadratic -#define __pyx_n_s_args __pyx_mstate_global->__pyx_n_s_args -#define __pyx_n_s_asyncio_coroutines __pyx_mstate_global->__pyx_n_s_asyncio_coroutines -#define __pyx_n_s_b __pyx_mstate_global->__pyx_n_s_b -#define __pyx_n_s_b1 __pyx_mstate_global->__pyx_n_s_b1 -#define __pyx_n_s_c __pyx_mstate_global->__pyx_n_s_c -#define __pyx_n_s_c1 __pyx_mstate_global->__pyx_n_s_c1 -#define __pyx_n_s_cline_in_traceback __pyx_mstate_global->__pyx_n_s_cline_in_traceback -#define __pyx_n_s_close __pyx_mstate_global->__pyx_n_s_close -#define __pyx_n_s_curve __pyx_mstate_global->__pyx_n_s_curve -#define __pyx_n_s_curve_to_quadratic __pyx_mstate_global->__pyx_n_s_curve_to_quadratic -#define __pyx_n_u_curve_to_quadratic __pyx_mstate_global->__pyx_n_u_curve_to_quadratic -#define __pyx_n_s_curves __pyx_mstate_global->__pyx_n_s_curves -#define __pyx_n_s_curves_to_quadratic __pyx_mstate_global->__pyx_n_s_curves_to_quadratic -#define __pyx_n_u_curves_to_quadratic __pyx_mstate_global->__pyx_n_u_curves_to_quadratic -#define __pyx_kp_u_curves_to_quadratic_line_474 __pyx_mstate_global->__pyx_kp_u_curves_to_quadratic_line_474 -#define __pyx_n_s_cython __pyx_mstate_global->__pyx_n_s_cython -#define __pyx_n_s_d __pyx_mstate_global->__pyx_n_s_d -#define __pyx_n_s_d1 __pyx_mstate_global->__pyx_n_s_d1 -#define __pyx_n_s_delta_2 __pyx_mstate_global->__pyx_n_s_delta_2 -#define __pyx_n_s_delta_3 __pyx_mstate_global->__pyx_n_s_delta_3 -#define __pyx_kp_u_disable __pyx_mstate_global->__pyx_kp_u_disable -#define __pyx_n_s_dt __pyx_mstate_global->__pyx_n_s_dt -#define __pyx_kp_u_enable __pyx_mstate_global->__pyx_kp_u_enable -#define __pyx_n_s_errors __pyx_mstate_global->__pyx_n_s_errors -#define __pyx_n_s_fontTools_cu2qu_cu2qu __pyx_mstate_global->__pyx_n_s_fontTools_cu2qu_cu2qu -#define __pyx_n_s_fontTools_misc __pyx_mstate_global->__pyx_n_s_fontTools_misc -#define __pyx_kp_u_gc __pyx_mstate_global->__pyx_kp_u_gc -#define __pyx_n_s_i __pyx_mstate_global->__pyx_n_s_i -#define __pyx_n_s_imag __pyx_mstate_global->__pyx_n_s_imag -#define __pyx_n_s_import __pyx_mstate_global->__pyx_n_s_import -#define __pyx_n_s_initializing __pyx_mstate_global->__pyx_n_s_initializing -#define __pyx_n_s_is_coroutine __pyx_mstate_global->__pyx_n_s_is_coroutine -#define __pyx_kp_u_isenabled __pyx_mstate_global->__pyx_kp_u_isenabled -#define __pyx_n_s_isnan __pyx_mstate_global->__pyx_n_s_isnan -#define __pyx_n_s_l __pyx_mstate_global->__pyx_n_s_l -#define __pyx_n_s_last_i __pyx_mstate_global->__pyx_n_s_last_i -#define __pyx_n_s_main __pyx_mstate_global->__pyx_n_s_main -#define __pyx_n_s_math __pyx_mstate_global->__pyx_n_s_math -#define __pyx_n_s_max_err __pyx_mstate_global->__pyx_n_s_max_err -#define __pyx_n_s_max_errors __pyx_mstate_global->__pyx_n_s_max_errors -#define __pyx_n_s_n __pyx_mstate_global->__pyx_n_s_n -#define __pyx_n_s_name __pyx_mstate_global->__pyx_n_s_name -#define __pyx_n_s_p __pyx_mstate_global->__pyx_n_s_p -#define __pyx_n_s_p0 __pyx_mstate_global->__pyx_n_s_p0 -#define __pyx_n_s_p1 __pyx_mstate_global->__pyx_n_s_p1 -#define __pyx_n_s_p2 __pyx_mstate_global->__pyx_n_s_p2 -#define __pyx_n_s_p3 __pyx_mstate_global->__pyx_n_s_p3 -#define __pyx_n_s_range __pyx_mstate_global->__pyx_n_s_range -#define __pyx_n_s_real __pyx_mstate_global->__pyx_n_s_real -#define __pyx_n_s_s __pyx_mstate_global->__pyx_n_s_s -#define __pyx_n_s_send __pyx_mstate_global->__pyx_n_s_send -#define __pyx_n_s_spec __pyx_mstate_global->__pyx_n_s_spec -#define __pyx_n_s_spline __pyx_mstate_global->__pyx_n_s_spline -#define __pyx_n_s_splines __pyx_mstate_global->__pyx_n_s_splines -#define __pyx_n_s_split_cubic_into_n_gen __pyx_mstate_global->__pyx_n_s_split_cubic_into_n_gen -#define __pyx_n_s_t1 __pyx_mstate_global->__pyx_n_s_t1 -#define __pyx_n_s_t1_2 __pyx_mstate_global->__pyx_n_s_t1_2 -#define __pyx_n_s_test __pyx_mstate_global->__pyx_n_s_test -#define __pyx_n_s_throw __pyx_mstate_global->__pyx_n_s_throw -#define __pyx_int_1 __pyx_mstate_global->__pyx_int_1 -#define __pyx_int_2 __pyx_mstate_global->__pyx_int_2 -#define __pyx_int_3 __pyx_mstate_global->__pyx_int_3 -#define __pyx_int_4 __pyx_mstate_global->__pyx_int_4 -#define __pyx_int_6 __pyx_mstate_global->__pyx_int_6 -#define __pyx_int_100 __pyx_mstate_global->__pyx_int_100 -#define __pyx_codeobj_ __pyx_mstate_global->__pyx_codeobj_ -#define __pyx_tuple__4 __pyx_mstate_global->__pyx_tuple__4 -#define __pyx_tuple__5 __pyx_mstate_global->__pyx_tuple__5 -#define __pyx_tuple__7 __pyx_mstate_global->__pyx_tuple__7 -#define __pyx_codeobj__6 __pyx_mstate_global->__pyx_codeobj__6 -#define __pyx_codeobj__8 __pyx_mstate_global->__pyx_codeobj__8 -/* #### Code section: module_code ### */ - -/* "fontTools/cu2qu/cu2qu.py":40 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.double) - */ - -static CYTHON_INLINE double __pyx_f_9fontTools_5cu2qu_5cu2qu_dot(__pyx_t_double_complex __pyx_v_v1, __pyx_t_double_complex __pyx_v_v2) { - double __pyx_r; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("dot", 0); - - /* "fontTools/cu2qu/cu2qu.py":54 - * double: Dot product. - * """ - * return (v1 * v2.conjugate()).real # <<<<<<<<<<<<<< - * - * - */ - __pyx_r = __Pyx_CREAL(__Pyx_c_prod_double(__pyx_v_v1, __Pyx_c_conj_double(__pyx_v_v2))); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":40 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.double) - */ - - /* function exit code */ - __pyx_L0:; - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":57 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals(a=cython.complex, b=cython.complex, c=cython.complex, d=cython.complex) - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_points(__pyx_t_double_complex __pyx_v_a, __pyx_t_double_complex __pyx_v_b, __pyx_t_double_complex __pyx_v_c, __pyx_t_double_complex __pyx_v_d) { - __pyx_t_double_complex __pyx_v__1; - __pyx_t_double_complex __pyx_v__2; - __pyx_t_double_complex __pyx_v__3; - __pyx_t_double_complex __pyx_v__4; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - __pyx_t_double_complex __pyx_t_1; - __pyx_t_double_complex __pyx_t_2; - PyObject *__pyx_t_3 = NULL; - PyObject *__pyx_t_4 = NULL; - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - PyObject *__pyx_t_7 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("calc_cubic_points", 0); - - /* "fontTools/cu2qu/cu2qu.py":64 - * ) - * def calc_cubic_points(a, b, c, d): - * _1 = d # <<<<<<<<<<<<<< - * _2 = (c / 3.0) + d - * _3 = (b + c) / 3.0 + _2 - */ - __pyx_v__1 = __pyx_v_d; - - /* "fontTools/cu2qu/cu2qu.py":65 - * def calc_cubic_points(a, b, c, d): - * _1 = d - * _2 = (c / 3.0) + d # <<<<<<<<<<<<<< - * _3 = (b + c) / 3.0 + _2 - * _4 = a + d + c + b - */ - __pyx_t_1 = __pyx_t_double_complex_from_parts(3.0, 0); - if (unlikely(__Pyx_c_is_zero_double(__pyx_t_1))) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 65, __pyx_L1_error) - } - __pyx_v__2 = __Pyx_c_sum_double(__Pyx_c_quot_double(__pyx_v_c, __pyx_t_1), __pyx_v_d); - - /* "fontTools/cu2qu/cu2qu.py":66 - * _1 = d - * _2 = (c / 3.0) + d - * _3 = (b + c) / 3.0 + _2 # <<<<<<<<<<<<<< - * _4 = a + d + c + b - * return _1, _2, _3, _4 - */ - __pyx_t_1 = __Pyx_c_sum_double(__pyx_v_b, __pyx_v_c); - __pyx_t_2 = __pyx_t_double_complex_from_parts(3.0, 0); - if (unlikely(__Pyx_c_is_zero_double(__pyx_t_2))) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 66, __pyx_L1_error) - } - __pyx_v__3 = __Pyx_c_sum_double(__Pyx_c_quot_double(__pyx_t_1, __pyx_t_2), __pyx_v__2); - - /* "fontTools/cu2qu/cu2qu.py":67 - * _2 = (c / 3.0) + d - * _3 = (b + c) / 3.0 + _2 - * _4 = a + d + c + b # <<<<<<<<<<<<<< - * return _1, _2, _3, _4 - * - */ - __pyx_v__4 = __Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__pyx_v_a, __pyx_v_d), __pyx_v_c), __pyx_v_b); - - /* "fontTools/cu2qu/cu2qu.py":68 - * _3 = (b + c) / 3.0 + _2 - * _4 = a + d + c + b - * return _1, _2, _3, _4 # <<<<<<<<<<<<<< - * - * - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_3 = __pyx_PyComplex_FromComplex(__pyx_v__1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 68, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = __pyx_PyComplex_FromComplex(__pyx_v__2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 68, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_v__3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 68, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_6 = __pyx_PyComplex_FromComplex(__pyx_v__4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 68, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_7 = PyTuple_New(4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 68, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_GIVEREF(__pyx_t_3); - PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_3); - __Pyx_GIVEREF(__pyx_t_4); - PyTuple_SET_ITEM(__pyx_t_7, 1, __pyx_t_4); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_7, 2, __pyx_t_5); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_7, 3, __pyx_t_6); - __pyx_t_3 = 0; - __pyx_t_4 = 0; - __pyx_t_5 = 0; - __pyx_t_6 = 0; - __pyx_r = __pyx_t_7; - __pyx_t_7 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":57 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals(a=cython.complex, b=cython.complex, c=cython.complex, d=cython.complex) - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_3); - __Pyx_XDECREF(__pyx_t_4); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_7); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.calc_cubic_points", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":71 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_parameters(__pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3) { - __pyx_t_double_complex __pyx_v_a; - __pyx_t_double_complex __pyx_v_b; - __pyx_t_double_complex __pyx_v_c; - __pyx_t_double_complex __pyx_v_d; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - PyObject *__pyx_t_3 = NULL; - PyObject *__pyx_t_4 = NULL; - PyObject *__pyx_t_5 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("calc_cubic_parameters", 0); - - /* "fontTools/cu2qu/cu2qu.py":78 - * @cython.locals(a=cython.complex, b=cython.complex, c=cython.complex, d=cython.complex) - * def calc_cubic_parameters(p0, p1, p2, p3): - * c = (p1 - p0) * 3.0 # <<<<<<<<<<<<<< - * b = (p2 - p1) * 3.0 - c - * d = p0 - */ - __pyx_v_c = __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_p1, __pyx_v_p0), __pyx_t_double_complex_from_parts(3.0, 0)); - - /* "fontTools/cu2qu/cu2qu.py":79 - * def calc_cubic_parameters(p0, p1, p2, p3): - * c = (p1 - p0) * 3.0 - * b = (p2 - p1) * 3.0 - c # <<<<<<<<<<<<<< - * d = p0 - * a = p3 - d - c - b - */ - __pyx_v_b = __Pyx_c_diff_double(__Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_p2, __pyx_v_p1), __pyx_t_double_complex_from_parts(3.0, 0)), __pyx_v_c); - - /* "fontTools/cu2qu/cu2qu.py":80 - * c = (p1 - p0) * 3.0 - * b = (p2 - p1) * 3.0 - c - * d = p0 # <<<<<<<<<<<<<< - * a = p3 - d - c - b - * return a, b, c, d - */ - __pyx_v_d = __pyx_v_p0; - - /* "fontTools/cu2qu/cu2qu.py":81 - * b = (p2 - p1) * 3.0 - c - * d = p0 - * a = p3 - d - c - b # <<<<<<<<<<<<<< - * return a, b, c, d - * - */ - __pyx_v_a = __Pyx_c_diff_double(__Pyx_c_diff_double(__Pyx_c_diff_double(__pyx_v_p3, __pyx_v_d), __pyx_v_c), __pyx_v_b); - - /* "fontTools/cu2qu/cu2qu.py":82 - * d = p0 - * a = p3 - d - c - b - * return a, b, c, d # <<<<<<<<<<<<<< - * - * - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_a); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 82, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __pyx_PyComplex_FromComplex(__pyx_v_b); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 82, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_3 = __pyx_PyComplex_FromComplex(__pyx_v_c); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 82, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = __pyx_PyComplex_FromComplex(__pyx_v_d); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 82, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_5 = PyTuple_New(4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 82, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_t_1); - __Pyx_GIVEREF(__pyx_t_2); - PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_t_2); - __Pyx_GIVEREF(__pyx_t_3); - PyTuple_SET_ITEM(__pyx_t_5, 2, __pyx_t_3); - __Pyx_GIVEREF(__pyx_t_4); - PyTuple_SET_ITEM(__pyx_t_5, 3, __pyx_t_4); - __pyx_t_1 = 0; - __pyx_t_2 = 0; - __pyx_t_3 = 0; - __pyx_t_4 = 0; - __pyx_r = __pyx_t_5; - __pyx_t_5 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":71 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_3); - __Pyx_XDECREF(__pyx_t_4); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.calc_cubic_parameters", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":85 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_n_iter(__pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3, PyObject *__pyx_v_n) { - PyObject *__pyx_v_a = NULL; - PyObject *__pyx_v_b = NULL; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - int __pyx_t_1; - PyObject *__pyx_t_2 = NULL; - PyObject *__pyx_t_3 = NULL; - PyObject *__pyx_t_4 = NULL; - PyObject *__pyx_t_5 = NULL; - PyObject *(*__pyx_t_6)(PyObject *); - __pyx_t_double_complex __pyx_t_7; - __pyx_t_double_complex __pyx_t_8; - __pyx_t_double_complex __pyx_t_9; - __pyx_t_double_complex __pyx_t_10; - PyObject *__pyx_t_11 = NULL; - PyObject *__pyx_t_12 = NULL; - PyObject *__pyx_t_13 = NULL; - int __pyx_t_14; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("split_cubic_into_n_iter", 0); - - /* "fontTools/cu2qu/cu2qu.py":107 - * """ - * # Hand-coded special-cases - * if n == 2: # <<<<<<<<<<<<<< - * return iter(split_cubic_into_two(p0, p1, p2, p3)) - * if n == 3: - */ - __pyx_t_1 = (__Pyx_PyInt_BoolEqObjC(__pyx_v_n, __pyx_int_2, 2, 0)); if (unlikely((__pyx_t_1 < 0))) __PYX_ERR(0, 107, __pyx_L1_error) - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":108 - * # Hand-coded special-cases - * if n == 2: - * return iter(split_cubic_into_two(p0, p1, p2, p3)) # <<<<<<<<<<<<<< - * if n == 3: - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_v_p0, __pyx_v_p1, __pyx_v_p2, __pyx_v_p3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 108, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_3 = PyObject_GetIter(__pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 108, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_r = __pyx_t_3; - __pyx_t_3 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":107 - * """ - * # Hand-coded special-cases - * if n == 2: # <<<<<<<<<<<<<< - * return iter(split_cubic_into_two(p0, p1, p2, p3)) - * if n == 3: - */ - } - - /* "fontTools/cu2qu/cu2qu.py":109 - * if n == 2: - * return iter(split_cubic_into_two(p0, p1, p2, p3)) - * if n == 3: # <<<<<<<<<<<<<< - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - * if n == 4: - */ - __pyx_t_1 = (__Pyx_PyInt_BoolEqObjC(__pyx_v_n, __pyx_int_3, 3, 0)); if (unlikely((__pyx_t_1 < 0))) __PYX_ERR(0, 109, __pyx_L1_error) - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":110 - * return iter(split_cubic_into_two(p0, p1, p2, p3)) - * if n == 3: - * return iter(split_cubic_into_three(p0, p1, p2, p3)) # <<<<<<<<<<<<<< - * if n == 4: - * a, b = split_cubic_into_two(p0, p1, p2, p3) - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_3 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_three(__pyx_v_p0, __pyx_v_p1, __pyx_v_p2, __pyx_v_p3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 110, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_2 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 110, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; - __pyx_r = __pyx_t_2; - __pyx_t_2 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":109 - * if n == 2: - * return iter(split_cubic_into_two(p0, p1, p2, p3)) - * if n == 3: # <<<<<<<<<<<<<< - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - * if n == 4: - */ - } - - /* "fontTools/cu2qu/cu2qu.py":111 - * if n == 3: - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - * if n == 4: # <<<<<<<<<<<<<< - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - */ - __pyx_t_1 = (__Pyx_PyInt_BoolEqObjC(__pyx_v_n, __pyx_int_4, 4, 0)); if (unlikely((__pyx_t_1 < 0))) __PYX_ERR(0, 111, __pyx_L1_error) - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":112 - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - * if n == 4: - * a, b = split_cubic_into_two(p0, p1, p2, p3) # <<<<<<<<<<<<<< - * return iter( - * split_cubic_into_two(a[0], a[1], a[2], a[3]) - */ - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_v_p0, __pyx_v_p1, __pyx_v_p2, __pyx_v_p3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 112, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - if ((likely(PyTuple_CheckExact(__pyx_t_2))) || (PyList_CheckExact(__pyx_t_2))) { - PyObject* sequence = __pyx_t_2; - Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); - if (unlikely(size != 2)) { - if (size > 2) __Pyx_RaiseTooManyValuesError(2); - else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); - __PYX_ERR(0, 112, __pyx_L1_error) - } - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - if (likely(PyTuple_CheckExact(sequence))) { - __pyx_t_3 = PyTuple_GET_ITEM(sequence, 0); - __pyx_t_4 = PyTuple_GET_ITEM(sequence, 1); - } else { - __pyx_t_3 = PyList_GET_ITEM(sequence, 0); - __pyx_t_4 = PyList_GET_ITEM(sequence, 1); - } - __Pyx_INCREF(__pyx_t_3); - __Pyx_INCREF(__pyx_t_4); - #else - __pyx_t_3 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 112, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 112, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - #endif - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - } else { - Py_ssize_t index = -1; - __pyx_t_5 = PyObject_GetIter(__pyx_t_2); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 112, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_6 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_5); - index = 0; __pyx_t_3 = __pyx_t_6(__pyx_t_5); if (unlikely(!__pyx_t_3)) goto __pyx_L6_unpacking_failed; - __Pyx_GOTREF(__pyx_t_3); - index = 1; __pyx_t_4 = __pyx_t_6(__pyx_t_5); if (unlikely(!__pyx_t_4)) goto __pyx_L6_unpacking_failed; - __Pyx_GOTREF(__pyx_t_4); - if (__Pyx_IternextUnpackEndCheck(__pyx_t_6(__pyx_t_5), 2) < 0) __PYX_ERR(0, 112, __pyx_L1_error) - __pyx_t_6 = NULL; - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - goto __pyx_L7_unpacking_done; - __pyx_L6_unpacking_failed:; - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - __pyx_t_6 = NULL; - if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); - __PYX_ERR(0, 112, __pyx_L1_error) - __pyx_L7_unpacking_done:; - } - __pyx_v_a = __pyx_t_3; - __pyx_t_3 = 0; - __pyx_v_b = __pyx_t_4; - __pyx_t_4 = 0; - - /* "fontTools/cu2qu/cu2qu.py":113 - * if n == 4: - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( # <<<<<<<<<<<<<< - * split_cubic_into_two(a[0], a[1], a[2], a[3]) - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) - */ - __Pyx_XDECREF(__pyx_r); - - /* "fontTools/cu2qu/cu2qu.py":114 - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - * split_cubic_into_two(a[0], a[1], a[2], a[3]) # <<<<<<<<<<<<<< - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) - * ) - */ - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_a, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_a, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_8 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_a, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_a, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_10 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_t_7, __pyx_t_8, __pyx_t_9, __pyx_t_10); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 114, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - - /* "fontTools/cu2qu/cu2qu.py":115 - * return iter( - * split_cubic_into_two(a[0], a[1], a[2], a[3]) - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) # <<<<<<<<<<<<<< - * ) - * if n == 6: - */ - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_b, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_10 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_b, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_b, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_8 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_b, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_t_10, __pyx_t_9, __pyx_t_8, __pyx_t_7); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_3 = PyNumber_Add(__pyx_t_2, __pyx_t_4); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 115, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - - /* "fontTools/cu2qu/cu2qu.py":113 - * if n == 4: - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( # <<<<<<<<<<<<<< - * split_cubic_into_two(a[0], a[1], a[2], a[3]) - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) - */ - __pyx_t_4 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 113, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; - __pyx_r = __pyx_t_4; - __pyx_t_4 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":111 - * if n == 3: - * return iter(split_cubic_into_three(p0, p1, p2, p3)) - * if n == 4: # <<<<<<<<<<<<<< - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - */ - } - - /* "fontTools/cu2qu/cu2qu.py":117 - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) - * ) - * if n == 6: # <<<<<<<<<<<<<< - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - */ - __pyx_t_1 = (__Pyx_PyInt_BoolEqObjC(__pyx_v_n, __pyx_int_6, 6, 0)); if (unlikely((__pyx_t_1 < 0))) __PYX_ERR(0, 117, __pyx_L1_error) - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":118 - * ) - * if n == 6: - * a, b = split_cubic_into_two(p0, p1, p2, p3) # <<<<<<<<<<<<<< - * return iter( - * split_cubic_into_three(a[0], a[1], a[2], a[3]) - */ - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_v_p0, __pyx_v_p1, __pyx_v_p2, __pyx_v_p3); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 118, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - if ((likely(PyTuple_CheckExact(__pyx_t_4))) || (PyList_CheckExact(__pyx_t_4))) { - PyObject* sequence = __pyx_t_4; - Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); - if (unlikely(size != 2)) { - if (size > 2) __Pyx_RaiseTooManyValuesError(2); - else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); - __PYX_ERR(0, 118, __pyx_L1_error) - } - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - if (likely(PyTuple_CheckExact(sequence))) { - __pyx_t_3 = PyTuple_GET_ITEM(sequence, 0); - __pyx_t_2 = PyTuple_GET_ITEM(sequence, 1); - } else { - __pyx_t_3 = PyList_GET_ITEM(sequence, 0); - __pyx_t_2 = PyList_GET_ITEM(sequence, 1); - } - __Pyx_INCREF(__pyx_t_3); - __Pyx_INCREF(__pyx_t_2); - #else - __pyx_t_3 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 118, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_2 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 118, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - #endif - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - } else { - Py_ssize_t index = -1; - __pyx_t_5 = PyObject_GetIter(__pyx_t_4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 118, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_6 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_5); - index = 0; __pyx_t_3 = __pyx_t_6(__pyx_t_5); if (unlikely(!__pyx_t_3)) goto __pyx_L9_unpacking_failed; - __Pyx_GOTREF(__pyx_t_3); - index = 1; __pyx_t_2 = __pyx_t_6(__pyx_t_5); if (unlikely(!__pyx_t_2)) goto __pyx_L9_unpacking_failed; - __Pyx_GOTREF(__pyx_t_2); - if (__Pyx_IternextUnpackEndCheck(__pyx_t_6(__pyx_t_5), 2) < 0) __PYX_ERR(0, 118, __pyx_L1_error) - __pyx_t_6 = NULL; - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - goto __pyx_L10_unpacking_done; - __pyx_L9_unpacking_failed:; - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - __pyx_t_6 = NULL; - if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); - __PYX_ERR(0, 118, __pyx_L1_error) - __pyx_L10_unpacking_done:; - } - __pyx_v_a = __pyx_t_3; - __pyx_t_3 = 0; - __pyx_v_b = __pyx_t_2; - __pyx_t_2 = 0; - - /* "fontTools/cu2qu/cu2qu.py":119 - * if n == 6: - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( # <<<<<<<<<<<<<< - * split_cubic_into_three(a[0], a[1], a[2], a[3]) - * + split_cubic_into_three(b[0], b[1], b[2], b[3]) - */ - __Pyx_XDECREF(__pyx_r); - - /* "fontTools/cu2qu/cu2qu.py":120 - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - * split_cubic_into_three(a[0], a[1], a[2], a[3]) # <<<<<<<<<<<<<< - * + split_cubic_into_three(b[0], b[1], b[2], b[3]) - * ) - */ - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_a, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_a, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_8 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_a, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_a, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_10 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_three(__pyx_t_7, __pyx_t_8, __pyx_t_9, __pyx_t_10); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 120, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - - /* "fontTools/cu2qu/cu2qu.py":121 - * return iter( - * split_cubic_into_three(a[0], a[1], a[2], a[3]) - * + split_cubic_into_three(b[0], b[1], b[2], b[3]) # <<<<<<<<<<<<<< - * ) - * - */ - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_b, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_10 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_b, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_b, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_8 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_b, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_three(__pyx_t_10, __pyx_t_9, __pyx_t_8, __pyx_t_7); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_3 = PyNumber_Add(__pyx_t_4, __pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 121, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - - /* "fontTools/cu2qu/cu2qu.py":119 - * if n == 6: - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( # <<<<<<<<<<<<<< - * split_cubic_into_three(a[0], a[1], a[2], a[3]) - * + split_cubic_into_three(b[0], b[1], b[2], b[3]) - */ - __pyx_t_2 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 119, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; - __pyx_r = __pyx_t_2; - __pyx_t_2 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":117 - * + split_cubic_into_two(b[0], b[1], b[2], b[3]) - * ) - * if n == 6: # <<<<<<<<<<<<<< - * a, b = split_cubic_into_two(p0, p1, p2, p3) - * return iter( - */ - } - - /* "fontTools/cu2qu/cu2qu.py":124 - * ) - * - * return _split_cubic_into_n_gen(p0, p1, p2, p3, n) # <<<<<<<<<<<<<< - * - * - */ - __Pyx_XDECREF(__pyx_r); - __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_split_cubic_into_n_gen); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = __pyx_PyComplex_FromComplex(__pyx_v_p0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_v_p1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_11 = __pyx_PyComplex_FromComplex(__pyx_v_p2); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_11); - __pyx_t_12 = __pyx_PyComplex_FromComplex(__pyx_v_p3); if (unlikely(!__pyx_t_12)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_12); - __pyx_t_13 = NULL; - __pyx_t_14 = 0; - if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) { - __pyx_t_13 = PyMethod_GET_SELF(__pyx_t_3); - if (likely(__pyx_t_13)) { - PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); - __Pyx_INCREF(__pyx_t_13); - __Pyx_INCREF(function); - __Pyx_DECREF_SET(__pyx_t_3, function); - __pyx_t_14 = 1; - } - } - { - PyObject *__pyx_callargs[6] = {__pyx_t_13, __pyx_t_4, __pyx_t_5, __pyx_t_11, __pyx_t_12, __pyx_v_n}; - __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_3, __pyx_callargs+1-__pyx_t_14, 5+__pyx_t_14); - __Pyx_XDECREF(__pyx_t_13); __pyx_t_13 = 0; - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; - __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; - if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 124, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; - } - __pyx_r = __pyx_t_2; - __pyx_t_2 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":85 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_3); - __Pyx_XDECREF(__pyx_t_4); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_11); - __Pyx_XDECREF(__pyx_t_12); - __Pyx_XDECREF(__pyx_t_13); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.split_cubic_into_n_iter", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XDECREF(__pyx_v_a); - __Pyx_XDECREF(__pyx_v_b); - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} -static PyObject *__pyx_gb_9fontTools_5cu2qu_5cu2qu_2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */ - -/* "fontTools/cu2qu/cu2qu.py":127 - * - * - * @cython.locals( # <<<<<<<<<<<<<< - * p0=cython.complex, - * p1=cython.complex, - */ - -/* Python wrapper */ -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_1_split_cubic_into_n_gen(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -); /*proto*/ -PyDoc_STRVAR(__pyx_doc_9fontTools_5cu2qu_5cu2qu__split_cubic_into_n_gen, "_split_cubic_into_n_gen(double complex p0, double complex p1, double complex p2, double complex p3, int n)"); -static PyMethodDef __pyx_mdef_9fontTools_5cu2qu_5cu2qu_1_split_cubic_into_n_gen = {"_split_cubic_into_n_gen", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_5cu2qu_5cu2qu_1_split_cubic_into_n_gen, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_5cu2qu_5cu2qu__split_cubic_into_n_gen}; -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_1_split_cubic_into_n_gen(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -) { - __pyx_t_double_complex __pyx_v_p0; - __pyx_t_double_complex __pyx_v_p1; - __pyx_t_double_complex __pyx_v_p2; - __pyx_t_double_complex __pyx_v_p3; - int __pyx_v_n; - #if !CYTHON_METH_FASTCALL - CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); - #endif - CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - PyObject *__pyx_r = 0; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("_split_cubic_into_n_gen (wrapper)", 0); - { - PyObject **__pyx_pyargnames[] = {&__pyx_n_s_p0,&__pyx_n_s_p1,&__pyx_n_s_p2,&__pyx_n_s_p3,&__pyx_n_s_n,0}; - PyObject* values[5] = {0,0,0,0,0}; - if (__pyx_kwds) { - Py_ssize_t kw_args; - switch (__pyx_nargs) { - case 5: values[4] = __Pyx_Arg_FASTCALL(__pyx_args, 4); - CYTHON_FALLTHROUGH; - case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3); - CYTHON_FALLTHROUGH; - case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - CYTHON_FALLTHROUGH; - case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - CYTHON_FALLTHROUGH; - case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - CYTHON_FALLTHROUGH; - case 0: break; - default: goto __pyx_L5_argtuple_error; - } - kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); - switch (__pyx_nargs) { - case 0: - if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_p0)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 127, __pyx_L3_error) - else goto __pyx_L5_argtuple_error; - CYTHON_FALLTHROUGH; - case 1: - if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_p1)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 127, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("_split_cubic_into_n_gen", 1, 5, 5, 1); __PYX_ERR(0, 127, __pyx_L3_error) - } - CYTHON_FALLTHROUGH; - case 2: - if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_p2)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 127, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("_split_cubic_into_n_gen", 1, 5, 5, 2); __PYX_ERR(0, 127, __pyx_L3_error) - } - CYTHON_FALLTHROUGH; - case 3: - if (likely((values[3] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_p3)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 127, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("_split_cubic_into_n_gen", 1, 5, 5, 3); __PYX_ERR(0, 127, __pyx_L3_error) - } - CYTHON_FALLTHROUGH; - case 4: - if (likely((values[4] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_n)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 127, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("_split_cubic_into_n_gen", 1, 5, 5, 4); __PYX_ERR(0, 127, __pyx_L3_error) - } - } - if (unlikely(kw_args > 0)) { - const Py_ssize_t kwd_pos_args = __pyx_nargs; - if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "_split_cubic_into_n_gen") < 0)) __PYX_ERR(0, 127, __pyx_L3_error) - } - } else if (unlikely(__pyx_nargs != 5)) { - goto __pyx_L5_argtuple_error; - } else { - values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3); - values[4] = __Pyx_Arg_FASTCALL(__pyx_args, 4); - } - __pyx_v_p0 = __Pyx_PyComplex_As___pyx_t_double_complex(values[0]); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 141, __pyx_L3_error) - __pyx_v_p1 = __Pyx_PyComplex_As___pyx_t_double_complex(values[1]); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 141, __pyx_L3_error) - __pyx_v_p2 = __Pyx_PyComplex_As___pyx_t_double_complex(values[2]); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 141, __pyx_L3_error) - __pyx_v_p3 = __Pyx_PyComplex_As___pyx_t_double_complex(values[3]); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 141, __pyx_L3_error) - __pyx_v_n = __Pyx_PyInt_As_int(values[4]); if (unlikely((__pyx_v_n == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 141, __pyx_L3_error) - } - goto __pyx_L4_argument_unpacking_done; - __pyx_L5_argtuple_error:; - __Pyx_RaiseArgtupleInvalid("_split_cubic_into_n_gen", 1, 5, 5, __pyx_nargs); __PYX_ERR(0, 127, __pyx_L3_error) - __pyx_L3_error:; - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu._split_cubic_into_n_gen", __pyx_clineno, __pyx_lineno, __pyx_filename); - __Pyx_RefNannyFinishContext(); - return NULL; - __pyx_L4_argument_unpacking_done:; - __pyx_r = __pyx_pf_9fontTools_5cu2qu_5cu2qu__split_cubic_into_n_gen(__pyx_self, __pyx_v_p0, __pyx_v_p1, __pyx_v_p2, __pyx_v_p3, __pyx_v_n); - - /* function exit code */ - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu__split_cubic_into_n_gen(CYTHON_UNUSED PyObject *__pyx_self, __pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3, int __pyx_v_n) { - struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *__pyx_cur_scope; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("_split_cubic_into_n_gen", 0); - __pyx_cur_scope = (struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *)__pyx_tp_new_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen(__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen, __pyx_empty_tuple, NULL); - if (unlikely(!__pyx_cur_scope)) { - __pyx_cur_scope = ((struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *)Py_None); - __Pyx_INCREF(Py_None); - __PYX_ERR(0, 127, __pyx_L1_error) - } else { - __Pyx_GOTREF((PyObject *)__pyx_cur_scope); - } - __pyx_cur_scope->__pyx_v_p0 = __pyx_v_p0; - __pyx_cur_scope->__pyx_v_p1 = __pyx_v_p1; - __pyx_cur_scope->__pyx_v_p2 = __pyx_v_p2; - __pyx_cur_scope->__pyx_v_p3 = __pyx_v_p3; - __pyx_cur_scope->__pyx_v_n = __pyx_v_n; - { - __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_9fontTools_5cu2qu_5cu2qu_2generator, __pyx_codeobj_, (PyObject *) __pyx_cur_scope, __pyx_n_s_split_cubic_into_n_gen, __pyx_n_s_split_cubic_into_n_gen, __pyx_n_s_fontTools_cu2qu_cu2qu); if (unlikely(!gen)) __PYX_ERR(0, 127, __pyx_L1_error) - __Pyx_DECREF(__pyx_cur_scope); - __Pyx_RefNannyFinishContext(); - return (PyObject *) gen; - } - - /* function exit code */ - __pyx_L1_error:; - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu._split_cubic_into_n_gen", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = NULL; - __Pyx_DECREF((PyObject *)__pyx_cur_scope); - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -static PyObject *__pyx_gb_9fontTools_5cu2qu_5cu2qu_2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */ -{ - struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *__pyx_cur_scope = ((struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *)__pyx_generator->closure); - PyObject *__pyx_r = NULL; - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - PyObject *__pyx_t_3 = NULL; - PyObject *__pyx_t_4 = NULL; - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - PyObject *(*__pyx_t_7)(PyObject *); - __pyx_t_double_complex __pyx_t_8; - __pyx_t_double_complex __pyx_t_9; - __pyx_t_double_complex __pyx_t_10; - __pyx_t_double_complex __pyx_t_11; - int __pyx_t_12; - int __pyx_t_13; - int __pyx_t_14; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("_split_cubic_into_n_gen", 0); - switch (__pyx_generator->resume_label) { - case 0: goto __pyx_L3_first_run; - case 1: goto __pyx_L8_resume_from_yield; - default: /* CPython raises the right error here */ - __Pyx_RefNannyFinishContext(); - return NULL; - } - __pyx_L3_first_run:; - if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 127, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":142 - * ) - * def _split_cubic_into_n_gen(p0, p1, p2, p3, n): - * a, b, c, d = calc_cubic_parameters(p0, p1, p2, p3) # <<<<<<<<<<<<<< - * dt = 1 / n - * delta_2 = dt * dt - */ - __pyx_t_1 = __pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_parameters(__pyx_cur_scope->__pyx_v_p0, __pyx_cur_scope->__pyx_v_p1, __pyx_cur_scope->__pyx_v_p2, __pyx_cur_scope->__pyx_v_p3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - if ((likely(PyTuple_CheckExact(__pyx_t_1))) || (PyList_CheckExact(__pyx_t_1))) { - PyObject* sequence = __pyx_t_1; - Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); - if (unlikely(size != 4)) { - if (size > 4) __Pyx_RaiseTooManyValuesError(4); - else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); - __PYX_ERR(0, 142, __pyx_L1_error) - } - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - if (likely(PyTuple_CheckExact(sequence))) { - __pyx_t_2 = PyTuple_GET_ITEM(sequence, 0); - __pyx_t_3 = PyTuple_GET_ITEM(sequence, 1); - __pyx_t_4 = PyTuple_GET_ITEM(sequence, 2); - __pyx_t_5 = PyTuple_GET_ITEM(sequence, 3); - } else { - __pyx_t_2 = PyList_GET_ITEM(sequence, 0); - __pyx_t_3 = PyList_GET_ITEM(sequence, 1); - __pyx_t_4 = PyList_GET_ITEM(sequence, 2); - __pyx_t_5 = PyList_GET_ITEM(sequence, 3); - } - __Pyx_INCREF(__pyx_t_2); - __Pyx_INCREF(__pyx_t_3); - __Pyx_INCREF(__pyx_t_4); - __Pyx_INCREF(__pyx_t_5); - #else - { - Py_ssize_t i; - PyObject** temps[4] = {&__pyx_t_2,&__pyx_t_3,&__pyx_t_4,&__pyx_t_5}; - for (i=0; i < 4; i++) { - PyObject* item = PySequence_ITEM(sequence, i); if (unlikely(!item)) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_GOTREF(item); - *(temps[i]) = item; - } - } - #endif - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - } else { - Py_ssize_t index = -1; - PyObject** temps[4] = {&__pyx_t_2,&__pyx_t_3,&__pyx_t_4,&__pyx_t_5}; - __pyx_t_6 = PyObject_GetIter(__pyx_t_1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_7 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_6); - for (index=0; index < 4; index++) { - PyObject* item = __pyx_t_7(__pyx_t_6); if (unlikely(!item)) goto __pyx_L4_unpacking_failed; - __Pyx_GOTREF(item); - *(temps[index]) = item; - } - if (__Pyx_IternextUnpackEndCheck(__pyx_t_7(__pyx_t_6), 4) < 0) __PYX_ERR(0, 142, __pyx_L1_error) - __pyx_t_7 = NULL; - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - goto __pyx_L5_unpacking_done; - __pyx_L4_unpacking_failed:; - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - __pyx_t_7 = NULL; - if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); - __PYX_ERR(0, 142, __pyx_L1_error) - __pyx_L5_unpacking_done:; - } - __pyx_t_8 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_3); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; - __pyx_t_10 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - __pyx_t_11 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_5); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 142, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - __pyx_cur_scope->__pyx_v_a = __pyx_t_8; - __pyx_cur_scope->__pyx_v_b = __pyx_t_9; - __pyx_cur_scope->__pyx_v_c = __pyx_t_10; - __pyx_cur_scope->__pyx_v_d = __pyx_t_11; - - /* "fontTools/cu2qu/cu2qu.py":143 - * def _split_cubic_into_n_gen(p0, p1, p2, p3, n): - * a, b, c, d = calc_cubic_parameters(p0, p1, p2, p3) - * dt = 1 / n # <<<<<<<<<<<<<< - * delta_2 = dt * dt - * delta_3 = dt * delta_2 - */ - if (unlikely(__pyx_cur_scope->__pyx_v_n == 0)) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 143, __pyx_L1_error) - } - __pyx_cur_scope->__pyx_v_dt = (1.0 / ((double)__pyx_cur_scope->__pyx_v_n)); - - /* "fontTools/cu2qu/cu2qu.py":144 - * a, b, c, d = calc_cubic_parameters(p0, p1, p2, p3) - * dt = 1 / n - * delta_2 = dt * dt # <<<<<<<<<<<<<< - * delta_3 = dt * delta_2 - * for i in range(n): - */ - __pyx_cur_scope->__pyx_v_delta_2 = (__pyx_cur_scope->__pyx_v_dt * __pyx_cur_scope->__pyx_v_dt); - - /* "fontTools/cu2qu/cu2qu.py":145 - * dt = 1 / n - * delta_2 = dt * dt - * delta_3 = dt * delta_2 # <<<<<<<<<<<<<< - * for i in range(n): - * t1 = i * dt - */ - __pyx_cur_scope->__pyx_v_delta_3 = (__pyx_cur_scope->__pyx_v_dt * __pyx_cur_scope->__pyx_v_delta_2); - - /* "fontTools/cu2qu/cu2qu.py":146 - * delta_2 = dt * dt - * delta_3 = dt * delta_2 - * for i in range(n): # <<<<<<<<<<<<<< - * t1 = i * dt - * t1_2 = t1 * t1 - */ - __pyx_t_12 = __pyx_cur_scope->__pyx_v_n; - __pyx_t_13 = __pyx_t_12; - for (__pyx_t_14 = 0; __pyx_t_14 < __pyx_t_13; __pyx_t_14+=1) { - __pyx_cur_scope->__pyx_v_i = __pyx_t_14; - - /* "fontTools/cu2qu/cu2qu.py":147 - * delta_3 = dt * delta_2 - * for i in range(n): - * t1 = i * dt # <<<<<<<<<<<<<< - * t1_2 = t1 * t1 - * # calc new a, b, c and d - */ - __pyx_cur_scope->__pyx_v_t1 = (__pyx_cur_scope->__pyx_v_i * __pyx_cur_scope->__pyx_v_dt); - - /* "fontTools/cu2qu/cu2qu.py":148 - * for i in range(n): - * t1 = i * dt - * t1_2 = t1 * t1 # <<<<<<<<<<<<<< - * # calc new a, b, c and d - * a1 = a * delta_3 - */ - __pyx_cur_scope->__pyx_v_t1_2 = (__pyx_cur_scope->__pyx_v_t1 * __pyx_cur_scope->__pyx_v_t1); - - /* "fontTools/cu2qu/cu2qu.py":150 - * t1_2 = t1 * t1 - * # calc new a, b, c and d - * a1 = a * delta_3 # <<<<<<<<<<<<<< - * b1 = (3 * a * t1 + b) * delta_2 - * c1 = (2 * b * t1 + c + 3 * a * t1_2) * dt - */ - __pyx_cur_scope->__pyx_v_a1 = __Pyx_c_prod_double(__pyx_cur_scope->__pyx_v_a, __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_delta_3, 0)); - - /* "fontTools/cu2qu/cu2qu.py":151 - * # calc new a, b, c and d - * a1 = a * delta_3 - * b1 = (3 * a * t1 + b) * delta_2 # <<<<<<<<<<<<<< - * c1 = (2 * b * t1 + c + 3 * a * t1_2) * dt - * d1 = a * t1 * t1_2 + b * t1_2 + c * t1 + d - */ - __pyx_cur_scope->__pyx_v_b1 = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_prod_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __pyx_cur_scope->__pyx_v_a), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1, 0)), __pyx_cur_scope->__pyx_v_b), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_delta_2, 0)); - - /* "fontTools/cu2qu/cu2qu.py":152 - * a1 = a * delta_3 - * b1 = (3 * a * t1 + b) * delta_2 - * c1 = (2 * b * t1 + c + 3 * a * t1_2) * dt # <<<<<<<<<<<<<< - * d1 = a * t1 * t1_2 + b * t1_2 + c * t1 + d - * yield calc_cubic_points(a1, b1, c1, d1) - */ - __pyx_cur_scope->__pyx_v_c1 = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_prod_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(2, 0), __pyx_cur_scope->__pyx_v_b), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1, 0)), __pyx_cur_scope->__pyx_v_c), __Pyx_c_prod_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __pyx_cur_scope->__pyx_v_a), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1_2, 0))), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_dt, 0)); - - /* "fontTools/cu2qu/cu2qu.py":153 - * b1 = (3 * a * t1 + b) * delta_2 - * c1 = (2 * b * t1 + c + 3 * a * t1_2) * dt - * d1 = a * t1 * t1_2 + b * t1_2 + c * t1 + d # <<<<<<<<<<<<<< - * yield calc_cubic_points(a1, b1, c1, d1) - * - */ - __pyx_cur_scope->__pyx_v_d1 = __Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_prod_double(__Pyx_c_prod_double(__pyx_cur_scope->__pyx_v_a, __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1, 0)), __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1_2, 0)), __Pyx_c_prod_double(__pyx_cur_scope->__pyx_v_b, __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1_2, 0))), __Pyx_c_prod_double(__pyx_cur_scope->__pyx_v_c, __pyx_t_double_complex_from_parts(__pyx_cur_scope->__pyx_v_t1, 0))), __pyx_cur_scope->__pyx_v_d); - - /* "fontTools/cu2qu/cu2qu.py":154 - * c1 = (2 * b * t1 + c + 3 * a * t1_2) * dt - * d1 = a * t1 * t1_2 + b * t1_2 + c * t1 + d - * yield calc_cubic_points(a1, b1, c1, d1) # <<<<<<<<<<<<<< - * - * - */ - __pyx_t_1 = __pyx_f_9fontTools_5cu2qu_5cu2qu_calc_cubic_points(__pyx_cur_scope->__pyx_v_a1, __pyx_cur_scope->__pyx_v_b1, __pyx_cur_scope->__pyx_v_c1, __pyx_cur_scope->__pyx_v_d1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 154, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_r = __pyx_t_1; - __pyx_t_1 = 0; - __pyx_cur_scope->__pyx_t_0 = __pyx_t_12; - __pyx_cur_scope->__pyx_t_1 = __pyx_t_13; - __pyx_cur_scope->__pyx_t_2 = __pyx_t_14; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - __Pyx_Coroutine_ResetAndClearException(__pyx_generator); - /* return from generator, yielding value */ - __pyx_generator->resume_label = 1; - return __pyx_r; - __pyx_L8_resume_from_yield:; - __pyx_t_12 = __pyx_cur_scope->__pyx_t_0; - __pyx_t_13 = __pyx_cur_scope->__pyx_t_1; - __pyx_t_14 = __pyx_cur_scope->__pyx_t_2; - if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 154, __pyx_L1_error) - } - CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope); - - /* "fontTools/cu2qu/cu2qu.py":127 - * - * - * @cython.locals( # <<<<<<<<<<<<<< - * p0=cython.complex, - * p1=cython.complex, - */ - - /* function exit code */ - PyErr_SetNone(PyExc_StopIteration); - goto __pyx_L0; - __pyx_L1_error:; - __Pyx_Generator_Replace_StopIteration(0); - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_3); - __Pyx_XDECREF(__pyx_t_4); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_AddTraceback("_split_cubic_into_n_gen", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_L0:; - __Pyx_XDECREF(__pyx_r); __pyx_r = 0; - #if !CYTHON_USE_EXC_INFO_STACK - __Pyx_Coroutine_ResetAndClearException(__pyx_generator); - #endif - __pyx_generator->resume_label = -1; - __Pyx_Coroutine_clear((PyObject*)__pyx_generator); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":157 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_two(__pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3) { - __pyx_t_double_complex __pyx_v_mid; - __pyx_t_double_complex __pyx_v_deriv3; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - __pyx_t_double_complex __pyx_t_2; - PyObject *__pyx_t_3 = NULL; - PyObject *__pyx_t_4 = NULL; - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - PyObject *__pyx_t_7 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("split_cubic_into_two", 0); - - /* "fontTools/cu2qu/cu2qu.py":178 - * values). - * """ - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 # <<<<<<<<<<<<<< - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return ( - */ - __pyx_v_mid = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__pyx_v_p0, __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __Pyx_c_sum_double(__pyx_v_p1, __pyx_v_p2))), __pyx_v_p3), __pyx_t_double_complex_from_parts(0.125, 0)); - - /* "fontTools/cu2qu/cu2qu.py":179 - * """ - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 # <<<<<<<<<<<<<< - * return ( - * (p0, (p0 + p1) * 0.5, mid - deriv3, mid), - */ - __pyx_v_deriv3 = __Pyx_c_prod_double(__Pyx_c_diff_double(__Pyx_c_diff_double(__Pyx_c_sum_double(__pyx_v_p3, __pyx_v_p2), __pyx_v_p1), __pyx_v_p0), __pyx_t_double_complex_from_parts(0.125, 0)); - - /* "fontTools/cu2qu/cu2qu.py":180 - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return ( # <<<<<<<<<<<<<< - * (p0, (p0 + p1) * 0.5, mid - deriv3, mid), - * (mid, mid + deriv3, (p2 + p3) * 0.5, p3), - */ - __Pyx_XDECREF(__pyx_r); - - /* "fontTools/cu2qu/cu2qu.py":181 - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return ( - * (p0, (p0 + p1) * 0.5, mid - deriv3, mid), # <<<<<<<<<<<<<< - * (mid, mid + deriv3, (p2 + p3) * 0.5, p3), - * ) - */ - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_p0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __Pyx_c_prod_double(__Pyx_c_sum_double(__pyx_v_p0, __pyx_v_p1), __pyx_t_double_complex_from_parts(0.5, 0)); - __pyx_t_3 = __pyx_PyComplex_FromComplex(__pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_2 = __Pyx_c_diff_double(__pyx_v_mid, __pyx_v_deriv3); - __pyx_t_4 = __pyx_PyComplex_FromComplex(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_v_mid); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_6 = PyTuple_New(4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_1); - __Pyx_GIVEREF(__pyx_t_3); - PyTuple_SET_ITEM(__pyx_t_6, 1, __pyx_t_3); - __Pyx_GIVEREF(__pyx_t_4); - PyTuple_SET_ITEM(__pyx_t_6, 2, __pyx_t_4); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_6, 3, __pyx_t_5); - __pyx_t_1 = 0; - __pyx_t_3 = 0; - __pyx_t_4 = 0; - __pyx_t_5 = 0; - - /* "fontTools/cu2qu/cu2qu.py":182 - * return ( - * (p0, (p0 + p1) * 0.5, mid - deriv3, mid), - * (mid, mid + deriv3, (p2 + p3) * 0.5, p3), # <<<<<<<<<<<<<< - * ) - * - */ - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_v_mid); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 182, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_2 = __Pyx_c_sum_double(__pyx_v_mid, __pyx_v_deriv3); - __pyx_t_4 = __pyx_PyComplex_FromComplex(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 182, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __pyx_t_2 = __Pyx_c_prod_double(__Pyx_c_sum_double(__pyx_v_p2, __pyx_v_p3), __pyx_t_double_complex_from_parts(0.5, 0)); - __pyx_t_3 = __pyx_PyComplex_FromComplex(__pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 182, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_3); - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_p3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 182, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_7 = PyTuple_New(4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 182, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_5); - __Pyx_GIVEREF(__pyx_t_4); - PyTuple_SET_ITEM(__pyx_t_7, 1, __pyx_t_4); - __Pyx_GIVEREF(__pyx_t_3); - PyTuple_SET_ITEM(__pyx_t_7, 2, __pyx_t_3); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_7, 3, __pyx_t_1); - __pyx_t_5 = 0; - __pyx_t_4 = 0; - __pyx_t_3 = 0; - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":181 - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return ( - * (p0, (p0 + p1) * 0.5, mid - deriv3, mid), # <<<<<<<<<<<<<< - * (mid, mid + deriv3, (p2 + p3) * 0.5, p3), - * ) - */ - __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 181, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_6); - __Pyx_GIVEREF(__pyx_t_7); - PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_t_7); - __pyx_t_6 = 0; - __pyx_t_7 = 0; - __pyx_r = __pyx_t_1; - __pyx_t_1 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":157 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_3); - __Pyx_XDECREF(__pyx_t_4); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_7); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.split_cubic_into_two", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":186 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_three(__pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3) { - __pyx_t_double_complex __pyx_v_mid1; - __pyx_t_double_complex __pyx_v_deriv1; - __pyx_t_double_complex __pyx_v_mid2; - __pyx_t_double_complex __pyx_v_deriv2; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - __pyx_t_double_complex __pyx_t_2; - __pyx_t_double_complex __pyx_t_3; - __pyx_t_double_complex __pyx_t_4; - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - PyObject *__pyx_t_7 = NULL; - PyObject *__pyx_t_8 = NULL; - PyObject *__pyx_t_9 = NULL; - PyObject *__pyx_t_10 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("split_cubic_into_three", 0); - - /* "fontTools/cu2qu/cu2qu.py":215 - * values). - * """ - * mid1 = (8 * p0 + 12 * p1 + 6 * p2 + p3) * (1 / 27) # <<<<<<<<<<<<<< - * deriv1 = (p3 + 3 * p2 - 4 * p0) * (1 / 27) - * mid2 = (p0 + 6 * p1 + 12 * p2 + 8 * p3) * (1 / 27) - */ - __pyx_v_mid1 = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(8, 0), __pyx_v_p0), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(12, 0), __pyx_v_p1)), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(6, 0), __pyx_v_p2)), __pyx_v_p3), __pyx_t_double_complex_from_parts((1.0 / 27.0), 0)); - - /* "fontTools/cu2qu/cu2qu.py":216 - * """ - * mid1 = (8 * p0 + 12 * p1 + 6 * p2 + p3) * (1 / 27) - * deriv1 = (p3 + 3 * p2 - 4 * p0) * (1 / 27) # <<<<<<<<<<<<<< - * mid2 = (p0 + 6 * p1 + 12 * p2 + 8 * p3) * (1 / 27) - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) - */ - __pyx_v_deriv1 = __Pyx_c_prod_double(__Pyx_c_diff_double(__Pyx_c_sum_double(__pyx_v_p3, __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __pyx_v_p2)), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(4, 0), __pyx_v_p0)), __pyx_t_double_complex_from_parts((1.0 / 27.0), 0)); - - /* "fontTools/cu2qu/cu2qu.py":217 - * mid1 = (8 * p0 + 12 * p1 + 6 * p2 + p3) * (1 / 27) - * deriv1 = (p3 + 3 * p2 - 4 * p0) * (1 / 27) - * mid2 = (p0 + 6 * p1 + 12 * p2 + 8 * p3) * (1 / 27) # <<<<<<<<<<<<<< - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) - * return ( - */ - __pyx_v_mid2 = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__pyx_v_p0, __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(6, 0), __pyx_v_p1)), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(12, 0), __pyx_v_p2)), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(8, 0), __pyx_v_p3)), __pyx_t_double_complex_from_parts((1.0 / 27.0), 0)); - - /* "fontTools/cu2qu/cu2qu.py":218 - * deriv1 = (p3 + 3 * p2 - 4 * p0) * (1 / 27) - * mid2 = (p0 + 6 * p1 + 12 * p2 + 8 * p3) * (1 / 27) - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) # <<<<<<<<<<<<<< - * return ( - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), - */ - __pyx_v_deriv2 = __Pyx_c_prod_double(__Pyx_c_diff_double(__Pyx_c_diff_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(4, 0), __pyx_v_p3), __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __pyx_v_p1)), __pyx_v_p0), __pyx_t_double_complex_from_parts((1.0 / 27.0), 0)); - - /* "fontTools/cu2qu/cu2qu.py":219 - * mid2 = (p0 + 6 * p1 + 12 * p2 + 8 * p3) * (1 / 27) - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) - * return ( # <<<<<<<<<<<<<< - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), - * (mid1, mid1 + deriv1, mid2 - deriv2, mid2), - */ - __Pyx_XDECREF(__pyx_r); - - /* "fontTools/cu2qu/cu2qu.py":220 - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) - * return ( - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), # <<<<<<<<<<<<<< - * (mid1, mid1 + deriv1, mid2 - deriv2, mid2), - * (mid2, mid2 + deriv2, (p2 + 2 * p3) / 3.0, p3), - */ - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_p0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __Pyx_c_sum_double(__Pyx_c_prod_double(__pyx_t_double_complex_from_parts(2, 0), __pyx_v_p0), __pyx_v_p1); - __pyx_t_3 = __pyx_t_double_complex_from_parts(3.0, 0); - if (unlikely(__Pyx_c_is_zero_double(__pyx_t_3))) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 220, __pyx_L1_error) - } - __pyx_t_4 = __Pyx_c_quot_double(__pyx_t_2, __pyx_t_3); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_t_4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_4 = __Pyx_c_diff_double(__pyx_v_mid1, __pyx_v_deriv1); - __pyx_t_6 = __pyx_PyComplex_FromComplex(__pyx_t_4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_7 = __pyx_PyComplex_FromComplex(__pyx_v_mid1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_8 = PyTuple_New(4); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_8, 0, __pyx_t_1); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_8, 1, __pyx_t_5); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_8, 2, __pyx_t_6); - __Pyx_GIVEREF(__pyx_t_7); - PyTuple_SET_ITEM(__pyx_t_8, 3, __pyx_t_7); - __pyx_t_1 = 0; - __pyx_t_5 = 0; - __pyx_t_6 = 0; - __pyx_t_7 = 0; - - /* "fontTools/cu2qu/cu2qu.py":221 - * return ( - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), - * (mid1, mid1 + deriv1, mid2 - deriv2, mid2), # <<<<<<<<<<<<<< - * (mid2, mid2 + deriv2, (p2 + 2 * p3) / 3.0, p3), - * ) - */ - __pyx_t_7 = __pyx_PyComplex_FromComplex(__pyx_v_mid1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 221, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_4 = __Pyx_c_sum_double(__pyx_v_mid1, __pyx_v_deriv1); - __pyx_t_6 = __pyx_PyComplex_FromComplex(__pyx_t_4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 221, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_4 = __Pyx_c_diff_double(__pyx_v_mid2, __pyx_v_deriv2); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_t_4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 221, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_mid2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 221, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_9 = PyTuple_New(4); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 221, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_9); - __Pyx_GIVEREF(__pyx_t_7); - PyTuple_SET_ITEM(__pyx_t_9, 0, __pyx_t_7); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_9, 1, __pyx_t_6); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_9, 2, __pyx_t_5); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_9, 3, __pyx_t_1); - __pyx_t_7 = 0; - __pyx_t_6 = 0; - __pyx_t_5 = 0; - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":222 - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), - * (mid1, mid1 + deriv1, mid2 - deriv2, mid2), - * (mid2, mid2 + deriv2, (p2 + 2 * p3) / 3.0, p3), # <<<<<<<<<<<<<< - * ) - * - */ - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_mid2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 222, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_4 = __Pyx_c_sum_double(__pyx_v_mid2, __pyx_v_deriv2); - __pyx_t_5 = __pyx_PyComplex_FromComplex(__pyx_t_4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 222, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_4 = __Pyx_c_sum_double(__pyx_v_p2, __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(2, 0), __pyx_v_p3)); - __pyx_t_3 = __pyx_t_double_complex_from_parts(3.0, 0); - if (unlikely(__Pyx_c_is_zero_double(__pyx_t_3))) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 222, __pyx_L1_error) - } - __pyx_t_2 = __Pyx_c_quot_double(__pyx_t_4, __pyx_t_3); - __pyx_t_6 = __pyx_PyComplex_FromComplex(__pyx_t_2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 222, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_7 = __pyx_PyComplex_FromComplex(__pyx_v_p3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 222, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_10 = PyTuple_New(4); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 222, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_10); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_10, 0, __pyx_t_1); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_10, 1, __pyx_t_5); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_10, 2, __pyx_t_6); - __Pyx_GIVEREF(__pyx_t_7); - PyTuple_SET_ITEM(__pyx_t_10, 3, __pyx_t_7); - __pyx_t_1 = 0; - __pyx_t_5 = 0; - __pyx_t_6 = 0; - __pyx_t_7 = 0; - - /* "fontTools/cu2qu/cu2qu.py":220 - * deriv2 = (4 * p3 - 3 * p1 - p0) * (1 / 27) - * return ( - * (p0, (2 * p0 + p1) / 3.0, mid1 - deriv1, mid1), # <<<<<<<<<<<<<< - * (mid1, mid1 + deriv1, mid2 - deriv2, mid2), - * (mid2, mid2 + deriv2, (p2 + 2 * p3) / 3.0, p3), - */ - __pyx_t_7 = PyTuple_New(3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 220, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_GIVEREF(__pyx_t_8); - PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_8); - __Pyx_GIVEREF(__pyx_t_9); - PyTuple_SET_ITEM(__pyx_t_7, 1, __pyx_t_9); - __Pyx_GIVEREF(__pyx_t_10); - PyTuple_SET_ITEM(__pyx_t_7, 2, __pyx_t_10); - __pyx_t_8 = 0; - __pyx_t_9 = 0; - __pyx_t_10 = 0; - __pyx_r = __pyx_t_7; - __pyx_t_7 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":186 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals( - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_7); - __Pyx_XDECREF(__pyx_t_8); - __Pyx_XDECREF(__pyx_t_9); - __Pyx_XDECREF(__pyx_t_10); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.split_cubic_into_three", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":226 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.complex) - */ - -static CYTHON_INLINE __pyx_t_double_complex __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_control(double __pyx_v_t, __pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3) { - __pyx_t_double_complex __pyx_v__p1; - __pyx_t_double_complex __pyx_v__p2; - __pyx_t_double_complex __pyx_r; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("cubic_approx_control", 0); - - /* "fontTools/cu2qu/cu2qu.py":250 - * complex: Location of candidate control point on quadratic curve. - * """ - * _p1 = p0 + (p1 - p0) * 1.5 # <<<<<<<<<<<<<< - * _p2 = p3 + (p2 - p3) * 1.5 - * return _p1 + (_p2 - _p1) * t - */ - __pyx_v__p1 = __Pyx_c_sum_double(__pyx_v_p0, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_p1, __pyx_v_p0), __pyx_t_double_complex_from_parts(1.5, 0))); - - /* "fontTools/cu2qu/cu2qu.py":251 - * """ - * _p1 = p0 + (p1 - p0) * 1.5 - * _p2 = p3 + (p2 - p3) * 1.5 # <<<<<<<<<<<<<< - * return _p1 + (_p2 - _p1) * t - * - */ - __pyx_v__p2 = __Pyx_c_sum_double(__pyx_v_p3, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_p2, __pyx_v_p3), __pyx_t_double_complex_from_parts(1.5, 0))); - - /* "fontTools/cu2qu/cu2qu.py":252 - * _p1 = p0 + (p1 - p0) * 1.5 - * _p2 = p3 + (p2 - p3) * 1.5 - * return _p1 + (_p2 - _p1) * t # <<<<<<<<<<<<<< - * - * - */ - __pyx_r = __Pyx_c_sum_double(__pyx_v__p1, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v__p2, __pyx_v__p1), __pyx_t_double_complex_from_parts(__pyx_v_t, 0))); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":226 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.complex) - */ - - /* function exit code */ - __pyx_L0:; - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":255 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.complex) - */ - -static CYTHON_INLINE __pyx_t_double_complex __pyx_f_9fontTools_5cu2qu_5cu2qu_calc_intersect(__pyx_t_double_complex __pyx_v_a, __pyx_t_double_complex __pyx_v_b, __pyx_t_double_complex __pyx_v_c, __pyx_t_double_complex __pyx_v_d) { - __pyx_t_double_complex __pyx_v_ab; - __pyx_t_double_complex __pyx_v_cd; - __pyx_t_double_complex __pyx_v_p; - double __pyx_v_h; - __pyx_t_double_complex __pyx_r; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - PyObject *__pyx_t_3 = NULL; - double __pyx_t_4; - double __pyx_t_5; - int __pyx_t_6; - PyObject *__pyx_t_7 = NULL; - PyObject *__pyx_t_8 = NULL; - PyObject *__pyx_t_9 = NULL; - PyObject *__pyx_t_10 = NULL; - PyObject *__pyx_t_11 = NULL; - PyObject *__pyx_t_12 = NULL; - __pyx_t_double_complex __pyx_t_13; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("calc_intersect", 0); - - /* "fontTools/cu2qu/cu2qu.py":273 - * if no intersection was found. - * """ - * ab = b - a # <<<<<<<<<<<<<< - * cd = d - c - * p = ab * 1j - */ - __pyx_v_ab = __Pyx_c_diff_double(__pyx_v_b, __pyx_v_a); - - /* "fontTools/cu2qu/cu2qu.py":274 - * """ - * ab = b - a - * cd = d - c # <<<<<<<<<<<<<< - * p = ab * 1j - * try: - */ - __pyx_v_cd = __Pyx_c_diff_double(__pyx_v_d, __pyx_v_c); - - /* "fontTools/cu2qu/cu2qu.py":275 - * ab = b - a - * cd = d - c - * p = ab * 1j # <<<<<<<<<<<<<< - * try: - * h = dot(p, a - c) / dot(p, cd) - */ - __pyx_v_p = __Pyx_c_prod_double(__pyx_v_ab, __pyx_t_double_complex_from_parts(0, 1.0)); - - /* "fontTools/cu2qu/cu2qu.py":276 - * cd = d - c - * p = ab * 1j - * try: # <<<<<<<<<<<<<< - * h = dot(p, a - c) / dot(p, cd) - * except ZeroDivisionError: - */ - { - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); - __Pyx_XGOTREF(__pyx_t_1); - __Pyx_XGOTREF(__pyx_t_2); - __Pyx_XGOTREF(__pyx_t_3); - /*try:*/ { - - /* "fontTools/cu2qu/cu2qu.py":277 - * p = ab * 1j - * try: - * h = dot(p, a - c) / dot(p, cd) # <<<<<<<<<<<<<< - * except ZeroDivisionError: - * return complex(NAN, NAN) - */ - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_dot(__pyx_v_p, __Pyx_c_diff_double(__pyx_v_a, __pyx_v_c)); if (unlikely(__pyx_t_4 == ((double)-1) && PyErr_Occurred())) __PYX_ERR(0, 277, __pyx_L3_error) - __pyx_t_5 = __pyx_f_9fontTools_5cu2qu_5cu2qu_dot(__pyx_v_p, __pyx_v_cd); if (unlikely(__pyx_t_5 == ((double)-1) && PyErr_Occurred())) __PYX_ERR(0, 277, __pyx_L3_error) - if (unlikely(__pyx_t_5 == 0)) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 277, __pyx_L3_error) - } - __pyx_v_h = (__pyx_t_4 / __pyx_t_5); - - /* "fontTools/cu2qu/cu2qu.py":276 - * cd = d - c - * p = ab * 1j - * try: # <<<<<<<<<<<<<< - * h = dot(p, a - c) / dot(p, cd) - * except ZeroDivisionError: - */ - } - __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; - __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; - goto __pyx_L8_try_end; - __pyx_L3_error:; - - /* "fontTools/cu2qu/cu2qu.py":278 - * try: - * h = dot(p, a - c) / dot(p, cd) - * except ZeroDivisionError: # <<<<<<<<<<<<<< - * return complex(NAN, NAN) - * return c + cd * h - */ - __pyx_t_6 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_ZeroDivisionError); - if (__pyx_t_6) { - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.calc_intersect", __pyx_clineno, __pyx_lineno, __pyx_filename); - if (__Pyx_GetException(&__pyx_t_7, &__pyx_t_8, &__pyx_t_9) < 0) __PYX_ERR(0, 278, __pyx_L5_except_error) - __Pyx_XGOTREF(__pyx_t_7); - __Pyx_XGOTREF(__pyx_t_8); - __Pyx_XGOTREF(__pyx_t_9); - - /* "fontTools/cu2qu/cu2qu.py":279 - * h = dot(p, a - c) / dot(p, cd) - * except ZeroDivisionError: - * return complex(NAN, NAN) # <<<<<<<<<<<<<< - * return c + cd * h - * - */ - __Pyx_GetModuleGlobalName(__pyx_t_10, __pyx_n_s_NAN); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 279, __pyx_L5_except_error) - __Pyx_GOTREF(__pyx_t_10); - __Pyx_GetModuleGlobalName(__pyx_t_11, __pyx_n_s_NAN); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 279, __pyx_L5_except_error) - __Pyx_GOTREF(__pyx_t_11); - __pyx_t_12 = PyTuple_New(2); if (unlikely(!__pyx_t_12)) __PYX_ERR(0, 279, __pyx_L5_except_error) - __Pyx_GOTREF(__pyx_t_12); - __Pyx_GIVEREF(__pyx_t_10); - PyTuple_SET_ITEM(__pyx_t_12, 0, __pyx_t_10); - __Pyx_GIVEREF(__pyx_t_11); - PyTuple_SET_ITEM(__pyx_t_12, 1, __pyx_t_11); - __pyx_t_10 = 0; - __pyx_t_11 = 0; - __pyx_t_11 = __Pyx_PyObject_Call(((PyObject *)(&PyComplex_Type)), __pyx_t_12, NULL); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 279, __pyx_L5_except_error) - __Pyx_GOTREF(__pyx_t_11); - __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; - __pyx_t_13 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_11); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 279, __pyx_L5_except_error) - __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; - __pyx_r = __pyx_t_13; - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; - goto __pyx_L6_except_return; - } - goto __pyx_L5_except_error; - - /* "fontTools/cu2qu/cu2qu.py":276 - * cd = d - c - * p = ab * 1j - * try: # <<<<<<<<<<<<<< - * h = dot(p, a - c) / dot(p, cd) - * except ZeroDivisionError: - */ - __pyx_L5_except_error:; - __Pyx_XGIVEREF(__pyx_t_1); - __Pyx_XGIVEREF(__pyx_t_2); - __Pyx_XGIVEREF(__pyx_t_3); - __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); - goto __pyx_L1_error; - __pyx_L6_except_return:; - __Pyx_XGIVEREF(__pyx_t_1); - __Pyx_XGIVEREF(__pyx_t_2); - __Pyx_XGIVEREF(__pyx_t_3); - __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); - goto __pyx_L0; - __pyx_L8_try_end:; - } - - /* "fontTools/cu2qu/cu2qu.py":280 - * except ZeroDivisionError: - * return complex(NAN, NAN) - * return c + cd * h # <<<<<<<<<<<<<< - * - * - */ - __pyx_r = __Pyx_c_sum_double(__pyx_v_c, __Pyx_c_prod_double(__pyx_v_cd, __pyx_t_double_complex_from_parts(__pyx_v_h, 0))); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":255 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.returns(cython.complex) - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_7); - __Pyx_XDECREF(__pyx_t_8); - __Pyx_XDECREF(__pyx_t_9); - __Pyx_XDECREF(__pyx_t_10); - __Pyx_XDECREF(__pyx_t_11); - __Pyx_XDECREF(__pyx_t_12); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.calc_intersect", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = __pyx_t_double_complex_from_parts(0, 0); - __pyx_L0:; - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":283 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.returns(cython.int) - * @cython.locals( - */ - -static int __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_t_double_complex __pyx_v_p0, __pyx_t_double_complex __pyx_v_p1, __pyx_t_double_complex __pyx_v_p2, __pyx_t_double_complex __pyx_v_p3, double __pyx_v_tolerance) { - __pyx_t_double_complex __pyx_v_mid; - __pyx_t_double_complex __pyx_v_deriv3; - int __pyx_r; - __Pyx_RefNannyDeclarations - int __pyx_t_1; - int __pyx_t_2; - int __pyx_t_3; - int __pyx_t_4; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("cubic_farthest_fit_inside", 0); - - /* "fontTools/cu2qu/cu2qu.py":312 - * """ - * # First check p2 then p1, as p2 has higher error early on. - * if abs(p2) <= tolerance and abs(p1) <= tolerance: # <<<<<<<<<<<<<< - * return True - * - */ - __pyx_t_2 = (__Pyx_c_abs_double(__pyx_v_p2) <= __pyx_v_tolerance); - if (__pyx_t_2) { - } else { - __pyx_t_1 = __pyx_t_2; - goto __pyx_L4_bool_binop_done; - } - __pyx_t_2 = (__Pyx_c_abs_double(__pyx_v_p1) <= __pyx_v_tolerance); - __pyx_t_1 = __pyx_t_2; - __pyx_L4_bool_binop_done:; - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":313 - * # First check p2 then p1, as p2 has higher error early on. - * if abs(p2) <= tolerance and abs(p1) <= tolerance: - * return True # <<<<<<<<<<<<<< - * - * # Split. - */ - __pyx_r = 1; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":312 - * """ - * # First check p2 then p1, as p2 has higher error early on. - * if abs(p2) <= tolerance and abs(p1) <= tolerance: # <<<<<<<<<<<<<< - * return True - * - */ - } - - /* "fontTools/cu2qu/cu2qu.py":316 - * - * # Split. - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 # <<<<<<<<<<<<<< - * if abs(mid) > tolerance: - * return False - */ - __pyx_v_mid = __Pyx_c_prod_double(__Pyx_c_sum_double(__Pyx_c_sum_double(__pyx_v_p0, __Pyx_c_prod_double(__pyx_t_double_complex_from_parts(3, 0), __Pyx_c_sum_double(__pyx_v_p1, __pyx_v_p2))), __pyx_v_p3), __pyx_t_double_complex_from_parts(0.125, 0)); - - /* "fontTools/cu2qu/cu2qu.py":317 - * # Split. - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 - * if abs(mid) > tolerance: # <<<<<<<<<<<<<< - * return False - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - */ - __pyx_t_1 = (__Pyx_c_abs_double(__pyx_v_mid) > __pyx_v_tolerance); - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":318 - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 - * if abs(mid) > tolerance: - * return False # <<<<<<<<<<<<<< - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return cubic_farthest_fit_inside( - */ - __pyx_r = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":317 - * # Split. - * mid = (p0 + 3 * (p1 + p2) + p3) * 0.125 - * if abs(mid) > tolerance: # <<<<<<<<<<<<<< - * return False - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - */ - } - - /* "fontTools/cu2qu/cu2qu.py":319 - * if abs(mid) > tolerance: - * return False - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 # <<<<<<<<<<<<<< - * return cubic_farthest_fit_inside( - * p0, (p0 + p1) * 0.5, mid - deriv3, mid, tolerance - */ - __pyx_v_deriv3 = __Pyx_c_prod_double(__Pyx_c_diff_double(__Pyx_c_diff_double(__Pyx_c_sum_double(__pyx_v_p3, __pyx_v_p2), __pyx_v_p1), __pyx_v_p0), __pyx_t_double_complex_from_parts(0.125, 0)); - - /* "fontTools/cu2qu/cu2qu.py":320 - * return False - * deriv3 = (p3 + p2 - p1 - p0) * 0.125 - * return cubic_farthest_fit_inside( # <<<<<<<<<<<<<< - * p0, (p0 + p1) * 0.5, mid - deriv3, mid, tolerance - * ) and cubic_farthest_fit_inside(mid, mid + deriv3, (p2 + p3) * 0.5, p3, tolerance) - */ - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_v_p0, __Pyx_c_prod_double(__Pyx_c_sum_double(__pyx_v_p0, __pyx_v_p1), __pyx_t_double_complex_from_parts(0.5, 0)), __Pyx_c_diff_double(__pyx_v_mid, __pyx_v_deriv3), __pyx_v_mid, __pyx_v_tolerance); if (unlikely(__pyx_t_4 == ((int)-1) && PyErr_Occurred())) __PYX_ERR(0, 320, __pyx_L1_error) - if (__pyx_t_4) { - } else { - __pyx_t_3 = __pyx_t_4; - goto __pyx_L7_bool_binop_done; - } - - /* "fontTools/cu2qu/cu2qu.py":322 - * return cubic_farthest_fit_inside( - * p0, (p0 + p1) * 0.5, mid - deriv3, mid, tolerance - * ) and cubic_farthest_fit_inside(mid, mid + deriv3, (p2 + p3) * 0.5, p3, tolerance) # <<<<<<<<<<<<<< - * - * - */ - __pyx_t_4 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_v_mid, __Pyx_c_sum_double(__pyx_v_mid, __pyx_v_deriv3), __Pyx_c_prod_double(__Pyx_c_sum_double(__pyx_v_p2, __pyx_v_p3), __pyx_t_double_complex_from_parts(0.5, 0)), __pyx_v_p3, __pyx_v_tolerance); if (unlikely(__pyx_t_4 == ((int)-1) && PyErr_Occurred())) __PYX_ERR(0, 322, __pyx_L1_error) - __pyx_t_3 = __pyx_t_4; - __pyx_L7_bool_binop_done:; - __pyx_r = __pyx_t_3; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":283 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.returns(cython.int) - * @cython.locals( - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.cubic_farthest_fit_inside", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = -1; - __pyx_L0:; - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":325 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals(tolerance=cython.double) - */ - -static CYTHON_INLINE PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_quadratic(PyObject *__pyx_v_cubic, double __pyx_v_tolerance) { - __pyx_t_double_complex __pyx_v_q1; - __pyx_t_double_complex __pyx_v_c0; - __pyx_t_double_complex __pyx_v_c1; - __pyx_t_double_complex __pyx_v_c2; - __pyx_t_double_complex __pyx_v_c3; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - __pyx_t_double_complex __pyx_t_2; - __pyx_t_double_complex __pyx_t_3; - __pyx_t_double_complex __pyx_t_4; - __pyx_t_double_complex __pyx_t_5; - __pyx_t_double_complex __pyx_t_6; - PyObject *__pyx_t_7 = NULL; - PyObject *__pyx_t_8 = NULL; - PyObject *__pyx_t_9 = NULL; - int __pyx_t_10; - int __pyx_t_11; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("cubic_approx_quadratic", 0); - - /* "fontTools/cu2qu/cu2qu.py":349 - * """ - * - * q1 = calc_intersect(cubic[0], cubic[1], cubic[2], cubic[3]) # <<<<<<<<<<<<<< - * if math.isnan(q1.imag): - * return None - */ - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_3 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_4 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 349, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_6 = __pyx_f_9fontTools_5cu2qu_5cu2qu_calc_intersect(__pyx_t_2, __pyx_t_3, __pyx_t_4, __pyx_t_5); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 349, __pyx_L1_error) - __pyx_v_q1 = __pyx_t_6; - - /* "fontTools/cu2qu/cu2qu.py":350 - * - * q1 = calc_intersect(cubic[0], cubic[1], cubic[2], cubic[3]) - * if math.isnan(q1.imag): # <<<<<<<<<<<<<< - * return None - * c0 = cubic[0] - */ - __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_math); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 350, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_8 = __Pyx_PyObject_GetAttrStr(__pyx_t_7, __pyx_n_s_isnan); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 350, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __pyx_t_7 = PyFloat_FromDouble(__Pyx_CIMAG(__pyx_v_q1)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 350, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_9 = NULL; - __pyx_t_10 = 0; - if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_8))) { - __pyx_t_9 = PyMethod_GET_SELF(__pyx_t_8); - if (likely(__pyx_t_9)) { - PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_8); - __Pyx_INCREF(__pyx_t_9); - __Pyx_INCREF(function); - __Pyx_DECREF_SET(__pyx_t_8, function); - __pyx_t_10 = 1; - } - } - { - PyObject *__pyx_callargs[2] = {__pyx_t_9, __pyx_t_7}; - __pyx_t_1 = __Pyx_PyObject_FastCall(__pyx_t_8, __pyx_callargs+1-__pyx_t_10, 1+__pyx_t_10); - __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 350, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - } - __pyx_t_11 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely((__pyx_t_11 < 0))) __PYX_ERR(0, 350, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - if (__pyx_t_11) { - - /* "fontTools/cu2qu/cu2qu.py":351 - * q1 = calc_intersect(cubic[0], cubic[1], cubic[2], cubic[3]) - * if math.isnan(q1.imag): - * return None # <<<<<<<<<<<<<< - * c0 = cubic[0] - * c3 = cubic[3] - */ - __Pyx_XDECREF(__pyx_r); - __pyx_r = Py_None; __Pyx_INCREF(Py_None); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":350 - * - * q1 = calc_intersect(cubic[0], cubic[1], cubic[2], cubic[3]) - * if math.isnan(q1.imag): # <<<<<<<<<<<<<< - * return None - * c0 = cubic[0] - */ - } - - /* "fontTools/cu2qu/cu2qu.py":352 - * if math.isnan(q1.imag): - * return None - * c0 = cubic[0] # <<<<<<<<<<<<<< - * c3 = cubic[3] - * c1 = c0 + (q1 - c0) * (2 / 3) - */ - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 352, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 352, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_v_c0 = __pyx_t_6; - - /* "fontTools/cu2qu/cu2qu.py":353 - * return None - * c0 = cubic[0] - * c3 = cubic[3] # <<<<<<<<<<<<<< - * c1 = c0 + (q1 - c0) * (2 / 3) - * c2 = c3 + (q1 - c3) * (2 / 3) - */ - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 353, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 353, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_v_c3 = __pyx_t_6; - - /* "fontTools/cu2qu/cu2qu.py":354 - * c0 = cubic[0] - * c3 = cubic[3] - * c1 = c0 + (q1 - c0) * (2 / 3) # <<<<<<<<<<<<<< - * c2 = c3 + (q1 - c3) * (2 / 3) - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): - */ - __pyx_v_c1 = __Pyx_c_sum_double(__pyx_v_c0, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_q1, __pyx_v_c0), __pyx_t_double_complex_from_parts((2.0 / 3.0), 0))); - - /* "fontTools/cu2qu/cu2qu.py":355 - * c3 = cubic[3] - * c1 = c0 + (q1 - c0) * (2 / 3) - * c2 = c3 + (q1 - c3) * (2 / 3) # <<<<<<<<<<<<<< - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): - * return None - */ - __pyx_v_c2 = __Pyx_c_sum_double(__pyx_v_c3, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_q1, __pyx_v_c3), __pyx_t_double_complex_from_parts((2.0 / 3.0), 0))); - - /* "fontTools/cu2qu/cu2qu.py":356 - * c1 = c0 + (q1 - c0) * (2 / 3) - * c2 = c3 + (q1 - c3) * (2 / 3) - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): # <<<<<<<<<<<<<< - * return None - * return c0, q1, c3 - */ - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_c1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_cubic, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_7 = PyNumber_Subtract(__pyx_t_1, __pyx_t_8); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_7); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __pyx_t_7 = __pyx_PyComplex_FromComplex(__pyx_v_c2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_cubic, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_1 = PyNumber_Subtract(__pyx_t_7, __pyx_t_8); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_1); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 356, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_10 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_t_double_complex_from_parts(0, 0), __pyx_t_6, __pyx_t_5, __pyx_t_double_complex_from_parts(0, 0), __pyx_v_tolerance); if (unlikely(__pyx_t_10 == ((int)-1) && PyErr_Occurred())) __PYX_ERR(0, 356, __pyx_L1_error) - __pyx_t_11 = (!(__pyx_t_10 != 0)); - if (__pyx_t_11) { - - /* "fontTools/cu2qu/cu2qu.py":357 - * c2 = c3 + (q1 - c3) * (2 / 3) - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): - * return None # <<<<<<<<<<<<<< - * return c0, q1, c3 - * - */ - __Pyx_XDECREF(__pyx_r); - __pyx_r = Py_None; __Pyx_INCREF(Py_None); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":356 - * c1 = c0 + (q1 - c0) * (2 / 3) - * c2 = c3 + (q1 - c3) * (2 / 3) - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): # <<<<<<<<<<<<<< - * return None - * return c0, q1, c3 - */ - } - - /* "fontTools/cu2qu/cu2qu.py":358 - * if not cubic_farthest_fit_inside(0, c1 - cubic[1], c2 - cubic[2], 0, tolerance): - * return None - * return c0, q1, c3 # <<<<<<<<<<<<<< - * - * - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_1 = __pyx_PyComplex_FromComplex(__pyx_v_c0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 358, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_8 = __pyx_PyComplex_FromComplex(__pyx_v_q1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 358, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_7 = __pyx_PyComplex_FromComplex(__pyx_v_c3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 358, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __pyx_t_9 = PyTuple_New(3); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 358, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_9); - __Pyx_GIVEREF(__pyx_t_1); - PyTuple_SET_ITEM(__pyx_t_9, 0, __pyx_t_1); - __Pyx_GIVEREF(__pyx_t_8); - PyTuple_SET_ITEM(__pyx_t_9, 1, __pyx_t_8); - __Pyx_GIVEREF(__pyx_t_7); - PyTuple_SET_ITEM(__pyx_t_9, 2, __pyx_t_7); - __pyx_t_1 = 0; - __pyx_t_8 = 0; - __pyx_t_7 = 0; - __pyx_r = __pyx_t_9; - __pyx_t_9 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":325 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.inline - * @cython.locals(tolerance=cython.double) - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_7); - __Pyx_XDECREF(__pyx_t_8); - __Pyx_XDECREF(__pyx_t_9); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.cubic_approx_quadratic", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":361 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int, tolerance=cython.double) - * @cython.locals(i=cython.int) - */ - -static PyObject *__pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_spline(PyObject *__pyx_v_cubic, int __pyx_v_n, double __pyx_v_tolerance, int __pyx_v_all_quadratic) { - __pyx_t_double_complex __pyx_v_q0; - __pyx_t_double_complex __pyx_v_q1; - __pyx_t_double_complex __pyx_v_next_q1; - __pyx_t_double_complex __pyx_v_q2; - __pyx_t_double_complex __pyx_v_d1; - CYTHON_UNUSED __pyx_t_double_complex __pyx_v_c0; - __pyx_t_double_complex __pyx_v_c1; - __pyx_t_double_complex __pyx_v_c2; - __pyx_t_double_complex __pyx_v_c3; - int __pyx_v_i; - PyObject *__pyx_v_cubics = NULL; - PyObject *__pyx_v_next_cubic = NULL; - PyObject *__pyx_v_spline = NULL; - __pyx_t_double_complex __pyx_v_d0; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - int __pyx_t_1; - PyObject *__pyx_t_2 = NULL; - int __pyx_t_3; - __pyx_t_double_complex __pyx_t_4; - __pyx_t_double_complex __pyx_t_5; - __pyx_t_double_complex __pyx_t_6; - __pyx_t_double_complex __pyx_t_7; - PyObject *__pyx_t_8 = NULL; - __pyx_t_double_complex __pyx_t_9; - PyObject *__pyx_t_10 = NULL; - long __pyx_t_11; - long __pyx_t_12; - int __pyx_t_13; - PyObject *__pyx_t_14 = NULL; - PyObject *__pyx_t_15 = NULL; - PyObject *(*__pyx_t_16)(PyObject *); - long __pyx_t_17; - int __pyx_t_18; - int __pyx_t_19; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("cubic_approx_spline", 0); - - /* "fontTools/cu2qu/cu2qu.py":390 - * """ - * - * if n == 1: # <<<<<<<<<<<<<< - * return cubic_approx_quadratic(cubic, tolerance) - * if n == 2 and all_quadratic == False: - */ - __pyx_t_1 = (__pyx_v_n == 1); - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":391 - * - * if n == 1: - * return cubic_approx_quadratic(cubic, tolerance) # <<<<<<<<<<<<<< - * if n == 2 and all_quadratic == False: - * return cubic - */ - __Pyx_XDECREF(__pyx_r); - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_quadratic(__pyx_v_cubic, __pyx_v_tolerance); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 391, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_r = __pyx_t_2; - __pyx_t_2 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":390 - * """ - * - * if n == 1: # <<<<<<<<<<<<<< - * return cubic_approx_quadratic(cubic, tolerance) - * if n == 2 and all_quadratic == False: - */ - } - - /* "fontTools/cu2qu/cu2qu.py":392 - * if n == 1: - * return cubic_approx_quadratic(cubic, tolerance) - * if n == 2 and all_quadratic == False: # <<<<<<<<<<<<<< - * return cubic - * - */ - __pyx_t_3 = (__pyx_v_n == 2); - if (__pyx_t_3) { - } else { - __pyx_t_1 = __pyx_t_3; - goto __pyx_L5_bool_binop_done; - } - __pyx_t_3 = (__pyx_v_all_quadratic == 0); - __pyx_t_1 = __pyx_t_3; - __pyx_L5_bool_binop_done:; - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":393 - * return cubic_approx_quadratic(cubic, tolerance) - * if n == 2 and all_quadratic == False: - * return cubic # <<<<<<<<<<<<<< - * - * cubics = split_cubic_into_n_iter(cubic[0], cubic[1], cubic[2], cubic[3], n) - */ - __Pyx_XDECREF(__pyx_r); - __Pyx_INCREF(__pyx_v_cubic); - __pyx_r = __pyx_v_cubic; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":392 - * if n == 1: - * return cubic_approx_quadratic(cubic, tolerance) - * if n == 2 and all_quadratic == False: # <<<<<<<<<<<<<< - * return cubic - * - */ - } - - /* "fontTools/cu2qu/cu2qu.py":395 - * return cubic - * - * cubics = split_cubic_into_n_iter(cubic[0], cubic[1], cubic[2], cubic[3], n) # <<<<<<<<<<<<<< - * - * # calculate the spline of quadratics and check errors at the same time. - */ - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_4 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_cubic, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_cubic, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __Pyx_PyInt_From_int(__pyx_v_n); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_8 = __pyx_f_9fontTools_5cu2qu_5cu2qu_split_cubic_into_n_iter(__pyx_t_4, __pyx_t_5, __pyx_t_6, __pyx_t_7, __pyx_t_2); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 395, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_v_cubics = __pyx_t_8; - __pyx_t_8 = 0; - - /* "fontTools/cu2qu/cu2qu.py":398 - * - * # calculate the spline of quadratics and check errors at the same time. - * next_cubic = next(cubics) # <<<<<<<<<<<<<< - * next_q1 = cubic_approx_control( - * 0, next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - */ - __pyx_t_8 = __Pyx_PyIter_Next(__pyx_v_cubics); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 398, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_v_next_cubic = __pyx_t_8; - __pyx_t_8 = 0; - - /* "fontTools/cu2qu/cu2qu.py":400 - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( - * 0, next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] # <<<<<<<<<<<<<< - * ) - * q2 = cubic[0] - */ - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_next_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_7 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_next_cubic, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_next_cubic, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_next_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_4 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 400, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - - /* "fontTools/cu2qu/cu2qu.py":399 - * # calculate the spline of quadratics and check errors at the same time. - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( # <<<<<<<<<<<<<< - * 0, next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - * ) - */ - __pyx_t_9 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_control(0.0, __pyx_t_7, __pyx_t_6, __pyx_t_5, __pyx_t_4); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 399, __pyx_L1_error) - __pyx_v_next_q1 = __pyx_t_9; - - /* "fontTools/cu2qu/cu2qu.py":402 - * 0, next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - * ) - * q2 = cubic[0] # <<<<<<<<<<<<<< - * d1 = 0j - * spline = [cubic[0], next_q1] - */ - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 402, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 402, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_v_q2 = __pyx_t_9; - - /* "fontTools/cu2qu/cu2qu.py":403 - * ) - * q2 = cubic[0] - * d1 = 0j # <<<<<<<<<<<<<< - * spline = [cubic[0], next_q1] - * for i in range(1, n + 1): - */ - __pyx_v_d1 = __pyx_t_double_complex_from_parts(0, 0.0); - - /* "fontTools/cu2qu/cu2qu.py":404 - * q2 = cubic[0] - * d1 = 0j - * spline = [cubic[0], next_q1] # <<<<<<<<<<<<<< - * for i in range(1, n + 1): - * # Current cubic to convert - */ - __pyx_t_8 = __Pyx_GetItemInt(__pyx_v_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 404, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_8); - __pyx_t_2 = __pyx_PyComplex_FromComplex(__pyx_v_next_q1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 404, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_10 = PyList_New(2); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 404, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_10); - __Pyx_GIVEREF(__pyx_t_8); - PyList_SET_ITEM(__pyx_t_10, 0, __pyx_t_8); - __Pyx_GIVEREF(__pyx_t_2); - PyList_SET_ITEM(__pyx_t_10, 1, __pyx_t_2); - __pyx_t_8 = 0; - __pyx_t_2 = 0; - __pyx_v_spline = ((PyObject*)__pyx_t_10); - __pyx_t_10 = 0; - - /* "fontTools/cu2qu/cu2qu.py":405 - * d1 = 0j - * spline = [cubic[0], next_q1] - * for i in range(1, n + 1): # <<<<<<<<<<<<<< - * # Current cubic to convert - * c0, c1, c2, c3 = next_cubic - */ - __pyx_t_11 = (__pyx_v_n + 1); - __pyx_t_12 = __pyx_t_11; - for (__pyx_t_13 = 1; __pyx_t_13 < __pyx_t_12; __pyx_t_13+=1) { - __pyx_v_i = __pyx_t_13; - - /* "fontTools/cu2qu/cu2qu.py":407 - * for i in range(1, n + 1): - * # Current cubic to convert - * c0, c1, c2, c3 = next_cubic # <<<<<<<<<<<<<< - * - * # Current quadratic approximation of current cubic - */ - if ((likely(PyTuple_CheckExact(__pyx_v_next_cubic))) || (PyList_CheckExact(__pyx_v_next_cubic))) { - PyObject* sequence = __pyx_v_next_cubic; - Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); - if (unlikely(size != 4)) { - if (size > 4) __Pyx_RaiseTooManyValuesError(4); - else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); - __PYX_ERR(0, 407, __pyx_L1_error) - } - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - if (likely(PyTuple_CheckExact(sequence))) { - __pyx_t_10 = PyTuple_GET_ITEM(sequence, 0); - __pyx_t_2 = PyTuple_GET_ITEM(sequence, 1); - __pyx_t_8 = PyTuple_GET_ITEM(sequence, 2); - __pyx_t_14 = PyTuple_GET_ITEM(sequence, 3); - } else { - __pyx_t_10 = PyList_GET_ITEM(sequence, 0); - __pyx_t_2 = PyList_GET_ITEM(sequence, 1); - __pyx_t_8 = PyList_GET_ITEM(sequence, 2); - __pyx_t_14 = PyList_GET_ITEM(sequence, 3); - } - __Pyx_INCREF(__pyx_t_10); - __Pyx_INCREF(__pyx_t_2); - __Pyx_INCREF(__pyx_t_8); - __Pyx_INCREF(__pyx_t_14); - #else - { - Py_ssize_t i; - PyObject** temps[4] = {&__pyx_t_10,&__pyx_t_2,&__pyx_t_8,&__pyx_t_14}; - for (i=0; i < 4; i++) { - PyObject* item = PySequence_ITEM(sequence, i); if (unlikely(!item)) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_GOTREF(item); - *(temps[i]) = item; - } - } - #endif - } else { - Py_ssize_t index = -1; - PyObject** temps[4] = {&__pyx_t_10,&__pyx_t_2,&__pyx_t_8,&__pyx_t_14}; - __pyx_t_15 = PyObject_GetIter(__pyx_v_next_cubic); if (unlikely(!__pyx_t_15)) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_15); - __pyx_t_16 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_15); - for (index=0; index < 4; index++) { - PyObject* item = __pyx_t_16(__pyx_t_15); if (unlikely(!item)) goto __pyx_L9_unpacking_failed; - __Pyx_GOTREF(item); - *(temps[index]) = item; - } - if (__Pyx_IternextUnpackEndCheck(__pyx_t_16(__pyx_t_15), 4) < 0) __PYX_ERR(0, 407, __pyx_L1_error) - __pyx_t_16 = NULL; - __Pyx_DECREF(__pyx_t_15); __pyx_t_15 = 0; - goto __pyx_L10_unpacking_done; - __pyx_L9_unpacking_failed:; - __Pyx_DECREF(__pyx_t_15); __pyx_t_15 = 0; - __pyx_t_16 = NULL; - if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); - __PYX_ERR(0, 407, __pyx_L1_error) - __pyx_L10_unpacking_done:; - } - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_10); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0; - __pyx_t_4 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_2); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_8); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_14); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 407, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - __pyx_v_c0 = __pyx_t_9; - __pyx_v_c1 = __pyx_t_4; - __pyx_v_c2 = __pyx_t_5; - __pyx_v_c3 = __pyx_t_6; - - /* "fontTools/cu2qu/cu2qu.py":410 - * - * # Current quadratic approximation of current cubic - * q0 = q2 # <<<<<<<<<<<<<< - * q1 = next_q1 - * if i < n: - */ - __pyx_v_q0 = __pyx_v_q2; - - /* "fontTools/cu2qu/cu2qu.py":411 - * # Current quadratic approximation of current cubic - * q0 = q2 - * q1 = next_q1 # <<<<<<<<<<<<<< - * if i < n: - * next_cubic = next(cubics) - */ - __pyx_v_q1 = __pyx_v_next_q1; - - /* "fontTools/cu2qu/cu2qu.py":412 - * q0 = q2 - * q1 = next_q1 - * if i < n: # <<<<<<<<<<<<<< - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( - */ - __pyx_t_1 = (__pyx_v_i < __pyx_v_n); - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":413 - * q1 = next_q1 - * if i < n: - * next_cubic = next(cubics) # <<<<<<<<<<<<<< - * next_q1 = cubic_approx_control( - * i / (n - 1), next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - */ - __pyx_t_14 = __Pyx_PyIter_Next(__pyx_v_cubics); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 413, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __Pyx_DECREF_SET(__pyx_v_next_cubic, __pyx_t_14); - __pyx_t_14 = 0; - - /* "fontTools/cu2qu/cu2qu.py":415 - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( - * i / (n - 1), next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] # <<<<<<<<<<<<<< - * ) - * spline.append(next_q1) - */ - __pyx_t_17 = (__pyx_v_n - 1); - if (unlikely(__pyx_t_17 == 0)) { - PyErr_SetString(PyExc_ZeroDivisionError, "float division"); - __PYX_ERR(0, 415, __pyx_L1_error) - } - __pyx_t_14 = __Pyx_GetItemInt(__pyx_v_next_cubic, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_6 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_14); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - __pyx_t_14 = __Pyx_GetItemInt(__pyx_v_next_cubic, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_5 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_14); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - __pyx_t_14 = __Pyx_GetItemInt(__pyx_v_next_cubic, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_4 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_14); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - __pyx_t_14 = __Pyx_GetItemInt(__pyx_v_next_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_9 = __Pyx_PyComplex_As___pyx_t_double_complex(__pyx_t_14); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 415, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - - /* "fontTools/cu2qu/cu2qu.py":414 - * if i < n: - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( # <<<<<<<<<<<<<< - * i / (n - 1), next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - * ) - */ - __pyx_t_7 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_control((((double)__pyx_v_i) / ((double)__pyx_t_17)), __pyx_t_6, __pyx_t_5, __pyx_t_4, __pyx_t_9); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 414, __pyx_L1_error) - __pyx_v_next_q1 = __pyx_t_7; - - /* "fontTools/cu2qu/cu2qu.py":417 - * i / (n - 1), next_cubic[0], next_cubic[1], next_cubic[2], next_cubic[3] - * ) - * spline.append(next_q1) # <<<<<<<<<<<<<< - * q2 = (q1 + next_q1) * 0.5 - * else: - */ - __pyx_t_14 = __pyx_PyComplex_FromComplex(__pyx_v_next_q1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 417, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_18 = __Pyx_PyList_Append(__pyx_v_spline, __pyx_t_14); if (unlikely(__pyx_t_18 == ((int)-1))) __PYX_ERR(0, 417, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - - /* "fontTools/cu2qu/cu2qu.py":418 - * ) - * spline.append(next_q1) - * q2 = (q1 + next_q1) * 0.5 # <<<<<<<<<<<<<< - * else: - * q2 = c3 - */ - __pyx_v_q2 = __Pyx_c_prod_double(__Pyx_c_sum_double(__pyx_v_q1, __pyx_v_next_q1), __pyx_t_double_complex_from_parts(0.5, 0)); - - /* "fontTools/cu2qu/cu2qu.py":412 - * q0 = q2 - * q1 = next_q1 - * if i < n: # <<<<<<<<<<<<<< - * next_cubic = next(cubics) - * next_q1 = cubic_approx_control( - */ - goto __pyx_L11; - } - - /* "fontTools/cu2qu/cu2qu.py":420 - * q2 = (q1 + next_q1) * 0.5 - * else: - * q2 = c3 # <<<<<<<<<<<<<< - * - * # End-point deltas - */ - /*else*/ { - __pyx_v_q2 = __pyx_v_c3; - } - __pyx_L11:; - - /* "fontTools/cu2qu/cu2qu.py":423 - * - * # End-point deltas - * d0 = d1 # <<<<<<<<<<<<<< - * d1 = q2 - c3 - * - */ - __pyx_v_d0 = __pyx_v_d1; - - /* "fontTools/cu2qu/cu2qu.py":424 - * # End-point deltas - * d0 = d1 - * d1 = q2 - c3 # <<<<<<<<<<<<<< - * - * if abs(d1) > tolerance or not cubic_farthest_fit_inside( - */ - __pyx_v_d1 = __Pyx_c_diff_double(__pyx_v_q2, __pyx_v_c3); - - /* "fontTools/cu2qu/cu2qu.py":426 - * d1 = q2 - c3 - * - * if abs(d1) > tolerance or not cubic_farthest_fit_inside( # <<<<<<<<<<<<<< - * d0, - * q0 + (q1 - q0) * (2 / 3) - c1, - */ - __pyx_t_3 = (__Pyx_c_abs_double(__pyx_v_d1) > __pyx_v_tolerance); - if (!__pyx_t_3) { - } else { - __pyx_t_1 = __pyx_t_3; - goto __pyx_L13_bool_binop_done; - } - - /* "fontTools/cu2qu/cu2qu.py":431 - * q2 + (q1 - q2) * (2 / 3) - c2, - * d1, - * tolerance, # <<<<<<<<<<<<<< - * ): - * return None - */ - __pyx_t_19 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_farthest_fit_inside(__pyx_v_d0, __Pyx_c_diff_double(__Pyx_c_sum_double(__pyx_v_q0, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_q1, __pyx_v_q0), __pyx_t_double_complex_from_parts((2.0 / 3.0), 0))), __pyx_v_c1), __Pyx_c_diff_double(__Pyx_c_sum_double(__pyx_v_q2, __Pyx_c_prod_double(__Pyx_c_diff_double(__pyx_v_q1, __pyx_v_q2), __pyx_t_double_complex_from_parts((2.0 / 3.0), 0))), __pyx_v_c2), __pyx_v_d1, __pyx_v_tolerance); if (unlikely(__pyx_t_19 == ((int)-1) && PyErr_Occurred())) __PYX_ERR(0, 426, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":426 - * d1 = q2 - c3 - * - * if abs(d1) > tolerance or not cubic_farthest_fit_inside( # <<<<<<<<<<<<<< - * d0, - * q0 + (q1 - q0) * (2 / 3) - c1, - */ - __pyx_t_3 = (!(__pyx_t_19 != 0)); - __pyx_t_1 = __pyx_t_3; - __pyx_L13_bool_binop_done:; - if (__pyx_t_1) { - - /* "fontTools/cu2qu/cu2qu.py":433 - * tolerance, - * ): - * return None # <<<<<<<<<<<<<< - * spline.append(cubic[3]) - * - */ - __Pyx_XDECREF(__pyx_r); - __pyx_r = Py_None; __Pyx_INCREF(Py_None); - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":426 - * d1 = q2 - c3 - * - * if abs(d1) > tolerance or not cubic_farthest_fit_inside( # <<<<<<<<<<<<<< - * d0, - * q0 + (q1 - q0) * (2 / 3) - c1, - */ - } - } - - /* "fontTools/cu2qu/cu2qu.py":434 - * ): - * return None - * spline.append(cubic[3]) # <<<<<<<<<<<<<< - * - * return spline - */ - __pyx_t_14 = __Pyx_GetItemInt(__pyx_v_cubic, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 434, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_14); - __pyx_t_18 = __Pyx_PyList_Append(__pyx_v_spline, __pyx_t_14); if (unlikely(__pyx_t_18 == ((int)-1))) __PYX_ERR(0, 434, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; - - /* "fontTools/cu2qu/cu2qu.py":436 - * spline.append(cubic[3]) - * - * return spline # <<<<<<<<<<<<<< - * - * - */ - __Pyx_XDECREF(__pyx_r); - __Pyx_INCREF(__pyx_v_spline); - __pyx_r = __pyx_v_spline; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":361 - * - * - * @cython.cfunc # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int, tolerance=cython.double) - * @cython.locals(i=cython.int) - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_8); - __Pyx_XDECREF(__pyx_t_10); - __Pyx_XDECREF(__pyx_t_14); - __Pyx_XDECREF(__pyx_t_15); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.cubic_approx_spline", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = 0; - __pyx_L0:; - __Pyx_XDECREF(__pyx_v_cubics); - __Pyx_XDECREF(__pyx_v_next_cubic); - __Pyx_XDECREF(__pyx_v_spline); - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":439 - * - * - * @cython.locals(max_err=cython.double) # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - */ - -/* Python wrapper */ -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_4curve_to_quadratic(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -); /*proto*/ -PyDoc_STRVAR(__pyx_doc_9fontTools_5cu2qu_5cu2qu_3curve_to_quadratic, "curve_to_quadratic(curve, double max_err, int all_quadratic=True)\nApproximate a cubic Bezier curve with a spline of n quadratics.\n\n Args:\n cubic (sequence): Four 2D tuples representing control points of\n the cubic Bezier curve.\n max_err (double): Permitted deviation from the original curve.\n all_quadratic (bool): If True (default) returned value is a\n quadratic spline. If False, it's either a single quadratic\n curve or a single cubic curve.\n\n Returns:\n If all_quadratic is True: A list of 2D tuples, representing\n control points of the quadratic spline if it fits within the\n given tolerance, or ``None`` if no suitable spline could be\n calculated.\n\n If all_quadratic is False: Either a quadratic curve (if length\n of output is 3), or a cubic curve (if length of output is 4).\n "); -static PyMethodDef __pyx_mdef_9fontTools_5cu2qu_5cu2qu_4curve_to_quadratic = {"curve_to_quadratic", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_5cu2qu_5cu2qu_4curve_to_quadratic, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_5cu2qu_5cu2qu_3curve_to_quadratic}; -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_4curve_to_quadratic(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -) { - PyObject *__pyx_v_curve = 0; - double __pyx_v_max_err; - int __pyx_v_all_quadratic; - #if !CYTHON_METH_FASTCALL - CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); - #endif - CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - PyObject *__pyx_r = 0; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("curve_to_quadratic (wrapper)", 0); - { - PyObject **__pyx_pyargnames[] = {&__pyx_n_s_curve,&__pyx_n_s_max_err,&__pyx_n_s_all_quadratic,0}; - PyObject* values[3] = {0,0,0}; - if (__pyx_kwds) { - Py_ssize_t kw_args; - switch (__pyx_nargs) { - case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - CYTHON_FALLTHROUGH; - case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - CYTHON_FALLTHROUGH; - case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - CYTHON_FALLTHROUGH; - case 0: break; - default: goto __pyx_L5_argtuple_error; - } - kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); - switch (__pyx_nargs) { - case 0: - if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_curve)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 439, __pyx_L3_error) - else goto __pyx_L5_argtuple_error; - CYTHON_FALLTHROUGH; - case 1: - if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_max_err)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 439, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("curve_to_quadratic", 0, 2, 3, 1); __PYX_ERR(0, 439, __pyx_L3_error) - } - CYTHON_FALLTHROUGH; - case 2: - if (kw_args > 0) { - PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_all_quadratic); - if (value) { values[2] = value; kw_args--; } - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 439, __pyx_L3_error) - } - } - if (unlikely(kw_args > 0)) { - const Py_ssize_t kwd_pos_args = __pyx_nargs; - if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "curve_to_quadratic") < 0)) __PYX_ERR(0, 439, __pyx_L3_error) - } - } else { - switch (__pyx_nargs) { - case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - CYTHON_FALLTHROUGH; - case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - break; - default: goto __pyx_L5_argtuple_error; - } - } - __pyx_v_curve = values[0]; - __pyx_v_max_err = __pyx_PyFloat_AsDouble(values[1]); if (unlikely((__pyx_v_max_err == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 442, __pyx_L3_error) - if (values[2]) { - __pyx_v_all_quadratic = __Pyx_PyInt_As_int(values[2]); if (unlikely((__pyx_v_all_quadratic == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 442, __pyx_L3_error) - } else { - - /* "fontTools/cu2qu/cu2qu.py":442 - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - * def curve_to_quadratic(curve, max_err, all_quadratic=True): # <<<<<<<<<<<<<< - * """Approximate a cubic Bezier curve with a spline of n quadratics. - * - */ - __pyx_v_all_quadratic = ((int)((int)1)); - } - } - goto __pyx_L4_argument_unpacking_done; - __pyx_L5_argtuple_error:; - __Pyx_RaiseArgtupleInvalid("curve_to_quadratic", 0, 2, 3, __pyx_nargs); __PYX_ERR(0, 439, __pyx_L3_error) - __pyx_L3_error:; - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.curve_to_quadratic", __pyx_clineno, __pyx_lineno, __pyx_filename); - __Pyx_RefNannyFinishContext(); - return NULL; - __pyx_L4_argument_unpacking_done:; - __pyx_r = __pyx_pf_9fontTools_5cu2qu_5cu2qu_3curve_to_quadratic(__pyx_self, __pyx_v_curve, __pyx_v_max_err, __pyx_v_all_quadratic); - - /* "fontTools/cu2qu/cu2qu.py":439 - * - * - * @cython.locals(max_err=cython.double) # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - */ - - /* function exit code */ - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu_3curve_to_quadratic(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_curve, double __pyx_v_max_err, int __pyx_v_all_quadratic) { - int __pyx_v_n; - PyObject *__pyx_v_spline = NULL; - PyObject *__pyx_7genexpr__pyx_v_p = NULL; - PyObject *__pyx_8genexpr1__pyx_v_s = NULL; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - Py_ssize_t __pyx_t_3; - PyObject *(*__pyx_t_4)(PyObject *); - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - long __pyx_t_7; - long __pyx_t_8; - int __pyx_t_9; - int __pyx_t_10; - PyObject *__pyx_t_11 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("curve_to_quadratic", 0); - __Pyx_INCREF(__pyx_v_curve); - - /* "fontTools/cu2qu/cu2qu.py":463 - * """ - * - * curve = [complex(*p) for p in curve] # <<<<<<<<<<<<<< - * - * for n in range(1, MAX_N + 1): - */ - { /* enter inner scope */ - __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_1); - if (likely(PyList_CheckExact(__pyx_v_curve)) || PyTuple_CheckExact(__pyx_v_curve)) { - __pyx_t_2 = __pyx_v_curve; __Pyx_INCREF(__pyx_t_2); __pyx_t_3 = 0; - __pyx_t_4 = NULL; - } else { - __pyx_t_3 = -1; __pyx_t_2 = PyObject_GetIter(__pyx_v_curve); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_4 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 463, __pyx_L5_error) - } - for (;;) { - if (likely(!__pyx_t_4)) { - if (likely(PyList_CheckExact(__pyx_t_2))) { - if (__pyx_t_3 >= PyList_GET_SIZE(__pyx_t_2)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyList_GET_ITEM(__pyx_t_2, __pyx_t_3); __Pyx_INCREF(__pyx_t_5); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 463, __pyx_L5_error) - #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_2, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_5); - #endif - } else { - if (__pyx_t_3 >= PyTuple_GET_SIZE(__pyx_t_2)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyTuple_GET_ITEM(__pyx_t_2, __pyx_t_3); __Pyx_INCREF(__pyx_t_5); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 463, __pyx_L5_error) - #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_2, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_5); - #endif - } - } else { - __pyx_t_5 = __pyx_t_4(__pyx_t_2); - if (unlikely(!__pyx_t_5)) { - PyObject* exc_type = PyErr_Occurred(); - if (exc_type) { - if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear(); - else __PYX_ERR(0, 463, __pyx_L5_error) - } - break; - } - __Pyx_GOTREF(__pyx_t_5); - } - __Pyx_XDECREF_SET(__pyx_7genexpr__pyx_v_p, __pyx_t_5); - __pyx_t_5 = 0; - __pyx_t_5 = __Pyx_PySequence_Tuple(__pyx_7genexpr__pyx_v_p); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_6 = __Pyx_PyObject_Call(((PyObject *)(&PyComplex_Type)), __pyx_t_5, NULL); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - if (unlikely(__Pyx_ListComp_Append(__pyx_t_1, (PyObject*)__pyx_t_6))) __PYX_ERR(0, 463, __pyx_L5_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - } - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_XDECREF(__pyx_7genexpr__pyx_v_p); __pyx_7genexpr__pyx_v_p = 0; - goto __pyx_L9_exit_scope; - __pyx_L5_error:; - __Pyx_XDECREF(__pyx_7genexpr__pyx_v_p); __pyx_7genexpr__pyx_v_p = 0; - goto __pyx_L1_error; - __pyx_L9_exit_scope:; - } /* exit inner scope */ - __Pyx_DECREF_SET(__pyx_v_curve, __pyx_t_1); - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":465 - * curve = [complex(*p) for p in curve] - * - * for n in range(1, MAX_N + 1): # <<<<<<<<<<<<<< - * spline = cubic_approx_spline(curve, n, max_err, all_quadratic) - * if spline is not None: - */ - __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_MAX_N); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 465, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __Pyx_PyInt_AddObjC(__pyx_t_1, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 465, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __pyx_t_7 = __Pyx_PyInt_As_long(__pyx_t_2); if (unlikely((__pyx_t_7 == (long)-1) && PyErr_Occurred())) __PYX_ERR(0, 465, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_8 = __pyx_t_7; - for (__pyx_t_9 = 1; __pyx_t_9 < __pyx_t_8; __pyx_t_9+=1) { - __pyx_v_n = __pyx_t_9; - - /* "fontTools/cu2qu/cu2qu.py":466 - * - * for n in range(1, MAX_N + 1): - * spline = cubic_approx_spline(curve, n, max_err, all_quadratic) # <<<<<<<<<<<<<< - * if spline is not None: - * # done. go home - */ - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_spline(__pyx_v_curve, __pyx_v_n, __pyx_v_max_err, __pyx_v_all_quadratic); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 466, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_XDECREF_SET(__pyx_v_spline, __pyx_t_2); - __pyx_t_2 = 0; - - /* "fontTools/cu2qu/cu2qu.py":467 - * for n in range(1, MAX_N + 1): - * spline = cubic_approx_spline(curve, n, max_err, all_quadratic) - * if spline is not None: # <<<<<<<<<<<<<< - * # done. go home - * return [(s.real, s.imag) for s in spline] - */ - __pyx_t_10 = (__pyx_v_spline != Py_None); - if (__pyx_t_10) { - - /* "fontTools/cu2qu/cu2qu.py":469 - * if spline is not None: - * # done. go home - * return [(s.real, s.imag) for s in spline] # <<<<<<<<<<<<<< - * - * raise ApproxNotFoundError(curve) - */ - __Pyx_XDECREF(__pyx_r); - { /* enter inner scope */ - __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_2); - if (likely(PyList_CheckExact(__pyx_v_spline)) || PyTuple_CheckExact(__pyx_v_spline)) { - __pyx_t_1 = __pyx_v_spline; __Pyx_INCREF(__pyx_t_1); __pyx_t_3 = 0; - __pyx_t_4 = NULL; - } else { - __pyx_t_3 = -1; __pyx_t_1 = PyObject_GetIter(__pyx_v_spline); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_4 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 469, __pyx_L15_error) - } - for (;;) { - if (likely(!__pyx_t_4)) { - if (likely(PyList_CheckExact(__pyx_t_1))) { - if (__pyx_t_3 >= PyList_GET_SIZE(__pyx_t_1)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_6 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_3); __Pyx_INCREF(__pyx_t_6); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 469, __pyx_L15_error) - #else - __pyx_t_6 = PySequence_ITEM(__pyx_t_1, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_6); - #endif - } else { - if (__pyx_t_3 >= PyTuple_GET_SIZE(__pyx_t_1)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_6 = PyTuple_GET_ITEM(__pyx_t_1, __pyx_t_3); __Pyx_INCREF(__pyx_t_6); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 469, __pyx_L15_error) - #else - __pyx_t_6 = PySequence_ITEM(__pyx_t_1, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_6); - #endif - } - } else { - __pyx_t_6 = __pyx_t_4(__pyx_t_1); - if (unlikely(!__pyx_t_6)) { - PyObject* exc_type = PyErr_Occurred(); - if (exc_type) { - if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear(); - else __PYX_ERR(0, 469, __pyx_L15_error) - } - break; - } - __Pyx_GOTREF(__pyx_t_6); - } - __Pyx_XDECREF_SET(__pyx_8genexpr1__pyx_v_s, __pyx_t_6); - __pyx_t_6 = 0; - __pyx_t_6 = __Pyx_PyObject_GetAttrStr(__pyx_8genexpr1__pyx_v_s, __pyx_n_s_real); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_5 = __Pyx_PyObject_GetAttrStr(__pyx_8genexpr1__pyx_v_s, __pyx_n_s_imag); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_5); - __pyx_t_11 = PyTuple_New(2); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_GOTREF(__pyx_t_11); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_11, 0, __pyx_t_6); - __Pyx_GIVEREF(__pyx_t_5); - PyTuple_SET_ITEM(__pyx_t_11, 1, __pyx_t_5); - __pyx_t_6 = 0; - __pyx_t_5 = 0; - if (unlikely(__Pyx_ListComp_Append(__pyx_t_2, (PyObject*)__pyx_t_11))) __PYX_ERR(0, 469, __pyx_L15_error) - __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; - } - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __Pyx_XDECREF(__pyx_8genexpr1__pyx_v_s); __pyx_8genexpr1__pyx_v_s = 0; - goto __pyx_L19_exit_scope; - __pyx_L15_error:; - __Pyx_XDECREF(__pyx_8genexpr1__pyx_v_s); __pyx_8genexpr1__pyx_v_s = 0; - goto __pyx_L1_error; - __pyx_L19_exit_scope:; - } /* exit inner scope */ - __pyx_r = __pyx_t_2; - __pyx_t_2 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":467 - * for n in range(1, MAX_N + 1): - * spline = cubic_approx_spline(curve, n, max_err, all_quadratic) - * if spline is not None: # <<<<<<<<<<<<<< - * # done. go home - * return [(s.real, s.imag) for s in spline] - */ - } - } - - /* "fontTools/cu2qu/cu2qu.py":471 - * return [(s.real, s.imag) for s in spline] - * - * raise ApproxNotFoundError(curve) # <<<<<<<<<<<<<< - * - * - */ - __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_ApproxNotFoundError); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 471, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_11 = NULL; - __pyx_t_9 = 0; - if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_1))) { - __pyx_t_11 = PyMethod_GET_SELF(__pyx_t_1); - if (likely(__pyx_t_11)) { - PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_1); - __Pyx_INCREF(__pyx_t_11); - __Pyx_INCREF(function); - __Pyx_DECREF_SET(__pyx_t_1, function); - __pyx_t_9 = 1; - } - } - { - PyObject *__pyx_callargs[2] = {__pyx_t_11, __pyx_v_curve}; - __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_1, __pyx_callargs+1-__pyx_t_9, 1+__pyx_t_9); - __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0; - if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 471, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - } - __Pyx_Raise(__pyx_t_2, 0, 0, 0); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __PYX_ERR(0, 471, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":439 - * - * - * @cython.locals(max_err=cython.double) # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_11); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.curve_to_quadratic", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = NULL; - __pyx_L0:; - __Pyx_XDECREF(__pyx_v_spline); - __Pyx_XDECREF(__pyx_7genexpr__pyx_v_p); - __Pyx_XDECREF(__pyx_8genexpr1__pyx_v_s); - __Pyx_XDECREF(__pyx_v_curve); - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -/* "fontTools/cu2qu/cu2qu.py":474 - * - * - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) # <<<<<<<<<<<<<< - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): - */ - -/* Python wrapper */ -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_6curves_to_quadratic(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -); /*proto*/ -PyDoc_STRVAR(__pyx_doc_9fontTools_5cu2qu_5cu2qu_5curves_to_quadratic, "curves_to_quadratic(curves, max_errors, int all_quadratic=True)\nReturn quadratic Bezier splines approximating the input cubic Beziers.\n\n Args:\n curves: A sequence of *n* curves, each curve being a sequence of four\n 2D tuples.\n max_errors: A sequence of *n* floats representing the maximum permissible\n deviation from each of the cubic Bezier curves.\n all_quadratic (bool): If True (default) returned values are a\n quadratic spline. If False, they are either a single quadratic\n curve or a single cubic curve.\n\n Example::\n\n >>> curves_to_quadratic( [\n ... [ (50,50), (100,100), (150,100), (200,50) ],\n ... [ (75,50), (120,100), (150,75), (200,60) ]\n ... ], [1,1] )\n [[(50.0, 50.0), (75.0, 75.0), (125.0, 91.66666666666666), (175.0, 75.0), (200.0, 50.0)], [(75.0, 50.0), (97.5, 75.0), (135.41666666666666, 82.08333333333333), (175.0, 67.5), (200.0, 60.0)]]\n\n The returned splines have \"implied oncurve points\" suitable for use in\n TrueType ``glif`` outlines - i.e. in the first spline returned above,\n the first quadratic segment runs from (50,50) to\n ( (75 + 125)/2 , (120 + 91.666..)/2 ) = (100, 83.333...).\n\n Returns:\n If all_quadratic is True, a list of splines, each spline being a list\n of 2D tuples.\n\n If all_quadratic is False, a list of curves, each curve being a quadratic\n (length 3), or cubic (length 4).\n\n Raises:\n fontTools.cu2qu.Errors.ApproxNotFoundError: if no suitable approximation\n can be found for all curves with the given parameters.\n "); -static PyMethodDef __pyx_mdef_9fontTools_5cu2qu_5cu2qu_6curves_to_quadratic = {"curves_to_quadratic", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_5cu2qu_5cu2qu_6curves_to_quadratic, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_5cu2qu_5cu2qu_5curves_to_quadratic}; -static PyObject *__pyx_pw_9fontTools_5cu2qu_5cu2qu_6curves_to_quadratic(PyObject *__pyx_self, -#if CYTHON_METH_FASTCALL -PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds -#else -PyObject *__pyx_args, PyObject *__pyx_kwds -#endif -) { - PyObject *__pyx_v_curves = 0; - PyObject *__pyx_v_max_errors = 0; - int __pyx_v_all_quadratic; - #if !CYTHON_METH_FASTCALL - CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); - #endif - CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - PyObject *__pyx_r = 0; - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("curves_to_quadratic (wrapper)", 0); - { - PyObject **__pyx_pyargnames[] = {&__pyx_n_s_curves,&__pyx_n_s_max_errors,&__pyx_n_s_all_quadratic,0}; - PyObject* values[3] = {0,0,0}; - if (__pyx_kwds) { - Py_ssize_t kw_args; - switch (__pyx_nargs) { - case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - CYTHON_FALLTHROUGH; - case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - CYTHON_FALLTHROUGH; - case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - CYTHON_FALLTHROUGH; - case 0: break; - default: goto __pyx_L5_argtuple_error; - } - kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); - switch (__pyx_nargs) { - case 0: - if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_curves)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 474, __pyx_L3_error) - else goto __pyx_L5_argtuple_error; - CYTHON_FALLTHROUGH; - case 1: - if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_max_errors)) != 0)) kw_args--; - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 474, __pyx_L3_error) - else { - __Pyx_RaiseArgtupleInvalid("curves_to_quadratic", 0, 2, 3, 1); __PYX_ERR(0, 474, __pyx_L3_error) - } - CYTHON_FALLTHROUGH; - case 2: - if (kw_args > 0) { - PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_all_quadratic); - if (value) { values[2] = value; kw_args--; } - else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 474, __pyx_L3_error) - } - } - if (unlikely(kw_args > 0)) { - const Py_ssize_t kwd_pos_args = __pyx_nargs; - if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "curves_to_quadratic") < 0)) __PYX_ERR(0, 474, __pyx_L3_error) - } - } else { - switch (__pyx_nargs) { - case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); - CYTHON_FALLTHROUGH; - case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); - values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); - break; - default: goto __pyx_L5_argtuple_error; - } - } - __pyx_v_curves = values[0]; - __pyx_v_max_errors = values[1]; - if (values[2]) { - __pyx_v_all_quadratic = __Pyx_PyInt_As_int(values[2]); if (unlikely((__pyx_v_all_quadratic == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 476, __pyx_L3_error) - } else { - - /* "fontTools/cu2qu/cu2qu.py":476 - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): # <<<<<<<<<<<<<< - * """Return quadratic Bezier splines approximating the input cubic Beziers. - * - */ - __pyx_v_all_quadratic = ((int)((int)1)); - } - } - goto __pyx_L4_argument_unpacking_done; - __pyx_L5_argtuple_error:; - __Pyx_RaiseArgtupleInvalid("curves_to_quadratic", 0, 2, 3, __pyx_nargs); __PYX_ERR(0, 474, __pyx_L3_error) - __pyx_L3_error:; - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.curves_to_quadratic", __pyx_clineno, __pyx_lineno, __pyx_filename); - __Pyx_RefNannyFinishContext(); - return NULL; - __pyx_L4_argument_unpacking_done:; - __pyx_r = __pyx_pf_9fontTools_5cu2qu_5cu2qu_5curves_to_quadratic(__pyx_self, __pyx_v_curves, __pyx_v_max_errors, __pyx_v_all_quadratic); - - /* "fontTools/cu2qu/cu2qu.py":474 - * - * - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) # <<<<<<<<<<<<<< - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): - */ - - /* function exit code */ - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -static PyObject *__pyx_pf_9fontTools_5cu2qu_5cu2qu_5curves_to_quadratic(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_curves, PyObject *__pyx_v_max_errors, int __pyx_v_all_quadratic) { - int __pyx_v_l; - int __pyx_v_last_i; - int __pyx_v_i; - PyObject *__pyx_v_splines = NULL; - PyObject *__pyx_v_n = NULL; - PyObject *__pyx_v_spline = NULL; - PyObject *__pyx_8genexpr2__pyx_v_curve = NULL; - PyObject *__pyx_8genexpr3__pyx_v_p = NULL; - PyObject *__pyx_8genexpr4__pyx_v_spline = NULL; - PyObject *__pyx_8genexpr5__pyx_v_s = NULL; - PyObject *__pyx_r = NULL; - __Pyx_RefNannyDeclarations - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - Py_ssize_t __pyx_t_3; - PyObject *(*__pyx_t_4)(PyObject *); - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - Py_ssize_t __pyx_t_7; - PyObject *(*__pyx_t_8)(PyObject *); - PyObject *__pyx_t_9 = NULL; - PyObject *__pyx_t_10 = NULL; - int __pyx_t_11; - int __pyx_t_12; - double __pyx_t_13; - long __pyx_t_14; - PyObject *__pyx_t_15 = NULL; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("curves_to_quadratic", 0); - __Pyx_INCREF(__pyx_v_curves); - - /* "fontTools/cu2qu/cu2qu.py":513 - * """ - * - * curves = [[complex(*p) for p in curve] for curve in curves] # <<<<<<<<<<<<<< - * assert len(max_errors) == len(curves) - * - */ - { /* enter inner scope */ - __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 513, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_1); - if (likely(PyList_CheckExact(__pyx_v_curves)) || PyTuple_CheckExact(__pyx_v_curves)) { - __pyx_t_2 = __pyx_v_curves; __Pyx_INCREF(__pyx_t_2); __pyx_t_3 = 0; - __pyx_t_4 = NULL; - } else { - __pyx_t_3 = -1; __pyx_t_2 = PyObject_GetIter(__pyx_v_curves); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 513, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_4 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 513, __pyx_L5_error) - } - for (;;) { - if (likely(!__pyx_t_4)) { - if (likely(PyList_CheckExact(__pyx_t_2))) { - if (__pyx_t_3 >= PyList_GET_SIZE(__pyx_t_2)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyList_GET_ITEM(__pyx_t_2, __pyx_t_3); __Pyx_INCREF(__pyx_t_5); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 513, __pyx_L5_error) - #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_2, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 513, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_5); - #endif - } else { - if (__pyx_t_3 >= PyTuple_GET_SIZE(__pyx_t_2)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyTuple_GET_ITEM(__pyx_t_2, __pyx_t_3); __Pyx_INCREF(__pyx_t_5); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 513, __pyx_L5_error) - #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_2, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 513, __pyx_L5_error) - __Pyx_GOTREF(__pyx_t_5); - #endif - } - } else { - __pyx_t_5 = __pyx_t_4(__pyx_t_2); - if (unlikely(!__pyx_t_5)) { - PyObject* exc_type = PyErr_Occurred(); - if (exc_type) { - if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear(); - else __PYX_ERR(0, 513, __pyx_L5_error) - } - break; - } - __Pyx_GOTREF(__pyx_t_5); - } - __Pyx_XDECREF_SET(__pyx_8genexpr2__pyx_v_curve, __pyx_t_5); - __pyx_t_5 = 0; - { /* enter inner scope */ - __pyx_t_5 = PyList_New(0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_5); - if (likely(PyList_CheckExact(__pyx_8genexpr2__pyx_v_curve)) || PyTuple_CheckExact(__pyx_8genexpr2__pyx_v_curve)) { - __pyx_t_6 = __pyx_8genexpr2__pyx_v_curve; __Pyx_INCREF(__pyx_t_6); __pyx_t_7 = 0; - __pyx_t_8 = NULL; - } else { - __pyx_t_7 = -1; __pyx_t_6 = PyObject_GetIter(__pyx_8genexpr2__pyx_v_curve); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_6); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 513, __pyx_L10_error) - } - for (;;) { - if (likely(!__pyx_t_8)) { - if (likely(PyList_CheckExact(__pyx_t_6))) { - if (__pyx_t_7 >= PyList_GET_SIZE(__pyx_t_6)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_9 = PyList_GET_ITEM(__pyx_t_6, __pyx_t_7); __Pyx_INCREF(__pyx_t_9); __pyx_t_7++; if (unlikely((0 < 0))) __PYX_ERR(0, 513, __pyx_L10_error) - #else - __pyx_t_9 = PySequence_ITEM(__pyx_t_6, __pyx_t_7); __pyx_t_7++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_9); - #endif - } else { - if (__pyx_t_7 >= PyTuple_GET_SIZE(__pyx_t_6)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_9 = PyTuple_GET_ITEM(__pyx_t_6, __pyx_t_7); __Pyx_INCREF(__pyx_t_9); __pyx_t_7++; if (unlikely((0 < 0))) __PYX_ERR(0, 513, __pyx_L10_error) - #else - __pyx_t_9 = PySequence_ITEM(__pyx_t_6, __pyx_t_7); __pyx_t_7++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_9); - #endif - } - } else { - __pyx_t_9 = __pyx_t_8(__pyx_t_6); - if (unlikely(!__pyx_t_9)) { - PyObject* exc_type = PyErr_Occurred(); - if (exc_type) { - if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear(); - else __PYX_ERR(0, 513, __pyx_L10_error) - } - break; - } - __Pyx_GOTREF(__pyx_t_9); - } - __Pyx_XDECREF_SET(__pyx_8genexpr3__pyx_v_p, __pyx_t_9); - __pyx_t_9 = 0; - __pyx_t_9 = __Pyx_PySequence_Tuple(__pyx_8genexpr3__pyx_v_p); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_9); - __pyx_t_10 = __Pyx_PyObject_Call(((PyObject *)(&PyComplex_Type)), __pyx_t_9, NULL); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_GOTREF(__pyx_t_10); - __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; - if (unlikely(__Pyx_ListComp_Append(__pyx_t_5, (PyObject*)__pyx_t_10))) __PYX_ERR(0, 513, __pyx_L10_error) - __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0; - } - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - __Pyx_XDECREF(__pyx_8genexpr3__pyx_v_p); __pyx_8genexpr3__pyx_v_p = 0; - goto __pyx_L14_exit_scope; - __pyx_L10_error:; - __Pyx_XDECREF(__pyx_8genexpr3__pyx_v_p); __pyx_8genexpr3__pyx_v_p = 0; - goto __pyx_L5_error; - __pyx_L14_exit_scope:; - } /* exit inner scope */ - if (unlikely(__Pyx_ListComp_Append(__pyx_t_1, (PyObject*)__pyx_t_5))) __PYX_ERR(0, 513, __pyx_L5_error) - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - } - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_curve); __pyx_8genexpr2__pyx_v_curve = 0; - goto __pyx_L16_exit_scope; - __pyx_L5_error:; - __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_curve); __pyx_8genexpr2__pyx_v_curve = 0; - goto __pyx_L1_error; - __pyx_L16_exit_scope:; - } /* exit inner scope */ - __Pyx_DECREF_SET(__pyx_v_curves, __pyx_t_1); - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":514 - * - * curves = [[complex(*p) for p in curve] for curve in curves] - * assert len(max_errors) == len(curves) # <<<<<<<<<<<<<< - * - * l = len(curves) - */ - #ifndef CYTHON_WITHOUT_ASSERTIONS - if (unlikely(__pyx_assertions_enabled())) { - __pyx_t_3 = PyObject_Length(__pyx_v_max_errors); if (unlikely(__pyx_t_3 == ((Py_ssize_t)-1))) __PYX_ERR(0, 514, __pyx_L1_error) - __pyx_t_7 = PyObject_Length(__pyx_v_curves); if (unlikely(__pyx_t_7 == ((Py_ssize_t)-1))) __PYX_ERR(0, 514, __pyx_L1_error) - __pyx_t_11 = (__pyx_t_3 == __pyx_t_7); - if (unlikely(!__pyx_t_11)) { - __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0); - __PYX_ERR(0, 514, __pyx_L1_error) - } - } - #else - if ((1)); else __PYX_ERR(0, 514, __pyx_L1_error) - #endif - - /* "fontTools/cu2qu/cu2qu.py":516 - * assert len(max_errors) == len(curves) - * - * l = len(curves) # <<<<<<<<<<<<<< - * splines = [None] * l - * last_i = i = 0 - */ - __pyx_t_7 = PyObject_Length(__pyx_v_curves); if (unlikely(__pyx_t_7 == ((Py_ssize_t)-1))) __PYX_ERR(0, 516, __pyx_L1_error) - __pyx_v_l = __pyx_t_7; - - /* "fontTools/cu2qu/cu2qu.py":517 - * - * l = len(curves) - * splines = [None] * l # <<<<<<<<<<<<<< - * last_i = i = 0 - * n = 1 - */ - __pyx_t_1 = PyList_New(1 * ((__pyx_v_l<0) ? 0:__pyx_v_l)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 517, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - { Py_ssize_t __pyx_temp; - for (__pyx_temp=0; __pyx_temp < __pyx_v_l; __pyx_temp++) { - __Pyx_INCREF(Py_None); - __Pyx_GIVEREF(Py_None); - PyList_SET_ITEM(__pyx_t_1, __pyx_temp, Py_None); - } - } - __pyx_v_splines = ((PyObject*)__pyx_t_1); - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":518 - * l = len(curves) - * splines = [None] * l - * last_i = i = 0 # <<<<<<<<<<<<<< - * n = 1 - * while True: - */ - __pyx_v_last_i = 0; - __pyx_v_i = 0; - - /* "fontTools/cu2qu/cu2qu.py":519 - * splines = [None] * l - * last_i = i = 0 - * n = 1 # <<<<<<<<<<<<<< - * while True: - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - */ - __Pyx_INCREF(__pyx_int_1); - __pyx_v_n = __pyx_int_1; - - /* "fontTools/cu2qu/cu2qu.py":520 - * last_i = i = 0 - * n = 1 - * while True: # <<<<<<<<<<<<<< - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - * if spline is None: - */ - while (1) { - - /* "fontTools/cu2qu/cu2qu.py":521 - * n = 1 - * while True: - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) # <<<<<<<<<<<<<< - * if spline is None: - * if n == MAX_N: - */ - __pyx_t_1 = __Pyx_GetItemInt(__pyx_v_curves, __pyx_v_i, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 521, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_12 = __Pyx_PyInt_As_int(__pyx_v_n); if (unlikely((__pyx_t_12 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 521, __pyx_L1_error) - __pyx_t_2 = __Pyx_GetItemInt(__pyx_v_max_errors, __pyx_v_i, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 521, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_13 = __pyx_PyFloat_AsDouble(__pyx_t_2); if (unlikely((__pyx_t_13 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 521, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_2 = __pyx_f_9fontTools_5cu2qu_5cu2qu_cubic_approx_spline(__pyx_t_1, __pyx_t_12, __pyx_t_13, __pyx_v_all_quadratic); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 521, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __Pyx_XDECREF_SET(__pyx_v_spline, __pyx_t_2); - __pyx_t_2 = 0; - - /* "fontTools/cu2qu/cu2qu.py":522 - * while True: - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - * if spline is None: # <<<<<<<<<<<<<< - * if n == MAX_N: - * break - */ - __pyx_t_11 = (__pyx_v_spline == Py_None); - if (__pyx_t_11) { - - /* "fontTools/cu2qu/cu2qu.py":523 - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - * if spline is None: - * if n == MAX_N: # <<<<<<<<<<<<<< - * break - * n += 1 - */ - __Pyx_GetModuleGlobalName(__pyx_t_2, __pyx_n_s_MAX_N); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 523, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_1 = PyObject_RichCompare(__pyx_v_n, __pyx_t_2, Py_EQ); __Pyx_XGOTREF(__pyx_t_1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 523, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __pyx_t_11 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely((__pyx_t_11 < 0))) __PYX_ERR(0, 523, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - if (__pyx_t_11) { - - /* "fontTools/cu2qu/cu2qu.py":524 - * if spline is None: - * if n == MAX_N: - * break # <<<<<<<<<<<<<< - * n += 1 - * last_i = i - */ - goto __pyx_L18_break; - - /* "fontTools/cu2qu/cu2qu.py":523 - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - * if spline is None: - * if n == MAX_N: # <<<<<<<<<<<<<< - * break - * n += 1 - */ - } - - /* "fontTools/cu2qu/cu2qu.py":525 - * if n == MAX_N: - * break - * n += 1 # <<<<<<<<<<<<<< - * last_i = i - * continue - */ - __pyx_t_1 = __Pyx_PyInt_AddObjC(__pyx_v_n, __pyx_int_1, 1, 1, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 525, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __Pyx_DECREF_SET(__pyx_v_n, __pyx_t_1); - __pyx_t_1 = 0; - - /* "fontTools/cu2qu/cu2qu.py":526 - * break - * n += 1 - * last_i = i # <<<<<<<<<<<<<< - * continue - * splines[i] = spline - */ - __pyx_v_last_i = __pyx_v_i; - - /* "fontTools/cu2qu/cu2qu.py":527 - * n += 1 - * last_i = i - * continue # <<<<<<<<<<<<<< - * splines[i] = spline - * i = (i + 1) % l - */ - goto __pyx_L17_continue; - - /* "fontTools/cu2qu/cu2qu.py":522 - * while True: - * spline = cubic_approx_spline(curves[i], n, max_errors[i], all_quadratic) - * if spline is None: # <<<<<<<<<<<<<< - * if n == MAX_N: - * break - */ - } - - /* "fontTools/cu2qu/cu2qu.py":528 - * last_i = i - * continue - * splines[i] = spline # <<<<<<<<<<<<<< - * i = (i + 1) % l - * if i == last_i: - */ - if (unlikely((__Pyx_SetItemInt(__pyx_v_splines, __pyx_v_i, __pyx_v_spline, int, 1, __Pyx_PyInt_From_int, 1, 1, 1) < 0))) __PYX_ERR(0, 528, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":529 - * continue - * splines[i] = spline - * i = (i + 1) % l # <<<<<<<<<<<<<< - * if i == last_i: - * # done. go home - */ - __pyx_t_14 = (__pyx_v_i + 1); - if (unlikely(__pyx_v_l == 0)) { - PyErr_SetString(PyExc_ZeroDivisionError, "integer division or modulo by zero"); - __PYX_ERR(0, 529, __pyx_L1_error) - } - __pyx_v_i = __Pyx_mod_long(__pyx_t_14, __pyx_v_l); - - /* "fontTools/cu2qu/cu2qu.py":530 - * splines[i] = spline - * i = (i + 1) % l - * if i == last_i: # <<<<<<<<<<<<<< - * # done. go home - * return [[(s.real, s.imag) for s in spline] for spline in splines] - */ - __pyx_t_11 = (__pyx_v_i == __pyx_v_last_i); - if (__pyx_t_11) { - - /* "fontTools/cu2qu/cu2qu.py":532 - * if i == last_i: - * # done. go home - * return [[(s.real, s.imag) for s in spline] for spline in splines] # <<<<<<<<<<<<<< - * - * raise ApproxNotFoundError(curves) - */ - __Pyx_XDECREF(__pyx_r); - { /* enter inner scope */ - __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 532, __pyx_L24_error) - __Pyx_GOTREF(__pyx_t_1); - __pyx_t_2 = __pyx_v_splines; __Pyx_INCREF(__pyx_t_2); __pyx_t_7 = 0; - for (;;) { - if (__pyx_t_7 >= PyList_GET_SIZE(__pyx_t_2)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyList_GET_ITEM(__pyx_t_2, __pyx_t_7); __Pyx_INCREF(__pyx_t_5); __pyx_t_7++; if (unlikely((0 < 0))) __PYX_ERR(0, 532, __pyx_L24_error) - #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_2, __pyx_t_7); __pyx_t_7++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 532, __pyx_L24_error) - __Pyx_GOTREF(__pyx_t_5); - #endif - __Pyx_XDECREF_SET(__pyx_8genexpr4__pyx_v_spline, __pyx_t_5); - __pyx_t_5 = 0; - { /* enter inner scope */ - __pyx_t_5 = PyList_New(0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_5); - if (likely(PyList_CheckExact(__pyx_8genexpr4__pyx_v_spline)) || PyTuple_CheckExact(__pyx_8genexpr4__pyx_v_spline)) { - __pyx_t_6 = __pyx_8genexpr4__pyx_v_spline; __Pyx_INCREF(__pyx_t_6); __pyx_t_3 = 0; - __pyx_t_4 = NULL; - } else { - __pyx_t_3 = -1; __pyx_t_6 = PyObject_GetIter(__pyx_8genexpr4__pyx_v_spline); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_6); - __pyx_t_4 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_6); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 532, __pyx_L29_error) - } - for (;;) { - if (likely(!__pyx_t_4)) { - if (likely(PyList_CheckExact(__pyx_t_6))) { - if (__pyx_t_3 >= PyList_GET_SIZE(__pyx_t_6)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_10 = PyList_GET_ITEM(__pyx_t_6, __pyx_t_3); __Pyx_INCREF(__pyx_t_10); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 532, __pyx_L29_error) - #else - __pyx_t_10 = PySequence_ITEM(__pyx_t_6, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_10); - #endif - } else { - if (__pyx_t_3 >= PyTuple_GET_SIZE(__pyx_t_6)) break; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_10 = PyTuple_GET_ITEM(__pyx_t_6, __pyx_t_3); __Pyx_INCREF(__pyx_t_10); __pyx_t_3++; if (unlikely((0 < 0))) __PYX_ERR(0, 532, __pyx_L29_error) - #else - __pyx_t_10 = PySequence_ITEM(__pyx_t_6, __pyx_t_3); __pyx_t_3++; if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_10); - #endif - } - } else { - __pyx_t_10 = __pyx_t_4(__pyx_t_6); - if (unlikely(!__pyx_t_10)) { - PyObject* exc_type = PyErr_Occurred(); - if (exc_type) { - if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear(); - else __PYX_ERR(0, 532, __pyx_L29_error) - } - break; - } - __Pyx_GOTREF(__pyx_t_10); - } - __Pyx_XDECREF_SET(__pyx_8genexpr5__pyx_v_s, __pyx_t_10); - __pyx_t_10 = 0; - __pyx_t_10 = __Pyx_PyObject_GetAttrStr(__pyx_8genexpr5__pyx_v_s, __pyx_n_s_real); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_10); - __pyx_t_9 = __Pyx_PyObject_GetAttrStr(__pyx_8genexpr5__pyx_v_s, __pyx_n_s_imag); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_9); - __pyx_t_15 = PyTuple_New(2); if (unlikely(!__pyx_t_15)) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_GOTREF(__pyx_t_15); - __Pyx_GIVEREF(__pyx_t_10); - PyTuple_SET_ITEM(__pyx_t_15, 0, __pyx_t_10); - __Pyx_GIVEREF(__pyx_t_9); - PyTuple_SET_ITEM(__pyx_t_15, 1, __pyx_t_9); - __pyx_t_10 = 0; - __pyx_t_9 = 0; - if (unlikely(__Pyx_ListComp_Append(__pyx_t_5, (PyObject*)__pyx_t_15))) __PYX_ERR(0, 532, __pyx_L29_error) - __Pyx_DECREF(__pyx_t_15); __pyx_t_15 = 0; - } - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_s); __pyx_8genexpr5__pyx_v_s = 0; - goto __pyx_L33_exit_scope; - __pyx_L29_error:; - __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_s); __pyx_8genexpr5__pyx_v_s = 0; - goto __pyx_L24_error; - __pyx_L33_exit_scope:; - } /* exit inner scope */ - if (unlikely(__Pyx_ListComp_Append(__pyx_t_1, (PyObject*)__pyx_t_5))) __PYX_ERR(0, 532, __pyx_L24_error) - __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; - } - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_XDECREF(__pyx_8genexpr4__pyx_v_spline); __pyx_8genexpr4__pyx_v_spline = 0; - goto __pyx_L35_exit_scope; - __pyx_L24_error:; - __Pyx_XDECREF(__pyx_8genexpr4__pyx_v_spline); __pyx_8genexpr4__pyx_v_spline = 0; - goto __pyx_L1_error; - __pyx_L35_exit_scope:; - } /* exit inner scope */ - __pyx_r = __pyx_t_1; - __pyx_t_1 = 0; - goto __pyx_L0; - - /* "fontTools/cu2qu/cu2qu.py":530 - * splines[i] = spline - * i = (i + 1) % l - * if i == last_i: # <<<<<<<<<<<<<< - * # done. go home - * return [[(s.real, s.imag) for s in spline] for spline in splines] - */ - } - __pyx_L17_continue:; - } - __pyx_L18_break:; - - /* "fontTools/cu2qu/cu2qu.py":534 - * return [[(s.real, s.imag) for s in spline] for spline in splines] - * - * raise ApproxNotFoundError(curves) # <<<<<<<<<<<<<< - */ - __Pyx_GetModuleGlobalName(__pyx_t_2, __pyx_n_s_ApproxNotFoundError); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 534, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_2); - __pyx_t_5 = NULL; - __pyx_t_12 = 0; - if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { - __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_2); - if (likely(__pyx_t_5)) { - PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); - __Pyx_INCREF(__pyx_t_5); - __Pyx_INCREF(function); - __Pyx_DECREF_SET(__pyx_t_2, function); - __pyx_t_12 = 1; - } - } - { - PyObject *__pyx_callargs[2] = {__pyx_t_5, __pyx_v_curves}; - __pyx_t_1 = __Pyx_PyObject_FastCall(__pyx_t_2, __pyx_callargs+1-__pyx_t_12, 1+__pyx_t_12); - __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; - if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 534, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_1); - __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - } - __Pyx_Raise(__pyx_t_1, 0, 0, 0); - __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; - __PYX_ERR(0, 534, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":474 - * - * - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) # <<<<<<<<<<<<<< - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): - */ - - /* function exit code */ - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_1); - __Pyx_XDECREF(__pyx_t_2); - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_9); - __Pyx_XDECREF(__pyx_t_10); - __Pyx_XDECREF(__pyx_t_15); - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu.curves_to_quadratic", __pyx_clineno, __pyx_lineno, __pyx_filename); - __pyx_r = NULL; - __pyx_L0:; - __Pyx_XDECREF(__pyx_v_splines); - __Pyx_XDECREF(__pyx_v_n); - __Pyx_XDECREF(__pyx_v_spline); - __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_curve); - __Pyx_XDECREF(__pyx_8genexpr3__pyx_v_p); - __Pyx_XDECREF(__pyx_8genexpr4__pyx_v_spline); - __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_s); - __Pyx_XDECREF(__pyx_v_curves); - __Pyx_XGIVEREF(__pyx_r); - __Pyx_RefNannyFinishContext(); - return __pyx_r; -} - -static struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *__pyx_freelist_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen[8]; -static int __pyx_freecount_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen = 0; - -static PyObject *__pyx_tp_new_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { - PyObject *o; - #if CYTHON_COMPILING_IN_LIMITED_API - allocfunc alloc_func = (allocfunc)PyType_GetSlot(t, Py_tp_alloc); - o = alloc_func(t, 0); - #else - if (CYTHON_COMPILING_IN_CPYTHON && likely((int)(__pyx_freecount_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen > 0) & (int)(t->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen)))) { - o = (PyObject*)__pyx_freelist_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen[--__pyx_freecount_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen]; - memset(o, 0, sizeof(struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen)); - (void) PyObject_INIT(o, t); - } else { - o = (*t->tp_alloc)(t, 0); - if (unlikely(!o)) return 0; - } - #endif - return o; -} - -static void __pyx_tp_dealloc_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen(PyObject *o) { - #if CYTHON_USE_TP_FINALIZE - if (unlikely((PY_VERSION_HEX >= 0x03080000 || __Pyx_PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE)) && __Pyx_PyObject_GetSlot(o, tp_finalize, destructor)) && (!PyType_IS_GC(Py_TYPE(o)) || !__Pyx_PyObject_GC_IsFinalized(o))) { - if (__Pyx_PyObject_GetSlot(o, tp_dealloc, destructor) == __pyx_tp_dealloc_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen) { - if (PyObject_CallFinalizerFromDealloc(o)) return; - } - } - #endif - if (CYTHON_COMPILING_IN_CPYTHON && ((int)(__pyx_freecount_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen < 8) & (int)(Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen)))) { - __pyx_freelist_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen[__pyx_freecount_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen++] = ((struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen *)o); - } else { - (*Py_TYPE(o)->tp_free)(o); - } -} -#if CYTHON_USE_TYPE_SPECS -static PyType_Slot __pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen_slots[] = { - {Py_tp_dealloc, (void *)__pyx_tp_dealloc_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen}, - {Py_tp_new, (void *)__pyx_tp_new_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen}, - {0, 0}, -}; -static PyType_Spec __pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen_spec = { - "fontTools.cu2qu.cu2qu.__pyx_scope_struct___split_cubic_into_n_gen", - sizeof(struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen), - 0, - Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_FINALIZE, - __pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen_slots, -}; -#else - -static PyTypeObject __pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen = { - PyVarObject_HEAD_INIT(0, 0) - "fontTools.cu2qu.cu2qu.""__pyx_scope_struct___split_cubic_into_n_gen", /*tp_name*/ - sizeof(struct __pyx_obj_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen), /*tp_basicsize*/ - 0, /*tp_itemsize*/ - __pyx_tp_dealloc_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen, /*tp_dealloc*/ - #if PY_VERSION_HEX < 0x030800b4 - 0, /*tp_print*/ - #endif - #if PY_VERSION_HEX >= 0x030800b4 - 0, /*tp_vectorcall_offset*/ - #endif - 0, /*tp_getattr*/ - 0, /*tp_setattr*/ - #if PY_MAJOR_VERSION < 3 - 0, /*tp_compare*/ - #endif - #if PY_MAJOR_VERSION >= 3 - 0, /*tp_as_async*/ - #endif - 0, /*tp_repr*/ - 0, /*tp_as_number*/ - 0, /*tp_as_sequence*/ - 0, /*tp_as_mapping*/ - 0, /*tp_hash*/ - 0, /*tp_call*/ - 0, /*tp_str*/ - 0, /*tp_getattro*/ - 0, /*tp_setattro*/ - 0, /*tp_as_buffer*/ - Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_FINALIZE, /*tp_flags*/ - 0, /*tp_doc*/ - 0, /*tp_traverse*/ - 0, /*tp_clear*/ - 0, /*tp_richcompare*/ - 0, /*tp_weaklistoffset*/ - 0, /*tp_iter*/ - 0, /*tp_iternext*/ - 0, /*tp_methods*/ - 0, /*tp_members*/ - 0, /*tp_getset*/ - 0, /*tp_base*/ - 0, /*tp_dict*/ - 0, /*tp_descr_get*/ - 0, /*tp_descr_set*/ - #if !CYTHON_USE_TYPE_SPECS - 0, /*tp_dictoffset*/ - #endif - 0, /*tp_init*/ - 0, /*tp_alloc*/ - __pyx_tp_new_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen, /*tp_new*/ - 0, /*tp_free*/ - 0, /*tp_is_gc*/ - 0, /*tp_bases*/ - 0, /*tp_mro*/ - 0, /*tp_cache*/ - 0, /*tp_subclasses*/ - 0, /*tp_weaklist*/ - 0, /*tp_del*/ - 0, /*tp_version_tag*/ - #if PY_VERSION_HEX >= 0x030400a1 - #if CYTHON_USE_TP_FINALIZE - 0, /*tp_finalize*/ - #else - NULL, /*tp_finalize*/ - #endif - #endif - #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800) - 0, /*tp_vectorcall*/ - #endif - #if __PYX_NEED_TP_PRINT_SLOT == 1 - 0, /*tp_print*/ - #endif - #if PY_VERSION_HEX >= 0x030C0000 - 0, /*tp_watched*/ - #endif - #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000 - 0, /*tp_pypy_flags*/ - #endif -}; -#endif - -static PyMethodDef __pyx_methods[] = { - {0, 0, 0, 0} -}; -#ifndef CYTHON_SMALL_CODE -#if defined(__clang__) - #define CYTHON_SMALL_CODE -#elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) - #define CYTHON_SMALL_CODE __attribute__((cold)) -#else - #define CYTHON_SMALL_CODE -#endif -#endif -/* #### Code section: pystring_table ### */ - -static int __Pyx_CreateStringTabAndInitStrings(void) { - __Pyx_StringTabEntry __pyx_string_tab[] = { - {&__pyx_n_s_ApproxNotFoundError, __pyx_k_ApproxNotFoundError, sizeof(__pyx_k_ApproxNotFoundError), 0, 0, 1, 1}, - {&__pyx_n_s_AssertionError, __pyx_k_AssertionError, sizeof(__pyx_k_AssertionError), 0, 0, 1, 1}, - {&__pyx_n_s_AttributeError, __pyx_k_AttributeError, sizeof(__pyx_k_AttributeError), 0, 0, 1, 1}, - {&__pyx_n_s_COMPILED, __pyx_k_COMPILED, sizeof(__pyx_k_COMPILED), 0, 0, 1, 1}, - {&__pyx_n_s_Cu2QuError, __pyx_k_Cu2QuError, sizeof(__pyx_k_Cu2QuError), 0, 0, 1, 1}, - {&__pyx_n_s_Error, __pyx_k_Error, sizeof(__pyx_k_Error), 0, 0, 1, 1}, - {&__pyx_n_s_ImportError, __pyx_k_ImportError, sizeof(__pyx_k_ImportError), 0, 0, 1, 1}, - {&__pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py, __pyx_k_Lib_fontTools_cu2qu_cu2qu_py, sizeof(__pyx_k_Lib_fontTools_cu2qu_cu2qu_py), 0, 0, 1, 0}, - {&__pyx_n_s_MAX_N, __pyx_k_MAX_N, sizeof(__pyx_k_MAX_N), 0, 0, 1, 1}, - {&__pyx_n_s_NAN, __pyx_k_NAN, sizeof(__pyx_k_NAN), 0, 0, 1, 1}, - {&__pyx_n_u_NaN, __pyx_k_NaN, sizeof(__pyx_k_NaN), 0, 1, 0, 1}, - {&__pyx_kp_u_Return_quadratic_Bezier_splines, __pyx_k_Return_quadratic_Bezier_splines, sizeof(__pyx_k_Return_quadratic_Bezier_splines), 0, 1, 0, 0}, - {&__pyx_n_s_ZeroDivisionError, __pyx_k_ZeroDivisionError, sizeof(__pyx_k_ZeroDivisionError), 0, 0, 1, 1}, - {&__pyx_kp_u__2, __pyx_k__2, sizeof(__pyx_k__2), 0, 1, 0, 0}, - {&__pyx_n_s__3, __pyx_k__3, sizeof(__pyx_k__3), 0, 0, 1, 1}, - {&__pyx_n_s__9, __pyx_k__9, sizeof(__pyx_k__9), 0, 0, 1, 1}, - {&__pyx_n_s_a, __pyx_k_a, sizeof(__pyx_k_a), 0, 0, 1, 1}, - {&__pyx_n_s_a1, __pyx_k_a1, sizeof(__pyx_k_a1), 0, 0, 1, 1}, - {&__pyx_n_s_all, __pyx_k_all, sizeof(__pyx_k_all), 0, 0, 1, 1}, - {&__pyx_n_s_all_quadratic, __pyx_k_all_quadratic, sizeof(__pyx_k_all_quadratic), 0, 0, 1, 1}, - {&__pyx_n_s_args, __pyx_k_args, sizeof(__pyx_k_args), 0, 0, 1, 1}, - {&__pyx_n_s_asyncio_coroutines, __pyx_k_asyncio_coroutines, sizeof(__pyx_k_asyncio_coroutines), 0, 0, 1, 1}, - {&__pyx_n_s_b, __pyx_k_b, sizeof(__pyx_k_b), 0, 0, 1, 1}, - {&__pyx_n_s_b1, __pyx_k_b1, sizeof(__pyx_k_b1), 0, 0, 1, 1}, - {&__pyx_n_s_c, __pyx_k_c, sizeof(__pyx_k_c), 0, 0, 1, 1}, - {&__pyx_n_s_c1, __pyx_k_c1, sizeof(__pyx_k_c1), 0, 0, 1, 1}, - {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, - {&__pyx_n_s_close, __pyx_k_close, sizeof(__pyx_k_close), 0, 0, 1, 1}, - {&__pyx_n_s_curve, __pyx_k_curve, sizeof(__pyx_k_curve), 0, 0, 1, 1}, - {&__pyx_n_s_curve_to_quadratic, __pyx_k_curve_to_quadratic, sizeof(__pyx_k_curve_to_quadratic), 0, 0, 1, 1}, - {&__pyx_n_u_curve_to_quadratic, __pyx_k_curve_to_quadratic, sizeof(__pyx_k_curve_to_quadratic), 0, 1, 0, 1}, - {&__pyx_n_s_curves, __pyx_k_curves, sizeof(__pyx_k_curves), 0, 0, 1, 1}, - {&__pyx_n_s_curves_to_quadratic, __pyx_k_curves_to_quadratic, sizeof(__pyx_k_curves_to_quadratic), 0, 0, 1, 1}, - {&__pyx_n_u_curves_to_quadratic, __pyx_k_curves_to_quadratic, sizeof(__pyx_k_curves_to_quadratic), 0, 1, 0, 1}, - {&__pyx_kp_u_curves_to_quadratic_line_474, __pyx_k_curves_to_quadratic_line_474, sizeof(__pyx_k_curves_to_quadratic_line_474), 0, 1, 0, 0}, - {&__pyx_n_s_cython, __pyx_k_cython, sizeof(__pyx_k_cython), 0, 0, 1, 1}, - {&__pyx_n_s_d, __pyx_k_d, sizeof(__pyx_k_d), 0, 0, 1, 1}, - {&__pyx_n_s_d1, __pyx_k_d1, sizeof(__pyx_k_d1), 0, 0, 1, 1}, - {&__pyx_n_s_delta_2, __pyx_k_delta_2, sizeof(__pyx_k_delta_2), 0, 0, 1, 1}, - {&__pyx_n_s_delta_3, __pyx_k_delta_3, sizeof(__pyx_k_delta_3), 0, 0, 1, 1}, - {&__pyx_kp_u_disable, __pyx_k_disable, sizeof(__pyx_k_disable), 0, 1, 0, 0}, - {&__pyx_n_s_dt, __pyx_k_dt, sizeof(__pyx_k_dt), 0, 0, 1, 1}, - {&__pyx_kp_u_enable, __pyx_k_enable, sizeof(__pyx_k_enable), 0, 1, 0, 0}, - {&__pyx_n_s_errors, __pyx_k_errors, sizeof(__pyx_k_errors), 0, 0, 1, 1}, - {&__pyx_n_s_fontTools_cu2qu_cu2qu, __pyx_k_fontTools_cu2qu_cu2qu, sizeof(__pyx_k_fontTools_cu2qu_cu2qu), 0, 0, 1, 1}, - {&__pyx_n_s_fontTools_misc, __pyx_k_fontTools_misc, sizeof(__pyx_k_fontTools_misc), 0, 0, 1, 1}, - {&__pyx_kp_u_gc, __pyx_k_gc, sizeof(__pyx_k_gc), 0, 1, 0, 0}, - {&__pyx_n_s_i, __pyx_k_i, sizeof(__pyx_k_i), 0, 0, 1, 1}, - {&__pyx_n_s_imag, __pyx_k_imag, sizeof(__pyx_k_imag), 0, 0, 1, 1}, - {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, - {&__pyx_n_s_initializing, __pyx_k_initializing, sizeof(__pyx_k_initializing), 0, 0, 1, 1}, - {&__pyx_n_s_is_coroutine, __pyx_k_is_coroutine, sizeof(__pyx_k_is_coroutine), 0, 0, 1, 1}, - {&__pyx_kp_u_isenabled, __pyx_k_isenabled, sizeof(__pyx_k_isenabled), 0, 1, 0, 0}, - {&__pyx_n_s_isnan, __pyx_k_isnan, sizeof(__pyx_k_isnan), 0, 0, 1, 1}, - {&__pyx_n_s_l, __pyx_k_l, sizeof(__pyx_k_l), 0, 0, 1, 1}, - {&__pyx_n_s_last_i, __pyx_k_last_i, sizeof(__pyx_k_last_i), 0, 0, 1, 1}, - {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, - {&__pyx_n_s_math, __pyx_k_math, sizeof(__pyx_k_math), 0, 0, 1, 1}, - {&__pyx_n_s_max_err, __pyx_k_max_err, sizeof(__pyx_k_max_err), 0, 0, 1, 1}, - {&__pyx_n_s_max_errors, __pyx_k_max_errors, sizeof(__pyx_k_max_errors), 0, 0, 1, 1}, - {&__pyx_n_s_n, __pyx_k_n, sizeof(__pyx_k_n), 0, 0, 1, 1}, - {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, - {&__pyx_n_s_p, __pyx_k_p, sizeof(__pyx_k_p), 0, 0, 1, 1}, - {&__pyx_n_s_p0, __pyx_k_p0, sizeof(__pyx_k_p0), 0, 0, 1, 1}, - {&__pyx_n_s_p1, __pyx_k_p1, sizeof(__pyx_k_p1), 0, 0, 1, 1}, - {&__pyx_n_s_p2, __pyx_k_p2, sizeof(__pyx_k_p2), 0, 0, 1, 1}, - {&__pyx_n_s_p3, __pyx_k_p3, sizeof(__pyx_k_p3), 0, 0, 1, 1}, - {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, - {&__pyx_n_s_real, __pyx_k_real, sizeof(__pyx_k_real), 0, 0, 1, 1}, - {&__pyx_n_s_s, __pyx_k_s, sizeof(__pyx_k_s), 0, 0, 1, 1}, - {&__pyx_n_s_send, __pyx_k_send, sizeof(__pyx_k_send), 0, 0, 1, 1}, - {&__pyx_n_s_spec, __pyx_k_spec, sizeof(__pyx_k_spec), 0, 0, 1, 1}, - {&__pyx_n_s_spline, __pyx_k_spline, sizeof(__pyx_k_spline), 0, 0, 1, 1}, - {&__pyx_n_s_splines, __pyx_k_splines, sizeof(__pyx_k_splines), 0, 0, 1, 1}, - {&__pyx_n_s_split_cubic_into_n_gen, __pyx_k_split_cubic_into_n_gen, sizeof(__pyx_k_split_cubic_into_n_gen), 0, 0, 1, 1}, - {&__pyx_n_s_t1, __pyx_k_t1, sizeof(__pyx_k_t1), 0, 0, 1, 1}, - {&__pyx_n_s_t1_2, __pyx_k_t1_2, sizeof(__pyx_k_t1_2), 0, 0, 1, 1}, - {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, - {&__pyx_n_s_throw, __pyx_k_throw, sizeof(__pyx_k_throw), 0, 0, 1, 1}, - {0, 0, 0, 0, 0, 0, 0} - }; - return __Pyx_InitStrings(__pyx_string_tab); -} -/* #### Code section: cached_builtins ### */ -static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { - __pyx_builtin_AttributeError = __Pyx_GetBuiltinName(__pyx_n_s_AttributeError); if (!__pyx_builtin_AttributeError) __PYX_ERR(0, 22, __pyx_L1_error) - __pyx_builtin_ImportError = __Pyx_GetBuiltinName(__pyx_n_s_ImportError); if (!__pyx_builtin_ImportError) __PYX_ERR(0, 22, __pyx_L1_error) - __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(0, 146, __pyx_L1_error) - __pyx_builtin_ZeroDivisionError = __Pyx_GetBuiltinName(__pyx_n_s_ZeroDivisionError); if (!__pyx_builtin_ZeroDivisionError) __PYX_ERR(0, 278, __pyx_L1_error) - __pyx_builtin_AssertionError = __Pyx_GetBuiltinName(__pyx_n_s_AssertionError); if (!__pyx_builtin_AssertionError) __PYX_ERR(0, 514, __pyx_L1_error) - return 0; - __pyx_L1_error:; - return -1; -} -/* #### Code section: cached_constants ### */ - -static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); - - /* "fontTools/cu2qu/cu2qu.py":127 - * - * - * @cython.locals( # <<<<<<<<<<<<<< - * p0=cython.complex, - * p1=cython.complex, - */ - __pyx_tuple__4 = PyTuple_Pack(19, __pyx_n_s_p0, __pyx_n_s_p1, __pyx_n_s_p2, __pyx_n_s_p3, __pyx_n_s_n, __pyx_n_s_a1, __pyx_n_s_b1, __pyx_n_s_c1, __pyx_n_s_d1, __pyx_n_s_dt, __pyx_n_s_delta_2, __pyx_n_s_delta_3, __pyx_n_s_i, __pyx_n_s_a, __pyx_n_s_b, __pyx_n_s_c, __pyx_n_s_d, __pyx_n_s_t1, __pyx_n_s_t1_2); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(0, 127, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__4); - __Pyx_GIVEREF(__pyx_tuple__4); - __pyx_codeobj_ = (PyObject*)__Pyx_PyCode_New(5, 0, 0, 19, 0, CO_OPTIMIZED|CO_NEWLOCALS|CO_GENERATOR, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__4, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py, __pyx_n_s_split_cubic_into_n_gen, 127, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj_)) __PYX_ERR(0, 127, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":439 - * - * - * @cython.locals(max_err=cython.double) # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - */ - __pyx_tuple__5 = PyTuple_Pack(7, __pyx_n_s_curve, __pyx_n_s_max_err, __pyx_n_s_all_quadratic, __pyx_n_s_n, __pyx_n_s_spline, __pyx_n_s_p, __pyx_n_s_s); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(0, 439, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__5); - __Pyx_GIVEREF(__pyx_tuple__5); - __pyx_codeobj__6 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 7, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__5, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py, __pyx_n_s_curve_to_quadratic, 439, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__6)) __PYX_ERR(0, 439, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":474 - * - * - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) # <<<<<<<<<<<<<< - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): - */ - __pyx_tuple__7 = PyTuple_Pack(13, __pyx_n_s_curves, __pyx_n_s_max_errors, __pyx_n_s_all_quadratic, __pyx_n_s_l, __pyx_n_s_last_i, __pyx_n_s_i, __pyx_n_s_splines, __pyx_n_s_n, __pyx_n_s_spline, __pyx_n_s_curve, __pyx_n_s_p, __pyx_n_s_spline, __pyx_n_s_s); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(0, 474, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__7); - __Pyx_GIVEREF(__pyx_tuple__7); - __pyx_codeobj__8 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 13, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__7, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_cu2qu_cu2qu_py, __pyx_n_s_curves_to_quadratic, 474, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__8)) __PYX_ERR(0, 474, __pyx_L1_error) - __Pyx_RefNannyFinishContext(); - return 0; - __pyx_L1_error:; - __Pyx_RefNannyFinishContext(); - return -1; -} -/* #### Code section: init_constants ### */ - -static CYTHON_SMALL_CODE int __Pyx_InitConstants(void) { - if (__Pyx_CreateStringTabAndInitStrings() < 0) __PYX_ERR(0, 1, __pyx_L1_error); - __pyx_int_1 = PyInt_FromLong(1); if (unlikely(!__pyx_int_1)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_int_2 = PyInt_FromLong(2); if (unlikely(!__pyx_int_2)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_int_3 = PyInt_FromLong(3); if (unlikely(!__pyx_int_3)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_int_4 = PyInt_FromLong(4); if (unlikely(!__pyx_int_4)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_int_6 = PyInt_FromLong(6); if (unlikely(!__pyx_int_6)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_int_100 = PyInt_FromLong(100); if (unlikely(!__pyx_int_100)) __PYX_ERR(0, 1, __pyx_L1_error) - return 0; - __pyx_L1_error:; - return -1; -} -/* #### Code section: init_globals ### */ - -static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { - /* AssertionsEnabled.init */ - __Pyx_init_assertions_enabled(); - -if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 1, __pyx_L1_error) - - return 0; - __pyx_L1_error:; - return -1; -} -/* #### Code section: init_module ### */ - -static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ -static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ - -static int __Pyx_modinit_global_init_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); - /*--- Global init code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - -static int __Pyx_modinit_variable_export_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); - /*--- Variable export code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - -static int __Pyx_modinit_function_export_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); - /*--- Function export code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - -static int __Pyx_modinit_type_init_code(void) { - __Pyx_RefNannyDeclarations - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); - /*--- Type init code ---*/ - #if CYTHON_USE_TYPE_SPECS - __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen = (PyTypeObject *) __Pyx_PyType_FromModuleAndSpec(__pyx_m, &__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen_spec, NULL); if (unlikely(!__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen)) __PYX_ERR(0, 127, __pyx_L1_error) - if (__Pyx_fix_up_extension_type_from_spec(&__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen_spec, __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen) < 0) __PYX_ERR(0, 127, __pyx_L1_error) - #else - __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen = &__pyx_type_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen; - #endif - #if !CYTHON_COMPILING_IN_LIMITED_API - #endif - #if !CYTHON_USE_TYPE_SPECS - if (__Pyx_PyType_Ready(__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen) < 0) __PYX_ERR(0, 127, __pyx_L1_error) - #endif - #if PY_MAJOR_VERSION < 3 - __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen->tp_print = 0; - #endif - #if !CYTHON_COMPILING_IN_LIMITED_API - if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen->tp_dictoffset && __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen->tp_getattro == PyObject_GenericGetAttr)) { - __pyx_ptype_9fontTools_5cu2qu_5cu2qu___pyx_scope_struct___split_cubic_into_n_gen->tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; - } - #endif - __Pyx_RefNannyFinishContext(); - return 0; - __pyx_L1_error:; - __Pyx_RefNannyFinishContext(); - return -1; -} - -static int __Pyx_modinit_type_import_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); - /*--- Type import code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - -static int __Pyx_modinit_variable_import_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); - /*--- Variable import code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - -static int __Pyx_modinit_function_import_code(void) { - __Pyx_RefNannyDeclarations - __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); - /*--- Function import code ---*/ - __Pyx_RefNannyFinishContext(); - return 0; -} - - -#if PY_MAJOR_VERSION >= 3 -#if CYTHON_PEP489_MULTI_PHASE_INIT -static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ -static int __pyx_pymod_exec_cu2qu(PyObject* module); /*proto*/ -static PyModuleDef_Slot __pyx_moduledef_slots[] = { - {Py_mod_create, (void*)__pyx_pymod_create}, - {Py_mod_exec, (void*)__pyx_pymod_exec_cu2qu}, - {0, NULL} -}; -#endif - -#ifdef __cplusplus -namespace { - struct PyModuleDef __pyx_moduledef = - #else - static struct PyModuleDef __pyx_moduledef = - #endif - { - PyModuleDef_HEAD_INIT, - "cu2qu", - 0, /* m_doc */ - #if CYTHON_PEP489_MULTI_PHASE_INIT - 0, /* m_size */ - #elif CYTHON_USE_MODULE_STATE - sizeof(__pyx_mstate), /* m_size */ - #else - -1, /* m_size */ - #endif - __pyx_methods /* m_methods */, - #if CYTHON_PEP489_MULTI_PHASE_INIT - __pyx_moduledef_slots, /* m_slots */ - #else - NULL, /* m_reload */ - #endif - #if CYTHON_USE_MODULE_STATE - __pyx_m_traverse, /* m_traverse */ - __pyx_m_clear, /* m_clear */ - NULL /* m_free */ - #else - NULL, /* m_traverse */ - NULL, /* m_clear */ - NULL /* m_free */ - #endif - }; - #ifdef __cplusplus -} /* anonymous namespace */ -#endif -#endif - -#ifndef CYTHON_NO_PYINIT_EXPORT -#define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC -#elif PY_MAJOR_VERSION < 3 -#ifdef __cplusplus -#define __Pyx_PyMODINIT_FUNC extern "C" void -#else -#define __Pyx_PyMODINIT_FUNC void -#endif -#else -#ifdef __cplusplus -#define __Pyx_PyMODINIT_FUNC extern "C" PyObject * -#else -#define __Pyx_PyMODINIT_FUNC PyObject * -#endif -#endif - - -#if PY_MAJOR_VERSION < 3 -__Pyx_PyMODINIT_FUNC initcu2qu(void) CYTHON_SMALL_CODE; /*proto*/ -__Pyx_PyMODINIT_FUNC initcu2qu(void) -#else -__Pyx_PyMODINIT_FUNC PyInit_cu2qu(void) CYTHON_SMALL_CODE; /*proto*/ -__Pyx_PyMODINIT_FUNC PyInit_cu2qu(void) -#if CYTHON_PEP489_MULTI_PHASE_INIT -{ - return PyModuleDef_Init(&__pyx_moduledef); -} -static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { - #if PY_VERSION_HEX >= 0x030700A1 - static PY_INT64_T main_interpreter_id = -1; - PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); - if (main_interpreter_id == -1) { - main_interpreter_id = current_id; - return (unlikely(current_id == -1)) ? -1 : 0; - } else if (unlikely(main_interpreter_id != current_id)) - #else - static PyInterpreterState *main_interpreter = NULL; - PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; - if (!main_interpreter) { - main_interpreter = current_interpreter; - } else if (unlikely(main_interpreter != current_interpreter)) - #endif - { - PyErr_SetString( - PyExc_ImportError, - "Interpreter change detected - this module can only be loaded into one interpreter per process."); - return -1; - } - return 0; -} -#if CYTHON_COMPILING_IN_LIMITED_API -static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *module, const char* from_name, const char* to_name, int allow_none) -#else -static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) -#endif -{ - PyObject *value = PyObject_GetAttrString(spec, from_name); - int result = 0; - if (likely(value)) { - if (allow_none || value != Py_None) { -#if CYTHON_COMPILING_IN_LIMITED_API - result = PyModule_AddObject(module, to_name, value); -#else - result = PyDict_SetItemString(moddict, to_name, value); -#endif - } - Py_DECREF(value); - } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { - PyErr_Clear(); - } else { - result = -1; - } - return result; -} -static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def) { - PyObject *module = NULL, *moddict, *modname; - CYTHON_UNUSED_VAR(def); - if (__Pyx_check_single_interpreter()) - return NULL; - if (__pyx_m) - return __Pyx_NewRef(__pyx_m); - modname = PyObject_GetAttrString(spec, "name"); - if (unlikely(!modname)) goto bad; - module = PyModule_NewObject(modname); - Py_DECREF(modname); - if (unlikely(!module)) goto bad; -#if CYTHON_COMPILING_IN_LIMITED_API - moddict = module; -#else - moddict = PyModule_GetDict(module); - if (unlikely(!moddict)) goto bad; -#endif - if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; - if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; - if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; - if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; - return module; -bad: - Py_XDECREF(module); - return NULL; -} - - -static CYTHON_SMALL_CODE int __pyx_pymod_exec_cu2qu(PyObject *__pyx_pyinit_module) -#endif -#endif -{ - int stringtab_initialized = 0; - #if CYTHON_USE_MODULE_STATE - int pystate_addmodule_run = 0; - #endif - PyObject *__pyx_t_1 = NULL; - PyObject *__pyx_t_2 = NULL; - PyObject *__pyx_t_3 = NULL; - int __pyx_t_4; - PyObject *__pyx_t_5 = NULL; - PyObject *__pyx_t_6 = NULL; - PyObject *__pyx_t_7 = NULL; - PyObject *__pyx_t_8 = NULL; - PyObject *__pyx_t_9 = NULL; - double __pyx_t_10; - int __pyx_lineno = 0; - const char *__pyx_filename = NULL; - int __pyx_clineno = 0; - __Pyx_RefNannyDeclarations - #if CYTHON_PEP489_MULTI_PHASE_INIT - if (__pyx_m) { - if (__pyx_m == __pyx_pyinit_module) return 0; - PyErr_SetString(PyExc_RuntimeError, "Module 'cu2qu' has already been imported. Re-initialisation is not supported."); - return -1; - } - #elif PY_MAJOR_VERSION >= 3 - if (__pyx_m) return __Pyx_NewRef(__pyx_m); - #endif - /*--- Module creation code ---*/ - #if CYTHON_PEP489_MULTI_PHASE_INIT - __pyx_m = __pyx_pyinit_module; - Py_INCREF(__pyx_m); - #else - #if PY_MAJOR_VERSION < 3 - __pyx_m = Py_InitModule4("cu2qu", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); - if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) - #elif CYTHON_USE_MODULE_STATE - __pyx_t_1 = PyModule_Create(&__pyx_moduledef); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) - { - int add_module_result = PyState_AddModule(__pyx_t_1, &__pyx_moduledef); - __pyx_t_1 = 0; /* transfer ownership from __pyx_t_1 to cu2qu pseudovariable */ - if (unlikely((add_module_result < 0))) __PYX_ERR(0, 1, __pyx_L1_error) - pystate_addmodule_run = 1; - } - #else - __pyx_m = PyModule_Create(&__pyx_moduledef); - if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #endif - CYTHON_UNUSED_VAR(__pyx_t_1); - __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) - Py_INCREF(__pyx_d); - __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) - Py_INCREF(__pyx_b); - __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) - Py_INCREF(__pyx_cython_runtime); - if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #if CYTHON_REFNANNY -__Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); -if (!__Pyx_RefNanny) { - PyErr_Clear(); - __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); - if (!__Pyx_RefNanny) - Py_FatalError("failed to import 'refnanny' module"); -} -#endif - __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit_cu2qu(void)", 0); - if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #ifdef __Pxy_PyFrame_Initialize_Offsets - __Pxy_PyFrame_Initialize_Offsets(); - #endif - __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) - __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) - #ifdef __Pyx_CyFunction_USED - if (__pyx_CyFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #ifdef __Pyx_FusedFunction_USED - if (__pyx_FusedFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #ifdef __Pyx_Coroutine_USED - if (__pyx_Coroutine_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #ifdef __Pyx_Generator_USED - if (__pyx_Generator_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #ifdef __Pyx_AsyncGen_USED - if (__pyx_AsyncGen_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - #ifdef __Pyx_StopAsyncIteration_USED - if (__pyx_StopAsyncIteration_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - /*--- Library function declarations ---*/ - /*--- Threads initialization code ---*/ - #if defined(WITH_THREAD) && PY_VERSION_HEX < 0x030700F0 && defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS - PyEval_InitThreads(); - #endif - /*--- Initialize various global constants etc. ---*/ - if (__Pyx_InitConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - stringtab_initialized = 1; - if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) - if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - if (__pyx_module_is_main_fontTools__cu2qu__cu2qu) { - if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - } - #if PY_MAJOR_VERSION >= 3 - { - PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) - if (!PyDict_GetItemString(modules, "fontTools.cu2qu.cu2qu")) { - if (unlikely((PyDict_SetItemString(modules, "fontTools.cu2qu.cu2qu", __pyx_m) < 0))) __PYX_ERR(0, 1, __pyx_L1_error) - } - } - #endif - /*--- Builtin init code ---*/ - if (__Pyx_InitCachedBuiltins() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - /*--- Constants init code ---*/ - if (__Pyx_InitCachedConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - /*--- Global type/function init code ---*/ - (void)__Pyx_modinit_global_init_code(); - (void)__Pyx_modinit_variable_export_code(); - (void)__Pyx_modinit_function_export_code(); - if (unlikely((__Pyx_modinit_type_init_code() < 0))) __PYX_ERR(0, 1, __pyx_L1_error) - (void)__Pyx_modinit_type_import_code(); - (void)__Pyx_modinit_variable_import_code(); - (void)__Pyx_modinit_function_import_code(); - /*--- Execution code ---*/ - #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) - if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) - #endif - - /* "fontTools/cu2qu/cu2qu.py":18 - * # limitations under the License. - * - * try: # <<<<<<<<<<<<<< - * import cython - * - */ - { - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); - __Pyx_XGOTREF(__pyx_t_1); - __Pyx_XGOTREF(__pyx_t_2); - __Pyx_XGOTREF(__pyx_t_3); - /*try:*/ { - - /* "fontTools/cu2qu/cu2qu.py":21 - * import cython - * - * COMPILED = cython.compiled # <<<<<<<<<<<<<< - * except (AttributeError, ImportError): - * # if cython not installed, use mock module with no-op decorators and types - */ - if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_True) < 0) __PYX_ERR(0, 21, __pyx_L2_error) - - /* "fontTools/cu2qu/cu2qu.py":18 - * # limitations under the License. - * - * try: # <<<<<<<<<<<<<< - * import cython - * - */ - } - __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; - __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; - __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; - goto __pyx_L7_try_end; - __pyx_L2_error:; - - /* "fontTools/cu2qu/cu2qu.py":22 - * - * COMPILED = cython.compiled - * except (AttributeError, ImportError): # <<<<<<<<<<<<<< - * # if cython not installed, use mock module with no-op decorators and types - * from fontTools.misc import cython - */ - __pyx_t_4 = __Pyx_PyErr_ExceptionMatches2(__pyx_builtin_AttributeError, __pyx_builtin_ImportError); - if (__pyx_t_4) { - __Pyx_AddTraceback("fontTools.cu2qu.cu2qu", __pyx_clineno, __pyx_lineno, __pyx_filename); - if (__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(0, 22, __pyx_L4_except_error) - __Pyx_XGOTREF(__pyx_t_5); - __Pyx_XGOTREF(__pyx_t_6); - __Pyx_XGOTREF(__pyx_t_7); - - /* "fontTools/cu2qu/cu2qu.py":24 - * except (AttributeError, ImportError): - * # if cython not installed, use mock module with no-op decorators and types - * from fontTools.misc import cython # <<<<<<<<<<<<<< - * - * COMPILED = False - */ - __pyx_t_8 = PyList_New(1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 24, __pyx_L4_except_error) - __Pyx_GOTREF(__pyx_t_8); - __Pyx_INCREF(__pyx_n_s_cython); - __Pyx_GIVEREF(__pyx_n_s_cython); - PyList_SET_ITEM(__pyx_t_8, 0, __pyx_n_s_cython); - __pyx_t_9 = __Pyx_Import(__pyx_n_s_fontTools_misc, __pyx_t_8, 0); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 24, __pyx_L4_except_error) - __Pyx_GOTREF(__pyx_t_9); - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __pyx_t_8 = __Pyx_ImportFrom(__pyx_t_9, __pyx_n_s_cython); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 24, __pyx_L4_except_error) - __Pyx_GOTREF(__pyx_t_8); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_cython, __pyx_t_8) < 0) __PYX_ERR(0, 24, __pyx_L4_except_error) - __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; - __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; - - /* "fontTools/cu2qu/cu2qu.py":26 - * from fontTools.misc import cython - * - * COMPILED = False # <<<<<<<<<<<<<< - * - * import math - */ - if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_False) < 0) __PYX_ERR(0, 26, __pyx_L4_except_error) - __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; - __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; - __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; - goto __pyx_L3_exception_handled; - } - goto __pyx_L4_except_error; - - /* "fontTools/cu2qu/cu2qu.py":18 - * # limitations under the License. - * - * try: # <<<<<<<<<<<<<< - * import cython - * - */ - __pyx_L4_except_error:; - __Pyx_XGIVEREF(__pyx_t_1); - __Pyx_XGIVEREF(__pyx_t_2); - __Pyx_XGIVEREF(__pyx_t_3); - __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); - goto __pyx_L1_error; - __pyx_L3_exception_handled:; - __Pyx_XGIVEREF(__pyx_t_1); - __Pyx_XGIVEREF(__pyx_t_2); - __Pyx_XGIVEREF(__pyx_t_3); - __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); - __pyx_L7_try_end:; - } - - /* "fontTools/cu2qu/cu2qu.py":28 - * COMPILED = False - * - * import math # <<<<<<<<<<<<<< - * - * from .errors import Error as Cu2QuError, ApproxNotFoundError - */ - __pyx_t_7 = __Pyx_ImportDottedModule(__pyx_n_s_math, NULL); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 28, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_math, __pyx_t_7) < 0) __PYX_ERR(0, 28, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - - /* "fontTools/cu2qu/cu2qu.py":30 - * import math - * - * from .errors import Error as Cu2QuError, ApproxNotFoundError # <<<<<<<<<<<<<< - * - * - */ - __pyx_t_7 = PyList_New(2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_INCREF(__pyx_n_s_Error); - __Pyx_GIVEREF(__pyx_n_s_Error); - PyList_SET_ITEM(__pyx_t_7, 0, __pyx_n_s_Error); - __Pyx_INCREF(__pyx_n_s_ApproxNotFoundError); - __Pyx_GIVEREF(__pyx_n_s_ApproxNotFoundError); - PyList_SET_ITEM(__pyx_t_7, 1, __pyx_n_s_ApproxNotFoundError); - __pyx_t_6 = __Pyx_Import(__pyx_n_s_errors, __pyx_t_7, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __pyx_t_7 = __Pyx_ImportFrom(__pyx_t_6, __pyx_n_s_Error); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_Cu2QuError, __pyx_t_7) < 0) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __pyx_t_7 = __Pyx_ImportFrom(__pyx_t_6, __pyx_n_s_ApproxNotFoundError); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_ApproxNotFoundError, __pyx_t_7) < 0) __PYX_ERR(0, 30, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":33 - * - * - * __all__ = ["curve_to_quadratic", "curves_to_quadratic"] # <<<<<<<<<<<<<< - * - * MAX_N = 100 - */ - __pyx_t_6 = PyList_New(2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 33, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_INCREF(__pyx_n_u_curve_to_quadratic); - __Pyx_GIVEREF(__pyx_n_u_curve_to_quadratic); - PyList_SET_ITEM(__pyx_t_6, 0, __pyx_n_u_curve_to_quadratic); - __Pyx_INCREF(__pyx_n_u_curves_to_quadratic); - __Pyx_GIVEREF(__pyx_n_u_curves_to_quadratic); - PyList_SET_ITEM(__pyx_t_6, 1, __pyx_n_u_curves_to_quadratic); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_all, __pyx_t_6) < 0) __PYX_ERR(0, 33, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":35 - * __all__ = ["curve_to_quadratic", "curves_to_quadratic"] - * - * MAX_N = 100 # <<<<<<<<<<<<<< - * - * NAN = float("NaN") - */ - if (PyDict_SetItem(__pyx_d, __pyx_n_s_MAX_N, __pyx_int_100) < 0) __PYX_ERR(0, 35, __pyx_L1_error) - - /* "fontTools/cu2qu/cu2qu.py":37 - * MAX_N = 100 - * - * NAN = float("NaN") # <<<<<<<<<<<<<< - * - * - */ - __pyx_t_10 = __Pyx_PyUnicode_AsDouble(__pyx_n_u_NaN); if (unlikely(__pyx_t_10 == ((double)((double)-1)) && PyErr_Occurred())) __PYX_ERR(0, 37, __pyx_L1_error) - __pyx_t_6 = PyFloat_FromDouble(__pyx_t_10); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 37, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_NAN, __pyx_t_6) < 0) __PYX_ERR(0, 37, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":127 - * - * - * @cython.locals( # <<<<<<<<<<<<<< - * p0=cython.complex, - * p1=cython.complex, - */ - __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_5cu2qu_5cu2qu_1_split_cubic_into_n_gen, 0, __pyx_n_s_split_cubic_into_n_gen, NULL, __pyx_n_s_fontTools_cu2qu_cu2qu, __pyx_d, ((PyObject *)__pyx_codeobj_)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 127, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - if (PyDict_SetItem(__pyx_d, __pyx_n_s_split_cubic_into_n_gen, __pyx_t_6) < 0) __PYX_ERR(0, 127, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":442 - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - * def curve_to_quadratic(curve, max_err, all_quadratic=True): # <<<<<<<<<<<<<< - * """Approximate a cubic Bezier curve with a spline of n quadratics. - * - */ - __pyx_t_6 = __Pyx_PyBool_FromLong(((int)1)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 442, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - - /* "fontTools/cu2qu/cu2qu.py":439 - * - * - * @cython.locals(max_err=cython.double) # <<<<<<<<<<<<<< - * @cython.locals(n=cython.int) - * @cython.locals(all_quadratic=cython.int) - */ - __pyx_t_7 = PyTuple_New(1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 439, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_6); - __pyx_t_6 = 0; - __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_5cu2qu_5cu2qu_4curve_to_quadratic, 0, __pyx_n_s_curve_to_quadratic, NULL, __pyx_n_s_fontTools_cu2qu_cu2qu, __pyx_d, ((PyObject *)__pyx_codeobj__6)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 439, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_CyFunction_SetDefaultsTuple(__pyx_t_6, __pyx_t_7); - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - if (PyDict_SetItem(__pyx_d, __pyx_n_s_curve_to_quadratic, __pyx_t_6) < 0) __PYX_ERR(0, 439, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":476 - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): # <<<<<<<<<<<<<< - * """Return quadratic Bezier splines approximating the input cubic Beziers. - * - */ - __pyx_t_6 = __Pyx_PyBool_FromLong(((int)1)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 476, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - - /* "fontTools/cu2qu/cu2qu.py":474 - * - * - * @cython.locals(l=cython.int, last_i=cython.int, i=cython.int) # <<<<<<<<<<<<<< - * @cython.locals(all_quadratic=cython.int) - * def curves_to_quadratic(curves, max_errors, all_quadratic=True): - */ - __pyx_t_7 = PyTuple_New(1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 474, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_7); - __Pyx_GIVEREF(__pyx_t_6); - PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_6); - __pyx_t_6 = 0; - __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_5cu2qu_5cu2qu_6curves_to_quadratic, 0, __pyx_n_s_curves_to_quadratic, NULL, __pyx_n_s_fontTools_cu2qu_cu2qu, __pyx_d, ((PyObject *)__pyx_codeobj__8)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 474, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - __Pyx_CyFunction_SetDefaultsTuple(__pyx_t_6, __pyx_t_7); - __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; - if (PyDict_SetItem(__pyx_d, __pyx_n_s_curves_to_quadratic, __pyx_t_6) < 0) __PYX_ERR(0, 474, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /* "fontTools/cu2qu/cu2qu.py":1 - * # cython: language_level=3 # <<<<<<<<<<<<<< - * # distutils: define_macros=CYTHON_TRACE_NOGIL=1 - * - */ - __pyx_t_6 = __Pyx_PyDict_NewPresized(1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 1, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_6); - if (PyDict_SetItem(__pyx_t_6, __pyx_kp_u_curves_to_quadratic_line_474, __pyx_kp_u_Return_quadratic_Bezier_splines) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_6) < 0) __PYX_ERR(0, 1, __pyx_L1_error) - __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; - - /*--- Wrapped vars code ---*/ - - goto __pyx_L0; - __pyx_L1_error:; - __Pyx_XDECREF(__pyx_t_5); - __Pyx_XDECREF(__pyx_t_6); - __Pyx_XDECREF(__pyx_t_7); - __Pyx_XDECREF(__pyx_t_8); - __Pyx_XDECREF(__pyx_t_9); - if (__pyx_m) { - if (__pyx_d && stringtab_initialized) { - __Pyx_AddTraceback("init fontTools.cu2qu.cu2qu", __pyx_clineno, __pyx_lineno, __pyx_filename); - } - #if !CYTHON_USE_MODULE_STATE - Py_CLEAR(__pyx_m); - #else - Py_DECREF(__pyx_m); - if (pystate_addmodule_run) { - PyObject *tp, *value, *tb; - PyErr_Fetch(&tp, &value, &tb); - PyState_RemoveModule(&__pyx_moduledef); - PyErr_Restore(tp, value, tb); - } - #endif - } else if (!PyErr_Occurred()) { - PyErr_SetString(PyExc_ImportError, "init fontTools.cu2qu.cu2qu"); - } - __pyx_L0:; - __Pyx_RefNannyFinishContext(); - #if CYTHON_PEP489_MULTI_PHASE_INIT - return (__pyx_m != NULL) ? 0 : -1; - #elif PY_MAJOR_VERSION >= 3 - return __pyx_m; - #else - return; - #endif -} -/* #### Code section: cleanup_globals ### */ -/* #### Code section: cleanup_module ### */ -/* #### Code section: main_method ### */ -/* #### Code section: utility_code_pragmas ### */ -#ifdef _MSC_VER -#pragma warning( push ) -/* Warning 4127: conditional expression is constant - * Cython uses constant conditional expressions to allow in inline functions to be optimized at - * compile-time, so this warning is not useful - */ -#pragma warning( disable : 4127 ) -#endif - - - -/* #### Code section: utility_code_def ### */ - -/* --- Runtime support code --- */ -/* Refnanny */ -#if CYTHON_REFNANNY -static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { - PyObject *m = NULL, *p = NULL; - void *r = NULL; - m = PyImport_ImportModule(modname); - if (!m) goto end; - p = PyObject_GetAttrString(m, "RefNannyAPI"); - if (!p) goto end; - r = PyLong_AsVoidPtr(p); -end: - Py_XDECREF(p); - Py_XDECREF(m); - return (__Pyx_RefNannyAPIStruct *)r; -} -#endif - -/* PyErrExceptionMatches */ -#if CYTHON_FAST_THREAD_STATE -static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { - Py_ssize_t i, n; - n = PyTuple_GET_SIZE(tuple); -#if PY_MAJOR_VERSION >= 3 - for (i=0; i= 0x030C00A6 - PyObject *current_exception = tstate->current_exception; - if (unlikely(!current_exception)) return 0; - exc_type = (PyObject*) Py_TYPE(current_exception); - if (exc_type == err) return 1; -#else - exc_type = tstate->curexc_type; - if (exc_type == err) return 1; - if (unlikely(!exc_type)) return 0; -#endif - #if CYTHON_AVOID_BORROWED_REFS - Py_INCREF(exc_type); - #endif - if (unlikely(PyTuple_Check(err))) { - result = __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err); - } else { - result = __Pyx_PyErr_GivenExceptionMatches(exc_type, err); - } - #if CYTHON_AVOID_BORROWED_REFS - Py_DECREF(exc_type); - #endif - return result; -} -#endif - -/* PyErrFetchRestore */ -#if CYTHON_FAST_THREAD_STATE -static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { -#if PY_VERSION_HEX >= 0x030C00A6 - PyObject *tmp_value; - assert(type == NULL || (value != NULL && type == (PyObject*) Py_TYPE(value))); - if (value) { - #if CYTHON_COMPILING_IN_CPYTHON - if (unlikely(((PyBaseExceptionObject*) value)->traceback != tb)) - #endif - PyException_SetTraceback(value, tb); - } - tmp_value = tstate->current_exception; - tstate->current_exception = value; - Py_XDECREF(tmp_value); -#else - PyObject *tmp_type, *tmp_value, *tmp_tb; - tmp_type = tstate->curexc_type; - tmp_value = tstate->curexc_value; - tmp_tb = tstate->curexc_traceback; - tstate->curexc_type = type; - tstate->curexc_value = value; - tstate->curexc_traceback = tb; - Py_XDECREF(tmp_type); - Py_XDECREF(tmp_value); - Py_XDECREF(tmp_tb); -#endif -} -static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { -#if PY_VERSION_HEX >= 0x030C00A6 - PyObject* exc_value; - exc_value = tstate->current_exception; - tstate->current_exception = 0; - *value = exc_value; - *type = NULL; - *tb = NULL; - if (exc_value) { - *type = (PyObject*) Py_TYPE(exc_value); - Py_INCREF(*type); - #if CYTHON_COMPILING_IN_CPYTHON - *tb = ((PyBaseExceptionObject*) exc_value)->traceback; - Py_XINCREF(*tb); - #else - *tb = PyException_GetTraceback(exc_value); - #endif - } -#else - *type = tstate->curexc_type; - *value = tstate->curexc_value; - *tb = tstate->curexc_traceback; - tstate->curexc_type = 0; - tstate->curexc_value = 0; - tstate->curexc_traceback = 0; -#endif -} -#endif - -/* PyObjectGetAttrStr */ -#if CYTHON_USE_TYPE_SLOTS -static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { - PyTypeObject* tp = Py_TYPE(obj); - if (likely(tp->tp_getattro)) - return tp->tp_getattro(obj, attr_name); -#if PY_MAJOR_VERSION < 3 - if (likely(tp->tp_getattr)) - return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); -#endif - return PyObject_GetAttr(obj, attr_name); -} -#endif - -/* PyObjectGetAttrStrNoError */ -static void __Pyx_PyObject_GetAttrStr_ClearAttributeError(void) { - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - if (likely(__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError))) - __Pyx_PyErr_Clear(); -} -static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name) { - PyObject *result; -#if CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_TYPE_SLOTS && PY_VERSION_HEX >= 0x030700B1 - PyTypeObject* tp = Py_TYPE(obj); - if (likely(tp->tp_getattro == PyObject_GenericGetAttr)) { - return _PyObject_GenericGetAttrWithDict(obj, attr_name, NULL, 1); - } -#endif - result = __Pyx_PyObject_GetAttrStr(obj, attr_name); - if (unlikely(!result)) { - __Pyx_PyObject_GetAttrStr_ClearAttributeError(); - } - return result; -} - -/* GetBuiltinName */ -static PyObject *__Pyx_GetBuiltinName(PyObject *name) { - PyObject* result = __Pyx_PyObject_GetAttrStrNoError(__pyx_b, name); - if (unlikely(!result) && !PyErr_Occurred()) { - PyErr_Format(PyExc_NameError, -#if PY_MAJOR_VERSION >= 3 - "name '%U' is not defined", name); -#else - "name '%.200s' is not defined", PyString_AS_STRING(name)); -#endif - } - return result; -} - -/* PyIntCompare */ -static CYTHON_INLINE int __Pyx_PyInt_BoolEqObjC(PyObject *op1, PyObject *op2, long intval, long inplace) { - CYTHON_MAYBE_UNUSED_VAR(intval); - CYTHON_UNUSED_VAR(inplace); - if (op1 == op2) { - return 1; - } - #if PY_MAJOR_VERSION < 3 - if (likely(PyInt_CheckExact(op1))) { - const long b = intval; - long a = PyInt_AS_LONG(op1); - return (a == b); - } - #endif - #if CYTHON_USE_PYLONG_INTERNALS - if (likely(PyLong_CheckExact(op1))) { - int unequal; - unsigned long uintval; - Py_ssize_t size = __Pyx_PyLong_DigitCount(op1); - const digit* digits = __Pyx_PyLong_Digits(op1); - if (intval == 0) { - return (__Pyx_PyLong_IsZero(op1) == 1); - } else if (intval < 0) { - if (__Pyx_PyLong_IsNonNeg(op1)) - return 0; - intval = -intval; - } else { - if (__Pyx_PyLong_IsNeg(op1)) - return 0; - } - uintval = (unsigned long) intval; -#if PyLong_SHIFT * 4 < SIZEOF_LONG*8 - if (uintval >> (PyLong_SHIFT * 4)) { - unequal = (size != 5) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) - | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[4] != ((uintval >> (4 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); - } else -#endif -#if PyLong_SHIFT * 3 < SIZEOF_LONG*8 - if (uintval >> (PyLong_SHIFT * 3)) { - unequal = (size != 4) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) - | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); - } else -#endif -#if PyLong_SHIFT * 2 < SIZEOF_LONG*8 - if (uintval >> (PyLong_SHIFT * 2)) { - unequal = (size != 3) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) - | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); - } else -#endif -#if PyLong_SHIFT * 1 < SIZEOF_LONG*8 - if (uintval >> (PyLong_SHIFT * 1)) { - unequal = (size != 2) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) - | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); - } else -#endif - unequal = (size != 1) || (((unsigned long) digits[0]) != (uintval & (unsigned long) PyLong_MASK)); - return (unequal == 0); - } - #endif - if (PyFloat_CheckExact(op1)) { - const long b = intval; -#if CYTHON_COMPILING_IN_LIMITED_API - double a = __pyx_PyFloat_AsDouble(op1); -#else - double a = PyFloat_AS_DOUBLE(op1); -#endif - return ((double)a == (double)b); - } - return __Pyx_PyObject_IsTrueAndDecref( - PyObject_RichCompare(op1, op2, Py_EQ)); -} - -/* RaiseTooManyValuesToUnpack */ -static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected) { - PyErr_Format(PyExc_ValueError, - "too many values to unpack (expected %" CYTHON_FORMAT_SSIZE_T "d)", expected); -} - -/* RaiseNeedMoreValuesToUnpack */ -static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index) { - PyErr_Format(PyExc_ValueError, - "need more than %" CYTHON_FORMAT_SSIZE_T "d value%.1s to unpack", - index, (index == 1) ? "" : "s"); -} - -/* IterFinish */ -static CYTHON_INLINE int __Pyx_IterFinish(void) { - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - PyObject* exc_type = __Pyx_PyErr_CurrentExceptionType(); - if (unlikely(exc_type)) { - if (unlikely(!__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) - return -1; - __Pyx_PyErr_Clear(); - return 0; - } - return 0; -} - -/* UnpackItemEndCheck */ -static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected) { - if (unlikely(retval)) { - Py_DECREF(retval); - __Pyx_RaiseTooManyValuesError(expected); - return -1; - } - return __Pyx_IterFinish(); -} - -/* GetItemInt */ -static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j) { - PyObject *r; - if (unlikely(!j)) return NULL; - r = PyObject_GetItem(o, j); - Py_DECREF(j); - return r; -} -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, - CYTHON_NCP_UNUSED int wraparound, - CYTHON_NCP_UNUSED int boundscheck) { -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - Py_ssize_t wrapped_i = i; - if (wraparound & unlikely(i < 0)) { - wrapped_i += PyList_GET_SIZE(o); - } - if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyList_GET_SIZE(o)))) { - PyObject *r = PyList_GET_ITEM(o, wrapped_i); - Py_INCREF(r); - return r; - } - return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); -#else - return PySequence_GetItem(o, i); -#endif -} -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, - CYTHON_NCP_UNUSED int wraparound, - CYTHON_NCP_UNUSED int boundscheck) { -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - Py_ssize_t wrapped_i = i; - if (wraparound & unlikely(i < 0)) { - wrapped_i += PyTuple_GET_SIZE(o); - } - if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyTuple_GET_SIZE(o)))) { - PyObject *r = PyTuple_GET_ITEM(o, wrapped_i); - Py_INCREF(r); - return r; - } - return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); -#else - return PySequence_GetItem(o, i); -#endif -} -static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, - CYTHON_NCP_UNUSED int wraparound, - CYTHON_NCP_UNUSED int boundscheck) { -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS - if (is_list || PyList_CheckExact(o)) { - Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyList_GET_SIZE(o); - if ((!boundscheck) || (likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o))))) { - PyObject *r = PyList_GET_ITEM(o, n); - Py_INCREF(r); - return r; - } - } - else if (PyTuple_CheckExact(o)) { - Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyTuple_GET_SIZE(o); - if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyTuple_GET_SIZE(o)))) { - PyObject *r = PyTuple_GET_ITEM(o, n); - Py_INCREF(r); - return r; - } - } else { - PyMappingMethods *mm = Py_TYPE(o)->tp_as_mapping; - PySequenceMethods *sm = Py_TYPE(o)->tp_as_sequence; - if (mm && mm->mp_subscript) { - PyObject *r, *key = PyInt_FromSsize_t(i); - if (unlikely(!key)) return NULL; - r = mm->mp_subscript(o, key); - Py_DECREF(key); - return r; - } - if (likely(sm && sm->sq_item)) { - if (wraparound && unlikely(i < 0) && likely(sm->sq_length)) { - Py_ssize_t l = sm->sq_length(o); - if (likely(l >= 0)) { - i += l; - } else { - if (!PyErr_ExceptionMatches(PyExc_OverflowError)) - return NULL; - PyErr_Clear(); - } - } - return sm->sq_item(o, i); - } - } -#else - if (is_list || PySequence_Check(o)) { - return PySequence_GetItem(o, i); - } -#endif - return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); -} - -/* PyDictVersioning */ -#if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS -static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { - PyObject *dict = Py_TYPE(obj)->tp_dict; - return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; -} -static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { - PyObject **dictptr = NULL; - Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; - if (offset) { -#if CYTHON_COMPILING_IN_CPYTHON - dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); -#else - dictptr = _PyObject_GetDictPtr(obj); -#endif - } - return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; -} -static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { - PyObject *dict = Py_TYPE(obj)->tp_dict; - if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) - return 0; - return obj_dict_version == __Pyx_get_object_dict_version(obj); -} -#endif - -/* GetModuleGlobalName */ -#if CYTHON_USE_DICT_VERSIONS -static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) -#else -static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) -#endif -{ - PyObject *result; -#if !CYTHON_AVOID_BORROWED_REFS -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 - result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); - __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) - if (likely(result)) { - return __Pyx_NewRef(result); - } else if (unlikely(PyErr_Occurred())) { - return NULL; - } -#elif CYTHON_COMPILING_IN_LIMITED_API - if (unlikely(!__pyx_m)) { - return NULL; - } - result = PyObject_GetAttr(__pyx_m, name); - if (likely(result)) { - return result; - } -#else - result = PyDict_GetItem(__pyx_d, name); - __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) - if (likely(result)) { - return __Pyx_NewRef(result); - } -#endif -#else - result = PyObject_GetItem(__pyx_d, name); - __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) - if (likely(result)) { - return __Pyx_NewRef(result); - } - PyErr_Clear(); -#endif - return __Pyx_GetBuiltinName(name); -} - -/* PyFunctionFastCall */ -#if CYTHON_FAST_PYCALL && !CYTHON_VECTORCALL -static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, - PyObject *globals) { - PyFrameObject *f; - PyThreadState *tstate = __Pyx_PyThreadState_Current; - PyObject **fastlocals; - Py_ssize_t i; - PyObject *result; - assert(globals != NULL); - /* XXX Perhaps we should create a specialized - PyFrame_New() that doesn't take locals, but does - take builtins without sanity checking them. - */ - assert(tstate != NULL); - f = PyFrame_New(tstate, co, globals, NULL); - if (f == NULL) { - return NULL; - } - fastlocals = __Pyx_PyFrame_GetLocalsplus(f); - for (i = 0; i < na; i++) { - Py_INCREF(*args); - fastlocals[i] = *args++; - } - result = PyEval_EvalFrameEx(f,0); - ++tstate->recursion_depth; - Py_DECREF(f); - --tstate->recursion_depth; - return result; -} -static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { - PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); - PyObject *globals = PyFunction_GET_GLOBALS(func); - PyObject *argdefs = PyFunction_GET_DEFAULTS(func); - PyObject *closure; -#if PY_MAJOR_VERSION >= 3 - PyObject *kwdefs; -#endif - PyObject *kwtuple, **k; - PyObject **d; - Py_ssize_t nd; - Py_ssize_t nk; - PyObject *result; - assert(kwargs == NULL || PyDict_Check(kwargs)); - nk = kwargs ? PyDict_Size(kwargs) : 0; - if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) { - return NULL; - } - if ( -#if PY_MAJOR_VERSION >= 3 - co->co_kwonlyargcount == 0 && -#endif - likely(kwargs == NULL || nk == 0) && - co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { - if (argdefs == NULL && co->co_argcount == nargs) { - result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); - goto done; - } - else if (nargs == 0 && argdefs != NULL - && co->co_argcount == Py_SIZE(argdefs)) { - /* function called with no arguments, but all parameters have - a default value: use default values as arguments .*/ - args = &PyTuple_GET_ITEM(argdefs, 0); - result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); - goto done; - } - } - if (kwargs != NULL) { - Py_ssize_t pos, i; - kwtuple = PyTuple_New(2 * nk); - if (kwtuple == NULL) { - result = NULL; - goto done; - } - k = &PyTuple_GET_ITEM(kwtuple, 0); - pos = i = 0; - while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { - Py_INCREF(k[i]); - Py_INCREF(k[i+1]); - i += 2; - } - nk = i / 2; - } - else { - kwtuple = NULL; - k = NULL; - } - closure = PyFunction_GET_CLOSURE(func); -#if PY_MAJOR_VERSION >= 3 - kwdefs = PyFunction_GET_KW_DEFAULTS(func); -#endif - if (argdefs != NULL) { - d = &PyTuple_GET_ITEM(argdefs, 0); - nd = Py_SIZE(argdefs); - } - else { - d = NULL; - nd = 0; - } -#if PY_MAJOR_VERSION >= 3 - result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, - args, (int)nargs, - k, (int)nk, - d, (int)nd, kwdefs, closure); -#else - result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, - args, (int)nargs, - k, (int)nk, - d, (int)nd, closure); -#endif - Py_XDECREF(kwtuple); -done: - Py_LeaveRecursiveCall(); - return result; -} -#endif - -/* PyObjectCall */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { - PyObject *result; - ternaryfunc call = Py_TYPE(func)->tp_call; - if (unlikely(!call)) - return PyObject_Call(func, arg, kw); - if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) - return NULL; - result = (*call)(func, arg, kw); - Py_LeaveRecursiveCall(); - if (unlikely(!result) && unlikely(!PyErr_Occurred())) { - PyErr_SetString( - PyExc_SystemError, - "NULL result without error in PyObject_Call"); - } - return result; -} -#endif - -/* PyObjectCallMethO */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { - PyObject *self, *result; - PyCFunction cfunc; - cfunc = PyCFunction_GET_FUNCTION(func); - self = PyCFunction_GET_SELF(func); - if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) - return NULL; - result = cfunc(self, arg); - Py_LeaveRecursiveCall(); - if (unlikely(!result) && unlikely(!PyErr_Occurred())) { - PyErr_SetString( - PyExc_SystemError, - "NULL result without error in PyObject_Call"); - } - return result; -} -#endif - -/* PyObjectFastCall */ -static PyObject* __Pyx_PyObject_FastCall_fallback(PyObject *func, PyObject **args, size_t nargs, PyObject *kwargs) { - PyObject *argstuple; - PyObject *result; - size_t i; - argstuple = PyTuple_New((Py_ssize_t)nargs); - if (unlikely(!argstuple)) return NULL; - for (i = 0; i < nargs; i++) { - Py_INCREF(args[i]); - PyTuple_SET_ITEM(argstuple, (Py_ssize_t)i, args[i]); - } - result = __Pyx_PyObject_Call(func, argstuple, kwargs); - Py_DECREF(argstuple); - return result; -} -static CYTHON_INLINE PyObject* __Pyx_PyObject_FastCallDict(PyObject *func, PyObject **args, size_t _nargs, PyObject *kwargs) { - Py_ssize_t nargs = __Pyx_PyVectorcall_NARGS(_nargs); -#if CYTHON_COMPILING_IN_CPYTHON - if (nargs == 0 && kwargs == NULL) { -#if defined(__Pyx_CyFunction_USED) && defined(NDEBUG) - if (__Pyx_IsCyOrPyCFunction(func)) -#else - if (PyCFunction_Check(func)) -#endif - { - if (likely(PyCFunction_GET_FLAGS(func) & METH_NOARGS)) { - return __Pyx_PyObject_CallMethO(func, NULL); - } - } - } - else if (nargs == 1 && kwargs == NULL) { - if (PyCFunction_Check(func)) - { - if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { - return __Pyx_PyObject_CallMethO(func, args[0]); - } - } - } -#endif - #if PY_VERSION_HEX < 0x030800B1 - #if CYTHON_FAST_PYCCALL - if (PyCFunction_Check(func)) { - if (kwargs) { - return _PyCFunction_FastCallDict(func, args, nargs, kwargs); - } else { - return _PyCFunction_FastCallKeywords(func, args, nargs, NULL); - } - } - #if PY_VERSION_HEX >= 0x030700A1 - if (!kwargs && __Pyx_IS_TYPE(func, &PyMethodDescr_Type)) { - return _PyMethodDescr_FastCallKeywords(func, args, nargs, NULL); - } - #endif - #endif - #if CYTHON_FAST_PYCALL - if (PyFunction_Check(func)) { - return __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs); - } - #endif - #endif - #if CYTHON_VECTORCALL - vectorcallfunc f = _PyVectorcall_Function(func); - if (f) { - return f(func, args, (size_t)nargs, kwargs); - } - #elif defined(__Pyx_CyFunction_USED) && CYTHON_BACKPORT_VECTORCALL - if (__Pyx_CyFunction_CheckExact(func)) { - __pyx_vectorcallfunc f = __Pyx_CyFunction_func_vectorcall(func); - if (f) return f(func, args, (size_t)nargs, kwargs); - } - #endif - if (nargs == 0) { - return __Pyx_PyObject_Call(func, __pyx_empty_tuple, kwargs); - } - return __Pyx_PyObject_FastCall_fallback(func, args, (size_t)nargs, kwargs); -} - -/* TupleAndListFromArray */ -#if CYTHON_COMPILING_IN_CPYTHON -static CYTHON_INLINE void __Pyx_copy_object_array(PyObject *const *CYTHON_RESTRICT src, PyObject** CYTHON_RESTRICT dest, Py_ssize_t length) { - PyObject *v; - Py_ssize_t i; - for (i = 0; i < length; i++) { - v = dest[i] = src[i]; - Py_INCREF(v); - } -} -static CYTHON_INLINE PyObject * -__Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n) -{ - PyObject *res; - if (n <= 0) { - Py_INCREF(__pyx_empty_tuple); - return __pyx_empty_tuple; - } - res = PyTuple_New(n); - if (unlikely(res == NULL)) return NULL; - __Pyx_copy_object_array(src, ((PyTupleObject*)res)->ob_item, n); - return res; -} -static CYTHON_INLINE PyObject * -__Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n) -{ - PyObject *res; - if (n <= 0) { - return PyList_New(0); - } - res = PyList_New(n); - if (unlikely(res == NULL)) return NULL; - __Pyx_copy_object_array(src, ((PyListObject*)res)->ob_item, n); - return res; -} -#endif - -/* BytesEquals */ -static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals) { -#if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API - return PyObject_RichCompareBool(s1, s2, equals); -#else - if (s1 == s2) { - return (equals == Py_EQ); - } else if (PyBytes_CheckExact(s1) & PyBytes_CheckExact(s2)) { - const char *ps1, *ps2; - Py_ssize_t length = PyBytes_GET_SIZE(s1); - if (length != PyBytes_GET_SIZE(s2)) - return (equals == Py_NE); - ps1 = PyBytes_AS_STRING(s1); - ps2 = PyBytes_AS_STRING(s2); - if (ps1[0] != ps2[0]) { - return (equals == Py_NE); - } else if (length == 1) { - return (equals == Py_EQ); - } else { - int result; -#if CYTHON_USE_UNICODE_INTERNALS && (PY_VERSION_HEX < 0x030B0000) - Py_hash_t hash1, hash2; - hash1 = ((PyBytesObject*)s1)->ob_shash; - hash2 = ((PyBytesObject*)s2)->ob_shash; - if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { - return (equals == Py_NE); - } -#endif - result = memcmp(ps1, ps2, (size_t)length); - return (equals == Py_EQ) ? (result == 0) : (result != 0); - } - } else if ((s1 == Py_None) & PyBytes_CheckExact(s2)) { - return (equals == Py_NE); - } else if ((s2 == Py_None) & PyBytes_CheckExact(s1)) { - return (equals == Py_NE); - } else { - int result; - PyObject* py_result = PyObject_RichCompare(s1, s2, equals); - if (!py_result) - return -1; - result = __Pyx_PyObject_IsTrue(py_result); - Py_DECREF(py_result); - return result; - } -#endif -} - -/* UnicodeEquals */ -static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals) { -#if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API - return PyObject_RichCompareBool(s1, s2, equals); -#else -#if PY_MAJOR_VERSION < 3 - PyObject* owned_ref = NULL; -#endif - int s1_is_unicode, s2_is_unicode; - if (s1 == s2) { - goto return_eq; - } - s1_is_unicode = PyUnicode_CheckExact(s1); - s2_is_unicode = PyUnicode_CheckExact(s2); -#if PY_MAJOR_VERSION < 3 - if ((s1_is_unicode & (!s2_is_unicode)) && PyString_CheckExact(s2)) { - owned_ref = PyUnicode_FromObject(s2); - if (unlikely(!owned_ref)) - return -1; - s2 = owned_ref; - s2_is_unicode = 1; - } else if ((s2_is_unicode & (!s1_is_unicode)) && PyString_CheckExact(s1)) { - owned_ref = PyUnicode_FromObject(s1); - if (unlikely(!owned_ref)) - return -1; - s1 = owned_ref; - s1_is_unicode = 1; - } else if (((!s2_is_unicode) & (!s1_is_unicode))) { - return __Pyx_PyBytes_Equals(s1, s2, equals); - } -#endif - if (s1_is_unicode & s2_is_unicode) { - Py_ssize_t length; - int kind; - void *data1, *data2; - if (unlikely(__Pyx_PyUnicode_READY(s1) < 0) || unlikely(__Pyx_PyUnicode_READY(s2) < 0)) - return -1; - length = __Pyx_PyUnicode_GET_LENGTH(s1); - if (length != __Pyx_PyUnicode_GET_LENGTH(s2)) { - goto return_ne; - } -#if CYTHON_USE_UNICODE_INTERNALS - { - Py_hash_t hash1, hash2; - #if CYTHON_PEP393_ENABLED - hash1 = ((PyASCIIObject*)s1)->hash; - hash2 = ((PyASCIIObject*)s2)->hash; - #else - hash1 = ((PyUnicodeObject*)s1)->hash; - hash2 = ((PyUnicodeObject*)s2)->hash; - #endif - if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { - goto return_ne; - } - } -#endif - kind = __Pyx_PyUnicode_KIND(s1); - if (kind != __Pyx_PyUnicode_KIND(s2)) { - goto return_ne; - } - data1 = __Pyx_PyUnicode_DATA(s1); - data2 = __Pyx_PyUnicode_DATA(s2); - if (__Pyx_PyUnicode_READ(kind, data1, 0) != __Pyx_PyUnicode_READ(kind, data2, 0)) { - goto return_ne; - } else if (length == 1) { - goto return_eq; - } else { - int result = memcmp(data1, data2, (size_t)(length * kind)); - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(owned_ref); - #endif - return (equals == Py_EQ) ? (result == 0) : (result != 0); - } - } else if ((s1 == Py_None) & s2_is_unicode) { - goto return_ne; - } else if ((s2 == Py_None) & s1_is_unicode) { - goto return_ne; - } else { - int result; - PyObject* py_result = PyObject_RichCompare(s1, s2, equals); - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(owned_ref); - #endif - if (!py_result) - return -1; - result = __Pyx_PyObject_IsTrue(py_result); - Py_DECREF(py_result); - return result; - } -return_eq: - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(owned_ref); - #endif - return (equals == Py_EQ); -return_ne: - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(owned_ref); - #endif - return (equals == Py_NE); -#endif -} - -/* fastcall */ -#if CYTHON_METH_FASTCALL -static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s) -{ - Py_ssize_t i, n = PyTuple_GET_SIZE(kwnames); - for (i = 0; i < n; i++) - { - if (s == PyTuple_GET_ITEM(kwnames, i)) return kwvalues[i]; - } - for (i = 0; i < n; i++) - { - int eq = __Pyx_PyUnicode_Equals(s, PyTuple_GET_ITEM(kwnames, i), Py_EQ); - if (unlikely(eq != 0)) { - if (unlikely(eq < 0)) return NULL; // error - return kwvalues[i]; - } - } - return NULL; // not found (no exception set) -} -#endif - -/* RaiseArgTupleInvalid */ -static void __Pyx_RaiseArgtupleInvalid( - const char* func_name, - int exact, - Py_ssize_t num_min, - Py_ssize_t num_max, - Py_ssize_t num_found) -{ - Py_ssize_t num_expected; - const char *more_or_less; - if (num_found < num_min) { - num_expected = num_min; - more_or_less = "at least"; - } else { - num_expected = num_max; - more_or_less = "at most"; - } - if (exact) { - more_or_less = "exactly"; - } - PyErr_Format(PyExc_TypeError, - "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", - func_name, more_or_less, num_expected, - (num_expected == 1) ? "" : "s", num_found); -} - -/* RaiseDoubleKeywords */ -static void __Pyx_RaiseDoubleKeywordsError( - const char* func_name, - PyObject* kw_name) -{ - PyErr_Format(PyExc_TypeError, - #if PY_MAJOR_VERSION >= 3 - "%s() got multiple values for keyword argument '%U'", func_name, kw_name); - #else - "%s() got multiple values for keyword argument '%s'", func_name, - PyString_AsString(kw_name)); - #endif -} - -/* ParseKeywords */ -static int __Pyx_ParseOptionalKeywords( - PyObject *kwds, - PyObject *const *kwvalues, - PyObject **argnames[], - PyObject *kwds2, - PyObject *values[], - Py_ssize_t num_pos_args, - const char* function_name) -{ - PyObject *key = 0, *value = 0; - Py_ssize_t pos = 0; - PyObject*** name; - PyObject*** first_kw_arg = argnames + num_pos_args; - int kwds_is_tuple = CYTHON_METH_FASTCALL && likely(PyTuple_Check(kwds)); - while (1) { - if (kwds_is_tuple) { - if (pos >= PyTuple_GET_SIZE(kwds)) break; - key = PyTuple_GET_ITEM(kwds, pos); - value = kwvalues[pos]; - pos++; - } - else - { - if (!PyDict_Next(kwds, &pos, &key, &value)) break; - } - name = first_kw_arg; - while (*name && (**name != key)) name++; - if (*name) { - values[name-argnames] = value; - continue; - } - name = first_kw_arg; - #if PY_MAJOR_VERSION < 3 - if (likely(PyString_Check(key))) { - while (*name) { - if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) - && _PyString_Eq(**name, key)) { - values[name-argnames] = value; - break; - } - name++; - } - if (*name) continue; - else { - PyObject*** argname = argnames; - while (argname != first_kw_arg) { - if ((**argname == key) || ( - (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) - && _PyString_Eq(**argname, key))) { - goto arg_passed_twice; - } - argname++; - } - } - } else - #endif - if (likely(PyUnicode_Check(key))) { - while (*name) { - int cmp = ( - #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 - (__Pyx_PyUnicode_GET_LENGTH(**name) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 : - #endif - PyUnicode_Compare(**name, key) - ); - if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; - if (cmp == 0) { - values[name-argnames] = value; - break; - } - name++; - } - if (*name) continue; - else { - PyObject*** argname = argnames; - while (argname != first_kw_arg) { - int cmp = (**argname == key) ? 0 : - #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 - (__Pyx_PyUnicode_GET_LENGTH(**argname) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 : - #endif - PyUnicode_Compare(**argname, key); - if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; - if (cmp == 0) goto arg_passed_twice; - argname++; - } - } - } else - goto invalid_keyword_type; - if (kwds2) { - if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; - } else { - goto invalid_keyword; - } - } - return 0; -arg_passed_twice: - __Pyx_RaiseDoubleKeywordsError(function_name, key); - goto bad; -invalid_keyword_type: - PyErr_Format(PyExc_TypeError, - "%.200s() keywords must be strings", function_name); - goto bad; -invalid_keyword: - #if PY_MAJOR_VERSION < 3 - PyErr_Format(PyExc_TypeError, - "%.200s() got an unexpected keyword argument '%.200s'", - function_name, PyString_AsString(key)); - #else - PyErr_Format(PyExc_TypeError, - "%s() got an unexpected keyword argument '%U'", - function_name, key); - #endif -bad: - return -1; -} - -/* GetException */ -#if CYTHON_FAST_THREAD_STATE -static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) -#else -static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb) -#endif -{ - PyObject *local_type = NULL, *local_value, *local_tb = NULL; -#if CYTHON_FAST_THREAD_STATE - PyObject *tmp_type, *tmp_value, *tmp_tb; - #if PY_VERSION_HEX >= 0x030C00A6 - local_value = tstate->current_exception; - tstate->current_exception = 0; - if (likely(local_value)) { - local_type = (PyObject*) Py_TYPE(local_value); - Py_INCREF(local_type); - local_tb = PyException_GetTraceback(local_value); - } - #else - local_type = tstate->curexc_type; - local_value = tstate->curexc_value; - local_tb = tstate->curexc_traceback; - tstate->curexc_type = 0; - tstate->curexc_value = 0; - tstate->curexc_traceback = 0; - #endif -#else - PyErr_Fetch(&local_type, &local_value, &local_tb); -#endif - PyErr_NormalizeException(&local_type, &local_value, &local_tb); -#if CYTHON_FAST_THREAD_STATE && PY_VERSION_HEX >= 0x030C00A6 - if (unlikely(tstate->current_exception)) -#elif CYTHON_FAST_THREAD_STATE - if (unlikely(tstate->curexc_type)) -#else - if (unlikely(PyErr_Occurred())) -#endif - goto bad; - #if PY_MAJOR_VERSION >= 3 - if (local_tb) { - if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0)) - goto bad; - } - #endif - Py_XINCREF(local_tb); - Py_XINCREF(local_type); - Py_XINCREF(local_value); - *type = local_type; - *value = local_value; - *tb = local_tb; -#if CYTHON_FAST_THREAD_STATE - #if CYTHON_USE_EXC_INFO_STACK - { - _PyErr_StackItem *exc_info = tstate->exc_info; - #if PY_VERSION_HEX >= 0x030B00a4 - tmp_value = exc_info->exc_value; - exc_info->exc_value = local_value; - tmp_type = NULL; - tmp_tb = NULL; - Py_XDECREF(local_type); - Py_XDECREF(local_tb); - #else - tmp_type = exc_info->exc_type; - tmp_value = exc_info->exc_value; - tmp_tb = exc_info->exc_traceback; - exc_info->exc_type = local_type; - exc_info->exc_value = local_value; - exc_info->exc_traceback = local_tb; - #endif - } - #else - tmp_type = tstate->exc_type; - tmp_value = tstate->exc_value; - tmp_tb = tstate->exc_traceback; - tstate->exc_type = local_type; - tstate->exc_value = local_value; - tstate->exc_traceback = local_tb; - #endif - Py_XDECREF(tmp_type); - Py_XDECREF(tmp_value); - Py_XDECREF(tmp_tb); -#else - PyErr_SetExcInfo(local_type, local_value, local_tb); -#endif - return 0; -bad: - *type = 0; - *value = 0; - *tb = 0; - Py_XDECREF(local_type); - Py_XDECREF(local_value); - Py_XDECREF(local_tb); - return -1; -} - -/* pep479 */ -static void __Pyx_Generator_Replace_StopIteration(int in_async_gen) { - PyObject *exc, *val, *tb, *cur_exc; - __Pyx_PyThreadState_declare - #ifdef __Pyx_StopAsyncIteration_USED - int is_async_stopiteration = 0; - #endif - CYTHON_MAYBE_UNUSED_VAR(in_async_gen); - cur_exc = PyErr_Occurred(); - if (likely(!__Pyx_PyErr_GivenExceptionMatches(cur_exc, PyExc_StopIteration))) { - #ifdef __Pyx_StopAsyncIteration_USED - if (in_async_gen && unlikely(__Pyx_PyErr_GivenExceptionMatches(cur_exc, __Pyx_PyExc_StopAsyncIteration))) { - is_async_stopiteration = 1; - } else - #endif - return; - } - __Pyx_PyThreadState_assign - __Pyx_GetException(&exc, &val, &tb); - Py_XDECREF(exc); - Py_XDECREF(val); - Py_XDECREF(tb); - PyErr_SetString(PyExc_RuntimeError, - #ifdef __Pyx_StopAsyncIteration_USED - is_async_stopiteration ? "async generator raised StopAsyncIteration" : - in_async_gen ? "async generator raised StopIteration" : - #endif - "generator raised StopIteration"); -} - -/* GetTopmostException */ -#if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE -static _PyErr_StackItem * -__Pyx_PyErr_GetTopmostException(PyThreadState *tstate) -{ - _PyErr_StackItem *exc_info = tstate->exc_info; - while ((exc_info->exc_value == NULL || exc_info->exc_value == Py_None) && - exc_info->previous_item != NULL) - { - exc_info = exc_info->previous_item; - } - return exc_info; -} -#endif - -/* SaveResetException */ -#if CYTHON_FAST_THREAD_STATE -static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { - #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4 - _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); - PyObject *exc_value = exc_info->exc_value; - if (exc_value == NULL || exc_value == Py_None) { - *value = NULL; - *type = NULL; - *tb = NULL; - } else { - *value = exc_value; - Py_INCREF(*value); - *type = (PyObject*) Py_TYPE(exc_value); - Py_INCREF(*type); - *tb = PyException_GetTraceback(exc_value); - } - #elif CYTHON_USE_EXC_INFO_STACK - _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); - *type = exc_info->exc_type; - *value = exc_info->exc_value; - *tb = exc_info->exc_traceback; - Py_XINCREF(*type); - Py_XINCREF(*value); - Py_XINCREF(*tb); - #else - *type = tstate->exc_type; - *value = tstate->exc_value; - *tb = tstate->exc_traceback; - Py_XINCREF(*type); - Py_XINCREF(*value); - Py_XINCREF(*tb); - #endif -} -static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { - #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4 - _PyErr_StackItem *exc_info = tstate->exc_info; - PyObject *tmp_value = exc_info->exc_value; - exc_info->exc_value = value; - Py_XDECREF(tmp_value); - Py_XDECREF(type); - Py_XDECREF(tb); - #else - PyObject *tmp_type, *tmp_value, *tmp_tb; - #if CYTHON_USE_EXC_INFO_STACK - _PyErr_StackItem *exc_info = tstate->exc_info; - tmp_type = exc_info->exc_type; - tmp_value = exc_info->exc_value; - tmp_tb = exc_info->exc_traceback; - exc_info->exc_type = type; - exc_info->exc_value = value; - exc_info->exc_traceback = tb; - #else - tmp_type = tstate->exc_type; - tmp_value = tstate->exc_value; - tmp_tb = tstate->exc_traceback; - tstate->exc_type = type; - tstate->exc_value = value; - tstate->exc_traceback = tb; - #endif - Py_XDECREF(tmp_type); - Py_XDECREF(tmp_value); - Py_XDECREF(tmp_tb); - #endif -} -#endif - -/* IterNext */ -static PyObject *__Pyx_PyIter_Next2Default(PyObject* defval) { - PyObject* exc_type; - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - exc_type = __Pyx_PyErr_CurrentExceptionType(); - if (unlikely(exc_type)) { - if (!defval || unlikely(!__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) - return NULL; - __Pyx_PyErr_Clear(); - Py_INCREF(defval); - return defval; - } - if (defval) { - Py_INCREF(defval); - return defval; - } - __Pyx_PyErr_SetNone(PyExc_StopIteration); - return NULL; -} -static void __Pyx_PyIter_Next_ErrorNoIterator(PyObject *iterator) { - __Pyx_TypeName iterator_type_name = __Pyx_PyType_GetName(Py_TYPE(iterator)); - PyErr_Format(PyExc_TypeError, - __Pyx_FMT_TYPENAME " object is not an iterator", iterator_type_name); - __Pyx_DECREF_TypeName(iterator_type_name); -} -static CYTHON_INLINE PyObject *__Pyx_PyIter_Next2(PyObject* iterator, PyObject* defval) { - PyObject* next; - iternextfunc iternext = Py_TYPE(iterator)->tp_iternext; - if (likely(iternext)) { -#if CYTHON_USE_TYPE_SLOTS || CYTHON_COMPILING_IN_PYPY - next = iternext(iterator); - if (likely(next)) - return next; -#if CYTHON_COMPILING_IN_CPYTHON - if (unlikely(iternext == &_PyObject_NextNotImplemented)) - return NULL; -#endif -#else - next = PyIter_Next(iterator); - if (likely(next)) - return next; -#endif - } else if (CYTHON_USE_TYPE_SLOTS || unlikely(!PyIter_Check(iterator))) { - __Pyx_PyIter_Next_ErrorNoIterator(iterator); - return NULL; - } -#if !CYTHON_USE_TYPE_SLOTS - else { - next = PyIter_Next(iterator); - if (likely(next)) - return next; - } -#endif - return __Pyx_PyIter_Next2Default(defval); -} - -/* PyIntBinop */ -#if !CYTHON_COMPILING_IN_PYPY -static PyObject* __Pyx_PyInt_AddObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check) { - CYTHON_MAYBE_UNUSED_VAR(intval); - CYTHON_MAYBE_UNUSED_VAR(inplace); - CYTHON_UNUSED_VAR(zerodivision_check); - #if PY_MAJOR_VERSION < 3 - if (likely(PyInt_CheckExact(op1))) { - const long b = intval; - long x; - long a = PyInt_AS_LONG(op1); - - x = (long)((unsigned long)a + (unsigned long)b); - if (likely((x^a) >= 0 || (x^b) >= 0)) - return PyInt_FromLong(x); - return PyLong_Type.tp_as_number->nb_add(op1, op2); - } - #endif - #if CYTHON_USE_PYLONG_INTERNALS - if (likely(PyLong_CheckExact(op1))) { - const long b = intval; - long a, x; -#ifdef HAVE_LONG_LONG - const PY_LONG_LONG llb = intval; - PY_LONG_LONG lla, llx; -#endif - if (unlikely(__Pyx_PyLong_IsZero(op1))) { - return __Pyx_NewRef(op2); - } - if (likely(__Pyx_PyLong_IsCompact(op1))) { - a = __Pyx_PyLong_CompactValue(op1); - } else { - const digit* digits = __Pyx_PyLong_Digits(op1); - const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(op1); - switch (size) { - case -2: - if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { - a = -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) { - lla = -(PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - case 2: - if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { - a = (long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) { - lla = (PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - case -3: - if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { - a = -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) { - lla = -(PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - case 3: - if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { - a = (long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) { - lla = (PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - case -4: - if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { - a = -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) { - lla = -(PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - case 4: - if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { - a = (long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])); - break; - #ifdef HAVE_LONG_LONG - } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) { - lla = (PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0])); - goto long_long; - #endif - } - CYTHON_FALLTHROUGH; - default: return PyLong_Type.tp_as_number->nb_add(op1, op2); - } - } - x = a + b; - return PyLong_FromLong(x); -#ifdef HAVE_LONG_LONG - long_long: - llx = lla + llb; - return PyLong_FromLongLong(llx); -#endif - - - } - #endif - if (PyFloat_CheckExact(op1)) { - const long b = intval; -#if CYTHON_COMPILING_IN_LIMITED_API - double a = __pyx_PyFloat_AsDouble(op1); -#else - double a = PyFloat_AS_DOUBLE(op1); -#endif - double result; - - PyFPE_START_PROTECT("add", return NULL) - result = ((double)a) + (double)b; - PyFPE_END_PROTECT(result) - return PyFloat_FromDouble(result); - } - return (inplace ? PyNumber_InPlaceAdd : PyNumber_Add)(op1, op2); -} -#endif - -/* RaiseException */ -#if PY_MAJOR_VERSION < 3 -static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { - __Pyx_PyThreadState_declare - CYTHON_UNUSED_VAR(cause); - Py_XINCREF(type); - if (!value || value == Py_None) - value = NULL; - else - Py_INCREF(value); - if (!tb || tb == Py_None) - tb = NULL; - else { - Py_INCREF(tb); - if (!PyTraceBack_Check(tb)) { - PyErr_SetString(PyExc_TypeError, - "raise: arg 3 must be a traceback or None"); - goto raise_error; - } - } - if (PyType_Check(type)) { -#if CYTHON_COMPILING_IN_PYPY - if (!value) { - Py_INCREF(Py_None); - value = Py_None; - } -#endif - PyErr_NormalizeException(&type, &value, &tb); - } else { - if (value) { - PyErr_SetString(PyExc_TypeError, - "instance exception may not have a separate value"); - goto raise_error; - } - value = type; - type = (PyObject*) Py_TYPE(type); - Py_INCREF(type); - if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { - PyErr_SetString(PyExc_TypeError, - "raise: exception class must be a subclass of BaseException"); - goto raise_error; - } - } - __Pyx_PyThreadState_assign - __Pyx_ErrRestore(type, value, tb); - return; -raise_error: - Py_XDECREF(value); - Py_XDECREF(type); - Py_XDECREF(tb); - return; -} -#else -static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { - PyObject* owned_instance = NULL; - if (tb == Py_None) { - tb = 0; - } else if (tb && !PyTraceBack_Check(tb)) { - PyErr_SetString(PyExc_TypeError, - "raise: arg 3 must be a traceback or None"); - goto bad; - } - if (value == Py_None) - value = 0; - if (PyExceptionInstance_Check(type)) { - if (value) { - PyErr_SetString(PyExc_TypeError, - "instance exception may not have a separate value"); - goto bad; - } - value = type; - type = (PyObject*) Py_TYPE(value); - } else if (PyExceptionClass_Check(type)) { - PyObject *instance_class = NULL; - if (value && PyExceptionInstance_Check(value)) { - instance_class = (PyObject*) Py_TYPE(value); - if (instance_class != type) { - int is_subclass = PyObject_IsSubclass(instance_class, type); - if (!is_subclass) { - instance_class = NULL; - } else if (unlikely(is_subclass == -1)) { - goto bad; - } else { - type = instance_class; - } - } - } - if (!instance_class) { - PyObject *args; - if (!value) - args = PyTuple_New(0); - else if (PyTuple_Check(value)) { - Py_INCREF(value); - args = value; - } else - args = PyTuple_Pack(1, value); - if (!args) - goto bad; - owned_instance = PyObject_Call(type, args, NULL); - Py_DECREF(args); - if (!owned_instance) - goto bad; - value = owned_instance; - if (!PyExceptionInstance_Check(value)) { - PyErr_Format(PyExc_TypeError, - "calling %R should have returned an instance of " - "BaseException, not %R", - type, Py_TYPE(value)); - goto bad; - } - } - } else { - PyErr_SetString(PyExc_TypeError, - "raise: exception class must be a subclass of BaseException"); - goto bad; - } - if (cause) { - PyObject *fixed_cause; - if (cause == Py_None) { - fixed_cause = NULL; - } else if (PyExceptionClass_Check(cause)) { - fixed_cause = PyObject_CallObject(cause, NULL); - if (fixed_cause == NULL) - goto bad; - } else if (PyExceptionInstance_Check(cause)) { - fixed_cause = cause; - Py_INCREF(fixed_cause); - } else { - PyErr_SetString(PyExc_TypeError, - "exception causes must derive from " - "BaseException"); - goto bad; - } - PyException_SetCause(value, fixed_cause); - } - PyErr_SetObject(type, value); - if (tb) { - #if PY_VERSION_HEX >= 0x030C00A6 - PyException_SetTraceback(value, tb); - #elif CYTHON_FAST_THREAD_STATE - PyThreadState *tstate = __Pyx_PyThreadState_Current; - PyObject* tmp_tb = tstate->curexc_traceback; - if (tb != tmp_tb) { - Py_INCREF(tb); - tstate->curexc_traceback = tb; - Py_XDECREF(tmp_tb); - } -#else - PyObject *tmp_type, *tmp_value, *tmp_tb; - PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); - Py_INCREF(tb); - PyErr_Restore(tmp_type, tmp_value, tb); - Py_XDECREF(tmp_tb); -#endif - } -bad: - Py_XDECREF(owned_instance); - return; -} -#endif - -/* SetItemInt */ -static int __Pyx_SetItemInt_Generic(PyObject *o, PyObject *j, PyObject *v) { - int r; - if (unlikely(!j)) return -1; - r = PyObject_SetItem(o, j, v); - Py_DECREF(j); - return r; -} -static CYTHON_INLINE int __Pyx_SetItemInt_Fast(PyObject *o, Py_ssize_t i, PyObject *v, int is_list, - CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS - if (is_list || PyList_CheckExact(o)) { - Py_ssize_t n = (!wraparound) ? i : ((likely(i >= 0)) ? i : i + PyList_GET_SIZE(o)); - if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o)))) { - PyObject* old = PyList_GET_ITEM(o, n); - Py_INCREF(v); - PyList_SET_ITEM(o, n, v); - Py_DECREF(old); - return 1; - } - } else { - PyMappingMethods *mm = Py_TYPE(o)->tp_as_mapping; - PySequenceMethods *sm = Py_TYPE(o)->tp_as_sequence; - if (mm && mm->mp_ass_subscript) { - int r; - PyObject *key = PyInt_FromSsize_t(i); - if (unlikely(!key)) return -1; - r = mm->mp_ass_subscript(o, key, v); - Py_DECREF(key); - return r; - } - if (likely(sm && sm->sq_ass_item)) { - if (wraparound && unlikely(i < 0) && likely(sm->sq_length)) { - Py_ssize_t l = sm->sq_length(o); - if (likely(l >= 0)) { - i += l; - } else { - if (!PyErr_ExceptionMatches(PyExc_OverflowError)) - return -1; - PyErr_Clear(); - } - } - return sm->sq_ass_item(o, i, v); - } - } -#else -#if CYTHON_COMPILING_IN_PYPY - if (is_list || (PySequence_Check(o) && !PyDict_Check(o))) -#else - if (is_list || PySequence_Check(o)) -#endif - { - return PySequence_SetItem(o, i, v); - } -#endif - return __Pyx_SetItemInt_Generic(o, PyInt_FromSsize_t(i), v); -} - -/* ModInt[long] */ -static CYTHON_INLINE long __Pyx_mod_long(long a, long b) { - long r = a % b; - r += ((r != 0) & ((r ^ b) < 0)) * b; - return r; -} - -/* FixUpExtensionType */ -#if CYTHON_USE_TYPE_SPECS -static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type) { -#if PY_VERSION_HEX > 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API - CYTHON_UNUSED_VAR(spec); - CYTHON_UNUSED_VAR(type); -#else - const PyType_Slot *slot = spec->slots; - while (slot && slot->slot && slot->slot != Py_tp_members) - slot++; - if (slot && slot->slot == Py_tp_members) { - int changed = 0; -#if !(PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON) - const -#endif - PyMemberDef *memb = (PyMemberDef*) slot->pfunc; - while (memb && memb->name) { - if (memb->name[0] == '_' && memb->name[1] == '_') { -#if PY_VERSION_HEX < 0x030900b1 - if (strcmp(memb->name, "__weaklistoffset__") == 0) { - assert(memb->type == T_PYSSIZET); - assert(memb->flags == READONLY); - type->tp_weaklistoffset = memb->offset; - changed = 1; - } - else if (strcmp(memb->name, "__dictoffset__") == 0) { - assert(memb->type == T_PYSSIZET); - assert(memb->flags == READONLY); - type->tp_dictoffset = memb->offset; - changed = 1; - } -#if CYTHON_METH_FASTCALL - else if (strcmp(memb->name, "__vectorcalloffset__") == 0) { - assert(memb->type == T_PYSSIZET); - assert(memb->flags == READONLY); -#if PY_VERSION_HEX >= 0x030800b4 - type->tp_vectorcall_offset = memb->offset; -#else - type->tp_print = (printfunc) memb->offset; -#endif - changed = 1; - } -#endif -#else - if ((0)); -#endif -#if PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON - else if (strcmp(memb->name, "__module__") == 0) { - PyObject *descr; - assert(memb->type == T_OBJECT); - assert(memb->flags == 0 || memb->flags == READONLY); - descr = PyDescr_NewMember(type, memb); - if (unlikely(!descr)) - return -1; - if (unlikely(PyDict_SetItem(type->tp_dict, PyDescr_NAME(descr), descr) < 0)) { - Py_DECREF(descr); - return -1; - } - Py_DECREF(descr); - changed = 1; - } -#endif - } - memb++; - } - if (changed) - PyType_Modified(type); - } -#endif - return 0; -} -#endif - -/* PyObjectCallNoArg */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func) { - PyObject *arg = NULL; - return __Pyx_PyObject_FastCall(func, (&arg)+1, 0 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET); -} - -/* PyObjectCallOneArg */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { - PyObject *args[2] = {NULL, arg}; - return __Pyx_PyObject_FastCall(func, args+1, 1 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET); -} - -/* PyObjectGetMethod */ -static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method) { - PyObject *attr; -#if CYTHON_UNPACK_METHODS && CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_PYTYPE_LOOKUP - __Pyx_TypeName type_name; - PyTypeObject *tp = Py_TYPE(obj); - PyObject *descr; - descrgetfunc f = NULL; - PyObject **dictptr, *dict; - int meth_found = 0; - assert (*method == NULL); - if (unlikely(tp->tp_getattro != PyObject_GenericGetAttr)) { - attr = __Pyx_PyObject_GetAttrStr(obj, name); - goto try_unpack; - } - if (unlikely(tp->tp_dict == NULL) && unlikely(PyType_Ready(tp) < 0)) { - return 0; - } - descr = _PyType_Lookup(tp, name); - if (likely(descr != NULL)) { - Py_INCREF(descr); -#if defined(Py_TPFLAGS_METHOD_DESCRIPTOR) && Py_TPFLAGS_METHOD_DESCRIPTOR - if (__Pyx_PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_METHOD_DESCRIPTOR)) -#elif PY_MAJOR_VERSION >= 3 - #ifdef __Pyx_CyFunction_USED - if (likely(PyFunction_Check(descr) || __Pyx_IS_TYPE(descr, &PyMethodDescr_Type) || __Pyx_CyFunction_Check(descr))) - #else - if (likely(PyFunction_Check(descr) || __Pyx_IS_TYPE(descr, &PyMethodDescr_Type))) - #endif -#else - #ifdef __Pyx_CyFunction_USED - if (likely(PyFunction_Check(descr) || __Pyx_CyFunction_Check(descr))) - #else - if (likely(PyFunction_Check(descr))) - #endif -#endif - { - meth_found = 1; - } else { - f = Py_TYPE(descr)->tp_descr_get; - if (f != NULL && PyDescr_IsData(descr)) { - attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); - Py_DECREF(descr); - goto try_unpack; - } - } - } - dictptr = _PyObject_GetDictPtr(obj); - if (dictptr != NULL && (dict = *dictptr) != NULL) { - Py_INCREF(dict); - attr = __Pyx_PyDict_GetItemStr(dict, name); - if (attr != NULL) { - Py_INCREF(attr); - Py_DECREF(dict); - Py_XDECREF(descr); - goto try_unpack; - } - Py_DECREF(dict); - } - if (meth_found) { - *method = descr; - return 1; - } - if (f != NULL) { - attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); - Py_DECREF(descr); - goto try_unpack; - } - if (likely(descr != NULL)) { - *method = descr; - return 0; - } - type_name = __Pyx_PyType_GetName(tp); - PyErr_Format(PyExc_AttributeError, -#if PY_MAJOR_VERSION >= 3 - "'" __Pyx_FMT_TYPENAME "' object has no attribute '%U'", - type_name, name); -#else - "'" __Pyx_FMT_TYPENAME "' object has no attribute '%.400s'", - type_name, PyString_AS_STRING(name)); -#endif - __Pyx_DECREF_TypeName(type_name); - return 0; -#else - attr = __Pyx_PyObject_GetAttrStr(obj, name); - goto try_unpack; -#endif -try_unpack: -#if CYTHON_UNPACK_METHODS - if (likely(attr) && PyMethod_Check(attr) && likely(PyMethod_GET_SELF(attr) == obj)) { - PyObject *function = PyMethod_GET_FUNCTION(attr); - Py_INCREF(function); - Py_DECREF(attr); - *method = function; - return 1; - } -#endif - *method = attr; - return 0; -} - -/* PyObjectCallMethod0 */ -static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name) { - PyObject *method = NULL, *result = NULL; - int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method); - if (likely(is_method)) { - result = __Pyx_PyObject_CallOneArg(method, obj); - Py_DECREF(method); - return result; - } - if (unlikely(!method)) goto bad; - result = __Pyx_PyObject_CallNoArg(method); - Py_DECREF(method); -bad: - return result; -} - -/* ValidateBasesTuple */ -#if CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API || CYTHON_USE_TYPE_SPECS -static int __Pyx_validate_bases_tuple(const char *type_name, Py_ssize_t dictoffset, PyObject *bases) { - Py_ssize_t i, n = PyTuple_GET_SIZE(bases); - for (i = 1; i < n; i++) - { - PyObject *b0 = PyTuple_GET_ITEM(bases, i); - PyTypeObject *b; -#if PY_MAJOR_VERSION < 3 - if (PyClass_Check(b0)) - { - PyErr_Format(PyExc_TypeError, "base class '%.200s' is an old-style class", - PyString_AS_STRING(((PyClassObject*)b0)->cl_name)); - return -1; - } -#endif - b = (PyTypeObject*) b0; - if (!__Pyx_PyType_HasFeature(b, Py_TPFLAGS_HEAPTYPE)) - { - __Pyx_TypeName b_name = __Pyx_PyType_GetName(b); - PyErr_Format(PyExc_TypeError, - "base class '" __Pyx_FMT_TYPENAME "' is not a heap type", b_name); - __Pyx_DECREF_TypeName(b_name); - return -1; - } - if (dictoffset == 0 && b->tp_dictoffset) - { - __Pyx_TypeName b_name = __Pyx_PyType_GetName(b); - PyErr_Format(PyExc_TypeError, - "extension type '%.200s' has no __dict__ slot, " - "but base type '" __Pyx_FMT_TYPENAME "' has: " - "either add 'cdef dict __dict__' to the extension type " - "or add '__slots__ = [...]' to the base type", - type_name, b_name); - __Pyx_DECREF_TypeName(b_name); - return -1; - } - } - return 0; -} -#endif - -/* PyType_Ready */ -static int __Pyx_PyType_Ready(PyTypeObject *t) { -#if CYTHON_USE_TYPE_SPECS || !(CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API) || defined(PYSTON_MAJOR_VERSION) - (void)__Pyx_PyObject_CallMethod0; -#if CYTHON_USE_TYPE_SPECS - (void)__Pyx_validate_bases_tuple; -#endif - return PyType_Ready(t); -#else - int r; - PyObject *bases = __Pyx_PyType_GetSlot(t, tp_bases, PyObject*); - if (bases && unlikely(__Pyx_validate_bases_tuple(t->tp_name, t->tp_dictoffset, bases) == -1)) - return -1; -#if PY_VERSION_HEX >= 0x03050000 && !defined(PYSTON_MAJOR_VERSION) - { - int gc_was_enabled; - #if PY_VERSION_HEX >= 0x030A00b1 - gc_was_enabled = PyGC_Disable(); - (void)__Pyx_PyObject_CallMethod0; - #else - PyObject *ret, *py_status; - PyObject *gc = NULL; - #if PY_VERSION_HEX >= 0x030700a1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM+0 >= 0x07030400) - gc = PyImport_GetModule(__pyx_kp_u_gc); - #endif - if (unlikely(!gc)) gc = PyImport_Import(__pyx_kp_u_gc); - if (unlikely(!gc)) return -1; - py_status = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_isenabled); - if (unlikely(!py_status)) { - Py_DECREF(gc); - return -1; - } - gc_was_enabled = __Pyx_PyObject_IsTrue(py_status); - Py_DECREF(py_status); - if (gc_was_enabled > 0) { - ret = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_disable); - if (unlikely(!ret)) { - Py_DECREF(gc); - return -1; - } - Py_DECREF(ret); - } else if (unlikely(gc_was_enabled == -1)) { - Py_DECREF(gc); - return -1; - } - #endif - t->tp_flags |= Py_TPFLAGS_HEAPTYPE; -#if PY_VERSION_HEX >= 0x030A0000 - t->tp_flags |= Py_TPFLAGS_IMMUTABLETYPE; -#endif -#else - (void)__Pyx_PyObject_CallMethod0; -#endif - r = PyType_Ready(t); -#if PY_VERSION_HEX >= 0x03050000 && !defined(PYSTON_MAJOR_VERSION) - t->tp_flags &= ~Py_TPFLAGS_HEAPTYPE; - #if PY_VERSION_HEX >= 0x030A00b1 - if (gc_was_enabled) - PyGC_Enable(); - #else - if (gc_was_enabled) { - PyObject *tp, *v, *tb; - PyErr_Fetch(&tp, &v, &tb); - ret = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_enable); - if (likely(ret || r == -1)) { - Py_XDECREF(ret); - PyErr_Restore(tp, v, tb); - } else { - Py_XDECREF(tp); - Py_XDECREF(v); - Py_XDECREF(tb); - r = -1; - } - } - Py_DECREF(gc); - #endif - } -#endif - return r; -#endif -} - -/* PyObject_GenericGetAttrNoDict */ -#if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 -static PyObject *__Pyx_RaiseGenericGetAttributeError(PyTypeObject *tp, PyObject *attr_name) { - __Pyx_TypeName type_name = __Pyx_PyType_GetName(tp); - PyErr_Format(PyExc_AttributeError, -#if PY_MAJOR_VERSION >= 3 - "'" __Pyx_FMT_TYPENAME "' object has no attribute '%U'", - type_name, attr_name); -#else - "'" __Pyx_FMT_TYPENAME "' object has no attribute '%.400s'", - type_name, PyString_AS_STRING(attr_name)); -#endif - __Pyx_DECREF_TypeName(type_name); - return NULL; -} -static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name) { - PyObject *descr; - PyTypeObject *tp = Py_TYPE(obj); - if (unlikely(!PyString_Check(attr_name))) { - return PyObject_GenericGetAttr(obj, attr_name); - } - assert(!tp->tp_dictoffset); - descr = _PyType_Lookup(tp, attr_name); - if (unlikely(!descr)) { - return __Pyx_RaiseGenericGetAttributeError(tp, attr_name); - } - Py_INCREF(descr); - #if PY_MAJOR_VERSION < 3 - if (likely(PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_HAVE_CLASS))) - #endif - { - descrgetfunc f = Py_TYPE(descr)->tp_descr_get; - if (unlikely(f)) { - PyObject *res = f(descr, obj, (PyObject *)tp); - Py_DECREF(descr); - return res; - } - } - return descr; -} -#endif - -/* FastTypeChecks */ -#if CYTHON_COMPILING_IN_CPYTHON -static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { - while (a) { - a = __Pyx_PyType_GetSlot(a, tp_base, PyTypeObject*); - if (a == b) - return 1; - } - return b == &PyBaseObject_Type; -} -static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { - PyObject *mro; - if (a == b) return 1; - mro = a->tp_mro; - if (likely(mro)) { - Py_ssize_t i, n; - n = PyTuple_GET_SIZE(mro); - for (i = 0; i < n; i++) { - if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) - return 1; - } - return 0; - } - return __Pyx_InBases(a, b); -} -static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b) { - PyObject *mro; - if (cls == a || cls == b) return 1; - mro = cls->tp_mro; - if (likely(mro)) { - Py_ssize_t i, n; - n = PyTuple_GET_SIZE(mro); - for (i = 0; i < n; i++) { - PyObject *base = PyTuple_GET_ITEM(mro, i); - if (base == (PyObject *)a || base == (PyObject *)b) - return 1; - } - return 0; - } - return __Pyx_InBases(cls, a) || __Pyx_InBases(cls, b); -} -#if PY_MAJOR_VERSION == 2 -static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { - PyObject *exception, *value, *tb; - int res; - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - __Pyx_ErrFetch(&exception, &value, &tb); - res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; - if (unlikely(res == -1)) { - PyErr_WriteUnraisable(err); - res = 0; - } - if (!res) { - res = PyObject_IsSubclass(err, exc_type2); - if (unlikely(res == -1)) { - PyErr_WriteUnraisable(err); - res = 0; - } - } - __Pyx_ErrRestore(exception, value, tb); - return res; -} -#else -static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { - if (exc_type1) { - return __Pyx_IsAnySubtype2((PyTypeObject*)err, (PyTypeObject*)exc_type1, (PyTypeObject*)exc_type2); - } else { - return __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); - } -} -#endif -static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { - Py_ssize_t i, n; - assert(PyExceptionClass_Check(exc_type)); - n = PyTuple_GET_SIZE(tuple); -#if PY_MAJOR_VERSION >= 3 - for (i=0; i= 3 - if (level == -1) { - if ((1) && (strchr(__Pyx_MODULE_NAME, '.'))) { - #if CYTHON_COMPILING_IN_LIMITED_API - module = PyImport_ImportModuleLevelObject( - name, empty_dict, empty_dict, from_list, 1); - #else - module = PyImport_ImportModuleLevelObject( - name, __pyx_d, empty_dict, from_list, 1); - #endif - if (unlikely(!module)) { - if (unlikely(!PyErr_ExceptionMatches(PyExc_ImportError))) - goto bad; - PyErr_Clear(); - } - } - level = 0; - } - #endif - if (!module) { - #if PY_MAJOR_VERSION < 3 - PyObject *py_level = PyInt_FromLong(level); - if (unlikely(!py_level)) - goto bad; - module = PyObject_CallFunctionObjArgs(py_import, - name, __pyx_d, empty_dict, from_list, py_level, (PyObject *)NULL); - Py_DECREF(py_level); - #else - #if CYTHON_COMPILING_IN_LIMITED_API - module = PyImport_ImportModuleLevelObject( - name, empty_dict, empty_dict, from_list, level); - #else - module = PyImport_ImportModuleLevelObject( - name, __pyx_d, empty_dict, from_list, level); - #endif - #endif - } - } -bad: - Py_XDECREF(empty_dict); - Py_XDECREF(empty_list); - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(py_import); - #endif - return module; -} - -/* ImportFrom */ -static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) { - PyObject* value = __Pyx_PyObject_GetAttrStr(module, name); - if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) { - const char* module_name_str = 0; - PyObject* module_name = 0; - PyObject* module_dot = 0; - PyObject* full_name = 0; - PyErr_Clear(); - module_name_str = PyModule_GetName(module); - if (unlikely(!module_name_str)) { goto modbad; } - module_name = PyUnicode_FromString(module_name_str); - if (unlikely(!module_name)) { goto modbad; } - module_dot = PyUnicode_Concat(module_name, __pyx_kp_u__2); - if (unlikely(!module_dot)) { goto modbad; } - full_name = PyUnicode_Concat(module_dot, name); - if (unlikely(!full_name)) { goto modbad; } - #if PY_VERSION_HEX < 0x030700A1 || (CYTHON_COMPILING_IN_PYPY && PYPY_VERSION_NUM < 0x07030400) - { - PyObject *modules = PyImport_GetModuleDict(); - if (unlikely(!modules)) - goto modbad; - value = PyObject_GetItem(modules, full_name); - } - #else - value = PyImport_GetModule(full_name); - #endif - modbad: - Py_XDECREF(full_name); - Py_XDECREF(module_dot); - Py_XDECREF(module_name); - } - if (unlikely(!value)) { - PyErr_Format(PyExc_ImportError, - #if PY_MAJOR_VERSION < 3 - "cannot import name %.230s", PyString_AS_STRING(name)); - #else - "cannot import name %S", name); - #endif - } - return value; -} - -/* ImportDottedModule */ -#if PY_MAJOR_VERSION >= 3 -static PyObject *__Pyx__ImportDottedModule_Error(PyObject *name, PyObject *parts_tuple, Py_ssize_t count) { - PyObject *partial_name = NULL, *slice = NULL, *sep = NULL; - if (unlikely(PyErr_Occurred())) { - PyErr_Clear(); - } - if (likely(PyTuple_GET_SIZE(parts_tuple) == count)) { - partial_name = name; - } else { - slice = PySequence_GetSlice(parts_tuple, 0, count); - if (unlikely(!slice)) - goto bad; - sep = PyUnicode_FromStringAndSize(".", 1); - if (unlikely(!sep)) - goto bad; - partial_name = PyUnicode_Join(sep, slice); - } - PyErr_Format( -#if PY_MAJOR_VERSION < 3 - PyExc_ImportError, - "No module named '%s'", PyString_AS_STRING(partial_name)); -#else -#if PY_VERSION_HEX >= 0x030600B1 - PyExc_ModuleNotFoundError, -#else - PyExc_ImportError, -#endif - "No module named '%U'", partial_name); -#endif -bad: - Py_XDECREF(sep); - Py_XDECREF(slice); - Py_XDECREF(partial_name); - return NULL; -} -#endif -#if PY_MAJOR_VERSION >= 3 -static PyObject *__Pyx__ImportDottedModule_Lookup(PyObject *name) { - PyObject *imported_module; -#if PY_VERSION_HEX < 0x030700A1 || (CYTHON_COMPILING_IN_PYPY && PYPY_VERSION_NUM < 0x07030400) - PyObject *modules = PyImport_GetModuleDict(); - if (unlikely(!modules)) - return NULL; - imported_module = __Pyx_PyDict_GetItemStr(modules, name); - Py_XINCREF(imported_module); -#else - imported_module = PyImport_GetModule(name); -#endif - return imported_module; -} -#endif -#if PY_MAJOR_VERSION >= 3 -static PyObject *__Pyx_ImportDottedModule_WalkParts(PyObject *module, PyObject *name, PyObject *parts_tuple) { - Py_ssize_t i, nparts; - nparts = PyTuple_GET_SIZE(parts_tuple); - for (i=1; i < nparts && module; i++) { - PyObject *part, *submodule; -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - part = PyTuple_GET_ITEM(parts_tuple, i); -#else - part = PySequence_ITEM(parts_tuple, i); -#endif - submodule = __Pyx_PyObject_GetAttrStrNoError(module, part); -#if !(CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS) - Py_DECREF(part); -#endif - Py_DECREF(module); - module = submodule; - } - if (unlikely(!module)) { - return __Pyx__ImportDottedModule_Error(name, parts_tuple, i); - } - return module; -} -#endif -static PyObject *__Pyx__ImportDottedModule(PyObject *name, PyObject *parts_tuple) { -#if PY_MAJOR_VERSION < 3 - PyObject *module, *from_list, *star = __pyx_n_s__3; - CYTHON_UNUSED_VAR(parts_tuple); - from_list = PyList_New(1); - if (unlikely(!from_list)) - return NULL; - Py_INCREF(star); - PyList_SET_ITEM(from_list, 0, star); - module = __Pyx_Import(name, from_list, 0); - Py_DECREF(from_list); - return module; -#else - PyObject *imported_module; - PyObject *module = __Pyx_Import(name, NULL, 0); - if (!parts_tuple || unlikely(!module)) - return module; - imported_module = __Pyx__ImportDottedModule_Lookup(name); - if (likely(imported_module)) { - Py_DECREF(module); - return imported_module; - } - PyErr_Clear(); - return __Pyx_ImportDottedModule_WalkParts(module, name, parts_tuple); -#endif -} -static PyObject *__Pyx_ImportDottedModule(PyObject *name, PyObject *parts_tuple) { -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030400B1 - PyObject *module = __Pyx__ImportDottedModule_Lookup(name); - if (likely(module)) { - PyObject *spec = __Pyx_PyObject_GetAttrStrNoError(module, __pyx_n_s_spec); - if (likely(spec)) { - PyObject *unsafe = __Pyx_PyObject_GetAttrStrNoError(spec, __pyx_n_s_initializing); - if (likely(!unsafe || !__Pyx_PyObject_IsTrue(unsafe))) { - Py_DECREF(spec); - spec = NULL; - } - Py_XDECREF(unsafe); - } - if (likely(!spec)) { - PyErr_Clear(); - return module; - } - Py_DECREF(spec); - Py_DECREF(module); - } else if (PyErr_Occurred()) { - PyErr_Clear(); - } -#endif - return __Pyx__ImportDottedModule(name, parts_tuple); -} - -/* pybytes_as_double */ -static double __Pyx_SlowPyString_AsDouble(PyObject *obj) { - PyObject *float_value; -#if PY_MAJOR_VERSION >= 3 - float_value = PyFloat_FromString(obj); -#else - float_value = PyFloat_FromString(obj, 0); -#endif - if (likely(float_value)) { - double value = PyFloat_AS_DOUBLE(float_value); - Py_DECREF(float_value); - return value; - } - return (double)-1; -} -static const char* __Pyx__PyBytes_AsDouble_Copy(const char* start, char* buffer, Py_ssize_t length) { - int last_was_punctuation = 1; - Py_ssize_t i; - for (i=0; i < length; i++) { - char chr = start[i]; - int is_punctuation = (chr == '_') | (chr == '.') | (chr == 'e') | (chr == 'E'); - *buffer = chr; - buffer += (chr != '_'); - if (unlikely(last_was_punctuation & is_punctuation)) goto parse_failure; - last_was_punctuation = is_punctuation; - } - if (unlikely(last_was_punctuation)) goto parse_failure; - *buffer = '\0'; - return buffer; -parse_failure: - return NULL; -} -static double __Pyx__PyBytes_AsDouble_inf_nan(const char* start, Py_ssize_t length) { - int matches = 1; - char sign = start[0]; - int is_signed = (sign == '+') | (sign == '-'); - start += is_signed; - length -= is_signed; - switch (start[0]) { - #ifdef Py_NAN - case 'n': - case 'N': - if (unlikely(length != 3)) goto parse_failure; - matches &= (start[1] == 'a' || start[1] == 'A'); - matches &= (start[2] == 'n' || start[2] == 'N'); - if (unlikely(!matches)) goto parse_failure; - return (sign == '-') ? -Py_NAN : Py_NAN; - #endif - case 'i': - case 'I': - if (unlikely(length < 3)) goto parse_failure; - matches &= (start[1] == 'n' || start[1] == 'N'); - matches &= (start[2] == 'f' || start[2] == 'F'); - if (likely(length == 3 && matches)) - return (sign == '-') ? -Py_HUGE_VAL : Py_HUGE_VAL; - if (unlikely(length != 8)) goto parse_failure; - matches &= (start[3] == 'i' || start[3] == 'I'); - matches &= (start[4] == 'n' || start[4] == 'N'); - matches &= (start[5] == 'i' || start[5] == 'I'); - matches &= (start[6] == 't' || start[6] == 'T'); - matches &= (start[7] == 'y' || start[7] == 'Y'); - if (unlikely(!matches)) goto parse_failure; - return (sign == '-') ? -Py_HUGE_VAL : Py_HUGE_VAL; - case '.': case '0': case '1': case '2': case '3': case '4': case '5': case '6': case '7': case '8': case '9': - break; - default: - goto parse_failure; - } - return 0.0; -parse_failure: - return -1.0; -} -static CYTHON_INLINE int __Pyx__PyBytes_AsDouble_IsSpace(char ch) { - return (ch == 0x20) | !((ch < 0x9) | (ch > 0xd)); -} -CYTHON_UNUSED static double __Pyx__PyBytes_AsDouble(PyObject *obj, const char* start, Py_ssize_t length) { - double value; - Py_ssize_t i, digits; - const char *last = start + length; - char *end; - while (__Pyx__PyBytes_AsDouble_IsSpace(*start)) - start++; - while (start < last - 1 && __Pyx__PyBytes_AsDouble_IsSpace(last[-1])) - last--; - length = last - start; - if (unlikely(length <= 0)) goto fallback; - value = __Pyx__PyBytes_AsDouble_inf_nan(start, length); - if (unlikely(value == -1.0)) goto fallback; - if (value != 0.0) return value; - digits = 0; - for (i=0; i < length; digits += start[i++] != '_'); - if (likely(digits == length)) { - value = PyOS_string_to_double(start, &end, NULL); - } else if (digits < 40) { - char number[40]; - last = __Pyx__PyBytes_AsDouble_Copy(start, number, length); - if (unlikely(!last)) goto fallback; - value = PyOS_string_to_double(number, &end, NULL); - } else { - char *number = (char*) PyMem_Malloc((digits + 1) * sizeof(char)); - if (unlikely(!number)) goto fallback; - last = __Pyx__PyBytes_AsDouble_Copy(start, number, length); - if (unlikely(!last)) { - PyMem_Free(number); - goto fallback; - } - value = PyOS_string_to_double(number, &end, NULL); - PyMem_Free(number); - } - if (likely(end == last) || (value == (double)-1 && PyErr_Occurred())) { - return value; - } -fallback: - return __Pyx_SlowPyString_AsDouble(obj); -} - -/* FetchSharedCythonModule */ -static PyObject *__Pyx_FetchSharedCythonABIModule(void) { - PyObject *abi_module = PyImport_AddModule((char*) __PYX_ABI_MODULE_NAME); - if (unlikely(!abi_module)) return NULL; - Py_INCREF(abi_module); - return abi_module; -} - -/* FetchCommonType */ -static int __Pyx_VerifyCachedType(PyObject *cached_type, - const char *name, - Py_ssize_t basicsize, - Py_ssize_t expected_basicsize) { - if (!PyType_Check(cached_type)) { - PyErr_Format(PyExc_TypeError, - "Shared Cython type %.200s is not a type object", name); - return -1; - } - if (basicsize != expected_basicsize) { - PyErr_Format(PyExc_TypeError, - "Shared Cython type %.200s has the wrong size, try recompiling", - name); - return -1; - } - return 0; -} -#if !CYTHON_USE_TYPE_SPECS -static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type) { - PyObject* abi_module; - const char* object_name; - PyTypeObject *cached_type = NULL; - abi_module = __Pyx_FetchSharedCythonABIModule(); - if (!abi_module) return NULL; - object_name = strrchr(type->tp_name, '.'); - object_name = object_name ? object_name+1 : type->tp_name; - cached_type = (PyTypeObject*) PyObject_GetAttrString(abi_module, object_name); - if (cached_type) { - if (__Pyx_VerifyCachedType( - (PyObject *)cached_type, - object_name, - cached_type->tp_basicsize, - type->tp_basicsize) < 0) { - goto bad; - } - goto done; - } - if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad; - PyErr_Clear(); - if (PyType_Ready(type) < 0) goto bad; - if (PyObject_SetAttrString(abi_module, object_name, (PyObject *)type) < 0) - goto bad; - Py_INCREF(type); - cached_type = type; -done: - Py_DECREF(abi_module); - return cached_type; -bad: - Py_XDECREF(cached_type); - cached_type = NULL; - goto done; -} -#else -static PyTypeObject *__Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) { - PyObject *abi_module, *cached_type = NULL; - const char* object_name = strrchr(spec->name, '.'); - object_name = object_name ? object_name+1 : spec->name; - abi_module = __Pyx_FetchSharedCythonABIModule(); - if (!abi_module) return NULL; - cached_type = PyObject_GetAttrString(abi_module, object_name); - if (cached_type) { - Py_ssize_t basicsize; -#if CYTHON_COMPILING_IN_LIMITED_API - PyObject *py_basicsize; - py_basicsize = PyObject_GetAttrString(cached_type, "__basicsize__"); - if (unlikely(!py_basicsize)) goto bad; - basicsize = PyLong_AsSsize_t(py_basicsize); - Py_DECREF(py_basicsize); - py_basicsize = 0; - if (unlikely(basicsize == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad; -#else - basicsize = likely(PyType_Check(cached_type)) ? ((PyTypeObject*) cached_type)->tp_basicsize : -1; -#endif - if (__Pyx_VerifyCachedType( - cached_type, - object_name, - basicsize, - spec->basicsize) < 0) { - goto bad; - } - goto done; - } - if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad; - PyErr_Clear(); - CYTHON_UNUSED_VAR(module); - cached_type = __Pyx_PyType_FromModuleAndSpec(abi_module, spec, bases); - if (unlikely(!cached_type)) goto bad; - if (unlikely(__Pyx_fix_up_extension_type_from_spec(spec, (PyTypeObject *) cached_type) < 0)) goto bad; - if (PyObject_SetAttrString(abi_module, object_name, cached_type) < 0) goto bad; -done: - Py_DECREF(abi_module); - assert(cached_type == NULL || PyType_Check(cached_type)); - return (PyTypeObject *) cached_type; -bad: - Py_XDECREF(cached_type); - cached_type = NULL; - goto done; -} -#endif - -/* PyVectorcallFastCallDict */ -#if CYTHON_METH_FASTCALL -static PyObject *__Pyx_PyVectorcall_FastCallDict_kw(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw) -{ - PyObject *res = NULL; - PyObject *kwnames; - PyObject **newargs; - PyObject **kwvalues; - Py_ssize_t i, pos; - size_t j; - PyObject *key, *value; - unsigned long keys_are_strings; - Py_ssize_t nkw = PyDict_GET_SIZE(kw); - newargs = (PyObject **)PyMem_Malloc((nargs + (size_t)nkw) * sizeof(args[0])); - if (unlikely(newargs == NULL)) { - PyErr_NoMemory(); - return NULL; - } - for (j = 0; j < nargs; j++) newargs[j] = args[j]; - kwnames = PyTuple_New(nkw); - if (unlikely(kwnames == NULL)) { - PyMem_Free(newargs); - return NULL; - } - kwvalues = newargs + nargs; - pos = i = 0; - keys_are_strings = Py_TPFLAGS_UNICODE_SUBCLASS; - while (PyDict_Next(kw, &pos, &key, &value)) { - keys_are_strings &= Py_TYPE(key)->tp_flags; - Py_INCREF(key); - Py_INCREF(value); - PyTuple_SET_ITEM(kwnames, i, key); - kwvalues[i] = value; - i++; - } - if (unlikely(!keys_are_strings)) { - PyErr_SetString(PyExc_TypeError, "keywords must be strings"); - goto cleanup; - } - res = vc(func, newargs, nargs, kwnames); -cleanup: - Py_DECREF(kwnames); - for (i = 0; i < nkw; i++) - Py_DECREF(kwvalues[i]); - PyMem_Free(newargs); - return res; -} -static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw) -{ - if (likely(kw == NULL) || PyDict_GET_SIZE(kw) == 0) { - return vc(func, args, nargs, NULL); - } - return __Pyx_PyVectorcall_FastCallDict_kw(func, vc, args, nargs, kw); -} -#endif - -/* CythonFunctionShared */ -static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj) { -#if PY_VERSION_HEX < 0x030900B1 - __Pyx_Py_XDECREF_SET( - __Pyx_CyFunction_GetClassObj(f), - ((classobj) ? __Pyx_NewRef(classobj) : NULL)); -#else - __Pyx_Py_XDECREF_SET( - ((PyCMethodObject *) (f))->mm_class, - (PyTypeObject*)((classobj) ? __Pyx_NewRef(classobj) : NULL)); -#endif -} -static PyObject * -__Pyx_CyFunction_get_doc(__pyx_CyFunctionObject *op, void *closure) -{ - CYTHON_UNUSED_VAR(closure); - if (unlikely(op->func_doc == NULL)) { - if (((PyCFunctionObject*)op)->m_ml->ml_doc) { -#if PY_MAJOR_VERSION >= 3 - op->func_doc = PyUnicode_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc); -#else - op->func_doc = PyString_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc); -#endif - if (unlikely(op->func_doc == NULL)) - return NULL; - } else { - Py_INCREF(Py_None); - return Py_None; - } - } - Py_INCREF(op->func_doc); - return op->func_doc; -} -static int -__Pyx_CyFunction_set_doc(__pyx_CyFunctionObject *op, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); - if (value == NULL) { - value = Py_None; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->func_doc, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_name(__pyx_CyFunctionObject *op, void *context) -{ - CYTHON_UNUSED_VAR(context); - if (unlikely(op->func_name == NULL)) { -#if PY_MAJOR_VERSION >= 3 - op->func_name = PyUnicode_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name); -#else - op->func_name = PyString_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name); -#endif - if (unlikely(op->func_name == NULL)) - return NULL; - } - Py_INCREF(op->func_name); - return op->func_name; -} -static int -__Pyx_CyFunction_set_name(__pyx_CyFunctionObject *op, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); -#if PY_MAJOR_VERSION >= 3 - if (unlikely(value == NULL || !PyUnicode_Check(value))) -#else - if (unlikely(value == NULL || !PyString_Check(value))) -#endif - { - PyErr_SetString(PyExc_TypeError, - "__name__ must be set to a string object"); - return -1; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->func_name, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_qualname(__pyx_CyFunctionObject *op, void *context) -{ - CYTHON_UNUSED_VAR(context); - Py_INCREF(op->func_qualname); - return op->func_qualname; -} -static int -__Pyx_CyFunction_set_qualname(__pyx_CyFunctionObject *op, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); -#if PY_MAJOR_VERSION >= 3 - if (unlikely(value == NULL || !PyUnicode_Check(value))) -#else - if (unlikely(value == NULL || !PyString_Check(value))) -#endif - { - PyErr_SetString(PyExc_TypeError, - "__qualname__ must be set to a string object"); - return -1; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->func_qualname, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_dict(__pyx_CyFunctionObject *op, void *context) -{ - CYTHON_UNUSED_VAR(context); - if (unlikely(op->func_dict == NULL)) { - op->func_dict = PyDict_New(); - if (unlikely(op->func_dict == NULL)) - return NULL; - } - Py_INCREF(op->func_dict); - return op->func_dict; -} -static int -__Pyx_CyFunction_set_dict(__pyx_CyFunctionObject *op, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); - if (unlikely(value == NULL)) { - PyErr_SetString(PyExc_TypeError, - "function's dictionary may not be deleted"); - return -1; - } - if (unlikely(!PyDict_Check(value))) { - PyErr_SetString(PyExc_TypeError, - "setting function's dictionary to a non-dict"); - return -1; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->func_dict, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_globals(__pyx_CyFunctionObject *op, void *context) -{ - CYTHON_UNUSED_VAR(context); - Py_INCREF(op->func_globals); - return op->func_globals; -} -static PyObject * -__Pyx_CyFunction_get_closure(__pyx_CyFunctionObject *op, void *context) -{ - CYTHON_UNUSED_VAR(op); - CYTHON_UNUSED_VAR(context); - Py_INCREF(Py_None); - return Py_None; -} -static PyObject * -__Pyx_CyFunction_get_code(__pyx_CyFunctionObject *op, void *context) -{ - PyObject* result = (op->func_code) ? op->func_code : Py_None; - CYTHON_UNUSED_VAR(context); - Py_INCREF(result); - return result; -} -static int -__Pyx_CyFunction_init_defaults(__pyx_CyFunctionObject *op) { - int result = 0; - PyObject *res = op->defaults_getter((PyObject *) op); - if (unlikely(!res)) - return -1; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - op->defaults_tuple = PyTuple_GET_ITEM(res, 0); - Py_INCREF(op->defaults_tuple); - op->defaults_kwdict = PyTuple_GET_ITEM(res, 1); - Py_INCREF(op->defaults_kwdict); - #else - op->defaults_tuple = PySequence_ITEM(res, 0); - if (unlikely(!op->defaults_tuple)) result = -1; - else { - op->defaults_kwdict = PySequence_ITEM(res, 1); - if (unlikely(!op->defaults_kwdict)) result = -1; - } - #endif - Py_DECREF(res); - return result; -} -static int -__Pyx_CyFunction_set_defaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) { - CYTHON_UNUSED_VAR(context); - if (!value) { - value = Py_None; - } else if (unlikely(value != Py_None && !PyTuple_Check(value))) { - PyErr_SetString(PyExc_TypeError, - "__defaults__ must be set to a tuple object"); - return -1; - } - PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__defaults__ will not " - "currently affect the values used in function calls", 1); - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->defaults_tuple, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_defaults(__pyx_CyFunctionObject *op, void *context) { - PyObject* result = op->defaults_tuple; - CYTHON_UNUSED_VAR(context); - if (unlikely(!result)) { - if (op->defaults_getter) { - if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL; - result = op->defaults_tuple; - } else { - result = Py_None; - } - } - Py_INCREF(result); - return result; -} -static int -__Pyx_CyFunction_set_kwdefaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) { - CYTHON_UNUSED_VAR(context); - if (!value) { - value = Py_None; - } else if (unlikely(value != Py_None && !PyDict_Check(value))) { - PyErr_SetString(PyExc_TypeError, - "__kwdefaults__ must be set to a dict object"); - return -1; - } - PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__kwdefaults__ will not " - "currently affect the values used in function calls", 1); - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(op->defaults_kwdict, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_kwdefaults(__pyx_CyFunctionObject *op, void *context) { - PyObject* result = op->defaults_kwdict; - CYTHON_UNUSED_VAR(context); - if (unlikely(!result)) { - if (op->defaults_getter) { - if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL; - result = op->defaults_kwdict; - } else { - result = Py_None; - } - } - Py_INCREF(result); - return result; -} -static int -__Pyx_CyFunction_set_annotations(__pyx_CyFunctionObject *op, PyObject* value, void *context) { - CYTHON_UNUSED_VAR(context); - if (!value || value == Py_None) { - value = NULL; - } else if (unlikely(!PyDict_Check(value))) { - PyErr_SetString(PyExc_TypeError, - "__annotations__ must be set to a dict object"); - return -1; - } - Py_XINCREF(value); - __Pyx_Py_XDECREF_SET(op->func_annotations, value); - return 0; -} -static PyObject * -__Pyx_CyFunction_get_annotations(__pyx_CyFunctionObject *op, void *context) { - PyObject* result = op->func_annotations; - CYTHON_UNUSED_VAR(context); - if (unlikely(!result)) { - result = PyDict_New(); - if (unlikely(!result)) return NULL; - op->func_annotations = result; - } - Py_INCREF(result); - return result; -} -static PyObject * -__Pyx_CyFunction_get_is_coroutine(__pyx_CyFunctionObject *op, void *context) { - int is_coroutine; - CYTHON_UNUSED_VAR(context); - if (op->func_is_coroutine) { - return __Pyx_NewRef(op->func_is_coroutine); - } - is_coroutine = op->flags & __Pyx_CYFUNCTION_COROUTINE; -#if PY_VERSION_HEX >= 0x03050000 - if (is_coroutine) { - PyObject *module, *fromlist, *marker = __pyx_n_s_is_coroutine; - fromlist = PyList_New(1); - if (unlikely(!fromlist)) return NULL; - Py_INCREF(marker); - PyList_SET_ITEM(fromlist, 0, marker); - module = PyImport_ImportModuleLevelObject(__pyx_n_s_asyncio_coroutines, NULL, NULL, fromlist, 0); - Py_DECREF(fromlist); - if (unlikely(!module)) goto ignore; - op->func_is_coroutine = __Pyx_PyObject_GetAttrStr(module, marker); - Py_DECREF(module); - if (likely(op->func_is_coroutine)) { - return __Pyx_NewRef(op->func_is_coroutine); - } -ignore: - PyErr_Clear(); - } -#endif - op->func_is_coroutine = __Pyx_PyBool_FromLong(is_coroutine); - return __Pyx_NewRef(op->func_is_coroutine); -} -static PyGetSetDef __pyx_CyFunction_getsets[] = { - {(char *) "func_doc", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0}, - {(char *) "__doc__", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0}, - {(char *) "func_name", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0}, - {(char *) "__name__", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0}, - {(char *) "__qualname__", (getter)__Pyx_CyFunction_get_qualname, (setter)__Pyx_CyFunction_set_qualname, 0, 0}, - {(char *) "func_dict", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0}, - {(char *) "__dict__", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0}, - {(char *) "func_globals", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0}, - {(char *) "__globals__", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0}, - {(char *) "func_closure", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0}, - {(char *) "__closure__", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0}, - {(char *) "func_code", (getter)__Pyx_CyFunction_get_code, 0, 0, 0}, - {(char *) "__code__", (getter)__Pyx_CyFunction_get_code, 0, 0, 0}, - {(char *) "func_defaults", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0}, - {(char *) "__defaults__", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0}, - {(char *) "__kwdefaults__", (getter)__Pyx_CyFunction_get_kwdefaults, (setter)__Pyx_CyFunction_set_kwdefaults, 0, 0}, - {(char *) "__annotations__", (getter)__Pyx_CyFunction_get_annotations, (setter)__Pyx_CyFunction_set_annotations, 0, 0}, - {(char *) "_is_coroutine", (getter)__Pyx_CyFunction_get_is_coroutine, 0, 0, 0}, - {0, 0, 0, 0, 0} -}; -static PyMemberDef __pyx_CyFunction_members[] = { - {(char *) "__module__", T_OBJECT, offsetof(PyCFunctionObject, m_module), 0, 0}, -#if CYTHON_USE_TYPE_SPECS - {(char *) "__dictoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_dict), READONLY, 0}, -#if CYTHON_METH_FASTCALL -#if CYTHON_BACKPORT_VECTORCALL - {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_vectorcall), READONLY, 0}, -#else - {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(PyCFunctionObject, vectorcall), READONLY, 0}, -#endif -#endif -#if PY_VERSION_HEX < 0x030500A0 - {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_weakreflist), READONLY, 0}, -#else - {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(PyCFunctionObject, m_weakreflist), READONLY, 0}, -#endif -#endif - {0, 0, 0, 0, 0} -}; -static PyObject * -__Pyx_CyFunction_reduce(__pyx_CyFunctionObject *m, PyObject *args) -{ - CYTHON_UNUSED_VAR(args); -#if PY_MAJOR_VERSION >= 3 - Py_INCREF(m->func_qualname); - return m->func_qualname; -#else - return PyString_FromString(((PyCFunctionObject*)m)->m_ml->ml_name); -#endif -} -static PyMethodDef __pyx_CyFunction_methods[] = { - {"__reduce__", (PyCFunction)__Pyx_CyFunction_reduce, METH_VARARGS, 0}, - {0, 0, 0, 0} -}; -#if PY_VERSION_HEX < 0x030500A0 -#define __Pyx_CyFunction_weakreflist(cyfunc) ((cyfunc)->func_weakreflist) -#else -#define __Pyx_CyFunction_weakreflist(cyfunc) (((PyCFunctionObject*)cyfunc)->m_weakreflist) -#endif -static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject *op, PyMethodDef *ml, int flags, PyObject* qualname, - PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) { - PyCFunctionObject *cf = (PyCFunctionObject*) op; - if (unlikely(op == NULL)) - return NULL; - op->flags = flags; - __Pyx_CyFunction_weakreflist(op) = NULL; - cf->m_ml = ml; - cf->m_self = (PyObject *) op; - Py_XINCREF(closure); - op->func_closure = closure; - Py_XINCREF(module); - cf->m_module = module; - op->func_dict = NULL; - op->func_name = NULL; - Py_INCREF(qualname); - op->func_qualname = qualname; - op->func_doc = NULL; -#if PY_VERSION_HEX < 0x030900B1 - op->func_classobj = NULL; -#else - ((PyCMethodObject*)op)->mm_class = NULL; -#endif - op->func_globals = globals; - Py_INCREF(op->func_globals); - Py_XINCREF(code); - op->func_code = code; - op->defaults_pyobjects = 0; - op->defaults_size = 0; - op->defaults = NULL; - op->defaults_tuple = NULL; - op->defaults_kwdict = NULL; - op->defaults_getter = NULL; - op->func_annotations = NULL; - op->func_is_coroutine = NULL; -#if CYTHON_METH_FASTCALL - switch (ml->ml_flags & (METH_VARARGS | METH_FASTCALL | METH_NOARGS | METH_O | METH_KEYWORDS | METH_METHOD)) { - case METH_NOARGS: - __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_NOARGS; - break; - case METH_O: - __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_O; - break; - case METH_METHOD | METH_FASTCALL | METH_KEYWORDS: - __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD; - break; - case METH_FASTCALL | METH_KEYWORDS: - __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS; - break; - case METH_VARARGS | METH_KEYWORDS: - __Pyx_CyFunction_func_vectorcall(op) = NULL; - break; - default: - PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction"); - Py_DECREF(op); - return NULL; - } -#endif - return (PyObject *) op; -} -static int -__Pyx_CyFunction_clear(__pyx_CyFunctionObject *m) -{ - Py_CLEAR(m->func_closure); - Py_CLEAR(((PyCFunctionObject*)m)->m_module); - Py_CLEAR(m->func_dict); - Py_CLEAR(m->func_name); - Py_CLEAR(m->func_qualname); - Py_CLEAR(m->func_doc); - Py_CLEAR(m->func_globals); - Py_CLEAR(m->func_code); -#if PY_VERSION_HEX < 0x030900B1 - Py_CLEAR(__Pyx_CyFunction_GetClassObj(m)); -#else - { - PyObject *cls = (PyObject*) ((PyCMethodObject *) (m))->mm_class; - ((PyCMethodObject *) (m))->mm_class = NULL; - Py_XDECREF(cls); - } -#endif - Py_CLEAR(m->defaults_tuple); - Py_CLEAR(m->defaults_kwdict); - Py_CLEAR(m->func_annotations); - Py_CLEAR(m->func_is_coroutine); - if (m->defaults) { - PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m); - int i; - for (i = 0; i < m->defaults_pyobjects; i++) - Py_XDECREF(pydefaults[i]); - PyObject_Free(m->defaults); - m->defaults = NULL; - } - return 0; -} -static void __Pyx__CyFunction_dealloc(__pyx_CyFunctionObject *m) -{ - if (__Pyx_CyFunction_weakreflist(m) != NULL) - PyObject_ClearWeakRefs((PyObject *) m); - __Pyx_CyFunction_clear(m); - __Pyx_PyHeapTypeObject_GC_Del(m); -} -static void __Pyx_CyFunction_dealloc(__pyx_CyFunctionObject *m) -{ - PyObject_GC_UnTrack(m); - __Pyx__CyFunction_dealloc(m); -} -static int __Pyx_CyFunction_traverse(__pyx_CyFunctionObject *m, visitproc visit, void *arg) -{ - Py_VISIT(m->func_closure); - Py_VISIT(((PyCFunctionObject*)m)->m_module); - Py_VISIT(m->func_dict); - Py_VISIT(m->func_name); - Py_VISIT(m->func_qualname); - Py_VISIT(m->func_doc); - Py_VISIT(m->func_globals); - Py_VISIT(m->func_code); - Py_VISIT(__Pyx_CyFunction_GetClassObj(m)); - Py_VISIT(m->defaults_tuple); - Py_VISIT(m->defaults_kwdict); - Py_VISIT(m->func_is_coroutine); - if (m->defaults) { - PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m); - int i; - for (i = 0; i < m->defaults_pyobjects; i++) - Py_VISIT(pydefaults[i]); - } - return 0; -} -static PyObject* -__Pyx_CyFunction_repr(__pyx_CyFunctionObject *op) -{ -#if PY_MAJOR_VERSION >= 3 - return PyUnicode_FromFormat("", - op->func_qualname, (void *)op); -#else - return PyString_FromFormat("", - PyString_AsString(op->func_qualname), (void *)op); -#endif -} -static PyObject * __Pyx_CyFunction_CallMethod(PyObject *func, PyObject *self, PyObject *arg, PyObject *kw) { - PyCFunctionObject* f = (PyCFunctionObject*)func; - PyCFunction meth = f->m_ml->ml_meth; - Py_ssize_t size; - switch (f->m_ml->ml_flags & (METH_VARARGS | METH_KEYWORDS | METH_NOARGS | METH_O)) { - case METH_VARARGS: - if (likely(kw == NULL || PyDict_Size(kw) == 0)) - return (*meth)(self, arg); - break; - case METH_VARARGS | METH_KEYWORDS: - return (*(PyCFunctionWithKeywords)(void*)meth)(self, arg, kw); - case METH_NOARGS: - if (likely(kw == NULL || PyDict_Size(kw) == 0)) { - size = PyTuple_GET_SIZE(arg); - if (likely(size == 0)) - return (*meth)(self, NULL); - PyErr_Format(PyExc_TypeError, - "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", - f->m_ml->ml_name, size); - return NULL; - } - break; - case METH_O: - if (likely(kw == NULL || PyDict_Size(kw) == 0)) { - size = PyTuple_GET_SIZE(arg); - if (likely(size == 1)) { - PyObject *result, *arg0; - #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - arg0 = PyTuple_GET_ITEM(arg, 0); - #else - arg0 = PySequence_ITEM(arg, 0); if (unlikely(!arg0)) return NULL; - #endif - result = (*meth)(self, arg0); - #if !(CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS) - Py_DECREF(arg0); - #endif - return result; - } - PyErr_Format(PyExc_TypeError, - "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", - f->m_ml->ml_name, size); - return NULL; - } - break; - default: - PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction"); - return NULL; - } - PyErr_Format(PyExc_TypeError, "%.200s() takes no keyword arguments", - f->m_ml->ml_name); - return NULL; -} -static CYTHON_INLINE PyObject *__Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject *kw) { - return __Pyx_CyFunction_CallMethod(func, ((PyCFunctionObject*)func)->m_self, arg, kw); -} -static PyObject *__Pyx_CyFunction_CallAsMethod(PyObject *func, PyObject *args, PyObject *kw) { - PyObject *result; - __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *) func; -#if CYTHON_METH_FASTCALL - __pyx_vectorcallfunc vc = __Pyx_CyFunction_func_vectorcall(cyfunc); - if (vc) { -#if CYTHON_ASSUME_SAFE_MACROS - return __Pyx_PyVectorcall_FastCallDict(func, vc, &PyTuple_GET_ITEM(args, 0), (size_t)PyTuple_GET_SIZE(args), kw); -#else - (void) &__Pyx_PyVectorcall_FastCallDict; - return PyVectorcall_Call(func, args, kw); -#endif - } -#endif - if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) { - Py_ssize_t argc; - PyObject *new_args; - PyObject *self; - argc = PyTuple_GET_SIZE(args); - new_args = PyTuple_GetSlice(args, 1, argc); - if (unlikely(!new_args)) - return NULL; - self = PyTuple_GetItem(args, 0); - if (unlikely(!self)) { - Py_DECREF(new_args); -#if PY_MAJOR_VERSION > 2 - PyErr_Format(PyExc_TypeError, - "unbound method %.200S() needs an argument", - cyfunc->func_qualname); -#else - PyErr_SetString(PyExc_TypeError, - "unbound method needs an argument"); -#endif - return NULL; - } - result = __Pyx_CyFunction_CallMethod(func, self, new_args, kw); - Py_DECREF(new_args); - } else { - result = __Pyx_CyFunction_Call(func, args, kw); - } - return result; -} -#if CYTHON_METH_FASTCALL -static CYTHON_INLINE int __Pyx_CyFunction_Vectorcall_CheckArgs(__pyx_CyFunctionObject *cyfunc, Py_ssize_t nargs, PyObject *kwnames) -{ - int ret = 0; - if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) { - if (unlikely(nargs < 1)) { - PyErr_Format(PyExc_TypeError, "%.200s() needs an argument", - ((PyCFunctionObject*)cyfunc)->m_ml->ml_name); - return -1; - } - ret = 1; - } - if (unlikely(kwnames) && unlikely(PyTuple_GET_SIZE(kwnames))) { - PyErr_Format(PyExc_TypeError, - "%.200s() takes no keyword arguments", ((PyCFunctionObject*)cyfunc)->m_ml->ml_name); - return -1; - } - return ret; -} -static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) -{ - __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; - PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; -#if CYTHON_BACKPORT_VECTORCALL - Py_ssize_t nargs = (Py_ssize_t)nargsf; -#else - Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); -#endif - PyObject *self; - switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) { - case 1: - self = args[0]; - args += 1; - nargs -= 1; - break; - case 0: - self = ((PyCFunctionObject*)cyfunc)->m_self; - break; - default: - return NULL; - } - if (unlikely(nargs != 0)) { - PyErr_Format(PyExc_TypeError, - "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", - def->ml_name, nargs); - return NULL; - } - return def->ml_meth(self, NULL); -} -static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) -{ - __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; - PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; -#if CYTHON_BACKPORT_VECTORCALL - Py_ssize_t nargs = (Py_ssize_t)nargsf; -#else - Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); -#endif - PyObject *self; - switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) { - case 1: - self = args[0]; - args += 1; - nargs -= 1; - break; - case 0: - self = ((PyCFunctionObject*)cyfunc)->m_self; - break; - default: - return NULL; - } - if (unlikely(nargs != 1)) { - PyErr_Format(PyExc_TypeError, - "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", - def->ml_name, nargs); - return NULL; - } - return def->ml_meth(self, args[0]); -} -static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) -{ - __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; - PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; -#if CYTHON_BACKPORT_VECTORCALL - Py_ssize_t nargs = (Py_ssize_t)nargsf; -#else - Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); -#endif - PyObject *self; - switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) { - case 1: - self = args[0]; - args += 1; - nargs -= 1; - break; - case 0: - self = ((PyCFunctionObject*)cyfunc)->m_self; - break; - default: - return NULL; - } - return ((_PyCFunctionFastWithKeywords)(void(*)(void))def->ml_meth)(self, args, nargs, kwnames); -} -static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) -{ - __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; - PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; - PyTypeObject *cls = (PyTypeObject *) __Pyx_CyFunction_GetClassObj(cyfunc); -#if CYTHON_BACKPORT_VECTORCALL - Py_ssize_t nargs = (Py_ssize_t)nargsf; -#else - Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); -#endif - PyObject *self; - switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) { - case 1: - self = args[0]; - args += 1; - nargs -= 1; - break; - case 0: - self = ((PyCFunctionObject*)cyfunc)->m_self; - break; - default: - return NULL; - } - return ((__Pyx_PyCMethod)(void(*)(void))def->ml_meth)(self, cls, args, (size_t)nargs, kwnames); -} -#endif -#if CYTHON_USE_TYPE_SPECS -static PyType_Slot __pyx_CyFunctionType_slots[] = { - {Py_tp_dealloc, (void *)__Pyx_CyFunction_dealloc}, - {Py_tp_repr, (void *)__Pyx_CyFunction_repr}, - {Py_tp_call, (void *)__Pyx_CyFunction_CallAsMethod}, - {Py_tp_traverse, (void *)__Pyx_CyFunction_traverse}, - {Py_tp_clear, (void *)__Pyx_CyFunction_clear}, - {Py_tp_methods, (void *)__pyx_CyFunction_methods}, - {Py_tp_members, (void *)__pyx_CyFunction_members}, - {Py_tp_getset, (void *)__pyx_CyFunction_getsets}, - {Py_tp_descr_get, (void *)__Pyx_PyMethod_New}, - {0, 0}, -}; -static PyType_Spec __pyx_CyFunctionType_spec = { - __PYX_TYPE_MODULE_PREFIX "cython_function_or_method", - sizeof(__pyx_CyFunctionObject), - 0, -#ifdef Py_TPFLAGS_METHOD_DESCRIPTOR - Py_TPFLAGS_METHOD_DESCRIPTOR | -#endif -#if (defined(_Py_TPFLAGS_HAVE_VECTORCALL) && CYTHON_METH_FASTCALL) - _Py_TPFLAGS_HAVE_VECTORCALL | -#endif - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, - __pyx_CyFunctionType_slots -}; -#else -static PyTypeObject __pyx_CyFunctionType_type = { - PyVarObject_HEAD_INIT(0, 0) - __PYX_TYPE_MODULE_PREFIX "cython_function_or_method", - sizeof(__pyx_CyFunctionObject), - 0, - (destructor) __Pyx_CyFunction_dealloc, -#if !CYTHON_METH_FASTCALL - 0, -#elif CYTHON_BACKPORT_VECTORCALL - (printfunc)offsetof(__pyx_CyFunctionObject, func_vectorcall), -#else - offsetof(PyCFunctionObject, vectorcall), -#endif - 0, - 0, -#if PY_MAJOR_VERSION < 3 - 0, -#else - 0, -#endif - (reprfunc) __Pyx_CyFunction_repr, - 0, - 0, - 0, - 0, - __Pyx_CyFunction_CallAsMethod, - 0, - 0, - 0, - 0, -#ifdef Py_TPFLAGS_METHOD_DESCRIPTOR - Py_TPFLAGS_METHOD_DESCRIPTOR | -#endif -#ifdef _Py_TPFLAGS_HAVE_VECTORCALL - _Py_TPFLAGS_HAVE_VECTORCALL | -#endif - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, - 0, - (traverseproc) __Pyx_CyFunction_traverse, - (inquiry) __Pyx_CyFunction_clear, - 0, -#if PY_VERSION_HEX < 0x030500A0 - offsetof(__pyx_CyFunctionObject, func_weakreflist), -#else - offsetof(PyCFunctionObject, m_weakreflist), -#endif - 0, - 0, - __pyx_CyFunction_methods, - __pyx_CyFunction_members, - __pyx_CyFunction_getsets, - 0, - 0, - __Pyx_PyMethod_New, - 0, - offsetof(__pyx_CyFunctionObject, func_dict), - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, -#if PY_VERSION_HEX >= 0x030400a1 - 0, -#endif -#if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800) - 0, -#endif -#if __PYX_NEED_TP_PRINT_SLOT - 0, -#endif -#if PY_VERSION_HEX >= 0x030C0000 - 0, -#endif -#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000 - 0, -#endif -}; -#endif -static int __pyx_CyFunction_init(PyObject *module) { -#if CYTHON_USE_TYPE_SPECS - __pyx_CyFunctionType = __Pyx_FetchCommonTypeFromSpec(module, &__pyx_CyFunctionType_spec, NULL); -#else - CYTHON_UNUSED_VAR(module); - __pyx_CyFunctionType = __Pyx_FetchCommonType(&__pyx_CyFunctionType_type); -#endif - if (unlikely(__pyx_CyFunctionType == NULL)) { - return -1; - } - return 0; -} -static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *func, size_t size, int pyobjects) { - __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; - m->defaults = PyObject_Malloc(size); - if (unlikely(!m->defaults)) - return PyErr_NoMemory(); - memset(m->defaults, 0, size); - m->defaults_pyobjects = pyobjects; - m->defaults_size = size; - return m->defaults; -} -static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *func, PyObject *tuple) { - __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; - m->defaults_tuple = tuple; - Py_INCREF(tuple); -} -static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *func, PyObject *dict) { - __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; - m->defaults_kwdict = dict; - Py_INCREF(dict); -} -static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *func, PyObject *dict) { - __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; - m->func_annotations = dict; - Py_INCREF(dict); -} - -/* CythonFunction */ -static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml, int flags, PyObject* qualname, - PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) { - PyObject *op = __Pyx_CyFunction_Init( - PyObject_GC_New(__pyx_CyFunctionObject, __pyx_CyFunctionType), - ml, flags, qualname, closure, module, globals, code - ); - if (likely(op)) { - PyObject_GC_Track(op); - } - return op; -} - -/* CLineInTraceback */ -#ifndef CYTHON_CLINE_IN_TRACEBACK -static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { - PyObject *use_cline; - PyObject *ptype, *pvalue, *ptraceback; -#if CYTHON_COMPILING_IN_CPYTHON - PyObject **cython_runtime_dict; -#endif - CYTHON_MAYBE_UNUSED_VAR(tstate); - if (unlikely(!__pyx_cython_runtime)) { - return c_line; - } - __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); -#if CYTHON_COMPILING_IN_CPYTHON - cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); - if (likely(cython_runtime_dict)) { - __PYX_PY_DICT_LOOKUP_IF_MODIFIED( - use_cline, *cython_runtime_dict, - __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) - } else -#endif - { - PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStrNoError(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); - if (use_cline_obj) { - use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; - Py_DECREF(use_cline_obj); - } else { - PyErr_Clear(); - use_cline = NULL; - } - } - if (!use_cline) { - c_line = 0; - (void) PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); - } - else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { - c_line = 0; - } - __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); - return c_line; -} -#endif - -/* CodeObjectCache */ -#if !CYTHON_COMPILING_IN_LIMITED_API -static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { - int start = 0, mid = 0, end = count - 1; - if (end >= 0 && code_line > entries[end].code_line) { - return count; - } - while (start < end) { - mid = start + (end - start) / 2; - if (code_line < entries[mid].code_line) { - end = mid; - } else if (code_line > entries[mid].code_line) { - start = mid + 1; - } else { - return mid; - } - } - if (code_line <= entries[mid].code_line) { - return mid; - } else { - return mid + 1; - } -} -static PyCodeObject *__pyx_find_code_object(int code_line) { - PyCodeObject* code_object; - int pos; - if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { - return NULL; - } - pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); - if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { - return NULL; - } - code_object = __pyx_code_cache.entries[pos].code_object; - Py_INCREF(code_object); - return code_object; -} -static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { - int pos, i; - __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; - if (unlikely(!code_line)) { - return; - } - if (unlikely(!entries)) { - entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); - if (likely(entries)) { - __pyx_code_cache.entries = entries; - __pyx_code_cache.max_count = 64; - __pyx_code_cache.count = 1; - entries[0].code_line = code_line; - entries[0].code_object = code_object; - Py_INCREF(code_object); - } - return; - } - pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); - if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { - PyCodeObject* tmp = entries[pos].code_object; - entries[pos].code_object = code_object; - Py_DECREF(tmp); - return; - } - if (__pyx_code_cache.count == __pyx_code_cache.max_count) { - int new_max = __pyx_code_cache.max_count + 64; - entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( - __pyx_code_cache.entries, ((size_t)new_max) * sizeof(__Pyx_CodeObjectCacheEntry)); - if (unlikely(!entries)) { - return; - } - __pyx_code_cache.entries = entries; - __pyx_code_cache.max_count = new_max; - } - for (i=__pyx_code_cache.count; i>pos; i--) { - entries[i] = entries[i-1]; - } - entries[pos].code_line = code_line; - entries[pos].code_object = code_object; - __pyx_code_cache.count++; - Py_INCREF(code_object); -} -#endif - -/* AddTraceback */ -#include "compile.h" -#include "frameobject.h" -#include "traceback.h" -#if PY_VERSION_HEX >= 0x030b00a6 - #ifndef Py_BUILD_CORE - #define Py_BUILD_CORE 1 - #endif - #include "internal/pycore_frame.h" -#endif -#if CYTHON_COMPILING_IN_LIMITED_API -static void __Pyx_AddTraceback(const char *funcname, int c_line, - int py_line, const char *filename) { - if (c_line) { - (void) __pyx_cfilenm; - (void) __Pyx_CLineForTraceback(__Pyx_PyThreadState_Current, c_line); - } - _PyTraceback_Add(funcname, filename, py_line); -} -#else -static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( - const char *funcname, int c_line, - int py_line, const char *filename) { - PyCodeObject *py_code = NULL; - PyObject *py_funcname = NULL; - #if PY_MAJOR_VERSION < 3 - PyObject *py_srcfile = NULL; - py_srcfile = PyString_FromString(filename); - if (!py_srcfile) goto bad; - #endif - if (c_line) { - #if PY_MAJOR_VERSION < 3 - py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); - if (!py_funcname) goto bad; - #else - py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); - if (!py_funcname) goto bad; - funcname = PyUnicode_AsUTF8(py_funcname); - if (!funcname) goto bad; - #endif - } - else { - #if PY_MAJOR_VERSION < 3 - py_funcname = PyString_FromString(funcname); - if (!py_funcname) goto bad; - #endif - } - #if PY_MAJOR_VERSION < 3 - py_code = __Pyx_PyCode_New( - 0, - 0, - 0, - 0, - 0, - 0, - __pyx_empty_bytes, /*PyObject *code,*/ - __pyx_empty_tuple, /*PyObject *consts,*/ - __pyx_empty_tuple, /*PyObject *names,*/ - __pyx_empty_tuple, /*PyObject *varnames,*/ - __pyx_empty_tuple, /*PyObject *freevars,*/ - __pyx_empty_tuple, /*PyObject *cellvars,*/ - py_srcfile, /*PyObject *filename,*/ - py_funcname, /*PyObject *name,*/ - py_line, - __pyx_empty_bytes /*PyObject *lnotab*/ - ); - Py_DECREF(py_srcfile); - #else - py_code = PyCode_NewEmpty(filename, funcname, py_line); - #endif - Py_XDECREF(py_funcname); // XDECREF since it's only set on Py3 if cline - return py_code; -bad: - Py_XDECREF(py_funcname); - #if PY_MAJOR_VERSION < 3 - Py_XDECREF(py_srcfile); - #endif - return NULL; -} -static void __Pyx_AddTraceback(const char *funcname, int c_line, - int py_line, const char *filename) { - PyCodeObject *py_code = 0; - PyFrameObject *py_frame = 0; - PyThreadState *tstate = __Pyx_PyThreadState_Current; - PyObject *ptype, *pvalue, *ptraceback; - if (c_line) { - c_line = __Pyx_CLineForTraceback(tstate, c_line); - } - py_code = __pyx_find_code_object(c_line ? -c_line : py_line); - if (!py_code) { - __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); - py_code = __Pyx_CreateCodeObjectForTraceback( - funcname, c_line, py_line, filename); - if (!py_code) { - /* If the code object creation fails, then we should clear the - fetched exception references and propagate the new exception */ - Py_XDECREF(ptype); - Py_XDECREF(pvalue); - Py_XDECREF(ptraceback); - goto bad; - } - __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); - __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); - } - py_frame = PyFrame_New( - tstate, /*PyThreadState *tstate,*/ - py_code, /*PyCodeObject *code,*/ - __pyx_d, /*PyObject *globals,*/ - 0 /*PyObject *locals*/ - ); - if (!py_frame) goto bad; - __Pyx_PyFrame_SetLineNumber(py_frame, py_line); - PyTraceBack_Here(py_frame); -bad: - Py_XDECREF(py_code); - Py_XDECREF(py_frame); -} -#endif - -/* Declarations */ -#if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) - #ifdef __cplusplus - static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { - return ::std::complex< double >(x, y); - } - #else - static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { - return x + y*(__pyx_t_double_complex)_Complex_I; - } - #endif -#else - static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { - __pyx_t_double_complex z; - z.real = x; - z.imag = y; - return z; - } -#endif - -/* Arithmetic */ -#if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) -#else - static CYTHON_INLINE int __Pyx_c_eq_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - return (a.real == b.real) && (a.imag == b.imag); - } - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_sum_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - __pyx_t_double_complex z; - z.real = a.real + b.real; - z.imag = a.imag + b.imag; - return z; - } - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_diff_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - __pyx_t_double_complex z; - z.real = a.real - b.real; - z.imag = a.imag - b.imag; - return z; - } - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_prod_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - __pyx_t_double_complex z; - z.real = a.real * b.real - a.imag * b.imag; - z.imag = a.real * b.imag + a.imag * b.real; - return z; - } - #if 1 - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - if (b.imag == 0) { - return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.real); - } else if (fabs(b.real) >= fabs(b.imag)) { - if (b.real == 0 && b.imag == 0) { - return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.imag); - } else { - double r = b.imag / b.real; - double s = (double)(1.0) / (b.real + b.imag * r); - return __pyx_t_double_complex_from_parts( - (a.real + a.imag * r) * s, (a.imag - a.real * r) * s); - } - } else { - double r = b.real / b.imag; - double s = (double)(1.0) / (b.imag + b.real * r); - return __pyx_t_double_complex_from_parts( - (a.real * r + a.imag) * s, (a.imag * r - a.real) * s); - } - } - #else - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - if (b.imag == 0) { - return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.real); - } else { - double denom = b.real * b.real + b.imag * b.imag; - return __pyx_t_double_complex_from_parts( - (a.real * b.real + a.imag * b.imag) / denom, - (a.imag * b.real - a.real * b.imag) / denom); - } - } - #endif - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_neg_double(__pyx_t_double_complex a) { - __pyx_t_double_complex z; - z.real = -a.real; - z.imag = -a.imag; - return z; - } - static CYTHON_INLINE int __Pyx_c_is_zero_double(__pyx_t_double_complex a) { - return (a.real == 0) && (a.imag == 0); - } - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_conj_double(__pyx_t_double_complex a) { - __pyx_t_double_complex z; - z.real = a.real; - z.imag = -a.imag; - return z; - } - #if 1 - static CYTHON_INLINE double __Pyx_c_abs_double(__pyx_t_double_complex z) { - #if !defined(HAVE_HYPOT) || defined(_MSC_VER) - return sqrt(z.real*z.real + z.imag*z.imag); - #else - return hypot(z.real, z.imag); - #endif - } - static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_pow_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { - __pyx_t_double_complex z; - double r, lnr, theta, z_r, z_theta; - if (b.imag == 0 && b.real == (int)b.real) { - if (b.real < 0) { - double denom = a.real * a.real + a.imag * a.imag; - a.real = a.real / denom; - a.imag = -a.imag / denom; - b.real = -b.real; - } - switch ((int)b.real) { - case 0: - z.real = 1; - z.imag = 0; - return z; - case 1: - return a; - case 2: - return __Pyx_c_prod_double(a, a); - case 3: - z = __Pyx_c_prod_double(a, a); - return __Pyx_c_prod_double(z, a); - case 4: - z = __Pyx_c_prod_double(a, a); - return __Pyx_c_prod_double(z, z); - } - } - if (a.imag == 0) { - if (a.real == 0) { - return a; - } else if ((b.imag == 0) && (a.real >= 0)) { - z.real = pow(a.real, b.real); - z.imag = 0; - return z; - } else if (a.real > 0) { - r = a.real; - theta = 0; - } else { - r = -a.real; - theta = atan2(0.0, -1.0); - } - } else { - r = __Pyx_c_abs_double(a); - theta = atan2(a.imag, a.real); - } - lnr = log(r); - z_r = exp(lnr * b.real - theta * b.imag); - z_theta = theta * b.real + lnr * b.imag; - z.real = z_r * cos(z_theta); - z.imag = z_r * sin(z_theta); - return z; - } - #endif -#endif - -/* FromPy */ -static __pyx_t_double_complex __Pyx_PyComplex_As___pyx_t_double_complex(PyObject* o) { - Py_complex cval; -#if !CYTHON_COMPILING_IN_PYPY - if (PyComplex_CheckExact(o)) - cval = ((PyComplexObject *)o)->cval; - else -#endif - cval = PyComplex_AsCComplex(o); - return __pyx_t_double_complex_from_parts( - (double)cval.real, - (double)cval.imag); -} - -/* CIntFromPyVerify */ -#define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ - __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) -#define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ - __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) -#define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ - {\ - func_type value = func_value;\ - if (sizeof(target_type) < sizeof(func_type)) {\ - if (unlikely(value != (func_type) (target_type) value)) {\ - func_type zero = 0;\ - if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ - return (target_type) -1;\ - if (is_unsigned && unlikely(value < zero))\ - goto raise_neg_overflow;\ - else\ - goto raise_overflow;\ - }\ - }\ - return (target_type) value;\ - } - -/* CIntFromPy */ -static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic push -#pragma GCC diagnostic ignored "-Wconversion" -#endif - const int neg_one = (int) -1, const_zero = (int) 0; -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic pop -#endif - const int is_unsigned = neg_one > const_zero; -#if PY_MAJOR_VERSION < 3 - if (likely(PyInt_Check(x))) { - if ((sizeof(int) < sizeof(long))) { - __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) - } else { - long val = PyInt_AS_LONG(x); - if (is_unsigned && unlikely(val < 0)) { - goto raise_neg_overflow; - } - return (int) val; - } - } else -#endif - if (likely(PyLong_Check(x))) { - if (is_unsigned) { -#if CYTHON_USE_PYLONG_INTERNALS - if (unlikely(__Pyx_PyLong_IsNeg(x))) { - goto raise_neg_overflow; - } else if (__Pyx_PyLong_IsCompact(x)) { - __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x)) - } else { - const digit* digits = __Pyx_PyLong_Digits(x); - assert(__Pyx_PyLong_DigitCount(x) > 1); - switch (__Pyx_PyLong_DigitCount(x)) { - case 2: - if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) >= 2 * PyLong_SHIFT)) { - return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); - } - } - break; - case 3: - if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) >= 3 * PyLong_SHIFT)) { - return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); - } - } - break; - case 4: - if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) >= 4 * PyLong_SHIFT)) { - return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); - } - } - break; - } - } -#endif -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7 - if (unlikely(Py_SIZE(x) < 0)) { - goto raise_neg_overflow; - } -#else - { - int result = PyObject_RichCompareBool(x, Py_False, Py_LT); - if (unlikely(result < 0)) - return (int) -1; - if (unlikely(result == 1)) - goto raise_neg_overflow; - } -#endif - if ((sizeof(int) <= sizeof(unsigned long))) { - __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) -#ifdef HAVE_LONG_LONG - } else if ((sizeof(int) <= sizeof(unsigned PY_LONG_LONG))) { - __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) -#endif - } - } else { -#if CYTHON_USE_PYLONG_INTERNALS - if (__Pyx_PyLong_IsCompact(x)) { - __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x)) - } else { - const digit* digits = __Pyx_PyLong_Digits(x); - assert(__Pyx_PyLong_DigitCount(x) > 1); - switch (__Pyx_PyLong_SignedDigitCount(x)) { - case -2: - if ((8 * sizeof(int) - 1 > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { - return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - case 2: - if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { - return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - case -3: - if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { - return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - case 3: - if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { - return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - case -4: - if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) { - return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - case 4: - if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) { - return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); - } - } - break; - } - } -#endif - if ((sizeof(int) <= sizeof(long))) { - __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) -#ifdef HAVE_LONG_LONG - } else if ((sizeof(int) <= sizeof(PY_LONG_LONG))) { - __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) -#endif - } - } - { - int val; - PyObject *v = __Pyx_PyNumber_IntOrLong(x); -#if PY_MAJOR_VERSION < 3 - if (likely(v) && !PyLong_Check(v)) { - PyObject *tmp = v; - v = PyNumber_Long(tmp); - Py_DECREF(tmp); - } -#endif - if (likely(v)) { - int ret = -1; -#if !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray) - int one = 1; int is_little = (int)*(unsigned char *)&one; - unsigned char *bytes = (unsigned char *)&val; - ret = _PyLong_AsByteArray((PyLongObject *)v, - bytes, sizeof(val), - is_little, !is_unsigned); -#else - PyObject *stepval = NULL, *mask = NULL, *shift = NULL; - int bits, remaining_bits, is_negative = 0; - long idigit; - int chunk_size = (sizeof(long) < 8) ? 30 : 62; - if (unlikely(!PyLong_CheckExact(v))) { - PyObject *tmp = v; - v = PyNumber_Long(v); - assert(PyLong_CheckExact(v)); - Py_DECREF(tmp); - if (unlikely(!v)) return (int) -1; - } -#if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - if (Py_SIZE(x) == 0) - return (int) 0; - is_negative = Py_SIZE(x) < 0; -#else - { - int result = PyObject_RichCompareBool(x, Py_False, Py_LT); - if (unlikely(result < 0)) - return (int) -1; - is_negative = result == 1; - } -#endif - if (is_unsigned && unlikely(is_negative)) { - goto raise_neg_overflow; - } else if (is_negative) { - stepval = PyNumber_Invert(v); - if (unlikely(!stepval)) - return (int) -1; - } else { - stepval = __Pyx_NewRef(v); - } - val = (int) 0; - mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done; - shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done; - for (bits = 0; bits < (int) sizeof(int) * 8 - chunk_size; bits += chunk_size) { - PyObject *tmp, *digit; - digit = PyNumber_And(stepval, mask); - if (unlikely(!digit)) goto done; - idigit = PyLong_AsLong(digit); - Py_DECREF(digit); - if (unlikely(idigit < 0)) goto done; - tmp = PyNumber_Rshift(stepval, shift); - if (unlikely(!tmp)) goto done; - Py_DECREF(stepval); stepval = tmp; - val |= ((int) idigit) << bits; - #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - if (Py_SIZE(stepval) == 0) - goto unpacking_done; - #endif - } - idigit = PyLong_AsLong(stepval); - if (unlikely(idigit < 0)) goto done; - remaining_bits = ((int) sizeof(int) * 8) - bits - (is_unsigned ? 0 : 1); - if (unlikely(idigit >= (1L << remaining_bits))) - goto raise_overflow; - val |= ((int) idigit) << bits; - #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - unpacking_done: - #endif - if (!is_unsigned) { - if (unlikely(val & (((int) 1) << (sizeof(int) * 8 - 1)))) - goto raise_overflow; - if (is_negative) - val = ~val; - } - ret = 0; - done: - Py_XDECREF(shift); - Py_XDECREF(mask); - Py_XDECREF(stepval); -#endif - Py_DECREF(v); - if (likely(!ret)) - return val; - } - return (int) -1; - } - } else { - int val; - PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); - if (!tmp) return (int) -1; - val = __Pyx_PyInt_As_int(tmp); - Py_DECREF(tmp); - return val; - } -raise_overflow: - PyErr_SetString(PyExc_OverflowError, - "value too large to convert to int"); - return (int) -1; -raise_neg_overflow: - PyErr_SetString(PyExc_OverflowError, - "can't convert negative value to int"); - return (int) -1; -} - -/* CIntToPy */ -static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic push -#pragma GCC diagnostic ignored "-Wconversion" -#endif - const long neg_one = (long) -1, const_zero = (long) 0; -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic pop -#endif - const int is_unsigned = neg_one > const_zero; - if (is_unsigned) { - if (sizeof(long) < sizeof(long)) { - return PyInt_FromLong((long) value); - } else if (sizeof(long) <= sizeof(unsigned long)) { - return PyLong_FromUnsignedLong((unsigned long) value); -#ifdef HAVE_LONG_LONG - } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { - return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); -#endif - } - } else { - if (sizeof(long) <= sizeof(long)) { - return PyInt_FromLong((long) value); -#ifdef HAVE_LONG_LONG - } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { - return PyLong_FromLongLong((PY_LONG_LONG) value); -#endif - } - } - { - int one = 1; int little = (int)*(unsigned char *)&one; - unsigned char *bytes = (unsigned char *)&value; - return _PyLong_FromByteArray(bytes, sizeof(long), - little, !is_unsigned); - } -} - -/* CIntToPy */ -static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value) { -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic push -#pragma GCC diagnostic ignored "-Wconversion" -#endif - const int neg_one = (int) -1, const_zero = (int) 0; -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic pop -#endif - const int is_unsigned = neg_one > const_zero; - if (is_unsigned) { - if (sizeof(int) < sizeof(long)) { - return PyInt_FromLong((long) value); - } else if (sizeof(int) <= sizeof(unsigned long)) { - return PyLong_FromUnsignedLong((unsigned long) value); -#ifdef HAVE_LONG_LONG - } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { - return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); -#endif - } - } else { - if (sizeof(int) <= sizeof(long)) { - return PyInt_FromLong((long) value); -#ifdef HAVE_LONG_LONG - } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { - return PyLong_FromLongLong((PY_LONG_LONG) value); -#endif - } - } - { - int one = 1; int little = (int)*(unsigned char *)&one; - unsigned char *bytes = (unsigned char *)&value; - return _PyLong_FromByteArray(bytes, sizeof(int), - little, !is_unsigned); - } -} - -/* CIntFromPy */ -static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic push -#pragma GCC diagnostic ignored "-Wconversion" -#endif - const long neg_one = (long) -1, const_zero = (long) 0; -#ifdef __Pyx_HAS_GCC_DIAGNOSTIC -#pragma GCC diagnostic pop -#endif - const int is_unsigned = neg_one > const_zero; -#if PY_MAJOR_VERSION < 3 - if (likely(PyInt_Check(x))) { - if ((sizeof(long) < sizeof(long))) { - __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) - } else { - long val = PyInt_AS_LONG(x); - if (is_unsigned && unlikely(val < 0)) { - goto raise_neg_overflow; - } - return (long) val; - } - } else -#endif - if (likely(PyLong_Check(x))) { - if (is_unsigned) { -#if CYTHON_USE_PYLONG_INTERNALS - if (unlikely(__Pyx_PyLong_IsNeg(x))) { - goto raise_neg_overflow; - } else if (__Pyx_PyLong_IsCompact(x)) { - __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x)) - } else { - const digit* digits = __Pyx_PyLong_Digits(x); - assert(__Pyx_PyLong_DigitCount(x) > 1); - switch (__Pyx_PyLong_DigitCount(x)) { - case 2: - if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) >= 2 * PyLong_SHIFT)) { - return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); - } - } - break; - case 3: - if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) >= 3 * PyLong_SHIFT)) { - return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); - } - } - break; - case 4: - if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) >= 4 * PyLong_SHIFT)) { - return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); - } - } - break; - } - } -#endif -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7 - if (unlikely(Py_SIZE(x) < 0)) { - goto raise_neg_overflow; - } -#else - { - int result = PyObject_RichCompareBool(x, Py_False, Py_LT); - if (unlikely(result < 0)) - return (long) -1; - if (unlikely(result == 1)) - goto raise_neg_overflow; - } -#endif - if ((sizeof(long) <= sizeof(unsigned long))) { - __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) -#ifdef HAVE_LONG_LONG - } else if ((sizeof(long) <= sizeof(unsigned PY_LONG_LONG))) { - __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) -#endif - } - } else { -#if CYTHON_USE_PYLONG_INTERNALS - if (__Pyx_PyLong_IsCompact(x)) { - __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x)) - } else { - const digit* digits = __Pyx_PyLong_Digits(x); - assert(__Pyx_PyLong_DigitCount(x) > 1); - switch (__Pyx_PyLong_SignedDigitCount(x)) { - case -2: - if ((8 * sizeof(long) - 1 > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { - return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - case 2: - if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { - return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - case -3: - if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { - return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - case 3: - if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { - return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - case -4: - if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) { - return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - case 4: - if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) { - if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { - __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) - } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) { - return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); - } - } - break; - } - } -#endif - if ((sizeof(long) <= sizeof(long))) { - __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) -#ifdef HAVE_LONG_LONG - } else if ((sizeof(long) <= sizeof(PY_LONG_LONG))) { - __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) -#endif - } - } - { - long val; - PyObject *v = __Pyx_PyNumber_IntOrLong(x); -#if PY_MAJOR_VERSION < 3 - if (likely(v) && !PyLong_Check(v)) { - PyObject *tmp = v; - v = PyNumber_Long(tmp); - Py_DECREF(tmp); - } -#endif - if (likely(v)) { - int ret = -1; -#if !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray) - int one = 1; int is_little = (int)*(unsigned char *)&one; - unsigned char *bytes = (unsigned char *)&val; - ret = _PyLong_AsByteArray((PyLongObject *)v, - bytes, sizeof(val), - is_little, !is_unsigned); -#else - PyObject *stepval = NULL, *mask = NULL, *shift = NULL; - int bits, remaining_bits, is_negative = 0; - long idigit; - int chunk_size = (sizeof(long) < 8) ? 30 : 62; - if (unlikely(!PyLong_CheckExact(v))) { - PyObject *tmp = v; - v = PyNumber_Long(v); - assert(PyLong_CheckExact(v)); - Py_DECREF(tmp); - if (unlikely(!v)) return (long) -1; - } -#if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - if (Py_SIZE(x) == 0) - return (long) 0; - is_negative = Py_SIZE(x) < 0; -#else - { - int result = PyObject_RichCompareBool(x, Py_False, Py_LT); - if (unlikely(result < 0)) - return (long) -1; - is_negative = result == 1; - } -#endif - if (is_unsigned && unlikely(is_negative)) { - goto raise_neg_overflow; - } else if (is_negative) { - stepval = PyNumber_Invert(v); - if (unlikely(!stepval)) - return (long) -1; - } else { - stepval = __Pyx_NewRef(v); - } - val = (long) 0; - mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done; - shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done; - for (bits = 0; bits < (int) sizeof(long) * 8 - chunk_size; bits += chunk_size) { - PyObject *tmp, *digit; - digit = PyNumber_And(stepval, mask); - if (unlikely(!digit)) goto done; - idigit = PyLong_AsLong(digit); - Py_DECREF(digit); - if (unlikely(idigit < 0)) goto done; - tmp = PyNumber_Rshift(stepval, shift); - if (unlikely(!tmp)) goto done; - Py_DECREF(stepval); stepval = tmp; - val |= ((long) idigit) << bits; - #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - if (Py_SIZE(stepval) == 0) - goto unpacking_done; - #endif - } - idigit = PyLong_AsLong(stepval); - if (unlikely(idigit < 0)) goto done; - remaining_bits = ((int) sizeof(long) * 8) - bits - (is_unsigned ? 0 : 1); - if (unlikely(idigit >= (1L << remaining_bits))) - goto raise_overflow; - val |= ((long) idigit) << bits; - #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000 - unpacking_done: - #endif - if (!is_unsigned) { - if (unlikely(val & (((long) 1) << (sizeof(long) * 8 - 1)))) - goto raise_overflow; - if (is_negative) - val = ~val; - } - ret = 0; - done: - Py_XDECREF(shift); - Py_XDECREF(mask); - Py_XDECREF(stepval); -#endif - Py_DECREF(v); - if (likely(!ret)) - return val; - } - return (long) -1; - } - } else { - long val; - PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); - if (!tmp) return (long) -1; - val = __Pyx_PyInt_As_long(tmp); - Py_DECREF(tmp); - return val; - } -raise_overflow: - PyErr_SetString(PyExc_OverflowError, - "value too large to convert to long"); - return (long) -1; -raise_neg_overflow: - PyErr_SetString(PyExc_OverflowError, - "can't convert negative value to long"); - return (long) -1; -} - -/* FormatTypeName */ -#if CYTHON_COMPILING_IN_LIMITED_API -static __Pyx_TypeName -__Pyx_PyType_GetName(PyTypeObject* tp) -{ - PyObject *name = __Pyx_PyObject_GetAttrStr((PyObject *)tp, - __pyx_n_s_name); - if (unlikely(name == NULL) || unlikely(!PyUnicode_Check(name))) { - PyErr_Clear(); - Py_XSETREF(name, __Pyx_NewRef(__pyx_n_s__9)); - } - return name; -} -#endif - -/* SwapException */ -#if CYTHON_FAST_THREAD_STATE -static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { - PyObject *tmp_type, *tmp_value, *tmp_tb; - #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4 - _PyErr_StackItem *exc_info = tstate->exc_info; - tmp_value = exc_info->exc_value; - exc_info->exc_value = *value; - if (tmp_value == NULL || tmp_value == Py_None) { - Py_XDECREF(tmp_value); - tmp_value = NULL; - tmp_type = NULL; - tmp_tb = NULL; - } else { - tmp_type = (PyObject*) Py_TYPE(tmp_value); - Py_INCREF(tmp_type); - #if CYTHON_COMPILING_IN_CPYTHON - tmp_tb = ((PyBaseExceptionObject*) tmp_value)->traceback; - Py_XINCREF(tmp_tb); - #else - tmp_tb = PyException_GetTraceback(tmp_value); - #endif - } - #elif CYTHON_USE_EXC_INFO_STACK - _PyErr_StackItem *exc_info = tstate->exc_info; - tmp_type = exc_info->exc_type; - tmp_value = exc_info->exc_value; - tmp_tb = exc_info->exc_traceback; - exc_info->exc_type = *type; - exc_info->exc_value = *value; - exc_info->exc_traceback = *tb; - #else - tmp_type = tstate->exc_type; - tmp_value = tstate->exc_value; - tmp_tb = tstate->exc_traceback; - tstate->exc_type = *type; - tstate->exc_value = *value; - tstate->exc_traceback = *tb; - #endif - *type = tmp_type; - *value = tmp_value; - *tb = tmp_tb; -} -#else -static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb) { - PyObject *tmp_type, *tmp_value, *tmp_tb; - PyErr_GetExcInfo(&tmp_type, &tmp_value, &tmp_tb); - PyErr_SetExcInfo(*type, *value, *tb); - *type = tmp_type; - *value = tmp_value; - *tb = tmp_tb; -} -#endif - -/* PyObjectCall2Args */ -static CYTHON_INLINE PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) { - PyObject *args[3] = {NULL, arg1, arg2}; - return __Pyx_PyObject_FastCall(function, args+1, 2 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET); -} - -/* PyObjectCallMethod1 */ -static PyObject* __Pyx__PyObject_CallMethod1(PyObject* method, PyObject* arg) { - PyObject *result = __Pyx_PyObject_CallOneArg(method, arg); - Py_DECREF(method); - return result; -} -static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg) { - PyObject *method = NULL, *result; - int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method); - if (likely(is_method)) { - result = __Pyx_PyObject_Call2Args(method, obj, arg); - Py_DECREF(method); - return result; - } - if (unlikely(!method)) return NULL; - return __Pyx__PyObject_CallMethod1(method, arg); -} - -/* CoroutineBase */ -#include -#if PY_VERSION_HEX >= 0x030b00a6 - #ifndef Py_BUILD_CORE - #define Py_BUILD_CORE 1 - #endif - #include "internal/pycore_frame.h" -#endif -#define __Pyx_Coroutine_Undelegate(gen) Py_CLEAR((gen)->yieldfrom) -static int __Pyx_PyGen__FetchStopIterationValue(PyThreadState *__pyx_tstate, PyObject **pvalue) { - PyObject *et, *ev, *tb; - PyObject *value = NULL; - CYTHON_UNUSED_VAR(__pyx_tstate); - __Pyx_ErrFetch(&et, &ev, &tb); - if (!et) { - Py_XDECREF(tb); - Py_XDECREF(ev); - Py_INCREF(Py_None); - *pvalue = Py_None; - return 0; - } - if (likely(et == PyExc_StopIteration)) { - if (!ev) { - Py_INCREF(Py_None); - value = Py_None; - } -#if PY_VERSION_HEX >= 0x030300A0 - else if (likely(__Pyx_IS_TYPE(ev, (PyTypeObject*)PyExc_StopIteration))) { - value = ((PyStopIterationObject *)ev)->value; - Py_INCREF(value); - Py_DECREF(ev); - } -#endif - else if (unlikely(PyTuple_Check(ev))) { - if (PyTuple_GET_SIZE(ev) >= 1) { -#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - value = PyTuple_GET_ITEM(ev, 0); - Py_INCREF(value); -#else - value = PySequence_ITEM(ev, 0); -#endif - } else { - Py_INCREF(Py_None); - value = Py_None; - } - Py_DECREF(ev); - } - else if (!__Pyx_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration)) { - value = ev; - } - if (likely(value)) { - Py_XDECREF(tb); - Py_DECREF(et); - *pvalue = value; - return 0; - } - } else if (!__Pyx_PyErr_GivenExceptionMatches(et, PyExc_StopIteration)) { - __Pyx_ErrRestore(et, ev, tb); - return -1; - } - PyErr_NormalizeException(&et, &ev, &tb); - if (unlikely(!PyObject_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration))) { - __Pyx_ErrRestore(et, ev, tb); - return -1; - } - Py_XDECREF(tb); - Py_DECREF(et); -#if PY_VERSION_HEX >= 0x030300A0 - value = ((PyStopIterationObject *)ev)->value; - Py_INCREF(value); - Py_DECREF(ev); -#else - { - PyObject* args = __Pyx_PyObject_GetAttrStr(ev, __pyx_n_s_args); - Py_DECREF(ev); - if (likely(args)) { - value = PySequence_GetItem(args, 0); - Py_DECREF(args); - } - if (unlikely(!value)) { - __Pyx_ErrRestore(NULL, NULL, NULL); - Py_INCREF(Py_None); - value = Py_None; - } - } -#endif - *pvalue = value; - return 0; -} -static CYTHON_INLINE -void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *exc_state) { -#if PY_VERSION_HEX >= 0x030B00a4 - Py_CLEAR(exc_state->exc_value); -#else - PyObject *t, *v, *tb; - t = exc_state->exc_type; - v = exc_state->exc_value; - tb = exc_state->exc_traceback; - exc_state->exc_type = NULL; - exc_state->exc_value = NULL; - exc_state->exc_traceback = NULL; - Py_XDECREF(t); - Py_XDECREF(v); - Py_XDECREF(tb); -#endif -} -#define __Pyx_Coroutine_AlreadyRunningError(gen) (__Pyx__Coroutine_AlreadyRunningError(gen), (PyObject*)NULL) -static void __Pyx__Coroutine_AlreadyRunningError(__pyx_CoroutineObject *gen) { - const char *msg; - CYTHON_MAYBE_UNUSED_VAR(gen); - if ((0)) { - #ifdef __Pyx_Coroutine_USED - } else if (__Pyx_Coroutine_Check((PyObject*)gen)) { - msg = "coroutine already executing"; - #endif - #ifdef __Pyx_AsyncGen_USED - } else if (__Pyx_AsyncGen_CheckExact((PyObject*)gen)) { - msg = "async generator already executing"; - #endif - } else { - msg = "generator already executing"; - } - PyErr_SetString(PyExc_ValueError, msg); -} -#define __Pyx_Coroutine_NotStartedError(gen) (__Pyx__Coroutine_NotStartedError(gen), (PyObject*)NULL) -static void __Pyx__Coroutine_NotStartedError(PyObject *gen) { - const char *msg; - CYTHON_MAYBE_UNUSED_VAR(gen); - if ((0)) { - #ifdef __Pyx_Coroutine_USED - } else if (__Pyx_Coroutine_Check(gen)) { - msg = "can't send non-None value to a just-started coroutine"; - #endif - #ifdef __Pyx_AsyncGen_USED - } else if (__Pyx_AsyncGen_CheckExact(gen)) { - msg = "can't send non-None value to a just-started async generator"; - #endif - } else { - msg = "can't send non-None value to a just-started generator"; - } - PyErr_SetString(PyExc_TypeError, msg); -} -#define __Pyx_Coroutine_AlreadyTerminatedError(gen, value, closing) (__Pyx__Coroutine_AlreadyTerminatedError(gen, value, closing), (PyObject*)NULL) -static void __Pyx__Coroutine_AlreadyTerminatedError(PyObject *gen, PyObject *value, int closing) { - CYTHON_MAYBE_UNUSED_VAR(gen); - CYTHON_MAYBE_UNUSED_VAR(closing); - #ifdef __Pyx_Coroutine_USED - if (!closing && __Pyx_Coroutine_Check(gen)) { - PyErr_SetString(PyExc_RuntimeError, "cannot reuse already awaited coroutine"); - } else - #endif - if (value) { - #ifdef __Pyx_AsyncGen_USED - if (__Pyx_AsyncGen_CheckExact(gen)) - PyErr_SetNone(__Pyx_PyExc_StopAsyncIteration); - else - #endif - PyErr_SetNone(PyExc_StopIteration); - } -} -static -PyObject *__Pyx_Coroutine_SendEx(__pyx_CoroutineObject *self, PyObject *value, int closing) { - __Pyx_PyThreadState_declare - PyThreadState *tstate; - __Pyx_ExcInfoStruct *exc_state; - PyObject *retval; - assert(!self->is_running); - if (unlikely(self->resume_label == 0)) { - if (unlikely(value && value != Py_None)) { - return __Pyx_Coroutine_NotStartedError((PyObject*)self); - } - } - if (unlikely(self->resume_label == -1)) { - return __Pyx_Coroutine_AlreadyTerminatedError((PyObject*)self, value, closing); - } -#if CYTHON_FAST_THREAD_STATE - __Pyx_PyThreadState_assign - tstate = __pyx_tstate; -#else - tstate = __Pyx_PyThreadState_Current; -#endif - exc_state = &self->gi_exc_state; - if (exc_state->exc_value) { - #if CYTHON_COMPILING_IN_PYPY - #else - PyObject *exc_tb; - #if PY_VERSION_HEX >= 0x030B00a4 && !CYTHON_COMPILING_IN_CPYTHON - exc_tb = PyException_GetTraceback(exc_state->exc_value); - #elif PY_VERSION_HEX >= 0x030B00a4 - exc_tb = ((PyBaseExceptionObject*) exc_state->exc_value)->traceback; - #else - exc_tb = exc_state->exc_traceback; - #endif - if (exc_tb) { - PyTracebackObject *tb = (PyTracebackObject *) exc_tb; - PyFrameObject *f = tb->tb_frame; - assert(f->f_back == NULL); - #if PY_VERSION_HEX >= 0x030B00A1 - f->f_back = PyThreadState_GetFrame(tstate); - #else - Py_XINCREF(tstate->frame); - f->f_back = tstate->frame; - #endif - #if PY_VERSION_HEX >= 0x030B00a4 && !CYTHON_COMPILING_IN_CPYTHON - Py_DECREF(exc_tb); - #endif - } - #endif - } -#if CYTHON_USE_EXC_INFO_STACK - exc_state->previous_item = tstate->exc_info; - tstate->exc_info = exc_state; -#else - if (exc_state->exc_type) { - __Pyx_ExceptionSwap(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback); - } else { - __Pyx_Coroutine_ExceptionClear(exc_state); - __Pyx_ExceptionSave(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback); - } -#endif - self->is_running = 1; - retval = self->body(self, tstate, value); - self->is_running = 0; -#if CYTHON_USE_EXC_INFO_STACK - exc_state = &self->gi_exc_state; - tstate->exc_info = exc_state->previous_item; - exc_state->previous_item = NULL; - __Pyx_Coroutine_ResetFrameBackpointer(exc_state); -#endif - return retval; -} -static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state) { -#if CYTHON_COMPILING_IN_PYPY - CYTHON_UNUSED_VAR(exc_state); -#else - PyObject *exc_tb; - #if PY_VERSION_HEX >= 0x030B00a4 - if (!exc_state->exc_value) return; - exc_tb = PyException_GetTraceback(exc_state->exc_value); - #else - exc_tb = exc_state->exc_traceback; - #endif - if (likely(exc_tb)) { - PyTracebackObject *tb = (PyTracebackObject *) exc_tb; - PyFrameObject *f = tb->tb_frame; - Py_CLEAR(f->f_back); - #if PY_VERSION_HEX >= 0x030B00a4 - Py_DECREF(exc_tb); - #endif - } -#endif -} -static CYTHON_INLINE -PyObject *__Pyx_Coroutine_MethodReturn(PyObject* gen, PyObject *retval) { - CYTHON_MAYBE_UNUSED_VAR(gen); - if (unlikely(!retval)) { - __Pyx_PyThreadState_declare - __Pyx_PyThreadState_assign - if (!__Pyx_PyErr_Occurred()) { - PyObject *exc = PyExc_StopIteration; - #ifdef __Pyx_AsyncGen_USED - if (__Pyx_AsyncGen_CheckExact(gen)) - exc = __Pyx_PyExc_StopAsyncIteration; - #endif - __Pyx_PyErr_SetNone(exc); - } - } - return retval; -} -#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) -static CYTHON_INLINE -PyObject *__Pyx_PyGen_Send(PyGenObject *gen, PyObject *arg) { -#if PY_VERSION_HEX <= 0x030A00A1 - return _PyGen_Send(gen, arg); -#else - PyObject *result; - if (PyIter_Send((PyObject*)gen, arg ? arg : Py_None, &result) == PYGEN_RETURN) { - if (PyAsyncGen_CheckExact(gen)) { - assert(result == Py_None); - PyErr_SetNone(PyExc_StopAsyncIteration); - } - else if (result == Py_None) { - PyErr_SetNone(PyExc_StopIteration); - } - else { - _PyGen_SetStopIterationValue(result); - } - Py_CLEAR(result); - } - return result; -#endif -} -#endif -static CYTHON_INLINE -PyObject *__Pyx_Coroutine_FinishDelegation(__pyx_CoroutineObject *gen) { - PyObject *ret; - PyObject *val = NULL; - __Pyx_Coroutine_Undelegate(gen); - __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, &val); - ret = __Pyx_Coroutine_SendEx(gen, val, 0); - Py_XDECREF(val); - return ret; -} -static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value) { - PyObject *retval; - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self; - PyObject *yf = gen->yieldfrom; - if (unlikely(gen->is_running)) - return __Pyx_Coroutine_AlreadyRunningError(gen); - if (yf) { - PyObject *ret; - gen->is_running = 1; - #ifdef __Pyx_Generator_USED - if (__Pyx_Generator_CheckExact(yf)) { - ret = __Pyx_Coroutine_Send(yf, value); - } else - #endif - #ifdef __Pyx_Coroutine_USED - if (__Pyx_Coroutine_Check(yf)) { - ret = __Pyx_Coroutine_Send(yf, value); - } else - #endif - #ifdef __Pyx_AsyncGen_USED - if (__pyx_PyAsyncGenASend_CheckExact(yf)) { - ret = __Pyx_async_gen_asend_send(yf, value); - } else - #endif - #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) - if (PyGen_CheckExact(yf)) { - ret = __Pyx_PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value); - } else - #endif - #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03050000 && defined(PyCoro_CheckExact) && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) - if (PyCoro_CheckExact(yf)) { - ret = __Pyx_PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value); - } else - #endif - { - if (value == Py_None) - ret = __Pyx_PyObject_GetIterNextFunc(yf)(yf); - else - ret = __Pyx_PyObject_CallMethod1(yf, __pyx_n_s_send, value); - } - gen->is_running = 0; - if (likely(ret)) { - return ret; - } - retval = __Pyx_Coroutine_FinishDelegation(gen); - } else { - retval = __Pyx_Coroutine_SendEx(gen, value, 0); - } - return __Pyx_Coroutine_MethodReturn(self, retval); -} -static int __Pyx_Coroutine_CloseIter(__pyx_CoroutineObject *gen, PyObject *yf) { - PyObject *retval = NULL; - int err = 0; - #ifdef __Pyx_Generator_USED - if (__Pyx_Generator_CheckExact(yf)) { - retval = __Pyx_Coroutine_Close(yf); - if (!retval) - return -1; - } else - #endif - #ifdef __Pyx_Coroutine_USED - if (__Pyx_Coroutine_Check(yf)) { - retval = __Pyx_Coroutine_Close(yf); - if (!retval) - return -1; - } else - if (__Pyx_CoroutineAwait_CheckExact(yf)) { - retval = __Pyx_CoroutineAwait_Close((__pyx_CoroutineAwaitObject*)yf, NULL); - if (!retval) - return -1; - } else - #endif - #ifdef __Pyx_AsyncGen_USED - if (__pyx_PyAsyncGenASend_CheckExact(yf)) { - retval = __Pyx_async_gen_asend_close(yf, NULL); - } else - if (__pyx_PyAsyncGenAThrow_CheckExact(yf)) { - retval = __Pyx_async_gen_athrow_close(yf, NULL); - } else - #endif - { - PyObject *meth; - gen->is_running = 1; - meth = __Pyx_PyObject_GetAttrStrNoError(yf, __pyx_n_s_close); - if (unlikely(!meth)) { - if (unlikely(PyErr_Occurred())) { - PyErr_WriteUnraisable(yf); - } - } else { - retval = __Pyx_PyObject_CallNoArg(meth); - Py_DECREF(meth); - if (unlikely(!retval)) - err = -1; - } - gen->is_running = 0; - } - Py_XDECREF(retval); - return err; -} -static PyObject *__Pyx_Generator_Next(PyObject *self) { - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self; - PyObject *yf = gen->yieldfrom; - if (unlikely(gen->is_running)) - return __Pyx_Coroutine_AlreadyRunningError(gen); - if (yf) { - PyObject *ret; - gen->is_running = 1; - #ifdef __Pyx_Generator_USED - if (__Pyx_Generator_CheckExact(yf)) { - ret = __Pyx_Generator_Next(yf); - } else - #endif - #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) - if (PyGen_CheckExact(yf)) { - ret = __Pyx_PyGen_Send((PyGenObject*)yf, NULL); - } else - #endif - #ifdef __Pyx_Coroutine_USED - if (__Pyx_Coroutine_Check(yf)) { - ret = __Pyx_Coroutine_Send(yf, Py_None); - } else - #endif - ret = __Pyx_PyObject_GetIterNextFunc(yf)(yf); - gen->is_running = 0; - if (likely(ret)) { - return ret; - } - return __Pyx_Coroutine_FinishDelegation(gen); - } - return __Pyx_Coroutine_SendEx(gen, Py_None, 0); -} -static PyObject *__Pyx_Coroutine_Close_Method(PyObject *self, PyObject *arg) { - CYTHON_UNUSED_VAR(arg); - return __Pyx_Coroutine_Close(self); -} -static PyObject *__Pyx_Coroutine_Close(PyObject *self) { - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; - PyObject *retval, *raised_exception; - PyObject *yf = gen->yieldfrom; - int err = 0; - if (unlikely(gen->is_running)) - return __Pyx_Coroutine_AlreadyRunningError(gen); - if (yf) { - Py_INCREF(yf); - err = __Pyx_Coroutine_CloseIter(gen, yf); - __Pyx_Coroutine_Undelegate(gen); - Py_DECREF(yf); - } - if (err == 0) - PyErr_SetNone(PyExc_GeneratorExit); - retval = __Pyx_Coroutine_SendEx(gen, NULL, 1); - if (unlikely(retval)) { - const char *msg; - Py_DECREF(retval); - if ((0)) { - #ifdef __Pyx_Coroutine_USED - } else if (__Pyx_Coroutine_Check(self)) { - msg = "coroutine ignored GeneratorExit"; - #endif - #ifdef __Pyx_AsyncGen_USED - } else if (__Pyx_AsyncGen_CheckExact(self)) { -#if PY_VERSION_HEX < 0x03060000 - msg = "async generator ignored GeneratorExit - might require Python 3.6+ finalisation (PEP 525)"; -#else - msg = "async generator ignored GeneratorExit"; -#endif - #endif - } else { - msg = "generator ignored GeneratorExit"; - } - PyErr_SetString(PyExc_RuntimeError, msg); - return NULL; - } - raised_exception = PyErr_Occurred(); - if (likely(!raised_exception || __Pyx_PyErr_GivenExceptionMatches2(raised_exception, PyExc_GeneratorExit, PyExc_StopIteration))) { - if (raised_exception) PyErr_Clear(); - Py_INCREF(Py_None); - return Py_None; - } - return NULL; -} -static PyObject *__Pyx__Coroutine_Throw(PyObject *self, PyObject *typ, PyObject *val, PyObject *tb, - PyObject *args, int close_on_genexit) { - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; - PyObject *yf = gen->yieldfrom; - if (unlikely(gen->is_running)) - return __Pyx_Coroutine_AlreadyRunningError(gen); - if (yf) { - PyObject *ret; - Py_INCREF(yf); - if (__Pyx_PyErr_GivenExceptionMatches(typ, PyExc_GeneratorExit) && close_on_genexit) { - int err = __Pyx_Coroutine_CloseIter(gen, yf); - Py_DECREF(yf); - __Pyx_Coroutine_Undelegate(gen); - if (err < 0) - return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0)); - goto throw_here; - } - gen->is_running = 1; - if (0 - #ifdef __Pyx_Generator_USED - || __Pyx_Generator_CheckExact(yf) - #endif - #ifdef __Pyx_Coroutine_USED - || __Pyx_Coroutine_Check(yf) - #endif - ) { - ret = __Pyx__Coroutine_Throw(yf, typ, val, tb, args, close_on_genexit); - #ifdef __Pyx_Coroutine_USED - } else if (__Pyx_CoroutineAwait_CheckExact(yf)) { - ret = __Pyx__Coroutine_Throw(((__pyx_CoroutineAwaitObject*)yf)->coroutine, typ, val, tb, args, close_on_genexit); - #endif - } else { - PyObject *meth = __Pyx_PyObject_GetAttrStrNoError(yf, __pyx_n_s_throw); - if (unlikely(!meth)) { - Py_DECREF(yf); - if (unlikely(PyErr_Occurred())) { - gen->is_running = 0; - return NULL; - } - __Pyx_Coroutine_Undelegate(gen); - gen->is_running = 0; - goto throw_here; - } - if (likely(args)) { - ret = __Pyx_PyObject_Call(meth, args, NULL); - } else { - PyObject *cargs[4] = {NULL, typ, val, tb}; - ret = __Pyx_PyObject_FastCall(meth, cargs+1, 3 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET); - } - Py_DECREF(meth); - } - gen->is_running = 0; - Py_DECREF(yf); - if (!ret) { - ret = __Pyx_Coroutine_FinishDelegation(gen); - } - return __Pyx_Coroutine_MethodReturn(self, ret); - } -throw_here: - __Pyx_Raise(typ, val, tb, NULL); - return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0)); -} -static PyObject *__Pyx_Coroutine_Throw(PyObject *self, PyObject *args) { - PyObject *typ; - PyObject *val = NULL; - PyObject *tb = NULL; - if (unlikely(!PyArg_UnpackTuple(args, (char *)"throw", 1, 3, &typ, &val, &tb))) - return NULL; - return __Pyx__Coroutine_Throw(self, typ, val, tb, args, 1); -} -static CYTHON_INLINE int __Pyx_Coroutine_traverse_excstate(__Pyx_ExcInfoStruct *exc_state, visitproc visit, void *arg) { -#if PY_VERSION_HEX >= 0x030B00a4 - Py_VISIT(exc_state->exc_value); -#else - Py_VISIT(exc_state->exc_type); - Py_VISIT(exc_state->exc_value); - Py_VISIT(exc_state->exc_traceback); -#endif - return 0; -} -static int __Pyx_Coroutine_traverse(__pyx_CoroutineObject *gen, visitproc visit, void *arg) { - Py_VISIT(gen->closure); - Py_VISIT(gen->classobj); - Py_VISIT(gen->yieldfrom); - return __Pyx_Coroutine_traverse_excstate(&gen->gi_exc_state, visit, arg); -} -static int __Pyx_Coroutine_clear(PyObject *self) { - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; - Py_CLEAR(gen->closure); - Py_CLEAR(gen->classobj); - Py_CLEAR(gen->yieldfrom); - __Pyx_Coroutine_ExceptionClear(&gen->gi_exc_state); -#ifdef __Pyx_AsyncGen_USED - if (__Pyx_AsyncGen_CheckExact(self)) { - Py_CLEAR(((__pyx_PyAsyncGenObject*)gen)->ag_finalizer); - } -#endif - Py_CLEAR(gen->gi_code); - Py_CLEAR(gen->gi_frame); - Py_CLEAR(gen->gi_name); - Py_CLEAR(gen->gi_qualname); - Py_CLEAR(gen->gi_modulename); - return 0; -} -static void __Pyx_Coroutine_dealloc(PyObject *self) { - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; - PyObject_GC_UnTrack(gen); - if (gen->gi_weakreflist != NULL) - PyObject_ClearWeakRefs(self); - if (gen->resume_label >= 0) { - PyObject_GC_Track(self); -#if PY_VERSION_HEX >= 0x030400a1 && CYTHON_USE_TP_FINALIZE - if (unlikely(PyObject_CallFinalizerFromDealloc(self))) -#else - Py_TYPE(gen)->tp_del(self); - if (unlikely(Py_REFCNT(self) > 0)) -#endif - { - return; - } - PyObject_GC_UnTrack(self); - } -#ifdef __Pyx_AsyncGen_USED - if (__Pyx_AsyncGen_CheckExact(self)) { - /* We have to handle this case for asynchronous generators - right here, because this code has to be between UNTRACK - and GC_Del. */ - Py_CLEAR(((__pyx_PyAsyncGenObject*)self)->ag_finalizer); - } -#endif - __Pyx_Coroutine_clear(self); - __Pyx_PyHeapTypeObject_GC_Del(gen); -} -static void __Pyx_Coroutine_del(PyObject *self) { - PyObject *error_type, *error_value, *error_traceback; - __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; - __Pyx_PyThreadState_declare - if (gen->resume_label < 0) { - return; - } -#if !CYTHON_USE_TP_FINALIZE - assert(self->ob_refcnt == 0); - __Pyx_SET_REFCNT(self, 1); -#endif - __Pyx_PyThreadState_assign - __Pyx_ErrFetch(&error_type, &error_value, &error_traceback); -#ifdef __Pyx_AsyncGen_USED - if (__Pyx_AsyncGen_CheckExact(self)) { - __pyx_PyAsyncGenObject *agen = (__pyx_PyAsyncGenObject*)self; - PyObject *finalizer = agen->ag_finalizer; - if (finalizer && !agen->ag_closed) { - PyObject *res = __Pyx_PyObject_CallOneArg(finalizer, self); - if (unlikely(!res)) { - PyErr_WriteUnraisable(self); - } else { - Py_DECREF(res); - } - __Pyx_ErrRestore(error_type, error_value, error_traceback); - return; - } - } -#endif - if (unlikely(gen->resume_label == 0 && !error_value)) { -#ifdef __Pyx_Coroutine_USED -#ifdef __Pyx_Generator_USED - if (!__Pyx_Generator_CheckExact(self)) -#endif - { - PyObject_GC_UnTrack(self); -#if PY_MAJOR_VERSION >= 3 || defined(PyErr_WarnFormat) - if (unlikely(PyErr_WarnFormat(PyExc_RuntimeWarning, 1, "coroutine '%.50S' was never awaited", gen->gi_qualname) < 0)) - PyErr_WriteUnraisable(self); -#else - {PyObject *msg; - char *cmsg; - #if CYTHON_COMPILING_IN_PYPY - msg = NULL; - cmsg = (char*) "coroutine was never awaited"; - #else - char *cname; - PyObject *qualname; - qualname = gen->gi_qualname; - cname = PyString_AS_STRING(qualname); - msg = PyString_FromFormat("coroutine '%.50s' was never awaited", cname); - if (unlikely(!msg)) { - PyErr_Clear(); - cmsg = (char*) "coroutine was never awaited"; - } else { - cmsg = PyString_AS_STRING(msg); - } - #endif - if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, cmsg, 1) < 0)) - PyErr_WriteUnraisable(self); - Py_XDECREF(msg);} -#endif - PyObject_GC_Track(self); - } -#endif - } else { - PyObject *res = __Pyx_Coroutine_Close(self); - if (unlikely(!res)) { - if (PyErr_Occurred()) - PyErr_WriteUnraisable(self); - } else { - Py_DECREF(res); - } - } - __Pyx_ErrRestore(error_type, error_value, error_traceback); -#if !CYTHON_USE_TP_FINALIZE - assert(Py_REFCNT(self) > 0); - if (likely(--self->ob_refcnt == 0)) { - return; - } - { - Py_ssize_t refcnt = Py_REFCNT(self); - _Py_NewReference(self); - __Pyx_SET_REFCNT(self, refcnt); - } -#if CYTHON_COMPILING_IN_CPYTHON - assert(PyType_IS_GC(Py_TYPE(self)) && - _Py_AS_GC(self)->gc.gc_refs != _PyGC_REFS_UNTRACKED); - _Py_DEC_REFTOTAL; -#endif -#ifdef COUNT_ALLOCS - --Py_TYPE(self)->tp_frees; - --Py_TYPE(self)->tp_allocs; -#endif -#endif -} -static PyObject * -__Pyx_Coroutine_get_name(__pyx_CoroutineObject *self, void *context) -{ - PyObject *name = self->gi_name; - CYTHON_UNUSED_VAR(context); - if (unlikely(!name)) name = Py_None; - Py_INCREF(name); - return name; -} -static int -__Pyx_Coroutine_set_name(__pyx_CoroutineObject *self, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); -#if PY_MAJOR_VERSION >= 3 - if (unlikely(value == NULL || !PyUnicode_Check(value))) -#else - if (unlikely(value == NULL || !PyString_Check(value))) -#endif - { - PyErr_SetString(PyExc_TypeError, - "__name__ must be set to a string object"); - return -1; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(self->gi_name, value); - return 0; -} -static PyObject * -__Pyx_Coroutine_get_qualname(__pyx_CoroutineObject *self, void *context) -{ - PyObject *name = self->gi_qualname; - CYTHON_UNUSED_VAR(context); - if (unlikely(!name)) name = Py_None; - Py_INCREF(name); - return name; -} -static int -__Pyx_Coroutine_set_qualname(__pyx_CoroutineObject *self, PyObject *value, void *context) -{ - CYTHON_UNUSED_VAR(context); -#if PY_MAJOR_VERSION >= 3 - if (unlikely(value == NULL || !PyUnicode_Check(value))) -#else - if (unlikely(value == NULL || !PyString_Check(value))) -#endif - { - PyErr_SetString(PyExc_TypeError, - "__qualname__ must be set to a string object"); - return -1; - } - Py_INCREF(value); - __Pyx_Py_XDECREF_SET(self->gi_qualname, value); - return 0; -} -static PyObject * -__Pyx_Coroutine_get_frame(__pyx_CoroutineObject *self, void *context) -{ - PyObject *frame = self->gi_frame; - CYTHON_UNUSED_VAR(context); - if (!frame) { - if (unlikely(!self->gi_code)) { - Py_RETURN_NONE; - } - frame = (PyObject *) PyFrame_New( - PyThreadState_Get(), /*PyThreadState *tstate,*/ - (PyCodeObject*) self->gi_code, /*PyCodeObject *code,*/ - __pyx_d, /*PyObject *globals,*/ - 0 /*PyObject *locals*/ - ); - if (unlikely(!frame)) - return NULL; - self->gi_frame = frame; - } - Py_INCREF(frame); - return frame; -} -static __pyx_CoroutineObject *__Pyx__Coroutine_New( - PyTypeObject* type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, - PyObject *name, PyObject *qualname, PyObject *module_name) { - __pyx_CoroutineObject *gen = PyObject_GC_New(__pyx_CoroutineObject, type); - if (unlikely(!gen)) - return NULL; - return __Pyx__Coroutine_NewInit(gen, body, code, closure, name, qualname, module_name); -} -static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit( - __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, - PyObject *name, PyObject *qualname, PyObject *module_name) { - gen->body = body; - gen->closure = closure; - Py_XINCREF(closure); - gen->is_running = 0; - gen->resume_label = 0; - gen->classobj = NULL; - gen->yieldfrom = NULL; - #if PY_VERSION_HEX >= 0x030B00a4 - gen->gi_exc_state.exc_value = NULL; - #else - gen->gi_exc_state.exc_type = NULL; - gen->gi_exc_state.exc_value = NULL; - gen->gi_exc_state.exc_traceback = NULL; - #endif -#if CYTHON_USE_EXC_INFO_STACK - gen->gi_exc_state.previous_item = NULL; -#endif - gen->gi_weakreflist = NULL; - Py_XINCREF(qualname); - gen->gi_qualname = qualname; - Py_XINCREF(name); - gen->gi_name = name; - Py_XINCREF(module_name); - gen->gi_modulename = module_name; - Py_XINCREF(code); - gen->gi_code = code; - gen->gi_frame = NULL; - PyObject_GC_Track(gen); - return gen; -} - -/* PatchModuleWithCoroutine */ -static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code) { -#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) - int result; - PyObject *globals, *result_obj; - globals = PyDict_New(); if (unlikely(!globals)) goto ignore; - result = PyDict_SetItemString(globals, "_cython_coroutine_type", - #ifdef __Pyx_Coroutine_USED - (PyObject*)__pyx_CoroutineType); - #else - Py_None); - #endif - if (unlikely(result < 0)) goto ignore; - result = PyDict_SetItemString(globals, "_cython_generator_type", - #ifdef __Pyx_Generator_USED - (PyObject*)__pyx_GeneratorType); - #else - Py_None); - #endif - if (unlikely(result < 0)) goto ignore; - if (unlikely(PyDict_SetItemString(globals, "_module", module) < 0)) goto ignore; - if (unlikely(PyDict_SetItemString(globals, "__builtins__", __pyx_b) < 0)) goto ignore; - result_obj = PyRun_String(py_code, Py_file_input, globals, globals); - if (unlikely(!result_obj)) goto ignore; - Py_DECREF(result_obj); - Py_DECREF(globals); - return module; -ignore: - Py_XDECREF(globals); - PyErr_WriteUnraisable(module); - if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, "Cython module failed to patch module with custom type", 1) < 0)) { - Py_DECREF(module); - module = NULL; - } -#else - py_code++; -#endif - return module; -} - -/* PatchGeneratorABC */ -#ifndef CYTHON_REGISTER_ABCS -#define CYTHON_REGISTER_ABCS 1 -#endif -#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) -static PyObject* __Pyx_patch_abc_module(PyObject *module); -static PyObject* __Pyx_patch_abc_module(PyObject *module) { - module = __Pyx_Coroutine_patch_module( - module, "" -"if _cython_generator_type is not None:\n" -" try: Generator = _module.Generator\n" -" except AttributeError: pass\n" -" else: Generator.register(_cython_generator_type)\n" -"if _cython_coroutine_type is not None:\n" -" try: Coroutine = _module.Coroutine\n" -" except AttributeError: pass\n" -" else: Coroutine.register(_cython_coroutine_type)\n" - ); - return module; -} -#endif -static int __Pyx_patch_abc(void) { -#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) - static int abc_patched = 0; - if (CYTHON_REGISTER_ABCS && !abc_patched) { - PyObject *module; - module = PyImport_ImportModule((PY_MAJOR_VERSION >= 3) ? "collections.abc" : "collections"); - if (unlikely(!module)) { - PyErr_WriteUnraisable(NULL); - if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, - ((PY_MAJOR_VERSION >= 3) ? - "Cython module failed to register with collections.abc module" : - "Cython module failed to register with collections module"), 1) < 0)) { - return -1; - } - } else { - module = __Pyx_patch_abc_module(module); - abc_patched = 1; - if (unlikely(!module)) - return -1; - Py_DECREF(module); - } - module = PyImport_ImportModule("backports_abc"); - if (module) { - module = __Pyx_patch_abc_module(module); - Py_XDECREF(module); - } - if (!module) { - PyErr_Clear(); - } - } -#else - if ((0)) __Pyx_Coroutine_patch_module(NULL, NULL); -#endif - return 0; -} - -/* Generator */ -static PyMethodDef __pyx_Generator_methods[] = { - {"send", (PyCFunction) __Pyx_Coroutine_Send, METH_O, - (char*) PyDoc_STR("send(arg) -> send 'arg' into generator,\nreturn next yielded value or raise StopIteration.")}, - {"throw", (PyCFunction) __Pyx_Coroutine_Throw, METH_VARARGS, - (char*) PyDoc_STR("throw(typ[,val[,tb]]) -> raise exception in generator,\nreturn next yielded value or raise StopIteration.")}, - {"close", (PyCFunction) __Pyx_Coroutine_Close_Method, METH_NOARGS, - (char*) PyDoc_STR("close() -> raise GeneratorExit inside generator.")}, - {0, 0, 0, 0} -}; -static PyMemberDef __pyx_Generator_memberlist[] = { - {(char *) "gi_running", T_BOOL, offsetof(__pyx_CoroutineObject, is_running), READONLY, NULL}, - {(char*) "gi_yieldfrom", T_OBJECT, offsetof(__pyx_CoroutineObject, yieldfrom), READONLY, - (char*) PyDoc_STR("object being iterated by 'yield from', or None")}, - {(char*) "gi_code", T_OBJECT, offsetof(__pyx_CoroutineObject, gi_code), READONLY, NULL}, - {(char *) "__module__", T_OBJECT, offsetof(__pyx_CoroutineObject, gi_modulename), 0, 0}, -#if CYTHON_USE_TYPE_SPECS - {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(__pyx_CoroutineObject, gi_weakreflist), READONLY, 0}, -#endif - {0, 0, 0, 0, 0} -}; -static PyGetSetDef __pyx_Generator_getsets[] = { - {(char *) "__name__", (getter)__Pyx_Coroutine_get_name, (setter)__Pyx_Coroutine_set_name, - (char*) PyDoc_STR("name of the generator"), 0}, - {(char *) "__qualname__", (getter)__Pyx_Coroutine_get_qualname, (setter)__Pyx_Coroutine_set_qualname, - (char*) PyDoc_STR("qualified name of the generator"), 0}, - {(char *) "gi_frame", (getter)__Pyx_Coroutine_get_frame, NULL, - (char*) PyDoc_STR("Frame of the generator"), 0}, - {0, 0, 0, 0, 0} -}; -#if CYTHON_USE_TYPE_SPECS -static PyType_Slot __pyx_GeneratorType_slots[] = { - {Py_tp_dealloc, (void *)__Pyx_Coroutine_dealloc}, - {Py_tp_traverse, (void *)__Pyx_Coroutine_traverse}, - {Py_tp_iter, (void *)PyObject_SelfIter}, - {Py_tp_iternext, (void *)__Pyx_Generator_Next}, - {Py_tp_methods, (void *)__pyx_Generator_methods}, - {Py_tp_members, (void *)__pyx_Generator_memberlist}, - {Py_tp_getset, (void *)__pyx_Generator_getsets}, - {Py_tp_getattro, (void *) __Pyx_PyObject_GenericGetAttrNoDict}, -#if CYTHON_USE_TP_FINALIZE - {Py_tp_finalize, (void *)__Pyx_Coroutine_del}, -#endif - {0, 0}, -}; -static PyType_Spec __pyx_GeneratorType_spec = { - __PYX_TYPE_MODULE_PREFIX "generator", - sizeof(__pyx_CoroutineObject), - 0, - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_HAVE_FINALIZE, - __pyx_GeneratorType_slots -}; -#else -static PyTypeObject __pyx_GeneratorType_type = { - PyVarObject_HEAD_INIT(0, 0) - __PYX_TYPE_MODULE_PREFIX "generator", - sizeof(__pyx_CoroutineObject), - 0, - (destructor) __Pyx_Coroutine_dealloc, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_HAVE_FINALIZE, - 0, - (traverseproc) __Pyx_Coroutine_traverse, - 0, - 0, - offsetof(__pyx_CoroutineObject, gi_weakreflist), - 0, - (iternextfunc) __Pyx_Generator_Next, - __pyx_Generator_methods, - __pyx_Generator_memberlist, - __pyx_Generator_getsets, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, - 0, -#if CYTHON_USE_TP_FINALIZE - 0, -#else - __Pyx_Coroutine_del, -#endif - 0, -#if CYTHON_USE_TP_FINALIZE - __Pyx_Coroutine_del, -#elif PY_VERSION_HEX >= 0x030400a1 - 0, -#endif -#if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800) - 0, -#endif -#if __PYX_NEED_TP_PRINT_SLOT - 0, -#endif -#if PY_VERSION_HEX >= 0x030C0000 - 0, -#endif -#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000 - 0, -#endif -}; -#endif -static int __pyx_Generator_init(PyObject *module) { -#if CYTHON_USE_TYPE_SPECS - __pyx_GeneratorType = __Pyx_FetchCommonTypeFromSpec(module, &__pyx_GeneratorType_spec, NULL); -#else - CYTHON_UNUSED_VAR(module); - __pyx_GeneratorType_type.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; - __pyx_GeneratorType_type.tp_iter = PyObject_SelfIter; - __pyx_GeneratorType = __Pyx_FetchCommonType(&__pyx_GeneratorType_type); -#endif - if (unlikely(!__pyx_GeneratorType)) { - return -1; - } - return 0; -} - -/* CheckBinaryVersion */ -static int __Pyx_check_binary_version(void) { - char ctversion[5]; - int same=1, i, found_dot; - const char* rt_from_call = Py_GetVersion(); - PyOS_snprintf(ctversion, 5, "%d.%d", PY_MAJOR_VERSION, PY_MINOR_VERSION); - found_dot = 0; - for (i = 0; i < 4; i++) { - if (!ctversion[i]) { - same = (rt_from_call[i] < '0' || rt_from_call[i] > '9'); - break; - } - if (rt_from_call[i] != ctversion[i]) { - same = 0; - break; - } - } - if (!same) { - char rtversion[5] = {'\0'}; - char message[200]; - for (i=0; i<4; ++i) { - if (rt_from_call[i] == '.') { - if (found_dot) break; - found_dot = 1; - } else if (rt_from_call[i] < '0' || rt_from_call[i] > '9') { - break; - } - rtversion[i] = rt_from_call[i]; - } - PyOS_snprintf(message, sizeof(message), - "compile time version %s of module '%.100s' " - "does not match runtime version %s", - ctversion, __Pyx_MODULE_NAME, rtversion); - return PyErr_WarnEx(NULL, message, 1); - } - return 0; -} - -/* InitStrings */ -#if PY_MAJOR_VERSION >= 3 -static int __Pyx_InitString(__Pyx_StringTabEntry t, PyObject **str) { - if (t.is_unicode | t.is_str) { - if (t.intern) { - *str = PyUnicode_InternFromString(t.s); - } else if (t.encoding) { - *str = PyUnicode_Decode(t.s, t.n - 1, t.encoding, NULL); - } else { - *str = PyUnicode_FromStringAndSize(t.s, t.n - 1); - } - } else { - *str = PyBytes_FromStringAndSize(t.s, t.n - 1); - } - if (!*str) - return -1; - if (PyObject_Hash(*str) == -1) - return -1; - return 0; -} -#endif -static int __Pyx_InitStrings(__Pyx_StringTabEntry *t) { - while (t->p) { - #if PY_MAJOR_VERSION >= 3 - __Pyx_InitString(*t, t->p); - #else - if (t->is_unicode) { - *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); - } else if (t->intern) { - *t->p = PyString_InternFromString(t->s); - } else { - *t->p = PyString_FromStringAndSize(t->s, t->n - 1); - } - if (!*t->p) - return -1; - if (PyObject_Hash(*t->p) == -1) - return -1; - #endif - ++t; - } - return 0; -} - -static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { - return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); -} -static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { - Py_ssize_t ignore; - return __Pyx_PyObject_AsStringAndSize(o, &ignore); -} -#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT -#if !CYTHON_PEP393_ENABLED -static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { - char* defenc_c; - PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); - if (!defenc) return NULL; - defenc_c = PyBytes_AS_STRING(defenc); -#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII - { - char* end = defenc_c + PyBytes_GET_SIZE(defenc); - char* c; - for (c = defenc_c; c < end; c++) { - if ((unsigned char) (*c) >= 128) { - PyUnicode_AsASCIIString(o); - return NULL; - } - } - } -#endif - *length = PyBytes_GET_SIZE(defenc); - return defenc_c; -} -#else -static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { - if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; -#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII - if (likely(PyUnicode_IS_ASCII(o))) { - *length = PyUnicode_GET_LENGTH(o); - return PyUnicode_AsUTF8(o); - } else { - PyUnicode_AsASCIIString(o); - return NULL; - } -#else - return PyUnicode_AsUTF8AndSize(o, length); -#endif -} -#endif -#endif -static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { -#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT - if ( -#if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII - __Pyx_sys_getdefaultencoding_not_ascii && -#endif - PyUnicode_Check(o)) { - return __Pyx_PyUnicode_AsStringAndSize(o, length); - } else -#endif -#if (!CYTHON_COMPILING_IN_PYPY && !CYTHON_COMPILING_IN_LIMITED_API) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) - if (PyByteArray_Check(o)) { - *length = PyByteArray_GET_SIZE(o); - return PyByteArray_AS_STRING(o); - } else -#endif - { - char* result; - int r = PyBytes_AsStringAndSize(o, &result, length); - if (unlikely(r < 0)) { - return NULL; - } else { - return result; - } - } -} -static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { - int is_true = x == Py_True; - if (is_true | (x == Py_False) | (x == Py_None)) return is_true; - else return PyObject_IsTrue(x); -} -static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { - int retval; - if (unlikely(!x)) return -1; - retval = __Pyx_PyObject_IsTrue(x); - Py_DECREF(x); - return retval; -} -static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { - __Pyx_TypeName result_type_name = __Pyx_PyType_GetName(Py_TYPE(result)); -#if PY_MAJOR_VERSION >= 3 - if (PyLong_Check(result)) { - if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, - "__int__ returned non-int (type " __Pyx_FMT_TYPENAME "). " - "The ability to return an instance of a strict subclass of int is deprecated, " - "and may be removed in a future version of Python.", - result_type_name)) { - __Pyx_DECREF_TypeName(result_type_name); - Py_DECREF(result); - return NULL; - } - __Pyx_DECREF_TypeName(result_type_name); - return result; - } -#endif - PyErr_Format(PyExc_TypeError, - "__%.4s__ returned non-%.4s (type " __Pyx_FMT_TYPENAME ")", - type_name, type_name, result_type_name); - __Pyx_DECREF_TypeName(result_type_name); - Py_DECREF(result); - return NULL; -} -static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { -#if CYTHON_USE_TYPE_SLOTS - PyNumberMethods *m; -#endif - const char *name = NULL; - PyObject *res = NULL; -#if PY_MAJOR_VERSION < 3 - if (likely(PyInt_Check(x) || PyLong_Check(x))) -#else - if (likely(PyLong_Check(x))) -#endif - return __Pyx_NewRef(x); -#if CYTHON_USE_TYPE_SLOTS - m = Py_TYPE(x)->tp_as_number; - #if PY_MAJOR_VERSION < 3 - if (m && m->nb_int) { - name = "int"; - res = m->nb_int(x); - } - else if (m && m->nb_long) { - name = "long"; - res = m->nb_long(x); - } - #else - if (likely(m && m->nb_int)) { - name = "int"; - res = m->nb_int(x); - } - #endif -#else - if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { - res = PyNumber_Int(x); - } -#endif - if (likely(res)) { -#if PY_MAJOR_VERSION < 3 - if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { -#else - if (unlikely(!PyLong_CheckExact(res))) { -#endif - return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); - } - } - else if (!PyErr_Occurred()) { - PyErr_SetString(PyExc_TypeError, - "an integer is required"); - } - return res; -} -static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { - Py_ssize_t ival; - PyObject *x; -#if PY_MAJOR_VERSION < 3 - if (likely(PyInt_CheckExact(b))) { - if (sizeof(Py_ssize_t) >= sizeof(long)) - return PyInt_AS_LONG(b); - else - return PyInt_AsSsize_t(b); - } -#endif - if (likely(PyLong_CheckExact(b))) { - #if CYTHON_USE_PYLONG_INTERNALS - if (likely(__Pyx_PyLong_IsCompact(b))) { - return __Pyx_PyLong_CompactValue(b); - } else { - const digit* digits = __Pyx_PyLong_Digits(b); - const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(b); - switch (size) { - case 2: - if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { - return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - case -2: - if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { - return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - case 3: - if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { - return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - case -3: - if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { - return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - case 4: - if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { - return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - case -4: - if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { - return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); - } - break; - } - } - #endif - return PyLong_AsSsize_t(b); - } - x = PyNumber_Index(b); - if (!x) return -1; - ival = PyInt_AsSsize_t(x); - Py_DECREF(x); - return ival; -} -static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject* o) { - if (sizeof(Py_hash_t) == sizeof(Py_ssize_t)) { - return (Py_hash_t) __Pyx_PyIndex_AsSsize_t(o); -#if PY_MAJOR_VERSION < 3 - } else if (likely(PyInt_CheckExact(o))) { - return PyInt_AS_LONG(o); -#endif - } else { - Py_ssize_t ival; - PyObject *x; - x = PyNumber_Index(o); - if (!x) return -1; - ival = PyInt_AsLong(x); - Py_DECREF(x); - return ival; - } -} -static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { - return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); -} -static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { - return PyInt_FromSize_t(ival); -} - - -/* #### Code section: utility_code_pragmas_end ### */ -#ifdef _MSC_VER -#pragma warning( pop ) -#endif - - - -/* #### Code section: end ### */ -#endif /* Py_PYTHON_H */ diff --git a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/__init__.py b/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/__init__.py deleted file mode 100644 index c049006bf2e58bed7786432e7ec84c8c8b40edf0..0000000000000000000000000000000000000000 --- a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -from .path import SVGPath, parse_path - -__all__ = ["SVGPath", "parse_path"] diff --git a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/path/arc.py b/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/path/arc.py deleted file mode 100644 index 3e0a211e043a9f52954a29ce95de9d2a9f1858d4..0000000000000000000000000000000000000000 --- a/spaces/cloudtheboi/Lofi4All/.pythonlibs/lib/python3.10/site-packages/fontTools/svgLib/path/arc.py +++ /dev/null @@ -1,153 +0,0 @@ -"""Convert SVG Path's elliptical arcs to Bezier curves. - -The code is mostly adapted from Blink's SVGPathNormalizer::DecomposeArcToCubic -https://github.com/chromium/chromium/blob/93831f2/third_party/ -blink/renderer/core/svg/svg_path_parser.cc#L169-L278 -""" -from fontTools.misc.transform import Identity, Scale -from math import atan2, ceil, cos, fabs, isfinite, pi, radians, sin, sqrt, tan - - -TWO_PI = 2 * pi -PI_OVER_TWO = 0.5 * pi - - -def _map_point(matrix, pt): - # apply Transform matrix to a point represented as a complex number - r = matrix.transformPoint((pt.real, pt.imag)) - return r[0] + r[1] * 1j - - -class EllipticalArc(object): - def __init__(self, current_point, rx, ry, rotation, large, sweep, target_point): - self.current_point = current_point - self.rx = rx - self.ry = ry - self.rotation = rotation - self.large = large - self.sweep = sweep - self.target_point = target_point - - # SVG arc's rotation angle is expressed in degrees, whereas Transform.rotate - # uses radians - self.angle = radians(rotation) - - # these derived attributes are computed by the _parametrize method - self.center_point = self.theta1 = self.theta2 = self.theta_arc = None - - def _parametrize(self): - # convert from endopoint to center parametrization: - # https://www.w3.org/TR/SVG/implnote.html#ArcConversionEndpointToCenter - - # If rx = 0 or ry = 0 then this arc is treated as a straight line segment (a - # "lineto") joining the endpoints. - # http://www.w3.org/TR/SVG/implnote.html#ArcOutOfRangeParameters - rx = fabs(self.rx) - ry = fabs(self.ry) - if not (rx and ry): - return False - - # If the current point and target point for the arc are identical, it should - # be treated as a zero length path. This ensures continuity in animations. - if self.target_point == self.current_point: - return False - - mid_point_distance = (self.current_point - self.target_point) * 0.5 - - point_transform = Identity.rotate(-self.angle) - - transformed_mid_point = _map_point(point_transform, mid_point_distance) - square_rx = rx * rx - square_ry = ry * ry - square_x = transformed_mid_point.real * transformed_mid_point.real - square_y = transformed_mid_point.imag * transformed_mid_point.imag - - # Check if the radii are big enough to draw the arc, scale radii if not. - # http://www.w3.org/TR/SVG/implnote.html#ArcCorrectionOutOfRangeRadii - radii_scale = square_x / square_rx + square_y / square_ry - if radii_scale > 1: - rx *= sqrt(radii_scale) - ry *= sqrt(radii_scale) - self.rx, self.ry = rx, ry - - point_transform = Scale(1 / rx, 1 / ry).rotate(-self.angle) - - point1 = _map_point(point_transform, self.current_point) - point2 = _map_point(point_transform, self.target_point) - delta = point2 - point1 - - d = delta.real * delta.real + delta.imag * delta.imag - scale_factor_squared = max(1 / d - 0.25, 0.0) - - scale_factor = sqrt(scale_factor_squared) - if self.sweep == self.large: - scale_factor = -scale_factor - - delta *= scale_factor - center_point = (point1 + point2) * 0.5 - center_point += complex(-delta.imag, delta.real) - point1 -= center_point - point2 -= center_point - - theta1 = atan2(point1.imag, point1.real) - theta2 = atan2(point2.imag, point2.real) - - theta_arc = theta2 - theta1 - if theta_arc < 0 and self.sweep: - theta_arc += TWO_PI - elif theta_arc > 0 and not self.sweep: - theta_arc -= TWO_PI - - self.theta1 = theta1 - self.theta2 = theta1 + theta_arc - self.theta_arc = theta_arc - self.center_point = center_point - - return True - - def _decompose_to_cubic_curves(self): - if self.center_point is None and not self._parametrize(): - return - - point_transform = Identity.rotate(self.angle).scale(self.rx, self.ry) - - # Some results of atan2 on some platform implementations are not exact - # enough. So that we get more cubic curves than expected here. Adding 0.001f - # reduces the count of sgements to the correct count. - num_segments = int(ceil(fabs(self.theta_arc / (PI_OVER_TWO + 0.001)))) - for i in range(num_segments): - start_theta = self.theta1 + i * self.theta_arc / num_segments - end_theta = self.theta1 + (i + 1) * self.theta_arc / num_segments - - t = (4 / 3) * tan(0.25 * (end_theta - start_theta)) - if not isfinite(t): - return - - sin_start_theta = sin(start_theta) - cos_start_theta = cos(start_theta) - sin_end_theta = sin(end_theta) - cos_end_theta = cos(end_theta) - - point1 = complex( - cos_start_theta - t * sin_start_theta, - sin_start_theta + t * cos_start_theta, - ) - point1 += self.center_point - target_point = complex(cos_end_theta, sin_end_theta) - target_point += self.center_point - point2 = target_point - point2 += complex(t * sin_end_theta, -t * cos_end_theta) - - point1 = _map_point(point_transform, point1) - point2 = _map_point(point_transform, point2) - target_point = _map_point(point_transform, target_point) - - yield point1, point2, target_point - - def draw(self, pen): - for point1, point2, target_point in self._decompose_to_cubic_curves(): - pen.curveTo( - (point1.real, point1.imag), - (point2.real, point2.imag), - (target_point.real, target_point.imag), - ) diff --git a/spaces/cm107/agv-demo/index.html b/spaces/cm107/agv-demo/index.html deleted file mode 100644 index 2cbaab82138fd9531a5d46c1a799a2dd92cb3c6f..0000000000000000000000000000000000000000 --- a/spaces/cm107/agv-demo/index.html +++ /dev/null @@ -1,92 +0,0 @@ - - - - - - - AGVWarehouse - - - - -
- -
- - - - - diff --git a/spaces/codeparrot/code-generation-models/architectures/incoder.md b/spaces/codeparrot/code-generation-models/architectures/incoder.md deleted file mode 100644 index 1202d2097c64034947a439593f167f519d34d587..0000000000000000000000000000000000000000 --- a/spaces/codeparrot/code-generation-models/architectures/incoder.md +++ /dev/null @@ -1,31 +0,0 @@ -[InCoder](https://huggingface.co/facebook/incoder-6B) uses a decoder-only Transformer with Causal Masking objective, to train a left-to-right language model to fill in masked token segments, with a context length of 2048. -
- -|Model | # parameters | -| - | - | -| [facebook/incoder-1B](https://huggingface.co/facebook/incoder-1B) |1.3B | -| [facebook/incoder-6B](https://huggingface.co/facebook/incoder-6B) |6.7B | - -
- -[Causal Masking objective](https://arxiv.org/abs/2201.07520) is a hybrid approach of Causal and Masked language models, "it combines the benefit of per-token generation with optional bi-directionality specifically tailored to prompting". -During the training of InCoder, spans of code were randomly masked and moved to the end of each file, which allows for bidirectional context. Figure below from InCoder [paper](https://arxiv.org/pdf/2204.05999.pdf) illustrates the training process. - -

- drawing -

- -So in addition to program synthesis (via left-to-right generation), InCoder can also perform editing (via infilling). The model gives promising results in some zero-shot code infilling tasks such as type prediction, variable re-naming and comment generation. - -You can load the model and tokenizer directly from 🤗 [`transformers`](https://huggingface.co/docs/transformers/index): - -```python -from transformers import AutoTokenizer, AutoModelWithLMHead - -tokenizer = AutoTokenizer.from_pretrained("facebook/incoder-6B") -model = AutoModelWithLMHead.from_pretrained("facebook/incoder-6B") - -inputs = tokenizer("def hello_world():", return_tensors="pt") -outputs = model(**inputs) - -``` \ No newline at end of file diff --git a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/alpha/Makefile b/spaces/colakin/video-generater/public/ffmpeg/libavcodec/alpha/Makefile deleted file mode 100644 index 796d9762b313515a7ef376e6482e098a3b0d1394..0000000000000000000000000000000000000000 --- a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/alpha/Makefile +++ /dev/null @@ -1,10 +0,0 @@ -OBJS-$(CONFIG_BLOCKDSP) += alpha/blockdsp_alpha.o -OBJS-$(CONFIG_ME_CMP) += alpha/me_cmp_alpha.o \ - alpha/me_cmp_mvi_asm.o -OBJS-$(CONFIG_HPELDSP) += alpha/hpeldsp_alpha.o \ - alpha/hpeldsp_alpha_asm.o -OBJS-$(CONFIG_IDCTDSP) += alpha/idctdsp_alpha.o \ - alpha/idctdsp_alpha_asm.o \ - alpha/simple_idct_alpha.o -OBJS-$(CONFIG_MPEGVIDEO) += alpha/mpegvideo_alpha.o -OBJS-$(CONFIG_PIXBLOCKDSP) += alpha/pixblockdsp_alpha.o diff --git a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/arm/sbcdsp_init_arm.c b/spaces/colakin/video-generater/public/ffmpeg/libavcodec/arm/sbcdsp_init_arm.c deleted file mode 100644 index b3a2c3d083cb773f9cb6b039584ed1a6f97d55ce..0000000000000000000000000000000000000000 --- a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/arm/sbcdsp_init_arm.c +++ /dev/null @@ -1,107 +0,0 @@ -/* - * Bluetooth low-complexity, subband codec (SBC) - * - * Copyright (C) 2017 Aurelien Jacobs - * Copyright (C) 2008-2010 Nokia Corporation - * Copyright (C) 2004-2010 Marcel Holtmann - * Copyright (C) 2004-2005 Henryk Ploetz - * Copyright (C) 2005-2006 Brad Midgley - * - * This file is part of FFmpeg. - * - * FFmpeg is free software; you can redistribute it and/or - * modify it under the terms of the GNU Lesser General Public - * License as published by the Free Software Foundation; either - * version 2.1 of the License, or (at your option) any later version. - * - * FFmpeg is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Lesser General Public License for more details. - * - * You should have received a copy of the GNU Lesser General Public - * License along with FFmpeg; if not, write to the Free Software - * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA - */ - -/** - * @file - * SBC ARMv6 optimization for some basic "building bricks" - */ - -#include "libavutil/attributes.h" -#include "libavutil/cpu.h" -#include "libavutil/mem_internal.h" -#include "libavutil/arm/cpu.h" -#include "libavcodec/sbcdsp.h" - -void ff_sbc_analyze_4_armv6(const int16_t *in, int32_t *out, const int16_t *consts); -void ff_sbc_analyze_8_armv6(const int16_t *in, int32_t *out, const int16_t *consts); - -void ff_sbc_analyze_4_neon(const int16_t *in, int32_t *out, const int16_t *consts); -void ff_sbc_analyze_8_neon(const int16_t *in, int32_t *out, const int16_t *consts); -void ff_sbc_calc_scalefactors_neon(int32_t sb_sample_f[16][2][8], - uint32_t scale_factor[2][8], - int blocks, int channels, int subbands); -int ff_sbc_calc_scalefactors_j_neon(int32_t sb_sample_f[16][2][8], - uint32_t scale_factor[2][8], - int blocks, int subbands); -int ff_sbc_enc_process_input_4s_neon(int position, const uint8_t *pcm, - int16_t X[2][SBC_X_BUFFER_SIZE], - int nsamples, int nchannels); -int ff_sbc_enc_process_input_8s_neon(int position, const uint8_t *pcm, - int16_t X[2][SBC_X_BUFFER_SIZE], - int nsamples, int nchannels); - -DECLARE_ALIGNED(SBC_ALIGN, int32_t, ff_sbcdsp_joint_bits_mask)[8] = { - 8, 4, 2, 1, 128, 64, 32, 16 -}; - -#if HAVE_BIGENDIAN -#define PERM(a, b, c, d) { \ - (a * 2) + 1, (a * 2) + 0, \ - (b * 2) + 1, (b * 2) + 0, \ - (c * 2) + 1, (c * 2) + 0, \ - (d * 2) + 1, (d * 2) + 0 \ - } -#else -#define PERM(a, b, c, d) { \ - (a * 2) + 0, (a * 2) + 1, \ - (b * 2) + 0, (b * 2) + 1, \ - (c * 2) + 0, (c * 2) + 1, \ - (d * 2) + 0, (d * 2) + 1 \ - } -#endif - -DECLARE_ALIGNED(SBC_ALIGN, uint8_t, ff_sbc_input_perm_4)[2][8] = { - PERM(7, 3, 6, 4), - PERM(0, 2, 1, 5) -}; - -DECLARE_ALIGNED(SBC_ALIGN, uint8_t, ff_sbc_input_perm_8)[4][8] = { - PERM(15, 7, 14, 8), - PERM(13, 9, 12, 10), - PERM(11, 3, 6, 0), - PERM( 5, 1, 4, 2) -}; - -av_cold void ff_sbcdsp_init_arm(SBCDSPContext *s) -{ - int cpu_flags = av_get_cpu_flags(); - - if (have_armv6(cpu_flags)) { - s->sbc_analyze_4 = ff_sbc_analyze_4_armv6; - s->sbc_analyze_8 = ff_sbc_analyze_8_armv6; - } - - if (have_neon(cpu_flags)) { - s->sbc_analyze_4 = ff_sbc_analyze_4_neon; - s->sbc_analyze_8 = ff_sbc_analyze_8_neon; - s->sbc_calc_scalefactors = ff_sbc_calc_scalefactors_neon; - s->sbc_calc_scalefactors_j = ff_sbc_calc_scalefactors_j_neon; - if (s->increment != 1) { - s->sbc_enc_process_input_4s = ff_sbc_enc_process_input_4s_neon; - s->sbc_enc_process_input_8s = ff_sbc_enc_process_input_8s_neon; - } - } -} diff --git a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/avcodec.c b/spaces/colakin/video-generater/public/ffmpeg/libavcodec/avcodec.c deleted file mode 100644 index fb1362290f5c60f6b2c72e8ebc0e08aa11ff814b..0000000000000000000000000000000000000000 --- a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/avcodec.c +++ /dev/null @@ -1,716 +0,0 @@ -/* - * AVCodecContext functions for libavcodec - * - * This file is part of FFmpeg. - * - * FFmpeg is free software; you can redistribute it and/or - * modify it under the terms of the GNU Lesser General Public - * License as published by the Free Software Foundation; either - * version 2.1 of the License, or (at your option) any later version. - * - * FFmpeg is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Lesser General Public License for more details. - * - * You should have received a copy of the GNU Lesser General Public - * License along with FFmpeg; if not, write to the Free Software - * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA - */ - -/** - * @file - * AVCodecContext functions for libavcodec - */ - -#include "config.h" -#include "libavutil/avassert.h" -#include "libavutil/avstring.h" -#include "libavutil/bprint.h" -#include "libavutil/channel_layout.h" -#include "libavutil/fifo.h" -#include "libavutil/imgutils.h" -#include "libavutil/mem.h" -#include "libavutil/opt.h" -#include "libavutil/thread.h" -#include "avcodec.h" -#include "bsf.h" -#include "codec_internal.h" -#include "decode.h" -#include "encode.h" -#include "frame_thread_encoder.h" -#include "internal.h" -#include "thread.h" - -int avcodec_default_execute(AVCodecContext *c, int (*func)(AVCodecContext *c2, void *arg2), void *arg, int *ret, int count, int size) -{ - int i; - - for (i = 0; i < count; i++) { - int r = func(c, (char *)arg + i * size); - if (ret) - ret[i] = r; - } - emms_c(); - return 0; -} - -int avcodec_default_execute2(AVCodecContext *c, int (*func)(AVCodecContext *c2, void *arg2, int jobnr, int threadnr), void *arg, int *ret, int count) -{ - int i; - - for (i = 0; i < count; i++) { - int r = func(c, arg, i, 0); - if (ret) - ret[i] = r; - } - emms_c(); - return 0; -} - -static AVMutex codec_mutex = AV_MUTEX_INITIALIZER; - -static void lock_avcodec(const FFCodec *codec) -{ - if (codec->caps_internal & FF_CODEC_CAP_NOT_INIT_THREADSAFE && codec->init) - ff_mutex_lock(&codec_mutex); -} - -static void unlock_avcodec(const FFCodec *codec) -{ - if (codec->caps_internal & FF_CODEC_CAP_NOT_INIT_THREADSAFE && codec->init) - ff_mutex_unlock(&codec_mutex); -} - -static int64_t get_bit_rate(AVCodecContext *ctx) -{ - int64_t bit_rate; - int bits_per_sample; - - switch (ctx->codec_type) { - case AVMEDIA_TYPE_VIDEO: - case AVMEDIA_TYPE_DATA: - case AVMEDIA_TYPE_SUBTITLE: - case AVMEDIA_TYPE_ATTACHMENT: - bit_rate = ctx->bit_rate; - break; - case AVMEDIA_TYPE_AUDIO: - bits_per_sample = av_get_bits_per_sample(ctx->codec_id); - if (bits_per_sample) { - bit_rate = ctx->sample_rate * (int64_t)ctx->ch_layout.nb_channels; - if (bit_rate > INT64_MAX / bits_per_sample) { - bit_rate = 0; - } else - bit_rate *= bits_per_sample; - } else - bit_rate = ctx->bit_rate; - break; - default: - bit_rate = 0; - break; - } - return bit_rate; -} - -int attribute_align_arg avcodec_open2(AVCodecContext *avctx, const AVCodec *codec, AVDictionary **options) -{ - int ret = 0; - AVCodecInternal *avci; - const FFCodec *codec2; - - if (avcodec_is_open(avctx)) - return 0; - - if (!codec && !avctx->codec) { - av_log(avctx, AV_LOG_ERROR, "No codec provided to avcodec_open2()\n"); - return AVERROR(EINVAL); - } - if (codec && avctx->codec && codec != avctx->codec) { - av_log(avctx, AV_LOG_ERROR, "This AVCodecContext was allocated for %s, " - "but %s passed to avcodec_open2()\n", avctx->codec->name, codec->name); - return AVERROR(EINVAL); - } - if (!codec) - codec = avctx->codec; - codec2 = ffcodec(codec); - - if ((avctx->codec_type != AVMEDIA_TYPE_UNKNOWN && avctx->codec_type != codec->type) || - (avctx->codec_id != AV_CODEC_ID_NONE && avctx->codec_id != codec->id)) { - av_log(avctx, AV_LOG_ERROR, "Codec type or id mismatches\n"); - return AVERROR(EINVAL); - } - - avctx->codec_type = codec->type; - avctx->codec_id = codec->id; - avctx->codec = codec; - - if (avctx->extradata_size < 0 || avctx->extradata_size >= FF_MAX_EXTRADATA_SIZE) - return AVERROR(EINVAL); - - avci = av_mallocz(sizeof(*avci)); - if (!avci) { - ret = AVERROR(ENOMEM); - goto end; - } - avctx->internal = avci; - - avci->buffer_frame = av_frame_alloc(); - avci->buffer_pkt = av_packet_alloc(); - if (!avci->buffer_frame || !avci->buffer_pkt) { - ret = AVERROR(ENOMEM); - goto free_and_end; - } - - if (codec2->priv_data_size > 0) { - if (!avctx->priv_data) { - avctx->priv_data = av_mallocz(codec2->priv_data_size); - if (!avctx->priv_data) { - ret = AVERROR(ENOMEM); - goto free_and_end; - } - if (codec->priv_class) { - *(const AVClass **)avctx->priv_data = codec->priv_class; - av_opt_set_defaults(avctx->priv_data); - } - } - if (codec->priv_class && (ret = av_opt_set_dict(avctx->priv_data, options)) < 0) - goto free_and_end; - } else { - avctx->priv_data = NULL; - } - if ((ret = av_opt_set_dict(avctx, options)) < 0) - goto free_and_end; - - if (avctx->codec_whitelist && av_match_list(codec->name, avctx->codec_whitelist, ',') <= 0) { - av_log(avctx, AV_LOG_ERROR, "Codec (%s) not on whitelist \'%s\'\n", codec->name, avctx->codec_whitelist); - ret = AVERROR(EINVAL); - goto free_and_end; - } - - // only call ff_set_dimensions() for non H.264/VP6F/DXV codecs so as not to overwrite previously setup dimensions - if (!(avctx->coded_width && avctx->coded_height && avctx->width && avctx->height && - (avctx->codec_id == AV_CODEC_ID_H264 || avctx->codec_id == AV_CODEC_ID_VP6F || avctx->codec_id == AV_CODEC_ID_DXV))) { - if (avctx->coded_width && avctx->coded_height) - ret = ff_set_dimensions(avctx, avctx->coded_width, avctx->coded_height); - else if (avctx->width && avctx->height) - ret = ff_set_dimensions(avctx, avctx->width, avctx->height); - if (ret < 0) - goto free_and_end; - } - - if ((avctx->coded_width || avctx->coded_height || avctx->width || avctx->height) - && ( av_image_check_size2(avctx->coded_width, avctx->coded_height, avctx->max_pixels, AV_PIX_FMT_NONE, 0, avctx) < 0 - || av_image_check_size2(avctx->width, avctx->height, avctx->max_pixels, AV_PIX_FMT_NONE, 0, avctx) < 0)) { - av_log(avctx, AV_LOG_WARNING, "Ignoring invalid width/height values\n"); - ff_set_dimensions(avctx, 0, 0); - } - - if (avctx->width > 0 && avctx->height > 0) { - if (av_image_check_sar(avctx->width, avctx->height, - avctx->sample_aspect_ratio) < 0) { - av_log(avctx, AV_LOG_WARNING, "ignoring invalid SAR: %u/%u\n", - avctx->sample_aspect_ratio.num, - avctx->sample_aspect_ratio.den); - avctx->sample_aspect_ratio = (AVRational){ 0, 1 }; - } - } - - if (avctx->sample_rate < 0) { - av_log(avctx, AV_LOG_ERROR, "Invalid sample rate: %d\n", avctx->sample_rate); - ret = AVERROR(EINVAL); - goto free_and_end; - } - if (avctx->block_align < 0) { - av_log(avctx, AV_LOG_ERROR, "Invalid block align: %d\n", avctx->block_align); - ret = AVERROR(EINVAL); - goto free_and_end; - } - -#if FF_API_OLD_CHANNEL_LAYOUT -FF_DISABLE_DEPRECATION_WARNINGS - /* compat wrapper for old-style callers */ - if (avctx->channel_layout && !avctx->channels) - avctx->channels = av_popcount64(avctx->channel_layout); - - if ((avctx->channels && avctx->ch_layout.nb_channels != avctx->channels) || - (avctx->channel_layout && (avctx->ch_layout.order != AV_CHANNEL_ORDER_NATIVE || - avctx->ch_layout.u.mask != avctx->channel_layout))) { - av_channel_layout_uninit(&avctx->ch_layout); - if (avctx->channel_layout) { - av_channel_layout_from_mask(&avctx->ch_layout, avctx->channel_layout); - } else { - avctx->ch_layout.order = AV_CHANNEL_ORDER_UNSPEC; - } - avctx->ch_layout.nb_channels = avctx->channels; - } -FF_ENABLE_DEPRECATION_WARNINGS -#endif - - /* AV_CODEC_CAP_CHANNEL_CONF is a decoder-only flag; so the code below - * in particular checks that nb_channels is set for all audio encoders. */ - if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && !avctx->ch_layout.nb_channels - && !(codec->capabilities & AV_CODEC_CAP_CHANNEL_CONF)) { - av_log(avctx, AV_LOG_ERROR, "%s requires channel layout to be set\n", - av_codec_is_decoder(codec) ? "Decoder" : "Encoder"); - ret = AVERROR(EINVAL); - goto free_and_end; - } - if (avctx->ch_layout.nb_channels && !av_channel_layout_check(&avctx->ch_layout)) { - av_log(avctx, AV_LOG_ERROR, "Invalid channel layout\n"); - ret = AVERROR(EINVAL); - goto free_and_end; - } - if (avctx->ch_layout.nb_channels > FF_SANE_NB_CHANNELS) { - av_log(avctx, AV_LOG_ERROR, "Too many channels: %d\n", avctx->ch_layout.nb_channels); - ret = AVERROR(EINVAL); - goto free_and_end; - } - - avctx->frame_num = 0; -#if FF_API_AVCTX_FRAME_NUMBER -FF_DISABLE_DEPRECATION_WARNINGS - avctx->frame_number = avctx->frame_num; -FF_ENABLE_DEPRECATION_WARNINGS -#endif - avctx->codec_descriptor = avcodec_descriptor_get(avctx->codec_id); - - if ((avctx->codec->capabilities & AV_CODEC_CAP_EXPERIMENTAL) && - avctx->strict_std_compliance > FF_COMPLIANCE_EXPERIMENTAL) { - const char *codec_string = av_codec_is_encoder(codec) ? "encoder" : "decoder"; - const AVCodec *codec2; - av_log(avctx, AV_LOG_ERROR, - "The %s '%s' is experimental but experimental codecs are not enabled, " - "add '-strict %d' if you want to use it.\n", - codec_string, codec->name, FF_COMPLIANCE_EXPERIMENTAL); - codec2 = av_codec_is_encoder(codec) ? avcodec_find_encoder(codec->id) : avcodec_find_decoder(codec->id); - if (!(codec2->capabilities & AV_CODEC_CAP_EXPERIMENTAL)) - av_log(avctx, AV_LOG_ERROR, "Alternatively use the non experimental %s '%s'.\n", - codec_string, codec2->name); - ret = AVERROR_EXPERIMENTAL; - goto free_and_end; - } - - if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && - (!avctx->time_base.num || !avctx->time_base.den)) { - avctx->time_base.num = 1; - avctx->time_base.den = avctx->sample_rate; - } - - if (av_codec_is_encoder(avctx->codec)) - ret = ff_encode_preinit(avctx); - else - ret = ff_decode_preinit(avctx); - if (ret < 0) - goto free_and_end; - - if (HAVE_THREADS && !avci->frame_thread_encoder) { - /* Frame-threaded decoders call FFCodec.init for their child contexts. */ - lock_avcodec(codec2); - ret = ff_thread_init(avctx); - unlock_avcodec(codec2); - if (ret < 0) { - goto free_and_end; - } - } - if (!HAVE_THREADS && !(codec2->caps_internal & FF_CODEC_CAP_AUTO_THREADS)) - avctx->thread_count = 1; - - if (!(avctx->active_thread_type & FF_THREAD_FRAME) || - avci->frame_thread_encoder) { - if (codec2->init) { - lock_avcodec(codec2); - ret = codec2->init(avctx); - unlock_avcodec(codec2); - if (ret < 0) { - avci->needs_close = codec2->caps_internal & FF_CODEC_CAP_INIT_CLEANUP; - goto free_and_end; - } - } - avci->needs_close = 1; - } - - ret=0; - - if (av_codec_is_decoder(avctx->codec)) { - if (!avctx->bit_rate) - avctx->bit_rate = get_bit_rate(avctx); - -#if FF_API_OLD_CHANNEL_LAYOUT -FF_DISABLE_DEPRECATION_WARNINGS - /* update the deprecated fields for old-style callers */ - avctx->channels = avctx->ch_layout.nb_channels; - avctx->channel_layout = avctx->ch_layout.order == AV_CHANNEL_ORDER_NATIVE ? - avctx->ch_layout.u.mask : 0; -FF_ENABLE_DEPRECATION_WARNINGS -#endif - - /* validate channel layout from the decoder */ - if ((avctx->ch_layout.nb_channels && !av_channel_layout_check(&avctx->ch_layout)) || - avctx->ch_layout.nb_channels > FF_SANE_NB_CHANNELS) { - ret = AVERROR(EINVAL); - goto free_and_end; - } - if (avctx->bits_per_coded_sample < 0) { - ret = AVERROR(EINVAL); - goto free_and_end; - } - } - if (codec->priv_class) - av_assert0(*(const AVClass **)avctx->priv_data == codec->priv_class); - -end: - - return ret; -free_and_end: - avcodec_close(avctx); - goto end; -} - -void avcodec_flush_buffers(AVCodecContext *avctx) -{ - AVCodecInternal *avci = avctx->internal; - - if (av_codec_is_encoder(avctx->codec)) { - int caps = avctx->codec->capabilities; - - if (!(caps & AV_CODEC_CAP_ENCODER_FLUSH)) { - // Only encoders that explicitly declare support for it can be - // flushed. Otherwise, this is a no-op. - av_log(avctx, AV_LOG_WARNING, "Ignoring attempt to flush encoder " - "that doesn't support it\n"); - return; - } - if (avci->in_frame) - av_frame_unref(avci->in_frame); - if (avci->recon_frame) - av_frame_unref(avci->recon_frame); - } else { - av_packet_unref(avci->last_pkt_props); - av_packet_unref(avci->in_pkt); - - avctx->pts_correction_last_pts = - avctx->pts_correction_last_dts = INT64_MIN; - - av_bsf_flush(avci->bsf); - } - - avci->draining = 0; - avci->draining_done = 0; - avci->nb_draining_errors = 0; - av_frame_unref(avci->buffer_frame); - av_packet_unref(avci->buffer_pkt); - - if (HAVE_THREADS && avctx->active_thread_type & FF_THREAD_FRAME) - ff_thread_flush(avctx); - else if (ffcodec(avctx->codec)->flush) - ffcodec(avctx->codec)->flush(avctx); -} - -void avsubtitle_free(AVSubtitle *sub) -{ - int i; - - for (i = 0; i < sub->num_rects; i++) { - AVSubtitleRect *const rect = sub->rects[i]; - - av_freep(&rect->data[0]); - av_freep(&rect->data[1]); - av_freep(&rect->data[2]); - av_freep(&rect->data[3]); - av_freep(&rect->text); - av_freep(&rect->ass); - - av_freep(&sub->rects[i]); - } - - av_freep(&sub->rects); - - memset(sub, 0, sizeof(*sub)); -} - -av_cold int avcodec_close(AVCodecContext *avctx) -{ - int i; - - if (!avctx) - return 0; - - if (avcodec_is_open(avctx)) { - AVCodecInternal *avci = avctx->internal; - - if (CONFIG_FRAME_THREAD_ENCODER && - avci->frame_thread_encoder && avctx->thread_count > 1) { - ff_frame_thread_encoder_free(avctx); - } - if (HAVE_THREADS && avci->thread_ctx) - ff_thread_free(avctx); - if (avci->needs_close && ffcodec(avctx->codec)->close) - ffcodec(avctx->codec)->close(avctx); - avci->byte_buffer_size = 0; - av_freep(&avci->byte_buffer); - av_frame_free(&avci->buffer_frame); - av_packet_free(&avci->buffer_pkt); - av_packet_free(&avci->last_pkt_props); - - av_packet_free(&avci->in_pkt); - av_frame_free(&avci->in_frame); - av_frame_free(&avci->recon_frame); - - av_buffer_unref(&avci->pool); - - if (avctx->hwaccel && avctx->hwaccel->uninit) - avctx->hwaccel->uninit(avctx); - av_freep(&avci->hwaccel_priv_data); - - av_bsf_free(&avci->bsf); - - av_channel_layout_uninit(&avci->initial_ch_layout); - -#if CONFIG_LCMS2 - ff_icc_context_uninit(&avci->icc); -#endif - - av_freep(&avctx->internal); - } - - for (i = 0; i < avctx->nb_coded_side_data; i++) - av_freep(&avctx->coded_side_data[i].data); - av_freep(&avctx->coded_side_data); - avctx->nb_coded_side_data = 0; - - av_buffer_unref(&avctx->hw_frames_ctx); - av_buffer_unref(&avctx->hw_device_ctx); - - if (avctx->priv_data && avctx->codec && avctx->codec->priv_class) - av_opt_free(avctx->priv_data); - av_opt_free(avctx); - av_freep(&avctx->priv_data); - if (av_codec_is_encoder(avctx->codec)) { - av_freep(&avctx->extradata); - avctx->extradata_size = 0; - } else if (av_codec_is_decoder(avctx->codec)) - av_freep(&avctx->subtitle_header); - - avctx->codec = NULL; - avctx->active_thread_type = 0; - - return 0; -} - -static const char *unknown_if_null(const char *str) -{ - return str ? str : "unknown"; -} - -void avcodec_string(char *buf, int buf_size, AVCodecContext *enc, int encode) -{ - const char *codec_type; - const char *codec_name; - const char *profile = NULL; - AVBPrint bprint; - int64_t bitrate; - int new_line = 0; - AVRational display_aspect_ratio; - const char *separator = enc->dump_separator ? (const char *)enc->dump_separator : ", "; - const char *str; - - if (!buf || buf_size <= 0) - return; - av_bprint_init_for_buffer(&bprint, buf, buf_size); - codec_type = av_get_media_type_string(enc->codec_type); - codec_name = avcodec_get_name(enc->codec_id); - profile = avcodec_profile_name(enc->codec_id, enc->profile); - - av_bprintf(&bprint, "%s: %s", codec_type ? codec_type : "unknown", - codec_name); - buf[0] ^= 'a' ^ 'A'; /* first letter in uppercase */ - - if (enc->codec && strcmp(enc->codec->name, codec_name)) - av_bprintf(&bprint, " (%s)", enc->codec->name); - - if (profile) - av_bprintf(&bprint, " (%s)", profile); - if ( enc->codec_type == AVMEDIA_TYPE_VIDEO - && av_log_get_level() >= AV_LOG_VERBOSE - && enc->refs) - av_bprintf(&bprint, ", %d reference frame%s", - enc->refs, enc->refs > 1 ? "s" : ""); - - if (enc->codec_tag) - av_bprintf(&bprint, " (%s / 0x%04X)", - av_fourcc2str(enc->codec_tag), enc->codec_tag); - - switch (enc->codec_type) { - case AVMEDIA_TYPE_VIDEO: - { - unsigned len; - - av_bprintf(&bprint, "%s%s", separator, - enc->pix_fmt == AV_PIX_FMT_NONE ? "none" : - unknown_if_null(av_get_pix_fmt_name(enc->pix_fmt))); - - av_bprint_chars(&bprint, '(', 1); - len = bprint.len; - - /* The following check ensures that '(' has been written - * and therefore allows us to erase it if it turns out - * to be unnecessary. */ - if (!av_bprint_is_complete(&bprint)) - return; - - if (enc->bits_per_raw_sample && enc->pix_fmt != AV_PIX_FMT_NONE && - enc->bits_per_raw_sample < av_pix_fmt_desc_get(enc->pix_fmt)->comp[0].depth) - av_bprintf(&bprint, "%d bpc, ", enc->bits_per_raw_sample); - if (enc->color_range != AVCOL_RANGE_UNSPECIFIED && - (str = av_color_range_name(enc->color_range))) - av_bprintf(&bprint, "%s, ", str); - - if (enc->colorspace != AVCOL_SPC_UNSPECIFIED || - enc->color_primaries != AVCOL_PRI_UNSPECIFIED || - enc->color_trc != AVCOL_TRC_UNSPECIFIED) { - const char *col = unknown_if_null(av_color_space_name(enc->colorspace)); - const char *pri = unknown_if_null(av_color_primaries_name(enc->color_primaries)); - const char *trc = unknown_if_null(av_color_transfer_name(enc->color_trc)); - if (strcmp(col, pri) || strcmp(col, trc)) { - new_line = 1; - av_bprintf(&bprint, "%s/%s/%s, ", col, pri, trc); - } else - av_bprintf(&bprint, "%s, ", col); - } - - if (enc->field_order != AV_FIELD_UNKNOWN) { - const char *field_order = "progressive"; - if (enc->field_order == AV_FIELD_TT) - field_order = "top first"; - else if (enc->field_order == AV_FIELD_BB) - field_order = "bottom first"; - else if (enc->field_order == AV_FIELD_TB) - field_order = "top coded first (swapped)"; - else if (enc->field_order == AV_FIELD_BT) - field_order = "bottom coded first (swapped)"; - - av_bprintf(&bprint, "%s, ", field_order); - } - - if (av_log_get_level() >= AV_LOG_VERBOSE && - enc->chroma_sample_location != AVCHROMA_LOC_UNSPECIFIED && - (str = av_chroma_location_name(enc->chroma_sample_location))) - av_bprintf(&bprint, "%s, ", str); - - if (len == bprint.len) { - bprint.str[len - 1] = '\0'; - bprint.len--; - } else { - if (bprint.len - 2 < bprint.size) { - /* Erase the last ", " */ - bprint.len -= 2; - bprint.str[bprint.len] = '\0'; - } - av_bprint_chars(&bprint, ')', 1); - } - } - - if (enc->width) { - av_bprintf(&bprint, "%s%dx%d", new_line ? separator : ", ", - enc->width, enc->height); - - if (av_log_get_level() >= AV_LOG_VERBOSE && - (enc->width != enc->coded_width || - enc->height != enc->coded_height)) - av_bprintf(&bprint, " (%dx%d)", - enc->coded_width, enc->coded_height); - - if (enc->sample_aspect_ratio.num) { - av_reduce(&display_aspect_ratio.num, &display_aspect_ratio.den, - enc->width * (int64_t)enc->sample_aspect_ratio.num, - enc->height * (int64_t)enc->sample_aspect_ratio.den, - 1024 * 1024); - av_bprintf(&bprint, " [SAR %d:%d DAR %d:%d]", - enc->sample_aspect_ratio.num, enc->sample_aspect_ratio.den, - display_aspect_ratio.num, display_aspect_ratio.den); - } - if (av_log_get_level() >= AV_LOG_DEBUG) { - int g = av_gcd(enc->time_base.num, enc->time_base.den); - av_bprintf(&bprint, ", %d/%d", - enc->time_base.num / g, enc->time_base.den / g); - } - } - if (encode) { - av_bprintf(&bprint, ", q=%d-%d", enc->qmin, enc->qmax); - } else { - if (enc->properties & FF_CODEC_PROPERTY_CLOSED_CAPTIONS) - av_bprintf(&bprint, ", Closed Captions"); - if (enc->properties & FF_CODEC_PROPERTY_FILM_GRAIN) - av_bprintf(&bprint, ", Film Grain"); - if (enc->properties & FF_CODEC_PROPERTY_LOSSLESS) - av_bprintf(&bprint, ", lossless"); - } - break; - case AVMEDIA_TYPE_AUDIO: - av_bprintf(&bprint, "%s", separator); - - if (enc->sample_rate) { - av_bprintf(&bprint, "%d Hz, ", enc->sample_rate); - } - { - char buf[512]; - int ret = av_channel_layout_describe(&enc->ch_layout, buf, sizeof(buf)); - if (ret >= 0) - av_bprintf(&bprint, "%s", buf); - } - if (enc->sample_fmt != AV_SAMPLE_FMT_NONE && - (str = av_get_sample_fmt_name(enc->sample_fmt))) { - av_bprintf(&bprint, ", %s", str); - } - if ( enc->bits_per_raw_sample > 0 - && enc->bits_per_raw_sample != av_get_bytes_per_sample(enc->sample_fmt) * 8) - av_bprintf(&bprint, " (%d bit)", enc->bits_per_raw_sample); - if (av_log_get_level() >= AV_LOG_VERBOSE) { - if (enc->initial_padding) - av_bprintf(&bprint, ", delay %d", enc->initial_padding); - if (enc->trailing_padding) - av_bprintf(&bprint, ", padding %d", enc->trailing_padding); - } - break; - case AVMEDIA_TYPE_DATA: - if (av_log_get_level() >= AV_LOG_DEBUG) { - int g = av_gcd(enc->time_base.num, enc->time_base.den); - if (g) - av_bprintf(&bprint, ", %d/%d", - enc->time_base.num / g, enc->time_base.den / g); - } - break; - case AVMEDIA_TYPE_SUBTITLE: - if (enc->width) - av_bprintf(&bprint, ", %dx%d", enc->width, enc->height); - break; - default: - return; - } - if (encode) { - if (enc->flags & AV_CODEC_FLAG_PASS1) - av_bprintf(&bprint, ", pass 1"); - if (enc->flags & AV_CODEC_FLAG_PASS2) - av_bprintf(&bprint, ", pass 2"); - } - bitrate = get_bit_rate(enc); - if (bitrate != 0) { - av_bprintf(&bprint, ", %"PRId64" kb/s", bitrate / 1000); - } else if (enc->rc_max_rate > 0) { - av_bprintf(&bprint, ", max. %"PRId64" kb/s", enc->rc_max_rate / 1000); - } -} - -int avcodec_is_open(AVCodecContext *s) -{ - return !!s->internal; -} - -int attribute_align_arg avcodec_receive_frame(AVCodecContext *avctx, AVFrame *frame) -{ - av_frame_unref(frame); - - if (av_codec_is_decoder(avctx->codec)) - return ff_decode_receive_frame(avctx, frame); - return ff_encode_receive_frame(avctx, frame); -} diff --git a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/faanidct.h b/spaces/colakin/video-generater/public/ffmpeg/libavcodec/faanidct.h deleted file mode 100644 index 6f4da67c1b15f082ae882480354948091ee31eef..0000000000000000000000000000000000000000 --- a/spaces/colakin/video-generater/public/ffmpeg/libavcodec/faanidct.h +++ /dev/null @@ -1,32 +0,0 @@ -/* - * Floating point AAN IDCT - * Copyright (c) 2008 Michael Niedermayer - * - * This file is part of FFmpeg. - * - * FFmpeg is free software; you can redistribute it and/or - * modify it under the terms of the GNU Lesser General Public - * License as published by the Free Software Foundation; either - * version 2.1 of the License, or (at your option) any later version. - * - * FFmpeg is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Lesser General Public License for more details. - * - * You should have received a copy of the GNU Lesser General Public - * License along with FFmpeg; if not, write to the Free Software - * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA - */ - -#ifndef AVCODEC_FAANIDCT_H -#define AVCODEC_FAANIDCT_H - -#include -#include - -void ff_faanidct(int16_t block[64]); -void ff_faanidct_add(uint8_t *dest, ptrdiff_t line_size, int16_t block[64]); -void ff_faanidct_put(uint8_t *dest, ptrdiff_t line_size, int16_t block[64]); - -#endif /* AVCODEC_FAANIDCT_H */ diff --git a/spaces/congsaPfin/Manga-OCR/logs/Annelids Online Battle - A Destructive and Addictive Worm Game.md b/spaces/congsaPfin/Manga-OCR/logs/Annelids Online Battle - A Destructive and Addictive Worm Game.md deleted file mode 100644 index 4c09c27b6a272cb9e56619ac42a539a256d01d22..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/Annelids Online Battle - A Destructive and Addictive Worm Game.md +++ /dev/null @@ -1,191 +0,0 @@ -
-

Download Annelids Online Battle: A Fun and Action-Packed Worm Game

-

If you are looking for a fun and action-packed worm game that you can play on your mobile device, then you should definitely check out Annelids Online Battle. This game is a real-time worm action game that lets you battle with other players or AI-controlled worms in fully destructible underground worlds.

-

In this article, we will tell you everything you need to know about Annelids Online Battle, including what it is, why you should play it, how to download it for Android and iOS devices, how to play it effectively, how it compares with other worm games, and some FAQs about it.

-

download annelids online battle


Download File ===== https://urlca.com/2uO5AM



-

What is Annelids Online Battle?

-

Annelids Online Battle is a game developed by Michal Srb that was released in 2015. It is inspired by the classic Worms games that have been around since the late 90s. The game features various game modes, such as deathmatch, team deathmatch, capture the flag, conquest, and king of the hill

In each game mode, you can choose from over 70 different weapons, such as grenades, rockets, mines, lasers, and flamethrowers, to destroy your enemies and the terrain. You can also customize your worm's appearance and name by choosing from different colors, hats, glasses, and accessories.

-

The game has over 100 missions that you can complete to earn coins and unlock new weapons and items. You can also play online with up to 6 players or offline with up to 15 players on the same device. The game has procedurally generated maps that ensure that every match is different and unpredictable.

-

Why You Should Play Annelids Online Battle

-

Annelids Online Battle is a game that you should play if you love worm games or action games in general. Here are some of the reasons why you should play Annelids Online Battle:

-
    -
  • Fun gameplay: The game is easy to learn but hard to master. You can enjoy the thrill of blowing up your enemies and the environment with various weapons and tactics. You can also experiment with different game modes and settings to find your favorite way of playing.
  • -
  • Challenging AI: The game has smart and adaptive AI-controlled worms that will give you a run for your money. They will use different strategies and weapons depending on the situation and the game mode. They will also react to your actions and the terrain changes.
  • -
  • Online multiplayer: The game allows you to play online with other players from around the world. You can join or create a room with up to 6 players and choose the game mode, map, and settings. You can also chat with other players and make friends or enemies.
  • -
  • Procedurally generated maps: The game has randomly generated maps that are different every time you play. This adds variety and replay value to the game. You never know what kind of terrain, obstacles, or surprises you will encounter in each match.
  • -
-

How to Download Annelids Online Battle for Android and iOS Devices

-

Annelids Online Battle is available for both Android and iOS devices. You can download it for free from the Google Play Store or the App Store. Here is how to download Annelids Online Battle for Android and iOS devices:

-

How to Download Annelids Online Battle for Android Devices

-

To download Annelids Online Battle for Android devices, follow these steps:

-

download annelids online battle app
-download annelids online battle for android
-download annelids online battle for ios
-download annelids online battle apk
-download annelids online battle mod
-download annelids online battle free
-download annelids online battle game
-download annelids online battle latest version
-download annelids online battle offline
-download annelids online battle hack
-download annelids online battle cheats
-download annelids online battle pc
-download annelids online battle mac
-download annelids online battle windows
-download annelids online battle linux
-download annelids online battle steam
-download annelids online battle multiplayer
-download annelids online battle pvp
-download annelids online battle coop
-download annelids online battle modes
-download annelids online battle missions
-download annelids online battle maps
-download annelids online battle weapons
-download annelids online battle reviews
-download annelids online battle ratings
-download annelids online battle tips
-download annelids online battle tricks
-download annelids online battle guide
-download annelids online battle walkthrough
-download annelids online battle gameplay
-download annelids online battle trailer
-download annelids online battle video
-download annelids online battle screenshot
-download annelids online battle update
-download annelids online battle patch notes
-download annelids online battle news
-download annelids online battle blog
-download annelids online battle forum
-download annelids online battle community
-download annelids online battle support
-download annelids online battle help
-download annelids online battle faq
-download annelids online battle wiki
-download annelids online battle reddit
-download annelids online battle discord
-download annelids online battle facebook
-download annelids online battle twitter

-
    -
  1. Open the Google Play Store app on your Android device.
  2. -
  3. Search for "Annelids Online Battle" in the search bar.
  4. -
  5. Tap on the game icon that appears in the results.
  6. -
  7. Tap on the "Install" button to start downloading the game.
  8. -
  9. Wait for the download to finish and then tap on the "Open" button to launch the game.
  10. -
  11. Grant the necessary permissions and access settings to the game when prompted.
  12. -
  13. Enjoy playing Annelids Online Battle on your Android device.
  14. -
-

How to Download Annelids Online Battle for iOS Devices

-

To download Annelids Online Battle for iOS devices, follow these steps:

-
    -
  1. Open the App Store app on your iOS device.
  2. -
  3. Search for "Annelids Online Battle" in the search bar.
  4. -
  5. Tap on the game icon that appears in the results.
  6. -
  7. Tap on the "Get" button to start downloading the game.
  8. -
  9. Wait for the download to finish and then tap on the game icon to launch it.
  10. -
  11. Grant the necessary permissions and access settings to the game when prompted.
  12. -
  13. Enjoy playing Annelids Online Battle on your iOS device.
  14. -
-

How to Play Annelids Online Battle: Tips and Tricks

-

Annelids Online Battle is a game that requires skill, strategy, and creativity. To help you play better and have more fun, here are some tips and tricks on how to play Annelids Online Battle effectively:

-

How to Control Your Worm in Annelids Online Battle

-

To control your worm in Annelids Online Battle, you need to use the joystick and buttons on the screen. Here is how they work:

-
    -
  • The joystick on the left side of the screen allows you to move your worm left or right. You can also use it to jump by flicking it up or down.
  • -
  • The button on the right side of the screen allows you to aim your weapon. You can drag it around to change the direction and angle of your shot. You can also use it to zoom in or out by pin ching it in or out.
  • -
  • The button on the bottom center of the screen allows you to shoot your weapon. You can tap it once to fire a single shot or hold it down to fire continuously.
  • -
  • The button on the top right of the screen allows you to switch your weapon. You can tap it to cycle through the available weapons or hold it down to open a menu where you can select a specific weapon.
  • -
  • The button on the top left of the screen allows you to pause the game. You can tap it to open a menu where you can resume, restart, or quit the game. You can also access the settings, help, and shop from this menu.
  • -
-

How to Use Different Weapons in Annelids Online Battle

-

Annelids Online Battle has over 70 different weapons that you can use to destroy your enemies and the terrain. Each weapon has its own advantages and disadvantages, and you need to use them wisely and strategically. Here are some of the most common types of weapons and how to use them effectively:

-
    -
  • Grenades: Grenades are explosive projectiles that can bounce off walls and obstacles. They are good for throwing over cover or around corners. You can adjust the power and angle of your throw by dragging the aim button. You can also change the fuse time of your grenades by tapping the shoot button before throwing them. The shorter the fuse time, the less time your enemies have to react.
  • -
  • Rockets: Rockets are powerful missiles that can fly straight and fast. They are good for hitting distant or moving targets. You can adjust the power and angle of your launch by dragging the aim button. You can also change the direction of your rockets mid-flight by tapping the shoot button again. This can help you avoid obstacles or hit targets behind cover.
  • -
  • Mines: Mines are explosive devices that can stick to any surface. They are good for setting traps or defending your position. You can adjust the power and angle of your drop by dragging the aim button. You can also change the activation mode of your mines by tapping the shoot button before dropping them. The activation mode determines how your mines will detonate: by timer, by proximity, or by remote control.
  • -
  • Lasers: Lasers are beams of light that can cut through anything. They are good for piercing through multiple targets or destroying structures. You can adjust the power and angle of your beam by dragging the aim button. You can also change the length of your beam by pinching the aim button in or out. The longer the beam, the more damage it does, but also the more energy it consumes.
  • -
  • Flamethrowers: Flamethrowers are weapons that can spray fire in a wide arc. They are good for burning enemies or setting things on fire. You can adjust the power and angle of your spray by dragging the aim button. You can also change the spread of your spray by pinching the aim button in or out. The wider the spread, the more area it covers, but also the less damage it does.
  • -
-

How to Customize Your Worm in Annelids Online Battle

-

Annelids Online Battle allows you to customize your worm's appearance and name by choosing from different options. You can access the customization menu by tapping on the worm icon on the main screen. Here is how to customize your worm in Annelids Online Battle:

-
    -
  • Color: You can change the color of your worm by tapping on the color palette icon on the top left of the menu. You can choose from 16 different colors, such as red, blue, green, yellow, pink, purple, and more.
  • -
  • Hat: You can change the hat of your worm by tapping on the hat icon on the top right of the menu. You can choose from 24 different hats, such as helmets, caps, crowns, berets, fedoras, and more.
  • -
  • Glasses: You can change the glasses of your worm by tapping on the glasses icon on the bottom left of the menu. You can choose from 16 different glasses, such as sunglasses, goggles, monocles, eyepatches, and more.
  • -
  • Accessory: You can change the accessory of your worm by tapping on the accessory icon on the bottom right of the menu. You can choose from 24 different accessories, such as earrings, necklaces, ties, scarves, and more.
  • -
  • Name: You can change the name of your worm by tapping on the name field on the bottom center of the menu. You can enter any name you want, up to 12 characters. You can also use emojis and symbols to make your name more unique.
  • -
-

How to Win Different Game Modes in Annelids Online Battle

-

Annelids Online Battle has five different game modes that you can play online or offline. Each game mode has its own rules and objectives that you need to follow and achieve. Here are some tips on how to win different game modes in Annelids Online Battle:

-
    -
  • Deathmatch: This is the classic game mode where you have to kill as many enemies as possible in a given time limit or until a certain score is reached. You can play solo or in teams. The tips for winning this game mode are: use weapons that deal high damage and have a large blast radius, such as rockets or grenades; avoid weapons that have a long reload time or require precision, such as lasers or snipers; move around and use cover to avoid enemy fire; and collect health packs and ammo crates to replenish your resources.
  • -
  • Team Deathmatch: This is similar to deathmatch, but you have to work with your teammates to kill the enemy team. You can play with up to 3 players per team. The tips for winning this game mode are: communicate with your teammates and coordinate your attacks; use weapons that complement each other's strengths and weaknesses, such as flamethrowers and mines; protect your teammates and revive them if they get knocked down; and focus on eliminating the most dangerous enemies first.
  • -
  • Capture the Flag: This is a game mode where you have to capture the enemy's flag and bring it back to your base while defending your own flag. You can play with up to 3 players per team. The tips for winning this game mode are: use weapons that can clear the way and create shortcuts, such as rockets or grenades; avoid weapons that can damage your own flag or base, such as mines or flamethrowers; assign roles to your teammates, such as attacker, defender, or support; and use teamwork and strategy to outsmart the enemy team.
  • -
  • Conquest: This is a game mode where you have to capture and hold as many zones as possible on the map. You can play with up to 3 players per team. The tips for winning this game mode are: use weapons that can cover a large area and prevent enemies from entering, such as lasers or flamethrowers; avoid weapons that can damage your own zones or teammates, such as rockets or grenades; spread out and capture multiple zones at once; and defend your zones from enemy attacks.
  • -
  • King of the Hill: This is a game mode where you have to stay inside a marked area on the map for as long as possible. You can play solo or in teams. The tips for winning this game mode are: use weapons that can push enemies away or trap them inside, such as mines or rockets; avoid weapons that can push you away or damage yourself, such as grenades or flamethrowers; stay close to the center of the area and use cover to avoid enemy fire; and collect health packs and ammo crates to survive longer.
  • -
-

Comparison Table of Annelids Online Battle with Other Worm Games

-

Annelids Online Battle is not the only worm game out there. There are many other popular worm games that you might have heard of or played before, such as Worms, Slither.io, and Wormax.io. How does Annelids Online Battle compare with these other worm games? Here is a comparison table of Annelids Online Battle with other worm games based on some criteria:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
GameGraphicsGameplayFeaturesRatingsReviews
Annelids Online Battle2D pixel art style with bright colors and smooth animations.Real-time action game with various weapons, game modes, maps, and missions.Online multiplayer with up to 6 players, offline multiplayer with up to 15 players on the same device, over 70 weapons, over 100 missions, procedurally generated maps, and customization options.4.5 out of 5 stars on Google Play Store and 4.7 out of 5 stars on App Store.Mostly positive reviews from players who praise the game's fun, addictive, and challenging gameplay, as well as its variety, creativity, and multiplayer features.
Worms2D cartoon style with colorful graphics and funny animations.Turn-based strategy game with various weapons, game modes, maps, and campaigns.Online multiplayer with up to 6 players, offline multiplayer with up to 4 players on the same device, over 80 weapons, over 50 missions, randomly generated maps, and customization options.4.3 out of 5 stars on Google Play Store and 4.6 out of 5 stars on App Store.Mostly positive reviews from players who love the game's humor, nostalgia, and strategy, as well as its variety, quality, and multiplayer features.
Slither.io2D minimalist style with simple graphics and smooth animations.Online multiplayer game where you have to grow your worm by eating pellets and other worms while avoiding collisions.Online multiplayer with millions of players worldwide, leaderboard system, skins system, and offline mode.4.3 out of 5 stars on Google Play Store and 4.4 out of 5 stars on App Store.Mixed reviews from players who enjoy the game's simplicity, addictiveness, and competitiveness, but also complain about the game's lag, ads, and glitches.
Wormax.io2D colorful style with bright graphics and smooth animations.Online multiplayer game where you have to grow your worm by eating food and other worms while using various skills and power-ups.Online multiplayer with thousands of players worldwide, leaderboard system, skills system, power-ups system, skins system, and clans system.4.2 out of 5 stars on Google Play Store and 4.3 out of 5 stars on App Store.Mixed reviews from players who like the game's graphics, gameplay, and features, but also dislike the game's lag, ads, and pay-to-win aspects.
-

Conclusion

-

Annelids Online Battle is a fun and action-packed worm game that you can download and play on your Android or iOS device. It has various game modes, weapons, maps, missions, and customization options that will keep you entertained for hours. It also has online multiplayer features that will allow you to battle with other players from around the world. If you love worm games or action games in general, you should definitely give Annelids Online Battle a try. You won't regret it!

-

FAQs About Annelids Online Battle

-

Here are some frequently asked questions about Annelids Online Battle that you might have:

-
    -
  • How much does Annelids Online Battle cost?
  • -

    Annelids Online Battle is free to download and play on both Android and iOS devices. However, it does have some in-app purchases that can enhance your gaming experience. You can buy coins to unlock more weapons and items or remove ads to enjoy the game without interruptions.

    -
  • Is Annelids Online Battle safe to download?
  • -

    Annelids Online Battle is safe to download as long as you download it from the official sources: the Google Play Store or the App Store. The game does not contain any viruses or malware that can harm your device or data. The game also does not require any sensitive permissions or access settings that can compromise your privacy or security.

    -
  • Can I play Annelids Online Battle offline?
  • -

    Annelids Online Battle can be played offline without an internet connection. You can play the single-player mode with AI-controlled worms or the local multiplayer mode with up to 15 players on the same device. However, you will need an internet connection to play the online multiplayer mode with other players from around the world.

    -
  • Can I play Annelids Online Battle with friends?
  • -

    Annelids Online Battle can be played with friends in two ways: online or offline. You can play online with friends by joining or creating a room with up to 6 players and choosing the game mode, map , and settings. You can also chat with your friends and make teams or alliances. You can play offline with friends by using the same device and playing the local multiplayer mode with up to 15 players. You can also use the split-screen feature to play on two devices connected via Wi-Fi or Bluetooth.

    -
  • What are the system requirements for Annelids Online Battle?
  • -

    Annelids Online Battle is compatible with most Android and iOS devices. However, to ensure the best performance and experience, you should have the following system requirements:

    -
      -
    • For Android devices: Android 4.4 or higher, 1 GB of RAM or more, and 100 MB of free storage space or more.
    • -
    • For iOS devices: iOS 9.0 or higher, iPhone 5s or newer, iPad Air or newer, iPod touch 6th generation or newer, and 100 MB of free storage space or more.
    • -
    -

    I hope this article has helped you learn more about Annelids Online Battle and how to download and play it. If you have any questions or feedback, please feel free to leave a comment below. Thank you for reading and have fun playing Annelids Online Battle!

    401be4b1e0
    -
    -
    \ No newline at end of file diff --git a/spaces/congsaPfin/Manga-OCR/logs/Convert PDF to Word Easily and Quickly with Free Download Full Version.md b/spaces/congsaPfin/Manga-OCR/logs/Convert PDF to Word Easily and Quickly with Free Download Full Version.md deleted file mode 100644 index 48f4f15d2fa226a6133d66d18214d9ed457725e3..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/Convert PDF to Word Easily and Quickly with Free Download Full Version.md +++ /dev/null @@ -1,200 +0,0 @@ - -

    Free Download Converter PDF to Word Full Version with Crack: Is It Worth It?

    -

    PDF (Portable Document Format) is one of the most popular file formats for sharing and storing documents. However, PDF files are not easy to edit, especially if they contain complex formatting, images, tables, or graphics. That's why many people look for ways to convert PDF files to Word documents, which are more editable and versatile.

    -

    free download converter pdf to word full version with crack


    Download File 🗸🗸🗸 https://urlca.com/2uO5tS



    -

    PDF to Word converters are software tools that can transform PDF files into Word files while preserving their original layout and quality. There are many types of PDF to Word converters available online, ranging from free to paid, from online to offline, from simple to advanced.

    -

    However, some people may be tempted to download converter PDF to word full version with crack. Cracking software means using illegal methods to bypass the license or activation requirements of a software product. By cracking software, people can access its full features without paying for it or obtaining permission from its developers.

    -

    But is it worth it? What are the benefits and risks of using cracked software? And what are some alternatives to cracked software? In this article, we will answer these questions and help you make an informed decision.

    -

    free pdf to word doc converter full crack download
    -download free pdf to word converter with crack key
    -free pdf to word converter full version cracked software
    -how to download free pdf to word converter with crack file
    -free pdf to word converter software download with crack patch
    -free pdf to word doc converter crack full version online
    -free online pdf to word converter with crack code
    -free pdf to word converter full version with crack for windows 10
    -free pdf to word converter download with crack serial number
    -free pdf to word doc converter full crack offline installer
    -best free pdf to word converter with crack license key
    -free pdf to word converter full version with crack for mac
    -free pdf to word converter download with crack activation key
    -free pdf to word doc converter full crack portable
    -free pdf to word converter with crack for android
    -free download pdf to word converter full version with crack and keygen
    -free pdf to word converter full version with crack for pc
    -free pdf to word converter download with crack registration code
    -free pdf to word doc converter full crack zip file
    -free pdf to word converter with crack for linux
    -free download pdf to word converter full version with crack and password
    -free pdf to word converter full version with crack for windows 7
    -free pdf to word converter download with crack product key
    -free pdf to word doc converter full crack rar file
    -free pdf to word converter with crack for ios

    -

    Benefits of PDF to Word Converters

    -

    Edit PDF Documents Easily

    -

    One of the main benefits of using PDF to Word converters is that they allow you to edit PDF documents easily. By converting PDFs to Word files, you can modify the text, images, fonts, colors, alignment, spacing, and other elements

    of the documents. You can also add, delete, or rearrange pages, sections, or paragraphs. This can help you customize your documents according to your needs and preferences.

    -

    Preserve Formatting and Quality

    -

    Another benefit of using PDF to Word converters is that they preserve the formatting and quality of the documents. By converting PDFs to Word files, you can keep the original appearance and layout of the documents, such as margins, headers, footers, page numbers, tables, charts, graphs, and so on. You can also retain the resolution and clarity of the images, logos, icons, and other graphics. This can help you maintain the professionalism and consistency of your documents.

    -

    Save Time and Storage Space

    -

    A third benefit of using PDF to Word converters is that they save time and storage space. By converting PDFs to Word files, you can reduce the file size and make it easier to share and store the documents. Word files are usually smaller than PDF files, which means they take up less space on your device or cloud storage. They are also more compatible with various platforms and applications, which means they can be opened and viewed by more people. This can help you save time and avoid compatibility issues.

    -

    Risks of Cracked Software

    -

    Malware Infections and Other Security Threats

    -

    One of the main risks of using cracked software is that it can expose you to malware infections and other security threats. Malware is any software that is designed to harm or disrupt your device or data. Some examples of malware are viruses, worms, trojans, spyware, ransomware, adware, and rootkits. Malware can infect your device through cracked software in various ways, such as:

    -
      -
    • Cracked software may contain hidden malware that can be activated when you run or install the software.
    • -
    • Cracked software may download additional malware from the internet without your knowledge or consent.
    • -
    • Cracked software may create backdoors or vulnerabilities in your device that can allow hackers or cybercriminals to access your device or data remotely.
    • -
    -

    Malware infections can cause various problems for your device or data, such as:

    -
      -
    • Malware can slow down your device performance or cause it to crash or freeze.
    • -
    • Malware can damage or delete your files or programs.
    • -
    • Malware can steal your personal or financial information or identity.
    • -
    • Malware can encrypt your files or lock your device and demand a ransom for their release.
    • -
    • Malware can display unwanted ads or pop-ups on your screen.
    • -
    -

    Malware infections can be difficult to detect or remove, especially if they are sophisticated or hidden. They can also spread to other devices or networks that are connected to your device. Therefore, using cracked software can put your device and data at serious risk.

    -

    Deactivation of Antivirus Software During Installation

    -

    A second risk of using cracked software is that it often requires you to deactivate your antivirus software during installation. Antivirus software is a program that protects your device from malware infections by scanning and removing any suspicious files or programs. However, cracked software may be detected as malware by your antivirus software and be blocked or deleted. Therefore, some cracked software may ask you to disable your antivirus software before installing them.

    -

    This is a very dangerous thing to do because it leaves your device unprotected from any malware infections that may occur during or after the installation. It also shows that the cracked software is not trustworthy and may contain malicious code. Therefore, using cracked software can compromise your device security.

    Downloading from Untrusted Sources

    -

    A third risk of using cracked software is that it is usually downloaded from untrusted sources that can contain malicious or fake files. Untrusted sources are websites or platforms that are not authorized or verified by the original developers or distributors of the software. Some examples of untrusted sources are torrent sites, file-sharing sites, forums, blogs, or social media pages. Untrusted sources can pose various threats to your device or data, such as:

    -
      -
    • Untrusted sources may provide incomplete, corrupted, or outdated versions of the software that may not work properly or have errors.
    • -
    • Untrusted sources may provide files that are mislabeled, disguised, or tampered with to trick you into downloading something else.
    • -
    • Untrusted sources may provide files that are infected with malware or viruses that can harm your device or data.
    • -
    • Untrusted sources may provide files that are illegal or infringe on the intellectual property rights of the software developers or owners.
    • -
    -

    Downloading from untrusted sources can be risky and unreliable, especially if you do not check the file details, reviews, ratings, or comments before downloading. Therefore, using cracked software can expose you to various dangers and inconveniences.

    -

    Lack of Updates, Leaving the Software Exposed to Security Threats

    -

    A fourth risk of using cracked software is that it does not receive updates from the original developers, making it outdated and prone to errors and vulnerabilities. Updates are software modifications that are released by the original developers to fix bugs, improve performance, add features, or enhance security. Updates are important for keeping the software up to date and compatible with the latest technologies and standards. However, cracked software does not have access to the official updates from the original developers because it bypasses the license or activation requirements. Therefore, cracked software may suffer from various problems, such as:

    -
      -
    • Cracked software may have bugs or glitches that affect its functionality or usability.
    • -
    • Cracked software may have compatibility issues with other software, hardware, or operating systems.
    • -
    • Cracked software may have security holes or weaknesses that can be exploited by hackers or cybercriminals.
    • -
    • Cracked software may have missing or outdated features that limit its capabilities or performance.
    • -
    -

    Lack of updates can make cracked software obsolete and unsafe, especially if it deals with sensitive or confidential data. Therefore, using cracked software can result in poor quality and security.

    Poorly Functioning Functions

    -

    A fifth risk of using cracked software is that it may not work properly or have missing or corrupted features. Cracked software may have been modified or altered by the crackers or hackers to bypass the license or activation requirements, but this may also affect the functionality or integrity of the software. Therefore, cracked software may experience various issues, such as:

    -
      -
    • Cracked software may fail to launch or run smoothly, causing crashes, freezes, or errors.
    • -
    • Cracked software may have limited or disabled functions, such as saving, printing, converting, or editing.
    • -
    • Cracked software may have distorted or inaccurate outputs, such as low-quality conversions, incorrect formatting, or missing elements.
    • -
    • Cracked software may have unwanted or harmful functions, such as displaying ads, pop-ups, or messages, or sending data to third parties.
    • -
    -

    Poorly functioning functions can make cracked software frustrating and useless, especially if it does not meet your expectations or needs. Therefore, using cracked software can waste your time and resources.

    -

    Loss of Revenue for Software Developers

    -

    A sixth risk of using cracked software is that it deprives software developers of their rightful income and discourages them from creating more quality products. Software development is a complex and costly process that involves research, design, coding, testing, debugging, marketing, and maintenance. Software developers invest a lot of time, money, and effort into creating software products that can benefit users and solve their problems. However, when users download and use cracked software without paying for it or obtaining permission from the developers, they are stealing their intellectual property and violating their rights. This can cause various consequences for the developers, such as:

    -
      -
    • Loss of revenue that could have been used to cover the costs of development and support.
    • -
    • Loss of motivation and incentive to improve the software or create new products.
    • -
    • Loss of reputation and credibility in the market and among users.
    • -
    • Loss of legal protection and recourse against piracy and infringement.
    • -
    -

    Loss of revenue for software developers can harm the software industry and innovation, as well as the users who rely on their products. Therefore, using cracked software is unethical and unfair.

    -

    Alternatives to Cracked Software

    -

    Free Online PDF to Word Converters

    -

    One of the alternatives to cracked software is to use free online PDF to Word converters. These are web-based tools that can convert PDF files to Word files without downloading or installing anything on your device. All you need is an internet connection and a browser. Some of the advantages of using free online PDF to Word converters are:

    -
      -
    • They are easy and convenient to use. You just need to upload your PDF file, choose your output format, and download your converted file.
    • -
    • They are safe and secure. They do not contain any malware or viruses that can harm your device or data. They also delete your files from their servers after a short period of time to protect your privacy.
    • -
    • They are compatible with various devices and platforms. You can use them on any device that has a browser, such as Windows, Mac, Linux, Android, iOS, etc.
    • -
    -

    However, some of the disadvantages of using free online PDF to Word converters are:

    -
      -
    • They have limited features and functions. They may not be able to convert large or complex PDF files, preserve all the formatting and quality of the documents, or support batch or partial conversions.
    • -
    • They have slow speed and performance. They may take a long time to upload, convert, or download your files, depending on your internet connection and file size.
    • -
    • They have unreliable quality and accuracy. They may produce low-quality conversions with errors, distortions, or missing elements.
    • -
    -

    Some examples of free online PDF to Word converters are:

    - - - - -
    NameWebsiteFeatures
    Ilovepdfhttps://www.ilovepdf.com/pdf_to_word- Supports drag-and-drop function
    - Allows you to choose between DOCX or DOC output format
    - Supports multiple languages
    - Offers other PDF tools such as merge, split, compress, etc.
    Pdf2dochttps://pdf2doc.com/- Supports batch conversion up to 20 files
    - Allows you to download all converted files in a ZIP archive
    - Preserves the layout and images of the documents
    - Does not require email registration
    Smallpdfhttps://smallpdf.com/pdf-to-word- Supports cloud storage integration with Google Drive and Dropbox
    - Allows you to choose between editable or scanned PDF conversion
    - Uses OCR technology to recognize text from scanned PDFs
    - Offers other PDF tools such as edit, sign, protect, etc.
    -

    Free Trial Versions of Paid PDF to Word Converters

    -

    Another alternative to cracked software is to use free trial versions of paid PDF to Word converters. These are software tools that can be downloaded and installed on your device, but require a license or activation code to access their full features. However, some of them offer free trial versions that allow you to test their features and performance before buying them. Some of the advantages of using free trial versions of paid PDF to Word converters are:

    -
      -
    • They have more features and functions than free online PDF to Word converters. They can convert large or complex PDF files, preserve all the formatting and quality of the documents, and support batch or partial conversions.
    • -
    • They have faster speed and performance than free online PDF to Word converters. They can upload, convert, and download your files quickly and efficiently, without depending on your internet connection or file size.
    • -
    • They have higher quality and accuracy than free online PDF to Word converters. They can produce high-quality conversions with no errors, distortions, or missing elements.
    • -
    -

    However, some of the disadvantages of using free trial versions of paid PDF to Word converters are:

    -
      -
    • They have limited duration and functionality. They may expire after a certain period of time or number of conversions, or they may restrict some features or options.
    • -
    • They have annoying reminders or pop-ups. They may display messages or notifications that urge you to buy the full version or activate the software.
    • -
    • They have potential security risks. They may contain malware or viruses that can harm your device or data, especially if they are downloaded from untrusted sources.
    • -
    -

    Some examples of free trial versions of paid PDF to Word converters are:

    - - - - - -
    NameWebsiteFeatures
    Wondershare PDFelementhttps://pdf.wondershare.com/- Supports Windows and Mac platforms
    - Allows you to convert PDFs to Word, Excel, PowerPoint, HTML, EPUB, etc.
    - Offers other PDF tools such as create, edit, annotate, sign, protect, etc.
    - Provides a free trial version for 14 days with no watermark or page limit
    Nitro Prohttps://www.gonitro.com/pro- Supports Windows platform
    - Allows you to convert PDFs to Word, Excel, PowerPoint, HTML, etc.
    - Offers other PDF tools such as create, edit, merge, split, compress, etc.
    - Provides a free trial version for 14 days with no watermark or page limit
    Adobe Acrobat Pro DChttps://acrobat.adobe.com/us/en/acrobat/acrobat-pro.html- Supports Windows and Mac platforms
    - Allows you to convert PDFs to Word, Excel, PowerPoint, HTML, etc.
    - Offers other PDF tools such as create, edit, combine, fill, sign, etc.
    - Provides a free trial version for 7 days with no watermark or page limit

    Paid PDF to Word Converters with Affordable Prices and Features

    -

    A third alternative to cracked software is to use paid PDF to Word converters with affordable prices and features. These are software tools that can be downloaded and installed on your device, and require a license or activation code to access their full features. However, unlike cracked software, they are legal, safe, and reliable. Some of the advantages of using paid PDF to Word converters with affordable prices and features are:

    -
      -
    • They have more features and functions than free online PDF to Word converters or free trial versions of paid PDF to Word converters. They can convert large or complex PDF files, preserve all the formatting and quality of the documents, and support batch or partial conversions. They may also offer additional features such as OCR, password protection, cloud integration, etc.
    • -
    • They have faster speed and performance than free online PDF to Word converters or free trial versions of paid PDF to Word converters. They can upload, convert, and download your files quickly and efficiently, without depending on your internet connection or file size.
    • -
    • They have higher quality and accuracy than free online PDF to Word converters or free trial versions of paid PDF to Word converters. They can produce high-quality conversions with no errors, distortions, or missing elements.
    • -
    • They have regular updates and support from the original developers. They can keep the software up to date and compatible with the latest technologies and standards. They can also provide technical assistance and customer service in case of any problems or questions.
    • -
    -

    However, some of the disadvantages of using paid PDF to Word converters with affordable prices and features are:

    -
      -
    • They have a cost that may not fit your budget or needs. They may require a one-time payment or a subscription fee to access their full features. They may also have different pricing plans or packages that may confuse you or limit your options.
    • -
    • They have a risk of being cracked or pirated by others. They may be targeted by crackers or hackers who want to bypass their license or activation requirements and distribute them illegally. This can affect their security, quality, and reputation.
    • -
    -

    Some examples of paid PDF to Word converters with affordable prices and features are:

    - - - - - -
    NameWebsiteFeatures
    Solid Converterhttps://www.soliddocuments.com/pdf-to-word-converter- Supports Windows platform
    - Allows you to convert PDFs to Word, Excel, PowerPoint, HTML, etc.
    - Offers other PDF tools such as create, edit, combine, split, etc.
    - Provides OCR technology to recognize text from scanned PDFs
    - Costs $99.95 for a lifetime license
    Able2Extract Professionalhttps://www.investintech.com/prod_a2e.htm- Supports Windows, Mac, and Linux platforms
    - Allows you to convert PDFs to Word, Excel, PowerPoint, HTML, etc.
    - Offers other PDF tools such as create, edit, annotate, sign, etc.
    - Provides OCR technology to recognize text from scanned PDFs
    - Costs $149.95 for a lifetime license
    CleverPDFhttps://www.cleverpdf.com/pdf-to-word- Supports Windows platform
    - Allows you to convert PDFs to Word
    - Offers other PDF tools such as compress, encrypt, decrypt, etc.
    - Provides OCR technology to recognize text from scanned PDFs
    - Costs $29.95 for a lifetime license
    -

    Conclusion

    -

    In conclusion, PDF to Word converters are useful tools that can help you edit PDF documents easily, preserve formatting and quality, and save time and storage space. However, using cracked software is not worth it because it can expose you to malware infections and other security threats, deactivate your antivirus software during installation, download from untrusted sources, lack updates, have poorly functioning functions, and cause loss of revenue for software developers. Therefore, we recommend that you use alternatives to cracked software, such as free online PDF to Word converters, free trial versions of paid PDF to Word converters, or paid PDF to Word converters with affordable prices and features. These alternatives are legal, safe, reliable, and offer more features, security, support, and quality than cracked software. We hope this article has helped you make an informed decision.

    -

    FAQs

    -

    Here are some frequently asked questions related to the topic of this article:

    -
      -
    1. What is the difference between cracking and hacking software?
    2. -

      Cracking software is a type of hacking software, but not all hacking software is cracking software. Hacking software is any software that is used to manipulate or gain unauthorized access to a system, network, or device. Cracking software is a specific type of hacking software that is used to bypass the license or activation requirements of a software product. Cracking software can be considered a form of software piracy, which is illegal and unethical.

      -
    3. What are some signs that a software is cracked?
    4. -

      Some signs that a software is cracked are:

      -
        -
      • The software does not require a license or activation code to access its full features.
      • -
      • The software asks you to disable your antivirus software before installing it.
      • -
      • The software is downloaded from an untrusted source that does not have a secure or verified website.
      • -
      • The software has a different name, logo, or interface than the original version.
      • -
      • The software has poor quality, performance, or functionality.
      • -
      • The software displays ads, pop-ups, or messages that are unrelated to the software.
      • -
      -
    5. What are some tips to avoid malware infections from cracked software?
    6. -

      Some tips to avoid malware infections from cracked software are:

      -
        -
      • Do not download or use cracked software at all. Use alternatives such as free online PDF to Word converters, free trial versions of paid PDF to Word converters, or paid PDF to Word converters with affordable prices and features.
      • -
      • Use a reputable antivirus software and keep it updated. Scan your device regularly and remove any suspicious files or programs.
      • -
      • Use a firewall and a VPN (virtual private network) to protect your device and data from unauthorized access or attacks.
      • -
      • Use strong passwords and encryption to secure your files and folders.
      • -
      • Backup your data regularly and store it in a safe location.
      • -
      -
    7. What are some factors to consider when choosing a PDF to Word converter?
    8. -

      Some factors to consider when choosing a PDF to Word converter are:

      -
        -
      • The type of PDF file you want to convert. Some PDF files may be simple or complex, editable or scanned, large or small, etc. You need to choose a PDF to Word converter that can handle the type of PDF file you have.
      • -
      • The features and functions you need. Some PDF to Word converters may offer more features and functions than others, such as OCR, batch conversion, password protection, cloud integration, etc. You need to choose a PDF to Word converter that can meet your needs and preferences.
      • -
      • The quality and accuracy of the conversion. Some PDF to Word converters may produce better quality and accuracy than others, such as preserving the formatting and quality of the documents, avoiding errors, distortions, or missing elements, etc. You need to choose a PDF to Word converter that can deliver high-quality and accurate results.
      • -
      • The speed and performance of the conversion. Some PDF to Word converters may be faster and more efficient than others, such as uploading, converting, and downloading your files quickly and smoothly, without depending on your internet connection or file size. You need to choose a PDF to Word converter that can save you time and hassle.
      • -
      • The cost and value of the conversion. Some PDF to Word converters may be free or paid, online or offline, simple or advanced, etc. You need to choose a PDF to Word converter that can fit your budget and needs. You also need to consider the value of the conversion, such as the security, support, updates, and quality of the software.
      • -
      -
    9. How can I convert PDF files to Word files without using any software?
    10. -

      One way to convert PDF files to Word files without using any software is to use Google Docs. Google Docs is a free online word processor that can also convert PDF files to Word files. Here are the steps to do it:

      -
        -
      1. Go to https://docs.google.com/ and sign in with your Google account.
      2. -
      3. Click on the "File" menu and select "Open".
      4. -
      5. Click on the "Upload" tab and drag and drop your PDF file or click on "Select a file from your device".
      6. -
      7. Wait for the file to upload and open in Google Docs.
      8. -
      9. Click on the "File" menu again and select "Download".
      10. -
      11. Choose "Microsoft Word (.docx)" as the output format.
      12. -
      13. Save the converted file on your device.
      14. -

      197e85843d
      -
      -
      \ No newline at end of file diff --git a/spaces/congsaPfin/Manga-OCR/logs/Discover the Secrets of Your Neighbor Bendy in Hello Bendy Neighbor APK.md b/spaces/congsaPfin/Manga-OCR/logs/Discover the Secrets of Your Neighbor Bendy in Hello Bendy Neighbor APK.md deleted file mode 100644 index 7ddeeb852db6249f57802c42abfb4476f73a2c8a..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/Discover the Secrets of Your Neighbor Bendy in Hello Bendy Neighbor APK.md +++ /dev/null @@ -1,138 +0,0 @@ - -

      Hello Bendy Neighbor APK: A Scary and Fun Game for Android

      -

      If you are looking for a horror game that will keep you on the edge of your seat, then you should try Hello Bendy Neighbor APK. This is a game that combines the elements of Bendy and the Ink Machine and Hello Neighbor, two popular horror games that have millions of fans around the world. In this game, you have to sneak into your neighbor's house and find out what he is hiding, while avoiding his creepy ink monster form. Sounds exciting, right? In this article, we will tell you everything you need to know about Hello Bendy Neighbor APK, including what it is, how to download and install it, how to play it, what are its features, and what are its pros and cons. Let's get started!

      -

      What is Hello Bendy Neighbor APK?

      -

      Hello Bendy Neighbor APK is a horror simulation game that was developed by Cheesecake, an independent game studio. The game was released in 2017 and has received positive reviews from players who enjoyed its spooky atmosphere, its challenging gameplay, and its unique twist on the horror genre. Here are some of the aspects of Hello Bendy Neighbor APK that make it stand out:

      -

      hello bendy neighbor apk


      Download Zip 🆓 https://urlca.com/2uO6QH



      -

      A horror game inspired by Bendy and the Ink Machine and Hello Neighbor

      -

      If you are familiar with Bendy and the Ink Machine and Hello Neighbor, then you will recognize some of the characters, settings, and themes in Hello Bendy Neighbor APK. The game is set in a cartoonish world where your neighbor is none other than Bendy, the ink demon from Bendy and the Ink Machine. He is not only your neighbor, but also your former co-worker at an animation studio where something went terribly wrong. You have to find out what happened there, and why he is so obsessed with his ink machine. You also have to deal with his traps, his minions, and his unpredictable behavior.

      -

      A simulation game where you have to explore your neighbor's house without getting caught

      -

      Like in Hello Neighbor, your main goal in Hello Bendy Neighbor APK is to sneak into your neighbor's house and discover his secrets. You have to use stealth, strategy, and creativity to avoid his detection and reach different rooms in his house. You also have to solve puzzles, collect items, and use them to unlock doors, windows, or hidden passages. You never know what you will find in his house, or what he will do next. He can chase you, attack you, or even set up traps for you. You have to be careful and smart if you want to survive.

      -

      A A game with high quality graphics, sounds, and controls

      -

      One of the things that make Hello Bendy Neighbor APK a great game is its impressive graphics, sounds, and controls. The game has a 3D design that creates a realistic and immersive experience for the players. The game also has a dark and eerie soundtrack that adds to the tension and suspense of the game. The game also has smooth and intuitive controls that allow you to move around, interact with objects, and escape from your neighbor with ease. You can also adjust the settings of the game to suit your preferences and device performance.

      -

      How to download and install Hello Bendy Neighbor APK?

      -

      If you want to play Hello Bendy Neighbor APK on your Android device, you will need to download and install the APK file of the game. An APK file is a package file that contains the installation files of an Android app. You can download the APK file of Hello Bendy Neighbor APK from various sources online, such as APKPure, APKMirror, or APKMonk. However, you should always be careful when downloading APK files from unknown sources, as they may contain viruses or malware that can harm your device. Here are the steps to download and install Hello Bendy Neighbor APK safely and correctly:

      -

      Download the APK file from a trusted source

      -

      The first step is to find a reliable and reputable website that offers the APK file of Hello Bendy Neighbor APK. You can use your browser to search for the game name and the word "APK" to find some results. You can also use the links below to access some of the websites that provide the APK file of Hello Bendy Neighbor APK:

      - -

      Once you find a website that you trust, you can click on the download button or link to start downloading the APK file of Hello Bendy Neighbor APK. The file size is about 100 MB, so make sure you have enough storage space and a stable internet connection before downloading.

      -

      Enable unknown sources on your device settings

      -

      The next step is to enable unknown sources on your device settings. This is a security feature that prevents you from installing apps that are not from the official Google Play Store. However, since you are installing an APK file from a third-party source, you will need to enable unknown sources to allow the installation. To do this, follow these steps:

      -
        -
      1. Go to your device settings and look for security or privacy options.
      2. -
      3. Find the option that says "unknown sources" or "install unknown apps" and toggle it on.
      4. -
      5. A warning message will appear, telling you about the risks of installing apps from unknown sources. Tap on "OK" or "Allow" to confirm.
      6. -
      -

      Now you are ready to install the APK file of Hello Bendy Neighbor APK on your device.

      -

      hello bendy neighbor horror game
      -hello bendy neighbor download free
      -hello bendy neighbor 3d graphics
      -hello bendy neighbor simulation
      -hello bendy neighbor latest version
      -hello bendy neighbor android app
      -hello bendy neighbor mobile game
      -hello bendy neighbor cheats and tips
      -hello bendy neighbor scary adventure
      -hello bendy neighbor reviews and ratings
      -hello bendy neighbor mod apk
      -hello bendy neighbor offline play
      -hello bendy neighbor online multiplayer
      -hello bendy neighbor secrets and easter eggs
      -hello bendy neighbor gameplay videos
      -hello bendy neighbor updates and news
      -hello bendy neighbor fan art and memes
      -hello bendy neighbor challenges and achievements
      -hello bendy neighbor hints and guides
      -hello bendy neighbor best strategies and tactics
      -hello bendy neighbor fun and addictive
      -hello bendy neighbor smooth and easy controls
      -hello bendy neighbor amazing sounds and music
      -hello bendy neighbor realistic physics and animations
      -hello bendy neighbor creepy and mysterious
      -hello bendy neighbor high quality 3d graphics apk
      -hello bendy neighbor explore your neighbor house apk
      -hello bendy neighbor avoid your neighbor apk
      -hello bendy neighbor catch you apk
      -hello bendy neighbor super scary and secretive apk
      -hello bendy neighbor commission tasks apk
      -hello bendy neighbor beware of your presence apk
      -hello bendy neighbor fail the level apk
      -hello bendy neighbor cautious that he is up-to something apk
      -hello bendy neighbor no longer fun apk
      -hello bendy neighbor zapzap apk
      -hello bendy neighbor cheesecake apk
      -hello bendy neighbor reyes horror games apk
      -hello bendy neighbor hellala apk
      -hello bendy neighbor giza apk

      Install the APK file and enjoy the game

      -

      The final step is to install the APK file of Hello Bendy Neighbor APK on your device. To do this, follow these steps:

      -
        -
      1. Locate the APK file that you downloaded on your device. You can use a file manager app or your browser's download history to find it.
      2. -
      3. Tap on the APK file to start the installation process. You may see a pop-up window asking you to confirm the installation. Tap on "Install" or "Yes" to proceed.
      4. -
      5. Wait for the installation to finish. It may take a few minutes depending on your device and the file size.
      6. -
      7. Once the installation is done, you can tap on "Open" or "Done" to launch the game or exit the installer.
      8. -
      -

      Congratulations! You have successfully downloaded and installed Hello Bendy Neighbor APK on your Android device. You can now enjoy playing this scary and fun game anytime and anywhere.

      -

      How to play Hello Bendy Neighbor APK?

      -

      Now that you have installed Hello Bendy Neighbor APK on your device, you may be wondering how to play it. Don't worry, we will guide you through the basics of the game and give you some tips and tricks to help you in your adventure. Here are some of the things you need to know about playing Hello Bendy Neighbor APK:

      -

      Choose your difficulty level and start the game

      -

      When you launch the game, you will see a main menu with several options. You can choose to start a new game, continue a previous game, change the settings, or exit the game. If you are new to the game, we recommend that you start a new game and choose your difficulty level. There are three difficulty levels in Hello Bendy Neighbor APK: easy, normal, and hard. The difficulty level affects how smart and aggressive your neighbor is, how many clues and hints you get, and how many lives you have. Choose the one that suits your skill and preference, and then tap on "Start" to begin the game.

      -

      Use the joystick to move and the buttons to interact with objects

      -

      The game has a simple and intuitive control system that allows you to move around and interact with objects in the game. On the left side of the screen, you will see a joystick that you can use to move your character in different directions. On the right side of the screen, you will see several buttons that you can use to perform various actions, such as jumping, crouching, running, picking up items, using items, or throwing items. You can also tap on objects in the environment to interact with them, such as opening doors, turning on lights, or hiding in closets.

      -

      Avoid your neighbor and find clues to uncover his secrets

      -

      The main objective of the game is to avoid your neighbor and find clues to uncover his secrets. Your neighbor is not friendly at all, and he will try to catch you if he sees you or hears you in his house. If he catches you, he will send you back to your house or kill you depending on the difficulty level. You have to be stealthy and smart to avoid his detection and reach different rooms in his house. You also have to find clues that will help you understand his past, his motives, and his plans. These clues can be notes, photos, tapes, keys, or other items that are hidden or scattered around his house. You have to collect them and use them to unlock new areas or solve puzzles in the game.

      What are the features of Hello Bendy Neighbor APK?

      -

      Hello Bendy Neighbor APK is a game that has a lot of features that make it fun and enjoyable to play. Some of these features are:

      -

      High quality 3D graphics and sounds

      -

      The game has a high quality 3D graphics that create a realistic and immersive environment for the players. The game also has a dark and eerie soundtrack that adds to the tension and suspense of the game. The game also has realistic sound effects that make you feel like you are in the game, such as the footsteps of your neighbor, the creaking of the doors, or the screams of the ink monsters.

      -

      Interesting and challenging activities and puzzles

      -

      The game has a lot of interesting and challenging activities and puzzles that you have to complete in order to progress in the game. You have to explore your neighbor's house and find clues that will help you uncover his secrets. You also have to solve puzzles that will test your logic, memory, and creativity. You also have to avoid your neighbor and his traps, or fight back if you have to. The game has a lot of replay value and different endings depending on your choices and actions in the game.

      -

      Hints and tips to help you in the game

      -

      If you are stuck or need some help in the game, you can use the hints and tips feature that will give you some guidance and advice on what to do next. You can access the hints and tips by tapping on the question mark icon on the top right corner of the screen. The hints and tips will tell you what your current objective is, what items you need, or where to go. However, you should use the hints and tips sparingly, as they will reduce your score and make the game less challenging.

      -

      What are the pros and cons of Hello Bendy Neighbor APK?

      -

      Like any other game, Hello Bendy Neighbor APK has its pros and cons that you should consider before playing it. Here are some of the pros and cons of Hello Bendy Neighbor APK:

      -

      Pros:

      -
        -
      • A fun and scary game for horror fans: If you love horror games, then you will love Hello Bendy Neighbor APK. The game has a spooky atmosphere, a creepy story, and a terrifying antagonist that will keep you on your toes. The game also has jump scares, gore, and violence that will make your heart race.
      • -
      • A free and easy to install game for Android devices: You can download and install Hello Bendy Neighbor APK for free on your Android device without any hassle. You just need to follow the steps we mentioned above and enjoy the game. You don't need to pay anything or register anything to play the game.
      • -
      • A game with a lot of replay value and different endings: The game has a lot of replay value and different endings depending on your choices and actions in the game. You can play the game multiple times and see different outcomes and scenarios. You can also try different difficulty levels and challenge yourself more.
      • -
      -

      Cons:

      -
        -
      • A game that may be too scary or violent for some players: The game is not suitable for everyone, especially for young children or people who are sensitive to horror or violence. The game has a lot of scary scenes, blood, gore, and violence that may disturb some players. The game also has a lot of jump scares that may cause anxiety or panic attacks for some players.
      • -
      • A game that may have some bugs or glitches: The game is not perfect, and it may have some bugs or glitches that may affect your gameplay or experience. Some of these bugs or glitches may include crashing, freezing, lagging, or errors. You may also encounter some problems with the graphics, sounds, or controls of the game.
      • -
      • A game that may require a lot of storage space and battery power: The game is quite large in size, about 100 MB, so you will need enough storage space on your device to download and install it. You will also need enough battery power to play the game for a long time, as it may drain your battery quickly.
      • -
      -

      Conclusion

      -

      Hello Bendy Neighbor APK is a horror simulation game that combines the elements of Bendy and the Ink Machine and Hello Neighbor. In this game, you have to sneak into your neighbor's house and find out what he is hiding, while avoiding his creepy ink monster form. The game has high quality graphics, sounds, and controls, as well as interesting and challenging activities and puzzles. The game also has a lot of features, pros, and cons that we discussed in this article. If you If you are a fan of horror games, then you should give Hello Bendy Neighbor APK a try. The game is a scary and fun game that will test your nerves and skills. You can download and install the game for free on your Android device and enjoy playing it anytime and anywhere. However, you should also be aware of the cons of the game, such as its scariness, its bugs, and its battery consumption. You should also play the game responsibly and not let it affect your mental or physical health. We hope this article has helped you learn more about Hello Bendy Neighbor APK and how to play it. Have fun and good luck!

      FAQs

      -

      Here are some of the frequently asked questions about Hello Bendy Neighbor APK:

      - - - - - - - - - - - - - - - - - - - - - - - - - -
      QuestionAnswer
      Is Hello Bendy Neighbor APK safe to download and install?Yes, as long as you download and install the APK file from a trusted source, such as the ones we mentioned in this article. You should also enable unknown sources on your device settings before installing the APK file.
      Is Hello Bendy Neighbor APK compatible with my device?Hello Bendy Neighbor APK is compatible with most Android devices that have Android 4.1 or higher. However, some devices may not support the game or may experience some issues with the game.
      How can I update Hello Bendy Neighbor APK?You can update Hello Bendy Neighbor APK by downloading and installing the latest version of the APK file from the same source that you used before. You should also uninstall the previous version of the game before installing the new one.
      How can I contact the developer of Hello Bendy Neighbor APK?You can contact the developer of Hello Bendy Neighbor APK by sending an email to cheesecakegames@gmail.com. You can also visit their website or their Facebook page for more information.
      How can I get more hints and tips for Hello Bendy Neighbor APK?You can get more hints and tips for Hello Bendy Neighbor APK by watching videos or reading articles online that show you how to play the game or how to solve certain puzzles. You can also join online communities or forums where you can ask questions or share your experiences with other players.

      197e85843d
      -
      -
      \ No newline at end of file diff --git a/spaces/congsaPfin/Manga-OCR/logs/FB Lite The Best Way to Connect with Friends on Android.md b/spaces/congsaPfin/Manga-OCR/logs/FB Lite The Best Way to Connect with Friends on Android.md deleted file mode 100644 index a4dc94a56412a4dab81997b7d568fb584f523569..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/FB Lite The Best Way to Connect with Friends on Android.md +++ /dev/null @@ -1,153 +0,0 @@ -
      -

      FB Apps Download Lite: A Faster and Easier Way to Connect with Friends

      -

      Do you want to keep in touch with your friends and family on Facebook, but you have a slow or unstable internet connection? Do you want to save space and data on your phone while enjoying the features of Facebook? If you answered yes, then you might want to try FB Apps Download Lite, a smaller and lighter version of the full Facebook app. In this article, we will tell you what FB Apps Download Lite is, how to download and install it, how to use it, and how to troubleshoot common issues with it.

      -

      fb apps download lite


      Download File ✪✪✪ https://urlca.com/2uO89W



      -

      What is FB Apps Download Lite?

      -

      FB Apps Download Lite is an official Facebook client that lets you access the social network with less data and works in all network conditions. It is designed for Android phones that are not supported by the regular Facebook app, or for users who want a faster and simpler experience. FB Apps Download Lite has many of the classic features of Facebook, such as sharing to a Timeline, liking photos, searching for people, and editing your profile and groups. It also has some specific features, such as:

      -
        -
      • It installs fast – the app is smaller, so it's quick to download and uses less storage space.
      • -
      • It works on old Android phones - you can use it on older Android phones that run on Android 2.3 or higher.
      • -
      • It uses less data - it is more efficient with your mobile data. You can save money by using less data.
      • -
      • It loads quickly - it is the fastest app from Facebook. You can upload photos faster and see updates from friends.
      • -
      • It works on all networks - it is designed for 2G networks and areas with slow or unstable internet connections.
      • -
      • It has a dark mode - you can switch to a dark theme to reduce eye strain and save battery life.
      • -
      -

      The benefits of using FB Apps Download Lite

      -

      By using FB Apps Download Lite, you can enjoy the following benefits:

      -
        -
      • You can stay connected with your friends and interests even if you have a poor internet connection.
      • -
      • You can save space on your phone and free up memory for other apps.
      • -
      • You can reduce your data usage and save money on your phone bill.
      • -
      • You can load pages faster and avoid lagging or crashing.
      • -
      • You can customize your app settings and preferences according to your needs.
      • -
      -

      How to download and install FB Apps Download Lite

      -

      To download and install FB Apps Download Lite, follow these steps:

      -

      fb lite apps download for android
      -fb lite apps download free
      -fb lite apps download apk
      -fb lite apps download 2023
      -fb lite apps download install
      -fb lite apps download old version
      -fb lite apps download new version
      -fb lite apps download update
      -fb lite apps download for pc
      -fb lite apps download for iphone
      -fb lite apps download for windows 10
      -fb lite apps download for jio phone
      -fb lite apps download for samsung
      -fb lite apps download for nokia
      -fb lite apps download for laptop
      -fb lite apps download for ios
      -fb lite apps download for mac
      -fb lite apps download for tablet
      -fb lite apps download for blackberry
      -fb lite apps download for huawei
      -fb lite apps download for oppo
      -fb lite apps download for vivo
      -fb lite apps download for xiaomi
      -fb lite apps download for lenovo
      -fb lite apps download for lg
      -fb lite apps download for sony
      -fb lite apps download for moto
      -fb lite apps download for micromax
      -fb lite apps download for asus
      -fb lite apps download for oneplus
      -fb lite apps download latest version 2023
      -fb lite apps free download apk file
      -fb lite apps fast and easy to use
      -fb lite apps save data and battery life
      -fb lite apps work on all networks and devices
      -fb lite apps connect with friends and family faster
      -fb lite apps share photos and videos easily
      -fb lite apps follow your favorite pages and groups
      -fb lite apps buy and sell locally on marketplace
      -fb lite apps watch live videos and stories
      -how to download and install fb lite app on android phone
      -how to update facebook app to facebook app light
      -how to use facebook app light mode
      -how to switch from facebook app to facebook app light
      -how to uninstall facebook app and use facebook app light instead
      -how to fix facebook app light not working or loading
      -how to enable dark mode on facebook app light
      -how to change language on facebook app light
      -how to log out of facebook app light
      -how to delete or deactivate facebook account on facebook app light

      -
        -
      1. Go to the Google Play Store on your Android phone and search for "Facebook Lite".
      2. -
      3. Tap on the app icon and then tap on "Install".
      4. -
      5. Wait for the app to download and install on your phone.
      6. -
      7. Once the app is installed, tap on "Open" to launch it.
      8. -
      -

      How to use FB Apps Download Lite

      -

      How to log in and set up your profile

      -

      To log in and set up your profile on FB Apps Download Lite, follow these steps:

      -
        -
      1. Enter your email address or phone number and password that you use for your Facebook account.
      2. -
      3. Tap on "Log In" to access your account.
      4. -
      5. If you don't have a Facebook account yet, tap on "Create New Account" and follow the instructions.
      6. 4. To set up your profile, tap on the menu icon (three horizontal lines) at the top right corner of the app.

        -

        5. Tap on your name and profile picture to go to your profile page.

        -

        6. Tap on the "Edit Profile" button to change your name, bio, education, work, and other details.

        -

        7. Tap on the camera icon to change your profile picture or cover photo.

        -

        8. Tap on the "Save" button to save your changes.

        -

        How to post, like, comment, and share on FB Apps Download Lite

        -

        To post, like, comment, and share on FB Apps Download Lite, follow these steps:

        -
          -
        1. To post something on your Timeline or a friend's Timeline, tap on the "What's on your mind?" box at the top of the app.
        2. -
        3. Type your message, add a photo or video, tag friends, check in to a location, or add a feeling or activity.
        4. -
        5. Tap on the "Post" button to share your post.
        6. -
        7. To like a post, tap on the "Like" button below the post. You can also choose a reaction emoji by tapping and holding the "Like" button.
        8. -
        9. To comment on a post, tap on the "Comment" button below the post. Type your comment and tap on the "Send" button.
        10. -
        11. To share a post, tap on the "Share" button below the post. You can choose to share it on your Timeline, in a group, in a message, or with a friend.
        12. -
        -

        How to chat, call, and video call on FB Apps Download Lite

        -

        To chat, call, and video call on FB Apps Download Lite, follow these steps:

        -
          -
        1. To chat with a friend, tap on the chat icon (speech bubble) at the top right corner of the app.
        2. -
        3. Select a friend from your contacts list or search for their name.
        4. -
        5. Type your message and tap on the "Send" button. You can also send stickers, emojis, photos, videos, voice messages, and GIFs by tapping on the icons above the keyboard.
        6. -
        7. To call or video call a friend, tap on the phone icon or the video icon at the top right corner of the chat screen.
        8. -
        9. Wait for your friend to answer and enjoy your conversation.
        10. -
        11. To end the call or video call, tap on the red button at the bottom of the screen.
        12. -
        -

        How to troubleshoot common issues with FB Apps Download Lite

        -

        How to update FB Apps Download Lite

        -

        To update FB Apps Download Lite, follow these steps:

        -
          -
        1. Go to the Google Play Store on your Android phone and search for "Facebook Lite".
        2. -
        3. If there is an update available for the app, you will see an "Update" button next to it. Tap on it to download and install the latest version of the app.
        4. -
        5. If there is no update available for the app, you will see an "Open" button next to it. This means you already have the latest version of the app installed on your phone.
        6. -
        -

        How to clear cache and data on FB Apps Download Lite

        -

        To clear cache and data on FB Apps Download Lite, follow these steps:

        -
          -
        1. Go to the Settings app on your Android phone and tap on "Apps".
        2. -
        3. Find and tap on "Facebook Lite".
        4. -
        5. Tap on "Storage".
        6. -
        7. Tap on "Clear Cache" to delete temporary files that may slow down or crash the app.
        8. -
        9. Tap on "Clear Data" to delete all your account information and preferences that may cause errors or glitches in the app. Note that this will log you out of the app and you will have to log in again with your email address or phone number and password.
        10. How to contact support and report problems on FB Apps Download Lite -

          To contact support and report problems on FB Apps Download Lite, follow these steps:

          -
            -
          1. Tap on the menu icon (three horizontal lines) at the top right corner of the app.
          2. -
          3. Scroll down and tap on "Help & Support".
          4. -
          5. Tap on "Report a Problem".
          6. -
          7. Select the type of problem you are facing, such as "Something Isn't Working", "Abusive Content", or "General Feedback".
          8. -
          9. Describe the problem in detail and attach a screenshot if possible.
          10. -
          11. Tap on "Send" to submit your report.
          12. -
          -

          Conclusion

          -

          FB Apps Download Lite is a great alternative to the regular Facebook app for users who want a faster and easier way to connect with their friends. It has many of the same features as the full app, but it uses less data, works on all networks, and runs on older Android phones. It is also easy to download, install, and use. However, if you encounter any issues with the app, you can always update it, clear its cache and data, or contact support and report problems. We hope this article has helped you learn more about FB Apps Download Lite and how to use it. If you have any questions or feedback, feel free to leave a comment below.

          -

          FAQs

          -

          Here are some frequently asked questions about FB Apps Download Lite:

          -
            -
          • Q: Is FB Apps Download Lite free?
          • -
          • A: Yes, FB Apps Download Lite is free to download and use. However, you may incur data charges from your mobile carrier depending on your plan and usage.
          • -
          • Q: Can I use FB Apps Download Lite on my iPhone or iPad?
          • -
          • A: No, FB Apps Download Lite is only available for Android devices. If you have an iOS device, you can use the regular Facebook app or the Facebook website on your browser.
          • -
          • Q: What is the difference between FB Apps Download Lite and Messenger Lite?
          • -
          • A: FB Apps Download Lite is a lighter version of the full Facebook app that lets you access the social network with less data and works in all network conditions. Messenger Lite is a lighter version of the Messenger app that lets you send messages, photos, videos, voice notes, and stickers with less data and works in all network conditions. You can use both apps separately or together depending on your preference.
          • -
          • Q: How can I switch to dark mode on FB Apps Download Lite?
          • A: To switch to dark mode on FB Apps Download Lite, tap on the menu icon (three horizontal lines) at the top right corner of the app. Scroll down and tap on "Settings". Tap on "Dark Mode" and toggle it on or off. -
          • Q: How can I delete my account on FB Apps Download Lite?
          • A: To delete your account on FB Apps Download Lite, tap on the menu icon (three horizontal lines) at the top right corner of the app. Scroll down and tap on "Settings". Tap on "Account Settings". Tap on "Personal Information". Tap on "Manage Account". Tap on "Deactivate" or "Delete Account" depending on your choice. Follow the instructions to confirm your action. -

          197e85843d
          -
          -
          \ No newline at end of file diff --git a/spaces/congsaPfin/Manga-OCR/logs/FoxOne Special Mission Mod APK The Ultimate Air Combat Game with Unlimited Money and Unlocked Planes.md b/spaces/congsaPfin/Manga-OCR/logs/FoxOne Special Mission Mod APK The Ultimate Air Combat Game with Unlimited Money and Unlocked Planes.md deleted file mode 100644 index e1e309d33cbfedfdece4a6f1f4c63ff1bef329fb..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/FoxOne Special Mission Mod APK The Ultimate Air Combat Game with Unlimited Money and Unlocked Planes.md +++ /dev/null @@ -1,81 +0,0 @@ - -

          Fox One Special Mission Mod Apk: A Thrilling Air Combat Game

          -

          Do you love flying and fighting in the sky? Do you want to experience the thrill of being a warplane pilot? If yes, then you should try Fox One Special Mission Mod Apk, a realistic and exciting air combat game that will test your skills and nerves.

          -

          fox one special mission mod apk


          DOWNLOAD ===== https://urlca.com/2uO7gA



          -

          What is Fox One Special Mission Mod Apk?

          -

          Fox One Special Mission Mod Apk is a modified version of Fox One Special Missions, a 3D action flight simulator game developed by SkyFox. In this game, you take on the role of a warplane pilot, fly at supersonic speeds, and take on a variety of missions, including mid-air combat with enemies, destroying enemy targets, protecting allied forces, and more. You can choose from over 15 different planes, each with its own characteristics and capabilities, and customize them with different weapons and skins. You can also explore different scenarios, such as deserts, mountains, oceans, cities, and more.

          -

          Features of Fox One Special Mission Mod Apk

          -

          Fox One Special Mission Mod Apk has some amazing features that make it more fun and enjoyable than the original game. Here are some of them:

          -

          Unlimited Money

          -

          With this mod apk, you don't have to worry about running out of money to buy new planes, weapons, or upgrades. You can get unlimited money by completing missions or simply by using the mod menu. You can use this money to buy anything you want in the game.

          -

          All Planes Unlocked

          -

          Another great feature of this mod apk is that it unlocks all the planes in the game. You don't have to wait for a certain level or complete a certain mission to unlock them. You can access all the planes from the beginning and fly them as you wish.

          -

          No Ads

          -

          One of the most annoying things about free games is that they have ads that interrupt your gameplay. However, with this mod apk, you don't have to deal with any ads at all. You can enjoy the game without any distractions or interruptions.

          -

          fox one special mission mod apk unlimited money
          -fox one special mission mod apk all planes unlocked
          -fox one special mission mod apk latest version
          -fox one special mission mod apk download for android
          -fox one special mission mod apk free download
          -fox one special mission mod apk revdl
          -fox one special mission mod apk hack
          -fox one special mission mod apk offline
          -fox one special mission mod apk rexdl
          -fox one special mission mod apk android 1
          -fox one special mission mod apk obb
          -fox one special mission mod apk data
          -fox one special mission mod apk 2.6.2
          -fox one special mission mod apk pure
          -fox one special mission mod apk no root
          -fox one special mission mod apk unlimited coins
          -fox one special mission mod apk full version
          -fox one special mission mod apk online
          -fox one special mission mod apk 2023
          -fox one special mission mod apk gameplay
          -fox one special mission mod apk cheat
          -fox one special mission mod apk update
          -fox one special mission mod apk old version
          -fox one special mission mod apk highly compressed
          -fox one special mission mod apk mega
          -fox one special mission mod apk android oyun club
          -fox one special mission mod apk unlimited ammo
          -fox one special mission mod apk unlocked everything
          -fox one special mission mod apk no ads
          -fox one special mission mod apk 2.6.1

          -

          How to Download and Install Fox One Special Mission Mod Apk?

          -

          If you want to download and install Fox One Special Mission Mod Apk on your device, you need to follow these simple steps:

          -

          Step 1: Download the mod apk file from a trusted source

          -

          The first thing you need to do is to download the mod apk file from a reliable source. You can use this link to download it safely and quickly.

          -

          Step 2: Enable unknown sources on your device

          -

          The next thing you need to do is to enable unknown sources on your device. This will allow you to install apps that are not from the Google Play Store. To do this, go to Settings > Security > Unknown Sources and toggle it on.

          -

          Step 3: Install the mod apk file and enjoy the game

          -

          The final thing you need to do is to install the mod apk file on your device. To do this, locate the file in your downloads folder and tap on it. Follow the instructions on the screen and wait for the installation to finish. Once it's done, you can launch the game and enjoy it.

          -

          Tips and Tricks for Playing Fox One Special Mission Mod Apk

          -

          Fox One Special Mission Mod Apk is not an easy game to play. You need to have some skills and strategies to complete the missions and defeat the enemies. Here are some tips and tricks that can help you improve your gameplay:

          -

          Choose the right plane for each mission

          -

          Not all planes are suitable for every mission. Some planes are faster, some are more agile, some have more firepower, and some have more stealth. You need to choose the plane that matches the requirements and objectives of each mission. For example, if you need to fly low and avoid radar detection, you should choose a stealth plane like the F-117 Nighthawk. If you need to engage in a dogfight with enemy fighters, you should choose a maneuverable plane like the F-16 Fighting Falcon.

          -

          Use the radar and the HUD to locate enemies and targets

          -

          The radar and the HUD (head-up display) are your best friends in this game. They show you the location and status of your enemies and targets, as well as your own plane. You can use them to plan your attacks, avoid enemy fire, and complete your objectives. For example, you can use the radar to see if there are any enemies behind you or in front of you, and use the HUD to aim your weapons and fire at them.

          -

          Master the controls and the maneuvers

          -

          The controls and the maneuvers of this game are realistic and challenging. You need to master them if you want to survive and succeed in this game. You can use the virtual joystick or the accelerometer to control your plane, and use the buttons to fire your weapons, activate your afterburner, or eject from your plane. You can also perform various maneuvers, such as barrel rolls, loops, turns, dives, and more. You need to practice these controls and maneuvers until you feel comfortable and confident with them.

          -

          Upgrade your planes and weapons

          -

          As you progress in this game, you will face more difficult missions and enemies. You need to upgrade your planes and weapons to keep up with them. You can use the money you earn from completing missions or from the mod apk to buy new planes, weapons, or upgrades. You can also change the skins of your planes to make them look cooler. Upgrading your planes and weapons will make them more powerful, faster, and durable.

          -

          Conclusion

          -

          Fox One Special Mission Mod Apk is a thrilling air combat game that will give you a realistic and exciting flying experience. You can fly over 15 different planes, take on various missions, fight against enemies, and explore different scenarios. You can also enjoy unlimited money, all planes unlocked, and no ads with this mod apk. If you love air combat games, you should download Fox One Special Mission Mod Apk today.

          -

          FAQs

          -

          Here are some frequently asked questions about Fox One Special Mission Mod Apk:

          -
            -
          • Is Fox One Special Mission Mod Apk safe to download?
          • -

            Yes, Fox One Special Mission Mod Apk is safe to download as long as you download it from a trusted source like this link. However, you should always scan any file you download with an antivirus software before installing it on your device.

            -
          • Is Fox One Special Mission Mod Apk compatible with my device?
          • -

            Fox One Special Mission Mod Apk is compatible with most Android devices that have Android 4.1 or higher. However, some devices may not support some features or graphics of this game due to their specifications or performance.

            -
          • How can I contact the developer of Fox One Special Mission Mod Apk?
          • -

            You can contact the developer of Fox One Special Mission Mod Apk by visiting their website or by sending them an email at skyfox@skyfoxgames.com.

            -
          • Can I play Fox One Special Mission Mod Apk offline?
          • -

            Yes, you can play Fox One Special Mission Mod Apk offline without any internet connection. However, some features or functions may not work properly or be available offline.

            -
          • Can I play Fox One Special Mission Mod Apk with my friends?
          • -

            No, Fox One Special Mission Mod Apk does not have a multiplayer mode or feature. You can only play this game solo against AI enemies.

            -
          - : https://www.apkmody.io/games/fox-one-special-missions.html : http://www.skyfoxgames.com/

          197e85843d
          -
          -
          \ No newline at end of file diff --git a/spaces/congsaPfin/Manga-OCR/logs/Why You Need Google Play Store APK Mirror 2022 and How to Get It.md b/spaces/congsaPfin/Manga-OCR/logs/Why You Need Google Play Store APK Mirror 2022 and How to Get It.md deleted file mode 100644 index 02415821165a419956340e20ebaa529b2129917b..0000000000000000000000000000000000000000 --- a/spaces/congsaPfin/Manga-OCR/logs/Why You Need Google Play Store APK Mirror 2022 and How to Get It.md +++ /dev/null @@ -1,116 +0,0 @@ -
          -

          Google Play Store APK Mirror 2022: How to Download and Install

          -

          If you are an Android user, you probably know what Google Play Store is. It is the official app store for Android devices, where you can find millions of apps, games, books, movies, music, and more. But did you know that there is a way to get the latest version of Google Play Store without waiting for updates from your device manufacturer or carrier? And that you can also access apps that are not available in your region or on your device model? In this article, we will show you how to download and install Google Play Store APK Mirror 2022, a website that hosts APK files for Android apps.

          -

          google play store apk mirror 2022


          Download - https://urlca.com/2uOaLa



          -

          What is Google Play Store?

          -

          The official app store for Android devices

          -

          Google Play Store is the default app store for Android devices, where you can browse, download, and update apps and games. You can also buy or rent digital content such as books, movies, music, and magazines. Google Play Store also offers various services such as Google Play Protect, which scans your device for malware and harmful apps, Google Play Games, which lets you play online games and track your achievements, and Google Play Pass, which gives you access to hundreds of premium apps and games for a monthly fee.

          -

          The benefits of using Google Play Store

          -

          Using Google Play Store has many benefits, such as:

          -
            -
          • You can access a large and diverse collection of apps and games for different categories and purposes.
          • -
          • You can enjoy high-quality and secure apps and games that are verified by Google.
          • -
          • You can get automatic updates for your apps and games whenever they are available.
          • -
          • You can manage your apps and games easily with features such as uninstalling, clearing cache, moving to SD card, etc.
          • -
          • You can discover new and recommended apps and games based on your preferences and interests.
          • -
          • You can backup and restore your apps and games data using Google Drive.
          • -
          -

          What is APK Mirror?

          -

          A website that hosts APK files for Android apps

          -

          APK Mirror is a popular website that hosts APK files for Android apps. APK files are the installation files for Android apps, similar to EXE files for Windows programs. You can download APK files from APK Mirror and install them on your device manually, without using Google Play Store. APK Mirror is run by the same team behind Android Police, a trusted source of Android news and reviews.

          -

          The advantages of using APK Mirror

          -

          Using APK Mirror has some advantages, such as:

          -
            -
          • You can get the latest version of Google Play Store and other apps before they are officially released or rolled out to your device.
          • -
          • You can access apps that are not available in your region or on your device model due to regional restrictions or device compatibility issues.
          • -
          • You can install Google Play Store on devices that don't have it pre-installed, such as Amazon Fire tablets, Huawei phones, etc.
          • -
          • You can download older versions of apps if you prefer them over the newer ones or if they work better on your device.
          • -
          • You can save bandwidth and storage space by downloading only the APK files you need instead of the whole app package.
          • -
          -

          Mirror 2022 is to enable unknown sources on your device settings. Unknown sources are the sources that are not verified by Google or your device manufacturer. By default, your device does not allow you to install apps from unknown sources for security reasons. However, if you trust the source of the APK file, such as APKMirror, you can enable unknown sources on your device settings to install the APK file. To enable unknown sources on your device settings, follow these steps: - Go to your device settings and look for the security or privacy option. - Find the option that says "Unknown sources" or "Install unknown apps" and toggle it on. - Confirm your choice by tapping "OK" or "Allow".

          Step 3: Install the APK file using a file browser or APKMirror's app

          -

          The third and final step to download and install Google Play Store APK Mirror 2022 is to install the APK file using a file browser or APKMirror's app. A file browser is an app that lets you access and manage the files and folders on your device. APKMirror's app is an app that lets you download and install APK files from APKMirror directly on your device.

          -

          To install the APK file using a file browser, follow these steps:

          -

          How to install google play store on amazon fire tablet 2022
          -Download android apps without google play store apk
          -Google play store apk download and install guide 2022
          -Google play store alternative apk sources 2022
          -Google play store apk mirror latest version 2022
          -Google play store apk mirror for android tv 2022
          -Google play store apk mirror for chromebook 2022
          -Google play store apk mirror for huawei devices 2022
          -Google play store apk mirror for windows 10 2022
          -Google play store apk mirror for samsung galaxy s21 2022
          -Google play store apk mirror modded version 2022
          -Google play store apk mirror dark mode 2022
          -Google play store apk mirror offline installer 2022
          -Google play store apk mirror update error fix 2022
          -Google play store apk mirror safe and secure 2022
          -Google play store apk mirror vs apkpure 2022
          -Google play store apk mirror vs aptoide 2022
          -Google play store apk mirror vs f-droid 2022
          -Google play store apk mirror vs uptodown 2022
          -Google play store apk mirror vs xda labs 2022
          -Best google play store apk mirror apps 2022
          -Best google play store apk mirror games 2022
          -Best google play store apk mirror movies 2022
          -Best google play store apk mirror books 2022
          -Best google play store apk mirror music 2022
          -Free google play store apk mirror apps 2022
          -Free google play store apk mirror games 2022
          -Free google play store apk mirror movies 2022
          -Free google play store apk mirror books 2022
          -Free google play store apk mirror music 2022
          -Paid google play store apk mirror apps 2022
          -Paid google play store apk mirror games 2022
          -Paid google play store apk mirror movies 2022
          -Paid google play store apk mirror books 2022
          -Paid google play store apk mirror music 2022
          -How to use google play store gift card on apk mirror 2022
          -How to use google play points on apk mirror 2022
          -How to use google opinion rewards on apk mirror 2022
          -How to use google family library on apk mirror 2022
          -How to use google one on apk mirror 2022
          -How to backup and restore google play store data on apk mirror 2022
          -How to clear cache and data of google play store on apk mirror 2022
          -How to change country and language of google play store on apk mirror 2022
          -How to enable or disable auto-update of google play store on apk mirror 2022
          -How to manage subscriptions and purchases of google play store on apk mirror 2022
          -How to contact support and report issues of google play store on apk mirror 2022
          -How to rate and review apps and games of google play store on apk mirror 2022
          -How to share apps and games of google play store on apk mirror 2022
          -How to uninstall and reinstall apps and games of google play store on apk mirror 2022

          -
            -
          1. Open the file browser app on your device and locate the APK file that you downloaded from APKMirror.
          2. -
          3. Tap on the APK file and select "Install".
          4. -
          5. Wait for the installation to finish and tap "Open" to launch Google Play Store.
          6. -
          -

          To install the APK file using APKMirror's app, follow these steps:

          -
            -
          1. Download and install APKMirror's app from [here].
          2. -
          3. Open the app and search for "Google Play Store" in the search bar.
          4. -
          5. Select the version of Google Play Store that you want to install and tap on "Install".
          6. -
          7. Wait for the installation to finish and tap "Open" to launch Google Play Store.
          8. -
          -

          Conclusion

          -

          In this article, we have shown you how to download and install Google Play Store APK Mirror 2022, a website that hosts APK files for Android apps. By doing so, you can access the latest version of Google Play Store, bypass regional restrictions and device compatibility issues, and install Google Play Store on devices that don't have it pre-installed. You can also enjoy other benefits of using APK Mirror, such as getting older versions of apps, saving bandwidth and storage space, and more.

          -

          We hope you found this article helpful and informative. If you have any questions or feedback, please feel free to leave a comment below. Thank you for reading!

          -

          FAQs

          -
            -
          • Q: Is it safe to download and install Google Play Store APK Mirror 2022?
          • -
          • A: Yes, it is safe to download and install Google Play Store APK Mirror 2022 from a reputable site like APKMirror. However, you should always be careful when downloading and installing apps from unknown sources, as they may contain malware or viruses. You should also check the permissions and reviews of the apps before installing them.
          • -
          • Q: Will I lose my data or settings if I update Google Play Store using APK Mirror?
          • -
          • A: No, you will not lose your data or settings if you update Google Play Store using APK Mirror. Your apps and games data will be backed up on Google Drive, and your settings will be synced with your Google account. You can also restore your data and settings if you uninstall or downgrade Google Play Store.
          • -
          • Q: Can I use both Google Play Store and APK Mirror on my device?
          • -
          • A: Yes, you can use both Google Play Store and APK Mirror on your device. You can use Google Play Store for downloading and updating apps and games from the official source, and use APK Mirror for downloading and installing apps and games that are not available or compatible with your device. However, you should not install the same app or game from both sources, as it may cause conflicts or errors.
          • -
          • Q: How can I uninstall Google Play Store APK Mirror 2022?
          • -
          • A: You can uninstall Google Play Store APK Mirror 2022 by following these steps:
          • -
              -
            1. Go to your device settings and look for the apps or applications option.
            2. -
            3. Find and tap on "Google Play Store" in the list of apps.
            4. -
            5. Tap on "Uninstall" and confirm your choice by tapping "OK".
            6. -
            -
          • Q: How can I contact APKMirror if I have any issues or suggestions?
          • -
          • A: You can contact APKMirror by visiting their website [here] and filling out their contact form [here ]. You can also follow them on their social media accounts [here]. They are always happy to hear from their users and improve their service.
          • -

          401be4b1e0
          -
          -
          \ No newline at end of file diff --git a/spaces/contluForse/HuggingGPT/assets/Aparichit 2005 Full Movie In Hindi Free Download [REPACK].md b/spaces/contluForse/HuggingGPT/assets/Aparichit 2005 Full Movie In Hindi Free Download [REPACK].md deleted file mode 100644 index 645b1c9324cdbdb7ac5a6b6093d02a6737fe5633..0000000000000000000000000000000000000000 --- a/spaces/contluForse/HuggingGPT/assets/Aparichit 2005 Full Movie In Hindi Free Download [REPACK].md +++ /dev/null @@ -1,8 +0,0 @@ -

          aparichit 2005 full movie in hindi free download


          DOWNLOAD ✦✦✦ https://ssurll.com/2uzyYw



          - -July 27, 2013 - Resumable Mediafire Hindi Movie Download Link Aparichit 2005 - 300MB Short Size Watch online download. Watch online full movie in Hindi... The title of the movie can be considered a joke. The film begins with a scene in which the policeman says that "he can't find something. If you can find it, we will be very happy." -In the movie I Won't Cry Anymore, he also goes by the name "Aparichit". -The film was produced by Kabir Kumar Films in 2005. 8a78ff9644
          -
          -
          -

          diff --git a/spaces/contluForse/HuggingGPT/assets/Erply Point Of Sale For Windows Cracked How to Get It for Free.md b/spaces/contluForse/HuggingGPT/assets/Erply Point Of Sale For Windows Cracked How to Get It for Free.md deleted file mode 100644 index 63b862f8a64997d8c5605a3eadf0b5307940e0fe..0000000000000000000000000000000000000000 --- a/spaces/contluForse/HuggingGPT/assets/Erply Point Of Sale For Windows Cracked How to Get It for Free.md +++ /dev/null @@ -1,13 +0,0 @@ -
          -

          Customer groups also come in handy when you need to generate sales reports. For example, a clothing store that groups its customers by age will quickly notice that teenage boys buy different clothing than adult men. Although an obvious example, it points to the subtle buying trends that can be identified with customer grouping.

          -

          Erply Point Of Sale For Windows Cracked


          Downloadhttps://ssurll.com/2uzvA6



          -

          POS Software is a system made up of hardware and software to enables sales processing for the day to day operations. The point of sale is often referred to as the point of service because it is not just a point of sale but also a point of return or customer order. POS terminal software may also include features for additional functionality, such as inventory management, CRM, financials, or warehousing.

          -

          ERPLY comes with everything you would expect in a full-service point of sale system, featuring gift cards, deep reporting, and excellent inventory management at its higher tier. It also has an open API for developers and integrates with myriad credit card processors, giving you options to shop around for the best rate. In short, ERPLY is a system to check out even if it might not blow you away at first glance.

          -

          You may think of point-of-sale (PoS) systems as just ordinary cash registers, but they're evolving to do much more. They can integrate with mobile devices and cloud services and satisfy the software and hardware requirements of users. In addition, you can use them with back-end accounting systems and credit card payment processors. Many small to midsize businesses (SMBs) are using these types of cloud-enabled POS services without the need to keep a physical back-end server in multiple locations.

          -

          A point of sale (POS) system is a computer and software setup that provides retailers with key mission-critical functions to help run their businesses. It is an indispensable retail management tool. The main functions are:

          -

          -

          Offers a wide range of point-of-sale terminals as well as complete POS systems. Payroll and tax management, competitive sales tracking and inventory management, credit card processing and other payment solutions

          -

          Add new customers at POS to begin tracking; previous customer lists can be imported from legacy system. Create and track gift cards right in ERPLY, no third party software needed. Define loyalty-points rules and create promotional incentives. Send receipts, quotes or any other sales document directly to customers via email.

          -

          Retail-Man converts a PC into a powerful point of sale (POS) + inventory system simply by attaching POS hardware such as Docket printer, Bar code Scanner, Cash Drawer and Pole Display. Retail Man also features a very simple user interface with powerful security. It includes POS, stock control, inventory and standard accounting functions like invoicing and debtors control, purchasing and creditors control, double entry accounting and built in e-mailing of invoices + quotes. For powerful, easy to use + low cost POS solutions. Currency, tax and setup can be localised to suit most countries. The e-mailing can be used to place purchase orders with suppliers, send invoices and statements to clients or to communicate with clients.

          aaccfb2cb3
          -
          -
          \ No newline at end of file diff --git a/spaces/coreml-community/ControlNet-v1-1-Annotators-cpu/annotator/mmpkg/mmseg/datasets/__init__.py b/spaces/coreml-community/ControlNet-v1-1-Annotators-cpu/annotator/mmpkg/mmseg/datasets/__init__.py deleted file mode 100644 index ebeaef4a28ef655e43578552a8aef6b77f13a636..0000000000000000000000000000000000000000 --- a/spaces/coreml-community/ControlNet-v1-1-Annotators-cpu/annotator/mmpkg/mmseg/datasets/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -from .ade import ADE20KDataset -from .builder import DATASETS, PIPELINES, build_dataloader, build_dataset -from .chase_db1 import ChaseDB1Dataset -from .cityscapes import CityscapesDataset -from .custom import CustomDataset -from .dataset_wrappers import ConcatDataset, RepeatDataset -from .drive import DRIVEDataset -from .hrf import HRFDataset -from .pascal_context import PascalContextDataset, PascalContextDataset59 -from .stare import STAREDataset -from .voc import PascalVOCDataset - -__all__ = [ - 'CustomDataset', 'build_dataloader', 'ConcatDataset', 'RepeatDataset', - 'DATASETS', 'build_dataset', 'PIPELINES', 'CityscapesDataset', - 'PascalVOCDataset', 'ADE20KDataset', 'PascalContextDataset', - 'PascalContextDataset59', 'ChaseDB1Dataset', 'DRIVEDataset', 'HRFDataset', - 'STAREDataset' -] diff --git a/spaces/cozyanduofen/bingo/src/components/chat-scroll-anchor.tsx b/spaces/cozyanduofen/bingo/src/components/chat-scroll-anchor.tsx deleted file mode 100644 index ac809f4486a48e134cb69314c3d0dae5e68d614e..0000000000000000000000000000000000000000 --- a/spaces/cozyanduofen/bingo/src/components/chat-scroll-anchor.tsx +++ /dev/null @@ -1,29 +0,0 @@ -'use client' - -import * as React from 'react' -import { useInView } from 'react-intersection-observer' - -import { useAtBottom } from '@/lib/hooks/use-at-bottom' - -interface ChatScrollAnchorProps { - trackVisibility?: boolean -} - -export function ChatScrollAnchor({ trackVisibility }: ChatScrollAnchorProps) { - const isAtBottom = useAtBottom() - const { ref, entry, inView } = useInView({ - trackVisibility, - delay: 100, - rootMargin: '0px 0px -150px 0px' - }) - - React.useEffect(() => { - if (isAtBottom && trackVisibility && !inView) { - entry?.target.scrollIntoView({ - block: 'start' - }) - } - }, [inView, entry, isAtBottom, trackVisibility]) - - return
          -} diff --git a/spaces/cscan/CodeFormer/CodeFormer/basicsr/archs/arcface_arch.py b/spaces/cscan/CodeFormer/CodeFormer/basicsr/archs/arcface_arch.py deleted file mode 100644 index fe5afb7bd2b359e0c2b7efdf628ab10b63964d87..0000000000000000000000000000000000000000 --- a/spaces/cscan/CodeFormer/CodeFormer/basicsr/archs/arcface_arch.py +++ /dev/null @@ -1,245 +0,0 @@ -import torch.nn as nn -from basicsr.utils.registry import ARCH_REGISTRY - - -def conv3x3(inplanes, outplanes, stride=1): - """A simple wrapper for 3x3 convolution with padding. - - Args: - inplanes (int): Channel number of inputs. - outplanes (int): Channel number of outputs. - stride (int): Stride in convolution. Default: 1. - """ - return nn.Conv2d(inplanes, outplanes, kernel_size=3, stride=stride, padding=1, bias=False) - - -class BasicBlock(nn.Module): - """Basic residual block used in the ResNetArcFace architecture. - - Args: - inplanes (int): Channel number of inputs. - planes (int): Channel number of outputs. - stride (int): Stride in convolution. Default: 1. - downsample (nn.Module): The downsample module. Default: None. - """ - expansion = 1 # output channel expansion ratio - - def __init__(self, inplanes, planes, stride=1, downsample=None): - super(BasicBlock, self).__init__() - self.conv1 = conv3x3(inplanes, planes, stride) - self.bn1 = nn.BatchNorm2d(planes) - self.relu = nn.ReLU(inplace=True) - self.conv2 = conv3x3(planes, planes) - self.bn2 = nn.BatchNorm2d(planes) - self.downsample = downsample - self.stride = stride - - def forward(self, x): - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - - if self.downsample is not None: - residual = self.downsample(x) - - out += residual - out = self.relu(out) - - return out - - -class IRBlock(nn.Module): - """Improved residual block (IR Block) used in the ResNetArcFace architecture. - - Args: - inplanes (int): Channel number of inputs. - planes (int): Channel number of outputs. - stride (int): Stride in convolution. Default: 1. - downsample (nn.Module): The downsample module. Default: None. - use_se (bool): Whether use the SEBlock (squeeze and excitation block). Default: True. - """ - expansion = 1 # output channel expansion ratio - - def __init__(self, inplanes, planes, stride=1, downsample=None, use_se=True): - super(IRBlock, self).__init__() - self.bn0 = nn.BatchNorm2d(inplanes) - self.conv1 = conv3x3(inplanes, inplanes) - self.bn1 = nn.BatchNorm2d(inplanes) - self.prelu = nn.PReLU() - self.conv2 = conv3x3(inplanes, planes, stride) - self.bn2 = nn.BatchNorm2d(planes) - self.downsample = downsample - self.stride = stride - self.use_se = use_se - if self.use_se: - self.se = SEBlock(planes) - - def forward(self, x): - residual = x - out = self.bn0(x) - out = self.conv1(out) - out = self.bn1(out) - out = self.prelu(out) - - out = self.conv2(out) - out = self.bn2(out) - if self.use_se: - out = self.se(out) - - if self.downsample is not None: - residual = self.downsample(x) - - out += residual - out = self.prelu(out) - - return out - - -class Bottleneck(nn.Module): - """Bottleneck block used in the ResNetArcFace architecture. - - Args: - inplanes (int): Channel number of inputs. - planes (int): Channel number of outputs. - stride (int): Stride in convolution. Default: 1. - downsample (nn.Module): The downsample module. Default: None. - """ - expansion = 4 # output channel expansion ratio - - def __init__(self, inplanes, planes, stride=1, downsample=None): - super(Bottleneck, self).__init__() - self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False) - self.bn1 = nn.BatchNorm2d(planes) - self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride, padding=1, bias=False) - self.bn2 = nn.BatchNorm2d(planes) - self.conv3 = nn.Conv2d(planes, planes * self.expansion, kernel_size=1, bias=False) - self.bn3 = nn.BatchNorm2d(planes * self.expansion) - self.relu = nn.ReLU(inplace=True) - self.downsample = downsample - self.stride = stride - - def forward(self, x): - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - out = self.relu(out) - - out = self.conv3(out) - out = self.bn3(out) - - if self.downsample is not None: - residual = self.downsample(x) - - out += residual - out = self.relu(out) - - return out - - -class SEBlock(nn.Module): - """The squeeze-and-excitation block (SEBlock) used in the IRBlock. - - Args: - channel (int): Channel number of inputs. - reduction (int): Channel reduction ration. Default: 16. - """ - - def __init__(self, channel, reduction=16): - super(SEBlock, self).__init__() - self.avg_pool = nn.AdaptiveAvgPool2d(1) # pool to 1x1 without spatial information - self.fc = nn.Sequential( - nn.Linear(channel, channel // reduction), nn.PReLU(), nn.Linear(channel // reduction, channel), - nn.Sigmoid()) - - def forward(self, x): - b, c, _, _ = x.size() - y = self.avg_pool(x).view(b, c) - y = self.fc(y).view(b, c, 1, 1) - return x * y - - -@ARCH_REGISTRY.register() -class ResNetArcFace(nn.Module): - """ArcFace with ResNet architectures. - - Ref: ArcFace: Additive Angular Margin Loss for Deep Face Recognition. - - Args: - block (str): Block used in the ArcFace architecture. - layers (tuple(int)): Block numbers in each layer. - use_se (bool): Whether use the SEBlock (squeeze and excitation block). Default: True. - """ - - def __init__(self, block, layers, use_se=True): - if block == 'IRBlock': - block = IRBlock - self.inplanes = 64 - self.use_se = use_se - super(ResNetArcFace, self).__init__() - - self.conv1 = nn.Conv2d(1, 64, kernel_size=3, padding=1, bias=False) - self.bn1 = nn.BatchNorm2d(64) - self.prelu = nn.PReLU() - self.maxpool = nn.MaxPool2d(kernel_size=2, stride=2) - self.layer1 = self._make_layer(block, 64, layers[0]) - self.layer2 = self._make_layer(block, 128, layers[1], stride=2) - self.layer3 = self._make_layer(block, 256, layers[2], stride=2) - self.layer4 = self._make_layer(block, 512, layers[3], stride=2) - self.bn4 = nn.BatchNorm2d(512) - self.dropout = nn.Dropout() - self.fc5 = nn.Linear(512 * 8 * 8, 512) - self.bn5 = nn.BatchNorm1d(512) - - # initialization - for m in self.modules(): - if isinstance(m, nn.Conv2d): - nn.init.xavier_normal_(m.weight) - elif isinstance(m, nn.BatchNorm2d) or isinstance(m, nn.BatchNorm1d): - nn.init.constant_(m.weight, 1) - nn.init.constant_(m.bias, 0) - elif isinstance(m, nn.Linear): - nn.init.xavier_normal_(m.weight) - nn.init.constant_(m.bias, 0) - - def _make_layer(self, block, planes, num_blocks, stride=1): - downsample = None - if stride != 1 or self.inplanes != planes * block.expansion: - downsample = nn.Sequential( - nn.Conv2d(self.inplanes, planes * block.expansion, kernel_size=1, stride=stride, bias=False), - nn.BatchNorm2d(planes * block.expansion), - ) - layers = [] - layers.append(block(self.inplanes, planes, stride, downsample, use_se=self.use_se)) - self.inplanes = planes - for _ in range(1, num_blocks): - layers.append(block(self.inplanes, planes, use_se=self.use_se)) - - return nn.Sequential(*layers) - - def forward(self, x): - x = self.conv1(x) - x = self.bn1(x) - x = self.prelu(x) - x = self.maxpool(x) - - x = self.layer1(x) - x = self.layer2(x) - x = self.layer3(x) - x = self.layer4(x) - x = self.bn4(x) - x = self.dropout(x) - x = x.view(x.size(0), -1) - x = self.fc5(x) - x = self.bn5(x) - - return x \ No newline at end of file diff --git a/spaces/csuhan/opendet2/app.py b/spaces/csuhan/opendet2/app.py deleted file mode 100644 index a79d7ccd03d9fd4c6069138add74afb129b72581..0000000000000000000000000000000000000000 --- a/spaces/csuhan/opendet2/app.py +++ /dev/null @@ -1,61 +0,0 @@ -import os -from pydoc import describe -os.system('pip install torch==1.9 torchvision') -os.system('pip install detectron2==0.5 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.9/index.html') -os.system('pip install timm opencv-python-headless') - - -import gradio as gr - -from demo.predictor import VisualizationDemo -from detectron2.config import get_cfg -from opendet2 import add_opendet_config - - -model_cfgs = { - "FR-CNN": ["configs/faster_rcnn_R_50_FPN_3x_baseline.yaml", "frcnn_r50.pth"], - "OpenDet-R50": ["configs/faster_rcnn_R_50_FPN_3x_opendet.yaml", "opendet2_r50.pth"], - "OpenDet-SwinT": ["configs/faster_rcnn_Swin_T_FPN_3x_opendet.yaml", "opendet2_swint.pth"], -} - - -def setup_cfg(model): - cfg = get_cfg() - add_opendet_config(cfg) - model_cfg = model_cfgs[model] - cfg.merge_from_file(model_cfg[0]) - cfg.MODEL.WEIGHTS = model_cfg[1] - cfg.MODEL.DEVICE = "cpu" - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5 - cfg.MODEL.ROI_HEADS.VIS_IOU_THRESH = 0.8 - cfg.freeze() - return cfg - - -def inference(input, model): - cfg = setup_cfg(model) - demo = VisualizationDemo(cfg) - # use PIL, to be consistent with evaluation - predictions, visualized_output = demo.run_on_image(input) - output = visualized_output.get_image()[:, :, ::-1] - return output - -examples = [ - ["demo/000000002157.jpg", "OpenDet-R50"], - ["demo/000000020059.jpg", "OpenDet-R50"]] - -iface = gr.Interface( - inference, - [ - "image", - gr.inputs.Radio( - ["FR-CNN", "OpenDet-R50", "OpenDet-SwinT"], default='OpenDet-R50'), - ], - "image", - examples=examples, - title="OpenDet", - article="

          Github Repo | Author Page

          ", - description="Online demo for: Expanding Low-Density Latent Regions for Open-Set Object Detection. Please upload your image, or click one of the examples to load them", -) - -iface.launch(enable_queue=True, cache_examples=True) diff --git a/spaces/cyberspyde/chatbot-team4/utils/join_data.py b/spaces/cyberspyde/chatbot-team4/utils/join_data.py deleted file mode 100644 index 9f131919c4db4574690687f34123f329e2b2beb7..0000000000000000000000000000000000000000 --- a/spaces/cyberspyde/chatbot-team4/utils/join_data.py +++ /dev/null @@ -1,17 +0,0 @@ -import os -def join(start, end): - text_files = [] - for k in range(start, end): - if os.path.exists('clean_data/output' + str(k) + '.txt'): - with open('clean_data/output' + str(k) + '.txt', 'r', encoding='utf-8') as f: - text = f.read() - text_files.append(text) - - joint_text = ''.join(text_files) - - with open('output.txt', 'w', encoding='utf-8') as f: - f.write(joint_text) - - print("Joint text files from {}, {} to a file output.txt".format(start, end-1)) - -join(1, 300) \ No newline at end of file diff --git a/spaces/cyllum/soccertwos-analytics/app.py b/spaces/cyllum/soccertwos-analytics/app.py deleted file mode 100644 index 9a652e255a6ca090409aacd8d6e6df496e9df6cc..0000000000000000000000000000000000000000 --- a/spaces/cyllum/soccertwos-analytics/app.py +++ /dev/null @@ -1,444 +0,0 @@ -import datetime -import yaml - -import pandas as pd -import requests -import streamlit as st -import timeago -import plotly.graph_objects as go - -from utils import build_team_results_df, get_win_streaks - - -st.set_page_config(layout="wide") -st.markdown( - """ - - """, - unsafe_allow_html=True, -) - -now = datetime.datetime.now() -MATCH_RESULTS_URL = "https://huggingface.co/datasets/huggingface-projects/bot-fight-data/resolve/main/soccer_history.csv" - - -@st.cache_data(ttl=1800) -def fetch_match_history() -> pd.DataFrame: - """ - Fetch match history. - Cache the result for 30min to avoid unnecessary requests. - Return a DataFrame. - """ - df = pd.read_csv(MATCH_RESULTS_URL) - df["timestamp"] = pd.to_datetime(df.timestamp, unit="s") - df.columns = ["home", "away", "timestamp", "result"] - return df - - -@st.cache_data(ttl=604800) -def fetch_model_database() -> pd.DataFrame: - """ - Fetch model metadata from HF dataset - Cache for 1 week as this is slowly changing. - Return a DataFrame. - """ - df = pd.read_csv(MATCH_RESULTS_URL) - df["timestamp"] = pd.to_datetime(df.timestamp, unit="s") - df.columns = ["home", "away", "timestamp", "result"] - return df - - -def get_result_counts(dframe: pd.DataFrame) -> pd.DataFrame: - """Calculate result counts.""" - rc = dframe.groupby(["team", "result"]).size().unstack(fill_value=0) - rc["total_wins"] = rc.get("Win", 0) - rc["total_losses"] = rc.get("Loss", 0) - rc["total_draws"] = rc.get("Draw", 0) - rc = rc[["total_wins", "total_losses", "total_draws"]] - rc.columns = ["wins", "losses", "draws"] - rc[["owner", "model"]] = rc.index.to_series().str.split("/", expand=True) - rc["win_pct"] = (rc["wins"] / (rc["wins"] + rc["draws"] + rc["losses"])) * 100 - return rc - - -def days_left() -> int: - end_date = datetime.date(2023, 4, 30) - today = datetime.date.today() - time_until_date = end_date - today - return time_until_date.days - - -def num_matches_played() -> int: - return match_df.shape[0] - - -match_df = fetch_match_history() -team_results = build_team_results_df(match_df) -win_streaks = get_win_streaks(team_results) -teams = sorted(team_results.team.unique(), key=str.casefold) - -owner_teams_dict = {} -for team in teams: - owner, team_name = team.split("/") - if owner in owner_teams_dict: - owner_teams_dict[owner].append(team_name) - else: - owner_teams_dict[owner] = [team_name] - -st.write("### 🤗 SoccerTwos Challenge Analytics") - -tab_team_analysis, tab_form, tab_competition = st.tabs( - [ - ":mag_right: Scout hut", - ":newspaper: Form tracker", - ":chart_with_upwards_trend: Competition stats", - ] -) - -with tab_team_analysis: - st.caption("Find out more about a team, the model, and its owner.") - col1, col2 = st.columns(2) - - with col1: - owner = st.selectbox("Owner", owner_teams_dict.keys()) - with col2: - owned_teams = owner_teams_dict[owner] - m = st.selectbox("Team", owned_teams) - - team = owner + "/" + m - - tab_team, tab_model, tab_owner = st.tabs( - [":tshirt: Team", ":brain: Model", ":male-scientist: Owner"] - ) - - with tab_model: - tab_arch, tab_config = st.tabs(["Architecture", "Training config"]) - - with tab_arch: - architecture_url = f"https://netron.app/?url=https://huggingface.co/{team}/resolve/main/SoccerTwos.onnx" - st.components.v1.iframe(architecture_url, height=800) - - with tab_config: - col1, col2 = st.columns((3, 1)) - - def build_config_data(data, standard_data): - - def highlight_nonstandard(val): - if val.item() != standard_data[val.name]: - return ["background-color: yellow"] * len(val) - else: - return [""] * len(val) - - df = pd.DataFrame.from_dict(data, orient="index", columns=["value"]).astype({"value": str}) - st.caption("Highlighted values indicate deviation from baseline.") - return df.style.apply(highlight_nonstandard, axis=1) - - with col1: - try: - r = requests.get(f"https://huggingface.co/{team}/resolve/main/configuration.yaml") - config = yaml.safe_load(r.text) - hyperparams = config["behaviors"]["SoccerTwos"]["hyperparameters"] - d = build_config_data(hyperparams, { - "batch_size": "2048", - "buffer_size": "20480", - "learning_rate": "0.0003", - "beta": "0.005", - "epsilon": "0.2", - "lambd": "0.95", - "num_epoch": "3", - "learning_rate_schedule": "constant", - "beta_schedule": "constant", - "epsilon_schedule": "constant", - }) - st.dataframe(d) - - except Exception: - st.info("Hyperparameters for this model are not available.") - - with col2: - try: - self_play = config["behaviors"]["SoccerTwos"]["self_play"] - d = build_config_data(self_play, { - "save_steps": "50000.0", - "team_change": "200000.0", - "swap_steps": "2000.0", - "window": "10.0", - "play_against_latest_model_ratio": "0.5", - "initial_elo": "1200.0", - }) - st.dataframe(d) - - except: - st.info("Self-play settings for this model are not available.") - - with tab_team: - tab_results, tab_insights = st.tabs( - [":date: Results", ":male-detective: Insights"] - ) - - with tab_results: - st.write(f"#### {team} results") - - tr = team_results[(team_results["team"] == team)] - - opponent = st.selectbox( - "Opponent", - ["All"] + sorted(list(tr["opponent"].unique()), key=str.casefold), - ) - - if opponent != "All": - tr = tr[(tr["opponent"] == opponent)] - - if not len(tr): - st.info("No results") - - else: - col1, col2 = st.columns(2) - - with col1: - c1, c2, c3 = st.columns(3) - - rc = get_result_counts(tr) - - with c1: - st.metric("Wins", f"{rc.loc[[team]]['wins'][0]}") - with c2: - st.metric("Draws", f"{rc.loc[[team]]['draws'][0]}") - with c3: - st.metric("Losses", f"{rc.loc[[team]]['losses'][0]}") - - st.write("Results") - - tr = tr.copy() - tr[["owner", "model"]] = tr["opponent"].str.split("/", expand=True) - tr["played"] = tr["timestamp"].apply( - lambda x: timeago.format(x, now) - ) - - disp_res_df = tr.drop( - ["team", "opponent", "match_id", "timestamp"], axis=1 - ) - - def highlight_results(s): - colour = { - "Win": "LightGreen", - "Draw": "LightYellow", - "Loss": "LightSalmon", - } - return [f"background-color: {colour[s.result]}"] * len(s) - - # Create a friendly index. - disp_res_df.reset_index(inplace=True, drop=True) - disp_res_df.index += 1 - disp_res_df = disp_res_df.iloc[::-1] - - # Display the table. - st.dataframe(disp_res_df.style.apply(highlight_results, axis=1)) - - with col2: - c1, c2 = st.columns(2) - with c1: - st.metric("Win rate", f"{rc.loc[[team]]['win_pct'][0]:.2f}%") - - joined = tr["timestamp"].min() - with c2: - st.metric("First played", f"{timeago.format(joined, now)}") - - grouped = ( - tr.groupby([tr["timestamp"].dt.date, "result"]) - .size() - .reset_index(name="count") - ) - - loss_trace = go.Bar( - x=grouped.loc[grouped["result"] == "Loss", "timestamp"], - y=grouped.loc[grouped["result"] == "Loss", "count"], - name="Losses", - marker=dict(color="red"), - ) - draw_trace = go.Bar( - x=grouped.loc[grouped["result"] == "Draw", "timestamp"], - y=grouped.loc[grouped["result"] == "Draw", "count"], - name="Draws", - marker=dict(color="orange"), - ) - win_trace = go.Bar( - x=grouped.loc[grouped["result"] == "Win", "timestamp"], - y=grouped.loc[grouped["result"] == "Win", "count"], - name="Wins", - marker=dict(color="green"), - ) - - fig = go.Figure(data=[loss_trace, draw_trace, win_trace]) - fig.update_layout(barmode="stack") - st.plotly_chart(fig) - - with tab_insights: - col1, col2 = st.columns(2) - - with col1: - st.write("#### Best win streaks") - best_streaks = win_streaks[win_streaks["team"] == team] - best_streaks = best_streaks.nlargest(5, "streak") - best_streaks["achieved"] = best_streaks["ended"].apply( - lambda x: timeago.format(x, now) - if not pd.isna(x) - else "Ongoing" - ) - best_streaks.sort_values( - ["streak", "ended"], ascending=[False, True], inplace=True - ) - best_streaks.drop(["ended", "team"], axis=1, inplace=True) - best_streaks.reset_index(drop=True, inplace=True) - best_streaks.index += 1 - st.dataframe(best_streaks) - with col2: - st.write("#### Toughest adversaries") - losses = ( - team_results[team_results["result"] == "Loss"] - .groupby(["team", "opponent"]) - .size() - .reset_index(name="losses") - ) - top3_all = losses.groupby("team").apply( - lambda x: x.nlargest(3, "losses") - ) - top_3_adversaries = ( - top3_all[top3_all["team"] == team] - .drop(["team"], axis=1) - .reset_index(drop=True) - ) - top_3_adversaries.index += 1 - st.dataframe(top_3_adversaries) - - with tab_owner: - st.write(f"#### :male-scientist: {owner}") - - col1, col2, col3 = st.columns(3) - - with col1: - st.metric("Models", len(owned_teams)) - - with col2: - competing_since = team_results[ - team_results["team"].str.startswith(owner) - ]["timestamp"].min() - st.metric("Competing since", f"{timeago.format(competing_since, now)}") - - with col3: - games_played = team_results[team_results["team"].str.startswith(owner)][ - "timestamp" - ].count() - st.metric("Games played", f"{games_played}") - - -with tab_form: - st.caption("Find out how teams and owners have been performing recently.") - - time_period = st.selectbox( - "Time period", ["Last day", "Last week", "Last month", "All time"] - ) - - st.caption("A big thankyou to *hectorjelly* for these components! 🙏") - - st.write("#### :trophy: Performance") - cl1, cl2 = st.columns((2, 1)) - - time_period_map = { - "Last day": pd.Timedelta(days=1), - "Last week": pd.Timedelta(days=7), - "Last month": pd.Timedelta(days=30), - "All time": None, - } - time_delta = time_period_map[time_period] - if time_delta is not None: - team_results_filtered = team_results[ - team_results["timestamp"].between(now - time_delta, now) - ] - else: - team_results_filtered = team_results - - with cl1: - st.write("#### Win rates") - - try: - rc_all = get_result_counts(team_results_filtered).sort_values( - "win_pct", ascending=False - ) - rc_all.reset_index(drop=True, inplace=True) - rc_all.index += 1 - st.dataframe(rc_all) - except: - st.info("No games were played during this period.") - - with cl2: - st.write("#### Owner activity") - games_played = ( - team_results_filtered.groupby("owner") - .count()["result"] - .sort_values(ascending=False) - ) - games_played.name = "games_played" - num_teams = pd.Series( - {k: len(v) for k, v in owner_teams_dict.items()}, name="num_teams" - ) - owner_stats = pd.concat([games_played, num_teams], axis=1) - st.dataframe(owner_stats) - - st.write("### :soccer: Win streaks") - col1, col2 = st.columns(2) - with col1: - st.write("#### Longest win streaks") - - win_streaks.sort_values( - ["streak", "ended"], ascending=[False, True], inplace=True - ) - - threshold = win_streaks["streak"].quantile(0.9999) - top_5_percent = win_streaks[win_streaks["streak"] >= threshold] - - top_5_percent = top_5_percent.copy() - top_5_percent["achieved"] = top_5_percent["ended"].apply( - lambda x: timeago.format(x, now) if not pd.isna(x) else "Ongoing" - ) - - top_5_percent.drop(["ended"], axis=1, inplace=True) - top_5_percent.reset_index(drop=True, inplace=True) - top_5_percent.index += 1 - st.dataframe(top_5_percent) - - with col2: - st.write("#### Live streaks") - win_streaks = win_streaks[win_streaks["ended"].isnull()] - win_streaks.sort_values( - ["streak", "ended"], ascending=[False, True], inplace=True - ) - - threshold = win_streaks["streak"].quantile(0.98) - top_5_percent = win_streaks.loc[win_streaks["streak"] >= threshold].copy() - - top_5_percent.drop(["ended"], axis=1, inplace=True) - top_5_percent.reset_index(drop=True, inplace=True) - top_5_percent.index += 1 - st.dataframe(top_5_percent) - - -with tab_competition: - col1, col2, col3 = st.columns(3) - - col1.metric("Matches played", f"{num_matches_played():,d}") - col2.metric("Live models", f"{len(teams)}") - col3.metric("Season ends in", f"{days_left()} days") - - match_counts = ( - match_df.groupby(match_df["timestamp"].dt.date).size().reset_index(name="count") - ) - match_counts["matches_played"] = match_counts["count"].cumsum() - - st.title("Matches played") - st.area_chart(match_counts.set_index("timestamp")["matches_played"]) diff --git a/spaces/daddyjin/TalkingFaceGeneration/Demo_TFR_Pirenderer/src/face3d/models/arcface_torch/docs/eval.md b/spaces/daddyjin/TalkingFaceGeneration/Demo_TFR_Pirenderer/src/face3d/models/arcface_torch/docs/eval.md deleted file mode 100644 index dd1d9e257367b6422680966198646c45e5a2671d..0000000000000000000000000000000000000000 --- a/spaces/daddyjin/TalkingFaceGeneration/Demo_TFR_Pirenderer/src/face3d/models/arcface_torch/docs/eval.md +++ /dev/null @@ -1,31 +0,0 @@ -## Eval on ICCV2021-MFR - -coming soon. - - -## Eval IJBC -You can eval ijbc with pytorch or onnx. - - -1. Eval IJBC With Onnx -```shell -CUDA_VISIBLE_DEVICES=0 python onnx_ijbc.py --model-root ms1mv3_arcface_r50 --image-path IJB_release/IJBC --result-dir ms1mv3_arcface_r50 -``` - -2. Eval IJBC With Pytorch -```shell -CUDA_VISIBLE_DEVICES=0,1 python eval_ijbc.py \ ---model-prefix ms1mv3_arcface_r50/backbone.pth \ ---image-path IJB_release/IJBC \ ---result-dir ms1mv3_arcface_r50 \ ---batch-size 128 \ ---job ms1mv3_arcface_r50 \ ---target IJBC \ ---network iresnet50 -``` - -## Inference - -```shell -python inference.py --weight ms1mv3_arcface_r50/backbone.pth --network r50 -``` diff --git a/spaces/danterivers/music-generation-samples/tests/modules/test_transformer.py b/spaces/danterivers/music-generation-samples/tests/modules/test_transformer.py deleted file mode 100644 index 8c9953d9e8f139db7b8ce3063e3d5a78d2f5d088..0000000000000000000000000000000000000000 --- a/spaces/danterivers/music-generation-samples/tests/modules/test_transformer.py +++ /dev/null @@ -1,247 +0,0 @@ -# Copyright (c) Meta Platforms, Inc. and affiliates. -# All rights reserved. -# -# This source code is licensed under the license found in the -# LICENSE file in the root directory of this source tree. - -from itertools import product - -import pytest -import torch - -from audiocraft.modules.transformer import StreamingMultiheadAttention, StreamingTransformer - - -def test_transformer_causal_streaming(): - torch.manual_seed(1234) - - for context, custom in product([None, 10], [False, True]): - # Test that causality and receptive fields are properly handled. - # looking at the gradients - tr = StreamingTransformer( - 16, 4, 1 if context else 2, - causal=True, past_context=context, custom=custom, - dropout=0.) - steps = 20 - for k in [0, 10, 15, 19]: - x = torch.randn(4, steps, 16, requires_grad=True) - y = tr(x) - y[:, k].abs().sum().backward() - if k + 1 < steps: - assert torch.allclose(x.grad[:, k + 1:], torch.tensor(0.)), x.grad[:, k + 1:].norm() - assert not torch.allclose(x.grad[:, :k + 1], torch.tensor(0.)), x.grad[:, :k + 1].norm() - if context is not None and k > context: - limit = k - context - 1 - assert torch.allclose(x.grad[:, :limit], - torch.tensor(0.)), x.grad[:, :limit].norm() - - # Now check that streaming gives the same result at batch eval. - x = torch.randn(4, steps, 16) - y = tr(x) - ys = [] - with tr.streaming(): - for k in range(steps): - chunk = x[:, k:k + 1, :] - ys.append(tr(chunk)) - y_stream = torch.cat(ys, dim=1) - delta = torch.norm(y_stream - y) / torch.norm(y) - assert delta < 1e-6, delta - - -def test_transformer_vs_pytorch(): - torch.manual_seed(1234) - # Check that in the non causal setting, we get the same result as - # PyTorch Transformer encoder. - for custom in [False, True]: - tr = StreamingTransformer( - 16, 4, 2, - causal=False, custom=custom, dropout=0., positional_scale=0.) - layer = torch.nn.TransformerEncoderLayer(16, 4, dropout=0., batch_first=True) - tr_ref = torch.nn.TransformerEncoder(layer, 2) - tr.load_state_dict(tr_ref.state_dict()) - - x = torch.randn(4, 20, 16) - y = tr(x) - y2 = tr_ref(x) - delta = torch.norm(y2 - y) / torch.norm(y) - assert delta < 1e-6, delta - - -def test_streaming_api(): - tr = StreamingTransformer(16, 4, 2, causal=True, dropout=0.) - tr.eval() - steps = 12 - x = torch.randn(1, steps, 16) - - with torch.no_grad(): - with tr.streaming(): - _ = tr(x[:, :1]) - state = {k: v.clone() for k, v in tr.get_streaming_state().items()} - y = tr(x[:, 1:2]) - tr.set_streaming_state(state) - y2 = tr(x[:, 1:2]) - assert torch.allclose(y, y2), (y - y2).norm() - assert tr.flush() is None - - -def test_memory_efficient(): - torch.manual_seed(1234) - tr = StreamingTransformer( - 16, 4, 2, custom=True, dropout=0., layer_scale=0.1) - tr_mem_efficient = StreamingTransformer( - 16, 4, 2, dropout=0., memory_efficient=True, layer_scale=0.1) - tr_mem_efficient.load_state_dict(tr.state_dict()) - tr.eval() - steps = 12 - x = torch.randn(3, steps, 16) - - with torch.no_grad(): - y = tr(x) - y2 = tr_mem_efficient(x) - assert torch.allclose(y, y2), (y - y2).norm() - - -def test_attention_as_float32(): - torch.manual_seed(1234) - cases = [ - {'custom': True}, - {'custom': False}, - ] - for case in cases: - tr = StreamingTransformer(16, 4, 2, dropout=0., dtype=torch.bfloat16, **case) - tr_float32 = StreamingTransformer( - 16, 4, 2, dropout=0., attention_as_float32=True, dtype=torch.bfloat16, **case) - if not case['custom']: - # we are not using autocast here because it doesn't really - # work as expected on CPU, so we have to manually cast the weights of the MHA. - for layer in tr_float32.layers: - layer.self_attn.mha.to(torch.float32) - tr_float32.load_state_dict(tr.state_dict()) - steps = 12 - x = torch.randn(3, steps, 16, dtype=torch.bfloat16) - - with torch.no_grad(): - y = tr(x) - y2 = tr_float32(x) - assert not torch.allclose(y, y2), (y - y2).norm() - - -@torch.no_grad() -def test_streaming_memory_efficient(): - torch.manual_seed(1234) - tr = StreamingTransformer(16, 4, 2, causal=True, dropout=0., custom=True) - tr_mem_efficient = StreamingTransformer( - 16, 4, 2, dropout=0., memory_efficient=True, causal=True) - tr.load_state_dict(tr_mem_efficient.state_dict()) - tr.eval() - tr_mem_efficient.eval() - steps = 12 - x = torch.randn(3, steps, 16) - - ref = tr(x) - - with tr_mem_efficient.streaming(): - outs = [] - # frame_sizes = [2] + [1] * (steps - 2) - frame_sizes = [1] * steps - - for frame_size in frame_sizes: - frame = x[:, :frame_size] - x = x[:, frame_size:] - outs.append(tr_mem_efficient(frame)) - - out = torch.cat(outs, dim=1) - delta = torch.norm(out - ref) / torch.norm(out) - assert delta < 1e-6, delta - - -def test_cross_attention(): - torch.manual_seed(1234) - for norm_first in [True, False]: - m = StreamingTransformer( - 16, 4, 2, cross_attention=False, norm_first=norm_first, dropout=0., custom=True) - m_cross = StreamingTransformer( - 16, 4, 2, cross_attention=True, norm_first=norm_first, dropout=0., custom=True) - m_cross.load_state_dict(m.state_dict(), strict=False) - x = torch.randn(2, 5, 16) - cross_x = torch.randn(2, 3, 16) - y_ref = m(x) - y_cross_zero = m_cross(x, cross_attention_src=0 * cross_x) - # With norm_first, the two should be exactly yhe same, - # but with norm_first=False, we get 2 normalization in a row - # and the epsilon value leads to a tiny change. - atol = 0. if norm_first else 1e-6 - print((y_ref - y_cross_zero).norm() / y_ref.norm()) - assert torch.allclose(y_ref, y_cross_zero, atol=atol) - - # We now expect a difference even with a generous atol of 1e-2. - y_cross = m_cross(x, cross_attention_src=cross_x) - assert not torch.allclose(y_cross, y_cross_zero, atol=1e-2) - - with pytest.raises(AssertionError): - _ = m_cross(x) - _ = m(x, cross_attention_src=cross_x) - - -def test_cross_attention_compat(): - torch.manual_seed(1234) - num_heads = 2 - dim = num_heads * 64 - with pytest.raises(AssertionError): - StreamingMultiheadAttention(dim, num_heads, causal=True, cross_attention=True) - - cross_attn = StreamingMultiheadAttention( - dim, num_heads, dropout=0, cross_attention=True, custom=True) - ref_attn = torch.nn.MultiheadAttention(dim, num_heads, dropout=0, batch_first=True) - - # We can load the regular attention state dict - # so we have compat when loading old checkpoints. - cross_attn.load_state_dict(ref_attn.state_dict()) - - queries = torch.randn(3, 7, dim) - keys = torch.randn(3, 9, dim) - values = torch.randn(3, 9, dim) - - y = cross_attn(queries, keys, values)[0] - y_ref = ref_attn(queries, keys, values)[0] - assert torch.allclose(y, y_ref, atol=1e-7) - - # Now let's check that streaming is working properly. - with cross_attn.streaming(): - ys = [] - for step in range(queries.shape[1]): - ys.append(cross_attn(queries[:, step: step + 1], keys, values)[0]) - y_streaming = torch.cat(ys, dim=1) - assert torch.allclose(y_streaming, y, atol=1e-7) - - -def test_repeat_kv(): - torch.manual_seed(1234) - num_heads = 8 - kv_repeat = 4 - dim = num_heads * 64 - with pytest.raises(AssertionError): - mha = StreamingMultiheadAttention( - dim, num_heads, causal=True, kv_repeat=kv_repeat, cross_attention=True) - mha = StreamingMultiheadAttention( - dim, num_heads, causal=True, kv_repeat=kv_repeat) - mha = StreamingMultiheadAttention( - dim, num_heads, causal=True, kv_repeat=kv_repeat, custom=True) - x = torch.randn(4, 18, dim) - y = mha(x, x, x)[0] - assert x.shape == y.shape - - -def test_qk_layer_norm(): - torch.manual_seed(1234) - tr = StreamingTransformer( - 16, 4, 2, custom=True, dropout=0., qk_layer_norm=True, bias_attn=False) - steps = 12 - x = torch.randn(3, steps, 16) - y = tr(x) - - tr = StreamingTransformer( - 16, 4, 2, custom=True, dropout=0., qk_layer_norm=True, cross_attention=True) - z = torch.randn(3, 21, 16) - y = tr(x, cross_attention_src=z) - assert y.shape == x.shape diff --git a/spaces/darkCat/Anime-image-classification/src/preprocess.py b/spaces/darkCat/Anime-image-classification/src/preprocess.py deleted file mode 100644 index 6158bea6f9ed1f4e43f88b7dc1e601647bd8f8dd..0000000000000000000000000000000000000000 --- a/spaces/darkCat/Anime-image-classification/src/preprocess.py +++ /dev/null @@ -1,38 +0,0 @@ -import tensorflow as tf -import numpy as np -import cv2 - -def perprocess_train_no_aug(image_name, WIDTH, HEIGHT): - image_s = tf.io.read_file(image_name) - raw_image = tf.image.decode_image(image_s,channels=3) - raw_image.set_shape([None,None,None]) - image = tf.image.convert_image_dtype(raw_image, tf.float32) - image = tf.image.resize(image, (WIDTH, HEIGHT)) - return image, raw_image - -# @tf.function -def perprocess_infer(image_s, WIDTH, HEIGHT): - # image_s = tf.io.read_file(image_name) - image = tf.image.convert_image_dtype(image_s, tf.float32) - # image = tf.image.decode_image(image,channels=3) - # image.set_shape([None,None,None]) - image = tf.image.resize(image, (WIDTH, HEIGHT)) - image = tf.reshape(image, shape=[1, image.get_shape()[0], image.get_shape()[1], image.get_shape()[2]]) - return image - -def prep_matting(src): - HEIGHT = 1080 - WIDTH = 1920 - src = np.expand_dims(cv2.resize(src, (WIDTH, HEIGHT), interpolation=cv2.INTER_CUBIC).transpose((2,0,1)), axis=0).astype(np.float32) - src /= 255. - # src = np.random.normal(size=(1, 3, HEIGHT, WIDTH)).astype(np.float32) - bgr = np.ones((1, 3, HEIGHT, WIDTH)).astype(np.float32) - return src, bgr - -def prep_seg(image): - H, W = 512, 512 - x = cv2.resize(image, (W, H)) - x = x.astype(np.float32) - x = x/255.0 - x = np.expand_dims(x, axis=0) - return x \ No newline at end of file diff --git a/spaces/dawdqd/ChuanhuChatGPT/readme/README_en.md b/spaces/dawdqd/ChuanhuChatGPT/readme/README_en.md deleted file mode 100644 index 80af4fbbfba5d15e1cb6d1f4b67808ca76fa37d7..0000000000000000000000000000000000000000 --- a/spaces/dawdqd/ChuanhuChatGPT/readme/README_en.md +++ /dev/null @@ -1,140 +0,0 @@ -
          - - 简体中文 | English | 日本語 -
          - -

          川虎 Chat 🐯 Chuanhu Chat

          -
          - - Logo - - -

          -

          Lightweight and User-friendly Web-UI for LLMs including ChatGPT/ChatGLM/LLaMA

          -

          - - Tests Passing - - - GitHub Contributors - - - GitHub pull requests - -

          - Streaming / Unlimited conversations / Save history / Preset prompts / Chat with files / Web search
          - LaTeX rendering / Table rendering / Code highlighting
          - Auto dark mode / Adaptive web interface / WeChat-like theme
          - Multi-parameters tuning / Multi-API-Key support / Multi-user support
          - Compatible with GPT-4 / Local deployment for LLMs -

          - Video Tutorial - · - 2.0 Introduction - · - 3.0 Introduction & Tutorial - || - Online trial - · - One-Click deployment -

          -

          - Animation Demo -

          -

          -
          - -## Supported LLM Models - -**LLM models via API**: - -- [ChatGPT](https://chat.openai.com) ([GPT-4](https://openai.com/product/gpt-4)) -- [Google PaLM](https://developers.generativeai.google/products/palm) -- [Inspur Yuan 1.0](https://air.inspur.com/home) -- [MiniMax](https://api.minimax.chat/) -- [XMChat](https://github.com/MILVLG/xmchat) - -**LLM models via local deployment**: - -- [ChatGLM](https://github.com/THUDM/ChatGLM-6B) ([ChatGLM2](https://github.com/THUDM/ChatGLM2-6B)) -- [LLaMA](https://github.com/facebookresearch/llama) -- [StableLM](https://github.com/Stability-AI/StableLM) -- [MOSS](https://github.com/OpenLMLab/MOSS) - -## Usage Tips - -- To better control the ChatGPT, use System Prompt. -- To use a Prompt Template, select the Prompt Template Collection file first, and then choose certain prompt from the drop-down menu. -- To try again if the response is unsatisfactory, use `🔄 Regenerate` button. -- To start a new line in the input box, press Shift + Enter keys. -- To quickly switch between input history, press and key in the input box. -- To deploy the program onto a server, set `"server_name": "0.0.0.0", "server_port" ,` in `config.json`. -- To get a public shared link, set `"share": true,` in `config.json`. Please be noted that the program must be running in order to be accessed via a public link. -- To use it in Hugging Face Spaces: It is recommended to **Duplicate Space** and run the program in your own Space for a faster and more secure experience. - -## Quickstart - -```shell -git clone https://github.com/GaiZhenbiao/ChuanhuChatGPT.git -cd ChuanhuChatGPT -pip install -r requirements.txt -``` - -Then make a copy of `config_example.json`, rename it to `config.json`, and then fill in your API-Key and other settings in the file. - -```shell -python ChuanhuChatbot.py -``` - -A browser window will open and you will be able to chat with ChatGPT. - -> **Note** -> -> Please check our [wiki page](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/使用教程) for detailed instructions. - -## Troubleshooting - -When you encounter problems, you should try manually pulling the latest changes of this project first. The steps are as follows: - -1. Download the latest code archive by clicking on `Download ZIP` on the webpage, or - ```shell - git pull https://github.com/GaiZhenbiao/ChuanhuChatGPT.git main -f - ``` -2. Try installing the dependencies again (as this project may have introduced new dependencies) - ``` - pip install -r requirements.txt - ``` - -Generally, you can solve most problems by following these steps. - -If the problem still exists, please refer to this page: [Frequently Asked Questions (FAQ)](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/常见问题) - -This page lists almost all the possible problems and solutions. Please read it carefully. - -## More Information - -More information could be found in our [wiki](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki): - -- [How to contribute a translation](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/Localization) -- [How to make a contribution](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/贡献指南) -- [How to cite the project](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/使用许可#如何引用该项目) -- [Project changelog](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/更新日志) -- [Project license](https://github.com/GaiZhenbiao/ChuanhuChatGPT/wiki/使用许可) - -## Starchart - -[![Star History Chart](https://api.star-history.com/svg?repos=GaiZhenbiao/ChuanhuChatGPT&type=Date)](https://star-history.com/#GaiZhenbiao/ChuanhuChatGPT&Date) - -## Contributors - - - - - -## Sponsor - -🐯 If you find this project helpful, feel free to buy me a coke or a cup of coffee~ - -Buy Me A Coffee - -image diff --git a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/gallery.py b/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/gallery.py deleted file mode 100644 index 8dccacf77f3b77b22ce3a87c2eeb56458b0b0cff..0000000000000000000000000000000000000000 --- a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/gallery.py +++ /dev/null @@ -1,249 +0,0 @@ -"""gr.Gallery() component.""" - -from __future__ import annotations - -from pathlib import Path -from typing import Any, Callable, Literal - -import numpy as np -from gradio_client.documentation import document, set_documentation_group -from gradio_client.serializing import GallerySerializable -from PIL import Image as _Image # using _ to minimize namespace pollution - -from gradio import utils -from gradio.components.base import IOComponent, _Keywords -from gradio.deprecation import warn_deprecation, warn_style_method_deprecation -from gradio.events import ( - EventListenerMethod, - Selectable, -) - -set_documentation_group("component") - - -@document() -class Gallery(IOComponent, GallerySerializable, Selectable): - """ - Used to display a list of images as a gallery that can be scrolled through. - Preprocessing: this component does *not* accept input. - Postprocessing: expects a list of images in any format, {List[numpy.array | PIL.Image | str | pathlib.Path]}, or a {List} of (image, {str} caption) tuples and displays them. - - Demos: fake_gan - """ - - def __init__( - self, - value: list[np.ndarray | _Image.Image | str | Path | tuple] - | Callable - | None = None, - *, - label: str | None = None, - every: float | None = None, - show_label: bool | None = None, - container: bool = True, - scale: int | None = None, - min_width: int = 160, - visible: bool = True, - elem_id: str | None = None, - elem_classes: list[str] | str | None = None, - columns: int | tuple | None = 2, - rows: int | tuple | None = None, - height: str | None = None, - preview: bool | None = None, - object_fit: Literal["contain", "cover", "fill", "none", "scale-down"] - | None = None, - allow_preview: bool = True, - show_share_button: bool | None = None, - show_download_button: bool | None = True, - **kwargs, - ): - """ - Parameters: - value: List of images to display in the gallery by default. If callable, the function will be called whenever the app loads to set the initial value of the component. - label: component name in interface. - every: If `value` is a callable, run the function 'every' number of seconds while the client connection is open. Has no effect otherwise. Queue must be enabled. The event can be accessed (e.g. to cancel it) via this component's .load_event attribute. - show_label: if True, will display label. - container: If True, will place the component in a container - providing some extra padding around the border. - scale: relative width compared to adjacent Components in a Row. For example, if Component A has scale=2, and Component B has scale=1, A will be twice as wide as B. Should be an integer. - min_width: minimum pixel width, will wrap if not sufficient screen space to satisfy this value. If a certain scale value results in this Component being narrower than min_width, the min_width parameter will be respected first. - visible: If False, component will be hidden. - elem_id: An optional string that is assigned as the id of this component in the HTML DOM. Can be used for targeting CSS styles. - elem_classes: An optional list of strings that are assigned as the classes of this component in the HTML DOM. Can be used for targeting CSS styles. - columns: Represents the number of images that should be shown in one row, for each of the six standard screen sizes (<576px, <768px, <992px, <1200px, <1400px, >1400px). if fewer that 6 are given then the last will be used for all subsequent breakpoints - rows: Represents the number of rows in the image grid, for each of the six standard screen sizes (<576px, <768px, <992px, <1200px, <1400px, >1400px). if fewer that 6 are given then the last will be used for all subsequent breakpoints - height: Height of the gallery. - preview: If True, will display the Gallery in preview mode, which shows all of the images as thumbnails and allows the user to click on them to view them in full size. - object_fit: CSS object-fit property for the thumbnail images in the gallery. Can be "contain", "cover", "fill", "none", or "scale-down". - allow_preview: If True, images in the gallery will be enlarged when they are clicked. Default is True. - show_share_button: If True, will show a share icon in the corner of the component that allows user to share outputs to Hugging Face Spaces Discussions. If False, icon does not appear. If set to None (default behavior), then the icon appears if this Gradio app is launched on Spaces, but not otherwise. - show_download_button: If True, will show a download button in the corner of the selected image. If False, the icon does not appear. Default is True. - - """ - self.grid_cols = columns - self.grid_rows = rows - self.height = height - self.preview = preview - self.object_fit = object_fit - self.allow_preview = allow_preview - self.show_download_button = ( - (utils.get_space() is not None) - if show_download_button is None - else show_download_button - ) - self.select: EventListenerMethod - """ - Event listener for when the user selects image within Gallery. - Uses event data gradio.SelectData to carry `value` referring to caption of selected image, and `index` to refer to index. - See EventData documentation on how to use this event data. - """ - self.show_share_button = ( - (utils.get_space() is not None) - if show_share_button is None - else show_share_button - ) - IOComponent.__init__( - self, - label=label, - every=every, - show_label=show_label, - container=container, - scale=scale, - min_width=min_width, - visible=visible, - elem_id=elem_id, - elem_classes=elem_classes, - value=value, - **kwargs, - ) - - @staticmethod - def update( - value: Any | Literal[_Keywords.NO_VALUE] | None = _Keywords.NO_VALUE, - label: str | None = None, - show_label: bool | None = None, - container: bool | None = None, - scale: int | None = None, - min_width: int | None = None, - visible: bool | None = None, - columns: int | tuple | None = None, - rows: int | tuple | None = None, - height: str | None = None, - preview: bool | None = None, - object_fit: Literal["contain", "cover", "fill", "none", "scale-down"] - | None = None, - allow_preview: bool | None = None, - show_share_button: bool | None = None, - show_download_button: bool | None = None, - ): - updated_config = { - "label": label, - "show_label": show_label, - "container": container, - "scale": scale, - "min_width": min_width, - "visible": visible, - "value": value, - "grid_cols": columns, - "grid_rows": rows, - "height": height, - "preview": preview, - "object_fit": object_fit, - "allow_preview": allow_preview, - "show_share_button": show_share_button, - "show_download_button": show_download_button, - "__type__": "update", - } - return updated_config - - def get_config(self): - return { - "value": self.value, - "grid_cols": self.grid_cols, - "grid_rows": self.grid_rows, - "height": self.height, - "preview": self.preview, - "object_fit": self.object_fit, - "allow_preview": self.allow_preview, - "show_share_button": self.show_share_button, - "show_download_button": self.show_download_button, - **IOComponent.get_config(self), - } - - def postprocess( - self, - y: list[np.ndarray | _Image.Image | str] - | list[tuple[np.ndarray | _Image.Image | str, str]] - | None, - ) -> list[str]: - """ - Parameters: - y: list of images, or list of (image, caption) tuples - Returns: - list of string file paths to images in temp directory - """ - if y is None: - return [] - output = [] - for img in y: - caption = None - if isinstance(img, (tuple, list)): - img, caption = img - if isinstance(img, np.ndarray): - file = self.img_array_to_temp_file(img, dir=self.DEFAULT_TEMP_DIR) - file_path = str(utils.abspath(file)) - self.temp_files.add(file_path) - elif isinstance(img, _Image.Image): - file = self.pil_to_temp_file(img, dir=self.DEFAULT_TEMP_DIR) - file_path = str(utils.abspath(file)) - self.temp_files.add(file_path) - elif isinstance(img, (str, Path)): - if utils.validate_url(img): - file_path = img - else: - file_path = self.make_temp_copy_if_needed(img) - else: - raise ValueError(f"Cannot process type as image: {type(img)}") - - if caption is not None: - output.append( - [{"name": file_path, "data": None, "is_file": True}, caption] - ) - else: - output.append({"name": file_path, "data": None, "is_file": True}) - - return output - - def style( - self, - *, - grid: int | tuple | None = None, - columns: int | tuple | None = None, - rows: int | tuple | None = None, - height: str | None = None, - container: bool | None = None, - preview: bool | None = None, - object_fit: str | None = None, - **kwargs, - ): - """ - This method is deprecated. Please set these arguments in the constructor instead. - """ - warn_style_method_deprecation() - if grid is not None: - warn_deprecation( - "The 'grid' parameter will be deprecated. Please use 'grid_cols' in the constructor instead.", - ) - self.grid_cols = grid - if columns is not None: - self.grid_cols = columns - if rows is not None: - self.grid_rows = rows - if height is not None: - self.height = height - if preview is not None: - self.preview = preview - if object_fit is not None: - self.object_fit = object_fit - if container is not None: - self.container = container - return self diff --git a/spaces/deelight-del/minima/app.py b/spaces/deelight-del/minima/app.py deleted file mode 100644 index a699bc5b3c2e987102ca93e0ee28d601e0a93d02..0000000000000000000000000000000000000000 --- a/spaces/deelight-del/minima/app.py +++ /dev/null @@ -1,7 +0,0 @@ -import gradio as gr - -def greet(name): - return "Hello " + name + "!!" - -iface = gr.Interface(fn=greet, inputs="text", outputs="text") -iface.launch() \ No newline at end of file diff --git a/spaces/dermetfak/healthcare_ai_loop/app.py b/spaces/dermetfak/healthcare_ai_loop/app.py deleted file mode 100644 index 405bb8c35848419f19575067d844c11402ff5fb1..0000000000000000000000000000000000000000 --- a/spaces/dermetfak/healthcare_ai_loop/app.py +++ /dev/null @@ -1,87 +0,0 @@ -import streamlit as st -from audiorecorder import audiorecorder -import openai -import os -openai.api_key = os.environ['OPENAI_API_KEY'] - - -def get_completion(messages, model="gpt-3.5-turbo"): - response = openai.ChatCompletion.create( - model=model, - messages=messages, - temperature=0) - return response.choices[0].message["content"] - - -def transcribe(audio_path): - audio_file = open(audio_path, "rb") - transcript = openai.Audio.translate_raw("whisper-1", audio_file, filename = '1.mp3') - return transcript["text"] - - -def get_ddx(vignette): - messages_ddx = [ - {'role': 'system', 'content': 'You are a Physician AI assistant tool. Write a differential diagnosis for a patient. Write just diagnoses and justification. Do no write any additional information. Do not write any introduction.'}, - {'role': 'user', 'content': vignette}] - ddx = get_completion(messages_ddx) - return ddx - - -def get_orders(vignette, ddx): - messages_orders = [ - {'role': 'system', 'content': 'You are a Physician AI assistant tool. Write an order set for a patient to differentiate between conditions. Write just orders and justification. Do no write any additional information. Do not write any introduction.'}, - {'role': 'user', 'content': f'Information about patient: {vignette}. Differential diagnosis: {ddx}'}] - orders = get_completion(messages_orders) - return orders - - -if 'vignette' not in st.session_state: - st.session_state['vignette'] = '' - -if 'ddx' not in st.session_state: - st.session_state['ddx'] = '' - -if 'orders' not in st.session_state: - st.session_state['orders'] = '' - -if 'length' not in st.session_state: - st.session_state['length'] = 0 - - -st.title("AI loop for healthcare providers") -st.markdown( - "Record your patient presentation and get the differential diagnoses and orders.") -st.divider() - -audio = audiorecorder("Record", "Stop") - - -if (len(audio) != st.session_state['length']): - st.session_state['length'] = len(audio) - # wav_file = open("audio.mp3", "wb") - # wav_file.write(audio.tobytes()) - transcript = openai.Audio.translate_raw("whisper-1", audio.tobytes(), filename = '1.mp3') - transcript["text"] - st.session_state['vignette'] += transcript["text"] - - -st.session_state['vignette'] = st.text_area( - "Vignette", value=st.session_state['vignette']) - - -if st.button("Get DDX and Orders"): - vignette = st.session_state['vignette'] - ddx = get_ddx(vignette) - st.session_state['ddx'] = ddx - st.session_state['orders'] = get_orders(vignette, ddx) - - -col1, col2 = st.columns(2) - -with col1: - st.markdown( - f"**DDX**\n\n{st.session_state['ddx']}", unsafe_allow_html=True) - -with col2: - st.markdown( - f"**ORDERS**\n\n{st.session_state['orders']}", unsafe_allow_html=True) diff --git a/spaces/deven367/yt-video-annotator-hf/annotator/__init__.py b/spaces/deven367/yt-video-annotator-hf/annotator/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/spaces/diacanFperku/AutoGPT/Eurotic Tv Penelope Pussy Show VERIFIED.md b/spaces/diacanFperku/AutoGPT/Eurotic Tv Penelope Pussy Show VERIFIED.md deleted file mode 100644 index 1fc33d5d6ce20f6d87d30684a6c45f69f48b0a39..0000000000000000000000000000000000000000 --- a/spaces/diacanFperku/AutoGPT/Eurotic Tv Penelope Pussy Show VERIFIED.md +++ /dev/null @@ -1,6 +0,0 @@ -

          Eurotic Tv Penelope Pussy Show


          Download Filehttps://gohhs.com/2uFUrZ



          -
          -baby marilyn sex tv - open pussy lips, baby marilyn masturbazione open ... eurotic tv etv show Penelope and other girls - Eurotic Tv, Penelope and other girls. 4d29de3e1b
          -
          -
          -

          diff --git a/spaces/diacanFperku/AutoGPT/Prepricana Lektira Staklareva Ljubav.md b/spaces/diacanFperku/AutoGPT/Prepricana Lektira Staklareva Ljubav.md deleted file mode 100644 index 759a7e2cd251900f620769508fe8bc9aff62d2f6..0000000000000000000000000000000000000000 --- a/spaces/diacanFperku/AutoGPT/Prepricana Lektira Staklareva Ljubav.md +++ /dev/null @@ -1,46 +0,0 @@ -

          Prepricana Lektira Staklareva Ljubav


          Downloadhttps://gohhs.com/2uFT4V



          -
          -Dalmatia Banka - -Dalmatia Banka or Dalmatinska Banka (in English: Dalmatian Bank) was a bank in Zagreb, Croatia, established in 1902. It was part of a group of national banks that were granted monopoly rights by the Austro-Hungarian Empire for its banking and commercial activity in Dalmatia. - -History - -Dalmatia Banka was founded in 1902, when Dalmatia was a part of the Austro-Hungarian Empire. It was headquartered in Zagreb, and was a member of the Zadarski bankanstvo, or Zadar Bank Association. It competed with the Banco di Zara. - -It later merged with Mediobanca, another bank that was owned by Italian businessman Carlo Tizzani. - -In 2006, the bank was acquired by Croatian Bank for Cooperation and Development (HBCD). In 2009, it became a subsidiary of HBCD, after the bank was purchased for CHF 80 million by HBCD. - -As of 2009, the bank was the 14th largest bank in Croatia. Its share capital was 1.8 billion Croatian kuna (CHF 207.5 million), with an additional 1.7 billion CHF (CHF 214.4 million) in deposits. - -In 2010, the bank merged with Croatia-Based Dunavska Banka, forming HBCD-Dalmatia Banka. - -As of 2014, the bank was the 16th largest bank in Croatia, with a share capital of CHF 2.1 billion, and a total of CHF 3.6 billion in deposits. - -References - -Category:Banks of Croatia - -Category:Banks established in 1902 - -Category:1902 establishments in Austria-Hungary - -Category:Companies based in Zagreb - -Category:Defunct banks of Croatia. - --5*sqrt(17) - -Simplify -3 + sqrt(50)/(sqrt(70)/sqrt(7)). - --3 + sqrt(5) - -Simplify (-4 + 1*sqrt(112))**2 - (sqrt(7) + (sqrt(252) - sqrt(252)*2) + sqrt(252))**2. - --40 - 32*sqrt(7) - -Simplify (sqrt(78)/(sqrt( 4fefd39f24
          -
          -
          -

          diff --git a/spaces/diagaiwei/ir_chinese_medqa/utility/supervision/triples.py b/spaces/diagaiwei/ir_chinese_medqa/utility/supervision/triples.py deleted file mode 100644 index b70c2c0933fcf25d0df939e6e1030e1c45eca2a6..0000000000000000000000000000000000000000 --- a/spaces/diagaiwei/ir_chinese_medqa/utility/supervision/triples.py +++ /dev/null @@ -1,150 +0,0 @@ -""" - Example: --positives 5,50 1,1000 ~~> best-5 (in top-50) + best-1 (in top-1000) -""" - -import os -import sys -import git -import tqdm -import ujson -import random - -from argparse import ArgumentParser -from colbert.utils.utils import print_message, load_ranking, groupby_first_item, create_directory -from utility.utils.save_metadata import save_metadata - - -MAX_NUM_TRIPLES = 40_000_000 - - -def sample_negatives(negatives, num_sampled, biased=None): - assert biased in [None, 100, 200], "NOTE: We bias 50% from the top-200 negatives, if there are twice or more." - - num_sampled = min(len(negatives), num_sampled) - - if biased and num_sampled < len(negatives): - assert num_sampled % 2 == 0, num_sampled - - num_sampled_top100 = num_sampled // 2 - num_sampled_rest = num_sampled - num_sampled_top100 - - oversampled, undersampled = negatives[:biased], negatives[biased:] - - if len(oversampled) < len(undersampled): - return random.sample(oversampled, num_sampled_top100) + random.sample(undersampled, num_sampled_rest) - - return random.sample(negatives, num_sampled) - - -def sample_for_query(qid, ranking, args_positives, depth, permissive, biased): - """ - Requires that the ranks are sorted per qid. - """ - - positives, negatives, triples = [], [], [] - - for pid, rank, *_, label in ranking: - assert rank >= 1, f"ranks should start at 1 \t\t got rank = {rank}" - assert label in [0, 1] - - if rank > depth: - break - - if label: - take_this_positive = any(rank <= maxDepth and len(positives) < maxBest for maxBest, maxDepth in args_positives) - - if take_this_positive: - positives.append((pid, 0)) - elif permissive: - positives.append((pid, rank)) # utilize with a few negatives, starting at (next) rank - - else: - negatives.append(pid) - - for pos, neg_start in positives: - num_sampled = 100 if neg_start == 0 else 5 - negatives_ = negatives[neg_start:] - - biased_ = biased if neg_start == 0 else None - for neg in sample_negatives(negatives_, num_sampled, biased=biased_): - triples.append((qid, pos, neg)) - - return triples - - -def main(args): - try: - rankings = load_ranking(args.ranking, types=[int, int, int, float, int]) - except: - rankings = load_ranking(args.ranking, types=[int, int, int, int]) - - print_message("#> Group by QID") - qid2rankings = groupby_first_item(tqdm.tqdm(rankings)) - - Triples = [] - NonEmptyQIDs = 0 - - for processing_idx, qid in enumerate(qid2rankings): - l = sample_for_query(qid, qid2rankings[qid], args.positives, args.depth, args.permissive, args.biased) - NonEmptyQIDs += (len(l) > 0) - Triples.extend(l) - - if processing_idx % (10_000) == 0: - print_message(f"#> Done with {processing_idx+1} questions!\t\t " - f"{str(len(Triples) / 1000)}k triples for {NonEmptyQIDs} unqiue QIDs.") - - print_message(f"#> Sub-sample the triples (if > {MAX_NUM_TRIPLES})..") - print_message(f"#> len(Triples) = {len(Triples)}") - - if len(Triples) > MAX_NUM_TRIPLES: - Triples = random.sample(Triples, MAX_NUM_TRIPLES) - - ### Prepare the triples ### - print_message("#> Shuffling the triples...") - random.shuffle(Triples) - - print_message("#> Writing {}M examples to file.".format(len(Triples) / 1000.0 / 1000.0)) - - with open(args.output, 'w') as f: - for example in Triples: - ujson.dump(example, f) - f.write('\n') - - save_metadata(f'{args.output}.meta', args) - - print('\n\n', args, '\n\n') - print(args.output) - print_message("#> Done.") - - -if __name__ == "__main__": - parser = ArgumentParser(description='Create training triples from ranked list.') - - # Input / Output Arguments - parser.add_argument('--ranking', dest='ranking', required=True, type=str) - parser.add_argument('--output', dest='output', required=True, type=str) - - # Weak Supervision Arguments. - parser.add_argument('--positives', dest='positives', required=True, nargs='+') - parser.add_argument('--depth', dest='depth', required=True, type=int) # for negatives - - parser.add_argument('--permissive', dest='permissive', default=False, action='store_true') - # parser.add_argument('--biased', dest='biased', default=False, action='store_true') - parser.add_argument('--biased', dest='biased', default=None, type=int) - parser.add_argument('--seed', dest='seed', required=False, default=12345, type=int) - - args = parser.parse_args() - random.seed(args.seed) - - assert not os.path.exists(args.output), args.output - - args.positives = [list(map(int, configuration.split(','))) for configuration in args.positives] - - assert all(len(x) == 2 for x in args.positives) - assert all(maxBest <= maxDepth for maxBest, maxDepth in args.positives), args.positives - - create_directory(os.path.dirname(args.output)) - - assert args.biased in [None, 100, 200] - - main(args) diff --git a/spaces/digitalxingtong/Xingtong-Read-Dongmuchang-Bert-VITS2/start.bat b/spaces/digitalxingtong/Xingtong-Read-Dongmuchang-Bert-VITS2/start.bat deleted file mode 100644 index 418d21233dbf720b0dd09821904d9d6a31b123a2..0000000000000000000000000000000000000000 --- a/spaces/digitalxingtong/Xingtong-Read-Dongmuchang-Bert-VITS2/start.bat +++ /dev/null @@ -1,2 +0,0 @@ -set PYTHON=venv\python.exe -start cmd /k "set PYTHON=%PYTHON%" \ No newline at end of file diff --git a/spaces/dineshreddy/WALT/mmdet/datasets/pipelines/instaboost.py b/spaces/dineshreddy/WALT/mmdet/datasets/pipelines/instaboost.py deleted file mode 100644 index 38b6819f60587a6e0c0f6d57bfda32bb3a7a4267..0000000000000000000000000000000000000000 --- a/spaces/dineshreddy/WALT/mmdet/datasets/pipelines/instaboost.py +++ /dev/null @@ -1,98 +0,0 @@ -import numpy as np - -from ..builder import PIPELINES - - -@PIPELINES.register_module() -class InstaBoost(object): - r"""Data augmentation method in `InstaBoost: Boosting Instance - Segmentation Via Probability Map Guided Copy-Pasting - `_. - - Refer to https://github.com/GothicAi/Instaboost for implementation details. - """ - - def __init__(self, - action_candidate=('normal', 'horizontal', 'skip'), - action_prob=(1, 0, 0), - scale=(0.8, 1.2), - dx=15, - dy=15, - theta=(-1, 1), - color_prob=0.5, - hflag=False, - aug_ratio=0.5): - try: - import instaboostfast as instaboost - except ImportError: - raise ImportError( - 'Please run "pip install instaboostfast" ' - 'to install instaboostfast first for instaboost augmentation.') - self.cfg = instaboost.InstaBoostConfig(action_candidate, action_prob, - scale, dx, dy, theta, - color_prob, hflag) - self.aug_ratio = aug_ratio - - def _load_anns(self, results): - labels = results['ann_info']['labels'] - masks = results['ann_info']['masks'] - bboxes = results['ann_info']['bboxes'] - n = len(labels) - - anns = [] - for i in range(n): - label = labels[i] - bbox = bboxes[i] - mask = masks[i] - x1, y1, x2, y2 = bbox - # assert (x2 - x1) >= 1 and (y2 - y1) >= 1 - bbox = [x1, y1, x2 - x1, y2 - y1] - anns.append({ - 'category_id': label, - 'segmentation': mask, - 'bbox': bbox - }) - - return anns - - def _parse_anns(self, results, anns, img): - gt_bboxes = [] - gt_labels = [] - gt_masks_ann = [] - for ann in anns: - x1, y1, w, h = ann['bbox'] - # TODO: more essential bug need to be fixed in instaboost - if w <= 0 or h <= 0: - continue - bbox = [x1, y1, x1 + w, y1 + h] - gt_bboxes.append(bbox) - gt_labels.append(ann['category_id']) - gt_masks_ann.append(ann['segmentation']) - gt_bboxes = np.array(gt_bboxes, dtype=np.float32) - gt_labels = np.array(gt_labels, dtype=np.int64) - results['ann_info']['labels'] = gt_labels - results['ann_info']['bboxes'] = gt_bboxes - results['ann_info']['masks'] = gt_masks_ann - results['img'] = img - return results - - def __call__(self, results): - img = results['img'] - orig_type = img.dtype - anns = self._load_anns(results) - if np.random.choice([0, 1], p=[1 - self.aug_ratio, self.aug_ratio]): - try: - import instaboostfast as instaboost - except ImportError: - raise ImportError('Please run "pip install instaboostfast" ' - 'to install instaboostfast first.') - anns, img = instaboost.get_new_data( - anns, img.astype(np.uint8), self.cfg, background=None) - - results = self._parse_anns(results, anns, img.astype(orig_type)) - return results - - def __repr__(self): - repr_str = self.__class__.__name__ - repr_str += f'(cfg={self.cfg}, aug_ratio={self.aug_ratio})' - return repr_str diff --git a/spaces/dineshreddy/WALT/mmdet/models/losses/accuracy.py b/spaces/dineshreddy/WALT/mmdet/models/losses/accuracy.py deleted file mode 100644 index 789a2240a491289c5801b6690116e8ca657d004f..0000000000000000000000000000000000000000 --- a/spaces/dineshreddy/WALT/mmdet/models/losses/accuracy.py +++ /dev/null @@ -1,78 +0,0 @@ -import mmcv -import torch.nn as nn - - -@mmcv.jit(coderize=True) -def accuracy(pred, target, topk=1, thresh=None): - """Calculate accuracy according to the prediction and target. - - Args: - pred (torch.Tensor): The model prediction, shape (N, num_class) - target (torch.Tensor): The target of each prediction, shape (N, ) - topk (int | tuple[int], optional): If the predictions in ``topk`` - matches the target, the predictions will be regarded as - correct ones. Defaults to 1. - thresh (float, optional): If not None, predictions with scores under - this threshold are considered incorrect. Default to None. - - Returns: - float | tuple[float]: If the input ``topk`` is a single integer, - the function will return a single float as accuracy. If - ``topk`` is a tuple containing multiple integers, the - function will return a tuple containing accuracies of - each ``topk`` number. - """ - assert isinstance(topk, (int, tuple)) - if isinstance(topk, int): - topk = (topk, ) - return_single = True - else: - return_single = False - - maxk = max(topk) - if pred.size(0) == 0: - accu = [pred.new_tensor(0.) for i in range(len(topk))] - return accu[0] if return_single else accu - assert pred.ndim == 2 and target.ndim == 1 - assert pred.size(0) == target.size(0) - assert maxk <= pred.size(1), \ - f'maxk {maxk} exceeds pred dimension {pred.size(1)}' - pred_value, pred_label = pred.topk(maxk, dim=1) - pred_label = pred_label.t() # transpose to shape (maxk, N) - correct = pred_label.eq(target.view(1, -1).expand_as(pred_label)) - if thresh is not None: - # Only prediction values larger than thresh are counted as correct - correct = correct & (pred_value > thresh).t() - res = [] - for k in topk: - correct_k = correct[:k].reshape(-1).float().sum(0, keepdim=True) - res.append(correct_k.mul_(100.0 / pred.size(0))) - return res[0] if return_single else res - - -class Accuracy(nn.Module): - - def __init__(self, topk=(1, ), thresh=None): - """Module to calculate the accuracy. - - Args: - topk (tuple, optional): The criterion used to calculate the - accuracy. Defaults to (1,). - thresh (float, optional): If not None, predictions with scores - under this threshold are considered incorrect. Default to None. - """ - super().__init__() - self.topk = topk - self.thresh = thresh - - def forward(self, pred, target): - """Forward function to calculate accuracy. - - Args: - pred (torch.Tensor): Prediction of models. - target (torch.Tensor): Target for each prediction. - - Returns: - tuple[float]: The accuracies under different topk criterions. - """ - return accuracy(pred, target, self.topk, self.thresh) diff --git a/spaces/ds520/bingo/src/components/ui/input.tsx b/spaces/ds520/bingo/src/components/ui/input.tsx deleted file mode 100644 index 684a857f3d769b78818fb13de1abaebfb09ca79c..0000000000000000000000000000000000000000 --- a/spaces/ds520/bingo/src/components/ui/input.tsx +++ /dev/null @@ -1,25 +0,0 @@ -import * as React from 'react' - -import { cn } from '@/lib/utils' - -export interface InputProps - extends React.InputHTMLAttributes {} - -const Input = React.forwardRef( - ({ className, type, ...props }, ref) => { - return ( - - ) - } -) -Input.displayName = 'Input' - -export { Input } diff --git a/spaces/ds520/bingo/src/components/voice.tsx b/spaces/ds520/bingo/src/components/voice.tsx deleted file mode 100644 index 074d0e145229947282a472bd84f6578cf0b3c71c..0000000000000000000000000000000000000000 --- a/spaces/ds520/bingo/src/components/voice.tsx +++ /dev/null @@ -1,52 +0,0 @@ -import React, { useEffect } from 'react' -import { useSetAtom } from 'jotai' -import { useBing } from '@/lib/hooks/use-bing' -import Image from 'next/image' -import VoiceIcon from '@/assets/images/voice.svg' -import VoiceButton from './ui/voice' -import { SR } from '@/lib/bots/bing/sr' -import { voiceListenAtom } from '@/state' - -const sr = new SR(['发送', '清空', '退出']) - -const Voice = ({ setInput, input, sendMessage, isSpeaking }: Pick, 'setInput' | 'sendMessage' | 'input' | 'isSpeaking'>) => { - const setListen = useSetAtom(voiceListenAtom) - useEffect(() => { - if (sr.listening) return - sr.transcript = !isSpeaking - }, [isSpeaking]) - - useEffect(() => { - sr.onchange = (msg: string, command?: string) => { - switch (command) { - case '退出': - sr.stop() - break; - case '发送': - sendMessage(input) - case '清空': - setInput('') - break; - default: - setInput(input + msg) - } - } - }, [input]) - - const switchSR = (enable: boolean = false) => { - setListen(enable) - if (enable) { - sr.start() - } else { - sr.stop() - } - } - - return sr.listening ? ( - switchSR(false)} /> - ) : ( - start voice switchSR(true)} /> - ) -}; - -export default Voice; diff --git a/spaces/duchaba/yml_hackathon_img_mindy/README.md b/spaces/duchaba/yml_hackathon_img_mindy/README.md deleted file mode 100644 index 9b6017b2e94526558787418e5b2f0053e19f1974..0000000000000000000000000000000000000000 --- a/spaces/duchaba/yml_hackathon_img_mindy/README.md +++ /dev/null @@ -1,13 +0,0 @@ ---- -title: Yml Hackathon Lexi -emoji: 🐢 -colorFrom: indigo -colorTo: green -sdk: gradio -sdk_version: 3.29.0 -app_file: app.py -pinned: false -license: mit ---- - -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference diff --git a/spaces/elozano/news-analyzer/README.md b/spaces/elozano/news-analyzer/README.md deleted file mode 100644 index 90e74f2bb71946a034326a5b8742cdbf62cd4374..0000000000000000000000000000000000000000 --- a/spaces/elozano/news-analyzer/README.md +++ /dev/null @@ -1,45 +0,0 @@ ---- -title: News Analyzer -emoji: 📰 -colorFrom: grey -colorTo: white -sdk: streamlit -app_file: app.py -pinned: false ---- - -# Configuration - -`title`: _string_ -Display title for the Space - -`emoji`: _string_ -Space emoji (emoji-only character allowed) - -`colorFrom`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`colorTo`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`sdk`: _string_ -Can be either `gradio`, `streamlit`, or `static` - -`sdk_version` : _string_ -Only applicable for `streamlit` SDK. -See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions. - -`app_file`: _string_ -Path to your main application file (which contains either `gradio` or `streamlit` Python code, or `static` html code). -Path is relative to the root of the repository. - -`models`: _List[string]_ -HF model IDs (like "gpt2" or "deepset/roberta-base-squad2") used in the Space. -Will be parsed automatically from your code if not specified here. - -`datasets`: _List[string]_ -HF dataset IDs (like "common_voice" or "oscar-corpus/OSCAR-2109") used in the Space. -Will be parsed automatically from your code if not specified here. - -`pinned`: _boolean_ -Whether the Space stays on top of your list. diff --git a/spaces/elun15/image-regression/app.py b/spaces/elun15/image-regression/app.py deleted file mode 100644 index 18fc4baa020fee19a51b7d05b8d0d22d815bc533..0000000000000000000000000000000000000000 --- a/spaces/elun15/image-regression/app.py +++ /dev/null @@ -1,75 +0,0 @@ -import os - -import gradio as gr -import hydra -import matplotlib.pyplot as plt -import numpy as np -import torch -import torchvision -from hydra import compose, initialize -from omegaconf import DictConfig, OmegaConf, open_dict -from PIL import Image - -from dataset import ImageRegressionDataset -from model import CustomResNet - - -def load_model(model_path, cfg): - # Load model - model = CustomResNet(cfg.model) - weigths = torch.load(os.path.join(model_path, "model.pth"), map_location=torch.device("cpu")) - model.load_state_dict(weigths, strict=True) - return model - - -def inference(image): - # Define model path - # model_path = "outputs/2023-02-27/23-58-04" - model_path = "models/1" - cfg = OmegaConf.load(os.path.join(model_path, ".hydra", "config.yaml")) - - # Load model - model = load_model(model_path, cfg) - - # Convert numpy array to PIL image - image = Image.fromarray(image) - - # Define validation transformations from PyTorch - val_transforms = torchvision.transforms.Compose( - [ - torchvision.transforms.Resize(224), - torchvision.transforms.ToTensor(), - torchvision.transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]), - ] - ) - - # Apply transformations to image - image = val_transforms(image) - - # Inference - model.eval() - with torch.no_grad(): - # Create batch dimension in image and replicate it two times - image = image.unsqueeze(0).repeat(2, 1, 1, 1) - - # Forward pass - outputs = model(image) - - # Select first element of the batch - outputs = outputs[0, :] - - # MinMax Normalization - # outputs = (outputs - outputs.min()) / (outputs.max() - outputs.min()) * 100 - outputs = outputs.item() - - return f"Disease Rating: {int(outputs*100)}" - - -def main(): - image = gr.Image(shape=(224, 224)) - title = "Image Regression" - gr.Interface(fn=inference, inputs=image, outputs="text", title=title).launch() - - -if __name__ == "__main__": - main() diff --git a/spaces/epexVfeibi/Imagedeblurr/AVA FIND PRO WORK Crack.md b/spaces/epexVfeibi/Imagedeblurr/AVA FIND PRO WORK Crack.md deleted file mode 100644 index f46d5267065728dedd5ee38b864882d3ca3fc506..0000000000000000000000000000000000000000 --- a/spaces/epexVfeibi/Imagedeblurr/AVA FIND PRO WORK Crack.md +++ /dev/null @@ -1,7 +0,0 @@ - -

          its also possible to search files by size, access date, kind, or by content by applying an exact match or advanced search to the result window. this software offers a much better interface and therefore a much faster menu browsing. the ava finder pro crack adds the layer in a simple configuration interface where you can set the interface and the processor the app will use. it is the undisputed king of the class of malware family. in addition, it can also easily detect and remove malicious plug-ins on your mac. ava finder pro crack adware can be easily detected and removed. click the 5 triangles at the bottom right of the progress bar to enable or disable these features.

          -

          you can access all the files and folders on your computer, set up the registry, remove the license key, and set the time when they were installed. â«â» is a free utility for mac that can easily sort an directory and auto-hide that window when it is not in use. furthermore, it runs on windows nt, 2000, me, xp, vista, windows 7, 8, and 10.

          -

          AVA FIND PRO crack


          Download Zip ··· https://jinyurl.com/2uEnmW



          -

          thats right! ava finder pro can easily open any file from windows and mac, as well as organize the files, and even make them into an archive. in addition, it supports the database in windows 7 or later. the new features allow its users to select the type of files to be found. for example, the user can select plain text and regular files or images with a particular format. â«â» is a famous software to increase your computer speed. â«â» is one of the most popular programs in the program category. â«â» is a platform to create numerous applications, which takes care of research, development, support, and many other things. â»â» is a convenient, professional, and free software. the user can quickly clean the browser without removing the valuable bookmarks, history and other data.

          899543212b
          -
          -
          \ No newline at end of file diff --git a/spaces/evaluate-metric/indic_glue/indic_glue.py b/spaces/evaluate-metric/indic_glue/indic_glue.py deleted file mode 100644 index 03afd1700180ad1d0eeb0d8e885c9ad4c67489e2..0000000000000000000000000000000000000000 --- a/spaces/evaluate-metric/indic_glue/indic_glue.py +++ /dev/null @@ -1,173 +0,0 @@ -# Copyright 2020 The HuggingFace Evaluate Authors. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -""" IndicGLUE benchmark metric. """ - -import datasets -import numpy as np -from scipy.spatial.distance import cdist -from scipy.stats import pearsonr, spearmanr -from sklearn.metrics import f1_score - -import evaluate - - -_CITATION = """\ - @inproceedings{kakwani2020indicnlpsuite, - title={{IndicNLPSuite: Monolingual Corpora, Evaluation Benchmarks and Pre-trained Multilingual Language Models for Indian Languages}}, - author={Divyanshu Kakwani and Anoop Kunchukuttan and Satish Golla and Gokul N.C. and Avik Bhattacharyya and Mitesh M. Khapra and Pratyush Kumar}, - year={2020}, - booktitle={Findings of EMNLP}, -} -""" - -_DESCRIPTION = """\ - IndicGLUE is a natural language understanding benchmark for Indian languages. It contains a wide - variety of tasks and covers 11 major Indian languages - as, bn, gu, hi, kn, ml, mr, or, pa, ta, te. -""" - -_KWARGS_DESCRIPTION = """ -Compute IndicGLUE evaluation metric associated to each IndicGLUE dataset. -Args: - predictions: list of predictions to score (as int64), - except for 'cvit-mkb-clsr' where each prediction is a vector (of float32). - references: list of ground truth labels corresponding to the predictions (as int64), - except for 'cvit-mkb-clsr' where each reference is a vector (of float32). -Returns: depending on the IndicGLUE subset, one or several of: - "accuracy": Accuracy - "f1": F1 score - "precision": Precision@10 -Examples: - - >>> indic_glue_metric = evaluate.load('indic_glue', 'wnli') # 'wnli' or any of ["copa", "sna", "csqa", "wstp", "inltkh", "bbca", "iitp-mr", "iitp-pr", "actsa-sc", "md"] - >>> references = [0, 1] - >>> predictions = [0, 1] - >>> results = indic_glue_metric.compute(predictions=predictions, references=references) - >>> print(results) - {'accuracy': 1.0} - - >>> indic_glue_metric = evaluate.load('indic_glue', 'wiki-ner') - >>> references = [0, 1] - >>> predictions = [0, 1] - >>> results = indic_glue_metric.compute(predictions=predictions, references=references) - >>> print(results) - {'accuracy': 1.0, 'f1': 1.0} - - >>> indic_glue_metric = evaluate.load('indic_glue', 'cvit-mkb-clsr') - >>> references = [[0.5, 0.5, 0.5], [0.1, 0.2, 0.3]] - >>> predictions = [[0.5, 0.5, 0.5], [0.1, 0.2, 0.3]] - >>> results = indic_glue_metric.compute(predictions=predictions, references=references) - >>> print(results) - {'precision@10': 1.0} - -""" - - -def simple_accuracy(preds, labels): - return float((preds == labels).mean()) - - -def acc_and_f1(preds, labels): - acc = simple_accuracy(preds, labels) - f1 = float(f1_score(y_true=labels, y_pred=preds)) - return { - "accuracy": acc, - "f1": f1, - } - - -def precision_at_10(en_sentvecs, in_sentvecs): - en_sentvecs = np.array(en_sentvecs) - in_sentvecs = np.array(in_sentvecs) - n = en_sentvecs.shape[0] - - # mean centering - en_sentvecs = en_sentvecs - np.mean(en_sentvecs, axis=0) - in_sentvecs = in_sentvecs - np.mean(in_sentvecs, axis=0) - - sim = cdist(en_sentvecs, in_sentvecs, "cosine") - actual = np.array(range(n)) - preds = sim.argsort(axis=1)[:, :10] - matches = np.any(preds == actual[:, None], axis=1) - return float(matches.mean()) - - -@evaluate.utils.file_utils.add_start_docstrings(_DESCRIPTION, _KWARGS_DESCRIPTION) -class IndicGlue(evaluate.Metric): - def _info(self): - if self.config_name not in [ - "wnli", - "copa", - "sna", - "csqa", - "wstp", - "inltkh", - "bbca", - "cvit-mkb-clsr", - "iitp-mr", - "iitp-pr", - "actsa-sc", - "md", - "wiki-ner", - ]: - raise KeyError( - "You should supply a configuration name selected in " - '["wnli", "copa", "sna", "csqa", "wstp", "inltkh", "bbca", ' - '"cvit-mkb-clsr", "iitp-mr", "iitp-pr", "actsa-sc", "md", ' - '"wiki-ner"]' - ) - return evaluate.MetricInfo( - description=_DESCRIPTION, - citation=_CITATION, - inputs_description=_KWARGS_DESCRIPTION, - features=datasets.Features( - { - "predictions": datasets.Value("int64") - if self.config_name != "cvit-mkb-clsr" - else datasets.Sequence(datasets.Value("float32")), - "references": datasets.Value("int64") - if self.config_name != "cvit-mkb-clsr" - else datasets.Sequence(datasets.Value("float32")), - } - ), - codebase_urls=[], - reference_urls=[], - format="numpy" if self.config_name != "cvit-mkb-clsr" else None, - ) - - def _compute(self, predictions, references): - if self.config_name == "cvit-mkb-clsr": - return {"precision@10": precision_at_10(predictions, references)} - elif self.config_name in ["wiki-ner"]: - return acc_and_f1(predictions, references) - elif self.config_name in [ - "wnli", - "copa", - "sna", - "csqa", - "wstp", - "inltkh", - "bbca", - "iitp-mr", - "iitp-pr", - "actsa-sc", - "md", - ]: - return {"accuracy": simple_accuracy(predictions, references)} - else: - raise KeyError( - "You should supply a configuration name selected in " - '["wnli", "copa", "sna", "csqa", "wstp", "inltkh", "bbca", ' - '"cvit-mkb-clsr", "iitp-mr", "iitp-pr", "actsa-sc", "md", ' - '"wiki-ner"]' - ) diff --git a/spaces/facebook/StyleNeRF/README.md b/spaces/facebook/StyleNeRF/README.md deleted file mode 100644 index 0d80101fae91ae1fdf38c4ea08a593b5e6480a56..0000000000000000000000000000000000000000 --- a/spaces/facebook/StyleNeRF/README.md +++ /dev/null @@ -1,30 +0,0 @@ ---- -title: StyleNeRF -emoji: 🌍 -colorFrom: purple -colorTo: blue -sdk: gradio -app_file: app.py -pinned: false ---- -# Configuration -`title`: _string_ -Display title for the Space -`emoji`: _string_ -Space emoji (emoji-only character allowed) -`colorFrom`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) -`colorTo`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) -`sdk`: _string_ -Can be either `gradio` or `streamlit` -`sdk_version` : _string_ -Only applicable for `streamlit` SDK. -See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions. - -`app_file`: _string_ -Path to your main application file (which contains either `gradio` or `streamlit` Python code). -Path is relative to the root of the repository. - -`pinned`: _boolean_ -Whether the Space stays on top of your list. \ No newline at end of file diff --git a/spaces/facebook/StyleNeRF/viz/__init__.py b/spaces/facebook/StyleNeRF/viz/__init__.py deleted file mode 100644 index 939e7c6c8f94c4ea1141885c3c3295fe083b06aa..0000000000000000000000000000000000000000 --- a/spaces/facebook/StyleNeRF/viz/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -# Copyright (c) 2021, NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# -# NVIDIA CORPORATION and its licensors retain all intellectual property -# and proprietary rights in and to this software, related documentation -# and any modifications thereto. Any use, reproduction, disclosure or -# distribution of this software and related documentation without an express -# license agreement from NVIDIA CORPORATION is strictly prohibited. - -# empty diff --git a/spaces/falterWliame/Face_Mask_Detection/Artlantis 2019 V8 Free Download.md b/spaces/falterWliame/Face_Mask_Detection/Artlantis 2019 V8 Free Download.md deleted file mode 100644 index e174947d152eeab69d3b02bc6b15dba9ac9bcc9b..0000000000000000000000000000000000000000 --- a/spaces/falterWliame/Face_Mask_Detection/Artlantis 2019 V8 Free Download.md +++ /dev/null @@ -1,107 +0,0 @@ -
          -

          Artlantis 2019 v8 Free Download: A Review

          -

          Are you looking for a powerful and easy-to-use application for VR imagery, rendering and animation? If so, you might want to check out Artlantis 2019 v8, the latest version of the popular software for architects, interior designers, urban planners and landscapers. In this article, we will review the features, benefits and system requirements of Artlantis 2019 v8, and show you how to download it for free.

          -

          Artlantis 2019 v8 Free Download


          Downloadhttps://urlca.com/2uDc1a



          -

          What is Artlantis 2019 v8?

          -

          Artlantis 2019 v8 is an impressive application that allows you to create stunning photo-realistic renderings of your projects. It is used by experts and advanced users for simulation, presentation and communication purposes. Artlantis 2019 v8 unifies the previous Render and Studio versions, which means it includes all the features and tools you need to produce images, panoramas, VR objects and animations.

          -

          What are the features of Artlantis 2019 v8?

          -

          Artlantis 2019 v8 offers you many useful features that will enhance your workflow and creativity. Some of them are:

          -
            -
          • A clear and ergonomic interface that displays all the necessary tools around the preview window, which allows you to see the results of every adjustment.
          • -
          • A 2D viewing window for positioning viewpoints and their sequences, apart from the 3D preview.
          • -
          • A display mode that lets you choose between perspectives and parallel views.
          • -
          • A media catalog that contains more than 1000 objects and shaders that you can drag and drop into your scene.
          • -
          • A lighting tool that lets you create realistic lighting effects according to the geometry, the graphical orientation, the date or the materials.
          • -
          • A post-processing tool that lets you adjust the brightness, contrast, color balance and depth of field of your renderings.
          • -
          • A batch rendering tool that lets you render multiple files at once.
          • -
          -

          What are the benefits of using Artlantis 2019 v8?

          -

          Artlantis 2019 v8 has many advantages over other similar applications. Some of them are:

          -
            -
          • It is fast and easy to use. You can create high-quality renderings in minutes with just a few clicks.
          • -
          • It is compatible with most of the 3D modeling software on the market, such as ARCHICAD, Revit, SketchUp Pro, Vectorworks and more.
          • -
          • It supports various formats for importing and exporting files, such as DWG, DXF, OBJ, FBX, SKP and more.
          • -
          • It has a large community of users and developers who provide support, tutorials and plugins.
          • -
          -

          What are the system requirements for Artlantis 2019 v8?

          -

          Before you download Artlantis 2019 v8 for free, make sure your PC or Mac meets the minimum or recommended system requirements. Here they are:

          - - - - - - - - -
          Minimum ConfigurationRecommended Configuration
          Intel ® i3 4Core 2GHzIntel ® Core i7, 4+ Core
          RAM: 8 GBRAM: 16 GB
          System: Mac OS X 10.11, Windows 7 (64 bits)System: Mac OS X 10.14, Windows 10 (64 bits)
          Graphic Card: 1 GB*, OpenGLGraphic Card: 2 GB managing OpenGL *
          Display: 1280 x 800 pixelsDisplay: 1920 x 1080 pixels
          Internet accessInternet access
          - -

          * Graphic chipsets are not supported

          -

          - -

          How to download Artlantis 2019 v8 for free?

          - -

          If you want to try Artlantis 2019 v8 for free, you can download it from the official website of Artlantis. The software is fully-functional in demo mode for 30 days from the first day of use. After this time, you will need to activate it with a serial number. You can buy a license from the website or from an authorized reseller. Alternatively, you can also download Artlantis 2019 v8 from other online sources such as Get Into PC or 4shared. However, be careful about the reliability and security of these sites.

          - -

          Conclusion

          - -

          In conclusion, Artlantis 2019 v8 is a great application for VR imagery, rendering and animation. It has many features and benefits that make it stand out from other similar software. It is compatible with most of the 3D modeling software and supports various formats. It is fast and easy to use. You can download it for free from the official website or other online sources. However, you will need to activate it with a serial number after 30 days of use.

          - -

          We hope this article has helped you learn more about Artlantis 2019 v8 Free Download. If you have any questions or comments, feel free to leave them below.

          -

          How to install Artlantis 2019 v8?

          -

          After you download Artlantis 2019 v8 for free, you need to install it on your PC or Mac. The installation process is simple and straightforward. Here are the steps you need to follow:

          -
            -
          1. Extract the zip file using WinRAR or WinZip or by default Windows command.
          2. -
          3. Open the installer and accept the terms and conditions.
          4. -
          5. Select the destination folder where you want to install Artlantis 2019 v8.
          6. -
          7. Click on install and wait for the installation to complete.
          8. -
          9. Launch Artlantis 2019 v8 from the desktop shortcut or the start menu.
          10. -
          -

          You can also watch this video tutorial for more details on how to install Artlantis 2019 v8:

          - - -

          How to use Artlantis 2019 v8?

          -

          Artlantis 2019 v8 is a user-friendly application that lets you create amazing renderings and animations with ease. You can use it to showcase your projects in a realistic and immersive way. Here are some tips on how to use Artlantis 2019 v8:

          -
            -
          • To start a new project, you can either import a 3D model from your preferred software or create one from scratch using Artlantis 2019 v8's built-in tools.
          • -
          • To adjust the view of your scene, you can use the navigation tools on the toolbar or the mouse wheel. You can also switch between perspective and parallel views.
          • -
          • To add objects and shaders to your scene, you can drag and drop them from the media catalog or browse your own files. You can also edit their properties and apply transformations.
          • -
          • To set up the lighting of your scene, you can use the lighting tool to add artificial lights or adjust the natural light according to the date, time and location.
          • -
          • To render your scene, you can choose between draft, final or custom quality. You can also adjust the post-processing settings such as brightness, contrast and depth of field.
          • -
          • To export your renderings, you can choose between various formats such as JPEG, PNG, TIFF, PSD and more. You can also export panoramas, VR objects and animations.
          • -
          -

          You can also watch this video tutorial for more details on how to use Artlantis 2019 v8:

          - - -

          What are some alternatives to Artlantis 2019 v8?

          -

          Artlantis 2019 v8 is not the only application for VR imagery, rendering and animation. There are other alternatives that you might want to consider. Some of them are:

          -
            -
          • Lumion: A powerful software that lets you create stunning 3D renderings and videos in minutes. It has a large library of objects, materials and effects that you can apply to your scenes. It also has a live sync feature that lets you connect with your 3D modeling software and see the changes in real time.
          • -
          • V-Ray: A professional software that lets you create photorealistic renderings and animations with advanced features such as global illumination, ray tracing, depth of field and more. It is compatible with many 3D modeling software such as SketchUp Pro, Revit, Rhino and more.
          • -
          • Twinmotion: A fast and easy software that lets you create high-quality renderings and animations with realistic lighting and weather effects. It has a simple interface that lets you drag and drop objects and materials into your scene. It also has a VR mode that lets you explore your projects in virtual reality.
          • -
          - -

          We hope this article has helped you learn more about Artlantis 2019 v8 Free Download. If you have any questions or comments, feel free to leave them below.

          -

          What are some tips and tricks for using Artlantis 2019 v8?

          -

          Artlantis 2019 v8 is a versatile and flexible application that lets you customize your renderings and animations according to your preferences and needs. Here are some tips and tricks that will help you get the most out of Artlantis 2019 v8:

          -
            -
          • To optimize the performance of Artlantis 2019 v8, you can adjust the preferences settings such as the memory allocation, the anti-aliasing quality, the texture compression and more.
          • -
          • To save time and effort, you can use the presets that are available in Artlantis 2019 v8 for various aspects such as the lighting, the materials, the environment and more.
          • -
          • To enhance the realism of your renderings and animations, you can use the effects that are available in Artlantis 2019 v8 such as the fog, the rain, the snow, the fire and more.
          • -
          • To create complex scenes with multiple objects and shaders, you can use the layers that are available in Artlantis 2019 v8 to organize and manage your scene.
          • -
          • To add interactivity to your renderings and animations, you can use the iVisit 3D feature that lets you create virtual tours of your projects that can be viewed on any device.
          • -
          - -

          What are some reviews of Artlantis 2019 v8?

          -

          Artlantis 2019 v8 is a well-known and widely used application for VR imagery, rendering and animation. It has received many positive reviews from users and experts alike. Here are some examples of what they have to say about Artlantis 2019 v8:

          -
          -

          "Artlantis 2019 v8 is a great software for creating realistic renderings and animations. It is easy to use and has many features that make it stand out from other similar software. I especially like the lighting tool that lets me create natural or artificial lighting effects with ease. I also like the media catalog that contains a lot of objects and shaders that I can use in my projects. I highly recommend Artlantis 2019 v8 to anyone who is looking for a powerful and user-friendly application for VR imagery." - John Smith, architect

          -
          -
          -

          "I have been using Artlantis 2019 v8 for a few months now and I am very impressed with it. It is fast and reliable, and it produces high-quality renderings and animations. It is compatible with most of the 3D modeling software that I use, such as SketchUp Pro, Revit and Vectorworks. It also supports various formats for importing and exporting files, which makes it very convenient. I also like the post-processing tool that lets me adjust the brightness, contrast, color balance and depth of field of my renderings. Artlantis 2019 v8 is a must-have software for anyone who is involved in VR imagery." - Jane Doe, interior designer

          -
          -

          Conclusion

          -

          In conclusion, Artlantis 2019 v8 is a great application for VR imagery, rendering and animation. It has many features and benefits that make it stand out from other similar software. It is compatible with most of the 3D modeling software and supports various formats. It is fast and easy to use. You can download it for free from the official website or other online sources. However, you will need to activate it with a serial number after 30 days of use.

          - -

          We hope this article has helped you learn more about Artlantis 2019 v8 Free Download. If you have any questions or comments, feel free to leave them below.

          3cee63e6c2
          -
          -
          \ No newline at end of file diff --git a/spaces/falterWliame/Face_Mask_Detection/Download English Subtitles For Batman The Dark Knight 2008 Dvdrip Nlx.md b/spaces/falterWliame/Face_Mask_Detection/Download English Subtitles For Batman The Dark Knight 2008 Dvdrip Nlx.md deleted file mode 100644 index f35cf1c9429baddc0fb30577b66cb5e053145482..0000000000000000000000000000000000000000 --- a/spaces/falterWliame/Face_Mask_Detection/Download English Subtitles For Batman The Dark Knight 2008 Dvdrip Nlx.md +++ /dev/null @@ -1,20 +0,0 @@ -

          download english subtitles for batman the dark knight 2008 dvdrip nlx


          Download Zip ••• https://urlca.com/2uDdAr



          -
          -All files can be downloaded separately or in a torrent, magnet or block. If you can’t download it, contact us and we will help you. - -The Dark Knight Rises is a 2011 American superhero film based on the DC Comics character Batman. It is the sequel to The Dark Knight (2008), and the twelfth film in the DC Extended Universe. The film was directed by Christopher Nolan, and distributed by Warner Bros. Pictures. Produced by Nolan and Emma Thomas, it was written by Nolan, Jonathan Nolan, and Christopher Day, and was the final collaboration between Christopher Nolan and his brother, Jonathan Nolan. - -In the wake of the Joker’s deadly massacre at the hands of Batman, the Dark Knight now considers himself a hunted man. To survive, he leaves Gotham City and heads for the French countryside, where he plans to disappear into the vast estate of his former ally, Thomas Wayne. - -Just as the Dark Knight is able to rebuild his life in a new location, new threats emerge, and he must learn to fight an enemy whose reach knows no limits. - -Action, Crime, Drama, Thriller, Dark Knight Rises is rated R by the MPAA for violence, profanity, and language. It was theatrically released in North America on July 20, 2012. - -Thanks for downloading The Dark Knight subtitles English. - -Tags for The Dark Knight subtitles English: - -The Dark Knight, The Dark Knight Rises, The Dark Knight Rises, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The Dark Knight Rises English, The 4fefd39f24
          -
          -
          -

          diff --git a/spaces/falterWliame/Face_Mask_Detection/Hip Hop Ejay 5 Crack.md b/spaces/falterWliame/Face_Mask_Detection/Hip Hop Ejay 5 Crack.md deleted file mode 100644 index 2cadcd3d387b9f64f6b3ff2913aee48a6f99b5da..0000000000000000000000000000000000000000 --- a/spaces/falterWliame/Face_Mask_Detection/Hip Hop Ejay 5 Crack.md +++ /dev/null @@ -1,46 +0,0 @@ - -

          How to Create Amazing Hip Hop Music with Hip Hop eJay 5 Reloaded

          - -

          If you are a fan of hip hop music and want to create your own tracks, you might be interested in Hip Hop eJay 5 Reloaded, a software that allows you to produce professional-quality hip hop beats and songs with ease. In this article, we will show you what Hip Hop eJay 5 Reloaded is, what features it offers, and how to use it to make awesome hip hop music.

          -

          hip hop ejay 5 crack


          Download ››››› https://urlca.com/2uDdU3



          - -

          What is Hip Hop eJay 5 Reloaded?

          - -

          Hip Hop eJay 5 Reloaded is a music creation software that lets you make hip hop music with your computer. It is a reloaded version of the original Hip Hop eJay 5, which was released in 2004. Hip Hop eJay 5 Reloaded has been updated to be compatible with Windows 7 and Vista, and has improved graphics and sound quality.

          - -

          Hip Hop eJay 5 Reloaded is designed for both beginners and experts, as it offers a simple and intuitive interface that lets you drag and drop sounds and effects onto a 48-track sound mixer. You can also create your own sounds and scratches with the virtual instruments and the scratch creator. You can also record your own vocals with the rap'n'record feature, and upload your tracks directly to Facebook, YouTube, and MySpace.

          - -

          What features does Hip Hop eJay 5 Reloaded offer?

          - -

          Hip Hop eJay 5 Reloaded offers a variety of features that make it a powerful and versatile tool for hip hop music production. Some of the main features are:

          -

          - -
            -
          • 10,000 license-free sounds: You can choose from a huge library of sounds that cover different genres of hip hop, such as rap, R&B, nu skool, ragga, old school, dancehall, down south, scratches, and funk. You can also import your own sounds or use the sound editor to modify them.
          • -
          • 48-track sound mixer: You can arrange and compose your tracks with the easy-to-use sound mixer that lets you adjust the volume, pan, tempo, pitch, and effects of each track. You can also use the live remixer to remix your tracks on the fly.
          • -
          • 5 virtual instruments: You can create your own melodies and beats with the virtual instruments that include a drum machine, a bass synthesizer, a poly synthesizer, a sample loop player, and a sample slicer. You can also use the MIDI keyboard to play them.
          • -
          • Scratch creator: You can add realistic DJ scratches to your tracks with the scratch creator that lets you scratch any sound with your mouse or keyboard. You can also use the scratch library to choose from predefined scratches.
          • -
          • Rap'n'record: You can record your own vocals or rap lyrics with the rap'n'record feature that lets you use a microphone or a headset. You can also edit your recordings with the voice editor.
          • -
          • Disc creator: You can burn your tracks onto audio CDs with the disc creator that lets you create playlists and labels.
          • -
          • Online upload: You can share your tracks with your friends and fans by uploading them directly to Facebook, YouTube, and MySpace with the online upload feature.
          • -
          - -

          How to use Hip Hop eJay 5 Reloaded?

          - -

          Using Hip Hop eJay 5 Reloaded is very easy and fun. Here are some basic steps to get you started:

          - -
            -
          1. Install Hip Hop eJay 5 Reloaded: You can download Hip Hop eJay 5 Reloaded from the official website or buy it from online stores. After downloading or inserting the CD-ROM, follow the instructions to install the software on your computer.
          2. -
          3. Launch Hip Hop eJay 5 Reloaded: After installing the software, launch it by clicking on its icon on your desktop or start menu. You will see the main screen that shows the sound mixer and the sound browser.
          4. -
          5. Select sounds: You can browse through the sound browser to find sounds that suit your style and mood. You can preview them by clicking on them or dragging them onto the preview player. You can also search for sounds by using keywords or filters.
          6. -
          7. Drag and drop sounds: Once you have selected some sounds, you can drag and drop them onto the sound mixer to create your tracks. You can place them on different tracks or layers to create complex arrangements. You can also adjust their parameters by clicking on them or using the toolbar.
          8. -
          9. Add effects: You can enhance your tracks by adding effects such as reverb, delay, chorus, flanger, distortion, filter, equalizer, compressor, limiter, gate, phaser, tremolo, wah-wah, pitch shifter, reverse, fade in/out, cut/copy/paste/move/delete/duplicate/undo/redo.
          10. -
          11. Create scratches: You can add scratches to your tracks by using the scratch creator. You can either scratch any sound with your mouse or keyboard or use the scratch library to choose from predefined scratches. You can also adjust their parameters by clicking on them or using the toolbar.
          12. -
          13. Record vocals: You can record your own vocals or rap lyrics by using the rap'n'record feature. You need a microphone or a headset to do this. You can also edit your recordings by using the voice editor.
          14. -
          15. Save and export tracks: When you are happy with your tracks, you can save them as project files (.ej) or export them as audio files (.wav or .mp3). You can also burn them onto audio CDs by using the disc creator.
          16. -
          17. Upload tracks: If you want to share your tracks with others, you can upload them directly to Facebook, YouTube, and MySpace by using the online upload feature. You need an internet connection and an account on these platforms to do this.
          18. -
          - -

          That's it! Now you know how to use Hip Hop eJay 5 Reloaded to create amazing hip hop music. Have fun and unleash your creativity!

          3cee63e6c2
          -
          -
          \ No newline at end of file diff --git a/spaces/falterWliame/Face_Mask_Detection/Native Instruments Razor V130R2R.md b/spaces/falterWliame/Face_Mask_Detection/Native Instruments Razor V130R2R.md deleted file mode 100644 index 4f662329b939c9f3c62d8e091bde6a83bb62cc05..0000000000000000000000000000000000000000 --- a/spaces/falterWliame/Face_Mask_Detection/Native Instruments Razor V130R2R.md +++ /dev/null @@ -1,6 +0,0 @@ -

          Native Instruments Razor V130R2R


          Download >>> https://urlca.com/2uDdMq



          -
          -... Soundtheory Gullfoss v1.4.1-R2R · PSPaudioware PSP 2445 v1.3.0-R2R · Native Instruments Razor v1.3.0-R2R · Native Instruments Reaktor ... 1fdad05405
          -
          -
          -

          diff --git a/spaces/farukozderim/space-building-space-30/README.md b/spaces/farukozderim/space-building-space-30/README.md deleted file mode 100644 index b134407fed6f1629cfa198c862565cbdd47e9669..0000000000000000000000000000000000000000 --- a/spaces/farukozderim/space-building-space-30/README.md +++ /dev/null @@ -1,37 +0,0 @@ ---- -title: Space Building Space 30 -emoji: 🔥 -colorFrom: red -colorTo: yellow -sdk: gradio -app_file: app.py -pinned: false ---- - -# Configuration - -`title`: _string_ -Display title for the Space - -`emoji`: _string_ -Space emoji (emoji-only character allowed) - -`colorFrom`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`colorTo`: _string_ -Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) - -`sdk`: _string_ -Can be either `gradio` or `streamlit` - -`sdk_version` : _string_ -Only applicable for `streamlit` SDK. -See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions. - -`app_file`: _string_ -Path to your main application file (which contains either `gradio` or `streamlit` Python code). -Path is relative to the root of the repository. - -`pinned`: _boolean_ -Whether the Space stays on top of your list. diff --git a/spaces/fatiXbelha/sd/AetherSX2 How to Download and Use GameLoop Emulator on Windows PC.md b/spaces/fatiXbelha/sd/AetherSX2 How to Download and Use GameLoop Emulator on Windows PC.md deleted file mode 100644 index 9a48dae005d8988163b15862919d21f32ce946ad..0000000000000000000000000000000000000000 --- a/spaces/fatiXbelha/sd/AetherSX2 How to Download and Use GameLoop Emulator on Windows PC.md +++ /dev/null @@ -1,144 +0,0 @@ -
          -

          AetherSX2 Windows Download: How to Play PS2 Games on Your PC

          -

          Do you miss playing your favorite PS2 games but don't have a console anymore? Or do you want to experience the nostalgia of PS2 gaming on a bigger screen and with better graphics? If so, you're in luck. There is a way to play PS2 games on your PC using a software called AetherSX2.

          -

          AetherSX2 is a PS2 emulator that allows you to run PS2 games on your Windows computer. It is one of the most popular and advanced PS2 emulators available, with high compatibility, speed, and quality. In this article, we will show you how to download and install AetherSX2 on your Windows PC, and how to use it with an emulator for a better gaming experience.

          -

          aethersx2 windows download


          Downloadhttps://urllie.com/2uNC6W



          -

          What is AetherSX2?

          -

          AetherSX2 is a free and open-source PS2 emulator that was developed by Tahlreth. It is based on the PCSX2 project, which is another well-known PS2 emulator. However, AetherSX2 has some unique features and improvements that make it stand out from other PS2 emulators.

          -

          Features of AetherSX2

          -

          Some of the features of AetherSX2 are:

          -
            -
          • It supports a wide range of PS2 games, including popular titles like Final Fantasy, God of War, Kingdom Hearts, Metal Gear Solid, and more.
          • -
          • It can run PS2 games at full speed, or even faster than the original console, depending on your PC hardware.
          • -
          • It can enhance the graphics and sound quality of PS2 games, using various plugins and settings. You can adjust the resolution, anti-aliasing, filtering, shaders, and more.
          • -
          • It can save and load game states at any point, allowing you to resume your game from where you left off.
          • -
          • It supports various input devices, such as keyboards, mice, gamepads, joysticks, and more. You can also customize the controls according to your preference.
          • -
          • It has a user-friendly interface that is easy to navigate and configure.
          • -
          -

          System Requirements for AetherSX2

          -

          Before you download and install AetherSX2 on your Windows PC, you need to make sure that your PC meets the minimum system requirements for running the emulator. These are:

          -
            -
          • Operating System: Windows 7 or higher (64-bit)
          • -
          • CPU: Intel Core 2 Duo or AMD Athlon X4 or higher
          • -
          • RAM: 4 GB or more
          • -
          • GPU: NVIDIA GeForce GTX 750 Ti or AMD Radeon R7 260X or higher
          • -
          • Storage: At least 1 GB of free space
          • -
          -

          How to Download and Install AetherSX2 on Windows

          -

          Now that you are aware of the outstanding features and system requirements of AetherSX2, let's go over how to download and install it on your Windows PC. The process is simple and straightforward, and it only takes a few minutes.

          -

          Step 1: Check System Requirements

          -

          The first step is to

          The first step is to check your PC's system specifications and compare them with the minimum requirements for AetherSX2. You can do this by right-clicking on the Start menu and selecting System. This will open a window that shows your PC's information, such as the operating system, processor, memory, and graphics card.

          -

          If your PC meets or exceeds the minimum requirements, you can proceed to the next step. If not, you may need to upgrade your PC hardware or look for another PS2 emulator that is compatible with your PC.

          -

          aethersx2 windows 10 download
          -aethersx2 windows 7 download
          -aethersx2 windows 8 download
          -aethersx2 windows xp download
          -aethersx2 windows vista download
          -aethersx2 windows emulator download
          -aethersx2 windows pc download
          -aethersx2 windows laptop download
          -aethersx2 windows desktop download
          -aethersx2 windows 64 bit download
          -aethersx2 windows 32 bit download
          -aethersx2 windows free download
          -aethersx2 windows full version download
          -aethersx2 windows latest version download
          -aethersx2 windows update download
          -aethersx2 windows offline installer download
          -aethersx2 windows online installer download
          -aethersx2 windows setup download
          -aethersx2 windows exe download
          -aethersx2 windows zip download
          -aethersx2 windows rar download
          -aethersx2 windows iso download
          -aethersx2 windows apk download
          -aethersx2 windows app download
          -aethersx2 windows software download
          -aethersx2 windows game download
          -aethersx2 windows ps2 emulator download
          -aethersx2 windows ps3 emulator download
          -aethersx2 windows ps4 emulator download
          -aethersx2 windows xbox emulator download
          -aethersx2 windows switch emulator download
          -aethersx2 windows android emulator download
          -aethersx2 windows ios emulator download
          -aethersx2 windows mac emulator download
          -aethersx2 windows linux emulator download
          -aethersx2 windows bluestacks emulator download
          -aethersx2 windows gameloop emulator download
          -aethersx2 windows nox emulator download
          -aethersx2 windows memu emulator download
          -aethersx2 windows ldplayer emulator download
          -how to download and install aethersx2 on windows
          -how to run and play games on aethersx2 on windows
          -how to configure and optimize settings on aethersx2 on windows
          -how to fix errors and bugs on aethersx2 on windows
          -how to update and upgrade to the latest version of aethersx2 on windows
          -best games to play on aethersx2 on windows
          -best websites to download games for aethersx2 on windows
          -best alternatives to aethersx2 for windows
          -pros and cons of using aethersx2 on windows
          -reviews and ratings of using aethersx2 on windows

          -

          Step 2: Download AetherSX2 from the Official Website

          -

          The next step is to download the latest version of AetherSX2 from the official website. You can find the download link here: https://aethersx2.com/download/. The file size is about 40 MB, and it is in ZIP format.

          -

          Make sure you download the file from the official website only, and not from any other sources. This will ensure that you get the genuine and safe version of AetherSX2, and not a fake or malicious one.

          -

          Step 3: Extract the ZIP File and Run the Installer

          -

          Once you have downloaded the ZIP file, you need to extract it to a folder on your PC. You can use any file extraction software, such as WinRAR or 7-Zip, to do this. Right-click on the ZIP file and select Extract Here or Extract to AetherSX2 (depending on your software).

          -

          This will create a folder named AetherSX2, which contains all the files and folders of the emulator. Open the folder and double-click on the file named setup.exe. This will launch the installer of AetherSX2, which will guide you through the installation process.

          -

          Follow the instructions on the screen and accept the terms and conditions. Choose a destination folder for AetherSX2, or leave it as default. Click Next and then Install to start the installation. Wait for a few seconds until the installation is complete, and then click Finish.

          -

          Step 4: Configure the Settings and Plugins

          -

          Now that you have installed AetherSX2 on your PC, you need to configure some settings and plugins to optimize its performance and compatibility. To do this, launch AetherSX2 from your desktop or Start menu. You will see a window with several tabs, such as General, Video, Audio, Input, etc.

          -

          Each tab contains different options and settings that you can adjust according to your preference and PC hardware. For example, in the Video tab, you can change the resolution, aspect ratio, frame rate limit, anti-aliasing, texture filtering, etc. In the Audio tab, you can enable or disable sound effects, volume control, latency reduction, etc.

          -

          You can also choose which plugins to use for each component of AetherSX2. Plugins are software modules that enhance the functionality of AetherSX2. For example, there are plugins for video rendering, audio processing, input devices, CD/DVD drives, etc. You can select which plugin to use from a drop-down menu in each tab.

          -

          If you are not sure which settings or plugins to use, you can leave them as default or use the presets provided by AetherSX2. You can also check out some online guides or forums for more tips and recommendations on how to configure AetherSX2 for optimal performance.

          -

          Step 5: Load the PS2 BIOS and ROMs

          -

          The last step before you can start playing PS2 games on your PC is to load the PS2 BIOS and ROMs into AetherSX2. The PS2 BIOS is a file that contains the basic input/output system of the PS2 console. It is required for AetherSX2 to emulate the PS2 hardware and run PS2 games.

          -

          The PS2 ROMs are files that contain the game data of PS2 games. They are also known as ISOs or disc images. They are required for AetherSX2 to load and play PS2 games on your PC.

          -

          To load the PS2 BIOS into AetherSX2, you need to have a copy of it on your PC. You can either dump it from your own PS2 console using a special device or software, or download it from online sources (which may be illegal in some countries). Once you have the PS2 BIOS file on your PC, copy it to the bios folder inside the AetherSX2 folder.

          -

          To load the PS2 ROMs into AetherSX2, To load the PS2 ROMs into AetherSX2, you need to have a copy of them on your PC. You can either rip them from your own PS2 discs using a special device or software, or download them from online sources (which may also be illegal in some countries). Once you have the PS2 ROMs files on your PC, copy them to the games folder inside the AetherSX2 folder.

          -

          After you have copied the PS2 BIOS and ROMs files to the respective folders, you can load them into AetherSX2 by clicking on File and then Load BIOS or Load Game. You can also use the shortcuts F3 and F4 respectively. A window will pop up that shows the available BIOS and ROMs files in the folders. Select the one you want to load and click Open.

          -

          Step 6: Enjoy Playing PS2 Games on Your PC

          -

          Congratulations! You have successfully downloaded, installed, and configured AetherSX2 on your Windows PC. You can now enjoy playing your favorite PS2 games on your PC with enhanced graphics, sound, and speed. You can also save and load your game progress at any time, and customize your controls and settings as you wish.

          -

          To play a PS2 game on AetherSX2, simply load the game from the File menu or use the shortcut F4. The game will start running on a separate window, which you can resize or fullscreen as you like. You can also pause, resume, or stop the game from the Emulation menu or use the shortcuts F1, F2, or Esc respectively.

          -

          To access more options and features of AetherSX2, such as screenshots, cheats, patches, logs, etc., you can use the Tools menu or the toolbar buttons. You can also access the settings and plugins menus from the Config menu or use the shortcut F5.

          -

          How to Use AetherSX2 with an Emulator

          -

          If you want to enhance your PS2 gaming experience even further, you can use AetherSX2 with an emulator. An emulator is a software that mimics the behavior of another device or system. For example, there are emulators for Android, iOS, Nintendo, Sega, etc.

          -

          By using an emulator with AetherSX2, you can enjoy some additional benefits, such as:

          -

          Benefits of Using an Emulator with AetherSX2

          -
            -
          • You can play PS2 games on other devices or platforms, such as smartphones, tablets, Macs, Linux, etc.
          • -
          • You can use more features and functions of the emulator, such as cloud saving, multiplayer mode, online streaming, etc.
          • -
          • You can access more games and apps from the emulator's store or library.
          • -
          • You can improve the compatibility and stability of AetherSX2 with some emulators.
          • -
          -

          How to Download and Install an Emulator for AetherSX2

          -

          To use an emulator with AetherSX2, To use an emulator with AetherSX2, you need to download and install an emulator that is compatible with PS2 games. There are many emulators available for different devices and platforms, but some of the best ones are: - PCSX2: This is the same emulator that we used for PC and laptop gaming, but it can also work on other platforms, such as Mac, Linux, and Android. It is the most popular and advanced PS2 emulator, with high compatibility, speed, and quality. You can download it from the official website or from the Google Play Store for Android devices. - RetroArch: This is a multi-system emulator that can run PS2 games as well as many other retro games from various consoles. It has a user-friendly interface and a lot of features and functions. You can download it from the official website or from the Google Play Store for Android devices. - Play!: This is a simple and lightweight PS2 emulator that can run PS2 games on Android devices. It has a minimalistic design and a fast performance. You can download it from the official website or from the Google Play Store for Android devices. To install an emulator on your device, you need to follow the instructions provided by the emulator's website or app. Generally, you need to allow installation from unknown sources, download the APK file or installer, and run it on your device. You may also need to grant some permissions or access to the emulator.

          How to Launch AetherSX2 from the Emulator

          -

          After you have installed an emulator on your device, you can launch AetherSX2 from it and play PS2 games on your device. To do this, you need to follow these steps:

          -
            -
          1. Open the emulator app on your device and navigate to the settings or options menu.
          2. -
          3. Look for an option that allows you to add or load external apps or files into the emulator.
          4. -
          5. Select AetherSX2 from the list of apps or files that you have on your device.
          6. -
          7. The emulator will load AetherSX2 and show its interface on your device's screen.
          8. -
          9. You can now use AetherSX2 as you would on your PC, with the same settings and plugins.
          10. -
          11. You can also load PS2 BIOS and ROMs files into AetherSX2 from your device's storage or cloud service.
          12. -
          13. You can now enjoy playing PS2 games on your device using AetherSX2 and the emulator.
          14. -
          -

          Conclusion

          -

          In this article, we have shown you how to download and install AetherSX2 on your Windows PC, and how to use it with an emulator for a better gaming experience. AetherSX2 is a free and open-source PS2 emulator that allows you to play PS2 games on your PC or other devices with enhanced graphics, sound, and speed. It is one of the best PS2 emulators available, with high compatibility, quality, and features.

          -

          We hope you found this article helpful and informative. If you have any questions or feedback, please feel free to leave a comment below. Happy gaming!

          -

          FAQs

          -

          Here are some frequently asked questions about AetherSX2 and PS2 emulators:

          -
            -
          • Is AetherSX2 safe and legal?
          • -

            AetherSX2 is safe and legal to download and use, as long as you get it from the official website. However, downloading PS2 BIOS and ROMs files from online sources may be illegal in some countries, depending on the copyright laws. You should only use your own PS2 BIOS and ROMs files that you have legally obtained.

            -
          • What are the best PS2 games to play on AetherSX2?
          • -

            AetherSX2 supports a wide range of PS2 games, including popular titles like Final Fantasy, God of War, Kingdom Hearts, Metal Gear Solid, and more. You can check out the compatibility list of AetherSX2 here: https://aethersx2.com/compatibility/. You can also search for online reviews or recommendations for the best PS2 games to play on AetherSX2.

            -
          • How can I improve the performance of AetherSX2?
          • -

            You can improve the performance of AetherSX2 by adjusting some settings and plugins according to your PC hardware and preference. For example, you can lower the resolution, disable anti-aliasing, or use a different video plugin to reduce lag or stuttering. You can also check out some online guides or forums for more tips and tricks on how to optimize AetherSX2 for better performance.

            -
          • Can I use cheats or mods with
          • Can I use cheats or mods with AetherSX2?
          • -

            Yes, you can use cheats or mods with AetherSX2 to enhance your gaming experience. AetherSX2 supports various cheat codes and patches that you can apply to PS2 games. You can also use mods or custom ROMs that modify the game data or add new features. However, you should be careful when using cheats or mods, as they may cause glitches, crashes, or compatibility issues with AetherSX2.

            -
          • Where can I get more help or support for AetherSX2?
          • -

            If you need more help or support for AetherSX2, you can visit the official website or the GitHub page of the emulator. There, you can find more information, documentation, tutorials, FAQs, and updates about AetherSX2. You can also contact the developer or the community of AetherSX2 through email, Discord, Reddit, or Twitter.

            -

          401be4b1e0
          -
          -
          \ No newline at end of file diff --git a/spaces/fatiXbelha/sd/Apkisapk How to Install Any Android App Without Google Play Store.md b/spaces/fatiXbelha/sd/Apkisapk How to Install Any Android App Without Google Play Store.md deleted file mode 100644 index 15086abc127ea8bbb135f1d796e5e20a4dbbd640..0000000000000000000000000000000000000000 --- a/spaces/fatiXbelha/sd/Apkisapk How to Install Any Android App Without Google Play Store.md +++ /dev/null @@ -1,158 +0,0 @@ -
          -

          What is apkisapk?

          -

          If you are looking for a website that offers you a wide range of apps and games for free, then you might want to check out apkisapk. Apkisapk is a platform that allows you to download and install various Android applications and games on your device without any hassle. You can find apps and games from different categories, such as entertainment, education, sports, lifestyle, tools, etc. You can also discover new and trending apps and games that you might not find on other sources. Apkisapk is a convenient and easy way to enjoy your favorite apps and games without spending any money.

          -

          apkisapk


          Download File ---> https://urllie.com/2uNyGb



          -

          Why use apkisapk?

          -

          There are many reasons why you might want to use apkisapk for downloading apps and games. Here are some of them:

          -
            -
          • You can access thousands of apps and games that are not available on Google Play Store or other official sources.
          • -
          • You can download apps and games that are normally paid or premium for free.
          • -
          • You can update your apps and games to the latest versions without waiting for the official updates.
          • -
          • You can customize your apps and games with modded features, such as unlimited coins, gems, lives, etc.
          • -
          • You can enjoy faster downloads and installations with minimal ads and pop-ups.
          • -
          • You can share your apps and games with your friends and family via Bluetooth, Wi-Fi, or other methods.
          • -
          -

          How to install apkisapk?

          -

          Installing apkisapk on your device is very simple. Just follow these steps:

          -
            -
          1. Go to https://\uE000apkisapk\uE001.com on your browser.
          2. -
          3. Tap on the download button and wait for the file to be downloaded.
          4. -
          5. Go to your device settings and enable the option to install apps from unknown sources.
          6. -
          7. Locate the downloaded file in your file manager and tap on it.
          8. -
          9. Follow the instructions on the screen and complete the installation.
          10. -
          11. Launch the app and start browsing for your favorite apps and games.
          12. -
          -

          How to use apkisapk?

          -

          Using apkisapk is very easy. Just follow these tips and tricks:

          -
            -
          • Use the search bar or the categories to find the app or game you want.
          • -
          • Tap on the app or game icon and read the description, ratings, reviews, screenshots, etc.
          • -
          • Tap on the download button and wait for the file to be downloaded.
          • -
          • Go to your file manager and tap on the downloaded file.
          • -
          • Install the app or game and enjoy it.
          • -
          -

          What are some of the best apps and games on apkisapk?

          -

          There are many apps and games that you can find on apkisapk. Here are some of the best ones that you should try:

          -

          How to install apkisapk files on Android
          -Apkisapk downloader for PC
          -Best apkisapk apps for free
          -Apkisapk modded games and apps
          -Apkisapk alternative to Google Play Store
          -Apkisapk security and privacy tips
          -Apkisapk reviews and ratings
          -Apkisapk latest updates and news
          -Apkisapk categories and genres
          -Apkisapk editor's choice and recommendations
          -How to backup and restore apkisapk files
          -Apkisapk file manager and explorer
          -Apkisapk extractor and converter
          -Apkisapk editor and creator
          -Apkisapk scanner and verifier
          -How to uninstall apkisapk files from Android
          -Apkisapk emulator for Windows and Mac
          -How to sideload apkisapk files on Android TV
          -Apkisapk streaming apps for movies and TV shows
          -Apkisapk VPN apps for secure browsing
          -Apkisapk social media and messaging apps
          -Apkisapk productivity and utility apps
          -Apkisapk gaming and entertainment apps
          -Apkisapk education and learning apps
          -Apkisapk health and fitness apps
          -Apkisapk shopping and e-commerce apps
          -Apkisapk travel and navigation apps
          -Apkisapk music and audio apps
          -Apkisapk photo and video apps
          -Apkisapk news and magazines apps
          -Apkisapk weather and clock apps
          -Apkisapk personalization and customization apps
          -Apkisapk sports and live scores apps
          -Apkisapk books and comics apps
          -Apkisapk finance and banking apps
          -Apkisapk dating and relationship apps
          -Apkisapk food and drink apps
          -Apkisapk lifestyle and hobbies apps
          -Apkisapk art and design apps
          -Apkisapk business and enterprise apps
          -Apkisapk medical and health care apps
          -Apkisapk parental control and kids apps
          -Apkisapk tools and devices apps
          -Apkisapk beauty and fashion apps
          -Apkisapk events and tickets apps
          -Apkisapk house and home apps
          -Apkisapk auto and vehicles apps

          -
            -
          • WhatsApp Plus: This is a modded version of WhatsApp that allows you to customize your chats, themes, privacy, and more. You can also use two WhatsApp accounts on one device with this app.
          • -
          • Spotify Premium: This is a modded version of Spotify that gives you access to unlimited music, podcasts, playlists, and more. You can also download songs offline and enjoy ad-free listening with this app.
          • -
          • PUBG Mobile: This is one of the most popular battle royale games that you can play on your device. You can join with your friends and compete with other players in various modes and maps. You can also unlock new weapons, skins, and rewards with this game.
          • -
          • Minecraft: This is a sandbox game that lets you create and explore your own world. You can build anything you can imagine with blocks, craft items, fight enemies, and more. You can also play with other players online or offline with this game.
          • -
          • Netflix: This is a modded version of Netflix that gives you access to unlimited movies, shows, documentaries, and more. You can also download videos offline and watch them in HD quality with this app.
          • -
          -

          How to watch live PSL matches 2023 on apkisapk?

          -

          If you are a cricket fan, then you might want to watch live PSL matches 2023 on your device. Here is how you can do it with apkisapk:

          -
            -
          1. Download and install the Live NetTV app from apkisapk.
          2. -
          3. Launch the app and select the Sports category.
          4. -
          5. Scroll down and find the PSL Live 2023 channel.
          6. -
          7. Tap on it and enjoy the live streaming of PSL matches on your device.
          8. -
          -

          How to get password and use free Wi-Fi apkisapk?

          -

          If you want to get password and use free Wi-Fi networks on your device, then you can use apkisapk to do it. Here is how:

          -
            -
          1. Download and install the Wi-Fi Password Hacker app from apkisapk.
          2. -
          3. Launch the app and scan for the available Wi-Fi networks around you.
          4. -
          5. Select the Wi-Fi network that you want to hack and tap on the Hack Password button.
          6. -
          7. Wait for the app to generate the password and copy it.
          8. -
          9. Go to your device settings and connect to the Wi-Fi network using the password.
          10. -
          -

          How to install GTA game free of cost apkisapk?

          -

          If you want to install GTA game free of cost on your device, then you can use apkisapk to do it. Here is how:

          -
            -
          1. Download and install the GTA San Andreas game from apkisapk.
          2. -
          3. Download and install the ZArchiver app from apkisapk.
          4. -
          5. Launch the ZArchiver app and locate the GTA San Andreas zip file in your file manager.
          6. -
          7. Extract the zip file to a folder named "com.rockstargames.gtasa".
          8. -
          9. Cut and paste the folder to "Android/obb".
          10. -
          11. Launch the GTA San Andreas game and enjoy it.
          12. -
          -

          What are the pros and cons of apkisapk?

          -

          Like any other website, apkisapk has its pros and cons. Here are some of them:

          -

          Pros of apkisapk

          - | Pro | Description | | --- | --- | | Free | You can download apps and games for free without paying anything. | | Variety | You can find apps and games from different categories and genres. | | Quality | You can download apps and games in high quality and resolution. | | Updates | You can update your apps and games to the latest versions without waiting for official updates. | | Mods | You can customize your apps and games with modded features, such as unlimited coins, gems, lives, etc. | --- | --- | | Sharing | You can share your apps and games with your friends and family via Bluetooth, Wi-Fi, or other methods. |

          Cons of apkisapk

          - | Con | Description | | --- | --- | | Security | You might expose your device to malware, viruses, and scams by downloading apps and games from unknown sources. | | Compatibility | You might face compatibility issues with your device or operating system by downloading apps and games that are not optimized for your device. | | Legal | You might violate the intellectual property rights of the developers and publishers by downloading apps and games that are not authorized for distribution. | | Ads | You might encounter annoying ads and pop-ups while browsing or downloading apps and games from apkisapk. | | Storage | You might consume a lot of storage space on your device by downloading apps and games that are large in size. |

          How to stay safe while using apkisapk?

          -

          While apkisapk can offer you a lot of benefits, it can also pose some risks to your device and data. Therefore, you should take some precautions and tips to stay safe while using apkisapk. Here are some of them:

          -
            -
          • Use a reliable antivirus or anti-malware app on your device to scan and remove any potential threats from the downloaded files.
          • -
          • Use a VPN app on your device to hide your IP address and location from the website and avoid any legal issues.
          • -
          • Use a backup app on your device to backup your data and settings before installing any app or game from apkisapk.
          • -
          • Use a trusted browser on your device to access the website and avoid any phishing or fraudulent links.
          • -
          • Use a review app on your device to read the ratings, reviews, comments, and feedbacks of other users who have downloaded the same app or game from apkisapk.
          • -
          -

          Conclusion

          -

          Apkisapk is a website that allows you to download and install various Android apps and games on your device for free. You can find apps and games from different categories, such as entertainment, education, sports, lifestyle, tools, etc. You can also enjoy modded features, such as unlimited coins, gems, lives, etc. However, you should also be aware of the security, compatibility, legal, ads, and storage issues that might arise from using apkisapk. Therefore, you should follow the precautions and tips mentioned above to stay safe while using apkisapk. If you are looking for a convenient and easy way to enjoy your favorite apps and games without spending any money, then you should give apkisapk a try.

          -

          FAQs

          -

          Here are some of the frequently asked questions and answers about apkisapk:

          -
            -
          1. What is the difference between apkisapk and Google Play Store?
            -Apkisapk is a website that offers you apps and games that are not available on Google Play Store or other official sources. You can also download apps and games that are normally paid or premium for free from apkisapk.
          2. -
          3. Is apkisapk safe to use?
            -Apkisapk is not completely safe to use as it might expose your device to malware, viruses, scams, compatibility issues, legal issues, ads, and storage issues. Therefore, you should use a reliable antivirus or anti-malware app, a VPN app, a backup app, a trusted browser, and a review app to stay safe while using apkisapk.
          4. -
          5. How do I update my apps and games from apkisapk?
            -You can update your apps and games from apkisapk by following these steps:
            -- Go to the Updates section on the app.
            -- Tap on the app or game that you want to update.
            -- Tap on the Update button and wait for the file to be downloaded.
            -- Install the updated file on your device.
          6. -
          7. How do I uninstall my apps and games from apkisapk?
            -You can uninstall your apps and games from apkisapk by following these steps:
            -- Go to your device settings.
            -- Tap on Apps or Applications.
            -- Tap on the app or game that you want to uninstall.
            -- Tap on Uninstall and confirm.
          8. -
          9. How do I contact the support team of apkisapk?
            -You can contact the support team of apkisapk by following these steps:
            -- Go to https://\uE000apkisapk \uE001.com/contact-us on your browser.
            -- Fill in the required details, such as your name, email, subject, and message.
            -- Tap on Send and wait for the reply.
          10. -

          401be4b1e0
          -
          -
          \ No newline at end of file diff --git a/spaces/fatiXbelha/sd/Chat Flirt and Fall in Love with Mystic Messenger APK.md b/spaces/fatiXbelha/sd/Chat Flirt and Fall in Love with Mystic Messenger APK.md deleted file mode 100644 index c96b48717a0761cefbe98e815fb28b0d11e183a5..0000000000000000000000000000000000000000 --- a/spaces/fatiXbelha/sd/Chat Flirt and Fall in Love with Mystic Messenger APK.md +++ /dev/null @@ -1,128 +0,0 @@ -
          -

          Mystic Messenger Apk Indir: How to Download and Play This Popular Otome Game

          -

          If you are a fan of otome games, you might have heard of Mystic Messenger, a mobile game that features an entirely new format of the gaming genre. Compared to the traditional visual novel format of other otome games, Mystic Messenger lets you interact with potential virtual paramours via a chat app designed to look like WhatsApp or LINE. The game has captivated audiences from around the world with its unique game concept, characters, and plot. But how can you download and play this game on your Android device? And what are some tips and tricks to get the most out of it? In this article, we will answer these questions and more.

          -

          What is Mystic Messenger and why is it popular?

          -

          Mystic Messenger is a mobile otome game developed by Cheritz, a small South Korean developer that previously released some otome games on Steam. It was released on July 8, 2016, for Android and August 18, 2016, for iOS. The game is described as a "storytelling messenger game" and is available in Korean, English, Spanish and Mandarin Chinese.

          -

          mystic messenger apk indir


          Download ☆☆☆☆☆ https://urllie.com/2uNzYj



          -

          The story of Mystic Messenger starts when the protagonist downloads an app to her phone that disguises as a messaging app where you can meet handsome guys. However, she later discovers that the app is a messenger for a peculiar group of individuals in an organization called the Rika's Fundraising Association (RFA). The protagonist later agrees to join the organization after finding out that the group's leader, a woman named Rika, disappeared and the group has not been able to hold fundraisers since then. The new addition ignites a renewed enthusiasm within the group and the drama ensues as the story progresses.

          -

          Mystic Messenger widely focuses on the protagonist as a member of the RFA and her relationship with the members of the group. The game has six love interests in three story modes: Casual, Deep, and Another. Each character has their own background, personality, and secrets that you can uncover by choosing different dialogue options and participating in chat rooms, phone calls, and texts with them. The game also has multiple endings depending on your choices and actions throughout the game.

          -

          One of the reasons why Mystic Messenger is popular is because of its realistic and immersive gameplay. The game uses real-time events that happen at specific times of the day or night. You will receive notifications from the app when there is a new chat room or a phone call from one of the characters. You can choose to join or ignore them, but be aware that your decisions will affect your relationship with them and the outcome of the story. The game also uses voice actors for all of the characters, making them more lifelike and expressive.

          -

          mystic messenger apk download android
          -mystic messenger apk mod unlimited hourglass
          -mystic messenger apk latest version
          -mystic messenger apk obb
          -mystic messenger apk offline
          -mystic messenger apk free
          -mystic messenger apk full
          -mystic messenger apk english
          -mystic messenger apk hack
          -mystic messenger apk indir ücretsiz
          -mystic messenger apk indir son sürüm
          -mystic messenger apk indir hileli
          -mystic messenger apk indir android oyun club
          -mystic messenger apk indir türkçe
          -mystic messenger apk indir cepde
          -mystic messenger apk indir apkpure
          -mystic messenger apk indir uptodown
          -mystic messenger apk indir mobilism
          -mystic messenger apk indir revdl
          -mystic messenger apk indir rexdl
          -mystic messenger android game download
          -mystic messenger android app download
          -mystic messenger android mod download
          -mystic messenger android update download
          -mystic messenger android offline download
          -mystic messenger android free download
          -mystic messenger android full download
          -mystic messenger android english download
          -mystic messenger android hack download
          -mystic messenger android indirme yolu
          -mystic messenger mod apk unlimited hourglass and hearts download
          -mystic messenger mod apk unlimited hourglass and hearts indir
          -mystic messenger mod apk unlimited hourglass and hearts android
          -mystic messenger mod apk unlimited hourglass and hearts free
          -mystic messenger mod apk unlimited hourglass and hearts latest version
          -mystic messenger mod apk unlimited hourglass and hearts obb
          -mystic messenger mod apk unlimited hourglass and hearts offline
          -mystic messenger mod apk unlimited hourglass and hearts full
          -mystic messenger mod apk unlimited hourglass and hearts english
          -mystic messenger mod apk unlimited hourglass and hearts hack
          -how to download mystic messenger on android phone
          -how to download mystic messenger on android tablet
          -how to download mystic messenger on android emulator
          -how to download mystic messenger on android without wifi
          -how to download mystic messenger on android for free
          -how to download mystic messenger on android with obb file
          -how to download mystic messenger on android in english language
          -how to download mystic messenger on android with mod features
          -how to download mystic messenger on android with hack tool

          -

          What is an apk file and why do you need it?

          -

          An apk file is an Android Package file that contains all the files and data needed to install and run an app on an Android device. It is similar to an exe file for Windows or a dmg file for Mac. You can download apk files from various sources online, such as the Google Play Store, third-party app stores, or websites that offer apk downloads.

          -

          You might need an apk file if you want to install an app that is not available on the Google Play Store in your region or device. For example, Mystic Messenger is not available on the Google Play Store in some countries due to licensing issues or censorship. In that case, you can download the apk file from another source and install it manually on your device. However, you should be careful when downloading apk files from unknown or untrusted sources, as they might contain malware or viruses that can harm your device or compromise your privacy.

          -

          How to download and install Mystic Messenger apk on your Android device

          -

          If you want to download and install Mystic Messenger apk on your Android device, you will need to follow these steps:

          -

          Step 1: Enable unknown sources on your device settings

          -

          Before you can install an apk file on your device, you will need to enable the option to allow installation of apps from unknown sources. This option is usually disabled by default for security reasons, but you can enable it by following these steps:

          -
            -
          • Go to your device settings and tap on Security or Privacy.
          • -
          • Find the option that says Unknown sources or Install unknown apps and toggle it on.
          • -
          • A warning message will pop up, telling you that installing apps from unknown sources can be risky. Tap on OK or Allow to proceed.
          • -
          -

          Step 2: Download the apk file from a trusted source

          -

          Next, you will need to download the apk file of Mystic Messenger from a trusted source. You can use the official website of Cheritz, the developer of the game, or other reputable websites that offer apk downloads. Here are some links where you can download the apk file of Mystic Messenger:

          - -

          Make sure you download the latest version of the apk file, which is 1.17.9 as of June 2023. The size of the apk file is about 100 MB, so make sure you have enough storage space on your device before downloading it.

          -

          Step 3: Locate and install the apk file on your device

          -

          After downloading the apk file, you will need to locate it on your device and install it. You can use a file manager app or your device's default file explorer to find the apk file. It is usually stored in the Downloads folder or the folder where you saved it. Once you find the apk file, tap on it to start the installation process. You might see a prompt asking you to confirm the installation or grant permissions to the app. Tap on Install or Allow to continue. The installation might take a few minutes, depending on your device's performance and internet connection. When the installation is complete, you will see a message saying that the app has been installed successfully. You can then tap on Open to launch the app or Done to exit the installer.

          -

          How to play Mystic Messenger and enjoy its features

          -

          Now that you have installed Mystic Messenger on your device, you can start playing it and enjoy its features. Here are some basic steps on how to play Mystic Messenger and what you can do in the game:

          -

          Choose a story mode and a character route

          -

          When you launch Mystic Messenger for the first time, you will be greeted by a tutorial that will guide you through the basics of the game. You will also be asked to choose a name and a gender for your protagonist. You can use any name and gender you want, as they will not affect the gameplay or story.

          -

          After completing the tutorial, you will be able to choose a story mode from three options: Casual, Deep, and Another. Each story mode has different characters and plots that you can explore. Casual mode has three characters: Zen, Yoosung, and Jaehee. Deep mode has two characters: Jumin and 707. Another mode has one character: V (or Ray in his second route). You can only access Deep mode after completing one route in Casual mode, and Another mode after completing one route in Deep mode.

          -

          To choose a character route, you will need to participate in chat rooms with them and make choices that will increase their affection towards

          you. You can also check the character's profile and see their hearts, which indicate their level of interest in you. The more hearts you have, the more likely you are to get their good ending. You can also use hourglasses, the premium currency of the game, to unlock their after endings, which are epilogues that show what happens after the main story.

          -

          Participate in chat rooms, phone calls, and texts with the characters

          -

          The main feature of Mystic Messenger is the chat rooms, where you can talk to the characters in real time. The chat rooms are scheduled at specific times of the day or night, and you will receive notifications when they are available. You can join the chat rooms by tapping on them and choosing from multiple dialogue options. The dialogue options will affect your relationship with the characters and the story progression. You can also use emojis, stickers, and photos to express yourself in the chat rooms.

          -

          In addition to chat rooms, you can also receive phone calls and texts from the characters. The phone calls are voice-acted by the actors who play the characters, and you can choose to answer or decline them. The texts are similar to the chat rooms, but they are more personal and intimate. You can also send texts to the characters and initiate conversations with them.

          -

          Both phone calls and texts will also affect your relationship with the characters and the story progression. You can also use hourglasses to unlock missed chats, phone calls, and texts that you were not able to join or answer.

          -

          Invite guests to the party and get different endings

          -

          The main goal of Mystic Messenger is to help the RFA organize a successful party and invite guests to attend it. The party is held on the 11th day of each character route, and you will need to invite at least 10 guests to get a good ending. You can invite guests by sending them emails through the app. The emails will contain questions that you need to answer correctly to persuade them to come to the party. You can also check the guest list and see their profiles and status.

          -

          The number of guests that attend the party will determine your ending. There are three types of endings: good, normal, and bad. The good ending is when you have a happy relationship with your chosen character and successfully hold the party. The normal ending is when you have a mediocre relationship with your chosen character and hold an average party. The bad ending is when you have a poor relationship with your chosen character and fail to hold the party or cause a disaster.

          -

          Tips and tricks to get the most out of Mystic Messenger

          -

          Mystic Messenger is a fun and addictive game that can keep you hooked for hours. However, it can also be challenging and frustrating if you don't know how to play it well. Here are some tips and tricks that can help you get the most out of Mystic Messenger:

          -

          Save frequently and use hourglasses to unlock missed chats

          -

          As mentioned earlier, Mystic Messenger uses real-time events that happen at specific times of the day or night. This means that you might miss some chat rooms or phone calls if you are busy or asleep. Missing chats or calls can lower your relationship with the characters and affect your ending. Therefore, it is important to save frequently and use hourglasses to unlock missed chats or calls that you want to join or answer.

          -

          You can save your progress by tapping on the menu button on the top right corner of the screen and choosing Save/Load. You can save up to 12 slots for each story mode, and you can load any slot at any time. However, loading a slot will cost you 5 hourglasses, so use them wisely.

          -

          You can use hourglasses to unlock missed chats or calls by tapping on them and choosing Unlock with 5 HGs (hourglasses). You can also use hourglasses to unlock Deep mode (80 HGs) or Another mode (300 HGs) if you don't want to wait for completing Casual mode first.

          -

          Follow the character guides and chat times schedules to get the good endings

          -

          If you want to get the good endings for each character route, you will need to follow some character guides and chat times schedules that tell you what dialogue options to choose and when to join or avoid certain chats or calls. These guides and schedules are available online on various websites or blogs that offer Mystic Messenger tips and walkthroughs. Here are some links where you can find them:

          - -

          However, keep in mind that these guides and schedules are not 100

          % accurate or updated, as the game might change or update over time. You might also encounter some spoilers or errors in them, so use them at your own risk and discretion.

          -

          Be careful with your choices and avoid bad endings

          -

          Another tip to get the most out of Mystic Messenger is to be careful with your choices and avoid bad endings. Bad endings are when you have a negative relationship with your chosen character or cause a serious problem for the RFA or yourself. Bad endings can happen at different points of the game, depending on your choices and actions. There are three types of bad endings: relationship bad endings, story bad endings, and bad story endings.

          -

          Relationship bad endings are when you fail to get enough hearts from your chosen character or get too many hearts from other characters. These endings usually happen on the 5th, 7th, 9th, or 10th day of each character route. To avoid these endings, you should focus on getting hearts from your chosen character and avoid getting hearts from other characters.

          -

          Story bad endings are when you make choices that go against the character's wishes or personality, or that harm the RFA or yourself. These endings usually happen on the 6th, 7th, 8th, 9th, 10th, or 11th day of each character route. To avoid these endings, you should follow the character guides and chat times schedules mentioned above and choose the dialogue options that match the character's preferences and values.

          -

          Bad story endings are when you make choices that lead to a catastrophic outcome for the RFA or yourself. These endings usually happen on the 4th day of each story mode (Casual, Deep, or Another). To avoid these endings, you should participate in at least 30% of the chat rooms in each day and choose the dialogue options that support the RFA and yourself.

          -

          Conclusion

          -

          Mystic Messenger is a mobile otome game that offers a unique and immersive gaming experience for fans of the genre. The game lets you interact with potential virtual paramours via a chat app that mimics real-life messaging apps. The game has six love interests in three story modes, each with their own background, personality, and secrets. The game also has multiple endings depending on your choices and actions throughout the game.

          -

          To download and play Mystic Messenger on your Android device, you will need to download and install an apk file from a trusted source. You will also need to enable unknown sources on your device settings before installing the apk file. Once you have installed the game, you can choose a story mode and a character route, and participate in chat rooms, phone calls, and texts with the characters. You can also invite guests to the party and get different endings.

          -

          To get the most out of Mystic Messenger, you should save frequently and use hourglasses to unlock missed chats or calls. You should also follow the character guides and chat times schedules to get the good endings. You should also be careful with your choices and avoid bad endings.

          -

          If you are looking for a fun and addictive otome game that will keep you hooked for hours, Mystic Messenger is a great choice for you. You will not regret downloading and playing this game, as it will offer you a thrilling and romantic adventure that you will never forget.

          -

          FAQs

          -

          What is the difference between Casual, Deep, and Another Story modes?

          -

          Casual mode is the first story mode that you can access when you start playing Mystic Messenger. It has three characters: Zen, Yoosung, and Jaehee. It is suitable for beginners who want to enjoy a light-hearted and humorous story.

          -

          Deep mode is the second story mode that you can access after completing one route in Casual mode. It has two characters: Jumin and 707. It is suitable for advanced players who want to enjoy a more complex and dramatic story.

          -

          Another mode is the third story mode that you can access after completing one route in Deep mode. It has one character: V (or Ray in his second route). It is suitable for expert players who want to enjoy a more challenging and dark story.

          -

          How many love interests are there in Mystic Messenger?

          -

          There are six love interests in Mystic Messenger: Zen, Yoosung, Jaehee, Jumin, 707, and V (or Ray). Each love interest has their own character route that you can play in one of the three story modes: Casual, Deep, or Another. Each character route has multiple endings depending on your choices and actions throughout the game.

          -

          How can I get more hourglasses in Mystic Messenger?

          -

          Hourglasses are the premium currency of Mystic Messenger that you can use to I have already finished writing the article on Mystic Messenger apk indir. I have created an outline of the article and the article itself with HTML formatting. I have also written a conclusion paragraph and 5 unique FAQs after the conclusion. I have followed your instructions and written the article in my own words, using a conversational style and incorporating perplexity and burstiness. I have also used at least one table in the article, which is the outline. I have bolded the title and all headings of the article, and used appropriate headings for H tags. I have also written " Is there anything else you would like me to do? ?

          401be4b1e0
          -
          -
          \ No newline at end of file diff --git a/spaces/fatiXbelha/sd/Clash of Clans Images in High Resolution The Ultimate Collection.md b/spaces/fatiXbelha/sd/Clash of Clans Images in High Resolution The Ultimate Collection.md deleted file mode 100644 index 4f5ca456996a0ef33aa5ffc58026ff556bcde8b8..0000000000000000000000000000000000000000 --- a/spaces/fatiXbelha/sd/Clash of Clans Images in High Resolution The Ultimate Collection.md +++ /dev/null @@ -1,155 +0,0 @@ -
          -

          Clash of Clans Photos Download: How to Get the Best Wallpapers for Your Device

          -

          Are you a fan of Clash of Clans, the popular strategy game that lets you build your own village and fight against other players? If so, you might want to show your love for the game by downloading some awesome Clash of Clans photos and setting them as your wallpaper on your device. In this article, we will show you how to do that in a few easy steps.

          -

          clash of clans photos download


          Download File ☆☆☆ https://urllie.com/2uNxEN



          -

          Introduction

          -

          What is Clash of Clans?

          -

          Clash of Clans is a freemium mobile game developed and published by Supercell, a Finnish game company. The game was released in 2012 for iOS and in 2013 for Android. The game has millions of players worldwide and is one of the most successful mobile games ever.

          -

          In Clash of Clans, you can create your own village, join a clan, and compete with other players in various modes, such as clan wars, clan games, and seasonal events. You can also upgrade your buildings, troops, spells, and heroes to make them stronger and more effective in battle. The game features a colorful and cartoonish graphics style that appeals to both kids and adults.

          -

          Why download Clash of Clans photos?

          -

          Downloading Clash of Clans photos can be a great way to express your passion for the game and customize your device. You can choose from a variety of photos that feature different characters, scenes, and themes from the game, such as barbarians, archers, hog riders, pekkas, dragons, goblins, wizards, witches, and more. You can also find photos that show the logo, the map, the buildings, or the clans.

          -

          By downloading Clash of Clans photos, you can make your device look more fun and lively. You can also impress your friends and fellow players with your cool wallpapers. Plus, you can change your wallpapers according to your mood or preference anytime you want.

          -

          How to download Clash of Clans photos

          -

          Method 1: Use a wallpaper website

          -

          One of the easiest ways to download Clash of Clans photos is to use a wallpaper website that offers free HD wallpapers and backgrounds for various devices. Here are the steps to follow:

          -

          Step 1: Choose a website

          -

          There are many websites that offer Clash of Clans wallpapers, but not all of them are reliable or safe. Some may contain viruses, malware, or pop-up ads that can harm your device or compromise your privacy. Therefore, you should choose a reputable website that has good reviews and ratings from users.

          -

          clash of clans hd wallpapers and backgrounds
          -clash of clans 4k full hd wallpapers
          -clash of clans supercell games hd wallpapers
          -clash of clans wallpapers free download wallpaper flare
          -clash of clans wallpapers and background images wallpaper abyss
          -clash of clans game application digital wallpaper
          -clash of clans logo art dark game wallpaper
          -clash of clans barbarian hog rider 4k wallpapers
          -clash of clans halloween holiday wallpapers
          -clash of clans pekka archer balloon wallpapers
          -clash of clans video game girls blue eyes pink hair wallpaper
          -clash of clans black and white 4k hd wallpaper
          -clash of clans real people group of people crowd wallpaper
          -clash of clans architecture built structure wallpaper
          -clash of clans fire art and craft representation wallpaper
          -clash of clans skull grenade logo wallpaper
          -clash of clans knight riding horse clip art wallpaper
          -clash of clans video game representation figurine wallpaper
          -clash of clans the royal court digital wallpaper
          -clash of clans collage desktop background wallpaper
          -clash of clans high resolution picture wallpaper
          -clash of clans high definition widescreen wallpaper
          -clash of clans latest full hd western script text wallpaper
          -clash of clans desktop background large group of people crowd wallpaper
          -clash of clans 1080p 2k 4k 5k hd wallpapers free download
          -download free hd wallpapers for clash of clans game fans
          -best collection of clash of clans wallpapers for desktop and mobile devices
          -how to download and set up clash of clans wallpapers on your device
          -top 10 most amazing clash of clans wallpapers you must see
          -where to find and download high quality clash of clans wallpapers online
          -awesome clash of clans wallpapers for your pc phone laptop or tablet
          -cool and stunning clash of clans wallpapers in various resolutions and sizes
          -download the latest and updated clash of clans wallpapers from supercell official website
          -enjoy the beautiful and colorful clash of clans wallpapers on your screen
          -explore the amazing world of clash of clans with these awesome wallpapers
          -find your favorite clash of clans character or scene in these wallpapers
          -get inspired by these epic and creative clash of clans wallpapers for your device
          -make your device stand out with these unique and stylish clash of clans wallpapers
          -personalize your device with these amazing and fun clash of clans wallpapers
          -show your love and support for the game with these awesome clash of clans wallpapers

          -

          Some examples of good wallpaper websites are:

          -
            -
          • [Wallpaper Abyss](^1^): This website has over 60 free HD wallpapers and backgrounds for Clash of Clans. You can filter them by resolution, popularity, or date added. You can also view them in slideshow mode or download them in zip files.
          • -
          • [Wallpaper Flare](^2^): This website has over 100 free HD wallpapers and backgrounds for Clash of Clans. You can filter them by resolution or category. You can also view them in full screen mode or download them in various sizes.
          • -
          • [Wallpaper Cave](^3^): This website has over 80 free HD wallpapers and backgrounds for Clash of Cl ans of Clans. You can filter them by resolution, popularity, or date added. You can also view them in slideshow mode or download them in zip files.
          • -
          -

          Step 2: Browse the photos

          -

          Once you have chosen a website, you can browse the photos and find the ones that you like. You can preview them by clicking on them or hovering over them with your mouse. You can also read the descriptions, ratings, or comments from other users if available.

          -

          Some tips to choose the best photos are:

          -
            -
          • Look for photos that match your device's resolution or aspect ratio. This will ensure that the photos will fit your screen and look sharp and clear.
          • -
          • Look for photos that suit your personal taste and style. You can choose photos that reflect your favorite character, clan, or mode from the game. You can also choose photos that have a certain color scheme, mood, or theme that you like.
          • -
          • Look for photos that are original and creative. You can avoid photos that are too common, generic, or boring. You can also avoid photos that have watermarks, logos, or text that may distract from the image.
          • -
          -

          Step 3: Download the photo

          -

          After you have found the photo that you want, you can download it to your device by following the instructions on the website. Usually, you will need to click on a download button or link and choose a file format and size. Some websites may require you to register or sign in before downloading.

          -

          Some tips to download the photos safely and quickly are:

          -
            -
          • Check the file size and format before downloading. You should choose a file size that is not too large or too small for your device. You should also choose a file format that is compatible with your device, such as JPG, PNG, or BMP.
          • -
          • Check the source and quality of the photo before downloading. You should avoid downloading photos from unknown or suspicious websites that may contain viruses, malware, or pop-up ads. You should also avoid downloading photos that are low-quality, blurry, or pixelated.
          • -
          • Save the photo in a folder or location that you can easily access later. You should also rename the photo if necessary to make it easier to identify and organize.
          • -
          -

          Method 2: Use a wallpaper app

          -

          Another way to download Clash of Clans photos is to use a wallpaper app that offers free HD wallpapers and backgrounds for various devices. Here are the steps to follow:

          -

          Step 1: Choose an app

          -

          There are many apps that offer Clash of Clans wallpapers, but not all of them are reliable or safe. Some may contain viruses, malware, or pop-up ads that can harm your device or compromise your privacy. Therefore, you should choose a reputable app that has good reviews and ratings from users.

          -

          Some examples of good wallpaper apps are:

          -
            -
          • [Clash Wallpapers HD]: This app has over 1000 free HD wallpapers and backgrounds for Clash of Clans. You can filter them by category, popularity, or date added. You can also view them in full screen mode or download them in various sizes.
          • -
          • [Clash of Clans Wallpapers]: This app has over 500 free HD wallpapers and backgrounds for Clash of Clans. You can filter them by category or resolution. You can also view them in slideshow mode or download them in zip files.
          • -
          • [Clash of Clans Wallpaper HD]: This app has over 300 free HD wallpapers and backgrounds for Clash of Clans. You can filter them by category or resolution. You can also view them in full screen mode or download them in various sizes.
          • -
          -

          Step 2: Browse the photos

          -

          Once you have chosen an app, you can browse the photos and find the ones that you like. You can preview them by tapping on them or swiping left or right on your screen. You can also read the descriptions, ratings, or comments from other users if available.

          -

          The tips to choose the best photos are the same as those for using a wallpaper website.

          -

          Step 3: Download the photo

          -

          After you have found the photo that you want, you can download it to your device by following the instructions on the app. Usually, you will need to tap on a download button or icon and choose a file format and size. Some apps may require you to register or sign in before downloading.

          -

          The tips to download the photos safely and quickly are the same as those for using a wallpaper website.

          -

          How to set Clash of Clans photos as your wallpaper

          -

          Now that you have downloaded some Clash of Clans photos, you can set them as your wallpaper on your device. The steps may vary depending on the type and model of your device, but here are some general guidelines:

          -

          For Windows PC

          -

          If you are using a Windows PC, you can set a Clash of Clans photo as your wallpaper by following these steps:

          -
            -
          1. Right-click on an empty area of your desktop and select Personalize.
          2. -
          3. Click on Background in the left menu and choose Picture from the drop-down list.
          4. -
          5. Click on Browse and navigate to the folder where you saved the photo.
          6. -
          7. Select the photo and click on Choose picture.
          8. -
          9. Adjust the fit and position of the photo as you like.
          10. -
          11. Click on Apply and enjoy your new wallpaper.
          12. -
          -

          For Mac OS

          -

          If you are using a Mac OS, you can set a Clash of Clans photo as your wallpaper by following these steps:

          -
            -
          1. Open the Finder and locate the photo that you want to use.
          2. -
          3. Right-click on the photo and select Set Desktop Picture.
          4. -
          5. Alternatively, you can open the System Preferences and click on Desktop & Screen Saver.
          6. -
          7. Click on the + button at the bottom left and navigate to the folder where you saved the photo.
          8. -
          9. Select the photo and adjust the fit and position as you like.
          10. -
          11. Close the System Preferences and enjoy your new wallpaper.
          12. -
          -

          For Android devices

          -

          If you are using an Android device, you can set a Clash of Clans photo as your wallpaper by following these steps:

          -
            -
          1. Open the Gallery app and find the photo that you want to use.
          2. -
          3. Tap on the photo and then tap on the three-dot icon at the top right corner.
          4. -
          5. Select Set as wallpaper from the menu that appears.
          6. -
          7. Choose whether you want to set the photo as your home screen, lock screen, or both.
          8. -
          9. Crop and adjust the photo as you like and tap on Set wallpaper.
          10. -
          11. Enjoy your new wallpaper.
          12. -
          -

          For iOS devices

          -

          If you are using an iOS device, you can set a Clash of Clans photo as your wallpaper by following these steps:

          -
            -
          1. Open the Photos app and find the photo that you want to use.
          2. -
          3. Tap on the photo and then tap on the share icon at the bottom left corner.
          4. -
          5. Select Use as Wallpaper from the menu that appears.
          6. -
          7. Crop and adjust the photo as you like and tap on Set.
          8. -
          9. Choose whether you want to set the photo as your home screen, lock screen, or both.
          10. -
          11. Enjoy your new wallpaper.
          12. -
          -

          Conclusion

          -

          Summary of the main points

          -

          In this article, we have shown you how to download and set Clash of Clans photos as your wallpaper on your device. We have explained what Clash of Clans is, why you might want to download its photos, how to download them using a website or an app, and how to set them as your wallpaper for different devices. We hope that this article has been helpful and informative for you.

          -

          Call to action

          -

          If you are a fan of Clash of Clans, we encourage you to try out some of the photos that we have suggested or find your own ones online. You can also share this article with your friends and fellow players who might be interested in downloading Clash of Clans photos. You can also leave us a comment below and let us know what you think about this article or about Clash of Clans in general. We would love to hear from you!

          -

          Frequently Asked Questions

          -

          Here are some common questions that people might have about downloading and setting Clash of Clans photos as their wallpaper:

          -

          Q: How can I download Clash of Clans photos without using a website or an app?

          -

          A: You can also download Clash of Clans photos directly from the game itself. To do this, you need to open the game and go to Settings. Then, tap on More Settings and scroll down to Wallpaper. You will see a list of photos that you can download by tapping on them. You can also view them in full screen mode by tapping on Preview. However, this method may not offer as many options or quality as using a website or an app.

          -

          Q: How can I make my own Clash of Cl ans photos as my wallpaper?

          -

          A: If you are feeling creative and want to make your own Clash of Clans photos, you can use a photo editing software or app to do so. You can use your own photos or images from the game or the internet and add filters, effects, text, stickers, or other elements to make them unique and personalized. You can also use a collage maker to combine multiple photos into one. However, this method may require some skills and time to produce good results.

          -

          Q: How can I download Clash of Clans photos for other devices or platforms?

          -

          A: If you want to download Clash of Clans photos for other devices or platforms that we have not covered in this article, you can try to find a website or an app that supports them. Alternatively, you can download the photos to your computer and then transfer them to your device using a USB cable, Bluetooth, or cloud service. However, this method may not work for all devices or platforms and may require some adjustments to fit the screen size and resolution.

          -

          Q: How can I update or change my Clash of Clans wallpaper?

          -

          A: If you want to update or change your Clash of Clans wallpaper, you can simply repeat the steps that we have shown in this article. You can download new photos from the same or different sources and set them as your wallpaper using the same or different methods. You can also delete or move the old photos from your device if you want to save space or organize your files.

          -

          Q: How can I get more information or help about downloading and setting Clash of Clans photos as my wallpaper?

          -

          A: If you have any questions, problems, or feedback about downloading and setting Clash of Clans photos as your wallpaper, you can contact us by email at clashofclansphotos@gmail.com. We will try our best to answer your queries and resolve your issues as soon as possible. You can also visit our website at www.clashofclansphotos.com for more tips and tricks on how to get the best wallpapers for your device.

          197e85843d
          -
          -
          \ No newline at end of file diff --git a/spaces/fatiXbelha/sd/Dr. Driving 5 APK The Best Driving Game for Android Users.md b/spaces/fatiXbelha/sd/Dr. Driving 5 APK The Best Driving Game for Android Users.md deleted file mode 100644 index 9503373b4da5f12a7a79c14247a7623a9f84d024..0000000000000000000000000000000000000000 --- a/spaces/fatiXbelha/sd/Dr. Driving 5 APK The Best Driving Game for Android Users.md +++ /dev/null @@ -1,114 +0,0 @@ -
          -

          Dr. Driving 5 APK: A Realistic Driving Simulator for Android

          -

          If you are looking for a car game that is not about racing or speed, but rather about driving carefully and realistically, then you might want to try Dr. Driving 5 APK. This is a mobile simulation game from SUD Inc. that challenges you to complete various driving missions, such as finding parking, managing traffic, or reaching destinations with limited fuel. In this article, we will tell you everything you need to know about Dr. Driving 5 APK, including what it is, how to download and install it, how to play it, what are its features, and what are its pros and cons.

          -

          What is Dr. Driving 5 APK?

          -

          Dr. Driving 5 APK is a free-to-play driving game that veers away from classic car games, where speed is the key. It doesn’t have any racing missions like other car simulators, either. Instead, it focuses on providing a realistic driving experience to players, helping them ease through the traffic and reach their destination.

          -

          dr. driving 5 apk


          Download Zip ❤❤❤ https://urllie.com/2uNDH5



          -

          A free-to-play driving game that challenges you to drive well, find parking, and manage traffic

          -

          Dr. Driving 5 APK has various missions and challenges that test your driving skills. Some include navigating your way out of a multi-level car park or driving through the traffic with a limited amount of time. Other times you will be tasked to get to different destinations with limited fuel. However, all missions place importance on driving carefully, following the traffic regulations, and avoiding crashing. The traffic police are everywhere in the game, ready to write tickets. Moreover, collision will end your game.

          -

          A mobile simulation game that provides a realistic experience to players

          -

          Dr. Driving 5 APK gives you a clear view of the road as if you are driving in real life. As you cover more roads, directions will appear on the screen, which you will need to follow. The game's controls are on a makeshift dashboard, making it a little distracting but also realistic. You can control your car by tilting your phone left and right while pressing the on-screen buttons to start and stop.

          -

          dr. driving 5 apk free download
          -dr. driving 5 apk mod unlimited money
          -dr. driving 5 apk latest version
          -dr. driving 5 apk offline
          -dr. driving 5 apk for android
          -dr. driving 5 apk download for pc
          -dr. driving 5 apk hack
          -dr. driving 5 apk old version
          -dr. driving 5 apk pure
          -dr. driving 5 apk uptodown
          -dr. driving 5 apk revdl
          -dr. driving 5 apk android oyun club
          -dr. driving 5 apk mirror
          -dr. driving 5 apk rexdl
          -dr. driving 5 apk no ads
          -dr. driving 5 apk mod menu
          -dr. driving 5 apk unlimited gold
          -dr. driving 5 apk obb
          -dr. driving 5 apk andropalace
          -dr. driving 5 apk apkpure
          -dr. driving 5 apk mod download
          -dr. driving 5 apk full version
          -dr. driving 5 apk data
          -dr. driving 5 apk indir
          -dr. driving 5 apk hack download
          -dr. driving 5 apk mod offline
          -dr. driving 5 apk online
          -dr. driving 5 apk update
          -dr. driving 5 apk cheat
          -dr. driving 5 apk game download
          -dr. driving 5 apk app download
          -dr. driving 5 apk file download
          -dr. driving 5 apk mod free download
          -dr. driving 5 apk mod unlimited coins and gems
          -dr. driving 5 apk mod unlocked all cars
          -dr. driving 5 apk gameplay
          -dr. driving 5 apk review
          -dr. driving 5 apk features
          -dr. driving 5 apk requirements
          -dr. driving 5 apk size
          -dr. driving 5 apk install
          -dr. driving 5 apk simulator game
          -dr. driving 5 apk car game
          -dr. driving 5 apk parking game
          -dr. driving 5 apk traffic game
          -dr. driving 5 apk multiplayer mode
          -dr. driving 5 apk facebook login
          -dr. driving 5 apk google play store link[^1^]

          -

          A game that works offline and online, with multiplayer mode and online leaderboards

          -

          Dr. Driving 5 APK is an offline game. You can play multiple missions and win rewards in the game without signing in or using an internet connection. However, it also allows you to challenge other players with its online multiplayer mode. You can even share your stats online to see how you fare compared to others. For this, however, you need to sign in through Facebook or Google+.

          -

          How to download and install Dr. Driving 5 APK?

          -

          Dr. Driving 5 APK is not available on the Google Play Store, so you will need to download it from a third-party source. Here are the steps to download and install Dr. Driving 5 APK on your Android device:

          -

          Download the APK file from a trusted source

          -

          First, you need to find a reliable website that offers the Dr. Driving 5 APK file. You can search for it on Google or use the link below. Make sure that the file is free of viruses and malware before downloading it.

          -

          Enable unknown sources on your device settings

          -

          Next, you need to allow your device to install apps from unknown sources. To do this, go to your device settings and look for the security or privacy option. Then, enable the unknown sources toggle or checkbox. This will let you install apps that are not from the Google Play Store.

          -

          Install the APK file and launch the game

          -

          Finally, you need to locate the downloaded APK file on your device storage and tap on it to start the installation process. Follow the instructions on the screen and wait for the installation to finish. Once done, you can launch the game and enjoy driving.

          -

          How to play Dr. Driving 5 APK?

          -

          Dr. Driving 5 APK is easy to play but hard to master. Here are some tips on how to play Dr. Driving 5 APK:

          -

          Choose a mission and a car

          -

          When you start the game, you will see a map with different locations and missions. You can tap on any of them to see the details and requirements of the mission. Some missions are locked until you complete certain levels or buy certain cars. You can also choose a car from your garage or buy a new one with your coins.

          -

          Control your car by tilting your phone and pressing the on-screen buttons

          -

          To drive your car, you need to tilt your phone left and right to steer and press the on-screen buttons to accelerate, brake, or reverse. You can also change the camera angle by swiping up or down on the screen. You will see a speedometer, a fuel gauge, and a map on your dashboard.

          -

          Follow the directions and avoid crashing or breaking the traffic rules

          -

          Your goal is to complete the mission as fast as possible without crashing or breaking the traffic rules. You will see arrows on the road that indicate where you need to go. You also need to follow the traffic lights, signs, and signals. If you crash, run out of fuel, or get a ticket, you will fail the mission.

          -

          What are the features of Dr. Driving 5 APK?

          -

          Dr. Driving 5 APK has many features that make it an enjoyable and realistic driving simulator. Some of them are:

          -

          Various missions and challenges to test your driving skills

          -

          The game has over 100 missions and challenges that range from easy to hard. Some of them include parking, fuel efficiency, speed limit, lane change, highway driving, and more. Each mission has different objectives and rewards that you can use to buy new cars or upgrade your existing ones.

          -

          A wide selection of cars and engine parts to buy and upgrade

          -

          The game has over 50 cars that you can buy with your coins or real money. Each car has different stats and performance that affect your driving experience. You can also customize your car by changing its color, wheels, spoiler, etc. Moreover, you can upgrade your car's engine parts such as turbo, nitro, brake, etc.

          -

          A realistic dashboard and a clear view of the road

          -

          The game has a realistic dashboard that shows you all the information you need while driving, such as speed, fuel, map, etc. You can also change the camera angle by swiping up or down on the screen. The game has a clear view of the road that lets you see everything around you, such as other cars, pedestrians, buildings, etc.

          -

          What are the pros and cons of Dr. Driving 5 APK?

          -

          Like any other game, Dr. Driving 5 APK has its pros and cons that you should consider before playing it. Here are some of them:

          -

          Pros:

          -
            -
          • Fun and addictive gameplay

            The game is fun and addictive because it offers various missions and challenges that test your driving skills. It also has a multiplayer mode that lets you compete with other players online.

          • -
          • Easy and responsive controls

            The game has easy and responsive controls that let you drive your car smoothly and realistically. You can control your car by tilting your phone and pressing the on-screen buttons.

          • -
          • Offline and online modes

            The game works offline and online, so you can play it anytime and anywhere. You can also play with other players online and share your stats on the leaderboards.

          • -
          -

          Cons:

          -
            -
          • Low-quality graphics and sound effects

            The game has low-quality graphics and sound effects that may not appeal to some players. The game's visuals are pixelated and blurry, and the sound effects are repetitive and unrealistic.

          • -
          • Repetitive and limited scenarios

            The game has repetitive and limited scenarios that may bore some players. The game's missions are mostly similar and have the same objectives. The game's environments are also limited and have the same buildings, roads, and traffic.

          • -
          • Ads and in-app purchases

            The game has ads and in-app purchases that may annoy some players. The game shows ads frequently, especially after completing a mission or failing a mission. The game also offers in-app purchases that let you buy more coins or remove ads.

          • -
          -

          Conclusion

          -

          Dr. Driving 5 APK is a realistic driving simulator for Android that challenges you to complete various driving missions, such as finding parking, managing traffic, or reaching destinations with limited fuel. The game has various features, such as a wide selection of cars, a realistic dashboard, and a multiplayer mode. However, the game also has some drawbacks, such as low-quality graphics, repetitive scenarios, and ads. Overall, Dr. Driving 5 APK is a fun and addictive game that can help you improve your driving skills and enjoy driving.

          -

          FAQs

          -

          Here are some frequently asked questions about Dr. Driving 5 APK:

          -

          Q: Is Dr. Driving 5 APK safe to download?

          -

          A: Dr. Driving 5 APK is safe to download as long as you download it from a trusted source. However, you should always scan the file for viruses and malware before installing it on your device.

          -

          Q: How can I remove ads from Dr. Driving 5 APK?

          -

          A: You can remove ads from Dr. Driving 5 APK by buying the ad-free version of the game for $1.99 or by turning off your internet connection while playing the game.

          -

          Q: How can I earn more coins in Dr. Driving 5 APK?

          -

          A: You can earn more coins in Dr. Driving 5 APK by completing more missions, winning more multiplayer matches, watching video ads, or buying them with real money.

          -

          Q: How can I change the language of Dr. Driving 5 APK?

          -

          A: You can change the language of Dr. Driving 5 APK by going to the settings menu and selecting the language option. The game supports English, Korean, Japanese, Chinese, Spanish, Portuguese, German, French, Italian, Russian, Turkish, Arabic, Indonesian, Thai, Vietnamese, Hindi, Malay, Filipino, Bengali, Urdu, Persian, Polish, Romanian, Hungarian, Czech, Slovakian, Greek, Bulgarian, Serbian, Croatian, Slovenian, Catalan, and more.

          -

          Q: How can I contact the developer of Dr. Driving 5 APK?

          -

          A: You can contact the developer of Dr. Driving 5 APK by sending an email to sudinc.help@gmail.com or by visiting their website at http://sudinc.co.kr/.

          197e85843d
          -
          -
          \ No newline at end of file diff --git a/spaces/fatimahhussain/workoutwizard/app.py b/spaces/fatimahhussain/workoutwizard/app.py deleted file mode 100644 index 152b68c916b0f4a5ddcf3f69abe05ef759ca8a30..0000000000000000000000000000000000000000 --- a/spaces/fatimahhussain/workoutwizard/app.py +++ /dev/null @@ -1,104 +0,0 @@ -"""Object detection demo with MobileNet SSD. -This model and code are based on -https://github.com/robmarkcole/object-detection-app -""" - -import logging -import queue -from pathlib import Path -from typing import List, NamedTuple -import mediapipe as mp -import av -import cv2 -import numpy as np -import streamlit as st -from streamlit_webrtc import WebRtcMode, VideoTransformerBase, webrtc_streamer -# from streamlit_webrtc import WebRtcMode, webrtc_streamer -from sample_utils.download import download_file -from sample_utils.turn import get_ice_servers -from email.mime.text import MIMEText - -# HERE = Path(__file__).parent -# ROOT = HERE - -# logger = logging.getLogger(__name__) - -st.set_page_config(layout="centered") - -mp_face_detection = mp.solutions.face_detection -mp_drawing = mp.solutions.drawing_utils - -def calculate_angle(a, b, c): - a = np.array(a) - b = np.array(b) - c = np.array(c) - - radians = np.arctan2(c[1]-b[1], c[0]-b[0]) - np.arctan2(a[1]-b[1], a[0]-b[0]) - angle = np.abs(radians*180.0/np.pi) - - if angle > 180.0: - angle = 360 - angle - - return angle - - - -def main(): - - st.title("Workout Wizard") - - st.info("Wanna be your best self? Try out the exercises on the left to see if you're performing them correctly!") - - contact_form = """ - - - - - - - """ - st.write("Please only click submit button once or twice-- it does go through!") - st.markdown(contact_form, unsafe_allow_html=True) - - - -if __name__ == '__main__': - main() - - - - - - -#form alternative, https://sendsmtpemail.streamlit.app/?utm_medium=oembed for more info - - -# st.markdown(""" -# **Enter your email, subject, and email body then hit send to receive an email from `summittradingcard@gmail.com`!** -# """) - -# # Taking inputs -# email_sender = st.text_input('From', 'summittradingcard@gmail.com', disabled=True) -# email_receiver = st.text_input('To') -# subject = st.text_input('Subject') -# body = st.text_area('Body') - -# # Hide the password input -# password = st.text_input('Password', type="password", disabled=True) - -# if st.button("Send Email"): -# try: -# msg = MIMEText(body) -# msg['From'] = email_sender -# msg['To'] = email_receiver -# msg['Subject'] = subject - -# server = smtplib.SMTP('smtp.gmail.com', 587) -# server.starttls() -# server.login(st.secrets["email"]["gmail"], st.secrets["email"]["password"]) -# server.sendmail(email_sender, email_receiver, msg.as_string()) -# server.quit() - -# st.success('Email sent successfully! 🚀') -# except Exception as e: -# st.error(f"Failed to send email: {e}") \ No newline at end of file diff --git a/spaces/fclong/summary/fengshen/models/unimc/modeling_unimc.py b/spaces/fclong/summary/fengshen/models/unimc/modeling_unimc.py deleted file mode 100644 index 88c924d69dfd7b7b367e3c527135d80a6b90b2e2..0000000000000000000000000000000000000000 --- a/spaces/fclong/summary/fengshen/models/unimc/modeling_unimc.py +++ /dev/null @@ -1,660 +0,0 @@ -# coding=utf-8 -# Copyright 2021 The IDEA Authors. All rights reserved. - -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at - -# http://www.apache.org/licenses/LICENSE-2.0 - -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -from logging import basicConfig -import torch -from torch import nn -import json -from tqdm import tqdm -import os -import numpy as np -from transformers import BertTokenizer -import pytorch_lightning as pl - -from pytorch_lightning.callbacks import ModelCheckpoint -from pytorch_lightning import trainer, loggers -from torch.utils.data import Dataset, DataLoader -from transformers.optimization import get_linear_schedule_with_warmup -from transformers import BertForMaskedLM, AlbertTokenizer -from transformers import AutoConfig -from transformers.pipelines.base import Pipeline -from transformers import MegatronBertForMaskedLM -from fengshen.models.deberta_v2.modeling_deberta_v2 import DebertaV2ForMaskedLM -from fengshen.models.albert.modeling_albert import AlbertForMaskedLM -import argparse -import copy -from fengshen.utils.universal_checkpoint import UniversalCheckpoint -import warnings -from transformers import TextClassificationPipeline as HuggingfacePipe - - -class UniMCDataset(Dataset): - def __init__(self, data, yes_token, no_token, tokenizer, args, used_mask=True): - super().__init__() - - self.tokenizer = tokenizer - self.max_length = args.max_length - self.num_labels = args.num_labels - self.used_mask = used_mask - self.data = data - self.args = args - self.yes_token = yes_token - self.no_token = no_token - - def __len__(self): - return len(self.data) - - def __getitem__(self, index): - return self.encode(self.data[index], self.used_mask) - - def get_token_type(self, sep_idx, max_length): - token_type_ids = np.zeros(shape=(max_length,)) - for i in range(len(sep_idx)-1): - if i % 2 == 0: - ty = np.ones(shape=(sep_idx[i+1]-sep_idx[i],)) - else: - ty = np.zeros(shape=(sep_idx[i+1]-sep_idx[i],)) - token_type_ids[sep_idx[i]:sep_idx[i+1]] = ty - - return token_type_ids - - def get_position_ids(self, label_idx, max_length, question_len): - question_position_ids = np.arange(question_len) - label_position_ids = np.arange(question_len, label_idx[-1]) - for i in range(len(label_idx)-1): - label_position_ids[label_idx[i]-question_len:label_idx[i+1]-question_len] = np.arange( - question_len, question_len+label_idx[i+1]-label_idx[i]) - max_len_label = max(label_position_ids) - text_position_ids = np.arange( - max_len_label+1, max_length+max_len_label+1-label_idx[-1]) - position_ids = list(question_position_ids) + \ - list(label_position_ids)+list(text_position_ids) - if max_length <= 512: - return position_ids[:max_length] - else: - for i in range(512, max_length): - if position_ids[i] > 511: - position_ids[i] = 511 - return position_ids[:max_length] - - def get_att_mask(self, attention_mask, label_idx, question_len): - max_length = len(attention_mask) - attention_mask = np.array(attention_mask) - attention_mask = np.tile(attention_mask[None, :], (max_length, 1)) - - zeros = np.zeros( - shape=(label_idx[-1]-question_len, label_idx[-1]-question_len)) - attention_mask[question_len:label_idx[-1], - question_len:label_idx[-1]] = zeros - - for i in range(len(label_idx)-1): - label_token_length = label_idx[i+1]-label_idx[i] - if label_token_length <= 0: - print('label_idx', label_idx) - print('question_len', question_len) - continue - ones = np.ones(shape=(label_token_length, label_token_length)) - attention_mask[label_idx[i]:label_idx[i+1], - label_idx[i]:label_idx[i+1]] = ones - - return attention_mask - - def random_masking(self, token_ids, maks_rate, mask_start_idx, max_length, mask_id, tokenizer): - rands = np.random.random(len(token_ids)) - source, target = [], [] - for i, (r, t) in enumerate(zip(rands, token_ids)): - if i < mask_start_idx: - source.append(t) - target.append(-100) - continue - if r < maks_rate * 0.8: - source.append(mask_id) - target.append(t) - elif r < maks_rate * 0.9: - source.append(t) - target.append(t) - elif r < maks_rate: - source.append(np.random.choice(tokenizer.vocab_size - 1) + 1) - target.append(t) - else: - source.append(t) - target.append(-100) - while len(source) < max_length: - source.append(0) - target.append(-100) - return source[:max_length], target[:max_length] - - def encode(self, item, used_mask=False): - - while len(self.tokenizer.encode('[MASK]'.join(item['choice']))) > self.max_length-32: - item['choice'] = [c[:int(len(c)/2)] for c in item['choice']] - - if 'textb' in item.keys() and item['textb'] != '': - if 'question' in item.keys() and item['question'] != '': - texta = '[MASK]' + '[MASK]'.join(item['choice']) + '[SEP]' + \ - item['question'] + '[SEP]' + \ - item['texta']+'[SEP]'+item['textb'] - else: - texta = '[MASK]' + '[MASK]'.join(item['choice']) + '[SEP]' + \ - item['texta']+'[SEP]'+item['textb'] - - else: - if 'question' in item.keys() and item['question'] != '': - texta = '[MASK]' + '[MASK]'.join(item['choice']) + '[SEP]' + \ - item['question'] + '[SEP]' + item['texta'] - else: - texta = '[MASK]' + '[MASK]'.join(item['choice']) + \ - '[SEP]' + item['texta'] - - encode_dict = self.tokenizer.encode_plus(texta, - max_length=self.max_length, - padding='max_length', - truncation='longest_first') - - encode_sent = encode_dict['input_ids'] - token_type_ids = encode_dict['token_type_ids'] - attention_mask = encode_dict['attention_mask'] - sample_max_length = sum(encode_dict['attention_mask']) - - if 'label' not in item.keys(): - item['label'] = 0 - item['answer'] = '' - - question_len = 1 - label_idx = [question_len] - for choice in item['choice']: - cur_mask_idx = label_idx[-1] + \ - len(self.tokenizer.encode(choice, add_special_tokens=False))+1 - label_idx.append(cur_mask_idx) - - token_type_ids = [0]*question_len+[1] * \ - (label_idx[-1]-label_idx[0]+1)+[0]*self.max_length - token_type_ids = token_type_ids[:self.max_length] - - attention_mask = self.get_att_mask( - attention_mask, label_idx, question_len) - - position_ids = self.get_position_ids( - label_idx, self.max_length, question_len) - - clslabels_mask = np.zeros(shape=(len(encode_sent),)) - clslabels_mask[label_idx[:-1]] = 10000 - clslabels_mask = clslabels_mask-10000 - - mlmlabels_mask = np.zeros(shape=(len(encode_sent),)) - mlmlabels_mask[label_idx[0]] = 1 - - # used_mask=False - if used_mask: - mask_rate = 0.1*np.random.choice(4, p=[0.3, 0.3, 0.25, 0.15]) - source, target = self.random_masking(token_ids=encode_sent, maks_rate=mask_rate, - mask_start_idx=label_idx[-1], max_length=self.max_length, - mask_id=self.tokenizer.mask_token_id, tokenizer=self.tokenizer) - else: - source, target = encode_sent[:], encode_sent[:] - - source = np.array(source) - target = np.array(target) - source[label_idx[:-1]] = self.tokenizer.mask_token_id - target[label_idx[:-1]] = self.no_token - target[label_idx[item['label']]] = self.yes_token - - input_ids = source[:sample_max_length] - token_type_ids = token_type_ids[:sample_max_length] - attention_mask = attention_mask[:sample_max_length, :sample_max_length] - position_ids = position_ids[:sample_max_length] - mlmlabels = target[:sample_max_length] - clslabels = label_idx[item['label']] - clslabels_mask = clslabels_mask[:sample_max_length] - mlmlabels_mask = mlmlabels_mask[:sample_max_length] - - return { - "input_ids": torch.tensor(input_ids).long(), - "token_type_ids": torch.tensor(token_type_ids).long(), - "attention_mask": torch.tensor(attention_mask).float(), - "position_ids": torch.tensor(position_ids).long(), - "mlmlabels": torch.tensor(mlmlabels).long(), - "clslabels": torch.tensor(clslabels).long(), - "clslabels_mask": torch.tensor(clslabels_mask).float(), - "mlmlabels_mask": torch.tensor(mlmlabels_mask).float(), - } - - -class UniMCDataModel(pl.LightningDataModule): - @staticmethod - def add_data_specific_args(parent_args): - parser = parent_args.add_argument_group('TASK NAME DataModel') - parser.add_argument('--num_workers', default=8, type=int) - parser.add_argument('--batchsize', default=16, type=int) - parser.add_argument('--max_length', default=512, type=int) - return parent_args - - def __init__(self, train_data, val_data, yes_token, no_token, tokenizer, args): - super().__init__() - self.batchsize = args.batchsize - - self.train_data = UniMCDataset( - train_data, yes_token, no_token, tokenizer, args, True) - self.valid_data = UniMCDataset( - val_data, yes_token, no_token, tokenizer, args, False) - - def train_dataloader(self): - return DataLoader(self.train_data, shuffle=True, collate_fn=self.collate_fn, batch_size=self.batchsize, pin_memory=False) - - def val_dataloader(self): - return DataLoader(self.valid_data, shuffle=False, collate_fn=self.collate_fn, batch_size=self.batchsize, pin_memory=False) - - def collate_fn(self, batch): - ''' - Aggregate a batch data. - batch = [ins1_dict, ins2_dict, ..., insN_dict] - batch_data = {'sentence':[ins1_sentence, ins2_sentence...], 'input_ids':[ins1_input_ids, ins2_input_ids...], ...} - ''' - batch_data = {} - for key in batch[0]: - batch_data[key] = [example[key] for example in batch] - - batch_data['input_ids'] = nn.utils.rnn.pad_sequence(batch_data['input_ids'], - batch_first=True, - padding_value=0) - batch_data['clslabels_mask'] = nn.utils.rnn.pad_sequence(batch_data['clslabels_mask'], - batch_first=True, - padding_value=-10000) - - batch_size, batch_max_length = batch_data['input_ids'].shape - for k, v in batch_data.items(): - if k == 'input_ids' or k == 'clslabels_mask': - continue - if k == 'clslabels': - batch_data[k] = torch.tensor(v).long() - continue - if k != 'attention_mask': - batch_data[k] = nn.utils.rnn.pad_sequence(v, - batch_first=True, - padding_value=0) - else: - attention_mask = torch.zeros( - (batch_size, batch_max_length, batch_max_length)) - for i, att in enumerate(v): - sample_length, _ = att.shape - attention_mask[i, :sample_length, :sample_length] = att - batch_data[k] = attention_mask - return batch_data - - -class UniMCModel(nn.Module): - def __init__(self, pre_train_dir, yes_token): - super().__init__() - self.config = AutoConfig.from_pretrained(pre_train_dir) - if self.config.model_type == 'megatron-bert': - self.bert = MegatronBertForMaskedLM.from_pretrained(pre_train_dir) - elif self.config.model_type == 'deberta-v2': - self.bert = DebertaV2ForMaskedLM.from_pretrained(pre_train_dir) - elif self.config.model_type == 'albert': - self.bert = AlbertForMaskedLM.from_pretrained(pre_train_dir) - else: - self.bert = BertForMaskedLM.from_pretrained(pre_train_dir) - - self.loss_func = torch.nn.CrossEntropyLoss() - self.yes_token = yes_token - - def forward(self, input_ids, attention_mask, token_type_ids, position_ids=None, mlmlabels=None, clslabels=None, clslabels_mask=None, mlmlabels_mask=None): - - batch_size, seq_len = input_ids.shape - outputs = self.bert(input_ids=input_ids, - attention_mask=attention_mask, - position_ids=position_ids, - token_type_ids=token_type_ids, - labels=mlmlabels) # (bsz, seq, dim) - mask_loss = outputs.loss - mlm_logits = outputs.logits - cls_logits = mlm_logits[:, :, - self.yes_token].view(-1, seq_len)+clslabels_mask - - if mlmlabels == None: - return 0, mlm_logits, cls_logits - else: - cls_loss = self.loss_func(cls_logits, clslabels) - all_loss = mask_loss+cls_loss - return all_loss, mlm_logits, cls_logits - - -class UniMCLitModel(pl.LightningModule): - - @staticmethod - def add_model_specific_args(parent_args): - parser = parent_args.add_argument_group('BaseModel') - - parser.add_argument('--learning_rate', default=1e-5, type=float) - parser.add_argument('--weight_decay', default=0.1, type=float) - parser.add_argument('--warmup', default=0.01, type=float) - parser.add_argument('--num_labels', default=2, type=int) - - return parent_args - - def __init__(self, args, yes_token, model_path, num_data=100): - super().__init__() - self.args = args - self.num_data = num_data - self.model = UniMCModel(model_path, yes_token) - - def setup(self, stage) -> None: - if stage == 'fit': - num_gpus = self.trainer.gpus if self.trainer.gpus is not None else 0 - self.total_step = int(self.trainer.max_epochs * self.num_data / - (max(1, num_gpus) * self.trainer.accumulate_grad_batches)) - print('Total training step:', self.total_step) - - def training_step(self, batch, batch_idx): - loss, logits, cls_logits = self.model(**batch) - cls_acc = self.comput_metrix( - cls_logits, batch['clslabels'], batch['mlmlabels_mask']) - self.log('train_loss', loss) - self.log('train_acc', cls_acc) - return loss - - def validation_step(self, batch, batch_idx): - loss, logits, cls_logits = self.model(**batch) - cls_acc = self.comput_metrix( - cls_logits, batch['clslabels'], batch['mlmlabels_mask']) - self.log('val_loss', loss) - self.log('val_acc', cls_acc) - - def configure_optimizers(self): - - no_decay = ['bias', 'LayerNorm.bias', 'LayerNorm.weight'] - paras = list( - filter(lambda p: p[1].requires_grad, self.named_parameters())) - paras = [{ - 'params': - [p for n, p in paras if not any(nd in n for nd in no_decay)], - 'weight_decay': self.args.weight_decay - }, { - 'params': [p for n, p in paras if any(nd in n for nd in no_decay)], - 'weight_decay': 0.0 - }] - optimizer = torch.optim.AdamW(paras, lr=self.args.learning_rate) - scheduler = get_linear_schedule_with_warmup( - optimizer, int(self.total_step * self.args.warmup), - self.total_step) - - return [{ - 'optimizer': optimizer, - 'lr_scheduler': { - 'scheduler': scheduler, - 'interval': 'step', - 'frequency': 1 - } - }] - - def comput_metrix(self, logits, labels, mlmlabels_mask): - logits = torch.nn.functional.softmax(logits, dim=-1) - logits = torch.argmax(logits, dim=-1) - y_pred = logits.view(size=(-1,)) - y_true = labels.view(size=(-1,)) - corr = torch.eq(y_pred, y_true).float() - return torch.sum(corr.float())/labels.size(0) - - -class UniMCPredict: - def __init__(self, yes_token, no_token, model, tokenizer, args): - self.tokenizer = tokenizer - self.args = args - self.data_model = UniMCDataModel( - [], [], yes_token, no_token, tokenizer, args) - self.model = model - - def predict(self, batch_data): - batch = [self.data_model.train_data.encode( - sample) for sample in batch_data] - batch = self.data_model.collate_fn(batch) - batch = {k: v.cuda() for k, v in batch.items()} - _, _, logits = self.model.model(**batch) - soft_logits = torch.nn.functional.softmax(logits, dim=-1) - logits = torch.argmax(soft_logits, dim=-1).detach().cpu().numpy() - - soft_logits = soft_logits.detach().cpu().numpy() - clslabels_mask = batch['clslabels_mask'].detach( - ).cpu().numpy().tolist() - clslabels = batch['clslabels'].detach().cpu().numpy().tolist() - for i, v in enumerate(batch_data): - label_idx = [idx for idx, v in enumerate( - clslabels_mask[i]) if v == 0.] - label = label_idx.index(logits[i]) - answer = batch_data[i]['choice'][label] - score = {} - for c in range(len(batch_data[i]['choice'])): - score[batch_data[i]['choice'][c]] = float( - soft_logits[i][label_idx[c]]) - - batch_data[i]['label_ori'] = copy.deepcopy(batch_data[i]['label']) - batch_data[i]['label'] = label - batch_data[i]['answer'] = answer - batch_data[i]['score'] = score - - return batch_data - - -class UniMCPipelines(Pipeline): - @staticmethod - def piplines_args(parent_args): - total_parser = parent_args.add_argument_group("piplines args") - total_parser.add_argument( - '--pretrained_model_path', default='', type=str) - total_parser.add_argument('--load_checkpoints_path', - default='', type=str) - total_parser.add_argument('--train', action='store_true') - total_parser.add_argument('--language', - default='chinese', type=str) - - total_parser = UniMCDataModel.add_data_specific_args(total_parser) - total_parser = UniversalCheckpoint.add_argparse_args(total_parser) - total_parser = UniMCLitModel.add_model_specific_args(total_parser) - total_parser = pl.Trainer.add_argparse_args(parent_args) - return parent_args - - def __init__(self, args, model_path): - self.args = args - self.checkpoint_callback = UniversalCheckpoint(args).callbacks - self.logger = loggers.TensorBoardLogger(save_dir=args.default_root_dir) - self.trainer = pl.Trainer.from_argparse_args(args, - logger=self.logger, - callbacks=[self.checkpoint_callback]) - self.config = AutoConfig.from_pretrained(model_path) - if self.config.model_type == 'albert': - self.tokenizer = AlbertTokenizer.from_pretrained( - model_path) - else: - self.tokenizer = BertTokenizer.from_pretrained( - model_path) - - if args.language == 'chinese': - self.yes_token = self.tokenizer.encode('是')[1] - self.no_token = self.tokenizer.encode('非')[1] - else: - self.yes_token = self.tokenizer.encode('yes')[1] - self.no_token = self.tokenizer.encode('no')[1] - - if args.load_checkpoints_path != '': - self.model = UniMCLitModel.load_from_checkpoint( - args.load_checkpoints_path, args=args, yes_token=self.yes_token, model_path=model_path) - print('load model from: ', args.load_checkpoints_path) - else: - self.model = UniMCLitModel( - args, yes_token=self.yes_token, model_path=model_path) - - def train(self, train_data, dev_data, process=True): - if process: - train_data = self.preprocess(train_data) - dev_data = self.preprocess(dev_data) - data_model = UniMCDataModel( - train_data, dev_data, self.yes_token, self.no_token, self.tokenizer, self.args) - self.model.num_data = len(train_data) - self.trainer.fit(self.model, data_model) - - def predict(self, test_data, cuda=True, process=True): - if process: - test_data = self.preprocess(test_data) - - result = [] - start = 0 - if cuda: - self.model = self.model.cuda() - self.model.model.eval() - predict_model = UniMCPredict( - self.yes_token, self.no_token, self.model, self.tokenizer, self.args) - while start < len(test_data): - batch_data = test_data[start:start+self.args.batchsize] - start += self.args.batchsize - batch_result = predict_model.predict(batch_data) - result.extend(batch_result) - if process: - result = self.postprocess(result) - return result - - def preprocess(self, data): - - for i, line in enumerate(data): - if 'task_type' in line.keys() and line['task_type'] == '语义匹配': - data[i]['choice'] = ['不能理解为:'+data[i] - ['textb'], '可以理解为:'+data[i]['textb']] - # data[i]['question']='怎么理解这段话?' - data[i]['textb'] = '' - - if 'task_type' in line.keys() and line['task_type'] == '自然语言推理': - data[i]['choice'] = ['不能推断出:'+data[i]['textb'], - '很难推断出:'+data[i]['textb'], '可以推断出:'+data[i]['textb']] - # data[i]['question']='根据这段话' - data[i]['textb'] = '' - - return data - - def postprocess(self, data): - for i, line in enumerate(data): - if 'task_type' in line.keys() and line['task_type'] == '语义匹配': - data[i]['textb'] = data[i]['choice'][0].replace('不能理解为:', '') - data[i]['choice'] = ['不相似', '相似'] - ns = {} - for k, v in data[i]['score'].items(): - if '不能' in k: - k = '不相似' - if '可以' in k: - k = '相似' - ns[k] = v - data[i]['score'] = ns - data[i]['answer'] = data[i]['choice'][data[i]['label']] - - if 'task_type' in line.keys() and line['task_type'] == '自然语言推理': - data[i]['textb'] = data[i]['choice'][0].replace('不能推断出:', '') - data[i]['choice'] = ['矛盾', '自然', '蕴含'] - ns = {} - for k, v in data[i]['score'].items(): - if '不能' in k: - k = '矛盾' - if '很难' in k: - k = '自然' - if '可以' in k: - k = '蕴含' - ns[k] = v - data[i]['score'] = ns - data[i]['answer'] = data[i]['choice'][data[i]['label']] - - return data - - def _forward(self, model_inputs): - return self.model(**model_inputs) - - def _sanitize_parameters(self, return_all_scores=None, function_to_apply=None, top_k="", **tokenizer_kwargs): - # Using "" as default argument because we're going to use `top_k=None` in user code to declare - # "No top_k" - preprocess_params = tokenizer_kwargs - - postprocess_params = {} - if hasattr(self.model.config, "return_all_scores") and return_all_scores is None: - return_all_scores = self.model.config.return_all_scores - - if isinstance(top_k, int) or top_k is None: - postprocess_params["top_k"] = top_k - postprocess_params["_legacy"] = False - elif return_all_scores is not None: - warnings.warn( - "`return_all_scores` is now deprecated, if want a similar funcionality use `top_k=None` instead of" - " `return_all_scores=True` or `top_k=1` instead of `return_all_scores=False`.", - UserWarning, - ) - if return_all_scores: - postprocess_params["top_k"] = None - else: - postprocess_params["top_k"] = 1 - - if function_to_apply is not None: - postprocess_params["function_to_apply"] = function_to_apply - return preprocess_params, {}, postprocess_params - - -def load_data(data_path): - with open(data_path, 'r', encoding='utf8') as f: - lines = f.readlines() - samples = [json.loads(line) for line in tqdm(lines)] - return samples - - -def comp_acc(pred_data, test_data): - corr = 0 - for i in range(len(pred_data)): - if pred_data[i]['label'] == test_data[i]['label']: - corr += 1 - return corr/len(pred_data) - - -def main(): - total_parser = argparse.ArgumentParser("TASK NAME") - total_parser.add_argument('--data_dir', default='./data', type=str) - total_parser.add_argument('--train_data', default='train.json', type=str) - total_parser.add_argument('--valid_data', default='dev.json', type=str) - total_parser.add_argument('--test_data', default='test.json', type=str) - total_parser.add_argument('--output_path', default='', type=str) - total_parser = UniMCPipelines.piplines_args(total_parser) - args = total_parser.parse_args() - - train_data = load_data(os.path.join(args.data_dir, args.train_data)) - dev_data = load_data(os.path.join(args.data_dir, args.valid_data)) - test_data = load_data(os.path.join(args.data_dir, args.test_data)) - - dev_data_ori = copy.deepcopy(dev_data) - - model = UniMCPipelines(args) - - print(args.data_dir) - - if args.train: - model.train(train_data, dev_data) - result = model.predict(dev_data) - for line in result[:20]: - print(line) - - acc = comp_acc(result, dev_data_ori) - print('acc:', acc) - - if args.output_path != '': - test_result = model.predict(test_data) - with open(args.output_path, 'w', encoding='utf8') as f: - for line in test_result: - json_data = json.dumps(line, ensure_ascii=False) - f.write(json_data+'\n') - - -if __name__ == "__main__": - main() diff --git a/spaces/feng2022/Time-TravelRephotography/app.py b/spaces/feng2022/Time-TravelRephotography/app.py deleted file mode 100644 index 0d0b2597a89bc11ee5b02ff6c3e7ce6d03daafd8..0000000000000000000000000000000000000000 --- a/spaces/feng2022/Time-TravelRephotography/app.py +++ /dev/null @@ -1,77 +0,0 @@ -#!/usr/bin/env python - -from __future__ import annotations - -import argparse -import functools -import os -import pickle -import sys -import subprocess - -import gradio as gr -import numpy as np -import torch -import torch.nn as nn - -sys.path.append('.') -sys.path.append('./Time_TravelRephotography') -from utils import torch_helpers as th -from argparse import Namespace -from projector import ( - ProjectorArguments, - main, - create_generator, - make_image, -) - -input_path = '' -spectral_sensitivity = 'b' -TITLE = 'Time-TravelRephotography' -DESCRIPTION = '''This is an unofficial demo for https://github.com/Time-Travel-Rephotography. -''' -ARTICLE = '
          visitor badge
          ' - - -def image_create(seed: int, truncation_psi: float): - args = ProjectorArguments().parse( - args=[str(input_path)], - namespace=Namespace( - encoder_ckpt=f"checkpoint/encoder/checkpoint_{spectral_sensitivity}.pt", - #gaussian=gaussian_radius, - log_visual_freq=1000 - )) - device = th.device() - generator = create_generator("stylegan2-ffhq-config-f.pt","feng2022/Time-TravelRephotography_stylegan2-ffhq-config-f",args,device) - #generator = create_generator("checkpoint_b.pt.pth","feng2022/Time_TravelRephotography_checkpoint_b",args,device) - latent = torch.randn((1, 512), device=device) - img_out, _, _ = generator([latent]) - imgs_arr = make_image(img_out) - return imgs_arr[0]/255 - -def main(): - torch.cuda.init() - if torch.cuda.is_initialized(): - ini = "True1" - else: - ini = "False1" - result = subprocess.check_output(['nvidia-smi']) - device = th.device() - iface = gr.Interface( - image_create, - [ - gr.inputs.Number(default=0, label='Seed'), - gr.inputs.Slider( - 0, 2, step=0.05, default=0.7, label='Truncation psi'), - ], - gr.outputs.Image(type='numpy', label='Output'), - title=TITLE, - description=DESCRIPTION, - article=ARTICLE, - ) - - iface.launch() - -if __name__ == '__main__': - main() - \ No newline at end of file diff --git a/spaces/fffffu/bing/src/lib/hooks/use-at-bottom.tsx b/spaces/fffffu/bing/src/lib/hooks/use-at-bottom.tsx deleted file mode 100644 index d37c8cf4162adcb0064e08ecec24eb731416b045..0000000000000000000000000000000000000000 --- a/spaces/fffffu/bing/src/lib/hooks/use-at-bottom.tsx +++ /dev/null @@ -1,23 +0,0 @@ -import * as React from 'react' - -export function useAtBottom(offset = 0) { - const [isAtBottom, setIsAtBottom] = React.useState(false) - - React.useEffect(() => { - const handleScroll = () => { - setIsAtBottom( - window.innerHeight + window.scrollY >= - document.body.offsetHeight - offset - ) - } - - window.addEventListener('scroll', handleScroll, { passive: true }) - handleScroll() - - return () => { - window.removeEventListener('scroll', handleScroll) - } - }, [offset]) - - return isAtBottom -} diff --git a/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/node_modules/cookie/index.js b/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/node_modules/cookie/index.js deleted file mode 100644 index 55331d9a62a30457223c452c6e1445c5312dc862..0000000000000000000000000000000000000000 --- a/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/node_modules/cookie/index.js +++ /dev/null @@ -1,202 +0,0 @@ -/*! - * cookie - * Copyright(c) 2012-2014 Roman Shtylman - * Copyright(c) 2015 Douglas Christopher Wilson - * MIT Licensed - */ - -'use strict'; - -/** - * Module exports. - * @public - */ - -exports.parse = parse; -exports.serialize = serialize; - -/** - * Module variables. - * @private - */ - -var decode = decodeURIComponent; -var encode = encodeURIComponent; - -/** - * RegExp to match field-content in RFC 7230 sec 3.2 - * - * field-content = field-vchar [ 1*( SP / HTAB ) field-vchar ] - * field-vchar = VCHAR / obs-text - * obs-text = %x80-FF - */ - -var fieldContentRegExp = /^[\u0009\u0020-\u007e\u0080-\u00ff]+$/; - -/** - * Parse a cookie header. - * - * Parse the given cookie header string into an object - * The object has the various cookies as keys(names) => values - * - * @param {string} str - * @param {object} [options] - * @return {object} - * @public - */ - -function parse(str, options) { - if (typeof str !== 'string') { - throw new TypeError('argument str must be a string'); - } - - var obj = {} - var opt = options || {}; - var pairs = str.split(';') - var dec = opt.decode || decode; - - for (var i = 0; i < pairs.length; i++) { - var pair = pairs[i]; - var index = pair.indexOf('=') - - // skip things that don't look like key=value - if (index < 0) { - continue; - } - - var key = pair.substring(0, index).trim() - - // only assign once - if (undefined == obj[key]) { - var val = pair.substring(index + 1, pair.length).trim() - - // quoted values - if (val[0] === '"') { - val = val.slice(1, -1) - } - - obj[key] = tryDecode(val, dec); - } - } - - return obj; -} - -/** - * Serialize data into a cookie header. - * - * Serialize the a name value pair into a cookie string suitable for - * http headers. An optional options object specified cookie parameters. - * - * serialize('foo', 'bar', { httpOnly: true }) - * => "foo=bar; httpOnly" - * - * @param {string} name - * @param {string} val - * @param {object} [options] - * @return {string} - * @public - */ - -function serialize(name, val, options) { - var opt = options || {}; - var enc = opt.encode || encode; - - if (typeof enc !== 'function') { - throw new TypeError('option encode is invalid'); - } - - if (!fieldContentRegExp.test(name)) { - throw new TypeError('argument name is invalid'); - } - - var value = enc(val); - - if (value && !fieldContentRegExp.test(value)) { - throw new TypeError('argument val is invalid'); - } - - var str = name + '=' + value; - - if (null != opt.maxAge) { - var maxAge = opt.maxAge - 0; - - if (isNaN(maxAge) || !isFinite(maxAge)) { - throw new TypeError('option maxAge is invalid') - } - - str += '; Max-Age=' + Math.floor(maxAge); - } - - if (opt.domain) { - if (!fieldContentRegExp.test(opt.domain)) { - throw new TypeError('option domain is invalid'); - } - - str += '; Domain=' + opt.domain; - } - - if (opt.path) { - if (!fieldContentRegExp.test(opt.path)) { - throw new TypeError('option path is invalid'); - } - - str += '; Path=' + opt.path; - } - - if (opt.expires) { - if (typeof opt.expires.toUTCString !== 'function') { - throw new TypeError('option expires is invalid'); - } - - str += '; Expires=' + opt.expires.toUTCString(); - } - - if (opt.httpOnly) { - str += '; HttpOnly'; - } - - if (opt.secure) { - str += '; Secure'; - } - - if (opt.sameSite) { - var sameSite = typeof opt.sameSite === 'string' - ? opt.sameSite.toLowerCase() : opt.sameSite; - - switch (sameSite) { - case true: - str += '; SameSite=Strict'; - break; - case 'lax': - str += '; SameSite=Lax'; - break; - case 'strict': - str += '; SameSite=Strict'; - break; - case 'none': - str += '; SameSite=None'; - break; - default: - throw new TypeError('option sameSite is invalid'); - } - } - - return str; -} - -/** - * Try decoding a string using a decoding function. - * - * @param {string} str - * @param {function} decode - * @private - */ - -function tryDecode(str, decode) { - try { - return decode(str); - } catch (e) { - return str; - } -} diff --git a/spaces/fffiloni/controlnet-animation-doodle/node_modules/fresh/README.md b/spaces/fffiloni/controlnet-animation-doodle/node_modules/fresh/README.md deleted file mode 100644 index 1c1c680d151e5a64ab39cb88afc4600f554f00d5..0000000000000000000000000000000000000000 --- a/spaces/fffiloni/controlnet-animation-doodle/node_modules/fresh/README.md +++ /dev/null @@ -1,119 +0,0 @@ -# fresh - -[![NPM Version][npm-image]][npm-url] -[![NPM Downloads][downloads-image]][downloads-url] -[![Node.js Version][node-version-image]][node-version-url] -[![Build Status][travis-image]][travis-url] -[![Test Coverage][coveralls-image]][coveralls-url] - -HTTP response freshness testing - -## Installation - -This is a [Node.js](https://nodejs.org/en/) module available through the -[npm registry](https://www.npmjs.com/). Installation is done using the -[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): - -``` -$ npm install fresh -``` - -## API - - - -```js -var fresh = require('fresh') -``` - -### fresh(reqHeaders, resHeaders) - -Check freshness of the response using request and response headers. - -When the response is still "fresh" in the client's cache `true` is -returned, otherwise `false` is returned to indicate that the client -cache is now stale and the full response should be sent. - -When a client sends the `Cache-Control: no-cache` request header to -indicate an end-to-end reload request, this module will return `false` -to make handling these requests transparent. - -## Known Issues - -This module is designed to only follow the HTTP specifications, not -to work-around all kinda of client bugs (especially since this module -typically does not recieve enough information to understand what the -client actually is). - -There is a known issue that in certain versions of Safari, Safari -will incorrectly make a request that allows this module to validate -freshness of the resource even when Safari does not have a -representation of the resource in the cache. The module -[jumanji](https://www.npmjs.com/package/jumanji) can be used in -an Express application to work-around this issue and also provides -links to further reading on this Safari bug. - -## Example - -### API usage - - - -```js -var reqHeaders = { 'if-none-match': '"foo"' } -var resHeaders = { 'etag': '"bar"' } -fresh(reqHeaders, resHeaders) -// => false - -var reqHeaders = { 'if-none-match': '"foo"' } -var resHeaders = { 'etag': '"foo"' } -fresh(reqHeaders, resHeaders) -// => true -``` - -### Using with Node.js http server - -```js -var fresh = require('fresh') -var http = require('http') - -var server = http.createServer(function (req, res) { - // perform server logic - // ... including adding ETag / Last-Modified response headers - - if (isFresh(req, res)) { - // client has a fresh copy of resource - res.statusCode = 304 - res.end() - return - } - - // send the resource - res.statusCode = 200 - res.end('hello, world!') -}) - -function isFresh (req, res) { - return fresh(req.headers, { - 'etag': res.getHeader('ETag'), - 'last-modified': res.getHeader('Last-Modified') - }) -} - -server.listen(3000) -``` - -## License - -[MIT](LICENSE) - -[npm-image]: https://img.shields.io/npm/v/fresh.svg -[npm-url]: https://npmjs.org/package/fresh -[node-version-image]: https://img.shields.io/node/v/fresh.svg -[node-version-url]: https://nodejs.org/en/ -[travis-image]: https://img.shields.io/travis/jshttp/fresh/master.svg -[travis-url]: https://travis-ci.org/jshttp/fresh -[coveralls-image]: https://img.shields.io/coveralls/jshttp/fresh/master.svg -[coveralls-url]: https://coveralls.io/r/jshttp/fresh?branch=master -[downloads-image]: https://img.shields.io/npm/dm/fresh.svg -[downloads-url]: https://npmjs.org/package/fresh diff --git a/spaces/fun-research/FC-CLIP/fcclip/data/datasets/register_pascal_ctx_459_sem_seg.py b/spaces/fun-research/FC-CLIP/fcclip/data/datasets/register_pascal_ctx_459_sem_seg.py deleted file mode 100644 index 15d944252487dcdb50664fa50b9290fa95bfda4f..0000000000000000000000000000000000000000 --- a/spaces/fun-research/FC-CLIP/fcclip/data/datasets/register_pascal_ctx_459_sem_seg.py +++ /dev/null @@ -1,66 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import os - -import numpy as np - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.data.datasets import load_sem_seg - -from . import openseg_classes - -PASCAL_CTX_459_CATEGORIES=openseg_classes.get_pascal_ctx_459_categories_with_prompt_eng() - -PASCAL_CTX_459_COLORS = [k["color"] for k in PASCAL_CTX_459_CATEGORIES] - -MetadataCatalog.get("openvocab_pascal_ctx459_sem_seg_train").set( - stuff_colors=PASCAL_CTX_459_COLORS[:], -) - -MetadataCatalog.get("openvocab_pascal_ctx459_sem_seg_val").set( - stuff_colors=PASCAL_CTX_459_COLORS[:], -) - -def _get_ctx459_meta(): - # Id 0 is reserved for ignore_label, we change ignore_label for 0 - # to 255 in our pre-processing, so all ids are shifted by 1. - stuff_ids = [k["id"] for k in PASCAL_CTX_459_CATEGORIES] - assert len(stuff_ids) == 459, len(stuff_ids) - - # For semantic segmentation, this mapping maps from contiguous stuff id - # (in [0, 91], used in models) to ids in the dataset (used for processing results) - stuff_dataset_id_to_contiguous_id = {k: i for i, k in enumerate(stuff_ids)} - stuff_classes = [k["name"] for k in PASCAL_CTX_459_CATEGORIES] - - ret = { - "stuff_dataset_id_to_contiguous_id": stuff_dataset_id_to_contiguous_id, - "stuff_classes": stuff_classes, - } - return ret - - -def register_all_ctx459(root): - root = os.path.join(root, "pascal_ctx_d2") - meta = _get_ctx459_meta() - for name, dirname in [("train", "training"), ("val", "validation")]: - image_dir = os.path.join(root, "images", dirname) - gt_dir = os.path.join(root, "annotations_ctx459", dirname) - name = f"openvocab_pascal_ctx459_sem_seg_{name}" - DatasetCatalog.register( - name, lambda x=image_dir, y=gt_dir: load_sem_seg(y, x, gt_ext="tif", image_ext="jpg") - ) - MetadataCatalog.get(name).set( - stuff_classes=meta["stuff_classes"][:], - thing_dataset_id_to_contiguous_id={}, # to make Mask2Former happy - stuff_dataset_id_to_contiguous_id=meta["stuff_dataset_id_to_contiguous_id"], - image_root=image_dir, - sem_seg_root=gt_dir, - evaluator_type="sem_seg", - ignore_label=65535, # NOTE: gt is saved in 16-bit TIFF images - gt_ext="tif", - ) - - - - -_root = os.getenv("DETECTRON2_DATASETS", "datasets") -register_all_ctx459(_root) \ No newline at end of file diff --git a/spaces/generativeai/test-image-similarity/image_similarity.py b/spaces/generativeai/test-image-similarity/image_similarity.py deleted file mode 100644 index a65804cc30399ffd94f5b3d0cf84219370205fb0..0000000000000000000000000000000000000000 --- a/spaces/generativeai/test-image-similarity/image_similarity.py +++ /dev/null @@ -1,59 +0,0 @@ -from sentence_transformers import SentenceTransformer, util - -class ImageSimilarity(object): - def __init__(self, minimum_commutative_image_diff): - self.minimum_commutative_image_diff = minimum_commutative_image_diff - - def check(self, pil_images): - results = [] - - # Load the OpenAI CLIP Model - print('Loading CLIP Model...') - model = SentenceTransformer('clip-ViT-B-32') - - print("Images:", len(pil_images)) - encoded_image = model.encode([image["pil"] for image in pil_images], batch_size=128, convert_to_tensor=True, show_progress_bar=True) - - # Now we run the clustering algorithm. This function compares images aganist - # all other images and returns a list with the pairs that have the highest - # cosine similarity score - processed_images = util.paraphrase_mining_embeddings(encoded_image) - # NUM_SIMILAR_IMAGES = 10 - - # ================= - # DUPLICATES - # ================= - # print('Finding duplicate images...') - # Filter list for duplicates. Results are triplets (score, image_id1, image_id2) and is scorted in decreasing order - # A duplicate image will have a score of 1.00 - # It may be 0.9999 due to lossy image compression (.jpg) - # duplicates = [image for image in processed_images if image[0] >= 0.999] - - # Output the top X duplicate images - # for score, image_id1, image_id2 in duplicates[0:NUM_SIMILAR_IMAGES]: - # print("\nScore: {:.3f}%".format(score * 100)) - # print(pil_images[image_id1]) - # print(pil_images[image_id2]) - - # ================= - # NEAR DUPLICATES - # ================= - print('Finding near duplicate images...') - # Use a threshold parameter to identify two images as similar. By setting the threshold lower, - # you will get larger clusters which have less similar images in it. Threshold 0 - 1.00 - # A threshold of 1.00 means the two images are exactly the same. Since we are finding near - # duplicate images, we can set it at 0.99 or any number 0 < X < 1.00. - # threshold = 0.99 - # near_duplicates = [image for image in processed_images if image[0] < threshold] - - for score, image_id1, image_id2 in processed_images: - results.append({ - 'score': score, - 'image1': pil_images[image_id1]["key"], - 'image2': pil_images[image_id2]["key"] - }) - # print("\nScore: {:.3f}%".format(score * 100)) - # print(pil_images[image_id1]["key"]) - # print(pil_images[image_id2]["key"]) - - return results \ No newline at end of file diff --git a/spaces/georgefen/Face-Landmark-ControlNet/annotator/uniformer/mmcv/ops/roi_pool.py b/spaces/georgefen/Face-Landmark-ControlNet/annotator/uniformer/mmcv/ops/roi_pool.py deleted file mode 100644 index d339d8f2941eabc1cbe181a9c6c5ab5ff4ff4e5f..0000000000000000000000000000000000000000 --- a/spaces/georgefen/Face-Landmark-ControlNet/annotator/uniformer/mmcv/ops/roi_pool.py +++ /dev/null @@ -1,86 +0,0 @@ -# Copyright (c) OpenMMLab. All rights reserved. -import torch -import torch.nn as nn -from torch.autograd import Function -from torch.autograd.function import once_differentiable -from torch.nn.modules.utils import _pair - -from ..utils import ext_loader - -ext_module = ext_loader.load_ext('_ext', - ['roi_pool_forward', 'roi_pool_backward']) - - -class RoIPoolFunction(Function): - - @staticmethod - def symbolic(g, input, rois, output_size, spatial_scale): - return g.op( - 'MaxRoiPool', - input, - rois, - pooled_shape_i=output_size, - spatial_scale_f=spatial_scale) - - @staticmethod - def forward(ctx, input, rois, output_size, spatial_scale=1.0): - ctx.output_size = _pair(output_size) - ctx.spatial_scale = spatial_scale - ctx.input_shape = input.size() - - assert rois.size(1) == 5, 'RoI must be (idx, x1, y1, x2, y2)!' - - output_shape = (rois.size(0), input.size(1), ctx.output_size[0], - ctx.output_size[1]) - output = input.new_zeros(output_shape) - argmax = input.new_zeros(output_shape, dtype=torch.int) - - ext_module.roi_pool_forward( - input, - rois, - output, - argmax, - pooled_height=ctx.output_size[0], - pooled_width=ctx.output_size[1], - spatial_scale=ctx.spatial_scale) - - ctx.save_for_backward(rois, argmax) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - rois, argmax = ctx.saved_tensors - grad_input = grad_output.new_zeros(ctx.input_shape) - - ext_module.roi_pool_backward( - grad_output, - rois, - argmax, - grad_input, - pooled_height=ctx.output_size[0], - pooled_width=ctx.output_size[1], - spatial_scale=ctx.spatial_scale) - - return grad_input, None, None, None - - -roi_pool = RoIPoolFunction.apply - - -class RoIPool(nn.Module): - - def __init__(self, output_size, spatial_scale=1.0): - super(RoIPool, self).__init__() - - self.output_size = _pair(output_size) - self.spatial_scale = float(spatial_scale) - - def forward(self, input, rois): - return roi_pool(input, rois, self.output_size, self.spatial_scale) - - def __repr__(self): - s = self.__class__.__name__ - s += f'(output_size={self.output_size}, ' - s += f'spatial_scale={self.spatial_scale})' - return s diff --git a/spaces/glyszt/vt/vtoonify/model/stylegan/op_gpu/fused_bias_act.cpp b/spaces/glyszt/vt/vtoonify/model/stylegan/op_gpu/fused_bias_act.cpp deleted file mode 100644 index 71f612cdbaaca03822eedc002a980d055d2f485c..0000000000000000000000000000000000000000 --- a/spaces/glyszt/vt/vtoonify/model/stylegan/op_gpu/fused_bias_act.cpp +++ /dev/null @@ -1,32 +0,0 @@ - -#include -#include - -torch::Tensor fused_bias_act_op(const torch::Tensor &input, - const torch::Tensor &bias, - const torch::Tensor &refer, int act, int grad, - float alpha, float scale); - -#define CHECK_CUDA(x) \ - TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor") -#define CHECK_CONTIGUOUS(x) \ - TORCH_CHECK(x.is_contiguous(), #x " must be contiguous") -#define CHECK_INPUT(x) \ - CHECK_CUDA(x); \ - CHECK_CONTIGUOUS(x) - -torch::Tensor fused_bias_act(const torch::Tensor &input, - const torch::Tensor &bias, - const torch::Tensor &refer, int act, int grad, - float alpha, float scale) { - CHECK_INPUT(input); - CHECK_INPUT(bias); - - at::DeviceGuard guard(input.device()); - - return fused_bias_act_op(input, bias, refer, act, grad, alpha, scale); -} - -PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { - m.def("fused_bias_act", &fused_bias_act, "fused bias act (CUDA)"); -} \ No newline at end of file diff --git a/spaces/gradio/HuBERT/scripts/split_train_valid_docs.py b/spaces/gradio/HuBERT/scripts/split_train_valid_docs.py deleted file mode 100644 index ff159785284a13b44626b207d84430c592acaf8f..0000000000000000000000000000000000000000 --- a/spaces/gradio/HuBERT/scripts/split_train_valid_docs.py +++ /dev/null @@ -1,86 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) Facebook, Inc. and its affiliates. -# -# This source code is licensed under the MIT license found in the -# LICENSE file in the root directory of this source tree. -""" -Split a large file into a train and valid set while respecting document -boundaries. Documents should be separated by a single empty line. -""" - -import argparse -import random -import sys - - -def main(): - parser = argparse.ArgumentParser() - parser.add_argument("input") - parser.add_argument("sample_output", help="train output file") - parser.add_argument("remainder_output", help="valid output file") - parser.add_argument("-k", type=int, help="remainder size") - parser.add_argument( - "--lines", action="store_true", help="split lines instead of docs" - ) - args = parser.parse_args() - - assert args.k is not None - - sample = [] - remainder = [] - num_docs = [0] - - def update_sample(doc): - if len(sample) < args.k: - sample.append(doc.copy()) - else: - i = num_docs[0] - j = random.randrange(i + 1) - if j < args.k: - remainder.append(sample[j]) - sample[j] = doc.copy() - else: - remainder.append(doc.copy()) - num_docs[0] += 1 - doc.clear() - - with open(args.input, "r", encoding="utf-8") as h: - doc = [] - for i, line in enumerate(h): - if line.strip() == "": # empty line indicates new document - update_sample(doc) - else: - doc.append(line) - if args.lines: - update_sample(doc) - if i % 1000000 == 0: - print(i, file=sys.stderr, end="", flush=True) - elif i % 100000 == 0: - print(".", file=sys.stderr, end="", flush=True) - if len(doc) > 0: - update_sample(doc) - print(file=sys.stderr, flush=True) - - assert len(sample) == args.k - - with open(args.sample_output, "w", encoding="utf-8") as out: - first = True - for doc in sample: - if not first and not args.lines: - out.write("\n") - first = False - for line in doc: - out.write(line) - - with open(args.remainder_output, "w", encoding="utf-8") as out: - first = True - for doc in remainder: - if not first and not args.lines: - out.write("\n") - first = False - for line in doc: - out.write(line) - - -if __name__ == "__main__": - main() diff --git a/spaces/gradio/calculator-flagging-options/app.py b/spaces/gradio/calculator-flagging-options/app.py deleted file mode 100644 index 03862b3a97fd885f07c600538259d4e89054f986..0000000000000000000000000000000000000000 --- a/spaces/gradio/calculator-flagging-options/app.py +++ /dev/null @@ -1,22 +0,0 @@ -import gradio as gr - - -def calculator(num1, operation, num2): - if operation == "add": - return num1 + num2 - elif operation == "subtract": - return num1 - num2 - elif operation == "multiply": - return num1 * num2 - elif operation == "divide": - return num1 / num2 - -iface = gr.Interface( - calculator, - ["number", gr.inputs.Radio(["add", "subtract", "multiply", "divide"]), "number"], - "number", - allow_flagging="manual", - flagging_options=["wrong sign", "off by one", "other"] -) - -iface.launch() \ No newline at end of file diff --git a/spaces/gyugnsu/DragGan-Inversion/stylegan_human/training_scripts/sg3/train.py b/spaces/gyugnsu/DragGan-Inversion/stylegan_human/training_scripts/sg3/train.py deleted file mode 100644 index afc4c934c6944b4333efa38a025f14888c67c59d..0000000000000000000000000000000000000000 --- a/spaces/gyugnsu/DragGan-Inversion/stylegan_human/training_scripts/sg3/train.py +++ /dev/null @@ -1,325 +0,0 @@ -# Copyright (c) SenseTime Research. All rights reserved. - -# Copyright (c) 2021, NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# -# NVIDIA CORPORATION and its licensors retain all intellectual property -# and proprietary rights in and to this software, related documentation -# and any modifications thereto. Any use, reproduction, disclosure or -# distribution of this software and related documentation without an express -# license agreement from NVIDIA CORPORATION is strictly prohibited. - -"""Train a GAN using the techniques described in the paper -"Alias-Free Generative Adversarial Networks".""" - -import os -import click -import re -import json -import tempfile -import torch - -import dnnlib -from training import training_loop -from metrics import metric_main -from torch_utils import training_stats -from torch_utils import custom_ops -import ast -# ---------------------------------------------------------------------------- - - -def subprocess_fn(rank, c, temp_dir): - dnnlib.util.Logger(file_name=os.path.join( - c.run_dir, 'log.txt'), file_mode='a', should_flush=True) - - # Init torch.distributed. - if c.num_gpus > 1: - init_file = os.path.abspath(os.path.join( - temp_dir, '.torch_distributed_init')) - if os.name == 'nt': - init_method = 'file:///' + init_file.replace('\\', '/') - torch.distributed.init_process_group( - backend='gloo', init_method=init_method, rank=rank, world_size=c.num_gpus) - else: - init_method = f'file://{init_file}' - torch.distributed.init_process_group( - backend='nccl', init_method=init_method, rank=rank, world_size=c.num_gpus) - - # Init torch_utils. - sync_device = torch.device('cuda', rank) if c.num_gpus > 1 else None - training_stats.init_multiprocessing(rank=rank, sync_device=sync_device) - if rank != 0: - custom_ops.verbosity = 'none' - - # Execute training loop. - training_loop.training_loop(rank=rank, **c) - -# ---------------------------------------------------------------------------- - - -def launch_training(c, desc, outdir, dry_run): - dnnlib.util.Logger(should_flush=True) - - # Pick output directory. - prev_run_dirs = [] - if os.path.isdir(outdir): - prev_run_dirs = [x for x in os.listdir( - outdir) if os.path.isdir(os.path.join(outdir, x))] - prev_run_ids = [re.match(r'^\d+', x) for x in prev_run_dirs] - prev_run_ids = [int(x.group()) for x in prev_run_ids if x is not None] - cur_run_id = max(prev_run_ids, default=-1) + 1 - c.run_dir = os.path.join(outdir, f'{cur_run_id:05d}-{desc}') - assert not os.path.exists(c.run_dir) - - # Print options. - print() - print('Training options:') - print(json.dumps(c, indent=2)) - print() - print(f'Output directory: {c.run_dir}') - print(f'Number of GPUs: {c.num_gpus}') - print(f'Batch size: {c.batch_size} images') - print(f'Training duration: {c.total_kimg} kimg') - print(f'Dataset path: {c.training_set_kwargs.path}') - print(f'Dataset size: {c.training_set_kwargs.max_size} images') - print(f'Dataset resolution: {c.training_set_kwargs.resolution}') - print(f'Dataset labels: {c.training_set_kwargs.use_labels}') - print(f'Dataset x-flips: {c.training_set_kwargs.xflip}') - print() - - # Dry run? - if dry_run: - print('Dry run; exiting.') - return - - # Create output directory. - print('Creating output directory...') - os.makedirs(c.run_dir) - with open(os.path.join(c.run_dir, 'training_options.json'), 'wt') as f: - json.dump(c, f, indent=2) - - # Launch processes. - print('Launching processes...') - torch.multiprocessing.set_start_method('spawn') - with tempfile.TemporaryDirectory() as temp_dir: - if c.num_gpus == 1: - subprocess_fn(rank=0, c=c, temp_dir=temp_dir) - else: - torch.multiprocessing.spawn( - fn=subprocess_fn, args=(c, temp_dir), nprocs=c.num_gpus) - -# ---------------------------------------------------------------------------- - - -def init_dataset_kwargs(data, square=False): - # dataset - - try: - dataset_kwargs = dnnlib.EasyDict(class_name='training.dataset.ImageFolderDataset', - path=data, use_labels=True, max_size=None, xflip=False, square=square) - # Subclass of training.dataset.Dataset. - dataset_obj = dnnlib.util.construct_class_by_name(**dataset_kwargs) - # Be explicit about resolution. - dataset_kwargs.resolution = dataset_obj.resolution - # Be explicit about labels. - dataset_kwargs.use_labels = dataset_obj.has_labels - # Be explicit about dataset size. - dataset_kwargs.max_size = len(dataset_obj) - return dataset_kwargs, dataset_obj.name - except IOError as err: - raise click.ClickException(f'--data: {err}') - - print("out of dataset") -# ---------------------------------------------------------------------------- - - -def parse_comma_separated_list(s): - if isinstance(s, list): - return s - if s is None or s.lower() == 'none' or s == '': - return [] - return s.split(',') - -# ---------------------------------------------------------------------------- - - -@click.command() -# Required. -@click.option('--outdir', help='Where to save the results', metavar='DIR', required=True) -@click.option('--cfg', help='Base configuration', type=click.Choice(['stylegan3-t', 'stylegan3-r', 'stylegan2']), required=True) -@click.option('--data', help='Training data', metavar='PATH', required=True) -@click.option('--gpus', help='Number of GPUs to use', metavar='INT', type=click.IntRange(min=1), required=True) -@click.option('--batch', help='Total batch size', metavar='INT', type=click.IntRange(min=1), required=True) -@click.option('--gamma', help='R1 regularization weight', metavar='FLOAT', type=click.FloatRange(min=0), required=True) -@click.option('--square', help='True for square, False for rectangle', type=bool, metavar='BOOL', default=False) -# Optional features. -@click.option('--cond', help='Train conditional model', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--mirror', help='Enable dataset x-flips', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--aug', help='Augmentation mode', type=click.Choice(['noaug', 'ada', 'fixed']), default='ada', show_default=True) -@click.option('--resume', help='Resume from given network pickle', metavar='[PATH|URL]', type=str) -@click.option('--freezed', help='Freeze first layers of D', metavar='INT', type=click.IntRange(min=0), default=0, show_default=True) -# Misc hyperparameters. -@click.option('--p', help='Probability for --aug=fixed', metavar='FLOAT', type=click.FloatRange(min=0, max=1), default=0.2, show_default=True) -@click.option('--target', help='Target value for --aug=ada', metavar='FLOAT', type=click.FloatRange(min=0, max=1), default=0.6, show_default=True) -@click.option('--batch-gpu', help='Limit batch size per GPU', metavar='INT', type=click.IntRange(min=1)) -@click.option('--cbase', help='Capacity multiplier', metavar='INT', type=click.IntRange(min=1), default=32768, show_default=True) -@click.option('--cmax', help='Max. feature maps', metavar='INT', type=click.IntRange(min=1), default=512, show_default=True) -@click.option('--glr', help='G learning rate [default: varies]', metavar='FLOAT', type=click.FloatRange(min=0)) -@click.option('--dlr', help='D learning rate', metavar='FLOAT', type=click.FloatRange(min=0), default=0.002, show_default=True) -@click.option('--map-depth', help='Mapping network depth [default: varies]', metavar='INT', type=click.IntRange(min=1)) -@click.option('--mbstd-group', help='Minibatch std group size', metavar='INT', type=click.IntRange(min=1), default=4, show_default=True) -# Misc settings. -@click.option('--desc', help='String to include in result dir name', metavar='STR', type=str) -@click.option('--metrics', help='Quality metrics', metavar='[NAME|A,B,C|none]', type=parse_comma_separated_list, default='fid50k_full', show_default=True) -@click.option('--kimg', help='Total training duration', metavar='KIMG', type=click.IntRange(min=1), default=25000, show_default=True) -@click.option('--tick', help='How often to print progress', metavar='KIMG', type=click.IntRange(min=1), default=4, show_default=True) -@click.option('--snap', help='How often to save snapshots', metavar='TICKS', type=click.IntRange(min=1), default=50, show_default=True) -@click.option('--seed', help='Random seed', metavar='INT', type=click.IntRange(min=0), default=0, show_default=True) -@click.option('--fp32', help='Disable mixed-precision', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--nobench', help='Disable cuDNN benchmarking', metavar='BOOL', type=bool, default=False, show_default=True) -@click.option('--workers', help='DataLoader worker processes', metavar='INT', type=click.IntRange(min=1), default=3, show_default=True) -@click.option('-n', '--dry-run', help='Print training options and exit', is_flag=True) -def main(**kwargs): - """Train a GAN using the techniques described in the paper - "Alias-Free Generative Adversarial Networks". - - Examples: - - \b - # Train StyleGAN3-T for AFHQv2 using 8 GPUs. - python train.py --outdir=~/training-runs --cfg=stylegan3-t --data=~/datasets/afhqv2-512x512.zip \\ - --gpus=8 --batch=32 --gamma=8.2 --mirror=1 - - \b - # Fine-tune StyleGAN3-R for MetFaces-U using 1 GPU, starting from the pre-trained FFHQ-U pickle. - python train.py --outdir=~/training-runs --cfg=stylegan3-r --data=~/datasets/metfacesu-1024x1024.zip \\ - --gpus=8 --batch=32 --gamma=6.6 --mirror=1 --kimg=5000 --snap=5 \\ - --resume=https://api.ngc.nvidia.com/v2/models/nvidia/research/stylegan3/versions/1/files/stylegan3-r-ffhqu-1024x1024.pkl - - \b - # Train StyleGAN2 for FFHQ at 1024x1024 resolution using 8 GPUs. - python train.py --outdir=~/training-runs --cfg=stylegan2 --data=~/datasets/ffhq-1024x1024.zip \\ - --gpus=8 --batch=32 --gamma=10 --mirror=1 --aug=noaug - """ - - # Initialize config. - opts = dnnlib.EasyDict(kwargs) # Command line arguments. - c = dnnlib.EasyDict() # Main config dict. - print('---- square: ', opts.square) - c.G_kwargs = dnnlib.EasyDict( - class_name=None, z_dim=512, w_dim=512, mapping_kwargs=dnnlib.EasyDict(), square=opts.square) - c.D_kwargs = dnnlib.EasyDict(class_name='training.networks_stylegan2.Discriminator', block_kwargs=dnnlib.EasyDict( - ), mapping_kwargs=dnnlib.EasyDict(), epilogue_kwargs=dnnlib.EasyDict(), square=opts.square) - c.G_opt_kwargs = dnnlib.EasyDict( - class_name='torch.optim.Adam', betas=[0, 0.99], eps=1e-8) - c.D_opt_kwargs = dnnlib.EasyDict( - class_name='torch.optim.Adam', betas=[0, 0.99], eps=1e-8) - c.loss_kwargs = dnnlib.EasyDict(class_name='training.loss.StyleGAN2Loss') - c.data_loader_kwargs = dnnlib.EasyDict(pin_memory=True, prefetch_factor=2) - - # Training set. - c.training_set_kwargs, dataset_name = init_dataset_kwargs( - data=opts.data, square=opts.square) - if opts.cond and not c.training_set_kwargs.use_labels: - raise click.ClickException( - '--cond=True requires labels specified in dataset.json') - c.training_set_kwargs.use_labels = opts.cond - c.training_set_kwargs.xflip = opts.mirror - - # Hyperparameters & settings. - c.num_gpus = opts.gpus - c.batch_size = opts.batch - c.batch_gpu = opts.batch_gpu or opts.batch // opts.gpus - c.G_kwargs.channel_base = c.D_kwargs.channel_base = opts.cbase - c.G_kwargs.channel_max = c.D_kwargs.channel_max = opts.cmax - c.G_kwargs.mapping_kwargs.num_layers = ( - 8 if opts.cfg == 'stylegan2' else 2) if opts.map_depth is None else opts.map_depth - c.D_kwargs.block_kwargs.freeze_layers = opts.freezed - c.D_kwargs.epilogue_kwargs.mbstd_group_size = opts.mbstd_group - c.loss_kwargs.r1_gamma = opts.gamma - c.G_opt_kwargs.lr = ( - 0.002 if opts.cfg == 'stylegan2' else 0.0025) if opts.glr is None else opts.glr - c.D_opt_kwargs.lr = opts.dlr - c.metrics = opts.metrics - c.total_kimg = opts.kimg - c.kimg_per_tick = opts.tick - c.image_snapshot_ticks = c.network_snapshot_ticks = opts.snap - c.random_seed = c.training_set_kwargs.random_seed = opts.seed - c.data_loader_kwargs.num_workers = opts.workers - - # Sanity checks. - if c.batch_size % c.num_gpus != 0: - raise click.ClickException('--batch must be a multiple of --gpus') - if c.batch_size % (c.num_gpus * c.batch_gpu) != 0: - raise click.ClickException( - '--batch must be a multiple of --gpus times --batch-gpu') - if c.batch_gpu < c.D_kwargs.epilogue_kwargs.mbstd_group_size: - raise click.ClickException( - '--batch-gpu cannot be smaller than --mbstd') - if any(not metric_main.is_valid_metric(metric) for metric in c.metrics): - raise click.ClickException('\n'.join( - ['--metrics can only contain the following values:'] + metric_main.list_valid_metrics())) - - # Base configuration. - c.ema_kimg = c.batch_size * 10 / 32 - if opts.cfg == 'stylegan2': - c.G_kwargs.class_name = 'training.networks_stylegan2.Generator' - # Enable style mixing regularization. - c.loss_kwargs.style_mixing_prob = 0.9 - c.loss_kwargs.pl_weight = 2 # Enable path length regularization. - c.G_reg_interval = 4 # Enable lazy regularization for G. - # Speed up training by using regular convolutions instead of grouped convolutions. - c.G_kwargs.fused_modconv_default = 'inference_only' - # Speed up path length regularization by skipping gradient computation wrt. conv2d weights. - c.loss_kwargs.pl_no_weight_grad = True - else: - c.G_kwargs.class_name = 'training.networks_stylegan3.Generator' - c.G_kwargs.magnitude_ema_beta = 0.5 ** (c.batch_size / (20 * 1e3)) - if opts.cfg == 'stylegan3-r': - c.G_kwargs.conv_kernel = 1 # Use 1x1 convolutions. - c.G_kwargs.channel_base *= 2 # Double the number of feature maps. - c.G_kwargs.channel_max *= 2 - # Use radially symmetric downsampling filters. - c.G_kwargs.use_radial_filters = True - # Blur the images seen by the discriminator. - c.loss_kwargs.blur_init_sigma = 10 - # Fade out the blur during the first N kimg. - c.loss_kwargs.blur_fade_kimg = c.batch_size * 200 / 32 - - # Augmentation. - if opts.aug != 'noaug': - c.augment_kwargs = dnnlib.EasyDict(class_name='training.augment.AugmentPipe', xflip=1, rotate90=1, xint=1, - scale=1, rotate=1, aniso=1, xfrac=1, brightness=1, contrast=1, lumaflip=1, hue=1, saturation=1) - if opts.aug == 'ada': - c.ada_target = opts.target - if opts.aug == 'fixed': - c.augment_p = opts.p - - # Resume. - if opts.resume is not None: - c.resume_pkl = opts.resume - c.ada_kimg = 100 # Make ADA react faster at the beginning. - c.ema_rampup = None # Disable EMA rampup. - c.loss_kwargs.blur_init_sigma = 0 # Disable blur rampup. - - # Performance-related toggles. - if opts.fp32: - c.G_kwargs.num_fp16_res = c.D_kwargs.num_fp16_res = 0 - c.G_kwargs.conv_clamp = c.D_kwargs.conv_clamp = None - if opts.nobench: - c.cudnn_benchmark = False - - # Description string. - desc = f'{opts.cfg:s}-{dataset_name:s}-gpus{c.num_gpus:d}-batch{c.batch_size:d}-gamma{c.loss_kwargs.r1_gamma:g}' - if opts.desc is not None: - desc += f'-{opts.desc}' - - # Launch. - launch_training(c=c, desc=desc, outdir=opts.outdir, dry_run=opts.dry_run) - -# ---------------------------------------------------------------------------- - - -if __name__ == "__main__": - main() # pylint: disable=no-value-for-parameter - -# ---------------------------------------------------------------------------- diff --git a/spaces/h2oai/wave-tour/examples/breadcrumbs.py b/spaces/h2oai/wave-tour/examples/breadcrumbs.py deleted file mode 100644 index 751fe472ac9892709ec54a7d73726af374428301..0000000000000000000000000000000000000000 --- a/spaces/h2oai/wave-tour/examples/breadcrumbs.py +++ /dev/null @@ -1,43 +0,0 @@ -# Breadcrumbs -# #Breadcrumbs should be used as a navigational aid in your app or site. -# They indicate the current page’s location within a hierarchy and help -# the user understand where they are in relation to the rest of that hierarchy. -# They also afford one-click access to higher levels of that hierarchy. -# Breadcrumbs are typically placed, in horizontal form, under the masthead -# or #navigation of an experience, above the primary content area. -# --- -from h2o_wave import main, app, Q, ui - - -@app('/demo') -async def serve(q: Q): - blurb_items = [ui.button(name='#submenu', label='Go to submenu', link=True)] - blurb_title = 'Welcome to Menu' - breadcrumbs = [ui.breadcrumb(name='#menu', label='Menu')] - - if q.args['#'] == 'menu': - blurb_items = [ui.button(name='#submenu', label='Go to submenu', link=True)] - blurb_title = 'Welcome to Menu!' - breadcrumbs = [ - ui.breadcrumb(name='#menu', label='Menu'), - ] - elif q.args['#'] == 'submenu': - blurb_items = [ui.button(name='#subsubmenu', label='Go to subsubmenu', link=True)] - blurb_title = 'Welcome to Submenu!' - breadcrumbs = [ - ui.breadcrumb(name='#menu', label='Menu'), - ui.breadcrumb(name='#submenu', label='Submenu'), - ] - elif q.args['#'] == 'subsubmenu': - blurb_items = [ui.text('You cannot go deeper, click on Breadcrumbs above to navigate back')] - blurb_title = 'Welcome to Subsubmenu!' - breadcrumbs = [ - ui.breadcrumb(name='#menu', label='Menu'), - ui.breadcrumb(name='#submenu', label='Submenu'), - ui.breadcrumb(name='#subsubmenu', label='Subsubmenu'), - ] - - q.page['blurb'] = ui.form_card(box='1 2 3 2', title=blurb_title, items=blurb_items) - q.page['breadcrumbs'] = ui.breadcrumbs_card(box='1 1 3 1', items=breadcrumbs) - - await q.page.save() diff --git a/spaces/hackathon-somos-nlp-2023/ask2democracy/config.py b/spaces/hackathon-somos-nlp-2023/ask2democracy/config.py deleted file mode 100644 index bcb5f88d9aed7511bf0c804b208945ffd946efcb..0000000000000000000000000000000000000000 --- a/spaces/hackathon-somos-nlp-2023/ask2democracy/config.py +++ /dev/null @@ -1,12 +0,0 @@ -class Config(): - es_host = "saimon-askwdemocracy.es.us-central1.gcp.cloud.es.io" - es_user = "elastic" - #es_password = "53f2a7a9-ea9d-4fd2-a8bc-f471b67f0262" - es_password = "1f45bf76-b600-42b3-b2cc-ab6062693eb7" - index_name = "docsreloaded" - reader_model_name_or_path = "deepset/xlm-roberta-base-squad2-distilled" - pinecone_environment="us-east-1-aws" - embedding_dim = 384 - similarity="cosine" - #reader_model_name_or_path = "deepset/xlm-roberta-base-squad2" - use_gpu = True \ No newline at end of file diff --git a/spaces/hamelcubsfan/AutoGPT/autogpt/commands/google_search.py b/spaces/hamelcubsfan/AutoGPT/autogpt/commands/google_search.py deleted file mode 100644 index 7d38ce7568d2de207d521b077cfebd72527c9795..0000000000000000000000000000000000000000 --- a/spaces/hamelcubsfan/AutoGPT/autogpt/commands/google_search.py +++ /dev/null @@ -1,87 +0,0 @@ -"""Google search command for Autogpt.""" -from __future__ import annotations - -import json - -from duckduckgo_search import ddg - -from autogpt.config import Config - -CFG = Config() - - -def google_search(query: str, num_results: int = 8) -> str: - """Return the results of a Google search - - Args: - query (str): The search query. - num_results (int): The number of results to return. - - Returns: - str: The results of the search. - """ - search_results = [] - if not query: - return json.dumps(search_results) - - results = ddg(query, max_results=num_results) - if not results: - return json.dumps(search_results) - - for j in results: - search_results.append(j) - - return json.dumps(search_results, ensure_ascii=False, indent=4) - - -def google_official_search(query: str, num_results: int = 8) -> str | list[str]: - """Return the results of a Google search using the official Google API - - Args: - query (str): The search query. - num_results (int): The number of results to return. - - Returns: - str: The results of the search. - """ - - from googleapiclient.discovery import build - from googleapiclient.errors import HttpError - - try: - # Get the Google API key and Custom Search Engine ID from the config file - api_key = CFG.google_api_key - custom_search_engine_id = CFG.custom_search_engine_id - - # Initialize the Custom Search API service - service = build("customsearch", "v1", developerKey=api_key) - - # Send the search query and retrieve the results - result = ( - service.cse() - .list(q=query, cx=custom_search_engine_id, num=num_results) - .execute() - ) - - # Extract the search result items from the response - search_results = result.get("items", []) - - # Create a list of only the URLs from the search results - search_results_links = [item["link"] for item in search_results] - - except HttpError as e: - # Handle errors in the API call - error_details = json.loads(e.content.decode()) - - # Check if the error is related to an invalid or missing API key - if error_details.get("error", {}).get( - "code" - ) == 403 and "invalid API key" in error_details.get("error", {}).get( - "message", "" - ): - return "Error: The provided Google API key is invalid or missing." - else: - return f"Error: {e}" - - # Return the list of search result URLs - return search_results_links diff --git a/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/csrc/ROIPool.h b/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/csrc/ROIPool.h deleted file mode 100644 index 4cf627fde6b9dc0ecd6524a7048e7f8c7d7746b5..0000000000000000000000000000000000000000 --- a/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/csrc/ROIPool.h +++ /dev/null @@ -1,48 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved. -#pragma once - -#include "cpu/vision.h" - -#ifdef WITH_CUDA -#include "cuda/vision.h" -#endif - - -std::tuple ROIPool_forward(const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width) { - if (input.device().is_cuda()) { -#ifdef WITH_CUDA - return ROIPool_forward_cuda(input, rois, spatial_scale, pooled_height, pooled_width); -#else - AT_ERROR("Not compiled with GPU support"); -#endif - } - AT_ERROR("Not implemented on the CPU"); -} - -at::Tensor ROIPool_backward(const at::Tensor& grad, - const at::Tensor& input, - const at::Tensor& rois, - const at::Tensor& argmax, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width) { - if (grad.device().is_cuda()) { -#ifdef WITH_CUDA - return ROIPool_backward_cuda(grad, input, rois, argmax, spatial_scale, pooled_height, pooled_width, batch_size, channels, height, width); -#else - AT_ERROR("Not compiled with GPU support"); -#endif - } - AT_ERROR("Not implemented on the CPU"); -} - - - diff --git "a/spaces/hbestm/gpt-academic-play/crazy_functions/\347\224\237\346\210\220\345\207\275\346\225\260\346\263\250\351\207\212.py" "b/spaces/hbestm/gpt-academic-play/crazy_functions/\347\224\237\346\210\220\345\207\275\346\225\260\346\263\250\351\207\212.py" deleted file mode 100644 index a564f21d231cd65c29b539573929ca5d2df63203..0000000000000000000000000000000000000000 --- "a/spaces/hbestm/gpt-academic-play/crazy_functions/\347\224\237\346\210\220\345\207\275\346\225\260\346\263\250\351\207\212.py" +++ /dev/null @@ -1,54 +0,0 @@ -from toolbox import update_ui -from toolbox import CatchException, report_execption, write_results_to_file -from .crazy_utils import request_gpt_model_in_new_thread_with_ui_alive -fast_debug = False - -def 生成函数注释(file_manifest, project_folder, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt): - import time, os - print('begin analysis on:', file_manifest) - for index, fp in enumerate(file_manifest): - with open(fp, 'r', encoding='utf-8', errors='replace') as f: - file_content = f.read() - - i_say = f'请对下面的程序文件做一个概述,并对文件中的所有函数生成注释,使用markdown表格输出结果,文件名是{os.path.relpath(fp, project_folder)},文件内容是 ```{file_content}```' - i_say_show_user = f'[{index}/{len(file_manifest)}] 请对下面的程序文件做一个概述,并对文件中的所有函数生成注释: {os.path.abspath(fp)}' - chatbot.append((i_say_show_user, "[Local Message] waiting gpt response.")) - yield from update_ui(chatbot=chatbot, history=history) # 刷新界面 - - if not fast_debug: - msg = '正常' - # ** gpt request ** - gpt_say = yield from request_gpt_model_in_new_thread_with_ui_alive( - i_say, i_say_show_user, llm_kwargs, chatbot, history=[], sys_prompt=system_prompt) # 带超时倒计时 - - chatbot[-1] = (i_say_show_user, gpt_say) - history.append(i_say_show_user); history.append(gpt_say) - yield from update_ui(chatbot=chatbot, history=history, msg=msg) # 刷新界面 - if not fast_debug: time.sleep(2) - - if not fast_debug: - res = write_results_to_file(history) - chatbot.append(("完成了吗?", res)) - yield from update_ui(chatbot=chatbot, history=history, msg=msg) # 刷新界面 - - - -@CatchException -def 批量生成函数注释(txt, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt, web_port): - history = [] # 清空历史,以免输入溢出 - import glob, os - if os.path.exists(txt): - project_folder = txt - else: - if txt == "": txt = '空空如也的输入栏' - report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到本地项目或无权访问: {txt}") - yield from update_ui(chatbot=chatbot, history=history) # 刷新界面 - return - file_manifest = [f for f in glob.glob(f'{project_folder}/**/*.py', recursive=True)] + \ - [f for f in glob.glob(f'{project_folder}/**/*.cpp', recursive=True)] - - if len(file_manifest) == 0: - report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到任何.tex文件: {txt}") - yield from update_ui(chatbot=chatbot, history=history) # 刷新界面 - return - yield from 生成函数注释(file_manifest, project_folder, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt) diff --git a/spaces/hezhaoqia/vits-simple-api/Dockerfile b/spaces/hezhaoqia/vits-simple-api/Dockerfile deleted file mode 100644 index 84fa3cb29ccc0b3c2b01d43c4835976e8f2dab34..0000000000000000000000000000000000000000 --- a/spaces/hezhaoqia/vits-simple-api/Dockerfile +++ /dev/null @@ -1,36 +0,0 @@ -FROM python:3.10.11-slim-bullseye - -RUN mkdir -p /app -WORKDIR /app - -ENV DEBIAN_FRONTEND=noninteractive - -COPY . /app - -RUN apt-get update && \ - apt-get install -yq build-essential espeak-ng cmake wget && \ - apt-get clean && \ - apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false && \ - rm -rf /var/lib/apt/lists/* - -RUN pip install --upgrade pip --no-cache-dir && \ - pip install MarkupSafe==2.1.2 numpy==1.23.3 cython six==1.16.0 safetensors==0.3.2 --no-cache-dir - -RUN wget https://raw.githubusercontent.com/Artrajz/archived/main/openjtalk/openjtalk-0.3.0.dev2.tar.gz && \ - tar -zxvf openjtalk-0.3.0.dev2.tar.gz && \ - cd openjtalk-0.3.0.dev2 && \ - rm -rf ./pyopenjtalk/open_jtalk_dic_utf_8-1.11 && \ - python setup.py install && \ - cd ../ && \ - rm -f openjtalk-0.3.0.dev2.tar.gz && \ - rm -rf openjtalk-0.3.0.dev2 - -RUN pip install torch --index-url https://download.pytorch.org/whl/cpu --no-cache-dir - -RUN pip install -r requirements.txt --no-cache-dir - -RUN pip install gunicorn --no-cache-dir - -EXPOSE 23456 - -CMD ["gunicorn", "-c", "gunicorn_config.py", "app:app"] \ No newline at end of file diff --git a/spaces/hf4all/web-ui/_next/static/chunks/webpack-c25634afc45333ff.js b/spaces/hf4all/web-ui/_next/static/chunks/webpack-c25634afc45333ff.js deleted file mode 100644 index 3cce6407b44e8b07cf0e1257274dbc98c6e6cba1..0000000000000000000000000000000000000000 --- a/spaces/hf4all/web-ui/_next/static/chunks/webpack-c25634afc45333ff.js +++ /dev/null @@ -1 +0,0 @@ -!function(){"use strict";var e,t,n,r,o,u,i,c,a,f,d,l,s={},p={};function h(e){var t=p[e];if(void 0!==t)return t.exports;var n=p[e]={id:e,loaded:!1,exports:{}},r=!0;try{s[e].call(n.exports,n,n.exports,h),r=!1}finally{r&&delete p[e]}return n.loaded=!0,n.exports}h.m=s,e=[],h.O=function(t,n,r,o){if(n){o=o||0;for(var u=e.length;u>0&&e[u-1][2]>o;u--)e[u]=e[u-1];e[u]=[n,r,o];return}for(var i=1/0,u=0;u=o&&Object.keys(h.O).every(function(e){return h.O[e](n[a])})?n.splice(a--,1):(c=!1,o - systemRoleEditing: Accessor - setSystemRoleEditing: Setter - currentSystemRoleSettings: Accessor - setCurrentSystemRoleSettings: Setter -} - -export default (props: Props) => { - let systemInputRef: HTMLTextAreaElement - - const handleButtonClick = () => { - props.setCurrentSystemRoleSettings(systemInputRef.value) - props.setSystemRoleEditing(false) - } - - return ( -
          - - -
          -
          - - System Role: -
          -
          - { props.currentSystemRoleSettings() } -
          -
          -
          - - props.setSystemRoleEditing(!props.systemRoleEditing())} class="sys-edit-btn"> - - Add System Role - - -
          - -
          -
          - - System Role: -
          -

          Gently instruct the assistant and set the behavior of the assistant.

          -
          - ", - y.noCloneChecked = !!ce.cloneNode(!0).lastChild.defaultValue, - ce.innerHTML = "", - y.option = !!ce.lastChild; - var ge = { - thead: [1, "", "
          "], - col: [2, "", "
          "], - tr: [2, "", "
          "], - td: [3, "", "
          "], - _default: [0, "", ""] - }; - function ve(e, t) { - var n; - return n = "undefined" != typeof e.getElementsByTagName ? e.getElementsByTagName(t || "*") : "undefined" != typeof e.querySelectorAll ? e.querySelectorAll(t || "*") : [], void 0 === t || t && A(e, t) ? S.merge([e], n) : n - } - function ye(e, t) { - for (var n = 0, r = e.length; n < r; n++) - Y.set(e[n], "globalEval", !t || Y.get(t[n], "globalEval")) - } - ge.tbody = ge.tfoot = ge.colgroup = ge.caption = ge.thead, - ge.th = ge.td, - y.option || (ge.optgroup = ge.option = [1, ""]); - var me = /<|&#?\w+;/; - function xe(e, t, n, r, i) { - for (var o, a, s, u, l, c, f = t.createDocumentFragment(), p = [], d = 0, h = e.length; d < h; d++) - if ((o = e[d]) || 0 === o) - if ("object" === w(o)) - S.merge(p, o.nodeType ? [o] : o); - else if (me.test(o)) { - a = a || f.appendChild(t.createElement("div")), - s = (de.exec(o) || ["", ""])[1].toLowerCase(), - u = ge[s] || ge._default, - a.innerHTML = u[1] + S.htmlPrefilter(o) + u[2], - c = u[0]; - while (c--) - a = a.lastChild; - S.merge(p, a.childNodes), - (a = f.firstChild).textContent = "" - } else - p.push(t.createTextNode(o)); - f.textContent = "", - d = 0; - while (o = p[d++]) - if (r && -1 < S.inArray(o, r)) - i && i.push(o); - else if (l = ie(o), a = ve(f.appendChild(o), "script"), l && ye(a), n) { - c = 0; - while (o = a[c++]) - he.test(o.type || "") && n.push(o) - } - return f - } - var be = /^([^.]*)(?:\.(.+)|)/; - function we() { - return !0 - } - function Te() { - return !1 - } - function Ce(e, t) { - return e === function() { - try { - return E.activeElement - } catch (e) {} - }() == ("focus" === t) - } - function Ee(e, t, n, r, i, o) { - var a, - s; - if ("object" == typeof t) { - for (s in "string" != typeof n && (r = r || n, n = void 0), t) - Ee(e, s, n, r, t[s], o); - return e - } - if (null == r && null == i ? (i = n, r = n = void 0) : null == i && ("string" == typeof n ? (i = r, r = void 0) : (i = r, r = n, n = void 0)), !1 === i) - i = Te; - else if (!i) - return e; - return 1 === o && (a = i, (i = function(e) { - return S().off(e), a.apply(this, arguments) - }).guid = a.guid || (a.guid = S.guid++)), e.each(function() { - S.event.add(this, t, i, r, n) - }) - } - function Se(e, i, o) { - o ? (Y.set(e, i, !1), S.event.add(e, i, { - namespace: !1, - handler: function(e) { - var t, - n, - r = Y.get(this, i); - if (1 & e.isTrigger && this[i]) { - if (r.length) - (S.event.special[i] || {}).delegateType && e.stopPropagation(); - else if (r = s.call(arguments), Y.set(this, i, r), t = o(this, i), this[i](), r !== (n = Y.get(this, i)) || t ? Y.set(this, i, !1) : n = {}, r !== n) - return e.stopImmediatePropagation(), e.preventDefault(), n && n.value - } else - r.length && (Y.set(this, i, { - value: S.event.trigger(S.extend(r[0], S.Event.prototype), r.slice(1), this) - }), e.stopImmediatePropagation()) - } - })) : void 0 === Y.get(e, i) && S.event.add(e, i, we) - } - S.event = { - global: {}, - add: function(t, e, n, r, i) { - var o, - a, - s, - u, - l, - c, - f, - p, - d, - h, - g, - v = Y.get(t); - if (V(t)) { - n.handler && (n = (o = n).handler, i = o.selector), - i && S.find.matchesSelector(re, i), - n.guid || (n.guid = S.guid++), - (u = v.events) || (u = v.events = Object.create(null)), - (a = v.handle) || (a = v.handle = function(e) { - return "undefined" != typeof S && S.event.triggered !== e.type ? S.event.dispatch.apply(t, arguments) : void 0 - }), - l = (e = (e || "").match(P) || [""]).length; - while (l--) - d = g = (s = be.exec(e[l]) || [])[1], - h = (s[2] || "").split(".").sort(), - d && (f = S.event.special[d] || {}, d = (i ? f.delegateType : f.bindType) || d, f = S.event.special[d] || {}, c = S.extend({ - type: d, - origType: g, - data: r, - handler: n, - guid: n.guid, - selector: i, - needsContext: i && S.expr.match.needsContext.test(i), - namespace: h.join(".") - }, o), (p = u[d]) || ((p = u[d] = []).delegateCount = 0, f.setup && !1 !== f.setup.call(t, r, h, a) || t.addEventListener && t.addEventListener(d, a)), f.add && (f.add.call(t, c), c.handler.guid || (c.handler.guid = n.guid)), i ? p.splice(p.delegateCount++, 0, c) : p.push(c), S.event.global[d] = !0) - } - }, - remove: function(e, t, n, r, i) { - var o, - a, - s, - u, - l, - c, - f, - p, - d, - h, - g, - v = Y.hasData(e) && Y.get(e); - if (v && (u = v.events)) { - l = (t = (t || "").match(P) || [""]).length; - while (l--) - if (d = g = (s = be.exec(t[l]) || [])[1], h = (s[2] || "").split(".").sort(), d) { - f = S.event.special[d] || {}, - p = u[d = (r ? f.delegateType : f.bindType) || d] || [], - s = s[2] && new RegExp("(^|\\.)" + h.join("\\.(?:.*\\.|)") + "(\\.|$)"), - a = o = p.length; - while (o--) - c = p[o], - !i && g !== c.origType || n && n.guid !== c.guid || s && !s.test(c.namespace) || r && r !== c.selector && ("**" !== r || !c.selector) || (p.splice(o, 1), c.selector && p.delegateCount--, f.remove && f.remove.call(e, c)); - a && !p.length && (f.teardown && !1 !== f.teardown.call(e, h, v.handle) || S.removeEvent(e, d, v.handle), delete u[d]) - } else - for (d in u) - S.event.remove(e, d + t[l], n, r, !0); - S.isEmptyObject(u) && Y.remove(e, "handle events") - } - }, - dispatch: function(e) { - var t, - n, - r, - i, - o, - a, - s = new Array(arguments.length), - u = S.event.fix(e), - l = (Y.get(this, "events") || Object.create(null))[u.type] || [], - c = S.event.special[u.type] || {}; - for (s[0] = u, t = 1; t < arguments.length; t++) - s[t] = arguments[t]; - if (u.delegateTarget = this, !c.preDispatch || !1 !== c.preDispatch.call(this, u)) { - a = S.event.handlers.call(this, u, l), - t = 0; - while ((i = a[t++]) && !u.isPropagationStopped()) { - u.currentTarget = i.elem, - n = 0; - while ((o = i.handlers[n++]) && !u.isImmediatePropagationStopped()) - u.rnamespace && !1 !== o.namespace && !u.rnamespace.test(o.namespace) || (u.handleObj = o, u.data = o.data, void 0 !== (r = ((S.event.special[o.origType] || {}).handle || o.handler).apply(i.elem, s)) && !1 === (u.result = r) && (u.preventDefault(), u.stopPropagation())) - } - return c.postDispatch && c.postDispatch.call(this, u), u.result - } - }, - handlers: function(e, t) { - var n, - r, - i, - o, - a, - s = [], - u = t.delegateCount, - l = e.target; - if (u && l.nodeType && !("click" === e.type && 1 <= e.button)) - for (; l !== this; l = l.parentNode || this) - if (1 === l.nodeType && ("click" !== e.type || !0 !== l.disabled)) { - for (o = [], a = {}, n = 0; n < u; n++) - void 0 === a[i = (r = t[n]).selector + " "] && (a[i] = r.needsContext ? -1 < S(i, this).index(l) : S.find(i, this, null, [l]).length), - a[i] && o.push(r); - o.length && s.push({ - elem: l, - handlers: o - }) - } - return l = this, u < t.length && s.push({ - elem: l, - handlers: t.slice(u) - }), s - }, - addProp: function(t, e) { - Object.defineProperty(S.Event.prototype, t, { - enumerable: !0, - configurable: !0, - get: m(e) ? function() { - if (this.originalEvent) - return e(this.originalEvent) - } : function() { - if (this.originalEvent) - return this.originalEvent[t] - }, - set: function(e) { - Object.defineProperty(this, t, { - enumerable: !0, - configurable: !0, - writable: !0, - value: e - }) - } - }) - }, - fix: function(e) { - return e[S.expando] ? e : new S.Event(e) - }, - special: { - load: { - noBubble: !0 - }, - click: { - setup: function(e) { - var t = this || e; - return pe.test(t.type) && t.click && A(t, "input") && Se(t, "click", we), !1 - }, - trigger: function(e) { - var t = this || e; - return pe.test(t.type) && t.click && A(t, "input") && Se(t, "click"), !0 - }, - _default: function(e) { - var t = e.target; - return pe.test(t.type) && t.click && A(t, "input") && Y.get(t, "click") || A(t, "a") - } - }, - beforeunload: { - postDispatch: function(e) { - void 0 !== e.result && e.originalEvent && (e.originalEvent.returnValue = e.result) - } - } - } - }, - S.removeEvent = function(e, t, n) { - e.removeEventListener && e.removeEventListener(t, n) - }, - S.Event = function(e, t) { - if (!(this instanceof S.Event)) - return new S.Event(e, t); - e && e.type ? (this.originalEvent = e, this.type = e.type, this.isDefaultPrevented = e.defaultPrevented || void 0 === e.defaultPrevented && !1 === e.returnValue ? we : Te, this.target = e.target && 3 === e.target.nodeType ? e.target.parentNode : e.target, this.currentTarget = e.currentTarget, this.relatedTarget = e.relatedTarget) : this.type = e, - t && S.extend(this, t), - this.timeStamp = e && e.timeStamp || Date.now(), - this[S.expando] = !0 - }, - S.Event.prototype = { - constructor: S.Event, - isDefaultPrevented: Te, - isPropagationStopped: Te, - isImmediatePropagationStopped: Te, - isSimulated: !1, - preventDefault: function() { - var e = this.originalEvent; - this.isDefaultPrevented = we, - e && !this.isSimulated && e.preventDefault() - }, - stopPropagation: function() { - var e = this.originalEvent; - this.isPropagationStopped = we, - e && !this.isSimulated && e.stopPropagation() - }, - stopImmediatePropagation: function() { - var e = this.originalEvent; - this.isImmediatePropagationStopped = we, - e && !this.isSimulated && e.stopImmediatePropagation(), - this.stopPropagation() - } - }, - S.each({ - altKey: !0, - bubbles: !0, - cancelable: !0, - changedTouches: !0, - ctrlKey: !0, - detail: !0, - eventPhase: !0, - metaKey: !0, - pageX: !0, - pageY: !0, - shiftKey: !0, - view: !0, - "char": !0, - code: !0, - charCode: !0, - key: !0, - keyCode: !0, - button: !0, - buttons: !0, - clientX: !0, - clientY: !0, - offsetX: !0, - offsetY: !0, - pointerId: !0, - pointerType: !0, - screenX: !0, - screenY: !0, - targetTouches: !0, - toElement: !0, - touches: !0, - which: !0 - }, S.event.addProp), - S.each({ - focus: "focusin", - blur: "focusout" - }, function(e, t) { - S.event.special[e] = { - setup: function() { - return Se(this, e, Ce), !1 - }, - trigger: function() { - return Se(this, e), !0 - }, - _default: function() { - return !0 - }, - delegateType: t - } - }), - S.each({ - mouseenter: "mouseover", - mouseleave: "mouseout", - pointerenter: "pointerover", - pointerleave: "pointerout" - }, function(e, i) { - S.event.special[e] = { - delegateType: i, - bindType: i, - handle: function(e) { - var t, - n = e.relatedTarget, - r = e.handleObj; - return n && (n === this || S.contains(this, n)) || (e.type = r.origType, t = r.handler.apply(this, arguments), e.type = i), t - } - } - }), - S.fn.extend({ - on: function(e, t, n, r) { - return Ee(this, e, t, n, r) - }, - one: function(e, t, n, r) { - return Ee(this, e, t, n, r, 1) - }, - off: function(e, t, n) { - var r, - i; - if (e && e.preventDefault && e.handleObj) - return r = e.handleObj, S(e.delegateTarget).off(r.namespace ? r.origType + "." + r.namespace : r.origType, r.selector, r.handler), this; - if ("object" == typeof e) { - for (i in e) - this.off(i, t, e[i]); - return this - } - return !1 !== t && "function" != typeof t || (n = t, t = void 0), !1 === n && (n = Te), this.each(function() { - S.event.remove(this, e, n, t) - }) - } - }); - var ke = /\s*$/g; - function je(e, t) { - return A(e, "table") && A(11 !== t.nodeType ? t : t.firstChild, "tr") && S(e).children("tbody")[0] || e - } - function De(e) { - return e.type = (null !== e.getAttribute("type")) + "/" + e.type, e - } - function qe(e) { - return "true/" === (e.type || "").slice(0, 5) ? e.type = e.type.slice(5) : e.removeAttribute("type"), e - } - function Le(e, t) { - var n, - r, - i, - o, - a, - s; - if (1 === t.nodeType) { - if (Y.hasData(e) && (s = Y.get(e).events)) - for (i in Y.remove(t, "handle events"), s) - for (n = 0, r = s[i].length; n < r; n++) - S.event.add(t, i, s[i][n]); - Q.hasData(e) && (o = Q.access(e), a = S.extend({}, o), Q.set(t, a)) - } - } - function He(n, r, i, o) { - r = g(r); - var e, - t, - a, - s, - u, - l, - c = 0, - f = n.length, - p = f - 1, - d = r[0], - h = m(d); - if (h || 1 < f && "string" == typeof d && !y.checkClone && Ae.test(d)) - return n.each(function(e) { - var t = n.eq(e); - h && (r[0] = d.call(this, e, t.html())), - He(t, r, i, o) - }); - if (f && (t = (e = xe(r, n[0].ownerDocument, !1, n, o)).firstChild, 1 === e.childNodes.length && (e = t), t || o)) { - for (s = (a = S.map(ve(e, "script"), De)).length; c < f; c++) - u = e, - c !== p && (u = S.clone(u, !0, !0), s && S.merge(a, ve(u, "script"))), - i.call(n[c], u, c); - if (s) - for (l = a[a.length - 1].ownerDocument, S.map(a, qe), c = 0; c < s; c++) - u = a[c], - he.test(u.type || "") && !Y.access(u, "globalEval") && S.contains(l, u) && (u.src && "module" !== (u.type || "").toLowerCase() ? S._evalUrl && !u.noModule && S._evalUrl(u.src, { - nonce: u.nonce || u.getAttribute("nonce") - }, l) : b(u.textContent.replace(Ne, ""), u, l)) - } - return n - } - function Oe(e, t, n) { - for (var r, i = t ? S.filter(t, e) : e, o = 0; null != (r = i[o]); o++) - n || 1 !== r.nodeType || S.cleanData(ve(r)), - r.parentNode && (n && ie(r) && ye(ve(r, "script")), r.parentNode.removeChild(r)); - return e - } - S.extend({ - htmlPrefilter: function(e) { - return e - }, - clone: function(e, t, n) { - var r, - i, - o, - a, - s, - u, - l, - c = e.cloneNode(!0), - f = ie(e); - if (!(y.noCloneChecked || 1 !== e.nodeType && 11 !== e.nodeType || S.isXMLDoc(e))) - for (a = ve(c), r = 0, i = (o = ve(e)).length; r < i; r++) - s = o[r], - u = a[r], - void 0, - "input" === (l = u.nodeName.toLowerCase()) && pe.test(s.type) ? u.checked = s.checked : "input" !== l && "textarea" !== l || (u.defaultValue = s.defaultValue); - if (t) - if (n) - for (o = o || ve(e), a = a || ve(c), r = 0, i = o.length; r < i; r++) - Le(o[r], a[r]); - else - Le(e, c); - return 0 < (a = ve(c, "script")).length && ye(a, !f && ve(e, "script")), c - }, - cleanData: function(e) { - for (var t, n, r, i = S.event.special, o = 0; void 0 !== (n = e[o]); o++) - if (V(n)) { - if (t = n[Y.expando]) { - if (t.events) - for (r in t.events) - i[r] ? S.event.remove(n, r) : S.removeEvent(n, r, t.handle); - n[Y.expando] = void 0 - } - n[Q.expando] && (n[Q.expando] = void 0) - } - } - }), - S.fn.extend({ - detach: function(e) { - return Oe(this, e, !0) - }, - remove: function(e) { - return Oe(this, e) - }, - text: function(e) { - return $(this, function(e) { - return void 0 === e ? S.text(this) : this.empty().each(function() { - 1 !== this.nodeType && 11 !== this.nodeType && 9 !== this.nodeType || (this.textContent = e) - }) - }, null, e, arguments.length) - }, - append: function() { - return He(this, arguments, function(e) { - 1 !== this.nodeType && 11 !== this.nodeType && 9 !== this.nodeType || je(this, e).appendChild(e) - }) - }, - prepend: function() { - return He(this, arguments, function(e) { - if (1 === this.nodeType || 11 === this.nodeType || 9 === this.nodeType) { - var t = je(this, e); - t.insertBefore(e, t.firstChild) - } - }) - }, - before: function() { - return He(this, arguments, function(e) { - this.parentNode && this.parentNode.insertBefore(e, this) - }) - }, - after: function() { - return He(this, arguments, function(e) { - this.parentNode && this.parentNode.insertBefore(e, this.nextSibling) - }) - }, - empty: function() { - for (var e, t = 0; null != (e = this[t]); t++) - 1 === e.nodeType && (S.cleanData(ve(e, !1)), e.textContent = ""); - return this - }, - clone: function(e, t) { - return e = null != e && e, t = null == t ? e : t, this.map(function() { - return S.clone(this, e, t) - }) - }, - html: function(e) { - return $(this, function(e) { - var t = this[0] || {}, - n = 0, - r = this.length; - if (void 0 === e && 1 === t.nodeType) - return t.innerHTML; - if ("string" == typeof e && !ke.test(e) && !ge[(de.exec(e) || ["", ""])[1].toLowerCase()]) { - e = S.htmlPrefilter(e); - try { - for (; n < r; n++) - 1 === (t = this[n] || {}).nodeType && (S.cleanData(ve(t, !1)), t.innerHTML = e); - t = 0 - } catch (e) {} - } - t && this.empty().append(e) - }, null, e, arguments.length) - }, - replaceWith: function() { - var n = []; - return He(this, arguments, function(e) { - var t = this.parentNode; - S.inArray(this, n) < 0 && (S.cleanData(ve(this)), t && t.replaceChild(e, this)) - }, n) - } - }), - S.each({ - appendTo: "append", - prependTo: "prepend", - insertBefore: "before", - insertAfter: "after", - replaceAll: "replaceWith" - }, function(e, a) { - S.fn[e] = function(e) { - for (var t, n = [], r = S(e), i = r.length - 1, o = 0; o <= i; o++) - t = o === i ? this : this.clone(!0), - S(r[o])[a](t), - u.apply(n, t.get()); - return this.pushStack(n) - } - }); - var Pe = new RegExp("^(" + ee + ")(?!px)[a-z%]+$", "i"), - Re = function(e) { - var t = e.ownerDocument.defaultView; - return t && t.opener || (t = C), t.getComputedStyle(e) - }, - Me = function(e, t, n) { - var r, - i, - o = {}; - for (i in t) - o[i] = e.style[i], - e.style[i] = t[i]; - for (i in r = n.call(e), t) - e.style[i] = o[i]; - return r - }, - Ie = new RegExp(ne.join("|"), "i"); - function We(e, t, n) { - var r, - i, - o, - a, - s = e.style; - return (n = n || Re(e)) && ("" !== (a = n.getPropertyValue(t) || n[t]) || ie(e) || (a = S.style(e, t)), !y.pixelBoxStyles() && Pe.test(a) && Ie.test(t) && (r = s.width, i = s.minWidth, o = s.maxWidth, s.minWidth = s.maxWidth = s.width = a, a = n.width, s.width = r, s.minWidth = i, s.maxWidth = o)), void 0 !== a ? a + "" : a - } - function Fe(e, t) { - return { - get: function() { - if (!e()) - return (this.get = t).apply(this, arguments); - delete this.get - } - } - } - !function() { - function e() { - if (l) { - u.style.cssText = "position:absolute;left:-11111px;width:60px;margin-top:1px;padding:0;border:0", - l.style.cssText = "position:relative;display:block;box-sizing:border-box;overflow:scroll;margin:auto;border:1px;padding:1px;width:60%;top:1%", - re.appendChild(u).appendChild(l); - var e = C.getComputedStyle(l); - n = "1%" !== e.top, - s = 12 === t(e.marginLeft), - l.style.right = "60%", - o = 36 === t(e.right), - r = 36 === t(e.width), - l.style.position = "absolute", - i = 12 === t(l.offsetWidth / 3), - re.removeChild(u), - l = null - } - } - function t(e) { - return Math.round(parseFloat(e)) - } - var n, - r, - i, - o, - a, - s, - u = E.createElement("div"), - l = E.createElement("div"); - l.style && (l.style.backgroundClip = "content-box", l.cloneNode(!0).style.backgroundClip = "", y.clearCloneStyle = "content-box" === l.style.backgroundClip, S.extend(y, { - boxSizingReliable: function() { - return e(), r - }, - pixelBoxStyles: function() { - return e(), o - }, - pixelPosition: function() { - return e(), n - }, - reliableMarginLeft: function() { - return e(), s - }, - scrollboxSize: function() { - return e(), i - }, - reliableTrDimensions: function() { - var e, - t, - n, - r; - return null == a && (e = E.createElement("table"), t = E.createElement("tr"), n = E.createElement("div"), e.style.cssText = "position:absolute;left:-11111px;border-collapse:separate", t.style.cssText = "border:1px solid", t.style.height = "1px", n.style.height = "9px", n.style.display = "block", re.appendChild(e).appendChild(t).appendChild(n), r = C.getComputedStyle(t), a = parseInt(r.height, 10) + parseInt(r.borderTopWidth, 10) + parseInt(r.borderBottomWidth, 10) === t.offsetHeight, re.removeChild(e)), a - } - })) - }(); - var Be = ["Webkit", "Moz", "ms"], - $e = E.createElement("div").style, - _e = {}; - function ze(e) { - var t = S.cssProps[e] || _e[e]; - return t || (e in $e ? e : _e[e] = function(e) { - var t = e[0].toUpperCase() + e.slice(1), - n = Be.length; - while (n--) - if ((e = Be[n] + t) in $e) - return e - }(e) || e) - } - var Ue = /^(none|table(?!-c[ea]).+)/, - Xe = /^--/, - Ve = { - position: "absolute", - visibility: "hidden", - display: "block" - }, - Ge = { - letterSpacing: "0", - fontWeight: "400" - }; - function Ye(e, t, n) { - var r = te.exec(t); - return r ? Math.max(0, r[2] - (n || 0)) + (r[3] || "px") : t - } - function Qe(e, t, n, r, i, o) { - var a = "width" === t ? 1 : 0, - s = 0, - u = 0; - if (n === (r ? "border" : "content")) - return 0; - for (; a < 4; a += 2) - "margin" === n && (u += S.css(e, n + ne[a], !0, i)), - r ? ("content" === n && (u -= S.css(e, "padding" + ne[a], !0, i)), "margin" !== n && (u -= S.css(e, "border" + ne[a] + "Width", !0, i))) : (u += S.css(e, "padding" + ne[a], !0, i), "padding" !== n ? u += S.css(e, "border" + ne[a] + "Width", !0, i) : s += S.css(e, "border" + ne[a] + "Width", !0, i)); - return !r && 0 <= o && (u += Math.max(0, Math.ceil(e["offset" + t[0].toUpperCase() + t.slice(1)] - o - u - s - .5)) || 0), u - } - function Je(e, t, n) { - var r = Re(e), - i = (!y.boxSizingReliable() || n) && "border-box" === S.css(e, "boxSizing", !1, r), - o = i, - a = We(e, t, r), - s = "offset" + t[0].toUpperCase() + t.slice(1); - if (Pe.test(a)) { - if (!n) - return a; - a = "auto" - } - return (!y.boxSizingReliable() && i || !y.reliableTrDimensions() && A(e, "tr") || "auto" === a || !parseFloat(a) && "inline" === S.css(e, "display", !1, r)) && e.getClientRects().length && (i = "border-box" === S.css(e, "boxSizing", !1, r), (o = s in e) && (a = e[s])), (a = parseFloat(a) || 0) + Qe(e, t, n || (i ? "border" : "content"), o, r, a) + "px" - } - function Ke(e, t, n, r, i) { - return new Ke.prototype.init(e, t, n, r, i) - } - S.extend({ - cssHooks: { - opacity: { - get: function(e, t) { - if (t) { - var n = We(e, "opacity"); - return "" === n ? "1" : n - } - } - } - }, - cssNumber: { - animationIterationCount: !0, - columnCount: !0, - fillOpacity: !0, - flexGrow: !0, - flexShrink: !0, - fontWeight: !0, - gridArea: !0, - gridColumn: !0, - gridColumnEnd: !0, - gridColumnStart: !0, - gridRow: !0, - gridRowEnd: !0, - gridRowStart: !0, - lineHeight: !0, - opacity: !0, - order: !0, - orphans: !0, - widows: !0, - zIndex: !0, - zoom: !0 - }, - cssProps: {}, - style: function(e, t, n, r) { - if (e && 3 !== e.nodeType && 8 !== e.nodeType && e.style) { - var i, - o, - a, - s = X(t), - u = Xe.test(t), - l = e.style; - if (u || (t = ze(s)), a = S.cssHooks[t] || S.cssHooks[s], void 0 === n) - return a && "get" in a && void 0 !== (i = a.get(e, !1, r)) ? i : l[t]; - "string" === (o = typeof n) && (i = te.exec(n)) && i[1] && (n = se(e, t, i), o = "number"), - null != n && n == n && ("number" !== o || u || (n += i && i[3] || (S.cssNumber[s] ? "" : "px")), y.clearCloneStyle || "" !== n || 0 !== t.indexOf("background") || (l[t] = "inherit"), a && "set" in a && void 0 === (n = a.set(e, n, r)) || (u ? l.setProperty(t, n) : l[t] = n)) - } - }, - css: function(e, t, n, r) { - var i, - o, - a, - s = X(t); - return Xe.test(t) || (t = ze(s)), (a = S.cssHooks[t] || S.cssHooks[s]) && "get" in a && (i = a.get(e, !0, n)), void 0 === i && (i = We(e, t, r)), "normal" === i && t in Ge && (i = Ge[t]), "" === n || n ? (o = parseFloat(i), !0 === n || isFinite(o) ? o || 0 : i) : i - } - }), - S.each(["height", "width"], function(e, u) { - S.cssHooks[u] = { - get: function(e, t, n) { - if (t) - return !Ue.test(S.css(e, "display")) || e.getClientRects().length && e.getBoundingClientRect().width ? Je(e, u, n) : Me(e, Ve, function() { - return Je(e, u, n) - }) - }, - set: function(e, t, n) { - var r, - i = Re(e), - o = !y.scrollboxSize() && "absolute" === i.position, - a = (o || n) && "border-box" === S.css(e, "boxSizing", !1, i), - s = n ? Qe(e, u, n, a, i) : 0; - return a && o && (s -= Math.ceil(e["offset" + u[0].toUpperCase() + u.slice(1)] - parseFloat(i[u]) - Qe(e, u, "border", !1, i) - .5)), s && (r = te.exec(t)) && "px" !== (r[3] || "px") && (e.style[u] = t, t = S.css(e, u)), Ye(0, t, s) - } - } - }), - S.cssHooks.marginLeft = Fe(y.reliableMarginLeft, function(e, t) { - if (t) - return (parseFloat(We(e, "marginLeft")) || e.getBoundingClientRect().left - Me(e, { - marginLeft: 0 - }, function() { - return e.getBoundingClientRect().left - })) + "px" - }), - S.each({ - margin: "", - padding: "", - border: "Width" - }, function(i, o) { - S.cssHooks[i + o] = { - expand: function(e) { - for (var t = 0, n = {}, r = "string" == typeof e ? e.split(" ") : [e]; t < 4; t++) - n[i + ne[t] + o] = r[t] || r[t - 2] || r[0]; - return n - } - }, - "margin" !== i && (S.cssHooks[i + o].set = Ye) - }), - S.fn.extend({ - css: function(e, t) { - return $(this, function(e, t, n) { - var r, - i, - o = {}, - a = 0; - if (Array.isArray(t)) { - for (r = Re(e), i = t.length; a < i; a++) - o[t[a]] = S.css(e, t[a], !1, r); - return o - } - return void 0 !== n ? S.style(e, t, n) : S.css(e, t) - }, e, t, 1 < arguments.length) - } - }), - ((S.Tween = Ke).prototype = { - constructor: Ke, - init: function(e, t, n, r, i, o) { - this.elem = e, - this.prop = n, - this.easing = i || S.easing._default, - this.options = t, - this.start = this.now = this.cur(), - this.end = r, - this.unit = o || (S.cssNumber[n] ? "" : "px") - }, - cur: function() { - var e = Ke.propHooks[this.prop]; - return e && e.get ? e.get(this) : Ke.propHooks._default.get(this) - }, - run: function(e) { - var t, - n = Ke.propHooks[this.prop]; - return this.options.duration ? this.pos = t = S.easing[this.easing](e, this.options.duration * e, 0, 1, this.options.duration) : this.pos = t = e, this.now = (this.end - this.start) * t + this.start, this.options.step && this.options.step.call(this.elem, this.now, this), n && n.set ? n.set(this) : Ke.propHooks._default.set(this), this - } - }).init.prototype = Ke.prototype, - (Ke.propHooks = { - _default: { - get: function(e) { - var t; - return 1 !== e.elem.nodeType || null != e.elem[e.prop] && null == e.elem.style[e.prop] ? e.elem[e.prop] : (t = S.css(e.elem, e.prop, "")) && "auto" !== t ? t : 0 - }, - set: function(e) { - S.fx.step[e.prop] ? S.fx.step[e.prop](e) : 1 !== e.elem.nodeType || !S.cssHooks[e.prop] && null == e.elem.style[ze(e.prop)] ? e.elem[e.prop] = e.now : S.style(e.elem, e.prop, e.now + e.unit) - } - } - }).scrollTop = Ke.propHooks.scrollLeft = { - set: function(e) { - e.elem.nodeType && e.elem.parentNode && (e.elem[e.prop] = e.now) - } - }, - S.easing = { - linear: function(e) { - return e - }, - swing: function(e) { - return .5 - Math.cos(e * Math.PI) / 2 - }, - _default: "swing" - }, - S.fx = Ke.prototype.init, - S.fx.step = {}; - var Ze, - et, - tt, - nt, - rt = /^(?:toggle|show|hide)$/, - it = /queueHooks$/; - function ot() { - et && (!1 === E.hidden && C.requestAnimationFrame ? C.requestAnimationFrame(ot) : C.setTimeout(ot, S.fx.interval), S.fx.tick()) - } - function at() { - return C.setTimeout(function() { - Ze = void 0 - }), Ze = Date.now() - } - function st(e, t) { - var n, - r = 0, - i = { - height: e - }; - for (t = t ? 1 : 0; r < 4; r += 2 - t) - i["margin" + (n = ne[r])] = i["padding" + n] = e; - return t && (i.opacity = i.width = e), i - } - function ut(e, t, n) { - for (var r, i = (lt.tweeners[t] || []).concat(lt.tweeners["*"]), o = 0, a = i.length; o < a; o++) - if (r = i[o].call(n, t, e)) - return r - } - function lt(o, e, t) { - var n, - a, - r = 0, - i = lt.prefilters.length, - s = S.Deferred().always(function() { - delete u.elem - }), - u = function() { - if (a) - return !1; - for (var e = Ze || at(), t = Math.max(0, l.startTime + l.duration - e), n = 1 - (t / l.duration || 0), r = 0, i = l.tweens.length; r < i; r++) - l.tweens[r].run(n); - return s.notifyWith(o, [l, n, t]), n < 1 && i ? t : (i || s.notifyWith(o, [l, 1, 0]), s.resolveWith(o, [l]), !1) - }, - l = s.promise({ - elem: o, - props: S.extend({}, e), - opts: S.extend(!0, { - specialEasing: {}, - easing: S.easing._default - }, t), - originalProperties: e, - originalOptions: t, - startTime: Ze || at(), - duration: t.duration, - tweens: [], - createTween: function(e, t) { - var n = S.Tween(o, l.opts, e, t, l.opts.specialEasing[e] || l.opts.easing); - return l.tweens.push(n), n - }, - stop: function(e) { - var t = 0, - n = e ? l.tweens.length : 0; - if (a) - return this; - for (a = !0; t < n; t++) - l.tweens[t].run(1); - return e ? (s.notifyWith(o, [l, 1, 0]), s.resolveWith(o, [l, e])) : s.rejectWith(o, [l, e]), this - } - }), - c = l.props; - for (!function(e, t) { - var n, - r, - i, - o, - a; - for (n in e) - if (i = t[r = X(n)], o = e[n], Array.isArray(o) && (i = o[1], o = e[n] = o[0]), n !== r && (e[r] = o, delete e[n]), (a = S.cssHooks[r]) && "expand" in a) - for (n in o = a.expand(o), delete e[r], o) - n in e || (e[n] = o[n], t[n] = i); - else - t[r] = i - }(c, l.opts.specialEasing); r < i; r++) - if (n = lt.prefilters[r].call(l, o, c, l.opts)) - return m(n.stop) && (S._queueHooks(l.elem, l.opts.queue).stop = n.stop.bind(n)), n; - return S.map(c, ut, l), m(l.opts.start) && l.opts.start.call(o, l), l.progress(l.opts.progress).done(l.opts.done, l.opts.complete).fail(l.opts.fail).always(l.opts.always), S.fx.timer(S.extend(u, { - elem: o, - anim: l, - queue: l.opts.queue - })), l - } - S.Animation = S.extend(lt, { - tweeners: { - "*": [function(e, t) { - var n = this.createTween(e, t); - return se(n.elem, e, te.exec(t), n), n - }] - }, - tweener: function(e, t) { - m(e) ? (t = e, e = ["*"]) : e = e.match(P); - for (var n, r = 0, i = e.length; r < i; r++) - n = e[r], - lt.tweeners[n] = lt.tweeners[n] || [], - lt.tweeners[n].unshift(t) - }, - prefilters: [function(e, t, n) { - var r, - i, - o, - a, - s, - u, - l, - c, - f = "width" in t || "height" in t, - p = this, - d = {}, - h = e.style, - g = e.nodeType && ae(e), - v = Y.get(e, "fxshow"); - for (r in n.queue || (null == (a = S._queueHooks(e, "fx")).unqueued && (a.unqueued = 0, s = a.empty.fire, a.empty.fire = function() { - a.unqueued || s() - }), a.unqueued++, p.always(function() { - p.always(function() { - a.unqueued--, - S.queue(e, "fx").length || a.empty.fire() - }) - })), t) - if (i = t[r], rt.test(i)) { - if (delete t[r], o = o || "toggle" === i, i === (g ? "hide" : "show")) { - if ("show" !== i || !v || void 0 === v[r]) - continue; - g = !0 - } - d[r] = v && v[r] || S.style(e, r) - } - if ((u = !S.isEmptyObject(t)) || !S.isEmptyObject(d)) - for (r in f && 1 === e.nodeType && (n.overflow = [h.overflow, h.overflowX, h.overflowY], null == (l = v && v.display) && (l = Y.get(e, "display")), "none" === (c = S.css(e, "display")) && (l ? c = l : (le([e], !0), l = e.style.display || l, c = S.css(e, "display"), le([e]))), ("inline" === c || "inline-block" === c && null != l) && "none" === S.css(e, "float") && (u || (p.done(function() { - h.display = l - }), null == l && (c = h.display, l = "none" === c ? "" : c)), h.display = "inline-block")), n.overflow && (h.overflow = "hidden", p.always(function() { - h.overflow = n.overflow[0], - h.overflowX = n.overflow[1], - h.overflowY = n.overflow[2] - })), u = !1, d) - u || (v ? "hidden" in v && (g = v.hidden) : v = Y.access(e, "fxshow", { - display: l - }), o && (v.hidden = !g), g && le([e], !0), p.done(function() { - for (r in g || le([e]), Y.remove(e, "fxshow"), d) - S.style(e, r, d[r]) - })), - u = ut(g ? v[r] : 0, r, p), - r in v || (v[r] = u.start, g && (u.end = u.start, u.start = 0)) - }], - prefilter: function(e, t) { - t ? lt.prefilters.unshift(e) : lt.prefilters.push(e) - } - }), - S.speed = function(e, t, n) { - var r = e && "object" == typeof e ? S.extend({}, e) : { - complete: n || !n && t || m(e) && e, - duration: e, - easing: n && t || t && !m(t) && t - }; - return S.fx.off ? r.duration = 0 : "number" != typeof r.duration && (r.duration in S.fx.speeds ? r.duration = S.fx.speeds[r.duration] : r.duration = S.fx.speeds._default), null != r.queue && !0 !== r.queue || (r.queue = "fx"), r.old = r.complete, r.complete = function() { - m(r.old) && r.old.call(this), - r.queue && S.dequeue(this, r.queue) - }, r - }, - S.fn.extend({ - fadeTo: function(e, t, n, r) { - return this.filter(ae).css("opacity", 0).show().end().animate({ - opacity: t - }, e, n, r) - }, - animate: function(t, e, n, r) { - var i = S.isEmptyObject(t), - o = S.speed(e, n, r), - a = function() { - var e = lt(this, S.extend({}, t), o); - (i || Y.get(this, "finish")) && e.stop(!0) - }; - return a.finish = a, i || !1 === o.queue ? this.each(a) : this.queue(o.queue, a) - }, - stop: function(i, e, o) { - var a = function(e) { - var t = e.stop; - delete e.stop, - t(o) - }; - return "string" != typeof i && (o = e, e = i, i = void 0), e && this.queue(i || "fx", []), this.each(function() { - var e = !0, - t = null != i && i + "queueHooks", - n = S.timers, - r = Y.get(this); - if (t) - r[t] && r[t].stop && a(r[t]); - else - for (t in r) - r[t] && r[t].stop && it.test(t) && a(r[t]); - for (t = n.length; t--;) - n[t].elem !== this || null != i && n[t].queue !== i || (n[t].anim.stop(o), e = !1, n.splice(t, 1)); - !e && o || S.dequeue(this, i) - }) - }, - finish: function(a) { - return !1 !== a && (a = a || "fx"), this.each(function() { - var e, - t = Y.get(this), - n = t[a + "queue"], - r = t[a + "queueHooks"], - i = S.timers, - o = n ? n.length : 0; - for (t.finish = !0, S.queue(this, a, []), r && r.stop && r.stop.call(this, !0), e = i.length; e--;) - i[e].elem === this && i[e].queue === a && (i[e].anim.stop(!0), i.splice(e, 1)); - for (e = 0; e < o; e++) - n[e] && n[e].finish && n[e].finish.call(this); - delete t.finish - }) - } - }), - S.each(["toggle", "show", "hide"], function(e, r) { - var i = S.fn[r]; - S.fn[r] = function(e, t, n) { - return null == e || "boolean" == typeof e ? i.apply(this, arguments) : this.animate(st(r, !0), e, t, n) - } - }), - S.each({ - slideDown: st("show"), - slideUp: st("hide"), - slideToggle: st("toggle"), - fadeIn: { - opacity: "show" - }, - fadeOut: { - opacity: "hide" - }, - fadeToggle: { - opacity: "toggle" - } - }, function(e, r) { - S.fn[e] = function(e, t, n) { - return this.animate(r, e, t, n) - } - }), - S.timers = [], - S.fx.tick = function() { - var e, - t = 0, - n = S.timers; - for (Ze = Date.now(); t < n.length; t++) - (e = n[t])() || n[t] !== e || n.splice(t--, 1); - n.length || S.fx.stop(), - Ze = void 0 - }, - S.fx.timer = function(e) { - S.timers.push(e), - S.fx.start() - }, - S.fx.interval = 13, - S.fx.start = function() { - et || (et = !0, ot()) - }, - S.fx.stop = function() { - et = null - }, - S.fx.speeds = { - slow: 600, - fast: 200, - _default: 400 - }, - S.fn.delay = function(r, e) { - return r = S.fx && S.fx.speeds[r] || r, e = e || "fx", this.queue(e, function(e, t) { - var n = C.setTimeout(e, r); - t.stop = function() { - C.clearTimeout(n) - } - }) - }, - tt = E.createElement("input"), - nt = E.createElement("select").appendChild(E.createElement("option")), - tt.type = "checkbox", - y.checkOn = "" !== tt.value, - y.optSelected = nt.selected, - (tt = E.createElement("input")).value = "t", - tt.type = "radio", - y.radioValue = "t" === tt.value; - var ct, - ft = S.expr.attrHandle; - S.fn.extend({ - attr: function(e, t) { - return $(this, S.attr, e, t, 1 < arguments.length) - }, - removeAttr: function(e) { - return this.each(function() { - S.removeAttr(this, e) - }) - } - }), - S.extend({ - attr: function(e, t, n) { - var r, - i, - o = e.nodeType; - if (3 !== o && 8 !== o && 2 !== o) - return "undefined" == typeof e.getAttribute ? S.prop(e, t, n) : (1 === o && S.isXMLDoc(e) || (i = S.attrHooks[t.toLowerCase()] || (S.expr.match.bool.test(t) ? ct : void 0)), void 0 !== n ? null === n ? void S.removeAttr(e, t) : i && "set" in i && void 0 !== (r = i.set(e, n, t)) ? r : (e.setAttribute(t, n + ""), n) : i && "get" in i && null !== (r = i.get(e, t)) ? r : null == (r = S.find.attr(e, t)) ? void 0 : r) - }, - attrHooks: { - type: { - set: function(e, t) { - if (!y.radioValue && "radio" === t && A(e, "input")) { - var n = e.value; - return e.setAttribute("type", t), n && (e.value = n), t - } - } - } - }, - removeAttr: function(e, t) { - var n, - r = 0, - i = t && t.match(P); - if (i && 1 === e.nodeType) - while (n = i[r++]) - e.removeAttribute(n) - } - }), - ct = { - set: function(e, t, n) { - return !1 === t ? S.removeAttr(e, n) : e.setAttribute(n, n), n - } - }, - S.each(S.expr.match.bool.source.match(/\w+/g), function(e, t) { - var a = ft[t] || S.find.attr; - ft[t] = function(e, t, n) { - var r, - i, - o = t.toLowerCase(); - return n || (i = ft[o], ft[o] = r, r = null != a(e, t, n) ? o : null, ft[o] = i), r - } - }); - var pt = /^(?:input|select|textarea|button)$/i, - dt = /^(?:a|area)$/i; - function ht(e) { - return (e.match(P) || []).join(" ") - } - function gt(e) { - return e.getAttribute && e.getAttribute("class") || "" - } - function vt(e) { - return Array.isArray(e) ? e : "string" == typeof e && e.match(P) || [] - } - S.fn.extend({ - prop: function(e, t) { - return $(this, S.prop, e, t, 1 < arguments.length) - }, - removeProp: function(e) { - return this.each(function() { - delete this[S.propFix[e] || e] - }) - } - }), - S.extend({ - prop: function(e, t, n) { - var r, - i, - o = e.nodeType; - if (3 !== o && 8 !== o && 2 !== o) - return 1 === o && S.isXMLDoc(e) || (t = S.propFix[t] || t, i = S.propHooks[t]), void 0 !== n ? i && "set" in i && void 0 !== (r = i.set(e, n, t)) ? r : e[t] = n : i && "get" in i && null !== (r = i.get(e, t)) ? r : e[t] - }, - propHooks: { - tabIndex: { - get: function(e) { - var t = S.find.attr(e, "tabindex"); - return t ? parseInt(t, 10) : pt.test(e.nodeName) || dt.test(e.nodeName) && e.href ? 0 : -1 - } - } - }, - propFix: { - "for": "htmlFor", - "class": "className" - } - }), - y.optSelected || (S.propHooks.selected = { - get: function(e) { - var t = e.parentNode; - return t && t.parentNode && t.parentNode.selectedIndex, null - }, - set: function(e) { - var t = e.parentNode; - t && (t.selectedIndex, t.parentNode && t.parentNode.selectedIndex) - } - }), - S.each(["tabIndex", "readOnly", "maxLength", "cellSpacing", "cellPadding", "rowSpan", "colSpan", "useMap", "frameBorder", "contentEditable"], function() { - S.propFix[this.toLowerCase()] = this - }), - S.fn.extend({ - addClass: function(t) { - var e, - n, - r, - i, - o, - a, - s, - u = 0; - if (m(t)) - return this.each(function(e) { - S(this).addClass(t.call(this, e, gt(this))) - }); - if ((e = vt(t)).length) - while (n = this[u++]) - if (i = gt(n), r = 1 === n.nodeType && " " + ht(i) + " ") { - a = 0; - while (o = e[a++]) - r.indexOf(" " + o + " ") < 0 && (r += o + " "); - i !== (s = ht(r)) && n.setAttribute("class", s) - } - return this - }, - removeClass: function(t) { - var e, - n, - r, - i, - o, - a, - s, - u = 0; - if (m(t)) - return this.each(function(e) { - S(this).removeClass(t.call(this, e, gt(this))) - }); - if (!arguments.length) - return this.attr("class", ""); - if ((e = vt(t)).length) - while (n = this[u++]) - if (i = gt(n), r = 1 === n.nodeType && " " + ht(i) + " ") { - a = 0; - while (o = e[a++]) - while (-1 < r.indexOf(" " + o + " ")) - r = r.replace(" " + o + " ", " "); - i !== (s = ht(r)) && n.setAttribute("class", s) - } - return this - }, - toggleClass: function(i, t) { - var o = typeof i, - a = "string" === o || Array.isArray(i); - return "boolean" == typeof t && a ? t ? this.addClass(i) : this.removeClass(i) : m(i) ? this.each(function(e) { - S(this).toggleClass(i.call(this, e, gt(this), t), t) - }) : this.each(function() { - var e, - t, - n, - r; - if (a) { - t = 0, - n = S(this), - r = vt(i); - while (e = r[t++]) - n.hasClass(e) ? n.removeClass(e) : n.addClass(e) - } else - void 0 !== i && "boolean" !== o || ((e = gt(this)) && Y.set(this, "__className__", e), this.setAttribute && this.setAttribute("class", e || !1 === i ? "" : Y.get(this, "__className__") || "")) - }) - }, - hasClass: function(e) { - var t, - n, - r = 0; - t = " " + e + " "; - while (n = this[r++]) - if (1 === n.nodeType && -1 < (" " + ht(gt(n)) + " ").indexOf(t)) - return !0; - return !1 - } - }); - var yt = /\r/g; - S.fn.extend({ - val: function(n) { - var r, - e, - i, - t = this[0]; - return arguments.length ? (i = m(n), this.each(function(e) { - var t; - 1 === this.nodeType && (null == (t = i ? n.call(this, e, S(this).val()) : n) ? t = "" : "number" == typeof t ? t += "" : Array.isArray(t) && (t = S.map(t, function(e) { - return null == e ? "" : e + "" - })), (r = S.valHooks[this.type] || S.valHooks[this.nodeName.toLowerCase()]) && "set" in r && void 0 !== r.set(this, t, "value") || (this.value = t)) - })) : t ? (r = S.valHooks[t.type] || S.valHooks[t.nodeName.toLowerCase()]) && "get" in r && void 0 !== (e = r.get(t, "value")) ? e : "string" == typeof (e = t.value) ? e.replace(yt, "") : null == e ? "" : e : void 0 - } - }), - S.extend({ - valHooks: { - option: { - get: function(e) { - var t = S.find.attr(e, "value"); - return null != t ? t : ht(S.text(e)) - } - }, - select: { - get: function(e) { - var t, - n, - r, - i = e.options, - o = e.selectedIndex, - a = "select-one" === e.type, - s = a ? null : [], - u = a ? o + 1 : i.length; - for (r = o < 0 ? u : a ? o : 0; r < u; r++) - if (((n = i[r]).selected || r === o) && !n.disabled && (!n.parentNode.disabled || !A(n.parentNode, "optgroup"))) { - if (t = S(n).val(), a) - return t; - s.push(t) - } - return s - }, - set: function(e, t) { - var n, - r, - i = e.options, - o = S.makeArray(t), - a = i.length; - while (a--) - ((r = i[a]).selected = -1 < S.inArray(S.valHooks.option.get(r), o)) && (n = !0); - return n || (e.selectedIndex = -1), o - } - } - } - }), - S.each(["radio", "checkbox"], function() { - S.valHooks[this] = { - set: function(e, t) { - if (Array.isArray(t)) - return e.checked = -1 < S.inArray(S(e).val(), t) - } - }, - y.checkOn || (S.valHooks[this].get = function(e) { - return null === e.getAttribute("value") ? "on" : e.value - }) - }), - y.focusin = "onfocusin" in C; - var mt = /^(?:focusinfocus|focusoutblur)$/, - xt = function(e) { - e.stopPropagation() - }; - S.extend(S.event, { - trigger: function(e, t, n, r) { - var i, - o, - a, - s, - u, - l, - c, - f, - p = [n || E], - d = v.call(e, "type") ? e.type : e, - h = v.call(e, "namespace") ? e.namespace.split(".") : []; - if (o = f = a = n = n || E, 3 !== n.nodeType && 8 !== n.nodeType && !mt.test(d + S.event.triggered) && (-1 < d.indexOf(".") && (d = (h = d.split(".")).shift(), h.sort()), u = d.indexOf(":") < 0 && "on" + d, (e = e[S.expando] ? e : new S.Event(d, "object" == typeof e && e)).isTrigger = r ? 2 : 3, e.namespace = h.join("."), e.rnamespace = e.namespace ? new RegExp("(^|\\.)" + h.join("\\.(?:.*\\.|)") + "(\\.|$)") : null, e.result = void 0, e.target || (e.target = n), t = null == t ? [e] : S.makeArray(t, [e]), c = S.event.special[d] || {}, r || !c.trigger || !1 !== c.trigger.apply(n, t))) { - if (!r && !c.noBubble && !x(n)) { - for (s = c.delegateType || d, mt.test(s + d) || (o = o.parentNode); o; o = o.parentNode) - p.push(o), - a = o; - a === (n.ownerDocument || E) && p.push(a.defaultView || a.parentWindow || C) - } - i = 0; - while ((o = p[i++]) && !e.isPropagationStopped()) - f = o, - e.type = 1 < i ? s : c.bindType || d, - (l = (Y.get(o, "events") || Object.create(null))[e.type] && Y.get(o, "handle")) && l.apply(o, t), - (l = u && o[u]) && l.apply && V(o) && (e.result = l.apply(o, t), !1 === e.result && e.preventDefault()); - return e.type = d, r || e.isDefaultPrevented() || c._default && !1 !== c._default.apply(p.pop(), t) || !V(n) || u && m(n[d]) && !x(n) && ((a = n[u]) && (n[u] = null), S.event.triggered = d, e.isPropagationStopped() && f.addEventListener(d, xt), n[d](), e.isPropagationStopped() && f.removeEventListener(d, xt), S.event.triggered = void 0, a && (n[u] = a)), e.result - } - }, - simulate: function(e, t, n) { - var r = S.extend(new S.Event, n, { - type: e, - isSimulated: !0 - }); - S.event.trigger(r, null, t) - } - }), - S.fn.extend({ - trigger: function(e, t) { - return this.each(function() { - S.event.trigger(e, t, this) - }) - }, - triggerHandler: function(e, t) { - var n = this[0]; - if (n) - return S.event.trigger(e, t, n, !0) - } - }), - y.focusin || S.each({ - focus: "focusin", - blur: "focusout" - }, function(n, r) { - var i = function(e) { - S.event.simulate(r, e.target, S.event.fix(e)) - }; - S.event.special[r] = { - setup: function() { - var e = this.ownerDocument || this.document || this, - t = Y.access(e, r); - t || e.addEventListener(n, i, !0), - Y.access(e, r, (t || 0) + 1) - }, - teardown: function() { - var e = this.ownerDocument || this.document || this, - t = Y.access(e, r) - 1; - t ? Y.access(e, r, t) : (e.removeEventListener(n, i, !0), Y.remove(e, r)) - } - } - }); - var bt = C.location, - wt = { - guid: Date.now() - }, - Tt = /\?/; - S.parseXML = function(e) { - var t, - n; - if (!e || "string" != typeof e) - return null; - try { - t = (new C.DOMParser).parseFromString(e, "text/xml") - } catch (e) {} - return n = t && t.getElementsByTagName("parsererror")[0], t && !n || S.error("Invalid XML: " + (n ? S.map(n.childNodes, function(e) { - return e.textContent - }).join("\n") : e)), t - }; - var Ct = /\[\]$/, - Et = /\r?\n/g, - St = /^(?:submit|button|image|reset|file)$/i, - kt = /^(?:input|select|textarea|keygen)/i; - function At(n, e, r, i) { - var t; - if (Array.isArray(e)) - S.each(e, function(e, t) { - r || Ct.test(n) ? i(n, t) : At(n + "[" + ("object" == typeof t && null != t ? e : "") + "]", t, r, i) - }); - else if (r || "object" !== w(e)) - i(n, e); - else - for (t in e) - At(n + "[" + t + "]", e[t], r, i) - } - S.param = function(e, t) { - var n, - r = [], - i = function(e, t) { - var n = m(t) ? t() : t; - r[r.length] = encodeURIComponent(e) + "=" + encodeURIComponent(null == n ? "" : n) - }; - if (null == e) - return ""; - if (Array.isArray(e) || e.jquery && !S.isPlainObject(e)) - S.each(e, function() { - i(this.name, this.value) - }); - else - for (n in e) - At(n, e[n], t, i); - return r.join("&") - }, - S.fn.extend({ - serialize: function() { - return S.param(this.serializeArray()) - }, - serializeArray: function() { - return this.map(function() { - var e = S.prop(this, "elements"); - return e ? S.makeArray(e) : this - }).filter(function() { - var e = this.type; - return this.name && !S(this).is(":disabled") && kt.test(this.nodeName) && !St.test(e) && (this.checked || !pe.test(e)) - }).map(function(e, t) { - var n = S(this).val(); - return null == n ? null : Array.isArray(n) ? S.map(n, function(e) { - return { - name: t.name, - value: e.replace(Et, "\r\n") - } - }) : { - name: t.name, - value: n.replace(Et, "\r\n") - } - }).get() - } - }); - var Nt = /%20/g, - jt = /#.*$/, - Dt = /([?&])_=[^&]*/, - qt = /^(.*?):[ \t]*([^\r\n]*)$/gm, - Lt = /^(?:GET|HEAD)$/, - Ht = /^\/\//, - Ot = {}, - Pt = {}, - Rt = "*/".concat("*"), - Mt = E.createElement("a"); - function It(o) { - return function(e, t) { - "string" != typeof e && (t = e, e = "*"); - var n, - r = 0, - i = e.toLowerCase().match(P) || []; - if (m(t)) - while (n = i[r++]) - "+" === n[0] ? (n = n.slice(1) || "*", (o[n] = o[n] || []).unshift(t)) : (o[n] = o[n] || []).push(t) - } - } - function Wt(t, i, o, a) { - var s = {}, - u = t === Pt; - function l(e) { - var r; - return s[e] = !0, S.each(t[e] || [], function(e, t) { - var n = t(i, o, a); - return "string" != typeof n || u || s[n] ? u ? !(r = n) : void 0 : (i.dataTypes.unshift(n), l(n), !1) - }), r - } - return l(i.dataTypes[0]) || !s["*"] && l("*") - } - function Ft(e, t) { - var n, - r, - i = S.ajaxSettings.flatOptions || {}; - for (n in t) - void 0 !== t[n] && ((i[n] ? e : r || (r = {}))[n] = t[n]); - return r && S.extend(!0, e, r), e - } - Mt.href = bt.href, - S.extend({ - active: 0, - lastModified: {}, - etag: {}, - ajaxSettings: { - url: bt.href, - type: "GET", - isLocal: /^(?:about|app|app-storage|.+-extension|file|res|widget):$/.test(bt.protocol), - global: !0, - processData: !0, - async: !0, - contentType: "application/x-www-form-urlencoded; charset=UTF-8", - accepts: { - "*": Rt, - text: "text/plain", - html: "text/html", - xml: "application/xml, text/xml", - json: "application/json, text/javascript" - }, - contents: { - xml: /\bxml\b/, - html: /\bhtml/, - json: /\bjson\b/ - }, - responseFields: { - xml: "responseXML", - text: "responseText", - json: "responseJSON" - }, - converters: { - "* text": String, - "text html": !0, - "text json": JSON.parse, - "text xml": S.parseXML - }, - flatOptions: { - url: !0, - context: !0 - } - }, - ajaxSetup: function(e, t) { - return t ? Ft(Ft(e, S.ajaxSettings), t) : Ft(S.ajaxSettings, e) - }, - ajaxPrefilter: It(Ot), - ajaxTransport: It(Pt), - ajax: function(e, t) { - "object" == typeof e && (t = e, e = void 0), - t = t || {}; - var c, - f, - p, - n, - d, - r, - h, - g, - i, - o, - v = S.ajaxSetup({}, t), - y = v.context || v, - m = v.context && (y.nodeType || y.jquery) ? S(y) : S.event, - x = S.Deferred(), - b = S.Callbacks("once memory"), - w = v.statusCode || {}, - a = {}, - s = {}, - u = "canceled", - T = { - readyState: 0, - getResponseHeader: function(e) { - var t; - if (h) { - if (!n) { - n = {}; - while (t = qt.exec(p)) - n[t[1].toLowerCase() + " "] = (n[t[1].toLowerCase() + " "] || []).concat(t[2]) - } - t = n[e.toLowerCase() + " "] - } - return null == t ? null : t.join(", ") - }, - getAllResponseHeaders: function() { - return h ? p : null - }, - setRequestHeader: function(e, t) { - return null == h && (e = s[e.toLowerCase()] = s[e.toLowerCase()] || e, a[e] = t), this - }, - overrideMimeType: function(e) { - return null == h && (v.mimeType = e), this - }, - statusCode: function(e) { - var t; - if (e) - if (h) - T.always(e[T.status]); - else - for (t in e) - w[t] = [w[t], e[t]]; - return this - }, - abort: function(e) { - var t = e || u; - return c && c.abort(t), l(0, t), this - } - }; - if (x.promise(T), v.url = ((e || v.url || bt.href) + "").replace(Ht, bt.protocol + "//"), v.type = t.method || t.type || v.method || v.type, v.dataTypes = (v.dataType || "*").toLowerCase().match(P) || [""], null == v.crossDomain) { - r = E.createElement("a"); - try { - r.href = v.url, - r.href = r.href, - v.crossDomain = Mt.protocol + "//" + Mt.host != r.protocol + "//" + r.host - } catch (e) { - v.crossDomain = !0 - } - } - if (v.data && v.processData && "string" != typeof v.data && (v.data = S.param(v.data, v.traditional)), Wt(Ot, v, t, T), h) - return T; - for (i in (g = S.event && v.global) && 0 == S.active++ && S.event.trigger("ajaxStart"), v.type = v.type.toUpperCase(), v.hasContent = !Lt.test(v.type), f = v.url.replace(jt, ""), v.hasContent ? v.data && v.processData && 0 === (v.contentType || "").indexOf("application/x-www-form-urlencoded") && (v.data = v.data.replace(Nt, "+")) : (o = v.url.slice(f.length), v.data && (v.processData || "string" == typeof v.data) && (f += (Tt.test(f) ? "&" : "?") + v.data, delete v.data), !1 === v.cache && (f = f.replace(Dt, "$1"), o = (Tt.test(f) ? "&" : "?") + "_=" + wt.guid++ + o), v.url = f + o), v.ifModified && (S.lastModified[f] && T.setRequestHeader("If-Modified-Since", S.lastModified[f]), S.etag[f] && T.setRequestHeader("If-None-Match", S.etag[f])), (v.data && v.hasContent && !1 !== v.contentType || t.contentType) && T.setRequestHeader("Content-Type", v.contentType), T.setRequestHeader("Accept", v.dataTypes[0] && v.accepts[v.dataTypes[0]] ? v.accepts[v.dataTypes[0]] + ("*" !== v.dataTypes[0] ? ", " + Rt + "; q=0.01" : "") : v.accepts["*"]), v.headers) - T.setRequestHeader(i, v.headers[i]); - if (v.beforeSend && (!1 === v.beforeSend.call(y, T, v) || h)) - return T.abort(); - if (u = "abort", b.add(v.complete), T.done(v.success), T.fail(v.error), c = Wt(Pt, v, t, T)) { - if (T.readyState = 1, g && m.trigger("ajaxSend", [T, v]), h) - return T; - v.async && 0 < v.timeout && (d = C.setTimeout(function() { - T.abort("timeout") - }, v.timeout)); - try { - h = !1, - c.send(a, l) - } catch (e) { - if (h) - throw e; - l(-1, e) - } - } else - l(-1, "No Transport"); - function l(e, t, n, r) { - var i, - o, - a, - s, - u, - l = t; - h || (h = !0, d && C.clearTimeout(d), c = void 0, p = r || "", T.readyState = 0 < e ? 4 : 0, i = 200 <= e && e < 300 || 304 === e, n && (s = function(e, t, n) { - var r, - i, - o, - a, - s = e.contents, - u = e.dataTypes; - while ("*" === u[0]) - u.shift(), - void 0 === r && (r = e.mimeType || t.getResponseHeader("Content-Type")); - if (r) - for (i in s) - if (s[i] && s[i].test(r)) { - u.unshift(i); - break - } - if (u[0] in n) - o = u[0]; - else { - for (i in n) { - if (!u[0] || e.converters[i + " " + u[0]]) { - o = i; - break - } - a || (a = i) - } - o = o || a - } - if (o) - return o !== u[0] && u.unshift(o), n[o] - }(v, T, n)), !i && -1 < S.inArray("script", v.dataTypes) && S.inArray("json", v.dataTypes) < 0 && (v.converters["text script"] = function() {}), s = function(e, t, n, r) { - var i, - o, - a, - s, - u, - l = {}, - c = e.dataTypes.slice(); - if (c[1]) - for (a in e.converters) - l[a.toLowerCase()] = e.converters[a]; - o = c.shift(); - while (o) - if (e.responseFields[o] && (n[e.responseFields[o]] = t), !u && r && e.dataFilter && (t = e.dataFilter(t, e.dataType)), u = o, o = c.shift()) - if ("*" === o) - o = u; - else if ("*" !== u && u !== o) { - if (!(a = l[u + " " + o] || l["* " + o])) - for (i in l) - if ((s = i.split(" "))[1] === o && (a = l[u + " " + s[0]] || l["* " + s[0]])) { - !0 === a ? a = l[i] : !0 !== l[i] && (o = s[0], c.unshift(s[1])); - break - } - if (!0 !== a) - if (a && e["throws"]) - t = a(t); - else - try { - t = a(t) - } catch (e) { - return { - state: "parsererror", - error: a ? e : "No conversion from " + u + " to " + o - } - } - } - return { - state: "success", - data: t - } - }(v, s, T, i), i ? (v.ifModified && ((u = T.getResponseHeader("Last-Modified")) && (S.lastModified[f] = u), (u = T.getResponseHeader("etag")) && (S.etag[f] = u)), 204 === e || "HEAD" === v.type ? l = "nocontent" : 304 === e ? l = "notmodified" : (l = s.state, o = s.data, i = !(a = s.error))) : (a = l, !e && l || (l = "error", e < 0 && (e = 0))), T.status = e, T.statusText = (t || l) + "", i ? x.resolveWith(y, [o, l, T]) : x.rejectWith(y, [T, l, a]), T.statusCode(w), w = void 0, g && m.trigger(i ? "ajaxSuccess" : "ajaxError", [T, v, i ? o : a]), b.fireWith(y, [T, l]), g && (m.trigger("ajaxComplete", [T, v]), --S.active || S.event.trigger("ajaxStop"))) - } - return T - }, - getJSON: function(e, t, n) { - return S.get(e, t, n, "json") - }, - getScript: function(e, t) { - return S.get(e, void 0, t, "script") - } - }), - S.each(["get", "post"], function(e, i) { - S[i] = function(e, t, n, r) { - return m(t) && (r = r || n, n = t, t = void 0), S.ajax(S.extend({ - url: e, - type: i, - dataType: r, - data: t, - success: n - }, S.isPlainObject(e) && e)) - } - }), - S.ajaxPrefilter(function(e) { - var t; - for (t in e.headers) - "content-type" === t.toLowerCase() && (e.contentType = e.headers[t] || "") - }), - S._evalUrl = function(e, t, n) { - return S.ajax({ - url: e, - type: "GET", - dataType: "script", - cache: !0, - async: !1, - global: !1, - converters: { - "text script": function() {} - }, - dataFilter: function(e) { - S.globalEval(e, t, n) - } - }) - }, - S.fn.extend({ - wrapAll: function(e) { - var t; - return this[0] && (m(e) && (e = e.call(this[0])), t = S(e, this[0].ownerDocument).eq(0).clone(!0), this[0].parentNode && t.insertBefore(this[0]), t.map(function() { - var e = this; - while (e.firstElementChild) - e = e.firstElementChild; - return e - }).append(this)), this - }, - wrapInner: function(n) { - return m(n) ? this.each(function(e) { - S(this).wrapInner(n.call(this, e)) - }) : this.each(function() { - var e = S(this), - t = e.contents(); - t.length ? t.wrapAll(n) : e.append(n) - }) - }, - wrap: function(t) { - var n = m(t); - return this.each(function(e) { - S(this).wrapAll(n ? t.call(this, e) : t) - }) - }, - unwrap: function(e) { - return this.parent(e).not("body").each(function() { - S(this).replaceWith(this.childNodes) - }), this - } - }), - S.expr.pseudos.hidden = function(e) { - return !S.expr.pseudos.visible(e) - }, - S.expr.pseudos.visible = function(e) { - return !!(e.offsetWidth || e.offsetHeight || e.getClientRects().length) - }, - S.ajaxSettings.xhr = function() { - try { - return new C.XMLHttpRequest - } catch (e) {} - }; - var Bt = { - 0: 200, - 1223: 204 - }, - $t = S.ajaxSettings.xhr(); - y.cors = !!$t && "withCredentials" in $t, - y.ajax = $t = !!$t, - S.ajaxTransport(function(i) { - var o, - a; - if (y.cors || $t && !i.crossDomain) - return { - send: function(e, t) { - var n, - r = i.xhr(); - if (r.open(i.type, i.url, i.async, i.username, i.password), i.xhrFields) - for (n in i.xhrFields) - r[n] = i.xhrFields[n]; - for (n in i.mimeType && r.overrideMimeType && r.overrideMimeType(i.mimeType), i.crossDomain || e["X-Requested-With"] || (e["X-Requested-With"] = "XMLHttpRequest"), e) - r.setRequestHeader(n, e[n]); - o = function(e) { - return function() { - o && (o = a = r.onload = r.onerror = r.onabort = r.ontimeout = r.onreadystatechange = null, "abort" === e ? r.abort() : "error" === e ? "number" != typeof r.status ? t(0, "error") : t(r.status, r.statusText) : t(Bt[r.status] || r.status, r.statusText, "text" !== (r.responseType || "text") || "string" != typeof r.responseText ? { - binary: r.response - } : { - text: r.responseText - }, r.getAllResponseHeaders())) - } - }, - r.onload = o(), - a = r.onerror = r.ontimeout = o("error"), - void 0 !== r.onabort ? r.onabort = a : r.onreadystatechange = function() { - 4 === r.readyState && C.setTimeout(function() { - o && a() - }) - }, - o = o("abort"); - try { - r.send(i.hasContent && i.data || null) - } catch (e) { - if (o) - throw e - } - }, - abort: function() { - o && o() - } - } - }), - S.ajaxPrefilter(function(e) { - e.crossDomain && (e.contents.script = !1) - }), - S.ajaxSetup({ - accepts: { - script: "text/javascript, application/javascript, application/ecmascript, application/x-ecmascript" - }, - contents: { - script: /\b(?:java|ecma)script\b/ - }, - converters: { - "text script": function(e) { - return S.globalEval(e), e - } - } - }), - S.ajaxPrefilter("script", function(e) { - void 0 === e.cache && (e.cache = !1), - e.crossDomain && (e.type = "GET") - }), - S.ajaxTransport("script", function(n) { - var r, - i; - if (n.crossDomain || n.scriptAttrs) - return { - send: function(e, t) { - r = S("