parquet-converter commited on
Commit
658a70c
·
1 Parent(s): 753f4ff

Update parquet files (step 38 of 397)

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. spaces/1acneusushi/gradio-2dmoleculeeditor/data/Download program decodare casetofoane auto Learn how to decode your car stereo with this simple program.md +0 -130
  2. spaces/1gistliPinn/ChatGPT4/Examples/Ace Attorney Cracked Ipa Apps Discover the Amazing Story and Characters of the Game Series.md +0 -18
  3. spaces/1gistliPinn/ChatGPT4/Examples/FULLIObitDriverBoosterPro7426810Crack.md +0 -6
  4. spaces/1gistliPinn/ChatGPT4/Examples/Free Download Ekahau Site Survey.md +0 -29
  5. spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Buy and Sell Anything with Aladdin.az the Leading E-commerce Platform in Azerbaijan.md +0 -207
  6. spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Download Clash of Clans and Join the Epic Clan Wars.md +0 -176
  7. spaces/1phancelerku/anime-remove-background/Car Driving School Simulator APK Drive Around the World in 9 Different Maps.md +0 -81
  8. spaces/1phancelerku/anime-remove-background/Christopher Martin - Let Her Go (Lyrics Video) - MP3 Download.md +0 -101
  9. spaces/1phancelerku/anime-remove-background/Dr. Driving for PC How to Download and Play the Best Racing Game on Your Computer.md +0 -131
  10. spaces/1phancelerku/anime-remove-background/Fly Anywhere in the World with Flight Simulator Download.md +0 -155
  11. spaces/801artistry/RVC801/configs/config.py +0 -265
  12. spaces/A666sxr/Genshin_TTS/text/english.py +0 -171
  13. spaces/AIConsultant/MusicGen/audiocraft/environment.py +0 -176
  14. spaces/AIGC-Audio/AudioGPT/NeuralSeq/modules/commons/transformer.py +0 -747
  15. spaces/AIGC-Audio/AudioGPT/NeuralSeq/utils/multiprocess_utils.py +0 -54
  16. spaces/AIGC-Audio/AudioGPT/text_to_speech/modules/tts/diffspeech/net.py +0 -110
  17. spaces/AIGText/GlyphControl/ldm/models/diffusion/dpm_solver/sampler.py +0 -87
  18. spaces/AgentVerse/agentVerse/agentverse/environments/base.py +0 -58
  19. spaces/AgentVerse/agentVerse/ui/src/phaser3-rex-plugins/templates/spinner/dots/Dots.js +0 -54
  20. spaces/AgentVerse/agentVerse/ui/src/phaser3-rex-plugins/templates/ui/fixwidthsizer/GetNearestChildIndex.js +0 -41
  21. spaces/Alven/background-remover/app.py +0 -127
  22. spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/en/api/attnprocessor.md +0 -42
  23. spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/en/using-diffusers/pipeline_overview.md +0 -17
  24. spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/ko/using-diffusers/write_own_pipeline.md +0 -290
  25. spaces/Andy1621/uniformer_image_detection/configs/_base_/datasets/wider_face.py +0 -63
  26. spaces/Andy1621/uniformer_image_detection/configs/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco.py +0 -10
  27. spaces/Andy1621/uniformer_image_detection/tools/dist_train.sh +0 -9
  28. spaces/Andy1621/uniformer_image_segmentation/configs/dnlnet/dnl_r50-d8_512x512_80k_ade20k.py +0 -6
  29. spaces/Andy1621/uniformer_image_segmentation/configs/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes.py +0 -23
  30. spaces/AnimalEquality/chatbot/scripts/nbdev_readme_patch_hface.sh +0 -18
  31. spaces/AsakuraMizu/moe-tts/text/sanskrit.py +0 -62
  32. spaces/Ataturk-Chatbot/HuggingFaceChat/venv/lib/python3.11/site-packages/pkg_resources/_vendor/jaraco/__init__.py +0 -0
  33. spaces/Atualli/node-media-server/Dockerfile +0 -13
  34. spaces/Awesimo/jojogan/e4e/models/discriminator.py +0 -20
  35. spaces/BetterAPI/BetterChat_new/src/routes/conversation/+server.ts +0 -57
  36. spaces/Big-Web/MMSD/env/Lib/site-packages/setuptools/_distutils/msvccompiler.py +0 -695
  37. spaces/Brij1808/Blog_Generator/app.py +0 -20
  38. spaces/CVPR/LIVE/pybind11/tests/test_class.py +0 -333
  39. spaces/CVPR/LIVE/pybind11/tests/test_embed/test_interpreter.cpp +0 -284
  40. spaces/CVPR/LIVE/thrust/cub/cmake/cub-config.cmake +0 -62
  41. spaces/CVPR/LIVE/thrust/thrust/detail/caching_allocator.h +0 -45
  42. spaces/CVPR/WALT/mmdet/core/mask/structures.py +0 -1042
  43. spaces/CVPR/WALT/mmdet/models/detectors/trident_faster_rcnn.py +0 -66
  44. spaces/Corran/qnagenerator/app.py +0 -48
  45. spaces/DHEIVER/Kidney_Image_Classifier/app.py +0 -29
  46. spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/templates/frontend/assets/index-ec39e521.css +0 -1
  47. spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/h11/tests/test_util.py +0 -112
  48. spaces/DanielPinsk/StableDiffusion/README.md +0 -13
  49. spaces/DaniilMIPT/greenatomtest/app.py +0 -111
  50. spaces/Dao3/MBTI_Test/app.py +0 -47
spaces/1acneusushi/gradio-2dmoleculeeditor/data/Download program decodare casetofoane auto Learn how to decode your car stereo with this simple program.md DELETED
@@ -1,130 +0,0 @@
1
-
2
- <br> - Benefits: List some advantages of using a program to decode car radios. <br> - Challenges: Mention some difficulties or risks of decoding car radios without a program. | | H2: How to find the best program decodare casetofoane auto for your car model? | - Criteria: Describe some features or factors to consider when choosing a program. <br> - Examples: Give some examples of popular or reliable programs for different car models. <br> - Tips: Provide some tips or warnings for using a program safely and effectively. | | H3: How to use a program decodare casetofoane auto to unlock your car radio? | - Steps: Explain the steps to follow to download, install and run a program. <br> - Troubleshooting: Suggest some solutions for common problems or errors that may occur. <br> - Alternatives: Mention some other ways to decode car radios if a program does not work. | | H4: Conclusion | - Summary: Summarize the main points of the article. <br> - Call to action: Invite the reader to try a program or contact you for more information. | **Table 2: Article with HTML formatting** ```html <h1>Download program decodare casetofoane auto: What is it and why do you need it?</h1>
3
- <p>If you own a car with a radio or a CD player, you may have encountered a situation where your device asks for a code after you disconnect the battery or change the car. This code is a security feature that prevents unauthorized use of your radio in case of theft or loss. However, if you don't have the code or you forget it, you may not be able to use your radio anymore.</p>
4
- <p>This is where a <strong>program decodare casetofoane auto</strong> comes in handy. This is a software tool that can help you generate or recover the code for your radio based on its serial number or model. By using such a program, you can unlock your radio and enjoy your music again.</p>
5
- <h2>Download program decodare casetofoane auto</h2><br /><p><b><b>Download Zip</b> &#10001; &#10001; &#10001; <a href="https://byltly.com/2uKxc2">https://byltly.com/2uKxc2</a></b></p><br /><br />
6
- <p>There are many benefits of using a program decodare casetofoane auto, such as:</p>
7
- <ul>
8
- <li>You can save time and money by avoiding going to a dealer or a service center to get your code.</li>
9
- <li>You can avoid losing your radio settings and presets by keeping your battery connected.</li>
10
- <li>You can use your radio in any car without worrying about compatibility issues.</li>
11
- <li>You can protect your radio from theft by changing the code periodically.</li>
12
- </ul>
13
- <p>However, there are also some challenges or risks of decoding your radio without a program, such as:</p>
14
- <ul>
15
- <li>You may damage your radio or void its warranty by trying to open it or modify it.</li>
16
- <li>You may enter the wrong code multiple times and lock your radio permanently.</li>
17
- <li>You may download a malicious or fake program that can harm your computer or steal your personal information.</li>
18
- <li>You may violate the law or the terms of service of your radio manufacturer by decoding your radio without authorization.</li>
19
- </ul>
20
- <p>Therefore, it is important to use a reliable and safe program decodare casetofoane auto that can help you unlock your radio without any hassle.</p>
21
- <h2>How to find the best program decodare casetofoane auto for your car model?</h2>
22
- <p>There are many programs decodare casetofoane auto available online, but not all of them are suitable for your car model. Some programs may only work for certain brands or models, while others may not work at all. To find the best program for your car model, you need to consider some features or factors, such as:</p>
23
- <ul>
24
- <li>The compatibility of the program with your car model and radio type.</li>
25
- <li>The accuracy and speed of the program in generating or recovering the code.</li>
26
- <li>The ease of use and installation of the program on your computer.</li>
27
- <li>The security and reputation of the program and its source website.</li>
28
- <li>The cost and availability of the program and its updates.</li>
29
- </ul>
30
- <p>To help you choose a good program decodare casetofoane auto, here are some examples of popular or reliable programs for different car models:</p>
31
- <table border="1">
32
- <tr><th>Car model</th><th>Radio type</th><th>Program name</th><th>Website</th></tr>
33
- <tr><td>Ford</td><td>M-Serial</td><td>Ford M-Serial Radio Code Decoder</td><td><a href="https://fordradiocode.eu/">https://fordradiocode.eu/</a></td></tr>
34
- <tr><td>Renault</td><td>Clio, Megane, Scenic etc.</td><td>Renault Radio Code Generator</td><td><a href="https://renaultradiocode.com/">https://renaultradiocode.com/</a></td></tr>
35
- <tr><td>Volkswagen</td><td>RNS 310 / 315 / 510 etc.</td><td>VW Radio Code Calculator</td><td><a href="https://www.vw-radio-code.com/">https://www.vw-radio-code.com/</a></td></tr>
36
- <tr><td>Blaupunkt</td><td>Blaupunkt SC202 etc.</td><td>Blaupunkt Calculator</td><td><a href="https://www.decodari-casetofoane-auto.ro/">https://www.decodari-casetofoane-auto.ro/</a></td></tr>
37
- <tr><td>Philips</td><td>Philips Car 400 etc.</td><td>Philips Radio Code Generator</td><td><a href="https://philipsradiocode.com/">https://philipsradiocode.com/</a></td></tr>
38
- </table>
39
- <p>Here are some tips or warnings for using a program decodare casetofoane auto safely and effectively:</p>
40
- <ul>
41
- <li>Always check the source website of the program for reviews, ratings, feedbacks, certificates, etc.</li>
42
- <li>Always scan the downloaded file with an antivirus software before opening it.</li>
43
- <li>Always backup your data and create a restore point before installing or running the program.</li>
44
- <li>Always follow the instructions and guidelines provided by the program carefully.</li>
45
- <li>Always contact the support team or the developer of the program if you have any questions or issues.</li>
46
- </ul>
47
- <h3>How to use a program decodare casetofoane auto to unlock your car radio?</h3>
48
- <p>To use a program decodare casetofoane auto to unlock your car radio, you need to follow these steps:</p>
49
- <ol>
50
- <li>Find out the serial number and model of your radio. You can usually find them on a sticker on the side or back of the device, or on the screen after pressing certain buttons. You may also need to remove the radio from the dashboard to access them.</li>
51
- <li>Select and download a suitable program decodare casetofoane auto for your car model and radio type from a trusted website. Make sure it is compatible with your computer system and has good reviews.</li>
52
- <li>Install and run the program on your computer. Follow the instructions on how to connect your radio to your computer via USB cable, Bluetooth, Wi-Fi, etc. You may need to enter some information about your radio such as serial number, model, brand, etc.</li>
53
- <li>Wait for the program to scan your radio and generate or recover the code. This may take from few seconds to few minutes depending on the complexity of the algorithm and the speed of the connection.</li>
54
- <li>Enter the code on your radio using the buttons or knobs. Confirm and test if it works. If not, try again with another code or another program.</li>
55
- <li>Enjoy your music!</li>
56
- </ol>
57
- <p>If you encounter any problems or errors while using a program decodare casetofoane auto, here are some solutions you can try:</p>
58
- <p>How to download software for unlocking car radios<br />
59
- Best program for decoding car stereo codes<br />
60
- Download free program to decode car radio serial numbers<br />
61
- Where to find program for decodare casetofoane auto online<br />
62
- Program decodare casetofoane auto compatible with Windows 10<br />
63
- Download program decodare casetofoane auto for Android devices<br />
64
- Program decodare casetofoane auto with lifetime updates<br />
65
- How to use program decodare casetofoane auto step by step<br />
66
- Program decodare casetofoane auto reviews and ratings<br />
67
- Program decodare casetofoane auto download link and activation code<br />
68
- Program decodare casetofoane auto for Ford, Renault, Volkswagen, etc.<br />
69
- Program decodare casetofoane auto supported models and brands<br />
70
- Program decodare casetofoane auto troubleshooting and error messages<br />
71
- Program decodare casetofoane auto alternatives and competitors<br />
72
- Program decodare casetofoane auto discount and coupon codes<br />
73
- How to install program decodare casetofoane auto on your PC or laptop<br />
74
- How to transfer program decodare casetofoane auto to your smartphone or tablet<br />
75
- How to backup and restore program decodare casetofoane auto data and settings<br />
76
- How to uninstall program decodare casetofoane auto from your device<br />
77
- How to contact program decodare casetofoane auto customer support and service<br />
78
- Program decodare casetofoane auto FAQs and tips<br />
79
- Program decodare casetofoane auto features and benefits<br />
80
- Program decodare casetofoane auto testimonials and success stories<br />
81
- Program decodare casetofoane auto demo and trial version<br />
82
- Program decodare casetofoane auto license and terms of use<br />
83
- How to update program decodare casetofoane auto to the latest version<br />
84
- How to customize program decodare casetofoane auto settings and preferences<br />
85
- How to access program decodare casetofoane auto online dashboard and account<br />
86
- How to share program decodare casetofoane auto with your friends and family<br />
87
- How to get program decodare casetofoane auto for free or cheap<br />
88
- How to solve common problems with program decodare casetofoane auto<br />
89
- How to verify program decodare casetofoane auto authenticity and security<br />
90
- How to speed up program decodare casetofoane auto performance and efficiency<br />
91
- How to integrate program decodare casetofoane auto with other tools and apps<br />
92
- How to learn more about program decodare casetofoane auto functionality and usage<br />
93
- How to get the most out of program decodare casetofoane auto for your needs<br />
94
- How to find the best deal on program decodare casetofoane auto online or offline<br />
95
- How to compare program decodare casetofoane auto with other similar programs<br />
96
- How to avoid scams and frauds with program decodare casetofoane auto downloads<br />
97
- How to report bugs and issues with program decodare casetofoane auto software</p>
98
- <ul>
99
- <li>Check if you have entered the correct serial number and model of your radio. Make sure there are no typos or missing digits.</li>
100
- <li>Check if you have downloaded and installed the correct version of the program for your computer system and radio type. Make sure it is updated and compatible.</li>
101
- <li>Check if you have connected your radio and computer properly via USB cable, Bluetooth, Wi-Fi, etc. Make sure there are no loose wires or interferences.</li>
102
- <li>Check if you have followed the instructions and guidelines provided by the program carefully. Make sure you have entered all the required information correctly.</li>
103
- <li>Contact the support team or the developer of the program if you have any questions or issues.</li>
104
- </ul>
105
- <p>If a program decodare casetofoane auto does not work for your car model or radio type, or if you prefer not to use a program at all, here are some other ways to decode your car radio:</p>
106
- <ul>
107
- <li>You can contact your car dealer or service center and provide them with your radio serial number and model. They may be able to give you the code or unlock your radio for a fee.</li>
108
- <li>You can search online for websites or forums that offer free codes or solutions for decoding car radios. You may need to register or provide some information to access them.</li>
109
- <li>You can buy a code or a service from a third-party provider that specializes in decoding car radios. You may need to pay a fee or send your radio to them.</li>
110
- </ul>
111
- <h4>Conclusion</h4>
112
- <p>In conclusion, a program decodare casetofoane auto is a software tool that can help you unlock your car radio by generating or recovering the code based on its serial number or model. It can save you time and money, and allow you to use your radio in any car. However, you need to be careful and choose a reliable and safe program that is compatible with your car model and radio type. You also need to follow the instructions and guidelines provided by the program carefully. If a program does not work or if you prefer not to use one, you can try other alternatives such as contacting your dealer, searching online, or buying a code or a service.</p>
113
- <p>We hope this article has helped you understand what a program decodare casetofoane auto is and how to use it. If you have any questions or comments, please feel free to contact us. We would love to hear from you!</p>
114
- <h4>FAQs</h4>
115
- <p>Here are some frequently asked questions about program decodare casetofoane auto:</p>
116
- <ol>
117
- <li>Q: Is it legal to use a program decodare casetofoane auto?<br>
118
- A: It depends on the laws and regulations of your country and the terms of service of your radio manufacturer. In general, it is legal to use a program decodare casetofoane auto if you own the radio and have the right to use it. However, it may be illegal to use a program decodare casetofoane auto if you do not own the radio, if you have stolen it, or if you intend to sell it.</li>
119
- <li>Q: Is it safe to use a program decodare casetofoane auto?<br>
120
- A: It depends on the quality and security of the program and its source website. In general, it is safe to use a program decodare casetofoane auto if you download it from a trusted website, scan it with an antivirus software, backup your data, create a restore point, and follow the instructions carefully. However, it may be unsafe to use a program decodare casetofoane auto if you download it from an unknown website, open it without scanning it, install it without backing up your data, run it without creating a restore point, or enter incorrect information.</li>
121
- <li>Q: How much does it cost to use a program decodare casetofoane auto?<br>
122
- A: It depends on the type and source of the program. In general, it is free to use a program decodare casetofoane auto if you download it from a website that offers free codes or solutions for decoding car radios. However, it may cost some money to use a program decodare casetofoane auto if you download it from a website that charges a fee for codes or solutions for decoding car radios.</li>
123
- <li>Q: How long does it take to use a program decodare casetofoane auto?<br>
124
- A: It depends on the speed and complexity of the program and the connection. In general, it takes few seconds to few minutes to use a program decodare casetofoane auto if you have a fast and simple program and a good connection. However, it may take longer to use a program decodare casetofoane auto if you have a slow and complex program and a poor connection.</li>
125
- <li>Q: What are some examples of program decodare casetofoane auto?<br>
126
- A: Some examples of program decodare casetofoane auto are Ford M-Serial Radio Code Decoder for Ford radios with M-Serial numbers, Renault Radio Code Generator for Renault radios with Clio, Megane, Scenic models etc., VW Radio Code Calculator for Volkswagen radios with RNS 310 / 315 / 510 models etc., Blaupunkt Calculator for Blaupunkt radios with SC202 models etc., Philips Radio Code Generator for Philips radios with Car 400 models etc.</li>
127
- </ol>
128
- </p> 0a6ba089eb<br />
129
- <br />
130
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1gistliPinn/ChatGPT4/Examples/Ace Attorney Cracked Ipa Apps Discover the Amazing Story and Characters of the Game Series.md DELETED
@@ -1,18 +0,0 @@
1
- <br />
2
- <p><br><strong>Modded/Hacked App:</strong> Ace Attorney Trilogy HD by CAPCOM Co., Ltd<br><strong>Bundle ID:</strong> jp.co.capcom.gyakusaisetus<br><strong>iTunes Store Link:</strong> -attorney-trilogy-hd/id365681816?uo=4&at=1010lce4</p>
3
- <p>Wright thought back to the state of the murder weapon and recalled that the Time Keeper had been activated during the reception as part of the First Startup of Love. Wright recalled that this process involved two symbols of the bride and groom's love, and that Wyatt's pendant was supposed to be a component of the Time Keeper. He fitted the pendant into a spot near the bottom, inserted the Key of Love into the pendant, and turned it. Doing so opened the top of the Time Keeper to reveal a miniature scene of the happy couple, protected by a glass covering, which was cracked and stained with blood. Since everyone had been watching Sorin and Wyatt at the time, the only person with the opportunity for murder was Nichody. Panicked, Nichody began frantically operating on his FXR-UPR mech, saying he could repair anything, until he dropped his tools, cursing himself for having listened to Selena when she asked him to save Sorin instead of her. Clutching at his chest, Nichody collapsed.</p>
4
- <h2>ace attorney cracked ipa apps</h2><br /><p><b><b>Download</b> &rarr;&rarr;&rarr; <a href="https://imgfil.com/2uxZZO">https://imgfil.com/2uxZZO</a></b></p><br /><br />
5
- <p>Court Is Back In Session. Star as rookie defense attorney, Apollo Justice, as he visits crime scenes, questions key witnesses and collects vital evidence before stepping into the courtroom to prove his clients innocence.</p>
6
- <p>Features:<br />· All-new high-resolution graphics<br />· A new touch screen interface<br />· Interactive forensic testing mini-games that allow players to reveal hidden clues by dusting for prints, testing for traces of blood, and other exciting techniques.<br />· Two distinct gameplay segments:<br />o Investigation phase survey crime scenes, interview witnesses and gather forensic evidence that will be used in court<br />o Trial phase present findings from the investigation to support your case, listen to testimonies and examine witnesses<br />· An eclectic cast of characters:<br />o Apollo Justice: Stepping into the shoes of Phoenix Wright, the new rookie attorney leads the series into an exciting next chapter<br />o Klavier Gavin: Lead prosecutor, Apollos nemesis and rock star legend<br />o Kristoph Gavin: the coolest defense attorney on the judicial circuit, and Apollos mentor.<br />o Trucy: A mysterious magician and Apollos assistant</p>
7
- <p>While both the SYSTEM_ALERT_WINDOW and the BIND_ACCESSIBILITY_SERVICE Android permissions have been abused individually (e.g., in UI redressing attacks, accessibility attacks), previous attacks based on these permissions failed to completely control the UI feedback loop and thus either rely on vanishing side-channels to time the appearance of overlay UI, cannot respond properly to user input, or make the attacks literally visible. In this work, we demonstrate how combining the capabilities of these permissions leads to complete control of the UI feedback loop and creates devastating and stealthy attacks. In particular, we demonstrate how an app with these two permissions can launch a variety of stealthy, powerful attacks, ranging from stealing user's login credentials and security PIN, to the silent installation of a God-like app with all permissions enabled. To make things even worse, we note that when installing an app targeting a recent Android SDK, the list of its required permissions is not shown to the user and that these attacks can be carried out without needing to lure the user to knowingly enable any permission, thus leaving him completely unsuspecting. In fact, we found that the SYSTEM_ALERT_WINDOW permission is automatically granted for apps installed from the Play Store and, even though the BIND_ACCESSIBILITY_SERVICE is not automatically granted, our experiment shows that it is very easy to lure users to unknowingly grant that permission by abusing capabilities from the SYSTEM_ALERT_WINDOW permission. We also found that it is straightforward to get a proof-of-concept app requiring both permissions accepted on the official store. We evaluated the practicality of these attacks by performing a user study: none of the 20 human subjects that took part of the experiment even suspected they had been attacked. We conclude with a number of observations and best-practices that Google and developers can adopt to secure the Android GUI.</p>
8
- <p>Despite all predictions, native Desktop apps are back. After years porting stand-alone apps to the web, we are witnessing an inverse trend. Many companies have started providing native desktop apps built using the same technologies as their web counterparts. In this trend, Github's Electron has become a popular framework to build cross-platform desktop apps with JavaScript, HTML, and CSS. While it seems to be easy, embedding a webapp in a self-contained web environment (Chromium, Node.Js) introduces new security challenges.<br> <br> In this presentation, we will illustrate Electron's security model and describe current isolation mechanisms to prevent untrusted content from using Node.js primitives. Electron's IPC messaging, preloading and other internals will be comprehensively discussed. BrowserWindow and WebView security-relevant options will be also analyzed, together with design-level weaknesses and implementation bugs in Electron-based applications.<br> <br> As part of our study of Electron security, we have mapped the overall attack surface and derived a comprehensive checklist of anti-patterns. A new tool (electronegativity) to facilitate testing of Electron-based apps will be released.</p>
9
- <p>The law affords unique protections to communications between a lawyer and client, commonly referred to as the "attorney-client privilege." This tool is indispensable because a lawyer can best advocate for a client when the client is free to disclose both the good and the bad. The law affords similar protections to communications between a physician/therapist and patient.<br /><br />Cybersecurity professionals have no equivalent. This is true despite the fact that cybersecurity professionals are regularly entrusted with more sensitive information (about an individual/company) than what is entrusted to a lawyer or doctor. Security consultants can hold their clients' darkest secrets, or perhaps information that could "bring down" the company. These professionals are asked to find flaws, infiltrate networks, gather sensitive data, and document exactly how it happened, all-the-while contemplating how to use the information to the worst detriment of the target.<br /><br />Although security consultants have no straightforward legal privilege for protecting client data, they may have the best mechanism of all: White Hat Privilege. By using this term, the speakers submit that a white hat professional is perhaps able to utilize technical savvy to implement technological solutions to the problem of protecting client data while staying within the confines of the law.<br /><br />In this talk, we will examine the legal landscape for cybersecurity professionals seeking to safeguard a clients' sensitive client data. We will cover issues including contract formation, risk allocation, and other legal issues that arise during formation of services contracts. We will pivot to legal regimes for handling PII, cross-border data transfers, IP rights, and export-control issues. And because security professionals are not static beings, we will also examine border crossings, including authority of TSA/Customs to search and seize devices that might hold client data. While examining these issues, where possible, we will discuss potential technological solutions to legal problems.</p>
10
- <p>- <strong>Download Ace Attorney: Dual Destinies mod for android phone apk</strong>: Click the download button on the Android device that corresponds to your phone's operating system at the top of this page! Here EN.VNMOD.NET commit to bring the file download link ace-attorney-dual-destinies-hack_mod.apk & full other version, the most accurate from the publisher CAPCOM CO., LTD..</p>
11
- <p>- <strong>Download Ace Attorney: Dual Destinies mod for iphone ios phone</strong>: Click the download button to your iphone then follow the instructions to download the file ace-attorney-dual-destinies-hack_mod.ipa for IPhone IOS phone. Install without jailbreak.</p>
12
- <p>You play as a young boy who must head out to stay with his aunt, who, as the title implies, happens to be a powerful witch. Unfortunately, his aunt isn't all that she is cracked up to be. While she is powerful, she is also somewhat of an outcast. Together, the two, and her cat, must explore various lands while solving puzzles and figuring out why darkness hangs over the kingdom.</p>
13
- <p></p>
14
- <p>Ace Attorney Investigations - Miles Edgeworth APK is definitely a great Adventure app for Android, and has been already downloaded 15341 times here at Sbenny.com! You'll love its gameplay for sure and we hope you'll enjoy it! If you have some spare moments, please scroll down and review this app by giving a valuable feedback and sharing your experience about Ace Attorney Investigations - Miles Edgeworth APK, to help people from all around the world to know what you think about it.<br>If you love Adventure apps for Android like we do, you'll be glad to know we have thousand of similar games and apps, just click here to find our full collection of Adventure apps for Android!</p>
15
- <p>Always run more like a family business than a blue-chip corporate empire, the private Trump Organization has operated free from the oversight of independent board members or pesky shareholders. But now that secrecy has cracked.</p>
16
- <p>Yet here were are, three years later, and 3DS is doing quite well for itself. While it may not command the enormous marketshare of its esteemed predecessor, 3DS's current success does at least give Nintendo a much-needed fallback position as it struggles to make a viable business of the Wii U. Oh, and also, it makes for a ton of great games: First-party hits, third-party creations from Japan, classic games via Virtual Console, and a healthy selection of entertaining independent software. To mark the occasion, USgamer's biggest 3DS fans have reflected on their feelings about the system... and cracked open our Activity Logs to confess our most-played titles.</p> aaccfb2cb3<br />
17
- <br />
18
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1gistliPinn/ChatGPT4/Examples/FULLIObitDriverBoosterPro7426810Crack.md DELETED
@@ -1,6 +0,0 @@
1
- <h2>FULLIObitDriverBoosterPro7426810Crack</h2><br /><p><b><b>Download Zip</b> &middot;&middot;&middot; <a href="https://imgfil.com/2uxZkI">https://imgfil.com/2uxZkI</a></b></p><br /><br />
2
-
3
- 899543212b<br />
4
- <br />
5
- <br />
6
- <p></p>
 
 
 
 
 
 
 
spaces/1gistliPinn/ChatGPT4/Examples/Free Download Ekahau Site Survey.md DELETED
@@ -1,29 +0,0 @@
1
-
2
- <p>Sorry, but Ekahau Site Survey for Mac does not have a direct download. Using the link below and downloading the required application from the developer's site was possible when we last checked. We cannot confirm if there is a free download of this app available. FDMLib cannot ensure the security of software that is hosted on external sites.</p>
3
- <h2>Free download ekahau site survey</h2><br /><p><b><b>DOWNLOAD</b> &#9999; &#9999; &#9999; <a href="https://imgfil.com/2uy1Tj">https://imgfil.com/2uy1Tj</a></b></p><br /><br />
4
- <p>About the download, Ekahau HeatMapper is a program that takes up less free space than many programs in the category Networking software. It's a software frequently downloaded in United States, Hungary, and China.</p>
5
- <p>Ekahau HeatMap is the latest "must have" Internet Marketing Tool for all the new breed of website owners who want to carve a niche for themselves and expand their market share. Ekahau has made it very easy to use their free website meter, which you can find at the link below, but what is this "must have" tool that everyone is raving about, and why do so many people think that it will help you increase your profits? For starters, it is extremely easy to use and understand. All you need to do is open the link below this article, then just follow the instructions on screen, and within minutes you are ready to begin your own website meter viewing, which can bring you great potential traffic, and thereby increased sales, for your business. And if you are looking for a great "make money online" tool, this is the tool for you!</p>
6
- <p>The Ekahau HeatMapper: Website Metering for FREE Wireless Networks. Website Metering is a free wi-fi site survey tool to help website owners to determine the best place to connect their internet connections through various wireless networks. It helps determine the best internet connection for your site, based on where your visitors are coming from, what they are typing in to see your site, what time of day they visit, what their computer language is, and what browser type they use, among other factors. Website meter is an interactive tool to help you determine what the optimum location for you is, and how to connect with the closest hotspots. The site survey tool also helps you determine the most cost effective route for connecting to users in your local area.</p>
7
- <p>In addition, the Ekahau HeatMapper can be used by webmasters who want to know which areas of their site need the most improvement. For example, if you have a restaurant that receives a lot of traffic, but you notice that the majority of these visitors have poor wi-fi coverage, you can run a site survey to find out what areas in your location are getting the best coverage. By knowing which areas of your site get the best reception of internet traffic, you can improve your marketing strategies and expand your target market. Another great thing about the Ekahau HeatMapper is that it is completely free to use. Users just need to download the software, and then they can start to measure the performance of their wireless networks immediately. All you need to do is log in with your user ID and password to begin!</p>
8
- <p></p>
9
- <p>With the Wifi HeatMap from Solarwinds, you can create custom heatmaps. Start with a manual site survey of the wireless signals so that the tool can record it on the surface of a blueprint or layout. NPM with Wifi Heatmaps will poll the signal strength from clients and APs and generate dynamic Wifi heatmaps.</p>
10
- <p>NetSpot is a comprehensive wireless site survey, Wifi analysis, and troubleshooting application. It is designed for a wide variety of scenarios, from small home WLAN users to large-scale wireless deployments. NetSpot can help you improve the wireless security standpoint by running an advanced analysis. It can also help optimize the Wifi signal strength via heatmaps.</p>
11
- <p>To start a site survey with NetSpot, you need to upload your office plan or area layout. Indicate your location on the office plan, and the tool will begin to calculate the wireless coverage. As you move around with your site survey device, Netspot will continue to record data of the signals received in the area, and finally, create a heatmap.</p>
12
- <p>VisiWave Site Survey is an advanced WLAN site survey solution with data collection and visualization capabilities. It is designed for indoor/outdoor and metropolitan hotspots surveys. VisiWave allows you to analyze your WLAN, generate coverage heat maps automatically, and display radio waves.</p>
13
- <p>Although the Wifi heatmap software comes at different sizes and prices, they all have the same basic functionality, to collect data and generate a heatmap. All the software showed in this list, have this ability and some others improve it with amazing features. Most of the software have free trial downloads, which is an excellent way to start testing your Wifi networks.</p>
14
- <p>We cannot guarantee that the program is safe to download as it will be downloaded from the developer's website. Before launching the program, check it with any free antivirus software. The program lies within Internet & Network Tools, more precisely Network Tools.</p>
15
- <p>HeatMapper is the free version of a more powerful Wi-Fi surveying tool called Ekahau Site Survey. HeatMapper lets you do surveys for only 15 minutes at a time; Site Survey gives you unlimited time, along with additional features. Pricing varies according to the size and complexity of your network.</p>
16
- <p>If you're looking for the simplest and most basic test of your Wi-Fi speed, then Ookla Speedtest is the way to go. You don't need to download any software (which means this particular app works just fine for Macs as well). Just head to the site, click "Begin Test" and the site tests your upload and download speeds. It's a great tool for getting quick-and-dirty information about your network's throughput.</p>
17
- <p>CyberGhost is simple to use: Just download and install the client. (Note: In order to download the free version of the software, click the Free Download link on the upper-right hand of the CyberGhost home page.) You won't even need to create an account; after you install the client, you're ready to go. There are clients for Windows, Mac, iOS, Linux and Android.</p>
18
- <p>However, there are a few hurdles you'll need to clear first. To begin with, the Connectify home page is a bit confusing when it comes to figuring out how to download the free (Lite) version. At the top of the page are two buttons: Buy Now and Download. Click the Download button and install the app; you can then choose the Lite option during installation, which will let you share Wi-Fi hotspots.</p>
19
- <p>TamoGraph is a powerful and user-friendly wireless site survey software tool for <b>collecting</b>, <b>visualizing</b>, and <b>analyzing 802.11 a/b/g/n/ac/ax Wi-Fi data</b>. Wireless network deployment and maintenance requires the use of a professional RF site survey tool that facilitates otherwise time-consuming and very complex tasks, such as ongoing analysis and reporting of <b>signal strength</b>, <b>noise</b> and <b>interference</b>, <b>channel allocation</b>, <b>data rates</b>, etc.</p>
20
- <p>In a word, wireless site surveys are necessary because radio wave propagation is difficult to predict, especially in non-open space environments. It is virtually impossible to consider all the variables that might affect the health and performance of your WLAN. Changing conditions, even something as seemingly minor as a notebook equipped with a legacy 802.11g adapter that your new employee connected to the office wireless network, might seriously affect the WLAN performance. In addition, considering the wide proliferation of wireless infrastructure, factors such as interference from nearby WLANs play a very important role. This is why regular site surveys that are conducted with a professional tool are important.</p>
21
- <p>Ekahau Site Survey & Planner allows you to design and maintain Wi-Fi networks to ensure your requirements are met. <br/>Main features:<br/>- WI-Fi site survey tool allows simple yet ultra-comprehensive validation of Wi-Fi network coverage and performance.<br/>- Eliminate coverage holes, interference issues, roaming problems and all performance bottlenecks.</p>
22
- <p>A wireless site survey analyzes the radio frequency environment of an area where a Wi-Fi network is deployed. Network teams use site surveys when planning a new network to determine where to install access points (APs).</p>
23
- <p>Teams should also conduct periodic site surveys while the network is operating. Changes to office floor plans and layouts, such as the movement or addition of desks or file cabinets, might require changes to AP locations. New equipment might need a high level of signal strength in a location that doesn't provide the required signal levels. New applications or increased use of an existing application might also require improved performance. A site survey can determine any necessary changes to the network.</p>
24
- <p>In a predictive site survey, the blueprint shows virtual APs, and the survey software determines signal strength based on information about how signals propagate through walls and around cabinets and desks. The software also factors in the types of applications used in the area -- e.g., heavy video use requires high throughput, while VoIP calls don't require high throughput, but do require tight limits on latency and delay.</p>
25
- <p>The goal of a passive survey is to report on all signals at each location, including the installed network and signals from neighboring sites or other devices that generate noise at wireless frequencies.</p>
26
- <p>Teams should perform passive surveys periodically after they build the site, install equipment and activate the network. These surveys report information on APs and their characteristics, signal strength, signal-to-noise ratios and interference. They might reveal marginal performance changes before users notice.</p>
27
- <p>Active surveys focus on a specific signal or set of specific signals and produce an extensive list of measurements for each AP that generates a studied signal. These measurements include signal strength, throughput, round-trip time, packet loss and retransmission rate throughout the area where the signal is used. Active site surveys also measure upstream and downstream data rates and might result in teams moving an AP or adding or removing an unneeded AP. Teams should perform active surveys when investigating performance problems.</p> aaccfb2cb3<br />
28
- <br />
29
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Buy and Sell Anything with Aladdin.az the Leading E-commerce Platform in Azerbaijan.md DELETED
@@ -1,207 +0,0 @@
1
-
2
- <h1> tag defines a heading level 1, and the <table> tag defines a table. To create a table in HTML, you need to use the following tags: - The <table> tag defines the start and end of a table. - The <tr> tag defines a table row. - The <th> tag defines a table header cell. - The <td> tag defines a table data cell. You can also use other tags to add more features to your table, such as captions, column groups, headers, footers, etc. You can find more details and examples at [HTML Tables - W3Schools](^14^). Here is an example of a simple HTML table: <table>
3
- <tr>
4
- <th>Name</th>
5
- <th>Age</th>
6
- <th>Country</th>
7
- </tr>
8
- <tr>
9
- <td>Alice</td>
10
- <td>25</td>
11
- <td>USA</td>
12
- </tr>
13
- <tr>
14
- <td>Bob</td>
15
- <td>32</td>
16
- <td>UK</td>
17
- </tr>
18
- <tr>
19
- <td>Charlie</td>
20
- <td>28</td>
21
- <td>Australia</td>
22
- </tr>
23
- </table>
24
- This table will look like this in the browser: | Name | Age | Country | | --- | --- | --- | | Alice | 25 | USA | | Bob | 32 | UK | | Charlie | 28 | Australia | Now that you know how to create a table in HTML, let's move on to how to write a conversational article. A conversational article is an article that uses a friendly and informal tone to engage the reader and make them feel like they are having a chat with the writer. A conversational article should also be clear, concise, and well-structured, using headings, subheadings, paragraphs, lists, examples, questions, and other elements to organize the content and make it easy to read and understand. To write a conversational article, you should follow these tips: - Speak to one person and personalize your writing. Imagine your ideal reader and write as if you are talking to them directly. Use the word "you" to address them and make them feel involved. For example: "In this article, I will show you how to create a table in HTML and write a conversational article." - Open with a story. A story can capture the reader's attention and interest from the start. It can also help you establish rapport and credibility with the reader. You can use a personal story, an anecdote, a case study, or any other relevant story that relates to your topic and purpose. For example: "When I first started learning HTML, I was confused by all the different tags and how to use them. I wanted to create a simple table to display some data, but I had no idea how to do it. That's why I decided to write this article for you." - Break grammar rules. Conversational writing does not have to follow all the formal rules of grammar and punctuation. You can use contractions, slang, colloquialisms, fragments, run-on sentences, etc., as long as they make sense and do not affect the clarity and readability of your writing. For example: "Don't worry if you're not an expert in HTML. It's not that hard once you get the hang of it." - Ask questions. Questions can help you engage the reader and make them think about your topic. They can also help you transition from one point to another or introduce a new idea or example. You can use rhetorical questions or direct questions that invite the reader to respond or take action. For example: "Do you want to learn how to create a table in HTML? Then keep reading." <h1>How to Create a Table in HTML and Write a Conversational Article</h1>
25
- - Introduction - Open with a story - Explain what HTML tags are and how to use them - State the purpose and scope of the article - How to Create a Table in HTML - Explain the basic tags for creating a table - Show an example of a simple HTML table - Explain how to add more features to a table - Show an example of a more complex HTML table - How to Write a Conversational Article - Explain what a conversational article is and why it is effective - Give some tips for writing a conversational article - Speak to one person and personalize your writing - Open with a story - Break grammar rules - Ask questions - Use examples - Use humor and emotion - Use transitions and summaries - Conclusion - Summarize the main points of the article - Restate the benefits of creating a table in HTML and writing a conversational article - Include a call to action or a suggestion for further reading or learning - FAQs - List some common questions and answers related to the topic of the article Here is the article based on the outline: <h1>How to Create a Table in HTML and Write a Conversational Article</h1>
26
- <p>Have you ever wanted to create a table in HTML and write a conversational article? If so, you're not alone. I was in the same boat when I first started learning HTML. I wanted to create a simple table to display some data, but I had no idea how to do it. I also wanted to write an engaging and friendly article that would appeal to my readers, but I didn't know how to use the right tone and style. That's why I decided to write this article for you.</p>
27
- <h2>aladdin.az</h2><br /><p><b><b>Download</b> &#128279; <a href="https://urlin.us/2uT0W1">https://urlin.us/2uT0W1</a></b></p><br /><br />
28
- <p>In this article, I will show you how to create a table in HTML and write a conversational article. You will learn what HTML tags are and how to use them, how to create a simple and complex table in HTML, and how to write an article that uses a friendly and informal tone to engage your reader. By the end of this article, you will be able to create your own tables in HTML and write your own conversational articles with ease.</p>
29
- <h2>How to Create a Table in HTML</h2>
30
- <p>HTML stands for HyperText Markup Language, and it is the standard language for creating web pages. HTML tags are special words or letters surrounded by angle brackets, < and >, that tell the browser how to display the content of a web page. For example, the <p> tag defines a paragraph, the <h1> tag defines a heading level 1, and the <table> tag defines a table.</p>
31
- <p>To create a table in HTML, you need to use the following tags:</p>
32
- <ul>
33
- <li>The <table> tag defines the start and end of a table.</li>
34
- <li>The <tr> tag defines a table row.</li>
35
- <li>The <th> tag defines a table header cell.</li>
36
- <li>The <td> tag defines a table data cell.</li>
37
- </ul>
38
- <p>You can also use other tags to add more features to your table, such as captions, column groups, headers, footers, etc. You can find more details and examples at [HTML Tables - W3Schools].</p>
39
- <p>aladdin.az marketplace<br />
40
- aladdin.az instagram<br />
41
- aladdin.az baku<br />
42
- aladdin.az online shopping<br />
43
- aladdin.az e-commerce<br />
44
- aladdin.az technology<br />
45
- aladdin.az reviews<br />
46
- aladdin.az products<br />
47
- aladdin.az delivery<br />
48
- aladdin.az customer service<br />
49
- aladdin.az coupons<br />
50
- aladdin.az discounts<br />
51
- aladdin.az careers<br />
52
- aladdin.az jobs<br />
53
- aladdin.az contact<br />
54
- aladdin.az phone number<br />
55
- aladdin.az email address<br />
56
- aladdin.az app<br />
57
- aladdin.az download<br />
58
- aladdin.az login<br />
59
- aladdin.az register<br />
60
- aladdin.az account<br />
61
- aladdin.az orders<br />
62
- aladdin.az returns<br />
63
- aladdin.az refund policy<br />
64
- aladdin.az payment methods<br />
65
- aladdin.az credit card<br />
66
- aladdin.az gift card<br />
67
- aladdin.az loyalty program<br />
68
- aladdin.az rewards<br />
69
- aladdin.az points<br />
70
- aladdin.az cashback<br />
71
- aladdin.az affiliates<br />
72
- aladdin.az partners<br />
73
- aladdin.az sellers<br />
74
- aladdin.az vendors<br />
75
- aladdin.az categories<br />
76
- aladdin.az fashion<br />
77
- aladdin.az electronics<br />
78
- aladdin.az home and garden<br />
79
- aladdin.az beauty and health<br />
80
- aladdin.az sports and outdoors<br />
81
- aladdin.az books and media<br />
82
- aladdin.az toys and games<br />
83
- aladdin.az pets and animals<br />
84
- aladdin.az travel and leisure<br />
85
- aladdin.az automotive and motorcycles<br />
86
- aladdin.az food and beverages</p>
87
- <p>Here is an example of a simple HTML table:</p>
88
- <table>
89
- <tr>
90
- <th>Name</th>
91
- <th>Age</th>
92
- <th>Country</th>
93
- </tr>
94
- <tr>
95
- <td>Alice</td>
96
- <td>25</td>
97
- <td>USA</td>
98
- </tr>
99
- <tr>
100
- <td>Bob</td>
101
- <td>32</td>
102
- <td>UK</td>
103
- </tr>
104
- <tr>
105
- <td>Charlie</td>
106
- <td>28</td>
107
- <td>Australia</td>
108
- </tr>
109
- </table>
110
- <p>This table will look like this in the browser:</p>
111
- | Name | Age | Country | | --- | --- | --- | | Alice | 25 | USA | | Bob | 32 | UK | | Charlie | 28 | Australia | <p>If you want to create a more complex table in HTML, you can use some additional tags and attributes. For example, you can use the colspan and rowspan attributes to merge cells horizontally or vertically. You can also use the border, cellpadding, cellspacing, width, height, align, valign, bgcolor, etc., attributes to style your table. You can find more details and examples at [HTML Table Advanced Features - W3Schools](^2^).</p>
112
- <p>Here is an example of a more complex HTML table:</p>
113
- <table border="1" cellpadding="10" cellspacing="0" width="80%" align="center" bgcolor="#f0f0f0">
114
- <caption><strong>Monthly Savings</strong></caption>
115
- <colgroup>
116
- <col span="2" style="background-color:yellow">
117
- <col style="background-color:orange">
118
- </colgroup>
119
- <thead>
120
- <tr>
121
- <th>Name</th>
122
- <th>Age</th>
123
- <th>Country</th>
124
- </tr>
125
- </thead>
126
- <tbody>
127
- <tr>
128
- <td>Alice</td>
129
- <td rowspan="2">25</td>
130
- <td>USA</td>
131
- </tr>
132
- <tr>
133
- <td>Bob</td>
134
- <td>UK</td>
135
- </tr>
136
- <tr>
137
- <td colspan="2">Charlie</td>
138
- <td>Australia</td>
139
- </tr>
140
- </tbody>
141
- <tfoot>
142
- <tr>
143
- <td colspan="3">This table shows some data about people and their savings.</td>
144
- </tr>
145
- </tfoot>
146
- </table>
147
- <p>This table will look like this in the browser:</p>
148
- | Name | Age | Country | | --- | --- | --- | | Alice | 25 | USA | | Bob | | UK | | Charlie | | Australia | <p>This table shows some data about people and their savings.</p>
149
- <h2>How to Write a Conversational Article</h2>
150
- <p>A conversational article is an article that uses a friendly and informal tone to engage the reader and make them feel like they are having a chat with the writer. A conversational article should also be clear, concise, and well-structured, using headings, subheadings, paragraphs, lists, examples, questions, and other elements to organize the content and make it easy to read and understand.</p>
151
- <p>To write a conversational article, you should follow these tips:</p>
152
- <h3>Speak to one person and personalize your writing</h3>
153
- <p>Imagine your ideal reader and write as if you are talking to them directly. Use the word "you" to address them and make them feel involved. For example: "In this article, I will show you how to create a table in HTML and write a conversational article."</p>
154
- <h3>Open with a story</h3>
155
- <p>A story can capture the reader's attention and interest from the start. It can also help you establish rapport and credibility with the reader. You can use a personal story, an anecdote, a case study, or any other relevant story that relates to your topic and purpose. For example: "When I first started learning HTML, I was confused by all the different tags and how to use them. I wanted to create a simple table to display some data, but I had no idea how to do it. That's why I decided to write this article for you."</p>
156
- <h3>Break grammar rules</h3>
157
- <p>Conversational writing does not have to follow all the formal rules of grammar and punctuation. You can use contractions, slang, colloquialisms, fragments, run-on sentences, etc., as long as they make sense and do not affect the clarity and readability of your writing. For example: "Don't worry if you're not an expert in HTML. It's not that hard once you get the hang of it."</p>
158
- <h3>Ask questions</h3>
159
- <p>Questions can help you engage the reader and make them think about your topic. They can also help you transition from one point to another or introduce a new idea or example. You can use rhetorical questions or direct questions that invite the reader to respond or take action. For example: "Do you want to learn how to create a table in HTML? Then keep reading."</p>
160
- <h3>Use examples</h3>
161
- <p>Examples can help you illustrate your points and make them more concrete and relatable for the reader. They can also help you explain complex or abstract concepts in simpler terms. You can use real-life examples, hypothetical scenarios, analogies, metaphors, etc., as long as they are relevant and accurate. For example: "Creating a table in HTML is like building a Lego structure. You need different pieces (tags) that fit together (nest) in a certain way (syntax) to form the shape (table) you want."</p>
162
- <h3>Use humor and emotion</h3>
163
- <p>Humor and emotion can help you connect with the reader and make your writing more enjoyable and memorable. You can use jokes, puns, sarcasm, irony, exaggeration, etc., as long as they are appropriate and tast eful for your audience and topic. You can also use emoticons, emojis, gifs, etc., to add some fun and personality to your writing. For example: "HTML tables are awesome ?. They can help you organize and present your data in a neat and attractive way. But they can also be tricky ?. You need to know how to use the right tags and attributes to create the table you want."</p>
164
- <h3>Use transitions and summaries</h3>
165
- <p>Transitions and summaries can help you guide the reader through your article and keep them on track. They can also help you emphasize your main points and remind the reader of what they have learned or what they need to do next. You can use words or phrases such as "however", "therefore", "in conclusion", "to sum up", etc., to create smooth and logical connections between your paragraphs and sections. For example: "In conclusion, creating a table in HTML is not as hard as it may seem. You just need to follow some basic steps and use some simple tags and attributes. Let's recap what we have learned so far."</p>
166
- <h2>Conclusion</h2>
167
- <p>In this article, I have shown you how to create a table in HTML and write a conversational article. You have learned what HTML tags are and how to use them, how to create a simple and complex table in HTML, and how to write an article that uses a friendly and informal tone to engage your reader.</p>
168
- <p>Creating a table in HTML can help you organize and present your data in a neat and attractive way. Writing a conversational article can help you connect with your reader and make your writing more enjoyable and memorable. By following the tips and examples I have given you, you will be able to create your own tables in HTML and write your own conversational articles with ease.</p>
169
- <p>I hope you have found this article helpful and informative. If you have any questions or comments, please feel free to leave them below. I would love to hear from you. And if you want to learn more about HTML or conversational writing, you can check out these resources:</p>
170
- <ul>
171
- <li>[HTML Tutorial - W3Schools]</li>
172
- <li>[How to Write Conversational Content - Neil Patel]</li>
173
- </ul>
174
- <p>Thank you for reading and happy coding!</p>
175
- <h2>FAQs</h2>
176
- <h3>What is HTML?</h3>
177
- <p>HTML stands for HyperText Markup Language, and it is the standard language for creating web pages. HTML tags are special words or letters surrounded by angle brackets, < and >, that tell the browser how to display the content of a web page.</p>
178
- <h3>What is a conversational article?</h3>
179
- <p>A conversational article is an article that uses a friendly and informal tone to engage the reader and make them feel like they are having a chat with the writer. A conversational article should also be clear, concise, and well-structured, using headings, subheadings, paragraphs, lists, examples, questions, and other elements to organize the content and make it easy to read and understand.</p>
180
- <h3>How do I create a table in HTML?</h3>
181
- <p>To create a table in HTML, you need to use the following tags:</p>
182
- <ul>
183
- <li>The <table> tag defines the start and end of a table.</li>
184
- <li>The <tr> tag defines a table row.</li>
185
- <li>The <th> tag defines a table header cell.</li>
186
- <li>The <td> tag defines a table data cell.</li>
187
- </ul>
188
- <p>You can also use other tags to add more features to your table, such as captions, column groups, headers, footers, etc.</p>
189
- <h3>How do I write a conversational article?</h3>
190
- <p>To write a conversational article, you should follow these tips:</p>
191
- <ul>
192
- <li>Speak to one person and personalize your writing.</li>
193
- <li>Open with a story.</li>
194
- <li>Break grammar rules.</li>
195
- <li>Ask questions.</li>
196
- <li>Use examples.</li>
197
- <li>Use humor and emotion.</li>
198
- <li>Use transitions and summaries.</li>
199
- </ul>
200
- <h3>Where can I learn more about HTML or conversational writing?</h3>
201
- <p>You can learn more about HTML or conversational writing by checking out these resources:</p>
202
- <ul>
203
- <li>[HTML Tutorial - W3Schools]</li>
204
- <li>[How to Write Conversational Content - Neil Patel]</li>
205
- </ul></p> 197e85843d<br />
206
- <br />
207
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1pelhydcardo/ChatGPT-prompt-generator/assets/Download Clash of Clans and Join the Epic Clan Wars.md DELETED
@@ -1,176 +0,0 @@
1
-
2
- <h1>How to Download Clash of Clans</h1>
3
- <p>Clash of Clans is one of the most popular and addictive mobile games in the world. It is a strategy game where you build your own village, raise a clan, and compete in epic clan wars with millions of other players. In this article, we will show you how to download clash of clans on your Android or iOS device and start playing right away.</p>
4
- <h2>how to download clash of clans</h2><br /><p><b><b>Download Zip</b> >> <a href="https://urlin.us/2uSYbK">https://urlin.us/2uSYbK</a></b></p><br /><br />
5
- <h2>What is Clash of Clans?</h2>
6
- <h3>A brief introduction to the game and its features</h3>
7
- <p>Clash of Clans is a free-to-play game developed by Supercell, a Finnish company that also created other hit games like Clash Royale, Brawl Stars, Boom Beach, and Hay Day. The game was launched in 2012 and has since become one of the highest-grossing and most-downloaded apps on both Google Play Store and App Store .</p>
8
- <p>The game is set in a fantasy world where you can create your own village with various buildings, such as town hall, barracks, gold mines, elixir collectors, defenses, walls, and more. You can also train different types of troops, such as barbarians, archers, wizards, giants, dragons, and more. You can use these troops to attack other players' villages or defend your own from enemy raids.</p>
9
- <p>One of the main features of the game is the clan system. You can join or create a clan with other players from around the world. You can chat with your clanmates, donate and receive troops, and participate in clan wars. Clan wars are special events where two clans face each other in a series of attacks. The clan with the most stars (earned by destroying enemy buildings) at the end of the war wins.</p>
10
- <h2>Why should you play Clash of Clans?</h2>
11
- <h3>The benefits of playing the game, such as fun, strategy, social interaction, and competition</h3>
12
- <p>There are many reasons why you should play Clash of Clans. Here are some of them:</p>
13
- <p>How to download clash of clans on Android devices<br />
14
- How to download clash of clans on iOS devices<br />
15
- How to download clash of clans on PC or Mac<br />
16
- How to download clash of clans on Windows 10<br />
17
- How to download clash of clans on Chromebook<br />
18
- How to download clash of clans without Google Play<br />
19
- How to download clash of clans without App Store<br />
20
- How to download clash of clans APK file<br />
21
- How to download clash of clans latest version<br />
22
- How to download clash of clans update<br />
23
- How to download clash of clans mod APK<br />
24
- How to download clash of clans hack version<br />
25
- How to download clash of clans private server<br />
26
- How to download clash of clans offline mode<br />
27
- How to download clash of clans for free<br />
28
- How to install clash of clans after downloading<br />
29
- How to play clash of clans after downloading<br />
30
- How to create an account for clash of clans after downloading<br />
31
- How to join a clan in clash of clans after downloading<br />
32
- How to start a clan war in clash of clans after downloading<br />
33
- How to upgrade your village in clash of clans after downloading<br />
34
- How to unlock new troops in clash of clans after downloading<br />
35
- How to use spells in clash of clans after downloading<br />
36
- How to use heroes in clash of clans after downloading<br />
37
- How to use siege machines in clash of clans after downloading<br />
38
- How to switch between villages in clash of clans after downloading<br />
39
- How to access the builder base in clash of clans after downloading<br />
40
- How to earn gems in clash of clans after downloading<br />
41
- How to spend gems wisely in clash of clans after downloading<br />
42
- How to get free gems in clash of clans after downloading<br />
43
- How to get unlimited gems in clash of clans after downloading<br />
44
- How to get magic items in clash of clans after downloading<br />
45
- How to use magic items effectively in clash of clans after downloading<br />
46
- How to get free magic items in clash of clans after downloading<br />
47
- How to participate in clan games in clash of clans after downloading<br />
48
- How to complete clan games challenges in clash of clans after downloading<br />
49
- How to earn clan games rewards in clash of clans after downloading<br />
50
- How to join the clan war leagues in clash of clans after downloading<br />
51
- How to win the clan war leagues in clash of clans after downloading<br />
52
- How to earn league medals in clan war leagues in clash of clans after downloading<br />
53
- How to redeem league medals for rewards in clan war leagues in clash of clans after downloading<br />
54
- How to join the clan capital leagues in clash of clans after downloading<br />
55
- How to win the clan capital leagues in clash of clans after downloading<br />
56
- How to earn capital trophies in clan capital leagues in clash of clans after downloading<br />
57
- How to customize your player house in clan capital leagues in clash of clans after downloading<br />
58
- How to get hero skins in clash of clans after downloading <br />
59
- How to change hero skins in clash of cl</p>
60
- <ul>
61
- <li>It is fun. The game offers endless hours of entertainment and challenge. You can design your own village, experiment with different troop combinations, and discover new strategies. You can also enjoy the colorful graphics, sound effects, and animations.</li>
62
- <li>It is strategic. The game requires you to think carefully about how to build your village, how to attack your enemies, and how to cooperate with your clanmates. You have to balance your resources, plan your upgrades, and choose your targets wisely. You also have to adapt to changing situations and learn from your mistakes.</li>
63
- <li>It is social. The game allows you to interact with other players from different countries and cultures. You can chat with them, share tips and tricks, and make friends. You can also join or create a clan that suits your play style and preferences. You can support each other in times of need and celebrate your victories together.</li>
64
- <li>It is competitive. The game gives you the opportunity to test your skills against other players from around the world. You can climb the leaderboards, earn trophies, and prove yourself as the best. You can also challenge yourself in various game modes, such as clan war leagues, friendly wars, clan games, and special events.</li>
65
- </ul>
66
- <h2>How to download Clash of Clans <h2>How to download Clash of Clans on Android devices</h2>
67
- <h3>The steps to download the game from Google Play Store</h3>
68
- <p>If you have an Android device, such as a smartphone or a tablet, you can download Clash of Clans from Google Play Store. Here are the steps to follow:</p>
69
- <ol>
70
- <li>Open the Google Play Store app on your device. You can find it on your home screen or in your app drawer.</li>
71
- <li>Tap on the search bar and type "Clash of Clans". You can also use the voice search feature by tapping on the microphone icon.</li>
72
- <li>Select the game from the list of results. You can recognize it by its icon, which is a red shield with a yellow lion.</li>
73
- <li>Tap on the green "Install" button. This will start downloading the game to your device. You may need to accept some permissions and terms of service before proceeding.</li>
74
- <li>Wait for the download and installation to finish. You can check the progress on the notification bar or on the app page.</li>
75
- <li>Once the game is installed, you can tap on the "Open" button to launch it. You can also find it on your home screen or in your app drawer.</li>
76
- </ol>
77
- <h4>Requirements and compatibility</h4>
78
- <p>To play Clash of Clans on your Android device, you need to meet some minimum requirements. These are:</p>
79
- <ul>
80
- <li>An Android version of 4.4 or higher</li>
81
- <li>A device with at least 1 GB of RAM</li>
82
- <li>A stable internet connection (Wi-Fi or mobile data)</li>
83
- <li>At least 200 MB of free storage space</li>
84
- </ul>
85
- <p>You can check your device's specifications by going to Settings > About phone or tablet. You can also use third-party apps like CPU-Z or Droid Hardware Info to get more details.</p>
86
- <p>If your device does not meet these requirements, you may not be able to download or play the game properly. You may experience crashes, glitches, or lagging issues. In that case, you may need to upgrade your device or look for alternative ways to play the game, such as using an emulator or a cloud gaming service.</p>
87
- <h4>Installation and updates</h4>
88
- <p>Once you have downloaded and installed Clash of Clans on your Android device, you can start playing right away. However, you may need to update the game from time to time to get new features, bug fixes, and security patches. You can update the game manually or automatically, depending on your preferences.</p>
89
- <p>To update the game manually, you need to follow these steps:</p>
90
- <ol>
91
- <li>Open the Google Play Store app on your device.</li>
92
- <li>Tap on the menu icon (three horizontal lines) on the top left corner.</li>
93
- <li>Select "My apps & games" from the menu.</li>
94
- <li>Find Clash of Clans from the list of installed apps and tap on it.</li>
95
- <li>If there is an update available, you will see an "Update" button. Tap on it to start downloading and installing the update.</li>
96
- <li>Wait for the update to finish and then tap on "Open" to launch the game.</li>
97
- </ol>
98
- <p>To update the game automatically, you need to follow these steps:</p>
99
- <ol>
100
- <li>Open the Google Play Store app on your device.</li>
101
- <li>Tap on the menu icon (three horizontal lines) on the top left corner.</li>
102
- <li>Select "Settings" from the menu.</li>
103
- <li>Tap on "Auto-update apps" under General settings.</li>
104
- <li>Select "Over Wi-Fi only" or "Over any network" depending on your preference. This will enable automatic updates for all your apps, including Clash of Clans.</li>
105
- <li>You can also choose to update individual apps by going to their app pages and tapping on the menu icon (three vertical dots) on the top right corner. Then select "Enable auto-update" from the menu.</li>
106
- </ol>
107
- <h4>Permissions and settings</h4>
108
- <p>To play Clash of Clans on your Android device, you need to grant some permissions to the game. These are:</p>
109
- <ul>
110
- <li>Access to photos, media, and files: This allows the game to save data and settings on your device's storage.</li>
111
- <li>Access to Wi-Fi connection information: This allows the game to check your internet connection status and quality.</li>
112
- <li>Access to device ID and call information: This allows the game to identify your device and prevent fraud and abuse.</li>
113
- </ul>
114
- <p>You can manage these permissions by going to Settings > Apps > Clash of Clans > Permissions. You can also revoke these permissions at any time, but this may affect your gameplay experience or functionality.</p>
115
- <p>In addition to these permissions In addition to these permissions, you can also customize some settings to enhance your gameplay experience. These include: - Sound and music: You can adjust the volume and mute or unmute the sound effects and background music of the game. You can also enable or disable notifications and vibration alerts. - Language: You can change the language of the game to your preferred one. The game supports over 20 languages, including English, Spanish, French, German, Chinese, Japanese, Korean, and more. - Graphics: You can change the graphics quality of the game to suit your device's performance and battery life. You can choose between low, medium, high, or ultra settings. - Account: You can link your game account to your Google Play Games or Supercell ID account. This will allow you to save your progress and access your game from different devices. You can also switch between multiple accounts if you have more than one. You can access these settings by tapping on the gear icon on the top right corner of the game screen. You can also find more information and help by tapping on the question mark icon next to it. <h2>How to download Clash of Clans on iOS devices</h2>
116
- <h3>The steps to download the game from App Store</h3>
117
- <p>If you have an iOS device, such as an iPhone or an iPad, you can download Clash of Clans from App Store. Here are the steps to follow:</p>
118
- <ol>
119
- <li>Open the App Store app on your device. You can find it on your home screen or in your app library.</li>
120
- <li>Tap on the search icon (a magnifying glass) on the bottom right corner.</li>
121
- <li>Type "Clash of Clans" in the search bar and tap on the search button. You can also use the voice search feature by tapping on the microphone icon.</li>
122
- <li>Select the game from the list of results. You can recognize it by its icon, which is a red shield with a yellow lion.</li>
123
- <li>Tap on the blue "Get" button. This will start downloading the game to your device. You may need to enter your Apple ID and password or use Touch ID or Face ID before proceeding.</li>
124
- <li>Wait for the download and installation to finish. You can check the progress on the app page or on your home screen.</li>
125
- <li>Once the game is installed, you can tap on it to launch it. You can also find it in your app library.</li>
126
- </ol>
127
- <h4>Requirements and compatibility</h4>
128
- <p>To play Clash of Clans on your iOS device, you need to meet some minimum requirements. These are:</p>
129
- <ul>
130
- <li>An iOS version of 10 or higher</li>
131
- <li>A device with at least 1 GB of RAM</li>
132
- <li>A stable internet connection (Wi-Fi or mobile data)</li>
133
- <li>At least 200 MB of free storage space</li>
134
- </ul>
135
- <p>You can check your device's specifications by going to Settings > General > About. You can also use third-party apps like System Status or Battery Life to get more details.</p>
136
- <p>If your device does not meet these requirements, you may not be able to download or play the game properly. You may experience crashes, glitches, or lagging issues. In that case, you may need to upgrade your device or look for alternative ways to play the game, such as using an emulator or a cloud gaming service.</p>
137
- <h4>Installation and updates</h4>
138
- <p>Once you have downloaded and installed Clash of Clans on your iOS device, you can start playing right away. However, you may need to update the game from time to time to get new features, bug fixes, and security patches. You can update the game manually or automatically, depending on your preferences.</p>
139
- <p>To update the game manually, you need to follow these steps:</p>
140
- <ol>
141
- <li>Open the App Store app on your device.</li>
142
- <li>Tap on your profile picture on the top right corner.</li>
143
- <li>Scroll down to see a list of apps that have updates available.</li>
144
- <li>Find Clash of Clans from the list and tap on "Update" next to it. This will start downloading and installing the update.</li>
145
- <li>Wait for the update to finish and then tap on "Open" to launch the game.</li>
146
- </ol>
147
- <p>To update the game automatically, you need to follow these steps:</p>
148
- <ol>
149
- <li>Open the Settings app on your device.</li>
150
- <li>Tap on "App Store" under General settings.</li>
151
- <li>Toggle on "App Updates" under Automatic Downloads. This will enable automatic updates for all your apps, including Clash of Clans.</li>
152
- <li>You can also choose to update individual apps by going to their app pages and tapping on "More" (three horizontal dots) on the top right corner. Then select "Automatic Updates" from the menu. Then toggle on "Enable Automatic Updates" from the menu.</li>
153
- </ol>
154
- <h4>Permissions and settings</h4>
155
- <p>To play Clash of Clans on your iOS device, you need to grant some permissions to the game. These are:</p>
156
- <ul>
157
- <li>Access to photos: This allows the game to save screenshots and videos of your gameplay on your device's photo library.</li>
158
- <li>Access to microphone: This allows the game to record your voice and use it for voice chat with your clanmates or other players.</li>
159
- <li>Access to notifications: This allows the game to send you alerts and reminders about your game progress, events, and offers.</li>
160
- </ul>
161
- <p>You can manage these permissions by going to Settings > Clash of Clans. You can also revoke these permissions at any time, but this may affect your gameplay experience or functionality.</p>
162
- <p>In addition to these permissions, you can also customize some settings to enhance your gameplay experience. These include:</p>
163
- - Sound and music: You can adjust the volume and mute or unmute the sound effects and background music of the game. You can also enable or disable notifications and vibration alerts. - Language: You can change the language of the game to your preferred one. The game supports over 20 languages, including English, Spanish, French, German, Chinese, Japanese, Korean, and more. - Graphics: You can change the graphics quality of the game to suit your device's performance and battery life. You can choose between low, medium, high, or ultra settings. - Account: You can link your game account to your Game Center or Supercell ID account. This will allow you to save your progress and access your game from different devices. You can also switch between multiple accounts if you have more than one. You can access these settings by tapping on the gear icon on the top right corner of the game screen. You can also find more information and help by tapping on the question mark icon next to it. <h2>How to start playing Clash of Clans</h2>
164
- <h3>The basics of the game, such as building your village, joining a clan, and fighting in clan wars</h3>
165
- <p>Now that you have downloaded and installed Clash of Clans on your device, you are ready to start playing. Here are some of the basics of the game that you need to know:</p>
166
- - Building your village: Your village is your base of operations and your main source of resources. You can build various buildings, such as town hall, barracks, gold mines, elixir collectors, defenses, walls, and more. You can also upgrade these buildings to improve their functions and appearance. You can use gold and elixir as the main currencies to build and upgrade your buildings. You can also use gems as a premium currency to speed up the process or buy special items. - Joining a clan: A clan is a group of players who share a common interest and goal in the game. You can join or create a clan with other players from around the world. You can chat with your clanmates, donate and receive troops, and participate in clan wars. Clan wars are special events where two clans face each other in a series of attacks. The clan with the most stars (earned by destroying enemy buildings) at the end of the war wins. - Fighting in clan wars: Clan wars are one of the most exciting and rewarding aspects of the game. They allow you to test your skills against other players from around the world. You can participate in clan wars once you have a town hall level 4 or higher and join a clan that has at least 10 members. To start a clan war, you need to have a leader or co-leader in your clan who can initiate the war search. Once a match is found, you will enter a preparation day where you can scout your enemy's village and donate troops to your clan's war base. Then you will enter a battle day where you can attack your enemy's village twice and earn stars for your clan. The clan with the most stars at the end of the battle day wins the war. <h2>Conclusion</h2>
167
- <h3>A summary of the main points and a call to action</h3>
168
- <p>Clash of Clans is a fun, strategic, social, and competitive game that you can play on your Android or iOS device. It is easy to download and install from Google Play Store or App Store. All you need is a compatible device, a stable internet connection, and some free storage space. You can also update the game regularly to get new features, bug fixes, and security patches.</p>
169
- <p>Once you have downloaded and installed Clash of Clans on your device, you can start building your own village, raising a clan, and competing in epic clan wars with millions of other players. You can also customize some settings to enhance your gameplay experience.</p>
170
- <p>If you are looking for a game that offers a game that offers endless hours of entertainment and challenge, then Clash of Clans is the perfect choice for you. Download it today and join the millions of players who are already enjoying this amazing game. You won't regret it!</p>
171
- <h2>FAQs</h2>
172
- <h3>Some common questions and answers about Clash of Clans</h3>
173
- <p>Here are some of the frequently asked questions and answers about Clash of Clans that you may find helpful:</p>
174
- - Q: How can I save my game progress and access it from different devices? - A: You can save your game progress and access it from different devices by linking your game account to your Google Play Games or Supercell ID account. You can do this by going to Settings > Account in the game. You can also switch between multiple accounts if you have more than one. - Q: How can I get more gems in the game? - A: Gems are the premium currency in the game that you can use to speed up the process or buy special items. You can get more gems by completing achievements, removing obstacles, participating in events, or purchasing them with real money. - Q: How can I contact the support team if I have any issues or questions about the game? - A: You can contact the support team by going to Settings > Help and Support in the game. You can also visit the official website, forum, or social media pages of the game for more information and help. - Q: How can I report a bug, a glitch, or a hacker in the game? - A: You can report a bug, a glitch, or a hacker by going to Settings > Help and Support > Report an Issue in the game. You can also send a screenshot or a video of the problem to the support team for further investigation. - Q: How can I join or create a clan in the game? - A: You can join or create a clan in the game by tapping on the clan icon on the bottom left corner of the game screen. You can then search for a clan that suits your play style and preferences, or create your own clan with your own rules and requirements.</p> 197e85843d<br />
175
- <br />
176
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1phancelerku/anime-remove-background/Car Driving School Simulator APK Drive Around the World in 9 Different Maps.md DELETED
@@ -1,81 +0,0 @@
1
- <br />
2
- <h1>Car Driving School Simulator Apk Dayı: A Fun and Useful Game for Learning to Drive</h1>
3
- <p>Do you want to learn how to drive in a realistic and safe way? Do you want to have fun while doing so? If yes, then you should try car driving school simulator apk dayı, a popular driving simulation game that has been on the market since 2017. This game will test your driving skills in various scenarios and teach you useful traffic rules along the way. In this article, we will tell you more about this game and why you should download it today.</p>
4
- <h2>car driving school simulator apk dayı</h2><br /><p><b><b>Download Zip</b> ::: <a href="https://jinyurl.com/2uNOcd">https://jinyurl.com/2uNOcd</a></b></p><br /><br />
5
- <h2>Features of Car Driving School Simulator Apk Dayı</h2>
6
- <p>Car driving school simulator apk dayı is a feature-packed game that offers a lot of content and variety for the players. Here are some of the main features of the game:</p>
7
- <ul>
8
- <li><b>Huge car collection:</b> You can choose from over 39 awesome cars in different categories, such as sedans, pickup trucks, muscle cars, 4x4s, buses, and even a supercar.</li>
9
- <li><b>Multiple varied maps:</b> You can drive around 9 different locations around the world, such as California, Canada, Aspen, Las Vegas, New York, Miami, Tokyo, and Norway.</li>
10
- <li><b>Realistic traffic:</b> You have to deal with real traffic AI that will react to your actions and follow the traffic rules.</li>
11
- <li><b>Dynamic weather:</b> You have to adapt to the changes on the road due to different weather conditions, such as rain, snow, fog, or night.</li>
12
- <li><b>Online multiplayer:</b> You can compete with other players online in free roaming mode or seasonal events.</li>
13
- <li><b>Seasonal events:</b> You can participate in special challenges that will surprise you with different themes and rewards.</li>
14
- </ul>
15
- <h2>Tips and Tricks for Playing Car Driving School Simulator Apk Dayı</h2>
16
- <p>If you want to improve your driving skills and enjoy the game more, here are some tips and tricks for playing car driving school simulator apk dayı:</p>
17
- <ul>
18
- <li><b>Practice vehicle control tasks until you can do them automatically:</b> You need to master the basics of handling the steering wheel, using the gas and brake pedals, shifting gears, visually scanning the road, and using the turn signal.</li>
19
- <li><b>Test various braking scenarios to learn how to safely stop your car:</b> You need to practice braking for different reasons, such as stop signs, red lights, pedestrians, speed limits, or sharp turns.</li>
20
- <li><b>Use traffic sign scenarios to practice responding to different signs:</b> You need to learn how to obey different traffic signs, such as stop signs, yield signs, speed limit signs, or turn signs.</li>
21
- <li><b>Practice turning and changing lanes using your turn signal:</b> You need to learn how to maneuver your car in different situations, such as turning left or right, changing lanes, overtaking, or merging with traffic.</li>
22
- <li><b>Use the first person mode to increase your immersion:</b> You can switch to the optional first person camera view to get a more realistic driving experience.</li>
23
- </ul>
24
- <h2>Review of Car Driving School Simulator Apk Dayı</h2>
25
- <p>Car driving school simulator apk dayı is one of the best driving simulators available on the market. It has received positive reviews from players who praised its realistic graphics, sound effects, physics, and gameplay. The game is also constantly updated with new features, improvements, and bug fixes. Here are some of the pros and cons of the game:</p>
26
- <table>
27
- <tr><th>Pros</th><th>Cons</th></ <tr><td>- Realistic graphics, sound effects, and physics<br>- Huge car collection and multiple varied maps<br>- Realistic traffic and dynamic weather<br>- Online multiplayer and seasonal events<br>- Educational and fun gameplay</td><td>- Requires a lot of storage space and a good device<br>- Some bugs and glitches may occur<br>- Some ads and in-app purchases may be annoying<br>- Some traffic rules may differ from real life<br>- Some features may require internet connection</td></tr>
28
- </table>
29
- <h2>Conclusion</h2>
30
- <p>Car driving school simulator apk dayı is a game that will not only entertain you, but also teach you valuable driving skills. You can choose from a variety of cars and locations, and experience realistic traffic and weather conditions. You can also compete with other players online and participate in seasonal events. The game is well-designed, realistic, and fun to play. If you are looking for a driving simulator that will challenge you and help you learn, you should download car driving school simulator apk dayı today.</p>
31
- <h2>FAQs</h2>
32
- <p>Here are some frequently asked questions about the game:</p>
33
- <p>car driving school simulator 2021 apk download<br />
34
- car driving school simulator mod apk unlimited money<br />
35
- car driving school simulator boombit games apk<br />
36
- car driving school simulator android oyun club<br />
37
- car driving school simulator apk indir<br />
38
- car driving school simulator hack apk<br />
39
- car driving school simulator 3d apk<br />
40
- car driving school simulator apk pure<br />
41
- car driving school simulator apk uptodown<br />
42
- car driving school simulator apk mod menu<br />
43
- car driving school simulator 2020 apk<br />
44
- car driving school simulator full apk<br />
45
- car driving school simulator pro apk<br />
46
- car driving school simulator free download apk<br />
47
- car driving school simulator latest version apk<br />
48
- car driving school simulator offline apk<br />
49
- car driving school simulator online apk<br />
50
- car driving school simulator premium apk<br />
51
- car driving school simulator real parking game apk<br />
52
- car driving school simulator unlocked apk<br />
53
- car driving school simulator 2 apk<br />
54
- car driving school simulator 4x4 apk<br />
55
- car driving school simulator android 1<br />
56
- car driving school simulator by nullapp apk<br />
57
- car driving school simulator cheats apk<br />
58
- car driving school simulator city drive game apk<br />
59
- car driving school simulator cracked apk<br />
60
- car driving school simulator everything unlocked apk<br />
61
- car driving school simulator for android apk<br />
62
- car driving school simulator games bracket apk<br />
63
- car driving school simulator hack mod apk download<br />
64
- car driving school simulator in usa apk<br />
65
- car driving school simulator ios apk<br />
66
- car driving school simulator latest mod apk<br />
67
- car driving school simulator mod apk android oyun club<br />
68
- car driving school simulator mod apk revdl<br />
69
- car driving school simulator new update apk<br />
70
- car driving school simulator no ads apk<br />
71
- car driving school simulator obb file download apk<br />
72
- car driving school simulator old version apk download</p>
73
- <ol>
74
- <li><b>How can I download car driving school simulator apk dayı?</b><br>You can download the game from the Google Play Store or the App Store for free. You can also download the apk file from various websites, such as [Apk Dayı] or [Apk Pure].</li>
75
- <li><b>How can I unlock more cars and maps?</b><br>You can unlock more cars and maps by earning coins and XP in the game. You can also buy them with real money or watch ads to get them for free.</li>
76
- <li><b>How can I play online multiplayer?</b><br>You can play online multiplayer by tapping on the multiplayer icon on the main menu. You can then choose a map and a mode to join or create a room. You can also invite your friends to play with you.</li>
77
- <li><b>How can I participate in seasonal events?</b><br>You can participate in seasonal events by tapping on the event icon on the main menu. You can then see the current event theme, duration, rewards, and challenges. You can complete the challenges to earn points and rank up on the leaderboard.</li>
78
- <li><b>How can I contact the developers of the game?</b><br>You can contact the developers of the game by sending an email to [email protected] or by visiting their website at [BoomBit Games]. You can also follow them on social media platforms, such as [Facebook], [Twitter], or [Instagram].</li>
79
- </ol></p> 197e85843d<br />
80
- <br />
81
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1phancelerku/anime-remove-background/Christopher Martin - Let Her Go (Lyrics Video) - MP3 Download.md DELETED
@@ -1,101 +0,0 @@
1
- <br />
2
- <h1>Let Her Go MP3 Download by Christopher Martin</h1>
3
- <p>If you are a fan of reggae and dancehall music, you might have heard of Christopher Martin, a talented singer and songwriter from Jamaica. He is best known for his cover version of Passenger's hit song "Let Her Go", which has over 48 million views on YouTube. In this article, we will tell you more about Christopher Martin, his rendition of "Let Her Go", and how you can download it as an MP3 file legally and easily.</p>
4
- <h2>Who is Christopher Martin?</h2>
5
- <p>Christopher Martin is a reggae/dancehall artist who was born on February 14, 1987, in St. Catherine, Jamaica. He rose to fame after winning the Digicel Rising Stars competition in 2005, which is the Jamaican version of American Idol. Since then, he has released several albums and singles, such as "Cheaters Prayer", "I'm a Big Deal", "Is It Love", and "Dreams of Brighter Days". He has also collaborated with other prominent artists in the genre, such as Busy Signal, Romain Virgo, Cecile, and Konshens.</p>
6
- <h2>let her go mp3 download by christopher martin</h2><br /><p><b><b>Download Zip</b> &#9675; <a href="https://jinyurl.com/2uNTuo">https://jinyurl.com/2uNTuo</a></b></p><br /><br />
7
- <h3>His background and achievements</h3>
8
- <p>Martin grew up in a musical family and developed a passion for singing at an early age. He attended the Watermount All-Age School and later graduated from St. Jago High School, where he excelled in academics and sports. He also participated in various drama and music festivals, winning several awards and accolades. After winning the Digicel Rising Stars contest, he signed a contract with VP Records, one of the largest independent reggae labels in the world. He has since toured extensively across the globe, performing at major events such as One Night with Michael Bolton, Air Jamaica Jazz Festival, Reggae Sumfest, and Rebel Salute. He has also received numerous nominations and honors for his work, such as the Excellence in Music and Entertainment Award, the Youth View Award, the International Reggae and World Music Award, and the MOBO Award.</p>
9
- <h3>His musical style and influences</h3>
10
- <p>Martin's music is a blend of reggae, dancehall, pop, R&B, and soul. He has a smooth and versatile voice that can deliver both romantic ballads and upbeat anthems. He writes most of his songs himself, drawing inspiration from his personal experiences, social issues, spirituality, and love. Some of his musical influences include Bob Marley, Dennis Brown, Beres Hammond, Luther Vandross, Boyz II Men, Usher, and R. Kelly.</p>
11
- <h2>What is Let Her Go?</h2>
12
- <p>"Let Her Go" is a song that was originally written and recorded by Passenger, a British singer-songwriter. It was released in July 2012 as the second single from his third album, All the Little Lights. It became a huge international success, topping the charts in over 20 countries and selling over six million copies worldwide. It was also nominated for a Brit Award for British Single of the Year in 2014.</p>
13
- <h3>The original song by Passenger</h3>
14
- <p>The song is about realizing the value of someone when they are already gone. It emphasizes how waiting too long to tell someone how you feel may be too late and that they may have moved on. The chorus says: "Well you only need the light when it's burning low / Only miss the sun when it starts to snow / Only know you love her when you let her go / Only know you've been high when you're feeling low / Only hate the road when you're missing home / Only know you love her when you let her go". <h3>The cover version by Christopher Martin</h3>
15
- <p>In 2014, Christopher Martin released his own version of "Let Her Go" as a single. He changed the lyrics slightly to suit his reggae style and added some Jamaican slang and expressions. He also gave the song a more upbeat and cheerful vibe, contrasting with the melancholic tone of the original. His cover was well-received by fans and critics alike, who praised his vocal delivery and interpretation of the song. It became one of his most popular and requested songs, especially in the Caribbean and Africa.</p>
16
- <h3>The meaning and message of the song</h3>
17
- <p>According to Martin, he chose to cover "Let Her Go" because he liked the message and the melody of the song. He said that he could relate to the song because he had experienced losing someone he loved before. He also said that he wanted to share the song with his fans who might be going through a similar situation. He explained that the song is about learning from your mistakes and appreciating what you have before it's gone. He said that the song is also about hope and optimism, because even if you let someone go, you can still find happiness and love again.</p>
18
- <p>* christopher martin let her go lyrics<br />
19
- * let her go by christopher martin video<br />
20
- * christopher martin let her go reggaeville<br />
21
- * let her go christopher martin mp3 free download<br />
22
- * christopher martin let her go official video<br />
23
- * let her go by christopher martin audio<br />
24
- * christopher martin let her go song download<br />
25
- * let her go christopher martin youtube<br />
26
- * christopher martin let her go instrumental<br />
27
- * let her go by christopher martin remix<br />
28
- * christopher martin let her go live performance<br />
29
- * let her go by christopher martin cover<br />
30
- * christopher martin let her go album<br />
31
- * let her go by christopher martin meaning<br />
32
- * christopher martin let her go chords<br />
33
- * let her go by christopher martin ringtone<br />
34
- * christopher martin let her go karaoke<br />
35
- * let her go by christopher martin spotify<br />
36
- * christopher martin let her go acoustic version<br />
37
- * let her go by christopher martin soundcloud<br />
38
- * christopher martin let her go piano tutorial<br />
39
- * let her go by christopher martin genre<br />
40
- * christopher martin let her go guitar tabs<br />
41
- * let her go by christopher martin release date<br />
42
- * christopher martin let her go reaction video<br />
43
- * let her go by christopher martin tiktok<br />
44
- * christopher martin let her go 320kbps download<br />
45
- * let her go by christopher martin last.fm<br />
46
- * christopher martin let her go slowed down<br />
47
- * let her go by christopher martin shazam<br />
48
- * christopher martin let her go dancehall version<br />
49
- * let her go by christopher martin azlyrics<br />
50
- * christopher martin let her go extended mix<br />
51
- * let her go by christopher martin genius.com<br />
52
- * christopher martin let her go sheet music<br />
53
- * let her go by christopher martin billboard chart<br />
54
- * christopher martin let her go mashup song<br />
55
- * let her go by christopher martin apple music<br />
56
- * christopher martin let her go nightcore edit<br />
57
- * let her go by christopher martin amazon music<br />
58
- * christopher martin let her go edm remix <br />
59
- * let her go by christopher martin deezer <br />
60
- * christopher martin let her go whatsapp status <br />
61
- * let her go by christopher martin pandora <br />
62
- * christopher martin let her go saxophone version <br />
63
- * let her go by christopher martin vevo <br />
64
- * christopher martin let her go violin cover <br />
65
- * let her go by christopher martin vimeo <br />
66
- * christopher martin let her go ukulele chords</p>
67
- <h2>How to download Let Her Go MP3 by Christopher Martin?</h2>
68
- <p>If you want to enjoy "Let Her Go" by Christopher Martin on your device, you might be wondering how to download it as an MP3 file. There are many ways to do so, but not all of them are legal and ethical. In this section, we will show you the best and safest ways to download the song without breaking any laws or harming the artist.</p>
69
- <h3>The legal and ethical ways</h3>
70
- <p>The most legal and ethical way to download "Let Her Go" by Christopher Martin is to buy it from an authorized online store or platform, such as iTunes, Amazon Music, or Google Play Music. By doing so, you will support the artist financially and help him continue making music. You will also get a high-quality MP3 file that you can play on any device. Buying the song usually costs less than a dollar, which is a fair price for a great song.</p>
71
- <p>Another legal and ethical way to download "Let Her Go" by Christopher Martin is to stream it from a licensed online service or app, such as Spotify, YouTube Music, or Deezer. By doing so, you will also support the artist indirectly, as he will receive royalties from the streaming platforms based on the number of plays and views. You will also get access to a large library of music that you can listen to anytime and anywhere. Streaming the song is usually free or very cheap, depending on the service or app you use.</p>
72
- <h3>The best sources and platforms</h3>
73
- <p>Among the online stores and platforms that sell "Let Her Go" by Christopher Martin, we recommend iTunes as the best option. This is because iTunes has a user-friendly interface, a fast and secure payment system, and a wide compatibility with various devices. You can easily buy the song from iTunes using your Apple ID and download it to your computer or mobile device. You can also sync it with your iCloud account and access it from any Apple device.</p>
74
- <p>Among the online services and apps that stream "Let Her Go" by Christopher Martin, we recommend Spotify as the best option. This is because Spotify has a huge catalog of music, a smart algorithm that suggests songs based on your preferences, and a social feature that lets you share your playlists with your friends. You can easily stream the song from Spotify using your account and listen to it online or offline. You can also create your own radio station based on the song or discover similar songs by other artists.</p>
75
- <h3>The tips and tricks for a smooth download</h3>
76
- <p>To ensure a smooth download of "Let Her Go" by Christopher Martin, here are some tips and tricks that you should follow:</p>
77
- <ul>
78
- <li>Make sure you have a stable internet connection and enough storage space on your device.</li>
79
- <li>Choose a reputable and reliable source or platform that offers high-quality MP3 files.</li>
80
- <li>Check the reviews and ratings of the source or platform before buying or streaming the song.</li>
81
- <li>Use a VPN or proxy service if you encounter any geo-restrictions or censorship issues.</li>
82
- <li>Scan your device for viruses or malware after downloading or streaming the song.</li>
83
- </ul>
84
- <h2>Conclusion</h2>
85
- <p>"Let Her Go" by Christopher Martin is a beautiful reggae cover of Passenger's hit song. It showcases Martin's talent and versatility as a singer and songwriter. It also conveys a powerful message about love, loss, and hope. If you want to download it as an MP3 file, you can do so legally and ethically by buying it from an authorized online store or platform, such as iTunes, or streaming it from a licensed online service or app, such as Spotify. You can also follow some tips and tricks to ensure a smooth download and enjoy the song on your device. We hope you found this article helpful and informative. If you have any questions or comments, feel free to leave them below.</p>
86
- <h2>FAQs</h2>
87
- <p>Here are some frequently asked questions about "Let Her Go" by Christopher Martin:</p>
88
- <ol>
89
- <li>When did Christopher Martin release his cover of "Let Her Go"?</li>
90
- <p>He released it in 2014 as a single.</p>
91
- <li>What album is "Let Her Go" by Christopher Martin from?</li>
92
- <p>It is not from any album, but it is included in some of his compilations, such as Reggae Gold 2014 and Strictly the Best Vol. 50.</p>
93
- <li>How long is "Let Her Go" by Christopher Martin?</li>
94
- <p>It is 4 minutes and 25 seconds long.</p>
95
- <li>Who produced "Let Her Go" by Christopher Martin?</li>
96
- <p>It was produced by Frankie Music, a Jamaican record label and production company.</p>
97
- <li>Where can I watch the video of "Let Her Go" by Christopher Martin?</li>
98
- <p>You can watch it on YouTube or on his official website.</p>
99
- </ol></p> 197e85843d<br />
100
- <br />
101
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1phancelerku/anime-remove-background/Dr. Driving for PC How to Download and Play the Best Racing Game on Your Computer.md DELETED
@@ -1,131 +0,0 @@
1
- <br />
2
- <h1>How to Download and Play Dr. Driving on PC</h1>
3
- <p>Do you love racing games but want to experience a more realistic driving simulation? Do you want to test your driving skills in various missions and modes? Do you want to challenge other players online and compete for the top spot on the leaderboards? If you answered yes to any of these questions, then you should try <strong>Dr. Driving</strong>, a popular racing game developed by SUD Inc.</p>
4
- <h2>dr driving pc download apk</h2><br /><p><b><b>DOWNLOAD</b> ->->->-> <a href="https://jinyurl.com/2uNSbT">https://jinyurl.com/2uNSbT</a></b></p><br /><br />
5
- <p>Dr. Driving is a game that drives you crazy with its fast-paced and visually stunning gameplay. You can choose from different cars, modes, and missions, and drive through the streets with realistic physics and graphics. You can also sign in with your Google account to play online multiplayer and see how you rank among other drivers.</p>
6
- <p>But what if you want to play Dr. Driving on a bigger screen with better controls? What if you want to access thousands of other Android games and apps on your PC? Well, there is a way to do that, and it's called an emulator. An emulator is a software that lets you run Android applications on your PC or laptop, giving you the best of both worlds.</p>
7
- <p>In this article, we will show you how to download and install Dr. Driving on PC using three different emulators: BlueStacks, LDPlayer, and NoxPlayer. We will also give you some tips and tricks for playing Dr. Driving on PC more efficiently and enjoyably.</p>
8
- <h2>What is Dr. Driving?</h2>
9
- <p>Dr. Driving is a racing game that is not like any other racing game you have played before. It is not about speed or adrenaline, but about skill and precision. It is a game that simulates real driving scenarios, such as parking, lane changing, traffic rules, etc.</p>
10
- <h3>A racing game with realistic driving physics and graphics</h3>
11
- <p>Dr. Driving features realistic driving physics that make you feel like you are behind the wheel of a real car. You can feel the weight, acceleration, braking, steering, and suspension of your car as you drive through the streets with different weather and traffic conditions. You can also see the details of your car, such as the speedometer, the fuel gauge, the brake pedal, etc.</p>
12
- <h3>A game that challenges you to complete various missions and modes</h3>
13
- <p>Dr. Driving is not just about driving around aimlessly. It is a game that tests your driving skills in various missions and modes. You can choose from different game modes, such as Highway, Drift, Speed, Fuel Efficiency, VIP Escort, Parking, Broken Brake, and more. Each mode has its own objectives and difficulties, such as reaching a certain speed, drifting for a certain distance, saving fuel, avoiding collisions, parking accurately, etc. You can also earn coins and gold by completing missions and modes, which you can use to buy or upgrade your cars.</p>
14
- <p>dr driving game download for pc windows 10<br />
15
- dr driving 2 pc download free<br />
16
- dr driving for pc online play<br />
17
- dr driving pc game full version download<br />
18
- dr driving mod apk download for pc<br />
19
- dr driving 3d game download for pc<br />
20
- dr driving simulator pc download<br />
21
- dr driving offline game download for pc<br />
22
- dr driving car game download for pc<br />
23
- dr driving hack apk download for pc<br />
24
- dr driving 1 pc download<br />
25
- dr driving for pc without emulator<br />
26
- dr driving game install in pc<br />
27
- dr driving game free download for laptop<br />
28
- dr driving exe file download for pc<br />
29
- dr driving 2 mod apk download for pc<br />
30
- dr driving game play on pc<br />
31
- dr driving game setup download for pc<br />
32
- dr driving unlimited coins and gold apk download for pc<br />
33
- dr driving game download for windows 7<br />
34
- dr driving 2 game download for pc windows 10<br />
35
- dr driving game download for mac<br />
36
- dr driving game free download for windows 8.1<br />
37
- dr driving game download for pc softonic<br />
38
- dr driving game free download for windows xp<br />
39
- dr driving 2 game download for laptop<br />
40
- dr driving game free download for macbook air<br />
41
- dr driving game free download for windows 7 ultimate<br />
42
- dr driving game free download for windows 10 pro<br />
43
- dr driving 2 game free download for windows 8.1<br />
44
- dr driving game free download for windows 10 64 bit<br />
45
- dr driving 2 game free download for windows xp<br />
46
- dr driving game free download for windows 7 professional<br />
47
- dr driving 2 game free download for windows 7 ultimate<br />
48
- dr driving game free download for windows 10 home<br />
49
- dr driving 2 game free download for windows 10 pro<br />
50
- dr driving game free download for windows 8.1 pro<br />
51
- dr driving 2 game free download for windows 10 home<br />
52
- dr driving game free download for windows xp sp3<br />
53
- dr driving 2 game free download for windows xp sp3<br />
54
- how to play dr driving on pc with keyboard<br />
55
- how to install and play dr driving on pc using bluestacks emulator<br />
56
- how to play and enjoy dr driving on pc with ldplayer emulator <br />
57
- how to run and play dr driving on pc without bluestacks emulator <br />
58
- how to play and have fun with dr driving on pc using noxplayer emulator <br />
59
- how to play and experience dr driving on pc with koplayer emulator <br />
60
- how to play and master dr driving on pc with memu emulator <br />
61
- how to play and win in dr driving on pc with gameloop emulator</p>
62
- <h3>A game that supports online multiplayer and leaderboards</h3>
63
- <p>Dr. Driving is not just a solo game. It is also a game that lets you compete with other players online. You can sign in with your Google account to play online multiplayer and see how you rank among other drivers. You can challenge your friends or random players in different modes, such as Speed Parking, Drift Race, Fuel Battle, etc. You can also see your stats and achievements on the leaderboards and compare them with others.</p>
64
- <h2>Why play Dr. Driving on PC?</h2>
65
- <p>Dr. Driving is a fun and addictive game that you can play on your Android device. But what if you want to enjoy it on a bigger screen with better controls? What if you want to access thousands of other Android games and apps on your PC? Well, there is a way to do that, and it's called an emulator. An emulator is a software that lets you run Android applications on your PC or laptop, giving you the best of both worlds.</p>
66
- <h3>Enjoy a larger screen and better controls with an emulator</h3>
67
- <p>One of the main advantages of playing Dr. Driving on PC using an emulator is that you can enjoy a larger screen and better controls. You can see the graphics and details of the game more clearly and vividly on your PC monitor. You can also use your keyboard, mouse, or gamepad to control your car more smoothly and accurately. You can customize your controls using the emulator settings and choose the layout that suits you best.</p>
68
- <h3>Choose from different emulators such as BlueStacks, LDPlayer, Nox, etc.</h3>
69
- <p>Another advantage of playing Dr. Driving on PC using an emulator is that you can choose from different emulators according to your preferences and needs. There are many emulators available for PC, such as BlueStacks, LDPlayer, NoxPlayer, etc. Each emulator has its own features and benefits, such as performance, compatibility, stability, user interface, etc. You can compare and try different emulators to find the one that works best for you.</p>
70
- <h3>Access thousands of other Android games and apps on your PC</h3>
71
- <p>A third advantage of playing Dr. Driving on PC using an emulator is that you can access thousands of other Android games and apps on your PC. You can download and install any Android game or app from the Google Play Store or other sources using the emulator. You can also switch between different games and apps easily using the emulator's multitasking feature. You can enjoy a variety of Android games and apps on your PC without any hassle.</p>
72
- <h2>How to download and install Dr. Driving on PC using BlueStacks</h2>
73
- <p>One of the most popular and widely used emulators for PC is BlueStacks. BlueStacks is a powerful and reliable emulator that lets you run Android games and apps on your PC smoothly and efficiently. Here are the steps to download and install Dr. Driving on PC using BlueStacks:</p>
74
- <h3>Download and install BlueStacks from its official website</h3>
75
- <p>The first step is to download and install BlueStacks from its official website . You can choose the version that matches your PC's operating system (Windows or Mac). The download process may take some time depending on your internet speed. Once the download is complete, you can run the installer file and follow the instructions to install BlueStacks on your PC.</p>
76
- <h3>Launch BlueStacks and sign in with your Google account</h3>
77
- <p>The second step is to launch BlueStacks and sign in with your Google account . You can use your existing Google account or create a new one if you don't have one yet. Signing in with your Google account will allow you to access the Google Play Store and other Google services on BlueStacks.</p>
78
- <h3>Search for Dr. Driving in the Play Store and install it</h3>
79
- <p>The third step is to search for Dr. Driving in the Play Store and install it . You can use the search bar on the top right corner of the BlueStacks home screen to look for Dr. Driving. You can also browse the categories or recommendations to find the game. Once you find the game, you can click on it and then click on the Install button to download and install it on BlueStacks.</p>
80
- <h3>Click the Dr. Driving icon on the home screen and start playing</h3>
81
- <p>The fourth and final step is to click the Dr. Driving icon on the home screen and start playing . You can also find the game in the My Games tab on BlueStacks. You can now enjoy Dr. Driving on your PC with a larger screen and better controls.</p>
82
- <h2>How to download and install Dr. Driving on PC using LDPlayer</h2>
83
- <p>Another popular and widely used emulator for PC is LDPlayer. LDPlayer is a fast and lightweight emulator that lets you run Android games and apps on your PC smoothly and efficiently. Here are the steps to download and install Dr. Driving on PC using LDPlayer:</p>
84
- <h3>Download and install LDPlayer from its official website</h3>
85
- <p>The first step is to download and install LDPlayer from its official website . You can choose the version that matches your PC's operating system (Windows or Mac). The download process may take some time depending on your internet speed. Once the download is complete, you can run the installer file and follow the instructions to install LDPlayer on your PC.</p>
86
- <h3>Launch LDPlayer and sign in with your Google account</h3>
87
- <p>The second step is to launch LDPlayer and sign in with your Google account . You can use your existing Google account or create a new one if you don't have one yet. Signing in with your Google account will allow you to access the LD Store and other Google services on LDPlayer.</p>
88
- <h3>Search for Dr. Driving in the LD Store and install it</h3>
89
- <p>The third step is to search for Dr. Driving in the LD Store and install it . You can use the search bar on the top right corner of the LDPlayer home screen to look for Dr. Driving. You can also browse the categories or recommendations to find the game. Once you find the game, you can click on it and then click on the Install button to download and install it on LDPlayer.</p>
90
- <h3>Click the Dr. Driving icon on the desktop and start playing</h3>
91
- <p>The fourth and final step is to click the Dr. Driving icon on the desktop and start playing . You can also find the game in the My Games tab on LDPlayer. You can now enjoy Dr. Driving on your PC with a larger screen and better controls.</p>
92
- <h2>How to download and install Dr. Driving on PC using NoxPlayer</h2>
93
- <p>A third popular and widely used emulator for PC is NoxPlayer. NoxPlayer is a powerful and stable emulator that lets you run Android games and apps on your PC smoothly and efficiently. Here are the steps to download and install Dr. Driving on PC using NoxPlayer:</p>
94
- <h3>Download and install NoxPlayer from its official website</h3>
95
- <p>The first step is to download and install NoxPlayer from its official website . You can choose the version that matches your PC's operating system (Windows or Mac). The download process may take some time depending on your internet speed. Once the download is complete, you can run the installer file and follow the instructions to install NoxPlayer on your PC.</p>
96
- <h3>Launch NoxPlayer and sign in with your Google account</h3>
97
- <p>The second step is to launch NoxPlayer and sign in with your Google account . You can use your existing Google account or create a new one if you don't have one yet. Signing in with your Google account will allow you to access the Google Play Store and other Google services on NoxPlayer.</p>
98
- <h3>Drag and drop the APK/XAPK file of Dr. Driving onto the NoxPlayer window</h3>
99
- <p>The third step is to drag and drop the APK/XAPK file of Dr. Driving onto the NoxPlayer window . You can download the APK/XAPK file of Dr. Driving from any reliable source, such as APKPure, APKMirror, etc. Once you drag and drop the file onto the NoxPlayer window, it will automatically install the game on NoxPlayer.</p>
100
- <h3>Click the Dr. Driving icon on the home screen and start playing</h3>
101
- <p>The fourth and final step is to click the Dr. Driving icon on the home screen and start playing . You can also find the game in the My Games tab on NoxPlayer. You can now enjoy Dr. Driving on your PC with a larger screen and better controls.</p>
102
- <h2>Tips and tricks for playing Dr. Driving on PC</h2>
103
- <p>Now that you know how to download and install Dr. Driving on PC using different emulators, you might want to know some tips and tricks for playing the game more efficiently and enjoyably. Here are some of them:</p>
104
- <h3>Customize your controls using the keyboard, mouse, or gamepad</h3>
105
- <p>One of the benefits of playing Dr. Driving on PC using an emulator is that you can customize your controls using the keyboard, mouse, or gamepad . You can use the emulator settings to change the key mapping, sensitivity, layout, etc. of your controls. You can also use the default controls or choose from different presets. You can find the best controls that suit your style and preference.</p>
106
- <h3>Use the overview mode to see your car and the road better</h3>
107
- <p>Another tip for playing Dr. Driving on PC using an emulator is to use the overview mode to see your car and the road better . You can use the overview mode by pressing the O key on your keyboard or clicking the O button on your emulator screen. The overview mode will zoom out your view and show you a wider angle of your car and the road. This will help you see your surroundings better and avoid crashing into other vehicles or obstacles.</p>
108
- <h3>Follow the instructions and avoid crashing into other vehicles or obstacles</h3>
109
- <p>A third tip for playing Dr. Driving on PC using an emulator is to follow the instructions and avoid crashing into other vehicles or obstacles . You can see the instructions for each mode and mission on the top left corner of your screen. You can also see the indicators for your speed, fuel, damage, etc. on the bottom right corner of your screen. You should follow the instructions carefully and complete the objectives as fast as possible. You should also avoid crashing into other vehicles or obstacles, as this will damage your car and reduce your score.</p>
110
- <h3>Earn coins and gold by completing missions and modes</h3>
111
- <p>A fourth tip for playing Dr. Driving on PC using an emulator is to earn coins and gold by completing missions and modes . You can earn coins and gold by finishing each mode and mission with a high score and rank. You can also earn coins and gold by signing in with your Google account and playing online multiplayer. You can use coins and gold to buy or upgrade your cars in the shop.</p>
112
- <h3>Upgrade your car or buy new ones with different features and stats</h3>
113
- <p>A fifth tip for playing Dr. Driving on PC using an emulator is to upgrade your car or buy new ones with different features and stats . You can upgrade your car's engine, brake, tire, suspension, etc. in the shop using coins. You can also buy new cars with different features and stats, such as speed, handling, fuel efficiency, etc. using gold. You can choose from different cars, such as sedan, truck, bus, sports car, etc. You can also customize your car's color and appearance.</p>
114
- <h2>Conclusion</h2>
115
- <p>Dr. Driving is a racing game that drives you crazy with its realistic driving physics and graphics. You can choose from different cars, modes, and missions, and drive through the streets with skill and precision. You can also sign in with your Google account to play online multiplayer and compete with other drivers.</p>
116
- <p>If you want to play Dr. Driving on PC, you can use an emulator to run the game on your PC or laptop. You can choose from different emulators, such as BlueStacks, LDPlayer, and NoxPlayer, and follow the steps to download and install the game on your PC. You can also customize your controls using the keyboard, mouse, or gamepad, and enjoy a larger screen and better performance.</p>
117
- <p>We hope this article has helped you learn how to download and play Dr. Driving on PC using different emulators. We also hope you have enjoyed some tips and tricks for playing the game more efficiently and enjoyably. If you have any questions or feedback, please feel free to leave a comment below.</p>
118
- <h2>FAQs</h2>
119
- <p>Here are some frequently asked questions about Dr. Driving and playing it on PC:</p>
120
- <h3>Is Dr. Driving free to play?</h3>
121
- <p>Yes, Dr. Driving is free to play on Android devices. You can download and install it from the Google Play Store or other sources without paying any money. However, the game may contain ads and in-app purchases that require real money.</p>
122
- <h3>Is Dr. Driving safe to play?</h3>
123
- <p>Yes, Dr. Driving is safe to play on Android devices. The game does not contain any harmful or malicious content that may harm your device or data. However, you should always download and install the game from trusted sources and avoid any modded or hacked versions that may contain viruses or malware.</p>
124
- <h3>Can I play Dr. Driving offline?</h3>
125
- <p>Yes, you can play Dr. Driving offline on Android devices. You can play most of the modes and missions without an internet connection. However, you will need an internet connection to play online multiplayer and access some of the Google services, such as leaderboards and achievements.</p>
126
- <h3>Can I play Dr. Driving with my friends?</h3>
127
- <p>Yes, you can play Dr. Driving with your friends on Android devices. You can sign in with your Google account and challenge your friends or random players in different modes online. You can also see your friends' stats and rankings on the leaderboards.</p>
128
- <h3>Can I transfer my progress from Android to PC or vice versa?</h3>
129
- <p>Yes, you can transfer your progress from Android to PC or vice versa using an emulator. You can use the same Google account to sign in on both devices and sync your data across them. However, you may need to reinstall the game on the new device if you switch between different emulators.</p> 197e85843d<br />
130
- <br />
131
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/1phancelerku/anime-remove-background/Fly Anywhere in the World with Flight Simulator Download.md DELETED
@@ -1,155 +0,0 @@
1
-
2
- <h1>Flight Simulator Download: How to Get the Best Flight Simulation Experience</h1>
3
- <p>If you are a fan of aviation, you might have dreamed of flying a plane yourself. But unless you have a pilot license, a lot of money, and access to an airport, this dream might seem impossible. Fortunately, there is a way to experience the thrill and challenge of flying without leaving your home: flight simulation.</p>
4
- <h2>flight simulator download</h2><br /><p><b><b>Download</b> &gt; <a href="https://jinyurl.com/2uNRug">https://jinyurl.com/2uNRug</a></b></p><br /><br />
5
- <h2>Introduction</h2>
6
- <h3>What is flight simulation and why is it popular?</h3>
7
- <p>Flight simulation is a computer-based activity that simulates the operation and control of an aircraft in a virtual environment. It allows you to fly various types of planes, from light aircraft to wide-body jets, in different locations, weather conditions, and scenarios. You can also interact with other pilots, air traffic controllers, and ground crew online.</p>
8
- <p>Flight simulation is popular among aviation enthusiasts, hobbyists, gamers, and aspiring pilots. It offers a realistic and immersive way to learn about aviation, practice your skills, explore the world, and have fun. It can also help you prepare for real-world flying situations, such as emergencies, navigation, communication, and instrument procedures.</p>
9
- <h3>What are the benefits of flight simulation?</h3>
10
- <p>Flight simulation has many benefits for different types of users. Some of the benefits are:</p>
11
- <ul>
12
- <li>It improves your knowledge of aviation theory, terminology, regulations, and procedures.</li>
13
- <li>It enhances your spatial awareness, hand-eye coordination, decision making, and problem solving skills.</li>
14
- <li>It boosts your confidence and self-esteem as you master new challenges and achieve your goals.</li>
15
- <li>It reduces your stress and anxiety by providing a relaxing and enjoyable activity.</li>
16
- <li>It saves you time and money by allowing you to fly anytime and anywhere without spending on fuel, maintenance, or fees.</li>
17
- </ul>
18
- <h3>What are the features of a good flight simulator?</h3>
19
- <p>Not all flight simulators are created equal. Some are more realistic, detailed, and comprehensive than others. Some are more user-friendly, customizable, and compatible than others. Some are more affordable, accessible, and supported than others. So how do you choose the best flight simulator for your needs?</p>
20
- <p>Here are some of the features that you should look for in a good flight simulator:</p>
21
- <ul>
22
- <li>A large selection of aircraft models with accurate physics, aerodynamics, performance, and appearance.</li>
23
- <li>A vast and diverse world with high-resolution scenery, landmarks, airports, runways, and terrain.</li>
24
- <li>A dynamic weather system with realistic clouds, wind, precipitation, visibility, temperature, and pressure.</li>
25
- <li>A live traffic system with real-time flights, schedules, routes, frequencies, and callsigns.</li>
26
- <li>A multiplayer mode with voice chat, shared cockpit, formation flying, and online events.</li>
27
- <li>A user interface that is easy to navigate, configure, and customize.</li>
28
- <li>A tutorial mode that guides you through the basics of flying and using the simulator.</li>
29
- <li>A mission mode that challenges you with various objectives and scenarios.</li>
30
- <li>A community that provides support, feedback, tips, mods, add-ons, and updates.</li>
31
- </ul>
32
- <h2>Microsoft Flight Simulator: The Next Generation of Flight Simulation</h2>
33
- <h3>What is Microsoft Flight Simulator and how does it work?</h3>
34
- <p>Microsoft Flight Simulator is the next generation of one of the most beloved <p>Microsoft Flight Simulator is the next generation of one of the most beloved and longest-running flight simulation franchises in history. It is developed by Asobo Studio and published by Xbox Game Studios, and it was released in August 2020 for Windows 10 and Xbox Series X/S.</p>
35
- <p>Microsoft Flight Simulator uses cutting-edge technology to create a stunning and realistic simulation of the entire planet. It uses data from Bing Maps, Azure cloud computing, and artificial intelligence to render over 37,000 airports, 2 million cities, 1.5 billion buildings, and 2 trillion trees. It also uses real-time data from sources like FlightAware, Meteoblue, and Navblue to simulate live weather, traffic, and navigation.</p>
36
- <p>To play Microsoft Flight Simulator, you need a device that meets the minimum or recommended system requirements, a stable internet connection, and a Microsoft account. You can also use various peripherals, such as a keyboard, mouse, joystick, yoke, throttle, rudder pedals, headset, or VR headset, to enhance your experience. You can purchase the game from the Microsoft Store or Steam, or you can subscribe to Xbox Game Pass for PC or Xbox Game Pass Ultimate.</p>
37
- <h3>What are the main features and highlights of Microsoft Flight Simulator?</h3>
38
- <p>Microsoft Flight Simulator offers a wide range of features and highlights that make it one of the most impressive and immersive flight simulators ever made. Some of the main features and highlights are:</p>
39
- <p>flight simulator download for pc<br />
40
- flight simulator download free full version<br />
41
- flight simulator download windows 10<br />
42
- flight simulator download size<br />
43
- flight simulator download steam<br />
44
- flight simulator download xbox one<br />
45
- flight simulator download mac<br />
46
- flight simulator download microsoft store<br />
47
- flight simulator download planes<br />
48
- flight simulator download scenery<br />
49
- flight simulator download 2024<br />
50
- flight simulator download addons<br />
51
- flight simulator download aircraft<br />
52
- flight simulator download apk<br />
53
- flight simulator download android<br />
54
- flight simulator download boeing 737<br />
55
- flight simulator download best buy<br />
56
- flight simulator download bundle<br />
57
- flight simulator download code<br />
58
- flight simulator download crack<br />
59
- flight simulator download content manager<br />
60
- flight simulator download deluxe edition<br />
61
- flight simulator download digital<br />
62
- flight simulator download disc<br />
63
- flight simulator download demo<br />
64
- flight simulator download error<br />
65
- flight simulator download expansion pack<br />
66
- flight simulator download europe<br />
67
- flight simulator download for xbox series x<br />
68
- flight simulator download for android phone<br />
69
- flight simulator download game pass<br />
70
- flight simulator download google earth<br />
71
- flight simulator download gamestop<br />
72
- flight simulator download helicopter<br />
73
- flight simulator download how long<br />
74
- flight simulator download how to install<br />
75
- flight simulator download iso file<br />
76
- flight simulator download in parts<br />
77
- flight simulator download india map<br />
78
- flight simulator download japan update<br />
79
- flight simulator download keygen<br />
80
- flight simulator download kickass torrent<br />
81
- flight simulator download liveries<br />
82
- flight simulator download loop fix<br />
83
- flight simulator download mods<br />
84
- flight simulator download multiplayer mode<br />
85
- flight simulator download not working<br />
86
- flight simulator download nzxt cam overlay fixer tool</p>
87
- <ul>
88
- <li>A huge variety of aircraft to choose from, including light planes, airliners, helicopters, jets, gliders, and more. Each aircraft has a detailed and functional cockpit with realistic instruments and controls.</li>
89
- <li>A beautiful and diverse world to explore, with stunning graphics and lighting effects that change depending on the time of day and weather conditions. You can fly over mountains, oceans, forests, deserts, cities, landmarks, and more.</li>
90
- <li>A dynamic weather system that affects the flight performance and visuals of your aircraft. You can experience rain, snow, fog, clouds, wind, turbulence, lightning, thunderstorms, hurricanes, and more. You can also adjust the weather settings to your liking or match them with the real-world conditions.</li>
91
- <li>A live traffic system that populates the skies and airports with real-world flights and aircraft. You can see and interact with other planes and vehicles on the ground and in the air. You can also tune in to various radio frequencies and communicate with air traffic controllers and other pilots.</li>
92
- <li>A multiplayer mode that lets you fly with or against other players online. You can join or create a group session with your friends or strangers, or you can join an event or activity with specific rules and objectives. You can also chat with other players using text or voice.</li>
93
- <li>A user interface that is easy to use and customize. You can access various menus and options from the main screen or during your flight. You can also adjust your graphics settings, sound settings, camera settings, control settings, difficulty settings, and more.</li>
94
- <li>A tutorial mode that teaches you how to fly and use the simulator. You can learn the basics of flight physics, <p>A tutorial mode that teaches you how to fly and use the simulator. You can learn the basics of flight physics, aerodynamics, controls, instruments, procedures, and more. You can also get feedback and tips from your instructor.</p>
95
- <li>A mission mode that challenges you with various tasks and scenarios. You can test your skills and knowledge in different situations, such as landing, takeoff, navigation, emergency, combat, and more. You can also earn rewards and achievements for completing the missions.</li>
96
- <li>A community that provides support, feedback, tips, mods, add-ons, and updates. You can access various forums, blogs, videos, guides, reviews, and more from other users and developers. You can also download and install various mods and add-ons that enhance or modify the simulator.</li>
97
- </ul>
98
- <h2>Tips and Tricks for Getting the Most Out of Microsoft Flight Simulator</h2>
99
- <h3>How to customize your settings and preferences?</h3>
100
- <p>One of the best things about Microsoft Flight Simulator is that you can customize it to suit your preferences and needs. Here are some of the things that you can do to make the simulator more enjoyable and comfortable for you:</p>
101
- <ul>
102
- <li>Choose the right edition for you. Microsoft Flight Simulator comes in three editions: Standard, Deluxe, and Premium Deluxe. Each edition has a different number of aircraft and airports included. You can compare the editions and their prices on the official website or store.</li>
103
- <li>Optimize your graphics settings. Microsoft Flight Simulator is a very demanding game that requires a lot of processing power and memory. To ensure a smooth and stable performance, you should adjust your graphics settings according to your device's capabilities. You can use the preset options (low, medium, high, ultra) or customize them individually.</li>
104
- <li>Select your control device and scheme. Microsoft Flight Simulator supports various types of control devices, such as keyboards, mice, joysticks, yokes, throttles, rudders, pedals, headsets, and VR headsets. You can choose the device that you prefer or have available, and then select the corresponding control scheme from the menu. You can also customize the key bindings and sensitivity of each device.</li>
105
- <li>Set your difficulty level and assistance options. Microsoft Flight Simulator allows you to choose how realistic and challenging you want your flight experience to be. You can select from three difficulty levels (easy, medium, hard) or customize them individually. You can also enable or disable various assistance options that help you with different aspects of flying, such as checklists, navigation aids, copilot assistance, failure simulation, <p>failure simulation, and more.</p>
106
- <h3>How to create and follow your flight plan?</h3>
107
- <p>A flight plan is a document that specifies the details of your flight, such as the departure and arrival airports, the route, the altitude, the speed, the fuel, and the weather. It is important to create and follow a flight plan to ensure a safe and efficient flight.</p>
108
- <p>Here are some of the steps that you can take to create and follow your flight plan in Microsoft Flight Simulator:</p>
109
- <ul>
110
- <li>Choose your aircraft and airport. You can select from the available aircraft and airports in the simulator, or you can search for a specific one by name, code, or location. You can also choose whether you want to start from a parking spot, a runway, or in the air.</li>
111
- <li>Plan your route. You can use the world map to plan your route by clicking on the waypoints, airways, or airports that you want to fly through. You can also enter the route manually using the ICAO codes or the GPS coordinates. You can also use various tools and websites, such as SkyVector, SimBrief, or Little Navmap, to generate and import your route.</li>
112
- <li>Set your altitude and speed. You can set your cruising altitude and speed according to your aircraft's capabilities and the airspace regulations. You can also adjust them during your flight as needed. You can use the autopilot or the manual controls to maintain your altitude and speed.</li>
113
- <li>Check the weather and traffic. You can check the current and forecasted weather and traffic conditions for your flight using the simulator's menu or external sources, such as Meteoblue, FlightAware, or Navblue. You can also change the weather and traffic settings to your liking or match them with the real-world conditions.</li>
114
- <li>Follow your flight plan. You can use the navigation instruments and aids in your cockpit, such as the GPS, the VOR, the NDB, the DME, the ILS, or the ATC, to follow your flight plan. You can also use the simulator's menu or external sources, such as Navigraph Charts, FltPlan Go, or ForeFlight, to view your flight plan on a map.</li>
115
- </ul>
116
- <h3>How to use the interactive cockpit and instruments?</h3>
117
- <p>One of the most realistic and immersive features of Microsoft Flight Simulator is the interactive cockpit and instruments. Each aircraft has a detailed and functional cockpit with realistic instruments and controls that you can interact with using your mouse, keyboard, or peripheral device.</p>
118
- <p>Here are some of the things that you can do to use the interactive cockpit and instruments in Microsoft Flight Simulator:</p>
119
- <ul>
120
- <li>Familiarize yourself with the cockpit layout and functions. You can use the cockpit camera to look around and zoom in on different parts of the cockpit. You can also use the tooltips to see the name and function of each instrument and control. You can also access various checklists and manuals that explain how to operate each aircraft.</li>
121
- <li>Interact with the instruments and controls using your mouse. You can use your mouse cursor to click on buttons, switches, knobs, levers, dials, screens, and more. You can also use your mouse wheel to rotate knobs and dials. You can also use your right mouse button to drag or pan certain instruments or controls.</li>
122
- <li>Interact with the instruments and controls using your keyboard. You can use various keyboard shortcuts to activate or deactivate certain instruments or controls. You can also use <li>Interact with the instruments and controls using your keyboard. You can use various keyboard shortcuts to activate or deactivate certain instruments or controls. You can also use the arrow keys, the page up and page down keys, the home and end keys, and the enter key to adjust certain instruments or controls. You can also customize your keyboard bindings in the settings menu.</li>
123
- <li>Interact with the instruments and controls using your peripheral device. You can use various peripheral devices, such as joysticks, yokes, throttles, rudders, pedals, headsets, or VR headsets, to interact with the instruments and controls. You can also use the buttons, switches, knobs, levers, dials, screens, and more on your device to control certain instruments or controls. You can also customize your device settings and bindings in the settings menu.</li>
124
- </ul>
125
- <h3>How to deal with realistic weather and traffic conditions?</h3>
126
- <p>Another realistic and immersive feature of Microsoft Flight Simulator is the realistic weather and traffic conditions. The simulator uses real-time data from various sources to simulate the current and forecasted weather and traffic conditions for your flight. You can also change the weather and traffic settings to your liking or match them with the real-world conditions.</p>
127
- <p>Here are some of the things that you can do to deal with realistic weather and traffic conditions in Microsoft Flight Simulator:</p>
128
- <ul>
129
- <li>Check the weather and traffic information before and during your flight. You can use the simulator's menu or external sources, such as Meteoblue, FlightAware, or Navblue, to check the current and forecasted weather and traffic conditions for your flight. You can also see the weather and traffic information on your navigation instruments and aids, such as the GPS, the ATIS, or the ATC.</li>
130
- <li>Adjust your flight plan and performance according to the weather and traffic conditions. You can use the simulator's menu or external sources, such as SkyVector, SimBrief, or Little Navmap, to plan or modify your route, altitude, speed, fuel, and more according to the weather and traffic conditions. You can also adjust your aircraft's performance settings, such as the engine power, the flaps, the trim, the landing gear, and more according to the weather and traffic conditions.</li>
131
- <li>Follow the rules and procedures for flying in different weather and traffic conditions. You can use various sources, such as checklists, manuals, guides, videos, <p>Follow the rules and procedures for flying in different weather and traffic conditions. You can use various sources, such as checklists, manuals, guides, videos, or online courses, to learn and follow the rules and procedures for flying in different weather and traffic conditions. You can also use the simulator's tutorial mode, mission mode, or assistance options to help you with the rules and procedures.</p>
132
- <li>React to the changing weather and traffic conditions during your flight. You can use your instruments, controls, communication, and judgment to react to the changing weather and traffic conditions during your flight. You can also use the simulator's menu or external sources, such as Meteoblue, FlightAware, or Navblue, to update the weather and traffic information during your flight. You can also pause, save, or restart your flight if needed.</li>
133
- </ul>
134
- <h2>Conclusion</h2>
135
- <p>Flight simulation is a great way to experience the thrill and challenge of flying without leaving your home. It has many benefits for different types of users, such as improving your knowledge, skills, confidence, and enjoyment of aviation. However, not all flight simulators are the same. You need to choose the best flight simulator for your needs and preferences.</p>
136
- <p>Microsoft Flight Simulator is one of the best flight simulators available today. It offers a stunning and realistic simulation of the entire planet with various types of aircraft, scenery, weather, traffic, and more. It also offers a user-friendly and customizable interface with various modes, settings, options, and features. It also has a supportive and active community that provides support, feedback, tips, mods, add-ons, and updates.</p>
137
- <p>To get the best flight simulation experience with Microsoft Flight Simulator, you need to download and install it on your device, customize your settings and preferences, create and follow your flight plan, use the interactive cockpit and instruments, <p>use the interactive cockpit and instruments, and deal with realistic weather and traffic conditions. You also need to follow the rules and procedures for flying in different situations, and react to the changing conditions during your flight. You can also use various sources and tools to help you with your flight simulation, such as tutorials, missions, checklists, manuals, guides, videos, websites, apps, and more.</p>
138
- <p>Flight simulation is a fun and rewarding activity that can enrich your life in many ways. Whether you are a beginner or an expert, a casual or a serious user, a gamer or a learner, you can find something that suits your taste and style in Microsoft Flight Simulator. So what are you waiting for? Download Microsoft Flight Simulator today and start your flight simulation adventure!</p>
139
- <h3>FAQs</h3>
140
- <p>Here are some of the frequently asked questions about flight simulation and Microsoft Flight Simulator:</p>
141
- <ol>
142
- <li>Q: How realistic is Microsoft Flight Simulator?<br>
143
- A: Microsoft Flight Simulator is one of the most realistic flight simulators ever made. It uses advanced technology and data to create a lifelike simulation of the entire planet with various types of aircraft, scenery, weather, traffic, and more. It also simulates the physics, aerodynamics, performance, and appearance of each aircraft with high accuracy and detail.</li>
144
- <li>Q: How much does Microsoft Flight Simulator cost?<br>
145
- A: Microsoft Flight Simulator comes in three editions: Standard ($59.99), Deluxe ($89.99), and Premium Deluxe ($119.99). Each edition has a different number of aircraft and airports included. You can also subscribe to Xbox Game Pass for PC or Xbox Game Pass Ultimate to access the Standard edition of the game.</li>
146
- <li>Q: What are the system requirements for Microsoft Flight Simulator?<br>
147
- A: Microsoft Flight Simulator is a very demanding game that requires a lot of processing power and memory. The minimum system requirements are: Windows 10 64-bit, Intel Core i5-4460 or AMD Ryzen 3 1200 processor, NVIDIA GTX 770 or AMD Radeon RX 570 graphics card, 8 GB RAM, 150 GB storage space, and 5 Mbps internet speed. The recommended system requirements are: Windows 10 64-bit, Intel Core i5-8400 or AMD Ryzen 5 1500X processor, NVIDIA GTX 970 or AMD Radeon RX 590 graphics card, 16 GB RAM, 150 GB storage space, and 20 Mbps internet speed.</li>
148
- <li>Q: How do I download and install Microsoft Flight Simulator?<br>
149
- A: You can download and install Microsoft Flight Simulator from the Microsoft Store or Steam. You need to have a Microsoft account and a stable internet connection. You also need to have enough storage space on your device. The download size is about 100 GB for the Standard edition, 125 GB for the Deluxe edition, and 150 GB for the Premium Deluxe edition.</li>
150
- <li>Q: Where can I find more information and help about Microsoft Flight Simulator?<br>
151
- A: You can find more information and help about Microsoft Flight Simulator on the official website (https://www.flightsimulator.com/), the official forums (https://forums.flightsimulator.com/), the official support page (https://flightsimulator.zendesk.com/hc/en-us), the official YouTube channel (https://www.youtube.com/channel/UCqONzeACDBaF6FfKjh7ndAQ), or various other sources, such as blogs, videos, <p>videos, guides, reviews, and more from other users and developers.</li>
152
- </ol>
153
- <p>I hope this article has helped you learn more about flight simulation and Microsoft Flight Simulator. If you have any questions or comments, please feel free to share them below. Happy flying!</p> 401be4b1e0<br />
154
- <br />
155
- <br />
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/801artistry/RVC801/configs/config.py DELETED
@@ -1,265 +0,0 @@
1
- import argparse
2
- import os
3
- import sys
4
- import json
5
- from multiprocessing import cpu_count
6
-
7
- import torch
8
-
9
- try:
10
- import intel_extension_for_pytorch as ipex # pylint: disable=import-error, unused-import
11
- if torch.xpu.is_available():
12
- from infer.modules.ipex import ipex_init
13
- ipex_init()
14
- except Exception:
15
- pass
16
-
17
- import logging
18
-
19
- logger = logging.getLogger(__name__)
20
-
21
-
22
- version_config_list = [
23
- "v1/32k.json",
24
- "v1/40k.json",
25
- "v1/48k.json",
26
- "v2/48k.json",
27
- "v2/32k.json",
28
- ]
29
-
30
-
31
- def singleton_variable(func):
32
- def wrapper(*args, **kwargs):
33
- if not wrapper.instance:
34
- wrapper.instance = func(*args, **kwargs)
35
- return wrapper.instance
36
-
37
- wrapper.instance = None
38
- return wrapper
39
-
40
-
41
- @singleton_variable
42
- class Config:
43
- def __init__(self):
44
- self.device = "cuda:0"
45
- self.is_half = True
46
- self.n_cpu = 0
47
- self.gpu_name = None
48
- self.json_config = self.load_config_json()
49
- self.gpu_mem = None
50
- (
51
- self.python_cmd,
52
- self.listen_port,
53
- self.iscolab,
54
- self.noparallel,
55
- self.noautoopen,
56
- self.paperspace,
57
- self.is_cli,
58
- self.grtheme,
59
- self.dml,
60
- ) = self.arg_parse()
61
- self.instead = ""
62
- self.x_pad, self.x_query, self.x_center, self.x_max = self.device_config()
63
-
64
- @staticmethod
65
- def load_config_json() -> dict:
66
- d = {}
67
- for config_file in version_config_list:
68
- with open(f"configs/{config_file}", "r") as f:
69
- d[config_file] = json.load(f)
70
- return d
71
-
72
- @staticmethod
73
- def arg_parse() -> tuple:
74
- exe = sys.executable or "python"
75
- parser = argparse.ArgumentParser()
76
- parser.add_argument("--port", type=int, default=7865, help="Listen port")
77
- parser.add_argument("--pycmd", type=str, default=exe, help="Python command")
78
- parser.add_argument("--colab", action="store_true", help="Launch in colab")
79
- parser.add_argument(
80
- "--noparallel", action="store_true", help="Disable parallel processing"
81
- )
82
- parser.add_argument(
83
- "--noautoopen",
84
- action="store_true",
85
- help="Do not open in browser automatically",
86
- )
87
- parser.add_argument(
88
- "--paperspace",
89
- action="store_true",
90
- help="Note that this argument just shares a gradio link for the web UI. Thus can be used on other non-local CLI systems.",
91
- )
92
- parser.add_argument(
93
- "--is_cli",
94
- action="store_true",
95
- help="Use the CLI instead of setting up a gradio UI. This flag will launch an RVC text interface where you can execute functions from infer-web.py!",
96
- )
97
-
98
- parser.add_argument(
99
- "-t",
100
- "--theme",
101
- help = "Theme for Gradio. Format - `JohnSmith9982/small_and_pretty` (no backticks)",
102
- default = "JohnSmith9982/small_and_pretty",
103
- type = str
104
- )
105
-
106
- parser.add_argument(
107
- "--dml",
108
- action="store_true",
109
- help="Use DirectML backend instead of CUDA."
110
- )
111
-
112
- cmd_opts = parser.parse_args()
113
-
114
- cmd_opts.port = cmd_opts.port if 0 <= cmd_opts.port <= 65535 else 7865
115
-
116
- return (
117
- cmd_opts.pycmd,
118
- cmd_opts.port,
119
- cmd_opts.colab,
120
- cmd_opts.noparallel,
121
- cmd_opts.noautoopen,
122
- cmd_opts.paperspace,
123
- cmd_opts.is_cli,
124
- cmd_opts.theme,
125
- cmd_opts.dml,
126
- )
127
-
128
- # has_mps is only available in nightly pytorch (for now) and MasOS 12.3+.
129
- # check `getattr` and try it for compatibility
130
- @staticmethod
131
- def has_mps() -> bool:
132
- if not torch.backends.mps.is_available():
133
- return False
134
- try:
135
- torch.zeros(1).to(torch.device("mps"))
136
- return True
137
- except Exception:
138
- return False
139
-
140
- @staticmethod
141
- def has_xpu() -> bool:
142
- if hasattr(torch, "xpu") and torch.xpu.is_available():
143
- return True
144
- else:
145
- return False
146
-
147
- def use_fp32_config(self):
148
- for config_file in version_config_list:
149
- self.json_config[config_file]["train"]["fp16_run"] = False
150
-
151
- def device_config(self) -> tuple:
152
- if torch.cuda.is_available():
153
- if self.has_xpu():
154
- self.device = self.instead = "xpu:0"
155
- self.is_half = True
156
- i_device = int(self.device.split(":")[-1])
157
- self.gpu_name = torch.cuda.get_device_name(i_device)
158
- if (
159
- ("16" in self.gpu_name and "V100" not in self.gpu_name.upper())
160
- or "P40" in self.gpu_name.upper()
161
- or "P10" in self.gpu_name.upper()
162
- or "1060" in self.gpu_name
163
- or "1070" in self.gpu_name
164
- or "1080" in self.gpu_name
165
- ):
166
- logger.info("Found GPU %s, force to fp32", self.gpu_name)
167
- self.is_half = False
168
- self.use_fp32_config()
169
- else:
170
- logger.info("Found GPU %s", self.gpu_name)
171
- self.gpu_mem = int(
172
- torch.cuda.get_device_properties(i_device).total_memory
173
- / 1024
174
- / 1024
175
- / 1024
176
- + 0.4
177
- )
178
- if self.gpu_mem <= 4:
179
- with open("infer/modules/train/preprocess.py", "r") as f:
180
- strr = f.read().replace("3.7", "3.0")
181
- with open("infer/modules/train/preprocess.py", "w") as f:
182
- f.write(strr)
183
- elif self.has_mps():
184
- logger.info("No supported Nvidia GPU found")
185
- self.device = self.instead = "mps"
186
- self.is_half = False
187
- self.use_fp32_config()
188
- else:
189
- logger.info("No supported Nvidia GPU found")
190
- self.device = self.instead = "cpu"
191
- self.is_half = False
192
- self.use_fp32_config()
193
-
194
- if self.n_cpu == 0:
195
- self.n_cpu = cpu_count()
196
-
197
- if self.is_half:
198
- # 6G显存配置
199
- x_pad = 3
200
- x_query = 10
201
- x_center = 60
202
- x_max = 65
203
- else:
204
- # 5G显存配置
205
- x_pad = 1
206
- x_query = 6
207
- x_center = 38
208
- x_max = 41
209
-
210
- if self.gpu_mem is not None and self.gpu_mem <= 4:
211
- x_pad = 1
212
- x_query = 5
213
- x_center = 30
214
- x_max = 32
215
- if self.dml:
216
- logger.info("Use DirectML instead")
217
- if (
218
- os.path.exists(
219
- "runtime\Lib\site-packages\onnxruntime\capi\DirectML.dll"
220
- )
221
- == False
222
- ):
223
- try:
224
- os.rename(
225
- "runtime\Lib\site-packages\onnxruntime",
226
- "runtime\Lib\site-packages\onnxruntime-cuda",
227
- )
228
- except:
229
- pass
230
- try:
231
- os.rename(
232
- "runtime\Lib\site-packages\onnxruntime-dml",
233
- "runtime\Lib\site-packages\onnxruntime",
234
- )
235
- except:
236
- pass
237
- # if self.device != "cpu":
238
- import torch_directml
239
-
240
- self.device = torch_directml.device(torch_directml.default_device())
241
- self.is_half = False
242
- else:
243
- if self.instead:
244
- logger.info(f"Use {self.instead} instead")
245
- if (
246
- os.path.exists(
247
- "runtime\Lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
248
- )
249
- == False
250
- ):
251
- try:
252
- os.rename(
253
- "runtime\Lib\site-packages\onnxruntime",
254
- "runtime\Lib\site-packages\onnxruntime-dml",
255
- )
256
- except:
257
- pass
258
- try:
259
- os.rename(
260
- "runtime\Lib\site-packages\onnxruntime-cuda",
261
- "runtime\Lib\site-packages\onnxruntime",
262
- )
263
- except:
264
- pass
265
- return x_pad, x_query, x_center, x_max
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/A666sxr/Genshin_TTS/text/english.py DELETED
@@ -1,171 +0,0 @@
1
- """ from https://github.com/keithito/tacotron """
2
-
3
- '''
4
- Cleaners are transformations that run over the input text at both training and eval time.
5
-
6
- Cleaners can be selected by passing a comma-delimited list of cleaner names as the "cleaners"
7
- hyperparameter. Some cleaners are English-specific. You'll typically want to use:
8
- 1. "english_cleaners" for English text
9
- 2. "transliteration_cleaners" for non-English text that can be transliterated to ASCII using
10
- the Unidecode library (https://pypi.python.org/pypi/Unidecode)
11
- 3. "basic_cleaners" if you do not want to transliterate (in this case, you should also update
12
- the symbols in symbols.py to match your data).
13
- '''
14
-
15
-
16
- # Regular expression matching whitespace:
17
-
18
-
19
- import re
20
- import inflect
21
- from unidecode import unidecode
22
- import eng_to_ipa as ipa
23
- _inflect = inflect.engine()
24
- _comma_number_re = re.compile(r'([0-9][0-9\,]+[0-9])')
25
- _decimal_number_re = re.compile(r'([0-9]+\.[0-9]+)')
26
- _pounds_re = re.compile(r'£([0-9\,]*[0-9]+)')
27
- _dollars_re = re.compile(r'\$([0-9\.\,]*[0-9]+)')
28
- _ordinal_re = re.compile(r'[0-9]+(st|nd|rd|th)')
29
- _number_re = re.compile(r'[0-9]+')
30
-
31
- # List of (regular expression, replacement) pairs for abbreviations:
32
- _abbreviations = [(re.compile('\\b%s\\.' % x[0], re.IGNORECASE), x[1]) for x in [
33
- ('mrs', 'misess'),
34
- ('mr', 'mister'),
35
- ('dr', 'doctor'),
36
- ('st', 'saint'),
37
- ('co', 'company'),
38
- ('jr', 'junior'),
39
- ('maj', 'major'),
40
- ('gen', 'general'),
41
- ('drs', 'doctors'),
42
- ('rev', 'reverend'),
43
- ('lt', 'lieutenant'),
44
- ('hon', 'honorable'),
45
- ('sgt', 'sergeant'),
46
- ('capt', 'captain'),
47
- ('esq', 'esquire'),
48
- ('ltd', 'limited'),
49
- ('col', 'colonel'),
50
- ('ft', 'fort'),
51
- ]]
52
-
53
-
54
- # List of (ipa, lazy ipa) pairs:
55
- _lazy_ipa = [(re.compile('%s' % x[0]), x[1]) for x in [
56
- ('r', 'ɹ'),
57
- ('æ', 'e'),
58
- ('ɑ', 'a'),
59
- ('ɔ', 'o'),
60
- ('ð', 'z'),
61
- ('θ', 's'),
62
- ('ɛ', 'e'),
63
- ('ɪ', 'i'),
64
- ('ʊ', 'u'),
65
- ('ʒ', 'ʥ'),
66
- ('ʤ', 'ʥ'),
67
- ('ˈ', '↓'),
68
- ]]
69
-
70
- # List of (ipa, ipa2) pairs
71
- _ipa_to_ipa2 = [(re.compile('%s' % x[0]), x[1]) for x in [
72
- ('r', 'ɹ'),
73
- ('ʤ', 'dʒ'),
74
- ('ʧ', 'tʃ')
75
- ]]
76
-
77
-
78
- def expand_abbreviations(text):
79
- for regex, replacement in _abbreviations:
80
- text = re.sub(regex, replacement, text)
81
- return text
82
-
83
-
84
- def collapse_whitespace(text):
85
- return re.sub(r'\s+', ' ', text)
86
-
87
-
88
- def _remove_commas(m):
89
- return m.group(1).replace(',', '')
90
-
91
-
92
- def _expand_decimal_point(m):
93
- return m.group(1).replace('.', ' point ')
94
-
95
-
96
- def _expand_dollars(m):
97
- match = m.group(1)
98
- parts = match.split('.')
99
- if len(parts) > 2:
100
- return match + ' dollars' # Unexpected format
101
- dollars = int(parts[0]) if parts[0] else 0
102
- cents = int(parts[1]) if len(parts) > 1 and parts[1] else 0
103
- if dollars and cents:
104
- dollar_unit = 'dollar' if dollars == 1 else 'dollars'
105
- cent_unit = 'cent' if cents == 1 else 'cents'
106
- return '%s %s, %s %s' % (dollars, dollar_unit, cents, cent_unit)
107
- elif dollars:
108
- dollar_unit = 'dollar' if dollars == 1 else 'dollars'
109
- return '%s %s' % (dollars, dollar_unit)
110
- elif cents:
111
- cent_unit = 'cent' if cents == 1 else 'cents'
112
- return '%s %s' % (cents, cent_unit)
113
- else:
114
- return 'zero dollars'
115
-
116
-
117
- def _expand_ordinal(m):
118
- return _inflect.number_to_words(m.group(0))
119
-
120
-
121
- def _expand_number(m):
122
- num = int(m.group(0))
123
- if num > 1000 and num < 3000:
124
- if num == 2000:
125
- return 'two thousand'
126
- elif num > 2000 and num < 2010:
127
- return 'two thousand ' + _inflect.number_to_words(num % 100)
128
- elif num % 100 == 0:
129
- return _inflect.number_to_words(num // 100) + ' hundred'
130
- else:
131
- return _inflect.number_to_words(num, andword='', zero='oh', group=2).replace(', ', ' ')
132
- else:
133
- return _inflect.number_to_words(num, andword='')
134
-
135
-
136
- def normalize_numbers(text):
137
- text = re.sub(_comma_number_re, _remove_commas, text)
138
- text = re.sub(_pounds_re, r'\1 pounds', text)
139
- text = re.sub(_dollars_re, _expand_dollars, text)
140
- text = re.sub(_decimal_number_re, _expand_decimal_point, text)
141
- text = re.sub(_ordinal_re, _expand_ordinal, text)
142
- text = re.sub(_number_re, _expand_number, text)
143
- return text
144
-
145
-
146
- def mark_dark_l(text):
147
- return re.sub(r'l([^aeiouæɑɔəɛɪʊ ]*(?: |$))', lambda x: 'ɫ'+x.group(1), text)
148
-
149
-
150
- def english_to_ipa(text):
151
- text = unidecode(text).lower()
152
- text = expand_abbreviations(text)
153
- text = normalize_numbers(text)
154
- phonemes = ipa.convert(text)
155
- phonemes = collapse_whitespace(phonemes)
156
- return phonemes
157
-
158
-
159
- def english_to_lazy_ipa(text):
160
- text = english_to_ipa(text)
161
- for regex, replacement in _lazy_ipa:
162
- text = re.sub(regex, replacement, text)
163
- return text
164
-
165
-
166
- def english_to_ipa2(text):
167
- text = english_to_ipa(text)
168
- text = mark_dark_l(text)
169
- for regex, replacement in _ipa_to_ipa2:
170
- text = re.sub(regex, replacement, text)
171
- return text.replace('...', '…')
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AIConsultant/MusicGen/audiocraft/environment.py DELETED
@@ -1,176 +0,0 @@
1
- # Copyright (c) Meta Platforms, Inc. and affiliates.
2
- # All rights reserved.
3
- #
4
- # This source code is licensed under the license found in the
5
- # LICENSE file in the root directory of this source tree.
6
-
7
- """
8
- Provides cluster and tools configuration across clusters (slurm, dora, utilities).
9
- """
10
-
11
- import logging
12
- import os
13
- from pathlib import Path
14
- import re
15
- import typing as tp
16
-
17
- import omegaconf
18
-
19
- from .utils.cluster import _guess_cluster_type
20
-
21
-
22
- logger = logging.getLogger(__name__)
23
-
24
-
25
- class AudioCraftEnvironment:
26
- """Environment configuration for teams and clusters.
27
-
28
- AudioCraftEnvironment picks compute cluster settings (slurm, dora) from the current running environment
29
- or declared variable and the loaded team configuration. Additionally, the AudioCraftEnvironment
30
- provides pointers to a reference folder resolved automatically across clusters that is shared across team members,
31
- allowing to share sigs or other files to run jobs. Finally, it provides dataset mappers to automatically
32
- map dataset file paths to new locations across clusters, allowing to use the same manifest of files across cluters.
33
-
34
- The cluster type is identified automatically and base configuration file is read from config/teams.yaml.
35
- Use the following environment variables to specify the cluster, team or configuration:
36
-
37
- AUDIOCRAFT_CLUSTER (optional): Cluster type to enforce. Useful if the cluster type
38
- cannot be inferred automatically.
39
- AUDIOCRAFT_CONFIG (optional): Path to yaml config holding the teams configuration.
40
- If not set, configuration is read from config/teams.yaml.
41
- AUDIOCRAFT_TEAM (optional): Name of the team. Recommended to set to your own team.
42
- Cluster configuration are shared across teams to match compute allocation,
43
- specify your cluster configuration in the configuration file under a key mapping
44
- your team name.
45
- """
46
- _instance = None
47
- DEFAULT_TEAM = "default"
48
-
49
- def __init__(self) -> None:
50
- """Loads configuration."""
51
- self.team: str = os.getenv("AUDIOCRAFT_TEAM", self.DEFAULT_TEAM)
52
- cluster_type = _guess_cluster_type()
53
- cluster = os.getenv(
54
- "AUDIOCRAFT_CLUSTER", cluster_type.value
55
- )
56
- logger.info("Detecting cluster type %s", cluster_type)
57
-
58
- self.cluster: str = cluster
59
-
60
- config_path = os.getenv(
61
- "AUDIOCRAFT_CONFIG",
62
- Path(__file__)
63
- .parent.parent.joinpath("config/teams", self.team)
64
- .with_suffix(".yaml"),
65
- )
66
- self.config = omegaconf.OmegaConf.load(config_path)
67
- self._dataset_mappers = []
68
- cluster_config = self._get_cluster_config()
69
- if "dataset_mappers" in cluster_config:
70
- for pattern, repl in cluster_config["dataset_mappers"].items():
71
- regex = re.compile(pattern)
72
- self._dataset_mappers.append((regex, repl))
73
-
74
- def _get_cluster_config(self) -> omegaconf.DictConfig:
75
- assert isinstance(self.config, omegaconf.DictConfig)
76
- return self.config[self.cluster]
77
-
78
- @classmethod
79
- def instance(cls):
80
- if cls._instance is None:
81
- cls._instance = cls()
82
- return cls._instance
83
-
84
- @classmethod
85
- def reset(cls):
86
- """Clears the environment and forces a reload on next invocation."""
87
- cls._instance = None
88
-
89
- @classmethod
90
- def get_team(cls) -> str:
91
- """Gets the selected team as dictated by the AUDIOCRAFT_TEAM env var.
92
- If not defined, defaults to "labs".
93
- """
94
- return cls.instance().team
95
-
96
- @classmethod
97
- def get_cluster(cls) -> str:
98
- """Gets the detected cluster.
99
- This value can be overridden by the AUDIOCRAFT_CLUSTER env var.
100
- """
101
- return cls.instance().cluster
102
-
103
- @classmethod
104
- def get_dora_dir(cls) -> Path:
105
- """Gets the path to the dora directory for the current team and cluster.
106
- Value is overridden by the AUDIOCRAFT_DORA_DIR env var.
107
- """
108
- cluster_config = cls.instance()._get_cluster_config()
109
- dora_dir = os.getenv("AUDIOCRAFT_DORA_DIR", cluster_config["dora_dir"])
110
- logger.warning(f"Dora directory: {dora_dir}")
111
- return Path(dora_dir)
112
-
113
- @classmethod
114
- def get_reference_dir(cls) -> Path:
115
- """Gets the path to the reference directory for the current team and cluster.
116
- Value is overridden by the AUDIOCRAFT_REFERENCE_DIR env var.
117
- """
118
- cluster_config = cls.instance()._get_cluster_config()
119
- return Path(os.getenv("AUDIOCRAFT_REFERENCE_DIR", cluster_config["reference_dir"]))
120
-
121
- @classmethod
122
- def get_slurm_exclude(cls) -> tp.Optional[str]:
123
- """Get the list of nodes to exclude for that cluster."""
124
- cluster_config = cls.instance()._get_cluster_config()
125
- return cluster_config.get("slurm_exclude")
126
-
127
- @classmethod
128
- def get_slurm_partitions(cls, partition_types: tp.Optional[tp.List[str]] = None) -> str:
129
- """Gets the requested partitions for the current team and cluster as a comma-separated string.
130
-
131
- Args:
132
- partition_types (list[str], optional): partition types to retrieve. Values must be
133
- from ['global', 'team']. If not provided, the global partition is returned.
134
- """
135
- if not partition_types:
136
- partition_types = ["global"]
137
-
138
- cluster_config = cls.instance()._get_cluster_config()
139
- partitions = [
140
- cluster_config["partitions"][partition_type]
141
- for partition_type in partition_types
142
- ]
143
- return ",".join(partitions)
144
-
145
- @classmethod
146
- def resolve_reference_path(cls, path: tp.Union[str, Path]) -> Path:
147
- """Converts reference placeholder in path with configured reference dir to resolve paths.
148
-
149
- Args:
150
- path (str or Path): Path to resolve.
151
- Returns:
152
- Path: Resolved path.
153
- """
154
- path = str(path)
155
-
156
- if path.startswith("//reference"):
157
- reference_dir = cls.get_reference_dir()
158
- logger.warn(f"Reference directory: {reference_dir}")
159
- assert (
160
- reference_dir.exists() and reference_dir.is_dir()
161
- ), f"Reference directory does not exist: {reference_dir}."
162
- path = re.sub("^//reference", str(reference_dir), path)
163
-
164
- return Path(path)
165
-
166
- @classmethod
167
- def apply_dataset_mappers(cls, path: str) -> str:
168
- """Applies dataset mapping regex rules as defined in the configuration.
169
- If no rules are defined, the path is returned as-is.
170
- """
171
- instance = cls.instance()
172
-
173
- for pattern, repl in instance._dataset_mappers:
174
- path = pattern.sub(repl, path)
175
-
176
- return path
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AIGC-Audio/AudioGPT/NeuralSeq/modules/commons/transformer.py DELETED
@@ -1,747 +0,0 @@
1
- import math
2
- import torch
3
- from torch import nn
4
- from torch.nn import Parameter, Linear
5
- from modules.commons.common_layers import LayerNorm, Embedding
6
- from utils.tts_utils import get_incremental_state, set_incremental_state, softmax, make_positions
7
- import torch.nn.functional as F
8
-
9
- DEFAULT_MAX_SOURCE_POSITIONS = 2000
10
- DEFAULT_MAX_TARGET_POSITIONS = 2000
11
-
12
-
13
- class SinusoidalPositionalEmbedding(nn.Module):
14
- """This module produces sinusoidal positional embeddings of any length.
15
-
16
- Padding symbols are ignored.
17
- """
18
-
19
- def __init__(self, embedding_dim, padding_idx, init_size=1024):
20
- super().__init__()
21
- self.embedding_dim = embedding_dim
22
- self.padding_idx = padding_idx
23
- self.weights = SinusoidalPositionalEmbedding.get_embedding(
24
- init_size,
25
- embedding_dim,
26
- padding_idx,
27
- )
28
- self.register_buffer('_float_tensor', torch.FloatTensor(1))
29
-
30
- @staticmethod
31
- def get_embedding(num_embeddings, embedding_dim, padding_idx=None):
32
- """Build sinusoidal embeddings.
33
-
34
- This matches the implementation in tensor2tensor, but differs slightly
35
- from the description in Section 3.5 of "Attention Is All You Need".
36
- """
37
- half_dim = embedding_dim // 2
38
- emb = math.log(10000) / (half_dim - 1)
39
- emb = torch.exp(torch.arange(half_dim, dtype=torch.float) * -emb)
40
- emb = torch.arange(num_embeddings, dtype=torch.float).unsqueeze(1) * emb.unsqueeze(0)
41
- emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1).view(num_embeddings, -1)
42
- if embedding_dim % 2 == 1:
43
- # zero pad
44
- emb = torch.cat([emb, torch.zeros(num_embeddings, 1)], dim=1)
45
- if padding_idx is not None:
46
- emb[padding_idx, :] = 0
47
- return emb
48
-
49
- def forward(self, input, incremental_state=None, timestep=None, positions=None, **kwargs):
50
- """Input is expected to be of size [bsz x seqlen]."""
51
- bsz, seq_len = input.shape[:2]
52
- max_pos = self.padding_idx + 1 + seq_len
53
- if self.weights is None or max_pos > self.weights.size(0):
54
- # recompute/expand embeddings if needed
55
- self.weights = SinusoidalPositionalEmbedding.get_embedding(
56
- max_pos,
57
- self.embedding_dim,
58
- self.padding_idx,
59
- )
60
- self.weights = self.weights.to(self._float_tensor)
61
-
62
- if incremental_state is not None:
63
- # positions is the same for every token when decoding a single step
64
- pos = timestep.view(-1)[0] + 1 if timestep is not None else seq_len
65
- return self.weights[self.padding_idx + pos, :].expand(bsz, 1, -1)
66
-
67
- positions = make_positions(input, self.padding_idx) if positions is None else positions
68
- return self.weights.index_select(0, positions.view(-1)).view(bsz, seq_len, -1).detach()
69
-
70
- def max_positions(self):
71
- """Maximum number of supported positions."""
72
- return int(1e5) # an arbitrary large number
73
-
74
-
75
- class TransformerFFNLayer(nn.Module):
76
- def __init__(self, hidden_size, filter_size, padding="SAME", kernel_size=1, dropout=0., act='gelu'):
77
- super().__init__()
78
- self.kernel_size = kernel_size
79
- self.dropout = dropout
80
- self.act = act
81
- if padding == 'SAME':
82
- self.ffn_1 = nn.Conv1d(hidden_size, filter_size, kernel_size, padding=kernel_size // 2)
83
- elif padding == 'LEFT':
84
- self.ffn_1 = nn.Sequential(
85
- nn.ConstantPad1d((kernel_size - 1, 0), 0.0),
86
- nn.Conv1d(hidden_size, filter_size, kernel_size)
87
- )
88
- self.ffn_2 = Linear(filter_size, hidden_size)
89
-
90
- def forward(self, x, incremental_state=None):
91
- # x: T x B x C
92
- if incremental_state is not None:
93
- saved_state = self._get_input_buffer(incremental_state)
94
- if 'prev_input' in saved_state:
95
- prev_input = saved_state['prev_input']
96
- x = torch.cat((prev_input, x), dim=0)
97
- x = x[-self.kernel_size:]
98
- saved_state['prev_input'] = x
99
- self._set_input_buffer(incremental_state, saved_state)
100
-
101
- x = self.ffn_1(x.permute(1, 2, 0)).permute(2, 0, 1)
102
- x = x * self.kernel_size ** -0.5
103
-
104
- if incremental_state is not None:
105
- x = x[-1:]
106
- if self.act == 'gelu':
107
- x = F.gelu(x)
108
- if self.act == 'relu':
109
- x = F.relu(x)
110
- x = F.dropout(x, self.dropout, training=self.training)
111
- x = self.ffn_2(x)
112
- return x
113
-
114
- def _get_input_buffer(self, incremental_state):
115
- return get_incremental_state(
116
- self,
117
- incremental_state,
118
- 'f',
119
- ) or {}
120
-
121
- def _set_input_buffer(self, incremental_state, buffer):
122
- set_incremental_state(
123
- self,
124
- incremental_state,
125
- 'f',
126
- buffer,
127
- )
128
-
129
- def clear_buffer(self, incremental_state):
130
- if incremental_state is not None:
131
- saved_state = self._get_input_buffer(incremental_state)
132
- if 'prev_input' in saved_state:
133
- del saved_state['prev_input']
134
- self._set_input_buffer(incremental_state, saved_state)
135
-
136
-
137
- class MultiheadAttention(nn.Module):
138
- def __init__(self, embed_dim, num_heads, kdim=None, vdim=None, dropout=0., bias=True,
139
- add_bias_kv=False, add_zero_attn=False, self_attention=False,
140
- encoder_decoder_attention=False):
141
- super().__init__()
142
- self.embed_dim = embed_dim
143
- self.kdim = kdim if kdim is not None else embed_dim
144
- self.vdim = vdim if vdim is not None else embed_dim
145
- self.qkv_same_dim = self.kdim == embed_dim and self.vdim == embed_dim
146
-
147
- self.num_heads = num_heads
148
- self.dropout = dropout
149
- self.head_dim = embed_dim // num_heads
150
- assert self.head_dim * num_heads == self.embed_dim, "embed_dim must be divisible by num_heads"
151
- self.scaling = self.head_dim ** -0.5
152
-
153
- self.self_attention = self_attention
154
- self.encoder_decoder_attention = encoder_decoder_attention
155
-
156
- assert not self.self_attention or self.qkv_same_dim, 'Self-attention requires query, key and ' \
157
- 'value to be of the same size'
158
-
159
- if self.qkv_same_dim:
160
- self.in_proj_weight = Parameter(torch.Tensor(3 * embed_dim, embed_dim))
161
- else:
162
- self.k_proj_weight = Parameter(torch.Tensor(embed_dim, self.kdim))
163
- self.v_proj_weight = Parameter(torch.Tensor(embed_dim, self.vdim))
164
- self.q_proj_weight = Parameter(torch.Tensor(embed_dim, embed_dim))
165
-
166
- if bias:
167
- self.in_proj_bias = Parameter(torch.Tensor(3 * embed_dim))
168
- else:
169
- self.register_parameter('in_proj_bias', None)
170
-
171
- self.out_proj = nn.Linear(embed_dim, embed_dim, bias=bias)
172
-
173
- if add_bias_kv:
174
- self.bias_k = Parameter(torch.Tensor(1, 1, embed_dim))
175
- self.bias_v = Parameter(torch.Tensor(1, 1, embed_dim))
176
- else:
177
- self.bias_k = self.bias_v = None
178
-
179
- self.add_zero_attn = add_zero_attn
180
-
181
- self.reset_parameters()
182
-
183
- self.enable_torch_version = False
184
- if hasattr(F, "multi_head_attention_forward"):
185
- self.enable_torch_version = True
186
- else:
187
- self.enable_torch_version = False
188
- self.last_attn_probs = None
189
-
190
- def reset_parameters(self):
191
- if self.qkv_same_dim:
192
- nn.init.xavier_uniform_(self.in_proj_weight)
193
- else:
194
- nn.init.xavier_uniform_(self.k_proj_weight)
195
- nn.init.xavier_uniform_(self.v_proj_weight)
196
- nn.init.xavier_uniform_(self.q_proj_weight)
197
-
198
- nn.init.xavier_uniform_(self.out_proj.weight)
199
- if self.in_proj_bias is not None:
200
- nn.init.constant_(self.in_proj_bias, 0.)
201
- nn.init.constant_(self.out_proj.bias, 0.)
202
- if self.bias_k is not None:
203
- nn.init.xavier_normal_(self.bias_k)
204
- if self.bias_v is not None:
205
- nn.init.xavier_normal_(self.bias_v)
206
-
207
- def forward(
208
- self,
209
- query, key, value,
210
- key_padding_mask=None,
211
- incremental_state=None,
212
- need_weights=True,
213
- static_kv=False,
214
- attn_mask=None,
215
- before_softmax=False,
216
- need_head_weights=False,
217
- enc_dec_attn_constraint_mask=None,
218
- reset_attn_weight=None
219
- ):
220
- """Input shape: Time x Batch x Channel
221
-
222
- Args:
223
- key_padding_mask (ByteTensor, optional): mask to exclude
224
- keys that are pads, of shape `(batch, src_len)`, where
225
- padding elements are indicated by 1s.
226
- need_weights (bool, optional): return the attention weights,
227
- averaged over heads (default: False).
228
- attn_mask (ByteTensor, optional): typically used to
229
- implement causal attention, where the mask prevents the
230
- attention from looking forward in time (default: None).
231
- before_softmax (bool, optional): return the raw attention
232
- weights and values before the attention softmax.
233
- need_head_weights (bool, optional): return the attention
234
- weights for each head. Implies *need_weights*. Default:
235
- return the average attention weights over all heads.
236
- """
237
- if need_head_weights:
238
- need_weights = True
239
-
240
- tgt_len, bsz, embed_dim = query.size()
241
- assert embed_dim == self.embed_dim
242
- assert list(query.size()) == [tgt_len, bsz, embed_dim]
243
- if self.enable_torch_version and incremental_state is None and not static_kv and reset_attn_weight is None:
244
- if self.qkv_same_dim:
245
- return F.multi_head_attention_forward(query, key, value,
246
- self.embed_dim, self.num_heads,
247
- self.in_proj_weight,
248
- self.in_proj_bias, self.bias_k, self.bias_v,
249
- self.add_zero_attn, self.dropout,
250
- self.out_proj.weight, self.out_proj.bias,
251
- self.training, key_padding_mask, need_weights,
252
- attn_mask)
253
- else:
254
- return F.multi_head_attention_forward(query, key, value,
255
- self.embed_dim, self.num_heads,
256
- torch.empty([0]),
257
- self.in_proj_bias, self.bias_k, self.bias_v,
258
- self.add_zero_attn, self.dropout,
259
- self.out_proj.weight, self.out_proj.bias,
260
- self.training, key_padding_mask, need_weights,
261
- attn_mask, use_separate_proj_weight=True,
262
- q_proj_weight=self.q_proj_weight,
263
- k_proj_weight=self.k_proj_weight,
264
- v_proj_weight=self.v_proj_weight)
265
-
266
- if incremental_state is not None:
267
- saved_state = self._get_input_buffer(incremental_state)
268
- if 'prev_key' in saved_state:
269
- # previous time steps are cached - no need to recompute
270
- # key and value if they are static
271
- if static_kv:
272
- assert self.encoder_decoder_attention and not self.self_attention
273
- key = value = None
274
- else:
275
- saved_state = None
276
-
277
- if self.self_attention:
278
- # self-attention
279
- q, k, v = self.in_proj_qkv(query)
280
- elif self.encoder_decoder_attention:
281
- # encoder-decoder attention
282
- q = self.in_proj_q(query)
283
- if key is None:
284
- assert value is None
285
- k = v = None
286
- else:
287
- k = self.in_proj_k(key)
288
- v = self.in_proj_v(key)
289
-
290
- else:
291
- q = self.in_proj_q(query)
292
- k = self.in_proj_k(key)
293
- v = self.in_proj_v(value)
294
- q *= self.scaling
295
-
296
- if self.bias_k is not None:
297
- assert self.bias_v is not None
298
- k = torch.cat([k, self.bias_k.repeat(1, bsz, 1)])
299
- v = torch.cat([v, self.bias_v.repeat(1, bsz, 1)])
300
- if attn_mask is not None:
301
- attn_mask = torch.cat([attn_mask, attn_mask.new_zeros(attn_mask.size(0), 1)], dim=1)
302
- if key_padding_mask is not None:
303
- key_padding_mask = torch.cat(
304
- [key_padding_mask, key_padding_mask.new_zeros(key_padding_mask.size(0), 1)], dim=1)
305
-
306
- q = q.contiguous().view(tgt_len, bsz * self.num_heads, self.head_dim).transpose(0, 1)
307
- if k is not None:
308
- k = k.contiguous().view(-1, bsz * self.num_heads, self.head_dim).transpose(0, 1)
309
- if v is not None:
310
- v = v.contiguous().view(-1, bsz * self.num_heads, self.head_dim).transpose(0, 1)
311
-
312
- if saved_state is not None:
313
- # saved states are stored with shape (bsz, num_heads, seq_len, head_dim)
314
- if 'prev_key' in saved_state:
315
- prev_key = saved_state['prev_key'].view(bsz * self.num_heads, -1, self.head_dim)
316
- if static_kv:
317
- k = prev_key
318
- else:
319
- k = torch.cat((prev_key, k), dim=1)
320
- if 'prev_value' in saved_state:
321
- prev_value = saved_state['prev_value'].view(bsz * self.num_heads, -1, self.head_dim)
322
- if static_kv:
323
- v = prev_value
324
- else:
325
- v = torch.cat((prev_value, v), dim=1)
326
- if 'prev_key_padding_mask' in saved_state and saved_state['prev_key_padding_mask'] is not None:
327
- prev_key_padding_mask = saved_state['prev_key_padding_mask']
328
- if static_kv:
329
- key_padding_mask = prev_key_padding_mask
330
- else:
331
- key_padding_mask = torch.cat((prev_key_padding_mask, key_padding_mask), dim=1)
332
-
333
- saved_state['prev_key'] = k.view(bsz, self.num_heads, -1, self.head_dim)
334
- saved_state['prev_value'] = v.view(bsz, self.num_heads, -1, self.head_dim)
335
- saved_state['prev_key_padding_mask'] = key_padding_mask
336
-
337
- self._set_input_buffer(incremental_state, saved_state)
338
-
339
- src_len = k.size(1)
340
-
341
- # This is part of a workaround to get around fork/join parallelism
342
- # not supporting Optional types.
343
- if key_padding_mask is not None and key_padding_mask.shape == torch.Size([]):
344
- key_padding_mask = None
345
-
346
- if key_padding_mask is not None:
347
- assert key_padding_mask.size(0) == bsz
348
- assert key_padding_mask.size(1) == src_len
349
-
350
- if self.add_zero_attn:
351
- src_len += 1
352
- k = torch.cat([k, k.new_zeros((k.size(0), 1) + k.size()[2:])], dim=1)
353
- v = torch.cat([v, v.new_zeros((v.size(0), 1) + v.size()[2:])], dim=1)
354
- if attn_mask is not None:
355
- attn_mask = torch.cat([attn_mask, attn_mask.new_zeros(attn_mask.size(0), 1)], dim=1)
356
- if key_padding_mask is not None:
357
- key_padding_mask = torch.cat(
358
- [key_padding_mask, torch.zeros(key_padding_mask.size(0), 1).type_as(key_padding_mask)], dim=1)
359
-
360
- attn_weights = torch.bmm(q, k.transpose(1, 2))
361
- attn_weights = self.apply_sparse_mask(attn_weights, tgt_len, src_len, bsz)
362
-
363
- assert list(attn_weights.size()) == [bsz * self.num_heads, tgt_len, src_len]
364
-
365
- if attn_mask is not None:
366
- if len(attn_mask.shape) == 2:
367
- attn_mask = attn_mask.unsqueeze(0)
368
- elif len(attn_mask.shape) == 3:
369
- attn_mask = attn_mask[:, None].repeat([1, self.num_heads, 1, 1]).reshape(
370
- bsz * self.num_heads, tgt_len, src_len)
371
- attn_weights = attn_weights + attn_mask
372
-
373
- if enc_dec_attn_constraint_mask is not None: # bs x head x L_kv
374
- attn_weights = attn_weights.view(bsz, self.num_heads, tgt_len, src_len)
375
- attn_weights = attn_weights.masked_fill(
376
- enc_dec_attn_constraint_mask.unsqueeze(2).bool(),
377
- -1e8,
378
- )
379
- attn_weights = attn_weights.view(bsz * self.num_heads, tgt_len, src_len)
380
-
381
- if key_padding_mask is not None:
382
- # don't attend to padding symbols
383
- attn_weights = attn_weights.view(bsz, self.num_heads, tgt_len, src_len)
384
- attn_weights = attn_weights.masked_fill(
385
- key_padding_mask.unsqueeze(1).unsqueeze(2),
386
- -1e8,
387
- )
388
- attn_weights = attn_weights.view(bsz * self.num_heads, tgt_len, src_len)
389
-
390
- attn_logits = attn_weights.view(bsz, self.num_heads, tgt_len, src_len)
391
-
392
- if before_softmax:
393
- return attn_weights, v
394
-
395
- attn_weights_float = softmax(attn_weights, dim=-1)
396
- attn_weights = attn_weights_float.type_as(attn_weights)
397
- attn_probs = F.dropout(attn_weights_float.type_as(attn_weights), p=self.dropout, training=self.training)
398
-
399
- if reset_attn_weight is not None:
400
- if reset_attn_weight:
401
- self.last_attn_probs = attn_probs.detach()
402
- else:
403
- assert self.last_attn_probs is not None
404
- attn_probs = self.last_attn_probs
405
- attn = torch.bmm(attn_probs, v)
406
- assert list(attn.size()) == [bsz * self.num_heads, tgt_len, self.head_dim]
407
- attn = attn.transpose(0, 1).contiguous().view(tgt_len, bsz, embed_dim)
408
- attn = self.out_proj(attn)
409
-
410
- if need_weights:
411
- attn_weights = attn_weights_float.view(bsz, self.num_heads, tgt_len, src_len).transpose(1, 0)
412
- if not need_head_weights:
413
- # average attention weights over heads
414
- attn_weights = attn_weights.mean(dim=0)
415
- else:
416
- attn_weights = None
417
-
418
- return attn, (attn_weights, attn_logits)
419
-
420
- def in_proj_qkv(self, query):
421
- return self._in_proj(query).chunk(3, dim=-1)
422
-
423
- def in_proj_q(self, query):
424
- if self.qkv_same_dim:
425
- return self._in_proj(query, end=self.embed_dim)
426
- else:
427
- bias = self.in_proj_bias
428
- if bias is not None:
429
- bias = bias[:self.embed_dim]
430
- return F.linear(query, self.q_proj_weight, bias)
431
-
432
- def in_proj_k(self, key):
433
- if self.qkv_same_dim:
434
- return self._in_proj(key, start=self.embed_dim, end=2 * self.embed_dim)
435
- else:
436
- weight = self.k_proj_weight
437
- bias = self.in_proj_bias
438
- if bias is not None:
439
- bias = bias[self.embed_dim:2 * self.embed_dim]
440
- return F.linear(key, weight, bias)
441
-
442
- def in_proj_v(self, value):
443
- if self.qkv_same_dim:
444
- return self._in_proj(value, start=2 * self.embed_dim)
445
- else:
446
- weight = self.v_proj_weight
447
- bias = self.in_proj_bias
448
- if bias is not None:
449
- bias = bias[2 * self.embed_dim:]
450
- return F.linear(value, weight, bias)
451
-
452
- def _in_proj(self, input, start=0, end=None):
453
- weight = self.in_proj_weight
454
- bias = self.in_proj_bias
455
- weight = weight[start:end, :]
456
- if bias is not None:
457
- bias = bias[start:end]
458
- return F.linear(input, weight, bias)
459
-
460
- def _get_input_buffer(self, incremental_state):
461
- return get_incremental_state(
462
- self,
463
- incremental_state,
464
- 'attn_state',
465
- ) or {}
466
-
467
- def _set_input_buffer(self, incremental_state, buffer):
468
- set_incremental_state(
469
- self,
470
- incremental_state,
471
- 'attn_state',
472
- buffer,
473
- )
474
-
475
- def apply_sparse_mask(self, attn_weights, tgt_len, src_len, bsz):
476
- return attn_weights
477
-
478
- def clear_buffer(self, incremental_state=None):
479
- if incremental_state is not None:
480
- saved_state = self._get_input_buffer(incremental_state)
481
- if 'prev_key' in saved_state:
482
- del saved_state['prev_key']
483
- if 'prev_value' in saved_state:
484
- del saved_state['prev_value']
485
- self._set_input_buffer(incremental_state, saved_state)
486
-
487
-
488
- class EncSALayer(nn.Module):
489
- def __init__(self, c, num_heads, dropout, attention_dropout=0.1,
490
- relu_dropout=0.1, kernel_size=9, padding='SAME', act='gelu'):
491
- super().__init__()
492
- self.c = c
493
- self.dropout = dropout
494
- self.num_heads = num_heads
495
- if num_heads > 0:
496
- self.layer_norm1 = LayerNorm(c)
497
- self.self_attn = MultiheadAttention(
498
- self.c, num_heads, self_attention=True, dropout=attention_dropout, bias=False)
499
- self.layer_norm2 = LayerNorm(c)
500
- self.ffn = TransformerFFNLayer(
501
- c, 4 * c, kernel_size=kernel_size, dropout=relu_dropout, padding=padding, act=act)
502
-
503
- def forward(self, x, encoder_padding_mask=None, **kwargs):
504
- layer_norm_training = kwargs.get('layer_norm_training', None)
505
- if layer_norm_training is not None:
506
- self.layer_norm1.training = layer_norm_training
507
- self.layer_norm2.training = layer_norm_training
508
- if self.num_heads > 0:
509
- residual = x
510
- x = self.layer_norm1(x)
511
- x, _, = self.self_attn(
512
- query=x,
513
- key=x,
514
- value=x,
515
- key_padding_mask=encoder_padding_mask
516
- )
517
- x = F.dropout(x, self.dropout, training=self.training)
518
- x = residual + x
519
- x = x * (1 - encoder_padding_mask.float()).transpose(0, 1)[..., None]
520
-
521
- residual = x
522
- x = self.layer_norm2(x)
523
- x = self.ffn(x)
524
- x = F.dropout(x, self.dropout, training=self.training)
525
- x = residual + x
526
- x = x * (1 - encoder_padding_mask.float()).transpose(0, 1)[..., None]
527
- return x
528
-
529
-
530
- class DecSALayer(nn.Module):
531
- def __init__(self, c, num_heads, dropout, attention_dropout=0.1, relu_dropout=0.1,
532
- kernel_size=9, act='gelu'):
533
- super().__init__()
534
- self.c = c
535
- self.dropout = dropout
536
- self.layer_norm1 = LayerNorm(c)
537
- self.self_attn = MultiheadAttention(
538
- c, num_heads, self_attention=True, dropout=attention_dropout, bias=False
539
- )
540
- self.layer_norm2 = LayerNorm(c)
541
- self.encoder_attn = MultiheadAttention(
542
- c, num_heads, encoder_decoder_attention=True, dropout=attention_dropout, bias=False,
543
- )
544
- self.layer_norm3 = LayerNorm(c)
545
- self.ffn = TransformerFFNLayer(
546
- c, 4 * c, padding='LEFT', kernel_size=kernel_size, dropout=relu_dropout, act=act)
547
-
548
- def forward(
549
- self,
550
- x,
551
- encoder_out=None,
552
- encoder_padding_mask=None,
553
- incremental_state=None,
554
- self_attn_mask=None,
555
- self_attn_padding_mask=None,
556
- attn_out=None,
557
- reset_attn_weight=None,
558
- **kwargs,
559
- ):
560
- layer_norm_training = kwargs.get('layer_norm_training', None)
561
- if layer_norm_training is not None:
562
- self.layer_norm1.training = layer_norm_training
563
- self.layer_norm2.training = layer_norm_training
564
- self.layer_norm3.training = layer_norm_training
565
- residual = x
566
- x = self.layer_norm1(x)
567
- x, _ = self.self_attn(
568
- query=x,
569
- key=x,
570
- value=x,
571
- key_padding_mask=self_attn_padding_mask,
572
- incremental_state=incremental_state,
573
- attn_mask=self_attn_mask
574
- )
575
- x = F.dropout(x, self.dropout, training=self.training)
576
- x = residual + x
577
-
578
- attn_logits = None
579
- if encoder_out is not None or attn_out is not None:
580
- residual = x
581
- x = self.layer_norm2(x)
582
- if encoder_out is not None:
583
- x, attn = self.encoder_attn(
584
- query=x,
585
- key=encoder_out,
586
- value=encoder_out,
587
- key_padding_mask=encoder_padding_mask,
588
- incremental_state=incremental_state,
589
- static_kv=True,
590
- enc_dec_attn_constraint_mask=get_incremental_state(self, incremental_state,
591
- 'enc_dec_attn_constraint_mask'),
592
- reset_attn_weight=reset_attn_weight
593
- )
594
- attn_logits = attn[1]
595
- elif attn_out is not None:
596
- x = self.encoder_attn.in_proj_v(attn_out)
597
- if encoder_out is not None or attn_out is not None:
598
- x = F.dropout(x, self.dropout, training=self.training)
599
- x = residual + x
600
-
601
- residual = x
602
- x = self.layer_norm3(x)
603
- x = self.ffn(x, incremental_state=incremental_state)
604
- x = F.dropout(x, self.dropout, training=self.training)
605
- x = residual + x
606
- return x, attn_logits
607
-
608
- def clear_buffer(self, input, encoder_out=None, encoder_padding_mask=None, incremental_state=None):
609
- self.encoder_attn.clear_buffer(incremental_state)
610
- self.ffn.clear_buffer(incremental_state)
611
-
612
- def set_buffer(self, name, tensor, incremental_state):
613
- return set_incremental_state(self, incremental_state, name, tensor)
614
-
615
-
616
- class TransformerEncoderLayer(nn.Module):
617
- def __init__(self, hidden_size, dropout, kernel_size=9, num_heads=2):
618
- super().__init__()
619
- self.hidden_size = hidden_size
620
- self.dropout = dropout
621
- self.num_heads = num_heads
622
- self.op = EncSALayer(
623
- hidden_size, num_heads, dropout=dropout,
624
- attention_dropout=0.0, relu_dropout=dropout,
625
- kernel_size=kernel_size)
626
-
627
- def forward(self, x, **kwargs):
628
- return self.op(x, **kwargs)
629
-
630
-
631
- class TransformerDecoderLayer(nn.Module):
632
- def __init__(self, hidden_size, dropout, kernel_size=9, num_heads=2):
633
- super().__init__()
634
- self.hidden_size = hidden_size
635
- self.dropout = dropout
636
- self.num_heads = num_heads
637
- self.op = DecSALayer(
638
- hidden_size, num_heads, dropout=dropout,
639
- attention_dropout=0.0, relu_dropout=dropout,
640
- kernel_size=kernel_size)
641
-
642
- def forward(self, x, **kwargs):
643
- return self.op(x, **kwargs)
644
-
645
- def clear_buffer(self, *args):
646
- return self.op.clear_buffer(*args)
647
-
648
- def set_buffer(self, *args):
649
- return self.op.set_buffer(*args)
650
-
651
-
652
- class FFTBlocks(nn.Module):
653
- def __init__(self, hidden_size, num_layers, ffn_kernel_size=9, dropout=0.0,
654
- num_heads=2, use_pos_embed=True, use_last_norm=True,
655
- use_pos_embed_alpha=True):
656
- super().__init__()
657
- self.num_layers = num_layers
658
- embed_dim = self.hidden_size = hidden_size
659
- self.dropout = dropout
660
- self.use_pos_embed = use_pos_embed
661
- self.use_last_norm = use_last_norm
662
- if use_pos_embed:
663
- self.max_source_positions = DEFAULT_MAX_TARGET_POSITIONS
664
- self.padding_idx = 0
665
- self.pos_embed_alpha = nn.Parameter(torch.Tensor([1])) if use_pos_embed_alpha else 1
666
- self.embed_positions = SinusoidalPositionalEmbedding(
667
- embed_dim, self.padding_idx, init_size=DEFAULT_MAX_TARGET_POSITIONS,
668
- )
669
-
670
- self.layers = nn.ModuleList([])
671
- self.layers.extend([
672
- TransformerEncoderLayer(self.hidden_size, self.dropout,
673
- kernel_size=ffn_kernel_size, num_heads=num_heads)
674
- for _ in range(self.num_layers)
675
- ])
676
- if self.use_last_norm:
677
- self.layer_norm = nn.LayerNorm(embed_dim)
678
- else:
679
- self.layer_norm = None
680
-
681
- def forward(self, x, padding_mask=None, attn_mask=None, return_hiddens=False):
682
- """
683
- :param x: [B, T, C]
684
- :param padding_mask: [B, T]
685
- :return: [B, T, C] or [L, B, T, C]
686
- """
687
- padding_mask = x.abs().sum(-1).eq(0).data if padding_mask is None else padding_mask
688
- nonpadding_mask_TB = 1 - padding_mask.transpose(0, 1).float()[:, :, None] # [T, B, 1]
689
- if self.use_pos_embed:
690
- positions = self.pos_embed_alpha * self.embed_positions(x[..., 0])
691
- x = x + positions
692
- x = F.dropout(x, p=self.dropout, training=self.training)
693
- # B x T x C -> T x B x C
694
- x = x.transpose(0, 1) * nonpadding_mask_TB
695
- hiddens = []
696
- for layer in self.layers:
697
- x = layer(x, encoder_padding_mask=padding_mask, attn_mask=attn_mask) * nonpadding_mask_TB
698
- hiddens.append(x)
699
- if self.use_last_norm:
700
- x = self.layer_norm(x) * nonpadding_mask_TB
701
- if return_hiddens:
702
- x = torch.stack(hiddens, 0) # [L, T, B, C]
703
- x = x.transpose(1, 2) # [L, B, T, C]
704
- else:
705
- x = x.transpose(0, 1) # [B, T, C]
706
- return x
707
-
708
-
709
- class FastSpeechEncoder(FFTBlocks):
710
- def __init__(self, dict_size, hidden_size=256, num_layers=4, kernel_size=9, num_heads=2,
711
- dropout=0.0):
712
- super().__init__(hidden_size, num_layers, kernel_size, num_heads=num_heads,
713
- use_pos_embed=False, dropout=dropout) # use_pos_embed_alpha for compatibility
714
- self.embed_tokens = Embedding(dict_size, hidden_size, 0)
715
- self.embed_scale = math.sqrt(hidden_size)
716
- self.padding_idx = 0
717
- self.embed_positions = SinusoidalPositionalEmbedding(
718
- hidden_size, self.padding_idx, init_size=DEFAULT_MAX_TARGET_POSITIONS,
719
- )
720
-
721
- def forward(self, txt_tokens, attn_mask=None):
722
- """
723
-
724
- :param txt_tokens: [B, T]
725
- :return: {
726
- 'encoder_out': [B x T x C]
727
- }
728
- """
729
- encoder_padding_mask = txt_tokens.eq(self.padding_idx).data
730
- x = self.forward_embedding(txt_tokens) # [B, T, H]
731
- if self.num_layers > 0:
732
- x = super(FastSpeechEncoder, self).forward(x, encoder_padding_mask, attn_mask=attn_mask)
733
- return x
734
-
735
- def forward_embedding(self, txt_tokens):
736
- # embed tokens and positions
737
- x = self.embed_scale * self.embed_tokens(txt_tokens)
738
- if self.use_pos_embed:
739
- positions = self.embed_positions(txt_tokens)
740
- x = x + positions
741
- x = F.dropout(x, p=self.dropout, training=self.training)
742
- return x
743
-
744
-
745
- class FastSpeechDecoder(FFTBlocks):
746
- def __init__(self, hidden_size=256, num_layers=4, kernel_size=9, num_heads=2):
747
- super().__init__(hidden_size, num_layers, kernel_size, num_heads=num_heads)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AIGC-Audio/AudioGPT/NeuralSeq/utils/multiprocess_utils.py DELETED
@@ -1,54 +0,0 @@
1
- import os
2
- import traceback
3
- from multiprocessing import Queue, Process
4
-
5
-
6
- def chunked_worker(worker_id, map_func, args, results_queue=None, init_ctx_func=None):
7
- ctx = init_ctx_func(worker_id) if init_ctx_func is not None else None
8
- for job_idx, arg in args:
9
- try:
10
- if ctx is not None:
11
- res = map_func(*arg, ctx=ctx)
12
- else:
13
- res = map_func(*arg)
14
- results_queue.put((job_idx, res))
15
- except:
16
- traceback.print_exc()
17
- results_queue.put((job_idx, None))
18
-
19
- def chunked_multiprocess_run(map_func, args, num_workers=None, ordered=True, init_ctx_func=None, q_max_size=1000):
20
- args = zip(range(len(args)), args)
21
- args = list(args)
22
- n_jobs = len(args)
23
- if num_workers is None:
24
- num_workers = int(os.getenv('N_PROC', os.cpu_count()))
25
- results_queues = []
26
- if ordered:
27
- for i in range(num_workers):
28
- results_queues.append(Queue(maxsize=q_max_size // num_workers))
29
- else:
30
- results_queue = Queue(maxsize=q_max_size)
31
- for i in range(num_workers):
32
- results_queues.append(results_queue)
33
- workers = []
34
- for i in range(num_workers):
35
- args_worker = args[i::num_workers]
36
- p = Process(target=chunked_worker, args=(
37
- i, map_func, args_worker, results_queues[i], init_ctx_func), daemon=True)
38
- workers.append(p)
39
- p.start()
40
- for n_finished in range(n_jobs):
41
- results_queue = results_queues[n_finished % num_workers]
42
- job_idx, res = results_queue.get()
43
- assert job_idx == n_finished or not ordered, (job_idx, n_finished)
44
- yield res
45
- for w in workers:
46
- w.join()
47
- w.close()
48
-
49
- def multiprocess_run_tqdm(map_func, args, num_workers=None, ordered=True, init_ctx_func=None,
50
- multithread=False, desc=None):
51
- for i, res in tqdm(enumerate(
52
- multiprocess_run(map_func, args, num_workers, ordered, init_ctx_func, multithread)),
53
- total=len(args), desc=desc):
54
- yield i, res
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AIGC-Audio/AudioGPT/text_to_speech/modules/tts/diffspeech/net.py DELETED
@@ -1,110 +0,0 @@
1
- import math
2
-
3
- import torch
4
- import torch.nn as nn
5
- import torch.nn.functional as F
6
-
7
- from math import sqrt
8
-
9
- Linear = nn.Linear
10
- ConvTranspose2d = nn.ConvTranspose2d
11
-
12
-
13
- class Mish(nn.Module):
14
- def forward(self, x):
15
- return x * torch.tanh(F.softplus(x))
16
-
17
-
18
- class SinusoidalPosEmb(nn.Module):
19
- def __init__(self, dim):
20
- super().__init__()
21
- self.dim = dim
22
-
23
- def forward(self, x):
24
- device = x.device
25
- half_dim = self.dim // 2
26
- emb = math.log(10000) / (half_dim - 1)
27
- emb = torch.exp(torch.arange(half_dim, device=device) * -emb)
28
- emb = x[:, None] * emb[None, :]
29
- emb = torch.cat((emb.sin(), emb.cos()), dim=-1)
30
- return emb
31
-
32
-
33
- def Conv1d(*args, **kwargs):
34
- layer = nn.Conv1d(*args, **kwargs)
35
- nn.init.kaiming_normal_(layer.weight)
36
- return layer
37
-
38
-
39
- class ResidualBlock(nn.Module):
40
- def __init__(self, encoder_hidden, residual_channels, dilation):
41
- super().__init__()
42
- self.dilated_conv = Conv1d(residual_channels, 2 * residual_channels, 3, padding=dilation, dilation=dilation)
43
- self.diffusion_projection = Linear(residual_channels, residual_channels)
44
- self.conditioner_projection = Conv1d(encoder_hidden, 2 * residual_channels, 1)
45
- self.output_projection = Conv1d(residual_channels, 2 * residual_channels, 1)
46
-
47
- def forward(self, x, conditioner, diffusion_step):
48
- diffusion_step = self.diffusion_projection(diffusion_step).unsqueeze(-1)
49
- conditioner = self.conditioner_projection(conditioner)
50
- y = x + diffusion_step
51
-
52
- y = self.dilated_conv(y) + conditioner
53
-
54
- gate, filter = torch.chunk(y, 2, dim=1)
55
- y = torch.sigmoid(gate) * torch.tanh(filter)
56
-
57
- y = self.output_projection(y)
58
- residual, skip = torch.chunk(y, 2, dim=1)
59
- return (x + residual) / sqrt(2.0), skip
60
-
61
-
62
- class DiffNet(nn.Module):
63
- def __init__(self, hparams):
64
- super().__init__()
65
- in_dims = hparams['audio_num_mel_bins']
66
- self.encoder_hidden = hparams['hidden_size']
67
- self.residual_layers = hparams['residual_layers']
68
- self.residual_channels = hparams['residual_channels']
69
- self.dilation_cycle_length = hparams['dilation_cycle_length']
70
-
71
- self.input_projection = Conv1d(in_dims, self.residual_channels, 1)
72
- self.diffusion_embedding = SinusoidalPosEmb(self.residual_channels)
73
- dim = self.residual_channels
74
- self.mlp = nn.Sequential(
75
- nn.Linear(dim, dim * 4),
76
- Mish(),
77
- nn.Linear(dim * 4, dim)
78
- )
79
- self.residual_layers = nn.ModuleList([
80
- ResidualBlock(self.encoder_hidden, self.residual_channels, 2 ** (i % self.dilation_cycle_length))
81
- for i in range(self.residual_layers)
82
- ])
83
- self.skip_projection = Conv1d(self.residual_channels, self.residual_channels, 1)
84
- self.output_projection = Conv1d(self.residual_channels, in_dims, 1)
85
- nn.init.zeros_(self.output_projection.weight)
86
-
87
- def forward(self, spec, diffusion_step, cond):
88
- """
89
-
90
- :param spec: [B, 1, M, T]
91
- :param diffusion_step: [B, 1]
92
- :param cond: [B, M, T]
93
- :return:
94
- """
95
- x = spec[:, 0]
96
- x = self.input_projection(x) # x [B, residual_channel, T]
97
-
98
- x = F.relu(x)
99
- diffusion_step = self.diffusion_embedding(diffusion_step)
100
- diffusion_step = self.mlp(diffusion_step)
101
- skip = []
102
- for layer_id, layer in enumerate(self.residual_layers):
103
- x, skip_connection = layer(x, cond, diffusion_step)
104
- skip.append(skip_connection)
105
-
106
- x = torch.sum(torch.stack(skip), dim=0) / sqrt(len(self.residual_layers))
107
- x = self.skip_projection(x)
108
- x = F.relu(x)
109
- x = self.output_projection(x) # [B, 80, T]
110
- return x[:, None, :, :]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AIGText/GlyphControl/ldm/models/diffusion/dpm_solver/sampler.py DELETED
@@ -1,87 +0,0 @@
1
- """SAMPLING ONLY."""
2
- import torch
3
-
4
- from .dpm_solver import NoiseScheduleVP, model_wrapper, DPM_Solver
5
-
6
-
7
- MODEL_TYPES = {
8
- "eps": "noise",
9
- "v": "v"
10
- }
11
-
12
-
13
- class DPMSolverSampler(object):
14
- def __init__(self, model, **kwargs):
15
- super().__init__()
16
- self.model = model
17
- to_torch = lambda x: x.clone().detach().to(torch.float32).to(model.device)
18
- self.register_buffer('alphas_cumprod', to_torch(model.alphas_cumprod))
19
-
20
- def register_buffer(self, name, attr):
21
- if type(attr) == torch.Tensor:
22
- if attr.device != torch.device("cuda"):
23
- attr = attr.to(torch.device("cuda"))
24
- setattr(self, name, attr)
25
-
26
- @torch.no_grad()
27
- def sample(self,
28
- S,
29
- batch_size,
30
- shape,
31
- conditioning=None,
32
- callback=None,
33
- normals_sequence=None,
34
- img_callback=None,
35
- quantize_x0=False,
36
- eta=0.,
37
- mask=None,
38
- x0=None,
39
- temperature=1.,
40
- noise_dropout=0.,
41
- score_corrector=None,
42
- corrector_kwargs=None,
43
- verbose=True,
44
- x_T=None,
45
- log_every_t=100,
46
- unconditional_guidance_scale=1.,
47
- unconditional_conditioning=None,
48
- # this has to come in the same format as the conditioning, # e.g. as encoded tokens, ...
49
- **kwargs
50
- ):
51
- if conditioning is not None:
52
- if isinstance(conditioning, dict):
53
- cbs = conditioning[list(conditioning.keys())[0]].shape[0]
54
- if cbs != batch_size:
55
- print(f"Warning: Got {cbs} conditionings but batch-size is {batch_size}")
56
- else:
57
- if conditioning.shape[0] != batch_size:
58
- print(f"Warning: Got {conditioning.shape[0]} conditionings but batch-size is {batch_size}")
59
-
60
- # sampling
61
- C, H, W = shape
62
- size = (batch_size, C, H, W)
63
-
64
- print(f'Data shape for DPM-Solver sampling is {size}, sampling steps {S}')
65
-
66
- device = self.model.betas.device
67
- if x_T is None:
68
- img = torch.randn(size, device=device)
69
- else:
70
- img = x_T
71
-
72
- ns = NoiseScheduleVP('discrete', alphas_cumprod=self.alphas_cumprod)
73
-
74
- model_fn = model_wrapper(
75
- lambda x, t, c: self.model.apply_model(x, t, c),
76
- ns,
77
- model_type=MODEL_TYPES[self.model.parameterization],
78
- guidance_type="classifier-free",
79
- condition=conditioning,
80
- unconditional_condition=unconditional_conditioning,
81
- guidance_scale=unconditional_guidance_scale,
82
- )
83
-
84
- dpm_solver = DPM_Solver(model_fn, ns, predict_x0=True, thresholding=False)
85
- x = dpm_solver.sample(img, steps=S, skip_type="time_uniform", method="multistep", order=2, lower_order_final=True)
86
-
87
- return x.to(device), None
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AgentVerse/agentVerse/agentverse/environments/base.py DELETED
@@ -1,58 +0,0 @@
1
- from __future__ import annotations
2
- from agentverse.logging import logger
3
-
4
- from abc import abstractmethod
5
- from typing import TYPE_CHECKING, Any, Dict, List
6
-
7
- from pydantic import BaseModel
8
-
9
- # from agentverse.agents.agent import Agent
10
-
11
- if TYPE_CHECKING:
12
- from agentverse.agents.base import BaseAgent
13
- from agentverse.message import Message
14
-
15
-
16
- class BaseRule(BaseModel):
17
- pass
18
-
19
-
20
- class BaseEnvironment(BaseModel):
21
- """
22
- Base class for environment.
23
-
24
- Args:
25
- agents: List of agents
26
- rule: Rule for the environment
27
- max_turns: Maximum number of turns
28
- cnt_turn: Current turn number
29
- last_messages: Messages from last turn
30
- rule_params: Variables set by the rule
31
- """
32
-
33
- agents: List[BaseAgent]
34
- rule: BaseRule
35
- max_turns: int = 10
36
- cnt_turn: int = 0
37
- last_messages: List[Message] = []
38
- rule_params: Dict = {}
39
-
40
- @abstractmethod
41
- async def step(self) -> List[Message]:
42
- """Run one step of the environment"""
43
- pass
44
-
45
- @abstractmethod
46
- def reset(self) -> None:
47
- """Reset the environment"""
48
- pass
49
-
50
- def report_metrics(self) -> None:
51
- """Report useful metrics"""
52
- total_spent = sum([agent.get_spend() for agent in self.agents])
53
- logger.info(f"Total spent: ${total_spent}")
54
- pass
55
-
56
- def is_done(self) -> bool:
57
- """Check if the environment is done"""
58
- return self.cnt_turn >= self.max_turns
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AgentVerse/agentVerse/ui/src/phaser3-rex-plugins/templates/spinner/dots/Dots.js DELETED
@@ -1,54 +0,0 @@
1
- import Base from '../base/Base.js';
2
- import { Circle } from '../utils/Geoms.js';
3
- import Yoyo from '../utils/Yoyo.js';
4
-
5
-
6
- const Linear = Phaser.Math.Linear;
7
-
8
- class Dots extends Base {
9
- constructor(scene, config) {
10
- super(scene, config);
11
- this.type = 'rexSpinnerDots';
12
- }
13
-
14
- buildShapes() {
15
- var cnt = 3;
16
- for (var i = 0; i < cnt; i++) {
17
- var dot = new Circle();
18
- this.addShape(dot);
19
-
20
- var offset = Yoyo(i / (cnt - 1)) / 2;
21
- dot.setData('offset', offset);
22
- }
23
- }
24
-
25
- updateShapes() {
26
- var centerX = this.centerX;
27
- var centerY = this.centerY;
28
- var radius = this.radius;
29
- var leftBound = centerX - radius;
30
-
31
- var shapes = this.getShapes(),
32
- cnt = shapes.length;
33
- var cellWidth = (radius * 2) / cnt;
34
- var maxDotRadius = cellWidth / 2;
35
-
36
- for (var i = 0; i < cnt; i++) {
37
- var dot = shapes[i];
38
- var t = (this.value + dot.getData('offset')) % 1;
39
- t = Yoyo(t);
40
-
41
- var dotAlpha = Linear(0.25, 1, t);
42
- var dotRadius = Linear(0.5, 1, t) * maxDotRadius;
43
- dot
44
- .fillStyle(this.color, dotAlpha)
45
- .setRadius(dotRadius)
46
- .setCenterPosition(
47
- leftBound + (cellWidth * (i + 0.5)),
48
- centerY
49
- )
50
- }
51
- }
52
- }
53
-
54
- export default Dots;
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AgentVerse/agentVerse/ui/src/phaser3-rex-plugins/templates/ui/fixwidthsizer/GetNearestChildIndex.js DELETED
@@ -1,41 +0,0 @@
1
- const DistanceBetween = Phaser.Math.Distance.Between;
2
-
3
- var GetNearestChildIndex = function (x, y) {
4
- var children = this.sizerChildren;
5
- if (children.length === 0) {
6
- return -1;
7
- }
8
-
9
- var nearestIndex = -1,
10
- minDistance = Infinity;
11
- for (var i = 0, cnt = children.length; i < cnt; i++) {
12
- var child = children[i];
13
- // position is not at this line
14
- if (Math.abs(child.centerY - y) > (child.height / 2)) {
15
- continue;
16
- }
17
-
18
- // Check left bound
19
- var distance = DistanceBetween(child.left, child.centerY, x, y);
20
- if (minDistance > distance) {
21
- minDistance = distance;
22
- nearestIndex = i;
23
- }
24
-
25
- // Is last child of this line
26
- var nextChild = children[i + 1];
27
- if (nextChild && (nextChild.y === child.y)) {
28
- continue;
29
- }
30
-
31
- var distance = DistanceBetween(child.right, child.centerY, x, y);
32
- if (minDistance > distance) {
33
- minDistance = distance;
34
- nearestIndex = i + 1;
35
- }
36
- }
37
-
38
- return nearestIndex;
39
- }
40
-
41
- export default GetNearestChildIndex;
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Alven/background-remover/app.py DELETED
@@ -1,127 +0,0 @@
1
- import cv2
2
- import gradio as gr
3
- import numpy as np
4
- import onnxruntime
5
- import requests
6
- from huggingface_hub import hf_hub_download
7
- from PIL import Image
8
-
9
-
10
- # Get x_scale_factor & y_scale_factor to resize image
11
- def get_scale_factor(im_h, im_w, ref_size=512):
12
-
13
- if max(im_h, im_w) < ref_size or min(im_h, im_w) > ref_size:
14
- if im_w >= im_h:
15
- im_rh = ref_size
16
- im_rw = int(im_w / im_h * ref_size)
17
- elif im_w < im_h:
18
- im_rw = ref_size
19
- im_rh = int(im_h / im_w * ref_size)
20
- else:
21
- im_rh = im_h
22
- im_rw = im_w
23
-
24
- im_rw = im_rw - im_rw % 32
25
- im_rh = im_rh - im_rh % 32
26
-
27
- x_scale_factor = im_rw / im_w
28
- y_scale_factor = im_rh / im_h
29
-
30
- return x_scale_factor, y_scale_factor
31
-
32
-
33
- MODEL_PATH = hf_hub_download('nateraw/background-remover-files', 'modnet.onnx', repo_type='dataset')
34
-
35
-
36
- def main(image_path, threshold):
37
-
38
- # read image
39
- im = cv2.imread(image_path)
40
- im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
41
-
42
- # unify image channels to 3
43
- if len(im.shape) == 2:
44
- im = im[:, :, None]
45
- if im.shape[2] == 1:
46
- im = np.repeat(im, 3, axis=2)
47
- elif im.shape[2] == 4:
48
- im = im[:, :, 0:3]
49
-
50
- # normalize values to scale it between -1 to 1
51
- im = (im - 127.5) / 127.5
52
-
53
- im_h, im_w, im_c = im.shape
54
- x, y = get_scale_factor(im_h, im_w)
55
-
56
- # resize image
57
- im = cv2.resize(im, None, fx=x, fy=y, interpolation=cv2.INTER_AREA)
58
-
59
- # prepare input shape
60
- im = np.transpose(im)
61
- im = np.swapaxes(im, 1, 2)
62
- im = np.expand_dims(im, axis=0).astype('float32')
63
-
64
- # Initialize session and get prediction
65
- session = onnxruntime.InferenceSession(MODEL_PATH, None)
66
- input_name = session.get_inputs()[0].name
67
- output_name = session.get_outputs()[0].name
68
- result = session.run([output_name], {input_name: im})
69
-
70
- # refine matte
71
- matte = (np.squeeze(result[0]) * 255).astype('uint8')
72
- matte = cv2.resize(matte, dsize=(im_w, im_h), interpolation=cv2.INTER_AREA)
73
-
74
- # HACK - Could probably just convert this to PIL instead of writing
75
- cv2.imwrite('out.png', matte)
76
-
77
- image = Image.open(image_path)
78
- matte = Image.open('out.png')
79
-
80
- # obtain predicted foreground
81
- image = np.asarray(image)
82
- if len(image.shape) == 2:
83
- image = image[:, :, None]
84
- if image.shape[2] == 1:
85
- image = np.repeat(image, 3, axis=2)
86
- elif image.shape[2] == 4:
87
- image = image[:, :, 0:3]
88
-
89
- b, g, r = cv2.split(image)
90
-
91
- mask = np.asarray(matte)
92
- a = np.ones(mask.shape, dtype='uint8') * 255
93
- alpha_im = cv2.merge([b, g, r, a], 4)
94
- bg = np.zeros(alpha_im.shape)
95
- new_mask = np.stack([mask, mask, mask, mask], axis=2)
96
- foreground = np.where(new_mask > threshold, alpha_im, bg).astype(np.uint8)
97
-
98
- return Image.fromarray(foreground)
99
-
100
-
101
- title = "MODNet Background Remover"
102
- description = "Gradio demo for MODNet, a model that can remove the background from a given image. To use it, simply upload your image, or click one of the examples to load them. Read more at the links below."
103
- article = "<div style='text-align: center;'> <a href='https://github.com/ZHKKKe/MODNet' target='_blank'>Github Repo</a> | <a href='https://arxiv.org/abs/2011.11961' target='_blank'>MODNet: Real-Time Trimap-Free Portrait Matting via Objective Decomposition</a> </div>"
104
-
105
- url = "https://huggingface.co/datasets/nateraw/background-remover-files/resolve/main/twitter_profile_pic.jpeg"
106
- image = Image.open(requests.get(url, stream=True).raw)
107
- image.save('twitter_profile_pic.jpg')
108
-
109
- url = "https://upload.wikimedia.org/wikipedia/commons/8/8d/President_Barack_Obama.jpg"
110
- image = Image.open(requests.get(url, stream=True).raw)
111
- image.save('obama.jpg')
112
-
113
- interface = gr.Interface(
114
- fn=main,
115
- inputs=[
116
- gr.inputs.Image(type='filepath'),
117
- gr.inputs.Slider(minimum=0, maximum=250, default=100, step=5, label='Mask Cutoff Threshold'),
118
- ],
119
- outputs='image',
120
- examples=[['twitter_profile_pic.jpg', 120], ['obama.jpg', 155]],
121
- title=title,
122
- description=description,
123
- article=article,
124
- )
125
-
126
- if __name__ == '__main__':
127
- interface.launch(debug=True)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/en/api/attnprocessor.md DELETED
@@ -1,42 +0,0 @@
1
- # Attention Processor
2
-
3
- An attention processor is a class for applying different types of attention mechanisms.
4
-
5
- ## AttnProcessor
6
- [[autodoc]] models.attention_processor.AttnProcessor
7
-
8
- ## AttnProcessor2_0
9
- [[autodoc]] models.attention_processor.AttnProcessor2_0
10
-
11
- ## LoRAAttnProcessor
12
- [[autodoc]] models.attention_processor.LoRAAttnProcessor
13
-
14
- ## LoRAAttnProcessor2_0
15
- [[autodoc]] models.attention_processor.LoRAAttnProcessor2_0
16
-
17
- ## CustomDiffusionAttnProcessor
18
- [[autodoc]] models.attention_processor.CustomDiffusionAttnProcessor
19
-
20
- ## AttnAddedKVProcessor
21
- [[autodoc]] models.attention_processor.AttnAddedKVProcessor
22
-
23
- ## AttnAddedKVProcessor2_0
24
- [[autodoc]] models.attention_processor.AttnAddedKVProcessor2_0
25
-
26
- ## LoRAAttnAddedKVProcessor
27
- [[autodoc]] models.attention_processor.LoRAAttnAddedKVProcessor
28
-
29
- ## XFormersAttnProcessor
30
- [[autodoc]] models.attention_processor.XFormersAttnProcessor
31
-
32
- ## LoRAXFormersAttnProcessor
33
- [[autodoc]] models.attention_processor.LoRAXFormersAttnProcessor
34
-
35
- ## CustomDiffusionXFormersAttnProcessor
36
- [[autodoc]] models.attention_processor.CustomDiffusionXFormersAttnProcessor
37
-
38
- ## SlicedAttnProcessor
39
- [[autodoc]] models.attention_processor.SlicedAttnProcessor
40
-
41
- ## SlicedAttnAddedKVProcessor
42
- [[autodoc]] models.attention_processor.SlicedAttnAddedKVProcessor
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/en/using-diffusers/pipeline_overview.md DELETED
@@ -1,17 +0,0 @@
1
- <!--Copyright 2023 The HuggingFace Team. All rights reserved.
2
-
3
- Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
4
- the License. You may obtain a copy of the License at
5
-
6
- http://www.apache.org/licenses/LICENSE-2.0
7
-
8
- Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
9
- an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
10
- specific language governing permissions and limitations under the License.
11
- -->
12
-
13
- # Overview
14
-
15
- A pipeline is an end-to-end class that provides a quick and easy way to use a diffusion system for inference by bundling independently trained models and schedulers together. Certain combinations of models and schedulers define specific pipeline types, like [`StableDiffusionPipeline`] or [`StableDiffusionControlNetPipeline`], with specific capabilities. All pipeline types inherit from the base [`DiffusionPipeline`] class; pass it any checkpoint, and it'll automatically detect the pipeline type and load the necessary components.
16
-
17
- This section introduces you to some of the tasks supported by our pipelines such as unconditional image generation and different techniques and variations of text-to-image generation. You'll also learn how to gain more control over the generation process by setting a seed for reproducibility and weighting prompts to adjust the influence certain words in the prompt has over the output. Finally, you'll see how you can create a community pipeline for a custom task like generating images from speech.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Androidonnxfork/CivitAi-to-Diffusers/diffusers/docs/source/ko/using-diffusers/write_own_pipeline.md DELETED
@@ -1,290 +0,0 @@
1
- <!--Copyright 2023 The HuggingFace Team. All rights reserved.
2
-
3
- Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
4
- the License. You may obtain a copy of the License at
5
-
6
- http://www.apache.org/licenses/LICENSE-2.0
7
-
8
- Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
9
- an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
10
- specific language governing permissions and limitations under the License.
11
- -->
12
-
13
- # 파이프라인, 모델 및 스케줄러 이해하기
14
-
15
- [[colab에서 열기]]
16
-
17
- 🧨 Diffusers는 사용자 친화적이며 유연한 도구 상자로, 사용사례에 맞게 diffusion 시스템을 구축 할 수 있도록 설계되었습니다. 이 도구 상자의 핵심은 모델과 스케줄러입니다. [`DiffusionPipeline`]은 편의를 위해 이러한 구성 요소를 번들로 제공하지만, 파이프라인을 분리하고 모델과 스케줄러를 개별적으로 사용해 새로운 diffusion 시스템을 만들 수도 있습니다.
18
-
19
- 이 튜토리얼에서는 기본 파이프라인부터 시작해 Stable Diffusion 파이프라인까지 진행하며 모델과 스케줄러를 사용해 추론을 위한 diffusion 시스템을 조립하는 방법을 배웁니다.
20
-
21
- ## 기본 파이프라인 해체하기
22
-
23
- 파이프라인은 추론을 위해 모델을 실행하는 빠르고 쉬운 방법으로, 이미지를 생성하는 데 코드가 4줄 이상 필요하지 않습니다:
24
-
25
- ```py
26
- >>> from diffusers import DDPMPipeline
27
-
28
- >>> ddpm = DDPMPipeline.from_pretrained("google/ddpm-cat-256").to("cuda")
29
- >>> image = ddpm(num_inference_steps=25).images[0]
30
- >>> image
31
- ```
32
-
33
- <div class="flex justify-center">
34
- <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/ddpm-cat.png" alt="Image of cat created from DDPMPipeline"/>
35
- </div>
36
-
37
- 정말 쉽습니다. 그런데 파이프라인은 어떻게 이렇게 할 수 있었을까요? 파이프라인을 세분화하여 내부에서 어떤 일이 일어나고 있는지 살펴보겠습니다.
38
-
39
- 위 예시에서 파이프라인에는 [`UNet2DModel`] 모델과 [`DDPMScheduler`]가 포함되어 있습니다. 파이프라인은 원하는 출력 크기의 랜덤 노이즈를 받아 모델을 여러번 통과시켜 이미지의 노이즈를 제거합니다. 각 timestep에서 모델은 *noise residual*을 예측하고 스케줄러는 이를 사용하여 노이즈가 적은 이미지를 예측합니다. 파이프라인은 지정된 추론 스텝수에 도달할 때까지 이 과정을 반복합니다.
40
-
41
- 모델과 스케줄러를 별도로 사용하여 파이프라인을 다시 생성하기 위해 자체적인 노이즈 제거 프로세스를 작성해 보겠습니다.
42
-
43
- 1. 모델과 스케줄러를 불러옵니다:
44
-
45
- ```py
46
- >>> from diffusers import DDPMScheduler, UNet2DModel
47
-
48
- >>> scheduler = DDPMScheduler.from_pretrained("google/ddpm-cat-256")
49
- >>> model = UNet2DModel.from_pretrained("google/ddpm-cat-256").to("cuda")
50
- ```
51
-
52
- 2. 노이즈 제거 프로세스를 실행할 timestep 수를 설정합니다:
53
-
54
- ```py
55
- >>> scheduler.set_timesteps(50)
56
- ```
57
-
58
- 3. 스케줄러의 timestep을 설정하면 균등한 간격의 구성 요소를 가진 텐서가 생성됩니다.(이 예시에서는 50개) 각 요소는 모델이 이미지의 노이즈를 제거하는 시간 간격에 해당합니다. 나중에 노이즈 제거 루프를 만들 때 이 텐서를 반복하여 이미지의 노이즈를 제거합니다:
59
-
60
- ```py
61
- >>> scheduler.timesteps
62
- tensor([980, 960, 940, 920, 900, 880, 860, 840, 820, 800, 780, 760, 740, 720,
63
- 700, 680, 660, 640, 620, 600, 580, 560, 540, 520, 500, 480, 460, 440,
64
- 420, 400, 380, 360, 340, 320, 300, 280, 260, 240, 220, 200, 180, 160,
65
- 140, 120, 100, 80, 60, 40, 20, 0])
66
- ```
67
-
68
- 4. 원하는 출력과 같은 모양을 가진 랜덤 노이즈를 생성합니다:
69
-
70
- ```py
71
- >>> import torch
72
-
73
- >>> sample_size = model.config.sample_size
74
- >>> noise = torch.randn((1, 3, sample_size, sample_size)).to("cuda")
75
- ```
76
-
77
- 5. 이제 timestep을 반복하는 루프를 작성합니다. 각 timestep에서 모델은 [`UNet2DModel.forward`]를 통해 noisy residual을 반환합니다. 스케줄러의 [`~DDPMScheduler.step`] 메서드는 noisy residual, timestep, 그리고 입력을 받아 이전 timestep에서 이미지를 예측합니다. 이 출력은 노이즈 제거 루프의 모델에 대한 다음 입력이 되며, `timesteps` 배열의 끝에 도달할 때까지 반복됩니다.
78
-
79
- ```py
80
- >>> input = noise
81
-
82
- >>> for t in scheduler.timesteps:
83
- ... with torch.no_grad():
84
- ... noisy_residual = model(input, t).sample
85
- ... previous_noisy_sample = scheduler.step(noisy_residual, t, input).prev_sample
86
- ... input = previous_noisy_sample
87
- ```
88
-
89
- 이것이 전체 노이즈 제거 프로세스이며, 동일한 패턴을 사용해 모든 diffusion 시스템을 작성할 수 있습니다.
90
-
91
- 6. 마지막 단계는 노이즈가 제거된 출력을 이미지로 변환하는 것입니다:
92
-
93
- ```py
94
- >>> from PIL import Image
95
- >>> import numpy as np
96
-
97
- >>> image = (input / 2 + 0.5).clamp(0, 1)
98
- >>> image = image.cpu().permute(0, 2, 3, 1).numpy()[0]
99
- >>> image = Image.fromarray((image * 255).round().astype("uint8"))
100
- >>> image
101
- ```
102
-
103
- 다음 섹션에서는 여러분의 기술을 시험해보고 좀 더 복잡한 Stable Diffusion 파이프라인을 분석해 보겠습니다. 방법은 거의 동일합니다. 필요한 구성요소들을 초기화하고 timestep수를 설정하여 `timestep` 배열을 생성합니다. 노이즈 제거 루프에서 `timestep` 배열이 사용되며, 이 배열의 각 요소에 대해 모델은 노이즈가 적은 이미지를 예측합니다. 노이즈 제거 루프는 `timestep`을 반복하고 각 timestep에서 noise residual을 출력하고 스케줄러는 이를 사용하여 이전 timestep에서 노이즈가 덜한 이미지를 예측합니다. 이 프로세스는 `timestep` 배열의 끝에 도달할 때까지 반복됩니다.
104
-
105
- 한번 사용해 봅시다!
106
-
107
- ## Stable Diffusion 파이프라인 해체하기
108
-
109
- Stable Diffusion 은 text-to-image *latent diffusion* 모델입니다. latent diffusion 모델이라고 불리는 이유는 실제 픽셀 공간 대신 이미지의 저차원의 표현으로 작업하기 때문이고, 메모리 효율이 더 높습니다. 인코더는 이미지를 더 작은 표현으로 압축하고, 디코더는 압축된 표현을 다시 이미지로 변환합니다. text-to-image 모델의 경우 텍스트 임베딩을 생성하기 위해 tokenizer와 인코더가 필요합니다. 이전 예제에서 이미 UNet 모델과 스케줄러가 필요하다는 것은 알고 계셨을 것입니다.
110
-
111
- 보시다시피, 이것은 UNet 모델만 포함된 DDPM 파이프라인보다 더 복잡합니다. Stable Diffusion 모델에는 세 개의 개별 사전학습된 모델이 있습니다.
112
-
113
- <Tip>
114
-
115
- 💡 VAE, UNet 및 텍스트 인코더 모델의 작동방식에 대한 자세한 내용은 [How does Stable Diffusion work?](https://huggingface.co/blog/stable_diffusion#how-does-stable-diffusion-work) 블로그를 참조하세요.
116
-
117
- </Tip>
118
-
119
- 이제 Stable Diffusion 파이프라인에 필요한 구성요소들이 무엇인지 알았으니, [`~ModelMixin.from_pretrained`] 메서드를 사용해 모든 구성요소를 불러옵니다. 사전학습된 체크포인트 [`runwayml/stable-diffusion-v1-5`](https://huggingface.co/runwayml/stable-diffusion-v1-5)에서 찾을 수 있으며, 각 구성요소들은 별도의 하위 폴더에 저장되어 있습니다:
120
-
121
- ```py
122
- >>> from PIL import Image
123
- >>> import torch
124
- >>> from transformers import CLIPTextModel, CLIPTokenizer
125
- >>> from diffusers import AutoencoderKL, UNet2DConditionModel, PNDMScheduler
126
-
127
- >>> vae = AutoencoderKL.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="vae")
128
- >>> tokenizer = CLIPTokenizer.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="tokenizer")
129
- >>> text_encoder = CLIPTextModel.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="text_encoder")
130
- >>> unet = UNet2DConditionModel.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="unet")
131
- ```
132
-
133
- 기본 [`PNDMScheduler`] 대신, [`UniPCMultistepScheduler`]로 교체하여 다른 스케줄러를 얼마나 쉽게 연결할 수 있는지 확인합니다:
134
-
135
- ```py
136
- >>> from diffusers import UniPCMultistepScheduler
137
-
138
- >>> scheduler = UniPCMultistepScheduler.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="scheduler")
139
- ```
140
-
141
- 추론 속도를 높이려면 스케줄러와 달리 학습 가능한 가중치가 있으므로 모델을 GPU로 옮기세요:
142
-
143
- ```py
144
- >>> torch_device = "cuda"
145
- >>> vae.to(torch_device)
146
- >>> text_encoder.to(torch_device)
147
- >>> unet.to(torch_device)
148
- ```
149
-
150
- ### 텍스트 임베딩 생성하기
151
-
152
- 다음 단계는 임베딩을 생성하기 위해 텍스트를 토큰화하는 것입니다. 이 텍스트는 UNet 모델에서 condition으로 사용되고 입력 프롬프트와 유사한 방향으로 diffusion 프로세스를 조정하는 데 사용됩니다.
153
-
154
- <Tip>
155
-
156
- 💡 `guidance_scale` 매개변수는 이미지를 생성할 때 프롬프트에 얼마나 많은 가중치를 부여할지 결정합니다.
157
-
158
- </Tip>
159
-
160
- 다른 프롬프트를 생성하고 싶다면 원하는 프롬프트를 자유롭게 선택하세요!
161
-
162
- ```py
163
- >>> prompt = ["a photograph of an astronaut riding a horse"]
164
- >>> height = 512 # Stable Diffusion의 기본 높이
165
- >>> width = 512 # Stable Diffusion의 기본 너비
166
- >>> num_inference_steps = 25 # 노이즈 제거 스텝 수
167
- >>> guidance_scale = 7.5 # classifier-free guidance를 위한 scale
168
- >>> generator = torch.manual_seed(0) # 초기 잠재 노이즈를 생성하는 seed generator
169
- >>> batch_size = len(prompt)
170
- ```
171
-
172
- 텍스트를 토큰화하고 프롬프트에서 임베딩을 생성합니다:
173
-
174
- ```py
175
- >>> text_input = tokenizer(
176
- ... prompt, padding="max_length", max_length=tokenizer.model_max_length, truncation=True, return_tensors="pt"
177
- ... )
178
-
179
- >>> with torch.no_grad():
180
- ... text_embeddings = text_encoder(text_input.input_ids.to(torch_device))[0]
181
- ```
182
-
183
- 또한 패딩 토큰의 임베딩인 *unconditional 텍스트 임베딩*을 생성해야 합니다. 이 임베딩은 조건부 `text_embeddings`과 동일한 shape(`batch_size` 그리고 `seq_length`)을 가져야 합니다:
184
-
185
- ```py
186
- >>> max_length = text_input.input_ids.shape[-1]
187
- >>> uncond_input = tokenizer([""] * batch_size, padding="max_length", max_length=max_length, return_tensors="pt")
188
- >>> uncond_embeddings = text_encoder(uncond_input.input_ids.to(torch_device))[0]
189
- ```
190
-
191
- 두번의 forward pass를 피하기 위해 conditional 임베딩과 unconditional 임베딩을 배치(batch)로 연결하겠습니다:
192
-
193
- ```py
194
- >>> text_embeddings = torch.cat([uncond_embeddings, text_embeddings])
195
- ```
196
-
197
- ### 랜덤 노이즈 생성
198
-
199
- 그다음 diffusion 프로세스의 시작점으로 초기 랜덤 노이즈를 생성합니다. 이것이 이미지의 잠재적 표현이며 점차적으로 노이즈가 제거됩니다. 이 시점에서 `latent` 이미지는 최종 이미지 크기보다 작지만 나중에 모델이 이를 512x512 이미지 크기로 변환하므로 괜찮습니다.
200
-
201
- <Tip>
202
-
203
- 💡 `vae` 모델에는 3개의 다운 샘플링 레이어가 있기 때문에 높이와 너비가 8로 나뉩니다. 다음을 실행하여 확인할 수 있습니다:
204
-
205
- ```py
206
- 2 ** (len(vae.config.block_out_channels) - 1) == 8
207
- ```
208
-
209
- </Tip>
210
-
211
- ```py
212
- >>> latents = torch.randn(
213
- ... (batch_size, unet.in_channels, height // 8, width // 8),
214
- ... generator=generator,
215
- ... )
216
- >>> latents = latents.to(torch_device)
217
- ```
218
-
219
- ### 이미지 노이즈 제거
220
-
221
- 먼저 [`UniPCMultistepScheduler`]와 같은 향상된 스케줄러에 필요한 노이즈 스케일 값인 초기 노이즈 분포 *sigma* 로 입력을 스케일링 하는 것부터 시작합니다:
222
-
223
- ```py
224
- >>> latents = latents * scheduler.init_noise_sigma
225
- ```
226
-
227
- 마지막 단계는 `latent`의 순수한 노이즈를 점진적으로 프롬프트에 설명된 이미지로 변환하는 노이즈 제거 루프를 생성하는 것입니다. 노이즈 제거 루프는 세 가지 작업을 수행해야 한다는 점을 기억하세요:
228
-
229
- 1. 노이즈 제거 중에 사용할 스케줄러의 timesteps를 설정합니다.
230
- 2. timestep을 따라 반복합니다.
231
- 3. 각 timestep에서 UNet 모델을 호출하여 noise residual을 예측하고 스케줄러에 전달하여 이전 노이즈 샘플을 계산합니다.
232
-
233
- ```py
234
- >>> from tqdm.auto import tqdm
235
-
236
- >>> scheduler.set_timesteps(num_inference_steps)
237
-
238
- >>> for t in tqdm(scheduler.timesteps):
239
- ... # classifier-free guidance를 수행하는 경우 두번의 forward pass를 수행하지 않도록 latent를 확장.
240
- ... latent_model_input = torch.cat([latents] * 2)
241
-
242
- ... latent_model_input = scheduler.scale_model_input(latent_model_input, timestep=t)
243
-
244
- ... # noise residual 예측
245
- ... with torch.no_grad():
246
- ... noise_pred = unet(latent_model_input, t, encoder_hidden_states=text_embeddings).sample
247
-
248
- ... # guidance 수행
249
- ... noise_pred_uncond, noise_pred_text = noise_pred.chunk(2)
250
- ... noise_pred = noise_pred_uncond + guidance_scale * (noise_pred_text - noise_pred_uncond)
251
-
252
- ... # 이전 노이즈 샘플을 계산 x_t -> x_t-1
253
- ... latents = scheduler.step(noise_pred, t, latents).prev_sample
254
- ```
255
-
256
- ### 이미지 디코딩
257
-
258
- 마지막 단계는 `vae`를 이용하여 잠재 표현을 이미지로 디코딩하고 `sample`과 함께 디코딩된 출력을 얻는 것입니다:
259
-
260
- ```py
261
- # latent를 스케일링하고 vae로 이미지 디코딩
262
- latents = 1 / 0.18215 * latents
263
- with torch.no_grad():
264
- image = vae.decode(latents).sample
265
- ```
266
-
267
- 마지막으로 이미지를 `PIL.Image`로 변환하면 생성된 이미지를 확인할 수 있습니다!
268
-
269
- ```py
270
- >>> image = (image / 2 + 0.5).clamp(0, 1)
271
- >>> image = image.detach().cpu().permute(0, 2, 3, 1).numpy()
272
- >>> images = (image * 255).round().astype("uint8")
273
- >>> pil_images = [Image.fromarray(image) for image in images]
274
- >>> pil_images[0]
275
- ```
276
-
277
- <div class="flex justify-center">
278
- <img src="https://huggingface.co/blog/assets/98_stable_diffusion/stable_diffusion_k_lms.png"/>
279
- </div>
280
-
281
- ## 다음 단계
282
-
283
- 기본 파이프라인부터 복잡한 파이프라인까지, 자신만의 diffusion 시스템을 작성하는 데 필요한 것은 노이즈 제거 루프뿐이라는 것을 알 수 있었습니다. 이 루프는 스케줄러의 timesteps를 설정하고, 이를 반복하며, UNet 모델을 호출하여 noise residual을 예측하고 스케줄러에 전달하여 이전 노이즈 샘플을 계산하는 과정을 번갈아 가며 수행해야 합니다.
284
-
285
- 이것이 바로 🧨 Diffusers가 설계된 목적입니다: 모델과 스케줄러를 사용해 자신만의 diffusion 시스템을 직관적이고 쉽게 작성할 수 있도록 하기 위해서입니다.
286
-
287
- ���음 단계를 자유롭게 진행하세요:
288
-
289
- * 🧨 Diffusers에 [파이프라인 구축 및 기여](using-diffusers/#contribute_pipeline)하는 방법을 알아보세요. 여러분이 어떤 아이디어를 내놓을지 기대됩니다!
290
- * 라이브러리에서 [기본 파이프라인](./api/pipelines/overview)을 살펴보고, 모델과 스케줄러를 별도로 사용하여 파이프라인을 처음부터 해체하고 빌드할 수 있는지 확인해 보세요.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Andy1621/uniformer_image_detection/configs/_base_/datasets/wider_face.py DELETED
@@ -1,63 +0,0 @@
1
- # dataset settings
2
- dataset_type = 'WIDERFaceDataset'
3
- data_root = 'data/WIDERFace/'
4
- img_norm_cfg = dict(mean=[123.675, 116.28, 103.53], std=[1, 1, 1], to_rgb=True)
5
- train_pipeline = [
6
- dict(type='LoadImageFromFile', to_float32=True),
7
- dict(type='LoadAnnotations', with_bbox=True),
8
- dict(
9
- type='PhotoMetricDistortion',
10
- brightness_delta=32,
11
- contrast_range=(0.5, 1.5),
12
- saturation_range=(0.5, 1.5),
13
- hue_delta=18),
14
- dict(
15
- type='Expand',
16
- mean=img_norm_cfg['mean'],
17
- to_rgb=img_norm_cfg['to_rgb'],
18
- ratio_range=(1, 4)),
19
- dict(
20
- type='MinIoURandomCrop',
21
- min_ious=(0.1, 0.3, 0.5, 0.7, 0.9),
22
- min_crop_size=0.3),
23
- dict(type='Resize', img_scale=(300, 300), keep_ratio=False),
24
- dict(type='Normalize', **img_norm_cfg),
25
- dict(type='RandomFlip', flip_ratio=0.5),
26
- dict(type='DefaultFormatBundle'),
27
- dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels']),
28
- ]
29
- test_pipeline = [
30
- dict(type='LoadImageFromFile'),
31
- dict(
32
- type='MultiScaleFlipAug',
33
- img_scale=(300, 300),
34
- flip=False,
35
- transforms=[
36
- dict(type='Resize', keep_ratio=False),
37
- dict(type='Normalize', **img_norm_cfg),
38
- dict(type='ImageToTensor', keys=['img']),
39
- dict(type='Collect', keys=['img']),
40
- ])
41
- ]
42
- data = dict(
43
- samples_per_gpu=60,
44
- workers_per_gpu=2,
45
- train=dict(
46
- type='RepeatDataset',
47
- times=2,
48
- dataset=dict(
49
- type=dataset_type,
50
- ann_file=data_root + 'train.txt',
51
- img_prefix=data_root + 'WIDER_train/',
52
- min_size=17,
53
- pipeline=train_pipeline)),
54
- val=dict(
55
- type=dataset_type,
56
- ann_file=data_root + 'val.txt',
57
- img_prefix=data_root + 'WIDER_val/',
58
- pipeline=test_pipeline),
59
- test=dict(
60
- type=dataset_type,
61
- ann_file=data_root + 'val.txt',
62
- img_prefix=data_root + 'WIDER_val/',
63
- pipeline=test_pipeline))
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Andy1621/uniformer_image_detection/configs/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco.py DELETED
@@ -1,10 +0,0 @@
1
- _base_ = './mask_rcnn_hrnetv2p_w18_1x_coco.py'
2
- model = dict(
3
- pretrained='open-mmlab://msra/hrnetv2_w40',
4
- backbone=dict(
5
- type='HRNet',
6
- extra=dict(
7
- stage2=dict(num_channels=(40, 80)),
8
- stage3=dict(num_channels=(40, 80, 160)),
9
- stage4=dict(num_channels=(40, 80, 160, 320)))),
10
- neck=dict(type='HRFPN', in_channels=[40, 80, 160, 320], out_channels=256))
 
 
 
 
 
 
 
 
 
 
 
spaces/Andy1621/uniformer_image_detection/tools/dist_train.sh DELETED
@@ -1,9 +0,0 @@
1
- #!/usr/bin/env bash
2
-
3
- CONFIG=$1
4
- GPUS=$2
5
- PORT=${PORT:-29500}
6
-
7
- PYTHONPATH="$(dirname $0)/..":$PYTHONPATH \
8
- python -m torch.distributed.launch --nproc_per_node=$GPUS --master_port=$PORT \
9
- $(dirname "$0")/train.py $CONFIG --launcher pytorch ${@:3}
 
 
 
 
 
 
 
 
 
 
spaces/Andy1621/uniformer_image_segmentation/configs/dnlnet/dnl_r50-d8_512x512_80k_ade20k.py DELETED
@@ -1,6 +0,0 @@
1
- _base_ = [
2
- '../_base_/models/dnl_r50-d8.py', '../_base_/datasets/ade20k.py',
3
- '../_base_/default_runtime.py', '../_base_/schedules/schedule_80k.py'
4
- ]
5
- model = dict(
6
- decode_head=dict(num_classes=150), auxiliary_head=dict(num_classes=150))
 
 
 
 
 
 
 
spaces/Andy1621/uniformer_image_segmentation/configs/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes.py DELETED
@@ -1,23 +0,0 @@
1
- _base_ = './lraspp_m-v3-d8_512x1024_320k_cityscapes.py'
2
- norm_cfg = dict(type='SyncBN', eps=0.001, requires_grad=True)
3
- model = dict(
4
- type='EncoderDecoder',
5
- pretrained='open-mmlab://contrib/mobilenet_v3_small',
6
- backbone=dict(
7
- type='MobileNetV3',
8
- arch='small',
9
- out_indices=(0, 1, 12),
10
- norm_cfg=norm_cfg),
11
- decode_head=dict(
12
- type='LRASPPHead',
13
- in_channels=(16, 16, 576),
14
- in_index=(0, 1, 2),
15
- channels=128,
16
- input_transform='multiple_select',
17
- dropout_ratio=0.1,
18
- num_classes=19,
19
- norm_cfg=norm_cfg,
20
- act_cfg=dict(type='ReLU'),
21
- align_corners=False,
22
- loss_decode=dict(
23
- type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0)))
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AnimalEquality/chatbot/scripts/nbdev_readme_patch_hface.sh DELETED
@@ -1,18 +0,0 @@
1
- #!/bin/bash
2
- # Run from root dir
3
-
4
- nbdev_readme
5
- yaml_file="hface.yaml"
6
- readme_file="README.md"
7
-
8
- # Read the content of the YAML file
9
- yaml_content=$(cat "$yaml_file")
10
-
11
- # Read the content of the README file
12
- readme_content=$(cat "$readme_file")
13
-
14
- # Combine the YAML content and README content
15
- combined_content="$yaml_content\n$readme_content"
16
-
17
- # Overwrite the README file with the combined content
18
- echo -e "$combined_content" > "$readme_file"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/AsakuraMizu/moe-tts/text/sanskrit.py DELETED
@@ -1,62 +0,0 @@
1
- import re
2
- from indic_transliteration import sanscript
3
-
4
-
5
- # List of (iast, ipa) pairs:
6
- _iast_to_ipa = [(re.compile('%s' % x[0]), x[1]) for x in [
7
- ('a', 'ə'),
8
- ('ā', 'aː'),
9
- ('ī', 'iː'),
10
- ('ū', 'uː'),
11
- ('ṛ', 'ɹ`'),
12
- ('ṝ', 'ɹ`ː'),
13
- ('ḷ', 'l`'),
14
- ('ḹ', 'l`ː'),
15
- ('e', 'eː'),
16
- ('o', 'oː'),
17
- ('k', 'k⁼'),
18
- ('k⁼h', 'kʰ'),
19
- ('g', 'g⁼'),
20
- ('g⁼h', 'gʰ'),
21
- ('ṅ', 'ŋ'),
22
- ('c', 'ʧ⁼'),
23
- ('ʧ⁼h', 'ʧʰ'),
24
- ('j', 'ʥ⁼'),
25
- ('ʥ⁼h', 'ʥʰ'),
26
- ('ñ', 'n^'),
27
- ('ṭ', 't`⁼'),
28
- ('t`⁼h', 't`ʰ'),
29
- ('ḍ', 'd`⁼'),
30
- ('d`⁼h', 'd`ʰ'),
31
- ('ṇ', 'n`'),
32
- ('t', 't⁼'),
33
- ('t⁼h', 'tʰ'),
34
- ('d', 'd⁼'),
35
- ('d⁼h', 'dʰ'),
36
- ('p', 'p⁼'),
37
- ('p⁼h', 'pʰ'),
38
- ('b', 'b⁼'),
39
- ('b⁼h', 'bʰ'),
40
- ('y', 'j'),
41
- ('ś', 'ʃ'),
42
- ('ṣ', 's`'),
43
- ('r', 'ɾ'),
44
- ('l̤', 'l`'),
45
- ('h', 'ɦ'),
46
- ("'", ''),
47
- ('~', '^'),
48
- ('ṃ', '^')
49
- ]]
50
-
51
-
52
- def devanagari_to_ipa(text):
53
- text = text.replace('ॐ', 'ओम्')
54
- text = re.sub(r'\s*।\s*$', '.', text)
55
- text = re.sub(r'\s*।\s*', ', ', text)
56
- text = re.sub(r'\s*॥', '.', text)
57
- text = sanscript.transliterate(text, sanscript.DEVANAGARI, sanscript.IAST)
58
- for regex, replacement in _iast_to_ipa:
59
- text = re.sub(regex, replacement, text)
60
- text = re.sub('(.)[`ː]*ḥ', lambda x: x.group(0)
61
- [:-1]+'h'+x.group(1)+'*', text)
62
- return text
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Ataturk-Chatbot/HuggingFaceChat/venv/lib/python3.11/site-packages/pkg_resources/_vendor/jaraco/__init__.py DELETED
File without changes
spaces/Atualli/node-media-server/Dockerfile DELETED
@@ -1,13 +0,0 @@
1
- FROM node:alpine
2
- ENV PORT=7860
3
- # ENV PORT=7861
4
- # ENV UUID=d342d11e-d424-4583-b36e-524ab1f0afa4
5
- # EXPOSE 7860 7861
6
- WORKDIR /nms
7
- RUN npm i node-media-server
8
- #RUN npm uninstall node-media-server
9
- #CMD ["node-media-server","start"]
10
- #RUN nms -d -p 1935:1935 -p 8000:8000 -p 8443:8443 illuspas/node-media-server
11
- COPY . /nms
12
- CMD ["node", "app.js"]
13
- # RUN node app.js
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Awesimo/jojogan/e4e/models/discriminator.py DELETED
@@ -1,20 +0,0 @@
1
- from torch import nn
2
-
3
-
4
- class LatentCodesDiscriminator(nn.Module):
5
- def __init__(self, style_dim, n_mlp):
6
- super().__init__()
7
-
8
- self.style_dim = style_dim
9
-
10
- layers = []
11
- for i in range(n_mlp-1):
12
- layers.append(
13
- nn.Linear(style_dim, style_dim)
14
- )
15
- layers.append(nn.LeakyReLU(0.2))
16
- layers.append(nn.Linear(512, 1))
17
- self.mlp = nn.Sequential(*layers)
18
-
19
- def forward(self, w):
20
- return self.mlp(w)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/BetterAPI/BetterChat_new/src/routes/conversation/+server.ts DELETED
@@ -1,57 +0,0 @@
1
- import type { RequestHandler } from "./$types";
2
- import { collections } from "$lib/server/database";
3
- import { ObjectId } from "mongodb";
4
- import { error, redirect } from "@sveltejs/kit";
5
- import { base } from "$app/paths";
6
- import { z } from "zod";
7
- import type { Message } from "$lib/types/Message";
8
-
9
- export const POST: RequestHandler = async (input) => {
10
- const body = await input.request.text();
11
-
12
- let title = "";
13
- let messages: Message[] = [];
14
- let fromShareId: string | undefined;
15
-
16
- if (body) {
17
- fromShareId = z.object({ fromShare: z.string().optional() }).parse(JSON.parse(body)).fromShare;
18
-
19
- if (fromShareId) {
20
- const conversation = await collections.sharedConversations.findOne({
21
- _id: fromShareId,
22
- });
23
-
24
- if (!conversation) {
25
- throw error(404, "Conversation not found");
26
- }
27
-
28
- title = conversation.title;
29
- messages = conversation.messages;
30
- }
31
- }
32
-
33
- const res = await collections.conversations.insertOne({
34
- _id: new ObjectId(),
35
- title:
36
- title ||
37
- "Untitled " +
38
- ((await collections.conversations.countDocuments({ sessionId: input.locals.sessionId })) +
39
- 1),
40
- messages,
41
- createdAt: new Date(),
42
- updatedAt: new Date(),
43
- sessionId: input.locals.sessionId,
44
- ...(fromShareId ? { meta: { fromShareId } } : {}),
45
- });
46
-
47
- return new Response(
48
- JSON.stringify({
49
- conversationId: res.insertedId.toString(),
50
- }),
51
- { headers: { "Content-Type": "application/json" } }
52
- );
53
- };
54
-
55
- export const GET: RequestHandler = async () => {
56
- throw redirect(302, base || "/");
57
- };
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Big-Web/MMSD/env/Lib/site-packages/setuptools/_distutils/msvccompiler.py DELETED
@@ -1,695 +0,0 @@
1
- """distutils.msvccompiler
2
-
3
- Contains MSVCCompiler, an implementation of the abstract CCompiler class
4
- for the Microsoft Visual Studio.
5
- """
6
-
7
- # Written by Perry Stoll
8
- # hacked by Robin Becker and Thomas Heller to do a better job of
9
- # finding DevStudio (through the registry)
10
-
11
- import sys
12
- import os
13
- import warnings
14
- from distutils.errors import (
15
- DistutilsExecError,
16
- DistutilsPlatformError,
17
- CompileError,
18
- LibError,
19
- LinkError,
20
- )
21
- from distutils.ccompiler import CCompiler, gen_lib_options
22
- from distutils import log
23
-
24
- _can_read_reg = False
25
- try:
26
- import winreg
27
-
28
- _can_read_reg = True
29
- hkey_mod = winreg
30
-
31
- RegOpenKeyEx = winreg.OpenKeyEx
32
- RegEnumKey = winreg.EnumKey
33
- RegEnumValue = winreg.EnumValue
34
- RegError = winreg.error
35
-
36
- except ImportError:
37
- try:
38
- import win32api
39
- import win32con
40
-
41
- _can_read_reg = True
42
- hkey_mod = win32con
43
-
44
- RegOpenKeyEx = win32api.RegOpenKeyEx
45
- RegEnumKey = win32api.RegEnumKey
46
- RegEnumValue = win32api.RegEnumValue
47
- RegError = win32api.error
48
- except ImportError:
49
- log.info(
50
- "Warning: Can't read registry to find the "
51
- "necessary compiler setting\n"
52
- "Make sure that Python modules winreg, "
53
- "win32api or win32con are installed."
54
- )
55
- pass
56
-
57
- if _can_read_reg:
58
- HKEYS = (
59
- hkey_mod.HKEY_USERS,
60
- hkey_mod.HKEY_CURRENT_USER,
61
- hkey_mod.HKEY_LOCAL_MACHINE,
62
- hkey_mod.HKEY_CLASSES_ROOT,
63
- )
64
-
65
-
66
- warnings.warn(
67
- "msvccompiler is deprecated and slated to be removed "
68
- "in the future. Please discontinue use or file an issue "
69
- "with pypa/distutils describing your use case.",
70
- DeprecationWarning,
71
- )
72
-
73
-
74
- def read_keys(base, key):
75
- """Return list of registry keys."""
76
- try:
77
- handle = RegOpenKeyEx(base, key)
78
- except RegError:
79
- return None
80
- L = []
81
- i = 0
82
- while True:
83
- try:
84
- k = RegEnumKey(handle, i)
85
- except RegError:
86
- break
87
- L.append(k)
88
- i += 1
89
- return L
90
-
91
-
92
- def read_values(base, key):
93
- """Return dict of registry keys and values.
94
-
95
- All names are converted to lowercase.
96
- """
97
- try:
98
- handle = RegOpenKeyEx(base, key)
99
- except RegError:
100
- return None
101
- d = {}
102
- i = 0
103
- while True:
104
- try:
105
- name, value, type = RegEnumValue(handle, i)
106
- except RegError:
107
- break
108
- name = name.lower()
109
- d[convert_mbcs(name)] = convert_mbcs(value)
110
- i += 1
111
- return d
112
-
113
-
114
- def convert_mbcs(s):
115
- dec = getattr(s, "decode", None)
116
- if dec is not None:
117
- try:
118
- s = dec("mbcs")
119
- except UnicodeError:
120
- pass
121
- return s
122
-
123
-
124
- class MacroExpander:
125
- def __init__(self, version):
126
- self.macros = {}
127
- self.load_macros(version)
128
-
129
- def set_macro(self, macro, path, key):
130
- for base in HKEYS:
131
- d = read_values(base, path)
132
- if d:
133
- self.macros["$(%s)" % macro] = d[key]
134
- break
135
-
136
- def load_macros(self, version):
137
- vsbase = r"Software\Microsoft\VisualStudio\%0.1f" % version
138
- self.set_macro("VCInstallDir", vsbase + r"\Setup\VC", "productdir")
139
- self.set_macro("VSInstallDir", vsbase + r"\Setup\VS", "productdir")
140
- net = r"Software\Microsoft\.NETFramework"
141
- self.set_macro("FrameworkDir", net, "installroot")
142
- try:
143
- if version > 7.0:
144
- self.set_macro("FrameworkSDKDir", net, "sdkinstallrootv1.1")
145
- else:
146
- self.set_macro("FrameworkSDKDir", net, "sdkinstallroot")
147
- except KeyError:
148
- raise DistutilsPlatformError(
149
- """Python was built with Visual Studio 2003;
150
- extensions must be built with a compiler than can generate compatible binaries.
151
- Visual Studio 2003 was not found on this system. If you have Cygwin installed,
152
- you can try compiling with MingW32, by passing "-c mingw32" to setup.py."""
153
- )
154
-
155
- p = r"Software\Microsoft\NET Framework Setup\Product"
156
- for base in HKEYS:
157
- try:
158
- h = RegOpenKeyEx(base, p)
159
- except RegError:
160
- continue
161
- key = RegEnumKey(h, 0)
162
- d = read_values(base, r"{}\{}".format(p, key))
163
- self.macros["$(FrameworkVersion)"] = d["version"]
164
-
165
- def sub(self, s):
166
- for k, v in self.macros.items():
167
- s = s.replace(k, v)
168
- return s
169
-
170
-
171
- def get_build_version():
172
- """Return the version of MSVC that was used to build Python.
173
-
174
- For Python 2.3 and up, the version number is included in
175
- sys.version. For earlier versions, assume the compiler is MSVC 6.
176
- """
177
- prefix = "MSC v."
178
- i = sys.version.find(prefix)
179
- if i == -1:
180
- return 6
181
- i = i + len(prefix)
182
- s, rest = sys.version[i:].split(" ", 1)
183
- majorVersion = int(s[:-2]) - 6
184
- if majorVersion >= 13:
185
- # v13 was skipped and should be v14
186
- majorVersion += 1
187
- minorVersion = int(s[2:3]) / 10.0
188
- # I don't think paths are affected by minor version in version 6
189
- if majorVersion == 6:
190
- minorVersion = 0
191
- if majorVersion >= 6:
192
- return majorVersion + minorVersion
193
- # else we don't know what version of the compiler this is
194
- return None
195
-
196
-
197
- def get_build_architecture():
198
- """Return the processor architecture.
199
-
200
- Possible results are "Intel" or "AMD64".
201
- """
202
-
203
- prefix = " bit ("
204
- i = sys.version.find(prefix)
205
- if i == -1:
206
- return "Intel"
207
- j = sys.version.find(")", i)
208
- return sys.version[i + len(prefix) : j]
209
-
210
-
211
- def normalize_and_reduce_paths(paths):
212
- """Return a list of normalized paths with duplicates removed.
213
-
214
- The current order of paths is maintained.
215
- """
216
- # Paths are normalized so things like: /a and /a/ aren't both preserved.
217
- reduced_paths = []
218
- for p in paths:
219
- np = os.path.normpath(p)
220
- # XXX(nnorwitz): O(n**2), if reduced_paths gets long perhaps use a set.
221
- if np not in reduced_paths:
222
- reduced_paths.append(np)
223
- return reduced_paths
224
-
225
-
226
- class MSVCCompiler(CCompiler):
227
- """Concrete class that implements an interface to Microsoft Visual C++,
228
- as defined by the CCompiler abstract class."""
229
-
230
- compiler_type = 'msvc'
231
-
232
- # Just set this so CCompiler's constructor doesn't barf. We currently
233
- # don't use the 'set_executables()' bureaucracy provided by CCompiler,
234
- # as it really isn't necessary for this sort of single-compiler class.
235
- # Would be nice to have a consistent interface with UnixCCompiler,
236
- # though, so it's worth thinking about.
237
- executables = {}
238
-
239
- # Private class data (need to distinguish C from C++ source for compiler)
240
- _c_extensions = ['.c']
241
- _cpp_extensions = ['.cc', '.cpp', '.cxx']
242
- _rc_extensions = ['.rc']
243
- _mc_extensions = ['.mc']
244
-
245
- # Needed for the filename generation methods provided by the
246
- # base class, CCompiler.
247
- src_extensions = _c_extensions + _cpp_extensions + _rc_extensions + _mc_extensions
248
- res_extension = '.res'
249
- obj_extension = '.obj'
250
- static_lib_extension = '.lib'
251
- shared_lib_extension = '.dll'
252
- static_lib_format = shared_lib_format = '%s%s'
253
- exe_extension = '.exe'
254
-
255
- def __init__(self, verbose=0, dry_run=0, force=0):
256
- super().__init__(verbose, dry_run, force)
257
- self.__version = get_build_version()
258
- self.__arch = get_build_architecture()
259
- if self.__arch == "Intel":
260
- # x86
261
- if self.__version >= 7:
262
- self.__root = r"Software\Microsoft\VisualStudio"
263
- self.__macros = MacroExpander(self.__version)
264
- else:
265
- self.__root = r"Software\Microsoft\Devstudio"
266
- self.__product = "Visual Studio version %s" % self.__version
267
- else:
268
- # Win64. Assume this was built with the platform SDK
269
- self.__product = "Microsoft SDK compiler %s" % (self.__version + 6)
270
-
271
- self.initialized = False
272
-
273
- def initialize(self):
274
- self.__paths = []
275
- if (
276
- "DISTUTILS_USE_SDK" in os.environ
277
- and "MSSdk" in os.environ
278
- and self.find_exe("cl.exe")
279
- ):
280
- # Assume that the SDK set up everything alright; don't try to be
281
- # smarter
282
- self.cc = "cl.exe"
283
- self.linker = "link.exe"
284
- self.lib = "lib.exe"
285
- self.rc = "rc.exe"
286
- self.mc = "mc.exe"
287
- else:
288
- self.__paths = self.get_msvc_paths("path")
289
-
290
- if len(self.__paths) == 0:
291
- raise DistutilsPlatformError(
292
- "Python was built with %s, "
293
- "and extensions need to be built with the same "
294
- "version of the compiler, but it isn't installed." % self.__product
295
- )
296
-
297
- self.cc = self.find_exe("cl.exe")
298
- self.linker = self.find_exe("link.exe")
299
- self.lib = self.find_exe("lib.exe")
300
- self.rc = self.find_exe("rc.exe") # resource compiler
301
- self.mc = self.find_exe("mc.exe") # message compiler
302
- self.set_path_env_var('lib')
303
- self.set_path_env_var('include')
304
-
305
- # extend the MSVC path with the current path
306
- try:
307
- for p in os.environ['path'].split(';'):
308
- self.__paths.append(p)
309
- except KeyError:
310
- pass
311
- self.__paths = normalize_and_reduce_paths(self.__paths)
312
- os.environ['path'] = ";".join(self.__paths)
313
-
314
- self.preprocess_options = None
315
- if self.__arch == "Intel":
316
- self.compile_options = ['/nologo', '/O2', '/MD', '/W3', '/GX', '/DNDEBUG']
317
- self.compile_options_debug = [
318
- '/nologo',
319
- '/Od',
320
- '/MDd',
321
- '/W3',
322
- '/GX',
323
- '/Z7',
324
- '/D_DEBUG',
325
- ]
326
- else:
327
- # Win64
328
- self.compile_options = ['/nologo', '/O2', '/MD', '/W3', '/GS-', '/DNDEBUG']
329
- self.compile_options_debug = [
330
- '/nologo',
331
- '/Od',
332
- '/MDd',
333
- '/W3',
334
- '/GS-',
335
- '/Z7',
336
- '/D_DEBUG',
337
- ]
338
-
339
- self.ldflags_shared = ['/DLL', '/nologo', '/INCREMENTAL:NO']
340
- if self.__version >= 7:
341
- self.ldflags_shared_debug = ['/DLL', '/nologo', '/INCREMENTAL:no', '/DEBUG']
342
- else:
343
- self.ldflags_shared_debug = [
344
- '/DLL',
345
- '/nologo',
346
- '/INCREMENTAL:no',
347
- '/pdb:None',
348
- '/DEBUG',
349
- ]
350
- self.ldflags_static = ['/nologo']
351
-
352
- self.initialized = True
353
-
354
- # -- Worker methods ------------------------------------------------
355
-
356
- def object_filenames(self, source_filenames, strip_dir=0, output_dir=''):
357
- # Copied from ccompiler.py, extended to return .res as 'object'-file
358
- # for .rc input file
359
- if output_dir is None:
360
- output_dir = ''
361
- obj_names = []
362
- for src_name in source_filenames:
363
- (base, ext) = os.path.splitext(src_name)
364
- base = os.path.splitdrive(base)[1] # Chop off the drive
365
- base = base[os.path.isabs(base) :] # If abs, chop off leading /
366
- if ext not in self.src_extensions:
367
- # Better to raise an exception instead of silently continuing
368
- # and later complain about sources and targets having
369
- # different lengths
370
- raise CompileError("Don't know how to compile %s" % src_name)
371
- if strip_dir:
372
- base = os.path.basename(base)
373
- if ext in self._rc_extensions:
374
- obj_names.append(os.path.join(output_dir, base + self.res_extension))
375
- elif ext in self._mc_extensions:
376
- obj_names.append(os.path.join(output_dir, base + self.res_extension))
377
- else:
378
- obj_names.append(os.path.join(output_dir, base + self.obj_extension))
379
- return obj_names
380
-
381
- def compile( # noqa: C901
382
- self,
383
- sources,
384
- output_dir=None,
385
- macros=None,
386
- include_dirs=None,
387
- debug=0,
388
- extra_preargs=None,
389
- extra_postargs=None,
390
- depends=None,
391
- ):
392
-
393
- if not self.initialized:
394
- self.initialize()
395
- compile_info = self._setup_compile(
396
- output_dir, macros, include_dirs, sources, depends, extra_postargs
397
- )
398
- macros, objects, extra_postargs, pp_opts, build = compile_info
399
-
400
- compile_opts = extra_preargs or []
401
- compile_opts.append('/c')
402
- if debug:
403
- compile_opts.extend(self.compile_options_debug)
404
- else:
405
- compile_opts.extend(self.compile_options)
406
-
407
- for obj in objects:
408
- try:
409
- src, ext = build[obj]
410
- except KeyError:
411
- continue
412
- if debug:
413
- # pass the full pathname to MSVC in debug mode,
414
- # this allows the debugger to find the source file
415
- # without asking the user to browse for it
416
- src = os.path.abspath(src)
417
-
418
- if ext in self._c_extensions:
419
- input_opt = "/Tc" + src
420
- elif ext in self._cpp_extensions:
421
- input_opt = "/Tp" + src
422
- elif ext in self._rc_extensions:
423
- # compile .RC to .RES file
424
- input_opt = src
425
- output_opt = "/fo" + obj
426
- try:
427
- self.spawn([self.rc] + pp_opts + [output_opt] + [input_opt])
428
- except DistutilsExecError as msg:
429
- raise CompileError(msg)
430
- continue
431
- elif ext in self._mc_extensions:
432
- # Compile .MC to .RC file to .RES file.
433
- # * '-h dir' specifies the directory for the
434
- # generated include file
435
- # * '-r dir' specifies the target directory of the
436
- # generated RC file and the binary message resource
437
- # it includes
438
- #
439
- # For now (since there are no options to change this),
440
- # we use the source-directory for the include file and
441
- # the build directory for the RC file and message
442
- # resources. This works at least for win32all.
443
- h_dir = os.path.dirname(src)
444
- rc_dir = os.path.dirname(obj)
445
- try:
446
- # first compile .MC to .RC and .H file
447
- self.spawn([self.mc] + ['-h', h_dir, '-r', rc_dir] + [src])
448
- base, _ = os.path.splitext(os.path.basename(src))
449
- rc_file = os.path.join(rc_dir, base + '.rc')
450
- # then compile .RC to .RES file
451
- self.spawn([self.rc] + ["/fo" + obj] + [rc_file])
452
-
453
- except DistutilsExecError as msg:
454
- raise CompileError(msg)
455
- continue
456
- else:
457
- # how to handle this file?
458
- raise CompileError(
459
- "Don't know how to compile {} to {}".format(src, obj)
460
- )
461
-
462
- output_opt = "/Fo" + obj
463
- try:
464
- self.spawn(
465
- [self.cc]
466
- + compile_opts
467
- + pp_opts
468
- + [input_opt, output_opt]
469
- + extra_postargs
470
- )
471
- except DistutilsExecError as msg:
472
- raise CompileError(msg)
473
-
474
- return objects
475
-
476
- def create_static_lib(
477
- self, objects, output_libname, output_dir=None, debug=0, target_lang=None
478
- ):
479
-
480
- if not self.initialized:
481
- self.initialize()
482
- (objects, output_dir) = self._fix_object_args(objects, output_dir)
483
- output_filename = self.library_filename(output_libname, output_dir=output_dir)
484
-
485
- if self._need_link(objects, output_filename):
486
- lib_args = objects + ['/OUT:' + output_filename]
487
- if debug:
488
- pass # XXX what goes here?
489
- try:
490
- self.spawn([self.lib] + lib_args)
491
- except DistutilsExecError as msg:
492
- raise LibError(msg)
493
- else:
494
- log.debug("skipping %s (up-to-date)", output_filename)
495
-
496
- def link( # noqa: C901
497
- self,
498
- target_desc,
499
- objects,
500
- output_filename,
501
- output_dir=None,
502
- libraries=None,
503
- library_dirs=None,
504
- runtime_library_dirs=None,
505
- export_symbols=None,
506
- debug=0,
507
- extra_preargs=None,
508
- extra_postargs=None,
509
- build_temp=None,
510
- target_lang=None,
511
- ):
512
-
513
- if not self.initialized:
514
- self.initialize()
515
- (objects, output_dir) = self._fix_object_args(objects, output_dir)
516
- fixed_args = self._fix_lib_args(libraries, library_dirs, runtime_library_dirs)
517
- (libraries, library_dirs, runtime_library_dirs) = fixed_args
518
-
519
- if runtime_library_dirs:
520
- self.warn(
521
- "I don't know what to do with 'runtime_library_dirs': "
522
- + str(runtime_library_dirs)
523
- )
524
-
525
- lib_opts = gen_lib_options(self, library_dirs, runtime_library_dirs, libraries)
526
- if output_dir is not None:
527
- output_filename = os.path.join(output_dir, output_filename)
528
-
529
- if self._need_link(objects, output_filename):
530
- if target_desc == CCompiler.EXECUTABLE:
531
- if debug:
532
- ldflags = self.ldflags_shared_debug[1:]
533
- else:
534
- ldflags = self.ldflags_shared[1:]
535
- else:
536
- if debug:
537
- ldflags = self.ldflags_shared_debug
538
- else:
539
- ldflags = self.ldflags_shared
540
-
541
- export_opts = []
542
- for sym in export_symbols or []:
543
- export_opts.append("/EXPORT:" + sym)
544
-
545
- ld_args = (
546
- ldflags + lib_opts + export_opts + objects + ['/OUT:' + output_filename]
547
- )
548
-
549
- # The MSVC linker generates .lib and .exp files, which cannot be
550
- # suppressed by any linker switches. The .lib files may even be
551
- # needed! Make sure they are generated in the temporary build
552
- # directory. Since they have different names for debug and release
553
- # builds, they can go into the same directory.
554
- if export_symbols is not None:
555
- (dll_name, dll_ext) = os.path.splitext(
556
- os.path.basename(output_filename)
557
- )
558
- implib_file = os.path.join(
559
- os.path.dirname(objects[0]), self.library_filename(dll_name)
560
- )
561
- ld_args.append('/IMPLIB:' + implib_file)
562
-
563
- if extra_preargs:
564
- ld_args[:0] = extra_preargs
565
- if extra_postargs:
566
- ld_args.extend(extra_postargs)
567
-
568
- self.mkpath(os.path.dirname(output_filename))
569
- try:
570
- self.spawn([self.linker] + ld_args)
571
- except DistutilsExecError as msg:
572
- raise LinkError(msg)
573
-
574
- else:
575
- log.debug("skipping %s (up-to-date)", output_filename)
576
-
577
- # -- Miscellaneous methods -----------------------------------------
578
- # These are all used by the 'gen_lib_options() function, in
579
- # ccompiler.py.
580
-
581
- def library_dir_option(self, dir):
582
- return "/LIBPATH:" + dir
583
-
584
- def runtime_library_dir_option(self, dir):
585
- raise DistutilsPlatformError(
586
- "don't know how to set runtime library search path for MSVC++"
587
- )
588
-
589
- def library_option(self, lib):
590
- return self.library_filename(lib)
591
-
592
- def find_library_file(self, dirs, lib, debug=0):
593
- # Prefer a debugging library if found (and requested), but deal
594
- # with it if we don't have one.
595
- if debug:
596
- try_names = [lib + "_d", lib]
597
- else:
598
- try_names = [lib]
599
- for dir in dirs:
600
- for name in try_names:
601
- libfile = os.path.join(dir, self.library_filename(name))
602
- if os.path.exists(libfile):
603
- return libfile
604
- else:
605
- # Oops, didn't find it in *any* of 'dirs'
606
- return None
607
-
608
- # Helper methods for using the MSVC registry settings
609
-
610
- def find_exe(self, exe):
611
- """Return path to an MSVC executable program.
612
-
613
- Tries to find the program in several places: first, one of the
614
- MSVC program search paths from the registry; next, the directories
615
- in the PATH environment variable. If any of those work, return an
616
- absolute path that is known to exist. If none of them work, just
617
- return the original program name, 'exe'.
618
- """
619
- for p in self.__paths:
620
- fn = os.path.join(os.path.abspath(p), exe)
621
- if os.path.isfile(fn):
622
- return fn
623
-
624
- # didn't find it; try existing path
625
- for p in os.environ['Path'].split(';'):
626
- fn = os.path.join(os.path.abspath(p), exe)
627
- if os.path.isfile(fn):
628
- return fn
629
-
630
- return exe
631
-
632
- def get_msvc_paths(self, path, platform='x86'):
633
- """Get a list of devstudio directories (include, lib or path).
634
-
635
- Return a list of strings. The list will be empty if unable to
636
- access the registry or appropriate registry keys not found.
637
- """
638
- if not _can_read_reg:
639
- return []
640
-
641
- path = path + " dirs"
642
- if self.__version >= 7:
643
- key = r"{}\{:0.1f}\VC\VC_OBJECTS_PLATFORM_INFO\Win32\Directories".format(
644
- self.__root,
645
- self.__version,
646
- )
647
- else:
648
- key = (
649
- r"%s\6.0\Build System\Components\Platforms"
650
- r"\Win32 (%s)\Directories" % (self.__root, platform)
651
- )
652
-
653
- for base in HKEYS:
654
- d = read_values(base, key)
655
- if d:
656
- if self.__version >= 7:
657
- return self.__macros.sub(d[path]).split(";")
658
- else:
659
- return d[path].split(";")
660
- # MSVC 6 seems to create the registry entries we need only when
661
- # the GUI is run.
662
- if self.__version == 6:
663
- for base in HKEYS:
664
- if read_values(base, r"%s\6.0" % self.__root) is not None:
665
- self.warn(
666
- "It seems you have Visual Studio 6 installed, "
667
- "but the expected registry settings are not present.\n"
668
- "You must at least run the Visual Studio GUI once "
669
- "so that these entries are created."
670
- )
671
- break
672
- return []
673
-
674
- def set_path_env_var(self, name):
675
- """Set environment variable 'name' to an MSVC path type value.
676
-
677
- This is equivalent to a SET command prior to execution of spawned
678
- commands.
679
- """
680
-
681
- if name == "lib":
682
- p = self.get_msvc_paths("library")
683
- else:
684
- p = self.get_msvc_paths(name)
685
- if p:
686
- os.environ[name] = ';'.join(p)
687
-
688
-
689
- if get_build_version() >= 8.0:
690
- log.debug("Importing new compiler from distutils.msvc9compiler")
691
- OldMSVCCompiler = MSVCCompiler
692
- from distutils.msvc9compiler import MSVCCompiler
693
-
694
- # get_build_architecture not really relevant now we support cross-compile
695
- from distutils.msvc9compiler import MacroExpander # noqa: F811
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Brij1808/Blog_Generator/app.py DELETED
@@ -1,20 +0,0 @@
1
- import os
2
- import gradio as gr
3
- from transformers import pipeline
4
-
5
-
6
- def blog_gen(txt):
7
- generator=pipeline(task='text-generation',model='gpt2')
8
- gen_blog=generator(txt,max_length=300,num_return_sequences=2)
9
- remove_gen_text = list(map(lambda x: x["generated_text"], gen_blog))
10
- clean_text = list(map(lambda x: x.replace("\n\n", " "), remove_gen_text))
11
- return clean_text
12
-
13
-
14
- iface = gr.Interface(fn=blog_gen,
15
- inputs=[
16
- gr.inputs.Textbox(
17
- lines=2, placeholder=None, label='Sentence'),
18
- ],
19
- outputs=[gr.outputs.JSON(label=None)])
20
- iface.launch()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/LIVE/pybind11/tests/test_class.py DELETED
@@ -1,333 +0,0 @@
1
- # -*- coding: utf-8 -*-
2
- import pytest
3
-
4
- import env # noqa: F401
5
-
6
- from pybind11_tests import class_ as m
7
- from pybind11_tests import UserType, ConstructorStats
8
-
9
-
10
- def test_repr():
11
- # In Python 3.3+, repr() accesses __qualname__
12
- assert "pybind11_type" in repr(type(UserType))
13
- assert "UserType" in repr(UserType)
14
-
15
-
16
- def test_instance(msg):
17
- with pytest.raises(TypeError) as excinfo:
18
- m.NoConstructor()
19
- assert msg(excinfo.value) == "m.class_.NoConstructor: No constructor defined!"
20
-
21
- instance = m.NoConstructor.new_instance()
22
-
23
- cstats = ConstructorStats.get(m.NoConstructor)
24
- assert cstats.alive() == 1
25
- del instance
26
- assert cstats.alive() == 0
27
-
28
-
29
- def test_docstrings(doc):
30
- assert doc(UserType) == "A `py::class_` type for testing"
31
- assert UserType.__name__ == "UserType"
32
- assert UserType.__module__ == "pybind11_tests"
33
- assert UserType.get_value.__name__ == "get_value"
34
- assert UserType.get_value.__module__ == "pybind11_tests"
35
-
36
- assert doc(UserType.get_value) == """
37
- get_value(self: m.UserType) -> int
38
-
39
- Get value using a method
40
- """
41
- assert doc(UserType.value) == "Get/set value using a property"
42
-
43
- assert doc(m.NoConstructor.new_instance) == """
44
- new_instance() -> m.class_.NoConstructor
45
-
46
- Return an instance
47
- """
48
-
49
-
50
- def test_qualname(doc):
51
- """Tests that a properly qualified name is set in __qualname__ (even in pre-3.3, where we
52
- backport the attribute) and that generated docstrings properly use it and the module name"""
53
- assert m.NestBase.__qualname__ == "NestBase"
54
- assert m.NestBase.Nested.__qualname__ == "NestBase.Nested"
55
-
56
- assert doc(m.NestBase.__init__) == """
57
- __init__(self: m.class_.NestBase) -> None
58
- """
59
- assert doc(m.NestBase.g) == """
60
- g(self: m.class_.NestBase, arg0: m.class_.NestBase.Nested) -> None
61
- """
62
- assert doc(m.NestBase.Nested.__init__) == """
63
- __init__(self: m.class_.NestBase.Nested) -> None
64
- """
65
- assert doc(m.NestBase.Nested.fn) == """
66
- fn(self: m.class_.NestBase.Nested, arg0: int, arg1: m.class_.NestBase, arg2: m.class_.NestBase.Nested) -> None
67
- """ # noqa: E501 line too long
68
- assert doc(m.NestBase.Nested.fa) == """
69
- fa(self: m.class_.NestBase.Nested, a: int, b: m.class_.NestBase, c: m.class_.NestBase.Nested) -> None
70
- """ # noqa: E501 line too long
71
- assert m.NestBase.__module__ == "pybind11_tests.class_"
72
- assert m.NestBase.Nested.__module__ == "pybind11_tests.class_"
73
-
74
-
75
- def test_inheritance(msg):
76
- roger = m.Rabbit('Rabbit')
77
- assert roger.name() + " is a " + roger.species() == "Rabbit is a parrot"
78
- assert m.pet_name_species(roger) == "Rabbit is a parrot"
79
-
80
- polly = m.Pet('Polly', 'parrot')
81
- assert polly.name() + " is a " + polly.species() == "Polly is a parrot"
82
- assert m.pet_name_species(polly) == "Polly is a parrot"
83
-
84
- molly = m.Dog('Molly')
85
- assert molly.name() + " is a " + molly.species() == "Molly is a dog"
86
- assert m.pet_name_species(molly) == "Molly is a dog"
87
-
88
- fred = m.Hamster('Fred')
89
- assert fred.name() + " is a " + fred.species() == "Fred is a rodent"
90
-
91
- assert m.dog_bark(molly) == "Woof!"
92
-
93
- with pytest.raises(TypeError) as excinfo:
94
- m.dog_bark(polly)
95
- assert msg(excinfo.value) == """
96
- dog_bark(): incompatible function arguments. The following argument types are supported:
97
- 1. (arg0: m.class_.Dog) -> str
98
-
99
- Invoked with: <m.class_.Pet object at 0>
100
- """
101
-
102
- with pytest.raises(TypeError) as excinfo:
103
- m.Chimera("lion", "goat")
104
- assert "No constructor defined!" in str(excinfo.value)
105
-
106
-
107
- def test_inheritance_init(msg):
108
-
109
- # Single base
110
- class Python(m.Pet):
111
- def __init__(self):
112
- pass
113
- with pytest.raises(TypeError) as exc_info:
114
- Python()
115
- expected = ["m.class_.Pet.__init__() must be called when overriding __init__",
116
- "Pet.__init__() must be called when overriding __init__"] # PyPy?
117
- # TODO: fix PyPy error message wrt. tp_name/__qualname__?
118
- assert msg(exc_info.value) in expected
119
-
120
- # Multiple bases
121
- class RabbitHamster(m.Rabbit, m.Hamster):
122
- def __init__(self):
123
- m.Rabbit.__init__(self, "RabbitHamster")
124
-
125
- with pytest.raises(TypeError) as exc_info:
126
- RabbitHamster()
127
- expected = ["m.class_.Hamster.__init__() must be called when overriding __init__",
128
- "Hamster.__init__() must be called when overriding __init__"] # PyPy
129
- assert msg(exc_info.value) in expected
130
-
131
-
132
- def test_automatic_upcasting():
133
- assert type(m.return_class_1()).__name__ == "DerivedClass1"
134
- assert type(m.return_class_2()).__name__ == "DerivedClass2"
135
- assert type(m.return_none()).__name__ == "NoneType"
136
- # Repeat these a few times in a random order to ensure no invalid caching is applied
137
- assert type(m.return_class_n(1)).__name__ == "DerivedClass1"
138
- assert type(m.return_class_n(2)).__name__ == "DerivedClass2"
139
- assert type(m.return_class_n(0)).__name__ == "BaseClass"
140
- assert type(m.return_class_n(2)).__name__ == "DerivedClass2"
141
- assert type(m.return_class_n(2)).__name__ == "DerivedClass2"
142
- assert type(m.return_class_n(0)).__name__ == "BaseClass"
143
- assert type(m.return_class_n(1)).__name__ == "DerivedClass1"
144
-
145
-
146
- def test_isinstance():
147
- objects = [tuple(), dict(), m.Pet("Polly", "parrot")] + [m.Dog("Molly")] * 4
148
- expected = (True, True, True, True, True, False, False)
149
- assert m.check_instances(objects) == expected
150
-
151
-
152
- def test_mismatched_holder():
153
- import re
154
-
155
- with pytest.raises(RuntimeError) as excinfo:
156
- m.mismatched_holder_1()
157
- assert re.match('generic_type: type ".*MismatchDerived1" does not have a non-default '
158
- 'holder type while its base ".*MismatchBase1" does', str(excinfo.value))
159
-
160
- with pytest.raises(RuntimeError) as excinfo:
161
- m.mismatched_holder_2()
162
- assert re.match('generic_type: type ".*MismatchDerived2" has a non-default holder type '
163
- 'while its base ".*MismatchBase2" does not', str(excinfo.value))
164
-
165
-
166
- def test_override_static():
167
- """#511: problem with inheritance + overwritten def_static"""
168
- b = m.MyBase.make()
169
- d1 = m.MyDerived.make2()
170
- d2 = m.MyDerived.make()
171
-
172
- assert isinstance(b, m.MyBase)
173
- assert isinstance(d1, m.MyDerived)
174
- assert isinstance(d2, m.MyDerived)
175
-
176
-
177
- def test_implicit_conversion_life_support():
178
- """Ensure the lifetime of temporary objects created for implicit conversions"""
179
- assert m.implicitly_convert_argument(UserType(5)) == 5
180
- assert m.implicitly_convert_variable(UserType(5)) == 5
181
-
182
- assert "outside a bound function" in m.implicitly_convert_variable_fail(UserType(5))
183
-
184
-
185
- def test_operator_new_delete(capture):
186
- """Tests that class-specific operator new/delete functions are invoked"""
187
-
188
- class SubAliased(m.AliasedHasOpNewDelSize):
189
- pass
190
-
191
- with capture:
192
- a = m.HasOpNewDel()
193
- b = m.HasOpNewDelSize()
194
- d = m.HasOpNewDelBoth()
195
- assert capture == """
196
- A new 8
197
- B new 4
198
- D new 32
199
- """
200
- sz_alias = str(m.AliasedHasOpNewDelSize.size_alias)
201
- sz_noalias = str(m.AliasedHasOpNewDelSize.size_noalias)
202
- with capture:
203
- c = m.AliasedHasOpNewDelSize()
204
- c2 = SubAliased()
205
- assert capture == (
206
- "C new " + sz_noalias + "\n" +
207
- "C new " + sz_alias + "\n"
208
- )
209
-
210
- with capture:
211
- del a
212
- pytest.gc_collect()
213
- del b
214
- pytest.gc_collect()
215
- del d
216
- pytest.gc_collect()
217
- assert capture == """
218
- A delete
219
- B delete 4
220
- D delete
221
- """
222
-
223
- with capture:
224
- del c
225
- pytest.gc_collect()
226
- del c2
227
- pytest.gc_collect()
228
- assert capture == (
229
- "C delete " + sz_noalias + "\n" +
230
- "C delete " + sz_alias + "\n"
231
- )
232
-
233
-
234
- def test_bind_protected_functions():
235
- """Expose protected member functions to Python using a helper class"""
236
- a = m.ProtectedA()
237
- assert a.foo() == 42
238
-
239
- b = m.ProtectedB()
240
- assert b.foo() == 42
241
-
242
- class C(m.ProtectedB):
243
- def __init__(self):
244
- m.ProtectedB.__init__(self)
245
-
246
- def foo(self):
247
- return 0
248
-
249
- c = C()
250
- assert c.foo() == 0
251
-
252
-
253
- def test_brace_initialization():
254
- """ Tests that simple POD classes can be constructed using C++11 brace initialization """
255
- a = m.BraceInitialization(123, "test")
256
- assert a.field1 == 123
257
- assert a.field2 == "test"
258
-
259
- # Tests that a non-simple class doesn't get brace initialization (if the
260
- # class defines an initializer_list constructor, in particular, it would
261
- # win over the expected constructor).
262
- b = m.NoBraceInitialization([123, 456])
263
- assert b.vec == [123, 456]
264
-
265
-
266
- @pytest.mark.xfail("env.PYPY")
267
- def test_class_refcount():
268
- """Instances must correctly increase/decrease the reference count of their types (#1029)"""
269
- from sys import getrefcount
270
-
271
- class PyDog(m.Dog):
272
- pass
273
-
274
- for cls in m.Dog, PyDog:
275
- refcount_1 = getrefcount(cls)
276
- molly = [cls("Molly") for _ in range(10)]
277
- refcount_2 = getrefcount(cls)
278
-
279
- del molly
280
- pytest.gc_collect()
281
- refcount_3 = getrefcount(cls)
282
-
283
- assert refcount_1 == refcount_3
284
- assert refcount_2 > refcount_1
285
-
286
-
287
- def test_reentrant_implicit_conversion_failure(msg):
288
- # ensure that there is no runaway reentrant implicit conversion (#1035)
289
- with pytest.raises(TypeError) as excinfo:
290
- m.BogusImplicitConversion(0)
291
- assert msg(excinfo.value) == '''
292
- __init__(): incompatible constructor arguments. The following argument types are supported:
293
- 1. m.class_.BogusImplicitConversion(arg0: m.class_.BogusImplicitConversion)
294
-
295
- Invoked with: 0
296
- '''
297
-
298
-
299
- def test_error_after_conversions():
300
- with pytest.raises(TypeError) as exc_info:
301
- m.test_error_after_conversions("hello")
302
- assert str(exc_info.value).startswith(
303
- "Unable to convert function return value to a Python type!")
304
-
305
-
306
- def test_aligned():
307
- if hasattr(m, "Aligned"):
308
- p = m.Aligned().ptr()
309
- assert p % 1024 == 0
310
-
311
-
312
- # https://foss.heptapod.net/pypy/pypy/-/issues/2742
313
- @pytest.mark.xfail("env.PYPY")
314
- def test_final():
315
- with pytest.raises(TypeError) as exc_info:
316
- class PyFinalChild(m.IsFinal):
317
- pass
318
- assert str(exc_info.value).endswith("is not an acceptable base type")
319
-
320
-
321
- # https://foss.heptapod.net/pypy/pypy/-/issues/2742
322
- @pytest.mark.xfail("env.PYPY")
323
- def test_non_final_final():
324
- with pytest.raises(TypeError) as exc_info:
325
- class PyNonFinalFinalChild(m.IsNonFinalFinal):
326
- pass
327
- assert str(exc_info.value).endswith("is not an acceptable base type")
328
-
329
-
330
- # https://github.com/pybind/pybind11/issues/1878
331
- def test_exception_rvalue_abort():
332
- with pytest.raises(RuntimeError):
333
- m.PyPrintDestructor().throw_something()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/LIVE/pybind11/tests/test_embed/test_interpreter.cpp DELETED
@@ -1,284 +0,0 @@
1
- #include <pybind11/embed.h>
2
-
3
- #ifdef _MSC_VER
4
- // Silence MSVC C++17 deprecation warning from Catch regarding std::uncaught_exceptions (up to catch
5
- // 2.0.1; this should be fixed in the next catch release after 2.0.1).
6
- # pragma warning(disable: 4996)
7
- #endif
8
-
9
- #include <catch.hpp>
10
-
11
- #include <thread>
12
- #include <fstream>
13
- #include <functional>
14
-
15
- namespace py = pybind11;
16
- using namespace py::literals;
17
-
18
- class Widget {
19
- public:
20
- Widget(std::string message) : message(message) { }
21
- virtual ~Widget() = default;
22
-
23
- std::string the_message() const { return message; }
24
- virtual int the_answer() const = 0;
25
-
26
- private:
27
- std::string message;
28
- };
29
-
30
- class PyWidget final : public Widget {
31
- using Widget::Widget;
32
-
33
- int the_answer() const override { PYBIND11_OVERLOAD_PURE(int, Widget, the_answer); }
34
- };
35
-
36
- PYBIND11_EMBEDDED_MODULE(widget_module, m) {
37
- py::class_<Widget, PyWidget>(m, "Widget")
38
- .def(py::init<std::string>())
39
- .def_property_readonly("the_message", &Widget::the_message);
40
-
41
- m.def("add", [](int i, int j) { return i + j; });
42
- }
43
-
44
- PYBIND11_EMBEDDED_MODULE(throw_exception, ) {
45
- throw std::runtime_error("C++ Error");
46
- }
47
-
48
- PYBIND11_EMBEDDED_MODULE(throw_error_already_set, ) {
49
- auto d = py::dict();
50
- d["missing"].cast<py::object>();
51
- }
52
-
53
- TEST_CASE("Pass classes and data between modules defined in C++ and Python") {
54
- auto module = py::module::import("test_interpreter");
55
- REQUIRE(py::hasattr(module, "DerivedWidget"));
56
-
57
- auto locals = py::dict("hello"_a="Hello, World!", "x"_a=5, **module.attr("__dict__"));
58
- py::exec(R"(
59
- widget = DerivedWidget("{} - {}".format(hello, x))
60
- message = widget.the_message
61
- )", py::globals(), locals);
62
- REQUIRE(locals["message"].cast<std::string>() == "Hello, World! - 5");
63
-
64
- auto py_widget = module.attr("DerivedWidget")("The question");
65
- auto message = py_widget.attr("the_message");
66
- REQUIRE(message.cast<std::string>() == "The question");
67
-
68
- const auto &cpp_widget = py_widget.cast<const Widget &>();
69
- REQUIRE(cpp_widget.the_answer() == 42);
70
- }
71
-
72
- TEST_CASE("Import error handling") {
73
- REQUIRE_NOTHROW(py::module::import("widget_module"));
74
- REQUIRE_THROWS_WITH(py::module::import("throw_exception"),
75
- "ImportError: C++ Error");
76
- REQUIRE_THROWS_WITH(py::module::import("throw_error_already_set"),
77
- Catch::Contains("ImportError: KeyError"));
78
- }
79
-
80
- TEST_CASE("There can be only one interpreter") {
81
- static_assert(std::is_move_constructible<py::scoped_interpreter>::value, "");
82
- static_assert(!std::is_move_assignable<py::scoped_interpreter>::value, "");
83
- static_assert(!std::is_copy_constructible<py::scoped_interpreter>::value, "");
84
- static_assert(!std::is_copy_assignable<py::scoped_interpreter>::value, "");
85
-
86
- REQUIRE_THROWS_WITH(py::initialize_interpreter(), "The interpreter is already running");
87
- REQUIRE_THROWS_WITH(py::scoped_interpreter(), "The interpreter is already running");
88
-
89
- py::finalize_interpreter();
90
- REQUIRE_NOTHROW(py::scoped_interpreter());
91
- {
92
- auto pyi1 = py::scoped_interpreter();
93
- auto pyi2 = std::move(pyi1);
94
- }
95
- py::initialize_interpreter();
96
- }
97
-
98
- bool has_pybind11_internals_builtin() {
99
- auto builtins = py::handle(PyEval_GetBuiltins());
100
- return builtins.contains(PYBIND11_INTERNALS_ID);
101
- };
102
-
103
- bool has_pybind11_internals_static() {
104
- auto **&ipp = py::detail::get_internals_pp();
105
- return ipp && *ipp;
106
- }
107
-
108
- TEST_CASE("Restart the interpreter") {
109
- // Verify pre-restart state.
110
- REQUIRE(py::module::import("widget_module").attr("add")(1, 2).cast<int>() == 3);
111
- REQUIRE(has_pybind11_internals_builtin());
112
- REQUIRE(has_pybind11_internals_static());
113
- REQUIRE(py::module::import("external_module").attr("A")(123).attr("value").cast<int>() == 123);
114
-
115
- // local and foreign module internals should point to the same internals:
116
- REQUIRE(reinterpret_cast<uintptr_t>(*py::detail::get_internals_pp()) ==
117
- py::module::import("external_module").attr("internals_at")().cast<uintptr_t>());
118
-
119
- // Restart the interpreter.
120
- py::finalize_interpreter();
121
- REQUIRE(Py_IsInitialized() == 0);
122
-
123
- py::initialize_interpreter();
124
- REQUIRE(Py_IsInitialized() == 1);
125
-
126
- // Internals are deleted after a restart.
127
- REQUIRE_FALSE(has_pybind11_internals_builtin());
128
- REQUIRE_FALSE(has_pybind11_internals_static());
129
- pybind11::detail::get_internals();
130
- REQUIRE(has_pybind11_internals_builtin());
131
- REQUIRE(has_pybind11_internals_static());
132
- REQUIRE(reinterpret_cast<uintptr_t>(*py::detail::get_internals_pp()) ==
133
- py::module::import("external_module").attr("internals_at")().cast<uintptr_t>());
134
-
135
- // Make sure that an interpreter with no get_internals() created until finalize still gets the
136
- // internals destroyed
137
- py::finalize_interpreter();
138
- py::initialize_interpreter();
139
- bool ran = false;
140
- py::module::import("__main__").attr("internals_destroy_test") =
141
- py::capsule(&ran, [](void *ran) { py::detail::get_internals(); *static_cast<bool *>(ran) = true; });
142
- REQUIRE_FALSE(has_pybind11_internals_builtin());
143
- REQUIRE_FALSE(has_pybind11_internals_static());
144
- REQUIRE_FALSE(ran);
145
- py::finalize_interpreter();
146
- REQUIRE(ran);
147
- py::initialize_interpreter();
148
- REQUIRE_FALSE(has_pybind11_internals_builtin());
149
- REQUIRE_FALSE(has_pybind11_internals_static());
150
-
151
- // C++ modules can be reloaded.
152
- auto cpp_module = py::module::import("widget_module");
153
- REQUIRE(cpp_module.attr("add")(1, 2).cast<int>() == 3);
154
-
155
- // C++ type information is reloaded and can be used in python modules.
156
- auto py_module = py::module::import("test_interpreter");
157
- auto py_widget = py_module.attr("DerivedWidget")("Hello after restart");
158
- REQUIRE(py_widget.attr("the_message").cast<std::string>() == "Hello after restart");
159
- }
160
-
161
- TEST_CASE("Subinterpreter") {
162
- // Add tags to the modules in the main interpreter and test the basics.
163
- py::module::import("__main__").attr("main_tag") = "main interpreter";
164
- {
165
- auto m = py::module::import("widget_module");
166
- m.attr("extension_module_tag") = "added to module in main interpreter";
167
-
168
- REQUIRE(m.attr("add")(1, 2).cast<int>() == 3);
169
- }
170
- REQUIRE(has_pybind11_internals_builtin());
171
- REQUIRE(has_pybind11_internals_static());
172
-
173
- /// Create and switch to a subinterpreter.
174
- auto main_tstate = PyThreadState_Get();
175
- auto sub_tstate = Py_NewInterpreter();
176
-
177
- // Subinterpreters get their own copy of builtins. detail::get_internals() still
178
- // works by returning from the static variable, i.e. all interpreters share a single
179
- // global pybind11::internals;
180
- REQUIRE_FALSE(has_pybind11_internals_builtin());
181
- REQUIRE(has_pybind11_internals_static());
182
-
183
- // Modules tags should be gone.
184
- REQUIRE_FALSE(py::hasattr(py::module::import("__main__"), "tag"));
185
- {
186
- auto m = py::module::import("widget_module");
187
- REQUIRE_FALSE(py::hasattr(m, "extension_module_tag"));
188
-
189
- // Function bindings should still work.
190
- REQUIRE(m.attr("add")(1, 2).cast<int>() == 3);
191
- }
192
-
193
- // Restore main interpreter.
194
- Py_EndInterpreter(sub_tstate);
195
- PyThreadState_Swap(main_tstate);
196
-
197
- REQUIRE(py::hasattr(py::module::import("__main__"), "main_tag"));
198
- REQUIRE(py::hasattr(py::module::import("widget_module"), "extension_module_tag"));
199
- }
200
-
201
- TEST_CASE("Execution frame") {
202
- // When the interpreter is embedded, there is no execution frame, but `py::exec`
203
- // should still function by using reasonable globals: `__main__.__dict__`.
204
- py::exec("var = dict(number=42)");
205
- REQUIRE(py::globals()["var"]["number"].cast<int>() == 42);
206
- }
207
-
208
- TEST_CASE("Threads") {
209
- // Restart interpreter to ensure threads are not initialized
210
- py::finalize_interpreter();
211
- py::initialize_interpreter();
212
- REQUIRE_FALSE(has_pybind11_internals_static());
213
-
214
- constexpr auto num_threads = 10;
215
- auto locals = py::dict("count"_a=0);
216
-
217
- {
218
- py::gil_scoped_release gil_release{};
219
- REQUIRE(has_pybind11_internals_static());
220
-
221
- auto threads = std::vector<std::thread>();
222
- for (auto i = 0; i < num_threads; ++i) {
223
- threads.emplace_back([&]() {
224
- py::gil_scoped_acquire gil{};
225
- locals["count"] = locals["count"].cast<int>() + 1;
226
- });
227
- }
228
-
229
- for (auto &thread : threads) {
230
- thread.join();
231
- }
232
- }
233
-
234
- REQUIRE(locals["count"].cast<int>() == num_threads);
235
- }
236
-
237
- // Scope exit utility https://stackoverflow.com/a/36644501/7255855
238
- struct scope_exit {
239
- std::function<void()> f_;
240
- explicit scope_exit(std::function<void()> f) noexcept : f_(std::move(f)) {}
241
- ~scope_exit() { if (f_) f_(); }
242
- };
243
-
244
- TEST_CASE("Reload module from file") {
245
- // Disable generation of cached bytecode (.pyc files) for this test, otherwise
246
- // Python might pick up an old version from the cache instead of the new versions
247
- // of the .py files generated below
248
- auto sys = py::module::import("sys");
249
- bool dont_write_bytecode = sys.attr("dont_write_bytecode").cast<bool>();
250
- sys.attr("dont_write_bytecode") = true;
251
- // Reset the value at scope exit
252
- scope_exit reset_dont_write_bytecode([&]() {
253
- sys.attr("dont_write_bytecode") = dont_write_bytecode;
254
- });
255
-
256
- std::string module_name = "test_module_reload";
257
- std::string module_file = module_name + ".py";
258
-
259
- // Create the module .py file
260
- std::ofstream test_module(module_file);
261
- test_module << "def test():\n";
262
- test_module << " return 1\n";
263
- test_module.close();
264
- // Delete the file at scope exit
265
- scope_exit delete_module_file([&]() {
266
- std::remove(module_file.c_str());
267
- });
268
-
269
- // Import the module from file
270
- auto module = py::module::import(module_name.c_str());
271
- int result = module.attr("test")().cast<int>();
272
- REQUIRE(result == 1);
273
-
274
- // Update the module .py file with a small change
275
- test_module.open(module_file);
276
- test_module << "def test():\n";
277
- test_module << " return 2\n";
278
- test_module.close();
279
-
280
- // Reload the module
281
- module.reload();
282
- result = module.attr("test")().cast<int>();
283
- REQUIRE(result == 2);
284
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/LIVE/thrust/cub/cmake/cub-config.cmake DELETED
@@ -1,62 +0,0 @@
1
- #
2
- # find_package(CUB) config file.
3
- #
4
- # Defines a CUB::CUB target that may be linked from user projects to include
5
- # CUB.
6
-
7
- if (TARGET CUB::CUB)
8
- return()
9
- endif()
10
-
11
- function(_cub_declare_interface_alias alias_name ugly_name)
12
- # 1) Only IMPORTED and ALIAS targets can be placed in a namespace.
13
- # 2) When an IMPORTED library is linked to another target, its include
14
- # directories are treated as SYSTEM includes.
15
- # 3) nvcc will automatically check the CUDA Toolkit include path *before* the
16
- # system includes. This means that the Toolkit CUB will *always* be used
17
- # during compilation, and the include paths of an IMPORTED CUB::CUB
18
- # target will never have any effect.
19
- # 4) This behavior can be fixed by setting the property NO_SYSTEM_FROM_IMPORTED
20
- # on EVERY target that links to CUB::CUB. This would be a burden and a
21
- # footgun for our users. Forgetting this would silently pull in the wrong CUB!
22
- # 5) A workaround is to make a non-IMPORTED library outside of the namespace,
23
- # configure it, and then ALIAS it into the namespace (or ALIAS and then
24
- # configure, that seems to work too).
25
- add_library(${ugly_name} INTERFACE)
26
- add_library(${alias_name} ALIAS ${ugly_name})
27
- endfunction()
28
-
29
- #
30
- # Setup targets
31
- #
32
-
33
- _cub_declare_interface_alias(CUB::CUB _CUB_CUB)
34
- # Strip out the 'cub/cmake/' from 'cub/cmake/cub-config.cmake':
35
- get_filename_component(_CUB_INCLUDE_DIR "../.." ABSOLUTE BASE_DIR "${CMAKE_CURRENT_LIST_DIR}")
36
- target_include_directories(_CUB_CUB INTERFACE "${_CUB_INCLUDE_DIR}")
37
-
38
- if (CUB_IGNORE_DEPRECATED_CPP_DIALECT OR
39
- THRUST_IGNORE_DEPRECATED_CPP_DIALECT)
40
- target_compile_definitions(_CUB_CUB INTERFACE "CUB_IGNORE_DEPRECATED_CPP_DIALECT")
41
- endif()
42
-
43
- if (CUB_IGNORE_DEPRECATED_CPP_11 OR
44
- THRUST_IGNORE_DEPRECATED_CPP_11)
45
- target_compile_definitions(_CUB_CUB INTERFACE "CUB_IGNORE_DEPRECATED_CPP_11")
46
- endif()
47
-
48
- if (CUB_IGNORE_DEPRECATED_COMPILER OR
49
- THRUST_IGNORE_DEPRECATED_COMPILER)
50
- target_compile_definitions(_CUB_CUB INTERFACE "CUB_IGNORE_DEPRECATED_COMPILER")
51
- endif()
52
-
53
- #
54
- # Standardize version info
55
- #
56
-
57
- set(CUB_VERSION ${${CMAKE_FIND_PACKAGE_NAME}_VERSION} CACHE INTERNAL "")
58
- set(CUB_VERSION_MAJOR ${${CMAKE_FIND_PACKAGE_NAME}_VERSION_MAJOR} CACHE INTERNAL "")
59
- set(CUB_VERSION_MINOR ${${CMAKE_FIND_PACKAGE_NAME}_VERSION_MINOR} CACHE INTERNAL "")
60
- set(CUB_VERSION_PATCH ${${CMAKE_FIND_PACKAGE_NAME}_VERSION_PATCH} CACHE INTERNAL "")
61
- set(CUB_VERSION_TWEAK ${${CMAKE_FIND_PACKAGE_NAME}_VERSION_TWEAK} CACHE INTERNAL "")
62
- set(CUB_VERSION_COUNT ${${CMAKE_FIND_PACKAGE_NAME}_VERSION_COUNT} CACHE INTERNAL "")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/LIVE/thrust/thrust/detail/caching_allocator.h DELETED
@@ -1,45 +0,0 @@
1
- /*
2
- * Copyright 2020 NVIDIA Corporation
3
- *
4
- * Licensed under the Apache License, Version 2.0 (the "License");
5
- * you may not use this file except in compliance with the License.
6
- * You may obtain a copy of the License at
7
- *
8
- * http://www.apache.org/licenses/LICENSE-2.0
9
- *
10
- * Unless required by applicable law or agreed to in writing, software
11
- * distributed under the License is distributed on an "AS IS" BASIS,
12
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
- * See the License for the specific language governing permissions and
14
- * limitations under the License.
15
- */
16
-
17
- #pragma once
18
-
19
- #include <thrust/mr/allocator.h>
20
- #include <thrust/mr/disjoint_tls_pool.h>
21
- #include <thrust/mr/new.h>
22
- #include <thrust/memory/detail/device_system_resource.h>
23
-
24
- namespace thrust
25
- {
26
- namespace detail
27
- {
28
- inline
29
- thrust::mr::allocator<
30
- char,
31
- thrust::mr::disjoint_unsynchronized_pool_resource<
32
- thrust::device_memory_resource,
33
- thrust::mr::new_delete_resource
34
- >
35
- > single_device_tls_caching_allocator()
36
- {
37
- return {
38
- &thrust::mr::tls_disjoint_pool(
39
- thrust::mr::get_global_resource<thrust::device_memory_resource>(),
40
- thrust::mr::get_global_resource<thrust::mr::new_delete_resource>()
41
- )
42
- };
43
- }
44
- }
45
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/WALT/mmdet/core/mask/structures.py DELETED
@@ -1,1042 +0,0 @@
1
- from abc import ABCMeta, abstractmethod
2
-
3
- import cv2
4
- import mmcv
5
- import numpy as np
6
- import pycocotools.mask as maskUtils
7
- import torch
8
- from mmcv.ops.roi_align import roi_align
9
-
10
-
11
- class BaseInstanceMasks(metaclass=ABCMeta):
12
- """Base class for instance masks."""
13
-
14
- @abstractmethod
15
- def rescale(self, scale, interpolation='nearest'):
16
- """Rescale masks as large as possible while keeping the aspect ratio.
17
- For details can refer to `mmcv.imrescale`.
18
-
19
- Args:
20
- scale (tuple[int]): The maximum size (h, w) of rescaled mask.
21
- interpolation (str): Same as :func:`mmcv.imrescale`.
22
-
23
- Returns:
24
- BaseInstanceMasks: The rescaled masks.
25
- """
26
-
27
- @abstractmethod
28
- def resize(self, out_shape, interpolation='nearest'):
29
- """Resize masks to the given out_shape.
30
-
31
- Args:
32
- out_shape: Target (h, w) of resized mask.
33
- interpolation (str): See :func:`mmcv.imresize`.
34
-
35
- Returns:
36
- BaseInstanceMasks: The resized masks.
37
- """
38
-
39
- @abstractmethod
40
- def flip(self, flip_direction='horizontal'):
41
- """Flip masks alone the given direction.
42
-
43
- Args:
44
- flip_direction (str): Either 'horizontal' or 'vertical'.
45
-
46
- Returns:
47
- BaseInstanceMasks: The flipped masks.
48
- """
49
-
50
- @abstractmethod
51
- def pad(self, out_shape, pad_val):
52
- """Pad masks to the given size of (h, w).
53
-
54
- Args:
55
- out_shape (tuple[int]): Target (h, w) of padded mask.
56
- pad_val (int): The padded value.
57
-
58
- Returns:
59
- BaseInstanceMasks: The padded masks.
60
- """
61
-
62
- @abstractmethod
63
- def crop(self, bbox):
64
- """Crop each mask by the given bbox.
65
-
66
- Args:
67
- bbox (ndarray): Bbox in format [x1, y1, x2, y2], shape (4, ).
68
-
69
- Return:
70
- BaseInstanceMasks: The cropped masks.
71
- """
72
-
73
- @abstractmethod
74
- def crop_and_resize(self,
75
- bboxes,
76
- out_shape,
77
- inds,
78
- device,
79
- interpolation='bilinear'):
80
- """Crop and resize masks by the given bboxes.
81
-
82
- This function is mainly used in mask targets computation.
83
- It firstly align mask to bboxes by assigned_inds, then crop mask by the
84
- assigned bbox and resize to the size of (mask_h, mask_w)
85
-
86
- Args:
87
- bboxes (Tensor): Bboxes in format [x1, y1, x2, y2], shape (N, 4)
88
- out_shape (tuple[int]): Target (h, w) of resized mask
89
- inds (ndarray): Indexes to assign masks to each bbox,
90
- shape (N,) and values should be between [0, num_masks - 1].
91
- device (str): Device of bboxes
92
- interpolation (str): See `mmcv.imresize`
93
-
94
- Return:
95
- BaseInstanceMasks: the cropped and resized masks.
96
- """
97
-
98
- @abstractmethod
99
- def expand(self, expanded_h, expanded_w, top, left):
100
- """see :class:`Expand`."""
101
-
102
- @property
103
- @abstractmethod
104
- def areas(self):
105
- """ndarray: areas of each instance."""
106
-
107
- @abstractmethod
108
- def to_ndarray(self):
109
- """Convert masks to the format of ndarray.
110
-
111
- Return:
112
- ndarray: Converted masks in the format of ndarray.
113
- """
114
-
115
- @abstractmethod
116
- def to_tensor(self, dtype, device):
117
- """Convert masks to the format of Tensor.
118
-
119
- Args:
120
- dtype (str): Dtype of converted mask.
121
- device (torch.device): Device of converted masks.
122
-
123
- Returns:
124
- Tensor: Converted masks in the format of Tensor.
125
- """
126
-
127
- @abstractmethod
128
- def translate(self,
129
- out_shape,
130
- offset,
131
- direction='horizontal',
132
- fill_val=0,
133
- interpolation='bilinear'):
134
- """Translate the masks.
135
-
136
- Args:
137
- out_shape (tuple[int]): Shape for output mask, format (h, w).
138
- offset (int | float): The offset for translate.
139
- direction (str): The translate direction, either "horizontal"
140
- or "vertical".
141
- fill_val (int | float): Border value. Default 0.
142
- interpolation (str): Same as :func:`mmcv.imtranslate`.
143
-
144
- Returns:
145
- Translated masks.
146
- """
147
-
148
- def shear(self,
149
- out_shape,
150
- magnitude,
151
- direction='horizontal',
152
- border_value=0,
153
- interpolation='bilinear'):
154
- """Shear the masks.
155
-
156
- Args:
157
- out_shape (tuple[int]): Shape for output mask, format (h, w).
158
- magnitude (int | float): The magnitude used for shear.
159
- direction (str): The shear direction, either "horizontal"
160
- or "vertical".
161
- border_value (int | tuple[int]): Value used in case of a
162
- constant border. Default 0.
163
- interpolation (str): Same as in :func:`mmcv.imshear`.
164
-
165
- Returns:
166
- ndarray: Sheared masks.
167
- """
168
-
169
- @abstractmethod
170
- def rotate(self, out_shape, angle, center=None, scale=1.0, fill_val=0):
171
- """Rotate the masks.
172
-
173
- Args:
174
- out_shape (tuple[int]): Shape for output mask, format (h, w).
175
- angle (int | float): Rotation angle in degrees. Positive values
176
- mean counter-clockwise rotation.
177
- center (tuple[float], optional): Center point (w, h) of the
178
- rotation in source image. If not specified, the center of
179
- the image will be used.
180
- scale (int | float): Isotropic scale factor.
181
- fill_val (int | float): Border value. Default 0 for masks.
182
-
183
- Returns:
184
- Rotated masks.
185
- """
186
-
187
-
188
- class BitmapMasks(BaseInstanceMasks):
189
- """This class represents masks in the form of bitmaps.
190
-
191
- Args:
192
- masks (ndarray): ndarray of masks in shape (N, H, W), where N is
193
- the number of objects.
194
- height (int): height of masks
195
- width (int): width of masks
196
-
197
- Example:
198
- >>> from mmdet.core.mask.structures import * # NOQA
199
- >>> num_masks, H, W = 3, 32, 32
200
- >>> rng = np.random.RandomState(0)
201
- >>> masks = (rng.rand(num_masks, H, W) > 0.1).astype(np.int)
202
- >>> self = BitmapMasks(masks, height=H, width=W)
203
-
204
- >>> # demo crop_and_resize
205
- >>> num_boxes = 5
206
- >>> bboxes = np.array([[0, 0, 30, 10.0]] * num_boxes)
207
- >>> out_shape = (14, 14)
208
- >>> inds = torch.randint(0, len(self), size=(num_boxes,))
209
- >>> device = 'cpu'
210
- >>> interpolation = 'bilinear'
211
- >>> new = self.crop_and_resize(
212
- ... bboxes, out_shape, inds, device, interpolation)
213
- >>> assert len(new) == num_boxes
214
- >>> assert new.height, new.width == out_shape
215
- """
216
-
217
- def __init__(self, masks, height, width):
218
- self.height = height
219
- self.width = width
220
- if len(masks) == 0:
221
- self.masks = np.empty((0, self.height, self.width), dtype=np.uint8)
222
- else:
223
- assert isinstance(masks, (list, np.ndarray))
224
- if isinstance(masks, list):
225
- assert isinstance(masks[0], np.ndarray)
226
- assert masks[0].ndim == 2 # (H, W)
227
- else:
228
- assert masks.ndim == 3 or masks.ndim == 4# (N, H, W)
229
-
230
- self.masks = np.stack(masks).reshape(-1, height, width)
231
- assert self.masks.shape[1] == self.height
232
- assert self.masks.shape[2] == self.width
233
-
234
- def __getitem__(self, index):
235
- """Index the BitmapMask.
236
-
237
- Args:
238
- index (int | ndarray): Indices in the format of integer or ndarray.
239
-
240
- Returns:
241
- :obj:`BitmapMasks`: Indexed bitmap masks.
242
- """
243
- try:
244
- masks = self.masks[index].reshape(-1, self.height, self.width)
245
- except:
246
- masks = self.masks[index].reshape(-1, self.height, self.width)
247
-
248
- return BitmapMasks(masks, self.height, self.width)
249
-
250
- def __iter__(self):
251
- return iter(self.masks)
252
-
253
- def __repr__(self):
254
- s = self.__class__.__name__ + '('
255
- s += f'num_masks={len(self.masks)}, '
256
- s += f'height={self.height}, '
257
- s += f'width={self.width})'
258
- return s
259
-
260
- def __len__(self):
261
- """Number of masks."""
262
- return len(self.masks)
263
-
264
- def rescale(self, scale, interpolation='nearest'):
265
- """See :func:`BaseInstanceMasks.rescale`."""
266
- if len(self.masks) == 0:
267
- new_w, new_h = mmcv.rescale_size((self.width, self.height), scale)
268
- rescaled_masks = np.empty((0, new_h, new_w), dtype=np.uint8)
269
- else:
270
- rescaled_masks = np.stack([
271
- mmcv.imrescale(mask, scale, interpolation=interpolation)
272
- for mask in self.masks
273
- ])
274
- height, width = rescaled_masks.shape[1:]
275
- return BitmapMasks(rescaled_masks, height, width)
276
-
277
- def resize(self, out_shape, interpolation='nearest'):
278
- """See :func:`BaseInstanceMasks.resize`."""
279
- if len(self.masks) == 0:
280
- resized_masks = np.empty((0, *out_shape), dtype=np.uint8)
281
- else:
282
- resized_masks = np.stack([
283
- mmcv.imresize(
284
- mask, out_shape[::-1], interpolation=interpolation)
285
- for mask in self.masks
286
- ])
287
- return BitmapMasks(resized_masks, *out_shape)
288
-
289
- def flip(self, flip_direction='horizontal'):
290
- """See :func:`BaseInstanceMasks.flip`."""
291
- assert flip_direction in ('horizontal', 'vertical', 'diagonal')
292
-
293
- if len(self.masks) == 0:
294
- flipped_masks = self.masks
295
- else:
296
- flipped_masks = np.stack([
297
- mmcv.imflip(mask, direction=flip_direction)
298
- for mask in self.masks
299
- ])
300
- return BitmapMasks(flipped_masks, self.height, self.width)
301
-
302
- def pad(self, out_shape, pad_val=0):
303
- """See :func:`BaseInstanceMasks.pad`."""
304
- if len(self.masks) == 0:
305
- padded_masks = np.empty((0, *out_shape), dtype=np.uint8)
306
- else:
307
- padded_masks = np.stack([
308
- mmcv.impad(mask, shape=out_shape, pad_val=pad_val)
309
- for mask in self.masks
310
- ])
311
- return BitmapMasks(padded_masks, *out_shape)
312
-
313
- def crop(self, bbox):
314
- """See :func:`BaseInstanceMasks.crop`."""
315
- assert isinstance(bbox, np.ndarray)
316
- assert bbox.ndim == 1
317
-
318
- # clip the boundary
319
- bbox = bbox.copy()
320
- bbox[0::2] = np.clip(bbox[0::2], 0, self.width)
321
- bbox[1::2] = np.clip(bbox[1::2], 0, self.height)
322
- x1, y1, x2, y2 = bbox
323
- w = np.maximum(x2 - x1, 1)
324
- h = np.maximum(y2 - y1, 1)
325
-
326
- if len(self.masks) == 0:
327
- cropped_masks = np.empty((0, h, w), dtype=np.uint8)
328
- else:
329
- cropped_masks = self.masks[:, y1:y1 + h, x1:x1 + w]
330
- return BitmapMasks(cropped_masks, h, w)
331
-
332
- def crop_and_resize(self,
333
- bboxes,
334
- out_shape,
335
- inds,
336
- device='cpu',
337
- interpolation='bilinear'):
338
- """See :func:`BaseInstanceMasks.crop_and_resize`."""
339
- if len(self.masks) == 0:
340
- empty_masks = np.empty((0, *out_shape), dtype=np.uint8)
341
- return BitmapMasks(empty_masks, *out_shape)
342
-
343
- # convert bboxes to tensor
344
- if isinstance(bboxes, np.ndarray):
345
- bboxes = torch.from_numpy(bboxes).to(device=device)
346
- if isinstance(inds, np.ndarray):
347
- inds = torch.from_numpy(inds).to(device=device)
348
-
349
- num_bbox = bboxes.shape[0]
350
- fake_inds = torch.arange(
351
- num_bbox, device=device).to(dtype=bboxes.dtype)[:, None]
352
- rois = torch.cat([fake_inds, bboxes], dim=1) # Nx5
353
- rois = rois.to(device=device)
354
- if num_bbox > 0:
355
- #masks_vis = (self.masks == 1)
356
- masks_vis = (self.masks > 0)
357
- gt_masks_th = torch.from_numpy(masks_vis).to(device).index_select(
358
- 0, inds).to(dtype=rois.dtype)
359
- targets = roi_align(gt_masks_th[:, None, :, :], rois, out_shape,
360
- 1.0, 0, 'avg', True).squeeze(1)
361
- targets = targets.cpu().numpy().astype(int)
362
- resized_masks_vis = (targets > 0.5)
363
-
364
- #masks_full = (self.masks > 0)
365
- masks_full = (self.masks == 2)
366
- #masks_occ = (self.masks == 2)
367
- gt_masks_th = torch.from_numpy(masks_full).to(device).index_select(
368
- 0, inds).to(dtype=rois.dtype)
369
- targets = roi_align(gt_masks_th[:, None, :, :], rois, out_shape,
370
- 1.0, 0, 'avg', True).squeeze(1)
371
- targets = targets.cpu().numpy().astype(int)
372
- resized_masks_full = (targets > 0.5)
373
- resized_masks = np.stack([resized_masks_vis,resized_masks_full],axis=1)
374
- else:
375
- resized_masks = []
376
- return BitmapMasks(resized_masks, *out_shape)
377
-
378
- def expand(self, expanded_h, expanded_w, top, left):
379
- """See :func:`BaseInstanceMasks.expand`."""
380
- if len(self.masks) == 0:
381
- expanded_mask = np.empty((0, expanded_h, expanded_w),
382
- dtype=np.uint8)
383
- else:
384
- expanded_mask = np.zeros((len(self), expanded_h, expanded_w),
385
- dtype=np.uint8)
386
- expanded_mask[:, top:top + self.height,
387
- left:left + self.width] = self.masks
388
- return BitmapMasks(expanded_mask, expanded_h, expanded_w)
389
-
390
- def translate(self,
391
- out_shape,
392
- offset,
393
- direction='horizontal',
394
- fill_val=0,
395
- interpolation='bilinear'):
396
- """Translate the BitmapMasks.
397
-
398
- Args:
399
- out_shape (tuple[int]): Shape for output mask, format (h, w).
400
- offset (int | float): The offset for translate.
401
- direction (str): The translate direction, either "horizontal"
402
- or "vertical".
403
- fill_val (int | float): Border value. Default 0 for masks.
404
- interpolation (str): Same as :func:`mmcv.imtranslate`.
405
-
406
- Returns:
407
- BitmapMasks: Translated BitmapMasks.
408
-
409
- Example:
410
- >>> from mmdet.core.mask.structures import BitmapMasks
411
- >>> self = BitmapMasks.random(dtype=np.uint8)
412
- >>> out_shape = (32, 32)
413
- >>> offset = 4
414
- >>> direction = 'horizontal'
415
- >>> fill_val = 0
416
- >>> interpolation = 'bilinear'
417
- >>> # Note, There seem to be issues when:
418
- >>> # * out_shape is different than self's shape
419
- >>> # * the mask dtype is not supported by cv2.AffineWarp
420
- >>> new = self.translate(out_shape, offset, direction, fill_val,
421
- >>> interpolation)
422
- >>> assert len(new) == len(self)
423
- >>> assert new.height, new.width == out_shape
424
- """
425
- if len(self.masks) == 0:
426
- translated_masks = np.empty((0, *out_shape), dtype=np.uint8)
427
- else:
428
- translated_masks = mmcv.imtranslate(
429
- self.masks.transpose((1, 2, 0)),
430
- offset,
431
- direction,
432
- border_value=fill_val,
433
- interpolation=interpolation)
434
- if translated_masks.ndim == 2:
435
- translated_masks = translated_masks[:, :, None]
436
- translated_masks = translated_masks.transpose(
437
- (2, 0, 1)).astype(self.masks.dtype)
438
- return BitmapMasks(translated_masks, *out_shape)
439
-
440
- def shear(self,
441
- out_shape,
442
- magnitude,
443
- direction='horizontal',
444
- border_value=0,
445
- interpolation='bilinear'):
446
- """Shear the BitmapMasks.
447
-
448
- Args:
449
- out_shape (tuple[int]): Shape for output mask, format (h, w).
450
- magnitude (int | float): The magnitude used for shear.
451
- direction (str): The shear direction, either "horizontal"
452
- or "vertical".
453
- border_value (int | tuple[int]): Value used in case of a
454
- constant border.
455
- interpolation (str): Same as in :func:`mmcv.imshear`.
456
-
457
- Returns:
458
- BitmapMasks: The sheared masks.
459
- """
460
- if len(self.masks) == 0:
461
- sheared_masks = np.empty((0, *out_shape), dtype=np.uint8)
462
- else:
463
- sheared_masks = mmcv.imshear(
464
- self.masks.transpose((1, 2, 0)),
465
- magnitude,
466
- direction,
467
- border_value=border_value,
468
- interpolation=interpolation)
469
- if sheared_masks.ndim == 2:
470
- sheared_masks = sheared_masks[:, :, None]
471
- sheared_masks = sheared_masks.transpose(
472
- (2, 0, 1)).astype(self.masks.dtype)
473
- return BitmapMasks(sheared_masks, *out_shape)
474
-
475
- def rotate(self, out_shape, angle, center=None, scale=1.0, fill_val=0):
476
- """Rotate the BitmapMasks.
477
-
478
- Args:
479
- out_shape (tuple[int]): Shape for output mask, format (h, w).
480
- angle (int | float): Rotation angle in degrees. Positive values
481
- mean counter-clockwise rotation.
482
- center (tuple[float], optional): Center point (w, h) of the
483
- rotation in source image. If not specified, the center of
484
- the image will be used.
485
- scale (int | float): Isotropic scale factor.
486
- fill_val (int | float): Border value. Default 0 for masks.
487
-
488
- Returns:
489
- BitmapMasks: Rotated BitmapMasks.
490
- """
491
- if len(self.masks) == 0:
492
- rotated_masks = np.empty((0, *out_shape), dtype=self.masks.dtype)
493
- else:
494
- rotated_masks = mmcv.imrotate(
495
- self.masks.transpose((1, 2, 0)),
496
- angle,
497
- center=center,
498
- scale=scale,
499
- border_value=fill_val)
500
- if rotated_masks.ndim == 2:
501
- # case when only one mask, (h, w)
502
- rotated_masks = rotated_masks[:, :, None] # (h, w, 1)
503
- rotated_masks = rotated_masks.transpose(
504
- (2, 0, 1)).astype(self.masks.dtype)
505
- return BitmapMasks(rotated_masks, *out_shape)
506
-
507
- @property
508
- def areas(self):
509
- """See :py:attr:`BaseInstanceMasks.areas`."""
510
- return self.masks.sum((1, 2))
511
-
512
- def to_ndarray(self):
513
- """See :func:`BaseInstanceMasks.to_ndarray`."""
514
- return self.masks
515
-
516
- def to_tensor(self, dtype, device):
517
- """See :func:`BaseInstanceMasks.to_tensor`."""
518
- return torch.tensor(self.masks, dtype=dtype, device=device)
519
-
520
- @classmethod
521
- def random(cls,
522
- num_masks=3,
523
- height=32,
524
- width=32,
525
- dtype=np.uint8,
526
- rng=None):
527
- """Generate random bitmap masks for demo / testing purposes.
528
-
529
- Example:
530
- >>> from mmdet.core.mask.structures import BitmapMasks
531
- >>> self = BitmapMasks.random()
532
- >>> print('self = {}'.format(self))
533
- self = BitmapMasks(num_masks=3, height=32, width=32)
534
- """
535
- from mmdet.utils.util_random import ensure_rng
536
- rng = ensure_rng(rng)
537
- masks = (rng.rand(num_masks, height, width) > 0.1).astype(dtype)
538
- self = cls(masks, height=height, width=width)
539
- return self
540
-
541
-
542
- class PolygonMasks(BaseInstanceMasks):
543
- """This class represents masks in the form of polygons.
544
-
545
- Polygons is a list of three levels. The first level of the list
546
- corresponds to objects, the second level to the polys that compose the
547
- object, the third level to the poly coordinates
548
-
549
- Args:
550
- masks (list[list[ndarray]]): The first level of the list
551
- corresponds to objects, the second level to the polys that
552
- compose the object, the third level to the poly coordinates
553
- height (int): height of masks
554
- width (int): width of masks
555
-
556
- Example:
557
- >>> from mmdet.core.mask.structures import * # NOQA
558
- >>> masks = [
559
- >>> [ np.array([0, 0, 10, 0, 10, 10., 0, 10, 0, 0]) ]
560
- >>> ]
561
- >>> height, width = 16, 16
562
- >>> self = PolygonMasks(masks, height, width)
563
-
564
- >>> # demo translate
565
- >>> new = self.translate((16, 16), 4., direction='horizontal')
566
- >>> assert np.all(new.masks[0][0][1::2] == masks[0][0][1::2])
567
- >>> assert np.all(new.masks[0][0][0::2] == masks[0][0][0::2] + 4)
568
-
569
- >>> # demo crop_and_resize
570
- >>> num_boxes = 3
571
- >>> bboxes = np.array([[0, 0, 30, 10.0]] * num_boxes)
572
- >>> out_shape = (16, 16)
573
- >>> inds = torch.randint(0, len(self), size=(num_boxes,))
574
- >>> device = 'cpu'
575
- >>> interpolation = 'bilinear'
576
- >>> new = self.crop_and_resize(
577
- ... bboxes, out_shape, inds, device, interpolation)
578
- >>> assert len(new) == num_boxes
579
- >>> assert new.height, new.width == out_shape
580
- """
581
-
582
- def __init__(self, masks, height, width):
583
- assert isinstance(masks, list)
584
- if len(masks) > 0:
585
- assert isinstance(masks[0], list)
586
- assert isinstance(masks[0][0], np.ndarray)
587
-
588
- self.height = height
589
- self.width = width
590
- self.masks = masks
591
-
592
- def __getitem__(self, index):
593
- """Index the polygon masks.
594
-
595
- Args:
596
- index (ndarray | List): The indices.
597
-
598
- Returns:
599
- :obj:`PolygonMasks`: The indexed polygon masks.
600
- """
601
- if isinstance(index, np.ndarray):
602
- index = index.tolist()
603
- if isinstance(index, list):
604
- masks = [self.masks[i] for i in index]
605
- else:
606
- try:
607
- masks = self.masks[index]
608
- except Exception:
609
- raise ValueError(
610
- f'Unsupported input of type {type(index)} for indexing!')
611
- if len(masks) and isinstance(masks[0], np.ndarray):
612
- masks = [masks] # ensure a list of three levels
613
- return PolygonMasks(masks, self.height, self.width)
614
-
615
- def __iter__(self):
616
- return iter(self.masks)
617
-
618
- def __repr__(self):
619
- s = self.__class__.__name__ + '('
620
- s += f'num_masks={len(self.masks)}, '
621
- s += f'height={self.height}, '
622
- s += f'width={self.width})'
623
- return s
624
-
625
- def __len__(self):
626
- """Number of masks."""
627
- return len(self.masks)
628
-
629
- def rescale(self, scale, interpolation=None):
630
- """see :func:`BaseInstanceMasks.rescale`"""
631
- new_w, new_h = mmcv.rescale_size((self.width, self.height), scale)
632
- if len(self.masks) == 0:
633
- rescaled_masks = PolygonMasks([], new_h, new_w)
634
- else:
635
- rescaled_masks = self.resize((new_h, new_w))
636
- return rescaled_masks
637
-
638
- def resize(self, out_shape, interpolation=None):
639
- """see :func:`BaseInstanceMasks.resize`"""
640
- if len(self.masks) == 0:
641
- resized_masks = PolygonMasks([], *out_shape)
642
- else:
643
- h_scale = out_shape[0] / self.height
644
- w_scale = out_shape[1] / self.width
645
- resized_masks = []
646
- for poly_per_obj in self.masks:
647
- resized_poly = []
648
- for p in poly_per_obj:
649
- p = p.copy()
650
- p[0::2] *= w_scale
651
- p[1::2] *= h_scale
652
- resized_poly.append(p)
653
- resized_masks.append(resized_poly)
654
- resized_masks = PolygonMasks(resized_masks, *out_shape)
655
- return resized_masks
656
-
657
- def flip(self, flip_direction='horizontal'):
658
- """see :func:`BaseInstanceMasks.flip`"""
659
- assert flip_direction in ('horizontal', 'vertical', 'diagonal')
660
- if len(self.masks) == 0:
661
- flipped_masks = PolygonMasks([], self.height, self.width)
662
- else:
663
- flipped_masks = []
664
- for poly_per_obj in self.masks:
665
- flipped_poly_per_obj = []
666
- for p in poly_per_obj:
667
- p = p.copy()
668
- if flip_direction == 'horizontal':
669
- p[0::2] = self.width - p[0::2]
670
- elif flip_direction == 'vertical':
671
- p[1::2] = self.height - p[1::2]
672
- else:
673
- p[0::2] = self.width - p[0::2]
674
- p[1::2] = self.height - p[1::2]
675
- flipped_poly_per_obj.append(p)
676
- flipped_masks.append(flipped_poly_per_obj)
677
- flipped_masks = PolygonMasks(flipped_masks, self.height,
678
- self.width)
679
- return flipped_masks
680
-
681
- def crop(self, bbox):
682
- """see :func:`BaseInstanceMasks.crop`"""
683
- assert isinstance(bbox, np.ndarray)
684
- assert bbox.ndim == 1
685
-
686
- # clip the boundary
687
- bbox = bbox.copy()
688
- bbox[0::2] = np.clip(bbox[0::2], 0, self.width)
689
- bbox[1::2] = np.clip(bbox[1::2], 0, self.height)
690
- x1, y1, x2, y2 = bbox
691
- w = np.maximum(x2 - x1, 1)
692
- h = np.maximum(y2 - y1, 1)
693
-
694
- if len(self.masks) == 0:
695
- cropped_masks = PolygonMasks([], h, w)
696
- else:
697
- cropped_masks = []
698
- for poly_per_obj in self.masks:
699
- cropped_poly_per_obj = []
700
- for p in poly_per_obj:
701
- # pycocotools will clip the boundary
702
- p = p.copy()
703
- p[0::2] -= bbox[0]
704
- p[1::2] -= bbox[1]
705
- cropped_poly_per_obj.append(p)
706
- cropped_masks.append(cropped_poly_per_obj)
707
- cropped_masks = PolygonMasks(cropped_masks, h, w)
708
- return cropped_masks
709
-
710
- def pad(self, out_shape, pad_val=0):
711
- """padding has no effect on polygons`"""
712
- return PolygonMasks(self.masks, *out_shape)
713
-
714
- def expand(self, *args, **kwargs):
715
- """TODO: Add expand for polygon"""
716
- raise NotImplementedError
717
-
718
- def crop_and_resize(self,
719
- bboxes,
720
- out_shape,
721
- inds,
722
- device='cpu',
723
- interpolation='bilinear'):
724
- """see :func:`BaseInstanceMasks.crop_and_resize`"""
725
- out_h, out_w = out_shape
726
- if len(self.masks) == 0:
727
- return PolygonMasks([], out_h, out_w)
728
-
729
- resized_masks = []
730
- for i in range(len(bboxes)):
731
- mask = self.masks[inds[i]]
732
- bbox = bboxes[i, :]
733
- x1, y1, x2, y2 = bbox
734
- w = np.maximum(x2 - x1, 1)
735
- h = np.maximum(y2 - y1, 1)
736
- h_scale = out_h / max(h, 0.1) # avoid too large scale
737
- w_scale = out_w / max(w, 0.1)
738
-
739
- resized_mask = []
740
- for p in mask:
741
- p = p.copy()
742
- # crop
743
- # pycocotools will clip the boundary
744
- p[0::2] -= bbox[0]
745
- p[1::2] -= bbox[1]
746
-
747
- # resize
748
- p[0::2] *= w_scale
749
- p[1::2] *= h_scale
750
- resized_mask.append(p)
751
- resized_masks.append(resized_mask)
752
- return PolygonMasks(resized_masks, *out_shape)
753
-
754
- def translate(self,
755
- out_shape,
756
- offset,
757
- direction='horizontal',
758
- fill_val=None,
759
- interpolation=None):
760
- """Translate the PolygonMasks.
761
-
762
- Example:
763
- >>> self = PolygonMasks.random(dtype=np.int)
764
- >>> out_shape = (self.height, self.width)
765
- >>> new = self.translate(out_shape, 4., direction='horizontal')
766
- >>> assert np.all(new.masks[0][0][1::2] == self.masks[0][0][1::2])
767
- >>> assert np.all(new.masks[0][0][0::2] == self.masks[0][0][0::2] + 4) # noqa: E501
768
- """
769
- assert fill_val is None or fill_val == 0, 'Here fill_val is not '\
770
- f'used, and defaultly should be None or 0. got {fill_val}.'
771
- if len(self.masks) == 0:
772
- translated_masks = PolygonMasks([], *out_shape)
773
- else:
774
- translated_masks = []
775
- for poly_per_obj in self.masks:
776
- translated_poly_per_obj = []
777
- for p in poly_per_obj:
778
- p = p.copy()
779
- if direction == 'horizontal':
780
- p[0::2] = np.clip(p[0::2] + offset, 0, out_shape[1])
781
- elif direction == 'vertical':
782
- p[1::2] = np.clip(p[1::2] + offset, 0, out_shape[0])
783
- translated_poly_per_obj.append(p)
784
- translated_masks.append(translated_poly_per_obj)
785
- translated_masks = PolygonMasks(translated_masks, *out_shape)
786
- return translated_masks
787
-
788
- def shear(self,
789
- out_shape,
790
- magnitude,
791
- direction='horizontal',
792
- border_value=0,
793
- interpolation='bilinear'):
794
- """See :func:`BaseInstanceMasks.shear`."""
795
- if len(self.masks) == 0:
796
- sheared_masks = PolygonMasks([], *out_shape)
797
- else:
798
- sheared_masks = []
799
- if direction == 'horizontal':
800
- shear_matrix = np.stack([[1, magnitude],
801
- [0, 1]]).astype(np.float32)
802
- elif direction == 'vertical':
803
- shear_matrix = np.stack([[1, 0], [magnitude,
804
- 1]]).astype(np.float32)
805
- for poly_per_obj in self.masks:
806
- sheared_poly = []
807
- for p in poly_per_obj:
808
- p = np.stack([p[0::2], p[1::2]], axis=0) # [2, n]
809
- new_coords = np.matmul(shear_matrix, p) # [2, n]
810
- new_coords[0, :] = np.clip(new_coords[0, :], 0,
811
- out_shape[1])
812
- new_coords[1, :] = np.clip(new_coords[1, :], 0,
813
- out_shape[0])
814
- sheared_poly.append(
815
- new_coords.transpose((1, 0)).reshape(-1))
816
- sheared_masks.append(sheared_poly)
817
- sheared_masks = PolygonMasks(sheared_masks, *out_shape)
818
- return sheared_masks
819
-
820
- def rotate(self, out_shape, angle, center=None, scale=1.0, fill_val=0):
821
- """See :func:`BaseInstanceMasks.rotate`."""
822
- if len(self.masks) == 0:
823
- rotated_masks = PolygonMasks([], *out_shape)
824
- else:
825
- rotated_masks = []
826
- rotate_matrix = cv2.getRotationMatrix2D(center, -angle, scale)
827
- for poly_per_obj in self.masks:
828
- rotated_poly = []
829
- for p in poly_per_obj:
830
- p = p.copy()
831
- coords = np.stack([p[0::2], p[1::2]], axis=1) # [n, 2]
832
- # pad 1 to convert from format [x, y] to homogeneous
833
- # coordinates format [x, y, 1]
834
- coords = np.concatenate(
835
- (coords, np.ones((coords.shape[0], 1), coords.dtype)),
836
- axis=1) # [n, 3]
837
- rotated_coords = np.matmul(
838
- rotate_matrix[None, :, :],
839
- coords[:, :, None])[..., 0] # [n, 2, 1] -> [n, 2]
840
- rotated_coords[:, 0] = np.clip(rotated_coords[:, 0], 0,
841
- out_shape[1])
842
- rotated_coords[:, 1] = np.clip(rotated_coords[:, 1], 0,
843
- out_shape[0])
844
- rotated_poly.append(rotated_coords.reshape(-1))
845
- rotated_masks.append(rotated_poly)
846
- rotated_masks = PolygonMasks(rotated_masks, *out_shape)
847
- return rotated_masks
848
-
849
- def to_bitmap(self):
850
- """convert polygon masks to bitmap masks."""
851
- bitmap_masks = self.to_ndarray()
852
- return BitmapMasks(bitmap_masks, self.height, self.width)
853
-
854
- @property
855
- def areas(self):
856
- """Compute areas of masks.
857
-
858
- This func is modified from `detectron2
859
- <https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387>`_.
860
- The function only works with Polygons using the shoelace formula.
861
-
862
- Return:
863
- ndarray: areas of each instance
864
- """ # noqa: W501
865
- area = []
866
- for polygons_per_obj in self.masks:
867
- area_per_obj = 0
868
- for p in polygons_per_obj:
869
- area_per_obj += self._polygon_area(p[0::2], p[1::2])
870
- area.append(area_per_obj)
871
- return np.asarray(area)
872
-
873
- def _polygon_area(self, x, y):
874
- """Compute the area of a component of a polygon.
875
-
876
- Using the shoelace formula:
877
- https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates
878
-
879
- Args:
880
- x (ndarray): x coordinates of the component
881
- y (ndarray): y coordinates of the component
882
-
883
- Return:
884
- float: the are of the component
885
- """ # noqa: 501
886
- return 0.5 * np.abs(
887
- np.dot(x, np.roll(y, 1)) - np.dot(y, np.roll(x, 1)))
888
-
889
- def to_ndarray(self):
890
- """Convert masks to the format of ndarray."""
891
- if len(self.masks) == 0:
892
- return np.empty((0, self.height, self.width), dtype=np.uint8)
893
- bitmap_masks = []
894
- for poly_per_obj in self.masks:
895
- bitmap_masks.append(
896
- polygon_to_bitmap(poly_per_obj, self.height, self.width))
897
- return np.stack(bitmap_masks)
898
-
899
- def to_tensor(self, dtype, device):
900
- """See :func:`BaseInstanceMasks.to_tensor`."""
901
- if len(self.masks) == 0:
902
- return torch.empty((0, self.height, self.width),
903
- dtype=dtype,
904
- device=device)
905
- ndarray_masks = self.to_ndarray()
906
- return torch.tensor(ndarray_masks, dtype=dtype, device=device)
907
-
908
- @classmethod
909
- def random(cls,
910
- num_masks=3,
911
- height=32,
912
- width=32,
913
- n_verts=5,
914
- dtype=np.float32,
915
- rng=None):
916
- """Generate random polygon masks for demo / testing purposes.
917
-
918
- Adapted from [1]_
919
-
920
- References:
921
- .. [1] https://gitlab.kitware.com/computer-vision/kwimage/-/blob/928cae35ca8/kwimage/structs/polygon.py#L379 # noqa: E501
922
-
923
- Example:
924
- >>> from mmdet.core.mask.structures import PolygonMasks
925
- >>> self = PolygonMasks.random()
926
- >>> print('self = {}'.format(self))
927
- """
928
- from mmdet.utils.util_random import ensure_rng
929
- rng = ensure_rng(rng)
930
-
931
- def _gen_polygon(n, irregularity, spikeyness):
932
- """Creates the polygon by sampling points on a circle around the
933
- centre. Random noise is added by varying the angular spacing
934
- between sequential points, and by varying the radial distance of
935
- each point from the centre.
936
-
937
- Based on original code by Mike Ounsworth
938
-
939
- Args:
940
- n (int): number of vertices
941
- irregularity (float): [0,1] indicating how much variance there
942
- is in the angular spacing of vertices. [0,1] will map to
943
- [0, 2pi/numberOfVerts]
944
- spikeyness (float): [0,1] indicating how much variance there is
945
- in each vertex from the circle of radius aveRadius. [0,1]
946
- will map to [0, aveRadius]
947
-
948
- Returns:
949
- a list of vertices, in CCW order.
950
- """
951
- from scipy.stats import truncnorm
952
- # Generate around the unit circle
953
- cx, cy = (0.0, 0.0)
954
- radius = 1
955
-
956
- tau = np.pi * 2
957
-
958
- irregularity = np.clip(irregularity, 0, 1) * 2 * np.pi / n
959
- spikeyness = np.clip(spikeyness, 1e-9, 1)
960
-
961
- # generate n angle steps
962
- lower = (tau / n) - irregularity
963
- upper = (tau / n) + irregularity
964
- angle_steps = rng.uniform(lower, upper, n)
965
-
966
- # normalize the steps so that point 0 and point n+1 are the same
967
- k = angle_steps.sum() / (2 * np.pi)
968
- angles = (angle_steps / k).cumsum() + rng.uniform(0, tau)
969
-
970
- # Convert high and low values to be wrt the standard normal range
971
- # https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.truncnorm.html
972
- low = 0
973
- high = 2 * radius
974
- mean = radius
975
- std = spikeyness
976
- a = (low - mean) / std
977
- b = (high - mean) / std
978
- tnorm = truncnorm(a=a, b=b, loc=mean, scale=std)
979
-
980
- # now generate the points
981
- radii = tnorm.rvs(n, random_state=rng)
982
- x_pts = cx + radii * np.cos(angles)
983
- y_pts = cy + radii * np.sin(angles)
984
-
985
- points = np.hstack([x_pts[:, None], y_pts[:, None]])
986
-
987
- # Scale to 0-1 space
988
- points = points - points.min(axis=0)
989
- points = points / points.max(axis=0)
990
-
991
- # Randomly place within 0-1 space
992
- points = points * (rng.rand() * .8 + .2)
993
- min_pt = points.min(axis=0)
994
- max_pt = points.max(axis=0)
995
-
996
- high = (1 - max_pt)
997
- low = (0 - min_pt)
998
- offset = (rng.rand(2) * (high - low)) + low
999
- points = points + offset
1000
- return points
1001
-
1002
- def _order_vertices(verts):
1003
- """
1004
- References:
1005
- https://stackoverflow.com/questions/1709283/how-can-i-sort-a-coordinate-list-for-a-rectangle-counterclockwise
1006
- """
1007
- mlat = verts.T[0].sum() / len(verts)
1008
- mlng = verts.T[1].sum() / len(verts)
1009
-
1010
- tau = np.pi * 2
1011
- angle = (np.arctan2(mlat - verts.T[0], verts.T[1] - mlng) +
1012
- tau) % tau
1013
- sortx = angle.argsort()
1014
- verts = verts.take(sortx, axis=0)
1015
- return verts
1016
-
1017
- # Generate a random exterior for each requested mask
1018
- masks = []
1019
- for _ in range(num_masks):
1020
- exterior = _order_vertices(_gen_polygon(n_verts, 0.9, 0.9))
1021
- exterior = (exterior * [(width, height)]).astype(dtype)
1022
- masks.append([exterior.ravel()])
1023
-
1024
- self = cls(masks, height, width)
1025
- return self
1026
-
1027
-
1028
- def polygon_to_bitmap(polygons, height, width):
1029
- """Convert masks from the form of polygons to bitmaps.
1030
-
1031
- Args:
1032
- polygons (list[ndarray]): masks in polygon representation
1033
- height (int): mask height
1034
- width (int): mask width
1035
-
1036
- Return:
1037
- ndarray: the converted masks in bitmap representation
1038
- """
1039
- rles = maskUtils.frPyObjects(polygons, height, width)
1040
- rle = maskUtils.merge(rles)
1041
- bitmap_mask = maskUtils.decode(rle).astype(np.bool)
1042
- return bitmap_mask
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/CVPR/WALT/mmdet/models/detectors/trident_faster_rcnn.py DELETED
@@ -1,66 +0,0 @@
1
- from ..builder import DETECTORS
2
- from .faster_rcnn import FasterRCNN
3
-
4
-
5
- @DETECTORS.register_module()
6
- class TridentFasterRCNN(FasterRCNN):
7
- """Implementation of `TridentNet <https://arxiv.org/abs/1901.01892>`_"""
8
-
9
- def __init__(self,
10
- backbone,
11
- rpn_head,
12
- roi_head,
13
- train_cfg,
14
- test_cfg,
15
- neck=None,
16
- pretrained=None):
17
-
18
- super(TridentFasterRCNN, self).__init__(
19
- backbone=backbone,
20
- neck=neck,
21
- rpn_head=rpn_head,
22
- roi_head=roi_head,
23
- train_cfg=train_cfg,
24
- test_cfg=test_cfg,
25
- pretrained=pretrained)
26
- assert self.backbone.num_branch == self.roi_head.num_branch
27
- assert self.backbone.test_branch_idx == self.roi_head.test_branch_idx
28
- self.num_branch = self.backbone.num_branch
29
- self.test_branch_idx = self.backbone.test_branch_idx
30
-
31
- def simple_test(self, img, img_metas, proposals=None, rescale=False):
32
- """Test without augmentation."""
33
- assert self.with_bbox, 'Bbox head must be implemented.'
34
- x = self.extract_feat(img)
35
- if proposals is None:
36
- num_branch = (self.num_branch if self.test_branch_idx == -1 else 1)
37
- trident_img_metas = img_metas * num_branch
38
- proposal_list = self.rpn_head.simple_test_rpn(x, trident_img_metas)
39
- else:
40
- proposal_list = proposals
41
-
42
- return self.roi_head.simple_test(
43
- x, proposal_list, trident_img_metas, rescale=rescale)
44
-
45
- def aug_test(self, imgs, img_metas, rescale=False):
46
- """Test with augmentations.
47
-
48
- If rescale is False, then returned bboxes and masks will fit the scale
49
- of imgs[0].
50
- """
51
- x = self.extract_feats(imgs)
52
- num_branch = (self.num_branch if self.test_branch_idx == -1 else 1)
53
- trident_img_metas = [img_metas * num_branch for img_metas in img_metas]
54
- proposal_list = self.rpn_head.aug_test_rpn(x, trident_img_metas)
55
- return self.roi_head.aug_test(
56
- x, proposal_list, img_metas, rescale=rescale)
57
-
58
- def forward_train(self, img, img_metas, gt_bboxes, gt_labels, **kwargs):
59
- """make copies of img and gts to fit multi-branch."""
60
- trident_gt_bboxes = tuple(gt_bboxes * self.num_branch)
61
- trident_gt_labels = tuple(gt_labels * self.num_branch)
62
- trident_img_metas = tuple(img_metas * self.num_branch)
63
-
64
- return super(TridentFasterRCNN,
65
- self).forward_train(img, trident_img_metas,
66
- trident_gt_bboxes, trident_gt_labels)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Corran/qnagenerator/app.py DELETED
@@ -1,48 +0,0 @@
1
- import gradio as gr
2
- import torch
3
- import random
4
-
5
- from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, AutoModelWithLMHead
6
- from sentence_splitter import SentenceSplitter, split_text_into_sentences
7
- splitter = SentenceSplitter(language='en')
8
-
9
- if torch.cuda.is_available():
10
- torch_device="cuda:0"
11
- else:
12
- torch_device="cpu"
13
-
14
- ptokenizer = AutoTokenizer.from_pretrained("tuner007/pegasus_paraphrase")
15
- pmodel = AutoModelForSeq2SeqLM.from_pretrained("tuner007/pegasus_paraphrase").to(torch_device)
16
-
17
- def get_answer(input_text,num_return_sequences,num_beams):
18
- batch = ptokenizer([input_text],truncation=True,padding='longest',max_length=60, return_tensors="pt").to(torch_device)
19
- translated = pmodel.generate(**batch,max_length=60,num_beams=num_beams, num_return_sequences=num_return_sequences, temperature=1.5)
20
- tgt_text = ptokenizer.batch_decode(translated, skip_special_tokens=True)
21
- return tgt_text
22
-
23
- qtokenizer = AutoTokenizer.from_pretrained("mrm8488/t5-base-finetuned-question-generation-ap")
24
- qmodel = AutoModelWithLMHead.from_pretrained("mrm8488/t5-base-finetuned-question-generation-ap").to(torch_device)
25
-
26
- def get_question(answer, context, max_length=64):
27
- input_text = "answer: %s context: %s </s>" % (answer, context)
28
- features = qtokenizer([input_text], return_tensors='pt').to(torch_device)
29
-
30
- output = qmodel.generate(input_ids=features['input_ids'],
31
- attention_mask=features['attention_mask'],
32
- max_length=max_length)
33
-
34
- return qtokenizer.decode(output[0])
35
-
36
- def getqna(input):
37
- input=split_text_into_sentences(text=input, language='en')
38
- if len(input)==0:
39
- answer= get_answer(input,10,10)[random.randint(0, 9)]
40
- else:
41
- sentences=[get_answer(sentence,10,10)[random.randint(0, 9)] for sentence in input]
42
- answer= " ".join(sentences)
43
- answer= get_answer(answer,10,10)[random.randint(0, 9)]
44
- question= get_question(answer, input).replace("<pad>","").replace("</s>","")
45
- return "%s \n answer:%s" % (question, answer)
46
-
47
- app = gr.Interface(fn=getqna, inputs="text", outputs="text")
48
- app.launch()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/DHEIVER/Kidney_Image_Classifier/app.py DELETED
@@ -1,29 +0,0 @@
1
- import gradio as gr
2
- import tensorflow as tf
3
- import tensorflow_io as tfio
4
- import numpy as np
5
-
6
- loaded_model = tf.keras.models.load_model('kidney2.h5')
7
-
8
- label_names = {
9
- "1": "Cyst",
10
- "2": "Normal",
11
- "3": "Stone",
12
- "4": "Tumor"
13
- }
14
-
15
- def classify_kidney_image(img):
16
- resize = tf.image.resize(img, (224, 224))
17
- gray = tfio.experimental.color.bgr_to_rgb(resize)
18
- normalized_img = gray / 255.0
19
- yhat = loaded_model.predict(np.expand_dims(normalized_img, 0))
20
- class_index = np.argmax(yhat, axis=1)[0]
21
- predicted_label = label_names[str(class_index + 1)]
22
- probabilities = {label_names[str(i + 1)]: str(prob) for i, prob in enumerate(yhat[0])}
23
- return predicted_label, probabilities
24
-
25
- image = gr.inputs.Image(shape=(224, 224))
26
- label = gr.outputs.Label()
27
-
28
- app = gr.Interface(fn=classify_kidney_image, inputs=image, outputs=label, interpretation='default', title='Kidney Image Classifier')
29
- app.launch(debug=True)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/gradio/templates/frontend/assets/index-ec39e521.css DELETED
@@ -1 +0,0 @@
1
- @font-face{font-family:KaTeX_AMS;font-style:normal;font-weight:400;src:url(./KaTeX_AMS-Regular-0cdd387c.woff2) format("woff2"),url(./KaTeX_AMS-Regular-30da91e8.woff) format("woff"),url(./KaTeX_AMS-Regular-68534840.ttf) format("truetype")}@font-face{font-family:KaTeX_Caligraphic;font-style:normal;font-weight:700;src:url(./KaTeX_Caligraphic-Bold-de7701e4.woff2) format("woff2"),url(./KaTeX_Caligraphic-Bold-1ae6bd74.woff) format("woff"),url(./KaTeX_Caligraphic-Bold-07d8e303.ttf) format("truetype")}@font-face{font-family:KaTeX_Caligraphic;font-style:normal;font-weight:400;src:url(./KaTeX_Caligraphic-Regular-5d53e70a.woff2) format("woff2"),url(./KaTeX_Caligraphic-Regular-3398dd02.woff) format("woff"),url(./KaTeX_Caligraphic-Regular-ed0b7437.ttf) format("truetype")}@font-face{font-family:KaTeX_Fraktur;font-style:normal;font-weight:700;src:url(./KaTeX_Fraktur-Bold-74444efd.woff2) format("woff2"),url(./KaTeX_Fraktur-Bold-9be7ceb8.woff) format("woff"),url(./KaTeX_Fraktur-Bold-9163df9c.ttf) format("truetype")}@font-face{font-family:KaTeX_Fraktur;font-style:normal;font-weight:400;src:url(./KaTeX_Fraktur-Regular-51814d27.woff2) format("woff2"),url(./KaTeX_Fraktur-Regular-5e28753b.woff) format("woff"),url(./KaTeX_Fraktur-Regular-1e6f9579.ttf) format("truetype")}@font-face{font-family:KaTeX_Main;font-style:normal;font-weight:700;src:url(./KaTeX_Main-Bold-0f60d1b8.woff2) format("woff2"),url(./KaTeX_Main-Bold-c76c5d69.woff) format("woff"),url(./KaTeX_Main-Bold-138ac28d.ttf) format("truetype")}@font-face{font-family:KaTeX_Main;font-style:italic;font-weight:700;src:url(./KaTeX_Main-BoldItalic-99cd42a3.woff2) format("woff2"),url(./KaTeX_Main-BoldItalic-a6f7ec0d.woff) format("woff"),url(./KaTeX_Main-BoldItalic-70ee1f64.ttf) format("truetype")}@font-face{font-family:KaTeX_Main;font-style:italic;font-weight:400;src:url(./KaTeX_Main-Italic-97479ca6.woff2) format("woff2"),url(./KaTeX_Main-Italic-f1d6ef86.woff) format("woff"),url(./KaTeX_Main-Italic-0d85ae7c.ttf) format("truetype")}@font-face{font-family:KaTeX_Main;font-style:normal;font-weight:400;src:url(./KaTeX_Main-Regular-c2342cd8.woff2) format("woff2"),url(./KaTeX_Main-Regular-c6368d87.woff) format("woff"),url(./KaTeX_Main-Regular-d0332f52.ttf) format("truetype")}@font-face{font-family:KaTeX_Math;font-style:italic;font-weight:700;src:url(./KaTeX_Math-BoldItalic-dc47344d.woff2) format("woff2"),url(./KaTeX_Math-BoldItalic-850c0af5.woff) format("woff"),url(./KaTeX_Math-BoldItalic-f9377ab0.ttf) format("truetype")}@font-face{font-family:KaTeX_Math;font-style:italic;font-weight:400;src:url(./KaTeX_Math-Italic-7af58c5e.woff2) format("woff2"),url(./KaTeX_Math-Italic-8a8d2445.woff) format("woff"),url(./KaTeX_Math-Italic-08ce98e5.ttf) format("truetype")}@font-face{font-family:KaTeX_SansSerif;font-style:normal;font-weight:700;src:url(./KaTeX_SansSerif-Bold-e99ae511.woff2) format("woff2"),url(./KaTeX_SansSerif-Bold-ece03cfd.woff) format("woff"),url(./KaTeX_SansSerif-Bold-1ece03f7.ttf) format("truetype")}@font-face{font-family:KaTeX_SansSerif;font-style:italic;font-weight:400;src:url(./KaTeX_SansSerif-Italic-00b26ac8.woff2) format("woff2"),url(./KaTeX_SansSerif-Italic-91ee6750.woff) format("woff"),url(./KaTeX_SansSerif-Italic-3931dd81.ttf) format("truetype")}@font-face{font-family:KaTeX_SansSerif;font-style:normal;font-weight:400;src:url(./KaTeX_SansSerif-Regular-68e8c73e.woff2) format("woff2"),url(./KaTeX_SansSerif-Regular-11e4dc8a.woff) format("woff"),url(./KaTeX_SansSerif-Regular-f36ea897.ttf) format("truetype")}@font-face{font-family:KaTeX_Script;font-style:normal;font-weight:400;src:url(./KaTeX_Script-Regular-036d4e95.woff2) format("woff2"),url(./KaTeX_Script-Regular-d96cdf2b.woff) format("woff"),url(./KaTeX_Script-Regular-1c67f068.ttf) format("truetype")}@font-face{font-family:KaTeX_Size1;font-style:normal;font-weight:400;src:url(./KaTeX_Size1-Regular-6b47c401.woff2) format("woff2"),url(./KaTeX_Size1-Regular-c943cc98.woff) format("woff"),url(./KaTeX_Size1-Regular-95b6d2f1.ttf) format("truetype")}@font-face{font-family:KaTeX_Size2;font-style:normal;font-weight:400;src:url(./KaTeX_Size2-Regular-d04c5421.woff2) format("woff2"),url(./KaTeX_Size2-Regular-2014c523.woff) format("woff"),url(./KaTeX_Size2-Regular-a6b2099f.ttf) format("truetype")}@font-face{font-family:KaTeX_Size3;font-style:normal;font-weight:400;src:url(data:font/woff2;base64,d09GMgABAAAAAA4oAA4AAAAAHbQAAA3TAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAABmAAgRQIDgmcDBEICo1oijYBNgIkA14LMgAEIAWJAAeBHAyBHBvbGiMRdnO0IkRRkiYDgr9KsJ1NUAf2kILNxgUmgqIgq1P89vcbIcmsQbRps3vCcXdYOKSWEPEKgZgQkprQQsxIXUgq0DqpGKmIvrgkeVGtEQD9DzAO29fM9jYhxZEsL2FeURH2JN4MIcTdO049NCVdxQ/w9NrSYFEBKTDKpLKfNkCGDc1RwjZLQcm3vqJ2UW9Xfa3tgAHz6ivp6vgC2yD4/6352ndnN0X0TL7seypkjZlMsjmZnf0Mm5Q+JykRWQBKCVCVPbARPXWyQtb5VgLB6Biq7/Uixcj2WGqdI8tGSgkuRG+t910GKP2D7AQH0DB9FMDW/obJZ8giFI3Wg8Cvevz0M+5m0rTh7XDBlvo9Y4vm13EXmfttwI4mBo1EG15fxJhUiCLbiiyCf/ZA6MFAhg3pGIZGdGIVjtPn6UcMk9A/UUr9PhoNsCENw1APAq0gpH73e+M+0ueyHbabc3vkbcdtzcf/fiy+NxQEjf9ud/ELBHAXJ0nk4z+MXH2Ev/kWyV4k7SkvpPc9Qr38F6RPWnM9cN6DJ0AdD1BhtgABtmoRoFCvPsBAumNm6soZG2Gk5GyVTo2sJncSyp0jQTYoR6WDvTwaaEcHsxHfvuWhHA3a6bN7twRKtcGok6NsCi7jYRrM2jExsUFMxMQYuJbMhuWNOumEJy9hi29Dmg5zMp/A5+hhPG19j1vBrq8JTLr8ki5VLPmG/PynJHVul440bxg5xuymHUFPBshC+nA9I1FmwbRBTNHAcik3Oae0cxKoI3MOriM42UrPe51nsaGxJ+WfXubAsP84aabUlQSJ1IiE0iPETLUU4CATgfXSCSpuRFRmCGbO+wSpAnzaeaCYW1VNEysRtuXCEL1kUFUbbtMv3Tilt/1c11jt3Q5bbMa84cpWipp8Elw3MZhOHsOlwwVUQM3lAR35JiFQbaYCRnMF2lxAWoOg2gyoIV4PouX8HytNIfLhqpJtXB4vjiViUI8IJ7bkC4ikkQvKksnOTKICwnqWSZ9YS5f0WCxmpgjbIq7EJcM4aI2nmhLNY2JIUgOjXZFWBHb+x5oh6cwb0Tv1ackHdKi0I9OO2wE9aogIOn540CCCziyhN+IaejtgAONKznHlHyutPrHGwCx9S6B8kfS4Mfi4Eyv7OU730bT1SCBjt834cXsf43zVjPUqqJjgrjeGnBxSG4aYAKFuVbeCfkDIjAqMb6yLNIbCuvXhMH2/+k2vkNpkORhR59N1CkzoOENvneIosjYmuTxlhUzaGEJQ/iWqx4dmwpmKjrwTiTGTCVozNAYqk/zXOndWxuWSmJkQpJw3pK5KX6QrLt5LATMqpmPAQhkhK6PUjzHUn7E0gHE0kPE0iKkolgkUx9SZmVAdDgpffdyJKg3k7VmzYGCwVXGz/tXmkOIp+vcWs+EMuhhvN0h9uhfzWJziBQmCREGSIFmQIkgVpAnSBRmC//6hkLZwaVhwxlrJSOdqlFtOYxlau9F2QN5Y98xmIAsiM1HVp2VFX+DHHGg6Ecjh3vmqtidX3qHI2qycTk/iwxSt5UzTmEP92ZBnEWTk4Mx8Mpl78ZDokxg/KWb+Q0QkvdKVmq3TMW+RXEgrsziSAfNXFMhDc60N5N9jQzjfO0kBKpUZl0ZmwJ41j/B9Hz6wmRaJB84niNmQrzp9eSlQCDDzazGDdVi3P36VZQ+Jy4f9UBNp+3zTjqI4abaFAm+GShVaXlsGdF3FYzZcDI6cori4kMxUECl9IjJZpzkvitAoxKue+90pDMvcKRxLl53TmOKCmV/xRolNKSqqUxc6LStOETmFOiLZZptlZepcKiAzteG8PEdpnQpbOMNcMsR4RR2Bs0cKFEvSmIjAFcnarqwUL4lDhHmnVkwu1IwshbiCcgvOheZuYyOteufZZwlcTlLgnZ3o/WcYdzZHW/WGaqaVfmTZ1aWCceJjkbZqsfbkOtcFlUZM/jy+hXHDbaUobWqqXaeWobbLO99yG5N3U4wxco0rQGGcOLASFMXeJoham8M+/x6O2WywK2l4HGbq1CoUyC/IZikQhdq3SiuNrvAEj0AVu9x2x3lp/xWzahaxidezFVtdcb5uEnzyl0ZmYiuKI0exvCd4Xc9CV1KB0db00z92wDPde0kukbvZIWN6jUWFTmPIC/Y4UPCm8UfDTFZpZNon1qLFTkBhxzB+FjQRA2Q/YRJT8pQigslMaUpFyAG8TMlXigiqmAZX4xgijKjRlGpLE0GdplRfCaJo0JQaSxNBk6ZmMzcya0FmrcisDdn0Q3HI2sWSppYigmlM1XT/kLQZSNpMJG0WkjYbSZuDpM1F0uYhFc1HxU4m1QJjDK6iL0S5uSj5rgXc3RejEigtcRBtqYPQsiTskmO5vosV+q4VGIKbOkDg0jtRrq+Em1YloaTFar3EGr1EUC8R0kus1Uus00usL97ABr2BjXoDm/QGNhuWtMVBKOwg/i78lT7hBsAvDmwHc/ao3vmUbBmhjeYySZNWvGkfZAgISDSaDo1SVpzGDsAEkF8B+gEapViUoZgUWXcRIGFZNm6gWbAKk0bp0k1MHG9fLYtV4iS2SmLEQFARzRcnf9PUS0LVn05/J9MiRRBU3v2IrvW974v4N00L7ZMk0wXP1409CHo/an8zTRHD3eSJ6m8D4YMkZNl3M79sqeuAsr/m3f+8/yl7A50aiAEJgeBeMWzu7ui9UfUBCe2TIqZIoOd/3/udRBOQidQZUERzb2/VwZN1H/Sju82ew2H2Wfr6qvfVf3hqwDvAIpkQVFy4B9Pe9e4/XvPeceu7h3dvO56iJPf0+A6cqA2ip18ER+iFgggiuOkvj24bby0N9j2UHIkgqIt+sVgfodC4YghLSMjSZbH0VR/6dMDrYJeKHilKTemt6v6kvzvn3/RrdWtr0GoN/xL+Sex/cPYLUpepx9cz/D46UPU5KXgAQa+NDps1v6J3xP1i2HtaDB0M9aX2deA7SYff//+gUCovMmIK/qfsFcOk+4Y5ZN97XlG6zebqtMbKgeRFi51vnxTQYBUik2rS/Cn6PC8ADR8FGxsRPB82dzfND90gIcshOcYUkfjherBz53odpm6TP8txlwOZ71xmfHHOvq053qFF/MRlS3jP0ELudrf2OeN8DHvp6ZceLe8qKYvWz/7yp0u4dKPfli3CYq0O13Ih71mylJ80tOi10On8wi+F4+LWgDPeJ30msSQt9/vkmHq9/Lvo2b461mP801v3W4xTcs6CbvF9UDdrSt+A8OUbpSh55qAUFXWznBBfdeJ8a4d7ugT5tvxUza3h9m4H7ptTqiG4z0g5dc0X29OcGlhpGFMpQo9ytTS+NViZpNdvU4kWx+LKxNY10kQ1yqGXrhe4/1nvP7E+nd5A92TtaRplbHSqoIdOqtRWti+fkB5/n1+/VvCmz12pG1kpQWsfi1ftlBobm0bpngs16CHkbIwdLnParxtTV3QYRlfJ0KFskH7pdN/YDn+yRuSd7sNH3aO0DYPggk6uWuXrfOc+fa3VTxFVvKaNxHsiHmsXyCLIE5yuOeN3/Jdf8HBL/5M6shjyhxHx9BjB1O0+4NLOnjLLSxwO7ukN4jMbOIcD879KLSi6Pk61Oqm2377n8079PXEEQ7cy7OKEC9nbpet118fxweTafpt69x/Bt8UqGzNQt7aelpc44dn5cqhwf71+qKp/Zf/+a0zcizOUWpl/iBcSXip0pplkatCchoH5c5aUM8I7/dWxAej8WicPL1URFZ9BDJelUwEwTkGqUhgSlydVes95YdXvhh9Gfz/aeFWvgVb4tuLbcv4+wLdutVZv/cUonwBD/6eDlE0aSiKK/uoH3+J1wDE/jMVqY2ysGufN84oIXB0sPzy8ollX/LegY74DgJXJR57sn+VGza0x3DnuIgABFM15LmajjjsNlYj+JEZGbuRYcAMOWxFkPN2w6Wd46xo4gVWQR/X4lyI/R6K/YK0110GzudPRW7Y+UOBGTfNNzHeYT0fiH0taunBpq9HEW8OKSaBGj21L0MqenEmNRWBAWDWAk4CpNoEZJ2tTaPFgbQYj8HxtFilErs3BTRwT8uO1NXQaWfIotchmPkAF5mMBAliEmZiOGVgCG9LgRzpscMAOOwowlT3JhusdazXGSC/hxR3UlmWVwWHpOIKheqONvjyhSiTHIkVUco5bnji8m//zL7PKaT1Vl5I6UE609f+gkr6MZKVyKc7zJRmCahLsdlyA5fdQkRSan9LgnnLEyGSkaKJCJog0wAgvepWBt80+1yKln1bMVtCljfNWDueKLsWwaEbBSfSPTEmVRsUcYYMnEjcjeyCZzBXK9E9BYBXLKjOSpUDR+nEV3TFSUdQaz+ot98QxgXwx0GQ+EEUAKB2qZPkQQ0GqFD8UPFMqyaCHM24BZmSGic9EYMagKizOw9Hz50DMrDLrqqLkTAhplMictiCAx5S3BIUQdeJeLnBy2CNtMfz6cV4u8XKoFZQesbf9YZiIERiHjaNodDW6LgcirX/mPnJIkBGDUpTBhSa0EIr38D5hCIszhCM8URGBqImoWjpvpt1ebu/v3Gl3qJfMnNM+9V+kiRFyROTPHQWOcs1dNW94/ukKMPZBvDi55i5CttdeJz84DLngLqjcdwEZ87bFFR8CIG35OAkDVN6VRDZ7aq67NteYqZ2lpT8oYB2CytoBd6VuAx4WgiAsnuj3WohG+LugzXiQRDeM3XYXlULv4dp5VFYC) format("woff2"),url(./KaTeX_Size3-Regular-6ab6b62e.woff) format("woff"),url(./KaTeX_Size3-Regular-500e04d5.ttf) format("truetype")}@font-face{font-family:KaTeX_Size4;font-style:normal;font-weight:400;src:url(./KaTeX_Size4-Regular-a4af7d41.woff2) format("woff2"),url(./KaTeX_Size4-Regular-99f9c675.woff) format("woff"),url(./KaTeX_Size4-Regular-c647367d.ttf) format("truetype")}@font-face{font-family:KaTeX_Typewriter;font-style:normal;font-weight:400;src:url(./KaTeX_Typewriter-Regular-71d517d6.woff2) format("woff2"),url(./KaTeX_Typewriter-Regular-e14fed02.woff) format("woff"),url(./KaTeX_Typewriter-Regular-f01f3e87.ttf) format("truetype")}.gradio-container-3-37-0 .katex{text-rendering:auto;font: 1.21em KaTeX_Main,Times New Roman,serif;line-height:1.2;text-indent:0}.gradio-container-3-37-0 .katex *{-ms-high-contrast-adjust:none!important;border-color:currentColor}.gradio-container-3-37-0 .katex .katex-version:after{content:"0.16.7"}.gradio-container-3-37-0 .katex .katex-mathml{clip:rect(1px,1px,1px,1px);border:0;height:1px;overflow:hidden;padding:0;position:absolute;width:1px}.gradio-container-3-37-0 .katex .katex-html>.newline{display:block}.gradio-container-3-37-0 .katex .base{position:relative;white-space:nowrap;width:-webkit-min-content;width:-moz-min-content;width:min-content}.gradio-container-3-37-0 .katex .base,.gradio-container-3-37-0 .katex .strut{display:inline-block}.gradio-container-3-37-0 .katex .textbf{font-weight:700}.gradio-container-3-37-0 .katex .textit{font-style:italic}.gradio-container-3-37-0 .katex .textrm{font-family:KaTeX_Main}.gradio-container-3-37-0 .katex .textsf{font-family:KaTeX_SansSerif}.gradio-container-3-37-0 .katex .texttt{font-family:KaTeX_Typewriter}.gradio-container-3-37-0 .katex .mathnormal{font-family:KaTeX_Math;font-style:italic}.gradio-container-3-37-0 .katex .mathit{font-family:KaTeX_Main;font-style:italic}.gradio-container-3-37-0 .katex .mathrm{font-style:normal}.gradio-container-3-37-0 .katex .mathbf{font-family:KaTeX_Main;font-weight:700}.gradio-container-3-37-0 .katex .boldsymbol{font-family:KaTeX_Math;font-style:italic;font-weight:700}.gradio-container-3-37-0 .katex .amsrm,.gradio-container-3-37-0 .katex .mathbb,.gradio-container-3-37-0 .katex .textbb{font-family:KaTeX_AMS}.gradio-container-3-37-0 .katex .mathcal{font-family:KaTeX_Caligraphic}.gradio-container-3-37-0 .katex .mathfrak,.gradio-container-3-37-0 .katex .textfrak{font-family:KaTeX_Fraktur}.gradio-container-3-37-0 .katex .mathtt{font-family:KaTeX_Typewriter}.gradio-container-3-37-0 .katex .mathscr,.gradio-container-3-37-0 .katex .textscr{font-family:KaTeX_Script}.gradio-container-3-37-0 .katex .mathsf,.gradio-container-3-37-0 .katex .textsf{font-family:KaTeX_SansSerif}.gradio-container-3-37-0 .katex .mathboldsf,.gradio-container-3-37-0 .katex .textboldsf{font-family:KaTeX_SansSerif;font-weight:700}.gradio-container-3-37-0 .katex .mathitsf,.gradio-container-3-37-0 .katex .textitsf{font-family:KaTeX_SansSerif;font-style:italic}.gradio-container-3-37-0 .katex .mainrm{font-family:KaTeX_Main;font-style:normal}.gradio-container-3-37-0 .katex .vlist-t{border-collapse:collapse;display:inline-table;table-layout:fixed}.gradio-container-3-37-0 .katex .vlist-r{display:table-row}.gradio-container-3-37-0 .katex .vlist{display:table-cell;position:relative;vertical-align:bottom}.gradio-container-3-37-0 .katex .vlist>span{display:block;height:0;position:relative}.gradio-container-3-37-0 .katex .vlist>span>span{display:inline-block}.gradio-container-3-37-0 .katex .vlist>span>.pstrut{overflow:hidden;width:0}.gradio-container-3-37-0 .katex .vlist-t2{margin-right:-2px}.gradio-container-3-37-0 .katex .vlist-s{display:table-cell;font-size:1px;min-width:2px;vertical-align:bottom;width:2px}.gradio-container-3-37-0 .katex .vbox{align-items:baseline;display:inline-flex;flex-direction:column}.gradio-container-3-37-0 .katex .hbox{width:100%}.gradio-container-3-37-0 .katex .hbox,.gradio-container-3-37-0 .katex .thinbox{display:inline-flex;flex-direction:row}.gradio-container-3-37-0 .katex .thinbox{max-width:0;width:0}.gradio-container-3-37-0 .katex .msupsub{text-align:left}.gradio-container-3-37-0 .katex .mfrac>span>span{text-align:center}.gradio-container-3-37-0 .katex .mfrac .frac-line{border-bottom-style:solid;display:inline-block;width:100%}.gradio-container-3-37-0 .katex .hdashline,.gradio-container-3-37-0 .katex .hline,.gradio-container-3-37-0 .katex .mfrac .frac-line,.gradio-container-3-37-0 .katex .overline .overline-line,.gradio-container-3-37-0 .katex .rule,.gradio-container-3-37-0 .katex .underline .underline-line{min-height:1px}.gradio-container-3-37-0 .katex .mspace{display:inline-block}.gradio-container-3-37-0 .katex .clap,.gradio-container-3-37-0 .katex .llap,.gradio-container-3-37-0 .katex .rlap{position:relative;width:0}.gradio-container-3-37-0 .katex .clap>.inner,.gradio-container-3-37-0 .katex .llap>.inner,.gradio-container-3-37-0 .katex .rlap>.inner{position:absolute}.gradio-container-3-37-0 .katex .clap>.fix,.gradio-container-3-37-0 .katex .llap>.fix,.gradio-container-3-37-0 .katex .rlap>.fix{display:inline-block}.gradio-container-3-37-0 .katex .llap>.inner{right:0}.gradio-container-3-37-0 .katex .clap>.inner,.gradio-container-3-37-0 .katex .rlap>.inner{left:0}.gradio-container-3-37-0 .katex .clap>.inner>span{margin-left:-50%;margin-right:50%}.gradio-container-3-37-0 .katex .rule{border:0 solid;display:inline-block;position:relative}.gradio-container-3-37-0 .katex .hline,.gradio-container-3-37-0 .katex .overline .overline-line,.gradio-container-3-37-0 .katex .underline .underline-line{border-bottom-style:solid;display:inline-block;width:100%}.gradio-container-3-37-0 .katex .hdashline{border-bottom-style:dashed;display:inline-block;width:100%}.gradio-container-3-37-0 .katex .sqrt>.root{margin-left:.27777778em;margin-right:-.55555556em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size1,.gradio-container-3-37-0 .katex .sizing.reset-size1.size1{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size2,.gradio-container-3-37-0 .katex .sizing.reset-size1.size2{font-size:1.2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size3,.gradio-container-3-37-0 .katex .sizing.reset-size1.size3{font-size:1.4em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size4,.gradio-container-3-37-0 .katex .sizing.reset-size1.size4{font-size:1.6em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size5,.gradio-container-3-37-0 .katex .sizing.reset-size1.size5{font-size:1.8em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size6,.gradio-container-3-37-0 .katex .sizing.reset-size1.size6{font-size:2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size7,.gradio-container-3-37-0 .katex .sizing.reset-size1.size7{font-size:2.4em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size8,.gradio-container-3-37-0 .katex .sizing.reset-size1.size8{font-size:2.88em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size9,.gradio-container-3-37-0 .katex .sizing.reset-size1.size9{font-size:3.456em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size10,.gradio-container-3-37-0 .katex .sizing.reset-size1.size10{font-size:4.148em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size1.size11,.gradio-container-3-37-0 .katex .sizing.reset-size1.size11{font-size:4.976em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size1,.gradio-container-3-37-0 .katex .sizing.reset-size2.size1{font-size:.83333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size2,.gradio-container-3-37-0 .katex .sizing.reset-size2.size2{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size3,.gradio-container-3-37-0 .katex .sizing.reset-size2.size3{font-size:1.16666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size4,.gradio-container-3-37-0 .katex .sizing.reset-size2.size4{font-size:1.33333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size5,.gradio-container-3-37-0 .katex .sizing.reset-size2.size5{font-size:1.5em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size6,.gradio-container-3-37-0 .katex .sizing.reset-size2.size6{font-size:1.66666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size7,.gradio-container-3-37-0 .katex .sizing.reset-size2.size7{font-size:2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size8,.gradio-container-3-37-0 .katex .sizing.reset-size2.size8{font-size:2.4em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size9,.gradio-container-3-37-0 .katex .sizing.reset-size2.size9{font-size:2.88em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size10,.gradio-container-3-37-0 .katex .sizing.reset-size2.size10{font-size:3.45666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size2.size11,.gradio-container-3-37-0 .katex .sizing.reset-size2.size11{font-size:4.14666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size1,.gradio-container-3-37-0 .katex .sizing.reset-size3.size1{font-size:.71428571em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size2,.gradio-container-3-37-0 .katex .sizing.reset-size3.size2{font-size:.85714286em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size3,.gradio-container-3-37-0 .katex .sizing.reset-size3.size3{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size4,.gradio-container-3-37-0 .katex .sizing.reset-size3.size4{font-size:1.14285714em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size5,.gradio-container-3-37-0 .katex .sizing.reset-size3.size5{font-size:1.28571429em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size6,.gradio-container-3-37-0 .katex .sizing.reset-size3.size6{font-size:1.42857143em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size7,.gradio-container-3-37-0 .katex .sizing.reset-size3.size7{font-size:1.71428571em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size8,.gradio-container-3-37-0 .katex .sizing.reset-size3.size8{font-size:2.05714286em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size9,.gradio-container-3-37-0 .katex .sizing.reset-size3.size9{font-size:2.46857143em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size10,.gradio-container-3-37-0 .katex .sizing.reset-size3.size10{font-size:2.96285714em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size3.size11,.gradio-container-3-37-0 .katex .sizing.reset-size3.size11{font-size:3.55428571em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size1,.gradio-container-3-37-0 .katex .sizing.reset-size4.size1{font-size:.625em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size2,.gradio-container-3-37-0 .katex .sizing.reset-size4.size2{font-size:.75em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size3,.gradio-container-3-37-0 .katex .sizing.reset-size4.size3{font-size:.875em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size4,.gradio-container-3-37-0 .katex .sizing.reset-size4.size4{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size5,.gradio-container-3-37-0 .katex .sizing.reset-size4.size5{font-size:1.125em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size6,.gradio-container-3-37-0 .katex .sizing.reset-size4.size6{font-size:1.25em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size7,.gradio-container-3-37-0 .katex .sizing.reset-size4.size7{font-size:1.5em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size8,.gradio-container-3-37-0 .katex .sizing.reset-size4.size8{font-size:1.8em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size9,.gradio-container-3-37-0 .katex .sizing.reset-size4.size9{font-size:2.16em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size10,.gradio-container-3-37-0 .katex .sizing.reset-size4.size10{font-size:2.5925em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size4.size11,.gradio-container-3-37-0 .katex .sizing.reset-size4.size11{font-size:3.11em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size1,.gradio-container-3-37-0 .katex .sizing.reset-size5.size1{font-size:.55555556em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size2,.gradio-container-3-37-0 .katex .sizing.reset-size5.size2{font-size:.66666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size3,.gradio-container-3-37-0 .katex .sizing.reset-size5.size3{font-size:.77777778em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size4,.gradio-container-3-37-0 .katex .sizing.reset-size5.size4{font-size:.88888889em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size5,.gradio-container-3-37-0 .katex .sizing.reset-size5.size5{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size6,.gradio-container-3-37-0 .katex .sizing.reset-size5.size6{font-size:1.11111111em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size7,.gradio-container-3-37-0 .katex .sizing.reset-size5.size7{font-size:1.33333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size8,.gradio-container-3-37-0 .katex .sizing.reset-size5.size8{font-size:1.6em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size9,.gradio-container-3-37-0 .katex .sizing.reset-size5.size9{font-size:1.92em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size10,.gradio-container-3-37-0 .katex .sizing.reset-size5.size10{font-size:2.30444444em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size5.size11,.gradio-container-3-37-0 .katex .sizing.reset-size5.size11{font-size:2.76444444em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size1,.gradio-container-3-37-0 .katex .sizing.reset-size6.size1{font-size:.5em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size2,.gradio-container-3-37-0 .katex .sizing.reset-size6.size2{font-size:.6em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size3,.gradio-container-3-37-0 .katex .sizing.reset-size6.size3{font-size:.7em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size4,.gradio-container-3-37-0 .katex .sizing.reset-size6.size4{font-size:.8em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size5,.gradio-container-3-37-0 .katex .sizing.reset-size6.size5{font-size:.9em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size6,.gradio-container-3-37-0 .katex .sizing.reset-size6.size6{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size7,.gradio-container-3-37-0 .katex .sizing.reset-size6.size7{font-size:1.2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size8,.gradio-container-3-37-0 .katex .sizing.reset-size6.size8{font-size:1.44em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size9,.gradio-container-3-37-0 .katex .sizing.reset-size6.size9{font-size:1.728em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size10,.gradio-container-3-37-0 .katex .sizing.reset-size6.size10{font-size:2.074em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size6.size11,.gradio-container-3-37-0 .katex .sizing.reset-size6.size11{font-size:2.488em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size1,.gradio-container-3-37-0 .katex .sizing.reset-size7.size1{font-size:.41666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size2,.gradio-container-3-37-0 .katex .sizing.reset-size7.size2{font-size:.5em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size3,.gradio-container-3-37-0 .katex .sizing.reset-size7.size3{font-size:.58333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size4,.gradio-container-3-37-0 .katex .sizing.reset-size7.size4{font-size:.66666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size5,.gradio-container-3-37-0 .katex .sizing.reset-size7.size5{font-size:.75em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size6,.gradio-container-3-37-0 .katex .sizing.reset-size7.size6{font-size:.83333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size7,.gradio-container-3-37-0 .katex .sizing.reset-size7.size7{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size8,.gradio-container-3-37-0 .katex .sizing.reset-size7.size8{font-size:1.2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size9,.gradio-container-3-37-0 .katex .sizing.reset-size7.size9{font-size:1.44em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size10,.gradio-container-3-37-0 .katex .sizing.reset-size7.size10{font-size:1.72833333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size7.size11,.gradio-container-3-37-0 .katex .sizing.reset-size7.size11{font-size:2.07333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size1,.gradio-container-3-37-0 .katex .sizing.reset-size8.size1{font-size:.34722222em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size2,.gradio-container-3-37-0 .katex .sizing.reset-size8.size2{font-size:.41666667em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size3,.gradio-container-3-37-0 .katex .sizing.reset-size8.size3{font-size:.48611111em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size4,.gradio-container-3-37-0 .katex .sizing.reset-size8.size4{font-size:.55555556em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size5,.gradio-container-3-37-0 .katex .sizing.reset-size8.size5{font-size:.625em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size6,.gradio-container-3-37-0 .katex .sizing.reset-size8.size6{font-size:.69444444em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size7,.gradio-container-3-37-0 .katex .sizing.reset-size8.size7{font-size:.83333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size8,.gradio-container-3-37-0 .katex .sizing.reset-size8.size8{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size9,.gradio-container-3-37-0 .katex .sizing.reset-size8.size9{font-size:1.2em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size10,.gradio-container-3-37-0 .katex .sizing.reset-size8.size10{font-size:1.44027778em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size8.size11,.gradio-container-3-37-0 .katex .sizing.reset-size8.size11{font-size:1.72777778em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size1,.gradio-container-3-37-0 .katex .sizing.reset-size9.size1{font-size:.28935185em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size2,.gradio-container-3-37-0 .katex .sizing.reset-size9.size2{font-size:.34722222em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size3,.gradio-container-3-37-0 .katex .sizing.reset-size9.size3{font-size:.40509259em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size4,.gradio-container-3-37-0 .katex .sizing.reset-size9.size4{font-size:.46296296em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size5,.gradio-container-3-37-0 .katex .sizing.reset-size9.size5{font-size:.52083333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size6,.gradio-container-3-37-0 .katex .sizing.reset-size9.size6{font-size:.5787037em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size7,.gradio-container-3-37-0 .katex .sizing.reset-size9.size7{font-size:.69444444em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size8,.gradio-container-3-37-0 .katex .sizing.reset-size9.size8{font-size:.83333333em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size9,.gradio-container-3-37-0 .katex .sizing.reset-size9.size9{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size10,.gradio-container-3-37-0 .katex .sizing.reset-size9.size10{font-size:1.20023148em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size9.size11,.gradio-container-3-37-0 .katex .sizing.reset-size9.size11{font-size:1.43981481em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size1,.gradio-container-3-37-0 .katex .sizing.reset-size10.size1{font-size:.24108004em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size2,.gradio-container-3-37-0 .katex .sizing.reset-size10.size2{font-size:.28929605em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size3,.gradio-container-3-37-0 .katex .sizing.reset-size10.size3{font-size:.33751205em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size4,.gradio-container-3-37-0 .katex .sizing.reset-size10.size4{font-size:.38572806em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size5,.gradio-container-3-37-0 .katex .sizing.reset-size10.size5{font-size:.43394407em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size6,.gradio-container-3-37-0 .katex .sizing.reset-size10.size6{font-size:.48216008em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size7,.gradio-container-3-37-0 .katex .sizing.reset-size10.size7{font-size:.57859209em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size8,.gradio-container-3-37-0 .katex .sizing.reset-size10.size8{font-size:.69431051em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size9,.gradio-container-3-37-0 .katex .sizing.reset-size10.size9{font-size:.83317261em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size10,.gradio-container-3-37-0 .katex .sizing.reset-size10.size10{font-size:1em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size10.size11,.gradio-container-3-37-0 .katex .sizing.reset-size10.size11{font-size:1.19961427em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size1,.gradio-container-3-37-0 .katex .sizing.reset-size11.size1{font-size:.20096463em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size2,.gradio-container-3-37-0 .katex .sizing.reset-size11.size2{font-size:.24115756em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size3,.gradio-container-3-37-0 .katex .sizing.reset-size11.size3{font-size:.28135048em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size4,.gradio-container-3-37-0 .katex .sizing.reset-size11.size4{font-size:.32154341em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size5,.gradio-container-3-37-0 .katex .sizing.reset-size11.size5{font-size:.36173633em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size6,.gradio-container-3-37-0 .katex .sizing.reset-size11.size6{font-size:.40192926em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size7,.gradio-container-3-37-0 .katex .sizing.reset-size11.size7{font-size:.48231511em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size8,.gradio-container-3-37-0 .katex .sizing.reset-size11.size8{font-size:.57877814em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size9,.gradio-container-3-37-0 .katex .sizing.reset-size11.size9{font-size:.69453376em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size10,.gradio-container-3-37-0 .katex .sizing.reset-size11.size10{font-size:.83360129em}.gradio-container-3-37-0 .katex .fontsize-ensurer.reset-size11.size11,.gradio-container-3-37-0 .katex .sizing.reset-size11.size11{font-size:1em}.gradio-container-3-37-0 .katex .delimsizing.size1{font-family:KaTeX_Size1}.gradio-container-3-37-0 .katex .delimsizing.size2{font-family:KaTeX_Size2}.gradio-container-3-37-0 .katex .delimsizing.size3{font-family:KaTeX_Size3}.gradio-container-3-37-0 .katex .delimsizing.size4{font-family:KaTeX_Size4}.gradio-container-3-37-0 .katex .delimsizing.mult .delim-size1>span{font-family:KaTeX_Size1}.gradio-container-3-37-0 .katex .delimsizing.mult .delim-size4>span{font-family:KaTeX_Size4}.gradio-container-3-37-0 .katex .nulldelimiter{display:inline-block;width:.12em}.gradio-container-3-37-0 .katex .delimcenter,.gradio-container-3-37-0 .katex .op-symbol{position:relative}.gradio-container-3-37-0 .katex .op-symbol.small-op{font-family:KaTeX_Size1}.gradio-container-3-37-0 .katex .op-symbol.large-op{font-family:KaTeX_Size2}.gradio-container-3-37-0 .katex .accent>.vlist-t,.gradio-container-3-37-0 .katex .op-limits>.vlist-t{text-align:center}.gradio-container-3-37-0 .katex .accent .accent-body{position:relative}.gradio-container-3-37-0 .katex .accent .accent-body:not(.accent-full){width:0}.gradio-container-3-37-0 .katex .overlay{display:block}.gradio-container-3-37-0 .katex .mtable .vertical-separator{display:inline-block;min-width:1px}.gradio-container-3-37-0 .katex .mtable .arraycolsep{display:inline-block}.gradio-container-3-37-0 .katex .mtable .col-align-c>.vlist-t{text-align:center}.gradio-container-3-37-0 .katex .mtable .col-align-l>.vlist-t{text-align:left}.gradio-container-3-37-0 .katex .mtable .col-align-r>.vlist-t{text-align:right}.gradio-container-3-37-0 .katex .svg-align{text-align:left}.gradio-container-3-37-0 .katex svg{fill:currentColor;stroke:currentColor;fill-rule:nonzero;fill-opacity:1;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;display:block;height:inherit;position:absolute;width:100%}.gradio-container-3-37-0 .katex svg path{stroke:none}.gradio-container-3-37-0 .katex img{border-style:none;max-height:none;max-width:none;min-height:0;min-width:0}.gradio-container-3-37-0 .katex .stretchy{display:block;overflow:hidden;position:relative;width:100%}.gradio-container-3-37-0 .katex .stretchy:after,.gradio-container-3-37-0 .katex .stretchy:before{content:""}.gradio-container-3-37-0 .katex .hide-tail{overflow:hidden;position:relative;width:100%}.gradio-container-3-37-0 .katex .halfarrow-left{left:0;overflow:hidden;position:absolute;width:50.2%}.gradio-container-3-37-0 .katex .halfarrow-right{overflow:hidden;position:absolute;right:0;width:50.2%}.gradio-container-3-37-0 .katex .brace-left{left:0;overflow:hidden;position:absolute;width:25.1%}.gradio-container-3-37-0 .katex .brace-center{left:25%;overflow:hidden;position:absolute;width:50%}.gradio-container-3-37-0 .katex .brace-right{overflow:hidden;position:absolute;right:0;width:25.1%}.gradio-container-3-37-0 .katex .x-arrow-pad{padding:0 .5em}.gradio-container-3-37-0 .katex .cd-arrow-pad{padding:0 .55556em 0 .27778em}.gradio-container-3-37-0 .katex .mover,.gradio-container-3-37-0 .katex .munder,.gradio-container-3-37-0 .katex .x-arrow{text-align:center}.gradio-container-3-37-0 .katex .boxpad{padding:0 .3em}.gradio-container-3-37-0 .katex .fbox,.gradio-container-3-37-0 .katex .fcolorbox{border:.04em solid;box-sizing:border-box}.gradio-container-3-37-0 .katex .cancel-pad{padding:0 .2em}.gradio-container-3-37-0 .katex .cancel-lap{margin-left:-.2em;margin-right:-.2em}.gradio-container-3-37-0 .katex .sout{border-bottom-style:solid;border-bottom-width:.08em}.gradio-container-3-37-0 .katex .angl{border-right:.049em solid;border-top:.049em solid;box-sizing:border-box;margin-right:.03889em}.gradio-container-3-37-0 .katex .anglpad{padding:0 .03889em}.gradio-container-3-37-0 .katex .eqn-num:before{content:"(" counter(katexEqnNo) ")";counter-increment:katexEqnNo}.gradio-container-3-37-0 .katex .mml-eqn-num:before{content:"(" counter(mmlEqnNo) ")";counter-increment:mmlEqnNo}.gradio-container-3-37-0 .katex .mtr-glue{width:50%}.gradio-container-3-37-0 .katex .cd-vert-arrow{display:inline-block;position:relative}.gradio-container-3-37-0 .katex .cd-label-left{display:inline-block;position:absolute;right:calc(50% + .3em);text-align:left}.gradio-container-3-37-0 .katex .cd-label-right{display:inline-block;left:calc(50% + .3em);position:absolute;text-align:right}.gradio-container-3-37-0 .katex-display{display:block;margin:1em 0;text-align:center}.gradio-container-3-37-0 .katex-display>.katex{display:block;text-align:center;white-space:nowrap}.gradio-container-3-37-0 .katex-display>.katex>.katex-html{display:block;position:relative}.gradio-container-3-37-0 .katex-display>.katex>.katex-html>.tag{position:absolute;right:0}.gradio-container-3-37-0 .katex-display.leqno>.katex>.katex-html>.tag{left:0;right:auto}.gradio-container-3-37-0 .katex-display.fleqn>.katex{padding-left:2em;text-align:left}.gradio-container-3-37-0 body{counter-reset:katexEqnNo mmlEqnNo}span.svelte-15hifvz code[class*=language-],span.svelte-15hifvz pre[class*=language-]{font-size:var(--text-md)}.wrap.svelte-1fzvtqo.svelte-1fzvtqo{padding:var(--block-padding);width:100%;overflow-y:auto}.message-wrap.svelte-1fzvtqo.svelte-1fzvtqo{display:flex;flex-direction:column;gap:var(--spacing-xxl)}.message-wrap.svelte-1fzvtqo>div.svelte-1fzvtqo img{border-radius:13px;max-width:30vw}.message-wrap.svelte-1fzvtqo>div.svelte-1fzvtqo p:not(:first-child){margin-top:var(--spacing-xxl)}.message-wrap.svelte-1fzvtqo audio{width:100%}.message.svelte-1fzvtqo.svelte-1fzvtqo{position:relative;align-self:flex-start;border-width:1px;border-radius:var(--radius-xxl);background:var(--background-fill-secondary);padding:var(--spacing-xxl);width:calc(100% - var(--spacing-xxl));color:var(--body-text-color);font-size:var(--text-lg);line-height:var(--line-lg);overflow-wrap:break-word}.user.svelte-1fzvtqo.svelte-1fzvtqo{align-self:flex-end;border-bottom-right-radius:0}.bot.svelte-1fzvtqo.svelte-1fzvtqo{border-bottom-left-radius:0;padding-left:calc(2 * var(--spacing-xxl))}@media (max-width: 480px){.message.svelte-1fzvtqo.svelte-1fzvtqo{width:auto}.bot.svelte-1fzvtqo.svelte-1fzvtqo{padding-left:var(--spacing-xxl)}}.bot.svelte-1fzvtqo.svelte-1fzvtqo,.pending.svelte-1fzvtqo.svelte-1fzvtqo{border-color:var(--border-color-primary);background:var(--background-fill-secondary)}.user.svelte-1fzvtqo.svelte-1fzvtqo{border-color:var(--border-color-accent);background-color:var(--color-accent-soft)}.feedback.svelte-1fzvtqo.svelte-1fzvtqo{display:flex;position:absolute;top:var(--spacing-xl);right:calc(var(--spacing-xxl) + var(--spacing-xl));gap:var(--spacing-lg);font-size:var(--text-sm)}.feedback.svelte-1fzvtqo button.svelte-1fzvtqo{color:var(--body-text-color-subdued)}.feedback.svelte-1fzvtqo button.svelte-1fzvtqo:hover{color:var(--body-text-color)}.selectable.svelte-1fzvtqo.svelte-1fzvtqo{cursor:pointer}.pending.svelte-1fzvtqo.svelte-1fzvtqo{display:flex;justify-content:center;align-items:center;align-self:center;gap:2px}.dot-flashing.svelte-1fzvtqo.svelte-1fzvtqo{animation:svelte-1fzvtqo-dot-flashing 1s infinite linear alternate;border-radius:5px;background-color:var(--body-text-color);width:5px;height:5px;color:var(--body-text-color)}.dot-flashing.svelte-1fzvtqo.svelte-1fzvtqo:nth-child(2){animation-delay:.33s}.dot-flashing.svelte-1fzvtqo.svelte-1fzvtqo:nth-child(3){animation-delay:.66s}@media (max-width: 480px){.user.svelte-1fzvtqo.svelte-1fzvtqo{align-self:flex-end}.bot.svelte-1fzvtqo.svelte-1fzvtqo{align-self:flex-start;padding-left:var(--size-3)}}@keyframes svelte-1fzvtqo-dot-flashing{0%{opacity:.8}50%{opacity:.5}to{opacity:.8}}.message-wrap.svelte-1fzvtqo .message.svelte-1fzvtqo img{margin:var(--size-2);max-height:200px}.message-wrap.svelte-1fzvtqo .message.svelte-1fzvtqo a{color:var(--color-text-link);text-decoration:underline}.hide.svelte-1fzvtqo.svelte-1fzvtqo{display:none}.message-wrap.svelte-1fzvtqo pre[class*=language-],.message-wrap.svelte-1fzvtqo pre{margin-top:var(--spacing-sm);margin-bottom:var(--spacing-sm);box-shadow:none;border:none;border-radius:var(--radius-md);background-color:var(--chatbot-code-background-color);padding:var(--spacing-xl) 10px;direction:ltr}.message-wrap.svelte-1fzvtqo table,.message-wrap.svelte-1fzvtqo tr,.message-wrap.svelte-1fzvtqo td,.message-wrap.svelte-1fzvtqo th{margin-top:var(--spacing-sm);margin-bottom:var(--spacing-sm);padding:var(--spacing-xl)}.message-wrap.svelte-1fzvtqo .bot.svelte-1fzvtqo table,.message-wrap.svelte-1fzvtqo .bot.svelte-1fzvtqo tr,.message-wrap.svelte-1fzvtqo .bot.svelte-1fzvtqo td,.message-wrap.svelte-1fzvtqo .bot.svelte-1fzvtqo th{border:1px solid var(--border-color-primary)}.message-wrap.svelte-1fzvtqo .user.svelte-1fzvtqo table,.message-wrap.svelte-1fzvtqo .user.svelte-1fzvtqo tr,.message-wrap.svelte-1fzvtqo .user.svelte-1fzvtqo td,.message-wrap.svelte-1fzvtqo .user.svelte-1fzvtqo th{border:1px solid var(--border-color-accent)}.message-wrap.svelte-1fzvtqo ol,.message-wrap.svelte-1fzvtqo ul{padding-inline-start:2em}.message-wrap.svelte-1fzvtqo span.katex{font-size:var(--text-lg);direction:ltr}.message-wrap.svelte-1fzvtqo code>button{position:absolute;top:var(--spacing-md);right:var(--spacing-md);z-index:1;cursor:pointer;border-bottom-left-radius:var(--radius-sm);padding:5px;padding:var(--spacing-md);width:22px;height:22px}.message-wrap.svelte-1fzvtqo code>button>span{position:absolute;top:var(--spacing-md);right:var(--spacing-md);width:12px;height:12px}.message-wrap.svelte-1fzvtqo .check{position:absolute;top:0;right:0;opacity:0;z-index:var(--layer-top);transition:opacity .2s;background:var(--background-fill-primary);padding:var(--size-1);width:100%;height:100%;color:var(--body-text-color)}.message-wrap.svelte-1fzvtqo pre{position:relative}.icon-button.svelte-1fzvtqo.svelte-1fzvtqo{position:absolute;top:6px;right:6px}.wrapper.svelte-nab2ao{display:flex;position:relative;flex-direction:column;align-items:start;width:100%;height:100%}
 
 
spaces/DQChoi/gpt-demo/venv/lib/python3.11/site-packages/h11/tests/test_util.py DELETED
@@ -1,112 +0,0 @@
1
- import re
2
- import sys
3
- import traceback
4
- from typing import NoReturn
5
-
6
- import pytest
7
-
8
- from .._util import (
9
- bytesify,
10
- LocalProtocolError,
11
- ProtocolError,
12
- RemoteProtocolError,
13
- Sentinel,
14
- validate,
15
- )
16
-
17
-
18
- def test_ProtocolError() -> None:
19
- with pytest.raises(TypeError):
20
- ProtocolError("abstract base class")
21
-
22
-
23
- def test_LocalProtocolError() -> None:
24
- try:
25
- raise LocalProtocolError("foo")
26
- except LocalProtocolError as e:
27
- assert str(e) == "foo"
28
- assert e.error_status_hint == 400
29
-
30
- try:
31
- raise LocalProtocolError("foo", error_status_hint=418)
32
- except LocalProtocolError as e:
33
- assert str(e) == "foo"
34
- assert e.error_status_hint == 418
35
-
36
- def thunk() -> NoReturn:
37
- raise LocalProtocolError("a", error_status_hint=420)
38
-
39
- try:
40
- try:
41
- thunk()
42
- except LocalProtocolError as exc1:
43
- orig_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
44
- exc1._reraise_as_remote_protocol_error()
45
- except RemoteProtocolError as exc2:
46
- assert type(exc2) is RemoteProtocolError
47
- assert exc2.args == ("a",)
48
- assert exc2.error_status_hint == 420
49
- new_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
50
- assert new_traceback.endswith(orig_traceback)
51
-
52
-
53
- def test_validate() -> None:
54
- my_re = re.compile(rb"(?P<group1>[0-9]+)\.(?P<group2>[0-9]+)")
55
- with pytest.raises(LocalProtocolError):
56
- validate(my_re, b"0.")
57
-
58
- groups = validate(my_re, b"0.1")
59
- assert groups == {"group1": b"0", "group2": b"1"}
60
-
61
- # successful partial matches are an error - must match whole string
62
- with pytest.raises(LocalProtocolError):
63
- validate(my_re, b"0.1xx")
64
- with pytest.raises(LocalProtocolError):
65
- validate(my_re, b"0.1\n")
66
-
67
-
68
- def test_validate_formatting() -> None:
69
- my_re = re.compile(rb"foo")
70
-
71
- with pytest.raises(LocalProtocolError) as excinfo:
72
- validate(my_re, b"", "oops")
73
- assert "oops" in str(excinfo.value)
74
-
75
- with pytest.raises(LocalProtocolError) as excinfo:
76
- validate(my_re, b"", "oops {}")
77
- assert "oops {}" in str(excinfo.value)
78
-
79
- with pytest.raises(LocalProtocolError) as excinfo:
80
- validate(my_re, b"", "oops {} xx", 10)
81
- assert "oops 10 xx" in str(excinfo.value)
82
-
83
-
84
- def test_make_sentinel() -> None:
85
- class S(Sentinel, metaclass=Sentinel):
86
- pass
87
-
88
- assert repr(S) == "S"
89
- assert S == S
90
- assert type(S).__name__ == "S"
91
- assert S in {S}
92
- assert type(S) is S
93
-
94
- class S2(Sentinel, metaclass=Sentinel):
95
- pass
96
-
97
- assert repr(S2) == "S2"
98
- assert S != S2
99
- assert S not in {S2}
100
- assert type(S) is not type(S2)
101
-
102
-
103
- def test_bytesify() -> None:
104
- assert bytesify(b"123") == b"123"
105
- assert bytesify(bytearray(b"123")) == b"123"
106
- assert bytesify("123") == b"123"
107
-
108
- with pytest.raises(UnicodeEncodeError):
109
- bytesify("\u1234")
110
-
111
- with pytest.raises(TypeError):
112
- bytesify(10)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/DanielPinsk/StableDiffusion/README.md DELETED
@@ -1,13 +0,0 @@
1
- ---
2
- title: StableDiffusion
3
- emoji: 💩
4
- colorFrom: red
5
- colorTo: gray
6
- sdk: gradio
7
- sdk_version: 3.3
8
- app_file: app.py
9
- pinned: false
10
- license: wtfpl
11
- ---
12
-
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/DaniilMIPT/greenatomtest/app.py DELETED
@@ -1,111 +0,0 @@
1
- import streamlit as st
2
- import re
3
- from pymorphy2 import MorphAnalyzer
4
- from functools import lru_cache
5
- from nltk.corpus import stopwords
6
- from tqdm import tqdm
7
- import pandas as pd
8
- import numpy as np
9
- from catboost import Pool
10
- import nltk
11
- from catboost import CatBoostClassifier
12
- from catboost import CatBoostRegressor
13
- model = CatBoostClassifier()
14
- model.load_model('classifire_model_MVP.cbm')
15
- model_reg = CatBoostRegressor()
16
- model_reg.load_model('regressor_model_MVP.cbm')
17
- nltk.download('stopwords')
18
- st.markdown(
19
- """
20
- <style>
21
- [data-testid = "stAppViewContainer"]{
22
- background-color: #e5e5f7;
23
- opacity: 1;
24
- background-image: radial-gradient(circle at center center, #5f8c4a, #e5e5f7), repeating-radial-gradient(circle at center center, #5f8c4a, #5f8c4a, 14px, transparent 28px, transparent 14px);
25
- background-blend-mode: multiply;
26
- }
27
- body{
28
- color: white;
29
- }
30
- </style>
31
- """,
32
- unsafe_allow_html=True
33
- )
34
- st.markdown("<h1 style='text-align: center; color: white;'>Green Atom</h1>", unsafe_allow_html=True)
35
- st.markdown("<h2 style='text-align: center; color: white;'>Check the status of your comment </h2>", unsafe_allow_html=True)
36
-
37
- start_text = "The Boys is one of the best Superhero shows I've ever seen. While Season 1 was the best season of the series, Season 2 and 3 were also both very good and absolutely worth watching. Season 3 was fantastic, Jensen Ackles was the perfect actor to add to this already incredible show! This show continues to amaze as it's not afraid to try new things and is a show that is definitely for adults. It has no problems being offensive and making you feel squeamish. You don't even have to be a fan of superhero shows to enjoy this. It's violent, funny, thrilling, etc.. Everything you want in a good superhero show. Season 4 just added Jeffrey Dean Morgan to the cast. Another great addition to an already incredible cast. I don't know what else I can say except I absolutely love this series and can't wait for more to come!"
38
- #st.markdown("<h6 left; color: white;> Enter your review text and click Ctrl+Enter</h6>", unsafe_allow_html=True)
39
- text = st.text_area("Enter your review text and click Ctrl+Enter",start_text)
40
- #text = st.text_area(':white[Enter your review text and click Ctrl+Enter]',start_text)
41
- print(text)
42
-
43
- data = pd.DataFrame({'text': [text]})
44
- m = MorphAnalyzer()
45
- regex = re.compile("[А-Яа-яA-z]+")
46
- mystopwords = stopwords.words('english')
47
-
48
-
49
- def words_only(text, regex=regex):
50
- try:
51
- return regex.findall(text.lower())
52
- except:
53
- return []
54
-
55
-
56
- @lru_cache(maxsize=128)
57
- def lemmatize_word(token, pymorphy=m):
58
- return pymorphy.parse(token)[0].normal_form
59
-
60
-
61
- def lemmatize_text(text):
62
- return [lemmatize_word(w) for w in text]
63
-
64
-
65
- def remove_stopwords(lemmas, stopwords=mystopwords):
66
- return [w for w in lemmas if not w in stopwords and len(w) > 3]
67
-
68
-
69
- def clean_text(text):
70
- tokens = words_only(text)
71
- lemmas = lemmatize_text(tokens)
72
-
73
- return ' '.join(remove_stopwords(lemmas))
74
-
75
-
76
- from multiprocessing import Pool as PoolSklearn
77
-
78
- with PoolSklearn(4) as p:
79
- lemmas = list(tqdm(p.imap(clean_text, data['text']), total=len(data)))
80
-
81
- data['text_lemmas'] = lemmas
82
-
83
- data['sym_len'] = data.text_lemmas.apply(len)
84
- data['word_len'] = data.text_lemmas.apply(lambda x: len(x.split()))
85
- data['sym_len'] = np.log(data['sym_len'])
86
- data['word_len'] = np.log(data['word_len'])
87
-
88
- test_pool = Pool(
89
- data,
90
- text_features=['text', 'text_lemmas'],
91
- )
92
-
93
- y_pred = model.predict(test_pool)
94
- y_pred_reg = model_reg.predict(test_pool)
95
- arr = np.round(y_pred_reg, 0).astype(int)
96
- arr_1 = []
97
- for i in range(len(arr)):
98
- if arr[i]<=0:
99
- arr_1.append(1)
100
- elif arr[i]>10:
101
- arr_1.append(10)
102
- else:
103
- arr_1.append(arr[i])
104
-
105
- ans = ''
106
- if y_pred[0] == 1:
107
- ans = 'positive'
108
- else:
109
- ans = 'negative'
110
- st.write('Estimated revocation status :', ans)
111
- st.write('Estimated comment rating :', arr_1[0])
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
spaces/Dao3/MBTI_Test/app.py DELETED
@@ -1,47 +0,0 @@
1
- import gradio as gr
2
- import os
3
- import openai
4
-
5
- # 请记得要把 api 的 key 放到 settings 下面的 Repository Secrets 里。
6
- # 目前有个特别奇怪的问题: duplicate 的 key 如果和原来的 key 重名,build 就会失败。不知是否是今天正在 migrating 的原因。
7
- # 作为 workaround,请对 key 使用一个不同的名字,并且记得修改下面这行代码中的 key 的名字。
8
- openai.api_key = os.getenv("key4")
9
-
10
-
11
-
12
-
13
-
14
- # 如果你只打算通过 prompt 来定制机器人的行为,只需要修改这段 prompt 就够了。
15
- prompt = '你现在是一个MBTI测试助理,现在我要开始MBTI测试。你要通过5个问题来测试我的mbti人格类型,每个问题要有abcd四个选项。5个问题内容必须是不同的方面。而且每个问题的选项都和其他问题的选项不同。等我直接回复这六个问题我的答案之后,你帮我给出人格类型分析结果,并对这个结果的人格特征做详细描述,并且告诉我这个人格类型适合和什么类型的人做朋友,适合和什么样类型的人做情侣。注意,一次性发给我这5个问题以及问题相应的四个选项,在我回复答案后你再回复结果和描述。注意,每次回答的时候可能是我的不同朋友来做测试,你必须对每次回答你刚刚设计的问题,单独分析每次答案的人格类型,不要参考过往对话中的答案。注意,全程使用中文。在我说完第一句话后你就开始出题,然后我做答,你再回复我的人格类型。如果你理解了,我们现在就开始,'
16
-
17
- # 修改本函数,来实现你自己的 chatbot
18
- # p: 对机器人说话的内容
19
- # qid: 当前消息的唯一标识。例如 `'bxqid-cManAtRMszw...'`。由平台生成并传递给机器人,以便机器人区分单个问题(写日志、追踪调试、异步回调等)。同步调用可忽略。
20
- # uid: 用户的唯一标识。例如`'bxuid-Aj8Spso8Xsp...'`。由平台生成并传递给机器人,以便机器人区分用户。可被用于实现多轮对话的功能。
21
- # 返回值:[type, content]
22
- # 详见 https://huggingface.co/spaces/baixing/hackathon_test/blob/main/bot-api.md
23
- def chat(p, qid, uid):
24
- return ["text", callapi(p)]
25
-
26
- def callapi(p):
27
- response = openai.ChatCompletion.create(
28
- model="gpt-3.5-turbo",
29
- messages= [{"role":"system", "content":prompt},
30
- {"role":"user", "content":p}
31
- ]
32
- )
33
- print(response)
34
- response = response["choices"][0]["message"]["content"]
35
- while response.startswith("\n"):
36
- response = response[1:]
37
- return response
38
-
39
-
40
-
41
- iface = gr.Interface(fn=chat,
42
- inputs=["text", "text", "text"],
43
- outputs=["text", "text"],
44
- description="""我是人格测试助手,在瀛海威广场的多轮对话中可以做人格测试哦~ [瀛海威广场](https://huggingface.co/spaces/baixing/hackathon_test),需要填写的api是 https://dao3-mbti-test.hf.space/run/predict
45
- """)
46
-
47
- iface.launch()