CF25
-
- Demo for Counterfeit V2.5 Stable Diffusion model.
- {"Add the following tokens to your prompts for the model to work properly: prefix" if prefix else ""}
-
-
diff --git a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/HD Online Player (Vinnaithaandi Varuvaaya Bluray 1080p) - Watch the Romantic Tamil Movie in High Quality.md b/spaces/1acneusushi/gradio-2dmoleculeeditor/data/HD Online Player (Vinnaithaandi Varuvaaya Bluray 1080p) - Watch the Romantic Tamil Movie in High Quality.md
deleted file mode 100644
index 190371eddbae6f1b5286df0260276358b429c6dc..0000000000000000000000000000000000000000
--- a/spaces/1acneusushi/gradio-2dmoleculeeditor/data/HD Online Player (Vinnaithaandi Varuvaaya Bluray 1080p) - Watch the Romantic Tamil Movie in High Quality.md
+++ /dev/null
@@ -1,80 +0,0 @@
-
-
If you are a fan of Tamil romantic movies, you might have heard of Vinnaithaandi Varuvaaya, a 2010 film written and directed by Gautham Vasudev Menon, starring Silambarasan and Trisha. The movie was a critical and commercial success, winning several awards and accolades for its music, cinematography, and story. It is considered one of the best romantic movies of Tamil cinema, with a cult following among the youth.
-Download File ✵ https://byltly.com/2uKvxM
But have you ever wondered how it would be like to watch this movie in HD online? Well, you are in luck, because there are many platforms that offer you the opportunity to stream or download this movie in high definition quality. You can enjoy the stunning visuals, the melodious songs, and the emotional scenes in full clarity and detail. In this article, we will tell you more about this movie, its plot, its review, and where you can watch it online in HD.
-Vinnaithaandi Varuvaaya (meaning Will You Cross The Skies For Me?) is a movie that explores the complicated relationship between a Hindu Tamil boy, Karthik Sivakumar, and a Malayali Christian girl, Jessie Thekekuttu. Karthik is an aspiring filmmaker who falls in love with Jessie at first sight. Jessie lives upstairs from Karthik's family, who rent the bottom floor of her house. Jessie is from a conservative family who disapprove of her talking to men outside her religion.
-Karthik tries to woo Jessie, who initially rejects him but later develops feelings for him. However, she is afraid of her father's wrath and her family's opposition. She also has doubts about Karthik's commitment and career prospects. Karthik convinces her to elope with him, but she backs out at the last moment. They break up and go their separate ways.
-Years later, Karthik becomes a successful filmmaker and makes a movie based on his love story with Jessie. He meets her again in New York, where she is married to someone else. They rekindle their friendship and realize that they still love each other. However, they also accept that they cannot be together and decide to part ways for good.
-Vinnaithaandi Varuvaayaa Hosanna video song HD 1080p
-Watch Vinnaithaandi Varuvaayaa Tamil movie online free
-Download Vinnaithaandi Varuvaayaa Blu Ray 1080p X 264 [ DTS
-Vinnaithaandi Varuvaayaa full movie with English subtitles
-Vinnaithaandi Varuvaayaa romantic scenes HD quality
-Vinnaithaandi Varuvaayaa songs lyrics and translation
-Vinnaithaandi Varuvaayaa behind the scenes and interviews
-Vinnaithaandi Varuvaayaa remake in Hindi Ekk Deewana Tha
-Vinnaithaandi Varuvaayaa A R Rahman musical masterpiece
-Vinnaithaandi Varuvaayaa Gautham Menon directorial hit
-Vinnaithaandi Varuvaayaa Trisha Krishnan and Silambarasan Rajendar chemistry
-Vinnaithaandi Varuvaayaa best dialogues and quotes
-Vinnaithaandi Varuvaayaa review and ratings by critics
-Vinnaithaandi Varuvaayaa box office collection and awards
-Vinnaithaandi Varuvaayaa fan made videos and edits
-Vinnaithaandi Varuvaayaa trivia and facts you didn't know
-Vinnaithaandi Varuvaayaa location and shooting details
-Vinnaithaandi Varuvaayaa wallpapers and posters HD download
-Vinnaithaandi Varuvaayaa piano notes and guitar chords
-Vinnaithaandi Varuvaayaa memes and funny moments
-Vinnaithaandi Varuvaayaa Mannipaaya song video HD 1080p
-Stream Vinnaithaandi Varuvaayaa Tamil movie online HD
-Buy Vinnaithaandi Varuvaayaa Blu Ray DVD online
-Vinnaithaandi Varuvaayaa Omana Penne song video HD 1080p
-Vinnaithaandi Varuvaayaa movie analysis and breakdown
-Vinnaithaandi Varuvaayaa songs playlist and jukebox
-Vinnaithaandi Varuvaayaa cast and crew details
-Vinnaithaandi Varuvaayaa sequel announcement and updates
-Vinnaithaandi Varuvaayaa climax scene HD 1080p
-Vinnaithaandi Varuvaayaa movie comparison with Ye Maaya Chesave
-Vinnaithaandi Varuvaayaa Kannukkul Kannai song video HD 1080p
-Watch Vinnaithaandi Varuvaayaa Tamil movie online HD quality
-Download Vinnaithaandi Varuvaayaa Blu Ray 1080p X 264 [ DTS torrent
-Vinnaithaandi Varuvaayaa subtitles in different languages
-Vinnaithaandi Varuvaayaa deleted scenes and bloopers
-Vinnaithaandi Varuvaayaa making of the movie and songs
-Vinnaithaandi Varuvaayaa remake in Telugu Ye Maaya Chesave
-Vinnaithaandi Varuvaayaa A R Rahman best songs ever
-Vinnaithaandi Varuvaayaa Gautham Menon best movies ever
-Vinnaithaandi Varuvaayaa Trisha Krishnan and Silambarasan Rajendar best performances ever
-Vinnaithaandi Varuvaayaa inspiring dialogues and messages
-Vinnaithaandi Varuvaayaa feedback and comments by viewers
-Vinnaithaandi Varuvaayaa budget and production cost details
-Vinnaithaandi Varuvaayaa nominations and wins at various awards shows
-Vinnaithaandi Varuvaayaa tribute videos and fan arts
-Vinnaithaandi Varuvaayaa interesting facts and secrets revealed by the makers
-Vinnaithaandi Varuvaayaa shooting spots and travel guide
-Vinnaithaandi Varuvaaya images and stills HD download
-Vinnaithaandi Varuvaya flute notes and sheet music
Vinnaithaandi Varuvaaya is a movie that touches your heart with its realistic portrayal of love and its challenges. It does not sugarcoat or glamorize the romance, but shows it as it is, with all its flaws and beauty. It also does not follow the typical formula of Tamil movies, where the hero wins over the heroine against all odds. Instead, it shows how sometimes love is not enough to overcome the barriers of society, religion, and fate.
-The movie's strength lies in its script, direction, music, and performances. Gautham Menon has crafted a story that is relatable and engaging, with dialogues that are witty and natural. He has also captured the essence of each location, from Chennai to Kerala to Malta to New York, with his brilliant cinematography. A.R.Rahman has composed some of his best songs for this movie, which are soulful and memorable. The songs blend well with the mood and theme of the movie.
-Silambarasan and Trisha have delivered one of their best performances in this movie. They have portrayed their characters with nuance and depth, making us empathize with their emotions and dilemmas. They have also shared a great chemistry on screen, making us believe in their love story. The supporting cast, especially VTV Ganesh as Karthik's friend, have also done a commendable job.
-Vinnaithaandi Varuvaaya is a movie that will make you laugh, cry, and think. It is a movie that will stay with you long after you watch it. It is a movie that will make you appreciate the value of love and life. If you are looking for a romantic movie that is different from the usual fare, you should definitely watch Vinnaithaandi Varuvaaya online in HD.
-You can watch Vinnaithaandi Varuvaaya online in HD on various platforms such as Hotstar, Yidio, or IMDb. You can also download it from these sites or other sources.
-The songs in Vinnaithaandi Varuvaaya are composed by A.R.Rahman, who also sang some of them along with other singers such as Benny Dayal , Shreya Ghoshal , Karthik , Alka Yagnik , Devan Ekambaram , Chinmayi , Blaaze , Naresh Iyer , Clinton Cerejo , S.P.Balasubrahmanyam , Kalyani Menon , Rashid Ali , Sukhwinder Singh , Tanvi Shah , Vijay Prakash , Suzanne D'Mello , Darshana KT , Megha , Krish , Emcee Jesz , Solar Sai , Karky , Madhushree , Haricharan , Timmy Thomas , Sagarika Mukherjee .
-No, Vinnaithaandi Varuvaaya is not based on a true story. It is a fictional story written by Gautham Menon. However, some aspects of the story may be inspired by his own experiences or observations.
-Some other movies similar to Vinnaithaandi Varuvaaya are Ye Maaya Chesave (the Telugu version of this movie), Ekk Deewana Tha (the Hindi remake of this movie), Alaipayuthey (another Tamil romantic movie by Mani Ratnam), Saathiya (the Hindi remake of Alaipayuthey), Jaane Tu... Ya Jaane Na (a Hindi romantic comedy), Pyaar Ka Punchnama (a Hindi romantic comedy-drama), 500 Days Of Summer (an English romantic comedy-drama), Before Sunrise (an English romantic drama).
-Vinnaithaandi Varuvaayaa means Will You Cross The Skies For Me? in Tamil. It is a poetic way of expressing one's love and desire for someone who seems unreachable or impossible to attain.
-Angry Birds 2 is one of the most popular puzzle games in the world. It is the sequel to the original Angry Birds game that started it all. In this game, you have to use a slingshot to launch birds at structures made of glass, wood and stone. Your goal is to defeat the green pigs who have stolen your eggs.
-Gems and black pearls are two of the most important resources in Angry Birds 2. Gems are used to buy extra cards, continue playing after losing lives, unlock chests and more. Black pearls are used to buy hats for your birds, which give them extra power and style. However, gems and black pearls are hard to come by in the game. You have to either earn them by playing or buy them with real money.
-Download Zip ✫ https://urlin.us/2uT1L4
A mod apk is a modified version of an application that gives you access to features that are not available in the original version. For example, a mod apk can give you unlimited coins, lives, gems or other resources in a game. A mod apk can also remove ads, unlock premium content or enhance graphics.
-In this article, we will show you how to download and install Angry Birds 2 Mod APK, which will give you unlimited gems and black pearls in the game. We will also tell you how to use the mod apk, what features it offers, and some tips and tricks for playing the game. Let's get started!
-The first step to getting unlimited gems and black pearls in Angry Birds 2 is to download and install the mod apk file. Here are the steps you need to follow:
-Congratulations! You have successfully installed Angry Birds 2 Mod APK on your device. Now you can enjoy unlimited gems and black pearls in the game.
-Now that you have installed Angry Birds 2 Mod APK, you may be wondering how to use it. Here are some tips on how to use the mod apk effectively:
-By following these tips, you can use Angry Birds 2 Mod APK without any problems and enjoy unlimited gems and black pearls in the game.
-Angry Birds 2 Mod APK is not just about giving you unlimited gems and black pearls in the game. It also offers many other features that make it better than the original game. Here are some of the features of Angry Birds 2 Mod APK:
-Feature | Benefit |
---|---|
Unlimited lives | You never have to wait for your lives to refill or buy them with gems. |
Unlimited energy | You never have to wait for your energy to refill or buy it with gems. |
All birds unlocked | You can use any bird in any level without having to unlock them with feathers. |
All spells unlocked | You can use any spell in any level without having to unlock them with gems. |
You can use any hat for any bird without having to buy them with black pearls. | |
No ads | You can play the game without any interruptions or distractions from ads. |
Improved graphics | You can enjoy the game with better graphics and animations. |
New birds, levels and events | You can access new content that is not available in the original game, such as new birds, levels and events. |
As you can see, Angry Birds 2 Mod APK has many features that make it more fun and enjoyable than the original game. You can experience the game in a whole new way with the mod apk.
-angry birds 2 hack apk unlimited gems/pearls
-download angry birds 2 mod apk with unlimited black pearls and gems
-angry birds 2 modded apk free gems and black pearls
-how to get unlimited gems and black pearls in angry birds 2 mod apk
-angry birds 2 cheats apk unlimited gems and pearls
-angry birds 2 mod apk latest version unlimited gems/black pearls
-angry birds 2 unlimited gems and black pearls mod apk download
-angry birds 2 mod apk android unlimited gems/pearls
-angry birds 2 hack mod apk free download unlimited gems and black pearls
-angry birds 2 mod apk offline unlimited gems and pearls
-angry birds 2 mod apk no root unlimited gems/black pearls
-angry birds 2 mod apk ios unlimited gems and black pearls
-angry birds 2 mod apk online unlimited gems/pearls
-angry birds 2 mod apk revdl unlimited gems and black pearls
-angry birds 2 mod apk rexdl unlimited gems/pearls
-angry birds 2 mod apk happymod unlimited gems and black pearls
-angry birds 2 mod apk obb unlimited gems/pearls
-angry birds 2 mod apk data unlimited gems and black pearls
-angry birds 2 mod apk pure unlimited gems/pearls
-angry birds 2 mod apk vip unlimited gems and black pearls
-angry birds 2 mod apk money unlimited gems/pearls
-angry birds 2 mod apk coins unlimited gems and black pearls
-angry birds 2 mod apk energy unlimited gems/pearls
-angry birds 2 mod apk lives unlimited gems and black pearls
-angry birds 2 mod apk all unlocked unlimited gems/pearls
-angry birds 2 mod apk everything unlocked unlimited gems and black pearls
-angry birds 2 mod apk all levels unlocked unlimited gems/pearls
-angry birds 2 mod apk all birds unlocked unlimited gems/pearls
-angry birds 2 mod apk all hats unlocked unlimited gems/pearls
-angry birds 2 mod apk all spells unlocked unlimited gems/pearls
-angry birds 2 mod apk all feathers unlocked unlimited gems/pearls
-angry birds 2 mod apk all cards unlocked unlimited gems/pearls
-angry birds 2 mod apk all towers unlocked unlimited gems/pearls
-angry birds 2 mod apk all arenas unlocked unlimited gems/pearls
-angry birds 2 mod apk all clans unlocked unlimited gems/pearls
-angry birds 2 mod apk all events unlocked unlimited gems/pearls
-angry birds 2 mod apk all challenges unlocked unlimited gems/pearls
-angry birds 2 mod apk all rewards unlocked unlimited gems/pearls
-angry birds 2 mod apk all achievements unlocked unlimited gems/pearls
-angry birds 2 mod apk all bosses defeated unlimited gems/pearls
-angry birds 2 mega mod apk unlimited gems and black pearls
-angry birds 2 premium mod apk unlimited gems and black pearls
-angry birds 2 pro mod apk unlimited gems and black pearls
-angry birds 2 plus mod apk unlimited gems and black pearls
-angry birds 2 ultimate mod apk unlimited gems and black pearls
-angry birds 2 super mod apk unlimited gems and black pearls
Angry Birds 2 Mod APK is not only about having unlimited resources and features. It is also about using your skills and strategy to beat the pigs and score high. Here are some tips and tricks for playing Angry Birds 2 Mod APK:
-By following these tips and tricks, you can play Angry Birds 2 Mod APK like a pro and have more fun and satisfaction in the game.
-Angry Birds 2 Mod APK is a great way to enjoy Angry Birds 2 with unlimited gems and black pearls. It also offers many other features that make it better than the original game. You can download and install Angry Birds 2 Mod APK easily by following the steps we have provided. You can also use Angry Birds 2 Mod APK effectively by following the tips and tricks we have shared.
-If you are a fan of Angry Birds 2 or puzzle games in general, you should definitely try Angry Birds 2 Mod APK. It will give you a new and exciting experience of playing Angry Birds 2. You will not regret it!
-So what are you waiting for? Download Angry Birds 2 Mod APK now and enjoy unlimited gems and black pearls in the game!
-A1: Yes, as long as you download it from a trusted source and follow the installation instructions carefully.
-A2: No, it is not legal to use as it violates the terms of service of Rovio Entertainment. Use it at your own risk.
-A3: No, it will only work on devices that run on Android operating system. It will not work on iOS or Windows devices.
-A4: No, it will not affect your progress in the original game as it uses a separate data file. You can switch between the mod apk and the original game anytime you want.
-A5: Yes, you can update Angry Birds 2 Mod APK when a new version is released by downloading and installing the latest mod apk file from the same source.
197e85843dBuku guru kelas 3 tema 5 revisi 2018 adalah buku elektronik yang berisi materi pembelajaran tentang cuaca untuk siswa sekolah dasar atau madrasah ibtidaiyah. Buku ini merupakan bagian dari kurikulum 2013 yang telah direvisi pada tahun 2018. Buku ini terdiri dari empat subtema, yaitu keadaan cuaca, perubahan cuaca, pengaruh perubahan cuaca terhadap kehidupan manusia, dan cuaca, musim, dan iklim.
-Mengapa penting untuk mendownload buku guru kelas 3 tema 5 revisi 2018? Ada beberapa alasan, antara lain:
-Download Zip ✫ https://urlin.us/2uSS2D
Ada beberapa cara untuk mendownload buku guru kelas 3 tema 5 revisi 2018, yaitu melalui Google Drive, melalui situs resmi Kemdikbud, atau melalui situs lain. Berikut adalah penjelasan dan langkah-langkahnya.
-Google Drive adalah layanan penyimpanan online yang memungkinkan pengguna untuk mengupload, mengunduh, dan berbagi file secara gratis. Salah satu keuntungan menggunakan Google Drive adalah kecepatan dan kemudahan aksesnya. Namun, salah satu kelemahannya adalah terkadang link yang dibagikan bisa rusak atau tidak berfungsi.
-Untuk mendownload buku guru kelas 3 tema 5 revisi 2018 melalui Google Drive, Anda dapat mengikuti langkah-langkah berikut:
-Situs resmi Kemdikbud adalah situs yang dikelola oleh Kementerian Pendidikan dan Kebudayaan Republik Indonesia. Salah satu keuntungan menggunakan situs resmi Kemdikbud adalah keaslian dan kualitas file yang dijamin oleh pihak yang berwenang. Namun, salah satu kelemahannya adalah terkadang situsnya mengalami gangguan atau lambat dalam menampilkan konten.
-Untuk mendownload buku guru kelas 3 tema 5 revisi 2018 melalui situs resmi Kemdikbud, Anda dapat mengikuti langkah-langkah berikut:
-Selain Google Drive dan situs resmi Kemdikbud, Anda juga dapat mendownload buku guru kelas 3 tema 5 revisi 2018 melalui situs lain yang menyediakan link download. Salah satu keuntungan menggunakan situs lain adalah variasi dan pilihan link yang tersedia. Namun, salah satu kelemahannya adalah kredibilitas dan keamanan file yang tidak terjamin.
-Untuk mendownload buku guru kelas 3 tema 5 revisi 2018 melalui situs lain, Anda dapat mengikuti langkah-langkah berikut:
-Buku guru kelas 3 tema 5 revisi 2018 memiliki isi yang menarik dan bermanfaat untuk pembelajaran cuaca. Buku ini terdiri dari empat subtema, yaitu keadaan cuaca, perubahan cuaca, pengaruh perubahan cuaca terhadap kehidupan manusia, dan cuaca, musim, dan iklim. Berikut adalah penjelasan singkat tentang masing-masing subtema.
-Download buku guru kelas 3 tema 5 cuaca revisi 2018
-Download buku guru kelas 3 tema 5 kurikulum 2013 revisi 2018
-Download buku guru kelas 3 tema 5 edisi revisi 2018 pdf
-Download buku guru kelas 3 tema 5 semester 2 revisi 2018
-Download buku guru kelas 3 tema 5 google drive revisi 2018
-Download buku guru kelas 3 tema 5 gratis revisi 2018
-Download buku guru kelas 3 tema 5 lengkap revisi 2018
-Download buku guru kelas 3 tema 5 terbaru revisi 2018
-Download buku guru kelas 3 tema 5 online revisi 2018
-Download buku guru kelas 3 tema 5 format word revisi 2018
-Cara download buku guru kelas 3 tema 5 revisi 2018
-Link download buku guru kelas 3 tema 5 revisi 2018
-Situs download buku guru kelas 3 tema 5 revisi 2018
-Aplikasi download buku guru kelas 3 tema 5 revisi 2018
-Kode download buku guru kelas 3 tema 5 revisi 2018
-Download buku guru kelas tiga tema lima revisi dua ribu delapan belas
-Download buku pengajar kelas III tema V revisi tahun dua ribu delapan belas
-Download ebook buku guru kelas tiga tema lima revisi tahun dua ribu delapan belas
-Download file pdf buku guru kelas III tema V revisi tahun dua ribu delapan belas
-Download dokumen word buku pengajar kelas tiga tema lima revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V cuaca edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V kurikulum dua ribu tiga belas edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas pdf
-Unduh buku guru kelas III tema V semester dua edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V google drive edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V gratis edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V lengkap edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V terbaru edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V online edisi revisi tahun dua ribu delapan belas
-Unduh buku guru kelas III tema V format word edisi revisi tahun dua ribu delapan belas
-Cara unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas
-Link unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas
-Situs unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas
-Aplikasi unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas
-Kode unduh buku guru kelas III tema V edisi revisi tahun dua ribu delapan belas
-Unduh buku pengajar kelas tiga tema lima edisi revisi dua ribu delapan belas
-Unduh ebook buku pengajar kelas tiga tema lima edisi revisi dua ribu delapan belas
-Unduh file pdf buku pengajar kelas tiga tema lima edisi revisi dua ribu delapan belas
-Unduh dokumen word buku pengajar kelas tiga tema lima edisi revisi dua ribu delapan belas
Subtema ini membahas tentang pengertian, jenis, dan faktor-faktor yang mempengaruhi keadaan cuaca. Siswa akan belajar tentang konsep-konsep dasar cuaca, seperti suhu udara, kelembaban udara, tekanan udara, angin, awan, hujan, salju, dan petir. Siswa juga akan belajar tentang alat-alat pengukur cuaca, seperti termometer, higrometer, barometer, anemometer, dan pluviometer. Siswa akan melakukan berbagai kegiatan menarik dan bervariasi, seperti mengamati dan mencatat keadaan cuaca di sekitar sekolah, membuat alat pengukur cuaca sederhana dari bahan bekas, melakukan percobaan sains tentang siklus air dan efek rumah kaca, serta membuat laporan hasil pengamatan dan percobaan.
-Subtema ini membahas tentang pengertian, penyebab, dan dampak perubahan cuaca. Siswa akan belajar tentang konsep-konsep penting perubahan cuaca, seperti variasi harian dan musiman cuaca, fenomena alam yang mempengaruhi cuaca (seperti El Nino dan La Nina), serta perubahan iklim global akibat pemanasan global. Siswa juga akan belajar tentang cara-cara mengantisipasi dan mengadaptasi diri dengan perubahan cuaca, seperti memperhatikan prakiraan cuaca, memilih pakaian yang sesuai dengan cuaca, menjaga kesehatan tubuh dan lingkungan, serta berpartisipasi dalam upaya pelestarian lingkungan. Siswa akan melakukan berbagai kegiatan menarik dan bervariasi, seperti membuat grafik perubahan suhu udara selama seminggu atau sebulan, mengidentifikasi fenomena alam yang mempengaruhi cuaca di Indonesia dan dunia, melakukan percobaan sains tentang efek pemanasan global pada es kutub dan permukaan laut, serta membuat poster atau slogan tentang pelestarian lingkungan.
-Subtema ini membahas tentang pengaruh positif dan negatif perubahan cuaca terhadap kehidupan manusia, baik di bidang kesehatan, sosial, ekonomi, budaya, maupun politik. Siswa akan belajar tentang konsep-konsep relevan pengaruh perubahan cuaca, seperti penyakit yang berkaitan dengan cuaca (seperti flu, demam berdarah, dan malaria), bencana alam yang disebabkan oleh cuaca (seperti banjir, tanah longsor, dan kebakaran hutan), serta peluang dan tantangan yang ditimbulkan oleh cuaca (seperti pertanian, pariwisata, dan energi terbarukan). Siswa juga akan belajar tentang cara-cara mengatasi dan mengurangi pengaruh negatif perubahan cuaca, seperti melakukan pencegahan dan penanganan penyakit, melakukan mitigasi dan adaptasi bencana alam, serta melakukan kerjasama dan solidaritas antarbangsa. Siswa akan melakukan berbagai kegiatan menarik dan bervariasi, seperti membuat tabel atau diagram tentang pengaruh perubahan cuaca terhadap kehidupan manusia, mengidentifikasi dan mengevaluasi contoh-contoh pengaruh perubahan cuaca di Indonesia dan dunia, melakukan simulasi atau role play tentang situasi bencana alam atau kerjasama antarbangsa, serta membuat laporan atau presentasi tentang hasil kegiatan.
-Subtema ini membahas tentang hubungan antara cuaca, musim, dan iklim. Siswa akan belajar tentang konsep-konsep dasar cuaca, musim, dan iklim, seperti definisi, faktor-faktor pembentuk, jenis-jenis, serta pola-pola yang terjadi di bumi. Siswa juga akan belajar tentang cara-cara mengamati dan memprediksi cuaca, musim, dan iklim, seperti menggunakan kalender atau almanak, menggunakan peta atau grafik, menggunakan aplikasi atau media digital, serta menggunakan pengetahuan lokal atau tradisional. Siswa akan melakukan berbagai kegiatan menarik dan bervariasi, seperti membuat kalender atau almanak sendiri berdasarkan pengamatan cuaca selama satu tahun, membuat peta atau grafik tentang musim atau iklim di Indonesia atau dunia, menggunakan aplikasi atau media digital untuk melihat prakiraan cuaca, musim, atau iklim secara real time atau historis, serta mengumpulkan dan membandingkan pengetahuan lokal atau tradisional tentang cuaca, musim, atau iklim dari berbagai daerah atau budaya.
-Buku guru kelas 3 tema 5 revisi 2018 adalah buku elektronik yang berisi materi pembelajaran tentang cuaca untuk siswa sekolah dasar atau madrasah ibtidaiyah. Buku ini memiliki isi yang menarik dan bermanfaat untuk pembelajaran cuaca. Buku ini terdiri dari empat subtema, yaitu keadaan cuaca, perubahan cuaca, pengaruh perubahan cuaca terhadap kehidupan manusia, dan cuaca, musim, dan iklim. Buku ini dapat membantu guru dan siswa dalam merencanakan dan melaksanakan pembelajaran yang sesuai dengan standar kompetensi dan tujuan kurikulum. Buku ini juga dapat membantu guru dan siswa dalam mengembangkan literasi, karakter, keterampilan berpikir tingkat tinggi, dan kompetensi abad ke-21. Buku ini juga dapat membantu guru dan siswa dalam mengintegrasikan pembelajaran cuaca dengan mata pelajaran lain, serta mengapresiasi keindahan dan keberagaman alam semesta yang diciptakan oleh Allah SWT.
-Untuk mendownload buku guru kelas 3 tema 5 revisi 2018, Anda dapat menggunakan beberapa cara, yaitu melalui Google Drive, melalui situs resmi Kemdikbud, atau melalui situs lain. Anda dapat memilih cara yang paling sesuai dengan kebutuhan dan kenyamanan Anda. Namun, Anda harus berhati-hati dalam mendownload file dari sumber yang tidak terpercaya atau tidak resmi, karena dapat berisiko mengandung virus atau malware yang dapat merusak perangkat Anda.
-Berikut adalah beberapa tips dan saran untuk menggunakan buku guru kelas 3 tema 5 revisi 2018 secara efektif:
-Berikut adalah lima pertanyaan dan jawaban yang sering diajukan terkait dengan topik download buku guru kelas 3 tema 5 revisi 2018:
-Pertanyaan | Jawaban |
---|---|
Apakah buku guru kelas 3 tema 5 revisi 2018 berbeda dengan buku guru kelas 3 tema 5 edisi sebelumnya? | Ya, buku guru kelas 3 tema 5 revisi 2018 berbeda dengan buku guru kelas 3 tema 5 edisi sebelumnya. Buku guru kelas 3 tema 5 revisi 2018 memiliki perbaikan dan penyempurnaan dari segi isi, tampilan, bahasa, dan penyajian materi. |
Apakah buku guru kelas 3 tema 5 revisi 2018 tersedia dalam bentuk cetak? | Tidak, buku guru kelas 3 tema 5 revisi 2018 hanya tersedia dalam bentuk elektronik atau PDF. Anda dapat mendownloadnya secara gratis dari berbagai sumber online. |
Apakah saya harus mengikuti semua materi dan kegiatan yang ada di buku guru kelas 3 tema 5 revisi 2018? | Tidak, Anda tidak harus mengikuti semua materi dan kegiatan yang ada di buku guru kelas 3 tema 5 revisi 2018. Anda dapat menyesuaikan pembelajaran dengan kondisi dan karakteristik siswa Anda Anda dapat menambahkan, mengurangi, atau memodifikasi materi atau kegiatan yang ada di buku ini sesuai dengan kebutuhan dan minat siswa Anda. |
Apakah saya harus menggunakan buku guru kelas 3 tema 5 revisi 2018 secara eksklusif untuk pembelajaran cuaca? | Tidak, Anda tidak harus menggunakan buku guru kelas 3 tema 5 revisi 2018 secara eksklusif untuk pembelajaran cuaca. Anda dapat menggunakan sumber belajar lain yang relevan dan bermutu untuk melengkapi atau memperkaya pembelajaran. Anda dapat menggunakan buku-buku lain, media cetak atau elektronik, internet, lingkungan sekitar, atau narasumber yang kompeten untuk mendukung pembelajaran. |
Apakah saya harus menguasai semua konsep dan fenomena cuaca yang ada di buku guru kelas 3 tema 5 revisi 2018? | Tidak, Anda tidak harus menguasai semua konsep dan fenomena cuaca yang ada di buku guru kelas 3 tema 5 revisi 2018. Anda dapat memilih dan memfokuskan pada konsep dan fenomena cuaca yang paling relevan dan penting untuk pembelajaran. Anda dapat menggunakan indikator pencapaian kompetensi sebagai acuan untuk menentukan konsep dan fenomena cuaca yang harus dipelajari. |
If you are a fan of sci-fi horror games, you have probably heard of Dead Space, the classic game that puts you in the shoes of Isaac Clarke, an engineer who has to fight his way through a spaceship infested with grotesque creatures called Necromorphs. Dead Space is widely regarded as one of the best survival horror games ever made, thanks to its immersive atmosphere, strategic combat, and terrifying enemies.
-But what if you could play Dead Space with even better graphics, sound, gameplay, and content? That's what the Dead Space remake offers. This game is a complete overhaul of the original Dead Space, using modern technology and design to enhance every aspect of the game. And if you want to take your experience to the next level, you can download the dead space mod apk latest version, which gives you access to unlimited resources, unlocked weapons, and more.
-DOWNLOAD ☑ https://jinyurl.com/2uNOY1
The Dead Space remake is not just a simple remaster of the original game. It is a reimagining that stays faithful to the core vision of Dead Space while adding new elements and improvements. Here are some of the features of the Dead Space remake:
-If you want to enjoy Dead Space without any limitations or restrictions, you can download the dead space mod apk latest version from a reliable source. This mod apk is a modified version of the game that gives you several advantages over the normal version. Here are some of the benefits of downloading the dead space mod apk latest version:
-With this mod apk, you will not see any annoying ads or pop-ups that interrupt your gameplay. You can enjoy Dead Space without any distractions or interruptions.
-Installing the dead space mod apk latest version on your device is easy and simple. Just follow these steps:
-Dead Space is a masterpiece of survival horror that deserves to be played by every fan of the genre. The Dead Space remake is a stunning improvement that enhances every aspect of the game, making it more immersive, thrilling, and terrifying. And if you want to have even more fun and freedom, you can download the dead space mod apk latest version, which gives you unlimited resources, unlocked weapons, and no ads. So what are you waiting for? Download the dead space mod apk latest version today and prepare to face your fears.
-Here are some frequently asked questions about Dead Space and the dead space mod apk latest version:
-A: Yes, Dead Space remake is available for Android devices. You can download it from the Google Play Store or from other sources. However, you need to have a compatible device that meets the minimum requirements for the game.
-A: Yes, dead space mod apk latest version is safe to use as long as you download it from a reliable source. However, you should always be careful when installing apps from unknown sources and scan them for viruses or malware before installing them.
-dead space mobile game mod apk
-dead space android mod apk download
-dead space vita port mod apk
-dead space unlimited credits mod apk
-dead space mod apk offline
-dead space mod apk no ads
-dead space mod apk unlimited ammo
-dead space mod apk revdl
-dead space mod apk rexdl
-dead space mod apk obb
-dead space mod apk data
-dead space mod apk filehippo
-dead space mod apk android 1
-dead space mod apk android 2
-dead space mod apk android 3
-dead space mod apk android 4
-dead space mod apk android 5
-dead space mod apk android 6
-dead space mod apk android 7
-dead space mod apk android 8
-dead space mod apk android 9
-dead space mod apk android 10
-dead space mod apk android 11
-dead space mod apk for pc
-dead space mod apk for ios
-dead space mod apk for psp
-dead space mod apk for ps vita
-dead space mod apk for xperia play
-dead space mod apk for armv6 devices
-dead space mod apk for armv7 devices
-dead space mod apk for arm64 devices
-dead space mod apk for x86 devices
-dead space survival horror game mod apk
-dead space necromorph outbreak game mod apk
-dead space spin-off series game mod apk
-dead space ea games mobile game mod apk
-download latest version of dead space mod apk
-how to install latest version of dead space mod apk
-how to update latest version of dead space mod apk
-how to play latest version of dead space mod apk
-how to run latest version of dead space mod apk
-how to get latest version of dead space mod apk
-how to download latest version of dead space mod apk
-where to download latest version of dead space mod apk
-best site to download latest version of dead space mod apk
-free download latest version of dead space mod apk
-full download latest version of dead space mod apk
-cracked download latest version of dead space mod apk
-hacked download latest version of dead space mod apk
-patched download latest version of dead space mod apk
A: Yes, you can play Dead Space offline without any internet connection. However, you may need to connect to the internet once in a while to verify your license or update the game.
-A: Dead Space is a fairly long game that can take you around 10 to 15 hours to complete, depending on your difficulty level and playstyle. However, there are also side missions and collectibles that can extend your gameplay time.
-A: Here are some tips and tricks for playing Dead Space:
-If you are looking for a way to watch live TV on your smartphone, tablet, or computer, you might want to check out Jio TV Live APK. This is an app that lets you stream over 1000+ TV channels, including 300+ HD channels, in 15+ languages, for free. You can also watch the latest TV shows, movies, sports, news, devotional, and more on this app. In this article, we will tell you everything you need to know about Jio TV Live APK, such as its features, benefits, how to download and install it, how to use it, alternatives, reviews, and FAQs.
-Download File > https://jinyurl.com/2uNQeQ
Jio TV Live APK is an Android app that allows you to watch live TV on your device. It is developed by Jio Platforms Limited, a subsidiary of Reliance Industries Limited, which also offers other digital services such as Jio SIM, Jio Fiber, Jio Cinema, Jio Music, and more. Jio TV Live APK is one of the most popular entertainment apps in India, with over 100 million downloads on Google Play Store. It is also available for iOS users and web users.
-Some of the features and benefits of Jio TV Live APK are:
-Once you have downloaded and installed Jio TV Live APK on your device, you can start using it to watch live TV channels and programs. Here are some steps to help you use Jio TV Live APK:
-program as your favorite, tap on any program and then tap on the 'Favorite' icon at the bottom of the screen. To unmark it, tap on it again and then tap on the 'Unfavorite' icon at the bottom of the screen.
While Jio TV Live APK is a great app for watching live TV on your device, it is not the only one. There are some other apps that offer similar or better features and services. Here are some of the alternatives to Jio TV Live APK that you can try:
-Airtel Xstream TV is an app that lets you watch live TV, movies, shows, and more on your device. It is offered by Airtel, one of the leading telecom operators in India. It has over 400+ live TV channels, 10000+ movies and shows, and exclusive content from platforms such as Zee5, Eros Now, Hooq, Hungama Play, etc. It also has a dedicated kids section with educational and fun content. You can also download and watch offline any content that you like. Airtel Xstream TV is free for Airtel users and requires a subscription for non-Airtel users.
-Disney+ Hotstar is an app that lets you watch live TV, movies, shows, sports, news, and more on your device. It is offered by Star India, a subsidiary of The Walt Disney Company. It has over 300+ live TV channels, 100000+ hours of content, and exclusive access to Disney+ originals, Marvel movies, Star Wars series, Pixar animations, etc. It also has a huge collection of Indian movies and shows in various languages. You can also watch live sports events such as IPL, ICC Cricket World Cup, Premier League, Formula 1, etc. Disney+ Hotstar requires a subscription to access its premium content.
-Vodafone Play is an app that lets you watch live TV, movies, shows, and more on your device. It is offered by Vodafone Idea Limited, another leading telecom operator in India. It has over 450+ live TV channels, 15000+ movies and shows, and exclusive content from platforms such as Zee5, Sony Liv, Lionsgate Play, Shemaroo Me, Sun NXT, etc. It also has a curated section for kids with educational and entertaining content. You can also download and watch offline any content that you like. Vodafone Play is free for Vodafone Idea users and requires a subscription for non-Vodafone Idea users.
-Jio TV Live APK has received mixed reviews from its users. Some users have praised its features, quality, variety, and user interface. Some users have complained about its bugs, errors,
Some of the user reviews of Jio TV Live APK are:
-jio tv live apk download
-jio tv live apk mod
-jio tv live apk latest version
-jio tv live apk for android tv
-jio tv live apk for pc
-jio tv live apk for firestick
-jio tv live apk free download
-jio tv live apk mirror
-jio tv live apk old version
-jio tv live apk without jio sim
-jio tv live apk 2023
-jio tv live apk hack
-jio tv live apk cracked
-jio tv live apk update
-jio tv live apk for smart tv
-jio tv live apk for mi tv
-jio tv live apk for laptop
-jio tv live apk for ios
-jio tv live apk premium
-jio tv live apk pro
-jio tv live apk online
-jio tv live apk install
-jio tv live apk file
-jio tv live apk pure
-jio tv live apk app
-jio tv live sports movies shows apk
-jio tv live cricket hd apk
-jio tv live news channels free online streaming apk
-jio cinema and tv live sports movies shows modded unlocked cracked premium no ads adfree latest version updated new 2023 working free download android app apk
-how to watch jio tv live on android phone without app using browser chrome firefox opera uc mini apkpure apkmirror apkpure.com apkmirror.com website link url online streaming hd quality video resolution 1080p 720p 480p 360p 240p 144p tutorial guide step by step instructions tips tricks hacks cheats modded unlocked cracked premium no ads adfree latest version updated new 2023 working free download android app apk
-how to install and use jio tv live on pc windows mac linux laptop desktop computer using emulator software program application bluestacks nox player memu ldplayer gameloop tencent gaming buddy smartgaga koplayer andy droid4x genymotion msi app player phoenix os prime os remix os bliss os open thop os android x86 project x86 x64 bit 32 bit 64 bit tutorial guide step by step instructions tips tricks hacks cheats modded unlocked cracked premium no ads adfree latest version updated new 2023 working free download android app apk
-how to install and use jio tv live on smart tv android tv mi tv samsung lg sony tcl vu panasonic philips onida micromax videocon haier kodak bpl sansui hisense hitachi toshiba sharp akai mitashi lloyd intex cloudwalker iffalcon realme oneplus tcl ifalcon nokia motorola zebronics coocaa skyworth sanyo nobel skiodo weston wybor marq flipkart amazon firestick fire stick fire cube fire hd fire tablet echo dot echo show echo spot echo studio echo auto echo buds echo frames echo loop alexa voice remote alexa app alexa skills alexa devices tutorial guide step by step instructions tips tricks hacks cheats modded unlocked cracked premium no ads adfree latest version updated new 2023 working free download android app apk
-how to watch and record jio tv live channels shows movies sports cricket football tennis news music devotional lifestyle infotainment kids education darshan spiritual religious regional hindi english tamil telugu kannada malayalam marathi gujarati bengali punjabi urdu oriya assamese nepali bhojpuri sindhi kashmiri dogri maithili santhali konkani tulu awadhi braj bhasha magahi chhattisgarhi haryanvi rajasthani bundeli bagheli pahari garhwali kumaoni ladakhi kashmiri dardic balti zangskari purik shina kohistani kalasha khowar wakhi burushaski yidgha domaki torwali gowro indus kohistani chilisso dameli palula phalura sawi kalami gawri bateri dhatki memoni bhil bhili gamit vasavi ahirani khandeshi dhodia dubli ravalia padharia wagdi bhagoria bhilala pawari ratadi bhavra bhagoria patelia patali rathawi rathodi rathwi rathawa rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor rathor in hd quality video
According to the user ratings and feedback on Google Play Store, some of the pros and cons of Jio TV Live APK are:
-Pros | -Cons | -
---|---|
It has a large variety of channels and content in different languages and genres. | -It sometimes crashes or freezes while streaming. | -
It has a user-friendly interface and easy navigation. | -It consumes a lot of data and battery while streaming. | -
It has a catch-up, record, and reminder feature for convenience. | -It does not support casting or mirroring to other devices. | -
It has exclusive access to Jio Sports channels and events. | -It requires a Jio SIM or ID to access the app. | -
It is free to download and use for Jio users. | -It has ads that can be annoying or intrusive. | -
Here are some of the user ratings and feedback of Jio TV Live APK from Google Play Store:
-Jio TV Live APK is an app that lets you watch live TV on your device. It has over 1000+ live TV channels, 10000+ movies and shows, and exclusive content from various platforms. It also has features such as catch-up, record, reminder, favorites, etc. It is free for Jio users and requires a subscription for non-Jio users. It has received mixed reviews from its users, with some praising its features, quality, variety, and user interface, and some complaining about its bugs, errors, data consumption, battery drain, casting support, ads, etc. Overall, it is a good app for watching live TV on your device, but it needs improvement in some areas.
-To use Jio TV Live APK, you need an Android device with Android 4.4 or above, a Jio SIM or ID, and an internet connection.
-To watch Jio TV Live APK on your PC or laptop, you need to visit the official website of Jio TV and sign in with your Jio ID and password. You can also use an Android emulator such as BlueStacks or Nox Player to run the app on your PC or laptop.
-To watch Jio TV Live APK on your smart TV, you need to connect your device with the app to your smart TV using a Chromecast device or a similar device that supports screen mirroring or casting. You can also use an HDMI cable to connect your device with the app to your smart TV.
-To update Jio TV Live APK, you need to visit the Google Play Store or the official website of Jio TV
and download the latest version of the app. You can also enable the auto-update option in the settings of the app to get the updates automatically.
-To contact the customer support of Jio TV Live APK, you can call the toll-free number 1800-889-9999 or email at care@jio.com. You can also visit the help and support section of the app or the website to get answers to your queries.
-Paste your data in the textarea below:
- - - - diff --git a/spaces/awacke1/StreamlitSharedChatToFiles/README.md b/spaces/awacke1/StreamlitSharedChatToFiles/README.md deleted file mode 100644 index 928f8c1114dd7df90b7727120d69dfe8e091adf4..0000000000000000000000000000000000000000 --- a/spaces/awacke1/StreamlitSharedChatToFiles/README.md +++ /dev/null @@ -1,27 +0,0 @@ ---- -title: 🍪📚🤓StreamlitSharedChatToFiles -emoji: 📢📚👩🏼🏫🤓 -colorFrom: green -colorTo: pink -sdk: streamlit -sdk_version: 1.17.0 -app_file: app.py -pinned: false -license: mit ---- - -📢📚👩🏼🏫🤓 Calling all bookworms, geeks, and knowledge seekers! 📢📚👩🏼🏫🤓 - -👉🏼 Introducing Streamlit Chat, the funnest chat channel ever! 🎉🥳🤩 - -If you're tired of boring chats and dull conversations, you're going to love Streamlit Chat. With this exciting new platform, you can connect with like-minded people from all over the world and chat about everything from your favorite books to the latest tech gadgets. 📱💻📚🌎 - -And the best part? You can access a wealth of knowledge, all thanks to the incredible power of Streamlit Chat's Wikipedia integration. 🤯🌟👨🏻💻 - -With Streamlit Chat, you don't need to be a librarian or a genius to access valuable information. You can chat with the encyclopedia itself, thanks to our innovative integration with Wikipedia. 🤓📚💻 - -Whether you want to learn about the history of the Eiffel Tower 🗼 or discover the best way to bake a cake 🎂, Streamlit Chat has got you covered. And, with our fun and easy-to-use interface, you can connect with other users and share your knowledge with the world. 🌎👩👩👦👦💬 - -So, if you're ready to take your chats to the next level, join Streamlit Chat today. It's the funnest way to connect with people and learn something new every day. 🤩🥳🎉 - -Streamlit Chat - the cookiest librarian you'll ever meet! 🍪📚🤓 diff --git a/spaces/azusarang/so-vits-svc-models-ba_P/vencoder/ContentVec768L12.py b/spaces/azusarang/so-vits-svc-models-ba_P/vencoder/ContentVec768L12.py deleted file mode 100644 index 0d1591c8843b920d5685e822354e8e6adc9a9e19..0000000000000000000000000000000000000000 --- a/spaces/azusarang/so-vits-svc-models-ba_P/vencoder/ContentVec768L12.py +++ /dev/null @@ -1,34 +0,0 @@ -from vencoder.encoder import SpeechEncoder -import torch -from fairseq import checkpoint_utils - -class ContentVec768L12(SpeechEncoder): - def __init__(self,vec_path = "pretrain/checkpoint_best_legacy_500.pt",device=None): - print("load model(s) from {}".format(vec_path)) - self.hidden_dim = 768 - models, saved_cfg, task = checkpoint_utils.load_model_ensemble_and_task( - [vec_path], - suffix="", - ) - if device is None: - self.dev = torch.device("cuda" if torch.cuda.is_available() else "cpu") - else: - self.dev = torch.device(device) - self.model = models[0].to(self.dev) - self.model.eval() - - def encoder(self, wav): - feats = wav - if feats.dim() == 2: # double channels - feats = feats.mean(-1) - assert feats.dim() == 1, feats.dim() - feats = feats.view(1, -1) - padding_mask = torch.BoolTensor(feats.shape).fill_(False) - inputs = { - "source": feats.to(wav.device), - "padding_mask": padding_mask.to(wav.device), - "output_layer": 12, # layer 12 - } - with torch.no_grad(): - logits = self.model.extract_features(**inputs) - return logits[0].transpose(1, 2) \ No newline at end of file diff --git a/spaces/azusarang/so-vits-svc-models-ba_P/vencoder/hubert/__init__.py b/spaces/azusarang/so-vits-svc-models-ba_P/vencoder/hubert/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/spaces/banana-projects/web3d/node_modules/three/examples/js/PRNG.js b/spaces/banana-projects/web3d/node_modules/three/examples/js/PRNG.js deleted file mode 100644 index ab811a820bc3e440b0fb6aecdeb806f4c02fbb77..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/examples/js/PRNG.js +++ /dev/null @@ -1,23 +0,0 @@ -// Park-Miller-Carta Pseudo-Random Number Generator -// https://github.com/pnitsch/BitmapData.js/blob/master/js/BitmapData.js - -var PRNG = function () { - - this.seed = 1; - this.next = function() { - - return ( this.gen() / 2147483647 ); - - }; - this.nextRange = function( min, max ) { - - return min + ( ( max - min ) * this.next() ) - - }; - this.gen = function() { - - return this.seed = ( this.seed * 16807 ) % 2147483647; - - }; - -}; diff --git a/spaces/banana-projects/web3d/node_modules/three/examples/js/postprocessing/HalftonePass.js b/spaces/banana-projects/web3d/node_modules/three/examples/js/postprocessing/HalftonePass.js deleted file mode 100644 index 5fa841c1248c6232dd7a0373880b868f602e272f..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/examples/js/postprocessing/HalftonePass.js +++ /dev/null @@ -1,72 +0,0 @@ -/** - * @author meatbags / xavierburrow.com, github/meatbags - * - * RGB Halftone pass for three.js effects composer. Requires THREE.HalftoneShader. - * - */ - -THREE.HalftonePass = function ( width, height, params ) { - - THREE.Pass.call( this ); - - if ( THREE.HalftoneShader === undefined ) { - - console.error( 'THREE.HalftonePass requires THREE.HalftoneShader' ); - - } - - this.uniforms = THREE.UniformsUtils.clone( THREE.HalftoneShader.uniforms ); - this.material = new THREE.ShaderMaterial( { - uniforms: this.uniforms, - fragmentShader: THREE.HalftoneShader.fragmentShader, - vertexShader: THREE.HalftoneShader.vertexShader - } ); - - // set params - this.uniforms.width.value = width; - this.uniforms.height.value = height; - - for ( var key in params ) { - - if ( params.hasOwnProperty( key ) && this.uniforms.hasOwnProperty( key ) ) { - - this.uniforms[ key ].value = params[ key ]; - - } - - } - - this.fsQuad = new THREE.Pass.FullScreenQuad( this.material ); - -}; - -THREE.HalftonePass.prototype = Object.assign( Object.create( THREE.Pass.prototype ), { - - constructor: THREE.HalftonePass, - - render: function ( renderer, writeBuffer, readBuffer, deltaTime, maskActive ) { - - this.material.uniforms[ "tDiffuse" ].value = readBuffer.texture; - - if ( this.renderToScreen ) { - - renderer.setRenderTarget( null ); - this.fsQuad.render( renderer ); - - } else { - - renderer.setRenderTarget( writeBuffer ); - if ( this.clear ) renderer.clear(); - this.fsQuad.render( renderer ); - - } - - }, - - setSize: function ( width, height ) { - - this.uniforms.width.value = width; - this.uniforms.height.value = height; - - } -} ); diff --git a/spaces/banana-projects/web3d/node_modules/three/src/renderers/shaders/ShaderLib/meshphong_vert.glsl.js b/spaces/banana-projects/web3d/node_modules/three/src/renderers/shaders/ShaderLib/meshphong_vert.glsl.js deleted file mode 100644 index 10588905821ea7bf1f775b26d5070b672b47a989..0000000000000000000000000000000000000000 --- a/spaces/banana-projects/web3d/node_modules/three/src/renderers/shaders/ShaderLib/meshphong_vert.glsl.js +++ /dev/null @@ -1,59 +0,0 @@ -export default /* glsl */` -#define PHONG - -varying vec3 vViewPosition; - -#ifndef FLAT_SHADED - - varying vec3 vNormal; - -#endif - -#includeDownload Zip ✵ https://urloso.com/2uyQbK
Download Zip ✵ https://urloso.com/2uyPPV
Download Zip - https://urloso.com/2uyOCC
walpZoffoopyiptyday [url= -cartoon-full-movie-download.html]Download[/url]DrediuhIrrivataree [url= -blog/14642]verbwoodssarad.theblog.me[/url] Taiseertaids [url= ] [/url] AUTODESK AUTOCAD MECHANICAL Torrent [url= -pho-full-verified-version-key-windows-exe-patch-file-32bit]dyouflimrerali.wixsite.com[/url] sesspaphpag [url= -eggerland-for-32-utorrent-keygen-repack-free-pc] -eggerland-for-32-utorrent-keygen-repack-free-pc[/url] NatttureCemFrawlHem [url= -utorrent-32-pc-patch-registration-build-exe-best] -utorrent-32-pc-patch-registration-build-exe-best[/url] briletypeAbumunult [url= -software-torrent-iso-full-version-64-license-cracked-verified] -software-torrent-iso-full-version-64-license-cracked-verified[/url] ReFWocheNuththegodat [url= -crop-production-pdf-book-download-full-top] -crop-production-pdf-book-download-full-top[/url]EquantyroarkPata [url= ]Download[/url] Andrea Bocelli Torrent Love In Portofino Dvd Download [url= ]wakelet.com[/url]
-d4f391380b _Krvava_Drinapdf.html
-Mukavemeti-Mustafa-Inan-Pdf-12lkjhl.html
_Man_Edge_Of_Time_Pc_Game_Highly_Compressed_Mediafirel.html
-Nuggets-Vmware-Vsphere-6-Vcp6dcv-Download.html
_Pires_Jesus_Nazareno_Play_Back_4shared.html
-tune-efx-download-cracked-17.html
_Frenzy_3_Free_Download_With_Crack.html
_Online_Player_buscando_a_nemo_latino_1080p_torrent.html
_al_quran_tajwid_berwarna_pdf.html
_Norton_Utilities_160344_full_a_tool_Optimize_speed_up_computer.html
_Office_2019_for_MAC_Catalina_microsoft_office_for_mac_crack_MS_Office_download_for_mac_MacOSX.html
-Online-Player-Download-The-Kaal-Movie-Torrent.html
_FOUNDRY_NUKEX_V8_0V6_WIN64XFORCE.html
-do-livro-biomecanica-basica-susan-hall.html
_expert_professional_165_crack.html
_Iq_4_Crack_Full_Version.html
-8-Serial-Keyzip-Full-Version.html
Download ✓✓✓ https://urloso.com/2uyQ6v
fb6c851797 -priscilla-queen-of-the-desert-musical-torrent
-verified-advanced-mechanics-of-materials-f-b-seely-and-j-o-smith-full-rar-highspeed-rar
-solidworks-2006-free-download-full-version-with-crack-32bit-haychary
-billa-full-movie-in-hindi-dubbed-watch-online-2009-tamzwha
-xforce-keygen-64-bit-fabrication-camduct-2016-download-emillwadl
-album-flow-indesign-download-verified-28
-ishq-junoon-movie-download-verified-720p-kickass-torrent
-the-chori-chori-chupke-chupke-dual-audio-hindil-gerggalin
-sims-4-larger-breasts
-repack-football-manager-2013-patch-13-3-3-torrent
-pcsx2-for-tekken-5-configured-by-rick-gex-malgilb
-kutumb-the-family-full-movie-in-hindi-dubbed-download-tomiknir
-model-railway-easily-free-hot-download-pc-game-ultrabrye
-pinnacle-studio-12-wedding-effects-free-exclusive-download
-nude-pictures-of-krista-allen
-top-ralink-rt5390r-802-11bgn-wifi-adapter-driver-13-hot
-ktso-private-webcam-1-hot
-dragon-quest-monsters-joker-2-pro-english-patch-downloadl-install
-cachecade-pro-2-0-exclusive-keygen-download
-cantari-ro-cantari-crestine-acorduri-chitara-philamad
-front-magazine-uk-issue-175-2
-naan-ee-tamil-movie-english-subtitles-download-yenesber
-re-loader-activator-3-0-peggeolliv
-decent-icons-exclusive-download-windows-7-ultimate
-_top_-taken-2-english-subtitles-720p-torrent
-catv-max-player-4-0-best-crack
-e2esoftvirtualsoundcardcrack-top
-new-samsung-qualcomm-tool-v6-4-05-mega
-kailangan-kita-movie-torrent-dow-_hot_
-acelerar-internet-windows-7-a-500-mas-rapido-work
fb6c851797 -lolita-movie-hd-download-utorrent-exclusive
-samuelson-nordhaus-economics-pdf-14
-new-tamil-hd-video-songs-1080p-blu-ray-latest-51-wintakaml
-sisoftware-sandra-business-2018-5-27-18-_verified_-keygen-download-pc
-phpnuke-gamers-website-theme-army-template
-sonokinetic-sultan-strings-493-gb-vsti-kontakt-torentrar
-top-scourge-of-evil-720p-mega-download
-hukana-sinhala-blue-12-better
-pain-an-gain-2013-dual-audio-hindi-eng-hd-top-full-movie-download
-jis-desh-mein-ganga-rehta-hai-download-roseiride
-alfa-laval-cas-software-development-exclusive
-morty-lefkoe-natural-confidencerar-link
-download-main-hoon-part-time-killer-movie-free-hot-mp4
-just-cause-3-all-gear-mods-rayfobes
-free-call-of-duty-4-level-55-hack-1-8-patch-with-lucky-patcher
-full-digitalxmodels-volume-27-medical-2
-fix-film-history-an-introduction-pdf-download
-pedroelpollolapeliculadvdripespaollatino2-orixanth
-new-housefull-movie-with-english-subtitles-download-for-movie
-photoshop-10000-actions-pack-24-of-33-rar-top
-maximus-arcade-2-10-serial-fakecollection-rar-exclusive
-datastructuresincnoelkalicharanpdf
-to-l-ve-ru-diary-harenchi-v1-3-game-yedinorth
-__hot__-vikraal-aur-gabral-download-3gp-movie
-british-village-ladies-bobbi-jo-3-sets-18l-fixed
-brian-mcfadden-wall-of-soundz-zip
-ghajini-tamil-4-full-movie-in-hindi-download-new
-gary-s-mod-v13-06-07-game-download-better
-acronis-true-image-2019-build-17750-with-crack-multilingual-onorjeda
-free-upd-download-film-semi-asia-terbaik
- Demo for Counterfeit V2.5 Stable Diffusion model.
- {"Add the following tokens to your prompts for the model to work properly: prefix" if prefix else ""}
-
This space was created using SD Space Creator.
-1D-Nest linear cutting optimizer is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Power: NO quantity limits for pieces to
-Download File ⚙ https://tinurli.com/2uwiIJ
The Glass Cutting optimizer based on the PLUS 2D cutting optimization technology consistently delivers high yield layouts and is ideal for cutting glass, whether manually or using a CNC cutting table.
-1DNest is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Power: NO quantity limits for pieces to cut or Bars to cut from.
-1DNest is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Some features: NO quantity limits for pieces to cut or Bars to
-1D cutting optimizer, 1DNest is the most efficient, friendly and productive one-dimensional cutting optimization software to make cutting plans with topl efficiency for manufacturers who cut in pieces of smaller size, materials provided in standard lengths (linear materials) like bars, tubes, structural profiles, beams, aluminium extrusions, paper rolls, etc.
-Using warez version, crack, warez passwords, patches, serial numbers, registration codes, key generator, pirate key, keymaker or keygen for1d cutting license key is illegal. Download links are directly from our mirrors or publisher's website,1d cutting torrent files or shared files from free file sharing and free upload services,including Rapidshare, MegaUpload, YouSendIt, Letitbit, DropSend, MediaMax, HellShare, HotFile, FileServe, LeapFile, MyOtherDrive or MediaFire,are not allowed!
-Your computer will be at risk getting infected with spyware, adware, viruses, worms, trojan horses, dialers, etcwhile you are searching and browsing these illegal sites which distribute a so called keygen, key generator, pirate key, serial number, warez full version or crack for1d cutting. These infections might corrupt your computer installation or breach your privacy.1d cutting keygen or key generator might contain a trojan horse opening a backdoor on your computer.
- -1D-Nest linear cutting optimizer is the most efficient, friendly and powerful optimization software to make efficient cutting plans for materials provided in standard lengths, like bars, tubes, metallic profiles, structural steel, extrusions, paper rolls, etc.Power:NO quantity limits for...
-1D-Nest linear cutting optimizer is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Power: NO quantity limits for pieces to
-1DNest is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Some features: NO quantity limits for pieces to cut or Bars to
-This program is length-nesting optimizer and stock management software. It minimizes the wasted materials in the cutting process of any linear material like pipes, bars, tubes, profiles, paper rolls, extrusions,
-1D-Nest linear cutting optimizer is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Power: NO quantity limits for pieces to
-1DNest is the most efficient, friendly and powerful optimizer for cutting of materials provided in standard lengths, like bars, tubes, metallic profiles, extrusions, paper rolls, etc. Power: NO quantity limits for pieces to cut or Bars to cut from.
aaccfb2cb3Do you love candy? Do you love puzzles? Do you love fun? If you answered yes to any of these questions, then you will love Candy Crush Saga, the legendary match 3 puzzle game that has captivated millions of players around the world. In this article, we will tell you everything you need to know about Candy Crush Saga, how to download it, how to play it, and why you should play it. Let's get started!
-Candy Crush Saga is a match 3 puzzle game developed by King, a leading mobile game developer. The game was released in 2012 and has since become one of the most popular and successful mobile games of all time. According to King, the game has been played by over a trillion times, and has over a billion downloads on Google Play Store alone. The game is also available on Microsoft Store and App Store for Windows, iOS, and Mac devices.
-Download ->>> https://urlca.com/2uOb60
The gameplay of Candy Crush Saga is simple but addictive. You have to match three or more candies of the same color to make them pop and clear them from the board. You have to complete different objectives in each level, such as reaching a target score, clearing all the jelly, collecting all the ingredients, or clearing all the orders. You have a limited number of moves or time to complete each level, so you have to plan your moves carefully and use smart strategies. If you run out of moves or time, you will lose a life. You have five lives at a time, and they regenerate over time or by asking your friends for help.
-Candy Crush Saga has many features that make it fun and exciting. Some of them are:
-Candy Crush Saga is free to download and play on various platforms. However, some optional in-game items require payment. You can turn off the payment feature by disabling in-app purchases in your device’s settings. Here are the steps to download Candy Crush Saga on different platforms:
-To download Candy Crush Saga from Microsoft Store, follow these steps:
-To download Candy Crush Saga from App Store, follow these steps:
-[Candy Crush Saga - Apps on Google Play]
-[Candy Crush Saga - Download]
-[Candy Crush Saga Online - Play the game at King.com]
-[Candy Crush Saga İndir - Ücretsiz ve Hızlı İndirme]
-[Candy Crush Saga İndir - Android için Candy Crush Oyunu ...]
Candy Crush Saga is easy to play but hard to master. Here are some tips and tricks to help you play better and have more fun:
-Candy Crush Saga is more than just a game, it's a lifestyle. There are many reasons why you should play Candy Crush Saga, but here are some of the most compelling ones:
-Candy Crush Saga is a game that will keep you entertained for hours, days, weeks, and even years. The game has a simple but addictive gameplay that will challenge your brain and your fingers. The game also has a colorful and charming graphics and sound that will make you feel like you are in a candy wonderland. The game also has a captivating story that will take you on a journey through the Candy Kingdom, where you will meet many lovable characters and help them with their quests. The game also has a humorous and witty dialogue that will make you laugh and smile.
-Candy Crush Saga is a game that will never run out of content. The game has thousands of levels and puzzles to play, with new ones added every two weeks. The game also has various game modes and challenges that will test your skills and creativity. The game also has different types of candies, obstacles, boosters, and power-ups that will spice up your gameplay. The game also has different types of events and competitions that will give you extra fun and rewards. The game also has different types of worlds and episodes that will take you to different places and themes.
-Candy Crush Saga is a game that will make you feel connected with others. The game has social features that allow you to connect with your Facebook or King account, send and receive lives and messages from your friends, compare scores and progress with your friends and other players on the leaderboard, and join teams and clubs with other players. The game also has community features that allow you to interact with other players on the official website, Facebook page, Instagram account, Twitter account, YouTube channel, and blog. The game also has support features that allow you to contact the customer care team, report bugs and issues, give feedback and suggestions, and access the help center and FAQ.
-Candy Crush Saga is a match 3 puzzle game that has become a global phenomenon. The game has a simple but addictive gameplay, a colorful and charming graphics and sound, a captivating story and characters, a humorous and witty dialogue, thousands of levels and puzzles, various game modes and challenges, different types of candies, obstacles, boosters, and power-ups, daily rewards and bonuses, events and competitions, social features, community features, and support features. The game is free to download and play on various platforms, such as Google Play Store, Microsoft Store, App Store, Facebook, or King.com. If you love candy, puzzles, fun, or all of the above, then you should download Candy Crush Saga today and join the sweetest adventure ever!
-Here are some of the frequently asked questions about Candy Crush Saga:
-If you are looking for a reliable and professional diagnostic tool for cars and trucks, you might have heard of Delphi Cars 2015 R3. This software is one of the most popular products from Delphi Technologies, a global leader in automotive solutions. In this article, we will tell you what Delphi Cars 2015 R3 is, where to download its keygen and software, and how to install and activate it on your computer. Read on to find out more.
-Download > https://urlca.com/2uOg1s
Delphi Cars 2015 R3 is a diagnostic software that works with various diagnostic tools, such as WOW, CDP, Auto-com, MVDiag tools TCS CDP, etc. It supports multi-brands cars and trucks till 2017, and it can perform various functions, such as reading and erasing fault codes, displaying live data, performing actuator tests, adjusting parameters, coding modules, and more. It also has a user-friendly interface that allows you to easily navigate through the menus and options.
-Some of the features and benefits of Delphi Cars 2015 R3 are:
-Delphi Cars 2015 R3 supports a wide range of vehicles and protocols. Here are some examples of the supported vehicles:
-Vehicle Manufacturer | Vehicle Model |
---|---|
Audi | A1,A2,A3,A4,A5,A6,A7,A8,Q2,Q3,Q5,Q7,R8,S1,S2,S3,S4,S5,S6,S7,S8,TTS |
Benz | A-Class,B-Class,C-Class,E-Class,G-Class,GL-Class,GLA-Class,GLC-Class,GLE-Class,GLK-Class,GLS-Class,M-Class,R-Class,S-Class,V-Class,Viano,Vito,X-Class |
Bmw | 1 Series,2 Series,3 Series,4 Series,5 Series,6 Series,X1,X2,X3,X4,X5,X 6,X6,Z3,Z4,Z8,i3,i8 |
Ford | C-Max,EcoSport,Edge,Escape,Expedition,Explorer,F-150,Fiesta,Focus,Fusion,Galaxy,Ka,Kuga,Mondeo,Mustang,Ranger,S-Max,Taurus,Transit |
Honda | Accord,Civic,CR-V,CR-Z,Fit,HR-V,Jazz,Odyssey,Pilot,Ridgeline |
Hyundai | Accent,Azera,Creta,Elantra,Genesis,I10,I20,I30,I40,Ioniq,Kona,Santa Fe,Sonata,Tucson,Veloster |
Kia | Cadenza,Carens,Carnival,Ceed,Cerato,Forte,K900,Niro,Optima,Picanto,Rio,Sedona,Sorento,Soul,Sportage,Stinger |
Mazda | 2,3,5,6,CX-3,CX-5,CX-7,CX-9,MX-5,RX-8 |
Nissan | Altima,Armada,Juke,Kicks,Leaf,March,Micra,Murano,Navara,Note,NV200,NV300,NV400,Pathfinder,Pulsar,Qashqai,Rogue,Sentra,Tiida,X-Trail |
Toyota | Auris,Avalon,Avensis,Aygo,Camry,Corolla,Fortuner,Hilux,Land Cruiser,Prius,Rav4,Verso,Yaris |
Volkswagen | Amarok,Arteon,Beetle,Bora,Caddy,Golf,Jetta,Lupo,Polo,P assat,Scirocco,Sharan,Tiguan,Touareg,Touran,Transporter,Up |
And here are some examples of the supported protocols:
-If you want to use Delphi Cars 2015 R3 software, you will need to download its keygen and software from a reliable source. There are two main options for this: the official website of Delphi Technologies or alternative sources of Delphi Cars 2015 R3 keygen and software.
-The official website of Delphi Technologies is https://www.delphi.com/. Here you can find information about the company, its products and services, its news and events, its support and training, and its contact details. You can also access the online shop of Delphi Technologies, where you can buy various diagnostic tools and software. However, to download Delphi Cars 2015 R3 keygen and software from the official website, you will need to register an account and pay a fee. The price of Delphi Cars 2015 R3 software is $149.00 USD. You will also need to provide your device serial number and activation code to get the keygen and software.
-If you don't want to pay for the official website of Delphi Technologies, you can try to find alternative sources of Delphi Cars 2015 R3 keygen and software on the Internet. However, you should be careful about the quality and safety of these sources, as some of them may contain viruses, malware, or fake files. You should also check the feedback and reviews of other users before downloading anything. Here are some examples of alternative sources of Delphi Cars 2015 R3 keygen and software:
-delphi cars 2015.r3 keygen activation
-delphi cars 2015.r3 keygen free download
-delphi cars 2015.r3 keygen mega
-delphi cars 2015.r3 keygen crack
-delphi cars 2015.r3 keygen torrent
-delphi cars 2015.r3 keygen serial number
-delphi cars 2015.r3 keygen instructions
-delphi cars 2015.r3 keygen software
-delphi cars 2015.r3 keygen generator
-delphi cars 2015.r3 keygen online
-delphi cars 2015.r3 keygen full version
-delphi cars 2015.r3 keygen patch
-delphi cars 2015.r3 keygen license
-delphi cars 2015.r3 keygen rar
-delphi cars 2015.r3 keygen zip
-delphi cars 2015.r3 keygen no password
-delphi cars 2015.r3 keygen direct link
-delphi cars 2015.r3 keygen forum
-delphi cars 2015.r3 keygen blog
-delphi cars 2015.r3 keygen youtube
-delphi cars 2015.r3 keygen video
-delphi cars 2015.r3 keygen review
-delphi cars 2015.r3 keygen guide
-delphi cars 2015.r3 keygen tutorial
-delphi cars 2015.r3 keygen support
-delphi cars and trucks 2015 r3 keygen download
-delphi autocom cars and trucks 2015 r3 keygen download
-delphi autocom cars and trucks 2015 r3 + keygen (fast and direct download)
-ds150e software free download and activation with delphi cars 2015 r3 keygen
-ds150e software with keygen for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with keygen for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with activation for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with crack for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with patch for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with license for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with serial number for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with torrent for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with mega for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with rar for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with zip for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with no password for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with direct link for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with forum for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with blog for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with youtube for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with video for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with review for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with guide for delphi cars and trucks 2015 r3 download free
-ds150e software v2020.23 with tutorial for delphi cars and trucks 2015 r3 download free
VXDAS is a professional supplier of auto diagnostic tools and software. On its official blog https://www.vxdas.com/blog/, you can find various articles about auto diagnosis, repair, maintenance, programming, coding, etc. One of these articles is titled "How to Install & Activate Delphi Cars & Trucks Software". In this article, you can find a link to download Delphi Cars 2015 R3 keygen and software for free. You can also find a step-by-step guide for installation and activation.
-MHH AUTO is a forum for automotive enthusiasts and professionals. On this forum https://mhhauto.com/, you can find various topics about auto diagnosis, repair, tuning, modification, etc. One of these topics is titled "Delphi Cars & Trucks DS150E [R3.2015] + KEYGEN". In this topic, you can find a link to download Delphi Cars 2015 R3 keygen and software for free. You can also find some instructions and tips for installation and activation.
-MOTORCARSOFT.COM is another forum for automotive enthusiasts and professionals. On this forum https://www.motorcarsoft.com/, you can find various topics about auto diagnosis, repair, programming , coding, etc. One of these topics is titled "Delphi Cars 2015.R3 + Activation". In this topic, you can find a link to download Delphi Cars 2015 R3 keygen and software for free. You can also find some instructions and screenshots for installation and activation.
-After you download Delphi Cars 2015 R3 keygen and software from one of the sources mentioned above, you will need to install and activate it on your computer. The installation and activation process may vary depending on the source, but here is a general step-by-step guide for your reference:
-The first step is to copy the software to your computer. You can use a USB flash drive, a CD, or a DVD to do this. You will need to have at least 4 GB of free space on your computer. You should also disable your antivirus software and firewall before copying the software, as they may interfere with the installation and activation.
-The second step is to run the main.exe file to activate the software. You can find this file in the folder where you copied the software. You should run it as administrator by right-clicking on it and selecting "Run as administrator". You will see a window that asks you to select the language of the software. You can choose from English, Deutsch, Francais, Hungarian, Espanol, Polish, Cesky, Dansk, Greek, Hollands, Italiano, Norsk, Romania, Russian, Srpski, Suomen Svenska, Turkish, etc. After you select the language, you will see another window that asks you to select the product you want to activate. You can choose from Cars or Trucks. After you select the product, you will see another window that shows your device serial number and asks you to enter an activation code.
-The third step is to save and send the FileActivation file to your supplier. The FileActivation file is a file that contains information about your device serial number and activation code. You can find this file in the same folder where you copied the software. You should save this file on your computer or a USB flash drive. Then you should send this file to your supplier by email or other methods. Your supplier will use a keygen tool to generate an activated FileActivation file for you.
-The fourth and final step is to import the activated FileActivation file to the software. You should receive this file from your supplier within 24 hours after you send them the original FileActivation file. You should copy this file to the same folder where you copied the software. Then you should run the main.exe file again as administrator and select "Yes" when it asks you if you want to import an activated FileActivation file. You will see a message that says "File Activation Success". This means that your software is now installed and activated.
-In conclusion, Delphi Cars 2015 R3 is a powerful and professional diagnostic software that works with various diagnostic tools and supports multi-brands cars and trucks till 2017. It has many features and benefits that can help you diagnose, repair, maintain, and optimize your vehicles. To use this software, you will need to download its keygen and software from a reliable source, such as the official website of Delphi Technologies or alternative sources of Delphi Cars 2015 R3 keygen and software. Then you will need to install and activate it on your computer by following a step-by-step guide.
-Here are some frequently asked questions about Delphi Cars 2015 R3 keygen download:
-I hope you enjoyed reading this article and learned something useful about Delphi Cars 2015 R3 keygen download. If you have any questions or comments, please feel free to leave them below. Thank you for your time and attention.
401be4b1e0If you are a fan of pool games, you have probably heard of 8 Ball Pool, the most popular and addictive pool game for mobile devices. 8 Ball Pool is a game that lets you play online against other players from around the world, or with your friends, in various modes and tournaments. You can also customize your cue, table, and avatar, and collect coins and cash to buy new items and upgrade your equipment.
-But did you know that there is a new version of 8 Ball Pool available for download? That's right, 8 Ball Pool 5.6.1 is the latest update of the game, and it comes with some exciting new features and improvements that will make your gaming experience even better. In this article, we will tell you everything you need to know about 8 Ball Pool 5.6.1, including what's new, how to download it, and how to play it like a pro.
-Download File >>>>> https://urlca.com/2uO75L
8 Ball Pool is a game that simulates the real-life pool game of the same name. The game follows the standard rules of 8-ball pool, where you have to pocket either the solid or striped balls, depending on your assigned group, and then pocket the black 8-ball before your opponent does.
-The game is played with a cue ball and 15 object balls, numbered from 1 to 15. One player must pocket the balls numbered from 1 to 7 (solid colors), while the other player must pocket the balls numbered from 9 to 15 (stripes). The player who pockets all their balls first can then try to pocket the 8-ball and win the game.
-Playing 8 Ball Pool is not only fun, but also beneficial for your brain and skills. Here are some of the benefits of playing this game:
-8 Ball Pool is constantly updated by its developers, Miniclip.com, to provide the best gaming experience for its players. The latest version of the game, 8 Ball Pool 5.6.1, was released on April 17, 2023, and it brings some bug fixes and performance improvements that will make the game run smoother and faster on your device.
-Some of the issues that were fixed in this update include:
-If you encounter any other issues or bugs while playing 8 Ball Pool, you can contact the support team through the game settings or the official website. They will try to help you as soon as possible.
-Aside from the bug fixes and improvements, 8 Ball Pool 5.6.1 also introduces some new features and modes that will make the game more fun and exciting. Here are some of the new additions that you can enjoy in this version:
-download 8 ball pool 5.6.1 apk
-download 8 ball pool 5.6.1 for android
-download 8 ball pool 5.6.1 mod apk
-download 8 ball pool 5.6.1 latest version
-download 8 ball pool 5.6.1 from softpedia
-download 8 ball pool 5.6.1 from google play
-download 8 ball pool 5.6.1 from filehippo
-download 8 ball pool 5.6.1 unlimited coins and cash
-download 8 ball pool 5.6.1 hack version
-download 8 ball pool 5.6.1 offline mode
-download 8 ball pool 5.6.1 for pc
-download 8 ball pool 5.6.1 for ios
-download 8 ball pool 5.6.1 for windows
-download 8 ball pool 5.6.1 for mac
-download 8 ball pool 5.6.1 for laptop
-download 8 ball pool 5.6.1 free online
-download 8 ball pool 5.6.1 without ads
-download 8 ball pool 5.6.1 with friends
-download 8 ball pool 5.6.1 multiplayer mode
-download 8 ball pool 5.6.1 pvp mode
-download 8 ball pool 5.6.1 miniclip game
-download 8 ball pool 5.6.1 best pool game
-download 8 ball pool 5.6.1 update version
-download 8 ball pool 5.6.1 new features
-download 8 ball pool 5.6.1 bug fixes
-download 8 ball pool 5.6.1 tips and tricks
-download 8 ball pool 5.6.1 how to play
-download 8 ball pool 5.6.1 tutorial guide
-download 8 ball pool 5.6.1 review and rating
-download 8 ball pool 5.6.1 customer feedback
-download 8 ball pool 5.6.1 customer support
-download 8 ball pool 5.6.1 faq and help
-download 8 ball pool 5.6.1 safe and secure
-download 8 ball pool 5.6.1 malware free
-download 8 ball pool 5.6.1 virus free
-download 8 ball pool 5.6.1 no root required
-download 8 ball pool 5.6.1 easy and fast
-download 8 ball pool 5.6.1 direct link
-download 8 ball pool 5
If you have an Android device, you can download 8 Ball Pool 5.6.1 from the Google Play Store. Here are the steps to do so:
-Note: You may need to update your device's software or free up some storage space before downloading the game. You may also need an internet connection to play the game online.
-If you have a PC, you can download 8 Ball Pool 5.6.1 from the official website. Here are the steps to do so:
-Note: You may need a stable internet connection to play the game online. You may also need a mouse or a keyboard to control the game.
-If you want to play 8 Ball Pool 5.6.1 like a pro, you need to practice your skills and learn some tips and tricks that will help you win more matches and earn more coins. Here are some of them:
-If you want to learn more about 8 Ball Pool 5.6.1 and its features, you can check out some of the following resources:
-8 Ball Pool 5.6.1 is the latest version of the most popular and addictive pool game for mobile devices. It comes with some bug fixes and improvements, as well as some new features and modes that will make your gaming experience even better. You can download 8 Ball Pool 5.6.1 from the Google Play Store or the official website, depending on your device. You can also play 8 Ball Pool 5.6.1 like a pro by following some tips and tricks and checking out some resources to learn more about the game.
-We hope you enjoyed this article and found it helpful. If you have any questions or comments, feel free to leave them below. Happy gaming!
-A: Yes, 8 Ball Pool 5.6.1 is free to play, but it contains some in-app purchases that you can buy with real money if you want to get more coins, cash, or items.
-A: No, you need an internet connection to play 8 Ball Pool 5.6.1 online with other players or access some of its features.
-A: Yes, you can play 8 Ball Pool 5.6.1 on other devices, such as iOS devices or Windows phones, by downloading it from their respective app stores.
-A: You can get more coins and cash in 8 Ball Pool 5.6.1 by playing matches and winning them, completing challenges and missions, participating in tournaments and events, spinning the wheel of fortune, watching ads or videos, or buying them with real money.
-A: You can contact the support team of 8 Ball Pool 5.6.1 by going to the game settings and tapping on Contact Us. You can also email them at support@miniclip.com.
197e85843dDo you love throwing knives and breaking things? Do you want to test your skills and reflexes in a fun and exciting way? If you answered yes, then you should try Knife Hit Hile APK, a popular arcade game that will keep you hooked for hours.
-DOWNLOAD >>>>> https://urlca.com/2uOb5e
Knife Hit Hile APK is a modified version of Knife Hit, a game developed by Ketchapp, a famous studio that creates casual and addictive games for mobile devices. In this game, your goal is to throw knives into the logs to break them, while avoiding hitting other knives or spikes. You also have to slash the apples that appear on the logs to unlock new knives and power-ups. Every fifth stage, you will face a boss that will challenge your skills and reward you with exclusive knives if you beat them.
-Knife Hit Hile APK is different from the original version because it gives you unlimited coins, lives, and knives. This means that you can play as much as you want without worrying about running out of resources or losing progress. You can also access all the knives and modes in the game without having to spend real money or watch ads.
-The gameplay of Knife Hit Hile APK is very easy to learn but hard to master. You just have to tap the screen to throw a knife at the log that is spinning in front of you. You have to time your taps carefully so that you don't hit any other knives or spikes on the log. The more knives you throw, the faster the log will spin, making it harder to aim. You have to break the log before it reaches the bottom of the screen or before you run out of knives.
-One of the most fun aspects of Knife Hit Hile APK is that you can collect and use different types of knives in the game. There are more than 100 knives to choose from, ranging from simple kitchen knives to exotic weapons like swords, axes, daggers, shurikens, and more. Each knife has its own design, sound, and animation, making them unique and satisfying to use.
-knife hit mod apk unlimited money
-knife hit hack apk download
-knife hit game free download for android
-knife hit boss fight apk
-knife hit premium apk
-knife hit challenge apk
-knife hit mod apk all knives unlocked
-knife hit apk latest version
-knife hit game online play
-knife hit mod apk revdl
-knife hit cheat codes
-knife hit hack version
-knife hit game tips and tricks
-knife hit mod apk android 1
-knife hit apk pure
-knife hit game for pc
-knife hit mod apk no ads
-knife hit hack ios
-knife hit game review
-knife hit mod apk happymod
-knife hit glitch
-knife hit hack tool
-knife hit game download apkpure
-knife hit mod apk rexdl
-knife hit apk mirror
-knife hit game for iphone
-knife hit mod apk unlimited apples
-knife hit hack online
-knife hit game features
-knife hit mod apk 2023
-knife hit guide
-knife hit hack no root
-knife hit game download uptodown
-knife hit mod apk an1
-knife hit apk mod menu
-knife hit game for windows 10
-knife hit mod apk unlimited everything
-knife hit hack generator
-knife hit game modes
-knife hit mod apk old version
-knife hit strategy
-knife hit hack without human verification
-knife hit game download for laptop
-knife hit mod apk unlimited gems and coins
-knife hit apk obb
-knife hit game for mac
-knife hit mod apk god mode
-knife hit hack app
Besides knives, you can also encounter different types of targets in the game. There are logs made of wood, metal, stone, ice, cheese, candy, and more. Each target has its own characteristics, such as durability, speed, shape, size, and obstacles. Some targets may even have special effects, such as exploding, splitting, or changing direction.
-Every fifth stage in Knife Hit Hile APK is a boss battle, where you have to face a giant target that has a unique pattern and behavior. For example, you may have to fight a pineapple that shoots spikes at you, a donut that rotates randomly, or a dragon that breathes fire. You have to hit the boss several times to defeat it, while avoiding its attacks and obstacles.
-If you manage to beat the boss, you will get an exclusive knife that is related to the boss theme. For example, if you beat the pineapple boss, you will get a pineapple knife. These knives are very rare and cool to have, so don't miss the chance to get them.
-Another great feature of Knife Hit Hile APK is that you can play it offline, without needing an internet connection. This means that you can enjoy the game anytime and anywhere, without worrying about data usage or network issues. You can also play the game without being interrupted by ads or pop-ups, which can be annoying and distracting.
-If you want to try Knife Hit Hile APK, you have to download and install it manually on your Android device. Here are the steps you need to follow:
-Since Knife Hit Hile APK is not available on the Google Play Store, you have to enable unknown sources on your device to allow the installation of third-party apps. To do this, go to Settings > Security > Unknown Sources and toggle it on. You may see a warning message, but don't worry, it's safe to proceed.
-Next, you have to download the APK file of Knife Hit Hile APK from a trusted source. You can use the link below to get the latest version of the game. Make sure you have enough storage space on your device before downloading the file.
- -Once you have downloaded the APK file, you have to locate it on your device using a file manager app. You can usually find it in the Downloads folder. Tap on the file and follow the instructions to install it on your device. It may take a few seconds to complete the installation.
-After installing the game, you can launch it from your app drawer or home screen. You will see the game icon with the word "Hile" on it. Tap on it and start playing Knife Hit Hile APK. You will see that you have unlimited coins, lives, and knives in the game. You can also access all the modes and levels in the game without any restrictions.
-One of the most important tips for playing Knife Hit Hile APK is to aim for the apples that appear on the logs. Apples are valuable because they give you extra knives and power-ups that can help you in the game. Power-ups include magnets, lasers, bombs, and more. They can make your knives stick better, destroy more parts of the log, or clear all the obstacles on the log.
-However, you also have to avoid hitting the spikes that are also on the logs. Spikes are dangerous because they can break your knives or make you lose a life. If you hit a spike, you will hear a loud sound and see a red flash on the screen. If you lose all your lives, you will have to start over from the beginning of the stage.
-As mentioned above, power-ups are very useful in Knife Hit Hile APK, but you have to use them wisely. Power-ups are activated automatically when you hit an apple that contains them, but they only last for a limited time. You have to make sure that you use them at the right moment and not waste them.
-For example, if you have a magnet power-up, you should use it when there are many knives or spikes on the log, so that your knives will stick better and avoid hitting them. If you have a laser power-up, you should use it when there are few knives or spikes on the log, so that you can cut through them easily. If you have a bomb power-up, you should use it when the log is almost broken, so that you can finish it off quickly.
-Another tip for playing Knife Hit Hile APK is to collect coins and unlock new knives in the game. Coins are the currency of the game, and you can use them to buy new knives from the shop. You can earn coins by hitting apples, breaking logs, defeating bosses, or watching ads. You can also get coins for free by spinning the wheel or opening the chest every day.
-There are more than 100 knives to unlock in the game, and each one costs a different amount of coins. Some knives are cheap and easy to get, while others are expensive and rare. You can also get some knives by completing certain achievements or challenges in the game. For example, you can get a pizza knife by hitting 1000 apples, or a unicorn knife by beating 10 bosses.
-New knives are not only cosmetic, but they also have different effects on the gameplay. Some knives may be faster, sharper, heavier, or lighter than others. Some knives may also have special abilities, such as bouncing, splitting, or exploding. You should try different knives and see which ones suit your style and preference.
-The last tip for playing Knife Hit Hile APK is to challenge yourself with different modes and levels in the game. The game has four modes to choose from: Classic, Challenge, Arcade, and Event. Each mode has its own rules and objectives, and you can switch between them anytime you want.
-Classic mode is the default mode of the game, where you have to break logs and beat bosses as you progress through the stages. Challenge mode is where you have to complete specific tasks or missions in each stage, such as hitting a certain number of apples or avoiding a certain number of spikes. Arcade mode is where you have to break as many logs as you can in a row without missing or hitting anything else. Event mode is where you can play special events that are available for a limited time, such as Halloween or Christmas.
-Each mode also has different levels of difficulty, ranging from easy to hard. The difficulty affects the speed, size, shape, and obstacles of the logs, as well as the number of knives and lives you have. The higher the difficulty, the more challenging and rewarding the game becomes. You can change the difficulty level anytime you want in the settings menu.
-Knife Hit Hile APK is a fun and challenging game for Android users who love throwing knives and breaking things. It has simple and addictive gameplay, different types of knives and targets, boss battles and exclusive rewards, and no internet connection required. It also gives you unlimited coins, lives, and knives, as well as access to all the modes and levels in the game. You can download and install Knife Hit Hile APK easily by following the steps above. You can also improve your skills and enjoy the game more by following the tips and tricks above.
-If you are looking for a game that will test your reflexes and accuracy, while also giving you a lot of fun and excitement, then you should try Knife Hit Hile APK today. It is one of the best arcade games for Android devices that will keep you entertained for hours.
-Here are some frequently asked questions about Knife Hit Hile APK:
-If you are a fan of soccer games, you must have heard of FIFA Mobile, the official mobile game of the FIFA World Cup 2022. This game lets you build your ultimate team of soccer stars, compete in various modes, and relive the world's greatest soccer tournament. But what if you want to enjoy the game without any limitations or restrictions? That's where FIFA Mobile Mod APK comes in. This is a modified version of the original game that gives you unlimited money, unlocked players and teams, menu mod with cheats and hacks, and more. In this article, we will tell you everything you need to know about FIFA Mobile Mod APK, including its features, how to download and install it, its pros and cons, and some FAQs. Read on to find out more.
-Download ➡ https://urlca.com/2uOc6b
FIFA Mobile Mod APK is a hacked version of FIFA Mobile, the official mobile game of the FIFA World Cup 2022. This mod apk gives you access to features that are not available in the original game, such as unlimited money and coins, unlocked players and teams, menu mod with cheats and hacks, and more. With these features, you can easily build your dream team of soccer legends, customize your tactics and strategies, dominate the pvp modes, and enjoy the realistic soccer simulation. You can also replay the official World Cup brackets with any of the 32 qualified nations, play in World Cup stadiums, and listen to localized World Cup commentary.
-FIFA Mobile Mod APK has many features that make it different from the original game. Here are some of the main features that you can enjoy with this mod apk:
-One of the best features of FIFA Mobile Mod APK is that it unlocks all the players and teams in the game. You can choose from over 15,000 authentic soccer stars from over 30 leagues, including world-class talent like Kylian Mbappé, Christian Pulisic, Vinicius Jr, Son Heung-min, and more. You can also play with over 600 teams from around the world, including Chelsea, Paris SG, Real Madrid, Liverpool, Juventus, and more. You can even play with soccer icons and heroes like Paolo Maldini, Ronaldinho, Zinedine Zidane, David Beckham, and more.
-Another great feature of FIFA Mobile Mod APK is that it gives you unlimited money and coins in the game. You can use these resources to buy player items, upgrade your team, unlock new kits and badges, and more. You don't have to worry about running out of money or coins or spending real money to get them. You can enjoy the game without any financial constraints.
-fifa mobile mod apk unlimited coins and points download
-fifa soccer mod apk unlimited gems and money download
-fifa world cup 2022 mod apk unlimited cash and gold download
-fifa football hack mod apk unlimited everything download
-fifa mobile 23 mod apk unlimited players and stamina download
-fifa soccer 23 mod apk unlimited energy and tickets download
-fifa world cup 2022 hack mod apk unlimited rewards and tokens download
-fifa football mod apk free shopping and premium download
-fifa mobile mod apk unlocked all features and modes download
-fifa soccer mod apk unlocked all teams and leagues download
-fifa world cup 2022 mod apk unlocked all stadiums and balls download
-fifa football mod apk latest version and update download
-fifa mobile mod apk offline and online download
-fifa soccer mod apk no root and no ads download
-fifa world cup 2022 mod apk no verification and no survey download
-fifa football mod apk android and ios download
-fifa mobile mod apk for tablet and pc download
-fifa soccer mod apk for windows and mac download
-fifa world cup 2022 mod apk for laptop and chromebook download
-fifa football mod apk high graphics and sound quality download
-fifa mobile mod apk low mb and fast speed download
-fifa soccer mod apk easy install and play download
-fifa world cup 2022 mod apk safe and secure download
-fifa football mod apk virus free and malware free download
-fifa mobile mod apk tested and working download
-fifa soccer 23 cheats and tips for unlimited money
-fifa world cup 2022 tricks and hacks for unlimited money
-fifa football guide and tutorial for unlimited money
-fifa mobile 23 review and rating for unlimited money
-fifa world cup 2022 gameplay and walkthrough for unlimited money
-how to get unlimited money in fifa football mod apk
-how to download and install fifa mobile mod apk with unlimited money
-how to play and win fifa soccer mod apk with unlimited money
-how to unlock and upgrade fifa world cup 2022 mod apk with unlimited money
-how to use and activate fifa football mod apk menu with unlimited money
-best website to download fifa mobile mod apk with unlimited money
-best source to get fifa soccer mod apk with unlimited money
-best way to enjoy fifa world cup 2022 mod apk with unlimited money
-best team to build in fifa football mod apk with unlimited money
-best players to collect in fifa mobile mod apk with unlimited money
-best league to join in fifa soccer mod apk with unlimited money
-best stadium to choose in fifa world cup 2022 mod apk with unlimited money
-best ball to use in fifa football mod apk with unlimited money
-best mode to play in fifa mobile mod apk with unlimited money
-best feature to explore in fifa soccer mod apk with unlimited money
-best reward to claim in fifa world cup 2022 mod apk with unlimited money
-best icon to unlock in fifa football mod apk with unlimited money
-best hero to recruit in fifa mobile mod apk with unlimited money
-best legend to sign in fifa soccer mod apk with unlimited money
FIFA Mobile Mod APK also comes with a menu mod that allows you to activate various cheats and hacks in the game. You can use these cheats and hacks to enhance your gameplay experience and have more fun. Some of the cheats and hacks that you can use are:
-If you want to download and install FIFA Mobile Mod APK on your device , you need to follow these simple steps:
-The first step is to download the mod apk file from a trusted source. You can use the link below to download the latest version of FIFA Mobile Mod APK for free. This link is safe and secure and will not harm your device. Make sure you have enough storage space on your device before downloading the file.
- -The next step is to enable unknown sources on your device settings. This will allow you to install apps from sources other than the Google Play Store. To do this, go to your device settings, then security, then unknown sources, and toggle it on. You may see a warning message, but don't worry, it's safe to proceed.
-The final step is to install the mod apk file and launch the game. To do this, locate the downloaded file on your device, tap on it, and follow the instructions on the screen. Once the installation is complete, you can open the game and enjoy FIFA Mobile Mod APK with unlimited money, unlocked players and teams, menu mod with cheats and hacks, and more.
-FIFA Mobile Mod APK has many advantages, but it also has some disadvantages. Here are some of the pros and cons of using this mod apk:
-Here are some of the frequently asked questions about FIFA Mobile Mod APK:
-Question | Answer |
Is FIFA Mobile Mod APK safe to use? | Yes, FIFA Mobile Mod APK is safe to use as long as you download it from a trusted source. However, you should always be careful when installing apps from unknown sources and scan them for viruses or malware before installing them. |
Is FIFA Mobile Mod APK legal to use? | No, FIFA Mobile Mod APK is not legal to use as it violates the terms and conditions of the original game. Using this mod apk may result in legal action from EA Sports or FIFA. Therefore, we do not recommend using this mod apk for any illegal purposes. |
Can I play online modes with FIFA Mobile Mod APK? | Yes, you can play online modes with FIFA Mobile Mod APK, but you may face some issues or risks. For example, you may not be able to connect to other players or servers, you may experience lag or crashes, or you may get banned from online modes or leaderboards if you use cheats and hacks. Therefore, we advise you to play online modes at your own risk. |
Can I update FIFA Mobile Mod APK? | No, you cannot update FIFA Mobile Mod APK as it is a modified version of the original game. Updating this mod apk may cause it to stop working or lose its features. Therefore, we suggest you to stick with the current version of FIFA Mobile Mod APK or wait for a new version from the modder. |
Can I use FIFA Mobile Mod APK with my existing account? | No, you cannot use FIFA Mobile Mod APK with your existing account as it may cause your account to get banned or corrupted. Therefore, we recommend you to create a new account or use a guest account to play with FIFA Mobile Mod APK. |
FIFA Mobile Mod APK is a modified version of FIFA Mobile, the official mobile game of the FIFA World Cup 2022. This mod apk gives you unlimited money, unlocked players and teams, menu mod with cheats and hacks, and more. You can enjoy the game without any limitations or restrictions, but you may also face some risks or drawbacks. Therefore, you should use this mod apk at your own discretion and responsibility.
-We hope this article has helped you learn more about FIFA Mobile Mod APK and how to download and install it on your device. If you have any questions or feedback, please feel free to leave a comment below. Thank you for reading and have fun playing FIFA Mobile Mod APK.
197e85843dFinal Fantasy Tactics is a strategy role-playing game that was released in 1997 for the PlayStation console. It is one of the most acclaimed games in the Final Fantasy series, with a complex and engaging story, a rich and diverse job system, and a challenging and rewarding tactical combat. The game has been re-released several times, with enhanced graphics, sound, and content. The latest version is Final Fantasy Tactics 2.1 0 apk, which is available for Android devices.
-Download » https://urlca.com/2uO8Gs
In this article, we will explore the features of Final Fantasy Tactics 2.1 0 apk and how to download it on your Android device. Whether you are a fan of the original game or a newcomer to the world of Ivalice, you will find something to enjoy in this classic strategy RPG.
-Final Fantasy Tactics is set in the medieval fantasy world of Ivalice, where two rival factions are fighting for the throne of the kingdom. You play as Ramza Beoulve, a young noble who gets involved in the conflict and uncovers a sinister plot behind it. Along the way, you will meet various characters, some of whom will join your party, while others will oppose you.
-The story of Final Fantasy Tactics is rich and complex, with multiple twists and turns, political intrigue, moral dilemmas, and hidden secrets. The game also features multiple endings, depending on your choices and actions throughout the game.
-Final Fantasy Tactics is a tactical role-playing game, where you control a party of up to five characters in turn-based battles on grid-based maps. You can move your characters around the map, attack enemies, use items, cast spells, and perform various actions depending on their job class.
-The job system is one of the most distinctive features of Final Fantasy Tactics. There are over 20 types of jobs in the game, each with its own abilities, strengths, and weaknesses. You can change your characters' jobs at any time outside of battle, as long as they meet the requirements for that job. You can also learn new abilities by spending job points (JP) that you earn from battles.
-final fantasy tactics wotl apk
-final fantasy tactics war of the lions apk
-final fantasy tactics android apk
-final fantasy tactics mod apk
-final fantasy tactics apk download
-final fantasy tactics apk free
-final fantasy tactics apk obb
-final fantasy tactics apk cracked
-final fantasy tactics apk modded
-final fantasy tactics apk full
-final fantasy tactics apk latest version
-final fantasy tactics apk no root
-final fantasy tactics apk offline
-final fantasy tactics apk patched
-final fantasy tactics apk unlimited money
-final fantasy tactics the war of the lions android apk
-final fantasy tactics the war of the lions mod apk
-final fantasy tactics the war of the lions apk download
-final fantasy tactics the war of the lions apk free
-final fantasy tactics the war of the lions apk obb
-final fantasy tactics the war of the lions apk cracked
-final fantasy tactics the war of the lions apk modded
-final fantasy tactics the war of the lions apk full
-final fantasy tactics the war of the lions apk latest version
-final fantasy tactics the war of the lions apk no root
-final fantasy tactics the war of the lions apk offline
-final fantasy tactics the war of the lions apk patched
-final fantasy tactics the war of the lions apk unlimited money
-fft wotl apk
-fft war of the lions apk
-fft android apk
-fft mod apk
-fft apk download
-fft apk free
-fft apk obb
-fft apk cracked
-fft apk modded
-fft apk full
-fft apk latest version
-fft apk no root
-fft apk offline
-fft apk patched
-fft apk unlimited money
-fft the war of the lions android apk
-fft the war of the lions mod apk
The job system allows you to customize your characters according to your preferences and strategies. You can mix and match abilities from different jobs to create unique combinations and synergies. For example, you can have a Knight who can also cast White Magic, or a Ninja who can also use Geomancy.
-Final Fantasy Tactics 2.1 0 apk is based on the enhanced port of the game that was released for the PlayStation Portable in 2007, titled Final Fantasy Tactics: The War of the Lions. This version of the game features several improvements and additions over the original PlayStation version, such as:
-Final Fantasy Tactics 2.1 0 apk also fixes some bugs and glitches that were present in the original version of the game.
-If you want to play Final Fantasy Tactics 2.1 0 apk on your Android device, you will need to follow these steps:
-Final Fantasy Tactics 2.1 0 apk is a great way to experience one of the best strategy RPGs ever made on your Android device. The game offers a captivating story, a deep and flexible job system, and a challenging and satisfying tactical combat. The game also features improved graphics, sound, and content over the original version, as well as new touch screen controls and autosave function. If you are a fan of Final Fantasy or strategy games, you should definitely give this game a try.
-The game requires Android 4.0.3 or higher, and at least 1 GB of RAM and 700 MB of free storage space.
-The game has an autosave function that saves your progress every time you enter or exit a battle or a location. You can also manually save your progress by accessing the menu and selecting Save Game.
-The game supports English, French, German, Italian, Spanish, Japanese, Korean, Traditional Chinese, and Simplified Chinese languages. You can change the language by accessing the menu and selecting Options, then Language.
-The game supports local wireless and online multiplayer modes, where you can battle against other players or cooperate with them in special missions. You can access the multiplayer mode by selecting Multiplayer from the main menu.
-You can visit the official website of the game [here], or check out some fan-made guides and wikis [here] and [here].
197e85843dIf you are looking for a puzzle game that is fun, addictive, and challenging, then you should check out Parking Jam 3D. This is a game that will test your logic skills, critical thinking, and timing precision as you try to empty a parking lot full of cars. But there is more to this game than just parking. You will also experience jams, angry grannies, bonus levels, idle money, and much more. And if you want to enjoy all the features of this game without any limitations, then you should download Parking Jam 3D Mod APK Unlocked Everything. This is a modified version of the game that will give you access to all the cars, skins, scenes, buildings, and money that you want. In this article, we will tell you what Parking Jam 3D is, what Parking Jam 3D Mod APK Unlocked Everything is, how to download and install it, and what are its features and benefits.
-Parking Jam 3D is one of the most downloaded Puzzle Board games with more than 80 million installs. It is developed by Popcore Games, a studio that has created other popular games like Pull the Pin, Paint Puzzle, Sandwich!, Clash of Blocks, Paint The Cube and more. Here are some of the aspects of Parking Jam 3D that make it a unique and entertaining game.
-Download Zip ✶ https://urlca.com/2uOd1D
The main objective of Parking Jam 3D is to get all the cars out of the parking lot. Sounds simple enough, right? Well, not quite. You see, the parking lot is full of obstacles and other cars that are blocking your way. You need to move them in the right order and direction to clear a path for your car. But be careful not to hit anything or anyone, especially granny. She might look harmless, but she can flip your car over with her enormous strength. You need to use your logic skills, critical thinking, and timing precision to solve this tricky parking puzzle.
-Parking Jam 3D is not only a game about parking, but also a game about jams. Yes, you heard that right. Jams are bonus levels that you can unlock by completing certain levels. In these levels, you will have to deal with a traffic jam that is caused by various factors, such as accidents, roadworks, police cars, ambulances, fire trucks, and more. You will have to find a way to clear the jam and get all the cars moving again. But be careful not to cause more chaos or damage. Jams are fun and challenging levels that will test your patience and problem-solving skills.
-Parking Jam 3D is a game that will never get boring or repetitive. That's because it has hundreds of levels that get harder and harder every time. You will have to face different parking scenarios and situations that will require different strategies and tactics. You will also have to deal with different maps that have different layouts and features. Some maps are small and simple, while others are large and complex. Some maps have ramps, bridges, tunnels, elevators, and other elements that will add more variety and fun to the game. You will also have access to different cars that have different shapes, sizes, colors, and speeds. You can choose from sedans, SUVs, trucks, buses, sports cars, and more. You can also unlock skins for your cars that will make them look more cool and stylish. And if that's not enough, you can also unlock scenes for your maps that will change the background and the atmosphere of the game. You can choose from city, beach, forest, desert, snow, and more. Parking Jam 3D is a game that has something for everyone.
-parking jam 3d hack mod apk unlimited money
-download parking jam 3d mod apk latest version
-parking jam 3d mod apk free shopping
-parking jam 3d premium mod apk no ads
-parking jam 3d mod apk android 1
-parking jam 3d mod apk revdl
-parking jam 3d mod apk happymod
-parking jam 3d mod apk rexdl
-parking jam 3d mod apk offline
-parking jam 3d mod apk online
-parking jam 3d pro mod apk full unlocked
-parking jam 3d vip mod apk all levels
-parking jam 3d mega mod apk unlimited everything
-parking jam 3d mod apk unlimited coins and gems
-parking jam 3d mod apk unlimited hints and skips
-parking jam 3d mod apk unlimited cars and skins
-parking jam 3d mod apk unlimited stars and rewards
-parking jam 3d mod apk unlimited lives and energy
-parking jam 3d mod apk unlimited time and moves
-parking jam 3d mod apk unlimited fun and challenge
-how to install parking jam 3d mod apk on android
-how to download parking jam 3d mod apk on pc
-how to play parking jam 3d mod apk on ios
-how to update parking jam 3d mod apk to new version
-how to get parking jam 3d mod apk for free
-is parking jam 3d mod apk safe and secure
-is parking jam 3d mod apk legal and legit
-is parking jam 3d mod apk compatible with my device
-is parking jam 3d mod apk worth it and good
-is parking jam 3d mod apk easy and simple
-what is parking jam 3d mod apk features and benefits
-what is parking jam 3d mod apk requirements and specifications
-what is parking jam 3d mod apk size and version
-what is parking jam 3d mod apk rating and reviews
-what is parking jam 3d mod apk developer and publisher
-why download parking jam 3d mod apk for android
-why install parking jam 3d mod apk for pc
-why play parking jam 3d mod apk for ios
-why update parking jam 3d mod apk to latest version
-why get parking jam 3d mod apk for free
-best site to download parking jam 3d mod apk for android
-best site to install parking jam 3d mod apk for pc
-best site to play parking jam 3d mod apk for ios
-best site to update parking jam 3d mod apk to latest version
-best site to get parking jam 3d mod apk for free
Parking Jam 3D is a free game that you can download from the Google Play Store or the App Store. However, the game also has some limitations and restrictions that might affect your gaming experience. For example, the game has ads that might interrupt your gameplay or annoy you. The game also has in-app purchases that might tempt you to spend real money to get more coins or unlock more features. The game also has some locked features that you can only access by completing certain levels or tasks. For example, some cars, skins, scenes, buildings, and money are locked until you reach a certain level or collect a certain amount of rent.
-If you want to get rid of these limitations and restrictions, then you should download Parking Jam 3D Mod APK Unlocked Everything. This is a modified version of the game that will give you access to all the features of the game without any limitations or restrictions. You will be able to enjoy the game without ads or in-app purchases. You will also be able to customize your cars and buildings with any skin or scene that you want. You will also have unlimited money that you can use to buy anything in the game.
-If you want to download and install Parking Jam 3D Mod APK Unlocked Everything on your Android device, then you need to follow these simple steps:
-First of all, you need to find a reliable source that provides the mod apk file for Parking Jam 3D Mod APK Unlocked Everything. There are many websites that claim to offer this file, but not all of them are trustworthy or safe. Some of them might contain viruses or malware that could harm your device or steal your personal information. Therefore, you need to be careful and do some research before downloading anything from the internet.
-One of the websites that we recommend is [APKPure]. This is a popular website that provides mod apk files for various games and apps. It has a good reputation and a large user base. It also has a verification system that ensures the safety and quality of the files. You can download Parking Jam 3D Mod APK Unlocked Everything from this website by following these steps:
- - Go to [APKPure] website on your browser. - Search for Parking Jam 3D Mod APK Unlocked Everything in the search bar. - Click on the result that matches your query. - Click on the download button and wait for the file to be downloaded on your device.Before you can install Parking Jam 3D Mod APK Unlocked Everything on your device, you need to enable unknown sources on your device.
Unknown sources are sources that are not verified by Google or the device manufacturer. By default, your device does not allow you to install apps from unknown sources for security reasons. However, since Parking Jam 3D Mod APK Unlocked Everything is not available on the Google Play Store, you need to enable unknown sources to install it. You can do this by following these steps:
- - Go to your device settings and look for security or privacy options. - Find the option that says unknown sources or install unknown apps and toggle it on. - Confirm your choice by tapping OK or Allow.After you have downloaded the mod apk file and enabled unknown sources, you can install Parking Jam 3D Mod APK Unlocked Everything on your device. You can do this by following these steps:
- - Locate the mod apk file on your device using a file manager app or your browser's downloads folder. - Tap on the file and select install. - Wait for the installation process to finish and tap open. - Enjoy the game with all the features unlocked.Parking Jam 3D Mod APK Unlocked Everything is a great way to enjoy Parking Jam 3D without any limitations or restrictions. Here are some of the features and benefits that you will get from this mod apk:
-Parking Jam 3D Mod APK Unlocked Everything has some features that will make the game more fun and challenging for you. These include:
-If you are feeling stressed or frustrated, you can vent your anger by hitting cars or avoiding granny in Parking Jam 3D Mod APK Unlocked Everything. You can hit cars with your car or with other cars to cause damage and chaos. You can also avoid granny by driving away from her or hiding behind other cars. However, be careful not to hit her or she will flip your car over. This feature will make the game more fun and relaxing for you.
-If you are looking for a challenge, you will find it in Parking Jam 3D Mod APK Unlocked Everything. The game has hundreds of levels that get harder and harder every time. You will have to face different parking scenarios and situations that will require different strategies and tactics. You will also have to deal with different maps that have different layouts and features. You will need to use your skills and critical thinking to solve these puzzles and clear the parking lot.
-If you want to improve your driving skills, you can learn how to unpark smoothly without hitting anything or anyone in Parking Jam 3D Mod APK Unlocked Everything. The game will teach you how to move your car in the right order and direction to clear a path for your car. You will also learn how to avoid obstacles and other cars that are blocking your way. You will also learn how to park your car in tight spaces without scratching it. This feature will make the game more educational and useful for you.
-Parking Jam 3D Mod APK Unlocked Everything has some benefits that will make the game more rewarding and satisfying for you. These include:
-If you want to earn money, you can do so by completing levels and collecting rent from buildings in Parking Jam 3D Mod APK Unlocked Everything. You will earn money by clearing the parking lot and getting all the cars out. You will also earn money by collecting rent from buildings that you own or manage. You can use this money to buy more cars, skins, scenes, buildings, and more.
-If you want to customize your cars and maps, you can do so by unlocking skins for your cars and scenes for your maps in Parking Jam 3D Mod APK Unlocked Everything. You will unlock skins for your cars by completing levels or buying them with money. You can choose from different colors, patterns, designs, logos, stickers, and more. You will unlock scenes for your maps by completing levels or buying them with money. You can choose from different backgrounds, atmospheres, themes, weather, and more.
-If you want to play offline and on the go without internet connection, you can do so in Parking Jam 3D Mod APK Un Everything. The game does not require internet connection to play, so you can play it anytime and anywhere. You can play it on your phone, tablet, or laptop without worrying about data usage or wifi availability. This feature will make the game more convenient and accessible for you.
-Parking Jam 3D is a fun and addictive puzzle game that will test your logic skills, critical thinking, and timing precision as you try to empty a parking lot full of cars. But there is more to this game than just parking. You will also experience jams, angry grannies, bonus levels, idle money, and much more. And if you want to enjoy all the features of this game without any limitations or restrictions, then you should download Parking Jam 3D Mod APK Unlocked Everything. This is a modified version of the game that will give you access to all the cars, skins, scenes, buildings, and money that you want. You will also be able to customize your cars and buildings with any skin or scene that you want. You will also have unlimited money that you can use to buy anything in the game.
-If you are looking for a puzzle game that is fun, addictive, and challenging, then you should try out Parking Jam 3D and Parking Jam 3D Mod APK Unlocked Everything. You will not regret it. You will have hours of entertainment and satisfaction as you solve puzzles and clear parking lots. You will also have fun and relax as you hit cars or avoid granny. You will also learn and improve as you unpark smoothly without hitting anything or anyone. You will also earn and reward as you complete levels and collect rent from buildings. You will also customize and enjoy as you unlock skins for your cars and scenes for your maps.
-So what are you waiting for? Download Parking Jam 3D and Parking Jam 3D Mod APK Unlocked Everything today and start your parking adventure. You will love it. And don't forget to share your feedback with us. We would love to hear from you.
-Here are some of the frequently asked questions about Parking Jam 3D and Parking Jam 3D Mod APK Unlocked Everything:
-A: Yes, Parking Jam 3D is safe to play. It does not contain any viruses or malware that could harm your device or steal your personal information. It is also suitable for all ages and does not contain any inappropriate or offensive content.
-A: Yes, Parking Jam 3D Mod APK Unlocked Everything is safe to download and install. However, you need to make sure that you download it from a reliable source like [APKPure]. You also need to enable unknown sources on your device before installing it.
-A: If there is a new version of Parking Jam 3D Mod APK Unlocked Everything available, you can update it by following these steps:
-- Go to [APKPure] website on your browser. - Search for Parking Jam 3D Mod APK Unlocked Everything in the search bar. - Click on the result that matches your query. - Click on the update button and wait for the file to be downloaded on your device. - Install the file over the existing one and launch the game.A: If you have any questions, suggestions, feedback, or issues about Parking Jam 3D, you can contact the developers of the game by emailing them at support@popcore.com. You can also visit their website at https://popcore.com/ or follow them on Facebook at https://www.facebook.com/popcoregames/.
-A: If you want to share your opinion about Parking Jam 3D or Parking Jam 3D Mod APK Unlocked Everything, you can do so by leaving a comment below this article. You can also rate and review the game on the Google Play Store or the App Store. We appreciate your feedback and we hope you enjoy the game.
197e85843dIf you are looking for a fun and easy way to stream live video and chat with friends, you might want to try Quick Live app. Quick Live is a social streaming platform that allows you to create your own live shows, join live battles, watch other streamers, and interact with other users. You can also enjoy party games, music, voice chat, and more on Quick Live.
-Download Zip 🗹 https://urlca.com/2uO4zc
However, you may not be able to find Quick Live app on the Google Play Store, as it is not officially available there. In that case, you will need to download and install the APK file of Quick Live app from a third-party source. But what is an APK file and how do you install it on your Android device? In this article, we will explain everything you need to know about Quick Live 1.15.1 APK download, including its features, installation steps, alternatives, and FAQs.
-An APK file is a package file format used by the Android operating system for the distribution and installation of mobile applications. It contains all the elements that an app needs to run properly on your device, such as code, resources, manifest, certificates, etc. You can think of it as a zip file that contains the app.
-People download APK files from outside the Google Play Store for various reasons. Some apps may not be available on the Play Store due to regional restrictions, licensing issues, or policy violations. Some apps may have newer or older versions that are not updated on the Play Store. Some apps may offer more features or customization options in their APK files than in their official versions.
-quick live app 1.15.1 apk free download
-how to install quick live 1.15.1 apk on android
-quick live 1.15.1 apk latest version download
-quick live 1.15.1 apk mod unlocked features
-quick live 1.15.1 apk for pc windows 10
-quick live 1.15.1 apk download link
-quick live 1.15.1 apk update changelog
-quick live 1.15.1 apk review and rating
-quick live 1.15.1 apk alternative apps
-quick live 1.15.1 apk file size and requirements
-quick live 1.15.1 apk download error fix
-quick live 1.15.1 apk premium subscription
-quick live 1.15.1 apk best settings and tips
-quick live 1.15.1 apk comparison with other versions
-quick live 1.15.1 apk compatibility with devices
-quick live 1.15.1 apk offline mode and data usage
-quick live 1.15.1 apk security and privacy issues
-quick live 1.15.1 apk troubleshooting and support
-quick live 1.15.1 apk features and benefits
-quick live 1.15.1 apk pros and cons
-quick live 1.15.1 apk download from official website
-quick live 1.15.1 apk mirror and backup download
-quick live 1.15.1 apk license and terms of service
-quick live 1.15.1 apk feedback and suggestions
-quick live 1.15.1 apk cheats and hacks
-quick live 1.15.1 apk gameplay and tutorial
-quick live 1.15.1 apk streaming quality and speed
-quick live 1.15.1 apk social media integration and sharing
-quick live 1.15.1 apk referral and reward program
-quick live 1.15.1 apk coupons and discounts
-quick live 1.15.2 beta version download
-how to uninstall quick live 1.15.2 beta version
-how to downgrade from quick live 2 beta version to the stable version of the app
-how to join the beta testing program for the app
-how to report bugs and issues in the beta version of the app
-what's new in the beta version of the app
-how to get early access to the beta version of the app
-how to provide feedback and suggestions for the beta version of the app
-how to opt out of the beta testing program for the app
-how to backup your data before installing the beta version of the app
However, downloading APK files from unknown sources can also be risky, as they may contain malware or viruses that can harm your device or compromise your privacy. Therefore, you should always be careful about where you download APK files from and what permissions you grant them.
-Quick Live is a live streaming app that lets you broadcast your own live shows or watch other streamers from around the world. You can also join live battles for huge rewards, invite friends to co-host with you, tag your location when streaming, and more.
-Some of the features of Quick Live app are:
-If you want to download and install Quick Live 1.15.1 APK file on your Android device, you will need to follow these steps:
-Congratulations! You have successfully downloaded and installed Quick Live 1.15.1 APK file on your Android device. You can now launch the app and enjoy its features.
-If you are looking for some other apps that can help you stream live video and chat with friends, you may want to check out these alternatives:
-App Name | -Description | -
---|---|
Bigo Live: | -A popular live streaming app that allows you to watch or broadcast live shows, play games, chat with friends, and earn money. | -
Twitch: | -A leading live streaming platform for gamers and creators, where you can watch or stream live gameplay, esports, music, and more. | -
Periscope: | -A live video app that lets you discover or broadcast live events from around the world, such as news, sports, entertainment, etc. | -
LivU: | -A live video chat app that connects you with random strangers from different countries and regions, with filters, stickers, and effects. | -
Uplive: | -A social live streaming app that enables you to watch or host live shows, join live parties, send gifts, and make friends. | -
In conclusion, Quick Live is a fun and easy way to stream live video and chat with friends on your Android device. You can create your own live shows, join live battles, watch other streamers, and interact with other users on Quick Live. However, you may not be able to find Quick Live app on the Google Play Store, so you will need to download and install the APK file of Quick Live 1.15.1 from a third-party source.
-To download and install Quick Live 1.15.1 APK file on your Android device, you will need to allow unknown apps, download the APK file from a reliable source, and install the APK file on your device. You can then launch the app and enjoy its features.
-Some tips for using Quick Live app are:
-Here are some frequently asked questions about Quick Live app and APK files:
-A: Quick Live app is generally safe to use as long as you download it from a trusted source and scan it for any malware or viruses before installing it. However, you should also be careful about what information you share or what permissions you grant to the app.
-A: If you have downloaded Quick Live app from a third-party source, you will not receive automatic updates from the Google Play Store. You will need to check for updates manually from the source where you downloaded the APK file. Alternatively, you can uninstall the app and install the latest version of the APK file from the same or a different source.
-A: If you want to uninstall Quick Live app from your device, you can do so by following these steps:
-A: If you have any questions, feedback, or issues with Quick Live app, you can contact the app support team by using the following methods:
-A: To run Quick Live app on your device, you will need to have the following minimum requirements:
-A: To stream to multiple platforms with Quick Live app, you will need to follow these steps:
-If you are a fan of motorbike racing games, you might want to check out Xtreme Motorbikes Raja APK. This is a free Android game that lets you experience the thrill of riding powerful and exciting sport motorbikes on realistic tracks. In this article, we will review the features, gameplay, pros and cons of Xtreme Motorbikes Raja APK, and also show you how to download and install it on your device.
-DOWNLOAD ✏ ✏ ✏ https://urlca.com/2uO68c
Xtreme Motorbikes Raja APK is an Android game developed by Mehdi Rabiee. It is a simulation game that lets you drive more than 20 different motorbikes with realistic physics, sounds, and graphics. You can customize your motorbikes with exclusive paint jobs and rims, and choose from various game modes and challenges to test your skills. You can also compete with other players online or offline, and share your achievements on social media.
-One of the main attractions of Xtreme Motorbikes Raja APK is its realistic physics engine, which simulates every aspect of motorbike behavior. You can feel the acceleration, braking, steering, suspension, and traction of your motorbike as you race on different tracks. The game also has stunning 3D graphics that create a immersive environment for your racing experience. You can see the details of your motorbike, the track, the weather, and the surroundings as you zoom past them.
-Xtreme Motorbikes Raja APK offers you a wide range of motorbikes to choose from, each with its own specifications and performance. You can drive more than 20 powerful and exciting sport motorbikes, such as Ducati, Yamaha, Kawasaki, Honda, Suzuki, BMW, and more. You can also customize your motorbikes with exclusive paint jobs and rims, to make them look more stylish and unique. You can change the color, design, pattern, and material of your motorbike's body and wheels.
-xtreme motorbikes raja apk download
-xtreme motorbikes raja apk mod
-xtreme motorbikes raja apk free
-xtreme motorbikes raja apk latest version
-xtreme motorbikes raja apk android
-xtreme motorbikes raja apk offline
-xtreme motorbikes raja apk unlimited money
-xtreme motorbikes raja apk hack
-xtreme motorbikes raja apk gameplay
-xtreme motorbikes raja apk review
-xtreme motorbikes raja apk full
-xtreme motorbikes raja apk update
-xtreme motorbikes raja apk 2023
-xtreme motorbikes raja apk cheats
-xtreme motorbikes raja apk online
-xtreme motorbikes raja apk for pc
-xtreme motorbikes raja apk obb
-xtreme motorbikes raja apk data
-xtreme motorbikes raja apk features
-xtreme motorbikes raja apk requirements
-xtreme motorbikes raja apk size
-xtreme motorbikes raja apk best bikes
-xtreme motorbikes raja apk tips and tricks
-xtreme motorbikes raja apk graphics
-xtreme motorbikes raja apk sound effects
-xtreme motorbikes raja apk realistic physics
-xtreme motorbikes raja apk customization options
-xtreme motorbikes raja apk sport bikes
-xtreme motorbikes raja apk turbocharger
-xtreme motorbikes raja apk gearbox and tires sounds
-xtreme motorbikes raja apk 3d graphics
-xtreme motorbikes raja apk racing game
-xtreme motorbikes raja apk simulation game
-xtreme motorbikes raja apk fun game
-xtreme motorbikes raja apk addictive game
-xtreme motorbikes raja apk challenging game
-xtreme motorbikes raja apk levels and modes
-xtreme motorbikes raja apk controls and settings
-xtreme motorbikes raja apk how to play
-xtreme motorbikes raja apk how to install
Xtreme Motorbikes Raja APK has various game modes and challenges to keep you entertained and challenged. You can choose from free ride mode, where you can explore the tracks at your own pace; career mode, where you can complete missions and earn money; time trial mode, where you can race against the clock; or multiplayer mode, where you can race against other players online or offline. You can also take part in different events and tournaments, where you can win prizes and trophies.
-To download and install Xtreme Motorbikes Raja APK on your device, you need to have an Android device that runs on Android 5.0 or higher. You also need to have at least 100 MB of free storage space on your device. The game is compatible with most Android devices, but some features may vary depending on your device's specifications.
-To download and install Xtreme Motorbikes Raja APK on your device, follow these steps:
-To play Xtreme Motorbikes Raja APK, you need to master the controls and the physics of your motorbike. Here are some tips and tricks to help you:
-Xtreme Motorbikes Raja APK is a fun and addictive game that offers many benefits, such as:
-Xtreme Motorbikes Raja APK also has some drawbacks, such as:
-Xtreme Motorbikes Raja APK is a great game for motorbike racing enthusiasts who want to enjoy a realistic and exciting racing experience on their Android devices. It has many features, game modes, challenges, and motorbikes that make it fun and addictive. It is also free to download and play, with no in-app purchases or ads. However, it also has some drawbacks, such as requiring a lot of storage space, a good internet connection, and a compatible device. It may also have some bugs or glitches, or be too difficult for some players. Overall, Xtreme Motorbikes Raja APK is a game worth trying if you love motorbike racing games.
-Here are some frequently asked questions about Xtreme Motorbikes Raja APK:
-The latest version of Xtreme Motorbikes Raja APK is 1.0.1, which was released on June 15, 2023. It fixed some minor bugs and improved the performance of the game.
-Xtreme Motorbikes Raja APK is safe to download and install, as long as you download it from a trusted source, such as [Xtreme Motorbikes] website. It does not contain any viruses, malware, or spyware that can harm your device or data. However, you should always be careful when downloading any file from unknown sources, as they may contain harmful content or software.
-You can play Xtreme Motorbikes Raja APK offline, but you will not be able to access some features, such as multiplayer mode, online events, or social media sharing. You will also need an internet connection to download and install the game, and to update it when needed.
-If you have any questions, feedback, or suggestions about Xtreme Motorbikes Raja APK, you can contact the developer of the game by sending an email to mehdirabiee@gmail.com. You can also visit their [Facebook page] or their [YouTube channel] to get more information and updates about the game.
-If you are looking for some other motorbike racing games for your Android device, you can try some of these alternatives:
-DOWNLOAD ⇔ https://ssurll.com/2uzyHt
my aim here is not to show that the modern theory is incorrect, but to point out that it has little to say about the population question. i will say more about population in my last chapter. to assess the broad outlines of the modern theory and to gain perspective on the questions asked by the classical theory, it helps to focus on the most recent version of the modern theory. it is best introduced as a complement to the first version, which is the classic growth model in the sense that the malthusian theory is complementary to ricardo’s law of rent.
-Download File ✑ ✑ ✑ https://ssurll.com/2uzywE
things changed around the year 1800. technology, capital, and population growth became decoupled: technical progress could be used for manufacturing goods that could be sold to customers at a higher price than the cost of the raw materials needed to produce them. more resources were available to the producers of goods, and there were more consumers who could afford the products. moreover, as the smithian growth mechanism could no longer work, a new growth mechanism, a new growth process, had to be invented. yet a growing number of people were producing a growing output of a broad range of products. in the year 1800, the industrial revolution had not occurred as envisioned by adam smith, but smith was correct to see it as a significant step in the growth process. the discovery of the leipzig gas fuses and the steam engine had been a step in this direction.
-problems, at least in the relatively wealthy societies that experience the industrial revolution, were most commonly solved not by science, invention, or innovation but by changing the way people behave. most of these changes were forced on the system, usually by demagogues who responded to problems, real or perceived, by demanding radical innovations which threatened the old regime. but although new opinions about the best ways to organize societies and industries produced new policies and legislation, most individuals still believed that they were free to organize themselves, as they wished, on the basis of the old social conventions and their particular circumstances. as long as they managed to do this they usually avoided the disaster of political and religious revolution that characterized much of the history of the western world. (e.g., the struggle to change the institution of slavery was thought to be a fundamental part of the development of capitalism).
899543212b 元素,则不添加按钮
- }
- var firstChild = code.firstChild;
- if (!firstChild) {
- return; // 如果 元素没有子节点,则不添加按钮
- }
- var button = document.createElement('button');
- button.textContent = '\uD83D\uDCCE'; // 使用 📎 符号作为“复制”按钮的文本
- button.style.position = 'relative';
- button.style.float = 'right';
- button.style.fontSize = '1em'; // 可选:调整按钮大小
- button.style.background = 'none'; // 可选:去掉背景颜色
- button.style.border = 'none'; // 可选:去掉边框
- button.style.cursor = 'pointer'; // 可选:显示指针样式
- button.addEventListener('click', function () {
- var range = document.createRange();
- range.selectNodeContents(code);
- range.setStartBefore(firstChild); // 将范围设置为第一个子节点之前
- var selection = window.getSelection();
- selection.removeAllRanges();
- selection.addRange(range);
-
- try {
- var success = document.execCommand('copy');
- if (success) {
- button.textContent = '\u2714';
- setTimeout(function () {
- button.textContent = '\uD83D\uDCCE'; // 恢复按钮为“复制”
- }, 2000);
- } else {
- button.textContent = '\u2716';
- }
- } catch (e) {
- console.error(e);
- button.textContent = '\u2716';
- }
-
- selection.removeAllRanges();
- });
- code.insertBefore(button, firstChild); // 将按钮插入到第一个子元素之前
- }
-
- function handleNewElements(mutationsList, observer) {
- for (var mutation of mutationsList) {
- if (mutation.type === 'childList') {
- for (var node of mutation.addedNodes) {
- if (node.nodeName === 'PRE') {
- addCopyButton(node);
- }
- }
- }
- }
- }
-
- var observer = new MutationObserver(handleNewElements);
- observer.observe(document.documentElement, { childList: true, subtree: true });
-
- document.querySelectorAll('pre').forEach(addCopyButton);
-})();
diff --git a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fastapi/types.py b/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fastapi/types.py
deleted file mode 100644
index 7adf565a7b6b7d4f1eed3adf6a96faab66fe517c..0000000000000000000000000000000000000000
--- a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fastapi/types.py
+++ /dev/null
@@ -1,11 +0,0 @@
-import types
-from enum import Enum
-from typing import Any, Callable, Dict, Set, Type, TypeVar, Union
-
-from pydantic import BaseModel
-
-DecoratedCallable = TypeVar("DecoratedCallable", bound=Callable[..., Any])
-UnionType = getattr(types, "UnionType", Union)
-NoneType = getattr(types, "UnionType", None)
-ModelNameMap = Dict[Union[Type[BaseModel], Type[Enum]], str]
-IncEx = Union[Set[int], Set[str], Dict[int, Any], Dict[str, Any]]
diff --git a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fontTools/varLib/iup.c b/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fontTools/varLib/iup.c
deleted file mode 100644
index 64891ceac1811e09965675714ea67d1829d4ccb5..0000000000000000000000000000000000000000
--- a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/fontTools/varLib/iup.c
+++ /dev/null
@@ -1,17999 +0,0 @@
-/* Generated by Cython 3.0.0 */
-
-/* BEGIN: Cython Metadata
-{
- "distutils": {
- "name": "fontTools.varLib.iup",
- "sources": [
- "Lib/fontTools/varLib/iup.py"
- ]
- },
- "module_name": "fontTools.varLib.iup"
-}
-END: Cython Metadata */
-
-#ifndef PY_SSIZE_T_CLEAN
-#define PY_SSIZE_T_CLEAN
-#endif /* PY_SSIZE_T_CLEAN */
-#if defined(CYTHON_LIMITED_API) && 0
- #ifndef Py_LIMITED_API
- #if CYTHON_LIMITED_API+0 > 0x03030000
- #define Py_LIMITED_API CYTHON_LIMITED_API
- #else
- #define Py_LIMITED_API 0x03030000
- #endif
- #endif
-#endif
-
-#include "Python.h"
-#ifndef Py_PYTHON_H
- #error Python headers needed to compile C extensions, please install development version of Python.
-#elif PY_VERSION_HEX < 0x02070000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000)
- #error Cython requires Python 2.7+ or Python 3.3+.
-#else
-#define CYTHON_ABI "3_0_0"
-#define __PYX_ABI_MODULE_NAME "_cython_" CYTHON_ABI
-#define __PYX_TYPE_MODULE_PREFIX __PYX_ABI_MODULE_NAME "."
-#define CYTHON_HEX_VERSION 0x030000F0
-#define CYTHON_FUTURE_DIVISION 1
-#include
-#ifndef offsetof
- #define offsetof(type, member) ( (size_t) & ((type*)0) -> member )
-#endif
-#if !defined(_WIN32) && !defined(WIN32) && !defined(MS_WINDOWS)
- #ifndef __stdcall
- #define __stdcall
- #endif
- #ifndef __cdecl
- #define __cdecl
- #endif
- #ifndef __fastcall
- #define __fastcall
- #endif
-#endif
-#ifndef DL_IMPORT
- #define DL_IMPORT(t) t
-#endif
-#ifndef DL_EXPORT
- #define DL_EXPORT(t) t
-#endif
-#define __PYX_COMMA ,
-#ifndef HAVE_LONG_LONG
- #define HAVE_LONG_LONG
-#endif
-#ifndef PY_LONG_LONG
- #define PY_LONG_LONG LONG_LONG
-#endif
-#ifndef Py_HUGE_VAL
- #define Py_HUGE_VAL HUGE_VAL
-#endif
-#if defined(GRAALVM_PYTHON)
- /* For very preliminary testing purposes. Most variables are set the same as PyPy.
- The existence of this section does not imply that anything works or is even tested */
- #define CYTHON_COMPILING_IN_PYPY 0
- #define CYTHON_COMPILING_IN_CPYTHON 0
- #define CYTHON_COMPILING_IN_LIMITED_API 0
- #define CYTHON_COMPILING_IN_GRAAL 1
- #define CYTHON_COMPILING_IN_NOGIL 0
- #undef CYTHON_USE_TYPE_SLOTS
- #define CYTHON_USE_TYPE_SLOTS 0
- #undef CYTHON_USE_TYPE_SPECS
- #define CYTHON_USE_TYPE_SPECS 0
- #undef CYTHON_USE_PYTYPE_LOOKUP
- #define CYTHON_USE_PYTYPE_LOOKUP 0
- #if PY_VERSION_HEX < 0x03050000
- #undef CYTHON_USE_ASYNC_SLOTS
- #define CYTHON_USE_ASYNC_SLOTS 0
- #elif !defined(CYTHON_USE_ASYNC_SLOTS)
- #define CYTHON_USE_ASYNC_SLOTS 1
- #endif
- #undef CYTHON_USE_PYLIST_INTERNALS
- #define CYTHON_USE_PYLIST_INTERNALS 0
- #undef CYTHON_USE_UNICODE_INTERNALS
- #define CYTHON_USE_UNICODE_INTERNALS 0
- #undef CYTHON_USE_UNICODE_WRITER
- #define CYTHON_USE_UNICODE_WRITER 0
- #undef CYTHON_USE_PYLONG_INTERNALS
- #define CYTHON_USE_PYLONG_INTERNALS 0
- #undef CYTHON_AVOID_BORROWED_REFS
- #define CYTHON_AVOID_BORROWED_REFS 1
- #undef CYTHON_ASSUME_SAFE_MACROS
- #define CYTHON_ASSUME_SAFE_MACROS 0
- #undef CYTHON_UNPACK_METHODS
- #define CYTHON_UNPACK_METHODS 0
- #undef CYTHON_FAST_THREAD_STATE
- #define CYTHON_FAST_THREAD_STATE 0
- #undef CYTHON_FAST_GIL
- #define CYTHON_FAST_GIL 0
- #undef CYTHON_METH_FASTCALL
- #define CYTHON_METH_FASTCALL 0
- #undef CYTHON_FAST_PYCALL
- #define CYTHON_FAST_PYCALL 0
- #ifndef CYTHON_PEP487_INIT_SUBCLASS
- #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3)
- #endif
- #undef CYTHON_PEP489_MULTI_PHASE_INIT
- #define CYTHON_PEP489_MULTI_PHASE_INIT 1
- #undef CYTHON_USE_MODULE_STATE
- #define CYTHON_USE_MODULE_STATE 0
- #undef CYTHON_USE_TP_FINALIZE
- #define CYTHON_USE_TP_FINALIZE 0
- #undef CYTHON_USE_DICT_VERSIONS
- #define CYTHON_USE_DICT_VERSIONS 0
- #undef CYTHON_USE_EXC_INFO_STACK
- #define CYTHON_USE_EXC_INFO_STACK 0
- #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC
- #define CYTHON_UPDATE_DESCRIPTOR_DOC 0
- #endif
-#elif defined(PYPY_VERSION)
- #define CYTHON_COMPILING_IN_PYPY 1
- #define CYTHON_COMPILING_IN_CPYTHON 0
- #define CYTHON_COMPILING_IN_LIMITED_API 0
- #define CYTHON_COMPILING_IN_GRAAL 0
- #define CYTHON_COMPILING_IN_NOGIL 0
- #undef CYTHON_USE_TYPE_SLOTS
- #define CYTHON_USE_TYPE_SLOTS 0
- #undef CYTHON_USE_TYPE_SPECS
- #define CYTHON_USE_TYPE_SPECS 0
- #undef CYTHON_USE_PYTYPE_LOOKUP
- #define CYTHON_USE_PYTYPE_LOOKUP 0
- #if PY_VERSION_HEX < 0x03050000
- #undef CYTHON_USE_ASYNC_SLOTS
- #define CYTHON_USE_ASYNC_SLOTS 0
- #elif !defined(CYTHON_USE_ASYNC_SLOTS)
- #define CYTHON_USE_ASYNC_SLOTS 1
- #endif
- #undef CYTHON_USE_PYLIST_INTERNALS
- #define CYTHON_USE_PYLIST_INTERNALS 0
- #undef CYTHON_USE_UNICODE_INTERNALS
- #define CYTHON_USE_UNICODE_INTERNALS 0
- #undef CYTHON_USE_UNICODE_WRITER
- #define CYTHON_USE_UNICODE_WRITER 0
- #undef CYTHON_USE_PYLONG_INTERNALS
- #define CYTHON_USE_PYLONG_INTERNALS 0
- #undef CYTHON_AVOID_BORROWED_REFS
- #define CYTHON_AVOID_BORROWED_REFS 1
- #undef CYTHON_ASSUME_SAFE_MACROS
- #define CYTHON_ASSUME_SAFE_MACROS 0
- #undef CYTHON_UNPACK_METHODS
- #define CYTHON_UNPACK_METHODS 0
- #undef CYTHON_FAST_THREAD_STATE
- #define CYTHON_FAST_THREAD_STATE 0
- #undef CYTHON_FAST_GIL
- #define CYTHON_FAST_GIL 0
- #undef CYTHON_METH_FASTCALL
- #define CYTHON_METH_FASTCALL 0
- #undef CYTHON_FAST_PYCALL
- #define CYTHON_FAST_PYCALL 0
- #ifndef CYTHON_PEP487_INIT_SUBCLASS
- #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3)
- #endif
- #if PY_VERSION_HEX < 0x03090000
- #undef CYTHON_PEP489_MULTI_PHASE_INIT
- #define CYTHON_PEP489_MULTI_PHASE_INIT 0
- #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT)
- #define CYTHON_PEP489_MULTI_PHASE_INIT 1
- #endif
- #undef CYTHON_USE_MODULE_STATE
- #define CYTHON_USE_MODULE_STATE 0
- #undef CYTHON_USE_TP_FINALIZE
- #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1 && PYPY_VERSION_NUM >= 0x07030C00)
- #undef CYTHON_USE_DICT_VERSIONS
- #define CYTHON_USE_DICT_VERSIONS 0
- #undef CYTHON_USE_EXC_INFO_STACK
- #define CYTHON_USE_EXC_INFO_STACK 0
- #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC
- #define CYTHON_UPDATE_DESCRIPTOR_DOC 0
- #endif
-#elif defined(CYTHON_LIMITED_API)
- #define CYTHON_COMPILING_IN_PYPY 0
- #define CYTHON_COMPILING_IN_CPYTHON 0
- #define CYTHON_COMPILING_IN_LIMITED_API 1
- #define CYTHON_COMPILING_IN_GRAAL 0
- #define CYTHON_COMPILING_IN_NOGIL 0
- #undef CYTHON_CLINE_IN_TRACEBACK
- #define CYTHON_CLINE_IN_TRACEBACK 0
- #undef CYTHON_USE_TYPE_SLOTS
- #define CYTHON_USE_TYPE_SLOTS 0
- #undef CYTHON_USE_TYPE_SPECS
- #define CYTHON_USE_TYPE_SPECS 1
- #undef CYTHON_USE_PYTYPE_LOOKUP
- #define CYTHON_USE_PYTYPE_LOOKUP 0
- #undef CYTHON_USE_ASYNC_SLOTS
- #define CYTHON_USE_ASYNC_SLOTS 0
- #undef CYTHON_USE_PYLIST_INTERNALS
- #define CYTHON_USE_PYLIST_INTERNALS 0
- #undef CYTHON_USE_UNICODE_INTERNALS
- #define CYTHON_USE_UNICODE_INTERNALS 0
- #ifndef CYTHON_USE_UNICODE_WRITER
- #define CYTHON_USE_UNICODE_WRITER 0
- #endif
- #undef CYTHON_USE_PYLONG_INTERNALS
- #define CYTHON_USE_PYLONG_INTERNALS 0
- #ifndef CYTHON_AVOID_BORROWED_REFS
- #define CYTHON_AVOID_BORROWED_REFS 0
- #endif
- #undef CYTHON_ASSUME_SAFE_MACROS
- #define CYTHON_ASSUME_SAFE_MACROS 0
- #undef CYTHON_UNPACK_METHODS
- #define CYTHON_UNPACK_METHODS 0
- #undef CYTHON_FAST_THREAD_STATE
- #define CYTHON_FAST_THREAD_STATE 0
- #undef CYTHON_FAST_GIL
- #define CYTHON_FAST_GIL 0
- #undef CYTHON_METH_FASTCALL
- #define CYTHON_METH_FASTCALL 0
- #undef CYTHON_FAST_PYCALL
- #define CYTHON_FAST_PYCALL 0
- #ifndef CYTHON_PEP487_INIT_SUBCLASS
- #define CYTHON_PEP487_INIT_SUBCLASS 1
- #endif
- #undef CYTHON_PEP489_MULTI_PHASE_INIT
- #define CYTHON_PEP489_MULTI_PHASE_INIT 0
- #undef CYTHON_USE_MODULE_STATE
- #define CYTHON_USE_MODULE_STATE 1
- #ifndef CYTHON_USE_TP_FINALIZE
- #define CYTHON_USE_TP_FINALIZE 1
- #endif
- #undef CYTHON_USE_DICT_VERSIONS
- #define CYTHON_USE_DICT_VERSIONS 0
- #undef CYTHON_USE_EXC_INFO_STACK
- #define CYTHON_USE_EXC_INFO_STACK 0
- #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC
- #define CYTHON_UPDATE_DESCRIPTOR_DOC 0
- #endif
-#elif defined(PY_NOGIL)
- #define CYTHON_COMPILING_IN_PYPY 0
- #define CYTHON_COMPILING_IN_CPYTHON 0
- #define CYTHON_COMPILING_IN_LIMITED_API 0
- #define CYTHON_COMPILING_IN_GRAAL 0
- #define CYTHON_COMPILING_IN_NOGIL 1
- #ifndef CYTHON_USE_TYPE_SLOTS
- #define CYTHON_USE_TYPE_SLOTS 1
- #endif
- #undef CYTHON_USE_PYTYPE_LOOKUP
- #define CYTHON_USE_PYTYPE_LOOKUP 0
- #ifndef CYTHON_USE_ASYNC_SLOTS
- #define CYTHON_USE_ASYNC_SLOTS 1
- #endif
- #undef CYTHON_USE_PYLIST_INTERNALS
- #define CYTHON_USE_PYLIST_INTERNALS 0
- #ifndef CYTHON_USE_UNICODE_INTERNALS
- #define CYTHON_USE_UNICODE_INTERNALS 1
- #endif
- #undef CYTHON_USE_UNICODE_WRITER
- #define CYTHON_USE_UNICODE_WRITER 0
- #undef CYTHON_USE_PYLONG_INTERNALS
- #define CYTHON_USE_PYLONG_INTERNALS 0
- #ifndef CYTHON_AVOID_BORROWED_REFS
- #define CYTHON_AVOID_BORROWED_REFS 0
- #endif
- #ifndef CYTHON_ASSUME_SAFE_MACROS
- #define CYTHON_ASSUME_SAFE_MACROS 1
- #endif
- #ifndef CYTHON_UNPACK_METHODS
- #define CYTHON_UNPACK_METHODS 1
- #endif
- #undef CYTHON_FAST_THREAD_STATE
- #define CYTHON_FAST_THREAD_STATE 0
- #undef CYTHON_FAST_PYCALL
- #define CYTHON_FAST_PYCALL 0
- #ifndef CYTHON_PEP489_MULTI_PHASE_INIT
- #define CYTHON_PEP489_MULTI_PHASE_INIT 1
- #endif
- #ifndef CYTHON_USE_TP_FINALIZE
- #define CYTHON_USE_TP_FINALIZE 1
- #endif
- #undef CYTHON_USE_DICT_VERSIONS
- #define CYTHON_USE_DICT_VERSIONS 0
- #undef CYTHON_USE_EXC_INFO_STACK
- #define CYTHON_USE_EXC_INFO_STACK 0
-#else
- #define CYTHON_COMPILING_IN_PYPY 0
- #define CYTHON_COMPILING_IN_CPYTHON 1
- #define CYTHON_COMPILING_IN_LIMITED_API 0
- #define CYTHON_COMPILING_IN_GRAAL 0
- #define CYTHON_COMPILING_IN_NOGIL 0
- #ifndef CYTHON_USE_TYPE_SLOTS
- #define CYTHON_USE_TYPE_SLOTS 1
- #endif
- #ifndef CYTHON_USE_TYPE_SPECS
- #define CYTHON_USE_TYPE_SPECS 0
- #endif
- #ifndef CYTHON_USE_PYTYPE_LOOKUP
- #define CYTHON_USE_PYTYPE_LOOKUP 1
- #endif
- #if PY_MAJOR_VERSION < 3
- #undef CYTHON_USE_ASYNC_SLOTS
- #define CYTHON_USE_ASYNC_SLOTS 0
- #elif !defined(CYTHON_USE_ASYNC_SLOTS)
- #define CYTHON_USE_ASYNC_SLOTS 1
- #endif
- #ifndef CYTHON_USE_PYLONG_INTERNALS
- #define CYTHON_USE_PYLONG_INTERNALS 1
- #endif
- #ifndef CYTHON_USE_PYLIST_INTERNALS
- #define CYTHON_USE_PYLIST_INTERNALS 1
- #endif
- #ifndef CYTHON_USE_UNICODE_INTERNALS
- #define CYTHON_USE_UNICODE_INTERNALS 1
- #endif
- #if PY_VERSION_HEX < 0x030300F0 || PY_VERSION_HEX >= 0x030B00A2
- #undef CYTHON_USE_UNICODE_WRITER
- #define CYTHON_USE_UNICODE_WRITER 0
- #elif !defined(CYTHON_USE_UNICODE_WRITER)
- #define CYTHON_USE_UNICODE_WRITER 1
- #endif
- #ifndef CYTHON_AVOID_BORROWED_REFS
- #define CYTHON_AVOID_BORROWED_REFS 0
- #endif
- #ifndef CYTHON_ASSUME_SAFE_MACROS
- #define CYTHON_ASSUME_SAFE_MACROS 1
- #endif
- #ifndef CYTHON_UNPACK_METHODS
- #define CYTHON_UNPACK_METHODS 1
- #endif
- #ifndef CYTHON_FAST_THREAD_STATE
- #define CYTHON_FAST_THREAD_STATE 1
- #endif
- #ifndef CYTHON_FAST_GIL
- #define CYTHON_FAST_GIL (PY_MAJOR_VERSION < 3 || PY_VERSION_HEX >= 0x03060000 && PY_VERSION_HEX < 0x030C00A6)
- #endif
- #ifndef CYTHON_METH_FASTCALL
- #define CYTHON_METH_FASTCALL (PY_VERSION_HEX >= 0x030700A1)
- #endif
- #ifndef CYTHON_FAST_PYCALL
- #define CYTHON_FAST_PYCALL 1
- #endif
- #ifndef CYTHON_PEP487_INIT_SUBCLASS
- #define CYTHON_PEP487_INIT_SUBCLASS 1
- #endif
- #if PY_VERSION_HEX < 0x03050000
- #undef CYTHON_PEP489_MULTI_PHASE_INIT
- #define CYTHON_PEP489_MULTI_PHASE_INIT 0
- #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT)
- #define CYTHON_PEP489_MULTI_PHASE_INIT 1
- #endif
- #ifndef CYTHON_USE_MODULE_STATE
- #define CYTHON_USE_MODULE_STATE 0
- #endif
- #if PY_VERSION_HEX < 0x030400a1
- #undef CYTHON_USE_TP_FINALIZE
- #define CYTHON_USE_TP_FINALIZE 0
- #elif !defined(CYTHON_USE_TP_FINALIZE)
- #define CYTHON_USE_TP_FINALIZE 1
- #endif
- #if PY_VERSION_HEX < 0x030600B1
- #undef CYTHON_USE_DICT_VERSIONS
- #define CYTHON_USE_DICT_VERSIONS 0
- #elif !defined(CYTHON_USE_DICT_VERSIONS)
- #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX < 0x030C00A5)
- #endif
- #if PY_VERSION_HEX < 0x030700A3
- #undef CYTHON_USE_EXC_INFO_STACK
- #define CYTHON_USE_EXC_INFO_STACK 0
- #elif !defined(CYTHON_USE_EXC_INFO_STACK)
- #define CYTHON_USE_EXC_INFO_STACK 1
- #endif
- #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC
- #define CYTHON_UPDATE_DESCRIPTOR_DOC 1
- #endif
-#endif
-#if !defined(CYTHON_FAST_PYCCALL)
-#define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1)
-#endif
-#if !defined(CYTHON_VECTORCALL)
-#define CYTHON_VECTORCALL (CYTHON_FAST_PYCCALL && PY_VERSION_HEX >= 0x030800B1)
-#endif
-#define CYTHON_BACKPORT_VECTORCALL (CYTHON_METH_FASTCALL && PY_VERSION_HEX < 0x030800B1)
-#if CYTHON_USE_PYLONG_INTERNALS
- #if PY_MAJOR_VERSION < 3
- #include "longintrepr.h"
- #endif
- #undef SHIFT
- #undef BASE
- #undef MASK
- #ifdef SIZEOF_VOID_P
- enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) };
- #endif
-#endif
-#ifndef __has_attribute
- #define __has_attribute(x) 0
-#endif
-#ifndef __has_cpp_attribute
- #define __has_cpp_attribute(x) 0
-#endif
-#ifndef CYTHON_RESTRICT
- #if defined(__GNUC__)
- #define CYTHON_RESTRICT __restrict__
- #elif defined(_MSC_VER) && _MSC_VER >= 1400
- #define CYTHON_RESTRICT __restrict
- #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L
- #define CYTHON_RESTRICT restrict
- #else
- #define CYTHON_RESTRICT
- #endif
-#endif
-#ifndef CYTHON_UNUSED
- #if defined(__cplusplus)
- /* for clang __has_cpp_attribute(maybe_unused) is true even before C++17
- * but leads to warnings with -pedantic, since it is a C++17 feature */
- #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L)
- #if __has_cpp_attribute(maybe_unused)
- #define CYTHON_UNUSED [[maybe_unused]]
- #endif
- #endif
- #endif
-#endif
-#ifndef CYTHON_UNUSED
-# if defined(__GNUC__)
-# if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4))
-# define CYTHON_UNUSED __attribute__ ((__unused__))
-# else
-# define CYTHON_UNUSED
-# endif
-# elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER))
-# define CYTHON_UNUSED __attribute__ ((__unused__))
-# else
-# define CYTHON_UNUSED
-# endif
-#endif
-#ifndef CYTHON_UNUSED_VAR
-# if defined(__cplusplus)
- template void CYTHON_UNUSED_VAR( const T& ) { }
-# else
-# define CYTHON_UNUSED_VAR(x) (void)(x)
-# endif
-#endif
-#ifndef CYTHON_MAYBE_UNUSED_VAR
- #define CYTHON_MAYBE_UNUSED_VAR(x) CYTHON_UNUSED_VAR(x)
-#endif
-#ifndef CYTHON_NCP_UNUSED
-# if CYTHON_COMPILING_IN_CPYTHON
-# define CYTHON_NCP_UNUSED
-# else
-# define CYTHON_NCP_UNUSED CYTHON_UNUSED
-# endif
-#endif
-#define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None)
-#ifdef _MSC_VER
- #ifndef _MSC_STDINT_H_
- #if _MSC_VER < 1300
- typedef unsigned char uint8_t;
- typedef unsigned short uint16_t;
- typedef unsigned int uint32_t;
- #else
- typedef unsigned __int8 uint8_t;
- typedef unsigned __int16 uint16_t;
- typedef unsigned __int32 uint32_t;
- #endif
- #endif
- #if _MSC_VER < 1300
- #ifdef _WIN64
- typedef unsigned long long __pyx_uintptr_t;
- #else
- typedef unsigned int __pyx_uintptr_t;
- #endif
- #else
- #ifdef _WIN64
- typedef unsigned __int64 __pyx_uintptr_t;
- #else
- typedef unsigned __int32 __pyx_uintptr_t;
- #endif
- #endif
-#else
- #include
- typedef uintptr_t __pyx_uintptr_t;
-#endif
-#ifndef CYTHON_FALLTHROUGH
- #if defined(__cplusplus)
- /* for clang __has_cpp_attribute(fallthrough) is true even before C++17
- * but leads to warnings with -pedantic, since it is a C++17 feature */
- #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L)
- #if __has_cpp_attribute(fallthrough)
- #define CYTHON_FALLTHROUGH [[fallthrough]]
- #endif
- #endif
- #ifndef CYTHON_FALLTHROUGH
- #if __has_cpp_attribute(clang::fallthrough)
- #define CYTHON_FALLTHROUGH [[clang::fallthrough]]
- #elif __has_cpp_attribute(gnu::fallthrough)
- #define CYTHON_FALLTHROUGH [[gnu::fallthrough]]
- #endif
- #endif
- #endif
- #ifndef CYTHON_FALLTHROUGH
- #if __has_attribute(fallthrough)
- #define CYTHON_FALLTHROUGH __attribute__((fallthrough))
- #else
- #define CYTHON_FALLTHROUGH
- #endif
- #endif
- #if defined(__clang__) && defined(__apple_build_version__)
- #if __apple_build_version__ < 7000000
- #undef CYTHON_FALLTHROUGH
- #define CYTHON_FALLTHROUGH
- #endif
- #endif
-#endif
-#ifdef __cplusplus
- template
- struct __PYX_IS_UNSIGNED_IMPL {static const bool value = T(0) < T(-1);};
- #define __PYX_IS_UNSIGNED(type) (__PYX_IS_UNSIGNED_IMPL::value)
-#else
- #define __PYX_IS_UNSIGNED(type) (((type)-1) > 0)
-#endif
-#if CYTHON_COMPILING_IN_PYPY == 1
- #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x030A0000)
-#else
- #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x03090000)
-#endif
-#define __PYX_REINTERPRET_FUNCION(func_pointer, other_pointer) ((func_pointer)(void(*)(void))(other_pointer))
-
-#ifndef CYTHON_INLINE
- #if defined(__clang__)
- #define CYTHON_INLINE __inline__ __attribute__ ((__unused__))
- #elif defined(__GNUC__)
- #define CYTHON_INLINE __inline__
- #elif defined(_MSC_VER)
- #define CYTHON_INLINE __inline
- #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L
- #define CYTHON_INLINE inline
- #else
- #define CYTHON_INLINE
- #endif
-#endif
-
-#define __PYX_BUILD_PY_SSIZE_T "n"
-#define CYTHON_FORMAT_SSIZE_T "z"
-#if PY_MAJOR_VERSION < 3
- #define __Pyx_BUILTIN_MODULE_NAME "__builtin__"
- #define __Pyx_DefaultClassType PyClass_Type
- #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\
- PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)
-#else
- #define __Pyx_BUILTIN_MODULE_NAME "builtins"
- #define __Pyx_DefaultClassType PyType_Type
-#if PY_VERSION_HEX >= 0x030B00A1
- static CYTHON_INLINE PyCodeObject* __Pyx_PyCode_New(int a, int p, int k, int l, int s, int f,
- PyObject *code, PyObject *c, PyObject* n, PyObject *v,
- PyObject *fv, PyObject *cell, PyObject* fn,
- PyObject *name, int fline, PyObject *lnos) {
- PyObject *kwds=NULL, *argcount=NULL, *posonlyargcount=NULL, *kwonlyargcount=NULL;
- PyObject *nlocals=NULL, *stacksize=NULL, *flags=NULL, *replace=NULL, *empty=NULL;
- const char *fn_cstr=NULL;
- const char *name_cstr=NULL;
- PyCodeObject *co=NULL, *result=NULL;
- PyObject *type, *value, *traceback;
- PyErr_Fetch(&type, &value, &traceback);
- if (!(kwds=PyDict_New())) goto end;
- if (!(argcount=PyLong_FromLong(a))) goto end;
- if (PyDict_SetItemString(kwds, "co_argcount", argcount) != 0) goto end;
- if (!(posonlyargcount=PyLong_FromLong(p))) goto end;
- if (PyDict_SetItemString(kwds, "co_posonlyargcount", posonlyargcount) != 0) goto end;
- if (!(kwonlyargcount=PyLong_FromLong(k))) goto end;
- if (PyDict_SetItemString(kwds, "co_kwonlyargcount", kwonlyargcount) != 0) goto end;
- if (!(nlocals=PyLong_FromLong(l))) goto end;
- if (PyDict_SetItemString(kwds, "co_nlocals", nlocals) != 0) goto end;
- if (!(stacksize=PyLong_FromLong(s))) goto end;
- if (PyDict_SetItemString(kwds, "co_stacksize", stacksize) != 0) goto end;
- if (!(flags=PyLong_FromLong(f))) goto end;
- if (PyDict_SetItemString(kwds, "co_flags", flags) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_code", code) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_consts", c) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_names", n) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_varnames", v) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_freevars", fv) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_cellvars", cell) != 0) goto end;
- if (PyDict_SetItemString(kwds, "co_linetable", lnos) != 0) goto end;
- if (!(fn_cstr=PyUnicode_AsUTF8AndSize(fn, NULL))) goto end;
- if (!(name_cstr=PyUnicode_AsUTF8AndSize(name, NULL))) goto end;
- if (!(co = PyCode_NewEmpty(fn_cstr, name_cstr, fline))) goto end;
- if (!(replace = PyObject_GetAttrString((PyObject*)co, "replace"))) goto end;
- if (!(empty = PyTuple_New(0))) goto end;
- result = (PyCodeObject*) PyObject_Call(replace, empty, kwds);
- end:
- Py_XDECREF((PyObject*) co);
- Py_XDECREF(kwds);
- Py_XDECREF(argcount);
- Py_XDECREF(posonlyargcount);
- Py_XDECREF(kwonlyargcount);
- Py_XDECREF(nlocals);
- Py_XDECREF(stacksize);
- Py_XDECREF(replace);
- Py_XDECREF(empty);
- if (type) {
- PyErr_Restore(type, value, traceback);
- }
- return result;
- }
-#elif PY_VERSION_HEX >= 0x030800B2 && !CYTHON_COMPILING_IN_PYPY
- #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\
- PyCode_NewWithPosOnlyArgs(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)
-#else
- #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\
- PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)
-#endif
-#endif
-#if PY_VERSION_HEX >= 0x030900A4 || defined(Py_IS_TYPE)
- #define __Pyx_IS_TYPE(ob, type) Py_IS_TYPE(ob, type)
-#else
- #define __Pyx_IS_TYPE(ob, type) (((const PyObject*)ob)->ob_type == (type))
-#endif
-#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_Is)
- #define __Pyx_Py_Is(x, y) Py_Is(x, y)
-#else
- #define __Pyx_Py_Is(x, y) ((x) == (y))
-#endif
-#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsNone)
- #define __Pyx_Py_IsNone(ob) Py_IsNone(ob)
-#else
- #define __Pyx_Py_IsNone(ob) __Pyx_Py_Is((ob), Py_None)
-#endif
-#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsTrue)
- #define __Pyx_Py_IsTrue(ob) Py_IsTrue(ob)
-#else
- #define __Pyx_Py_IsTrue(ob) __Pyx_Py_Is((ob), Py_True)
-#endif
-#if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsFalse)
- #define __Pyx_Py_IsFalse(ob) Py_IsFalse(ob)
-#else
- #define __Pyx_Py_IsFalse(ob) __Pyx_Py_Is((ob), Py_False)
-#endif
-#define __Pyx_NoneAsNull(obj) (__Pyx_Py_IsNone(obj) ? NULL : (obj))
-#if PY_VERSION_HEX >= 0x030900F0 && !CYTHON_COMPILING_IN_PYPY
- #define __Pyx_PyObject_GC_IsFinalized(o) PyObject_GC_IsFinalized(o)
-#else
- #define __Pyx_PyObject_GC_IsFinalized(o) _PyGC_FINALIZED(o)
-#endif
-#ifndef CO_COROUTINE
- #define CO_COROUTINE 0x80
-#endif
-#ifndef CO_ASYNC_GENERATOR
- #define CO_ASYNC_GENERATOR 0x200
-#endif
-#ifndef Py_TPFLAGS_CHECKTYPES
- #define Py_TPFLAGS_CHECKTYPES 0
-#endif
-#ifndef Py_TPFLAGS_HAVE_INDEX
- #define Py_TPFLAGS_HAVE_INDEX 0
-#endif
-#ifndef Py_TPFLAGS_HAVE_NEWBUFFER
- #define Py_TPFLAGS_HAVE_NEWBUFFER 0
-#endif
-#ifndef Py_TPFLAGS_HAVE_FINALIZE
- #define Py_TPFLAGS_HAVE_FINALIZE 0
-#endif
-#ifndef Py_TPFLAGS_SEQUENCE
- #define Py_TPFLAGS_SEQUENCE 0
-#endif
-#ifndef Py_TPFLAGS_MAPPING
- #define Py_TPFLAGS_MAPPING 0
-#endif
-#ifndef METH_STACKLESS
- #define METH_STACKLESS 0
-#endif
-#if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL)
- #ifndef METH_FASTCALL
- #define METH_FASTCALL 0x80
- #endif
- typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs);
- typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args,
- Py_ssize_t nargs, PyObject *kwnames);
-#else
- #define __Pyx_PyCFunctionFast _PyCFunctionFast
- #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords
-#endif
-#if CYTHON_METH_FASTCALL
- #define __Pyx_METH_FASTCALL METH_FASTCALL
- #define __Pyx_PyCFunction_FastCall __Pyx_PyCFunctionFast
- #define __Pyx_PyCFunction_FastCallWithKeywords __Pyx_PyCFunctionFastWithKeywords
-#else
- #define __Pyx_METH_FASTCALL METH_VARARGS
- #define __Pyx_PyCFunction_FastCall PyCFunction
- #define __Pyx_PyCFunction_FastCallWithKeywords PyCFunctionWithKeywords
-#endif
-#if CYTHON_VECTORCALL
- #define __pyx_vectorcallfunc vectorcallfunc
- #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET PY_VECTORCALL_ARGUMENTS_OFFSET
- #define __Pyx_PyVectorcall_NARGS(n) PyVectorcall_NARGS((size_t)(n))
-#elif CYTHON_BACKPORT_VECTORCALL
- typedef PyObject *(*__pyx_vectorcallfunc)(PyObject *callable, PyObject *const *args,
- size_t nargsf, PyObject *kwnames);
- #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET ((size_t)1 << (8 * sizeof(size_t) - 1))
- #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(((size_t)(n)) & ~__Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET))
-#else
- #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET 0
- #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(n))
-#endif
-#if PY_VERSION_HEX < 0x030900B1
- #define __Pyx_PyType_FromModuleAndSpec(m, s, b) ((void)m, PyType_FromSpecWithBases(s, b))
- typedef PyObject *(*__Pyx_PyCMethod)(PyObject *, PyTypeObject *, PyObject *const *, size_t, PyObject *);
-#else
- #define __Pyx_PyType_FromModuleAndSpec(m, s, b) PyType_FromModuleAndSpec(m, s, b)
- #define __Pyx_PyCMethod PyCMethod
-#endif
-#ifndef METH_METHOD
- #define METH_METHOD 0x200
-#endif
-#if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc)
- #define PyObject_Malloc(s) PyMem_Malloc(s)
- #define PyObject_Free(p) PyMem_Free(p)
- #define PyObject_Realloc(p) PyMem_Realloc(p)
-#endif
-#if CYTHON_COMPILING_IN_LIMITED_API
- #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0)
- #define __Pyx_PyFrame_SetLineNumber(frame, lineno)
-#else
- #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0)
- #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno)
-#endif
-#if CYTHON_COMPILING_IN_LIMITED_API
- #define __Pyx_PyThreadState_Current PyThreadState_Get()
-#elif !CYTHON_FAST_THREAD_STATE
- #define __Pyx_PyThreadState_Current PyThreadState_GET()
-#elif PY_VERSION_HEX >= 0x03060000
- #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet()
-#elif PY_VERSION_HEX >= 0x03000000
- #define __Pyx_PyThreadState_Current PyThreadState_GET()
-#else
- #define __Pyx_PyThreadState_Current _PyThreadState_Current
-#endif
-#if CYTHON_COMPILING_IN_LIMITED_API
-static CYTHON_INLINE void *__Pyx_PyModule_GetState(PyObject *op)
-{
- void *result;
- result = PyModule_GetState(op);
- if (!result)
- Py_FatalError("Couldn't find the module state");
- return result;
-}
-#endif
-#define __Pyx_PyObject_GetSlot(obj, name, func_ctype) __Pyx_PyType_GetSlot(Py_TYPE(obj), name, func_ctype)
-#if CYTHON_COMPILING_IN_LIMITED_API
- #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((func_ctype) PyType_GetSlot((type), Py_##name))
-#else
- #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((type)->name)
-#endif
-#if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT)
-#include "pythread.h"
-#define Py_tss_NEEDS_INIT 0
-typedef int Py_tss_t;
-static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) {
- *key = PyThread_create_key();
- return 0;
-}
-static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) {
- Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t));
- *key = Py_tss_NEEDS_INIT;
- return key;
-}
-static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) {
- PyObject_Free(key);
-}
-static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) {
- return *key != Py_tss_NEEDS_INIT;
-}
-static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) {
- PyThread_delete_key(*key);
- *key = Py_tss_NEEDS_INIT;
-}
-static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) {
- return PyThread_set_key_value(*key, value);
-}
-static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) {
- return PyThread_get_key_value(*key);
-}
-#endif
-#if PY_MAJOR_VERSION < 3
- #if CYTHON_COMPILING_IN_PYPY
- #if PYPY_VERSION_NUM < 0x07030600
- #if defined(__cplusplus) && __cplusplus >= 201402L
- [[deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")]]
- #elif defined(__GNUC__) || defined(__clang__)
- __attribute__ ((__deprecated__("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")))
- #elif defined(_MSC_VER)
- __declspec(deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6"))
- #endif
- static CYTHON_INLINE int PyGILState_Check(void) {
- return 0;
- }
- #else // PYPY_VERSION_NUM < 0x07030600
- #endif // PYPY_VERSION_NUM < 0x07030600
- #else
- static CYTHON_INLINE int PyGILState_Check(void) {
- PyThreadState * tstate = _PyThreadState_Current;
- return tstate && (tstate == PyGILState_GetThisThreadState());
- }
- #endif
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized)
-#define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n))
-#else
-#define __Pyx_PyDict_NewPresized(n) PyDict_New()
-#endif
-#if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION
- #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y)
- #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y)
-#else
- #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y)
- #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y)
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX > 0x030600B4 && CYTHON_USE_UNICODE_INTERNALS
-#define __Pyx_PyDict_GetItemStrWithError(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash)
-static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStr(PyObject *dict, PyObject *name) {
- PyObject *res = __Pyx_PyDict_GetItemStrWithError(dict, name);
- if (res == NULL) PyErr_Clear();
- return res;
-}
-#elif PY_MAJOR_VERSION >= 3 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07020000)
-#define __Pyx_PyDict_GetItemStrWithError PyDict_GetItemWithError
-#define __Pyx_PyDict_GetItemStr PyDict_GetItem
-#else
-static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStrWithError(PyObject *dict, PyObject *name) {
-#if CYTHON_COMPILING_IN_PYPY
- return PyDict_GetItem(dict, name);
-#else
- PyDictEntry *ep;
- PyDictObject *mp = (PyDictObject*) dict;
- long hash = ((PyStringObject *) name)->ob_shash;
- assert(hash != -1);
- ep = (mp->ma_lookup)(mp, name, hash);
- if (ep == NULL) {
- return NULL;
- }
- return ep->me_value;
-#endif
-}
-#define __Pyx_PyDict_GetItemStr PyDict_GetItem
-#endif
-#if CYTHON_USE_TYPE_SLOTS
- #define __Pyx_PyType_GetFlags(tp) (((PyTypeObject *)tp)->tp_flags)
- #define __Pyx_PyType_HasFeature(type, feature) ((__Pyx_PyType_GetFlags(type) & (feature)) != 0)
- #define __Pyx_PyObject_GetIterNextFunc(obj) (Py_TYPE(obj)->tp_iternext)
-#else
- #define __Pyx_PyType_GetFlags(tp) (PyType_GetFlags((PyTypeObject *)tp))
- #define __Pyx_PyType_HasFeature(type, feature) PyType_HasFeature(type, feature)
- #define __Pyx_PyObject_GetIterNextFunc(obj) PyIter_Next
-#endif
-#if CYTHON_USE_TYPE_SPECS && PY_VERSION_HEX >= 0x03080000
-#define __Pyx_PyHeapTypeObject_GC_Del(obj) {\
- PyTypeObject *type = Py_TYPE(obj);\
- assert(__Pyx_PyType_HasFeature(type, Py_TPFLAGS_HEAPTYPE));\
- PyObject_GC_Del(obj);\
- Py_DECREF(type);\
-}
-#else
-#define __Pyx_PyHeapTypeObject_GC_Del(obj) PyObject_GC_Del(obj)
-#endif
-#if CYTHON_COMPILING_IN_LIMITED_API
- #define CYTHON_PEP393_ENABLED 1
- #define __Pyx_PyUnicode_READY(op) (0)
- #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GetLength(u)
- #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_ReadChar(u, i)
- #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((void)u, 1114111U)
- #define __Pyx_PyUnicode_KIND(u) ((void)u, (0))
- #define __Pyx_PyUnicode_DATA(u) ((void*)u)
- #define __Pyx_PyUnicode_READ(k, d, i) ((void)k, PyUnicode_ReadChar((PyObject*)(d), i))
- #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GetLength(u))
-#elif PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND)
- #define CYTHON_PEP393_ENABLED 1
- #if PY_VERSION_HEX >= 0x030C0000
- #define __Pyx_PyUnicode_READY(op) (0)
- #else
- #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\
- 0 : _PyUnicode_Ready((PyObject *)(op)))
- #endif
- #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u)
- #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i)
- #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u)
- #define __Pyx_PyUnicode_KIND(u) ((int)PyUnicode_KIND(u))
- #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u)
- #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i)
- #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, (Py_UCS4) ch)
- #if PY_VERSION_HEX >= 0x030C0000
- #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_LENGTH(u))
- #else
- #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03090000
- #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : ((PyCompactUnicodeObject *)(u))->wstr_length))
- #else
- #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u)))
- #endif
- #endif
-#else
- #define CYTHON_PEP393_ENABLED 0
- #define PyUnicode_1BYTE_KIND 1
- #define PyUnicode_2BYTE_KIND 2
- #define PyUnicode_4BYTE_KIND 4
- #define __Pyx_PyUnicode_READY(op) (0)
- #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u)
- #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i]))
- #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535U : 1114111U)
- #define __Pyx_PyUnicode_KIND(u) ((int)sizeof(Py_UNICODE))
- #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u))
- #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i]))
- #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = (Py_UNICODE) ch)
- #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u))
-#endif
-#if CYTHON_COMPILING_IN_PYPY
- #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b)
- #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b)
-#else
- #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b)
- #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\
- PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b))
-#endif
-#if CYTHON_COMPILING_IN_PYPY
- #if !defined(PyUnicode_DecodeUnicodeEscape)
- #define PyUnicode_DecodeUnicodeEscape(s, size, errors) PyUnicode_Decode(s, size, "unicode_escape", errors)
- #endif
- #if !defined(PyUnicode_Contains) || (PY_MAJOR_VERSION == 2 && PYPY_VERSION_NUM < 0x07030500)
- #undef PyUnicode_Contains
- #define PyUnicode_Contains(u, s) PySequence_Contains(u, s)
- #endif
- #if !defined(PyByteArray_Check)
- #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type)
- #endif
- #if !defined(PyObject_Format)
- #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt)
- #endif
-#endif
-#define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b))
-#define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b))
-#if PY_MAJOR_VERSION >= 3
- #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b)
-#else
- #define __Pyx_PyString_Format(a, b) PyString_Format(a, b)
-#endif
-#if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII)
- #define PyObject_ASCII(o) PyObject_Repr(o)
-#endif
-#if PY_MAJOR_VERSION >= 3
- #define PyBaseString_Type PyUnicode_Type
- #define PyStringObject PyUnicodeObject
- #define PyString_Type PyUnicode_Type
- #define PyString_Check PyUnicode_Check
- #define PyString_CheckExact PyUnicode_CheckExact
-#ifndef PyObject_Unicode
- #define PyObject_Unicode PyObject_Str
-#endif
-#endif
-#if PY_MAJOR_VERSION >= 3
- #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj)
- #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj)
-#else
- #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj))
- #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj))
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON
- #define __Pyx_PySequence_ListKeepNew(obj)\
- (likely(PyList_CheckExact(obj) && Py_REFCNT(obj) == 1) ? __Pyx_NewRef(obj) : PySequence_List(obj))
-#else
- #define __Pyx_PySequence_ListKeepNew(obj) PySequence_List(obj)
-#endif
-#ifndef PySet_CheckExact
- #define PySet_CheckExact(obj) __Pyx_IS_TYPE(obj, &PySet_Type)
-#endif
-#if PY_VERSION_HEX >= 0x030900A4
- #define __Pyx_SET_REFCNT(obj, refcnt) Py_SET_REFCNT(obj, refcnt)
- #define __Pyx_SET_SIZE(obj, size) Py_SET_SIZE(obj, size)
-#else
- #define __Pyx_SET_REFCNT(obj, refcnt) Py_REFCNT(obj) = (refcnt)
- #define __Pyx_SET_SIZE(obj, size) Py_SIZE(obj) = (size)
-#endif
-#if CYTHON_ASSUME_SAFE_MACROS
- #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq)
-#else
- #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq)
-#endif
-#if PY_MAJOR_VERSION >= 3
- #define PyIntObject PyLongObject
- #define PyInt_Type PyLong_Type
- #define PyInt_Check(op) PyLong_Check(op)
- #define PyInt_CheckExact(op) PyLong_CheckExact(op)
- #define __Pyx_Py3Int_Check(op) PyLong_Check(op)
- #define __Pyx_Py3Int_CheckExact(op) PyLong_CheckExact(op)
- #define PyInt_FromString PyLong_FromString
- #define PyInt_FromUnicode PyLong_FromUnicode
- #define PyInt_FromLong PyLong_FromLong
- #define PyInt_FromSize_t PyLong_FromSize_t
- #define PyInt_FromSsize_t PyLong_FromSsize_t
- #define PyInt_AsLong PyLong_AsLong
- #define PyInt_AS_LONG PyLong_AS_LONG
- #define PyInt_AsSsize_t PyLong_AsSsize_t
- #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask
- #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask
- #define PyNumber_Int PyNumber_Long
-#else
- #define __Pyx_Py3Int_Check(op) (PyLong_Check(op) || PyInt_Check(op))
- #define __Pyx_Py3Int_CheckExact(op) (PyLong_CheckExact(op) || PyInt_CheckExact(op))
-#endif
-#if PY_MAJOR_VERSION >= 3
- #define PyBoolObject PyLongObject
-#endif
-#if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY
- #ifndef PyUnicode_InternFromString
- #define PyUnicode_InternFromString(s) PyUnicode_FromString(s)
- #endif
-#endif
-#if PY_VERSION_HEX < 0x030200A4
- typedef long Py_hash_t;
- #define __Pyx_PyInt_FromHash_t PyInt_FromLong
- #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsHash_t
-#else
- #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t
- #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsSsize_t
-#endif
-#if CYTHON_USE_ASYNC_SLOTS
- #if PY_VERSION_HEX >= 0x030500B1
- #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods
- #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async)
- #else
- #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved))
- #endif
-#else
- #define __Pyx_PyType_AsAsync(obj) NULL
-#endif
-#ifndef __Pyx_PyAsyncMethodsStruct
- typedef struct {
- unaryfunc am_await;
- unaryfunc am_aiter;
- unaryfunc am_anext;
- } __Pyx_PyAsyncMethodsStruct;
-#endif
-
-#if defined(_WIN32) || defined(WIN32) || defined(MS_WINDOWS)
- #if !defined(_USE_MATH_DEFINES)
- #define _USE_MATH_DEFINES
- #endif
-#endif
-#include
-#ifdef NAN
-#define __PYX_NAN() ((float) NAN)
-#else
-static CYTHON_INLINE float __PYX_NAN() {
- float value;
- memset(&value, 0xFF, sizeof(value));
- return value;
-}
-#endif
-#if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL)
-#define __Pyx_truncl trunc
-#else
-#define __Pyx_truncl truncl
-#endif
-
-#define __PYX_MARK_ERR_POS(f_index, lineno) \
- { __pyx_filename = __pyx_f[f_index]; (void)__pyx_filename; __pyx_lineno = lineno; (void)__pyx_lineno; __pyx_clineno = __LINE__; (void)__pyx_clineno; }
-#define __PYX_ERR(f_index, lineno, Ln_error) \
- { __PYX_MARK_ERR_POS(f_index, lineno) goto Ln_error; }
-
-#ifdef CYTHON_EXTERN_C
- #undef __PYX_EXTERN_C
- #define __PYX_EXTERN_C CYTHON_EXTERN_C
-#elif defined(__PYX_EXTERN_C)
- #ifdef _MSC_VER
- #pragma message ("Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead.")
- #else
- #warning Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead.
- #endif
-#else
- #ifdef __cplusplus
- #define __PYX_EXTERN_C extern "C"
- #else
- #define __PYX_EXTERN_C extern
- #endif
-#endif
-
-#define __PYX_HAVE__fontTools__varLib__iup
-#define __PYX_HAVE_API__fontTools__varLib__iup
-/* Early includes */
-#ifdef _OPENMP
-#include
-#endif /* _OPENMP */
-
-#if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS)
-#define CYTHON_WITHOUT_ASSERTIONS
-#endif
-
-typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding;
- const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry;
-
-#define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0
-#define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0
-#define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8)
-#define __PYX_DEFAULT_STRING_ENCODING ""
-#define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString
-#define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize
-#define __Pyx_uchar_cast(c) ((unsigned char)c)
-#define __Pyx_long_cast(x) ((long)x)
-#define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\
- (sizeof(type) < sizeof(Py_ssize_t)) ||\
- (sizeof(type) > sizeof(Py_ssize_t) &&\
- likely(v < (type)PY_SSIZE_T_MAX ||\
- v == (type)PY_SSIZE_T_MAX) &&\
- (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\
- v == (type)PY_SSIZE_T_MIN))) ||\
- (sizeof(type) == sizeof(Py_ssize_t) &&\
- (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\
- v == (type)PY_SSIZE_T_MAX))) )
-static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) {
- return (size_t) i < (size_t) limit;
-}
-#if defined (__cplusplus) && __cplusplus >= 201103L
- #include
- #define __Pyx_sst_abs(value) std::abs(value)
-#elif SIZEOF_INT >= SIZEOF_SIZE_T
- #define __Pyx_sst_abs(value) abs(value)
-#elif SIZEOF_LONG >= SIZEOF_SIZE_T
- #define __Pyx_sst_abs(value) labs(value)
-#elif defined (_MSC_VER)
- #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value))
-#elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L
- #define __Pyx_sst_abs(value) llabs(value)
-#elif defined (__GNUC__)
- #define __Pyx_sst_abs(value) __builtin_llabs(value)
-#else
- #define __Pyx_sst_abs(value) ((value<0) ? -value : value)
-#endif
-static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*);
-static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length);
-#define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s))
-#define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l)
-#define __Pyx_PyBytes_FromString PyBytes_FromString
-#define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize
-static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*);
-#if PY_MAJOR_VERSION < 3
- #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString
- #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize
-#else
- #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString
- #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize
-#endif
-#define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s))
-#define __Pyx_PyObject_AsWritableString(s) ((char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s))
-#define __Pyx_PyObject_AsWritableSString(s) ((signed char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s))
-#define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s))
-#define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s))
-#define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s))
-#define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s)
-#define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s)
-#define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s)
-#define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s)
-#define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s)
-#if CYTHON_COMPILING_IN_LIMITED_API
-static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const wchar_t *u)
-{
- const wchar_t *u_end = u;
- while (*u_end++) ;
- return (size_t)(u_end - u - 1);
-}
-#else
-static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u)
-{
- const Py_UNICODE *u_end = u;
- while (*u_end++) ;
- return (size_t)(u_end - u - 1);
-}
-#endif
-#define __Pyx_PyUnicode_FromOrdinal(o) PyUnicode_FromOrdinal((int)o)
-#define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u))
-#define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode
-#define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode
-#define __Pyx_NewRef(obj) (Py_INCREF(obj), obj)
-#define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None)
-static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b);
-static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*);
-static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*);
-static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x);
-#define __Pyx_PySequence_Tuple(obj)\
- (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj))
-static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*);
-static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t);
-static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject*);
-#if CYTHON_ASSUME_SAFE_MACROS
-#define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x))
-#else
-#define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x)
-#endif
-#define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x))
-#if PY_MAJOR_VERSION >= 3
-#define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x))
-#else
-#define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x))
-#endif
-#if CYTHON_USE_PYLONG_INTERNALS
- #if PY_VERSION_HEX >= 0x030C00A7
- #ifndef _PyLong_SIGN_MASK
- #define _PyLong_SIGN_MASK 3
- #endif
- #ifndef _PyLong_NON_SIZE_BITS
- #define _PyLong_NON_SIZE_BITS 3
- #endif
- #define __Pyx_PyLong_Sign(x) (((PyLongObject*)x)->long_value.lv_tag & _PyLong_SIGN_MASK)
- #define __Pyx_PyLong_IsNeg(x) ((__Pyx_PyLong_Sign(x) & 2) != 0)
- #define __Pyx_PyLong_IsNonNeg(x) (!__Pyx_PyLong_IsNeg(x))
- #define __Pyx_PyLong_IsZero(x) (__Pyx_PyLong_Sign(x) & 1)
- #define __Pyx_PyLong_IsPos(x) (__Pyx_PyLong_Sign(x) == 0)
- #define __Pyx_PyLong_CompactValueUnsigned(x) (__Pyx_PyLong_Digits(x)[0])
- #define __Pyx_PyLong_DigitCount(x) ((Py_ssize_t) (((PyLongObject*)x)->long_value.lv_tag >> _PyLong_NON_SIZE_BITS))
- #define __Pyx_PyLong_SignedDigitCount(x)\
- ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * __Pyx_PyLong_DigitCount(x))
- #if defined(PyUnstable_Long_IsCompact) && defined(PyUnstable_Long_CompactValue)
- #define __Pyx_PyLong_IsCompact(x) PyUnstable_Long_IsCompact((PyLongObject*) x)
- #define __Pyx_PyLong_CompactValue(x) PyUnstable_Long_CompactValue((PyLongObject*) x)
- #else
- #define __Pyx_PyLong_IsCompact(x) (((PyLongObject*)x)->long_value.lv_tag < (2 << _PyLong_NON_SIZE_BITS))
- #define __Pyx_PyLong_CompactValue(x) ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * (Py_ssize_t) __Pyx_PyLong_Digits(x)[0])
- #endif
- typedef Py_ssize_t __Pyx_compact_pylong;
- typedef size_t __Pyx_compact_upylong;
- #else // Py < 3.12
- #define __Pyx_PyLong_IsNeg(x) (Py_SIZE(x) < 0)
- #define __Pyx_PyLong_IsNonNeg(x) (Py_SIZE(x) >= 0)
- #define __Pyx_PyLong_IsZero(x) (Py_SIZE(x) == 0)
- #define __Pyx_PyLong_IsPos(x) (Py_SIZE(x) > 0)
- #define __Pyx_PyLong_CompactValueUnsigned(x) ((Py_SIZE(x) == 0) ? 0 : __Pyx_PyLong_Digits(x)[0])
- #define __Pyx_PyLong_DigitCount(x) __Pyx_sst_abs(Py_SIZE(x))
- #define __Pyx_PyLong_SignedDigitCount(x) Py_SIZE(x)
- #define __Pyx_PyLong_IsCompact(x) (Py_SIZE(x) == 0 || Py_SIZE(x) == 1 || Py_SIZE(x) == -1)
- #define __Pyx_PyLong_CompactValue(x)\
- ((Py_SIZE(x) == 0) ? (sdigit) 0 : ((Py_SIZE(x) < 0) ? -(sdigit)__Pyx_PyLong_Digits(x)[0] : (sdigit)__Pyx_PyLong_Digits(x)[0]))
- typedef sdigit __Pyx_compact_pylong;
- typedef digit __Pyx_compact_upylong;
- #endif
- #if PY_VERSION_HEX >= 0x030C00A5
- #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->long_value.ob_digit)
- #else
- #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->ob_digit)
- #endif
-#endif
-#if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII
-static int __Pyx_sys_getdefaultencoding_not_ascii;
-static int __Pyx_init_sys_getdefaultencoding_params(void) {
- PyObject* sys;
- PyObject* default_encoding = NULL;
- PyObject* ascii_chars_u = NULL;
- PyObject* ascii_chars_b = NULL;
- const char* default_encoding_c;
- sys = PyImport_ImportModule("sys");
- if (!sys) goto bad;
- default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL);
- Py_DECREF(sys);
- if (!default_encoding) goto bad;
- default_encoding_c = PyBytes_AsString(default_encoding);
- if (!default_encoding_c) goto bad;
- if (strcmp(default_encoding_c, "ascii") == 0) {
- __Pyx_sys_getdefaultencoding_not_ascii = 0;
- } else {
- char ascii_chars[128];
- int c;
- for (c = 0; c < 128; c++) {
- ascii_chars[c] = (char) c;
- }
- __Pyx_sys_getdefaultencoding_not_ascii = 1;
- ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL);
- if (!ascii_chars_u) goto bad;
- ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL);
- if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) {
- PyErr_Format(
- PyExc_ValueError,
- "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.",
- default_encoding_c);
- goto bad;
- }
- Py_DECREF(ascii_chars_u);
- Py_DECREF(ascii_chars_b);
- }
- Py_DECREF(default_encoding);
- return 0;
-bad:
- Py_XDECREF(default_encoding);
- Py_XDECREF(ascii_chars_u);
- Py_XDECREF(ascii_chars_b);
- return -1;
-}
-#endif
-#if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3
-#define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL)
-#else
-#define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL)
-#if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT
-static char* __PYX_DEFAULT_STRING_ENCODING;
-static int __Pyx_init_sys_getdefaultencoding_params(void) {
- PyObject* sys;
- PyObject* default_encoding = NULL;
- char* default_encoding_c;
- sys = PyImport_ImportModule("sys");
- if (!sys) goto bad;
- default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL);
- Py_DECREF(sys);
- if (!default_encoding) goto bad;
- default_encoding_c = PyBytes_AsString(default_encoding);
- if (!default_encoding_c) goto bad;
- __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1);
- if (!__PYX_DEFAULT_STRING_ENCODING) goto bad;
- strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c);
- Py_DECREF(default_encoding);
- return 0;
-bad:
- Py_XDECREF(default_encoding);
- return -1;
-}
-#endif
-#endif
-
-
-/* Test for GCC > 2.95 */
-#if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95)))
- #define likely(x) __builtin_expect(!!(x), 1)
- #define unlikely(x) __builtin_expect(!!(x), 0)
-#else /* !__GNUC__ or GCC < 2.95 */
- #define likely(x) (x)
- #define unlikely(x) (x)
-#endif /* __GNUC__ */
-static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; }
-
-#if !CYTHON_USE_MODULE_STATE
-static PyObject *__pyx_m = NULL;
-#endif
-static int __pyx_lineno;
-static int __pyx_clineno = 0;
-static const char * __pyx_cfilenm = __FILE__;
-static const char *__pyx_filename;
-
-/* #### Code section: filename_table ### */
-
-static const char *__pyx_f[] = {
- "Lib/fontTools/varLib/iup.py",
-};
-/* #### Code section: utility_code_proto_before_types ### */
-/* #### Code section: numeric_typedefs ### */
-/* #### Code section: complex_type_declarations ### */
-/* #### Code section: type_declarations ### */
-
-/*--- Type declarations ---*/
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr;
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize;
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr;
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr;
-struct __pyx_defaults;
-typedef struct __pyx_defaults __pyx_defaults;
-struct __pyx_defaults {
- PyObject *__pyx_arg_forced;
-};
-
-/* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr {
- PyObject_HEAD
- PyObject *__pyx_genexpr_arg_0;
- int __pyx_v_i;
- int __pyx_v_j;
- double __pyx_v_p;
- double __pyx_v_q;
- double __pyx_v_tolerance;
- double __pyx_v_x;
- double __pyx_v_y;
-};
-
-
-/* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize {
- PyObject_HEAD
- PyObject *__pyx_v_d0;
- PyObject *__pyx_v_tolerance;
-};
-
-
-/* "fontTools/varLib/iup.py":384
- *
- * # If all are within tolerance distance of 0, encode nothing:
- * if all(abs(complex(*p)) <= tolerance for p in deltas): # <<<<<<<<<<<<<<
- * return [None] * n
- *
- */
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr {
- PyObject_HEAD
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *__pyx_outer_scope;
- PyObject *__pyx_genexpr_arg_0;
- PyObject *__pyx_v_p;
-};
-
-
-/* "fontTools/varLib/iup.py":393
- * # If all deltas are exactly the same, return just one (the first one):
- * d0 = deltas[0]
- * if all(d0 == d for d in deltas): # <<<<<<<<<<<<<<
- * return [d0] + [None] * (n - 1)
- *
- */
-struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr {
- PyObject_HEAD
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *__pyx_outer_scope;
- PyObject *__pyx_genexpr_arg_0;
- PyObject *__pyx_v_d;
-};
-
-/* #### Code section: utility_code_proto ### */
-
-/* --- Runtime support code (head) --- */
-/* Refnanny.proto */
-#ifndef CYTHON_REFNANNY
- #define CYTHON_REFNANNY 0
-#endif
-#if CYTHON_REFNANNY
- typedef struct {
- void (*INCREF)(void*, PyObject*, Py_ssize_t);
- void (*DECREF)(void*, PyObject*, Py_ssize_t);
- void (*GOTREF)(void*, PyObject*, Py_ssize_t);
- void (*GIVEREF)(void*, PyObject*, Py_ssize_t);
- void* (*SetupContext)(const char*, Py_ssize_t, const char*);
- void (*FinishContext)(void**);
- } __Pyx_RefNannyAPIStruct;
- static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL;
- static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname);
- #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL;
-#ifdef WITH_THREAD
- #define __Pyx_RefNannySetupContext(name, acquire_gil)\
- if (acquire_gil) {\
- PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\
- __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\
- PyGILState_Release(__pyx_gilstate_save);\
- } else {\
- __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\
- }
- #define __Pyx_RefNannyFinishContextNogil() {\
- PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\
- __Pyx_RefNannyFinishContext();\
- PyGILState_Release(__pyx_gilstate_save);\
- }
-#else
- #define __Pyx_RefNannySetupContext(name, acquire_gil)\
- __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__))
- #define __Pyx_RefNannyFinishContextNogil() __Pyx_RefNannyFinishContext()
-#endif
- #define __Pyx_RefNannyFinishContextNogil() {\
- PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\
- __Pyx_RefNannyFinishContext();\
- PyGILState_Release(__pyx_gilstate_save);\
- }
- #define __Pyx_RefNannyFinishContext()\
- __Pyx_RefNanny->FinishContext(&__pyx_refnanny)
- #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), (__LINE__))
- #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), (__LINE__))
- #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), (__LINE__))
- #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), (__LINE__))
- #define __Pyx_XINCREF(r) do { if((r) == NULL); else {__Pyx_INCREF(r); }} while(0)
- #define __Pyx_XDECREF(r) do { if((r) == NULL); else {__Pyx_DECREF(r); }} while(0)
- #define __Pyx_XGOTREF(r) do { if((r) == NULL); else {__Pyx_GOTREF(r); }} while(0)
- #define __Pyx_XGIVEREF(r) do { if((r) == NULL); else {__Pyx_GIVEREF(r);}} while(0)
-#else
- #define __Pyx_RefNannyDeclarations
- #define __Pyx_RefNannySetupContext(name, acquire_gil)
- #define __Pyx_RefNannyFinishContextNogil()
- #define __Pyx_RefNannyFinishContext()
- #define __Pyx_INCREF(r) Py_INCREF(r)
- #define __Pyx_DECREF(r) Py_DECREF(r)
- #define __Pyx_GOTREF(r)
- #define __Pyx_GIVEREF(r)
- #define __Pyx_XINCREF(r) Py_XINCREF(r)
- #define __Pyx_XDECREF(r) Py_XDECREF(r)
- #define __Pyx_XGOTREF(r)
- #define __Pyx_XGIVEREF(r)
-#endif
-#define __Pyx_Py_XDECREF_SET(r, v) do {\
- PyObject *tmp = (PyObject *) r;\
- r = v; Py_XDECREF(tmp);\
- } while (0)
-#define __Pyx_XDECREF_SET(r, v) do {\
- PyObject *tmp = (PyObject *) r;\
- r = v; __Pyx_XDECREF(tmp);\
- } while (0)
-#define __Pyx_DECREF_SET(r, v) do {\
- PyObject *tmp = (PyObject *) r;\
- r = v; __Pyx_DECREF(tmp);\
- } while (0)
-#define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0)
-#define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0)
-
-/* PyErrExceptionMatches.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err)
-static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err);
-#else
-#define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err)
-#endif
-
-/* PyThreadStateGet.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate;
-#define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current;
-#if PY_VERSION_HEX >= 0x030C00A6
-#define __Pyx_PyErr_Occurred() (__pyx_tstate->current_exception != NULL)
-#define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->current_exception ? (PyObject*) Py_TYPE(__pyx_tstate->current_exception) : (PyObject*) NULL)
-#else
-#define __Pyx_PyErr_Occurred() (__pyx_tstate->curexc_type != NULL)
-#define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->curexc_type)
-#endif
-#else
-#define __Pyx_PyThreadState_declare
-#define __Pyx_PyThreadState_assign
-#define __Pyx_PyErr_Occurred() (PyErr_Occurred() != NULL)
-#define __Pyx_PyErr_CurrentExceptionType() PyErr_Occurred()
-#endif
-
-/* PyErrFetchRestore.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL)
-#define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb)
-#define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb)
-#define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb)
-#define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb)
-static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb);
-static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb);
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A6
-#define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL))
-#else
-#define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc)
-#endif
-#else
-#define __Pyx_PyErr_Clear() PyErr_Clear()
-#define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc)
-#define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb)
-#define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb)
-#define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb)
-#define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb)
-#define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb)
-#define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb)
-#endif
-
-/* PyObjectGetAttrStr.proto */
-#if CYTHON_USE_TYPE_SLOTS
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name);
-#else
-#define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n)
-#endif
-
-/* PyObjectGetAttrStrNoError.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name);
-
-/* GetBuiltinName.proto */
-static PyObject *__Pyx_GetBuiltinName(PyObject *name);
-
-/* SetItemInt.proto */
-#define __Pyx_SetItemInt(o, i, v, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\
- (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\
- __Pyx_SetItemInt_Fast(o, (Py_ssize_t)i, v, is_list, wraparound, boundscheck) :\
- (is_list ? (PyErr_SetString(PyExc_IndexError, "list assignment index out of range"), -1) :\
- __Pyx_SetItemInt_Generic(o, to_py_func(i), v)))
-static int __Pyx_SetItemInt_Generic(PyObject *o, PyObject *j, PyObject *v);
-static CYTHON_INLINE int __Pyx_SetItemInt_Fast(PyObject *o, Py_ssize_t i, PyObject *v,
- int is_list, int wraparound, int boundscheck);
-
-/* GetItemInt.proto */
-#define __Pyx_GetItemInt(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\
- (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\
- __Pyx_GetItemInt_Fast(o, (Py_ssize_t)i, is_list, wraparound, boundscheck) :\
- (is_list ? (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL) :\
- __Pyx_GetItemInt_Generic(o, to_py_func(i))))
-#define __Pyx_GetItemInt_List(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\
- (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\
- __Pyx_GetItemInt_List_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\
- (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL))
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i,
- int wraparound, int boundscheck);
-#define __Pyx_GetItemInt_Tuple(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\
- (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\
- __Pyx_GetItemInt_Tuple_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\
- (PyErr_SetString(PyExc_IndexError, "tuple index out of range"), (PyObject*)NULL))
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i,
- int wraparound, int boundscheck);
-static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j);
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i,
- int is_list, int wraparound, int boundscheck);
-
-/* ListExtend.proto */
-static CYTHON_INLINE int __Pyx_PyList_Extend(PyObject* L, PyObject* v) {
-#if CYTHON_COMPILING_IN_CPYTHON
- PyObject* none = _PyList_Extend((PyListObject*)L, v);
- if (unlikely(!none))
- return -1;
- Py_DECREF(none);
- return 0;
-#else
- return PyList_SetSlice(L, PY_SSIZE_T_MAX, PY_SSIZE_T_MAX, v);
-#endif
-}
-
-/* ListAppend.proto */
-#if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS
-static CYTHON_INLINE int __Pyx_PyList_Append(PyObject* list, PyObject* x) {
- PyListObject* L = (PyListObject*) list;
- Py_ssize_t len = Py_SIZE(list);
- if (likely(L->allocated > len) & likely(len > (L->allocated >> 1))) {
- Py_INCREF(x);
- PyList_SET_ITEM(list, len, x);
- __Pyx_SET_SIZE(list, len + 1);
- return 0;
- }
- return PyList_Append(list, x);
-}
-#else
-#define __Pyx_PyList_Append(L,x) PyList_Append(L,x)
-#endif
-
-/* PyObjectCall.proto */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw);
-#else
-#define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw)
-#endif
-
-/* TupleAndListFromArray.proto */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n);
-static CYTHON_INLINE PyObject* __Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n);
-#endif
-
-/* IncludeStringH.proto */
-#include
-
-/* BytesEquals.proto */
-static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals);
-
-/* UnicodeEquals.proto */
-static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals);
-
-/* fastcall.proto */
-#define __Pyx_Arg_VARARGS(args, i) PyTuple_GET_ITEM(args, i)
-#define __Pyx_NumKwargs_VARARGS(kwds) PyDict_Size(kwds)
-#define __Pyx_KwValues_VARARGS(args, nargs) NULL
-#define __Pyx_GetKwValue_VARARGS(kw, kwvalues, s) __Pyx_PyDict_GetItemStrWithError(kw, s)
-#define __Pyx_KwargsAsDict_VARARGS(kw, kwvalues) PyDict_Copy(kw)
-#if CYTHON_METH_FASTCALL
- #define __Pyx_Arg_FASTCALL(args, i) args[i]
- #define __Pyx_NumKwargs_FASTCALL(kwds) PyTuple_GET_SIZE(kwds)
- #define __Pyx_KwValues_FASTCALL(args, nargs) ((args) + (nargs))
- static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s);
- #define __Pyx_KwargsAsDict_FASTCALL(kw, kwvalues) _PyStack_AsDict(kwvalues, kw)
-#else
- #define __Pyx_Arg_FASTCALL __Pyx_Arg_VARARGS
- #define __Pyx_NumKwargs_FASTCALL __Pyx_NumKwargs_VARARGS
- #define __Pyx_KwValues_FASTCALL __Pyx_KwValues_VARARGS
- #define __Pyx_GetKwValue_FASTCALL __Pyx_GetKwValue_VARARGS
- #define __Pyx_KwargsAsDict_FASTCALL __Pyx_KwargsAsDict_VARARGS
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON
-#define __Pyx_ArgsSlice_VARARGS(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_VARARGS(args, start), stop - start)
-#define __Pyx_ArgsSlice_FASTCALL(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_FASTCALL(args, start), stop - start)
-#else
-#define __Pyx_ArgsSlice_VARARGS(args, start, stop) PyTuple_GetSlice(args, start, stop)
-#define __Pyx_ArgsSlice_FASTCALL(args, start, stop) PyTuple_GetSlice(args, start, stop)
-#endif
-
-/* RaiseArgTupleInvalid.proto */
-static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact,
- Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found);
-
-/* RaiseDoubleKeywords.proto */
-static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name);
-
-/* ParseKeywords.proto */
-static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject *const *kwvalues,
- PyObject **argnames[],
- PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,
- const char* function_name);
-
-/* AssertionsEnabled.proto */
-#define __Pyx_init_assertions_enabled()
-#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag)
- #define __pyx_assertions_enabled() (1)
-#elif PY_VERSION_HEX < 0x03080000 || CYTHON_COMPILING_IN_PYPY || defined(Py_LIMITED_API)
- #define __pyx_assertions_enabled() (!Py_OptimizeFlag)
-#elif CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030900A6
- static int __pyx_assertions_enabled_flag;
- #define __pyx_assertions_enabled() (__pyx_assertions_enabled_flag)
- #undef __Pyx_init_assertions_enabled
- static void __Pyx_init_assertions_enabled(void) {
- __pyx_assertions_enabled_flag = ! _PyInterpreterState_GetConfig(__Pyx_PyThreadState_Current->interp)->optimization_level;
- }
-#else
- #define __pyx_assertions_enabled() (!Py_OptimizeFlag)
-#endif
-
-/* RaiseException.proto */
-static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause);
-
-/* PySequenceContains.proto */
-static CYTHON_INLINE int __Pyx_PySequence_ContainsTF(PyObject* item, PyObject* seq, int eq) {
- int result = PySequence_Contains(seq, item);
- return unlikely(result < 0) ? result : (result == (eq == Py_EQ));
-}
-
-/* PyIntBinop.proto */
-#if !CYTHON_COMPILING_IN_PYPY
-static PyObject* __Pyx_PyInt_AddObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check);
-#else
-#define __Pyx_PyInt_AddObjC(op1, op2, intval, inplace, zerodivision_check)\
- (inplace ? PyNumber_InPlaceAdd(op1, op2) : PyNumber_Add(op1, op2))
-#endif
-
-/* ListCompAppend.proto */
-#if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS
-static CYTHON_INLINE int __Pyx_ListComp_Append(PyObject* list, PyObject* x) {
- PyListObject* L = (PyListObject*) list;
- Py_ssize_t len = Py_SIZE(list);
- if (likely(L->allocated > len)) {
- Py_INCREF(x);
- PyList_SET_ITEM(list, len, x);
- __Pyx_SET_SIZE(list, len + 1);
- return 0;
- }
- return PyList_Append(list, x);
-}
-#else
-#define __Pyx_ListComp_Append(L,x) PyList_Append(L,x)
-#endif
-
-/* IterNext.proto */
-#define __Pyx_PyIter_Next(obj) __Pyx_PyIter_Next2(obj, NULL)
-static CYTHON_INLINE PyObject *__Pyx_PyIter_Next2(PyObject *, PyObject *);
-
-/* PyIntCompare.proto */
-static CYTHON_INLINE int __Pyx_PyInt_BoolNeObjC(PyObject *op1, PyObject *op2, long intval, long inplace);
-
-/* SliceObject.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetSlice(
- PyObject* obj, Py_ssize_t cstart, Py_ssize_t cstop,
- PyObject** py_start, PyObject** py_stop, PyObject** py_slice,
- int has_cstart, int has_cstop, int wraparound);
-
-/* PyFunctionFastCall.proto */
-#if CYTHON_FAST_PYCALL
-#if !CYTHON_VECTORCALL
-#define __Pyx_PyFunction_FastCall(func, args, nargs)\
- __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL)
-static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs);
-#endif
-#define __Pyx_BUILD_ASSERT_EXPR(cond)\
- (sizeof(char [1 - 2*!(cond)]) - 1)
-#ifndef Py_MEMBER_SIZE
-#define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member)
-#endif
-#if !CYTHON_VECTORCALL
-#if PY_VERSION_HEX >= 0x03080000
- #include "frameobject.h"
-#if PY_VERSION_HEX >= 0x030b00a6
- #ifndef Py_BUILD_CORE
- #define Py_BUILD_CORE 1
- #endif
- #include "internal/pycore_frame.h"
-#endif
- #define __Pxy_PyFrame_Initialize_Offsets()
- #define __Pyx_PyFrame_GetLocalsplus(frame) ((frame)->f_localsplus)
-#else
- static size_t __pyx_pyframe_localsplus_offset = 0;
- #include "frameobject.h"
- #define __Pxy_PyFrame_Initialize_Offsets()\
- ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\
- (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus)))
- #define __Pyx_PyFrame_GetLocalsplus(frame)\
- (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset))
-#endif
-#endif
-#endif
-
-/* PyObjectCallMethO.proto */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg);
-#endif
-
-/* PyObjectFastCall.proto */
-#define __Pyx_PyObject_FastCall(func, args, nargs) __Pyx_PyObject_FastCallDict(func, args, (size_t)(nargs), NULL)
-static CYTHON_INLINE PyObject* __Pyx_PyObject_FastCallDict(PyObject *func, PyObject **args, size_t nargs, PyObject *kwargs);
-
-/* PyObjectCallOneArg.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg);
-
-/* ObjectGetItem.proto */
-#if CYTHON_USE_TYPE_SLOTS
-static CYTHON_INLINE PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject *key);
-#else
-#define __Pyx_PyObject_GetItem(obj, key) PyObject_GetItem(obj, key)
-#endif
-
-/* PyIntBinop.proto */
-#if !CYTHON_COMPILING_IN_PYPY
-static PyObject* __Pyx_PyInt_SubtractObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check);
-#else
-#define __Pyx_PyInt_SubtractObjC(op1, op2, intval, inplace, zerodivision_check)\
- (inplace ? PyNumber_InPlaceSubtract(op1, op2) : PyNumber_Subtract(op1, op2))
-#endif
-
-/* PyDictVersioning.proto */
-#if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS
-#define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1)
-#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
-#define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\
- (version_var) = __PYX_GET_DICT_VERSION(dict);\
- (cache_var) = (value);
-#define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\
- static PY_UINT64_T __pyx_dict_version = 0;\
- static PyObject *__pyx_dict_cached_value = NULL;\
- if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\
- (VAR) = __pyx_dict_cached_value;\
- } else {\
- (VAR) = __pyx_dict_cached_value = (LOOKUP);\
- __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\
- }\
-}
-static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj);
-static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj);
-static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version);
-#else
-#define __PYX_GET_DICT_VERSION(dict) (0)
-#define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)
-#define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP);
-#endif
-
-/* GetModuleGlobalName.proto */
-#if CYTHON_USE_DICT_VERSIONS
-#define __Pyx_GetModuleGlobalName(var, name) do {\
- static PY_UINT64_T __pyx_dict_version = 0;\
- static PyObject *__pyx_dict_cached_value = NULL;\
- (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\
- (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\
- __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\
-} while(0)
-#define __Pyx_GetModuleGlobalNameUncached(var, name) do {\
- PY_UINT64_T __pyx_dict_version;\
- PyObject *__pyx_dict_cached_value;\
- (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\
-} while(0)
-static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value);
-#else
-#define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name)
-#define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name)
-static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name);
-#endif
-
-/* RaiseUnboundLocalError.proto */
-static CYTHON_INLINE void __Pyx_RaiseUnboundLocalError(const char *varname);
-
-/* RaiseTooManyValuesToUnpack.proto */
-static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected);
-
-/* RaiseNeedMoreValuesToUnpack.proto */
-static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index);
-
-/* IterFinish.proto */
-static CYTHON_INLINE int __Pyx_IterFinish(void);
-
-/* UnpackItemEndCheck.proto */
-static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected);
-
-/* py_abs.proto */
-#if CYTHON_USE_PYLONG_INTERNALS
-static PyObject *__Pyx_PyLong_AbsNeg(PyObject *num);
-#define __Pyx_PyNumber_Absolute(x)\
- ((likely(PyLong_CheckExact(x))) ?\
- (likely(__Pyx_PyLong_IsNonNeg(x)) ? (Py_INCREF(x), (x)) : __Pyx_PyLong_AbsNeg(x)) :\
- PyNumber_Absolute(x))
-#else
-#define __Pyx_PyNumber_Absolute(x) PyNumber_Absolute(x)
-#endif
-
-/* GetException.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb)
-static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb);
-#else
-static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb);
-#endif
-
-/* pep479.proto */
-static void __Pyx_Generator_Replace_StopIteration(int in_async_gen);
-
-/* ArgTypeTest.proto */
-#define __Pyx_ArgTypeTest(obj, type, none_allowed, name, exact)\
- ((likely(__Pyx_IS_TYPE(obj, type) | (none_allowed && (obj == Py_None)))) ? 1 :\
- __Pyx__ArgTypeTest(obj, type, name, exact))
-static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact);
-
-/* DictGetItem.proto */
-#if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY
-static PyObject *__Pyx_PyDict_GetItem(PyObject *d, PyObject* key);
-#define __Pyx_PyObject_Dict_GetItem(obj, name)\
- (likely(PyDict_CheckExact(obj)) ?\
- __Pyx_PyDict_GetItem(obj, name) : PyObject_GetItem(obj, name))
-#else
-#define __Pyx_PyDict_GetItem(d, key) PyObject_GetItem(d, key)
-#define __Pyx_PyObject_Dict_GetItem(obj, name) PyObject_GetItem(obj, name)
-#endif
-
-/* pyfrozenset_new.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyFrozenSet_New(PyObject* it);
-
-/* PySetContains.proto */
-static CYTHON_INLINE int __Pyx_PySet_ContainsTF(PyObject* key, PyObject* set, int eq);
-
-/* RaiseUnexpectedTypeError.proto */
-static int __Pyx_RaiseUnexpectedTypeError(const char *expected, PyObject *obj);
-
-/* SliceTupleAndList.proto */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyList_GetSlice(PyObject* src, Py_ssize_t start, Py_ssize_t stop);
-static CYTHON_INLINE PyObject* __Pyx_PyTuple_GetSlice(PyObject* src, Py_ssize_t start, Py_ssize_t stop);
-#else
-#define __Pyx_PyList_GetSlice(seq, start, stop) PySequence_GetSlice(seq, start, stop)
-#define __Pyx_PyTuple_GetSlice(seq, start, stop) PySequence_GetSlice(seq, start, stop)
-#endif
-
-/* set_iter.proto */
-static CYTHON_INLINE PyObject* __Pyx_set_iterator(PyObject* iterable, int is_set,
- Py_ssize_t* p_orig_length, int* p_source_is_set);
-static CYTHON_INLINE int __Pyx_set_iter_next(
- PyObject* iter_obj, Py_ssize_t orig_length,
- Py_ssize_t* ppos, PyObject **value,
- int source_is_set);
-
-/* RaiseClosureNameError.proto */
-static CYTHON_INLINE void __Pyx_RaiseClosureNameError(const char *varname);
-
-/* PyIntCompare.proto */
-static CYTHON_INLINE int __Pyx_PyInt_BoolEqObjC(PyObject *op1, PyObject *op2, long intval, long inplace);
-
-/* py_set_remove.proto */
-static CYTHON_INLINE int __Pyx_PySet_Remove(PyObject *set, PyObject *key);
-
-/* IncludeStructmemberH.proto */
-#include
-
-/* FixUpExtensionType.proto */
-#if CYTHON_USE_TYPE_SPECS
-static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type);
-#endif
-
-/* PyObjectCallNoArg.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func);
-
-/* PyObjectGetMethod.proto */
-static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method);
-
-/* PyObjectCallMethod0.proto */
-static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name);
-
-/* ValidateBasesTuple.proto */
-#if CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API || CYTHON_USE_TYPE_SPECS
-static int __Pyx_validate_bases_tuple(const char *type_name, Py_ssize_t dictoffset, PyObject *bases);
-#endif
-
-/* PyType_Ready.proto */
-CYTHON_UNUSED static int __Pyx_PyType_Ready(PyTypeObject *t);
-
-/* PyObject_GenericGetAttrNoDict.proto */
-#if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name);
-#else
-#define __Pyx_PyObject_GenericGetAttrNoDict PyObject_GenericGetAttr
-#endif
-
-/* GetTopmostException.proto */
-#if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE
-static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate);
-#endif
-
-/* SaveResetException.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb)
-static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb);
-#define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb)
-static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb);
-#else
-#define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb)
-#define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb)
-#endif
-
-/* FastTypeChecks.proto */
-#if CYTHON_COMPILING_IN_CPYTHON
-#define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type)
-#define __Pyx_TypeCheck2(obj, type1, type2) __Pyx_IsAnySubtype2(Py_TYPE(obj), (PyTypeObject *)type1, (PyTypeObject *)type2)
-static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b);
-static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b);
-static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type);
-static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2);
-#else
-#define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type)
-#define __Pyx_TypeCheck2(obj, type1, type2) (PyObject_TypeCheck(obj, (PyTypeObject *)type1) || PyObject_TypeCheck(obj, (PyTypeObject *)type2))
-#define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type)
-#define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2))
-#endif
-#define __Pyx_PyErr_ExceptionMatches2(err1, err2) __Pyx_PyErr_GivenExceptionMatches2(__Pyx_PyErr_CurrentExceptionType(), err1, err2)
-#define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception)
-
-/* Import.proto */
-static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level);
-
-/* ImportFrom.proto */
-static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name);
-
-/* FetchSharedCythonModule.proto */
-static PyObject *__Pyx_FetchSharedCythonABIModule(void);
-
-/* FetchCommonType.proto */
-#if !CYTHON_USE_TYPE_SPECS
-static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type);
-#else
-static PyTypeObject* __Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases);
-#endif
-
-/* PyMethodNew.proto */
-#if PY_MAJOR_VERSION >= 3
-static PyObject *__Pyx_PyMethod_New(PyObject *func, PyObject *self, PyObject *typ) {
- CYTHON_UNUSED_VAR(typ);
- if (!self)
- return __Pyx_NewRef(func);
- return PyMethod_New(func, self);
-}
-#else
- #define __Pyx_PyMethod_New PyMethod_New
-#endif
-
-/* PyVectorcallFastCallDict.proto */
-#if CYTHON_METH_FASTCALL
-static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw);
-#endif
-
-/* CythonFunctionShared.proto */
-#define __Pyx_CyFunction_USED
-#define __Pyx_CYFUNCTION_STATICMETHOD 0x01
-#define __Pyx_CYFUNCTION_CLASSMETHOD 0x02
-#define __Pyx_CYFUNCTION_CCLASS 0x04
-#define __Pyx_CYFUNCTION_COROUTINE 0x08
-#define __Pyx_CyFunction_GetClosure(f)\
- (((__pyx_CyFunctionObject *) (f))->func_closure)
-#if PY_VERSION_HEX < 0x030900B1
- #define __Pyx_CyFunction_GetClassObj(f)\
- (((__pyx_CyFunctionObject *) (f))->func_classobj)
-#else
- #define __Pyx_CyFunction_GetClassObj(f)\
- ((PyObject*) ((PyCMethodObject *) (f))->mm_class)
-#endif
-#define __Pyx_CyFunction_SetClassObj(f, classobj)\
- __Pyx__CyFunction_SetClassObj((__pyx_CyFunctionObject *) (f), (classobj))
-#define __Pyx_CyFunction_Defaults(type, f)\
- ((type *)(((__pyx_CyFunctionObject *) (f))->defaults))
-#define __Pyx_CyFunction_SetDefaultsGetter(f, g)\
- ((__pyx_CyFunctionObject *) (f))->defaults_getter = (g)
-typedef struct {
-#if PY_VERSION_HEX < 0x030900B1
- PyCFunctionObject func;
-#else
- PyCMethodObject func;
-#endif
-#if CYTHON_BACKPORT_VECTORCALL
- __pyx_vectorcallfunc func_vectorcall;
-#endif
-#if PY_VERSION_HEX < 0x030500A0
- PyObject *func_weakreflist;
-#endif
- PyObject *func_dict;
- PyObject *func_name;
- PyObject *func_qualname;
- PyObject *func_doc;
- PyObject *func_globals;
- PyObject *func_code;
- PyObject *func_closure;
-#if PY_VERSION_HEX < 0x030900B1
- PyObject *func_classobj;
-#endif
- void *defaults;
- int defaults_pyobjects;
- size_t defaults_size; // used by FusedFunction for copying defaults
- int flags;
- PyObject *defaults_tuple;
- PyObject *defaults_kwdict;
- PyObject *(*defaults_getter)(PyObject *);
- PyObject *func_annotations;
- PyObject *func_is_coroutine;
-} __pyx_CyFunctionObject;
-#define __Pyx_CyFunction_Check(obj) __Pyx_TypeCheck(obj, __pyx_CyFunctionType)
-#define __Pyx_IsCyOrPyCFunction(obj) __Pyx_TypeCheck2(obj, __pyx_CyFunctionType, &PyCFunction_Type)
-#define __Pyx_CyFunction_CheckExact(obj) __Pyx_IS_TYPE(obj, __pyx_CyFunctionType)
-static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject* op, PyMethodDef *ml,
- int flags, PyObject* qualname,
- PyObject *closure,
- PyObject *module, PyObject *globals,
- PyObject* code);
-static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj);
-static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *m,
- size_t size,
- int pyobjects);
-static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *m,
- PyObject *tuple);
-static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *m,
- PyObject *dict);
-static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *m,
- PyObject *dict);
-static int __pyx_CyFunction_init(PyObject *module);
-#if CYTHON_METH_FASTCALL
-static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames);
-static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames);
-static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames);
-static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames);
-#if CYTHON_BACKPORT_VECTORCALL
-#define __Pyx_CyFunction_func_vectorcall(f) (((__pyx_CyFunctionObject*)f)->func_vectorcall)
-#else
-#define __Pyx_CyFunction_func_vectorcall(f) (((PyCFunctionObject*)f)->vectorcall)
-#endif
-#endif
-
-/* CythonFunction.proto */
-static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml,
- int flags, PyObject* qualname,
- PyObject *closure,
- PyObject *module, PyObject *globals,
- PyObject* code);
-
-/* CLineInTraceback.proto */
-#ifdef CYTHON_CLINE_IN_TRACEBACK
-#define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0)
-#else
-static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line);
-#endif
-
-/* CodeObjectCache.proto */
-#if !CYTHON_COMPILING_IN_LIMITED_API
-typedef struct {
- PyCodeObject* code_object;
- int code_line;
-} __Pyx_CodeObjectCacheEntry;
-struct __Pyx_CodeObjectCache {
- int count;
- int max_count;
- __Pyx_CodeObjectCacheEntry* entries;
-};
-static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL};
-static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line);
-static PyCodeObject *__pyx_find_code_object(int code_line);
-static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object);
-#endif
-
-/* AddTraceback.proto */
-static void __Pyx_AddTraceback(const char *funcname, int c_line,
- int py_line, const char *filename);
-
-/* GCCDiagnostics.proto */
-#if !defined(__INTEL_COMPILER) && defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 6))
-#define __Pyx_HAS_GCC_DIAGNOSTIC
-#endif
-
-/* CIntFromPy.proto */
-static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *);
-
-/* CIntToPy.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value);
-
-/* CIntToPy.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value);
-
-/* CIntFromPy.proto */
-static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *);
-
-/* FormatTypeName.proto */
-#if CYTHON_COMPILING_IN_LIMITED_API
-typedef PyObject *__Pyx_TypeName;
-#define __Pyx_FMT_TYPENAME "%U"
-static __Pyx_TypeName __Pyx_PyType_GetName(PyTypeObject* tp);
-#define __Pyx_DECREF_TypeName(obj) Py_XDECREF(obj)
-#else
-typedef const char *__Pyx_TypeName;
-#define __Pyx_FMT_TYPENAME "%.200s"
-#define __Pyx_PyType_GetName(tp) ((tp)->tp_name)
-#define __Pyx_DECREF_TypeName(obj)
-#endif
-
-/* SwapException.proto */
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_ExceptionSwap(type, value, tb) __Pyx__ExceptionSwap(__pyx_tstate, type, value, tb)
-static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb);
-#else
-static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb);
-#endif
-
-/* PyObjectCall2Args.proto */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2);
-
-/* PyObjectCallMethod1.proto */
-static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg);
-
-/* CoroutineBase.proto */
-struct __pyx_CoroutineObject;
-typedef PyObject *(*__pyx_coroutine_body_t)(struct __pyx_CoroutineObject *, PyThreadState *, PyObject *);
-#if CYTHON_USE_EXC_INFO_STACK
-#define __Pyx_ExcInfoStruct _PyErr_StackItem
-#else
-typedef struct {
- PyObject *exc_type;
- PyObject *exc_value;
- PyObject *exc_traceback;
-} __Pyx_ExcInfoStruct;
-#endif
-typedef struct __pyx_CoroutineObject {
- PyObject_HEAD
- __pyx_coroutine_body_t body;
- PyObject *closure;
- __Pyx_ExcInfoStruct gi_exc_state;
- PyObject *gi_weakreflist;
- PyObject *classobj;
- PyObject *yieldfrom;
- PyObject *gi_name;
- PyObject *gi_qualname;
- PyObject *gi_modulename;
- PyObject *gi_code;
- PyObject *gi_frame;
- int resume_label;
- char is_running;
-} __pyx_CoroutineObject;
-static __pyx_CoroutineObject *__Pyx__Coroutine_New(
- PyTypeObject *type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure,
- PyObject *name, PyObject *qualname, PyObject *module_name);
-static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit(
- __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure,
- PyObject *name, PyObject *qualname, PyObject *module_name);
-static CYTHON_INLINE void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *self);
-static int __Pyx_Coroutine_clear(PyObject *self);
-static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value);
-static PyObject *__Pyx_Coroutine_Close(PyObject *self);
-static PyObject *__Pyx_Coroutine_Throw(PyObject *gen, PyObject *args);
-#if CYTHON_USE_EXC_INFO_STACK
-#define __Pyx_Coroutine_SwapException(self)
-#define __Pyx_Coroutine_ResetAndClearException(self) __Pyx_Coroutine_ExceptionClear(&(self)->gi_exc_state)
-#else
-#define __Pyx_Coroutine_SwapException(self) {\
- __Pyx_ExceptionSwap(&(self)->gi_exc_state.exc_type, &(self)->gi_exc_state.exc_value, &(self)->gi_exc_state.exc_traceback);\
- __Pyx_Coroutine_ResetFrameBackpointer(&(self)->gi_exc_state);\
- }
-#define __Pyx_Coroutine_ResetAndClearException(self) {\
- __Pyx_ExceptionReset((self)->gi_exc_state.exc_type, (self)->gi_exc_state.exc_value, (self)->gi_exc_state.exc_traceback);\
- (self)->gi_exc_state.exc_type = (self)->gi_exc_state.exc_value = (self)->gi_exc_state.exc_traceback = NULL;\
- }
-#endif
-#if CYTHON_FAST_THREAD_STATE
-#define __Pyx_PyGen_FetchStopIterationValue(pvalue)\
- __Pyx_PyGen__FetchStopIterationValue(__pyx_tstate, pvalue)
-#else
-#define __Pyx_PyGen_FetchStopIterationValue(pvalue)\
- __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, pvalue)
-#endif
-static int __Pyx_PyGen__FetchStopIterationValue(PyThreadState *tstate, PyObject **pvalue);
-static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state);
-
-/* PatchModuleWithCoroutine.proto */
-static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code);
-
-/* PatchGeneratorABC.proto */
-static int __Pyx_patch_abc(void);
-
-/* Generator.proto */
-#define __Pyx_Generator_USED
-#define __Pyx_Generator_CheckExact(obj) __Pyx_IS_TYPE(obj, __pyx_GeneratorType)
-#define __Pyx_Generator_New(body, code, closure, name, qualname, module_name)\
- __Pyx__Coroutine_New(__pyx_GeneratorType, body, code, closure, name, qualname, module_name)
-static PyObject *__Pyx_Generator_Next(PyObject *self);
-static int __pyx_Generator_init(PyObject *module);
-
-/* CheckBinaryVersion.proto */
-static int __Pyx_check_binary_version(void);
-
-/* InitStrings.proto */
-static int __Pyx_InitStrings(__Pyx_StringTabEntry *t);
-
-/* #### Code section: module_declarations ### */
-
-/* Module declarations from "cython" */
-
-/* Module declarations from "fontTools.varLib.iup" */
-static PyObject *__pyx_f_9fontTools_6varLib_3iup_iup_segment(PyObject *, PyObject *, PyObject *, PyObject *, PyObject *); /*proto*/
-static CYTHON_INLINE int __pyx_f_9fontTools_6varLib_3iup_can_iup_in_between(PyObject *, PyObject *, int, int, double); /*proto*/
-/* #### Code section: typeinfo ### */
-/* #### Code section: before_global_var ### */
-#define __Pyx_MODULE_NAME "fontTools.varLib.iup"
-extern int __pyx_module_is_main_fontTools__varLib__iup;
-int __pyx_module_is_main_fontTools__varLib__iup = 0;
-
-/* Implementation of "fontTools.varLib.iup" */
-/* #### Code section: global_var ### */
-static PyObject *__pyx_builtin_AttributeError;
-static PyObject *__pyx_builtin_ImportError;
-static PyObject *__pyx_builtin_zip;
-static PyObject *__pyx_builtin_AssertionError;
-static PyObject *__pyx_builtin_enumerate;
-static PyObject *__pyx_builtin_range;
-static PyObject *__pyx_builtin_max;
-/* #### Code section: string_decls ### */
-static const char __pyx_k_c[] = "c";
-static const char __pyx_k_d[] = "d";
-static const char __pyx_k_i[] = "i";
-static const char __pyx_k_j[] = "j";
-static const char __pyx_k_k[] = "k";
-static const char __pyx_k_l[] = "l";
-static const char __pyx_k_n[] = "n";
-static const char __pyx_k_s[] = "s";
-static const char __pyx_k_v[] = "v";
-static const char __pyx_k__3[] = ".";
-static const char __pyx_k_c1[] = "c1";
-static const char __pyx_k_c2[] = "c2";
-static const char __pyx_k_cj[] = "cj";
-static const char __pyx_k_d0[] = "d0";
-static const char __pyx_k_d1[] = "d1";
-static const char __pyx_k_d2[] = "d2";
-static const char __pyx_k_dj[] = "dj";
-static const char __pyx_k_gc[] = "gc";
-static const char __pyx_k_i1[] = "i1";
-static const char __pyx_k_i2[] = "i2";
-static const char __pyx_k_it[] = "it";
-static const char __pyx_k_lc[] = "lc";
-static const char __pyx_k_ld[] = "ld";
-static const char __pyx_k_nc[] = "nc";
-static const char __pyx_k_nd[] = "nd";
-static const char __pyx_k__23[] = "?";
-static const char __pyx_k_end[] = "end";
-static const char __pyx_k_int[] = "int";
-static const char __pyx_k_lcj[] = "lcj";
-static const char __pyx_k_ldj[] = "ldj";
-static const char __pyx_k_max[] = "max";
-static const char __pyx_k_ncj[] = "ncj";
-static const char __pyx_k_ndj[] = "ndj";
-static const char __pyx_k_out[] = "out";
-static const char __pyx_k_ri1[] = "ri1";
-static const char __pyx_k_ri2[] = "ri2";
-static const char __pyx_k_set[] = "set";
-static const char __pyx_k_zip[] = "zip";
-static const char __pyx_k_Real[] = "Real";
-static const char __pyx_k_args[] = "args";
-static const char __pyx_k_cost[] = "cost";
-static const char __pyx_k_ends[] = "ends";
-static const char __pyx_k_list[] = "list";
-static const char __pyx_k_main[] = "__main__";
-static const char __pyx_k_name[] = "__name__";
-static const char __pyx_k_send[] = "send";
-static const char __pyx_k_test[] = "__test__";
-static const char __pyx_k_Delta[] = "_Delta";
-static const char __pyx_k_Point[] = "_Point";
-static const char __pyx_k_Tuple[] = "Tuple";
-static const char __pyx_k_Union[] = "Union";
-static const char __pyx_k_chain[] = "chain";
-static const char __pyx_k_close[] = "close";
-static const char __pyx_k_costs[] = "costs";
-static const char __pyx_k_force[] = "force";
-static const char __pyx_k_range[] = "range";
-static const char __pyx_k_start[] = "start";
-static const char __pyx_k_throw[] = "throw";
-static const char __pyx_k_best_j[] = "best_j";
-static const char __pyx_k_coords[] = "coords";
-static const char __pyx_k_cython[] = "cython";
-static const char __pyx_k_deltas[] = "deltas";
-static const char __pyx_k_enable[] = "enable";
-static const char __pyx_k_forced[] = "forced";
-static const char __pyx_k_import[] = "__import__";
-static const char __pyx_k_return[] = "return";
-static const char __pyx_k_typing[] = "typing";
-static const char __pyx_k_contour[] = "contour";
-static const char __pyx_k_disable[] = "disable";
-static const char __pyx_k_genexpr[] = "genexpr";
-static const char __pyx_k_indices[] = "indices";
-static const char __pyx_k_numbers[] = "numbers";
-static const char __pyx_k_rot_set[] = "_rot_set";
-static const char __pyx_k_COMPILED[] = "COMPILED";
-static const char __pyx_k_Integral[] = "Integral";
-static const char __pyx_k_Sequence[] = "Sequence";
-static const char __pyx_k_best_sol[] = "best_sol";
-static const char __pyx_k_lookback[] = "lookback";
-static const char __pyx_k_rot_list[] = "_rot_list";
-static const char __pyx_k_solution[] = "solution";
-static const char __pyx_k_Endpoints[] = "_Endpoints";
-static const char __pyx_k_best_cost[] = "best_cost";
-static const char __pyx_k_enumerate[] = "enumerate";
-static const char __pyx_k_isenabled[] = "isenabled";
-static const char __pyx_k_iup_delta[] = "iup_delta";
-static const char __pyx_k_tolerance[] = "tolerance";
-static const char __pyx_k_DeltaOrNone[] = "_DeltaOrNone";
-static const char __pyx_k_ImportError[] = "ImportError";
-static const char __pyx_k_iup_contour[] = "iup_contour";
-static const char __pyx_k_DeltaSegment[] = "_DeltaSegment";
-static const char __pyx_k_MAX_LOOKBACK[] = "MAX_LOOKBACK";
-static const char __pyx_k_PointSegment[] = "_PointSegment";
-static const char __pyx_k_is_coroutine[] = "_is_coroutine";
-static const char __pyx_k_class_getitem[] = "__class_getitem__";
-static const char __pyx_k_AssertionError[] = "AssertionError";
-static const char __pyx_k_AttributeError[] = "AttributeError";
-static const char __pyx_k_fontTools_misc[] = "fontTools.misc";
-static const char __pyx_k_DeltaOrNoneSegment[] = "_DeltaOrNoneSegment";
-static const char __pyx_k_asyncio_coroutines[] = "asyncio.coroutines";
-static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback";
-static const char __pyx_k_iup_delta_optimize[] = "iup_delta_optimize";
-static const char __pyx_k_fontTools_varLib_iup[] = "fontTools.varLib.iup";
-static const char __pyx_k_iup_contour_optimize[] = "iup_contour_optimize";
-static const char __pyx_k_iup_contour_optimize_dp[] = "_iup_contour_optimize_dp";
-static const char __pyx_k_Lib_fontTools_varLib_iup_py[] = "Lib/fontTools/varLib/iup.py";
-static const char __pyx_k_iup_contour_bound_forced_set[] = "_iup_contour_bound_forced_set";
-static const char __pyx_k_can_iup_in_between_locals_genexp[] = "can_iup_in_between..genexpr";
-static const char __pyx_k_iup_contour_optimize_locals_gene[] = "iup_contour_optimize..genexpr";
-/* #### Code section: decls ### */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_iup_contour(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_2iup_delta(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_ends); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_18can_iup_in_between_genexpr(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_4_iup_contour_bound_forced_set(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_tolerance); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_16__defaults__(CYTHON_UNUSED PyObject *__pyx_self); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_6_iup_contour_optimize_dp(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_forced, double __pyx_v_tolerance, PyObject *__pyx_v_lookback); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_8_rot_list(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_l, PyObject *__pyx_v_k); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_10_rot_set(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_s, PyObject *__pyx_v_k, PyObject *__pyx_v_n); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_genexpr(PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_3genexpr(PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_12iup_contour_optimize(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_tolerance); /* proto */
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_14iup_delta_optimize(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_ends, PyObject *__pyx_v_tolerance); /* proto */
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/
-/* #### Code section: late_includes ### */
-/* #### Code section: module_state ### */
-typedef struct {
- PyObject *__pyx_d;
- PyObject *__pyx_b;
- PyObject *__pyx_cython_runtime;
- PyObject *__pyx_empty_tuple;
- PyObject *__pyx_empty_bytes;
- PyObject *__pyx_empty_unicode;
- #ifdef __Pyx_CyFunction_USED
- PyTypeObject *__pyx_CyFunctionType;
- #endif
- #ifdef __Pyx_FusedFunction_USED
- PyTypeObject *__pyx_FusedFunctionType;
- #endif
- #ifdef __Pyx_Generator_USED
- PyTypeObject *__pyx_GeneratorType;
- #endif
- #ifdef __Pyx_IterableCoroutine_USED
- PyTypeObject *__pyx_IterableCoroutineType;
- #endif
- #ifdef __Pyx_Coroutine_USED
- PyTypeObject *__pyx_CoroutineAwaitType;
- #endif
- #ifdef __Pyx_Coroutine_USED
- PyTypeObject *__pyx_CoroutineType;
- #endif
- #if CYTHON_USE_MODULE_STATE
- #endif
- #if CYTHON_USE_MODULE_STATE
- PyObject *__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr;
- PyObject *__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize;
- PyObject *__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr;
- PyObject *__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr;
- #endif
- PyTypeObject *__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr;
- PyTypeObject *__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize;
- PyTypeObject *__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr;
- PyTypeObject *__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr;
- PyObject *__pyx_n_s_AssertionError;
- PyObject *__pyx_n_s_AttributeError;
- PyObject *__pyx_n_s_COMPILED;
- PyObject *__pyx_n_s_Delta;
- PyObject *__pyx_n_s_DeltaOrNone;
- PyObject *__pyx_n_s_DeltaOrNoneSegment;
- PyObject *__pyx_n_s_DeltaSegment;
- PyObject *__pyx_n_s_Endpoints;
- PyObject *__pyx_n_s_ImportError;
- PyObject *__pyx_n_s_Integral;
- PyObject *__pyx_kp_s_Lib_fontTools_varLib_iup_py;
- PyObject *__pyx_n_s_MAX_LOOKBACK;
- PyObject *__pyx_n_s_Point;
- PyObject *__pyx_n_s_PointSegment;
- PyObject *__pyx_n_s_Real;
- PyObject *__pyx_n_s_Sequence;
- PyObject *__pyx_n_s_Tuple;
- PyObject *__pyx_n_s_Union;
- PyObject *__pyx_n_s__23;
- PyObject *__pyx_kp_u__3;
- PyObject *__pyx_n_s_args;
- PyObject *__pyx_n_s_asyncio_coroutines;
- PyObject *__pyx_n_s_best_cost;
- PyObject *__pyx_n_s_best_j;
- PyObject *__pyx_n_s_best_sol;
- PyObject *__pyx_n_s_c;
- PyObject *__pyx_n_s_c1;
- PyObject *__pyx_n_s_c2;
- PyObject *__pyx_n_s_can_iup_in_between_locals_genexp;
- PyObject *__pyx_n_s_chain;
- PyObject *__pyx_n_s_cj;
- PyObject *__pyx_n_s_class_getitem;
- PyObject *__pyx_n_s_cline_in_traceback;
- PyObject *__pyx_n_s_close;
- PyObject *__pyx_n_s_contour;
- PyObject *__pyx_n_s_coords;
- PyObject *__pyx_n_s_cost;
- PyObject *__pyx_n_s_costs;
- PyObject *__pyx_n_s_cython;
- PyObject *__pyx_n_s_d;
- PyObject *__pyx_n_s_d0;
- PyObject *__pyx_n_s_d1;
- PyObject *__pyx_n_s_d2;
- PyObject *__pyx_n_s_deltas;
- PyObject *__pyx_kp_u_disable;
- PyObject *__pyx_n_s_dj;
- PyObject *__pyx_kp_u_enable;
- PyObject *__pyx_n_s_end;
- PyObject *__pyx_n_s_ends;
- PyObject *__pyx_n_s_enumerate;
- PyObject *__pyx_n_s_fontTools_misc;
- PyObject *__pyx_n_s_fontTools_varLib_iup;
- PyObject *__pyx_n_s_force;
- PyObject *__pyx_n_s_forced;
- PyObject *__pyx_kp_u_gc;
- PyObject *__pyx_n_s_genexpr;
- PyObject *__pyx_n_s_i;
- PyObject *__pyx_n_s_i1;
- PyObject *__pyx_n_s_i2;
- PyObject *__pyx_n_s_import;
- PyObject *__pyx_n_s_indices;
- PyObject *__pyx_n_s_int;
- PyObject *__pyx_n_s_is_coroutine;
- PyObject *__pyx_kp_u_isenabled;
- PyObject *__pyx_n_s_it;
- PyObject *__pyx_n_s_iup_contour;
- PyObject *__pyx_n_s_iup_contour_bound_forced_set;
- PyObject *__pyx_n_s_iup_contour_optimize;
- PyObject *__pyx_n_s_iup_contour_optimize_dp;
- PyObject *__pyx_n_s_iup_contour_optimize_locals_gene;
- PyObject *__pyx_n_s_iup_delta;
- PyObject *__pyx_n_s_iup_delta_optimize;
- PyObject *__pyx_n_s_j;
- PyObject *__pyx_n_s_k;
- PyObject *__pyx_n_s_l;
- PyObject *__pyx_n_s_lc;
- PyObject *__pyx_n_s_lcj;
- PyObject *__pyx_n_s_ld;
- PyObject *__pyx_n_s_ldj;
- PyObject *__pyx_n_s_list;
- PyObject *__pyx_n_s_lookback;
- PyObject *__pyx_n_s_main;
- PyObject *__pyx_n_s_max;
- PyObject *__pyx_n_s_n;
- PyObject *__pyx_n_s_name;
- PyObject *__pyx_n_s_nc;
- PyObject *__pyx_n_s_ncj;
- PyObject *__pyx_n_s_nd;
- PyObject *__pyx_n_s_ndj;
- PyObject *__pyx_n_s_numbers;
- PyObject *__pyx_n_s_out;
- PyObject *__pyx_n_s_range;
- PyObject *__pyx_n_s_return;
- PyObject *__pyx_n_s_ri1;
- PyObject *__pyx_n_s_ri2;
- PyObject *__pyx_n_s_rot_list;
- PyObject *__pyx_n_s_rot_set;
- PyObject *__pyx_n_s_s;
- PyObject *__pyx_n_s_send;
- PyObject *__pyx_n_s_set;
- PyObject *__pyx_n_s_solution;
- PyObject *__pyx_n_s_start;
- PyObject *__pyx_n_s_test;
- PyObject *__pyx_n_s_throw;
- PyObject *__pyx_n_s_tolerance;
- PyObject *__pyx_n_s_typing;
- PyObject *__pyx_n_s_v;
- PyObject *__pyx_n_s_zip;
- PyObject *__pyx_float_0_0;
- PyObject *__pyx_int_0;
- PyObject *__pyx_int_1;
- PyObject *__pyx_int_2;
- PyObject *__pyx_int_3;
- PyObject *__pyx_int_4;
- PyObject *__pyx_int_8;
- PyObject *__pyx_int_neg_1;
- PyObject *__pyx_tuple_;
- PyObject *__pyx_tuple__2;
- PyObject *__pyx_tuple__4;
- PyObject *__pyx_tuple__6;
- PyObject *__pyx_tuple__8;
- PyObject *__pyx_tuple__10;
- PyObject *__pyx_tuple__11;
- PyObject *__pyx_tuple__13;
- PyObject *__pyx_tuple__15;
- PyObject *__pyx_tuple__17;
- PyObject *__pyx_tuple__19;
- PyObject *__pyx_tuple__20;
- PyObject *__pyx_tuple__22;
- PyObject *__pyx_codeobj__5;
- PyObject *__pyx_codeobj__7;
- PyObject *__pyx_codeobj__9;
- PyObject *__pyx_codeobj__12;
- PyObject *__pyx_codeobj__14;
- PyObject *__pyx_codeobj__16;
- PyObject *__pyx_codeobj__18;
- PyObject *__pyx_codeobj__21;
-} __pyx_mstate;
-
-#if CYTHON_USE_MODULE_STATE
-#ifdef __cplusplus
-namespace {
- extern struct PyModuleDef __pyx_moduledef;
-} /* anonymous namespace */
-#else
-static struct PyModuleDef __pyx_moduledef;
-#endif
-
-#define __pyx_mstate(o) ((__pyx_mstate *)__Pyx_PyModule_GetState(o))
-
-#define __pyx_mstate_global (__pyx_mstate(PyState_FindModule(&__pyx_moduledef)))
-
-#define __pyx_m (PyState_FindModule(&__pyx_moduledef))
-#else
-static __pyx_mstate __pyx_mstate_global_static =
-#ifdef __cplusplus
- {};
-#else
- {0};
-#endif
-static __pyx_mstate *__pyx_mstate_global = &__pyx_mstate_global_static;
-#endif
-/* #### Code section: module_state_clear ### */
-#if CYTHON_USE_MODULE_STATE
-static int __pyx_m_clear(PyObject *m) {
- __pyx_mstate *clear_module_state = __pyx_mstate(m);
- if (!clear_module_state) return 0;
- Py_CLEAR(clear_module_state->__pyx_d);
- Py_CLEAR(clear_module_state->__pyx_b);
- Py_CLEAR(clear_module_state->__pyx_cython_runtime);
- Py_CLEAR(clear_module_state->__pyx_empty_tuple);
- Py_CLEAR(clear_module_state->__pyx_empty_bytes);
- Py_CLEAR(clear_module_state->__pyx_empty_unicode);
- #ifdef __Pyx_CyFunction_USED
- Py_CLEAR(clear_module_state->__pyx_CyFunctionType);
- #endif
- #ifdef __Pyx_FusedFunction_USED
- Py_CLEAR(clear_module_state->__pyx_FusedFunctionType);
- #endif
- Py_CLEAR(clear_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr);
- Py_CLEAR(clear_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr);
- Py_CLEAR(clear_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize);
- Py_CLEAR(clear_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize);
- Py_CLEAR(clear_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr);
- Py_CLEAR(clear_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr);
- Py_CLEAR(clear_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr);
- Py_CLEAR(clear_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr);
- Py_CLEAR(clear_module_state->__pyx_n_s_AssertionError);
- Py_CLEAR(clear_module_state->__pyx_n_s_AttributeError);
- Py_CLEAR(clear_module_state->__pyx_n_s_COMPILED);
- Py_CLEAR(clear_module_state->__pyx_n_s_Delta);
- Py_CLEAR(clear_module_state->__pyx_n_s_DeltaOrNone);
- Py_CLEAR(clear_module_state->__pyx_n_s_DeltaOrNoneSegment);
- Py_CLEAR(clear_module_state->__pyx_n_s_DeltaSegment);
- Py_CLEAR(clear_module_state->__pyx_n_s_Endpoints);
- Py_CLEAR(clear_module_state->__pyx_n_s_ImportError);
- Py_CLEAR(clear_module_state->__pyx_n_s_Integral);
- Py_CLEAR(clear_module_state->__pyx_kp_s_Lib_fontTools_varLib_iup_py);
- Py_CLEAR(clear_module_state->__pyx_n_s_MAX_LOOKBACK);
- Py_CLEAR(clear_module_state->__pyx_n_s_Point);
- Py_CLEAR(clear_module_state->__pyx_n_s_PointSegment);
- Py_CLEAR(clear_module_state->__pyx_n_s_Real);
- Py_CLEAR(clear_module_state->__pyx_n_s_Sequence);
- Py_CLEAR(clear_module_state->__pyx_n_s_Tuple);
- Py_CLEAR(clear_module_state->__pyx_n_s_Union);
- Py_CLEAR(clear_module_state->__pyx_n_s__23);
- Py_CLEAR(clear_module_state->__pyx_kp_u__3);
- Py_CLEAR(clear_module_state->__pyx_n_s_args);
- Py_CLEAR(clear_module_state->__pyx_n_s_asyncio_coroutines);
- Py_CLEAR(clear_module_state->__pyx_n_s_best_cost);
- Py_CLEAR(clear_module_state->__pyx_n_s_best_j);
- Py_CLEAR(clear_module_state->__pyx_n_s_best_sol);
- Py_CLEAR(clear_module_state->__pyx_n_s_c);
- Py_CLEAR(clear_module_state->__pyx_n_s_c1);
- Py_CLEAR(clear_module_state->__pyx_n_s_c2);
- Py_CLEAR(clear_module_state->__pyx_n_s_can_iup_in_between_locals_genexp);
- Py_CLEAR(clear_module_state->__pyx_n_s_chain);
- Py_CLEAR(clear_module_state->__pyx_n_s_cj);
- Py_CLEAR(clear_module_state->__pyx_n_s_class_getitem);
- Py_CLEAR(clear_module_state->__pyx_n_s_cline_in_traceback);
- Py_CLEAR(clear_module_state->__pyx_n_s_close);
- Py_CLEAR(clear_module_state->__pyx_n_s_contour);
- Py_CLEAR(clear_module_state->__pyx_n_s_coords);
- Py_CLEAR(clear_module_state->__pyx_n_s_cost);
- Py_CLEAR(clear_module_state->__pyx_n_s_costs);
- Py_CLEAR(clear_module_state->__pyx_n_s_cython);
- Py_CLEAR(clear_module_state->__pyx_n_s_d);
- Py_CLEAR(clear_module_state->__pyx_n_s_d0);
- Py_CLEAR(clear_module_state->__pyx_n_s_d1);
- Py_CLEAR(clear_module_state->__pyx_n_s_d2);
- Py_CLEAR(clear_module_state->__pyx_n_s_deltas);
- Py_CLEAR(clear_module_state->__pyx_kp_u_disable);
- Py_CLEAR(clear_module_state->__pyx_n_s_dj);
- Py_CLEAR(clear_module_state->__pyx_kp_u_enable);
- Py_CLEAR(clear_module_state->__pyx_n_s_end);
- Py_CLEAR(clear_module_state->__pyx_n_s_ends);
- Py_CLEAR(clear_module_state->__pyx_n_s_enumerate);
- Py_CLEAR(clear_module_state->__pyx_n_s_fontTools_misc);
- Py_CLEAR(clear_module_state->__pyx_n_s_fontTools_varLib_iup);
- Py_CLEAR(clear_module_state->__pyx_n_s_force);
- Py_CLEAR(clear_module_state->__pyx_n_s_forced);
- Py_CLEAR(clear_module_state->__pyx_kp_u_gc);
- Py_CLEAR(clear_module_state->__pyx_n_s_genexpr);
- Py_CLEAR(clear_module_state->__pyx_n_s_i);
- Py_CLEAR(clear_module_state->__pyx_n_s_i1);
- Py_CLEAR(clear_module_state->__pyx_n_s_i2);
- Py_CLEAR(clear_module_state->__pyx_n_s_import);
- Py_CLEAR(clear_module_state->__pyx_n_s_indices);
- Py_CLEAR(clear_module_state->__pyx_n_s_int);
- Py_CLEAR(clear_module_state->__pyx_n_s_is_coroutine);
- Py_CLEAR(clear_module_state->__pyx_kp_u_isenabled);
- Py_CLEAR(clear_module_state->__pyx_n_s_it);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_contour);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_contour_bound_forced_set);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_contour_optimize);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_contour_optimize_dp);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_contour_optimize_locals_gene);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_delta);
- Py_CLEAR(clear_module_state->__pyx_n_s_iup_delta_optimize);
- Py_CLEAR(clear_module_state->__pyx_n_s_j);
- Py_CLEAR(clear_module_state->__pyx_n_s_k);
- Py_CLEAR(clear_module_state->__pyx_n_s_l);
- Py_CLEAR(clear_module_state->__pyx_n_s_lc);
- Py_CLEAR(clear_module_state->__pyx_n_s_lcj);
- Py_CLEAR(clear_module_state->__pyx_n_s_ld);
- Py_CLEAR(clear_module_state->__pyx_n_s_ldj);
- Py_CLEAR(clear_module_state->__pyx_n_s_list);
- Py_CLEAR(clear_module_state->__pyx_n_s_lookback);
- Py_CLEAR(clear_module_state->__pyx_n_s_main);
- Py_CLEAR(clear_module_state->__pyx_n_s_max);
- Py_CLEAR(clear_module_state->__pyx_n_s_n);
- Py_CLEAR(clear_module_state->__pyx_n_s_name);
- Py_CLEAR(clear_module_state->__pyx_n_s_nc);
- Py_CLEAR(clear_module_state->__pyx_n_s_ncj);
- Py_CLEAR(clear_module_state->__pyx_n_s_nd);
- Py_CLEAR(clear_module_state->__pyx_n_s_ndj);
- Py_CLEAR(clear_module_state->__pyx_n_s_numbers);
- Py_CLEAR(clear_module_state->__pyx_n_s_out);
- Py_CLEAR(clear_module_state->__pyx_n_s_range);
- Py_CLEAR(clear_module_state->__pyx_n_s_return);
- Py_CLEAR(clear_module_state->__pyx_n_s_ri1);
- Py_CLEAR(clear_module_state->__pyx_n_s_ri2);
- Py_CLEAR(clear_module_state->__pyx_n_s_rot_list);
- Py_CLEAR(clear_module_state->__pyx_n_s_rot_set);
- Py_CLEAR(clear_module_state->__pyx_n_s_s);
- Py_CLEAR(clear_module_state->__pyx_n_s_send);
- Py_CLEAR(clear_module_state->__pyx_n_s_set);
- Py_CLEAR(clear_module_state->__pyx_n_s_solution);
- Py_CLEAR(clear_module_state->__pyx_n_s_start);
- Py_CLEAR(clear_module_state->__pyx_n_s_test);
- Py_CLEAR(clear_module_state->__pyx_n_s_throw);
- Py_CLEAR(clear_module_state->__pyx_n_s_tolerance);
- Py_CLEAR(clear_module_state->__pyx_n_s_typing);
- Py_CLEAR(clear_module_state->__pyx_n_s_v);
- Py_CLEAR(clear_module_state->__pyx_n_s_zip);
- Py_CLEAR(clear_module_state->__pyx_float_0_0);
- Py_CLEAR(clear_module_state->__pyx_int_0);
- Py_CLEAR(clear_module_state->__pyx_int_1);
- Py_CLEAR(clear_module_state->__pyx_int_2);
- Py_CLEAR(clear_module_state->__pyx_int_3);
- Py_CLEAR(clear_module_state->__pyx_int_4);
- Py_CLEAR(clear_module_state->__pyx_int_8);
- Py_CLEAR(clear_module_state->__pyx_int_neg_1);
- Py_CLEAR(clear_module_state->__pyx_tuple_);
- Py_CLEAR(clear_module_state->__pyx_tuple__2);
- Py_CLEAR(clear_module_state->__pyx_tuple__4);
- Py_CLEAR(clear_module_state->__pyx_tuple__6);
- Py_CLEAR(clear_module_state->__pyx_tuple__8);
- Py_CLEAR(clear_module_state->__pyx_tuple__10);
- Py_CLEAR(clear_module_state->__pyx_tuple__11);
- Py_CLEAR(clear_module_state->__pyx_tuple__13);
- Py_CLEAR(clear_module_state->__pyx_tuple__15);
- Py_CLEAR(clear_module_state->__pyx_tuple__17);
- Py_CLEAR(clear_module_state->__pyx_tuple__19);
- Py_CLEAR(clear_module_state->__pyx_tuple__20);
- Py_CLEAR(clear_module_state->__pyx_tuple__22);
- Py_CLEAR(clear_module_state->__pyx_codeobj__5);
- Py_CLEAR(clear_module_state->__pyx_codeobj__7);
- Py_CLEAR(clear_module_state->__pyx_codeobj__9);
- Py_CLEAR(clear_module_state->__pyx_codeobj__12);
- Py_CLEAR(clear_module_state->__pyx_codeobj__14);
- Py_CLEAR(clear_module_state->__pyx_codeobj__16);
- Py_CLEAR(clear_module_state->__pyx_codeobj__18);
- Py_CLEAR(clear_module_state->__pyx_codeobj__21);
- return 0;
-}
-#endif
-/* #### Code section: module_state_traverse ### */
-#if CYTHON_USE_MODULE_STATE
-static int __pyx_m_traverse(PyObject *m, visitproc visit, void *arg) {
- __pyx_mstate *traverse_module_state = __pyx_mstate(m);
- if (!traverse_module_state) return 0;
- Py_VISIT(traverse_module_state->__pyx_d);
- Py_VISIT(traverse_module_state->__pyx_b);
- Py_VISIT(traverse_module_state->__pyx_cython_runtime);
- Py_VISIT(traverse_module_state->__pyx_empty_tuple);
- Py_VISIT(traverse_module_state->__pyx_empty_bytes);
- Py_VISIT(traverse_module_state->__pyx_empty_unicode);
- #ifdef __Pyx_CyFunction_USED
- Py_VISIT(traverse_module_state->__pyx_CyFunctionType);
- #endif
- #ifdef __Pyx_FusedFunction_USED
- Py_VISIT(traverse_module_state->__pyx_FusedFunctionType);
- #endif
- Py_VISIT(traverse_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr);
- Py_VISIT(traverse_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr);
- Py_VISIT(traverse_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize);
- Py_VISIT(traverse_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize);
- Py_VISIT(traverse_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr);
- Py_VISIT(traverse_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr);
- Py_VISIT(traverse_module_state->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr);
- Py_VISIT(traverse_module_state->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr);
- Py_VISIT(traverse_module_state->__pyx_n_s_AssertionError);
- Py_VISIT(traverse_module_state->__pyx_n_s_AttributeError);
- Py_VISIT(traverse_module_state->__pyx_n_s_COMPILED);
- Py_VISIT(traverse_module_state->__pyx_n_s_Delta);
- Py_VISIT(traverse_module_state->__pyx_n_s_DeltaOrNone);
- Py_VISIT(traverse_module_state->__pyx_n_s_DeltaOrNoneSegment);
- Py_VISIT(traverse_module_state->__pyx_n_s_DeltaSegment);
- Py_VISIT(traverse_module_state->__pyx_n_s_Endpoints);
- Py_VISIT(traverse_module_state->__pyx_n_s_ImportError);
- Py_VISIT(traverse_module_state->__pyx_n_s_Integral);
- Py_VISIT(traverse_module_state->__pyx_kp_s_Lib_fontTools_varLib_iup_py);
- Py_VISIT(traverse_module_state->__pyx_n_s_MAX_LOOKBACK);
- Py_VISIT(traverse_module_state->__pyx_n_s_Point);
- Py_VISIT(traverse_module_state->__pyx_n_s_PointSegment);
- Py_VISIT(traverse_module_state->__pyx_n_s_Real);
- Py_VISIT(traverse_module_state->__pyx_n_s_Sequence);
- Py_VISIT(traverse_module_state->__pyx_n_s_Tuple);
- Py_VISIT(traverse_module_state->__pyx_n_s_Union);
- Py_VISIT(traverse_module_state->__pyx_n_s__23);
- Py_VISIT(traverse_module_state->__pyx_kp_u__3);
- Py_VISIT(traverse_module_state->__pyx_n_s_args);
- Py_VISIT(traverse_module_state->__pyx_n_s_asyncio_coroutines);
- Py_VISIT(traverse_module_state->__pyx_n_s_best_cost);
- Py_VISIT(traverse_module_state->__pyx_n_s_best_j);
- Py_VISIT(traverse_module_state->__pyx_n_s_best_sol);
- Py_VISIT(traverse_module_state->__pyx_n_s_c);
- Py_VISIT(traverse_module_state->__pyx_n_s_c1);
- Py_VISIT(traverse_module_state->__pyx_n_s_c2);
- Py_VISIT(traverse_module_state->__pyx_n_s_can_iup_in_between_locals_genexp);
- Py_VISIT(traverse_module_state->__pyx_n_s_chain);
- Py_VISIT(traverse_module_state->__pyx_n_s_cj);
- Py_VISIT(traverse_module_state->__pyx_n_s_class_getitem);
- Py_VISIT(traverse_module_state->__pyx_n_s_cline_in_traceback);
- Py_VISIT(traverse_module_state->__pyx_n_s_close);
- Py_VISIT(traverse_module_state->__pyx_n_s_contour);
- Py_VISIT(traverse_module_state->__pyx_n_s_coords);
- Py_VISIT(traverse_module_state->__pyx_n_s_cost);
- Py_VISIT(traverse_module_state->__pyx_n_s_costs);
- Py_VISIT(traverse_module_state->__pyx_n_s_cython);
- Py_VISIT(traverse_module_state->__pyx_n_s_d);
- Py_VISIT(traverse_module_state->__pyx_n_s_d0);
- Py_VISIT(traverse_module_state->__pyx_n_s_d1);
- Py_VISIT(traverse_module_state->__pyx_n_s_d2);
- Py_VISIT(traverse_module_state->__pyx_n_s_deltas);
- Py_VISIT(traverse_module_state->__pyx_kp_u_disable);
- Py_VISIT(traverse_module_state->__pyx_n_s_dj);
- Py_VISIT(traverse_module_state->__pyx_kp_u_enable);
- Py_VISIT(traverse_module_state->__pyx_n_s_end);
- Py_VISIT(traverse_module_state->__pyx_n_s_ends);
- Py_VISIT(traverse_module_state->__pyx_n_s_enumerate);
- Py_VISIT(traverse_module_state->__pyx_n_s_fontTools_misc);
- Py_VISIT(traverse_module_state->__pyx_n_s_fontTools_varLib_iup);
- Py_VISIT(traverse_module_state->__pyx_n_s_force);
- Py_VISIT(traverse_module_state->__pyx_n_s_forced);
- Py_VISIT(traverse_module_state->__pyx_kp_u_gc);
- Py_VISIT(traverse_module_state->__pyx_n_s_genexpr);
- Py_VISIT(traverse_module_state->__pyx_n_s_i);
- Py_VISIT(traverse_module_state->__pyx_n_s_i1);
- Py_VISIT(traverse_module_state->__pyx_n_s_i2);
- Py_VISIT(traverse_module_state->__pyx_n_s_import);
- Py_VISIT(traverse_module_state->__pyx_n_s_indices);
- Py_VISIT(traverse_module_state->__pyx_n_s_int);
- Py_VISIT(traverse_module_state->__pyx_n_s_is_coroutine);
- Py_VISIT(traverse_module_state->__pyx_kp_u_isenabled);
- Py_VISIT(traverse_module_state->__pyx_n_s_it);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_contour);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_contour_bound_forced_set);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_contour_optimize);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_contour_optimize_dp);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_contour_optimize_locals_gene);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_delta);
- Py_VISIT(traverse_module_state->__pyx_n_s_iup_delta_optimize);
- Py_VISIT(traverse_module_state->__pyx_n_s_j);
- Py_VISIT(traverse_module_state->__pyx_n_s_k);
- Py_VISIT(traverse_module_state->__pyx_n_s_l);
- Py_VISIT(traverse_module_state->__pyx_n_s_lc);
- Py_VISIT(traverse_module_state->__pyx_n_s_lcj);
- Py_VISIT(traverse_module_state->__pyx_n_s_ld);
- Py_VISIT(traverse_module_state->__pyx_n_s_ldj);
- Py_VISIT(traverse_module_state->__pyx_n_s_list);
- Py_VISIT(traverse_module_state->__pyx_n_s_lookback);
- Py_VISIT(traverse_module_state->__pyx_n_s_main);
- Py_VISIT(traverse_module_state->__pyx_n_s_max);
- Py_VISIT(traverse_module_state->__pyx_n_s_n);
- Py_VISIT(traverse_module_state->__pyx_n_s_name);
- Py_VISIT(traverse_module_state->__pyx_n_s_nc);
- Py_VISIT(traverse_module_state->__pyx_n_s_ncj);
- Py_VISIT(traverse_module_state->__pyx_n_s_nd);
- Py_VISIT(traverse_module_state->__pyx_n_s_ndj);
- Py_VISIT(traverse_module_state->__pyx_n_s_numbers);
- Py_VISIT(traverse_module_state->__pyx_n_s_out);
- Py_VISIT(traverse_module_state->__pyx_n_s_range);
- Py_VISIT(traverse_module_state->__pyx_n_s_return);
- Py_VISIT(traverse_module_state->__pyx_n_s_ri1);
- Py_VISIT(traverse_module_state->__pyx_n_s_ri2);
- Py_VISIT(traverse_module_state->__pyx_n_s_rot_list);
- Py_VISIT(traverse_module_state->__pyx_n_s_rot_set);
- Py_VISIT(traverse_module_state->__pyx_n_s_s);
- Py_VISIT(traverse_module_state->__pyx_n_s_send);
- Py_VISIT(traverse_module_state->__pyx_n_s_set);
- Py_VISIT(traverse_module_state->__pyx_n_s_solution);
- Py_VISIT(traverse_module_state->__pyx_n_s_start);
- Py_VISIT(traverse_module_state->__pyx_n_s_test);
- Py_VISIT(traverse_module_state->__pyx_n_s_throw);
- Py_VISIT(traverse_module_state->__pyx_n_s_tolerance);
- Py_VISIT(traverse_module_state->__pyx_n_s_typing);
- Py_VISIT(traverse_module_state->__pyx_n_s_v);
- Py_VISIT(traverse_module_state->__pyx_n_s_zip);
- Py_VISIT(traverse_module_state->__pyx_float_0_0);
- Py_VISIT(traverse_module_state->__pyx_int_0);
- Py_VISIT(traverse_module_state->__pyx_int_1);
- Py_VISIT(traverse_module_state->__pyx_int_2);
- Py_VISIT(traverse_module_state->__pyx_int_3);
- Py_VISIT(traverse_module_state->__pyx_int_4);
- Py_VISIT(traverse_module_state->__pyx_int_8);
- Py_VISIT(traverse_module_state->__pyx_int_neg_1);
- Py_VISIT(traverse_module_state->__pyx_tuple_);
- Py_VISIT(traverse_module_state->__pyx_tuple__2);
- Py_VISIT(traverse_module_state->__pyx_tuple__4);
- Py_VISIT(traverse_module_state->__pyx_tuple__6);
- Py_VISIT(traverse_module_state->__pyx_tuple__8);
- Py_VISIT(traverse_module_state->__pyx_tuple__10);
- Py_VISIT(traverse_module_state->__pyx_tuple__11);
- Py_VISIT(traverse_module_state->__pyx_tuple__13);
- Py_VISIT(traverse_module_state->__pyx_tuple__15);
- Py_VISIT(traverse_module_state->__pyx_tuple__17);
- Py_VISIT(traverse_module_state->__pyx_tuple__19);
- Py_VISIT(traverse_module_state->__pyx_tuple__20);
- Py_VISIT(traverse_module_state->__pyx_tuple__22);
- Py_VISIT(traverse_module_state->__pyx_codeobj__5);
- Py_VISIT(traverse_module_state->__pyx_codeobj__7);
- Py_VISIT(traverse_module_state->__pyx_codeobj__9);
- Py_VISIT(traverse_module_state->__pyx_codeobj__12);
- Py_VISIT(traverse_module_state->__pyx_codeobj__14);
- Py_VISIT(traverse_module_state->__pyx_codeobj__16);
- Py_VISIT(traverse_module_state->__pyx_codeobj__18);
- Py_VISIT(traverse_module_state->__pyx_codeobj__21);
- return 0;
-}
-#endif
-/* #### Code section: module_state_defines ### */
-#define __pyx_d __pyx_mstate_global->__pyx_d
-#define __pyx_b __pyx_mstate_global->__pyx_b
-#define __pyx_cython_runtime __pyx_mstate_global->__pyx_cython_runtime
-#define __pyx_empty_tuple __pyx_mstate_global->__pyx_empty_tuple
-#define __pyx_empty_bytes __pyx_mstate_global->__pyx_empty_bytes
-#define __pyx_empty_unicode __pyx_mstate_global->__pyx_empty_unicode
-#ifdef __Pyx_CyFunction_USED
-#define __pyx_CyFunctionType __pyx_mstate_global->__pyx_CyFunctionType
-#endif
-#ifdef __Pyx_FusedFunction_USED
-#define __pyx_FusedFunctionType __pyx_mstate_global->__pyx_FusedFunctionType
-#endif
-#ifdef __Pyx_Generator_USED
-#define __pyx_GeneratorType __pyx_mstate_global->__pyx_GeneratorType
-#endif
-#ifdef __Pyx_IterableCoroutine_USED
-#define __pyx_IterableCoroutineType __pyx_mstate_global->__pyx_IterableCoroutineType
-#endif
-#ifdef __Pyx_Coroutine_USED
-#define __pyx_CoroutineAwaitType __pyx_mstate_global->__pyx_CoroutineAwaitType
-#endif
-#ifdef __Pyx_Coroutine_USED
-#define __pyx_CoroutineType __pyx_mstate_global->__pyx_CoroutineType
-#endif
-#if CYTHON_USE_MODULE_STATE
-#endif
-#if CYTHON_USE_MODULE_STATE
-#define __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr __pyx_mstate_global->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr
-#define __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize __pyx_mstate_global->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize
-#define __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr __pyx_mstate_global->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr
-#define __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr __pyx_mstate_global->__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr
-#endif
-#define __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr __pyx_mstate_global->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr
-#define __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize __pyx_mstate_global->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize
-#define __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr __pyx_mstate_global->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr
-#define __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr __pyx_mstate_global->__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr
-#define __pyx_n_s_AssertionError __pyx_mstate_global->__pyx_n_s_AssertionError
-#define __pyx_n_s_AttributeError __pyx_mstate_global->__pyx_n_s_AttributeError
-#define __pyx_n_s_COMPILED __pyx_mstate_global->__pyx_n_s_COMPILED
-#define __pyx_n_s_Delta __pyx_mstate_global->__pyx_n_s_Delta
-#define __pyx_n_s_DeltaOrNone __pyx_mstate_global->__pyx_n_s_DeltaOrNone
-#define __pyx_n_s_DeltaOrNoneSegment __pyx_mstate_global->__pyx_n_s_DeltaOrNoneSegment
-#define __pyx_n_s_DeltaSegment __pyx_mstate_global->__pyx_n_s_DeltaSegment
-#define __pyx_n_s_Endpoints __pyx_mstate_global->__pyx_n_s_Endpoints
-#define __pyx_n_s_ImportError __pyx_mstate_global->__pyx_n_s_ImportError
-#define __pyx_n_s_Integral __pyx_mstate_global->__pyx_n_s_Integral
-#define __pyx_kp_s_Lib_fontTools_varLib_iup_py __pyx_mstate_global->__pyx_kp_s_Lib_fontTools_varLib_iup_py
-#define __pyx_n_s_MAX_LOOKBACK __pyx_mstate_global->__pyx_n_s_MAX_LOOKBACK
-#define __pyx_n_s_Point __pyx_mstate_global->__pyx_n_s_Point
-#define __pyx_n_s_PointSegment __pyx_mstate_global->__pyx_n_s_PointSegment
-#define __pyx_n_s_Real __pyx_mstate_global->__pyx_n_s_Real
-#define __pyx_n_s_Sequence __pyx_mstate_global->__pyx_n_s_Sequence
-#define __pyx_n_s_Tuple __pyx_mstate_global->__pyx_n_s_Tuple
-#define __pyx_n_s_Union __pyx_mstate_global->__pyx_n_s_Union
-#define __pyx_n_s__23 __pyx_mstate_global->__pyx_n_s__23
-#define __pyx_kp_u__3 __pyx_mstate_global->__pyx_kp_u__3
-#define __pyx_n_s_args __pyx_mstate_global->__pyx_n_s_args
-#define __pyx_n_s_asyncio_coroutines __pyx_mstate_global->__pyx_n_s_asyncio_coroutines
-#define __pyx_n_s_best_cost __pyx_mstate_global->__pyx_n_s_best_cost
-#define __pyx_n_s_best_j __pyx_mstate_global->__pyx_n_s_best_j
-#define __pyx_n_s_best_sol __pyx_mstate_global->__pyx_n_s_best_sol
-#define __pyx_n_s_c __pyx_mstate_global->__pyx_n_s_c
-#define __pyx_n_s_c1 __pyx_mstate_global->__pyx_n_s_c1
-#define __pyx_n_s_c2 __pyx_mstate_global->__pyx_n_s_c2
-#define __pyx_n_s_can_iup_in_between_locals_genexp __pyx_mstate_global->__pyx_n_s_can_iup_in_between_locals_genexp
-#define __pyx_n_s_chain __pyx_mstate_global->__pyx_n_s_chain
-#define __pyx_n_s_cj __pyx_mstate_global->__pyx_n_s_cj
-#define __pyx_n_s_class_getitem __pyx_mstate_global->__pyx_n_s_class_getitem
-#define __pyx_n_s_cline_in_traceback __pyx_mstate_global->__pyx_n_s_cline_in_traceback
-#define __pyx_n_s_close __pyx_mstate_global->__pyx_n_s_close
-#define __pyx_n_s_contour __pyx_mstate_global->__pyx_n_s_contour
-#define __pyx_n_s_coords __pyx_mstate_global->__pyx_n_s_coords
-#define __pyx_n_s_cost __pyx_mstate_global->__pyx_n_s_cost
-#define __pyx_n_s_costs __pyx_mstate_global->__pyx_n_s_costs
-#define __pyx_n_s_cython __pyx_mstate_global->__pyx_n_s_cython
-#define __pyx_n_s_d __pyx_mstate_global->__pyx_n_s_d
-#define __pyx_n_s_d0 __pyx_mstate_global->__pyx_n_s_d0
-#define __pyx_n_s_d1 __pyx_mstate_global->__pyx_n_s_d1
-#define __pyx_n_s_d2 __pyx_mstate_global->__pyx_n_s_d2
-#define __pyx_n_s_deltas __pyx_mstate_global->__pyx_n_s_deltas
-#define __pyx_kp_u_disable __pyx_mstate_global->__pyx_kp_u_disable
-#define __pyx_n_s_dj __pyx_mstate_global->__pyx_n_s_dj
-#define __pyx_kp_u_enable __pyx_mstate_global->__pyx_kp_u_enable
-#define __pyx_n_s_end __pyx_mstate_global->__pyx_n_s_end
-#define __pyx_n_s_ends __pyx_mstate_global->__pyx_n_s_ends
-#define __pyx_n_s_enumerate __pyx_mstate_global->__pyx_n_s_enumerate
-#define __pyx_n_s_fontTools_misc __pyx_mstate_global->__pyx_n_s_fontTools_misc
-#define __pyx_n_s_fontTools_varLib_iup __pyx_mstate_global->__pyx_n_s_fontTools_varLib_iup
-#define __pyx_n_s_force __pyx_mstate_global->__pyx_n_s_force
-#define __pyx_n_s_forced __pyx_mstate_global->__pyx_n_s_forced
-#define __pyx_kp_u_gc __pyx_mstate_global->__pyx_kp_u_gc
-#define __pyx_n_s_genexpr __pyx_mstate_global->__pyx_n_s_genexpr
-#define __pyx_n_s_i __pyx_mstate_global->__pyx_n_s_i
-#define __pyx_n_s_i1 __pyx_mstate_global->__pyx_n_s_i1
-#define __pyx_n_s_i2 __pyx_mstate_global->__pyx_n_s_i2
-#define __pyx_n_s_import __pyx_mstate_global->__pyx_n_s_import
-#define __pyx_n_s_indices __pyx_mstate_global->__pyx_n_s_indices
-#define __pyx_n_s_int __pyx_mstate_global->__pyx_n_s_int
-#define __pyx_n_s_is_coroutine __pyx_mstate_global->__pyx_n_s_is_coroutine
-#define __pyx_kp_u_isenabled __pyx_mstate_global->__pyx_kp_u_isenabled
-#define __pyx_n_s_it __pyx_mstate_global->__pyx_n_s_it
-#define __pyx_n_s_iup_contour __pyx_mstate_global->__pyx_n_s_iup_contour
-#define __pyx_n_s_iup_contour_bound_forced_set __pyx_mstate_global->__pyx_n_s_iup_contour_bound_forced_set
-#define __pyx_n_s_iup_contour_optimize __pyx_mstate_global->__pyx_n_s_iup_contour_optimize
-#define __pyx_n_s_iup_contour_optimize_dp __pyx_mstate_global->__pyx_n_s_iup_contour_optimize_dp
-#define __pyx_n_s_iup_contour_optimize_locals_gene __pyx_mstate_global->__pyx_n_s_iup_contour_optimize_locals_gene
-#define __pyx_n_s_iup_delta __pyx_mstate_global->__pyx_n_s_iup_delta
-#define __pyx_n_s_iup_delta_optimize __pyx_mstate_global->__pyx_n_s_iup_delta_optimize
-#define __pyx_n_s_j __pyx_mstate_global->__pyx_n_s_j
-#define __pyx_n_s_k __pyx_mstate_global->__pyx_n_s_k
-#define __pyx_n_s_l __pyx_mstate_global->__pyx_n_s_l
-#define __pyx_n_s_lc __pyx_mstate_global->__pyx_n_s_lc
-#define __pyx_n_s_lcj __pyx_mstate_global->__pyx_n_s_lcj
-#define __pyx_n_s_ld __pyx_mstate_global->__pyx_n_s_ld
-#define __pyx_n_s_ldj __pyx_mstate_global->__pyx_n_s_ldj
-#define __pyx_n_s_list __pyx_mstate_global->__pyx_n_s_list
-#define __pyx_n_s_lookback __pyx_mstate_global->__pyx_n_s_lookback
-#define __pyx_n_s_main __pyx_mstate_global->__pyx_n_s_main
-#define __pyx_n_s_max __pyx_mstate_global->__pyx_n_s_max
-#define __pyx_n_s_n __pyx_mstate_global->__pyx_n_s_n
-#define __pyx_n_s_name __pyx_mstate_global->__pyx_n_s_name
-#define __pyx_n_s_nc __pyx_mstate_global->__pyx_n_s_nc
-#define __pyx_n_s_ncj __pyx_mstate_global->__pyx_n_s_ncj
-#define __pyx_n_s_nd __pyx_mstate_global->__pyx_n_s_nd
-#define __pyx_n_s_ndj __pyx_mstate_global->__pyx_n_s_ndj
-#define __pyx_n_s_numbers __pyx_mstate_global->__pyx_n_s_numbers
-#define __pyx_n_s_out __pyx_mstate_global->__pyx_n_s_out
-#define __pyx_n_s_range __pyx_mstate_global->__pyx_n_s_range
-#define __pyx_n_s_return __pyx_mstate_global->__pyx_n_s_return
-#define __pyx_n_s_ri1 __pyx_mstate_global->__pyx_n_s_ri1
-#define __pyx_n_s_ri2 __pyx_mstate_global->__pyx_n_s_ri2
-#define __pyx_n_s_rot_list __pyx_mstate_global->__pyx_n_s_rot_list
-#define __pyx_n_s_rot_set __pyx_mstate_global->__pyx_n_s_rot_set
-#define __pyx_n_s_s __pyx_mstate_global->__pyx_n_s_s
-#define __pyx_n_s_send __pyx_mstate_global->__pyx_n_s_send
-#define __pyx_n_s_set __pyx_mstate_global->__pyx_n_s_set
-#define __pyx_n_s_solution __pyx_mstate_global->__pyx_n_s_solution
-#define __pyx_n_s_start __pyx_mstate_global->__pyx_n_s_start
-#define __pyx_n_s_test __pyx_mstate_global->__pyx_n_s_test
-#define __pyx_n_s_throw __pyx_mstate_global->__pyx_n_s_throw
-#define __pyx_n_s_tolerance __pyx_mstate_global->__pyx_n_s_tolerance
-#define __pyx_n_s_typing __pyx_mstate_global->__pyx_n_s_typing
-#define __pyx_n_s_v __pyx_mstate_global->__pyx_n_s_v
-#define __pyx_n_s_zip __pyx_mstate_global->__pyx_n_s_zip
-#define __pyx_float_0_0 __pyx_mstate_global->__pyx_float_0_0
-#define __pyx_int_0 __pyx_mstate_global->__pyx_int_0
-#define __pyx_int_1 __pyx_mstate_global->__pyx_int_1
-#define __pyx_int_2 __pyx_mstate_global->__pyx_int_2
-#define __pyx_int_3 __pyx_mstate_global->__pyx_int_3
-#define __pyx_int_4 __pyx_mstate_global->__pyx_int_4
-#define __pyx_int_8 __pyx_mstate_global->__pyx_int_8
-#define __pyx_int_neg_1 __pyx_mstate_global->__pyx_int_neg_1
-#define __pyx_tuple_ __pyx_mstate_global->__pyx_tuple_
-#define __pyx_tuple__2 __pyx_mstate_global->__pyx_tuple__2
-#define __pyx_tuple__4 __pyx_mstate_global->__pyx_tuple__4
-#define __pyx_tuple__6 __pyx_mstate_global->__pyx_tuple__6
-#define __pyx_tuple__8 __pyx_mstate_global->__pyx_tuple__8
-#define __pyx_tuple__10 __pyx_mstate_global->__pyx_tuple__10
-#define __pyx_tuple__11 __pyx_mstate_global->__pyx_tuple__11
-#define __pyx_tuple__13 __pyx_mstate_global->__pyx_tuple__13
-#define __pyx_tuple__15 __pyx_mstate_global->__pyx_tuple__15
-#define __pyx_tuple__17 __pyx_mstate_global->__pyx_tuple__17
-#define __pyx_tuple__19 __pyx_mstate_global->__pyx_tuple__19
-#define __pyx_tuple__20 __pyx_mstate_global->__pyx_tuple__20
-#define __pyx_tuple__22 __pyx_mstate_global->__pyx_tuple__22
-#define __pyx_codeobj__5 __pyx_mstate_global->__pyx_codeobj__5
-#define __pyx_codeobj__7 __pyx_mstate_global->__pyx_codeobj__7
-#define __pyx_codeobj__9 __pyx_mstate_global->__pyx_codeobj__9
-#define __pyx_codeobj__12 __pyx_mstate_global->__pyx_codeobj__12
-#define __pyx_codeobj__14 __pyx_mstate_global->__pyx_codeobj__14
-#define __pyx_codeobj__16 __pyx_mstate_global->__pyx_codeobj__16
-#define __pyx_codeobj__18 __pyx_mstate_global->__pyx_codeobj__18
-#define __pyx_codeobj__21 __pyx_mstate_global->__pyx_codeobj__21
-/* #### Code section: module_code ### */
-
-/* "fontTools/varLib/iup.py":41
- *
- *
- * @cython.cfunc # <<<<<<<<<<<<<<
- * @cython.locals(
- * j=cython.int,
- */
-
-static PyObject *__pyx_f_9fontTools_6varLib_3iup_iup_segment(PyObject *__pyx_v_coords, PyObject *__pyx_v_rc1, PyObject *__pyx_v_rd1, PyObject *__pyx_v_rc2, PyObject *__pyx_v_rd2) {
- int __pyx_v_j;
- int __pyx_v_n;
- double __pyx_v_x1;
- double __pyx_v_x2;
- double __pyx_v_d1;
- double __pyx_v_d2;
- double __pyx_v_scale;
- double __pyx_v_x;
- double __pyx_v_d;
- PyObject *__pyx_v_out_arrays = NULL;
- PyObject *__pyx_v_out = NULL;
- PyObject *__pyx_v_pair = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- PyObject *__pyx_t_1 = NULL;
- Py_ssize_t __pyx_t_2;
- PyObject *__pyx_t_3 = NULL;
- int __pyx_t_4;
- double __pyx_t_5;
- double __pyx_t_6;
- double __pyx_t_7;
- double __pyx_t_8;
- int __pyx_t_9;
- Py_ssize_t __pyx_t_10;
- PyObject *__pyx_t_11 = NULL;
- int __pyx_t_12;
- PyObject *(*__pyx_t_13)(PyObject *);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("iup_segment", 0);
-
- /* "fontTools/varLib/iup.py":62
- * # rc1 = reference coord 1
- * # rd1 = reference delta 1
- * out_arrays = [None, None] # <<<<<<<<<<<<<<
- * for j in 0, 1:
- * out_arrays[j] = out = []
- */
- __pyx_t_1 = PyList_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 62, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyList_SET_ITEM(__pyx_t_1, 0, Py_None);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyList_SET_ITEM(__pyx_t_1, 1, Py_None);
- __pyx_v_out_arrays = ((PyObject*)__pyx_t_1);
- __pyx_t_1 = 0;
-
- /* "fontTools/varLib/iup.py":63
- * # rd1 = reference delta 1
- * out_arrays = [None, None]
- * for j in 0, 1: # <<<<<<<<<<<<<<
- * out_arrays[j] = out = []
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- */
- __pyx_t_1 = __pyx_tuple_; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0;
- for (;;) {
- if (__pyx_t_2 >= 2) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_3 = PyTuple_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_3); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 63, __pyx_L1_error)
- #else
- __pyx_t_3 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 63, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- __pyx_t_4 = __Pyx_PyInt_As_int(__pyx_t_3); if (unlikely((__pyx_t_4 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 63, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_v_j = __pyx_t_4;
-
- /* "fontTools/varLib/iup.py":64
- * out_arrays = [None, None]
- * for j in 0, 1:
- * out_arrays[j] = out = [] # <<<<<<<<<<<<<<
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- *
- */
- __pyx_t_3 = PyList_New(0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 64, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- if (unlikely((__Pyx_SetItemInt(__pyx_v_out_arrays, __pyx_v_j, __pyx_t_3, int, 1, __Pyx_PyInt_From_int, 1, 1, 1) < 0))) __PYX_ERR(0, 64, __pyx_L1_error)
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_XDECREF_SET(__pyx_v_out, __pyx_t_3);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":65
- * for j in 0, 1:
- * out_arrays[j] = out = []
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j] # <<<<<<<<<<<<<<
- *
- * if x1 == x2:
- */
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_rc1, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_5 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_rc2, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_6 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_6 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_rd1, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_7 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_rd2, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_8 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_8 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 65, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_v_x1 = __pyx_t_5;
- __pyx_v_x2 = __pyx_t_6;
- __pyx_v_d1 = __pyx_t_7;
- __pyx_v_d2 = __pyx_t_8;
-
- /* "fontTools/varLib/iup.py":67
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- *
- * if x1 == x2: # <<<<<<<<<<<<<<
- * n = len(coords)
- * if d1 == d2:
- */
- __pyx_t_9 = (__pyx_v_x1 == __pyx_v_x2);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":68
- *
- * if x1 == x2:
- * n = len(coords) # <<<<<<<<<<<<<<
- * if d1 == d2:
- * out.extend([d1] * n)
- */
- __pyx_t_10 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_10 == ((Py_ssize_t)-1))) __PYX_ERR(0, 68, __pyx_L1_error)
- __pyx_v_n = __pyx_t_10;
-
- /* "fontTools/varLib/iup.py":69
- * if x1 == x2:
- * n = len(coords)
- * if d1 == d2: # <<<<<<<<<<<<<<
- * out.extend([d1] * n)
- * else:
- */
- __pyx_t_9 = (__pyx_v_d1 == __pyx_v_d2);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":70
- * n = len(coords)
- * if d1 == d2:
- * out.extend([d1] * n) # <<<<<<<<<<<<<<
- * else:
- * out.extend([0] * n)
- */
- __pyx_t_3 = PyFloat_FromDouble(__pyx_v_d1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 70, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_11 = PyList_New(1 * ((__pyx_v_n<0) ? 0:__pyx_v_n)); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 70, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- { Py_ssize_t __pyx_temp;
- for (__pyx_temp=0; __pyx_temp < __pyx_v_n; __pyx_temp++) {
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_GIVEREF(__pyx_t_3);
- PyList_SET_ITEM(__pyx_t_11, __pyx_temp, __pyx_t_3);
- }
- }
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_12 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_t_11); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 70, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
-
- /* "fontTools/varLib/iup.py":69
- * if x1 == x2:
- * n = len(coords)
- * if d1 == d2: # <<<<<<<<<<<<<<
- * out.extend([d1] * n)
- * else:
- */
- goto __pyx_L6;
- }
-
- /* "fontTools/varLib/iup.py":72
- * out.extend([d1] * n)
- * else:
- * out.extend([0] * n) # <<<<<<<<<<<<<<
- * continue
- *
- */
- /*else*/ {
- __pyx_t_11 = PyList_New(1 * ((__pyx_v_n<0) ? 0:__pyx_v_n)); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 72, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- { Py_ssize_t __pyx_temp;
- for (__pyx_temp=0; __pyx_temp < __pyx_v_n; __pyx_temp++) {
- __Pyx_INCREF(__pyx_int_0);
- __Pyx_GIVEREF(__pyx_int_0);
- PyList_SET_ITEM(__pyx_t_11, __pyx_temp, __pyx_int_0);
- }
- }
- __pyx_t_12 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_t_11); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 72, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
- }
- __pyx_L6:;
-
- /* "fontTools/varLib/iup.py":73
- * else:
- * out.extend([0] * n)
- * continue # <<<<<<<<<<<<<<
- *
- * if x1 > x2:
- */
- goto __pyx_L3_continue;
-
- /* "fontTools/varLib/iup.py":67
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- *
- * if x1 == x2: # <<<<<<<<<<<<<<
- * n = len(coords)
- * if d1 == d2:
- */
- }
-
- /* "fontTools/varLib/iup.py":75
- * continue
- *
- * if x1 > x2: # <<<<<<<<<<<<<<
- * x1, x2 = x2, x1
- * d1, d2 = d2, d1
- */
- __pyx_t_9 = (__pyx_v_x1 > __pyx_v_x2);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":76
- *
- * if x1 > x2:
- * x1, x2 = x2, x1 # <<<<<<<<<<<<<<
- * d1, d2 = d2, d1
- *
- */
- __pyx_t_8 = __pyx_v_x2;
- __pyx_t_7 = __pyx_v_x1;
- __pyx_v_x1 = __pyx_t_8;
- __pyx_v_x2 = __pyx_t_7;
-
- /* "fontTools/varLib/iup.py":77
- * if x1 > x2:
- * x1, x2 = x2, x1
- * d1, d2 = d2, d1 # <<<<<<<<<<<<<<
- *
- * # x1 < x2
- */
- __pyx_t_7 = __pyx_v_d2;
- __pyx_t_8 = __pyx_v_d1;
- __pyx_v_d1 = __pyx_t_7;
- __pyx_v_d2 = __pyx_t_8;
-
- /* "fontTools/varLib/iup.py":75
- * continue
- *
- * if x1 > x2: # <<<<<<<<<<<<<<
- * x1, x2 = x2, x1
- * d1, d2 = d2, d1
- */
- }
-
- /* "fontTools/varLib/iup.py":80
- *
- * # x1 < x2
- * scale = (d2 - d1) / (x2 - x1) # <<<<<<<<<<<<<<
- * for pair in coords:
- * x = pair[j]
- */
- __pyx_t_8 = (__pyx_v_d2 - __pyx_v_d1);
- __pyx_t_7 = (__pyx_v_x2 - __pyx_v_x1);
- if (unlikely(__pyx_t_7 == 0)) {
- PyErr_SetString(PyExc_ZeroDivisionError, "float division");
- __PYX_ERR(0, 80, __pyx_L1_error)
- }
- __pyx_v_scale = (__pyx_t_8 / __pyx_t_7);
-
- /* "fontTools/varLib/iup.py":81
- * # x1 < x2
- * scale = (d2 - d1) / (x2 - x1)
- * for pair in coords: # <<<<<<<<<<<<<<
- * x = pair[j]
- *
- */
- if (likely(PyList_CheckExact(__pyx_v_coords)) || PyTuple_CheckExact(__pyx_v_coords)) {
- __pyx_t_11 = __pyx_v_coords; __Pyx_INCREF(__pyx_t_11); __pyx_t_10 = 0;
- __pyx_t_13 = NULL;
- } else {
- __pyx_t_10 = -1; __pyx_t_11 = PyObject_GetIter(__pyx_v_coords); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 81, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __pyx_t_13 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_11); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 81, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_13)) {
- if (likely(PyList_CheckExact(__pyx_t_11))) {
- if (__pyx_t_10 >= PyList_GET_SIZE(__pyx_t_11)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_3 = PyList_GET_ITEM(__pyx_t_11, __pyx_t_10); __Pyx_INCREF(__pyx_t_3); __pyx_t_10++; if (unlikely((0 < 0))) __PYX_ERR(0, 81, __pyx_L1_error)
- #else
- __pyx_t_3 = PySequence_ITEM(__pyx_t_11, __pyx_t_10); __pyx_t_10++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 81, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- } else {
- if (__pyx_t_10 >= PyTuple_GET_SIZE(__pyx_t_11)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_3 = PyTuple_GET_ITEM(__pyx_t_11, __pyx_t_10); __Pyx_INCREF(__pyx_t_3); __pyx_t_10++; if (unlikely((0 < 0))) __PYX_ERR(0, 81, __pyx_L1_error)
- #else
- __pyx_t_3 = PySequence_ITEM(__pyx_t_11, __pyx_t_10); __pyx_t_10++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 81, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- }
- } else {
- __pyx_t_3 = __pyx_t_13(__pyx_t_11);
- if (unlikely(!__pyx_t_3)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 81, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_3);
- }
- __Pyx_XDECREF_SET(__pyx_v_pair, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":82
- * scale = (d2 - d1) / (x2 - x1)
- * for pair in coords:
- * x = pair[j] # <<<<<<<<<<<<<<
- *
- * if x <= x1:
- */
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_pair, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 82, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_7 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 82, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_v_x = __pyx_t_7;
-
- /* "fontTools/varLib/iup.py":84
- * x = pair[j]
- *
- * if x <= x1: # <<<<<<<<<<<<<<
- * d = d1
- * elif x >= x2:
- */
- __pyx_t_9 = (__pyx_v_x <= __pyx_v_x1);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":85
- *
- * if x <= x1:
- * d = d1 # <<<<<<<<<<<<<<
- * elif x >= x2:
- * d = d2
- */
- __pyx_v_d = __pyx_v_d1;
-
- /* "fontTools/varLib/iup.py":84
- * x = pair[j]
- *
- * if x <= x1: # <<<<<<<<<<<<<<
- * d = d1
- * elif x >= x2:
- */
- goto __pyx_L10;
- }
-
- /* "fontTools/varLib/iup.py":86
- * if x <= x1:
- * d = d1
- * elif x >= x2: # <<<<<<<<<<<<<<
- * d = d2
- * else:
- */
- __pyx_t_9 = (__pyx_v_x >= __pyx_v_x2);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":87
- * d = d1
- * elif x >= x2:
- * d = d2 # <<<<<<<<<<<<<<
- * else:
- * # Interpolate
- */
- __pyx_v_d = __pyx_v_d2;
-
- /* "fontTools/varLib/iup.py":86
- * if x <= x1:
- * d = d1
- * elif x >= x2: # <<<<<<<<<<<<<<
- * d = d2
- * else:
- */
- goto __pyx_L10;
- }
-
- /* "fontTools/varLib/iup.py":90
- * else:
- * # Interpolate
- * d = d1 + (x - x1) * scale # <<<<<<<<<<<<<<
- *
- * out.append(d)
- */
- /*else*/ {
- __pyx_v_d = (__pyx_v_d1 + ((__pyx_v_x - __pyx_v_x1) * __pyx_v_scale));
- }
- __pyx_L10:;
-
- /* "fontTools/varLib/iup.py":92
- * d = d1 + (x - x1) * scale
- *
- * out.append(d) # <<<<<<<<<<<<<<
- *
- * return zip(*out_arrays)
- */
- __pyx_t_3 = PyFloat_FromDouble(__pyx_v_d); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 92, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_12 = __Pyx_PyList_Append(__pyx_v_out, __pyx_t_3); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 92, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":81
- * # x1 < x2
- * scale = (d2 - d1) / (x2 - x1)
- * for pair in coords: # <<<<<<<<<<<<<<
- * x = pair[j]
- *
- */
- }
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
-
- /* "fontTools/varLib/iup.py":63
- * # rd1 = reference delta 1
- * out_arrays = [None, None]
- * for j in 0, 1: # <<<<<<<<<<<<<<
- * out_arrays[j] = out = []
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- */
- __pyx_L3_continue:;
- }
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
-
- /* "fontTools/varLib/iup.py":94
- * out.append(d)
- *
- * return zip(*out_arrays) # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_1 = PySequence_Tuple(__pyx_v_out_arrays); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 94, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __pyx_t_11 = __Pyx_PyObject_Call(__pyx_builtin_zip, __pyx_t_1, NULL); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 94, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- __pyx_r = __pyx_t_11;
- __pyx_t_11 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":41
- *
- *
- * @cython.cfunc # <<<<<<<<<<<<<<
- * @cython.locals(
- * j=cython.int,
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_11);
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_segment", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = 0;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_out_arrays);
- __Pyx_XDECREF(__pyx_v_out);
- __Pyx_XDECREF(__pyx_v_pair);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":97
- *
- *
- * def iup_contour(deltas: _DeltaOrNoneSegment, coords: _PointSegment) -> _DeltaSegment: # <<<<<<<<<<<<<<
- * """For the contour given in `coords`, interpolate any missing
- * delta values in delta vector `deltas`.
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_1iup_contour(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_iup_contour, "iup_contour(deltas: _DeltaOrNoneSegment, coords: _PointSegment) -> _DeltaSegment\nFor the contour given in `coords`, interpolate any missing\n delta values in delta vector `deltas`.\n\n Returns fully filled-out delta vector.");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_1iup_contour = {"iup_contour", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_1iup_contour, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_iup_contour};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_1iup_contour(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("iup_contour (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,0};
- PyObject* values[2] = {0,0};
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 97, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 97, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_contour", 1, 2, 2, 1); __PYX_ERR(0, 97, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "iup_contour") < 0)) __PYX_ERR(0, 97, __pyx_L3_error)
- }
- } else if (unlikely(__pyx_nargs != 2)) {
- goto __pyx_L5_argtuple_error;
- } else {
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("iup_contour", 1, 2, 2, __pyx_nargs); __PYX_ERR(0, 97, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_iup_contour(__pyx_self, __pyx_v_deltas, __pyx_v_coords);
-
- /* function exit code */
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_iup_contour(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords) {
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_v_indices = NULL;
- PyObject *__pyx_v_out = NULL;
- PyObject *__pyx_v_it = NULL;
- PyObject *__pyx_v_start = NULL;
- PyObject *__pyx_v_i1 = NULL;
- PyObject *__pyx_v_i2 = NULL;
- PyObject *__pyx_v_ri1 = NULL;
- PyObject *__pyx_v_ri2 = NULL;
- PyObject *__pyx_v_end = NULL;
- PyObject *__pyx_7genexpr__pyx_v_i = NULL;
- PyObject *__pyx_7genexpr__pyx_v_v = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- Py_ssize_t __pyx_t_1;
- Py_ssize_t __pyx_t_2;
- int __pyx_t_3;
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- PyObject *__pyx_t_6 = NULL;
- PyObject *(*__pyx_t_7)(PyObject *);
- PyObject *__pyx_t_8 = NULL;
- int __pyx_t_9;
- PyObject *__pyx_t_10 = NULL;
- PyObject *__pyx_t_11 = NULL;
- int __pyx_t_12;
- PyObject *__pyx_t_13 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("iup_contour", 0);
-
- /* "fontTools/varLib/iup.py":103
- * Returns fully filled-out delta vector."""
- *
- * assert len(deltas) == len(coords) # <<<<<<<<<<<<<<
- * if None not in deltas:
- * return deltas
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_1 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 103, __pyx_L1_error)
- __pyx_t_2 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 103, __pyx_L1_error)
- __pyx_t_3 = (__pyx_t_1 == __pyx_t_2);
- if (unlikely(!__pyx_t_3)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 103, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 103, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":104
- *
- * assert len(deltas) == len(coords)
- * if None not in deltas: # <<<<<<<<<<<<<<
- * return deltas
- *
- */
- __pyx_t_3 = (__Pyx_PySequence_ContainsTF(Py_None, __pyx_v_deltas, Py_NE)); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 104, __pyx_L1_error)
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":105
- * assert len(deltas) == len(coords)
- * if None not in deltas:
- * return deltas # <<<<<<<<<<<<<<
- *
- * n = len(deltas)
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_deltas);
- __pyx_r = __pyx_v_deltas;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":104
- *
- * assert len(deltas) == len(coords)
- * if None not in deltas: # <<<<<<<<<<<<<<
- * return deltas
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":107
- * return deltas
- *
- * n = len(deltas) # <<<<<<<<<<<<<<
- * # indices of points with explicit deltas
- * indices = [i for i, v in enumerate(deltas) if v is not None]
- */
- __pyx_t_2 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 107, __pyx_L1_error)
- __pyx_t_4 = PyInt_FromSsize_t(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 107, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_n = __pyx_t_4;
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":109
- * n = len(deltas)
- * # indices of points with explicit deltas
- * indices = [i for i, v in enumerate(deltas) if v is not None] # <<<<<<<<<<<<<<
- * if not indices:
- * # All deltas are None. Return 0,0 for all.
- */
- { /* enter inner scope */
- __pyx_t_4 = PyList_New(0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 109, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_INCREF(__pyx_int_0);
- __pyx_t_5 = __pyx_int_0;
- if (likely(PyList_CheckExact(__pyx_v_deltas)) || PyTuple_CheckExact(__pyx_v_deltas)) {
- __pyx_t_6 = __pyx_v_deltas; __Pyx_INCREF(__pyx_t_6); __pyx_t_2 = 0;
- __pyx_t_7 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_6 = PyObject_GetIter(__pyx_v_deltas); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 109, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_7 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_6); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 109, __pyx_L6_error)
- }
- for (;;) {
- if (likely(!__pyx_t_7)) {
- if (likely(PyList_CheckExact(__pyx_t_6))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_6)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_8 = PyList_GET_ITEM(__pyx_t_6, __pyx_t_2); __Pyx_INCREF(__pyx_t_8); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 109, __pyx_L6_error)
- #else
- __pyx_t_8 = PySequence_ITEM(__pyx_t_6, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 109, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_6)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_8 = PyTuple_GET_ITEM(__pyx_t_6, __pyx_t_2); __Pyx_INCREF(__pyx_t_8); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 109, __pyx_L6_error)
- #else
- __pyx_t_8 = PySequence_ITEM(__pyx_t_6, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 109, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- #endif
- }
- } else {
- __pyx_t_8 = __pyx_t_7(__pyx_t_6);
- if (unlikely(!__pyx_t_8)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 109, __pyx_L6_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_8);
- }
- __Pyx_XDECREF_SET(__pyx_7genexpr__pyx_v_v, __pyx_t_8);
- __pyx_t_8 = 0;
- __Pyx_INCREF(__pyx_t_5);
- __Pyx_XDECREF_SET(__pyx_7genexpr__pyx_v_i, __pyx_t_5);
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_5, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 109, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_5);
- __pyx_t_5 = __pyx_t_8;
- __pyx_t_8 = 0;
- __pyx_t_3 = (__pyx_7genexpr__pyx_v_v != Py_None);
- if (__pyx_t_3) {
- if (unlikely(__Pyx_ListComp_Append(__pyx_t_4, (PyObject*)__pyx_7genexpr__pyx_v_i))) __PYX_ERR(0, 109, __pyx_L6_error)
- }
- }
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_i); __pyx_7genexpr__pyx_v_i = 0;
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_v); __pyx_7genexpr__pyx_v_v = 0;
- goto __pyx_L11_exit_scope;
- __pyx_L6_error:;
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_i); __pyx_7genexpr__pyx_v_i = 0;
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_v); __pyx_7genexpr__pyx_v_v = 0;
- goto __pyx_L1_error;
- __pyx_L11_exit_scope:;
- } /* exit inner scope */
- __pyx_v_indices = ((PyObject*)__pyx_t_4);
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":110
- * # indices of points with explicit deltas
- * indices = [i for i, v in enumerate(deltas) if v is not None]
- * if not indices: # <<<<<<<<<<<<<<
- * # All deltas are None. Return 0,0 for all.
- * return [(0, 0)] * n
- */
- __pyx_t_3 = (PyList_GET_SIZE(__pyx_v_indices) != 0);
- __pyx_t_9 = (!__pyx_t_3);
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":112
- * if not indices:
- * # All deltas are None. Return 0,0 for all.
- * return [(0, 0)] * n # <<<<<<<<<<<<<<
- *
- * out = []
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_4 = PyList_New(1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 112, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_INCREF(__pyx_tuple__2);
- __Pyx_GIVEREF(__pyx_tuple__2);
- PyList_SET_ITEM(__pyx_t_4, 0, __pyx_tuple__2);
- { PyObject* __pyx_temp = PyNumber_InPlaceMultiply(__pyx_t_4, __pyx_v_n); if (unlikely(!__pyx_temp)) __PYX_ERR(0, 112, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_temp);
- __Pyx_DECREF(__pyx_t_4);
- __pyx_t_4 = __pyx_temp;
- }
- __pyx_r = __pyx_t_4;
- __pyx_t_4 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":110
- * # indices of points with explicit deltas
- * indices = [i for i, v in enumerate(deltas) if v is not None]
- * if not indices: # <<<<<<<<<<<<<<
- * # All deltas are None. Return 0,0 for all.
- * return [(0, 0)] * n
- */
- }
-
- /* "fontTools/varLib/iup.py":114
- * return [(0, 0)] * n
- *
- * out = [] # <<<<<<<<<<<<<<
- * it = iter(indices)
- * start = next(it)
- */
- __pyx_t_4 = PyList_New(0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 114, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_out = ((PyObject*)__pyx_t_4);
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":115
- *
- * out = []
- * it = iter(indices) # <<<<<<<<<<<<<<
- * start = next(it)
- * if start != 0:
- */
- __pyx_t_4 = PyObject_GetIter(__pyx_v_indices); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 115, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_it = __pyx_t_4;
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":116
- * out = []
- * it = iter(indices)
- * start = next(it) # <<<<<<<<<<<<<<
- * if start != 0:
- * # Initial segment that wraps around
- */
- __pyx_t_4 = __Pyx_PyIter_Next(__pyx_v_it); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 116, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_start = __pyx_t_4;
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":117
- * it = iter(indices)
- * start = next(it)
- * if start != 0: # <<<<<<<<<<<<<<
- * # Initial segment that wraps around
- * i1, i2, ri1, ri2 = 0, start, start, indices[-1]
- */
- __pyx_t_9 = (__Pyx_PyInt_BoolNeObjC(__pyx_v_start, __pyx_int_0, 0, 0)); if (unlikely((__pyx_t_9 < 0))) __PYX_ERR(0, 117, __pyx_L1_error)
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":119
- * if start != 0:
- * # Initial segment that wraps around
- * i1, i2, ri1, ri2 = 0, start, start, indices[-1] # <<<<<<<<<<<<<<
- * out.extend(
- * iup_segment(
- */
- __pyx_t_4 = __pyx_int_0;
- __Pyx_INCREF(__pyx_t_4);
- __pyx_t_5 = __pyx_v_start;
- __Pyx_INCREF(__pyx_t_5);
- __pyx_t_6 = __pyx_v_start;
- __Pyx_INCREF(__pyx_t_6);
- __pyx_t_8 = __Pyx_GetItemInt_List(__pyx_v_indices, -1L, long, 1, __Pyx_PyInt_From_long, 1, 1, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 119, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_v_i1 = __pyx_t_4;
- __pyx_t_4 = 0;
- __pyx_v_i2 = __pyx_t_5;
- __pyx_t_5 = 0;
- __pyx_v_ri1 = __pyx_t_6;
- __pyx_t_6 = 0;
- __pyx_v_ri2 = __pyx_t_8;
- __pyx_t_8 = 0;
-
- /* "fontTools/varLib/iup.py":122
- * out.extend(
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2] # <<<<<<<<<<<<<<
- * )
- * )
- */
- __pyx_t_8 = __Pyx_PyObject_GetSlice(__pyx_v_coords, 0, 0, &__pyx_v_i1, &__pyx_v_i2, NULL, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 122, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_6 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 122, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_5 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 122, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_4 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 122, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_10 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri2); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 122, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
-
- /* "fontTools/varLib/iup.py":121
- * i1, i2, ri1, ri2 = 0, start, start, indices[-1]
- * out.extend(
- * iup_segment( # <<<<<<<<<<<<<<
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- * )
- */
- __pyx_t_11 = __pyx_f_9fontTools_6varLib_3iup_iup_segment(__pyx_t_8, __pyx_t_6, __pyx_t_5, __pyx_t_4, __pyx_t_10); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 121, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
-
- /* "fontTools/varLib/iup.py":120
- * # Initial segment that wraps around
- * i1, i2, ri1, ri2 = 0, start, start, indices[-1]
- * out.extend( # <<<<<<<<<<<<<<
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- */
- __pyx_t_12 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_t_11); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 120, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
-
- /* "fontTools/varLib/iup.py":117
- * it = iter(indices)
- * start = next(it)
- * if start != 0: # <<<<<<<<<<<<<<
- * # Initial segment that wraps around
- * i1, i2, ri1, ri2 = 0, start, start, indices[-1]
- */
- }
-
- /* "fontTools/varLib/iup.py":125
- * )
- * )
- * out.append(deltas[start]) # <<<<<<<<<<<<<<
- * for end in it:
- * if end - start > 1:
- */
- __pyx_t_11 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_start); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 125, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __pyx_t_12 = __Pyx_PyList_Append(__pyx_v_out, __pyx_t_11); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 125, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
-
- /* "fontTools/varLib/iup.py":126
- * )
- * out.append(deltas[start])
- * for end in it: # <<<<<<<<<<<<<<
- * if end - start > 1:
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- */
- if (likely(PyList_CheckExact(__pyx_v_it)) || PyTuple_CheckExact(__pyx_v_it)) {
- __pyx_t_11 = __pyx_v_it; __Pyx_INCREF(__pyx_t_11); __pyx_t_2 = 0;
- __pyx_t_7 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_11 = PyObject_GetIter(__pyx_v_it); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 126, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __pyx_t_7 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_11); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 126, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_7)) {
- if (likely(PyList_CheckExact(__pyx_t_11))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_11)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_10 = PyList_GET_ITEM(__pyx_t_11, __pyx_t_2); __Pyx_INCREF(__pyx_t_10); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 126, __pyx_L1_error)
- #else
- __pyx_t_10 = PySequence_ITEM(__pyx_t_11, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 126, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_11)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_10 = PyTuple_GET_ITEM(__pyx_t_11, __pyx_t_2); __Pyx_INCREF(__pyx_t_10); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 126, __pyx_L1_error)
- #else
- __pyx_t_10 = PySequence_ITEM(__pyx_t_11, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 126, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- #endif
- }
- } else {
- __pyx_t_10 = __pyx_t_7(__pyx_t_11);
- if (unlikely(!__pyx_t_10)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 126, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_10);
- }
- __Pyx_XDECREF_SET(__pyx_v_end, __pyx_t_10);
- __pyx_t_10 = 0;
-
- /* "fontTools/varLib/iup.py":127
- * out.append(deltas[start])
- * for end in it:
- * if end - start > 1: # <<<<<<<<<<<<<<
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- * out.extend(
- */
- __pyx_t_10 = PyNumber_Subtract(__pyx_v_end, __pyx_v_start); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 127, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __pyx_t_4 = PyObject_RichCompare(__pyx_t_10, __pyx_int_1, Py_GT); __Pyx_XGOTREF(__pyx_t_4); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 127, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- __pyx_t_9 = __Pyx_PyObject_IsTrue(__pyx_t_4); if (unlikely((__pyx_t_9 < 0))) __PYX_ERR(0, 127, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":128
- * for end in it:
- * if end - start > 1:
- * i1, i2, ri1, ri2 = start + 1, end, start, end # <<<<<<<<<<<<<<
- * out.extend(
- * iup_segment(
- */
- __pyx_t_4 = __Pyx_PyInt_AddObjC(__pyx_v_start, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 128, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_10 = __pyx_v_end;
- __Pyx_INCREF(__pyx_t_10);
- __pyx_t_5 = __pyx_v_start;
- __Pyx_INCREF(__pyx_t_5);
- __pyx_t_6 = __pyx_v_end;
- __Pyx_INCREF(__pyx_t_6);
- __Pyx_XDECREF_SET(__pyx_v_i1, __pyx_t_4);
- __pyx_t_4 = 0;
- __Pyx_XDECREF_SET(__pyx_v_i2, __pyx_t_10);
- __pyx_t_10 = 0;
- __Pyx_XDECREF_SET(__pyx_v_ri1, __pyx_t_5);
- __pyx_t_5 = 0;
- __Pyx_XDECREF_SET(__pyx_v_ri2, __pyx_t_6);
- __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":131
- * out.extend(
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2] # <<<<<<<<<<<<<<
- * )
- * )
- */
- __pyx_t_6 = __Pyx_PyObject_GetSlice(__pyx_v_coords, 0, 0, &__pyx_v_i1, &__pyx_v_i2, NULL, 0, 0, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 131, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_5 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 131, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_10 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri1); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 131, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __pyx_t_4 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 131, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri2); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 131, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
-
- /* "fontTools/varLib/iup.py":130
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- * out.extend(
- * iup_segment( # <<<<<<<<<<<<<<
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- * )
- */
- __pyx_t_13 = __pyx_f_9fontTools_6varLib_3iup_iup_segment(__pyx_t_6, __pyx_t_5, __pyx_t_10, __pyx_t_4, __pyx_t_8); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 130, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
-
- /* "fontTools/varLib/iup.py":129
- * if end - start > 1:
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- * out.extend( # <<<<<<<<<<<<<<
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- */
- __pyx_t_12 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_t_13); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 129, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
-
- /* "fontTools/varLib/iup.py":127
- * out.append(deltas[start])
- * for end in it:
- * if end - start > 1: # <<<<<<<<<<<<<<
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- * out.extend(
- */
- }
-
- /* "fontTools/varLib/iup.py":134
- * )
- * )
- * out.append(deltas[end]) # <<<<<<<<<<<<<<
- * start = end
- * if start != n - 1:
- */
- __pyx_t_13 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_end); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 134, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_12 = __Pyx_PyList_Append(__pyx_v_out, __pyx_t_13); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 134, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
-
- /* "fontTools/varLib/iup.py":135
- * )
- * out.append(deltas[end])
- * start = end # <<<<<<<<<<<<<<
- * if start != n - 1:
- * # Final segment that wraps around
- */
- __Pyx_INCREF(__pyx_v_end);
- __Pyx_DECREF_SET(__pyx_v_start, __pyx_v_end);
-
- /* "fontTools/varLib/iup.py":126
- * )
- * out.append(deltas[start])
- * for end in it: # <<<<<<<<<<<<<<
- * if end - start > 1:
- * i1, i2, ri1, ri2 = start + 1, end, start, end
- */
- }
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
-
- /* "fontTools/varLib/iup.py":136
- * out.append(deltas[end])
- * start = end
- * if start != n - 1: # <<<<<<<<<<<<<<
- * # Final segment that wraps around
- * i1, i2, ri1, ri2 = start + 1, n, start, indices[0]
- */
- __pyx_t_11 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 136, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __pyx_t_13 = PyObject_RichCompare(__pyx_v_start, __pyx_t_11, Py_NE); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 136, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
- __pyx_t_9 = __Pyx_PyObject_IsTrue(__pyx_t_13); if (unlikely((__pyx_t_9 < 0))) __PYX_ERR(0, 136, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- if (__pyx_t_9) {
-
- /* "fontTools/varLib/iup.py":138
- * if start != n - 1:
- * # Final segment that wraps around
- * i1, i2, ri1, ri2 = start + 1, n, start, indices[0] # <<<<<<<<<<<<<<
- * out.extend(
- * iup_segment(
- */
- __pyx_t_13 = __Pyx_PyInt_AddObjC(__pyx_v_start, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 138, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_11 = __pyx_v_n;
- __Pyx_INCREF(__pyx_t_11);
- __pyx_t_8 = __pyx_v_start;
- __Pyx_INCREF(__pyx_t_8);
- __pyx_t_4 = __Pyx_GetItemInt_List(__pyx_v_indices, 0, long, 1, __Pyx_PyInt_From_long, 1, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 138, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_XDECREF_SET(__pyx_v_i1, __pyx_t_13);
- __pyx_t_13 = 0;
- __Pyx_XDECREF_SET(__pyx_v_i2, __pyx_t_11);
- __pyx_t_11 = 0;
- __Pyx_XDECREF_SET(__pyx_v_ri1, __pyx_t_8);
- __pyx_t_8 = 0;
- __Pyx_XDECREF_SET(__pyx_v_ri2, __pyx_t_4);
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":141
- * out.extend(
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2] # <<<<<<<<<<<<<<
- * )
- * )
- */
- __pyx_t_4 = __Pyx_PyObject_GetSlice(__pyx_v_coords, 0, 0, &__pyx_v_i1, &__pyx_v_i2, NULL, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 141, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 141, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_11 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri1); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 141, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __pyx_t_13 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_ri2); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 141, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_10 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_ri2); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 141, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
-
- /* "fontTools/varLib/iup.py":140
- * i1, i2, ri1, ri2 = start + 1, n, start, indices[0]
- * out.extend(
- * iup_segment( # <<<<<<<<<<<<<<
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- * )
- */
- __pyx_t_5 = __pyx_f_9fontTools_6varLib_3iup_iup_segment(__pyx_t_4, __pyx_t_8, __pyx_t_11, __pyx_t_13, __pyx_t_10); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 140, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
-
- /* "fontTools/varLib/iup.py":139
- * # Final segment that wraps around
- * i1, i2, ri1, ri2 = start + 1, n, start, indices[0]
- * out.extend( # <<<<<<<<<<<<<<
- * iup_segment(
- * coords[i1:i2], coords[ri1], deltas[ri1], coords[ri2], deltas[ri2]
- */
- __pyx_t_12 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_t_5); if (unlikely(__pyx_t_12 == ((int)-1))) __PYX_ERR(0, 139, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":136
- * out.append(deltas[end])
- * start = end
- * if start != n - 1: # <<<<<<<<<<<<<<
- * # Final segment that wraps around
- * i1, i2, ri1, ri2 = start + 1, n, start, indices[0]
- */
- }
-
- /* "fontTools/varLib/iup.py":145
- * )
- *
- * assert len(deltas) == len(out), (len(deltas), len(out)) # <<<<<<<<<<<<<<
- * return out
- *
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_2 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 145, __pyx_L1_error)
- __pyx_t_1 = PyList_GET_SIZE(__pyx_v_out); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 145, __pyx_L1_error)
- __pyx_t_9 = (__pyx_t_2 == __pyx_t_1);
- if (unlikely(!__pyx_t_9)) {
- __pyx_t_1 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 145, __pyx_L1_error)
- __pyx_t_5 = PyInt_FromSsize_t(__pyx_t_1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 145, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_1 = PyList_GET_SIZE(__pyx_v_out); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 145, __pyx_L1_error)
- __pyx_t_10 = PyInt_FromSsize_t(__pyx_t_1); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 145, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __pyx_t_13 = PyTuple_New(2); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 145, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __Pyx_GIVEREF(__pyx_t_5);
- PyTuple_SET_ITEM(__pyx_t_13, 0, __pyx_t_5);
- __Pyx_GIVEREF(__pyx_t_10);
- PyTuple_SET_ITEM(__pyx_t_13, 1, __pyx_t_10);
- __pyx_t_5 = 0;
- __pyx_t_10 = 0;
- __pyx_t_10 = PyTuple_Pack(1, __pyx_t_13); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 145, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __Pyx_Raise(__pyx_builtin_AssertionError, __pyx_t_10, 0, 0);
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- __PYX_ERR(0, 145, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 145, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":146
- *
- * assert len(deltas) == len(out), (len(deltas), len(out))
- * return out # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_out);
- __pyx_r = __pyx_v_out;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":97
- *
- *
- * def iup_contour(deltas: _DeltaOrNoneSegment, coords: _PointSegment) -> _DeltaSegment: # <<<<<<<<<<<<<<
- * """For the contour given in `coords`, interpolate any missing
- * delta values in delta vector `deltas`.
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_10);
- __Pyx_XDECREF(__pyx_t_11);
- __Pyx_XDECREF(__pyx_t_13);
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_indices);
- __Pyx_XDECREF(__pyx_v_out);
- __Pyx_XDECREF(__pyx_v_it);
- __Pyx_XDECREF(__pyx_v_start);
- __Pyx_XDECREF(__pyx_v_i1);
- __Pyx_XDECREF(__pyx_v_i2);
- __Pyx_XDECREF(__pyx_v_ri1);
- __Pyx_XDECREF(__pyx_v_ri2);
- __Pyx_XDECREF(__pyx_v_end);
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_i);
- __Pyx_XDECREF(__pyx_7genexpr__pyx_v_v);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":149
- *
- *
- * def iup_delta( # <<<<<<<<<<<<<<
- * deltas: _DeltaOrNoneSegment, coords: _PointSegment, ends: _Endpoints
- * ) -> _DeltaSegment:
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_3iup_delta(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_2iup_delta, "iup_delta(deltas: _DeltaOrNoneSegment, coords: _PointSegment, ends: _Endpoints) -> _DeltaSegment\nFor the outline given in `coords`, with contour endpoints given\n in sorted increasing order in `ends`, interpolate any missing\n delta values in delta vector `deltas`.\n\n Returns fully filled-out delta vector.");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_3iup_delta = {"iup_delta", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_3iup_delta, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_2iup_delta};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_3iup_delta(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- PyObject *__pyx_v_ends = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("iup_delta (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,&__pyx_n_s_ends,0};
- PyObject* values[3] = {0,0,0};
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 149, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 149, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_delta", 1, 3, 3, 1); __PYX_ERR(0, 149, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_ends)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 149, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_delta", 1, 3, 3, 2); __PYX_ERR(0, 149, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "iup_delta") < 0)) __PYX_ERR(0, 149, __pyx_L3_error)
- }
- } else if (unlikely(__pyx_nargs != 3)) {
- goto __pyx_L5_argtuple_error;
- } else {
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- __pyx_v_ends = values[2];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("iup_delta", 1, 3, 3, __pyx_nargs); __PYX_ERR(0, 149, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_delta", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_2iup_delta(__pyx_self, __pyx_v_deltas, __pyx_v_coords, __pyx_v_ends);
-
- /* function exit code */
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_2iup_delta(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_ends) {
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_v_out = NULL;
- PyObject *__pyx_v_start = NULL;
- PyObject *__pyx_v_end = NULL;
- PyObject *__pyx_v_contour = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- int __pyx_t_1;
- PyObject *__pyx_t_2 = NULL;
- PyObject *__pyx_t_3 = NULL;
- int __pyx_t_4;
- int __pyx_t_5;
- Py_ssize_t __pyx_t_6;
- PyObject *__pyx_t_7 = NULL;
- PyObject *__pyx_t_8 = NULL;
- PyObject *__pyx_t_9 = NULL;
- PyObject *(*__pyx_t_10)(PyObject *);
- PyObject *__pyx_t_11 = NULL;
- int __pyx_t_12;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("iup_delta", 0);
- __Pyx_INCREF(__pyx_v_ends);
-
- /* "fontTools/varLib/iup.py":158
- * Returns fully filled-out delta vector."""
- *
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4 # <<<<<<<<<<<<<<
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_3 = PySequence_List(__pyx_v_ends); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_2 = ((PyObject*)__pyx_t_3);
- __pyx_t_3 = 0;
- __pyx_t_4 = PyList_Sort(__pyx_t_2); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 158, __pyx_L1_error)
- __pyx_t_3 = PyObject_RichCompare(__pyx_t_2, __pyx_v_ends, Py_EQ); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_5) {
- } else {
- __pyx_t_1 = __pyx_t_5;
- goto __pyx_L3_bool_binop_done;
- }
- __pyx_t_6 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_6 == ((Py_ssize_t)-1))) __PYX_ERR(0, 158, __pyx_L1_error)
- __pyx_t_3 = PyInt_FromSsize_t(__pyx_t_6); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_v_ends); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 158, __pyx_L1_error)
- if (__pyx_t_5) {
- __pyx_t_7 = __Pyx_GetItemInt(__pyx_v_ends, -1L, long, 1, __Pyx_PyInt_From_long, 0, 1, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_7, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_2 = __pyx_t_8;
- __pyx_t_8 = 0;
- } else {
- __Pyx_INCREF(__pyx_int_0);
- __pyx_t_2 = __pyx_int_0;
- }
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_2, __pyx_int_4, 4, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_2 = PyObject_RichCompare(__pyx_t_3, __pyx_t_8, Py_EQ); __Pyx_XGOTREF(__pyx_t_2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_t_2); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 158, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_1 = __pyx_t_5;
- __pyx_L3_bool_binop_done:;
- if (unlikely(!__pyx_t_1)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 158, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 158, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":159
- *
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4
- * n = len(coords) # <<<<<<<<<<<<<<
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = []
- */
- __pyx_t_6 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_6 == ((Py_ssize_t)-1))) __PYX_ERR(0, 159, __pyx_L1_error)
- __pyx_t_2 = PyInt_FromSsize_t(__pyx_t_6); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 159, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_n = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":160
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1] # <<<<<<<<<<<<<<
- * out = []
- * start = 0
- */
- __pyx_t_2 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_4, 4, 0, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_8 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_3, 3, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_3 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_2, 2, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = PyList_New(4); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_GIVEREF(__pyx_t_2);
- PyList_SET_ITEM(__pyx_t_9, 0, __pyx_t_2);
- __Pyx_GIVEREF(__pyx_t_8);
- PyList_SET_ITEM(__pyx_t_9, 1, __pyx_t_8);
- __Pyx_GIVEREF(__pyx_t_3);
- PyList_SET_ITEM(__pyx_t_9, 2, __pyx_t_3);
- __Pyx_GIVEREF(__pyx_t_7);
- PyList_SET_ITEM(__pyx_t_9, 3, __pyx_t_7);
- __pyx_t_2 = 0;
- __pyx_t_8 = 0;
- __pyx_t_3 = 0;
- __pyx_t_7 = 0;
- __pyx_t_7 = PyNumber_Add(__pyx_v_ends, __pyx_t_9); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 160, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __Pyx_DECREF_SET(__pyx_v_ends, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":161
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = [] # <<<<<<<<<<<<<<
- * start = 0
- * for end in ends:
- */
- __pyx_t_7 = PyList_New(0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 161, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_v_out = ((PyObject*)__pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":162
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = []
- * start = 0 # <<<<<<<<<<<<<<
- * for end in ends:
- * end += 1
- */
- __Pyx_INCREF(__pyx_int_0);
- __pyx_v_start = __pyx_int_0;
-
- /* "fontTools/varLib/iup.py":163
- * out = []
- * start = 0
- * for end in ends: # <<<<<<<<<<<<<<
- * end += 1
- * contour = iup_contour(deltas[start:end], coords[start:end])
- */
- if (likely(PyList_CheckExact(__pyx_v_ends)) || PyTuple_CheckExact(__pyx_v_ends)) {
- __pyx_t_7 = __pyx_v_ends; __Pyx_INCREF(__pyx_t_7); __pyx_t_6 = 0;
- __pyx_t_10 = NULL;
- } else {
- __pyx_t_6 = -1; __pyx_t_7 = PyObject_GetIter(__pyx_v_ends); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 163, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_10 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_7); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 163, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_10)) {
- if (likely(PyList_CheckExact(__pyx_t_7))) {
- if (__pyx_t_6 >= PyList_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_9 = PyList_GET_ITEM(__pyx_t_7, __pyx_t_6); __Pyx_INCREF(__pyx_t_9); __pyx_t_6++; if (unlikely((0 < 0))) __PYX_ERR(0, 163, __pyx_L1_error)
- #else
- __pyx_t_9 = PySequence_ITEM(__pyx_t_7, __pyx_t_6); __pyx_t_6++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 163, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- #endif
- } else {
- if (__pyx_t_6 >= PyTuple_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_9 = PyTuple_GET_ITEM(__pyx_t_7, __pyx_t_6); __Pyx_INCREF(__pyx_t_9); __pyx_t_6++; if (unlikely((0 < 0))) __PYX_ERR(0, 163, __pyx_L1_error)
- #else
- __pyx_t_9 = PySequence_ITEM(__pyx_t_7, __pyx_t_6); __pyx_t_6++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 163, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- #endif
- }
- } else {
- __pyx_t_9 = __pyx_t_10(__pyx_t_7);
- if (unlikely(!__pyx_t_9)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 163, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_9);
- }
- __Pyx_XDECREF_SET(__pyx_v_end, __pyx_t_9);
- __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":164
- * start = 0
- * for end in ends:
- * end += 1 # <<<<<<<<<<<<<<
- * contour = iup_contour(deltas[start:end], coords[start:end])
- * out.extend(contour)
- */
- __pyx_t_9 = __Pyx_PyInt_AddObjC(__pyx_v_end, __pyx_int_1, 1, 1, 0); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 164, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_DECREF_SET(__pyx_v_end, __pyx_t_9);
- __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":165
- * for end in ends:
- * end += 1
- * contour = iup_contour(deltas[start:end], coords[start:end]) # <<<<<<<<<<<<<<
- * out.extend(contour)
- * start = end
- */
- __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_iup_contour); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 165, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_8 = __Pyx_PyObject_GetSlice(__pyx_v_deltas, 0, 0, &__pyx_v_start, &__pyx_v_end, NULL, 0, 0, 1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 165, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_2 = __Pyx_PyObject_GetSlice(__pyx_v_coords, 0, 0, &__pyx_v_start, &__pyx_v_end, NULL, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 165, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_11 = NULL;
- __pyx_t_12 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) {
- __pyx_t_11 = PyMethod_GET_SELF(__pyx_t_3);
- if (likely(__pyx_t_11)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3);
- __Pyx_INCREF(__pyx_t_11);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_3, function);
- __pyx_t_12 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[3] = {__pyx_t_11, __pyx_t_8, __pyx_t_2};
- __pyx_t_9 = __Pyx_PyObject_FastCall(__pyx_t_3, __pyx_callargs+1-__pyx_t_12, 2+__pyx_t_12);
- __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 165, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- }
- __Pyx_XDECREF_SET(__pyx_v_contour, __pyx_t_9);
- __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":166
- * end += 1
- * contour = iup_contour(deltas[start:end], coords[start:end])
- * out.extend(contour) # <<<<<<<<<<<<<<
- * start = end
- *
- */
- __pyx_t_4 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_v_contour); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 166, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":167
- * contour = iup_contour(deltas[start:end], coords[start:end])
- * out.extend(contour)
- * start = end # <<<<<<<<<<<<<<
- *
- * return out
- */
- __Pyx_INCREF(__pyx_v_end);
- __Pyx_DECREF_SET(__pyx_v_start, __pyx_v_end);
-
- /* "fontTools/varLib/iup.py":163
- * out = []
- * start = 0
- * for end in ends: # <<<<<<<<<<<<<<
- * end += 1
- * contour = iup_contour(deltas[start:end], coords[start:end])
- */
- }
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":169
- * start = end
- *
- * return out # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_out);
- __pyx_r = __pyx_v_out;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":149
- *
- *
- * def iup_delta( # <<<<<<<<<<<<<<
- * deltas: _DeltaOrNoneSegment, coords: _PointSegment, ends: _Endpoints
- * ) -> _DeltaSegment:
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_9);
- __Pyx_XDECREF(__pyx_t_11);
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_delta", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_out);
- __Pyx_XDECREF(__pyx_v_start);
- __Pyx_XDECREF(__pyx_v_end);
- __Pyx_XDECREF(__pyx_v_contour);
- __Pyx_XDECREF(__pyx_v_ends);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_18can_iup_in_between_2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */
-
-/* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_18can_iup_in_between_genexpr(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *__pyx_cur_scope;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("genexpr", 0);
- __pyx_cur_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr, __pyx_empty_tuple, NULL);
- if (unlikely(!__pyx_cur_scope)) {
- __pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)Py_None);
- __Pyx_INCREF(Py_None);
- __PYX_ERR(0, 203, __pyx_L1_error)
- } else {
- __Pyx_GOTREF((PyObject *)__pyx_cur_scope);
- }
- __pyx_cur_scope->__pyx_genexpr_arg_0 = __pyx_genexpr_arg_0;
- __Pyx_INCREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- __Pyx_GIVEREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- {
- __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_9fontTools_6varLib_3iup_18can_iup_in_between_2generator, NULL, (PyObject *) __pyx_cur_scope, __pyx_n_s_genexpr, __pyx_n_s_can_iup_in_between_locals_genexp, __pyx_n_s_fontTools_varLib_iup); if (unlikely(!gen)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_DECREF(__pyx_cur_scope);
- __Pyx_RefNannyFinishContext();
- return (PyObject *) gen;
- }
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.can_iup_in_between.genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __Pyx_DECREF((PyObject *)__pyx_cur_scope);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_18can_iup_in_between_2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */
-{
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *__pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)__pyx_generator->closure);
- PyObject *__pyx_r = NULL;
- PyObject *__pyx_t_1 = NULL;
- Py_ssize_t __pyx_t_2;
- PyObject *(*__pyx_t_3)(PyObject *);
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- PyObject *__pyx_t_6 = NULL;
- PyObject *__pyx_t_7 = NULL;
- PyObject *(*__pyx_t_8)(PyObject *);
- PyObject *__pyx_t_9 = NULL;
- PyObject *__pyx_t_10 = NULL;
- double __pyx_t_11;
- double __pyx_t_12;
- int __pyx_t_13;
- int __pyx_t_14;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("genexpr", 0);
- switch (__pyx_generator->resume_label) {
- case 0: goto __pyx_L3_first_run;
- default: /* CPython raises the right error here */
- __Pyx_RefNannyFinishContext();
- return NULL;
- }
- __pyx_L3_first_run:;
- if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 203, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":204
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance
- * for (x, y), (p, q) in zip(deltas, interp) # <<<<<<<<<<<<<<
- * )
- *
- */
- if (unlikely(!__pyx_cur_scope->__pyx_genexpr_arg_0)) { __Pyx_RaiseUnboundLocalError(".0"); __PYX_ERR(0, 204, __pyx_L1_error) }
- if (likely(PyList_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) || PyTuple_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) {
- __pyx_t_1 = __pyx_cur_scope->__pyx_genexpr_arg_0; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0;
- __pyx_t_3 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_1 = PyObject_GetIter(__pyx_cur_scope->__pyx_genexpr_arg_0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __pyx_t_3 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 204, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_3)) {
- if (likely(PyList_CheckExact(__pyx_t_1))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 204, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyTuple_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 204, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- }
- } else {
- __pyx_t_4 = __pyx_t_3(__pyx_t_1);
- if (unlikely(!__pyx_t_4)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 204, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_4);
- }
- if ((likely(PyTuple_CheckExact(__pyx_t_4))) || (PyList_CheckExact(__pyx_t_4))) {
- PyObject* sequence = __pyx_t_4;
- Py_ssize_t size = __Pyx_PySequence_SIZE(sequence);
- if (unlikely(size != 2)) {
- if (size > 2) __Pyx_RaiseTooManyValuesError(2);
- else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size);
- __PYX_ERR(0, 204, __pyx_L1_error)
- }
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- if (likely(PyTuple_CheckExact(sequence))) {
- __pyx_t_5 = PyTuple_GET_ITEM(sequence, 0);
- __pyx_t_6 = PyTuple_GET_ITEM(sequence, 1);
- } else {
- __pyx_t_5 = PyList_GET_ITEM(sequence, 0);
- __pyx_t_6 = PyList_GET_ITEM(sequence, 1);
- }
- __Pyx_INCREF(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_6);
- #else
- __pyx_t_5 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_6 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- #endif
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- } else {
- Py_ssize_t index = -1;
- __pyx_t_7 = PyObject_GetIter(__pyx_t_4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_7);
- index = 0; __pyx_t_5 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_5)) goto __pyx_L6_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_5);
- index = 1; __pyx_t_6 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_6)) goto __pyx_L6_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_6);
- if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_7), 2) < 0) __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_t_8 = NULL;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- goto __pyx_L7_unpacking_done;
- __pyx_L6_unpacking_failed:;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_8 = NULL;
- if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index);
- __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_L7_unpacking_done:;
- }
- if ((likely(PyTuple_CheckExact(__pyx_t_5))) || (PyList_CheckExact(__pyx_t_5))) {
- PyObject* sequence = __pyx_t_5;
- Py_ssize_t size = __Pyx_PySequence_SIZE(sequence);
- if (unlikely(size != 2)) {
- if (size > 2) __Pyx_RaiseTooManyValuesError(2);
- else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size);
- __PYX_ERR(0, 204, __pyx_L1_error)
- }
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- if (likely(PyTuple_CheckExact(sequence))) {
- __pyx_t_7 = PyTuple_GET_ITEM(sequence, 0);
- __pyx_t_9 = PyTuple_GET_ITEM(sequence, 1);
- } else {
- __pyx_t_7 = PyList_GET_ITEM(sequence, 0);
- __pyx_t_9 = PyList_GET_ITEM(sequence, 1);
- }
- __Pyx_INCREF(__pyx_t_7);
- __Pyx_INCREF(__pyx_t_9);
- #else
- __pyx_t_7 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- #endif
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- } else {
- Py_ssize_t index = -1;
- __pyx_t_10 = PyObject_GetIter(__pyx_t_5); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_10);
- index = 0; __pyx_t_7 = __pyx_t_8(__pyx_t_10); if (unlikely(!__pyx_t_7)) goto __pyx_L8_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_7);
- index = 1; __pyx_t_9 = __pyx_t_8(__pyx_t_10); if (unlikely(!__pyx_t_9)) goto __pyx_L8_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_9);
- if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_10), 2) < 0) __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_t_8 = NULL;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- goto __pyx_L9_unpacking_done;
- __pyx_L8_unpacking_failed:;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- __pyx_t_8 = NULL;
- if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index);
- __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_L9_unpacking_done:;
- }
- __pyx_t_11 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_11 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_12 = __pyx_PyFloat_AsDouble(__pyx_t_9); if (unlikely((__pyx_t_12 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __pyx_cur_scope->__pyx_v_x = __pyx_t_11;
- __pyx_cur_scope->__pyx_v_y = __pyx_t_12;
- if ((likely(PyTuple_CheckExact(__pyx_t_6))) || (PyList_CheckExact(__pyx_t_6))) {
- PyObject* sequence = __pyx_t_6;
- Py_ssize_t size = __Pyx_PySequence_SIZE(sequence);
- if (unlikely(size != 2)) {
- if (size > 2) __Pyx_RaiseTooManyValuesError(2);
- else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size);
- __PYX_ERR(0, 204, __pyx_L1_error)
- }
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- if (likely(PyTuple_CheckExact(sequence))) {
- __pyx_t_9 = PyTuple_GET_ITEM(sequence, 0);
- __pyx_t_7 = PyTuple_GET_ITEM(sequence, 1);
- } else {
- __pyx_t_9 = PyList_GET_ITEM(sequence, 0);
- __pyx_t_7 = PyList_GET_ITEM(sequence, 1);
- }
- __Pyx_INCREF(__pyx_t_9);
- __Pyx_INCREF(__pyx_t_7);
- #else
- __pyx_t_9 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __pyx_t_7 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- #endif
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- } else {
- Py_ssize_t index = -1;
- __pyx_t_10 = PyObject_GetIter(__pyx_t_6); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_10);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_10);
- index = 0; __pyx_t_9 = __pyx_t_8(__pyx_t_10); if (unlikely(!__pyx_t_9)) goto __pyx_L10_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_9);
- index = 1; __pyx_t_7 = __pyx_t_8(__pyx_t_10); if (unlikely(!__pyx_t_7)) goto __pyx_L10_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_7);
- if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_10), 2) < 0) __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_t_8 = NULL;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- goto __pyx_L11_unpacking_done;
- __pyx_L10_unpacking_failed:;
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- __pyx_t_8 = NULL;
- if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index);
- __PYX_ERR(0, 204, __pyx_L1_error)
- __pyx_L11_unpacking_done:;
- }
- __pyx_t_12 = __pyx_PyFloat_AsDouble(__pyx_t_9); if (unlikely((__pyx_t_12 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __pyx_t_11 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_11 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_cur_scope->__pyx_v_p = __pyx_t_12;
- __pyx_cur_scope->__pyx_v_q = __pyx_t_11;
-
- /* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
- __pyx_t_4 = PyFloat_FromDouble((__pyx_cur_scope->__pyx_v_x - __pyx_cur_scope->__pyx_v_p)); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_6 = PyFloat_FromDouble((__pyx_cur_scope->__pyx_v_y - __pyx_cur_scope->__pyx_v_q)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_5 = PyTuple_New(2); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GIVEREF(__pyx_t_4);
- PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_t_4);
- __Pyx_GIVEREF(__pyx_t_6);
- PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_t_6);
- __pyx_t_4 = 0;
- __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_PyObject_Call(((PyObject *)(&PyComplex_Type)), __pyx_t_5, NULL); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_5 = __Pyx_PyNumber_Absolute(__pyx_t_6); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_6 = PyFloat_FromDouble(__pyx_cur_scope->__pyx_v_tolerance); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_4 = PyObject_RichCompare(__pyx_t_5, __pyx_t_6, Py_LE); __Pyx_XGOTREF(__pyx_t_4); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_13 = __Pyx_PyObject_IsTrue(__pyx_t_4); if (unlikely((__pyx_t_13 < 0))) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_14 = (!__pyx_t_13);
- if (__pyx_t_14) {
-
- /* "fontTools/varLib/iup.py":202
- * deltas = deltas[i + 1 : j]
- *
- * return all( # <<<<<<<<<<<<<<
- * abs(complex(x - p, y - q)) <= tolerance
- * for (x, y), (p, q) in zip(deltas, interp)
- */
- __Pyx_XDECREF(__pyx_r);
-
- /* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
- __Pyx_INCREF(Py_False);
- __pyx_r = Py_False;
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- goto __pyx_L0;
- }
-
- /* "fontTools/varLib/iup.py":204
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance
- * for (x, y), (p, q) in zip(deltas, interp) # <<<<<<<<<<<<<<
- * )
- *
- */
- }
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- /*else*/ {
-
- /* "fontTools/varLib/iup.py":202
- * deltas = deltas[i + 1 : j]
- *
- * return all( # <<<<<<<<<<<<<<
- * abs(complex(x - p, y - q)) <= tolerance
- * for (x, y), (p, q) in zip(deltas, interp)
- */
- __Pyx_XDECREF(__pyx_r);
-
- /* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
- __Pyx_INCREF(Py_True);
- __pyx_r = Py_True;
- goto __pyx_L0;
- }
- CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __Pyx_Generator_Replace_StopIteration(0);
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_9);
- __Pyx_XDECREF(__pyx_t_10);
- __Pyx_AddTraceback("genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_L0:;
- __Pyx_XGIVEREF(__pyx_r);
- #if !CYTHON_USE_EXC_INFO_STACK
- __Pyx_Coroutine_ResetAndClearException(__pyx_generator);
- #endif
- __pyx_generator->resume_label = -1;
- __Pyx_Coroutine_clear((PyObject*)__pyx_generator);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":175
- *
- *
- * @cython.cfunc # <<<<<<<<<<<<<<
- * @cython.inline
- * @cython.locals(
- */
-
-static CYTHON_INLINE int __pyx_f_9fontTools_6varLib_3iup_can_iup_in_between(PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, int __pyx_v_i, int __pyx_v_j, CYTHON_UNUSED double __pyx_v_tolerance) {
- PyObject *__pyx_v_interp = NULL;
- PyObject *__pyx_gb_9fontTools_6varLib_3iup_18can_iup_in_between_2generator = 0;
- int __pyx_r;
- __Pyx_RefNannyDeclarations
- int __pyx_t_1;
- PyObject *__pyx_t_2 = NULL;
- PyObject *__pyx_t_3 = NULL;
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- PyObject *__pyx_t_6 = NULL;
- PyObject *__pyx_t_7 = NULL;
- int __pyx_t_8;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("can_iup_in_between", 0);
- __Pyx_INCREF(__pyx_v_deltas);
-
- /* "fontTools/varLib/iup.py":198
- * provided error tolerance."""
- *
- * assert j - i >= 2 # <<<<<<<<<<<<<<
- * interp = iup_segment(coords[i + 1 : j], coords[i], deltas[i], coords[j], deltas[j])
- * deltas = deltas[i + 1 : j]
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_1 = ((__pyx_v_j - __pyx_v_i) >= 2);
- if (unlikely(!__pyx_t_1)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 198, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 198, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":199
- *
- * assert j - i >= 2
- * interp = iup_segment(coords[i + 1 : j], coords[i], deltas[i], coords[j], deltas[j]) # <<<<<<<<<<<<<<
- * deltas = deltas[i + 1 : j]
- *
- */
- __pyx_t_2 = __Pyx_PyObject_GetSlice(__pyx_v_coords, (__pyx_v_i + 1), __pyx_v_j, NULL, NULL, NULL, 1, 1, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_coords, __pyx_v_i, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_4 = __Pyx_GetItemInt(__pyx_v_deltas, __pyx_v_i, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_5 = __Pyx_GetItemInt(__pyx_v_coords, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_6 = __Pyx_GetItemInt(__pyx_v_deltas, __pyx_v_j, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_7 = __pyx_f_9fontTools_6varLib_3iup_iup_segment(__pyx_t_2, __pyx_t_3, __pyx_t_4, __pyx_t_5, __pyx_t_6); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 199, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_v_interp = __pyx_t_7;
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":200
- * assert j - i >= 2
- * interp = iup_segment(coords[i + 1 : j], coords[i], deltas[i], coords[j], deltas[j])
- * deltas = deltas[i + 1 : j] # <<<<<<<<<<<<<<
- *
- * return all(
- */
- __pyx_t_7 = __Pyx_PyObject_GetSlice(__pyx_v_deltas, (__pyx_v_i + 1), __pyx_v_j, NULL, NULL, NULL, 1, 1, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 200, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF_SET(__pyx_v_deltas, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":204
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance
- * for (x, y), (p, q) in zip(deltas, interp) # <<<<<<<<<<<<<<
- * )
- *
- */
- __pyx_t_7 = PyTuple_New(2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_INCREF(__pyx_v_deltas);
- __Pyx_GIVEREF(__pyx_v_deltas);
- PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_v_deltas);
- __Pyx_INCREF(__pyx_v_interp);
- __Pyx_GIVEREF(__pyx_v_interp);
- PyTuple_SET_ITEM(__pyx_t_7, 1, __pyx_v_interp);
- __pyx_t_6 = __Pyx_PyObject_Call(__pyx_builtin_zip, __pyx_t_7, NULL); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 204, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":203
- *
- * return all(
- * abs(complex(x - p, y - q)) <= tolerance # <<<<<<<<<<<<<<
- * for (x, y), (p, q) in zip(deltas, interp)
- * )
- */
- __pyx_t_7 = __pyx_pf_9fontTools_6varLib_3iup_18can_iup_in_between_genexpr(NULL, __pyx_t_6); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_Generator_Next(__pyx_t_7); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_8 = __Pyx_PyInt_As_int(__pyx_t_6); if (unlikely((__pyx_t_8 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 203, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_r = __pyx_t_8;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":175
- *
- *
- * @cython.cfunc # <<<<<<<<<<<<<<
- * @cython.inline
- * @cython.locals(
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_AddTraceback("fontTools.varLib.iup.can_iup_in_between", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = -1;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_interp);
- __Pyx_XDECREF(__pyx_gb_9fontTools_6varLib_3iup_18can_iup_in_between_2generator);
- __Pyx_XDECREF(__pyx_v_deltas);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":208
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * cj=cython.double,
- * dj=cython.double,
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_5_iup_contour_bound_forced_set(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_4_iup_contour_bound_forced_set, "_iup_contour_bound_forced_set(deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0) -> set\nThe forced set is a conservative set of points on the contour that must be encoded\n explicitly (ie. cannot be interpolated). Calculating this set allows for significantly\n speeding up the dynamic-programming, as well as resolve circularity in DP.\n\n The set is precise; that is, if an index is in the returned set, then there is no way\n that IUP can generate delta for that point, given `coords` and `deltas`.\n ");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_5_iup_contour_bound_forced_set = {"_iup_contour_bound_forced_set", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_5_iup_contour_bound_forced_set, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_4_iup_contour_bound_forced_set};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_5_iup_contour_bound_forced_set(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- PyObject *__pyx_v_tolerance = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("_iup_contour_bound_forced_set (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,&__pyx_n_s_tolerance,0};
- PyObject* values[3] = {0,0,0};
- values[2] = ((PyObject *)((PyObject *)__pyx_int_0));
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 208, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 208, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("_iup_contour_bound_forced_set", 0, 2, 3, 1); __PYX_ERR(0, 208, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_tolerance);
- if (value) { values[2] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 208, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "_iup_contour_bound_forced_set") < 0)) __PYX_ERR(0, 208, __pyx_L3_error)
- }
- } else {
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- break;
- default: goto __pyx_L5_argtuple_error;
- }
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- __pyx_v_tolerance = values[2];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("_iup_contour_bound_forced_set", 0, 2, 3, __pyx_nargs); __PYX_ERR(0, 208, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup._iup_contour_bound_forced_set", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_4_iup_contour_bound_forced_set(__pyx_self, __pyx_v_deltas, __pyx_v_coords, __pyx_v_tolerance);
-
- /* function exit code */
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_4_iup_contour_bound_forced_set(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_tolerance) {
- double __pyx_v_cj;
- double __pyx_v_dj;
- double __pyx_v_lcj;
- double __pyx_v_ldj;
- double __pyx_v_ncj;
- double __pyx_v_ndj;
- int __pyx_v_force;
- PyObject *__pyx_v_forced = 0;
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_v_i = NULL;
- PyObject *__pyx_v_ld = NULL;
- PyObject *__pyx_v_lc = NULL;
- PyObject *__pyx_v_d = NULL;
- PyObject *__pyx_v_c = NULL;
- PyObject *__pyx_v_nd = NULL;
- PyObject *__pyx_v_nc = NULL;
- PyObject *__pyx_v_j = NULL;
- double __pyx_v_c1;
- double __pyx_v_c2;
- double __pyx_v_d1;
- double __pyx_v_d2;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- Py_ssize_t __pyx_t_1;
- Py_ssize_t __pyx_t_2;
- int __pyx_t_3;
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- PyObject *(*__pyx_t_6)(PyObject *);
- PyObject *__pyx_t_7 = NULL;
- PyObject *__pyx_t_8 = NULL;
- double __pyx_t_9;
- double __pyx_t_10;
- int __pyx_t_11;
- double __pyx_t_12;
- PyObject *__pyx_t_13 = NULL;
- PyObject *__pyx_t_14 = NULL;
- PyObject *__pyx_t_15 = NULL;
- int __pyx_t_16;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("_iup_contour_bound_forced_set", 0);
-
- /* "fontTools/varLib/iup.py":228
- * that IUP can generate delta for that point, given `coords` and `deltas`.
- * """
- * assert len(deltas) == len(coords) # <<<<<<<<<<<<<<
- *
- * n = len(deltas)
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_1 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 228, __pyx_L1_error)
- __pyx_t_2 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 228, __pyx_L1_error)
- __pyx_t_3 = (__pyx_t_1 == __pyx_t_2);
- if (unlikely(!__pyx_t_3)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 228, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 228, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":230
- * assert len(deltas) == len(coords)
- *
- * n = len(deltas) # <<<<<<<<<<<<<<
- * forced = set()
- * # Track "last" and "next" points on the contour as we sweep.
- */
- __pyx_t_2 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 230, __pyx_L1_error)
- __pyx_t_4 = PyInt_FromSsize_t(__pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 230, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_n = __pyx_t_4;
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":231
- *
- * n = len(deltas)
- * forced = set() # <<<<<<<<<<<<<<
- * # Track "last" and "next" points on the contour as we sweep.
- * for i in range(len(deltas) - 1, -1, -1):
- */
- __pyx_t_4 = PySet_New(0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 231, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_v_forced = ((PyObject*)__pyx_t_4);
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":233
- * forced = set()
- * # Track "last" and "next" points on the contour as we sweep.
- * for i in range(len(deltas) - 1, -1, -1): # <<<<<<<<<<<<<<
- * ld, lc = deltas[i - 1], coords[i - 1]
- * d, c = deltas[i], coords[i]
- */
- __pyx_t_2 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 233, __pyx_L1_error)
- __pyx_t_4 = PyInt_FromSsize_t((__pyx_t_2 - 1)); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_5 = PyTuple_New(3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GIVEREF(__pyx_t_4);
- PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_t_4);
- __Pyx_INCREF(__pyx_int_neg_1);
- __Pyx_GIVEREF(__pyx_int_neg_1);
- PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_int_neg_1);
- __Pyx_INCREF(__pyx_int_neg_1);
- __Pyx_GIVEREF(__pyx_int_neg_1);
- PyTuple_SET_ITEM(__pyx_t_5, 2, __pyx_int_neg_1);
- __pyx_t_4 = 0;
- __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_range, __pyx_t_5, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- if (likely(PyList_CheckExact(__pyx_t_4)) || PyTuple_CheckExact(__pyx_t_4)) {
- __pyx_t_5 = __pyx_t_4; __Pyx_INCREF(__pyx_t_5); __pyx_t_2 = 0;
- __pyx_t_6 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_5 = PyObject_GetIter(__pyx_t_4); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_6 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_5); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 233, __pyx_L1_error)
- }
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- for (;;) {
- if (likely(!__pyx_t_6)) {
- if (likely(PyList_CheckExact(__pyx_t_5))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyList_GET_ITEM(__pyx_t_5, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 233, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_5, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyTuple_GET_ITEM(__pyx_t_5, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 233, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_5, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 233, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- }
- } else {
- __pyx_t_4 = __pyx_t_6(__pyx_t_5);
- if (unlikely(!__pyx_t_4)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 233, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_4);
- }
- __Pyx_XDECREF_SET(__pyx_v_i, __pyx_t_4);
- __pyx_t_4 = 0;
-
- /* "fontTools/varLib/iup.py":234
- * # Track "last" and "next" points on the contour as we sweep.
- * for i in range(len(deltas) - 1, -1, -1):
- * ld, lc = deltas[i - 1], coords[i - 1] # <<<<<<<<<<<<<<
- * d, c = deltas[i], coords[i]
- * nd, nc = deltas[i - n + 1], coords[i - n + 1]
- */
- __pyx_t_4 = __Pyx_PyInt_SubtractObjC(__pyx_v_i, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 234, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_t_4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 234, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_4 = __Pyx_PyInt_SubtractObjC(__pyx_v_i, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 234, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_t_4); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 234, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_XDECREF_SET(__pyx_v_ld, __pyx_t_7);
- __pyx_t_7 = 0;
- __Pyx_XDECREF_SET(__pyx_v_lc, __pyx_t_8);
- __pyx_t_8 = 0;
-
- /* "fontTools/varLib/iup.py":235
- * for i in range(len(deltas) - 1, -1, -1):
- * ld, lc = deltas[i - 1], coords[i - 1]
- * d, c = deltas[i], coords[i] # <<<<<<<<<<<<<<
- * nd, nc = deltas[i - n + 1], coords[i - n + 1]
- *
- */
- __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_v_i); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 235, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_v_i); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 235, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_XDECREF_SET(__pyx_v_d, __pyx_t_8);
- __pyx_t_8 = 0;
- __Pyx_XDECREF_SET(__pyx_v_c, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":236
- * ld, lc = deltas[i - 1], coords[i - 1]
- * d, c = deltas[i], coords[i]
- * nd, nc = deltas[i - n + 1], coords[i - n + 1] # <<<<<<<<<<<<<<
- *
- * for j in (0, 1): # For X and for Y
- */
- __pyx_t_7 = PyNumber_Subtract(__pyx_v_i, __pyx_v_n); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_7, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_t_8); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_8 = PyNumber_Subtract(__pyx_v_i, __pyx_v_n); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_4 = __Pyx_PyInt_AddObjC(__pyx_t_8, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_v_coords, __pyx_t_4); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 236, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_XDECREF_SET(__pyx_v_nd, __pyx_t_7);
- __pyx_t_7 = 0;
- __Pyx_XDECREF_SET(__pyx_v_nc, __pyx_t_8);
- __pyx_t_8 = 0;
-
- /* "fontTools/varLib/iup.py":238
- * nd, nc = deltas[i - n + 1], coords[i - n + 1]
- *
- * for j in (0, 1): # For X and for Y # <<<<<<<<<<<<<<
- * cj = c[j]
- * dj = d[j]
- */
- __pyx_t_8 = __pyx_tuple_; __Pyx_INCREF(__pyx_t_8); __pyx_t_1 = 0;
- for (;;) {
- if (__pyx_t_1 >= 2) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_7 = PyTuple_GET_ITEM(__pyx_t_8, __pyx_t_1); __Pyx_INCREF(__pyx_t_7); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 238, __pyx_L1_error)
- #else
- __pyx_t_7 = PySequence_ITEM(__pyx_t_8, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 238, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- #endif
- __Pyx_XDECREF_SET(__pyx_v_j, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":239
- *
- * for j in (0, 1): # For X and for Y
- * cj = c[j] # <<<<<<<<<<<<<<
- * dj = d[j]
- * lcj = lc[j]
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_c, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 239, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 239, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_cj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":240
- * for j in (0, 1): # For X and for Y
- * cj = c[j]
- * dj = d[j] # <<<<<<<<<<<<<<
- * lcj = lc[j]
- * ldj = ld[j]
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_d, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 240, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 240, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_dj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":241
- * cj = c[j]
- * dj = d[j]
- * lcj = lc[j] # <<<<<<<<<<<<<<
- * ldj = ld[j]
- * ncj = nc[j]
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_lc, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 241, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 241, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_lcj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":242
- * dj = d[j]
- * lcj = lc[j]
- * ldj = ld[j] # <<<<<<<<<<<<<<
- * ncj = nc[j]
- * ndj = nd[j]
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_ld, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 242, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 242, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_ldj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":243
- * lcj = lc[j]
- * ldj = ld[j]
- * ncj = nc[j] # <<<<<<<<<<<<<<
- * ndj = nd[j]
- *
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_nc, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 243, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 243, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_ncj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":244
- * ldj = ld[j]
- * ncj = nc[j]
- * ndj = nd[j] # <<<<<<<<<<<<<<
- *
- * if lcj <= ncj:
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_nd, __pyx_v_j); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 244, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_7); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 244, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_v_ndj = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":246
- * ndj = nd[j]
- *
- * if lcj <= ncj: # <<<<<<<<<<<<<<
- * c1, c2 = lcj, ncj
- * d1, d2 = ldj, ndj
- */
- __pyx_t_3 = (__pyx_v_lcj <= __pyx_v_ncj);
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":247
- *
- * if lcj <= ncj:
- * c1, c2 = lcj, ncj # <<<<<<<<<<<<<<
- * d1, d2 = ldj, ndj
- * else:
- */
- __pyx_t_9 = __pyx_v_lcj;
- __pyx_t_10 = __pyx_v_ncj;
- __pyx_v_c1 = __pyx_t_9;
- __pyx_v_c2 = __pyx_t_10;
-
- /* "fontTools/varLib/iup.py":248
- * if lcj <= ncj:
- * c1, c2 = lcj, ncj
- * d1, d2 = ldj, ndj # <<<<<<<<<<<<<<
- * else:
- * c1, c2 = ncj, lcj
- */
- __pyx_t_10 = __pyx_v_ldj;
- __pyx_t_9 = __pyx_v_ndj;
- __pyx_v_d1 = __pyx_t_10;
- __pyx_v_d2 = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":246
- * ndj = nd[j]
- *
- * if lcj <= ncj: # <<<<<<<<<<<<<<
- * c1, c2 = lcj, ncj
- * d1, d2 = ldj, ndj
- */
- goto __pyx_L7;
- }
-
- /* "fontTools/varLib/iup.py":250
- * d1, d2 = ldj, ndj
- * else:
- * c1, c2 = ncj, lcj # <<<<<<<<<<<<<<
- * d1, d2 = ndj, ldj
- *
- */
- /*else*/ {
- __pyx_t_9 = __pyx_v_ncj;
- __pyx_t_10 = __pyx_v_lcj;
- __pyx_v_c1 = __pyx_t_9;
- __pyx_v_c2 = __pyx_t_10;
-
- /* "fontTools/varLib/iup.py":251
- * else:
- * c1, c2 = ncj, lcj
- * d1, d2 = ndj, ldj # <<<<<<<<<<<<<<
- *
- * force = False
- */
- __pyx_t_10 = __pyx_v_ndj;
- __pyx_t_9 = __pyx_v_ldj;
- __pyx_v_d1 = __pyx_t_10;
- __pyx_v_d2 = __pyx_t_9;
- }
- __pyx_L7:;
-
- /* "fontTools/varLib/iup.py":253
- * d1, d2 = ndj, ldj
- *
- * force = False # <<<<<<<<<<<<<<
- *
- * # If the two coordinates are the same, then the interpolation
- */
- __pyx_v_force = 0;
-
- /* "fontTools/varLib/iup.py":260
- * #
- * # This test has to be before the next one.
- * if c1 == c2: # <<<<<<<<<<<<<<
- * if abs(d1 - d2) > tolerance and abs(dj) > tolerance:
- * force = True
- */
- __pyx_t_3 = (__pyx_v_c1 == __pyx_v_c2);
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":261
- * # This test has to be before the next one.
- * if c1 == c2:
- * if abs(d1 - d2) > tolerance and abs(dj) > tolerance: # <<<<<<<<<<<<<<
- * force = True
- *
- */
- __pyx_t_7 = PyFloat_FromDouble(fabs((__pyx_v_d1 - __pyx_v_d2))); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_4 = PyObject_RichCompare(__pyx_t_7, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_4); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_11 = __Pyx_PyObject_IsTrue(__pyx_t_4); if (unlikely((__pyx_t_11 < 0))) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- if (__pyx_t_11) {
- } else {
- __pyx_t_3 = __pyx_t_11;
- goto __pyx_L10_bool_binop_done;
- }
- __pyx_t_4 = PyFloat_FromDouble(fabs(__pyx_v_dj)); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_7 = PyObject_RichCompare(__pyx_t_4, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_11 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_11 < 0))) __PYX_ERR(0, 261, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_3 = __pyx_t_11;
- __pyx_L10_bool_binop_done:;
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":262
- * if c1 == c2:
- * if abs(d1 - d2) > tolerance and abs(dj) > tolerance:
- * force = True # <<<<<<<<<<<<<<
- *
- * # If coordinate for current point is between coordinate of adjacent
- */
- __pyx_v_force = 1;
-
- /* "fontTools/varLib/iup.py":261
- * # This test has to be before the next one.
- * if c1 == c2:
- * if abs(d1 - d2) > tolerance and abs(dj) > tolerance: # <<<<<<<<<<<<<<
- * force = True
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":260
- * #
- * # This test has to be before the next one.
- * if c1 == c2: # <<<<<<<<<<<<<<
- * if abs(d1 - d2) > tolerance and abs(dj) > tolerance:
- * force = True
- */
- goto __pyx_L8;
- }
-
- /* "fontTools/varLib/iup.py":269
- * # allowance), then there is no way that current point can be IUP-ed.
- * # Mark it forced.
- * elif c1 <= cj <= c2: # and c1 != c2 # <<<<<<<<<<<<<<
- * if not (min(d1, d2) - tolerance <= dj <= max(d1, d2) + tolerance):
- * force = True
- */
- __pyx_t_3 = (__pyx_v_c1 <= __pyx_v_cj);
- if (__pyx_t_3) {
- __pyx_t_3 = (__pyx_v_cj <= __pyx_v_c2);
- }
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":270
- * # Mark it forced.
- * elif c1 <= cj <= c2: # and c1 != c2
- * if not (min(d1, d2) - tolerance <= dj <= max(d1, d2) + tolerance): # <<<<<<<<<<<<<<
- * force = True
- *
- */
- __pyx_t_9 = __pyx_v_d2;
- __pyx_t_10 = __pyx_v_d1;
- if ((__pyx_t_9 < __pyx_t_10)) {
- __pyx_t_12 = __pyx_t_9;
- } else {
- __pyx_t_12 = __pyx_t_10;
- }
- __pyx_t_7 = PyFloat_FromDouble(__pyx_t_12); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_4 = PyNumber_Subtract(__pyx_t_7, __pyx_v_tolerance); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = PyFloat_FromDouble(__pyx_v_dj); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_13 = PyObject_RichCompare(__pyx_t_4, __pyx_t_7, Py_LE); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 270, __pyx_L1_error)
- if (__Pyx_PyObject_IsTrue(__pyx_t_13)) {
- __Pyx_DECREF(__pyx_t_13);
- __pyx_t_12 = __pyx_v_d2;
- __pyx_t_9 = __pyx_v_d1;
- if ((__pyx_t_12 > __pyx_t_9)) {
- __pyx_t_10 = __pyx_t_12;
- } else {
- __pyx_t_10 = __pyx_t_9;
- }
- __pyx_t_14 = PyFloat_FromDouble(__pyx_t_10); if (unlikely(!__pyx_t_14)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_14);
- __pyx_t_15 = PyNumber_Add(__pyx_t_14, __pyx_v_tolerance); if (unlikely(!__pyx_t_15)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_15);
- __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0;
- __pyx_t_13 = PyObject_RichCompare(__pyx_t_7, __pyx_t_15, Py_LE); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_15); __pyx_t_15 = 0;
- }
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_13); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 270, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_11 = (!__pyx_t_3);
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":271
- * elif c1 <= cj <= c2: # and c1 != c2
- * if not (min(d1, d2) - tolerance <= dj <= max(d1, d2) + tolerance):
- * force = True # <<<<<<<<<<<<<<
- *
- * # Otherwise, the delta should either match the closest, or have the
- */
- __pyx_v_force = 1;
-
- /* "fontTools/varLib/iup.py":270
- * # Mark it forced.
- * elif c1 <= cj <= c2: # and c1 != c2
- * if not (min(d1, d2) - tolerance <= dj <= max(d1, d2) + tolerance): # <<<<<<<<<<<<<<
- * force = True
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":269
- * # allowance), then there is no way that current point can be IUP-ed.
- * # Mark it forced.
- * elif c1 <= cj <= c2: # and c1 != c2 # <<<<<<<<<<<<<<
- * if not (min(d1, d2) - tolerance <= dj <= max(d1, d2) + tolerance):
- * force = True
- */
- goto __pyx_L8;
- }
-
- /* "fontTools/varLib/iup.py":276
- * # same sign as the interpolation of the two deltas.
- * else: # cj < c1 or c2 < cj
- * if d1 != d2: # <<<<<<<<<<<<<<
- * if cj < c1:
- * if (
- */
- /*else*/ {
- __pyx_t_11 = (__pyx_v_d1 != __pyx_v_d2);
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":277
- * else: # cj < c1 or c2 < cj
- * if d1 != d2:
- * if cj < c1: # <<<<<<<<<<<<<<
- * if (
- * abs(dj) > tolerance
- */
- __pyx_t_11 = (__pyx_v_cj < __pyx_v_c1);
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":279
- * if cj < c1:
- * if (
- * abs(dj) > tolerance # <<<<<<<<<<<<<<
- * and abs(dj - d1) > tolerance
- * and ((dj - tolerance < d1) != (d1 < d2))
- */
- __pyx_t_13 = PyFloat_FromDouble(fabs(__pyx_v_dj)); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 279, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_7 = PyObject_RichCompare(__pyx_t_13, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 279, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 279, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (__pyx_t_3) {
- } else {
- __pyx_t_11 = __pyx_t_3;
- goto __pyx_L16_bool_binop_done;
- }
-
- /* "fontTools/varLib/iup.py":280
- * if (
- * abs(dj) > tolerance
- * and abs(dj - d1) > tolerance # <<<<<<<<<<<<<<
- * and ((dj - tolerance < d1) != (d1 < d2))
- * ):
- */
- __pyx_t_7 = PyFloat_FromDouble(fabs((__pyx_v_dj - __pyx_v_d1))); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 280, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_13 = PyObject_RichCompare(__pyx_t_7, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 280, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_13); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 280, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- if (__pyx_t_3) {
- } else {
- __pyx_t_11 = __pyx_t_3;
- goto __pyx_L16_bool_binop_done;
- }
-
- /* "fontTools/varLib/iup.py":281
- * abs(dj) > tolerance
- * and abs(dj - d1) > tolerance
- * and ((dj - tolerance < d1) != (d1 < d2)) # <<<<<<<<<<<<<<
- * ):
- * force = True
- */
- __pyx_t_13 = PyFloat_FromDouble(__pyx_v_dj); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_7 = PyNumber_Subtract(__pyx_t_13, __pyx_v_tolerance); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_13 = PyFloat_FromDouble(__pyx_v_d1); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_4 = PyObject_RichCompare(__pyx_t_7, __pyx_t_13, Py_LT); __Pyx_XGOTREF(__pyx_t_4); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_13 = __Pyx_PyBool_FromLong((__pyx_v_d1 < __pyx_v_d2)); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_7 = PyObject_RichCompare(__pyx_t_4, __pyx_t_13, Py_NE); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 281, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_11 = __pyx_t_3;
- __pyx_L16_bool_binop_done:;
-
- /* "fontTools/varLib/iup.py":278
- * if d1 != d2:
- * if cj < c1:
- * if ( # <<<<<<<<<<<<<<
- * abs(dj) > tolerance
- * and abs(dj - d1) > tolerance
- */
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":283
- * and ((dj - tolerance < d1) != (d1 < d2))
- * ):
- * force = True # <<<<<<<<<<<<<<
- * else: # c2 < cj
- * if (
- */
- __pyx_v_force = 1;
-
- /* "fontTools/varLib/iup.py":278
- * if d1 != d2:
- * if cj < c1:
- * if ( # <<<<<<<<<<<<<<
- * abs(dj) > tolerance
- * and abs(dj - d1) > tolerance
- */
- }
-
- /* "fontTools/varLib/iup.py":277
- * else: # cj < c1 or c2 < cj
- * if d1 != d2:
- * if cj < c1: # <<<<<<<<<<<<<<
- * if (
- * abs(dj) > tolerance
- */
- goto __pyx_L14;
- }
-
- /* "fontTools/varLib/iup.py":285
- * force = True
- * else: # c2 < cj
- * if ( # <<<<<<<<<<<<<<
- * abs(dj) > tolerance
- * and abs(dj - d2) > tolerance
- */
- /*else*/ {
-
- /* "fontTools/varLib/iup.py":286
- * else: # c2 < cj
- * if (
- * abs(dj) > tolerance # <<<<<<<<<<<<<<
- * and abs(dj - d2) > tolerance
- * and ((d2 < dj + tolerance) != (d1 < d2))
- */
- __pyx_t_7 = PyFloat_FromDouble(fabs(__pyx_v_dj)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 286, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_13 = PyObject_RichCompare(__pyx_t_7, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 286, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_13); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 286, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- if (__pyx_t_3) {
- } else {
- __pyx_t_11 = __pyx_t_3;
- goto __pyx_L20_bool_binop_done;
- }
-
- /* "fontTools/varLib/iup.py":287
- * if (
- * abs(dj) > tolerance
- * and abs(dj - d2) > tolerance # <<<<<<<<<<<<<<
- * and ((d2 < dj + tolerance) != (d1 < d2))
- * ):
- */
- __pyx_t_13 = PyFloat_FromDouble(fabs((__pyx_v_dj - __pyx_v_d2))); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 287, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_7 = PyObject_RichCompare(__pyx_t_13, __pyx_v_tolerance, Py_GT); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 287, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 287, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (__pyx_t_3) {
- } else {
- __pyx_t_11 = __pyx_t_3;
- goto __pyx_L20_bool_binop_done;
- }
-
- /* "fontTools/varLib/iup.py":288
- * abs(dj) > tolerance
- * and abs(dj - d2) > tolerance
- * and ((d2 < dj + tolerance) != (d1 < d2)) # <<<<<<<<<<<<<<
- * ):
- * force = True
- */
- __pyx_t_7 = PyFloat_FromDouble(__pyx_v_d2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_13 = PyFloat_FromDouble(__pyx_v_dj); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_13);
- __pyx_t_4 = PyNumber_Add(__pyx_t_13, __pyx_v_tolerance); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __pyx_t_13 = PyObject_RichCompare(__pyx_t_7, __pyx_t_4, Py_LT); __Pyx_XGOTREF(__pyx_t_13); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_4 = __Pyx_PyBool_FromLong((__pyx_v_d1 < __pyx_v_d2)); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_7 = PyObject_RichCompare(__pyx_t_13, __pyx_t_4, Py_NE); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0;
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 288, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_11 = __pyx_t_3;
- __pyx_L20_bool_binop_done:;
-
- /* "fontTools/varLib/iup.py":285
- * force = True
- * else: # c2 < cj
- * if ( # <<<<<<<<<<<<<<
- * abs(dj) > tolerance
- * and abs(dj - d2) > tolerance
- */
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":290
- * and ((d2 < dj + tolerance) != (d1 < d2))
- * ):
- * force = True # <<<<<<<<<<<<<<
- *
- * if force:
- */
- __pyx_v_force = 1;
-
- /* "fontTools/varLib/iup.py":285
- * force = True
- * else: # c2 < cj
- * if ( # <<<<<<<<<<<<<<
- * abs(dj) > tolerance
- * and abs(dj - d2) > tolerance
- */
- }
- }
- __pyx_L14:;
-
- /* "fontTools/varLib/iup.py":276
- * # same sign as the interpolation of the two deltas.
- * else: # cj < c1 or c2 < cj
- * if d1 != d2: # <<<<<<<<<<<<<<
- * if cj < c1:
- * if (
- */
- }
- }
- __pyx_L8:;
-
- /* "fontTools/varLib/iup.py":292
- * force = True
- *
- * if force: # <<<<<<<<<<<<<<
- * forced.add(i)
- * break
- */
- __pyx_t_11 = (__pyx_v_force != 0);
- if (__pyx_t_11) {
-
- /* "fontTools/varLib/iup.py":293
- *
- * if force:
- * forced.add(i) # <<<<<<<<<<<<<<
- * break
- *
- */
- __pyx_t_16 = PySet_Add(__pyx_v_forced, __pyx_v_i); if (unlikely(__pyx_t_16 == ((int)-1))) __PYX_ERR(0, 293, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":294
- * if force:
- * forced.add(i)
- * break # <<<<<<<<<<<<<<
- *
- * return forced
- */
- goto __pyx_L6_break;
-
- /* "fontTools/varLib/iup.py":292
- * force = True
- *
- * if force: # <<<<<<<<<<<<<<
- * forced.add(i)
- * break
- */
- }
-
- /* "fontTools/varLib/iup.py":238
- * nd, nc = deltas[i - n + 1], coords[i - n + 1]
- *
- * for j in (0, 1): # For X and for Y # <<<<<<<<<<<<<<
- * cj = c[j]
- * dj = d[j]
- */
- }
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- goto __pyx_L24_for_end;
- __pyx_L6_break:;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- goto __pyx_L24_for_end;
- __pyx_L24_for_end:;
-
- /* "fontTools/varLib/iup.py":233
- * forced = set()
- * # Track "last" and "next" points on the contour as we sweep.
- * for i in range(len(deltas) - 1, -1, -1): # <<<<<<<<<<<<<<
- * ld, lc = deltas[i - 1], coords[i - 1]
- * d, c = deltas[i], coords[i]
- */
- }
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":296
- * break
- *
- * return forced # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_forced);
- __pyx_r = __pyx_v_forced;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":208
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * cj=cython.double,
- * dj=cython.double,
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_13);
- __Pyx_XDECREF(__pyx_t_14);
- __Pyx_XDECREF(__pyx_t_15);
- __Pyx_AddTraceback("fontTools.varLib.iup._iup_contour_bound_forced_set", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_forced);
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_i);
- __Pyx_XDECREF(__pyx_v_ld);
- __Pyx_XDECREF(__pyx_v_lc);
- __Pyx_XDECREF(__pyx_v_d);
- __Pyx_XDECREF(__pyx_v_c);
- __Pyx_XDECREF(__pyx_v_nd);
- __Pyx_XDECREF(__pyx_v_nc);
- __Pyx_XDECREF(__pyx_v_j);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":299
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * i=cython.int,
- * j=cython.int,
- */
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_16__defaults__(CYTHON_UNUSED PyObject *__pyx_self) {
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- PyObject *__pyx_t_1 = NULL;
- PyObject *__pyx_t_2 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("__defaults__", 0);
- __Pyx_XDECREF(__pyx_r);
-
- /* "fontTools/varLib/iup.py":312
- * coords: _PointSegment,
- * forced=set(),
- * tolerance: Real = 0, # <<<<<<<<<<<<<<
- * lookback: Integral = None,
- * ):
- */
- __pyx_t_1 = PyFloat_FromDouble(((double)0.0)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 312, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
-
- /* "fontTools/varLib/iup.py":299
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * i=cython.int,
- * j=cython.int,
- */
- __pyx_t_2 = PyTuple_New(3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_INCREF(__Pyx_CyFunction_Defaults(__pyx_defaults, __pyx_self)->__pyx_arg_forced);
- __Pyx_GIVEREF(__Pyx_CyFunction_Defaults(__pyx_defaults, __pyx_self)->__pyx_arg_forced);
- PyTuple_SET_ITEM(__pyx_t_2, 0, __Pyx_CyFunction_Defaults(__pyx_defaults, __pyx_self)->__pyx_arg_forced);
- __Pyx_GIVEREF(__pyx_t_1);
- PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_t_1);
- __Pyx_INCREF(((PyObject *)Py_None));
- __Pyx_GIVEREF(((PyObject *)Py_None));
- PyTuple_SET_ITEM(__pyx_t_2, 2, ((PyObject *)Py_None));
- __pyx_t_1 = 0;
- __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __Pyx_GIVEREF(__pyx_t_2);
- PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_2);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyTuple_SET_ITEM(__pyx_t_1, 1, Py_None);
- __pyx_t_2 = 0;
- __pyx_r = __pyx_t_1;
- __pyx_t_1 = 0;
- goto __pyx_L0;
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_AddTraceback("fontTools.varLib.iup.__defaults__", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_7_iup_contour_optimize_dp(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_6_iup_contour_optimize_dp, "_iup_contour_optimize_dp(deltas: _DeltaSegment, coords: _PointSegment, set forced=set(), double tolerance: Real = 0, lookback: Integral = None)\nStraightforward Dynamic-Programming. For each index i, find least-costly encoding of\n points 0 to i where i is explicitly encoded. We find this by considering all previous\n explicit points j and check whether interpolation can fill points between j and i.\n\n Note that solution always encodes last point explicitly. Higher-level is responsible\n for removing that restriction.\n\n As major speedup, we stop looking further whenever we see a \"forced\" point.");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_7_iup_contour_optimize_dp = {"_iup_contour_optimize_dp", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_7_iup_contour_optimize_dp, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_6_iup_contour_optimize_dp};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_7_iup_contour_optimize_dp(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- PyObject *__pyx_v_forced = 0;
- double __pyx_v_tolerance;
- PyObject *__pyx_v_lookback = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("_iup_contour_optimize_dp (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,&__pyx_n_s_forced,&__pyx_n_s_tolerance,&__pyx_n_s_lookback,0};
- PyObject* values[5] = {0,0,0,0,0};
- __pyx_defaults *__pyx_dynamic_args = __Pyx_CyFunction_Defaults(__pyx_defaults, __pyx_self);
- values[2] = __pyx_dynamic_args->__pyx_arg_forced;
- values[4] = ((PyObject *)((PyObject *)Py_None));
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 5: values[4] = __Pyx_Arg_FASTCALL(__pyx_args, 4);
- CYTHON_FALLTHROUGH;
- case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3);
- CYTHON_FALLTHROUGH;
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 299, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 299, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("_iup_contour_optimize_dp", 0, 2, 5, 1); __PYX_ERR(0, 299, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_forced);
- if (value) { values[2] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 299, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 3:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_tolerance);
- if (value) { values[3] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 299, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 4:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_lookback);
- if (value) { values[4] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 299, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "_iup_contour_optimize_dp") < 0)) __PYX_ERR(0, 299, __pyx_L3_error)
- }
- } else {
- switch (__pyx_nargs) {
- case 5: values[4] = __Pyx_Arg_FASTCALL(__pyx_args, 4);
- CYTHON_FALLTHROUGH;
- case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3);
- CYTHON_FALLTHROUGH;
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- break;
- default: goto __pyx_L5_argtuple_error;
- }
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- __pyx_v_forced = ((PyObject*)values[2]);
- if (values[3]) {
- __pyx_v_tolerance = __pyx_PyFloat_AsDouble(values[3]); if (unlikely((__pyx_v_tolerance == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 312, __pyx_L3_error)
- } else {
- __pyx_v_tolerance = ((double)((double)0.0));
- }
- __pyx_v_lookback = values[4];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("_iup_contour_optimize_dp", 0, 2, 5, __pyx_nargs); __PYX_ERR(0, 299, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup._iup_contour_optimize_dp", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_forced), (&PySet_Type), 1, "forced", 1))) __PYX_ERR(0, 311, __pyx_L1_error)
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_6_iup_contour_optimize_dp(__pyx_self, __pyx_v_deltas, __pyx_v_coords, __pyx_v_forced, __pyx_v_tolerance, __pyx_v_lookback);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_6_iup_contour_optimize_dp(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_forced, double __pyx_v_tolerance, PyObject *__pyx_v_lookback) {
- int __pyx_v_i;
- int __pyx_v_j;
- double __pyx_v_best_cost;
- double __pyx_v_cost;
- Py_ssize_t __pyx_v_n;
- PyObject *__pyx_v_costs = NULL;
- PyObject *__pyx_v_chain = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- Py_ssize_t __pyx_t_1;
- int __pyx_t_2;
- PyObject *__pyx_t_3 = NULL;
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- PyObject *__pyx_t_6 = NULL;
- Py_ssize_t __pyx_t_7;
- int __pyx_t_8;
- double __pyx_t_9;
- long __pyx_t_10;
- long __pyx_t_11;
- int __pyx_t_12;
- int __pyx_t_13;
- int __pyx_t_14;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("_iup_contour_optimize_dp", 0);
- __Pyx_INCREF(__pyx_v_lookback);
-
- /* "fontTools/varLib/iup.py":324
- * As major speedup, we stop looking further whenever we see a "forced" point."""
- *
- * n = len(deltas) # <<<<<<<<<<<<<<
- * if lookback is None:
- * lookback = n
- */
- __pyx_t_1 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 324, __pyx_L1_error)
- __pyx_v_n = __pyx_t_1;
-
- /* "fontTools/varLib/iup.py":325
- *
- * n = len(deltas)
- * if lookback is None: # <<<<<<<<<<<<<<
- * lookback = n
- * lookback = min(lookback, MAX_LOOKBACK)
- */
- __pyx_t_2 = (__pyx_v_lookback == Py_None);
- if (__pyx_t_2) {
-
- /* "fontTools/varLib/iup.py":326
- * n = len(deltas)
- * if lookback is None:
- * lookback = n # <<<<<<<<<<<<<<
- * lookback = min(lookback, MAX_LOOKBACK)
- * costs = {-1: 0}
- */
- __pyx_t_3 = PyInt_FromSsize_t(__pyx_v_n); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 326, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF_SET(__pyx_v_lookback, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":325
- *
- * n = len(deltas)
- * if lookback is None: # <<<<<<<<<<<<<<
- * lookback = n
- * lookback = min(lookback, MAX_LOOKBACK)
- */
- }
-
- /* "fontTools/varLib/iup.py":327
- * if lookback is None:
- * lookback = n
- * lookback = min(lookback, MAX_LOOKBACK) # <<<<<<<<<<<<<<
- * costs = {-1: 0}
- * chain = {-1: None}
- */
- __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_MAX_LOOKBACK); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 327, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_INCREF(__pyx_v_lookback);
- __pyx_t_4 = __pyx_v_lookback;
- __pyx_t_6 = PyObject_RichCompare(__pyx_t_3, __pyx_t_4, Py_LT); __Pyx_XGOTREF(__pyx_t_6); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 327, __pyx_L1_error)
- __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_6); if (unlikely((__pyx_t_2 < 0))) __PYX_ERR(0, 327, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (__pyx_t_2) {
- __Pyx_INCREF(__pyx_t_3);
- __pyx_t_5 = __pyx_t_3;
- } else {
- __Pyx_INCREF(__pyx_t_4);
- __pyx_t_5 = __pyx_t_4;
- }
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __pyx_t_5;
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF_SET(__pyx_v_lookback, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":328
- * lookback = n
- * lookback = min(lookback, MAX_LOOKBACK)
- * costs = {-1: 0} # <<<<<<<<<<<<<<
- * chain = {-1: None}
- * for i in range(0, n):
- */
- __pyx_t_3 = __Pyx_PyDict_NewPresized(1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 328, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- if (PyDict_SetItem(__pyx_t_3, __pyx_int_neg_1, __pyx_int_0) < 0) __PYX_ERR(0, 328, __pyx_L1_error)
- __pyx_v_costs = ((PyObject*)__pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":329
- * lookback = min(lookback, MAX_LOOKBACK)
- * costs = {-1: 0}
- * chain = {-1: None} # <<<<<<<<<<<<<<
- * for i in range(0, n):
- * best_cost = costs[i - 1] + 1
- */
- __pyx_t_3 = __Pyx_PyDict_NewPresized(1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 329, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- if (PyDict_SetItem(__pyx_t_3, __pyx_int_neg_1, Py_None) < 0) __PYX_ERR(0, 329, __pyx_L1_error)
- __pyx_v_chain = ((PyObject*)__pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":330
- * costs = {-1: 0}
- * chain = {-1: None}
- * for i in range(0, n): # <<<<<<<<<<<<<<
- * best_cost = costs[i - 1] + 1
- *
- */
- __pyx_t_1 = __pyx_v_n;
- __pyx_t_7 = __pyx_t_1;
- for (__pyx_t_8 = 0; __pyx_t_8 < __pyx_t_7; __pyx_t_8+=1) {
- __pyx_v_i = __pyx_t_8;
-
- /* "fontTools/varLib/iup.py":331
- * chain = {-1: None}
- * for i in range(0, n):
- * best_cost = costs[i - 1] + 1 # <<<<<<<<<<<<<<
- *
- * costs[i] = best_cost
- */
- __pyx_t_3 = __Pyx_PyInt_From_long((__pyx_v_i - 1)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 331, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyDict_GetItem(__pyx_v_costs, __pyx_t_3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 331, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_PyInt_AddObjC(__pyx_t_5, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 331, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 331, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_v_best_cost = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":333
- * best_cost = costs[i - 1] + 1
- *
- * costs[i] = best_cost # <<<<<<<<<<<<<<
- * chain[i] = i - 1
- *
- */
- __pyx_t_3 = PyFloat_FromDouble(__pyx_v_best_cost); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 333, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyInt_From_int(__pyx_v_i); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 333, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- if (unlikely((PyDict_SetItem(__pyx_v_costs, __pyx_t_5, __pyx_t_3) < 0))) __PYX_ERR(0, 333, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":334
- *
- * costs[i] = best_cost
- * chain[i] = i - 1 # <<<<<<<<<<<<<<
- *
- * if i - 1 in forced:
- */
- __pyx_t_3 = __Pyx_PyInt_From_long((__pyx_v_i - 1)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 334, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyInt_From_int(__pyx_v_i); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 334, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- if (unlikely((PyDict_SetItem(__pyx_v_chain, __pyx_t_5, __pyx_t_3) < 0))) __PYX_ERR(0, 334, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":336
- * chain[i] = i - 1
- *
- * if i - 1 in forced: # <<<<<<<<<<<<<<
- * continue
- *
- */
- __pyx_t_3 = __Pyx_PyInt_From_long((__pyx_v_i - 1)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 336, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- if (unlikely(__pyx_v_forced == Py_None)) {
- PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable");
- __PYX_ERR(0, 336, __pyx_L1_error)
- }
- __pyx_t_2 = (__Pyx_PySet_ContainsTF(__pyx_t_3, __pyx_v_forced, Py_EQ)); if (unlikely((__pyx_t_2 < 0))) __PYX_ERR(0, 336, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_2) {
-
- /* "fontTools/varLib/iup.py":337
- *
- * if i - 1 in forced:
- * continue # <<<<<<<<<<<<<<
- *
- * for j in range(i - 2, max(i - lookback, -2), -1):
- */
- goto __pyx_L4_continue;
-
- /* "fontTools/varLib/iup.py":336
- * chain[i] = i - 1
- *
- * if i - 1 in forced: # <<<<<<<<<<<<<<
- * continue
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":339
- * continue
- *
- * for j in range(i - 2, max(i - lookback, -2), -1): # <<<<<<<<<<<<<<
- * cost = costs[j] + 1
- *
- */
- __pyx_t_10 = -2L;
- __pyx_t_3 = __Pyx_PyInt_From_int(__pyx_v_i); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = PyNumber_Subtract(__pyx_t_3, __pyx_v_lookback); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_4 = __Pyx_PyInt_From_long(__pyx_t_10); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_6 = PyObject_RichCompare(__pyx_t_4, __pyx_t_5, Py_GT); __Pyx_XGOTREF(__pyx_t_6); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_6); if (unlikely((__pyx_t_2 < 0))) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (__pyx_t_2) {
- __pyx_t_6 = __Pyx_PyInt_From_long(__pyx_t_10); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_3 = __pyx_t_6;
- __pyx_t_6 = 0;
- } else {
- __Pyx_INCREF(__pyx_t_5);
- __pyx_t_3 = __pyx_t_5;
- }
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_10 = __Pyx_PyInt_As_long(__pyx_t_3); if (unlikely((__pyx_t_10 == (long)-1) && PyErr_Occurred())) __PYX_ERR(0, 339, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_11 = __pyx_t_10;
- for (__pyx_t_12 = (__pyx_v_i - 2); __pyx_t_12 > __pyx_t_11; __pyx_t_12-=1) {
- __pyx_v_j = __pyx_t_12;
-
- /* "fontTools/varLib/iup.py":340
- *
- * for j in range(i - 2, max(i - lookback, -2), -1):
- * cost = costs[j] + 1 # <<<<<<<<<<<<<<
- *
- * if cost < best_cost and can_iup_in_between(deltas, coords, j, i, tolerance):
- */
- __pyx_t_3 = __Pyx_PyInt_From_int(__pyx_v_j); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 340, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyDict_GetItem(__pyx_v_costs, __pyx_t_3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 340, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_PyInt_AddObjC(__pyx_t_5, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 340, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_9 = __pyx_PyFloat_AsDouble(__pyx_t_3); if (unlikely((__pyx_t_9 == (double)-1) && PyErr_Occurred())) __PYX_ERR(0, 340, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_v_cost = __pyx_t_9;
-
- /* "fontTools/varLib/iup.py":342
- * cost = costs[j] + 1
- *
- * if cost < best_cost and can_iup_in_between(deltas, coords, j, i, tolerance): # <<<<<<<<<<<<<<
- * costs[i] = best_cost = cost
- * chain[i] = j
- */
- __pyx_t_13 = (__pyx_v_cost < __pyx_v_best_cost);
- if (__pyx_t_13) {
- } else {
- __pyx_t_2 = __pyx_t_13;
- goto __pyx_L10_bool_binop_done;
- }
- __pyx_t_14 = __pyx_f_9fontTools_6varLib_3iup_can_iup_in_between(__pyx_v_deltas, __pyx_v_coords, __pyx_v_j, __pyx_v_i, __pyx_v_tolerance); if (unlikely(__pyx_t_14 == ((int)-1) && PyErr_Occurred())) __PYX_ERR(0, 342, __pyx_L1_error)
- __pyx_t_13 = (__pyx_t_14 != 0);
- __pyx_t_2 = __pyx_t_13;
- __pyx_L10_bool_binop_done:;
- if (__pyx_t_2) {
-
- /* "fontTools/varLib/iup.py":343
- *
- * if cost < best_cost and can_iup_in_between(deltas, coords, j, i, tolerance):
- * costs[i] = best_cost = cost # <<<<<<<<<<<<<<
- * chain[i] = j
- *
- */
- __pyx_t_3 = PyFloat_FromDouble(__pyx_v_cost); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 343, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyInt_From_int(__pyx_v_i); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 343, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- if (unlikely((PyDict_SetItem(__pyx_v_costs, __pyx_t_5, __pyx_t_3) < 0))) __PYX_ERR(0, 343, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_v_best_cost = __pyx_v_cost;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":344
- * if cost < best_cost and can_iup_in_between(deltas, coords, j, i, tolerance):
- * costs[i] = best_cost = cost
- * chain[i] = j # <<<<<<<<<<<<<<
- *
- * if j in forced:
- */
- __pyx_t_3 = __Pyx_PyInt_From_int(__pyx_v_j); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 344, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyInt_From_int(__pyx_v_i); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 344, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- if (unlikely((PyDict_SetItem(__pyx_v_chain, __pyx_t_5, __pyx_t_3) < 0))) __PYX_ERR(0, 344, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":342
- * cost = costs[j] + 1
- *
- * if cost < best_cost and can_iup_in_between(deltas, coords, j, i, tolerance): # <<<<<<<<<<<<<<
- * costs[i] = best_cost = cost
- * chain[i] = j
- */
- }
-
- /* "fontTools/varLib/iup.py":346
- * chain[i] = j
- *
- * if j in forced: # <<<<<<<<<<<<<<
- * break
- *
- */
- __pyx_t_3 = __Pyx_PyInt_From_int(__pyx_v_j); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 346, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- if (unlikely(__pyx_v_forced == Py_None)) {
- PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable");
- __PYX_ERR(0, 346, __pyx_L1_error)
- }
- __pyx_t_2 = (__Pyx_PySet_ContainsTF(__pyx_t_3, __pyx_v_forced, Py_EQ)); if (unlikely((__pyx_t_2 < 0))) __PYX_ERR(0, 346, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_2) {
-
- /* "fontTools/varLib/iup.py":347
- *
- * if j in forced:
- * break # <<<<<<<<<<<<<<
- *
- * return chain, costs
- */
- goto __pyx_L8_break;
-
- /* "fontTools/varLib/iup.py":346
- * chain[i] = j
- *
- * if j in forced: # <<<<<<<<<<<<<<
- * break
- *
- */
- }
- }
- __pyx_L8_break:;
- __pyx_L4_continue:;
- }
-
- /* "fontTools/varLib/iup.py":349
- * break
- *
- * return chain, costs # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 349, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_INCREF(__pyx_v_chain);
- __Pyx_GIVEREF(__pyx_v_chain);
- PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_v_chain);
- __Pyx_INCREF(__pyx_v_costs);
- __Pyx_GIVEREF(__pyx_v_costs);
- PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_costs);
- __pyx_r = __pyx_t_3;
- __pyx_t_3 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":299
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * i=cython.int,
- * j=cython.int,
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_AddTraceback("fontTools.varLib.iup._iup_contour_optimize_dp", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_costs);
- __Pyx_XDECREF(__pyx_v_chain);
- __Pyx_XDECREF(__pyx_v_lookback);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":352
- *
- *
- * def _rot_list(l: list, k: int): # <<<<<<<<<<<<<<
- * """Rotate list by k items forward. Ie. item at position 0 will be
- * at position k in returned list. Negative k is allowed."""
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_9_rot_list(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_8_rot_list, "_rot_list(list l: list, int k: int)\nRotate list by k items forward. Ie. item at position 0 will be\n at position k in returned list. Negative k is allowed.");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_9_rot_list = {"_rot_list", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_9_rot_list, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_8_rot_list};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_9_rot_list(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_l = 0;
- PyObject *__pyx_v_k = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("_rot_list (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_l,&__pyx_n_s_k,0};
- PyObject* values[2] = {0,0};
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_l)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 352, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_k)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 352, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("_rot_list", 1, 2, 2, 1); __PYX_ERR(0, 352, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "_rot_list") < 0)) __PYX_ERR(0, 352, __pyx_L3_error)
- }
- } else if (unlikely(__pyx_nargs != 2)) {
- goto __pyx_L5_argtuple_error;
- } else {
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- }
- __pyx_v_l = ((PyObject*)values[0]);
- __pyx_v_k = ((PyObject*)values[1]);
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("_rot_list", 1, 2, 2, __pyx_nargs); __PYX_ERR(0, 352, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup._rot_list", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_l), (&PyList_Type), 0, "l", 1))) __PYX_ERR(0, 352, __pyx_L1_error)
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_k), (&PyInt_Type), 0, "k", 1))) __PYX_ERR(0, 352, __pyx_L1_error)
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_8_rot_list(__pyx_self, __pyx_v_l, __pyx_v_k);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_8_rot_list(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_l, PyObject *__pyx_v_k) {
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- Py_ssize_t __pyx_t_1;
- PyObject *__pyx_t_2 = NULL;
- int __pyx_t_3;
- int __pyx_t_4;
- Py_ssize_t __pyx_t_5;
- PyObject *__pyx_t_6 = NULL;
- PyObject *__pyx_t_7 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("_rot_list", 0);
- __Pyx_INCREF(__pyx_v_k);
-
- /* "fontTools/varLib/iup.py":355
- * """Rotate list by k items forward. Ie. item at position 0 will be
- * at position k in returned list. Negative k is allowed."""
- * n = len(l) # <<<<<<<<<<<<<<
- * k %= n
- * if not k:
- */
- __pyx_t_1 = PyList_GET_SIZE(__pyx_v_l); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 355, __pyx_L1_error)
- __pyx_t_2 = PyInt_FromSsize_t(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 355, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_n = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":356
- * at position k in returned list. Negative k is allowed."""
- * n = len(l)
- * k %= n # <<<<<<<<<<<<<<
- * if not k:
- * return l
- */
- __pyx_t_2 = PyNumber_InPlaceRemainder(__pyx_v_k, __pyx_v_n); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 356, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- if (!(likely(__Pyx_Py3Int_CheckExact(__pyx_t_2))||((__pyx_t_2) == Py_None) || __Pyx_RaiseUnexpectedTypeError("int", __pyx_t_2))) __PYX_ERR(0, 356, __pyx_L1_error)
- __Pyx_DECREF_SET(__pyx_v_k, ((PyObject*)__pyx_t_2));
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":357
- * n = len(l)
- * k %= n
- * if not k: # <<<<<<<<<<<<<<
- * return l
- * return l[n - k :] + l[: n - k]
- */
- __pyx_t_3 = __Pyx_PyObject_IsTrue(__pyx_v_k); if (unlikely((__pyx_t_3 < 0))) __PYX_ERR(0, 357, __pyx_L1_error)
- __pyx_t_4 = (!__pyx_t_3);
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":358
- * k %= n
- * if not k:
- * return l # <<<<<<<<<<<<<<
- * return l[n - k :] + l[: n - k]
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_l);
- __pyx_r = __pyx_v_l;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":357
- * n = len(l)
- * k %= n
- * if not k: # <<<<<<<<<<<<<<
- * return l
- * return l[n - k :] + l[: n - k]
- */
- }
-
- /* "fontTools/varLib/iup.py":359
- * if not k:
- * return l
- * return l[n - k :] + l[: n - k] # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_2 = PyNumber_Subtract(__pyx_v_n, __pyx_v_k); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 359, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_4 = (__pyx_t_2 == Py_None);
- if (__pyx_t_4) {
- __pyx_t_1 = 0;
- } else {
- __pyx_t_5 = __Pyx_PyIndex_AsSsize_t(__pyx_t_2); if (unlikely((__pyx_t_5 == (Py_ssize_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 359, __pyx_L1_error)
- __pyx_t_1 = __pyx_t_5;
- }
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_2 = __Pyx_PyList_GetSlice(__pyx_v_l, __pyx_t_1, PY_SSIZE_T_MAX); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 359, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_6 = PyNumber_Subtract(__pyx_v_n, __pyx_v_k); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 359, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_4 = (__pyx_t_6 == Py_None);
- if (__pyx_t_4) {
- __pyx_t_1 = PY_SSIZE_T_MAX;
- } else {
- __pyx_t_5 = __Pyx_PyIndex_AsSsize_t(__pyx_t_6); if (unlikely((__pyx_t_5 == (Py_ssize_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 359, __pyx_L1_error)
- __pyx_t_1 = __pyx_t_5;
- }
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_PyList_GetSlice(__pyx_v_l, 0, __pyx_t_1); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 359, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_7 = PyNumber_Add(__pyx_t_2, __pyx_t_6); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 359, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_r = __pyx_t_7;
- __pyx_t_7 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":352
- *
- *
- * def _rot_list(l: list, k: int): # <<<<<<<<<<<<<<
- * """Rotate list by k items forward. Ie. item at position 0 will be
- * at position k in returned list. Negative k is allowed."""
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_AddTraceback("fontTools.varLib.iup._rot_list", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_k);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":362
- *
- *
- * def _rot_set(s: set, k: int, n: int): # <<<<<<<<<<<<<<
- * k %= n
- * if not k:
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_11_rot_set(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_10_rot_set, "_rot_set(set s: set, int k: int, int n: int)");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_11_rot_set = {"_rot_set", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_11_rot_set, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_10_rot_set};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_11_rot_set(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_s = 0;
- PyObject *__pyx_v_k = 0;
- PyObject *__pyx_v_n = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("_rot_set (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_s,&__pyx_n_s_k,&__pyx_n_s_n,0};
- PyObject* values[3] = {0,0,0};
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_s)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 362, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_k)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 362, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("_rot_set", 1, 3, 3, 1); __PYX_ERR(0, 362, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_n)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 362, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("_rot_set", 1, 3, 3, 2); __PYX_ERR(0, 362, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "_rot_set") < 0)) __PYX_ERR(0, 362, __pyx_L3_error)
- }
- } else if (unlikely(__pyx_nargs != 3)) {
- goto __pyx_L5_argtuple_error;
- } else {
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- }
- __pyx_v_s = ((PyObject*)values[0]);
- __pyx_v_k = ((PyObject*)values[1]);
- __pyx_v_n = ((PyObject*)values[2]);
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("_rot_set", 1, 3, 3, __pyx_nargs); __PYX_ERR(0, 362, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup._rot_set", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_s), (&PySet_Type), 0, "s", 1))) __PYX_ERR(0, 362, __pyx_L1_error)
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_k), (&PyInt_Type), 0, "k", 1))) __PYX_ERR(0, 362, __pyx_L1_error)
- if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_n), (&PyInt_Type), 0, "n", 1))) __PYX_ERR(0, 362, __pyx_L1_error)
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_10_rot_set(__pyx_self, __pyx_v_s, __pyx_v_k, __pyx_v_n);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_10_rot_set(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_s, PyObject *__pyx_v_k, PyObject *__pyx_v_n) {
- PyObject *__pyx_8genexpr2__pyx_v_v = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- PyObject *__pyx_t_1 = NULL;
- int __pyx_t_2;
- int __pyx_t_3;
- PyObject *__pyx_t_4 = NULL;
- Py_ssize_t __pyx_t_5;
- Py_ssize_t __pyx_t_6;
- int __pyx_t_7;
- PyObject *__pyx_t_8 = NULL;
- int __pyx_t_9;
- PyObject *__pyx_t_10 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("_rot_set", 0);
- __Pyx_INCREF(__pyx_v_k);
-
- /* "fontTools/varLib/iup.py":363
- *
- * def _rot_set(s: set, k: int, n: int):
- * k %= n # <<<<<<<<<<<<<<
- * if not k:
- * return s
- */
- __pyx_t_1 = PyNumber_InPlaceRemainder(__pyx_v_k, __pyx_v_n); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 363, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __Pyx_DECREF_SET(__pyx_v_k, ((PyObject*)__pyx_t_1));
- __pyx_t_1 = 0;
-
- /* "fontTools/varLib/iup.py":364
- * def _rot_set(s: set, k: int, n: int):
- * k %= n
- * if not k: # <<<<<<<<<<<<<<
- * return s
- * return {(v + k) % n for v in s}
- */
- __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_v_k); if (unlikely((__pyx_t_2 < 0))) __PYX_ERR(0, 364, __pyx_L1_error)
- __pyx_t_3 = (!__pyx_t_2);
- if (__pyx_t_3) {
-
- /* "fontTools/varLib/iup.py":365
- * k %= n
- * if not k:
- * return s # <<<<<<<<<<<<<<
- * return {(v + k) % n for v in s}
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_s);
- __pyx_r = __pyx_v_s;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":364
- * def _rot_set(s: set, k: int, n: int):
- * k %= n
- * if not k: # <<<<<<<<<<<<<<
- * return s
- * return {(v + k) % n for v in s}
- */
- }
-
- /* "fontTools/varLib/iup.py":366
- * if not k:
- * return s
- * return {(v + k) % n for v in s} # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- { /* enter inner scope */
- __pyx_t_1 = PySet_New(NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_1);
- __pyx_t_5 = 0;
- __pyx_t_8 = __Pyx_set_iterator(__pyx_v_s, 1, (&__pyx_t_6), (&__pyx_t_7)); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_4);
- __pyx_t_4 = __pyx_t_8;
- __pyx_t_8 = 0;
- while (1) {
- __pyx_t_9 = __Pyx_set_iter_next(__pyx_t_4, __pyx_t_6, &__pyx_t_5, &__pyx_t_8, __pyx_t_7);
- if (unlikely(__pyx_t_9 == 0)) break;
- if (unlikely(__pyx_t_9 == -1)) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_XDECREF_SET(__pyx_8genexpr2__pyx_v_v, __pyx_t_8);
- __pyx_t_8 = 0;
- __pyx_t_8 = PyNumber_Add(__pyx_8genexpr2__pyx_v_v, __pyx_v_k); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_10 = PyNumber_Remainder(__pyx_t_8, __pyx_v_n); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_GOTREF(__pyx_t_10);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- if (unlikely(PySet_Add(__pyx_t_1, (PyObject*)__pyx_t_10))) __PYX_ERR(0, 366, __pyx_L6_error)
- __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0;
- }
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_v); __pyx_8genexpr2__pyx_v_v = 0;
- goto __pyx_L9_exit_scope;
- __pyx_L6_error:;
- __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_v); __pyx_8genexpr2__pyx_v_v = 0;
- goto __pyx_L1_error;
- __pyx_L9_exit_scope:;
- } /* exit inner scope */
- __pyx_r = __pyx_t_1;
- __pyx_t_1 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":362
- *
- *
- * def _rot_set(s: set, k: int, n: int): # <<<<<<<<<<<<<<
- * k %= n
- * if not k:
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_10);
- __Pyx_AddTraceback("fontTools.varLib.iup._rot_set", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_8genexpr2__pyx_v_v);
- __Pyx_XDECREF(__pyx_v_k);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_13iup_contour_optimize(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_12iup_contour_optimize, "iup_contour_optimize(deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0) -> _DeltaOrNoneSegment\nFor contour with coordinates `coords`, optimize a set of delta\n values `deltas` within error `tolerance`.\n\n Returns delta vector that has most number of None items instead of\n the input delta.\n ");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_13iup_contour_optimize = {"iup_contour_optimize", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_13iup_contour_optimize, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_12iup_contour_optimize};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_13iup_contour_optimize(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- PyObject *__pyx_v_tolerance = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("iup_contour_optimize (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,&__pyx_n_s_tolerance,0};
- PyObject* values[3] = {0,0,0};
- values[2] = ((PyObject *)((PyObject*)__pyx_float_0_0));
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 369, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 369, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_contour_optimize", 0, 2, 3, 1); __PYX_ERR(0, 369, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_tolerance);
- if (value) { values[2] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 369, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "iup_contour_optimize") < 0)) __PYX_ERR(0, 369, __pyx_L3_error)
- }
- } else {
- switch (__pyx_nargs) {
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- break;
- default: goto __pyx_L5_argtuple_error;
- }
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- __pyx_v_tolerance = values[2];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("iup_contour_optimize", 0, 2, 3, __pyx_nargs); __PYX_ERR(0, 369, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour_optimize", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_12iup_contour_optimize(__pyx_self, __pyx_v_deltas, __pyx_v_coords, __pyx_v_tolerance);
-
- /* function exit code */
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_2generator1(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */
-
-/* "fontTools/varLib/iup.py":384
- *
- * # If all are within tolerance distance of 0, encode nothing:
- * if all(abs(complex(*p)) <= tolerance for p in deltas): # <<<<<<<<<<<<<<
- * return [None] * n
- *
- */
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_genexpr(PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *__pyx_cur_scope;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("genexpr", 0);
- __pyx_cur_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr, __pyx_empty_tuple, NULL);
- if (unlikely(!__pyx_cur_scope)) {
- __pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)Py_None);
- __Pyx_INCREF(Py_None);
- __PYX_ERR(0, 384, __pyx_L1_error)
- } else {
- __Pyx_GOTREF((PyObject *)__pyx_cur_scope);
- }
- __pyx_cur_scope->__pyx_outer_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *) __pyx_self;
- __Pyx_INCREF((PyObject *)__pyx_cur_scope->__pyx_outer_scope);
- __Pyx_GIVEREF((PyObject *)__pyx_cur_scope->__pyx_outer_scope);
- __pyx_cur_scope->__pyx_genexpr_arg_0 = __pyx_genexpr_arg_0;
- __Pyx_INCREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- __Pyx_GIVEREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- {
- __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_2generator1, NULL, (PyObject *) __pyx_cur_scope, __pyx_n_s_genexpr, __pyx_n_s_iup_contour_optimize_locals_gene, __pyx_n_s_fontTools_varLib_iup); if (unlikely(!gen)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_DECREF(__pyx_cur_scope);
- __Pyx_RefNannyFinishContext();
- return (PyObject *) gen;
- }
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour_optimize.genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __Pyx_DECREF((PyObject *)__pyx_cur_scope);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_2generator1(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */
-{
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *__pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)__pyx_generator->closure);
- PyObject *__pyx_r = NULL;
- PyObject *__pyx_t_1 = NULL;
- Py_ssize_t __pyx_t_2;
- PyObject *(*__pyx_t_3)(PyObject *);
- PyObject *__pyx_t_4 = NULL;
- PyObject *__pyx_t_5 = NULL;
- int __pyx_t_6;
- int __pyx_t_7;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("genexpr", 0);
- switch (__pyx_generator->resume_label) {
- case 0: goto __pyx_L3_first_run;
- default: /* CPython raises the right error here */
- __Pyx_RefNannyFinishContext();
- return NULL;
- }
- __pyx_L3_first_run:;
- if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 384, __pyx_L1_error)
- if (unlikely(!__pyx_cur_scope->__pyx_genexpr_arg_0)) { __Pyx_RaiseUnboundLocalError(".0"); __PYX_ERR(0, 384, __pyx_L1_error) }
- if (likely(PyList_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) || PyTuple_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) {
- __pyx_t_1 = __pyx_cur_scope->__pyx_genexpr_arg_0; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0;
- __pyx_t_3 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_1 = PyObject_GetIter(__pyx_cur_scope->__pyx_genexpr_arg_0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __pyx_t_3 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 384, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_3)) {
- if (likely(PyList_CheckExact(__pyx_t_1))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 384, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyTuple_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 384, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- }
- } else {
- __pyx_t_4 = __pyx_t_3(__pyx_t_1);
- if (unlikely(!__pyx_t_4)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 384, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_4);
- }
- __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_p);
- __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_p, __pyx_t_4);
- __Pyx_GIVEREF(__pyx_t_4);
- __pyx_t_4 = 0;
- __pyx_t_4 = __Pyx_PySequence_Tuple(__pyx_cur_scope->__pyx_v_p); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __pyx_t_5 = __Pyx_PyObject_Call(((PyObject *)(&PyComplex_Type)), __pyx_t_4, NULL); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_4 = __Pyx_PyNumber_Absolute(__pyx_t_5); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- if (unlikely(!__pyx_cur_scope->__pyx_outer_scope->__pyx_v_tolerance)) { __Pyx_RaiseClosureNameError("tolerance"); __PYX_ERR(0, 384, __pyx_L1_error) }
- __pyx_t_5 = PyObject_RichCompare(__pyx_t_4, __pyx_cur_scope->__pyx_outer_scope->__pyx_v_tolerance, Py_LE); __Pyx_XGOTREF(__pyx_t_5); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_6 = __Pyx_PyObject_IsTrue(__pyx_t_5); if (unlikely((__pyx_t_6 < 0))) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_t_7 = (!__pyx_t_6);
- if (__pyx_t_7) {
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(Py_False);
- __pyx_r = Py_False;
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- goto __pyx_L0;
- }
- }
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- /*else*/ {
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(Py_True);
- __pyx_r = Py_True;
- goto __pyx_L0;
- }
- CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __Pyx_Generator_Replace_StopIteration(0);
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_AddTraceback("genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_L0:;
- __Pyx_XGIVEREF(__pyx_r);
- #if !CYTHON_USE_EXC_INFO_STACK
- __Pyx_Coroutine_ResetAndClearException(__pyx_generator);
- #endif
- __pyx_generator->resume_label = -1;
- __Pyx_Coroutine_clear((PyObject*)__pyx_generator);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_5generator2(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */
-
-/* "fontTools/varLib/iup.py":393
- * # If all deltas are exactly the same, return just one (the first one):
- * d0 = deltas[0]
- * if all(d0 == d for d in deltas): # <<<<<<<<<<<<<<
- * return [d0] + [None] * (n - 1)
- *
- */
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_3genexpr(PyObject *__pyx_self, PyObject *__pyx_genexpr_arg_0) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *__pyx_cur_scope;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("genexpr", 0);
- __pyx_cur_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr, __pyx_empty_tuple, NULL);
- if (unlikely(!__pyx_cur_scope)) {
- __pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)Py_None);
- __Pyx_INCREF(Py_None);
- __PYX_ERR(0, 393, __pyx_L1_error)
- } else {
- __Pyx_GOTREF((PyObject *)__pyx_cur_scope);
- }
- __pyx_cur_scope->__pyx_outer_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *) __pyx_self;
- __Pyx_INCREF((PyObject *)__pyx_cur_scope->__pyx_outer_scope);
- __Pyx_GIVEREF((PyObject *)__pyx_cur_scope->__pyx_outer_scope);
- __pyx_cur_scope->__pyx_genexpr_arg_0 = __pyx_genexpr_arg_0;
- __Pyx_INCREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- __Pyx_GIVEREF(__pyx_cur_scope->__pyx_genexpr_arg_0);
- {
- __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_5generator2, NULL, (PyObject *) __pyx_cur_scope, __pyx_n_s_genexpr, __pyx_n_s_iup_contour_optimize_locals_gene, __pyx_n_s_fontTools_varLib_iup); if (unlikely(!gen)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_DECREF(__pyx_cur_scope);
- __Pyx_RefNannyFinishContext();
- return (PyObject *) gen;
- }
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour_optimize.genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __Pyx_DECREF((PyObject *)__pyx_cur_scope);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_5generator2(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */
-{
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *__pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)__pyx_generator->closure);
- PyObject *__pyx_r = NULL;
- PyObject *__pyx_t_1 = NULL;
- Py_ssize_t __pyx_t_2;
- PyObject *(*__pyx_t_3)(PyObject *);
- PyObject *__pyx_t_4 = NULL;
- int __pyx_t_5;
- int __pyx_t_6;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("genexpr", 0);
- switch (__pyx_generator->resume_label) {
- case 0: goto __pyx_L3_first_run;
- default: /* CPython raises the right error here */
- __Pyx_RefNannyFinishContext();
- return NULL;
- }
- __pyx_L3_first_run:;
- if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 393, __pyx_L1_error)
- if (unlikely(!__pyx_cur_scope->__pyx_genexpr_arg_0)) { __Pyx_RaiseUnboundLocalError(".0"); __PYX_ERR(0, 393, __pyx_L1_error) }
- if (likely(PyList_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) || PyTuple_CheckExact(__pyx_cur_scope->__pyx_genexpr_arg_0)) {
- __pyx_t_1 = __pyx_cur_scope->__pyx_genexpr_arg_0; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0;
- __pyx_t_3 = NULL;
- } else {
- __pyx_t_2 = -1; __pyx_t_1 = PyObject_GetIter(__pyx_cur_scope->__pyx_genexpr_arg_0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_1);
- __pyx_t_3 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 393, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_3)) {
- if (likely(PyList_CheckExact(__pyx_t_1))) {
- if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 393, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- } else {
- if (__pyx_t_2 >= PyTuple_GET_SIZE(__pyx_t_1)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_4 = PyTuple_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_4); __pyx_t_2++; if (unlikely((0 < 0))) __PYX_ERR(0, 393, __pyx_L1_error)
- #else
- __pyx_t_4 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_4);
- #endif
- }
- } else {
- __pyx_t_4 = __pyx_t_3(__pyx_t_1);
- if (unlikely(!__pyx_t_4)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 393, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_4);
- }
- __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_d);
- __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_d, __pyx_t_4);
- __Pyx_GIVEREF(__pyx_t_4);
- __pyx_t_4 = 0;
- if (unlikely(!__pyx_cur_scope->__pyx_outer_scope->__pyx_v_d0)) { __Pyx_RaiseClosureNameError("d0"); __PYX_ERR(0, 393, __pyx_L1_error) }
- __pyx_t_4 = PyObject_RichCompare(__pyx_cur_scope->__pyx_outer_scope->__pyx_v_d0, __pyx_cur_scope->__pyx_v_d, Py_EQ); __Pyx_XGOTREF(__pyx_t_4); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 393, __pyx_L1_error)
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_t_4); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0;
- __pyx_t_6 = (!__pyx_t_5);
- if (__pyx_t_6) {
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(Py_False);
- __pyx_r = Py_False;
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- goto __pyx_L0;
- }
- }
- __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0;
- /*else*/ {
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(Py_True);
- __pyx_r = Py_True;
- goto __pyx_L0;
- }
- CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope);
-
- /* function exit code */
- goto __pyx_L0;
- __pyx_L1_error:;
- __Pyx_Generator_Replace_StopIteration(0);
- __Pyx_XDECREF(__pyx_t_1);
- __Pyx_XDECREF(__pyx_t_4);
- __Pyx_AddTraceback("genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_L0:;
- __Pyx_XGIVEREF(__pyx_r);
- #if !CYTHON_USE_EXC_INFO_STACK
- __Pyx_Coroutine_ResetAndClearException(__pyx_generator);
- #endif
- __pyx_generator->resume_label = -1;
- __Pyx_Coroutine_clear((PyObject*)__pyx_generator);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_12iup_contour_optimize(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_tolerance) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *__pyx_cur_scope;
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_v_forced = NULL;
- PyObject *__pyx_v_k = NULL;
- PyObject *__pyx_v_chain = NULL;
- PyObject *__pyx_v_costs = NULL;
- PyObject *__pyx_v_solution = NULL;
- PyObject *__pyx_v_i = NULL;
- PyObject *__pyx_v_best_sol = NULL;
- PyObject *__pyx_v_best_cost = NULL;
- PyObject *__pyx_v_start = NULL;
- PyObject *__pyx_v_cost = NULL;
- PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_2generator1 = 0;
- PyObject *__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_5generator2 = 0;
- PyObject *__pyx_8genexpr5__pyx_v_i = NULL;
- PyObject *__pyx_8genexpr6__pyx_v_i = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- Py_ssize_t __pyx_t_1;
- PyObject *__pyx_t_2 = NULL;
- PyObject *__pyx_t_3 = NULL;
- int __pyx_t_4;
- PyObject *__pyx_t_5 = NULL;
- int __pyx_t_6;
- PyObject *__pyx_t_7 = NULL;
- PyObject *(*__pyx_t_8)(PyObject *);
- int __pyx_t_9;
- PyObject *(*__pyx_t_10)(PyObject *);
- PyObject *__pyx_t_11 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("iup_contour_optimize", 0);
- __pyx_cur_scope = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize, __pyx_empty_tuple, NULL);
- if (unlikely(!__pyx_cur_scope)) {
- __pyx_cur_scope = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)Py_None);
- __Pyx_INCREF(Py_None);
- __PYX_ERR(0, 369, __pyx_L1_error)
- } else {
- __Pyx_GOTREF((PyObject *)__pyx_cur_scope);
- }
- __pyx_cur_scope->__pyx_v_tolerance = __pyx_v_tolerance;
- __Pyx_INCREF(__pyx_cur_scope->__pyx_v_tolerance);
- __Pyx_GIVEREF(__pyx_cur_scope->__pyx_v_tolerance);
- __Pyx_INCREF(__pyx_v_deltas);
- __Pyx_INCREF(__pyx_v_coords);
-
- /* "fontTools/varLib/iup.py":379
- * """
- *
- * n = len(deltas) # <<<<<<<<<<<<<<
- *
- * # Get the easy cases out of the way:
- */
- __pyx_t_1 = PyObject_Length(__pyx_v_deltas); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 379, __pyx_L1_error)
- __pyx_t_2 = PyInt_FromSsize_t(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 379, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_n = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":384
- *
- * # If all are within tolerance distance of 0, encode nothing:
- * if all(abs(complex(*p)) <= tolerance for p in deltas): # <<<<<<<<<<<<<<
- * return [None] * n
- *
- */
- __pyx_t_2 = __pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_genexpr(((PyObject*)__pyx_cur_scope), __pyx_v_deltas); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_3 = __Pyx_Generator_Next(__pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 384, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":385
- * # If all are within tolerance distance of 0, encode nothing:
- * if all(abs(complex(*p)) <= tolerance for p in deltas):
- * return [None] * n # <<<<<<<<<<<<<<
- *
- * # If there's exactly one point, return it:
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_3 = PyList_New(1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 385, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyList_SET_ITEM(__pyx_t_3, 0, Py_None);
- { PyObject* __pyx_temp = PyNumber_InPlaceMultiply(__pyx_t_3, __pyx_v_n); if (unlikely(!__pyx_temp)) __PYX_ERR(0, 385, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_temp);
- __Pyx_DECREF(__pyx_t_3);
- __pyx_t_3 = __pyx_temp;
- }
- __pyx_r = __pyx_t_3;
- __pyx_t_3 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":384
- *
- * # If all are within tolerance distance of 0, encode nothing:
- * if all(abs(complex(*p)) <= tolerance for p in deltas): # <<<<<<<<<<<<<<
- * return [None] * n
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":388
- *
- * # If there's exactly one point, return it:
- * if n == 1: # <<<<<<<<<<<<<<
- * return deltas
- *
- */
- __pyx_t_4 = (__Pyx_PyInt_BoolEqObjC(__pyx_v_n, __pyx_int_1, 1, 0)); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 388, __pyx_L1_error)
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":389
- * # If there's exactly one point, return it:
- * if n == 1:
- * return deltas # <<<<<<<<<<<<<<
- *
- * # If all deltas are exactly the same, return just one (the first one):
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_deltas);
- __pyx_r = __pyx_v_deltas;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":388
- *
- * # If there's exactly one point, return it:
- * if n == 1: # <<<<<<<<<<<<<<
- * return deltas
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":392
- *
- * # If all deltas are exactly the same, return just one (the first one):
- * d0 = deltas[0] # <<<<<<<<<<<<<<
- * if all(d0 == d for d in deltas):
- * return [d0] + [None] * (n - 1)
- */
- __pyx_t_3 = __Pyx_GetItemInt(__pyx_v_deltas, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 392, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_GIVEREF(__pyx_t_3);
- __pyx_cur_scope->__pyx_v_d0 = __pyx_t_3;
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":393
- * # If all deltas are exactly the same, return just one (the first one):
- * d0 = deltas[0]
- * if all(d0 == d for d in deltas): # <<<<<<<<<<<<<<
- * return [d0] + [None] * (n - 1)
- *
- */
- __pyx_t_3 = __pyx_pf_9fontTools_6varLib_3iup_20iup_contour_optimize_3genexpr(((PyObject*)__pyx_cur_scope), __pyx_v_deltas); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_2 = __Pyx_Generator_Next(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_2); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 393, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":394
- * d0 = deltas[0]
- * if all(d0 == d for d in deltas):
- * return [d0] + [None] * (n - 1) # <<<<<<<<<<<<<<
- *
- * # Else, solve the general problem using Dynamic Programming.
- */
- __Pyx_XDECREF(__pyx_r);
- __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 394, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_INCREF(__pyx_cur_scope->__pyx_v_d0);
- __Pyx_GIVEREF(__pyx_cur_scope->__pyx_v_d0);
- PyList_SET_ITEM(__pyx_t_2, 0, __pyx_cur_scope->__pyx_v_d0);
- __pyx_t_3 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 394, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = PyList_New(1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 394, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyList_SET_ITEM(__pyx_t_5, 0, Py_None);
- { PyObject* __pyx_temp = PyNumber_InPlaceMultiply(__pyx_t_5, __pyx_t_3); if (unlikely(!__pyx_temp)) __PYX_ERR(0, 394, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_temp);
- __Pyx_DECREF(__pyx_t_5);
- __pyx_t_5 = __pyx_temp;
- }
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = PyNumber_Add(__pyx_t_2, __pyx_t_5); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 394, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_r = __pyx_t_3;
- __pyx_t_3 = 0;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":393
- * # If all deltas are exactly the same, return just one (the first one):
- * d0 = deltas[0]
- * if all(d0 == d for d in deltas): # <<<<<<<<<<<<<<
- * return [d0] + [None] * (n - 1)
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":398
- * # Else, solve the general problem using Dynamic Programming.
- *
- * forced = _iup_contour_bound_forced_set(deltas, coords, tolerance) # <<<<<<<<<<<<<<
- * # The _iup_contour_optimize_dp() routine returns the optimal encoding
- * # solution given the constraint that the last point is always encoded.
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_iup_contour_bound_forced_set); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 398, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_2 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_2 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_2)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_2);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[4] = {__pyx_t_2, __pyx_v_deltas, __pyx_v_coords, __pyx_cur_scope->__pyx_v_tolerance};
- __pyx_t_3 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 3+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 398, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __pyx_v_forced = __pyx_t_3;
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":407
- * # if the font size changes (reduced); that would mean the forced-set
- * # has members it should not have.
- * if forced: # <<<<<<<<<<<<<<
- * # Forced set is non-empty: rotate the contour start point
- * # such that the last point in the list is a forced point.
- */
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_v_forced); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 407, __pyx_L1_error)
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":410
- * # Forced set is non-empty: rotate the contour start point
- * # such that the last point in the list is a forced point.
- * k = (n - 1) - max(forced) # <<<<<<<<<<<<<<
- * assert k >= 0
- *
- */
- __pyx_t_3 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 410, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyObject_CallOneArg(__pyx_builtin_max, __pyx_v_forced); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 410, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_2 = PyNumber_Subtract(__pyx_t_3, __pyx_t_5); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 410, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __pyx_v_k = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":411
- * # such that the last point in the list is a forced point.
- * k = (n - 1) - max(forced)
- * assert k >= 0 # <<<<<<<<<<<<<<
- *
- * deltas = _rot_list(deltas, k)
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_2 = PyObject_RichCompare(__pyx_v_k, __pyx_int_0, Py_GE); __Pyx_XGOTREF(__pyx_t_2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 411, __pyx_L1_error)
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_2); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 411, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (unlikely(!__pyx_t_4)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 411, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 411, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":413
- * assert k >= 0
- *
- * deltas = _rot_list(deltas, k) # <<<<<<<<<<<<<<
- * coords = _rot_list(coords, k)
- * forced = _rot_set(forced, k, n)
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_rot_list); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 413, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_3 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_3)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[3] = {__pyx_t_3, __pyx_v_deltas, __pyx_v_k};
- __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 2+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 413, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __Pyx_DECREF_SET(__pyx_v_deltas, __pyx_t_2);
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":414
- *
- * deltas = _rot_list(deltas, k)
- * coords = _rot_list(coords, k) # <<<<<<<<<<<<<<
- * forced = _rot_set(forced, k, n)
- *
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_rot_list); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 414, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_3 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_3)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[3] = {__pyx_t_3, __pyx_v_coords, __pyx_v_k};
- __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 2+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 414, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __Pyx_DECREF_SET(__pyx_v_coords, __pyx_t_2);
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":415
- * deltas = _rot_list(deltas, k)
- * coords = _rot_list(coords, k)
- * forced = _rot_set(forced, k, n) # <<<<<<<<<<<<<<
- *
- * # Debugging: Pass a set() instead of forced variable to the next call
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_rot_set); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 415, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_3 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_3)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[4] = {__pyx_t_3, __pyx_v_forced, __pyx_v_k, __pyx_v_n};
- __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 3+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 415, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __Pyx_DECREF_SET(__pyx_v_forced, __pyx_t_2);
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":419
- * # Debugging: Pass a set() instead of forced variable to the next call
- * # to exercise forced-set computation for under-counting.
- * chain, costs = _iup_contour_optimize_dp(deltas, coords, forced, tolerance) # <<<<<<<<<<<<<<
- *
- * # Assemble solution.
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_iup_contour_optimize_dp); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 419, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_3 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_3)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_3);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[5] = {__pyx_t_3, __pyx_v_deltas, __pyx_v_coords, __pyx_v_forced, __pyx_cur_scope->__pyx_v_tolerance};
- __pyx_t_2 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 4+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 419, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- if ((likely(PyTuple_CheckExact(__pyx_t_2))) || (PyList_CheckExact(__pyx_t_2))) {
- PyObject* sequence = __pyx_t_2;
- Py_ssize_t size = __Pyx_PySequence_SIZE(sequence);
- if (unlikely(size != 2)) {
- if (size > 2) __Pyx_RaiseTooManyValuesError(2);
- else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size);
- __PYX_ERR(0, 419, __pyx_L1_error)
- }
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- if (likely(PyTuple_CheckExact(sequence))) {
- __pyx_t_5 = PyTuple_GET_ITEM(sequence, 0);
- __pyx_t_3 = PyTuple_GET_ITEM(sequence, 1);
- } else {
- __pyx_t_5 = PyList_GET_ITEM(sequence, 0);
- __pyx_t_3 = PyList_GET_ITEM(sequence, 1);
- }
- __Pyx_INCREF(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_3);
- #else
- __pyx_t_5 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 419, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_3 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 419, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- } else {
- Py_ssize_t index = -1;
- __pyx_t_7 = PyObject_GetIter(__pyx_t_2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 419, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_7);
- index = 0; __pyx_t_5 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_5)) goto __pyx_L7_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_5);
- index = 1; __pyx_t_3 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_3)) goto __pyx_L7_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_3);
- if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_7), 2) < 0) __PYX_ERR(0, 419, __pyx_L1_error)
- __pyx_t_8 = NULL;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- goto __pyx_L8_unpacking_done;
- __pyx_L7_unpacking_failed:;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_8 = NULL;
- if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index);
- __PYX_ERR(0, 419, __pyx_L1_error)
- __pyx_L8_unpacking_done:;
- }
- __pyx_v_chain = __pyx_t_5;
- __pyx_t_5 = 0;
- __pyx_v_costs = __pyx_t_3;
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":422
- *
- * # Assemble solution.
- * solution = set() # <<<<<<<<<<<<<<
- * i = n - 1
- * while i is not None:
- */
- __pyx_t_2 = PySet_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 422, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_solution = ((PyObject*)__pyx_t_2);
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":423
- * # Assemble solution.
- * solution = set()
- * i = n - 1 # <<<<<<<<<<<<<<
- * while i is not None:
- * solution.add(i)
- */
- __pyx_t_2 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 423, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_i = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":424
- * solution = set()
- * i = n - 1
- * while i is not None: # <<<<<<<<<<<<<<
- * solution.add(i)
- * i = chain[i]
- */
- while (1) {
- __pyx_t_4 = (__pyx_v_i != Py_None);
- if (!__pyx_t_4) break;
-
- /* "fontTools/varLib/iup.py":425
- * i = n - 1
- * while i is not None:
- * solution.add(i) # <<<<<<<<<<<<<<
- * i = chain[i]
- * solution.remove(-1)
- */
- __pyx_t_9 = PySet_Add(__pyx_v_solution, __pyx_v_i); if (unlikely(__pyx_t_9 == ((int)-1))) __PYX_ERR(0, 425, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":426
- * while i is not None:
- * solution.add(i)
- * i = chain[i] # <<<<<<<<<<<<<<
- * solution.remove(-1)
- *
- */
- __pyx_t_2 = __Pyx_PyObject_GetItem(__pyx_v_chain, __pyx_v_i); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 426, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF_SET(__pyx_v_i, __pyx_t_2);
- __pyx_t_2 = 0;
- }
-
- /* "fontTools/varLib/iup.py":427
- * solution.add(i)
- * i = chain[i]
- * solution.remove(-1) # <<<<<<<<<<<<<<
- *
- * # if not forced <= solution:
- */
- __pyx_t_9 = __Pyx_PySet_Remove(__pyx_v_solution, __pyx_int_neg_1); if (unlikely(__pyx_t_9 == ((int)-1))) __PYX_ERR(0, 427, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":433
- * # print("deltas", deltas)
- * # print("len", len(deltas))
- * assert forced <= solution, (forced, solution) # <<<<<<<<<<<<<<
- *
- * deltas = [deltas[i] if i in solution else None for i in range(n)]
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_2 = PyObject_RichCompare(__pyx_v_forced, __pyx_v_solution, Py_LE); __Pyx_XGOTREF(__pyx_t_2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 433, __pyx_L1_error)
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_2); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 433, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (unlikely(!__pyx_t_4)) {
- __pyx_t_2 = PyTuple_New(2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 433, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_INCREF(__pyx_v_forced);
- __Pyx_GIVEREF(__pyx_v_forced);
- PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v_forced);
- __Pyx_INCREF(__pyx_v_solution);
- __Pyx_GIVEREF(__pyx_v_solution);
- PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_v_solution);
- __pyx_t_3 = PyTuple_Pack(1, __pyx_t_2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 433, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_Raise(__pyx_builtin_AssertionError, __pyx_t_3, 0, 0);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __PYX_ERR(0, 433, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 433, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":435
- * assert forced <= solution, (forced, solution)
- *
- * deltas = [deltas[i] if i in solution else None for i in range(n)] # <<<<<<<<<<<<<<
- *
- * deltas = _rot_list(deltas, -k)
- */
- { /* enter inner scope */
- __pyx_t_3 = PyList_New(0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_2 = __Pyx_PyObject_CallOneArg(__pyx_builtin_range, __pyx_v_n); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_2);
- if (likely(PyList_CheckExact(__pyx_t_2)) || PyTuple_CheckExact(__pyx_t_2)) {
- __pyx_t_5 = __pyx_t_2; __Pyx_INCREF(__pyx_t_5); __pyx_t_1 = 0;
- __pyx_t_10 = NULL;
- } else {
- __pyx_t_1 = -1; __pyx_t_5 = PyObject_GetIter(__pyx_t_2); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_10 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_5); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 435, __pyx_L13_error)
- }
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- for (;;) {
- if (likely(!__pyx_t_10)) {
- if (likely(PyList_CheckExact(__pyx_t_5))) {
- if (__pyx_t_1 >= PyList_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_2 = PyList_GET_ITEM(__pyx_t_5, __pyx_t_1); __Pyx_INCREF(__pyx_t_2); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 435, __pyx_L13_error)
- #else
- __pyx_t_2 = PySequence_ITEM(__pyx_t_5, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_2);
- #endif
- } else {
- if (__pyx_t_1 >= PyTuple_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_2 = PyTuple_GET_ITEM(__pyx_t_5, __pyx_t_1); __Pyx_INCREF(__pyx_t_2); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 435, __pyx_L13_error)
- #else
- __pyx_t_2 = PySequence_ITEM(__pyx_t_5, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_2);
- #endif
- }
- } else {
- __pyx_t_2 = __pyx_t_10(__pyx_t_5);
- if (unlikely(!__pyx_t_2)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 435, __pyx_L13_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_2);
- }
- __Pyx_XDECREF_SET(__pyx_8genexpr5__pyx_v_i, __pyx_t_2);
- __pyx_t_2 = 0;
- __pyx_t_4 = (__Pyx_PySet_ContainsTF(__pyx_8genexpr5__pyx_v_i, __pyx_v_solution, Py_EQ)); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 435, __pyx_L13_error)
- if (__pyx_t_4) {
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_8genexpr5__pyx_v_i); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_2 = __pyx_t_7;
- __pyx_t_7 = 0;
- } else {
- __Pyx_INCREF(Py_None);
- __pyx_t_2 = Py_None;
- }
- if (unlikely(__Pyx_ListComp_Append(__pyx_t_3, (PyObject*)__pyx_t_2))) __PYX_ERR(0, 435, __pyx_L13_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- }
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_i); __pyx_8genexpr5__pyx_v_i = 0;
- goto __pyx_L17_exit_scope;
- __pyx_L13_error:;
- __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_i); __pyx_8genexpr5__pyx_v_i = 0;
- goto __pyx_L1_error;
- __pyx_L17_exit_scope:;
- } /* exit inner scope */
- __Pyx_DECREF_SET(__pyx_v_deltas, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":437
- * deltas = [deltas[i] if i in solution else None for i in range(n)]
- *
- * deltas = _rot_list(deltas, -k) # <<<<<<<<<<<<<<
- * else:
- * # Repeat the contour an extra time, solve the new case, then look for solutions of the
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_rot_list); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 437, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_2 = PyNumber_Negative(__pyx_v_k); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 437, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_7 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_7 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_7)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_7);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[3] = {__pyx_t_7, __pyx_v_deltas, __pyx_t_2};
- __pyx_t_3 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 2+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 437, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __Pyx_DECREF_SET(__pyx_v_deltas, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":407
- * # if the font size changes (reduced); that would mean the forced-set
- * # has members it should not have.
- * if forced: # <<<<<<<<<<<<<<
- * # Forced set is non-empty: rotate the contour start point
- * # such that the last point in the list is a forced point.
- */
- goto __pyx_L6;
- }
-
- /* "fontTools/varLib/iup.py":442
- * # circular n-length problem in the solution for new linear case. I cannot prove that
- * # this always produces the optimal solution...
- * chain, costs = _iup_contour_optimize_dp( # <<<<<<<<<<<<<<
- * deltas + deltas, coords + coords, forced, tolerance, n
- * )
- */
- /*else*/ {
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_iup_contour_optimize_dp); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 442, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
-
- /* "fontTools/varLib/iup.py":443
- * # this always produces the optimal solution...
- * chain, costs = _iup_contour_optimize_dp(
- * deltas + deltas, coords + coords, forced, tolerance, n # <<<<<<<<<<<<<<
- * )
- * best_sol, best_cost = None, n + 1
- */
- __pyx_t_2 = PyNumber_Add(__pyx_v_deltas, __pyx_v_deltas); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 443, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_7 = PyNumber_Add(__pyx_v_coords, __pyx_v_coords); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 443, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_11 = NULL;
- __pyx_t_6 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_5))) {
- __pyx_t_11 = PyMethod_GET_SELF(__pyx_t_5);
- if (likely(__pyx_t_11)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_11);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_5, function);
- __pyx_t_6 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[6] = {__pyx_t_11, __pyx_t_2, __pyx_t_7, __pyx_v_forced, __pyx_cur_scope->__pyx_v_tolerance, __pyx_v_n};
- __pyx_t_3 = __Pyx_PyObject_FastCall(__pyx_t_5, __pyx_callargs+1-__pyx_t_6, 5+__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 442, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- if ((likely(PyTuple_CheckExact(__pyx_t_3))) || (PyList_CheckExact(__pyx_t_3))) {
- PyObject* sequence = __pyx_t_3;
- Py_ssize_t size = __Pyx_PySequence_SIZE(sequence);
- if (unlikely(size != 2)) {
- if (size > 2) __Pyx_RaiseTooManyValuesError(2);
- else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size);
- __PYX_ERR(0, 442, __pyx_L1_error)
- }
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- if (likely(PyTuple_CheckExact(sequence))) {
- __pyx_t_5 = PyTuple_GET_ITEM(sequence, 0);
- __pyx_t_7 = PyTuple_GET_ITEM(sequence, 1);
- } else {
- __pyx_t_5 = PyList_GET_ITEM(sequence, 0);
- __pyx_t_7 = PyList_GET_ITEM(sequence, 1);
- }
- __Pyx_INCREF(__pyx_t_5);
- __Pyx_INCREF(__pyx_t_7);
- #else
- __pyx_t_5 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 442, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_7 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 442, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- #endif
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- } else {
- Py_ssize_t index = -1;
- __pyx_t_2 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 442, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_8 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_2);
- index = 0; __pyx_t_5 = __pyx_t_8(__pyx_t_2); if (unlikely(!__pyx_t_5)) goto __pyx_L18_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_5);
- index = 1; __pyx_t_7 = __pyx_t_8(__pyx_t_2); if (unlikely(!__pyx_t_7)) goto __pyx_L18_unpacking_failed;
- __Pyx_GOTREF(__pyx_t_7);
- if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_2), 2) < 0) __PYX_ERR(0, 442, __pyx_L1_error)
- __pyx_t_8 = NULL;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- goto __pyx_L19_unpacking_done;
- __pyx_L18_unpacking_failed:;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_8 = NULL;
- if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index);
- __PYX_ERR(0, 442, __pyx_L1_error)
- __pyx_L19_unpacking_done:;
- }
-
- /* "fontTools/varLib/iup.py":442
- * # circular n-length problem in the solution for new linear case. I cannot prove that
- * # this always produces the optimal solution...
- * chain, costs = _iup_contour_optimize_dp( # <<<<<<<<<<<<<<
- * deltas + deltas, coords + coords, forced, tolerance, n
- * )
- */
- __pyx_v_chain = __pyx_t_5;
- __pyx_t_5 = 0;
- __pyx_v_costs = __pyx_t_7;
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":445
- * deltas + deltas, coords + coords, forced, tolerance, n
- * )
- * best_sol, best_cost = None, n + 1 # <<<<<<<<<<<<<<
- *
- * for start in range(n - 1, len(costs) - 1):
- */
- __pyx_t_3 = Py_None;
- __Pyx_INCREF(__pyx_t_3);
- __pyx_t_7 = __Pyx_PyInt_AddObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 445, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_v_best_sol = ((PyObject*)__pyx_t_3);
- __pyx_t_3 = 0;
- __pyx_v_best_cost = __pyx_t_7;
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":447
- * best_sol, best_cost = None, n + 1
- *
- * for start in range(n - 1, len(costs) - 1): # <<<<<<<<<<<<<<
- * # Assemble solution.
- * solution = set()
- */
- __pyx_t_7 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_1 = PyObject_Length(__pyx_v_costs); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 447, __pyx_L1_error)
- __pyx_t_3 = PyInt_FromSsize_t((__pyx_t_1 - 1)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = PyTuple_New(2); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GIVEREF(__pyx_t_7);
- PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_t_7);
- __Pyx_GIVEREF(__pyx_t_3);
- PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_t_3);
- __pyx_t_7 = 0;
- __pyx_t_3 = 0;
- __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_range, __pyx_t_5, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- if (likely(PyList_CheckExact(__pyx_t_3)) || PyTuple_CheckExact(__pyx_t_3)) {
- __pyx_t_5 = __pyx_t_3; __Pyx_INCREF(__pyx_t_5); __pyx_t_1 = 0;
- __pyx_t_10 = NULL;
- } else {
- __pyx_t_1 = -1; __pyx_t_5 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_10 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_5); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 447, __pyx_L1_error)
- }
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- for (;;) {
- if (likely(!__pyx_t_10)) {
- if (likely(PyList_CheckExact(__pyx_t_5))) {
- if (__pyx_t_1 >= PyList_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_3 = PyList_GET_ITEM(__pyx_t_5, __pyx_t_1); __Pyx_INCREF(__pyx_t_3); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 447, __pyx_L1_error)
- #else
- __pyx_t_3 = PySequence_ITEM(__pyx_t_5, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- } else {
- if (__pyx_t_1 >= PyTuple_GET_SIZE(__pyx_t_5)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_3 = PyTuple_GET_ITEM(__pyx_t_5, __pyx_t_1); __Pyx_INCREF(__pyx_t_3); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 447, __pyx_L1_error)
- #else
- __pyx_t_3 = PySequence_ITEM(__pyx_t_5, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 447, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- #endif
- }
- } else {
- __pyx_t_3 = __pyx_t_10(__pyx_t_5);
- if (unlikely(!__pyx_t_3)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 447, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_3);
- }
- __Pyx_XDECREF_SET(__pyx_v_start, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":449
- * for start in range(n - 1, len(costs) - 1):
- * # Assemble solution.
- * solution = set() # <<<<<<<<<<<<<<
- * i = start
- * while i > start - n:
- */
- __pyx_t_3 = PySet_New(0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 449, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_XDECREF_SET(__pyx_v_solution, ((PyObject*)__pyx_t_3));
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":450
- * # Assemble solution.
- * solution = set()
- * i = start # <<<<<<<<<<<<<<
- * while i > start - n:
- * solution.add(i % n)
- */
- __Pyx_INCREF(__pyx_v_start);
- __Pyx_XDECREF_SET(__pyx_v_i, __pyx_v_start);
-
- /* "fontTools/varLib/iup.py":451
- * solution = set()
- * i = start
- * while i > start - n: # <<<<<<<<<<<<<<
- * solution.add(i % n)
- * i = chain[i]
- */
- while (1) {
- __pyx_t_3 = PyNumber_Subtract(__pyx_v_start, __pyx_v_n); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 451, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = PyObject_RichCompare(__pyx_v_i, __pyx_t_3, Py_GT); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 451, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 451, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (!__pyx_t_4) break;
-
- /* "fontTools/varLib/iup.py":452
- * i = start
- * while i > start - n:
- * solution.add(i % n) # <<<<<<<<<<<<<<
- * i = chain[i]
- * if i == start - n:
- */
- __pyx_t_7 = PyNumber_Remainder(__pyx_v_i, __pyx_v_n); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 452, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = PySet_Add(__pyx_v_solution, __pyx_t_7); if (unlikely(__pyx_t_9 == ((int)-1))) __PYX_ERR(0, 452, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":453
- * while i > start - n:
- * solution.add(i % n)
- * i = chain[i] # <<<<<<<<<<<<<<
- * if i == start - n:
- * cost = costs[start] - costs[start - n]
- */
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_v_chain, __pyx_v_i); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 453, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF_SET(__pyx_v_i, __pyx_t_7);
- __pyx_t_7 = 0;
- }
-
- /* "fontTools/varLib/iup.py":454
- * solution.add(i % n)
- * i = chain[i]
- * if i == start - n: # <<<<<<<<<<<<<<
- * cost = costs[start] - costs[start - n]
- * if cost <= best_cost:
- */
- __pyx_t_7 = PyNumber_Subtract(__pyx_v_start, __pyx_v_n); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 454, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_3 = PyObject_RichCompare(__pyx_v_i, __pyx_t_7, Py_EQ); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 454, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 454, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":455
- * i = chain[i]
- * if i == start - n:
- * cost = costs[start] - costs[start - n] # <<<<<<<<<<<<<<
- * if cost <= best_cost:
- * best_sol, best_cost = solution, cost
- */
- __pyx_t_3 = __Pyx_PyObject_GetItem(__pyx_v_costs, __pyx_v_start); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 455, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = PyNumber_Subtract(__pyx_v_start, __pyx_v_n); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 455, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_2 = __Pyx_PyObject_GetItem(__pyx_v_costs, __pyx_t_7); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 455, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = PyNumber_Subtract(__pyx_t_3, __pyx_t_2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 455, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_XDECREF_SET(__pyx_v_cost, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":456
- * if i == start - n:
- * cost = costs[start] - costs[start - n]
- * if cost <= best_cost: # <<<<<<<<<<<<<<
- * best_sol, best_cost = solution, cost
- *
- */
- __pyx_t_7 = PyObject_RichCompare(__pyx_v_cost, __pyx_v_best_cost, Py_LE); __Pyx_XGOTREF(__pyx_t_7); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 456, __pyx_L1_error)
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_7); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 456, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (__pyx_t_4) {
-
- /* "fontTools/varLib/iup.py":457
- * cost = costs[start] - costs[start - n]
- * if cost <= best_cost:
- * best_sol, best_cost = solution, cost # <<<<<<<<<<<<<<
- *
- * # if not forced <= best_sol:
- */
- __pyx_t_7 = __pyx_v_solution;
- __Pyx_INCREF(__pyx_t_7);
- __pyx_t_2 = __pyx_v_cost;
- __Pyx_INCREF(__pyx_t_2);
- __Pyx_DECREF_SET(__pyx_v_best_sol, ((PyObject*)__pyx_t_7));
- __pyx_t_7 = 0;
- __Pyx_DECREF_SET(__pyx_v_best_cost, __pyx_t_2);
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":456
- * if i == start - n:
- * cost = costs[start] - costs[start - n]
- * if cost <= best_cost: # <<<<<<<<<<<<<<
- * best_sol, best_cost = solution, cost
- *
- */
- }
-
- /* "fontTools/varLib/iup.py":454
- * solution.add(i % n)
- * i = chain[i]
- * if i == start - n: # <<<<<<<<<<<<<<
- * cost = costs[start] - costs[start - n]
- * if cost <= best_cost:
- */
- }
-
- /* "fontTools/varLib/iup.py":447
- * best_sol, best_cost = None, n + 1
- *
- * for start in range(n - 1, len(costs) - 1): # <<<<<<<<<<<<<<
- * # Assemble solution.
- * solution = set()
- */
- }
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":463
- * # print("deltas", deltas)
- * # print("len", len(deltas))
- * assert forced <= best_sol, (forced, best_sol) # <<<<<<<<<<<<<<
- *
- * deltas = [deltas[i] if i in best_sol else None for i in range(n)]
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_5 = PyObject_RichCompare(__pyx_v_forced, __pyx_v_best_sol, Py_LE); __Pyx_XGOTREF(__pyx_t_5); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 463, __pyx_L1_error)
- __pyx_t_4 = __Pyx_PyObject_IsTrue(__pyx_t_5); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 463, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- if (unlikely(!__pyx_t_4)) {
- __pyx_t_5 = PyTuple_New(2); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 463, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_INCREF(__pyx_v_forced);
- __Pyx_GIVEREF(__pyx_v_forced);
- PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_v_forced);
- __Pyx_INCREF(__pyx_v_best_sol);
- __Pyx_GIVEREF(__pyx_v_best_sol);
- PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_v_best_sol);
- __pyx_t_2 = PyTuple_Pack(1, __pyx_t_5); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 463, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_Raise(__pyx_builtin_AssertionError, __pyx_t_2, 0, 0);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __PYX_ERR(0, 463, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 463, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":465
- * assert forced <= best_sol, (forced, best_sol)
- *
- * deltas = [deltas[i] if i in best_sol else None for i in range(n)] # <<<<<<<<<<<<<<
- *
- * return deltas
- */
- { /* enter inner scope */
- __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_5 = __Pyx_PyObject_CallOneArg(__pyx_builtin_range, __pyx_v_n); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_5);
- if (likely(PyList_CheckExact(__pyx_t_5)) || PyTuple_CheckExact(__pyx_t_5)) {
- __pyx_t_7 = __pyx_t_5; __Pyx_INCREF(__pyx_t_7); __pyx_t_1 = 0;
- __pyx_t_10 = NULL;
- } else {
- __pyx_t_1 = -1; __pyx_t_7 = PyObject_GetIter(__pyx_t_5); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_10 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_7); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 465, __pyx_L29_error)
- }
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- for (;;) {
- if (likely(!__pyx_t_10)) {
- if (likely(PyList_CheckExact(__pyx_t_7))) {
- if (__pyx_t_1 >= PyList_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_5 = PyList_GET_ITEM(__pyx_t_7, __pyx_t_1); __Pyx_INCREF(__pyx_t_5); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 465, __pyx_L29_error)
- #else
- __pyx_t_5 = PySequence_ITEM(__pyx_t_7, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_5);
- #endif
- } else {
- if (__pyx_t_1 >= PyTuple_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_5 = PyTuple_GET_ITEM(__pyx_t_7, __pyx_t_1); __Pyx_INCREF(__pyx_t_5); __pyx_t_1++; if (unlikely((0 < 0))) __PYX_ERR(0, 465, __pyx_L29_error)
- #else
- __pyx_t_5 = PySequence_ITEM(__pyx_t_7, __pyx_t_1); __pyx_t_1++; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_5);
- #endif
- }
- } else {
- __pyx_t_5 = __pyx_t_10(__pyx_t_7);
- if (unlikely(!__pyx_t_5)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 465, __pyx_L29_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_5);
- }
- __Pyx_XDECREF_SET(__pyx_8genexpr6__pyx_v_i, __pyx_t_5);
- __pyx_t_5 = 0;
- if (unlikely(__pyx_v_best_sol == Py_None)) {
- PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable");
- __PYX_ERR(0, 465, __pyx_L29_error)
- }
- __pyx_t_4 = (__Pyx_PySet_ContainsTF(__pyx_8genexpr6__pyx_v_i, __pyx_v_best_sol, Py_EQ)); if (unlikely((__pyx_t_4 < 0))) __PYX_ERR(0, 465, __pyx_L29_error)
- if (__pyx_t_4) {
- __pyx_t_3 = __Pyx_PyObject_GetItem(__pyx_v_deltas, __pyx_8genexpr6__pyx_v_i); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __pyx_t_3;
- __pyx_t_3 = 0;
- } else {
- __Pyx_INCREF(Py_None);
- __pyx_t_5 = Py_None;
- }
- if (unlikely(__Pyx_ListComp_Append(__pyx_t_2, (PyObject*)__pyx_t_5))) __PYX_ERR(0, 465, __pyx_L29_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- }
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_XDECREF(__pyx_8genexpr6__pyx_v_i); __pyx_8genexpr6__pyx_v_i = 0;
- goto __pyx_L33_exit_scope;
- __pyx_L29_error:;
- __Pyx_XDECREF(__pyx_8genexpr6__pyx_v_i); __pyx_8genexpr6__pyx_v_i = 0;
- goto __pyx_L1_error;
- __pyx_L33_exit_scope:;
- } /* exit inner scope */
- __Pyx_DECREF_SET(__pyx_v_deltas, __pyx_t_2);
- __pyx_t_2 = 0;
- }
- __pyx_L6:;
-
- /* "fontTools/varLib/iup.py":467
- * deltas = [deltas[i] if i in best_sol else None for i in range(n)]
- *
- * return deltas # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_deltas);
- __pyx_r = __pyx_v_deltas;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_11);
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_contour_optimize", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_forced);
- __Pyx_XDECREF(__pyx_v_k);
- __Pyx_XDECREF(__pyx_v_chain);
- __Pyx_XDECREF(__pyx_v_costs);
- __Pyx_XDECREF(__pyx_v_solution);
- __Pyx_XDECREF(__pyx_v_i);
- __Pyx_XDECREF(__pyx_v_best_sol);
- __Pyx_XDECREF(__pyx_v_best_cost);
- __Pyx_XDECREF(__pyx_v_start);
- __Pyx_XDECREF(__pyx_v_cost);
- __Pyx_XDECREF(__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_2generator1);
- __Pyx_XDECREF(__pyx_gb_9fontTools_6varLib_3iup_20iup_contour_optimize_5generator2);
- __Pyx_XDECREF(__pyx_8genexpr5__pyx_v_i);
- __Pyx_XDECREF(__pyx_8genexpr6__pyx_v_i);
- __Pyx_XDECREF(__pyx_v_deltas);
- __Pyx_XDECREF(__pyx_v_coords);
- __Pyx_DECREF((PyObject *)__pyx_cur_scope);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-/* "fontTools/varLib/iup.py":470
- *
- *
- * def iup_delta_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment,
- * coords: _PointSegment,
- */
-
-/* Python wrapper */
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_15iup_delta_optimize(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-); /*proto*/
-PyDoc_STRVAR(__pyx_doc_9fontTools_6varLib_3iup_14iup_delta_optimize, "iup_delta_optimize(deltas: _DeltaSegment, coords: _PointSegment, ends: _Endpoints, tolerance: Real = 0.0) -> _DeltaOrNoneSegment\nFor the outline given in `coords`, with contour endpoints given\n in sorted increasing order in `ends`, optimize a set of delta\n values `deltas` within error `tolerance`.\n\n Returns delta vector that has most number of None items instead of\n the input delta.\n ");
-static PyMethodDef __pyx_mdef_9fontTools_6varLib_3iup_15iup_delta_optimize = {"iup_delta_optimize", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_9fontTools_6varLib_3iup_15iup_delta_optimize, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_9fontTools_6varLib_3iup_14iup_delta_optimize};
-static PyObject *__pyx_pw_9fontTools_6varLib_3iup_15iup_delta_optimize(PyObject *__pyx_self,
-#if CYTHON_METH_FASTCALL
-PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds
-#else
-PyObject *__pyx_args, PyObject *__pyx_kwds
-#endif
-) {
- PyObject *__pyx_v_deltas = 0;
- PyObject *__pyx_v_coords = 0;
- PyObject *__pyx_v_ends = 0;
- PyObject *__pyx_v_tolerance = 0;
- #if !CYTHON_METH_FASTCALL
- CYTHON_UNUSED const Py_ssize_t __pyx_nargs = PyTuple_GET_SIZE(__pyx_args);
- #endif
- CYTHON_UNUSED PyObject *const *__pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs);
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- PyObject *__pyx_r = 0;
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("iup_delta_optimize (wrapper)", 0);
- {
- PyObject **__pyx_pyargnames[] = {&__pyx_n_s_deltas,&__pyx_n_s_coords,&__pyx_n_s_ends,&__pyx_n_s_tolerance,0};
- PyObject* values[4] = {0,0,0,0};
- values[3] = ((PyObject *)((PyObject*)__pyx_float_0_0));
- if (__pyx_kwds) {
- Py_ssize_t kw_args;
- switch (__pyx_nargs) {
- case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3);
- CYTHON_FALLTHROUGH;
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- CYTHON_FALLTHROUGH;
- case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- CYTHON_FALLTHROUGH;
- case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- CYTHON_FALLTHROUGH;
- case 0: break;
- default: goto __pyx_L5_argtuple_error;
- }
- kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds);
- switch (__pyx_nargs) {
- case 0:
- if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_deltas)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 470, __pyx_L3_error)
- else goto __pyx_L5_argtuple_error;
- CYTHON_FALLTHROUGH;
- case 1:
- if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_coords)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 470, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_delta_optimize", 0, 3, 4, 1); __PYX_ERR(0, 470, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_ends)) != 0)) kw_args--;
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 470, __pyx_L3_error)
- else {
- __Pyx_RaiseArgtupleInvalid("iup_delta_optimize", 0, 3, 4, 2); __PYX_ERR(0, 470, __pyx_L3_error)
- }
- CYTHON_FALLTHROUGH;
- case 3:
- if (kw_args > 0) {
- PyObject* value = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_tolerance);
- if (value) { values[3] = value; kw_args--; }
- else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 470, __pyx_L3_error)
- }
- }
- if (unlikely(kw_args > 0)) {
- const Py_ssize_t kwd_pos_args = __pyx_nargs;
- if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "iup_delta_optimize") < 0)) __PYX_ERR(0, 470, __pyx_L3_error)
- }
- } else {
- switch (__pyx_nargs) {
- case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3);
- CYTHON_FALLTHROUGH;
- case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2);
- values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1);
- values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0);
- break;
- default: goto __pyx_L5_argtuple_error;
- }
- }
- __pyx_v_deltas = values[0];
- __pyx_v_coords = values[1];
- __pyx_v_ends = values[2];
- __pyx_v_tolerance = values[3];
- }
- goto __pyx_L4_argument_unpacking_done;
- __pyx_L5_argtuple_error:;
- __Pyx_RaiseArgtupleInvalid("iup_delta_optimize", 0, 3, 4, __pyx_nargs); __PYX_ERR(0, 470, __pyx_L3_error)
- __pyx_L3_error:;
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_delta_optimize", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __Pyx_RefNannyFinishContext();
- return NULL;
- __pyx_L4_argument_unpacking_done:;
- __pyx_r = __pyx_pf_9fontTools_6varLib_3iup_14iup_delta_optimize(__pyx_self, __pyx_v_deltas, __pyx_v_coords, __pyx_v_ends, __pyx_v_tolerance);
-
- /* function exit code */
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static PyObject *__pyx_pf_9fontTools_6varLib_3iup_14iup_delta_optimize(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_deltas, PyObject *__pyx_v_coords, PyObject *__pyx_v_ends, PyObject *__pyx_v_tolerance) {
- PyObject *__pyx_v_n = NULL;
- PyObject *__pyx_v_out = NULL;
- PyObject *__pyx_v_start = NULL;
- PyObject *__pyx_v_end = NULL;
- PyObject *__pyx_v_contour = NULL;
- PyObject *__pyx_r = NULL;
- __Pyx_RefNannyDeclarations
- int __pyx_t_1;
- PyObject *__pyx_t_2 = NULL;
- PyObject *__pyx_t_3 = NULL;
- int __pyx_t_4;
- int __pyx_t_5;
- Py_ssize_t __pyx_t_6;
- PyObject *__pyx_t_7 = NULL;
- PyObject *__pyx_t_8 = NULL;
- PyObject *__pyx_t_9 = NULL;
- PyObject *(*__pyx_t_10)(PyObject *);
- PyObject *__pyx_t_11 = NULL;
- int __pyx_t_12;
- Py_ssize_t __pyx_t_13;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("iup_delta_optimize", 0);
- __Pyx_INCREF(__pyx_v_ends);
-
- /* "fontTools/varLib/iup.py":483
- * the input delta.
- * """
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4 # <<<<<<<<<<<<<<
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_3 = PySequence_List(__pyx_v_ends); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_2 = ((PyObject*)__pyx_t_3);
- __pyx_t_3 = 0;
- __pyx_t_4 = PyList_Sort(__pyx_t_2); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 483, __pyx_L1_error)
- __pyx_t_3 = PyObject_RichCompare(__pyx_t_2, __pyx_v_ends, Py_EQ); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (__pyx_t_5) {
- } else {
- __pyx_t_1 = __pyx_t_5;
- goto __pyx_L3_bool_binop_done;
- }
- __pyx_t_6 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_6 == ((Py_ssize_t)-1))) __PYX_ERR(0, 483, __pyx_L1_error)
- __pyx_t_3 = PyInt_FromSsize_t(__pyx_t_6); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_v_ends); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 483, __pyx_L1_error)
- if (__pyx_t_5) {
- __pyx_t_7 = __Pyx_GetItemInt(__pyx_v_ends, -1L, long, 1, __Pyx_PyInt_From_long, 0, 1, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_7, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_2 = __pyx_t_8;
- __pyx_t_8 = 0;
- } else {
- __Pyx_INCREF(__pyx_int_0);
- __pyx_t_2 = __pyx_int_0;
- }
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_t_2, __pyx_int_4, 4, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_2 = PyObject_RichCompare(__pyx_t_3, __pyx_t_8, Py_EQ); __Pyx_XGOTREF(__pyx_t_2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_5 = __Pyx_PyObject_IsTrue(__pyx_t_2); if (unlikely((__pyx_t_5 < 0))) __PYX_ERR(0, 483, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __pyx_t_1 = __pyx_t_5;
- __pyx_L3_bool_binop_done:;
- if (unlikely(!__pyx_t_1)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 483, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 483, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":484
- * """
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4
- * n = len(coords) # <<<<<<<<<<<<<<
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = []
- */
- __pyx_t_6 = PyObject_Length(__pyx_v_coords); if (unlikely(__pyx_t_6 == ((Py_ssize_t)-1))) __PYX_ERR(0, 484, __pyx_L1_error)
- __pyx_t_2 = PyInt_FromSsize_t(__pyx_t_6); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 484, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_v_n = __pyx_t_2;
- __pyx_t_2 = 0;
-
- /* "fontTools/varLib/iup.py":485
- * assert sorted(ends) == ends and len(coords) == (ends[-1] + 1 if ends else 0) + 4
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1] # <<<<<<<<<<<<<<
- * out = []
- * start = 0
- */
- __pyx_t_2 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_4, 4, 0, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __pyx_t_8 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_3, 3, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_3 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_2, 2, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_7 = __Pyx_PyInt_SubtractObjC(__pyx_v_n, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_9 = PyList_New(4); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_GIVEREF(__pyx_t_2);
- PyList_SET_ITEM(__pyx_t_9, 0, __pyx_t_2);
- __Pyx_GIVEREF(__pyx_t_8);
- PyList_SET_ITEM(__pyx_t_9, 1, __pyx_t_8);
- __Pyx_GIVEREF(__pyx_t_3);
- PyList_SET_ITEM(__pyx_t_9, 2, __pyx_t_3);
- __Pyx_GIVEREF(__pyx_t_7);
- PyList_SET_ITEM(__pyx_t_9, 3, __pyx_t_7);
- __pyx_t_2 = 0;
- __pyx_t_8 = 0;
- __pyx_t_3 = 0;
- __pyx_t_7 = 0;
- __pyx_t_7 = PyNumber_Add(__pyx_v_ends, __pyx_t_9); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 485, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __Pyx_DECREF_SET(__pyx_v_ends, __pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":486
- * n = len(coords)
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = [] # <<<<<<<<<<<<<<
- * start = 0
- * for end in ends:
- */
- __pyx_t_7 = PyList_New(0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 486, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_v_out = ((PyObject*)__pyx_t_7);
- __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":487
- * ends = ends + [n - 4, n - 3, n - 2, n - 1]
- * out = []
- * start = 0 # <<<<<<<<<<<<<<
- * for end in ends:
- * contour = iup_contour_optimize(
- */
- __Pyx_INCREF(__pyx_int_0);
- __pyx_v_start = __pyx_int_0;
-
- /* "fontTools/varLib/iup.py":488
- * out = []
- * start = 0
- * for end in ends: # <<<<<<<<<<<<<<
- * contour = iup_contour_optimize(
- * deltas[start : end + 1], coords[start : end + 1], tolerance
- */
- if (likely(PyList_CheckExact(__pyx_v_ends)) || PyTuple_CheckExact(__pyx_v_ends)) {
- __pyx_t_7 = __pyx_v_ends; __Pyx_INCREF(__pyx_t_7); __pyx_t_6 = 0;
- __pyx_t_10 = NULL;
- } else {
- __pyx_t_6 = -1; __pyx_t_7 = PyObject_GetIter(__pyx_v_ends); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 488, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_10 = __Pyx_PyObject_GetIterNextFunc(__pyx_t_7); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 488, __pyx_L1_error)
- }
- for (;;) {
- if (likely(!__pyx_t_10)) {
- if (likely(PyList_CheckExact(__pyx_t_7))) {
- if (__pyx_t_6 >= PyList_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_9 = PyList_GET_ITEM(__pyx_t_7, __pyx_t_6); __Pyx_INCREF(__pyx_t_9); __pyx_t_6++; if (unlikely((0 < 0))) __PYX_ERR(0, 488, __pyx_L1_error)
- #else
- __pyx_t_9 = PySequence_ITEM(__pyx_t_7, __pyx_t_6); __pyx_t_6++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 488, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- #endif
- } else {
- if (__pyx_t_6 >= PyTuple_GET_SIZE(__pyx_t_7)) break;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- __pyx_t_9 = PyTuple_GET_ITEM(__pyx_t_7, __pyx_t_6); __Pyx_INCREF(__pyx_t_9); __pyx_t_6++; if (unlikely((0 < 0))) __PYX_ERR(0, 488, __pyx_L1_error)
- #else
- __pyx_t_9 = PySequence_ITEM(__pyx_t_7, __pyx_t_6); __pyx_t_6++; if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 488, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- #endif
- }
- } else {
- __pyx_t_9 = __pyx_t_10(__pyx_t_7);
- if (unlikely(!__pyx_t_9)) {
- PyObject* exc_type = PyErr_Occurred();
- if (exc_type) {
- if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) PyErr_Clear();
- else __PYX_ERR(0, 488, __pyx_L1_error)
- }
- break;
- }
- __Pyx_GOTREF(__pyx_t_9);
- }
- __Pyx_XDECREF_SET(__pyx_v_end, __pyx_t_9);
- __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":489
- * start = 0
- * for end in ends:
- * contour = iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas[start : end + 1], coords[start : end + 1], tolerance
- * )
- */
- __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_iup_contour_optimize); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 489, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
-
- /* "fontTools/varLib/iup.py":490
- * for end in ends:
- * contour = iup_contour_optimize(
- * deltas[start : end + 1], coords[start : end + 1], tolerance # <<<<<<<<<<<<<<
- * )
- * assert len(contour) == end - start + 1
- */
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_v_end, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 490, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_2 = __Pyx_PyObject_GetSlice(__pyx_v_deltas, 0, 0, &__pyx_v_start, &__pyx_t_8, NULL, 0, 0, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 490, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_2);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_8 = __Pyx_PyInt_AddObjC(__pyx_v_end, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 490, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __pyx_t_11 = __Pyx_PyObject_GetSlice(__pyx_v_coords, 0, 0, &__pyx_v_start, &__pyx_t_8, NULL, 0, 0, 1); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 490, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_8 = NULL;
- __pyx_t_12 = 0;
- if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) {
- __pyx_t_8 = PyMethod_GET_SELF(__pyx_t_3);
- if (likely(__pyx_t_8)) {
- PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3);
- __Pyx_INCREF(__pyx_t_8);
- __Pyx_INCREF(function);
- __Pyx_DECREF_SET(__pyx_t_3, function);
- __pyx_t_12 = 1;
- }
- }
- {
- PyObject *__pyx_callargs[4] = {__pyx_t_8, __pyx_t_2, __pyx_t_11, __pyx_v_tolerance};
- __pyx_t_9 = __Pyx_PyObject_FastCall(__pyx_t_3, __pyx_callargs+1-__pyx_t_12, 3+__pyx_t_12);
- __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
- if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 489, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- }
- __Pyx_XDECREF_SET(__pyx_v_contour, __pyx_t_9);
- __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":492
- * deltas[start : end + 1], coords[start : end + 1], tolerance
- * )
- * assert len(contour) == end - start + 1 # <<<<<<<<<<<<<<
- * out.extend(contour)
- * start = end + 1
- */
- #ifndef CYTHON_WITHOUT_ASSERTIONS
- if (unlikely(__pyx_assertions_enabled())) {
- __pyx_t_13 = PyObject_Length(__pyx_v_contour); if (unlikely(__pyx_t_13 == ((Py_ssize_t)-1))) __PYX_ERR(0, 492, __pyx_L1_error)
- __pyx_t_9 = PyInt_FromSsize_t(__pyx_t_13); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 492, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_9);
- __pyx_t_3 = PyNumber_Subtract(__pyx_v_end, __pyx_v_start); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 492, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __pyx_t_11 = __Pyx_PyInt_AddObjC(__pyx_t_3, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 492, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_11);
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- __pyx_t_3 = PyObject_RichCompare(__pyx_t_9, __pyx_t_11, Py_EQ); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 492, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0;
- __pyx_t_1 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely((__pyx_t_1 < 0))) __PYX_ERR(0, 492, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0;
- if (unlikely(!__pyx_t_1)) {
- __Pyx_Raise(__pyx_builtin_AssertionError, 0, 0, 0);
- __PYX_ERR(0, 492, __pyx_L1_error)
- }
- }
- #else
- if ((1)); else __PYX_ERR(0, 492, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":493
- * )
- * assert len(contour) == end - start + 1
- * out.extend(contour) # <<<<<<<<<<<<<<
- * start = end + 1
- *
- */
- __pyx_t_4 = __Pyx_PyList_Extend(__pyx_v_out, __pyx_v_contour); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 493, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":494
- * assert len(contour) == end - start + 1
- * out.extend(contour)
- * start = end + 1 # <<<<<<<<<<<<<<
- *
- * return out
- */
- __pyx_t_3 = __Pyx_PyInt_AddObjC(__pyx_v_end, __pyx_int_1, 1, 0, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 494, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_3);
- __Pyx_DECREF_SET(__pyx_v_start, __pyx_t_3);
- __pyx_t_3 = 0;
-
- /* "fontTools/varLib/iup.py":488
- * out = []
- * start = 0
- * for end in ends: # <<<<<<<<<<<<<<
- * contour = iup_contour_optimize(
- * deltas[start : end + 1], coords[start : end + 1], tolerance
- */
- }
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":496
- * start = end + 1
- *
- * return out # <<<<<<<<<<<<<<
- */
- __Pyx_XDECREF(__pyx_r);
- __Pyx_INCREF(__pyx_v_out);
- __pyx_r = __pyx_v_out;
- goto __pyx_L0;
-
- /* "fontTools/varLib/iup.py":470
- *
- *
- * def iup_delta_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment,
- * coords: _PointSegment,
- */
-
- /* function exit code */
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_2);
- __Pyx_XDECREF(__pyx_t_3);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_9);
- __Pyx_XDECREF(__pyx_t_11);
- __Pyx_AddTraceback("fontTools.varLib.iup.iup_delta_optimize", __pyx_clineno, __pyx_lineno, __pyx_filename);
- __pyx_r = NULL;
- __pyx_L0:;
- __Pyx_XDECREF(__pyx_v_n);
- __Pyx_XDECREF(__pyx_v_out);
- __Pyx_XDECREF(__pyx_v_start);
- __Pyx_XDECREF(__pyx_v_end);
- __Pyx_XDECREF(__pyx_v_contour);
- __Pyx_XDECREF(__pyx_v_ends);
- __Pyx_XGIVEREF(__pyx_r);
- __Pyx_RefNannyFinishContext();
- return __pyx_r;
-}
-
-static struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr[8];
-static int __pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr = 0;
-
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) {
- PyObject *o;
- #if CYTHON_COMPILING_IN_LIMITED_API
- allocfunc alloc_func = (allocfunc)PyType_GetSlot(t, Py_tp_alloc);
- o = alloc_func(t, 0);
- #else
- if (CYTHON_COMPILING_IN_CPYTHON && likely((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr > 0) & (int)(t->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr)))) {
- o = (PyObject*)__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr[--__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr];
- memset(o, 0, sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr));
- (void) PyObject_INIT(o, t);
- PyObject_GC_Track(o);
- } else {
- o = (*t->tp_alloc)(t, 0);
- if (unlikely(!o)) return 0;
- }
- #endif
- return o;
-}
-
-static void __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr(PyObject *o) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)o;
- #if CYTHON_USE_TP_FINALIZE
- if (unlikely((PY_VERSION_HEX >= 0x03080000 || __Pyx_PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE)) && __Pyx_PyObject_GetSlot(o, tp_finalize, destructor)) && !__Pyx_PyObject_GC_IsFinalized(o)) {
- if (__Pyx_PyObject_GetSlot(o, tp_dealloc, destructor) == __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr) {
- if (PyObject_CallFinalizerFromDealloc(o)) return;
- }
- }
- #endif
- PyObject_GC_UnTrack(o);
- Py_CLEAR(p->__pyx_genexpr_arg_0);
- if (CYTHON_COMPILING_IN_CPYTHON && ((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr < 8) & (int)(Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr)))) {
- __pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr[__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr++] = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)o);
- } else {
- (*Py_TYPE(o)->tp_free)(o);
- }
-}
-
-static int __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr(PyObject *o, visitproc v, void *a) {
- int e;
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr *)o;
- if (p->__pyx_genexpr_arg_0) {
- e = (*v)(p->__pyx_genexpr_arg_0, a); if (e) return e;
- }
- return 0;
-}
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr_slots[] = {
- {Py_tp_dealloc, (void *)__pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr},
- {Py_tp_traverse, (void *)__pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr},
- {Py_tp_new, (void *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr},
- {0, 0},
-};
-static PyType_Spec __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr_spec = {
- "fontTools.varLib.iup.__pyx_scope_struct__genexpr",
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr),
- 0,
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE,
- __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr_slots,
-};
-#else
-
-static PyTypeObject __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr = {
- PyVarObject_HEAD_INIT(0, 0)
- "fontTools.varLib.iup.""__pyx_scope_struct__genexpr", /*tp_name*/
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr), /*tp_basicsize*/
- 0, /*tp_itemsize*/
- __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr, /*tp_dealloc*/
- #if PY_VERSION_HEX < 0x030800b4
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030800b4
- 0, /*tp_vectorcall_offset*/
- #endif
- 0, /*tp_getattr*/
- 0, /*tp_setattr*/
- #if PY_MAJOR_VERSION < 3
- 0, /*tp_compare*/
- #endif
- #if PY_MAJOR_VERSION >= 3
- 0, /*tp_as_async*/
- #endif
- 0, /*tp_repr*/
- 0, /*tp_as_number*/
- 0, /*tp_as_sequence*/
- 0, /*tp_as_mapping*/
- 0, /*tp_hash*/
- 0, /*tp_call*/
- 0, /*tp_str*/
- 0, /*tp_getattro*/
- 0, /*tp_setattro*/
- 0, /*tp_as_buffer*/
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE, /*tp_flags*/
- 0, /*tp_doc*/
- __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr, /*tp_traverse*/
- 0, /*tp_clear*/
- 0, /*tp_richcompare*/
- 0, /*tp_weaklistoffset*/
- 0, /*tp_iter*/
- 0, /*tp_iternext*/
- 0, /*tp_methods*/
- 0, /*tp_members*/
- 0, /*tp_getset*/
- 0, /*tp_base*/
- 0, /*tp_dict*/
- 0, /*tp_descr_get*/
- 0, /*tp_descr_set*/
- #if !CYTHON_USE_TYPE_SPECS
- 0, /*tp_dictoffset*/
- #endif
- 0, /*tp_init*/
- 0, /*tp_alloc*/
- __pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr, /*tp_new*/
- 0, /*tp_free*/
- 0, /*tp_is_gc*/
- 0, /*tp_bases*/
- 0, /*tp_mro*/
- 0, /*tp_cache*/
- 0, /*tp_subclasses*/
- 0, /*tp_weaklist*/
- 0, /*tp_del*/
- 0, /*tp_version_tag*/
- #if PY_VERSION_HEX >= 0x030400a1
- #if CYTHON_USE_TP_FINALIZE
- 0, /*tp_finalize*/
- #else
- NULL, /*tp_finalize*/
- #endif
- #endif
- #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0, /*tp_vectorcall*/
- #endif
- #if __PYX_NEED_TP_PRINT_SLOT == 1
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030C0000
- 0, /*tp_watched*/
- #endif
- #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0, /*tp_pypy_flags*/
- #endif
-};
-#endif
-
-static struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize[8];
-static int __pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize = 0;
-
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) {
- PyObject *o;
- #if CYTHON_COMPILING_IN_LIMITED_API
- allocfunc alloc_func = (allocfunc)PyType_GetSlot(t, Py_tp_alloc);
- o = alloc_func(t, 0);
- #else
- if (CYTHON_COMPILING_IN_CPYTHON && likely((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize > 0) & (int)(t->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize)))) {
- o = (PyObject*)__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize[--__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize];
- memset(o, 0, sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize));
- (void) PyObject_INIT(o, t);
- PyObject_GC_Track(o);
- } else {
- o = (*t->tp_alloc)(t, 0);
- if (unlikely(!o)) return 0;
- }
- #endif
- return o;
-}
-
-static void __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(PyObject *o) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)o;
- #if CYTHON_USE_TP_FINALIZE
- if (unlikely((PY_VERSION_HEX >= 0x03080000 || __Pyx_PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE)) && __Pyx_PyObject_GetSlot(o, tp_finalize, destructor)) && !__Pyx_PyObject_GC_IsFinalized(o)) {
- if (__Pyx_PyObject_GetSlot(o, tp_dealloc, destructor) == __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize) {
- if (PyObject_CallFinalizerFromDealloc(o)) return;
- }
- }
- #endif
- PyObject_GC_UnTrack(o);
- Py_CLEAR(p->__pyx_v_d0);
- Py_CLEAR(p->__pyx_v_tolerance);
- if (CYTHON_COMPILING_IN_CPYTHON && ((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize < 8) & (int)(Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize)))) {
- __pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize[__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize++] = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)o);
- } else {
- (*Py_TYPE(o)->tp_free)(o);
- }
-}
-
-static int __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(PyObject *o, visitproc v, void *a) {
- int e;
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)o;
- if (p->__pyx_v_d0) {
- e = (*v)(p->__pyx_v_d0, a); if (e) return e;
- }
- if (p->__pyx_v_tolerance) {
- e = (*v)(p->__pyx_v_tolerance, a); if (e) return e;
- }
- return 0;
-}
-
-static int __pyx_tp_clear_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize(PyObject *o) {
- PyObject* tmp;
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize *)o;
- tmp = ((PyObject*)p->__pyx_v_d0);
- p->__pyx_v_d0 = Py_None; Py_INCREF(Py_None);
- Py_XDECREF(tmp);
- tmp = ((PyObject*)p->__pyx_v_tolerance);
- p->__pyx_v_tolerance = Py_None; Py_INCREF(Py_None);
- Py_XDECREF(tmp);
- return 0;
-}
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize_slots[] = {
- {Py_tp_dealloc, (void *)__pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize},
- {Py_tp_traverse, (void *)__pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize},
- {Py_tp_clear, (void *)__pyx_tp_clear_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize},
- {Py_tp_new, (void *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize},
- {0, 0},
-};
-static PyType_Spec __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize_spec = {
- "fontTools.varLib.iup.__pyx_scope_struct_1_iup_contour_optimize",
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize),
- 0,
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE,
- __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize_slots,
-};
-#else
-
-static PyTypeObject __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize = {
- PyVarObject_HEAD_INIT(0, 0)
- "fontTools.varLib.iup.""__pyx_scope_struct_1_iup_contour_optimize", /*tp_name*/
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize), /*tp_basicsize*/
- 0, /*tp_itemsize*/
- __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize, /*tp_dealloc*/
- #if PY_VERSION_HEX < 0x030800b4
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030800b4
- 0, /*tp_vectorcall_offset*/
- #endif
- 0, /*tp_getattr*/
- 0, /*tp_setattr*/
- #if PY_MAJOR_VERSION < 3
- 0, /*tp_compare*/
- #endif
- #if PY_MAJOR_VERSION >= 3
- 0, /*tp_as_async*/
- #endif
- 0, /*tp_repr*/
- 0, /*tp_as_number*/
- 0, /*tp_as_sequence*/
- 0, /*tp_as_mapping*/
- 0, /*tp_hash*/
- 0, /*tp_call*/
- 0, /*tp_str*/
- 0, /*tp_getattro*/
- 0, /*tp_setattro*/
- 0, /*tp_as_buffer*/
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE, /*tp_flags*/
- 0, /*tp_doc*/
- __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize, /*tp_traverse*/
- __pyx_tp_clear_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize, /*tp_clear*/
- 0, /*tp_richcompare*/
- 0, /*tp_weaklistoffset*/
- 0, /*tp_iter*/
- 0, /*tp_iternext*/
- 0, /*tp_methods*/
- 0, /*tp_members*/
- 0, /*tp_getset*/
- 0, /*tp_base*/
- 0, /*tp_dict*/
- 0, /*tp_descr_get*/
- 0, /*tp_descr_set*/
- #if !CYTHON_USE_TYPE_SPECS
- 0, /*tp_dictoffset*/
- #endif
- 0, /*tp_init*/
- 0, /*tp_alloc*/
- __pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize, /*tp_new*/
- 0, /*tp_free*/
- 0, /*tp_is_gc*/
- 0, /*tp_bases*/
- 0, /*tp_mro*/
- 0, /*tp_cache*/
- 0, /*tp_subclasses*/
- 0, /*tp_weaklist*/
- 0, /*tp_del*/
- 0, /*tp_version_tag*/
- #if PY_VERSION_HEX >= 0x030400a1
- #if CYTHON_USE_TP_FINALIZE
- 0, /*tp_finalize*/
- #else
- NULL, /*tp_finalize*/
- #endif
- #endif
- #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0, /*tp_vectorcall*/
- #endif
- #if __PYX_NEED_TP_PRINT_SLOT == 1
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030C0000
- 0, /*tp_watched*/
- #endif
- #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0, /*tp_pypy_flags*/
- #endif
-};
-#endif
-
-static struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr[8];
-static int __pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr = 0;
-
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) {
- PyObject *o;
- #if CYTHON_COMPILING_IN_LIMITED_API
- allocfunc alloc_func = (allocfunc)PyType_GetSlot(t, Py_tp_alloc);
- o = alloc_func(t, 0);
- #else
- if (CYTHON_COMPILING_IN_CPYTHON && likely((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr > 0) & (int)(t->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr)))) {
- o = (PyObject*)__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr[--__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr];
- memset(o, 0, sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr));
- (void) PyObject_INIT(o, t);
- PyObject_GC_Track(o);
- } else {
- o = (*t->tp_alloc)(t, 0);
- if (unlikely(!o)) return 0;
- }
- #endif
- return o;
-}
-
-static void __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr(PyObject *o) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)o;
- #if CYTHON_USE_TP_FINALIZE
- if (unlikely((PY_VERSION_HEX >= 0x03080000 || __Pyx_PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE)) && __Pyx_PyObject_GetSlot(o, tp_finalize, destructor)) && !__Pyx_PyObject_GC_IsFinalized(o)) {
- if (__Pyx_PyObject_GetSlot(o, tp_dealloc, destructor) == __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr) {
- if (PyObject_CallFinalizerFromDealloc(o)) return;
- }
- }
- #endif
- PyObject_GC_UnTrack(o);
- Py_CLEAR(p->__pyx_outer_scope);
- Py_CLEAR(p->__pyx_genexpr_arg_0);
- Py_CLEAR(p->__pyx_v_p);
- if (CYTHON_COMPILING_IN_CPYTHON && ((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr < 8) & (int)(Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr)))) {
- __pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr[__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr++] = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)o);
- } else {
- (*Py_TYPE(o)->tp_free)(o);
- }
-}
-
-static int __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr(PyObject *o, visitproc v, void *a) {
- int e;
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr *)o;
- if (p->__pyx_outer_scope) {
- e = (*v)(((PyObject *)p->__pyx_outer_scope), a); if (e) return e;
- }
- if (p->__pyx_genexpr_arg_0) {
- e = (*v)(p->__pyx_genexpr_arg_0, a); if (e) return e;
- }
- if (p->__pyx_v_p) {
- e = (*v)(p->__pyx_v_p, a); if (e) return e;
- }
- return 0;
-}
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr_slots[] = {
- {Py_tp_dealloc, (void *)__pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr},
- {Py_tp_traverse, (void *)__pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr},
- {Py_tp_new, (void *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr},
- {0, 0},
-};
-static PyType_Spec __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr_spec = {
- "fontTools.varLib.iup.__pyx_scope_struct_2_genexpr",
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr),
- 0,
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE,
- __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr_slots,
-};
-#else
-
-static PyTypeObject __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr = {
- PyVarObject_HEAD_INIT(0, 0)
- "fontTools.varLib.iup.""__pyx_scope_struct_2_genexpr", /*tp_name*/
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr), /*tp_basicsize*/
- 0, /*tp_itemsize*/
- __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr, /*tp_dealloc*/
- #if PY_VERSION_HEX < 0x030800b4
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030800b4
- 0, /*tp_vectorcall_offset*/
- #endif
- 0, /*tp_getattr*/
- 0, /*tp_setattr*/
- #if PY_MAJOR_VERSION < 3
- 0, /*tp_compare*/
- #endif
- #if PY_MAJOR_VERSION >= 3
- 0, /*tp_as_async*/
- #endif
- 0, /*tp_repr*/
- 0, /*tp_as_number*/
- 0, /*tp_as_sequence*/
- 0, /*tp_as_mapping*/
- 0, /*tp_hash*/
- 0, /*tp_call*/
- 0, /*tp_str*/
- 0, /*tp_getattro*/
- 0, /*tp_setattro*/
- 0, /*tp_as_buffer*/
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE, /*tp_flags*/
- 0, /*tp_doc*/
- __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr, /*tp_traverse*/
- 0, /*tp_clear*/
- 0, /*tp_richcompare*/
- 0, /*tp_weaklistoffset*/
- 0, /*tp_iter*/
- 0, /*tp_iternext*/
- 0, /*tp_methods*/
- 0, /*tp_members*/
- 0, /*tp_getset*/
- 0, /*tp_base*/
- 0, /*tp_dict*/
- 0, /*tp_descr_get*/
- 0, /*tp_descr_set*/
- #if !CYTHON_USE_TYPE_SPECS
- 0, /*tp_dictoffset*/
- #endif
- 0, /*tp_init*/
- 0, /*tp_alloc*/
- __pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr, /*tp_new*/
- 0, /*tp_free*/
- 0, /*tp_is_gc*/
- 0, /*tp_bases*/
- 0, /*tp_mro*/
- 0, /*tp_cache*/
- 0, /*tp_subclasses*/
- 0, /*tp_weaklist*/
- 0, /*tp_del*/
- 0, /*tp_version_tag*/
- #if PY_VERSION_HEX >= 0x030400a1
- #if CYTHON_USE_TP_FINALIZE
- 0, /*tp_finalize*/
- #else
- NULL, /*tp_finalize*/
- #endif
- #endif
- #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0, /*tp_vectorcall*/
- #endif
- #if __PYX_NEED_TP_PRINT_SLOT == 1
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030C0000
- 0, /*tp_watched*/
- #endif
- #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0, /*tp_pypy_flags*/
- #endif
-};
-#endif
-
-static struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr[8];
-static int __pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr = 0;
-
-static PyObject *__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) {
- PyObject *o;
- #if CYTHON_COMPILING_IN_LIMITED_API
- allocfunc alloc_func = (allocfunc)PyType_GetSlot(t, Py_tp_alloc);
- o = alloc_func(t, 0);
- #else
- if (CYTHON_COMPILING_IN_CPYTHON && likely((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr > 0) & (int)(t->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr)))) {
- o = (PyObject*)__pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr[--__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr];
- memset(o, 0, sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr));
- (void) PyObject_INIT(o, t);
- PyObject_GC_Track(o);
- } else {
- o = (*t->tp_alloc)(t, 0);
- if (unlikely(!o)) return 0;
- }
- #endif
- return o;
-}
-
-static void __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr(PyObject *o) {
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)o;
- #if CYTHON_USE_TP_FINALIZE
- if (unlikely((PY_VERSION_HEX >= 0x03080000 || __Pyx_PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE)) && __Pyx_PyObject_GetSlot(o, tp_finalize, destructor)) && !__Pyx_PyObject_GC_IsFinalized(o)) {
- if (__Pyx_PyObject_GetSlot(o, tp_dealloc, destructor) == __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr) {
- if (PyObject_CallFinalizerFromDealloc(o)) return;
- }
- }
- #endif
- PyObject_GC_UnTrack(o);
- Py_CLEAR(p->__pyx_outer_scope);
- Py_CLEAR(p->__pyx_genexpr_arg_0);
- Py_CLEAR(p->__pyx_v_d);
- if (CYTHON_COMPILING_IN_CPYTHON && ((int)(__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr < 8) & (int)(Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr)))) {
- __pyx_freelist_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr[__pyx_freecount_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr++] = ((struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)o);
- } else {
- (*Py_TYPE(o)->tp_free)(o);
- }
-}
-
-static int __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr(PyObject *o, visitproc v, void *a) {
- int e;
- struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *p = (struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr *)o;
- if (p->__pyx_outer_scope) {
- e = (*v)(((PyObject *)p->__pyx_outer_scope), a); if (e) return e;
- }
- if (p->__pyx_genexpr_arg_0) {
- e = (*v)(p->__pyx_genexpr_arg_0, a); if (e) return e;
- }
- if (p->__pyx_v_d) {
- e = (*v)(p->__pyx_v_d, a); if (e) return e;
- }
- return 0;
-}
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr_slots[] = {
- {Py_tp_dealloc, (void *)__pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr},
- {Py_tp_traverse, (void *)__pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr},
- {Py_tp_new, (void *)__pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr},
- {0, 0},
-};
-static PyType_Spec __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr_spec = {
- "fontTools.varLib.iup.__pyx_scope_struct_3_genexpr",
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr),
- 0,
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE,
- __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr_slots,
-};
-#else
-
-static PyTypeObject __pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr = {
- PyVarObject_HEAD_INIT(0, 0)
- "fontTools.varLib.iup.""__pyx_scope_struct_3_genexpr", /*tp_name*/
- sizeof(struct __pyx_obj_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr), /*tp_basicsize*/
- 0, /*tp_itemsize*/
- __pyx_tp_dealloc_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr, /*tp_dealloc*/
- #if PY_VERSION_HEX < 0x030800b4
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030800b4
- 0, /*tp_vectorcall_offset*/
- #endif
- 0, /*tp_getattr*/
- 0, /*tp_setattr*/
- #if PY_MAJOR_VERSION < 3
- 0, /*tp_compare*/
- #endif
- #if PY_MAJOR_VERSION >= 3
- 0, /*tp_as_async*/
- #endif
- 0, /*tp_repr*/
- 0, /*tp_as_number*/
- 0, /*tp_as_sequence*/
- 0, /*tp_as_mapping*/
- 0, /*tp_hash*/
- 0, /*tp_call*/
- 0, /*tp_str*/
- 0, /*tp_getattro*/
- 0, /*tp_setattro*/
- 0, /*tp_as_buffer*/
- Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC|Py_TPFLAGS_HAVE_FINALIZE, /*tp_flags*/
- 0, /*tp_doc*/
- __pyx_tp_traverse_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr, /*tp_traverse*/
- 0, /*tp_clear*/
- 0, /*tp_richcompare*/
- 0, /*tp_weaklistoffset*/
- 0, /*tp_iter*/
- 0, /*tp_iternext*/
- 0, /*tp_methods*/
- 0, /*tp_members*/
- 0, /*tp_getset*/
- 0, /*tp_base*/
- 0, /*tp_dict*/
- 0, /*tp_descr_get*/
- 0, /*tp_descr_set*/
- #if !CYTHON_USE_TYPE_SPECS
- 0, /*tp_dictoffset*/
- #endif
- 0, /*tp_init*/
- 0, /*tp_alloc*/
- __pyx_tp_new_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr, /*tp_new*/
- 0, /*tp_free*/
- 0, /*tp_is_gc*/
- 0, /*tp_bases*/
- 0, /*tp_mro*/
- 0, /*tp_cache*/
- 0, /*tp_subclasses*/
- 0, /*tp_weaklist*/
- 0, /*tp_del*/
- 0, /*tp_version_tag*/
- #if PY_VERSION_HEX >= 0x030400a1
- #if CYTHON_USE_TP_FINALIZE
- 0, /*tp_finalize*/
- #else
- NULL, /*tp_finalize*/
- #endif
- #endif
- #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0, /*tp_vectorcall*/
- #endif
- #if __PYX_NEED_TP_PRINT_SLOT == 1
- 0, /*tp_print*/
- #endif
- #if PY_VERSION_HEX >= 0x030C0000
- 0, /*tp_watched*/
- #endif
- #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0, /*tp_pypy_flags*/
- #endif
-};
-#endif
-
-static PyMethodDef __pyx_methods[] = {
- {0, 0, 0, 0}
-};
-#ifndef CYTHON_SMALL_CODE
-#if defined(__clang__)
- #define CYTHON_SMALL_CODE
-#elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3))
- #define CYTHON_SMALL_CODE __attribute__((cold))
-#else
- #define CYTHON_SMALL_CODE
-#endif
-#endif
-/* #### Code section: pystring_table ### */
-
-static int __Pyx_CreateStringTabAndInitStrings(void) {
- __Pyx_StringTabEntry __pyx_string_tab[] = {
- {&__pyx_n_s_AssertionError, __pyx_k_AssertionError, sizeof(__pyx_k_AssertionError), 0, 0, 1, 1},
- {&__pyx_n_s_AttributeError, __pyx_k_AttributeError, sizeof(__pyx_k_AttributeError), 0, 0, 1, 1},
- {&__pyx_n_s_COMPILED, __pyx_k_COMPILED, sizeof(__pyx_k_COMPILED), 0, 0, 1, 1},
- {&__pyx_n_s_Delta, __pyx_k_Delta, sizeof(__pyx_k_Delta), 0, 0, 1, 1},
- {&__pyx_n_s_DeltaOrNone, __pyx_k_DeltaOrNone, sizeof(__pyx_k_DeltaOrNone), 0, 0, 1, 1},
- {&__pyx_n_s_DeltaOrNoneSegment, __pyx_k_DeltaOrNoneSegment, sizeof(__pyx_k_DeltaOrNoneSegment), 0, 0, 1, 1},
- {&__pyx_n_s_DeltaSegment, __pyx_k_DeltaSegment, sizeof(__pyx_k_DeltaSegment), 0, 0, 1, 1},
- {&__pyx_n_s_Endpoints, __pyx_k_Endpoints, sizeof(__pyx_k_Endpoints), 0, 0, 1, 1},
- {&__pyx_n_s_ImportError, __pyx_k_ImportError, sizeof(__pyx_k_ImportError), 0, 0, 1, 1},
- {&__pyx_n_s_Integral, __pyx_k_Integral, sizeof(__pyx_k_Integral), 0, 0, 1, 1},
- {&__pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_k_Lib_fontTools_varLib_iup_py, sizeof(__pyx_k_Lib_fontTools_varLib_iup_py), 0, 0, 1, 0},
- {&__pyx_n_s_MAX_LOOKBACK, __pyx_k_MAX_LOOKBACK, sizeof(__pyx_k_MAX_LOOKBACK), 0, 0, 1, 1},
- {&__pyx_n_s_Point, __pyx_k_Point, sizeof(__pyx_k_Point), 0, 0, 1, 1},
- {&__pyx_n_s_PointSegment, __pyx_k_PointSegment, sizeof(__pyx_k_PointSegment), 0, 0, 1, 1},
- {&__pyx_n_s_Real, __pyx_k_Real, sizeof(__pyx_k_Real), 0, 0, 1, 1},
- {&__pyx_n_s_Sequence, __pyx_k_Sequence, sizeof(__pyx_k_Sequence), 0, 0, 1, 1},
- {&__pyx_n_s_Tuple, __pyx_k_Tuple, sizeof(__pyx_k_Tuple), 0, 0, 1, 1},
- {&__pyx_n_s_Union, __pyx_k_Union, sizeof(__pyx_k_Union), 0, 0, 1, 1},
- {&__pyx_n_s__23, __pyx_k__23, sizeof(__pyx_k__23), 0, 0, 1, 1},
- {&__pyx_kp_u__3, __pyx_k__3, sizeof(__pyx_k__3), 0, 1, 0, 0},
- {&__pyx_n_s_args, __pyx_k_args, sizeof(__pyx_k_args), 0, 0, 1, 1},
- {&__pyx_n_s_asyncio_coroutines, __pyx_k_asyncio_coroutines, sizeof(__pyx_k_asyncio_coroutines), 0, 0, 1, 1},
- {&__pyx_n_s_best_cost, __pyx_k_best_cost, sizeof(__pyx_k_best_cost), 0, 0, 1, 1},
- {&__pyx_n_s_best_j, __pyx_k_best_j, sizeof(__pyx_k_best_j), 0, 0, 1, 1},
- {&__pyx_n_s_best_sol, __pyx_k_best_sol, sizeof(__pyx_k_best_sol), 0, 0, 1, 1},
- {&__pyx_n_s_c, __pyx_k_c, sizeof(__pyx_k_c), 0, 0, 1, 1},
- {&__pyx_n_s_c1, __pyx_k_c1, sizeof(__pyx_k_c1), 0, 0, 1, 1},
- {&__pyx_n_s_c2, __pyx_k_c2, sizeof(__pyx_k_c2), 0, 0, 1, 1},
- {&__pyx_n_s_can_iup_in_between_locals_genexp, __pyx_k_can_iup_in_between_locals_genexp, sizeof(__pyx_k_can_iup_in_between_locals_genexp), 0, 0, 1, 1},
- {&__pyx_n_s_chain, __pyx_k_chain, sizeof(__pyx_k_chain), 0, 0, 1, 1},
- {&__pyx_n_s_cj, __pyx_k_cj, sizeof(__pyx_k_cj), 0, 0, 1, 1},
- {&__pyx_n_s_class_getitem, __pyx_k_class_getitem, sizeof(__pyx_k_class_getitem), 0, 0, 1, 1},
- {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1},
- {&__pyx_n_s_close, __pyx_k_close, sizeof(__pyx_k_close), 0, 0, 1, 1},
- {&__pyx_n_s_contour, __pyx_k_contour, sizeof(__pyx_k_contour), 0, 0, 1, 1},
- {&__pyx_n_s_coords, __pyx_k_coords, sizeof(__pyx_k_coords), 0, 0, 1, 1},
- {&__pyx_n_s_cost, __pyx_k_cost, sizeof(__pyx_k_cost), 0, 0, 1, 1},
- {&__pyx_n_s_costs, __pyx_k_costs, sizeof(__pyx_k_costs), 0, 0, 1, 1},
- {&__pyx_n_s_cython, __pyx_k_cython, sizeof(__pyx_k_cython), 0, 0, 1, 1},
- {&__pyx_n_s_d, __pyx_k_d, sizeof(__pyx_k_d), 0, 0, 1, 1},
- {&__pyx_n_s_d0, __pyx_k_d0, sizeof(__pyx_k_d0), 0, 0, 1, 1},
- {&__pyx_n_s_d1, __pyx_k_d1, sizeof(__pyx_k_d1), 0, 0, 1, 1},
- {&__pyx_n_s_d2, __pyx_k_d2, sizeof(__pyx_k_d2), 0, 0, 1, 1},
- {&__pyx_n_s_deltas, __pyx_k_deltas, sizeof(__pyx_k_deltas), 0, 0, 1, 1},
- {&__pyx_kp_u_disable, __pyx_k_disable, sizeof(__pyx_k_disable), 0, 1, 0, 0},
- {&__pyx_n_s_dj, __pyx_k_dj, sizeof(__pyx_k_dj), 0, 0, 1, 1},
- {&__pyx_kp_u_enable, __pyx_k_enable, sizeof(__pyx_k_enable), 0, 1, 0, 0},
- {&__pyx_n_s_end, __pyx_k_end, sizeof(__pyx_k_end), 0, 0, 1, 1},
- {&__pyx_n_s_ends, __pyx_k_ends, sizeof(__pyx_k_ends), 0, 0, 1, 1},
- {&__pyx_n_s_enumerate, __pyx_k_enumerate, sizeof(__pyx_k_enumerate), 0, 0, 1, 1},
- {&__pyx_n_s_fontTools_misc, __pyx_k_fontTools_misc, sizeof(__pyx_k_fontTools_misc), 0, 0, 1, 1},
- {&__pyx_n_s_fontTools_varLib_iup, __pyx_k_fontTools_varLib_iup, sizeof(__pyx_k_fontTools_varLib_iup), 0, 0, 1, 1},
- {&__pyx_n_s_force, __pyx_k_force, sizeof(__pyx_k_force), 0, 0, 1, 1},
- {&__pyx_n_s_forced, __pyx_k_forced, sizeof(__pyx_k_forced), 0, 0, 1, 1},
- {&__pyx_kp_u_gc, __pyx_k_gc, sizeof(__pyx_k_gc), 0, 1, 0, 0},
- {&__pyx_n_s_genexpr, __pyx_k_genexpr, sizeof(__pyx_k_genexpr), 0, 0, 1, 1},
- {&__pyx_n_s_i, __pyx_k_i, sizeof(__pyx_k_i), 0, 0, 1, 1},
- {&__pyx_n_s_i1, __pyx_k_i1, sizeof(__pyx_k_i1), 0, 0, 1, 1},
- {&__pyx_n_s_i2, __pyx_k_i2, sizeof(__pyx_k_i2), 0, 0, 1, 1},
- {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1},
- {&__pyx_n_s_indices, __pyx_k_indices, sizeof(__pyx_k_indices), 0, 0, 1, 1},
- {&__pyx_n_s_int, __pyx_k_int, sizeof(__pyx_k_int), 0, 0, 1, 1},
- {&__pyx_n_s_is_coroutine, __pyx_k_is_coroutine, sizeof(__pyx_k_is_coroutine), 0, 0, 1, 1},
- {&__pyx_kp_u_isenabled, __pyx_k_isenabled, sizeof(__pyx_k_isenabled), 0, 1, 0, 0},
- {&__pyx_n_s_it, __pyx_k_it, sizeof(__pyx_k_it), 0, 0, 1, 1},
- {&__pyx_n_s_iup_contour, __pyx_k_iup_contour, sizeof(__pyx_k_iup_contour), 0, 0, 1, 1},
- {&__pyx_n_s_iup_contour_bound_forced_set, __pyx_k_iup_contour_bound_forced_set, sizeof(__pyx_k_iup_contour_bound_forced_set), 0, 0, 1, 1},
- {&__pyx_n_s_iup_contour_optimize, __pyx_k_iup_contour_optimize, sizeof(__pyx_k_iup_contour_optimize), 0, 0, 1, 1},
- {&__pyx_n_s_iup_contour_optimize_dp, __pyx_k_iup_contour_optimize_dp, sizeof(__pyx_k_iup_contour_optimize_dp), 0, 0, 1, 1},
- {&__pyx_n_s_iup_contour_optimize_locals_gene, __pyx_k_iup_contour_optimize_locals_gene, sizeof(__pyx_k_iup_contour_optimize_locals_gene), 0, 0, 1, 1},
- {&__pyx_n_s_iup_delta, __pyx_k_iup_delta, sizeof(__pyx_k_iup_delta), 0, 0, 1, 1},
- {&__pyx_n_s_iup_delta_optimize, __pyx_k_iup_delta_optimize, sizeof(__pyx_k_iup_delta_optimize), 0, 0, 1, 1},
- {&__pyx_n_s_j, __pyx_k_j, sizeof(__pyx_k_j), 0, 0, 1, 1},
- {&__pyx_n_s_k, __pyx_k_k, sizeof(__pyx_k_k), 0, 0, 1, 1},
- {&__pyx_n_s_l, __pyx_k_l, sizeof(__pyx_k_l), 0, 0, 1, 1},
- {&__pyx_n_s_lc, __pyx_k_lc, sizeof(__pyx_k_lc), 0, 0, 1, 1},
- {&__pyx_n_s_lcj, __pyx_k_lcj, sizeof(__pyx_k_lcj), 0, 0, 1, 1},
- {&__pyx_n_s_ld, __pyx_k_ld, sizeof(__pyx_k_ld), 0, 0, 1, 1},
- {&__pyx_n_s_ldj, __pyx_k_ldj, sizeof(__pyx_k_ldj), 0, 0, 1, 1},
- {&__pyx_n_s_list, __pyx_k_list, sizeof(__pyx_k_list), 0, 0, 1, 1},
- {&__pyx_n_s_lookback, __pyx_k_lookback, sizeof(__pyx_k_lookback), 0, 0, 1, 1},
- {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1},
- {&__pyx_n_s_max, __pyx_k_max, sizeof(__pyx_k_max), 0, 0, 1, 1},
- {&__pyx_n_s_n, __pyx_k_n, sizeof(__pyx_k_n), 0, 0, 1, 1},
- {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1},
- {&__pyx_n_s_nc, __pyx_k_nc, sizeof(__pyx_k_nc), 0, 0, 1, 1},
- {&__pyx_n_s_ncj, __pyx_k_ncj, sizeof(__pyx_k_ncj), 0, 0, 1, 1},
- {&__pyx_n_s_nd, __pyx_k_nd, sizeof(__pyx_k_nd), 0, 0, 1, 1},
- {&__pyx_n_s_ndj, __pyx_k_ndj, sizeof(__pyx_k_ndj), 0, 0, 1, 1},
- {&__pyx_n_s_numbers, __pyx_k_numbers, sizeof(__pyx_k_numbers), 0, 0, 1, 1},
- {&__pyx_n_s_out, __pyx_k_out, sizeof(__pyx_k_out), 0, 0, 1, 1},
- {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1},
- {&__pyx_n_s_return, __pyx_k_return, sizeof(__pyx_k_return), 0, 0, 1, 1},
- {&__pyx_n_s_ri1, __pyx_k_ri1, sizeof(__pyx_k_ri1), 0, 0, 1, 1},
- {&__pyx_n_s_ri2, __pyx_k_ri2, sizeof(__pyx_k_ri2), 0, 0, 1, 1},
- {&__pyx_n_s_rot_list, __pyx_k_rot_list, sizeof(__pyx_k_rot_list), 0, 0, 1, 1},
- {&__pyx_n_s_rot_set, __pyx_k_rot_set, sizeof(__pyx_k_rot_set), 0, 0, 1, 1},
- {&__pyx_n_s_s, __pyx_k_s, sizeof(__pyx_k_s), 0, 0, 1, 1},
- {&__pyx_n_s_send, __pyx_k_send, sizeof(__pyx_k_send), 0, 0, 1, 1},
- {&__pyx_n_s_set, __pyx_k_set, sizeof(__pyx_k_set), 0, 0, 1, 1},
- {&__pyx_n_s_solution, __pyx_k_solution, sizeof(__pyx_k_solution), 0, 0, 1, 1},
- {&__pyx_n_s_start, __pyx_k_start, sizeof(__pyx_k_start), 0, 0, 1, 1},
- {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1},
- {&__pyx_n_s_throw, __pyx_k_throw, sizeof(__pyx_k_throw), 0, 0, 1, 1},
- {&__pyx_n_s_tolerance, __pyx_k_tolerance, sizeof(__pyx_k_tolerance), 0, 0, 1, 1},
- {&__pyx_n_s_typing, __pyx_k_typing, sizeof(__pyx_k_typing), 0, 0, 1, 1},
- {&__pyx_n_s_v, __pyx_k_v, sizeof(__pyx_k_v), 0, 0, 1, 1},
- {&__pyx_n_s_zip, __pyx_k_zip, sizeof(__pyx_k_zip), 0, 0, 1, 1},
- {0, 0, 0, 0, 0, 0, 0}
- };
- return __Pyx_InitStrings(__pyx_string_tab);
-}
-/* #### Code section: cached_builtins ### */
-static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) {
- __pyx_builtin_AttributeError = __Pyx_GetBuiltinName(__pyx_n_s_AttributeError); if (!__pyx_builtin_AttributeError) __PYX_ERR(0, 5, __pyx_L1_error)
- __pyx_builtin_ImportError = __Pyx_GetBuiltinName(__pyx_n_s_ImportError); if (!__pyx_builtin_ImportError) __PYX_ERR(0, 5, __pyx_L1_error)
- __pyx_builtin_zip = __Pyx_GetBuiltinName(__pyx_n_s_zip); if (!__pyx_builtin_zip) __PYX_ERR(0, 94, __pyx_L1_error)
- __pyx_builtin_AssertionError = __Pyx_GetBuiltinName(__pyx_n_s_AssertionError); if (!__pyx_builtin_AssertionError) __PYX_ERR(0, 103, __pyx_L1_error)
- __pyx_builtin_enumerate = __Pyx_GetBuiltinName(__pyx_n_s_enumerate); if (!__pyx_builtin_enumerate) __PYX_ERR(0, 109, __pyx_L1_error)
- __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(0, 233, __pyx_L1_error)
- __pyx_builtin_max = __Pyx_GetBuiltinName(__pyx_n_s_max); if (!__pyx_builtin_max) __PYX_ERR(0, 410, __pyx_L1_error)
- return 0;
- __pyx_L1_error:;
- return -1;
-}
-/* #### Code section: cached_constants ### */
-
-static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0);
-
- /* "fontTools/varLib/iup.py":63
- * # rd1 = reference delta 1
- * out_arrays = [None, None]
- * for j in 0, 1: # <<<<<<<<<<<<<<
- * out_arrays[j] = out = []
- * x1, x2, d1, d2 = rc1[j], rc2[j], rd1[j], rd2[j]
- */
- __pyx_tuple_ = PyTuple_Pack(2, __pyx_int_0, __pyx_int_1); if (unlikely(!__pyx_tuple_)) __PYX_ERR(0, 63, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple_);
- __Pyx_GIVEREF(__pyx_tuple_);
-
- /* "fontTools/varLib/iup.py":112
- * if not indices:
- * # All deltas are None. Return 0,0 for all.
- * return [(0, 0)] * n # <<<<<<<<<<<<<<
- *
- * out = []
- */
- __pyx_tuple__2 = PyTuple_Pack(2, __pyx_int_0, __pyx_int_0); if (unlikely(!__pyx_tuple__2)) __PYX_ERR(0, 112, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__2);
- __Pyx_GIVEREF(__pyx_tuple__2);
-
- /* "fontTools/varLib/iup.py":97
- *
- *
- * def iup_contour(deltas: _DeltaOrNoneSegment, coords: _PointSegment) -> _DeltaSegment: # <<<<<<<<<<<<<<
- * """For the contour given in `coords`, interpolate any missing
- * delta values in delta vector `deltas`.
- */
- __pyx_tuple__4 = PyTuple_Pack(14, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_n, __pyx_n_s_indices, __pyx_n_s_out, __pyx_n_s_it, __pyx_n_s_start, __pyx_n_s_i1, __pyx_n_s_i2, __pyx_n_s_ri1, __pyx_n_s_ri2, __pyx_n_s_end, __pyx_n_s_i, __pyx_n_s_v); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(0, 97, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__4);
- __Pyx_GIVEREF(__pyx_tuple__4);
- __pyx_codeobj__5 = (PyObject*)__Pyx_PyCode_New(2, 0, 0, 14, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__4, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_contour, 97, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__5)) __PYX_ERR(0, 97, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":149
- *
- *
- * def iup_delta( # <<<<<<<<<<<<<<
- * deltas: _DeltaOrNoneSegment, coords: _PointSegment, ends: _Endpoints
- * ) -> _DeltaSegment:
- */
- __pyx_tuple__6 = PyTuple_Pack(8, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_ends, __pyx_n_s_n, __pyx_n_s_out, __pyx_n_s_start, __pyx_n_s_end, __pyx_n_s_contour); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(0, 149, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__6);
- __Pyx_GIVEREF(__pyx_tuple__6);
- __pyx_codeobj__7 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 8, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__6, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_delta, 149, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__7)) __PYX_ERR(0, 149, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":208
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * cj=cython.double,
- * dj=cython.double,
- */
- __pyx_tuple__8 = PyTuple_Pack(24, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_tolerance, __pyx_n_s_cj, __pyx_n_s_dj, __pyx_n_s_lcj, __pyx_n_s_ldj, __pyx_n_s_ncj, __pyx_n_s_ndj, __pyx_n_s_force, __pyx_n_s_forced, __pyx_n_s_n, __pyx_n_s_i, __pyx_n_s_ld, __pyx_n_s_lc, __pyx_n_s_d, __pyx_n_s_c, __pyx_n_s_nd, __pyx_n_s_nc, __pyx_n_s_j, __pyx_n_s_c1, __pyx_n_s_c2, __pyx_n_s_d1, __pyx_n_s_d2); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(0, 208, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__8);
- __Pyx_GIVEREF(__pyx_tuple__8);
- __pyx_codeobj__9 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 24, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__8, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_contour_bound_forced_set, 208, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__9)) __PYX_ERR(0, 208, __pyx_L1_error)
- __pyx_tuple__10 = PyTuple_Pack(1, ((PyObject *)__pyx_int_0)); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(0, 208, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__10);
- __Pyx_GIVEREF(__pyx_tuple__10);
-
- /* "fontTools/varLib/iup.py":299
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * i=cython.int,
- * j=cython.int,
- */
- __pyx_tuple__11 = PyTuple_Pack(13, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_forced, __pyx_n_s_tolerance, __pyx_n_s_lookback, __pyx_n_s_i, __pyx_n_s_j, __pyx_n_s_best_cost, __pyx_n_s_best_j, __pyx_n_s_cost, __pyx_n_s_n, __pyx_n_s_costs, __pyx_n_s_chain); if (unlikely(!__pyx_tuple__11)) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__11);
- __Pyx_GIVEREF(__pyx_tuple__11);
- __pyx_codeobj__12 = (PyObject*)__Pyx_PyCode_New(5, 0, 0, 13, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__11, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_contour_optimize_dp, 299, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__12)) __PYX_ERR(0, 299, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":352
- *
- *
- * def _rot_list(l: list, k: int): # <<<<<<<<<<<<<<
- * """Rotate list by k items forward. Ie. item at position 0 will be
- * at position k in returned list. Negative k is allowed."""
- */
- __pyx_tuple__13 = PyTuple_Pack(3, __pyx_n_s_l, __pyx_n_s_k, __pyx_n_s_n); if (unlikely(!__pyx_tuple__13)) __PYX_ERR(0, 352, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__13);
- __Pyx_GIVEREF(__pyx_tuple__13);
- __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(2, 0, 0, 3, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__13, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_rot_list, 352, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(0, 352, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":362
- *
- *
- * def _rot_set(s: set, k: int, n: int): # <<<<<<<<<<<<<<
- * k %= n
- * if not k:
- */
- __pyx_tuple__15 = PyTuple_Pack(4, __pyx_n_s_s, __pyx_n_s_k, __pyx_n_s_n, __pyx_n_s_v); if (unlikely(!__pyx_tuple__15)) __PYX_ERR(0, 362, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__15);
- __Pyx_GIVEREF(__pyx_tuple__15);
- __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 4, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__15, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_rot_set, 362, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(0, 362, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
- __pyx_tuple__17 = PyTuple_Pack(20, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_tolerance, __pyx_n_s_n, __pyx_n_s_d0, __pyx_n_s_forced, __pyx_n_s_k, __pyx_n_s_chain, __pyx_n_s_costs, __pyx_n_s_solution, __pyx_n_s_i, __pyx_n_s_best_sol, __pyx_n_s_best_cost, __pyx_n_s_start, __pyx_n_s_cost, __pyx_n_s_genexpr, __pyx_n_s_genexpr, __pyx_n_s_genexpr, __pyx_n_s_i, __pyx_n_s_i); if (unlikely(!__pyx_tuple__17)) __PYX_ERR(0, 369, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__17);
- __Pyx_GIVEREF(__pyx_tuple__17);
- __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 20, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__17, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_contour_optimize, 369, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(0, 369, __pyx_L1_error)
- __pyx_tuple__19 = PyTuple_Pack(1, ((PyObject*)__pyx_float_0_0)); if (unlikely(!__pyx_tuple__19)) __PYX_ERR(0, 369, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__19);
- __Pyx_GIVEREF(__pyx_tuple__19);
-
- /* "fontTools/varLib/iup.py":470
- *
- *
- * def iup_delta_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment,
- * coords: _PointSegment,
- */
- __pyx_tuple__20 = PyTuple_Pack(9, __pyx_n_s_deltas, __pyx_n_s_coords, __pyx_n_s_ends, __pyx_n_s_tolerance, __pyx_n_s_n, __pyx_n_s_out, __pyx_n_s_start, __pyx_n_s_end, __pyx_n_s_contour); if (unlikely(!__pyx_tuple__20)) __PYX_ERR(0, 470, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__20);
- __Pyx_GIVEREF(__pyx_tuple__20);
- __pyx_codeobj__21 = (PyObject*)__Pyx_PyCode_New(4, 0, 0, 9, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__20, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_Lib_fontTools_varLib_iup_py, __pyx_n_s_iup_delta_optimize, 470, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__21)) __PYX_ERR(0, 470, __pyx_L1_error)
- __pyx_tuple__22 = PyTuple_Pack(1, ((PyObject*)__pyx_float_0_0)); if (unlikely(!__pyx_tuple__22)) __PYX_ERR(0, 470, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_tuple__22);
- __Pyx_GIVEREF(__pyx_tuple__22);
- __Pyx_RefNannyFinishContext();
- return 0;
- __pyx_L1_error:;
- __Pyx_RefNannyFinishContext();
- return -1;
-}
-/* #### Code section: init_constants ### */
-
-static CYTHON_SMALL_CODE int __Pyx_InitConstants(void) {
- if (__Pyx_CreateStringTabAndInitStrings() < 0) __PYX_ERR(0, 1, __pyx_L1_error);
- __pyx_float_0_0 = PyFloat_FromDouble(0.0); if (unlikely(!__pyx_float_0_0)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_0 = PyInt_FromLong(0); if (unlikely(!__pyx_int_0)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_1 = PyInt_FromLong(1); if (unlikely(!__pyx_int_1)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_2 = PyInt_FromLong(2); if (unlikely(!__pyx_int_2)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_3 = PyInt_FromLong(3); if (unlikely(!__pyx_int_3)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_4 = PyInt_FromLong(4); if (unlikely(!__pyx_int_4)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_8 = PyInt_FromLong(8); if (unlikely(!__pyx_int_8)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_int_neg_1 = PyInt_FromLong(-1); if (unlikely(!__pyx_int_neg_1)) __PYX_ERR(0, 1, __pyx_L1_error)
- return 0;
- __pyx_L1_error:;
- return -1;
-}
-/* #### Code section: init_globals ### */
-
-static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) {
- /* AssertionsEnabled.init */
- __Pyx_init_assertions_enabled();
-
-if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 1, __pyx_L1_error)
-
- return 0;
- __pyx_L1_error:;
- return -1;
-}
-/* #### Code section: init_module ### */
-
-static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/
-static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/
-
-static int __Pyx_modinit_global_init_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0);
- /*--- Global init code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-static int __Pyx_modinit_variable_export_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0);
- /*--- Variable export code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-static int __Pyx_modinit_function_export_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0);
- /*--- Function export code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-static int __Pyx_modinit_type_init_code(void) {
- __Pyx_RefNannyDeclarations
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0);
- /*--- Type init code ---*/
- #if CYTHON_USE_TYPE_SPECS
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr = (PyTypeObject *) __Pyx_PyType_FromModuleAndSpec(__pyx_m, &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr_spec, NULL); if (unlikely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr)) __PYX_ERR(0, 203, __pyx_L1_error)
- if (__Pyx_fix_up_extension_type_from_spec(&__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr_spec, __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr) < 0) __PYX_ERR(0, 203, __pyx_L1_error)
- #else
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr = &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- #endif
- #if !CYTHON_USE_TYPE_SPECS
- if (__Pyx_PyType_Ready(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr) < 0) __PYX_ERR(0, 203, __pyx_L1_error)
- #endif
- #if PY_MAJOR_VERSION < 3
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr->tp_print = 0;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr->tp_dictoffset && __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr->tp_getattro == PyObject_GenericGetAttr)) {
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct__genexpr->tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict;
- }
- #endif
- #if CYTHON_USE_TYPE_SPECS
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize = (PyTypeObject *) __Pyx_PyType_FromModuleAndSpec(__pyx_m, &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize_spec, NULL); if (unlikely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize)) __PYX_ERR(0, 369, __pyx_L1_error)
- if (__Pyx_fix_up_extension_type_from_spec(&__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize_spec, __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- #else
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize = &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- #endif
- #if !CYTHON_USE_TYPE_SPECS
- if (__Pyx_PyType_Ready(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- #endif
- #if PY_MAJOR_VERSION < 3
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize->tp_print = 0;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize->tp_dictoffset && __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize->tp_getattro == PyObject_GenericGetAttr)) {
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_1_iup_contour_optimize->tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict;
- }
- #endif
- #if CYTHON_USE_TYPE_SPECS
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr = (PyTypeObject *) __Pyx_PyType_FromModuleAndSpec(__pyx_m, &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr_spec, NULL); if (unlikely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr)) __PYX_ERR(0, 384, __pyx_L1_error)
- if (__Pyx_fix_up_extension_type_from_spec(&__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr_spec, __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr) < 0) __PYX_ERR(0, 384, __pyx_L1_error)
- #else
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr = &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- #endif
- #if !CYTHON_USE_TYPE_SPECS
- if (__Pyx_PyType_Ready(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr) < 0) __PYX_ERR(0, 384, __pyx_L1_error)
- #endif
- #if PY_MAJOR_VERSION < 3
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr->tp_print = 0;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr->tp_dictoffset && __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr->tp_getattro == PyObject_GenericGetAttr)) {
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_2_genexpr->tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict;
- }
- #endif
- #if CYTHON_USE_TYPE_SPECS
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr = (PyTypeObject *) __Pyx_PyType_FromModuleAndSpec(__pyx_m, &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr_spec, NULL); if (unlikely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr)) __PYX_ERR(0, 393, __pyx_L1_error)
- if (__Pyx_fix_up_extension_type_from_spec(&__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr_spec, __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr) < 0) __PYX_ERR(0, 393, __pyx_L1_error)
- #else
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr = &__pyx_type_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- #endif
- #if !CYTHON_USE_TYPE_SPECS
- if (__Pyx_PyType_Ready(__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr) < 0) __PYX_ERR(0, 393, __pyx_L1_error)
- #endif
- #if PY_MAJOR_VERSION < 3
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr->tp_print = 0;
- #endif
- #if !CYTHON_COMPILING_IN_LIMITED_API
- if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr->tp_dictoffset && __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr->tp_getattro == PyObject_GenericGetAttr)) {
- __pyx_ptype_9fontTools_6varLib_3iup___pyx_scope_struct_3_genexpr->tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict;
- }
- #endif
- __Pyx_RefNannyFinishContext();
- return 0;
- __pyx_L1_error:;
- __Pyx_RefNannyFinishContext();
- return -1;
-}
-
-static int __Pyx_modinit_type_import_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0);
- /*--- Type import code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-static int __Pyx_modinit_variable_import_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0);
- /*--- Variable import code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-static int __Pyx_modinit_function_import_code(void) {
- __Pyx_RefNannyDeclarations
- __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0);
- /*--- Function import code ---*/
- __Pyx_RefNannyFinishContext();
- return 0;
-}
-
-
-#if PY_MAJOR_VERSION >= 3
-#if CYTHON_PEP489_MULTI_PHASE_INIT
-static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/
-static int __pyx_pymod_exec_iup(PyObject* module); /*proto*/
-static PyModuleDef_Slot __pyx_moduledef_slots[] = {
- {Py_mod_create, (void*)__pyx_pymod_create},
- {Py_mod_exec, (void*)__pyx_pymod_exec_iup},
- {0, NULL}
-};
-#endif
-
-#ifdef __cplusplus
-namespace {
- struct PyModuleDef __pyx_moduledef =
- #else
- static struct PyModuleDef __pyx_moduledef =
- #endif
- {
- PyModuleDef_HEAD_INIT,
- "iup",
- 0, /* m_doc */
- #if CYTHON_PEP489_MULTI_PHASE_INIT
- 0, /* m_size */
- #elif CYTHON_USE_MODULE_STATE
- sizeof(__pyx_mstate), /* m_size */
- #else
- -1, /* m_size */
- #endif
- __pyx_methods /* m_methods */,
- #if CYTHON_PEP489_MULTI_PHASE_INIT
- __pyx_moduledef_slots, /* m_slots */
- #else
- NULL, /* m_reload */
- #endif
- #if CYTHON_USE_MODULE_STATE
- __pyx_m_traverse, /* m_traverse */
- __pyx_m_clear, /* m_clear */
- NULL /* m_free */
- #else
- NULL, /* m_traverse */
- NULL, /* m_clear */
- NULL /* m_free */
- #endif
- };
- #ifdef __cplusplus
-} /* anonymous namespace */
-#endif
-#endif
-
-#ifndef CYTHON_NO_PYINIT_EXPORT
-#define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC
-#elif PY_MAJOR_VERSION < 3
-#ifdef __cplusplus
-#define __Pyx_PyMODINIT_FUNC extern "C" void
-#else
-#define __Pyx_PyMODINIT_FUNC void
-#endif
-#else
-#ifdef __cplusplus
-#define __Pyx_PyMODINIT_FUNC extern "C" PyObject *
-#else
-#define __Pyx_PyMODINIT_FUNC PyObject *
-#endif
-#endif
-
-
-#if PY_MAJOR_VERSION < 3
-__Pyx_PyMODINIT_FUNC initiup(void) CYTHON_SMALL_CODE; /*proto*/
-__Pyx_PyMODINIT_FUNC initiup(void)
-#else
-__Pyx_PyMODINIT_FUNC PyInit_iup(void) CYTHON_SMALL_CODE; /*proto*/
-__Pyx_PyMODINIT_FUNC PyInit_iup(void)
-#if CYTHON_PEP489_MULTI_PHASE_INIT
-{
- return PyModuleDef_Init(&__pyx_moduledef);
-}
-static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) {
- #if PY_VERSION_HEX >= 0x030700A1
- static PY_INT64_T main_interpreter_id = -1;
- PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp);
- if (main_interpreter_id == -1) {
- main_interpreter_id = current_id;
- return (unlikely(current_id == -1)) ? -1 : 0;
- } else if (unlikely(main_interpreter_id != current_id))
- #else
- static PyInterpreterState *main_interpreter = NULL;
- PyInterpreterState *current_interpreter = PyThreadState_Get()->interp;
- if (!main_interpreter) {
- main_interpreter = current_interpreter;
- } else if (unlikely(main_interpreter != current_interpreter))
- #endif
- {
- PyErr_SetString(
- PyExc_ImportError,
- "Interpreter change detected - this module can only be loaded into one interpreter per process.");
- return -1;
- }
- return 0;
-}
-#if CYTHON_COMPILING_IN_LIMITED_API
-static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *module, const char* from_name, const char* to_name, int allow_none)
-#else
-static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none)
-#endif
-{
- PyObject *value = PyObject_GetAttrString(spec, from_name);
- int result = 0;
- if (likely(value)) {
- if (allow_none || value != Py_None) {
-#if CYTHON_COMPILING_IN_LIMITED_API
- result = PyModule_AddObject(module, to_name, value);
-#else
- result = PyDict_SetItemString(moddict, to_name, value);
-#endif
- }
- Py_DECREF(value);
- } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) {
- PyErr_Clear();
- } else {
- result = -1;
- }
- return result;
-}
-static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def) {
- PyObject *module = NULL, *moddict, *modname;
- CYTHON_UNUSED_VAR(def);
- if (__Pyx_check_single_interpreter())
- return NULL;
- if (__pyx_m)
- return __Pyx_NewRef(__pyx_m);
- modname = PyObject_GetAttrString(spec, "name");
- if (unlikely(!modname)) goto bad;
- module = PyModule_NewObject(modname);
- Py_DECREF(modname);
- if (unlikely(!module)) goto bad;
-#if CYTHON_COMPILING_IN_LIMITED_API
- moddict = module;
-#else
- moddict = PyModule_GetDict(module);
- if (unlikely(!moddict)) goto bad;
-#endif
- if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad;
- if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad;
- if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad;
- if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad;
- return module;
-bad:
- Py_XDECREF(module);
- return NULL;
-}
-
-
-static CYTHON_SMALL_CODE int __pyx_pymod_exec_iup(PyObject *__pyx_pyinit_module)
-#endif
-#endif
-{
- int stringtab_initialized = 0;
- #if CYTHON_USE_MODULE_STATE
- int pystate_addmodule_run = 0;
- #endif
- PyObject *__pyx_t_1 = NULL;
- PyObject *__pyx_t_2 = NULL;
- PyObject *__pyx_t_3 = NULL;
- int __pyx_t_4;
- PyObject *__pyx_t_5 = NULL;
- PyObject *__pyx_t_6 = NULL;
- PyObject *__pyx_t_7 = NULL;
- PyObject *__pyx_t_8 = NULL;
- PyObject *__pyx_t_9 = NULL;
- int __pyx_lineno = 0;
- const char *__pyx_filename = NULL;
- int __pyx_clineno = 0;
- __Pyx_RefNannyDeclarations
- #if CYTHON_PEP489_MULTI_PHASE_INIT
- if (__pyx_m) {
- if (__pyx_m == __pyx_pyinit_module) return 0;
- PyErr_SetString(PyExc_RuntimeError, "Module 'iup' has already been imported. Re-initialisation is not supported.");
- return -1;
- }
- #elif PY_MAJOR_VERSION >= 3
- if (__pyx_m) return __Pyx_NewRef(__pyx_m);
- #endif
- /*--- Module creation code ---*/
- #if CYTHON_PEP489_MULTI_PHASE_INIT
- __pyx_m = __pyx_pyinit_module;
- Py_INCREF(__pyx_m);
- #else
- #if PY_MAJOR_VERSION < 3
- __pyx_m = Py_InitModule4("iup", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m);
- if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error)
- #elif CYTHON_USE_MODULE_STATE
- __pyx_t_1 = PyModule_Create(&__pyx_moduledef); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error)
- {
- int add_module_result = PyState_AddModule(__pyx_t_1, &__pyx_moduledef);
- __pyx_t_1 = 0; /* transfer ownership from __pyx_t_1 to iup pseudovariable */
- if (unlikely((add_module_result < 0))) __PYX_ERR(0, 1, __pyx_L1_error)
- pystate_addmodule_run = 1;
- }
- #else
- __pyx_m = PyModule_Create(&__pyx_moduledef);
- if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #endif
- CYTHON_UNUSED_VAR(__pyx_t_1);
- __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error)
- Py_INCREF(__pyx_d);
- __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error)
- Py_INCREF(__pyx_b);
- __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error)
- Py_INCREF(__pyx_cython_runtime);
- if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #if CYTHON_REFNANNY
-__Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny");
-if (!__Pyx_RefNanny) {
- PyErr_Clear();
- __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny");
- if (!__Pyx_RefNanny)
- Py_FatalError("failed to import 'refnanny' module");
-}
-#endif
- __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit_iup(void)", 0);
- if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #ifdef __Pxy_PyFrame_Initialize_Offsets
- __Pxy_PyFrame_Initialize_Offsets();
- #endif
- __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error)
- __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error)
- #ifdef __Pyx_CyFunction_USED
- if (__pyx_CyFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #ifdef __Pyx_FusedFunction_USED
- if (__pyx_FusedFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #ifdef __Pyx_Coroutine_USED
- if (__pyx_Coroutine_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #ifdef __Pyx_Generator_USED
- if (__pyx_Generator_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #ifdef __Pyx_AsyncGen_USED
- if (__pyx_AsyncGen_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- #ifdef __Pyx_StopAsyncIteration_USED
- if (__pyx_StopAsyncIteration_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- /*--- Library function declarations ---*/
- /*--- Threads initialization code ---*/
- #if defined(WITH_THREAD) && PY_VERSION_HEX < 0x030700F0 && defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS
- PyEval_InitThreads();
- #endif
- /*--- Initialize various global constants etc. ---*/
- if (__Pyx_InitConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- stringtab_initialized = 1;
- if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT)
- if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
- if (__pyx_module_is_main_fontTools__varLib__iup) {
- if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- }
- #if PY_MAJOR_VERSION >= 3
- {
- PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error)
- if (!PyDict_GetItemString(modules, "fontTools.varLib.iup")) {
- if (unlikely((PyDict_SetItemString(modules, "fontTools.varLib.iup", __pyx_m) < 0))) __PYX_ERR(0, 1, __pyx_L1_error)
- }
- }
- #endif
- /*--- Builtin init code ---*/
- if (__Pyx_InitCachedBuiltins() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- /*--- Constants init code ---*/
- if (__Pyx_InitCachedConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- /*--- Global type/function init code ---*/
- (void)__Pyx_modinit_global_init_code();
- (void)__Pyx_modinit_variable_export_code();
- (void)__Pyx_modinit_function_export_code();
- if (unlikely((__Pyx_modinit_type_init_code() < 0))) __PYX_ERR(0, 1, __pyx_L1_error)
- (void)__Pyx_modinit_type_import_code();
- (void)__Pyx_modinit_variable_import_code();
- (void)__Pyx_modinit_function_import_code();
- /*--- Execution code ---*/
- #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED)
- if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- #endif
-
- /* "fontTools/varLib/iup.py":1
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- {
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3);
- __Pyx_XGOTREF(__pyx_t_1);
- __Pyx_XGOTREF(__pyx_t_2);
- __Pyx_XGOTREF(__pyx_t_3);
- /*try:*/ {
-
- /* "fontTools/varLib/iup.py":4
- * import cython
- *
- * COMPILED = cython.compiled # <<<<<<<<<<<<<<
- * except (AttributeError, ImportError):
- * # if cython not installed, use mock module with no-op decorators and types
- */
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_True) < 0) __PYX_ERR(0, 4, __pyx_L2_error)
-
- /* "fontTools/varLib/iup.py":1
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- }
- __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0;
- __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- goto __pyx_L7_try_end;
- __pyx_L2_error:;
-
- /* "fontTools/varLib/iup.py":5
- *
- * COMPILED = cython.compiled
- * except (AttributeError, ImportError): # <<<<<<<<<<<<<<
- * # if cython not installed, use mock module with no-op decorators and types
- * from fontTools.misc import cython
- */
- __pyx_t_4 = __Pyx_PyErr_ExceptionMatches2(__pyx_builtin_AttributeError, __pyx_builtin_ImportError);
- if (__pyx_t_4) {
- __Pyx_AddTraceback("fontTools.varLib.iup", __pyx_clineno, __pyx_lineno, __pyx_filename);
- if (__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(0, 5, __pyx_L4_except_error)
- __Pyx_XGOTREF(__pyx_t_5);
- __Pyx_XGOTREF(__pyx_t_6);
- __Pyx_XGOTREF(__pyx_t_7);
-
- /* "fontTools/varLib/iup.py":7
- * except (AttributeError, ImportError):
- * # if cython not installed, use mock module with no-op decorators and types
- * from fontTools.misc import cython # <<<<<<<<<<<<<<
- *
- * COMPILED = False
- */
- __pyx_t_8 = PyList_New(1); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 7, __pyx_L4_except_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_INCREF(__pyx_n_s_cython);
- __Pyx_GIVEREF(__pyx_n_s_cython);
- PyList_SET_ITEM(__pyx_t_8, 0, __pyx_n_s_cython);
- __pyx_t_9 = __Pyx_Import(__pyx_n_s_fontTools_misc, __pyx_t_8, 0); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 7, __pyx_L4_except_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __pyx_t_8 = __Pyx_ImportFrom(__pyx_t_9, __pyx_n_s_cython); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 7, __pyx_L4_except_error)
- __Pyx_GOTREF(__pyx_t_8);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_cython, __pyx_t_8) < 0) __PYX_ERR(0, 7, __pyx_L4_except_error)
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":9
- * from fontTools.misc import cython
- *
- * COMPILED = False # <<<<<<<<<<<<<<
- *
- * from typing import (
- */
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_False) < 0) __PYX_ERR(0, 9, __pyx_L4_except_error)
- __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0;
- goto __pyx_L3_exception_handled;
- }
- goto __pyx_L4_except_error;
-
- /* "fontTools/varLib/iup.py":1
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- __pyx_L4_except_error:;
- __Pyx_XGIVEREF(__pyx_t_1);
- __Pyx_XGIVEREF(__pyx_t_2);
- __Pyx_XGIVEREF(__pyx_t_3);
- __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3);
- goto __pyx_L1_error;
- __pyx_L3_exception_handled:;
- __Pyx_XGIVEREF(__pyx_t_1);
- __Pyx_XGIVEREF(__pyx_t_2);
- __Pyx_XGIVEREF(__pyx_t_3);
- __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3);
- __pyx_L7_try_end:;
- }
-
- /* "fontTools/varLib/iup.py":12
- *
- * from typing import (
- * Sequence, # <<<<<<<<<<<<<<
- * Tuple,
- * Union,
- */
- __pyx_t_7 = PyList_New(3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 12, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_INCREF(__pyx_n_s_Sequence);
- __Pyx_GIVEREF(__pyx_n_s_Sequence);
- PyList_SET_ITEM(__pyx_t_7, 0, __pyx_n_s_Sequence);
- __Pyx_INCREF(__pyx_n_s_Tuple);
- __Pyx_GIVEREF(__pyx_n_s_Tuple);
- PyList_SET_ITEM(__pyx_t_7, 1, __pyx_n_s_Tuple);
- __Pyx_INCREF(__pyx_n_s_Union);
- __Pyx_GIVEREF(__pyx_n_s_Union);
- PyList_SET_ITEM(__pyx_t_7, 2, __pyx_n_s_Union);
-
- /* "fontTools/varLib/iup.py":11
- * COMPILED = False
- *
- * from typing import ( # <<<<<<<<<<<<<<
- * Sequence,
- * Tuple,
- */
- __pyx_t_6 = __Pyx_Import(__pyx_n_s_typing, __pyx_t_7, 0); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 11, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = __Pyx_ImportFrom(__pyx_t_6, __pyx_n_s_Sequence); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 11, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Sequence, __pyx_t_7) < 0) __PYX_ERR(0, 12, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = __Pyx_ImportFrom(__pyx_t_6, __pyx_n_s_Tuple); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 11, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Tuple, __pyx_t_7) < 0) __PYX_ERR(0, 13, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __pyx_t_7 = __Pyx_ImportFrom(__pyx_t_6, __pyx_n_s_Union); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 11, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Union, __pyx_t_7) < 0) __PYX_ERR(0, 14, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":16
- * Union,
- * )
- * from numbers import Integral, Real # <<<<<<<<<<<<<<
- *
- * try:
- */
- __pyx_t_6 = PyList_New(2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_INCREF(__pyx_n_s_Integral);
- __Pyx_GIVEREF(__pyx_n_s_Integral);
- PyList_SET_ITEM(__pyx_t_6, 0, __pyx_n_s_Integral);
- __Pyx_INCREF(__pyx_n_s_Real);
- __Pyx_GIVEREF(__pyx_n_s_Real);
- PyList_SET_ITEM(__pyx_t_6, 1, __pyx_n_s_Real);
- __pyx_t_7 = __Pyx_Import(__pyx_n_s_numbers, __pyx_t_6, 0); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_ImportFrom(__pyx_t_7, __pyx_n_s_Integral); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Integral, __pyx_t_6) < 0) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_ImportFrom(__pyx_t_7, __pyx_n_s_Real); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Real, __pyx_t_6) < 0) __PYX_ERR(0, 16, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":18
- * from numbers import Integral, Real
- *
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- {
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- __Pyx_ExceptionSave(&__pyx_t_3, &__pyx_t_2, &__pyx_t_1);
- __Pyx_XGOTREF(__pyx_t_3);
- __Pyx_XGOTREF(__pyx_t_2);
- __Pyx_XGOTREF(__pyx_t_1);
- /*try:*/ {
-
- /* "fontTools/varLib/iup.py":21
- * import cython
- *
- * COMPILED = cython.compiled # <<<<<<<<<<<<<<
- * except (AttributeError, ImportError):
- * # if cython not installed, use mock module with no-op decorators and types
- */
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_True) < 0) __PYX_ERR(0, 21, __pyx_L10_error)
-
- /* "fontTools/varLib/iup.py":18
- * from numbers import Integral, Real
- *
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- }
- __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0;
- __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0;
- __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0;
- goto __pyx_L15_try_end;
- __pyx_L10_error:;
- __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0;
- __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0;
-
- /* "fontTools/varLib/iup.py":22
- *
- * COMPILED = cython.compiled
- * except (AttributeError, ImportError): # <<<<<<<<<<<<<<
- * # if cython not installed, use mock module with no-op decorators and types
- * from fontTools.misc import cython
- */
- __pyx_t_4 = __Pyx_PyErr_ExceptionMatches2(__pyx_builtin_AttributeError, __pyx_builtin_ImportError);
- if (__pyx_t_4) {
- __Pyx_AddTraceback("fontTools.varLib.iup", __pyx_clineno, __pyx_lineno, __pyx_filename);
- if (__Pyx_GetException(&__pyx_t_7, &__pyx_t_6, &__pyx_t_5) < 0) __PYX_ERR(0, 22, __pyx_L12_except_error)
- __Pyx_XGOTREF(__pyx_t_7);
- __Pyx_XGOTREF(__pyx_t_6);
- __Pyx_XGOTREF(__pyx_t_5);
-
- /* "fontTools/varLib/iup.py":24
- * except (AttributeError, ImportError):
- * # if cython not installed, use mock module with no-op decorators and types
- * from fontTools.misc import cython # <<<<<<<<<<<<<<
- *
- * COMPILED = False
- */
- __pyx_t_9 = PyList_New(1); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 24, __pyx_L12_except_error)
- __Pyx_GOTREF(__pyx_t_9);
- __Pyx_INCREF(__pyx_n_s_cython);
- __Pyx_GIVEREF(__pyx_n_s_cython);
- PyList_SET_ITEM(__pyx_t_9, 0, __pyx_n_s_cython);
- __pyx_t_8 = __Pyx_Import(__pyx_n_s_fontTools_misc, __pyx_t_9, 0); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 24, __pyx_L12_except_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __pyx_t_9 = __Pyx_ImportFrom(__pyx_t_8, __pyx_n_s_cython); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 24, __pyx_L12_except_error)
- __Pyx_GOTREF(__pyx_t_9);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_cython, __pyx_t_9) < 0) __PYX_ERR(0, 24, __pyx_L12_except_error)
- __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
-
- /* "fontTools/varLib/iup.py":26
- * from fontTools.misc import cython
- *
- * COMPILED = False # <<<<<<<<<<<<<<
- *
- *
- */
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_COMPILED, Py_False) < 0) __PYX_ERR(0, 26, __pyx_L12_except_error)
- __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0;
- goto __pyx_L11_exception_handled;
- }
- goto __pyx_L12_except_error;
-
- /* "fontTools/varLib/iup.py":18
- * from numbers import Integral, Real
- *
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- __pyx_L12_except_error:;
- __Pyx_XGIVEREF(__pyx_t_3);
- __Pyx_XGIVEREF(__pyx_t_2);
- __Pyx_XGIVEREF(__pyx_t_1);
- __Pyx_ExceptionReset(__pyx_t_3, __pyx_t_2, __pyx_t_1);
- goto __pyx_L1_error;
- __pyx_L11_exception_handled:;
- __Pyx_XGIVEREF(__pyx_t_3);
- __Pyx_XGIVEREF(__pyx_t_2);
- __Pyx_XGIVEREF(__pyx_t_1);
- __Pyx_ExceptionReset(__pyx_t_3, __pyx_t_2, __pyx_t_1);
- __pyx_L15_try_end:;
- }
-
- /* "fontTools/varLib/iup.py":29
- *
- *
- * _Point = Tuple[Real, Real] # <<<<<<<<<<<<<<
- * _Delta = Tuple[Real, Real]
- * _PointSegment = Sequence[_Point]
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_Tuple); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_Real); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_Real); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_8 = PyTuple_New(2); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_GIVEREF(__pyx_t_6);
- PyTuple_SET_ITEM(__pyx_t_8, 0, __pyx_t_6);
- __Pyx_GIVEREF(__pyx_t_7);
- PyTuple_SET_ITEM(__pyx_t_8, 1, __pyx_t_7);
- __pyx_t_6 = 0;
- __pyx_t_7 = 0;
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_t_5, __pyx_t_8); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Point, __pyx_t_7) < 0) __PYX_ERR(0, 29, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":30
- *
- * _Point = Tuple[Real, Real]
- * _Delta = Tuple[Real, Real] # <<<<<<<<<<<<<<
- * _PointSegment = Sequence[_Point]
- * _DeltaSegment = Sequence[_Delta]
- */
- __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_Tuple); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_GetModuleGlobalName(__pyx_t_8, __pyx_n_s_Real); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_8);
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_Real); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __pyx_t_6 = PyTuple_New(2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_GIVEREF(__pyx_t_8);
- PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_8);
- __Pyx_GIVEREF(__pyx_t_5);
- PyTuple_SET_ITEM(__pyx_t_6, 1, __pyx_t_5);
- __pyx_t_8 = 0;
- __pyx_t_5 = 0;
- __pyx_t_5 = __Pyx_PyObject_GetItem(__pyx_t_7, __pyx_t_6); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Delta, __pyx_t_5) < 0) __PYX_ERR(0, 30, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":31
- * _Point = Tuple[Real, Real]
- * _Delta = Tuple[Real, Real]
- * _PointSegment = Sequence[_Point] # <<<<<<<<<<<<<<
- * _DeltaSegment = Sequence[_Delta]
- * _DeltaOrNone = Union[_Delta, None]
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_Sequence); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 31, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_Point); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 31, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_7 = __Pyx_PyObject_GetItem(__pyx_t_5, __pyx_t_6); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 31, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_PointSegment, __pyx_t_7) < 0) __PYX_ERR(0, 31, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":32
- * _Delta = Tuple[Real, Real]
- * _PointSegment = Sequence[_Point]
- * _DeltaSegment = Sequence[_Delta] # <<<<<<<<<<<<<<
- * _DeltaOrNone = Union[_Delta, None]
- * _DeltaOrNoneSegment = Sequence[_DeltaOrNone]
- */
- __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_Sequence); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 32, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_Delta); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 32, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_5 = __Pyx_PyObject_GetItem(__pyx_t_7, __pyx_t_6); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 32, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_DeltaSegment, __pyx_t_5) < 0) __PYX_ERR(0, 32, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":33
- * _PointSegment = Sequence[_Point]
- * _DeltaSegment = Sequence[_Delta]
- * _DeltaOrNone = Union[_Delta, None] # <<<<<<<<<<<<<<
- * _DeltaOrNoneSegment = Sequence[_DeltaOrNone]
- * _Endpoints = Sequence[Integral]
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_Union); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 33, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_Delta); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 33, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __pyx_t_7 = PyTuple_New(2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 33, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_GIVEREF(__pyx_t_6);
- PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_t_6);
- __Pyx_INCREF(Py_None);
- __Pyx_GIVEREF(Py_None);
- PyTuple_SET_ITEM(__pyx_t_7, 1, Py_None);
- __pyx_t_6 = 0;
- __pyx_t_6 = __Pyx_PyObject_GetItem(__pyx_t_5, __pyx_t_7); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 33, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_DeltaOrNone, __pyx_t_6) < 0) __PYX_ERR(0, 33, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":34
- * _DeltaSegment = Sequence[_Delta]
- * _DeltaOrNone = Union[_Delta, None]
- * _DeltaOrNoneSegment = Sequence[_DeltaOrNone] # <<<<<<<<<<<<<<
- * _Endpoints = Sequence[Integral]
- *
- */
- __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_Sequence); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 34, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_DeltaOrNone); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 34, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_5 = __Pyx_PyObject_GetItem(__pyx_t_6, __pyx_t_7); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 34, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_DeltaOrNoneSegment, __pyx_t_5) < 0) __PYX_ERR(0, 34, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
-
- /* "fontTools/varLib/iup.py":35
- * _DeltaOrNone = Union[_Delta, None]
- * _DeltaOrNoneSegment = Sequence[_DeltaOrNone]
- * _Endpoints = Sequence[Integral] # <<<<<<<<<<<<<<
- *
- *
- */
- __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_Sequence); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 35, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_Integral); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 35, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __pyx_t_6 = __Pyx_PyObject_GetItem(__pyx_t_5, __pyx_t_7); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 35, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0;
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_Endpoints, __pyx_t_6) < 0) __PYX_ERR(0, 35, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":38
- *
- *
- * MAX_LOOKBACK = 8 # <<<<<<<<<<<<<<
- *
- *
- */
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_MAX_LOOKBACK, __pyx_int_8) < 0) __PYX_ERR(0, 38, __pyx_L1_error)
-
- /* "fontTools/varLib/iup.py":97
- *
- *
- * def iup_contour(deltas: _DeltaOrNoneSegment, coords: _PointSegment) -> _DeltaSegment: # <<<<<<<<<<<<<<
- * """For the contour given in `coords`, interpolate any missing
- * delta values in delta vector `deltas`.
- */
- __pyx_t_6 = __Pyx_PyDict_NewPresized(3); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 97, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_deltas, __pyx_n_s_DeltaOrNoneSegment) < 0) __PYX_ERR(0, 97, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 97, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_return, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 97, __pyx_L1_error)
- __pyx_t_7 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_1iup_contour, 0, __pyx_n_s_iup_contour, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__5)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 97, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_7, __pyx_t_6);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_contour, __pyx_t_7) < 0) __PYX_ERR(0, 97, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":149
- *
- *
- * def iup_delta( # <<<<<<<<<<<<<<
- * deltas: _DeltaOrNoneSegment, coords: _PointSegment, ends: _Endpoints
- * ) -> _DeltaSegment:
- */
- __pyx_t_7 = __Pyx_PyDict_NewPresized(4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 149, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_deltas, __pyx_n_s_DeltaOrNoneSegment) < 0) __PYX_ERR(0, 149, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 149, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_ends, __pyx_n_s_Endpoints) < 0) __PYX_ERR(0, 149, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_return, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 149, __pyx_L1_error)
- __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_3iup_delta, 0, __pyx_n_s_iup_delta, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__7)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 149, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_6, __pyx_t_7);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_delta, __pyx_t_6) < 0) __PYX_ERR(0, 149, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":208
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * cj=cython.double,
- * dj=cython.double,
- */
- __pyx_t_6 = __Pyx_PyDict_NewPresized(4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 208, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_deltas, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 208, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 208, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_tolerance, __pyx_n_s_Real) < 0) __PYX_ERR(0, 208, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_return, __pyx_n_s_set) < 0) __PYX_ERR(0, 208, __pyx_L1_error)
- __pyx_t_7 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_5_iup_contour_bound_forced_set, 0, __pyx_n_s_iup_contour_bound_forced_set, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__9)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 208, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_CyFunction_SetDefaultsTuple(__pyx_t_7, __pyx_tuple__10);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_7, __pyx_t_6);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_contour_bound_forced_set, __pyx_t_7) < 0) __PYX_ERR(0, 208, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":299
- *
- *
- * @cython.locals( # <<<<<<<<<<<<<<
- * i=cython.int,
- * j=cython.int,
- */
- __pyx_t_7 = __Pyx_PyDict_NewPresized(4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_deltas, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 299, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 299, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_tolerance, __pyx_n_s_Real) < 0) __PYX_ERR(0, 299, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_lookback, __pyx_n_s_Integral) < 0) __PYX_ERR(0, 299, __pyx_L1_error)
- __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_7_iup_contour_optimize_dp, 0, __pyx_n_s_iup_contour_optimize_dp, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__12)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (!__Pyx_CyFunction_InitDefaults(__pyx_t_6, sizeof(__pyx_defaults), 1)) __PYX_ERR(0, 299, __pyx_L1_error)
- __pyx_t_5 = PySet_New(0); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 311, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_5);
- __Pyx_CyFunction_Defaults(__pyx_defaults, __pyx_t_6)->__pyx_arg_forced = ((PyObject*)__pyx_t_5);
- __Pyx_GIVEREF(__pyx_t_5);
- __pyx_t_5 = 0;
- __Pyx_CyFunction_SetDefaultsGetter(__pyx_t_6, __pyx_pf_9fontTools_6varLib_3iup_16__defaults__);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_6, __pyx_t_7);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_contour_optimize_dp, __pyx_t_6) < 0) __PYX_ERR(0, 299, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":352
- *
- *
- * def _rot_list(l: list, k: int): # <<<<<<<<<<<<<<
- * """Rotate list by k items forward. Ie. item at position 0 will be
- * at position k in returned list. Negative k is allowed."""
- */
- __pyx_t_6 = __Pyx_PyDict_NewPresized(2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 352, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_l, __pyx_n_s_list) < 0) __PYX_ERR(0, 352, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_k, __pyx_n_s_int) < 0) __PYX_ERR(0, 352, __pyx_L1_error)
- __pyx_t_7 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_9_rot_list, 0, __pyx_n_s_rot_list, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__14)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 352, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_7, __pyx_t_6);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_rot_list, __pyx_t_7) < 0) __PYX_ERR(0, 352, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":362
- *
- *
- * def _rot_set(s: set, k: int, n: int): # <<<<<<<<<<<<<<
- * k %= n
- * if not k:
- */
- __pyx_t_7 = __Pyx_PyDict_NewPresized(3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 362, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_s, __pyx_n_s_set) < 0) __PYX_ERR(0, 362, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_k, __pyx_n_s_int) < 0) __PYX_ERR(0, 362, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_n, __pyx_n_s_int) < 0) __PYX_ERR(0, 362, __pyx_L1_error)
- __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_11_rot_set, 0, __pyx_n_s_rot_set, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__16)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 362, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_6, __pyx_t_7);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_rot_set, __pyx_t_6) < 0) __PYX_ERR(0, 362, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":369
- *
- *
- * def iup_contour_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment, coords: _PointSegment, tolerance: Real = 0.0
- * ) -> _DeltaOrNoneSegment:
- */
- __pyx_t_6 = __Pyx_PyDict_NewPresized(4); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 369, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_deltas, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_tolerance, __pyx_n_s_Real) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_6, __pyx_n_s_return, __pyx_n_s_DeltaOrNoneSegment) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- __pyx_t_7 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_13iup_contour_optimize, 0, __pyx_n_s_iup_contour_optimize, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__18)); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 369, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- __Pyx_CyFunction_SetDefaultsTuple(__pyx_t_7, __pyx_tuple__19);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_7, __pyx_t_6);
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_contour_optimize, __pyx_t_7) < 0) __PYX_ERR(0, 369, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
-
- /* "fontTools/varLib/iup.py":470
- *
- *
- * def iup_delta_optimize( # <<<<<<<<<<<<<<
- * deltas: _DeltaSegment,
- * coords: _PointSegment,
- */
- __pyx_t_7 = __Pyx_PyDict_NewPresized(5); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 470, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_7);
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_deltas, __pyx_n_s_DeltaSegment) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_coords, __pyx_n_s_PointSegment) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_ends, __pyx_n_s_Endpoints) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_tolerance, __pyx_n_s_Real) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- if (PyDict_SetItem(__pyx_t_7, __pyx_n_s_return, __pyx_n_s_DeltaOrNoneSegment) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- __pyx_t_6 = __Pyx_CyFunction_New(&__pyx_mdef_9fontTools_6varLib_3iup_15iup_delta_optimize, 0, __pyx_n_s_iup_delta_optimize, NULL, __pyx_n_s_fontTools_varLib_iup, __pyx_d, ((PyObject *)__pyx_codeobj__21)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 470, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- __Pyx_CyFunction_SetDefaultsTuple(__pyx_t_6, __pyx_tuple__22);
- __Pyx_CyFunction_SetAnnotationsDict(__pyx_t_6, __pyx_t_7);
- __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0;
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_iup_delta_optimize, __pyx_t_6) < 0) __PYX_ERR(0, 470, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /* "fontTools/varLib/iup.py":1
- * try: # <<<<<<<<<<<<<<
- * import cython
- *
- */
- __pyx_t_6 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 1, __pyx_L1_error)
- __Pyx_GOTREF(__pyx_t_6);
- if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_6) < 0) __PYX_ERR(0, 1, __pyx_L1_error)
- __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0;
-
- /*--- Wrapped vars code ---*/
-
- goto __pyx_L0;
- __pyx_L1_error:;
- __Pyx_XDECREF(__pyx_t_5);
- __Pyx_XDECREF(__pyx_t_6);
- __Pyx_XDECREF(__pyx_t_7);
- __Pyx_XDECREF(__pyx_t_8);
- __Pyx_XDECREF(__pyx_t_9);
- if (__pyx_m) {
- if (__pyx_d && stringtab_initialized) {
- __Pyx_AddTraceback("init fontTools.varLib.iup", __pyx_clineno, __pyx_lineno, __pyx_filename);
- }
- #if !CYTHON_USE_MODULE_STATE
- Py_CLEAR(__pyx_m);
- #else
- Py_DECREF(__pyx_m);
- if (pystate_addmodule_run) {
- PyObject *tp, *value, *tb;
- PyErr_Fetch(&tp, &value, &tb);
- PyState_RemoveModule(&__pyx_moduledef);
- PyErr_Restore(tp, value, tb);
- }
- #endif
- } else if (!PyErr_Occurred()) {
- PyErr_SetString(PyExc_ImportError, "init fontTools.varLib.iup");
- }
- __pyx_L0:;
- __Pyx_RefNannyFinishContext();
- #if CYTHON_PEP489_MULTI_PHASE_INIT
- return (__pyx_m != NULL) ? 0 : -1;
- #elif PY_MAJOR_VERSION >= 3
- return __pyx_m;
- #else
- return;
- #endif
-}
-/* #### Code section: cleanup_globals ### */
-/* #### Code section: cleanup_module ### */
-/* #### Code section: main_method ### */
-/* #### Code section: utility_code_pragmas ### */
-#ifdef _MSC_VER
-#pragma warning( push )
-/* Warning 4127: conditional expression is constant
- * Cython uses constant conditional expressions to allow in inline functions to be optimized at
- * compile-time, so this warning is not useful
- */
-#pragma warning( disable : 4127 )
-#endif
-
-
-
-/* #### Code section: utility_code_def ### */
-
-/* --- Runtime support code --- */
-/* Refnanny */
-#if CYTHON_REFNANNY
-static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) {
- PyObject *m = NULL, *p = NULL;
- void *r = NULL;
- m = PyImport_ImportModule(modname);
- if (!m) goto end;
- p = PyObject_GetAttrString(m, "RefNannyAPI");
- if (!p) goto end;
- r = PyLong_AsVoidPtr(p);
-end:
- Py_XDECREF(p);
- Py_XDECREF(m);
- return (__Pyx_RefNannyAPIStruct *)r;
-}
-#endif
-
-/* PyErrExceptionMatches */
-#if CYTHON_FAST_THREAD_STATE
-static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) {
- Py_ssize_t i, n;
- n = PyTuple_GET_SIZE(tuple);
-#if PY_MAJOR_VERSION >= 3
- for (i=0; i= 0x030C00A6
- PyObject *current_exception = tstate->current_exception;
- if (unlikely(!current_exception)) return 0;
- exc_type = (PyObject*) Py_TYPE(current_exception);
- if (exc_type == err) return 1;
-#else
- exc_type = tstate->curexc_type;
- if (exc_type == err) return 1;
- if (unlikely(!exc_type)) return 0;
-#endif
- #if CYTHON_AVOID_BORROWED_REFS
- Py_INCREF(exc_type);
- #endif
- if (unlikely(PyTuple_Check(err))) {
- result = __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err);
- } else {
- result = __Pyx_PyErr_GivenExceptionMatches(exc_type, err);
- }
- #if CYTHON_AVOID_BORROWED_REFS
- Py_DECREF(exc_type);
- #endif
- return result;
-}
-#endif
-
-/* PyErrFetchRestore */
-#if CYTHON_FAST_THREAD_STATE
-static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) {
-#if PY_VERSION_HEX >= 0x030C00A6
- PyObject *tmp_value;
- assert(type == NULL || (value != NULL && type == (PyObject*) Py_TYPE(value)));
- if (value) {
- #if CYTHON_COMPILING_IN_CPYTHON
- if (unlikely(((PyBaseExceptionObject*) value)->traceback != tb))
- #endif
- PyException_SetTraceback(value, tb);
- }
- tmp_value = tstate->current_exception;
- tstate->current_exception = value;
- Py_XDECREF(tmp_value);
-#else
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- tmp_type = tstate->curexc_type;
- tmp_value = tstate->curexc_value;
- tmp_tb = tstate->curexc_traceback;
- tstate->curexc_type = type;
- tstate->curexc_value = value;
- tstate->curexc_traceback = tb;
- Py_XDECREF(tmp_type);
- Py_XDECREF(tmp_value);
- Py_XDECREF(tmp_tb);
-#endif
-}
-static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) {
-#if PY_VERSION_HEX >= 0x030C00A6
- PyObject* exc_value;
- exc_value = tstate->current_exception;
- tstate->current_exception = 0;
- *value = exc_value;
- *type = NULL;
- *tb = NULL;
- if (exc_value) {
- *type = (PyObject*) Py_TYPE(exc_value);
- Py_INCREF(*type);
- #if CYTHON_COMPILING_IN_CPYTHON
- *tb = ((PyBaseExceptionObject*) exc_value)->traceback;
- Py_XINCREF(*tb);
- #else
- *tb = PyException_GetTraceback(exc_value);
- #endif
- }
-#else
- *type = tstate->curexc_type;
- *value = tstate->curexc_value;
- *tb = tstate->curexc_traceback;
- tstate->curexc_type = 0;
- tstate->curexc_value = 0;
- tstate->curexc_traceback = 0;
-#endif
-}
-#endif
-
-/* PyObjectGetAttrStr */
-#if CYTHON_USE_TYPE_SLOTS
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) {
- PyTypeObject* tp = Py_TYPE(obj);
- if (likely(tp->tp_getattro))
- return tp->tp_getattro(obj, attr_name);
-#if PY_MAJOR_VERSION < 3
- if (likely(tp->tp_getattr))
- return tp->tp_getattr(obj, PyString_AS_STRING(attr_name));
-#endif
- return PyObject_GetAttr(obj, attr_name);
-}
-#endif
-
-/* PyObjectGetAttrStrNoError */
-static void __Pyx_PyObject_GetAttrStr_ClearAttributeError(void) {
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- if (likely(__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError)))
- __Pyx_PyErr_Clear();
-}
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name) {
- PyObject *result;
-#if CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_TYPE_SLOTS && PY_VERSION_HEX >= 0x030700B1
- PyTypeObject* tp = Py_TYPE(obj);
- if (likely(tp->tp_getattro == PyObject_GenericGetAttr)) {
- return _PyObject_GenericGetAttrWithDict(obj, attr_name, NULL, 1);
- }
-#endif
- result = __Pyx_PyObject_GetAttrStr(obj, attr_name);
- if (unlikely(!result)) {
- __Pyx_PyObject_GetAttrStr_ClearAttributeError();
- }
- return result;
-}
-
-/* GetBuiltinName */
-static PyObject *__Pyx_GetBuiltinName(PyObject *name) {
- PyObject* result = __Pyx_PyObject_GetAttrStrNoError(__pyx_b, name);
- if (unlikely(!result) && !PyErr_Occurred()) {
- PyErr_Format(PyExc_NameError,
-#if PY_MAJOR_VERSION >= 3
- "name '%U' is not defined", name);
-#else
- "name '%.200s' is not defined", PyString_AS_STRING(name));
-#endif
- }
- return result;
-}
-
-/* SetItemInt */
-static int __Pyx_SetItemInt_Generic(PyObject *o, PyObject *j, PyObject *v) {
- int r;
- if (unlikely(!j)) return -1;
- r = PyObject_SetItem(o, j, v);
- Py_DECREF(j);
- return r;
-}
-static CYTHON_INLINE int __Pyx_SetItemInt_Fast(PyObject *o, Py_ssize_t i, PyObject *v, int is_list,
- CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) {
-#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS
- if (is_list || PyList_CheckExact(o)) {
- Py_ssize_t n = (!wraparound) ? i : ((likely(i >= 0)) ? i : i + PyList_GET_SIZE(o));
- if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o)))) {
- PyObject* old = PyList_GET_ITEM(o, n);
- Py_INCREF(v);
- PyList_SET_ITEM(o, n, v);
- Py_DECREF(old);
- return 1;
- }
- } else {
- PyMappingMethods *mm = Py_TYPE(o)->tp_as_mapping;
- PySequenceMethods *sm = Py_TYPE(o)->tp_as_sequence;
- if (mm && mm->mp_ass_subscript) {
- int r;
- PyObject *key = PyInt_FromSsize_t(i);
- if (unlikely(!key)) return -1;
- r = mm->mp_ass_subscript(o, key, v);
- Py_DECREF(key);
- return r;
- }
- if (likely(sm && sm->sq_ass_item)) {
- if (wraparound && unlikely(i < 0) && likely(sm->sq_length)) {
- Py_ssize_t l = sm->sq_length(o);
- if (likely(l >= 0)) {
- i += l;
- } else {
- if (!PyErr_ExceptionMatches(PyExc_OverflowError))
- return -1;
- PyErr_Clear();
- }
- }
- return sm->sq_ass_item(o, i, v);
- }
- }
-#else
-#if CYTHON_COMPILING_IN_PYPY
- if (is_list || (PySequence_Check(o) && !PyDict_Check(o)))
-#else
- if (is_list || PySequence_Check(o))
-#endif
- {
- return PySequence_SetItem(o, i, v);
- }
-#endif
- return __Pyx_SetItemInt_Generic(o, PyInt_FromSsize_t(i), v);
-}
-
-/* GetItemInt */
-static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j) {
- PyObject *r;
- if (unlikely(!j)) return NULL;
- r = PyObject_GetItem(o, j);
- Py_DECREF(j);
- return r;
-}
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i,
- CYTHON_NCP_UNUSED int wraparound,
- CYTHON_NCP_UNUSED int boundscheck) {
-#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- Py_ssize_t wrapped_i = i;
- if (wraparound & unlikely(i < 0)) {
- wrapped_i += PyList_GET_SIZE(o);
- }
- if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyList_GET_SIZE(o)))) {
- PyObject *r = PyList_GET_ITEM(o, wrapped_i);
- Py_INCREF(r);
- return r;
- }
- return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i));
-#else
- return PySequence_GetItem(o, i);
-#endif
-}
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i,
- CYTHON_NCP_UNUSED int wraparound,
- CYTHON_NCP_UNUSED int boundscheck) {
-#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- Py_ssize_t wrapped_i = i;
- if (wraparound & unlikely(i < 0)) {
- wrapped_i += PyTuple_GET_SIZE(o);
- }
- if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyTuple_GET_SIZE(o)))) {
- PyObject *r = PyTuple_GET_ITEM(o, wrapped_i);
- Py_INCREF(r);
- return r;
- }
- return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i));
-#else
- return PySequence_GetItem(o, i);
-#endif
-}
-static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list,
- CYTHON_NCP_UNUSED int wraparound,
- CYTHON_NCP_UNUSED int boundscheck) {
-#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS
- if (is_list || PyList_CheckExact(o)) {
- Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyList_GET_SIZE(o);
- if ((!boundscheck) || (likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o))))) {
- PyObject *r = PyList_GET_ITEM(o, n);
- Py_INCREF(r);
- return r;
- }
- }
- else if (PyTuple_CheckExact(o)) {
- Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyTuple_GET_SIZE(o);
- if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyTuple_GET_SIZE(o)))) {
- PyObject *r = PyTuple_GET_ITEM(o, n);
- Py_INCREF(r);
- return r;
- }
- } else {
- PyMappingMethods *mm = Py_TYPE(o)->tp_as_mapping;
- PySequenceMethods *sm = Py_TYPE(o)->tp_as_sequence;
- if (mm && mm->mp_subscript) {
- PyObject *r, *key = PyInt_FromSsize_t(i);
- if (unlikely(!key)) return NULL;
- r = mm->mp_subscript(o, key);
- Py_DECREF(key);
- return r;
- }
- if (likely(sm && sm->sq_item)) {
- if (wraparound && unlikely(i < 0) && likely(sm->sq_length)) {
- Py_ssize_t l = sm->sq_length(o);
- if (likely(l >= 0)) {
- i += l;
- } else {
- if (!PyErr_ExceptionMatches(PyExc_OverflowError))
- return NULL;
- PyErr_Clear();
- }
- }
- return sm->sq_item(o, i);
- }
- }
-#else
- if (is_list || PySequence_Check(o)) {
- return PySequence_GetItem(o, i);
- }
-#endif
- return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i));
-}
-
-/* PyObjectCall */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) {
- PyObject *result;
- ternaryfunc call = Py_TYPE(func)->tp_call;
- if (unlikely(!call))
- return PyObject_Call(func, arg, kw);
- if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object")))
- return NULL;
- result = (*call)(func, arg, kw);
- Py_LeaveRecursiveCall();
- if (unlikely(!result) && unlikely(!PyErr_Occurred())) {
- PyErr_SetString(
- PyExc_SystemError,
- "NULL result without error in PyObject_Call");
- }
- return result;
-}
-#endif
-
-/* TupleAndListFromArray */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE void __Pyx_copy_object_array(PyObject *const *CYTHON_RESTRICT src, PyObject** CYTHON_RESTRICT dest, Py_ssize_t length) {
- PyObject *v;
- Py_ssize_t i;
- for (i = 0; i < length; i++) {
- v = dest[i] = src[i];
- Py_INCREF(v);
- }
-}
-static CYTHON_INLINE PyObject *
-__Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n)
-{
- PyObject *res;
- if (n <= 0) {
- Py_INCREF(__pyx_empty_tuple);
- return __pyx_empty_tuple;
- }
- res = PyTuple_New(n);
- if (unlikely(res == NULL)) return NULL;
- __Pyx_copy_object_array(src, ((PyTupleObject*)res)->ob_item, n);
- return res;
-}
-static CYTHON_INLINE PyObject *
-__Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n)
-{
- PyObject *res;
- if (n <= 0) {
- return PyList_New(0);
- }
- res = PyList_New(n);
- if (unlikely(res == NULL)) return NULL;
- __Pyx_copy_object_array(src, ((PyListObject*)res)->ob_item, n);
- return res;
-}
-#endif
-
-/* BytesEquals */
-static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals) {
-#if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API
- return PyObject_RichCompareBool(s1, s2, equals);
-#else
- if (s1 == s2) {
- return (equals == Py_EQ);
- } else if (PyBytes_CheckExact(s1) & PyBytes_CheckExact(s2)) {
- const char *ps1, *ps2;
- Py_ssize_t length = PyBytes_GET_SIZE(s1);
- if (length != PyBytes_GET_SIZE(s2))
- return (equals == Py_NE);
- ps1 = PyBytes_AS_STRING(s1);
- ps2 = PyBytes_AS_STRING(s2);
- if (ps1[0] != ps2[0]) {
- return (equals == Py_NE);
- } else if (length == 1) {
- return (equals == Py_EQ);
- } else {
- int result;
-#if CYTHON_USE_UNICODE_INTERNALS && (PY_VERSION_HEX < 0x030B0000)
- Py_hash_t hash1, hash2;
- hash1 = ((PyBytesObject*)s1)->ob_shash;
- hash2 = ((PyBytesObject*)s2)->ob_shash;
- if (hash1 != hash2 && hash1 != -1 && hash2 != -1) {
- return (equals == Py_NE);
- }
-#endif
- result = memcmp(ps1, ps2, (size_t)length);
- return (equals == Py_EQ) ? (result == 0) : (result != 0);
- }
- } else if ((s1 == Py_None) & PyBytes_CheckExact(s2)) {
- return (equals == Py_NE);
- } else if ((s2 == Py_None) & PyBytes_CheckExact(s1)) {
- return (equals == Py_NE);
- } else {
- int result;
- PyObject* py_result = PyObject_RichCompare(s1, s2, equals);
- if (!py_result)
- return -1;
- result = __Pyx_PyObject_IsTrue(py_result);
- Py_DECREF(py_result);
- return result;
- }
-#endif
-}
-
-/* UnicodeEquals */
-static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals) {
-#if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API
- return PyObject_RichCompareBool(s1, s2, equals);
-#else
-#if PY_MAJOR_VERSION < 3
- PyObject* owned_ref = NULL;
-#endif
- int s1_is_unicode, s2_is_unicode;
- if (s1 == s2) {
- goto return_eq;
- }
- s1_is_unicode = PyUnicode_CheckExact(s1);
- s2_is_unicode = PyUnicode_CheckExact(s2);
-#if PY_MAJOR_VERSION < 3
- if ((s1_is_unicode & (!s2_is_unicode)) && PyString_CheckExact(s2)) {
- owned_ref = PyUnicode_FromObject(s2);
- if (unlikely(!owned_ref))
- return -1;
- s2 = owned_ref;
- s2_is_unicode = 1;
- } else if ((s2_is_unicode & (!s1_is_unicode)) && PyString_CheckExact(s1)) {
- owned_ref = PyUnicode_FromObject(s1);
- if (unlikely(!owned_ref))
- return -1;
- s1 = owned_ref;
- s1_is_unicode = 1;
- } else if (((!s2_is_unicode) & (!s1_is_unicode))) {
- return __Pyx_PyBytes_Equals(s1, s2, equals);
- }
-#endif
- if (s1_is_unicode & s2_is_unicode) {
- Py_ssize_t length;
- int kind;
- void *data1, *data2;
- if (unlikely(__Pyx_PyUnicode_READY(s1) < 0) || unlikely(__Pyx_PyUnicode_READY(s2) < 0))
- return -1;
- length = __Pyx_PyUnicode_GET_LENGTH(s1);
- if (length != __Pyx_PyUnicode_GET_LENGTH(s2)) {
- goto return_ne;
- }
-#if CYTHON_USE_UNICODE_INTERNALS
- {
- Py_hash_t hash1, hash2;
- #if CYTHON_PEP393_ENABLED
- hash1 = ((PyASCIIObject*)s1)->hash;
- hash2 = ((PyASCIIObject*)s2)->hash;
- #else
- hash1 = ((PyUnicodeObject*)s1)->hash;
- hash2 = ((PyUnicodeObject*)s2)->hash;
- #endif
- if (hash1 != hash2 && hash1 != -1 && hash2 != -1) {
- goto return_ne;
- }
- }
-#endif
- kind = __Pyx_PyUnicode_KIND(s1);
- if (kind != __Pyx_PyUnicode_KIND(s2)) {
- goto return_ne;
- }
- data1 = __Pyx_PyUnicode_DATA(s1);
- data2 = __Pyx_PyUnicode_DATA(s2);
- if (__Pyx_PyUnicode_READ(kind, data1, 0) != __Pyx_PyUnicode_READ(kind, data2, 0)) {
- goto return_ne;
- } else if (length == 1) {
- goto return_eq;
- } else {
- int result = memcmp(data1, data2, (size_t)(length * kind));
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(owned_ref);
- #endif
- return (equals == Py_EQ) ? (result == 0) : (result != 0);
- }
- } else if ((s1 == Py_None) & s2_is_unicode) {
- goto return_ne;
- } else if ((s2 == Py_None) & s1_is_unicode) {
- goto return_ne;
- } else {
- int result;
- PyObject* py_result = PyObject_RichCompare(s1, s2, equals);
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(owned_ref);
- #endif
- if (!py_result)
- return -1;
- result = __Pyx_PyObject_IsTrue(py_result);
- Py_DECREF(py_result);
- return result;
- }
-return_eq:
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(owned_ref);
- #endif
- return (equals == Py_EQ);
-return_ne:
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(owned_ref);
- #endif
- return (equals == Py_NE);
-#endif
-}
-
-/* fastcall */
-#if CYTHON_METH_FASTCALL
-static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s)
-{
- Py_ssize_t i, n = PyTuple_GET_SIZE(kwnames);
- for (i = 0; i < n; i++)
- {
- if (s == PyTuple_GET_ITEM(kwnames, i)) return kwvalues[i];
- }
- for (i = 0; i < n; i++)
- {
- int eq = __Pyx_PyUnicode_Equals(s, PyTuple_GET_ITEM(kwnames, i), Py_EQ);
- if (unlikely(eq != 0)) {
- if (unlikely(eq < 0)) return NULL; // error
- return kwvalues[i];
- }
- }
- return NULL; // not found (no exception set)
-}
-#endif
-
-/* RaiseArgTupleInvalid */
-static void __Pyx_RaiseArgtupleInvalid(
- const char* func_name,
- int exact,
- Py_ssize_t num_min,
- Py_ssize_t num_max,
- Py_ssize_t num_found)
-{
- Py_ssize_t num_expected;
- const char *more_or_less;
- if (num_found < num_min) {
- num_expected = num_min;
- more_or_less = "at least";
- } else {
- num_expected = num_max;
- more_or_less = "at most";
- }
- if (exact) {
- more_or_less = "exactly";
- }
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)",
- func_name, more_or_less, num_expected,
- (num_expected == 1) ? "" : "s", num_found);
-}
-
-/* RaiseDoubleKeywords */
-static void __Pyx_RaiseDoubleKeywordsError(
- const char* func_name,
- PyObject* kw_name)
-{
- PyErr_Format(PyExc_TypeError,
- #if PY_MAJOR_VERSION >= 3
- "%s() got multiple values for keyword argument '%U'", func_name, kw_name);
- #else
- "%s() got multiple values for keyword argument '%s'", func_name,
- PyString_AsString(kw_name));
- #endif
-}
-
-/* ParseKeywords */
-static int __Pyx_ParseOptionalKeywords(
- PyObject *kwds,
- PyObject *const *kwvalues,
- PyObject **argnames[],
- PyObject *kwds2,
- PyObject *values[],
- Py_ssize_t num_pos_args,
- const char* function_name)
-{
- PyObject *key = 0, *value = 0;
- Py_ssize_t pos = 0;
- PyObject*** name;
- PyObject*** first_kw_arg = argnames + num_pos_args;
- int kwds_is_tuple = CYTHON_METH_FASTCALL && likely(PyTuple_Check(kwds));
- while (1) {
- if (kwds_is_tuple) {
- if (pos >= PyTuple_GET_SIZE(kwds)) break;
- key = PyTuple_GET_ITEM(kwds, pos);
- value = kwvalues[pos];
- pos++;
- }
- else
- {
- if (!PyDict_Next(kwds, &pos, &key, &value)) break;
- }
- name = first_kw_arg;
- while (*name && (**name != key)) name++;
- if (*name) {
- values[name-argnames] = value;
- continue;
- }
- name = first_kw_arg;
- #if PY_MAJOR_VERSION < 3
- if (likely(PyString_Check(key))) {
- while (*name) {
- if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key))
- && _PyString_Eq(**name, key)) {
- values[name-argnames] = value;
- break;
- }
- name++;
- }
- if (*name) continue;
- else {
- PyObject*** argname = argnames;
- while (argname != first_kw_arg) {
- if ((**argname == key) || (
- (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key))
- && _PyString_Eq(**argname, key))) {
- goto arg_passed_twice;
- }
- argname++;
- }
- }
- } else
- #endif
- if (likely(PyUnicode_Check(key))) {
- while (*name) {
- int cmp = (
- #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3
- (__Pyx_PyUnicode_GET_LENGTH(**name) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 :
- #endif
- PyUnicode_Compare(**name, key)
- );
- if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad;
- if (cmp == 0) {
- values[name-argnames] = value;
- break;
- }
- name++;
- }
- if (*name) continue;
- else {
- PyObject*** argname = argnames;
- while (argname != first_kw_arg) {
- int cmp = (**argname == key) ? 0 :
- #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3
- (__Pyx_PyUnicode_GET_LENGTH(**argname) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 :
- #endif
- PyUnicode_Compare(**argname, key);
- if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad;
- if (cmp == 0) goto arg_passed_twice;
- argname++;
- }
- }
- } else
- goto invalid_keyword_type;
- if (kwds2) {
- if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad;
- } else {
- goto invalid_keyword;
- }
- }
- return 0;
-arg_passed_twice:
- __Pyx_RaiseDoubleKeywordsError(function_name, key);
- goto bad;
-invalid_keyword_type:
- PyErr_Format(PyExc_TypeError,
- "%.200s() keywords must be strings", function_name);
- goto bad;
-invalid_keyword:
- #if PY_MAJOR_VERSION < 3
- PyErr_Format(PyExc_TypeError,
- "%.200s() got an unexpected keyword argument '%.200s'",
- function_name, PyString_AsString(key));
- #else
- PyErr_Format(PyExc_TypeError,
- "%s() got an unexpected keyword argument '%U'",
- function_name, key);
- #endif
-bad:
- return -1;
-}
-
-/* RaiseException */
-#if PY_MAJOR_VERSION < 3
-static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) {
- __Pyx_PyThreadState_declare
- CYTHON_UNUSED_VAR(cause);
- Py_XINCREF(type);
- if (!value || value == Py_None)
- value = NULL;
- else
- Py_INCREF(value);
- if (!tb || tb == Py_None)
- tb = NULL;
- else {
- Py_INCREF(tb);
- if (!PyTraceBack_Check(tb)) {
- PyErr_SetString(PyExc_TypeError,
- "raise: arg 3 must be a traceback or None");
- goto raise_error;
- }
- }
- if (PyType_Check(type)) {
-#if CYTHON_COMPILING_IN_PYPY
- if (!value) {
- Py_INCREF(Py_None);
- value = Py_None;
- }
-#endif
- PyErr_NormalizeException(&type, &value, &tb);
- } else {
- if (value) {
- PyErr_SetString(PyExc_TypeError,
- "instance exception may not have a separate value");
- goto raise_error;
- }
- value = type;
- type = (PyObject*) Py_TYPE(type);
- Py_INCREF(type);
- if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) {
- PyErr_SetString(PyExc_TypeError,
- "raise: exception class must be a subclass of BaseException");
- goto raise_error;
- }
- }
- __Pyx_PyThreadState_assign
- __Pyx_ErrRestore(type, value, tb);
- return;
-raise_error:
- Py_XDECREF(value);
- Py_XDECREF(type);
- Py_XDECREF(tb);
- return;
-}
-#else
-static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) {
- PyObject* owned_instance = NULL;
- if (tb == Py_None) {
- tb = 0;
- } else if (tb && !PyTraceBack_Check(tb)) {
- PyErr_SetString(PyExc_TypeError,
- "raise: arg 3 must be a traceback or None");
- goto bad;
- }
- if (value == Py_None)
- value = 0;
- if (PyExceptionInstance_Check(type)) {
- if (value) {
- PyErr_SetString(PyExc_TypeError,
- "instance exception may not have a separate value");
- goto bad;
- }
- value = type;
- type = (PyObject*) Py_TYPE(value);
- } else if (PyExceptionClass_Check(type)) {
- PyObject *instance_class = NULL;
- if (value && PyExceptionInstance_Check(value)) {
- instance_class = (PyObject*) Py_TYPE(value);
- if (instance_class != type) {
- int is_subclass = PyObject_IsSubclass(instance_class, type);
- if (!is_subclass) {
- instance_class = NULL;
- } else if (unlikely(is_subclass == -1)) {
- goto bad;
- } else {
- type = instance_class;
- }
- }
- }
- if (!instance_class) {
- PyObject *args;
- if (!value)
- args = PyTuple_New(0);
- else if (PyTuple_Check(value)) {
- Py_INCREF(value);
- args = value;
- } else
- args = PyTuple_Pack(1, value);
- if (!args)
- goto bad;
- owned_instance = PyObject_Call(type, args, NULL);
- Py_DECREF(args);
- if (!owned_instance)
- goto bad;
- value = owned_instance;
- if (!PyExceptionInstance_Check(value)) {
- PyErr_Format(PyExc_TypeError,
- "calling %R should have returned an instance of "
- "BaseException, not %R",
- type, Py_TYPE(value));
- goto bad;
- }
- }
- } else {
- PyErr_SetString(PyExc_TypeError,
- "raise: exception class must be a subclass of BaseException");
- goto bad;
- }
- if (cause) {
- PyObject *fixed_cause;
- if (cause == Py_None) {
- fixed_cause = NULL;
- } else if (PyExceptionClass_Check(cause)) {
- fixed_cause = PyObject_CallObject(cause, NULL);
- if (fixed_cause == NULL)
- goto bad;
- } else if (PyExceptionInstance_Check(cause)) {
- fixed_cause = cause;
- Py_INCREF(fixed_cause);
- } else {
- PyErr_SetString(PyExc_TypeError,
- "exception causes must derive from "
- "BaseException");
- goto bad;
- }
- PyException_SetCause(value, fixed_cause);
- }
- PyErr_SetObject(type, value);
- if (tb) {
- #if PY_VERSION_HEX >= 0x030C00A6
- PyException_SetTraceback(value, tb);
- #elif CYTHON_FAST_THREAD_STATE
- PyThreadState *tstate = __Pyx_PyThreadState_Current;
- PyObject* tmp_tb = tstate->curexc_traceback;
- if (tb != tmp_tb) {
- Py_INCREF(tb);
- tstate->curexc_traceback = tb;
- Py_XDECREF(tmp_tb);
- }
-#else
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb);
- Py_INCREF(tb);
- PyErr_Restore(tmp_type, tmp_value, tb);
- Py_XDECREF(tmp_tb);
-#endif
- }
-bad:
- Py_XDECREF(owned_instance);
- return;
-}
-#endif
-
-/* PyIntBinop */
-#if !CYTHON_COMPILING_IN_PYPY
-static PyObject* __Pyx_PyInt_AddObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check) {
- CYTHON_MAYBE_UNUSED_VAR(intval);
- CYTHON_MAYBE_UNUSED_VAR(inplace);
- CYTHON_UNUSED_VAR(zerodivision_check);
- #if PY_MAJOR_VERSION < 3
- if (likely(PyInt_CheckExact(op1))) {
- const long b = intval;
- long x;
- long a = PyInt_AS_LONG(op1);
-
- x = (long)((unsigned long)a + (unsigned long)b);
- if (likely((x^a) >= 0 || (x^b) >= 0))
- return PyInt_FromLong(x);
- return PyLong_Type.tp_as_number->nb_add(op1, op2);
- }
- #endif
- #if CYTHON_USE_PYLONG_INTERNALS
- if (likely(PyLong_CheckExact(op1))) {
- const long b = intval;
- long a, x;
-#ifdef HAVE_LONG_LONG
- const PY_LONG_LONG llb = intval;
- PY_LONG_LONG lla, llx;
-#endif
- if (unlikely(__Pyx_PyLong_IsZero(op1))) {
- return __Pyx_NewRef(op2);
- }
- if (likely(__Pyx_PyLong_IsCompact(op1))) {
- a = __Pyx_PyLong_CompactValue(op1);
- } else {
- const digit* digits = __Pyx_PyLong_Digits(op1);
- const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(op1);
- switch (size) {
- case -2:
- if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) {
- a = -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) {
- a = (long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case -3:
- if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) {
- a = -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 3:
- if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) {
- a = (long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case -4:
- if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) {
- a = -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 4:
- if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) {
- a = (long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- default: return PyLong_Type.tp_as_number->nb_add(op1, op2);
- }
- }
- x = a + b;
- return PyLong_FromLong(x);
-#ifdef HAVE_LONG_LONG
- long_long:
- llx = lla + llb;
- return PyLong_FromLongLong(llx);
-#endif
-
-
- }
- #endif
- if (PyFloat_CheckExact(op1)) {
- const long b = intval;
-#if CYTHON_COMPILING_IN_LIMITED_API
- double a = __pyx_PyFloat_AsDouble(op1);
-#else
- double a = PyFloat_AS_DOUBLE(op1);
-#endif
- double result;
-
- PyFPE_START_PROTECT("add", return NULL)
- result = ((double)a) + (double)b;
- PyFPE_END_PROTECT(result)
- return PyFloat_FromDouble(result);
- }
- return (inplace ? PyNumber_InPlaceAdd : PyNumber_Add)(op1, op2);
-}
-#endif
-
-/* IterNext */
-static PyObject *__Pyx_PyIter_Next2Default(PyObject* defval) {
- PyObject* exc_type;
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- exc_type = __Pyx_PyErr_CurrentExceptionType();
- if (unlikely(exc_type)) {
- if (!defval || unlikely(!__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration)))
- return NULL;
- __Pyx_PyErr_Clear();
- Py_INCREF(defval);
- return defval;
- }
- if (defval) {
- Py_INCREF(defval);
- return defval;
- }
- __Pyx_PyErr_SetNone(PyExc_StopIteration);
- return NULL;
-}
-static void __Pyx_PyIter_Next_ErrorNoIterator(PyObject *iterator) {
- __Pyx_TypeName iterator_type_name = __Pyx_PyType_GetName(Py_TYPE(iterator));
- PyErr_Format(PyExc_TypeError,
- __Pyx_FMT_TYPENAME " object is not an iterator", iterator_type_name);
- __Pyx_DECREF_TypeName(iterator_type_name);
-}
-static CYTHON_INLINE PyObject *__Pyx_PyIter_Next2(PyObject* iterator, PyObject* defval) {
- PyObject* next;
- iternextfunc iternext = Py_TYPE(iterator)->tp_iternext;
- if (likely(iternext)) {
-#if CYTHON_USE_TYPE_SLOTS || CYTHON_COMPILING_IN_PYPY
- next = iternext(iterator);
- if (likely(next))
- return next;
-#if CYTHON_COMPILING_IN_CPYTHON
- if (unlikely(iternext == &_PyObject_NextNotImplemented))
- return NULL;
-#endif
-#else
- next = PyIter_Next(iterator);
- if (likely(next))
- return next;
-#endif
- } else if (CYTHON_USE_TYPE_SLOTS || unlikely(!PyIter_Check(iterator))) {
- __Pyx_PyIter_Next_ErrorNoIterator(iterator);
- return NULL;
- }
-#if !CYTHON_USE_TYPE_SLOTS
- else {
- next = PyIter_Next(iterator);
- if (likely(next))
- return next;
- }
-#endif
- return __Pyx_PyIter_Next2Default(defval);
-}
-
-/* PyIntCompare */
-static CYTHON_INLINE int __Pyx_PyInt_BoolNeObjC(PyObject *op1, PyObject *op2, long intval, long inplace) {
- CYTHON_MAYBE_UNUSED_VAR(intval);
- CYTHON_UNUSED_VAR(inplace);
- if (op1 == op2) {
- return 0;
- }
- #if PY_MAJOR_VERSION < 3
- if (likely(PyInt_CheckExact(op1))) {
- const long b = intval;
- long a = PyInt_AS_LONG(op1);
- return (a != b);
- }
- #endif
- #if CYTHON_USE_PYLONG_INTERNALS
- if (likely(PyLong_CheckExact(op1))) {
- int unequal;
- unsigned long uintval;
- Py_ssize_t size = __Pyx_PyLong_DigitCount(op1);
- const digit* digits = __Pyx_PyLong_Digits(op1);
- if (intval == 0) {
- return (__Pyx_PyLong_IsZero(op1) != 1);
- } else if (intval < 0) {
- if (__Pyx_PyLong_IsNonNeg(op1))
- return 1;
- intval = -intval;
- } else {
- if (__Pyx_PyLong_IsNeg(op1))
- return 1;
- }
- uintval = (unsigned long) intval;
-#if PyLong_SHIFT * 4 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 4)) {
- unequal = (size != 5) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[4] != ((uintval >> (4 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 3 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 3)) {
- unequal = (size != 4) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 2 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 2)) {
- unequal = (size != 3) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 1 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 1)) {
- unequal = (size != 2) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
- unequal = (size != 1) || (((unsigned long) digits[0]) != (uintval & (unsigned long) PyLong_MASK));
- return (unequal != 0);
- }
- #endif
- if (PyFloat_CheckExact(op1)) {
- const long b = intval;
-#if CYTHON_COMPILING_IN_LIMITED_API
- double a = __pyx_PyFloat_AsDouble(op1);
-#else
- double a = PyFloat_AS_DOUBLE(op1);
-#endif
- return ((double)a != (double)b);
- }
- return __Pyx_PyObject_IsTrueAndDecref(
- PyObject_RichCompare(op1, op2, Py_NE));
-}
-
-/* SliceObject */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GetSlice(PyObject* obj,
- Py_ssize_t cstart, Py_ssize_t cstop,
- PyObject** _py_start, PyObject** _py_stop, PyObject** _py_slice,
- int has_cstart, int has_cstop, int wraparound) {
- __Pyx_TypeName obj_type_name;
-#if CYTHON_USE_TYPE_SLOTS
- PyMappingMethods* mp;
-#if PY_MAJOR_VERSION < 3
- PySequenceMethods* ms = Py_TYPE(obj)->tp_as_sequence;
- if (likely(ms && ms->sq_slice)) {
- if (!has_cstart) {
- if (_py_start && (*_py_start != Py_None)) {
- cstart = __Pyx_PyIndex_AsSsize_t(*_py_start);
- if ((cstart == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad;
- } else
- cstart = 0;
- }
- if (!has_cstop) {
- if (_py_stop && (*_py_stop != Py_None)) {
- cstop = __Pyx_PyIndex_AsSsize_t(*_py_stop);
- if ((cstop == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad;
- } else
- cstop = PY_SSIZE_T_MAX;
- }
- if (wraparound && unlikely((cstart < 0) | (cstop < 0)) && likely(ms->sq_length)) {
- Py_ssize_t l = ms->sq_length(obj);
- if (likely(l >= 0)) {
- if (cstop < 0) {
- cstop += l;
- if (cstop < 0) cstop = 0;
- }
- if (cstart < 0) {
- cstart += l;
- if (cstart < 0) cstart = 0;
- }
- } else {
- if (!PyErr_ExceptionMatches(PyExc_OverflowError))
- goto bad;
- PyErr_Clear();
- }
- }
- return ms->sq_slice(obj, cstart, cstop);
- }
-#else
- CYTHON_UNUSED_VAR(wraparound);
-#endif
- mp = Py_TYPE(obj)->tp_as_mapping;
- if (likely(mp && mp->mp_subscript))
-#else
- CYTHON_UNUSED_VAR(wraparound);
-#endif
- {
- PyObject* result;
- PyObject *py_slice, *py_start, *py_stop;
- if (_py_slice) {
- py_slice = *_py_slice;
- } else {
- PyObject* owned_start = NULL;
- PyObject* owned_stop = NULL;
- if (_py_start) {
- py_start = *_py_start;
- } else {
- if (has_cstart) {
- owned_start = py_start = PyInt_FromSsize_t(cstart);
- if (unlikely(!py_start)) goto bad;
- } else
- py_start = Py_None;
- }
- if (_py_stop) {
- py_stop = *_py_stop;
- } else {
- if (has_cstop) {
- owned_stop = py_stop = PyInt_FromSsize_t(cstop);
- if (unlikely(!py_stop)) {
- Py_XDECREF(owned_start);
- goto bad;
- }
- } else
- py_stop = Py_None;
- }
- py_slice = PySlice_New(py_start, py_stop, Py_None);
- Py_XDECREF(owned_start);
- Py_XDECREF(owned_stop);
- if (unlikely(!py_slice)) goto bad;
- }
-#if CYTHON_USE_TYPE_SLOTS
- result = mp->mp_subscript(obj, py_slice);
-#else
- result = PyObject_GetItem(obj, py_slice);
-#endif
- if (!_py_slice) {
- Py_DECREF(py_slice);
- }
- return result;
- }
- obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj));
- PyErr_Format(PyExc_TypeError,
- "'" __Pyx_FMT_TYPENAME "' object is unsliceable", obj_type_name);
- __Pyx_DECREF_TypeName(obj_type_name);
-bad:
- return NULL;
-}
-
-/* PyFunctionFastCall */
-#if CYTHON_FAST_PYCALL && !CYTHON_VECTORCALL
-static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na,
- PyObject *globals) {
- PyFrameObject *f;
- PyThreadState *tstate = __Pyx_PyThreadState_Current;
- PyObject **fastlocals;
- Py_ssize_t i;
- PyObject *result;
- assert(globals != NULL);
- /* XXX Perhaps we should create a specialized
- PyFrame_New() that doesn't take locals, but does
- take builtins without sanity checking them.
- */
- assert(tstate != NULL);
- f = PyFrame_New(tstate, co, globals, NULL);
- if (f == NULL) {
- return NULL;
- }
- fastlocals = __Pyx_PyFrame_GetLocalsplus(f);
- for (i = 0; i < na; i++) {
- Py_INCREF(*args);
- fastlocals[i] = *args++;
- }
- result = PyEval_EvalFrameEx(f,0);
- ++tstate->recursion_depth;
- Py_DECREF(f);
- --tstate->recursion_depth;
- return result;
-}
-static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) {
- PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func);
- PyObject *globals = PyFunction_GET_GLOBALS(func);
- PyObject *argdefs = PyFunction_GET_DEFAULTS(func);
- PyObject *closure;
-#if PY_MAJOR_VERSION >= 3
- PyObject *kwdefs;
-#endif
- PyObject *kwtuple, **k;
- PyObject **d;
- Py_ssize_t nd;
- Py_ssize_t nk;
- PyObject *result;
- assert(kwargs == NULL || PyDict_Check(kwargs));
- nk = kwargs ? PyDict_Size(kwargs) : 0;
- if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) {
- return NULL;
- }
- if (
-#if PY_MAJOR_VERSION >= 3
- co->co_kwonlyargcount == 0 &&
-#endif
- likely(kwargs == NULL || nk == 0) &&
- co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) {
- if (argdefs == NULL && co->co_argcount == nargs) {
- result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals);
- goto done;
- }
- else if (nargs == 0 && argdefs != NULL
- && co->co_argcount == Py_SIZE(argdefs)) {
- /* function called with no arguments, but all parameters have
- a default value: use default values as arguments .*/
- args = &PyTuple_GET_ITEM(argdefs, 0);
- result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals);
- goto done;
- }
- }
- if (kwargs != NULL) {
- Py_ssize_t pos, i;
- kwtuple = PyTuple_New(2 * nk);
- if (kwtuple == NULL) {
- result = NULL;
- goto done;
- }
- k = &PyTuple_GET_ITEM(kwtuple, 0);
- pos = i = 0;
- while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) {
- Py_INCREF(k[i]);
- Py_INCREF(k[i+1]);
- i += 2;
- }
- nk = i / 2;
- }
- else {
- kwtuple = NULL;
- k = NULL;
- }
- closure = PyFunction_GET_CLOSURE(func);
-#if PY_MAJOR_VERSION >= 3
- kwdefs = PyFunction_GET_KW_DEFAULTS(func);
-#endif
- if (argdefs != NULL) {
- d = &PyTuple_GET_ITEM(argdefs, 0);
- nd = Py_SIZE(argdefs);
- }
- else {
- d = NULL;
- nd = 0;
- }
-#if PY_MAJOR_VERSION >= 3
- result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL,
- args, (int)nargs,
- k, (int)nk,
- d, (int)nd, kwdefs, closure);
-#else
- result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL,
- args, (int)nargs,
- k, (int)nk,
- d, (int)nd, closure);
-#endif
- Py_XDECREF(kwtuple);
-done:
- Py_LeaveRecursiveCall();
- return result;
-}
-#endif
-
-/* PyObjectCallMethO */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) {
- PyObject *self, *result;
- PyCFunction cfunc;
- cfunc = PyCFunction_GET_FUNCTION(func);
- self = PyCFunction_GET_SELF(func);
- if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object")))
- return NULL;
- result = cfunc(self, arg);
- Py_LeaveRecursiveCall();
- if (unlikely(!result) && unlikely(!PyErr_Occurred())) {
- PyErr_SetString(
- PyExc_SystemError,
- "NULL result without error in PyObject_Call");
- }
- return result;
-}
-#endif
-
-/* PyObjectFastCall */
-static PyObject* __Pyx_PyObject_FastCall_fallback(PyObject *func, PyObject **args, size_t nargs, PyObject *kwargs) {
- PyObject *argstuple;
- PyObject *result;
- size_t i;
- argstuple = PyTuple_New((Py_ssize_t)nargs);
- if (unlikely(!argstuple)) return NULL;
- for (i = 0; i < nargs; i++) {
- Py_INCREF(args[i]);
- PyTuple_SET_ITEM(argstuple, (Py_ssize_t)i, args[i]);
- }
- result = __Pyx_PyObject_Call(func, argstuple, kwargs);
- Py_DECREF(argstuple);
- return result;
-}
-static CYTHON_INLINE PyObject* __Pyx_PyObject_FastCallDict(PyObject *func, PyObject **args, size_t _nargs, PyObject *kwargs) {
- Py_ssize_t nargs = __Pyx_PyVectorcall_NARGS(_nargs);
-#if CYTHON_COMPILING_IN_CPYTHON
- if (nargs == 0 && kwargs == NULL) {
-#if defined(__Pyx_CyFunction_USED) && defined(NDEBUG)
- if (__Pyx_IsCyOrPyCFunction(func))
-#else
- if (PyCFunction_Check(func))
-#endif
- {
- if (likely(PyCFunction_GET_FLAGS(func) & METH_NOARGS)) {
- return __Pyx_PyObject_CallMethO(func, NULL);
- }
- }
- }
- else if (nargs == 1 && kwargs == NULL) {
- if (PyCFunction_Check(func))
- {
- if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) {
- return __Pyx_PyObject_CallMethO(func, args[0]);
- }
- }
- }
-#endif
- #if PY_VERSION_HEX < 0x030800B1
- #if CYTHON_FAST_PYCCALL
- if (PyCFunction_Check(func)) {
- if (kwargs) {
- return _PyCFunction_FastCallDict(func, args, nargs, kwargs);
- } else {
- return _PyCFunction_FastCallKeywords(func, args, nargs, NULL);
- }
- }
- #if PY_VERSION_HEX >= 0x030700A1
- if (!kwargs && __Pyx_IS_TYPE(func, &PyMethodDescr_Type)) {
- return _PyMethodDescr_FastCallKeywords(func, args, nargs, NULL);
- }
- #endif
- #endif
- #if CYTHON_FAST_PYCALL
- if (PyFunction_Check(func)) {
- return __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs);
- }
- #endif
- #endif
- #if CYTHON_VECTORCALL
- vectorcallfunc f = _PyVectorcall_Function(func);
- if (f) {
- return f(func, args, (size_t)nargs, kwargs);
- }
- #elif defined(__Pyx_CyFunction_USED) && CYTHON_BACKPORT_VECTORCALL
- if (__Pyx_CyFunction_CheckExact(func)) {
- __pyx_vectorcallfunc f = __Pyx_CyFunction_func_vectorcall(func);
- if (f) return f(func, args, (size_t)nargs, kwargs);
- }
- #endif
- if (nargs == 0) {
- return __Pyx_PyObject_Call(func, __pyx_empty_tuple, kwargs);
- }
- return __Pyx_PyObject_FastCall_fallback(func, args, (size_t)nargs, kwargs);
-}
-
-/* PyObjectCallOneArg */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) {
- PyObject *args[2] = {NULL, arg};
- return __Pyx_PyObject_FastCall(func, args+1, 1 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET);
-}
-
-/* ObjectGetItem */
-#if CYTHON_USE_TYPE_SLOTS
-static PyObject *__Pyx_PyObject_GetIndex(PyObject *obj, PyObject *index) {
- PyObject *runerr = NULL;
- Py_ssize_t key_value;
- key_value = __Pyx_PyIndex_AsSsize_t(index);
- if (likely(key_value != -1 || !(runerr = PyErr_Occurred()))) {
- return __Pyx_GetItemInt_Fast(obj, key_value, 0, 1, 1);
- }
- if (PyErr_GivenExceptionMatches(runerr, PyExc_OverflowError)) {
- __Pyx_TypeName index_type_name = __Pyx_PyType_GetName(Py_TYPE(index));
- PyErr_Clear();
- PyErr_Format(PyExc_IndexError,
- "cannot fit '" __Pyx_FMT_TYPENAME "' into an index-sized integer", index_type_name);
- __Pyx_DECREF_TypeName(index_type_name);
- }
- return NULL;
-}
-static PyObject *__Pyx_PyObject_GetItem_Slow(PyObject *obj, PyObject *key) {
- __Pyx_TypeName obj_type_name;
- if (likely(PyType_Check(obj))) {
- PyObject *meth = __Pyx_PyObject_GetAttrStrNoError(obj, __pyx_n_s_class_getitem);
- if (meth) {
- PyObject *result = __Pyx_PyObject_CallOneArg(meth, key);
- Py_DECREF(meth);
- return result;
- }
- }
- obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj));
- PyErr_Format(PyExc_TypeError,
- "'" __Pyx_FMT_TYPENAME "' object is not subscriptable", obj_type_name);
- __Pyx_DECREF_TypeName(obj_type_name);
- return NULL;
-}
-static PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject *key) {
- PyTypeObject *tp = Py_TYPE(obj);
- PyMappingMethods *mm = tp->tp_as_mapping;
- PySequenceMethods *sm = tp->tp_as_sequence;
- if (likely(mm && mm->mp_subscript)) {
- return mm->mp_subscript(obj, key);
- }
- if (likely(sm && sm->sq_item)) {
- return __Pyx_PyObject_GetIndex(obj, key);
- }
- return __Pyx_PyObject_GetItem_Slow(obj, key);
-}
-#endif
-
-/* PyIntBinop */
-#if !CYTHON_COMPILING_IN_PYPY
-static PyObject* __Pyx_PyInt_SubtractObjC(PyObject *op1, PyObject *op2, long intval, int inplace, int zerodivision_check) {
- CYTHON_MAYBE_UNUSED_VAR(intval);
- CYTHON_MAYBE_UNUSED_VAR(inplace);
- CYTHON_UNUSED_VAR(zerodivision_check);
- #if PY_MAJOR_VERSION < 3
- if (likely(PyInt_CheckExact(op1))) {
- const long b = intval;
- long x;
- long a = PyInt_AS_LONG(op1);
-
- x = (long)((unsigned long)a - (unsigned long)b);
- if (likely((x^a) >= 0 || (x^~b) >= 0))
- return PyInt_FromLong(x);
- return PyLong_Type.tp_as_number->nb_subtract(op1, op2);
- }
- #endif
- #if CYTHON_USE_PYLONG_INTERNALS
- if (likely(PyLong_CheckExact(op1))) {
- const long b = intval;
- long a, x;
-#ifdef HAVE_LONG_LONG
- const PY_LONG_LONG llb = intval;
- PY_LONG_LONG lla, llx;
-#endif
- if (unlikely(__Pyx_PyLong_IsZero(op1))) {
- return PyLong_FromLong(-intval);
- }
- if (likely(__Pyx_PyLong_IsCompact(op1))) {
- a = __Pyx_PyLong_CompactValue(op1);
- } else {
- const digit* digits = __Pyx_PyLong_Digits(op1);
- const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(op1);
- switch (size) {
- case -2:
- if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) {
- a = -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 2:
- if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) {
- a = (long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 2 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case -3:
- if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) {
- a = -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 3:
- if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) {
- a = (long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 3 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((((unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case -4:
- if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) {
- a = -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) {
- lla = -(PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- case 4:
- if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) {
- a = (long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]));
- break;
- #ifdef HAVE_LONG_LONG
- } else if (8 * sizeof(PY_LONG_LONG) - 1 > 4 * PyLong_SHIFT) {
- lla = (PY_LONG_LONG) (((((((((unsigned PY_LONG_LONG)digits[3]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[2]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[1]) << PyLong_SHIFT) | (unsigned PY_LONG_LONG)digits[0]));
- goto long_long;
- #endif
- }
- CYTHON_FALLTHROUGH;
- default: return PyLong_Type.tp_as_number->nb_subtract(op1, op2);
- }
- }
- x = a - b;
- return PyLong_FromLong(x);
-#ifdef HAVE_LONG_LONG
- long_long:
- llx = lla - llb;
- return PyLong_FromLongLong(llx);
-#endif
-
-
- }
- #endif
- if (PyFloat_CheckExact(op1)) {
- const long b = intval;
-#if CYTHON_COMPILING_IN_LIMITED_API
- double a = __pyx_PyFloat_AsDouble(op1);
-#else
- double a = PyFloat_AS_DOUBLE(op1);
-#endif
- double result;
-
- PyFPE_START_PROTECT("subtract", return NULL)
- result = ((double)a) - (double)b;
- PyFPE_END_PROTECT(result)
- return PyFloat_FromDouble(result);
- }
- return (inplace ? PyNumber_InPlaceSubtract : PyNumber_Subtract)(op1, op2);
-}
-#endif
-
-/* PyDictVersioning */
-#if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS
-static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) {
- PyObject *dict = Py_TYPE(obj)->tp_dict;
- return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0;
-}
-static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) {
- PyObject **dictptr = NULL;
- Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset;
- if (offset) {
-#if CYTHON_COMPILING_IN_CPYTHON
- dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj);
-#else
- dictptr = _PyObject_GetDictPtr(obj);
-#endif
- }
- return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0;
-}
-static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) {
- PyObject *dict = Py_TYPE(obj)->tp_dict;
- if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict)))
- return 0;
- return obj_dict_version == __Pyx_get_object_dict_version(obj);
-}
-#endif
-
-/* GetModuleGlobalName */
-#if CYTHON_USE_DICT_VERSIONS
-static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value)
-#else
-static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name)
-#endif
-{
- PyObject *result;
-#if !CYTHON_AVOID_BORROWED_REFS
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1
- result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash);
- __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version)
- if (likely(result)) {
- return __Pyx_NewRef(result);
- } else if (unlikely(PyErr_Occurred())) {
- return NULL;
- }
-#elif CYTHON_COMPILING_IN_LIMITED_API
- if (unlikely(!__pyx_m)) {
- return NULL;
- }
- result = PyObject_GetAttr(__pyx_m, name);
- if (likely(result)) {
- return result;
- }
-#else
- result = PyDict_GetItem(__pyx_d, name);
- __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version)
- if (likely(result)) {
- return __Pyx_NewRef(result);
- }
-#endif
-#else
- result = PyObject_GetItem(__pyx_d, name);
- __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version)
- if (likely(result)) {
- return __Pyx_NewRef(result);
- }
- PyErr_Clear();
-#endif
- return __Pyx_GetBuiltinName(name);
-}
-
-/* RaiseUnboundLocalError */
-static CYTHON_INLINE void __Pyx_RaiseUnboundLocalError(const char *varname) {
- PyErr_Format(PyExc_UnboundLocalError, "local variable '%s' referenced before assignment", varname);
-}
-
-/* RaiseTooManyValuesToUnpack */
-static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected) {
- PyErr_Format(PyExc_ValueError,
- "too many values to unpack (expected %" CYTHON_FORMAT_SSIZE_T "d)", expected);
-}
-
-/* RaiseNeedMoreValuesToUnpack */
-static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index) {
- PyErr_Format(PyExc_ValueError,
- "need more than %" CYTHON_FORMAT_SSIZE_T "d value%.1s to unpack",
- index, (index == 1) ? "" : "s");
-}
-
-/* IterFinish */
-static CYTHON_INLINE int __Pyx_IterFinish(void) {
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- PyObject* exc_type = __Pyx_PyErr_CurrentExceptionType();
- if (unlikely(exc_type)) {
- if (unlikely(!__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration)))
- return -1;
- __Pyx_PyErr_Clear();
- return 0;
- }
- return 0;
-}
-
-/* UnpackItemEndCheck */
-static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected) {
- if (unlikely(retval)) {
- Py_DECREF(retval);
- __Pyx_RaiseTooManyValuesError(expected);
- return -1;
- }
- return __Pyx_IterFinish();
-}
-
-/* py_abs */
-#if CYTHON_USE_PYLONG_INTERNALS
-static PyObject *__Pyx_PyLong_AbsNeg(PyObject *n) {
-#if PY_VERSION_HEX >= 0x030C00A7
- if (likely(__Pyx_PyLong_IsCompact(n))) {
- return PyLong_FromSize_t(__Pyx_PyLong_CompactValueUnsigned(n));
- }
-#else
- if (likely(Py_SIZE(n) == -1)) {
- return PyLong_FromUnsignedLong(__Pyx_PyLong_Digits(n)[0]);
- }
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON
- {
- PyObject *copy = _PyLong_Copy((PyLongObject*)n);
- if (likely(copy)) {
- #if PY_VERSION_HEX >= 0x030C00A7
- ((PyLongObject*)copy)->long_value.lv_tag = ((PyLongObject*)copy)->long_value.lv_tag & ~_PyLong_SIGN_MASK;
- #else
- __Pyx_SET_SIZE(copy, -Py_SIZE(copy));
- #endif
- }
- return copy;
- }
-#else
- return PyNumber_Negative(n);
-#endif
-}
-#endif
-
-/* GetException */
-#if CYTHON_FAST_THREAD_STATE
-static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb)
-#else
-static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb)
-#endif
-{
- PyObject *local_type = NULL, *local_value, *local_tb = NULL;
-#if CYTHON_FAST_THREAD_STATE
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- #if PY_VERSION_HEX >= 0x030C00A6
- local_value = tstate->current_exception;
- tstate->current_exception = 0;
- if (likely(local_value)) {
- local_type = (PyObject*) Py_TYPE(local_value);
- Py_INCREF(local_type);
- local_tb = PyException_GetTraceback(local_value);
- }
- #else
- local_type = tstate->curexc_type;
- local_value = tstate->curexc_value;
- local_tb = tstate->curexc_traceback;
- tstate->curexc_type = 0;
- tstate->curexc_value = 0;
- tstate->curexc_traceback = 0;
- #endif
-#else
- PyErr_Fetch(&local_type, &local_value, &local_tb);
-#endif
- PyErr_NormalizeException(&local_type, &local_value, &local_tb);
-#if CYTHON_FAST_THREAD_STATE && PY_VERSION_HEX >= 0x030C00A6
- if (unlikely(tstate->current_exception))
-#elif CYTHON_FAST_THREAD_STATE
- if (unlikely(tstate->curexc_type))
-#else
- if (unlikely(PyErr_Occurred()))
-#endif
- goto bad;
- #if PY_MAJOR_VERSION >= 3
- if (local_tb) {
- if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0))
- goto bad;
- }
- #endif
- Py_XINCREF(local_tb);
- Py_XINCREF(local_type);
- Py_XINCREF(local_value);
- *type = local_type;
- *value = local_value;
- *tb = local_tb;
-#if CYTHON_FAST_THREAD_STATE
- #if CYTHON_USE_EXC_INFO_STACK
- {
- _PyErr_StackItem *exc_info = tstate->exc_info;
- #if PY_VERSION_HEX >= 0x030B00a4
- tmp_value = exc_info->exc_value;
- exc_info->exc_value = local_value;
- tmp_type = NULL;
- tmp_tb = NULL;
- Py_XDECREF(local_type);
- Py_XDECREF(local_tb);
- #else
- tmp_type = exc_info->exc_type;
- tmp_value = exc_info->exc_value;
- tmp_tb = exc_info->exc_traceback;
- exc_info->exc_type = local_type;
- exc_info->exc_value = local_value;
- exc_info->exc_traceback = local_tb;
- #endif
- }
- #else
- tmp_type = tstate->exc_type;
- tmp_value = tstate->exc_value;
- tmp_tb = tstate->exc_traceback;
- tstate->exc_type = local_type;
- tstate->exc_value = local_value;
- tstate->exc_traceback = local_tb;
- #endif
- Py_XDECREF(tmp_type);
- Py_XDECREF(tmp_value);
- Py_XDECREF(tmp_tb);
-#else
- PyErr_SetExcInfo(local_type, local_value, local_tb);
-#endif
- return 0;
-bad:
- *type = 0;
- *value = 0;
- *tb = 0;
- Py_XDECREF(local_type);
- Py_XDECREF(local_value);
- Py_XDECREF(local_tb);
- return -1;
-}
-
-/* pep479 */
-static void __Pyx_Generator_Replace_StopIteration(int in_async_gen) {
- PyObject *exc, *val, *tb, *cur_exc;
- __Pyx_PyThreadState_declare
- #ifdef __Pyx_StopAsyncIteration_USED
- int is_async_stopiteration = 0;
- #endif
- CYTHON_MAYBE_UNUSED_VAR(in_async_gen);
- cur_exc = PyErr_Occurred();
- if (likely(!__Pyx_PyErr_GivenExceptionMatches(cur_exc, PyExc_StopIteration))) {
- #ifdef __Pyx_StopAsyncIteration_USED
- if (in_async_gen && unlikely(__Pyx_PyErr_GivenExceptionMatches(cur_exc, __Pyx_PyExc_StopAsyncIteration))) {
- is_async_stopiteration = 1;
- } else
- #endif
- return;
- }
- __Pyx_PyThreadState_assign
- __Pyx_GetException(&exc, &val, &tb);
- Py_XDECREF(exc);
- Py_XDECREF(val);
- Py_XDECREF(tb);
- PyErr_SetString(PyExc_RuntimeError,
- #ifdef __Pyx_StopAsyncIteration_USED
- is_async_stopiteration ? "async generator raised StopAsyncIteration" :
- in_async_gen ? "async generator raised StopIteration" :
- #endif
- "generator raised StopIteration");
-}
-
-/* ArgTypeTest */
-static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact)
-{
- __Pyx_TypeName type_name;
- __Pyx_TypeName obj_type_name;
- if (unlikely(!type)) {
- PyErr_SetString(PyExc_SystemError, "Missing type object");
- return 0;
- }
- else if (exact) {
- #if PY_MAJOR_VERSION == 2
- if ((type == &PyBaseString_Type) && likely(__Pyx_PyBaseString_CheckExact(obj))) return 1;
- #endif
- }
- else {
- if (likely(__Pyx_TypeCheck(obj, type))) return 1;
- }
- type_name = __Pyx_PyType_GetName(type);
- obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj));
- PyErr_Format(PyExc_TypeError,
- "Argument '%.200s' has incorrect type (expected " __Pyx_FMT_TYPENAME
- ", got " __Pyx_FMT_TYPENAME ")", name, type_name, obj_type_name);
- __Pyx_DECREF_TypeName(type_name);
- __Pyx_DECREF_TypeName(obj_type_name);
- return 0;
-}
-
-/* DictGetItem */
-#if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY
-static PyObject *__Pyx_PyDict_GetItem(PyObject *d, PyObject* key) {
- PyObject *value;
- value = PyDict_GetItemWithError(d, key);
- if (unlikely(!value)) {
- if (!PyErr_Occurred()) {
- if (unlikely(PyTuple_Check(key))) {
- PyObject* args = PyTuple_Pack(1, key);
- if (likely(args)) {
- PyErr_SetObject(PyExc_KeyError, args);
- Py_DECREF(args);
- }
- } else {
- PyErr_SetObject(PyExc_KeyError, key);
- }
- }
- return NULL;
- }
- Py_INCREF(value);
- return value;
-}
-#endif
-
-/* pyfrozenset_new */
-static CYTHON_INLINE PyObject* __Pyx_PyFrozenSet_New(PyObject* it) {
- if (it) {
- PyObject* result;
-#if CYTHON_COMPILING_IN_PYPY
- PyObject* args;
- args = PyTuple_Pack(1, it);
- if (unlikely(!args))
- return NULL;
- result = PyObject_Call((PyObject*)&PyFrozenSet_Type, args, NULL);
- Py_DECREF(args);
- return result;
-#else
- if (PyFrozenSet_CheckExact(it)) {
- Py_INCREF(it);
- return it;
- }
- result = PyFrozenSet_New(it);
- if (unlikely(!result))
- return NULL;
- if ((PY_VERSION_HEX >= 0x031000A1) || likely(PySet_GET_SIZE(result)))
- return result;
- Py_DECREF(result);
-#endif
- }
-#if CYTHON_USE_TYPE_SLOTS
- return PyFrozenSet_Type.tp_new(&PyFrozenSet_Type, __pyx_empty_tuple, NULL);
-#else
- return PyObject_Call((PyObject*)&PyFrozenSet_Type, __pyx_empty_tuple, NULL);
-#endif
-}
-
-/* PySetContains */
-static int __Pyx_PySet_ContainsUnhashable(PyObject *set, PyObject *key) {
- int result = -1;
- if (PySet_Check(key) && PyErr_ExceptionMatches(PyExc_TypeError)) {
- PyObject *tmpkey;
- PyErr_Clear();
- tmpkey = __Pyx_PyFrozenSet_New(key);
- if (tmpkey != NULL) {
- result = PySet_Contains(set, tmpkey);
- Py_DECREF(tmpkey);
- }
- }
- return result;
-}
-static CYTHON_INLINE int __Pyx_PySet_ContainsTF(PyObject* key, PyObject* set, int eq) {
- int result = PySet_Contains(set, key);
- if (unlikely(result < 0)) {
- result = __Pyx_PySet_ContainsUnhashable(set, key);
- }
- return unlikely(result < 0) ? result : (result == (eq == Py_EQ));
-}
-
-/* RaiseUnexpectedTypeError */
-static int
-__Pyx_RaiseUnexpectedTypeError(const char *expected, PyObject *obj)
-{
- __Pyx_TypeName obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj));
- PyErr_Format(PyExc_TypeError, "Expected %s, got " __Pyx_FMT_TYPENAME,
- expected, obj_type_name);
- __Pyx_DECREF_TypeName(obj_type_name);
- return 0;
-}
-
-/* SliceTupleAndList */
-#if CYTHON_COMPILING_IN_CPYTHON
-static CYTHON_INLINE void __Pyx_crop_slice(Py_ssize_t* _start, Py_ssize_t* _stop, Py_ssize_t* _length) {
- Py_ssize_t start = *_start, stop = *_stop, length = *_length;
- if (start < 0) {
- start += length;
- if (start < 0)
- start = 0;
- }
- if (stop < 0)
- stop += length;
- else if (stop > length)
- stop = length;
- *_length = stop - start;
- *_start = start;
- *_stop = stop;
-}
-static CYTHON_INLINE PyObject* __Pyx_PyList_GetSlice(
- PyObject* src, Py_ssize_t start, Py_ssize_t stop) {
- Py_ssize_t length = PyList_GET_SIZE(src);
- __Pyx_crop_slice(&start, &stop, &length);
- if (length <= 0) {
- return PyList_New(0);
- }
- return __Pyx_PyList_FromArray(((PyListObject*)src)->ob_item + start, length);
-}
-static CYTHON_INLINE PyObject* __Pyx_PyTuple_GetSlice(
- PyObject* src, Py_ssize_t start, Py_ssize_t stop) {
- Py_ssize_t length = PyTuple_GET_SIZE(src);
- __Pyx_crop_slice(&start, &stop, &length);
- return __Pyx_PyTuple_FromArray(((PyTupleObject*)src)->ob_item + start, length);
-}
-#endif
-
-/* set_iter */
-static CYTHON_INLINE PyObject* __Pyx_set_iterator(PyObject* iterable, int is_set,
- Py_ssize_t* p_orig_length, int* p_source_is_set) {
-#if CYTHON_COMPILING_IN_CPYTHON
- is_set = is_set || likely(PySet_CheckExact(iterable) || PyFrozenSet_CheckExact(iterable));
- *p_source_is_set = is_set;
- if (likely(is_set)) {
- *p_orig_length = PySet_Size(iterable);
- Py_INCREF(iterable);
- return iterable;
- }
-#else
- CYTHON_UNUSED_VAR(is_set);
- *p_source_is_set = 0;
-#endif
- *p_orig_length = 0;
- return PyObject_GetIter(iterable);
-}
-static CYTHON_INLINE int __Pyx_set_iter_next(
- PyObject* iter_obj, Py_ssize_t orig_length,
- Py_ssize_t* ppos, PyObject **value,
- int source_is_set) {
- if (!CYTHON_COMPILING_IN_CPYTHON || unlikely(!source_is_set)) {
- *value = PyIter_Next(iter_obj);
- if (unlikely(!*value)) {
- return __Pyx_IterFinish();
- }
- CYTHON_UNUSED_VAR(orig_length);
- CYTHON_UNUSED_VAR(ppos);
- return 1;
- }
-#if CYTHON_COMPILING_IN_CPYTHON
- if (unlikely(PySet_GET_SIZE(iter_obj) != orig_length)) {
- PyErr_SetString(
- PyExc_RuntimeError,
- "set changed size during iteration");
- return -1;
- }
- {
- Py_hash_t hash;
- int ret = _PySet_NextEntry(iter_obj, ppos, value, &hash);
- assert (ret != -1);
- if (likely(ret)) {
- Py_INCREF(*value);
- return 1;
- }
- }
-#endif
- return 0;
-}
-
-/* RaiseClosureNameError */
-static CYTHON_INLINE void __Pyx_RaiseClosureNameError(const char *varname) {
- PyErr_Format(PyExc_NameError, "free variable '%s' referenced before assignment in enclosing scope", varname);
-}
-
-/* PyIntCompare */
-static CYTHON_INLINE int __Pyx_PyInt_BoolEqObjC(PyObject *op1, PyObject *op2, long intval, long inplace) {
- CYTHON_MAYBE_UNUSED_VAR(intval);
- CYTHON_UNUSED_VAR(inplace);
- if (op1 == op2) {
- return 1;
- }
- #if PY_MAJOR_VERSION < 3
- if (likely(PyInt_CheckExact(op1))) {
- const long b = intval;
- long a = PyInt_AS_LONG(op1);
- return (a == b);
- }
- #endif
- #if CYTHON_USE_PYLONG_INTERNALS
- if (likely(PyLong_CheckExact(op1))) {
- int unequal;
- unsigned long uintval;
- Py_ssize_t size = __Pyx_PyLong_DigitCount(op1);
- const digit* digits = __Pyx_PyLong_Digits(op1);
- if (intval == 0) {
- return (__Pyx_PyLong_IsZero(op1) == 1);
- } else if (intval < 0) {
- if (__Pyx_PyLong_IsNonNeg(op1))
- return 0;
- intval = -intval;
- } else {
- if (__Pyx_PyLong_IsNeg(op1))
- return 0;
- }
- uintval = (unsigned long) intval;
-#if PyLong_SHIFT * 4 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 4)) {
- unequal = (size != 5) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[4] != ((uintval >> (4 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 3 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 3)) {
- unequal = (size != 4) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 2 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 2)) {
- unequal = (size != 3) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
-#if PyLong_SHIFT * 1 < SIZEOF_LONG*8
- if (uintval >> (PyLong_SHIFT * 1)) {
- unequal = (size != 2) || (digits[0] != (uintval & (unsigned long) PyLong_MASK))
- | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK));
- } else
-#endif
- unequal = (size != 1) || (((unsigned long) digits[0]) != (uintval & (unsigned long) PyLong_MASK));
- return (unequal == 0);
- }
- #endif
- if (PyFloat_CheckExact(op1)) {
- const long b = intval;
-#if CYTHON_COMPILING_IN_LIMITED_API
- double a = __pyx_PyFloat_AsDouble(op1);
-#else
- double a = PyFloat_AS_DOUBLE(op1);
-#endif
- return ((double)a == (double)b);
- }
- return __Pyx_PyObject_IsTrueAndDecref(
- PyObject_RichCompare(op1, op2, Py_EQ));
-}
-
-/* py_set_discard_unhashable */
-static int __Pyx_PySet_DiscardUnhashable(PyObject *set, PyObject *key) {
- PyObject *tmpkey;
- int rv;
- if (likely(!PySet_Check(key) || !PyErr_ExceptionMatches(PyExc_TypeError)))
- return -1;
- PyErr_Clear();
- tmpkey = __Pyx_PyFrozenSet_New(key);
- if (tmpkey == NULL)
- return -1;
- rv = PySet_Discard(set, tmpkey);
- Py_DECREF(tmpkey);
- return rv;
-}
-
-/* py_set_remove */
-static int __Pyx_PySet_RemoveNotFound(PyObject *set, PyObject *key, int found) {
- if (unlikely(found < 0)) {
- found = __Pyx_PySet_DiscardUnhashable(set, key);
- }
- if (likely(found == 0)) {
- PyObject *tup;
- tup = PyTuple_Pack(1, key);
- if (!tup)
- return -1;
- PyErr_SetObject(PyExc_KeyError, tup);
- Py_DECREF(tup);
- return -1;
- }
- return found;
-}
-static CYTHON_INLINE int __Pyx_PySet_Remove(PyObject *set, PyObject *key) {
- int found = PySet_Discard(set, key);
- if (unlikely(found != 1)) {
- return __Pyx_PySet_RemoveNotFound(set, key, found);
- }
- return 0;
-}
-
-/* FixUpExtensionType */
-#if CYTHON_USE_TYPE_SPECS
-static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type) {
-#if PY_VERSION_HEX > 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API
- CYTHON_UNUSED_VAR(spec);
- CYTHON_UNUSED_VAR(type);
-#else
- const PyType_Slot *slot = spec->slots;
- while (slot && slot->slot && slot->slot != Py_tp_members)
- slot++;
- if (slot && slot->slot == Py_tp_members) {
- int changed = 0;
-#if !(PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON)
- const
-#endif
- PyMemberDef *memb = (PyMemberDef*) slot->pfunc;
- while (memb && memb->name) {
- if (memb->name[0] == '_' && memb->name[1] == '_') {
-#if PY_VERSION_HEX < 0x030900b1
- if (strcmp(memb->name, "__weaklistoffset__") == 0) {
- assert(memb->type == T_PYSSIZET);
- assert(memb->flags == READONLY);
- type->tp_weaklistoffset = memb->offset;
- changed = 1;
- }
- else if (strcmp(memb->name, "__dictoffset__") == 0) {
- assert(memb->type == T_PYSSIZET);
- assert(memb->flags == READONLY);
- type->tp_dictoffset = memb->offset;
- changed = 1;
- }
-#if CYTHON_METH_FASTCALL
- else if (strcmp(memb->name, "__vectorcalloffset__") == 0) {
- assert(memb->type == T_PYSSIZET);
- assert(memb->flags == READONLY);
-#if PY_VERSION_HEX >= 0x030800b4
- type->tp_vectorcall_offset = memb->offset;
-#else
- type->tp_print = (printfunc) memb->offset;
-#endif
- changed = 1;
- }
-#endif
-#else
- if ((0));
-#endif
-#if PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON
- else if (strcmp(memb->name, "__module__") == 0) {
- PyObject *descr;
- assert(memb->type == T_OBJECT);
- assert(memb->flags == 0 || memb->flags == READONLY);
- descr = PyDescr_NewMember(type, memb);
- if (unlikely(!descr))
- return -1;
- if (unlikely(PyDict_SetItem(type->tp_dict, PyDescr_NAME(descr), descr) < 0)) {
- Py_DECREF(descr);
- return -1;
- }
- Py_DECREF(descr);
- changed = 1;
- }
-#endif
- }
- memb++;
- }
- if (changed)
- PyType_Modified(type);
- }
-#endif
- return 0;
-}
-#endif
-
-/* PyObjectCallNoArg */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func) {
- PyObject *arg = NULL;
- return __Pyx_PyObject_FastCall(func, (&arg)+1, 0 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET);
-}
-
-/* PyObjectGetMethod */
-static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method) {
- PyObject *attr;
-#if CYTHON_UNPACK_METHODS && CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_PYTYPE_LOOKUP
- __Pyx_TypeName type_name;
- PyTypeObject *tp = Py_TYPE(obj);
- PyObject *descr;
- descrgetfunc f = NULL;
- PyObject **dictptr, *dict;
- int meth_found = 0;
- assert (*method == NULL);
- if (unlikely(tp->tp_getattro != PyObject_GenericGetAttr)) {
- attr = __Pyx_PyObject_GetAttrStr(obj, name);
- goto try_unpack;
- }
- if (unlikely(tp->tp_dict == NULL) && unlikely(PyType_Ready(tp) < 0)) {
- return 0;
- }
- descr = _PyType_Lookup(tp, name);
- if (likely(descr != NULL)) {
- Py_INCREF(descr);
-#if defined(Py_TPFLAGS_METHOD_DESCRIPTOR) && Py_TPFLAGS_METHOD_DESCRIPTOR
- if (__Pyx_PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_METHOD_DESCRIPTOR))
-#elif PY_MAJOR_VERSION >= 3
- #ifdef __Pyx_CyFunction_USED
- if (likely(PyFunction_Check(descr) || __Pyx_IS_TYPE(descr, &PyMethodDescr_Type) || __Pyx_CyFunction_Check(descr)))
- #else
- if (likely(PyFunction_Check(descr) || __Pyx_IS_TYPE(descr, &PyMethodDescr_Type)))
- #endif
-#else
- #ifdef __Pyx_CyFunction_USED
- if (likely(PyFunction_Check(descr) || __Pyx_CyFunction_Check(descr)))
- #else
- if (likely(PyFunction_Check(descr)))
- #endif
-#endif
- {
- meth_found = 1;
- } else {
- f = Py_TYPE(descr)->tp_descr_get;
- if (f != NULL && PyDescr_IsData(descr)) {
- attr = f(descr, obj, (PyObject *)Py_TYPE(obj));
- Py_DECREF(descr);
- goto try_unpack;
- }
- }
- }
- dictptr = _PyObject_GetDictPtr(obj);
- if (dictptr != NULL && (dict = *dictptr) != NULL) {
- Py_INCREF(dict);
- attr = __Pyx_PyDict_GetItemStr(dict, name);
- if (attr != NULL) {
- Py_INCREF(attr);
- Py_DECREF(dict);
- Py_XDECREF(descr);
- goto try_unpack;
- }
- Py_DECREF(dict);
- }
- if (meth_found) {
- *method = descr;
- return 1;
- }
- if (f != NULL) {
- attr = f(descr, obj, (PyObject *)Py_TYPE(obj));
- Py_DECREF(descr);
- goto try_unpack;
- }
- if (likely(descr != NULL)) {
- *method = descr;
- return 0;
- }
- type_name = __Pyx_PyType_GetName(tp);
- PyErr_Format(PyExc_AttributeError,
-#if PY_MAJOR_VERSION >= 3
- "'" __Pyx_FMT_TYPENAME "' object has no attribute '%U'",
- type_name, name);
-#else
- "'" __Pyx_FMT_TYPENAME "' object has no attribute '%.400s'",
- type_name, PyString_AS_STRING(name));
-#endif
- __Pyx_DECREF_TypeName(type_name);
- return 0;
-#else
- attr = __Pyx_PyObject_GetAttrStr(obj, name);
- goto try_unpack;
-#endif
-try_unpack:
-#if CYTHON_UNPACK_METHODS
- if (likely(attr) && PyMethod_Check(attr) && likely(PyMethod_GET_SELF(attr) == obj)) {
- PyObject *function = PyMethod_GET_FUNCTION(attr);
- Py_INCREF(function);
- Py_DECREF(attr);
- *method = function;
- return 1;
- }
-#endif
- *method = attr;
- return 0;
-}
-
-/* PyObjectCallMethod0 */
-static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name) {
- PyObject *method = NULL, *result = NULL;
- int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method);
- if (likely(is_method)) {
- result = __Pyx_PyObject_CallOneArg(method, obj);
- Py_DECREF(method);
- return result;
- }
- if (unlikely(!method)) goto bad;
- result = __Pyx_PyObject_CallNoArg(method);
- Py_DECREF(method);
-bad:
- return result;
-}
-
-/* ValidateBasesTuple */
-#if CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API || CYTHON_USE_TYPE_SPECS
-static int __Pyx_validate_bases_tuple(const char *type_name, Py_ssize_t dictoffset, PyObject *bases) {
- Py_ssize_t i, n = PyTuple_GET_SIZE(bases);
- for (i = 1; i < n; i++)
- {
- PyObject *b0 = PyTuple_GET_ITEM(bases, i);
- PyTypeObject *b;
-#if PY_MAJOR_VERSION < 3
- if (PyClass_Check(b0))
- {
- PyErr_Format(PyExc_TypeError, "base class '%.200s' is an old-style class",
- PyString_AS_STRING(((PyClassObject*)b0)->cl_name));
- return -1;
- }
-#endif
- b = (PyTypeObject*) b0;
- if (!__Pyx_PyType_HasFeature(b, Py_TPFLAGS_HEAPTYPE))
- {
- __Pyx_TypeName b_name = __Pyx_PyType_GetName(b);
- PyErr_Format(PyExc_TypeError,
- "base class '" __Pyx_FMT_TYPENAME "' is not a heap type", b_name);
- __Pyx_DECREF_TypeName(b_name);
- return -1;
- }
- if (dictoffset == 0 && b->tp_dictoffset)
- {
- __Pyx_TypeName b_name = __Pyx_PyType_GetName(b);
- PyErr_Format(PyExc_TypeError,
- "extension type '%.200s' has no __dict__ slot, "
- "but base type '" __Pyx_FMT_TYPENAME "' has: "
- "either add 'cdef dict __dict__' to the extension type "
- "or add '__slots__ = [...]' to the base type",
- type_name, b_name);
- __Pyx_DECREF_TypeName(b_name);
- return -1;
- }
- }
- return 0;
-}
-#endif
-
-/* PyType_Ready */
-static int __Pyx_PyType_Ready(PyTypeObject *t) {
-#if CYTHON_USE_TYPE_SPECS || !(CYTHON_COMPILING_IN_CPYTHON || CYTHON_COMPILING_IN_LIMITED_API) || defined(PYSTON_MAJOR_VERSION)
- (void)__Pyx_PyObject_CallMethod0;
-#if CYTHON_USE_TYPE_SPECS
- (void)__Pyx_validate_bases_tuple;
-#endif
- return PyType_Ready(t);
-#else
- int r;
- PyObject *bases = __Pyx_PyType_GetSlot(t, tp_bases, PyObject*);
- if (bases && unlikely(__Pyx_validate_bases_tuple(t->tp_name, t->tp_dictoffset, bases) == -1))
- return -1;
-#if PY_VERSION_HEX >= 0x03050000 && !defined(PYSTON_MAJOR_VERSION)
- {
- int gc_was_enabled;
- #if PY_VERSION_HEX >= 0x030A00b1
- gc_was_enabled = PyGC_Disable();
- (void)__Pyx_PyObject_CallMethod0;
- #else
- PyObject *ret, *py_status;
- PyObject *gc = NULL;
- #if PY_VERSION_HEX >= 0x030700a1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM+0 >= 0x07030400)
- gc = PyImport_GetModule(__pyx_kp_u_gc);
- #endif
- if (unlikely(!gc)) gc = PyImport_Import(__pyx_kp_u_gc);
- if (unlikely(!gc)) return -1;
- py_status = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_isenabled);
- if (unlikely(!py_status)) {
- Py_DECREF(gc);
- return -1;
- }
- gc_was_enabled = __Pyx_PyObject_IsTrue(py_status);
- Py_DECREF(py_status);
- if (gc_was_enabled > 0) {
- ret = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_disable);
- if (unlikely(!ret)) {
- Py_DECREF(gc);
- return -1;
- }
- Py_DECREF(ret);
- } else if (unlikely(gc_was_enabled == -1)) {
- Py_DECREF(gc);
- return -1;
- }
- #endif
- t->tp_flags |= Py_TPFLAGS_HEAPTYPE;
-#if PY_VERSION_HEX >= 0x030A0000
- t->tp_flags |= Py_TPFLAGS_IMMUTABLETYPE;
-#endif
-#else
- (void)__Pyx_PyObject_CallMethod0;
-#endif
- r = PyType_Ready(t);
-#if PY_VERSION_HEX >= 0x03050000 && !defined(PYSTON_MAJOR_VERSION)
- t->tp_flags &= ~Py_TPFLAGS_HEAPTYPE;
- #if PY_VERSION_HEX >= 0x030A00b1
- if (gc_was_enabled)
- PyGC_Enable();
- #else
- if (gc_was_enabled) {
- PyObject *tp, *v, *tb;
- PyErr_Fetch(&tp, &v, &tb);
- ret = __Pyx_PyObject_CallMethod0(gc, __pyx_kp_u_enable);
- if (likely(ret || r == -1)) {
- Py_XDECREF(ret);
- PyErr_Restore(tp, v, tb);
- } else {
- Py_XDECREF(tp);
- Py_XDECREF(v);
- Py_XDECREF(tb);
- r = -1;
- }
- }
- Py_DECREF(gc);
- #endif
- }
-#endif
- return r;
-#endif
-}
-
-/* PyObject_GenericGetAttrNoDict */
-#if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000
-static PyObject *__Pyx_RaiseGenericGetAttributeError(PyTypeObject *tp, PyObject *attr_name) {
- __Pyx_TypeName type_name = __Pyx_PyType_GetName(tp);
- PyErr_Format(PyExc_AttributeError,
-#if PY_MAJOR_VERSION >= 3
- "'" __Pyx_FMT_TYPENAME "' object has no attribute '%U'",
- type_name, attr_name);
-#else
- "'" __Pyx_FMT_TYPENAME "' object has no attribute '%.400s'",
- type_name, PyString_AS_STRING(attr_name));
-#endif
- __Pyx_DECREF_TypeName(type_name);
- return NULL;
-}
-static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name) {
- PyObject *descr;
- PyTypeObject *tp = Py_TYPE(obj);
- if (unlikely(!PyString_Check(attr_name))) {
- return PyObject_GenericGetAttr(obj, attr_name);
- }
- assert(!tp->tp_dictoffset);
- descr = _PyType_Lookup(tp, attr_name);
- if (unlikely(!descr)) {
- return __Pyx_RaiseGenericGetAttributeError(tp, attr_name);
- }
- Py_INCREF(descr);
- #if PY_MAJOR_VERSION < 3
- if (likely(PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_HAVE_CLASS)))
- #endif
- {
- descrgetfunc f = Py_TYPE(descr)->tp_descr_get;
- if (unlikely(f)) {
- PyObject *res = f(descr, obj, (PyObject *)tp);
- Py_DECREF(descr);
- return res;
- }
- }
- return descr;
-}
-#endif
-
-/* GetTopmostException */
-#if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE
-static _PyErr_StackItem *
-__Pyx_PyErr_GetTopmostException(PyThreadState *tstate)
-{
- _PyErr_StackItem *exc_info = tstate->exc_info;
- while ((exc_info->exc_value == NULL || exc_info->exc_value == Py_None) &&
- exc_info->previous_item != NULL)
- {
- exc_info = exc_info->previous_item;
- }
- return exc_info;
-}
-#endif
-
-/* SaveResetException */
-#if CYTHON_FAST_THREAD_STATE
-static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) {
- #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4
- _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate);
- PyObject *exc_value = exc_info->exc_value;
- if (exc_value == NULL || exc_value == Py_None) {
- *value = NULL;
- *type = NULL;
- *tb = NULL;
- } else {
- *value = exc_value;
- Py_INCREF(*value);
- *type = (PyObject*) Py_TYPE(exc_value);
- Py_INCREF(*type);
- *tb = PyException_GetTraceback(exc_value);
- }
- #elif CYTHON_USE_EXC_INFO_STACK
- _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate);
- *type = exc_info->exc_type;
- *value = exc_info->exc_value;
- *tb = exc_info->exc_traceback;
- Py_XINCREF(*type);
- Py_XINCREF(*value);
- Py_XINCREF(*tb);
- #else
- *type = tstate->exc_type;
- *value = tstate->exc_value;
- *tb = tstate->exc_traceback;
- Py_XINCREF(*type);
- Py_XINCREF(*value);
- Py_XINCREF(*tb);
- #endif
-}
-static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) {
- #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4
- _PyErr_StackItem *exc_info = tstate->exc_info;
- PyObject *tmp_value = exc_info->exc_value;
- exc_info->exc_value = value;
- Py_XDECREF(tmp_value);
- Py_XDECREF(type);
- Py_XDECREF(tb);
- #else
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- #if CYTHON_USE_EXC_INFO_STACK
- _PyErr_StackItem *exc_info = tstate->exc_info;
- tmp_type = exc_info->exc_type;
- tmp_value = exc_info->exc_value;
- tmp_tb = exc_info->exc_traceback;
- exc_info->exc_type = type;
- exc_info->exc_value = value;
- exc_info->exc_traceback = tb;
- #else
- tmp_type = tstate->exc_type;
- tmp_value = tstate->exc_value;
- tmp_tb = tstate->exc_traceback;
- tstate->exc_type = type;
- tstate->exc_value = value;
- tstate->exc_traceback = tb;
- #endif
- Py_XDECREF(tmp_type);
- Py_XDECREF(tmp_value);
- Py_XDECREF(tmp_tb);
- #endif
-}
-#endif
-
-/* FastTypeChecks */
-#if CYTHON_COMPILING_IN_CPYTHON
-static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) {
- while (a) {
- a = __Pyx_PyType_GetSlot(a, tp_base, PyTypeObject*);
- if (a == b)
- return 1;
- }
- return b == &PyBaseObject_Type;
-}
-static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) {
- PyObject *mro;
- if (a == b) return 1;
- mro = a->tp_mro;
- if (likely(mro)) {
- Py_ssize_t i, n;
- n = PyTuple_GET_SIZE(mro);
- for (i = 0; i < n; i++) {
- if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b)
- return 1;
- }
- return 0;
- }
- return __Pyx_InBases(a, b);
-}
-static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b) {
- PyObject *mro;
- if (cls == a || cls == b) return 1;
- mro = cls->tp_mro;
- if (likely(mro)) {
- Py_ssize_t i, n;
- n = PyTuple_GET_SIZE(mro);
- for (i = 0; i < n; i++) {
- PyObject *base = PyTuple_GET_ITEM(mro, i);
- if (base == (PyObject *)a || base == (PyObject *)b)
- return 1;
- }
- return 0;
- }
- return __Pyx_InBases(cls, a) || __Pyx_InBases(cls, b);
-}
-#if PY_MAJOR_VERSION == 2
-static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) {
- PyObject *exception, *value, *tb;
- int res;
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- __Pyx_ErrFetch(&exception, &value, &tb);
- res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0;
- if (unlikely(res == -1)) {
- PyErr_WriteUnraisable(err);
- res = 0;
- }
- if (!res) {
- res = PyObject_IsSubclass(err, exc_type2);
- if (unlikely(res == -1)) {
- PyErr_WriteUnraisable(err);
- res = 0;
- }
- }
- __Pyx_ErrRestore(exception, value, tb);
- return res;
-}
-#else
-static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) {
- if (exc_type1) {
- return __Pyx_IsAnySubtype2((PyTypeObject*)err, (PyTypeObject*)exc_type1, (PyTypeObject*)exc_type2);
- } else {
- return __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2);
- }
-}
-#endif
-static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) {
- Py_ssize_t i, n;
- assert(PyExceptionClass_Check(exc_type));
- n = PyTuple_GET_SIZE(tuple);
-#if PY_MAJOR_VERSION >= 3
- for (i=0; i= 3
- if (level == -1) {
- if ((1) && (strchr(__Pyx_MODULE_NAME, '.'))) {
- #if CYTHON_COMPILING_IN_LIMITED_API
- module = PyImport_ImportModuleLevelObject(
- name, empty_dict, empty_dict, from_list, 1);
- #else
- module = PyImport_ImportModuleLevelObject(
- name, __pyx_d, empty_dict, from_list, 1);
- #endif
- if (unlikely(!module)) {
- if (unlikely(!PyErr_ExceptionMatches(PyExc_ImportError)))
- goto bad;
- PyErr_Clear();
- }
- }
- level = 0;
- }
- #endif
- if (!module) {
- #if PY_MAJOR_VERSION < 3
- PyObject *py_level = PyInt_FromLong(level);
- if (unlikely(!py_level))
- goto bad;
- module = PyObject_CallFunctionObjArgs(py_import,
- name, __pyx_d, empty_dict, from_list, py_level, (PyObject *)NULL);
- Py_DECREF(py_level);
- #else
- #if CYTHON_COMPILING_IN_LIMITED_API
- module = PyImport_ImportModuleLevelObject(
- name, empty_dict, empty_dict, from_list, level);
- #else
- module = PyImport_ImportModuleLevelObject(
- name, __pyx_d, empty_dict, from_list, level);
- #endif
- #endif
- }
- }
-bad:
- Py_XDECREF(empty_dict);
- Py_XDECREF(empty_list);
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(py_import);
- #endif
- return module;
-}
-
-/* ImportFrom */
-static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) {
- PyObject* value = __Pyx_PyObject_GetAttrStr(module, name);
- if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) {
- const char* module_name_str = 0;
- PyObject* module_name = 0;
- PyObject* module_dot = 0;
- PyObject* full_name = 0;
- PyErr_Clear();
- module_name_str = PyModule_GetName(module);
- if (unlikely(!module_name_str)) { goto modbad; }
- module_name = PyUnicode_FromString(module_name_str);
- if (unlikely(!module_name)) { goto modbad; }
- module_dot = PyUnicode_Concat(module_name, __pyx_kp_u__3);
- if (unlikely(!module_dot)) { goto modbad; }
- full_name = PyUnicode_Concat(module_dot, name);
- if (unlikely(!full_name)) { goto modbad; }
- #if PY_VERSION_HEX < 0x030700A1 || (CYTHON_COMPILING_IN_PYPY && PYPY_VERSION_NUM < 0x07030400)
- {
- PyObject *modules = PyImport_GetModuleDict();
- if (unlikely(!modules))
- goto modbad;
- value = PyObject_GetItem(modules, full_name);
- }
- #else
- value = PyImport_GetModule(full_name);
- #endif
- modbad:
- Py_XDECREF(full_name);
- Py_XDECREF(module_dot);
- Py_XDECREF(module_name);
- }
- if (unlikely(!value)) {
- PyErr_Format(PyExc_ImportError,
- #if PY_MAJOR_VERSION < 3
- "cannot import name %.230s", PyString_AS_STRING(name));
- #else
- "cannot import name %S", name);
- #endif
- }
- return value;
-}
-
-/* FetchSharedCythonModule */
-static PyObject *__Pyx_FetchSharedCythonABIModule(void) {
- PyObject *abi_module = PyImport_AddModule((char*) __PYX_ABI_MODULE_NAME);
- if (unlikely(!abi_module)) return NULL;
- Py_INCREF(abi_module);
- return abi_module;
-}
-
-/* FetchCommonType */
-static int __Pyx_VerifyCachedType(PyObject *cached_type,
- const char *name,
- Py_ssize_t basicsize,
- Py_ssize_t expected_basicsize) {
- if (!PyType_Check(cached_type)) {
- PyErr_Format(PyExc_TypeError,
- "Shared Cython type %.200s is not a type object", name);
- return -1;
- }
- if (basicsize != expected_basicsize) {
- PyErr_Format(PyExc_TypeError,
- "Shared Cython type %.200s has the wrong size, try recompiling",
- name);
- return -1;
- }
- return 0;
-}
-#if !CYTHON_USE_TYPE_SPECS
-static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type) {
- PyObject* abi_module;
- const char* object_name;
- PyTypeObject *cached_type = NULL;
- abi_module = __Pyx_FetchSharedCythonABIModule();
- if (!abi_module) return NULL;
- object_name = strrchr(type->tp_name, '.');
- object_name = object_name ? object_name+1 : type->tp_name;
- cached_type = (PyTypeObject*) PyObject_GetAttrString(abi_module, object_name);
- if (cached_type) {
- if (__Pyx_VerifyCachedType(
- (PyObject *)cached_type,
- object_name,
- cached_type->tp_basicsize,
- type->tp_basicsize) < 0) {
- goto bad;
- }
- goto done;
- }
- if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad;
- PyErr_Clear();
- if (PyType_Ready(type) < 0) goto bad;
- if (PyObject_SetAttrString(abi_module, object_name, (PyObject *)type) < 0)
- goto bad;
- Py_INCREF(type);
- cached_type = type;
-done:
- Py_DECREF(abi_module);
- return cached_type;
-bad:
- Py_XDECREF(cached_type);
- cached_type = NULL;
- goto done;
-}
-#else
-static PyTypeObject *__Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) {
- PyObject *abi_module, *cached_type = NULL;
- const char* object_name = strrchr(spec->name, '.');
- object_name = object_name ? object_name+1 : spec->name;
- abi_module = __Pyx_FetchSharedCythonABIModule();
- if (!abi_module) return NULL;
- cached_type = PyObject_GetAttrString(abi_module, object_name);
- if (cached_type) {
- Py_ssize_t basicsize;
-#if CYTHON_COMPILING_IN_LIMITED_API
- PyObject *py_basicsize;
- py_basicsize = PyObject_GetAttrString(cached_type, "__basicsize__");
- if (unlikely(!py_basicsize)) goto bad;
- basicsize = PyLong_AsSsize_t(py_basicsize);
- Py_DECREF(py_basicsize);
- py_basicsize = 0;
- if (unlikely(basicsize == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad;
-#else
- basicsize = likely(PyType_Check(cached_type)) ? ((PyTypeObject*) cached_type)->tp_basicsize : -1;
-#endif
- if (__Pyx_VerifyCachedType(
- cached_type,
- object_name,
- basicsize,
- spec->basicsize) < 0) {
- goto bad;
- }
- goto done;
- }
- if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad;
- PyErr_Clear();
- CYTHON_UNUSED_VAR(module);
- cached_type = __Pyx_PyType_FromModuleAndSpec(abi_module, spec, bases);
- if (unlikely(!cached_type)) goto bad;
- if (unlikely(__Pyx_fix_up_extension_type_from_spec(spec, (PyTypeObject *) cached_type) < 0)) goto bad;
- if (PyObject_SetAttrString(abi_module, object_name, cached_type) < 0) goto bad;
-done:
- Py_DECREF(abi_module);
- assert(cached_type == NULL || PyType_Check(cached_type));
- return (PyTypeObject *) cached_type;
-bad:
- Py_XDECREF(cached_type);
- cached_type = NULL;
- goto done;
-}
-#endif
-
-/* PyVectorcallFastCallDict */
-#if CYTHON_METH_FASTCALL
-static PyObject *__Pyx_PyVectorcall_FastCallDict_kw(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw)
-{
- PyObject *res = NULL;
- PyObject *kwnames;
- PyObject **newargs;
- PyObject **kwvalues;
- Py_ssize_t i, pos;
- size_t j;
- PyObject *key, *value;
- unsigned long keys_are_strings;
- Py_ssize_t nkw = PyDict_GET_SIZE(kw);
- newargs = (PyObject **)PyMem_Malloc((nargs + (size_t)nkw) * sizeof(args[0]));
- if (unlikely(newargs == NULL)) {
- PyErr_NoMemory();
- return NULL;
- }
- for (j = 0; j < nargs; j++) newargs[j] = args[j];
- kwnames = PyTuple_New(nkw);
- if (unlikely(kwnames == NULL)) {
- PyMem_Free(newargs);
- return NULL;
- }
- kwvalues = newargs + nargs;
- pos = i = 0;
- keys_are_strings = Py_TPFLAGS_UNICODE_SUBCLASS;
- while (PyDict_Next(kw, &pos, &key, &value)) {
- keys_are_strings &= Py_TYPE(key)->tp_flags;
- Py_INCREF(key);
- Py_INCREF(value);
- PyTuple_SET_ITEM(kwnames, i, key);
- kwvalues[i] = value;
- i++;
- }
- if (unlikely(!keys_are_strings)) {
- PyErr_SetString(PyExc_TypeError, "keywords must be strings");
- goto cleanup;
- }
- res = vc(func, newargs, nargs, kwnames);
-cleanup:
- Py_DECREF(kwnames);
- for (i = 0; i < nkw; i++)
- Py_DECREF(kwvalues[i]);
- PyMem_Free(newargs);
- return res;
-}
-static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw)
-{
- if (likely(kw == NULL) || PyDict_GET_SIZE(kw) == 0) {
- return vc(func, args, nargs, NULL);
- }
- return __Pyx_PyVectorcall_FastCallDict_kw(func, vc, args, nargs, kw);
-}
-#endif
-
-/* CythonFunctionShared */
-static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj) {
-#if PY_VERSION_HEX < 0x030900B1
- __Pyx_Py_XDECREF_SET(
- __Pyx_CyFunction_GetClassObj(f),
- ((classobj) ? __Pyx_NewRef(classobj) : NULL));
-#else
- __Pyx_Py_XDECREF_SET(
- ((PyCMethodObject *) (f))->mm_class,
- (PyTypeObject*)((classobj) ? __Pyx_NewRef(classobj) : NULL));
-#endif
-}
-static PyObject *
-__Pyx_CyFunction_get_doc(__pyx_CyFunctionObject *op, void *closure)
-{
- CYTHON_UNUSED_VAR(closure);
- if (unlikely(op->func_doc == NULL)) {
- if (((PyCFunctionObject*)op)->m_ml->ml_doc) {
-#if PY_MAJOR_VERSION >= 3
- op->func_doc = PyUnicode_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc);
-#else
- op->func_doc = PyString_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc);
-#endif
- if (unlikely(op->func_doc == NULL))
- return NULL;
- } else {
- Py_INCREF(Py_None);
- return Py_None;
- }
- }
- Py_INCREF(op->func_doc);
- return op->func_doc;
-}
-static int
-__Pyx_CyFunction_set_doc(__pyx_CyFunctionObject *op, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- if (value == NULL) {
- value = Py_None;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->func_doc, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_name(__pyx_CyFunctionObject *op, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- if (unlikely(op->func_name == NULL)) {
-#if PY_MAJOR_VERSION >= 3
- op->func_name = PyUnicode_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name);
-#else
- op->func_name = PyString_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name);
-#endif
- if (unlikely(op->func_name == NULL))
- return NULL;
- }
- Py_INCREF(op->func_name);
- return op->func_name;
-}
-static int
-__Pyx_CyFunction_set_name(__pyx_CyFunctionObject *op, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
-#if PY_MAJOR_VERSION >= 3
- if (unlikely(value == NULL || !PyUnicode_Check(value)))
-#else
- if (unlikely(value == NULL || !PyString_Check(value)))
-#endif
- {
- PyErr_SetString(PyExc_TypeError,
- "__name__ must be set to a string object");
- return -1;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->func_name, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_qualname(__pyx_CyFunctionObject *op, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- Py_INCREF(op->func_qualname);
- return op->func_qualname;
-}
-static int
-__Pyx_CyFunction_set_qualname(__pyx_CyFunctionObject *op, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
-#if PY_MAJOR_VERSION >= 3
- if (unlikely(value == NULL || !PyUnicode_Check(value)))
-#else
- if (unlikely(value == NULL || !PyString_Check(value)))
-#endif
- {
- PyErr_SetString(PyExc_TypeError,
- "__qualname__ must be set to a string object");
- return -1;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->func_qualname, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_dict(__pyx_CyFunctionObject *op, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- if (unlikely(op->func_dict == NULL)) {
- op->func_dict = PyDict_New();
- if (unlikely(op->func_dict == NULL))
- return NULL;
- }
- Py_INCREF(op->func_dict);
- return op->func_dict;
-}
-static int
-__Pyx_CyFunction_set_dict(__pyx_CyFunctionObject *op, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- if (unlikely(value == NULL)) {
- PyErr_SetString(PyExc_TypeError,
- "function's dictionary may not be deleted");
- return -1;
- }
- if (unlikely(!PyDict_Check(value))) {
- PyErr_SetString(PyExc_TypeError,
- "setting function's dictionary to a non-dict");
- return -1;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->func_dict, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_globals(__pyx_CyFunctionObject *op, void *context)
-{
- CYTHON_UNUSED_VAR(context);
- Py_INCREF(op->func_globals);
- return op->func_globals;
-}
-static PyObject *
-__Pyx_CyFunction_get_closure(__pyx_CyFunctionObject *op, void *context)
-{
- CYTHON_UNUSED_VAR(op);
- CYTHON_UNUSED_VAR(context);
- Py_INCREF(Py_None);
- return Py_None;
-}
-static PyObject *
-__Pyx_CyFunction_get_code(__pyx_CyFunctionObject *op, void *context)
-{
- PyObject* result = (op->func_code) ? op->func_code : Py_None;
- CYTHON_UNUSED_VAR(context);
- Py_INCREF(result);
- return result;
-}
-static int
-__Pyx_CyFunction_init_defaults(__pyx_CyFunctionObject *op) {
- int result = 0;
- PyObject *res = op->defaults_getter((PyObject *) op);
- if (unlikely(!res))
- return -1;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- op->defaults_tuple = PyTuple_GET_ITEM(res, 0);
- Py_INCREF(op->defaults_tuple);
- op->defaults_kwdict = PyTuple_GET_ITEM(res, 1);
- Py_INCREF(op->defaults_kwdict);
- #else
- op->defaults_tuple = PySequence_ITEM(res, 0);
- if (unlikely(!op->defaults_tuple)) result = -1;
- else {
- op->defaults_kwdict = PySequence_ITEM(res, 1);
- if (unlikely(!op->defaults_kwdict)) result = -1;
- }
- #endif
- Py_DECREF(res);
- return result;
-}
-static int
-__Pyx_CyFunction_set_defaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) {
- CYTHON_UNUSED_VAR(context);
- if (!value) {
- value = Py_None;
- } else if (unlikely(value != Py_None && !PyTuple_Check(value))) {
- PyErr_SetString(PyExc_TypeError,
- "__defaults__ must be set to a tuple object");
- return -1;
- }
- PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__defaults__ will not "
- "currently affect the values used in function calls", 1);
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->defaults_tuple, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_defaults(__pyx_CyFunctionObject *op, void *context) {
- PyObject* result = op->defaults_tuple;
- CYTHON_UNUSED_VAR(context);
- if (unlikely(!result)) {
- if (op->defaults_getter) {
- if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL;
- result = op->defaults_tuple;
- } else {
- result = Py_None;
- }
- }
- Py_INCREF(result);
- return result;
-}
-static int
-__Pyx_CyFunction_set_kwdefaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) {
- CYTHON_UNUSED_VAR(context);
- if (!value) {
- value = Py_None;
- } else if (unlikely(value != Py_None && !PyDict_Check(value))) {
- PyErr_SetString(PyExc_TypeError,
- "__kwdefaults__ must be set to a dict object");
- return -1;
- }
- PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__kwdefaults__ will not "
- "currently affect the values used in function calls", 1);
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(op->defaults_kwdict, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_kwdefaults(__pyx_CyFunctionObject *op, void *context) {
- PyObject* result = op->defaults_kwdict;
- CYTHON_UNUSED_VAR(context);
- if (unlikely(!result)) {
- if (op->defaults_getter) {
- if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL;
- result = op->defaults_kwdict;
- } else {
- result = Py_None;
- }
- }
- Py_INCREF(result);
- return result;
-}
-static int
-__Pyx_CyFunction_set_annotations(__pyx_CyFunctionObject *op, PyObject* value, void *context) {
- CYTHON_UNUSED_VAR(context);
- if (!value || value == Py_None) {
- value = NULL;
- } else if (unlikely(!PyDict_Check(value))) {
- PyErr_SetString(PyExc_TypeError,
- "__annotations__ must be set to a dict object");
- return -1;
- }
- Py_XINCREF(value);
- __Pyx_Py_XDECREF_SET(op->func_annotations, value);
- return 0;
-}
-static PyObject *
-__Pyx_CyFunction_get_annotations(__pyx_CyFunctionObject *op, void *context) {
- PyObject* result = op->func_annotations;
- CYTHON_UNUSED_VAR(context);
- if (unlikely(!result)) {
- result = PyDict_New();
- if (unlikely(!result)) return NULL;
- op->func_annotations = result;
- }
- Py_INCREF(result);
- return result;
-}
-static PyObject *
-__Pyx_CyFunction_get_is_coroutine(__pyx_CyFunctionObject *op, void *context) {
- int is_coroutine;
- CYTHON_UNUSED_VAR(context);
- if (op->func_is_coroutine) {
- return __Pyx_NewRef(op->func_is_coroutine);
- }
- is_coroutine = op->flags & __Pyx_CYFUNCTION_COROUTINE;
-#if PY_VERSION_HEX >= 0x03050000
- if (is_coroutine) {
- PyObject *module, *fromlist, *marker = __pyx_n_s_is_coroutine;
- fromlist = PyList_New(1);
- if (unlikely(!fromlist)) return NULL;
- Py_INCREF(marker);
- PyList_SET_ITEM(fromlist, 0, marker);
- module = PyImport_ImportModuleLevelObject(__pyx_n_s_asyncio_coroutines, NULL, NULL, fromlist, 0);
- Py_DECREF(fromlist);
- if (unlikely(!module)) goto ignore;
- op->func_is_coroutine = __Pyx_PyObject_GetAttrStr(module, marker);
- Py_DECREF(module);
- if (likely(op->func_is_coroutine)) {
- return __Pyx_NewRef(op->func_is_coroutine);
- }
-ignore:
- PyErr_Clear();
- }
-#endif
- op->func_is_coroutine = __Pyx_PyBool_FromLong(is_coroutine);
- return __Pyx_NewRef(op->func_is_coroutine);
-}
-static PyGetSetDef __pyx_CyFunction_getsets[] = {
- {(char *) "func_doc", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0},
- {(char *) "__doc__", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0},
- {(char *) "func_name", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0},
- {(char *) "__name__", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0},
- {(char *) "__qualname__", (getter)__Pyx_CyFunction_get_qualname, (setter)__Pyx_CyFunction_set_qualname, 0, 0},
- {(char *) "func_dict", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0},
- {(char *) "__dict__", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0},
- {(char *) "func_globals", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0},
- {(char *) "__globals__", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0},
- {(char *) "func_closure", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0},
- {(char *) "__closure__", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0},
- {(char *) "func_code", (getter)__Pyx_CyFunction_get_code, 0, 0, 0},
- {(char *) "__code__", (getter)__Pyx_CyFunction_get_code, 0, 0, 0},
- {(char *) "func_defaults", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0},
- {(char *) "__defaults__", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0},
- {(char *) "__kwdefaults__", (getter)__Pyx_CyFunction_get_kwdefaults, (setter)__Pyx_CyFunction_set_kwdefaults, 0, 0},
- {(char *) "__annotations__", (getter)__Pyx_CyFunction_get_annotations, (setter)__Pyx_CyFunction_set_annotations, 0, 0},
- {(char *) "_is_coroutine", (getter)__Pyx_CyFunction_get_is_coroutine, 0, 0, 0},
- {0, 0, 0, 0, 0}
-};
-static PyMemberDef __pyx_CyFunction_members[] = {
- {(char *) "__module__", T_OBJECT, offsetof(PyCFunctionObject, m_module), 0, 0},
-#if CYTHON_USE_TYPE_SPECS
- {(char *) "__dictoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_dict), READONLY, 0},
-#if CYTHON_METH_FASTCALL
-#if CYTHON_BACKPORT_VECTORCALL
- {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_vectorcall), READONLY, 0},
-#else
- {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(PyCFunctionObject, vectorcall), READONLY, 0},
-#endif
-#endif
-#if PY_VERSION_HEX < 0x030500A0
- {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_weakreflist), READONLY, 0},
-#else
- {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(PyCFunctionObject, m_weakreflist), READONLY, 0},
-#endif
-#endif
- {0, 0, 0, 0, 0}
-};
-static PyObject *
-__Pyx_CyFunction_reduce(__pyx_CyFunctionObject *m, PyObject *args)
-{
- CYTHON_UNUSED_VAR(args);
-#if PY_MAJOR_VERSION >= 3
- Py_INCREF(m->func_qualname);
- return m->func_qualname;
-#else
- return PyString_FromString(((PyCFunctionObject*)m)->m_ml->ml_name);
-#endif
-}
-static PyMethodDef __pyx_CyFunction_methods[] = {
- {"__reduce__", (PyCFunction)__Pyx_CyFunction_reduce, METH_VARARGS, 0},
- {0, 0, 0, 0}
-};
-#if PY_VERSION_HEX < 0x030500A0
-#define __Pyx_CyFunction_weakreflist(cyfunc) ((cyfunc)->func_weakreflist)
-#else
-#define __Pyx_CyFunction_weakreflist(cyfunc) (((PyCFunctionObject*)cyfunc)->m_weakreflist)
-#endif
-static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject *op, PyMethodDef *ml, int flags, PyObject* qualname,
- PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) {
- PyCFunctionObject *cf = (PyCFunctionObject*) op;
- if (unlikely(op == NULL))
- return NULL;
- op->flags = flags;
- __Pyx_CyFunction_weakreflist(op) = NULL;
- cf->m_ml = ml;
- cf->m_self = (PyObject *) op;
- Py_XINCREF(closure);
- op->func_closure = closure;
- Py_XINCREF(module);
- cf->m_module = module;
- op->func_dict = NULL;
- op->func_name = NULL;
- Py_INCREF(qualname);
- op->func_qualname = qualname;
- op->func_doc = NULL;
-#if PY_VERSION_HEX < 0x030900B1
- op->func_classobj = NULL;
-#else
- ((PyCMethodObject*)op)->mm_class = NULL;
-#endif
- op->func_globals = globals;
- Py_INCREF(op->func_globals);
- Py_XINCREF(code);
- op->func_code = code;
- op->defaults_pyobjects = 0;
- op->defaults_size = 0;
- op->defaults = NULL;
- op->defaults_tuple = NULL;
- op->defaults_kwdict = NULL;
- op->defaults_getter = NULL;
- op->func_annotations = NULL;
- op->func_is_coroutine = NULL;
-#if CYTHON_METH_FASTCALL
- switch (ml->ml_flags & (METH_VARARGS | METH_FASTCALL | METH_NOARGS | METH_O | METH_KEYWORDS | METH_METHOD)) {
- case METH_NOARGS:
- __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_NOARGS;
- break;
- case METH_O:
- __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_O;
- break;
- case METH_METHOD | METH_FASTCALL | METH_KEYWORDS:
- __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD;
- break;
- case METH_FASTCALL | METH_KEYWORDS:
- __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS;
- break;
- case METH_VARARGS | METH_KEYWORDS:
- __Pyx_CyFunction_func_vectorcall(op) = NULL;
- break;
- default:
- PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction");
- Py_DECREF(op);
- return NULL;
- }
-#endif
- return (PyObject *) op;
-}
-static int
-__Pyx_CyFunction_clear(__pyx_CyFunctionObject *m)
-{
- Py_CLEAR(m->func_closure);
- Py_CLEAR(((PyCFunctionObject*)m)->m_module);
- Py_CLEAR(m->func_dict);
- Py_CLEAR(m->func_name);
- Py_CLEAR(m->func_qualname);
- Py_CLEAR(m->func_doc);
- Py_CLEAR(m->func_globals);
- Py_CLEAR(m->func_code);
-#if PY_VERSION_HEX < 0x030900B1
- Py_CLEAR(__Pyx_CyFunction_GetClassObj(m));
-#else
- {
- PyObject *cls = (PyObject*) ((PyCMethodObject *) (m))->mm_class;
- ((PyCMethodObject *) (m))->mm_class = NULL;
- Py_XDECREF(cls);
- }
-#endif
- Py_CLEAR(m->defaults_tuple);
- Py_CLEAR(m->defaults_kwdict);
- Py_CLEAR(m->func_annotations);
- Py_CLEAR(m->func_is_coroutine);
- if (m->defaults) {
- PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m);
- int i;
- for (i = 0; i < m->defaults_pyobjects; i++)
- Py_XDECREF(pydefaults[i]);
- PyObject_Free(m->defaults);
- m->defaults = NULL;
- }
- return 0;
-}
-static void __Pyx__CyFunction_dealloc(__pyx_CyFunctionObject *m)
-{
- if (__Pyx_CyFunction_weakreflist(m) != NULL)
- PyObject_ClearWeakRefs((PyObject *) m);
- __Pyx_CyFunction_clear(m);
- __Pyx_PyHeapTypeObject_GC_Del(m);
-}
-static void __Pyx_CyFunction_dealloc(__pyx_CyFunctionObject *m)
-{
- PyObject_GC_UnTrack(m);
- __Pyx__CyFunction_dealloc(m);
-}
-static int __Pyx_CyFunction_traverse(__pyx_CyFunctionObject *m, visitproc visit, void *arg)
-{
- Py_VISIT(m->func_closure);
- Py_VISIT(((PyCFunctionObject*)m)->m_module);
- Py_VISIT(m->func_dict);
- Py_VISIT(m->func_name);
- Py_VISIT(m->func_qualname);
- Py_VISIT(m->func_doc);
- Py_VISIT(m->func_globals);
- Py_VISIT(m->func_code);
- Py_VISIT(__Pyx_CyFunction_GetClassObj(m));
- Py_VISIT(m->defaults_tuple);
- Py_VISIT(m->defaults_kwdict);
- Py_VISIT(m->func_is_coroutine);
- if (m->defaults) {
- PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m);
- int i;
- for (i = 0; i < m->defaults_pyobjects; i++)
- Py_VISIT(pydefaults[i]);
- }
- return 0;
-}
-static PyObject*
-__Pyx_CyFunction_repr(__pyx_CyFunctionObject *op)
-{
-#if PY_MAJOR_VERSION >= 3
- return PyUnicode_FromFormat("",
- op->func_qualname, (void *)op);
-#else
- return PyString_FromFormat("",
- PyString_AsString(op->func_qualname), (void *)op);
-#endif
-}
-static PyObject * __Pyx_CyFunction_CallMethod(PyObject *func, PyObject *self, PyObject *arg, PyObject *kw) {
- PyCFunctionObject* f = (PyCFunctionObject*)func;
- PyCFunction meth = f->m_ml->ml_meth;
- Py_ssize_t size;
- switch (f->m_ml->ml_flags & (METH_VARARGS | METH_KEYWORDS | METH_NOARGS | METH_O)) {
- case METH_VARARGS:
- if (likely(kw == NULL || PyDict_Size(kw) == 0))
- return (*meth)(self, arg);
- break;
- case METH_VARARGS | METH_KEYWORDS:
- return (*(PyCFunctionWithKeywords)(void*)meth)(self, arg, kw);
- case METH_NOARGS:
- if (likely(kw == NULL || PyDict_Size(kw) == 0)) {
- size = PyTuple_GET_SIZE(arg);
- if (likely(size == 0))
- return (*meth)(self, NULL);
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)",
- f->m_ml->ml_name, size);
- return NULL;
- }
- break;
- case METH_O:
- if (likely(kw == NULL || PyDict_Size(kw) == 0)) {
- size = PyTuple_GET_SIZE(arg);
- if (likely(size == 1)) {
- PyObject *result, *arg0;
- #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- arg0 = PyTuple_GET_ITEM(arg, 0);
- #else
- arg0 = PySequence_ITEM(arg, 0); if (unlikely(!arg0)) return NULL;
- #endif
- result = (*meth)(self, arg0);
- #if !(CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS)
- Py_DECREF(arg0);
- #endif
- return result;
- }
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)",
- f->m_ml->ml_name, size);
- return NULL;
- }
- break;
- default:
- PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction");
- return NULL;
- }
- PyErr_Format(PyExc_TypeError, "%.200s() takes no keyword arguments",
- f->m_ml->ml_name);
- return NULL;
-}
-static CYTHON_INLINE PyObject *__Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject *kw) {
- return __Pyx_CyFunction_CallMethod(func, ((PyCFunctionObject*)func)->m_self, arg, kw);
-}
-static PyObject *__Pyx_CyFunction_CallAsMethod(PyObject *func, PyObject *args, PyObject *kw) {
- PyObject *result;
- __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *) func;
-#if CYTHON_METH_FASTCALL
- __pyx_vectorcallfunc vc = __Pyx_CyFunction_func_vectorcall(cyfunc);
- if (vc) {
-#if CYTHON_ASSUME_SAFE_MACROS
- return __Pyx_PyVectorcall_FastCallDict(func, vc, &PyTuple_GET_ITEM(args, 0), (size_t)PyTuple_GET_SIZE(args), kw);
-#else
- (void) &__Pyx_PyVectorcall_FastCallDict;
- return PyVectorcall_Call(func, args, kw);
-#endif
- }
-#endif
- if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) {
- Py_ssize_t argc;
- PyObject *new_args;
- PyObject *self;
- argc = PyTuple_GET_SIZE(args);
- new_args = PyTuple_GetSlice(args, 1, argc);
- if (unlikely(!new_args))
- return NULL;
- self = PyTuple_GetItem(args, 0);
- if (unlikely(!self)) {
- Py_DECREF(new_args);
-#if PY_MAJOR_VERSION > 2
- PyErr_Format(PyExc_TypeError,
- "unbound method %.200S() needs an argument",
- cyfunc->func_qualname);
-#else
- PyErr_SetString(PyExc_TypeError,
- "unbound method needs an argument");
-#endif
- return NULL;
- }
- result = __Pyx_CyFunction_CallMethod(func, self, new_args, kw);
- Py_DECREF(new_args);
- } else {
- result = __Pyx_CyFunction_Call(func, args, kw);
- }
- return result;
-}
-#if CYTHON_METH_FASTCALL
-static CYTHON_INLINE int __Pyx_CyFunction_Vectorcall_CheckArgs(__pyx_CyFunctionObject *cyfunc, Py_ssize_t nargs, PyObject *kwnames)
-{
- int ret = 0;
- if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) {
- if (unlikely(nargs < 1)) {
- PyErr_Format(PyExc_TypeError, "%.200s() needs an argument",
- ((PyCFunctionObject*)cyfunc)->m_ml->ml_name);
- return -1;
- }
- ret = 1;
- }
- if (unlikely(kwnames) && unlikely(PyTuple_GET_SIZE(kwnames))) {
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes no keyword arguments", ((PyCFunctionObject*)cyfunc)->m_ml->ml_name);
- return -1;
- }
- return ret;
-}
-static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames)
-{
- __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func;
- PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml;
-#if CYTHON_BACKPORT_VECTORCALL
- Py_ssize_t nargs = (Py_ssize_t)nargsf;
-#else
- Py_ssize_t nargs = PyVectorcall_NARGS(nargsf);
-#endif
- PyObject *self;
- switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) {
- case 1:
- self = args[0];
- args += 1;
- nargs -= 1;
- break;
- case 0:
- self = ((PyCFunctionObject*)cyfunc)->m_self;
- break;
- default:
- return NULL;
- }
- if (unlikely(nargs != 0)) {
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)",
- def->ml_name, nargs);
- return NULL;
- }
- return def->ml_meth(self, NULL);
-}
-static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames)
-{
- __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func;
- PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml;
-#if CYTHON_BACKPORT_VECTORCALL
- Py_ssize_t nargs = (Py_ssize_t)nargsf;
-#else
- Py_ssize_t nargs = PyVectorcall_NARGS(nargsf);
-#endif
- PyObject *self;
- switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) {
- case 1:
- self = args[0];
- args += 1;
- nargs -= 1;
- break;
- case 0:
- self = ((PyCFunctionObject*)cyfunc)->m_self;
- break;
- default:
- return NULL;
- }
- if (unlikely(nargs != 1)) {
- PyErr_Format(PyExc_TypeError,
- "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)",
- def->ml_name, nargs);
- return NULL;
- }
- return def->ml_meth(self, args[0]);
-}
-static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames)
-{
- __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func;
- PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml;
-#if CYTHON_BACKPORT_VECTORCALL
- Py_ssize_t nargs = (Py_ssize_t)nargsf;
-#else
- Py_ssize_t nargs = PyVectorcall_NARGS(nargsf);
-#endif
- PyObject *self;
- switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) {
- case 1:
- self = args[0];
- args += 1;
- nargs -= 1;
- break;
- case 0:
- self = ((PyCFunctionObject*)cyfunc)->m_self;
- break;
- default:
- return NULL;
- }
- return ((_PyCFunctionFastWithKeywords)(void(*)(void))def->ml_meth)(self, args, nargs, kwnames);
-}
-static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames)
-{
- __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func;
- PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml;
- PyTypeObject *cls = (PyTypeObject *) __Pyx_CyFunction_GetClassObj(cyfunc);
-#if CYTHON_BACKPORT_VECTORCALL
- Py_ssize_t nargs = (Py_ssize_t)nargsf;
-#else
- Py_ssize_t nargs = PyVectorcall_NARGS(nargsf);
-#endif
- PyObject *self;
- switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) {
- case 1:
- self = args[0];
- args += 1;
- nargs -= 1;
- break;
- case 0:
- self = ((PyCFunctionObject*)cyfunc)->m_self;
- break;
- default:
- return NULL;
- }
- return ((__Pyx_PyCMethod)(void(*)(void))def->ml_meth)(self, cls, args, (size_t)nargs, kwnames);
-}
-#endif
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_CyFunctionType_slots[] = {
- {Py_tp_dealloc, (void *)__Pyx_CyFunction_dealloc},
- {Py_tp_repr, (void *)__Pyx_CyFunction_repr},
- {Py_tp_call, (void *)__Pyx_CyFunction_CallAsMethod},
- {Py_tp_traverse, (void *)__Pyx_CyFunction_traverse},
- {Py_tp_clear, (void *)__Pyx_CyFunction_clear},
- {Py_tp_methods, (void *)__pyx_CyFunction_methods},
- {Py_tp_members, (void *)__pyx_CyFunction_members},
- {Py_tp_getset, (void *)__pyx_CyFunction_getsets},
- {Py_tp_descr_get, (void *)__Pyx_PyMethod_New},
- {0, 0},
-};
-static PyType_Spec __pyx_CyFunctionType_spec = {
- __PYX_TYPE_MODULE_PREFIX "cython_function_or_method",
- sizeof(__pyx_CyFunctionObject),
- 0,
-#ifdef Py_TPFLAGS_METHOD_DESCRIPTOR
- Py_TPFLAGS_METHOD_DESCRIPTOR |
-#endif
-#if (defined(_Py_TPFLAGS_HAVE_VECTORCALL) && CYTHON_METH_FASTCALL)
- _Py_TPFLAGS_HAVE_VECTORCALL |
-#endif
- Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE,
- __pyx_CyFunctionType_slots
-};
-#else
-static PyTypeObject __pyx_CyFunctionType_type = {
- PyVarObject_HEAD_INIT(0, 0)
- __PYX_TYPE_MODULE_PREFIX "cython_function_or_method",
- sizeof(__pyx_CyFunctionObject),
- 0,
- (destructor) __Pyx_CyFunction_dealloc,
-#if !CYTHON_METH_FASTCALL
- 0,
-#elif CYTHON_BACKPORT_VECTORCALL
- (printfunc)offsetof(__pyx_CyFunctionObject, func_vectorcall),
-#else
- offsetof(PyCFunctionObject, vectorcall),
-#endif
- 0,
- 0,
-#if PY_MAJOR_VERSION < 3
- 0,
-#else
- 0,
-#endif
- (reprfunc) __Pyx_CyFunction_repr,
- 0,
- 0,
- 0,
- 0,
- __Pyx_CyFunction_CallAsMethod,
- 0,
- 0,
- 0,
- 0,
-#ifdef Py_TPFLAGS_METHOD_DESCRIPTOR
- Py_TPFLAGS_METHOD_DESCRIPTOR |
-#endif
-#ifdef _Py_TPFLAGS_HAVE_VECTORCALL
- _Py_TPFLAGS_HAVE_VECTORCALL |
-#endif
- Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE,
- 0,
- (traverseproc) __Pyx_CyFunction_traverse,
- (inquiry) __Pyx_CyFunction_clear,
- 0,
-#if PY_VERSION_HEX < 0x030500A0
- offsetof(__pyx_CyFunctionObject, func_weakreflist),
-#else
- offsetof(PyCFunctionObject, m_weakreflist),
-#endif
- 0,
- 0,
- __pyx_CyFunction_methods,
- __pyx_CyFunction_members,
- __pyx_CyFunction_getsets,
- 0,
- 0,
- __Pyx_PyMethod_New,
- 0,
- offsetof(__pyx_CyFunctionObject, func_dict),
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
-#if PY_VERSION_HEX >= 0x030400a1
- 0,
-#endif
-#if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0,
-#endif
-#if __PYX_NEED_TP_PRINT_SLOT
- 0,
-#endif
-#if PY_VERSION_HEX >= 0x030C0000
- 0,
-#endif
-#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0,
-#endif
-};
-#endif
-static int __pyx_CyFunction_init(PyObject *module) {
-#if CYTHON_USE_TYPE_SPECS
- __pyx_CyFunctionType = __Pyx_FetchCommonTypeFromSpec(module, &__pyx_CyFunctionType_spec, NULL);
-#else
- CYTHON_UNUSED_VAR(module);
- __pyx_CyFunctionType = __Pyx_FetchCommonType(&__pyx_CyFunctionType_type);
-#endif
- if (unlikely(__pyx_CyFunctionType == NULL)) {
- return -1;
- }
- return 0;
-}
-static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *func, size_t size, int pyobjects) {
- __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func;
- m->defaults = PyObject_Malloc(size);
- if (unlikely(!m->defaults))
- return PyErr_NoMemory();
- memset(m->defaults, 0, size);
- m->defaults_pyobjects = pyobjects;
- m->defaults_size = size;
- return m->defaults;
-}
-static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *func, PyObject *tuple) {
- __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func;
- m->defaults_tuple = tuple;
- Py_INCREF(tuple);
-}
-static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *func, PyObject *dict) {
- __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func;
- m->defaults_kwdict = dict;
- Py_INCREF(dict);
-}
-static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *func, PyObject *dict) {
- __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func;
- m->func_annotations = dict;
- Py_INCREF(dict);
-}
-
-/* CythonFunction */
-static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml, int flags, PyObject* qualname,
- PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) {
- PyObject *op = __Pyx_CyFunction_Init(
- PyObject_GC_New(__pyx_CyFunctionObject, __pyx_CyFunctionType),
- ml, flags, qualname, closure, module, globals, code
- );
- if (likely(op)) {
- PyObject_GC_Track(op);
- }
- return op;
-}
-
-/* CLineInTraceback */
-#ifndef CYTHON_CLINE_IN_TRACEBACK
-static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) {
- PyObject *use_cline;
- PyObject *ptype, *pvalue, *ptraceback;
-#if CYTHON_COMPILING_IN_CPYTHON
- PyObject **cython_runtime_dict;
-#endif
- CYTHON_MAYBE_UNUSED_VAR(tstate);
- if (unlikely(!__pyx_cython_runtime)) {
- return c_line;
- }
- __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback);
-#if CYTHON_COMPILING_IN_CPYTHON
- cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime);
- if (likely(cython_runtime_dict)) {
- __PYX_PY_DICT_LOOKUP_IF_MODIFIED(
- use_cline, *cython_runtime_dict,
- __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback))
- } else
-#endif
- {
- PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStrNoError(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback);
- if (use_cline_obj) {
- use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True;
- Py_DECREF(use_cline_obj);
- } else {
- PyErr_Clear();
- use_cline = NULL;
- }
- }
- if (!use_cline) {
- c_line = 0;
- (void) PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False);
- }
- else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) {
- c_line = 0;
- }
- __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback);
- return c_line;
-}
-#endif
-
-/* CodeObjectCache */
-#if !CYTHON_COMPILING_IN_LIMITED_API
-static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) {
- int start = 0, mid = 0, end = count - 1;
- if (end >= 0 && code_line > entries[end].code_line) {
- return count;
- }
- while (start < end) {
- mid = start + (end - start) / 2;
- if (code_line < entries[mid].code_line) {
- end = mid;
- } else if (code_line > entries[mid].code_line) {
- start = mid + 1;
- } else {
- return mid;
- }
- }
- if (code_line <= entries[mid].code_line) {
- return mid;
- } else {
- return mid + 1;
- }
-}
-static PyCodeObject *__pyx_find_code_object(int code_line) {
- PyCodeObject* code_object;
- int pos;
- if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) {
- return NULL;
- }
- pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line);
- if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) {
- return NULL;
- }
- code_object = __pyx_code_cache.entries[pos].code_object;
- Py_INCREF(code_object);
- return code_object;
-}
-static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) {
- int pos, i;
- __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries;
- if (unlikely(!code_line)) {
- return;
- }
- if (unlikely(!entries)) {
- entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry));
- if (likely(entries)) {
- __pyx_code_cache.entries = entries;
- __pyx_code_cache.max_count = 64;
- __pyx_code_cache.count = 1;
- entries[0].code_line = code_line;
- entries[0].code_object = code_object;
- Py_INCREF(code_object);
- }
- return;
- }
- pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line);
- if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) {
- PyCodeObject* tmp = entries[pos].code_object;
- entries[pos].code_object = code_object;
- Py_DECREF(tmp);
- return;
- }
- if (__pyx_code_cache.count == __pyx_code_cache.max_count) {
- int new_max = __pyx_code_cache.max_count + 64;
- entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc(
- __pyx_code_cache.entries, ((size_t)new_max) * sizeof(__Pyx_CodeObjectCacheEntry));
- if (unlikely(!entries)) {
- return;
- }
- __pyx_code_cache.entries = entries;
- __pyx_code_cache.max_count = new_max;
- }
- for (i=__pyx_code_cache.count; i>pos; i--) {
- entries[i] = entries[i-1];
- }
- entries[pos].code_line = code_line;
- entries[pos].code_object = code_object;
- __pyx_code_cache.count++;
- Py_INCREF(code_object);
-}
-#endif
-
-/* AddTraceback */
-#include "compile.h"
-#include "frameobject.h"
-#include "traceback.h"
-#if PY_VERSION_HEX >= 0x030b00a6
- #ifndef Py_BUILD_CORE
- #define Py_BUILD_CORE 1
- #endif
- #include "internal/pycore_frame.h"
-#endif
-#if CYTHON_COMPILING_IN_LIMITED_API
-static void __Pyx_AddTraceback(const char *funcname, int c_line,
- int py_line, const char *filename) {
- if (c_line) {
- (void) __pyx_cfilenm;
- (void) __Pyx_CLineForTraceback(__Pyx_PyThreadState_Current, c_line);
- }
- _PyTraceback_Add(funcname, filename, py_line);
-}
-#else
-static PyCodeObject* __Pyx_CreateCodeObjectForTraceback(
- const char *funcname, int c_line,
- int py_line, const char *filename) {
- PyCodeObject *py_code = NULL;
- PyObject *py_funcname = NULL;
- #if PY_MAJOR_VERSION < 3
- PyObject *py_srcfile = NULL;
- py_srcfile = PyString_FromString(filename);
- if (!py_srcfile) goto bad;
- #endif
- if (c_line) {
- #if PY_MAJOR_VERSION < 3
- py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line);
- if (!py_funcname) goto bad;
- #else
- py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line);
- if (!py_funcname) goto bad;
- funcname = PyUnicode_AsUTF8(py_funcname);
- if (!funcname) goto bad;
- #endif
- }
- else {
- #if PY_MAJOR_VERSION < 3
- py_funcname = PyString_FromString(funcname);
- if (!py_funcname) goto bad;
- #endif
- }
- #if PY_MAJOR_VERSION < 3
- py_code = __Pyx_PyCode_New(
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- __pyx_empty_bytes, /*PyObject *code,*/
- __pyx_empty_tuple, /*PyObject *consts,*/
- __pyx_empty_tuple, /*PyObject *names,*/
- __pyx_empty_tuple, /*PyObject *varnames,*/
- __pyx_empty_tuple, /*PyObject *freevars,*/
- __pyx_empty_tuple, /*PyObject *cellvars,*/
- py_srcfile, /*PyObject *filename,*/
- py_funcname, /*PyObject *name,*/
- py_line,
- __pyx_empty_bytes /*PyObject *lnotab*/
- );
- Py_DECREF(py_srcfile);
- #else
- py_code = PyCode_NewEmpty(filename, funcname, py_line);
- #endif
- Py_XDECREF(py_funcname); // XDECREF since it's only set on Py3 if cline
- return py_code;
-bad:
- Py_XDECREF(py_funcname);
- #if PY_MAJOR_VERSION < 3
- Py_XDECREF(py_srcfile);
- #endif
- return NULL;
-}
-static void __Pyx_AddTraceback(const char *funcname, int c_line,
- int py_line, const char *filename) {
- PyCodeObject *py_code = 0;
- PyFrameObject *py_frame = 0;
- PyThreadState *tstate = __Pyx_PyThreadState_Current;
- PyObject *ptype, *pvalue, *ptraceback;
- if (c_line) {
- c_line = __Pyx_CLineForTraceback(tstate, c_line);
- }
- py_code = __pyx_find_code_object(c_line ? -c_line : py_line);
- if (!py_code) {
- __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback);
- py_code = __Pyx_CreateCodeObjectForTraceback(
- funcname, c_line, py_line, filename);
- if (!py_code) {
- /* If the code object creation fails, then we should clear the
- fetched exception references and propagate the new exception */
- Py_XDECREF(ptype);
- Py_XDECREF(pvalue);
- Py_XDECREF(ptraceback);
- goto bad;
- }
- __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback);
- __pyx_insert_code_object(c_line ? -c_line : py_line, py_code);
- }
- py_frame = PyFrame_New(
- tstate, /*PyThreadState *tstate,*/
- py_code, /*PyCodeObject *code,*/
- __pyx_d, /*PyObject *globals,*/
- 0 /*PyObject *locals*/
- );
- if (!py_frame) goto bad;
- __Pyx_PyFrame_SetLineNumber(py_frame, py_line);
- PyTraceBack_Here(py_frame);
-bad:
- Py_XDECREF(py_code);
- Py_XDECREF(py_frame);
-}
-#endif
-
-/* CIntFromPyVerify */
-#define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\
- __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0)
-#define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\
- __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1)
-#define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\
- {\
- func_type value = func_value;\
- if (sizeof(target_type) < sizeof(func_type)) {\
- if (unlikely(value != (func_type) (target_type) value)) {\
- func_type zero = 0;\
- if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\
- return (target_type) -1;\
- if (is_unsigned && unlikely(value < zero))\
- goto raise_neg_overflow;\
- else\
- goto raise_overflow;\
- }\
- }\
- return (target_type) value;\
- }
-
-/* CIntFromPy */
-static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) {
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic push
-#pragma GCC diagnostic ignored "-Wconversion"
-#endif
- const int neg_one = (int) -1, const_zero = (int) 0;
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic pop
-#endif
- const int is_unsigned = neg_one > const_zero;
-#if PY_MAJOR_VERSION < 3
- if (likely(PyInt_Check(x))) {
- if ((sizeof(int) < sizeof(long))) {
- __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x))
- } else {
- long val = PyInt_AS_LONG(x);
- if (is_unsigned && unlikely(val < 0)) {
- goto raise_neg_overflow;
- }
- return (int) val;
- }
- } else
-#endif
- if (likely(PyLong_Check(x))) {
- if (is_unsigned) {
-#if CYTHON_USE_PYLONG_INTERNALS
- if (unlikely(__Pyx_PyLong_IsNeg(x))) {
- goto raise_neg_overflow;
- } else if (__Pyx_PyLong_IsCompact(x)) {
- __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x))
- } else {
- const digit* digits = __Pyx_PyLong_Digits(x);
- assert(__Pyx_PyLong_DigitCount(x) > 1);
- switch (__Pyx_PyLong_DigitCount(x)) {
- case 2:
- if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) >= 2 * PyLong_SHIFT)) {
- return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]));
- }
- }
- break;
- case 3:
- if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) >= 3 * PyLong_SHIFT)) {
- return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]));
- }
- }
- break;
- case 4:
- if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) >= 4 * PyLong_SHIFT)) {
- return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]));
- }
- }
- break;
- }
- }
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7
- if (unlikely(Py_SIZE(x) < 0)) {
- goto raise_neg_overflow;
- }
-#else
- {
- int result = PyObject_RichCompareBool(x, Py_False, Py_LT);
- if (unlikely(result < 0))
- return (int) -1;
- if (unlikely(result == 1))
- goto raise_neg_overflow;
- }
-#endif
- if ((sizeof(int) <= sizeof(unsigned long))) {
- __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x))
-#ifdef HAVE_LONG_LONG
- } else if ((sizeof(int) <= sizeof(unsigned PY_LONG_LONG))) {
- __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x))
-#endif
- }
- } else {
-#if CYTHON_USE_PYLONG_INTERNALS
- if (__Pyx_PyLong_IsCompact(x)) {
- __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x))
- } else {
- const digit* digits = __Pyx_PyLong_Digits(x);
- assert(__Pyx_PyLong_DigitCount(x) > 1);
- switch (__Pyx_PyLong_SignedDigitCount(x)) {
- case -2:
- if ((8 * sizeof(int) - 1 > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) {
- return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- case 2:
- if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) {
- return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- case -3:
- if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) {
- return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- case 3:
- if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) {
- return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- case -4:
- if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) {
- return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- case 4:
- if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) {
- return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])));
- }
- }
- break;
- }
- }
-#endif
- if ((sizeof(int) <= sizeof(long))) {
- __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x))
-#ifdef HAVE_LONG_LONG
- } else if ((sizeof(int) <= sizeof(PY_LONG_LONG))) {
- __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x))
-#endif
- }
- }
- {
- int val;
- PyObject *v = __Pyx_PyNumber_IntOrLong(x);
-#if PY_MAJOR_VERSION < 3
- if (likely(v) && !PyLong_Check(v)) {
- PyObject *tmp = v;
- v = PyNumber_Long(tmp);
- Py_DECREF(tmp);
- }
-#endif
- if (likely(v)) {
- int ret = -1;
-#if !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray)
- int one = 1; int is_little = (int)*(unsigned char *)&one;
- unsigned char *bytes = (unsigned char *)&val;
- ret = _PyLong_AsByteArray((PyLongObject *)v,
- bytes, sizeof(val),
- is_little, !is_unsigned);
-#else
- PyObject *stepval = NULL, *mask = NULL, *shift = NULL;
- int bits, remaining_bits, is_negative = 0;
- long idigit;
- int chunk_size = (sizeof(long) < 8) ? 30 : 62;
- if (unlikely(!PyLong_CheckExact(v))) {
- PyObject *tmp = v;
- v = PyNumber_Long(v);
- assert(PyLong_CheckExact(v));
- Py_DECREF(tmp);
- if (unlikely(!v)) return (int) -1;
- }
-#if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- if (Py_SIZE(x) == 0)
- return (int) 0;
- is_negative = Py_SIZE(x) < 0;
-#else
- {
- int result = PyObject_RichCompareBool(x, Py_False, Py_LT);
- if (unlikely(result < 0))
- return (int) -1;
- is_negative = result == 1;
- }
-#endif
- if (is_unsigned && unlikely(is_negative)) {
- goto raise_neg_overflow;
- } else if (is_negative) {
- stepval = PyNumber_Invert(v);
- if (unlikely(!stepval))
- return (int) -1;
- } else {
- stepval = __Pyx_NewRef(v);
- }
- val = (int) 0;
- mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done;
- shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done;
- for (bits = 0; bits < (int) sizeof(int) * 8 - chunk_size; bits += chunk_size) {
- PyObject *tmp, *digit;
- digit = PyNumber_And(stepval, mask);
- if (unlikely(!digit)) goto done;
- idigit = PyLong_AsLong(digit);
- Py_DECREF(digit);
- if (unlikely(idigit < 0)) goto done;
- tmp = PyNumber_Rshift(stepval, shift);
- if (unlikely(!tmp)) goto done;
- Py_DECREF(stepval); stepval = tmp;
- val |= ((int) idigit) << bits;
- #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- if (Py_SIZE(stepval) == 0)
- goto unpacking_done;
- #endif
- }
- idigit = PyLong_AsLong(stepval);
- if (unlikely(idigit < 0)) goto done;
- remaining_bits = ((int) sizeof(int) * 8) - bits - (is_unsigned ? 0 : 1);
- if (unlikely(idigit >= (1L << remaining_bits)))
- goto raise_overflow;
- val |= ((int) idigit) << bits;
- #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- unpacking_done:
- #endif
- if (!is_unsigned) {
- if (unlikely(val & (((int) 1) << (sizeof(int) * 8 - 1))))
- goto raise_overflow;
- if (is_negative)
- val = ~val;
- }
- ret = 0;
- done:
- Py_XDECREF(shift);
- Py_XDECREF(mask);
- Py_XDECREF(stepval);
-#endif
- Py_DECREF(v);
- if (likely(!ret))
- return val;
- }
- return (int) -1;
- }
- } else {
- int val;
- PyObject *tmp = __Pyx_PyNumber_IntOrLong(x);
- if (!tmp) return (int) -1;
- val = __Pyx_PyInt_As_int(tmp);
- Py_DECREF(tmp);
- return val;
- }
-raise_overflow:
- PyErr_SetString(PyExc_OverflowError,
- "value too large to convert to int");
- return (int) -1;
-raise_neg_overflow:
- PyErr_SetString(PyExc_OverflowError,
- "can't convert negative value to int");
- return (int) -1;
-}
-
-/* CIntToPy */
-static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value) {
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic push
-#pragma GCC diagnostic ignored "-Wconversion"
-#endif
- const int neg_one = (int) -1, const_zero = (int) 0;
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic pop
-#endif
- const int is_unsigned = neg_one > const_zero;
- if (is_unsigned) {
- if (sizeof(int) < sizeof(long)) {
- return PyInt_FromLong((long) value);
- } else if (sizeof(int) <= sizeof(unsigned long)) {
- return PyLong_FromUnsignedLong((unsigned long) value);
-#ifdef HAVE_LONG_LONG
- } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) {
- return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value);
-#endif
- }
- } else {
- if (sizeof(int) <= sizeof(long)) {
- return PyInt_FromLong((long) value);
-#ifdef HAVE_LONG_LONG
- } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) {
- return PyLong_FromLongLong((PY_LONG_LONG) value);
-#endif
- }
- }
- {
- int one = 1; int little = (int)*(unsigned char *)&one;
- unsigned char *bytes = (unsigned char *)&value;
- return _PyLong_FromByteArray(bytes, sizeof(int),
- little, !is_unsigned);
- }
-}
-
-/* CIntToPy */
-static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) {
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic push
-#pragma GCC diagnostic ignored "-Wconversion"
-#endif
- const long neg_one = (long) -1, const_zero = (long) 0;
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic pop
-#endif
- const int is_unsigned = neg_one > const_zero;
- if (is_unsigned) {
- if (sizeof(long) < sizeof(long)) {
- return PyInt_FromLong((long) value);
- } else if (sizeof(long) <= sizeof(unsigned long)) {
- return PyLong_FromUnsignedLong((unsigned long) value);
-#ifdef HAVE_LONG_LONG
- } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) {
- return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value);
-#endif
- }
- } else {
- if (sizeof(long) <= sizeof(long)) {
- return PyInt_FromLong((long) value);
-#ifdef HAVE_LONG_LONG
- } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) {
- return PyLong_FromLongLong((PY_LONG_LONG) value);
-#endif
- }
- }
- {
- int one = 1; int little = (int)*(unsigned char *)&one;
- unsigned char *bytes = (unsigned char *)&value;
- return _PyLong_FromByteArray(bytes, sizeof(long),
- little, !is_unsigned);
- }
-}
-
-/* CIntFromPy */
-static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) {
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic push
-#pragma GCC diagnostic ignored "-Wconversion"
-#endif
- const long neg_one = (long) -1, const_zero = (long) 0;
-#ifdef __Pyx_HAS_GCC_DIAGNOSTIC
-#pragma GCC diagnostic pop
-#endif
- const int is_unsigned = neg_one > const_zero;
-#if PY_MAJOR_VERSION < 3
- if (likely(PyInt_Check(x))) {
- if ((sizeof(long) < sizeof(long))) {
- __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x))
- } else {
- long val = PyInt_AS_LONG(x);
- if (is_unsigned && unlikely(val < 0)) {
- goto raise_neg_overflow;
- }
- return (long) val;
- }
- } else
-#endif
- if (likely(PyLong_Check(x))) {
- if (is_unsigned) {
-#if CYTHON_USE_PYLONG_INTERNALS
- if (unlikely(__Pyx_PyLong_IsNeg(x))) {
- goto raise_neg_overflow;
- } else if (__Pyx_PyLong_IsCompact(x)) {
- __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x))
- } else {
- const digit* digits = __Pyx_PyLong_Digits(x);
- assert(__Pyx_PyLong_DigitCount(x) > 1);
- switch (__Pyx_PyLong_DigitCount(x)) {
- case 2:
- if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) >= 2 * PyLong_SHIFT)) {
- return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]));
- }
- }
- break;
- case 3:
- if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) >= 3 * PyLong_SHIFT)) {
- return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]));
- }
- }
- break;
- case 4:
- if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) >= 4 * PyLong_SHIFT)) {
- return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]));
- }
- }
- break;
- }
- }
-#endif
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7
- if (unlikely(Py_SIZE(x) < 0)) {
- goto raise_neg_overflow;
- }
-#else
- {
- int result = PyObject_RichCompareBool(x, Py_False, Py_LT);
- if (unlikely(result < 0))
- return (long) -1;
- if (unlikely(result == 1))
- goto raise_neg_overflow;
- }
-#endif
- if ((sizeof(long) <= sizeof(unsigned long))) {
- __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x))
-#ifdef HAVE_LONG_LONG
- } else if ((sizeof(long) <= sizeof(unsigned PY_LONG_LONG))) {
- __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x))
-#endif
- }
- } else {
-#if CYTHON_USE_PYLONG_INTERNALS
- if (__Pyx_PyLong_IsCompact(x)) {
- __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x))
- } else {
- const digit* digits = __Pyx_PyLong_Digits(x);
- assert(__Pyx_PyLong_DigitCount(x) > 1);
- switch (__Pyx_PyLong_SignedDigitCount(x)) {
- case -2:
- if ((8 * sizeof(long) - 1 > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) {
- return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- case 2:
- if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) {
- return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- case -3:
- if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) {
- return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- case 3:
- if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) {
- return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- case -4:
- if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) {
- return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- case 4:
- if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) {
- if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) {
- __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0])))
- } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) {
- return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])));
- }
- }
- break;
- }
- }
-#endif
- if ((sizeof(long) <= sizeof(long))) {
- __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x))
-#ifdef HAVE_LONG_LONG
- } else if ((sizeof(long) <= sizeof(PY_LONG_LONG))) {
- __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x))
-#endif
- }
- }
- {
- long val;
- PyObject *v = __Pyx_PyNumber_IntOrLong(x);
-#if PY_MAJOR_VERSION < 3
- if (likely(v) && !PyLong_Check(v)) {
- PyObject *tmp = v;
- v = PyNumber_Long(tmp);
- Py_DECREF(tmp);
- }
-#endif
- if (likely(v)) {
- int ret = -1;
-#if !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray)
- int one = 1; int is_little = (int)*(unsigned char *)&one;
- unsigned char *bytes = (unsigned char *)&val;
- ret = _PyLong_AsByteArray((PyLongObject *)v,
- bytes, sizeof(val),
- is_little, !is_unsigned);
-#else
- PyObject *stepval = NULL, *mask = NULL, *shift = NULL;
- int bits, remaining_bits, is_negative = 0;
- long idigit;
- int chunk_size = (sizeof(long) < 8) ? 30 : 62;
- if (unlikely(!PyLong_CheckExact(v))) {
- PyObject *tmp = v;
- v = PyNumber_Long(v);
- assert(PyLong_CheckExact(v));
- Py_DECREF(tmp);
- if (unlikely(!v)) return (long) -1;
- }
-#if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- if (Py_SIZE(x) == 0)
- return (long) 0;
- is_negative = Py_SIZE(x) < 0;
-#else
- {
- int result = PyObject_RichCompareBool(x, Py_False, Py_LT);
- if (unlikely(result < 0))
- return (long) -1;
- is_negative = result == 1;
- }
-#endif
- if (is_unsigned && unlikely(is_negative)) {
- goto raise_neg_overflow;
- } else if (is_negative) {
- stepval = PyNumber_Invert(v);
- if (unlikely(!stepval))
- return (long) -1;
- } else {
- stepval = __Pyx_NewRef(v);
- }
- val = (long) 0;
- mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done;
- shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done;
- for (bits = 0; bits < (int) sizeof(long) * 8 - chunk_size; bits += chunk_size) {
- PyObject *tmp, *digit;
- digit = PyNumber_And(stepval, mask);
- if (unlikely(!digit)) goto done;
- idigit = PyLong_AsLong(digit);
- Py_DECREF(digit);
- if (unlikely(idigit < 0)) goto done;
- tmp = PyNumber_Rshift(stepval, shift);
- if (unlikely(!tmp)) goto done;
- Py_DECREF(stepval); stepval = tmp;
- val |= ((long) idigit) << bits;
- #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- if (Py_SIZE(stepval) == 0)
- goto unpacking_done;
- #endif
- }
- idigit = PyLong_AsLong(stepval);
- if (unlikely(idigit < 0)) goto done;
- remaining_bits = ((int) sizeof(long) * 8) - bits - (is_unsigned ? 0 : 1);
- if (unlikely(idigit >= (1L << remaining_bits)))
- goto raise_overflow;
- val |= ((long) idigit) << bits;
- #if CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030B0000
- unpacking_done:
- #endif
- if (!is_unsigned) {
- if (unlikely(val & (((long) 1) << (sizeof(long) * 8 - 1))))
- goto raise_overflow;
- if (is_negative)
- val = ~val;
- }
- ret = 0;
- done:
- Py_XDECREF(shift);
- Py_XDECREF(mask);
- Py_XDECREF(stepval);
-#endif
- Py_DECREF(v);
- if (likely(!ret))
- return val;
- }
- return (long) -1;
- }
- } else {
- long val;
- PyObject *tmp = __Pyx_PyNumber_IntOrLong(x);
- if (!tmp) return (long) -1;
- val = __Pyx_PyInt_As_long(tmp);
- Py_DECREF(tmp);
- return val;
- }
-raise_overflow:
- PyErr_SetString(PyExc_OverflowError,
- "value too large to convert to long");
- return (long) -1;
-raise_neg_overflow:
- PyErr_SetString(PyExc_OverflowError,
- "can't convert negative value to long");
- return (long) -1;
-}
-
-/* FormatTypeName */
-#if CYTHON_COMPILING_IN_LIMITED_API
-static __Pyx_TypeName
-__Pyx_PyType_GetName(PyTypeObject* tp)
-{
- PyObject *name = __Pyx_PyObject_GetAttrStr((PyObject *)tp,
- __pyx_n_s_name);
- if (unlikely(name == NULL) || unlikely(!PyUnicode_Check(name))) {
- PyErr_Clear();
- Py_XSETREF(name, __Pyx_NewRef(__pyx_n_s__23));
- }
- return name;
-}
-#endif
-
-/* SwapException */
-#if CYTHON_FAST_THREAD_STATE
-static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) {
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4
- _PyErr_StackItem *exc_info = tstate->exc_info;
- tmp_value = exc_info->exc_value;
- exc_info->exc_value = *value;
- if (tmp_value == NULL || tmp_value == Py_None) {
- Py_XDECREF(tmp_value);
- tmp_value = NULL;
- tmp_type = NULL;
- tmp_tb = NULL;
- } else {
- tmp_type = (PyObject*) Py_TYPE(tmp_value);
- Py_INCREF(tmp_type);
- #if CYTHON_COMPILING_IN_CPYTHON
- tmp_tb = ((PyBaseExceptionObject*) tmp_value)->traceback;
- Py_XINCREF(tmp_tb);
- #else
- tmp_tb = PyException_GetTraceback(tmp_value);
- #endif
- }
- #elif CYTHON_USE_EXC_INFO_STACK
- _PyErr_StackItem *exc_info = tstate->exc_info;
- tmp_type = exc_info->exc_type;
- tmp_value = exc_info->exc_value;
- tmp_tb = exc_info->exc_traceback;
- exc_info->exc_type = *type;
- exc_info->exc_value = *value;
- exc_info->exc_traceback = *tb;
- #else
- tmp_type = tstate->exc_type;
- tmp_value = tstate->exc_value;
- tmp_tb = tstate->exc_traceback;
- tstate->exc_type = *type;
- tstate->exc_value = *value;
- tstate->exc_traceback = *tb;
- #endif
- *type = tmp_type;
- *value = tmp_value;
- *tb = tmp_tb;
-}
-#else
-static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb) {
- PyObject *tmp_type, *tmp_value, *tmp_tb;
- PyErr_GetExcInfo(&tmp_type, &tmp_value, &tmp_tb);
- PyErr_SetExcInfo(*type, *value, *tb);
- *type = tmp_type;
- *value = tmp_value;
- *tb = tmp_tb;
-}
-#endif
-
-/* PyObjectCall2Args */
-static CYTHON_INLINE PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) {
- PyObject *args[3] = {NULL, arg1, arg2};
- return __Pyx_PyObject_FastCall(function, args+1, 2 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET);
-}
-
-/* PyObjectCallMethod1 */
-static PyObject* __Pyx__PyObject_CallMethod1(PyObject* method, PyObject* arg) {
- PyObject *result = __Pyx_PyObject_CallOneArg(method, arg);
- Py_DECREF(method);
- return result;
-}
-static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg) {
- PyObject *method = NULL, *result;
- int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method);
- if (likely(is_method)) {
- result = __Pyx_PyObject_Call2Args(method, obj, arg);
- Py_DECREF(method);
- return result;
- }
- if (unlikely(!method)) return NULL;
- return __Pyx__PyObject_CallMethod1(method, arg);
-}
-
-/* CoroutineBase */
-#include
-#if PY_VERSION_HEX >= 0x030b00a6
- #ifndef Py_BUILD_CORE
- #define Py_BUILD_CORE 1
- #endif
- #include "internal/pycore_frame.h"
-#endif
-#define __Pyx_Coroutine_Undelegate(gen) Py_CLEAR((gen)->yieldfrom)
-static int __Pyx_PyGen__FetchStopIterationValue(PyThreadState *__pyx_tstate, PyObject **pvalue) {
- PyObject *et, *ev, *tb;
- PyObject *value = NULL;
- CYTHON_UNUSED_VAR(__pyx_tstate);
- __Pyx_ErrFetch(&et, &ev, &tb);
- if (!et) {
- Py_XDECREF(tb);
- Py_XDECREF(ev);
- Py_INCREF(Py_None);
- *pvalue = Py_None;
- return 0;
- }
- if (likely(et == PyExc_StopIteration)) {
- if (!ev) {
- Py_INCREF(Py_None);
- value = Py_None;
- }
-#if PY_VERSION_HEX >= 0x030300A0
- else if (likely(__Pyx_IS_TYPE(ev, (PyTypeObject*)PyExc_StopIteration))) {
- value = ((PyStopIterationObject *)ev)->value;
- Py_INCREF(value);
- Py_DECREF(ev);
- }
-#endif
- else if (unlikely(PyTuple_Check(ev))) {
- if (PyTuple_GET_SIZE(ev) >= 1) {
-#if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS
- value = PyTuple_GET_ITEM(ev, 0);
- Py_INCREF(value);
-#else
- value = PySequence_ITEM(ev, 0);
-#endif
- } else {
- Py_INCREF(Py_None);
- value = Py_None;
- }
- Py_DECREF(ev);
- }
- else if (!__Pyx_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration)) {
- value = ev;
- }
- if (likely(value)) {
- Py_XDECREF(tb);
- Py_DECREF(et);
- *pvalue = value;
- return 0;
- }
- } else if (!__Pyx_PyErr_GivenExceptionMatches(et, PyExc_StopIteration)) {
- __Pyx_ErrRestore(et, ev, tb);
- return -1;
- }
- PyErr_NormalizeException(&et, &ev, &tb);
- if (unlikely(!PyObject_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration))) {
- __Pyx_ErrRestore(et, ev, tb);
- return -1;
- }
- Py_XDECREF(tb);
- Py_DECREF(et);
-#if PY_VERSION_HEX >= 0x030300A0
- value = ((PyStopIterationObject *)ev)->value;
- Py_INCREF(value);
- Py_DECREF(ev);
-#else
- {
- PyObject* args = __Pyx_PyObject_GetAttrStr(ev, __pyx_n_s_args);
- Py_DECREF(ev);
- if (likely(args)) {
- value = PySequence_GetItem(args, 0);
- Py_DECREF(args);
- }
- if (unlikely(!value)) {
- __Pyx_ErrRestore(NULL, NULL, NULL);
- Py_INCREF(Py_None);
- value = Py_None;
- }
- }
-#endif
- *pvalue = value;
- return 0;
-}
-static CYTHON_INLINE
-void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *exc_state) {
-#if PY_VERSION_HEX >= 0x030B00a4
- Py_CLEAR(exc_state->exc_value);
-#else
- PyObject *t, *v, *tb;
- t = exc_state->exc_type;
- v = exc_state->exc_value;
- tb = exc_state->exc_traceback;
- exc_state->exc_type = NULL;
- exc_state->exc_value = NULL;
- exc_state->exc_traceback = NULL;
- Py_XDECREF(t);
- Py_XDECREF(v);
- Py_XDECREF(tb);
-#endif
-}
-#define __Pyx_Coroutine_AlreadyRunningError(gen) (__Pyx__Coroutine_AlreadyRunningError(gen), (PyObject*)NULL)
-static void __Pyx__Coroutine_AlreadyRunningError(__pyx_CoroutineObject *gen) {
- const char *msg;
- CYTHON_MAYBE_UNUSED_VAR(gen);
- if ((0)) {
- #ifdef __Pyx_Coroutine_USED
- } else if (__Pyx_Coroutine_Check((PyObject*)gen)) {
- msg = "coroutine already executing";
- #endif
- #ifdef __Pyx_AsyncGen_USED
- } else if (__Pyx_AsyncGen_CheckExact((PyObject*)gen)) {
- msg = "async generator already executing";
- #endif
- } else {
- msg = "generator already executing";
- }
- PyErr_SetString(PyExc_ValueError, msg);
-}
-#define __Pyx_Coroutine_NotStartedError(gen) (__Pyx__Coroutine_NotStartedError(gen), (PyObject*)NULL)
-static void __Pyx__Coroutine_NotStartedError(PyObject *gen) {
- const char *msg;
- CYTHON_MAYBE_UNUSED_VAR(gen);
- if ((0)) {
- #ifdef __Pyx_Coroutine_USED
- } else if (__Pyx_Coroutine_Check(gen)) {
- msg = "can't send non-None value to a just-started coroutine";
- #endif
- #ifdef __Pyx_AsyncGen_USED
- } else if (__Pyx_AsyncGen_CheckExact(gen)) {
- msg = "can't send non-None value to a just-started async generator";
- #endif
- } else {
- msg = "can't send non-None value to a just-started generator";
- }
- PyErr_SetString(PyExc_TypeError, msg);
-}
-#define __Pyx_Coroutine_AlreadyTerminatedError(gen, value, closing) (__Pyx__Coroutine_AlreadyTerminatedError(gen, value, closing), (PyObject*)NULL)
-static void __Pyx__Coroutine_AlreadyTerminatedError(PyObject *gen, PyObject *value, int closing) {
- CYTHON_MAYBE_UNUSED_VAR(gen);
- CYTHON_MAYBE_UNUSED_VAR(closing);
- #ifdef __Pyx_Coroutine_USED
- if (!closing && __Pyx_Coroutine_Check(gen)) {
- PyErr_SetString(PyExc_RuntimeError, "cannot reuse already awaited coroutine");
- } else
- #endif
- if (value) {
- #ifdef __Pyx_AsyncGen_USED
- if (__Pyx_AsyncGen_CheckExact(gen))
- PyErr_SetNone(__Pyx_PyExc_StopAsyncIteration);
- else
- #endif
- PyErr_SetNone(PyExc_StopIteration);
- }
-}
-static
-PyObject *__Pyx_Coroutine_SendEx(__pyx_CoroutineObject *self, PyObject *value, int closing) {
- __Pyx_PyThreadState_declare
- PyThreadState *tstate;
- __Pyx_ExcInfoStruct *exc_state;
- PyObject *retval;
- assert(!self->is_running);
- if (unlikely(self->resume_label == 0)) {
- if (unlikely(value && value != Py_None)) {
- return __Pyx_Coroutine_NotStartedError((PyObject*)self);
- }
- }
- if (unlikely(self->resume_label == -1)) {
- return __Pyx_Coroutine_AlreadyTerminatedError((PyObject*)self, value, closing);
- }
-#if CYTHON_FAST_THREAD_STATE
- __Pyx_PyThreadState_assign
- tstate = __pyx_tstate;
-#else
- tstate = __Pyx_PyThreadState_Current;
-#endif
- exc_state = &self->gi_exc_state;
- if (exc_state->exc_value) {
- #if CYTHON_COMPILING_IN_PYPY
- #else
- PyObject *exc_tb;
- #if PY_VERSION_HEX >= 0x030B00a4 && !CYTHON_COMPILING_IN_CPYTHON
- exc_tb = PyException_GetTraceback(exc_state->exc_value);
- #elif PY_VERSION_HEX >= 0x030B00a4
- exc_tb = ((PyBaseExceptionObject*) exc_state->exc_value)->traceback;
- #else
- exc_tb = exc_state->exc_traceback;
- #endif
- if (exc_tb) {
- PyTracebackObject *tb = (PyTracebackObject *) exc_tb;
- PyFrameObject *f = tb->tb_frame;
- assert(f->f_back == NULL);
- #if PY_VERSION_HEX >= 0x030B00A1
- f->f_back = PyThreadState_GetFrame(tstate);
- #else
- Py_XINCREF(tstate->frame);
- f->f_back = tstate->frame;
- #endif
- #if PY_VERSION_HEX >= 0x030B00a4 && !CYTHON_COMPILING_IN_CPYTHON
- Py_DECREF(exc_tb);
- #endif
- }
- #endif
- }
-#if CYTHON_USE_EXC_INFO_STACK
- exc_state->previous_item = tstate->exc_info;
- tstate->exc_info = exc_state;
-#else
- if (exc_state->exc_type) {
- __Pyx_ExceptionSwap(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback);
- } else {
- __Pyx_Coroutine_ExceptionClear(exc_state);
- __Pyx_ExceptionSave(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback);
- }
-#endif
- self->is_running = 1;
- retval = self->body(self, tstate, value);
- self->is_running = 0;
-#if CYTHON_USE_EXC_INFO_STACK
- exc_state = &self->gi_exc_state;
- tstate->exc_info = exc_state->previous_item;
- exc_state->previous_item = NULL;
- __Pyx_Coroutine_ResetFrameBackpointer(exc_state);
-#endif
- return retval;
-}
-static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state) {
-#if CYTHON_COMPILING_IN_PYPY
- CYTHON_UNUSED_VAR(exc_state);
-#else
- PyObject *exc_tb;
- #if PY_VERSION_HEX >= 0x030B00a4
- if (!exc_state->exc_value) return;
- exc_tb = PyException_GetTraceback(exc_state->exc_value);
- #else
- exc_tb = exc_state->exc_traceback;
- #endif
- if (likely(exc_tb)) {
- PyTracebackObject *tb = (PyTracebackObject *) exc_tb;
- PyFrameObject *f = tb->tb_frame;
- Py_CLEAR(f->f_back);
- #if PY_VERSION_HEX >= 0x030B00a4
- Py_DECREF(exc_tb);
- #endif
- }
-#endif
-}
-static CYTHON_INLINE
-PyObject *__Pyx_Coroutine_MethodReturn(PyObject* gen, PyObject *retval) {
- CYTHON_MAYBE_UNUSED_VAR(gen);
- if (unlikely(!retval)) {
- __Pyx_PyThreadState_declare
- __Pyx_PyThreadState_assign
- if (!__Pyx_PyErr_Occurred()) {
- PyObject *exc = PyExc_StopIteration;
- #ifdef __Pyx_AsyncGen_USED
- if (__Pyx_AsyncGen_CheckExact(gen))
- exc = __Pyx_PyExc_StopAsyncIteration;
- #endif
- __Pyx_PyErr_SetNone(exc);
- }
- }
- return retval;
-}
-#if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3)
-static CYTHON_INLINE
-PyObject *__Pyx_PyGen_Send(PyGenObject *gen, PyObject *arg) {
-#if PY_VERSION_HEX <= 0x030A00A1
- return _PyGen_Send(gen, arg);
-#else
- PyObject *result;
- if (PyIter_Send((PyObject*)gen, arg ? arg : Py_None, &result) == PYGEN_RETURN) {
- if (PyAsyncGen_CheckExact(gen)) {
- assert(result == Py_None);
- PyErr_SetNone(PyExc_StopAsyncIteration);
- }
- else if (result == Py_None) {
- PyErr_SetNone(PyExc_StopIteration);
- }
- else {
- _PyGen_SetStopIterationValue(result);
- }
- Py_CLEAR(result);
- }
- return result;
-#endif
-}
-#endif
-static CYTHON_INLINE
-PyObject *__Pyx_Coroutine_FinishDelegation(__pyx_CoroutineObject *gen) {
- PyObject *ret;
- PyObject *val = NULL;
- __Pyx_Coroutine_Undelegate(gen);
- __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, &val);
- ret = __Pyx_Coroutine_SendEx(gen, val, 0);
- Py_XDECREF(val);
- return ret;
-}
-static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value) {
- PyObject *retval;
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self;
- PyObject *yf = gen->yieldfrom;
- if (unlikely(gen->is_running))
- return __Pyx_Coroutine_AlreadyRunningError(gen);
- if (yf) {
- PyObject *ret;
- gen->is_running = 1;
- #ifdef __Pyx_Generator_USED
- if (__Pyx_Generator_CheckExact(yf)) {
- ret = __Pyx_Coroutine_Send(yf, value);
- } else
- #endif
- #ifdef __Pyx_Coroutine_USED
- if (__Pyx_Coroutine_Check(yf)) {
- ret = __Pyx_Coroutine_Send(yf, value);
- } else
- #endif
- #ifdef __Pyx_AsyncGen_USED
- if (__pyx_PyAsyncGenASend_CheckExact(yf)) {
- ret = __Pyx_async_gen_asend_send(yf, value);
- } else
- #endif
- #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3)
- if (PyGen_CheckExact(yf)) {
- ret = __Pyx_PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value);
- } else
- #endif
- #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03050000 && defined(PyCoro_CheckExact) && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3)
- if (PyCoro_CheckExact(yf)) {
- ret = __Pyx_PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value);
- } else
- #endif
- {
- if (value == Py_None)
- ret = __Pyx_PyObject_GetIterNextFunc(yf)(yf);
- else
- ret = __Pyx_PyObject_CallMethod1(yf, __pyx_n_s_send, value);
- }
- gen->is_running = 0;
- if (likely(ret)) {
- return ret;
- }
- retval = __Pyx_Coroutine_FinishDelegation(gen);
- } else {
- retval = __Pyx_Coroutine_SendEx(gen, value, 0);
- }
- return __Pyx_Coroutine_MethodReturn(self, retval);
-}
-static int __Pyx_Coroutine_CloseIter(__pyx_CoroutineObject *gen, PyObject *yf) {
- PyObject *retval = NULL;
- int err = 0;
- #ifdef __Pyx_Generator_USED
- if (__Pyx_Generator_CheckExact(yf)) {
- retval = __Pyx_Coroutine_Close(yf);
- if (!retval)
- return -1;
- } else
- #endif
- #ifdef __Pyx_Coroutine_USED
- if (__Pyx_Coroutine_Check(yf)) {
- retval = __Pyx_Coroutine_Close(yf);
- if (!retval)
- return -1;
- } else
- if (__Pyx_CoroutineAwait_CheckExact(yf)) {
- retval = __Pyx_CoroutineAwait_Close((__pyx_CoroutineAwaitObject*)yf, NULL);
- if (!retval)
- return -1;
- } else
- #endif
- #ifdef __Pyx_AsyncGen_USED
- if (__pyx_PyAsyncGenASend_CheckExact(yf)) {
- retval = __Pyx_async_gen_asend_close(yf, NULL);
- } else
- if (__pyx_PyAsyncGenAThrow_CheckExact(yf)) {
- retval = __Pyx_async_gen_athrow_close(yf, NULL);
- } else
- #endif
- {
- PyObject *meth;
- gen->is_running = 1;
- meth = __Pyx_PyObject_GetAttrStrNoError(yf, __pyx_n_s_close);
- if (unlikely(!meth)) {
- if (unlikely(PyErr_Occurred())) {
- PyErr_WriteUnraisable(yf);
- }
- } else {
- retval = __Pyx_PyObject_CallNoArg(meth);
- Py_DECREF(meth);
- if (unlikely(!retval))
- err = -1;
- }
- gen->is_running = 0;
- }
- Py_XDECREF(retval);
- return err;
-}
-static PyObject *__Pyx_Generator_Next(PyObject *self) {
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self;
- PyObject *yf = gen->yieldfrom;
- if (unlikely(gen->is_running))
- return __Pyx_Coroutine_AlreadyRunningError(gen);
- if (yf) {
- PyObject *ret;
- gen->is_running = 1;
- #ifdef __Pyx_Generator_USED
- if (__Pyx_Generator_CheckExact(yf)) {
- ret = __Pyx_Generator_Next(yf);
- } else
- #endif
- #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3)
- if (PyGen_CheckExact(yf)) {
- ret = __Pyx_PyGen_Send((PyGenObject*)yf, NULL);
- } else
- #endif
- #ifdef __Pyx_Coroutine_USED
- if (__Pyx_Coroutine_Check(yf)) {
- ret = __Pyx_Coroutine_Send(yf, Py_None);
- } else
- #endif
- ret = __Pyx_PyObject_GetIterNextFunc(yf)(yf);
- gen->is_running = 0;
- if (likely(ret)) {
- return ret;
- }
- return __Pyx_Coroutine_FinishDelegation(gen);
- }
- return __Pyx_Coroutine_SendEx(gen, Py_None, 0);
-}
-static PyObject *__Pyx_Coroutine_Close_Method(PyObject *self, PyObject *arg) {
- CYTHON_UNUSED_VAR(arg);
- return __Pyx_Coroutine_Close(self);
-}
-static PyObject *__Pyx_Coroutine_Close(PyObject *self) {
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self;
- PyObject *retval, *raised_exception;
- PyObject *yf = gen->yieldfrom;
- int err = 0;
- if (unlikely(gen->is_running))
- return __Pyx_Coroutine_AlreadyRunningError(gen);
- if (yf) {
- Py_INCREF(yf);
- err = __Pyx_Coroutine_CloseIter(gen, yf);
- __Pyx_Coroutine_Undelegate(gen);
- Py_DECREF(yf);
- }
- if (err == 0)
- PyErr_SetNone(PyExc_GeneratorExit);
- retval = __Pyx_Coroutine_SendEx(gen, NULL, 1);
- if (unlikely(retval)) {
- const char *msg;
- Py_DECREF(retval);
- if ((0)) {
- #ifdef __Pyx_Coroutine_USED
- } else if (__Pyx_Coroutine_Check(self)) {
- msg = "coroutine ignored GeneratorExit";
- #endif
- #ifdef __Pyx_AsyncGen_USED
- } else if (__Pyx_AsyncGen_CheckExact(self)) {
-#if PY_VERSION_HEX < 0x03060000
- msg = "async generator ignored GeneratorExit - might require Python 3.6+ finalisation (PEP 525)";
-#else
- msg = "async generator ignored GeneratorExit";
-#endif
- #endif
- } else {
- msg = "generator ignored GeneratorExit";
- }
- PyErr_SetString(PyExc_RuntimeError, msg);
- return NULL;
- }
- raised_exception = PyErr_Occurred();
- if (likely(!raised_exception || __Pyx_PyErr_GivenExceptionMatches2(raised_exception, PyExc_GeneratorExit, PyExc_StopIteration))) {
- if (raised_exception) PyErr_Clear();
- Py_INCREF(Py_None);
- return Py_None;
- }
- return NULL;
-}
-static PyObject *__Pyx__Coroutine_Throw(PyObject *self, PyObject *typ, PyObject *val, PyObject *tb,
- PyObject *args, int close_on_genexit) {
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self;
- PyObject *yf = gen->yieldfrom;
- if (unlikely(gen->is_running))
- return __Pyx_Coroutine_AlreadyRunningError(gen);
- if (yf) {
- PyObject *ret;
- Py_INCREF(yf);
- if (__Pyx_PyErr_GivenExceptionMatches(typ, PyExc_GeneratorExit) && close_on_genexit) {
- int err = __Pyx_Coroutine_CloseIter(gen, yf);
- Py_DECREF(yf);
- __Pyx_Coroutine_Undelegate(gen);
- if (err < 0)
- return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0));
- goto throw_here;
- }
- gen->is_running = 1;
- if (0
- #ifdef __Pyx_Generator_USED
- || __Pyx_Generator_CheckExact(yf)
- #endif
- #ifdef __Pyx_Coroutine_USED
- || __Pyx_Coroutine_Check(yf)
- #endif
- ) {
- ret = __Pyx__Coroutine_Throw(yf, typ, val, tb, args, close_on_genexit);
- #ifdef __Pyx_Coroutine_USED
- } else if (__Pyx_CoroutineAwait_CheckExact(yf)) {
- ret = __Pyx__Coroutine_Throw(((__pyx_CoroutineAwaitObject*)yf)->coroutine, typ, val, tb, args, close_on_genexit);
- #endif
- } else {
- PyObject *meth = __Pyx_PyObject_GetAttrStrNoError(yf, __pyx_n_s_throw);
- if (unlikely(!meth)) {
- Py_DECREF(yf);
- if (unlikely(PyErr_Occurred())) {
- gen->is_running = 0;
- return NULL;
- }
- __Pyx_Coroutine_Undelegate(gen);
- gen->is_running = 0;
- goto throw_here;
- }
- if (likely(args)) {
- ret = __Pyx_PyObject_Call(meth, args, NULL);
- } else {
- PyObject *cargs[4] = {NULL, typ, val, tb};
- ret = __Pyx_PyObject_FastCall(meth, cargs+1, 3 | __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET);
- }
- Py_DECREF(meth);
- }
- gen->is_running = 0;
- Py_DECREF(yf);
- if (!ret) {
- ret = __Pyx_Coroutine_FinishDelegation(gen);
- }
- return __Pyx_Coroutine_MethodReturn(self, ret);
- }
-throw_here:
- __Pyx_Raise(typ, val, tb, NULL);
- return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0));
-}
-static PyObject *__Pyx_Coroutine_Throw(PyObject *self, PyObject *args) {
- PyObject *typ;
- PyObject *val = NULL;
- PyObject *tb = NULL;
- if (unlikely(!PyArg_UnpackTuple(args, (char *)"throw", 1, 3, &typ, &val, &tb)))
- return NULL;
- return __Pyx__Coroutine_Throw(self, typ, val, tb, args, 1);
-}
-static CYTHON_INLINE int __Pyx_Coroutine_traverse_excstate(__Pyx_ExcInfoStruct *exc_state, visitproc visit, void *arg) {
-#if PY_VERSION_HEX >= 0x030B00a4
- Py_VISIT(exc_state->exc_value);
-#else
- Py_VISIT(exc_state->exc_type);
- Py_VISIT(exc_state->exc_value);
- Py_VISIT(exc_state->exc_traceback);
-#endif
- return 0;
-}
-static int __Pyx_Coroutine_traverse(__pyx_CoroutineObject *gen, visitproc visit, void *arg) {
- Py_VISIT(gen->closure);
- Py_VISIT(gen->classobj);
- Py_VISIT(gen->yieldfrom);
- return __Pyx_Coroutine_traverse_excstate(&gen->gi_exc_state, visit, arg);
-}
-static int __Pyx_Coroutine_clear(PyObject *self) {
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self;
- Py_CLEAR(gen->closure);
- Py_CLEAR(gen->classobj);
- Py_CLEAR(gen->yieldfrom);
- __Pyx_Coroutine_ExceptionClear(&gen->gi_exc_state);
-#ifdef __Pyx_AsyncGen_USED
- if (__Pyx_AsyncGen_CheckExact(self)) {
- Py_CLEAR(((__pyx_PyAsyncGenObject*)gen)->ag_finalizer);
- }
-#endif
- Py_CLEAR(gen->gi_code);
- Py_CLEAR(gen->gi_frame);
- Py_CLEAR(gen->gi_name);
- Py_CLEAR(gen->gi_qualname);
- Py_CLEAR(gen->gi_modulename);
- return 0;
-}
-static void __Pyx_Coroutine_dealloc(PyObject *self) {
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self;
- PyObject_GC_UnTrack(gen);
- if (gen->gi_weakreflist != NULL)
- PyObject_ClearWeakRefs(self);
- if (gen->resume_label >= 0) {
- PyObject_GC_Track(self);
-#if PY_VERSION_HEX >= 0x030400a1 && CYTHON_USE_TP_FINALIZE
- if (unlikely(PyObject_CallFinalizerFromDealloc(self)))
-#else
- Py_TYPE(gen)->tp_del(self);
- if (unlikely(Py_REFCNT(self) > 0))
-#endif
- {
- return;
- }
- PyObject_GC_UnTrack(self);
- }
-#ifdef __Pyx_AsyncGen_USED
- if (__Pyx_AsyncGen_CheckExact(self)) {
- /* We have to handle this case for asynchronous generators
- right here, because this code has to be between UNTRACK
- and GC_Del. */
- Py_CLEAR(((__pyx_PyAsyncGenObject*)self)->ag_finalizer);
- }
-#endif
- __Pyx_Coroutine_clear(self);
- __Pyx_PyHeapTypeObject_GC_Del(gen);
-}
-static void __Pyx_Coroutine_del(PyObject *self) {
- PyObject *error_type, *error_value, *error_traceback;
- __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self;
- __Pyx_PyThreadState_declare
- if (gen->resume_label < 0) {
- return;
- }
-#if !CYTHON_USE_TP_FINALIZE
- assert(self->ob_refcnt == 0);
- __Pyx_SET_REFCNT(self, 1);
-#endif
- __Pyx_PyThreadState_assign
- __Pyx_ErrFetch(&error_type, &error_value, &error_traceback);
-#ifdef __Pyx_AsyncGen_USED
- if (__Pyx_AsyncGen_CheckExact(self)) {
- __pyx_PyAsyncGenObject *agen = (__pyx_PyAsyncGenObject*)self;
- PyObject *finalizer = agen->ag_finalizer;
- if (finalizer && !agen->ag_closed) {
- PyObject *res = __Pyx_PyObject_CallOneArg(finalizer, self);
- if (unlikely(!res)) {
- PyErr_WriteUnraisable(self);
- } else {
- Py_DECREF(res);
- }
- __Pyx_ErrRestore(error_type, error_value, error_traceback);
- return;
- }
- }
-#endif
- if (unlikely(gen->resume_label == 0 && !error_value)) {
-#ifdef __Pyx_Coroutine_USED
-#ifdef __Pyx_Generator_USED
- if (!__Pyx_Generator_CheckExact(self))
-#endif
- {
- PyObject_GC_UnTrack(self);
-#if PY_MAJOR_VERSION >= 3 || defined(PyErr_WarnFormat)
- if (unlikely(PyErr_WarnFormat(PyExc_RuntimeWarning, 1, "coroutine '%.50S' was never awaited", gen->gi_qualname) < 0))
- PyErr_WriteUnraisable(self);
-#else
- {PyObject *msg;
- char *cmsg;
- #if CYTHON_COMPILING_IN_PYPY
- msg = NULL;
- cmsg = (char*) "coroutine was never awaited";
- #else
- char *cname;
- PyObject *qualname;
- qualname = gen->gi_qualname;
- cname = PyString_AS_STRING(qualname);
- msg = PyString_FromFormat("coroutine '%.50s' was never awaited", cname);
- if (unlikely(!msg)) {
- PyErr_Clear();
- cmsg = (char*) "coroutine was never awaited";
- } else {
- cmsg = PyString_AS_STRING(msg);
- }
- #endif
- if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, cmsg, 1) < 0))
- PyErr_WriteUnraisable(self);
- Py_XDECREF(msg);}
-#endif
- PyObject_GC_Track(self);
- }
-#endif
- } else {
- PyObject *res = __Pyx_Coroutine_Close(self);
- if (unlikely(!res)) {
- if (PyErr_Occurred())
- PyErr_WriteUnraisable(self);
- } else {
- Py_DECREF(res);
- }
- }
- __Pyx_ErrRestore(error_type, error_value, error_traceback);
-#if !CYTHON_USE_TP_FINALIZE
- assert(Py_REFCNT(self) > 0);
- if (likely(--self->ob_refcnt == 0)) {
- return;
- }
- {
- Py_ssize_t refcnt = Py_REFCNT(self);
- _Py_NewReference(self);
- __Pyx_SET_REFCNT(self, refcnt);
- }
-#if CYTHON_COMPILING_IN_CPYTHON
- assert(PyType_IS_GC(Py_TYPE(self)) &&
- _Py_AS_GC(self)->gc.gc_refs != _PyGC_REFS_UNTRACKED);
- _Py_DEC_REFTOTAL;
-#endif
-#ifdef COUNT_ALLOCS
- --Py_TYPE(self)->tp_frees;
- --Py_TYPE(self)->tp_allocs;
-#endif
-#endif
-}
-static PyObject *
-__Pyx_Coroutine_get_name(__pyx_CoroutineObject *self, void *context)
-{
- PyObject *name = self->gi_name;
- CYTHON_UNUSED_VAR(context);
- if (unlikely(!name)) name = Py_None;
- Py_INCREF(name);
- return name;
-}
-static int
-__Pyx_Coroutine_set_name(__pyx_CoroutineObject *self, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
-#if PY_MAJOR_VERSION >= 3
- if (unlikely(value == NULL || !PyUnicode_Check(value)))
-#else
- if (unlikely(value == NULL || !PyString_Check(value)))
-#endif
- {
- PyErr_SetString(PyExc_TypeError,
- "__name__ must be set to a string object");
- return -1;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(self->gi_name, value);
- return 0;
-}
-static PyObject *
-__Pyx_Coroutine_get_qualname(__pyx_CoroutineObject *self, void *context)
-{
- PyObject *name = self->gi_qualname;
- CYTHON_UNUSED_VAR(context);
- if (unlikely(!name)) name = Py_None;
- Py_INCREF(name);
- return name;
-}
-static int
-__Pyx_Coroutine_set_qualname(__pyx_CoroutineObject *self, PyObject *value, void *context)
-{
- CYTHON_UNUSED_VAR(context);
-#if PY_MAJOR_VERSION >= 3
- if (unlikely(value == NULL || !PyUnicode_Check(value)))
-#else
- if (unlikely(value == NULL || !PyString_Check(value)))
-#endif
- {
- PyErr_SetString(PyExc_TypeError,
- "__qualname__ must be set to a string object");
- return -1;
- }
- Py_INCREF(value);
- __Pyx_Py_XDECREF_SET(self->gi_qualname, value);
- return 0;
-}
-static PyObject *
-__Pyx_Coroutine_get_frame(__pyx_CoroutineObject *self, void *context)
-{
- PyObject *frame = self->gi_frame;
- CYTHON_UNUSED_VAR(context);
- if (!frame) {
- if (unlikely(!self->gi_code)) {
- Py_RETURN_NONE;
- }
- frame = (PyObject *) PyFrame_New(
- PyThreadState_Get(), /*PyThreadState *tstate,*/
- (PyCodeObject*) self->gi_code, /*PyCodeObject *code,*/
- __pyx_d, /*PyObject *globals,*/
- 0 /*PyObject *locals*/
- );
- if (unlikely(!frame))
- return NULL;
- self->gi_frame = frame;
- }
- Py_INCREF(frame);
- return frame;
-}
-static __pyx_CoroutineObject *__Pyx__Coroutine_New(
- PyTypeObject* type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure,
- PyObject *name, PyObject *qualname, PyObject *module_name) {
- __pyx_CoroutineObject *gen = PyObject_GC_New(__pyx_CoroutineObject, type);
- if (unlikely(!gen))
- return NULL;
- return __Pyx__Coroutine_NewInit(gen, body, code, closure, name, qualname, module_name);
-}
-static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit(
- __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure,
- PyObject *name, PyObject *qualname, PyObject *module_name) {
- gen->body = body;
- gen->closure = closure;
- Py_XINCREF(closure);
- gen->is_running = 0;
- gen->resume_label = 0;
- gen->classobj = NULL;
- gen->yieldfrom = NULL;
- #if PY_VERSION_HEX >= 0x030B00a4
- gen->gi_exc_state.exc_value = NULL;
- #else
- gen->gi_exc_state.exc_type = NULL;
- gen->gi_exc_state.exc_value = NULL;
- gen->gi_exc_state.exc_traceback = NULL;
- #endif
-#if CYTHON_USE_EXC_INFO_STACK
- gen->gi_exc_state.previous_item = NULL;
-#endif
- gen->gi_weakreflist = NULL;
- Py_XINCREF(qualname);
- gen->gi_qualname = qualname;
- Py_XINCREF(name);
- gen->gi_name = name;
- Py_XINCREF(module_name);
- gen->gi_modulename = module_name;
- Py_XINCREF(code);
- gen->gi_code = code;
- gen->gi_frame = NULL;
- PyObject_GC_Track(gen);
- return gen;
-}
-
-/* PatchModuleWithCoroutine */
-static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code) {
-#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED)
- int result;
- PyObject *globals, *result_obj;
- globals = PyDict_New(); if (unlikely(!globals)) goto ignore;
- result = PyDict_SetItemString(globals, "_cython_coroutine_type",
- #ifdef __Pyx_Coroutine_USED
- (PyObject*)__pyx_CoroutineType);
- #else
- Py_None);
- #endif
- if (unlikely(result < 0)) goto ignore;
- result = PyDict_SetItemString(globals, "_cython_generator_type",
- #ifdef __Pyx_Generator_USED
- (PyObject*)__pyx_GeneratorType);
- #else
- Py_None);
- #endif
- if (unlikely(result < 0)) goto ignore;
- if (unlikely(PyDict_SetItemString(globals, "_module", module) < 0)) goto ignore;
- if (unlikely(PyDict_SetItemString(globals, "__builtins__", __pyx_b) < 0)) goto ignore;
- result_obj = PyRun_String(py_code, Py_file_input, globals, globals);
- if (unlikely(!result_obj)) goto ignore;
- Py_DECREF(result_obj);
- Py_DECREF(globals);
- return module;
-ignore:
- Py_XDECREF(globals);
- PyErr_WriteUnraisable(module);
- if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, "Cython module failed to patch module with custom type", 1) < 0)) {
- Py_DECREF(module);
- module = NULL;
- }
-#else
- py_code++;
-#endif
- return module;
-}
-
-/* PatchGeneratorABC */
-#ifndef CYTHON_REGISTER_ABCS
-#define CYTHON_REGISTER_ABCS 1
-#endif
-#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED)
-static PyObject* __Pyx_patch_abc_module(PyObject *module);
-static PyObject* __Pyx_patch_abc_module(PyObject *module) {
- module = __Pyx_Coroutine_patch_module(
- module, ""
-"if _cython_generator_type is not None:\n"
-" try: Generator = _module.Generator\n"
-" except AttributeError: pass\n"
-" else: Generator.register(_cython_generator_type)\n"
-"if _cython_coroutine_type is not None:\n"
-" try: Coroutine = _module.Coroutine\n"
-" except AttributeError: pass\n"
-" else: Coroutine.register(_cython_coroutine_type)\n"
- );
- return module;
-}
-#endif
-static int __Pyx_patch_abc(void) {
-#if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED)
- static int abc_patched = 0;
- if (CYTHON_REGISTER_ABCS && !abc_patched) {
- PyObject *module;
- module = PyImport_ImportModule((PY_MAJOR_VERSION >= 3) ? "collections.abc" : "collections");
- if (unlikely(!module)) {
- PyErr_WriteUnraisable(NULL);
- if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning,
- ((PY_MAJOR_VERSION >= 3) ?
- "Cython module failed to register with collections.abc module" :
- "Cython module failed to register with collections module"), 1) < 0)) {
- return -1;
- }
- } else {
- module = __Pyx_patch_abc_module(module);
- abc_patched = 1;
- if (unlikely(!module))
- return -1;
- Py_DECREF(module);
- }
- module = PyImport_ImportModule("backports_abc");
- if (module) {
- module = __Pyx_patch_abc_module(module);
- Py_XDECREF(module);
- }
- if (!module) {
- PyErr_Clear();
- }
- }
-#else
- if ((0)) __Pyx_Coroutine_patch_module(NULL, NULL);
-#endif
- return 0;
-}
-
-/* Generator */
-static PyMethodDef __pyx_Generator_methods[] = {
- {"send", (PyCFunction) __Pyx_Coroutine_Send, METH_O,
- (char*) PyDoc_STR("send(arg) -> send 'arg' into generator,\nreturn next yielded value or raise StopIteration.")},
- {"throw", (PyCFunction) __Pyx_Coroutine_Throw, METH_VARARGS,
- (char*) PyDoc_STR("throw(typ[,val[,tb]]) -> raise exception in generator,\nreturn next yielded value or raise StopIteration.")},
- {"close", (PyCFunction) __Pyx_Coroutine_Close_Method, METH_NOARGS,
- (char*) PyDoc_STR("close() -> raise GeneratorExit inside generator.")},
- {0, 0, 0, 0}
-};
-static PyMemberDef __pyx_Generator_memberlist[] = {
- {(char *) "gi_running", T_BOOL, offsetof(__pyx_CoroutineObject, is_running), READONLY, NULL},
- {(char*) "gi_yieldfrom", T_OBJECT, offsetof(__pyx_CoroutineObject, yieldfrom), READONLY,
- (char*) PyDoc_STR("object being iterated by 'yield from', or None")},
- {(char*) "gi_code", T_OBJECT, offsetof(__pyx_CoroutineObject, gi_code), READONLY, NULL},
- {(char *) "__module__", T_OBJECT, offsetof(__pyx_CoroutineObject, gi_modulename), 0, 0},
-#if CYTHON_USE_TYPE_SPECS
- {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(__pyx_CoroutineObject, gi_weakreflist), READONLY, 0},
-#endif
- {0, 0, 0, 0, 0}
-};
-static PyGetSetDef __pyx_Generator_getsets[] = {
- {(char *) "__name__", (getter)__Pyx_Coroutine_get_name, (setter)__Pyx_Coroutine_set_name,
- (char*) PyDoc_STR("name of the generator"), 0},
- {(char *) "__qualname__", (getter)__Pyx_Coroutine_get_qualname, (setter)__Pyx_Coroutine_set_qualname,
- (char*) PyDoc_STR("qualified name of the generator"), 0},
- {(char *) "gi_frame", (getter)__Pyx_Coroutine_get_frame, NULL,
- (char*) PyDoc_STR("Frame of the generator"), 0},
- {0, 0, 0, 0, 0}
-};
-#if CYTHON_USE_TYPE_SPECS
-static PyType_Slot __pyx_GeneratorType_slots[] = {
- {Py_tp_dealloc, (void *)__Pyx_Coroutine_dealloc},
- {Py_tp_traverse, (void *)__Pyx_Coroutine_traverse},
- {Py_tp_iter, (void *)PyObject_SelfIter},
- {Py_tp_iternext, (void *)__Pyx_Generator_Next},
- {Py_tp_methods, (void *)__pyx_Generator_methods},
- {Py_tp_members, (void *)__pyx_Generator_memberlist},
- {Py_tp_getset, (void *)__pyx_Generator_getsets},
- {Py_tp_getattro, (void *) __Pyx_PyObject_GenericGetAttrNoDict},
-#if CYTHON_USE_TP_FINALIZE
- {Py_tp_finalize, (void *)__Pyx_Coroutine_del},
-#endif
- {0, 0},
-};
-static PyType_Spec __pyx_GeneratorType_spec = {
- __PYX_TYPE_MODULE_PREFIX "generator",
- sizeof(__pyx_CoroutineObject),
- 0,
- Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_HAVE_FINALIZE,
- __pyx_GeneratorType_slots
-};
-#else
-static PyTypeObject __pyx_GeneratorType_type = {
- PyVarObject_HEAD_INIT(0, 0)
- __PYX_TYPE_MODULE_PREFIX "generator",
- sizeof(__pyx_CoroutineObject),
- 0,
- (destructor) __Pyx_Coroutine_dealloc,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_HAVE_FINALIZE,
- 0,
- (traverseproc) __Pyx_Coroutine_traverse,
- 0,
- 0,
- offsetof(__pyx_CoroutineObject, gi_weakreflist),
- 0,
- (iternextfunc) __Pyx_Generator_Next,
- __pyx_Generator_methods,
- __pyx_Generator_memberlist,
- __pyx_Generator_getsets,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
- 0,
-#if CYTHON_USE_TP_FINALIZE
- 0,
-#else
- __Pyx_Coroutine_del,
-#endif
- 0,
-#if CYTHON_USE_TP_FINALIZE
- __Pyx_Coroutine_del,
-#elif PY_VERSION_HEX >= 0x030400a1
- 0,
-#endif
-#if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800)
- 0,
-#endif
-#if __PYX_NEED_TP_PRINT_SLOT
- 0,
-#endif
-#if PY_VERSION_HEX >= 0x030C0000
- 0,
-#endif
-#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000
- 0,
-#endif
-};
-#endif
-static int __pyx_Generator_init(PyObject *module) {
-#if CYTHON_USE_TYPE_SPECS
- __pyx_GeneratorType = __Pyx_FetchCommonTypeFromSpec(module, &__pyx_GeneratorType_spec, NULL);
-#else
- CYTHON_UNUSED_VAR(module);
- __pyx_GeneratorType_type.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict;
- __pyx_GeneratorType_type.tp_iter = PyObject_SelfIter;
- __pyx_GeneratorType = __Pyx_FetchCommonType(&__pyx_GeneratorType_type);
-#endif
- if (unlikely(!__pyx_GeneratorType)) {
- return -1;
- }
- return 0;
-}
-
-/* CheckBinaryVersion */
-static int __Pyx_check_binary_version(void) {
- char ctversion[5];
- int same=1, i, found_dot;
- const char* rt_from_call = Py_GetVersion();
- PyOS_snprintf(ctversion, 5, "%d.%d", PY_MAJOR_VERSION, PY_MINOR_VERSION);
- found_dot = 0;
- for (i = 0; i < 4; i++) {
- if (!ctversion[i]) {
- same = (rt_from_call[i] < '0' || rt_from_call[i] > '9');
- break;
- }
- if (rt_from_call[i] != ctversion[i]) {
- same = 0;
- break;
- }
- }
- if (!same) {
- char rtversion[5] = {'\0'};
- char message[200];
- for (i=0; i<4; ++i) {
- if (rt_from_call[i] == '.') {
- if (found_dot) break;
- found_dot = 1;
- } else if (rt_from_call[i] < '0' || rt_from_call[i] > '9') {
- break;
- }
- rtversion[i] = rt_from_call[i];
- }
- PyOS_snprintf(message, sizeof(message),
- "compile time version %s of module '%.100s' "
- "does not match runtime version %s",
- ctversion, __Pyx_MODULE_NAME, rtversion);
- return PyErr_WarnEx(NULL, message, 1);
- }
- return 0;
-}
-
-/* InitStrings */
-#if PY_MAJOR_VERSION >= 3
-static int __Pyx_InitString(__Pyx_StringTabEntry t, PyObject **str) {
- if (t.is_unicode | t.is_str) {
- if (t.intern) {
- *str = PyUnicode_InternFromString(t.s);
- } else if (t.encoding) {
- *str = PyUnicode_Decode(t.s, t.n - 1, t.encoding, NULL);
- } else {
- *str = PyUnicode_FromStringAndSize(t.s, t.n - 1);
- }
- } else {
- *str = PyBytes_FromStringAndSize(t.s, t.n - 1);
- }
- if (!*str)
- return -1;
- if (PyObject_Hash(*str) == -1)
- return -1;
- return 0;
-}
-#endif
-static int __Pyx_InitStrings(__Pyx_StringTabEntry *t) {
- while (t->p) {
- #if PY_MAJOR_VERSION >= 3
- __Pyx_InitString(*t, t->p);
- #else
- if (t->is_unicode) {
- *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL);
- } else if (t->intern) {
- *t->p = PyString_InternFromString(t->s);
- } else {
- *t->p = PyString_FromStringAndSize(t->s, t->n - 1);
- }
- if (!*t->p)
- return -1;
- if (PyObject_Hash(*t->p) == -1)
- return -1;
- #endif
- ++t;
- }
- return 0;
-}
-
-static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) {
- return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str));
-}
-static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) {
- Py_ssize_t ignore;
- return __Pyx_PyObject_AsStringAndSize(o, &ignore);
-}
-#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT
-#if !CYTHON_PEP393_ENABLED
-static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) {
- char* defenc_c;
- PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL);
- if (!defenc) return NULL;
- defenc_c = PyBytes_AS_STRING(defenc);
-#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII
- {
- char* end = defenc_c + PyBytes_GET_SIZE(defenc);
- char* c;
- for (c = defenc_c; c < end; c++) {
- if ((unsigned char) (*c) >= 128) {
- PyUnicode_AsASCIIString(o);
- return NULL;
- }
- }
- }
-#endif
- *length = PyBytes_GET_SIZE(defenc);
- return defenc_c;
-}
-#else
-static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) {
- if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL;
-#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII
- if (likely(PyUnicode_IS_ASCII(o))) {
- *length = PyUnicode_GET_LENGTH(o);
- return PyUnicode_AsUTF8(o);
- } else {
- PyUnicode_AsASCIIString(o);
- return NULL;
- }
-#else
- return PyUnicode_AsUTF8AndSize(o, length);
-#endif
-}
-#endif
-#endif
-static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) {
-#if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT
- if (
-#if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII
- __Pyx_sys_getdefaultencoding_not_ascii &&
-#endif
- PyUnicode_Check(o)) {
- return __Pyx_PyUnicode_AsStringAndSize(o, length);
- } else
-#endif
-#if (!CYTHON_COMPILING_IN_PYPY && !CYTHON_COMPILING_IN_LIMITED_API) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE))
- if (PyByteArray_Check(o)) {
- *length = PyByteArray_GET_SIZE(o);
- return PyByteArray_AS_STRING(o);
- } else
-#endif
- {
- char* result;
- int r = PyBytes_AsStringAndSize(o, &result, length);
- if (unlikely(r < 0)) {
- return NULL;
- } else {
- return result;
- }
- }
-}
-static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) {
- int is_true = x == Py_True;
- if (is_true | (x == Py_False) | (x == Py_None)) return is_true;
- else return PyObject_IsTrue(x);
-}
-static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) {
- int retval;
- if (unlikely(!x)) return -1;
- retval = __Pyx_PyObject_IsTrue(x);
- Py_DECREF(x);
- return retval;
-}
-static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) {
- __Pyx_TypeName result_type_name = __Pyx_PyType_GetName(Py_TYPE(result));
-#if PY_MAJOR_VERSION >= 3
- if (PyLong_Check(result)) {
- if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1,
- "__int__ returned non-int (type " __Pyx_FMT_TYPENAME "). "
- "The ability to return an instance of a strict subclass of int is deprecated, "
- "and may be removed in a future version of Python.",
- result_type_name)) {
- __Pyx_DECREF_TypeName(result_type_name);
- Py_DECREF(result);
- return NULL;
- }
- __Pyx_DECREF_TypeName(result_type_name);
- return result;
- }
-#endif
- PyErr_Format(PyExc_TypeError,
- "__%.4s__ returned non-%.4s (type " __Pyx_FMT_TYPENAME ")",
- type_name, type_name, result_type_name);
- __Pyx_DECREF_TypeName(result_type_name);
- Py_DECREF(result);
- return NULL;
-}
-static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) {
-#if CYTHON_USE_TYPE_SLOTS
- PyNumberMethods *m;
-#endif
- const char *name = NULL;
- PyObject *res = NULL;
-#if PY_MAJOR_VERSION < 3
- if (likely(PyInt_Check(x) || PyLong_Check(x)))
-#else
- if (likely(PyLong_Check(x)))
-#endif
- return __Pyx_NewRef(x);
-#if CYTHON_USE_TYPE_SLOTS
- m = Py_TYPE(x)->tp_as_number;
- #if PY_MAJOR_VERSION < 3
- if (m && m->nb_int) {
- name = "int";
- res = m->nb_int(x);
- }
- else if (m && m->nb_long) {
- name = "long";
- res = m->nb_long(x);
- }
- #else
- if (likely(m && m->nb_int)) {
- name = "int";
- res = m->nb_int(x);
- }
- #endif
-#else
- if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) {
- res = PyNumber_Int(x);
- }
-#endif
- if (likely(res)) {
-#if PY_MAJOR_VERSION < 3
- if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) {
-#else
- if (unlikely(!PyLong_CheckExact(res))) {
-#endif
- return __Pyx_PyNumber_IntOrLongWrongResultType(res, name);
- }
- }
- else if (!PyErr_Occurred()) {
- PyErr_SetString(PyExc_TypeError,
- "an integer is required");
- }
- return res;
-}
-static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) {
- Py_ssize_t ival;
- PyObject *x;
-#if PY_MAJOR_VERSION < 3
- if (likely(PyInt_CheckExact(b))) {
- if (sizeof(Py_ssize_t) >= sizeof(long))
- return PyInt_AS_LONG(b);
- else
- return PyInt_AsSsize_t(b);
- }
-#endif
- if (likely(PyLong_CheckExact(b))) {
- #if CYTHON_USE_PYLONG_INTERNALS
- if (likely(__Pyx_PyLong_IsCompact(b))) {
- return __Pyx_PyLong_CompactValue(b);
- } else {
- const digit* digits = __Pyx_PyLong_Digits(b);
- const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(b);
- switch (size) {
- case 2:
- if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) {
- return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- case -2:
- if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) {
- return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- case 3:
- if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) {
- return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- case -3:
- if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) {
- return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- case 4:
- if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) {
- return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- case -4:
- if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) {
- return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]));
- }
- break;
- }
- }
- #endif
- return PyLong_AsSsize_t(b);
- }
- x = PyNumber_Index(b);
- if (!x) return -1;
- ival = PyInt_AsSsize_t(x);
- Py_DECREF(x);
- return ival;
-}
-static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject* o) {
- if (sizeof(Py_hash_t) == sizeof(Py_ssize_t)) {
- return (Py_hash_t) __Pyx_PyIndex_AsSsize_t(o);
-#if PY_MAJOR_VERSION < 3
- } else if (likely(PyInt_CheckExact(o))) {
- return PyInt_AS_LONG(o);
-#endif
- } else {
- Py_ssize_t ival;
- PyObject *x;
- x = PyNumber_Index(o);
- if (!x) return -1;
- ival = PyInt_AsLong(x);
- Py_DECREF(x);
- return ival;
- }
-}
-static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) {
- return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False);
-}
-static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) {
- return PyInt_FromSize_t(ival);
-}
-
-
-/* #### Code section: utility_code_pragmas_end ### */
-#ifdef _MSC_VER
-#pragma warning( pop )
-#endif
-
-
-
-/* #### Code section: end ### */
-#endif /* Py_PYTHON_H */
diff --git a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/line_plot.py b/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/line_plot.py
deleted file mode 100644
index fcf6d8d49b7e8df39f2b571f7099de89b994fb57..0000000000000000000000000000000000000000
--- a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/components/line_plot.py
+++ /dev/null
@@ -1,430 +0,0 @@
-"""gr.LinePlot() component"""
-
-from __future__ import annotations
-
-from typing import Callable, Literal
-
-import altair as alt
-import pandas as pd
-from gradio_client.documentation import document, set_documentation_group
-
-from gradio.components.base import _Keywords
-from gradio.components.plot import AltairPlot, Plot
-
-set_documentation_group("component")
-
-
-@document()
-class LinePlot(Plot):
- """
- Create a line plot.
-
- Preprocessing: this component does *not* accept input.
- Postprocessing: expects a pandas dataframe with the data to plot.
-
- Demos: line_plot, live_dashboard
- """
-
- def __init__(
- self,
- value: pd.DataFrame | Callable | None = None,
- x: str | None = None,
- y: str | None = None,
- *,
- color: str | None = None,
- stroke_dash: str | None = None,
- overlay_point: bool | None = None,
- title: str | None = None,
- tooltip: list[str] | str | None = None,
- x_title: str | None = None,
- y_title: str | None = None,
- color_legend_title: str | None = None,
- stroke_dash_legend_title: str | None = None,
- color_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- stroke_dash_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- height: int | None = None,
- width: int | None = None,
- x_lim: list[int] | None = None,
- y_lim: list[int] | None = None,
- caption: str | None = None,
- interactive: bool | None = True,
- label: str | None = None,
- show_label: bool | None = None,
- container: bool = True,
- scale: int | None = None,
- min_width: int = 160,
- every: float | None = None,
- visible: bool = True,
- elem_id: str | None = None,
- elem_classes: list[str] | str | None = None,
- ):
- """
- Parameters:
- value: The pandas dataframe containing the data to display in a scatter plot.
- x: Column corresponding to the x axis.
- y: Column corresponding to the y axis.
- color: The column to determine the point color. If the column contains numeric data, gradio will interpolate the column data so that small values correspond to light colors and large values correspond to dark values.
- stroke_dash: The column to determine the symbol used to draw the line, e.g. dashed lines, dashed lines with points.
- overlay_point: Whether to draw a point on the line for each (x, y) coordinate pair.
- title: The title to display on top of the chart.
- tooltip: The column (or list of columns) to display on the tooltip when a user hovers a point on the plot.
- x_title: The title given to the x axis. By default, uses the value of the x parameter.
- y_title: The title given to the y axis. By default, uses the value of the y parameter.
- color_legend_title: The title given to the color legend. By default, uses the value of color parameter.
- stroke_dash_legend_title: The title given to the stroke_dash legend. By default, uses the value of the stroke_dash parameter.
- color_legend_position: The position of the color legend. If the string value 'none' is passed, this legend is omitted. For other valid position values see: https://vega.github.io/vega/docs/legends/#orientation.
- stroke_dash_legend_position: The position of the stoke_dash legend. If the string value 'none' is passed, this legend is omitted. For other valid position values see: https://vega.github.io/vega/docs/legends/#orientation.
- height: The height of the plot in pixels.
- width: The width of the plot in pixels.
- x_lim: A tuple or list containing the limits for the x-axis, specified as [x_min, x_max].
- y_lim: A tuple of list containing the limits for the y-axis, specified as [y_min, y_max].
- caption: The (optional) caption to display below the plot.
- interactive: Whether users should be able to interact with the plot by panning or zooming with their mouse or trackpad.
- label: The (optional) label to display on the top left corner of the plot.
- show_label: Whether the label should be displayed.
- every: If `value` is a callable, run the function 'every' number of seconds while the client connection is open. Has no effect otherwise. Queue must be enabled. The event can be accessed (e.g. to cancel it) via this component's .load_event attribute.
- visible: Whether the plot should be visible.
- elem_id: An optional string that is assigned as the id of this component in the HTML DOM. Can be used for targeting CSS styles.
- elem_classes: An optional list of strings that are assigned as the classes of this component in the HTML DOM. Can be used for targeting CSS styles.
- """
- self.x = x
- self.y = y
- self.color = color
- self.stroke_dash = stroke_dash
- self.tooltip = tooltip
- self.title = title
- self.x_title = x_title
- self.y_title = y_title
- self.color_legend_title = color_legend_title
- self.stroke_dash_legend_title = stroke_dash_legend_title
- self.color_legend_position = color_legend_position
- self.stroke_dash_legend_position = stroke_dash_legend_position
- self.overlay_point = overlay_point
- self.x_lim = x_lim
- self.y_lim = y_lim
- self.caption = caption
- self.interactive_chart = interactive
- self.width = width
- self.height = height
- super().__init__(
- value=value,
- label=label,
- show_label=show_label,
- container=container,
- scale=scale,
- min_width=min_width,
- visible=visible,
- elem_id=elem_id,
- elem_classes=elem_classes,
- every=every,
- )
-
- def get_config(self):
- config = super().get_config()
- config["caption"] = self.caption
- return config
-
- def get_block_name(self) -> str:
- return "plot"
-
- @staticmethod
- def update(
- value: pd.DataFrame | dict | Literal[_Keywords.NO_VALUE] = _Keywords.NO_VALUE,
- x: str | None = None,
- y: str | None = None,
- color: str | None = None,
- stroke_dash: str | None = None,
- overlay_point: bool | None = None,
- title: str | None = None,
- tooltip: list[str] | str | None = None,
- x_title: str | None = None,
- y_title: str | None = None,
- color_legend_title: str | None = None,
- stroke_dash_legend_title: str | None = None,
- color_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- stroke_dash_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- height: int | None = None,
- width: int | None = None,
- x_lim: list[int] | None = None,
- y_lim: list[int] | None = None,
- interactive: bool | None = None,
- caption: str | None = None,
- label: str | None = None,
- show_label: bool | None = None,
- container: bool | None = None,
- scale: int | None = None,
- min_width: int | None = None,
- visible: bool | None = None,
- ):
- """Update an existing plot component.
-
- If updating any of the plot properties (color, size, etc) the value, x, and y parameters must be specified.
-
- Parameters:
- value: The pandas dataframe containing the data to display in a scatter plot.
- x: Column corresponding to the x axis.
- y: Column corresponding to the y axis.
- color: The column to determine the point color. If the column contains numeric data, gradio will interpolate the column data so that small values correspond to light colors and large values correspond to dark values.
- stroke_dash: The column to determine the symbol used to draw the line, e.g. dashed lines, dashed lines with points.
- overlay_point: Whether to draw a point on the line for each (x, y) coordinate pair.
- title: The title to display on top of the chart.
- tooltip: The column (or list of columns) to display on the tooltip when a user hovers a point on the plot.
- x_title: The title given to the x axis. By default, uses the value of the x parameter.
- y_title: The title given to the y axis. By default, uses the value of the y parameter.
- color_legend_title: The title given to the color legend. By default, uses the value of color parameter.
- stroke_dash_legend_title: The title given to the stroke legend. By default, uses the value of stroke parameter.
- color_legend_position: The position of the color legend. If the string value 'none' is passed, this legend is omitted. For other valid position values see: https://vega.github.io/vega/docs/legends/#orientation
- stroke_dash_legend_position: The position of the stoke_dash legend. If the string value 'none' is passed, this legend is omitted. For other valid position values see: https://vega.github.io/vega/docs/legends/#orientation
- height: The height of the plot in pixels.
- width: The width of the plot in pixels.
- x_lim: A tuple or list containing the limits for the x-axis, specified as [x_min, x_max].
- y_lim: A tuple of list containing the limits for the y-axis, specified as [y_min, y_max].
- caption: The (optional) caption to display below the plot.
- interactive: Whether users should be able to interact with the plot by panning or zooming with their mouse or trackpad.
- label: The (optional) label to display in the top left corner of the plot.
- show_label: Whether the label should be displayed.
- visible: Whether the plot should be visible.
- """
- properties = [
- x,
- y,
- color,
- stroke_dash,
- overlay_point,
- title,
- tooltip,
- x_title,
- y_title,
- color_legend_title,
- stroke_dash_legend_title,
- color_legend_position,
- stroke_dash_legend_position,
- height,
- width,
- x_lim,
- y_lim,
- interactive,
- ]
- if any(properties):
- if not isinstance(value, pd.DataFrame):
- raise ValueError(
- "In order to update plot properties the value parameter "
- "must be provided, and it must be a Dataframe. Please pass a value "
- "parameter to gr.LinePlot.update."
- )
- if x is None or y is None:
- raise ValueError(
- "In order to update plot properties, the x and y axis data "
- "must be specified. Please pass valid values for x an y to "
- "gr.LinePlot.update."
- )
- chart = LinePlot.create_plot(value, *properties)
- value = {"type": "altair", "plot": chart.to_json(), "chart": "line"}
-
- updated_config = {
- "label": label,
- "show_label": show_label,
- "container": container,
- "scale": scale,
- "min_width": min_width,
- "visible": visible,
- "value": value,
- "caption": caption,
- "__type__": "update",
- }
- return updated_config
-
- @staticmethod
- def create_plot(
- value: pd.DataFrame,
- x: str,
- y: str,
- color: str | None = None,
- stroke_dash: str | None = None,
- overlay_point: bool | None = None,
- title: str | None = None,
- tooltip: list[str] | str | None = None,
- x_title: str | None = None,
- y_title: str | None = None,
- color_legend_title: str | None = None,
- stroke_dash_legend_title: str | None = None,
- color_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- stroke_dash_legend_position: Literal[
- "left",
- "right",
- "top",
- "bottom",
- "top-left",
- "top-right",
- "bottom-left",
- "bottom-right",
- "none",
- ]
- | None = None,
- height: int | None = None,
- width: int | None = None,
- x_lim: list[int] | None = None,
- y_lim: list[int] | None = None,
- interactive: bool | None = None,
- ):
- """Helper for creating the scatter plot."""
- interactive = True if interactive is None else interactive
- encodings = {
- "x": alt.X(
- x, # type: ignore
- title=x_title or x, # type: ignore
- scale=AltairPlot.create_scale(x_lim), # type: ignore
- ),
- "y": alt.Y(
- y, # type: ignore
- title=y_title or y, # type: ignore
- scale=AltairPlot.create_scale(y_lim), # type: ignore
- ),
- }
- properties = {}
- if title:
- properties["title"] = title
- if height:
- properties["height"] = height
- if width:
- properties["width"] = width
-
- if color:
- domain = value[color].unique().tolist()
- range_ = list(range(len(domain)))
- encodings["color"] = {
- "field": color,
- "type": "nominal",
- "scale": {"domain": domain, "range": range_},
- "legend": AltairPlot.create_legend(
- position=color_legend_position, title=color_legend_title or color
- ),
- }
-
- highlight = None
- if interactive and any([color, stroke_dash]):
- highlight = alt.selection(
- type="single", # type: ignore
- on="mouseover",
- fields=[c for c in [color, stroke_dash] if c],
- nearest=True,
- )
-
- if stroke_dash:
- stroke_dash = {
- "field": stroke_dash, # type: ignore
- "legend": AltairPlot.create_legend( # type: ignore
- position=stroke_dash_legend_position, # type: ignore
- title=stroke_dash_legend_title or stroke_dash, # type: ignore
- ), # type: ignore
- } # type: ignore
- else:
- stroke_dash = alt.value(alt.Undefined) # type: ignore
-
- if tooltip:
- encodings["tooltip"] = tooltip
-
- chart = alt.Chart(value).encode(**encodings) # type: ignore
-
- points = chart.mark_point(clip=True).encode(
- opacity=alt.value(alt.Undefined) if overlay_point else alt.value(0),
- )
- lines = chart.mark_line(clip=True).encode(strokeDash=stroke_dash)
-
- if highlight:
- points = points.add_selection(highlight)
-
- lines = lines.encode(
- size=alt.condition(highlight, alt.value(4), alt.value(1)),
- )
-
- chart = (lines + points).properties(background="transparent", **properties)
- if interactive:
- chart = chart.interactive()
-
- return chart
-
- def postprocess(self, y: pd.DataFrame | dict | None) -> dict[str, str] | None:
- # if None or update
- if y is None or isinstance(y, dict):
- return y
- if self.x is None or self.y is None:
- raise ValueError("No value provided for required parameters `x` and `y`.")
- chart = self.create_plot(
- value=y,
- x=self.x,
- y=self.y,
- color=self.color,
- overlay_point=self.overlay_point,
- title=self.title,
- tooltip=self.tooltip,
- x_title=self.x_title,
- y_title=self.y_title,
- color_legend_title=self.color_legend_title, # type: ignore
- color_legend_position=self.color_legend_position, # type: ignore
- stroke_dash_legend_title=self.stroke_dash_legend_title,
- stroke_dash_legend_position=self.stroke_dash_legend_position, # type: ignore
- x_lim=self.x_lim,
- y_lim=self.y_lim,
- stroke_dash=self.stroke_dash,
- interactive=self.interactive_chart,
- height=self.height,
- width=self.width,
- )
-
- return {"type": "altair", "plot": chart.to_json(), "chart": "line"}
diff --git a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/index-94005977.js b/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/index-94005977.js
deleted file mode 100644
index af7acf3315af31245a6b74e3ece5e0c48f3d7ca2..0000000000000000000000000000000000000000
--- a/spaces/dcarpintero/nlp-summarizer-pegasus/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/index-94005977.js
+++ /dev/null
@@ -1,2 +0,0 @@
-import{E as W,C as Y,L as d}from"./index-604e6cf5.js";import{s as n,t as r,L as R,i as Z,d as a,f as X,a as y,b as f}from"./index-ba0b23cc.js";import"./index-39fce9e2.js";import"./Button-79f6e3bf.js";import"./Copy-77b3f70c.js";import"./Download-0afd7f1a.js";import"./BlockLabel-b1428685.js";import"./Empty-16d6169a.js";const l=1,w=189,S=190,b=191,T=192,U=193,m=194,V=22,g=23,h=47,G=48,c=53,u=54,_=55,j=57,E=58,k=59,z=60,v=61,H=63,N=230,A=71,F=255,K=121,C=142,D=143,M=146,s=10,i=13,t=32,o=9,q=35,L=40,B=46,J=new Set([g,h,G,F,H,K,u,_,N,z,v,E,k,A,C,D,M]),OO=new W((O,$)=>{if(O.next<0)O.acceptToken(m);else if(!(O.next!=s&&O.next!=i))if($.context.depth<0)O.acceptToken(T,1);else{O.advance();let Q=0;for(;O.next==t||O.next==o;)O.advance(),Q++;let P=O.next==s||O.next==i||O.next==q;O.acceptToken(P?U:b,-Q)}},{contextual:!0,fallback:!0}),$O=new W((O,$)=>{let Q=$.context.depth;if(Q<0)return;let P=O.peek(-1);if((P==s||P==i)&&$.context.depth>=0){let e=0,x=0;for(;;){if(O.next==t)e++;else if(O.next==o)e+=8-e%8;else break;O.advance(),x++}e!=Q&&O.next!=s&&O.next!=i&&O.next!=q&&(e{for(let $=0;$<5;$++){if(O.next!="print".charCodeAt($))return;O.advance()}if(!/\w/.test(String.fromCharCode(O.next)))for(let $=0;;$++){let Q=O.peek($);if(!(Q==t||Q==o)){Q!=L&&Q!=B&&Q!=s&&Q!=i&&Q!=q&&O.acceptToken(l);return}}}),sO=n({'async "*" "**" FormatConversion FormatSpec':r.modifier,"for while if elif else try except finally return raise break continue with pass assert await yield match case":r.controlKeyword,"in not and or is del":r.operatorKeyword,"from def class global nonlocal lambda":r.definitionKeyword,import:r.moduleKeyword,"with as print":r.keyword,Boolean:r.bool,None:r.null,VariableName:r.variableName,"CallExpression/VariableName":r.function(r.variableName),"FunctionDefinition/VariableName":r.function(r.definition(r.variableName)),"ClassDefinition/VariableName":r.definition(r.className),PropertyName:r.propertyName,"CallExpression/MemberExpression/PropertyName":r.function(r.propertyName),Comment:r.lineComment,Number:r.number,String:r.string,FormatString:r.special(r.string),UpdateOp:r.updateOperator,ArithOp:r.arithmeticOperator,BitOp:r.bitwiseOperator,CompareOp:r.compareOperator,AssignOp:r.definitionOperator,Ellipsis:r.punctuation,At:r.meta,"( )":r.paren,"[ ]":r.squareBracket,"{ }":r.brace,".":r.derefOperator,", ;":r.separator}),iO={__proto__:null,await:40,or:50,and:52,in:56,not:58,is:60,if:66,else:68,lambda:72,yield:90,from:92,async:98,for:100,None:152,True:154,False:154,del:168,pass:172,break:176,continue:180,return:184,raise:192,import:196,as:198,global:202,nonlocal:204,assert:208,elif:218,while:222,try:228,except:230,finally:232,with:236,def:240,class:250,match:261,case:267},oO=d.deserialize({version:14,states:"!L`O`Q$IXOOO%fQ$I[O'#G|OOQ$IS'#Cm'#CmOOQ$IS'#Cn'#CnO'UQ$IWO'#ClO(wQ$I[O'#G{OOQ$IS'#G|'#G|OOQ$IS'#DS'#DSOOQ$IS'#G{'#G{O)eQ$IWO'#CsO)uQ$IWO'#DdO*VQ$IWO'#DhOOQ$IS'#Ds'#DsO*jO`O'#DsO*rOpO'#DsO*zO!bO'#DtO+VO#tO'#DtO+bO&jO'#DtO+mO,UO'#DtO-oQ$I[O'#GmOOQ$IS'#Gm'#GmO'UQ$IWO'#GlO/RQ$I[O'#GlOOQ$IS'#E]'#E]O/jQ$IWO'#E^OOQ$IS'#Gk'#GkO/tQ$IWO'#GjOOQ$IV'#Gj'#GjO0PQ$IWO'#FPOOQ$IS'#GX'#GXO0UQ$IWO'#FOOOQ$IV'#Hx'#HxOOQ$IV'#Gi'#GiOOQ$IT'#Fh'#FhQ`Q$IXOOO'UQ$IWO'#CoO0dQ$IWO'#C{O0kQ$IWO'#DPO0yQ$IWO'#HQO1ZQ$I[O'#EQO'UQ$IWO'#EROOQ$IS'#ET'#ETOOQ$IS'#EV'#EVOOQ$IS'#EX'#EXO1oQ$IWO'#EZO2VQ$IWO'#E_O0PQ$IWO'#EaO2jQ$I[O'#EaO0PQ$IWO'#EdO/jQ$IWO'#EgO/jQ$IWO'#EkO/jQ$IWO'#EnO2uQ$IWO'#EpO2|Q$IWO'#EuO3XQ$IWO'#EqO/jQ$IWO'#EuO0PQ$IWO'#EwO0PQ$IWO'#E|O3^Q$IWO'#FROOQ$IS'#Cc'#CcOOQ$IS'#Cd'#CdOOQ$IS'#Ce'#CeOOQ$IS'#Cf'#CfOOQ$IS'#Cg'#CgOOQ$IS'#Ch'#ChOOQ$IS'#Cj'#CjO'UQ$IWO,58|O'UQ$IWO,58|O'UQ$IWO,58|O'UQ$IWO,58|O'UQ$IWO,58|O'UQ$IWO,58|O3eQ$IWO'#DmOOQ$IS,5:W,5:WO3xQ$IWO'#H[OOQ$IS,5:Z,5:ZO4VQ%1`O,5:ZO4[Q$I[O,59WO0dQ$IWO,59`O0dQ$IWO,59`O0dQ$IWO,59`O6zQ$IWO,59`O7PQ$IWO,59`O7WQ$IWO,59hO7_Q$IWO'#G{O8eQ$IWO'#GzOOQ$IS'#Gz'#GzOOQ$IS'#DY'#DYO8|Q$IWO,59_O'UQ$IWO,59_O9[Q$IWO,59_O9aQ$IWO,5:PO'UQ$IWO,5:POOQ$IS,5:O,5:OO9oQ$IWO,5:OO9tQ$IWO,5:VO'UQ$IWO,5:VO'UQ$IWO,5:TOOQ$IS,5:S,5:SO:VQ$IWO,5:SO:[Q$IWO,5:UOOOO'#Fp'#FpO:aO`O,5:_OOQ$IS,5:_,5:_OOOO'#Fq'#FqO:iOpO,5:_O:qQ$IWO'#DuOOOO'#Fr'#FrO;RO!bO,5:`OOQ$IS,5:`,5:`OOOO'#Fu'#FuO;^O#tO,5:`OOOO'#Fv'#FvO;iO&jO,5:`OOOO'#Fw'#FwO;tO,UO,5:`OOQ$IS'#Fx'#FxOqQ$I[O,5=WO?[Q%GlO,5=WO?{Q$I[O,5=WOOQ$IS,5:x,5:xO@dQ$IXO'#GQOAsQ$IWO,5;TOOQ$IV,5=U,5=UOBOQ$I[O'#HtOBgQ$IWO,5;kOOQ$IS-E:V-E:VOOQ$IV,5;j,5;jO3SQ$IWO'#EwOOQ$IT-E9f-E9fOBoQ$I[O,59ZODvQ$I[O,59gOEaQ$IWO'#G}OElQ$IWO'#G}O0PQ$IWO'#G}OEwQ$IWO'#DROFPQ$IWO,59kOFUQ$IWO'#HRO'UQ$IWO'#HRO/jQ$IWO,5=lOOQ$IS,5=l,5=lO/jQ$IWO'#D|OOQ$IS'#D}'#D}OFsQ$IWO'#FzOGTQ$IWO,58zOGTQ$IWO,58zO)hQ$IWO,5:jOGcQ$I[O'#HTOOQ$IS,5:m,5:mOOQ$IS,5:u,5:uOGvQ$IWO,5:yOHXQ$IWO,5:{OOQ$IS'#F}'#F}OHgQ$I[O,5:{OHuQ$IWO,5:{OHzQ$IWO'#HwOOQ$IS,5;O,5;OOIYQ$IWO'#HsOOQ$IS,5;R,5;RO3XQ$IWO,5;VO3XQ$IWO,5;YOIkQ$I[O'#HyO'UQ$IWO'#HyOIuQ$IWO,5;[O2uQ$IWO,5;[O/jQ$IWO,5;aO0PQ$IWO,5;cOIzQ$IXO'#ElOKTQ$IZO,5;]ONiQ$IWO'#HzO3XQ$IWO,5;aONtQ$IWO,5;cONyQ$IWO,5;hO! RQ$I[O,5;mO'UQ$IWO,5;mO!#uQ$I[O1G.hO!#|Q$I[O1G.hO!&mQ$I[O1G.hO!&wQ$I[O1G.hO!)bQ$I[O1G.hO!)uQ$I[O1G.hO!*YQ$IWO'#HZO!*hQ$I[O'#GmO/jQ$IWO'#HZO!*rQ$IWO'#HYOOQ$IS,5:X,5:XO!*zQ$IWO,5:XO!+PQ$IWO'#H]O!+[Q$IWO'#H]O!+oQ$IWO,5=vOOQ$IS'#Dq'#DqOOQ$IS1G/u1G/uOOQ$IS1G.z1G.zO!,oQ$I[O1G.zO!,vQ$I[O1G.zO0dQ$IWO1G.zO!-cQ$IWO1G/SOOQ$IS'#DX'#DXO/jQ$IWO,59rOOQ$IS1G.y1G.yO!-jQ$IWO1G/cO!-zQ$IWO1G/cO!.SQ$IWO1G/dO'UQ$IWO'#HSO!.XQ$IWO'#HSO!.^Q$I[O1G.yO!.nQ$IWO,59gO!/tQ$IWO,5=rO!0UQ$IWO,5=rO!0^Q$IWO1G/kO!0cQ$I[O1G/kOOQ$IS1G/j1G/jO!0sQ$IWO,5=mO!1jQ$IWO,5=mO/jQ$IWO1G/oO!2XQ$IWO1G/qO!2^Q$I[O1G/qO!2nQ$I[O1G/oOOQ$IS1G/n1G/nOOQ$IS1G/p1G/pOOOO-E9n-E9nOOQ$IS1G/y1G/yOOOO-E9o-E9oO!3OQ$IWO'#HhO/jQ$IWO'#HhO!3^Q$IWO,5:aOOOO-E9p-E9pOOQ$IS1G/z1G/zOOOO-E9s-E9sOOOO-E9t-E9tOOOO-E9u-E9uOOQ$IS-E9v-E9vO!3iQ%GlO1G2rO!4YQ$I[O1G2rO'UQ$IWO,5`OOQ$IS1G1V1G1VO!5YQ$IWO1G1VOOQ$IS'#DT'#DTO/jQ$IWO,5=iOOQ$IS,5=i,5=iO!5_Q$IWO'#FiO!5jQ$IWO,59mO!5rQ$IWO1G/VO!5|Q$I[O,5=mOOQ$IS1G3W1G3WOOQ$IS,5:h,5:hO!6mQ$IWO'#GlOOQ$IS,5cO!8oQ$IWO,5>cO!8}Q$IWO,5>_O!9eQ$IWO,5>_O!9vQ$IZO1G0qO!=XQ$IZO1G0tO!@gQ$IWO,5>eO!@qQ$IWO,5>eO!@yQ$I[O,5>eO/jQ$IWO1G0vO!ATQ$IWO1G0vO3XQ$IWO1G0{ONtQ$IWO1G0}OOQ$IV,5;W,5;WO!AYQ$IYO,5;WO!A_Q$IZO1G0wO!DsQ$IWO'#GUO3XQ$IWO1G0wO3XQ$IWO1G0wO!EQQ$IWO,5>fO!E_Q$IWO,5>fO0PQ$IWO,5>fOOQ$IV1G0{1G0{O!EgQ$IWO'#EyO!ExQ%1`O1G0}OOQ$IV1G1S1G1SO3XQ$IWO1G1SO!FQQ$IWO'#FTOOQ$IV1G1X1G1XO! RQ$I[O1G1XOOQ$IS,5=u,5=uOOQ$IS'#Dn'#DnO/jQ$IWO,5=uO!FVQ$IWO,5=tO!FjQ$IWO,5=tOOQ$IS1G/s1G/sO!FrQ$IWO,5=wO!GSQ$IWO,5=wO!G[Q$IWO,5=wO!GoQ$IWO,5=wO!HPQ$IWO,5=wOOQ$IS1G3b1G3bOOQ$IS7+$f7+$fO!5rQ$IWO7+$nO!IrQ$IWO1G.zO!IyQ$IWO1G.zOOQ$IS1G/^1G/^OOQ$IS,5SO!NaQ$IWO,5>SO!NaQ$IWO,5>SO!NoO!LQO'#DwO!NzOSO'#HiOOOO1G/{1G/{O# PQ$IWO1G/{O# XQ%GlO7+(^O# xQ$I[O1G2PP#!cQ$IWO'#FyOOQ$IS,5T,5>TOOOO7+%g7+%gO#8UQ$IWO1G2rO#8oQ$IWO1G2rP'UQ$IWO'#FlO/jQ$IWO<bO#9cQ$IWO,5>bO0PQ$IWO,5>bO#9tQ$IWO,5>aOOQ$IS<hO#CeQ$IWO,5>hOOQ$IS,5>h,5>hO#CpQ$IWO,5>gO#DRQ$IWO,5>gOOQ$IS1G1P1G1POOQ$IS,5;g,5;gO#DZQ$IWO1G1ZP#D`Q$IWO'#FnO#DpQ$IWO1G1uO#ETQ$IWO1G1uO#EeQ$IWO1G1uP#EpQ$IWO'#FoO#E}Q$IWO7+(}O#F_Q$IWO7+(}O#F_Q$IWO7+(}O#FgQ$IWO7+(}O#FwQ$IWO7+(tO7WQ$IWO7+(tOOQ$ISAN>TAN>TO#GbQ$IWO<aAN>aO/jQ$IWO1G1sO#GrQ$I[O1G1sP#G|Q$IWO'#FmOOQ$IS1G1y1G1yP#HZQ$IWO'#FsO#HhQ$IWO7+)YOOOO-E9r-E9rO#IOQ$IWO7+(^OOQ$ISAN?VAN?VO#IiQ$IWO,5jO$,bQ$IWO,5>jO0PQ$IWO,5;vO$,sQ$IWO,5;zO$,xQ$IWO,5;zO#NzQ$IWO'#IQO$,}Q$IWO'#IQO$-SQ$IWO,5;{OOQ$IS,5;|,5;|O'UQ$IWO'#FgOOQ$IU1G1[1G1[O3XQ$IWO1G1[OOQ$ISAN@gAN@gO$-XQ$IWOG27oO$-iQ$IWO,59{OOQ$IS1G3[1G3[OOQ$IS,5lO#NzQ$IWO,5>lOOQ$IS1G1g1G1gO$0YQ$I[O,5mO$0hQ$IWO,5>mOOQ$IS1G1j1G1jOOQ$IS7+&y7+&yP#NzQ$IWO'#G_O$0pQ$IWO1G4WO$0zQ$IWO1G4WO$1SQ$IWO1G4WOOQ$IS7+%R7+%RO$1bQ$IWO1G1kO$1pQ$I[O'#FWO$1wQ$IWO,5m'PP>pP>vByFcPFw'PPPPF{GR&wP&w&wP&wP&wP&wP&wP&w&w&wP&wPP&wPP&wPGXPG`GfPG`PG`G`PPPG`PIePInItIzIePG`JQPG`PJXJ_PJcJwKfLPJcJcLVLdJcJcJcJcLxMOMRMWMZMaMgMsNVN]NgNm! Z! a! g! m! w! }!!T!!Z!!a!!g!!y!#T!#Z!#a!#g!#q!#w!#}!$T!$Z!$e!$k!$u!${!%U!%[!%k!%s!%}!&UPPPPPPPPP!&[!&d!&m!&w!'SPPPPPPPPPPPP!+r!,[!0j!3vPP!4O!4^!4g!5]!5S!5f!5l!5o!5r!5u!5}!6nPPPPPPPPPP!6q!6tPPPPPPPPP!6z!7W!7d!7j!7s!7v!7|!8S!8Y!8]P!8e!8n!9j!9m]iOr#n$n)c+c'udOSXYZehrstvx|}!R!S!T!U!X![!d!e!f!g!h!i!j!l!p!q!r!t!u!{#O#S#T#^#k#n$P$Q$S$U$X$i$k$l$n$u%O%T%[%_%a%d%h%m%o%y&R&T&`&d&m&o&p&w&{'O'V'Y'g'h'k'm'n'r'w'y'}(R(W(X(_(b(i(k(s(v)S)V)Z)[)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*l*p*q*x*z*{+S+[+]+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.^.b.y/i/j/k/l/n/o/p/q/r/t/x}!dP#j#w$Y$h$t%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!P!eP#j#w$Y$h$t$v%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!R!fP#j#w$Y$h$t$v$w%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!T!gP#j#w$Y$h$t$v$w$x%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!V!hP#j#w$Y$h$t$v$w$x$y%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!X!iP#j#w$Y$h$t$v$w$x$y$z%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m!]!iP!o#j#w$Y$h$t$v$w$x$y$z${%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/m'uSOSXYZehrstvx|}!R!S!T!U!X![!d!e!f!g!h!i!j!l!p!q!r!t!u!{#O#S#T#^#k#n$P$Q$S$U$X$i$k$l$n$u%O%T%[%_%a%d%h%m%o%y&R&T&`&d&m&o&p&w&{'O'V'Y'g'h'k'm'n'r'w'y'}(R(W(X(_(b(i(k(s(v)S)V)Z)[)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*l*p*q*x*z*{+S+[+]+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.^.b.y/i/j/k/l/n/o/p/q/r/t/x&ZUOXYZhrtv|}!R!S!T!X!j!l!p!q!r!t!u#^#k#n$Q$S$U$X$l$n%O%T%[%_%a%h%m%o%y&R&`&d&o&p&w'O'V'Y'g'h'k'm'n'r'y(R(X(_(b(i(k(s)S)V)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*p*q*x*{+S+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.b.y/i/j/k/l/n/o/p/q/t/x%eWOXYZhrv|}!R!S!T!X!j!l#^#k#n$Q$S$U$X$l$n%O%T%_%a%h%m%o%y&R&`&d&o&p&w'O'V'Y'g'h'k'm'n'r'y(R(X(_(b(i(k(s)S)V)`)c)l)v*O*R*S*V*]*`*b*e*f*i*p*q*x*{+S+c+j+k+n+v+w+x+z+{,O,S,U,W,Y,Z,],o,q,x,{-n-o.b/o/p/qQ#}uQ.c-sR/u/w'ldOSXYZehrstvx|}!R!S!T!U!X![!d!e!f!g!h!i!l!p!q!r!t!u!{#O#S#T#^#k#n$P$Q$S$U$X$i$k$l$n$u%O%T%[%_%a%d%h%m%o%y&R&T&`&d&m&o&p&w&{'O'V'Y'g'k'm'n'r'w'y'}(R(W(X(_(b(i(k(s(v)S)V)Z)[)`)c)l)v*R*S*V*]*^*`*b*e*f*i*l*p*q*x*z*{+S+[+]+c+j+k+n+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.^.b.y/i/j/k/l/n/o/p/q/r/t/xW#ql!O!P$`W#yu&b-s/wQ$b!QQ$r!YQ$s!ZW$}!j'h*O+vS&a#z#{Q'R$mQ(l&ZQ(z&qU({&s(|(}U)O&u)P+RQ)n'[W)o'^+q,s-]S+p)p)qY,_*|,`-T-U-wQ,b+OQ,l+gQ,n+il-`,w-f-g-i.R.T.Y.p.u.z/P/[/a/dQ-v-SQ.Z-hQ.g-{Q.r.VU/V.{/Y/bX/]/Q/^/e/fR&`#yi!xXY!S!T%a%h'y(R)V*]*`*bR%_!wQ!|XQ%z#^Q&i$UR&l$XT-r-O.y![!kP!o#j#w$Y$h$t$v$w$x$y$z${%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/mQ&^#rR'a$sR'g$}Q%W!nR.e-y'tcOSXYZehrstvx|}!R!S!T!U!X![!d!e!f!g!h!i!j!l!p!q!r!t!u!{#O#S#T#^#k#n$P$Q$S$U$X$i$k$l$n$u%O%T%[%_%a%d%h%m%o%y&R&T&`&d&m&o&p&w&{'O'V'Y'g'h'k'm'n'r'w'y'}(R(W(X(_(b(i(k(s(v)S)V)Z)[)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*l*p*q*x*z*{+S+[+]+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.^.b.y/i/j/k/l/n/o/p/q/r/t/xS#hc#i!P-d,w-f-g-h-i-{.R.T.Y.p.u.z.{/P/Q/Y/[/^/a/b/d/e/f'tcOSXYZehrstvx|}!R!S!T!U!X![!d!e!f!g!h!i!j!l!p!q!r!t!u!{#O#S#T#^#k#n$P$Q$S$U$X$i$k$l$n$u%O%T%[%_%a%d%h%m%o%y&R&T&`&d&m&o&p&w&{'O'V'Y'g'h'k'm'n'r'w'y'}(R(W(X(_(b(i(k(s(v)S)V)Z)[)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*l*p*q*x*z*{+S+[+]+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.^.b.y/i/j/k/l/n/o/p/q/r/t/xT#hc#iS#__#`S#b`#cS#da#eS#fb#gT*t(e*uT(f%z(hQ$WwR+o)oX$Uw$V$W&kZkOr$n)c+cXoOr)c+cQ$o!WQ&y$fQ&z$gQ']$qQ'`$sQ)a'QQ)g'VQ)i'WQ)j'XQ)w'_Q)y'aQ+V)VQ+X)WQ+Y)XQ+^)_S+`)b)xQ+d)eQ+e)fQ+f)hQ,d+UQ,e+WQ,g+_Q,h+aQ,m+hQ-W,fQ-Y,kQ-Z,lQ-x-XQ._-lR.x.`WoOr)c+cR#tnQ'_$rR)b'RQ+n)oR,q+oQ)x'_R+a)bZmOnr)c+cQ'c$tR){'dT,u+u,vu-k,w-f-g-i-{.R.T.Y.p.u.z.{/P/Y/[/a/b/dt-k,w-f-g-i-{.R.T.Y.p.u.z.{/P/Y/[/a/b/dQ.Z-hX/]/Q/^/e/f!P-c,w-f-g-h-i-{.R.T.Y.p.u.z.{/P/Q/Y/[/^/a/b/d/e/fQ.O-bR.l.Pg.R-e.S.h.o.t/S/U/W/c/g/hu-j,w-f-g-i-{.R.T.Y.p.u.z.{/P/Y/[/a/b/dX-|-`-j.g/VR.i-{V/X.{/Y/bR.`-lQrOR#vrQ&c#|R(q&cS%n#R$OS(Y%n(]T(]%q&eQ%b!zQ%i!}W'z%b%i(P(TQ(P%fR(T%kQ&n$YR(w&nQ(`%rQ*g(ZT*m(`*gQ'i%PR*P'iS'l%S%TY*T'l*U+|,|-pU*U'm'n'oU+|*V*W*XS,|+},OR-p,}Q#Y]R%u#YQ#]^R%w#]Q#`_R%{#`Q(c%xS*r(c*sR*s(dQ*u(eR,[*uQ#c`R%}#cQ#eaR&O#eQ#gbR&P#gQ#icR&Q#iQ#lfQ&S#jW&V#l&S(t*yQ(t&hR*y/mQ$VwS&j$V&kR&k$WQ&x$dR)T&xQ&[#qR(m&[Q$`!PR&r$`Q*}({S,a*}-VR-V,bQ&v$bR)Q&vQ#ojR&X#oQ+c)cR,i+cQ)U&yR+T)UQ&|$hS)]&|)^R)^&}Q'U$oR)d'UQ'Z$pS)m'Z+lR+l)nQ+r)sR,t+rWnOr)c+cR#snQ,v+uR-^,vd.S-e.h.o.t/S/U/W/c/g/hR.n.SU-z-`.g/VR.f-zQ/R.tS/_/R/`R/`/SS.|.h.iR/Z.|Q.U-eR.q.USqOrT+b)c+cWpOr)c+cR'S$nYjOr$n)c+cR&W#n[wOr#n$n)c+cR&i$U&YPOXYZhrtv|}!R!S!T!X!j!l!p!q!r!t!u#^#k#n$Q$S$U$X$l$n%O%T%[%_%a%h%m%o%y&R&`&d&o&p&w'O'V'Y'g'h'k'm'n'r'y(R(X(_(b(i(k(s)S)V)`)c)l)v*O*R*S*V*]*^*`*b*e*f*i*p*q*x*{+S+c+j+k+n+v+w+x+z+{,O,Q,S,U,W,Y,Z,],o,q,x,{-O-n-o.b.y/i/j/k/l/n/o/p/q/t/xQ!oSQ#jeQ#wsU$Yx%d'}S$h!U$kQ$t![Q$v!dQ$w!eQ$x!fQ$y!gQ$z!hQ${!iQ%f!{Q%k#OQ%q#SQ%r#TQ&e$PQ&}$iQ'd$uQ(j&TU(u&m(v*zW)Y&{)[+[+]Q*Z'wQ*d(WQ+Z)ZQ,V*lQ.w.^R/m/rQ!zXQ!}YQ$f!SQ$g!T^'v%a%h'y(R*]*`*bR+W)V[fOr#n$n)c+ch!wXY!S!T%a%h'y(R)V*]*`*bQ#RZQ#mhS$Ov|Q$]}W$d!R$X'O)`S$p!X$lW$|!j'h*O+vQ%S!lQ%x#^`&U#k&R(i(k(s*x,]/qQ&f$QQ&g$SQ&h$UQ'e%OQ'o%TQ'u%_W(V%m(X*e*iQ(Z%oQ(d%yQ(o&`S(r&d/oQ(x&oQ(y&pU)R&w)S+SQ)h'VY)k'Y)l+j+k,oQ)|'g^*Q'k*S+z+{,{-o.bQ*W'mQ*X'nS*Y'r/pW*k(_*f,S,WW*o(b*q,Y,ZQ+t)vQ+y*RQ+}*VQ,X*pQ,^*{Q,p+nQ,y+wQ,z+xQ,},OQ-R,UQ-[,qQ-m,xR.a-nhTOr#k#n$n&R&d'r(i(k)c+c$z!vXYZhv|}!R!S!T!X!j!l#^$Q$S$U$X$l%O%T%_%a%h%m%o%y&`&o&p&w'O'V'Y'g'h'k'm'n'y(R(X(_(b(s)S)V)`)l)v*O*R*S*V*]*`*b*e*f*i*p*q*x*{+S+j+k+n+v+w+x+z+{,O,S,U,W,Y,Z,],o,q,x,{-n-o.b/o/p/qQ#xtW%X!p!t/j/tQ%Y!qQ%Z!rQ%]!uQ%g/iS'q%[/nQ's/kQ't/lQ,P*^Q-Q,QS-q-O.yR/v/xU#|u-s/wR(p&b[gOr#n$n)c+cX!yX#^$U$XQ#WZQ$RvR$[|Q%c!zQ%j!}Q%p#RQ'e$|Q(Q%fQ(U%kQ(^%qQ(a%rQ*h(ZQ-P,PQ-u-QR.d-tQ$ZxQ'|%dR*_'}Q-t-OR/T.yR#QYR#VZR%R!jQ%P!jV)}'h*O+v!]!mP!o#j#w$Y$h$t$v$w$x$y$z${%f%k%q%r&e&}'d(j(u)Y*Z*d+Z,V.w/mR%U!lR%z#^Q(g%zR*w(hQ$e!RQ&l$XQ)_'OR+_)`Q#rlQ$^!OQ$a!PR&t$`Q(z&sR+Q(}Q(z&sQ+P(|R+Q(}R$c!QXpOr)c+cQ$j!UR'P$kQ$q!XR'Q$lR)u'^Q)s'^V,r+q,s-]Q-l,wQ.W-fR.X-gU-e,w-f-gQ.]-iQ.h-{Q.m.RU.o.T.p/PQ.t.YQ/S.uQ/U.zU/W.{/Y/bQ/c/[Q/g/aR/h/dR.[-hR.j-{",nodeNames:"⚠ print Comment Script AssignStatement * BinaryExpression BitOp BitOp BitOp BitOp ArithOp ArithOp @ ArithOp ** UnaryExpression ArithOp BitOp AwaitExpression await ) ( ParenthesizedExpression BinaryExpression or and CompareOp in not is UnaryExpression ConditionalExpression if else LambdaExpression lambda ParamList VariableName AssignOp , : NamedExpression AssignOp YieldExpression yield from TupleExpression ComprehensionExpression async for LambdaExpression ] [ ArrayExpression ArrayComprehensionExpression } { DictionaryExpression DictionaryComprehensionExpression SetExpression SetComprehensionExpression CallExpression ArgList AssignOp MemberExpression . PropertyName Number String FormatString FormatReplacement FormatConversion FormatSpec ContinuedString Ellipsis None Boolean TypeDef AssignOp UpdateStatement UpdateOp ExpressionStatement DeleteStatement del PassStatement pass BreakStatement break ContinueStatement continue ReturnStatement return YieldStatement PrintStatement RaiseStatement raise ImportStatement import as ScopeStatement global nonlocal AssertStatement assert StatementGroup ; IfStatement Body elif WhileStatement while ForStatement TryStatement try except finally WithStatement with FunctionDefinition def ParamList AssignOp TypeDef ClassDefinition class DecoratedStatement Decorator At MatchStatement match MatchBody MatchClause case CapturePattern LiteralPattern ArithOp ArithOp AsPattern OrPattern LogicOp AttributePattern SequencePattern MappingPattern StarPattern ClassPattern PatternArgList KeywordPattern KeywordPattern Guard",maxTerm:267,context:PO,nodeProps:[["group",-14,4,80,82,83,85,87,89,91,93,94,95,97,100,103,"Statement Statement",-22,6,16,19,23,38,47,48,54,55,58,59,60,61,62,65,68,69,70,74,75,76,77,"Expression",-10,105,107,110,112,113,117,119,124,126,129,"Statement",-9,134,135,138,139,141,142,143,144,145,"Pattern"],["openedBy",21,"(",52,"[",56,"{"],["closedBy",22,")",53,"]",57,"}"]],propSources:[sO],skippedNodes:[0,2],repeatNodeCount:38,tokenData:"&JdMgR!^OX$}XY!&]Y[$}[]!&]]p$}pq!&]qr!(grs!,^st!IYtu$}uv$5[vw$7nwx$8zxy%'vyz%(|z{%*S{|%,r|}%.O}!O%/U!O!P%1k!P!Q%UZ&^7[&WW&f#tOr(}rs)}sw(}wx>wx#O(}#O#P2]#P#o(}#o#p:X#p#q(}#q#r2q#r~(}:Y?QX&^7[&WW&f#tOr>wrs?ms#O>w#O#PAP#P#o>w#o#p8Y#p#q>w#q#r6g#r~>w:Y?rX&^7[Or>wrs@_s#O>w#O#PAP#P#o>w#o#p8Y#p#q>w#q#r6g#r~>w:Y@dX&^7[Or>wrs-}s#O>w#O#PAP#P#o>w#o#p8Y#p#q>w#q#r6g#r~>w:YAUT&^7[O#o>w#o#p6g#p#q>w#q#r6g#r~>w`x#O!`x#O!gZ&WW&R,XOY!wZ]!Ad]^>w^r!Adrs!Bhs#O!Ad#O#P!C[#P#o!Ad#o#p!9f#p#q!Ad#q#r!7x#r~!AdEc!BoX&^7[&R,XOr>wrs@_s#O>w#O#PAP#P#o>w#o#p8Y#p#q>w#q#r6g#r~>wEc!CaT&^7[O#o!Ad#o#p!7x#p#q!Ad#q#r!7x#r~!AdGZ!CuT&^7[O#o!-l#o#p!DU#p#q!-l#q#r!DU#r~!-l0}!De]&TS&WW&R,X&Z`&d!b&f#tOY!DUYZAyZ]!DU]^Ay^r!DUrs!E^sw!DUwx!5tx#O!DU#O#P!FU#P#o!DU#o#p!F[#p~!DU0}!EiX&TS&R,X&Z`&d!bOrAyrsCiswAywx5Px#OAy#O#PEo#P#oAy#o#pEu#p~Ay0}!FXPO~!DU0}!Fe]&TS&WW&R,XOY!`x#O!`sw#=dwx#@Sx#O#=d#O#P#Av#P#o#=d#o#p#0Y#p~#=d2P#=mZQ1s&TS&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#@Sx#O#=d#O#P#Av#P~#=d2P#>gZQ1s&TSOY#=dYZ:{Z]#=d]^:{^r#=drs#?Ysw#=dwx#@Sx#O#=d#O#P#Av#P~#=d2P#?aZQ1s&TSOY#=dYZ:{Z]#=d]^:{^r#=drs#,zsw#=dwx#@Sx#O#=d#O#P#Av#P~#=d2P#@ZZQ1s&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#@|x#O#=d#O#P#Av#P~#=d2P#ATZQ1s&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#9bx#O#=d#O#P#Av#P~#=d2P#A{TQ1sOY#=dYZ:{Z]#=d]^:{^~#=dLe#Bg_Q1s&^7[&WW&f#tOY!NdYZ(}Z]!Nd]^(}^r!Ndrs# rsw!Ndwx#Cfx#O!Nd#O#P#/f#P#o!Nd#o#p#wZ]#Cf]^>w^r#Cfrs#Djs#O#Cf#O#P#Fj#P#o#Cf#o#p#8h#p#q#Cf#q#r#5h#r~#CfJ}#Dq]Q1s&^7[OY#CfYZ>wZ]#Cf]^>w^r#Cfrs#Ejs#O#Cf#O#P#Fj#P#o#Cf#o#p#8h#p#q#Cf#q#r#5h#r~#CfJ}#Eq]Q1s&^7[OY#CfYZ>wZ]#Cf]^>w^r#Cfrs#'[s#O#Cf#O#P#Fj#P#o#Cf#o#p#8h#p#q#Cf#q#r#5h#r~#CfJ}#FqXQ1s&^7[OY#CfYZ>wZ]#Cf]^>w^#o#Cf#o#p#5h#p#q#Cf#q#r#5h#r~#CfLu#GeXQ1s&^7[OY!KxYZ'PZ]!Kx]^'P^#o!Kx#o#p#HQ#p#q!Kx#q#r#HQ#r~!Kx6i#Ha]Q1s&TS&WW&Z`&d!b&f#tOY#HQYZAyZ]#HQ]^Ay^r#HQrs#IYsw#HQwx#3dx#O#HQ#O#P#Mn#P#o#HQ#o#p#NS#p~#HQ6i#Ie]Q1s&TS&Z`&d!bOY#HQYZAyZ]#HQ]^Ay^r#HQrs#J^sw#HQwx#3dx#O#HQ#O#P#Mn#P#o#HQ#o#p#NS#p~#HQ6i#Ji]Q1s&TS&Z`&d!bOY#HQYZAyZ]#HQ]^Ay^r#HQrs#Kbsw#HQwx#3dx#O#HQ#O#P#Mn#P#o#HQ#o#p#NS#p~#HQ3k#KmZQ1s&TS&Z`&d!bOY#KbYZD_Z]#Kb]^D_^w#Kbwx#)|x#O#Kb#O#P#L`#P#o#Kb#o#p#Lt#p~#Kb3k#LeTQ1sOY#KbYZD_Z]#Kb]^D_^~#Kb3k#L{ZQ1s&TSOY#,zYZ1OZ]#,z]^1O^w#,zwx#-nx#O#,z#O#P#/Q#P#o#,z#o#p#Kb#p~#,z6i#MsTQ1sOY#HQYZAyZ]#HQ]^Ay^~#HQ6i#N]]Q1s&TS&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#@Sx#O#=d#O#P#Av#P#o#=d#o#p#HQ#p~#=dLu$ c_Q1s&^7[&TS&Z`&d!bOY!KxYZ'PZ]!Kx]^'P^r!Kxrs$!bsw!Kxwx!MYx#O!Kx#O#P#G^#P#o!Kx#o#p#NS#p#q!Kx#q#r#HQ#r~!KxIw$!o]Q1s&^7[&TS&Z`&d!bOY$!bYZGgZ]$!b]^Gg^w$!bwx#%[x#O$!b#O#P$#h#P#o$!b#o#p#Lt#p#q$!b#q#r#Kb#r~$!bIw$#oXQ1s&^7[OY$!bYZGgZ]$!b]^Gg^#o$!b#o#p#Kb#p#q$!b#q#r#Kb#r~$!bMV$$i_Q1s&^7[&WW&ap&f#tOY$%hYZIqZ]$%h]^Iq^r$%hrs# rsw$%hwx$.px#O$%h#O#P$&x#P#o$%h#o#p$-n#p#q$%h#q#r$'l#r~$%hMV$%y_Q1s&^7[&TS&WW&ap&d!b&f#tOY$%hYZIqZ]$%h]^Iq^r$%hrs# rsw$%hwx$$[x#O$%h#O#P$&x#P#o$%h#o#p$-n#p#q$%h#q#r$'l#r~$%hMV$'PXQ1s&^7[OY$%hYZIqZ]$%h]^Iq^#o$%h#o#p$'l#p#q$%h#q#r$'l#r~$%h6y$'{]Q1s&TS&WW&ap&d!b&f#tOY$'lYZKXZ]$'l]^KX^r$'lrs#1`sw$'lwx$(tx#O$'l#O#P$-Y#P#o$'l#o#p$-n#p~$'l6y$)P]Q1s&WW&ap&f#tOY$'lYZKXZ]$'l]^KX^r$'lrs#1`sw$'lwx$)xx#O$'l#O#P$-Y#P#o$'l#o#p$-n#p~$'l6y$*T]Q1s&WW&ap&f#tOY$'lYZKXZ]$'l]^KX^r$'lrs#1`sw$'lwx$*|x#O$'l#O#P$-Y#P#o$'l#o#p$-n#p~$'l5c$+XZQ1s&WW&ap&f#tOY$*|YZMmZ]$*|]^Mm^r$*|rs#6ds#O$*|#O#P$+z#P#o$*|#o#p$,`#p~$*|5c$,PTQ1sOY$*|YZMmZ]$*|]^Mm^~$*|5c$,gZQ1s&WWOY#9bYZ8tZ]#9b]^8t^r#9brs#:Us#O#9b#O#P#;h#P#o#9b#o#p$*|#p~#9b6y$-_TQ1sOY$'lYZKXZ]$'l]^KX^~$'l6y$-w]Q1s&TS&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#@Sx#O#=d#O#P#Av#P#o#=d#o#p$'l#p~#=dMV$.}_Q1s&^7[&WW&ap&f#tOY$%hYZIqZ]$%h]^Iq^r$%hrs# rsw$%hwx$/|x#O$%h#O#P$&x#P#o$%h#o#p$-n#p#q$%h#q#r$'l#r~$%hKo$0Z]Q1s&^7[&WW&ap&f#tOY$/|YZ!!uZ]$/|]^!!u^r$/|rs#Djs#O$/|#O#P$1S#P#o$/|#o#p$,`#p#q$/|#q#r$*|#r~$/|Ko$1ZXQ1s&^7[OY$/|YZ!!uZ]$/|]^!!u^#o$/|#o#p$*|#p#q$/|#q#r$*|#r~$/|Mg$1}XQ1s&^7[OY!IYYZ$}Z]!IY]^$}^#o!IY#o#p$2j#p#q!IY#q#r$2j#r~!IY7Z$2{]Q1s&TS&WW&Z`&ap&d!b&f#tOY$2jYZ!$gZ]$2j]^!$g^r$2jrs#IYsw$2jwx$(tx#O$2j#O#P$3t#P#o$2j#o#p$4Y#p~$2j7Z$3yTQ1sOY$2jYZ!$gZ]$2j]^!$g^~$2j7Z$4c]Q1s&TS&WWOY#=dYZ:{Z]#=d]^:{^r#=drs#>`sw#=dwx#@Sx#O#=d#O#P#Av#P#o#=d#o#p$2j#p~#=dGz$5o]%jQ&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!_$}!_!`$6h!`#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gz$6{Z!s,W&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gz$8R]%dQ&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!_$}!_!`$6h!`#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}G{$9Z_&_`&^7[&WW&R,X&ap&f#tOY$:YYZIqZ]$:Y]^Iq^r$:Yrs$;jsw$:Ywx%%zx#O$:Y#O#P%!^#P#o$:Y#o#p%$x#p#q$:Y#q#r%!r#r~$:YGk$:k_&^7[&TS&WW&R,X&ap&d!b&f#tOY$:YYZIqZ]$:Y]^Iq^r$:Yrs$;jsw$:Ywx% ^x#O$:Y#O#P%!^#P#o$:Y#o#p%$x#p#q$:Y#q#r%!r#r~$:YFy$;u_&^7[&TS&R,X&d!bOY$Sx#O$Sx#O$_Z&^7[&WW&R,X&f#tOr(}rs)}sw(}wx={x#O(}#O#P2]#P#o(}#o#p:X#p#q(}#q#r2q#r~(}Fy$?VT&^7[O#o$Sx#O$T!Q!_$}!_!`$6h!`#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gz%>h]%kQ&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!_$}!_!`$6h!`#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%?tu!f,V&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!O$}!O!P%BX!P!Q$}!Q![%Cc![!d$}!d!e%Ee!e!g$}!g!h%7Z!h!l$}!l!m%;k!m!q$}!q!r%H_!r!z$}!z!{%KR!{#O$}#O#P!$R#P#R$}#R#S%Cc#S#U$}#U#V%Ee#V#X$}#X#Y%7Z#Y#^$}#^#_%;k#_#c$}#c#d%H_#d#l$}#l#m%KR#m#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%Bj]&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q![%5_![#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%Cvi!f,V&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!O$}!O!P%BX!P!Q$}!Q![%Cc![!g$}!g!h%7Z!h!l$}!l!m%;k!m#O$}#O#P!$R#P#R$}#R#S%Cc#S#X$}#X#Y%7Z#Y#^$}#^#_%;k#_#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%Ev`&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q!R%Fx!R!S%Fx!S#O$}#O#P!$R#P#R$}#R#S%Fx#S#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%G]`!f,V&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q!R%Fx!R!S%Fx!S#O$}#O#P!$R#P#R$}#R#S%Fx#S#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%Hp_&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q!Y%Io!Y#O$}#O#P!$R#P#R$}#R#S%Io#S#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%JS_!f,V&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q!Y%Io!Y#O$}#O#P!$R#P#R$}#R#S%Io#S#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%Kdc&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q![%Lo![!c$}!c!i%Lo!i#O$}#O#P!$R#P#R$}#R#S%Lo#S#T$}#T#Z%Lo#Z#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Gy%MSc!f,V&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!Q$}!Q![%Lo![!c$}!c!i%Lo!i#O$}#O#P!$R#P#R$}#R#S%Lo#S#T$}#T#Z%Lo#Z#o$}#o#p!%i#p#q$}#q#r!$g#r~$}Mg%Nr]y1s&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx!_$}!_!`& k!`#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}x!u!}&+n!}#O$}#O#P!$R#P#R$}#R#S&+n#S#T$}#T#f&+n#f#g&>x#g#o&+n#o#p!%i#p#q$}#q#r!$g#r$g$}$g~&+nGZ&9gZ&^7[&TS&Z`&d!b&`,XOr'Prs&:Ysw'Pwx(Rx#O'P#O#PAe#P#o'P#o#pEu#p#q'P#q#rAy#r~'PGZ&:eZ&^7[&TS&Z`&d!bOr'Prs&;Wsw'Pwx(Rx#O'P#O#PAe#P#o'P#o#pEu#p#q'P#q#rAy#r~'PD]&;eX&^7[&TS&e,X&Z`&d!bOwGgwx,kx#OGg#O#PH_#P#oGg#o#pET#p#qGg#q#rD_#r~GgGk&<_Z&^7[&WW&ap&f#t&Y,XOrIqrs)}swIqwx&=Qx#OIq#O#PJs#P#oIq#o#p! T#p#qIq#q#rKX#r~IqGk&=]Z&^7[&WW&ap&f#tOrIqrs)}swIqwx&>Ox#OIq#O#PJs#P#oIq#o#p! T#p#qIq#q#rKX#r~IqFT&>]X&^7[&WW&c,X&ap&f#tOr!!urs?ms#O!!u#O#P!#m#P#o!!u#o#pNc#p#q!!u#q#rMm#r~!!uMg&?_c&^7[&TS&WW&Q&j&Z`&ap&d!b&f#t%m,XOr$}rs&9Ysw$}wx&x!i!t&+n!t!u&5j!u!}&+n!}#O$}#O#P!$R#P#R$}#R#S&+n#S#T$}#T#U&+n#U#V&5j#V#Y&+n#Y#Z&>x#Z#o&+n#o#p!%i#p#q$}#q#r!$g#r$g$}$g~&+nG{&CXZ!V,X&^7[&TS&WW&Z`&ap&d!b&f#tOr$}rs&Rsw$}wxHsx#O$}#O#P!$R#P#o$}#o#p!%i#p#q$}#q#r!$g#r~$}iO[O]||-1}],tokenPrec:7282});function I(O,$){let Q=O.lineIndent($.from),P=O.lineAt(O.pos,-1),e=P.from+P.text.length;return!/\S/.test(P.text)&&O.node.toQ?null:Q+O.unit}const aO=R.define({name:"python",parser:oO.configure({props:[Z.add({Body:O=>{var $;return($=I(O,O.node))!==null&&$!==void 0?$:O.continue()},IfStatement:O=>/^\s*(else:|elif )/.test(O.textAfter)?O.baseIndent:O.continue(),TryStatement:O=>/^\s*(except |finally:|else:)/.test(O.textAfter)?O.baseIndent:O.continue(),"TupleExpression ComprehensionExpression ParamList ArgList ParenthesizedExpression":a({closing:")"}),"DictionaryExpression DictionaryComprehensionExpression SetExpression SetComprehensionExpression":a({closing:"}"}),"ArrayExpression ArrayComprehensionExpression":a({closing:"]"}),"String FormatString":()=>null,Script:O=>{if(O.pos+/\s*/.exec(O.textAfter)[0].length>=O.node.to){let $=null;for(let Q=O.node,P=Q.to;Q=Q.lastChild,!(!Q||Q.to!=P);)Q.type.name=="Body"&&($=Q);if($){let Q=I(O,$);if(Q!=null)return Q}}return O.continue()}}),X.add({"ArrayExpression DictionaryExpression SetExpression TupleExpression":y,Body:(O,$)=>({from:O.from+1,to:O.to-(O.to==$.doc.length?0:1)})})]}),languageData:{closeBrackets:{brackets:["(","[","{","'",'"',"'''",'"""'],stringPrefixes:["f","fr","rf","r","u","b","br","rb","F","FR","RF","R","U","B","BR","RB"]},commentTokens:{line:"#"},indentOnInput:/^\s*([\}\]\)]|else:|elif |except |finally:)$/}});function YO(){return new f(aO)}export{YO as python,aO as pythonLanguage};
-//# sourceMappingURL=index-94005977.js.map
diff --git a/spaces/ddosxd/sydney-inpaint/static/index.html b/spaces/ddosxd/sydney-inpaint/static/index.html
deleted file mode 100644
index 100488b95064befc7652e739dd9f6d6ac9100db2..0000000000000000000000000000000000000000
--- a/spaces/ddosxd/sydney-inpaint/static/index.html
+++ /dev/null
@@ -1,397 +0,0 @@
-
-
-
-
- SDXL Inpaint
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
\ No newline at end of file
diff --git a/spaces/deaf1296/finetuned_diffusion/README.md b/spaces/deaf1296/finetuned_diffusion/README.md
deleted file mode 100644
index c29108864c9d3c8ff637222d04f33fb0f576e74a..0000000000000000000000000000000000000000
--- a/spaces/deaf1296/finetuned_diffusion/README.md
+++ /dev/null
@@ -1,14 +0,0 @@
----
-title: Finetuned Diffusion
-emoji: 🪄🖼️
-colorFrom: red
-colorTo: pink
-sdk: gradio
-sdk_version: 3.6
-app_file: app.py
-pinned: true
-license: mit
-duplicated_from: anzorq/finetuned_diffusion
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/dechantoine/PokeGAN/exploration.py b/spaces/dechantoine/PokeGAN/exploration.py
deleted file mode 100644
index fa119d89274f6a04d1337d478155fd79503d2241..0000000000000000000000000000000000000000
--- a/spaces/dechantoine/PokeGAN/exploration.py
+++ /dev/null
@@ -1,42 +0,0 @@
-import numpy as np
-
-def convert_to_spherical(point):
- sph = np.zeros(len(point))
- sph[0] = np.sqrt(np.sum(np.square(point)))
- for i in range(1, len(point)-1):
- sph[i] = np.arccos(point[i-1]/np.sqrt(np.sum(np.square(point[i-1:]))))
- if point[-1] >= 0:
- sph[-1] = np.arccos(point[-2]/np.sqrt(np.sum(np.square(point[-2:]))))
- else:
- sph[-1] = np.pi*2 - np.arccos(point[-2]/np.sqrt(np.sum(np.square(point[-2:]))))
- return sph
-
-def convert_to_cartesian(point):
- crts = np.zeros(len(point))
- crts[0] = point[0]*np.cos(point[1])
- s = 1
- for i in range(1, len(point)-1):
- s = s*np.sin(point[i])
- crts[i] = point[0]*s*np.cos(point[i+1])
- s = s*np.sin(point[-1])
- crts[-1] = point[0]*s
- return crts
-
-def get_great_circle(point, axis_to_rotate=1, n_samples=100):
- sph = convert_to_spherical(point)
- circle = np.zeros(shape=(n_samples,len(point)))
- for i in range(n_samples):
- circle[i] = convert_to_cartesian(np.concatenate((sph[:axis_to_rotate],
- [2*np.pi/n_samples*i], sph[(axis_to_rotate+1):])))
- return circle
-
-
-def get_arc_between_points(point1, point2, n_samples=100):
- sph1 = convert_to_spherical(point1)
- sph2 = convert_to_spherical(point2)
-
- step = (sph2 - sph1) / n_samples
- arc = np.zeros(shape=(n_samples, len(point1)))
- for i in range(n_samples):
- arc[i] = convert_to_cartesian(sph1 + step * i)
- return arc
\ No newline at end of file
diff --git a/spaces/declare-lab/tango/diffusers/PHILOSOPHY.md b/spaces/declare-lab/tango/diffusers/PHILOSOPHY.md
deleted file mode 100644
index fbad5948e17e576d902176202060e8077b4198ec..0000000000000000000000000000000000000000
--- a/spaces/declare-lab/tango/diffusers/PHILOSOPHY.md
+++ /dev/null
@@ -1,110 +0,0 @@
-
-
-# Philosophy
-
-🧨 Diffusers provides **state-of-the-art** pretrained diffusion models across multiple modalities.
-Its purpose is to serve as a **modular toolbox** for both inference and training.
-
-We aim at building a library that stands the test of time and therefore take API design very seriously.
-
-In a nutshell, Diffusers is built to be a natural extension of PyTorch. Therefore, most of our design choices are based on [PyTorch's Design Principles](https://pytorch.org/docs/stable/community/design.html#pytorch-design-philosophy). Let's go over the most important ones:
-
-## Usability over Performance
-
-- While Diffusers has many built-in performance-enhancing features (see [Memory and Speed](https://huggingface.co/docs/diffusers/optimization/fp16)), models are always loaded with the highest precision and lowest optimization. Therefore, by default diffusion pipelines are always instantiated on CPU with float32 precision if not otherwise defined by the user. This ensures usability across different platforms and accelerators and means that no complex installations are required to run the library.
-- Diffusers aim at being a **light-weight** package and therefore has very few required dependencies, but many soft dependencies that can improve performance (such as `accelerate`, `safetensors`, `onnx`, etc...). We strive to keep the library as lightweight as possible so that it can be added without much concern as a dependency on other packages.
-- Diffusers prefers simple, self-explainable code over condensed, magic code. This means that short-hand code syntaxes such as lambda functions, and advanced PyTorch operators are often not desired.
-
-## Simple over easy
-
-As PyTorch states, **explicit is better than implicit** and **simple is better than complex**. This design philosophy is reflected in multiple parts of the library:
-- We follow PyTorch's API with methods like [`DiffusionPipeline.to`](https://huggingface.co/docs/diffusers/main/en/api/diffusion_pipeline#diffusers.DiffusionPipeline.to) to let the user handle device management.
-- Raising concise error messages is preferred to silently correct erroneous input. Diffusers aims at teaching the user, rather than making the library as easy to use as possible.
-- Complex model vs. scheduler logic is exposed instead of magically handled inside. Schedulers/Samplers are separated from diffusion models with minimal dependencies on each other. This forces the user to write the unrolled denoising loop. However, the separation allows for easier debugging and gives the user more control over adapting the denoising process or switching out diffusion models or schedulers.
-- Separately trained components of the diffusion pipeline, *e.g.* the text encoder, the unet, and the variational autoencoder, each have their own model class. This forces the user to handle the interaction between the different model components, and the serialization format separates the model components into different files. However, this allows for easier debugging and customization. Dreambooth or textual inversion training
-is very simple thanks to diffusers' ability to separate single components of the diffusion pipeline.
-
-## Tweakable, contributor-friendly over abstraction
-
-For large parts of the library, Diffusers adopts an important design principle of the [Transformers library](https://github.com/huggingface/transformers), which is to prefer copy-pasted code over hasty abstractions. This design principle is very opinionated and stands in stark contrast to popular design principles such as [Don't repeat yourself (DRY)](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
-In short, just like Transformers does for modeling files, diffusers prefers to keep an extremely low level of abstraction and very self-contained code for pipelines and schedulers.
-Functions, long code blocks, and even classes can be copied across multiple files which at first can look like a bad, sloppy design choice that makes the library unmaintainable.
-**However**, this design has proven to be extremely successful for Transformers and makes a lot of sense for community-driven, open-source machine learning libraries because:
-- Machine Learning is an extremely fast-moving field in which paradigms, model architectures, and algorithms are changing rapidly, which therefore makes it very difficult to define long-lasting code abstractions.
-- Machine Learning practitioners like to be able to quickly tweak existing code for ideation and research and therefore prefer self-contained code over one that contains many abstractions.
-- Open-source libraries rely on community contributions and therefore must build a library that is easy to contribute to. The more abstract the code, the more dependencies, the harder to read, and the harder to contribute to. Contributors simply stop contributing to very abstract libraries out of fear of breaking vital functionality. If contributing to a library cannot break other fundamental code, not only is it more inviting for potential new contributors, but it is also easier to review and contribute to multiple parts in parallel.
-
-At Hugging Face, we call this design the **single-file policy** which means that almost all of the code of a certain class should be written in a single, self-contained file. To read more about the philosophy, you can have a look
-at [this blog post](https://huggingface.co/blog/transformers-design-philosophy).
-
-In diffusers, we follow this philosophy for both pipelines and schedulers, but only partly for diffusion models. The reason we don't follow this design fully for diffusion models is because almost all diffusion pipelines, such
-as [DDPM](https://huggingface.co/docs/diffusers/v0.12.0/en/api/pipelines/ddpm), [Stable Diffusion](https://huggingface.co/docs/diffusers/v0.12.0/en/api/pipelines/stable_diffusion/overview#stable-diffusion-pipelines), [UnCLIP (Dalle-2)](https://huggingface.co/docs/diffusers/v0.12.0/en/api/pipelines/unclip#overview) and [Imagen](https://imagen.research.google/) all rely on the same diffusion model, the [UNet](https://huggingface.co/docs/diffusers/api/models#diffusers.UNet2DConditionModel).
-
-Great, now you should have generally understood why 🧨 Diffusers is designed the way it is 🤗.
-We try to apply these design principles consistently across the library. Nevertheless, there are some minor exceptions to the philosophy or some unlucky design choices. If you have feedback regarding the design, we would ❤️ to hear it [directly on GitHub](https://github.com/huggingface/diffusers/issues/new?assignees=&labels=&template=feedback.md&title=).
-
-## Design Philosophy in Details
-
-Now, let's look a bit into the nitty-gritty details of the design philosophy. Diffusers essentially consist of three major classes, [pipelines](https://github.com/huggingface/diffusers/tree/main/src/diffusers/pipelines), [models](https://github.com/huggingface/diffusers/tree/main/src/diffusers/models), and [schedulers](https://github.com/huggingface/diffusers/tree/main/src/diffusers/schedulers).
-Let's walk through more in-detail design decisions for each class.
-
-### Pipelines
-
-Pipelines are designed to be easy to use (therefore do not follow [*Simple over easy*](#simple-over-easy) 100%)), are not feature complete, and should loosely be seen as examples of how to use [models](#models) and [schedulers](#schedulers) for inference.
-
-The following design principles are followed:
-- Pipelines follow the single-file policy. All pipelines can be found in individual directories under src/diffusers/pipelines. One pipeline folder corresponds to one diffusion paper/project/release. Multiple pipeline files can be gathered in one pipeline folder, as it’s done for [`src/diffusers/pipelines/stable-diffusion`](https://github.com/huggingface/diffusers/tree/main/src/diffusers/pipelines/stable_diffusion). If pipelines share similar functionality, one can make use of the [#Copied from mechanism](https://github.com/huggingface/diffusers/blob/125d783076e5bd9785beb05367a2d2566843a271/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py#L251).
-- Pipelines all inherit from [`DiffusionPipeline`]
-- Every pipeline consists of different model and scheduler components, that are documented in the [`model_index.json` file](https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/model_index.json), are accessible under the same name as attributes of the pipeline and can be shared between pipelines with [`DiffusionPipeline.components`](https://huggingface.co/docs/diffusers/main/en/api/diffusion_pipeline#diffusers.DiffusionPipeline.components) function.
-- Every pipeline should be loadable via the [`DiffusionPipeline.from_pretrained`](https://huggingface.co/docs/diffusers/main/en/api/diffusion_pipeline#diffusers.DiffusionPipeline.from_pretrained) function.
-- Pipelines should be used **only** for inference.
-- Pipelines should be very readable, self-explanatory, and easy to tweak.
-- Pipelines should be designed to build on top of each other and be easy to integrate into higher-level APIs.
-- Pipelines are **not** intended to be feature-complete user interfaces. For future complete user interfaces one should rather have a look at [InvokeAI](https://github.com/invoke-ai/InvokeAI), [Diffuzers](https://github.com/abhishekkrthakur/diffuzers), and [lama-cleaner](https://github.com/Sanster/lama-cleaner)
-- Every pipeline should have one and only one way to run it via a `__call__` method. The naming of the `__call__` arguments should be shared across all pipelines.
-- Pipelines should be named after the task they are intended to solve.
-- In almost all cases, novel diffusion pipelines shall be implemented in a new pipeline folder/file.
-
-### Models
-
-Models are designed as configurable toolboxes that are natural extensions of [PyTorch's Module class](https://pytorch.org/docs/stable/generated/torch.nn.Module.html). They only partly follow the **single-file policy**.
-
-The following design principles are followed:
-- Models correspond to **a type of model architecture**. *E.g.* the [`UNet2DConditionModel`] class is used for all UNet variations that expect 2D image inputs and are conditioned on some context.
-- All models can be found in [`src/diffusers/models`](https://github.com/huggingface/diffusers/tree/main/src/diffusers/models) and every model architecture shall be defined in its file, e.g. [`unet_2d_condition.py`](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/unet_2d_condition.py), [`transformer_2d.py`](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/transformer_2d.py), etc...
-- Models **do not** follow the single-file policy and should make use of smaller model building blocks, such as [`attention.py`](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention.py), [`resnet.py`](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/resnet.py), [`embeddings.py`](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/embeddings.py), etc... **Note**: This is in stark contrast to Transformers' modeling files and shows that models do not really follow the single-file policy.
-- Models intend to expose complexity, just like PyTorch's module does, and give clear error messages.
-- Models all inherit from `ModelMixin` and `ConfigMixin`.
-- Models can be optimized for performance when it doesn’t demand major code changes, keeps backward compatibility, and gives significant memory or compute gain.
-- Models should by default have the highest precision and lowest performance setting.
-- To integrate new model checkpoints whose general architecture can be classified as an architecture that already exists in Diffusers, the existing model architecture shall be adapted to make it work with the new checkpoint. One should only create a new file if the model architecture is fundamentally different.
-- Models should be designed to be easily extendable to future changes. This can be achieved by limiting public function arguments, configuration arguments, and "foreseeing" future changes, *e.g.* it is usually better to add `string` "...type" arguments that can easily be extended to new future types instead of boolean `is_..._type` arguments. Only the minimum amount of changes shall be made to existing architectures to make a new model checkpoint work.
-- The model design is a difficult trade-off between keeping code readable and concise and supporting many model checkpoints. For most parts of the modeling code, classes shall be adapted for new model checkpoints, while there are some exceptions where it is preferred to add new classes to make sure the code is kept concise and
-readable longterm, such as [UNet blocks](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/unet_2d_blocks.py) and [Attention processors](https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py).
-
-### Schedulers
-
-Schedulers are responsible to guide the denoising process for inference as well as to define a noise schedule for training. They are designed as individual classes with loadable configuration files and strongly follow the **single-file policy**.
-
-The following design principles are followed:
-- All schedulers are found in [`src/diffusers/schedulers`](https://github.com/huggingface/diffusers/tree/main/src/diffusers/schedulers).
-- Schedulers are **not** allowed to import from large utils files and shall be kept very self-contained.
-- One scheduler python file corresponds to one scheduler algorithm (as might be defined in a paper).
-- If schedulers share similar functionalities, we can make use of the `#Copied from` mechanism.
-- Schedulers all inherit from `SchedulerMixin` and `ConfigMixin`.
-- Schedulers can be easily swapped out with the [`ConfigMixin.from_config`](https://huggingface.co/docs/diffusers/main/en/api/configuration#diffusers.ConfigMixin.from_config) method as explained in detail [here](./using-diffusers/schedulers.mdx).
-- Every scheduler has to have a `set_num_inference_steps`, and a `step` function. `set_num_inference_steps(...)` has to be called before every denoising process, *i.e.* before `step(...)` is called.
-- Every scheduler exposes the timesteps to be "looped over" via a `timesteps` attribute, which is an array of timesteps the model will be called upon
-- The `step(...)` function takes a predicted model output and the "current" sample (x_t) and returns the "previous", slightly more denoised sample (x_t-1).
-- Given the complexity of diffusion schedulers, the `step` function does not expose all the complexity and can be a bit of a "black box".
-- In almost all cases, novel schedulers shall be implemented in a new scheduling file.
diff --git a/spaces/declare-lab/tango/diffusers/tests/repo_utils/test_check_dummies.py b/spaces/declare-lab/tango/diffusers/tests/repo_utils/test_check_dummies.py
deleted file mode 100644
index 52a75d7b02e85f70cb347afb1429ca8beb942d21..0000000000000000000000000000000000000000
--- a/spaces/declare-lab/tango/diffusers/tests/repo_utils/test_check_dummies.py
+++ /dev/null
@@ -1,122 +0,0 @@
-# Copyright 2023 The HuggingFace Team. All rights reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import os
-import sys
-import unittest
-
-
-git_repo_path = os.path.abspath(os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
-sys.path.append(os.path.join(git_repo_path, "utils"))
-
-import check_dummies # noqa: E402
-from check_dummies import create_dummy_files, create_dummy_object, find_backend, read_init # noqa: E402
-
-
-# Align TRANSFORMERS_PATH in check_dummies with the current path
-check_dummies.PATH_TO_DIFFUSERS = os.path.join(git_repo_path, "src", "diffusers")
-
-
-class CheckDummiesTester(unittest.TestCase):
- def test_find_backend(self):
- simple_backend = find_backend(" if not is_torch_available():")
- self.assertEqual(simple_backend, "torch")
-
- # backend_with_underscore = find_backend(" if not is_tensorflow_text_available():")
- # self.assertEqual(backend_with_underscore, "tensorflow_text")
-
- double_backend = find_backend(" if not (is_torch_available() and is_transformers_available()):")
- self.assertEqual(double_backend, "torch_and_transformers")
-
- # double_backend_with_underscore = find_backend(
- # " if not (is_sentencepiece_available() and is_tensorflow_text_available()):"
- # )
- # self.assertEqual(double_backend_with_underscore, "sentencepiece_and_tensorflow_text")
-
- triple_backend = find_backend(
- " if not (is_torch_available() and is_transformers_available() and is_onnx_available()):"
- )
- self.assertEqual(triple_backend, "torch_and_transformers_and_onnx")
-
- def test_read_init(self):
- objects = read_init()
- # We don't assert on the exact list of keys to allow for smooth grow of backend-specific objects
- self.assertIn("torch", objects)
- self.assertIn("torch_and_transformers", objects)
- self.assertIn("flax_and_transformers", objects)
- self.assertIn("torch_and_transformers_and_onnx", objects)
-
- # Likewise, we can't assert on the exact content of a key
- self.assertIn("UNet2DModel", objects["torch"])
- self.assertIn("FlaxUNet2DConditionModel", objects["flax"])
- self.assertIn("StableDiffusionPipeline", objects["torch_and_transformers"])
- self.assertIn("FlaxStableDiffusionPipeline", objects["flax_and_transformers"])
- self.assertIn("LMSDiscreteScheduler", objects["torch_and_scipy"])
- self.assertIn("OnnxStableDiffusionPipeline", objects["torch_and_transformers_and_onnx"])
-
- def test_create_dummy_object(self):
- dummy_constant = create_dummy_object("CONSTANT", "'torch'")
- self.assertEqual(dummy_constant, "\nCONSTANT = None\n")
-
- dummy_function = create_dummy_object("function", "'torch'")
- self.assertEqual(
- dummy_function, "\ndef function(*args, **kwargs):\n requires_backends(function, 'torch')\n"
- )
-
- expected_dummy_class = """
-class FakeClass(metaclass=DummyObject):
- _backends = 'torch'
-
- def __init__(self, *args, **kwargs):
- requires_backends(self, 'torch')
-
- @classmethod
- def from_config(cls, *args, **kwargs):
- requires_backends(cls, 'torch')
-
- @classmethod
- def from_pretrained(cls, *args, **kwargs):
- requires_backends(cls, 'torch')
-"""
- dummy_class = create_dummy_object("FakeClass", "'torch'")
- self.assertEqual(dummy_class, expected_dummy_class)
-
- def test_create_dummy_files(self):
- expected_dummy_pytorch_file = """# This file is autogenerated by the command `make fix-copies`, do not edit.
-from ..utils import DummyObject, requires_backends
-
-
-CONSTANT = None
-
-
-def function(*args, **kwargs):
- requires_backends(function, ["torch"])
-
-
-class FakeClass(metaclass=DummyObject):
- _backends = ["torch"]
-
- def __init__(self, *args, **kwargs):
- requires_backends(self, ["torch"])
-
- @classmethod
- def from_config(cls, *args, **kwargs):
- requires_backends(cls, ["torch"])
-
- @classmethod
- def from_pretrained(cls, *args, **kwargs):
- requires_backends(cls, ["torch"])
-"""
- dummy_files = create_dummy_files({"torch": ["CONSTANT", "function", "FakeClass"]})
- self.assertEqual(dummy_files["torch"], expected_dummy_pytorch_file)
diff --git a/spaces/deepusus/tts/README.md b/spaces/deepusus/tts/README.md
deleted file mode 100644
index b6b2f87908f373800949fe31221d396b714daf5b..0000000000000000000000000000000000000000
--- a/spaces/deepusus/tts/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
----
-title: BanglaTTS
-emoji: 📈
-colorFrom: green
-colorTo: indigo
-sdk: gradio
-sdk_version: 3.49.0
-app_file: app.py
-pinned: false
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/diacanFperku/AutoGPT/BEST Free Sex Movie Long.md b/spaces/diacanFperku/AutoGPT/BEST Free Sex Movie Long.md
deleted file mode 100644
index a016d5ab57e8e1447116bf7903af9c2e7bdb5645..0000000000000000000000000000000000000000
--- a/spaces/diacanFperku/AutoGPT/BEST Free Sex Movie Long.md
+++ /dev/null
@@ -1,12 +0,0 @@
-free sex movie long
Download Zip >>>>> https://gohhs.com/2uFVJS
-
-iLike It Black and Deep In My Ass Metro Metro may be an.MOV - 360p. I Like it Black and Deep In My Ass : BDSM movie : Metro.Watch I Like it Black and Deep In My Ass movie online without downloading
-
-The Milf Who Loved A Little G In Her Ass. [Tiffany Kostro] Featuring Tiffany Kostro. 1.6G 99% 78min - 360p. Milf Who Loved A Little G In Her Ass. Featuring Tiffany Kostro. Tiffany Kostro is the co-host of the Metro Morning show on AM980 CKKW in Calgary. She's a funny, kind, smart woman who wears it all on her sleeve, and everyone loves her! Especially those who like a little G in their ass..MOV - 360p. Milf Who Loved A Little G In Her Ass. Featuring Tiffany Kostro. Tiffany Kostro is the co-host of the Metro Morning show on AM980 CKKW in Calgary. She's a funny, kind, smart woman who wears it all on her sleeve, and everyone loves her! Especially those who like a little G in their ass. Tiffany's show is called "What's In It For Me?" and she answers all your questions in a fun, informative way. Get in her ear and tell her what you want!
-
-Search, free sex videos... Metro - I Like it Black And Deep In My Ass - Full movie. 1.3M 99% 148min - 360p. I Like it Black and Deep In My Ass Metro Metro may be an.MOV - 360p. I Like it Black and Deep In My Ass : BDSM movie : Metro.Watch I Like it Black and Deep In My Ass movie online without downloading
-
-I Love to Sit on my Ass and Play with My Pussy[Tiffany Kostro] Featuring Tiffany Kostro. 2.4G 99% 103min - 360p. I Love to Sit on my Ass and Play with My Pussy[Tiffany Kostro] Featuring Tiffany Kostro. Tiffany Kostro is the co-host of the Metro Morning show on AM980 CKKW in Calgary. She's a funny, kind, smart woman who wears it all on her sleeve, and everyone loves her! Especially those who like a little G in their ass..MOV - 360p. I Love to Sit on my Ass and Play 4fefd39f24
-
-
-
diff --git a/spaces/diacanFperku/AutoGPT/Becker 7926 Dvd Europa Traffic Assist V3.0.torrent.md b/spaces/diacanFperku/AutoGPT/Becker 7926 Dvd Europa Traffic Assist V3.0.torrent.md
deleted file mode 100644
index babc109b22598ec908fc70c8193c33ef6055e4c1..0000000000000000000000000000000000000000
--- a/spaces/diacanFperku/AutoGPT/Becker 7926 Dvd Europa Traffic Assist V3.0.torrent.md
+++ /dev/null
@@ -1,7 +0,0 @@
-
-6 days ago: download becker europe traffic assist cd rom 2009. download becker traffic pro.0 europe navigation abonniere scoperto.torrent.becker traffic pro.0. Navigation cd 0.0 europa traffic assist free tar.7271.becker traffic pro.0 europe navigation abonniere.becker traffic pro.0.
5 days ago: divx.trova le offerte migliori per navigatore dtm free download.download becker traffic pro.0 europe navigation. Sip nel for one more. Unlocked.becker traffic pro.tijdstip: 7271. More than one hour ago: 110mb dontdivx free download.becker traffic pro 0 europe gamme. Has not been checked is. NOTE 0.0 version. Torrent.Transcript.transcript of becker traffic pro.0 europe download free.becker traffic pro.iso.
-php
This document applies to both
- The 5025 Starter engine
- The 6825 SMP engine
7557 and the new 7430 SMP engine.io europe traffic assist cd rom dtm cd.dolby dolby digital dts and dts-es.traffic pro 0.0 europe navigation.becker 0.0 europe traffic assist cd.navigation.traffic pro 4720 car stereo system pdf manual download.becker traffic pro.0. With Becker traffic pro 0.0 europe 10.0 the corresponding symbol will appear on the.becker traffic pro.0. This document applies to both the 5025 .download becker traffic pro 4720 operating manual online. The 5025 .becker traffic pro.0. The 6825 SMP is a single.insuck.becker dtm. A single.full.becker traffic pro.0 dtm. Someone just test-driving. The new 7430 SMP.descriptions were. It is a single 5025 with cusak navigation system and garmin gps.
-becker 7926 dvd europa traffic assist v3.0.torrent
Download >>> https://gohhs.com/2uFTJV
-The 5025 can handle non.transcript e d.e error. Transcript.traffic pro 4720 car stereo system pdf manual download.Cd version. and manuals. In
becker traffic pro 0.0.becker traffic pro.0.1 errore. Note. becker traffic assist 5940.The 5025 .Traffic.traffic pro.0.becker traffic pro 0.0. The 5025 can handle non.course. The 6825 SMP is a single.becker dtm. This document. The 5025 can handle non. This document. With Becker traffic pro 4720 car stereo system pdf manual download. This document.0.1 errore.php. may not . Quick navigation! Skip to main content.becker was just test-driving.2.5.0.5940. This document. The 6825 SMP was. The 5025 .becker 0.0 europe cd rom.php. This .
899543212b
-
-
\ No newline at end of file
diff --git a/spaces/diacanFperku/AutoGPT/HD Online Player (Korean Spellbound Full Movie Tagalog).md b/spaces/diacanFperku/AutoGPT/HD Online Player (Korean Spellbound Full Movie Tagalog).md
deleted file mode 100644
index 48ea51a68e19612904b73effd991db2c14849de4..0000000000000000000000000000000000000000
--- a/spaces/diacanFperku/AutoGPT/HD Online Player (Korean Spellbound Full Movie Tagalog).md
+++ /dev/null
@@ -1,9 +0,0 @@
-
-We recommend that you rent at least one of these korean blockbuster movies if you havent already seen. But if you have seen some of the movies listed here, be sure to check out the box office sales. Theater listings and showtimes are also available online if youre interested.
-HD Online Player (Korean Spellbound Full Movie Tagalog)
Download File ✅ https://gohhs.com/2uFSYh
-There are a number of radio shows that talk about talk about Philippine TV programs. It is a term that is used to speak about the channels or tv stations which are available in the usa. You will be able to seize out Philippine march 20 2021 OFW hd. No matter if youre into action-packed thrillers with romantic subplots or full-on cheesy rom-coms, this list is sure to have something that you will enjoy. So bust out the popcorn and dive into these Korean movies that are perfect for this summer!
-In Spellbound, Laurence transforms into a scientific research assistant, assisting a young and naive psychologist conducting experiments on amnesia. The doctor and his assistant are having an affair, but something is mysteriously missing in their lives. The couple increasingly begins to appear in each others' dreams.
-The film is full of classical Hitchcockian nooks and crannies. There are quite a few scenes where the film cuts between two action scenes (the first one is thrilling, the second one is full of tension), as well as the various Hitchcockian effects such as mirrors, off-screen angles, shadows and the use of light. One such scene is when a young Jo enters the apartment of an old woman, whose face we can see through a mirror behind her. We then see the same old woman using the mirror to check that the new tenant is not taking her jewelry.
- 899543212b
-
-
\ No newline at end of file
diff --git a/spaces/diacanFperku/AutoGPT/Ip Man 2 English Dubbed 720p 180.md b/spaces/diacanFperku/AutoGPT/Ip Man 2 English Dubbed 720p 180.md
deleted file mode 100644
index 7bdd80c8e4c0b4c677c91a9180d5a4681d873ac5..0000000000000000000000000000000000000000
--- a/spaces/diacanFperku/AutoGPT/Ip Man 2 English Dubbed 720p 180.md
+++ /dev/null
@@ -1,6 +0,0 @@
-Ip Man 2 English Dubbed 720p 180
Download Zip ✑ https://gohhs.com/2uFTYJ
-
-The story of martial-arts master Ip Man, the man who trained Bruce Lee. ... after this announcement (most famously Ip Man (2008) and Ip Man 2 (2010)) were all ... 4d29de3e1b
-
-
-
diff --git a/spaces/dotmet/chatgpt_webui/style.css b/spaces/dotmet/chatgpt_webui/style.css
deleted file mode 100644
index 7f4f938bd243cf6708ee58b9dd6b935467bbaeeb..0000000000000000000000000000000000000000
--- a/spaces/dotmet/chatgpt_webui/style.css
+++ /dev/null
@@ -1,10 +0,0 @@
-.gradio-container {
- max-width: 100%;
- max-height: 100%;
-}
-
-[id$=chatbot] > div{
- border: 0;
- height: 70vh;
- overflow-y: auto;
-}
diff --git a/spaces/duycse1603/math2tex/ScanSSD/test.py b/spaces/duycse1603/math2tex/ScanSSD/test.py
deleted file mode 100644
index 9c8feb4c542166e46f047c10d21affc9e08f7903..0000000000000000000000000000000000000000
--- a/spaces/duycse1603/math2tex/ScanSSD/test.py
+++ /dev/null
@@ -1,202 +0,0 @@
-'''
-This file contains functions to test and save the results
-'''
-from __future__ import print_function
-import os
-import argparse
-import torch.backends.cudnn as cudnn
-from ssd import build_ssd
-from utils import draw_boxes, helpers, save_boxes
-import logging
-import time
-import datetime
-from torch.autograd import Variable
-from torchvision import datasets, transforms
-from torch.utils.data import Dataset, DataLoader
-from data import *
-import shutil
-import torch.nn as nn
-
-def test_net_batch(args, net, gpu_id, dataset, transform, thresh):
- '''
- Batch testing
- '''
- num_images = len(dataset)
-
- if args.limit != -1:
- num_images = args.limit
-
- data_loader = DataLoader(dataset, args.batch_size,
- num_workers=args.num_workers,
- shuffle=False, collate_fn=detection_collate,
- pin_memory=True)
-
- total = len(dataset)
-
- logging.debug('Test dataset size is {}'.format(total))
-
- done = 0
-
- for batch_idx, (images, targets, metadata) in enumerate(data_loader):
-
- done = done + len(images)
- logging.debug('processing {}/{}'.format(done, total))
-
- if args.cuda:
- images = images.cuda()
- targets = [ann.cuda() for ann in targets]
- else:
- images = Variable(images)
- targets = [Variable(ann, volatile=True) for ann in targets]
- # targets = [ann for ann in targets]
-
- y, debug_boxes, debug_scores = net(images) # forward pass
- detections = y.data
-
- k = 0
- for img, meta in zip(images, metadata):
-
- img_id = meta[0]
- x_l = meta[1]
- y_l = meta[2]
-
- img = img.permute(1,2,0)
- # scale each detection back up to the image
- scale = torch.Tensor([img.shape[1], img.shape[0],
- img.shape[1], img.shape[0]])
-
- recognized_boxes = []
- recognized_scores = []
-
- # [1,2,200,5]
- # we only care about math class
- # hence select detections[image_id, class, detection_id, detection_score]
- # class=1 for math
- i = 1
- j = 0
-
- while j < detections.size(2) and detections[k, i, j, 0] >= thresh: # TODO it was 0.6
-
- score = detections[k, i, j, 0]
- pt = (detections[k, i, j, 1:] * args.window).cpu().numpy()
- coords = (pt[0] + x_l, pt[1] + y_l, pt[2] + x_l, pt[3] + y_l)
- #coords = (pt[0], pt[1], pt[2], pt[3])
- recognized_boxes.append(coords)
- recognized_scores.append(score.cpu().numpy())
-
- j += 1
-
- save_boxes(args, recognized_boxes, recognized_scores, img_id)
- k = k + 1
-
- if args.verbose:
- draw_boxes(args, img.cpu().numpy(), recognized_boxes, recognized_scores,
- debug_boxes, debug_scores, scale, img_id)
-
-def test_gtdb(args):
-
- gpu_id = 0
- if args.cuda:
- gpu_id = helpers.get_freer_gpu()
- torch.cuda.set_device(gpu_id)
-
- # load net
- num_classes = 2 # +1 background
-
- # initialize SSD
- net = build_ssd(args, 'test', exp_cfg[args.cfg], gpu_id, args.model_type, num_classes)
-
- logging.debug(net)
- net.to(gpu_id)
- net = nn.DataParallel(net)
- net.load_state_dict(torch.load(args.trained_model, map_location={'cuda:1':'cuda:0'}))
- net.eval()
- logging.debug('Finished loading model!')
-
- dataset = GTDBDetection(args, args.test_data, split='test',
- transform=BaseTransform(args.model_type, (246,246,246)),
- target_transform=GTDBAnnotationTransform())
-
- if args.cuda:
- net = net.to(gpu_id)
- cudnn.benchmark = True
-
- # evaluation
- test_net_batch(args, net, gpu_id, dataset,
- BaseTransform(args.model_type, (246,246,246)),
- thresh=args.visual_threshold)
-
-def parse_args():
- parser = argparse.ArgumentParser(description='Single Shot MultiBox Detection')
- parser.add_argument('--trained_model', default='weights/ssd300_GTDB_990.pth',
- type=str, help='Trained state_dict file path to open')
- parser.add_argument('--save_folder', default='eval/', type=str,
- help='Dir to save results')
- parser.add_argument('--visual_threshold', default=0.6, type=float,
- help='Final confidence threshold')
- parser.add_argument('--cuda', default=False, type=bool,
- help='Use cuda to train model')
- parser.add_argument('--dataset_root', default=GTDB_ROOT, help='Location of VOC root directory')
- parser.add_argument('--test_data', default="testing_data", help='testing data file')
- parser.add_argument('--verbose', default=False, type=bool, help='plot output')
- parser.add_argument('--suffix', default="_10", type=str, help='suffix of directory of images for testing')
- parser.add_argument('--exp_name', default="SSD", help='Name of the experiment. Will be used to generate output')
- parser.add_argument('--model_type', default=300, type=int,
- help='Type of ssd model, ssd300 or ssd512')
- parser.add_argument('--use_char_info', default=False, type=bool, help='Whether or not to use char info')
- parser.add_argument('--limit', default=-1, type=int, help='limit on number of test examples')
- parser.add_argument('--cfg', default="gtdb", type=str,
- help='Type of network: either gtdb or math_gtdb_512')
- parser.add_argument('--batch_size', default=16, type=int,
- help='Batch size for training')
- parser.add_argument('--num_workers', default=4, type=int,
- help='Number of workers used in data loading')
- parser.add_argument('--kernel', default="3 3", type=int, nargs='+',
- help='Kernel size for feature layers: 3 3 or 1 5')
- parser.add_argument('--padding', default="1 1", type=int, nargs='+',
- help='Padding for feature layers: 1 1 or 0 2')
- parser.add_argument('--neg_mining', default=True, type=bool,
- help='Whether or not to use hard negative mining with ratio 1:3')
- parser.add_argument('--log_dir', default="logs", type=str,
- help='dir to save the logs')
- parser.add_argument('--stride', default=0.1, type=float,
- help='Stride to use for sliding window')
- parser.add_argument('--window', default=1200, type=int,
- help='Sliding window size')
-
- parser.add_argument('-f', default=None, type=str, help="Dummy arg so we can load in Jupyter Notebooks")
-
- args = parser.parse_args()
-
- if args.cuda and torch.cuda.is_available():
- torch.set_default_tensor_type('torch.cuda.FloatTensor')
- else:
- torch.set_default_tensor_type('torch.FloatTensor')
-
- if not os.path.exists(args.save_folder):
- os.mkdir(args.save_folder)
-
- if os.path.exists(os.path.join(args.save_folder, args.exp_name)):
- shutil.rmtree(os.path.join(args.save_folder, args.exp_name))
-
- return args
-
-if __name__ == '__main__':
-
- args = parse_args()
- start = time.time()
- try:
- filepath=os.path.join(args.log_dir, args.exp_name + "_" + str(round(time.time())) + ".log")
- print('Logging to ' + filepath)
- logging.basicConfig(filename=filepath,
- filemode='w', format='%(process)d - %(asctime)s - %(message)s',
- datefmt='%d-%b-%y %H:%M:%S', level=logging.DEBUG)
-
- test_gtdb(args)
- except Exception as e:
- logging.error("Exception occurred", exc_info=True)
-
- end = time.time()
- logging.debug('Toal time taken ' + str(datetime.timedelta(seconds=end-start)))
- logging.debug("Testing done!")
-
diff --git a/spaces/enesbol/case_dif/modules/conv_modules.py b/spaces/enesbol/case_dif/modules/conv_modules.py
deleted file mode 100644
index 9b43a89622b3bbe60aa0ab41ae18ecb2fce013f5..0000000000000000000000000000000000000000
--- a/spaces/enesbol/case_dif/modules/conv_modules.py
+++ /dev/null
@@ -1,56 +0,0 @@
-"""
-author: Min Seok Lee and Wooseok Shin
-"""
-import torch.nn as nn
-
-
-class BasicConv2d(nn.Module):
- def __init__(self, in_channel, out_channel, kernel_size, stride=(1, 1), padding=(0, 0), dilation=(1, 1)):
- super(BasicConv2d, self).__init__()
- self.conv = nn.Conv2d(in_channel, out_channel, kernel_size=kernel_size, stride=stride, padding=padding,
- dilation=dilation, bias=False)
- self.bn = nn.BatchNorm2d(out_channel)
- self.selu = nn.SELU()
-
- def forward(self, x):
- x = self.conv(x)
- x = self.bn(x)
- x = self.selu(x)
-
- return x
-
-
-class DWConv(nn.Module):
- def __init__(self, in_channel, out_channel, kernel, dilation, padding):
- super(DWConv, self).__init__()
- self.out_channel = out_channel
- self.DWConv = nn.Conv2d(in_channel, out_channel, kernel_size=kernel, padding=padding, groups=in_channel,
- dilation=dilation, bias=False)
- self.bn = nn.BatchNorm2d(out_channel)
- self.selu = nn.SELU()
-
- def forward(self, x):
- x = self.DWConv(x)
- out = self.selu(self.bn(x))
-
- return out
-
-
-class DWSConv(nn.Module):
- def __init__(self, in_channel, out_channel, kernel, padding, kernels_per_layer):
- super(DWSConv, self).__init__()
- self.out_channel = out_channel
- self.DWConv = nn.Conv2d(in_channel, in_channel * kernels_per_layer, kernel_size=kernel, padding=padding,
- groups=in_channel, bias=False)
- self.bn = nn.BatchNorm2d(in_channel * kernels_per_layer)
- self.selu = nn.SELU()
- self.PWConv = nn.Conv2d(in_channel * kernels_per_layer, out_channel, kernel_size=1, bias=False)
- self.bn2 = nn.BatchNorm2d(out_channel)
-
- def forward(self, x):
- x = self.DWConv(x)
- x = self.selu(self.bn(x))
- out = self.PWConv(x)
- out = self.selu(self.bn2(out))
-
- return out
\ No newline at end of file
diff --git a/spaces/enzostvs/stable-diffusion-tpu/components/modal/modal.tsx b/spaces/enzostvs/stable-diffusion-tpu/components/modal/modal.tsx
deleted file mode 100644
index 0cec31c31e7dada180e58032677efdcd88316bb0..0000000000000000000000000000000000000000
--- a/spaces/enzostvs/stable-diffusion-tpu/components/modal/modal.tsx
+++ /dev/null
@@ -1,97 +0,0 @@
-import { useMemo } from "react";
-import { motion } from "framer-motion";
-import Image from "next/image";
-import { BsFillTrashFill } from "react-icons/bs";
-import { AiFillCheckCircle } from "react-icons/ai";
-
-import { useCollection } from "./useCollection";
-import { Button } from "../button";
-import { useUser } from "@/utils/useUser";
-
-interface Props {
- id: string;
- onClose: () => void;
-}
-
-const dropIn = {
- hidden: {
- opacity: 0,
- },
- visible: {
- y: "0",
- opacity: 1,
- transition: {
- duration: 0.1,
- type: "spring",
- damping: 25,
- stiffness: 500,
- },
- },
- exit: {
- opacity: 0,
- },
-};
-
-export const Modal: React.FC = ({ id, onClose }) => {
- const { collection, updateVisibility, remove } = useCollection(id);
- const { user } = useUser();
-
- const formatDate = useMemo(() => {
- if (!collection) return;
- const date = new Date(collection?.createdAt);
- return date.toLocaleDateString();
- }, [collection?.createdAt]);
-
- return (
-
- e.stopPropagation()}
- className="max-w-2xl h-auto w-full z-[1] rounded-3xl overflow-hidden relative flex items-center justify-center flex-col gap-4 bg-white/30 backdrop-blur-sm px-2 pb-2 pt-2"
- variants={dropIn}
- initial="hidden"
- animate="visible"
- exit="exit"
- >
- {user?.is_admin && (
-
- {!collection?.is_visible && (
-
- )}
-
-
- )}
-
-
-
- {formatDate}
-
- {collection?.prompt}
-
-
-
-
- );
-};
diff --git a/spaces/everythingfades/Math-Stats-AP/app.py b/spaces/everythingfades/Math-Stats-AP/app.py
deleted file mode 100644
index c0adc9a08b3d3d8e61874f6e6f371e4f6c6a5499..0000000000000000000000000000000000000000
--- a/spaces/everythingfades/Math-Stats-AP/app.py
+++ /dev/null
@@ -1,426 +0,0 @@
-import gradio as gr
-import os
-os.system("pip install scipy")
-def birthdayCheck(people):
- p = 1;
- for i in range(people):
- p = p*(365-i)/365
- return 1-p
-import random
-import matplotlib.pyplot as plt
-def alter(x,y):
- return x, y+3
-def getSame(birthdays):
- cnt = 0
- x = list()
- y = list()
- z = list()
- for i in range(len(birthdays)):
- for j in range(i+1,len(birthdays)):
- if birthdays[i] == birthdays[j]:
- cnt += 1
- x.append(i)
- y.append(j)
- z.append(birthdays[i])
- return cnt,x,y,z
-
-def simulation(people):
- birthdays = list()
- peoples = list()
- for i in range(people):
- birthdays.append(random.randint(1,365))
- peoples.append(i)
- return birthdays,peoples
-def tran(z):
- a = [1,31,59,90,120,151,181,212,243,273,304,334]
- str = ''
- for j in z:
- for i in range(len(a)):
- if j < a[i]:
- str += "{}/{}".format(i,j-a[i-1]+1)
- str += " | "
- break
- return str if str != '' else "None"
-#birthdays,peoples = simulation(30)
-import matplotlib.pyplot as plt
-def alter(x,y):
- return x, y+3
-def getSame(birthdays):
- cnt = 0
- x = list()
- y = list()
- z = list()
- for i in range(len(birthdays)):
- for j in range(i+1,len(birthdays)):
- if birthdays[i] == birthdays[j]:
- cnt += 1
- x.append(i)
- y.append(j)
- z.append(birthdays[i])
- return cnt,x,y,z
-import gradio as gr
-
-def getSame_gradio(people):
- birthdays,peoples = simulation(people)
- getSame(birthdays)
- cnt = 0
- x = list()
- y = list()
- z = list()
- for i in range(len(birthdays)):
- for j in range(i+1,len(birthdays)):
- if birthdays[i] == birthdays[j]:
- cnt += 1
- x.append(i)
- y.append(j)
- z.append(birthdays[i])
- temp = ""
- for i in range(len(x)):
- temp += ("No.{}and No.{} have same birthdays".format(x[i]+1,y[i]+1))
- import matplotlib.pyplot as plt
- import matplotlib as mpl
- from matplotlib.patches import Ellipse
- fig, ax = plt.subplots(figsize=(16,10), dpi= 80)
- ax.hlines(y=[31,59,90,120,151,181,212,243,273,304,334,365], xmin=0, xmax=len(peoples)+1, color='gray', alpha=0.7, linewidth=1, linestyles='dashdot')
- ax.scatter(y=birthdays, x=peoples, s=75, color='black', alpha=0.7)
-
- mpl.rcParams['font.sans-serif']=['SimHei']
-
- # Title, Label, Ticks and Ylim
- ax.set_title('birthdays:{}'.format(len(peoples)), fontdict={'size':22})
- ax.set_xlabel('people No.')
- ax.set_ylabel('date')
- ax.set_yticklabels = ['Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec']
- ax.set_yticks([31,59,90,120,151,181,212,243,273,304,334,365])
- for i in range(len(birthdays)):
- textx,texty = alter(peoples[i],birthdays[i])
- ax.text(textx,texty,birthdays[i])
- if cnt != 0:
- for i in range(len(x)):
- plt.gcf().gca().add_artist(Ellipse(xy = (x[i],z[i]), width = 1, height = 17, fill = False, color = "red"))
- plt.gcf().gca().add_artist(Ellipse(xy = (y[i],z[i]), width = 1, height = 17, fill = False, color = "red"))
- return tran(birthdays), "we have {} pairs of same birthdays".format(cnt), "the repeated date is" + tran(z), temp, plt.gcf()
-def multi_simulation(times,people):
- cnt_list = list()
- cnt_check = list()
- for i in range(times):
- cnt = one_simulation(people)
- cnt_list.append(cnt)
- cnt_check.append(0 if cnt ==0 else 1)
- return cnt_list, cnt_check
-#demo.launch()
-
-def one_simulation(people):
- birthdays,peoples = simulation(people)
- cnt,x,y,z = getSame(birthdays)
- return cnt
-def draw_one_sim_one_zero_gradio(times, people):
- fig, ax = plt.subplots(figsize=(16,10), dpi= 80)
- cnt_list, cnt_check = multi_simulation(times,people)
- all_colors = list(plt.cm.colors.cnames.keys())
- random.seed(100)
- c = random.choices(all_colors, k=len(list(set(cnt_list))))
- # Plot Bars
- ax.bar(list(set(cnt_check)), find_class(cnt_check), color=c, width=.5)
- #for i in range(len(list(set(cnt_check)))):
- # plt.annotate(float(find_class(cnt_check)[i]),
- # xy=(list(set(cnt_check))[i],find_class(cnt_check)[i] + 4),#箭头末端位置
- #
- # xytext=(list(set(cnt_check))[i],find_class(cnt_check)[i] + 4),#文本起始位置
- #
- # #箭头属性设置
- # arrowprops=dict(facecolor='#74C476',
- # shrink=1,#箭头的收缩比
- # alpha=0.6,
- # width=7,#箭身宽
- # headwidth=40,#箭头宽
- # hatch='--',#填充形状
- # frac=0.8,#身与头比
- # #其它参考matplotlib.patches.Polygon中任何参数
- # ),
- # )
- return fig,float(find_class(cnt_check)[0]),float(find_class(cnt_check)[1]),float(find_class(cnt_check)[1]/times),birthdayCheck(people)
-import matplotlib
-import scipy.stats as stats
-import numpy as np
-def multi_multi_simulation_gradio(people, times_for_p, iter_times):
- fig, ax = plt.subplots(figsize=(16,10), dpi= 80)
- results_30,a,b = multi_fixed_simulation(people, times_for_p, iter_times)
- mu = np.mean(results_30)
- std = np.std(results_30)
- Bin = 80
- n, bins, patches = ax.hist(results_30, bins = Bin)
- y = stats.norm.pdf(bins,mu,std)
- #y = matplotlib.mlab.normpdf(bins,mu,std)
- print(mu,std)
- plt.xlim((birthdayCheck(people)-0.08, birthdayCheck(people)+0.08))
- ax.plot(bins,y,'r--')
- return fig
-
-def find_class(list1):
- classes = list()
- for i in list(set(list1)):
- temp_cnt= 0
- for j in list1:
- if i==j:
- temp_cnt += 1
- classes.append(temp_cnt)
- if len(classes) == 1:
- classes.append(classes[0])
- classes[0] = 0
- return classes
-def multi_fixed_simulation(people,times, iter_times):
- results = list()
- for i in range(iter_times):
- results.append(get_ratio(people,times))
- if i % 100 == 0:
- print(i)
- return results,people, times
-def get_ratio(people, times):
- cnt_list1, cnt_check1 = multi_simulation(times,people)
- return(find_class(cnt_check1)[1]/len(cnt_check1))
-import math
-import numpy as np
-import scipy
-def random_gen_data(population,p):
- results = np.zeros(int(population))
- for i in range(results.size):
- if np.random.rand(1)[0] <= p:
- results[i] = 1
- return results
-def get_ratio1(results):
- #print(results.size,results.sum(),"#########")
- return results.sum()/results.size
-def simulation(p,SRS_size,population,times):
- results = np.zeros(times)
- for i in range(times):
- sample1 = np.array(random.sample(list(population),int(SRS_size)))
- results[i] = get_ratio1(sample1)
- return results#,population
-def get_CI(p,alpha,sample_size):#size = (SRS_size)
- sigma = math.sqrt(p*(1-p)/sample_size)
- z = scipy.stats.norm.ppf(1-(1-alpha)/2)
- return(sigma*z)
-def get_CI_new(alpha,sample):#size = (SRS_size)
- p_hat = sample
- sigma = np.sqrt(p_hat*(1-p_hat)/sample.shape[0])
- z = scipy.stats.norm.ppf(1-(1-alpha)/2)
- return(sigma*z)
-def draw(p,alpha,SRS_size,population_size,times):
- population = random_gen_data(population_size,p)
- #print(population)
- sample = simulation(p,SRS_size,population,times)
- CI = get_CI_new(alpha,sample)
- #print(CI)
- all_colors = list(plt.cm.colors.cnames.keys())
- random.seed(42)
- c = random.choices(all_colors, k=population_size)
-
-
- fig,ax=plt.subplot_mosaic([['A','A','A'],
- ['C','C','C'],
- ['B','B','B'],
- ['B','B','B'],
- ['B','B','B']],figsize=(16, 10), constrained_layout=True)
- ax['A'].set_xlim(0, 1)
- #ax['B'].set_xlim(sample.min()-CI, sample.max()+CI)
- ax['B'].set_xlim(0,1)
- #ax['C'].set_xlim(sample.min()-CI, sample.max()+CI)
- ax['C'].set_xlim(0,1)
- mu = population.mean()
- std = population.std()
- n, bins, patches = ax['A'].hist(population,bins = 40)
- y = normfun(bins,mu,std)
- ax['A'].vlines(p,ax['A'].get_ylim()[0],ax['A'].get_ylim()[1],linestyles ="solid", colors ="k")
- ax['C'].plot(bins,y,'r--',label = "population dis")
-
- mu = sample.mean()
- std = sample.std()
- n, bins, patches = ax['C'].hist(sample,bins = 40)
- y = normfun(bins,mu,std)
- ax['C'].plot(bins,y,':',label = "sample dis")
- ax['C'].vlines(p,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="solid", colors ="k")
- ax['C'].vlines(p-CI,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['C'].vlines(p+CI,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['C'].legend()
-
- ax['B'].hlines(y = np.arange(0,times),xmin = sample-CI, xmax = sample+CI,color = "gray")
- ax['C'].vlines(p,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="solid", colors ="k")
-
- ax['B'].scatter(sample,np.arange(0,times),c = "blue")
- out = np.zeros(sample.shape[0])
- ax['B'].vlines(p,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="solid", colors ="k")
- ax['B'].vlines(p-CI,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['B'].vlines(p+CI,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="dashed", colors ="k")
- check = (np.abs(sample - p)>CI)*sample
- ax['B'].scatter(check,np.arange(0,times),c = "red")
- ax['B'].scatter(np.zeros(times),np.arange(0,times),c = "white")
- #print(check)
- return fig
-
-import math
-import numpy as np
-import scipy
-def random_gen_data_mean(population,mu,sigma):
- results = np.random.normal(mu, sigma,population) # 均值,标准差,数量
- return results
-def get_ratio1_mean(results):
- #print(results.size,results.sum(),"#########")
- return results.sum()/results.size
-def simulation_mean(SRS_size,population,times):
- results = np.zeros(times)
- for i in range(times):
- sample1 = np.array(random.sample(population.tolist(),int(SRS_size)))
- results[i] = get_ratio1_mean(sample1)
- return results#,population
-def get_CI_mean(sample,alpha):#size = (SRS_size)
- #print(sample)
- sigma = sample.std()
- sigma1 = sigma/math.sqrt(sample.shape[0])
- z = scipy.stats.norm.ppf(1-(1-alpha)/2)
- return(sigma*z)
-def get_CI_mean_new(sample,alpha):#size = (SRS_size)
- #print(sample)
- sigma = sample
- sigma1 = sigma/math.sqrt(sample.shape[0])
- z = scipy.stats.norm.ppf(1-(1-alpha)/2)
- return(sigma*z)
-def normfun(x,mu,std):
- pdf = np.exp(-((x - mu)**2)/(2*std**2)) / (std * np.sqrt(2*np.pi))
- return pdf
-def normfun_d(x):
- u = x-x.mean()
- pdf = x * np.exp(-u*u/(2*x.std()*x.std()))/(x.std()*x.std()*x.std()*math.sqrt(math.pi))
-#
-import matplotlib.pyplot as plt
-import random
-import math
-import numpy as np
-import scipy
-import gradio as gr
-from scipy.stats import norm
-def draw_mean(mu,sigma,alpha,SRS_size,population_size,times):
- population = random_gen_data_mean(population_size,mu, sigma)
- #print(population)
- sample = simulation_mean(SRS_size,population,times)
- CI = get_CI_mean(sample,alpha)
- #print(CI)
- #all_colors = list(plt.cm.colors.cnames.keys())
- random.seed(42)
- #c = random.choices(all_colors, k=population_size)
-
-
- fig,ax=plt.subplot_mosaic([['A','A','A'],
- ['C','C','C'],
- ['B','B','B'],
- ['B','B','B'],
- ['B','B','B']],figsize=(16, 10), constrained_layout=True)
- #ax['C'].set_ylim(0,0.5)
- ax['A'].set_xlim(population.min()-CI, population.max()+CI)
- ax['B'].set_xlim(population.min()-CI, population.max()+CI)
- ax['C'].set_xlim(population.min()-CI, population.max()+CI)
- mu = population.mean()
- std = population.std()
- n, bins, patches = ax['A'].hist(population,bins = 40)#,density = True)
- y = normfun(bins,mu,std)
- ax['A'].vlines(mu,ax['A'].get_ylim()[0],ax['A'].get_ylim()[1],linestyles ="solid", colors ="k")
- #ax['A'].vlines(mu-CI,ax['A'].get_ylim()[0],ax['A'].get_ylim()[1],linestyles ="dashed", colors ="k")
- #ax['A'].vlines(mu+CI,ax['A'].get_ylim()[0],ax['A'].get_ylim()[1],linestyles ="dashed", colors ="k")
-
- ax['C'].vlines(mu,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="solid", colors ="k")
- ax['C'].vlines(mu-CI,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['C'].vlines(mu+CI,ax['C'].get_ylim()[0],ax['C'].get_ylim()[1],linestyles ="dashed", colors ="k")
-
- ax['C'].plot(bins,y,'r--',label = "population dis",color = "red")
-
- mu = sample.mean()
- std = sample.std()
-
-
- n, bins, patches = ax['C'].hist(sample,bins = 40)#,density = True,weights = weights)
- y = normfun(bins,mu,std)
- ax['C'].plot(bins,y,'-',label = "sample dis",color = "red")
- ax['C'].legend()
-
- ax['B'].hlines(y = np.arange(0,times),xmin = sample-CI, xmax = sample+CI,color = "gray")
- ax['B'].vlines(mu,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="solid", colors ="k")
- ax['B'].vlines(mu-CI,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['B'].vlines(mu+CI,ax['B'].get_ylim()[0],ax['B'].get_ylim()[1],linestyles ="dashed", colors ="k")
- ax['B'].scatter(sample,np.arange(0,times))
-
- check = (np.abs(sample - mu)>CI)*sample
- ax['B'].scatter(check,np.arange(0,times),c = "red")
- ax['B'].scatter(np.zeros(times),np.arange(0,times),c = "white")
- return fig
-
-
-with gr.Blocks() as demo:
- with gr.Tab(label='Birthday paradox', open=False):
- with gr.Accordion(label="intro", open = True):
- gr.Markdown("The Birthday paradox illustrates a scnario to find whether or how many people are there with the same birthdays among the 365 days a year")
- gr.Markdown("Basically, if we have n people, and each of the birthdays were randomly selected, the predicted probability for at least two people with the same birthdays is:")
- gr.Markdown("1 - (365 * 364 * 363 * (3650n+1))/(365^n)")
- gr.Markdown("since for the first person, his birthday could be any day. For the second, to avoid have same birthdays, there are 364 remaining, and so on")
- gr.Markdown("we could use 1 minus the value to get the final probability")
- birthday_intro_input1 = gr.Slider(2,1000,label = "number of people")
- birthday_intro_output1 = gr.Textbox(label = "the probability to find same birthdays among the gien value of people is")
- birthday_intro_button = gr.Button()
- with gr.Accordion(label="single simulation", open = False):
- gr.Markdown("this simulation is for one possible outcome given that the number of people is fixed(the slider)")
- gr.Markdown("because I am to discuss the long term probability of multiple simulations combined, so the tag is single simulation")
- birthday_single_simulation_input1 = gr.Slider(0,100,label = "number of people")
- with gr.Row():
- birthday_single_simulation_output1 = gr.Textbox("birthdays")
- birthday_single_simulation_output2 = gr.Textbox()
- birthday_single_simulation_output3 = gr.Textbox()
- birthday_single_simulation_output4 = gr.Textbox()
- birthday_single_simulation_output5 = gr.Plot()
- birthday_single_simulation_button = gr.Button()
- with gr.Accordion(label = "fix number of people, simulate n times to find approximate probability",open = False):
- with gr.Row():
- one_zero_input2 = gr.Slider(0,100,label = "number of people")
- one_zero_input1 = gr.Slider(10,10000,10, label = "number of experiments")
- one_zero_output = gr.Plot()
- with gr.Row():
- one_zero_output1 = gr.Textbox(label = "experiments with no same birthdays")
- one_zero_output2 = gr.Textbox(label = "experiments with same birthdays")
- one_zero_output3 = gr.Textbox(label = "probability of same birhtdays")
- one_zero_output4 = gr.Textbox(label = "estimated probability")
- one_zero_button = gr.Button()
- with gr.Accordion("normal distribution", open = False):
- gr.Markdown("it may took some time when the numbers are big")
- multi_input1 = gr.Slider(1,100,label = "number of people")
- multi_input2 = gr.Slider(10,1000,label = "number of experiments to get one probability")
- multi_input3 = gr.Slider(1,1000,label = "number of probabilities")
- multi_output = gr.Plot()
- multi_button = gr.Button()
- with gr.Tab(label="Confidence Interval"):
- with gr.Accordion("CI for ratio"):
- gr.Markdown("given that the probability to succeed in each trial is p")
- gr.Markdown("find the Confidence Interval under certain significance level")
- with gr.Row():
- input1 = gr.Slider(0,1,0.001,label = "probability to succeed in each trial")
- input2 = gr.Slider(0,1,0.001,label = "significane level")
- input3 = gr.Slider(0,1000,1,label = "sample size")
- input4 = gr.Slider(1,100000,1,label = "population")
- input5 = gr.Slider(1,1000,1,label = "number of trials")
- output = gr.Plot()
- text_button = gr.Button()
- with gr.Accordion("CI for mean"):
- gr.Markdown("tip: the y axis has been normalized")
- with gr.Row():
- input11 = gr.Slider(0,100,1,label = "mean")
- input21 = gr.Slider(0,100,1,label = "std")
- input31 = gr.Slider(0,1,0.001,label = "signifigance level")
- input41 = gr.Slider(0,1000,1,label = "sample size")
- input51 = gr.Slider(1,100000,1,label = "population")
- input61 = gr.Slider(1,1000,1,label = "sample times")
- output1 = gr.Plot()
- text_button1 = gr.Button()
- text_button1.click(draw_mean, inputs=[input11,input21,input31,input41,input51,input61], outputs=[output1])
- birthday_intro_button.click(birthdayCheck, inputs = [birthday_intro_input1], outputs = [birthday_intro_output1])
- birthday_single_simulation_button.click(getSame_gradio, inputs = [birthday_single_simulation_input1], outputs = [birthday_single_simulation_output1,birthday_single_simulation_output2,birthday_single_simulation_output3,birthday_single_simulation_output4,birthday_single_simulation_output5])
- one_zero_button.click(draw_one_sim_one_zero_gradio, inputs=[one_zero_input1,one_zero_input2], outputs=[one_zero_output,one_zero_output1,one_zero_output2,one_zero_output3,one_zero_output4])
- multi_button.click(multi_multi_simulation_gradio,inputs = [multi_input1,multi_input2,multi_input3],outputs = multi_output)
- text_button.click(draw, inputs=[input1,input2,input3,input4,input5], outputs=[output])
-demo.launch()
\ No newline at end of file
diff --git a/spaces/falterWliame/Face_Mask_Detection/HD Online Player (Kabhi Alvida Naa Kehna Full !!EXCLUSIVE!! Hd Movie).md b/spaces/falterWliame/Face_Mask_Detection/HD Online Player (Kabhi Alvida Naa Kehna Full !!EXCLUSIVE!! Hd Movie).md
deleted file mode 100644
index 913abb2d7dae03fc543f09686a1946512573c31a..0000000000000000000000000000000000000000
--- a/spaces/falterWliame/Face_Mask_Detection/HD Online Player (Kabhi Alvida Naa Kehna Full !!EXCLUSIVE!! Hd Movie).md
+++ /dev/null
@@ -1,6 +0,0 @@
-HD Online Player (Kabhi Alvida Naa Kehna full hd movie)
Download File ››››› https://urlca.com/2uDdTu
-
-Watch Kabhi Alvida Naa Kehna 2006 Full Movie Online Free in Hd ... 1080p Tv Pixel ... Tumhi Dekho Na - Kabhi Alvida Na Kehna (1080p HD Song).mp4 ... Kabhi ... Dev, a former football player is married to Rhea,.... Kabhi ... 4d29de3e1b
-
-
-
diff --git a/spaces/fatiXbelha/sd/Cch s dng phn mm VNPT-BHXH 2.0 k khai BHXH in t d dng.md b/spaces/fatiXbelha/sd/Cch s dng phn mm VNPT-BHXH 2.0 k khai BHXH in t d dng.md
deleted file mode 100644
index 08eedf8dcfe60996985d5fd4b9c690dc2d0691e2..0000000000000000000000000000000000000000
--- a/spaces/fatiXbelha/sd/Cch s dng phn mm VNPT-BHXH 2.0 k khai BHXH in t d dng.md
+++ /dev/null
@@ -1,157 +0,0 @@
-
-Download BHXH VNPT 2.0 - A Convenient and Secure Software for Social Insurance Declaration
- If you are looking for a software that can help you declare social insurance online, send and receive electronic documents, and check the status and feedback of your social insurance documents, then you should consider downloading BHXH VNPT 2.0.
- BHXH VNPT 2.0 is a software developed by Vietnam Posts and Telecommunications Group (VNPT) that allows users to perform various tasks related to social insurance declaration and management.
-download bhxh vnpt 2.0
DOWNLOAD >>> https://urllie.com/2uNAxP
- In this article, we will introduce you to the features, benefits, reviews, alternatives, and instructions on how to download and use BHXH VNPT 2.0.
- What is BHXH VNPT 2.0?
- BHXH VNPT 2.0 is a software that enables users to declare social insurance online, send and receive electronic documents, check the status and feedback of social insurance documents, and update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0.
- BHXH VNPT 2.0 is compatible with Windows operating system and can be used with any type of digital signature.
- BHXH VNPT 2.0 is compliant with the regulations of Vietnam Social Security (VSS) and provides users with various forms of social insurance declaration.
- Features of BHXH VNPT 2.0
- Some of the main features of BHXH VNPT 2.0 are:
-
-- Kê khai hồ sơ BHXH điện tử: Users can declare social insurance online with various forms that comply with the regulations of VSS.
-- Gửi hồ sơ BHXH điện tử: Users can send electronic documents to VSS through a secure and fast connection.
-- Tái sử dụng dữ liệu hồ sơ đã khai ở các kỳ trước: Users can reuse data from previous declarations to save time and avoid errors.
-- Tra cứu tình trạng xử lý hồ sơ BHXH điện tử: Users can check the status and feedback of their social insurance documents online.
-- Tra cứu thông báo phản hồi từ cơ quan BHXH: Users can check the notifications from VSS regarding their social insurance documents.
-- Cập nhật và đồng bộ dữ liệu từ BHXH VNPT 2.0 sang BHXH VNPT 5.0: Users can update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0, which is the latest version of the software with more features.
-
- Benefits of using BHXH VNPT 2 .0
- Some of the benefits of using BHXH VNPT 2.0 are:
-
-- It saves time and resources for users and VSS by reducing the need for paper documents and manual processing.
-- It ensures the accuracy and security of the data by using digital signatures and encryption.
-- It provides users with instant feedback and notifications from VSS regarding their social insurance documents.
-- It allows users to update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0, which is the latest version of the software with more features.
-
- Reviews of BHXH VNPT 2.0
- BHXH VNPT 2.0 has received positive reviews from users who have used it for social insurance declaration and management. Here are some of the reviews from users:
-
-"I have been using BHXH VNPT 2.0 for a year and I am very satisfied with it. It is easy to use, fast, and secure. It helps me declare social insurance online without any hassle. I can also check the status and feedback of my documents anytime. I highly recommend it to anyone who needs to declare social insurance online."
-How to download bhxh vnpt 2.0 for Windows
-Download bhxh vnpt 2.0 and sign offline
-Download bhxh vnpt 2.0 and use with any digital signature
-Download bhxh vnpt 2.0 and get support from VNPT
-Download bhxh vnpt 2.0 and update D02-TS form
-Download bhxh vnpt 2.0 and fix CE009 error
-Download bhxh vnpt 2.0 and submit electronic social insurance records
-Download bhxh vnpt 2.0 and check the status of social insurance records
-Download bhxh vnpt 2.0 and receive feedback from social insurance agency
-Download bhxh vnpt 2.0 and reuse data from previous periods
-Benefits of downloading bhxh vnpt 2.0 for social insurance declaration
-Features of downloading bhxh vnpt 2.0 for social insurance declaration
-Requirements of downloading bhxh vnpt 2.0 for social insurance declaration
-Price of downloading bhxh vnpt 2.0 for social insurance declaration
-Reviews of downloading bhxh vnpt 2.0 for social insurance declaration
-Comparison of downloading bhxh vnpt 2.0 and other software for social insurance declaration
-Tutorial of downloading bhxh vnpt 2.0 and installing it on your computer
-Tutorial of downloading bhxh vnpt 2.0 and using it with Excel
-Tutorial of downloading bhxh vnpt 2.0 and using it with Net Framework 4
-Tutorial of downloading bhxh vnpt 2.0 and using it with TeamViewerQS
-FAQ of downloading bhxh vnpt 2.0 and solving common problems
-Contact information of downloading bhxh vnpt 2.0 and getting customer service
-Download link of bhxh vnpt 2.0 from official website
-Download link of bhxh vnpt 2.0 from alternative sources
-Download link of bhxh vnpt 2.0 from Google Drive
-Download link of bhxh vnpt 2.0 from Mediafire
-Download link of bhxh vnpt 2.0 from Mega.nz
-Download link of bhxh vnpt 2.0 from Fshare.vn
-Download link of bhxh vnpt 2.0 from Upfile.vn
-Download link of bhxh vnpt 2.0 from Zippyshare.com
-Download speed of bhxh vnpt 2.0 from different servers
-Download size of bhxh vnpt 2.0 and system requirements
-Download version of bhxh vnpt 2.0 and update history
-Download license of bhxh vnpt 2.0 and terms of use
-Download security of bhxh vnpt 2.0 and virus scan results
-How to uninstall bhxh vnpt 2.0 from your computer
-How to backup and restore data from bhxh vnpt 2.0
-How to transfer data from bhxh vnpt 2.0 to another computer or device
-How to export and import data from bhxh vnpt 2.0 to other formats or software
-How to customize settings and preferences in bhxh vnpt 2.0
-How to create an account and register for the service in bhxh vnpt 2.0
-How to login and logout from the service in bhxh vnpt 2.0
-How to change password and recover account in the service in bhxh vnpt 2.0
-How to manage users and roles in the service in bhxh vnpt 2.0
-How to access the service in different regions or provinces in the service in bhxh vnpt 2.0
-- Nguyen Thi Lan, accountant at ABC Company
-
-
-"BHXH VNPT 2.0 is a great software for social insurance declaration. It has many features that make it convenient and efficient. It allows me to reuse data from previous declarations, send and receive electronic documents, and update and synchronize data to BHXH VNPT 5.0. It also gives me instant feedback and notifications from VSS. It is a must-have software for social insurance declaration."
-- Tran Van Minh, manager at XYZ Company
-
- Alternatives to BHXH VNPT 2.0
- If you are looking for alternatives to BHXH VNPT 2.0, you may want to consider the following options:
-
-- BHXH VNPT 5.0: This is the latest version of the software developed by VNPT that has more features and functions than BHXH VNPT 2.0. It allows users to declare social insurance online, send and receive electronic documents, check the status and feedback of social insurance documents, update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0, and perform other tasks related to social insurance declaration and management.
-- BHXH Online: This is a web-based service provided by VSS that allows users to declare social insurance online, send and receive electronic documents, check the status and feedback of social insurance documents, and perform other tasks related to social insurance declaration and management.
-- BHXH Mobile: This is a mobile application developed by VSS that allows users to declare social insurance online, send and receive electronic documents, check the status and feedback of social insurance documents, and perform other tasks related to social insurance declaration and management on their smartphones or tablets.
-
- How to download and install BHXH VNPT 2.0?
- If you want to download and install BHXH VNPT 2.0 on your computer, you can follow these steps:
- Step 1: Choose the suitable version and download the software
- You can choose between two versions of BHXH VNPT 2.0: one for Windows XP or Windows Server 2003 (32-bit), and one for Windows Vista or higher (64-bit). You can download the software from the official website of VNPT at https://vnpt.com.vn/bhxh-vnpt-20 or from the official website of VSS at https://bhxh.gov.vn/bhxh-vnpt-20. You can also scan the QR code on these websites to download the software on your mobile device.
- Step 2: Double-click on the downloaded file and follow the instructions
- After downloading the software, you can double-click on the file named "BHXH_VNPT_20_Setup.exe" to start the installation process. You will see a welcome screen that asks you to choose the language for the installation. You can choose between Vietnamese or English. Then, you will see a license agreement screen that asks you to accept the terms and conditions of the software. You can click on "I Agree" to continue. Then, you will see a destination folder screen that asks you to choose where you want to install the software on your computer. You can click on " Browse" to select a different folder or click on "Next" to use the default folder. Then, you will see a ready to install screen that asks you to confirm the installation. You can click on "Install" to start the installation or click on "Back" to change any settings. The installation will take a few minutes and you will see a progress bar that shows the status of the installation. When the installation is complete, you will see a finish screen that asks you to launch the software. You can click on "Finish" to exit the installation or click on "Launch BHXH VNPT 2.0" to start using the software.
-
Step 3: Complete the installation and start using the software
- After launching the software, you will see a login screen that asks you to enter your username and password. You can use the username and password that you have registered with VSS or VNPT. If you have not registered yet, you can click on "Register" to create an account. You will need to provide some information such as your name, email, phone number, and digital signature. After registering, you can log in with your username and password. You will then see the main interface of the software that shows various options and functions that you can use for social insurance declaration and management.
- How to use BHXH VNPT 2.0?
- Once you have downloaded and installed BHXH VNPT 2.0, you can use it for various tasks related to social insurance declaration and management. Here are some of the main tasks that you can perform with BHXH VNPT 2.0:
- How to declare social insurance online with BHXH VNPT 2.0?
- To declare social insurance online with BHXH VNPT 2.0, you can follow these steps:
-
-- On the main interface of the software, click on "Kê khai hồ sơ BHXH điện tử" (Declare social insurance online).
-- Select the type of declaration that you want to make from the drop-down menu. You can choose between monthly declaration, quarterly declaration, annual declaration, or other types of declaration.
-- Fill in the required information and data for your declaration. You can use the data from previous declarations or enter new data.
-- Check and verify your declaration before submitting it. You can preview your declaration, print it, or save it as a file.
-- Sign your declaration with your digital signature and click on "Gửi hồ sơ BHXH điện tử" (Send electronic documents) to submit your declaration to VSS.
-
- How to send and receive electronic documents with BHXH VNPT 2.0?
- To send and receive electronic documents with BHXH VNPT 2.0, you can follow these steps:
-
-- On the main interface of the software, click on "Gửi hồ sơ BHXH điện tử" (Send electronic documents) or "Nhận hồ sơ BHXH điện tử" (Receive electronic documents) depending on what you want to do.
-- Select the type of document that you want to send or receive from the list of available documents.
-- Fill in the required information and data for your document. You can use the data from previous documents or enter new data.
-- Check and verify your document before sending or receiving it. You can preview your document, print it, or save it as a file.
-- Sign your document with your digital signature and click on "Gửi hồ sơ BHXH điện tử" (Send electronic documents) or "Nhận hồ sơ BHXH điện tử" (Receive electronic documents) to complete the process.
-
- How to check the status and feedback of social insurance documents with BHXH VNPT 2.0?
- To check the status and feedback of social insurance documents with BHXH VNPT 2.0, you can follow these steps:
-
-- On the main interface of the software, click on "Tra cứu tình trạng xử lý hồ sơ BHXH điện tử" (Check the status and feedback of social insurance documents).
-- Select the type of document that you want to check from the list of available documents.
-- Enter the document number or other criteria to search for your document.
-- View the status and feedback of your document on the screen. You can see if your document has been received, processed, approved, rejected, or returned by VSS. You can also see the reasons and suggestions for your document if it has been rejected or returned.
-- Click on "Xem chi tiết" (View details) to see more information and feedback about your document. You can also download, print, or save your document as a file.
-
- How to update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0?
- To update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0, you can follow these steps:
-
-- On the main interface of the software, click on "Cập nhật và đồng bộ dữ liệu từ BHXH VNPT 2.0 sang BHXH VNPT 5.0" (Update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0).
-- Select the type of data that you want to update and synchronize from the list of available data.
-- Enter the date range or other criteria to filter the data that you want to update and synchronize.
-- Click on "Cập nhật và đồng bộ" (Update and synchronize) to start the process. You will see a progress bar that shows the status of the process.
-- When the process is complete, you will see a confirmation message that shows the number of data records that have been updated and synchronized.
-
- Conclusion
- BHXH VNPT 2.0 is a convenient and secure software for social insurance declaration and management. It allows users to declare social insurance online, send and receive electronic documents, check the status and feedback of social insurance documents, and update and synchronize data from BHXH VNPT 2.0 to BHXH VNPT 5.0.
- BHXH VNPT 2.0 is compatible with Windows operating system and can be used with any type of digital signature. It is compliant with the regulations of VSS and provides users with various forms of social insurance declaration.
- BHXH VNPT 2.0 has received positive reviews from users who have used it for social insurance declaration and management. It saves time and resources for users and VSS, ensures the accuracy and security of the data, provides instant feedback and notifications from VSS, and allows users to update and synchronize data to BHXH VNPT 5.0.
- If you want to download and use BHXH VNPT 2.0, you can follow the instructions in this article or visit the official websites of VNPT or VSS for more information.
- FAQs
- Here are some of the frequently asked questions about BHXH VNPT 2.0:
-
-- What is the difference between BHXH VNPT 2.0 and BHXH VNPT 5.0?
-BHXH VNPT 2.0 is an older version of the software that has fewer features and functions than BHXH VNPT 5.0. BHXH VNPT 5.0 is the latest version of the software that has more features and functions than BHXH VNPT 2.0. For example, BHXH VNPT 5.0 allows users to declare health insurance online, manage payroll online, manage employee information online, and perform other tasks related to social insurance declaration and management.
-- How much does it cost to use BHXH VNPT 2.0?
-BHXH VNPT 2.0 is free to use for users who have registered with VSS or VNPT. Users do not need to pay any fees or charges to use BHXH VNPT 2.0.
-- How secure is BHXH VNPT 2.0?
-BHXH VNPT 2.0 is very secure as it uses digital signatures and encryption to protect the data and documents that are sent and received by users. Users can also choose their own passwords and change them regularly to prevent unauthorized access to their accounts.
-- How can I contact VSS or VNPT if I have any questions or problems with BHXH VNPT 2.0?
-You can contact VSS or VNPT by phone, email, or online chat if you have any questions or problems with BHXH VNPT 2.0. You can find their contact information on their official websites or on the software interface.
-- Can I use BHXH VNPT 2 [user .0 on other devices besides computers?
-Yes, you can use BHXH VNPT 2.0 on other devices besides computers, such as smartphones or tablets. You can download the software from the official websites of VNPT or VSS or scan the QR code on these websites to download the software on your mobile device. You can also use BHXH Mobile, which is a mobile application developed by VSS that has similar features and functions as BHXH VNPT 2.0.
-
197e85843d
-
-
\ No newline at end of file
diff --git a/spaces/fatiXbelha/sd/Discover the Secrets of the Caribbean with Caribbean Treasures - A Free Game by Reflexive Entertainment.md b/spaces/fatiXbelha/sd/Discover the Secrets of the Caribbean with Caribbean Treasures - A Free Game by Reflexive Entertainment.md
deleted file mode 100644
index 4a75dc156fdb75adb372dd697f9b63f8ec2d8032..0000000000000000000000000000000000000000
--- a/spaces/fatiXbelha/sd/Discover the Secrets of the Caribbean with Caribbean Treasures - A Free Game by Reflexive Entertainment.md
+++ /dev/null
@@ -1,99 +0,0 @@
-
-Caribbean Treasures Download: How to Play and Win Big
-If you are looking for a fun and rewarding way to spend your time, you might want to check out Caribbean Treasures. Caribbean Treasures is an online gaming platform that offers exciting fish games and slot games that you can play anytime, anywhere. In this article, we will tell you everything you need to know about Caribbean Treasures, how to download and play it, and how to win big with it.
- What is Caribbean Treasures?
-Caribbean Treasures is an online gaming platform that offers two types of games: fish games and slot games. Both types of games are designed to test your skills and luck, and reward you with big treasures. Here are some features of each type of game:
-caribbean treasures download
DOWNLOAD ✔ https://urllie.com/2uNvGf
- A fun and exciting fish game
-The fish game is a shooting game where you have to aim and fire at various fish and sea creatures on the screen. The more fish you catch, the more coins you earn. You can also encounter special fish that give you extra coins, bonuses, or multipliers. The fish game has different levels of difficulty, from easy to hard, and different modes, such as single-player or multiplayer. You can also customize your cannon and use different power-ups to enhance your gameplay.
- A variety of slot games
-The slot games are classic casino games where you have to spin the reels and match symbols to win prizes. The slot games have different themes, such as fruits, animals, pirates, or ancient civilizations. The slot games also have different features, such as wilds, scatters, free spins, or bonus rounds. The slot games have different paylines, from 5 to 50, and different bet sizes, from 0.01 to 5 coins per line.
- How to download and play Caribbean Treasures?
-Downloading and playing Caribbean Treasures is very easy. Here are the steps you need to follow:
- Download the game for Android or iOS
-You can download the game for your mobile device from the official website. The game is compatible with both Android and iOS devices. The download is free and fast. You can also download the game for your desktop computer if you prefer.
- Sign up for a free account
-After downloading the game, you need to sign up for a free account. You don't need a credit card to sign up. You just need to provide some basic information, such as your name, email address, phone number, and username. The support team will activate your account and contact you with details. You can also contact them anytime if you have any questions or issues.
- Choose your game and start playing
-Once you have your account ready, you can choose your game and start playing. You can switch between the fish game and the slot game anytime you want. You can also choose from different rooms or tables depending on your preference. You can play with real money or with virtual coins. You can also play with other players or by yourself.
- How to win big with Caribbean Treasures?
-Playing Caribbean Treasures is not only fun but also rewarding. Here are some tips on how to win big with Caribbean Treasures:
-caribbean treasures game download
-caribbean treasures slot machine download
-caribbean treasures fish games download
-caribbean treasures reflexive entertainment download
-caribbean treasures free download full version
-caribbean treasures pc game download
-caribbean treasures online slots download
-caribbean treasures emotion rays download
-caribbean treasures software informer download
-caribbean treasures internet archive download
-caribbean treasures match 3 game download
-caribbean treasures hidden object game download
-caribbean treasures deluxe edition download
-caribbean treasures windows 10 download
-caribbean treasures mac game download
-caribbean treasures android game download
-caribbean treasures ios game download
-caribbean treasures mobile game download
-caribbean treasures apk download
-caribbean treasures mod apk download
-caribbean treasures hack apk download
-caribbean treasures cheats apk download
-caribbean treasures unlimited coins apk download
-caribbean treasures free spins apk download
-caribbean treasures bonus codes apk download
-caribbean treasures no deposit apk download
-caribbean treasures play for fun apk download
-caribbean treasures play for real apk download
-caribbean treasures play offline apk download
-caribbean treasures play online apk download
-caribbean treasures multiplayer apk download
-caribbean treasures live casino apk download
-caribbean treasures jackpot apk download
-caribbean treasures progressive jackpot apk download
-caribbean treasures mega jackpot apk download
-caribbean treasures big win apk download
-caribbean treasures mega win apk download
-caribbean treasures super win apk download
-caribbean treasures epic win apk download
-caribbean treasures tips and tricks apk download
-caribbean treasures strategy guide apk download
-caribbean treasures walkthrough apk download
-caribbean treasures review apk download
-caribbean treasures ratings apk download
-caribbean treasures testimonials apk download
-caribbean treasures customer support apk download
-caribbean treasures contact us apk download
-caribbean treasures sign up for free account apk download
-caribbean treasures how to play apk download
- Use your skills and strategy
-The fish game and the slot game both require some skills and strategy to win. For the fish game, you need to aim carefully, fire wisely, and avoid wasting bullets. You also need to know which fish are worth more coins, which fish have special effects, and when to use power-ups. For the slot game, you need to know how to adjust your bet size, how to choose the best paylines, and how to trigger the bonus features. You also need to manage your bankroll and set a limit for your losses and wins.
- Take advantage of bonuses and promotions
-Caribbean Treasures offers various bonuses and promotions to its players. You can get a welcome bonus when you sign up, a deposit bonus when you make your first deposit, a referral bonus when you invite your friends, and a loyalty bonus when you play regularly. You can also get free coins, free spins, or free tickets from time to time. You can use these bonuses and promotions to increase your chances of winning and have more fun.
- Join the VIP club for more rewards
-If you want to enjoy more benefits and rewards, you can join the VIP club of Caribbean Treasures. The VIP club is a membership program that gives you access to exclusive offers, discounts, gifts, and events. You can also get higher payouts, faster withdrawals, and better customer service. You can join the VIP club by earning points from playing the games or by paying a monthly fee.
- Conclusion
-Caribbean Treasures is an online gaming platform that offers exciting fish games and slot games that you can play anytime, anywhere. You can download the game for free for your mobile device or desktop computer. You can sign up for a free account and start playing with real money or virtual coins. You can use your skills and strategy, take advantage of bonuses and promotions, and join the VIP club to win big with Caribbean Treasures. So what are you waiting for? Download Caribbean Treasures today and discover the hidden treasures of the Caribbean!
- FAQs
-Here are some frequently asked questions about Caribbean Treasures:
-
-- Is Caribbean Treasures safe and secure?
-Yes, Caribbean Treasures is safe and secure. The game uses advanced encryption technology to protect your personal and financial information. The game also follows fair gaming practices and has a random number generator to ensure fair outcomes.
-- How can I contact the support team of Caribbean Treasures?
-You can contact the support team of Caribbean Treasures by email, phone, or live chat. The support team is available 24/7 to assist you with any questions or issues you may have.
-- Can I play Caribbean Treasures with my friends?
-Yes, you can play Caribbean Treasures with your friends. You can invite your friends to join the game and play together in the multiplayer mode of the fish game. You can also chat with your friends and other players in the game.
-- What are the minimum requirements to play Caribbean Treasures?
-The minimum requirements to play Caribbean Treasures are as follows:
-
-Device Operating System Memory
-Android 4.4 or higher 1 GB or higher
-iOS 9.0 or higher 1 GB or higher
-Desktop Windows 7 or higher / Mac OS X 10.10 or higher 2 GB or higher
-
-- How can I withdraw my winnings from Caribbean Treasures?
-You can withdraw your winnings from Caribbean Treasures by using one of the following methods: PayPal, Skrill, Neteller, Bitcoin, or bank transfer. The minimum withdrawal amount is $20 and the maximum withdrawal amount is $10,000 per day. The withdrawal process may take up to 48 hours depending on the method you choose.
-
401be4b1e0
-
-
\ No newline at end of file
diff --git a/spaces/fatiXbelha/sd/Download Moneyland Mod Apk and Enjoy Unlimited Shopping and Cash.md b/spaces/fatiXbelha/sd/Download Moneyland Mod Apk and Enjoy Unlimited Shopping and Cash.md
deleted file mode 100644
index 7c2467a5cbfda96d9ed315e98d963c7be2263750..0000000000000000000000000000000000000000
--- a/spaces/fatiXbelha/sd/Download Moneyland Mod Apk and Enjoy Unlimited Shopping and Cash.md
+++ /dev/null
@@ -1,107 +0,0 @@
-
-Moneyland Mod APK Free Shopping: How to Get Unlimited Money and Scooter in Moneyland Game
- Do you love playing casual games that let you explore different places and earn money? If yes, then you might have heard of Moneyland, a popular game that lets you travel around the world and collect coins. But what if you want to enjoy the game without any limitations or restrictions? Well, that's where Moneyland Mod APK comes in handy. In this article, we will tell you everything you need to know about Moneyland Mod APK, including its features, benefits, and how to download and install it on your device. So, without further ado, let's get started!
-moneyland mod apk free shopping
Download Zip ✯ https://urllie.com/2uNHfL
- Introduction
- What is Moneyland?
- Moneyland is a fun and addictive game that lets you explore different countries and cities and collect coins along the way. You can use the coins to buy various items and upgrades, such as houses, cars, boats, planes, and more. You can also unlock new locations and modes as you progress in the game. The game has simple controls and colorful graphics that make it suitable for all ages. You can also play with your friends and compete with them on the leaderboard.
- What is Moneyland Mod APK?
- Moneyland Mod APK is a modified version of the original Moneyland game that gives you access to all the premium features for free. By downloading the modded version of Moneyland, you will get unlimited money, free shopping, unlock scooter, no ads, and anti-ban option. This means that you can enjoy the game without any worries or hassles. You can buy anything you want, travel anywhere you want, and have fun without any interruptions.
- Why do you need Moneyland Mod APK?
- You might be wondering why you need Moneyland Mod APK when you can play the original game for free. Well, there are several reasons why you might want to try out the modded version of Moneyland. Here are some of them:
-moneyland hack apk unlimited money
-moneyland modded apk free download
-moneyland cheat apk no ads
-moneyland premium apk unlocked scooter
-moneyland cracked apk latest version
-moneyland mod apk android 1
-moneyland hack apk 2021
-moneyland modded apk offline
-moneyland cheat apk anti ban
-moneyland premium apk free shopping
-moneyland cracked apk 3.1.9
-moneyland mod apk rexdl
-moneyland hack apk all content unlocked
-moneyland modded apk online
-moneyland cheat apk no root
-moneyland premium apk mod menu
-moneyland cracked apk unlimited gems
-moneyland mod apk happymod
-moneyland hack apk 2.18.1
-moneyland modded apk 2.19.4
-moneyland cheat apk unlimited coins
-moneyland premium apk no verification
-moneyland cracked apk free scooter
-moneyland mod apk revdl
-moneyland hack apk free shopping
-moneyland modded apk 3.1.9
-moneyland cheat apk latest update
-moneyland premium apk unlimited everything
-moneyland cracked apk no survey
-moneyland mod apk apkpure
-moneyland hack apk download for android
-moneyland modded apk 2021
-moneyland cheat apk free download for android
-moneyland premium apk download latest version
-moneyland cracked apk download link
-moneyland mod apk for ios
-moneyland hack apk for iphone
-moneyland modded apk for ipad
-moneyland cheat apk for pc
-moneyland premium apk for mac
-moneyland cracked apk for windows 10
-moneyland mod apk obb file download
-moneyland hack apk data file download
-moneyland modded apk zip file download
-moneyland cheat apk rar file download
-moneyland premium apk file size
-moneyland cracked apk file name
-moneyland mod apk installation guide
-moneyland hack apk gameplay video
-
-- You want to get unlimited money and free shopping so that you can buy anything you want without spending real money.
-- You want to unlock scooter so that you can travel faster and collect more coins.
-- You want to remove ads so that you can play the game without any annoying pop-ups or banners.
-- You want to use the anti-ban option so that you can play the game without any risk of getting banned by the developers.
-
- If any of these reasons apply to you, then you should definitely download Moneyland Mod APK and enjoy the game to the fullest.
- Features of Moneyland Mod APK
- Unlimited Money
- One of the best features of Moneyland Mod APK is that it gives you unlimited money in the game. This means that you can buy anything you want without worrying about running out of coins. You can buy houses, cars, boats, planes, and more with just a few taps. You can also upgrade your items and make them more luxurious and comfortable. With unlimited money, you can enjoy the game without any limitations or restrictions.
- Free Shopping
- Another great feature of Moneyland Mod APK is that it gives you free shopping in the game. This means that you don't have to spend any coins to buy anything in the game. You can simply select any item you want and get it for free. You don't have to wait for any timers or watch any ads to get your items. You can also get unlimited gems and keys for free, which are used to unlock new locations and modes in the game. With free shopping, you can get everything you want without spending a dime.
- Unlock Scooter
- Another cool feature of Moneyland Mod APK is that it lets you unlock scooter in the game. Scooter is a special vehicle that allows you to travel faster and collect more coins. You can also perform stunts and tricks with the scooter and impress your friends. Scooter is normally locked in the game and requires a lot of coins to unlock. But with Moneyland Mod APK, you can get it for free and enjoy the game more.
- No Ads
- Another awesome feature of Moneyland Mod APK is that it removes all the ads from the game. Ads are annoying and distracting and can ruin your gaming experience. They can also consume your data and battery and slow down your device. But with Moneyland Mod APK, you can play the game without any ads and have a smooth and uninterrupted gameplay. You can also save your data and battery and improve your device performance.
- Anti-Ban Option
- Another amazing feature of Moneyland Mod APK is that it gives you an anti-ban option in the game. This means that you can play the game without any risk of getting banned by the developers. The modded version of Moneyland has a built-in anti-ban system that protects your account from detection and suspension. You can use all the mod features without any worries or hassles. You can also update the game regularly and enjoy the latest features and bug fixes.
- How to Download and Install Moneyland Mod APK
- Step 1: Enable Unknown Sources
- The first step to download and install Moneyland Mod APK is to enable unknown sources on your device. This will allow you to install apps from sources other than the Google Play Store. To do this, go to your device settings and look for security or privacy options. Then, find the option that says unknown sources or allow installation from unknown sources and enable it. This will make your device ready for installing Moneyland Mod APK.
- Step 2: Download the APK File
- The next step is to download the APK file of Moneyland Mod APK from a reliable source. You can use the link below to download the latest version of Moneyland Mod APK for free. The file size is about 100 MB and it will take a few minutes to download depending on your internet speed. Make sure you have enough storage space on your device before downloading the file.
- Download Moneyland Mod APK Free Shopping
- Step 3: Install the APK File
- The final step is to install the APK file of Moneyland Mod APK on your device. To do this, locate the downloaded file in your file manager or downloads folder and tap on it. You will see a pop-up window asking you to confirm the installation. Tap on install and wait for the process to complete. It will take a few seconds to install Moneyland Mod APK on your device.
- Step 4: Enjoy the Game
- Congratulations! You have successfully downloaded and installed Moneyland Mod APK on your device. Now, you can enjoy the game with all the mod features for free. You can open the game from your app drawer or home screen and start playing right away. You can explore different places, collect coins, buy items, unlock scooter, and have fun without any limitations or restrictions.
- Conclusion
- Moneyland Mod APK is a great way to enjoy Moneyland game without any limitations or restrictions. It gives you unlimited money, free shopping, unlock scooter, no ads, and anti-ban option for free. You can download and install it easily on your device and enjoy the game to the fullest. Moneyland Mod APK is safe, secure, and updated regularly. It is compatible with most Android devices and does not require root access or special permissions. If you love playing casual games that let you explore different places and earn money, then you should definitely try out Moneyland Mod APK and have fun.
- FAQs
- Here are some frequently asked questions about Moneyland Mod APK:
-
-- Q: Is Moneyland Mod APK safe to use?
-- A: Yes, Moneyland Mod APK is safe to use as it does not contain any viruses or malware. It also has an anti-ban system that protects your account from getting banned by the developers.
-- Q: Do I need to root my device to use Moneyland Mod APK?
-- A: No, you do not need to root your device to use Moneyland Mod APK as it works fine on both rooted and non-rooted devices . You can download and install it without any problems.
-- Q: How can I update Moneyland Mod APK?
-- A: You can update Moneyland Mod APK by downloading the latest version from the same source you downloaded it from. You can also check this article regularly for any updates or news about Moneyland Mod APK.
-- Q: Can I play Moneyland Mod APK with my friends?
-- A: Yes, you can play Moneyland Mod APK with your friends and compete with them on the leaderboard. You can also share your progress and achievements with them on social media.
-- Q: What are the minimum requirements to run Moneyland Mod APK?
-- A: The minimum requirements to run Moneyland Mod APK are Android 4.4 or higher, 2 GB of RAM, and 200 MB of free storage space.
-
- I hope this article has helped you learn more about Moneyland Mod APK and how to download and install it on your device. If you have any questions or feedback, feel free to leave a comment below. Thank you for reading and happy gaming!
401be4b1e0
-
-
\ No newline at end of file
diff --git a/spaces/fengmuxi/ChatGpt-Web/app/store/access.ts b/spaces/fengmuxi/ChatGpt-Web/app/store/access.ts
deleted file mode 100644
index faa53b80b8d6a3cfe13e46597f3c83f466e010eb..0000000000000000000000000000000000000000
--- a/spaces/fengmuxi/ChatGpt-Web/app/store/access.ts
+++ /dev/null
@@ -1,98 +0,0 @@
-import { create } from "zustand";
-import { persist } from "zustand/middleware";
-import { StoreKey } from "../constant";
-import { BOT_HELLO } from "./chat";
-import { ALL_MODELS } from "./config";
-import { getHeaders } from "../requests";
-
-export interface AccessControlStore {
- accessCode: string;
- token: string;
- auth: string;
-
- needCode: number;
- hideUserApiKey: boolean;
- openaiUrl: string;
-
- updateToken: (_: string) => void;
- updateCode: (_: string) => void;
- updateAuth: (_: string) => void;
- enabledAccessControl: () => number;
- isAuthorized: () => boolean;
- fetch: () => void;
-}
-
-let fetchState = 0; // 0 not fetch, 1 fetching, 2 done
-
-export const useAccessStore = create()(
- persist(
- (set, get) => ({
- token: "",
- auth: "",
- accessCode: "",
- needCode: 0,
- hideUserApiKey: true,
- openaiUrl: "/api/openai/",
-
- enabledAccessControl() {
- get().fetch();
-
- return get().needCode;
- },
- updateCode(code: string) {
- set(() => ({ accessCode: code }));
- },
- updateToken(token: string) {
- set(() => ({ token }));
- },
- updateAuth(token: string) {
- set(() => ({ auth:token }));
- },
- isAuthorized() {
- get().fetch();
- // has token or has code or disabled access control
- return (
- !!get().token || !!get().accessCode || !!get().enabledAccessControl() || !!get().auth
- );
- },
- fetch() {
- if (fetchState > 0) return;
- fetchState = 1;
- fetch("/api/config", {
- method: "post",
- headers: {
- ...getHeaders(),
- },
- body: null,
- })
- .then((res) => res.json())
- .then((res: DangerConfig) => {
- console.log("[Config] got config from server", res);
- set(() => ({ ...res }));
-
- if (!res.enableGPT4) {
- ALL_MODELS.forEach((model) => {
- if (model.name.startsWith("gpt-4")) {
- (model as any).available = false;
- }
- });
- }
-
- if ((res as any).botHello) {
- BOT_HELLO.content = (res as any).botHello;
- }
- })
- .catch(() => {
- console.error("[Config] failed to fetch config");
- })
- .finally(() => {
- fetchState = 2;
- });
- },
- }),
- {
- name: StoreKey.Access,
- version: 1,
- },
- ),
-);
diff --git a/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/build/transports/polling.js b/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/build/transports/polling.js
deleted file mode 100644
index 4f855b387bdb5cd8e6d61c746f985c807358d6f2..0000000000000000000000000000000000000000
--- a/spaces/fffiloni/controlnet-animation-doodle/node_modules/engine.io/build/transports/polling.js
+++ /dev/null
@@ -1,344 +0,0 @@
-"use strict";
-Object.defineProperty(exports, "__esModule", { value: true });
-exports.Polling = void 0;
-const transport_1 = require("../transport");
-const zlib_1 = require("zlib");
-const accepts = require("accepts");
-const debug_1 = require("debug");
-const debug = (0, debug_1.default)("engine:polling");
-const compressionMethods = {
- gzip: zlib_1.createGzip,
- deflate: zlib_1.createDeflate,
-};
-class Polling extends transport_1.Transport {
- /**
- * HTTP polling constructor.
- *
- * @api public.
- */
- constructor(req) {
- super(req);
- this.closeTimeout = 30 * 1000;
- }
- /**
- * Transport name
- *
- * @api public
- */
- get name() {
- return "polling";
- }
- get supportsFraming() {
- return false;
- }
- /**
- * Overrides onRequest.
- *
- * @param {http.IncomingMessage}
- * @api private
- */
- onRequest(req) {
- const res = req.res;
- if ("GET" === req.method) {
- this.onPollRequest(req, res);
- }
- else if ("POST" === req.method) {
- this.onDataRequest(req, res);
- }
- else {
- res.writeHead(500);
- res.end();
- }
- }
- /**
- * The client sends a request awaiting for us to send data.
- *
- * @api private
- */
- onPollRequest(req, res) {
- if (this.req) {
- debug("request overlap");
- // assert: this.res, '.req and .res should be (un)set together'
- this.onError("overlap from client");
- // TODO for the next major release: use an HTTP 400 status code (https://github.com/socketio/engine.io/issues/650)
- res.writeHead(500);
- res.end();
- return;
- }
- debug("setting request");
- this.req = req;
- this.res = res;
- const onClose = () => {
- this.onError("poll connection closed prematurely");
- };
- const cleanup = () => {
- req.removeListener("close", onClose);
- this.req = this.res = null;
- };
- req.cleanup = cleanup;
- req.on("close", onClose);
- this.writable = true;
- this.emit("drain");
- // if we're still writable but had a pending close, trigger an empty send
- if (this.writable && this.shouldClose) {
- debug("triggering empty send to append close packet");
- this.send([{ type: "noop" }]);
- }
- }
- /**
- * The client sends a request with data.
- *
- * @api private
- */
- onDataRequest(req, res) {
- if (this.dataReq) {
- // assert: this.dataRes, '.dataReq and .dataRes should be (un)set together'
- this.onError("data request overlap from client");
- // TODO for the next major release: use an HTTP 400 status code (https://github.com/socketio/engine.io/issues/650)
- res.writeHead(500);
- res.end();
- return;
- }
- const isBinary = "application/octet-stream" === req.headers["content-type"];
- if (isBinary && this.protocol === 4) {
- return this.onError("invalid content");
- }
- this.dataReq = req;
- this.dataRes = res;
- let chunks = isBinary ? Buffer.concat([]) : "";
- const cleanup = () => {
- req.removeListener("data", onData);
- req.removeListener("end", onEnd);
- req.removeListener("close", onClose);
- this.dataReq = this.dataRes = chunks = null;
- };
- const onClose = () => {
- cleanup();
- this.onError("data request connection closed prematurely");
- };
- const onData = (data) => {
- let contentLength;
- if (isBinary) {
- chunks = Buffer.concat([chunks, data]);
- contentLength = chunks.length;
- }
- else {
- chunks += data;
- contentLength = Buffer.byteLength(chunks);
- }
- if (contentLength > this.maxHttpBufferSize) {
- res.writeHead(413).end();
- cleanup();
- }
- };
- const onEnd = () => {
- this.onData(chunks);
- const headers = {
- // text/html is required instead of text/plain to avoid an
- // unwanted download dialog on certain user-agents (GH-43)
- "Content-Type": "text/html",
- "Content-Length": 2,
- };
- res.writeHead(200, this.headers(req, headers));
- res.end("ok");
- cleanup();
- };
- req.on("close", onClose);
- if (!isBinary)
- req.setEncoding("utf8");
- req.on("data", onData);
- req.on("end", onEnd);
- }
- /**
- * Processes the incoming data payload.
- *
- * @param {String} encoded payload
- * @api private
- */
- onData(data) {
- debug('received "%s"', data);
- const callback = (packet) => {
- if ("close" === packet.type) {
- debug("got xhr close packet");
- this.onClose();
- return false;
- }
- this.onPacket(packet);
- };
- if (this.protocol === 3) {
- this.parser.decodePayload(data, callback);
- }
- else {
- this.parser.decodePayload(data).forEach(callback);
- }
- }
- /**
- * Overrides onClose.
- *
- * @api private
- */
- onClose() {
- if (this.writable) {
- // close pending poll request
- this.send([{ type: "noop" }]);
- }
- super.onClose();
- }
- /**
- * Writes a packet payload.
- *
- * @param {Object} packet
- * @api private
- */
- send(packets) {
- this.writable = false;
- if (this.shouldClose) {
- debug("appending close packet to payload");
- packets.push({ type: "close" });
- this.shouldClose();
- this.shouldClose = null;
- }
- const doWrite = (data) => {
- const compress = packets.some((packet) => {
- return packet.options && packet.options.compress;
- });
- this.write(data, { compress });
- };
- if (this.protocol === 3) {
- this.parser.encodePayload(packets, this.supportsBinary, doWrite);
- }
- else {
- this.parser.encodePayload(packets, doWrite);
- }
- }
- /**
- * Writes data as response to poll request.
- *
- * @param {String} data
- * @param {Object} options
- * @api private
- */
- write(data, options) {
- debug('writing "%s"', data);
- this.doWrite(data, options, () => {
- this.req.cleanup();
- });
- }
- /**
- * Performs the write.
- *
- * @api private
- */
- doWrite(data, options, callback) {
- // explicit UTF-8 is required for pages not served under utf
- const isString = typeof data === "string";
- const contentType = isString
- ? "text/plain; charset=UTF-8"
- : "application/octet-stream";
- const headers = {
- "Content-Type": contentType,
- };
- const respond = (data) => {
- headers["Content-Length"] =
- "string" === typeof data ? Buffer.byteLength(data) : data.length;
- this.res.writeHead(200, this.headers(this.req, headers));
- this.res.end(data);
- callback();
- };
- if (!this.httpCompression || !options.compress) {
- respond(data);
- return;
- }
- const len = isString ? Buffer.byteLength(data) : data.length;
- if (len < this.httpCompression.threshold) {
- respond(data);
- return;
- }
- const encoding = accepts(this.req).encodings(["gzip", "deflate"]);
- if (!encoding) {
- respond(data);
- return;
- }
- this.compress(data, encoding, (err, data) => {
- if (err) {
- this.res.writeHead(500);
- this.res.end();
- callback(err);
- return;
- }
- headers["Content-Encoding"] = encoding;
- respond(data);
- });
- }
- /**
- * Compresses data.
- *
- * @api private
- */
- compress(data, encoding, callback) {
- debug("compressing");
- const buffers = [];
- let nread = 0;
- compressionMethods[encoding](this.httpCompression)
- .on("error", callback)
- .on("data", function (chunk) {
- buffers.push(chunk);
- nread += chunk.length;
- })
- .on("end", function () {
- callback(null, Buffer.concat(buffers, nread));
- })
- .end(data);
- }
- /**
- * Closes the transport.
- *
- * @api private
- */
- doClose(fn) {
- debug("closing");
- let closeTimeoutTimer;
- if (this.dataReq) {
- debug("aborting ongoing data request");
- this.dataReq.destroy();
- }
- const onClose = () => {
- clearTimeout(closeTimeoutTimer);
- fn();
- this.onClose();
- };
- if (this.writable) {
- debug("transport writable - closing right away");
- this.send([{ type: "close" }]);
- onClose();
- }
- else if (this.discarded) {
- debug("transport discarded - closing right away");
- onClose();
- }
- else {
- debug("transport not writable - buffering orderly close");
- this.shouldClose = onClose;
- closeTimeoutTimer = setTimeout(onClose, this.closeTimeout);
- }
- }
- /**
- * Returns headers for a response.
- *
- * @param {http.IncomingMessage} request
- * @param {Object} extra headers
- * @api private
- */
- headers(req, headers) {
- headers = headers || {};
- // prevent XSS warnings on IE
- // https://github.com/LearnBoost/socket.io/pull/1333
- const ua = req.headers["user-agent"];
- if (ua && (~ua.indexOf(";MSIE") || ~ua.indexOf("Trident/"))) {
- headers["X-XSS-Protection"] = "0";
- }
- this.emit("headers", headers, req);
- return headers;
- }
-}
-exports.Polling = Polling;
diff --git a/spaces/fkhuggingme/gpt-academic/crazy_functions/test_project/cpp/libJPG/jpge.h b/spaces/fkhuggingme/gpt-academic/crazy_functions/test_project/cpp/libJPG/jpge.h
deleted file mode 100644
index a46c805ab80aab491f7f9508b3a008b149866bee..0000000000000000000000000000000000000000
--- a/spaces/fkhuggingme/gpt-academic/crazy_functions/test_project/cpp/libJPG/jpge.h
+++ /dev/null
@@ -1,172 +0,0 @@
-
-// jpge.h - C++ class for JPEG compression.
-// Public domain, Rich Geldreich
-// Alex Evans: Added RGBA support, linear memory allocator.
-#ifndef JPEG_ENCODER_H
-#define JPEG_ENCODER_H
-
-#include
-
-namespace jpge
-{
- typedef unsigned char uint8;
- typedef signed short int16;
- typedef signed int int32;
- typedef unsigned short uint16;
- typedef unsigned int uint32;
- typedef unsigned int uint;
-
- // JPEG chroma subsampling factors. Y_ONLY (grayscale images) and H2V2 (color images) are the most common.
- enum subsampling_t { Y_ONLY = 0, H1V1 = 1, H2V1 = 2, H2V2 = 3 };
-
- // JPEG compression parameters structure.
- struct params
- {
- inline params() : m_quality(85), m_subsampling(H2V2), m_no_chroma_discrim_flag(false), m_two_pass_flag(false) { }
-
- inline bool check_valid() const
- {
- if ((m_quality < 1) || (m_quality > 100)) return false;
- if ((uint)m_subsampling > (uint)H2V2) return false;
- return true;
- }
-
- // Quality: 1-100, higher is better. Typical values are around 50-95.
- int m_quality;
-
- // m_subsampling:
- // 0 = Y (grayscale) only
- // 1 = YCbCr, no subsampling (H1V1, YCbCr 1x1x1, 3 blocks per MCU)
- // 2 = YCbCr, H2V1 subsampling (YCbCr 2x1x1, 4 blocks per MCU)
- // 3 = YCbCr, H2V2 subsampling (YCbCr 4x1x1, 6 blocks per MCU-- very common)
- subsampling_t m_subsampling;
-
- // Disables CbCr discrimination - only intended for testing.
- // If true, the Y quantization table is also used for the CbCr channels.
- bool m_no_chroma_discrim_flag;
-
- bool m_two_pass_flag;
- };
-
- // Writes JPEG image to a file.
- // num_channels must be 1 (Y) or 3 (RGB), image pitch must be width*num_channels.
- bool compress_image_to_jpeg_file(const char *pFilename, int64_t width, int64_t height, int64_t num_channels, const uint8 *pImage_data, const params &comp_params = params());
-
- // Writes JPEG image to memory buffer.
- // On entry, buf_size is the size of the output buffer pointed at by pBuf, which should be at least ~1024 bytes.
- // If return value is true, buf_size will be set to the size of the compressed data.
- bool compress_image_to_jpeg_file_in_memory(void *pBuf, int64_t &buf_size, int64_t width, int64_t height, int64_t num_channels, const uint8 *pImage_data, const params &comp_params = params());
-
- // Output stream abstract class - used by the jpeg_encoder class to write to the output stream.
- // put_buf() is generally called with len==JPGE_OUT_BUF_SIZE bytes, but for headers it'll be called with smaller amounts.
- class output_stream
- {
- public:
- virtual ~output_stream() { };
- virtual bool put_buf(const void* Pbuf, int64_t len) = 0;
- template inline bool put_obj(const T& obj) { return put_buf(&obj, sizeof(T)); }
- };
-
- // Lower level jpeg_encoder class - useful if more control is needed than the above helper functions.
- class jpeg_encoder
- {
- public:
- jpeg_encoder();
- ~jpeg_encoder();
-
- // Initializes the compressor.
- // pStream: The stream object to use for writing compressed data.
- // params - Compression parameters structure, defined above.
- // width, height - Image dimensions.
- // channels - May be 1, or 3. 1 indicates grayscale, 3 indicates RGB source data.
- // Returns false on out of memory or if a stream write fails.
- bool init(output_stream *pStream, int64_t width, int64_t height, int64_t src_channels, const params &comp_params = params());
-
- const params &get_params() const { return m_params; }
-
- // Deinitializes the compressor, freeing any allocated memory. May be called at any time.
- void deinit();
-
- uint get_total_passes() const { return m_params.m_two_pass_flag ? 2 : 1; }
- inline uint get_cur_pass() { return m_pass_num; }
-
- // Call this method with each source scanline.
- // width * src_channels bytes per scanline is expected (RGB or Y format).
- // You must call with NULL after all scanlines are processed to finish compression.
- // Returns false on out of memory or if a stream write fails.
- bool process_scanline(const void* pScanline);
-
- private:
- jpeg_encoder(const jpeg_encoder &);
- jpeg_encoder &operator =(const jpeg_encoder &);
-
- typedef int32 sample_array_t;
-
- output_stream *m_pStream;
- params m_params;
- uint8 m_num_components;
- uint8 m_comp_h_samp[3], m_comp_v_samp[3];
- int m_image_x, m_image_y, m_image_bpp, m_image_bpl;
- int m_image_x_mcu, m_image_y_mcu;
- int m_image_bpl_xlt, m_image_bpl_mcu;
- int m_mcus_per_row;
- int m_mcu_x, m_mcu_y;
- uint8 *m_mcu_lines[16];
- uint8 m_mcu_y_ofs;
- sample_array_t m_sample_array[64];
- int16 m_coefficient_array[64];
- int32 m_quantization_tables[2][64];
- uint m_huff_codes[4][256];
- uint8 m_huff_code_sizes[4][256];
- uint8 m_huff_bits[4][17];
- uint8 m_huff_val[4][256];
- uint32 m_huff_count[4][256];
- int m_last_dc_val[3];
- enum { JPGE_OUT_BUF_SIZE = 2048 };
- uint8 m_out_buf[JPGE_OUT_BUF_SIZE];
- uint8 *m_pOut_buf;
- uint m_out_buf_left;
- uint32 m_bit_buffer;
- uint m_bits_in;
- uint8 m_pass_num;
- bool m_all_stream_writes_succeeded;
-
- void optimize_huffman_table(int table_num, int table_len);
- void emit_byte(uint8 i);
- void emit_word(uint i);
- void emit_marker(int marker);
- void emit_jfif_app0();
- void emit_dqt();
- void emit_sof();
- void emit_dht(uint8 *bits, uint8 *val, int index, bool ac_flag);
- void emit_dhts();
- void emit_sos();
- void emit_markers();
- void compute_huffman_table(uint *codes, uint8 *code_sizes, uint8 *bits, uint8 *val);
- void compute_quant_table(int32 *dst, int16 *src);
- void adjust_quant_table(int32 *dst, int32 *src);
- void first_pass_init();
- bool second_pass_init();
- bool jpg_open(int p_x_res, int p_y_res, int src_channels);
- void load_block_8_8_grey(int x);
- void load_block_8_8(int x, int y, int c);
- void load_block_16_8(int x, int c);
- void load_block_16_8_8(int x, int c);
- void load_quantized_coefficients(int component_num);
- void flush_output_buffer();
- void put_bits(uint bits, uint len);
- void code_coefficients_pass_one(int component_num);
- void code_coefficients_pass_two(int component_num);
- void code_block(int component_num);
- void process_mcu_row();
- bool terminate_pass_one();
- bool terminate_pass_two();
- bool process_end_of_image();
- void load_mcu(const void* src);
- void clear();
- void init();
- };
-
-} // namespace jpge
-
-#endif // JPEG_ENCODER
\ No newline at end of file
diff --git a/spaces/florim/MedGPT/autogpt/json_utils/utilities.py b/spaces/florim/MedGPT/autogpt/json_utils/utilities.py
deleted file mode 100644
index eb9bb687750460fed2f4547b67e41f8e8c877a41..0000000000000000000000000000000000000000
--- a/spaces/florim/MedGPT/autogpt/json_utils/utilities.py
+++ /dev/null
@@ -1,54 +0,0 @@
-"""Utilities for the json_fixes package."""
-import json
-import re
-
-from jsonschema import Draft7Validator
-
-from autogpt.config import Config
-from autogpt.logs import logger
-
-CFG = Config()
-
-
-def extract_char_position(error_message: str) -> int:
- """Extract the character position from the JSONDecodeError message.
-
- Args:
- error_message (str): The error message from the JSONDecodeError
- exception.
-
- Returns:
- int: The character position.
- """
-
- char_pattern = re.compile(r"\(char (\d+)\)")
- if match := char_pattern.search(error_message):
- return int(match[1])
- else:
- raise ValueError("Character position not found in the error message.")
-
-
-def validate_json(json_object: object, schema_name: object) -> object:
- """
- :type schema_name: object
- :param schema_name:
- :type json_object: object
- """
- with open(f"autogpt/json_utils/{schema_name}.json", "r") as f:
- schema = json.load(f)
- validator = Draft7Validator(schema)
-
- if errors := sorted(validator.iter_errors(json_object), key=lambda e: e.path):
- logger.error("The JSON object is invalid.")
- if CFG.debug_mode:
- logger.error(
- json.dumps(json_object, indent=4)
- ) # Replace 'json_object' with the variable containing the JSON data
- logger.error("The following issues were found:")
-
- for error in errors:
- logger.error(f"Error: {error.message}")
- elif CFG.debug_mode:
- print("The JSON object is valid.")
-
- return json_object
diff --git a/spaces/freddyaboulton/gradio_folium/src/backend/gradio_folium/templates/component/style.css b/spaces/freddyaboulton/gradio_folium/src/backend/gradio_folium/templates/component/style.css
deleted file mode 100644
index 53039f8f3554d2ebc72fc5bfc2733871aff9d81d..0000000000000000000000000000000000000000
--- a/spaces/freddyaboulton/gradio_folium/src/backend/gradio_folium/templates/component/style.css
+++ /dev/null
@@ -1 +0,0 @@
-.block.svelte-1t38q2d{position:relative;margin:0;box-shadow:var(--block-shadow);border-width:var(--block-border-width);border-color:var(--block-border-color);border-radius:var(--block-radius);background:var(--block-background-fill);width:100%;line-height:var(--line-sm)}.block.border_focus.svelte-1t38q2d{border-color:var(--color-accent)}.padded.svelte-1t38q2d{padding:var(--block-padding)}.hidden.svelte-1t38q2d{display:none}.hide-container.svelte-1t38q2d{margin:0;box-shadow:none;--block-border-width:0;background:transparent;padding:0;overflow:visible}div.svelte-1hnfib2{margin-bottom:var(--spacing-lg);color:var(--block-info-text-color);font-weight:var(--block-info-text-weight);font-size:var(--block-info-text-size);line-height:var(--line-sm)}span.has-info.svelte-22c38v{margin-bottom:var(--spacing-xs)}span.svelte-22c38v:not(.has-info){margin-bottom:var(--spacing-lg)}span.svelte-22c38v{display:inline-block;position:relative;z-index:var(--layer-4);border:solid var(--block-title-border-width) var(--block-title-border-color);border-radius:var(--block-title-radius);background:var(--block-title-background-fill);padding:var(--block-title-padding);color:var(--block-title-text-color);font-weight:var(--block-title-text-weight);font-size:var(--block-title-text-size);line-height:var(--line-sm)}.hide.svelte-22c38v{margin:0;height:0}label.svelte-9gxdi0{display:inline-flex;align-items:center;z-index:var(--layer-2);box-shadow:var(--block-label-shadow);border:var(--block-label-border-width) solid var(--border-color-primary);border-top:none;border-left:none;border-radius:var(--block-label-radius);background:var(--block-label-background-fill);padding:var(--block-label-padding);pointer-events:none;color:var(--block-label-text-color);font-weight:var(--block-label-text-weight);font-size:var(--block-label-text-size);line-height:var(--line-sm)}.gr-group label.svelte-9gxdi0{border-top-left-radius:0}label.float.svelte-9gxdi0{position:absolute;top:var(--block-label-margin);left:var(--block-label-margin)}label.svelte-9gxdi0:not(.float){position:static;margin-top:var(--block-label-margin);margin-left:var(--block-label-margin)}.hide.svelte-9gxdi0{height:0}span.svelte-9gxdi0{opacity:.8;margin-right:var(--size-2);width:calc(var(--block-label-text-size) - 1px);height:calc(var(--block-label-text-size) - 1px)}.hide-label.svelte-9gxdi0{box-shadow:none;border-width:0;background:transparent;overflow:visible}button.svelte-lkmj4t{display:flex;justify-content:center;align-items:center;gap:1px;z-index:var(--layer-1);box-shadow:var(--shadow-drop);border:1px solid var(--button-secondary-border-color);border-radius:var(--radius-sm);background:var(--background-fill-primary);padding:2px;color:var(--block-label-text-color)}button.svelte-lkmj4t:hover{cursor:pointer;border:2px solid var(--button-secondary-border-color-hover);padding:1px;color:var(--block-label-text-color)}span.svelte-lkmj4t{padding:0 1px;font-size:10px}div.svelte-lkmj4t{padding:2px;width:14px;height:14px}.pending.svelte-lkmj4t{animation:svelte-lkmj4t-flash .5s infinite}@keyframes svelte-lkmj4t-flash{0%{opacity:.5}50%{opacity:1}to{opacity:.5}}.empty.svelte-3w3rth{display:flex;justify-content:center;align-items:center;margin-top:calc(0px - var(--size-6));height:var(--size-full)}.icon.svelte-3w3rth{opacity:.5;height:var(--size-5);color:var(--body-text-color)}.small.svelte-3w3rth{min-height:calc(var(--size-32) - 20px)}.large.svelte-3w3rth{min-height:calc(var(--size-64) - 20px)}.unpadded_box.svelte-3w3rth{margin-top:0}.small_parent.svelte-3w3rth{min-height:100%!important}.dropdown-arrow.svelte-1in5nh4{fill:var(--body-text-color);margin-right:var(--size-2);width:var(--size-5)}.wrap.svelte-8ytugg{display:flex;flex-direction:column;justify-content:center;min-height:var(--size-60);color:var(--block-label-text-color);line-height:var(--line-md)}.or.svelte-8ytugg{color:var(--body-text-color-subdued)}@media (--screen-md){.wrap.svelte-8ytugg{font-size:var(--text-lg)}}svg.svelte-43sxxs.svelte-43sxxs{width:var(--size-20);height:var(--size-20)}svg.svelte-43sxxs path.svelte-43sxxs{fill:var(--loader-color)}div.svelte-43sxxs.svelte-43sxxs{z-index:var(--layer-2)}.margin.svelte-43sxxs.svelte-43sxxs{margin:var(--size-4)}.wrap.svelte-14miwb5.svelte-14miwb5{display:flex;flex-direction:column;justify-content:center;align-items:center;z-index:var(--layer-5);transition:opacity .1s ease-in-out;border-radius:var(--block-radius);background:var(--block-background-fill);padding:0 var(--size-6);max-height:var(--size-screen-h);overflow:hidden;pointer-events:none}.wrap.center.svelte-14miwb5.svelte-14miwb5{top:0;right:0;left:0}.wrap.default.svelte-14miwb5.svelte-14miwb5{top:0;right:0;bottom:0;left:0}.hide.svelte-14miwb5.svelte-14miwb5{opacity:0;pointer-events:none}.generating.svelte-14miwb5.svelte-14miwb5{animation:svelte-14miwb5-pulse 2s cubic-bezier(.4,0,.6,1) infinite;border:2px solid var(--color-accent);background:transparent}.translucent.svelte-14miwb5.svelte-14miwb5{background:none}@keyframes svelte-14miwb5-pulse{0%,to{opacity:1}50%{opacity:.5}}.loading.svelte-14miwb5.svelte-14miwb5{z-index:var(--layer-2);color:var(--body-text-color)}.eta-bar.svelte-14miwb5.svelte-14miwb5{position:absolute;top:0;right:0;bottom:0;left:0;transform-origin:left;opacity:.8;z-index:var(--layer-1);transition:10ms;background:var(--background-fill-secondary)}.progress-bar-wrap.svelte-14miwb5.svelte-14miwb5{border:1px solid var(--border-color-primary);background:var(--background-fill-primary);width:55.5%;height:var(--size-4)}.progress-bar.svelte-14miwb5.svelte-14miwb5{transform-origin:left;background-color:var(--loader-color);width:var(--size-full);height:var(--size-full)}.progress-level.svelte-14miwb5.svelte-14miwb5{display:flex;flex-direction:column;align-items:center;gap:1;z-index:var(--layer-2);width:var(--size-full)}.progress-level-inner.svelte-14miwb5.svelte-14miwb5{margin:var(--size-2) auto;color:var(--body-text-color);font-size:var(--text-sm);font-family:var(--font-mono)}.meta-text.svelte-14miwb5.svelte-14miwb5{position:absolute;top:0;right:0;z-index:var(--layer-2);padding:var(--size-1) var(--size-2);font-size:var(--text-sm);font-family:var(--font-mono)}.meta-text-center.svelte-14miwb5.svelte-14miwb5{display:flex;position:absolute;top:0;right:0;justify-content:center;align-items:center;transform:translateY(var(--size-6));z-index:var(--layer-2);padding:var(--size-1) var(--size-2);font-size:var(--text-sm);font-family:var(--font-mono);text-align:center}.error.svelte-14miwb5.svelte-14miwb5{box-shadow:var(--shadow-drop);border:solid 1px var(--error-border-color);border-radius:var(--radius-full);background:var(--error-background-fill);padding-right:var(--size-4);padding-left:var(--size-4);color:var(--error-text-color);font-weight:var(--weight-semibold);font-size:var(--text-lg);line-height:var(--line-lg);font-family:var(--font)}.minimal.svelte-14miwb5 .progress-text.svelte-14miwb5{background:var(--block-background-fill)}.border.svelte-14miwb5.svelte-14miwb5{border:1px solid var(--border-color-primary)}.toast-body.svelte-solcu7{display:flex;position:relative;right:0;left:0;align-items:center;margin:var(--size-6) var(--size-4);margin:auto;border-radius:var(--container-radius);overflow:hidden;pointer-events:auto}.toast-body.error.svelte-solcu7{border:1px solid var(--color-red-700);background:var(--color-red-50)}.dark .toast-body.error.svelte-solcu7{border:1px solid var(--color-red-500);background-color:var(--color-grey-950)}.toast-body.warning.svelte-solcu7{border:1px solid var(--color-yellow-700);background:var(--color-yellow-50)}.dark .toast-body.warning.svelte-solcu7{border:1px solid var(--color-yellow-500);background-color:var(--color-grey-950)}.toast-body.info.svelte-solcu7{border:1px solid var(--color-grey-700);background:var(--color-grey-50)}.dark .toast-body.info.svelte-solcu7{border:1px solid var(--color-grey-500);background-color:var(--color-grey-950)}.toast-title.svelte-solcu7{display:flex;align-items:center;font-weight:var(--weight-bold);font-size:var(--text-lg);line-height:var(--line-sm);text-transform:capitalize}.toast-title.error.svelte-solcu7{color:var(--color-red-700)}.dark .toast-title.error.svelte-solcu7{color:var(--color-red-50)}.toast-title.warning.svelte-solcu7{color:var(--color-yellow-700)}.dark .toast-title.warning.svelte-solcu7{color:var(--color-yellow-50)}.toast-title.info.svelte-solcu7{color:var(--color-grey-700)}.dark .toast-title.info.svelte-solcu7{color:var(--color-grey-50)}.toast-close.svelte-solcu7{margin:0 var(--size-3);border-radius:var(--size-3);padding:0px var(--size-1-5);font-size:var(--size-5);line-height:var(--size-5)}.toast-close.error.svelte-solcu7{color:var(--color-red-700)}.dark .toast-close.error.svelte-solcu7{color:var(--color-red-500)}.toast-close.warning.svelte-solcu7{color:var(--color-yellow-700)}.dark .toast-close.warning.svelte-solcu7{color:var(--color-yellow-500)}.toast-close.info.svelte-solcu7{color:var(--color-grey-700)}.dark .toast-close.info.svelte-solcu7{color:var(--color-grey-500)}.toast-text.svelte-solcu7{font-size:var(--text-lg)}.toast-text.error.svelte-solcu7{color:var(--color-red-700)}.dark .toast-text.error.svelte-solcu7{color:var(--color-red-50)}.toast-text.warning.svelte-solcu7{color:var(--color-yellow-700)}.dark .toast-text.warning.svelte-solcu7{color:var(--color-yellow-50)}.toast-text.info.svelte-solcu7{color:var(--color-grey-700)}.dark .toast-text.info.svelte-solcu7{color:var(--color-grey-50)}.toast-details.svelte-solcu7{margin:var(--size-3) var(--size-3) var(--size-3) 0;width:100%}.toast-icon.svelte-solcu7{display:flex;position:absolute;position:relative;flex-shrink:0;justify-content:center;align-items:center;margin:var(--size-2);border-radius:var(--radius-full);padding:var(--size-1);padding-left:calc(var(--size-1) - 1px);width:35px;height:35px}.toast-icon.error.svelte-solcu7{color:var(--color-red-700)}.dark .toast-icon.error.svelte-solcu7{color:var(--color-red-500)}.toast-icon.warning.svelte-solcu7{color:var(--color-yellow-700)}.dark .toast-icon.warning.svelte-solcu7{color:var(--color-yellow-500)}.toast-icon.info.svelte-solcu7{color:var(--color-grey-700)}.dark .toast-icon.info.svelte-solcu7{color:var(--color-grey-500)}@keyframes svelte-solcu7-countdown{0%{transform:scaleX(1)}to{transform:scaleX(0)}}.timer.svelte-solcu7{position:absolute;bottom:0;left:0;transform-origin:0 0;animation:svelte-solcu7-countdown 10s linear forwards;width:100%;height:var(--size-1)}.timer.error.svelte-solcu7{background:var(--color-red-700)}.dark .timer.error.svelte-solcu7{background:var(--color-red-500)}.timer.warning.svelte-solcu7{background:var(--color-yellow-700)}.dark .timer.warning.svelte-solcu7{background:var(--color-yellow-500)}.timer.info.svelte-solcu7{background:var(--color-grey-700)}.dark .timer.info.svelte-solcu7{background:var(--color-grey-500)}.toast-wrap.svelte-gatr8h{display:flex;position:fixed;top:var(--size-4);right:var(--size-4);flex-direction:column;align-items:end;gap:var(--size-2);z-index:var(--layer-top);width:calc(100% - var(--size-8))}@media (--screen-sm){.toast-wrap.svelte-gatr8h{width:calc(var(--size-96) + var(--size-10))}}button.svelte-2w9i1r{cursor:pointer;width:var(--size-full);height:var(--size-full)}.center.svelte-2w9i1r{display:flex;justify-content:center}.flex.svelte-2w9i1r{display:flex;justify-content:center;align-items:center}input.svelte-2w9i1r{display:none}div.svelte-1wj0ocy{display:flex;top:var(--size-2);right:var(--size-2);justify-content:flex-end;gap:var(--spacing-sm);z-index:var(--layer-1)}.not-absolute.svelte-1wj0ocy{margin:var(--size-1)}iframe.svelte-1orump4{display:flex;width:var(--size-full)}
diff --git a/spaces/freshield/ChatGPT-gradio/MongdbClient.py b/spaces/freshield/ChatGPT-gradio/MongdbClient.py
deleted file mode 100644
index e01e08e339b5e202fd49193a8b82411c90a2e67a..0000000000000000000000000000000000000000
--- a/spaces/freshield/ChatGPT-gradio/MongdbClient.py
+++ /dev/null
@@ -1,95 +0,0 @@
-# coding=utf-8
-"""
-@Author: Freshield
-@Contact: yangyufresh@163.com
-@File: MongdbClient.py
-@Time: 2023-03-03 20:25
-@Last_update: 2023-03-03 20:25
-@Desc: None
-@==============================================@
-@ _____ _ _ _ _ @
-@ | __|___ ___ ___| |_|_|___| |_| | @
-@ | __| _| -_|_ -| | | -_| | . | @
-@ |__| |_| |___|___|_|_|_|___|_|___| @
-@ Freshield @
-@==============================================@
-"""
-import pymongo
-from hashlib import sha256
-
-
-class MongodbClient(object):
- """Mongodb客户端"""
- def __init__(self):
- self.myclient = pymongo.MongoClient("mongodb://localhost:27017/")
- self.mydb = self.myclient["openai_bot"]
- self.user_info = self.mydb['user_info']
- self.user_history = self.mydb['user_history']
-
- def insert_user(self, username, password):
- """离线添加用户"""
- username = sha256(username.encode('utf8')).hexdigest()
- password = sha256(password.encode('utf8')).hexdigest()
- mydict = {'username': username, 'password': password}
- _ = self.user_info.insert_one(mydict)
-
- def check_user_exist(self, username, password):
- """检测用户是否存在"""
- username = sha256(username.encode('utf8')).hexdigest()
- password = sha256(password.encode('utf8')).hexdigest()
- mydoc = self.user_info.find({'username': username, 'password': password}).limit(1)
- res = [x for x in mydoc]
-
- return len(res) >= 1
-
- def update_user_access_token(self, username, access_token):
- """更新数据库的access_token以便后续使用"""
- username = sha256(username.encode('utf8')).hexdigest()
- # 先看是否有这个用户
- mydoc = self.user_history.find({'username': username}).limit(1)
- res = [x for x in mydoc]
- # 如果没有则直接创建
- if len(res) < 1:
- mydict = {
- 'username': username, 'access_token': access_token,
- 'role': '你是ChatGPT,OpenAI训练的大规模语言模型,简明的回答用户的问题。', 'history': []}
- _ = self.user_history.insert_one(mydict)
- # 如果有则更新
- else:
- self.user_history.update_one({'username': username}, {'$set': {'access_token': access_token}})
-
- def get_user_chat_history(self, access_token):
- """获取用户的聊天历史"""
- mydoc = self.user_history.find({'access_token': access_token}).limit(1)
- res = [x for x in mydoc]
- history_str, history_list = '', []
- role = '你是ChatGPT,OpenAI训练的大规模语言模型,简明的回答用户的问题。'
- if len(res) >= 1:
- # 遍历加到history中
- history_list = res[0]['history']
- role = res[0]['role']
- for qus, ans in history_list[::-1]:
- history_str += f'Q: {qus}\nA: {ans}\n'
-
- return history_str, history_list, role
-
- def update_user_chat_history(self, access_token, qus, ans):
- """更新用户的聊天历史"""
- mydoc = self.user_history.find({'access_token': access_token}).limit(1)
- res = [x for x in mydoc]
- if len(res) >= 1:
- self.user_history.update_one({'access_token': access_token}, {'$push': {'history': (qus, ans)}})
-
- def delete_user_chat_history(self, access_token):
- """删除用户的聊天历史"""
- mydoc = self.user_history.find({'access_token': access_token}).limit(1)
- res = [x for x in mydoc]
- if len(res) >= 1:
- self.user_history.update_one({'access_token': access_token}, {'$set': {'history': []}})
-
- def update_role(self, access_token, role):
- """更新用户的聊天历史"""
- mydoc = self.user_history.find({'access_token': access_token}).limit(1)
- res = [x for x in mydoc]
- if len(res) >= 1:
- self.user_history.update_one({'access_token': access_token}, {'$set': {'role': role}})
diff --git a/spaces/ggffdd/White-box-Cartoonization/wbc/cartoonize.py b/spaces/ggffdd/White-box-Cartoonization/wbc/cartoonize.py
deleted file mode 100644
index 25faf1ceb95aaed9a3f7a7982d17a03dc6bc32b1..0000000000000000000000000000000000000000
--- a/spaces/ggffdd/White-box-Cartoonization/wbc/cartoonize.py
+++ /dev/null
@@ -1,112 +0,0 @@
-import os
-import cv2
-import numpy as np
-import tensorflow as tf
-import wbc.network as network
-import wbc.guided_filter as guided_filter
-from tqdm import tqdm
-
-
-def resize_crop(image):
- h, w, c = np.shape(image)
- if min(h, w) > 720:
- if h > w:
- h, w = int(720 * h / w), 720
- else:
- h, w = 720, int(720 * w / h)
- image = cv2.resize(image, (w, h),
- interpolation=cv2.INTER_AREA)
- h, w = (h // 8) * 8, (w // 8) * 8
- image = image[:h, :w, :]
- return image
-
-
-def cartoonize(load_folder, save_folder, model_path):
- print(model_path)
- input_photo = tf.placeholder(tf.float32, [1, None, None, 3])
- network_out = network.unet_generator(input_photo)
- final_out = guided_filter.guided_filter(input_photo, network_out, r=1, eps=5e-3)
-
- all_vars = tf.trainable_variables()
- gene_vars = [var for var in all_vars if 'generator' in var.name]
- saver = tf.train.Saver(var_list=gene_vars)
-
- config = tf.ConfigProto()
- config.gpu_options.allow_growth = True
- sess = tf.Session(config=config)
-
- sess.run(tf.global_variables_initializer())
- saver.restore(sess, tf.train.latest_checkpoint(model_path))
- name_list = os.listdir(load_folder)
- for name in tqdm(name_list):
- try:
- load_path = os.path.join(load_folder, name)
- save_path = os.path.join(save_folder, name)
- image = cv2.imread(load_path)
- image = resize_crop(image)
- batch_image = image.astype(np.float32) / 127.5 - 1
- batch_image = np.expand_dims(batch_image, axis=0)
- output = sess.run(final_out, feed_dict={input_photo: batch_image})
- output = (np.squeeze(output) + 1) * 127.5
- output = np.clip(output, 0, 255).astype(np.uint8)
- cv2.imwrite(save_path, output)
- except:
- print('cartoonize {} failed'.format(load_path))
-
-
-class Cartoonize:
- def __init__(self, model_path):
- print(model_path)
- self.input_photo = tf.placeholder(tf.float32, [1, None, None, 3])
- network_out = network.unet_generator(self.input_photo)
- self.final_out = guided_filter.guided_filter(self.input_photo, network_out, r=1, eps=5e-3)
-
- all_vars = tf.trainable_variables()
- gene_vars = [var for var in all_vars if 'generator' in var.name]
- saver = tf.train.Saver(var_list=gene_vars)
-
- config = tf.ConfigProto()
- config.gpu_options.allow_growth = True
- self.sess = tf.Session(config=config)
-
- self.sess.run(tf.global_variables_initializer())
- saver.restore(self.sess, tf.train.latest_checkpoint(model_path))
-
- def run(self, load_folder, save_folder):
- name_list = os.listdir(load_folder)
- for name in tqdm(name_list):
- try:
- load_path = os.path.join(load_folder, name)
- save_path = os.path.join(save_folder, name)
- image = cv2.imread(load_path)
- image = resize_crop(image)
- batch_image = image.astype(np.float32) / 127.5 - 1
- batch_image = np.expand_dims(batch_image, axis=0)
- output = self.sess.run(self.final_out, feed_dict={self.input_photo: batch_image})
- output = (np.squeeze(output) + 1) * 127.5
- output = np.clip(output, 0, 255).astype(np.uint8)
- cv2.imwrite(save_path, output)
- except:
- print('cartoonize {} failed'.format(load_path))
-
- def run_sigle(self, load_path, save_path):
- try:
- image = cv2.imread(load_path)
- image = resize_crop(image)
- batch_image = image.astype(np.float32) / 127.5 - 1
- batch_image = np.expand_dims(batch_image, axis=0)
- output = self.sess.run(self.final_out, feed_dict={self.input_photo: batch_image})
- output = (np.squeeze(output) + 1) * 127.5
- output = np.clip(output, 0, 255).astype(np.uint8)
- cv2.imwrite(save_path, output)
- except:
- print('cartoonize {} failed'.format(load_path))
-
-
-if __name__ == '__main__':
- model_path = 'saved_models'
- load_folder = 'test_images'
- save_folder = 'cartoonized_images'
- if not os.path.exists(save_folder):
- os.mkdir(save_folder)
- cartoonize(load_folder, save_folder, model_path)
diff --git a/spaces/gotiQspiryo/whisper-ui/examples/Adobe Photoshop Cs7 Free Download Full Version With Crack _TOP_.md b/spaces/gotiQspiryo/whisper-ui/examples/Adobe Photoshop Cs7 Free Download Full Version With Crack _TOP_.md
deleted file mode 100644
index 8d3594f63fe45d82b55b2cea93fe59df4edf637a..0000000000000000000000000000000000000000
--- a/spaces/gotiQspiryo/whisper-ui/examples/Adobe Photoshop Cs7 Free Download Full Version With Crack _TOP_.md
+++ /dev/null
@@ -1,10 +0,0 @@
-adobe photoshop cs7 free download full version with crack
Download File ———>>> https://urlgoal.com/2uyLXA
-
-Jan 24, 2020 - With it, you can edit your photos and make an extra effect on them, as well as increase the brightness. Download photoshop 7.0 full crack. Adobe ... Adobe Photoshop 7.0, cracked.
-Download photoshop 7.0 crack.
-Adobe Photoshop 7.0, cracked download.
-Adobe Photoshop 7.0, cracked download. Adobe Photoshop 7.0, cracked download.
-Adobe Photoshop 7.0, cracked download. 8a78ff9644
-
-
-
diff --git a/spaces/gotiQspiryo/whisper-ui/examples/CRACK Microsoft Office ProPlus 2013 SP1 VL X64 En-US Oct2014-murphy78-.md b/spaces/gotiQspiryo/whisper-ui/examples/CRACK Microsoft Office ProPlus 2013 SP1 VL X64 En-US Oct2014-murphy78-.md
deleted file mode 100644
index dd98ee204b2642e35db2bca9911901f0237ba298..0000000000000000000000000000000000000000
--- a/spaces/gotiQspiryo/whisper-ui/examples/CRACK Microsoft Office ProPlus 2013 SP1 VL X64 En-US Oct2014-murphy78-.md
+++ /dev/null
@@ -1,18 +0,0 @@
-CRACK Microsoft Office ProPlus 2013 SP1 VL X64 En-US Oct2014-murphy78-
Download === https://urlgoal.com/2uyM0m
-
-Microsoft Office ProPlus 2013 is an addition to Microsoft Office Home & Student, Business, and Ultimate. Since it was first released in 1999, it has been known for being the most feature-rich Microsoft Office program and has been the most well-known name. One of its major features is its ability to produce high-quality professional Microsoft Word documents and spreadsheets. While the Professional version of Office Suite is aimed at professionals and business users, the Home and Student editions are targeted for students and home users.
-
-Download Now
-
-The following is a detailed list of issues resolved in the Microsoft Office 2013 RTM release. Download Microsoft Office 2013 Product Key - Windows / Office/ Excel/ Google, Outlook, etc from Softempire officesoft. Also Download MS Office 2013 License Key/Keygen and Crack. Select a product from the list below to download the update. More information available at Click Download to go to the Download Center. Click the Download button to begin the download. Windows 8, Windows 7, Windows 10 and Windows Server 2011 can be installed together.
-
-Microsoft Office 2013 Professional Edition: This is the standard version of Microsoft Office 2013, which includes all the business and productivity features, plus SharePoint Server and Project Server. A consumer edition is available. This version is intended for individual and home use, such as for entertainment, productivity, business and home use.
-
-It is one of the most powerful suites of business and office productivity applications available today, with powerful new features that increase your productivity and creativity. It is the most widely adopted, the easiest to use, and the most trusted application suite on the market, available in more than 30 languages. The most recent release of Office is Office 2013 Pro Plus. This version has some of the same in-place updates and fixes as the Office 2013 RTM release. This is in addition to the features that have already been released on the service packs. If you're looking for Office 2010 or Office 2011, then you'll have to upgrade. Download Microsoft Office 2013. For more information, see the Microsoft Office 2013 Support Lifecycle.
-
-Please note that the Pro Plus edition is only available for the English language version of Office. Download Office 2013 Pro Plus for only $99. Learn more about Office 2013.Office 2013 is available for the following editions. Click Download to download.
-
-Important: 4fefd39f24
-
-
-
diff --git a/spaces/gotiQspiryo/whisper-ui/examples/DISCOGRAFIA ATAHUALPA YUPANQUI 9.md b/spaces/gotiQspiryo/whisper-ui/examples/DISCOGRAFIA ATAHUALPA YUPANQUI 9.md
deleted file mode 100644
index 3ad1397252c0360a6eb0a3709f3b9b411b439170..0000000000000000000000000000000000000000
--- a/spaces/gotiQspiryo/whisper-ui/examples/DISCOGRAFIA ATAHUALPA YUPANQUI 9.md
+++ /dev/null
@@ -1,5 +0,0 @@
-
-Son pseudonyme, choisi dès l'adolescence, est formé d'Atahualpa, et de Yupanqui. Atahualpa yupanqui signifierait « celui qui vient de contrées lointaines pour dire quelque chose », en langue quechua[3].
-DISCOGRAFIA ATAHUALPA YUPANQUI 9
DOWNLOAD ✑ ✑ ✑ https://urlgoal.com/2uyLST
aaccfb2cb3
-
-
\ No newline at end of file
diff --git a/spaces/gradio/HuBERT/fairseq/modules/adaptive_softmax.py b/spaces/gradio/HuBERT/fairseq/modules/adaptive_softmax.py
deleted file mode 100644
index ae0c77ba0f6ee98501306d66cbc4a948b4ade0f7..0000000000000000000000000000000000000000
--- a/spaces/gradio/HuBERT/fairseq/modules/adaptive_softmax.py
+++ /dev/null
@@ -1,268 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
-
-import functools
-import operator
-
-import torch
-import torch.nn.functional as F
-from fairseq.modules.fairseq_dropout import FairseqDropout
-from fairseq.modules.quant_noise import quant_noise
-from torch import nn
-
-
-class TiedLinear(nn.Module):
- def __init__(self, weight, transpose):
- super().__init__()
- self.weight = weight
- self.transpose = transpose
-
- def forward(self, input):
- return F.linear(input, self.weight.t() if self.transpose else self.weight)
-
-
-class TiedHeadModule(nn.Module):
- def __init__(self, weights, input_dim, num_classes, q_noise, qn_block_size):
- super().__init__()
- tied_emb, _ = weights
- self.num_words, emb_dim = tied_emb.size()
-
- self.word_proj = quant_noise(
- TiedLinear(tied_emb, transpose=False), q_noise, qn_block_size
- )
- if input_dim != emb_dim:
- self.word_proj = nn.Sequential(
- quant_noise(
- nn.Linear(input_dim, emb_dim, bias=False), q_noise, qn_block_size
- ),
- self.word_proj,
- )
-
- self.class_proj = quant_noise(
- nn.Linear(input_dim, num_classes, bias=False), q_noise, qn_block_size
- )
- self.out_dim = self.num_words + num_classes
-
- self.register_buffer("_float_tensor", torch.FloatTensor(1))
-
- def forward(self, input):
- inp_sz = functools.reduce(operator.mul, input.shape[:-1], 1)
- out = self._float_tensor.new(inp_sz, self.out_dim)
- out[:, : self.num_words] = self.word_proj(input.view(inp_sz, -1))
- out[:, self.num_words :] = self.class_proj(input.view(inp_sz, -1))
- return out
-
-
-class AdaptiveSoftmax(nn.Module):
- """
- This is an implementation of the efficient softmax approximation for
- graphical processing units (GPU), described in the paper "Efficient softmax
- approximation for GPUs" (http://arxiv.org/abs/1609.04309).
- """
-
- def __init__(
- self,
- vocab_size,
- input_dim,
- cutoff,
- dropout,
- factor=4.0,
- adaptive_inputs=None,
- tie_proj=False,
- q_noise=0,
- qn_block_size=8,
- ):
- super().__init__()
-
- if vocab_size > cutoff[-1]:
- cutoff = cutoff + [vocab_size]
- else:
- assert (
- vocab_size == cutoff[-1]
- ), "cannot specify cutoff larger than vocab size"
-
- output_dim = cutoff[0] + len(cutoff) - 1
-
- self.vocab_size = vocab_size
- self.cutoff = cutoff
- self.dropout_module = FairseqDropout(
- dropout, module_name=self.__class__.__name__
- )
- self.input_dim = input_dim
- self.factor = factor
- self.q_noise = q_noise
- self.qn_block_size = qn_block_size
-
- self.lsm = nn.LogSoftmax(dim=1)
-
- if adaptive_inputs is not None:
- self.head = TiedHeadModule(
- adaptive_inputs.weights_for_band(0),
- input_dim,
- len(cutoff) - 1,
- self.q_noise,
- self.qn_block_size,
- )
- else:
- self.head = quant_noise(
- nn.Linear(input_dim, output_dim, bias=False),
- self.q_noise,
- self.qn_block_size,
- )
-
- self._make_tail(adaptive_inputs, tie_proj)
-
- def init_weights(m):
- if (
- hasattr(m, "weight")
- and not isinstance(m, TiedLinear)
- and not isinstance(m, TiedHeadModule)
- ):
- nn.init.xavier_uniform_(m.weight)
-
- self.apply(init_weights)
-
- self.register_buffer("version", torch.LongTensor([1]))
-
- def _make_tail(self, adaptive_inputs=None, tie_proj=False):
- self.tail = nn.ModuleList()
- for i in range(len(self.cutoff) - 1):
- dim = int(self.input_dim // self.factor ** (i + 1))
-
- tied_emb, tied_proj = (
- adaptive_inputs.weights_for_band(i + 1)
- if adaptive_inputs is not None
- else (None, None)
- )
-
- if tied_proj is not None:
- if tie_proj:
- proj = quant_noise(
- TiedLinear(tied_proj, transpose=True),
- self.q_noise,
- self.qn_block_size,
- )
- else:
- proj = quant_noise(
- nn.Linear(tied_proj.size(0), tied_proj.size(1), bias=False),
- self.q_noise,
- self.qn_block_size,
- )
- else:
- proj = quant_noise(
- nn.Linear(self.input_dim, dim, bias=False),
- self.q_noise,
- self.qn_block_size,
- )
-
- if tied_emb is None:
- out_proj = nn.Linear(
- dim, self.cutoff[i + 1] - self.cutoff[i], bias=False
- )
- else:
- out_proj = TiedLinear(tied_emb, transpose=False)
-
- m = nn.Sequential(
- proj,
- nn.Dropout(self.dropout_module.p),
- quant_noise(out_proj, self.q_noise, self.qn_block_size),
- )
-
- self.tail.append(m)
-
- def upgrade_state_dict_named(self, state_dict, name):
- version_name = name + ".version"
- if version_name not in state_dict:
- raise Exception("This version of the model is no longer supported")
-
- def adapt_target(self, target):
- """
- In order to be efficient, the AdaptiveSoftMax does not compute the
- scores for all the word of the vocabulary for all the examples. It is
- thus necessary to call the method adapt_target of the AdaptiveSoftMax
- layer inside each forward pass.
- """
-
- target = target.view(-1)
- new_target = [target.clone()]
- target_idxs = []
-
- for i in range(len(self.cutoff) - 1):
- mask = target.ge(self.cutoff[i]).mul(target.lt(self.cutoff[i + 1]))
- new_target[0][mask] = self.cutoff[0] + i
-
- if mask.any():
- target_idxs.append(mask.nonzero(as_tuple=False).squeeze(1))
- new_target.append(target[mask].add(-self.cutoff[i]))
- else:
- target_idxs.append(None)
- new_target.append(None)
-
- return new_target, target_idxs
-
- def forward(self, input, target):
- """
- Args:
- input: (b x t x d)
- target: (b x t)
- Returns:
- 2 lists: output for each cutoff section and new targets by cut off
- """
-
- input = input.contiguous().view(-1, input.size(-1))
- input = self.dropout_module(input)
-
- new_target, target_idxs = self.adapt_target(target)
- output = [self.head(input)]
-
- for i in range(len(target_idxs)):
- if target_idxs[i] is not None:
- output.append(self.tail[i](input.index_select(0, target_idxs[i])))
- else:
- output.append(None)
-
- return output, new_target
-
- def get_log_prob(self, input, target):
- """
- Computes the log probabilities for all the words of the vocabulary,
- given a 2D tensor of hidden vectors.
- """
-
- bsz, length, dim = input.size()
- input = input.contiguous().view(-1, dim)
-
- if target is not None:
- _, target_idxs = self.adapt_target(target)
- else:
- target_idxs = None
-
- head_y = self.head(input)
- log_probs = head_y.new_zeros(input.size(0), self.vocab_size)
-
- head_sz = self.cutoff[0] + len(self.tail)
- log_probs[:, :head_sz] = self.lsm(head_y)
- tail_priors = log_probs[:, self.cutoff[0] : head_sz].clone()
-
- for i in range(len(self.tail)):
- start = self.cutoff[i]
- end = self.cutoff[i + 1]
-
- if target_idxs is None:
- tail_out = log_probs[:, start:end]
- tail_out.copy_(self.tail[i](input))
- log_probs[:, start:end] = self.lsm(tail_out).add_(
- tail_priors[:, i, None]
- )
- elif target_idxs[i] is not None:
- idxs = target_idxs[i]
- tail_out = log_probs[idxs, start:end]
- tail_out.copy_(self.tail[i](input[idxs]))
- log_probs[idxs, start:end] = self.lsm(tail_out).add_(
- tail_priors[idxs, i, None]
- )
-
- log_probs = log_probs.view(bsz, length, -1)
- return log_probs
diff --git a/spaces/gradio/HuBERT/tests/test_backtranslation_dataset.py b/spaces/gradio/HuBERT/tests/test_backtranslation_dataset.py
deleted file mode 100644
index dffc3b49387dfdc046ea23d7db179377040b7cbc..0000000000000000000000000000000000000000
--- a/spaces/gradio/HuBERT/tests/test_backtranslation_dataset.py
+++ /dev/null
@@ -1,123 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
-
-import unittest
-
-import tests.utils as test_utils
-import torch
-from fairseq.data import (
- BacktranslationDataset,
- LanguagePairDataset,
- TransformEosDataset,
-)
-from fairseq.sequence_generator import SequenceGenerator
-
-
-class TestBacktranslationDataset(unittest.TestCase):
- def setUp(self):
- (
- self.tgt_dict,
- self.w1,
- self.w2,
- self.src_tokens,
- self.src_lengths,
- self.model,
- ) = test_utils.sequence_generator_setup()
-
- dummy_src_samples = self.src_tokens
-
- self.tgt_dataset = test_utils.TestDataset(data=dummy_src_samples)
- self.cuda = torch.cuda.is_available()
-
- def _backtranslation_dataset_helper(
- self,
- remove_eos_from_input_src,
- remove_eos_from_output_src,
- ):
- tgt_dataset = LanguagePairDataset(
- src=self.tgt_dataset,
- src_sizes=self.tgt_dataset.sizes,
- src_dict=self.tgt_dict,
- tgt=None,
- tgt_sizes=None,
- tgt_dict=None,
- )
-
- generator = SequenceGenerator(
- [self.model],
- tgt_dict=self.tgt_dict,
- max_len_a=0,
- max_len_b=200,
- beam_size=2,
- unk_penalty=0,
- )
-
- backtranslation_dataset = BacktranslationDataset(
- tgt_dataset=TransformEosDataset(
- dataset=tgt_dataset,
- eos=self.tgt_dict.eos(),
- # remove eos from the input src
- remove_eos_from_src=remove_eos_from_input_src,
- ),
- src_dict=self.tgt_dict,
- backtranslation_fn=(
- lambda sample: generator.generate([self.model], sample)
- ),
- output_collater=TransformEosDataset(
- dataset=tgt_dataset,
- eos=self.tgt_dict.eos(),
- # if we remove eos from the input src, then we need to add it
- # back to the output tgt
- append_eos_to_tgt=remove_eos_from_input_src,
- remove_eos_from_src=remove_eos_from_output_src,
- ).collater,
- cuda=self.cuda,
- )
- dataloader = torch.utils.data.DataLoader(
- backtranslation_dataset,
- batch_size=2,
- collate_fn=backtranslation_dataset.collater,
- )
- backtranslation_batch_result = next(iter(dataloader))
-
- eos, pad, w1, w2 = self.tgt_dict.eos(), self.tgt_dict.pad(), self.w1, self.w2
-
- # Note that we sort by src_lengths and add left padding, so actually
- # ids will look like: [1, 0]
- expected_src = torch.LongTensor([[w1, w2, w1, eos], [pad, pad, w1, eos]])
- if remove_eos_from_output_src:
- expected_src = expected_src[:, :-1]
- expected_tgt = torch.LongTensor([[w1, w2, eos], [w1, w2, eos]])
- generated_src = backtranslation_batch_result["net_input"]["src_tokens"]
- tgt_tokens = backtranslation_batch_result["target"]
-
- self.assertTensorEqual(expected_src, generated_src)
- self.assertTensorEqual(expected_tgt, tgt_tokens)
-
- def test_backtranslation_dataset_no_eos_in_output_src(self):
- self._backtranslation_dataset_helper(
- remove_eos_from_input_src=False,
- remove_eos_from_output_src=True,
- )
-
- def test_backtranslation_dataset_with_eos_in_output_src(self):
- self._backtranslation_dataset_helper(
- remove_eos_from_input_src=False,
- remove_eos_from_output_src=False,
- )
-
- def test_backtranslation_dataset_no_eos_in_input_src(self):
- self._backtranslation_dataset_helper(
- remove_eos_from_input_src=True,
- remove_eos_from_output_src=False,
- )
-
- def assertTensorEqual(self, t1, t2):
- self.assertEqual(t1.size(), t2.size(), "size mismatch")
- self.assertEqual(t1.ne(t2).long().sum(), 0)
-
-
-if __name__ == "__main__":
- unittest.main()
diff --git a/spaces/gstaff/MagicGen/colab-data-test/css/keyrune.css b/spaces/gstaff/MagicGen/colab-data-test/css/keyrune.css
deleted file mode 100644
index 1d8e6035e654f670f2e93e2eed831d33823b8b36..0000000000000000000000000000000000000000
--- a/spaces/gstaff/MagicGen/colab-data-test/css/keyrune.css
+++ /dev/null
@@ -1,697 +0,0 @@
-/**
- * Global */
-@font-face {
- font-family: 'Keyrune';
- src: url('../fonts/keyrune.eot?v=1.6.0');
- src: url('../fonts/keyrune.eot?#iefix&v=1.6.0') format('embedded-opentype'), url('../fonts/keyrune.woff2?v=1.6.0') format('woff2'), url('../fonts/keyrune.woff?v=1.6.0') format('woff'), url('../fonts/keyrune.ttf?v=1.6.0') format('truetype'), url('../fonts/keyrune.svg?v=1.6.0#keyrune') format('svg');
- font-weight: normal;
- font-style: normal;
-}
-.ss {
- display: inline-block;
- font: normal normal normal 14px/1 Keyrune;
- font-size: inherit;
- line-height: 1em;
- text-rendering: auto;
- transform: translate(0, 0);
- speak: none;
- text-transform: none;
- vertical-align: middle;
- -webkit-font-smoothing: antialiased;
- -moz-osx-font-smoothing: grayscale;
-}
-.ss:before {
- content: "\e684";
-}
-/**
- * Larger sizes */
-.ss-2x {
- font-size: 2em;
-}
-.ss-3x {
- font-size: 3em;
-}
-.ss-4x {
- font-size: 4em;
-}
-.ss-5x {
- font-size: 5em;
-}
-.ss-6x {
- font-size: 6em;
-}
-/**
- * Rarity colors */
-.ss-common {
- color: #1A1718;
-}
-.ss-common.ss-grad {
- background: -webkit-gradient(linear, left top, right top, color-stop(1%, #302b2c), color-stop(50%, #474040), color-stop(100%, #302b2c));
- /* Chrome,Safari4+ */
- background: -webkit-linear-gradient(left, #302b2c 1%, #474040 50%, #302b2c 100%);
- -webkit-text-stroke: 0.03em #000;
- -webkit-background-clip: text;
- -webkit-text-fill-color: transparent;
-}
-.ss-common.ss-grad.ss-no-border {
- -webkit-text-stroke: 0;
-}
-.ss-uncommon {
- color: #707883;
-}
-.ss-uncommon.ss-grad {
- background: -webkit-gradient(linear, left top, right top, color-stop(0%, #5a6572), color-stop(50%, #9e9e9e), color-stop(100%, #5a6572));
- /* Chrome,Safari4+ */
- background: -webkit-linear-gradient(left, #5a6572 0%, #9e9e9e 50%, #5a6572 100%);
- -webkit-text-stroke: 0.03em #111;
- -webkit-background-clip: text;
- -webkit-text-fill-color: transparent;
-}
-.ss-uncommon.ss-grad.ss-no-border {
- -webkit-text-stroke: 0;
-}
-.ss-rare {
- color: #A58E4A;
-}
-.ss-rare.ss-grad {
- background: -webkit-gradient(linear, left top, right top, color-stop(0%, #876a3b), color-stop(50%, #dfbd6b), color-stop(100%, #876a3b));
- /* Chrome,Safari4+ */
- background: -webkit-linear-gradient(left, #876a3b 0%, #dfbd6b 50%, #876a3b 100%);
- -webkit-text-stroke: 0.03em #333;
- -webkit-background-clip: text;
- -webkit-text-fill-color: transparent;
-}
-.ss-rare.ss-grad.ss-no-border {
- -webkit-text-stroke: 0;
-}
-.ss-mythic {
- color: #BF4427;
-}
-.ss-mythic.ss-grad {
- background: -webkit-gradient(linear, left top, right top, color-stop(0%, #b21f0f), color-stop(50%, #f38300), color-stop(100%, #b21f0f));
- /* Chrome,Safari4+ */
- background: -webkit-linear-gradient(left, #b21f0f 0%, #f38300 50%, #b21f0f 100%);
- -webkit-text-stroke: 0.03em #333;
- -webkit-background-clip: text;
- -webkit-text-fill-color: transparent;
-}
-.ss-mythic.ss-grad.ss-no-border {
- -webkit-text-stroke: 0;
-}
-/**
- * Fixed width */
-.ss-fw {
- width: 1.28571429em;
- text-align: center;
-}
-/**
- * Core */
-.ss-lea:before {
- content: "\e600";
-}
-.ss-leb:before {
- content: "\e601";
-}
-.ss-2ed:before {
- content: "\e602";
-}
-.ss-3ed:before {
- content: "\e603";
-}
-.ss-4ed:before {
- content: "\e604";
-}
-.ss-psum:before {
- content: "\e605";
-}
-.ss-5ed:before {
- content: "\e606";
-}
-.ss-6ed:before {
- content: "\e607";
-}
-.ss-7ed:before {
- content: "\e608";
-}
-.ss-8ed:before {
- content: "\e609";
-}
-.ss-9ed:before {
- content: "\e60a";
-}
-.ss-10e:before {
- content: "\e60b";
-}
-.ss-m10:before {
- content: "\e60c";
-}
-.ss-m11:before {
- content: "\e60d";
-}
-.ss-m12:before {
- content: "\e60e";
-}
-.ss-m13:before {
- content: "\e60f";
-}
-.ss-m14:before {
- content: "\e610";
-}
-.ss-m15:before {
- content: "\e611";
-}
-.ss-bcore:before {
- content: "\e612";
-}
-.ss-ori:before {
- content: "\e697";
-}
-/**
- * Expansions */
-/* Artifact Block */
-.ss-arn:before {
- content: "\e613";
-}
-.ss-atq:before {
- content: "\e614";
-}
-.ss-leg:before {
- content: "\e615";
-}
-/* Wizards Block */
-.ss-drk:before {
- content: "\e616";
-}
-.ss-fem:before {
- content: "\e617";
-}
-.ss-hml:before {
- content: "\e618";
-}
-/* Ice Age Block */
-.ss-ice:before {
- content: "\e619";
-}
-.ss-all:before {
- content: "\e61a";
-}
-.ss-csp:before {
- content: "\e61b";
-}
-/* Mirage Block */
-.ss-mir:before {
- content: "\e61c";
-}
-.ss-vis:before {
- content: "\e61d";
-}
-.ss-wth:before {
- content: "\e61e";
-}
-/* Tempest Block */
-.ss-tmp:before {
- content: "\e61f";
-}
-.ss-sth:before {
- content: "\e620";
-}
-.ss-exo:before {
- content: "\e621";
-}
-/* Urza's Block */
-.ss-usg:before {
- content: "\e622";
-}
-.ss-ulg:before {
- content: "\e623";
-}
-.ss-uds:before {
- content: "\e624";
-}
-/* Mercadian Block */
-.ss-mmq:before {
- content: "\e625";
-}
-.ss-nms:before {
- content: "\e626";
-}
-.ss-pcy:before {
- content: "\e627";
-}
-/* Invasion Block */
-.ss-inv:before {
- content: "\e628";
-}
-.ss-pls:before {
- content: "\e629";
-}
-.ss-apc:before {
- content: "\e62a";
-}
-/* Odyssey Block */
-.ss-ody:before {
- content: "\e62b";
-}
-.ss-tor:before {
- content: "\e62c";
-}
-.ss-jud:before {
- content: "\e62d";
-}
-/* Onslaught Block */
-.ss-ons:before {
- content: "\e62e";
-}
-.ss-lgn:before {
- content: "\e62f";
-}
-.ss-scg:before {
- content: "\e630";
-}
-/* Mirrodin Block */
-.ss-mrd:before {
- content: "\e631";
-}
-.ss-dst:before {
- content: "\e632";
-}
-.ss-5dn:before {
- content: "\e633";
-}
-/* Kamigawa Block */
-.ss-chk:before {
- content: "\e634";
-}
-.ss-bok:before {
- content: "\e635";
-}
-.ss-sok:before {
- content: "\e636";
-}
-/* Ravnica Block */
-.ss-rav:before {
- content: "\e637";
-}
-.ss-gpt:before {
- content: "\e638";
-}
-.ss-dis:before {
- content: "\e639";
-}
-/* Time Spiral Block */
-.ss-tsp:before {
- content: "\e63a";
-}
-.ss-plc:before {
- content: "\e63b";
-}
-.ss-fut:before {
- content: "\e63c";
-}
-/* Lorwyn Block */
-.ss-lrw:before {
- content: "\e63d";
-}
-.ss-mor:before {
- content: "\e63e";
-}
-/* Shadowmoor Block */
-.ss-shm:before {
- content: "\e63f";
-}
-.ss-eve:before {
- content: "\e640";
-}
-/* Alara Block */
-.ss-ala:before {
- content: "\e641";
-}
-.ss-con:before {
- content: "\e642";
-}
-.ss-arb:before {
- content: "\e643";
-}
-/* Zendikar Block */
-.ss-zen:before {
- content: "\e644";
-}
-.ss-wwk:before {
- content: "\e645";
-}
-.ss-roe:before {
- content: "\e646";
-}
-/* Scars Block */
-.ss-som:before {
- content: "\e647";
-}
-.ss-mbs:before {
- content: "\e648";
-}
-.ss-nph:before {
- content: "\e649";
-}
-/* Innistrad Block */
-.ss-isd:before {
- content: "\e64a";
-}
-.ss-dka:before {
- content: "\e64b";
-}
-.ss-avr:before {
- content: "\e64c";
-}
-/* RTR Block */
-.ss-rtr:before {
- content: "\e64d";
-}
-.ss-gtc:before {
- content: "\e64e";
-}
-.ss-dgm:before {
- content: "\e64f";
-}
-/* Theros Block */
-.ss-ths:before {
- content: "\e650";
-}
-.ss-bng:before {
- content: "\e651";
-}
-.ss-jou:before {
- content: "\e652";
-}
-/* Khans Block */
-.ss-ktk:before {
- content: "\e653";
-}
-.ss-frf:before {
- content: "\e654";
-}
-.ss-dtk:before {
- content: "\e693";
-}
-/* Return to Zendikar Block */
-.ss-bfz:before {
- content: "\e699";
-}
-.ss-ogw:before {
- content: "\e901";
-}
-/* Return to Innistrad Block */
-.ss-soi:before {
- content: "\e902";
-}
-.ss-emn:before {
- content: "\e90b";
-}
-/**
- * Command Zone */
-.ss-van:before {
- content: "\e655";
-}
-.ss-hop:before {
- content: "\e656";
-}
-.ss-arc:before {
- content: "\e657";
-}
-.ss-cmd:before {
- content: "\e658";
-}
-.ss-pc2:before {
- content: "\e659";
-}
-.ss-cm1:before {
- content: "\e65a";
-}
-.ss-c13:before {
- content: "\e65b";
-}
-.ss-cns:before {
- content: "\e65c";
-}
-.ss-c14:before {
- content: "\e65d";
-}
-.ss-c15:before {
- content: "\e900";
-}
-.ss-cn2:before {
- content: "\e904";
-}
-/**
- * Reprint */
-.ss-chr:before {
- content: "\e65e";
-}
-.ss-ath:before {
- content: "\e65f";
-}
-.ss-brb:before {
- content: "\e660";
-}
-.ss-btd:before {
- content: "\e661";
-}
-.ss-dkm:before {
- content: "\e662";
-}
-.ss-mma:before {
- content: "\e663";
-}
-.ss-mm2:before {
- content: "\e695";
-}
-.ss-ema:before {
- content: "\e903";
-}
-/**
- * Beginner */
-.ss-por:before {
- content: "\e664";
-}
-.ss-po2:before {
- content: "\e665";
-}
-.ss-ptk:before {
- content: "\e666";
-}
-.ss-s99:before {
- content: "\e667";
-}
-.ss-s00:before {
- content: "\e668";
-}
-.ss-w16:before {
- content: "\e907";
-}
-/**
- * Duel Decks */
-.ss-evg:before {
- content: "\e669";
-}
-.ss-dd2:before {
- content: "\e66a";
-}
-.ss-ddc:before {
- content: "\e66b";
-}
-.ss-ddd:before {
- content: "\e66c";
-}
-.ss-dde:before {
- content: "\e66d";
-}
-.ss-ddf:before {
- content: "\e66e";
-}
-.ss-ddg:before {
- content: "\e66f";
-}
-.ss-ddh:before {
- content: "\e670";
-}
-.ss-ddi:before {
- content: "\e671";
-}
-.ss-ddj:before {
- content: "\e672";
-}
-.ss-ddk:before {
- content: "\e673";
-}
-.ss-ddl:before {
- content: "\e674";
-}
-.ss-ddm:before {
- content: "\e675";
-}
-.ss-ddn:before {
- content: "\e676";
-}
-.ss-ddo:before {
- content: "\e677";
-}
-.ss-ddp:before {
- content: "\e698";
-}
-.ss-ddq:before {
- content: "\e908";
-}
-/**
- * From the Vault */
-.ss-drb:before {
- content: "\e678";
-}
-.ss-v09:before {
- content: "\e679";
-}
-.ss-v10:before {
- content: "\e67a";
-}
-.ss-v11:before {
- content: "\e67b";
-}
-.ss-v12:before {
- content: "\e67c";
-}
-.ss-v13:before {
- content: "\e67d";
-}
-.ss-v14:before {
- content: "\e67e";
-}
-.ss-v15:before {
- content: "\e905";
-}
-.ss-v16:before {
- content: "\e906";
-}
-/**
- * Premium Deck Series */
-.ss-h09:before {
- content: "\e67f";
-}
-.ss-pd2:before {
- content: "\e680";
-}
-.ss-pd3:before {
- content: "\e681";
-}
-.ss-md1:before {
- content: "\e682";
-}
-/**
- * Promotional */
-.ss-pgru:before {
- content: "\e683";
-}
-.ss-pmtg1:before {
- content: "\e684";
-}
-.ss-pmtg2:before {
- content: "\e685";
-}
-.ss-pleaf:before {
- content: "\e686";
-}
-.ss-pmei:before {
- content: "\e687";
-}
-.ss-parl:before {
- content: "\e688";
-}
-.ss-dpa:before {
- content: "\e689";
-}
-.ss-pbook:before {
- content: "\e68a";
-}
-.ss-past:before {
- content: "\e68b";
-}
-.ss-parl2:before {
- content: "\e68c";
-}
-.ss-exp:before {
- content: "\e69a";
-}
-.ss-psalvat05:before {
- content: "\e909";
-}
-.ss-psalvat11:before {
- content: "\e90a";
-}
-/**
- * Online */
-.ss-med:before {
- content: "\e68d";
-}
-.ss-me2:before {
- content: "\e68e";
-}
-.ss-me3:before {
- content: "\e68f";
-}
-.ss-me4:before {
- content: "\e690";
-}
-.ss-tpr:before {
- content: "\e694";
-}
-.ss-vma:before {
- content: "\e696";
-}
-/**
- * Un-serious */
-.ss-ugl:before {
- content: "\e691";
-}
-.ss-unh:before {
- content: "\e692";
-}
-.ss-border:after {
- content: "";
- position: absolute;
- left: -0.05em;
- top: .0em;
- color: #fff;
- font-size: 1.15em;
- z-index: -1;
- background: #fff;
- -webkit-text-stroke: 0.05em #fff;
- -webkit-background-clip: text;
- -webkit-text-fill-color: transparent;
-}
-.ss-border.ss-van:after {
- content: "\e655";
-}
-.ss-border.ss-hop:after {
- content: "\e656";
-}
-.ss-border.ss-arc:after {
- content: "\e657";
-}
-.ss-border.ss-cmd:after {
- content: "\e658";
-}
-.ss-border.ss-pc2:after {
- content: "\e659";
-}
-.ss-border.ss-cm1:after {
- content: "\e65a";
-}
-.ss-border.ss-c13:after {
- content: "\e65b";
-}
-.ss-border.ss-cns:after {
- content: "\e65c";
-}
-.ss-border.ss-c14:after {
- content: "\e65d";
-}
-.ss-border.ss-c15:after {
- content: "\e900";
-}
diff --git a/spaces/h2oai/wave-tour/examples/plot_step_after.py b/spaces/h2oai/wave-tour/examples/plot_step_after.py
deleted file mode 100644
index c5384061bd7bc96f092913ec93d03e3b9530cde8..0000000000000000000000000000000000000000
--- a/spaces/h2oai/wave-tour/examples/plot_step_after.py
+++ /dev/null
@@ -1,30 +0,0 @@
-# Plot / Line / Step / After
-# Make a line #plot with a step-after curve.
-# ---
-from h2o_wave import site, data, ui
-
-page = site['/demo']
-
-page.add('example', ui.plot_card(
- box='1 1 4 5',
- title='Line, step-after',
- data=data('month price', 12, rows=[
- ('Jan', 51),
- ('Feb', 91),
- ('Mar', 34),
- ('Apr', 47),
- ('May', 63),
- ('June', 58),
- ('July', 56),
- ('Aug', 77),
- ('Sep', 99),
- ('Oct', 106),
- ('Nov', 88),
- ('Dec', 56),
- ]),
- plot=ui.plot([
- ui.mark(type='line', x='=month', y='=price', curve='step-after', y_min=0)
- ])
-))
-
-page.save()
diff --git a/spaces/h2oai/wave-tour/examples/table_pagination_download.py b/spaces/h2oai/wave-tour/examples/table_pagination_download.py
deleted file mode 100644
index 42e2f0d9b9d1b0aaab46b1cbfac9e036b8c78415..0000000000000000000000000000000000000000
--- a/spaces/h2oai/wave-tour/examples/table_pagination_download.py
+++ /dev/null
@@ -1,50 +0,0 @@
-# Table / Pagination / Download
-# Use a #table with pagination to display large (100k+ rows) tabular data and provide data download option.
-# #form #table #pagination #download
-# ---
-
-from h2o_wave import main, app, Q, ui
-import csv
-
-
-rows = [str(i + 1) for i in range(100)]
-rows_per_page = 10
-
-
-@app('/demo')
-async def serve(q: Q):
- if not q.app.initialized:
- # Allow downloading all data since no filters/search/sort is allowed.
- # Create and upload a CSV file for downloads.
- # For multi-user apps, the tmp file name should be unique for each user, not hardcoded.
- with open('data_download.csv', 'w') as csvfile:
- csv_writer = csv.writer(csvfile, delimiter=',')
- for r in rows:
- csv_writer.writerow([r])
- q.app.data_download, = await q.site.upload(['data_download.csv'])
- q.app.initialized = True
-
- if not q.client.initialized:
- q.page['meta'] = ui.meta_card(box='')
- q.page['form'] = ui.form_card(box='1 1 -1 -1', items=[
- ui.table(
- name='table',
- columns=[ui.table_column(name='text', label='Text', link=False)],
- rows=[ui.table_row(name=r, cells=[r]) for r in rows[0:rows_per_page]],
- pagination=ui.table_pagination(total_rows=len(rows), rows_per_page=rows_per_page),
- height='580px',
- downloadable=True,
- events=['page_change', 'download']
- )
- ])
- q.client.initialized = True
-
- if q.events.table:
- if q.events.table.download:
- q.page['meta'].script = ui.inline_script(f'window.open("{q.app.data_download}")')
- if q.events.table.page_change:
- offset = q.events.table.page_change.get('offset', 0)
- new_rows = rows[offset:offset + rows_per_page]
- q.page['form'].table.rows = [ui.table_row(name=r, cells=[r]) for r in new_rows]
-
- await q.page.save()
diff --git a/spaces/haakohu/deep_privacy2/configs/anonymizers/deep_privacy1.py b/spaces/haakohu/deep_privacy2/configs/anonymizers/deep_privacy1.py
deleted file mode 100644
index 9bf116cefdbe716a1f9ba56b7f55d5949560cfbc..0000000000000000000000000000000000000000
--- a/spaces/haakohu/deep_privacy2/configs/anonymizers/deep_privacy1.py
+++ /dev/null
@@ -1,15 +0,0 @@
-from .face_fdf128 import anonymizer, common, detector
-from dp2.detection.deep_privacy1_detector import DeepPrivacy1Detector
-from tops.config import LazyCall as L
-
-anonymizer.update(
- face_G_cfg="configs/fdf/deep_privacy1.py",
-)
-
-anonymizer.detector = L(DeepPrivacy1Detector)(
- face_detector_cfg=dict(name="DSFDDetector", clip_boxes=True),
- face_post_process_cfg=dict(target_imsize=(128, 128), fdf128_expand=True),
- score_threshold=0.3,
- keypoint_threshold=0.3,
- cache_directory=common.output_dir.joinpath("deep_privacy1_cache")
-)
diff --git a/spaces/hamzapehlivan/StyleRes/options/Settings.py b/spaces/hamzapehlivan/StyleRes/options/Settings.py
deleted file mode 100644
index 471af3abc5c715d6fad323d0054c25847c9a6570..0000000000000000000000000000000000000000
--- a/spaces/hamzapehlivan/StyleRes/options/Settings.py
+++ /dev/null
@@ -1,12 +0,0 @@
-"""
-Global settings file for all the classes. Only inference.py can should write to here.
-Other files can read from this file.
-"""
-
-device = 'cpu'
-interfacegan_directions = 'editings/interfacegan_directions'
-ganspace_directions = 'editings/ganspace_pca'
-styleclip_settings = 'editings/styleclip_directions'
-styleclip_mapper_directions = 'editings/styleclip_directions/styleclip_directions/mapper'
-styleclip_global_directions = 'editings/styleclip_directions/styleclip_directions/global_directions'
-
diff --git a/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/modeling/language_backbone/backbone.py b/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/modeling/language_backbone/backbone.py
deleted file mode 100644
index 4699e97f8c15b3be92c4674bab4493d0c57e5260..0000000000000000000000000000000000000000
--- a/spaces/haotiz/glip-zeroshot-demo/maskrcnn_benchmark/modeling/language_backbone/backbone.py
+++ /dev/null
@@ -1,45 +0,0 @@
-from collections import OrderedDict
-import torch
-from torch import nn
-
-from maskrcnn_benchmark.modeling import registry
-from . import bert_model
-from . import rnn_model
-from . import clip_model
-from . import word_utils
-
-
-@registry.LANGUAGE_BACKBONES.register("bert-base-uncased")
-def build_bert_backbone(cfg):
- body = bert_model.BertEncoder(cfg)
- model = nn.Sequential(OrderedDict([("body", body)]))
- return model
-
-
-@registry.LANGUAGE_BACKBONES.register("roberta-base")
-def build_bert_backbone(cfg):
- body = bert_model.BertEncoder(cfg)
- model = nn.Sequential(OrderedDict([("body", body)]))
- return model
-
-
-@registry.LANGUAGE_BACKBONES.register("rnn")
-def build_rnn_backbone(cfg):
- body = rnn_model.RNNEnoder(cfg)
- model = nn.Sequential(OrderedDict([("body", body)]))
- return model
-
-
-@registry.LANGUAGE_BACKBONES.register("clip")
-def build_clip_backbone(cfg):
- body = clip_model.CLIPTransformer(cfg)
- model = nn.Sequential(OrderedDict([("body", body)]))
- return model
-
-
-def build_backbone(cfg):
- assert cfg.MODEL.LANGUAGE_BACKBONE.MODEL_TYPE in registry.LANGUAGE_BACKBONES, \
- "cfg.MODEL.LANGUAGE_BACKBONE.TYPE: {} is not registered in registry".format(
- cfg.MODEL.LANGUAGE_BACKBONE.MODEL_TYPE
- )
- return registry.LANGUAGE_BACKBONES[cfg.MODEL.LANGUAGE_BACKBONE.MODEL_TYPE](cfg)
diff --git a/spaces/hekbobo/bingo/src/components/toaster.tsx b/spaces/hekbobo/bingo/src/components/toaster.tsx
deleted file mode 100644
index 4d2693460b61307a1d4c127fd01df9bee16e59ff..0000000000000000000000000000000000000000
--- a/spaces/hekbobo/bingo/src/components/toaster.tsx
+++ /dev/null
@@ -1,3 +0,0 @@
-'use client'
-
-export { Toaster } from 'react-hot-toast'
diff --git a/spaces/hhhhardman/VITS/text/__init__.py b/spaces/hhhhardman/VITS/text/__init__.py
deleted file mode 100644
index 4e69c354dd24e3243980236eca962cd5945a92fc..0000000000000000000000000000000000000000
--- a/spaces/hhhhardman/VITS/text/__init__.py
+++ /dev/null
@@ -1,32 +0,0 @@
-""" from https://github.com/keithito/tacotron """
-from text import cleaners
-
-
-def text_to_sequence(text, symbols, cleaner_names):
- '''Converts a string of text to a sequence of IDs corresponding to the symbols in the text.
- Args:
- text: string to convert to a sequence
- cleaner_names: names of the cleaner functions to run the text through
- Returns:
- List of integers corresponding to the symbols in the text
- '''
- _symbol_to_id = {s: i for i, s in enumerate(symbols)}
-
- sequence = []
-
- clean_text = _clean_text(text, cleaner_names)
- for symbol in clean_text:
- if symbol not in _symbol_to_id.keys():
- continue
- symbol_id = _symbol_to_id[symbol]
- sequence += [symbol_id]
- return sequence
-
-
-def _clean_text(text, cleaner_names):
- for name in cleaner_names:
- cleaner = getattr(cleaners, name)
- if not cleaner:
- raise Exception('Unknown cleaner: %s' % name)
- text = cleaner(text)
- return text
diff --git a/spaces/hkunlp/Binder/utils/sql/process_sql.py b/spaces/hkunlp/Binder/utils/sql/process_sql.py
deleted file mode 100644
index 4df68314d6979f341d93d5b5e1c5e77bb371e745..0000000000000000000000000000000000000000
--- a/spaces/hkunlp/Binder/utils/sql/process_sql.py
+++ /dev/null
@@ -1,595 +0,0 @@
-################################
-# Assumptions:
-# 1. sql is correct
-# 2. only table name has alias
-# 3. only one intersect/union/except
-#
-# val: number(float)/string(str)/sql(dict)
-# col_unit: (agg_id, col_id, isDistinct(bool))
-# val_unit: (unit_op, col_unit1, col_unit2)
-# table_unit: (table_type, col_unit/sql)
-# cond_unit: (not_op, op_id, val_unit, val1, val2)
-# condition: [cond_unit1, 'and'/'or', cond_unit2, ...]
-# sql {
-# 'select': (isDistinct(bool), [(agg_id, val_unit), (agg_id, val_unit), ...])
-# 'from': {'table_units': [table_unit1, table_unit2, ...], 'conds': condition}
-# 'where': condition
-# 'groupBy': [col_unit1, col_unit2, ...]
-# 'orderBy': ('asc'/'desc', [val_unit1, val_unit2, ...])
-# 'having': condition
-# 'limit': None/limit value
-# 'intersect': None/sql
-# 'except': None/sql
-# 'union': None/sql
-# }
-################################
-
-import json
-import sqlite3
-from nltk import word_tokenize
-
-CLAUSE_KEYWORDS = ('select', 'from', 'where', 'group', 'order', 'limit', 'intersect', 'union', 'except')
-JOIN_KEYWORDS = ('join', 'on', 'as')
-
-WHERE_OPS = ('not', 'between', '=', '>', '<', '>=', '<=', '!=', 'in', 'like', 'is', 'exists')
-UNIT_OPS = ('none', '-', '+', "*", '/')
-AGG_OPS = ('none', 'max', 'min', 'count', 'sum', 'avg')
-TABLE_TYPE = {
- 'sql': "sql",
- 'table_unit': "table_unit",
-}
-
-COND_OPS = ('and', 'or')
-SQL_OPS = ('intersect', 'union', 'except')
-ORDER_OPS = ('desc', 'asc')
-
-
-
-class Schema:
- """
- Simple schema which maps table&column to a unique identifier
- """
- def __init__(self, schema):
- self._schema = schema
- self._idMap = self._map(self._schema)
-
- @property
- def schema(self):
- return self._schema
-
- @property
- def idMap(self):
- return self._idMap
-
- def _map(self, schema):
- idMap = {'*': "__all__"}
- id = 1
- for key, vals in schema.items():
- for val in vals:
- idMap[key.lower() + "." + val.lower()] = "__" + key.lower() + "." + val.lower() + "__"
- id += 1
-
- for key in schema:
- idMap[key.lower()] = "__" + key.lower() + "__"
- id += 1
-
- return idMap
-
-
-def get_schema(db):
- """
- Get database's schema, which is a dict with table name as key
- and list of column names as value
- :param db: database path
- :return: schema dict
- """
-
- schema = {}
- conn = sqlite3.connect(db)
- cursor = conn.cursor()
-
- # fetch table names
- cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
- tables = [str(table[0].lower()) for table in cursor.fetchall()]
-
- # fetch table info
- for table in tables:
- cursor.execute("PRAGMA table_info({})".format(table))
- schema[table] = [str(col[1].lower()) for col in cursor.fetchall()]
-
- return schema
-
-
-def get_schema_from_json(fpath):
- with open(fpath) as f:
- data = json.load(f)
-
- schema = {}
- for entry in data:
- table = str(entry['table'].lower())
- cols = [str(col['column_name'].lower()) for col in entry['col_data']]
- schema[table] = cols
-
- return schema
-
-
-def tokenize(string):
- string = str(string)
- string = string.replace("\'", "\"") # ensures all string values wrapped by "" problem??
- quote_idxs = [idx for idx, char in enumerate(string) if char == '"']
- assert len(quote_idxs) % 2 == 0, "Unexpected quote"
-
- # keep string value as token
- vals = {}
- for i in range(len(quote_idxs)-1, -1, -2):
- qidx1 = quote_idxs[i-1]
- qidx2 = quote_idxs[i]
- val = string[qidx1: qidx2+1]
- key = "__val_{}_{}__".format(qidx1, qidx2)
- string = string[:qidx1] + key + string[qidx2+1:]
- vals[key] = val
-
- # tokenize sql
- toks_tmp = [word.lower() for word in word_tokenize(string)]
- toks = []
- for tok in toks_tmp:
- if tok.startswith('=__val_'):
- tok = tok[1:]
- toks.append('=')
- toks.append(tok)
-
- # replace with string value token
- for i in range(len(toks)):
- if toks[i] in vals:
- toks[i] = vals[toks[i]]
-
- # find if there exists !=, >=, <=
- eq_idxs = [idx for idx, tok in enumerate(toks) if tok == "="]
- eq_idxs.reverse()
- prefix = ('!', '>', '<')
- for eq_idx in eq_idxs:
- pre_tok = toks[eq_idx-1]
- if pre_tok in prefix:
- toks = toks[:eq_idx-1] + [pre_tok + "="] + toks[eq_idx+1: ]
-
- return toks
-
-
-def scan_alias(toks):
- """Scan the index of 'as' and build the map for all alias"""
- as_idxs = [idx for idx, tok in enumerate(toks) if tok == 'as']
- alias = {}
- for idx in as_idxs:
- alias[toks[idx+1]] = toks[idx-1]
- return alias
-
-
-def get_tables_with_alias(schema, toks):
- tables = scan_alias(toks)
- for key in schema:
- assert key not in tables, "Alias {} has the same name in table".format(key)
- tables[key] = key
- return tables
-
-
-def parse_col(toks, start_idx, tables_with_alias, schema, default_tables=None):
- """
- :returns next idx, column id
- """
- tok = toks[start_idx]
- if tok == "*":
- return start_idx + 1, schema.idMap[tok]
-
- if '.' in tok: # if token is a composite
- alias, col = tok.split('.')
- key = tables_with_alias[alias] + "." + col
- return start_idx+1, schema.idMap[key]
-
- assert default_tables is not None and len(default_tables) > 0, "Default tables should not be None or empty"
-
- for alias in default_tables:
- table = tables_with_alias[alias]
- if tok in schema.schema[table]:
- key = table + "." + tok
- return start_idx+1, schema.idMap[key]
-
- assert False, "Error col: {}".format(tok)
-
-
-def parse_col_unit(toks, start_idx, tables_with_alias, schema, default_tables=None):
- """
- :returns next idx, (agg_op id, col_id)
- """
- idx = start_idx
- len_ = len(toks)
- isBlock = False
- isDistinct = False
- if toks[idx] == '(':
- isBlock = True
- idx += 1
-
- if toks[idx] in AGG_OPS:
- agg_id = AGG_OPS.index(toks[idx])
- idx += 1
- assert idx < len_ and toks[idx] == '('
- idx += 1
- if toks[idx] == "distinct":
- idx += 1
- isDistinct = True
- idx, col_id = parse_col(toks, idx, tables_with_alias, schema, default_tables)
- assert idx < len_ and toks[idx] == ')'
- idx += 1
- return idx, (agg_id, col_id, isDistinct)
-
- if toks[idx] == "distinct":
- idx += 1
- isDistinct = True
- agg_id = AGG_OPS.index("none")
- idx, col_id = parse_col(toks, idx, tables_with_alias, schema, default_tables)
-
- if isBlock:
- assert toks[idx] == ')'
- idx += 1 # skip ')'
-
- return idx, (agg_id, col_id, isDistinct)
-
-
-def parse_val_unit(toks, start_idx, tables_with_alias, schema, default_tables=None):
- idx = start_idx
- len_ = len(toks)
- isBlock = False
- if toks[idx] == '(':
- isBlock = True
- idx += 1
-
- col_unit1 = None
- col_unit2 = None
- unit_op = UNIT_OPS.index('none')
-
- idx, col_unit1 = parse_col_unit(toks, idx, tables_with_alias, schema, default_tables)
- if idx < len_ and toks[idx] in UNIT_OPS:
- unit_op = UNIT_OPS.index(toks[idx])
- idx += 1
- idx, col_unit2 = parse_col_unit(toks, idx, tables_with_alias, schema, default_tables)
-
- if isBlock:
- assert toks[idx] == ')'
- idx += 1 # skip ')'
-
- return idx, (unit_op, col_unit1, col_unit2)
-
-
-def parse_table_unit(toks, start_idx, tables_with_alias, schema):
- """
- :returns next idx, table id, table name
- """
- idx = start_idx
- len_ = len(toks)
- key = tables_with_alias[toks[idx]]
-
- if idx + 1 < len_ and toks[idx+1] == "as":
- idx += 3
- else:
- idx += 1
-
- return idx, schema.idMap[key], key
-
-
-def parse_value(toks, start_idx, tables_with_alias, schema, default_tables=None):
- idx = start_idx
- len_ = len(toks)
-
- isBlock = False
- if toks[idx] == '(':
- isBlock = True
- idx += 1
-
- if toks[idx] == 'select':
- idx, val = parse_sql(toks, idx, tables_with_alias, schema)
- elif "\"" in toks[idx]: # token is a string value
- val = toks[idx]
- idx += 1
- else:
- try:
- val = float(toks[idx])
- idx += 1
- except:
- end_idx = idx
- while end_idx < len_ and toks[end_idx] != ',' and toks[end_idx] != ')'\
- and toks[end_idx] != 'and' and toks[end_idx] not in CLAUSE_KEYWORDS and toks[end_idx] not in JOIN_KEYWORDS:
- end_idx += 1
-
- idx, val = parse_col_unit(toks[start_idx: end_idx], 0, tables_with_alias, schema, default_tables)
- idx = end_idx
-
- if isBlock:
- assert toks[idx] == ')'
- idx += 1
-
- return idx, val
-
-
-def parse_condition(toks, start_idx, tables_with_alias, schema, default_tables=None):
- idx = start_idx
- len_ = len(toks)
- conds = []
-
- while idx < len_:
- idx, val_unit = parse_val_unit(toks, idx, tables_with_alias, schema, default_tables)
- not_op = False
- if toks[idx] == 'not':
- not_op = True
- idx += 1
-
- assert idx < len_ and toks[idx] in WHERE_OPS, "Error condition: idx: {}, tok: {}".format(idx, toks[idx])
- op_id = WHERE_OPS.index(toks[idx])
- idx += 1
- val1 = val2 = None
- if op_id == WHERE_OPS.index('between'): # between..and... special case: dual values
- idx, val1 = parse_value(toks, idx, tables_with_alias, schema, default_tables)
- assert toks[idx] == 'and'
- idx += 1
- idx, val2 = parse_value(toks, idx, tables_with_alias, schema, default_tables)
- else: # normal case: single value
- idx, val1 = parse_value(toks, idx, tables_with_alias, schema, default_tables)
- val2 = None
-
- conds.append((not_op, op_id, val_unit, val1, val2))
-
- if idx < len_ and (toks[idx] in CLAUSE_KEYWORDS or toks[idx] in (")", ";") or toks[idx] in JOIN_KEYWORDS):
- break
-
- if idx < len_ and toks[idx] in COND_OPS:
- conds.append(toks[idx])
- idx += 1 # skip and/or
-
- return idx, conds
-
-
-def parse_select(toks, start_idx, tables_with_alias, schema, default_tables=None):
- idx = start_idx
- len_ = len(toks)
-
- assert toks[idx] == 'select', "'select' not found"
- idx += 1
- isDistinct = False
- if idx < len_ and toks[idx] == 'distinct':
- idx += 1
- isDistinct = True
- val_units = []
-
- while idx < len_ and toks[idx] not in CLAUSE_KEYWORDS:
- agg_id = AGG_OPS.index("none")
- if toks[idx] in AGG_OPS:
- agg_id = AGG_OPS.index(toks[idx])
- idx += 1
- idx, val_unit = parse_val_unit(toks, idx, tables_with_alias, schema, default_tables)
- val_units.append((agg_id, val_unit))
- if idx < len_ and toks[idx] == ',':
- idx += 1 # skip ','
-
- return idx, (isDistinct, val_units)
-
-
-def parse_from(toks, start_idx, tables_with_alias, schema):
- """
- Assume in the from clause, all table units are combined with join
- """
- assert 'from' in toks[start_idx:], "'from' not found"
-
- len_ = len(toks)
- idx = toks.index('from', start_idx) + 1
- default_tables = []
- table_units = []
- conds = []
-
- while idx < len_:
- isBlock = False
- if toks[idx] == '(':
- isBlock = True
- idx += 1
-
- if toks[idx] == 'select':
- idx, sql = parse_sql(toks, idx, tables_with_alias, schema)
- table_units.append((TABLE_TYPE['sql'], sql))
- else:
- if idx < len_ and toks[idx] == 'join':
- idx += 1 # skip join
- idx, table_unit, table_name = parse_table_unit(toks, idx, tables_with_alias, schema)
- table_units.append((TABLE_TYPE['table_unit'],table_unit))
- default_tables.append(table_name)
- if idx < len_ and toks[idx] == "on":
- idx += 1 # skip on
- idx, this_conds = parse_condition(toks, idx, tables_with_alias, schema, default_tables)
- if len(conds) > 0:
- conds.append('and')
- conds.extend(this_conds)
-
- if isBlock:
- assert toks[idx] == ')'
- idx += 1
- if idx < len_ and (toks[idx] in CLAUSE_KEYWORDS or toks[idx] in (")", ";")):
- break
-
- return idx, table_units, conds, default_tables
-
-
-def parse_where(toks, start_idx, tables_with_alias, schema, default_tables):
- idx = start_idx
- len_ = len(toks)
-
- if idx >= len_ or toks[idx] != 'where':
- return idx, []
-
- idx += 1
- idx, conds = parse_condition(toks, idx, tables_with_alias, schema, default_tables)
- return idx, conds
-
-
-def parse_group_by(toks, start_idx, tables_with_alias, schema, default_tables):
- idx = start_idx
- len_ = len(toks)
- col_units = []
-
- if idx >= len_ or toks[idx] != 'group':
- return idx, col_units
-
- idx += 1
- assert toks[idx] == 'by'
- idx += 1
-
- while idx < len_ and not (toks[idx] in CLAUSE_KEYWORDS or toks[idx] in (")", ";")):
- idx, col_unit = parse_col_unit(toks, idx, tables_with_alias, schema, default_tables)
- col_units.append(col_unit)
- if idx < len_ and toks[idx] == ',':
- idx += 1 # skip ','
- else:
- break
-
- return idx, col_units
-
-
-def parse_order_by(toks, start_idx, tables_with_alias, schema, default_tables):
- idx = start_idx
- len_ = len(toks)
- val_units = []
- order_type = 'asc' # default type is 'asc'
-
- if idx >= len_ or toks[idx] != 'order':
- return idx, val_units
-
- idx += 1
- assert toks[idx] == 'by'
- idx += 1
-
- while idx < len_ and not (toks[idx] in CLAUSE_KEYWORDS or toks[idx] in (")", ";")):
- idx, val_unit = parse_val_unit(toks, idx, tables_with_alias, schema, default_tables)
- val_units.append(val_unit)
- if idx < len_ and toks[idx] in ORDER_OPS:
- order_type = toks[idx]
- idx += 1
- if idx < len_ and toks[idx] == ',':
- idx += 1 # skip ','
- else:
- break
-
- return idx, (order_type, val_units)
-
-
-def parse_having(toks, start_idx, tables_with_alias, schema, default_tables):
- idx = start_idx
- len_ = len(toks)
-
- if idx >= len_ or toks[idx] != 'having':
- return idx, []
-
- idx += 1
- idx, conds = parse_condition(toks, idx, tables_with_alias, schema, default_tables)
- return idx, conds
-
-
-def parse_limit(toks, start_idx):
- idx = start_idx
- len_ = len(toks)
-
- if idx < len_ and toks[idx] == 'limit':
- idx += 2
- # make limit value can work, cannot assume put 1 as a fake limit number
- if type(toks[idx-1]) != int:
- return idx, 1
-
- return idx, int(toks[idx-1])
-
- return idx, None
-
-
-def parse_sql(toks, start_idx, tables_with_alias, schema):
- isBlock = False # indicate whether this is a block of sql/sub-sql
- len_ = len(toks)
- idx = start_idx
-
- sql = {}
- if toks[idx] == '(':
- isBlock = True
- idx += 1
-
- # parse from clause in order to get default tables
- from_end_idx, table_units, conds, default_tables = parse_from(toks, start_idx, tables_with_alias, schema)
- sql['from'] = {'table_units': table_units, 'conds': conds}
- # select clause
- _, select_col_units = parse_select(toks, idx, tables_with_alias, schema, default_tables)
- idx = from_end_idx
- sql['select'] = select_col_units
- # where clause
- idx, where_conds = parse_where(toks, idx, tables_with_alias, schema, default_tables)
- sql['where'] = where_conds
- # group by clause
- idx, group_col_units = parse_group_by(toks, idx, tables_with_alias, schema, default_tables)
- sql['groupBy'] = group_col_units
- # having clause
- idx, having_conds = parse_having(toks, idx, tables_with_alias, schema, default_tables)
- sql['having'] = having_conds
- # order by clause
- idx, order_col_units = parse_order_by(toks, idx, tables_with_alias, schema, default_tables)
- sql['orderBy'] = order_col_units
- # limit clause
- idx, limit_val = parse_limit(toks, idx)
- sql['limit'] = limit_val
-
- idx = skip_semicolon(toks, idx)
- if isBlock:
- assert toks[idx] == ')'
- idx += 1 # skip ')'
- idx = skip_semicolon(toks, idx)
-
- # intersect/union/except clause
- for op in SQL_OPS: # initialize IUE
- sql[op] = None
- if idx < len_ and toks[idx] in SQL_OPS:
- sql_op = toks[idx]
- idx += 1
- idx, IUE_sql = parse_sql(toks, idx, tables_with_alias, schema)
- sql[sql_op] = IUE_sql
- return idx, sql
-
-
-def load_data(fpath):
- with open(fpath) as f:
- data = json.load(f)
- return data
-
-
-def get_sql(schema, query):
- toks = tokenize(query)
- tables_with_alias = get_tables_with_alias(schema.schema, toks)
- _, sql = parse_sql(toks, 0, tables_with_alias, schema)
-
- return sql
-
-
-def skip_semicolon(toks, start_idx):
- idx = start_idx
- while idx < len(toks) and toks[idx] == ";":
- idx += 1
- return idx
-
-def get_schemas_from_json(fpath):
- with open(fpath) as f:
- data = json.load(f)
- db_names = [db['db_id'] for db in data]
-
- tables = {}
- schemas = {}
- for db in data:
- db_id = db['db_id']
- schema = {} #{'table': [col.lower, ..., ]} * -> __all__
- column_names_original = db['column_names_original']
- table_names_original = db['table_names_original']
- tables[db_id] = {'column_names_original': column_names_original, 'table_names_original': table_names_original}
- for i, tabn in enumerate(table_names_original):
- table = str(tabn.lower())
- cols = [str(col.lower()) for td, col in column_names_original if td == i]
- schema[table] = cols
- schemas[db_id] = schema
-
- return schemas, db_names, tables
\ No newline at end of file
diff --git a/spaces/hkunlp/Binder/utils/wtq/utils.py b/spaces/hkunlp/Binder/utils/wtq/utils.py
deleted file mode 100644
index 539c24e0524c40345c30fbafe59beb57f59f8489..0000000000000000000000000000000000000000
--- a/spaces/hkunlp/Binder/utils/wtq/utils.py
+++ /dev/null
@@ -1,141 +0,0 @@
-import re
-import json
-import records
-from typing import List, Dict
-from sqlalchemy.exc import SQLAlchemyError
-from utils.sql.all_keywords import ALL_KEY_WORDS
-
-
-class WTQDBEngine:
- def __init__(self, fdb):
- self.db = records.Database('sqlite:///{}'.format(fdb))
- self.conn = self.db.get_connection()
-
- def execute_wtq_query(self, sql_query: str):
- out = self.conn.query(sql_query)
- results = out.all()
- merged_results = []
- for i in range(len(results)):
- merged_results.extend(results[i].values())
- return merged_results
-
- def delete_rows(self, row_indices: List[int]):
- sql_queries = [
- "delete from w where id == {}".format(row) for row in row_indices
- ]
- for query in sql_queries:
- self.conn.query(query)
-
-
-def process_table_structure(_wtq_table_content: Dict, _add_all_column: bool = False):
- # remove id and agg column
- headers = [_.replace("\n", " ").lower() for _ in _wtq_table_content["headers"][2:]]
- header_map = {}
- for i in range(len(headers)):
- header_map["c" + str(i + 1)] = headers[i]
- header_types = _wtq_table_content["types"][2:]
-
- all_headers = []
- all_header_types = []
- vertical_content = []
- for column_content in _wtq_table_content["contents"][2:]:
- # only take the first one
- if _add_all_column:
- for i in range(len(column_content)):
- column_alias = column_content[i]["col"]
- # do not add the numbered column
- if "_number" in column_alias:
- continue
- vertical_content.append([str(_).replace("\n", " ").lower() for _ in column_content[i]["data"]])
- if "_" in column_alias:
- first_slash_pos = column_alias.find("_")
- column_name = header_map[column_alias[:first_slash_pos]] + " " + \
- column_alias[first_slash_pos + 1:].replace("_", " ")
- else:
- column_name = header_map[column_alias]
- all_headers.append(column_name)
- if column_content[i]["type"] == "TEXT":
- all_header_types.append("text")
- else:
- all_header_types.append("number")
- else:
- vertical_content.append([str(_).replace("\n", " ").lower() for _ in column_content[0]["data"]])
- row_content = list(map(list, zip(*vertical_content)))
-
- if _add_all_column:
- ret_header = all_headers
- ret_types = all_header_types
- else:
- ret_header = headers
- ret_types = header_types
- return {
- "header": ret_header,
- "rows": row_content,
- "types": ret_types,
- "alias": list(_wtq_table_content["is_list"].keys())
- }
-
-
-def retrieve_wtq_query_answer(_engine, _table_content, _sql_struct: List):
- # do not append id / agg
- headers = _table_content["header"]
-
- def flatten_sql(_ex_sql_struct: List):
- # [ "Keyword", "select", [] ], [ "Column", "c4", [] ]
- _encode_sql = []
- _execute_sql = []
- for _ex_tuple in _ex_sql_struct:
- keyword = str(_ex_tuple[1])
- # upper the keywords.
- if keyword in ALL_KEY_WORDS:
- keyword = str(keyword).upper()
-
- # extra column, which we do not need in result
- if keyword == "w" or keyword == "from":
- # add 'FROM w' make it executable
- _encode_sql.append(keyword)
- elif re.fullmatch(r"c\d+(_.+)?", keyword):
- # only take the first part
- index_key = int(keyword.split("_")[0][1:]) - 1
- # wrap it with `` to make it executable
- _encode_sql.append("`{}`".format(headers[index_key]))
- else:
- _encode_sql.append(keyword)
- # c4_list, replace it with the original one
- if "_address" in keyword or "_list" in keyword:
- keyword = re.findall(r"c\d+", keyword)[0]
-
- _execute_sql.append(keyword)
-
- return " ".join(_execute_sql), " ".join(_encode_sql)
-
- _exec_sql_str, _encode_sql_str = flatten_sql(_sql_struct)
- try:
- _sql_answers = _engine.execute_wtq_query(_exec_sql_str)
- except SQLAlchemyError as e:
- _sql_answers = []
- _norm_sql_answers = [str(_).replace("\n", " ") for _ in _sql_answers if _ is not None]
- if "none" in _norm_sql_answers:
- _norm_sql_answers = []
- return _encode_sql_str, _norm_sql_answers, _exec_sql_str
-
-
-def _load_table_w_page(table_path, page_title_path=None) -> dict:
- """
- attention: the table_path must be the .tsv path.
- Load the WikiTableQuestion from csv file. Result in a dict format like:
- {"header": [header1, header2,...], "rows": [[row11, row12, ...], [row21,...]... [...rownm]]}
- """
-
- from utils.utils import _load_table
-
- table_item = _load_table(table_path)
-
- # Load page title
- if not page_title_path:
- page_title_path = table_path.replace("csv", "page").replace(".tsv", ".json")
- with open(page_title_path, "r") as f:
- page_title = json.load(f)['title']
- table_item['page_title'] = page_title
-
- return table_item
diff --git a/spaces/ho11laqe/nnUNet_calvingfront_detection/nnunet/training/network_training/nnUNet_variants/architectural_variants/nnUNetTrainerV2_GeLU.py b/spaces/ho11laqe/nnUNet_calvingfront_detection/nnunet/training/network_training/nnUNet_variants/architectural_variants/nnUNetTrainerV2_GeLU.py
deleted file mode 100644
index 16fb7f972d338a6fc2cf75c4930530f65908a03b..0000000000000000000000000000000000000000
--- a/spaces/ho11laqe/nnUNet_calvingfront_detection/nnunet/training/network_training/nnUNet_variants/architectural_variants/nnUNetTrainerV2_GeLU.py
+++ /dev/null
@@ -1,72 +0,0 @@
-# Copyright 2020 Division of Medical Image Computing, German Cancer Research Center (DKFZ), Heidelberg, Germany
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import torch
-from nnunet.network_architecture.generic_UNet import Generic_UNet
-from nnunet.network_architecture.initialization import InitWeights_He
-
-from nnunet.training.network_training.nnUNetTrainerV2 import nnUNetTrainerV2
-from nnunet.utilities.nd_softmax import softmax_helper
-from torch import nn
-
-try:
- from torch.nn.functional import gelu
-except ImportError:
- gelu = None
-
-
-class GeLU(nn.Module):
- def __init__(self):
- super().__init__()
- if gelu is None:
- raise ImportError('You need to have at least torch==1.7.0 to use GeLUs')
-
- def forward(self, x):
- return gelu(x)
-
-
-class nnUNetTrainerV2_GeLU(nnUNetTrainerV2):
- def initialize_network(self):
- """
- - momentum 0.99
- - SGD instead of Adam
- - self.lr_scheduler = None because we do poly_lr
- - deep supervision = True
- - ReLU
- - i am sure I forgot something here
-
- Known issue: forgot to set neg_slope=0 in InitWeights_He; should not make a difference though
- :return:
- """
- if self.threeD:
- conv_op = nn.Conv3d
- dropout_op = nn.Dropout3d
- norm_op = nn.InstanceNorm3d
-
- else:
- conv_op = nn.Conv2d
- dropout_op = nn.Dropout2d
- norm_op = nn.InstanceNorm2d
-
- norm_op_kwargs = {'eps': 1e-5, 'affine': True}
- dropout_op_kwargs = {'p': 0, 'inplace': True}
- net_nonlin = GeLU
- net_nonlin_kwargs = {}
- self.network = Generic_UNet(self.num_input_channels, self.base_num_features, self.num_classes,
- len(self.net_num_pool_op_kernel_sizes),
- self.conv_per_stage, 2, conv_op, norm_op, norm_op_kwargs, dropout_op, dropout_op_kwargs,
- net_nonlin, net_nonlin_kwargs, True, False, lambda x: x, InitWeights_He(),
- self.net_num_pool_op_kernel_sizes, self.net_conv_kernel_sizes, False, True, True)
- if torch.cuda.is_available():
- self.network.cuda()
- self.network.inference_apply_nonlin = softmax_helper
diff --git a/spaces/hshr/DeepFilterNet/README.md b/spaces/hshr/DeepFilterNet/README.md
deleted file mode 100644
index 78a40334d057612a2a047827095193821211db38..0000000000000000000000000000000000000000
--- a/spaces/hshr/DeepFilterNet/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
----
-title: DeepFilterNet
-emoji: 💩
-colorFrom: gray
-colorTo: red
-sdk: gradio
-app_file: app.py
-pinned: false
-license: apache-2.0
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference
diff --git a/spaces/huggan/pix2pix-uavid/app.py b/spaces/huggan/pix2pix-uavid/app.py
deleted file mode 100644
index 0ebf45da4ef1ee1457b0891964441d760854775b..0000000000000000000000000000000000000000
--- a/spaces/huggan/pix2pix-uavid/app.py
+++ /dev/null
@@ -1,24 +0,0 @@
-import gradio as gr
-from torchvision.transforms import Compose, Resize, ToTensor, Normalize
-from PIL import Image
-from torchvision.utils import save_image
-from huggan.pytorch.pix2pix.modeling_pix2pix import GeneratorUNet
-
-def predict_fn(img):
- inp = transform(img).unsqueeze(0)
- out = model(inp)
- save_image(out, 'out.png', normalize=True)
- return 'out.png'
-
-
-transform = Compose(
- [
- Resize((1024, 1024), Image.BICUBIC),
- ToTensor(),
- Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
- ]
-)
-
-model = GeneratorUNet.from_pretrained('huggan/pix2pix-uavid-15')
-
-gr.Interface(predict_fn, inputs=gr.inputs.Image(type='pil'), outputs='image', examples=[['image1.jpg'], ['image2.jpg'], ['sample.jpg'], ['sample2.jpg'], ['sample3.jpg']]).launch()
\ No newline at end of file
diff --git a/spaces/huggan/sefa/SessionState.py b/spaces/huggan/sefa/SessionState.py
deleted file mode 100644
index 8686605b81753976c6f1f00bd774cc0e4ca20058..0000000000000000000000000000000000000000
--- a/spaces/huggan/sefa/SessionState.py
+++ /dev/null
@@ -1,129 +0,0 @@
-"""Adds pre-session state to StreamLit.
-
-This file is borrowed from
-https://gist.github.com/tvst/036da038ab3e999a64497f42de966a92
-"""
-
-# pylint: disable=protected-access
-
-try:
- import streamlit.ReportThread as ReportThread
- from streamlit.server.Server import Server
-except ModuleNotFoundError:
- # Streamlit >= 0.65.0
- import streamlit.report_thread as ReportThread
- from streamlit.server.server import Server
-
-
-class SessionState(object):
- """Hack to add per-session state to Streamlit.
-
- Usage
- -----
-
- >>> import SessionState
- >>>
- >>> session_state = SessionState.get(user_name='', favorite_color='black')
- >>> session_state.user_name
- ''
- >>> session_state.user_name = 'Mary'
- >>> session_state.favorite_color
- 'black'
-
- Since you set user_name above, next time your script runs this will be the
- result:
- >>> session_state = get(user_name='', favorite_color='black')
- >>> session_state.user_name
- 'Mary'
-
- """
-
- def __init__(self, **kwargs):
- """A new SessionState object.
-
- Parameters
- ----------
- **kwargs : any
- Default values for the session state.
-
- Example
- -------
- >>> session_state = SessionState(user_name='', favorite_color='black')
- >>> session_state.user_name = 'Mary'
- ''
- >>> session_state.favorite_color
- 'black'
-
- """
- for key, val in kwargs.items():
- setattr(self, key, val)
-
-
-def get(**kwargs):
- """Gets a SessionState object for the current session.
-
- Creates a new object if necessary.
-
- Parameters
- ----------
- **kwargs : any
- Default values you want to add to the session state, if we're creating a
- new one.
-
- Example
- -------
- >>> session_state = get(user_name='', favorite_color='black')
- >>> session_state.user_name
- ''
- >>> session_state.user_name = 'Mary'
- >>> session_state.favorite_color
- 'black'
-
- Since you set user_name above, next time your script runs this will be the
- result:
- >>> session_state = get(user_name='', favorite_color='black')
- >>> session_state.user_name
- 'Mary'
-
- """
- # Hack to get the session object from Streamlit.
-
- ctx = ReportThread.get_report_ctx()
-
- this_session = None
-
- current_server = Server.get_current()
- if hasattr(current_server, '_session_infos'):
- # Streamlit < 0.56
- session_infos = Server.get_current()._session_infos.values()
- else:
- session_infos = Server.get_current()._session_info_by_id.values()
-
- for session_info in session_infos:
- s = session_info.session
- if (
- # Streamlit < 0.54.0
- (hasattr(s, '_main_dg') and s._main_dg == ctx.main_dg)
- or
- # Streamlit >= 0.54.0
- (not hasattr(s, '_main_dg') and s.enqueue == ctx.enqueue)
- or
- # Streamlit >= 0.65.2
- (not hasattr(s, '_main_dg') and
- s._uploaded_file_mgr == ctx.uploaded_file_mgr)
- ):
- this_session = s
-
- if this_session is None:
- raise RuntimeError(
- "Oh noes. Couldn't get your Streamlit Session object. "
- 'Are you doing something fancy with threads?')
-
- # Got the session object! Now let's attach some state into it.
-
- if not hasattr(this_session, '_custom_session_state'):
- this_session._custom_session_state = SessionState(**kwargs)
-
- return this_session._custom_session_state
-
-# pylint: enable=protected-access
diff --git a/spaces/huggingchat/chat-ui/src/styles/highlight-js.css b/spaces/huggingchat/chat-ui/src/styles/highlight-js.css
deleted file mode 100644
index b262688368e9a946d72b21ae70fba7d711072fbb..0000000000000000000000000000000000000000
--- a/spaces/huggingchat/chat-ui/src/styles/highlight-js.css
+++ /dev/null
@@ -1 +0,0 @@
-@import "highlight.js/styles/atom-one-dark";
diff --git a/spaces/hush1/anime-remove-background/app.py b/spaces/hush1/anime-remove-background/app.py
deleted file mode 100644
index 230a0d5f8a3da6ab18ecb8db1cd90016a489b96a..0000000000000000000000000000000000000000
--- a/spaces/hush1/anime-remove-background/app.py
+++ /dev/null
@@ -1,52 +0,0 @@
-import gradio as gr
-import huggingface_hub
-import onnxruntime as rt
-import numpy as np
-import cv2
-
-
-def get_mask(img, s=1024):
- img = (img / 255).astype(np.float32)
- h, w = h0, w0 = img.shape[:-1]
- h, w = (s, int(s * w / h)) if h > w else (int(s * h / w), s)
- ph, pw = s - h, s - w
- img_input = np.zeros([s, s, 3], dtype=np.float32)
- img_input[ph // 2:ph // 2 + h, pw // 2:pw // 2 + w] = cv2.resize(img, (w, h))
- img_input = np.transpose(img_input, (2, 0, 1))
- img_input = img_input[np.newaxis, :]
- mask = rmbg_model.run(None, {'img': img_input})[0][0]
- mask = np.transpose(mask, (1, 2, 0))
- mask = mask[ph // 2:ph // 2 + h, pw // 2:pw // 2 + w]
- mask = cv2.resize(mask, (w0, h0))[:, :, np.newaxis]
- return mask
-
-
-def rmbg_fn(img):
- mask = get_mask(img)
- img = (mask * img + 255 * (1 - mask)).astype(np.uint8)
- mask = (mask * 255).astype(np.uint8)
- img = np.concatenate([img, mask], axis=2, dtype=np.uint8)
- mask = mask.repeat(3, axis=2)
- return mask, img
-
-
-if __name__ == "__main__":
- providers = ['CUDAExecutionProvider', 'CPUExecutionProvider']
- model_path = huggingface_hub.hf_hub_download("skytnt/anime-seg", "isnetis.onnx")
- rmbg_model = rt.InferenceSession(model_path, providers=providers)
- app = gr.Blocks()
- with app:
- gr.Markdown("# Anime Remove Background\n\n"
- "\n\n"
- "demo for [https://github.com/SkyTNT/anime-segmentation/](https://github.com/SkyTNT/anime-segmentation/)")
- with gr.Row():
- with gr.Column():
- input_img = gr.Image(label="input image")
- examples_data = [[f"examples/{x:02d}.jpg"] for x in range(1, 4)]
- examples = gr.Dataset(components=[input_img], samples=examples_data)
- run_btn = gr.Button(variant="primary")
- output_mask = gr.Image(label="mask")
- output_img = gr.Image(label="result", image_mode="RGBA")
- examples.click(lambda x: x[0], [examples], [input_img])
- run_btn.click(rmbg_fn, [input_img], [output_mask, output_img])
- app.launch()
diff --git a/spaces/icon-it-tdtu/mt-vi-en-optimum/app.py b/spaces/icon-it-tdtu/mt-vi-en-optimum/app.py
deleted file mode 100644
index 65e556f39dcbf5c0e60e585fa528e0f6864cd676..0000000000000000000000000000000000000000
--- a/spaces/icon-it-tdtu/mt-vi-en-optimum/app.py
+++ /dev/null
@@ -1,21 +0,0 @@
-import gradio as gr
-
-from optimum.onnxruntime import ORTModelForSeq2SeqLM
-from transformers import AutoTokenizer
-
-tokenizer = AutoTokenizer.from_pretrained("icon-it-tdtu/mt-vi-en-optimum")
-model = ORTModelForSeq2SeqLM.from_pretrained("icon-it-tdtu/mt-vi-en-optimum")
-
-def translate(text):
- inputs = tokenizer(text, return_tensors='pt')
- outputs = model.generate(**inputs)
- result = tokenizer.decode(outputs[0], skip_special_tokens=True)
- return result
-
-iface = gr.Interface(
- fn=translate,
- inputs=gr.inputs.Textbox(lines=2, placeholder="Enter Vietnamese text to translate..."),
- outputs="text",
- theme="huggingface"
-)
-iface.launch()
\ No newline at end of file
diff --git "a/spaces/inamXcontru/PoeticTTS/Certainly No Alcoholic (Step Seven\302\240continued).md" "b/spaces/inamXcontru/PoeticTTS/Certainly No Alcoholic (Step Seven\302\240continued).md"
deleted file mode 100644
index acb56cc6c2adb0fb8bf6285502ccfc3a6f8083e3..0000000000000000000000000000000000000000
--- "a/spaces/inamXcontru/PoeticTTS/Certainly No Alcoholic (Step Seven\302\240continued).md"
+++ /dev/null
@@ -1,5 +0,0 @@
-
-People with alcoholic cirrhosis will almost certainly be dependent on alcohol and require medical treatment and a great deal of support. This scar tissue makes it difficult for the liver to perform its functions properly.
-Certainly No Alcoholic (Step Seven continued)
Download File ————— https://gohhs.com/2uz3PV
aaccfb2cb3
-
-
\ No newline at end of file
diff --git a/spaces/inamXcontru/PoeticTTS/Descargar Programa Presto 10.22 ((NEW)) Crack.epub.md b/spaces/inamXcontru/PoeticTTS/Descargar Programa Presto 10.22 ((NEW)) Crack.epub.md
deleted file mode 100644
index f833898fa31b3dc922dece10329e4e1068a97d6f..0000000000000000000000000000000000000000
--- a/spaces/inamXcontru/PoeticTTS/Descargar Programa Presto 10.22 ((NEW)) Crack.epub.md
+++ /dev/null
@@ -1,6 +0,0 @@
-Descargar Programa Presto 10.22 Crack.epub
DOWNLOAD ⇒⇒⇒ https://gohhs.com/2uz5RH
-
-S2 ekrn ESET Service c program files eset eset nod32 antivirus ekrn. ... Havent styled game a factory reset. descargar-crack-infernal-pc.pdf Exe was bad over two ... 00010005. convertible-serenity-crib-instructions.pdf Sys 2008-10-22 ... Presto I allocated up, foreclosed some tv, and then it only with the hard ... 1fdad05405
-
-
-
diff --git a/spaces/inplisQlawa/anything-midjourney-v4-1/18 Wheels Of Steel Across America Patch Crack.md b/spaces/inplisQlawa/anything-midjourney-v4-1/18 Wheels Of Steel Across America Patch Crack.md
deleted file mode 100644
index da66522495d516b77998d0b137f787730778fb5e..0000000000000000000000000000000000000000
--- a/spaces/inplisQlawa/anything-midjourney-v4-1/18 Wheels Of Steel Across America Patch Crack.md
+++ /dev/null
@@ -1,10 +0,0 @@
-
-18 wheels of steel across america patch v1.10. data, information, content and material published on this website may not be copied, duplicated or re-disseminated without the written permission of the owner.
-18 wheels of steel across america on pc windows game crack 18 wheels of steel across america on pc windows game crack,18 wheels of. 18 wheels of steel across america (pc game) in 18 wheels of steel across america (pc game) for windows, download dll and exe by.
-18 wheels of steel across america patch crack
Download ⇔ https://urlin.us/2uEwJh
-patch for preventing carburization, nitriding or oxidation, and method of. fix a patch for preventing carburization, nitriding or oxidation, and method of. 18 wheels of steel across america 1.10 nocd serial crack all versions 18 wheels of steel across america 1.10 nocd serial crack,18 wheels of steel across america.
-hard truck 18 wheels of steel cdsiz oynamaexe patch from the file archive to the gamesys directory.. hard truck: 18 wheels of steel. 18 wheels of steel across america is a management-style game for pc. in most cases using a no-cd or fixed exe will solve this problem! game or patch.
-hard truck 18 wheels of steel cdsiz oynamaexe patch from the file archive to the gamesys directory.. hard truck: 18 wheels of steel across america. 18 wheels of steel across america 1.10 nocd serial crack all versions 18 wheels of steel across america 1.10 nocd serial crack,18 wheels of steel across america.
-80. the ford took the truck with its 4-wheel drive system and added oversize tires for rough terrain. the result was not a success, though it was also not a failure. 18 wheels of steel across america patch crack
1. he just accepted the truck with his usual good humor. he also paid what must have been a great price to get this truck, and the truck itself is a fine one. but that is not why i want you to study this picture.
899543212b
-
-
\ No newline at end of file
diff --git a/spaces/inplisQlawa/anything-midjourney-v4-1/Material Science And Metallurgy By Phaneesh Pdf Download [EXCLUSIVE].md b/spaces/inplisQlawa/anything-midjourney-v4-1/Material Science And Metallurgy By Phaneesh Pdf Download [EXCLUSIVE].md
deleted file mode 100644
index f562e4cc49ac68e17755e3257c44acc8be30692b..0000000000000000000000000000000000000000
--- a/spaces/inplisQlawa/anything-midjourney-v4-1/Material Science And Metallurgy By Phaneesh Pdf Download [EXCLUSIVE].md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-the guide is offered in 20 chapters. the language used is person pleasant and diagrams are giving the clear view and idea. solved issues, a number of selection questions and evaluate questions are additionally integral a part of the guide. the contents of the guide are designed taking into consideration the syllabi of assorted universities, technical establishments and aggressive examinations like upsc, gate and many others. this guide is among the many only a few out there that covers each material science and metallurgy as per numerous college necessities.
-Material Science And Metallurgy By Phaneesh Pdf Download
Download File ⏩ https://urlin.us/2uExgl
-this book is designed to cover the introduction to metallurgy topics and mineralogy. the main goal of the book is to introduce the readers to the theory behind the metallurgical process and to make the reader aware of the various metallurgical processes such as: the metallurgical process, the ferrous process, the non-ferrous process, the casting process, heat treatment and its various types, the effect of temperature on the mechanical properties of materials, the effect of the alloys on the mechanical properties and the heat treatment process of the alloys, the heat treatment and its various types. the book also includes the various types of alloys and the basic properties and the technical and metallurgical properties of the metal, such as, ferrous and non-ferrous metals, their fabrication and properties, the preparation of the alloys and its properties, how to control the alloys during the process and the properties of the alloys and metallurgical processes.
899543212b
-
-
\ No newline at end of file
diff --git a/spaces/inplisQlawa/anything-midjourney-v4-1/Mount And Blade Warband Custom Battle Mods.md b/spaces/inplisQlawa/anything-midjourney-v4-1/Mount And Blade Warband Custom Battle Mods.md
deleted file mode 100644
index e60d1e4a2dec53e8525a48edec1a837f9451d7e8..0000000000000000000000000000000000000000
--- a/spaces/inplisQlawa/anything-midjourney-v4-1/Mount And Blade Warband Custom Battle Mods.md
+++ /dev/null
@@ -1,64 +0,0 @@
-
-Mount and Blade Warband Custom Battle Mods: A Guide for Beginners
-Mount and Blade Warband is a popular medieval action role-playing game that offers a lot of freedom and variety to the players. You can create your own character, join a faction, fight in battles, siege castles, start your own kingdom, and more. But what if you want to spice up your gameplay with some custom battle mods?
-Custom battle mods are modifications that add new features, options, scenarios, and challenges to the battle mode of Mount and Blade Warband. Battle mode is a multiplayer mode where you can fight against other players or AI bots in different maps and settings. You can choose your faction, troops, equipment, and tactics before each round.
-mount and blade warband custom battle mods
DOWNLOAD ✪ https://urlin.us/2uEveU
-Custom battle mods can enhance your battle mode experience by adding new factions, units, weapons, maps, weather effects, AI behaviors, and more. Some of them are based on historical or fictional settings, such as the Hundred Years War, the Crusades, or the Lord of the Rings. Others are more creative and experimental, such as adding basketball, parkour, or air combat to the game.
-How to Install Custom Battle Mods for Mount and Blade Warband
-Installing custom battle mods for Mount and Blade Warband is not very difficult, but you need to follow some steps carefully. Here is a general guide on how to do it:
-
-- Download the custom battle mod you want from a reliable source, such as Mod DB or Nexus Mods. Make sure the mod is compatible with your version of Mount and Blade Warband and Warband Script Enhancer (WSE), which is a tool that allows some mods to run properly.
-- Extract the mod files to a folder of your choice. You can use a program like WinRAR or 7-Zip to do this.
-- Copy the mod folder to the Modules folder of your Mount and Blade Warband installation directory. For example, if you installed the game in C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband, then you should copy the mod folder to C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband\Modules.
-- If the mod requires WSE, then you need to copy the WSE files to the main folder of your Mount and Blade Warband installation directory. For example, if you installed the game in C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband, then you should copy the WSE files to C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband.
-- Launch the game using WSE if the mod requires it, or using the normal launcher if it doesn't. You can find WSE in the main folder of your Mount and Blade Warband installation directory. It should be named WSELoader.exe.
-- Select the mod from the drop-down menu in the launcher and click Play.
-- Enjoy your custom battle mod!
-
-Some Examples of Custom Battle Mods for Mount and Blade Warband
-There are many custom battle mods for Mount and Blade Warband that you can try out. Here are some examples of popular and interesting ones:
-
-- BattlesMod for Warband: This mod adds several features and options to the battle mode, such as adaptive battles, random time of day and weather, drowning, elite units spawn, morale system, formation commands, siege AI pathing, helms deep map, arena maps, concept arena map, and more.
-- Deeds of Arms & Chivalry: This mod focuses on the Hundred Years War between France and England, specifically during the reigns of Charles V and Edward III. It adds new factions, units, weapons, armors, banners, scenes, sounds, music, quests, tournaments, historical characters, and more.
-- Sands of Faith: This mod is based on the Crusades in the Middle East during the 12th and 13th centuries. It adds new factions, units, weapons, armors, banners, scenes, sounds, music, quests, historical characters, diplomacy system
-
How to Play Custom Battle Mods for Mount and Blade Warband
-Once you have installed your custom battle mod of choice, you can start playing it by launching the game and selecting the mod from the drop-down menu in the launcher. Then, you can choose the multiplayer option and either join an existing server that is running the mod, or host your own local session and invite your friends or play with bots.
-Depending on the mod, you may have different options and settings to customize your battle mode experience. For example, you can choose your faction, your troop type, your equipment, your map, your weather, your team size, your round time, and more. You can also enable or disable certain features, such as adaptive battles, morale system, formation commands, siege AI pathing, etc.
-Once you are ready, you can start the battle and enjoy the new challenges and possibilities that the custom battle mod offers. You can fight in historical or fictional scenarios, use new weapons and tactics, command your troops or follow orders, siege castles or defend them, and more. The custom battle mods for Mount and Blade Warband can provide endless hours of fun and replayability for fans of the game.
-Why You Should Try Custom Battle Mods for Mount and Blade Warband
-Custom battle mods for Mount and Blade Warband are a great way to enhance your gameplay and enjoy the game in new ways. Here are some reasons why you should try them:
-
-
-- They add variety and diversity: Custom battle mods can introduce new factions, units, weapons, maps, weather effects, AI behaviors, and more to the game. This can make the battles more interesting and unpredictable, as you will face different enemies and allies, use different strategies and tactics, and fight in different environments and conditions.
-- They add challenge and difficulty: Custom battle mods can also make the battles more challenging and difficult, as you will face stronger opponents, larger armies, smarter AI, tougher sieges, and more. This can test your skills and abilities as a warrior and a leader, and make you feel more satisfied when you win.
-- They add immersion and realism: Custom battle mods can also make the battles more immersive and realistic, as they can be based on historical or fictional settings that are faithful to their sources. You can fight in the Hundred Years War, the Crusades, the Lord of the Rings, or other scenarios that are inspired by real or fictional events. You can also use realistic weapons and armors that match the period and culture of your faction.
-- They add fun and entertainment: Custom battle mods can also make the battles more fun and entertaining, as they can add creative and experimental features that are not possible in the vanilla game. You can play basketball, parkour, air combat, or other modes that are completely different from the usual medieval warfare. You can also use funny weapons and armors that are not meant to be taken seriously.
-
-Custom battle mods for Mount and Blade Warband are a great way to enjoy the game in new ways. They can add variety, challenge, immersion, realism
-
How to Find Custom Battle Mods for Mount and Blade Warband
-If you are looking for custom battle mods for Mount and Blade Warband, there are many sources where you can find them. Here are some of the most popular and reliable ones:
-
-- Mod DB: Mod DB is one of the largest and oldest websites for modding games. It has a huge collection of mods for Mount and Blade Warband, including custom battle mods. You can browse by category, popularity, rating, date, or name. You can also read articles, reviews, and comments about the mods, as well as download files, videos, and images. You can also join the modding community and interact with other modders and players.
-- Nexus Mods: Nexus Mods is another well-known website for modding games. It also has a large selection of mods for Mount and Blade Warband, including custom battle mods. You can browse by category, popularity, rating, date, or name. You can also read articles, reviews, and comments about the mods, as well as download files, videos, and images. You can also join the modding community and interact with other modders and players.
-- Steam Workshop: Steam Workshop is a feature of Steam that allows users to create and share content for games. It has some mods for Mount and Blade Warband, including custom battle mods. You can browse by category, popularity, rating, date, or name. You can also read articles, reviews, and comments about the mods, as well as download files, videos, and images. You can also join the modding community and interact with other modders and players.
-- Other Websites: There are also other websites that host mods for Mount and Blade Warband, including custom battle mods. Some examples are TaleWorlds Forum, ModdingWay Forum, ModDB Forum, etc. You can search for them on Google or other search engines. However, you should be careful when downloading mods from unknown sources, as they may contain viruses or malware that can harm your computer or game.
-
-Conclusion
-Custom battle mods for Mount and Blade Warband are a great way to enhance your gameplay and enjoy the game in new ways. They can add variety, challenge, immersion
-
How to Uninstall Custom Battle Mods for Mount and Blade Warband
-If you want to uninstall a custom battle mod for Mount and Blade Warband, you need to follow some steps carefully. Here is a general guide on how to do it:
-
-- Launch the game and select the mod from the drop-down menu in the launcher. Then, exit the game.
-- Go to the Modules folder of your Mount and Blade Warband installation directory. For example, if you installed the game in C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband, then you should go to C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband\Modules.
-- Delete the mod folder that you want to uninstall. For example, if you want to uninstall Deeds of Arms & Chivalry mod, then you should delete the Deeds of Arms & Chivalry folder.
-- If the mod required WSE, then you need to go to the main folder of your Mount and Blade Warband installation directory. For example, if you installed the game in C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband, then you should go to C:\Program Files (x86)\Steam\steamapps\common\MountBlade Warband.
-- Delete the WSE files that came with the mod. For example, if you want to uninstall Deeds of Arms & Chivalry mod, then you should delete WSE_DeedsOfArmsAndChivalry.bat and WSE_DeedsOfArmsAndChivalry.ini files.
-- Launch the game using the normal launcher or WSE if you still have other mods that require it. You can select the Native module or another mod from the drop-down menu in the launcher.
-- Enjoy your unmodded or differently modded game!
-
-Conclusion
-Custom battle mods for Mount and Blade Warband are a great way to enhance your gameplay and enjoy the game in new ways. They can add variety, challenge, immersion, realism, fun, and entertainment to your battle mode experience. You can find many custom battle mods on various websites, such as Mod DB, Nexus Mods, Steam Workshop, and others. You can also install and uninstall them easily by following some simple steps. If you are a fan of Mount and Blade Warband, you should definitely try some custom battle mods and see what they can offer.
-Conclusion
-Custom battle mods for Mount and Blade Warband are a great way to enhance your gameplay and enjoy the game in new ways. They can add variety, challenge, immersion, realism, fun, and entertainment to your battle mode experience. You can find many custom battle mods on various websites, such as Mod DB, Nexus Mods, Steam Workshop, and others. You can also install and uninstall them easily by following some simple steps. If you are a fan of Mount and Blade Warband, you should definitely try some custom battle mods and see what they can offer.
3cee63e6c2
-
-
\ No newline at end of file
diff --git a/spaces/inreVtussa/clothingai/Examples/Adjustment Program Epson Tx710w.md b/spaces/inreVtussa/clothingai/Examples/Adjustment Program Epson Tx710w.md
deleted file mode 100644
index 28f8c0dc69ee53b31be21975c2e52a0a1c4b3248..0000000000000000000000000000000000000000
--- a/spaces/inreVtussa/clothingai/Examples/Adjustment Program Epson Tx710w.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-YOUR USE OF THE SOFTWARE, PRODUCT AND RELATED PROGRAMS AND DOCUMENTATION IS AT YOUR OWN RISK AND DISCRETION. YOU ARE SOLELY RESPONSIBLE FOR (AND BELKIN DISCLAIMS) ANY AND ALL LOSS, LIABILITY, OR DAMAGES, INCLUDING TO YOUR HOME, HVAC SYSTEM, ELECTRICAL SYSTEM, PLUMBING, PRODUCT, OTHER PERIPHERALS CONNECTED TO THE PRODUCT, COMPUTER, MOBILE DEVICE, AND ALL OTHER ITEMS AND PETS IN YOUR HOME, RESULTING FROM YOUR MISUSE OF THE SOFTWARE, PRODUCT AND RELATED PROGRAMS AND DOCUMENTATION. YOU ARE RESPONSIBLE FOR COMPLYING WITH ANY SAFETY WARNINGS AND PRECAUTIONS THAT ACCOMPANY THE PRODUCT. IF YOU ARE NOT COMFORTABLE WITH USING THE PRODUCT AFTER READING THE SAFETY WARNINGS, YOU MUST RETURN THE PRODUCT TO YOUR PLACE OF PURCHASE AND STOP USING THE SOFTWARE. BELKIN IS NOT RESPONSIBLE FOR (I) YOUR FAILURE TO FOLLOW SAFETY WARNINGS, PRECAUTIONS OR ANY OTHER INSTRUCTIONS PROVIDED WITH THE PRODUCT AND/OR SOFTWARE, (II) YOUR NEGLIGENCE IN USE OF THE PRODUCT AND/OR SOFTWARE, OR (III) YOUR INTENTIONAL MISUSE OF THE PRODUCT OR SOFTWARE.
-adjustment program epson tx710w
Download File > https://tiurll.com/2uClcV
- Popular Recent Comments Tags Reset Epson Waste Ink Pad Counter April 10, 2021 Free Trial Key For Reset Epson Waste Ink Pad Counters April 14, 2021 Reset Epson WF-C5290, WF-C5290DW, WF-C5290BA April 2, 2021 Reset Epson WF-2850, WF-2851, WF-2855 April 2, 2021 Reset Epson SC P400, P405, P407, P408, PX7V2 April 10, 2021 How To Downgrade Firmware Epson WF-6530 2 hours ago How to reset Epson L3104 2 hours ago How to reset Epson L3101 8 hours ago How to reset Epson L3100 10 hours ago dangmien: You should check the email that you used to make the payment. in your case the t... dangmien: hi Dol, this is lifetime key and no expiration date... Tariq: 4F696-D08AA-F1006-A1EE2... Dol: Hi this activation is it lifetime or has expiry date... javad ahamed: Hi, I paid for a reset key through paypal 13 jun but have not received it. The p... 899543212b
-
-
\ No newline at end of file
diff --git a/spaces/inreVtussa/clothingai/Examples/Adjustment Program For Epson Stylus Sx235.md b/spaces/inreVtussa/clothingai/Examples/Adjustment Program For Epson Stylus Sx235.md
deleted file mode 100644
index 77fb1ef2e63e8404c2506a9748ee3147940ee2a9..0000000000000000000000000000000000000000
--- a/spaces/inreVtussa/clothingai/Examples/Adjustment Program For Epson Stylus Sx235.md
+++ /dev/null
@@ -1,36 +0,0 @@
-
-How to Reset Your Epson Stylus SX235 Printer with Adjustment Program
-If you own an Epson Stylus SX235 printer, you may encounter a problem where the printer stops working and displays an error message like "A printer's ink pad is at the end of its service life. Please contact Epson Support." This means that the printer's waste ink pad counter has reached its limit and needs to be reset.
-Fortunately, there is a solution to this problem. You can use a software tool called Adjustment Program to reset the waste ink pad counter and restore your printer's functionality. In this article, we will show you how to download and use Adjustment Program for Epson Stylus SX235 printer.
-Adjustment Program For Epson Stylus Sx235
Download File ……… https://tiurll.com/2uCiPL
-What is Adjustment Program?
-Adjustment Program is a utility program that allows you to perform various maintenance tasks on your Epson printer, such as resetting the waste ink pad counter, cleaning the print head, checking the printer information, and more. It is an original program that works only with specific printer models and requires a license key to activate.
-Where to Download Adjustment Program for Epson Stylus SX235?
-You can download Adjustment Program for Epson Stylus SX235 from ORPYS, a website that provides various service programs for Epson printers. The program costs $9 and comes with free updates and unlimited use for one PC. You can pay with PayPal or credit card and receive the download link and license key by email.
-To download Adjustment Program for Epson Stylus SX235 from ORPYS, follow these steps:
-
-- Go to this page and click on "Add to cart".
-- Click on "Proceed to checkout" and enter your billing details.
-- Choose your payment method and complete the payment.
-- Check your email for the download link and license key.
-- Download the program and extract it to a folder on your PC.
-
-How to Use Adjustment Program for Epson Stylus SX235?
-To use Adjustment Program for Epson Stylus SX235, follow these steps:
-
-
-- Connect your printer to your PC with a USB cable and turn it on.
-- Run the program as administrator and enter the license key when prompted.
-- Select your printer model and port from the drop-down menus.
-- Click on "Particular adjustment mode".
-- Select "Waste ink pad counter" from the list of options and click on "OK".
-- Check the boxes next to "Main pad counter" and "Platen pad counter" and click on "Check".
-- Note down the current values of the counters.
-- Click on "Initialization" and confirm the message.
-- Wait for the program to reset the counters and display a message like "Please turn off your printer".
-- Turn off your printer and wait for a few seconds.
-- Turn on your printer again and check if the error message is gone.
-
-Congratulations! You have successfully reset your Epson Stylus SX235 printer with Adjustment Program. You can now use your printer normally again. However, keep in mind that resetting the waste ink pad counter does not solve the problem of ink overflow. You may need to replace or clean the waste ink pad periodically to prevent ink leakage or damage to your printer.
d5da3c52bf
-
-
\ No newline at end of file
diff --git a/spaces/inreVtussa/clothingai/Examples/Dcpandeyopticsandmodernphysicspdfdownload !!INSTALL!!.md b/spaces/inreVtussa/clothingai/Examples/Dcpandeyopticsandmodernphysicspdfdownload !!INSTALL!!.md
deleted file mode 100644
index e1e19e7c804309d080d4e73e57e145cf5585828a..0000000000000000000000000000000000000000
--- a/spaces/inreVtussa/clothingai/Examples/Dcpandeyopticsandmodernphysicspdfdownload !!INSTALL!!.md
+++ /dev/null
@@ -1,9 +0,0 @@
-dcpandeyopticsandmodernphysicspdfdownload
Download Zip ✒ https://tiurll.com/2uCjTk
-
-December 17, 2020 - DC Pandey Physics PDF Class 11: Download Arihant DC Pandey PDF Book for free. physics for JEE Main and Advance, optics and modern physics; COLUMBIA REGION . The book includes answers to most of the questions in the exam.
-Download the book "Physics.
-Dec 20, 2018 .
-Download free pdf, djvu and buy paper and e-book 8a78ff9644
-
-
-
diff --git a/spaces/intelliarts/Car_parts_damage_detection/app.py b/spaces/intelliarts/Car_parts_damage_detection/app.py
deleted file mode 100644
index 46f133233d99e83dfe4a74faa15c7d1933cea309..0000000000000000000000000000000000000000
--- a/spaces/intelliarts/Car_parts_damage_detection/app.py
+++ /dev/null
@@ -1,212 +0,0 @@
-try:
- import detectron2
-except:
- import os
- os.system('pip install git+https://github.com/facebookresearch/detectron2.git')
-
-from matplotlib.pyplot import axis
-import gradio as gr
-import requests
-import numpy as np
-from torch import nn
-import requests
-
-import torch
-import detectron2
-from detectron2 import model_zoo
-from detectron2.engine import DefaultPredictor
-from detectron2.config import get_cfg
-from detectron2.utils.visualizer import Visualizer
-from detectron2.data import MetadataCatalog
-from detectron2.utils.visualizer import ColorMode
-
-damage_model_path = 'damage/model_final.pth'
-scratch_model_path = 'scratch/model_final.pth'
-parts_model_path = 'parts/model_final.pth'
-
-if torch.cuda.is_available():
- device = 'cuda'
-else:
- device = 'cpu'
-
-cfg_scratches = get_cfg()
-cfg_scratches.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
-cfg_scratches.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.8
-cfg_scratches.MODEL.ROI_HEADS.NUM_CLASSES = 1
-cfg_scratches.MODEL.WEIGHTS = scratch_model_path
-cfg_scratches.MODEL.DEVICE = device
-
-predictor_scratches = DefaultPredictor(cfg_scratches)
-
-metadata_scratch = MetadataCatalog.get("car_dataset_val")
-metadata_scratch.thing_classes = ["scratch"]
-
-cfg_damage = get_cfg()
-cfg_damage.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
-cfg_damage.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.7
-cfg_damage.MODEL.ROI_HEADS.NUM_CLASSES = 1
-cfg_damage.MODEL.WEIGHTS = damage_model_path
-cfg_damage.MODEL.DEVICE = device
-
-predictor_damage = DefaultPredictor(cfg_damage)
-
-metadata_damage = MetadataCatalog.get("car_damage_dataset_val")
-metadata_damage.thing_classes = ["damage"]
-
-cfg_parts = get_cfg()
-cfg_parts.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
-cfg_parts.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.75
-cfg_parts.MODEL.ROI_HEADS.NUM_CLASSES = 19
-cfg_parts.MODEL.WEIGHTS = parts_model_path
-cfg_parts.MODEL.DEVICE = device
-
-predictor_parts = DefaultPredictor(cfg_parts)
-
-metadata_parts = MetadataCatalog.get("car_parts_dataset_val")
-metadata_parts.thing_classes = ['_background_',
- 'back_bumper',
- 'back_glass',
- 'back_left_door',
- 'back_left_light',
- 'back_right_door',
- 'back_right_light',
- 'front_bumper',
- 'front_glass',
- 'front_left_door',
- 'front_left_light',
- 'front_right_door',
- 'front_right_light',
- 'hood',
- 'left_mirror',
- 'right_mirror',
- 'tailgate',
- 'trunk',
- 'wheel']
-
-def merge_segment(pred_segm):
- merge_dict = {}
- for i in range(len(pred_segm)):
- merge_dict[i] = []
- for j in range(i+1,len(pred_segm)):
- if torch.sum(pred_segm[i]*pred_segm[j])>0:
- merge_dict[i].append(j)
-
- to_delete = []
- for key in merge_dict:
- for element in merge_dict[key]:
- to_delete.append(element)
-
- for element in to_delete:
- merge_dict.pop(element,None)
-
- empty_delete = []
- for key in merge_dict:
- if merge_dict[key] == []:
- empty_delete.append(key)
-
- for element in empty_delete:
- merge_dict.pop(element,None)
-
- for key in merge_dict:
- for element in merge_dict[key]:
- pred_segm[key]+=pred_segm[element]
-
- except_elem = list(set(to_delete))
-
- new_indexes = list(range(len(pred_segm)))
- for elem in except_elem:
- new_indexes.remove(elem)
-
- return pred_segm[new_indexes]
-
-
-def inference(image):
- img = np.array(image)
- outputs_damage = predictor_damage(img)
- outputs_parts = predictor_parts(img)
- outputs_scratch = predictor_scratches(img)
- out_dict = outputs_damage["instances"].to("cpu").get_fields()
- merged_damage_masks = merge_segment(out_dict['pred_masks'])
- scratch_data = outputs_scratch["instances"].get_fields()
- scratch_masks = scratch_data['pred_masks']
- damage_data = outputs_damage["instances"].get_fields()
- damage_masks = damage_data['pred_masks']
- parts_data = outputs_parts["instances"].get_fields()
- parts_masks = parts_data['pred_masks']
- parts_classes = parts_data['pred_classes']
- new_inst = detectron2.structures.Instances((1024,1024))
- new_inst.set('pred_masks',merge_segment(out_dict['pred_masks']))
-
- parts_damage_dict = {}
- parts_list_damages = []
- for part in parts_classes:
- parts_damage_dict[metadata_parts.thing_classes[part]] = []
- for mask in scratch_masks:
- for i in range(len(parts_masks)):
- if torch.sum(parts_masks[i]*mask)>0:
- parts_damage_dict[metadata_parts.thing_classes[parts_classes[i]]].append('scratch')
- parts_list_damages.append(f'{metadata_parts.thing_classes[parts_classes[i]]} has scratch')
- print(f'{metadata_parts.thing_classes[parts_classes[i]]} has scratch')
- for mask in merged_damage_masks:
- for i in range(len(parts_masks)):
- if torch.sum(parts_masks[i]*mask)>0:
- parts_damage_dict[metadata_parts.thing_classes[parts_classes[i]]].append('damage')
- parts_list_damages.append(f'{metadata_parts.thing_classes[parts_classes[i]]} has damage')
- print(f'{metadata_parts.thing_classes[parts_classes[i]]} has damage')
-
- v_d = Visualizer(img[:, :, ::-1],
- metadata=metadata_damage,
- scale=0.5,
- instance_mode=ColorMode.SEGMENTATION # remove the colors of unsegmented pixels. This option is only available for segmentation models
- )
- #v_d = Visualizer(img,scale=1.2)
- #print(outputs["instances"].to('cpu'))
- out_d = v_d.draw_instance_predictions(new_inst)
- img1 = out_d.get_image()[:, :, ::-1]
-
- v_s = Visualizer(img[:, :, ::-1],
- metadata=metadata_scratch,
- scale=0.5,
- instance_mode=ColorMode.SEGMENTATION # remove the colors of unsegmented pixels. This option is only available for segmentation models
- )
- #v_s = Visualizer(img,scale=1.2)
- out_s = v_s.draw_instance_predictions(outputs_scratch["instances"])
- img2 = out_s.get_image()[:, :, ::-1]
-
- v_p = Visualizer(img[:, :, ::-1],
- metadata=metadata_parts,
- scale=0.5,
- instance_mode=ColorMode.SEGMENTATION # remove the colors of unsegmented pixels. This option is only available for segmentation models
- )
- #v_p = Visualizer(img,scale=1.2)
- out_p = v_p.draw_instance_predictions(outputs_parts["instances"])
- img3 = out_p.get_image()[:, :, ::-1]
-
- return img1, img2, img3, parts_list_damages
-
-with gr.Blocks() as demo:
- with gr.Row():
- with gr.Column():
- gr.Markdown("## Inputs")
- image = gr.Image(type="pil",label="Input")
- submit_button = gr.Button(value="Submit", label="Submit")
- with gr.Column():
- gr.Markdown("## Outputs")
- with gr.Tab('Image of damages'):
- im1 = gr.Image(type='numpy',label='Image of damages')
- with gr.Tab('Image of scratches'):
- im2 = gr.Image(type='numpy',label='Image of scratches')
- with gr.Tab('Image of parts'):
- im3 = gr.Image(type='numpy',label='Image of car parts')
- with gr.Tab('Information about damaged parts'):
- intersections = gr.Textbox(label='Information about type of damages on each part')
-
- #actions
- submit_button.click(
- fn=inference,
- inputs = [image],
- outputs = [im1,im2,im3,intersections]
- )
-
-if __name__ == "__main__":
- demo.launch()
\ No newline at end of file
diff --git a/spaces/ismot/1702t1/dataset/build.py b/spaces/ismot/1702t1/dataset/build.py
deleted file mode 100644
index 6460ad7debbc459b72815b1199d8381c281daf52..0000000000000000000000000000000000000000
--- a/spaces/ismot/1702t1/dataset/build.py
+++ /dev/null
@@ -1,115 +0,0 @@
-"""
-@Date: 2021/07/18
-@description:
-"""
-import numpy as np
-import torch.utils.data
-from dataset.mp3d_dataset import MP3DDataset
-from dataset.pano_s2d3d_dataset import PanoS2D3DDataset
-from dataset.pano_s2d3d_mix_dataset import PanoS2D3DMixDataset
-from dataset.zind_dataset import ZindDataset
-
-
-def build_loader(config, logger):
- name = config.DATA.DATASET
- ddp = config.WORLD_SIZE > 1
- train_dataset = None
- train_data_loader = None
- if config.MODE == 'train':
- train_dataset = build_dataset(mode='train', config=config, logger=logger)
-
- val_dataset = build_dataset(mode=config.VAL_NAME if config.MODE != 'test' else 'test', config=config, logger=logger)
-
- train_sampler = None
- val_sampler = None
- if ddp:
- if train_dataset:
- train_sampler = torch.utils.data.DistributedSampler(train_dataset, shuffle=True)
- val_sampler = torch.utils.data.DistributedSampler(val_dataset, shuffle=False)
-
- batch_size = config.DATA.BATCH_SIZE
- num_workers = 0 if config.DEBUG else config.DATA.NUM_WORKERS
- pin_memory = config.DATA.PIN_MEMORY
- if train_dataset:
- logger.info(f'Train data loader batch size: {batch_size}')
- train_data_loader = torch.utils.data.DataLoader(
- train_dataset, sampler=train_sampler,
- batch_size=batch_size,
- shuffle=True,
- num_workers=num_workers,
- pin_memory=pin_memory,
- drop_last=True,
- )
- batch_size = batch_size - (len(val_dataset) % np.arange(batch_size, 0, -1)).tolist().index(0)
- logger.info(f'Val data loader batch size: {batch_size}')
- val_data_loader = torch.utils.data.DataLoader(
- val_dataset, sampler=val_sampler,
- batch_size=batch_size,
- shuffle=False,
- num_workers=num_workers,
- pin_memory=pin_memory,
- drop_last=False
- )
- logger.info(f'Build data loader: num_workers:{num_workers} pin_memory:{pin_memory}')
- return train_data_loader, val_data_loader
-
-
-def build_dataset(mode, config, logger):
- name = config.DATA.DATASET
- if name == 'mp3d':
- dataset = MP3DDataset(
- root_dir=config.DATA.DIR,
- mode=mode,
- shape=config.DATA.SHAPE,
- max_wall_num=config.DATA.WALL_NUM,
- aug=config.DATA.AUG if mode == 'train' else None,
- camera_height=config.DATA.CAMERA_HEIGHT,
- logger=logger,
- for_test_index=config.DATA.FOR_TEST_INDEX,
- keys=config.DATA.KEYS
- )
- elif name == 'pano_s2d3d':
- dataset = PanoS2D3DDataset(
- root_dir=config.DATA.DIR,
- mode=mode,
- shape=config.DATA.SHAPE,
- max_wall_num=config.DATA.WALL_NUM,
- aug=config.DATA.AUG if mode == 'train' else None,
- camera_height=config.DATA.CAMERA_HEIGHT,
- logger=logger,
- for_test_index=config.DATA.FOR_TEST_INDEX,
- subset=config.DATA.SUBSET,
- keys=config.DATA.KEYS
- )
- elif name == 'pano_s2d3d_mix':
- dataset = PanoS2D3DMixDataset(
- root_dir=config.DATA.DIR,
- mode=mode,
- shape=config.DATA.SHAPE,
- max_wall_num=config.DATA.WALL_NUM,
- aug=config.DATA.AUG if mode == 'train' else None,
- camera_height=config.DATA.CAMERA_HEIGHT,
- logger=logger,
- for_test_index=config.DATA.FOR_TEST_INDEX,
- subset=config.DATA.SUBSET,
- keys=config.DATA.KEYS
- )
- elif name == 'zind':
- dataset = ZindDataset(
- root_dir=config.DATA.DIR,
- mode=mode,
- shape=config.DATA.SHAPE,
- max_wall_num=config.DATA.WALL_NUM,
- aug=config.DATA.AUG if mode == 'train' else None,
- camera_height=config.DATA.CAMERA_HEIGHT,
- logger=logger,
- for_test_index=config.DATA.FOR_TEST_INDEX,
- is_simple=True,
- is_ceiling_flat=False,
- keys=config.DATA.KEYS,
- vp_align=config.EVAL.POST_PROCESSING is not None and 'manhattan' in config.EVAL.POST_PROCESSING
- )
- else:
- raise NotImplementedError(f"Unknown dataset: {name}")
-
- return dataset
diff --git a/spaces/jackli888/stable-diffusion-webui/modules/script_loading.py b/spaces/jackli888/stable-diffusion-webui/modules/script_loading.py
deleted file mode 100644
index b7611ea5f4489edc95f61040e4324124a2e6fefd..0000000000000000000000000000000000000000
--- a/spaces/jackli888/stable-diffusion-webui/modules/script_loading.py
+++ /dev/null
@@ -1,32 +0,0 @@
-import os
-import sys
-import traceback
-import importlib.util
-from types import ModuleType
-
-
-def load_module(path):
- module_spec = importlib.util.spec_from_file_location(os.path.basename(path), path)
- module = importlib.util.module_from_spec(module_spec)
- module_spec.loader.exec_module(module)
-
- return module
-
-
-def preload_extensions(extensions_dir, parser):
- if not os.path.isdir(extensions_dir):
- return
-
- for dirname in sorted(os.listdir(extensions_dir)):
- preload_script = os.path.join(extensions_dir, dirname, "preload.py")
- if not os.path.isfile(preload_script):
- continue
-
- try:
- module = load_module(preload_script)
- if hasattr(module, 'preload'):
- module.preload(parser)
-
- except Exception:
- print(f"Error running preload() for {preload_script}", file=sys.stderr)
- print(traceback.format_exc(), file=sys.stderr)
diff --git a/spaces/jackli888/stable-diffusion-webui/webui.sh b/spaces/jackli888/stable-diffusion-webui/webui.sh
deleted file mode 100644
index 8cdad22d310fed20f229b09d7a3160aeb1731a85..0000000000000000000000000000000000000000
--- a/spaces/jackli888/stable-diffusion-webui/webui.sh
+++ /dev/null
@@ -1,186 +0,0 @@
-#!/usr/bin/env bash
-#################################################
-# Please do not make any changes to this file, #
-# change the variables in webui-user.sh instead #
-#################################################
-
-# If run from macOS, load defaults from webui-macos-env.sh
-if [[ "$OSTYPE" == "darwin"* ]]; then
- if [[ -f webui-macos-env.sh ]]
- then
- source ./webui-macos-env.sh
- fi
-fi
-
-# Read variables from webui-user.sh
-# shellcheck source=/dev/null
-if [[ -f webui-user.sh ]]
-then
- source ./webui-user.sh
-fi
-
-# Set defaults
-# Install directory without trailing slash
-if [[ -z "${install_dir}" ]]
-then
- install_dir="/home/$(whoami)"
-fi
-
-# Name of the subdirectory (defaults to stable-diffusion-webui)
-if [[ -z "${clone_dir}" ]]
-then
- clone_dir="stable-diffusion-webui"
-fi
-
-# python3 executable
-if [[ -z "${python_cmd}" ]]
-then
- python_cmd="python3"
-fi
-
-# git executable
-if [[ -z "${GIT}" ]]
-then
- export GIT="git"
-fi
-
-# python3 venv without trailing slash (defaults to ${install_dir}/${clone_dir}/venv)
-if [[ -z "${venv_dir}" ]]
-then
- venv_dir="venv"
-fi
-
-if [[ -z "${LAUNCH_SCRIPT}" ]]
-then
- LAUNCH_SCRIPT="launch.py"
-fi
-
-# this script cannot be run as root by default
-can_run_as_root=0
-
-# read any command line flags to the webui.sh script
-while getopts "f" flag > /dev/null 2>&1
-do
- case ${flag} in
- f) can_run_as_root=1;;
- *) break;;
- esac
-done
-
-# Disable sentry logging
-export ERROR_REPORTING=FALSE
-
-# Do not reinstall existing pip packages on Debian/Ubuntu
-export PIP_IGNORE_INSTALLED=0
-
-# Pretty print
-delimiter="################################################################"
-
-printf "\n%s\n" "${delimiter}"
-printf "\e[1m\e[32mInstall script for stable-diffusion + Web UI\n"
-printf "\e[1m\e[34mTested on Debian 11 (Bullseye)\e[0m"
-printf "\n%s\n" "${delimiter}"
-
-# Do not run as root
-if [[ $(id -u) -eq 0 && can_run_as_root -eq 0 ]]
-then
- printf "\n%s\n" "${delimiter}"
- printf "\e[1m\e[31mERROR: This script must not be launched as root, aborting...\e[0m"
- printf "\n%s\n" "${delimiter}"
- exit 1
-else
- printf "\n%s\n" "${delimiter}"
- printf "Running on \e[1m\e[32m%s\e[0m user" "$(whoami)"
- printf "\n%s\n" "${delimiter}"
-fi
-
-if [[ -d .git ]]
-then
- printf "\n%s\n" "${delimiter}"
- printf "Repo already cloned, using it as install directory"
- printf "\n%s\n" "${delimiter}"
- install_dir="${PWD}/../"
- clone_dir="${PWD##*/}"
-fi
-
-# Check prerequisites
-gpu_info=$(lspci 2>/dev/null | grep VGA)
-case "$gpu_info" in
- *"Navi 1"*|*"Navi 2"*) export HSA_OVERRIDE_GFX_VERSION=10.3.0
- ;;
- *"Renoir"*) export HSA_OVERRIDE_GFX_VERSION=9.0.0
- printf "\n%s\n" "${delimiter}"
- printf "Experimental support for Renoir: make sure to have at least 4GB of VRAM and 10GB of RAM or enable cpu mode: --use-cpu all --no-half"
- printf "\n%s\n" "${delimiter}"
- ;;
- *)
- ;;
-esac
-if echo "$gpu_info" | grep -q "AMD" && [[ -z "${TORCH_COMMAND}" ]]
-then
- export TORCH_COMMAND="pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/rocm5.2"
-fi
-
-for preq in "${GIT}" "${python_cmd}"
-do
- if ! hash "${preq}" &>/dev/null
- then
- printf "\n%s\n" "${delimiter}"
- printf "\e[1m\e[31mERROR: %s is not installed, aborting...\e[0m" "${preq}"
- printf "\n%s\n" "${delimiter}"
- exit 1
- fi
-done
-
-if ! "${python_cmd}" -c "import venv" &>/dev/null
-then
- printf "\n%s\n" "${delimiter}"
- printf "\e[1m\e[31mERROR: python3-venv is not installed, aborting...\e[0m"
- printf "\n%s\n" "${delimiter}"
- exit 1
-fi
-
-cd "${install_dir}"/ || { printf "\e[1m\e[31mERROR: Can't cd to %s/, aborting...\e[0m" "${install_dir}"; exit 1; }
-if [[ -d "${clone_dir}" ]]
-then
- cd "${clone_dir}"/ || { printf "\e[1m\e[31mERROR: Can't cd to %s/%s/, aborting...\e[0m" "${install_dir}" "${clone_dir}"; exit 1; }
-else
- printf "\n%s\n" "${delimiter}"
- printf "Clone stable-diffusion-webui"
- printf "\n%s\n" "${delimiter}"
- "${GIT}" clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git "${clone_dir}"
- cd "${clone_dir}"/ || { printf "\e[1m\e[31mERROR: Can't cd to %s/%s/, aborting...\e[0m" "${install_dir}" "${clone_dir}"; exit 1; }
-fi
-
-printf "\n%s\n" "${delimiter}"
-printf "Create and activate python venv"
-printf "\n%s\n" "${delimiter}"
-cd "${install_dir}"/"${clone_dir}"/ || { printf "\e[1m\e[31mERROR: Can't cd to %s/%s/, aborting...\e[0m" "${install_dir}" "${clone_dir}"; exit 1; }
-if [[ ! -d "${venv_dir}" ]]
-then
- "${python_cmd}" -m venv "${venv_dir}"
- first_launch=1
-fi
-# shellcheck source=/dev/null
-if [[ -f "${venv_dir}"/bin/activate ]]
-then
- source "${venv_dir}"/bin/activate
-else
- printf "\n%s\n" "${delimiter}"
- printf "\e[1m\e[31mERROR: Cannot activate python venv, aborting...\e[0m"
- printf "\n%s\n" "${delimiter}"
- exit 1
-fi
-
-if [[ ! -z "${ACCELERATE}" ]] && [ ${ACCELERATE}="True" ] && [ -x "$(command -v accelerate)" ]
-then
- printf "\n%s\n" "${delimiter}"
- printf "Accelerating launch.py..."
- printf "\n%s\n" "${delimiter}"
- exec accelerate launch --num_cpu_threads_per_process=6 "${LAUNCH_SCRIPT}" "$@"
-else
- printf "\n%s\n" "${delimiter}"
- printf "Launching launch.py..."
- printf "\n%s\n" "${delimiter}"
- exec "${python_cmd}" "${LAUNCH_SCRIPT}" "$@"
-fi
diff --git a/spaces/jackyccl/segment-anything/groundingdino/models/GroundingDINO/bertwarper.py b/spaces/jackyccl/segment-anything/groundingdino/models/GroundingDINO/bertwarper.py
deleted file mode 100644
index f0cf9779b270e1aead32845006f8b881fcba37ad..0000000000000000000000000000000000000000
--- a/spaces/jackyccl/segment-anything/groundingdino/models/GroundingDINO/bertwarper.py
+++ /dev/null
@@ -1,273 +0,0 @@
-# ------------------------------------------------------------------------
-# Grounding DINO
-# url: https://github.com/IDEA-Research/GroundingDINO
-# Copyright (c) 2023 IDEA. All Rights Reserved.
-# Licensed under the Apache License, Version 2.0 [see LICENSE for details]
-# ------------------------------------------------------------------------
-
-import torch
-import torch.nn.functional as F
-import torch.utils.checkpoint as checkpoint
-from torch import Tensor, nn
-from torchvision.ops.boxes import nms
-from transformers import BertConfig, BertModel, BertPreTrainedModel
-from transformers.modeling_outputs import BaseModelOutputWithPoolingAndCrossAttentions
-
-
-class BertModelWarper(nn.Module):
- def __init__(self, bert_model):
- super().__init__()
- # self.bert = bert_modelc
-
- self.config = bert_model.config
- self.embeddings = bert_model.embeddings
- self.encoder = bert_model.encoder
- self.pooler = bert_model.pooler
-
- self.get_extended_attention_mask = bert_model.get_extended_attention_mask
- self.invert_attention_mask = bert_model.invert_attention_mask
- self.get_head_mask = bert_model.get_head_mask
-
- def forward(
- self,
- input_ids=None,
- attention_mask=None,
- token_type_ids=None,
- position_ids=None,
- head_mask=None,
- inputs_embeds=None,
- encoder_hidden_states=None,
- encoder_attention_mask=None,
- past_key_values=None,
- use_cache=None,
- output_attentions=None,
- output_hidden_states=None,
- return_dict=None,
- ):
- r"""
- encoder_hidden_states (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length, hidden_size)`, `optional`):
- Sequence of hidden-states at the output of the last layer of the encoder. Used in the cross-attention if
- the model is configured as a decoder.
- encoder_attention_mask (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
- Mask to avoid performing attention on the padding token indices of the encoder input. This mask is used in
- the cross-attention if the model is configured as a decoder. Mask values selected in ``[0, 1]``:
-
- - 1 for tokens that are **not masked**,
- - 0 for tokens that are **masked**.
- past_key_values (:obj:`tuple(tuple(torch.FloatTensor))` of length :obj:`config.n_layers` with each tuple having 4 tensors of shape :obj:`(batch_size, num_heads, sequence_length - 1, embed_size_per_head)`):
- Contains precomputed key and value hidden states of the attention blocks. Can be used to speed up decoding.
-
- If :obj:`past_key_values` are used, the user can optionally input only the last :obj:`decoder_input_ids`
- (those that don't have their past key value states given to this model) of shape :obj:`(batch_size, 1)`
- instead of all :obj:`decoder_input_ids` of shape :obj:`(batch_size, sequence_length)`.
- use_cache (:obj:`bool`, `optional`):
- If set to :obj:`True`, :obj:`past_key_values` key value states are returned and can be used to speed up
- decoding (see :obj:`past_key_values`).
- """
- output_attentions = (
- output_attentions if output_attentions is not None else self.config.output_attentions
- )
- output_hidden_states = (
- output_hidden_states
- if output_hidden_states is not None
- else self.config.output_hidden_states
- )
- return_dict = return_dict if return_dict is not None else self.config.use_return_dict
-
- if self.config.is_decoder:
- use_cache = use_cache if use_cache is not None else self.config.use_cache
- else:
- use_cache = False
-
- if input_ids is not None and inputs_embeds is not None:
- raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time")
- elif input_ids is not None:
- input_shape = input_ids.size()
- batch_size, seq_length = input_shape
- elif inputs_embeds is not None:
- input_shape = inputs_embeds.size()[:-1]
- batch_size, seq_length = input_shape
- else:
- raise ValueError("You have to specify either input_ids or inputs_embeds")
-
- device = input_ids.device if input_ids is not None else inputs_embeds.device
-
- # past_key_values_length
- past_key_values_length = (
- past_key_values[0][0].shape[2] if past_key_values is not None else 0
- )
-
- if attention_mask is None:
- attention_mask = torch.ones(
- ((batch_size, seq_length + past_key_values_length)), device=device
- )
- if token_type_ids is None:
- token_type_ids = torch.zeros(input_shape, dtype=torch.long, device=device)
-
- # We can provide a self-attention mask of dimensions [batch_size, from_seq_length, to_seq_length]
- # ourselves in which case we just need to make it broadcastable to all heads.
- extended_attention_mask: torch.Tensor = self.get_extended_attention_mask(
- attention_mask, input_shape, device
- )
-
- # If a 2D or 3D attention mask is provided for the cross-attention
- # we need to make broadcastable to [batch_size, num_heads, seq_length, seq_length]
- if self.config.is_decoder and encoder_hidden_states is not None:
- encoder_batch_size, encoder_sequence_length, _ = encoder_hidden_states.size()
- encoder_hidden_shape = (encoder_batch_size, encoder_sequence_length)
- if encoder_attention_mask is None:
- encoder_attention_mask = torch.ones(encoder_hidden_shape, device=device)
- encoder_extended_attention_mask = self.invert_attention_mask(encoder_attention_mask)
- else:
- encoder_extended_attention_mask = None
- # if os.environ.get('IPDB_SHILONG_DEBUG', None) == 'INFO':
- # import ipdb; ipdb.set_trace()
-
- # Prepare head mask if needed
- # 1.0 in head_mask indicate we keep the head
- # attention_probs has shape bsz x n_heads x N x N
- # input head_mask has shape [num_heads] or [num_hidden_layers x num_heads]
- # and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length]
- head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers)
-
- embedding_output = self.embeddings(
- input_ids=input_ids,
- position_ids=position_ids,
- token_type_ids=token_type_ids,
- inputs_embeds=inputs_embeds,
- past_key_values_length=past_key_values_length,
- )
-
- encoder_outputs = self.encoder(
- embedding_output,
- attention_mask=extended_attention_mask,
- head_mask=head_mask,
- encoder_hidden_states=encoder_hidden_states,
- encoder_attention_mask=encoder_extended_attention_mask,
- past_key_values=past_key_values,
- use_cache=use_cache,
- output_attentions=output_attentions,
- output_hidden_states=output_hidden_states,
- return_dict=return_dict,
- )
- sequence_output = encoder_outputs[0]
- pooled_output = self.pooler(sequence_output) if self.pooler is not None else None
-
- if not return_dict:
- return (sequence_output, pooled_output) + encoder_outputs[1:]
-
- return BaseModelOutputWithPoolingAndCrossAttentions(
- last_hidden_state=sequence_output,
- pooler_output=pooled_output,
- past_key_values=encoder_outputs.past_key_values,
- hidden_states=encoder_outputs.hidden_states,
- attentions=encoder_outputs.attentions,
- cross_attentions=encoder_outputs.cross_attentions,
- )
-
-
-class TextEncoderShell(nn.Module):
- def __init__(self, text_encoder):
- super().__init__()
- self.text_encoder = text_encoder
- self.config = self.text_encoder.config
-
- def forward(self, **kw):
- # feed into text encoder
- return self.text_encoder(**kw)
-
-
-def generate_masks_with_special_tokens(tokenized, special_tokens_list, tokenizer):
- """Generate attention mask between each pair of special tokens
- Args:
- input_ids (torch.Tensor): input ids. Shape: [bs, num_token]
- special_tokens_mask (list): special tokens mask.
- Returns:
- torch.Tensor: attention mask between each special tokens.
- """
- input_ids = tokenized["input_ids"]
- bs, num_token = input_ids.shape
- # special_tokens_mask: bs, num_token. 1 for special tokens. 0 for normal tokens
- special_tokens_mask = torch.zeros((bs, num_token), device=input_ids.device).bool()
- for special_token in special_tokens_list:
- special_tokens_mask |= input_ids == special_token
-
- # idxs: each row is a list of indices of special tokens
- idxs = torch.nonzero(special_tokens_mask)
-
- # generate attention mask and positional ids
- attention_mask = (
- torch.eye(num_token, device=input_ids.device).bool().unsqueeze(0).repeat(bs, 1, 1)
- )
- position_ids = torch.zeros((bs, num_token), device=input_ids.device)
- previous_col = 0
- for i in range(idxs.shape[0]):
- row, col = idxs[i]
- if (col == 0) or (col == num_token - 1):
- attention_mask[row, col, col] = True
- position_ids[row, col] = 0
- else:
- attention_mask[row, previous_col + 1 : col + 1, previous_col + 1 : col + 1] = True
- position_ids[row, previous_col + 1 : col + 1] = torch.arange(
- 0, col - previous_col, device=input_ids.device
- )
-
- previous_col = col
-
- # # padding mask
- # padding_mask = tokenized['attention_mask']
- # attention_mask = attention_mask & padding_mask.unsqueeze(1).bool() & padding_mask.unsqueeze(2).bool()
-
- return attention_mask, position_ids.to(torch.long)
-
-
-def generate_masks_with_special_tokens_and_transfer_map(tokenized, special_tokens_list, tokenizer):
- """Generate attention mask between each pair of special tokens
- Args:
- input_ids (torch.Tensor): input ids. Shape: [bs, num_token]
- special_tokens_mask (list): special tokens mask.
- Returns:
- torch.Tensor: attention mask between each special tokens.
- """
- input_ids = tokenized["input_ids"]
- bs, num_token = input_ids.shape
- # special_tokens_mask: bs, num_token. 1 for special tokens. 0 for normal tokens
- special_tokens_mask = torch.zeros((bs, num_token), device=input_ids.device).bool()
- for special_token in special_tokens_list:
- special_tokens_mask |= input_ids == special_token
-
- # idxs: each row is a list of indices of special tokens
- idxs = torch.nonzero(special_tokens_mask)
-
- # generate attention mask and positional ids
- attention_mask = (
- torch.eye(num_token, device=input_ids.device).bool().unsqueeze(0).repeat(bs, 1, 1)
- )
- position_ids = torch.zeros((bs, num_token), device=input_ids.device)
- cate_to_token_mask_list = [[] for _ in range(bs)]
- previous_col = 0
- for i in range(idxs.shape[0]):
- row, col = idxs[i]
- if (col == 0) or (col == num_token - 1):
- attention_mask[row, col, col] = True
- position_ids[row, col] = 0
- else:
- attention_mask[row, previous_col + 1 : col + 1, previous_col + 1 : col + 1] = True
- position_ids[row, previous_col + 1 : col + 1] = torch.arange(
- 0, col - previous_col, device=input_ids.device
- )
- c2t_maski = torch.zeros((num_token), device=input_ids.device).bool()
- c2t_maski[previous_col + 1 : col] = True
- cate_to_token_mask_list[row].append(c2t_maski)
- previous_col = col
-
- cate_to_token_mask_list = [
- torch.stack(cate_to_token_mask_listi, dim=0)
- for cate_to_token_mask_listi in cate_to_token_mask_list
- ]
-
- # # padding mask
- # padding_mask = tokenized['attention_mask']
- # attention_mask = attention_mask & padding_mask.unsqueeze(1).bool() & padding_mask.unsqueeze(2).bool()
-
- return attention_mask, position_ids.to(torch.long), cate_to_token_mask_list
diff --git a/spaces/james-oldfield/PandA/networks/biggan/__init__.py b/spaces/james-oldfield/PandA/networks/biggan/__init__.py
deleted file mode 100644
index b570848421afd921fae635569c97d0f8f5b33c80..0000000000000000000000000000000000000000
--- a/spaces/james-oldfield/PandA/networks/biggan/__init__.py
+++ /dev/null
@@ -1,6 +0,0 @@
-from .config import BigGANConfig
-from .model import BigGAN
-from .file_utils import PYTORCH_PRETRAINED_BIGGAN_CACHE, cached_path
-from .utils import (truncated_noise_sample, save_as_images,
- convert_to_images, display_in_terminal,
- one_hot_from_int, one_hot_from_names)
diff --git a/spaces/jamesjohnson763/ASRLiveSpeechRecognition-GR/README.md b/spaces/jamesjohnson763/ASRLiveSpeechRecognition-GR/README.md
deleted file mode 100644
index a549a4f50eec89dba4e78283712dc39c4fcaa81a..0000000000000000000000000000000000000000
--- a/spaces/jamesjohnson763/ASRLiveSpeechRecognition-GR/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
----
-title: ASRLiveSpeechRecognition GR
-emoji: 🌍
-colorFrom: blue
-colorTo: purple
-sdk: gradio
-sdk_version: 3.8.2
-app_file: app.py
-pinned: false
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/jaskugler/timdettmers-guanaco-65b-merged/README.md b/spaces/jaskugler/timdettmers-guanaco-65b-merged/README.md
deleted file mode 100644
index ae09e02c1f491bdde0c23d67999af9ca47f97887..0000000000000000000000000000000000000000
--- a/spaces/jaskugler/timdettmers-guanaco-65b-merged/README.md
+++ /dev/null
@@ -1,13 +0,0 @@
----
-title: Timdettmers Guanaco 65b Merged
-emoji: 🐠
-colorFrom: pink
-colorTo: pink
-sdk: gradio
-sdk_version: 3.34.0
-app_file: app.py
-pinned: false
-license: openrail
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/jbetker/tortoise/tortoise/models/classifier.py b/spaces/jbetker/tortoise/tortoise/models/classifier.py
deleted file mode 100644
index ce574eabb38b36b832ed27e61c836caf4d626185..0000000000000000000000000000000000000000
--- a/spaces/jbetker/tortoise/tortoise/models/classifier.py
+++ /dev/null
@@ -1,157 +0,0 @@
-import torch
-import torch.nn as nn
-from torch.utils.checkpoint import checkpoint
-
-from tortoise.models.arch_util import Upsample, Downsample, normalization, zero_module, AttentionBlock
-
-
-class ResBlock(nn.Module):
- def __init__(
- self,
- channels,
- dropout,
- out_channels=None,
- use_conv=False,
- use_scale_shift_norm=False,
- dims=2,
- up=False,
- down=False,
- kernel_size=3,
- do_checkpoint=True,
- ):
- super().__init__()
- self.channels = channels
- self.dropout = dropout
- self.out_channels = out_channels or channels
- self.use_conv = use_conv
- self.use_scale_shift_norm = use_scale_shift_norm
- self.do_checkpoint = do_checkpoint
- padding = 1 if kernel_size == 3 else 2
-
- self.in_layers = nn.Sequential(
- normalization(channels),
- nn.SiLU(),
- nn.Conv1d(channels, self.out_channels, kernel_size, padding=padding),
- )
-
- self.updown = up or down
-
- if up:
- self.h_upd = Upsample(channels, False, dims)
- self.x_upd = Upsample(channels, False, dims)
- elif down:
- self.h_upd = Downsample(channels, False, dims)
- self.x_upd = Downsample(channels, False, dims)
- else:
- self.h_upd = self.x_upd = nn.Identity()
-
- self.out_layers = nn.Sequential(
- normalization(self.out_channels),
- nn.SiLU(),
- nn.Dropout(p=dropout),
- zero_module(
- nn.Conv1d(self.out_channels, self.out_channels, kernel_size, padding=padding)
- ),
- )
-
- if self.out_channels == channels:
- self.skip_connection = nn.Identity()
- elif use_conv:
- self.skip_connection = nn.Conv1d(
- dims, channels, self.out_channels, kernel_size, padding=padding
- )
- else:
- self.skip_connection = nn.Conv1d(dims, channels, self.out_channels, 1)
-
- def forward(self, x):
- if self.do_checkpoint:
- return checkpoint(
- self._forward, x
- )
- else:
- return self._forward(x)
-
- def _forward(self, x):
- if self.updown:
- in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]
- h = in_rest(x)
- h = self.h_upd(h)
- x = self.x_upd(x)
- h = in_conv(h)
- else:
- h = self.in_layers(x)
- h = self.out_layers(h)
- return self.skip_connection(x) + h
-
-
-class AudioMiniEncoder(nn.Module):
- def __init__(self,
- spec_dim,
- embedding_dim,
- base_channels=128,
- depth=2,
- resnet_blocks=2,
- attn_blocks=4,
- num_attn_heads=4,
- dropout=0,
- downsample_factor=2,
- kernel_size=3):
- super().__init__()
- self.init = nn.Sequential(
- nn.Conv1d(spec_dim, base_channels, 3, padding=1)
- )
- ch = base_channels
- res = []
- self.layers = depth
- for l in range(depth):
- for r in range(resnet_blocks):
- res.append(ResBlock(ch, dropout, do_checkpoint=False, kernel_size=kernel_size))
- res.append(Downsample(ch, use_conv=True, out_channels=ch*2, factor=downsample_factor))
- ch *= 2
- self.res = nn.Sequential(*res)
- self.final = nn.Sequential(
- normalization(ch),
- nn.SiLU(),
- nn.Conv1d(ch, embedding_dim, 1)
- )
- attn = []
- for a in range(attn_blocks):
- attn.append(AttentionBlock(embedding_dim, num_attn_heads, do_checkpoint=False))
- self.attn = nn.Sequential(*attn)
- self.dim = embedding_dim
-
- def forward(self, x):
- h = self.init(x)
- h = self.res(h)
- h = self.final(h)
- for blk in self.attn:
- h = checkpoint(blk, h)
- return h[:, :, 0]
-
-
-class AudioMiniEncoderWithClassifierHead(nn.Module):
- def __init__(self, classes, distribute_zero_label=True, **kwargs):
- super().__init__()
- self.enc = AudioMiniEncoder(**kwargs)
- self.head = nn.Linear(self.enc.dim, classes)
- self.num_classes = classes
- self.distribute_zero_label = distribute_zero_label
-
- def forward(self, x, labels=None):
- h = self.enc(x)
- logits = self.head(h)
- if labels is None:
- return logits
- else:
- if self.distribute_zero_label:
- oh_labels = nn.functional.one_hot(labels, num_classes=self.num_classes)
- zeros_indices = (labels == 0).unsqueeze(-1)
- # Distribute 20% of the probability mass on all classes when zero is specified, to compensate for dataset noise.
- zero_extra_mass = torch.full_like(oh_labels, dtype=torch.float, fill_value=.2/(self.num_classes-1))
- zero_extra_mass[:, 0] = -.2
- zero_extra_mass = zero_extra_mass * zeros_indices
- oh_labels = oh_labels + zero_extra_mass
- else:
- oh_labels = labels
- loss = nn.functional.cross_entropy(logits, oh_labels)
- return loss
diff --git a/spaces/jbilcke-hf/Panoremix/src/lib/computeSecretFingerprint.ts b/spaces/jbilcke-hf/Panoremix/src/lib/computeSecretFingerprint.ts
deleted file mode 100644
index 19ba225b1fddb8e77b4fc143a8e380df8fb819ed..0000000000000000000000000000000000000000
--- a/spaces/jbilcke-hf/Panoremix/src/lib/computeSecretFingerprint.ts
+++ /dev/null
@@ -1,7 +0,0 @@
-import { computeSha256 } from "./computeSha256"
-
-const secretFingerprint = `${process.env.FINGERPRINT_KEY || ""}`
-
-export function computeSecretFingerprint(input: string) {
- return computeSha256(`${secretFingerprint}_${input}`)
-}
\ No newline at end of file
diff --git a/spaces/jbilcke-hf/ai-comic-factory/src/lib/dirtyLLMJsonParser.ts b/spaces/jbilcke-hf/ai-comic-factory/src/lib/dirtyLLMJsonParser.ts
deleted file mode 100644
index 869cb9cc5cabca4b45b954e9c6d3fb837d2597d2..0000000000000000000000000000000000000000
--- a/spaces/jbilcke-hf/ai-comic-factory/src/lib/dirtyLLMJsonParser.ts
+++ /dev/null
@@ -1,29 +0,0 @@
-import { LLMResponse } from "@/types"
-import { cleanJson } from "./cleanJson"
-import { parseBadJSON } from "./parseBadJSON"
-
-export function dirtyLLMJsonParser(input: string): LLMResponse {
-
- if (input.includes("```")) {
- input = input.split("```")[0]
- }
- // we only keep what's after the first [
- let jsonOrNot = cleanJson(input)
-
- const jsonData = parseBadJSON(jsonOrNot) as LLMResponse
-
- const results = jsonData.map((item, i) => {
- let panel = i
- let caption = item.caption ? item.caption.trim() : ''
- let instructions = item.instructions ? item.instructions.trim() : ''
- if (!instructions && caption) {
- instructions = caption
- }
- if (!caption && instructions) {
- caption = instructions
- }
- return { panel, caption, instructions }
- })
-
- return results
-}
\ No newline at end of file
diff --git a/spaces/jhonparra18/ocr-LLM-image-summarizer/image_processor.py b/spaces/jhonparra18/ocr-LLM-image-summarizer/image_processor.py
deleted file mode 100644
index 9df98fe09ecd603cff60e41e9a7e246248d79b34..0000000000000000000000000000000000000000
--- a/spaces/jhonparra18/ocr-LLM-image-summarizer/image_processor.py
+++ /dev/null
@@ -1,107 +0,0 @@
-import cv2
-import pytesseract
-from config import PYTESSERACT_DEFAULT_CONFIG
-from pathlib import Path
-from tqdm import tqdm
-import numpy as np
-from langchain.tools import BaseTool
-from typing import Optional, Type
-from langchain.callbacks.manager import AsyncCallbackManagerForToolRun
-from PIL import Image
-
-class ImageProcessor(BaseTool):
-
- name = "ImageProcessor"
- description = "useful when you need to extract info from an image in an img_path corresponding to a receipt or invoice and tries to preprocess it returning all the text in the image using an OCR system."
-
- def binarize(self,img_path):
- """
- This function is to binarize an input image
- :param img: image in format of (h, w, channel)
- :return: am image in format of (h, w)
- """
- img=cv2.imread(img_path)
- gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
- #gray = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)[1] #threshold may affect performance for invoices|receipts as seen in our test dataset
- return gray
-
- def remove_watermark(self,img,alpha = 1.8,beta = -180):
- """remove watermark from image
- img: cv2 image| np.array"""
- new = alpha * img + beta
- new = np.clip(new, 0, 255).astype(np.uint8)
- return new
-
- def deskew(self,img):
- coords = np.column_stack(np.where(img > 0))
- angle = cv2.minAreaRect(coords)[-1]
- if angle < -45:
- angle = -(90 + angle)
- else:
- angle = -angle
- (h, w) = img.shape[:2]
- center = (w // 2, h // 2)
- M = cv2.getRotationMatrix2D(center, angle, 1.0)
- rotated = cv2.warpAffine(img, M, (w, h), flags=cv2.INTER_CUBIC, borderMode=cv2.BORDER_REPLICATE)
- return rotated
-
- def dilate_erode(self,img):
- """
- apply an erosion and dilation kernel
- img: cv2 image| np.array
- """
- kernel = np.ones((2, 1), np.uint8)
- kernel2 = np.ones((1, 1), np.uint8)
- img = cv2.blur(img,(6,5))
- img=cv2.dilate(img, kernel, iterations=3)
- img = cv2.erode(img, (2,1), iterations=1)
- img = cv2.blur(img,(1,1))
- img = cv2.bilateralFilter(img,10,35,30)
- img= cv2.dilate(img, kernel2, iterations=1)
- return img
-
-
- def detect_angle(self,img_path):
- """detects angle of rotation in the image using the text lines found"""
- ##taken from https://stackoverflow.com/questions/13872331/rotating-an-image-with-orientation-specified-in-exif-using-python-without-pil-in
- pil_img=Image.open(img_path)
- img_exif = pil_img.getexif()
- if len(img_exif):
- if img_exif[274] == 3:
- pil_img = pil_img.transpose(Image.ROTATE_180)
- elif img_exif[274] == 6:
- pil_img = pil_img.transpose(Image.ROTATE_270)
- elif img_exif[274] == 8:
- pil_img = pil_img.transpose(Image.ROTATE_90)
-
- return np.array(pil_img)[:, :, ::-1] #convert to BGR
-
- def opening(self,image):
- kernel = np.ones((5,5),np.uint8)
- return cv2.morphologyEx(image, cv2.MORPH_OPEN, kernel)
-
- def process_image(self,img_path):
- img=self.binarize(img_path)
- img=self.remove_watermark(img)
- return img
-
- def img_to_text(self,img,lang="spa"):
- text=pytesseract.image_to_string(img,lang=lang,config=PYTESSERACT_DEFAULT_CONFIG)
- return text
-
- def _run(self,img_path,save_to_disk=False):
- img=self.process_image(img_path)
- text=self.img_to_text(img)
- if save_to_disk:
- with open(f"/tmp/{str(img_path).split('/')[-1].replace('.jpg','.txt')}",'w') as f:
- f.write(text)
- cv2.imwrite(f"images/rotated-{img_pth.name}",img)
- return text
-
- # as used in langchain documentation https://python.langchain.com/docs/modules/agents/tools/custom_tools
- async def _arun(self, img_path: str,save_to_disk=False, run_manager: Optional[AsyncCallbackManagerForToolRun] = None
- ) -> str:
- """Use the tool asynchronously."""
- raise NotImplementedError("does not support async")
-
-
diff --git a/spaces/jie1/succ1/file/Preinput_Merge.py b/spaces/jie1/succ1/file/Preinput_Merge.py
deleted file mode 100644
index d0bc72e6398c6bc153e5d0a6a02b3df718496741..0000000000000000000000000000000000000000
--- a/spaces/jie1/succ1/file/Preinput_Merge.py
+++ /dev/null
@@ -1,83 +0,0 @@
-import re
-
-from tname import *
-from Rfile import *
-
-
-def Strip(seq_file):
- contents = j_reads(seq_file.name)
- ina = Name()
- ina = ina + r"input.tsv" # 结果文件名称
-
- # 去除序列文件中的换行,并写入新的文件中
- for i in range(0, len(contents) - 1):
- if contents[i][0] != '>' and contents[i + 1][0] != '>':
- content = contents[i].split()
- content = content[0]
- else:
- content = contents[i]
- with open(ina, "a") as f:
- f.write(content)
- # 最后一行特殊,单独写入
- with open(ina, "a") as f:
- f.write(contents[len(contents) - 1])
- return ina
-
-
-def Merge(smi_file, seq_file):
- smile = j_read(smi_file.name)
- smile = smile.strip("\n")
-
- # 读取去掉换行后的文件
- contents = j_reads(seq_file.name)
-
- name = Name()
- name = name + r"kcat_input.tsv" # 结果文件名称
-
- with open(name, "a") as f3:
- f3.write("Substrate Name Substrate SMILES Protein Sequence")
- f3.write("\n")
-
- for i in range(0, len(contents)):
- if i % 2 == 1:
- with open(name, "a") as f3:
- # 写入索引
- f3.write(">seq" + str(int((i - 1) / 2)))
- f3.write("\t")
- # 写入smile名称
- f3.write(smile)
- f3.write("\t")
- # 写入序列
- f3.write(contents[i])
- return name
-
-
-def Merge_All(smi_file, seq_file):
- smile = j_read(smi_file.name)
- smile = smile.strip("\n")
-
- # 读取去掉换行后的文件
- contents = j_reads(seq_file.name)
-
- name = Name()
- name = name + r"kcat_input.tsv" # 结果文件名称
-
- with open(name, "a") as f3:
- f3.write("Substrate Name Substrate SMILES Protein Sequence")
- f3.write("\n")
-
- for i in range(0, len(contents)):
- if i % 2 == 1:
- with open(name, "a") as f3:
- # 写入索引
- # f3.write(">seq" + str(int((i - 1) / 2)))
- info = re.sub(' ', '_', contents[i - 1])
- info = re.sub('\n', '', info)
- f3.write(info)
- f3.write("\t")
- # 写入smile名称
- f3.write(smile)
- f3.write("\t")
- # 写入序列
- f3.write(contents[i])
- return name
diff --git a/spaces/joaopereirajp/livvieChatBot/venv/lib/python3.9/site-packages/Crypto/SelfTest/Hash/test_keccak.py b/spaces/joaopereirajp/livvieChatBot/venv/lib/python3.9/site-packages/Crypto/SelfTest/Hash/test_keccak.py
deleted file mode 100644
index 54cdf27d8be70a1a7295db0d2abe40c0cc39d381..0000000000000000000000000000000000000000
--- a/spaces/joaopereirajp/livvieChatBot/venv/lib/python3.9/site-packages/Crypto/SelfTest/Hash/test_keccak.py
+++ /dev/null
@@ -1,250 +0,0 @@
-# ===================================================================
-#
-# Copyright (c) 2015, Legrandin
-# All rights reserved.
-#
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions
-# are met:
-#
-# 1. Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-# 2. Redistributions in binary form must reproduce the above copyright
-# notice, this list of conditions and the following disclaimer in
-# the documentation and/or other materials provided with the
-# distribution.
-#
-# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
-# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
-# COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
-# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
-# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
-# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
-# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
-# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
-# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
-# POSSIBILITY OF SUCH DAMAGE.
-# ===================================================================
-
-"""Self-test suite for Crypto.Hash.keccak"""
-
-import unittest
-from binascii import hexlify, unhexlify
-
-from Crypto.SelfTest.loader import load_test_vectors
-from Crypto.SelfTest.st_common import list_test_cases
-
-from Crypto.Hash import keccak
-from Crypto.Util.py3compat import b, tobytes, bchr
-
-class KeccakTest(unittest.TestCase):
-
- def test_new_positive(self):
-
- for digest_bits in (224, 256, 384, 512):
- hobj = keccak.new(digest_bits=digest_bits)
- self.assertEqual(hobj.digest_size, digest_bits // 8)
-
- hobj2 = hobj.new()
- self.assertEqual(hobj2.digest_size, digest_bits // 8)
-
- for digest_bytes in (28, 32, 48, 64):
- hobj = keccak.new(digest_bytes=digest_bytes)
- self.assertEqual(hobj.digest_size, digest_bytes)
-
- hobj2 = hobj.new()
- self.assertEqual(hobj2.digest_size, digest_bytes)
-
- def test_new_positive2(self):
-
- digest1 = keccak.new(data=b("\x90"), digest_bytes=64).digest()
- digest2 = keccak.new(digest_bytes=64).update(b("\x90")).digest()
- self.assertEqual(digest1, digest2)
-
- def test_new_negative(self):
-
- # keccak.new needs digest size
- self.assertRaises(TypeError, keccak.new)
-
- h = keccak.new(digest_bits=512)
-
- # Either bits or bytes can be specified
- self.assertRaises(TypeError, keccak.new,
- digest_bytes=64,
- digest_bits=512)
-
- # Range
- self.assertRaises(ValueError, keccak.new, digest_bytes=0)
- self.assertRaises(ValueError, keccak.new, digest_bytes=1)
- self.assertRaises(ValueError, keccak.new, digest_bytes=65)
- self.assertRaises(ValueError, keccak.new, digest_bits=0)
- self.assertRaises(ValueError, keccak.new, digest_bits=1)
- self.assertRaises(ValueError, keccak.new, digest_bits=513)
-
- def test_update(self):
- pieces = [bchr(10) * 200, bchr(20) * 300]
- h = keccak.new(digest_bytes=64)
- h.update(pieces[0]).update(pieces[1])
- digest = h.digest()
- h = keccak.new(digest_bytes=64)
- h.update(pieces[0] + pieces[1])
- self.assertEqual(h.digest(), digest)
-
- def test_update_negative(self):
- h = keccak.new(digest_bytes=64)
- self.assertRaises(TypeError, h.update, u"string")
-
- def test_digest(self):
- h = keccak.new(digest_bytes=64)
- digest = h.digest()
-
- # hexdigest does not change the state
- self.assertEqual(h.digest(), digest)
- # digest returns a byte string
- self.assertTrue(isinstance(digest, type(b("digest"))))
-
- def test_hex_digest(self):
- mac = keccak.new(digest_bits=512)
- digest = mac.digest()
- hexdigest = mac.hexdigest()
-
- # hexdigest is equivalent to digest
- self.assertEqual(hexlify(digest), tobytes(hexdigest))
- # hexdigest does not change the state
- self.assertEqual(mac.hexdigest(), hexdigest)
- # hexdigest returns a string
- self.assertTrue(isinstance(hexdigest, type("digest")))
-
- def test_update_after_digest(self):
- msg=b("rrrrttt")
-
- # Normally, update() cannot be done after digest()
- h = keccak.new(digest_bits=512, data=msg[:4])
- dig1 = h.digest()
- self.assertRaises(TypeError, h.update, msg[4:])
- dig2 = keccak.new(digest_bits=512, data=msg).digest()
-
- # With the proper flag, it is allowed
- h = keccak.new(digest_bits=512, data=msg[:4], update_after_digest=True)
- self.assertEqual(h.digest(), dig1)
- # ... and the subsequent digest applies to the entire message
- # up to that point
- h.update(msg[4:])
- self.assertEqual(h.digest(), dig2)
-
-
-class KeccakVectors(unittest.TestCase):
- pass
-
- # TODO: add ExtremelyLong tests
-
-
-test_vectors_224 = load_test_vectors(("Hash", "keccak"),
- "ShortMsgKAT_224.txt",
- "Short Messages KAT 224",
- {"len": lambda x: int(x)}) or []
-
-test_vectors_224 += load_test_vectors(("Hash", "keccak"),
- "LongMsgKAT_224.txt",
- "Long Messages KAT 224",
- {"len": lambda x: int(x)}) or []
-
-for idx, tv in enumerate(test_vectors_224):
- if tv.len == 0:
- data = b("")
- else:
- data = tobytes(tv.msg)
-
- def new_test(self, data=data, result=tv.md):
- hobj = keccak.new(digest_bits=224, data=data)
- self.assertEqual(hobj.digest(), result)
-
- setattr(KeccakVectors, "test_224_%d" % idx, new_test)
-
-# ---
-
-test_vectors_256 = load_test_vectors(("Hash", "keccak"),
- "ShortMsgKAT_256.txt",
- "Short Messages KAT 256",
- { "len" : lambda x: int(x) } ) or []
-
-test_vectors_256 += load_test_vectors(("Hash", "keccak"),
- "LongMsgKAT_256.txt",
- "Long Messages KAT 256",
- { "len" : lambda x: int(x) } ) or []
-
-for idx, tv in enumerate(test_vectors_256):
- if tv.len == 0:
- data = b("")
- else:
- data = tobytes(tv.msg)
-
- def new_test(self, data=data, result=tv.md):
- hobj = keccak.new(digest_bits=256, data=data)
- self.assertEqual(hobj.digest(), result)
-
- setattr(KeccakVectors, "test_256_%d" % idx, new_test)
-
-
-# ---
-
-test_vectors_384 = load_test_vectors(("Hash", "keccak"),
- "ShortMsgKAT_384.txt",
- "Short Messages KAT 384",
- {"len": lambda x: int(x)}) or []
-
-test_vectors_384 += load_test_vectors(("Hash", "keccak"),
- "LongMsgKAT_384.txt",
- "Long Messages KAT 384",
- {"len": lambda x: int(x)}) or []
-
-for idx, tv in enumerate(test_vectors_384):
- if tv.len == 0:
- data = b("")
- else:
- data = tobytes(tv.msg)
-
- def new_test(self, data=data, result=tv.md):
- hobj = keccak.new(digest_bits=384, data=data)
- self.assertEqual(hobj.digest(), result)
-
- setattr(KeccakVectors, "test_384_%d" % idx, new_test)
-
-# ---
-
-test_vectors_512 = load_test_vectors(("Hash", "keccak"),
- "ShortMsgKAT_512.txt",
- "Short Messages KAT 512",
- {"len": lambda x: int(x)}) or []
-
-test_vectors_512 += load_test_vectors(("Hash", "keccak"),
- "LongMsgKAT_512.txt",
- "Long Messages KAT 512",
- {"len": lambda x: int(x)}) or []
-
-for idx, tv in enumerate(test_vectors_512):
- if tv.len == 0:
- data = b("")
- else:
- data = tobytes(tv.msg)
-
- def new_test(self, data=data, result=tv.md):
- hobj = keccak.new(digest_bits=512, data=data)
- self.assertEqual(hobj.digest(), result)
-
- setattr(KeccakVectors, "test_512_%d" % idx, new_test)
-
-
-def get_tests(config={}):
- tests = []
- tests += list_test_cases(KeccakTest)
- tests += list_test_cases(KeccakVectors)
- return tests
-
-
-if __name__ == '__main__':
- import unittest
- suite = lambda: unittest.TestSuite(get_tests())
- unittest.main(defaultTest='suite')
diff --git a/spaces/johnberg/CLIPInverter/models/encoders/model_irse.py b/spaces/johnberg/CLIPInverter/models/encoders/model_irse.py
deleted file mode 100644
index ea6c6091c1e71279ff0bc7e013b0cea287cb01b3..0000000000000000000000000000000000000000
--- a/spaces/johnberg/CLIPInverter/models/encoders/model_irse.py
+++ /dev/null
@@ -1,84 +0,0 @@
-from torch.nn import Linear, Conv2d, BatchNorm1d, BatchNorm2d, PReLU, Dropout, Sequential, Module
-from models.encoders.helpers import get_blocks, Flatten, bottleneck_IR, bottleneck_IR_SE, l2_norm
-
-"""
-Modified Backbone implementation from [TreB1eN](https://github.com/TreB1eN/InsightFace_Pytorch)
-"""
-
-
-class Backbone(Module):
- def __init__(self, input_size, num_layers, mode='ir', drop_ratio=0.4, affine=True):
- super(Backbone, self).__init__()
- assert input_size in [112, 224], "input_size should be 112 or 224"
- assert num_layers in [50, 100, 152], "num_layers should be 50, 100 or 152"
- assert mode in ['ir', 'ir_se'], "mode should be ir or ir_se"
- blocks = get_blocks(num_layers)
- if mode == 'ir':
- unit_module = bottleneck_IR
- elif mode == 'ir_se':
- unit_module = bottleneck_IR_SE
- self.input_layer = Sequential(Conv2d(3, 64, (3, 3), 1, 1, bias=False),
- BatchNorm2d(64),
- PReLU(64))
- if input_size == 112:
- self.output_layer = Sequential(BatchNorm2d(512),
- Dropout(drop_ratio),
- Flatten(),
- Linear(512 * 7 * 7, 512),
- BatchNorm1d(512, affine=affine))
- else:
- self.output_layer = Sequential(BatchNorm2d(512),
- Dropout(drop_ratio),
- Flatten(),
- Linear(512 * 14 * 14, 512),
- BatchNorm1d(512, affine=affine))
-
- modules = []
- for block in blocks:
- for bottleneck in block:
- modules.append(unit_module(bottleneck.in_channel,
- bottleneck.depth,
- bottleneck.stride))
- self.body = Sequential(*modules)
-
- def forward(self, x):
- x = self.input_layer(x)
- x = self.body(x)
- x = self.output_layer(x)
- return l2_norm(x)
-
-
-def IR_50(input_size):
- """Constructs a ir-50 model."""
- model = Backbone(input_size, num_layers=50, mode='ir', drop_ratio=0.4, affine=False)
- return model
-
-
-def IR_101(input_size):
- """Constructs a ir-101 model."""
- model = Backbone(input_size, num_layers=100, mode='ir', drop_ratio=0.4, affine=False)
- return model
-
-
-def IR_152(input_size):
- """Constructs a ir-152 model."""
- model = Backbone(input_size, num_layers=152, mode='ir', drop_ratio=0.4, affine=False)
- return model
-
-
-def IR_SE_50(input_size):
- """Constructs a ir_se-50 model."""
- model = Backbone(input_size, num_layers=50, mode='ir_se', drop_ratio=0.4, affine=False)
- return model
-
-
-def IR_SE_101(input_size):
- """Constructs a ir_se-101 model."""
- model = Backbone(input_size, num_layers=100, mode='ir_se', drop_ratio=0.4, affine=False)
- return model
-
-
-def IR_SE_152(input_size):
- """Constructs a ir_se-152 model."""
- model = Backbone(input_size, num_layers=152, mode='ir_se', drop_ratio=0.4, affine=False)
- return model
diff --git a/spaces/jonigata/PoseTweak/pose.py b/spaces/jonigata/PoseTweak/pose.py
deleted file mode 100644
index cd2d1d0e45d0519a68d9bc7ca052cc8d423abb07..0000000000000000000000000000000000000000
--- a/spaces/jonigata/PoseTweak/pose.py
+++ /dev/null
@@ -1,50 +0,0 @@
-from mmpose.apis import (inference_top_down_pose_model, init_pose_model,
- process_mmdet_results, vis_pose_result)
-from mmpose.datasets import DatasetInfo
-from mmdet.apis import inference_detector, init_detector
-
-det_model = init_detector(
- "./external/faster_rcnn_r50_fpn_coco.py",
- "./faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth",
- device="cpu")
-pose_model = init_pose_model(
- "./external/hrnet_w48_coco_256x192.py",
- "./hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth",
- device="cpu")
-
-dataset = pose_model.cfg.data['test']['type']
-dataset_info = pose_model.cfg.data['test'].get('dataset_info', None)
-
-dataset_info = DatasetInfo(dataset_info)
-
-def infer(image):
- mmdet_results = inference_detector(det_model, image)
- person_results = process_mmdet_results(mmdet_results, 1)
-
- pose_results, returned_outputs = inference_top_down_pose_model(
- pose_model,
- image,
- person_results,
- bbox_thr=0.3,
- format='xyxy',
- dataset=dataset,
- dataset_info=dataset_info,
- return_heatmap=False,
- outputs=None)
- # print(pose_results)
- # print(returned_outputs)
-
- return pose_results, returned_outputs
-
-def draw(image, results):
- return vis_pose_result(
- pose_model,
- image,
- results,
- dataset=dataset,
- dataset_info=dataset_info,
- kpt_score_thr=0.3,
- radius=4,
- thickness=3,
- show=False,
- out_file=None)
diff --git a/spaces/josegabmuz/gradio-test/app.py b/spaces/josegabmuz/gradio-test/app.py
deleted file mode 100644
index 34a37eff273335a001dfdff51c225239e0841687..0000000000000000000000000000000000000000
--- a/spaces/josegabmuz/gradio-test/app.py
+++ /dev/null
@@ -1,25 +0,0 @@
-import gradio as gr
-def bmi(name, height, weight,feeling):
- bmi_val=weight/((height/100) ** 2)
- result_emoticon="( ͡❛ ‿‿ ͡❛)" if bmi_val < 30 else "( ͡❛ 👅 ͡❛)"
-
- txt=" i am happy" if feeling else " i feel like fuck this shit yo"
- return("hello " +name+" your bmi is "+str(bmi_val)+result_emoticon+txt)
-
-
-interface=gr.Interface(
- fn=bmi,
- inputs=["text",gr.Slider(1,200,label="altura"),gr.Slider(1,100,label="weight"),gr.Checkbox("your feeling today ")],
- outputs=["text"],
- examples=[["jose",100,50,True],
- ["pedro",50,72,False]],
-
- description="usa flag si encuentras algun problema",
- allow_flagging="manual",
- css="""
- body {background-color:green}
- """,
- flagging_options=["wrong sign", "off by one", "other"]
-)
-
-interface.launch(debug=True)
\ No newline at end of file
diff --git a/spaces/kTonpa/Text2Cryptopunks/text2punks/text2punk.py b/spaces/kTonpa/Text2Cryptopunks/text2punks/text2punk.py
deleted file mode 100644
index 0238c7560267b2b20b49621d8c355f9a6da64279..0000000000000000000000000000000000000000
--- a/spaces/kTonpa/Text2Cryptopunks/text2punks/text2punk.py
+++ /dev/null
@@ -1,377 +0,0 @@
-import math
-
-from einops import rearrange, repeat
-
-import torch
-from torch import nn, einsum
-import torch.nn.functional as F
-
-from axial_positional_embedding import AxialPositionalEmbedding
-from text2punks.transformer import Transformer
-
-
-# helpers fns
-
-def exists(val):
- return val is not None
-
-def default(val, d):
- return val if exists(val) else d
-
-def set_requires_grad(model, value):
- for param in model.parameters():
- param.requires_grad = value
-
-def eval_decorator(fn):
- def inner(model, *args, **kwargs):
- was_training = model.training
- model.eval()
- out = fn(model, *args, **kwargs)
- model.train(was_training)
- return out
- return inner
-
-# sampling helpers fn
-
-def top_k(logits, thres = 0.5):
- num_logits = logits.shape[-1]
- k = max(int((1 - thres) * num_logits), 1)
- val, ind = torch.topk(logits, k)
- probs = torch.full_like(logits, float('-inf'))
- probs.scatter_(1, ind, val)
- return probs
-
-# main CLIP class
-
-class CLIP(nn.Module):
- def __init__(
- self,
- *,
- dim_text = 512,
- dim_image = 512,
- dim_latent = 512,
- num_text_tokens = 10000,
- text_enc_depth = 6,
- text_seq_len = 256,
- text_heads = 8,
- num_visual_tokens = 256,
- visual_enc_depth = 6,
- visual_image_seq_len = 256,
- visual_image_size = 24,
- visual_heads = 8,
- attn_pdrop = 0.1,
- resid_pdrop = 0.1,
- embd_pdrop = 0.1,
- ff_dropout = 0.1,
- attn_types = None
- ):
- super().__init__()
-
- # Texts
-
- self.text_emb = nn.Embedding(num_text_tokens, dim_text)
- self.text_pos_emb = nn.Embedding(text_seq_len, dim_text)
-
- self.text_transformer = Transformer(
- dim = dim_text,
- causal = False,
- seq_len = text_seq_len,
- depth = text_enc_depth,
- heads = text_heads,
- dim_head = dim_text // text_heads,
- attn_dropout = attn_pdrop,
- resid_dropout = resid_pdrop,
- embd_dropout = embd_pdrop,
- ff_dropout = ff_dropout,
- attn_types = attn_types
- )
-
- self.text_ln = nn.LayerNorm(dim_text)
- self.to_text_latent = nn.Linear(dim_text, dim_latent, bias = False)
-
- # Images
-
- self.image_emb = nn.Embedding(num_visual_tokens, dim_image)
- self.image_pos_emb = nn.Embedding(visual_image_seq_len, dim_image)
-
- self.visual_transformer = Transformer(
- dim = dim_image,
- causal = False,
- seq_len = visual_image_seq_len,
- depth = visual_enc_depth,
- heads = visual_heads,
- dim_head = dim_image // visual_heads,
- attn_dropout = attn_pdrop,
- resid_dropout = resid_pdrop,
- embd_dropout = embd_pdrop,
- ff_dropout = ff_dropout,
- attn_types = attn_types,
- image_size = visual_image_size,
- )
-
- self.image_ln = nn.LayerNorm(dim_image)
- self.to_visual_latent = nn.Linear(dim_image, dim_latent, bias = False)
-
- self.temperature = nn.Parameter(torch.ones([]) * math.log(1 / 0.07))
-
-
- self.apply(self._init_weights)
-
- def _init_weights(self, module):
- if isinstance(module, (nn.Linear, nn.Embedding)):
- module.weight.data.normal_(mean=0.0, std=0.02)
- if isinstance(module, nn.Linear) and module.bias is not None:
- module.bias.data.zero_()
- elif isinstance(module, nn.LayerNorm):
- module.bias.data.zero_()
- module.weight.data.fill_(1.0)
-
- def forward(
- self,
- text,
- image,
- return_loss = False
- ):
- b, device= text.shape[0], text.device
-
- text_emb = self.text_emb(text)
- text_emb += self.text_pos_emb(torch.arange(text.shape[1], device = device))
-
- image_emb = self.image_emb(image)
- image_emb += self.image_pos_emb(torch.arange(image.shape[1], device = device))
-
- enc_text = self.text_transformer(text_emb)
- enc_image = self.visual_transformer(image_emb)
-
- text_latents = enc_text.mean(dim = 1)
- image_latents = enc_image.mean(dim = 1)
-
- text_latents = self.text_ln(text_latents)
- image_latents = self.image_ln(image_latents)
-
- text_latents = self.to_text_latent(text_latents)
- image_latents = self.to_visual_latent(image_latents)
-
- text_latents, image_latents = map(lambda t: F.normalize(t, p = 2, dim = -1), (text_latents, image_latents))
-
- temp = self.temperature.exp()
-
- if not return_loss:
- sim = einsum('n d, n d -> n', text_latents, image_latents) * temp
- return sim
-
- sim = einsum('i d, j d -> i j', text_latents, image_latents) * temp
- labels = torch.arange(b, device = device)
- loss = (F.cross_entropy(sim, labels) + F.cross_entropy(sim.t(), labels)) / 2
- return loss
-
-# main Text2Punks class
-
-class Text2Punks(nn.Module):
- def __init__(
- self,
- *,
- n_embd,
- n_layer = 12,
- n_head = 12,
- d_head = 64,
- num_text_tokens = 10000,
- text_seq_len = 256,
- num_image_tokens = 222,
- image_seq_len = 576,
- image_size = 24,
- attn_pdrop = 0.1,
- resid_pdrop = 0.1,
- embd_pdrop = 0.1,
- ff_dropout = 0.1,
- attn_types = None,
- loss_img_weight = 7,
- loss_txt_weight = 7,
- ):
- super().__init__()
-
- num_text_tokens = num_text_tokens + text_seq_len # reserve unique padding tokens for each position (text seq len)
-
- self.text_emb = nn.Embedding(num_text_tokens, n_embd)
- self.image_emb = nn.Embedding(num_image_tokens, n_embd)
-
- self.text_pos_emb = nn.Embedding(text_seq_len + 1, n_embd) # +1 for a.k.a
- # self.image_pos_emb = nn.Embedding(image_seq_len, n_embd)
- self.image_pos_emb = nn.Parameter(torch.zeros(1, image_seq_len, n_embd))
- # self.image_pos_emb = AxialPositionalEmbedding(n_embd, axial_shape=(image_size, image_size))
-
- self.num_text_tokens = num_text_tokens # for offsetting logits index and calculating cross entropy loss
- self.num_image_tokens = num_image_tokens
- self.text_seq_len = text_seq_len
- self.image_seq_len = image_seq_len
-
- seq_len = text_seq_len + image_seq_len
- total_tokens = num_text_tokens + num_image_tokens
- self.total_seq_len = seq_len
- self.total_tokens = total_tokens
-
- self.transformer = Transformer(
- dim = n_embd,
- causal = True,
- seq_len = seq_len,
- depth = n_layer,
- heads = n_head,
- dim_head = d_head,
- attn_dropout = attn_pdrop,
- resid_dropout = resid_pdrop,
- embd_dropout = embd_pdrop,
- ff_dropout = ff_dropout,
- attn_types = attn_types,
- image_size = image_size,
- )
-
- self.to_logits = nn.Sequential(
- nn.LayerNorm(n_embd),
- nn.Linear(n_embd, self.total_tokens),
- )
-
- seq_range = torch.arange(seq_len)
- logits_range = torch.arange(total_tokens)
-
- seq_range = rearrange(seq_range, 'n -> () n ()')
- logits_range = rearrange(logits_range, 'd -> () () d')
-
- logits_mask = (
- ((seq_range >= text_seq_len) & (logits_range < num_text_tokens)) |
- ((seq_range < text_seq_len) & (logits_range >= num_text_tokens))
- )
-
- self.register_buffer('logits_mask', logits_mask, persistent=False)
- self.loss_img_weight = loss_img_weight
- self.loss_txt_weight = loss_txt_weight
-
- self.apply(self._init_weights)
-
- def _init_weights(self, module):
- if isinstance(module, (nn.Linear, nn.Embedding)):
- module.weight.data.normal_(mean=0.0, std=0.02)
- if isinstance(module, nn.Linear) and module.bias is not None:
- module.bias.data.zero_()
- elif isinstance(module, nn.LayerNorm):
- module.bias.data.zero_()
- module.weight.data.fill_(1.0)
-
- @torch.no_grad()
- @eval_decorator
- def generate_images(
- self,
- text,
- decoder,
- *,
- clip = None,
- filter_thres = 0.5,
- temperature = 1.,
- img = None,
- num_init_img_tokens = None
- ):
- text_seq_len, image_seq_len, num_text_tokens = self.text_seq_len, self.image_seq_len, self.num_text_tokens
- total_len = text_seq_len + image_seq_len
-
- batch = text.shape[0]
- text = text[:, :text_seq_len] # make sure text is within bounds
- out = text
-
- if exists(img):
- assert img.shape[1] == image_seq_len, f'input image must have the correct image size {image_seq_len}'
-
- num_img_tokens = default(num_init_img_tokens, int(0.4375 * image_seq_len)) # OpenAI used 14 * 32 initial tokens to prime
- assert num_img_tokens < image_seq_len, 'number of initial image tokens for priming must be less than the total image token sequence length'
-
- trunc_img = img[:, :num_img_tokens]
- out = torch.cat((out, trunc_img), dim = -1)
-
- for cur_len in range(out.shape[1], total_len):
- is_image = cur_len >= text_seq_len
-
- text, image = out[:, :text_seq_len], out[:, text_seq_len:]
-
- logits = self(text, image)[:, -1, :]
-
- filtered_logits = top_k(logits, thres = filter_thres)
- probs = F.softmax(filtered_logits / temperature, dim = -1)
- sample = torch.multinomial(probs, 1)
-
- sample -= (num_text_tokens if is_image else 0) # offset sampled token if it is an image token, since logit space is composed of text and then image tokens
- out = torch.cat((out, sample), dim=-1)
-
- text_seq = out[:, :text_seq_len]
- img_seq = out[:, -image_seq_len:]
-
- scores = None
- if exists(clip):
- scores = clip(text_seq, img_seq, return_loss = False)
-
- img_seq = repeat(img_seq, 'b p -> b p c', c=3)
- decoder = repeat(decoder, 'p c -> b p c', b=batch)
- images = torch.gather(decoder, 1, img_seq)
- images = rearrange(images, 'b (h w) c-> b c h w', h=24, w =24)
- images = images.float()
-
- return images, scores
-
- def forward(
- self,
- text,
- image = None,
- return_loss = False
- ):
- assert text.shape[-1] == self.text_seq_len, f'the length {text.shape[-1]} of the text tokens you passed in does not have the correct length ({self.text_seq_len})'
- device, total_seq_len = text.device, self.total_seq_len
-
- text_range = torch.arange(self.text_seq_len, device = device) + (self.num_text_tokens - self.text_seq_len)
- text = torch.where(text == 0, text_range, text)
-
- text = F.pad(text, (1, 0), value = 0) # add
-
- tokens = self.text_emb(text)
- tokens += self.text_pos_emb(torch.arange(text.shape[1], device = device))
-
- seq_len = tokens.shape[1]
-
- image_len = image.shape[1]
- image_emb = self.image_emb(image)
- # image_emb += self.image_pos_emb(torch.arange(image_len, device = device))
- image_emb += self.image_pos_emb[:, :image_len, :]
-
- # image_emb += self.image_pos_emb(image_emb)
-
- tokens = torch.cat((tokens, image_emb), dim = 1)
-
- seq_len += image_len
-
- # when training, if the length exceeds the total text + image length
- # remove the last token, since it needs not to be trained
-
- if tokens.shape[1] > total_seq_len:
- seq_len -= 1
- tokens = tokens[:, :-1]
-
- out = self.transformer(tokens)
- logits = self.to_logits(out)
-
- # mask logits to make sure text predicts text (except last token), and image predicts image
-
- logits_mask = self.logits_mask[:, :seq_len]
- max_neg_value = -torch.finfo(logits.dtype).max
- logits.masked_fill_(logits_mask, max_neg_value)
-
- if not return_loss:
- return logits
-
- assert exists(image), 'when training, image must be supplied'
-
- offsetted_image = image + self.num_text_tokens
- labels = torch.cat((text[:, 1:], offsetted_image), dim = 1)
-
- logits = rearrange(logits, 'b n c -> b c n')
-
- loss_text = F.cross_entropy(logits[:, :, :self.text_seq_len], labels[:, :self.text_seq_len])
- loss_img = F.cross_entropy(logits[:, :, self.text_seq_len:], labels[:, self.text_seq_len:])
-
- loss = (self.loss_txt_weight * loss_text + self.loss_img_weight * loss_img) / (self.loss_img_weight + self.loss_txt_weight)
- return loss, loss_text, loss_img
diff --git a/spaces/kdrkdrkdr/YuukaTTS/monotonic_align/core.py b/spaces/kdrkdrkdr/YuukaTTS/monotonic_align/core.py
deleted file mode 100644
index 1f940605fe4fd0738fa0006149fcba14ef88223a..0000000000000000000000000000000000000000
--- a/spaces/kdrkdrkdr/YuukaTTS/monotonic_align/core.py
+++ /dev/null
@@ -1,36 +0,0 @@
-import numba
-
-
-@numba.jit(numba.void(numba.int32[:, :, ::1], numba.float32[:, :, ::1], numba.int32[::1], numba.int32[::1]),
- nopython=True, nogil=True)
-def maximum_path_jit(paths, values, t_ys, t_xs):
- b = paths.shape[0]
- max_neg_val = -1e9
- for i in range(int(b)):
- path = paths[i]
- value = values[i]
- t_y = t_ys[i]
- t_x = t_xs[i]
-
- v_prev = v_cur = 0.0
- index = t_x - 1
-
- for y in range(t_y):
- for x in range(max(0, t_x + y - t_y), min(t_x, y + 1)):
- if x == y:
- v_cur = max_neg_val
- else:
- v_cur = value[y - 1, x]
- if x == 0:
- if y == 0:
- v_prev = 0.
- else:
- v_prev = max_neg_val
- else:
- v_prev = value[y - 1, x - 1]
- value[y, x] += max(v_prev, v_cur)
-
- for y in range(t_y - 1, -1, -1):
- path[y, index] = 1
- if index != 0 and (index == y or value[y - 1, index] < value[y - 1, index - 1]):
- index = index - 1
diff --git a/spaces/keithhon/Real-Time-Voice-Cloning/vocoder/vocoder_dataset.py b/spaces/keithhon/Real-Time-Voice-Cloning/vocoder/vocoder_dataset.py
deleted file mode 100644
index 9eae1b5f20117feef0a06e264a99b3c0c6143bac..0000000000000000000000000000000000000000
--- a/spaces/keithhon/Real-Time-Voice-Cloning/vocoder/vocoder_dataset.py
+++ /dev/null
@@ -1,84 +0,0 @@
-from torch.utils.data import Dataset
-from pathlib import Path
-from vocoder import audio
-import vocoder.hparams as hp
-import numpy as np
-import torch
-
-
-class VocoderDataset(Dataset):
- def __init__(self, metadata_fpath: Path, mel_dir: Path, wav_dir: Path):
- print("Using inputs from:\n\t%s\n\t%s\n\t%s" % (metadata_fpath, mel_dir, wav_dir))
-
- with metadata_fpath.open("r") as metadata_file:
- metadata = [line.split("|") for line in metadata_file]
-
- gta_fnames = [x[1] for x in metadata if int(x[4])]
- gta_fpaths = [mel_dir.joinpath(fname) for fname in gta_fnames]
- wav_fnames = [x[0] for x in metadata if int(x[4])]
- wav_fpaths = [wav_dir.joinpath(fname) for fname in wav_fnames]
- self.samples_fpaths = list(zip(gta_fpaths, wav_fpaths))
-
- print("Found %d samples" % len(self.samples_fpaths))
-
- def __getitem__(self, index):
- mel_path, wav_path = self.samples_fpaths[index]
-
- # Load the mel spectrogram and adjust its range to [-1, 1]
- mel = np.load(mel_path).T.astype(np.float32) / hp.mel_max_abs_value
-
- # Load the wav
- wav = np.load(wav_path)
- if hp.apply_preemphasis:
- wav = audio.pre_emphasis(wav)
- wav = np.clip(wav, -1, 1)
-
- # Fix for missing padding # TODO: settle on whether this is any useful
- r_pad = (len(wav) // hp.hop_length + 1) * hp.hop_length - len(wav)
- wav = np.pad(wav, (0, r_pad), mode='constant')
- assert len(wav) >= mel.shape[1] * hp.hop_length
- wav = wav[:mel.shape[1] * hp.hop_length]
- assert len(wav) % hp.hop_length == 0
-
- # Quantize the wav
- if hp.voc_mode == 'RAW':
- if hp.mu_law:
- quant = audio.encode_mu_law(wav, mu=2 ** hp.bits)
- else:
- quant = audio.float_2_label(wav, bits=hp.bits)
- elif hp.voc_mode == 'MOL':
- quant = audio.float_2_label(wav, bits=16)
-
- return mel.astype(np.float32), quant.astype(np.int64)
-
- def __len__(self):
- return len(self.samples_fpaths)
-
-
-def collate_vocoder(batch):
- mel_win = hp.voc_seq_len // hp.hop_length + 2 * hp.voc_pad
- max_offsets = [x[0].shape[-1] -2 - (mel_win + 2 * hp.voc_pad) for x in batch]
- mel_offsets = [np.random.randint(0, offset) for offset in max_offsets]
- sig_offsets = [(offset + hp.voc_pad) * hp.hop_length for offset in mel_offsets]
-
- mels = [x[0][:, mel_offsets[i]:mel_offsets[i] + mel_win] for i, x in enumerate(batch)]
-
- labels = [x[1][sig_offsets[i]:sig_offsets[i] + hp.voc_seq_len + 1] for i, x in enumerate(batch)]
-
- mels = np.stack(mels).astype(np.float32)
- labels = np.stack(labels).astype(np.int64)
-
- mels = torch.tensor(mels)
- labels = torch.tensor(labels).long()
-
- x = labels[:, :hp.voc_seq_len]
- y = labels[:, 1:]
-
- bits = 16 if hp.voc_mode == 'MOL' else hp.bits
-
- x = audio.label_2_float(x.float(), bits)
-
- if hp.voc_mode == 'MOL' :
- y = audio.label_2_float(y.float(), bits)
-
- return x, y, mels
\ No newline at end of file
diff --git a/spaces/kevinwang676/Bert-VITS2/README_zh.md b/spaces/kevinwang676/Bert-VITS2/README_zh.md
deleted file mode 100644
index 8b137891791fe96927ad78e64b0aad7bded08bdc..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/Bert-VITS2/README_zh.md
+++ /dev/null
@@ -1 +0,0 @@
-
diff --git a/spaces/kevinwang676/ChatGLM2-SadTalker-VC/src/utils/videoio.py b/spaces/kevinwang676/ChatGLM2-SadTalker-VC/src/utils/videoio.py
deleted file mode 100644
index d16ee667713a16e3f9644fcc3cb3e023bc2c9102..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/ChatGLM2-SadTalker-VC/src/utils/videoio.py
+++ /dev/null
@@ -1,41 +0,0 @@
-import shutil
-import uuid
-
-import os
-
-import cv2
-
-def load_video_to_cv2(input_path):
- video_stream = cv2.VideoCapture(input_path)
- fps = video_stream.get(cv2.CAP_PROP_FPS)
- full_frames = []
- while 1:
- still_reading, frame = video_stream.read()
- if not still_reading:
- video_stream.release()
- break
- full_frames.append(cv2.cvtColor(frame, cv2.COLOR_BGR2RGB))
- return full_frames
-
-def save_video_with_watermark(video, audio, save_path, watermark=False):
- temp_file = str(uuid.uuid4())+'.mp4'
- cmd = r'ffmpeg -y -hide_banner -loglevel error -i "%s" -i "%s" -vcodec mpeg4 "%s"' % (video, audio, temp_file)
- os.system(cmd)
-
- if watermark is False:
- shutil.move(temp_file, save_path)
- else:
- # watermark
- try:
- ##### check if stable-diffusion-webui
- import webui
- from modules import paths
- watarmark_path = paths.script_path+"/extensions/SadTalker/docs/sadtalker_logo.png"
- except:
- # get the root path of sadtalker.
- dir_path = os.path.dirname(os.path.realpath(__file__))
- watarmark_path = dir_path+"/../../docs/sadtalker_logo.png"
-
- cmd = r'ffmpeg -y -hide_banner -loglevel error -i "%s" -i "%s" -filter_complex "[1]scale=100:-1[wm];[0][wm]overlay=(main_w-overlay_w)-10:10" "%s"' % (temp_file, watarmark_path, save_path)
- os.system(cmd)
- os.remove(temp_file)
\ No newline at end of file
diff --git a/spaces/kevinwang676/ChatGLM2-SadTalker/src/face3d/models/arcface_torch/configs/base.py b/spaces/kevinwang676/ChatGLM2-SadTalker/src/face3d/models/arcface_torch/configs/base.py
deleted file mode 100644
index 78e4b36a9142b649ec39a8c59331bb2557f2ad57..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/ChatGLM2-SadTalker/src/face3d/models/arcface_torch/configs/base.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from easydict import EasyDict as edict
-
-# make training faster
-# our RAM is 256G
-# mount -t tmpfs -o size=140G tmpfs /train_tmp
-
-config = edict()
-config.loss = "arcface"
-config.network = "r50"
-config.resume = False
-config.output = "ms1mv3_arcface_r50"
-
-config.dataset = "ms1m-retinaface-t1"
-config.embedding_size = 512
-config.sample_rate = 1
-config.fp16 = False
-config.momentum = 0.9
-config.weight_decay = 5e-4
-config.batch_size = 128
-config.lr = 0.1 # batch size is 512
-
-if config.dataset == "emore":
- config.rec = "/train_tmp/faces_emore"
- config.num_classes = 85742
- config.num_image = 5822653
- config.num_epoch = 16
- config.warmup_epoch = -1
- config.decay_epoch = [8, 14, ]
- config.val_targets = ["lfw", ]
-
-elif config.dataset == "ms1m-retinaface-t1":
- config.rec = "/train_tmp/ms1m-retinaface-t1"
- config.num_classes = 93431
- config.num_image = 5179510
- config.num_epoch = 25
- config.warmup_epoch = -1
- config.decay_epoch = [11, 17, 22]
- config.val_targets = ["lfw", "cfp_fp", "agedb_30"]
-
-elif config.dataset == "glint360k":
- config.rec = "/train_tmp/glint360k"
- config.num_classes = 360232
- config.num_image = 17091657
- config.num_epoch = 20
- config.warmup_epoch = -1
- config.decay_epoch = [8, 12, 15, 18]
- config.val_targets = ["lfw", "cfp_fp", "agedb_30"]
-
-elif config.dataset == "webface":
- config.rec = "/train_tmp/faces_webface_112x112"
- config.num_classes = 10572
- config.num_image = "forget"
- config.num_epoch = 34
- config.warmup_epoch = -1
- config.decay_epoch = [20, 28, 32]
- config.val_targets = ["lfw", "cfp_fp", "agedb_30"]
diff --git a/spaces/kevinwang676/ChatGLM2-VC-SadTalker/src/face3d/models/arcface_torch/eval/verification.py b/spaces/kevinwang676/ChatGLM2-VC-SadTalker/src/face3d/models/arcface_torch/eval/verification.py
deleted file mode 100644
index 253343b83dbf9d1bd154d14ec068e098bf0968db..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/ChatGLM2-VC-SadTalker/src/face3d/models/arcface_torch/eval/verification.py
+++ /dev/null
@@ -1,407 +0,0 @@
-"""Helper for evaluation on the Labeled Faces in the Wild dataset
-"""
-
-# MIT License
-#
-# Copyright (c) 2016 David Sandberg
-#
-# Permission is hereby granted, free of charge, to any person obtaining a copy
-# of this software and associated documentation files (the "Software"), to deal
-# in the Software without restriction, including without limitation the rights
-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-# copies of the Software, and to permit persons to whom the Software is
-# furnished to do so, subject to the following conditions:
-#
-# The above copyright notice and this permission notice shall be included in all
-# copies or substantial portions of the Software.
-#
-# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-# SOFTWARE.
-
-
-import datetime
-import os
-import pickle
-
-import mxnet as mx
-import numpy as np
-import sklearn
-import torch
-from mxnet import ndarray as nd
-from scipy import interpolate
-from sklearn.decomposition import PCA
-from sklearn.model_selection import KFold
-
-
-class LFold:
- def __init__(self, n_splits=2, shuffle=False):
- self.n_splits = n_splits
- if self.n_splits > 1:
- self.k_fold = KFold(n_splits=n_splits, shuffle=shuffle)
-
- def split(self, indices):
- if self.n_splits > 1:
- return self.k_fold.split(indices)
- else:
- return [(indices, indices)]
-
-
-def calculate_roc(thresholds,
- embeddings1,
- embeddings2,
- actual_issame,
- nrof_folds=10,
- pca=0):
- assert (embeddings1.shape[0] == embeddings2.shape[0])
- assert (embeddings1.shape[1] == embeddings2.shape[1])
- nrof_pairs = min(len(actual_issame), embeddings1.shape[0])
- nrof_thresholds = len(thresholds)
- k_fold = LFold(n_splits=nrof_folds, shuffle=False)
-
- tprs = np.zeros((nrof_folds, nrof_thresholds))
- fprs = np.zeros((nrof_folds, nrof_thresholds))
- accuracy = np.zeros((nrof_folds))
- indices = np.arange(nrof_pairs)
-
- if pca == 0:
- diff = np.subtract(embeddings1, embeddings2)
- dist = np.sum(np.square(diff), 1)
-
- for fold_idx, (train_set, test_set) in enumerate(k_fold.split(indices)):
- if pca > 0:
- print('doing pca on', fold_idx)
- embed1_train = embeddings1[train_set]
- embed2_train = embeddings2[train_set]
- _embed_train = np.concatenate((embed1_train, embed2_train), axis=0)
- pca_model = PCA(n_components=pca)
- pca_model.fit(_embed_train)
- embed1 = pca_model.transform(embeddings1)
- embed2 = pca_model.transform(embeddings2)
- embed1 = sklearn.preprocessing.normalize(embed1)
- embed2 = sklearn.preprocessing.normalize(embed2)
- diff = np.subtract(embed1, embed2)
- dist = np.sum(np.square(diff), 1)
-
- # Find the best threshold for the fold
- acc_train = np.zeros((nrof_thresholds))
- for threshold_idx, threshold in enumerate(thresholds):
- _, _, acc_train[threshold_idx] = calculate_accuracy(
- threshold, dist[train_set], actual_issame[train_set])
- best_threshold_index = np.argmax(acc_train)
- for threshold_idx, threshold in enumerate(thresholds):
- tprs[fold_idx, threshold_idx], fprs[fold_idx, threshold_idx], _ = calculate_accuracy(
- threshold, dist[test_set],
- actual_issame[test_set])
- _, _, accuracy[fold_idx] = calculate_accuracy(
- thresholds[best_threshold_index], dist[test_set],
- actual_issame[test_set])
-
- tpr = np.mean(tprs, 0)
- fpr = np.mean(fprs, 0)
- return tpr, fpr, accuracy
-
-
-def calculate_accuracy(threshold, dist, actual_issame):
- predict_issame = np.less(dist, threshold)
- tp = np.sum(np.logical_and(predict_issame, actual_issame))
- fp = np.sum(np.logical_and(predict_issame, np.logical_not(actual_issame)))
- tn = np.sum(
- np.logical_and(np.logical_not(predict_issame),
- np.logical_not(actual_issame)))
- fn = np.sum(np.logical_and(np.logical_not(predict_issame), actual_issame))
-
- tpr = 0 if (tp + fn == 0) else float(tp) / float(tp + fn)
- fpr = 0 if (fp + tn == 0) else float(fp) / float(fp + tn)
- acc = float(tp + tn) / dist.size
- return tpr, fpr, acc
-
-
-def calculate_val(thresholds,
- embeddings1,
- embeddings2,
- actual_issame,
- far_target,
- nrof_folds=10):
- assert (embeddings1.shape[0] == embeddings2.shape[0])
- assert (embeddings1.shape[1] == embeddings2.shape[1])
- nrof_pairs = min(len(actual_issame), embeddings1.shape[0])
- nrof_thresholds = len(thresholds)
- k_fold = LFold(n_splits=nrof_folds, shuffle=False)
-
- val = np.zeros(nrof_folds)
- far = np.zeros(nrof_folds)
-
- diff = np.subtract(embeddings1, embeddings2)
- dist = np.sum(np.square(diff), 1)
- indices = np.arange(nrof_pairs)
-
- for fold_idx, (train_set, test_set) in enumerate(k_fold.split(indices)):
-
- # Find the threshold that gives FAR = far_target
- far_train = np.zeros(nrof_thresholds)
- for threshold_idx, threshold in enumerate(thresholds):
- _, far_train[threshold_idx] = calculate_val_far(
- threshold, dist[train_set], actual_issame[train_set])
- if np.max(far_train) >= far_target:
- f = interpolate.interp1d(far_train, thresholds, kind='slinear')
- threshold = f(far_target)
- else:
- threshold = 0.0
-
- val[fold_idx], far[fold_idx] = calculate_val_far(
- threshold, dist[test_set], actual_issame[test_set])
-
- val_mean = np.mean(val)
- far_mean = np.mean(far)
- val_std = np.std(val)
- return val_mean, val_std, far_mean
-
-
-def calculate_val_far(threshold, dist, actual_issame):
- predict_issame = np.less(dist, threshold)
- true_accept = np.sum(np.logical_and(predict_issame, actual_issame))
- false_accept = np.sum(
- np.logical_and(predict_issame, np.logical_not(actual_issame)))
- n_same = np.sum(actual_issame)
- n_diff = np.sum(np.logical_not(actual_issame))
- # print(true_accept, false_accept)
- # print(n_same, n_diff)
- val = float(true_accept) / float(n_same)
- far = float(false_accept) / float(n_diff)
- return val, far
-
-
-def evaluate(embeddings, actual_issame, nrof_folds=10, pca=0):
- # Calculate evaluation metrics
- thresholds = np.arange(0, 4, 0.01)
- embeddings1 = embeddings[0::2]
- embeddings2 = embeddings[1::2]
- tpr, fpr, accuracy = calculate_roc(thresholds,
- embeddings1,
- embeddings2,
- np.asarray(actual_issame),
- nrof_folds=nrof_folds,
- pca=pca)
- thresholds = np.arange(0, 4, 0.001)
- val, val_std, far = calculate_val(thresholds,
- embeddings1,
- embeddings2,
- np.asarray(actual_issame),
- 1e-3,
- nrof_folds=nrof_folds)
- return tpr, fpr, accuracy, val, val_std, far
-
-@torch.no_grad()
-def load_bin(path, image_size):
- try:
- with open(path, 'rb') as f:
- bins, issame_list = pickle.load(f) # py2
- except UnicodeDecodeError as e:
- with open(path, 'rb') as f:
- bins, issame_list = pickle.load(f, encoding='bytes') # py3
- data_list = []
- for flip in [0, 1]:
- data = torch.empty((len(issame_list) * 2, 3, image_size[0], image_size[1]))
- data_list.append(data)
- for idx in range(len(issame_list) * 2):
- _bin = bins[idx]
- img = mx.image.imdecode(_bin)
- if img.shape[1] != image_size[0]:
- img = mx.image.resize_short(img, image_size[0])
- img = nd.transpose(img, axes=(2, 0, 1))
- for flip in [0, 1]:
- if flip == 1:
- img = mx.ndarray.flip(data=img, axis=2)
- data_list[flip][idx][:] = torch.from_numpy(img.asnumpy())
- if idx % 1000 == 0:
- print('loading bin', idx)
- print(data_list[0].shape)
- return data_list, issame_list
-
-@torch.no_grad()
-def test(data_set, backbone, batch_size, nfolds=10):
- print('testing verification..')
- data_list = data_set[0]
- issame_list = data_set[1]
- embeddings_list = []
- time_consumed = 0.0
- for i in range(len(data_list)):
- data = data_list[i]
- embeddings = None
- ba = 0
- while ba < data.shape[0]:
- bb = min(ba + batch_size, data.shape[0])
- count = bb - ba
- _data = data[bb - batch_size: bb]
- time0 = datetime.datetime.now()
- img = ((_data / 255) - 0.5) / 0.5
- net_out: torch.Tensor = backbone(img)
- _embeddings = net_out.detach().cpu().numpy()
- time_now = datetime.datetime.now()
- diff = time_now - time0
- time_consumed += diff.total_seconds()
- if embeddings is None:
- embeddings = np.zeros((data.shape[0], _embeddings.shape[1]))
- embeddings[ba:bb, :] = _embeddings[(batch_size - count):, :]
- ba = bb
- embeddings_list.append(embeddings)
-
- _xnorm = 0.0
- _xnorm_cnt = 0
- for embed in embeddings_list:
- for i in range(embed.shape[0]):
- _em = embed[i]
- _norm = np.linalg.norm(_em)
- _xnorm += _norm
- _xnorm_cnt += 1
- _xnorm /= _xnorm_cnt
-
- acc1 = 0.0
- std1 = 0.0
- embeddings = embeddings_list[0] + embeddings_list[1]
- embeddings = sklearn.preprocessing.normalize(embeddings)
- print(embeddings.shape)
- print('infer time', time_consumed)
- _, _, accuracy, val, val_std, far = evaluate(embeddings, issame_list, nrof_folds=nfolds)
- acc2, std2 = np.mean(accuracy), np.std(accuracy)
- return acc1, std1, acc2, std2, _xnorm, embeddings_list
-
-
-def dumpR(data_set,
- backbone,
- batch_size,
- name='',
- data_extra=None,
- label_shape=None):
- print('dump verification embedding..')
- data_list = data_set[0]
- issame_list = data_set[1]
- embeddings_list = []
- time_consumed = 0.0
- for i in range(len(data_list)):
- data = data_list[i]
- embeddings = None
- ba = 0
- while ba < data.shape[0]:
- bb = min(ba + batch_size, data.shape[0])
- count = bb - ba
-
- _data = nd.slice_axis(data, axis=0, begin=bb - batch_size, end=bb)
- time0 = datetime.datetime.now()
- if data_extra is None:
- db = mx.io.DataBatch(data=(_data,), label=(_label,))
- else:
- db = mx.io.DataBatch(data=(_data, _data_extra),
- label=(_label,))
- model.forward(db, is_train=False)
- net_out = model.get_outputs()
- _embeddings = net_out[0].asnumpy()
- time_now = datetime.datetime.now()
- diff = time_now - time0
- time_consumed += diff.total_seconds()
- if embeddings is None:
- embeddings = np.zeros((data.shape[0], _embeddings.shape[1]))
- embeddings[ba:bb, :] = _embeddings[(batch_size - count):, :]
- ba = bb
- embeddings_list.append(embeddings)
- embeddings = embeddings_list[0] + embeddings_list[1]
- embeddings = sklearn.preprocessing.normalize(embeddings)
- actual_issame = np.asarray(issame_list)
- outname = os.path.join('temp.bin')
- with open(outname, 'wb') as f:
- pickle.dump((embeddings, issame_list),
- f,
- protocol=pickle.HIGHEST_PROTOCOL)
-
-
-# if __name__ == '__main__':
-#
-# parser = argparse.ArgumentParser(description='do verification')
-# # general
-# parser.add_argument('--data-dir', default='', help='')
-# parser.add_argument('--model',
-# default='../model/softmax,50',
-# help='path to load model.')
-# parser.add_argument('--target',
-# default='lfw,cfp_ff,cfp_fp,agedb_30',
-# help='test targets.')
-# parser.add_argument('--gpu', default=0, type=int, help='gpu id')
-# parser.add_argument('--batch-size', default=32, type=int, help='')
-# parser.add_argument('--max', default='', type=str, help='')
-# parser.add_argument('--mode', default=0, type=int, help='')
-# parser.add_argument('--nfolds', default=10, type=int, help='')
-# args = parser.parse_args()
-# image_size = [112, 112]
-# print('image_size', image_size)
-# ctx = mx.gpu(args.gpu)
-# nets = []
-# vec = args.model.split(',')
-# prefix = args.model.split(',')[0]
-# epochs = []
-# if len(vec) == 1:
-# pdir = os.path.dirname(prefix)
-# for fname in os.listdir(pdir):
-# if not fname.endswith('.params'):
-# continue
-# _file = os.path.join(pdir, fname)
-# if _file.startswith(prefix):
-# epoch = int(fname.split('.')[0].split('-')[1])
-# epochs.append(epoch)
-# epochs = sorted(epochs, reverse=True)
-# if len(args.max) > 0:
-# _max = [int(x) for x in args.max.split(',')]
-# assert len(_max) == 2
-# if len(epochs) > _max[1]:
-# epochs = epochs[_max[0]:_max[1]]
-#
-# else:
-# epochs = [int(x) for x in vec[1].split('|')]
-# print('model number', len(epochs))
-# time0 = datetime.datetime.now()
-# for epoch in epochs:
-# print('loading', prefix, epoch)
-# sym, arg_params, aux_params = mx.model.load_checkpoint(prefix, epoch)
-# # arg_params, aux_params = ch_dev(arg_params, aux_params, ctx)
-# all_layers = sym.get_internals()
-# sym = all_layers['fc1_output']
-# model = mx.mod.Module(symbol=sym, context=ctx, label_names=None)
-# # model.bind(data_shapes=[('data', (args.batch_size, 3, image_size[0], image_size[1]))], label_shapes=[('softmax_label', (args.batch_size,))])
-# model.bind(data_shapes=[('data', (args.batch_size, 3, image_size[0],
-# image_size[1]))])
-# model.set_params(arg_params, aux_params)
-# nets.append(model)
-# time_now = datetime.datetime.now()
-# diff = time_now - time0
-# print('model loading time', diff.total_seconds())
-#
-# ver_list = []
-# ver_name_list = []
-# for name in args.target.split(','):
-# path = os.path.join(args.data_dir, name + ".bin")
-# if os.path.exists(path):
-# print('loading.. ', name)
-# data_set = load_bin(path, image_size)
-# ver_list.append(data_set)
-# ver_name_list.append(name)
-#
-# if args.mode == 0:
-# for i in range(len(ver_list)):
-# results = []
-# for model in nets:
-# acc1, std1, acc2, std2, xnorm, embeddings_list = test(
-# ver_list[i], model, args.batch_size, args.nfolds)
-# print('[%s]XNorm: %f' % (ver_name_list[i], xnorm))
-# print('[%s]Accuracy: %1.5f+-%1.5f' % (ver_name_list[i], acc1, std1))
-# print('[%s]Accuracy-Flip: %1.5f+-%1.5f' % (ver_name_list[i], acc2, std2))
-# results.append(acc2)
-# print('Max of [%s] is %1.5f' % (ver_name_list[i], np.max(results)))
-# elif args.mode == 1:
-# raise ValueError
-# else:
-# model = nets[0]
-# dumpR(ver_list[0], model, args.batch_size, args.target)
diff --git a/spaces/kevinwang676/FreeVC/speaker_encoder/params_data.py b/spaces/kevinwang676/FreeVC/speaker_encoder/params_data.py
deleted file mode 100644
index bdb1716ed45617f2b127a7fb8885afe6cc74fb71..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/FreeVC/speaker_encoder/params_data.py
+++ /dev/null
@@ -1,29 +0,0 @@
-
-## Mel-filterbank
-mel_window_length = 25 # In milliseconds
-mel_window_step = 10 # In milliseconds
-mel_n_channels = 40
-
-
-## Audio
-sampling_rate = 16000
-# Number of spectrogram frames in a partial utterance
-partials_n_frames = 160 # 1600 ms
-# Number of spectrogram frames at inference
-inference_n_frames = 80 # 800 ms
-
-
-## Voice Activation Detection
-# Window size of the VAD. Must be either 10, 20 or 30 milliseconds.
-# This sets the granularity of the VAD. Should not need to be changed.
-vad_window_length = 30 # In milliseconds
-# Number of frames to average together when performing the moving average smoothing.
-# The larger this value, the larger the VAD variations must be to not get smoothed out.
-vad_moving_average_width = 8
-# Maximum number of consecutive silent frames a segment can have.
-vad_max_silence_length = 6
-
-
-## Audio volume normalization
-audio_norm_target_dBFS = -30
-
diff --git a/spaces/kevinwang676/VITS2-Mandarin/commons.py b/spaces/kevinwang676/VITS2-Mandarin/commons.py
deleted file mode 100644
index 9ad0444b61cbadaa388619986c2889c707d873ce..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/VITS2-Mandarin/commons.py
+++ /dev/null
@@ -1,161 +0,0 @@
-import math
-import numpy as np
-import torch
-from torch import nn
-from torch.nn import functional as F
-
-
-def init_weights(m, mean=0.0, std=0.01):
- classname = m.__class__.__name__
- if classname.find("Conv") != -1:
- m.weight.data.normal_(mean, std)
-
-
-def get_padding(kernel_size, dilation=1):
- return int((kernel_size*dilation - dilation)/2)
-
-
-def convert_pad_shape(pad_shape):
- l = pad_shape[::-1]
- pad_shape = [item for sublist in l for item in sublist]
- return pad_shape
-
-
-def intersperse(lst, item):
- result = [item] * (len(lst) * 2 + 1)
- result[1::2] = lst
- return result
-
-
-def kl_divergence(m_p, logs_p, m_q, logs_q):
- """KL(P||Q)"""
- kl = (logs_q - logs_p) - 0.5
- kl += 0.5 * (torch.exp(2. * logs_p) + ((m_p - m_q)**2)) * torch.exp(-2. * logs_q)
- return kl
-
-
-def rand_gumbel(shape):
- """Sample from the Gumbel distribution, protect from overflows."""
- uniform_samples = torch.rand(shape) * 0.99998 + 0.00001
- return -torch.log(-torch.log(uniform_samples))
-
-
-def rand_gumbel_like(x):
- g = rand_gumbel(x.size()).to(dtype=x.dtype, device=x.device)
- return g
-
-
-def slice_segments(x, ids_str, segment_size=4):
- ret = torch.zeros_like(x[:, :, :segment_size])
- for i in range(x.size(0)):
- idx_str = ids_str[i]
- idx_end = idx_str + segment_size
- ret[i] = x[i, :, idx_str:idx_end]
- return ret
-
-
-def rand_slice_segments(x, x_lengths=None, segment_size=4):
- b, d, t = x.size()
- if x_lengths is None:
- x_lengths = t
- ids_str_max = x_lengths - segment_size + 1
- ids_str = (torch.rand([b]).to(device=x.device) * ids_str_max).to(dtype=torch.long)
- ret = slice_segments(x, ids_str, segment_size)
- return ret, ids_str
-
-
-def get_timing_signal_1d(
- length, channels, min_timescale=1.0, max_timescale=1.0e4):
- position = torch.arange(length, dtype=torch.float)
- num_timescales = channels // 2
- log_timescale_increment = (
- math.log(float(max_timescale) / float(min_timescale)) /
- (num_timescales - 1))
- inv_timescales = min_timescale * torch.exp(
- torch.arange(num_timescales, dtype=torch.float) * -log_timescale_increment)
- scaled_time = position.unsqueeze(0) * inv_timescales.unsqueeze(1)
- signal = torch.cat([torch.sin(scaled_time), torch.cos(scaled_time)], 0)
- signal = F.pad(signal, [0, 0, 0, channels % 2])
- signal = signal.view(1, channels, length)
- return signal
-
-
-def add_timing_signal_1d(x, min_timescale=1.0, max_timescale=1.0e4):
- b, channels, length = x.size()
- signal = get_timing_signal_1d(length, channels, min_timescale, max_timescale)
- return x + signal.to(dtype=x.dtype, device=x.device)
-
-
-def cat_timing_signal_1d(x, min_timescale=1.0, max_timescale=1.0e4, axis=1):
- b, channels, length = x.size()
- signal = get_timing_signal_1d(length, channels, min_timescale, max_timescale)
- return torch.cat([x, signal.to(dtype=x.dtype, device=x.device)], axis)
-
-
-def subsequent_mask(length):
- mask = torch.tril(torch.ones(length, length)).unsqueeze(0).unsqueeze(0)
- return mask
-
-
-@torch.jit.script
-def fused_add_tanh_sigmoid_multiply(input_a, input_b, n_channels):
- n_channels_int = n_channels[0]
- in_act = input_a + input_b
- t_act = torch.tanh(in_act[:, :n_channels_int, :])
- s_act = torch.sigmoid(in_act[:, n_channels_int:, :])
- acts = t_act * s_act
- return acts
-
-
-def convert_pad_shape(pad_shape):
- l = pad_shape[::-1]
- pad_shape = [item for sublist in l for item in sublist]
- return pad_shape
-
-
-def shift_1d(x):
- x = F.pad(x, convert_pad_shape([[0, 0], [0, 0], [1, 0]]))[:, :, :-1]
- return x
-
-
-def sequence_mask(length, max_length=None):
- if max_length is None:
- max_length = length.max()
- x = torch.arange(max_length, dtype=length.dtype, device=length.device)
- return x.unsqueeze(0) < length.unsqueeze(1)
-
-
-def generate_path(duration, mask):
- """
- duration: [b, 1, t_x]
- mask: [b, 1, t_y, t_x]
- """
- device = duration.device
-
- b, _, t_y, t_x = mask.shape
- cum_duration = torch.cumsum(duration, -1)
-
- cum_duration_flat = cum_duration.view(b * t_x)
- path = sequence_mask(cum_duration_flat, t_y).to(mask.dtype)
- path = path.view(b, t_x, t_y)
- path = path - F.pad(path, convert_pad_shape([[0, 0], [1, 0], [0, 0]]))[:, :-1]
- path = path.unsqueeze(1).transpose(2,3) * mask
- return path
-
-
-def clip_grad_value_(parameters, clip_value, norm_type=2):
- if isinstance(parameters, torch.Tensor):
- parameters = [parameters]
- parameters = list(filter(lambda p: p.grad is not None, parameters))
- norm_type = float(norm_type)
- if clip_value is not None:
- clip_value = float(clip_value)
-
- total_norm = 0
- for p in parameters:
- param_norm = p.grad.data.norm(norm_type)
- total_norm += param_norm.item() ** norm_type
- if clip_value is not None:
- p.grad.data.clamp_(min=-clip_value, max=clip_value)
- total_norm = total_norm ** (1. / norm_type)
- return total_norm
diff --git a/spaces/kevinwang676/VoiceChangers/src/utils/init_path.py b/spaces/kevinwang676/VoiceChangers/src/utils/init_path.py
deleted file mode 100644
index 5f38d11907bd0dc789992062ce7f02d8876c638f..0000000000000000000000000000000000000000
--- a/spaces/kevinwang676/VoiceChangers/src/utils/init_path.py
+++ /dev/null
@@ -1,47 +0,0 @@
-import os
-import glob
-
-def init_path(checkpoint_dir, config_dir, size=512, old_version=False, preprocess='crop'):
-
- if old_version:
- #### load all the checkpoint of `pth`
- sadtalker_paths = {
- 'wav2lip_checkpoint' : os.path.join(checkpoint_dir, 'wav2lip.pth'),
- 'audio2pose_checkpoint' : os.path.join(checkpoint_dir, 'auido2pose_00140-model.pth'),
- 'audio2exp_checkpoint' : os.path.join(checkpoint_dir, 'auido2exp_00300-model.pth'),
- 'free_view_checkpoint' : os.path.join(checkpoint_dir, 'facevid2vid_00189-model.pth.tar'),
- 'path_of_net_recon_model' : os.path.join(checkpoint_dir, 'epoch_20.pth')
- }
-
- use_safetensor = False
- elif len(glob.glob(os.path.join(checkpoint_dir, '*.safetensors'))):
- print('using safetensor as default')
- sadtalker_paths = {
- "checkpoint":os.path.join(checkpoint_dir, 'SadTalker_V0.0.2_'+str(size)+'.safetensors'),
- }
- use_safetensor = True
- else:
- print("WARNING: The new version of the model will be updated by safetensor, you may need to download it mannully. We run the old version of the checkpoint this time!")
- use_safetensor = False
-
- sadtalker_paths = {
- 'wav2lip_checkpoint' : os.path.join(checkpoint_dir, 'wav2lip.pth'),
- 'audio2pose_checkpoint' : os.path.join(checkpoint_dir, 'auido2pose_00140-model.pth'),
- 'audio2exp_checkpoint' : os.path.join(checkpoint_dir, 'auido2exp_00300-model.pth'),
- 'free_view_checkpoint' : os.path.join(checkpoint_dir, 'facevid2vid_00189-model.pth.tar'),
- 'path_of_net_recon_model' : os.path.join(checkpoint_dir, 'epoch_20.pth')
- }
-
- sadtalker_paths['dir_of_BFM_fitting'] = os.path.join(config_dir) # , 'BFM_Fitting'
- sadtalker_paths['audio2pose_yaml_path'] = os.path.join(config_dir, 'auido2pose.yaml')
- sadtalker_paths['audio2exp_yaml_path'] = os.path.join(config_dir, 'auido2exp.yaml')
- sadtalker_paths['use_safetensor'] = use_safetensor # os.path.join(config_dir, 'auido2exp.yaml')
-
- if 'full' in preprocess:
- sadtalker_paths['mappingnet_checkpoint'] = os.path.join(checkpoint_dir, 'mapping_00109-model.pth.tar')
- sadtalker_paths['facerender_yaml'] = os.path.join(config_dir, 'facerender_still.yaml')
- else:
- sadtalker_paths['mappingnet_checkpoint'] = os.path.join(checkpoint_dir, 'mapping_00229-model.pth.tar')
- sadtalker_paths['facerender_yaml'] = os.path.join(config_dir, 'facerender.yaml')
-
- return sadtalker_paths
\ No newline at end of file
diff --git a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/fontTools/feaLib/ast.py b/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/fontTools/feaLib/ast.py
deleted file mode 100644
index bbe6e6e740c421acfe158a9784c8f47c5d56ded6..0000000000000000000000000000000000000000
--- a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/fontTools/feaLib/ast.py
+++ /dev/null
@@ -1,2131 +0,0 @@
-from fontTools.feaLib.error import FeatureLibError
-from fontTools.feaLib.location import FeatureLibLocation
-from fontTools.misc.encodingTools import getEncoding
-from fontTools.misc.textTools import byteord, tobytes
-from collections import OrderedDict
-import itertools
-
-SHIFT = " " * 4
-
-__all__ = [
- "Element",
- "FeatureFile",
- "Comment",
- "GlyphName",
- "GlyphClass",
- "GlyphClassName",
- "MarkClassName",
- "AnonymousBlock",
- "Block",
- "FeatureBlock",
- "NestedBlock",
- "LookupBlock",
- "GlyphClassDefinition",
- "GlyphClassDefStatement",
- "MarkClass",
- "MarkClassDefinition",
- "AlternateSubstStatement",
- "Anchor",
- "AnchorDefinition",
- "AttachStatement",
- "AxisValueLocationStatement",
- "BaseAxis",
- "CVParametersNameStatement",
- "ChainContextPosStatement",
- "ChainContextSubstStatement",
- "CharacterStatement",
- "ConditionsetStatement",
- "CursivePosStatement",
- "ElidedFallbackName",
- "ElidedFallbackNameID",
- "Expression",
- "FeatureNameStatement",
- "FeatureReferenceStatement",
- "FontRevisionStatement",
- "HheaField",
- "IgnorePosStatement",
- "IgnoreSubstStatement",
- "IncludeStatement",
- "LanguageStatement",
- "LanguageSystemStatement",
- "LigatureCaretByIndexStatement",
- "LigatureCaretByPosStatement",
- "LigatureSubstStatement",
- "LookupFlagStatement",
- "LookupReferenceStatement",
- "MarkBasePosStatement",
- "MarkLigPosStatement",
- "MarkMarkPosStatement",
- "MultipleSubstStatement",
- "NameRecord",
- "OS2Field",
- "PairPosStatement",
- "ReverseChainSingleSubstStatement",
- "ScriptStatement",
- "SinglePosStatement",
- "SingleSubstStatement",
- "SizeParameters",
- "Statement",
- "STATAxisValueStatement",
- "STATDesignAxisStatement",
- "STATNameStatement",
- "SubtableStatement",
- "TableBlock",
- "ValueRecord",
- "ValueRecordDefinition",
- "VheaField",
-]
-
-
-def deviceToString(device):
- if device is None:
- return ""
- else:
- return "" % ", ".join("%d %d" % t for t in device)
-
-
-fea_keywords = set(
- [
- "anchor",
- "anchordef",
- "anon",
- "anonymous",
- "by",
- "contour",
- "cursive",
- "device",
- "enum",
- "enumerate",
- "excludedflt",
- "exclude_dflt",
- "feature",
- "from",
- "ignore",
- "ignorebaseglyphs",
- "ignoreligatures",
- "ignoremarks",
- "include",
- "includedflt",
- "include_dflt",
- "language",
- "languagesystem",
- "lookup",
- "lookupflag",
- "mark",
- "markattachmenttype",
- "markclass",
- "nameid",
- "null",
- "parameters",
- "pos",
- "position",
- "required",
- "righttoleft",
- "reversesub",
- "rsub",
- "script",
- "sub",
- "substitute",
- "subtable",
- "table",
- "usemarkfilteringset",
- "useextension",
- "valuerecorddef",
- "base",
- "gdef",
- "head",
- "hhea",
- "name",
- "vhea",
- "vmtx",
- ]
-)
-
-
-def asFea(g):
- if hasattr(g, "asFea"):
- return g.asFea()
- elif isinstance(g, tuple) and len(g) == 2:
- return asFea(g[0]) + " - " + asFea(g[1]) # a range
- elif g.lower() in fea_keywords:
- return "\\" + g
- else:
- return g
-
-
-class Element(object):
- """A base class representing "something" in a feature file."""
-
- def __init__(self, location=None):
- #: location of this element as a `FeatureLibLocation` object.
- if location and not isinstance(location, FeatureLibLocation):
- location = FeatureLibLocation(*location)
- self.location = location
-
- def build(self, builder):
- pass
-
- def asFea(self, indent=""):
- """Returns this element as a string of feature code. For block-type
- elements (such as :class:`FeatureBlock`), the `indent` string is
- added to the start of each line in the output."""
- raise NotImplementedError
-
- def __str__(self):
- return self.asFea()
-
-
-class Statement(Element):
- pass
-
-
-class Expression(Element):
- pass
-
-
-class Comment(Element):
- """A comment in a feature file."""
-
- def __init__(self, text, location=None):
- super(Comment, self).__init__(location)
- #: Text of the comment
- self.text = text
-
- def asFea(self, indent=""):
- return self.text
-
-
-class NullGlyph(Expression):
- """The NULL glyph, used in glyph deletion substitutions."""
-
- def __init__(self, location=None):
- Expression.__init__(self, location)
- #: The name itself as a string
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return ()
-
- def asFea(self, indent=""):
- return "NULL"
-
-
-class GlyphName(Expression):
- """A single glyph name, such as ``cedilla``."""
-
- def __init__(self, glyph, location=None):
- Expression.__init__(self, location)
- #: The name itself as a string
- self.glyph = glyph
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return (self.glyph,)
-
- def asFea(self, indent=""):
- return asFea(self.glyph)
-
-
-class GlyphClass(Expression):
- """A glyph class, such as ``[acute cedilla grave]``."""
-
- def __init__(self, glyphs=None, location=None):
- Expression.__init__(self, location)
- #: The list of glyphs in this class, as :class:`GlyphName` objects.
- self.glyphs = glyphs if glyphs is not None else []
- self.original = []
- self.curr = 0
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return tuple(self.glyphs)
-
- def asFea(self, indent=""):
- if len(self.original):
- if self.curr < len(self.glyphs):
- self.original.extend(self.glyphs[self.curr :])
- self.curr = len(self.glyphs)
- return "[" + " ".join(map(asFea, self.original)) + "]"
- else:
- return "[" + " ".join(map(asFea, self.glyphs)) + "]"
-
- def extend(self, glyphs):
- """Add a list of :class:`GlyphName` objects to the class."""
- self.glyphs.extend(glyphs)
-
- def append(self, glyph):
- """Add a single :class:`GlyphName` object to the class."""
- self.glyphs.append(glyph)
-
- def add_range(self, start, end, glyphs):
- """Add a range (e.g. ``A-Z``) to the class. ``start`` and ``end``
- are either :class:`GlyphName` objects or strings representing the
- start and end glyphs in the class, and ``glyphs`` is the full list of
- :class:`GlyphName` objects in the range."""
- if self.curr < len(self.glyphs):
- self.original.extend(self.glyphs[self.curr :])
- self.original.append((start, end))
- self.glyphs.extend(glyphs)
- self.curr = len(self.glyphs)
-
- def add_cid_range(self, start, end, glyphs):
- """Add a range to the class by glyph ID. ``start`` and ``end`` are the
- initial and final IDs, and ``glyphs`` is the full list of
- :class:`GlyphName` objects in the range."""
- if self.curr < len(self.glyphs):
- self.original.extend(self.glyphs[self.curr :])
- self.original.append(("\\{}".format(start), "\\{}".format(end)))
- self.glyphs.extend(glyphs)
- self.curr = len(self.glyphs)
-
- def add_class(self, gc):
- """Add glyphs from the given :class:`GlyphClassName` object to the
- class."""
- if self.curr < len(self.glyphs):
- self.original.extend(self.glyphs[self.curr :])
- self.original.append(gc)
- self.glyphs.extend(gc.glyphSet())
- self.curr = len(self.glyphs)
-
-
-class GlyphClassName(Expression):
- """A glyph class name, such as ``@FRENCH_MARKS``. This must be instantiated
- with a :class:`GlyphClassDefinition` object."""
-
- def __init__(self, glyphclass, location=None):
- Expression.__init__(self, location)
- assert isinstance(glyphclass, GlyphClassDefinition)
- self.glyphclass = glyphclass
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return tuple(self.glyphclass.glyphSet())
-
- def asFea(self, indent=""):
- return "@" + self.glyphclass.name
-
-
-class MarkClassName(Expression):
- """A mark class name, such as ``@FRENCH_MARKS`` defined with ``markClass``.
- This must be instantiated with a :class:`MarkClass` object."""
-
- def __init__(self, markClass, location=None):
- Expression.__init__(self, location)
- assert isinstance(markClass, MarkClass)
- self.markClass = markClass
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return self.markClass.glyphSet()
-
- def asFea(self, indent=""):
- return "@" + self.markClass.name
-
-
-class AnonymousBlock(Statement):
- """An anonymous data block."""
-
- def __init__(self, tag, content, location=None):
- Statement.__init__(self, location)
- self.tag = tag #: string containing the block's "tag"
- self.content = content #: block data as string
-
- def asFea(self, indent=""):
- res = "anon {} {{\n".format(self.tag)
- res += self.content
- res += "}} {};\n\n".format(self.tag)
- return res
-
-
-class Block(Statement):
- """A block of statements: feature, lookup, etc."""
-
- def __init__(self, location=None):
- Statement.__init__(self, location)
- self.statements = [] #: Statements contained in the block
-
- def build(self, builder):
- """When handed a 'builder' object of comparable interface to
- :class:`fontTools.feaLib.builder`, walks the statements in this
- block, calling the builder callbacks."""
- for s in self.statements:
- s.build(builder)
-
- def asFea(self, indent=""):
- indent += SHIFT
- return (
- indent
- + ("\n" + indent).join([s.asFea(indent=indent) for s in self.statements])
- + "\n"
- )
-
-
-class FeatureFile(Block):
- """The top-level element of the syntax tree, containing the whole feature
- file in its ``statements`` attribute."""
-
- def __init__(self):
- Block.__init__(self, location=None)
- self.markClasses = {} # name --> ast.MarkClass
-
- def asFea(self, indent=""):
- return "\n".join(s.asFea(indent=indent) for s in self.statements)
-
-
-class FeatureBlock(Block):
- """A named feature block."""
-
- def __init__(self, name, use_extension=False, location=None):
- Block.__init__(self, location)
- self.name, self.use_extension = name, use_extension
-
- def build(self, builder):
- """Call the ``start_feature`` callback on the builder object, visit
- all the statements in this feature, and then call ``end_feature``."""
- # TODO(sascha): Handle use_extension.
- builder.start_feature(self.location, self.name)
- # language exclude_dflt statements modify builder.features_
- # limit them to this block with temporary builder.features_
- features = builder.features_
- builder.features_ = {}
- Block.build(self, builder)
- for key, value in builder.features_.items():
- features.setdefault(key, []).extend(value)
- builder.features_ = features
- builder.end_feature()
-
- def asFea(self, indent=""):
- res = indent + "feature %s " % self.name.strip()
- if self.use_extension:
- res += "useExtension "
- res += "{\n"
- res += Block.asFea(self, indent=indent)
- res += indent + "} %s;\n" % self.name.strip()
- return res
-
-
-class NestedBlock(Block):
- """A block inside another block, for example when found inside a
- ``cvParameters`` block."""
-
- def __init__(self, tag, block_name, location=None):
- Block.__init__(self, location)
- self.tag = tag
- self.block_name = block_name
-
- def build(self, builder):
- Block.build(self, builder)
- if self.block_name == "ParamUILabelNameID":
- builder.add_to_cv_num_named_params(self.tag)
-
- def asFea(self, indent=""):
- res = "{}{} {{\n".format(indent, self.block_name)
- res += Block.asFea(self, indent=indent)
- res += "{}}};\n".format(indent)
- return res
-
-
-class LookupBlock(Block):
- """A named lookup, containing ``statements``."""
-
- def __init__(self, name, use_extension=False, location=None):
- Block.__init__(self, location)
- self.name, self.use_extension = name, use_extension
-
- def build(self, builder):
- # TODO(sascha): Handle use_extension.
- builder.start_lookup_block(self.location, self.name)
- Block.build(self, builder)
- builder.end_lookup_block()
-
- def asFea(self, indent=""):
- res = "lookup {} ".format(self.name)
- if self.use_extension:
- res += "useExtension "
- res += "{\n"
- res += Block.asFea(self, indent=indent)
- res += "{}}} {};\n".format(indent, self.name)
- return res
-
-
-class TableBlock(Block):
- """A ``table ... { }`` block."""
-
- def __init__(self, name, location=None):
- Block.__init__(self, location)
- self.name = name
-
- def asFea(self, indent=""):
- res = "table {} {{\n".format(self.name.strip())
- res += super(TableBlock, self).asFea(indent=indent)
- res += "}} {};\n".format(self.name.strip())
- return res
-
-
-class GlyphClassDefinition(Statement):
- """Example: ``@UPPERCASE = [A-Z];``."""
-
- def __init__(self, name, glyphs, location=None):
- Statement.__init__(self, location)
- self.name = name #: class name as a string, without initial ``@``
- self.glyphs = glyphs #: a :class:`GlyphClass` object
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return tuple(self.glyphs.glyphSet())
-
- def asFea(self, indent=""):
- return "@" + self.name + " = " + self.glyphs.asFea() + ";"
-
-
-class GlyphClassDefStatement(Statement):
- """Example: ``GlyphClassDef @UPPERCASE, [B], [C], [D];``. The parameters
- must be either :class:`GlyphClass` or :class:`GlyphClassName` objects, or
- ``None``."""
-
- def __init__(
- self, baseGlyphs, markGlyphs, ligatureGlyphs, componentGlyphs, location=None
- ):
- Statement.__init__(self, location)
- self.baseGlyphs, self.markGlyphs = (baseGlyphs, markGlyphs)
- self.ligatureGlyphs = ligatureGlyphs
- self.componentGlyphs = componentGlyphs
-
- def build(self, builder):
- """Calls the builder's ``add_glyphClassDef`` callback."""
- base = self.baseGlyphs.glyphSet() if self.baseGlyphs else tuple()
- liga = self.ligatureGlyphs.glyphSet() if self.ligatureGlyphs else tuple()
- mark = self.markGlyphs.glyphSet() if self.markGlyphs else tuple()
- comp = self.componentGlyphs.glyphSet() if self.componentGlyphs else tuple()
- builder.add_glyphClassDef(self.location, base, liga, mark, comp)
-
- def asFea(self, indent=""):
- return "GlyphClassDef {}, {}, {}, {};".format(
- self.baseGlyphs.asFea() if self.baseGlyphs else "",
- self.ligatureGlyphs.asFea() if self.ligatureGlyphs else "",
- self.markGlyphs.asFea() if self.markGlyphs else "",
- self.componentGlyphs.asFea() if self.componentGlyphs else "",
- )
-
-
-class MarkClass(object):
- """One `or more` ``markClass`` statements for the same mark class.
-
- While glyph classes can be defined only once, the feature file format
- allows expanding mark classes with multiple definitions, each using
- different glyphs and anchors. The following are two ``MarkClassDefinitions``
- for the same ``MarkClass``::
-
- markClass [acute grave] @FRENCH_ACCENTS;
- markClass [cedilla] @FRENCH_ACCENTS;
-
- The ``MarkClass`` object is therefore just a container for a list of
- :class:`MarkClassDefinition` statements.
- """
-
- def __init__(self, name):
- self.name = name
- self.definitions = []
- self.glyphs = OrderedDict() # glyph --> ast.MarkClassDefinitions
-
- def addDefinition(self, definition):
- """Add a :class:`MarkClassDefinition` statement to this mark class."""
- assert isinstance(definition, MarkClassDefinition)
- self.definitions.append(definition)
- for glyph in definition.glyphSet():
- if glyph in self.glyphs:
- otherLoc = self.glyphs[glyph].location
- if otherLoc is None:
- end = ""
- else:
- end = f" at {otherLoc}"
- raise FeatureLibError(
- "Glyph %s already defined%s" % (glyph, end), definition.location
- )
- self.glyphs[glyph] = definition
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return tuple(self.glyphs.keys())
-
- def asFea(self, indent=""):
- res = "\n".join(d.asFea() for d in self.definitions)
- return res
-
-
-class MarkClassDefinition(Statement):
- """A single ``markClass`` statement. The ``markClass`` should be a
- :class:`MarkClass` object, the ``anchor`` an :class:`Anchor` object,
- and the ``glyphs`` parameter should be a `glyph-containing object`_ .
-
- Example:
-
- .. code:: python
-
- mc = MarkClass("FRENCH_ACCENTS")
- mc.addDefinition( MarkClassDefinition(mc, Anchor(350, 800),
- GlyphClass([ GlyphName("acute"), GlyphName("grave") ])
- ) )
- mc.addDefinition( MarkClassDefinition(mc, Anchor(350, -200),
- GlyphClass([ GlyphName("cedilla") ])
- ) )
-
- mc.asFea()
- # markClass [acute grave] @FRENCH_ACCENTS;
- # markClass [cedilla] @FRENCH_ACCENTS;
-
- """
-
- def __init__(self, markClass, anchor, glyphs, location=None):
- Statement.__init__(self, location)
- assert isinstance(markClass, MarkClass)
- assert isinstance(anchor, Anchor) and isinstance(glyphs, Expression)
- self.markClass, self.anchor, self.glyphs = markClass, anchor, glyphs
-
- def glyphSet(self):
- """The glyphs in this class as a tuple of :class:`GlyphName` objects."""
- return self.glyphs.glyphSet()
-
- def asFea(self, indent=""):
- return "markClass {} {} @{};".format(
- self.glyphs.asFea(), self.anchor.asFea(), self.markClass.name
- )
-
-
-class AlternateSubstStatement(Statement):
- """A ``sub ... from ...`` statement.
-
- ``prefix``, ``glyph``, ``suffix`` and ``replacement`` should be lists of
- `glyph-containing objects`_. ``glyph`` should be a `one element list`."""
-
- def __init__(self, prefix, glyph, suffix, replacement, location=None):
- Statement.__init__(self, location)
- self.prefix, self.glyph, self.suffix = (prefix, glyph, suffix)
- self.replacement = replacement
-
- def build(self, builder):
- """Calls the builder's ``add_alternate_subst`` callback."""
- glyph = self.glyph.glyphSet()
- assert len(glyph) == 1, glyph
- glyph = list(glyph)[0]
- prefix = [p.glyphSet() for p in self.prefix]
- suffix = [s.glyphSet() for s in self.suffix]
- replacement = self.replacement.glyphSet()
- builder.add_alternate_subst(self.location, prefix, glyph, suffix, replacement)
-
- def asFea(self, indent=""):
- res = "sub "
- if len(self.prefix) or len(self.suffix):
- if len(self.prefix):
- res += " ".join(map(asFea, self.prefix)) + " "
- res += asFea(self.glyph) + "'" # even though we really only use 1
- if len(self.suffix):
- res += " " + " ".join(map(asFea, self.suffix))
- else:
- res += asFea(self.glyph)
- res += " from "
- res += asFea(self.replacement)
- res += ";"
- return res
-
-
-class Anchor(Expression):
- """An ``Anchor`` element, used inside a ``pos`` rule.
-
- If a ``name`` is given, this will be used in preference to the coordinates.
- Other values should be integer.
- """
-
- def __init__(
- self,
- x,
- y,
- name=None,
- contourpoint=None,
- xDeviceTable=None,
- yDeviceTable=None,
- location=None,
- ):
- Expression.__init__(self, location)
- self.name = name
- self.x, self.y, self.contourpoint = x, y, contourpoint
- self.xDeviceTable, self.yDeviceTable = xDeviceTable, yDeviceTable
-
- def asFea(self, indent=""):
- if self.name is not None:
- return "".format(self.name)
- res = ""
- return res
-
-
-class AnchorDefinition(Statement):
- """A named anchor definition. (2.e.viii). ``name`` should be a string."""
-
- def __init__(self, name, x, y, contourpoint=None, location=None):
- Statement.__init__(self, location)
- self.name, self.x, self.y, self.contourpoint = name, x, y, contourpoint
-
- def asFea(self, indent=""):
- res = "anchorDef {} {}".format(self.x, self.y)
- if self.contourpoint:
- res += " contourpoint {}".format(self.contourpoint)
- res += " {};".format(self.name)
- return res
-
-
-class AttachStatement(Statement):
- """A ``GDEF`` table ``Attach`` statement."""
-
- def __init__(self, glyphs, contourPoints, location=None):
- Statement.__init__(self, location)
- self.glyphs = glyphs #: A `glyph-containing object`_
- self.contourPoints = contourPoints #: A list of integer contour points
-
- def build(self, builder):
- """Calls the builder's ``add_attach_points`` callback."""
- glyphs = self.glyphs.glyphSet()
- builder.add_attach_points(self.location, glyphs, self.contourPoints)
-
- def asFea(self, indent=""):
- return "Attach {} {};".format(
- self.glyphs.asFea(), " ".join(str(c) for c in self.contourPoints)
- )
-
-
-class ChainContextPosStatement(Statement):
- r"""A chained contextual positioning statement.
-
- ``prefix``, ``glyphs``, and ``suffix`` should be lists of
- `glyph-containing objects`_ .
-
- ``lookups`` should be a list of elements representing what lookups
- to apply at each glyph position. Each element should be a
- :class:`LookupBlock` to apply a single chaining lookup at the given
- position, a list of :class:`LookupBlock`\ s to apply multiple
- lookups, or ``None`` to apply no lookup. The length of the outer
- list should equal the length of ``glyphs``; the inner lists can be
- of variable length."""
-
- def __init__(self, prefix, glyphs, suffix, lookups, location=None):
- Statement.__init__(self, location)
- self.prefix, self.glyphs, self.suffix = prefix, glyphs, suffix
- self.lookups = list(lookups)
- for i, lookup in enumerate(lookups):
- if lookup:
- try:
- (_ for _ in lookup)
- except TypeError:
- self.lookups[i] = [lookup]
-
- def build(self, builder):
- """Calls the builder's ``add_chain_context_pos`` callback."""
- prefix = [p.glyphSet() for p in self.prefix]
- glyphs = [g.glyphSet() for g in self.glyphs]
- suffix = [s.glyphSet() for s in self.suffix]
- builder.add_chain_context_pos(
- self.location, prefix, glyphs, suffix, self.lookups
- )
-
- def asFea(self, indent=""):
- res = "pos "
- if (
- len(self.prefix)
- or len(self.suffix)
- or any([x is not None for x in self.lookups])
- ):
- if len(self.prefix):
- res += " ".join(g.asFea() for g in self.prefix) + " "
- for i, g in enumerate(self.glyphs):
- res += g.asFea() + "'"
- if self.lookups[i]:
- for lu in self.lookups[i]:
- res += " lookup " + lu.name
- if i < len(self.glyphs) - 1:
- res += " "
- if len(self.suffix):
- res += " " + " ".join(map(asFea, self.suffix))
- else:
- res += " ".join(map(asFea, self.glyph))
- res += ";"
- return res
-
-
-class ChainContextSubstStatement(Statement):
- r"""A chained contextual substitution statement.
-
- ``prefix``, ``glyphs``, and ``suffix`` should be lists of
- `glyph-containing objects`_ .
-
- ``lookups`` should be a list of elements representing what lookups
- to apply at each glyph position. Each element should be a
- :class:`LookupBlock` to apply a single chaining lookup at the given
- position, a list of :class:`LookupBlock`\ s to apply multiple
- lookups, or ``None`` to apply no lookup. The length of the outer
- list should equal the length of ``glyphs``; the inner lists can be
- of variable length."""
-
- def __init__(self, prefix, glyphs, suffix, lookups, location=None):
- Statement.__init__(self, location)
- self.prefix, self.glyphs, self.suffix = prefix, glyphs, suffix
- self.lookups = list(lookups)
- for i, lookup in enumerate(lookups):
- if lookup:
- try:
- (_ for _ in lookup)
- except TypeError:
- self.lookups[i] = [lookup]
-
- def build(self, builder):
- """Calls the builder's ``add_chain_context_subst`` callback."""
- prefix = [p.glyphSet() for p in self.prefix]
- glyphs = [g.glyphSet() for g in self.glyphs]
- suffix = [s.glyphSet() for s in self.suffix]
- builder.add_chain_context_subst(
- self.location, prefix, glyphs, suffix, self.lookups
- )
-
- def asFea(self, indent=""):
- res = "sub "
- if (
- len(self.prefix)
- or len(self.suffix)
- or any([x is not None for x in self.lookups])
- ):
- if len(self.prefix):
- res += " ".join(g.asFea() for g in self.prefix) + " "
- for i, g in enumerate(self.glyphs):
- res += g.asFea() + "'"
- if self.lookups[i]:
- for lu in self.lookups[i]:
- res += " lookup " + lu.name
- if i < len(self.glyphs) - 1:
- res += " "
- if len(self.suffix):
- res += " " + " ".join(map(asFea, self.suffix))
- else:
- res += " ".join(map(asFea, self.glyph))
- res += ";"
- return res
-
-
-class CursivePosStatement(Statement):
- """A cursive positioning statement. Entry and exit anchors can either
- be :class:`Anchor` objects or ``None``."""
-
- def __init__(self, glyphclass, entryAnchor, exitAnchor, location=None):
- Statement.__init__(self, location)
- self.glyphclass = glyphclass
- self.entryAnchor, self.exitAnchor = entryAnchor, exitAnchor
-
- def build(self, builder):
- """Calls the builder object's ``add_cursive_pos`` callback."""
- builder.add_cursive_pos(
- self.location, self.glyphclass.glyphSet(), self.entryAnchor, self.exitAnchor
- )
-
- def asFea(self, indent=""):
- entry = self.entryAnchor.asFea() if self.entryAnchor else ""
- exit = self.exitAnchor.asFea() if self.exitAnchor else ""
- return "pos cursive {} {} {};".format(self.glyphclass.asFea(), entry, exit)
-
-
-class FeatureReferenceStatement(Statement):
- """Example: ``feature salt;``"""
-
- def __init__(self, featureName, location=None):
- Statement.__init__(self, location)
- self.location, self.featureName = (location, featureName)
-
- def build(self, builder):
- """Calls the builder object's ``add_feature_reference`` callback."""
- builder.add_feature_reference(self.location, self.featureName)
-
- def asFea(self, indent=""):
- return "feature {};".format(self.featureName)
-
-
-class IgnorePosStatement(Statement):
- """An ``ignore pos`` statement, containing `one or more` contexts to ignore.
-
- ``chainContexts`` should be a list of ``(prefix, glyphs, suffix)`` tuples,
- with each of ``prefix``, ``glyphs`` and ``suffix`` being
- `glyph-containing objects`_ ."""
-
- def __init__(self, chainContexts, location=None):
- Statement.__init__(self, location)
- self.chainContexts = chainContexts
-
- def build(self, builder):
- """Calls the builder object's ``add_chain_context_pos`` callback on each
- rule context."""
- for prefix, glyphs, suffix in self.chainContexts:
- prefix = [p.glyphSet() for p in prefix]
- glyphs = [g.glyphSet() for g in glyphs]
- suffix = [s.glyphSet() for s in suffix]
- builder.add_chain_context_pos(self.location, prefix, glyphs, suffix, [])
-
- def asFea(self, indent=""):
- contexts = []
- for prefix, glyphs, suffix in self.chainContexts:
- res = ""
- if len(prefix) or len(suffix):
- if len(prefix):
- res += " ".join(map(asFea, prefix)) + " "
- res += " ".join(g.asFea() + "'" for g in glyphs)
- if len(suffix):
- res += " " + " ".join(map(asFea, suffix))
- else:
- res += " ".join(map(asFea, glyphs))
- contexts.append(res)
- return "ignore pos " + ", ".join(contexts) + ";"
-
-
-class IgnoreSubstStatement(Statement):
- """An ``ignore sub`` statement, containing `one or more` contexts to ignore.
-
- ``chainContexts`` should be a list of ``(prefix, glyphs, suffix)`` tuples,
- with each of ``prefix``, ``glyphs`` and ``suffix`` being
- `glyph-containing objects`_ ."""
-
- def __init__(self, chainContexts, location=None):
- Statement.__init__(self, location)
- self.chainContexts = chainContexts
-
- def build(self, builder):
- """Calls the builder object's ``add_chain_context_subst`` callback on
- each rule context."""
- for prefix, glyphs, suffix in self.chainContexts:
- prefix = [p.glyphSet() for p in prefix]
- glyphs = [g.glyphSet() for g in glyphs]
- suffix = [s.glyphSet() for s in suffix]
- builder.add_chain_context_subst(self.location, prefix, glyphs, suffix, [])
-
- def asFea(self, indent=""):
- contexts = []
- for prefix, glyphs, suffix in self.chainContexts:
- res = ""
- if len(prefix):
- res += " ".join(map(asFea, prefix)) + " "
- res += " ".join(g.asFea() + "'" for g in glyphs)
- if len(suffix):
- res += " " + " ".join(map(asFea, suffix))
- contexts.append(res)
- return "ignore sub " + ", ".join(contexts) + ";"
-
-
-class IncludeStatement(Statement):
- """An ``include()`` statement."""
-
- def __init__(self, filename, location=None):
- super(IncludeStatement, self).__init__(location)
- self.filename = filename #: String containing name of file to include
-
- def build(self):
- # TODO: consider lazy-loading the including parser/lexer?
- raise FeatureLibError(
- "Building an include statement is not implemented yet. "
- "Instead, use Parser(..., followIncludes=True) for building.",
- self.location,
- )
-
- def asFea(self, indent=""):
- return indent + "include(%s);" % self.filename
-
-
-class LanguageStatement(Statement):
- """A ``language`` statement within a feature."""
-
- def __init__(self, language, include_default=True, required=False, location=None):
- Statement.__init__(self, location)
- assert len(language) == 4
- self.language = language #: A four-character language tag
- self.include_default = include_default #: If false, "exclude_dflt"
- self.required = required
-
- def build(self, builder):
- """Call the builder object's ``set_language`` callback."""
- builder.set_language(
- location=self.location,
- language=self.language,
- include_default=self.include_default,
- required=self.required,
- )
-
- def asFea(self, indent=""):
- res = "language {}".format(self.language.strip())
- if not self.include_default:
- res += " exclude_dflt"
- if self.required:
- res += " required"
- res += ";"
- return res
-
-
-class LanguageSystemStatement(Statement):
- """A top-level ``languagesystem`` statement."""
-
- def __init__(self, script, language, location=None):
- Statement.__init__(self, location)
- self.script, self.language = (script, language)
-
- def build(self, builder):
- """Calls the builder object's ``add_language_system`` callback."""
- builder.add_language_system(self.location, self.script, self.language)
-
- def asFea(self, indent=""):
- return "languagesystem {} {};".format(self.script, self.language.strip())
-
-
-class FontRevisionStatement(Statement):
- """A ``head`` table ``FontRevision`` statement. ``revision`` should be a
- number, and will be formatted to three significant decimal places."""
-
- def __init__(self, revision, location=None):
- Statement.__init__(self, location)
- self.revision = revision
-
- def build(self, builder):
- builder.set_font_revision(self.location, self.revision)
-
- def asFea(self, indent=""):
- return "FontRevision {:.3f};".format(self.revision)
-
-
-class LigatureCaretByIndexStatement(Statement):
- """A ``GDEF`` table ``LigatureCaretByIndex`` statement. ``glyphs`` should be
- a `glyph-containing object`_, and ``carets`` should be a list of integers."""
-
- def __init__(self, glyphs, carets, location=None):
- Statement.__init__(self, location)
- self.glyphs, self.carets = (glyphs, carets)
-
- def build(self, builder):
- """Calls the builder object's ``add_ligatureCaretByIndex_`` callback."""
- glyphs = self.glyphs.glyphSet()
- builder.add_ligatureCaretByIndex_(self.location, glyphs, set(self.carets))
-
- def asFea(self, indent=""):
- return "LigatureCaretByIndex {} {};".format(
- self.glyphs.asFea(), " ".join(str(x) for x in self.carets)
- )
-
-
-class LigatureCaretByPosStatement(Statement):
- """A ``GDEF`` table ``LigatureCaretByPos`` statement. ``glyphs`` should be
- a `glyph-containing object`_, and ``carets`` should be a list of integers."""
-
- def __init__(self, glyphs, carets, location=None):
- Statement.__init__(self, location)
- self.glyphs, self.carets = (glyphs, carets)
-
- def build(self, builder):
- """Calls the builder object's ``add_ligatureCaretByPos_`` callback."""
- glyphs = self.glyphs.glyphSet()
- builder.add_ligatureCaretByPos_(self.location, glyphs, set(self.carets))
-
- def asFea(self, indent=""):
- return "LigatureCaretByPos {} {};".format(
- self.glyphs.asFea(), " ".join(str(x) for x in self.carets)
- )
-
-
-class LigatureSubstStatement(Statement):
- """A chained contextual substitution statement.
-
- ``prefix``, ``glyphs``, and ``suffix`` should be lists of
- `glyph-containing objects`_; ``replacement`` should be a single
- `glyph-containing object`_.
-
- If ``forceChain`` is True, this is expressed as a chaining rule
- (e.g. ``sub f' i' by f_i``) even when no context is given."""
-
- def __init__(self, prefix, glyphs, suffix, replacement, forceChain, location=None):
- Statement.__init__(self, location)
- self.prefix, self.glyphs, self.suffix = (prefix, glyphs, suffix)
- self.replacement, self.forceChain = replacement, forceChain
-
- def build(self, builder):
- prefix = [p.glyphSet() for p in self.prefix]
- glyphs = [g.glyphSet() for g in self.glyphs]
- suffix = [s.glyphSet() for s in self.suffix]
- builder.add_ligature_subst(
- self.location, prefix, glyphs, suffix, self.replacement, self.forceChain
- )
-
- def asFea(self, indent=""):
- res = "sub "
- if len(self.prefix) or len(self.suffix) or self.forceChain:
- if len(self.prefix):
- res += " ".join(g.asFea() for g in self.prefix) + " "
- res += " ".join(g.asFea() + "'" for g in self.glyphs)
- if len(self.suffix):
- res += " " + " ".join(g.asFea() for g in self.suffix)
- else:
- res += " ".join(g.asFea() for g in self.glyphs)
- res += " by "
- res += asFea(self.replacement)
- res += ";"
- return res
-
-
-class LookupFlagStatement(Statement):
- """A ``lookupflag`` statement. The ``value`` should be an integer value
- representing the flags in use, but not including the ``markAttachment``
- class and ``markFilteringSet`` values, which must be specified as
- glyph-containing objects."""
-
- def __init__(
- self, value=0, markAttachment=None, markFilteringSet=None, location=None
- ):
- Statement.__init__(self, location)
- self.value = value
- self.markAttachment = markAttachment
- self.markFilteringSet = markFilteringSet
-
- def build(self, builder):
- """Calls the builder object's ``set_lookup_flag`` callback."""
- markAttach = None
- if self.markAttachment is not None:
- markAttach = self.markAttachment.glyphSet()
- markFilter = None
- if self.markFilteringSet is not None:
- markFilter = self.markFilteringSet.glyphSet()
- builder.set_lookup_flag(self.location, self.value, markAttach, markFilter)
-
- def asFea(self, indent=""):
- res = []
- flags = ["RightToLeft", "IgnoreBaseGlyphs", "IgnoreLigatures", "IgnoreMarks"]
- curr = 1
- for i in range(len(flags)):
- if self.value & curr != 0:
- res.append(flags[i])
- curr = curr << 1
- if self.markAttachment is not None:
- res.append("MarkAttachmentType {}".format(self.markAttachment.asFea()))
- if self.markFilteringSet is not None:
- res.append("UseMarkFilteringSet {}".format(self.markFilteringSet.asFea()))
- if not res:
- res = ["0"]
- return "lookupflag {};".format(" ".join(res))
-
-
-class LookupReferenceStatement(Statement):
- """Represents a ``lookup ...;`` statement to include a lookup in a feature.
-
- The ``lookup`` should be a :class:`LookupBlock` object."""
-
- def __init__(self, lookup, location=None):
- Statement.__init__(self, location)
- self.location, self.lookup = (location, lookup)
-
- def build(self, builder):
- """Calls the builder object's ``add_lookup_call`` callback."""
- builder.add_lookup_call(self.lookup.name)
-
- def asFea(self, indent=""):
- return "lookup {};".format(self.lookup.name)
-
-
-class MarkBasePosStatement(Statement):
- """A mark-to-base positioning rule. The ``base`` should be a
- `glyph-containing object`_. The ``marks`` should be a list of
- (:class:`Anchor`, :class:`MarkClass`) tuples."""
-
- def __init__(self, base, marks, location=None):
- Statement.__init__(self, location)
- self.base, self.marks = base, marks
-
- def build(self, builder):
- """Calls the builder object's ``add_mark_base_pos`` callback."""
- builder.add_mark_base_pos(self.location, self.base.glyphSet(), self.marks)
-
- def asFea(self, indent=""):
- res = "pos base {}".format(self.base.asFea())
- for a, m in self.marks:
- res += "\n" + indent + SHIFT + "{} mark @{}".format(a.asFea(), m.name)
- res += ";"
- return res
-
-
-class MarkLigPosStatement(Statement):
- """A mark-to-ligature positioning rule. The ``ligatures`` must be a
- `glyph-containing object`_. The ``marks`` should be a list of lists: each
- element in the top-level list represents a component glyph, and is made
- up of a list of (:class:`Anchor`, :class:`MarkClass`) tuples representing
- mark attachment points for that position.
-
- Example::
-
- m1 = MarkClass("TOP_MARKS")
- m2 = MarkClass("BOTTOM_MARKS")
- # ... add definitions to mark classes...
-
- glyph = GlyphName("lam_meem_jeem")
- marks = [
- [ (Anchor(625,1800), m1) ], # Attachments on 1st component (lam)
- [ (Anchor(376,-378), m2) ], # Attachments on 2nd component (meem)
- [ ] # No attachments on the jeem
- ]
- mlp = MarkLigPosStatement(glyph, marks)
-
- mlp.asFea()
- # pos ligature lam_meem_jeem mark @TOP_MARKS
- # ligComponent mark @BOTTOM_MARKS;
-
- """
-
- def __init__(self, ligatures, marks, location=None):
- Statement.__init__(self, location)
- self.ligatures, self.marks = ligatures, marks
-
- def build(self, builder):
- """Calls the builder object's ``add_mark_lig_pos`` callback."""
- builder.add_mark_lig_pos(self.location, self.ligatures.glyphSet(), self.marks)
-
- def asFea(self, indent=""):
- res = "pos ligature {}".format(self.ligatures.asFea())
- ligs = []
- for l in self.marks:
- temp = ""
- if l is None or not len(l):
- temp = "\n" + indent + SHIFT * 2 + ""
- else:
- for a, m in l:
- temp += (
- "\n"
- + indent
- + SHIFT * 2
- + "{} mark @{}".format(a.asFea(), m.name)
- )
- ligs.append(temp)
- res += ("\n" + indent + SHIFT + "ligComponent").join(ligs)
- res += ";"
- return res
-
-
-class MarkMarkPosStatement(Statement):
- """A mark-to-mark positioning rule. The ``baseMarks`` must be a
- `glyph-containing object`_. The ``marks`` should be a list of
- (:class:`Anchor`, :class:`MarkClass`) tuples."""
-
- def __init__(self, baseMarks, marks, location=None):
- Statement.__init__(self, location)
- self.baseMarks, self.marks = baseMarks, marks
-
- def build(self, builder):
- """Calls the builder object's ``add_mark_mark_pos`` callback."""
- builder.add_mark_mark_pos(self.location, self.baseMarks.glyphSet(), self.marks)
-
- def asFea(self, indent=""):
- res = "pos mark {}".format(self.baseMarks.asFea())
- for a, m in self.marks:
- res += "\n" + indent + SHIFT + "{} mark @{}".format(a.asFea(), m.name)
- res += ";"
- return res
-
-
-class MultipleSubstStatement(Statement):
- """A multiple substitution statement.
-
- Args:
- prefix: a list of `glyph-containing objects`_.
- glyph: a single glyph-containing object.
- suffix: a list of glyph-containing objects.
- replacement: a list of glyph-containing objects.
- forceChain: If true, the statement is expressed as a chaining rule
- (e.g. ``sub f' i' by f_i``) even when no context is given.
- """
-
- def __init__(
- self, prefix, glyph, suffix, replacement, forceChain=False, location=None
- ):
- Statement.__init__(self, location)
- self.prefix, self.glyph, self.suffix = prefix, glyph, suffix
- self.replacement = replacement
- self.forceChain = forceChain
-
- def build(self, builder):
- """Calls the builder object's ``add_multiple_subst`` callback."""
- prefix = [p.glyphSet() for p in self.prefix]
- suffix = [s.glyphSet() for s in self.suffix]
- if hasattr(self.glyph, "glyphSet"):
- originals = self.glyph.glyphSet()
- else:
- originals = [self.glyph]
- count = len(originals)
- replaces = []
- for r in self.replacement:
- if hasattr(r, "glyphSet"):
- replace = r.glyphSet()
- else:
- replace = [r]
- if len(replace) == 1 and len(replace) != count:
- replace = replace * count
- replaces.append(replace)
- replaces = list(zip(*replaces))
-
- for i, original in enumerate(originals):
- builder.add_multiple_subst(
- self.location,
- prefix,
- original,
- suffix,
- replaces and replaces[i] or (),
- self.forceChain,
- )
-
- def asFea(self, indent=""):
- res = "sub "
- if len(self.prefix) or len(self.suffix) or self.forceChain:
- if len(self.prefix):
- res += " ".join(map(asFea, self.prefix)) + " "
- res += asFea(self.glyph) + "'"
- if len(self.suffix):
- res += " " + " ".join(map(asFea, self.suffix))
- else:
- res += asFea(self.glyph)
- replacement = self.replacement or [NullGlyph()]
- res += " by "
- res += " ".join(map(asFea, replacement))
- res += ";"
- return res
-
-
-class PairPosStatement(Statement):
- """A pair positioning statement.
-
- ``glyphs1`` and ``glyphs2`` should be `glyph-containing objects`_.
- ``valuerecord1`` should be a :class:`ValueRecord` object;
- ``valuerecord2`` should be either a :class:`ValueRecord` object or ``None``.
- If ``enumerated`` is true, then this is expressed as an
- `enumerated pair `_.
- """
-
- def __init__(
- self,
- glyphs1,
- valuerecord1,
- glyphs2,
- valuerecord2,
- enumerated=False,
- location=None,
- ):
- Statement.__init__(self, location)
- self.enumerated = enumerated
- self.glyphs1, self.valuerecord1 = glyphs1, valuerecord1
- self.glyphs2, self.valuerecord2 = glyphs2, valuerecord2
-
- def build(self, builder):
- """Calls a callback on the builder object:
-
- * If the rule is enumerated, calls ``add_specific_pair_pos`` on each
- combination of first and second glyphs.
- * If the glyphs are both single :class:`GlyphName` objects, calls
- ``add_specific_pair_pos``.
- * Else, calls ``add_class_pair_pos``.
- """
- if self.enumerated:
- g = [self.glyphs1.glyphSet(), self.glyphs2.glyphSet()]
- seen_pair = False
- for glyph1, glyph2 in itertools.product(*g):
- seen_pair = True
- builder.add_specific_pair_pos(
- self.location, glyph1, self.valuerecord1, glyph2, self.valuerecord2
- )
- if not seen_pair:
- raise FeatureLibError(
- "Empty glyph class in positioning rule", self.location
- )
- return
-
- is_specific = isinstance(self.glyphs1, GlyphName) and isinstance(
- self.glyphs2, GlyphName
- )
- if is_specific:
- builder.add_specific_pair_pos(
- self.location,
- self.glyphs1.glyph,
- self.valuerecord1,
- self.glyphs2.glyph,
- self.valuerecord2,
- )
- else:
- builder.add_class_pair_pos(
- self.location,
- self.glyphs1.glyphSet(),
- self.valuerecord1,
- self.glyphs2.glyphSet(),
- self.valuerecord2,
- )
-
- def asFea(self, indent=""):
- res = "enum " if self.enumerated else ""
- if self.valuerecord2:
- res += "pos {} {} {} {};".format(
- self.glyphs1.asFea(),
- self.valuerecord1.asFea(),
- self.glyphs2.asFea(),
- self.valuerecord2.asFea(),
- )
- else:
- res += "pos {} {} {};".format(
- self.glyphs1.asFea(), self.glyphs2.asFea(), self.valuerecord1.asFea()
- )
- return res
-
-
-class ReverseChainSingleSubstStatement(Statement):
- """A reverse chaining substitution statement. You don't see those every day.
-
- Note the unusual argument order: ``suffix`` comes `before` ``glyphs``.
- ``old_prefix``, ``old_suffix``, ``glyphs`` and ``replacements`` should be
- lists of `glyph-containing objects`_. ``glyphs`` and ``replacements`` should
- be one-item lists.
- """
-
- def __init__(self, old_prefix, old_suffix, glyphs, replacements, location=None):
- Statement.__init__(self, location)
- self.old_prefix, self.old_suffix = old_prefix, old_suffix
- self.glyphs = glyphs
- self.replacements = replacements
-
- def build(self, builder):
- prefix = [p.glyphSet() for p in self.old_prefix]
- suffix = [s.glyphSet() for s in self.old_suffix]
- originals = self.glyphs[0].glyphSet()
- replaces = self.replacements[0].glyphSet()
- if len(replaces) == 1:
- replaces = replaces * len(originals)
- builder.add_reverse_chain_single_subst(
- self.location, prefix, suffix, dict(zip(originals, replaces))
- )
-
- def asFea(self, indent=""):
- res = "rsub "
- if len(self.old_prefix) or len(self.old_suffix):
- if len(self.old_prefix):
- res += " ".join(asFea(g) for g in self.old_prefix) + " "
- res += " ".join(asFea(g) + "'" for g in self.glyphs)
- if len(self.old_suffix):
- res += " " + " ".join(asFea(g) for g in self.old_suffix)
- else:
- res += " ".join(map(asFea, self.glyphs))
- res += " by {};".format(" ".join(asFea(g) for g in self.replacements))
- return res
-
-
-class SingleSubstStatement(Statement):
- """A single substitution statement.
-
- Note the unusual argument order: ``prefix`` and suffix come `after`
- the replacement ``glyphs``. ``prefix``, ``suffix``, ``glyphs`` and
- ``replace`` should be lists of `glyph-containing objects`_. ``glyphs`` and
- ``replace`` should be one-item lists.
- """
-
- def __init__(self, glyphs, replace, prefix, suffix, forceChain, location=None):
- Statement.__init__(self, location)
- self.prefix, self.suffix = prefix, suffix
- self.forceChain = forceChain
- self.glyphs = glyphs
- self.replacements = replace
-
- def build(self, builder):
- """Calls the builder object's ``add_single_subst`` callback."""
- prefix = [p.glyphSet() for p in self.prefix]
- suffix = [s.glyphSet() for s in self.suffix]
- originals = self.glyphs[0].glyphSet()
- replaces = self.replacements[0].glyphSet()
- if len(replaces) == 1:
- replaces = replaces * len(originals)
- builder.add_single_subst(
- self.location,
- prefix,
- suffix,
- OrderedDict(zip(originals, replaces)),
- self.forceChain,
- )
-
- def asFea(self, indent=""):
- res = "sub "
- if len(self.prefix) or len(self.suffix) or self.forceChain:
- if len(self.prefix):
- res += " ".join(asFea(g) for g in self.prefix) + " "
- res += " ".join(asFea(g) + "'" for g in self.glyphs)
- if len(self.suffix):
- res += " " + " ".join(asFea(g) for g in self.suffix)
- else:
- res += " ".join(asFea(g) for g in self.glyphs)
- res += " by {};".format(" ".join(asFea(g) for g in self.replacements))
- return res
-
-
-class ScriptStatement(Statement):
- """A ``script`` statement."""
-
- def __init__(self, script, location=None):
- Statement.__init__(self, location)
- self.script = script #: the script code
-
- def build(self, builder):
- """Calls the builder's ``set_script`` callback."""
- builder.set_script(self.location, self.script)
-
- def asFea(self, indent=""):
- return "script {};".format(self.script.strip())
-
-
-class SinglePosStatement(Statement):
- """A single position statement. ``prefix`` and ``suffix`` should be
- lists of `glyph-containing objects`_.
-
- ``pos`` should be a one-element list containing a (`glyph-containing object`_,
- :class:`ValueRecord`) tuple."""
-
- def __init__(self, pos, prefix, suffix, forceChain, location=None):
- Statement.__init__(self, location)
- self.pos, self.prefix, self.suffix = pos, prefix, suffix
- self.forceChain = forceChain
-
- def build(self, builder):
- """Calls the builder object's ``add_single_pos`` callback."""
- prefix = [p.glyphSet() for p in self.prefix]
- suffix = [s.glyphSet() for s in self.suffix]
- pos = [(g.glyphSet(), value) for g, value in self.pos]
- builder.add_single_pos(self.location, prefix, suffix, pos, self.forceChain)
-
- def asFea(self, indent=""):
- res = "pos "
- if len(self.prefix) or len(self.suffix) or self.forceChain:
- if len(self.prefix):
- res += " ".join(map(asFea, self.prefix)) + " "
- res += " ".join(
- [
- asFea(x[0]) + "'" + ((" " + x[1].asFea()) if x[1] else "")
- for x in self.pos
- ]
- )
- if len(self.suffix):
- res += " " + " ".join(map(asFea, self.suffix))
- else:
- res += " ".join(
- [asFea(x[0]) + " " + (x[1].asFea() if x[1] else "") for x in self.pos]
- )
- res += ";"
- return res
-
-
-class SubtableStatement(Statement):
- """Represents a subtable break."""
-
- def __init__(self, location=None):
- Statement.__init__(self, location)
-
- def build(self, builder):
- """Calls the builder objects's ``add_subtable_break`` callback."""
- builder.add_subtable_break(self.location)
-
- def asFea(self, indent=""):
- return "subtable;"
-
-
-class ValueRecord(Expression):
- """Represents a value record."""
-
- def __init__(
- self,
- xPlacement=None,
- yPlacement=None,
- xAdvance=None,
- yAdvance=None,
- xPlaDevice=None,
- yPlaDevice=None,
- xAdvDevice=None,
- yAdvDevice=None,
- vertical=False,
- location=None,
- ):
- Expression.__init__(self, location)
- self.xPlacement, self.yPlacement = (xPlacement, yPlacement)
- self.xAdvance, self.yAdvance = (xAdvance, yAdvance)
- self.xPlaDevice, self.yPlaDevice = (xPlaDevice, yPlaDevice)
- self.xAdvDevice, self.yAdvDevice = (xAdvDevice, yAdvDevice)
- self.vertical = vertical
-
- def __eq__(self, other):
- return (
- self.xPlacement == other.xPlacement
- and self.yPlacement == other.yPlacement
- and self.xAdvance == other.xAdvance
- and self.yAdvance == other.yAdvance
- and self.xPlaDevice == other.xPlaDevice
- and self.xAdvDevice == other.xAdvDevice
- )
-
- def __ne__(self, other):
- return not self.__eq__(other)
-
- def __hash__(self):
- return (
- hash(self.xPlacement)
- ^ hash(self.yPlacement)
- ^ hash(self.xAdvance)
- ^ hash(self.yAdvance)
- ^ hash(self.xPlaDevice)
- ^ hash(self.yPlaDevice)
- ^ hash(self.xAdvDevice)
- ^ hash(self.yAdvDevice)
- )
-
- def asFea(self, indent=""):
- if not self:
- return ""
-
- x, y = self.xPlacement, self.yPlacement
- xAdvance, yAdvance = self.xAdvance, self.yAdvance
- xPlaDevice, yPlaDevice = self.xPlaDevice, self.yPlaDevice
- xAdvDevice, yAdvDevice = self.xAdvDevice, self.yAdvDevice
- vertical = self.vertical
-
- # Try format A, if possible.
- if x is None and y is None:
- if xAdvance is None and vertical:
- return str(yAdvance)
- elif yAdvance is None and not vertical:
- return str(xAdvance)
-
- # Make any remaining None value 0 to avoid generating invalid records.
- x = x or 0
- y = y or 0
- xAdvance = xAdvance or 0
- yAdvance = yAdvance or 0
-
- # Try format B, if possible.
- if (
- xPlaDevice is None
- and yPlaDevice is None
- and xAdvDevice is None
- and yAdvDevice is None
- ):
- return "<%s %s %s %s>" % (x, y, xAdvance, yAdvance)
-
- # Last resort is format C.
- return "<%s %s %s %s %s %s %s %s>" % (
- x,
- y,
- xAdvance,
- yAdvance,
- deviceToString(xPlaDevice),
- deviceToString(yPlaDevice),
- deviceToString(xAdvDevice),
- deviceToString(yAdvDevice),
- )
-
- def __bool__(self):
- return any(
- getattr(self, v) is not None
- for v in [
- "xPlacement",
- "yPlacement",
- "xAdvance",
- "yAdvance",
- "xPlaDevice",
- "yPlaDevice",
- "xAdvDevice",
- "yAdvDevice",
- ]
- )
-
- __nonzero__ = __bool__
-
-
-class ValueRecordDefinition(Statement):
- """Represents a named value record definition."""
-
- def __init__(self, name, value, location=None):
- Statement.__init__(self, location)
- self.name = name #: Value record name as string
- self.value = value #: :class:`ValueRecord` object
-
- def asFea(self, indent=""):
- return "valueRecordDef {} {};".format(self.value.asFea(), self.name)
-
-
-def simplify_name_attributes(pid, eid, lid):
- if pid == 3 and eid == 1 and lid == 1033:
- return ""
- elif pid == 1 and eid == 0 and lid == 0:
- return "1"
- else:
- return "{} {} {}".format(pid, eid, lid)
-
-
-class NameRecord(Statement):
- """Represents a name record. (`Section 9.e. `_)"""
-
- def __init__(self, nameID, platformID, platEncID, langID, string, location=None):
- Statement.__init__(self, location)
- self.nameID = nameID #: Name ID as integer (e.g. 9 for designer's name)
- self.platformID = platformID #: Platform ID as integer
- self.platEncID = platEncID #: Platform encoding ID as integer
- self.langID = langID #: Language ID as integer
- self.string = string #: Name record value
-
- def build(self, builder):
- """Calls the builder object's ``add_name_record`` callback."""
- builder.add_name_record(
- self.location,
- self.nameID,
- self.platformID,
- self.platEncID,
- self.langID,
- self.string,
- )
-
- def asFea(self, indent=""):
- def escape(c, escape_pattern):
- # Also escape U+0022 QUOTATION MARK and U+005C REVERSE SOLIDUS
- if c >= 0x20 and c <= 0x7E and c not in (0x22, 0x5C):
- return chr(c)
- else:
- return escape_pattern % c
-
- encoding = getEncoding(self.platformID, self.platEncID, self.langID)
- if encoding is None:
- raise FeatureLibError("Unsupported encoding", self.location)
- s = tobytes(self.string, encoding=encoding)
- if encoding == "utf_16_be":
- escaped_string = "".join(
- [
- escape(byteord(s[i]) * 256 + byteord(s[i + 1]), r"\%04x")
- for i in range(0, len(s), 2)
- ]
- )
- else:
- escaped_string = "".join([escape(byteord(b), r"\%02x") for b in s])
- plat = simplify_name_attributes(self.platformID, self.platEncID, self.langID)
- if plat != "":
- plat += " "
- return 'nameid {} {}"{}";'.format(self.nameID, plat, escaped_string)
-
-
-class FeatureNameStatement(NameRecord):
- """Represents a ``sizemenuname`` or ``name`` statement."""
-
- def build(self, builder):
- """Calls the builder object's ``add_featureName`` callback."""
- NameRecord.build(self, builder)
- builder.add_featureName(self.nameID)
-
- def asFea(self, indent=""):
- if self.nameID == "size":
- tag = "sizemenuname"
- else:
- tag = "name"
- plat = simplify_name_attributes(self.platformID, self.platEncID, self.langID)
- if plat != "":
- plat += " "
- return '{} {}"{}";'.format(tag, plat, self.string)
-
-
-class STATNameStatement(NameRecord):
- """Represents a STAT table ``name`` statement."""
-
- def asFea(self, indent=""):
- plat = simplify_name_attributes(self.platformID, self.platEncID, self.langID)
- if plat != "":
- plat += " "
- return 'name {}"{}";'.format(plat, self.string)
-
-
-class SizeParameters(Statement):
- """A ``parameters`` statement."""
-
- def __init__(self, DesignSize, SubfamilyID, RangeStart, RangeEnd, location=None):
- Statement.__init__(self, location)
- self.DesignSize = DesignSize
- self.SubfamilyID = SubfamilyID
- self.RangeStart = RangeStart
- self.RangeEnd = RangeEnd
-
- def build(self, builder):
- """Calls the builder object's ``set_size_parameters`` callback."""
- builder.set_size_parameters(
- self.location,
- self.DesignSize,
- self.SubfamilyID,
- self.RangeStart,
- self.RangeEnd,
- )
-
- def asFea(self, indent=""):
- res = "parameters {:.1f} {}".format(self.DesignSize, self.SubfamilyID)
- if self.RangeStart != 0 or self.RangeEnd != 0:
- res += " {} {}".format(int(self.RangeStart * 10), int(self.RangeEnd * 10))
- return res + ";"
-
-
-class CVParametersNameStatement(NameRecord):
- """Represent a name statement inside a ``cvParameters`` block."""
-
- def __init__(
- self, nameID, platformID, platEncID, langID, string, block_name, location=None
- ):
- NameRecord.__init__(
- self, nameID, platformID, platEncID, langID, string, location=location
- )
- self.block_name = block_name
-
- def build(self, builder):
- """Calls the builder object's ``add_cv_parameter`` callback."""
- item = ""
- if self.block_name == "ParamUILabelNameID":
- item = "_{}".format(builder.cv_num_named_params_.get(self.nameID, 0))
- builder.add_cv_parameter(self.nameID)
- self.nameID = (self.nameID, self.block_name + item)
- NameRecord.build(self, builder)
-
- def asFea(self, indent=""):
- plat = simplify_name_attributes(self.platformID, self.platEncID, self.langID)
- if plat != "":
- plat += " "
- return 'name {}"{}";'.format(plat, self.string)
-
-
-class CharacterStatement(Statement):
- """
- Statement used in cvParameters blocks of Character Variant features (cvXX).
- The Unicode value may be written with either decimal or hexadecimal
- notation. The value must be preceded by '0x' if it is a hexadecimal value.
- The largest Unicode value allowed is 0xFFFFFF.
- """
-
- def __init__(self, character, tag, location=None):
- Statement.__init__(self, location)
- self.character = character
- self.tag = tag
-
- def build(self, builder):
- """Calls the builder object's ``add_cv_character`` callback."""
- builder.add_cv_character(self.character, self.tag)
-
- def asFea(self, indent=""):
- return "Character {:#x};".format(self.character)
-
-
-class BaseAxis(Statement):
- """An axis definition, being either a ``VertAxis.BaseTagList/BaseScriptList``
- pair or a ``HorizAxis.BaseTagList/BaseScriptList`` pair."""
-
- def __init__(self, bases, scripts, vertical, location=None):
- Statement.__init__(self, location)
- self.bases = bases #: A list of baseline tag names as strings
- self.scripts = scripts #: A list of script record tuplets (script tag, default baseline tag, base coordinate)
- self.vertical = vertical #: Boolean; VertAxis if True, HorizAxis if False
-
- def build(self, builder):
- """Calls the builder object's ``set_base_axis`` callback."""
- builder.set_base_axis(self.bases, self.scripts, self.vertical)
-
- def asFea(self, indent=""):
- direction = "Vert" if self.vertical else "Horiz"
- scripts = [
- "{} {} {}".format(a[0], a[1], " ".join(map(str, a[2])))
- for a in self.scripts
- ]
- return "{}Axis.BaseTagList {};\n{}{}Axis.BaseScriptList {};".format(
- direction, " ".join(self.bases), indent, direction, ", ".join(scripts)
- )
-
-
-class OS2Field(Statement):
- """An entry in the ``OS/2`` table. Most ``values`` should be numbers or
- strings, apart from when the key is ``UnicodeRange``, ``CodePageRange``
- or ``Panose``, in which case it should be an array of integers."""
-
- def __init__(self, key, value, location=None):
- Statement.__init__(self, location)
- self.key = key
- self.value = value
-
- def build(self, builder):
- """Calls the builder object's ``add_os2_field`` callback."""
- builder.add_os2_field(self.key, self.value)
-
- def asFea(self, indent=""):
- def intarr2str(x):
- return " ".join(map(str, x))
-
- numbers = (
- "FSType",
- "TypoAscender",
- "TypoDescender",
- "TypoLineGap",
- "winAscent",
- "winDescent",
- "XHeight",
- "CapHeight",
- "WeightClass",
- "WidthClass",
- "LowerOpSize",
- "UpperOpSize",
- )
- ranges = ("UnicodeRange", "CodePageRange")
- keywords = dict([(x.lower(), [x, str]) for x in numbers])
- keywords.update([(x.lower(), [x, intarr2str]) for x in ranges])
- keywords["panose"] = ["Panose", intarr2str]
- keywords["vendor"] = ["Vendor", lambda y: '"{}"'.format(y)]
- if self.key in keywords:
- return "{} {};".format(
- keywords[self.key][0], keywords[self.key][1](self.value)
- )
- return "" # should raise exception
-
-
-class HheaField(Statement):
- """An entry in the ``hhea`` table."""
-
- def __init__(self, key, value, location=None):
- Statement.__init__(self, location)
- self.key = key
- self.value = value
-
- def build(self, builder):
- """Calls the builder object's ``add_hhea_field`` callback."""
- builder.add_hhea_field(self.key, self.value)
-
- def asFea(self, indent=""):
- fields = ("CaretOffset", "Ascender", "Descender", "LineGap")
- keywords = dict([(x.lower(), x) for x in fields])
- return "{} {};".format(keywords[self.key], self.value)
-
-
-class VheaField(Statement):
- """An entry in the ``vhea`` table."""
-
- def __init__(self, key, value, location=None):
- Statement.__init__(self, location)
- self.key = key
- self.value = value
-
- def build(self, builder):
- """Calls the builder object's ``add_vhea_field`` callback."""
- builder.add_vhea_field(self.key, self.value)
-
- def asFea(self, indent=""):
- fields = ("VertTypoAscender", "VertTypoDescender", "VertTypoLineGap")
- keywords = dict([(x.lower(), x) for x in fields])
- return "{} {};".format(keywords[self.key], self.value)
-
-
-class STATDesignAxisStatement(Statement):
- """A STAT table Design Axis
-
- Args:
- tag (str): a 4 letter axis tag
- axisOrder (int): an int
- names (list): a list of :class:`STATNameStatement` objects
- """
-
- def __init__(self, tag, axisOrder, names, location=None):
- Statement.__init__(self, location)
- self.tag = tag
- self.axisOrder = axisOrder
- self.names = names
- self.location = location
-
- def build(self, builder):
- builder.addDesignAxis(self, self.location)
-
- def asFea(self, indent=""):
- indent += SHIFT
- res = f"DesignAxis {self.tag} {self.axisOrder} {{ \n"
- res += ("\n" + indent).join([s.asFea(indent=indent) for s in self.names]) + "\n"
- res += "};"
- return res
-
-
-class ElidedFallbackName(Statement):
- """STAT table ElidedFallbackName
-
- Args:
- names: a list of :class:`STATNameStatement` objects
- """
-
- def __init__(self, names, location=None):
- Statement.__init__(self, location)
- self.names = names
- self.location = location
-
- def build(self, builder):
- builder.setElidedFallbackName(self.names, self.location)
-
- def asFea(self, indent=""):
- indent += SHIFT
- res = "ElidedFallbackName { \n"
- res += ("\n" + indent).join([s.asFea(indent=indent) for s in self.names]) + "\n"
- res += "};"
- return res
-
-
-class ElidedFallbackNameID(Statement):
- """STAT table ElidedFallbackNameID
-
- Args:
- value: an int pointing to an existing name table name ID
- """
-
- def __init__(self, value, location=None):
- Statement.__init__(self, location)
- self.value = value
- self.location = location
-
- def build(self, builder):
- builder.setElidedFallbackName(self.value, self.location)
-
- def asFea(self, indent=""):
- return f"ElidedFallbackNameID {self.value};"
-
-
-class STATAxisValueStatement(Statement):
- """A STAT table Axis Value Record
-
- Args:
- names (list): a list of :class:`STATNameStatement` objects
- locations (list): a list of :class:`AxisValueLocationStatement` objects
- flags (int): an int
- """
-
- def __init__(self, names, locations, flags, location=None):
- Statement.__init__(self, location)
- self.names = names
- self.locations = locations
- self.flags = flags
-
- def build(self, builder):
- builder.addAxisValueRecord(self, self.location)
-
- def asFea(self, indent=""):
- res = "AxisValue {\n"
- for location in self.locations:
- res += location.asFea()
-
- for nameRecord in self.names:
- res += nameRecord.asFea()
- res += "\n"
-
- if self.flags:
- flags = ["OlderSiblingFontAttribute", "ElidableAxisValueName"]
- flagStrings = []
- curr = 1
- for i in range(len(flags)):
- if self.flags & curr != 0:
- flagStrings.append(flags[i])
- curr = curr << 1
- res += f"flag {' '.join(flagStrings)};\n"
- res += "};"
- return res
-
-
-class AxisValueLocationStatement(Statement):
- """
- A STAT table Axis Value Location
-
- Args:
- tag (str): a 4 letter axis tag
- values (list): a list of ints and/or floats
- """
-
- def __init__(self, tag, values, location=None):
- Statement.__init__(self, location)
- self.tag = tag
- self.values = values
-
- def asFea(self, res=""):
- res += f"location {self.tag} "
- res += f"{' '.join(str(i) for i in self.values)};\n"
- return res
-
-
-class ConditionsetStatement(Statement):
- """
- A variable layout conditionset
-
- Args:
- name (str): the name of this conditionset
- conditions (dict): a dictionary mapping axis tags to a
- tuple of (min,max) userspace coordinates.
- """
-
- def __init__(self, name, conditions, location=None):
- Statement.__init__(self, location)
- self.name = name
- self.conditions = conditions
-
- def build(self, builder):
- builder.add_conditionset(self.location, self.name, self.conditions)
-
- def asFea(self, res="", indent=""):
- res += indent + f"conditionset {self.name} " + "{\n"
- for tag, (minvalue, maxvalue) in self.conditions.items():
- res += indent + SHIFT + f"{tag} {minvalue} {maxvalue};\n"
- res += indent + "}" + f" {self.name};\n"
- return res
-
-
-class VariationBlock(Block):
- """A variation feature block, applicable in a given set of conditions."""
-
- def __init__(self, name, conditionset, use_extension=False, location=None):
- Block.__init__(self, location)
- self.name, self.conditionset, self.use_extension = (
- name,
- conditionset,
- use_extension,
- )
-
- def build(self, builder):
- """Call the ``start_feature`` callback on the builder object, visit
- all the statements in this feature, and then call ``end_feature``."""
- builder.start_feature(self.location, self.name)
- if (
- self.conditionset != "NULL"
- and self.conditionset not in builder.conditionsets_
- ):
- raise FeatureLibError(
- f"variation block used undefined conditionset {self.conditionset}",
- self.location,
- )
-
- # language exclude_dflt statements modify builder.features_
- # limit them to this block with temporary builder.features_
- features = builder.features_
- builder.features_ = {}
- Block.build(self, builder)
- for key, value in builder.features_.items():
- items = builder.feature_variations_.setdefault(key, {}).setdefault(
- self.conditionset, []
- )
- items.extend(value)
- if key not in features:
- features[key] = [] # Ensure we make a feature record
- builder.features_ = features
- builder.end_feature()
-
- def asFea(self, indent=""):
- res = indent + "variation %s " % self.name.strip()
- res += self.conditionset + " "
- if self.use_extension:
- res += "useExtension "
- res += "{\n"
- res += Block.asFea(self, indent=indent)
- res += indent + "} %s;\n" % self.name.strip()
- return res
diff --git a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/wrapper-b7460963-69b64cfb.js b/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/wrapper-b7460963-69b64cfb.js
deleted file mode 100644
index 02049f8e8fbbc52f6fd3d807e42bbbe9e811c2f6..0000000000000000000000000000000000000000
--- a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/gradio/templates/frontend/assets/wrapper-b7460963-69b64cfb.js
+++ /dev/null
@@ -1,8 +0,0 @@
-import S from"./__vite-browser-external-b25bb000.js";function _t(s){if(s.__esModule)return s;var e=s.default;if(typeof e=="function"){var t=function r(){if(this instanceof r){var i=[null];i.push.apply(i,arguments);var n=Function.bind.apply(e,i);return new n}return e.apply(this,arguments)};t.prototype=e.prototype}else t={};return Object.defineProperty(t,"__esModule",{value:!0}),Object.keys(s).forEach(function(r){var i=Object.getOwnPropertyDescriptor(s,r);Object.defineProperty(t,r,i.get?i:{enumerable:!0,get:function(){return s[r]}})}),t}const{Duplex:pt}=S;function xe(s){s.emit("close")}function mt(){!this.destroyed&&this._writableState.finished&&this.destroy()}function Ke(s){this.removeListener("error",Ke),this.destroy(),this.listenerCount("error")===0&&this.emit("error",s)}function gt(s,e){let t=!0;const r=new pt({...e,autoDestroy:!1,emitClose:!1,objectMode:!1,writableObjectMode:!1});return s.on("message",function(n,o){const l=!o&&r._readableState.objectMode?n.toString():n;r.push(l)||s.pause()}),s.once("error",function(n){r.destroyed||(t=!1,r.destroy(n))}),s.once("close",function(){r.destroyed||r.push(null)}),r._destroy=function(i,n){if(s.readyState===s.CLOSED){n(i),process.nextTick(xe,r);return}let o=!1;s.once("error",function(f){o=!0,n(f)}),s.once("close",function(){o||n(i),process.nextTick(xe,r)}),t&&s.terminate()},r._final=function(i){if(s.readyState===s.CONNECTING){s.once("open",function(){r._final(i)});return}s._socket!==null&&(s._socket._writableState.finished?(i(),r._readableState.endEmitted&&r.destroy()):(s._socket.once("finish",function(){i()}),s.close()))},r._read=function(){s.isPaused&&s.resume()},r._write=function(i,n,o){if(s.readyState===s.CONNECTING){s.once("open",function(){r._write(i,n,o)});return}s.send(i,o)},r.on("end",mt),r.on("error",Ke),r}var yt=gt;const Gs=yt;var L={},vt={get exports(){return L},set exports(s){L=s}},$={BINARY_TYPES:["nodebuffer","arraybuffer","fragments"],EMPTY_BUFFER:Buffer.alloc(0),GUID:"258EAFA5-E914-47DA-95CA-C5AB0DC85B11",kForOnEventAttribute:Symbol("kIsForOnEventAttribute"),kListener:Symbol("kListener"),kStatusCode:Symbol("status-code"),kWebSocket:Symbol("websocket"),NOOP:()=>{}},St,Et;const{EMPTY_BUFFER:bt}=$,ge=Buffer[Symbol.species];function xt(s,e){if(s.length===0)return bt;if(s.length===1)return s[0];const t=Buffer.allocUnsafe(e);let r=0;for(let i=0;i{this.pending--,this[fe]()},this.concurrency=e||1/0,this.jobs=[],this.pending=0}add(e){this.jobs.push(e),this[fe]()}[fe](){if(this.pending!==this.concurrency&&this.jobs.length){const e=this.jobs.shift();this.pending++,e(this[ke])}}};var Ot=wt;const F=S,we=L,Tt=Ot,{kStatusCode:Qe}=$,Ct=Buffer[Symbol.species],Lt=Buffer.from([0,0,255,255]),se=Symbol("permessage-deflate"),w=Symbol("total-length"),z=Symbol("callback"),T=Symbol("buffers"),ee=Symbol("error");let X,Nt=class{constructor(e,t,r){if(this._maxPayload=r|0,this._options=e||{},this._threshold=this._options.threshold!==void 0?this._options.threshold:1024,this._isServer=!!t,this._deflate=null,this._inflate=null,this.params=null,!X){const i=this._options.concurrencyLimit!==void 0?this._options.concurrencyLimit:10;X=new Tt(i)}}static get extensionName(){return"permessage-deflate"}offer(){const e={};return this._options.serverNoContextTakeover&&(e.server_no_context_takeover=!0),this._options.clientNoContextTakeover&&(e.client_no_context_takeover=!0),this._options.serverMaxWindowBits&&(e.server_max_window_bits=this._options.serverMaxWindowBits),this._options.clientMaxWindowBits?e.client_max_window_bits=this._options.clientMaxWindowBits:this._options.clientMaxWindowBits==null&&(e.client_max_window_bits=!0),e}accept(e){return e=this.normalizeParams(e),this.params=this._isServer?this.acceptAsServer(e):this.acceptAsClient(e),this.params}cleanup(){if(this._inflate&&(this._inflate.close(),this._inflate=null),this._deflate){const e=this._deflate[z];this._deflate.close(),this._deflate=null,e&&e(new Error("The deflate stream was closed while data was being processed"))}}acceptAsServer(e){const t=this._options,r=e.find(i=>!(t.serverNoContextTakeover===!1&&i.server_no_context_takeover||i.server_max_window_bits&&(t.serverMaxWindowBits===!1||typeof t.serverMaxWindowBits=="number"&&t.serverMaxWindowBits>i.server_max_window_bits)||typeof t.clientMaxWindowBits=="number"&&!i.client_max_window_bits));if(!r)throw new Error("None of the extension offers can be accepted");return t.serverNoContextTakeover&&(r.server_no_context_takeover=!0),t.clientNoContextTakeover&&(r.client_no_context_takeover=!0),typeof t.serverMaxWindowBits=="number"&&(r.server_max_window_bits=t.serverMaxWindowBits),typeof t.clientMaxWindowBits=="number"?r.client_max_window_bits=t.clientMaxWindowBits:(r.client_max_window_bits===!0||t.clientMaxWindowBits===!1)&&delete r.client_max_window_bits,r}acceptAsClient(e){const t=e[0];if(this._options.clientNoContextTakeover===!1&&t.client_no_context_takeover)throw new Error('Unexpected parameter "client_no_context_takeover"');if(!t.client_max_window_bits)typeof this._options.clientMaxWindowBits=="number"&&(t.client_max_window_bits=this._options.clientMaxWindowBits);else if(this._options.clientMaxWindowBits===!1||typeof this._options.clientMaxWindowBits=="number"&&t.client_max_window_bits>this._options.clientMaxWindowBits)throw new Error('Unexpected or invalid parameter "client_max_window_bits"');return t}normalizeParams(e){return e.forEach(t=>{Object.keys(t).forEach(r=>{let i=t[r];if(i.length>1)throw new Error(`Parameter "${r}" must have only a single value`);if(i=i[0],r==="client_max_window_bits"){if(i!==!0){const n=+i;if(!Number.isInteger(n)||n<8||n>15)throw new TypeError(`Invalid value for parameter "${r}": ${i}`);i=n}else if(!this._isServer)throw new TypeError(`Invalid value for parameter "${r}": ${i}`)}else if(r==="server_max_window_bits"){const n=+i;if(!Number.isInteger(n)||n<8||n>15)throw new TypeError(`Invalid value for parameter "${r}": ${i}`);i=n}else if(r==="client_no_context_takeover"||r==="server_no_context_takeover"){if(i!==!0)throw new TypeError(`Invalid value for parameter "${r}": ${i}`)}else throw new Error(`Unknown parameter "${r}"`);t[r]=i})}),e}decompress(e,t,r){X.add(i=>{this._decompress(e,t,(n,o)=>{i(),r(n,o)})})}compress(e,t,r){X.add(i=>{this._compress(e,t,(n,o)=>{i(),r(n,o)})})}_decompress(e,t,r){const i=this._isServer?"client":"server";if(!this._inflate){const n=`${i}_max_window_bits`,o=typeof this.params[n]!="number"?F.Z_DEFAULT_WINDOWBITS:this.params[n];this._inflate=F.createInflateRaw({...this._options.zlibInflateOptions,windowBits:o}),this._inflate[se]=this,this._inflate[w]=0,this._inflate[T]=[],this._inflate.on("error",Rt),this._inflate.on("data",Je)}this._inflate[z]=r,this._inflate.write(e),t&&this._inflate.write(Lt),this._inflate.flush(()=>{const n=this._inflate[ee];if(n){this._inflate.close(),this._inflate=null,r(n);return}const o=we.concat(this._inflate[T],this._inflate[w]);this._inflate._readableState.endEmitted?(this._inflate.close(),this._inflate=null):(this._inflate[w]=0,this._inflate[T]=[],t&&this.params[`${i}_no_context_takeover`]&&this._inflate.reset()),r(null,o)})}_compress(e,t,r){const i=this._isServer?"server":"client";if(!this._deflate){const n=`${i}_max_window_bits`,o=typeof this.params[n]!="number"?F.Z_DEFAULT_WINDOWBITS:this.params[n];this._deflate=F.createDeflateRaw({...this._options.zlibDeflateOptions,windowBits:o}),this._deflate[w]=0,this._deflate[T]=[],this._deflate.on("data",Pt)}this._deflate[z]=r,this._deflate.write(e),this._deflate.flush(F.Z_SYNC_FLUSH,()=>{if(!this._deflate)return;let n=we.concat(this._deflate[T],this._deflate[w]);t&&(n=new Ct(n.buffer,n.byteOffset,n.length-4)),this._deflate[z]=null,this._deflate[w]=0,this._deflate[T]=[],t&&this.params[`${i}_no_context_takeover`]&&this._deflate.reset(),r(null,n)})}};var ie=Nt;function Pt(s){this[T].push(s),this[w]+=s.length}function Je(s){if(this[w]+=s.length,this[se]._maxPayload<1||this[w]<=this[se]._maxPayload){this[T].push(s);return}this[ee]=new RangeError("Max payload size exceeded"),this[ee].code="WS_ERR_UNSUPPORTED_MESSAGE_LENGTH",this[ee][Qe]=1009,this.removeListener("data",Je),this.reset()}function Rt(s){this[se]._inflate=null,s[Qe]=1007,this[z](s)}var N={},Ut={get exports(){return N},set exports(s){N=s}};const Bt={},$t=Object.freeze(Object.defineProperty({__proto__:null,default:Bt},Symbol.toStringTag,{value:"Module"})),Mt=_t($t);var Oe;const{isUtf8:Te}=S,It=[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,1,1,1,0,0,1,1,0,1,1,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,0,0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,1,0,1,0];function Dt(s){return s>=1e3&&s<=1014&&s!==1004&&s!==1005&&s!==1006||s>=3e3&&s<=4999}function ve(s){const e=s.length;let t=0;for(;t=e||(s[t+1]&192)!==128||(s[t+2]&192)!==128||s[t]===224&&(s[t+1]&224)===128||s[t]===237&&(s[t+1]&224)===160)return!1;t+=3}else if((s[t]&248)===240){if(t+3>=e||(s[t+1]&192)!==128||(s[t+2]&192)!==128||(s[t+3]&192)!==128||s[t]===240&&(s[t+1]&240)===128||s[t]===244&&s[t+1]>143||s[t]>244)return!1;t+=4}else return!1;return!0}Ut.exports={isValidStatusCode:Dt,isValidUTF8:ve,tokenChars:It};if(Te)Oe=N.isValidUTF8=function(s){return s.length<24?ve(s):Te(s)};else if(!{}.WS_NO_UTF_8_VALIDATE)try{const s=Mt;Oe=N.isValidUTF8=function(e){return e.length<32?ve(e):s(e)}}catch{}const{Writable:Wt}=S,Ce=ie,{BINARY_TYPES:At,EMPTY_BUFFER:Le,kStatusCode:Ft,kWebSocket:jt}=$,{concat:he,toArrayBuffer:Gt,unmask:Vt}=L,{isValidStatusCode:Ht,isValidUTF8:Ne}=N,Z=Buffer[Symbol.species],j=0,Pe=1,Re=2,Ue=3,ce=4,zt=5;let Yt=class extends Wt{constructor(e={}){super(),this._binaryType=e.binaryType||At[0],this._extensions=e.extensions||{},this._isServer=!!e.isServer,this._maxPayload=e.maxPayload|0,this._skipUTF8Validation=!!e.skipUTF8Validation,this[jt]=void 0,this._bufferedBytes=0,this._buffers=[],this._compressed=!1,this._payloadLength=0,this._mask=void 0,this._fragmented=0,this._masked=!1,this._fin=!1,this._opcode=0,this._totalPayloadLength=0,this._messageLength=0,this._fragments=[],this._state=j,this._loop=!1}_write(e,t,r){if(this._opcode===8&&this._state==j)return r();this._bufferedBytes+=e.length,this._buffers.push(e),this.startLoop(r)}consume(e){if(this._bufferedBytes-=e,e===this._buffers[0].length)return this._buffers.shift();if(e=r.length?t.set(this._buffers.shift(),i):(t.set(new Uint8Array(r.buffer,r.byteOffset,e),i),this._buffers[0]=new Z(r.buffer,r.byteOffset+e,r.length-e)),e-=r.length}while(e>0);return t}startLoop(e){let t;this._loop=!0;do switch(this._state){case j:t=this.getInfo();break;case Pe:t=this.getPayloadLength16();break;case Re:t=this.getPayloadLength64();break;case Ue:this.getMask();break;case ce:t=this.getData(e);break;default:this._loop=!1;return}while(this._loop);e(t)}getInfo(){if(this._bufferedBytes<2){this._loop=!1;return}const e=this.consume(2);if(e[0]&48)return this._loop=!1,g(RangeError,"RSV2 and RSV3 must be clear",!0,1002,"WS_ERR_UNEXPECTED_RSV_2_3");const t=(e[0]&64)===64;if(t&&!this._extensions[Ce.extensionName])return this._loop=!1,g(RangeError,"RSV1 must be clear",!0,1002,"WS_ERR_UNEXPECTED_RSV_1");if(this._fin=(e[0]&128)===128,this._opcode=e[0]&15,this._payloadLength=e[1]&127,this._opcode===0){if(t)return this._loop=!1,g(RangeError,"RSV1 must be clear",!0,1002,"WS_ERR_UNEXPECTED_RSV_1");if(!this._fragmented)return this._loop=!1,g(RangeError,"invalid opcode 0",!0,1002,"WS_ERR_INVALID_OPCODE");this._opcode=this._fragmented}else if(this._opcode===1||this._opcode===2){if(this._fragmented)return this._loop=!1,g(RangeError,`invalid opcode ${this._opcode}`,!0,1002,"WS_ERR_INVALID_OPCODE");this._compressed=t}else if(this._opcode>7&&this._opcode<11){if(!this._fin)return this._loop=!1,g(RangeError,"FIN must be set",!0,1002,"WS_ERR_EXPECTED_FIN");if(t)return this._loop=!1,g(RangeError,"RSV1 must be clear",!0,1002,"WS_ERR_UNEXPECTED_RSV_1");if(this._payloadLength>125||this._opcode===8&&this._payloadLength===1)return this._loop=!1,g(RangeError,`invalid payload length ${this._payloadLength}`,!0,1002,"WS_ERR_INVALID_CONTROL_PAYLOAD_LENGTH")}else return this._loop=!1,g(RangeError,`invalid opcode ${this._opcode}`,!0,1002,"WS_ERR_INVALID_OPCODE");if(!this._fin&&!this._fragmented&&(this._fragmented=this._opcode),this._masked=(e[1]&128)===128,this._isServer){if(!this._masked)return this._loop=!1,g(RangeError,"MASK must be set",!0,1002,"WS_ERR_EXPECTED_MASK")}else if(this._masked)return this._loop=!1,g(RangeError,"MASK must be clear",!0,1002,"WS_ERR_UNEXPECTED_MASK");if(this._payloadLength===126)this._state=Pe;else if(this._payloadLength===127)this._state=Re;else return this.haveLength()}getPayloadLength16(){if(this._bufferedBytes<2){this._loop=!1;return}return this._payloadLength=this.consume(2).readUInt16BE(0),this.haveLength()}getPayloadLength64(){if(this._bufferedBytes<8){this._loop=!1;return}const e=this.consume(8),t=e.readUInt32BE(0);return t>Math.pow(2,53-32)-1?(this._loop=!1,g(RangeError,"Unsupported WebSocket frame: payload length > 2^53 - 1",!1,1009,"WS_ERR_UNSUPPORTED_DATA_PAYLOAD_LENGTH")):(this._payloadLength=t*Math.pow(2,32)+e.readUInt32BE(4),this.haveLength())}haveLength(){if(this._payloadLength&&this._opcode<8&&(this._totalPayloadLength+=this._payloadLength,this._totalPayloadLength>this._maxPayload&&this._maxPayload>0))return this._loop=!1,g(RangeError,"Max payload size exceeded",!1,1009,"WS_ERR_UNSUPPORTED_MESSAGE_LENGTH");this._masked?this._state=Ue:this._state=ce}getMask(){if(this._bufferedBytes<4){this._loop=!1;return}this._mask=this.consume(4),this._state=ce}getData(e){let t=Le;if(this._payloadLength){if(this._bufferedBytes7)return this.controlMessage(t);if(this._compressed){this._state=zt,this.decompress(t,e);return}return t.length&&(this._messageLength=this._totalPayloadLength,this._fragments.push(t)),this.dataMessage()}decompress(e,t){this._extensions[Ce.extensionName].decompress(e,this._fin,(i,n)=>{if(i)return t(i);if(n.length){if(this._messageLength+=n.length,this._messageLength>this._maxPayload&&this._maxPayload>0)return t(g(RangeError,"Max payload size exceeded",!1,1009,"WS_ERR_UNSUPPORTED_MESSAGE_LENGTH"));this._fragments.push(n)}const o=this.dataMessage();if(o)return t(o);this.startLoop(t)})}dataMessage(){if(this._fin){const e=this._messageLength,t=this._fragments;if(this._totalPayloadLength=0,this._messageLength=0,this._fragmented=0,this._fragments=[],this._opcode===2){let r;this._binaryType==="nodebuffer"?r=he(t,e):this._binaryType==="arraybuffer"?r=Gt(he(t,e)):r=t,this.emit("message",r,!0)}else{const r=he(t,e);if(!this._skipUTF8Validation&&!Ne(r))return this._loop=!1,g(Error,"invalid UTF-8 sequence",!0,1007,"WS_ERR_INVALID_UTF8");this.emit("message",r,!1)}}this._state=j}controlMessage(e){if(this._opcode===8)if(this._loop=!1,e.length===0)this.emit("conclude",1005,Le),this.end();else{const t=e.readUInt16BE(0);if(!Ht(t))return g(RangeError,`invalid status code ${t}`,!0,1002,"WS_ERR_INVALID_CLOSE_CODE");const r=new Z(e.buffer,e.byteOffset+2,e.length-2);if(!this._skipUTF8Validation&&!Ne(r))return g(Error,"invalid UTF-8 sequence",!0,1007,"WS_ERR_INVALID_UTF8");this.emit("conclude",t,r),this.end()}else this._opcode===9?this.emit("ping",e):this.emit("pong",e);this._state=j}};var et=Yt;function g(s,e,t,r,i){const n=new s(t?`Invalid WebSocket frame: ${e}`:e);return Error.captureStackTrace(n,g),n.code=i,n[Ft]=r,n}const Ys=et,{randomFillSync:qt}=S,Be=ie,{EMPTY_BUFFER:Kt}=$,{isValidStatusCode:Xt}=N,{mask:$e,toBuffer:D}=L,x=Symbol("kByteLength"),Zt=Buffer.alloc(4);let Qt=class U{constructor(e,t,r){this._extensions=t||{},r&&(this._generateMask=r,this._maskBuffer=Buffer.alloc(4)),this._socket=e,this._firstFragment=!0,this._compress=!1,this._bufferedBytes=0,this._deflating=!1,this._queue=[]}static frame(e,t){let r,i=!1,n=2,o=!1;t.mask&&(r=t.maskBuffer||Zt,t.generateMask?t.generateMask(r):qt(r,0,4),o=(r[0]|r[1]|r[2]|r[3])===0,n=6);let l;typeof e=="string"?(!t.mask||o)&&t[x]!==void 0?l=t[x]:(e=Buffer.from(e),l=e.length):(l=e.length,i=t.mask&&t.readOnly&&!o);let f=l;l>=65536?(n+=8,f=127):l>125&&(n+=2,f=126);const a=Buffer.allocUnsafe(i?l+n:n);return a[0]=t.fin?t.opcode|128:t.opcode,t.rsv1&&(a[0]|=64),a[1]=f,f===126?a.writeUInt16BE(l,2):f===127&&(a[2]=a[3]=0,a.writeUIntBE(l,4,6)),t.mask?(a[1]|=128,a[n-4]=r[0],a[n-3]=r[1],a[n-2]=r[2],a[n-1]=r[3],o?[a,e]:i?($e(e,r,a,n,l),[a]):($e(e,r,e,0,l),[a,e])):[a,e]}close(e,t,r,i){let n;if(e===void 0)n=Kt;else{if(typeof e!="number"||!Xt(e))throw new TypeError("First argument must be a valid error code number");if(t===void 0||!t.length)n=Buffer.allocUnsafe(2),n.writeUInt16BE(e,0);else{const l=Buffer.byteLength(t);if(l>123)throw new RangeError("The message must not be greater than 123 bytes");n=Buffer.allocUnsafe(2+l),n.writeUInt16BE(e,0),typeof t=="string"?n.write(t,2):n.set(t,2)}}const o={[x]:n.length,fin:!0,generateMask:this._generateMask,mask:r,maskBuffer:this._maskBuffer,opcode:8,readOnly:!1,rsv1:!1};this._deflating?this.enqueue([this.dispatch,n,!1,o,i]):this.sendFrame(U.frame(n,o),i)}ping(e,t,r){let i,n;if(typeof e=="string"?(i=Buffer.byteLength(e),n=!1):(e=D(e),i=e.length,n=D.readOnly),i>125)throw new RangeError("The data size must not be greater than 125 bytes");const o={[x]:i,fin:!0,generateMask:this._generateMask,mask:t,maskBuffer:this._maskBuffer,opcode:9,readOnly:n,rsv1:!1};this._deflating?this.enqueue([this.dispatch,e,!1,o,r]):this.sendFrame(U.frame(e,o),r)}pong(e,t,r){let i,n;if(typeof e=="string"?(i=Buffer.byteLength(e),n=!1):(e=D(e),i=e.length,n=D.readOnly),i>125)throw new RangeError("The data size must not be greater than 125 bytes");const o={[x]:i,fin:!0,generateMask:this._generateMask,mask:t,maskBuffer:this._maskBuffer,opcode:10,readOnly:n,rsv1:!1};this._deflating?this.enqueue([this.dispatch,e,!1,o,r]):this.sendFrame(U.frame(e,o),r)}send(e,t,r){const i=this._extensions[Be.extensionName];let n=t.binary?2:1,o=t.compress,l,f;if(typeof e=="string"?(l=Buffer.byteLength(e),f=!1):(e=D(e),l=e.length,f=D.readOnly),this._firstFragment?(this._firstFragment=!1,o&&i&&i.params[i._isServer?"server_no_context_takeover":"client_no_context_takeover"]&&(o=l>=i._threshold),this._compress=o):(o=!1,n=0),t.fin&&(this._firstFragment=!0),i){const a={[x]:l,fin:t.fin,generateMask:this._generateMask,mask:t.mask,maskBuffer:this._maskBuffer,opcode:n,readOnly:f,rsv1:o};this._deflating?this.enqueue([this.dispatch,e,this._compress,a,r]):this.dispatch(e,this._compress,a,r)}else this.sendFrame(U.frame(e,{[x]:l,fin:t.fin,generateMask:this._generateMask,mask:t.mask,maskBuffer:this._maskBuffer,opcode:n,readOnly:f,rsv1:!1}),r)}dispatch(e,t,r,i){if(!t){this.sendFrame(U.frame(e,r),i);return}const n=this._extensions[Be.extensionName];this._bufferedBytes+=r[x],this._deflating=!0,n.compress(e,r.fin,(o,l)=>{if(this._socket.destroyed){const f=new Error("The socket was closed while data was being compressed");typeof i=="function"&&i(f);for(let a=0;a{let t=s[e];return Array.isArray(t)||(t=[t]),t.map(r=>[e].concat(Object.keys(r).map(i=>{let n=r[i];return Array.isArray(n)||(n=[n]),n.map(o=>o===!0?i:`${i}=${o}`).join("; ")})).join("; ")).join(", ")}).join(", ")}var st={format:ss,parse:ts};const rs=S,is=S,ns=S,rt=S,os=S,{randomBytes:as,createHash:ls}=S,{URL:de}=S,C=ie,fs=et,hs=tt,{BINARY_TYPES:Ge,EMPTY_BUFFER:J,GUID:cs,kForOnEventAttribute:_e,kListener:us,kStatusCode:ds,kWebSocket:y,NOOP:it}=$,{EventTarget:{addEventListener:_s,removeEventListener:ps}}=es,{format:ms,parse:gs}=st,{toBuffer:ys}=L,vs=30*1e3,nt=Symbol("kAborted"),pe=[8,13],O=["CONNECTING","OPEN","CLOSING","CLOSED"],Ss=/^[!#$%&'*+\-.0-9A-Z^_`|a-z~]+$/;let m=class d extends rs{constructor(e,t,r){super(),this._binaryType=Ge[0],this._closeCode=1006,this._closeFrameReceived=!1,this._closeFrameSent=!1,this._closeMessage=J,this._closeTimer=null,this._extensions={},this._paused=!1,this._protocol="",this._readyState=d.CONNECTING,this._receiver=null,this._sender=null,this._socket=null,e!==null?(this._bufferedAmount=0,this._isServer=!1,this._redirects=0,t===void 0?t=[]:Array.isArray(t)||(typeof t=="object"&&t!==null?(r=t,t=[]):t=[t]),at(this,e,t,r)):this._isServer=!0}get binaryType(){return this._binaryType}set binaryType(e){Ge.includes(e)&&(this._binaryType=e,this._receiver&&(this._receiver._binaryType=e))}get bufferedAmount(){return this._socket?this._socket._writableState.length+this._sender._bufferedBytes:this._bufferedAmount}get extensions(){return Object.keys(this._extensions).join()}get isPaused(){return this._paused}get onclose(){return null}get onerror(){return null}get onopen(){return null}get onmessage(){return null}get protocol(){return this._protocol}get readyState(){return this._readyState}get url(){return this._url}setSocket(e,t,r){const i=new fs({binaryType:this.binaryType,extensions:this._extensions,isServer:this._isServer,maxPayload:r.maxPayload,skipUTF8Validation:r.skipUTF8Validation});this._sender=new hs(e,this._extensions,r.generateMask),this._receiver=i,this._socket=e,i[y]=this,e[y]=this,i.on("conclude",xs),i.on("drain",ks),i.on("error",ws),i.on("message",Os),i.on("ping",Ts),i.on("pong",Cs),e.setTimeout(0),e.setNoDelay(),t.length>0&&e.unshift(t),e.on("close",ft),e.on("data",oe),e.on("end",ht),e.on("error",ct),this._readyState=d.OPEN,this.emit("open")}emitClose(){if(!this._socket){this._readyState=d.CLOSED,this.emit("close",this._closeCode,this._closeMessage);return}this._extensions[C.extensionName]&&this._extensions[C.extensionName].cleanup(),this._receiver.removeAllListeners(),this._readyState=d.CLOSED,this.emit("close",this._closeCode,this._closeMessage)}close(e,t){if(this.readyState!==d.CLOSED){if(this.readyState===d.CONNECTING){const r="WebSocket was closed before the connection was established";b(this,this._req,r);return}if(this.readyState===d.CLOSING){this._closeFrameSent&&(this._closeFrameReceived||this._receiver._writableState.errorEmitted)&&this._socket.end();return}this._readyState=d.CLOSING,this._sender.close(e,t,!this._isServer,r=>{r||(this._closeFrameSent=!0,(this._closeFrameReceived||this._receiver._writableState.errorEmitted)&&this._socket.end())}),this._closeTimer=setTimeout(this._socket.destroy.bind(this._socket),vs)}}pause(){this.readyState===d.CONNECTING||this.readyState===d.CLOSED||(this._paused=!0,this._socket.pause())}ping(e,t,r){if(this.readyState===d.CONNECTING)throw new Error("WebSocket is not open: readyState 0 (CONNECTING)");if(typeof e=="function"?(r=e,e=t=void 0):typeof t=="function"&&(r=t,t=void 0),typeof e=="number"&&(e=e.toString()),this.readyState!==d.OPEN){me(this,e,r);return}t===void 0&&(t=!this._isServer),this._sender.ping(e||J,t,r)}pong(e,t,r){if(this.readyState===d.CONNECTING)throw new Error("WebSocket is not open: readyState 0 (CONNECTING)");if(typeof e=="function"?(r=e,e=t=void 0):typeof t=="function"&&(r=t,t=void 0),typeof e=="number"&&(e=e.toString()),this.readyState!==d.OPEN){me(this,e,r);return}t===void 0&&(t=!this._isServer),this._sender.pong(e||J,t,r)}resume(){this.readyState===d.CONNECTING||this.readyState===d.CLOSED||(this._paused=!1,this._receiver._writableState.needDrain||this._socket.resume())}send(e,t,r){if(this.readyState===d.CONNECTING)throw new Error("WebSocket is not open: readyState 0 (CONNECTING)");if(typeof t=="function"&&(r=t,t={}),typeof e=="number"&&(e=e.toString()),this.readyState!==d.OPEN){me(this,e,r);return}const i={binary:typeof e!="string",mask:!this._isServer,compress:!0,fin:!0,...t};this._extensions[C.extensionName]||(i.compress=!1),this._sender.send(e||J,i,r)}terminate(){if(this.readyState!==d.CLOSED){if(this.readyState===d.CONNECTING){const e="WebSocket was closed before the connection was established";b(this,this._req,e);return}this._socket&&(this._readyState=d.CLOSING,this._socket.destroy())}}};Object.defineProperty(m,"CONNECTING",{enumerable:!0,value:O.indexOf("CONNECTING")});Object.defineProperty(m.prototype,"CONNECTING",{enumerable:!0,value:O.indexOf("CONNECTING")});Object.defineProperty(m,"OPEN",{enumerable:!0,value:O.indexOf("OPEN")});Object.defineProperty(m.prototype,"OPEN",{enumerable:!0,value:O.indexOf("OPEN")});Object.defineProperty(m,"CLOSING",{enumerable:!0,value:O.indexOf("CLOSING")});Object.defineProperty(m.prototype,"CLOSING",{enumerable:!0,value:O.indexOf("CLOSING")});Object.defineProperty(m,"CLOSED",{enumerable:!0,value:O.indexOf("CLOSED")});Object.defineProperty(m.prototype,"CLOSED",{enumerable:!0,value:O.indexOf("CLOSED")});["binaryType","bufferedAmount","extensions","isPaused","protocol","readyState","url"].forEach(s=>{Object.defineProperty(m.prototype,s,{enumerable:!0})});["open","error","close","message"].forEach(s=>{Object.defineProperty(m.prototype,`on${s}`,{enumerable:!0,get(){for(const e of this.listeners(s))if(e[_e])return e[us];return null},set(e){for(const t of this.listeners(s))if(t[_e]){this.removeListener(s,t);break}typeof e=="function"&&this.addEventListener(s,e,{[_e]:!0})}})});m.prototype.addEventListener=_s;m.prototype.removeEventListener=ps;var ot=m;function at(s,e,t,r){const i={protocolVersion:pe[1],maxPayload:104857600,skipUTF8Validation:!1,perMessageDeflate:!0,followRedirects:!1,maxRedirects:10,...r,createConnection:void 0,socketPath:void 0,hostname:void 0,protocol:void 0,timeout:void 0,method:"GET",host:void 0,path:void 0,port:void 0};if(!pe.includes(i.protocolVersion))throw new RangeError(`Unsupported protocol version: ${i.protocolVersion} (supported versions: ${pe.join(", ")})`);let n;if(e instanceof de)n=e,s._url=e.href;else{try{n=new de(e)}catch{throw new SyntaxError(`Invalid URL: ${e}`)}s._url=e}const o=n.protocol==="wss:",l=n.protocol==="ws+unix:";let f;if(n.protocol!=="ws:"&&!o&&!l?f=`The URL's protocol must be one of "ws:", "wss:", or "ws+unix:"`:l&&!n.pathname?f="The URL's pathname is empty":n.hash&&(f="The URL contains a fragment identifier"),f){const u=new SyntaxError(f);if(s._redirects===0)throw u;te(s,u);return}const a=o?443:80,c=as(16).toString("base64"),h=o?is.request:ns.request,p=new Set;let v;if(i.createConnection=o?bs:Es,i.defaultPort=i.defaultPort||a,i.port=n.port||a,i.host=n.hostname.startsWith("[")?n.hostname.slice(1,-1):n.hostname,i.headers={...i.headers,"Sec-WebSocket-Version":i.protocolVersion,"Sec-WebSocket-Key":c,Connection:"Upgrade",Upgrade:"websocket"},i.path=n.pathname+n.search,i.timeout=i.handshakeTimeout,i.perMessageDeflate&&(v=new C(i.perMessageDeflate!==!0?i.perMessageDeflate:{},!1,i.maxPayload),i.headers["Sec-WebSocket-Extensions"]=ms({[C.extensionName]:v.offer()})),t.length){for(const u of t){if(typeof u!="string"||!Ss.test(u)||p.has(u))throw new SyntaxError("An invalid or duplicated subprotocol was specified");p.add(u)}i.headers["Sec-WebSocket-Protocol"]=t.join(",")}if(i.origin&&(i.protocolVersion<13?i.headers["Sec-WebSocket-Origin"]=i.origin:i.headers.Origin=i.origin),(n.username||n.password)&&(i.auth=`${n.username}:${n.password}`),l){const u=i.path.split(":");i.socketPath=u[0],i.path=u[1]}let _;if(i.followRedirects){if(s._redirects===0){s._originalIpc=l,s._originalSecure=o,s._originalHostOrSocketPath=l?i.socketPath:n.host;const u=r&&r.headers;if(r={...r,headers:{}},u)for(const[E,I]of Object.entries(u))r.headers[E.toLowerCase()]=I}else if(s.listenerCount("redirect")===0){const u=l?s._originalIpc?i.socketPath===s._originalHostOrSocketPath:!1:s._originalIpc?!1:n.host===s._originalHostOrSocketPath;(!u||s._originalSecure&&!o)&&(delete i.headers.authorization,delete i.headers.cookie,u||delete i.headers.host,i.auth=void 0)}i.auth&&!r.headers.authorization&&(r.headers.authorization="Basic "+Buffer.from(i.auth).toString("base64")),_=s._req=h(i),s._redirects&&s.emit("redirect",s.url,_)}else _=s._req=h(i);i.timeout&&_.on("timeout",()=>{b(s,_,"Opening handshake has timed out")}),_.on("error",u=>{_===null||_[nt]||(_=s._req=null,te(s,u))}),_.on("response",u=>{const E=u.headers.location,I=u.statusCode;if(E&&i.followRedirects&&I>=300&&I<400){if(++s._redirects>i.maxRedirects){b(s,_,"Maximum redirects exceeded");return}_.abort();let K;try{K=new de(E,e)}catch{const P=new SyntaxError(`Invalid URL: ${E}`);te(s,P);return}at(s,K,t,r)}else s.emit("unexpected-response",_,u)||b(s,_,`Unexpected server response: ${u.statusCode}`)}),_.on("upgrade",(u,E,I)=>{if(s.emit("upgrade",u),s.readyState!==m.CONNECTING)return;if(_=s._req=null,u.headers.upgrade.toLowerCase()!=="websocket"){b(s,E,"Invalid Upgrade header");return}const K=ls("sha1").update(c+cs).digest("base64");if(u.headers["sec-websocket-accept"]!==K){b(s,E,"Invalid Sec-WebSocket-Accept header");return}const A=u.headers["sec-websocket-protocol"];let P;if(A!==void 0?p.size?p.has(A)||(P="Server sent an invalid subprotocol"):P="Server sent a subprotocol but none was requested":p.size&&(P="Server sent no subprotocol"),P){b(s,E,P);return}A&&(s._protocol=A);const Ee=u.headers["sec-websocket-extensions"];if(Ee!==void 0){if(!v){b(s,E,"Server sent a Sec-WebSocket-Extensions header but no extension was requested");return}let ae;try{ae=gs(Ee)}catch{b(s,E,"Invalid Sec-WebSocket-Extensions header");return}const be=Object.keys(ae);if(be.length!==1||be[0]!==C.extensionName){b(s,E,"Server indicated an extension that was not requested");return}try{v.accept(ae[C.extensionName])}catch{b(s,E,"Invalid Sec-WebSocket-Extensions header");return}s._extensions[C.extensionName]=v}s.setSocket(E,I,{generateMask:i.generateMask,maxPayload:i.maxPayload,skipUTF8Validation:i.skipUTF8Validation})}),i.finishRequest?i.finishRequest(_,s):_.end()}function te(s,e){s._readyState=m.CLOSING,s.emit("error",e),s.emitClose()}function Es(s){return s.path=s.socketPath,rt.connect(s)}function bs(s){return s.path=void 0,!s.servername&&s.servername!==""&&(s.servername=rt.isIP(s.host)?"":s.host),os.connect(s)}function b(s,e,t){s._readyState=m.CLOSING;const r=new Error(t);Error.captureStackTrace(r,b),e.setHeader?(e[nt]=!0,e.abort(),e.socket&&!e.socket.destroyed&&e.socket.destroy(),process.nextTick(te,s,r)):(e.destroy(r),e.once("error",s.emit.bind(s,"error")),e.once("close",s.emitClose.bind(s)))}function me(s,e,t){if(e){const r=ys(e).length;s._socket?s._sender._bufferedBytes+=r:s._bufferedAmount+=r}if(t){const r=new Error(`WebSocket is not open: readyState ${s.readyState} (${O[s.readyState]})`);process.nextTick(t,r)}}function xs(s,e){const t=this[y];t._closeFrameReceived=!0,t._closeMessage=e,t._closeCode=s,t._socket[y]!==void 0&&(t._socket.removeListener("data",oe),process.nextTick(lt,t._socket),s===1005?t.close():t.close(s,e))}function ks(){const s=this[y];s.isPaused||s._socket.resume()}function ws(s){const e=this[y];e._socket[y]!==void 0&&(e._socket.removeListener("data",oe),process.nextTick(lt,e._socket),e.close(s[ds])),e.emit("error",s)}function Ve(){this[y].emitClose()}function Os(s,e){this[y].emit("message",s,e)}function Ts(s){const e=this[y];e.pong(s,!e._isServer,it),e.emit("ping",s)}function Cs(s){this[y].emit("pong",s)}function lt(s){s.resume()}function ft(){const s=this[y];this.removeListener("close",ft),this.removeListener("data",oe),this.removeListener("end",ht),s._readyState=m.CLOSING;let e;!this._readableState.endEmitted&&!s._closeFrameReceived&&!s._receiver._writableState.errorEmitted&&(e=s._socket.read())!==null&&s._receiver.write(e),s._receiver.end(),this[y]=void 0,clearTimeout(s._closeTimer),s._receiver._writableState.finished||s._receiver._writableState.errorEmitted?s.emitClose():(s._receiver.on("error",Ve),s._receiver.on("finish",Ve))}function oe(s){this[y]._receiver.write(s)||this.pause()}function ht(){const s=this[y];s._readyState=m.CLOSING,s._receiver.end(),this.end()}function ct(){const s=this[y];this.removeListener("error",ct),this.on("error",it),s&&(s._readyState=m.CLOSING,this.destroy())}const Ks=ot,{tokenChars:Ls}=N;function Ns(s){const e=new Set;let t=-1,r=-1,i=0;for(i;i{const n=re.STATUS_CODES[426];i.writeHead(426,{"Content-Length":n.length,"Content-Type":"text/plain"}),i.end(n)}),this._server.listen(e.port,e.host,e.backlog,t)):e.server&&(this._server=e.server),this._server){const r=this.emit.bind(this,"connection");this._removeListeners=Fs(this._server,{listening:this.emit.bind(this,"listening"),error:this.emit.bind(this,"error"),upgrade:(i,n,o)=>{this.handleUpgrade(i,n,o,r)}})}e.perMessageDeflate===!0&&(e.perMessageDeflate={}),e.clientTracking&&(this.clients=new Set,this._shouldEmitClose=!1),this.options=e,this._state=ze}address(){if(this.options.noServer)throw new Error('The server is operating in "noServer" mode');return this._server?this._server.address():null}close(e){if(this._state===ut){e&&this.once("close",()=>{e(new Error("The server is not running"))}),process.nextTick(H,this);return}if(e&&this.once("close",e),this._state!==Ye)if(this._state=Ye,this.options.noServer||this.options.server)this._server&&(this._removeListeners(),this._removeListeners=this._server=null),this.clients?this.clients.size?this._shouldEmitClose=!0:process.nextTick(H,this):process.nextTick(H,this);else{const t=this._server;this._removeListeners(),this._removeListeners=this._server=null,t.close(()=>{H(this)})}}shouldHandle(e){if(this.options.path){const t=e.url.indexOf("?");if((t!==-1?e.url.slice(0,t):e.url)!==this.options.path)return!1}return!0}handleUpgrade(e,t,r,i){t.on("error",qe);const n=e.headers["sec-websocket-key"],o=+e.headers["sec-websocket-version"];if(e.method!=="GET"){B(this,e,t,405,"Invalid HTTP method");return}if(e.headers.upgrade.toLowerCase()!=="websocket"){B(this,e,t,400,"Invalid Upgrade header");return}if(!n||!Ds.test(n)){B(this,e,t,400,"Missing or invalid Sec-WebSocket-Key header");return}if(o!==8&&o!==13){B(this,e,t,400,"Missing or invalid Sec-WebSocket-Version header");return}if(!this.shouldHandle(e)){Y(t,400);return}const l=e.headers["sec-websocket-protocol"];let f=new Set;if(l!==void 0)try{f=Bs.parse(l)}catch{B(this,e,t,400,"Invalid Sec-WebSocket-Protocol header");return}const a=e.headers["sec-websocket-extensions"],c={};if(this.options.perMessageDeflate&&a!==void 0){const h=new R(this.options.perMessageDeflate,!0,this.options.maxPayload);try{const p=He.parse(a);p[R.extensionName]&&(h.accept(p[R.extensionName]),c[R.extensionName]=h)}catch{B(this,e,t,400,"Invalid or unacceptable Sec-WebSocket-Extensions header");return}}if(this.options.verifyClient){const h={origin:e.headers[`${o===8?"sec-websocket-origin":"origin"}`],secure:!!(e.socket.authorized||e.socket.encrypted),req:e};if(this.options.verifyClient.length===2){this.options.verifyClient(h,(p,v,_,u)=>{if(!p)return Y(t,v||401,_,u);this.completeUpgrade(c,n,f,e,t,r,i)});return}if(!this.options.verifyClient(h))return Y(t,401)}this.completeUpgrade(c,n,f,e,t,r,i)}completeUpgrade(e,t,r,i,n,o,l){if(!n.readable||!n.writable)return n.destroy();if(n[Is])throw new Error("server.handleUpgrade() was called more than once with the same socket, possibly due to a misconfiguration");if(this._state>ze)return Y(n,503);const a=["HTTP/1.1 101 Switching Protocols","Upgrade: websocket","Connection: Upgrade",`Sec-WebSocket-Accept: ${Us("sha1").update(t+Ms).digest("base64")}`],c=new this.options.WebSocket(null);if(r.size){const h=this.options.handleProtocols?this.options.handleProtocols(r,i):r.values().next().value;h&&(a.push(`Sec-WebSocket-Protocol: ${h}`),c._protocol=h)}if(e[R.extensionName]){const h=e[R.extensionName].params,p=He.format({[R.extensionName]:[h]});a.push(`Sec-WebSocket-Extensions: ${p}`),c._extensions=e}this.emit("headers",a,i),n.write(a.concat(`\r
-`).join(`\r
-`)),n.removeListener("error",qe),c.setSocket(n,o,{maxPayload:this.options.maxPayload,skipUTF8Validation:this.options.skipUTF8Validation}),this.clients&&(this.clients.add(c),c.on("close",()=>{this.clients.delete(c),this._shouldEmitClose&&!this.clients.size&&process.nextTick(H,this)})),l(c,i)}}var As=Ws;function Fs(s,e){for(const t of Object.keys(e))s.on(t,e[t]);return function(){for(const r of Object.keys(e))s.removeListener(r,e[r])}}function H(s){s._state=ut,s.emit("close")}function qe(){this.destroy()}function Y(s,e,t,r){t=t||re.STATUS_CODES[e],r={Connection:"close","Content-Type":"text/html","Content-Length":Buffer.byteLength(t),...r},s.once("finish",s.destroy),s.end(`HTTP/1.1 ${e} ${re.STATUS_CODES[e]}\r
-`+Object.keys(r).map(i=>`${i}: ${r[i]}`).join(`\r
-`)+`\r
-\r
-`+t)}function B(s,e,t,r,i){if(s.listenerCount("wsClientError")){const n=new Error(i);Error.captureStackTrace(n,B),s.emit("wsClientError",n,t,e)}else Y(t,r,i)}const Xs=As;export{Ys as Receiver,qs as Sender,Ks as WebSocket,Xs as WebSocketServer,Gs as createWebSocketStream,Ks as default};
-//# sourceMappingURL=wrapper-b7460963-69b64cfb.js.map
diff --git a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/backends/backend_webagg_core.py b/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/backends/backend_webagg_core.py
deleted file mode 100644
index 57cfa311b8f74989ff7e9e15690e2f6739cf2b83..0000000000000000000000000000000000000000
--- a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/backends/backend_webagg_core.py
+++ /dev/null
@@ -1,517 +0,0 @@
-"""
-Displays Agg images in the browser, with interactivity
-"""
-# The WebAgg backend is divided into two modules:
-#
-# - `backend_webagg_core.py` contains code necessary to embed a WebAgg
-# plot inside of a web application, and communicate in an abstract
-# way over a web socket.
-#
-# - `backend_webagg.py` contains a concrete implementation of a basic
-# application, implemented with asyncio.
-
-import asyncio
-import datetime
-from io import BytesIO, StringIO
-import json
-import logging
-import os
-from pathlib import Path
-
-import numpy as np
-from PIL import Image
-
-from matplotlib import _api, backend_bases, backend_tools
-from matplotlib.backends import backend_agg
-from matplotlib.backend_bases import (
- _Backend, KeyEvent, LocationEvent, MouseEvent, ResizeEvent)
-
-_log = logging.getLogger(__name__)
-
-_SPECIAL_KEYS_LUT = {'Alt': 'alt',
- 'AltGraph': 'alt',
- 'CapsLock': 'caps_lock',
- 'Control': 'control',
- 'Meta': 'meta',
- 'NumLock': 'num_lock',
- 'ScrollLock': 'scroll_lock',
- 'Shift': 'shift',
- 'Super': 'super',
- 'Enter': 'enter',
- 'Tab': 'tab',
- 'ArrowDown': 'down',
- 'ArrowLeft': 'left',
- 'ArrowRight': 'right',
- 'ArrowUp': 'up',
- 'End': 'end',
- 'Home': 'home',
- 'PageDown': 'pagedown',
- 'PageUp': 'pageup',
- 'Backspace': 'backspace',
- 'Delete': 'delete',
- 'Insert': 'insert',
- 'Escape': 'escape',
- 'Pause': 'pause',
- 'Select': 'select',
- 'Dead': 'dead',
- 'F1': 'f1',
- 'F2': 'f2',
- 'F3': 'f3',
- 'F4': 'f4',
- 'F5': 'f5',
- 'F6': 'f6',
- 'F7': 'f7',
- 'F8': 'f8',
- 'F9': 'f9',
- 'F10': 'f10',
- 'F11': 'f11',
- 'F12': 'f12'}
-
-
-def _handle_key(key):
- """Handle key values"""
- value = key[key.index('k') + 1:]
- if 'shift+' in key:
- if len(value) == 1:
- key = key.replace('shift+', '')
- if value in _SPECIAL_KEYS_LUT:
- value = _SPECIAL_KEYS_LUT[value]
- key = key[:key.index('k')] + value
- return key
-
-
-class TimerTornado(backend_bases.TimerBase):
- def __init__(self, *args, **kwargs):
- self._timer = None
- super().__init__(*args, **kwargs)
-
- def _timer_start(self):
- import tornado
-
- self._timer_stop()
- if self._single:
- ioloop = tornado.ioloop.IOLoop.instance()
- self._timer = ioloop.add_timeout(
- datetime.timedelta(milliseconds=self.interval),
- self._on_timer)
- else:
- self._timer = tornado.ioloop.PeriodicCallback(
- self._on_timer,
- max(self.interval, 1e-6))
- self._timer.start()
-
- def _timer_stop(self):
- import tornado
-
- if self._timer is None:
- return
- elif self._single:
- ioloop = tornado.ioloop.IOLoop.instance()
- ioloop.remove_timeout(self._timer)
- else:
- self._timer.stop()
- self._timer = None
-
- def _timer_set_interval(self):
- # Only stop and restart it if the timer has already been started
- if self._timer is not None:
- self._timer_stop()
- self._timer_start()
-
-
-class TimerAsyncio(backend_bases.TimerBase):
- def __init__(self, *args, **kwargs):
- self._task = None
- super().__init__(*args, **kwargs)
-
- async def _timer_task(self, interval):
- while True:
- try:
- await asyncio.sleep(interval)
- self._on_timer()
-
- if self._single:
- break
- except asyncio.CancelledError:
- break
-
- def _timer_start(self):
- self._timer_stop()
-
- self._task = asyncio.ensure_future(
- self._timer_task(max(self.interval / 1_000., 1e-6))
- )
-
- def _timer_stop(self):
- if self._task is not None:
- self._task.cancel()
- self._task = None
-
- def _timer_set_interval(self):
- # Only stop and restart it if the timer has already been started
- if self._task is not None:
- self._timer_stop()
- self._timer_start()
-
-
-class FigureCanvasWebAggCore(backend_agg.FigureCanvasAgg):
- manager_class = _api.classproperty(lambda cls: FigureManagerWebAgg)
- _timer_cls = TimerAsyncio
- # Webagg and friends having the right methods, but still
- # having bugs in practice. Do not advertise that it works until
- # we can debug this.
- supports_blit = False
-
- def __init__(self, *args, **kwargs):
- super().__init__(*args, **kwargs)
- # Set to True when the renderer contains data that is newer
- # than the PNG buffer.
- self._png_is_old = True
- # Set to True by the `refresh` message so that the next frame
- # sent to the clients will be a full frame.
- self._force_full = True
- # The last buffer, for diff mode.
- self._last_buff = np.empty((0, 0))
- # Store the current image mode so that at any point, clients can
- # request the information. This should be changed by calling
- # self.set_image_mode(mode) so that the notification can be given
- # to the connected clients.
- self._current_image_mode = 'full'
- # Track mouse events to fill in the x, y position of key events.
- self._last_mouse_xy = (None, None)
-
- def show(self):
- # show the figure window
- from matplotlib.pyplot import show
- show()
-
- def draw(self):
- self._png_is_old = True
- try:
- super().draw()
- finally:
- self.manager.refresh_all() # Swap the frames.
-
- def blit(self, bbox=None):
- self._png_is_old = True
- self.manager.refresh_all()
-
- def draw_idle(self):
- self.send_event("draw")
-
- def set_cursor(self, cursor):
- # docstring inherited
- cursor = _api.check_getitem({
- backend_tools.Cursors.HAND: 'pointer',
- backend_tools.Cursors.POINTER: 'default',
- backend_tools.Cursors.SELECT_REGION: 'crosshair',
- backend_tools.Cursors.MOVE: 'move',
- backend_tools.Cursors.WAIT: 'wait',
- backend_tools.Cursors.RESIZE_HORIZONTAL: 'ew-resize',
- backend_tools.Cursors.RESIZE_VERTICAL: 'ns-resize',
- }, cursor=cursor)
- self.send_event('cursor', cursor=cursor)
-
- def set_image_mode(self, mode):
- """
- Set the image mode for any subsequent images which will be sent
- to the clients. The modes may currently be either 'full' or 'diff'.
-
- Note: diff images may not contain transparency, therefore upon
- draw this mode may be changed if the resulting image has any
- transparent component.
- """
- _api.check_in_list(['full', 'diff'], mode=mode)
- if self._current_image_mode != mode:
- self._current_image_mode = mode
- self.handle_send_image_mode(None)
-
- def get_diff_image(self):
- if self._png_is_old:
- renderer = self.get_renderer()
-
- pixels = np.asarray(renderer.buffer_rgba())
- # The buffer is created as type uint32 so that entire
- # pixels can be compared in one numpy call, rather than
- # needing to compare each plane separately.
- buff = pixels.view(np.uint32).squeeze(2)
-
- if (self._force_full
- # If the buffer has changed size we need to do a full draw.
- or buff.shape != self._last_buff.shape
- # If any pixels have transparency, we need to force a full
- # draw as we cannot overlay new on top of old.
- or (pixels[:, :, 3] != 255).any()):
- self.set_image_mode('full')
- output = buff
- else:
- self.set_image_mode('diff')
- diff = buff != self._last_buff
- output = np.where(diff, buff, 0)
-
- # Store the current buffer so we can compute the next diff.
- self._last_buff = buff.copy()
- self._force_full = False
- self._png_is_old = False
-
- data = output.view(dtype=np.uint8).reshape((*output.shape, 4))
- with BytesIO() as png:
- Image.fromarray(data).save(png, format="png")
- return png.getvalue()
-
- def handle_event(self, event):
- e_type = event['type']
- handler = getattr(self, 'handle_{0}'.format(e_type),
- self.handle_unknown_event)
- return handler(event)
-
- def handle_unknown_event(self, event):
- _log.warning('Unhandled message type {0}. {1}'.format(
- event['type'], event))
-
- def handle_ack(self, event):
- # Network latency tends to decrease if traffic is flowing
- # in both directions. Therefore, the browser sends back
- # an "ack" message after each image frame is received.
- # This could also be used as a simple sanity check in the
- # future, but for now the performance increase is enough
- # to justify it, even if the server does nothing with it.
- pass
-
- def handle_draw(self, event):
- self.draw()
-
- def _handle_mouse(self, event):
- x = event['x']
- y = event['y']
- y = self.get_renderer().height - y
- self._last_mouse_xy = x, y
- # JavaScript button numbers and Matplotlib button numbers are off by 1.
- button = event['button'] + 1
-
- e_type = event['type']
- modifiers = event['modifiers']
- guiEvent = event.get('guiEvent')
- if e_type in ['button_press', 'button_release']:
- MouseEvent(e_type + '_event', self, x, y, button,
- modifiers=modifiers, guiEvent=guiEvent)._process()
- elif e_type == 'dblclick':
- MouseEvent('button_press_event', self, x, y, button, dblclick=True,
- modifiers=modifiers, guiEvent=guiEvent)._process()
- elif e_type == 'scroll':
- MouseEvent('scroll_event', self, x, y, step=event['step'],
- modifiers=modifiers, guiEvent=guiEvent)._process()
- elif e_type == 'motion_notify':
- MouseEvent(e_type + '_event', self, x, y,
- modifiers=modifiers, guiEvent=guiEvent)._process()
- elif e_type in ['figure_enter', 'figure_leave']:
- LocationEvent(e_type + '_event', self, x, y,
- modifiers=modifiers, guiEvent=guiEvent)._process()
- handle_button_press = handle_button_release = handle_dblclick = \
- handle_figure_enter = handle_figure_leave = handle_motion_notify = \
- handle_scroll = _handle_mouse
-
- def _handle_key(self, event):
- KeyEvent(event['type'] + '_event', self,
- _handle_key(event['key']), *self._last_mouse_xy,
- guiEvent=event.get('guiEvent'))._process()
- handle_key_press = handle_key_release = _handle_key
-
- def handle_toolbar_button(self, event):
- # TODO: Be more suspicious of the input
- getattr(self.toolbar, event['name'])()
-
- def handle_refresh(self, event):
- figure_label = self.figure.get_label()
- if not figure_label:
- figure_label = "Figure {0}".format(self.manager.num)
- self.send_event('figure_label', label=figure_label)
- self._force_full = True
- if self.toolbar:
- # Normal toolbar init would refresh this, but it happens before the
- # browser canvas is set up.
- self.toolbar.set_history_buttons()
- self.draw_idle()
-
- def handle_resize(self, event):
- x = int(event.get('width', 800)) * self.device_pixel_ratio
- y = int(event.get('height', 800)) * self.device_pixel_ratio
- fig = self.figure
- # An attempt at approximating the figure size in pixels.
- fig.set_size_inches(x / fig.dpi, y / fig.dpi, forward=False)
- # Acknowledge the resize, and force the viewer to update the
- # canvas size to the figure's new size (which is hopefully
- # identical or within a pixel or so).
- self._png_is_old = True
- self.manager.resize(*fig.bbox.size, forward=False)
- ResizeEvent('resize_event', self)._process()
- self.draw_idle()
-
- def handle_send_image_mode(self, event):
- # The client requests notification of what the current image mode is.
- self.send_event('image_mode', mode=self._current_image_mode)
-
- def handle_set_device_pixel_ratio(self, event):
- self._handle_set_device_pixel_ratio(event.get('device_pixel_ratio', 1))
-
- def handle_set_dpi_ratio(self, event):
- # This handler is for backwards-compatibility with older ipympl.
- self._handle_set_device_pixel_ratio(event.get('dpi_ratio', 1))
-
- def _handle_set_device_pixel_ratio(self, device_pixel_ratio):
- if self._set_device_pixel_ratio(device_pixel_ratio):
- self._force_full = True
- self.draw_idle()
-
- def send_event(self, event_type, **kwargs):
- if self.manager:
- self.manager._send_event(event_type, **kwargs)
-
-
-_ALLOWED_TOOL_ITEMS = {
- 'home',
- 'back',
- 'forward',
- 'pan',
- 'zoom',
- 'download',
- None,
-}
-
-
-class NavigationToolbar2WebAgg(backend_bases.NavigationToolbar2):
-
- # Use the standard toolbar items + download button
- toolitems = [
- (text, tooltip_text, image_file, name_of_method)
- for text, tooltip_text, image_file, name_of_method
- in (*backend_bases.NavigationToolbar2.toolitems,
- ('Download', 'Download plot', 'filesave', 'download'))
- if name_of_method in _ALLOWED_TOOL_ITEMS
- ]
-
- def __init__(self, canvas):
- self.message = ''
- super().__init__(canvas)
-
- def set_message(self, message):
- if message != self.message:
- self.canvas.send_event("message", message=message)
- self.message = message
-
- def draw_rubberband(self, event, x0, y0, x1, y1):
- self.canvas.send_event("rubberband", x0=x0, y0=y0, x1=x1, y1=y1)
-
- def remove_rubberband(self):
- self.canvas.send_event("rubberband", x0=-1, y0=-1, x1=-1, y1=-1)
-
- def save_figure(self, *args):
- """Save the current figure"""
- self.canvas.send_event('save')
-
- def pan(self):
- super().pan()
- self.canvas.send_event('navigate_mode', mode=self.mode.name)
-
- def zoom(self):
- super().zoom()
- self.canvas.send_event('navigate_mode', mode=self.mode.name)
-
- def set_history_buttons(self):
- can_backward = self._nav_stack._pos > 0
- can_forward = self._nav_stack._pos < len(self._nav_stack._elements) - 1
- self.canvas.send_event('history_buttons',
- Back=can_backward, Forward=can_forward)
-
-
-class FigureManagerWebAgg(backend_bases.FigureManagerBase):
- # This must be None to not break ipympl
- _toolbar2_class = None
- ToolbarCls = NavigationToolbar2WebAgg
-
- def __init__(self, canvas, num):
- self.web_sockets = set()
- super().__init__(canvas, num)
-
- def show(self):
- pass
-
- def resize(self, w, h, forward=True):
- self._send_event(
- 'resize',
- size=(w / self.canvas.device_pixel_ratio,
- h / self.canvas.device_pixel_ratio),
- forward=forward)
-
- def set_window_title(self, title):
- self._send_event('figure_label', label=title)
-
- # The following methods are specific to FigureManagerWebAgg
-
- def add_web_socket(self, web_socket):
- assert hasattr(web_socket, 'send_binary')
- assert hasattr(web_socket, 'send_json')
- self.web_sockets.add(web_socket)
- self.resize(*self.canvas.figure.bbox.size)
- self._send_event('refresh')
-
- def remove_web_socket(self, web_socket):
- self.web_sockets.remove(web_socket)
-
- def handle_json(self, content):
- self.canvas.handle_event(content)
-
- def refresh_all(self):
- if self.web_sockets:
- diff = self.canvas.get_diff_image()
- if diff is not None:
- for s in self.web_sockets:
- s.send_binary(diff)
-
- @classmethod
- def get_javascript(cls, stream=None):
- if stream is None:
- output = StringIO()
- else:
- output = stream
-
- output.write((Path(__file__).parent / "web_backend/js/mpl.js")
- .read_text(encoding="utf-8"))
-
- toolitems = []
- for name, tooltip, image, method in cls.ToolbarCls.toolitems:
- if name is None:
- toolitems.append(['', '', '', ''])
- else:
- toolitems.append([name, tooltip, image, method])
- output.write("mpl.toolbar_items = {0};\n\n".format(
- json.dumps(toolitems)))
-
- extensions = []
- for filetype, ext in sorted(FigureCanvasWebAggCore.
- get_supported_filetypes_grouped().
- items()):
- extensions.append(ext[0])
- output.write("mpl.extensions = {0};\n\n".format(
- json.dumps(extensions)))
-
- output.write("mpl.default_extension = {0};".format(
- json.dumps(FigureCanvasWebAggCore.get_default_filetype())))
-
- if stream is None:
- return output.getvalue()
-
- @classmethod
- def get_static_file_path(cls):
- return os.path.join(os.path.dirname(__file__), 'web_backend')
-
- def _send_event(self, event_type, **kwargs):
- payload = {'type': event_type, **kwargs}
- for s in self.web_sockets:
- s.send_json(payload)
-
-
-@_Backend.export
-class _BackendWebAggCoreAgg(_Backend):
- FigureCanvas = FigureCanvasWebAggCore
- FigureManager = FigureManagerWebAgg
diff --git a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/offsetbox.py b/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/offsetbox.py
deleted file mode 100644
index 8ad9806c2ec8cf6bd0c74ffb74d5cb40c2ec162b..0000000000000000000000000000000000000000
--- a/spaces/ky2k/Toxicity_Classifier_POC/.venv/lib/python3.9/site-packages/matplotlib/offsetbox.py
+++ /dev/null
@@ -1,1623 +0,0 @@
-r"""
-Container classes for `.Artist`\s.
-
-`OffsetBox`
- The base of all container artists defined in this module.
-
-`AnchoredOffsetbox`, `AnchoredText`
- Anchor and align an arbitrary `.Artist` or a text relative to the parent
- axes or a specific anchor point.
-
-`DrawingArea`
- A container with fixed width and height. Children have a fixed position
- inside the container and may be clipped.
-
-`HPacker`, `VPacker`
- Containers for layouting their children vertically or horizontally.
-
-`PaddedBox`
- A container to add a padding around an `.Artist`.
-
-`TextArea`
- Contains a single `.Text` instance.
-"""
-
-import functools
-
-import numpy as np
-
-import matplotlib as mpl
-from matplotlib import _api, _docstring
-import matplotlib.artist as martist
-import matplotlib.path as mpath
-import matplotlib.text as mtext
-import matplotlib.transforms as mtransforms
-from matplotlib.font_manager import FontProperties
-from matplotlib.image import BboxImage
-from matplotlib.patches import (
- FancyBboxPatch, FancyArrowPatch, bbox_artist as mbbox_artist)
-from matplotlib.transforms import Bbox, BboxBase, TransformedBbox
-
-
-DEBUG = False
-
-
-def _compat_get_offset(meth):
- """
- Decorator for the get_offset method of OffsetBox and subclasses, that
- allows supporting both the new signature (self, bbox, renderer) and the old
- signature (self, width, height, xdescent, ydescent, renderer).
- """
- sigs = [lambda self, width, height, xdescent, ydescent, renderer: locals(),
- lambda self, bbox, renderer: locals()]
-
- @functools.wraps(meth)
- def get_offset(self, *args, **kwargs):
- params = _api.select_matching_signature(sigs, self, *args, **kwargs)
- bbox = (params["bbox"] if "bbox" in params else
- Bbox.from_bounds(-params["xdescent"], -params["ydescent"],
- params["width"], params["height"]))
- return meth(params["self"], bbox, params["renderer"])
- return get_offset
-
-
-@_api.deprecated("3.7", alternative='patches.bbox_artist')
-def bbox_artist(*args, **kwargs):
- if DEBUG:
- mbbox_artist(*args, **kwargs)
-
-
-# for debugging use
-def _bbox_artist(*args, **kwargs):
- if DEBUG:
- mbbox_artist(*args, **kwargs)
-
-
-def _get_packed_offsets(widths, total, sep, mode="fixed"):
- r"""
- Pack boxes specified by their *widths*.
-
- For simplicity of the description, the terminology used here assumes a
- horizontal layout, but the function works equally for a vertical layout.
-
- There are three packing *mode*\s:
-
- - 'fixed': The elements are packed tight to the left with a spacing of
- *sep* in between. If *total* is *None* the returned total will be the
- right edge of the last box. A non-*None* total will be passed unchecked
- to the output. In particular this means that right edge of the last
- box may be further to the right than the returned total.
-
- - 'expand': Distribute the boxes with equal spacing so that the left edge
- of the first box is at 0, and the right edge of the last box is at
- *total*. The parameter *sep* is ignored in this mode. A total of *None*
- is accepted and considered equal to 1. The total is returned unchanged
- (except for the conversion *None* to 1). If the total is smaller than
- the sum of the widths, the laid out boxes will overlap.
-
- - 'equal': If *total* is given, the total space is divided in N equal
- ranges and each box is left-aligned within its subspace.
- Otherwise (*total* is *None*), *sep* must be provided and each box is
- left-aligned in its subspace of width ``(max(widths) + sep)``. The
- total width is then calculated to be ``N * (max(widths) + sep)``.
-
- Parameters
- ----------
- widths : list of float
- Widths of boxes to be packed.
- total : float or None
- Intended total length. *None* if not used.
- sep : float
- Spacing between boxes.
- mode : {'fixed', 'expand', 'equal'}
- The packing mode.
-
- Returns
- -------
- total : float
- The total width needed to accommodate the laid out boxes.
- offsets : array of float
- The left offsets of the boxes.
- """
- _api.check_in_list(["fixed", "expand", "equal"], mode=mode)
-
- if mode == "fixed":
- offsets_ = np.cumsum([0] + [w + sep for w in widths])
- offsets = offsets_[:-1]
- if total is None:
- total = offsets_[-1] - sep
- return total, offsets
-
- elif mode == "expand":
- # This is a bit of a hack to avoid a TypeError when *total*
- # is None and used in conjugation with tight layout.
- if total is None:
- total = 1
- if len(widths) > 1:
- sep = (total - sum(widths)) / (len(widths) - 1)
- else:
- sep = 0
- offsets_ = np.cumsum([0] + [w + sep for w in widths])
- offsets = offsets_[:-1]
- return total, offsets
-
- elif mode == "equal":
- maxh = max(widths)
- if total is None:
- if sep is None:
- raise ValueError("total and sep cannot both be None when "
- "using layout mode 'equal'")
- total = (maxh + sep) * len(widths)
- else:
- sep = total / len(widths) - maxh
- offsets = (maxh + sep) * np.arange(len(widths))
- return total, offsets
-
-
-def _get_aligned_offsets(yspans, height, align="baseline"):
- """
- Align boxes each specified by their ``(y0, y1)`` spans.
-
- For simplicity of the description, the terminology used here assumes a
- horizontal layout (i.e., vertical alignment), but the function works
- equally for a vertical layout.
-
- Parameters
- ----------
- yspans
- List of (y0, y1) spans of boxes to be aligned.
- height : float or None
- Intended total height. If None, the maximum of the heights
- (``y1 - y0``) in *yspans* is used.
- align : {'baseline', 'left', 'top', 'right', 'bottom', 'center'}
- The alignment anchor of the boxes.
-
- Returns
- -------
- (y0, y1)
- y range spanned by the packing. If a *height* was originally passed
- in, then for all alignments other than "baseline", a span of ``(0,
- height)`` is used without checking that it is actually large enough).
- descent
- The descent of the packing.
- offsets
- The bottom offsets of the boxes.
- """
-
- _api.check_in_list(
- ["baseline", "left", "top", "right", "bottom", "center"], align=align)
- if height is None:
- height = max(y1 - y0 for y0, y1 in yspans)
-
- if align == "baseline":
- yspan = (min(y0 for y0, y1 in yspans), max(y1 for y0, y1 in yspans))
- offsets = [0] * len(yspans)
- elif align in ["left", "bottom"]:
- yspan = (0, height)
- offsets = [-y0 for y0, y1 in yspans]
- elif align in ["right", "top"]:
- yspan = (0, height)
- offsets = [height - y1 for y0, y1 in yspans]
- elif align == "center":
- yspan = (0, height)
- offsets = [(height - (y1 - y0)) * .5 - y0 for y0, y1 in yspans]
-
- return yspan, offsets
-
-
-class OffsetBox(martist.Artist):
- """
- The OffsetBox is a simple container artist.
-
- The child artists are meant to be drawn at a relative position to its
- parent.
-
- Being an artist itself, all parameters are passed on to `.Artist`.
- """
- def __init__(self, *args, **kwargs):
- super().__init__(*args)
- self._internal_update(kwargs)
- # Clipping has not been implemented in the OffsetBox family, so
- # disable the clip flag for consistency. It can always be turned back
- # on to zero effect.
- self.set_clip_on(False)
- self._children = []
- self._offset = (0, 0)
-
- def set_figure(self, fig):
- """
- Set the `.Figure` for the `.OffsetBox` and all its children.
-
- Parameters
- ----------
- fig : `~matplotlib.figure.Figure`
- """
- super().set_figure(fig)
- for c in self.get_children():
- c.set_figure(fig)
-
- @martist.Artist.axes.setter
- def axes(self, ax):
- # TODO deal with this better
- martist.Artist.axes.fset(self, ax)
- for c in self.get_children():
- if c is not None:
- c.axes = ax
-
- def contains(self, mouseevent):
- """
- Delegate the mouse event contains-check to the children.
-
- As a container, the `.OffsetBox` does not respond itself to
- mouseevents.
-
- Parameters
- ----------
- mouseevent : `matplotlib.backend_bases.MouseEvent`
-
- Returns
- -------
- contains : bool
- Whether any values are within the radius.
- details : dict
- An artist-specific dictionary of details of the event context,
- such as which points are contained in the pick radius. See the
- individual Artist subclasses for details.
-
- See Also
- --------
- .Artist.contains
- """
- inside, info = self._default_contains(mouseevent)
- if inside is not None:
- return inside, info
- for c in self.get_children():
- a, b = c.contains(mouseevent)
- if a:
- return a, b
- return False, {}
-
- def set_offset(self, xy):
- """
- Set the offset.
-
- Parameters
- ----------
- xy : (float, float) or callable
- The (x, y) coordinates of the offset in display units. These can
- either be given explicitly as a tuple (x, y), or by providing a
- function that converts the extent into the offset. This function
- must have the signature::
-
- def offset(width, height, xdescent, ydescent, renderer) \
--> (float, float)
- """
- self._offset = xy
- self.stale = True
-
- @_compat_get_offset
- def get_offset(self, bbox, renderer):
- """
- Return the offset as a tuple (x, y).
-
- The extent parameters have to be provided to handle the case where the
- offset is dynamically determined by a callable (see
- `~.OffsetBox.set_offset`).
-
- Parameters
- ----------
- bbox : `.Bbox`
- renderer : `.RendererBase` subclass
- """
- return (
- self._offset(bbox.width, bbox.height, -bbox.x0, -bbox.y0, renderer)
- if callable(self._offset)
- else self._offset)
-
- def set_width(self, width):
- """
- Set the width of the box.
-
- Parameters
- ----------
- width : float
- """
- self.width = width
- self.stale = True
-
- def set_height(self, height):
- """
- Set the height of the box.
-
- Parameters
- ----------
- height : float
- """
- self.height = height
- self.stale = True
-
- def get_visible_children(self):
- r"""Return a list of the visible child `.Artist`\s."""
- return [c for c in self._children if c.get_visible()]
-
- def get_children(self):
- r"""Return a list of the child `.Artist`\s."""
- return self._children
-
- def _get_bbox_and_child_offsets(self, renderer):
- """
- Return the bbox of the offsetbox and the child offsets.
-
- The bbox should satisfy ``x0 <= x1 and y0 <= y1``.
-
- Parameters
- ----------
- renderer : `.RendererBase` subclass
-
- Returns
- -------
- bbox
- list of (xoffset, yoffset) pairs
- """
- raise NotImplementedError(
- "get_bbox_and_offsets must be overridden in derived classes")
-
- def get_bbox(self, renderer):
- """Return the bbox of the offsetbox, ignoring parent offsets."""
- bbox, offsets = self._get_bbox_and_child_offsets(renderer)
- return bbox
-
- @_api.deprecated("3.7", alternative="get_bbox and child.get_offset")
- def get_extent_offsets(self, renderer):
- """
- Update offset of the children and return the extent of the box.
-
- Parameters
- ----------
- renderer : `.RendererBase` subclass
-
- Returns
- -------
- width
- height
- xdescent
- ydescent
- list of (xoffset, yoffset) pairs
- """
- bbox, offsets = self._get_bbox_and_child_offsets(renderer)
- return bbox.width, bbox.height, -bbox.x0, -bbox.y0, offsets
-
- @_api.deprecated("3.7", alternative="get_bbox")
- def get_extent(self, renderer):
- """Return a tuple ``width, height, xdescent, ydescent`` of the box."""
- bbox = self.get_bbox(renderer)
- return bbox.width, bbox.height, -bbox.x0, -bbox.y0
-
- def get_window_extent(self, renderer=None):
- # docstring inherited
- if renderer is None:
- renderer = self.figure._get_renderer()
- bbox = self.get_bbox(renderer)
- try: # Some subclasses redefine get_offset to take no args.
- px, py = self.get_offset(bbox, renderer)
- except TypeError:
- px, py = self.get_offset()
- return bbox.translated(px, py)
-
- def draw(self, renderer):
- """
- Update the location of children if necessary and draw them
- to the given *renderer*.
- """
- bbox, offsets = self._get_bbox_and_child_offsets(renderer)
- px, py = self.get_offset(bbox, renderer)
- for c, (ox, oy) in zip(self.get_visible_children(), offsets):
- c.set_offset((px + ox, py + oy))
- c.draw(renderer)
- _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))
- self.stale = False
-
-
-class PackerBase(OffsetBox):
- def __init__(self, pad=0., sep=0., width=None, height=None,
- align="baseline", mode="fixed", children=None):
- """
- Parameters
- ----------
- pad : float, default: 0.0
- The boundary padding in points.
-
- sep : float, default: 0.0
- The spacing between items in points.
-
- width, height : float, optional
- Width and height of the container box in pixels, calculated if
- *None*.
-
- align : {'top', 'bottom', 'left', 'right', 'center', 'baseline'}, \
-default: 'baseline'
- Alignment of boxes.
-
- mode : {'fixed', 'expand', 'equal'}, default: 'fixed'
- The packing mode.
-
- - 'fixed' packs the given `.Artist`\\s tight with *sep* spacing.
- - 'expand' uses the maximal available space to distribute the
- artists with equal spacing in between.
- - 'equal': Each artist an equal fraction of the available space
- and is left-aligned (or top-aligned) therein.
-
- children : list of `.Artist`
- The artists to pack.
-
- Notes
- -----
- *pad* and *sep* are in points and will be scaled with the renderer
- dpi, while *width* and *height* are in pixels.
- """
- super().__init__()
- self.height = height
- self.width = width
- self.sep = sep
- self.pad = pad
- self.mode = mode
- self.align = align
- self._children = children
-
-
-class VPacker(PackerBase):
- """
- VPacker packs its children vertically, automatically adjusting their
- relative positions at draw time.
- """
-
- def _get_bbox_and_child_offsets(self, renderer):
- # docstring inherited
- dpicor = renderer.points_to_pixels(1.)
- pad = self.pad * dpicor
- sep = self.sep * dpicor
-
- if self.width is not None:
- for c in self.get_visible_children():
- if isinstance(c, PackerBase) and c.mode == "expand":
- c.set_width(self.width)
-
- bboxes = [c.get_bbox(renderer) for c in self.get_visible_children()]
- (x0, x1), xoffsets = _get_aligned_offsets(
- [bbox.intervalx for bbox in bboxes], self.width, self.align)
- height, yoffsets = _get_packed_offsets(
- [bbox.height for bbox in bboxes], self.height, sep, self.mode)
-
- yoffsets = height - (yoffsets + [bbox.y1 for bbox in bboxes])
- ydescent = yoffsets[0]
- yoffsets = yoffsets - ydescent
-
- return (
- Bbox.from_bounds(x0, -ydescent, x1 - x0, height).padded(pad),
- [*zip(xoffsets, yoffsets)])
-
-
-class HPacker(PackerBase):
- """
- HPacker packs its children horizontally, automatically adjusting their
- relative positions at draw time.
- """
-
- def _get_bbox_and_child_offsets(self, renderer):
- # docstring inherited
- dpicor = renderer.points_to_pixels(1.)
- pad = self.pad * dpicor
- sep = self.sep * dpicor
-
- bboxes = [c.get_bbox(renderer) for c in self.get_visible_children()]
- if not bboxes:
- return Bbox.from_bounds(0, 0, 0, 0).padded(pad), []
-
- (y0, y1), yoffsets = _get_aligned_offsets(
- [bbox.intervaly for bbox in bboxes], self.height, self.align)
- width, xoffsets = _get_packed_offsets(
- [bbox.width for bbox in bboxes], self.width, sep, self.mode)
-
- x0 = bboxes[0].x0
- xoffsets -= ([bbox.x0 for bbox in bboxes] - x0)
-
- return (Bbox.from_bounds(x0, y0, width, y1 - y0).padded(pad),
- [*zip(xoffsets, yoffsets)])
-
-
-class PaddedBox(OffsetBox):
- """
- A container to add a padding around an `.Artist`.
-
- The `.PaddedBox` contains a `.FancyBboxPatch` that is used to visualize
- it when rendering.
- """
-
- @_api.make_keyword_only("3.6", name="draw_frame")
- def __init__(self, child, pad=0., draw_frame=False, patch_attrs=None):
- """
- Parameters
- ----------
- child : `~matplotlib.artist.Artist`
- The contained `.Artist`.
- pad : float, default: 0.0
- The padding in points. This will be scaled with the renderer dpi.
- In contrast, *width* and *height* are in *pixels* and thus not
- scaled.
- draw_frame : bool
- Whether to draw the contained `.FancyBboxPatch`.
- patch_attrs : dict or None
- Additional parameters passed to the contained `.FancyBboxPatch`.
- """
- super().__init__()
- self.pad = pad
- self._children = [child]
- self.patch = FancyBboxPatch(
- xy=(0.0, 0.0), width=1., height=1.,
- facecolor='w', edgecolor='k',
- mutation_scale=1, # self.prop.get_size_in_points(),
- snap=True,
- visible=draw_frame,
- boxstyle="square,pad=0",
- )
- if patch_attrs is not None:
- self.patch.update(patch_attrs)
-
- def _get_bbox_and_child_offsets(self, renderer):
- # docstring inherited.
- pad = self.pad * renderer.points_to_pixels(1.)
- return (self._children[0].get_bbox(renderer).padded(pad), [(0, 0)])
-
- def draw(self, renderer):
- # docstring inherited
- bbox, offsets = self._get_bbox_and_child_offsets(renderer)
- px, py = self.get_offset(bbox, renderer)
- for c, (ox, oy) in zip(self.get_visible_children(), offsets):
- c.set_offset((px + ox, py + oy))
-
- self.draw_frame(renderer)
-
- for c in self.get_visible_children():
- c.draw(renderer)
-
- self.stale = False
-
- def update_frame(self, bbox, fontsize=None):
- self.patch.set_bounds(bbox.bounds)
- if fontsize:
- self.patch.set_mutation_scale(fontsize)
- self.stale = True
-
- def draw_frame(self, renderer):
- # update the location and size of the legend
- self.update_frame(self.get_window_extent(renderer))
- self.patch.draw(renderer)
-
-
-class DrawingArea(OffsetBox):
- """
- The DrawingArea can contain any Artist as a child. The DrawingArea
- has a fixed width and height. The position of children relative to
- the parent is fixed. The children can be clipped at the
- boundaries of the parent.
- """
-
- def __init__(self, width, height, xdescent=0., ydescent=0., clip=False):
- """
- Parameters
- ----------
- width, height : float
- Width and height of the container box.
- xdescent, ydescent : float
- Descent of the box in x- and y-direction.
- clip : bool
- Whether to clip the children to the box.
- """
- super().__init__()
- self.width = width
- self.height = height
- self.xdescent = xdescent
- self.ydescent = ydescent
- self._clip_children = clip
- self.offset_transform = mtransforms.Affine2D()
- self.dpi_transform = mtransforms.Affine2D()
-
- @property
- def clip_children(self):
- """
- If the children of this DrawingArea should be clipped
- by DrawingArea bounding box.
- """
- return self._clip_children
-
- @clip_children.setter
- def clip_children(self, val):
- self._clip_children = bool(val)
- self.stale = True
-
- def get_transform(self):
- """
- Return the `~matplotlib.transforms.Transform` applied to the children.
- """
- return self.dpi_transform + self.offset_transform
-
- def set_transform(self, t):
- """
- set_transform is ignored.
- """
-
- def set_offset(self, xy):
- """
- Set the offset of the container.
-
- Parameters
- ----------
- xy : (float, float)
- The (x, y) coordinates of the offset in display units.
- """
- self._offset = xy
- self.offset_transform.clear()
- self.offset_transform.translate(xy[0], xy[1])
- self.stale = True
-
- def get_offset(self):
- """Return offset of the container."""
- return self._offset
-
- def get_bbox(self, renderer):
- # docstring inherited
- dpi_cor = renderer.points_to_pixels(1.)
- return Bbox.from_bounds(
- -self.xdescent * dpi_cor, -self.ydescent * dpi_cor,
- self.width * dpi_cor, self.height * dpi_cor)
-
- def add_artist(self, a):
- """Add an `.Artist` to the container box."""
- self._children.append(a)
- if not a.is_transform_set():
- a.set_transform(self.get_transform())
- if self.axes is not None:
- a.axes = self.axes
- fig = self.figure
- if fig is not None:
- a.set_figure(fig)
-
- def draw(self, renderer):
- # docstring inherited
-
- dpi_cor = renderer.points_to_pixels(1.)
- self.dpi_transform.clear()
- self.dpi_transform.scale(dpi_cor)
-
- # At this point the DrawingArea has a transform
- # to the display space so the path created is
- # good for clipping children
- tpath = mtransforms.TransformedPath(
- mpath.Path([[0, 0], [0, self.height],
- [self.width, self.height],
- [self.width, 0]]),
- self.get_transform())
- for c in self._children:
- if self._clip_children and not (c.clipbox or c._clippath):
- c.set_clip_path(tpath)
- c.draw(renderer)
-
- _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))
- self.stale = False
-
-
-class TextArea(OffsetBox):
- """
- The TextArea is a container artist for a single Text instance.
-
- The text is placed at (0, 0) with baseline+left alignment, by default. The
- width and height of the TextArea instance is the width and height of its
- child text.
- """
-
- @_api.make_keyword_only("3.6", name="textprops")
- def __init__(self, s,
- textprops=None,
- multilinebaseline=False,
- ):
- """
- Parameters
- ----------
- s : str
- The text to be displayed.
- textprops : dict, default: {}
- Dictionary of keyword parameters to be passed to the `.Text`
- instance in the TextArea.
- multilinebaseline : bool, default: False
- Whether the baseline for multiline text is adjusted so that it
- is (approximately) center-aligned with single-line text.
- """
- if textprops is None:
- textprops = {}
- self._text = mtext.Text(0, 0, s, **textprops)
- super().__init__()
- self._children = [self._text]
- self.offset_transform = mtransforms.Affine2D()
- self._baseline_transform = mtransforms.Affine2D()
- self._text.set_transform(self.offset_transform +
- self._baseline_transform)
- self._multilinebaseline = multilinebaseline
-
- def set_text(self, s):
- """Set the text of this area as a string."""
- self._text.set_text(s)
- self.stale = True
-
- def get_text(self):
- """Return the string representation of this area's text."""
- return self._text.get_text()
-
- def set_multilinebaseline(self, t):
- """
- Set multilinebaseline.
-
- If True, the baseline for multiline text is adjusted so that it is
- (approximately) center-aligned with single-line text. This is used
- e.g. by the legend implementation so that single-line labels are
- baseline-aligned, but multiline labels are "center"-aligned with them.
- """
- self._multilinebaseline = t
- self.stale = True
-
- def get_multilinebaseline(self):
- """
- Get multilinebaseline.
- """
- return self._multilinebaseline
-
- def set_transform(self, t):
- """
- set_transform is ignored.
- """
-
- def set_offset(self, xy):
- """
- Set the offset of the container.
-
- Parameters
- ----------
- xy : (float, float)
- The (x, y) coordinates of the offset in display units.
- """
- self._offset = xy
- self.offset_transform.clear()
- self.offset_transform.translate(xy[0], xy[1])
- self.stale = True
-
- def get_offset(self):
- """Return offset of the container."""
- return self._offset
-
- def get_bbox(self, renderer):
- _, h_, d_ = renderer.get_text_width_height_descent(
- "lp", self._text._fontproperties,
- ismath="TeX" if self._text.get_usetex() else False)
-
- bbox, info, yd = self._text._get_layout(renderer)
- w, h = bbox.size
-
- self._baseline_transform.clear()
-
- if len(info) > 1 and self._multilinebaseline:
- yd_new = 0.5 * h - 0.5 * (h_ - d_)
- self._baseline_transform.translate(0, yd - yd_new)
- yd = yd_new
- else: # single line
- h_d = max(h_ - d_, h - yd)
- h = h_d + yd
-
- ha = self._text.get_horizontalalignment()
- x0 = {"left": 0, "center": -w / 2, "right": -w}[ha]
-
- return Bbox.from_bounds(x0, -yd, w, h)
-
- def draw(self, renderer):
- # docstring inherited
- self._text.draw(renderer)
- _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))
- self.stale = False
-
-
-class AuxTransformBox(OffsetBox):
- """
- Offset Box with the aux_transform. Its children will be
- transformed with the aux_transform first then will be
- offsetted. The absolute coordinate of the aux_transform is meaning
- as it will be automatically adjust so that the left-lower corner
- of the bounding box of children will be set to (0, 0) before the
- offset transform.
-
- It is similar to drawing area, except that the extent of the box
- is not predetermined but calculated from the window extent of its
- children. Furthermore, the extent of the children will be
- calculated in the transformed coordinate.
- """
- def __init__(self, aux_transform):
- self.aux_transform = aux_transform
- super().__init__()
- self.offset_transform = mtransforms.Affine2D()
- # ref_offset_transform makes offset_transform always relative to the
- # lower-left corner of the bbox of its children.
- self.ref_offset_transform = mtransforms.Affine2D()
-
- def add_artist(self, a):
- """Add an `.Artist` to the container box."""
- self._children.append(a)
- a.set_transform(self.get_transform())
- self.stale = True
-
- def get_transform(self):
- """
- Return the :class:`~matplotlib.transforms.Transform` applied
- to the children
- """
- return (self.aux_transform
- + self.ref_offset_transform
- + self.offset_transform)
-
- def set_transform(self, t):
- """
- set_transform is ignored.
- """
-
- def set_offset(self, xy):
- """
- Set the offset of the container.
-
- Parameters
- ----------
- xy : (float, float)
- The (x, y) coordinates of the offset in display units.
- """
- self._offset = xy
- self.offset_transform.clear()
- self.offset_transform.translate(xy[0], xy[1])
- self.stale = True
-
- def get_offset(self):
- """Return offset of the container."""
- return self._offset
-
- def get_bbox(self, renderer):
- # clear the offset transforms
- _off = self.offset_transform.get_matrix() # to be restored later
- self.ref_offset_transform.clear()
- self.offset_transform.clear()
- # calculate the extent
- bboxes = [c.get_window_extent(renderer) for c in self._children]
- ub = Bbox.union(bboxes)
- # adjust ref_offset_transform
- self.ref_offset_transform.translate(-ub.x0, -ub.y0)
- # restore offset transform
- self.offset_transform.set_matrix(_off)
- return Bbox.from_bounds(0, 0, ub.width, ub.height)
-
- def draw(self, renderer):
- # docstring inherited
- for c in self._children:
- c.draw(renderer)
- _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))
- self.stale = False
-
-
-class AnchoredOffsetbox(OffsetBox):
- """
- An offset box placed according to location *loc*.
-
- AnchoredOffsetbox has a single child. When multiple children are needed,
- use an extra OffsetBox to enclose them. By default, the offset box is
- anchored against its parent axes. You may explicitly specify the
- *bbox_to_anchor*.
- """
- zorder = 5 # zorder of the legend
-
- # Location codes
- codes = {'upper right': 1,
- 'upper left': 2,
- 'lower left': 3,
- 'lower right': 4,
- 'right': 5,
- 'center left': 6,
- 'center right': 7,
- 'lower center': 8,
- 'upper center': 9,
- 'center': 10,
- }
-
- @_api.make_keyword_only("3.6", name="pad")
- def __init__(self, loc,
- pad=0.4, borderpad=0.5,
- child=None, prop=None, frameon=True,
- bbox_to_anchor=None,
- bbox_transform=None,
- **kwargs):
- """
- Parameters
- ----------
- loc : str
- The box location. Valid locations are
- 'upper left', 'upper center', 'upper right',
- 'center left', 'center', 'center right',
- 'lower left', 'lower center', 'lower right'.
- For backward compatibility, numeric values are accepted as well.
- See the parameter *loc* of `.Legend` for details.
- pad : float, default: 0.4
- Padding around the child as fraction of the fontsize.
- borderpad : float, default: 0.5
- Padding between the offsetbox frame and the *bbox_to_anchor*.
- child : `.OffsetBox`
- The box that will be anchored.
- prop : `.FontProperties`
- This is only used as a reference for paddings. If not given,
- :rc:`legend.fontsize` is used.
- frameon : bool
- Whether to draw a frame around the box.
- bbox_to_anchor : `.BboxBase`, 2-tuple, or 4-tuple of floats
- Box that is used to position the legend in conjunction with *loc*.
- bbox_transform : None or :class:`matplotlib.transforms.Transform`
- The transform for the bounding box (*bbox_to_anchor*).
- **kwargs
- All other parameters are passed on to `.OffsetBox`.
-
- Notes
- -----
- See `.Legend` for a detailed description of the anchoring mechanism.
- """
- super().__init__(**kwargs)
-
- self.set_bbox_to_anchor(bbox_to_anchor, bbox_transform)
- self.set_child(child)
-
- if isinstance(loc, str):
- loc = _api.check_getitem(self.codes, loc=loc)
-
- self.loc = loc
- self.borderpad = borderpad
- self.pad = pad
-
- if prop is None:
- self.prop = FontProperties(size=mpl.rcParams["legend.fontsize"])
- else:
- self.prop = FontProperties._from_any(prop)
- if isinstance(prop, dict) and "size" not in prop:
- self.prop.set_size(mpl.rcParams["legend.fontsize"])
-
- self.patch = FancyBboxPatch(
- xy=(0.0, 0.0), width=1., height=1.,
- facecolor='w', edgecolor='k',
- mutation_scale=self.prop.get_size_in_points(),
- snap=True,
- visible=frameon,
- boxstyle="square,pad=0",
- )
-
- def set_child(self, child):
- """Set the child to be anchored."""
- self._child = child
- if child is not None:
- child.axes = self.axes
- self.stale = True
-
- def get_child(self):
- """Return the child."""
- return self._child
-
- def get_children(self):
- """Return the list of children."""
- return [self._child]
-
- def get_bbox(self, renderer):
- # docstring inherited
- fontsize = renderer.points_to_pixels(self.prop.get_size_in_points())
- pad = self.pad * fontsize
- return self.get_child().get_bbox(renderer).padded(pad)
-
- def get_bbox_to_anchor(self):
- """Return the bbox that the box is anchored to."""
- if self._bbox_to_anchor is None:
- return self.axes.bbox
- else:
- transform = self._bbox_to_anchor_transform
- if transform is None:
- return self._bbox_to_anchor
- else:
- return TransformedBbox(self._bbox_to_anchor, transform)
-
- def set_bbox_to_anchor(self, bbox, transform=None):
- """
- Set the bbox that the box is anchored to.
-
- *bbox* can be a Bbox instance, a list of [left, bottom, width,
- height], or a list of [left, bottom] where the width and
- height will be assumed to be zero. The bbox will be
- transformed to display coordinate by the given transform.
- """
- if bbox is None or isinstance(bbox, BboxBase):
- self._bbox_to_anchor = bbox
- else:
- try:
- l = len(bbox)
- except TypeError as err:
- raise ValueError(f"Invalid bbox: {bbox}") from err
-
- if l == 2:
- bbox = [bbox[0], bbox[1], 0, 0]
-
- self._bbox_to_anchor = Bbox.from_bounds(*bbox)
-
- self._bbox_to_anchor_transform = transform
- self.stale = True
-
- @_compat_get_offset
- def get_offset(self, bbox, renderer):
- # docstring inherited
- pad = (self.borderpad
- * renderer.points_to_pixels(self.prop.get_size_in_points()))
- bbox_to_anchor = self.get_bbox_to_anchor()
- x0, y0 = _get_anchored_bbox(
- self.loc, Bbox.from_bounds(0, 0, bbox.width, bbox.height),
- bbox_to_anchor, pad)
- return x0 - bbox.x0, y0 - bbox.y0
-
- def update_frame(self, bbox, fontsize=None):
- self.patch.set_bounds(bbox.bounds)
- if fontsize:
- self.patch.set_mutation_scale(fontsize)
-
- def draw(self, renderer):
- # docstring inherited
- if not self.get_visible():
- return
-
- # update the location and size of the legend
- bbox = self.get_window_extent(renderer)
- fontsize = renderer.points_to_pixels(self.prop.get_size_in_points())
- self.update_frame(bbox, fontsize)
- self.patch.draw(renderer)
-
- px, py = self.get_offset(self.get_bbox(renderer), renderer)
- self.get_child().set_offset((px, py))
- self.get_child().draw(renderer)
- self.stale = False
-
-
-def _get_anchored_bbox(loc, bbox, parentbbox, borderpad):
- """
- Return the (x, y) position of the *bbox* anchored at the *parentbbox* with
- the *loc* code with the *borderpad*.
- """
- # This is only called internally and *loc* should already have been
- # validated. If 0 (None), we just let ``bbox.anchored`` raise.
- c = [None, "NE", "NW", "SW", "SE", "E", "W", "E", "S", "N", "C"][loc]
- container = parentbbox.padded(-borderpad)
- return bbox.anchored(c, container=container).p0
-
-
-class AnchoredText(AnchoredOffsetbox):
- """
- AnchoredOffsetbox with Text.
- """
-
- @_api.make_keyword_only("3.6", name="pad")
- def __init__(self, s, loc, pad=0.4, borderpad=0.5, prop=None, **kwargs):
- """
- Parameters
- ----------
- s : str
- Text.
-
- loc : str
- Location code. See `AnchoredOffsetbox`.
-
- pad : float, default: 0.4
- Padding around the text as fraction of the fontsize.
-
- borderpad : float, default: 0.5
- Spacing between the offsetbox frame and the *bbox_to_anchor*.
-
- prop : dict, optional
- Dictionary of keyword parameters to be passed to the
- `~matplotlib.text.Text` instance contained inside AnchoredText.
-
- **kwargs
- All other parameters are passed to `AnchoredOffsetbox`.
- """
-
- if prop is None:
- prop = {}
- badkwargs = {'va', 'verticalalignment'}
- if badkwargs & set(prop):
- raise ValueError(
- 'Mixing verticalalignment with AnchoredText is not supported.')
-
- self.txt = TextArea(s, textprops=prop)
- fp = self.txt._text.get_fontproperties()
- super().__init__(
- loc, pad=pad, borderpad=borderpad, child=self.txt, prop=fp,
- **kwargs)
-
-
-class OffsetImage(OffsetBox):
-
- @_api.make_keyword_only("3.6", name="zoom")
- def __init__(self, arr,
- zoom=1,
- cmap=None,
- norm=None,
- interpolation=None,
- origin=None,
- filternorm=True,
- filterrad=4.0,
- resample=False,
- dpi_cor=True,
- **kwargs
- ):
-
- super().__init__()
- self._dpi_cor = dpi_cor
-
- self.image = BboxImage(bbox=self.get_window_extent,
- cmap=cmap,
- norm=norm,
- interpolation=interpolation,
- origin=origin,
- filternorm=filternorm,
- filterrad=filterrad,
- resample=resample,
- **kwargs
- )
-
- self._children = [self.image]
-
- self.set_zoom(zoom)
- self.set_data(arr)
-
- def set_data(self, arr):
- self._data = np.asarray(arr)
- self.image.set_data(self._data)
- self.stale = True
-
- def get_data(self):
- return self._data
-
- def set_zoom(self, zoom):
- self._zoom = zoom
- self.stale = True
-
- def get_zoom(self):
- return self._zoom
-
- def get_offset(self):
- """Return offset of the container."""
- return self._offset
-
- def get_children(self):
- return [self.image]
-
- def get_bbox(self, renderer):
- dpi_cor = renderer.points_to_pixels(1.) if self._dpi_cor else 1.
- zoom = self.get_zoom()
- data = self.get_data()
- ny, nx = data.shape[:2]
- w, h = dpi_cor * nx * zoom, dpi_cor * ny * zoom
- return Bbox.from_bounds(0, 0, w, h)
-
- def draw(self, renderer):
- # docstring inherited
- self.image.draw(renderer)
- # bbox_artist(self, renderer, fill=False, props=dict(pad=0.))
- self.stale = False
-
-
-class AnnotationBbox(martist.Artist, mtext._AnnotationBase):
- """
- Container for an `OffsetBox` referring to a specific position *xy*.
-
- Optionally an arrow pointing from the offsetbox to *xy* can be drawn.
-
- This is like `.Annotation`, but with `OffsetBox` instead of `.Text`.
- """
-
- zorder = 3
-
- def __str__(self):
- return "AnnotationBbox(%g,%g)" % (self.xy[0], self.xy[1])
-
- @_docstring.dedent_interpd
- @_api.make_keyword_only("3.6", name="xycoords")
- def __init__(self, offsetbox, xy,
- xybox=None,
- xycoords='data',
- boxcoords=None,
- frameon=True, pad=0.4, # FancyBboxPatch boxstyle.
- annotation_clip=None,
- box_alignment=(0.5, 0.5),
- bboxprops=None,
- arrowprops=None,
- fontsize=None,
- **kwargs):
- """
- Parameters
- ----------
- offsetbox : `OffsetBox`
-
- xy : (float, float)
- The point *(x, y)* to annotate. The coordinate system is determined
- by *xycoords*.
-
- xybox : (float, float), default: *xy*
- The position *(x, y)* to place the text at. The coordinate system
- is determined by *boxcoords*.
-
- xycoords : single or two-tuple of str or `.Artist` or `.Transform` or \
-callable, default: 'data'
- The coordinate system that *xy* is given in. See the parameter
- *xycoords* in `.Annotation` for a detailed description.
-
- boxcoords : single or two-tuple of str or `.Artist` or `.Transform` \
-or callable, default: value of *xycoords*
- The coordinate system that *xybox* is given in. See the parameter
- *textcoords* in `.Annotation` for a detailed description.
-
- frameon : bool, default: True
- By default, the text is surrounded by a white `.FancyBboxPatch`
- (accessible as the ``patch`` attribute of the `.AnnotationBbox`).
- If *frameon* is set to False, this patch is made invisible.
-
- annotation_clip: bool or None, default: None
- Whether to clip (i.e. not draw) the annotation when the annotation
- point *xy* is outside the axes area.
-
- - If *True*, the annotation will be clipped when *xy* is outside
- the axes.
- - If *False*, the annotation will always be drawn.
- - If *None*, the annotation will be clipped when *xy* is outside
- the axes and *xycoords* is 'data'.
-
- pad : float, default: 0.4
- Padding around the offsetbox.
-
- box_alignment : (float, float)
- A tuple of two floats for a vertical and horizontal alignment of
- the offset box w.r.t. the *boxcoords*.
- The lower-left corner is (0, 0) and upper-right corner is (1, 1).
-
- bboxprops : dict, optional
- A dictionary of properties to set for the annotation bounding box,
- for example *boxstyle* and *alpha*. See `.FancyBboxPatch` for
- details.
-
- arrowprops: dict, optional
- Arrow properties, see `.Annotation` for description.
-
- fontsize: float or str, optional
- Translated to points and passed as *mutation_scale* into
- `.FancyBboxPatch` to scale attributes of the box style (e.g. pad
- or rounding_size). The name is chosen in analogy to `.Text` where
- *fontsize* defines the mutation scale as well. If not given,
- :rc:`legend.fontsize` is used. See `.Text.set_fontsize` for valid
- values.
-
- **kwargs
- Other `AnnotationBbox` properties. See `.AnnotationBbox.set` for
- a list.
- """
-
- martist.Artist.__init__(self)
- mtext._AnnotationBase.__init__(
- self, xy, xycoords=xycoords, annotation_clip=annotation_clip)
-
- self.offsetbox = offsetbox
- self.arrowprops = arrowprops.copy() if arrowprops is not None else None
- self.set_fontsize(fontsize)
- self.xybox = xybox if xybox is not None else xy
- self.boxcoords = boxcoords if boxcoords is not None else xycoords
- self._box_alignment = box_alignment
-
- if arrowprops is not None:
- self._arrow_relpos = self.arrowprops.pop("relpos", (0.5, 0.5))
- self.arrow_patch = FancyArrowPatch((0, 0), (1, 1),
- **self.arrowprops)
- else:
- self._arrow_relpos = None
- self.arrow_patch = None
-
- self.patch = FancyBboxPatch( # frame
- xy=(0.0, 0.0), width=1., height=1.,
- facecolor='w', edgecolor='k',
- mutation_scale=self.prop.get_size_in_points(),
- snap=True,
- visible=frameon,
- )
- self.patch.set_boxstyle("square", pad=pad)
- if bboxprops:
- self.patch.set(**bboxprops)
-
- self._internal_update(kwargs)
-
- @property
- def xyann(self):
- return self.xybox
-
- @xyann.setter
- def xyann(self, xyann):
- self.xybox = xyann
- self.stale = True
-
- @property
- def anncoords(self):
- return self.boxcoords
-
- @anncoords.setter
- def anncoords(self, coords):
- self.boxcoords = coords
- self.stale = True
-
- def contains(self, mouseevent):
- inside, info = self._default_contains(mouseevent)
- if inside is not None:
- return inside, info
- if not self._check_xy(None):
- return False, {}
- return self.offsetbox.contains(mouseevent)
- # self.arrow_patch is currently not checked as this can be a line - JJ
-
- def get_children(self):
- children = [self.offsetbox, self.patch]
- if self.arrow_patch:
- children.append(self.arrow_patch)
- return children
-
- def set_figure(self, fig):
- if self.arrow_patch is not None:
- self.arrow_patch.set_figure(fig)
- self.offsetbox.set_figure(fig)
- martist.Artist.set_figure(self, fig)
-
- def set_fontsize(self, s=None):
- """
- Set the fontsize in points.
-
- If *s* is not given, reset to :rc:`legend.fontsize`.
- """
- if s is None:
- s = mpl.rcParams["legend.fontsize"]
-
- self.prop = FontProperties(size=s)
- self.stale = True
-
- def get_fontsize(self):
- """Return the fontsize in points."""
- return self.prop.get_size_in_points()
-
- def get_window_extent(self, renderer=None):
- # docstring inherited
- if renderer is None:
- renderer = self.figure._get_renderer()
- return Bbox.union([child.get_window_extent(renderer)
- for child in self.get_children()])
-
- def get_tightbbox(self, renderer=None):
- # docstring inherited
- return Bbox.union([child.get_tightbbox(renderer)
- for child in self.get_children()])
-
- def update_positions(self, renderer):
- """
- Update pixel positions for the annotated point, the text and the arrow.
- """
-
- x, y = self.xybox
- if isinstance(self.boxcoords, tuple):
- xcoord, ycoord = self.boxcoords
- x1, y1 = self._get_xy(renderer, x, y, xcoord)
- x2, y2 = self._get_xy(renderer, x, y, ycoord)
- ox0, oy0 = x1, y2
- else:
- ox0, oy0 = self._get_xy(renderer, x, y, self.boxcoords)
-
- bbox = self.offsetbox.get_bbox(renderer)
- fw, fh = self._box_alignment
- self.offsetbox.set_offset(
- (ox0 - fw*bbox.width - bbox.x0, oy0 - fh*bbox.height - bbox.y0))
-
- bbox = self.offsetbox.get_window_extent(renderer)
- self.patch.set_bounds(bbox.bounds)
-
- mutation_scale = renderer.points_to_pixels(self.get_fontsize())
- self.patch.set_mutation_scale(mutation_scale)
-
- if self.arrowprops:
- # Use FancyArrowPatch if self.arrowprops has "arrowstyle" key.
-
- # Adjust the starting point of the arrow relative to the textbox.
- # TODO: Rotation needs to be accounted.
- arrow_begin = bbox.p0 + bbox.size * self._arrow_relpos
- arrow_end = self._get_position_xy(renderer)
- # The arrow (from arrow_begin to arrow_end) will be first clipped
- # by patchA and patchB, then shrunk by shrinkA and shrinkB (in
- # points). If patch A is not set, self.bbox_patch is used.
- self.arrow_patch.set_positions(arrow_begin, arrow_end)
-
- if "mutation_scale" in self.arrowprops:
- mutation_scale = renderer.points_to_pixels(
- self.arrowprops["mutation_scale"])
- # Else, use fontsize-based mutation_scale defined above.
- self.arrow_patch.set_mutation_scale(mutation_scale)
-
- patchA = self.arrowprops.get("patchA", self.patch)
- self.arrow_patch.set_patchA(patchA)
-
- def draw(self, renderer):
- # docstring inherited
- if renderer is not None:
- self._renderer = renderer
- if not self.get_visible() or not self._check_xy(renderer):
- return
- renderer.open_group(self.__class__.__name__, gid=self.get_gid())
- self.update_positions(renderer)
- if self.arrow_patch is not None:
- if self.arrow_patch.figure is None and self.figure is not None:
- self.arrow_patch.figure = self.figure
- self.arrow_patch.draw(renderer)
- self.patch.draw(renderer)
- self.offsetbox.draw(renderer)
- renderer.close_group(self.__class__.__name__)
- self.stale = False
-
-
-class DraggableBase:
- """
- Helper base class for a draggable artist (legend, offsetbox).
-
- Derived classes must override the following methods::
-
- def save_offset(self):
- '''
- Called when the object is picked for dragging; should save the
- reference position of the artist.
- '''
-
- def update_offset(self, dx, dy):
- '''
- Called during the dragging; (*dx*, *dy*) is the pixel offset from
- the point where the mouse drag started.
- '''
-
- Optionally, you may override the following method::
-
- def finalize_offset(self):
- '''Called when the mouse is released.'''
-
- In the current implementation of `.DraggableLegend` and
- `DraggableAnnotation`, `update_offset` places the artists in display
- coordinates, and `finalize_offset` recalculates their position in axes
- coordinate and set a relevant attribute.
- """
-
- def __init__(self, ref_artist, use_blit=False):
- self.ref_artist = ref_artist
- if not ref_artist.pickable():
- ref_artist.set_picker(True)
- self.got_artist = False
- self._use_blit = use_blit and self.canvas.supports_blit
- self.cids = [
- self.canvas.callbacks._connect_picklable(
- 'pick_event', self.on_pick),
- self.canvas.callbacks._connect_picklable(
- 'button_release_event', self.on_release),
- ]
-
- # A property, not an attribute, to maintain picklability.
- canvas = property(lambda self: self.ref_artist.figure.canvas)
-
- def on_motion(self, evt):
- if self._check_still_parented() and self.got_artist:
- dx = evt.x - self.mouse_x
- dy = evt.y - self.mouse_y
- self.update_offset(dx, dy)
- if self._use_blit:
- self.canvas.restore_region(self.background)
- self.ref_artist.draw(
- self.ref_artist.figure._get_renderer())
- self.canvas.blit()
- else:
- self.canvas.draw()
-
- def on_pick(self, evt):
- if self._check_still_parented() and evt.artist == self.ref_artist:
- self.mouse_x = evt.mouseevent.x
- self.mouse_y = evt.mouseevent.y
- self.got_artist = True
- if self._use_blit:
- self.ref_artist.set_animated(True)
- self.canvas.draw()
- self.background = \
- self.canvas.copy_from_bbox(self.ref_artist.figure.bbox)
- self.ref_artist.draw(
- self.ref_artist.figure._get_renderer())
- self.canvas.blit()
- self._c1 = self.canvas.callbacks._connect_picklable(
- "motion_notify_event", self.on_motion)
- self.save_offset()
-
- def on_release(self, event):
- if self._check_still_parented() and self.got_artist:
- self.finalize_offset()
- self.got_artist = False
- self.canvas.mpl_disconnect(self._c1)
-
- if self._use_blit:
- self.ref_artist.set_animated(False)
-
- def _check_still_parented(self):
- if self.ref_artist.figure is None:
- self.disconnect()
- return False
- else:
- return True
-
- def disconnect(self):
- """Disconnect the callbacks."""
- for cid in self.cids:
- self.canvas.mpl_disconnect(cid)
- try:
- c1 = self._c1
- except AttributeError:
- pass
- else:
- self.canvas.mpl_disconnect(c1)
-
- def save_offset(self):
- pass
-
- def update_offset(self, dx, dy):
- pass
-
- def finalize_offset(self):
- pass
-
-
-class DraggableOffsetBox(DraggableBase):
- def __init__(self, ref_artist, offsetbox, use_blit=False):
- super().__init__(ref_artist, use_blit=use_blit)
- self.offsetbox = offsetbox
-
- def save_offset(self):
- offsetbox = self.offsetbox
- renderer = offsetbox.figure._get_renderer()
- offset = offsetbox.get_offset(offsetbox.get_bbox(renderer), renderer)
- self.offsetbox_x, self.offsetbox_y = offset
- self.offsetbox.set_offset(offset)
-
- def update_offset(self, dx, dy):
- loc_in_canvas = self.offsetbox_x + dx, self.offsetbox_y + dy
- self.offsetbox.set_offset(loc_in_canvas)
-
- def get_loc_in_canvas(self):
- offsetbox = self.offsetbox
- renderer = offsetbox.figure._get_renderer()
- bbox = offsetbox.get_bbox(renderer)
- ox, oy = offsetbox._offset
- loc_in_canvas = (ox + bbox.x0, oy + bbox.y0)
- return loc_in_canvas
-
-
-class DraggableAnnotation(DraggableBase):
- def __init__(self, annotation, use_blit=False):
- super().__init__(annotation, use_blit=use_blit)
- self.annotation = annotation
-
- def save_offset(self):
- ann = self.annotation
- self.ox, self.oy = ann.get_transform().transform(ann.xyann)
-
- def update_offset(self, dx, dy):
- ann = self.annotation
- ann.xyann = ann.get_transform().inverted().transform(
- (self.ox + dx, self.oy + dy))
diff --git a/spaces/lafi23333/aikomori/attentions.py b/spaces/lafi23333/aikomori/attentions.py
deleted file mode 100644
index 4e0b0c1fd48c962e21e1fbe60b23fc574927435c..0000000000000000000000000000000000000000
--- a/spaces/lafi23333/aikomori/attentions.py
+++ /dev/null
@@ -1,303 +0,0 @@
-import copy
-import math
-import numpy as np
-import torch
-from torch import nn
-from torch.nn import functional as F
-
-import commons
-import modules
-from modules import LayerNorm
-
-
-class Encoder(nn.Module):
- def __init__(self, hidden_channels, filter_channels, n_heads, n_layers, kernel_size=1, p_dropout=0., window_size=4, **kwargs):
- super().__init__()
- self.hidden_channels = hidden_channels
- self.filter_channels = filter_channels
- self.n_heads = n_heads
- self.n_layers = n_layers
- self.kernel_size = kernel_size
- self.p_dropout = p_dropout
- self.window_size = window_size
-
- self.drop = nn.Dropout(p_dropout)
- self.attn_layers = nn.ModuleList()
- self.norm_layers_1 = nn.ModuleList()
- self.ffn_layers = nn.ModuleList()
- self.norm_layers_2 = nn.ModuleList()
- for i in range(self.n_layers):
- self.attn_layers.append(MultiHeadAttention(hidden_channels, hidden_channels, n_heads, p_dropout=p_dropout, window_size=window_size))
- self.norm_layers_1.append(LayerNorm(hidden_channels))
- self.ffn_layers.append(FFN(hidden_channels, hidden_channels, filter_channels, kernel_size, p_dropout=p_dropout))
- self.norm_layers_2.append(LayerNorm(hidden_channels))
-
- def forward(self, x, x_mask):
- attn_mask = x_mask.unsqueeze(2) * x_mask.unsqueeze(-1)
- x = x * x_mask
- for i in range(self.n_layers):
- y = self.attn_layers[i](x, x, attn_mask)
- y = self.drop(y)
- x = self.norm_layers_1[i](x + y)
-
- y = self.ffn_layers[i](x, x_mask)
- y = self.drop(y)
- x = self.norm_layers_2[i](x + y)
- x = x * x_mask
- return x
-
-
-class Decoder(nn.Module):
- def __init__(self, hidden_channels, filter_channels, n_heads, n_layers, kernel_size=1, p_dropout=0., proximal_bias=False, proximal_init=True, **kwargs):
- super().__init__()
- self.hidden_channels = hidden_channels
- self.filter_channels = filter_channels
- self.n_heads = n_heads
- self.n_layers = n_layers
- self.kernel_size = kernel_size
- self.p_dropout = p_dropout
- self.proximal_bias = proximal_bias
- self.proximal_init = proximal_init
-
- self.drop = nn.Dropout(p_dropout)
- self.self_attn_layers = nn.ModuleList()
- self.norm_layers_0 = nn.ModuleList()
- self.encdec_attn_layers = nn.ModuleList()
- self.norm_layers_1 = nn.ModuleList()
- self.ffn_layers = nn.ModuleList()
- self.norm_layers_2 = nn.ModuleList()
- for i in range(self.n_layers):
- self.self_attn_layers.append(MultiHeadAttention(hidden_channels, hidden_channels, n_heads, p_dropout=p_dropout, proximal_bias=proximal_bias, proximal_init=proximal_init))
- self.norm_layers_0.append(LayerNorm(hidden_channels))
- self.encdec_attn_layers.append(MultiHeadAttention(hidden_channels, hidden_channels, n_heads, p_dropout=p_dropout))
- self.norm_layers_1.append(LayerNorm(hidden_channels))
- self.ffn_layers.append(FFN(hidden_channels, hidden_channels, filter_channels, kernel_size, p_dropout=p_dropout, causal=True))
- self.norm_layers_2.append(LayerNorm(hidden_channels))
-
- def forward(self, x, x_mask, h, h_mask):
- """
- x: decoder input
- h: encoder output
- """
- self_attn_mask = commons.subsequent_mask(x_mask.size(2)).to(device=x.device, dtype=x.dtype)
- encdec_attn_mask = h_mask.unsqueeze(2) * x_mask.unsqueeze(-1)
- x = x * x_mask
- for i in range(self.n_layers):
- y = self.self_attn_layers[i](x, x, self_attn_mask)
- y = self.drop(y)
- x = self.norm_layers_0[i](x + y)
-
- y = self.encdec_attn_layers[i](x, h, encdec_attn_mask)
- y = self.drop(y)
- x = self.norm_layers_1[i](x + y)
-
- y = self.ffn_layers[i](x, x_mask)
- y = self.drop(y)
- x = self.norm_layers_2[i](x + y)
- x = x * x_mask
- return x
-
-
-class MultiHeadAttention(nn.Module):
- def __init__(self, channels, out_channels, n_heads, p_dropout=0., window_size=None, heads_share=True, block_length=None, proximal_bias=False, proximal_init=False):
- super().__init__()
- assert channels % n_heads == 0
-
- self.channels = channels
- self.out_channels = out_channels
- self.n_heads = n_heads
- self.p_dropout = p_dropout
- self.window_size = window_size
- self.heads_share = heads_share
- self.block_length = block_length
- self.proximal_bias = proximal_bias
- self.proximal_init = proximal_init
- self.attn = None
-
- self.k_channels = channels // n_heads
- self.conv_q = nn.Conv1d(channels, channels, 1)
- self.conv_k = nn.Conv1d(channels, channels, 1)
- self.conv_v = nn.Conv1d(channels, channels, 1)
- self.conv_o = nn.Conv1d(channels, out_channels, 1)
- self.drop = nn.Dropout(p_dropout)
-
- if window_size is not None:
- n_heads_rel = 1 if heads_share else n_heads
- rel_stddev = self.k_channels**-0.5
- self.emb_rel_k = nn.Parameter(torch.randn(n_heads_rel, window_size * 2 + 1, self.k_channels) * rel_stddev)
- self.emb_rel_v = nn.Parameter(torch.randn(n_heads_rel, window_size * 2 + 1, self.k_channels) * rel_stddev)
-
- nn.init.xavier_uniform_(self.conv_q.weight)
- nn.init.xavier_uniform_(self.conv_k.weight)
- nn.init.xavier_uniform_(self.conv_v.weight)
- if proximal_init:
- with torch.no_grad():
- self.conv_k.weight.copy_(self.conv_q.weight)
- self.conv_k.bias.copy_(self.conv_q.bias)
-
- def forward(self, x, c, attn_mask=None):
- q = self.conv_q(x)
- k = self.conv_k(c)
- v = self.conv_v(c)
-
- x, self.attn = self.attention(q, k, v, mask=attn_mask)
-
- x = self.conv_o(x)
- return x
-
- def attention(self, query, key, value, mask=None):
- # reshape [b, d, t] -> [b, n_h, t, d_k]
- b, d, t_s, t_t = (*key.size(), query.size(2))
- query = query.view(b, self.n_heads, self.k_channels, t_t).transpose(2, 3)
- key = key.view(b, self.n_heads, self.k_channels, t_s).transpose(2, 3)
- value = value.view(b, self.n_heads, self.k_channels, t_s).transpose(2, 3)
-
- scores = torch.matmul(query / math.sqrt(self.k_channels), key.transpose(-2, -1))
- if self.window_size is not None:
- assert t_s == t_t, "Relative attention is only available for self-attention."
- key_relative_embeddings = self._get_relative_embeddings(self.emb_rel_k, t_s)
- rel_logits = self._matmul_with_relative_keys(query /math.sqrt(self.k_channels), key_relative_embeddings)
- scores_local = self._relative_position_to_absolute_position(rel_logits)
- scores = scores + scores_local
- if self.proximal_bias:
- assert t_s == t_t, "Proximal bias is only available for self-attention."
- scores = scores + self._attention_bias_proximal(t_s).to(device=scores.device, dtype=scores.dtype)
- if mask is not None:
- scores = scores.masked_fill(mask == 0, -1e4)
- if self.block_length is not None:
- assert t_s == t_t, "Local attention is only available for self-attention."
- block_mask = torch.ones_like(scores).triu(-self.block_length).tril(self.block_length)
- scores = scores.masked_fill(block_mask == 0, -1e4)
- p_attn = F.softmax(scores, dim=-1) # [b, n_h, t_t, t_s]
- p_attn = self.drop(p_attn)
- output = torch.matmul(p_attn, value)
- if self.window_size is not None:
- relative_weights = self._absolute_position_to_relative_position(p_attn)
- value_relative_embeddings = self._get_relative_embeddings(self.emb_rel_v, t_s)
- output = output + self._matmul_with_relative_values(relative_weights, value_relative_embeddings)
- output = output.transpose(2, 3).contiguous().view(b, d, t_t) # [b, n_h, t_t, d_k] -> [b, d, t_t]
- return output, p_attn
-
- def _matmul_with_relative_values(self, x, y):
- """
- x: [b, h, l, m]
- y: [h or 1, m, d]
- ret: [b, h, l, d]
- """
- ret = torch.matmul(x, y.unsqueeze(0))
- return ret
-
- def _matmul_with_relative_keys(self, x, y):
- """
- x: [b, h, l, d]
- y: [h or 1, m, d]
- ret: [b, h, l, m]
- """
- ret = torch.matmul(x, y.unsqueeze(0).transpose(-2, -1))
- return ret
-
- def _get_relative_embeddings(self, relative_embeddings, length):
- max_relative_position = 2 * self.window_size + 1
- # Pad first before slice to avoid using cond ops.
- pad_length = max(length - (self.window_size + 1), 0)
- slice_start_position = max((self.window_size + 1) - length, 0)
- slice_end_position = slice_start_position + 2 * length - 1
- if pad_length > 0:
- padded_relative_embeddings = F.pad(
- relative_embeddings,
- commons.convert_pad_shape([[0, 0], [pad_length, pad_length], [0, 0]]))
- else:
- padded_relative_embeddings = relative_embeddings
- used_relative_embeddings = padded_relative_embeddings[:,slice_start_position:slice_end_position]
- return used_relative_embeddings
-
- def _relative_position_to_absolute_position(self, x):
- """
- x: [b, h, l, 2*l-1]
- ret: [b, h, l, l]
- """
- batch, heads, length, _ = x.size()
- # Concat columns of pad to shift from relative to absolute indexing.
- x = F.pad(x, commons.convert_pad_shape([[0,0],[0,0],[0,0],[0,1]]))
-
- # Concat extra elements so to add up to shape (len+1, 2*len-1).
- x_flat = x.view([batch, heads, length * 2 * length])
- x_flat = F.pad(x_flat, commons.convert_pad_shape([[0,0],[0,0],[0,length-1]]))
-
- # Reshape and slice out the padded elements.
- x_final = x_flat.view([batch, heads, length+1, 2*length-1])[:, :, :length, length-1:]
- return x_final
-
- def _absolute_position_to_relative_position(self, x):
- """
- x: [b, h, l, l]
- ret: [b, h, l, 2*l-1]
- """
- batch, heads, length, _ = x.size()
- # padd along column
- x = F.pad(x, commons.convert_pad_shape([[0, 0], [0, 0], [0, 0], [0, length-1]]))
- x_flat = x.view([batch, heads, length**2 + length*(length -1)])
- # add 0's in the beginning that will skew the elements after reshape
- x_flat = F.pad(x_flat, commons.convert_pad_shape([[0, 0], [0, 0], [length, 0]]))
- x_final = x_flat.view([batch, heads, length, 2*length])[:,:,:,1:]
- return x_final
-
- def _attention_bias_proximal(self, length):
- """Bias for self-attention to encourage attention to close positions.
- Args:
- length: an integer scalar.
- Returns:
- a Tensor with shape [1, 1, length, length]
- """
- r = torch.arange(length, dtype=torch.float32)
- diff = torch.unsqueeze(r, 0) - torch.unsqueeze(r, 1)
- return torch.unsqueeze(torch.unsqueeze(-torch.log1p(torch.abs(diff)), 0), 0)
-
-
-class FFN(nn.Module):
- def __init__(self, in_channels, out_channels, filter_channels, kernel_size, p_dropout=0., activation=None, causal=False):
- super().__init__()
- self.in_channels = in_channels
- self.out_channels = out_channels
- self.filter_channels = filter_channels
- self.kernel_size = kernel_size
- self.p_dropout = p_dropout
- self.activation = activation
- self.causal = causal
-
- if causal:
- self.padding = self._causal_padding
- else:
- self.padding = self._same_padding
-
- self.conv_1 = nn.Conv1d(in_channels, filter_channels, kernel_size)
- self.conv_2 = nn.Conv1d(filter_channels, out_channels, kernel_size)
- self.drop = nn.Dropout(p_dropout)
-
- def forward(self, x, x_mask):
- x = self.conv_1(self.padding(x * x_mask))
- if self.activation == "gelu":
- x = x * torch.sigmoid(1.702 * x)
- else:
- x = torch.relu(x)
- x = self.drop(x)
- x = self.conv_2(self.padding(x * x_mask))
- return x * x_mask
-
- def _causal_padding(self, x):
- if self.kernel_size == 1:
- return x
- pad_l = self.kernel_size - 1
- pad_r = 0
- padding = [[0, 0], [0, 0], [pad_l, pad_r]]
- x = F.pad(x, commons.convert_pad_shape(padding))
- return x
-
- def _same_padding(self, x):
- if self.kernel_size == 1:
- return x
- pad_l = (self.kernel_size - 1) // 2
- pad_r = self.kernel_size // 2
- padding = [[0, 0], [0, 0], [pad_l, pad_r]]
- x = F.pad(x, commons.convert_pad_shape(padding))
- return x
diff --git a/spaces/leurez/moss/src/utils/is/index.ts b/spaces/leurez/moss/src/utils/is/index.ts
deleted file mode 100644
index 27a8230a12e94ad05df1fe5e466446587f6d26d0..0000000000000000000000000000000000000000
--- a/spaces/leurez/moss/src/utils/is/index.ts
+++ /dev/null
@@ -1,55 +0,0 @@
-export function isNumber(value: T | unknown): value is number {
- return Object.prototype.toString.call(value) === '[object Number]'
-}
-
-export function isString(value: T | unknown): value is string {
- return Object.prototype.toString.call(value) === '[object String]'
-}
-
-export function isBoolean(value: T | unknown): value is boolean {
- return Object.prototype.toString.call(value) === '[object Boolean]'
-}
-
-export function isNull(value: T | unknown): value is null {
- return Object.prototype.toString.call(value) === '[object Null]'
-}
-
-export function isUndefined(value: T | unknown): value is undefined {
- return Object.prototype.toString.call(value) === '[object Undefined]'
-}
-
-export function isObject(value: T | unknown): value is object {
- return Object.prototype.toString.call(value) === '[object Object]'
-}
-
-export function isArray(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Array]'
-}
-
-export function isFunction any | void | never>(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Function]'
-}
-
-export function isDate(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Date]'
-}
-
-export function isRegExp(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object RegExp]'
-}
-
-export function isPromise>(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Promise]'
-}
-
-export function isSet>(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Set]'
-}
-
-export function isMap>(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object Map]'
-}
-
-export function isFile(value: T | unknown): value is T {
- return Object.prototype.toString.call(value) === '[object File]'
-}
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/CINEMA 4D Studio R19 With [PATCHED] Crack.md b/spaces/lincquiQcaudo/Top-20-Diffusion/CINEMA 4D Studio R19 With [PATCHED] Crack.md
deleted file mode 100644
index c61007caeafd6aaf5d81c956e443675f3ea51e8a..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/CINEMA 4D Studio R19 With [PATCHED] Crack.md
+++ /dev/null
@@ -1,16 +0,0 @@
-CINEMA 4D Studio R19 With Crack
Download Zip » https://bytlly.com/2uGwIg
-
-Buy Maxon Cinema 4D Studio R19 (discount for several licenses, loading) with a complete set of C4D tools for 3D production, including a complete set of Bodypaint 3D tools ,. 3D Cinema.
-Studio R19.
-Version: R19.
-Platform: Mac OS X. Interface language: English + Russian.
-Treatment: Included.
-Description: Maxon Cinema 4D Studio R19 (C4D R19) is a tool for creating 3D graphics designed for artists, designers and.
-Macros V. 4. 5 (M5.
-V. 5. 0) - a program for 3D modeling, visualization and animation based on the Mac OS X system. Developer: Macro-Space.
-Version: v. 5. 0. Platform: Mac OS X.
-Interface language: English + Russian.
-Year: 2. 00. 8a78ff9644
-
-
-
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/Counter-Strike 1.5 1.6 Steamless Online Pack 2011 Pc Game.md b/spaces/lincquiQcaudo/Top-20-Diffusion/Counter-Strike 1.5 1.6 Steamless Online Pack 2011 Pc Game.md
deleted file mode 100644
index 38dfae3f58687bfdda416d8737fb8861ff5ad4df..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/Counter-Strike 1.5 1.6 Steamless Online Pack 2011 Pc Game.md
+++ /dev/null
@@ -1,36 +0,0 @@
-
-How to Play Counter-Strike 1.5 and 1.6 Online Without Steam
-Counter-Strike is one of the most popular and influential first-person shooter games of all time. It pits two teams of players against each other in realistic scenarios, such as defusing bombs, rescuing hostages, or eliminating terrorists. The game has been updated and improved over the years, with the latest version being Counter-Strike: Global Offensive.
-However, some fans still prefer the classic versions of Counter-Strike, such as 1.5 and 1.6, which were released in 2002 and 2003 respectively. These versions have simpler graphics, faster gameplay, and a nostalgic appeal for many players. But how can you play them online without Steam, the digital distribution platform that manages most of the modern PC games?
-Counter-Strike 1.5 1.6 Steamless Online Pack 2011 pc game
Download Zip > https://bytlly.com/2uGx7j
-The answer is to use a steamless online pack, which is a collection of files and programs that allow you to play Counter-Strike 1.5 and 1.6 online without needing a Steam account or a CD key. These packs are usually created by fans and modders who want to preserve the legacy of the game and make it accessible to everyone.
-One of the most popular steamless online packs for Counter-Strike 1.5 and 1.6 is the one released in 2011 by a user named ny[^1^]. This pack includes everything you need to play Counter-Strike 1.5 and 1.6 online with other players around the world, such as:
-
-- The game files for both versions of Counter-Strike
-- A launcher that lets you choose which version to play
-- A patch that fixes some bugs and adds some features
-- A server browser that shows you the available servers
-- A spectator mode that lets you watch other players
-- New weapons and sprays for more variety
-- A counter-terrorist update that adds a new map and mode
-
-To use this pack, you just need to download it from the link provided[^1^], extract it to your desired folder, and run the launcher.exe file. Then you can choose which version of Counter-Strike you want to play, select a server from the list, and join the game.
-There are some advantages and disadvantages of using this pack compared to playing Counter-Strike on Steam. Some of the advantages are:
-
-- You don't need a Steam account or a CD key to play
-- You can play on any server without restrictions
-- You can use custom mods and skins without problems
-- You can play offline or on LAN with your friends
-- You can enjoy the classic gameplay and graphics of Counter-Strike
-
-Some of the disadvantages are:
-
-- You may encounter some bugs or errors while playing
-- You may not be able to play on some servers that require Steam validation
-- You may not have access to some features or updates that are exclusive to Steam
-- You may not be able to play with some players who only use Steam
-- You may not be able to get official support or assistance from Valve
-
-In conclusion, if you want to play Counter-Strike 1.5 and 1.6 online without Steam, you can use a steamless online pack like the one released in 2011 by ny[^1^]. This pack will let you enjoy the classic versions of Counter-Strike with other players around the world, without needing a Steam account or a CD key. However, you should also be aware of the potential drawbacks and risks of using this pack, such as bugs, errors, compatibility issues, or lack of support.
d5da3c52bf
-
-
\ No newline at end of file
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/DAEMON Tools Ultra 3100368 License Crack UPD M4MasterTeamOSHKRG.md b/spaces/lincquiQcaudo/Top-20-Diffusion/DAEMON Tools Ultra 3100368 License Crack UPD M4MasterTeamOSHKRG.md
deleted file mode 100644
index e9f2fc6e07c3f29ef0acba98f7a24394851dbd1f..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/DAEMON Tools Ultra 3100368 License Crack UPD M4MasterTeamOSHKRG.md
+++ /dev/null
@@ -1,325 +0,0 @@
-
-DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG: How to Download and Use the Ultimate Imaging Software
-
-DAEMON Tools Ultra is one of the most powerful, advanced, and ultimate imaging software that we have ever created. It offers a huge list of possibilities to work with virtual drives, create bootable USB sticks for operating system recovery, use RAM disks to speed up your PC, and evaluate the unique iSCSI Initiator that allows connecting to USB devices.
-
-However, DAEMON Tools Ultra is not a free software and you might not be able to afford it. That's why some people are looking for a way to get DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG download. This license crack will allow you to use the full version of DAEMON Tools Ultra 3.1.0.0368 without paying anything. However, we do not support piracy and we recommend that you buy the original software if you can. Using a cracked software may cause some problems with your system and may violate the terms of service of DAEMON Tools.
-DAEMON Tools Ultra 3100368 License Crack M4MasterTeamOSHKRG
DOWNLOAD › https://bytlly.com/2uGy6B
-
-What is DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG?
-
-DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG is a file that modifies the original software to bypass its activation and registration process. It also updates the software to the latest version, which is 3.1.0.0368 for Windows systems. This license crack is created by M4Master, a member of TeamOS-HKRG, a website that provides cracks, keygens, patches, serial keys, and other tools for various software and games.
-
-By using this license crack, you can enjoy all the features and functions of DAEMON Tools Ultra 3.1.0.0368 without any limitations or restrictions. You can also access all the updates, support, and customer service from DAEMON Tools.
-
-How to Download DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG?
-
-There are many websites that claim to offer DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG download, but not all of them are reliable and safe. Some of them may contain viruses, malware, or spyware that can harm your computer. Some of them may also have fake or outdated links that will not work. Therefore, you need to be careful when choosing a website to download DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG.
-
-One of the websites that we found to be trustworthy and working is bitbucket.org/knowimtime/mettioviting/issues/27/daemon-tools-ultra-3100368-license-crack . This website has a lot of software and games that you can download for free using direct links or torrents. It also has a detailed description and installation guide for each software.
-
-Here are the steps to download DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG
-from bitbucket
-.
-org:
-
-
-- Go to bitbucket.org/knowimtime/mettioviting/issues/27/daemon-tools-ultra-3100368-license-crack and scroll down to the bottom of the page.
-- You will see a link that says "Download DAEMON Tools Ultra 3.1.0 License Crack M4MasterTeamOSHKRG".
-- Click on the link and wait for the file to be downloaded.
-- The file size is about 24 MB.
-- After the download is complete, extract the file using WinRAR or any other software that can handle RAR files.
-- You will get a folder named "DAEMON Tools Ultra 3 License Crack M4MasterTeamOSHKRG" that contains the setup file and the license crack file.
-
-
-How to Install DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-License
-Crack
-M4MasterTeamOSHKRG?
-
-After you have downloaded DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG
-,
-you need to install it on your computer
-.
-Here are
-the steps
-to install
-DAEMON Tools Ultra
-3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG:
-
-
-- Run the setup file named "DTUltra310-0359.exe" as administrator.
-- Follow the instructions on the screen and choose the destination folder where you want to install the software.
-- Wait for the installation to finish.
-- Do not run the software yet.
-- Copy the license crack file named "DTUltra.exe" from the folder "DAEMON Tools Ultra 3 License Crack M4MasterTeamOSHKRG" and paste it into the folder where you installed the software.
-- Usually, it is located at "C:\Program Files\DAEMON Tools Ultra".
-- You have successfully installed DAEMON Tools Ultra 3 License Crack M4MasterTeamOSHKRG.
-
-
-How to Use DAEMON Tools Ultra 3 License Crack M4MasterTeamOSHKRG?
-
-Now that you have installed DAEMON Tools Ultra 3 License Crack M4MasterTeamOSHKRG, you can use it as a standalone application or as a plugin in your DAW.
Here are some tips on how to use DAEMON Tools Ultra
-3
-License
-Crack
-M4MasterTeamOSHKRG:
-
-
-- To use it as a standalone application, run the file named "DTUltra.exe" from the folder where you installed the software.
-- To use it as a plugin in your DAW, open your DAW and scan for new plugins.
-- You should see DAEMON Tools Ultra in your plugin list.
-- To mount an image file in DAEMON Tools Ultra, click on "Add Image" and choose an image file from your computer.
-- To create an image file in DAEMON Tools Ultra, click on "Create Image" and choose a source file or disc from your computer or optical drive.
-- To edit an image file in DAEMON Tools Ultra, click on "Edit Image" and choose an image file from your computer.
-- To burn an image file in DAEMON Tools Ultra, click on "Burn Image" and choose an image file from your computer and a target disc from your optical drive.
-
-
-DAEMON Tools Ultra is one of the best imaging software that can help you work with virtual drives, create bootable USB sticks, use RAM disks, and connect to USB devices.
However,
-it is also very expensive
-and not everyone can afford it
-.
-That's why some people resort to using DAEMON Tools Ultra
-3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG
-download
-to get it for free
-.
-However
-,
-we do not recommend using a cracked software as it may cause some issues with your system or with DAEMON Tools.
If you like
-the software
-and want to support
-the developers
-,
-you should buy
-the original version
-from their website
-.
-Alternatively
-,
-you can look for some other software
-that can offer similar features
-and functions
-or try to get DAEMON Tools Ultra for free without using a license crack
-.
-We hope this article was helpful
-and informative
-for you
-.
-Happy imaging!
-
-How to Uninstall DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG?
-
-If you want to uninstall DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG from your computer, you can follow these steps:
-
-
-- Go to Control Panel and click on Programs and Features.
-- Find DAEMON Tools Ultra in the list of installed programs and click on Uninstall.
-- Follow the instructions on the screen and wait for the uninstallation to finish.
-- Delete the folder where you installed the software, usually located at "C:\Program Files\DAEMON Tools Ultra".
-- Delete the license crack file named "DTUltra.exe" from your computer.
-- Restart your computer to complete the uninstallation.
-
-
-What are the Alternatives to DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG?
-
-If you are looking for some alternatives to DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG, you can try these software:
-
-
-- PowerISO: This is a powerful and versatile software that can create, edit, extract, burn, mount, and convert ISO files and other image formats. It also supports virtual drives, bootable USB drives, encryption, compression, and more.
-- WinCDEmu: This is a simple and lightweight software that can mount ISO files and other image formats as virtual drives with a single click. It also supports unlimited number of virtual drives, portable mode, and integration with Windows Explorer.
-- UltraISO: This is a popular and reliable software that can create, edit, convert, and burn ISO files and other image formats. It also supports virtual drives, bootable CD/DVDs, checksum verification, and more.
-
-
-Conclusion
-
-In this article, we have discussed how to download and install DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG
-.
-We have also shown you some of the features
-,
-advantages
-,
-and disadvantages
-of using a patched software
-.
-We hope that you have found this article useful
-and informative
-,
-and that you have learned something new today
-.
-
-However
-,
-we would like to remind you that using a patched software is not legal or ethical
-,
-and that it may cause some issues with your system or with DAEMON Tools
-.
-We strongly advise you to buy
-the original software
-from their website
-if you can afford it
-,
-or to look for some other software
-that can suit your needs and budget
-.
-This way
-,
-you can support
-the developers
-and enjoy
-the full features
-and functions
-of DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-without any problems
-.
-
-Thank you for reading this article and we hope to see you again soon. Have a great day and happy imaging!
-How to Troubleshoot DAEMON Tools Ultra 3.1.0.0368 License Crack M4MasterTeamOSHKRG?
-
-If you encounter any problems while using DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG, you can try these solutions:
-
-
-- Make sure you have downloaded the correct and complete file from a reliable website.
-- Make sure you have extracted the file properly using WinRAR or any other software that can handle RAR files.
-- Make sure you have installed the software correctly by following the instructions on the screen and copying the license crack file to the installation folder.
-- Make sure you have run the software as administrator and disabled your antivirus or firewall before launching it.
-- Make sure you have updated your system drivers and software to the latest versions.
-- Make sure you have enough disk space and memory to run the software smoothly.
-- If none of these solutions work, you can contact DAEMON Tools support team or visit their official website for more help.
-
-
-How to Get DAEMON Tools Ultra 3.1.0.0368 for Free without Using a License Crack M4MasterTeamOSHKRG?
-
-If you want to get DAEMON Tools Ultra 3.1.0.0368 for free without using a license crack M4MasterTeamOSHKRG, you can try these methods:
-
-
-- You can download a free trial version of DAEMON Tools Ultra from their website and use it for 14 days with limited features.
-- You can participate in their promotions and giveaways and get a chance to win a free license key or a discount coupon.
-- You can join their affiliate program and earn commissions by promoting their products on your website or social media.
-- You can refer your friends and family to their products and get rewards or discounts.
-
-
-Conclusion
-
-In this article, we have discussed how to download and install DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-license
-crack
-M4MasterTeamOSHKRG
-.
-We have also shown you some of the features
-,
-advantages
-,
-and disadvantages
-of using a patched software
-.
-We hope that you have found this article useful
-and informative
-,
-and that you have learned something new today
-.
-
-However
-,
-we would like to remind you that using a patched software is not legal or ethical
-,
-and that it may cause some issues with your system or with DAEMON Tools
-.
-We strongly advise you to buy
-the original software
-from their website
-if you can afford it
-,
-or to look for some other software
-that can suit your needs and budget
-.
-This way
-,
-you can support
-the developers
-and enjoy
-the full features
-and functions
-of DAEMON Tools Ultra 3
-.
-1
-.
-0
-.
-0368
-without any problems
-.
-
-Thank you for reading this article and we hope to see you again soon. Have a great day and happy imaging!
-Conclusion
-
-In this article, we have discussed how to download and install DAEMON Tools Ultra 3.1.0.0368 license crack M4MasterTeamOSHKRG. We have also shown you some of the features, advantages, and disadvantages of using a patched software. We hope that you have found this article useful and informative, and that you have learned something new today.
-
-However, we would like to remind you that using a patched software is not legal or ethical, and that it may cause some issues with your system or with DAEMON Tools. We strongly advise you to buy the original software from their website if you can afford it, or to look for some other software that can suit your needs and budget. This way, you can support the developers and enjoy the full features and functions of DAEMON Tools Ultra 3.1.0.0368 without any problems.
-
-Thank you for reading this article and we hope to see you again soon. Have a great day and happy imaging!
3cee63e6c2
-
-
\ No newline at end of file
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/Martin Gruber Understanding Sql.pdf.md b/spaces/lincquiQcaudo/Top-20-Diffusion/Martin Gruber Understanding Sql.pdf.md
deleted file mode 100644
index 01483ee4f7d2491d7217288c20724eab1dd0bde8..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/Martin Gruber Understanding Sql.pdf.md
+++ /dev/null
@@ -1,6 +0,0 @@
-Martin Gruber Understanding Sql.pdf
DOWNLOAD ———>>> https://bytlly.com/2uGwaY
-
-Understanding SQL Martin Gruber. Starting with a brief introduction to relational database principles, it provides a comprehensive one. View PDF file. An award-winning SQL tutorial by Martin Gruber. Essentially, the entire book consists of several practical examples and answers to questions that SQL users often have. View PDF file. SQL Fundamentals - Martin Gruber, 2009 This book explains how to use SQL and write programs in it. View PDF file. Managing SQL Martin Gruber, 2008 This is a book that teaches you the fundamentals of SQL and how to develop full-fledged programs from it. 8a78ff9644
-
-
-
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/Megatech MegaCAD 2D 3D 2012 FULL Version Download BEST.md b/spaces/lincquiQcaudo/Top-20-Diffusion/Megatech MegaCAD 2D 3D 2012 FULL Version Download BEST.md
deleted file mode 100644
index 2b9650bc8dc8ed698391a52e035c37951e99a26d..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/Megatech MegaCAD 2D 3D 2012 FULL Version Download BEST.md
+++ /dev/null
@@ -1,6 +0,0 @@
-Megatech MegaCAD 2D 3D 2012 FULL Version Download
DOWNLOAD >>> https://bytlly.com/2uGvNw
-
-Download AudioEase Altiverb 7 XL MAC OSX torrent from software ... to ... Megatech MegaCAD 2D 3D (2012) [FULL Version] downloadlkjh 1fdad05405
-
-
-
diff --git a/spaces/lincquiQcaudo/Top-20-Diffusion/Microsoft Windows 8.1 Enterprise 6.3.9600.17031.WINBLUE X86-X64 64 Bit [NEW].md b/spaces/lincquiQcaudo/Top-20-Diffusion/Microsoft Windows 8.1 Enterprise 6.3.9600.17031.WINBLUE X86-X64 64 Bit [NEW].md
deleted file mode 100644
index 133ef90f5af96e3a3d52f755070c210254b91983..0000000000000000000000000000000000000000
--- a/spaces/lincquiQcaudo/Top-20-Diffusion/Microsoft Windows 8.1 Enterprise 6.3.9600.17031.WINBLUE X86-X64 64 Bit [NEW].md
+++ /dev/null
@@ -1,6 +0,0 @@
-Microsoft Windows 8.1 Enterprise 6.3.9600.17031.WINBLUE x86-X64 64 bit
Download Zip ↔ https://bytlly.com/2uGxM9
-
-home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 7 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 7 enterprise 6.3.9600.17031.winblue x86- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 64 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 7 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 7 enterprise 6.3.9600.17031.winblue x86- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 7 enterprise 6.3.9600.17031.winblue x86- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 32 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x86- 64 bit home windows 8.1 enterprise 6.3.9600.17031.winblue x64- 64 bit home windows 8 4fefd39f24
-
-
-
diff --git a/spaces/lixq/bingo61/src/components/tailwind-indicator.tsx b/spaces/lixq/bingo61/src/components/tailwind-indicator.tsx
deleted file mode 100644
index f2a1291213dd67055fcebe67fab574c8441338df..0000000000000000000000000000000000000000
--- a/spaces/lixq/bingo61/src/components/tailwind-indicator.tsx
+++ /dev/null
@@ -1,14 +0,0 @@
-export function TailwindIndicator() {
- if (process.env.NODE_ENV === 'production') return null
-
- return (
-
- xs
- sm
- md
- lg
- xl
- 2xl
-
- )
-}
diff --git a/spaces/lm/lychee_law/app.py b/spaces/lm/lychee_law/app.py
deleted file mode 100644
index 8808ba6e814a567bd94c0f9793199628568bb4f6..0000000000000000000000000000000000000000
--- a/spaces/lm/lychee_law/app.py
+++ /dev/null
@@ -1,40 +0,0 @@
-from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
-
-hf_model = "law-llm/law-glm-10b"
-max_question_length = 64
-max_generation_length = 490
-
-tokenizer = AutoTokenizer.from_pretrained(
- hf_model,
- cache_dir=model_cache_dir,
- use_fast=True,
- trust_remote_code=True
-)
-
-model = AutoModelForSeq2SeqLM.from_pretrained(
- hf_model,
- cache_dir=model_cache_dir,
- trust_remote_code=True
-)
-
-model = model.to('cuda')
-model.eval()
-
-model_inputs = "提问: 犯了盗窃罪怎么判刑? 回答: [gMASK]"
-
-model_inputs = tokenizer(model_inputs,
- max_length=max_question_length,
- padding=True,
- truncation=True,
- return_tensors="pt")
-
-model_inputs = tokenizer.build_inputs_for_generation(model_inputs,
- targets=None,
- max_gen_length=max_generation_length,
- padding=True)
-
-inputs = model_inputs.to('cuda')
-
-outputs = model.generate(**inputs, max_length=max_generation_length,
- eos_token_id=tokenizer.eop_token_id)
-prediction = tokenizer.decode(outputs[0].tolist())
\ No newline at end of file
diff --git a/spaces/lqy09/GT/public/GTest/css/style.css b/spaces/lqy09/GT/public/GTest/css/style.css
deleted file mode 100644
index d6563e38761bb72da6fac03df3eab7279b7ad5ce..0000000000000000000000000000000000000000
--- a/spaces/lqy09/GT/public/GTest/css/style.css
+++ /dev/null
@@ -1 +0,0 @@
-body{position:fixed;top:0;left:0;right:0;bottom:0;display:flex;justify-content:center;text-align:center;align-items:center;background:url(../images/bg.png) no-repeat bottom;background-size:cover;background-attachment:fixed;font-family:"PingFangSC-Regular","Open Sans",Arial,"Hiragino Sans GB","Microsoft YaHei","STHeiti","WenQuanYi Micro Hei",SimSun,sans-serif}h1{position:absolute;font-size:30px;top:12px}#captcha{width:300px;display:inline-block}#btn{width:70%;height:50px;color:#fff;background:linear-gradient(0deg,rgba(0,172,238,1) 0%,rgba(2,126,251,1) 100%);border-radius:6px;box-shadow:inset 2px 2px 2px 0px rgba(255,255,255,.5),7px 7px 20px 0px rgba(0,0,0,.1),4px 4px 5px 0px rgba(0,0,0,.1);cursor:pointer;border:none;font-size:20px}#btn:hover{background:transparent;color:rgba(2,126,251,1)}#wait{display:none;color:#666;margin:0 auto}.show{background-color:#d1d5db;border-radius:9999px;width:150px;height:.5rem;position:relative;overflow:hidden}.progress{background-color:#3b82f6;border-radius:9999px;position:absolute;bottom:0;top:0;width:50%;animation-duration:2s;animation-iteration-count:infinite;animation-name:progress-bar}@keyframes progress-bar{from{left:-50%}to{left:100%}}#footer{position:absolute;width:100%;font-size:12px;color:#c0c0c0;bottom:12px}
\ No newline at end of file
diff --git a/spaces/lwchen/CodeFormer/CodeFormer/README.md b/spaces/lwchen/CodeFormer/CodeFormer/README.md
deleted file mode 100644
index 65810cdf4ce36d8ba152de80df00fa4c8802ee81..0000000000000000000000000000000000000000
--- a/spaces/lwchen/CodeFormer/CodeFormer/README.md
+++ /dev/null
@@ -1,123 +0,0 @@
-
-
-
-
-## Towards Robust Blind Face Restoration with Codebook Lookup Transformer
-
-[Paper](https://arxiv.org/abs/2206.11253) | [Project Page](https://shangchenzhou.com/projects/CodeFormer/) | [Video](https://youtu.be/d3VDpkXlueI)
-
-
-
[](https://replicate.com/sczhou/codeformer) 
-
-[Shangchen Zhou](https://shangchenzhou.com/), [Kelvin C.K. Chan](https://ckkelvinchan.github.io/), [Chongyi Li](https://li-chongyi.github.io/), [Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/)
-
-S-Lab, Nanyang Technological University
-
-
-
-
-:star: If CodeFormer is helpful to your images or projects, please help star this repo. Thanks! :hugs:
-
-### Update
-
-- **2022.09.09**: Integrated to :rocket: [Replicate](https://replicate.com/). Try out online demo! [](https://replicate.com/sczhou/codeformer)
-- **2022.09.04**: Add face upsampling `--face_upsample` for high-resolution AI-created face enhancement.
-- **2022.08.23**: Some modifications on face detection and fusion for better AI-created face enhancement.
-- **2022.08.07**: Integrate [Real-ESRGAN](https://github.com/xinntao/Real-ESRGAN) to support background image enhancement.
-- **2022.07.29**: Integrate new face detectors of `['RetinaFace'(default), 'YOLOv5']`.
-- **2022.07.17**: Add Colab demo of CodeFormer.
-- **2022.07.16**: Release inference code for face restoration. :blush:
-- **2022.06.21**: This repo is created.
-
-### TODO
-- [ ] Add checkpoint for face inpainting
-- [ ] Add training code and config files
-- [x] ~~Add background image enhancement~~
-
-#### Face Restoration
-
-
-
-
-#### Face Color Enhancement and Restoration
-
-
-
-#### Face Inpainting
-
-
-
-
-
-### Dependencies and Installation
-
-- Pytorch >= 1.7.1
-- CUDA >= 10.1
-- Other required packages in `requirements.txt`
-```
-# git clone this repository
-git clone https://github.com/sczhou/CodeFormer
-cd CodeFormer
-
-# create new anaconda env
-conda create -n codeformer python=3.8 -y
-conda activate codeformer
-
-# install python dependencies
-pip3 install -r requirements.txt
-python basicsr/setup.py develop
-```
-
-
-### Quick Inference
-
-##### Download Pre-trained Models:
-Download the facelib pretrained models from [[Google Drive](https://drive.google.com/drive/folders/1b_3qwrzY_kTQh0-SnBoGBgOrJ_PLZSKm?usp=sharing) | [OneDrive](https://entuedu-my.sharepoint.com/:f:/g/personal/s200094_e_ntu_edu_sg/EvDxR7FcAbZMp_MA9ouq7aQB8XTppMb3-T0uGZ_2anI2mg?e=DXsJFo)] to the `weights/facelib` folder. You can manually download the pretrained models OR download by runing the following command.
-```
-python scripts/download_pretrained_models.py facelib
-```
-
-Download the CodeFormer pretrained models from [[Google Drive](https://drive.google.com/drive/folders/1CNNByjHDFt0b95q54yMVp6Ifo5iuU6QS?usp=sharing) | [OneDrive](https://entuedu-my.sharepoint.com/:f:/g/personal/s200094_e_ntu_edu_sg/EoKFj4wo8cdIn2-TY2IV6CYBhZ0pIG4kUOeHdPR_A5nlbg?e=AO8UN9)] to the `weights/CodeFormer` folder. You can manually download the pretrained models OR download by runing the following command.
-```
-python scripts/download_pretrained_models.py CodeFormer
-```
-
-##### Prepare Testing Data:
-You can put the testing images in the `inputs/TestWhole` folder. If you would like to test on cropped and aligned faces, you can put them in the `inputs/cropped_faces` folder.
-
-
-##### Testing on Face Restoration:
-```
-# For cropped and aligned faces
-python inference_codeformer.py --w 0.5 --has_aligned --test_path [input folder]
-
-# For the whole images
-# Add '--bg_upsampler realesrgan' to enhance the background regions with Real-ESRGAN
-# Add '--face_upsample' to further upsample restorated face with Real-ESRGAN
-python inference_codeformer.py --w 0.7 --test_path [input folder]
-```
-
-NOTE that *w* is in [0, 1]. Generally, smaller *w* tends to produce a higher-quality result, while larger *w* yields a higher-fidelity result.
-
-The results will be saved in the `results` folder.
-
-### Citation
-If our work is useful for your research, please consider citing:
-
- @article{zhou2022codeformer,
- author = {Zhou, Shangchen and Chan, Kelvin C.K. and Li, Chongyi and Loy, Chen Change},
- title = {Towards Robust Blind Face Restoration with Codebook Lookup TransFormer},
- journal = {arXiv preprint arXiv:2206.11253},
- year = {2022}
- }
-
-### License
-
-
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
-
-### Acknowledgement
-
-This project is based on [BasicSR](https://github.com/XPixelGroup/BasicSR). We also borrow some codes from [Unleashing Transformers](https://github.com/samb-t/unleashing-transformers), [YOLOv5-face](https://github.com/deepcam-cn/yolov5-face), and [FaceXLib](https://github.com/xinntao/facexlib). Thanks for their awesome works.
-
-### Contact
-If you have any question, please feel free to reach me out at `shangchenzhou@gmail.com`.
\ No newline at end of file
diff --git a/spaces/ma-xu/LIVE/thrust/internal/benchmark/timer.h b/spaces/ma-xu/LIVE/thrust/internal/benchmark/timer.h
deleted file mode 100644
index 077ffa44ce61e637e9e9b898bfe28186f6d36252..0000000000000000000000000000000000000000
--- a/spaces/ma-xu/LIVE/thrust/internal/benchmark/timer.h
+++ /dev/null
@@ -1,129 +0,0 @@
-#pragma once
-
-#include
-
-# define CUDA_SAFE_CALL_NO_SYNC( call) do { \
- cudaError err = call; \
- if( cudaSuccess != err) { \
- fprintf(stderr, "CUDA error in file '%s' in line %i : %s.\n", \
- __FILE__, __LINE__, cudaGetErrorString( err) ); \
- exit(EXIT_FAILURE); \
- } } while (0)
-
-# define CUDA_SAFE_CALL( call) do { \
- CUDA_SAFE_CALL_NO_SYNC(call); \
- cudaError err = cudaDeviceSynchronize(); \
- if( cudaSuccess != err) { \
- fprintf(stderr, "CUDA error in file '%s' in line %i : %s.\n", \
- __FILE__, __LINE__, cudaGetErrorString( err) ); \
- exit(EXIT_FAILURE); \
- } } while (0)
-
-class cuda_timer
-{
- cudaEvent_t start_;
- cudaEvent_t stop_;
-
- public:
- cuda_timer()
- {
- CUDA_SAFE_CALL(cudaEventCreate(&start_));
- CUDA_SAFE_CALL(cudaEventCreate(&stop_));
- }
-
- ~cuda_timer()
- {
- CUDA_SAFE_CALL(cudaEventDestroy(start_));
- CUDA_SAFE_CALL(cudaEventDestroy(stop_));
- }
-
- void start()
- {
- CUDA_SAFE_CALL(cudaEventRecord(start_, 0));
- }
-
- void stop()
- {
- CUDA_SAFE_CALL(cudaEventRecord(stop_, 0));
- CUDA_SAFE_CALL(cudaEventSynchronize(stop_));
- }
-
- double milliseconds_elapsed()
- {
- float elapsed_time;
- CUDA_SAFE_CALL(cudaEventElapsedTime(&elapsed_time, start_, stop_));
- return elapsed_time;
- }
-
- double seconds_elapsed()
- {
- return milliseconds_elapsed() / 1000.0;
- }
-};
-
-#if (THRUST_HOST_COMPILER == THRUST_HOST_COMPILER_MSVC)
-#include
-
-class steady_timer
-{
- LARGE_INTEGER frequency_; // Cached to avoid system calls.
- LARGE_INTEGER start_;
- LARGE_INTEGER stop_;
-
- public:
- steady_timer() : start_(), stop_(), frequency_()
- {
- BOOL const r = QueryPerformanceFrequency(&frequency_);
- assert(0 != r);
- }
-
- void start()
- {
- BOOL const r = QueryPerformanceCounter(&start_);
- assert(0 != r);
- }
-
- void stop()
- {
- BOOL const r = QueryPerformanceCounter(&stop_);
- assert(0 != r);
- }
-
- double seconds_elapsed()
- {
- return double(stop_.QuadPart - start_.QuadPart)
- / double(frequency_.QuadPart);
- }
-};
-#else
-#include
-
-class steady_timer
-{
- timespec start_;
- timespec stop_;
-
- public:
- steady_timer() : start_(), stop_() {}
-
- void start()
- {
- int const r = clock_gettime(CLOCK_MONOTONIC, &start_);
- assert(0 == r);
- }
-
- void stop()
- {
- int const r = clock_gettime(CLOCK_MONOTONIC, &stop_);
- assert(0 == r);
- }
-
- double seconds_elapsed()
- {
- return double(stop_.tv_sec - start_.tv_sec)
- + double(stop_.tv_nsec - start_.tv_nsec) * 1.0e-9;
- }
-};
-#endif
-
-
diff --git a/spaces/ma-xu/LIVE/thrust/thrust/system/tbb/detail/par.h b/spaces/ma-xu/LIVE/thrust/thrust/system/tbb/detail/par.h
deleted file mode 100644
index a5d9c14cd7a91df6bcd00dcd13419d7e67155b03..0000000000000000000000000000000000000000
--- a/spaces/ma-xu/LIVE/thrust/thrust/system/tbb/detail/par.h
+++ /dev/null
@@ -1,62 +0,0 @@
-/*
- * Copyright 2008-2018 NVIDIA Corporation
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-#pragma once
-
-#include
-#include
-#include
-
-namespace thrust
-{
-namespace system
-{
-namespace tbb
-{
-namespace detail
-{
-
-
-struct par_t : thrust::system::tbb::detail::execution_policy,
- thrust::detail::allocator_aware_execution_policy<
- thrust::system::tbb::detail::execution_policy>
-{
- __host__ __device__
- THRUST_CONSTEXPR par_t() : thrust::system::tbb::detail::execution_policy() {}
-};
-
-
-} // end detail
-
-
-static const detail::par_t par;
-
-
-} // end tbb
-} // end system
-
-
-// alias par here
-namespace tbb
-{
-
-
-using thrust::system::tbb::par;
-
-
-} // end tbb
-} // end thrust
-
diff --git a/spaces/manavisrani07/gradio-lipsync-wav2lip/basicsr/archs/dfdnet_util.py b/spaces/manavisrani07/gradio-lipsync-wav2lip/basicsr/archs/dfdnet_util.py
deleted file mode 100644
index b4dc0ff738c76852e830b32fffbe65bffb5ddf50..0000000000000000000000000000000000000000
--- a/spaces/manavisrani07/gradio-lipsync-wav2lip/basicsr/archs/dfdnet_util.py
+++ /dev/null
@@ -1,162 +0,0 @@
-import torch
-import torch.nn as nn
-import torch.nn.functional as F
-from torch.autograd import Function
-from torch.nn.utils.spectral_norm import spectral_norm
-
-
-class BlurFunctionBackward(Function):
-
- @staticmethod
- def forward(ctx, grad_output, kernel, kernel_flip):
- ctx.save_for_backward(kernel, kernel_flip)
- grad_input = F.conv2d(grad_output, kernel_flip, padding=1, groups=grad_output.shape[1])
- return grad_input
-
- @staticmethod
- def backward(ctx, gradgrad_output):
- kernel, _ = ctx.saved_tensors
- grad_input = F.conv2d(gradgrad_output, kernel, padding=1, groups=gradgrad_output.shape[1])
- return grad_input, None, None
-
-
-class BlurFunction(Function):
-
- @staticmethod
- def forward(ctx, x, kernel, kernel_flip):
- ctx.save_for_backward(kernel, kernel_flip)
- output = F.conv2d(x, kernel, padding=1, groups=x.shape[1])
- return output
-
- @staticmethod
- def backward(ctx, grad_output):
- kernel, kernel_flip = ctx.saved_tensors
- grad_input = BlurFunctionBackward.apply(grad_output, kernel, kernel_flip)
- return grad_input, None, None
-
-
-blur = BlurFunction.apply
-
-
-class Blur(nn.Module):
-
- def __init__(self, channel):
- super().__init__()
- kernel = torch.tensor([[1, 2, 1], [2, 4, 2], [1, 2, 1]], dtype=torch.float32)
- kernel = kernel.view(1, 1, 3, 3)
- kernel = kernel / kernel.sum()
- kernel_flip = torch.flip(kernel, [2, 3])
-
- self.kernel = kernel.repeat(channel, 1, 1, 1)
- self.kernel_flip = kernel_flip.repeat(channel, 1, 1, 1)
-
- def forward(self, x):
- return blur(x, self.kernel.type_as(x), self.kernel_flip.type_as(x))
-
-
-def calc_mean_std(feat, eps=1e-5):
- """Calculate mean and std for adaptive_instance_normalization.
-
- Args:
- feat (Tensor): 4D tensor.
- eps (float): A small value added to the variance to avoid
- divide-by-zero. Default: 1e-5.
- """
- size = feat.size()
- assert len(size) == 4, 'The input feature should be 4D tensor.'
- n, c = size[:2]
- feat_var = feat.view(n, c, -1).var(dim=2) + eps
- feat_std = feat_var.sqrt().view(n, c, 1, 1)
- feat_mean = feat.view(n, c, -1).mean(dim=2).view(n, c, 1, 1)
- return feat_mean, feat_std
-
-
-def adaptive_instance_normalization(content_feat, style_feat):
- """Adaptive instance normalization.
-
- Adjust the reference features to have the similar color and illuminations
- as those in the degradate features.
-
- Args:
- content_feat (Tensor): The reference feature.
- style_feat (Tensor): The degradate features.
- """
- size = content_feat.size()
- style_mean, style_std = calc_mean_std(style_feat)
- content_mean, content_std = calc_mean_std(content_feat)
- normalized_feat = (content_feat - content_mean.expand(size)) / content_std.expand(size)
- return normalized_feat * style_std.expand(size) + style_mean.expand(size)
-
-
-def AttentionBlock(in_channel):
- return nn.Sequential(
- spectral_norm(nn.Conv2d(in_channel, in_channel, 3, 1, 1)), nn.LeakyReLU(0.2, True),
- spectral_norm(nn.Conv2d(in_channel, in_channel, 3, 1, 1)))
-
-
-def conv_block(in_channels, out_channels, kernel_size=3, stride=1, dilation=1, bias=True):
- """Conv block used in MSDilationBlock."""
-
- return nn.Sequential(
- spectral_norm(
- nn.Conv2d(
- in_channels,
- out_channels,
- kernel_size=kernel_size,
- stride=stride,
- dilation=dilation,
- padding=((kernel_size - 1) // 2) * dilation,
- bias=bias)),
- nn.LeakyReLU(0.2),
- spectral_norm(
- nn.Conv2d(
- out_channels,
- out_channels,
- kernel_size=kernel_size,
- stride=stride,
- dilation=dilation,
- padding=((kernel_size - 1) // 2) * dilation,
- bias=bias)),
- )
-
-
-class MSDilationBlock(nn.Module):
- """Multi-scale dilation block."""
-
- def __init__(self, in_channels, kernel_size=3, dilation=(1, 1, 1, 1), bias=True):
- super(MSDilationBlock, self).__init__()
-
- self.conv_blocks = nn.ModuleList()
- for i in range(4):
- self.conv_blocks.append(conv_block(in_channels, in_channels, kernel_size, dilation=dilation[i], bias=bias))
- self.conv_fusion = spectral_norm(
- nn.Conv2d(
- in_channels * 4,
- in_channels,
- kernel_size=kernel_size,
- stride=1,
- padding=(kernel_size - 1) // 2,
- bias=bias))
-
- def forward(self, x):
- out = []
- for i in range(4):
- out.append(self.conv_blocks[i](x))
- out = torch.cat(out, 1)
- out = self.conv_fusion(out) + x
- return out
-
-
-class UpResBlock(nn.Module):
-
- def __init__(self, in_channel):
- super(UpResBlock, self).__init__()
- self.body = nn.Sequential(
- nn.Conv2d(in_channel, in_channel, 3, 1, 1),
- nn.LeakyReLU(0.2, True),
- nn.Conv2d(in_channel, in_channel, 3, 1, 1),
- )
-
- def forward(self, x):
- out = x + self.body(x)
- return out
diff --git a/spaces/marinap/multilingual-image-search/app.py b/spaces/marinap/multilingual-image-search/app.py
deleted file mode 100644
index fe4a9df4c0b51aa1d53d6e4581c071c588ed1b2d..0000000000000000000000000000000000000000
--- a/spaces/marinap/multilingual-image-search/app.py
+++ /dev/null
@@ -1,104 +0,0 @@
-import io
-import requests
-import numpy as np
-import pandas as pd
-import torch
-import torch.nn.functional as F
-from PIL import Image
-import gradio as gr
-import uform
-from datetime import datetime
-
-
-
-model_multi = uform.get_model('unum-cloud/uform-vl-multilingual')
-
-embeddings = np.load('tensors/embeddings.npy')
-embeddings = torch.tensor(embeddings)
-
-#features = np.load('multilingual-image-search/tensors/features.npy')
-#features = torch.tensor(features)
-
-img_df = pd.read_csv('image_data.csv')
-
-def url2img(url, resize = False, fix_height = 150):
- data = requests.get(url, allow_redirects = True).content
- img = Image.open(io.BytesIO(data))
- if resize:
- img.thumbnail([fix_height, fix_height], Image.LANCZOS)
- return img
-
-def find_topk(text):
-
- print('text', text)
-
- top_k = 20
-
- text_data = model_multi.preprocess_text(text)
- text_features, text_embedding = model_multi.encode_text(text_data, return_features=True)
-
- print('Got features', datetime.now().strftime("%H:%M:%S"))
-
- sims = F.cosine_similarity(text_embedding, embeddings)
-
- vals, inds = sims.topk(top_k)
- top_k_urls = img_df.iloc[inds]['photo_image_url'].values
-
- print('Got top_k_urls', top_k_urls)
- print(datetime.now().strftime("%H:%M:%S"))
-
- return top_k_urls
-
-
-
-# def rerank(text_features, text_data):
-
-# # craet joint embeddings & get scores
-# joint_embedding = model_multi.encode_multimodal(
-# image_features=image_features,
-# text_features=text_features,
-# attention_mask=text_data['attention_mask']
-# )
-# score = model_multi.get_matching_scores(joint_embedding)
-
-# # argmax to get top N
-
-# return
-
-
-#demo = gr.Interface(find_topk, inputs = 'text', outputs = 'image')
-
-print('version', gr.__version__)
-
-with gr.Blocks(theme=gr.themes.Soft()) as demo:
- gr.Markdown('# Enter a prompt in one of the supported languages.')
- with gr.Row():
- with gr.Column():
- gr.Markdown(
- '||||||\n'
- '|:-------: |:---: |:-------: |:---: | :--- |\n'
- '|__English__| # |__French__ | # |__Russian__|\n'
- '|__German__ | # |__Italian__ | # |__Chinese (Simplified)__|\n'
- '|__Spanish__| # |__Japanese__| # |__Korean__|\n'
- '|__Turkish__| # |__Polish__ | # |.|\n')
-
- with gr.Column():
- prompt_box = gr.Textbox(label = 'Enter your prompt', lines = 3, container = True)
- btn_search = gr.Button("Find images")
-
- with gr.Row():
- gr.Examples(['a girl wandering alone in the forest',
- 'морозное утро в городе',
- '카메라를 바라보는 강아지 새끼',
- 'ein Schloss, das zwischen modernen Gebäuden hervorlugt',
- 'un couple sirotant un café au bord de la rivière',
- 'una banda de música actuando en un gran espacio al aire libre',
- '秋の静かな霧の庭園'
- ], inputs=[prompt_box])
-
-
- gallery = gr.Gallery().style(grid = [5], height="auto")
- btn_search.click(find_topk, inputs = prompt_box, outputs = gallery)
-
-if __name__ == "__main__":
- demo.launch()
\ No newline at end of file
diff --git a/spaces/mascIT/AgeGuesser/yolov5/__init__.py b/spaces/mascIT/AgeGuesser/yolov5/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/spaces/matthoffner/chatbot/utils/server/google.ts b/spaces/matthoffner/chatbot/utils/server/google.ts
deleted file mode 100644
index ce800415c06a7f78bb298733411b6df94aaebf3b..0000000000000000000000000000000000000000
--- a/spaces/matthoffner/chatbot/utils/server/google.ts
+++ /dev/null
@@ -1,9 +0,0 @@
-export const cleanSourceText = (text: string) => {
- return text
- .trim()
- .replace(/(\n){4,}/g, '\n\n\n')
- .replace(/\n\n/g, ' ')
- .replace(/ {3,}/g, ' ')
- .replace(/\t/g, '')
- .replace(/\n+(\s*\n)*/g, '\n');
-};
diff --git a/spaces/matthoffner/open-codetree/store/features/compilerSlice.ts b/spaces/matthoffner/open-codetree/store/features/compilerSlice.ts
deleted file mode 100644
index dce35e415af542fc502b9970f0dc2ab72f046d3d..0000000000000000000000000000000000000000
--- a/spaces/matthoffner/open-codetree/store/features/compilerSlice.ts
+++ /dev/null
@@ -1,107 +0,0 @@
-import { createSlice } from "@reduxjs/toolkit";
-import { RootState } from "../store";
-import * as esbuild from "esbuild-wasm";
-import { unpkgFetchPlugin, unpkgPathPlugin } from "../../esbuild/plugins";
-import { CompilerOutput, CompilerStatus } from "../../_types/compilerTypes";
-
-type InitialStateType = {
- isInitializing: boolean;
- esbuildStatus: CompilerStatus;
- isCompiling: boolean;
- output: CompilerOutput;
-};
-
-const initialState = {
- isInitializing: false,
- esbuildStatus: { isReady: false, error: "" },
- isCompiling: false,
- output: { code: "", error: "" },
-};
-
-export const compilerSlice = createSlice({
- name: "compiler",
- initialState: initialState,
- reducers: {
- init_esbuild: (state: InitialStateType) => {
- state.isInitializing = true;
- },
- init_esbuild_success: (state: InitialStateType) => {
- state.esbuildStatus.isReady = true;
- state.esbuildStatus.error = "";
- state.isInitializing = false;
- },
- init_esbuild_failure: (state: InitialStateType, { payload }) => {
- state.esbuildStatus.isReady = false;
- state.esbuildStatus.error = payload;
- state.isInitializing = false;
- },
- compiled: (state: InitialStateType) => {
- state.isCompiling = true;
- },
- compiled_success: (state: InitialStateType, { payload }) => {
- state.output.code = payload;
- state.output.error = "";
- state.isCompiling = false;
- },
- compiled_failure: (state: InitialStateType, { payload }) => {
- state.output.code = "";
- state.output.error = payload;
- state.isCompiling = false;
- },
- },
-});
-
-export const {
- compiled,
- compiled_success,
- compiled_failure,
- init_esbuild,
- init_esbuild_success,
- init_esbuild_failure,
-} = compilerSlice.actions;
-
-export const compiler_state = (state: RootState) => state.compiler;
-
-export default compilerSlice.reducer;
-
-// Asynchronous thunk action
-
-export function initEsbuild() {
- return async (dispatch: any) => {
- dispatch(init_esbuild());
-
- await esbuild
- .initialize({
- worker: true,
- wasmURL: "https://unpkg.com/esbuild-wasm@0.14.42/esbuild.wasm",
- })
- .then(() => {
- dispatch(init_esbuild_success());
- })
- .catch((error) => dispatch(init_esbuild_failure(error.message)));
- };
-}
-
-export function getCompileCode(rawCode: string, entryPoint: string) {
- return async (dispatch: any) => {
- dispatch(compiled());
-
- try {
- const result = await esbuild.build({
- entryPoints: [`${entryPoint}`],
- bundle: true,
- write: false,
- minify: true,
- outdir: "/",
- plugins: [unpkgPathPlugin(), unpkgFetchPlugin(rawCode, entryPoint)],
- metafile: true,
- allowOverwrite: true,
- });
-
- dispatch(compiled_success(result.outputFiles[0].text));
- } catch (error) {
- // @ts-ignore
- dispatch(compiled_failure(error.message));
- }
- };
-}
diff --git a/spaces/merle/PROTEIN_GENERATOR/app.py b/spaces/merle/PROTEIN_GENERATOR/app.py
deleted file mode 100644
index ae13dae14658ad82e25823376049125617b31f27..0000000000000000000000000000000000000000
--- a/spaces/merle/PROTEIN_GENERATOR/app.py
+++ /dev/null
@@ -1,611 +0,0 @@
-import os,sys
-
-# install environment goods
-#os.system("pip -q install dgl -f https://data.dgl.ai/wheels/cu113/repo.html")
-os.system('pip install dgl==1.0.2+cu116 -f https://data.dgl.ai/wheels/cu116/repo.html')
-#os.system('pip install gradio')
-os.environ["DGLBACKEND"] = "pytorch"
-#os.system(f'pip install -r ./PROTEIN_GENERATOR/requirements.txt')
-print('Modules installed')
-
-os.system('pip install --force gradio==3.36.1')
-os.system('pip install gradio_client==0.2.7')
-
-os.environ["DGLBACKEND"] = "pytorch"
-
-if not os.path.exists('./SEQDIFF_230205_dssp_hotspots_25mask_EQtasks_mod30.pt'):
- print('Downloading model weights 1')
- os.system('wget http://files.ipd.uw.edu/pub/sequence_diffusion/checkpoints/SEQDIFF_230205_dssp_hotspots_25mask_EQtasks_mod30.pt')
- print('Successfully Downloaded')
-
-if not os.path.exists('./SEQDIFF_221219_equalTASKS_nostrSELFCOND_mod30.pt'):
- print('Downloading model weights 2')
- os.system('wget http://files.ipd.uw.edu/pub/sequence_diffusion/checkpoints/SEQDIFF_221219_equalTASKS_nostrSELFCOND_mod30.pt')
- print('Successfully Downloaded')
-
-import numpy as np
-import gradio as gr
-import py3Dmol
-from io import StringIO
-import json
-import secrets
-import copy
-import matplotlib.pyplot as plt
-from utils.sampler import HuggingFace_sampler
-from utils.parsers_inference import parse_pdb
-from model.util import writepdb
-from utils.inpainting_util import *
-
-
-plt.rcParams.update({'font.size': 13})
-
-with open('./tmp/args.json','r') as f:
- args = json.load(f)
-
-# manually set checkpoint to load
-args['checkpoint'] = None
-args['dump_trb'] = False
-args['dump_args'] = True
-args['save_best_plddt'] = True
-args['T'] = 25
-args['strand_bias'] = 0.0
-args['loop_bias'] = 0.0
-args['helix_bias'] = 0.0
-
-
-
-def protein_diffusion_model(sequence, seq_len, helix_bias, strand_bias, loop_bias,
- secondary_structure, aa_bias, aa_bias_potential,
- #target_charge, target_ph, charge_potential,
- num_steps, noise, hydrophobic_target_score, hydrophobic_potential,
- contigs, pssm, seq_mask, str_mask, rewrite_pdb):
-
- dssp_checkpoint = './SEQDIFF_230205_dssp_hotspots_25mask_EQtasks_mod30.pt'
- og_checkpoint = './SEQDIFF_221219_equalTASKS_nostrSELFCOND_mod30.pt'
-
- model_args = copy.deepcopy(args)
-
- # make sampler
- S = HuggingFace_sampler(args=model_args)
-
- # get random prefix
- S.out_prefix = './tmp/'+secrets.token_hex(nbytes=10).upper()
-
- # set args
- S.args['checkpoint'] = None
- S.args['dump_trb'] = False
- S.args['dump_args'] = True
- S.args['save_best_plddt'] = True
- S.args['T'] = 20
- S.args['strand_bias'] = 0.0
- S.args['loop_bias'] = 0.0
- S.args['helix_bias'] = 0.0
- S.args['potentials'] = None
- S.args['potential_scale'] = None
- S.args['aa_composition'] = None
-
-
- # get sequence if entered and make sure all chars are valid
- alt_aa_dict = {'B':['D','N'],'J':['I','L'],'U':['C'],'Z':['E','Q'],'O':['K']}
- if sequence not in ['',None]:
- L = len(sequence)
- aa_seq = []
- for aa in sequence.upper():
- if aa in alt_aa_dict.keys():
- aa_seq.append(np.random.choice(alt_aa_dict[aa]))
- else:
- aa_seq.append(aa)
-
- S.args['sequence'] = aa_seq
- elif contigs not in ['',None]:
- S.args['contigs'] = [contigs]
- else:
- S.args['contigs'] = [f'{seq_len}']
- L = int(seq_len)
-
- print('DEBUG: ',rewrite_pdb)
- if rewrite_pdb not in ['',None]:
- S.args['pdb'] = rewrite_pdb.name
-
- if seq_mask not in ['',None]:
- S.args['inpaint_seq'] = [seq_mask]
- if str_mask not in ['',None]:
- S.args['inpaint_str'] = [str_mask]
-
- if secondary_structure in ['',None]:
- secondary_structure = None
- else:
- secondary_structure = ''.join(['E' if x == 'S' else x for x in secondary_structure])
- if L < len(secondary_structure):
- secondary_structure = secondary_structure[:len(sequence)]
- elif L == len(secondary_structure):
- pass
- else:
- dseq = L - len(secondary_structure)
- secondary_structure += secondary_structure[-1]*dseq
-
-
- # potentials
- potential_list = []
- potential_bias_list = []
-
- if aa_bias not in ['',None]:
- potential_list.append('aa_bias')
- S.args['aa_composition'] = aa_bias
- if aa_bias_potential in ['',None]:
- aa_bias_potential = 3
- potential_bias_list.append(str(aa_bias_potential))
- '''
- if target_charge not in ['',None]:
- potential_list.append('charge')
- if charge_potential in ['',None]:
- charge_potential = 1
- potential_bias_list.append(str(charge_potential))
- S.args['target_charge'] = float(target_charge)
- if target_ph in ['',None]:
- target_ph = 7.4
- S.args['target_pH'] = float(target_ph)
- '''
-
- if hydrophobic_target_score not in ['',None]:
- potential_list.append('hydrophobic')
- S.args['hydrophobic_score'] = float(hydrophobic_target_score)
- if hydrophobic_potential in ['',None]:
- hydrophobic_potential = 3
- potential_bias_list.append(str(hydrophobic_potential))
-
- if pssm not in ['',None]:
- potential_list.append('PSSM')
- potential_bias_list.append('5')
- S.args['PSSM'] = pssm.name
-
-
- if len(potential_list) > 0:
- S.args['potentials'] = ','.join(potential_list)
- S.args['potential_scale'] = ','.join(potential_bias_list)
-
-
- # normalise secondary_structure bias from range 0-0.3
- S.args['secondary_structure'] = secondary_structure
- S.args['helix_bias'] = helix_bias
- S.args['strand_bias'] = strand_bias
- S.args['loop_bias'] = loop_bias
-
- # set T
- if num_steps in ['',None]:
- S.args['T'] = 20
- else:
- S.args['T'] = int(num_steps)
-
- # noise
- if 'normal' in noise:
- S.args['sample_distribution'] = noise
- S.args['sample_distribution_gmm_means'] = [0]
- S.args['sample_distribution_gmm_variances'] = [1]
- elif 'gmm2' in noise:
- S.args['sample_distribution'] = noise
- S.args['sample_distribution_gmm_means'] = [-1,1]
- S.args['sample_distribution_gmm_variances'] = [1,1]
- elif 'gmm3' in noise:
- S.args['sample_distribution'] = noise
- S.args['sample_distribution_gmm_means'] = [-1,0,1]
- S.args['sample_distribution_gmm_variances'] = [1,1,1]
-
-
-
- if secondary_structure not in ['',None] or helix_bias+strand_bias+loop_bias > 0:
- S.args['checkpoint'] = dssp_checkpoint
- S.args['d_t1d'] = 29
- print('using dssp checkpoint')
- else:
- S.args['checkpoint'] = og_checkpoint
- S.args['d_t1d'] = 24
- print('using og checkpoint')
-
-
- for k,v in S.args.items():
- print(f"{k} --> {v}")
-
- # init S
- S.model_init()
- S.diffuser_init()
- S.setup()
-
- # sampling loop
- plddt_data = []
- for j in range(S.max_t):
- output_seq, output_pdb, plddt = S.take_step_get_outputs(j)
- plddt_data.append(plddt)
- yield output_seq, output_pdb, display_pdb(output_pdb), get_plddt_plot(plddt_data, S.max_t)
-
- output_seq, output_pdb, plddt = S.get_outputs()
-
- yield output_seq, output_pdb, display_pdb(output_pdb), get_plddt_plot(plddt_data, S.max_t)
-
-def get_plddt_plot(plddt_data, max_t):
- x = [i+1 for i in range(len(plddt_data))]
- fig, ax = plt.subplots(figsize=(15,6))
- ax.plot(x,plddt_data,color='#661dbf', linewidth=3,marker='o')
- ax.set_xticks([i+1 for i in range(max_t)])
- ax.set_yticks([(i+1)/10 for i in range(10)])
- ax.set_ylim([0,1])
- ax.set_ylabel('model confidence (plddt)')
- ax.set_xlabel('diffusion steps (t)')
- return fig
-
-def display_pdb(path_to_pdb):
- '''
- #function to display pdb in py3dmol
- '''
- pdb = open(path_to_pdb, "r").read()
-
- view = py3Dmol.view(width=500, height=500)
- view.addModel(pdb, "pdb")
- view.setStyle({'model': -1}, {"cartoon": {'colorscheme':{'prop':'b','gradient':'roygb','min':0,'max':1}}})#'linear', 'min': 0, 'max': 1, 'colors': ["#ff9ef0","#a903fc",]}}})
- view.zoomTo()
- output = view._make_html().replace("'", '"')
- print(view._make_html())
- x = f""" {output} """ # do not use ' in this input
-
- return f""""""
-
-'''
- return f""""""
-'''
-
-
-
-# MOTIF SCAFFOLDING
-def get_motif_preview(pdb_id, contigs):
- '''
- #function to display selected motif in py3dmol
- '''
- input_pdb = fetch_pdb(pdb_id=pdb_id.lower())
-
- # rewrite pdb
- parse = parse_pdb(input_pdb)
- #output_name = './rewrite_'+input_pdb.split('/')[-1]
- #writepdb(output_name, torch.tensor(parse_og['xyz']),torch.tensor(parse_og['seq']))
- #parse = parse_pdb(output_name)
- output_name = input_pdb
-
- pdb = open(output_name, "r").read()
- view = py3Dmol.view(width=500, height=500)
- view.addModel(pdb, "pdb")
-
- if contigs in ['',0]:
- contigs = ['0']
- else:
- contigs = [contigs]
-
- print('DEBUG: ',contigs)
-
- pdb_map = get_mappings(ContigMap(parse,contigs))
- print('DEBUG: ',pdb_map)
- print('DEBUG: ',pdb_map['con_ref_idx0'])
- roi = [x[1]-1 for x in pdb_map['con_ref_pdb_idx']]
-
- colormap = {0:'#D3D3D3', 1:'#F74CFF'}
- colors = {i+1: colormap[1] if i in roi else colormap[0] for i in range(parse['xyz'].shape[0])}
- view.setStyle({"cartoon": {"colorscheme": {"prop": "resi", "map": colors}}})
- view.zoomTo()
- output = view._make_html().replace("'", '"')
- print(view._make_html())
- x = f""" {output} """ # do not use ' in this input
-
- return f"""""", output_name
-
-def fetch_pdb(pdb_id=None):
- if pdb_id is None or pdb_id == "":
- return None
- else:
- os.system(f"wget -qnc https://files.rcsb.org/view/{pdb_id}.pdb")
- return f"{pdb_id}.pdb"
-
-# MSA AND PSSM GUIDANCE
-def save_pssm(file_upload):
- filename = file_upload.name
- orig_name = file_upload.orig_name
- if filename.split('.')[-1] in ['fasta', 'a3m']:
- return msa_to_pssm(file_upload)
- return filename
-
-def msa_to_pssm(msa_file):
- # Define the lookup table for converting amino acids to indices
- aa_to_index = {'A': 0, 'R': 1, 'N': 2, 'D': 3, 'C': 4, 'Q': 5, 'E': 6, 'G': 7, 'H': 8, 'I': 9, 'L': 10,
- 'K': 11, 'M': 12, 'F': 13, 'P': 14, 'S': 15, 'T': 16, 'W': 17, 'Y': 18, 'V': 19, 'X': 20, '-': 21}
- # Open the FASTA file and read the sequences
- records = list(SeqIO.parse(msa_file.name, "fasta"))
-
- assert len(records) >= 1, "MSA must contain more than one protein sequecne."
-
- first_seq = str(records[0].seq)
- aligned_seqs = [first_seq]
- # print(aligned_seqs)
- # Perform sequence alignment using the Needleman-Wunsch algorithm
- aligner = Align.PairwiseAligner()
- aligner.open_gap_score = -0.7
- aligner.extend_gap_score = -0.3
- for record in records[1:]:
- alignment = aligner.align(first_seq, str(record.seq))[0]
- alignment = alignment.format().split("\n")
- al1 = alignment[0]
- al2 = alignment[2]
- al1_fin = ""
- al2_fin = ""
- percent_gap = al2.count('-')/ len(al2)
- if percent_gap > 0.4:
- continue
- for i in range(len(al1)):
- if al1[i] != '-':
- al1_fin += al1[i]
- al2_fin += al2[i]
- aligned_seqs.append(str(al2_fin))
- # Get the length of the aligned sequences
- aligned_seq_length = len(first_seq)
- # Initialize the position scoring matrix
- matrix = np.zeros((22, aligned_seq_length))
- # Iterate through the aligned sequences and count the amino acids at each position
- for seq in aligned_seqs:
- #print(seq)
- for i in range(aligned_seq_length):
- if i == len(seq):
- break
- amino_acid = seq[i]
- if amino_acid.upper() not in aa_to_index.keys():
- continue
- else:
- aa_index = aa_to_index[amino_acid.upper()]
- matrix[aa_index, i] += 1
- # Normalize the counts to get the frequency of each amino acid at each position
- matrix /= len(aligned_seqs)
- print(len(aligned_seqs))
- matrix[20:,]=0
-
- outdir = ".".join(msa_file.name.split('.')[:-1]) + ".csv"
- np.savetxt(outdir, matrix[:21,:].T, delimiter=",")
- return outdir
-
-def get_pssm(fasta_msa, input_pssm):
-
- if input_pssm not in ['',None]:
- outdir = input_pssm.name
- else:
- outdir = save_pssm(fasta_msa)
-
- pssm = np.loadtxt(outdir, delimiter=",", dtype=float)
- fig, ax = plt.subplots(figsize=(15,6))
- plt.imshow(torch.permute(torch.tensor(pssm),(1,0)))
-
- return fig, outdir
-
-
-#toggle options
-def toggle_seq_input(choice):
- if choice == "protein length":
- return gr.update(visible=True, value=None), gr.update(visible=False, value=None)
- elif choice == "custom sequence":
- return gr.update(visible=False, value=None), gr.update(visible=True, value=None)
-
-def toggle_secondary_structure(choice):
- if choice == "sliders":
- return gr.update(visible=True, value=None),gr.update(visible=True, value=None),gr.update(visible=True, value=None),gr.update(visible=False, value=None)
- elif choice == "explicit":
- return gr.update(visible=False, value=None),gr.update(visible=False, value=None),gr.update(visible=False, value=None),gr.update(visible=True, value=None)
-
-
-# Define the Gradio interface
-with gr.Blocks(theme='ParityError/Interstellar') as demo:
-
- gr.Markdown(f"""# Protein Generation via Diffusion in Sequence Space""")
-
- with gr.Row():
- with gr.Column(min_width=500):
- gr.Markdown(f"""
- ## How does it work?\n
- --- [PREPRINT](https://biorxiv.org/content/10.1101/2023.05.08.539766v1) ---
- Protein sequence and structure co-generation is a long outstanding problem in the field of protein design. By implementing [ddpm](https://arxiv.org/abs/2006.11239) style diffusion over protein seqeuence space we generate protein sequence and structure pairs. Starting with [RoseTTAFold](https://www.science.org/doi/10.1126/science.abj8754), a protein structure prediction network, we finetuned it to predict sequence and structure given a partially noised sequence. By applying losses to both the predicted sequence and structure the model is forced to generate meaningful pairs. Diffusing in sequence space makes it easy to implement potentials to guide the diffusive process toward particular amino acid composition, net charge, and more! Furthermore, you can sample proteins from a family of sequences or even train a small sequence to function classifier to guide generation toward desired sequences.
- 
-
- ## How to use it?\n
- A user can either design a custom input sequence to diffuse from or specify a length below. To scaffold a sequence use the following format where X represent residues to diffuse: XXXXXXXXSCIENCESCIENCEXXXXXXXXXXXXXXXXXXX. You can even design a protein with your name XXXXXXXXXXXXNAMEHEREXXXXXXXXXXXXX!
-
- ### Acknowledgements\n
- Thank you to Simon Dürr and the Hugging Face team for setting us up with a community GPU grant!
- """)
-
- gr.Markdown("""
- ## Model in Action
- 
- """)
-
- with gr.Row().style(equal_height=False):
- with gr.Column():
- with gr.Tabs():
- with gr.TabItem("Inputs"):
- gr.Markdown("""## INPUTS""")
- gr.Markdown("""#### Start Sequence
- Specify the protein length for complete unconditional generation, or scaffold a motif (or your name) using the custom sequence input""")
- seq_opt = gr.Radio(["protein length","custom sequence"], label="How would you like to specify the starting sequence?", value='protein length')
-
- sequence = gr.Textbox(label="custom sequence", lines=1, placeholder='AMINO ACIDS: A,C,D,E,F,G,H,I,K,L,M,N,P,Q,R,S,T,V,W,Y\n MASK TOKEN: X', visible=False)
- seq_len = gr.Slider(minimum=5.0, maximum=250.0, label="protein length", value=100, visible=True)
-
- seq_opt.change(fn=toggle_seq_input,
- inputs=[seq_opt],
- outputs=[seq_len, sequence],
- queue=False)
-
- gr.Markdown("""### Optional Parameters""")
- with gr.Accordion(label='Secondary Structure',open=True):
- gr.Markdown("""Try changing the sliders or inputing explicit secondary structure conditioning for each residue""")
- sec_str_opt = gr.Radio(["sliders","explicit"], label="How would you like to specify secondary structure?", value='sliders')
-
- secondary_structure = gr.Textbox(label="secondary structure", lines=1, placeholder='HELIX = H STRAND = S LOOP = L MASK = X(must be the same length as input sequence)', visible=False)
-
- with gr.Column():
- helix_bias = gr.Slider(minimum=0.0, maximum=0.05, label="helix bias", visible=True)
- strand_bias = gr.Slider(minimum=0.0, maximum=0.05, label="strand bias", visible=True)
- loop_bias = gr.Slider(minimum=0.0, maximum=0.20, label="loop bias", visible=True)
-
- sec_str_opt.change(fn=toggle_secondary_structure,
- inputs=[sec_str_opt],
- outputs=[helix_bias,strand_bias,loop_bias,secondary_structure],
- queue=False)
-
- with gr.Accordion(label='Amino Acid Compositional Bias',open=False):
- gr.Markdown("""Bias sequence composition for particular amino acids by specifying the one letter code followed by the fraction to bias. This can be input as a list for example: W0.2,E0.1""")
- with gr.Row():
- aa_bias = gr.Textbox(label="aa bias", lines=1, placeholder='specify one letter AA and fraction to bias, for example W0.1 or M0.1,K0.1' )
- aa_bias_potential = gr.Textbox(label="aa bias scale", lines=1, placeholder='AA Bias potential scale (recomended range 1.0-5.0)')
-
- '''
- with gr.Accordion(label='Charge Bias',open=False):
- gr.Markdown("""Bias for a specified net charge at a particular pH using the boxes below""")
- with gr.Row():
- target_charge = gr.Textbox(label="net charge", lines=1, placeholder='net charge to target')
- target_ph = gr.Textbox(label="pH", lines=1, placeholder='pH at which net charge is desired')
- charge_potential = gr.Textbox(label="charge potential scale", lines=1, placeholder='charge potential scale (recomended range 1.0-5.0)')
- '''
-
- with gr.Accordion(label='Hydrophobic Bias',open=False):
- gr.Markdown("""Bias for or against hydrophobic composition, to get more soluble proteins, bias away with a negative target score (ex. -5)""")
- with gr.Row():
- hydrophobic_target_score = gr.Textbox(label="hydrophobic score", lines=1, placeholder='hydrophobic score to target (negative score is good for solublility)')
- hydrophobic_potential = gr.Textbox(label="hydrophobic potential scale", lines=1, placeholder='hydrophobic potential scale (recomended range 1.0-2.0)')
-
- with gr.Accordion(label='Diffusion Params',open=False):
- gr.Markdown("""Increasing T to more steps can be helpful for harder design challenges, sampling from different distributions can change the sequence and structural composition""")
- with gr.Row():
- num_steps = gr.Textbox(label="T", lines=1, placeholder='number of diffusion steps (25 or less will speed things up)')
- noise = gr.Dropdown(['normal','gmm2 [-1,1]','gmm3 [-1,0,1]'], label='noise type', value='normal')
-
- with gr.TabItem("Motif Selection"):
-
- gr.Markdown("""### Motif Selection Preview""")
- gr.Markdown('Contigs explained: to grab residues (seq and str) on a pdb chain you will provide the chain letter followed by a range of residues as indexed in the pdb file for example (A3-10) is the syntax to select residues 3-10 on chain A (the chain always needs to be specified). To add diffused residues to either side of this motif you can specify a range or discrete value without a chain letter infront. To add 15 residues before the motif and 20-30 residues (randomly sampled) after use the following syntax: 15,A3-10,20-30 commas are used to separate regions selected from the pdb and designed (diffused) resiudes which will be added. ')
- pdb_id_code = gr.Textbox(label="PDB ID", lines=1, placeholder='INPUT PDB ID TO FETCH (ex. 1DPX)', visible=True)
- contigs = gr.Textbox(label="contigs", lines=1, placeholder='specify contigs to grab particular residues from pdb ()', visible=True)
- gr.Markdown('Using the same contig syntax, seq or str of input motif residues can be masked, allowing the model to hold strucutre fixed and design sequence or vice-versa')
- with gr.Row():
- seq_mask = gr.Textbox(label='seq mask',lines=1,placeholder='input residues to mask sequence')
- str_mask = gr.Textbox(label='str mask',lines=1,placeholder='input residues to mask structure')
- preview_viewer = gr.HTML()
- rewrite_pdb = gr.File(label='PDB file')
- preview_btn = gr.Button("Preview Motif")
-
- with gr.TabItem("MSA to PSSM"):
- gr.Markdown("""### MSA to PSSM Generation""")
- gr.Markdown('input either an MSA or PSSM to guide the model toward generating samples within your family of interest')
- with gr.Row():
- fasta_msa = gr.File(label='MSA')
- input_pssm = gr.File(label='PSSM (.csv)')
- pssm = gr.File(label='Generated PSSM')
- pssm_view = gr.Plot(label='PSSM Viewer')
- pssm_gen_btn = gr.Button("Generate PSSM")
-
-
- btn = gr.Button("GENERATE")
-
- #with gr.Row():
- with gr.Column():
- gr.Markdown("""## OUTPUTS""")
- gr.Markdown("""#### Confidence score for generated structure at each timestep""")
- plddt_plot = gr.Plot(label='plddt at step t')
- gr.Markdown("""#### Output protein sequnece""")
- output_seq = gr.Textbox(label="sequence")
- gr.Markdown("""#### Download PDB file""")
- output_pdb = gr.File(label="PDB file")
- gr.Markdown("""#### Structure viewer""")
- output_viewer = gr.HTML()
-
- gr.Markdown("""### Don't know where to get started? Click on an example below to try it out!""")
- gr.Examples(
- [["","125",0.0,0.0,0.2,"","","","20","normal",'','','',None,'','',None],
- ["","100",0.0,0.0,0.0,"","W0.2","2","20","normal",'','','',None,'','',None],
- ["","100",0.0,0.0,0.0,
- "XXHHHHHHHHHXXXXXXXHHHHHHHHHXXXXXXXHHHHHHHHXXXXSSSSSSSSSSSXXXXXXXXSSSSSSSSSSSSXXXXXXXSSSSSSSSSXXXXXXX",
- "","","25","normal",'','','',None,'','',None],
- ["XXXXXXXXXXXXXXXXXXXXXXXXXIPDXXXXXXXXXXXXXXXXXXXXXXPEPSEQXXXXXXXXXXXXXXXXXXXXXXXXXXIPDXXXXXXXXXXXXXXXXXXX",
- "",0.0,0.0,0.0,"","","","25","normal",'','','',None,'','',None],
- ["","",0.0,0.0,0.0,"","","","25","normal",'','',
- '9,D10-11,8,D20-20,4,D25-35,65,D101-101,2,D104-105,8,D114-116,15,D132-138,6,D145-145,2,D148-148,12,D161-161,3',
- './tmp/PSSM_lysozyme.csv',
- 'D25-25,D27-31,D33-35,D132-137',
- 'D26-26','./tmp/150l.pdb']
-
- ],
- inputs=[sequence,
- seq_len,
- helix_bias,
- strand_bias,
- loop_bias,
- secondary_structure,
- aa_bias,
- aa_bias_potential,
- #target_charge,
- #target_ph,
- #charge_potential,
- num_steps,
- noise,
- hydrophobic_target_score,
- hydrophobic_potential,
- contigs,
- pssm,
- seq_mask,
- str_mask,
- rewrite_pdb],
- outputs=[output_seq,
- output_pdb,
- output_viewer,
- plddt_plot],
- fn=protein_diffusion_model,
- )
-
- preview_btn.click(get_motif_preview,[pdb_id_code, contigs],[preview_viewer, rewrite_pdb])
-
- pssm_gen_btn.click(get_pssm,[fasta_msa,input_pssm],[pssm_view, pssm])
-
- btn.click(protein_diffusion_model,
- [sequence,
- seq_len,
- helix_bias,
- strand_bias,
- loop_bias,
- secondary_structure,
- aa_bias,
- aa_bias_potential,
- #target_charge,
- #target_ph,
- #charge_potential,
- num_steps,
- noise,
- hydrophobic_target_score,
- hydrophobic_potential,
- contigs,
- pssm,
- seq_mask,
- str_mask,
- rewrite_pdb],
- [output_seq,
- output_pdb,
- output_viewer,
- plddt_plot])
-
-demo.queue()
-demo.launch(debug=True)
-
-
-
diff --git a/spaces/merve/anonymization/public/third_party/topojson-client.js b/spaces/merve/anonymization/public/third_party/topojson-client.js
deleted file mode 100644
index 728070f185d11aa72b3f78ab88037275614fe89b..0000000000000000000000000000000000000000
--- a/spaces/merve/anonymization/public/third_party/topojson-client.js
+++ /dev/null
@@ -1,2 +0,0 @@
-// https://github.com/topojson/topojson-client v3.0.1 Copyright 2019 Mike Bostock
-!function(e,r){"object"==typeof exports&&"undefined"!=typeof module?r(exports):"function"==typeof define&&define.amd?define(["exports"],r):r((e=e||self).topojson=e.topojson||{})}(this,function(e){"use strict";function r(e){return e}function t(e){if(null==e)return r;var t,n,o=e.scale[0],a=e.scale[1],i=e.translate[0],c=e.translate[1];return function(e,r){r||(t=n=0);var u=2,f=e.length,s=new Array(f);for(s[0]=(t+=e[0])*o+i,s[1]=(n+=e[1])*a+c;ui&&(i=e[0]),e[1]c&&(c=e[1])}function f(e){switch(e.type){case"GeometryCollection":e.geometries.forEach(f);break;case"Point":u(e.coordinates);break;case"MultiPoint":e.coordinates.forEach(u)}}for(r in e.arcs.forEach(function(e){for(var r,t=-1,u=e.length;++ti&&(i=r[0]),r[1]c&&(c=r[1])}),e.objects)f(e.objects[r]);return[o,a,i,c]}function o(e,r){var t=r.id,n=r.bbox,o=null==r.properties?{}:r.properties,i=a(e,r);return null==t&&null==n?{type:"Feature",properties:o,geometry:i}:null==n?{type:"Feature",id:t,properties:o,geometry:i}:{type:"Feature",id:t,bbox:n,properties:o,geometry:i}}function a(e,r){var n=t(e.transform),o=e.arcs;function a(e,r){r.length&&r.pop();for(var t=o[e<0?~e:e],a=0,i=t.length;a1)n=function(e,r,t){var n,o=[],a=[];function i(e){var r=e<0?~e:e;(a[r]||(a[r]=[])).push({i:e,g:n})}function c(e){e.forEach(i)}function u(e){e.forEach(c)}return function e(r){switch(n=r,r.type){case"GeometryCollection":r.geometries.forEach(e);break;case"LineString":c(r.arcs);break;case"MultiLineString":case"Polygon":u(r.arcs);break;case"MultiPolygon":!function(e){e.forEach(u)}(r.arcs)}}(r),a.forEach(null==t?function(e){o.push(e[0].i)}:function(e){t(e[0].g,e[e.length-1].g)&&o.push(e[0].i)}),o}(0,r,t);else for(o=0,n=new Array(a=e.arcs.length);o1)for(var a,c,f=1,s=u(o[0]);fs&&(c=o[0],o[0]=o[f],o[f]=c,s=a);return o}).filter(function(e){return e.length>0})}}function f(e,r){for(var t=0,n=e.length;t>>1;e[o]=2))throw new Error("n must be ≥2");var t,o=(u=e.bbox||n(e))[0],a=u[1],i=u[2],c=u[3];r={scale:[i-o?(i-o)/(t-1):1,c-a?(c-a)/(t-1):1],translate:[o,a]}}var u,f,l=s(r),h=e.objects,p={};function g(e){return l(e)}function y(e){var r;switch(e.type){case"GeometryCollection":r={type:"GeometryCollection",geometries:e.geometries.map(y)};break;case"Point":r={type:"Point",coordinates:g(e.coordinates)};break;case"MultiPoint":r={type:"MultiPoint",coordinates:e.coordinates.map(g)};break;default:return e}return null!=e.id&&(r.id=e.id),null!=e.bbox&&(r.bbox=e.bbox),null!=e.properties&&(r.properties=e.properties),r}for(f in h)p[f]=y(h[f]);return{type:"Topology",bbox:u,transform:r,objects:p,arcs:e.arcs.map(function(e){var r,t=0,n=1,o=e.length,a=new Array(o);for(a[0]=l(e[0],0);++t {
- rv.headsProb = headsProb
- updateSliderPos()
-
-
- estimates.updateEstimates()
- estimates.render()
- }
-
- rv.updatePopulation = (population) => {
- rv.population = population
- updateSliderPos()
-
-
- var scale = d3.clamp(0, 13 / Math.sqrt(population), 1)
- sel.studentGroup.st({
- transformOrigin: 'top',
- transformOrigin: c.width/2 + 'px ' + 160 + 'px',
- transform: `scale(${scale})`
- })
-
- estimates.updateEstimates()
- estimates.render()
-
- sel.student.classed('inactive',(d, i) => i >= population)
- }
-
- rv.updatePopulationSlider = (val) => {
- rv.updatePopulation(val)
- }
-
- rv.updateNoiseSlider = (val) => {
- rv.updateHeadsProb(val)
- }
-
- var updateSliderPos = (function(){
- var width = d3.clamp(50, window.innerWidth/2 - 40, 145)
- var height = 30
- var color = '#007276'
-
- var sliderVals = {
- population: {
- key: 'population',
- textFn: d => rv.population + ' students' ,
- r: [144, 756],
- v: 144,
- stepFn: d => rv.updatePopulation(Math.round(d.v/2)*2),
- },
- headsProb: {
- key: 'headsProb',
- textFn: d => d3.format('.1%')(rv.headsProb) + ' chance of heads',
- r: [.2, .5],
- v: .5,
- stepFn: d => rv.updateHeadsProb(d.v),
- }
- }
- var sliders = [sliderVals.headsProb, sliderVals.population, sliderVals.headsProb]
- sliders.forEach(d => {
- d.s = d3.scaleLinear().domain(d.r).range([0, width])
- })
-
- var sliderSel = d3.selectAll('.slide-container-population,.slide-container-heads-prob').html('')
- .data(sliders)
- .classed('slider', true)
- .st({
- display: 'inline-block',
- width: width,
- paddingRight: (d, i) => i == 1 ? 40 : 0,
- marginTop: 20,
- })
-
- var textSel = sliderSel.append('div.slider-label-container')
- .st({marginBottom: -5})
-
- var svgSel = sliderSel.append('svg').at({width, height})
- .on('click', function(d){
- d.v = d.s.invert(d3.mouse(this)[0])
- d.stepFn(d)
- })
- .st({
- cursor: 'pointer'
- })
- .append('g').translate(height/2, 1)
- svgSel.append('rect').at({width, height, y: -height/2, fill: 'rgba(0,0,0,0)'})
-
- svgSel.append('path').at({
- d: `M 0 -.5 H ${width}`,
- stroke: color,
- strokeWidth: 1
- })
-
- var leftPathSel = svgSel.append('path').at({
- d: `M 0 -.5 H ${width}`,
- stroke: color,
- strokeWidth: 3
- })
-
-
- var drag = d3.drag()
- .on('drag', function(d){
- var x = d3.mouse(this)[0]
- d.v = d3.clamp(d3.min(d.r), d.s.invert(x), d3.max(d.r))
- d.stepFn(d)
- })
-
- var rectSel = svgSel.append('rect')
- .at({
- width: height/2 - 1,
- height: height/2 - 1,
- stroke: color,
- strokeWidth: 3,
- fill: '#fff',
- })
- .translate([-height/4, -height/4])
- .call(drag)
-
- return isDrag => {
- rectSel.at({x: d => Math.round(d.s(rv[d.key]))})
- textSel.text(d => d.textFn(d))
-
- leftPathSel.at({d: d => `M 0 -.5 H ${d.s(rv[d.key])}`})
- }
- })()
- updateSliderPos()
-
-
- return rv
-}
-
-
-
-
-if (window.init) window.init()
\ No newline at end of file
diff --git a/spaces/merve/fill-in-the-blank/source/measuring-diversity/sliders.js b/spaces/merve/fill-in-the-blank/source/measuring-diversity/sliders.js
deleted file mode 100644
index 13b03fa080fe5d1c2db81ef456242c0d856b0a0f..0000000000000000000000000000000000000000
--- a/spaces/merve/fill-in-the-blank/source/measuring-diversity/sliders.js
+++ /dev/null
@@ -1,206 +0,0 @@
-window.highlightColor = '#bf0bbf'
-
-window.makeSliders = function(metrics, sets, c, selectSet, drawRow, onRender){
-
- var width = 180
- var height = 30
- var color = '#000'
-
- var xScale = d3.scaleLinear().range([0, width]).domain([0, 1])
- .clamp(1)
-
- var sliderSel = c.svg.appendMany('g', metrics)
- .translate((d, i) => [-c.margin.left -10 , 130*i + 30])
- .on('click', function(d){
- d.target = xScale.invert(d3.mouse(this)[0])
- render()
- })
- .classed('slider', true)
- .st({cursor: 'pointer'})
-
- var textSel = sliderSel.append('text.slider-label-container')
- .at({y: -20, fontWeight: 500, textAnchor: 'middle', x: 180/2})
-
- sliderSel.append('rect')
- .at({width, height, y: -height/2, fill: 'rgba(0,0,0,0)'})
-
- sliderSel.append('path').at({
- d: `M 0 -.5 H ${width}`,
- stroke: color,
- strokeWidth: 1
- })
-
- var leftPathSel = sliderSel.append('path').at({
- d: `M 0 -.5 H ${width}`,
- stroke: color,
- strokeWidth: 3
- })
-
- var drag = d3.drag()
- .on('drag', function(d){
- var x = d3.mouse(this)[0]
- d.target = xScale.invert(x)
- render()
- })
-
- var circleSel = sliderSel.append('circle').call(drag)
- .at({r: 7, stroke: '#000'})
-
-
- var exSel = c.svg.append('g').translate([-c.margin.left -10, 400])
- .st({fontSize: 13})
-
- var curY = 0
- exSel.append('g')
- .append('text').text('The selected set is...')
-
- var selectedSetG = exSel.append('g.selected').translate([-10, curY += 15])
- .datum(sets[0])
- .call(drawRow)
-
- selectedSetG.select('.no-stroke').classed('selected', 1)
-
- curY += 25
- var exMetrics = exSel.appendMany('g', metrics)
- .translate(() => curY +=22, 1)
- .append('text').html(d => '10% small, 10% more than target')
-
- curY += 10
- var exMeanDiff = exSel.append('text').translate(() => curY +=22, 1)
- .at({textAnchor: 'end', x: 190})
- var exMaxDiff = exSel.append('text').translate(() => curY +=22, 1)
- .at({textAnchor: 'end', x: 190})
-
-
- // Make histogram data
- sliderSel.each(function(metric){
- var countKey = metric.key + '_count'
- sets.forEach(set => {
- var v = d3.sum(set, d => d[metric.field] == metric.key)
- set[countKey] = v / set.length
- })
-
- var byCountKey = d3.nestBy(sets, d => d[countKey])
-
- d3.range(.1, 1, .1).forEach(i => {
- if (byCountKey.some(d => d.key*100 == Math.round(i*100))) return
-
- var rv = []
- rv.key = i
- byCountKey.push(rv)
- })
-
- byCountKey.forEach(d => {
- d.metric = metric
- d.key = +d.key
- })
-
- var countSel = d3.select(this).append('g.histogram').lower()
- .translate(30, 1)
- .appendMany('g', byCountKey)
- .translate(d => xScale.clamp(0)(d.key - .05), 0)
- xScale.clamp(1)
-
- countSel.append('text')
- // .text(d => '10')
- .at({fontSize: 11, opacity: .7, y: -8, textAnchor: 'middle', x: 9.5})
- .text(d => d.key*100)
-
- countSel.append('path')
- .at({d: 'M 9.5 -18 V -30', stroke: '#ccc'})
-
- countSel
- .appendMany('rect.histogram-set', d => d)
- .at({width: 16, height: 4, x: 1.5, y: (d, i) => i*6})
- // .on('mouseover', selectSet)
- })
- var histogramSetSel = sliderSel.selectAll('rect.histogram-set')
- .st({cursor: 'default'})
-
- var axisSel = sliderSel.selectAll('.histogram text')
-
-
- var pinkSel = sliderSel.append('g')
- .at({r: 4, fill: highlightColor})
- .st({pointerEvents: 'none', opacity:0})
- pinkSel.append('path').at({stroke: highlightColor, d: 'M .5 0 V 15'})
- pinkSel.append('text').at({y: 30, textAnchor: 'middle'})
- pinkSel.append('text.score').at({y: 50, textAnchor: 'middle'})
-
-
- function render(){
- circleSel.at({cx: d => xScale(d.target)})
- // circleSel.at({cx: d => xScale(d.target)})
- textSel.text(d => (d.str + ' Target: ').replace('s ', ' ') + pctFmt(d.target))
-
- axisSel
- .classed('selected', false)
- // .text(function(d){
- // var str = Math.round(100*Math.abs(d.key - d.metric.target))
-
- // if (d.some(e => e.selected)){
- // d3.select(this).classed('selected', 1)
- // // str = str + '%'
- // }
-
- // return str
- // })
-
- leftPathSel.at({d: d => `M 0 -.5 H ${xScale(d.target)}`})
- metrics.forEach(d => {
- d.scoreScale = d3.scaleLinear()
- .domain([-.1, d.target, 1.1])
- .range([0, 1, 0])
- })
- histogramSetSel.st({fill: d => d === sets.selected ? highlightColor: '#bbb'})
-
- if (onRender) onRender()
-
- var shapes = sets.selected
-
- var metricVals = metrics.map(m => {
- return d3.sum(shapes, (d, i) => shapes[i][m.field] == m.key)/shapes.length
- })
-
- pinkSel.translate((d, i) => xScale(metricVals[i]), 0)
- pinkSel.select('text').text((d, i) => pctFmt(metricVals[i]))
- pinkSel.select('.score').text((d, i) => 'Difference: ' + Math.round(shapes.score[i]*100))
-
-
- selectedSetG.html('')
- .datum(sets.selected)
- .call(drawRow)
-
- selectedSetG.select('.no-stroke').classed('selected', 1)
-
- exMetrics
- .html((d, i) => {
- var target = d.target
- var actual = sets.selected[d.key + '_count']
- var diff = sets.selected.score[i]
-
- var str = d.str.replace('ls', 'l').replace('ns', 'n').toLowerCase()
-
- return `
- ${pctFmt(actual)}
- ${str},
- ${pctFmt(diff)}
- ${actual < target ? 'less' : 'more'} than target
- `
- })
- .at({textAnchor: 'end', x: 190})
-
- exMeanDiff
- .text('Mean Difference: ' + d3.format('.2%')(sets.selected['Utilitarian']/100))
-
- exMaxDiff
- .text('Max Difference: ' + measures[1].ppFn(sets.selected['score']).replace('%', '.00%'))
-
- }
-
- return {render}
-}
-
-
-// window.initColumns('#columns-height', metrics1, measures)
-// window.initColumns('#columns-height-disagree', metrics2, measures2)
diff --git a/spaces/merve/uncertainty-calibration/server-side/fill-in-the-blank/scatter-plot-colab/spearman-distribution/watch-files.js b/spaces/merve/uncertainty-calibration/server-side/fill-in-the-blank/scatter-plot-colab/spearman-distribution/watch-files.js
deleted file mode 100644
index 25d1fcfe5b17fa1e63323e0389379264463572af..0000000000000000000000000000000000000000
--- a/spaces/merve/uncertainty-calibration/server-side/fill-in-the-blank/scatter-plot-colab/spearman-distribution/watch-files.js
+++ /dev/null
@@ -1,88 +0,0 @@
-/* Copyright 2021 Google LLC. All Rights Reserved.
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-==============================================================================*/
-
-
-
-!(function(){
- function watchFile(path){
- var lastStr = ''
-
- console.log(path)
- function check(){
- d3.text(path + '?' + Math.random(), (err, nextStr) => {
- if (err){
- console.log(err)
- return check()
- }
-
- if (nextStr == lastStr) return
- lastStr = nextStr
-
- if (path.includes('.js')){
- // console.log('js', new Date())
- Function(nextStr.replace('\n', ';').replace('\n', ';'))()
- }
-
- if (path.includes('.css')){
- // console.log('css', new Date())
-
- Array.from(document.querySelectorAll('link'))
- .filter(d => d.href.includes(path) || d.href.includes('__hs_placeholder'))
- .filter((d, i) => i == 0)
- .forEach(d => d.href = path + '?' + Math.random())
-
- throw 'up'
- }
- })
-
- if (python_settings.isDev) setTimeout(check, 100)
- }
- check()
- }
-
- ;[
- '../spearman-compare/list.css',
- 'style.css',
- '../two-sentences/init-scatter.js',
- '../two-sentences/init-util.js',
- '../two-sentences/init-pair.js',
- 'init.js'
- ].forEach(filename => {
- var root = document.currentScript.src.replace('watch-files.js', '').split('?')[0]
- var path = root + filename
- console.log(filename)
-
- if (python_settings.isDev){
- watchFile(path)
- } else {
-
- if (path.includes('.js')){
- var node = document.createElement('script')
- node.setAttribute('src', path)
- document.body.appendChild(node)
- }
-
- if (path.includes('.css')){
- Array.from(document.querySelectorAll('link'))
- .filter(d => d.href.includes(path) || d.href.includes('__hs_placeholder'))
- .filter((d, i) => i == 0)
- .forEach(d => d.href = path + '?' + Math.random())
- }
- }
- })
-})()
-
-
-
diff --git a/spaces/merve/uncertainty-calibration/source/fill-in-the-blank/data/cachekey2filename.js b/spaces/merve/uncertainty-calibration/source/fill-in-the-blank/data/cachekey2filename.js
deleted file mode 100644
index 85df2a5b1806c3853f4e12ab05b430af77c800f9..0000000000000000000000000000000000000000
--- a/spaces/merve/uncertainty-calibration/source/fill-in-the-blank/data/cachekey2filename.js
+++ /dev/null
@@ -1,19 +0,0 @@
-window.cacheKey2filename = {
- "{\"tokens\":[101,2000,2022,2030,2025,2000,2022,29623,2008,2003,1996,3160,29628,102]}embed_group_top": "tokens-101-2000-2022-2030-2025-2000-2022-29623-2008-2003-1996-3160-29628-102-embed-group-top.json",
- "{\"sentence\":\"In New York, they like to buy [MASK].\"}embed": "sentence-in-new-york-they-like-to-buy-mask-embed.json",
- "{\"sentence\":\"Elsie was born in the year of [MASK].\"}embed": "sentence-elsie-was-born-in-the-year-of-mask-embed.json",
- "{\"sentence\":\"Jim worked as a [MASK].\"}embed": "sentence-jim-worked-as-a-mask-embed.json",
- "{\"sentence\":\"The new nurse was named [MASK].\"}embed": "sentence-the-new-nurse-was-named-mask-embed.json",
- "{\"sentence\":\"The doctor performed CPR even though [MASK] knew it was too late.\"}embed_zari_cda": "sentence-the-doctor-performed-cpr-even-though-mask-knew-it-was-too-late-embed-zari-cda.json",
- "{\"sentence\":\"In 1908, he was employed as a [MASK].\"}embed": "sentence-in-1908-he-was-employed-as-a-mask-embed.json",
- "{\"sentence\":\"Jane worked as a [MASK].\"}embed": "sentence-jane-worked-as-a-mask-embed.json",
- "{\"sentence\":\"In Texas, they like to buy [MASK].\"}embed": "sentence-in-texas-they-like-to-buy-mask-embed.json",
- "{\"sentence\":\"Lauren was born in the year of [MASK].\"}embed": "sentence-lauren-was-born-in-the-year-of-mask-embed.json",
- "{\"sentence\":\"The new doctor was named [MASK].\"}embed": "sentence-the-new-doctor-was-named-mask-embed.json",
- "{\"sentence\":\"The nurse performed CPR even though [MASK] knew it was too late.\"}embed_zari_cda": "sentence-the-nurse-performed-cpr-even-though-mask-knew-it-was-too-late-embed-zari-cda.json",
- "{\"sentence\":\"In 1908, she was employed as a [MASK].\"}embed": "sentence-in-1908-she-was-employed-as-a-mask-embed.json",
- "{\"sentence\":\"In 2018, he was employed as a [MASK].\"}embed": "sentence-in-2018-he-was-employed-as-a-mask-embed.json",
- "{\"sentence\":\"In 2018, she was employed as a [MASK].\"}embed": "sentence-in-2018-she-was-employed-as-a-mask-embed.json",
- "{\"tokens\":[101,1999,2047,2259,29623,2027,2066,2000,4965,2477,29625,102]}embed_group_top": "tokens-101-1999-2047-2259-29623-2027-2066-2000-4965-2477-29625-102-embed-group-top.json",
- "{\"tokens\":[101,1999,3146,29623,2027,2066,2000,4965,2477,29625,102]}embed_group_top": "tokens-101-1999-3146-29623-2027-2066-2000-4965-2477-29625-102-embed-group-top.json"
-}
\ No newline at end of file
diff --git a/spaces/miesnerjacob/Multi-task-NLP/README.md b/spaces/miesnerjacob/Multi-task-NLP/README.md
deleted file mode 100644
index 6093c5ae13af47fb1d72f5f6283e4ebc2baf9f10..0000000000000000000000000000000000000000
--- a/spaces/miesnerjacob/Multi-task-NLP/README.md
+++ /dev/null
@@ -1,17 +0,0 @@
----
-title: Multi Task NLP
-emoji: 🤖👽🔥💯
-sdk: streamlit
-sdk_version: 1.10.0
-app_file: app.py
-python_version: 3.9
-pinned: false
----
-
-# Multi-task NLP
-
-This application is a celebration of open-source and the power that programmers have been granted today by those who
-give back to the community. This tool was constructed using Streamlit, Huggingface Transformers,
-Transformers-Interpret, NLTK, Spacy, amongst other open-source Python libraries and models.
-
-You can access a live version of this app here: https://huggingface.co/spaces/miesnerjacob/Multi-task-NLP
diff --git a/spaces/mmDigital/therapy-bot/README.md b/spaces/mmDigital/therapy-bot/README.md
deleted file mode 100644
index 8f307d90d3d50c557fcdedaf037c78bb5e0d2655..0000000000000000000000000000000000000000
--- a/spaces/mmDigital/therapy-bot/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
----
-title: Therapy Bot
-emoji: 🌖
-colorFrom: red
-colorTo: pink
-sdk: gradio
-sdk_version: 3.24.1
-app_file: app.py
-pinned: false
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/mmlab-ntu/Segment-Any-RGBD/open_vocab_seg/utils/misc.py b/spaces/mmlab-ntu/Segment-Any-RGBD/open_vocab_seg/utils/misc.py
deleted file mode 100644
index a22d0a978c9cd89595c6e7c900885e1c148844b1..0000000000000000000000000000000000000000
--- a/spaces/mmlab-ntu/Segment-Any-RGBD/open_vocab_seg/utils/misc.py
+++ /dev/null
@@ -1,126 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-# Modified by Bowen Cheng from https://github.com/facebookresearch/detr/blob/master/util/misc.py
-# Copyright (c) Meta Platforms, Inc. All Rights Reserved
-
-"""
-Misc functions, including distributed helpers.
-
-Mostly copy-paste from torchvision references.
-"""
-from typing import List, Optional
-
-import torch
-import torch.distributed as dist
-import torchvision
-from torch import Tensor
-
-
-def _max_by_axis(the_list):
- # type: (List[List[int]]) -> List[int]
- maxes = the_list[0]
- for sublist in the_list[1:]:
- for index, item in enumerate(sublist):
- maxes[index] = max(maxes[index], item)
- return maxes
-
-
-class NestedTensor(object):
- def __init__(self, tensors, mask: Optional[Tensor]):
- self.tensors = tensors
- self.mask = mask
-
- def to(self, device):
- # type: (Device) -> NestedTensor # noqa
- cast_tensor = self.tensors.to(device)
- mask = self.mask
- if mask is not None:
- assert mask is not None
- cast_mask = mask.to(device)
- else:
- cast_mask = None
- return NestedTensor(cast_tensor, cast_mask)
-
- def decompose(self):
- return self.tensors, self.mask
-
- def __repr__(self):
- return str(self.tensors)
-
-
-def nested_tensor_from_tensor_list(tensor_list: List[Tensor]):
- # TODO make this more general
- if tensor_list[0].ndim == 3:
- if torchvision._is_tracing():
- # nested_tensor_from_tensor_list() does not export well to ONNX
- # call _onnx_nested_tensor_from_tensor_list() instead
- return _onnx_nested_tensor_from_tensor_list(tensor_list)
-
- # TODO make it support different-sized images
- max_size = _max_by_axis([list(img.shape) for img in tensor_list])
- # min_size = tuple(min(s) for s in zip(*[img.shape for img in tensor_list]))
- batch_shape = [len(tensor_list)] + max_size
- b, c, h, w = batch_shape
- dtype = tensor_list[0].dtype
- device = tensor_list[0].device
- tensor = torch.zeros(batch_shape, dtype=dtype, device=device)
- mask = torch.ones((b, h, w), dtype=torch.bool, device=device)
- for img, pad_img, m in zip(tensor_list, tensor, mask):
- pad_img[: img.shape[0], : img.shape[1], : img.shape[2]].copy_(img)
- m[: img.shape[1], : img.shape[2]] = False
- else:
- raise ValueError("not supported")
- return NestedTensor(tensor, mask)
-
-
-# _onnx_nested_tensor_from_tensor_list() is an implementation of
-# nested_tensor_from_tensor_list() that is supported by ONNX tracing.
-@torch.jit.unused
-def _onnx_nested_tensor_from_tensor_list(tensor_list: List[Tensor]) -> NestedTensor:
- max_size = []
- for i in range(tensor_list[0].dim()):
- max_size_i = torch.max(
- torch.stack([img.shape[i] for img in tensor_list]).to(torch.float32)
- ).to(torch.int64)
- max_size.append(max_size_i)
- max_size = tuple(max_size)
-
- # work around for
- # pad_img[: img.shape[0], : img.shape[1], : img.shape[2]].copy_(img)
- # m[: img.shape[1], :img.shape[2]] = False
- # which is not yet supported in onnx
- padded_imgs = []
- padded_masks = []
- for img in tensor_list:
- padding = [(s1 - s2) for s1, s2 in zip(max_size, tuple(img.shape))]
- padded_img = torch.nn.functional.pad(
- img, (0, padding[2], 0, padding[1], 0, padding[0])
- )
- padded_imgs.append(padded_img)
-
- m = torch.zeros_like(img[0], dtype=torch.int, device=img.device)
- padded_mask = torch.nn.functional.pad(
- m, (0, padding[2], 0, padding[1]), "constant", 1
- )
- padded_masks.append(padded_mask.to(torch.bool))
-
- tensor = torch.stack(padded_imgs)
- mask = torch.stack(padded_masks)
-
- return NestedTensor(tensor, mask=mask)
-
-
-def is_dist_avail_and_initialized():
- if not dist.is_available():
- return False
- if not dist.is_initialized():
- return False
- return True
-
-def get_gt_binary_masks(gt_semseg):
- mask_ids = torch.unique(gt_semseg)
- gt_masks = []
- for id in mask_ids:
- if id != 255:
- gt_masks.append(gt_semseg == id)
- gt_masks = torch.stack(gt_masks).float()
- return gt_masks
diff --git a/spaces/mrdbourke/foodvision_big_video/model.py b/spaces/mrdbourke/foodvision_big_video/model.py
deleted file mode 100644
index 2060a8a6ae4f6692cc634c067a876cb1daea285b..0000000000000000000000000000000000000000
--- a/spaces/mrdbourke/foodvision_big_video/model.py
+++ /dev/null
@@ -1,24 +0,0 @@
-import torch
-import torchvision
-
-from torch import nn
-
-def create_effnetb2_model(num_classes:int=3, # default output classes = 3 (pizza, steak, sushi)
- seed:int=42):
- # 1, 2, 3 Create EffNetB2 pretrained weights, transforms and model
- weights = torchvision.models.EfficientNet_B2_Weights.DEFAULT
- transforms = weights.transforms()
- model = torchvision.models.efficientnet_b2(weights=weights)
-
- # 4. Freeze all layers in the base model
- for param in model.parameters():
- param.requires_grad = False
-
- # 5. Change classifier head with random seed for reproducibility
- torch.manual_seed(seed)
- model.classifier = nn.Sequential(
- nn.Dropout(p=0.3, inplace=True),
- nn.Linear(in_features=1408, out_features=num_classes)
- )
-
- return model, transforms
diff --git a/spaces/mrm8488/llama-2-7b-chat-cpp/app.py b/spaces/mrm8488/llama-2-7b-chat-cpp/app.py
deleted file mode 100644
index 1860bb749ff9805ef3ac7886656a6f236f837695..0000000000000000000000000000000000000000
--- a/spaces/mrm8488/llama-2-7b-chat-cpp/app.py
+++ /dev/null
@@ -1,76 +0,0 @@
-import os
-import gradio as gr
-import copy
-import time
-import llama_cpp
-from llama_cpp import Llama
-from huggingface_hub import hf_hub_download
-
-
-llm = Llama(
- model_path=hf_hub_download(
- repo_id=os.environ.get("REPO_ID", "TheBloke/Llama-2-7B-Chat-GGML"),
- filename=os.environ.get("MODEL_FILE", "llama-2-7b-chat.ggmlv3.q5_0.bin"),
- ),
- n_ctx=2048,
- n_gpu_layers=50, # change n_gpu_layers if you have more or less VRAM
-)
-
-history = []
-
-system_prompt = """
-You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
-
-If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
-"""
-
-system_prompt_es = """
-Eres un asistente útil, respetuoso y honest. Responde siemore en español de la manera más útil posible.
-"""
-
-
-def generate_text(message, history):
- temp = ""
- input_prompt = f"[INST] <>\n{system_prompt_es}\n< >\n\n "
- for interaction in history:
- input_prompt = input_prompt + str(interaction[0]) + " [/INST] " + str(interaction[1]) + " [INST] "
-
- input_prompt = input_prompt + str(message) + " [/INST] "
-
- output = llm(
- input_prompt,
- temperature=0.15,
- top_p=0.1,
- top_k=40,
- repeat_penalty=1.1,
- max_tokens=1024,
- stop=[
- "<|prompter|>",
- "<|endoftext|>",
- "<|endoftext|> \n",
- "ASSISTANT:",
- "USER:",
- "SYSTEM:",
- ],
- stream=True,
- )
- for out in output:
- stream = copy.deepcopy(out)
- temp += stream["choices"][0]["text"]
- yield temp
-
- history = ["init", input_prompt]
-
-
-demo = gr.ChatInterface(
- generate_text,
- title="llama-2 🦙 7B chat cpp on GPU",
- description="Running LLM with https://github.com/abetlen/llama-cpp-python",
- examples=["Cúenta cosas interesantes sobre la Región de Murcia", "¿Qué es una Chirigota?", "Si tengo 8 manzanas, compro una y vendo 7, ¿cuántas me quedan?"],
- cache_examples=True,
- retry_btn=None,
- undo_btn="Delete Previous",
- clear_btn="Clear",
-)
-demo.queue(concurrency_count=1, max_size=5)
-demo.launch()
\ No newline at end of file
diff --git a/spaces/mrneuralnet/P-DFD/dataset/abstract_dataset.py b/spaces/mrneuralnet/P-DFD/dataset/abstract_dataset.py
deleted file mode 100644
index 5b029f8c2b3d0a41c383abfd235ceb41e028e8fc..0000000000000000000000000000000000000000
--- a/spaces/mrneuralnet/P-DFD/dataset/abstract_dataset.py
+++ /dev/null
@@ -1,41 +0,0 @@
-import cv2
-import torch
-import numpy as np
-from torchvision.datasets import VisionDataset
-import albumentations
-from albumentations import Compose
-from albumentations.pytorch.transforms import ToTensorV2
-
-
-class AbstractDataset(VisionDataset):
- def __init__(self, cfg, seed=2022, transforms=None, transform=None, target_transform=None):
- super(AbstractDataset, self).__init__(cfg['root'], transforms=transforms,
- transform=transform, target_transform=target_transform)
- # fix for re-production
- np.random.seed(seed)
-
- self.images = list()
- self.targets = list()
- self.split = cfg['split']
- if self.transforms is None:
- self.transforms = Compose(
- [getattr(albumentations, _['name'])(**_['params']) for _ in cfg['transforms']] +
- [ToTensorV2()]
- )
-
- def __len__(self):
- return len(self.images)
-
- def __getitem__(self, index):
- path = self.images[index]
- tgt = self.targets[index]
- return path, tgt
-
- def load_item(self, items):
- images = list()
- for item in items:
- img = cv2.imread(item)
- img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
- image = self.transforms(image=img)['image']
- images.append(image)
- return torch.stack(images, dim=0)
diff --git a/spaces/mrrandom123/mattmdjaga-segformer_b2_clothes/README.md b/spaces/mrrandom123/mattmdjaga-segformer_b2_clothes/README.md
deleted file mode 100644
index 99d0b60a615aca44b1d3dcc122277ec2bcf32887..0000000000000000000000000000000000000000
--- a/spaces/mrrandom123/mattmdjaga-segformer_b2_clothes/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
----
-title: Mattmdjaga-segformer B2 Clothes
-emoji: 🔥
-colorFrom: purple
-colorTo: indigo
-sdk: gradio
-sdk_version: 3.35.2
-app_file: app.py
-pinned: false
----
-
-Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
diff --git a/spaces/mshkdm/VToonify/vtoonify/model/raft/alt_cuda_corr/correlation.cpp b/spaces/mshkdm/VToonify/vtoonify/model/raft/alt_cuda_corr/correlation.cpp
deleted file mode 100644
index b01584d19edb99e7feec5f2e4c51169a1ed208db..0000000000000000000000000000000000000000
--- a/spaces/mshkdm/VToonify/vtoonify/model/raft/alt_cuda_corr/correlation.cpp
+++ /dev/null
@@ -1,54 +0,0 @@
-#include
-#include
-
-// CUDA forward declarations
-std::vector corr_cuda_forward(
- torch::Tensor fmap1,
- torch::Tensor fmap2,
- torch::Tensor coords,
- int radius);
-
-std::vector corr_cuda_backward(
- torch::Tensor fmap1,
- torch::Tensor fmap2,
- torch::Tensor coords,
- torch::Tensor corr_grad,
- int radius);
-
-// C++ interface
-#define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
-#define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be contiguous")
-#define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
-
-std::vector corr_forward(
- torch::Tensor fmap1,
- torch::Tensor fmap2,
- torch::Tensor coords,
- int radius) {
- CHECK_INPUT(fmap1);
- CHECK_INPUT(fmap2);
- CHECK_INPUT(coords);
-
- return corr_cuda_forward(fmap1, fmap2, coords, radius);
-}
-
-
-std::vector corr_backward(
- torch::Tensor fmap1,
- torch::Tensor fmap2,
- torch::Tensor coords,
- torch::Tensor corr_grad,
- int radius) {
- CHECK_INPUT(fmap1);
- CHECK_INPUT(fmap2);
- CHECK_INPUT(coords);
- CHECK_INPUT(corr_grad);
-
- return corr_cuda_backward(fmap1, fmap2, coords, corr_grad, radius);
-}
-
-
-PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
- m.def("forward", &corr_forward, "CORR forward");
- m.def("backward", &corr_backward, "CORR backward");
-}
\ No newline at end of file
diff --git a/spaces/mshukor/UnIVAL/fairseq/examples/speech_synthesis/evaluation/__init__.py b/spaces/mshukor/UnIVAL/fairseq/examples/speech_synthesis/evaluation/__init__.py
deleted file mode 100644
index 6264236915a7269a4d920ee8213004374dd86a9a..0000000000000000000000000000000000000000
--- a/spaces/mshukor/UnIVAL/fairseq/examples/speech_synthesis/evaluation/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
diff --git a/spaces/mshukor/UnIVAL/fairseq/fairseq/models/transformer/transformer_encoder.py b/spaces/mshukor/UnIVAL/fairseq/fairseq/models/transformer/transformer_encoder.py
deleted file mode 100644
index f007776a6f3b7e6731edc01d95aa24eed255d0e8..0000000000000000000000000000000000000000
--- a/spaces/mshukor/UnIVAL/fairseq/fairseq/models/transformer/transformer_encoder.py
+++ /dev/null
@@ -1,341 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
-
-import math
-from typing import Dict, List, Optional
-
-import torch
-import torch.nn as nn
-from fairseq import utils
-from fairseq.distributed import fsdp_wrap
-from fairseq.models import FairseqEncoder
-from fairseq.modules import (
- FairseqDropout,
- LayerDropModuleList,
- LayerNorm,
- PositionalEmbedding,
- SinusoidalPositionalEmbedding,
-)
-from fairseq.modules import transformer_layer
-from fairseq.modules.checkpoint_activations import checkpoint_wrapper
-from fairseq.modules.quant_noise import quant_noise as apply_quant_noise_
-from torch import Tensor
-from fairseq.models.transformer import (
- TransformerConfig,
-)
-
-
-# rewrite name for backward compatibility in `make_generation_fast_`
-def module_name_fordropout(module_name: str) -> str:
- if module_name == 'TransformerEncoderBase':
- return 'TransformerEncoder'
- else:
- return module_name
-
-
-class TransformerEncoderBase(FairseqEncoder):
- """
- Transformer encoder consisting of *cfg.encoder.layers* layers. Each layer
- is a :class:`TransformerEncoderLayer`.
-
- Args:
- args (argparse.Namespace): parsed command-line arguments
- dictionary (~fairseq.data.Dictionary): encoding dictionary
- embed_tokens (torch.nn.Embedding): input embedding
- """
-
- def __init__(self, cfg, dictionary, embed_tokens):
- self.cfg = cfg
- super().__init__(dictionary)
- self.register_buffer("version", torch.Tensor([3]))
-
- self.dropout_module = FairseqDropout(
- cfg.dropout, module_name=module_name_fordropout(self.__class__.__name__)
- )
- self.encoder_layerdrop = cfg.encoder.layerdrop
-
- embed_dim = embed_tokens.embedding_dim
- self.padding_idx = embed_tokens.padding_idx
- self.max_source_positions = cfg.max_source_positions
-
- self.embed_tokens = embed_tokens
-
- self.embed_scale = 1.0 if cfg.no_scale_embedding else math.sqrt(embed_dim)
-
- self.embed_positions = (
- PositionalEmbedding(
- cfg.max_source_positions,
- embed_dim,
- self.padding_idx,
- learned=cfg.encoder.learned_pos,
- )
- if not cfg.no_token_positional_embeddings
- else None
- )
- if cfg.layernorm_embedding:
- self.layernorm_embedding = LayerNorm(embed_dim, export=cfg.export)
- else:
- self.layernorm_embedding = None
-
- if not cfg.adaptive_input and cfg.quant_noise.pq > 0:
- self.quant_noise = apply_quant_noise_(
- nn.Linear(embed_dim, embed_dim, bias=False),
- cfg.quant_noise.pq,
- cfg.quant_noise.pq_block_size,
- )
- else:
- self.quant_noise = None
-
- if self.encoder_layerdrop > 0.0:
- self.layers = LayerDropModuleList(p=self.encoder_layerdrop)
- else:
- self.layers = nn.ModuleList([])
- self.layers.extend(
- [self.build_encoder_layer(cfg) for i in range(cfg.encoder.layers)]
- )
- self.num_layers = len(self.layers)
-
- if cfg.encoder.normalize_before:
- self.layer_norm = LayerNorm(embed_dim, export=cfg.export)
- else:
- self.layer_norm = None
-
- def build_encoder_layer(self, cfg):
- layer = transformer_layer.TransformerEncoderLayerBase(cfg)
- checkpoint = cfg.checkpoint_activations
- if checkpoint:
- offload_to_cpu = cfg.offload_activations
- layer = checkpoint_wrapper(layer, offload_to_cpu=offload_to_cpu)
- # if we are checkpointing, enforce that FSDP always wraps the
- # checkpointed layer, regardless of layer size
- min_params_to_wrap = cfg.min_params_to_wrap if not checkpoint else 0
- layer = fsdp_wrap(layer, min_num_params=min_params_to_wrap)
- return layer
-
- def forward_embedding(
- self, src_tokens, token_embedding: Optional[torch.Tensor] = None
- ):
- # embed tokens and positions
- if token_embedding is None:
- token_embedding = self.embed_tokens(src_tokens)
- x = embed = self.embed_scale * token_embedding
- if self.embed_positions is not None:
- x = embed + self.embed_positions(src_tokens)
- if self.layernorm_embedding is not None:
- x = self.layernorm_embedding(x)
- x = self.dropout_module(x)
- if self.quant_noise is not None:
- x = self.quant_noise(x)
- return x, embed
-
- def forward(
- self,
- src_tokens,
- src_lengths: Optional[torch.Tensor] = None,
- return_all_hiddens: bool = False,
- token_embeddings: Optional[torch.Tensor] = None,
- ):
- """
- Args:
- src_tokens (LongTensor): tokens in the source language of shape
- `(batch, src_len)`
- src_lengths (torch.LongTensor): lengths of each source sentence of
- shape `(batch)`
- return_all_hiddens (bool, optional): also return all of the
- intermediate hidden states (default: False).
- token_embeddings (torch.Tensor, optional): precomputed embeddings
- default `None` will recompute embeddings
-
- Returns:
- dict:
- - **encoder_out** (Tensor): the last encoder layer's output of
- shape `(src_len, batch, embed_dim)`
- - **encoder_padding_mask** (ByteTensor): the positions of
- padding elements of shape `(batch, src_len)`
- - **encoder_embedding** (Tensor): the (scaled) embedding lookup
- of shape `(batch, src_len, embed_dim)`
- - **encoder_states** (List[Tensor]): all intermediate
- hidden states of shape `(src_len, batch, embed_dim)`.
- Only populated if *return_all_hiddens* is True.
- """
- return self.forward_scriptable(
- src_tokens, src_lengths, return_all_hiddens, token_embeddings
- )
-
- # TorchScript doesn't support super() method so that the scriptable Subclass
- # can't access the base class model in Torchscript.
- # Current workaround is to add a helper function with different name and
- # call the helper function from scriptable Subclass.
- def forward_scriptable(
- self,
- src_tokens,
- src_lengths: Optional[torch.Tensor] = None,
- return_all_hiddens: bool = False,
- token_embeddings: Optional[torch.Tensor] = None,
- ):
- """
- Args:
- src_tokens (LongTensor): tokens in the source language of shape
- `(batch, src_len)`
- src_lengths (torch.LongTensor): lengths of each source sentence of
- shape `(batch)`
- return_all_hiddens (bool, optional): also return all of the
- intermediate hidden states (default: False).
- token_embeddings (torch.Tensor, optional): precomputed embeddings
- default `None` will recompute embeddings
-
- Returns:
- dict:
- - **encoder_out** (Tensor): the last encoder layer's output of
- shape `(src_len, batch, embed_dim)`
- - **encoder_padding_mask** (ByteTensor): the positions of
- padding elements of shape `(batch, src_len)`
- - **encoder_embedding** (Tensor): the (scaled) embedding lookup
- of shape `(batch, src_len, embed_dim)`
- - **encoder_states** (List[Tensor]): all intermediate
- hidden states of shape `(src_len, batch, embed_dim)`.
- Only populated if *return_all_hiddens* is True.
- """
- # compute padding mask
- encoder_padding_mask = src_tokens.eq(self.padding_idx)
- has_pads = src_tokens.device.type == "xla" or encoder_padding_mask.any()
-
- x, encoder_embedding = self.forward_embedding(src_tokens, token_embeddings)
-
- # account for padding while computing the representation
- if has_pads:
- x = x * (1 - encoder_padding_mask.unsqueeze(-1).type_as(x))
-
- # B x T x C -> T x B x C
- x = x.transpose(0, 1)
-
- encoder_states = []
-
- if return_all_hiddens:
- encoder_states.append(x)
-
- # encoder layers
- for layer in self.layers:
- x = layer(
- x, encoder_padding_mask=encoder_padding_mask if has_pads else None
- )
- if return_all_hiddens:
- assert encoder_states is not None
- encoder_states.append(x)
-
- if self.layer_norm is not None:
- x = self.layer_norm(x)
-
- # The Pytorch Mobile lite interpreter does not supports returning NamedTuple in
- # `forward` so we use a dictionary instead.
- # TorchScript does not support mixed values so the values are all lists.
- # The empty list is equivalent to None.
- src_lengths = src_tokens.ne(self.padding_idx).sum(dim=1, dtype=torch.int32).reshape(-1, 1).contiguous()
- return {
- "encoder_out": [x], # T x B x C
- "encoder_padding_mask": [encoder_padding_mask], # B x T
- "encoder_embedding": [encoder_embedding], # B x T x C
- "encoder_states": encoder_states, # List[T x B x C]
- "src_tokens": [],
- "src_lengths": [src_lengths],
- }
-
- @torch.jit.export
- def reorder_encoder_out(self, encoder_out: Dict[str, List[Tensor]], new_order):
- """
- Reorder encoder output according to *new_order*.
-
- Args:
- encoder_out: output from the ``forward()`` method
- new_order (LongTensor): desired order
-
- Returns:
- *encoder_out* rearranged according to *new_order*
- """
- if len(encoder_out["encoder_out"]) == 0:
- new_encoder_out = []
- else:
- new_encoder_out = [encoder_out["encoder_out"][0].index_select(1, new_order)]
- if len(encoder_out["encoder_padding_mask"]) == 0:
- new_encoder_padding_mask = []
- else:
- new_encoder_padding_mask = [
- encoder_out["encoder_padding_mask"][0].index_select(0, new_order)
- ]
- if len(encoder_out["encoder_embedding"]) == 0:
- new_encoder_embedding = []
- else:
- new_encoder_embedding = [
- encoder_out["encoder_embedding"][0].index_select(0, new_order)
- ]
-
- if len(encoder_out["src_tokens"]) == 0:
- src_tokens = []
- else:
- src_tokens = [(encoder_out["src_tokens"][0]).index_select(0, new_order)]
-
- if len(encoder_out["src_lengths"]) == 0:
- src_lengths = []
- else:
- src_lengths = [(encoder_out["src_lengths"][0]).index_select(0, new_order)]
-
- encoder_states = encoder_out["encoder_states"]
- if len(encoder_states) > 0:
- for idx, state in enumerate(encoder_states):
- encoder_states[idx] = state.index_select(1, new_order)
-
- return {
- "encoder_out": new_encoder_out, # T x B x C
- "encoder_padding_mask": new_encoder_padding_mask, # B x T
- "encoder_embedding": new_encoder_embedding, # B x T x C
- "encoder_states": encoder_states, # List[T x B x C]
- "src_tokens": src_tokens, # B x T
- "src_lengths": src_lengths, # B x 1
- }
-
- def max_positions(self):
- """Maximum input length supported by the encoder."""
- if self.embed_positions is None:
- return self.max_source_positions
- return min(self.max_source_positions, self.embed_positions.max_positions)
-
- def upgrade_state_dict_named(self, state_dict, name):
- """Upgrade a (possibly old) state dict for new versions of fairseq."""
- if isinstance(self.embed_positions, SinusoidalPositionalEmbedding):
- weights_key = "{}.embed_positions.weights".format(name)
- if weights_key in state_dict:
- print("deleting {0}".format(weights_key))
- del state_dict[weights_key]
- state_dict[
- "{}.embed_positions._float_tensor".format(name)
- ] = torch.FloatTensor(1)
- for i in range(self.num_layers):
- # update layer norms
- self.layers[i].upgrade_state_dict_named(
- state_dict, "{}.layers.{}".format(name, i)
- )
-
- version_key = "{}.version".format(name)
- if utils.item(state_dict.get(version_key, torch.Tensor([1]))[0]) < 2:
- # earlier checkpoints did not normalize after the stack of layers
- self.layer_norm = None
- self.normalize = False
- state_dict[version_key] = torch.Tensor([1])
- return state_dict
-
-
-class TransformerEncoder(TransformerEncoderBase):
- def __init__(self, args, dictionary, embed_tokens):
- self.args = args
- super().__init__(
- TransformerConfig.from_namespace(args),
- dictionary,
- embed_tokens,
- )
-
- def build_encoder_layer(self, args):
- return super().build_encoder_layer(
- TransformerConfig.from_namespace(args),
- )
diff --git a/spaces/mshukor/UnIVAL/fairseq/fairseq/models/wav2vec/wav2vec2.py b/spaces/mshukor/UnIVAL/fairseq/fairseq/models/wav2vec/wav2vec2.py
deleted file mode 100644
index 714fd3ab50443b8d15715b1cf5abd4eb517298c4..0000000000000000000000000000000000000000
--- a/spaces/mshukor/UnIVAL/fairseq/fairseq/models/wav2vec/wav2vec2.py
+++ /dev/null
@@ -1,1016 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
-
-import math
-from dataclasses import dataclass, field
-from typing import List, Tuple
-
-import numpy as np
-import torch
-import torch.nn as nn
-import torch.nn.functional as F
-from fairseq import utils
-from fairseq.data.data_utils import compute_mask_indices
-from fairseq.dataclass import ChoiceEnum, FairseqDataclass
-from fairseq.models import BaseFairseqModel, register_model
-from fairseq.modules import (
- Fp32GroupNorm,
- Fp32LayerNorm,
- GradMultiply,
- GumbelVectorQuantizer,
- LayerNorm,
- MultiheadAttention,
- SamePad,
- TransposeLast,
-)
-from fairseq.modules.transformer_sentence_encoder import init_bert_params
-from fairseq.utils import buffered_arange, index_put, is_xla_tensor
-
-
-EXTRACTOR_MODE_CHOICES = ChoiceEnum(["default", "layer_norm"])
-MASKING_DISTRIBUTION_CHOICES = ChoiceEnum(["static", "uniform", "normal", "poisson"])
-
-
-@dataclass
-class Wav2Vec2Config(FairseqDataclass):
- extractor_mode: EXTRACTOR_MODE_CHOICES = field(
- default="default",
- metadata={
- "help": "mode for feature extractor. default has a single group norm with d "
- "groups in the first conv block, whereas layer_norm has layer norms in "
- "every block (meant to use with normalize=True)"
- },
- )
- encoder_layers: int = field(
- default=12, metadata={"help": "num encoder layers in the transformer"}
- )
- encoder_embed_dim: int = field(
- default=768, metadata={"help": "encoder embedding dimension"}
- )
- encoder_ffn_embed_dim: int = field(
- default=3072, metadata={"help": "encoder embedding dimension for FFN"}
- )
- encoder_attention_heads: int = field(
- default=12, metadata={"help": "num encoder attention heads"}
- )
- activation_fn: ChoiceEnum(utils.get_available_activation_fns()) = field(
- default="gelu", metadata={"help": "activation function to use"}
- )
-
- # dropouts
- dropout: float = field(
- default=0.1, metadata={"help": "dropout probability for the transformer"}
- )
- attention_dropout: float = field(
- default=0.1, metadata={"help": "dropout probability for attention weights"}
- )
- activation_dropout: float = field(
- default=0.0, metadata={"help": "dropout probability after activation in FFN"}
- )
- encoder_layerdrop: float = field(
- default=0.0, metadata={"help": "probability of dropping a tarnsformer layer"}
- )
- dropout_input: float = field(
- default=0.0,
- metadata={"help": "dropout to apply to the input (after feat extr)"},
- )
- dropout_features: float = field(
- default=0.0,
- metadata={"help": "dropout to apply to the features (after feat extr)"},
- )
-
- final_dim: int = field(
- default=0,
- metadata={
- "help": "project final representations and targets to this many dimensions."
- "set to encoder_embed_dim is <= 0"
- },
- )
- layer_norm_first: bool = field(
- default=False, metadata={"help": "apply layernorm first in the transformer"}
- )
- conv_feature_layers: str = field(
- default="[(512, 10, 5)] + [(512, 3, 2)] * 4 + [(512,2,2)] + [(512,2,2)]",
- metadata={
- "help": "string describing convolutional feature extraction layers in form of a python list that contains "
- "[(dim, kernel_size, stride), ...]"
- },
- )
- conv_bias: bool = field(
- default=False, metadata={"help": "include bias in conv encoder"}
- )
- logit_temp: float = field(
- default=0.1, metadata={"help": "temperature to divide logits by"}
- )
- quantize_targets: bool = field(
- default=False, metadata={"help": "use quantized targets"}
- )
- quantize_input: bool = field(
- default=False, metadata={"help": "use quantized inputs"}
- )
- same_quantizer: bool = field(
- default=False, metadata={"help": "use same quantizer for inputs and targets"}
- )
- target_glu: bool = field(
- default=False, metadata={"help": "adds projection + glu to targets"}
- )
- feature_grad_mult: float = field(
- default=1.0, metadata={"help": "multiply feature extractor var grads by this"}
- )
- quantizer_depth: int = field(
- default=1,
- metadata={"help": "number of quantizer layers"},
- )
- quantizer_factor: int = field(
- default=3,
- metadata={
- "help": "dimensionality increase for inner quantizer layers (if depth > 1)"
- },
- )
- latent_vars: int = field(
- default=320,
- metadata={"help": "number of latent variables V in each group of the codebook"},
- )
- latent_groups: int = field(
- default=2,
- metadata={"help": "number of groups G of latent variables in the codebook"},
- )
- latent_dim: int = field(
- default=0,
- metadata={
- "help": "if > 0, uses this dimensionality for latent variables. "
- "otherwise uses final_dim / latent_groups"
- },
- )
-
- # masking
- mask_length: int = field(default=10, metadata={"help": "mask length"})
- mask_prob: float = field(
- default=0.65, metadata={"help": "probability of replacing a token with mask"}
- )
- mask_selection: MASKING_DISTRIBUTION_CHOICES = field(
- default="static", metadata={"help": "how to choose mask length"}
- )
- mask_other: float = field(
- default=0,
- metadata={
- "help": "secondary mask argument (used for more complex distributions), "
- "see help in compute_mask_indices"
- },
- )
- no_mask_overlap: bool = field(
- default=False, metadata={"help": "whether to allow masks to overlap"}
- )
- mask_min_space: int = field(
- default=1,
- metadata={"help": "min space between spans (if no overlap is enabled)"},
- )
-
- # channel masking
- mask_channel_length: int = field(
- default=10, metadata={"help": "length of the mask for features (channels)"}
- )
- mask_channel_prob: float = field(
- default=0.0, metadata={"help": "probability of replacing a feature with 0"}
- )
- mask_channel_before: bool = False
- mask_channel_selection: MASKING_DISTRIBUTION_CHOICES = field(
- default="static",
- metadata={"help": "how to choose mask length for channel masking"},
- )
- mask_channel_other: float = field(
- default=0,
- metadata={
- "help": "secondary mask argument (used for more complex distributions), "
- "see help in compute_mask_indicesh"
- },
- )
- no_mask_channel_overlap: bool = field(
- default=False, metadata={"help": "whether to allow channel masks to overlap"}
- )
- mask_channel_min_space: int = field(
- default=1,
- metadata={"help": "min space between spans (if no overlap is enabled)"},
- )
-
- # negative selection
- num_negatives: int = field(
- default=100,
- metadata={"help": "number of negative examples from the same sample"},
- )
- negatives_from_everywhere: bool = field(
- default=False,
- metadata={"help": "sample negatives from everywhere, not just masked states"},
- )
- cross_sample_negatives: int = field(
- default=0, metadata={"help": "number of negative examples from the any sample"}
- )
- codebook_negatives: int = field(
- default=0, metadata={"help": "number of negative examples codebook"}
- )
-
- # positional embeddings
- conv_pos: int = field(
- default=128,
- metadata={"help": "number of filters for convolutional positional embeddings"},
- )
- conv_pos_groups: int = field(
- default=16,
- metadata={"help": "number of groups for convolutional positional embedding"},
- )
-
- latent_temp: Tuple[float, float, float] = field(
- default=(2, 0.5, 0.999995),
- metadata={
- "help": "temperature for latent variable sampling. "
- "can be tuple of 3 values (start, end, decay)"
- },
- )
-
-
-@register_model("wav2vec2", dataclass=Wav2Vec2Config)
-class Wav2Vec2Model(BaseFairseqModel):
- def __init__(self, cfg: Wav2Vec2Config):
- super().__init__()
- self.cfg = cfg
-
- feature_enc_layers = eval(cfg.conv_feature_layers)
- self.embed = feature_enc_layers[-1][0]
-
- self.feature_extractor = ConvFeatureExtractionModel(
- conv_layers=feature_enc_layers,
- dropout=0.0,
- mode=cfg.extractor_mode,
- conv_bias=cfg.conv_bias,
- )
-
- self.post_extract_proj = (
- nn.Linear(self.embed, cfg.encoder_embed_dim)
- if self.embed != cfg.encoder_embed_dim and not cfg.quantize_input
- else None
- )
-
- self.mask_prob = cfg.mask_prob
- self.mask_selection = cfg.mask_selection
- self.mask_other = cfg.mask_other
- self.mask_length = cfg.mask_length
- self.no_mask_overlap = cfg.no_mask_overlap
- self.mask_min_space = cfg.mask_min_space
-
- self.mask_channel_prob = cfg.mask_channel_prob
- self.mask_channel_before = cfg.mask_channel_before
- self.mask_channel_selection = cfg.mask_channel_selection
- self.mask_channel_other = cfg.mask_channel_other
- self.mask_channel_length = cfg.mask_channel_length
- self.no_mask_channel_overlap = cfg.no_mask_channel_overlap
- self.mask_channel_min_space = cfg.mask_channel_min_space
-
- self.dropout_input = nn.Dropout(cfg.dropout_input)
- self.dropout_features = nn.Dropout(cfg.dropout_features)
-
- self.feature_grad_mult = cfg.feature_grad_mult
-
- self.quantizer = None
- self.input_quantizer = None
-
- self.n_negatives = cfg.num_negatives
- self.cross_sample_negatives = cfg.cross_sample_negatives
- self.codebook_negatives = cfg.codebook_negatives
- self.negatives_from_everywhere = cfg.negatives_from_everywhere
-
- self.logit_temp = cfg.logit_temp
-
- final_dim = cfg.final_dim if cfg.final_dim > 0 else cfg.encoder_embed_dim
-
- if cfg.quantize_targets:
- vq_dim = cfg.latent_dim if cfg.latent_dim > 0 else final_dim
- self.quantizer = GumbelVectorQuantizer(
- dim=self.embed,
- num_vars=cfg.latent_vars,
- temp=cfg.latent_temp,
- groups=cfg.latent_groups,
- combine_groups=False,
- vq_dim=vq_dim,
- time_first=True,
- weight_proj_depth=cfg.quantizer_depth,
- weight_proj_factor=cfg.quantizer_factor,
- )
- self.project_q = nn.Linear(vq_dim, final_dim)
- else:
- self.project_q = nn.Linear(self.embed, final_dim)
-
- if cfg.quantize_input:
- if cfg.same_quantizer and self.quantizer is not None:
- vq_dim = final_dim
- self.input_quantizer = self.quantizer
- else:
- vq_dim = cfg.latent_dim if cfg.latent_dim > 0 else cfg.encoder_embed_dim
- self.input_quantizer = GumbelVectorQuantizer(
- dim=self.embed,
- num_vars=cfg.latent_vars,
- temp=cfg.latent_temp,
- groups=cfg.latent_groups,
- combine_groups=False,
- vq_dim=vq_dim,
- time_first=True,
- weight_proj_depth=cfg.quantizer_depth,
- weight_proj_factor=cfg.quantizer_factor,
- )
- self.project_inp = nn.Linear(vq_dim, cfg.encoder_embed_dim)
-
- self.mask_emb = nn.Parameter(
- torch.FloatTensor(cfg.encoder_embed_dim).uniform_()
- )
-
- self.encoder = TransformerEncoder(cfg)
- self.layer_norm = LayerNorm(self.embed)
-
- self.target_glu = None
- if cfg.target_glu:
- self.target_glu = nn.Sequential(
- nn.Linear(final_dim, final_dim * 2), nn.GLU()
- )
-
- self.final_proj = nn.Linear(cfg.encoder_embed_dim, final_dim)
-
- def upgrade_state_dict_named(self, state_dict, name):
- super().upgrade_state_dict_named(state_dict, name)
- """Upgrade a (possibly old) state dict for new versions of fairseq."""
- return state_dict
-
- @classmethod
- def build_model(cls, cfg: Wav2Vec2Config, task=None):
- """Build a new model instance."""
-
- return cls(cfg)
-
- def apply_mask(
- self,
- x,
- padding_mask,
- mask_indices=None,
- mask_channel_indices=None,
- ):
- B, T, C = x.shape
-
- if self.mask_channel_prob > 0 and self.mask_channel_before:
- mask_channel_indices = compute_mask_indices(
- (B, C),
- None,
- self.mask_channel_prob,
- self.mask_channel_length,
- self.mask_channel_selection,
- self.mask_channel_other,
- no_overlap=self.no_mask_channel_overlap,
- min_space=self.mask_channel_min_space,
- )
- mask_channel_indices = (
- torch.from_numpy(mask_channel_indices)
- .to(x.device)
- .unsqueeze(1)
- .expand(-1, T, -1)
- )
- x[mask_channel_indices] = 0
-
- if self.mask_prob > 0:
- if mask_indices is None:
- mask_indices = compute_mask_indices(
- (B, T),
- padding_mask,
- self.mask_prob,
- self.mask_length,
- self.mask_selection,
- self.mask_other,
- min_masks=2,
- no_overlap=self.no_mask_overlap,
- min_space=self.mask_min_space,
- )
- mask_indices = torch.from_numpy(mask_indices).to(x.device)
- x = index_put(x, mask_indices, self.mask_emb)
- else:
- mask_indices = None
-
- if self.mask_channel_prob > 0 and not self.mask_channel_before:
- if mask_channel_indices is None:
- mask_channel_indices = compute_mask_indices(
- (B, C),
- None,
- self.mask_channel_prob,
- self.mask_channel_length,
- self.mask_channel_selection,
- self.mask_channel_other,
- no_overlap=self.no_mask_channel_overlap,
- min_space=self.mask_channel_min_space,
- )
- mask_channel_indices = (
- torch.from_numpy(mask_channel_indices)
- .to(x.device)
- .unsqueeze(1)
- .expand(-1, T, -1)
- )
- x = index_put(x, mask_channel_indices, 0)
-
- return x, mask_indices
-
- def sample_negatives(self, y, num, padding_count=None):
-
- if self.n_negatives == 0 and self.cross_sample_negatives == 0:
- return y.new(0)
-
- bsz, tsz, fsz = y.shape
- y = y.view(-1, fsz) # BTC => (BxT)C
-
- # FIXME: what happens if padding_count is specified?
- cross_high = tsz * bsz
- high = tsz - (padding_count or 0)
- with torch.no_grad():
- assert high > 1, f"{bsz,tsz,fsz}"
-
- if self.n_negatives > 0:
- tszs = (
- buffered_arange(num)
- .unsqueeze(-1)
- .expand(-1, self.n_negatives)
- .flatten()
- )
-
- neg_idxs = torch.randint(
- low=0, high=high - 1, size=(bsz, self.n_negatives * num)
- )
- neg_idxs[neg_idxs >= tszs] += 1
-
- if self.cross_sample_negatives > 0:
- tszs = (
- buffered_arange(num)
- .unsqueeze(-1)
- .expand(-1, self.cross_sample_negatives)
- .flatten()
- )
-
- cross_neg_idxs = torch.randint(
- low=0,
- high=cross_high - 1,
- size=(bsz, self.cross_sample_negatives * num),
- )
- cross_neg_idxs[cross_neg_idxs >= tszs] += 1
-
- if self.n_negatives > 0:
- for i in range(1, bsz):
- neg_idxs[i] += i * high
- else:
- neg_idxs = cross_neg_idxs
-
- if self.cross_sample_negatives > 0 and self.n_negatives > 0:
- neg_idxs = torch.cat([neg_idxs, cross_neg_idxs], dim=1)
-
- negs = y[neg_idxs.view(-1)]
- negs = negs.view(
- bsz, num, self.n_negatives + self.cross_sample_negatives, fsz
- ).permute(
- 2, 0, 1, 3
- ) # to NxBxTxC
- return negs, neg_idxs
-
- def compute_preds(self, x, y, negatives):
-
- neg_is_pos = (y == negatives).all(-1)
- y = y.unsqueeze(0)
- targets = torch.cat([y, negatives], dim=0)
-
- logits = torch.cosine_similarity(x.float(), targets.float(), dim=-1).type_as(x)
-
- logits = logits / self.logit_temp
-
- if is_xla_tensor(logits) or neg_is_pos.any():
- fillval = -float(2 ** 30)
- if not hasattr(self, "_inftensor"):
- self._inftensor = (
- torch.tensor(fillval).to(x.device)
- if is_xla_tensor(logits)
- else float("-inf")
- )
- logits[1:] = index_put(logits[1:], neg_is_pos, self._inftensor)
-
- return logits
-
- def _get_feat_extract_output_lengths(self, input_lengths: torch.LongTensor):
- """
- Computes the output length of the convolutional layers
- """
-
- def _conv_out_length(input_length, kernel_size, stride):
- return torch.floor((input_length - kernel_size) / stride + 1)
-
- conv_cfg_list = eval(self.cfg.conv_feature_layers)
-
- for i in range(len(conv_cfg_list)):
- input_lengths = _conv_out_length(
- input_lengths, conv_cfg_list[i][1], conv_cfg_list[i][2]
- )
-
- return input_lengths.to(torch.long)
-
- def forward(
- self,
- source,
- padding_mask=None,
- mask=True,
- features_only=False,
- layer=None,
- mask_indices=None,
- mask_channel_indices=None,
- padding_count=None,
- ):
-
- if self.feature_grad_mult > 0:
- features = self.feature_extractor(source)
- if self.feature_grad_mult != 1.0:
- features = GradMultiply.apply(features, self.feature_grad_mult)
- else:
- with torch.no_grad():
- features = self.feature_extractor(source)
-
- features_pen = features.float().pow(2).mean()
-
- features = features.transpose(1, 2)
- features = self.layer_norm(features)
- unmasked_features = features.clone()
-
- if padding_mask is not None and padding_mask.any():
- input_lengths = (1 - padding_mask.long()).sum(-1)
- # apply conv formula to get real output_lengths
- output_lengths = self._get_feat_extract_output_lengths(input_lengths)
-
- padding_mask = torch.zeros(
- features.shape[:2], dtype=features.dtype, device=features.device
- )
-
- # these two operations makes sure that all values
- # before the output lengths indices are attended to
- padding_mask[
- (
- torch.arange(padding_mask.shape[0], device=padding_mask.device),
- output_lengths - 1,
- )
- ] = 1
- padding_mask = (1 - padding_mask.flip([-1]).cumsum(-1).flip([-1])).bool()
- else:
- padding_mask = None
-
- if self.post_extract_proj is not None:
- features = self.post_extract_proj(features)
-
- features = self.dropout_input(features)
- unmasked_features = self.dropout_features(unmasked_features)
-
- num_vars = None
- code_ppl = None
- prob_ppl = None
- curr_temp = None
-
- if self.input_quantizer:
- q = self.input_quantizer(features, produce_targets=False)
- features = q["x"]
- num_vars = q["num_vars"]
- code_ppl = q["code_perplexity"]
- prob_ppl = q["prob_perplexity"]
- curr_temp = q["temp"]
- features = self.project_inp(features)
-
- if mask:
- x, mask_indices = self.apply_mask(
- features,
- padding_mask,
- mask_indices=mask_indices,
- mask_channel_indices=mask_channel_indices,
- )
- if not is_xla_tensor(x) and mask_indices is not None:
- # tpu-comment: reducing the size in a dynamic way causes
- # too many recompilations on xla.
- y = unmasked_features[mask_indices].view(
- unmasked_features.size(0), -1, unmasked_features.size(-1)
- )
- else:
- y = unmasked_features
- else:
- x = features
- y = unmasked_features
- mask_indices = None
-
- x, layer_results = self.encoder(x, padding_mask=padding_mask, layer=layer)
-
- if features_only:
- return {
- "x": x,
- "padding_mask": padding_mask,
- "features": unmasked_features,
- "layer_results": layer_results,
- }
-
- if self.quantizer:
- q = self.quantizer(y, produce_targets=False)
- y = q["x"]
- num_vars = q["num_vars"]
- code_ppl = q["code_perplexity"]
- prob_ppl = q["prob_perplexity"]
- curr_temp = q["temp"]
-
- y = self.project_q(y)
-
- if self.negatives_from_everywhere:
- neg_cands = self.quantizer(unmasked_features, produce_targets=False)[
- "x"
- ]
- negs, _ = self.sample_negatives(
- neg_cands,
- y.size(1),
- padding_count=padding_count,
- )
- negs = self.project_q(negs)
-
- else:
- negs, _ = self.sample_negatives(
- y,
- y.size(1),
- padding_count=padding_count,
- )
-
- if self.codebook_negatives > 0:
- cb_negs = self.quantizer.sample_from_codebook(
- y.size(0) * y.size(1), self.codebook_negatives
- )
- cb_negs = cb_negs.view(
- self.codebook_negatives, y.size(0), y.size(1), -1
- ) # order doesnt matter
- cb_negs = self.project_q(cb_negs)
- negs = torch.cat([negs, cb_negs], dim=0)
- else:
- y = self.project_q(y)
-
- if self.negatives_from_everywhere:
- negs, _ = self.sample_negatives(
- unmasked_features,
- y.size(1),
- padding_count=padding_count,
- )
- negs = self.project_q(negs)
- else:
- negs, _ = self.sample_negatives(
- y,
- y.size(1),
- padding_count=padding_count,
- )
-
- if not is_xla_tensor(x):
- # tpu-comment: reducing the size in a dynamic way causes
- # too many recompilations on xla.
- x = x[mask_indices].view(x.size(0), -1, x.size(-1))
-
- if self.target_glu:
- y = self.target_glu(y)
- negs = self.target_glu(negs)
-
- x = self.final_proj(x)
- x = self.compute_preds(x, y, negs)
-
- result = {
- "x": x,
- "padding_mask": padding_mask,
- "features_pen": features_pen,
- }
-
- if prob_ppl is not None:
- result["prob_perplexity"] = prob_ppl
- result["code_perplexity"] = code_ppl
- result["num_vars"] = num_vars
- result["temp"] = curr_temp
-
- return result
-
- def quantize(self, x):
- assert self.quantizer is not None
- x = self.feature_extractor(x)
- x = x.transpose(1, 2)
- x = self.layer_norm(x)
- return self.quantizer.forward_idx(x)
-
- def extract_features(self, source, padding_mask, mask=False, layer=None):
- res = self.forward(
- source, padding_mask, mask=mask, features_only=True, layer=layer
- )
- return res
-
- def get_logits(self, net_output):
- logits = net_output["x"]
- logits = logits.transpose(0, 2)
- logits = logits.reshape(-1, logits.size(-1))
- return logits
-
- def get_targets(self, sample, net_output, expand_steps=True):
- x = net_output["x"]
- return x.new_zeros(x.size(1) * x.size(2), dtype=torch.long)
-
- def get_extra_losses(self, net_output):
- pen = []
-
- if "prob_perplexity" in net_output:
- pen.append(
- (net_output["num_vars"] - net_output["prob_perplexity"])
- / net_output["num_vars"]
- )
-
- if "features_pen" in net_output:
- pen.append(net_output["features_pen"])
-
- return pen
-
- def remove_pretraining_modules(self):
- self.quantizer = None
- self.project_q = None
- self.target_glu = None
- self.final_proj = None
-
-
-class ConvFeatureExtractionModel(nn.Module):
- def __init__(
- self,
- conv_layers: List[Tuple[int, int, int]],
- dropout: float = 0.0,
- mode: str = "default",
- conv_bias: bool = False,
- ):
- super().__init__()
-
- assert mode in {"default", "layer_norm"}
-
- def block(
- n_in,
- n_out,
- k,
- stride,
- is_layer_norm=False,
- is_group_norm=False,
- conv_bias=False,
- ):
- def make_conv():
- conv = nn.Conv1d(n_in, n_out, k, stride=stride, bias=conv_bias)
- nn.init.kaiming_normal_(conv.weight)
- return conv
-
- assert (
- is_layer_norm and is_group_norm
- ) == False, "layer norm and group norm are exclusive"
-
- if is_layer_norm:
- return nn.Sequential(
- make_conv(),
- nn.Dropout(p=dropout),
- nn.Sequential(
- TransposeLast(),
- Fp32LayerNorm(dim, elementwise_affine=True),
- TransposeLast(),
- ),
- nn.GELU(),
- )
- elif is_group_norm:
- return nn.Sequential(
- make_conv(),
- nn.Dropout(p=dropout),
- Fp32GroupNorm(dim, dim, affine=True),
- nn.GELU(),
- )
- else:
- return nn.Sequential(make_conv(), nn.Dropout(p=dropout), nn.GELU())
-
- in_d = 1
- self.conv_layers = nn.ModuleList()
- for i, cl in enumerate(conv_layers):
- assert len(cl) == 3, "invalid conv definition: " + str(cl)
- (dim, k, stride) = cl
-
- self.conv_layers.append(
- block(
- in_d,
- dim,
- k,
- stride,
- is_layer_norm=mode == "layer_norm",
- is_group_norm=mode == "default" and i == 0,
- conv_bias=conv_bias,
- )
- )
- in_d = dim
-
- def forward(self, x):
-
- # BxT -> BxCxT
- x = x.unsqueeze(1)
-
- for conv in self.conv_layers:
- x = conv(x)
-
- return x
-
-
-class TransformerEncoder(nn.Module):
- def __init__(self, args):
- super().__init__()
-
- self.dropout = args.dropout
- self.embedding_dim = args.encoder_embed_dim
-
- self.pos_conv = nn.Conv1d(
- self.embedding_dim,
- self.embedding_dim,
- kernel_size=args.conv_pos,
- padding=args.conv_pos // 2,
- groups=args.conv_pos_groups,
- )
- dropout = 0
- std = math.sqrt((4 * (1.0 - dropout)) / (args.conv_pos * self.embedding_dim))
- nn.init.normal_(self.pos_conv.weight, mean=0, std=std)
- nn.init.constant_(self.pos_conv.bias, 0)
-
- self.pos_conv = nn.utils.weight_norm(self.pos_conv, name="weight", dim=2)
- self.pos_conv = nn.Sequential(self.pos_conv, SamePad(args.conv_pos), nn.GELU())
-
- self.layers = nn.ModuleList(
- [
- TransformerSentenceEncoderLayer(
- embedding_dim=self.embedding_dim,
- ffn_embedding_dim=args.encoder_ffn_embed_dim,
- num_attention_heads=args.encoder_attention_heads,
- dropout=self.dropout,
- attention_dropout=args.attention_dropout,
- activation_dropout=args.activation_dropout,
- activation_fn=args.activation_fn,
- layer_norm_first=args.layer_norm_first,
- )
- for _ in range(args.encoder_layers)
- ]
- )
-
- self.layer_norm_first = args.layer_norm_first
- self.layer_norm = LayerNorm(self.embedding_dim)
- self.layerdrop = args.encoder_layerdrop
-
- self.apply(init_bert_params)
-
- def forward(self, x, padding_mask=None, layer=None):
- x, layer_results = self.extract_features(x, padding_mask, layer)
-
- if self.layer_norm_first and layer is None:
- x = self.layer_norm(x)
-
- return x, layer_results
-
- def extract_features(self, x, padding_mask=None, tgt_layer=None):
-
- if padding_mask is not None:
- x = index_put(x, padding_mask, 0)
-
- x_conv = self.pos_conv(x.transpose(1, 2))
- x_conv = x_conv.transpose(1, 2)
- x = x + x_conv
-
- if not self.layer_norm_first:
- x = self.layer_norm(x)
-
- x = F.dropout(x, p=self.dropout, training=self.training)
-
- # B x T x C -> T x B x C
- x = x.transpose(0, 1)
-
- layer_results = []
- r = None
- for i, layer in enumerate(self.layers):
- dropout_probability = np.random.random()
- if not self.training or (dropout_probability > self.layerdrop):
- x, z = layer(x, self_attn_padding_mask=padding_mask, need_weights=False)
- if tgt_layer is not None:
- layer_results.append((x, z))
- if i == tgt_layer:
- r = x
- break
-
- if r is not None:
- x = r
-
- # T x B x C -> B x T x C
- x = x.transpose(0, 1)
-
- return x, layer_results
-
- def max_positions(self):
- """Maximum output length supported by the encoder."""
- return self.args.max_positions
-
- def upgrade_state_dict_named(self, state_dict, name):
- """Upgrade a (possibly old) state dict for new versions of fairseq."""
- return state_dict
-
-
-class TransformerSentenceEncoderLayer(nn.Module):
- """
- Implements a Transformer Encoder Layer used in BERT/XLM style pre-trained
- models.
- """
-
- def __init__(
- self,
- embedding_dim: float = 768,
- ffn_embedding_dim: float = 3072,
- num_attention_heads: float = 8,
- dropout: float = 0.1,
- attention_dropout: float = 0.1,
- activation_dropout: float = 0.1,
- activation_fn: str = "relu",
- layer_norm_first: bool = False,
- ) -> None:
-
- super().__init__()
- # Initialize parameters
- self.embedding_dim = embedding_dim
- self.dropout = dropout
- self.activation_dropout = activation_dropout
-
- # Initialize blocks
- self.activation_fn = utils.get_activation_fn(activation_fn)
- self.self_attn = MultiheadAttention(
- self.embedding_dim,
- num_attention_heads,
- dropout=attention_dropout,
- self_attention=True,
- )
-
- self.dropout1 = nn.Dropout(dropout)
- self.dropout2 = nn.Dropout(self.activation_dropout)
- self.dropout3 = nn.Dropout(dropout)
-
- self.layer_norm_first = layer_norm_first
-
- # layer norm associated with the self attention layer
- self.self_attn_layer_norm = LayerNorm(self.embedding_dim)
- self.fc1 = nn.Linear(self.embedding_dim, ffn_embedding_dim)
- self.fc2 = nn.Linear(ffn_embedding_dim, self.embedding_dim)
-
- # layer norm associated with the position wise feed-forward NN
- self.final_layer_norm = LayerNorm(self.embedding_dim)
-
- def forward(
- self,
- x: torch.Tensor,
- self_attn_mask: torch.Tensor = None,
- self_attn_padding_mask: torch.Tensor = None,
- need_weights: bool = False,
- att_args=None,
- ):
- """
- LayerNorm is applied either before or after the self-attention/ffn
- modules similar to the original Transformer imlementation.
- """
- residual = x
-
- if self.layer_norm_first:
- x = self.self_attn_layer_norm(x)
- x, attn = self.self_attn(
- query=x,
- key=x,
- value=x,
- key_padding_mask=self_attn_padding_mask,
- attn_mask=self_attn_mask,
- )
- x = self.dropout1(x)
- x = residual + x
-
- residual = x
- x = self.final_layer_norm(x)
- x = self.activation_fn(self.fc1(x))
- x = self.dropout2(x)
- x = self.fc2(x)
- x = self.dropout3(x)
- x = residual + x
- else:
- x, attn = self.self_attn(
- query=x,
- key=x,
- value=x,
- key_padding_mask=self_attn_padding_mask,
- )
-
- x = self.dropout1(x)
- x = residual + x
-
- x = self.self_attn_layer_norm(x)
-
- residual = x
- x = self.activation_fn(self.fc1(x))
- x = self.dropout2(x)
- x = self.fc2(x)
- x = self.dropout3(x)
- x = residual + x
- x = self.final_layer_norm(x)
-
- return x, attn
diff --git a/spaces/mshukor/UnIVAL/fairseq/fairseq/modules/sinusoidal_positional_embedding.py b/spaces/mshukor/UnIVAL/fairseq/fairseq/modules/sinusoidal_positional_embedding.py
deleted file mode 100644
index 4793ecfb522d0729fc2d24a3ddf0c6a774d67773..0000000000000000000000000000000000000000
--- a/spaces/mshukor/UnIVAL/fairseq/fairseq/modules/sinusoidal_positional_embedding.py
+++ /dev/null
@@ -1,105 +0,0 @@
-# Copyright (c) Facebook, Inc. and its affiliates.
-#
-# This source code is licensed under the MIT license found in the
-# LICENSE file in the root directory of this source tree.
-
-import math
-from typing import Any, Optional
-
-import torch
-import torch.onnx.operators
-from fairseq import utils
-from torch import Tensor, nn
-
-
-class SinusoidalPositionalEmbedding(nn.Module):
- """This module produces sinusoidal positional embeddings of any length.
-
- Padding symbols are ignored.
- """
-
- def __init__(self, embedding_dim, padding_idx, init_size=1024):
- super().__init__()
- self.embedding_dim = embedding_dim
- self.padding_idx = padding_idx if padding_idx is not None else 0
- self.weights = SinusoidalPositionalEmbedding.get_embedding(
- init_size, embedding_dim, padding_idx
- )
- self.onnx_trace = False
- self.register_buffer("_float_tensor", torch.FloatTensor(1))
- self.max_positions = int(1e5)
-
- def prepare_for_onnx_export_(self):
- self.onnx_trace = True
-
- @staticmethod
- def get_embedding(
- num_embeddings: int, embedding_dim: int, padding_idx: Optional[int] = None
- ):
- """Build sinusoidal embeddings.
-
- This matches the implementation in tensor2tensor, but differs slightly
- from the description in Section 3.5 of "Attention Is All You Need".
- """
- half_dim = embedding_dim // 2
- emb = math.log(10000) / (half_dim - 1)
- emb = torch.exp(torch.arange(half_dim, dtype=torch.float) * -emb)
- emb = torch.arange(num_embeddings, dtype=torch.float).unsqueeze(
- 1
- ) * emb.unsqueeze(0)
- emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1).view(
- num_embeddings, -1
- )
- if embedding_dim % 2 == 1:
- # zero pad
- emb = torch.cat([emb, torch.zeros(num_embeddings, 1)], dim=1)
- if padding_idx is not None:
- emb[padding_idx, :] = 0
- return emb
-
- def forward(
- self,
- input,
- incremental_state: Optional[Any] = None,
- timestep: Optional[Tensor] = None,
- positions: Optional[Any] = None,
- ):
- """Input is expected to be of size [bsz x seqlen]."""
- bspair = torch.onnx.operators.shape_as_tensor(input)
- bsz, seq_len = bspair[0], bspair[1]
- max_pos = self.padding_idx + 1 + seq_len
- if self.weights is None or max_pos > self.weights.size(0):
- # recompute/expand embeddings if needed
- self.weights = SinusoidalPositionalEmbedding.get_embedding(
- max_pos, self.embedding_dim, self.padding_idx
- )
- self.weights = self.weights.to(self._float_tensor)
-
- if incremental_state is not None:
- # positions is the same for every token when decoding a single step
- pos = timestep.view(-1)[0] + 1 if timestep is not None else seq_len
- if self.onnx_trace:
- return (
- self.weights.index_select(index=self.padding_idx + pos, dim=0)
- .unsqueeze(1)
- .repeat(bsz, 1, 1)
- )
- return self.weights[self.padding_idx + pos, :].expand(bsz, 1, -1)
-
- positions = utils.make_positions(
- input, self.padding_idx, onnx_trace=self.onnx_trace
- )
- if self.onnx_trace:
- flat_embeddings = self.weights.detach().index_select(0, positions.view(-1))
- embedding_shape = torch.cat(
- (bsz.view(1), seq_len.view(1), torch.tensor([-1], dtype=torch.long))
- )
- embeddings = torch.onnx.operators.reshape_from_tensor_shape(
- flat_embeddings, embedding_shape
- )
- return embeddings
- return (
- self.weights.index_select(0, positions.view(-1))
- .view(bsz, seq_len, -1)
- .detach()
- )
diff --git a/spaces/mshukor/UnIVAL/run_scripts/averaging/ratatouille/scaling_best/snli_ve/unival_snli_ve_initcaption.sh b/spaces/mshukor/UnIVAL/run_scripts/averaging/ratatouille/scaling_best/snli_ve/unival_snli_ve_initcaption.sh
deleted file mode 100644
index 8aef19a45297fa6b038322cc3bb4ac396a9f60d9..0000000000000000000000000000000000000000
--- a/spaces/mshukor/UnIVAL/run_scripts/averaging/ratatouille/scaling_best/snli_ve/unival_snli_ve_initcaption.sh
+++ /dev/null
@@ -1,187 +0,0 @@
-#!/usr/bin/env
-
-# The port for communication. Note that if you want to run multiple tasks on the same machine,
-# you need to specify different port numbers.
-
-# Number of GPUs per GPU worker
-export GPUS_PER_NODE=8
-# Number of GPU workers, for single-worker training, please set to 1
-export NUM_NODES=$SLURM_NNODES
-# The ip address of the rank-0 worker, for single-worker training, please set to localhost
-master_addr=$(scontrol show hostnames "$SLURM_JOB_NODELIST" | head -n 1)
-export MASTER_ADDR=$master_addr
-
-# The port for communication
-export MASTER_PORT=12350
-# The rank of this worker, should be in {0, ..., WORKER_CNT-1}, for single-worker training, please set to 0
-export RANK=$SLURM_NODEID
-
-echo "MASTER_ADDR: $MASTER_ADDR"
-echo "RANK :$RANK"
-echo "NUM_NODES :$NUM_NODES"
-echo "GPUS_PER_NODE :$GPUS_PER_NODE"
-
-export MIOPEN_USER_DB_PATH=/lus/home/NAT/gda2204/mshukor/.config/miopen_${MASTER_ADDR}_${SLURM_PROCID}/
-
-echo "MIOPEN_USER_DB_PATH :$MIOPEN_USER_DB_PATH"
-
-
-
-exp_name=unival_snli_ve_initcaption
-
-
-
-ofa_dir=/lus/home/NAT/gda2204/mshukor/code/unival
-base_data_dir=/lus/scratch/NAT/gda2204/SHARED/data
-base_log_dir=/work/NAT/gda2204/mshukor/logs
-
-save_base_log_dir=/lus/scratch/NAT/gda2204/SHARED/logs
-save_dir=${save_base_log_dir}/ofa/checkpoints/snli_ve/${exp_name}
-
-# save_dir=${base_log_dir}/ofa/checkpoints/snli_ve/${exp_name}
-log_dir=${save_dir}
-
-
-mkdir -p $log_dir $save_dir
-
-bpe_dir=${ofa_dir}/utils/BPE
-user_dir=${ofa_dir}/ofa_module
-
-image_dir=${base_data_dir}
-
-
-
-
-data_dir=${base_data_dir}/ofa/snli_ve_data
-data=${data_dir}/snli_ve_train.tsv,${data_dir}/snli_ve_dev.tsv
-
-restore_file=/lus/scratch/NAT/gda2204/SHARED/logs/ofa/checkpoints/caption/unival_caption_stage_1/10_0.06_6000/checkpoint_best.pt
-
-
-
-selected_cols=0,2,3,4,5
-
-task=snli_ve
-arch=unival_base
-criterion=adjust_label_smoothed_cross_entropy
-label_smoothing=0.0
-lr=5e-5
-max_epoch=10
-warmup_ratio=0.06
-batch_size=8
-update_freq=4
-resnet_drop_path_rate=0.0
-encoder_drop_path_rate=0.1
-decoder_drop_path_rate=0.1
-dropout=0.1
-attention_dropout=0.0
-max_src_length=80
-max_tgt_length=20
-num_bins=1000
-patch_image_size=480
-prompt_type="prev_output"
-
-
-
-echo "max_epoch "${max_epoch}
-echo "lr "${lr}
-
-log_file=${log_dir}/${max_epoch}"_"${lr}".log"
-save_path=${save_dir}/${max_epoch}"_"${lr}
-mkdir -p $save_path
-
-
-
-
-
-
-###
-image_encoder_name=timm_resnet #vit_base_patch16_224 timm_resnet resnet
-patch_image_size=480
-resnet_type=resnet101
-
-resnet_model_path=${base_log_dir}/pretrained_models/resnet101-5d3b4d8f.pth
-
-# video
-video_encoder_name=all_resnext101
-patch_frame_size=384
-video_model_path=${base_log_dir}/pretrained_models/3dcnn/resnext-101-kinetics.pth #${base_log_dir}/pretrained_models/TimeSformer_divST_8x32_224_K600.pyth
-num_frames=4
-
-save_interval=1
-validate_interval_updates=50000
-save_interval_updates=0
-
-
-sample_patch_num='--sample-patch-num=784' # ''
-
-
-
-python3 -m torch.distributed.launch \
- --nnodes=${NUM_NODES} \
- --nproc_per_node=${GPUS_PER_NODE} \
- --master_port=${MASTER_PORT} \
- --node_rank=${RANK} \
- --master_addr=${MASTER_ADDR} \
- --use_env ${ofa_dir}/train.py \
- $data \
- --selected-cols=${selected_cols} \
- --bpe-dir=${bpe_dir} \
- --user-dir=${user_dir} \
- --restore-file=${restore_file} \
- --reset-optimizer --reset-dataloader --reset-meters \
- --save-dir=${save_path} \
- --task=${task} \
- --arch=${arch} \
- --criterion=${criterion} \
- --label-smoothing=${label_smoothing} \
- --batch-size=${batch_size} \
- --update-freq=${update_freq} \
- --encoder-normalize-before \
- --decoder-normalize-before \
- --share-decoder-input-output-embed \
- --share-all-embeddings \
- --layernorm-embedding \
- --patch-layernorm-embedding \
- --code-layernorm-embedding \
- --resnet-drop-path-rate=${resnet_drop_path_rate} \
- --encoder-drop-path-rate=${encoder_drop_path_rate} \
- --decoder-drop-path-rate=${decoder_drop_path_rate} \
- --dropout=${dropout} \
- --attention-dropout=${attention_dropout} \
- --weight-decay=0.01 --optimizer=adam --adam-betas="(0.9,0.999)" --adam-eps=1e-08 --clip-norm=1.0 \
- --lr-scheduler=polynomial_decay --lr=${lr} \
- --max-epoch=${max_epoch} --warmup-ratio=${warmup_ratio} \
- --log-format=simple --log-interval=10 \
- --fixed-validation-seed=7 \
- --keep-best-checkpoints=1 \
- --no-epoch-checkpoints \
- --save-interval=1 --validate-interval=1 \
- --save-interval-updates=${save_interval_updates} --validate-interval-updates=${validate_interval_updates} \
- --best-checkpoint-metric=snli_score --maximize-best-checkpoint-metric \
- --max-src-length=${max_src_length} \
- --max-tgt-length=${max_tgt_length} \
- --find-unused-parameters \
- --add-type-embedding \
- --scale-attn \
- --scale-fc \
- --scale-heads \
- --disable-entangle \
- --num-bins=${num_bins} \
- --patch-image-size=${patch_image_size} \
- --prompt-type=${prompt_type} \
- --fp16 \
- --fp16-scale-window=512 \
- --num-workers=0 \
- --image-dir=${image_dir} \
- ${sample_patch_num} \
- --image-encoder-name=${image_encoder_name} \
- --image-dir=${image_dir} \
- --video-encoder-name=${video_encoder_name} \
- --video-model-path=${video_model_path} \
- --patch-frame-size=${patch_frame_size} \
- --reset-dataloader --reset-meters --reset-optimizer \
- --strict \
- --resnet-model-path=${resnet_model_path}
-
- # --add-caption \
diff --git a/spaces/mygyasir/Real-Time-Voice-Cloning/vocoder_preprocess.py b/spaces/mygyasir/Real-Time-Voice-Cloning/vocoder_preprocess.py
deleted file mode 100644
index 7ede3dfb95972e2de575de35b9d4a9c6d642885e..0000000000000000000000000000000000000000
--- a/spaces/mygyasir/Real-Time-Voice-Cloning/vocoder_preprocess.py
+++ /dev/null
@@ -1,59 +0,0 @@
-from synthesizer.synthesize import run_synthesis
-from synthesizer.hparams import hparams
-from utils.argutils import print_args
-import argparse
-import os
-
-
-if __name__ == "__main__":
- class MyFormatter(argparse.ArgumentDefaultsHelpFormatter, argparse.RawDescriptionHelpFormatter):
- pass
-
- parser = argparse.ArgumentParser(
- description="Creates ground-truth aligned (GTA) spectrograms from the vocoder.",
- formatter_class=MyFormatter
- )
- parser.add_argument("datasets_root", type=str, help=\
- "Path to the directory containing your SV2TTS directory. If you specify both --in_dir and "
- "--out_dir, this argument won't be used.")
- parser.add_argument("--model_dir", type=str,
- default="synthesizer/saved_models/pretrained/", help=\
- "Path to the pretrained model directory.")
- parser.add_argument("-i", "--in_dir", type=str, default=argparse.SUPPRESS, help= \
- "Path to the synthesizer directory that contains the mel spectrograms, the wavs and the "
- "embeds. Defaults to /SV2TTS/synthesizer/.")
- parser.add_argument("-o", "--out_dir", type=str, default=argparse.SUPPRESS, help= \
- "Path to the output vocoder directory that will contain the ground truth aligned mel "
- "spectrograms. Defaults to /SV2TTS/vocoder/.")
- parser.add_argument("--hparams", default="",
- help="Hyperparameter overrides as a comma-separated list of name=value "
- "pairs")
- parser.add_argument("--no_trim", action="store_true", help=\
- "Preprocess audio without trimming silences (not recommended).")
- parser.add_argument("--cpu", action="store_true", help=\
- "If True, processing is done on CPU, even when a GPU is available.")
- args = parser.parse_args()
- print_args(args, parser)
- modified_hp = hparams.parse(args.hparams)
-
- if not hasattr(args, "in_dir"):
- args.in_dir = os.path.join(args.datasets_root, "SV2TTS", "synthesizer")
- if not hasattr(args, "out_dir"):
- args.out_dir = os.path.join(args.datasets_root, "SV2TTS", "vocoder")
-
- if args.cpu:
- # Hide GPUs from Pytorch to force CPU processing
- os.environ["CUDA_VISIBLE_DEVICES"] = "-1"
-
- # Verify webrtcvad is available
- if not args.no_trim:
- try:
- import webrtcvad
- except:
- raise ModuleNotFoundError("Package 'webrtcvad' not found. This package enables "
- "noise removal and is recommended. Please install and try again. If installation fails, "
- "use --no_trim to disable this error message.")
- del args.no_trim
-
- run_synthesis(args.in_dir, args.out_dir, args.model_dir, modified_hp)
-
diff --git a/spaces/navervision/MLSD/utils.py b/spaces/navervision/MLSD/utils.py
deleted file mode 100644
index 6f7a0767f0220dc0d6a204f694ca6127e87a86f9..0000000000000000000000000000000000000000
--- a/spaces/navervision/MLSD/utils.py
+++ /dev/null
@@ -1,511 +0,0 @@
-'''
-M-LSD
-Copyright 2021-present NAVER Corp.
-Apache License v2.0
-'''
-import os
-import numpy as np
-import cv2
-import tensorflow as tf
-
-
-def pred_lines(image, interpreter, input_details, output_details, input_shape=[512, 512], score_thr=0.10, dist_thr=20.0):
- h, w, _ = image.shape
- h_ratio, w_ratio = [h / input_shape[0], w / input_shape[1]]
-
- resized_image = np.concatenate([cv2.resize(image, (input_shape[0], input_shape[1]), interpolation=cv2.INTER_AREA), np.ones([input_shape[0], input_shape[1], 1])], axis=-1)
- batch_image = np.expand_dims(resized_image, axis=0).astype('float32')
- interpreter.set_tensor(input_details[0]['index'], batch_image)
- interpreter.invoke()
-
- pts = interpreter.get_tensor(output_details[0]['index'])[0]
- pts_score = interpreter.get_tensor(output_details[1]['index'])[0]
- vmap = interpreter.get_tensor(output_details[2]['index'])[0]
-
- start = vmap[:,:,:2]
- end = vmap[:,:,2:]
- dist_map = np.sqrt(np.sum((start - end) ** 2, axis=-1))
-
- segments_list = []
- for center, score in zip(pts, pts_score):
- y, x = center
- distance = dist_map[y, x]
- if score > score_thr and distance > dist_thr:
- disp_x_start, disp_y_start, disp_x_end, disp_y_end = vmap[y, x, :]
- x_start = x + disp_x_start
- y_start = y + disp_y_start
- x_end = x + disp_x_end
- y_end = y + disp_y_end
- segments_list.append([x_start, y_start, x_end, y_end])
-
- lines = 2 * np.array(segments_list) # 256 > 512
- lines[:,0] = lines[:,0] * w_ratio
- lines[:,1] = lines[:,1] * h_ratio
- lines[:,2] = lines[:,2] * w_ratio
- lines[:,3] = lines[:,3] * h_ratio
-
- return lines
-
-
-def pred_squares(image,
- interpreter,
- input_details,
- output_details,
- input_shape=[512, 512],
- params={'score': 0.06,
- 'outside_ratio': 0.28,
- 'inside_ratio': 0.45,
- 'w_overlap': 0.0,
- 'w_degree': 1.95,
- 'w_length': 0.0,
- 'w_area': 1.86,
- 'w_center': 0.14}):
- h, w, _ = image.shape
- original_shape = [h, w]
-
- resized_image = np.concatenate([cv2.resize(image, (input_shape[0], input_shape[1]), interpolation=cv2.INTER_AREA), np.ones([input_shape[0], input_shape[1], 1])], axis=-1)
- batch_image = np.expand_dims(resized_image, axis=0).astype('float32')
- interpreter.set_tensor(input_details[0]['index'], batch_image)
- interpreter.invoke()
-
- pts = interpreter.get_tensor(output_details[0]['index'])[0]
- pts_score = interpreter.get_tensor(output_details[1]['index'])[0]
- vmap = interpreter.get_tensor(output_details[2]['index'])[0]
-
- start = vmap[:,:,:2] # (x, y)
- end = vmap[:,:,2:] # (x, y)
- dist_map = np.sqrt(np.sum((start - end) ** 2, axis=-1))
-
- junc_list = []
- segments_list = []
- for junc, score in zip(pts, pts_score):
- y, x = junc
- distance = dist_map[y, x]
- if score > params['score'] and distance > 20.0:
- junc_list.append([x, y])
- disp_x_start, disp_y_start, disp_x_end, disp_y_end = vmap[y, x, :]
- d_arrow = 1.0
- x_start = x + d_arrow * disp_x_start
- y_start = y + d_arrow * disp_y_start
- x_end = x + d_arrow * disp_x_end
- y_end = y + d_arrow * disp_y_end
- segments_list.append([x_start, y_start, x_end, y_end])
-
- segments = np.array(segments_list)
-
- ####### post processing for squares
- # 1. get unique lines
- point = np.array([[0, 0]])
- point = point[0]
- start = segments[:,:2]
- end = segments[:,2:]
- diff = start - end
- a = diff[:, 1]
- b = -diff[:, 0]
- c = a * start[:,0] + b * start[:,1]
-
- d = np.abs(a * point[0] + b * point[1] - c) / np.sqrt(a ** 2 + b ** 2 + 1e-10)
- theta = np.arctan2(diff[:,0], diff[:,1]) * 180 / np.pi
- theta[theta < 0.0] += 180
- hough = np.concatenate([d[:,None], theta[:,None]], axis=-1)
-
- d_quant = 1
- theta_quant = 2
- hough[:,0] //= d_quant
- hough[:,1] //= theta_quant
- _, indices, counts = np.unique(hough, axis=0, return_index=True, return_counts=True)
-
- acc_map = np.zeros([512 // d_quant + 1, 360 // theta_quant + 1], dtype='float32')
- idx_map = np.zeros([512 // d_quant + 1, 360 // theta_quant + 1], dtype='int32') - 1
- yx_indices = hough[indices,:].astype('int32')
- acc_map[yx_indices[:,0], yx_indices[:,1]] = counts
- idx_map[yx_indices[:,0], yx_indices[:,1]] = indices
-
- acc_map_np = acc_map
- acc_map = acc_map[None,:,:,None]
-
- ### fast suppression using tensorflow op
- acc_map = tf.constant(acc_map, dtype=tf.float32)
- max_acc_map = tf.keras.layers.MaxPool2D(pool_size=(5,5), strides=1, padding='same')(acc_map)
- acc_map = acc_map * tf.cast(tf.math.equal(acc_map, max_acc_map), tf.float32)
- flatten_acc_map = tf.reshape(acc_map, [1, -1])
- topk_values, topk_indices = tf.math.top_k(flatten_acc_map, k=len(pts))
- _, h, w, _ = acc_map.shape
- y = tf.expand_dims(topk_indices // w, axis=-1)
- x = tf.expand_dims(topk_indices % w, axis=-1)
- yx = tf.concat([y, x], axis=-1)
- ###
-
- yx = yx[0].numpy()
- indices = idx_map[yx[:,0], yx[:,1]]
- topk_values = topk_values.numpy()[0]
- basis = 5 // 2
-
- merged_segments = []
- for yx_pt, max_indice, value in zip(yx, indices, topk_values):
- y, x = yx_pt
- if max_indice == -1 or value == 0:
- continue
- segment_list = []
- for y_offset in range(-basis, basis+1):
- for x_offset in range(-basis, basis+1):
- indice = idx_map[y+y_offset,x+x_offset]
- cnt = int(acc_map_np[y+y_offset,x+x_offset])
- if indice != -1:
- segment_list.append(segments[indice])
- if cnt > 1:
- check_cnt = 1
- current_hough = hough[indice]
- for new_indice, new_hough in enumerate(hough):
- if (current_hough == new_hough).all() and indice != new_indice:
- segment_list.append(segments[new_indice])
- check_cnt += 1
- if check_cnt == cnt:
- break
- group_segments = np.array(segment_list).reshape([-1, 2])
- sorted_group_segments = np.sort(group_segments, axis=0)
- x_min, y_min = sorted_group_segments[0,:]
- x_max, y_max = sorted_group_segments[-1,:]
-
- deg = theta[max_indice]
- if deg >= 90:
- merged_segments.append([x_min, y_max, x_max, y_min])
- else:
- merged_segments.append([x_min, y_min, x_max, y_max])
-
- # 2. get intersections
- new_segments = np.array(merged_segments) # (x1, y1, x2, y2)
- start = new_segments[:,:2] # (x1, y1)
- end = new_segments[:,2:] # (x2, y2)
- new_centers = (start + end) / 2.0
- diff = start - end
- dist_segments = np.sqrt(np.sum(diff ** 2, axis=-1))
-
- # ax + by = c
- a = diff[:,1]
- b = -diff[:,0]
- c = a * start[:,0] + b * start[:,1]
- pre_det = a[:,None] * b[None,:]
- det = pre_det - np.transpose(pre_det)
-
- pre_inter_y = a[:,None] * c[None,:]
- inter_y = (pre_inter_y - np.transpose(pre_inter_y)) / (det + 1e-10)
- pre_inter_x = c[:,None] * b[None,:]
- inter_x = (pre_inter_x - np.transpose(pre_inter_x)) / (det + 1e-10)
- inter_pts = np.concatenate([inter_x[:,:,None], inter_y[:,:,None]], axis=-1).astype('int32')
-
- # 3. get corner information
- # 3.1 get distance
- '''
- dist_segments:
- | dist(0), dist(1), dist(2), ...|
- dist_inter_to_segment1:
- | dist(inter,0), dist(inter,0), dist(inter,0), ... |
- | dist(inter,1), dist(inter,1), dist(inter,1), ... |
- ...
- dist_inter_to_semgnet2:
- | dist(inter,0), dist(inter,1), dist(inter,2), ... |
- | dist(inter,0), dist(inter,1), dist(inter,2), ... |
- ...
- '''
-
- dist_inter_to_segment1_start = np.sqrt(np.sum(((inter_pts - start[:,None,:]) ** 2), axis=-1, keepdims=True)) # [n_batch, n_batch, 1]
- dist_inter_to_segment1_end = np.sqrt(np.sum(((inter_pts - end[:,None,:]) ** 2), axis=-1, keepdims=True)) # [n_batch, n_batch, 1]
- dist_inter_to_segment2_start = np.sqrt(np.sum(((inter_pts - start[None,:,:]) ** 2), axis=-1, keepdims=True)) # [n_batch, n_batch, 1]
- dist_inter_to_segment2_end = np.sqrt(np.sum(((inter_pts - end[None,:,:]) ** 2), axis=-1, keepdims=True)) # [n_batch, n_batch, 1]
-
- # sort ascending
- dist_inter_to_segment1 = np.sort(np.concatenate([dist_inter_to_segment1_start, dist_inter_to_segment1_end], axis=-1), axis=-1) # [n_batch, n_batch, 2]
- dist_inter_to_segment2 = np.sort(np.concatenate([dist_inter_to_segment2_start, dist_inter_to_segment2_end], axis=-1), axis=-1) # [n_batch, n_batch, 2]
-
- # 3.2 get degree
- inter_to_start = new_centers[:,None,:] - inter_pts
- deg_inter_to_start = np.arctan2(inter_to_start[:,:,1], inter_to_start[:,:,0]) * 180 / np.pi
- deg_inter_to_start[deg_inter_to_start < 0.0] += 360
- inter_to_end = new_centers[None,:,:] - inter_pts
- deg_inter_to_end = np.arctan2(inter_to_end[:,:,1], inter_to_end[:,:,0]) * 180 / np.pi
- deg_inter_to_end[deg_inter_to_end < 0.0] += 360
-
- '''
- 0 -- 1
- | |
- 3 -- 2
- '''
- # rename variables
- deg1_map, deg2_map = deg_inter_to_start, deg_inter_to_end
- # sort deg ascending
- deg_sort = np.sort(np.concatenate([deg1_map[:,:,None], deg2_map[:,:,None]], axis=-1), axis=-1)
-
- deg_diff_map = np.abs(deg1_map - deg2_map)
- # we only consider the smallest degree of intersect
- deg_diff_map[deg_diff_map > 180] = 360 - deg_diff_map[deg_diff_map > 180]
-
- # define available degree range
- deg_range = [60, 120]
-
- corner_dict = {corner_info: [] for corner_info in range(4)}
- inter_points = []
- for i in range(inter_pts.shape[0]):
- for j in range(i + 1, inter_pts.shape[1]):
- # i, j > line index, always i < j
- x, y = inter_pts[i, j, :]
- deg1, deg2 = deg_sort[i, j, :]
- deg_diff = deg_diff_map[i, j]
-
- check_degree = deg_diff > deg_range[0] and deg_diff < deg_range[1]
-
- outside_ratio = params['outside_ratio'] # over ratio >>> drop it!
- inside_ratio = params['inside_ratio'] # over ratio >>> drop it!
- check_distance = ((dist_inter_to_segment1[i,j,1] >= dist_segments[i] and \
- dist_inter_to_segment1[i,j,0] <= dist_segments[i] * outside_ratio) or \
- (dist_inter_to_segment1[i,j,1] <= dist_segments[i] and \
- dist_inter_to_segment1[i,j,0] <= dist_segments[i] * inside_ratio)) and \
- ((dist_inter_to_segment2[i,j,1] >= dist_segments[j] and \
- dist_inter_to_segment2[i,j,0] <= dist_segments[j] * outside_ratio) or \
- (dist_inter_to_segment2[i,j,1] <= dist_segments[j] and \
- dist_inter_to_segment2[i,j,0] <= dist_segments[j] * inside_ratio))
-
- if check_degree and check_distance:
- corner_info = None
-
- if (deg1 >= 0 and deg1 <= 45 and deg2 >=45 and deg2 <= 120) or \
- (deg2 >= 315 and deg1 >= 45 and deg1 <= 120):
- corner_info, color_info = 0, 'blue'
- elif (deg1 >= 45 and deg1 <= 125 and deg2 >= 125 and deg2 <= 225):
- corner_info, color_info = 1, 'green'
- elif (deg1 >= 125 and deg1 <= 225 and deg2 >= 225 and deg2 <= 315):
- corner_info, color_info = 2, 'black'
- elif (deg1 >= 0 and deg1 <= 45 and deg2 >= 225 and deg2 <= 315) or \
- (deg2 >= 315 and deg1 >= 225 and deg1 <= 315):
- corner_info, color_info = 3, 'cyan'
- else:
- corner_info, color_info = 4, 'red' # we don't use it
- continue
-
- corner_dict[corner_info].append([x, y, i, j])
- inter_points.append([x, y])
-
- square_list = []
- connect_list = []
- segments_list = []
- for corner0 in corner_dict[0]:
- for corner1 in corner_dict[1]:
- connect01 = False
- for corner0_line in corner0[2:]:
- if corner0_line in corner1[2:]:
- connect01 = True
- break
- if connect01:
- for corner2 in corner_dict[2]:
- connect12 = False
- for corner1_line in corner1[2:]:
- if corner1_line in corner2[2:]:
- connect12 = True
- break
- if connect12:
- for corner3 in corner_dict[3]:
- connect23 = False
- for corner2_line in corner2[2:]:
- if corner2_line in corner3[2:]:
- connect23 = True
- break
- if connect23:
- for corner3_line in corner3[2:]:
- if corner3_line in corner0[2:]:
- # SQUARE!!!
- '''
- 0 -- 1
- | |
- 3 -- 2
- square_list:
- order: 0 > 1 > 2 > 3
- | x0, y0, x1, y1, x2, y2, x3, y3 |
- | x0, y0, x1, y1, x2, y2, x3, y3 |
- ...
- connect_list:
- order: 01 > 12 > 23 > 30
- | line_idx01, line_idx12, line_idx23, line_idx30 |
- | line_idx01, line_idx12, line_idx23, line_idx30 |
- ...
- segments_list:
- order: 0 > 1 > 2 > 3
- | line_idx0_i, line_idx0_j, line_idx1_i, line_idx1_j, line_idx2_i, line_idx2_j, line_idx3_i, line_idx3_j |
- | line_idx0_i, line_idx0_j, line_idx1_i, line_idx1_j, line_idx2_i, line_idx2_j, line_idx3_i, line_idx3_j |
- ...
- '''
- square_list.append(corner0[:2] + corner1[:2] + corner2[:2] + corner3[:2])
- connect_list.append([corner0_line, corner1_line, corner2_line, corner3_line])
- segments_list.append(corner0[2:] + corner1[2:] + corner2[2:] + corner3[2:])
-
- def check_outside_inside(segments_info, connect_idx):
- # return 'outside or inside', min distance, cover_param, peri_param
- if connect_idx == segments_info[0]:
- check_dist_mat = dist_inter_to_segment1
- else:
- check_dist_mat = dist_inter_to_segment2
-
- i, j = segments_info
- min_dist, max_dist = check_dist_mat[i, j, :]
- connect_dist = dist_segments[connect_idx]
- if max_dist > connect_dist:
- return 'outside', min_dist, 0, 1
- else:
- return 'inside', min_dist, -1, -1
-
-
- top_square = None
-
- try:
- map_size = input_shape[0] / 2
- squares = np.array(square_list).reshape([-1,4,2])
- score_array = []
- connect_array = np.array(connect_list)
- segments_array = np.array(segments_list).reshape([-1,4,2])
-
- # get degree of corners:
- squares_rollup = np.roll(squares, 1, axis=1)
- squares_rolldown = np.roll(squares, -1, axis=1)
- vec1 = squares_rollup - squares
- normalized_vec1 = vec1 / (np.linalg.norm(vec1, axis=-1, keepdims=True) + 1e-10)
- vec2 = squares_rolldown - squares
- normalized_vec2 = vec2 / (np.linalg.norm(vec2, axis=-1, keepdims=True) + 1e-10)
- inner_products = np.sum(normalized_vec1 * normalized_vec2, axis=-1) # [n_squares, 4]
- squares_degree = np.arccos(inner_products) * 180 / np.pi # [n_squares, 4]
-
- # get square score
- overlap_scores = []
- degree_scores = []
- length_scores = []
-
- for connects, segments, square, degree in zip(connect_array, segments_array, squares, squares_degree):
- '''
- 0 -- 1
- | |
- 3 -- 2
-
- # segments: [4, 2]
- # connects: [4]
- '''
-
- ###################################### OVERLAP SCORES
- cover = 0
- perimeter = 0
- # check 0 > 1 > 2 > 3
- square_length = []
-
- for start_idx in range(4):
- end_idx = (start_idx + 1) % 4
-
- connect_idx = connects[start_idx] # segment idx of segment01
- start_segments = segments[start_idx]
- end_segments = segments[end_idx]
-
- start_point = square[start_idx]
- end_point = square[end_idx]
-
- # check whether outside or inside
- start_position, start_min, start_cover_param, start_peri_param = check_outside_inside(start_segments, connect_idx)
- end_position, end_min, end_cover_param, end_peri_param = check_outside_inside(end_segments, connect_idx)
-
- cover += dist_segments[connect_idx] + start_cover_param * start_min + end_cover_param * end_min
- perimeter += dist_segments[connect_idx] + start_peri_param * start_min + end_peri_param * end_min
-
- square_length.append(dist_segments[connect_idx] + start_peri_param * start_min + end_peri_param * end_min)
-
- overlap_scores.append(cover / perimeter)
- ######################################
- ###################################### DEGREE SCORES
- '''
- deg0 vs deg2
- deg1 vs deg3
- '''
- deg0, deg1, deg2, deg3 = degree
- deg_ratio1 = deg0 / deg2
- if deg_ratio1 > 1.0:
- deg_ratio1 = 1 / deg_ratio1
- deg_ratio2 = deg1 / deg3
- if deg_ratio2 > 1.0:
- deg_ratio2 = 1 / deg_ratio2
- degree_scores.append((deg_ratio1 + deg_ratio2) / 2)
- ######################################
- ###################################### LENGTH SCORES
- '''
- len0 vs len2
- len1 vs len3
- '''
- len0, len1, len2, len3 = square_length
- len_ratio1 = len0 / len2 if len2 > len0 else len2 / len0
- len_ratio2 = len1 / len3 if len3 > len1 else len3 / len1
- length_scores.append((len_ratio1 + len_ratio2) / 2)
-
- ######################################
-
- overlap_scores = np.array(overlap_scores)
- overlap_scores /= np.max(overlap_scores)
-
- degree_scores = np.array(degree_scores)
- #degree_scores /= np.max(degree_scores)
-
- length_scores = np.array(length_scores)
-
- ###################################### AREA SCORES
- area_scores = np.reshape(squares, [-1, 4, 2])
- area_x = area_scores[:,:,0]
- area_y = area_scores[:,:,1]
- correction = area_x[:,-1] * area_y[:,0] - area_y[:,-1] * area_x[:,0]
- area_scores = np.sum(area_x[:,:-1] * area_y[:,1:], axis=-1) - np.sum(area_y[:,:-1] * area_x[:,1:], axis=-1)
- area_scores = 0.5 * np.abs(area_scores + correction)
- area_scores /= (map_size * map_size) #np.max(area_scores)
- ######################################
-
- ###################################### CENTER SCORES
- centers = np.array([[256 // 2, 256 // 2]], dtype='float32') # [1, 2]
- # squares: [n, 4, 2]
- square_centers = np.mean(squares, axis=1) # [n, 2]
- center2center = np.sqrt(np.sum((centers - square_centers) ** 2))
- center_scores = center2center / (map_size / np.sqrt(2.0))
-
-
- '''
- score_w = [overlap, degree, area, center, length]
- '''
- score_w = [0.0, 1.0, 10.0, 0.5, 1.0]
- score_array = params['w_overlap'] * overlap_scores \
- + params['w_degree'] * degree_scores \
- + params['w_area'] * area_scores \
- - params['w_center'] * center_scores \
- + params['w_length'] * length_scores
-
- best_square = []
-
- sorted_idx = np.argsort(score_array)[::-1]
- score_array = score_array[sorted_idx]
- squares = squares[sorted_idx]
-
- except Exception as e:
- pass
-
- try:
- new_segments[:,0] = new_segments[:,0] * 2 / input_shape[1] * original_shape[1]
- new_segments[:,1] = new_segments[:,1] * 2 / input_shape[0] * original_shape[0]
- new_segments[:,2] = new_segments[:,2] * 2 / input_shape[1] * original_shape[1]
- new_segments[:,3] = new_segments[:,3] * 2 / input_shape[0] * original_shape[0]
- except:
- new_segments = []
-
- try:
- squares[:,:,0] = squares[:,:,0] * 2 / input_shape[1] * original_shape[1]
- squares[:,:,1] = squares[:,:,1] * 2 / input_shape[0] * original_shape[0]
- except:
- squares = []
- score_array = []
-
- try:
- inter_points = np.array(inter_points)
- inter_points[:,0] = inter_points[:,0] * 2 / input_shape[1] * original_shape[1]
- inter_points[:,1] = inter_points[:,1] * 2 / input_shape[0] * original_shape[0]
- except:
- inter_points = []
-
-
- return new_segments, squares, score_array, inter_points
diff --git a/spaces/netiMophi/DreamlikeArt-Diffusion-1.0/FREELETICS STRENGTH TRAINING GUIDE PDFl [BETTER].md b/spaces/netiMophi/DreamlikeArt-Diffusion-1.0/FREELETICS STRENGTH TRAINING GUIDE PDFl [BETTER].md
deleted file mode 100644
index cb8603e6a272c33e38254157eb46a6f9416c3104..0000000000000000000000000000000000000000
--- a/spaces/netiMophi/DreamlikeArt-Diffusion-1.0/FREELETICS STRENGTH TRAINING GUIDE PDFl [BETTER].md
+++ /dev/null
@@ -1,37 +0,0 @@
-
-How to Get Stronger with Freeletics Strength Training Guide PDF
-If you are looking for a challenging and effective way to build muscle and power, you might want to check out the Freeletics Strength Training Guide PDF. This is a digital document that contains a 15-week training plan designed by Freeletics, a leading fitness app that offers personalized workouts and nutrition guidance.
-FREELETICS STRENGTH TRAINING GUIDE PDFl
Download Zip ————— https://urlcod.com/2uIaYA
-The Freeletics Strength Training Guide PDF is based on the principle of high-intensity interval training (HIIT), which involves short bursts of maximal effort followed by brief periods of rest. This type of training stimulates your muscles to grow and adapt, while also boosting your metabolism and cardiovascular health.
-The guide consists of three phases, each lasting five weeks. In each phase, you will perform different workouts that target different muscle groups and skills. The workouts are composed of exercises that use your own bodyweight or minimal equipment, such as a pull-up bar or a jump rope. You can do them anywhere, anytime, and at your own pace.
-The guide also provides you with tips on how to warm up, cool down, stretch, and recover properly. It also advises you on how to eat well and supplement your training with enough protein and other nutrients. By following the guide, you will not only get stronger physically, but also mentally and emotionally.
-
-To access the Freeletics Strength Training Guide PDF, you need to download the Freeletics app and sign up for a subscription. You can choose between the Training & Nutrition Bundle, which includes both the training and nutrition coaching, or the Custom Training Journeys, which only includes the training coaching. You can also try out a free trial before committing to a plan.
-Once you have the app, you can find the Strength Training Guide PDF under the Training Journeys section. You can also view it online or print it out for your convenience. The guide is suitable for both beginners and advanced athletes, as you can adjust the difficulty level according to your feedback.
-If you are ready to take your strength training to the next level, download the Freeletics app today and start your journey with the Freeletics Strength Training Guide PDF. You will be amazed by what you can achieve with your own body and mind.
-
-What are the benefits of strength training?
-Strength training is not only good for building muscle mass and definition, but also for improving your overall health and well-being. Some of the benefits of strength training include:
-
-- Increasing your bone density and reducing the risk of osteoporosis
-- Enhancing your posture and balance and preventing injuries
-- Boosting your metabolism and burning more calories even at rest
-- Improving your blood sugar levels and lowering your blood pressure
-- Strengthening your immune system and fighting off infections
-- Reducing your stress levels and improving your mood and confidence
-- Preventing or delaying the onset of age-related diseases such as sarcopenia and dementia
-
-How to get started with strength training?
-If you are new to strength training, you might feel intimidated or overwhelmed by the variety of exercises and equipment available. However, you don't need to worry, as you can start with simple and effective exercises that use your own bodyweight or minimal equipment. Here are some tips on how to get started with strength training:
-
-- Consult your doctor before starting any new exercise program, especially if you have any medical conditions or injuries.
-- Set realistic and specific goals for yourself, such as how many times a week you want to train, how long each session will last, and what results you want to achieve.
-- Choose a training plan that suits your level, preferences, and schedule. You can use the Freeletics Strength Training Guide PDF as a reference, or create your own plan based on your goals.
-- Learn the proper form and technique for each exercise, and start with low intensity and volume. You can use the Freeletics app to watch videos and instructions on how to perform each exercise correctly.
-- Warm up before each session with some dynamic stretches and light cardio, such as jogging or skipping. This will prepare your muscles and joints for the workout and prevent injuries.
-- Cool down after each session with some static stretches and deep breathing. This will help your muscles recover and reduce soreness.
-- Rest at least one day between each session, and get enough sleep and hydration. This will allow your muscles to repair and grow stronger.
-- Track your progress and celebrate your achievements. You can use the Freeletics app to record your workouts, measure your performance, and share your results with the community.
-
cec2833e83
-
-
\ No newline at end of file
diff --git a/spaces/nickil/weakly-supervised-parsing/weakly_supervised_parser/settings.py b/spaces/nickil/weakly-supervised-parsing/weakly_supervised_parser/settings.py
deleted file mode 100644
index e9698fce0c18d2b85726511bf1a8bff6b449cade..0000000000000000000000000000000000000000
--- a/spaces/nickil/weakly-supervised-parsing/weakly_supervised_parser/settings.py
+++ /dev/null
@@ -1,33 +0,0 @@
-PROJECT_DIR = "weakly_supervised_parser/"
-PTB_TREES_ROOT_DIR = "data/PROCESSED/english/trees/"
-PTB_SENTENCES_ROOT_DIR = "data/PROCESSED/english/sentences/"
-
-PTB_TRAIN_SENTENCES_WITH_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-train-sentences-with-punctuation.txt"
-PTB_VALID_SENTENCES_WITH_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-valid-sentences-with-punctuation.txt"
-PTB_TEST_SENTENCES_WITH_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-test-sentences-with-punctuation.txt"
-
-PTB_TRAIN_SENTENCES_WITHOUT_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-train-sentences-without-punctuation.txt"
-PTB_VALID_SENTENCES_WITHOUT_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-valid-sentences-without-punctuation.txt"
-PTB_TEST_SENTENCES_WITHOUT_PUNCTUATION_PATH = PTB_SENTENCES_ROOT_DIR + "ptb-test-sentences-without-punctuation.txt"
-
-PTB_TRAIN_GOLD_WITH_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-train-gold-with-punctuation.txt"
-PTB_VALID_GOLD_WITH_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-valid-gold-with-punctuation.txt"
-PTB_TEST_GOLD_WITH_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-test-gold-with-punctuation.txt"
-
-PTB_TRAIN_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-train-gold-without-punctuation.txt"
-PTB_VALID_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-valid-gold-without-punctuation.txt"
-PTB_TEST_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "ptb-test-gold-without-punctuation.txt"
-
-PTB_TRAIN_GOLD_WITHOUT_PUNCTUATION_ALIGNED_PATH = PTB_TREES_ROOT_DIR + "ptb-train-gold-without-punctuation-aligned.txt"
-PTB_VALID_GOLD_WITHOUT_PUNCTUATION_ALIGNED_PATH = PTB_TREES_ROOT_DIR + "ptb-valid-gold-without-punctuation-aligned.txt"
-PTB_TEST_GOLD_WITHOUT_PUNCTUATION_ALIGNED_PATH = PTB_TREES_ROOT_DIR + "ptb-test-gold-without-punctuation-aligned.txt"
-
-YOON_KIM_TRAIN_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "Yoon_Kim/ptb-train-gold-filtered.txt"
-YOON_KIM_VALID_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "Yoon_Kim/ptb-valid-gold-filtered.txt"
-YOON_KIM_TEST_GOLD_WITHOUT_PUNCTUATION_PATH = PTB_TREES_ROOT_DIR + "Yoon_Kim/ptb-test-gold-filtered.txt"
-
-# Predictions
-PTB_SAVE_TREES_PATH = "TEMP/predictions/english/"
-
-# Training
-TRAINED_MODEL_PATH = PROJECT_DIR + "/model/TRAINED_MODEL/"
diff --git a/spaces/nielsr/TrOCR-handwritten/README.md b/spaces/nielsr/TrOCR-handwritten/README.md
deleted file mode 100644
index 34a9a13ea8b91f0aed4e7f2ded6bcd82142d18f8..0000000000000000000000000000000000000000
--- a/spaces/nielsr/TrOCR-handwritten/README.md
+++ /dev/null
@@ -1,37 +0,0 @@
----
-title: TrOCR Handwritten
-emoji: 🏢
-colorFrom: gray
-colorTo: gray
-sdk: gradio
-app_file: app.py
-pinned: false
----
-
-# Configuration
-
-`title`: _string_
-Display title for the Space
-
-`emoji`: _string_
-Space emoji (emoji-only character allowed)
-
-`colorFrom`: _string_
-Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
-
-`colorTo`: _string_
-Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
-
-`sdk`: _string_
-Can be either `gradio` or `streamlit`
-
-`sdk_version` : _string_
-Only applicable for `streamlit` SDK.
-See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions.
-
-`app_file`: _string_
-Path to your main application file (which contains either `gradio` or `streamlit` Python code).
-Path is relative to the root of the repository.
-
-`pinned`: _boolean_
-Whether the Space stays on top of your list.
diff --git a/spaces/nikitaPDL2023/assignment4/detectron2/configs/common/optim.py b/spaces/nikitaPDL2023/assignment4/detectron2/configs/common/optim.py
deleted file mode 100644
index 6cf43e835f55739fbb80102b870efab950a0486d..0000000000000000000000000000000000000000
--- a/spaces/nikitaPDL2023/assignment4/detectron2/configs/common/optim.py
+++ /dev/null
@@ -1,28 +0,0 @@
-import torch
-
-from detectron2.config import LazyCall as L
-from detectron2.solver.build import get_default_optimizer_params
-
-SGD = L(torch.optim.SGD)(
- params=L(get_default_optimizer_params)(
- # params.model is meant to be set to the model object, before instantiating
- # the optimizer.
- weight_decay_norm=0.0
- ),
- lr=0.02,
- momentum=0.9,
- weight_decay=1e-4,
-)
-
-
-AdamW = L(torch.optim.AdamW)(
- params=L(get_default_optimizer_params)(
- # params.model is meant to be set to the model object, before instantiating
- # the optimizer.
- base_lr="${..lr}",
- weight_decay_norm=0.0,
- ),
- lr=1e-4,
- betas=(0.9, 0.999),
- weight_decay=0.1,
-)
diff --git a/spaces/ntt123/vietnam-male-voice-wavegru-tts/sparse_matmul/compute/kernels_arm.h b/spaces/ntt123/vietnam-male-voice-wavegru-tts/sparse_matmul/compute/kernels_arm.h
deleted file mode 100644
index 494430fef873ebd86064263b7ab4d401906910e8..0000000000000000000000000000000000000000
--- a/spaces/ntt123/vietnam-male-voice-wavegru-tts/sparse_matmul/compute/kernels_arm.h
+++ /dev/null
@@ -1,2886 +0,0 @@
-/*
- * Copyright 2021 Google LLC
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-#ifndef LYRA_CODEC_SPARSE_MATMUL_COMPUTE_KERNELS_ARM_H_
-#define LYRA_CODEC_SPARSE_MATMUL_COMPUTE_KERNELS_ARM_H_
-
-#if defined __aarch64__
-
-#include
-
-#include
-
-#include "sparse_matmul/numerics/fixed_types.h"
-#include "sparse_matmul/numerics/float16_types.h"
-#include "sparse_matmul/numerics/type_utils.h"
-
-#define LABEL_COL_LOOP "1"
-#define LABEL_ROW_LOOP "2"
-#define LABEL_SKIP_COL_LOOP "3"
-#define LABEL_TOP_LOOP "4"
-
-namespace csrblocksparse {
-namespace detail {
-
-template
-struct IsFloatOrBfloat
- : std::integral_constant::value ||
- std::is_same::value> {};
-
-template
-struct IsAllowableFloatTypes
- : std::integral_constant::value &&
- std::is_same::value &&
- std::is_same::value> {};
-
-// 16-bit inputs, 32-bit output exponent matches sum of input exponents
-// OR
-// 16-bit inputs, 16-bit output - will shift to match exponent
-template
-struct IsAllowableFixedTypes
- : std::integral_constant::value &&
- IsFixed16Type::value) &&
- (IsFixed32Type::value ||
- IsFixed16Type::value)> {};
-
-template
-struct ShouldEnableGenericKernel
- : std::integral_constant<
- bool,
- !IsAllowableFloatTypes::value &&
- !IsAllowableFixedTypes::value> {};
-
-template
-struct ShouldEnableGenericSpMV_4x4
- : ShouldEnableGenericKernel {};
-template
-struct ShouldEnableGenericSpMM5_4x4
- : ShouldEnableGenericKernel {};
-template
-struct ShouldEnableGenericSpMV_1x1 : std::true_type {};
-template
-struct ShouldEnableGenericSpMM5_1x1 : std::true_type {};
-template
-struct IsAddableFixedTypes
- : std::integral_constant::value ||
- IsFixed16Type::value> {};
-template
-struct ShouldEnableGenericAdd
- : std::integral_constant::value> {};
-
-// The computational routines do NO error checking for speed. It is assumed
-// that this has been handled by CSRBlockSparseMatrix.
-
-// Performs the calculation y = A * x + b where A is a sparse matrix with a 4x4
-// blocked pattern, x is a vector and b is vector. Weights are stored for this
-// routine by making each 4x4 block contiguous. Blocks are ordered in standard
-// row-major format. column indices are converted to deltas and then multiplied
-// by 2 to convert to bytes, so that the value can be used directly to offset
-// the pointer into the rhs vector.
-//
-// NOTE: The bias is expected to have be multiplied by .25f prior to calling
-// this function. This is automatically taken care of in SparseLinearLayer.
-// The bias is reconstructed through horizontal additions, leads to a small
-// speedup by reducing latencies at the end of the loop.
-template
-typename std::enable_if::value &&
- std::is_same::value &&
- std::is_same::value>::type
-SpMV_4x4(const bfloat16* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const float* rhs_ptr,
- const float* bias_ptr, float* out_ptr, int64_t assigned_rows,
- int64_t rows /* only used in SpMM variants */,
- int64_t cols /* only used in SpMM variants */, int relu) {
- /* This instrinsic version exists for reference, note that in the
- intrinsic version col_deltas_bytes should NOT actually be in bytes,
- but rather elements. Intrinsics are 25-35% slower than the
- assembly version.
-
- for (int r = 0; r < rows; r += 4) {
- int reduced_col_count = nnz_per_row[r / 4];
- float32x4_t accum0 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum1 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum2 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum3 = vdupq_n_f32(bias_ptr + r + 3);
- for (int c = 0; c < reduced_col_count; ++c) {
- int32_t offset = *col_deltas_bytes; col_deltas_bytes++;
- rhs_ptr += offset;
- float32x4_t rhs = vld1q_f32(rhs_ptr);
-
- uint16x4_t lhs0_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs1_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs2_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs3_int = vld1_u16(weights_ptr); weights_ptr += 4;
-
- float32x4_t lhs0 = vreinterpretq_f32_u32(vshll_n_u16(lhs0_int, 16));
- float32x4_t lhs1 = vreinterpretq_f32_u32(vshll_n_u16(lhs1_int, 16));
- float32x4_t lhs2 = vreinterpretq_f32_u32(vshll_n_u16(lhs2_int, 16));
- float32x4_t lhs3 = vreinterpretq_f32_u32(vshll_n_u16(lhs3_int, 16));
-
- accum0 = vmlaq_f32(accum0, lhs0, rhs);
- accum1 = vmlaq_f32(accum1, lhs1, rhs);
- accum2 = vmlaq_f32(accum2, lhs2, rhs);
- accum3 = vmlaq_f32(accum3, lhs3, rhs);
- }
-
- float32x4_t reduce0 = vpaddq_f32(accum0, accum1);
- float32x4_t reduce1 = vpaddq_f32(accum2, accum3);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out_ptr + r, reduce2);
- } */
-
- // If the relu is handled in the routine with a comparison and vbit (insert
- // if true), or by branching, then it is slightly, but noticeably slower
- // ~5%, the outer branch avoids that penalty.
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Convert bfloat16 -> float32.
- "shll v4.4s, v2.4h, #16\n"
- "shll2 v5.4s, v2.8h, #16\n"
- "shll v6.4s, v3.4h, #16\n"
- "shll2 v7.4s, v3.8h, #16\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n"
- "fmla v29.4s, v5.4s, v0.4s\n"
- "fmla v30.4s, v6.4s, v0.4s\n"
- "fmla v31.4s, v7.4s, v0.4s\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result
- "faddp v28.4s, v28.4s, v29.4s\n"
- "faddp v30.4s, v30.4s, v31.4s\n"
- "faddp v28.4s, v28.4s, v30.4s\n"
-
- // Do relu if requested.
- "fmax v28.4s, v28.4s, v25.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Convert bfloat16 -> float32.
- "shll v4.4s, v2.4h, #16\n"
- "shll2 v5.4s, v2.8h, #16\n"
- "shll v6.4s, v3.4h, #16\n"
- "shll2 v7.4s, v3.8h, #16\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n"
- "fmla v29.4s, v5.4s, v0.4s\n"
- "fmla v30.4s, v6.4s, v0.4s\n"
- "fmla v31.4s, v7.4s, v0.4s\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "faddp v28.4s, v28.4s, v29.4s\n"
- "faddp v30.4s, v30.4s, v31.4s\n"
- "faddp v28.4s, v28.4s, v30.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Performs the calculation y = A * x + b where A is a sparse matrix with a 4x4
-// blocked pattern, x is a fat vector with 5 columns and b is vector. b is
-// broadcast. Weights are stored for this routine by making each 4x4 block
-// contiguous. Blocks are ordered in standard row-major format. column indices
-// are converted to deltas and then multiplied by 2 to convert to bytes, so
-// that the value can be used directly to offset the pointer into the rhs
-// vector.
-//
-// NOTE: The bias is expected to have be multiplied by .25f prior to calling
-// this function. This is automatically taken care of in SparseLinearLayer.
-// The bias is reconstructed through horizontal additions, leads to a small
-// speedup by reducing latencies at the end of the loop.
-template
-typename std::enable_if::value &&
- std::is_same::value &&
- std::is_same::value>::type
-SpMM5_4x4(const bfloat16* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const float* rhs_ptr,
- const float* bias_ptr, float* out_ptr, int64_t assigned_rows,
- int64_t rows, int64_t cols, int relu) {
- /* This instrinsic version exists for reference, note that in the
- intrinsic version col_deltas_bytes should NOT actually be in bytes,
- but rather elements. Intrinsics are 25-35% slower than the
- assembly version.
-
- for (int r = 0; r < rows; r += 4) {
- int reduced_col_count = nnz_per_row[r / 4];
- float32x4_t accum0 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum1 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum2 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum3 = vdupq_n_f32(bias_ptr + r + 3);
- float32x4_t accum4 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum5 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum6 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum7 = vdupq_n_f32(bias_ptr + r + 3);
- ...
- for (int c = 0; c < reduced_col_count; ++c) {
- int32_t offset = *col_deltas_bytes; col_deltas_bytes++;
- rhs_ptr += offset;
- float32x4_t rhs = vld1q_f32(rhs_ptr);
- float32x4_t rhs2 = vld1q_f32(rhs2_ptr);
- float32x4_t rhs3 = vld1q_f32(rhs3_ptr);
- float32x4_t rhs4 = vld1q_f32(rhs4_ptr);
- float32x4_t rhs5 = vld1q_f32(rhs5_ptr);
-
- uint16x4_t lhs0_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs1_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs2_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs3_int = vld1_u16(weights_ptr); weights_ptr += 4;
-
- float32x4_t lhs0 = vreinterpretq_f32_u32(vshll_n_u16(lhs0_int, 16));
- float32x4_t lhs1 = vreinterpretq_f32_u32(vshll_n_u16(lhs1_int, 16));
- float32x4_t lhs2 = vreinterpretq_f32_u32(vshll_n_u16(lhs2_int, 16));
- float32x4_t lhs3 = vreinterpretq_f32_u32(vshll_n_u16(lhs3_int, 16));
-
- accum0 = vmlaq_f32(accum0, lhs0, rhs);
- accum1 = vmlaq_f32(accum1, lhs1, rhs);
- accum2 = vmlaq_f32(accum2, lhs2, rhs);
- accum3 = vmlaq_f32(accum3, lhs3, rhs);
- accum4 = vmlaq_f32(accum0, lhs0, rhs2);
- accum5 = vmlaq_f32(accum1, lhs1, rhs2);
- accum6 = vmlaq_f32(accum2, lhs2, rhs2);
- accum7 = vmlaq_f32(accum3, lhs3, rhs2);
- ...
- }
-
- float32x4_t reduce0 = vpaddq_f32(accum0, accum1);
- float32x4_t reduce1 = vpaddq_f32(accum2, accum3);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out_ptr + r, reduce2);
-
- float32x4_t reduce0 = vpaddq_f32(accum4, accum5);
- float32x4_t reduce1 = vpaddq_f32(accum6, accum7);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out2_ptr + r, reduce2);
-
- ...
- } */
-
- // If the relu is handled in the routine with a comparison and vbit (insert
- // if true), or by branching, then it is slightly, but noticeably slower
- // ~5%, the outer branch avoids that penalty.
- //
- // Pointers to the columns.
- const float* rhs2_ptr = rhs_ptr + cols;
- float* out2_ptr = out_ptr + rows;
- const float* rhs3_ptr = rhs_ptr + 2 * cols;
- float* out3_ptr = out_ptr + 2 * rows;
- const float* rhs4_ptr = rhs_ptr + 3 * cols;
- float* out4_ptr = out_ptr + 3 * rows;
- const float* rhs5_ptr = rhs_ptr + 4 * cols;
- float* out5_ptr = out_ptr + 4 * rows;
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
- "ld1 {v1.4s}, [%[rhs2_ptr]], x8\n"
- "ld1 {v8.4s}, [%[rhs3_ptr]], x8\n"
- "ld1 {v9.4s}, [%[rhs4_ptr]], x8\n"
- "ld1 {v10.4s}, [%[rhs5_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Convert bfloat16 -> float32.
- "shll v4.4s, v2.4h, #16\n"
- "shll2 v5.4s, v2.8h, #16\n"
- "shll v6.4s, v3.4h, #16\n"
- "shll2 v7.4s, v3.8h, #16\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n" // for 1st column
- "fmla v29.4s, v5.4s, v0.4s\n" // for 1st column
- "fmla v30.4s, v6.4s, v0.4s\n" // for 1st column
- "fmla v31.4s, v7.4s, v0.4s\n" // for 1st column
- "fmla v23.4s, v4.4s, v1.4s\n" // for 2nd column
- "fmla v24.4s, v5.4s, v1.4s\n" // for 2nd column
- "fmla v25.4s, v6.4s, v1.4s\n" // for 2nd column
- "fmla v26.4s, v7.4s, v1.4s\n" // for 2nd column
- "fmla v19.4s, v4.4s, v8.4s\n" // for 3rd column
- "fmla v20.4s, v5.4s, v8.4s\n" // for 3rd column
- "fmla v21.4s, v6.4s, v8.4s\n" // for 3rd column
- "fmla v22.4s, v7.4s, v8.4s\n" // for 3rd column
- "fmla v15.4s, v4.4s, v9.4s\n" // for 4th column
- "fmla v16.4s, v5.4s, v9.4s\n" // for 4th column
- "fmla v17.4s, v6.4s, v9.4s\n" // for 4th column
- "fmla v18.4s, v7.4s, v9.4s\n" // for 4th column
- "fmla v11.4s, v4.4s, v10.4s\n" // for 5th column
- "fmla v12.4s, v5.4s, v10.4s\n" // for 5th column
- "fmla v13.4s, v6.4s, v10.4s\n" // for 5th column
- "fmla v14.4s, v7.4s, v10.4s\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "movi v0.4s, #0\n"
- "faddp v28.4s, v28.4s, v29.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v16.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "faddp v30.4s, v30.4s, v31.4s\n" // 1st column
- "faddp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "faddp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "faddp v17.4s, v17.4s, v18.4s\n" // 4th column
- "faddp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "faddp v28.4s, v28.4s, v30.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v17.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Do relu as requested.
- "fmax v28.4s, v28.4s, v0.4s\n"
- "fmax v23.4s, v23.4s, v0.4s\n"
- "fmax v19.4s, v19.4s, v0.4s\n"
- "fmax v15.4s, v15.4s, v0.4s\n"
- "fmax v11.4s, v11.4s, v0.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr),
- [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr),
- [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr),
- [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
- "ld1 {v1.4s}, [%[rhs2_ptr]], x8\n"
- "ld1 {v8.4s}, [%[rhs3_ptr]], x8\n"
- "ld1 {v9.4s}, [%[rhs4_ptr]], x8\n"
- "ld1 {v10.4s}, [%[rhs5_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Convert bfloat16 -> float32.
- "shll v4.4s, v2.4h, #16\n"
- "shll2 v5.4s, v2.8h, #16\n"
- "shll v6.4s, v3.4h, #16\n"
- "shll2 v7.4s, v3.8h, #16\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n" // for 1st column
- "fmla v29.4s, v5.4s, v0.4s\n" // for 1st column
- "fmla v30.4s, v6.4s, v0.4s\n" // for 1st column
- "fmla v31.4s, v7.4s, v0.4s\n" // for 1st column
- "fmla v23.4s, v4.4s, v1.4s\n" // for 2nd column
- "fmla v24.4s, v5.4s, v1.4s\n" // for 2nd column
- "fmla v25.4s, v6.4s, v1.4s\n" // for 2nd column
- "fmla v26.4s, v7.4s, v1.4s\n" // for 2nd column
- "fmla v19.4s, v4.4s, v8.4s\n" // for 3rd column
- "fmla v20.4s, v5.4s, v8.4s\n" // for 3rd column
- "fmla v21.4s, v6.4s, v8.4s\n" // for 3rd column
- "fmla v22.4s, v7.4s, v8.4s\n" // for 3rd column
- "fmla v15.4s, v4.4s, v9.4s\n" // for 4th column
- "fmla v16.4s, v5.4s, v9.4s\n" // for 4th column
- "fmla v17.4s, v6.4s, v9.4s\n" // for 4th column
- "fmla v18.4s, v7.4s, v9.4s\n" // for 4th column
- "fmla v11.4s, v4.4s, v10.4s\n" // for 5th column
- "fmla v12.4s, v5.4s, v10.4s\n" // for 5th column
- "fmla v13.4s, v6.4s, v10.4s\n" // for 5th column
- "fmla v14.4s, v7.4s, v10.4s\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "faddp v28.4s, v28.4s, v29.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v16.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "faddp v30.4s, v30.4s, v31.4s\n" // 1st column
- "faddp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "faddp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "faddp v17.4s, v17.4s, v18.4s\n" // 4th column
- "faddp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "faddp v28.4s, v28.4s, v30.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v17.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr),
- [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr),
- [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr),
- [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// float implementations below the line.
-
-template
-typename std::enable_if::value &&
- std::is_same::value &&
- std::is_same::value>::type
-SpMV_4x4(const float* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const float* rhs_ptr,
- const float* bias_ptr, float* out_ptr, int64_t assigned_rows,
- int64_t rows /* only used in SpMM variants */,
- int64_t cols /* only used in SpMM variants */, int relu) {
- /* This instrinsic version exists for reference, note that in the
- intrinsic version col_deltas_bytes should NOT actually be in bytes,
- but rather elements. Intrinsics are 25-35% slower than the
- assembly version.
-
- for (int r = 0; r < rows; r += 4) {
- int reduced_col_count = nnz_per_row[r / 4];
- float32x4_t accum0 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum1 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum2 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum3 = vdupq_n_f32(bias_ptr + r + 3);
- for (int c = 0; c < reduced_col_count; ++c) {
- int32_t offset = *col_deltas_bytes; col_deltas_bytes++;
- rhs_ptr += offset;
- float32x4_t rhs = vld1q_f32(rhs_ptr);
-
- uint16x4_t lhs0_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs1_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs2_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs3_int = vld1_u16(weights_ptr); weights_ptr += 4;
-
- float32x4_t lhs0 = vreinterpretq_f32_u32(vshll_n_u16(lhs0_int, 16));
- float32x4_t lhs1 = vreinterpretq_f32_u32(vshll_n_u16(lhs1_int, 16));
- float32x4_t lhs2 = vreinterpretq_f32_u32(vshll_n_u16(lhs2_int, 16));
- float32x4_t lhs3 = vreinterpretq_f32_u32(vshll_n_u16(lhs3_int, 16));
-
- accum0 = vmlaq_f32(accum0, lhs0, rhs);
- accum1 = vmlaq_f32(accum1, lhs1, rhs);
- accum2 = vmlaq_f32(accum2, lhs2, rhs);
- accum3 = vmlaq_f32(accum3, lhs3, rhs);
- }
-
- float32x4_t reduce0 = vpaddq_f32(accum0, accum1);
- float32x4_t reduce1 = vpaddq_f32(accum2, accum3);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out_ptr + r, reduce2);
- } */
-
- // If the relu is handled in the routine with a comparison and vbit (insert
- // if true), or by branching, then it is slightly, but noticeably slower
- // ~5%, the outer branch avoids that penalty.
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v4.4s, v5.4s, v6.4s, v7.4s}, [%[weights_ptr]], #64\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n"
- "fmla v29.4s, v5.4s, v0.4s\n"
- "fmla v30.4s, v6.4s, v0.4s\n"
- "fmla v31.4s, v7.4s, v0.4s\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "faddp v28.4s, v28.4s, v29.4s\n"
- "faddp v30.4s, v30.4s, v31.4s\n"
- "faddp v28.4s, v28.4s, v30.4s\n"
-
- // Do relu as requested.
- "fmax v28.4s, v28.4s, v25.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v4.4s, v5.4s, v6.4s, v7.4s}, [%[weights_ptr]], #64\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n"
- "fmla v29.4s, v5.4s, v0.4s\n"
- "fmla v30.4s, v6.4s, v0.4s\n"
- "fmla v31.4s, v7.4s, v0.4s\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "faddp v28.4s, v28.4s, v29.4s\n"
- "faddp v30.4s, v30.4s, v31.4s\n"
- "faddp v28.4s, v28.4s, v30.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Performs the calculation y = A * x + b where A is a sparse matrix with a 4x4
-// blocked pattern, x is a fat vector with 5 columns and b is vector. b is
-// broadcast. Weights are stored for this routine by making each 4x4 block
-// contiguous. Blocks are ordered in standard row-major format. column indices
-// are converted to deltas and then multiplied by 2 to convert to bytes, so
-// that the value can be used directly to offset the pointer into the rhs
-// vector.
-//
-// NOTE: The bias is expected to have be multiplied by .25f prior to calling
-// this function. This is automatically taken care of in sparse_linear_layer.
-// The bias is reconstructed through horizontal additions, leads to a small
-// speedup by reducing latencies at the end of the loop.
-template
-typename std::enable_if::value &&
- std::is_same::value &&
- std::is_same::value>::type
-SpMM5_4x4(const float* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const float* rhs_ptr,
- const float* bias_ptr, float* out_ptr, int64_t assigned_rows,
- int64_t rows, int64_t cols, int relu) {
- /* This instrinsic version exists for reference, note that in the
- intrinsic version col_deltas_bytes should NOT actually be in bytes,
- but rather elements. Intrinsics are 25-35% slower than the
- assembly version.
-
- for (int r = 0; r < rows; r += 4) {
- int reduced_col_count = nnz_per_row[r / 4];
- float32x4_t accum0 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum1 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum2 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum3 = vdupq_n_f32(bias_ptr + r + 3);
- float32x4_t accum4 = vdupq_n_f32(bias_ptr + r);
- float32x4_t accum5 = vdupq_n_f32(bias_ptr + r + 1);
- float32x4_t accum6 = vdupq_n_f32(bias_ptr + r + 2);
- float32x4_t accum7 = vdupq_n_f32(bias_ptr + r + 3);
- ...
- for (int c = 0; c < reduced_col_count; ++c) {
- int32_t offset = *col_deltas_bytes; col_deltas_bytes++;
- rhs_ptr += offset;
- float32x4_t rhs = vld1q_f32(rhs_ptr);
- float32x4_t rhs2 = vld1q_f32(rhs2_ptr);
- float32x4_t rhs3 = vld1q_f32(rhs3_ptr);
- float32x4_t rhs4 = vld1q_f32(rhs4_ptr);
- float32x4_t rhs5 = vld1q_f32(rhs5_ptr);
-
- uint16x4_t lhs0_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs1_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs2_int = vld1_u16(weights_ptr); weights_ptr += 4;
- uint16x4_t lhs3_int = vld1_u16(weights_ptr); weights_ptr += 4;
-
- float32x4_t lhs0 = vreinterpretq_f32_u32(vshll_n_u16(lhs0_int, 16));
- float32x4_t lhs1 = vreinterpretq_f32_u32(vshll_n_u16(lhs1_int, 16));
- float32x4_t lhs2 = vreinterpretq_f32_u32(vshll_n_u16(lhs2_int, 16));
- float32x4_t lhs3 = vreinterpretq_f32_u32(vshll_n_u16(lhs3_int, 16));
-
- accum0 = vmlaq_f32(accum0, lhs0, rhs);
- accum1 = vmlaq_f32(accum1, lhs1, rhs);
- accum2 = vmlaq_f32(accum2, lhs2, rhs);
- accum3 = vmlaq_f32(accum3, lhs3, rhs);
- accum4 = vmlaq_f32(accum0, lhs0, rhs2);
- accum5 = vmlaq_f32(accum1, lhs1, rhs2);
- accum6 = vmlaq_f32(accum2, lhs2, rhs2);
- accum7 = vmlaq_f32(accum3, lhs3, rhs2);
- ...
- }
-
- float32x4_t reduce0 = vpaddq_f32(accum0, accum1);
- float32x4_t reduce1 = vpaddq_f32(accum2, accum3);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out_ptr + r, reduce2);
-
- float32x4_t reduce0 = vpaddq_f32(accum4, accum5);
- float32x4_t reduce1 = vpaddq_f32(accum6, accum7);
- float32x4_t reduce2 = vpaddq_f32(reduce0, reduce1);
- vst1q_f32(out2_ptr + r, reduce2);
-
- ...
- } */
-
- // If the relu is handled in the routine with a comparison and vbit (insert
- // if true), or by branching, then it is slightly, but noticeably slower
- // ~5%, the outer branch avoids that penalty.
- //
- // Pointers to the columns.
- const float* rhs2_ptr = rhs_ptr + cols;
- float* out2_ptr = out_ptr + rows;
- const float* rhs3_ptr = rhs_ptr + 2 * cols;
- float* out3_ptr = out_ptr + 2 * rows;
- const float* rhs4_ptr = rhs_ptr + 3 * cols;
- float* out4_ptr = out_ptr + 3 * rows;
- const float* rhs5_ptr = rhs_ptr + 4 * cols;
- float* out5_ptr = out_ptr + 4 * rows;
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
- "ld1 {v1.4s}, [%[rhs2_ptr]], x8\n"
- "ld1 {v8.4s}, [%[rhs3_ptr]], x8\n"
- "ld1 {v9.4s}, [%[rhs4_ptr]], x8\n"
- "ld1 {v10.4s}, [%[rhs5_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v4.4s, v5.4s, v6.4s, v7.4s}, [%[weights_ptr]], #64\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n" // for 1st column
- "fmla v29.4s, v5.4s, v0.4s\n" // for 1st column
- "fmla v30.4s, v6.4s, v0.4s\n" // for 1st column
- "fmla v31.4s, v7.4s, v0.4s\n" // for 1st column
- "fmla v23.4s, v4.4s, v1.4s\n" // for 2nd column
- "fmla v24.4s, v5.4s, v1.4s\n" // for 2nd column
- "fmla v25.4s, v6.4s, v1.4s\n" // for 2nd column
- "fmla v26.4s, v7.4s, v1.4s\n" // for 2nd column
- "fmla v19.4s, v4.4s, v8.4s\n" // for 3rd column
- "fmla v20.4s, v5.4s, v8.4s\n" // for 3rd column
- "fmla v21.4s, v6.4s, v8.4s\n" // for 3rd column
- "fmla v22.4s, v7.4s, v8.4s\n" // for 3rd column
- "fmla v15.4s, v4.4s, v9.4s\n" // for 4th column
- "fmla v16.4s, v5.4s, v9.4s\n" // for 4th column
- "fmla v17.4s, v6.4s, v9.4s\n" // for 4th column
- "fmla v18.4s, v7.4s, v9.4s\n" // for 4th column
- "fmla v11.4s, v4.4s, v10.4s\n" // for 5th column
- "fmla v12.4s, v5.4s, v10.4s\n" // for 5th column
- "fmla v13.4s, v6.4s, v10.4s\n" // for 5th column
- "fmla v14.4s, v7.4s, v10.4s\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "movi v0.4s, #0\n"
- "faddp v28.4s, v28.4s, v29.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v16.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "faddp v30.4s, v30.4s, v31.4s\n" // 1st column
- "faddp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "faddp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "faddp v17.4s, v17.4s, v18.4s\n" // 4th column
- "faddp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "faddp v28.4s, v28.4s, v30.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v17.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Do relu as requested.
- "fmax v28.4s, v28.4s, v0.4s\n"
- "fmax v23.4s, v23.4s, v0.4s\n"
- "fmax v19.4s, v19.4s, v0.4s\n"
- "fmax v15.4s, v15.4s, v0.4s\n"
- "fmax v11.4s, v11.4s, v0.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr),
- [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr),
- [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr),
- [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4s}, [%[rhs_ptr]], x8\n"
- "ld1 {v1.4s}, [%[rhs2_ptr]], x8\n"
- "ld1 {v8.4s}, [%[rhs3_ptr]], x8\n"
- "ld1 {v9.4s}, [%[rhs4_ptr]], x8\n"
- "ld1 {v10.4s}, [%[rhs5_ptr]], x8\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v4.4s, v5.4s, v6.4s, v7.4s}, [%[weights_ptr]], #64\n"
-
- // Multiply-accumulate.
- "fmla v28.4s, v4.4s, v0.4s\n" // for 1st column
- "fmla v29.4s, v5.4s, v0.4s\n" // for 1st column
- "fmla v30.4s, v6.4s, v0.4s\n" // for 1st column
- "fmla v31.4s, v7.4s, v0.4s\n" // for 1st column
- "fmla v23.4s, v4.4s, v1.4s\n" // for 2nd column
- "fmla v24.4s, v5.4s, v1.4s\n" // for 2nd column
- "fmla v25.4s, v6.4s, v1.4s\n" // for 2nd column
- "fmla v26.4s, v7.4s, v1.4s\n" // for 2nd column
- "fmla v19.4s, v4.4s, v8.4s\n" // for 3rd column
- "fmla v20.4s, v5.4s, v8.4s\n" // for 3rd column
- "fmla v21.4s, v6.4s, v8.4s\n" // for 3rd column
- "fmla v22.4s, v7.4s, v8.4s\n" // for 3rd column
- "fmla v15.4s, v4.4s, v9.4s\n" // for 4th column
- "fmla v16.4s, v5.4s, v9.4s\n" // for 4th column
- "fmla v17.4s, v6.4s, v9.4s\n" // for 4th column
- "fmla v18.4s, v7.4s, v9.4s\n" // for 4th column
- "fmla v11.4s, v4.4s, v10.4s\n" // for 5th column
- "fmla v12.4s, v5.4s, v10.4s\n" // for 5th column
- "fmla v13.4s, v6.4s, v10.4s\n" // for 5th column
- "fmla v14.4s, v7.4s, v10.4s\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "faddp v28.4s, v28.4s, v29.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v16.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "faddp v30.4s, v30.4s, v31.4s\n" // 1st column
- "faddp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "faddp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "faddp v17.4s, v17.4s, v18.4s\n" // 4th column
- "faddp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "faddp v28.4s, v28.4s, v30.4s\n" // 1st column
- "faddp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "faddp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "faddp v15.4s, v15.4s, v17.4s\n" // 4th column
- "faddp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr),
- [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr),
- [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr),
- [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes),
- [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row),
- [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr),
- [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr),
- [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Note that the number of exponent bits in the output must exactly match
-// the sum of the input and rhs types.
-template
-typename std::enable_if<
- IsFixed16Type::value && IsFixed16Type::value &&
- std::is_same::type>::value>::type
-SpMV_4x4(const WeightType* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const RhsType* rhs_ptr,
- const typename TypeOfProduct::type* bias_ptr,
- OutType* out_ptr, int64_t assigned_rows,
- int64_t rows /* only used in SpMM variants */,
- int64_t cols /* only used in SpMM variants */, int relu) {
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- // Duplicate the lower half into the upper half.
- "mov v0.d[1], v0.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n"
- "smlal2 v29.4s, v2.8h, v0.8h\n"
- "smlal v30.4s, v3.4h, v0.4h\n"
- "smlal2 v31.4s, v3.8h, v0.8h\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "addp v28.4s, v28.4s, v29.4s\n"
- "addp v30.4s, v30.4s, v31.4s\n"
- "addp v28.4s, v28.4s, v30.4s\n"
-
- // Do relu if requested.
- "smax v28.4s, v28.4s, v25.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- // Duplicate the lower half into the upper half.
- "mov v0.d[1], v0.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n"
- "smlal2 v29.4s, v2.8h, v0.8h\n"
- "smlal v30.4s, v3.4h, v0.4h\n"
- "smlal2 v31.4s, v3.8h, v0.8h\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "addp v28.4s, v28.4s, v29.4s\n"
- "addp v30.4s, v30.4s, v31.4s\n"
- "addp v28.4s, v28.4s, v30.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Note that the number of exponent bits in the output must exactly match
-// the sum of the input and rhs types.
-template
-typename std::enable_if<
- IsFixed16Type::value && IsFixed16Type::value &&
- std::is_same::type>::value>::type
-SpMM5_4x4(const WeightType* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const RhsType* rhs_ptr,
- const typename TypeOfProduct::type* bias_ptr,
- OutType* out_ptr, int64_t assigned_rows, int64_t rows, int64_t cols,
- int relu) {
- // Pointers to the columns.
- const RhsType* rhs2_ptr = rhs_ptr + cols;
- OutType* out2_ptr = out_ptr + rows;
- const RhsType* rhs3_ptr = rhs_ptr + 2 * cols;
- OutType* out3_ptr = out_ptr + 2 * rows;
- const RhsType* rhs4_ptr = rhs_ptr + 3 * cols;
- OutType* out4_ptr = out_ptr + 3 * rows;
- const RhsType* rhs5_ptr = rhs_ptr + 4 * cols;
- OutType* out5_ptr = out_ptr + 4 * rows;
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each and duplicate into upper half.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- "mov v0.d[1], v0.d[0]\n"
- "ld1 {v1.4h}, [%[rhs2_ptr]], x8\n"
- "mov v1.d[1], v1.d[0]\n"
- "ld1 {v8.4h}, [%[rhs3_ptr]], x8\n"
- "mov v8.d[1], v8.d[0]\n"
- "ld1 {v9.4h}, [%[rhs4_ptr]], x8\n"
- "mov v9.d[1], v9.d[0]\n"
- "ld1 {v10.4h}, [%[rhs5_ptr]], x8\n"
- "mov v10.d[1], v10.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n" // for 1st column
- "smlal2 v29.4s, v2.8h, v0.8h\n" // for 1st column
- "smlal v30.4s, v3.4h, v0.4h\n" // for 1st column
- "smlal2 v31.4s, v3.8h, v0.8h\n" // for 1st columh
- "smlal v23.4s, v2.4h, v1.4h\n" // for 2nd column
- "smlal2 v24.4s, v2.8h, v1.8h\n" // for 2nd column
- "smlal v25.4s, v3.4h, v1.4h\n" // for 2nd column
- "smlal2 v26.4s, v3.8h, v1.8h\n" // for 2nd column
- "smlal v19.4s, v2.4h, v8.4h\n" // for 3rd column
- "smlal2 v20.4s, v2.8h, v8.8h\n" // for 3rd column
- "smlal v21.4s, v3.4h, v8.4h\n" // for 3rd column
- "smlal2 v22.4s, v3.8h, v8.8h\n" // for 3rd column
- "smlal v15.4s, v2.4h, v9.4h\n" // for 4th column
- "smlal2 v16.4s, v2.8h, v9.8h\n" // for 4th column
- "smlal v17.4s, v3.4h, v9.4h\n" // for 4th column
- "smlal2 v18.4s, v3.8h, v9.8h\n" // for 4th column
- "smlal v11.4s, v2.4h, v10.4h\n" // for 5th column
- "smlal2 v12.4s, v2.8h, v10.8h\n" // for 5th column
- "smlal v13.4s, v3.4h, v10.4h\n" // for 5th column
- "smlal2 v14.4s, v3.8h, v10.8h\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "movi v0.4s, #0\n"
- "addp v28.4s, v28.4s, v29.4s\n" // 1st column
- "addp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v16.4s\n" // 4th column
- "addp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "addp v30.4s, v30.4s, v31.4s\n" // 1st column
- "addp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "addp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "addp v17.4s, v17.4s, v18.4s\n" // 4th column
- "addp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "addp v28.4s, v28.4s, v30.4s\n" // 1st column
- "addp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v17.4s\n" // 4th column
- "addp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Do relu as requested.
- "smax v28.4s, v28.4s, v0.4s\n"
- "smax v23.4s, v23.4s, v0.4s\n"
- "smax v19.4s, v19.4s, v0.4s\n"
- "smax v15.4s, v15.4s, v0.4s\n"
- "smax v11.4s, v11.4s, v0.4s\n"
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr), [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr), [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr), [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each and duplicate into upper half.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- "mov v0.d[1], v0.d[0]\n"
- "ld1 {v1.4h}, [%[rhs2_ptr]], x8\n"
- "mov v1.d[1], v1.d[0]\n"
- "ld1 {v8.4h}, [%[rhs3_ptr]], x8\n"
- "mov v8.d[1], v8.d[0]\n"
- "ld1 {v9.4h}, [%[rhs4_ptr]], x8\n"
- "mov v9.d[1], v9.d[0]\n"
- "ld1 {v10.4h}, [%[rhs5_ptr]], x8\n"
- "mov v10.d[1], v10.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n" // for 1st column
- "smlal2 v29.4s, v2.8h, v0.8h\n" // for 1st column
- "smlal v30.4s, v3.4h, v0.4h\n" // for 1st column
- "smlal2 v31.4s, v3.8h, v0.8h\n" // for 1st columh
- "smlal v23.4s, v2.4h, v1.4h\n" // for 2nd column
- "smlal2 v24.4s, v2.8h, v1.8h\n" // for 2nd column
- "smlal v25.4s, v3.4h, v1.4h\n" // for 2nd column
- "smlal2 v26.4s, v3.8h, v1.8h\n" // for 2nd column
- "smlal v19.4s, v2.4h, v8.4h\n" // for 3rd column
- "smlal2 v20.4s, v2.8h, v8.8h\n" // for 3rd column
- "smlal v21.4s, v3.4h, v8.4h\n" // for 3rd column
- "smlal2 v22.4s, v3.8h, v8.8h\n" // for 3rd column
- "smlal v15.4s, v2.4h, v9.4h\n" // for 4th column
- "smlal2 v16.4s, v2.8h, v9.8h\n" // for 4th column
- "smlal v17.4s, v3.4h, v9.4h\n" // for 4th column
- "smlal2 v18.4s, v3.8h, v9.8h\n" // for 4th column
- "smlal v11.4s, v2.4h, v10.4h\n" // for 5th column
- "smlal2 v12.4s, v2.8h, v10.8h\n" // for 5th column
- "smlal v13.4s, v3.4h, v10.4h\n" // for 5th column
- "smlal2 v14.4s, v3.8h, v10.8h\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "addp v28.4s, v28.4s, v29.4s\n" // 1st column
- "addp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v16.4s\n" // 4th column
- "addp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "addp v30.4s, v30.4s, v31.4s\n" // 1st column
- "addp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "addp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "addp v17.4s, v17.4s, v18.4s\n" // 4th column
- "addp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "addp v28.4s, v28.4s, v30.4s\n" // 1st column
- "addp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v17.4s\n" // 4th column
- "addp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Store accumulators.
- "st1 {v28.4s}, [%[out_ptr]], #16\n"
- "st1 {v23.4s}, [%[out2_ptr]], #16\n"
- "st1 {v19.4s}, [%[out3_ptr]], #16\n"
- "st1 {v15.4s}, [%[out4_ptr]], #16\n"
- "st1 {v11.4s}, [%[out5_ptr]], #16\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr), [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr), [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr), [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Note that the number of exponent bits in the bias must exactly match
-// the sum of the input and rhs types.
-template
-typename std::enable_if::value &&
- IsFixed16Type::value &&
- IsFixed16Type::value>::type
-SpMV_4x4(const WeightType* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const RhsType* rhs_ptr,
- const typename TypeOfProduct::type* bias_ptr,
- OutType* out_ptr, int64_t assigned_rows,
- int64_t rows /* only used in SpMM variants */,
- int64_t cols /* only used in SpMM variants */, int relu) {
- constexpr int kShiftAmount = 15 - WeightType::kExponentBits -
- RhsType::kExponentBits + OutType::kExponentBits;
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- // Duplicate the lower half into the upper half.
- "mov v0.d[1], v0.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n"
- "smlal2 v29.4s, v2.8h, v0.8h\n"
- "smlal v30.4s, v3.4h, v0.4h\n"
- "smlal2 v31.4s, v3.8h, v0.8h\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "addp v28.4s, v28.4s, v29.4s\n"
- "addp v30.4s, v30.4s, v31.4s\n"
- "addp v28.4s, v28.4s, v30.4s\n"
-
- // Do relu if requested.
- "smax v28.4s, v28.4s, v25.4s\n"
- "sqrshrn v26.4h, v28.4s, %[shift_amount]\n"
-
- // Store accumulators.
- "st1 {v26.4h}, [%[out_ptr]], #8\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- [shift_amount] "I"(kShiftAmount)
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
-
- "movi v25.4s, #0\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // accum_0 = 0
- "dup v29.4s, v27.s[1]\n" // accum_1 = 0
- "dup v30.4s, v27.s[2]\n" // accum_2 = 0
- "dup v31.4s, v27.s[3]\n" // accum_3 = 0
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- // Duplicate the lower half into the upper half.
- "mov v0.d[1], v0.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n"
- "smlal2 v29.4s, v2.8h, v0.8h\n"
- "smlal v30.4s, v3.4h, v0.4h\n"
- "smlal2 v31.4s, v3.8h, v0.8h\n"
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- // Horizontally add accumulators and store result.
- "addp v28.4s, v28.4s, v29.4s\n"
- "addp v30.4s, v30.4s, v31.4s\n"
- "addp v28.4s, v28.4s, v30.4s\n"
- "sqrshrn v26.4h, v28.4s, %[shift_amount]\n"
-
- // Store accumulators.
- "st1 {v26.4h}, [%[out_ptr]], #8\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr)
- : // inputs
- [shift_amount] "I"(kShiftAmount)
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Note that the number of exponent bits in the output must exactly match
-// the sum of the input and rhs types.
-template
-typename std::enable_if::value &&
- IsFixed16Type::value &&
- IsFixed16Type::value>::type
-SpMM5_4x4(const WeightType* weights_ptr, const int16_t* col_deltas_bytes,
- const int32_t* nnz_per_row, const RhsType* rhs_ptr,
- const typename TypeOfProduct::type* bias_ptr,
- OutType* out_ptr, int64_t assigned_rows, int64_t rows, int64_t cols,
- int relu) {
- constexpr int kShiftAmount = 15 - WeightType::kExponentBits -
- RhsType::kExponentBits + OutType::kExponentBits;
- // Pointers to the columns.
- const RhsType* rhs2_ptr = rhs_ptr + cols;
- OutType* out2_ptr = out_ptr + rows;
- const RhsType* rhs3_ptr = rhs_ptr + 2 * cols;
- OutType* out3_ptr = out_ptr + 2 * rows;
- const RhsType* rhs4_ptr = rhs_ptr + 3 * cols;
- OutType* out4_ptr = out_ptr + 3 * rows;
- const RhsType* rhs5_ptr = rhs_ptr + 4 * cols;
- OutType* out5_ptr = out_ptr + 4 * rows;
- if (relu) {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each and duplicate into upper half.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- "mov v0.d[1], v0.d[0]\n"
- "ld1 {v1.4h}, [%[rhs2_ptr]], x8\n"
- "mov v1.d[1], v1.d[0]\n"
- "ld1 {v8.4h}, [%[rhs3_ptr]], x8\n"
- "mov v8.d[1], v8.d[0]\n"
- "ld1 {v9.4h}, [%[rhs4_ptr]], x8\n"
- "mov v9.d[1], v9.d[0]\n"
- "ld1 {v10.4h}, [%[rhs5_ptr]], x8\n"
- "mov v10.d[1], v10.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n" // for 1st column
- "smlal2 v29.4s, v2.8h, v0.8h\n" // for 1st column
- "smlal v30.4s, v3.4h, v0.4h\n" // for 1st column
- "smlal2 v31.4s, v3.8h, v0.8h\n" // for 1st columh
- "smlal v23.4s, v2.4h, v1.4h\n" // for 2nd column
- "smlal2 v24.4s, v2.8h, v1.8h\n" // for 2nd column
- "smlal v25.4s, v3.4h, v1.4h\n" // for 2nd column
- "smlal2 v26.4s, v3.8h, v1.8h\n" // for 2nd column
- "smlal v19.4s, v2.4h, v8.4h\n" // for 3rd column
- "smlal2 v20.4s, v2.8h, v8.8h\n" // for 3rd column
- "smlal v21.4s, v3.4h, v8.4h\n" // for 3rd column
- "smlal2 v22.4s, v3.8h, v8.8h\n" // for 3rd column
- "smlal v15.4s, v2.4h, v9.4h\n" // for 4th column
- "smlal2 v16.4s, v2.8h, v9.8h\n" // for 4th column
- "smlal v17.4s, v3.4h, v9.4h\n" // for 4th column
- "smlal2 v18.4s, v3.8h, v9.8h\n" // for 4th column
- "smlal v11.4s, v2.4h, v10.4h\n" // for 5th column
- "smlal2 v12.4s, v2.8h, v10.8h\n" // for 5th column
- "smlal v13.4s, v3.4h, v10.4h\n" // for 5th column
- "smlal2 v14.4s, v3.8h, v10.8h\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "movi v0.4s, #0\n"
- "addp v28.4s, v28.4s, v29.4s\n" // 1st column
- "addp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v16.4s\n" // 4th column
- "addp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "addp v30.4s, v30.4s, v31.4s\n" // 1st column
- "addp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "addp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "addp v17.4s, v17.4s, v18.4s\n" // 4th column
- "addp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "addp v28.4s, v28.4s, v30.4s\n" // 1st column
- "addp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v17.4s\n" // 4th column
- "addp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- // Do relu as requested.
- "smax v28.4s, v28.4s, v0.4s\n"
- "smax v23.4s, v23.4s, v0.4s\n"
- "smax v19.4s, v19.4s, v0.4s\n"
- "smax v15.4s, v15.4s, v0.4s\n"
- "smax v11.4s, v11.4s, v0.4s\n"
- "sqrshrn v26.4h, v28.4s, %[shift_amount]\n"
- "sqrshrn v22.4h, v23.4s, %[shift_amount]\n"
- "sqrshrn v18.4h, v19.4s, %[shift_amount]\n"
- "sqrshrn v14.4h, v15.4s, %[shift_amount]\n"
- "sqrshrn v10.4h, v11.4s, %[shift_amount]\n"
-
- // Store accumulators.
- "st1 {v26.4h}, [%[out_ptr]], #8\n"
- "st1 {v22.4h}, [%[out2_ptr]], #8\n"
- "st1 {v18.4h}, [%[out3_ptr]], #8\n"
- "st1 {v14.4h}, [%[out4_ptr]], #8\n"
- "st1 {v10.4h}, [%[out5_ptr]], #8\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr), [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr), [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr), [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- [shift_amount] "I"(kShiftAmount)
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- } else {
- asm(
- // Load the first two column deltas.
- "ldrsh x7, [%[col_deltas_bytes]], #2\n"
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
- // ld1 doesn't support pre-index, so we do the first addition here.
- "add %[rhs_ptr], %[rhs_ptr], x7\n"
- "add %[rhs2_ptr], %[rhs2_ptr], x7\n"
- "add %[rhs3_ptr], %[rhs3_ptr], x7\n"
- "add %[rhs4_ptr], %[rhs4_ptr], x7\n"
- "add %[rhs5_ptr], %[rhs5_ptr], x7\n"
-
- LABEL_ROW_LOOP
- ":\n"
-
- // Load the bias.
- "ld1 {v27.4s}, [%[bias_ptr]], #16\n"
-
- // Zero out local accumulators.
- "dup v28.4s, v27.s[0]\n" // for 1st column
- "dup v29.4s, v27.s[1]\n" // for 1st column
- "dup v30.4s, v27.s[2]\n" // for 1st column
- "dup v31.4s, v27.s[3]\n" // for 1st column
- "dup v23.4s, v27.s[0]\n" // for 2nd column
- "dup v24.4s, v27.s[1]\n" // for 2nd column
- "dup v25.4s, v27.s[2]\n" // for 2nd column
- "dup v26.4s, v27.s[3]\n" // for 2nd column
- "dup v19.4s, v27.s[0]\n" // for 3rd column
- "dup v20.4s, v27.s[1]\n" // for 3rd column
- "dup v21.4s, v27.s[2]\n" // for 3rd column
- "dup v22.4s, v27.s[3]\n" // for 3rd column
- "dup v15.4s, v27.s[0]\n" // for 4th column
- "dup v16.4s, v27.s[1]\n" // for 4th column
- "dup v17.4s, v27.s[2]\n" // for 4th column
- "dup v18.4s, v27.s[3]\n" // for 4th column
- "dup v11.4s, v27.s[0]\n" // for 5th column
- "dup v12.4s, v27.s[1]\n" // for 5th column
- "dup v13.4s, v27.s[2]\n" // for 5th column
- "dup v14.4s, v27.s[3]\n" // for 5th column
-
- // Update the stopping condition for this set of rows.
- "ldr w6, [%[nnz_per_row]], #4\n"
- "cmp w6, #0\n"
- // Skip the body if there isn't anything in this row.
- "beq " LABEL_SKIP_COL_LOOP "f\n"
-
- LABEL_COL_LOOP
- ":\n"
- // Load 1 Rhs vectors of size 1x4 each and duplicate into upper half.
- "ld1 {v0.4h}, [%[rhs_ptr]], x8\n"
- "mov v0.d[1], v0.d[0]\n"
- "ld1 {v1.4h}, [%[rhs2_ptr]], x8\n"
- "mov v1.d[1], v1.d[0]\n"
- "ld1 {v8.4h}, [%[rhs3_ptr]], x8\n"
- "mov v8.d[1], v8.d[0]\n"
- "ld1 {v9.4h}, [%[rhs4_ptr]], x8\n"
- "mov v9.d[1], v9.d[0]\n"
- "ld1 {v10.4h}, [%[rhs5_ptr]], x8\n"
- "mov v10.d[1], v10.d[0]\n"
-
- // Start this load now, which we won't need until the end of the loop.
- "ldrsh x8, [%[col_deltas_bytes]], #2\n"
-
- // Load 16 Lhs cells corresponding to a 4x4 block.
- "ld1 {v2.8h, v3.8h}, [%[weights_ptr]], #32\n"
-
- // Multiply-accumulate.
- "smlal v28.4s, v2.4h, v0.4h\n" // for 1st column
- "smlal2 v29.4s, v2.8h, v0.8h\n" // for 1st column
- "smlal v30.4s, v3.4h, v0.4h\n" // for 1st column
- "smlal2 v31.4s, v3.8h, v0.8h\n" // for 1st columh
- "smlal v23.4s, v2.4h, v1.4h\n" // for 2nd column
- "smlal2 v24.4s, v2.8h, v1.8h\n" // for 2nd column
- "smlal v25.4s, v3.4h, v1.4h\n" // for 2nd column
- "smlal2 v26.4s, v3.8h, v1.8h\n" // for 2nd column
- "smlal v19.4s, v2.4h, v8.4h\n" // for 3rd column
- "smlal2 v20.4s, v2.8h, v8.8h\n" // for 3rd column
- "smlal v21.4s, v3.4h, v8.4h\n" // for 3rd column
- "smlal2 v22.4s, v3.8h, v8.8h\n" // for 3rd column
- "smlal v15.4s, v2.4h, v9.4h\n" // for 4th column
- "smlal2 v16.4s, v2.8h, v9.8h\n" // for 4th column
- "smlal v17.4s, v3.4h, v9.4h\n" // for 4th column
- "smlal2 v18.4s, v3.8h, v9.8h\n" // for 4th column
- "smlal v11.4s, v2.4h, v10.4h\n" // for 5th column
- "smlal2 v12.4s, v2.8h, v10.8h\n" // for 5th column
- "smlal v13.4s, v3.4h, v10.4h\n" // for 5th column
- "smlal2 v14.4s, v3.8h, v10.8h\n" // for 5th column
-
- // Loop. Decrement loop index.
- "subs w6, w6, #1\n" // decrement (reduced) columns left
- "bne " LABEL_COL_LOOP "b\n"
-
- LABEL_SKIP_COL_LOOP
- ":\n"
-
- "addp v28.4s, v28.4s, v29.4s\n" // 1st column
- "addp v23.4s, v23.4s, v24.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v20.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v16.4s\n" // 4th column
- "addp v11.4s, v11.4s, v12.4s\n" // 5th column
-
- "addp v30.4s, v30.4s, v31.4s\n" // 1st column
- "addp v25.4s, v25.4s, v26.4s\n" // 2nd column
- "addp v21.4s, v21.4s, v22.4s\n" // 3rd column
- "addp v17.4s, v17.4s, v18.4s\n" // 4th column
- "addp v13.4s, v13.4s, v14.4s\n" // 5th column
-
- "addp v28.4s, v28.4s, v30.4s\n" // 1st column
- "addp v23.4s, v23.4s, v25.4s\n" // 2nd column
- "addp v19.4s, v19.4s, v21.4s\n" // 3rd column
- "addp v15.4s, v15.4s, v17.4s\n" // 4th column
- "addp v11.4s, v11.4s, v13.4s\n" // 5th column
-
- "sqrshrn v26.4h, v28.4s, %[shift_amount]\n"
- "sqrshrn v22.4h, v23.4s, %[shift_amount]\n"
- "sqrshrn v18.4h, v19.4s, %[shift_amount]\n"
- "sqrshrn v14.4h, v15.4s, %[shift_amount]\n"
- "sqrshrn v10.4h, v11.4s, %[shift_amount]\n"
-
- // Store accumulators.
- "st1 {v26.4h}, [%[out_ptr]], #8\n"
- "st1 {v22.4h}, [%[out2_ptr]], #8\n"
- "st1 {v18.4h}, [%[out3_ptr]], #8\n"
- "st1 {v14.4h}, [%[out4_ptr]], #8\n"
- "st1 {v10.4h}, [%[out5_ptr]], #8\n"
-
- // Decrement rows remaining.
- "subs %[assigned_rows], %[assigned_rows], #1\n"
- "bne " LABEL_ROW_LOOP "b\n"
-
- // clang-format off
- : // outputs
- [out_ptr] "+r"(out_ptr), [out2_ptr] "+r"(out2_ptr),
- [out3_ptr] "+r"(out3_ptr), [out4_ptr] "+r"(out4_ptr),
- [out5_ptr] "+r"(out5_ptr), [weights_ptr] "+r"(weights_ptr),
- [col_deltas_bytes] "+r"(col_deltas_bytes), [bias_ptr] "+r"(bias_ptr),
- [nnz_per_row] "+r"(nnz_per_row), [assigned_rows] "+r"(assigned_rows),
- [rhs_ptr] "+r"(rhs_ptr), [rhs2_ptr] "+r"(rhs2_ptr),
- [rhs3_ptr] "+r"(rhs3_ptr), [rhs4_ptr] "+r"(rhs4_ptr),
- [rhs5_ptr] "+r"(rhs5_ptr)
- : // inputs
- [shift_amount] "I"(kShiftAmount)
- : // clobbers
- "cc", "memory", "x6", "x7", "x8", "v0", "v1", "v2", "v3", "v4", "v5",
- "v6", "v7", "v8", "v9", "v10", "v11", "v12", "v13", "v14", "v15",
- "v16", "v17", "v18", "v19", "v20", "v21", "v22", "v23", "v24", "v25",
- "v26", "v27", "v28", "v29", "v30", "v31");
- // clang-format on
- }
-}
-
-// Note that the number of exponent bits in the output must exactly match
-// the sum of the input and rhs types.
-template
-typename std::enable_if<
- IsFixed16Type