repo
stringlengths
26
115
file
stringlengths
54
212
language
stringclasses
2 values
license
stringclasses
16 values
content
stringlengths
19
1.07M
https://github.com/felsenhower/kbs-typst
https://raw.githubusercontent.com/felsenhower/kbs-typst/master/examples/06.typ
typst
MIT License
#figure( image("typst.svg", width: 50%), caption: [ Typst ist toll. ] )
https://github.com/dogeystamp/typst-templates
https://raw.githubusercontent.com/dogeystamp/typst-templates/master/general.typ
typst
The Unlicense
// general document #import "main.typ": gen_preamble, doc_template, lref, source_code, status #let template( title: none, authors: none, suffix: none, prefix: none, enable-footer: true, body ) = { doc_template( title: title, authors: authors, enable-footer: enable-footer, { gen_preamble(title: title, authors: authors, suffix: suffix, prefix: prefix) body }) }
https://github.com/VisualFP/docs
https://raw.githubusercontent.com/VisualFP/docs/main/SA/design_concept/abstract.typ
typst
= Abstract Most visual programming tools used to introduce children & young adults to the programming world are based on the imperative paradigm. Existing tools based on functional programming either lack a good user experience or hide critical concepts of the functional paradigm. To address this gap, a visual, block-based tool for functional programming should be designed. This project aims to find a visual design of such a tool and then prove its feasibility in a proof of concept application. Existing visual programming tools are examined before creating a visual design. The development of the design is approached in two iterations: In a first step, concept drafts are based on researched tools and evaluated using a survey. Then, a new concept is created using the received feedback and implemented in a proof of concept. The created design concept focuses on function composition, guided by type holes that indicate the type required for a valid function definition. The implemented application proves that the proposed concept for function composition works as envisioned. It includes an inference engine that determines the type of undefined parts of a function and is built using Electron.js & Haskell. It is recommended that an additional project be conducted to implement missing features of the application so that it can be used in classrooms. Keywords: Haskell, Functional Programming, Visual Programming #pagebreak()
https://github.com/kilpkonn/CV
https://raw.githubusercontent.com/kilpkonn/CV/master/modules_en/projects.typ
typst
// Imports #import "@preview/brilliant-cv:2.0.3": cvSection, cvEntry #let metadata = toml("../metadata.toml") #let cvSection = cvSection.with(metadata: metadata) #let cvEntry = cvEntry.with(metadata: metadata) #cvSection("Projects & Associations") #cvEntry( title: [Open Source Contributor], society: [`rust-analyzer`], date: [2023 - 2024], location: [], description: list( [Implemented term search (proof search) functionality], [Worked with type unification, trait solving, borrow checking and code generation], ), ) #cvEntry( title: [Open Source Contributor], society: [`veloren`], date: [2021 - 2022], location: [], description: list( [Improved camera clipping and camera behavior in general], [Experimented with continuous time drag calculation for physics] ), )
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/036%20-%20Guilds%20of%20Ravnica/001_Under%20the%20Cover%20of%20Fog.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Under the Cover of Fog", set_name: "Guilds of Ravnica", story_date: datetime(day: 10, month: 10, year: 2018), author: "<NAME>", doc ) A silver-winged surveillance fly buzzes near my ear, and I resist the urge to shoo it away. Whoever had worked the magic on it had done a shoddy job, first-year mind-mage probably. Seems like the bug spends more time staring at me than helping track down weapons shipments. My first few weeks of working the docks I hadn't found much, but now not a day goes by without me uncovering a crate full of jewel-encrusted battle mallets, or bone-carved armor, or poison-infused knives. Tension is brewing in Ravnica, I'm sure of it, but House Dimir doesn't expect me to think. They expect me to work these covert jobs without getting caught. With crates stacked a dozen high and crammed into a maze of thin passageways, my job is simple—quick crowbar to the lid, crack the crate's seal, just enough to let the bug fly inside, then it zips back out, and we move onto the next~only this time, a gleam inside the crate catches my attention. "Buttress South Whiskey," the label reads, and without another thought, the whiskey bottle is in my hands. Expensive, enchanted, and aged in casks made from thousand-year-old trees poached from Selesnyan forests. Immoral? Maybe. Lucrative? Definitely. Serves them right for not sealing the crate with a stronger spell. The bug chirps at me, urging me on, but it's too late. My mind's already imagining the pile of gold zinos I could get for it. The long, slim bottle would fit nicely into the pocket of my trench coat. No one would notice. Suddenly, the bug whistles, then I look up, now all too aware of the approaching footsteps I should have been listening for. Sloppy, Merret, sloppy. Fog swirls, obscuring me from view, and in those last few moments of bought time, I shove the bottle snugly into the divot of packing straw, gently tap the lid closed, and then try to look inconspicuous. "Ah! Merret!" says <NAME>, my boss, arms crossed over his wide chest, horns scraping against the stacked crates on either side of him. He's half-man, half-bull, total grind-hard. "Just the guy I was looking for." "Sir?" I say, averting my eyes, trying to blend into my surroundings. Wishing I could become invisible. "Fog's too thick, and I've got a potential investor wanting to see the harbor. Clear it for me." "Can't Warwick do it?" I ask. A little fog I can handle, but despite a year of training, I don't have enough focus to clear the harbor. Can't concentrate hard enough to inflict nightmares or purge memories. As a covert agent of House Dimir, I don't have much to offer except the ability to work a crowbar of malintent. #figure(image("001_Under the Cover of Fog/01.jpg", width: 100%), caption: [Wall of Mist | Art by: Tianhua X], supplement: none, numbering: none) "Warwick's out. And Bender, too. You're all I've got." He looks me up and down, nostrils flaring. "Unfortunately." "Thanks for the vote of confidence." "How's this for confidence~you don't clear it and you don't get your pay for today?" "I'm on it, boss," I grumble. Should have taken the damn bottle. There's no way I'm going to get this fog cleared. Bills are overdue, wife and kids are hungry. Another day with docked pay and deeper in debt. I amble up to the very edge of the deepest pier and focus on the magic all around me. I pull, drawing the power in like an inhale of glass shards, and then release, a force from within me beating like thunder against the inside of my eardrums. Fog swirls, barely, clearing about halfway to the other side of the river, just enough to reveal a sleek, Simic schooner with spiral-embellished sails cutting through the water. Two merfolk keep pace beside the boat. The one in the lead turns toward me, scowls, then presses a webbed palm against the hull of the ship. Within seconds, the schooner fades into shimmering blue-green ripples, indistinguishable from the choppy river water unless you knew right where to look. <NAME> stomps his hooves, his deep, roaring laugh a near-perfect match to the blare of a fog horn. "Didn't see that, did we?" he says, turning to his investor, his mischievous smile stretched wide. "The cover of fog is a key selling point for the kinds of ships that sail through these parts, and as you'll come to find, it is a very profitable one. Tomorrow, I will show you the harbor. Tonight, we will drink to the beginnings of a new partnership!" <NAME> slaps his massive, furred hand against the investor's back, moving him along, but not before aiming a soul crushing glare down at me. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) My feet pad softly against the wet steps to my apartment building, avoiding the crunch of leaf litter piled in the corners. Tenement buildings crowd together, their spires jutting up like an enormous mouth full of pitted fangs. Sun doesn't shine here. Ever. Keyhole Village isn't the worst neighborhood we could have ended up in by far, but sometimes the gloom gets to me. Nine stories up, I sneak a peek into an open window. Our small kitchen looks like it's been hit by a rage spell, overturned bowls and measuring spoons spread across the counter. Tashi's balancing the baby on her hip as she conjures minor healing salves from a blend of arrowroot and boar spice to sell at market. She's working under the dim light of a single candle floating uncomfortably close to the loose fabric of her cloak—it's the green cloak with the golden leaves printed on the trim. I seem to remember it fit fine, once. I turn the knob and step inside. House Dimir has nothing on the traps that litter our floor. Wooden blocks sit in wait, ready to impale a bare foot with their sharp corners. A wheeled xylophone made of rib bones offers a fast track to a broken neck. I step around them, nearly second nature now, and get ready to break the news to my wife. "Merret! Finally," Tashi says, exasperated. She shoves the baby into my arms, who is almost a year old now, but he's still as fussy and listless as a newborn. He barely weighs anything at all, his nose a constant dribble of snot. Two seconds of holding him, and it's all over my lapels. "Daddy!" Soche, my oldest, comes barreling up to me, head hitting right in my gut. I bite back the pain as I force a smile onto my face. "Soche, shouldn't you be in bed?" I ask. "I wanted to see you, Daddy." "You've been good for your mother today?" "An absolute terror," my wife grates at me. "Broke a bottle of mat'ti root essence. Whole thing ruined! Where are we going to get money to replace that? Money to run the gas lamps so I'm not hunched over this candle all day? Money to feed the baby?" "I brought home a dozen apples yesterday," I remind her, hoping it will stall the next question. #emph[Where's the pay from today?] The job at the docks may be a covert assignment, but the money's real, and it's the only thing keeping us afloat. "They're mush, Merret. Market mush. Baby eats and eats and isn't getting any bigger. He needs real food. The kind you get from a proper grocer. Something that will fill him!" "I need filling, too!" Soche yells, patting her belly. "And mum!" "To bed!" my wife scolds her, and little feet pad against the stone floor. Soche ducks into her sleeping nook next to the unlit hearth, then buries herself in a mound of threadbare covers, tattered warming spells drifting off them like tufts of shed fur. "I~" I open my mouth, but for the first time, I notice how sunken my wife's face has become. A lump catches in my throat, and the words just won't come out. "I didn't—" "Get the food, Merret. I don't care how." She pries the baby from my arms, then starts enchanting her herbal mix again. I stand there for a moment, trying to figure out how #emph[this] has become my life. Fog seeps in from under the gap in the front door, twirls around me, like the dreariness of the streets has come to claim its stake inside my home. Inside me. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Stealing from the grocer isn't nearly as easy as stealing from the market at Keyhole Downs. Oh, they're nice enough here. Seems I've got a personal escort, following five steps behind me, big smile on his face. I try to lose him, snaking up and down the aisles past a display of steaming minced elk pies, a floating pile of blemish-free fruits, and bins containing twelve different types of live maggots for the discerning Viashino. But no matter what I do, the market clerk is still there. I guess the same scarred face that says "don't mess with me" to the vendors of Keyhole Downs, screams "thief" here in this posh neighborhood. #figure(image("001_Under the Cover of Fog/02.png", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) I leave empty handed, but for all my luck, I hear that roar of laughter that has left me cringing on so many occasions. I look up and spot <NAME> and his investor friend coming out of an apartment four floors up—the building massive, top heavy, and drenched in cleansing spells so it's impervious to graffiti. I knew he lived around here, but I hadn't imagined his place being #emph[this] nice. Huge gas lamps cut through the murk, their light glinting off the silver sigils jutted out from the building's polished red stone. I watch as pedestrians scramble between the archways of one market to the next. An enormous indrik stomphowler trudges through the streets, muzzled with so much magic I can feel it sizzle where I'm standing. Throngs of workers cling to the web of harnesses strung across its back, returning home from far-flung districts. Typical evening rush hour. Armor-clad centurions in chainmail and sunburst helmets are stationed here, also, ensuring the night traffic remains the legitimate sort. I press closer into the shadows, and once I'm sure my boss is well on his way to the pub, I sneak up to his home. The spell on the door lock is tight. Much too difficult for me to break, but minotaurs, they're too beef-headed to ever think of themselves as potential targets. I round the building, make a quick hop to the balcony, and sure enough, find an unlocked window. I slip inside, like a sheet of fog, feet barely touching the expensive ceramic tiles beneath me. Doubt bites at me. Sure, I've pinched things from the market on occasion, from a few pockets, too, but I've never done anything like this. I nearly turn back, remembering the look of disappointment on my mentor's face as I'd failed to pull a single memory thread after six months of close instruction. "Maybe you aren't meant for House Dimir," she'd said to me. Well, not said. She'd jammed the thought into my mind, easy as breathing. And there it still sits, front and center. I shake it off. My father was a spy. And three of my aunts and an uncle. Sneaking runs in my family. I can do this. After a short trip up a narrow hallway, I find myself in the kitchen. A gas light burns on its lowest setting, just enough to cast a warm, yellow glow upon the cabinetry. There, on the counter, a basket of bread. I take a loaf, feeling how hearty it is, nearly a brick in my hand. It's perfect. But next to the basket, tucked in a wire rack, something else catches my attention. Elixirs, a dozen of them. I pull out one of the bottles, long and rectangular, and made of thick, artisan glass. "Elixir of Focus" the metallic label reads. Inside, blue liquid glistens like it's bathed in the purest moonlight. The bread, it's nice. It'll feed my family tonight, but this~just a few drops of this elixir could change our lives. I could strengthen my magic, prove myself on the docks. Work my way back into the favor of the guild. Just a few drops. My boss would never notice what I've taken. I pop the cork, and the smell wafts right up my nose~a soft, cottony scent like that of freshly washed blankets. I open my mouth, tilt the bottle. One drop. Two. Just one more, for good measure. But before the last drop hits my tongue, the lights flicker on full. My eyes go wide, and the elixir spills all over me, down my chin, seeping into my trench coat. I stand there, frozen like a statue as a minotaur enters the kitchen, her eyes half-closed, rollers in her hair, long robe draping nearly to her hooves. Never in my wildest dreams would I have imagined there was a person in all of Ravnica willing to wake up next to Grimbly Wothis every day. A real spy would have taken time to learn these things. How could I kid myself? I'm nothing close to a spy. I'm barely a thief. She yawns, and I see every single tooth in her gummy mouth. Nothing threatening in there, but I'm pretty sure she'd be able to snap me clear in half, if she put her mind to it. I stand there, completely exposed, not even daring to swirl up fog around me. She's half-asleep, half-aware, but I can guarantee she won't stay that way for long. She moves over to the counter across from me, pulls out a large metal bowl, and fills it to the brim with grass and barely. Then she scoops the bowl up into her hands and shuffles back toward me. But the elixir, I'm feeling it now. Scattered thoughts come into focus, and I start flexing muscles I never knew I had. My fingertips glow, and nearly forgotten spells suddenly sit upon my lips. I draw on magic, and her mind opens up to me like a map. I tug here, push there, and suddenly I'm invisible to her. She's inches from touching me, chewing, chewing, chewing~mouth open, eyes distant. #figure(image("001_Under the Cover of Fog/03.jpg", width: 100%), caption: [Unexplained Disappearance | Art by: Izzy], supplement: none, numbering: none) Guilt overwhelms me. I'd wasted so much of the elixir. I should apologize. Offer to pay it back. But we can't afford that kind of debt, especially with what her husband pays me. #emph[When] he pays me. Besides, if House Dimir finds out I'm this awful at espionage, I'd be disappeared for good. I'm doing the right thing, staying quiet. Even if I have to stand here all night. I suck in my breath and clench the bread loaf to my chest like it's my lifeline, taking comfort that it will soon feed my hungry child. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) A blast of magic erupts from my fingertips, fog leaps out of my way, and for the first time since I've worked the docks, the river is clear as far as the eye can see. It's not much of a sight—muddy waters, riddled with trash and clumps of invasive river plants. I can't help but wonder if the mystery would have been better for Grimbly Wothis than his investor now seeing the naked truth. It just isn't that great of a harbor, but that's no problem of mine. I grow antsy, all this power at my fingertips, wanting to show off a little in front of the other dockworkers. Yantis is operating the crane, a Viashino with sticky fingers, the kind great for pulling levers and coaxing gears. But his forked tongue has aimed more than a few reptilian curses my way, and a little payback sounds right on time. I recount the nightmare spell I'd been taught. It never materialized into more than haze before, but now, Yantis has ribbons steaming off his brain, just waiting for me to give them a yank. Power wells within me, so fast, so hard, I can't control it. Yantis screams, fighting all the terrifying nothings in front of him. The boom swivels left, the crate drops loose and goes tumbling, tumbling toward Grimbly Wothis and the investor standing at the edge of the dock. My boss sees the rogue crate, sees Yantis flailing, sees the last few shreds of nightmare spells drifting from my fingertips. He scowls at me, then pushes the investor into the river at the last second. He barely has time to jump in himself before the crate smashes right where they'd been standing. Glass cracks, and the sharp scent of good whisky fills the air. The surveillance bug hums in my ear again, tiny wings flapping, eyes pointed right at me. Nope, there's nothing covert about ruining a thousand zinos worth of cargo. I wince. Losing my job, I could handle. But once House Dimir comes knocking at my door, it'll be like I never existed. Heh. Like they'd knock. Fast as I can, I run home. We'll have to pack up whatever we can and leave Keyhole, maybe go into hiding in the old Ghost Quarter or seek refuge at the ruins of Mahovana, claiming the treetops as our new home. I turn the knob on our front door so hard that the lock shatters, remnants of weak magic slipping away like wisps into the air. Tashi stands there, holding the baby, giant smile plastered on her face. "Merret! Merret, you've got to see this!" She holds the baby up. He's flourishing. Cheeks plump, his gummy smile glistening, and an undeniable spark in his eyes. "He's so strong, now. Feel his muscles. I think he's going to walk any day." And then she's pulling me close, kissing my cheek, telling me how she loves me and I can't even get a word in about how our lives are about to change, and not for the better. "Everything's going to be okay," she's saying, but me, I'm just staring at that bright blue elixir stain on the chunk of bread baby's been gnawing on. Watching how it shimmers, ever so slightly, like moonlight. Then baby sneezes, and every single candle in our apartment bursts into flame. Something's happened. Good or bad, I don't know. No time to think with the beating on our front door. I wedge my weight against it. <NAME> is hollering from the other side about how he knew it was me who caused the incident, and that I'd ruined his cargo as well as scared of his investor. They say merfolk cuss like you've never heard, but dock bosses have them beat, hooves down. With a broken lock, this door won't hold him back for long. I whisper for Tashi to hide in the cupboard with the baby, and for Soche to duck into her sleeping nook and cover herself with blankets. Me~there's no room left in our little hovel to hide. Doesn't matter anyway, because when that big hoof hits the flimsy door, splinters fly, and I take to the air, landing hard on my chin. It takes a moment for the fog #emph[inside] my head to clear, but soon as I'm able, I reach out between me and <NAME>, trying to pull those magic threads, trying to shield myself from sight, but it's useless. Now, <NAME> is standing over me, brow bent, his stare as sharp as the tips of his horns. Bits of flotsam cling to his body, and he smells like a striking combination of dank river and wet fur. "You owe me, Merret." He takes one look around my home and laughs his roaring laugh, as if the idea of me possessing anything of value were a huge joke. "I'd take it out of your pay, but you'd spend three lifetimes earning back the cost of that whiskey. Then I thought I'd just take it out on your hide, but it seems you #emph[do] have something of considerable value after all." My heart constricts in my chest and doesn't let go. I watch his eyes track to our kitchen. "I'll do anything," I tell him, scrambling between him and the cupboard. "Clear the harbor every waking hour. Double shifts. My wife! My wife will work, too. We'll pay off whatever we owe you, I promise." "I saw what that child did through the window, the trick with the candles." His hoof knocks across my shin and I bite back the pain. Another kick, right in the ribs, and I crumple into a ball. Then he's past me, throwing open the door to the cupboard. Tashi is inside, whimpering, the baby asleep against her chest. The sight of my wife suffering, of my child in danger, ignites my fury, and I'm up to my feet again. I conjure the magic~before it had been a chore, like sucking hard through a cracked straw, but now it enters me with a flow as unrestrained as the river. "A child like this is worth something," <NAME> says, attempting to pry the baby from my wife's arms. She bucks and bites and screams, and now the baby is awake and howling. The tips of my fingers dance with light, and the threads of my boss's mind open up to me. I pull and tug, weaving a nightmare, especially for him, constructed of his deepest fears. Now <NAME> is screaming, too, a piercing and perfectly pitched note that rattles the glass of our gas lamps. He fights the invisible foes before him, throwing pots and pans, tipping over chairs. He's stomping all over the place, not watching where he's going. My nerves go tight as he stomps closer to the pile of blankets Soche is hiding under. #emph[Those hooves] ~my focus wanes, just for a moment, but it's enough for Grimbly Wothis to throw off the nightmares and make a run for my son. #figure(image("001_Under the Cover of Fog/04.jpg", width: 100%), caption: [Smelt-Ward Minotaur | Art by: <NAME>], supplement: none, numbering: none) And like that, my baby is in Grimbly Wothis's arms, back arched, letting loose a heart-wrenching scream that tears me up inside. "As always, you've got no focus, Merret," Grimbly Wothis scolds me. "But we're even now." "Give me back my—" Grimbly Wothis raises his leg high, and for a moment, I'm mesmerized by the draw of all that firm muscle, then his hoof lands square in my mouth and my world explodes with pain. I catch blood in my cupped hands, but they can't hold all of it. I must have blacked out for a moment, because Wothis is already at the door, trying to maneuver his horns through the opening while the baby writhes and my wife grips at the fur on his thigh. With a hard shake, he flings her off. She goes flying and hits the side of a cabinet. Something cracks. Something that's not old wooden cabinetry. I focus as hard as I can, ignoring my child's screams and the awful whimpers coming from my wife. I pull at magic, trying to wrap a noose around my boss' thick neck, but the flow is back to a trickle now. Whatever he feels, it's no more than a scratch in his throat. He coughs once, then looks back at me. Laughs. "See you at the docks tomorrow, bright and—" His eyes go wide, his breathing chokes off. I look down at my fingers, dull as dirt. Not even a breath of magic stirs around me, but Grimbly Wothis has been gripped by the mind, I'm sure of it. I catch a glimpse of the intensity in my child's eyes. My son arches his back again, throws his arms up, and suddenly, he's gone. Disappeared. Vanished. "What did you do with my baby?" my wife screams out, gripping her broken ribs. My brave Soche has come out of her hiding spot, and now she's pitching wooden blocks at Wothis. One hits him square in the forehead. "Stop! You'll hit the baby!" I say, scrambling over, trying to see through the baby's cloak. I feel for him in my boss's arm, but there's nothing. Panic overwhelms me. Had he dropped him? Grimbly Wothis starts coughing, sucking in huge amounts of air as he regains his composure. Bloodshot eyes stare down at me. "Where's the baby?" he says, like he's accusing me of the baby's disappearance. I'm so angry, I can't think straight and punch him square in the jaw. His nostrils flare, and his eyes soften like I've just given him permission for this to be a real fight. My fists are up, and then we're scrapping, and I'm trying to push him towards the door, and he's trying to fight his way back in, and then Tashi screams the baby's name, and we all stop and stare. The baby's sitting there on the floor. He's got scratches on his arms and is holding a strange purple fruit shaped like a star. I've never seen anything like it. He puts it in his mouth, the bitter skin making his lip pucker tight. He drops the fruit, and then pushes up on all fours, about to crawl. Grimbly Wothis is trying to force his way past me, but I hold him back with all my might. "Go to mama," I tell the baby. "Go to mama!" But the baby isn't listening. His eyes are focused across the room. Then I see the near-shadow sitting in the armchair by the hearth. We all see it. Him. And I realize somewhere deep in the back of my brain that he's been sitting there a long, long time. He's draped in a flowing leather cloak, made from the hide of some beast that had gone extinct ages ago~he's regal, even upon the throne that is our rickety armchair. All of the magic in the room, in this apartment block, maybe in this whole neighborhood is flowing toward him, like a sinkhole that's suddenly opened up in the middle of an unsuspecting lake. I shake my head, trying to rid myself of improbable thoughts. Could this be Lazav? Lazav the Mastermind, Guildmaster of House Dimir? Every aching bone in my body wants to bow in his presence, though doing so would be the worst indiscretion I could make. The baby pushes up again, and suddenly he's standing~wobbling back and forth and back again, before taking his first timid step. He smiles for a moment, proud of himself, then takes another step, and another, until momentum gets the best of him, and he falls right into Lazav's arms. Lazav hoists the baby up into his lap. #figure(image("001_Under the Cover of Fog/05.jpg", width: 100%), caption: [Lazav, the Multifarious | Art by: <NAME>], supplement: none, numbering: none) "Any outstanding debts Merret owes you will be paid in full by the close of business tomorrow," Lazav says to my boss. "And in return, you will refrain from further contact with any member of this family. Isn't that right, <NAME>?" "Who the hell do you think you are?" <NAME> says, puffing up to his full breadth, head tilted forward, horns ready for a battle. "No one," Lazav says, his voice as hallow as a whisper, but there's nothing soft about it. He waves a hand and the entire room starts spinning, spells blazing silvery light in circles around the edges of our home. I cling to the floor, feeling like the weight of the world is pressing upon my lungs. It spins faster, faster—furniture quaking, walls shaking, windows warping and on the verge of shattering out of their panes. Then everything comes to a screeching halt. For a long moment, there is absolute silence, then <NAME> mutters, "Okay. Sounds good. Whatever you say," and stumbles dizzily out of the house, nearly toppling over the balcony railing. "Good," Lazav says, smiling at me now, my son happily gnawing on one of his knuckles. "This babe will astound us in every way you've disappointed us." "You won't have my son," I say, respectful, yet firm. "We don't want your son. At least not in that way. He will stay with you. You will raise him as you see fit. But in return for paying off your debts, we would like to ask that we send a tutor to your home to oversee his learning. Of course, we will also provide you a modest stipend so that you can adequately provide for his needs. And yours." My jaw has dropped. I go over to Tashi, pulling her gently into me. I try to push away some of her pain, then we just stare at each other, dumbfounded, each grasping for questions to ask and falling short. "Is my brother special?" comes Soche's voice, a terror-ridden peep. Lazav laughs a raspy laugh, like stones scratching against rib bone. Something in my brain twists sideways, my mind fogs over, and then all of a sudden, we're all laughing, and Great Aunt Bea is sitting in our armchair, bouncing the baby on her knee. Soche's playing a tune on her xylophone, and Tashi's in the kitchen, chopping up some strange purple fruit she must have gotten at the market. I go stand next to her, and she smiles at me, then places a bit of the sweet pulp on my tongue. As I chew, my jaw aches some, like I'd been punched in the mouth. "You're sure you're okay with my aunt staying with us for a while?" she asks. "Just until she gets back on her feet? She won't be much trouble, and she can help keep an eye on the baby while I get some work done." "Of course, it's okay. I like her," I say. "There's just something about her, you know? That wisdom that comes with old age? I think she'll be good for our family."
https://github.com/Toniolo-Marco/git-for-dummies
https://raw.githubusercontent.com/Toniolo-Marco/git-for-dummies/main/book/git-advanced.typ
typst
#import "@preview/fletcher:0.5.1" as fletcher: diagram, node, edge, shapes #import fletcher.shapes: diamond #import "components/gh-button.typ": gh_button #import "components/git-graph.typ": branch_indicator, commit_node, connect_nodes, branch #show ref: it => emph(text(blue)[#it]) = Git Advanced == Git Clean Il comando `git clean` è un comando distruttivo che, di norma, viene utilizzato per rimuovere i *file untracked*. I file eliminati dopo il suo utilizzo non saranno recuperabili tramite git; per questo necessita dell'opzione `-f` per essere eseguito. Con questo comando è possibile combinare molteplici opzioni al fine di ottenere il risultato desiderato; di seguito al lista: ```bash ➜ git clean # Alone will always produce this output fatal: clean.requireForce is true and -f not given: refusing to clean ➜ git clean -n # To preview files that will be deleted ➜ git clean --dry-run # Same as -n ➜ git clean -d # Remove untracked directories in addition to untracked files ➜ git clean -e <expr> # Exclude files matching the given pattern from being removed. ➜ git clean -X # Remove only files ignored by Git. ➜ git clean -x # Deletes all untracked files, including those ignored by Git. ➜ git clean -i # Interactive Mode ➜ git clean -f # Actually execute git clean ➜ git clean -ff # Execute git clean recursively in sub-directories ➜ git clean -q # Suppress the output ``` == Git Revert Sempre rimanendo in tema di annullamenti, il comando `git revert` risulta particolarmente utile. Sostanzialmente ciò che fa questo comando è il contrario di `git diff`: ovvero, passato un commit come riferimento, le differenze apportate da questo verrano applicate all'inverso. Inoltre automaticamente verrà fatto un nuovo commit. Uno dei vantaggi di `git revert` è il non riscrivere la storia, oltre al fatto di essere _safe_: è sempre possibile tornare al commit precedente e riprendere da lì, eliminando il commit di revert se inutile. Come esempio prendiamo direttamente quello fornito da Atlassian sulla pagina dedicata a #link("https://www.atlassian.com/git/tutorials/undoing-changes/git-revert")[questo comando]. Riporto di seguito il codice per semplicità: ```bash ➜ git init . Initialized empty Git repository in /git_revert_test/.git/ ➜ touch demo_file ➜ git add demo_file ➜ git commit -am "initial commit" [main (root-commit) 299b15f] initial commit 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 demo_file ➜ echo "initial content" >> demo_file ➜ git commit -am "add new content to demo file" [main 3602d88] add new content to demo file n 1 file changed, 1 insertion(+) ➜ echo "prepended line content" >> demo_file ➜ git commit -am "prepend content to demo file" [main 86bb32e] prepend content to demo file 1 file changed, 1 insertion(+) ``` Questa serie di comandi non dovrebbe sorprenderci; lo stato si presenta così: \ \ \ #figure( align(center)[ #scale(85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 3, head:2, commits: ("initial commit", "add new content", "prepend content") ), ) ] ], )<git-to-revert> Proseguendo con i comandi suggeriti troviamo: ```bash ➜ git revert HEAD [main b9cd081] Revert "prepend content to demo file" 1 file changed, 1 deletion(-) ``` Lanciando il comando probabilmente si aprirà l'editor che ci permette di modificare il messaggio di commit o lasciare quello di default. Il risultato comunque sarà: \ \ \ #figure( align(center)[ #scale(85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 4, head:3, commits: ("initial commit", "add new content", "prepend content", "Revert \"prepend ...\"") ), ) ] ], )<git-reverted> Il contenuto di demo_file è quindi: ```bash ➜ cat demo_file initial content ``` Le dirette conseguenze dei vantaggi elencati prima rendono questo metodo il migliore quando si lavora in team in quanto non si rende neccessario un `git push --force`. == Git Reset `Git Reset` è un comando alquanto articolato, come i precedenti riguarda l'annullamento dei cambiamenti e si basa sul concetto dei _three trees_ (working directory, staging index e commit history). Quello che fa', in generale è spostare l'HEAD al commit passato come argomento: #figure( align(center)[ #scale(85%)[ #stack( dir:ltr, spacing:15%, [ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 3, head:2, commits: ("","","") ), ) ], [ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 3, head:0, commits: ("","","") ), ) ] ) ] ], ) Si presenta con tre principali opzioni differenti: `--soft`, `--mixed` (default) e `--hard`. Queste determinano il suo comportamento con i commit successivi a quello a cui abbiamo spostato l'HEAD. - `git reset --soft <commit-hash>` metterà tutti i cambiamenti presenti nei commit successivi nello staged index. Questa opzione è molto utile per combinare in un solo commit tutti i cambiamenti degli ultimi n commit che non sono ancora stati pushati. - `git reset --mixed <commit-hash>` salverà tutti quei cambiamenti come unstaged - `git reset --hard <commit-hash>` scarterà tutti i cambiamenti apportati.#footnote([Il comando `git reset --hard`, senza argomenti eliminerà tutti i cambiamenti (sia staged che unstaged) attuali]) == Git RefLog Lo strumento `git reflog` restituisce tutti i movimenti dell'HEAD nella repo locale, dunque in qualche modo rappresenta l'ultima possibilità di recuperare delle modifiche che abbiamo perso. Se analizziamo la situazione dopo il revert: \ \ \ #figure( align(center)[ #scale(85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 4, head:3, commits: ("initial commit", "add new content", "prepend content", "Revert \"prepend ...\"") ), ) ] ], ) e qui supponiamo di sbagliarci e dare il comando: `git reset --hard HEAD~2` (invece che `HEAD~1`); possiamo cercare nell'output di `git reflog` il commit con messaggio: _"prepend content ..."_ #grid( columns:(2fr,5fr), column-gutter: 15%, [#scale(85%)[ \ \ \ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // main branch branch( name:"main", color:blue, start:(0,0), length: 2, head:1, commits: ("initial commit", "add new content",) ), )] ], [ ```bash f0eb5e3 (HEAD -> main) HEAD@{0}: reset: moving to HEAD~2 42fcfa5 HEAD@{1}: revert: Revert "prepend content to demo file" 558d9f6 HEAD@{2}: commit: prepend content to demo file f0eb5e3 (HEAD -> main) HEAD@{3}: commit: add new content to demo file fd3f7bd HEAD@{4}: commit (initial): initial commit ``` ] ) Con un pizzico di fortuna, possiamo ottenere quindi l'hash del commit, che nel nostro caso è `558d9f6`.#footnote([Se si effettua un checkout su un commit non direttamente successivo rispetto al quello su cui si trova l'HEAD attualmente, si recuperano comunque i commit intermedi.]) == Interactive Staging Git ci permette di fare uno staging interattivo @git-interactive-staging con il comando `git add -i` o `git add --interactive`. L'output che avremo sarà molto dettagliato riguardo i singoli files e il loro stato. Inoltre, sotto, avremo un menù con molteplici comandi: ```bash ➜ git add -i staged unstaged path 1: unchanged +0/-1 TODO 2: unchanged +1/-1 index.html 3: unchanged +5/-1 lib/simplegit.rb *** Commands *** 1: [s]tatus 2: [u]pdate 3: [r]evert 4: [a]dd untracked 5: [p]atch 6: [d]iff 7: [q]uit 8: [h]elp What now> ``` Da questo momento in poi avremo una shell interattiva, analizziamo quindi le opzioni: - `status`: restituisce lo stato dei file, mostrando quante righe sono state aggiunte o rimosse. - `update`: permette di portare i file nella staging area. - `revert`: rimuove i file dalla staging area. - `add untracked`: aggiunge i file non tracciati. - `patch`: permette di fare uno staging parziale, lavorando su singole parti di un file. - `diff`: mostra le differenze tra l'index e l'HEAD. Ognuno di questi comandi può essere eseguito digitando il numero corrispondente del file. Per selezionare più file utilizzando una virgola o uno spazio come separatore. I file selezionati saranno preceduti da un asterisco nell'output successivo. Per confermare le modifiche ci basterà premere invio, tornando così al menù principale. In alternativa possiamo utilizzare un asterisco ("\*") per selezionare tutti i file, ed in automatico verranno applicate le modifiche a tutti i file. - `quit`: esce dalla shell interattiva. - `help`: mostra i comandi disponibili e la descrizione. == Avoid useless commits with Git Stash Se avete mai scritto del codice con o senza git, vi sarete resi conto che spesso vorremmo sperimentare nuove idee o fare delle modifiche senza dover creare un commit per ogni singola modifica. Oppure in molti casi non abbiamo ancora abbastanza materiale per creare un commit, ma vogliamo comunque salvare il lavoro fatto finora. Per quello che abbiamo imparato, dovremmo fare un commit e poi utilizzare l'opzione `--amend` per editarlo con le nuove modifiche. In questi casi possiamo usare `git stash`. @git-stash-tutorial L'operazione di stash seleziona *tutte le modifiche dei file tracciati ed i file nella staging area* e le salva su uno stack temporaneo. Questo ci farà tornare ad una working directory pulita, sincronizzata con l'HEAD e ci permette di ri-applicare le modifiche in un secondo momento. Come comando possiamo utilizzare `git stash push` (equivalente di `git stash` senza argomenti). Il == Squash Spesso potremmo aver fatto commit non necessari, o vorremmo avere un solo commit per la feature che stiamo sviluppando; in modo da avere una storia più pulita e comprensibile e delle PR che non contengono commit inutili. L'idea è di trasformare il flusso di lavoro visto in @workflow[Figura], in un flusso di lavoro simile: #figure( align(center)[ #scale(85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, // master branch branch( name:"main", color:blue, start:(0,0), length: 7, commits: ("",none,none,none,none,none,"")), edge((7,0),(8,0),"--",stroke:2pt+blue), //... other commits // develop branch connect_nodes((1,0),(2,1),orange), branch( name:"develop", color:orange, start:(1,1), length:5, commits:("",none,none,"","")), connect_nodes((6,1),(7,0),orange), // feature branch connect_nodes((2,1),(4,2),yellow), branch( name:"feature", color:yellow, start:(3,2)), connect_nodes((4,2), (5,1),yellow), // 2nd feature branch connect_nodes((2,1),(4,3),teal), branch( name:"2nd feature", color:teal, start:(3,3)), connect_nodes((4,3), (6,1),teal), ) ] ], caption: [Workflow Diagram with squashed commits] )<squashed-commits-workflow> In questi casi possiamo usare il comando `git rebase -i HEAD~n` dove `n` è il numero di commit che vogliamo combinare $(1,2,3 ...)$. Quello che otterremo diperà dall'editor di testo che abbiamo configurato come predefinito, in ogni caso i passaggi sono sempre gli stessi. In questo caso usiamo `vscode` con l'estensioni: `git-lens` e `git-graph`. Come situazione di partenza abbiamo il seguente log: \ \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch(name:"main",color:blue,start:(0,0)), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", indicator-xy:(5,0), color:yellow, start:(1,0), length:3, head: 2, commits: ("added 1","added 2","added 3") ), // add-letters branch connect_nodes((1,0),(2,1),orange), branch( name:"add-letters", color:orange, start:(1,1), length: 5, alignment: bottom, commits:( "added one \"A\"", "added another \"A\"", "added 3rd \"A\"", "added \"A\"", "5 \"A\" in 5 commits") ), ) ] ]\ \ Dall'ultimo commit del main abbiamo creato 2 branch distinti. Attualmente ci troviamo sull'ultimo commit del branch `add-numbers` e vogliamo combinare i 3 commit in un unico commit. Per farlo eseguiamo il comando: `git rebase -i HEAD~3` e ci troveremo davanti a qualcosa del genere: sul nostro editor: #image("img/rebase-git-lens-1.png") In generale per ogni commit possiamo scegliere se: - `pick`: mantenere il commit - `reword`: cambiare il messaggio del commit - `edit`: fermarsi al commit per fare delle modifiche - `squash`: combinare il commit con il precedente - `drop`: eliminare il commit #footnote("Nota che questa opzione potrebbe causa conflitti di merge se uno dei commit successivi nel branch dipende dai cambiamenti del commit droppato.") - `fixup`: In questo caso vogliamo combinare i 3 commit in un unico commit, quindi averne uno solo che descriva l'aggiunta di 3 numeri. #image("img/rebase-git-lens-2.png") A questo punto salviamo e *chiudiamo la scheda*, ora il procedimento interattivo ci proporrà opzioni differenti in base alle scelte appena fatte. In questo caso apre il file il file `COMMIT_EDITMSG` ci chiede di scrivere il messaggio del commit che conterrà i 3 commit combinati. Si presenterà un file in una scheda di questo tipo: ```bash added 1 # Please enter the commit message for your changes. Lines starting # with '\#' will be ignored, and an empty message aborts the commit. # # Date: Mon Sep 16 09:39:13 2024 +0200 # # interactive rebase in progress; onto f3c8de5 # Last command done (1 command done): # reword 52ddcf4 added 1 # Next commands to do (2 remaining commands): # squash 6438329 added 2 # squash aa86ec5 added 3 # You are currently editing a commit while rebasing branch 'add-numbers' on 'f3c8de5'. # # Changes to be committed: # modified: README.md # ``` Ora basta semplicemenete scrivere il messaggio che desideriamo eliminando il testo iniziale, salvare e chiudere la scheda. L'ultima scheda che ci viene presentata è un riassunto del rebase interattivo che abbiamo appena compiuto, procediamo chiudendo anche quella.#footnote("Se non si vogliono i messaggi degli altri commit è sufficiente commentarli con `#`.") Così facendo avremo completato il rebase e avremo un solo commit che descrive l'aggiunta di 3 numeri: \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( name:"main", color:blue, start:(0,0) ), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", color:yellow, start:(1,0), indicator-xy: (2.75,0), length: 1, head:0, commits: ("added some\n numbers",), alignment: top ), // add-letters branch connect_nodes((1,0),(2,1),orange), branch( name:"add-letters", color:orange, start:(1,1), length: 5, alignment: bottom, commits:( "added one \"A\"", "added another \"A\"", "added 3rd \"A\"", "added \"A\"", "5 \"A\" in 5 commits") ), ) ] ]\ \ Potremmo applicare lo stesso procedimento per il branch `add-letters` e combinare i 5 commit in un unico commit. Dunque utilizziamo `git switch add-letters` e poi `git rebase -i HEAD~5`... alla fine del procedimento otterremo un log di questo tipo: \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( name:"main", color:blue, start:(0,0) ), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", color:yellow, start:(1,0), indicator-xy: (2.75,0), length: 1, angle: 45deg, commits: ("added some\n numbers",), alignment: top ), // add-letters branch connect_nodes((1,0),(2,1),orange), branch( name:"add-letters", color:orange, start:(1,1), indicator-xy: (2.75,1), length: 1, head:0, alignment: bottom, angle: 45deg, commits:( "added multiple \n\"A\"",) ), ) ] ]\ \ Fin'ora però abbiamo sempre lavorato su branch locali, come possiamo fare per i branch remoti? - Se il branch non esiste nel remote repository su cui vogliamo pusharlo ed abbiamo i permessi basterà: `git push --set-upstream origin branch-name`. - In modo simile, se il branch esiste ma non abbiamo pushato nessuno dei commit che vogliamo _squashare_ possiamo fare un `git push`. - Se non abbiamo i permessi di pushare dovremmo fare una PR o forkare il repository e fare una PR dal nostro fork. - Se abbiamo già pushato i commit che vogliamo _squashare_, dopo il rebase, ci troveremo con una situazione simile a questa: \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( name:"main", color:blue, start:(0,0) ), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", color:yellow, start:(1,0), indicator-xy: (2.75,0), length: 1, commits: ("added some\n numbers",), alignment: top ), // add-letters branch connect_nodes((1,0),(5,1),orange), branch( name:"add-letters", // remote:"origin", color:orange, start:(4,1), indicator-xy: (4.75,0.5), length: 1, head:0, alignment: bottom, commits:("added multiple \n\"B\"",) ), // remote add-letters branch connect_nodes((1,0),(2,2),green), branch( name:"my-fork/add-letters", color:green, start:(1,2), length: 3, alignment: bottom, commits: ( "added multiple \n\"A\"", "removed all \n\"A\"", "added multiple \n\"B\"") ), ) ] ]\ \ Notiamo che il branch in remoto non si aggiorna in automaticamente. Se tentiamo di effettuare un `git push my-fork`, otterremo una cosa simile: ```bash ➜ git push my-fork To https://github.com/account/repo.git ! [rejected] add-letters -> add-letters (non-fast-forward) error: failed to push some refs to 'https://github.com/account/repo.git' hint: Updates were rejected because the tip of your current branch is behind hint: its remote counterpart. If you want to integrate the remote changes, hint: use 'git pull' before pushing again. hint: See the 'Note about fast-forwards' in 'git push --help' for details. ``` Seguire i suggerimenti in questo caso non ci porterà da nessuna parte: per lo stesso motivo per cui il `push` non ha funzionato, non funzionerà nemmeno il comando: `git pull myfork add-letters`. Mostriamo comunque di seguito il procedimento suggerito da git: ```bash ➜ git pull my-fork add-lettersers From https://github.com/account/repo * branch add-letters -> FETCH_HEAD hint: You have divergent branches and need to specify how to reconcile them. hint: You can do so by running one of the following commands sometime before hint: your next pull: hint: hint: git config pull.rebase false # merge hint: git config pull.rebase true # rebase hint: git config pull.ff only # fast-forward only hint: hint: You can replace "git config" with "git config --global" to set a default hint: preference for all repositories. You can also pass --rebase, --no-rebase, hint: or --ff-only on the command line to override the configured default per hint: invocation. fatal: Need to specify how to reconcile divergent branches. ``` Se diamo i tre comandi consigliati e ritentiamo il pull, otterremo: ```bash ➜ git pull my-fork add-letters From https://github.com/account/repo * branch add-letters -> FETCH_HEAD hint: Diverging branches can't be fast-forwarded, you need to either: hint: hint: git merge --no-ff hint: hint: or: hint: hint: git rebase hint: hint: Disable this message with "git config advice.diverging false" fatal: Not possible to fast-forward, aborting. ``` Come abbiamo già visto in precedenza il merge produrrebbe un commit di merge, mentre il rebase aggiungerebbe uno dei due branch sopra all'altro; nessuna di queste soluzioni è quella che stiamo cercando. In questo caso la soluzione più semplice e *rischiosa* è _riscrivere la storia_; ovvero utilizzare l'opzione `--force` di `git push`. Questo riscriverà completamente la storia della remote Repository e porterà il corrispondente branch remoto allo stesso commit di quello locale, ottenendo: \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( name:"main", color:blue, start:(0,0) ), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", color:yellow, start:(1,0), indicator-xy: (2.75,0), length: 1, commits: ("added some\n numbers",), alignment: top ), // add-letters branch connect_nodes((1,0),(3,1),orange), branch( name:"add-letters", remote:"my-fork", color:orange, start:(2,1), indicator-xy: (4,1), length: 1, head:0, alignment: bottom, commits:("added multiple \n\"B\"",) ), ) ] ]\ \ È importante notare che se altre persone stanno utilizzando quel branch remoto non potranno più pushare su quello stesso branch, se non forzando a loro volta e così via. Una soluzione più "sicura" sarebbe creare un branch locale e spostarsi su quello prima di effettuare il rebase. In questo modo, dopo il rebase, avremo: \ #align(center)[ #scale(90%,x:85%)[ #set text(10pt) #diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( name:"main", color:blue, start:(0,0) ), // add-numbers branch connect_nodes((1,0),(2,0),yellow), branch( name:"add-numbers", color:yellow, start:(1,0), indicator-xy: (2.75,0), length: 1, commits: ("added some\n numbers",), alignment: top ), // local branch connect_nodes((1,0),(5,1),orange), branch( name:"local-branch", // remote:"origin", color:orange, start:(4,1), indicator-xy: (4.75,0.5), length: 1, head:0, alignment: bottom, commits:("added multiple \n\"B\"",) ), // remote add-letters branch connect_nodes((1,0),(2,2),green), branch( name:"add-letters", remote:"my-fork", color:green, start:(1,2), length: 3, alignment: bottom, commits: ( "added multiple \n\"A\"", "removed all \n\"A\"", "added multiple \n\"B\"") ), ) ] ]\ \ Questo inoltre ci permette di mantenere i commit intermedi in caso di necessita: elimineremo il branch una volta che non ci serviranno più. //TODO: Other merging strategies //TODO: Cherry-pick
https://github.com/danbalarin/vse-typst-template
https://raw.githubusercontent.com/danbalarin/vse-typst-template/main/lib/headings.typ
typst
#let heading-blocks = ( none, (it) => { pagebreak(weak: true) block(below: 40pt)[ #set align(left + horizon) #set text(20pt, weight: "black") #smallcaps(it) ] }, (it) => { block(above: 24pt, below: 16pt)[ #set align(left + horizon) #set text(14pt, weight: "bold") #smallcaps(it) ] }, (it) => { block(above: 20pt, below: 9pt)[ #set align(left + horizon) #set text(12pt, weight: "bold") #smallcaps(it) ] }, )
https://github.com/barddust/Kuafu
https://raw.githubusercontent.com/barddust/Kuafu/main/src/Logic/build.typ
typst
#{ import "/config.typ": project import "/mathenv.typ": * show: mathenv-init project( "夸父:逻辑", "0.1", "Logic", ( "intro.typ", "sentential-logic.typ", ), bio: false ) }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cuti/0.1.0/lib.typ
typst
Apache License 2.0
#let fakebold(base-weight: none, s) = { set text(stroke: 0.02857em) set text(weight: base-weight) if base-weight != none s } #let regex-fakebold(reg-exp: ".", base-weight: none, s) = { show regex(reg-exp): it => { fakebold(base-weight: base-weight, it) } s } #let show-fakebold(reg-exp: ".", base-weight: none, s) = { show text.where(weight: "bold").or(strong): it => { regex-fakebold(reg-exp: reg-exp, base-weight: base-weight, it) } s } #let cn-fakebold(s) = { regex-fakebold(reg-exp: "[\p{script=Han} !-・〇-〰—]", base-weight: "regular", s) } #let show-cn-fakebold(s) = { show-fakebold(reg-exp: "[\p{script=Han} !-・〇-〰—]", base-weight: "regular", s) }
https://github.com/jonathan-iksjssen/jx-style
https://raw.githubusercontent.com/jonathan-iksjssen/jx-style/main/README.md
markdown
# jx-style **@joniksj**'s personal style package for his `typst` documents. Recommended for use if you are him, or if you want to make your documents look like his. # Installation Of course, to use this, you'll need [`typst`](https://github.com/typst/typst). If you haven't heard, `typst` is a newfangled document typesetting system akin to LaTeX, also exporting directly to `.pdf`, that uses more modern syntax *(thank god)*, familiar programming concepts, and has a powerful styling engine, which makes it perfect for me, **@joniksj**, that wanted an alternative to Google Docs but didn't want to learn LaTeX. ## Dependencies - An installation of **Typst**, either the binary from the [releases page](https://github.com/typst/typst/releases/) or from **@nvarner**'s `typst-lsp` for VS Code(ium) - A text editor to actually test this out with (I recommend VS Code(ium) with `typst-lsp`) - Git CLI *(optional, to make things easier)* ## Installation Instructions <details> <summary> With `git clone` </summary> **You will need to use `git` for this.** 1. Navigate to your `typst` local packages directory, which is: - `$XDG_DATA_HOME/typst/packages` or `~/.local/share/typst/packages` on Linux - `~/Library/Application Support/typst/packages` on macOS - `%APPDATA%/typst/packages` on Windows 2. Make a folder called `local` *(or whatever you want to name your package namespace)* 3. In that `local` folder, open a terminal and run: ```bash git clone https://github.com/jonathan-iksjssen/jx-style.git ``` 4. Make a new `.typ` file anywhere 5. Import the package with ```c #import "@local/jx-style:0.2.0": * ``` Replace `local` with whatever you named your package namespace in step 2. </details> <details> <summary> Without `git clone` </summary> 1. Download the `.zip` file of this repository. 2. Navigate to your `typst` local packages directory, which is: - `$XDG_DATA_HOME/typst/packages` or `~/.local/share/typst/packages` on Linux - `~/Library/Application Support/typst/packages` on macOS - `%APPDATA%/typst/packages` on Windows 3. Make a folder called `local` *(or whatever you want to name your package namespace)* and navigate into it. 4. Extract the contents of `jx-style-main.zip` directly into said folder. 5. If necessary, rename the `jx-style-main` folder to just `jx-style`. 6. Make a new `.typ` file anywhere 7. Import the package with ```c #import "@local/jx-style:0.2.0": * ``` Replace `local` with whatever you named your package namespace in step 2. </details> ㅤ # Parts There are four main parts of this package that are important: | File | Purpose | |---|---| `irgot.typ` | *(is `irgot.json` in 0.1.0)* Contains the **colour schemes** that are possible for documents. *‹irgot›* means *‹colours›* in Icsine, my personal conlang. `jx-style.typ` | The **main style file**, and the entry point to the package itself. Contains all the styles themselves, all that shabang. `jx-date.typ` | My own **date string formatter**. Accessed by `jx-style.typ` to display dates how I like it. `typst.toml` | The **manifest** for the package, in accordance with [typst/packages](https://github.com/typst/packages). ## `irgot.typ` A bit of backstory -- before I moved to making my documents with Typst,
https://github.com/0x1B05/nju_os
https://raw.githubusercontent.com/0x1B05/nju_os/main/book_notes/content/04_persistence_01_hardware%26%26raid.typ
typst
#import "../template.typ": * #pagebreak() = hardware&&raid == Hard Disk Drives === The Interface The drive consists of a large number of sectors (*512-byte blocks*), each of which can be read or written. The sectors are numbered from 0 to n − 1 (address space)on a disk with n sectors. Multi-sector operations are possible; indeed, many file systems will read or write 4KB at a time (or more). However, when updating the disk, the only guarantee drive manufacturers make is that a single 512-byte write is *atomic*. #tip("Tip")[ *a torn write*: if an untimely power loss occurs, only a portion of a larger write may complete ] ==== Assumptions 1. Accessing two blocks near one-another within the drive’s address space will be faster than accessing two blocks that are far apart. 2. Accessing blocks in a contiguous chunk (i.e., a sequential read or write) is the fastest access mode, and usually much faster than any more random access pattern. === Basic Geometry #image("images/2024-02-29-20-24-44.png", width: 50%) A disk may have one or more *platters*; each platter has 2 sides, each of which is called a *surface*. Data is encoded on each surface in concentric circles of sectors; we call one such concentric circle *a track*. The process of reading and writing is accomplished by the *disk head*. The disk head is attached to a single *disk arm*, which moves across the surface to position the head over the desired track. These platters are usually made of some hard material (such as aluminum), and then coated with *a thin magnetic layer* that enables the drive to persistently store bits. The rate of rotation is often measured in *rotations per minute (RPM)* === A Simple Disk Drive Assume we have a simple disk with a single track: #image("images/2024-02-29-20-30-28.png", width: 50%) #tip("Tip")[ This track has just 12 sectors, each of which is 512 bytes in size,and the surface is rotating counter-clockwise. ] ==== Single-track Latency: The Rotational Delay Imagine we now receive a request to read block 0: Just wait for the desired sector to rotate under the disk head. (*rotational delay*) In the example, if the full rotational delay is R, the disk has to incur a rotational delay of about R/2 to wait for 0 to come under the read/write head. ==== Multiple Tracks: Seek Time #image("images/2024-02-29-20-33-54.png", width: 80%) A read to sector 11: first move the disk arm to the correct track (in this case, the outermost one), in a process known as a *seek*. #tip("Tip")[ Seeks + rotations, are one of the most costly disk operations. ] The seek: - an *acceleration* phase as the disk arm gets moving; - *coasting* as the arm is moving at full speed - *deceleration* as the arm slows down - *settling* as the head is carefully positioned over the correct track #tip("Tip")[ The *settling time* is often quite significant, e.g., 0.5 to 2 ms ] When sector 11 passes under the disk head, the final phase of I/O will take place, known as the *transfer*, where data is either read from or written to the surface. And thus, we have a complete picture of I/O time: *seek -> rotation -> transfer* ==== Some Other Details ===== track skew Many drives employ some kind of *track skew* to make sure that sequential reads can be properly serviced even when crossing track boundaries. #image("images/2024-02-29-20-39-22.png", width: 50%) #tip("Tip")[ Sectors are often skewed like this because when switching from one track to another ] ===== multi-zoned Outer tracks tend to have more sectors than inner tracks. These tracks are often referred to as *multi-zoned* disk drives, where the disk is organized into multiple zones, and where a zone is consecutive set of tracks on a surface. Each zone has the same number of sectors per track, and outer zones have more sectors than inner zones. ===== track buffer An important part of any modern disk drive is its cache, for historical reasons sometimes called a track buffer.(usually around 8 or 16 MB) For example, when reading a sector from the disk, the drive might decide to read in all of the sectors on that track and cache them in its memory. On writes, the drive has a choice: should it acknowledge the write has completed when it has put the data in its memory, or after the write has actually been written to disk? The former is called write back(fast but dangerous) caching, and the latter write through. === I/O Time: Doing The Math $T_(I/O) = T_("seek") + T_("rotation") + T_("transfer")$ $R_(I/O) = ("Size"_("Transfer"))/(T_(I/O))$ #three-line-table[ |\ | Cheetah 15K.5 | Barracuda | | ------------ | ------------- | --------- | | Capacity | 300 GB | 1 TB | | RPM | 15,000 | 7,200 | | Average Seek | 4 ms | 9 ms | | Max Transfer | 125 MB/s | 105 MB/s | | Platters | 4 | 4 | | Cache | 16 MB | 16/32 MB | | Connects via | SCSI | SATA | ] ==== random workload Assuming each 4 KB read occurs at a random location on disk $T_("seek") = 4 "ms", T_"rotation" = 2 "ms", T_"transfer" = 30 "microsecs"$ The average seek time (4milliseconds) is just taken as the average time reported by the manufacturer; note that a full seek (from one end of the surface to the other) would likely take two or three times longer. The average rotational delay is calculated from the RPM directly. 15000 RPM is equal to 250 RPS (rotations per second); thus, each rotation takes 4 ms. On average, the disk will encounter a half rotation and thus 2 ms is the average time. Finally, the transfer time is just the size of the transfer over the peak transfer rate. ==== sequential workload assume the size of the transfer is 100 MB. Thus, $T_(I/O)$ for the Cheetah and Barracuda is about 800 ms and 950 ms, respectively. The rates of I/O are thus very nearly the peak transfer rates of 125 MB/s and 105 MB/s, respectively. #three-line-table[ |\ | Cheetah | Barracuda | | --------------- | --------- | --------- | | RI/O Random | 0.66 MB/s | 0.31 MB/s | | RI/O Sequential | 125 MB/s | 105 MB/s | ] Figure 37.6 summarizes these numbers. #tip("Tip")[ When at all possible, transfer data to and from disks in a sequential manner. ] In many books and papers, you will see average disk-seek time cited as being roughly one-third of the full seek time. Where does this come from? === Disk Scheduling Given a set of I/O requests, the *disk scheduler* examines the requests and decides which one to schedule next. The disk scheduler will try to follow *the principle of SJF (shortest job first)* in its operation. ==== SSTF: Shortest Seek Time First STF orders the queue of I/O requests by track, *picking requests on the nearest track* to complete first. SSTF is not a panacea, for the following reasons. First, the drive geometry is not available to the host OS; rather, it sees an array of blocks (easily fixed nearest-block-first (NBF)) The second problem is more fundamental: *starvation*. ==== Elevator (a.k.a. SCAN or C-SCAN) To avoid starvation. The algorithm, originally called *SCAN*, simply moves back and forth across the disk servicing requests in order across the tracks. Let’s call a single pass across the disk (from outer to inner tracks, or inner to outer) a *sweep*. If a request comes for a block on a track that has already been serviced on this sweep of the disk, it is not handled immediately, but rather queued until the next sweep (in the other direction). ===== F-SCAN Freezing the queue to be serviced when it is doing a sweep This action places requests that come in during the sweep into a queue to be serviced later. ===== C-SCAN short for Circular SCAN The algorithm only sweeps from outer-to-inner, and then resets at the outer track to begin again. #tip("Tip")[ Doing so is a bit more fair to inner and outer tracks, as pure back-and-forth SCAN favors the middle tracks, ] For reasons that should now be clear, the SCAN algorithm is sometimes referred to as the *elevator* algorithm. Because it behaves like an elevator which is either going up or down and not just servicing requests to floors based on which floor is closer. In particular, SCAN(or SSTF even) do not actually adhere as closely to the principle of SJF as they could. ==== SPTF: Shortest Positioning Time First #image("images/2024-02-29-21-05-46.png", width: 50%) In the example, the head is currently positioned over sector 30 on the inner track. The scheduler thus has to decide: should it schedule sector 16 (on the middle track) or sector 8 (on the outer track) for its next request. So which should it service next? The answer, of course, is “it depends”. #tip("Tip")[ In engineering, it turns out “it depends” is almost always the answer, reflecting that trade-offs are part of the life of the engineer. ] On modern drives, as we saw above, both seek and rotation are roughly equivalent (depending, of course, on the exact requests), and thus SPTF is useful and improves performance. ==== Other Scheduling Issues ===== Where is disk scheduling performed on modern systems? The OS scheduler usually picks what it thinks the best few requests are (say 16) and issues them all to disk; the disk then uses its internal knowledge of head position and detailed track layout information to service said requests in the best possible (SPTF) order. ===== I/O merging For example, imagine a series of requests to read blocks 33, then 8, then 34. The scheduler should *merge* the requests for blocks 33 and 34 into a single two-block request. ===== how long should the system wait before issuing an I/O to disk? - work-conserving: once it has even a single I/O, should immediately issue the request to the drive - non-work-conserving:By waiting, a new and “better” request may arrive at the disk, and thus overall efficiency is increased. #tip("Tip")[ when to wait, and for how long, can be tricky ] == Redundant Arrays of Inexpensive Disks (RAIDs) I/O operations can be the bottleneck for the entire system. Redundant Array of Inexpensive Disks better known as RAID, a technique to use multiple disks in concert to build a faster, bigger, and more reliable disk system. Externally, a RAID looks like a disk: a group of blocks one can read or write. Internally, the RAID is a complex beast, consisting of multiple disks, memory (both volatile and non-), and one or more processors to manage the system. RAIDs offer a number of advantages: performance, capacity, reliability #tip("Tip")[ With some form of redundancy, RAIDs can tolerate the loss of a disk and keep operating as if nothing were wrong. ] When considering how to add new functionality to a system, one should always consider whether such functionality can be added transparently, in a way that demands no changes to the rest of the system. RAID is a perfect example. Transparency greatly improves the *deployability* of RAID. Amazingly, RAIDs provide these advantages transparently to systems that use them, i.e., a RAID just looks like a big disk to the host system. === Interface And RAID Internals When a file system issues a logical I/O request to the RAID, and then issue one or more physical I/Os to do so. Consider a RAID that keeps two copies of each block (each one on a separate disk); when writing to such a mirrored RAID system, the RAID will have to perform two physical I/Os for every one logical I/O it is issued. A RAID system is often built as a separate hardware box, with a standard connection (e.g., SCSI, or SATA) to a host. At a high level, a RAID is very much a specialized computer system: it has a processor, memory, and disks; however, instead of running applications, it runs specialized software designed to operate the RAID. === Fault Model ==== fail-stop fault model In this model, a disk can be in exactly one of two states: working or failed. With a working disk, all blocks can be read or written. In contrast, when a disk has failed, we assume it is permanently lost. One critical aspect of the fail-stop model is what it assumes about fault detection. Specifically, when a disk has failed, we assume that this is easily detected. For now, we do not have to worry about more complex “silent” failures such as disk corruption. We also do not have to worry about a single block becoming inaccessible upon an otherwise working disk (sometimes called a latent sector error). === How To Evaluate A RAID The first axis is capacity; given a set of N disks each with B blocks, how much useful capacity is available to clients of the RAID? The second axis of evaluation is reliability. How many disk faults can the given design tolerate? The third axis is performance. === RAID Level 0: Striping The first RAID level is actually not a RAID level at all, in that there is no redundancy. RAID level 0, or striping as it is better known, serves as an excellent upper-bound on performance and capacity and thus is worth understanding. Stripe blocks across the disks of the system as follows (assume here a 4-disk array): #three-line-table[ | Disk 0 | Disk 1 | Disk 2 | Disk 3 | | ------ | ------ | ------ | ------ | | 0 | 1 | 2 | 3 | | 4 | 5 | 6 | 7 | | 8 | 9 | 10 | 11 | | 12 | 13 | 14 | 15 | ] #tip("Tip")[ We call the blocks in the same row a *stripe* ] This approach is designed to extract the most parallelism from the array when requests are made for contiguous chunks of the array. #three-line-table[ | Disk 0 | Disk 1 | Disk 2 | Disk 3 | | ------ | ------ | ------ | ------ | | 0 | 2 | 4 | 6 | | 1 | 3 | 5 | 7 | | 8 | 10 | 12 | 14 | | 9 | 11 | 13 | 15 | ] #tip("Tip")[ chunk size:2 blocks ] We place two 4KB blocks on each disk before moving on to the next disk. Thus, the chunk size of this RAID array is 8KB, and a stripe thus consists of 4 chunks or 32KB of data. ==== THE RAID MAPPING PROBLEM Take the first striping example above (chunk size = 1 block = 4KB). Disk = A % number_of_disks Offset = A / number_of_disks How these equations would be modified to support different chunk sizes. ==== Chunk Sizes a small chunk size implies that many files will get striped across many disks - increasing the parallelism of reads and writes to a single file; - the positioning time to access blocks across multiple disks increases - because the positioning time for the entire request is determined by the maximum of the positioning times of the requests across all drives. A big chunk size, - reduces such intra-file parallelism, and thus relies on multiple concurrent requests to achieve high throughput. - reduce positioning time; if, for example, a single file fits within a chunk and thus is placed on a single disk, the positioning time incurred while accessing it will just be the positioning time of a single disk. For the rest of this discussion, we will assume that the array uses a chunk size of a single block (4KB). ==== Back To RAID-0 Analysis - capacity: given N disks each of size B blocks, striping deliversN·B blocks of useful capacity - reliability: any disk failure will lead to data loss. - performance: all disks are utilized, often in parallel, to service user I/O requests. ==== Evaluating RAID Performance - The first is *single-request latency* - The second is *steady-state throughput* Two types of workloads: sequential and random. We will assume that a disk can transfer data at S MB/s under a sequential workload, and R MB/s when under a random workload. In general, S is much greater than R (i.e., S ≫ R) Assume a sequential transfer of size 10MB on average, and a random transfer of 10 KB on average. Also, assume the following disk characteristics: - Average seek time 7 ms - Average rotational delay 3 ms - Transfer rate of disk 50 MB/s - S = Amount of Data / Time to access = 10 MB/210 ms = 47.62 MB/s - R = Amount of Data / Time to access = 10 KB/10.195 ms = 0.981 MB/s ==== Back To RAID-0 Analysis, Again From a latency perspective, the latency of a single-block request should be just about identical to that of a single disk; From the perspective of steady-state throughput, throughput equals N (the number of disks) multiplied by S (the sequential bandwidth of a single disk). For a large number of random I/Os, we can again use all of the disks, and thus obtain N · R MB/s. === RAID Level 1: Mirroring With a mirrored system, we simply make more than one copy of each block in the system; each copy should be placed on a separate disk. #three-line-table[ | Disk 0 | Disk 1 | Disk 2 | Disk 3 | | ------ | ------ | ------ | ------ | | 0 | 0 | 1 | 1 | | 2 | 2 | 3 | 3 | | 4 | 4 | 5 | 5 | | 6 | 6 | 7 | 7 | ] The arrangement above is a common one and is sometimes called *RAID-10* or (RAID 1+0) because it uses mirrored pairs (RAID-1) and then stripes (RAID-0) on top of them; another common arrangement is *RAID-01* (or RAID 0+1), which contains two large striping (RAID-0) arrays, and then mirrors (RAID-1) on top of them. When reading a block from a mirrored array, the RAID has a choice: it can read either copy. When writing a block, though, no such choice exists: the RAID must update both copies of the data, in order to preserve reliability. #tip("Tip")[ Do note, though, that these writes can take place in parallel; for example, a write to logical block 5 could proceed to disks 2 and 3 at the same time. ] ==== RAID-1 Analysis == File System
https://github.com/Mc-Zen/quill
https://raw.githubusercontent.com/Mc-Zen/quill/main/tests/gates/permute/test.typ
typst
MIT License
#set page(width: auto, height: auto, margin: 0pt) #import "/src/quill.typ": * #quantum-circuit( 2, permute(1,0), permute(1,0), 1, permute(2,0,1), 2, [\ ], 2, 4, permute(1,0), 1, [\ ], 2, gate($H$), 5, ) #pagebreak() // Test separation parameter #quantum-circuit( 1, permute(1,0, separation: none), permute(1,0, separation: 2pt), permute(1,0, separation: red), 1, [\ ], 5 ) #pagebreak() // Test bend #quantum-circuit( 1, permute(1,0, bend: 0%), 1, [\ ], 3, )
https://github.com/atareao/fondos-productivos
https://raw.githubusercontent.com/atareao/fondos-productivos/master/README.md
markdown
MIT License
<h1 align="center">Welcome to Fondos de pantallas productivos 👋</h1> <!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section --> [![All Contributors](https://img.shields.io/badge/all_contributors-5-orange.svg?style=flat-square)](#-contributors-) <!-- ALL-CONTRIBUTORS-BADGE:END --> <p> <img src="https://img.shields.io/badge/version-0.1-blue.svg?cacheSeconds=2592000" /> <a href="https://twitter.com/atareao"> <img alt="Twitter: atareao" src="https://img.shields.io/twitter/follow/atareao.svg?style=social" target="_blank" /> </a> </p> > Fondos de pantalla para mejorar tu productividad conociendo los diferentes atajos de teclado de las aplicaciones que utilizas de forma mas habitual. ### 🏠 [Homepage](https://www.atareao.es) ## Convertir los fondos de pantalla Para convertir los fondos de pantalla de formato SVG a PNG tienes que ejecutar el script que está en la carpeta scripts. Se generarán tantos fondos de pantalla como imágenes SVG en el directorio `src` encuentre el script. Los formatos de pantalla vienen definidos por el nombre de los directorios en fondos. Estos nombres deben ser números enteros separados por una `x`. Es decir, `anchoxalto`. De esta manera se determina las dimensiones. Si no quieres algún formato simplemente borra el directorio o directorios del cual no quieres formatos. **Nota** Está pendiente actualizar los scripts para poder gestionarlos con `typst`. ### Dependencias Este script usa [Inkscape](https://inkscape.org/) para convertir las imágenes en formato SVG a PNG. En caso de no encontrarlo se intenta utilizar [ImageMagick](https://imagemagick.org). ```sh sripts/topng.sh ``` ## 👤 Contributors ✨ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)): <!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section --> <!-- prettier-ignore-start --> <!-- markdownlint-disable --> <table> <tr> <td align="center"><a href="https://www.atareao.es"><img src="https://avatars3.githubusercontent.com/u/298055?v=4" width="100px;" alt=""/><br /><sub><b><NAME></b></sub></a><br /><a href="https://github.com/atareao/fondos-productivos/commits?author=atareao" title="Code">💻</a></td> <td align="center"><a href="http://tomatesasesinos.com"><img src="https://avatars2.githubusercontent.com/u/1285451?v=4" width="100px;" alt=""/><br /><sub><b><NAME></b></sub></a><br /><a href="https://github.com/atareao/fondos-productivos/commits?author=mdtrooper" title="Code">💻</a></td> <td align="center"><a href="https://github.com/Marzal"><img src="https://avatars3.githubusercontent.com/u/2069735?v=4" width="100px;" alt=""/><br /><sub><b><NAME></b></sub></a><br /><a href="https://github.com/atareao/fondos-productivos/commits?author=Marzal" title="Code">💻</a></td> <td align="center"><a href="https://azamarro.github.io/"><img src="https://avatars2.githubusercontent.com/u/16717087?v=4" width="100px;" alt=""/><br /><sub><b>AZamarro</b></sub></a><br /><a href="https://github.com/atareao/fondos-productivos/commits?author=AZamarro" title="Code">💻</a></td> <td align="center"><a href="http://avpodcast.net/podcastlinux/"><img src="https://avatars3.githubusercontent.com/u/23723653?v=4" width="100px;" alt=""/><br /><sub><b>Podcast Linux</b></sub></a><br /><a href="https://github.com/atareao/fondos-productivos/commits?author=podcastlinux" title="Code">💻</a></td> </tr> </table> <!-- markdownlint-enable --> <!-- prettier-ignore-end --> <!-- ALL-CONTRIBUTORS-LIST:END --> This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome! ## Show your support Dame una ⭐️ si este proyecto te ha servido o mejorado tu vida!! *** _This README was generated with ❤️ by [readme-md-generator](https://github.com/kefranabg/readme-md-generator)_
https://github.com/EGmux/TheoryOfComputation
https://raw.githubusercontent.com/EGmux/TheoryOfComputation/master/unit3/main.typ
typst
#include "./complexity.typ"
https://github.com/jocap/typst-x-bar
https://raw.githubusercontent.com/jocap/typst-x-bar/master/lib.typ
typst
MIT License
// Exports // #let make-label #let make-category #let with-arrows #let with-node-spacing #{ let node-spacing(sty) = { let ns = measure(metadata("x-bar-node-spacing"), sty).width if ns == 0pt { 0.7*measure(block(height: 1em), sty).height } else { ns } } // The `dx' value attached via metadata to each node/branch is the amount of horizontal space that the node/branch should be offset. For simple (non-branching) nodes, this is half of the node's width. let getdx(x, sty) = { if type(x) == "content" { x = if repr(x.func()) == "style" { (x.func)(sty) } else { x } if repr(x.func()) == "sequence" and x.children.first().func() == metadata { x.children.first().value.dx } else { measure(x, sty).width/2 } } else { measure(x, sty).width/2 } } // The `children' value attached via metadata to each node/branch contains an array of pairs that specify the horizontal and vertical position of each terminal in the tree. This information is used by `with-arrows'. let getchildren(x, sty) = { if type(x) == "content" { x = if repr(x.func()) == "style" { (x.func)(sty) } else { x } if repr(x.func()) == "sequence" and x.children.first().func() == metadata { x.children.first().value.children } else { () } } else { () } } // At each extension of the tree, the positions in the `children' value are updated to account for the horizontal and vertical extension of the tree. let updatechildren(children, dx, dy) = { children.map(c => (c.at(0), (c.at(1).at(0)+dx, c.at(1).at(1)+dy))) } // Place `label' above `term'. If `term' is content returned by `node' or `branch', then position it correctly according to the attached metadata. let node(label, term, i: none) = { let max(a, b) = if a > b { a } else { b } if i != none { label = label + sub(i) } if term == none { label } else { style(sty => { let tdx = getdx(term, sty) let width = max(measure(label, sty).width, measure(term, sty).width) let (labeloffset, termoffset) = (0pt, 0pt) if tdx < measure(label, sty).width/2 { termoffset = measure(label, sty).width/2 - tdx } else { labeloffset = tdx - measure(label, sty).width/2 } let children = updatechildren(getchildren(term, sty), termoffset, measure(label, sty).height+0.27em) metadata(( dx: labeloffset + measure(label, sty).width/2, children: children)) stack( dir: ttb, spacing: 0.27em, move(dx: labeloffset, label), move(dx: termoffset, term)) }) } } // https://github.com/typst/typst/issues/2196#issuecomment-1728135476 let to-string(content) = { if content.has("text") { content.text } else if content.has("children") { content.children.map(to-string).join("") } else if content.has("body") { to-string(content.body) } else if content == [ ] { " " } } // Create a unary or binary branch between given `terms'. let branch(..terms) = style(sty => { if terms.pos().len() == 1 { let term = terms.pos().first() if term == [] { term = "" } let s = if type(term) == str { term } else if type(term) == content { to-string(if repr(term.func()) == "style" { (term.func)(sty) } else { term }) } else { "" } let roof = s.match(regex(" ")) != none let tdx = getdx(term, sty) let children = getchildren(term, sty) if children == () { children = ((term, (measure(term, sty).width/2, 0.27em+1em+measure(term, sty).height)),) } else { children = updatechildren(getchildren(term, sty), 0pt, 1em+0.27em) } metadata((dx: tdx, children: children)) stack(dir: ttb, spacing: 0.27em, if term != "" { if roof { polygon(stroke: 0.5pt, (0pt, 1em), (tdx, 0pt), (tdx*2, 1em)) } else { move(dx: tdx - 0.5pt, line(stroke: 0.5pt, length: 1em, angle: 90deg)) } }, term) } else if terms.pos().len() == 2 { let (left, right) = terms.pos() let leftwidth = measure(left, sty).width let rightwidth = measure(right, sty).width let width = leftwidth + rightwidth + node-spacing(sty) let leftdx = getdx(left, sty) let rightdx = getdx(right, sty) let bottom = stack(dir: ltr, spacing: node-spacing(sty), left, right) let labelmid = leftdx + ((width - rightwidth + rightdx) - leftdx)/2 let top = stack(dir: ltr, line(stroke: 0.5pt, start: (leftdx, 1em), end: (labelmid,0pt)), line(stroke: 0.5pt, start: (labelmid - leftdx, 1em), end: (0pt,0pt))) metadata(( dx: labelmid, children: updatechildren(getchildren(left, sty), 0pt, 1em+0.27em) + updatechildren(getchildren(right, sty), leftwidth+node-spacing(sty), 1em+0.27em))) stack(dir: ttb, spacing: 0.27em, top, bottom) } else { panic("x-bar/branch: only unary/binary branches supported") } }) // Using the `children' value, return the terminal in `tree' that is equal to `term'. This is used by `with-arrows'. let getchild(tree, term, sty) = { let children = (tree.func)(sty).children.at(0).value.children let found for child in children { if child.first() == term { found = child break } } if found != none { found.at(1) } } // Display `tree' with movement arrows between the terminals specified in `pairs'. with-arrows = (tree, ..pairs) => style(sty => { let pairs = pairs.pos() let lowest = 0pt let ys = () for pair in pairs { let (d, t) = pair.map(term => { let pos = getchild(tree, term, sty) if pos == none { panic("x-bar/with-arrows: could not find terminal " + repr(term)) } pos }) if d.at(0) > t.at(0) { let tmp = d d = t t = tmp } d.at(1) = d.at(1) + 0.36em t.at(1) = t.at(1) + 0.36em let y = t.at(1) + 1em while y in ys { y = y + 1em } ys.push(y) let p = path(stroke: (thickness: 0.5pt), (d.at(0), d.at(1)), (d.at(0), y), (t.at(0), y), (t.at(0), t.at(1))) place(p) let (w, h) = (0.27em, 0.36em) place(dx: d.at(0), dy: d.at(1), polygon(fill: black, stroke: 0.5pt, (-w/2, h), (0pt, 0pt), (w/2, h))) let low = measure(p, sty).height if low > lowest { lowest = low } } tree let diff = lowest - measure(tree, sty).height if diff > 0pt { v(diff) } }) with-node-spacing = (ns, body) => { show metadata.where(value: "x-bar-node-spacing"): h(ns) body } // Curry `node' with `label'. make-label = (label) => (..rest) => { if rest.pos().len() == 0 { node(label, none, ..rest.named()) } else { node(label, branch(..rest), ..rest.named()) } } // Create node functions for `label' at phrase-, bar- and head-level. make-category = (label) => ( make-label(label+"P"), make-label(label+"′"), make-label(label)) } #let _P = make-label(block(height: -0.53em)) #let Spec = make-label("Spec") #let (AdvP, Adv1, Adv0) = make-category("Adv") #let (AuxP, Aux1, Aux0) = make-category("Aux") #let (CP, C1, C0) = make-category("C") #let (DP, D1, D0) = make-category("D") #let (FP, F1, F0) = make-category("F") #let (FinP, Fin1, Fin0) = make-category("Fin") #let (ForceP, Force1, Force0) = make-category("Force") #let (IP, I1, I0) = make-category("I") #let (NP, N1, N0) = make-category("N") #let (AP, A1, A0) = make-category("A") #let (NegP, Neg1, Neg0) = make-category("Neg") #let (PP, P1, P0) = make-category("P") #let (PartP, Part1, Part0) = make-category("Part") #let (TP, T1, T0) = make-category("T") #let (VP, V1, V0) = make-category("V") #let (XP, X1, X0) = make-category("X") #let (vP, v1, v0) = make-category([_v_])
https://github.com/nixon-voxell/apu_rmct
https://raw.githubusercontent.com/nixon-voxell/apu_rmct/main/literature.typ
typst
// Global settings #set page(paper: "a4") #set par( justify: true, ) #set text( font: ("Times New Roman"), lang: "en", size: 12pt, fallback: false, hyphenate: false ) #set heading(numbering: "1.") #show heading: it => block[ #text(size: 12pt)[#it] ] #show heading.where(level: 4) : it => block[ #text(size: 12pt, style: "italic", weight: "regular")[#it] ] // Cover page #align(horizon)[ #align(center)[ #image("./images/apu_logo.png", width: 200pt) *INDIVIDUAL ASSIGNMENT* *RESEARCH METHODS FOR COMPUTING AND TECHNOLOGY* #table( columns: (1fr, 2fr), inset: 10pt, align: horizon, align(left)[*Student Name*], align(left)[<NAME>], align(left)[*TP Number*], align(left)[TP058994], align(left)[*Intake Code*], align(left)[APU2F2305CGD], align(left)[*Module Code*], align(left)[CT098­3­2­RMCT], align(left)[*Lecturer Name*], align(left)[Assoc. Prof. Ts. Dr. <NAME> Ike], align(left)[*Hand Out Date*], align(left)[7#super[th] November 2023], align(left)[*Hand In Date*], align(left)[19#super[th] February 2024], ) ] ] #pagebreak() // ====================================== // Content page start // ====================================== #align(center)[ #text(size: 16pt)[*Exploring Deep Learning Approaches for Real-Time Interactive Character Animation*] // #text(size: 16pt)[*Improving Immersion of Real-Time Interactive Character Animation with Deep Learning*] <NAME> #link("mailto:<EMAIL>") ] #show: rest => columns(2, rest) // *_Abstract_--- Xxx* // *_Index Terms_* // Character Animation, Deep Learning, Neural Networks, Interactive, Real-time // = Introduction = Literature Review == Introduction Animation has evolved significantly from its origins to become a cornerstone of entertainment and communication. This review delves into 3 three key areas. The first section will discuss about the evolution of animation. History is extremely important when it comes to understanding a field. This section will bring about the advancement in animation over the past decades, providing a timeline and the potential future trajectory. The following section will be a deep dive into the research that has been done to create the technologies needed to make interactive character animation possible. Understanding the underlying technologies that made interactive animation possible is key to new innovations. The final section will be about the recent advancements of deep learning that could propel the interactive character animation standards. This section explores what had been done and the potential future of the interactive characater animation industry. The type of sources that will be used in this chapter will primarily come from research articles. == Evolution of Animation Animation in its simplest form is a sequence of actions that when played in a sequential manner, produces an illusion of movement. In the beginning, all animations are offline or pre-recorded in some form before displaying it to its audience. As oppose to many real-time animations today, especially in games, offline animations are not interactable, and thus only fits the purpose of the film industry. Animation production started off with hand-drawn animations. From the 1940s to the 1980s, hand-drawn animations was the main mode of output in the animation industry @lamotte2022discovering. An animation is produced frame-by-frame, requiring prodigious quanitites of labor for the construction of a 24 frames per second film @baecker1969picture. It was slow and ineffective, but it was the only choice given the state of the technology at that time. The emergence of computer-assisted animation started gaining popularity during the 1970s @lamotte2022discovering. Computer graphics systems strive to create a better experience to replace the drawing and painting process, widely known as the "Ink and Paint" process at that time. TicTacToon was a method that proposes a paperless 2D animation production line @fekete1995tictactoon. Motion capture was also introduced using potentiometers to track the movement of the human body @sturman1994brief @gleicher1999animation. Starting around 2000s, purely computer generated images (CGI) has started to become possible. Computer graphics systems had evolved to be able to render 3D scenes. A tremendous improvement in CGI can be seen from the film _Tron: Legacy_ which was released in December 2010 and in production since 2009. There were also released 3D games with 3D interactive animations like _Halo: Combat Evolved_, _Gears of War_, and _Half-Life_. This marks a significant change in the animation industry. == Interactive Character Animation /* - Methods of character skinning - Animation blending - State machine graph - Inverse kinematic */ Interactive character animation is made up of multiple underlying technologies. It is a subset of animation where characters are typically animated using a rig which deforms a mesh made up of triangles that is rendered onto the screen in real-time. The end goal is to create a system that is capable of providing visual feedback of character movements for users in real-time applications. === Mesh Skinning Skinning is the process of performing mesh deformation according to a function of skeletal poses @rumman2016state. In character animation, it is important to adopt a skinning method that is high in fidelity and performance. This section will explore the various methods of skinning that had been developed over the years as well as their pros and cons. Linear Blend Skinning (LBS) is a commonly used method in character animation where each vertex of the character mesh is influenced by a weighted sum of the transformations of nearby bones @lander1998skin. It is being used in _AAA_ game engines like Unity3D and Unreal Engine. LBS is known for its fast and simple algorithm that maps advantageously to the graphics hardware. Spherical Blend Skinning (SBS) is another form of skinning method that employs spherical interpolation to smoothly blend between bone transformations @kavan2005spherical. SBS aims to solve the "lost of volume" artifact that LBS brings despite its efficient algorithm. To solve the computational and memory overhead that SBS brings, #cite(<kavan2007skinning>, form: "prose") propose Dual Quaternion Blending (DQB). DQB uses dual quaternions to represent both translation and rotation, allowing for more accurate and natural deformations of the character mesh. Unlike SBS, it does not require additional memory to cache rotation centers. The DQB method is also extremely efficient. However, DBQ comes with a limitation, it only supports rigid transformation and is not suitable for scaling or shearing effects. === Inverse Kinematics Inverse kinematics (IK) is widely used in video games and robotics to create realistic poses within a defined constraint. In short, the ultimate goal of IK is to determine an appropriate joint configuration that allow the end effectors to reach a target position @aristidou2018inverse. One use case of IK is to perform animation retargeting to map movements between characters with different proportions @molla2017egocentric. In the context of interactive applications like games, IK can also be used to perform secondary motions on top of an already playing animation @ruuskanen2018inverse. For example, turning the head towards an interest point, or moving the hand towards a target position. At its current state, there is a total of 4 main categories towards IK: + *Analytical* Analytical IK solvers aim to determine all potential solutions based on mechanism lengths, initial posture, and rotation constraints. They often rely on assumptions to compute a single solution. + *Numerical* Numerical methods often require a set of iterations to achieve a satisfactory approximation by minimizing a predefined cost function. + *Data-Driven* Data-driven methods relies on large accurate animation databases. Most data-driven methods employs some kind of machine learning algorithms to learn from the dataset. + *Hybrid* The hybrid method is simply a way of combining 2 or more different IK methods into a single solution. === Physics Based Character Animation Physics based character animation offers a completely new solution for developers to prioritize physics accuracy over animation precision. It forces characters to obey the laws of physics like preventing collisions between collidable objects and interacting with external forces such as gravity, pressure, etc @ye2016physics. Authoring physics based character animation can be extremely hard. This is due to the unpredictability of the physical world. For example, a character might accidentally get hit by a physical object during runtime, resulting in unexpected movements or behaviors that can disrupt the intended animation sequence. A major limitation of physics based animation is the inability to precisely control the artistic intent for achieving specific visual effect. Additionally, ensuring computational efficiency while simulating complex physical interactions adds another layer of challenge to the authoring process. === Animation System Multiple animation clips are normally used in interactive environments to create a variety of dynamic motions. In a conference talk by #cite(<holden2018character>, form: "prose"), he mentioned that Assassin's Creed Origins had around 15,000 animations in the game. These animations are needed to be handled by an animation system to systematically select the correct animation clips to sample depending on the current scenario. Game engines like Unity3D uses a hierarchical state machine (HSM) graph, shown in @mechanim. It controls the sampling of animation clips and the transition between them. This allows developers to divide complex systems into smaller isolated modules @berg2023animation. During runtime, the animation system will traverse the state machine graph and subsequently transition to the animation clip it reaches. #figure( image("./images/unity mechanim.jpg"), caption: [Unity's Mechanim], ) <mechanim> In some cases, animators would also like to mix and match different animation clips together. For example, an in between animation of walking and jogging to produce a slow jog. This can be achieved using a method called blend trees @berg2023animation. #figure( image("./images/unity mechanim blend tree.png"), caption: [Blend tree.], ) <blendtree> Another method known as motion blending is also used to apply motion trajectories onto the rig, based on a weighted sum of multiple animation clips @menardais2004motion. This can create interesting motion dynamics like a walking animation clip towards the lower body part and a punching animation towards the upper body part. == Deep Learning in Animation === A Brief History of Deep Learning In 1943, neurophysiologist <NAME> and mathematician <NAME> published an article titled _"A Logical Calculus of the Ideas Immanent in Nervous Activity"_ @mcculloch1943logical. In it, they described how neurons in brain might work and modeled a simple neural network using electrical circuits. It was an attempt to understand how the human brain works and more importantly, how it learns. The first artificial neural network (ANN) called Perceptron was invented by <NAME> @rosenblatt1958perceptron. Perceptron was able to model functions that are determined by linearly separable data. Activation functions were used to introduce non-linearity into the network, e.g. Sigmoid, ReLU, and Leaky ReLU @sharma2017activation. Deep neural networks can solve many hard computational tasks like image recognition using convolutional neural networks (CNN) @lecun1998gradient. Recurrent neural networks (RNN) were also introduced to tackle sequential data @rumelhart1985learning. An improved version of RNN known as the Long Short-Term Memory (LSTM) was proposed to solve more complex and longer sequential tasks @memory2010long. In an article titled _"Attention Is All You Need"_, the authors revolutionized the deep learning industry by introducing the Transformer model @vaswani2017attention. The Transformer model is capable of performing all kinds of tasks from learning sequential data for human conversation like Llama 2 to speech recognition like Whisper @touvron2023llama @radford2023robust. === Using Deep Learning to Drive Character Animation /* - Cyclic based deep learning - Phased function neural network - Local motion phases - Deep phase: periodic autoencoders for learned motion phase manifold - Reinforcement learning based - Adversarial skill embeddings (ASE) - Learned motion matching */ ==== Motion Matching Traditional HSM methods tightly couples the animation data with the states @holden2018character. A better approach to this problem is to store all animation data into a database and tag them with their related traits, e.g. walk, idle, run. etc. Immediately, all of the animation data now form a relation based on similar tags they share. Getting an animation from the database can be done by querying the specifc traits that is needed. To improve this system further, character data and important animation state can also be incorporated as traits in the animation database, e.g. velocity of the character, location of the hip bone, etc. This way, getting a specific animation clip becomes more like a matching system rather than a query system. Developers can now find the best match animation based on the current and desired character state alone, without the need of any state machines. This method is widely known as motion matching, a data-driven approach towards character animation @buttner2015motion. #figure( image("./images/motion matching.png"), caption: [Motion matching.], ) <motionmatching> ==== Deep Learning The key idea of using neural networks is to attempt to generalize the problem and solve the scalability issue of many traditional animation systems @holden2018character. Other than scalability, neural networks also proof to be fast and memory efficient. No animation data is required during runtime when inferencing a sufficiently trained neural network. Phase-Functioned Neural Network (PFNN) introduces the idea of using a phase function to generate weights for a regression network which in turn generates the animation @holden2017phase. Not all animations have a phase, in addition, labeling the phase variable to each animation can be a laborious task. To solve this, a new neural network architecture called Mode-Adaptive Neural Networks (MANN) is proposed to remove the phase label and replace it with a gating network @zhang2018mode. Local Motion Phases (LMP) removes the dependency of a global phase variable in favor of multiple independent local phases for each bone @starke2020local. The local phase is defined based on the contact between each bone and the environment. Instead of defining phases manually, #cite(<starke2022deepphase>, form: "prose") proposes Deep Phase, a periodic autoencoders for learning motion phase manifolds automatically. The authors stated that the learned motion phase can also potentially be used for motion matching and reinforcement learning. Learned motion matching presents a learned alternative towards the highly flexbily and low pre-processing time method, motion matching. It promises to retain high quality animation data and quick iteration time while preserving the scalability of a neural network approach @holden2020learned. Deep learning also tremendously benefit the world of physics based animation. Deep Mimic uses deep reinforcement learning (RL) to learn control policies to imitate a variety of animation clips in a fully physics simulated scenario @peng2018deepmimic. #cite(<peng2021amp>, form: "prose") proposes a fully automated adversarial RL system for physics based character animation to imitate behaviors from unstructured dataset. In the following year, a better approach was introduced to allow physically simulated characters to learn reusable skill embeddings from large dataset of unstructured motion clips @peng2022ase. == Conclusion Creating scalable interactive character animation system is still a challenging and on going research topic. While deep learning solves a lot of the scalability issue, every little change to the animation data would require an update towards the network. Compared to systems like HSM and motion matching, this update is not instantaneous and might require a huge amount of computational power. Besides, using new systems like motion matching and neural networks require animators to adapt and learn a new set of skills. Further studies is needed to create highly scalable animation systems that can retain rich animation motion while shortening or even removing the retraining process of neural networks. = Methodology == Introduction This chapter will discuss about the target users that are suitable for this research as well as providing insights into the strategies used to sample and collect data, and ultimately draw meaningful conclusions regarding the efficacy of deep learning for real-time interactive character animation. == Target user /* - Game developers - Animators - Gamers */ The target users for this study encompass three primary groups: game developers, animators, and gamers. Understanding the perspectives and requirements of each group is essential for designing deep learning approaches that are effective, user-friendly, and capable of enhancing the interactive character animation experience across various platforms. + *Game Developers*: These professionals play a crucial role in integrating character animation into interactive gaming environments. Their insights are invaluable for understanding technical constraints, performance requirements, and integration challenges associated with deploying deep learning techniques in real-time scenarios. + *Animators*: Animators possess expertise in crafting compelling character animations that resonate with audiences. Their input is essential for evaluating the artistic quality, expressiveness, and fidelity of animations generated using deep learning methods. + *AI Researchers*: Creating effective deep neural network architectures is not a simple task. Their expertise can provide valuable insights into the latest advancements, methodologies, and challenges in applying deep learning techniques to character animation. + *Gamers*: Ultimately, the success of interactive character animation lies in its reception by gamers (or interactive application users). Understanding their preferences, expectations, and experiences with character animations can provide valuable feedback on the effectiveness and immersion of deep learning-driven animations. == Sampling method We primarily focuses on using purposive sampling method so that participants can be selected based on their expertise and involvement in the field of character animation and interactive applications. This method ensures that only individuals with relevant knowledge and experience are included in the study. A total of 40 participants will be sampled, comprising 10 game developers, 10 animators, 10 AI researchers, and 10 gamers. This sampling approach ensures that diverse perspectives are represented, enhancing the richness and depth of the data collected. == Data collection method The data collection method used invovles qualitative interviews. In-depth interviews will be conducted with each participant, utilizing a semi-structured approach to explore their perspectives, challenges, and expectations regarding real-time interactive character animation and the potential role of deep learning techniques. During these interviews, it is also allow for the flexibility to delve into more specifc topic of interest, providing rich insights regarding the area of expertise of the interviewee. == Conclusion By employing purposive sampling and qualitative data collection techniques, this chapter aims to gather comprehensive insights from game developers, animators, and gamers. These insights will inform the development and refinement of deep learning techniques, ultimately enhancing the quality, realism, and interactivity of character animations in gaming and interactive media environments. #set par(first-line-indent: 0pt) // ====================================== // Bibliography start // ====================================== = References #bibliography("citation.bib", title: none, full: true, style: "apa")
https://github.com/lucannez64/Notes
https://raw.githubusercontent.com/lucannez64/Notes/master/Allemand_Bertolt_Brecht_Podcast.typ
typst
#import "template.typ": * // Take a look at the file `template.typ` in the file panel // to customize this template and discover how it works. #show: project.with( title: "Allemand Bertolt Brecht Podcast", authors: ( "<NAME>", ), date: "3 Décembre, 2023", ) #set heading(numbering: "1.1.") === Geburstag <geburstag> Ich wurde am 10. Februar 1898 im Augsburg geboren. === Hintergrund <hintergrund> Nein, ich habe keinen klassischen Hintergrund als Schriftsteller weil ich Medizin und Philosophie studiert habe. === Bedeutung <bedeutung> Diese Werke sind politisch und kulturell wichtig, da sie den Faschismus und den Kapitalismus anprangern und sie leicht verständlich war. === Epischen Theater <epischen-theater> Im epischen Theater wollte ich eine Distanz zwischen dem Zuschauer und dem Stück schaffen, um Reflexion zu ermöglichen. Das epische Theater zielt auf intellektuelle Teilnahme ab, während das dramatische Theater oft auf emotionale Identifikation setzt. === Exil <exil> Ich bin wegen des Aufstiegs des Nazismus ins Exil gegangen und ich war 1933 im Exil in Prag, in Wien, nach Dänemark, 1939 nach Schweden, 1940 nach Finnland und 1941 in den USA. Meine politischen Ideen und meine offene Kritik am Faschismus wurden nicht gerne gesehen. === RDA und Berliner Ensemble <rda-und-berliner-ensemble> Ich wohne in der DDR in Ost-Berlin und ich habe vor, mit meiner Frau Helene ein Theater namens Berliner Ensemble im Theater am Schiffbauerdam zu gründen. Ich gründete 1949 das Berliner Ensemble in das Theater am schiffbauerdamm Meine Leben sieht gut aus === Die Dreigroschenoper und der aufhaltsame Aufstieg des Arturo Ui <die-dreigroschenoper-und-der-aufhaltsame-aufstieg-des-arturo-ui> Die Dreigroschenoper folgt den Abenteuern von <NAME>, dem Anführer einer Bande von Ganoven in London, mit der Musik von Kurt Weill. Während Der aufhaltsame Aufstieg des Arturo Ui eine tragische Fabel über den Aufstieg des Nazis durch eine Metapher aus Chicago ist. Ich denke, dass das leichte Verständnis der Werke sie bekannt gemacht hat. Ich wurde der aufhaltsame Aufstieg des Arturo Ui mit Margarete Steffin geschaffen. === Musik Dreigroschenoper <musik-dreigroschenoper> In der Dreigroschenoper hilft die Musik von <NAME>, die Themen zu unterstreichen === Kommunismus <kommunismus> Ich bin ein Sympathisant der Kommunisten. Für mich ist der Kommunismus eine gerechtere Perspektive der Gesellschaft und ich benutze den Klassenkampf oft, um die Ungerechtigkeit des Kapitalismus zu kritisieren. === Familie <familie> Ich habe eine Frau <NAME>gel und vier Kinder Hanna, Frank, Stefan und Barbara. === Schreiben <schreiben> Für mich ist Schreiben dafür da, zum Nachdenken anzuregen. Ich möchte, dass meine Werke die Zuschauer inspirieren, soziale und politische Ungerechtigkeiten zu hinterfragen und kritisch zu denken. Ich prangere die Ungerechtigkeiten des Kapitalismus an. === Emotionen <emotionen> Im epischen Theater fühlt der Zuschauer keine Emotionen === Nach dem Krieg <nach-dem-krieg> Nach dem Krieg wurde das Schreiben von Gedichten für mich wichtig, weil ich meine Traumata damit ausdrücken konnte.
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/040%20-%20Zendikar%20Rising/009_Episode%205%3A%20The%20Two%20Guardians.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Episode 5: The Two Guardians", set_name: "<NAME>", story_date: datetime(day: 30, month: 09, year: 2020), author: "<NAME>", doc ) As Nissa braced to fight the people she once considered allies, she wondered if she had made a grave mistake ever leaving Zendikar. Jace and Nahiri stood before her, breathing hard from their race through the Singing City. Behind her were the elementals of the Kazandu forest. Dozens and dozens strong. If Nissa had never become a planeswalker, her chest wouldn't be constricting right now with the pain and guilt of past mistakes and lost friendships. She wouldn't be mourning Gideon's death. Or the loss of Chandra's love. "How~how are you traveling~so fast?" Nahiri said. She was cut and bruised, and the rage on her face was unadulterated and brutally clear. On her hip was the satchel with the Lithoform Core. It pulsed mutely through the fabric. Nissa clenched her fists. On the other hand, if she'd never left Zendikar, if she never tried and failed and tried again, she wouldn't be standing here in front of the Singing City, defending her home when no one else would. "Zendikar is where I belong. It's the heart of my power and strength," Nissa said. "I know all the paths and how to use them. But you two"—she thought of the fern elemental Nahiri carelessly murdered in Akoum's Skyclave, and she felt the army of Kazandu's elementals behind her swell with her anger—"you will never understand. Leave my home." Jace tried to reason with her, but Nissa ignored him. It was Nahiri she focused on as the lithomancer yelled, "This is #emph[my] home, tree-dweller!" The elemental army instinctively tensed and grew closer to Nissa, ready to defend her with their lives. For a moment, Nissa was overwhelmed with gratitude toward these embodiments of Zendikar. Who found her in her exile. Who had taken her in when she was alone. Jace went still. He raised a magical ward. Elementals, these fragments of Zendikar's heart and soul, who stood by her, despite her mistakes and the damage she accidently caused. Nahiri lifted her hands, and the stones of the Singing City began to tremble. Elementals, who taught her what being part of a family meant, what a family was supposed to be. Who came to her aid now, without her asking. #emph[Hers ] and not Nahiri's. #emph[What would Gideon do?] Nissa thought. #emph[He would tell you it's time to make choices for yourself.] "Defend Zendikar," she said to the elementals, in a voice lower than a whisper. But they heard. They understood. And like a wave crashing on a shore, they did. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Nahiri always believed in the power of stone, the strength of it, that stone would outlast everything in the end. But for the first time in centuries, as dozens and dozens of elementals swarmed around her, she began to doubt the power of her lithomancy. Like Nissa, the elementals moved with impossible speed. Nahiri raised a column of stone a split second before a massive elemental shaped like a stomper crashed into her. It roared and, with a swipe of its leafy paw, smashed the pillar away. It snarled at her, and Nahiri screamed back. She swept out her arms, called on the stones, just as the leaf thing pounced. It was knocked away by a granite fist shooting up from the ground. Nahiri smiled. But her smile slipped away when she saw Nissa. The elf was standing midair on a mass of vines, arms outstretched, with ribbons of green energy swirling around her. And behind her~ Behind her was an elemental like none other. It was massive, shaped like an eagle but with a body made of jaddi roots, twisting and swirling. It spotted Nahiri and, quick as fury, surged toward her, beak wide and talons extended. Nahiri called the stones to defend her, but the creature's talons sunk into her shoulders. Nahiri cried out, in both surprise and pain. It flapped its wings—once, twice—and began to lift her away. #emph[Like hell] , Nahiri thought, and snapped her wrists forward. Within moments, thirty glowing swords sunk into the jaddi eagle. It screamed and dropped her. Nahiri rolled out of the way and onto her feet, only to come face to face with a giant elemental made of water, complete with algae and fish swimming within it. "You can't be serious," Nahiri hissed and jumped out of the way as it aimed a watery kick toward her head. On and on it went. Nahiri caught snatches of Jace swearing and casting illusions of fires and Eldrazi broods, the elementals instinctively recoiling from the mirage. She was impressed. He was using Zendikar's fears both as a weapon and a shield. His tricks bought him enough time to dodge the barrage of beaks and maws, talons and thorns. But Nahiri knew they were barely managing to hold back the relentless assault. #emph[How is the tree dweller managing to do all of this? ] she thought. And for one terrible moment, Nahiri wondered if Nissa was right. If elementals were embodiments of the plane itself, then Zendikar had given the elf an army to fight with. While Nahiri fought with nothing. No, not nothing. She had strength and determination. She had mastered stone. She had survived for millennia. She was the guardian of the real, ancient Zendikar. She was the protector of the bedrock and foundation of this world. And she would stop this madness. With one fluid movement, Nahiri shoved an elemental made of rain and autumn-colored leaves away with a stone hand. She squared her shoulders, widened her stance, and lined up her shot. Nahiri brought her hands together with a #emph[clap] . And sent fifty glowing swords flying right at the elf. Nissa's eyes widened in surprise, but before the blades could strike, the giant jaddi root eagle appeared again—from where, Nahiri couldn't tell—and brushed all fifty weapons away with a sweep of its wing. #emph[Damn it] , Nahiri thought, calling the stones again. She tried to throw boulders at Nissa, to make the ground around the elf trap her, to send more swords. But Nissa's elementals defended her fiercely, as if they were more than just mindless tools. As if they knew they were fighting for their lives. #emph[How can anyone live a good life if your world is broken and failing? ] Nahiri thought with a scowl. She attacked again. And again. And again. When a giant griffin made out of the broken hedrons and the moss of the Singing City swallowed a dozen stone spears and seemingly grinned at her, Nahiri realized she had to try a different tactic. She ran. Dodging and blocking and weaving, Nahiri sprinted around the lunging paws, gnashing teeth, and striking thorns of elementals. She didn't stop until she reached the massive marble gates of the Singing City and pushed through them. She had to protect the Core. The stones in this ancient city would help her do just that. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Jace had never before seen so many elementals in so many different forms. If they weren't attacking him, he would have been fascinated. But they #emph[were] attacking him, and it took his full skill and cunning to evade their blows and not damage them in return. He knew that if he hoped to win Nissa back as a friend and ally, he couldn't damage Zendikar. He had to get the Lithoform Core. He had to find a way to broker peace between the two guardians of Zendikar. From the corner of his eye, he saw Nahiri sprint into the Singing City. He knew that whatever the lithomancer was planning would not help any potential peace talks. Sweeping up both hands, Jace raised an illusion: a cloud of mist, thicker than natural, thick enough to disappear into, confusing the ivy and lichen elemental looming in front of him. Buying himself some time. Under this cover, Jace ran. He slipped into the Singing City moments before an ear-splitting roar of destruction bellowed behind him. He turned to see a massive wall of stone crushing the marble gates of the city, blocking the exit. Leaving Jace trapped within, where its eerie tune began to hum again. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Nahiri could hear the elementals pounding against the Singing City's walls, banging their mud fists and moss wings uselessly at the stones. The sound pleased her. Nissa couldn't destroy her makeshift fortress, not while Nahiri was within it, using her lithomancy to hold it together. Still, the thought of all those nature monstrosities attacking her made her skin crawl with dread. Being surrounded by walls and the haunting tune of unseen voices made her stomach lurch. It reminded her too much of being trapped on Helvault again. She lifted her arms, summoning bedrock and sandstone. And like a dance, she made them rise, knit together, become stronger, harder than the original walls of the Singing City, building herself a massive, indestructible fort above the ruins. Her body ached with the effort, but she refused to let that foolish elf get her hands on the Core. Not when she was so close to healing Zendikar, to returning it back to the stable, even world she once knew. Within her fortress, the pounding of the elementals grew muffled and the City's song became a faint melody. Nahiri exhaled. She finally had a moment alone. "Nahiri." #emph[Damn it. ] She knew who it was before she even turned around. She recognized the pattern of Jace's footsteps on the stones. But she hadn't noticed them until now. She turned to see Jace moving toward her. "If you try to take the Core," Nahiri said, with deadly calm, "I'll add you to my collection of wall hangs." That made him stop. "I don't want to fight you," he said, raising his hands in a conciliatory gesture. "But~please, let's go to Ravnica. I think Nissa will listen to us there." "Oh, she'll listen," replied Nahiri, anger rising in her. "She'll listen and listen, and when it comes time to choose, she'll choose to let this world stay fractured and ruined." She clenched her fists and began to unbuild the roof of her fortress, giving her access to the open sky. Turning her gaze upward, she called the hedrons she made an age ago. She called every single one marooned around the Singing City. There were dozens. "No, Jace. The Core won't work on another plane. It stays here." "I don't want to fight you," he said again, and there was no aggression in his voice. But she heard what he wasn't saying. The silence half of that statement: #emph[But I will.] "Please," he said. But Nahiri was done. Done with these weak planeswalkers who couldn't see what was clearly in front of them. Her hands shook with emotion, and she used that energy to pull the hedrons down from the sky and hover above Jace. "Nahiri," said Jace, with alarm. The hedrons closed in around him and began to spin, confining him in their circle. "Listen, please!" Nahiri was done listening. She rose into the air, her fury and hurt fueling her. With a twirl of her fingers, blue energy engulfed her hands and she sent it through the hedrons, trapping Jace within the dangerous ring. Then, she commanded the ring to close. She wanted her face to be the last thing he ever saw. There was movement from the corner of her eye. She knew its shape, its posture, its cool and silent danger. Nahiri turned and was faced with her old mentor. Her sworn enemy. Sorin. He was standing on the fortress wall, a dozen feet away from her, at eye level. His long black jacket flowed out behind him. He was smiling. "What are you doing here?" said Nahiri through gritted teeth. Sorin didn't respond. He just lifted a hand in that dangerous way she knew so well. The slight movement that heralded a terrible attack. #emph[No, not you, too. ] Nahiri bared her teeth and screamed. She sent a giant stone foot shooting up from the ground directly at the vampire's chest. Sorin disappeared in the rush of stones, and Nahiri exhaled. Then, an instant later, he reappeared, still smiling. Like nothing had happened at all. Nahiri blinked, confused. She reached for the stones under Sorin's feet and discovered that they weren't supporting the vampire's weight. #emph[This is an illusion, ] she realized. #emph[This is Jace.] But this realization came a moment too late. A mist flooded in around her, too thick to see through. She heard her hedrons clatter to the ground. Suddenly, her thoughts were not her own. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) #emph[It worked! ] Jace thought as the hedrons hit the surrounding ground. He could feel Nahiri's mind struggle against his control. He hated that it had come to this, but his choices were limited. The Singing City's eerie hum began to grow in volume. #emph[Better hurry. ] He wasn't sure if he could hold Nahiri's mind and a silencing spell simultaneously. Nahiri floated to the ground, and he commanded her to hold still. Cautiously, he approached. Reached into the satchel on her hip. Took the Lithoform Core. It glowed like a beacon in his hand, pulsing gently with the promise of power. The haunting song of the City swelled in volume, and Jace found himself filled with a sudden, inexplicable longing. He saw himself wielding the Core's energy, solving problems without needing to debate or fight with others. Without needing to throw himself or his friends in harm's way. With the Core, with a thought, he could easily change the world. All the worlds. #emph[No, that's not who I am. ] Jace pushed the temptation away. He groaned as Nahiri's mind thrashed against his control with renewed force. There was rage in her expression, in every line of her paralyzed body as she fought against him. His hold on her almost slipped, but Jace regained it at the last moment. "Let me out of this fortress. Lower the wall around the entrance," he commanded. Nahiri's mind balked at the order, but he heard the sound of stone toppling in the distance, the elementals' attack growing louder. Jace winced. They should be trying to find a solution for Zendikar together, not fighting against each other. He could take the Core to Ravnica right now. He should. Nahiri claimed that the Core only worked on this plane, but he wanted to test that theory, safely away from this already damaged world. He also knew that if he disappeared with the Core without telling Nissa, he would lose her trust forever. He both wanted her friendship and needed her in the battles to come. Jace wrapped the Core in his cloak and sprinted out of the Singing City, running as fast as his exhausted body could. The haunting song was growing louder now, stealing into his bones. Jace ran faster, faster than he realized he could. He needed to get to the entrance before Nahiri regained herself and sealed it up again. He needed to reach Nissa. He crossed the ruined marble gates an instant before his control on Nahiri's mind slipped and stone walls slammed up against the ancient city. #emph[Safely on the other side] , he thought with some satisfaction. He didn't see the giant limb of roots and green buds until it was on top of him. Until the elemental pinned him down with one of four massive hands and leaned over him, blocking out the sunlight. Jace gasped, recognizing Ashaya. "I need to talk to Nissa," he shouted. But Ashaya just increased the pressure on his sternum. Balling his fist, Jace created an illusion of fire around them, wild and consuming, hoping to create enough of a distraction to escape. But Ashaya wasn't fooled. The elemental calmly reached into Jace's cloak and pulled out the Lithoform Core. "Wait," Jace groaned. But the elemental didn't. It examined the artifact for a moment before tossing the Core over its shoulder. And into Nissa's waiting hands. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) She should destroy it. That was Nissa's initial thought as she held the Lithoform Core in her hands for the first time. #emph[Listen to me.] The thought was not hers, though the voice sounded familiar. She looked over to where Jace was struggling under Ashaya's grasp. His expression was pleading. Tentatively, she allowed Jace into her thoughts. #emph[Nissa, please, we need to stop this] , Jace thought.#emph[ Call off the elementals.] #emph[If we stop, Jace, Nahiri will take the truce as an opportunity to overwhelm us. You've seen how ruthless she is.] There was a loud crash as the thick walls around the city began rearranging themselves. Nahiri appeared atop the stone chaos. The elementals flooded around her at once. #emph[Please] ,#emph[ ] Jace thought. #emph[Let's go to Ravnica. We can study the Core there together.] #emph[What makes you think we won't accidentally annihilate Ravnica? ] Nissa replied. #emph[I've seen the damage the Core can do. We should destroy it.] #emph[Nahiri said it won't work outside of Zendikar. It'll be safe to test it there.] In the distance, Nahiri began trapping elementals in prisons of stone, her movements focused, precise, and furious. Nissa's breath caught as four impenetrable walls shot up around a river elemental. #emph[Nahiri is not known for her truthfulness, Jace.] Gritting her teeth, Nissa thrust out her hands and sent a wave of green energy straight at Nahiri. #emph[Listen to me.] Nahiri screamed a battle cry and deflected Nissa's energy with a massive bedrock wall. #emph[The Gatewatch. We can use this] , thought Jace as he struggled against Ashaya's roots. #emph[There's something you don't know. I] #emph[] #emph[. . .] #emph[ we have other battles to face, Nissa.] #emph[The Gatewatch failed. We were supposed to protect the things we love. We couldn't even protect each other. ] Nissa's heart ached with the memory of Gideon's smiling face, at those tender and hopeful moments with Chandra. How, for a little while at least, among the other planeswalkers, Nissa felt like she belonged somewhere. #emph[You were like a family to me.] A hundred feet away, Nahiri fought her way forward, moving closer to where Nissa stood, while elemental after elemental fell victim to the kor's relentless attacks. No, no, no. Nissa couldn't lose this fight. The Core in her hands grew warmer. #emph[Listen to me.] "I am listening to you Jace," she shouted. "You aren't listening to me!" #emph[Not him. Me.] The Core was flashing urgently in her hand. Nissa realized why the voice sounded so familiar. There was something in its cadence, as if the pulse, the vibrations and breath of Zendikar that she knew so well, had found its words. #emph[Who are you? ] she asked. #emph[I am me. I am you.] Fifty feet away, Nahiri smashed a stone foot into an earth elemental, catching it by surprise. It crumbled to its knees. Nissa shot a tangle of vines at Nahiri's ankles. #emph[Why are you only speaking now? ] she said to the Core. Nahiri dodged the vines with one elegant twist and jump, landing neatly on her feet. There was a small rumble in the land's rhythm, in its air. Nissa realized Zendikar was laughing. The chuckle from the Core matched the land's pulse. #emph[How? ] she asked. This was impossible. Confusing. Nissa didn't have time for a new mystery right now. Nahiri close and coming closer. But if this was Zendikar, really Zendikar~ #emph[Nissa, please! Let me take the Core! ] Jace thought. Nissa ignored him. #emph[The object in your hand is a very old piece of me. It's full of power] ,#emph[ ] the voice from the Core replied. Nissa frowned, aimed a fresh attack at Nahiri. #emph[Why? Why would the ancient kor create this?] #emph[To undo damage.] Thirty feet away, Nahiri batted aside the second vine attack with a fence of sandstone. She stalked forward, stopping her advance twenty feet away from Nissa. "Give me the Core, Nissa!" she shouted. #emph[Will you help me, Jace? ] Nissa thought. Jace nodded once, but even at a distance, she could tell he was planning something. A moment later, she felt tendrils of power slip into her head. Nissa realized in one horrified instant that Jace was trying to take control of her mind. She snapped the mental link between them and silently asked Ashaya to make sure Jace couldn't move. The elemental complied, piling all four limbs on the mage. Jace groaned. "I knew this plane when it was whole," Nahiri shouted, "and you want to cling onto the broken pieces of it!" Nissa studied her adversary, unsure of what to say. Nahiri was dusty and bleeding, but her anger and determination were indomitable. In that moment, Nissa realized how alone she was. #emph[What would Gideon do? ] she thought, then caught herself. #emph[No, what would I do?] #emph[Trust your strength] ,#emph[ ] whispered the power in her hands. "Broken doesn't mean weak, Nahiri," replied Nissa. "Broken doesn't mean that there isn't beauty or redemption." "So says the broken planeswalker," retorted Nahiri, "who destroys everything she touches." Nissa tightened her grip around the Core. The words stung~but not as badly as they once would have. Because behind Nahiri's cruel expression, Nissa saw fear. And in that moment, Nissa knew exactly what she would do. #emph[I will protect my home, my family. I will try and try again until I get this right.] "Broken doesn't mean a life is not worth living," Nissa said, standing tall, staring straight at the lithomancer. "You are what Zendikar once was, Nahiri. I am what it is now." Doubt flickered across Nahiri's face. It quickly faded and Nahiri snarled, raising her hands. Scores and scores of hedrons appeared, hovering in the air behind her. They began to twist and flow in a complex pattern, the energy sparking between them. Every elemental on the battlefield cowered and shrank away. Nissa understood in that moment that Nahiri would destroy them all before she'd admit that she was wrong. She would dampen the essence of Zendikar's spirit just to tame it. If Nahiri was allowed to do whatever it was she planned, Nissa would be mourning the loss of yet another piece of Zendikar's battered soul. In her hand, the Core shone like a beacon. The hedrons spun around Nahiri faster and faster, gathering power. Like a storm just on the edge of breaking. #emph[What if I destroy] , Nissa thought, #emph[like Nahiri?] #emph[Trust your strength] , her home whispered. Nissa closed her eyes, took a deep breath, and imagined a better Zendikar. One not defined and hurting from the wounds of the Eldrazi. One not oozing from the poison they left behind. A healthier world, but still fragmented and dangerous and beautiful. The Core warmed and hummed in her palms. She felt Zendikar's leylines stretched out before her. And easily, so easily, Nissa's magic merged with the Core's power. She set it loose. There was a flash. There was a dull roar. A gust of wind struck Nissa, knocking the breath from her body, smelling of ash and rain. Of earth and streams. Of magic, ancient and terrible. The power from the Core collided with the hedrons in a shower of sparks and energy. The dull roar became a screaming bellow. The light became blinding. The air rushed away. Then there was nothing at all. Nissa opened her eyes slowly, terrified of the silence, the sudden emptiness she felt around her. Even the Core had gone quiet and dull in her hands. What she saw made her breath catch in her throat, and panic rose up in her chest. The Singing City was gone. Flattened into dust. So was a large swath of the forest. All reduced to ash. Across the battlefield, the elementals were lying motionless in the dust. "No," she whispered and rushed to the nearest one. A large jaddi tree embodiment, with delicate yellow flowers twining around its limbs. She dropped to her knees beside it, putting a hand on its rough bark skin. "No." Not again. Not again. The elemental stirred under her hand. It opened its eyes, blinking sleepily, and got to its feet, at first a bit shaky, but with more strength and confidence each passing second. It took her hand, gave it a squeeze, and Nissa felt it growing taller. Stronger. Tears pricked the corners of Nissa's eyes as all around the battlefield, elementals were rising, dusting themselves off, becoming fuller, more vibrant. She felt herself dropping the Core, heard it hit the ashy ground. But it didn't matter. The ancient artifact had grown silent. Its light had gone out. It had served its purpose, Nissa realized, smiling. It had undone damage. She closed her eyes and listened. She heard Nahiri picking herself up painfully from the ground. A dozen feet away from her, Jace was doing the same. Farther away, tender green jaddi roots were sprouting in the ruined forest. Farther than that, rich, unbroken earth was supplanting the diseased wastes that remained from the battle with Emrakul. And farther than that, Bala Ged was blooming again, growing, the forest coming back at speeds that only magic could accomplish. Zendikar was healing, turning into something healthier, stronger than it was before the battle with the Eldrazi. Though the scars were still there, they were memories now, not its defining features. For the first time in a long time, Nissa let out a genuine laugh, and she heard Zendikar laughing with her. #emph[Trust my strength] , she thought. Nissa called up her vines and grinned as they grew and twirled underneath her, raising her in the air. She turned east, and moving as quick as the wind, she followed the leylines of the land, flying through the forest, toward Bala Ged, traveling as only she knew how. Rushing forward, ahead, all while Zendikar hummed happily in her ears. Nissa was finally home. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) Jace picked up the inert Core and watched Nissa disappear. He considered calling after her but realized that it was no use. Mistakes were made here today and more than a few were his. Now, he understood how Nissa felt after the war in Ravnica. Around him, elementals were standing tall and healthy, swelling with vigor. But one by one, they melted back into the land, or disappeared into the jaddi trees. Something brushed against his boot. Startled, Jace stepped back, looked down. In the dust and wreckage around him, vines and young shoots were sprouting up from the destruction. Thriving and growing at a remarkable rate. #emph[Like a life bloom after the Roil] , he thought. He'd read about life blooms but had never seen one. "Is the power gone?" Nahiri asked, coming up beside him, kicking a vine. It took Jace a moment to realize she was talking about the lightless Core in his hands. "I don't know." "It wasn't hers to use," Nahiri said with disgust. "I think she was precisely the person who should have used this power," replied Jace. Nahiri scowled. "We need to apologize to Nissa," he said. "We were wrong." Nahiri scowled. "You think you can fix this?" she snapped. "With an apology? You made enemies again today, Jace. But that's your nature, isn't it? Whenever you try to do good, it just makes things worse." Jace didn't reply. He didn't try to argue as the ancient kor turned and planeswalked away. He was beginning to realize some battles weren't worth fighting. But some were. #emph[Nissa] , he thought. #emph[I'm so sorry. I should have listened better.] He had caused so much damage here, both to his friend and to the home she loved. And he knew the terrible guilt he felt in this moment would not lessen with time. So, as Jace stood in the dust of Zendikar with the dead Core in his hands, with new life clinging and wrapping its tender shoots around his boots, he hoped what Nissa said was true. That broken things could be redeemed.
https://github.com/Gekkio/gb-ctr
https://raw.githubusercontent.com/Gekkio/gb-ctr/main/gbctr.typ
typst
Creative Commons Attribution Share Alike 4.0 International
#import "common.typ": awesome-brands, monotext #let title = [Game Boy: Complete Technical Reference] #let date = datetime.today() #let config = json("config.json") #set document(title: title, author: ("gekkio"), date: date) #set par(justify: true) #set text(font: "Noto Sans") #show figure.where(kind: "register"): set figure.caption(position: top) #show raw: set text(1.25em, font: "Anonymous Pro", fallback: false) #set figure(numbering: (..nums) => [#counter(heading).display((..heading_nums) => heading_nums.pos().at(1))\.#nums.pos().at(0)]) #[ #set align(center) #set par(justify: false) #set page(margin: (x: 2cm, y: 5cm), footer-descent: 0%, footer: [ #link("http://creativecommons.org/licenses/by-sa/4.0/")[ #text(17pt)[ #awesome-brands[\u{f25e}] #awesome-brands[\u{f4e7}] #awesome-brands[\u{f4ef}] ] ]\ This work is licensed under a #link("http://creativecommons.org/licenses/by-sa/4.0/")[Creative Commons Attribution-ShareAlike 4.0 International License]. ]) #image("images/gbctr.svg", width: 5cm) #text(17pt)[#title] gekkio\ #monotext[#link("https://gekkio.fi")] #date.display("[month repr:long] [day padding:none], [year]") Revision #config.revision #if config.draft [\ DRAFT!] ] #set page( margin: (x: 2cm, y: 2cm), footer: [ #set text(10pt) #block(width: 100%)[ #set align(center) #if config.draft [ #place(left, text(style: "italic", [DRAFT! #config.revision])) ] #context { box(counter(page).display()) } ] ] ) #show heading: set block(above: 1.4em, below: 1em) #set heading(numbering: (..nums) => { let level = nums.pos().len() if level <= 2 { none } else { numbering("1.1.1.1.1", ..nums.pos().slice(1)) } }) #include("preface.typ") #pagebreak() #show outline: set heading(outlined: true) #show outline.entry: it => { if it.level == 1 { block(above: 20pt, below: 0pt, strong(it)) } else if it.level == 2 { strong(it) } else { it } } #outline(fill: repeat(" . "), indent: n => calc.max(0, n - 1) * 1em) #set heading(numbering: (..nums) => { let level = nums.pos().len() if level == 1 { numbering("I", ..nums) } else if level > 3 { none } else { numbering("1.1.1.1.1", ..nums.pos().slice(1)) } }) #let total-chapters = counter("total-chapters") #counter(heading).update(0) <maincontent> #[ #show heading.where(level: 1): it => [ #pagebreak() #set align(center) #text(21pt)[ #v(1fr) #block("Part " + counter(heading).display()) #block(it.body) #v(1fr) ] #context { let chapters = total-chapters.get().at(0) return counter(heading).update((part) => (part, chapters)) } ] #show heading.where(level: 2): it => [ #pagebreak() #block[ #text(17pt, [Chapter #counter(heading).display()]) ] #text(21pt, it.body) #v(1em) #total-chapters.step() #counter(figure).update(0) #counter(figure.where(kind: table)).update(0) #counter(figure.where(kind: "register")).update(0) ] = Game Boy console architecture #include "chapter/console/intro.typ" #include "chapter/console/clocks.typ" = Sharp SM83 CPU core #include "chapter/cpu/intro.typ" #include "chapter/cpu/simple.typ" #include "chapter/cpu/timing.typ" #include "chapter/cpu/instruction-set.typ" = Game Boy SoC peripherals and features #include "chapter/peripherals/boot-rom.typ" #include "chapter/peripherals/dma.typ" #include "chapter/peripherals/ppu.typ" #include "chapter/peripherals/p1.typ" #include "chapter/peripherals/serial.typ" = Game Boy game cartridges #include "chapter/cartridges/mbc1.typ" #include "chapter/cartridges/mbc2.typ" #include "chapter/cartridges/mbc3.typ" #include "chapter/cartridges/mbc30.typ" #include "chapter/cartridges/mbc5.typ" #include "chapter/cartridges/mbc6.typ" #include "chapter/cartridges/mbc7.typ" #include "chapter/cartridges/huc1.typ" #include "chapter/cartridges/huc3.typ" #include "chapter/cartridges/mmm01.typ" #include "chapter/cartridges/tama5.typ" ] #counter(heading).update(0) #set heading(numbering: (..nums) => { let level = nums.pos().len() if level == 1 { none } else { numbering("A.1.1.1.1", ..nums.pos().slice(1)) } }) #set figure(numbering: (..nums) => [#counter(heading).display((..heading_nums) => numbering("A", heading_nums.pos().at(1)))\.#nums.pos().at(0)]) #pagebreak() #text(21pt)[ #set align(center) #v(1fr) = Appendices #v(1fr) ] #show heading.where( level: 2 ): it => block[ #block[ #text(17pt, [Appendix #counter(heading).display()]) ] #text(21pt, it.body) #v(1em) ] #pagebreak() #include "appendix/opcode-tables.typ" #pagebreak() #include "appendix/memory-map.typ" #pagebreak() #include "appendix/external-bus.typ" #pagebreak() #include "appendix/pinouts.typ" #pagebreak() #bibliography("gbctr.yml")
https://github.com/Duarte0903/resume
https://raw.githubusercontent.com/Duarte0903/resume/main/template/coverletter.typ
typst
Other
#import "@preview/modern-cv:0.4.0": * #show: coverletter.with( author: ( firstname: "John", lastname: "Smith", email: "<EMAIL>", phone: "(+1) 111-111-1111", github: "DeveloperPaul123", linkedin: "<NAME>", address: "111 Example St. Apt. 111, Example City, EX 11111", positions: ( "Software Engineer", "Full Stack Developer", ), ), profile-picture: image("./profile.png"), language: "en", ) #hiring-entity-info(entity-info: ( target: "Company Recruitement Team", name: "Google, Inc.", street-address: "1600 AMPHITHEATRE PARKWAY", city: "MOUNTAIN VIEW, CA 94043", )) #letter-heading( job-position: "Software Engineer", addressee: "Sir or Madame", ) = About Me #coverletter-content[ #lorem(80) ] = Why Google? #coverletter-content[ #lorem(90) ] = Why Me? #coverletter-content[ #lorem(100) ]
https://github.com/ClazyChen/Table-Tennis-Rankings
https://raw.githubusercontent.com/ClazyChen/Table-Tennis-Rankings/main/history_CN/2011/MS-12.typ
typst
#set text(font: ("Courier New", "NSimSun")) #figure( caption: "Men's Singles (1 - 32)", table( columns: 4, [排名], [运动员], [国家/地区], [积分], [1], [马龙], [CHN], [3430], [2], [张继科], [CHN], [3293], [3], [蒂姆 波尔], [GER], [3161], [4], [王皓], [CHN], [3159], [5], [许昕], [CHN], [3123], [6], [王励勤], [CHN], [3111], [7], [马琳], [CHN], [3039], [8], [陈玘], [CHN], [3005], [9], [郝帅], [CHN], [2952], [10], [朱世赫], [KOR], [2931], [11], [迪米特里 奥恰洛夫], [GER], [2916], [12], [柳承敏], [KOR], [2886], [13], [吴尚垠], [KOR], [2869], [14], [SKACHKOV Kirill], [RUS], [2856], [15], [庄智渊], [TPE], [2855], [16], [水谷隼], [JPN], [2842], [17], [博扬 托基奇], [SLO], [2832], [18], [帕特里克 鲍姆], [GER], [2830], [19], [李尚洙], [KOR], [2825], [20], [高宁], [SGP], [2817], [21], [MATTENET Adrien], [FRA], [2813], [22], [闫安], [CHN], [2812], [23], [帕纳吉奥迪斯 吉奥尼斯], [GRE], [2798], [24], [米凯尔 梅兹], [DEN], [2783], [25], [LIVENTSOV Alexey], [RUS], [2782], [26], [岸川圣也], [JPN], [2773], [27], [弗拉基米尔 萨姆索诺夫], [BLR], [2747], [28], [SEO Hyundeok], [KOR], [2745], [29], [金珉锡], [KOR], [2733], [30], [林高远], [CHN], [2730], [31], [蒂亚戈 阿波罗尼亚], [POR], [2716], [32], [罗伯特 加尔多斯], [AUT], [2713], ) )#pagebreak() #set text(font: ("Courier New", "NSimSun")) #figure( caption: "Men's Singles (33 - 64)", table( columns: 4, [排名], [运动员], [国家/地区], [积分], [33], [巴斯蒂安 斯蒂格], [GER], [2705], [34], [丹羽孝希], [JPN], [2701], [35], [吉田海伟], [JPN], [2693], [36], [李廷佑], [KOR], [2687], [37], [克里斯蒂安 苏斯], [GER], [2682], [38], [KARAKASEVIC Aleksandar], [SRB], [2681], [39], [TAKAKIWA Taku], [JPN], [2672], [40], [维尔纳 施拉格], [AUT], [2669], [41], [#text(gray, "高礼泽")], [HKG], [2668], [42], [MONTEIRO Joao], [POR], [2667], [43], [GERELL Par], [SWE], [2666], [44], [阿德里安 克里桑], [ROU], [2654], [45], [RUBTSOV Igor], [RUS], [2653], [46], [CHO Eonrae], [KOR], [2652], [47], [陈建安], [TPE], [2650], [48], [马克斯 弗雷塔斯], [POR], [2640], [49], [阿列克谢 斯米尔诺夫], [RUS], [2633], [50], [利亚姆 皮切福德], [ENG], [2620], [51], [约尔根 佩尔森], [SWE], [2614], [52], [侯英超], [CHN], [2610], [53], [詹斯 伦德奎斯特], [SWE], [2606], [54], [TAN Ruiwu], [CRO], [2605], [55], [CHEN Feng], [SGP], [2597], [56], [WANG Zengyi], [POL], [2596], [57], [张一博], [JPN], [2594], [58], [松平健太], [JPN], [2592], [59], [佐兰 普里莫拉克], [CRO], [2589], [60], [艾曼纽 莱贝松], [FRA], [2584], [61], [陈卫星], [AUT], [2584], [62], [MATSUDAIRA Kenji], [JPN], [2581], [63], [SUCH Bartosz], [POL], [2580], [64], [VANG Bora], [TUR], [2580], ) )#pagebreak() #set text(font: ("Courier New", "NSimSun")) #figure( caption: "Men's Singles (65 - 96)", table( columns: 4, [排名], [运动员], [国家/地区], [积分], [65], [卡林尼科斯 格林卡], [GRE], [2580], [66], [YIN Hang], [CHN], [2577], [67], [诺沙迪 阿拉米扬], [IRI], [2575], [68], [JANG Song Man], [PRK], [2575], [69], [LEUNG Chu Yan], [HKG], [2567], [70], [PROKOPCOV Dmitrij], [CZE], [2564], [71], [HE Zhiwen], [ESP], [2563], [72], [#text(gray, "SONG Hongyuan")], [CHN], [2563], [73], [吉村真晴], [JPN], [2559], [74], [LI Ahmet], [TUR], [2559], [75], [卢文 菲鲁斯], [GER], [2558], [76], [让 米歇尔 赛弗], [BEL], [2552], [77], [KOSIBA Daniel], [HUN], [2546], [78], [ZHAN Jian], [SGP], [2544], [79], [丁祥恩], [KOR], [2543], [80], [HABESOHN Daniel], [AUT], [2542], [81], [西蒙 高兹], [FRA], [2540], [82], [江天一], [HKG], [2540], [83], [JAKAB Janos], [HUN], [2535], [84], [LI Ping], [QAT], [2531], [85], [沙拉特 卡马尔 阿昌塔], [IND], [2529], [86], [LIN Ju], [DOM], [2529], [87], [上田仁], [JPN], [2523], [88], [<NAME>], [SVK], [2522], [89], [YANG Zi], [SGP], [2521], [90], [张钰], [HKG], [2518], [91], [斯特凡 菲格尔], [AUT], [2518], [92], [HUNG Tzu-Hsiang], [TPE], [2518], [93], [安德烈 加奇尼], [CRO], [2518], [94], [郑荣植], [KOR], [2516], [95], [SIMONCIK Josef], [CZE], [2509], [96], [FEJER-KONNERTH Zoltan], [GER], [2504], ) )#pagebreak() #set text(font: ("Courier New", "NSimSun")) #figure( caption: "Men's Singles (97 - 128)", table( columns: 4, [排名], [运动员], [国家/地区], [积分], [97], [SHIBAEV Alexander], [RUS], [2501], [98], [KIM Junghoon], [KOR], [2500], [99], [<NAME>], [BRA], [2499], [100], [<NAME>], [JPN], [2497], [101], [<NAME>], [DOM], [2493], [102], [<NAME>], [SVK], [2491], [103], [KOSOWSKI Jakub], [POL], [2488], [104], [<NAME>], [ESP], [2488], [105], [<NAME>], [SGP], [2488], [106], [<NAME>], [ARG], [2487], [107], [<NAME>], [MEX], [2483], [108], [<NAME>], [BLR], [2482], [109], [奥马尔 阿萨尔], [EGY], [2481], [110], [<NAME>], [RUS], [2475], [111], [雅罗斯列夫 扎姆登科], [UKR], [2475], [112], [唐鹏], [HKG], [2471], [113], [TSUBOI Gustavo], [BRA], [2462], [114], [SVENSSON Robert], [SWE], [2462], [115], [<NAME>], [POL], [2456], [116], [<NAME>], [FRA], [2455], [117], [<NAME>], [ROU], [2451], [118], [<NAME>], [CZE], [2450], [119], [尹在荣], [KOR], [2447], [120], [PLATONOV Pavel], [BLR], [2445], [121], [李静], [HKG], [2445], [122], [<NAME>], [UKR], [2441], [123], [<NAME>], [ESP], [2434], [124], [BLASZCZYK Lucjan], [POL], [2433], [125], [#text(gray, "<NAME>")], [PRK], [2432], [126], [<NAME>], [CHN], [2430], [127], [<NAME>], [ESP], [2429], [128], [OYA Hidetoshi], [JPN], [2428], ) )
https://github.com/EgorGorshen/scripts-for-typst
https://raw.githubusercontent.com/EgorGorshen/scripts-for-typst/main/gause-algo.typ
typst
MIT License
#import "@preview/pyrunner:0.1.0" as py #let compiled = py.compile( ```python def gaussian_elimination(matrix): from fractions import Fraction rows = len(matrix) cols = len(matrix[0]) matrix = [[Fraction(element) for element in row] for row in matrix] for i in range(min(rows, cols)): max_row_index = i for k in range(i, rows): if abs(matrix[k][i]) > abs(matrix[max_row_index][i]): max_row_index = k if max_row_index != i: matrix[i], matrix[max_row_index] = matrix[max_row_index], matrix[i] yield matrix, {"method": "swap", "args": [i+1, max_row_index+1]} pivot = matrix[i][i] if pivot != 0: matrix[i] = [element / pivot for element in matrix[i]] yield matrix, {"method": "normalize", "args": [i+1, str(pivot)]} for j in range(rows): if j != i and matrix[j][i] != 0: factor = matrix[j][i] matrix[j] = [matrix[j][col] - factor * matrix[i][col] for col in range(cols)] yield matrix, {"method": "zero_elem", "args": [j + 1, i + 1, factor]} yield matrix, {"method": "done", "args": None} def matrix_to_typst(matrix) -> str: return "mat(\n\t\t" + ';\n\t\t'.join(list(map(lambda line: ', '.join(map(str, line)), matrix))) + "\n\t)" def gaussian_elimination_solution(matrix, name="A"): matrix_pair = [] was = matrix[::] for step, transformation in gaussian_elimination(matrix): res = '' from_ = matrix_to_typst(was) to = matrix_to_typst(step) args = transformation['args'] method = transformation['method'] if was == step and method != 'done': continue if method == "swap": res = from_ + f" |=>^({args[0]} <-> {args[1]}) " + to elif method == "normalize": res = from_ + f" |=>^({name}_({args[0]}) / ({args[1]})) " + to elif method == "zero_elem": res = from_ + f" |=>^({name}_({args[0]}) {'-' if args[2] >= 0 else '+'} {abs(args[2])} dot {name}_({args[1]})) " + to elif method == "done": res = to res = '$\n' + res + '\n\t$' matrix_pair.append(res) was = step[::] return '+ ' + '\n\n+ '.join(matrix_pair), was ```, ) #let gaussian_method_result(matrix, name: "A") = { let (text, result_mat) = py.call(compiled, "gaussian_elimination_solution", matrix, name) result_mat } #let gaussian_method_print(matrix, name: "A") = { let (text, result_mat) = py.call(compiled, "gaussian_elimination_solution", matrix, name) eval(text, mode: "markup") }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/lovelace/0.1.0/README.md
markdown
Apache License 2.0
# Lovelace This is a package for writing pseudocode in [Typst](https://typst.app/). It is named after the computer science pioneer [Ada Lovelace](https://en.wikipedia.org/wiki/Ada_Lovelace) and inspired by the [pseudo package](https://ctan.org/pkg/pseudo) for LaTeX. ![GitHub license](https://img.shields.io/github/license/andreasKroepelin/lovelace) ![GitHub release (latest by date)](https://img.shields.io/github/v/release/andreasKroepelin/lovelace) ![GitHub Repo stars](https://img.shields.io/github/stars/andreasKroepelin/lovelace) ## Usage Import the package using ```typ #import "@preview/lovelace:0.1.0": * ``` You should then call the setup function in a show rule _at the top of your document_: ```typ #show: setup-lovelace ``` You are then ready to go to typeset some pseudocode: ```typ #pseudocode( no-number, [*input:* integers $a$ and $b$], no-number, [*output:* greatest common divisor of $a$ and $b$], [*while* $a != b$ *do*], ind, [*if* $a > b$ *then*], ind, $a <- a - b$, ded, [*else*], ind, $b <- b - a$, ded, [*end*], ded, [*end*], [*return* $a$] ) ``` resulting in: ![euclid](examples/euclid.png) As you can see, every line of your pseudocode is represented by a single content argument. Additionally, we use `ind` and `ded` to control the indentation level: `ind` (indent) to go one level deeper, `ded` (dedent) to go one level back. Don't forget to put all the commas in between! The content of your pseudocode is up to you. This package does not assume any specific set of keywords or language constructs. For example, you might want to write something like ```typ #pseudocode( $x <- a$, [*repeat until convergence*], ind, $x <- (x + a/x) / 2$, ded, [*return* $x$] ) ``` ![custom-keywords](examples/custom-keywords.png) for some more abstract, less implementation concerned pseudocode that follows your own convention, most suitable to you. There are two other elements you can use as positional arguments to `#pseudocode`: `no-number` makes the next line have no line number (and also not being counted). This is useful for things like input and output (as seen above) or to introduce an empty line (i.e, you add `no-number, []` to the arguments). ### Referencing lines Finally, you can put labels there. They will be attached to the line number of the following line and can be used to reference that line later: ```typ #pseudocode( <line:eat>, [Eat], [Train], <line:sleep>, [Sleep], [*goto* @line:eat] ) @line:sleep is of particular importance. ``` ![goto](examples/goto.png) ### Algorithm as figure `#pseudocode` is great if you just want to show some lines of code. If you want to display a full algorithm with bells and whistles, you can use `#algorithm`: ```typ #algorithm( caption: [The Euclidean algorithm], pseudocode( no-number, [*input:* integers $a$ and $b$], no-number, [*output:* greatest common divisor of $a$ and $b$], [*while* $a != b$ *do*], ind, [*if* $a > b$ *then*], ind, $a <- a - b$, ded, [*else*], ind, $b <- b - a$, ded, [*end*], ded, [*end*], [*return* $a$] ) ) ``` resulting in: ![euclid-algorithm](examples/euclid-algorithm.png) `#algorithm` creates a figure with `kind: "lovelace"` so it gets its own counter and display. You can use optional arguments such as `placement` or `caption`, see [figure in the Typst docs](https://typst.app/docs/reference/meta/figure/#parameters). Note that such figures are only displayed correctly when you used the setup function mentioned above! ### Comments Again, the content of your pseudocode is completely up to you, and that includes comments. However, Lovelace provides a sensible `#comment` function you can use: ```typ #pseudocode( [A statement #comment[and a comment]], [Another statement #comment[and another comment]], ) ``` ![comment](examples/comment.png) ### Customisation Lovelace provides a couple of customisation options. First, the `pseudocode` function accepts optional keyword arguments: - `line-numbering`: `true` or `false`, whether to display line numbers, default `true` - `line-number-transform`: a function that takes in the line number as an integer and returns an arbitrary value that will be displayed instead of the line number, default `num => num` (identity function) - `indentation-guide-stroke`: a [stroke](https://typst.app/docs/reference/visualize/line/#parameters-stroke), defining how the indentation guides are displayed, default `none` (no lines) For example, let's use thin blue indentation guides and roman line numbering: ```typ #pseudocode( line-number-transform: num => numbering("i", num), indentation-guide-stroke: .5pt + aqua, no-number, [*input:* integers $a$ and $b$], no-number, [*output:* greatest common divisor of $a$ and $b$], [*while* $a != b$ *do*], ind, [*if* $a > b$ *then*], ind, $a <- a - b$, ded, [*else*], ind, $b <- b - a$, ded, [*end*], ded, [*end*], [*return* $a$] ) ``` resulting in: ![euclid-modified](examples/euclid-modified.png) Also, there are some optional arguments to `lovelace-setup`: - `line-number-style`: a function that takes content and returns content, used to display the line numbers in the pseudocode, default `text.with(size: .7em)`, note that this is different from the `line-number-transform` argument to `#pseudocode` as the latter has an effect on line numbers in references as well. - `line-number-supplement`: some content that is placed before the line number when referencing it, default `"Line"` If you want to avoid having to repeat all those configurations, here is what you can do. Suppose, we always want German supplements (Zeile and Algorithmus instead of Line and Algorithm) and thin indentation guides. ```typ #show: setup-lovelace.with(line-number-supplement: "Zeile") #let pseudocode = pseudocode.with(indentation-guide-stroke: .5pt) #let algorithm = algorithm.with(supplement: "Algorithmus") #algorithm( caption: [Spurwechsel nach links auf der Autobahn], pseudocode( <line:blinken>, [Links blinken], [In den linken Außenspiegel schauen], [*wenn* niemand nähert sich auf der linken Spur, *dann*], ind, [Spur wechseln], ded, [Blinker aus], ) ) Der Schritt in @line:blinken stellt offenbar für viele Verkehrsteilnehmer eine Herausforderung dar. ``` ![autobahn](examples/autobahn.png)
https://github.com/Toniolo-Marco/git-for-dummies
https://raw.githubusercontent.com/Toniolo-Marco/git-for-dummies/main/slides/animations/remote-example.typ
typst
#import "@preview/touying:0.5.2": * #import themes.university: * #import "@preview/cetz:0.2.2" #import "@preview/fletcher:0.5.1" as fletcher: node, edge #import "@preview/ctheorems:1.1.2": * #import "@preview/numbly:0.1.0": numbly #import "/slides/components/code-blocks.typ": code-block, window-titlebar #import "/slides/components/utils.typ": n_pagebreak #import "/slides/components/git-graph.typ": branch_indicator, commit_node, connect_nodes, branch #let fletcher-diagram = touying-reducer.with(reduce: fletcher.diagram, cover: fletcher.hide) #slide(repeat:6, self => [ #let (uncover, only, alternatives) = utils.methods(self) #only("1")[ Let's see a complete example: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch_indicator("my-fork/main", (4.5,1.5), blue), branch_indicator("origin/main", (0.75,0.5), blue), branch( // main branch name:"main", indicator-xy: (5.75,0.5), color:blue, start:(0,1), length:6, head: 5, commits:("","","",none,"","",) ), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] We developed two different features, merged the _feature-2_ locally and pushed it on the fork. Next we completed _feature-1_ and merged it. ] #only("2")[ At this point we apply everything we have seen in this chapter: moving to the _main_ branch we push to our fork with the command: `git push my-fork`. #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch_indicator("origin/main", (0.75,0.5), blue), branch( // main branch name:"main ", remote:"my-fork ", indicator-xy: (5.75,0.5), color:blue, start:(0,1), length:6, head: 5, commits:("","","",none,"","",) ), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] Now we can proceed with our PR, choosing as is common to stay on our _main_ branch when we run the `gh pr create` command; that way the PR will come from that one. #footnote([You cannot have multiple open PRs coming from the same branch of the same fork.]) ] #only("3")[ Once the request is accepted, we can run the `git fetch origin` command to find out the most recent changes on the remote origin and we will be in this state: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // remote origin name:"origin/main", indicator-xy: (6,-0.5), color:lime, start:(0,-1), length:7, commits:("",none,none,none,none,none,"merge pr"), angle: 0deg ), connect_nodes((1,-1),(2,1),blue), branch( // main branch name:"main" , remote:"my-fork ", indicator-xy: (5.75,0.5), color:blue, start:(1,1), length:5, head: 4, commits:("","",none,"","") ), connect_nodes((6,1),(7,-1),blue,bend:-25deg), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] ] #only("4")[ This kind of graph is pretty normal, if we analyze it, we notice that _origin/main_ has as its first commit the last commit in common and as its last commit the merge commit. Fortunately for us, the project maintainers had already performed this merge. All that remains *now* is to *synchronize our fork and our local repository*. Both the web interface and the gh tool allow us to synchronize a branch of our fork with the most recent version of the original remote. The command to do this is: `gh repo sync owner/cli-fork -b BRANCH-NAME`@gh-sync. In our case the `BRANCH-NAME` will obviously be _main_. ] #only("5")[ To continue the example one step at a time and make sure that everything went as we expected, we can run `git fetch` again: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch_indicator("my-fork/main", (6,-0.75), lime), branch( // remote origin name:"origin/main", indicator-xy: (6,-0.45), color:lime, start:(0,-1), length:7, commits:("",none,none,none,none,none,"merge pr"), angle: 0deg ), connect_nodes((1,-1),(2,1),blue), branch( // main branch name:"main", indicator-xy: (5.75,0.5), color:blue, start:(1,1), length:5, head: 4, commits:("","",none,"","") ), connect_nodes((6,1),(7,-1),blue,bend:-25deg), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] ] #only("6")[ If everything went as we expect the last thing left to do is to update the _main_ branch locally with `git pull` if we are on it, otherwise specifying branch and remote. #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // remote origin name:"main", remote:("origin ","my-fork "), indicator-xy: (6,-0.5), color:lime, start:(0,-0.75), length:7, head: 6, commits:("",none,none,none,none,none,"merge pr"), angle: 0deg ), connect_nodes((1,-0.75),(2,1),blue), branch( // main branch name:"", indicator-xy: (5.75,0.5), color:blue, start:(1,1), length:5, commits:("","",none,"","") ), connect_nodes((6,1),(7,-0.75),blue,bend:-25deg), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] ] ])
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/rivet/0.1.0/gallery/example1.typ
typst
Apache License 2.0
#import "../src/lib.typ": schema, config #let example = schema.load("/gallery/example1.yaml") #schema.render(example, config: config.config( full-page: true ))
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/fauve-cdb/0.1.0/src/lib.typ
typst
Apache License 2.0
#import "cover-bg.typ": cover-bg #import "abstracts-bg.typ": abstracts-bg #import "utils.typ": balanced-cols #let school-color-recto = blue #let school-color-verso = rgb("0054a0") #let cover( title-en: "", title-fr: "", author: "", affiliation: "", defense-place: "", defense-date: "", jury-content: [], ) = { set page( margin: (left: 20mm, right: 25mm, top: 30mm, bottom: 30mm), numbering: none, ) set text(font: "<NAME>", fill: black) place(dx: -20mm, dy: 30mm, cover-bg(school-color-recto)) place(dx: 100mm, dy: -15mm, image("/assets/UR.png", width: 6cm)) place(dx: 0mm, dy: -15mm, image("/assets/logo.png", width: 7.5cm)) v(2.1cm) text(size: 2em, smallcaps[Thèse de doctorat de]) v(2.25cm) set text(fill: white) text(size: 1.5em, smallcaps[l'Université de Rennes]) v(.01cm) text(size: 1.2em)[ #smallcaps[École Doctorale N° 601] \ _Mathématiques, Télécommunications, Informatique, \ Signal, Systèmes, Électronique_ \ Spécialité : Informatique \ #v(.1cm) #h(.6cm) Par \ ] v(0em) h(.6cm) text(size: 1.9em)[*#author* \ ] v(.1cm) // Add a blue background with the width of the page context { let y-start = locate(<cover:title-en>).position().y - .5cm let y-end = locate(<cover:defense-info>).position().y + measure(query(<cover:defense-info>).first()).height + .5cm let height = 5em place( top + left, dy: y-start - page.margin.top, dx: -page.margin.left, float: false, block(width: page.width, height: y-end - y-start, fill: school-color-recto), ) } // Title + defense info block text(size: 1.6em)[*#title-en*<cover:title-en>] parbreak() text(size: 1.4em, title-fr) parbreak() text(size: 1.1em)[ *Thèse présentée et soutenue à #defense-place, le #defense-date* \ *Unité de recherche : #affiliation* <cover:defense-info> ] set text(fill: black) v(.5em) jury-content } #let abstracts( title-fr: "", keywords-fr: "", abstract-fr: [], title-en: "", keywords-en: "", abstract-en: [], ) = { set page( margin: (left: 20mm, right: 30mm, top: 30mm, bottom: 30mm), numbering: none, header: none, ) set text(font: "<NAME>", fill: black) pagebreak() pagebreak() place(dx: -20mm, dy: -65mm, abstracts-bg(school-color-verso)) place(dx: 100mm, dy: -15mm, image("/assets/UR.png", width: 6cm)) place(dx: 0mm, dy: -15mm, image("/assets/logo.png", width: 7.5cm)) v(2.7cm) line(length: 100%, stroke: .2cm + school-color-verso) v(.4cm) [ #text(school-color-verso)[*Titre :*] #title-fr *Mots clés :* #keywords-fr ] balanced-cols(2, gutter: 11pt)[*Résumé :* #abstract-fr] v(1cm) line(length: 100%, stroke: .2cm + school-color-verso) v(.4cm) [ #text(school-color-verso)[*Title:*] #title-en *Keywords:* #keywords-en ] balanced-cols(2, gutter: 11pt)[*Abstract:* #abstract-en] } #let matisse-thesis( jury-content: [], author: "", affiliation: "", title-en: "", title-fr: "", keywords-fr: "", keywords-en: "", abstract-en: [], abstract-fr: [], acknowledgements: [], defense-place: "", defense-date: "", draft: true, body, ) = { let draft-string = "" if draft { draft-string = "DRAFT - " } set document(author: author, title: draft-string + title-en) set heading(numbering: "1.") set page( "a4", numbering: (..numbers) => text( font: "New Computer Modern", size: 4.5mm, numbering("1", numbers.pos().at(0)), ), number-align: center, ) set par(justify: true) cover( title-en: draft-string + title-en, title-fr: draft-string + title-fr, author: author, affiliation: affiliation, defense-place: defense-place, defense-date: defense-date, jury-content: jury-content, ) set text(font: "New Computer Modern", fill: black) set page( margin: (outside: 20mm, inside: 30mm, top: 50mm, bottom: 50mm), header: context { // get the page number let i = counter(page).get().first() // if the page starts a chapter, display nothing let all-chapters = query(heading.where(level: 1)) if all-chapters.any(it => it.location().page() == i) { return } // if the page is odd, display the chapter if calc.odd(i) { let chapter-stack = query( selector(heading.where(level: 1)).before(here()), ) if chapter-stack != () { let last-chapter = chapter-stack.last() let title = last-chapter.body let nb = counter(heading).at(last-chapter.location()).first() text(0.35cm)[Chapter #nb -- _ #title _] //chapter-stack.first() } } // if the page is even, display the section if calc.even(i) { let chapter-stack = query( selector(heading.where(level: 2)).before(here()), ) if chapter-stack != () { let last-section = chapter-stack.last() let title = last-section.body let nb = counter(heading).at(last-section.location()).map(it => str(it)).join(".") align(right, text(0.35cm)[_ #nb. #title _]) //chapter-stack.first() } } // horizontal rule v(-.3cm) line(length: 100%, stroke: .2mm) }, ) // chapters show heading.where(level: 1): it => { // always start on odd pages // pagebreak(to: "odd") // if chaptering is enabled, display chapter number set align(right) v(-.8cm) if it.numbering != none { context text( smallcaps[Chapter #counter(heading).get().first() \ ], size: .45cm, weight: "regular", font: "New Computer Modern", ) v(0cm) } // chapter name text(smallcaps(it.body), font: "TeX Gyre Heros", size: .9cm) set align(left) // horizontal rule v(.7cm) line(length: 100%, stroke: .2mm) v(.7cm) } // table of contents show outline.entry.where(level: 1): it => { v(5mm, weak: true) strong(it) } // footnotes show footnote.entry: it => { let loc = it.note.location() numbering( "1. ", ..counter(footnote).at(loc), ) it.note.body } // show page number context counter(page).update(here().page()) body abstracts( title-fr: title-fr, keywords-fr: keywords-fr, abstract-fr: abstract-fr, title-en: title-en, keywords-en: keywords-en, abstract-en: abstract-en, ) }
https://github.com/FkHiroki/ex-D2
https://raw.githubusercontent.com/FkHiroki/ex-D2/main/libs/mscs/lib.typ
typst
MIT No Attribution
// Workaround for the lack of an `std` scope. #let std-bibliography = bibliography #let mscs( title: [タイトル], authors: [著者], etitle: "", eauthors: "", abstract: none, keywords: (), bibliography: none, body ) = { // Set document metadata. set document(title: title) // Set the Fonts let gothic = ("MS PGothic", "Hiragino Kaku Gothic Pro", "IPAexGothic", "Noto Sans CJK JP") let mincho = ("MS PMincho", "Hiragino Mincho Pro", "IPAexMincho", "Noto Serif CJK JP") let english = ("Times New Roman", "New Computer Modern") // Configure the page. set page( paper: "a4", margin: (top: 20mm, bottom: 27mm, x: 20mm) ) set text(size: 10pt, font: mincho) // show regex("[0-9a-zA-Z]"): set text(font: "New Computer Modern Math") set par(leading: 0.55em, first-line-indent: 1em, justify: true) show par: set block(spacing: 0.55em) // Configure equation numbering and spacing. set math.equation(numbering: "(1)") show math.equation: set block(spacing: 0.55em) // Configure appearance of equation references show ref: it => { if it.element != none and it.element.func() == math.equation { // Override equation references. link(it.element.location(), numbering( it.element.numbering, ..counter(math.equation).at(it.element.location()) )) } else { // Other references as usual. it } } // Configure lists. set enum(indent: 10pt, body-indent: 9pt) set list(indent: 10pt, body-indent: 9pt) // Configure headings. set heading(numbering: "1.") show heading: it => locate(loc => { // Find out the final number of the heading counter. let levels = counter(heading).at(loc) let deepest = if levels != () { levels.last() } else { 1 } if it.level == 1 [ // First-level headings are centered smallcaps. // We don't want to number of the acknowledgment section. #set par(first-line-indent: 0pt) #let is-ack = it.body in ([謝辞], [Acknowledgment], [Acknowledgement]) #set text(if is-ack { 11pt } else { 11pt }, font: gothic) #v(20pt, weak: true) #if it.numbering != none and not is-ack { numbering("1.", ..levels) h(8pt, weak: true) } #it.body #v(13.75pt, weak: true) ] else [ // The other level headings are run-ins. #set par(first-line-indent: 0pt) #set text(10pt, weight: 400) #v(10pt, weak: true) #if it.numbering != none { numbering("1.", ..levels) h(8pt, weak: true) } #it.body #v(10pt, weak: true) ] }) // Configure figures. show figure.where(kind: table): set figure(placement: top, supplement: [Table]) show figure.where(kind: table): set figure.caption(position: top, separator: [: ]) show figure.where(kind: image): set figure(placement: top, supplement: [Fig.]) show figure.where(kind: image): set figure.caption(position: bottom, separator: [: ]) // Display the paper's title. align(center, text(16pt, title, weight: "bold", font: gothic)) v(18pt, weak: true) // Display the authors list. align(center, text(12pt, authors, font: mincho)) v(1.5em, weak: true) // Display the paper's title in English. align(center, text(12pt, etitle, weight: "bold", font: english)) v(1.5em, weak: true) // Display the authors list in English. align(center, text(12pt, eauthors, font: english)) v(1.5em, weak: true) // Display abstract and index terms. if abstract != none { grid( columns: (0.7cm, 1fr, 0.7cm), [], [ #set text(10pt, font: english) #set par(first-line-indent: 0pt) *Abstract--* #h(0.5em) #abstract #v(1em) *Key Words:* #keywords.join(", ") ], [] ) v(1em, weak: false) } // Start two column mode and configure paragraph properties. show: columns.with(2, gutter: 8mm) // Display the paper's contents. body // Display bibliography. if bibliography != none { show std-bibliography: set text(9pt) show regex("[0-9a-zA-Z]"): set text(font: english) set std-bibliography(title: align(center, text(11pt)[参 考 文 献]), style: "sice.csl") bibliography } }
https://github.com/SillyFreak/typst-packages-old
https://raw.githubusercontent.com/SillyFreak/typst-packages-old/main/scrutinize/gallery/small-example.typ
typst
MIT License
#import "@preview/scrutinize:0.2.0": grading, question, questions // #import "../src/lib.typ" as scrutinize: grading, question, questions #import question: q #import questions: free-text-answer, single-choice, multiple-choice, with-solution // make the PDF reproducible to ease version control #set document(date: none) // toggle this comment or pass `--input solution=true` to produce a sample solution // #questions.solution.update(true) #set table(stroke: 0.5pt) #context [ #let total = grading.total-points(question.all()) The candidate achieved #h(3em) out of #total points. ] = Instructions #with-solution(true)[ Use a pen. For multiple choice questions, make a cross in the box, such as in this example: #pad(x: 5%)[ Which of these numbers are prime? #multiple-choice( (([1], false), ([2], true), ([3], true), ([4], false), ([5], true)), ) ] ] #show heading: it => [ #it.body #h(1fr) / #question.current().points ] #q(points: 2)[ = Question 1 Write an answer. #free-text-answer(height: 4cm)[ An answer ] ] #q(points: 1)[ = Question 2 Select the largest number: #single-choice( ([5], [20], [25], [10], [15]), 2, // 0-based index ) ]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/visualize/shape-ellipse_00.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Default ellipse. #ellipse()
https://github.com/noahjutz/AD
https://raw.githubusercontent.com/noahjutz/AD/main/notizen/sortieralgorithmen/mergesort/recursion.typ
typst
#import "@preview/cetz:0.2.2" #import "/config.typ": theme #let row( nums, is_complete: false ) = table( columns: nums.len(), align: center + horizon, fill: if is_complete {theme.success_trans}, ..nums.map(n => str(n)) ) #let split(nums) = { let n = nums.len() let m = calc.div-euclid(n, 2) if nums.len() == 1 { return nums } let a1 = nums.slice(0, m) let a2 = nums.slice(m, n) return (nums, a1, a2) } #let deepmap(tuple) = { let l = () for x in tuple { if type(x.at(0)) == array { l.push(deepmap(x)) } else { l.push(row(x)) } } return l } #let mergesort_recursion(nums, spacing: 8pt) = { if nums.len() == 1 { return row(nums) } let (all, l, r) = split(nums) set block(breakable: false) stack( dir: ttb, spacing: 4pt, row(all), stack( dir: ltr, spacing: spacing, mergesort_recursion(l), mergesort_recursion(r), ), row(all.sorted(), is_complete: true) ) }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/embiggen/0.0.1/README.md
markdown
Apache License 2.0
# embiggen Get LaTeX-like delimeter sizing in Typst! ## Usage ```typst #import "@preview/embiggen:0.0.1": * = embiggen Here's an equation of sorts: $ {lr(1/2x^2|)^(x=n)_(x=0) + (2x+3)} $ And here are some bigger versions of it: $ {big(1/2x^2|)^(x=n)_(x=0) + big((2x+3))} $ $ {Big(1/2x^2|)^(x=n)_(x=0) + Big((2x+3))} $ $ {bigg(1/2x^2|)^(x=n)_(x=0) + bigg((2x+3))} $ $ {Bigg(1/2x^2|)^(x=n)_(x=0) + Bigg((2x+3))} $ And now, some smaller versions (#text([#link("https://x.com/tsoding/status/1756517251497255167", "cAn YoUr LaTeX dO tHaT?")], fill: rgb(50, 20, 200), font: "Noto Mono")): $ small(1/2x^2|)^(x=n)_(x=0) $ $ Small(1/2x^2|)^(x=n)_(x=0) $ $ smalll(1/2x^2|)^(x=n)_(x=0) $ $ Smalll(1/2x^2|)^(x=n)_(x=0) $ ``` ## Functions ### big(...) Applies a scale factor of `125%` to `#lr` pre-determined scale. Delimeters are enlarged by this amount compared to what `#lr` would normally do. ### Big(...) Like `big(...)`, but applies a scale factor of `156.25%`. ### bigg(...) Like `big(...)`, but applies a scale factor of `195.313%`. ### Bigg(...) Like `big(...)`, but applies a scale factor of `244.141%`. ### small(...) Applies a scale factor of `80%` to `#lr` pre-determined scale. Delimeters are shrunk by this amount compared to what `#lr` would normally do. This does *not* exist in standard LaTeX, but is necessary in this package because these functions scale the output of `#lr`, so delimeter sizes will get larger depending on the content. ### Small(...) Like `small(...)`, but applies a scale factor of `64%`. ### smalll(...) Like `small(...)`, but applies a scale factor of `51.2%`. ### Smalll(...) Like `small(...)`, but applies a scale factor of `40.96%`.
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/page-binding_01.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test setting the binding explicitly. #set page(margin: (inside: 30pt)) #rect(width: 100%)[Bound] #pagebreak() #rect(width: 100%)[Left]
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/call-01.typ
typst
Other
// Trailing comma. #test(1 + 1, 2,) // Call function assigned to variable. #let alias = type #test(alias(alias), "function") // Callee expressions. #{ // Wrapped in parens. test((type)("hi"), "string") // Call the return value of a function. let adder(dx) = x => x + dx test(adder(2)(5), 7) }
https://github.com/iyno-org/Holistic-Clinical-Neurosurgery
https://raw.githubusercontent.com/iyno-org/Holistic-Clinical-Neurosurgery/main/1.%20Spinal%20Cord/General%20Malformations/main.typ
typst
#import "@preview/tufte-memo:0.1.2": * #show: template.with( title: [1. Spinal Cord: Gross and General Spinal Cord Malformations], shorttitle: [], authors: ( ( name: "<NAME>", ), ), document-number: [Draft 1], abstract: [In the first part, we explore the overall structure and organization of the spinal cord. This includes a detailed examination of its gross anatomy, such as the spinal cord's segments, the protective meninges, and the surrounding vertebral column. We will also cover common spinal procedures, offering insights into standard practices and techniques used in spinal surgeries. Understanding these foundational elements is crucial for grasping the complexities of spinal cord functions and pathologies.], publisher: [iyno.org], toc: true, draft: false, bib: bibliography("references.bib") ) #pagebreak() = Case 1 == Presentation #wideblock[A patient presents with a complaint of severe back pain. He is a student who recently completed a period of intensive studying, involving prolonged sitting at his desk. The patient acknowledges being in poor physical condition. On the day following his exams, he decided to clean his room, which he had neglected during his study period. While attempting to lift and move his desk to vacuum the floor, he experienced a sudden, sharp back pain that radiated to his right lower leg. In distress, he called 911. Paramedics responded promptly and transported him to the neurological department, fortunately you are the neurosurgeon on call.] == Relavant Anatomical Background This case involves a lumbar disc herniation. To fully comprehend this condition, it's essential to first understand the basic structure and organization of the spinal cord. The spinal cord is housed within the vertebral canal, which is part of the vertebral column, and is protected by three layers of tissue called the meninges. See @fig:meninges. Structurally, the spinal cord resembles a cylindrical extension of the brain, beginning at the brain's base and extending down to the lumbar region of the vertebral column.#note([Difference between the vertebral column, vertebral canal and the spinal cord. Vertebral Column: The bony structure that encases and protects the spinal cord. Vertebral Canal: The space within the vertebral column through which the spinal cord passes. Spinal Cord: The bundle of nervous tissue contained within the vertebral canal.], dy:-0.15in) The spinal cord is organized into nerve roots, each of which exits through openings in the vertebral column. An example of such an opening can be seen in @fig:cervical-vertebra. #note([ #figure( image("images/cervical-vertebra.png"), caption:[A figure showing a cross section through a vertebral bone @cervical-vertebra.] ) <fig:cervical-vertebra> ], dy:-3in, numbered:false) Throughout the entire length of the spinal cord, there are 31 pairs of spinal nerves, each consisting of an anterior (ventral or motor) root and a posterior (dorsal or sensory) root. Each of these roots also contains a dorsal root ganglion, which houses the cells that give rise to both peripheral and central nerve fibers, as shown in @fig:vertebral-column. == Lumbar Disk Herniation #wideblock[ Returning to the medical student's injury, the herniation occurred on the right side and was relatively small. This herniation took place between the L5 and S1 levels of the spinal cord, resulting in compression of the posterior (dorsal) roots. @fig:lumbar-disk-hernia illustrates the varying severities of disc herniations. It is clearly visible how the contents of the intervertebral disks may compress the spinal nerves. See @fig:intervertebral-disk for a visualization. The symptoms may be motor or sensory function abnormalities. ] #wideblock[ #figure( image("images/spinal-cord-overview.png"), caption:[The vertebral column and spinal nerves exiting from it @vertebral-column-overview.] ) <fig:vertebral-column> ] #note([ #figure( image("images/lumbar-disk-lesions.jpg"), caption:[Lumbar Disk Herniation visualization @lumbar-disk-lesions.] ) <fig:lumbar-disk-hernia> ], dy:0.5in, numbered:false) Lumbar disk herniations occour most commonly in the lumbar region#note([see @fig:vertebral-column], dy:-0.3in) as a relatively mobile part of the spinal cord meets the relatively immobile sacral part of the spinal cord. This area is also more common as the entire weight of the head and the thorax and the weight lifted by the upper limb is transmitted towards the legs through this region. @fig:lumbar-disk-hernia shows the pathology. The blue part in the intervertebral disk is the nucleus pulposus while the while part is the annulus fibrosus. The nucleus pulposus can be seen being squeezed into the cavity of the spinal cord where it compresses the nerves. #wideblock[ #figure( image("images/intervertebral-disk.svg"), caption:[Views of the Intervertebral disk. The substance that can cause hernia (nucleus pulposus) is clearly visible and labelled @intervertebral-disk.] ) <fig:intervertebral-disk> ] This can lead to pain being felt in the leg on the side where the nerve is being compressed. As was the case of our student, his spinal nerves L5 and S1 were most probably compressed leading to him experiencing the pain. His condition is known as 'Sciatica'#note([Compression of the sensory roots will lead to pain being felt while compression of the motor roots will produce weakness of the muscles.], dy:-1.2in). = Case 2 == Presentation A man was involved in a motor vehicle accident, sustaining a head-on collision. First responders observed that his breathing was severely compromised. What is the major muscle controlling respiration and how is injury to the spinal cord related to breathing? == Relavant Anatomical Background The major muscle controlling respiration is the Diaphragm. It is located below the lungs and can be seen in @fig:diaphragm. Its contraction leads to the increase in volume of the thoracic cavity which causes the lungs to fill up with air#note[Further anatomical details of the diaphragm are beyond the scope of this book but more information can be found in Gray's Basic Anatomy section on Thorox and the heading is Diaphragm]. #note([ #figure( image("images/diaphragm.svg"), caption:[Diagram showing the position of the diaphragm @thoracic-diaphragm.] ) <fig:diaphragm> ], dy:-3.3in, numbered:false, ) The spinal nerves C3 to C5. These are the nerves that exit from the cervical spinal cord levels of 5th, 6th and 7th vertebrae. A major point to note here is that the spinal nerve C3, for example, exits from the level C5. See table#note([ #table( columns: 2, [*Vertebrae*], [*Spinal Nerve*], [Cervical vertebrae], [Add 1], [Upper thoracic vertebrae], [Add 2], [Lower thoracic vertebrae], [Add 3], [10th thoracic vertebra], [L1 and 2], [11th thoracic vertebra], [L3 and 4], [12th thoracic vertebra], [L5], [First lumbar vertebra], [Sacral and coccygeal spinal nerves] ) ], dy:-1in). which are known as phrenic nerves innervate the diaphragm. If the spinal cord is damaged above this level, control of the diaphragm is lost which could lead to death. = Case 3 == Presentation A person complains of continuous and severe headaches, high fever, stiff neck and drowsiness#note([Most of the times headaches and fever are common complaints from patients and can be safely treated by prescribing an over-the-counter painkillers but if they accompany neurological symptoms, such as drowsiness or excessive sleeping or confusion, then a neurologist must take precautions to rule out more serious underlying causes.], dy:0.5in). As a neurologist, how do you manage the patient? == Relavant Anatomical Background The person is suspected of having meningitis. It is an inflammation of the layers covering the brain and the spinal cord. @fig:meninges shows the layers over the surface of the brain. The layers in order from outside to inside are: 1. Dura Mater 2. Arachnoid Mater 3. Pia Mater The Dura Mater is the toughest outer covering over the brain. It lies directly beneath the bone#note([Seeing @fig:meninges will help to visualize the concepts here.], dy:-0.1in). The Arachnoid Mater contains the subarachnoid space#note([Where the CerebroSpinal Fluid is circulating.], dy:0.1in). The Pia Mater is a thin layer that directly covers the surface of the brain and is usually transparent. A lumbar puncture procedure may be performed to withdraw a sample of CSF to check for infections (such as meningitis in our case) or to inject drugs in response to infections or induce anesthesia#note[This is the case for having a painless childbirth. The mother will not feel the contraction during the first stage of labour. For more details, search for 'Caudal Analgesia for Labour'.]. One extremely important feature of our spinal cord is that the nervous tissue (of adults) ends at the level of L1 vertebra but the subarachnoid space (containing the CSF) extends until the level of S2#note([For a visualization of specific levels of the spinal cord, refer to @fig:vertebral-column], dy:0.1in). A needle inserted into this space here, will generally not damage the spinal nerves as they will be pushed to one side owing to the fact that there is a lot of space here for the spinal cord. #figure( image("images/lumbar-puncture.png"), caption: [To obtain a sample of CSF, a lumbar puncture or a spinal tap may be performed. @lumbar-puncture.] ) <fig:lumbar-puncture> @fig:lumbar-puncture-level shows the level of the lumbar puncture procedure. At the level of L4, the illiac crest of the illiac part of the hip bone can be felt. This a safe site for this procedure. #note([ #figure( image("images/meninges.png"), caption: [Image with the 3 coverings of the nervous system labelled. Namely the Dura, Arachnoid and Pia Mater. The subarachnoid space (the web-like space between the Arachnoid and Pia Mater) is also visible @meninges.] ) <fig:meninges> ], dy:-3in, numbered: false) #note([ #figure( image("images/level-of-lumbar-puncture.png"), caption: [An image showing imporatant vertebral column levels @lumbar-puncture-level.] ) <fig:lumbar-puncture-level> ], numbered: false) After administering a small amount of local anesthetic, the physician can insert a spinal needle just above the L4 spinal level. The depth of needle insertion varies depending on the patient's physique. For example, in a child, the needle may only need to be inserted approximately 1 cm, while in an obese adult, it may need to be inserted up to 10 cm into the lumbar spine. This needle is then used to collect a small sample of cerebrospinal fluid (CSF) for laboratory examination. Additionally, the CSF pressure can be measured by attaching a manometer to the spinal needle. Deviations from the normal CSF pressure, which typically ranges from 60 to 150 mm of water, can indicate various medical conditions. Elevated or reduced CSF pressure may be associated with specific neurological or systemic disorders. Some causes of elevated CSF pressure are: Intracranial Masses, Hydrocephalus, Infections, Trauma, Vascular Issues, Idiopathic Intracranial Hypertension (IIH), Toxins and Metabolic Disorders. Some causes of lowered CSF pressure are: CSF Leak, Dehydration, Overdrainage of CSF, Certain Medications#note([As a side note, the specific causes of elevated and lowered CSF pressure will be discussed in detail in their respective chapters. For instance, intracranial masses, such as brain tumors, can obstruct CSF flow pathways, resulting in increased pressure. Similarly, hematomas, which are accumulations of blood within the cranial cavity, can compress brain tissue and subsequently elevate CSF pressure.], dy:-1in). #pagebreak()
https://github.com/taooceros/MATH-542-HW
https://raw.githubusercontent.com/taooceros/MATH-542-HW/main/HW2/HW2.typ
typst
#import "@local/homework-template:0.1.0": * // Take a look at the file `template.typ` in the file panel // to customize this template and discover how it works. #show: project.with( title: "Math 542 HW2", authors: ("<NAME>",), ) #let ( theorem, lemma, corollary, remark, proposition, example, proof ) = ( thm, lemma, corollary, remark, proposition, example, proof ) #show: thmrules = Chinese Remainder == 10.3.16 <10.3.16> For any left ideal $I$ of $R$ define $ I M = {sum_"finite" a_i m_i | a_i in I, m_i in M} $ to be the collection of all finite sums of elements of the form $a m$ where $a in I$ and $m in M$. This is a submodule of $M$. For any ideal $I$ of $R$ let $I M$ be the submodule defined above. Let $A_1, ..., A_k$ be any ideals in the ring $R$. Prove that the map $ phi : M -> M/(A_1 M) times ... times M/(A_k M) "defined by" m arrow.bar (m + A_1 M, ..., m + A_k M) $ is an $R$-module homomorphism with kernel $A_1 M sect A_2 M sect ... sect A_k M$. #solution[ Want to check $forall x, y in M : phi(x+y) = phi(x) + phi(y)$ and $forall x in M, r in R: phi(r x) = r phi(x)$. $forall x,y in M: phi(x+y) &= (x + y + A_1 M, ..., x + y + A_k M) \ &= (x + A_1 M, ..., x + A_k M) + (y + A_1 M, ..., y + A_k M) \ & = phi(x) + phi(y)$ $forall r in R, x in M: phi(r x) = (r x A_1M, ..., r x A_k M) = r(x A_1 M,..., x A_k M)$ because submodule is invariant under the ring $R$. To become the kernel, it need to satisfy that $forall i in [1,k] : x+A_i M = A_i M$, which implies that $x in sect.big_i A_i M$. ] == 10.3.17 In the notation of the @10.3.16, assume further that the ideals $A_1, ..., A_k$ are pairwise comaximal $(i.e. forall i != j : A_i + A_j = R)$. Prove that $ M/((A_1 ... A_k) M) cong M/(A_1 M) times ... times M/(A_k M) $ [See proof of the Chinese Remainder Theorem for rings in Section 7.6.] #solution[ Based on the proof of Chinese Remainder Theorem for rings in Section 7.6, it suffices to check the case when $k = 2$. Consider a map $phi : M -> M/(A M) times M/(B M)$ by sending $x arrow.bar (a + A M, a+ B M)$. $phi$ is a module homomorphism based on @10.3.16. The kernel is clearly $(A M sect B M)$, and similar to the proof in Ring, it suffices to check when $A, B$ are comaximal, $(A B) M =( A sect B) M$ Because $A+B = R$, $exists x in A, y in B : x+y = 1$ $ forall (r_1 mod A, r_2 mod B) in M/(A M) times M/(B M) $ $ phi(r_2 x + r_1 y) &= phi(r_2) phi(x) + phi(r_1) y \ &= (r_2 mod A, r_2 mod B) (0, 1) + (r_1 mod A, r_1 mod B) (1, 0) \ &= (0, r_2 mod B) + (r_1 mod A, 0) \ &= (r_1 mod A, r_2 mod B) $ Therefore $phi$ is surjective. It's clear that $(A B) M subset (A sect B) M$ Because $A+B = R$, $exists x in A, y in B : x+y = 1$ Thus $forall c in (A sect B) M : exists x' in A, y' in B : c = c x + c y in (A B) M$ ] = Fractions Suppose that $R$ is an integral domain and let $M$ be an $R$-module. Let $S$ be a multiplicatively closed subset of $R$ that includes $1$ and does not include $0$ (for instance complemenets of prime ideals). Let $S^(-1) M$ be the collection of symbols of the form $m/s$ where $m in M$ and $s in S$ and where we insist that $(s' dot m)/(s's) = m/s$ for any $s' in S$. This is an abelian group where we define addition by $m_1/s_1 + m_2/s_2 := (s_2 dot m_1 + s_1 dot m_2)/(s_1s_2)$ for $m_1, m_2 in M$ and $s_1, s_2 in S$. Note that $S^(-1)R$ is a ring if we additionally define multiplication by $r_1/s_1 dot r_2/s_2 := (r_1r_2)/(s_1s_2)$ for $r_1, r_2 in R$ and $s_1, s_2 in S$. Finally, we note that $S^(-1)M$ is an $S^(-1)R$-module where $r/(s_1) dot m/s_2 := (r dot m)/(s_1s_2)$. == Show that if $f : M_1 → M_2$ is a homomorphism of R-modules, then the map $S^(-1)f : S^(-1)M_1 → S^(-1)M_2$ sending $m/s arrow.bar f(m)/s$ is a homomorphism of $S^(-1)R$ modules. #let inv(x) = $#x^(-1)$ #solution[ Given that $f$ is a homomorphism of $R$-module, this question is automatically right when checking addition and multiplication. $ forall m/s in S^(-1)M_1, r in R: inv(S)f(r m/s) = f(r m)/s = (r f(m))/s $ addition is similar and omitted. ] == #let tensor = $times.circle$ If $S=R-{0}$, then note that $S^(-1)R$ is a field. Use this to show that $R^n$ and $R^m$ are not isomorphic if $n$ and $m$ are distinct positive integers. #solution[ Note $inv(S)R^n$ and $inv(S)R^m$ are $inv(S)R$ module, and because $inv(S)R$ is a field, this is a vector space, and thus the dimension will match. Thus it suffices to see that $inv(S)R^m cong S^(-1)R^n$ if and only if $R^m cong R^m$, which is clear. ] == Let $S = R-{0}$ and consider the map $M -> (S^(-1)R ) tensor M$ that sends $m$ to $1 tensor m$. Show that its kernel is the torsion submodule of $M$. #solution[ Consider a map $phi$ from $S^(-1)R plus.circle M$ to $S^(-1)R$ by sending $(r/s, m) arrow.bar (r m)/s$. Given the universal property of $tensor$, there must exists a unique $f$ that maps from $S^(-1)R tensor M$ to $S^(-1)R$ which factor through $phi$. ] == Show that any linearly independent subset of $R^n$ can be extended to a linearly independent subset of size n. (The bonus problem shows that this result is not true when $R$ is not an integral domain). #solution[ Because $R$ is an integral domain, and thus no zero divisor exists, and every submodules are torsion free, and thus every submodules are free. Thus we can extend the linearly independent subset to a basis, which has size $n$. ] = Tensor == 10.4.2 Show that the element "$2 times.circle 1$" is $0 in ZZ times.circle_ZZ ZZ/(2ZZ)$ but is nonzero in $2ZZ times.circle_ZZ ZZ/(2ZZ)$. #solution[ We have $2 tensor 1 = 2 times (2 tensor 1) = 1 tensor 2 = 1 tensor 0 = 0$ in $ZZ tensor_ZZ ZZ/2ZZ$. However, in $2ZZ$, we cannot pull out the $2$ out of $2$ because $1 in.not 2ZZ$. ] == 10.4.20 Let $I = (2,x)$ be the ideal generated by $2$ and $x$ in the ring $R = ZZ[x]$. Show that the element $2 tensor 2+x tensor x$ in $I tensor_R I$ is not a simple tensor, i.e. cannot be written as $a tensor b$ for some $a,b in I$. #solution[ From the old school we have $(a+b)(a-b) = a^2 -b^2$, and since $tensor$ also satisfy distributive rule we shall have $a=2, b=i x$. However, we don't have $i in ZZ[x]$, and thus this is impossible. ] = Duality Suppose that $R$ is commutative. Let $M, N$, and $U$ be $R$-modules. The _dual module_ of $M$ is defined to be $M^* := "Hom"_R (M, R)$. == Suppose that $(e_1, ..., e_n)$ is a bisis, i.e. linearly independent spanning set for $M$. Define $e_i^* in M^*$ to be the homomorphism that sends $e_i$ to $1$ and all other $e_j (j != i)$ to $0$. Show that $(e_1^*,...,e_n^*)$ is a basis for $M^*$. #solution[ A homomorphism can be uniquely determined by sending the basis of $M$ to $R$. Because $forall x in M : x = sum a_i e_i$, then by defining the map from $M -> M^*$ by sending $sum a_i e_i arrow.bar sum a_i e_i^*$ is a surjective. Therefore $(e_1^*, ..., e_n^*)$ is a basis for $M^*$. ] == Show that if $M$ is a free $R$-module of rank $n$, where $n$ is a positive integer, then $(M^*)^*$ is isomorphic to $M$. (Hint: Consider the map $M arrow.long (M^*)^*$ that sends $m in M$ to $"ev"_m$ where $"ev"_m : M^* -> R$ sends a homomorphism $phi : M -> R$ to $phi(m)$. To establish that this map is a surjection show that, in the notation of the preceding part, $((e_i)^*)^* = "ev"_e_i$.) #solution[ Consider the map $psi : M -> (M^*)^*$ by sending $m arrow.bar e v_m$ where $e v_m : M^* -> R$ by sending $(phi : M -> R) arrow.bar phi(m)$ as an evaluation map. Then $e v_(e_i)$ will send $e_i^* arrow.bar e_i^* (e_i) = e_i$ and every other $e_j^* : j!= i$ will send $e_i$ to $0$, which is exactly $e_i^*^*$. The only thing left checking is that $psi$ is a homomorphism, which is automatically true given that this is an evaluation map from a homomorphism $phi : M -> R$. ] == Show that if $R$ is a field and $M$ and $N$ are finitely generated, then $"Hom"_R (M, N) cong M^* tensor N$ as $R$-modules. Show that this is not necessarily such an isomorphism when $R=ZZ$ and $M$ and $N$ are finitely generated $ZZ$-modules. #solution[ Suppose $(e_i)$ are a finite set of elements that finitely generate $M$. Then by the same construction of part (4.1), we will have $(e_i^*)$ that finitely generate $M^*$. If $R$ is a field, then $M, N$ are vector spaces, and thus free. We have $"Hom"_R (M, N)$ equivalent to a matrix that has dimension $m times n$, where $m,n$ are the number of basis of $M, N$. On the other hand, we have any bilinear map from $M plus.circle N$ can be written as $m^T A n$, where the $A$ has dimension $m times n$. Thus this is an isomorphism of set. Since in the expression we have $m^T$, and thus this is a map from $M -> R$, and thus we need the left hand side to be $M^*$ of the tensor. This is not necessary true under $ZZ$ module because we can have different basis representation of the same element in $ZZ$ module. Thus the dimension of the matrix may vary given different finitely generated set, and thus the tensor may be larger than the homomorphism. ] = Counterexample Do one of the following two problems: 10.3.24 or 10.3.26. == 10.3.24 For each positive integer $i$ let $M_i$ be the free $ZZ$-module $ZZ$, and let $M$ be the direct product $product_(i in Z^+) M_i$. Each elements of $M$ is in the uniquely determined form $(a_1, a_2, a_3, ...)$ with $a_i in ZZ$ for all $i$. Let $N$ be the submodule of $M$ consisting of all such tuple with only finitely many nonzero $a_i$. Assume $M$ is a free $ZZ$-module with basis $cal(B)$. === Show that $N$ is countable. #solution[ It suffices to prove that the diagnoal argument that proves $RR$ is uncountable does not work here. Because all elements in $N$ contains only finitely many nonzero entries, the new element we retrieve from the diagonal plus 1 can only contain finitely many nonzero entries. However, this is not possible unless we have $9$ on the diagonal after some finitely many terms. However, that contradicts to how we enumerate the elements in $N$. Thus the diagnol plus 1 is not in $N$. ] === Show that there is some countable subset $cal(B)_1$ of $cal(B)$ such that $N$ is contained in the submodule, $N_1$, generated by $cal(B)_1$. Show also that $N_1$ is countable. #solution[ Because $cal(B)$ is a basis, $ forall n in N : exists c_i in ZZ: sum_(b_i in cal(B)) c_i b_i = n $ such that the number of $b_i$ used to represent $n$ is finite. Thus we takes the union of such $b_i$ that's required to cover $N$, we will have a countable union of finite subsets which is still countable. ] === #solution[ By definition of quotient, $overline(M)$ can be generated by $cal(B) \\ cal(B_1)$, and thus is a free module. As a free module, every element in $overline(M)$ can be represented by a finite sum of elements in $cal(B) \\ cal(B_1)$, and thus is a multiple of other elements if and only if $k$ divides all the coefficients of the basis. ] === #solution[ The diagonal arguments works here by flipping the sign of each diagonal element. Since $cal(S)$ is uncountable, it is not possible that $cal(S) subset N_1$. ] === #solution[ Given that $overline(s) in M/N_1$, and $N in N_1$, we can add linear combination of any element that have finitely many nonzero entries. Consider an integer $k$, it suffices to use $N$ to fill the gap for any entries in $overline(s)$ that has index less than $k$. Beyond that, every element will have a factor of $k$. Thus $forall k in ZZ, exists m in M: overline(s) = k overline(m)$. ] = Bonus #solution[ To prove that $M$ is free, it suffices to show that the map to $M$ is injective. If $r x = 0$, then $r$ must map everything in the $x$ axis to $0$. Same for $r y = 0$. Then $(r x, r y) = 0$ implies that it is a trivial map from $CC^2 - {0,0} -> CC$. Thus $M$ is free. If $exists (u,v) in R^2$ that is linearly independent with $(x,y)$, then ]
https://github.com/Nikudanngo/typst-ja-resume-template
https://raw.githubusercontent.com/Nikudanngo/typst-ja-resume-template/main/README.md
markdown
MIT License
# Typst履歴書テンプレート ![PDF](https://img.shields.io/badge/Resume-PDF-blue) ## これは何? [Typst](https://typst.app/)で履歴書のテンプレートを作成しました。 履歴書は[こちら](/main.pdf)をご覧ください。 自由に改変、利用してもいいです。 何かの規約とかに則って作っているわけではないので、おかしな点があればご指摘ください! ## 参考にした書式 [rireki-style](https://github.com/shigio/rireki-style) [doda履歴書テンプレート](https://doda.jp/guide/rireki/template/)
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-2900.typ
typst
Apache License 2.0
#let data = ( ("RIGHTWARDS TWO-HEADED ARROW WITH VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS TWO-HEADED ARROW WITH DOUBLE VERTICAL STROKE", "Sm", 0), ("LEFTWARDS DOUBLE ARROW WITH VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS DOUBLE ARROW WITH VERTICAL STROKE", "Sm", 0), ("LEFT RIGHT DOUBLE ARROW WITH VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS TWO-HEADED ARROW FROM BAR", "Sm", 0), ("LEFTWARDS DOUBLE ARROW FROM BAR", "Sm", 0), ("RIGHTWARDS DOUBLE ARROW FROM BAR", "Sm", 0), ("DOWNWARDS ARROW WITH HORIZONTAL STROKE", "Sm", 0), ("UPWARDS ARROW WITH HORIZONTAL STROKE", "Sm", 0), ("UPWARDS TRIPLE ARROW", "Sm", 0), ("DOWNWARDS TRIPLE ARROW", "Sm", 0), ("LEFTWARDS DOUBLE DASH ARROW", "Sm", 0), ("RIGHTWARDS DOUBLE DASH ARROW", "Sm", 0), ("LEFTWARDS TRIPLE DASH ARROW", "Sm", 0), ("RIGHTWARDS TRIPLE DASH ARROW", "Sm", 0), ("RIGHTWARDS TWO-HEADED TRIPLE DASH ARROW", "Sm", 0), ("RIGHTWARDS ARROW WITH DOTTED STEM", "Sm", 0), ("UPWARDS ARROW TO BAR", "Sm", 0), ("DOWNWARDS ARROW TO BAR", "Sm", 0), ("RIGHTWARDS ARROW WITH TAIL WITH VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS ARROW WITH TAIL WITH DOUBLE VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS TWO-HEADED ARROW WITH TAIL", "Sm", 0), ("RIGHTWARDS TWO-HEADED ARROW WITH TAIL WITH VERTICAL STROKE", "Sm", 0), ("RIGHTWARDS TWO-HEADED ARROW WITH TAIL WITH DOUBLE VERTICAL STROKE", "Sm", 0), ("LEFTWARDS ARROW-TAIL", "Sm", 0), ("RIGHTWARDS ARROW-TAIL", "Sm", 0), ("LEFTWARDS DOUBLE ARROW-TAIL", "Sm", 0), ("RIGHTWARDS DOUBLE ARROW-TAIL", "Sm", 0), ("LEFTWARDS ARROW TO BLACK DIAMOND", "Sm", 0), ("RIGHTWARDS ARROW TO BLACK DIAMOND", "Sm", 0), ("LEFTWARDS ARROW FROM BAR TO BLACK DIAMOND", "Sm", 0), ("RIGHTWARDS ARROW FROM BAR TO BLACK DIAMOND", "Sm", 0), ("NORTH WEST AND SOUTH EAST ARROW", "Sm", 0), ("NORTH EAST AND SOUTH WEST ARROW", "Sm", 0), ("NORTH WEST ARROW WITH HOOK", "Sm", 0), ("NORTH EAST ARROW WITH HOOK", "Sm", 0), ("SOUTH EAST ARROW WITH HOOK", "Sm", 0), ("SOUTH WEST ARROW WITH HOOK", "Sm", 0), ("NORTH WEST ARROW AND NORTH EAST ARROW", "Sm", 0), ("NORTH EAST ARROW AND SOUTH EAST ARROW", "Sm", 0), ("SOUTH EAST ARROW AND SOUTH WEST ARROW", "Sm", 0), ("SOUTH WEST ARROW AND NORTH WEST ARROW", "Sm", 0), ("RISING DIAGONAL CROSSING FALLING DIAGONAL", "Sm", 0), ("FALLING DIAGONAL CROSSING RISING DIAGONAL", "Sm", 0), ("SOUTH EAST ARROW CROSSING NORTH EAST ARROW", "Sm", 0), ("NORTH EAST ARROW CROSSING SOUTH EAST ARROW", "Sm", 0), ("FALLING DIAGONAL CROSSING NORTH EAST ARROW", "Sm", 0), ("RISING DIAGONAL CROSSING SOUTH EAST ARROW", "Sm", 0), ("NORTH EAST ARROW CROSSING NORTH WEST ARROW", "Sm", 0), ("NORTH WEST ARROW CROSSING NORTH EAST ARROW", "Sm", 0), ("WAVE ARROW POINTING DIRECTLY RIGHT", "Sm", 0), ("ARROW POINTING RIGHTWARDS THEN CURVING UPWARDS", "Sm", 0), ("ARROW POINTING RIGHTWARDS THEN CURVING DOWNWARDS", "Sm", 0), ("ARROW POINTING DOWNWARDS THEN CURVING LEFTWARDS", "Sm", 0), ("ARROW POINTING DOWNWARDS THEN CURVING RIGHTWARDS", "Sm", 0), ("RIGHT-SIDE ARC CLOCKWISE ARROW", "Sm", 0), ("LEFT-SIDE ARC ANTICLOCKWISE ARROW", "Sm", 0), ("TOP ARC ANTICLOCKWISE ARROW", "Sm", 0), ("BOTTOM ARC ANTICLOCKWISE ARROW", "Sm", 0), ("TOP ARC CLOCKWISE ARROW WITH MINUS", "Sm", 0), ("TOP ARC ANTICLOCKWISE ARROW WITH PLUS", "Sm", 0), ("LOWER RIGHT SEMICIRCULAR CLOCKWISE ARROW", "Sm", 0), ("LOWER LEFT SEMICIRCULAR ANTICLOCKWISE ARROW", "Sm", 0), ("ANTICLOCKWISE CLOSED CIRCLE ARROW", "Sm", 0), ("CLOCKWISE CLOSED CIRCLE ARROW", "Sm", 0), ("RIGHTWARDS ARROW ABOVE SHORT LEFTWARDS ARROW", "Sm", 0), ("LEFTWARDS ARROW ABOVE SHORT RIGHTWARDS ARROW", "Sm", 0), ("SHORT RIGHTWARDS ARROW ABOVE LEFTWARDS ARROW", "Sm", 0), ("RIGHTWARDS ARROW WITH PLUS BELOW", "Sm", 0), ("LEFTWARDS ARROW WITH PLUS BELOW", "Sm", 0), ("RIGHTWARDS ARROW THROUGH X", "Sm", 0), ("LEFT RIGHT ARROW THROUGH SMALL CIRCLE", "Sm", 0), ("UPWARDS TWO-HEADED ARROW FROM SMALL CIRCLE", "Sm", 0), ("LEFT BARB UP RIGHT BARB DOWN HARPOON", "Sm", 0), ("LEFT BARB DOWN RIGHT BARB UP HARPOON", "Sm", 0), ("UP BARB RIGHT DOWN BARB LEFT HARPOON", "Sm", 0), ("UP BARB LEFT DOWN BARB RIGHT HARPOON", "Sm", 0), ("LEFT BARB UP RIGHT BARB UP HARPOON", "Sm", 0), ("UP BARB RIGHT DOWN BARB RIGHT HARPOON", "Sm", 0), ("LEFT BARB DOWN RIGHT BARB DOWN HARPOON", "Sm", 0), ("UP BARB LEFT DOWN BARB LEFT HARPOON", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB UP TO BAR", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB UP TO BAR", "Sm", 0), ("UPWARDS HARPOON WITH BARB RIGHT TO BAR", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB RIGHT TO BAR", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB DOWN TO BAR", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB DOWN TO BAR", "Sm", 0), ("UPWARDS HARPOON WITH BARB LEFT TO BAR", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB LEFT TO BAR", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB UP FROM BAR", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB UP FROM BAR", "Sm", 0), ("UPWARDS HARPOON WITH BARB RIGHT FROM BAR", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB RIGHT FROM BAR", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB DOWN FROM BAR", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB DOWN FROM BAR", "Sm", 0), ("UPWARDS HARPOON WITH BARB LEFT FROM BAR", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB LEFT FROM BAR", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB UP ABOVE LEFTWARDS HARPOON WITH BARB DOWN", "Sm", 0), ("UPWARDS HARPOON WITH BARB LEFT BESIDE UPWARDS HARPOON WITH BARB RIGHT", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB UP ABOVE RIGHTWARDS HARPOON WITH BARB DOWN", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB LEFT BESIDE DOWNWARDS HARPOON WITH BARB RIGHT", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB UP ABOVE RIGHTWARDS HARPOON WITH BARB UP", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB DOWN ABOVE RIGHTWARDS HARPOON WITH BARB DOWN", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB UP ABOVE LEFTWARDS HARPOON WITH BARB UP", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB DOWN ABOVE LEFTWARDS HARPOON WITH BARB DOWN", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB UP ABOVE LONG DASH", "Sm", 0), ("LEFTWARDS HARPOON WITH BARB DOWN BELOW LONG DASH", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB UP ABOVE LONG DASH", "Sm", 0), ("RIGHTWARDS HARPOON WITH BARB DOWN BELOW LONG DASH", "Sm", 0), ("UPWARDS HARPOON WITH BARB LEFT BESIDE DOWNWARDS HARPOON WITH BARB RIGHT", "Sm", 0), ("DOWNWARDS HARPOON WITH BARB LEFT BESIDE UPWARDS HARPOON WITH BARB RIGHT", "Sm", 0), ("RIGHT DOUBLE ARROW WITH ROUNDED HEAD", "Sm", 0), ("EQUALS SIGN ABOVE RIGHTWARDS ARROW", "Sm", 0), ("TILDE OPERATOR ABOVE RIGHTWARDS ARROW", "Sm", 0), ("LEFTWARDS ARROW ABOVE TILDE OPERATOR", "Sm", 0), ("RIGHTWARDS ARROW ABOVE TILDE OPERATOR", "Sm", 0), ("RIGHTWARDS ARROW ABOVE ALMOST EQUAL TO", "Sm", 0), ("LESS-THAN ABOVE LEFTWARDS ARROW", "Sm", 0), ("LEFTWARDS ARROW THROUGH LESS-THAN", "Sm", 0), ("GREATER-THAN ABOVE RIGHTWARDS ARROW", "Sm", 0), ("SUBSET ABOVE RIGHTWARDS ARROW", "Sm", 0), ("LEFTWARDS ARROW THROUGH SUBSET", "Sm", 0), ("SUPERSET ABOVE LEFTWARDS ARROW", "Sm", 0), ("LEFT FISH TAIL", "Sm", 0), ("RIGHT FISH TAIL", "Sm", 0), ("UP FISH TAIL", "Sm", 0), ("DOWN FISH TAIL", "Sm", 0), )
https://github.com/Kasci/LiturgicalBooks
https://raw.githubusercontent.com/Kasci/LiturgicalBooks/master/CU/texts.typ
typst
#let translation = ( "MINEA_OBS": "МИНІ́А Ѻ҆́БЩА", "M_PAN": "Пра́здникѡмъ Гдⷭ҇а на́шегѡ Їи҃са Хрⷭ҇та̀", "M_BOHORODICKA": "Пра́здникѡмъ Бг҃орѡ́дичнымъ", "M_KRIZ": "Чтⷭ҇но́мꙋ и҆ животворѧ́щемꙋ крⷭ҇тꙋ̀", "M_ANJELI": "Ст҃ы̑мъ а҆́гг҃лѡмъ и҆ про́чымъ безплѡ́тнымъ", "M_PREDCHODCA": "Іѡа́нна прⷣте́чи, прⷪ҇ро́ка и҆ крⷭ҇ти́телѧ гдⷭ҇нѧ", "M_SVATI_OTCOVIA": "Ст҃ы́хъ ѻ҆тє́цъ, на собо́ры", "M_PROROK_JEDEN": "Прⷪ҇ро́кꙋ є҆ди́номꙋ", "M_APOSTOL_JEDEN": "А҆пⷭ҇лꙋ є҆ди́номꙋ", "M_APOSTOL_VIAC": "Апⷭ҇лѡмъ двѣма̀ и҆ мнѡ́гимъ", "M_SVATITEL_JEDEN": "Ст҃и́телю є҆ди́номꙋ", "M_SVATITEL_VIAC": "Ст҃и́телємъ двѣма̀ и҆ мнѡ́гимъ", "M_PREPODOBNY_JEDEN": "Прпⷣбномꙋ є҆ди́номꙋ", "M_PREPODOBNY_VIAC": "Прпⷣбнымъ двѣма̀ и҆ мнѡ́гимъ", "M_MUCENIK_JEDEN": "Мч҃нкꙋ є҆ди́номꙋ", "M_MUCENIK_VIAC": "Мч҃нкѡмъ, двѣма̀ и҆ мнѡ́гимъ", "M_HIEROMUCENIK_JEDEN": "Сщ҃енномч҃нкꙋ є҆ди́номꙋ", "M_HIEROMUCENIK_VIAC": "Сщ҃енномч҃нкѡмъ двѣма̀, и҆ мнѡ́гимъ", "M_PREPODOBNY_MUCENIK_JEDEN": "Прпⷣбномч҃нкꙋ є҆ди́номꙋ", "M_PREPODOBNY_MUCENIK_VIAC": "Прпⷣбномч҃нкѡмъ двѣма̀ и҆ мнѡ́гимъ", "M_MUCENICA_JEDNA": "Мч҃нцѣ є҆ди́ной", "M_MUCENICA_VIAC": "Мч҃нцамъ, двѣма̀ и҆ мнѡ́гимъ", "M_PREPODOBNA_ZENA_JEDNA": "Прпⷣбнѣй женѣ̀ є҆ди́нѣй", "M_PREPODOBNA_ZENA_VIAC": "Прпⷣбнымъ жена́мъ двѣма̀ и҆ мнѡ́гимъ", "M_PREPODOBNA_MUCENICA_JEDNA": "Прпⷣбномч҃нцѣ є҆ди́ной", "M_SPOVEDNIK_JEDEN": "Сщ҃енноисповѣ́дникꙋ и҆ прпⷣбноисповѣ́дникꙋ", "M_DIVOTVORCA_JEDEN": "Безме́здникѡмъ и҆ чꙋдотво́рцємъ", // "M_":"", // "M_":"", "M_01_september": "МИНІ́А МѢ́СѦЦЪ СЕПТЕ́МВРІЙ", "M_02_oktober": "МИНІ́А МѢ́СѦЦЪ Ѻ҆КТѠ́ВРІЙ", "M_03_november": "МИНІ́А МѢ́СѦЦЪ НОЕ́МВРІЙ", "M_04_december": "МИНІ́А МѢ́СѦЦЪ ДЕКЕ́МВРІЙ", "M_05_januar": "МИНІ́А МѢ́СѦЦЪ І҆АННꙊА́РІЙ", "M_06_februar": "МИНІ́А МѢ́СѦЦЪ ФЕѴРꙊА́РІЙ", "M_07_marec": "МИНІ́А МѢ́СѦЦЪ МА́РТЪ", "M_08_april": "МИНІ́А МѢ́СѦЦЪ А҆ПРІ́ЛЛІЙ", "M_09_maj": "МИНІ́А МѢ́СѦЦЪ МА́ІЙ", "M_10_jun": "МИНІ́А МѢ́СѦЦЪ І҆Ꙋ́НІЙ", "M_11_jul": "МИНІ́А МѢ́СѦЦЪ І҆Ꙋ́ЛІЙ", "M_12_august": "МИНІ́А МѢ́СѦЦЪ А҆́ѴГꙊСТЪ", // 🕀🕁🕂🕃🕄 "M_NAR_BOHORODICKY": "🕀 Ржⷭ҇тво̀ прест҃ы́ѧ влⷣчцы на́шеѧ бцⷣы, и҆ приснодв҃ы марі́и.", "M_VOZDV_KRIZA": "🕀 Всемі́рное воздви́женїе честна́гѡ и҆ животворѧ́щагѡ крⷭ҇та̀.", "M_JAN_BOHOSLOV": "🕁 Преставле́нїе ст҃а́гѡ а҆пⷭ҇ла и҆ є҆ѵⷢ҇лі́ста І҆ѡа́нна бг҃осло́ва.", "M_POKROV": "🕀 Покро́въ прест҃ы́ѧ влⷣчцы на́шеѧ бцⷣы и҆ приснодв҃ы мр҃і́и.", "M_DEMETER": "🕁 Ст҃а́гѡ и҆ сла́внагѡ великомꙋ́ченика дими́трїа мѷрото́чца.", "M_MICHAL": "🕁 Собо́ръ ст҃а́гѡ а҆рхїстрати́га мїхаи́ла, и҆ про́чихъ безпло́тныхъ си́лъ.", "M_JOZAFAT": "🕁", "M_ZLATOUSTY": "🕁 И҆́же во свѧты́хъ ѻ҆ц҃а̀ на́шегѡ і҆ѡа́нна, а҆рхїепі́скопа кѡнстанті́нѧ гра́да, златоꙋ́стагѡ.", "M_VOVEDENIE": "🕀 Вхо́дъ во хра́мъ прест҃ы́ѧ влⷣчцы на́шеѧ бцⷣы и҆ приснодѣ́вы марі́и.", "M_SAVA": "🕁 Прпⷣбнагѡ и҆ бг҃оно́снагѡ ѻ҆тца̀ на́шегѡ са́ввы ѡ҆свѧще́ннагѡ.", "M_MIKULAS": "🕁 И҆́же во ст҃ы́хъ ѻ҆тца̀ на́шегѡ нїкола́а, а҆рхїепі́скопа мѷрлѷкі́йскихъ чꙋдотво́рца.", "M_POCATIE_BOHORODICKY": "🕁 Зача́тїе ст҃ы́ѧ а҆́нны, є҆гда̀ зача́тъ прест҃ꙋ́ю бцⷣꙋ.", "M_NARODENIE": "🕀 Є҆́же по пло́ти, ржⷭ҇тво̀ гдⷭ҇а бг҃а и҆ сп҃са на́шегѡ і҆и҃са хрⷭ҇та̀.", "M_ZHROM_BOHORODICKA": "🕃 Собо́ръ прест҃ы́ѧ бцⷣы", "M_OBREZANIE": "🕀 ", "M_BOHOZJAVENIE": "🕀 ", "M_ANTON": "🕁 ", "M_EUTMIOS": "🕁 ", "M_TRAJA_SVATITELIA": "🕁 ", "M_OBETOVANIE": "🕀 ", "M_ZVESTOVANIE": "🕀 ", "M_JURAJ": "🕁 ", "M_JAN_EVANJELISTA": "🕁 ", "M_NAR_JAN_KRSTITEL": "🕀 Рождество̀ честна́гѡ сла́внагѡ прⷪ҇ро́ка, предте́чи и҆ крести́телѧ Іѡа́нна.", "M_PETER_PAVOL": "🕀 Ст҃ы́хъ сла́вныхъ и҆ всехва́льныхъ и҆ первоверхо́вныхъ а҆пⷭ҇лъ, Петра̀ и҆ Па́ѵла", "M_PAVOL_GOJDIC": "🕁 ", "M_ELIAS": "🕁 Ст҃а́гѡ сла́внагѡ прⷪ҇ро́ка Илїѝ.", "M_PREMENENIE": "🕀 Ст҃о́е преѡбраже́нїе гдⷭ҇а бг҃а и҆ сп҃са на́шегѡ Іи҃са Хрⷭ҇та̀.", "M_ZOSNUTIE": "🕀 Оу҆спе́нїе прест҃ы́ѧ сла́вныѧ влⷣчцы на́шеѧ Бцⷣы и҆ приснодв҃ы Марі́и.", "SI": "Сла́ва, и҆ ны́нѣ:", "S": "Сла́ва:", "IN": "И́ ны́нѣ:", "PR": "пр", "PD": "под", "VV": "в", "ST": "ст", "HLAS": "Гла́съ", "TYZDEN": "Недѣ́лѧ ст҃а́гѡ поста̀", "Ne": "Недѣ́льа", "Po": "Понедѣ́льникъ", "Ut": "Вто́рникъ", "Sr": "Сре́да", "St": "Четверто́къ", "Pi": "Пѧто́къ", "So": "Сꙋббѡ́та", "M": "Ма́лая вече́рньа", "V": "Вече́рньа", "P": "Повече́рїе", "N": "Полꙋ́нощница", "U": "Оу҆́треньа", "L": "Лїтꙋргі́а", "I": "Из̾ѡбрази́тєльнаѧ", "So_V": "въ сꙋббѡ́тꙋ ве́чера", "So_N": "въ сꙋббѡ́тꙋ но́щи", "Ne_V": "въ недѣ́лю ве́чера", "Ne_N": "въ недѣ́лю но́щи", "Po_V": "въ понедѣ́льникъ ве́чера", "Po_N": "въ понедѣ́льникъ но́щи", "Ut_V": "въ вто́рникъ ве́чера", "Ut_N": "въ вто́рникъ но́щи", "Sr_V": "въ сре́дꙋ ве́чера", "Sr_N": "въ сре́дꙋ но́щи", "St_V": "въ четверто́къ ве́чера", "St_N": "въ четверто́къ но́щи", "Pi_V": "въ пѧто́къ ве́чера", "Pi_N": "въ пѧто́къ но́щи", "HOSPODI_VOZVACH": "Гдⷭ҇и воззва́хъ", "PARAMIE": "Чтє́нїѧ", "LITIA": "На лїті́и", "STICHOVNI": "На стїхо́внѣ стїхи́рꙋ", "TROPAR": "Тропа́рь", "PIESEN": "Пѣ́снь", "SIDALEN": "Сѣда́ленъ", "SIDALENY": "Сѣда́лнѣ", "SIDALEN_PO": "По стїхосло́вїи", "VELICANIE": "Велича́нїе", "YPAKOJ": "Ѵ҆пакоѝ", "STEPENNY": "Степє́нны", "ANTIFONY": "А҆нтїфѡ́ны", "ANTIFON": "Антїфѡ́нъ", "PROKIMEN": "Прокі́менъ", "STICH": "Сті́хъ", "ALLILUJA": "А҆ллилꙋ́їа", "KANON": "Канѡ́нъ", "KATAVASIA": "Катава́сїа", "KONDAK_IKOS": "Конда́къ и І҆́косъ", "KONDAK": "Конда́къ", "IKOS": "І҆́косъ", "CHVALITE": "На хвали́техъ стїхи̑ры", "BLAZENNA": "Бл҃жє́нна", "TROPAR_KONDAK": "Тропа́рь и Конда́къ", "50_STICHIRA": "Стїхи́ра по н҃-мъ ѱалмѣ̀", "SVITILEN": "Свѣти́ленъ", "PRICASTEN": "Прича́стенъ", "IRMOS": "І҆рмо́съ", "JEDINORODNY": "Є҆диноро́дный сн҃е и҆ сло́ве", "VCHOD": "Вхо́дное", "HV_MINEA": "Та́же стїхи̑ры 3 и҆з̾ мине́и и҆лѝ и҆з̾ мине́и ѻ҆́бщей.", "HV_NOTE": "Сла́ва: и҆з̾ мине́и, И҆ ны́нѣ: бг҃оро́диченъ, а҆́ще не бꙋ́детъ, Сла́ва: И҆ ны́нѣ: бг҃оро́диченъ", "HV_N_NOTE": "Бг҃оро́диченъ во ᲂу҆ста́вѣ", "T_NOTE": "Сла́ва: мине́и; И҆ ны́нѣ: бг҃оро́диченъ воскрⷭ҇ный" )
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/layout/pad-03.typ
typst
Other
// Test that padding adding up to 100% does not panic. #pad(50%)[]
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-1BC00.typ
typst
Apache License 2.0
#let data = ( ("DUPLOYAN LETTER H", "Lo", 0), ("DUPLOYAN LETTER X", "Lo", 0), ("DUPLOYAN LETTER P", "Lo", 0), ("DUPLOYAN LETTER T", "Lo", 0), ("DUPLOYAN LETTER F", "Lo", 0), ("DUPLOYAN LETTER K", "Lo", 0), ("DUPLOYAN LETTER L", "Lo", 0), ("DUPLOYAN LETTER B", "Lo", 0), ("DUPLOYAN LETTER D", "Lo", 0), ("DUPLOYAN LETTER V", "Lo", 0), ("DUPLOYAN LETTER G", "Lo", 0), ("DUPLOYAN LETTER R", "Lo", 0), ("DUPLOYAN LETTER P N", "Lo", 0), ("DUPLOYAN LETTER D S", "Lo", 0), ("DUPLOYAN LETTER F N", "Lo", 0), ("DUPLOYAN LETTER K M", "Lo", 0), ("DUPLOYAN LETTER R S", "Lo", 0), ("DUPLOYAN LETTER TH", "Lo", 0), ("DUPLOYAN LETTER SLOAN DH", "Lo", 0), ("DUPLOYAN LETTER DH", "Lo", 0), ("DUPLOYAN LETTER KK", "Lo", 0), ("DUPLOYAN LETTER SLOAN J", "Lo", 0), ("DUPLOYAN LETTER HL", "Lo", 0), ("DUPLOYAN LETTER LH", "Lo", 0), ("DUPLOYAN LETTER RH", "Lo", 0), ("DUPLOYAN LETTER M", "Lo", 0), ("DUPLOYAN LETTER N", "Lo", 0), ("DUPLOYAN LETTER J", "Lo", 0), ("DUPLOYAN LETTER S", "Lo", 0), ("DUPLOYAN LETTER M N", "Lo", 0), ("DUPLOYAN LETTER N M", "Lo", 0), ("DUPLOYAN LETTER J M", "Lo", 0), ("DUPLOYAN LETTER S J", "Lo", 0), ("DUPLOYAN LETTER M WITH DOT", "Lo", 0), ("DUPLOYAN LETTER N WITH DOT", "Lo", 0), ("DUPLOYAN LETTER J WITH DOT", "Lo", 0), ("DUPLOYAN LETTER J WITH DOTS INSIDE AND ABOVE", "Lo", 0), ("DUPLOYAN LETTER S WITH DOT", "Lo", 0), ("DUPLOYAN LETTER S WITH DOT BELOW", "Lo", 0), ("DUPLOYAN LETTER M S", "Lo", 0), ("DUPLOYAN LETTER N S", "Lo", 0), ("DUPLOYAN LETTER J S", "Lo", 0), ("DUPLOYAN LETTER S S", "Lo", 0), ("DUPLOYAN LETTER M N S", "Lo", 0), ("DUPLOYAN LETTER N M S", "Lo", 0), ("DUPLOYAN LETTER J M S", "Lo", 0), ("DUPLOYAN LETTER S J S", "Lo", 0), ("DUPLOYAN LETTER J S WITH DOT", "Lo", 0), ("DUPLOYAN LETTER J N", "Lo", 0), ("DUPLOYAN LETTER J N S", "Lo", 0), ("DUPLOYAN LETTER S T", "Lo", 0), ("DUPLOYAN LETTER S T R", "Lo", 0), ("DUPLOYAN LETTER S P", "Lo", 0), ("DUPLOYAN LETTER S P R", "Lo", 0), ("DUPLOYAN LETTER T S", "Lo", 0), ("DUPLOYAN LETTER T R S", "Lo", 0), ("DUPLOYAN LETTER W", "Lo", 0), ("DUPLOYAN LETTER WH", "Lo", 0), ("DUPLOYAN LETTER W R", "Lo", 0), ("DUPLOYAN LETTER S N", "Lo", 0), ("DUPLOYAN LETTER S M", "Lo", 0), ("DUPLOYAN LETTER K R S", "Lo", 0), ("DUPLOYAN LETTER G R S", "Lo", 0), ("DUPLOYAN LETTER S K", "Lo", 0), ("DUPLOYAN LETTER S K R", "Lo", 0), ("DUPLOYAN LETTER A", "Lo", 0), ("DUPLOYAN LETTER SLOAN OW", "Lo", 0), ("DUPLOYAN LETTER OA", "Lo", 0), ("DUPLOYAN LETTER O", "Lo", 0), ("DUPLOYAN LETTER AOU", "Lo", 0), ("DUPLOYAN LETTER I", "Lo", 0), ("DUPLOYAN LETTER E", "Lo", 0), ("DUPLOYAN LETTER IE", "Lo", 0), ("DUPLOYAN LETTER SHORT I", "Lo", 0), ("DUPLOYAN LETTER UI", "Lo", 0), ("DUPLOYAN LETTER EE", "Lo", 0), ("DUPLOYAN LETTER SLOAN EH", "Lo", 0), ("DUPLOYAN LETTER ROMANIAN I", "Lo", 0), ("DUPLOYAN LETTER SLOAN EE", "Lo", 0), ("DUPLOYAN LETTER LONG I", "Lo", 0), ("DUPLOYAN LETTER YE", "Lo", 0), ("DUPLOYAN LETTER U", "Lo", 0), ("DUPLOYAN LETTER EU", "Lo", 0), ("DUPLOYAN LETTER XW", "Lo", 0), ("DUPLOYAN LETTER U N", "Lo", 0), ("DUPLOYAN LETTER LONG U", "Lo", 0), ("DUPLOYAN LETTER ROMANIAN U", "Lo", 0), ("DUPLOYAN LETTER UH", "Lo", 0), ("DUPLOYAN LETTER SLOAN U", "Lo", 0), ("DUPLOYAN LETTER OOH", "Lo", 0), ("DUPLOYAN LETTER OW", "Lo", 0), ("DUPLOYAN LETTER OU", "Lo", 0), ("DUPLOYAN LETTER WA", "Lo", 0), ("DUPLOYAN LETTER WO", "Lo", 0), ("DUPLOYAN LETTER WI", "Lo", 0), ("DUPLOYAN LETTER WEI", "Lo", 0), ("DUPLOYAN LETTER WOW", "Lo", 0), ("DUPLOYAN LETTER NASAL U", "Lo", 0), ("DUPLOYAN LETTER NASAL O", "Lo", 0), ("DUPLOYAN LETTER NASAL I", "Lo", 0), ("DUPLOYAN LETTER NASAL A", "Lo", 0), ("DUPLOYAN LETTER PERNIN AN", "Lo", 0), ("DUPLOYAN LETTER PERNIN AM", "Lo", 0), ("DUPLOYAN LETTER SLOAN EN", "Lo", 0), ("DUPLOYAN LETTER SLOAN AN", "Lo", 0), ("DUPLOYAN LETTER SLOAN ON", "Lo", 0), ("DUPLOYAN LETTER VOCALIC M", "Lo", 0), (), (), (), (), (), ("DUPLOYAN AFFIX LEFT HORIZONTAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX MID HORIZONTAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX RIGHT HORIZONTAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX LOW VERTICAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX MID VERTICAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX HIGH VERTICAL SECANT", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED SECANT", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED LEFT-TO-RIGHT SECANT", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED TANGENT", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED TAIL", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED E HOOK", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED I HOOK", "Lo", 0), ("DUPLOYAN AFFIX ATTACHED TANGENT HOOK", "Lo", 0), (), (), (), ("DUPLOYAN AFFIX HIGH ACUTE", "Lo", 0), ("DUPLOYAN AFFIX HIGH TIGHT ACUTE", "Lo", 0), ("DUPLOYAN AFFIX HIGH GRAVE", "Lo", 0), ("DUPLOYAN AFFIX HIGH LONG GRAVE", "Lo", 0), ("DUPLOYAN AFFIX HIGH DOT", "Lo", 0), ("DUPLOYAN AFFIX HIGH CIRCLE", "Lo", 0), ("DUPLOYAN AFFIX HIGH LINE", "Lo", 0), ("DUPLOYAN AFFIX HIGH WAVE", "Lo", 0), ("DUPLOYAN AFFIX HIGH VERTICAL", "Lo", 0), (), (), (), (), (), (), (), ("DUPLOYAN AFFIX LOW ACUTE", "Lo", 0), ("DUPLOYAN AFFIX LOW TIGHT ACUTE", "Lo", 0), ("DUPLOYAN AFFIX LOW GRAVE", "Lo", 0), ("DUPLOYAN AFFIX LOW LONG GRAVE", "Lo", 0), ("DUPLOYAN AFFIX LOW DOT", "Lo", 0), ("DUPLOYAN AFFIX LOW CIRCLE", "Lo", 0), ("DUPLOYAN AFFIX LOW LINE", "Lo", 0), ("DUPLOYAN AFFIX LOW WAVE", "Lo", 0), ("DUPLOYAN AFFIX LOW VERTICAL", "Lo", 0), ("DUPLOYAN AFFIX LOW ARROW", "Lo", 0), (), (), ("DUPLOYAN SIGN O WITH CROSS", "So", 0), ("DUPLOYAN THICK LETTER SELECTOR", "Mn", 0), ("DUPLOYAN DOUBLE MARK", "Mn", 1), ("DUPLOYAN PUNCTUATION CHINOOK FULL STOP", "Po", 0), )
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/bugs/cite-locate_00.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page #set page(width: 180pt) #set heading(numbering: "1") #outline( title: [List of Figures], target: figure.where(kind: image), ) #pagebreak() = Introduction <intro> #figure( rect[-- PIRATE --], caption: [A pirate @arrgh in @intro], ) #locate(loc => [Citation @distress on page #loc.page()]) #pagebreak() #bibliography("/assets/files/works.bib", style: "chicago-notes")
https://github.com/daniel-eder/typst-template-jku
https://raw.githubusercontent.com/daniel-eder/typst-template-jku/main/src/template/definitions/programme_types.typ
typst
// SPDX-FileCopyrightText: 2023 <NAME> // // SPDX-License-Identifier: Apache-2.0 #let programme_types = ( diploma: "Diploma Programme", doctorate: "Doctoral Programme", master: "Master's Programme" )
https://github.com/antran22/typst-cv-builder
https://raw.githubusercontent.com/antran22/typst-cv-builder/main/lib/resume/language.typ
typst
MIT License
#let ResumeLanguageSection(languages) = { stick_together( threshold: 60pt, [= Languages], stack( dir: ltr, spacing: 24pt, ..languages.map(project => { ResumeEntry( title: project.name, title-r: project.description, subtitle: project.date, subtitle-r: project.at("keywords", default: ()).join(", "), ) ResumeItem( cmarker.render(project.summary) ) }) ) ) }
https://github.com/LEXUGE/poincare
https://raw.githubusercontent.com/LEXUGE/poincare/main/src/notes/math_methods/main.typ
typst
MIT License
#import "@preview/physica:0.9.2": * #import "@preview/gentle-clues:0.4.0": * #import "@lexuge/templates:0.1.0": * #import shorthands: * #import pf3: * #show: simple.with( title: "Mathematical Methods", authors: ((name: "<NAME>", email: "<EMAIL>"),), ) #let unproven = text(red)[This is not proven yet.] #let unfinished = text(red)[This is not finished yet.] #pagebreak() = Vector Space and Hilbert Space == Vector Spaces We omit the general definition of a general vector space. This can be found in @abdel[Definition 1.1] or @nadir[Page 12]. Unless otherwise stated, - we understand that $V$ means a vector space. - we use $FF$ to denote the underlying field of $V$. - $cal(L)(V, W)$ means the vector space of linear map between vector space $V$ and $W$. It has _nothing to do_ with $Lp$ spaces we define later. We also omit the definition of *linear dependence*, this can be found in @abdel[Definition 1.3]. Remember linear dependence on infinite set is defined using linear combination of any finite subsets, because infinite sum requires the notion of convergence and we don't even have norm on such a general setting. We give the definition of the span of some set. #def( "Span", )[ Let $cal(X) subset.eq V$, $vb(v) in span cal(X)$ if $vb(v)$ can be expressed as a linear combination of $cal(X)$. Recall linear combination is finite, so in particular, $ vb(v) = sum_(i=1)^N c_i vb(u)_i, vb(u)_i in cal(X) $ ] #thm( "Span gives a subspace", )[ For any $emptyset eq.not cal(X) subset.eq V$, $span cal(X)$ is a subspace of $V$. ] #proof[ #pfstep[$vb(0)$ is in $span cal(X)$][ #pfstep[There exists an vector $vb(v)$ in $cal(X)$][$cal(X)$ is not empty.] #pfstep[$vb(0) = 0 cdot vb(v) in span cal(X)$][$0 cdot vb(v)$ is a linear combination $cal(X)$ and by definition of span it's included in $span cal(X)$.] ] #pfstep[Addition is closed in $span cal(X)$][ #pflet[$vb(v), vb(w) in span cal(X)$.] By definition of span, $vb(v), vb(w)$ can be expressed as linear combinations. $ vb(v) &= sum_(i=1)^N c_i vb(a)_i, vb(a)_i in cal(X) \ vb(w) &= sum_(i=1)^M d_i vb(b)_i, vb(b)_i in cal(X) $ Thus $ vb(v) + vb(w) = sum_(i=1)^N c_i vb(a)_i + sum_(i=1)^M d_i vb(b)_i $ is still a linear combination of ${vb(a)_i, vb(b)_i} subset.eq cal(X)$. Thus $vb(v) + vb(w) in span cal(X)$. ] #pfstep( finished: true, )[Scalar multiplication is closed in $span cal(X)$][ #pflet[$vb(v) in cal(X); vb(v) = sum_(i=1)^N c_i vb(a)_i, vb(a)_i in cal(X)$] Thus $lambda vb(v) = sum_(i=1)^N (lambda c_i) vb(a)_i$ is still a linear combination of $cal(X)$. Thus $lambda vb(v) in span cal(X)$. ] ] #def( "Basis and Dimension", )[ If there exists $cal(X) subset.eq V$ such that $span cal(X) = V$ then $cal(X)$ is a basis of $V$. If there exists a finite set basis $cal(B)$, then $V$ is finite dimensional, otherwise $V$ is infinite dimensional. ] #thm( "Finite-dimensional space has a well-defined dimension", )[ Let $cal(B)_1, cal(B)_2$ be two basis of some finite-dimensional $V$, then the number of elements in $cal(B)_1$ is the same as that of $cal(B)_2$. And $V$ has no infinite basis. Thus we may define unambiguously the dimension of $V$ as the number of elements in any basis of $V$, denoted also as $dim V =: n$. #unproven ] #def( "Commonly used infinite-dimensional spaces", )[ Just to fix the notation, we define $ cal(P)_n (I) &:= { text("polynomials of order ") lt.eq n text("with domain " I subset.eq RR) } \ cal(P)(I) &:= { text("polynomials of any order with domain " I subset.eq RR) }\ C^(n) (I) &:= { text("functions with ") n text("-th derivative defined and continuous on " I) } $ And $ C^infinity (I) := union.big_(i=0)^infinity C^(n) (I) $ ] == Non-degenerate Forms and Inner Product We want to keep the discussion general. So we will adopt the approach in @nadir[Section 2.6] which doesn't assume the positive-definiteness for a metric. #def( "Non-degenerate Hermitian Form", )[ A Hermition Form is a function $H: V times V to CC$ such that 1. *Sesquilinear* $H$ is anti-linear#footnote[So $H(lambda vb(v), cdot) = macron(lambda) H(vb(v), cdot)$. Anti-linear in first argument is _natural_ for adoption in Dirac notation.] in the first argument, and linear in the second argument. 2. *Skew-symmetric* $H(vb(v), vb(w)) = overline(H(vb(w), vb(v)))$ 3. *Non-degenerate* For all $vb(0) eq.not vb(v) in V$, there exists $vb(w) in V$ such that $H(vb(v), vb(w)) eq.not 0$ ]<non-deg-hermitian> #def( "Metric", )[ Let $H$ be a non-degenerate Hermitian Form as defined above. If $H$ is indeed symmetric (i.e. $H: V times V to RR$)#footnote[which also implies underlying field is $RR$, as $RR in.rev H(vb(v), i vb(w)) = i H(vb(v), vb(w)) in CC$ wouldn't work.]. Then we say $H$ is also a metric. ] #info[ The reason we require "realness" for metric is more like a convention, and related to how later we lower and raise indices. In the language of Einstein notation, the "metric dual" $vb(tilde(v))$ of $vb(v)$ is given by $ tensor(tilde(v), -nu) = tensor(H, -mu, -nu) tensor(v, +mu) $ And if $H$ is not symmetric (underlying space is not real), then we have to write $ tensor(tilde(v), -nu) = tensor(H, -mu, -nu) overline(tensor(v, +mu)) $ instead. This is due to the anti-linear nature of the first argument of $H$. #text( red, )[This is actually gonna be complicated if not real. How will we think of $f(vb(v))$ in Einstein notation if $f$ is _anti-linear_ functional? Apparently $tensor(f, -mu) overline(tensor(v, +mu)) eq.not overline(tensor(f, -mu)) tensor(v, +mu)$. How do we explain such asymmetry?] The whole point is: we are calling $H$ the metric tensor, and by definition, a tensor must be multi-linear instead of sesquilinear. ] #def( "Inner Product", )[ An inner product $braket(cdot, cdot): V times V to CC$ is a Non-degenerate Hermition Form that is - *positive definite* $braket(vb(v), vb(v)) > 0$ for all $vb(v) eq.not vb(0)$ ]<inner-product> #info[ Later when we introduce more Dirac notation, we will see $braket(vb(v), vb(w))$ is best not to be thought as a result of inner product but rather as dual vector $bra(vb(v))$ acting on vector $ket(vb(w))$. However, the definition of $bra(vb(v))$ depends on the canonical identification of $V^*$ with $V$, which depends on the inner product#footnote[Actually positive-definiteness is not required, non-degeneracy is enough. And this is indeed the case for special relativity, where we have the canonical duality but not the inner product. And we can use metric to "raise/lower the indices".]. Therefore, to be non-cyclic, we should for now think of $braket(vb(v), vb(w))$ as one piece: the inner product, rather than composition $bra(vb(v))(ket(vb(w)))$ with lines and bracket removed. ] /*#warning[ Notation: we used $H$ to denote non-degenerate Hermitian form (@non-deg-hermitian) and $braket(cdot, cdot)$ for inner product @inner-product. However, we _will_ mix them together, so $braket(cdot, cdot)$ will not necessarily be ]*/ #thm( "Cauchy-Schwartz inequality", )[ For all $vb(v), vb(w) in V$ (where $V$ has inner product), we have $ |braket(vb(v), vb(w))|^2 lt.eq braket(vb(v), vb(v)) braket(vb(w), vb(w)) $ #unproven ] There is another, _independent_ notion of norm for vector space. #def( "Norm", )[ A norm $norm(cdot): V to RR$ is a function such that 1. *Scaling* $norm(lambda vb(a)) = |lambda| norm(vb(a))$. 2. *Positive-definiteness* $norm(vb(a)) gt.eq 0$ and takes $0$ when $vb(a) = vb(0)$. 3. *Triangular inequality* $norm(vb(a) + vb(b)) lt.eq norm(vb(a)) + norm(vb(b))$. ]<norm-axioms> If a space has an inner product $braket(cdot, cdot)$, we can define a *induced norm* $norm(cdot)$ simply by $ norm(vb(v)) = sqrt(braket(vb(v), vb(v))) $ #thm( "Inner Product indeed induces norm", )[ Norm defined by $ norm(vb(v)) = sqrt(braket(vb(v), vb(v))) $ satisfies the @norm-axioms. #unproven ] Again, there exists many interesting norms like $p$-norm, but they are not useful for our theory and applications. From inner product and there induced norm, we have a very important notion of orthonormal basis. #def( "Orthonormal and orthogonal basis", )[ Let $V$ be equipped with a non-degenerate Hermitian form $H$. A basis $cal(B)$ is orthogonal if for any $vb(v), vb(w) in cal(B)$#footnote[We are formulating in such a weird way to contain the case where $cal(B)$ is uncountable. However, such case would not occur in this notes, and not useful in physics to my knowledge.], $ H(vb(v), vb(w)) eq.not 0 text("if and only if") vb(v) = vb(w) $ A basis is orthonormal if it's orthogonal and for any $vb(v) in cal(B)$, $H(vb(v), vb(v)) = plus.minus 1 $ ]<orthonormal-basis> #info[ Notice we defined orthonormal with $H(vb(v), vb(v)) = plus.minus 1$. This is to include the useful physical examples like special relativity. Note also that @orthonormal-basis doesn't depends on inner product! non-degenerate Hermitian form is enough! ] We have a very beautiful (yet advanced to prove) result. #thm( "Every space has orthonormal basis", )[ Given any basis $cal(B)$, we can construct an orthonormal basis $cal(B')$. #text( red, )[we actually also have claim on number of positive and negative norm!] #unproven ] == Dual Space <sec-dual-space> #def( "Dual Space", )[ The dual space of vector space $V$ is defined as $V' equiv cal(L)(V, FF)$. Vectors $f in V'$ are called dual vectors or linear functionals. The additive identity is the $0(vb(v)):= vb(0) in V$. ] #def( "Dual Basis", )[ Given a basis ${ vb(v)_i }_(i=1)^N$ of $V$, its dual basis ${ vb(v)^i }_(i=1)^N$ are defined by $ vb(v)^i (vb(v)_j) = cases(0 "if" i eq.not j, 1 "if" i = j) $ ]<dual-basis> #info[ This $vb(v)^i (vb(v)_j)$ _is_ indeed the coordinate representation of the $(1,1)$ identity tensor $ II(vb(v), f) := f(vb(v)) $ under basis ${ vb(v)^j tp vb(v)_i }$. So in fact from the perspective of tensor component we can also write $ overbrace( vb(v)_j tp vb(v)^i (II) = II(vb(v)_j tp vb(v)^i), "think " II "as double dual", ) = II(vb(v)_j, vb(v)^i) = vb(v)^i (vb(v)_j) = tensor(delta, -j, +i) $ where $delta equiv II$, and $tensor(delta, -j, +i)$ is its component. #text(yellow)[Actually this is where the universal property kicks in?] ] An important property #thm("Dual Basis gives coordinates")[ For any $vb(w) = sum_(i=1)^N w^i vb(v)_i $, $ w^i = vb(v)^i (vb(w)) $ ]<dual-basis-give-coordinate> #proof[ Plug in the expansion of $vb(w)$ to the right-hand side and evaluate $vb(v)^i (vb(w))$ by @dual-basis. ] #thm[Dual Basis is a Basis of $V'$ when $V$ is finite-dimensional]<dual-basis-is-a-basis> #proof[ #pfstep[Dual Basis is linearly independent][ Consider the linear combination $sum_(i=1)^N a_i vb(v)^i = 0 in V'$, apply it to $vb(v)_k$ one by one $ 0 = sum_(i=1)^N a_i vb(v)^i (vb(v)_k) = sum_(i=1)^N a_i tensor(delta, +i, -k) = a_k =0 $ Thus $a_k = 0$ for all $k$. ] #pfstep[It spans $V'$][ #pfstep[For any $f in V'$, $f = sum_(i=1)^N f(vb(v)_i) vb(v)^i $][ (This is actually one example of the tensor contraction.) We verify by plugging in. Expand an arbitrary $vb(w) equiv sum_(i=1)^N w^i vb(v)_i in V$. Then $ f(vb(w)) &= f(sum_(i=1)^N w^i vb(v)_i) \ &= sum_(i=1)^N w^i f(vb(v)_i) \ &= sum_(i=1)^N f(vb(v)_i) vb(v)^i (vb(w)) \ &= (sum_(i=1)^N f(vb(v)_i) vb(v)^i ) (vb(w)) $ where the second last line is by @dual-basis-give-coordinate. ] ] ] === Metric Dual #def( "Metric Dual", )[ If $V$ has an non-degenerate Hermitian form, then we can define an _anti-linear_ mapping $L: V to V'$ by $ tilde(vb(v))(vb(w) in V) equiv (L vb(v))(vb(w)) = H(vb(v), vb(w)) $ ]<metric-dual> Now $L$ has some important properties to make it work #thm[$L$ is injective] #proof[ We need to show $L(vb(v)) = L(vb(w)) arrow.double vb(v) = vb(w)$ #pfstep[$tilde(vb(v)) = tilde(vb(w)) arrow.double H(vb(v) - vb(w), vb(a)) = 0$ for all $vb(a)$][ #pfstep[For all $vb(a)$, $tilde(vb(v))(vb(a)) = tilde(vb(w))(vb(a))$][By definition of $tilde(vb(v)) = tilde(vb(w))$] #pfstep[$H(vb(v) - vb(w), vb(a)) = 0$][ Expand $tilde(vb(v)) = tilde(vb(w))$ by definition and use anti-linearity in the first argument of $H$ ] ] #pfstep( finished: true, )[$vb(v) = vb(w)$][ By non-degeneracy of $H$ (see @non-deg-hermitian), if $vb(v) - vb(w) eq.not vb(0)$, then there exists $vb(a)$ such that $H(vb(v) - vb(w), vb(a))$. However, this is not the vase by Claim 1. Thus contradictory. ] ] #thm[Metric Dual of a basis is its dual basis if and only if orthonormal][ Let $cal(B) = {vb(v)_i}$ be a basis, then its dual basis $cal(B)'$ is equal to applying metric dual to each of its basis vector if and only if $cal(B)$ is orthonormal. ]<metric-dual-is-dual-basis> #proof[ #pfstep[Orthonormal $arrow.double$ metric dual is dual basis][ We have#footnote[Only under such specific basis would the coordinate representation of $H$ be evaluated according to Kronecker delta. In fact, there is no good definition of (2,0) identity tensor, so $tensor(delta, -i, -j)$ should not be think of as a coordinate representation of some tensor.]#footnote[We are _not_ raising the indices of $vb(v)$ to $tilde(vb(v))^i$ because this $i$ is not the component of $vb(v)$, instead, it's a indices for basis.] $ tilde(vb(v))_i (vb(v)_j) := H(vb(v)_i, vb(v)_j) = tensor(delta, -i, -j) $ Thus by definition of @dual-basis we know $tilde(vb(v))_i = vb(v)^i$. ] #pfstep( finished: true, )[Metric dual is dual basis $arrow.double$ orthonormal][ Metric dual is dual basis means#footnote[Again, as in previous footnotes, the indices positions etc doesn't match isn't an issue as the underlying tensors ($H$ and $II$) are not of the same type and are not equal.] $ tilde(vb(v))_i (vb(v)_j) := H(vb(v)_i, vb(v)_j) = vb(v)^i (vb(v)_j) = tensor(delta, -j, +i) $ Thus $cal(B)$ is orthonormal. ] ] #thm[$L$ is surjective if $V$ is finite-dimensional]<metric-dual-is-surjective> #proof[ #pfstep[$dim V' = dim V$][ By @dual-basis-is-a-basis, the basis of $V'$ has the same number of elements as basis of $V$. ] #pfstep( finished: true, )[$dim img L = dim V$][$dim img L = dim V - dim ker L = dim V = dim V'$] ] Thus, if $V$ is finite-dimensional, then #thm[$V$ and $V'$ are canonically isomorphic if $V$ is finite-dimensional] #proof[ They are canonically isomorphic through the bijective map $L: V to V'$. Notice how the definition of $L$ doesn't depends on choice of basis. ] === Double Dual We now have canonical identification of another space with $V$. #thm[$V''$ is canonically isomorphic to $V$ if $V$ is finite-dimensional][ We define the map $L: V to V''$#footnote[The $L$ here has nothing to do with the $L$ defined for metric dual] by $ L(vb(v))(phi) := phi(vb(v)) $ Prove this is bijective if $V$ is finite-dimensional. ]<double-dual-isomorphism> #proof[ #pfstep[$L$ is injective#footnote[The proof that we give actually applies even if $V$ is infinite dimensional. See also @inf-dim-dual-vector-kernel]][This is equivalent to proving $L(vb(v)) = 0$ if and only if $vb(v) = vb(0)$. The if part is evident, the only if part is as follows. #pfstep[If $phi(vb(v)) = 0$ for all $phi in V'$, then $vb(v) = vb(0)$][ If $vb(v)$ is non-zero, then we can extend a basis $cal(B)$ of $V$ from it, construct the dual basis of $cal(B)$. Then $vb(v)'(vb(v)) = 1 eq.not 0$ Thus $vb(v) = 0$ ] ] #pfstep[$L$ is surjective][ #pfstep[$dim V = dim V' = dim V''$ in finite-dimensional case][ By @dual-basis-is-a-basis, $dim V = dim V'$. Since $V'$ is also finite dimensional, we have $dim V' = dim V''$ as $V''$ is just the dual of $V'$. ] by the same argument as @metric-dual-is-surjective, we have $L$ being surjective as well. ] ] == $cal(L)^1, cal(L)^2$ Space Much of our problem and solution are set in the $cal(L)^1, cal(L)^2$ spaces. A lot of mathematical constructs is needed to arrive at a rigorous theory. We thus will only pay attention to properties and subtleties important to our use. Specifically, integrals in this notes should be Lebsegue integral in order for things to work. However, since we will not care about pedagogical cases and every Riemann integral gives the same results as Lebsegue integral, we will just think of Riemann integral (with some additional properties) anyway. #def( [$Lp$ spaces], )[ Let $I$ be some interval (possibly infinite) on $RR$, $f: RR to CC$ is in $Lp(I)$ if $ integral_I |f(x)|^p differential(x) < oo $ ] Till now, $L1, L2$ are just sets. We need to show that they are actually vector spaces. Moreover, $L2$ will actually be an inner product (and thus also normed) space. Unfortunately, we cannot give a rigorous proof on integrability. However, we will do our best to give some insights on why it's a vector space. #thm([$L1$ spaces are vector spaces.])<L1-is-a-vector-space> #proof[ The "hard" part is to prove if $f,g in L1$, then $f+g in L1$. #pfstep[$|f(x) + g(x)| lt.eq |f(x)| + |g(x)|$][Triangular inequality in $RR$.] #pfstep( finished: true, )[if $|f(x) + g(x)|$ is integrable, then it's integral is finite.][For Riemann integral (and integrable $f,g$), if $f lt.eq g$, $integral_I f lt.eq integral_I g$. Thus $integral_I |f(x) + g(x)| lt.eq integral_I |f(x)| + |g(x)| < oo$ if $|f(x) + g(x)|$ is integrable.] ] #thm([$L2$ spaces are vector spaces.])<L2-is-a-vector-space> #proof[ Again, the hard part is to prove if $f,g in L2$, then $f+g in L2$. #pfstep( finished: true, )[$|f(x) + g(x)|^2 lt.eq 2(|f(x)|^2 + |g(x)|^2)$][ #pfstep[For any $a,b in CC$, $2|a||b| lt.eq |a|^2 + |b|^2$][ $ 0 &lt.eq (|a| - |b|)^2 \ &= |a|^2 + |b|^2 - 2|a||b| $ Thus $2|a||b| lt.eq |a|^2 + |b|^2$. ] #pfstep[For any $a in CC$, $|a| gt.eq Re(a)$][ $ |a|^2 gt.eq |a|^2 - Im(a)^2 = Re(a)^2 $ ] And $ |f(x) + g(x)|^2 &= |f(x)|^2 + |g(x)|^2 + overline(f(x)) g(x) + f(x) overline(g(x)) \ &= |f(x)|^2 + |g(x)|^2 + 2 Re(f(x) g(x)) \ &lt.eq |f(x)|^2 + |g(x)|^2 + 2 |f(x)g(x)| \ &= |f(x)|^2 + |g(x)|^2 + 2 |f(x)||g(x)| \ &lt.eq |f(x)|^2 + |g(x)|^2 + |f(x)|^2 + |g(x)|^2 = 2(|f(x)|^2 + |g(x)|^2) $ ] Then we can just follow the reasoning in @L1-is-a-vector-space ] One further step is needed, we need to identify $f,g$ by $ f tilde g arrow.l.r.double integral_I |f - g|^p = 0 $ This equivalence relation gives equivalence classes $[f]$. So $f = g$ means $[f] = [g]$ which is equivalent to saying $f tilde g$. #text( red, )[Actually much is left out: how do we know integral will be well-defined for equivalence classes?] Under this identification, with Minkowski's inequality#footnote[For $L1$, our proof in @L1-is-a-vector-space is enough for proving triangular inequality in @norm-axioms. However, for $L2$, our loose bound in @L2-is-a-vector-space $|f(x) + g(x)|^2 lt.eq 2(|f(x)|^2 + |g(x)|^2)$ is not enough.], we have actually $Lp$ as normed spaces, with norm $ norm(f) := (integral_I |f|^p)^(1/p) $ Now, we want to make $L2$ (which we use mainly) an inner product space. This is done by defining inner product $ braket(f, g) := integral_I overline(f) g $ #thm[$L2$ is an inner product space][ $ braket(f, g) := integral_I overline(f) g $ satisfies the @inner-product ] #proof[ #pfstep[$braket(f, g)$ is sesquilinear][ By linearity of the integral ] #pfstep[$braket(f, g)$ is skew-symmetric][ We have by definition of complex integral on real line that, $ integral_I f = integral_I Re(f) + i integral_I Im(f) $ Thus $ overline(braket(f, g)) = overline(integral_I overline(f)g) = integral_I f overline(g) = braket(g, f) $ by simple algebra. ] #pfstep( finished: true, )[$braket(f, f) gt.eq 0$ and takes equality only when $f=0$][ For Riemann integral, $f gt.eq 0$ implies $integral_I f gt.eq 0$. Thus $|f|^2 gt.eq 0$ means $braket(f, f) gt.eq 0$. By our identification, we know $braket(f, f) =0 arrow.r.double f tilde 0 arrow.r.double f = 0 $ by definition. ] ] #info[Riemann integral has integrability issues. In particular, we could have $f^2, g^2$ integrable while $f g$ don't, which makes inner product ill-defined. See @abdel[Exercise 1.21] for a simple example based on indicator function of $QQ$. But such subtleties don't affect the general applications.] === Relation between $L1, L2$ === Sequences and limits in $L2$ === Orthonormal basis in $L2$ #pagebreak() = Tensors There are two main concepts: - Tensor - Tensor Product $tp$ The idea is that any (multi-)linear object is a tensor. And tensor product is an operation that helps us to build multi-linear object out of linear objects (e.g. vector spaces). #info[ Without otherwise stated, in this section, all operator of the form $V_1 times V_2 times dots.c times V_n to W$ are (multi-)linear. ]<conv-all-multi-linear> == Tensors #def( "Tensor", )[ Let $V_1, V_2, dots V_n$ be finite dimensional vector spaces over the same field $FF$. A tensor is a multi-linear functional $ tau: V_1 times V_2 times dots.c times V_n to FF $ If in particular, $ V_1 times V_2 times dots.c times V_n = overbrace(V times V times dots.c times V, r "times") times underbrace(V' times V' times dots.c times V', s "times") $ where $r+s = n$ of course, then we call $tau$ a *type $(r,s)$ tensor on vector space $V$*. ]<def-tensor> #thm( "Tensors of the same signature form a vector space", )[ This means, $ cal(L)(V_1, V_2, dots.c, V_n) := \{ tau| tau: V_1 times V_2 times dots.c times V_n to FF \} $ is a vector space (with addition and scalar multiplication properly defined). We call such space tensor space. In particular, tensors of the type $(r,s)$ over the same vector space $V$ forms a space. We denote such space as $ tau^r_s (V) $ as we use such space a lot. ]<tensors-form-space> #proof[ We define the $+, cdot$ operator as usually defined for functions. It's straightforward to verify multi-linearity is preserved under these operations and all axioms are satisfied. ] == Tensor Products Now, much of the ink will be devoted to define the tensor product $tp$ and how this operator gives a unified way to construct tensor spaces. #def[Tensor product of two vectors][ Given $vb(v) in V, vb(w) in W$, define $ vb(v) tp vb(w): V' times W' &to FF\ (h, g) &sendto vb(v)(h) vb(w)(g) equiv h(vb(v)) g(vb(w)) $ where $vb(v)(h) vb(w)(g) equiv h(vb(v)) g(vb(w))$ is due to @double-dual-isomorphism. ]<tp-vectors> By @def-tensor, $vb(v) tp vb(w)$ is a tensor. And indeed it lives in the tensor space $cal(L)(V', W')$. We now are set to explore the dimension and basis for $cal(L)(V', W')$. #thm[Basis of tensor space][ Let $n = dim V, m = dim W$, ${vb(a)_i}_(i=1)^n$ be a basis of $V$, ${vb(b)_j}_(j=1)^m$ be a basis of $W$, the set $ { vb(a)_i tp vb(b)_j in cal(L)(V', W')} $ is a basis of the vector space $cal(L)(V', W')$ ]<basis-of-tensor-space> #proof[ Let ${vb(a)^i}, {vb(b)^j}$ be the dual basis (@dual-basis) of ${vb(a)_i}, {vb(b)_j}$. #pfstep[${ vb(a)_i tp vb(b)_j}$ is linearly independent][ Let $sum_(i,j) c_(i,j) vb(a)_i tp vb(b)_j = 0$. #pfstep[$c_(i,j) = 0$ for all $i,j$][ $ 0 = (sum_(i,j) c_(i,j) vb(a)_i tp vb(b)_j) (vb(a)^k, vb(b)^l) &= sum_(i,j) c_(i,j) vb(a)_i tp vb(b)_j (vb(a)^k, vb(b)^l) \ &= sum_(i,j) c_(i,j) vb(a)^k (vb(a)_i) vb(b)^l (vb(b)_j) \ &= c_(k,l) $ ] By definition of linear independence, ${ vb(a)_i tp vb(b)_j}$ is linearly independent. ] #pfstep( finished: true, )[${ vb(a)_i tp vb(b)_j}$ spans $cal(L)(V', W')$][ Let $tau in cal(L)(V', W')$. #pfstep[$tau = sum_(i,j) tau(vb(a)^i, vb(b)^j) vb(a)_i tp vb(b)_j$ ][ For all $h = sum_i h_i vb(a)^i, g = sum_j g_j vb(b)^j$, we have $ (sum_(i,j) tau(vb(a)^i, vb(b)^j) vb(a)_i tp vb(b)_j) (h,g) &= sum_(i,j) tau(vb(a)^i, vb(b)^j) vb(a)_i tp vb(b)_j (h,g) \ &= sum_(i,j) tau(vb(a)^i, vb(b)^j) vb(a)_i (h) vb(b)_j (g) \ &= sum_(i,j) tau(vb(a)^i, vb(b)^j) h_i g_j \ &= tau(sum_i h_i vb(a)^i, sum_j g_j vb(b)^j) = tau(h, g) $ ] where we used @dual-basis-give-coordinate in the third line. ] ] #remark[This proves that $dim cal(L)(V', W') = dim V' dim W' = dim V dim W$ for finite-dimensional $V, W$.] #remark[By switching $V'$ and $V$ and etc., we have also got $dim cal(L)(V, W) = dim V dim W$.] Now, we define the tensor product for two vector spaces. #def[Tensor Product for Two Vector Spaces][ Define $ V tp W := span { vb(v) tp vb(w) | vb(v) in V, vb(w) in W } $ ]<tp-spaces> And indeed #thm[$V tp W = cal(L)(V', W')$]<tp-gives-tensor-space> #proof[ #pfstep[$V tp W subset.eq cal(L)(V', W')$][ $vb(v) tp vb(w) in cal(L)(V', W')$ and $cal(L)(V', W')$ is a vector space, thus linear combinations of $vb(v) tp vb(w)$ still lives in $cal(L)(V', W')$. ] #pfstep( finished: true, )[$cal(L)(V', W') subset.eq V tp W$][ Clearly, ${ vb(a)_i tp vb(b)_j } subset V tp W$ where $vb(a)_i, vb(b)_j$ are defined as in @basis-of-tensor-space. By @basis-of-tensor-space, we know $ cal(L)(V', W') = span { vb(a)_i tp vb(b)_j } subset.eq span { vb(v) tp vb(w) | vb(v) in V, vb(w) in W } equiv V tp W $ ] ] #remark[Similarly, $ V' tp W' = cal(L)(V, W) $ ] This means, at least in the bilinear case, all tensors can be completely reconstructed from tensor product operation! Now, as it would later prove to be useful, we want to: - Define the dual of the tensor space. - Use tensor product to construct $cal(L)(V, W, Z)$. - For $tau^r_s (V)$ and $V$ with non-degenerate Hermitian Form $H$ (or inner product), we want to define inner product on $tau^r_s (V)$ == Universal Property Universal Property will be a promising tool that allows us to show $ (V tp W)' caniso V' tp W' $ in @dual-is-commutative-with-tp and $ (V tp W) tp Z caniso V tp (W tp Z) caniso cal(L)(V', W', Z') $ where $caniso$ means canonically isomorphic. #thm[Universal Property][ 1. Let $tau in cal(L)(V, W)$, there exists a unique function $hat(tau) in (V tp W)'$ such that $ hat(tau)(vb(v) tp vb(w)) = tau(vb(v), vb(w)) $ for all $vb(v),vb(w)$. 2. Let $hat(tau) in (V tp W)'$, there exists a unique $tau in cal(L)(V, W)$ such that $ tau(vb(v), vb(w)) = hat(tau)(vb(v) tp vb(w)) $ for all $vb(v),vb(w)$. ]<univ-prop-1> #proof[See @ladr[Theorem 9.79, page 375], though the actual proof is neither hard nor long.] This is immediately yields #thm[$(V tp W)' caniso V' tp W'$]<dual-is-commutative-with-tp> #proof[ We are basically showing that @univ-prop-1 gives us a canonical isomorphism between $cal(L)(V, W)$ and $(V tp W)'$. #pfstep[$dim cal(L)(V, W) = dim (V tp W)'$][ By @tp-gives-tensor-space, we have $ dim cal(L)(V, W) = dim V tp W $ Since it's finite-dimensional, we have $dim (V tp W)' = dim V tp W$ as well. ] Now universal property allows us to define a mapping $L_1: cal(L)(V, W) to (V tp W)'$. Define $ L_1 tau = hat(tau) $ where $tau, hat(tau)$ are given in @univ-prop-1 part 1. This is well-defined as part 1 asserts that for any $tau$, such $hat(tau)$ is unique. #pfstep[$L_1$ is injective][ If $L_1 tau = vb(0)$, then by definition of @univ-prop-1, $ tau(vb(v), vb(w)) = vb(0)(vb(v) tp vb(w)) = 0 $ for all $vb(v), vb(w)$ This means $tau = vb(0)$. Thus $ker L_1 = {vb(0) in cal(L)(V, W)}$. ] Since dimension match and $L_1$ is injective, we can use the same technique as in @metric-dual-is-surjective to prove $L_1$ is surjective as well. #pfstep(finished: true)[$(V tp W)' caniso V' tp W'$][ By @tp-gives-tensor-space, $V' tp W' = cal(L)(V, W)$, thus we have $ V' tp W' = cal(L)(V, W) caniso (V tp W)' $ ] ] #remark[This isomorphism is indeed very explicit. We know $vb(a)^i tp vb(b)^j in cal(L)(V, W)$. By definition of @univ-prop-1, $ (L_1 vb(a)^i tp vb(b)^j)( vb(a)_k tp vb(b)_l) &= vb(a)^i tp vb(b)^j (vb(a)_k, vb(b)_l) \ &= vb(a)^i (vb(a)_k) vb(b)^j (vb(b)_l) = tensor(delta, -k, +i) tensor(delta, -l, +j) $]<explicit-dual-basis> #idea[Therefore, _under isomorphism_, ${vb(a)^i tp vb(b)^j}$ is equivalent to the dual basis of ${vb(a)_i tp vb(b)_j}$! This is not a direct result and we derived it!] #info[ This is not just mathematical nit picking. This is indeed the mechanism that powers the following rule in Dirac notation: $ (ketbra(psi, phi))^dagger = ketbra(phi, psi) $ ] We can extend the universal property a bit and prove the associativity as well! #thm[Universal Property - Extended][ 1. Let $Gamma in cal(L)(V, W, Z)$, there exists a unique function $hat(Gamma) in cal(L)(V, W tp Z)$ such that $ hat(Gamma)(vb(v), vb(w) tp vb(z)) = Gamma(vb(v), vb(w), vb(z)) $ for all $vb(v),vb(w), vb(z)$. 2. Let $hat(Gamma) in cal(L)(V, W tp Z)$, there exists a unique $Gamma in cal(L)(V, W, Z)$ such that $ Gamma(vb(v), vb(w), vb(z)) = hat(Gamma)(vb(v),vb(w) tp vb(z)) $ for all $vb(v),vb(w), vb(z)$. ]<univ-prop-2> #proof[ #pfstep[Part 1 is correct][ Fix $vb(v)$ first, then $Gamma (vb(v), cdot, cdot) in cal(L)(W, Z)$. By @univ-prop-1 we can define an unique $hat(tau)_vb(v) in (W tp Z)'$ such that $ hat(tau)_vb(v) (vb(w) tp vb(z)) := Gamma (vb(v), vb(w), vb(z)) $ for all $vb(w), vb(z)$. And we define $ hat(Gamma)(vb(v), vb(w) tp vb(z)) := hat(tau)_vb(v) (vb(w) tp vb(z)) $ We remain to prove that $hat(Gamma)$ is bilinear. Linearity in the second argument (i.e. $W tp Z$ space) is easy as we know $hat(tau)_vb(v)$ is linear provided by @univ-prop-1. And for the second argument, $ hat(Gamma)(vb(v)_1 + c vb(v)_2, vb(w) tp vb(z)) &:= hat(tau)_(vb(v)_1 + c vb(v)_2) (vb(w) tp vb(z)) \ &= Gamma (vb(v)_1 + c vb(v)_2, vb(w), vb(z)) \ &= Gamma (vb(v)_1, vb(w), vb(z)) + c Gamma(vb(v)_2, vb(w), vb(z)) \ &= hat(tau)_(vb(v)_1) (vb(w) tp vb(z)) + c hat(tau)_(vb(v)_2) (vb(w) tp vb(z)) \ &= hat(Gamma)(vb(v)_1, vb(w) tp vb(z)) + c hat(Gamma)(vb(v)_2, vb(w) tp vb(z)) $ ] #pfstep( finished: true, )[Part 2 is correct][ We do the same construction as did in the above. Fix $vb(v)$ for the moment, then $hat(Gamma) (vb(v), cdot) in (W tp Z)'$. By @univ-prop-1 we can define the unique bilinear functional $ tau_vb(v)(vb(w), vb(z)) := hat(Gamma) (vb(v), vb(w) tp vb(v)) $ And define $ Gamma (vb(v), vb(w), vb(z)) := tau_vb(v)(vb(w), vb(z)) $ This is well defined as for each $vb(v)$, $tau_vb(v)$ is unique. Now, to prove the trilinearity. The bilinearity in $W, Z$ argument is easy as $tau_vb(v)$ is bilinear as provided by @univ-prop-1. The linearity in the first argument is given by $ Gamma (vb(v)_1 + c vb(v)_2, vb(w), vb(z)) &= tau_(vb(v)_1+ c vb(v)_2) (vb(w), vb(z)) \ &= hat(Gamma) (vb(v)_1 + c vb(v)_2, vb(w) tp vb(v)) \ &= hat(Gamma) (vb(v)_1, vb(w) tp vb(v)) + c hat(Gamma) (vb(v)_2, vb(w) tp vb(v)) \ &= tau_(vb(v)_1) (vb(w), vb(z)) + c tau_(vb(v)_2) (vb(w), vb(z)) \ &= Gamma (vb(v)_1, vb(w), vb(z)) + c Gamma (vb(v)_2, vb(w), vb(z)) $ ] ] #remark[ The proof can be easily adapted into proving the statement $hat(Gamma) (V tp W, Z) dots$. So where we tensor stuff together doesn't matter for this proof. ]<univ-prop-2-remark> We have an easy result #thm[$dim cal(L)(V, W, Z) = dim V dim W dim Z$]<trilinear-dimension> #proof[See @ladr[Theorem 9.87, page 378]] And this gives #thm[$V' tp (W tp Z)' caniso cal(L)(V, W, Z)$]<trilinear-caniso> #proof[ Let $Gamma in cal(L)(V, W, Z)$, define $L_2: cal(L)(V, W, Z) to V' tp (W tp Z)'$ by $ L_2 Gamma = hat(Gamma) $ where $hat(Gamma)$ is given by @univ-prop-2. #pfstep[$L_2$ is injective][ By exact analogue to @dual-is-commutative-with-tp Claim 2. ] #pfstep(finished: true)[$dim cal(L)(V, W, Z) = dim V' tp (W tp Z)'$][ By @trilinear-dimension, we have $ dim cal(L)(V, W, Z) &= dim V dim W dim Z \ &= dim V dim (W tp Z) \ &= dim V' tp (W tp Z)' $ ] Since dimension match and $L_2$ is injective, we can use the same technique as in @metric-dual-is-surjective to prove $L_2$ is surjective as well. ] #remark[By @univ-prop-2-remark, we can also prove that indeed $ (V tp W)' tp Z' caniso cal(L)(V, W, Z) $ ]<trilinear-caniso-remark-pre> Now, we can post-compose $L_2$ with $L_1$ to get another canonical isomorphism: #thm[For all $tau in cal(L)(V, W, Z)$, we have the canonical isomorphism $ L_1 L_2: cal(L)(V, W, Z) &to (V tp (W tp Z))' \ tau &sendto hat(tau) $ where $L_1, L_2$ are defined in @dual-is-commutative-with-tp, @trilinear-caniso respectively, such that $ hat(tau)(vb(v) tp (vb(w) tp vb(z))) = tau(vb(v), vb(w), vb(z)) $ for all $vb(v) in V, vb(w) in W, vb(z) in Z$. ]<explicit-trilinear-caniso> #remark[Similar to @trilinear-caniso-remark-pre, we can also prove that indeed $ ((V tp W) tp Z)' caniso cal(L)(V, W, Z) $ ]<trilinear-caniso-remark> #remark[ This theorem gives us an explicit candidate for dual basis of ${vb(a)_i tp (vb(b)_j tp vb(c)_k)}$ where ${vb(a)_i}, {vb(b)_j}, {vb(c)_k}$ are basis of $V, W, Z$. If we define $ vb(a)^i tp vb(b)^j tp vb(c)^k (cdot, cdot, cdot) := vb(a)^i (cdot) vb(b)^j (cdot) vb(c)^k (cdot) $ Then you can verify ${ L_1 L_2 vb(a)^i tp vb(b)^j tp vb(c)^k }$ is a dual basis for ${vb(a)_i tp (vb(b)_j tp vb(c)_k)}$ To see why: $ (L_1 L_2 vb(a)^i tp vb(b)^j tp vb(c)^k) (vb(a)_l tp (vb(b)_m tp vb(c)_n)) &= (L_2 vb(a)^i tp vb(b)^j tp vb(c)^k) (vb(a)_l, (vb(b)_m tp vb(c)_n)) \ &= vb(a)^i tp vb(b)^j tp vb(c)^k (vb(a)_l, (vb(b)_m, vb(c)_n))\ &= tensor(delta, +i, -l) tensor(delta, +j, -m) tensor(delta, +k, -n) $ Alternatively, ${ L_1 vb(a)^i tp (L_1 vb(b)^j tp vb(c)^k) }$ is also the dual basis of ${vb(a)_i tp (vb(b)_j tp vb(c)_k)}$: $ overbrace( L_1 underbrace( vb(a)^i tp overbrace((L_1 underbrace(vb(b)^j tp vb(c)^k, W' tp Z')), (W tp Z)'), V' tp (W tp Z)' = cal(L)(V, W tp Z), ), (V tp (W tp Z))', ) (vb(a)_l tp (vb(b)_m tp vb(c)_n)) &= vb(a)^i tp (L_1 vb(b)^j tp vb(c)^k) (vb(a)_l, vb(b)_m tp vb(c)_n)\ &= vb(a)^i (vb(a)_l) (L_1 vb(b)^j tp vb(c)^k)(vb(b)_m tp vb(c)_n) \ &= tensor(delta, +i, -l) vb(b)^j tp vb(c)^k(vb(b)_m, vb(c)_n) \ &= tensor(delta, +i, -l) tensor(delta, +j, -m) tensor(delta, +k, -n) $ ]<explicit-construction-of-trilinear-dual> #thm[$V' tp (W' tp Z') caniso (V' tp W') tp Z' caniso cal(L)(V, W, Z)$] #proof[ #pfstep[$V' tp (W' tp Z') caniso (V tp (W tp Z))'$][ By @dual-is-commutative-with-tp, $V' tp (W' tp Z') caniso V' tp (W tp Z)'$. Apply again we get the desired result. ] #pfstep(finished: true)[$(V tp (W tp Z))' caniso cal(L)(V, W, Z)$][ By @trilinear-caniso ] Therefore, we have $ V' tp (W' tp Z') caniso (V tp (W tp Z))' caniso cal(L)(V, W, Z) $ By @trilinear-caniso-remark, we have the same result for $(V' tp W') tp Z'$. Thus, $ V' tp (W' tp Z') caniso (V' tp W') tp Z' caniso cal(L)(V, W, Z) $ ] #remark[ Equivalently, we have $ V tp (W tp Z) caniso (V tp W) tp Z caniso cal(L)(V', W', Z') $ ] #idea[ Thus tensor product between vector spaces is indeed associative! We may write $V tp W tp Z$ which is not ambiguous *up to a canonical isomorphism*. And they are all equivalent to $cal(L)(V', W', Z')$ under this canonical isomorphism. ] After having associativity, we can extend these results to "fourth order" $ ((V tp W) tp Y) tp Z caniso (V tp W) tp (Y tp Z) caniso V tp (W tp (Y tp Z)) $<eq-fourth-order-tp> And by analogue to @univ-prop-2, we could indeed show $ cal(L)(V' tp W', Y', Z') caniso cal(L)(V', W', Y', Z') $ and by @univ-prop-2, $ cal(L)(V' tp W', Y', Z') caniso cal(L)(V' tp W', Y' tp Z')$ and by @univ-prop-1, $ cal(L)(V' tp W', Y' tp Z') caniso ((V' tp W') tp (Y' tp Z'))' $ where by applying @dual-is-commutative-with-tp repeatedly, $ ((V' tp W') tp (Y' tp Z'))' caniso (V tp W) tp (Y tp Z) $ So all of the tensor product spaces in @eq-fourth-order-tp is indeed canonically isomorphic to $cal(L)(V', W', Y', Z')$ #conclusion[ So how to think about and _use these_ all of these after all? A few take-aways: - Canonical Isomorphism here are all powered by universal properties @univ-prop-1, @univ-prop-2 which gives @dual-is-commutative-with-tp, @trilinear-caniso respectively. - Canonical isomorphism basically gives us a natural, unambiguous way to specify items. *At the end of the day, you can just ignore the parenthesis and commute dual and the tensor product, and plug in the things as you expect them to.* It's not useful to write out all the canonical identification that makes e.g. ${vb(a)^i tp (vb(b)^j tp vb(c)^k)}$ a dual basis as in @explicit-construction-of-trilinear-dual. And this is the reason why people even just define $V_1 tp dots.c tp V_n$ as $cal(L)(V_1, dots.c, V_n)$. However, our treatment is arguably better: - We only defined tensor product once (@tp-vectors and @tp-spaces) and built up all the later laws henceforth. - We explicitly demonstrated that associativity of tensor product and commutativity with dual works. ] == Inner Product for Tensors We want to define inner product for tensor product spaces (because we want to do quantum mechanics for composite systems!) #thm[The Natural Non-degenerate Hermitian Form for Tensor Product Spaces][ There exists an unique non-degenerate Hermitian form (@non-deg-hermitian) $H_(V tp W)$ on $V tp W$ such that $ H_(V tp W) (vb(v)_1 tp vb(w)_1, vb(v)_2 tp vb(w)_2) = H_V (vb(v)_1, vb(v)_2) H_W (vb(w)_1, vb(w)_2) $ where $H_V, H_W$ are the forms defined for $V, W$. We call this $H_(V tp W)$ *the natural form*. ]<natural-form-on-tp> #proof[See @ladr[Theorem 9.80, page 376]] #remark[This of course also works for inner product as inner product is just a special case for Hermitian form] #remark[Do see the remark after @ladr[Theorem 9.80, page 376] on why you cannot just define the value of $H_(V tp W) (cdot, cdot)$ on separable tensors and use linearity from there] This theorems shows us that we can meaningfully talk about _the_ Hermitian form $H_(V tp W)$ such that $ H_(V tp W) (vb(v)_1 tp vb(w)_1, vb(v)_2 tp vb(w)_2) = H_V (vb(v)_1, vb(v)_2) H_W (vb(w)_1, vb(w)_2) $ And to do any practical calculation, we just expand the vector into separable tensor and use the sesquilinearity of the form. Now, using @natural-form-on-tp, we have the natural form on $V tp (W tp Z)$ satisfying #footnote[We drop subscript on $H$ to avoid cluttering.]: $ H (vb(v)_1 tp (vb(w)_1 tp vb(z)_1), vb(v)_2 tp (vb(w)_2 tp vb(z)_2)) &= H (vb(v)_1, vb(v)_2) H (vb(w)_1 tp vb(z)_1, vb(w)_2 tp vb(z)_2) \ &= H (vb(v)_1, vb(v)_2) H (vb(w)_1, vb(w)_2) H (vb(z)_1, vb(z)_2) $ And actually for the natural form on $(V tp W) tp Z$, we have $ H ((vb(v)_1 tp vb(w)_1) tp vb(z)_1, (vb(v)_2 tp (vb(w)_2) tp vb(z)_2) = H (vb(v)_1, vb(v)_2) H (vb(w)_1, vb(w)_2) H (vb(z)_1, vb(z)_2) $ So indeed from the point of view of the natural form, $(V tp W) tp Z$ and $V tp (W tp Z)$ are the same space. And indeed ${ vb(a)_i tp vb(b)_j }$ is orthonormal in $V tp W$ if and only if ${vb(a)_i}, {vb(b)_j}$ are orthonormal in $V,W$. === Metric Dual for Tensors Since we have Hermitian form, we can define a canonical isomorphism between $V tp W$ and $(V tp W)'$ much like @metric-dual. #thm[Metric dual of $vb(a) tp vb(b)$][ Let $vb(a) in V, vb(b) in W$, then $L vb(a) tp vb(b) in (V tp W)'$ is equivalent to $tilde(vb(a)) tp tilde(vb(b))$, where $L$ has the meaning in @metric-dual. ] #proof[ By @metric-dual, we have $ H_(V tp W)(vb(a) tp vb(b), vb(v) tp vb(w)) &= H(vb(a), vb(v)) H(vb(b), vb(w)) \ &= tilde(vb(a))(vb(v)) tilde(vb(b))(vb(w)) \ &= (L_1 tilde(vb(a)) tp tilde(vb(b))) (vb(v) tp vb(w)) $ for any $vb(v) in V, vb(w) in W$, where $L_1$ is defined in @univ-prop-1. That is $ (L vb(a) tp vb(b))(vb(v) tp vb(w)) = (L_1 tilde(vb(a)) tp tilde(vb(b))) (vb(v) tp vb(w)) $ Therefore, by taking linear combination of basis tensor of $V tp W$, any tensor $tau$ in $V tp W$ has $ (L vb(a) tp vb(b))(tau) = (L_1 tilde(vb(a)) tp tilde(vb(b))) (tau) $ ] #remark[ The proof is evaluating instead of _defining_ value of $L vb(a) tp vb(b)$ on separable tensors. So we don't suffer from the issue discussed in remark after @ladr[Theorem 9.80, page 376]. ] #remark[ Since in Dirac notation, $dagger$ means taking metric dual of some tensor, we see naturally $ (ketbra(phi, psi))^dagger = ketbra(psi, phi) $ ] There is not much else to discuss here actually, everything works similarly as $V tp W$ is a vector space. However, one thing to point out is: as @metric-dual-is-dual-basis dictates, the metric duals of ${vb(a)_i tp vb(b)_j}$ is its dual basis. As pointed out in @explicit-dual-basis, this is equivalent to ${vb(a)^i tp vb(b)^j}$. == Change of Coordinates === Some "coordinates" are not the coordinates of a tensor == Contraction Contraction is probably the most important concept in this formalism that has practical use. Whenever we are manipulating tensors, we are manipulating multilinear functionals by applying / composing different them together. For the sake of discussion, we will stay in the special case $tau^r_s (V)$ where things are formulated most easily. However, this concept of contraction can easily be generalized. #def[Self-Contraction][ Given a tensor $Gamma in tau^r_s(V)$, define the contraction $ cal(C)_(i,j): tau^r_s (V) &to tau^(r-1)_(s-1) (V) \ Gamma &sendto sum_k Gamma( dots, underbrace(vb(a)_k, i"-th slot"), dots, underbrace(vb(a)^k, j"-th slot"), dots, ) $ where ${vb(a)_k}$ is a (not necessarily orthonormal) basis of $V$ and $i$-th slot accepts vector while $j$-th slot accepts dual vector. ]<self-contraction> #thm[@self-contraction is well-defined][ The definition doesn't depend on our choice of basis ${vb(a)_k}$. ] #proof[ Let ${vb(b)_l}$ be another basis, then we have some number ${A^l_k}, {A^k'_l'}$ for conversion between these two basis such that $ vb(a)_k = sum_l A^l_k vb(b)_l $ And $ vb(b)_l' = sum_k' A^k'_l' vb(b)_k' $ By multi-linearity, $ Gamma(dots, vb(a)_k, dots, vb(a)^k, j, dots) &= Gamma(dots, sum_l A^l_k vb(b)_l, dots, sum_m A^k_m vb(b)^m, j, dots) $ ] Contraction can be understood as composition and/or partial composition. Specifically, == Metric Tensor and Rising/Lowering Indices == Examples from Physics = Dirac Notation Dirac notation is an effective convention for writing linear algebra for quantum mechanics, relying on the following properties of the underlying space $V$: - The space is an inner product space ($V$ is a Hilbert space, which is a complete complex inner product space) - The dual space $V'$ is canonically isomorphic to the $V$ through inner product These points are discussed in @sec-dual-space. #def( "Dirac notation - Basics", )[ Given a finite dimensional vector space $V$ with a non-degenerate Hermitian form $H$ (@non-deg-hermitian), we write: - $ket("something")$ to represent a vector in $V$, - $bra("something")$ to represent the metric dual (@metric-dual) of the vector $ket("something")$. In other words, $ bra("something")(cdot) = H(ket("something"), cdot) $ ] Main advantages of Dirac notation compared to the usual $vb(v)$ notation includes: - Naming of the vector is very easy, and we can write $ket((n,l,m))$ to clearly label the eigenstate of some particle, instead of resorting to $vb(e)_(n,l,m)$. - We don't need to write out the metric dual conversion mapping $L: V to V'$ explicitly every time. In the light of this definition, we can translate $ bra(psi) (ket(phi)) &equiv L(ket(psi)) (ket(phi)) \ &equiv H(ket(psi), ket(phi)) $ and $ bra(psi) (A ket(phi)) &equiv L(ket(psi)) (A ket(phi)) \ &equiv H(ket(psi), A ket(phi)) $ where $A: V to V$ is any operator. And for brevity we introduce the shorthand: $ bra(psi) (ket(phi)) &to braket(psi, phi) \ bra(psi) (A ket(phi)) &to braket(psi, A, phi) $ And also short hand like $ ket(0) + ket(1) to ket(0+1) $ However, you will also see use like $ ketbra(psi, phi), ket(psi) ket(phi) $ This is actually understood as a tensor product $ ketbra(psi, phi) equiv ket(psi) tp bra(phi), ket(psi) ket(phi) equiv ket(psi) tp ket(phi) $ which we will introduce now. #pagebreak() = Groups = Strum-Liouville Problem = Fourier Transform #pagebreak() #bibliography("./bib.yaml", style: "ieee")
https://github.com/yhtq/Notes
https://raw.githubusercontent.com/yhtq/Notes/main/抽象代数/作业/hw3.typ
typst
#import "../../template.typ": * // Take a look at the file `template.typ` in the file panel // to customize this template and discover how it works. #show: note.with( title: "作业3", author: "YHTQ ", date: none, logo: none, withOutlined: false ) #set heading(outlined: false) = 9.25 (上周没有注意到我们周四要交周一的作业,这里补上) == P24 6. 设 $a, b$ 是两个不同的二阶元素,从而 $Inv(a) = a, Inv(b) = b$,显有: $ (a b a)^2 = a b a a b a = e $ - 若 $a b a$ 不为 $a, b$ 中一个,则已证毕 - 若 $a b a = a => a b = e => a = b$ 矛盾 - 若 $a b a = b => (a b)^2 = e$,此时 $a b$ 为另一个不同于 $a, b$(显然)的二阶元素 7. $ &(a b)^2 = a^2 b^2\ <=>& a(a b)b=a(b a)b\ <=>& a b = b a $ 从而结论显然。 若 $exp(G) = 2$,则恒有 $(a b)^2 = a^2 b^2 = e$,从而有 $G$ 交换 11. 若 $tilde$ 是等价关系: $a, b in S => a tilde e, b tilde e => a tilde e, e tilde b => a tilde b => a Inv(b)$\ 由熟知结论显有 $S$ 是子群; 若 $S$ 是子群: - 自反性:$e in S => a Inv(a) in S => a tilde a$ - 传递性:$a tilde b, b tilde c => a Inv(b) in S, b Inv(c) in S => a Inv(c) in S => a tilde c$ - 对称性:$a tilde b => a Inv(b) in S => Inv((a Inv(b))) in S=> b Inv(a) in S => b tilde a$ 从而确实是等价关系 14. 记 $e = (1), a = (12)(34), b = (13)(24). c = (14)(23)$,则: $ a^2 = b^2 = c^2 = e\ a b = (12)(34)(13)(24) = (23)(14) = c\ b a = (13)(24)(12)(34) = (14)(23) = c\ a c = (12)(34)(14)(23) = (13)(24) = b\ c a = (14)(23)(12)(34) = (13)(24) = b\ b c = (13)(24)(14)(23) = (12)(34) = a\ c b = (14)(23)(13)(24) = (12)(34) = a\ $ 以上事实足以给出题设集合是 $S_4$ 的子群。 容易发现四次单位根群是循环群,但本群不是,从而不同构。 15. 注意到: $ B^n = I \ A^2 = I\ B A = mat(0, e^((2 pi) / n);e^((-2 pi) / n), 0)\ A (B A) = mat(0, 1;1, 0) mat(0, e^((2 pi) / n);e^((-2 pi) / n), 0) = mat(e^((-2 pi) / n), 0;0,e^((2 pi) / n)) = B^(n-1) => B A = A B^(n-1)\ $ 以上事实足以给出所有 $A, B$ 及它们的逆所生成的元素恰为题设集合。\ 事实上,$D_(2n)$也是由满足以下关系的两个元素 $r, s$: $ r^n = s^2 = e\ r s = s r^(n-1)\ $ 所生成($r, s$ 分别对应旋转 $(2 pi)/n$ 度和对称)。从而构造映射 $phi$ 使得: $ phi(A) = s\ phi(B^i) = r^i\ phi(A B^i) = s r^i \ $ 容易验证它是同构 == P25 18. 设 $ord(a b) = m, ord(b a) = n$,注意到: $ e = (a b)^m = a (b a)^(m-1)b => Inv(a) Inv(b) = (b a)^(m-1) => b a = (b a)^(m-1) => (b a)^m = e => m | n\ $ 另一侧同理,因此 $m = n$。 19. $ a = mat(0, -1;1, 0)\ a^2 = mat(-1, 0;0, -1) = -I => ord(a) = 4 $ $ b = mat(0, 1;-1, -1)\ b^2 = mat(-1, -1;1, 0)\ b^3 = mat(1, 0;0, 1) = I => ord(b) = 3 $ $ a b = mat(1, 1;0, 1) = (I + J_2)\ (a b)^n = (I + J_2)^n = I + n J_2 => ord(a b) = oo $ 20. 设 $G$ 中所有有限阶元素构成集合 $H$: - $a in H => ord(a) = ord(Inv(a))< inf => Inv(a) in H$ - $a, b in H => (a b)^(ord(a) ord(b)) = a^(ord(a) ord(b)) b^(ord(a) ord(b)) = e => ord(a b) < inf => a b in H$ 从而 $H$ 是子群。\ 21. 如若不然,设无限群 $G$ 只有有限个子群 $H_1, H_2, ..., H_n$。显然 $G$ 中不应有无限阶元素,否则它对应的循环群已有无穷个子群。考虑其中所有循环子群(这些子群都是有限群)之间的偏序关系: $ H_i <= H_j <=> H_i subset H_j \ $ 显然此偏序关系存在唯一最小元 ${e}$。\ 取该偏序关系下所有极大元构成集合 $A$ (由于 $G$ 不是循环群,故 $G in.not A$),显有: $ H_i in A <=> (H_i "是循环群") and (H_i < H_j => H_j "不是循环群") $ 我们断言对任意 $a in G$,均存在 $S in A$ 使得 $generatedBy(a) subset S$。这是因为$generatedBy(a)$ 是循环群,它在一些有限长的链上,从而取某条链上的极大元即可。\ 事实上,上述事实蕴含了 $union A = G$ ,但有限个有限集合的并一定是有限集合,这就与 $G$ 无限矛盾了。\ == P27 43. 设 $H$ 是 $A_4$ 的六阶子群。 显然 $H$ 中元素的阶只能为 $1, 2, 3, 6$。由前习题结论其中必有 $2$ 阶元素,而 $A_4$ 中 $2$ 阶元素只有 $3$ 个;另一方面,$A_4$ 中无 $6$ 阶元素,因此 $H$ 中也必有三阶元素。\ 显然三阶元素与它们的逆成对出现。因此 $2, 3$ 阶元素的数量必为 $(1, 4)$ 或 $(3, 2)$。\ - $2, 3$ 阶元素的数量为 $(1, 4)$,此时不妨设$(i_1i_2i_3), (i_1i_2i_4) in H$(显然四个元素中选出三个的两种选法恰由两个数重复,每种选法的所有排列只对应两个互逆的轮换,应当都在 $H$ 中),而: $ (i_1i_2i_3)(i_1i_2i_4) = (i_4i_1i_3) in H\ $ 这是另一个 $3$ 阶元素,矛盾。 - $2, 3$ 阶元素的数量为 $(3, 2)$,此时所有二阶元素均在 $H$ 中。不妨设 $(123) in H$,而: $ (123)(12)(34) = (341) in H\ $ 这是另一个 $3$ 阶元素,矛盾。 综上,$A_4$ 中不存在 $6$ 阶子群。 == P57 1. 由裴蜀定理知存在 $u, v$ 使得: $ u r+v s = 1 $ 并且 $gcd(u, s) = 1, gcd(v, r) = 1$ 从而取 $a = g ^ (u r), b = g^ (v b)$,显有: $ a b = g ^ (u r + v s) = g $ 且: $ ord(a) = ord(g^(u r)) = (r s) / gcd(r s, u r) = r / gcd(s, u) = s\ ord(b) = r "(同理可得)" $ 3. 设 $G = generatedBy(n_1/m_1\, n_2/m_2\, ... \, n_k/m_k)$。事实上,不妨设 $m_1 = m_2 = ... = m_k = m$ (取公分母即可),从而 $G = generatedBy(n_1/m\, n_2/m\, ... \, n_k/m)$,进而其中元素都形如 $n/m$。\ 定义: $ funcDef(phi, G, ZZ_m, n/m, n) $ 容易验证它是单同态,从而 $G tilde.eq im(phi) <= ZZ_m$,而 $ZZ_m$ 的子群都是循环群,从而 $G$ 也是循环群。 6. 必要性是显然的,只证明充分性。\ 定义: $ funcDef(phi, G, ZZ^+, a, min_(k in ZZ^+, k>=0) {G^k = generatedBy(a)}) $ - 显然若 $a$ 是有限阶元素,则有 $G^(phi(a)) = generatedBy(a)$ 知所有元素都是有限阶元素,从而 $exp(G)$ 有限。先证明这种情况: #lemma[][$exp(G) | phi(a) dot ord(a), space forall a in G$ ] #proof[ 由题意,对任意 $g in G$,有: $ g^phi(a) in generatedBy(a) $ 从而: $ e = (g^phi(a))^exp(generatedBy(a)) = (g^phi(a))^ord(a) = g^(phi(a) dot ord(a)) $ 这就说明了原命题。 ] #lemma[][ 若 $gcd(ord(a), phi(a)) = 1$,且 $G^i subset G^phi(a) = generatedBy(a)$。则 $G^(gcd(i, phi(a))) = generatedBy(a)$。进一步的,有 $phi(a) | i$ ] #proof[ 设 $d = gcd(i, phi(a))$,利用裴蜀定理,设 $u i + v phi(a) = d$。对 $forall g in G$,将有 $g^i = a^s, g^(phi(a)) = a^ t$。从而: $ g^d = g^(u i + v phi(a)) = (g^i)^u (g^(phi(a)))^v = a^(s u + t v) in generatedBy(a) $ 这说明 $G^d subset generatedBy(a)$。 另一方面,由于: $ gcd(d, ord(a)) = gcd(gcd(phi(a), i), ord(a)) = 1 $ 从而: $ generatedBy(a) = generatedBy(a^d) subset G^d$ ,可得 $G^d = generatedBy(a)$。 又由定义知 $phi(a) <= d$,从而 $d = phi(a) => phi(a) | i$ ] #corollary[][ 若 $gcd(ord(a), phi(a)) = 1$,则 $exp(G) = phi(a) ord(a)$ ] #proof[ 在引理 2 中取 $i = exp(G)$,可得 $phi(a) | exp(G)$,进而 $ord(a) phi(a) | exp(G)$。又由引理 1 知 $exp(G) = phi(a) ord(a)$ ] #corollary[][ 若 $gcd(ord(a), phi(a)) = 1$,且 $ord(b) | ord(a)$,则$generatedBy(x) <= generatedBy(a) <=> x in generatedBy(a)$ ] #proof[ 由: $ phi(a)ord(a) = exp(G) | phi(x) ord(x) | phi(x) ord(a) $ 知 $phi(a) | phi(x)$,进而: $ generatedBy(x) = G^phi(x) <= G^phi(a) = generatedBy(a) $ ] 取 $G$ 中阶最大元素 $a$,并取 $g^(phi(a)) = a$,则: $ ord(g) >= ord(g^(phi(a))) >= ord(g) $ 因此 $ord(g) = ord(g^(phi(a)))$,进而 $gcd(ord(g) = ord(a), phi(a)) = 1$。\ 由于 $g$ 也是最大阶元素,同样可得 $gcd(ord(g) , phi(g)) = 1$。\ 从而 $a, g$ 都满足引理 1,2,3 条件,易得 $generatedBy(a) = generatedBy(g)$。 我们的目标的是证明 $phi(a) = 1$。\ 如若不然,设素数 $p$ 满足 $p | phi(a)$,显然 $p divides.not ord(a)$。 #let p = $p$ 记 $R = union_(k=1)^(p-1) G^ (k phi(a) ord(a)/#p)$ 由于 $exp(G) > phi(a) ord(a)/#p$,故 $R != {e}$。显然其中任意非平凡元素均有 $ord(x) = p$ #proposition[][$forall x in R, ord(a x) = p ord(a)$] #proof[ + $R sect generatedBy(a) = {e}$ 这是因为设 $g in R sect generatedBy(a)$,将有: $ ord(g) divides #p, ord(g) divides ord(a) \ => ord(g) divides gcd(#p, ord(a)) = 1 => ord(g) = 1 => g = e $ + $forall y in G, ord(y) = p => y in R$ 事实上: $ phi(a)ord(a) | phi(y)ord(y) = p phi(y) => phi(a)/p ord(a) | phi(y) $ 因此 $phi(y) = k phi(a)/p ord(a) (1<= k < p)=> y in R$ + $R$ 中任意两元素可交换。 事实上,设 $x, y in R$,设 $phi(x) = k_1 phi(a)/p ord(a),phi(y) = k_2 phi(a)/p ord(a)$。 取 $g in G^(phi(a)/p ord(a)), g != e$,有: $ ord(g) = p => g^(k_1) != e, g^(k_2) != e $ 由于 $g^(k_1) in generatedBy(x), g^(k_2) in generatedBy(y)$,设: $ g^(k_1) = x^i, g^(k_2) = y^j $ 由于 $x^i, y^j$ 都是生成元,而它们可交换,因此 $generatedBy(x), generatedBy(y)$ 都可交换,进而 $x, y$ 可交换。 + $R$ 是群 $ a, b in R => (a b)^p = a^p b^p = e "(交换性)"=> ord(a b) | p => a b in R\ ord(Inv(a)) = ord(a) = p => Inv(a) in R $ + $forall c in R, d in generatedBy(a)$,有: $ c d Inv(c) in generatedBy(a)\ d c Inv(d) in R $ 这是因为 $ ord(c d Inv(c)) = ord(d) | ord(a)\ ord(d c Inv(d)) = ord(c) = p $ 由引理 2 和前述结论知 $c d Inv(c) in generatedBy(a), d c Inv(d) in R$ + $forall c in R, d in generatedBy(a) : c d = d c$ 注意到:(运用乘法封闭性和正规性) $ c d Inv(c) Inv(d) = (c d Inv(c)) Inv(d) in generatedBy(a)\ c d Inv(c) Inv(d) = c (d Inv(c) Inv(d)) in R $ 从而: $ c d Inv(c) Inv(d) in sect generatedBy(a) R => c d Inv(c) Inv(d) = e => c d = d c $ + $forall x in R, ord(a x) = p ord(a)$ 由于 $a x$ 可交换,$gcd(ord(a), ord(x)) = 1$,根据熟知定理: $ ord(a x) = p ord(a) $ ] 显然前述命题与 $ord(a)$ 的最大性矛盾,从而 $phi(a) = 1$,也即 $G = generatedBy(a)$ 是循环群,证毕。 - 若 $G$ 中所有元素都是无穷阶 任取 $G$ 中元素 $a$,并取 $g^phi(a) = a$。\ 断言 $phi(g) = 1$,否则再取 $x^phi(g) = g$,则有 $x^(phi(g) phi(a)) = a$。另一方面,由 $x^phi(a) in generatedBy(a)$ 知 $x^phi(a) = a^k = x^(k phi(g) phi(a))$,这与 $x$ 是无限阶元素矛盾,这就说明断言是正确的,也即 $phi(g) = 1$,进而由定义 $G= generatedBy(g)$ 是循环群。 11. 只需证明所有对换 $(i j)$ 均可由题设两置换生成。记 $sigma = (123...n)$ //定义: //$ //funcDef(pi_k, A_k, S_n, a + generatedBy(k), (a, a+k, a+2k ...)) //$ // //由于 $(a, a+k, a+2k, ...) = (a+k, a+2k, a+3k, ...)$,故定义是良好的 // //#lemma[][ // $sigma^k = product_(H in A_k) pi(H)$ //] //#proof[ // 首先,由于 $A_k$ 中的子群两两不交,故 ${pi_k (H) | H in A_k}$ 也两两不交,进而可交换,因此$product_(H in A_k) pi_k (H)$ 是合理的。 \ // 注意到(其中加法是在 $Z_n$ 意义下): // $ // sigma^k (a) = sigma^(k-1) (sigma(a)) = sigma^(k-1) (a+1) = ... = a+k // $ // 从而 $sigma^k |_(H in A)$ 都是一个轮换,它恰好就是 $pi_k (H)$,因此容易得到 $sigma^k = product_(H in A_k) pi_k (H)$ // //] 注意到(其中加法是在 $Z_n$ 意义下): $ sigma^k (a) = sigma^(k-1) (sigma(a)) = sigma^(k-1) (a+1) = ... = a+k $ 从而考虑 $sigma^(i_1) (1) = i, sigma^(i-1)(2) = i+1$,将有: $ sigma^(i-1) (12) Inv((sigma^(i-1))) = (i space i+1) $ 进一步 $(i+1 space i+2)(i space i+1)Inv((i space i+2)) = (i space i+2)$,以此类推便可得任意 $(i space j)$ 均可由题设两置换生成,进而 $S_n$ 由题设两置换生成。 //#lemma[][若 $m$ 有限,则 $ord(a) = ord(b) <=> phi(a) = phi(b)$ //] //#proof[ // $arrow.l.double$:显然可得\ // $arrow.r.double$:记 $i = phi(a), j= phi(b)$,此时由 $|G^i| = |G^j|$,往证 $G^i = G^j$。 // // // //] // //#lemma[][ // 任取 $a, b, c in Inv(phi)(k)$,则有 $a (b Inv(c)) in Inv(phi)(k)$ //] //#proof[ // //] 12. 记题设两种偶置换生成的群为 $A, B$,显有 $A, B <= A_n$。为了证明 $A, B$ 可生成 $A_n$,只需证明 $forall i, j, k, s, (i j)(k s) in A, B$。若 $n = 3$,结论是显然的,下设 $n >= 4$ - 先证明 $A$: 若假设 $i, j, k, s$ 两两不等,则: $ (12n)^2 = (1n 2) (1 j 2)(1 2 i)Inv((1 j 2)) = (j 1 i) = (1 i j) in A\ (1 s 2)(1 2 k)Inv((1 s 2)) = (s 1 k) = (1 k s) in A\ (1 k s)(1 i j)Inv((1 k s)) = (k i j) in A $ 这意味着所有三轮换都在 $A$ 中。因此若 $i, j, k, s$ 有相等数,则 $(i j)(k s)$ 是一个三轮换(或者恒等变换),在 $A$ 中。若它们两两不等在,则: $ (i k j)(i k s) = (i j)(k s) in A $ 从而 $A$ 可以生成 $A_n$。 - 再证明 $B$。注意到: $ (i space i+1 space i+2)(i-1 space i space i+1)Inv((i space i+1 space i+2)) = (i-1 space i+1 space i+2) in B\ (i-1 space i space i+1)(i-1 space i+1 space i+2)Inv((i-1 space i space i+1)) = (i space i-1 space i+2) \ = (i-1 space i+2 space i) in B\ (i-1 space i+2 space i)^2 = (i-1 space i space i+2) in B\ (i-2 space i-1 space i)^2 = (i-2 space i space i-1) in B\ (i-2 space i space i-1)(i-1 space i space i+2)Inv((i-2 space i space i-1)) = (i-2 space i-1 space i+2) in B\ $ 反复进行最后两步,可得 $(12i+2) in B$,从而 $A_n = A <= B <= A_n$,证毕。 == 补充题 1. 不妨设二阶群为 ${1, -1}$,$phi$ 为 $S_n -> {1, -1}$ 的一个同态\ 由 $11$ 题过程以及二阶群的交换性可以看出,所有对换应当均与 $(12)$ 有相同的像。 - $phi((12)) = 1$,此时所有对换的像均为 $1$,从而为平凡同态 $phi(x) = 1$ - $phi((12)) = -1$,此时所有对换的像均为 $-1$,从而易得 $ker(phi) = A_n$ 显然这就是唯二可能的同态。 2. 将 $sigma$ 分解为互不相交的轮换 $sigma_1 sigma_2 ... sigma_n$,显有: $ e = sigma^p = (sigma_1 sigma_2 ... sigma_n)^p = sigma_1^p sigma_2^p ... sigma_n^p $ 显然由于 $sigma_i^p$ 之间互不相交,从而设 $x$ 恰在 $sigma_i$ 中,则有 $sigma^p (x) = sigma_i^p (x)$。易得 $e = sigma_1^p sigma_2^p ... sigma_n^p$ 当且仅当 $forallSa(i, sigma_i^p = e)$,这就表明 $ord(sigma_i) | p => ord(sigma_i) = p$,进而它们都是长度为 $p$ 的轮换。\ = 9.28 1. 显然 $ord(g^m) = n$,因此 $generatedBy(g) = generatedBy(g^m) <= H$ 2. 用归纳法,假设 $m-1$ 时结论成立 取自然同态: $ funcDef(pi, U(p^m), U(p^(m-1)), a, overline(a)) $ 由归纳假设,$U(p^(m-1))$ 是循环群,设 $U(p^(m-1)) = generatedBy(overline(a))$,则显有 $p^(m-2)(p-1) | ord(a)$。 若 $ord(a) = p^(m-1)(p-1)$,则结论已成立。下设 $ord(a) = p^(m-2)(p-1)$。 另一方面,易得 $|ker(pi)| = p$,从而 $ker(pi)$ 是循环群,设 $ker(pi) = generatedBy(b)$。 若 $m = 2$,则 $gcd(p^(m-2)(p-1), p) = 1$,从而显有 $ord(a b) = p^(m-1)(p-1)$,下设 $m >= 3$。 #proposition[][ $generatedBy(b) times generatedBy(a) tilde.eq U(P^m)$ ] #proof[ + 定义: $ funcDef(phi, generatedBy(b) times generatedBy(a), U(P^m), (x\, y), x y) $ 由于 $U(P^m)$ 交换,容易验证它是同态。 + $generatedBy(a) sect generatedBy(b) = {e}$ 由 $pi$ 是满射知 $pi|_(generatedBy(a))$ 是同构,因此: $ {e} = ker(pi|_(generatedBy(a))) = ker(pi) sect generatedBy(a) = generatedBy(a) sect generatedBy(b) $ + $ker(phi) = {e}$,从而 $phi$ 是单射。 $ a b = e => a = Inv(b) => a, b in generatedBy(a) sect generatedBy(b) => a = b = e => (a, b) = e $ + $|generatedBy(b) times generatedBy(a)| = |U(P^m)|$,从而结合单射知同构成立 ] 由上述事实,立得: $ forall g in U(p^m), exists x in generatedBy(a), y in generatedBy(b), g = x y => g^p = x^p y^p = x^p in generatedBy(a) $ 从而: $ (p^m - 1)^p = p^m - 1 in generatedBy(a)\ (p^(m-2) - 1)^p = - C_p^2 p^(2m-4) + p^(m-1) - 1 = p^(m-1) - 1 in generatedBy(a) \ "(注意到" p dot p^(2m-4) | C_p^2 p^(2m-4) "且" 2m - 3 >= m ")" $ 但 $pi(p^m - 1) = pi(p^(m-1) - 1) = p^(m-1)-1$,这与 $pi$ 是同构矛盾!\ 从而命题 2 之前的假设必有某些不成立,而不论哪个都说明 $U(p^m)$ 是循环群。 3. 若 $m | n$,注意到 $G$ 的 $m$ 阶子群存在且唯一,记唯一的 $m$ 阶子群为 $A$,断言 ${x | x^m = e} = A$。 事实上,$x^m = e => ord(x) | m => generatedBy(x)$ 与 $A$ 的某个子群同阶,这意味着 $generatedBy(x) subset A$,从而 $x in A$。\ 而 $A subset {x | x^m = e}$ 是显然的,这就证明了断言,从而解的数量恰为 $m$ 一般而言,记 $d = gcd(m, n)$,则 $x^d = e$ 的解数为 $d$。由裴蜀定理: $ x^m = e => e = x^(u m + v n) = x^d $ 从而 $x^d = e <=> x^m = e$,进而解的数量为 $d$。 4. 不妨设 $G = ZZ_m$ 或 $ZZ$,定义: $ funcDef(phi, G, "Hom"(G), a, [x | x -> a x]) $ 我们断言: - $phi$ 是单射 事实上,若 $phi(a) = phi(b)$,则 $forall x in G, a x = b x$,从而 $a = b$。 - $phi$ 是满射 事实上,对于任意 $f in "Hom"(G)$,取 $f(1)$,则 $f(x) = f(1) x$,从而 $f = phi(f(1))$ - $phi(a) dot phi(b) = phi(a b)$ 对于任意 $x in G$,有: $ (phi(a) dot phi(b)) (x) = a (b x) = a b x = phi(a b) (x) $ - $(phi(a) + phi(b))(x) = phi(a + b) (x)$ 事实上: $ (phi(a) + phi(b))(x) = a x + b x = (a + b)x = phi(a + b) (x^2) $ 以上事实说明 $phi$ 是一个环同构,从而 $G$ 上的自同态环与 $ZZ$ 或 $ZZ_(|G|)$ 同构 5. 取 $r$ 为逆时针旋转 $pi / 2$,$s$ 为沿 $x$ 轴翻转,显然 $r^4 = e, s^2 = e$,且 $s r = r^3 s$,并有:\ $ D_4 = generatedBy(r\, s) $ 设 $phi$ 是 $D_4$ 上自同构,$D_4$ 中二阶元素分别为: $ s, r s, r^2 s, r^3 s $ (注意到 $(r^k s)^2 = r^k (s r^k s) = r^k (s r s)^k = r^(4 k) = e$) $D_4$ 中四阶元素分别为: $ r, r^3 $ 记: $ r' = [r -> r^3, s -> s]\ s' = [r -> r, s -> r s] $ 便有: $ ord(r') = 2, ord(s') = 4\ generatedBy(r') sect generatedBy(s') = {e}\ r' (s')^k = [r -> r^3, s -> r^(3 k) s] \ s' r' = [r -> r^3, s -> r s] = r' s^3\ $ 从而 $Aut(D_4) tilde.eq D_4$ 6. 显然 $ZZ_4$ 与 $ZZ_6$ 的自同构群都是 $ZZ_2$,但它们不同构。 = 10.9 24. 二阶群中结论平凡,不妨设 $G$ 不是二阶群,从而 $H_1, H_2$ 非平凡。 设 $H_1, H_2$ 是 $G$ 是两个不同的指数为 $2$ 的子群。\ 有: $ |G : H_1 sect H_2| <= |G:H_1| |G:H_2| = 4 $ 且 $2 | |G: H_1 sect H_2|$。 这说明 $|G: H_1 sect H_2| = 2 "或 " 4$ - $|G: H_1 sect H_2| = 2$,又 $H_1 != H_2$,从而 $H_1 sect H_2$ 是异于 $H_1, H_2$ 的二阶子群。 - $|G: H_1 sect H_2| = 4$ 这将意味着 $H_1 H_2 = G$。从而 $H_1, H_2$ 不可能相互包含。 显然 $H_1 union H_2 != G$,否则任取两个非平凡元 $h_1 in H_1, h_2 in H_2$,将有: $ h_1 h_2 in H_1 => h_2 in H_1 => H_2 subset H_1 => H_1 = H_2 $ 或: $ h_1 h_2 in H_2 => h_1 in H_2 => H_1 subset H_2 => H_1 = H_2 $ 取 $h_1 in H_1 - H_2, h_2 in H_2 - H_1, h_3 in G - H_1 - H_2$\ 我们有(容易验证它们不在同一个陪集): $ G = (H_1 sect H_2) union h_1 (H_1 sect H_2) union h_2 (H_1 sect H_2) union h_3 (H_1 sect H_2)\ = (H_1 sect H_2) union (H_1 sect H_2) h_1 union (H_1 sect H_2) h_2 union (H_1 sect H_2) h_3 $ #proposition[][ $(H_1 sect H_2) union h_3 (H_1 sect H_2) <= G$ 且其指数为 $2$ ] #proof[ + $h_3 (H_1 sect H_2) = G - H_1 union H_2 = (H_1 sect H_2) h_3$ 注意到: $ (H_1 sect H_2) union (H_1 sect H_2) h_1 union (H_1 sect H_2) h_2 subset H_1 union H_2\ h_3 (H_1 sect H_2) in.not H_1, H_2 => h_3 (H_1 sect H_2) subset G - H_1 union H_2 $ 结合上面的陪集分解,结论显然成立(另一侧同理) + $e in (H_1 sect H_2) union h_3 (H_1 sect H_2)$ 显然成立 + $Inv(((H_1 sect H_2) union h_3 (H_1 sect H_2))) = (H_1 sect H_2) union h_3 (H_1 sect H_2)$ 若 $h in (H_1 sect H_2)$,显有 $Inv(h) in (H_1 sect H_2)$\ 若 $h_3 x in h_3 (H_1 sect H_2)$,则: $ Inv((h_3 x)) = Inv(h_3) Inv(x) in G - H_1 - H_2 => Inv((h_3 x)) in h_3 (H_1 sect H_2) $\ + $h_3^2 in H_1 sect H_2$ 考虑关于 $H_1$ 的陪集分解,将有: $ G = H_1 union h_3 H_1 $ 若 $h_3^2 in.not H_1 => h_3^2 in h_3 H_1 => h_3 in H_1$,矛盾! 类似可得 $h_3 in H_2$,从而结论成立。 + $((H_1 sect H_2) union h_3 (H_1 sect H_2))^2 = (H_1 sect H_2) union h_3 (H_1 sect H_2)$ 设 $x, y, z in H_1 sect H_2$,注意到: - $x y in H_1 sect H_2$ - $x (h_3 y) in.not H_1$,否则 $h_3 in H_1$,同理 $x h_3 y not in H_2$。 - $x (h_3 y) in G - H_1 - H_2 => x h_3 y in h_3 (H_1 sect H_2)$ - 同理,$(h_3 x) y in h_3 (H_1 sect H_2)$ - $(h_3 x)(h_3 y) in H_1 sect H_2$,只需证 $h_3 x h_3 in H_1 sect H_2$,这是因为: $ x h_3 in (H_1 sect H_2) h_3 = h_3 (H_1 sect H_2) => exists z, x h_3 = h_3 z $ 从而: $ h_3 x h_3 = h_3^2 x in H_1 sect H_2 $ 以上事实足以说明子群关系。\ 由 $|(H_1 sect H_2) union h_3 (H_1 sect H_2)| = 2 |(H_1 sect H_2) union h_3 (H_1 sect H_2)|$ 易得其指数为 $2$ ] 上述命题给出异于 $H_1, H_2$ 的二阶子群,证毕。 30. #corollary[][ $g(H sect K) = (g H) sect (g K)$ ] #proof[ + $g(H sect K) subset (g H) sect (g K)$:显然 + $(g H) sect (g K) subset g(H sect K)$: $ x in (g H) sect (g K) => exists h in H, k in K, x = g h = g k \ => Inv(g) x = h = k in H sect K => x in g(H sect K) $ ] 由引理,结论是显然的。 31. 由 30 题结论,每个 $H sect K$ 的陪集都是 $H$ 的陪集与 $K$ 的陪集的交集,从而 $|G : H sect K|$ 不可能超过 $|G : H||G : K|$,证毕。 36. 注意到必有 $gcd(ord(g), m) = gcd(ord(h), m) = 1$,从而 $g, h$ 分别是 $generatedBy(g), generatedBy(h)$ 的生成元。\ 从而 $g^m = h^m => generatedBy(g^m) = generatedBy(h^m) => generatedBy(g) = generatedBy(h)$。\ 设 $g = h^k$,则有: $ g^m = h^(k m) = h^m => h^(m (k-1)) = e => ord(h) | m(k-1) => ord(h) | (k-1) $ 这说明 $g = h^k = h$\ 对于第二个结论,之前已经证明了唯一性,而存在性是由于 $x^m$ 是 $generatedBy(x)$ 的生成元,从而: $ exists k: (x^m)^k = x => (x^k)^m = x $ 38. $ H a = K b => H = K b Inv(a) := K c\ e in H => e in K c => Inv(c) in K => c in K => K c = K => H = K $ = 补充题 1. - $1 compose f(x) = f(x)$ - $(a b) compose f(x) = f(a b x) = a compose (b compose f(x))$ 从而这确实给出一个群作用。 2. - $1 compose z = z/1 = z$ - $(A B) compose z = A compose (B compose z)$ 记 $phi(vec(a, b)) = a / b$,注意到: $ phi(A vec(z, k)) = A compose (z / k) = A compose phi(A vec(z, k))\ (A B) compose z = phi((A B) vec(z, 1)) = phi(A (B vec(z, 1))) = A compose (phi(B vec(z, 1))) = A compose (B compose z) $ 这足以说明 $phi$ 是一个群作用。\ 断言它只有一个轨道,事实上: $ forall z_1 = x_1 + y_1 i, z_2 = x_2 + y_2 i in HH\ z_1 = x_1 + y_1 Inv(y_2)(z_2 - x_2) $ 略加整理即可找到对应的分式线性变换 3. 显有: $ R^* = R^+ union (-1) R^+ $ 容易验证它恰好是陪集分解,从而指数为 $2$ 4. 容易验证: $ x H = {z | |z| = |x|} $ 是复平面上过 $z$ 点的圆 5. 定义: $ funcDef(phi, H times K, (G -> G), (h, k), [g | g -> h g Inv(k)]) $ 验证 $phi$ 给出 $H times K$ 在 $G$ 上的一个群作用: - $phi(e)(g) = g$ - 验证结合律: $ phi((h_1, k_1)(h_2, k_2)) = [g | g -> h_1 h_2 g Inv(k_2) Inv(k_1)]\ phi(h_1, k_1) compose phi(h_2, k_2) = [g | g -> h_1 g Inv(k_1)] compose [g | g -> h_2 g Inv(k_2)] \ = [g | g -> h_1 h_2 g Inv(k_2) Inv(k_1)] $ 从而 $phi(h_1, k_1)(phi(h_2, k_2)(g)) = (phi(h_1, k_1) compose phi(h_2, k_2))(g) = phi((h_1, k_1)(h_2, k_2))(g)$ 足以说明这是一个群作用,从而 $G$ 可以写作轨道的无交并,而轨道恰好是双陪集。\ 只需注意到 $g K Inv(g) <= G$,从而: $ |H g K| = |H g K Inv(g)| = |H (g K Inv(g))| \ = (|H| |g K Inv(g)|) / (|H sect g K Inv(g)|) = (|H| |K|) / (|H sect g K Inv(g)|) = |K| |H : H sect g K Inv(g)| $ 另一侧同理 6. 考虑 $G$ 对 $K$ 的左陪集分解,设其代表元集为 $I$: $ G = union_(g in I) g K $ 显然同一陪集的元素在 $sigma$ 下的像相同,不同的陪集的元素在 $sigma$ 下的像不同,从而恰有: $ Inv(sigma)(sigma(H)) = union_(g in H) g K = H K $ 由群同态的性质: $ H <= G => sigma(H) <= sigma(G) \ H' <= sigma(G) => Inv(sigma)(H') <= G $ 可得 $H K <= G$
https://github.com/JvandeLocht/assignment-template-typst-hfh
https://raw.githubusercontent.com/JvandeLocht/assignment-template-typst-hfh/main/utils/feedback.typ
typst
MIT License
#let feedback( feedback: "", response: "", ) = { counter("feedback").step() block( breakable: false, ( par( justify: true, text(weight: 700, counter("feedback").display() + ". Feedback: ") + feedback + pad( left: 5%, top: 2mm, bottom: 5mm, text( fill: blue, text(weight: 700, "Response: ") + response ) ) ) ) ) }
https://github.com/pku-typst/ichigo
https://raw.githubusercontent.com/pku-typst/ichigo/main/src/themes/simple/lib.typ
typst
MIT License
#let theme(meta) = { return ( title: ( whole-page: () => { return [ #v(40pt) #align(center)[ #set text(font: ( "New Computer Modern", "Source Han Serif SC", )) #text(size: 28pt, weight: "bold")[ #meta.course-name ] #text(size: 18pt)[ #meta.serial-str ] #text(size: 12pt, font: "STFangsong")[ #meta.author-info ] ] #pagebreak(weak: true) ] }, simple: () => { return [ #v(10pt) #align(center)[ #set text(font: ( "New Computer Modern", "Source Han Serif SC", )) #text(size: 28pt, weight: "bold")[ #meta.course-name ] #text(size: 18pt)[ #meta.serial-str ] #text(size: 12pt, font: "STFangsong")[ #meta.author-info ] ] ] }, ), page-setting: ( header: () => none, footer: () => { let cur = context counter(page).get().at(0) let tot = context counter(page).final().at(0) return align(center)[ #cur / #tot ] }, ), fonts: ( heading: ( "New Computer Modern", "Source Han Serif SC", ), text: ( "New Computer Modern", "Source Han Serif SC", ), equation: ( "New Computer Modern Math", "Source Han Serif SC", ), ), ) }
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tools/support/README.md
markdown
Apache License 2.0
# Language support This VS Code extension provides minimal language support for Typst. It contains a syntax definition and a language configuration for comment toggling, autoclosing etc. The extension was created for development purposes only. It is not maintained and its grammar is not 100% accurate. For a more actively developed extension see the third-party [Typst LSP](https://github.com/nvarner/typst-lsp). ## Installation The simplest way to install this extension (and keep it up-to-date) is to add a symlink from `~/.vscode/extensions/typst-support` to `path/to/typst/tools/support`.
https://github.com/jujimeizuo/ZJSU-typst-template
https://raw.githubusercontent.com/jujimeizuo/ZJSU-typst-template/master/template/cover.typ
typst
Apache License 2.0
// 封面 #import "font.typ": * #import "../contents/info.typ": * #set page(footer: none) // 封面页码置0 #counter(page).update(0) #align(center)[ #v(20pt) #table( columns: (auto, auto, auto), align: horizon, stroke: none, [], [#image("./images/zjgsu_name.png", width: 90%, height: 9%)], ) #table( columns: (auto), rows: (auto, auto ,auto), stroke: none, gutter: 9pt, text( font: songti, size: font_size.xiaoyi, )[毕业论文(设计)正文], ) #table( columns: (auto, auto, auto), align: horizon, stroke: none, [#image("./images/zjgsu_logo.png", width: 50%)] ) #v(40pt) #grid( columns: (65pt, 60%), column-gutter: 1pt, rect(width: 100%, inset: 2pt, stroke: none, text( font: songti, size: font_size.sanhao, weight: "bold", overhang: false, "题 目:" )), rect( width: 90%, inset: 2pt, stroke: ( bottom: 1pt + black ), text( font: songti, size: font_size.sanhao, weight: "bold", bottom-edge: "descender" )[ #zh_title ] ) ) #v(70pt) #let info_value(body) = { rect( width: 80%, inset: 2pt, stroke: ( bottom: 1pt + black ), text( font: songti, size: font_size.sanhao, weight: "medium", bottom-edge: "descender" )[ #body ] ) } #let info_key(body) = { rect(width: 100%, inset: 2pt, stroke: none, text( font: heiti, size: font_size.sanhao, weight: "bold", overhang: false, body )) } #grid( columns: (80pt, 210pt), rows : (35pt, 35pt), //gutter: 3pt, // row : (auto ,auto, auto, auto, auto, auto), info_key("学  院"), info_value(college), info_key("专  业"), info_value(major), info_key("学  号"), info_value(student_id), info_key("学生姓名"), info_value(student_name), info_key("指导老师"), info_value(college_advisor), info_key("企业导师"), info_value(company_advisor), // info_key("起讫日期"), // info_value(start_and_end_date) ) ] #pagebreak() #counter(page).update(0) #align(center+horizon)[ #table( columns: (auto), rows: (auto, auto ,auto), stroke: none, gutter: 9pt, text( font: songti, size: font_size.xiaochu, )[毕业论文(设计)正文], text( font: songti, size: font_size.sanhao, )[题目:#zh_title] ) ]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/meta/figure-caption_01.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test figure.caption element for specific figure kinds #show figure.caption.where(kind: table): underline #figure( [Not a table], caption: [Not underlined], ) #figure( table[A table], caption: [Underlined], )
https://github.com/heloineto/utfpr-tcc-template
https://raw.githubusercontent.com/heloineto/utfpr-tcc-template/main/template/approval-page.typ
typst
#let months = ( "janeiro", "fevereiro", "março", "abril", "maio", "junho", "julho", "agosto", "setembro", "outubro", "novembro", "dezembro", ) #let approver-field( name: "", degree: "", institution: "", ) = { [ #set text(size: 10pt, weight: "regular") #line(length: 100%, stroke: 0.5pt) #name #linebreak() #degree #linebreak() #institution ] } #let approval-page( title: "", authors: (), city: "", year: "", goal: [], approval-date: datetime.today(), approvers: (), ) = { [ #set align(center) #set text(weight: "bold") #block( width: 100%, height: 5em, (authors.map(author => upper(author)).join("\n")) ) #block( width: 100%, height: 6em, upper(title) ) #align(right)[ #block(width: 52.5%)[ #set text(size: 10pt, weight: "regular") #set align(left) #set par(justify: true) #goal ] ] #v(3em) #block( width: 100%, height: 4em, )[ #set text(size: 10pt, weight: "regular") Data de aprovação: #approval-date.display("[day]")/#months.at(approval-date.month() - 1)/#approval-date.display("[year]") ] #text(approvers.map(approver => approver-field( name: approver.name, degree: approver.degree, institution: approver.institution, ) ).join(v(3em))) #align(bottom)[ #upper(city) #linebreak() #year ] ] pagebreak() }
https://github.com/cspr-rad/kairos-spec
https://raw.githubusercontent.com/cspr-rad/kairos-spec/main/requirements/main.typ
typst
#let title = [ Kairos: Zero-knowledge Casper Transaction Scaling ] #let time_format = "[weekday] [month repr:long] [day padding:none], [year]" #set page(paper: "a4", numbering: "1", margin: (x: 3.2cm, y: 4.0cm)) #set heading(numbering: "1.") #set text( // font: "Linux Libertine", size: 12pt, ) #show link: underline #align( center, text( 21pt, )[ *#title* Requirements #align( center, text( 12pt, )[ <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME> ], ) #datetime.today().display(time_format) ], ) #outline(title: "Contents", indent: auto) #pagebreak() = Introduction The Casper blockchain's ecosystem is in need of a scaling solution to achieve a higher transaction throughput and continue to stay competitive. As a first step towards providing a trustless scaling solution, the goal of the initial version 0.1 of the Kairos project is to build a zero-knowledge (ZK) _validium_ @validium @validium-vs-rollup for payment transactions in a second layer (L2). This system will both enable a higher transaction throughput and lower gas fees. Here, _validium_ refers to a rollup where the data, such as account balances, are stored on L2 rather than on the Casper blockchain directly (L1). Additionally, Kairos V0.1 serves two other purposes: - It is the first step towards a cheap and frictionless NFT (non-fungible token) minting and transfer system aiding Casper to become _the_ blockchain to push the digital art industry forward. - The conciseness and complexity of its scope allow us to explore the problem space of L2 solutions that leverage zero-knowledge technology and integrate with Casper's L1. Furthermore, it allows the team to collaborate and grow together by building a production-grade system. Kairos V0.1 will support very few simple interactions and features. Users will be able to deposit, withdraw, and transfer funds. These interactions will be serviced by the L2 and verified and stored by the L1, leveraging zero-knowledge technology. In the remainder of this document, we will detail the requirements of such a system. In @overview (Product Overview) we describe the high-level features that Kairos V0.1 will support. Next, @requirements specifies the requirements based on the described interactions and features. We conclude the document with a glossary, which clarifies the terminology used throughout this document. Additionally, this document is accompanied by several blog posts detailing some of the design considerations in more detail, as listed in the bibliography @compare-zk-provers. = Product Overview<overview> To have a common denominator on the scope of Kairos V0.1, this section describes the high-level features it has to support. == User Characteristics The target audience comprises users familiar with blockchain technology and applications built on top of the Casper blockchain. == Product Constraints - The product's backend will be deployed on modern powerful machines equipped with a powerful graphics processing unit (GPU) and a large amount of working memory as well as persistent disk space. - The operating machines will have continuous access to the Internet. - The CLI will be deployed on modern, potentially less powerful hardware. - The product should be a centralized solution for this initial version 0.1. - The applied proving system should be zero knowledge. == Features The features in this section are associated with a tag, which is used as a prefix of the tags in @requirements (Requirements). === Deposit Tokens Into L2 System *[tag:F00]*: Users should be able to deposit tokens from their L1 account to their L2 account at any given time through a command line interface (CLI). === Withdraw Tokens From L2 System *[tag:F01]*: Users should be able to withdraw tokens from their L2 account to their L1 account at any given time through a CLI. === Transfer Tokens Within the L2 System *[tag:F02]*: Users should be able to transfer tokens from their L2 account to another user's L2 account at any given time through a CLI. === Query Account Balances *[tag:F03]*: Anyone should be able to query any L2 account balances at any given time through a CLI. In particular, users can also query their personal L2 account balance. === Query Transaction Data *[tag:F04]*: Anyone should be able to query any L2 transactions at any given time through a CLI. === Verification *[tag:F05]*: Anyone should be able to verify deposits, withdrawals, or transactions either through a CLI or application programming interface (API), i.e. a machine-readable way. = Requirements <requirements> Based on the product overview given in the previous section, this section aims to describe testable, functional requirements the system needs to meet. == Functional Requirements === Deposit Tokens Into L2 System - *[tag:F00-00]* Depositing an amount of `tokens`, where `tokens >= MIN_AMOUNT` must be accounted correctly: `new_account_balance = old_account_balance + tokens` - *[tag:F00-01]* Depositing an amount of `tokens`, where `tokens < MIN_AMOUNT` must not be executed at all - *[tag:F00-02]* A user depositing any valid amount (condition stated in F00-00) to their `L2 account` must only succeed if the user has signed the deposit transaction - *[tag:F00-03]* A user depositing any amount with a proper signature to another user's account must fail - *[tag:F00-04]* A deposit request shall not be replayable. === Withdraw Tokens From L2 System - *[tag:F01-00]* Withdrawing an amount of `tokens`, where `user's L2 account balance >= tokens > MIN_AMOUNT` must be accounted correctly: `new_account_balance = old_account_balance - tokens` - *[tag:F01-01]* Withdrawing an amount of `tokens`, where `tokens < MIN_AMOUNT` must not be executed at all - *[tag:F01-02]* Withdrawing an amount of `tokens`, where `tokens > user's L2 account balance` should not be possible - *[tag:F01-03]* Withdrawing a valid amount (condition stated in F01-00, F01-02) from the user's L2 account must be possible without the intermediary operator of the system - *[tag:F01-04]* Withdrawing a valid amount (condition stated in F01-00, F01-02) from the user's L2 account must succeed if the user has signed the withdrawal transaction - *[tag:F01-05]* Withdrawing any amount from another user's L2 account must not be possible - *[tag:F01-06]* A withdrawal request shall not be replayable. - *[tag:F01-07]* A withdrawal request must prevent double spending of tokens. === Transfer Tokens Within the L2 System - *[tag:F02-00]* Transfering an amount of `tokens` from `user1` to `user2`, where `user1's L2 account balance >= tokens > MIN_AMOUNT` must be accounted correctly: `new_account_balance_user1 = old_account_balance_user1 - tokens` and `new_account_balance_user2 = old_account_balance_user2 - tokens` - *[tag:F02-01]* Transfering an amount of `tokens`, where `tokens < MIN_AMOUNT` must not be executed at all - *[tag:F02-02]* Transfering an amount of `tokens`, where `tokens > user's L2 account balance` must not be possible - *[tag:F02-03]* Transfering a valid amount (condition F02-00) to another user that does not have a registered L2 account yet must be possible. - *[tag:F02-04]* Transfering a valid amount (condition F02-00) to another user should only succeed if the user owning the funds has signed the transfer transaction - *[tag:F02-05]* A transfer request shall not be replayable. - *[tag:F02-06]* A transfer request must prevent double spending of tokens. === Query Account Balances - *[tag:F03-00]* A user must be able to see their L2 account balance when it's queried through the CLI - *[tag:F03-01]* Anyone must be able to obtain any L2 account balances when querying the CLI or API - *[tag:F03-02]* Account balances must be written by known, verified entities only - *[tag:F03-03]* Account balances must be updated immediately after the successful verification of correct deposit/withdraw/transfer interactions - *[tag:F03-04]* Account balances must not be updated if the verification of the proof of the interactions fails - *[tag:F03-05]* Account balances must be stored redundantly @data-redundancy === Query Transaction Data - *[tag:F04-00]* A user must be able to see its L2 transactions when they are queried through the CLI - *[tag:F04-01]* Anyone must be able to obtain any L2 transactions when querying the CLI or API - *[tag:F04-02]* Transaction data must be written by known, verified entities only - *[tag:F04-03]* Transaction data must be written immediately after the successful verification of correct deposit/withdraw/transfer interactions - *[tag:F04-04]* Transaction data must not be written if the verification of the proof of the interactions fails - *[tag:F04-05]* Transaction data must be stored redundantly @data-redundancy === Verification - *[tag:F05-00]* Anyone must be able to query and verify proofs of the system's state changes caused by deposit/withdraw/transfer interactions at any given time == Quality of Service === Performance - *[tag:QS00]* The CLI should respond to user interactions immediately. - *[tag:QS01]* The L2 should support a high parallel transaction throughput #footnote[Read @sequential-throughput for more insight into parallel vs. sequential transaction throughput.] === Security - *[tag:QS02]* The application must not leak any private or sensitive information like private keys === Reusability - *[tag:QS03]* The L2's API must be designed in such a way that it's easy to swap out a client implementation - *[tag:QS04]* The whole system must be easy to extend with new features === Usability - *[tag:QS05]* The CLI should be designed in a user-friendly way = Glossary <glossary> / Validium: Please refer to @validium and @validium-vs-rollup / L1: The Casper blockchain as it currently runs. / L2: A layer built on top of the Casper blockchain, which leverages Casper's consensus algorithm and existing infrastructure for security purposes while adding scaling and/or privacy benefits / Zero knowledge proof (ZKP): Is a proof generated by person A which proves to person B that A is in possession of certain information X without revealing X itself to B. These ZKPs provide some of the most exciting ways to build L2s with privacy controls and scalability. @zkp #bibliography("bibliography.yml")
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/array-15.typ
typst
Other
// Error: 2:16-2:18 missing argument: index #let numbers = () #numbers.insert()
https://github.com/Kasci/LiturgicalBooks
https://raw.githubusercontent.com/Kasci/LiturgicalBooks/master/SK/zalmy/Z053.typ
typst
Pane, zachráň ma pre svoje meno a svojou mocou obráň moje právo. \* Bože, vyslyš moju modlitbu a vypočuj slová mojich úst. Lebo pyšní povstávajú proti mne a násilníci mi striehnu na život, \* nechcú mať Boha na očiach. Ale mne Boh pomáha a môj život udržiava Pán. \* Na mojich protivníkov obráť nešťastie a rozpráš ich, veď si verný. S radosťou ti prinesiem obetu; \* meno tvoje, Pane, budem velebiť, lebo si dobrý; lebo ma vyslobodzuješ zo všetkých súžení \* a na svojich nepriateľov môžem hľadieť zvysoka.
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/list-attach_05.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // ... even if forced to. Hello #list(tight: false)[A][B] World
https://github.com/mem-courses/calculus
https://raw.githubusercontent.com/mem-courses/calculus/main/homework-1/calculus-homework6.typ
typst
#import "../template.typ": * #show: project.with( title: "Calculus Homework #6", authors: (( name: "<NAME> (#47)", email: "<EMAIL>", phone: "3230104585" ),), date: "October 31, 2023", ) = P98 习题2-1 12 #prob[求导数:$display(y = 1/x^2 + root(3,x^2) + 1/root(3,x) + root(3,2))$] $ y' = (x^(-2) + x^(2/3) + x^(-1/3) + root(3,2))' = -2 x^(-3) + 2/3 x^(-1/3) - 1/3 x^(-4/3) = - 2/(x^3) + 2/(3 root(3,x)) - 1/(3 x root(3,x)) $ = P98 习题2-1 15 #prob[求导数:$display(y = x^2 sin x + 2 x sin x - cos pi / 5)$] $ y' = 2x sin x + x^2 cos x + 2 sin x + 2x cos x = 2 (x+1) sin x + x(x+2)cos x $ = P99 习题2-1 18 #prob[求导数:$display(y = sqrt(x)/(sqrt(x)+1))$] $ y' = ((sqrt(x))' (sqrt(x) + 1) - (sqrt(x) + 1)' sqrt(x))/((sqrt(x) + 1)^2) = (1/(2 sqrt(x)) (sqrt(x) + 1) - 1/(2 sqrt(x)) sqrt(x))/(x + 2 sqrt(x) + 1) = 1/(2 x sqrt(x) + 4 x + 2 sqrt(x)) $ = P99 习题2-1 21 #prob[求导数:$display(y = 1/(1+sqrt(x)) - 1/(1-sqrt(x)))$] $ y' &= frac(-(1+sqrt(x))', (1+sqrt(x))^2) - frac(-(1-sqrt(x))', (1-sqrt(x))^2) = -1/(2 sqrt(x)) (1/(1+sqrt(x))^2 + 1/(1-sqrt(x))^2)\ &= - frac((1+sqrt(x))^2 + (1-sqrt(x))^2, 2 (1-x)^2 sqrt(x)) = - frac(x+1, (1-x)^2 sqrt(x)) $ = P99 习题2-1 24 #prob[求导数:$display(y = frac(sin x - cos x, sin x + cos x))$] $ y' & = frac( (sin x - cos x)' (sin x + cos x) - (sin x + cos x)' (sin x - cos x), (sin x + cos x)^2)\ &= frac( (cos x + sin x) (sin x + cos x) - (cos x - sin x) (sin x - cos x), sin^2 x + 2 sin x cos x + cos^2 x)\ &= 2/(1 + sin 2x) $ = P99 习题2-1 27 #prob[求导数:$display(y = (x^3 - 1/x^3 + 3)^4)$] $ y' = 4 (x^3 - 1/x^3 + 3)^4 (x^3 - 1/x^3 + 3)' = 12 (x^3 - 1/x^3 + 3)^4 (x^2 + 1/x^4) $ = P99 习题2-1 30 #prob[求导数:$display(y = (3-2 sin x)^5)$] $ y' = 5 (3 - 2 sin x)^4 (3 - 2 sin x)' = -10 cos x (3 - 2 sin x)^4 $ = P99 习题2-1 33 #prob[求导数:$display(y = sqrt(sin x))$] $ y' = (cos x)/(2 sqrt(sin x)) $ = P99 习题2-1 36 #prob[求导数:$display(y = sin^2 x/2 + cos^2 x/2)$] $ y' = 2 sin x/2 (sin x/2)' + 2 cos x/2 (cos x/2)' = 2 sin x/2 (1/2 cos x/2) + 2 cos x/2 (-1/2 sin x/2) = 0 $ = P99 习题2-1 39 #prob[求导数:$display(y = (frac(1 - cos x, 1 + cos x))^3)$] $ y' &= 3(frac(1-cos x,1+cos x))^2 (frac(1-cos x,1+cos x))'\ &= 3(frac(1-cos x,1+cos x))^2 frac((1-cos x)'(1+cos x)-(1+cos x)'(1-cos x),(1+cos x)^2)\ &= frac(6 sin x (1-cos x)^2,(1+cos x)^4) $ = P99 习题2-1 42 #prob[求导数:$display(y = x sec^2 x - tan 2x)$] $ y' = sec^2 x + x (2 sec x) (sec x)' - 2 sec 2x = sec^2 x + 2x sec^2 x tan x - 2 sec 2x $ = P99 习题2-1 45 #prob[求导数:$display(y = sqrt(x sin 2x))$] $ y' = 1/(2 sqrt(x sin 2x)) (x sin 2x)' = (sin 2x + 2 x cos 2x)/(2 sqrt(x sin 2x)) $ = P99 习题2-1 48 #prob[求导数:$display(y = e^(2x + 3))$] $ y' = e^(2x + 3) (2x + 3)' = 2 e^(2x + 3) $ = P99 习题2-1 51 #prob[求导数:$display(y = cos sqrt(x) + sqrt(cos x) + sqrt(cos sqrt(x)))$] $ y' &= - sin sqrt(x) (sqrt(x))' + 1/(2 sqrt(cos x)) (cos x)' + 1/(2 sqrt(cos sqrt(x))) (cos sqrt(x))'\ &= -(sin sqrt(x))/(2 sqrt(x)) + (sin x)/(2 sqrt(cos x)) - (sin sqrt(x))/(4 sqrt(x cos sqrt(x)))\ $ = P99 习题2-1 54 #prob[求导数:$display(y = ln (sec x + tan x))$] $ y' = (sec x + tan x)'/(sec x + tan x) = (sec x tan x + tan x) / (sec x + tan x) = (sec x + 1) / (csc x + 1) $ = P99 习题2-1 57 #prob[求导数:$display(y = sqrt(1 + (ln x)^2))$] $ y' = ((1 + (ln x)^2)')/(2 sqrt(1 + (ln x)^2)) = (2 ln x dot 1/x)/(2 sqrt(1 + (ln x)^2)) = (ln x)/(x sqrt(1 + (ln x)^2)) $ = P99 习题2-1 60 #prob[求导数:$display(y = arcsin 1/x)$] $ y' = ((1/x)')/sqrt(1 + (1/x)^2) = (-1/x^2)/sqrt(1 + 1/(x^2)) = -1/(x sqrt(x + 1)) $ = P99 习题2-1 63 #prob[求导数:$display(y = e^x sqrt(1 - e^(2x)) + s e^x)$] $ y' &= e^x (sqrt(1 - e^(2x)) + ((1-e^(2x))')/(2 sqrt(1 - e^(2x)))) + ((e^(2x))')/sqrt(1-e^(2x))\ &= (e^x (1 - e^(2x)) - e^(2x) + 2 e^(2x))/sqrt(1-e^(2x))\ &= (e^x + e^(2x) - e^(3x))/sqrt(1-e^(2x)) $ = P100 习题2-1 66 #prob[求导数:$display(y = e^sqrt(ln x))$] $ y' &= e^(sqrt(ln x)) (sqrt(ln x))' = e^(sqrt(ln x)) ((ln x)')/(2 sqrt(ln x)) = (e^sqrt(ln x))/(2x sqrt(ln x)) $ = P100 习题2-1 69 #prob[求导数:$display(y = x^x + x^(1/x))$] $ y &= e^(x ln x) + e^(1/x ln x)\ => y' &= e^(x ln x) (x ln x)' + e^(1/x ln x) (1/x ln x)'\ &= x^x (1 + ln x) + x^(1/x) (-1/x^2 ln x + 1/x^2)\ &= x^x (1 + ln x) + x^(1/x - 2) (1 - ln x) $ = P100 习题2-1 72 #prob[设 $display(y = arctan phi(x)/psi(x))$,求 $display(dy/dx)$.] $ y' = display((phi(x)/psi(x))')/(1 + display((phi(x)/psi(x))^2)) = (phi'(x) psi(x) - psi'(x) phi(x))/(phi^2(x) + psi^2(x)) $ = P100 习题2-1 81(1) #prob[求 $display(y = sin x)$ 在点 $display(x = pi/4)$ 的切线方程与法线方程.] $ dy/dx = cos x $ 所以在点 $x=display(pi/4)$ 的 - 切线方程为:$display(y-sqrt(2)/2 = sqrt(2)/2 (x - pi/4) => y = sqrt(2)/2 x + sqrt(2)/2 - (pi sqrt(2))/8)$; - 法线方程为:$display(y-sqrt(2)/2 = -sqrt(2) (x - pi/4) => y = -sqrt(2) x + sqrt(2)/2 + (pi sqrt(2))/4)$. = P100 习题2-1 81(3) #prob[求 $display(e^(x y) - x^2 + y^3 = 0)$ 在点 $display(x = 0)$ 的切线方程与法线方程.] $ => (dif e^(x y))/dx - (dif x^2)/dx + (dif y^3)/dx = 0 => e^(x y) (y + x dy/dx)- 2x + 3 y^2 dy/dx = 0 $ 当 $x=0$ 时,$y=-1$,此时 $display(dy/dx = 1/3)$. 故原函数在点 $x=0$ 的 - 切线方程为:$display(y+1 = 1/3(x - 0) => y = x/3 - 1)$; - 法线方程为:$display(y+1 = -3(x - 0) => y = -3x - 1)$. = P100 习题2-1 82(1) #prob[求导数:$display(y = |x^3|)$] 当 $x>=0$ 时,$y' =(x^3)' = 3 x^2$;当 $x<=0$ 时,$y' = (- x^3)' = -3 x^2$. 注意到当 $x=0$ 时,$y'_+ = y'_- = 0$,故原函数导数在 $RR$ 上存在,且为: $ y' = cases( 3 x^2 \,quad x>=0, -3 x^2 \, quad x<0 ) $ = P100 习题2-1 82(2) #prob[求导数:$display(y = x |x(x-1)|)$] 当 $x>=1$ 或 $x<=0$ 时,$y' = (x^2 (x-1))' = 2x (x-1) + x^2 = 3x^2 - 2x$; 当 $0<=x<=1$ 时,$y' = (-x^2 (x-1))' = 2x - 3x^2$. 注意到 $x=1$ 时 $y'_+ != y'_-$,故原函数 $y$ 在点 $x=1$ 处不可导.在其余值域上的导数为: $ y' = cases( 3x^2 - 2x \, quad x>1, 2x - 3x^2 \, quad 0<=x<1, 3x^2 - 2x \, quad x<0 ) $ = P100 习题2-1 83(1) #prob[求导数:$display(f(x) = cases( x^3\,quad& x>0, x^2\,quad& x<=0, ))$] 当 $x>0$ 时,$f'(x) = 3x^2$;当 $x<=0$ 时 $f'(x) = 2x$. 注意到 $x=1$ 时,$f'_+(x) = f'_-(x) = 0$,故原函数导数在 $RR$ 上存在,且为: $ f'(x) = cases( 3x^2 \,quad x>0, 2x \,quad x<=0 ) $ = P100 习题2-1 83(2) #prob[求导数:$display(f(x) = cases( display(x^2 cos 1/x)\,quad& x!=0, 0\,quad& x=0, ))$] 当 $x!=0$ 时,$display(f'(x) = 2x cos(1/x) + x^2 (-sin(1/x)) (-1/x^2) = 2x cos(1/x) + sin(1/x))$. 另有 $display(lim_(x->0) f'(x) = lim_(x->0) sin(1/x))$ 不存在,故 $f(x)$ 在点 $x=0$ 处不可导. 故原函数 $f(x)$ 的导数为 $f'(x) = display(2x cos (1/x) + sin (1/x)) quad (x!=0)$.
https://github.com/nhuongmh/nhuongmh.cv
https://raw.githubusercontent.com/nhuongmh/nhuongmh.cv/main/README.md
markdown
Do What The F*ck You Want To Public License
Using Typst template from <https://github.com/ice-kylin/typst-cv-miku> ## Usage 1. Read [typst](https://typst.app/docs/) documentation. 2. Install fonts needed by this template: - [kpfonts](https://ctan.org/pkg/kpfonts) - [Source Han Sans](https://github.com/adobe-fonts/source-han-sans) - [Source Han Serif](https://source.typekit.com/source-han-serif/cn/) 3. Modify `.typ` files to fit your needs. You may need to learn some basic typst syntax. 4. Compile to PDF - `typst c cv_1.typ` - `typst c cv_2.typ` ## By the way Small icon from Material Icons (Community). ## License Licensed under [WTFPL](http://www.wtfpl.net/).
https://github.com/coljac/swinburne-phd-typst
https://raw.githubusercontent.com/coljac/swinburne-phd-typst/main/template.typ
typst
/* Swinburne PhD Thesis template See github.com/coljac/swinburne-phd-typst for updates, documents and questions */ #let stt = state("stt", " ") #let sttf = state("sttf", 0) #let begun = state("begun", false) #let pb() = { stt.update(" ") pagebreak(to: "odd") } #let thesis( title: "My PhD Thesis", subtitle: "In the subject of Astronomy", author: "<NAME>", year: datetime.today().year(), for_binding: false, bibliography_file: "references.bib", frontmatter, content ) = { let pagemargins = ( top: 2.5cm, bottom: 2cm, right: 3.5cm, left: 3.5cm, ) if for_binding { pagemargins = (inside: 3.5cm, outside: 3cm, y: 1.75cm) } // Global stuff set page(paper: "a4", numbering: "i", margin: pagemargins, footer: locate(loc => { if sttf.at(loc) == 1 { align(center)[#counter(page).display("i")] sttf.update(0) } }), ) set par( leading: 1.5em, first-line-indent: 1em, justify: true ) set text(size: 11pt) show outline.entry.where( level: 1 ): it => { v(12pt, weak: true) strong(it) } set math.equation(numbering: (..nums) => "(" + counter(heading.where(level: 1)).display() + "." + nums .pos() .map(str) .join(".") + ")") [ // TITLE PAGE // #set page(numbering: none) // #pagenumbers.update("i") #place(dy: 20%, box(width: 100%, align(center, image("CAS_logo.svg", width: 70%)))) #place(dy: 45%, box(width: 100%, stroke: 0pt, align(center, text(size: 33pt, title)))) #place(dy: 55%, box(width: 100%, stroke: 0pt, align(center, text(size: 20pt, "By " + author)))) #v(65%) #set text(size: 14pt) #align(center)[ Presented in fulfillment of the requirements of the degree of Doctor of Philosophy #v(2em) #text(size: 1.2em)[#year] #v(2em) School of Science, Computing and Engineering Technologies Swinburne University of Technology ] ] // end title page // FRONT MATTER // counter(heading).update(1) show heading.where(level: 1): it => { counter(math.equation).update(0) pb() if it.numbering == "1.1" { // This is regular a chapter heading locate(loc => { if not begun.at(loc) { begun.update(true) counter(page).update(1) } }) v(20%) align(right, smallcaps[#text(font: "Liberation Sans", size: 76pt, fill: gray, weight: 300, counter(heading).display())]) }// v(1.0em) // box(width: 100%, inset: 2pt, stroke: (bottom: 1pt+blue), align(right)[ #text(size: 30pt, weight: 400, it.body) ] v(1em) stt.update(it.body) } show heading.where(level: 2): it => { box(width: 100%, inset: (top: 1em, bottom: 1.0em), stroke: (bottom: 1pt+black))[ #text(size: 1.1em)[#it]] } [ #set heading(numbering: none) #frontmatter // #counter(page).update(1) #counter(heading).update(0) #stt.update(" ") #set page( header: locate(loc => { let l = [L] let r = [R] if calc.odd(loc.page()) { l = [_ #counter(heading).display() #h(0.8em) #stt.display() _] r = counter(page).display("1") } else { r = [_ #counter(heading).display() #h(0.8em) #stt.display() _] l = counter(page).display("1") } let leftbit = align(left)[#l] let rightbit = align(right)[#r] if stt.at(loc) != " " { box(width: 100%, stroke: (bottom: 1pt+black), inset: (bottom: 2mm))[ #grid(columns: (1fr, 1fr), [#leftbit], [#rightbit]) ]} else { sttf.update(1) } }) ) #outline(title: "Contents", indent: 2em, depth: 2) #outline( title: [List of Figures], target: figure.where(kind: image), ) #outline( title: [List of Tables], target: figure.where(kind: table), ) #set heading(numbering: "1.1") #show figure.caption: it => [#it #v(0.3em)] // #locate(loc => { // if calc.even(loc.page()) { // pagebreak() // } // }) #set page(numbering: "1") #counter(page).update(1) #set cite(form: "prose") #content #set heading(numbering: none) #bibliography(bibliography_file, style: "chicago-author-date") ] // End of body } // #let begin = [ // #set text(size: 14pt) // ] // ============================= // Helper functions - Citations // ============================= #let citep(pre: none, post: none,..citation) = { let strings = citation.pos().map( x => [#cite(x, form: "author")~#cite(x, form: "year")] ) let combined = ((pre,) + strings + (post,)).filter(x=>x != none).join("; ") [(#combined)] } #let citet = cite // #let abstract(content) = [ // #heading(outlined: true, level: 1, "Abstract") // ] // #outline(depth: 2) // #outline( // title: [List of Figures], // target: figure.where(kind: image), // ) // #pagebreak(to: "odd") // #outline( // title: [List of Tables], // target: figure.where(kind: table), // ) // #pagebreak(to: "odd") // #content] /* TODO: - Cite with ; between - e.g. in cite, etc. \cite[e.g.](colin2022) - bibliography style - captions set par(leading: 1.0em) - Par indent is wonky, also headings * Bibliography heading * Figure caption spacing still dodgy - subheadings indented as pars? * page headings * Odd, even page stuff * Odd, even margins * Page num at bottom on chapter pages * Bug: numerical pages in outline for front matter */
https://github.com/frectonz/the-pg-book
https://raw.githubusercontent.com/frectonz/the-pg-book/main/book/080.%20colleges.html.typ
typst
colleges.html News from the Front September 2007A few weeks ago I had a thought so heretical that it really surprised me. It may not matter all that much where you go to college.For me, as for a lot of middle class kids, getting into a good college was more or less the meaning of life when I was growing up. What was I? A student. To do that well meant to get good grades. Why did one have to get good grades? To get into a good college. And why did one want to do that? There seemed to be several reasons: you'd learn more, get better jobs, make more money. But it didn't matter exactly what the benefits would be. College was a bottleneck through which all your future prospects passed; everything would be better if you went to a better college.A few weeks ago I realized that somewhere along the line I had stopped believing that.What first set me thinking about this was the new trend of worrying obsessively about what kindergarten your kids go to. It seemed to me this couldn't possibly matter. Either it won't help your kid get into Harvard, or if it does, getting into Harvard won't mean much anymore. And then I thought: how much does it mean even now?It turns out I have a lot of data about that. My three partners and I run a seed stage investment firm called Y Combinator. We invest when the company is just a couple guys and an idea. The idea doesn't matter much; it will change anyway. Most of our decision is based on the founders. The average founder is three years out of college. Many have just graduated; a few are still in school. So we're in much the same position as a graduate program, or a company hiring people right out of college. Except our choices are immediately and visibly tested. There are two possible outcomes for a startup: success or failure—and usually you know within a year which it will be.The test applied to a startup is among the purest of real world tests. A startup succeeds or fails depending almost entirely on the efforts of the founders. Success is decided by the market: you only succeed if users like what you've built. And users don't care where you went to college.As well as having precisely measurable results, we have a lot of them. Instead of doing a small number of large deals like a traditional venture capital fund, we do a large number of small ones. We currently fund about 40 companies a year, selected from about 900 applications representing a total of about 2000 people. [1]Between the volume of people we judge and the rapid, unequivocal test that's applied to our choices, Y Combinator has been an unprecedented opportunity for learning how to pick winners. One of the most surprising things we've learned is how little it matters where people went to college.I thought I'd already been cured of caring about that. There's nothing like going to grad school at Harvard to cure you of any illusions you might have about the average Harvard undergrad. And yet Y Combinator showed us we were still overestimating people who'd been to elite colleges. We'd interview people from MIT or Harvard or Stanford and sometimes find ourselves thinking: they must be smarter than they seem. It took us a few iterations to learn to trust our senses.Practically everyone thinks that someone who went to MIT or Harvard or Stanford must be smart. Even people who hate you for it believe it.But when you think about what it means to have gone to an elite college, how could this be true? We're talking about a decision made by admissions officers—basically, HR people—based on a cursory examination of a huge pile of depressingly similar applications submitted by seventeen year olds. And what do they have to go on? An easily gamed standardized test; a short essay telling you what the kid thinks you want to hear; an interview with a random alum; a high school record that's largely an index of obedience. Who would rely on such a test?And yet a lot of companies do. A lot of companies are very much influenced by where applicants went to college. How could they be? I think I know the answer to that.There used to be a saying in the corporate world: "No one ever got fired for buying IBM." You no longer hear this about IBM specifically, but the idea is very much alive; there is a whole category of "enterprise" software companies that exist to take advantage of it. People buying technology for large organizations don't care if they pay a fortune for mediocre software. It's not their money. They just want to buy from a supplier who seems safe—a company with an established name, confident salesmen, impressive offices, and software that conforms to all the current fashions. Not necessarily a company that will deliver so much as one that, if they do let you down, will still seem to have been a prudent choice. So companies have evolved to fill that niche.A recruiter at a big company is in much the same position as someone buying technology for one. If someone went to Stanford and is not obviously insane, they're probably a safe bet. And a safe bet is enough. No one ever measures recruiters by the later performance of people they turn down. [2]I'm not saying, of course, that elite colleges have evolved to prey upon the weaknesses of large organizations the way enterprise software companies have. But they work as if they had. In addition to the power of the brand name, graduates of elite colleges have two critical qualities that plug right into the way large organizations work. They're good at doing what they're asked, since that's what it takes to please the adults who judge you at seventeen. And having been to an elite college makes them more confident.Back in the days when people might spend their whole career at one big company, these qualities must have been very valuable. Graduates of elite colleges would have been capable, yet amenable to authority. And since individual performance is so hard to measure in large organizations, their own confidence would have been the starting point for their reputation.Things are very different in the new world of startups. We couldn't save someone from the market's judgement even if we wanted to. And being charming and confident counts for nothing with users. All users care about is whether you make something they like. If you don't, you're dead.Knowing that test is coming makes us work a lot harder to get the right answers than anyone would if they were merely hiring people. We can't afford to have any illusions about the predictors of success. And what we've found is that the variation between schools is so much smaller than the variation between individuals that it's negligible by comparison. We can learn more about someone in the first minute of talking to them than by knowing where they went to school.It seems obvious when you put it that way. Look at the individual, not where they went to college. But that's a weaker statement than the idea I began with, that it doesn't matter much where a given individual goes to college. Don't you learn things at the best schools that you wouldn't learn at lesser places?Apparently not. Obviously you can't prove this in the case of a single individual, but you can tell from aggregate evidence: you can't, without asking them, distinguish people who went to one school from those who went to another three times as far down the US News list. [3] Try it and see.How can this be? Because how much you learn in college depends a lot more on you than the college. A determined party animal can get through the best school without learning anything. And someone with a real thirst for knowledge will be able to find a few smart people to learn from at a school that isn't prestigious at all. The other students are the biggest advantage of going to an elite college; you learn more from them than the professors. But you should be able to reproduce this at most colleges if you make a conscious effort to find smart friends. At most colleges you can find at least a handful of other smart students, and most people have only a handful of close friends in college anyway. [4] The odds of finding smart professors are even better. The curve for faculty is a lot flatter than for students, especially in math and the hard sciences; you have to go pretty far down the list of colleges before you stop finding smart professors in the math department.So it's not surprising that we've found the relative prestige of different colleges useless in judging individuals. There's a lot of randomness in how colleges select people, and what they learn there depends much more on them than the college. Between these two sources of variation, the college someone went to doesn't mean a lot. It is to some degree a predictor of ability, but so weak that we regard it mainly as a source of error and try consciously to ignore it.I doubt what we've discovered is an anomaly specific to startups. Probably people have always overestimated the importance of where one goes to college. We're just finally able to measure it.The unfortunate thing is not just that people are judged by such a superficial test, but that so many judge themselves by it. A lot of people, probably the majority of people in America, have some amount of insecurity about where, or whether, they went to college. The tragedy of the situation is that by far the greatest liability of not having gone to the college you'd have liked is your own feeling that you're thereby lacking something. Colleges are a bit like exclusive clubs in this respect. There is only one real advantage to being a member of most exclusive clubs: you know you wouldn't be missing much if you weren't. When you're excluded, you can only imagine the advantages of being an insider. But invariably they're larger in your imagination than in real life.So it is with colleges. Colleges differ, but they're nothing like the stamp of destiny so many imagine them to be. People aren't what some admissions officer decides about them at seventeen. They're what they make themselves.Indeed, the great advantage of not caring where people went to college is not just that you can stop judging them (and yourself) by superficial measures, but that you can focus instead on what really matters. What matters is what you make of yourself. I think that's what we should tell kids. Their job isn't to get good grades so they can get into a good college, but to learn and do. And not just because that's more rewarding than worldly success. That will increasingly be the route to worldly success. Notes[1] Is what we measure worth measuring? I think so. You can get rich simply by being energetic and unscrupulous, but getting rich from a technology startup takes some amount of brains. It is just the kind of work the upper middle class values; it has about the same intellectual component as being a doctor.[2] Actually, someone did, once. <NAME>'s wife Freada was in charge of HR at Lotus in the early years. (As he is at pains to point out, they did not become romantically involved till afterward.) At one point they worried Lotus was losing its startup edge and turning into a big company. So as an experiment she sent their recruiters the resumes of the first 40 employees, with identifying details changed. These were the people who had made Lotus into the star it was. Not one got an interview.[3] The US News list? Surely no one trusts that. Even if the statistics they consider are useful, how do they decide on the relative weights? The reason the US News list is meaningful is precisely because they are so intellectually dishonest in that respect. There is no external source they can use to calibrate the weighting of the statistics they use; if there were, we could just use that instead. What they must do is adjust the weights till the top schools are the usual suspects in about the right order. So in effect what the US News list tells us is what the editors think the top schools are, which is probably not far from the conventional wisdom on the matter. The amusing thing is, because some schools work hard to game the system, the editors will have to keep tweaking their algorithm to get the rankings they want.[4] Possible doesn't mean easy, of course. A smart student at a party school will inevitably be something of an outcast, just as he or she would be in most high schools. Thanks to <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, and <NAME> for reading drafts of this.French Translation
https://github.com/yichenchong/game_theory_cw1
https://raw.githubusercontent.com/yichenchong/game_theory_cw1/main/report/main.typ
typst
#import "template.typ": * #show: ams-article.with( title: "Pure Strategy Equilibria in Generalisations of the Election Game", authors: ( ( name: "<NAME>", department: [Department of Computing], organization: [Imperial College London], location: [London, SW7 2AZ], email: "<EMAIL>", cid: "02015150" ), ( name: "<NAME>", department: [Department of Computing], organization: [Imperial College London], location: [London, SW7 2AZ], email: "<EMAIL>", cid: "01866869" ), ( name: "<NAME>", department: [Department of Mathematics], organization: [Imperial College London], location: [London, SW7 2AZ], email: "<EMAIL>", cid: "02017016" ), ), abstract: [ The Election Game is a discrete variation on the classic Hotelling-Downs Model, in which $N$ politicians decide where along a (discrete) political spectrum (of $R$ positions) to position themselves to attract the most voters. Other than the trivial variants ($N = 0$ and $N = 1$), the $N = 2$ variant also is relatively simple, with politicians gravitating towards the one or two median positions, in line with the median voter theorem. However, more surprising behaviours arise for $N >= 3$. This paper describes an original method used to find all pure strategy equilibria for each $N > 0$ with the original posited $R = 10$ (which also carries over to any other value of $R$), and produces a brief analysis of some of the more surprising and exciting portions of the results (including some deviations from "median" behaviour). ], bibliography-file: "refs.bib", ) In 1929, a brilliant mathematician and economic theorist by the name of <NAME> at Stanford University proposed an economic model based on space. He imagined two sellers as points on a line, selling their goods, and that consumers preferred nearer sellers. While he was mainly talking about competitive pricing in the face of this spatial model, he also pointed out that it was strategically advantageous for the sellers to move towards each other @stabcomp. Variations on this have been applied to explain ideas ranging from logistics to politics. In 2023, a different brilliant mathematics professor posed the following (paraphrased and slightly generalised) variation on the original Hotellings problem in a problem set. Consider the political spectrum modelled some number of discrete positions, numbered 1 through $R$ ($R$ in the original problem is given to be $10$). The players of the game are candidates seeking to achieve as much of the public vote as possible. A uniform proportion of voters ($frac(1, R)$ of all voters) is on each position, and voters will vote for whoever is closest to their position. In the event of a tie, the votes are split proportionally (e.g. if there is one candidate at pos. 1, and 3 candidates at pos. 3, with no candidates at pos. 2, half the voters at pos. 2 would vote for the candidate at pos. 1, and a sixth of them would vote for each candidate at pos. 3). @ps2 This paper attempts to address a generalisation of this problem, by finding and analysing the set of pure equilibria for $R = 10$, $forall N in NN$, and for some other values of $N$ and $R$. The original problem's solutions @ps2sol also contains a remark about how the equilibria of the $N = 2$ case has the candidates tending towards the centre, in an occurrence known as the "median voter theorem". We will explore how the generalised game deviates as it does not meet the conditions for the theorem. To find the set of all equilibria for $R = 10$, we will start with the surprisingly easier case of $N > R$, which will help us find the set for $N = R$. We will then explore how the payouts work and are calculated, and gain the intuition for an efficient algorithm to compute whether or not a given portfolio of strategies constitutes a Nash equilibria, before computationally iterating through our solution set and find all possible equilibria for $3 <= N < R$ with $R = 10$ (and in fact, equilibria for all $R < 12$ are shown in our repository). Finally, this paper will present an analysis of some interesting findings. = Preliminaries The most important preliminary concept to address is that the game is a symmetric game. The symmetry here is not spatial in nature (even though the game does have a sort of spatial symmetry to it). Rather, it means that the payoffs for playing any strategy depend only on the the strategies employed by other players @symgames. Intuitively, this must be true, as the problem did not define payouts for any specific player. Formally speaking, we define this as follows: for any permutation $pi$, $g_pi(i) (s_1, s_2, ..., s_i, ..., s_N) = g_i (s_pi(1), s_pi(2), ..., s_pi(i), ..., s_pi(N))$. In practice, this also means that whether a given portfolio of strategies is at equilibrium depends only on the number of candidates who choose each strategy @symgames. As such, for the purposes of this paper, we will be able to represent any portfolio of strategies with a vector of counts, where the value at position $i$ will represent the number of candidates who chose strategy $i$, which we will denote $sigma in RR^R$. This is a very useful feature of the game, as it drastically reduces our solution space, while allowing us to use some interesting features and variations of the representation to capture important information about the equilibria. Another important note is that we are not dealing with the trivial cases where $N = 0$ or $N = 1$, or even the proven case $N = 2$. The paper will assume that $N, R > 2$, unless stated otherwise. Finally, in terms of units, although the original problem uses payoff units to represent percentage points, this paper will use 1 unit of payoff to represent $1 / R$ of the votes. This is without loss of generality, and is just used to simplify the maths. The algorithm for verification actually uses units each representing $1 / (2R)$ of the vote to simplify the payoff equations, but the paper will use $1 / R$ as 1 unit unless otherwise stated. #pagebreak() = Pure Strategy Equilibria for $N >= R$ == Pure Strategy Equilibria for $N > R$ When looking at cases for $N > R$, a relatively intuitive conjecture, later proven, came to mind: #proposition[ A pure strategy portfolio $sigma$ for $N > R$ represents an equilibrium if and only if $sigma$ is "_almost flat_", where almost flat is defined as having the property that all values of $sigma$ are either $floor(N / R)$ or $floor(N / R) + 1$, and $floor(dot)$ represents the floor (round down) function. ] #proof[ We prove this is the case with a separate proof in each direction. #v(0.4em) _Proof that $sigma$ represents a pure strategy equilibrium if it is "almost flat":_ #set par(first-line-indent: 2em) #set par(hanging-indent: 2em) #v(0.25em) Assume $sigma$ almost flat and let $eta = floor(N / R)$. For $N >= R$, $N / R >= 1$, so $eta >= 1$, and therefore we have that there must be at least one candidate at every position. Therefore, for any position $i$, the payoff of a candidate at position $i$ is $1 / sigma_i$, which is either $1 / eta$ or $1 / (eta + 1)$. If a candidate were to move to a different position $j$, his new payoff would be $1 / (sigma_j + 1)$, which is either $1 / (eta + 1)$ or $1 / (eta + 2)$. Since $eta >= 1$, we have that $1 / eta > 1/ (eta + 1) > 1 / (eta + 2)$, and therefore, $forall i, j, 1 / (sigma_j + 1) < 1 / (sigma_i)$. Therefore, with an almost flat $sigma$, there is no incentive for any candidate to change positions, #v(0.7em) #set par(first-line-indent: large-size) #set par(hanging-indent: large-size) _Proof that if $sigma$ represents a pure strategy equilibrium, it must be "almost flat":_ #set par(first-line-indent: 3em) #set par(hanging-indent: 2em) #v(0.25em) Assuming that $sigma$ is not almost flat, we will prove that it is not a pure strategy equilibrium. First, consider the case where $forall i, sigma_i > 0$, i.e. every position has at least one candidate. We will take $i : sigma_i = min_n {sigma_n}$ and $j : sigma_j = max_n {sigma_n}$. Since we know that the mean value must be between the minimum and maximum values, we have that $sigma_i <= N / R <= sigma_j$. Since there exists values $sigma_n : sigma_n != floor(N / R) and sigma_n != floor(N / R) + 1$, we know that either $sigma_i < floor(N / R)$ or $sigma_j > floor(N / R) + 1$. Therefore, $sigma_j > sigma_i + 1$. The current payoff at $j$ is $1 / sigma_j$, and the payoff if a candidate from $j$ decides to switch positions to $i$ is $1 / (sigma_i + 1) > 1 / sigma_j$. Therefore, a candidate at $j$ would benefit from switching over to the candidate, meaning this is not a pure strategy equilibrium. Next, considering the case where $exists i: sigma_i = 0$. We call $i$ a "_gap_" in the positions. The payout for switching to a gap is always at least $1$, as the voters on position $i$ would all vote for the only candidate that is on the position. We employ a probabilistic argument. On average, each candidate currently receives $R / N$ votes, which means that some candidates receive payouts $<= floor(R / N)$. Since $N > R$, these candidates receive $< 1$ votes, and would benefit from switching over to position $i$, where they would gain at least one vote. Therefore, $sigma$ also would not represent a pure strategy equilibrium. ] #v(-2em) #linebreak() === Set of Pure Strategy Equilbria $forall N > R$ With this proposition, we can now construct the set of pure strategy equilibria $forall N > R$. We know that there must be $N mod R$ positions with $floor(N / R) + 1$ candidates (necessary for there to be $N$ total candidates). Therefore, we have $vec(R, N mod R)$, as in $R$ choose $N mod R$, combinations of candidates fulfilling this. An interesting consequence is that for $N > 2R$, we can just find the solutions for $N - 10$, and add a candidate to each position in each soluion. The set of pure strategy equilibria for $R = 10, 11 <= N <= 20$ has been included. \ == Set of Pure Strategy Equilibria for $N = R$ Note that the only part of the proof for $N > R$ that does not apply to $N = R$ is the case where $exists i : sigma_i = 0$. For $N = R$, the average payout is exactly $1$, and therefore, we have to address cases where (1) the minimum payout is also $1$ (i.e. every candidate has a payout of $1$), and (2) that the payout of moving into a gap are also 1. Any gaps that are of size $>= 2$ would mean that the payoff of moving into one would be at least $1.5$, violating (2). We can therefore assume all gaps are of size 1. For a gap to exist when $N = R$, there must be a position with at least 2 candidates by the pigeon-hole principle _(p.h.)_. $n$ candidates at a position not next to any gaps each get $1 / n$ votes; candidates at a position next to a gap that is not at an extreme end of the spectrum each get $1.5 / n$ votes; candidates surrounded by gaps, each get $2 / n$. Obviously, the first two cases would violate (1); the third is only possible if each position has at most 2 candidates, and, letting $x$ be the number of positions with 2 candidates, there must be $x + 1$ gaps to surround each, contradicting _p.h._ Therefore, the only alternatives to the "almost flat" equilibrium are if the extreme position is a gap, followed by 2 players (i.e. the first position is a gap and the second position has 2 candidates, and/or if the second position has 2 candidates, and the third position has a gap), meaning there are four equilibria for $N = R$ : $sigma in {(0, 2, 1, ..., 1), (1, 1, ..., 1, 2, 0), (0, 2, 1, 1, ..., 1, 2, 0), (1, 1, ..., 1)}$. #v(1em) #pagebreak() = Payouts and a Novel Efficient Algorithm for the Verification of Pure Strategy Equilibria The payouts for the game are quite complicated, as the payout functions for a candidate given the strategies for all other candidates is not smooth or continuous. As such, we could not find a method to formulate the set of equilibria in the generalised case. However, we can build up to an algorithmic approach to determine whether a portfolio is at equilibrium and generate all such portfolios. Let us start by computing the payouts for some number $I$ of candidates, given a portfolio of other candidates' positions $sigma = [0, 0, ..., A, 0, ..., 0, B, 0, 0, ...]$, i.e. $sigma$ is zero, apart from arbitrary positions $a, b$, where there are already $A$ and $B$ (quantity of) candidates respectively. We break down the payouts if the candidates position themselves at different locations along the spectrum (let us call the new candidates' position $i$): #table( columns: (0.2fr, 1fr), inset: 5pt, [*Positioning*], [*Payout*], [$i < a$], [Each new candidate's payout would be the number of votes on and to the left of $i$ per candidate ($i / I$) plus half the votes between $i$ and $a$ per candidate ($(a - i - 1) / (2I)$). Summing the two, you get $(a + i - 1) / (2I)$. Interestingly, as this is linear with respect to $i$, the strategy of positioning $i = a - 1$ strictly dominates all other strategies $i < a$. Therefore, the optimal payoff for $i : i < a$ is $(a + (a - 1) - 1) / (2I) = (2a-2) / (2I)$.], $i = a$, [This is equivalent to placing $A + I$ candidates at positon $a$ in a previous portfolio with only $B$ at position $b$, giving us a payoff per candidate of $(a + b + 1) / (2(I+A))$.], [$a < i < b$\ (Assuming $a > b + 1$)], [The total payout is equal to the vote at $i$ ($1$), plus half the candidates between $a$ and $i$ ($(i - a - 1) / 2$), and half the candidates between $b$ and $i$ ($(b - i - 1) / 2$). Summing together and dividing by $I$, we get $(b - a) / (2I)$ votes per candidate.], [$i = b$], [The spatial symmetry of the situation means we can use the same computations as when $i <= a$, giving us $(2R + 1 - a - b) / (2(I+B))$.], [$i > b$], [As before, the payoff is $(2R+1 - i - b) / (2I)$, but the optimal payout is $(2R - 2b) / (2I)$.] ) Given this, our calculations show us that the payouts for any individual candidate at position $i$ are: $ cases( (2a - 2) / (2I) "if" exists.not j : sigma_j > 0 and j<i "and" a "is the candidate at the next position" , (b - a) / (2I) "if " a < i < b and exists.not j:sigma_j != 0 and (a<j<i or i<j<b), (2R - 2b) / (2I) "if" exists.not j : sigma_j > 0 and j>i "and" b "is the candidate at the next position". ) $ To compute the payouts given any portfolio, we create a list $chi$ of non-empty positions (not a vector, as it has arbitrary length, but we will still denote the value at the $i^(t h)$ position as $chi_i)$, and another list $upsilon$ of the number of candidates at position $chi_i$. Note that $(chi, upsilon)$ fully describes all the information in $sigma$ and vice versa -- they are equivalent representations. Their length (which is equal) will be denoted by $l < min(N, R)$. This allows us to ignore all positions with no candidates, as they do not affect the payoffs. We then create a new list using $chi$ as follows: $ [-1, 2chi_1-1,0,chi_2-chi_1,0,chi_3-chi_2,0,...,chi_l-chi_(l-1), 0,2R+1-2chi_(l), -1] $ and take a moving sum (with window size 3) $delta$ of that new list, so we get: $ delta = [2chi_1-2, chi_1+chi_2-1,chi_2-chi_1,chi_3-chi_1,chi_3-chi_2,...] $ You will notice that this list represents twice the total payoffs for candidates (the numerators of the piecewise function) if they position before $chi_1$ (optimally), at $chi_1$, between $chi_1$ and $chi_2$, at $chi_2$, and so on. Therefore, if we take $delta'=[delta_2, delta_4,delta_6,...]$, and divide $delta' / (2upsilon)$ element-wise (all division between lists will be assumed to be element-wise), we get a list of payoffs for candidates at position $chi_i$. If we take $upsilon' = [1, upsilon_1+1, 1, upsilon_2+1, 1, upsilon_3+1,...]$, then $delta / (2upsilon')$ represents the payoffs for a new candidate if they position before $chi_1$, at $chi_1$, between $chi_1$ and $chi_2$, at $chi_2$, and so on. == Naive algorithm We can use these results to develop a naive algorithm to verify whether or not a given portfolio of strategies is a Nash equilibrium. We first convert $sigma$ into $(chi, upsilon)$, and compute $delta' / (2upsilon)$. For every $chi_i$, we compute $sigma^((i)) = (sigma_1,sigma_2,...,sigma_(chi_i) - 1, sigma_(chi_i + 1), ..., sigma_N)$ (i.e. the modified portfolio where a candidate at $chi_i$ has recanted his decision, and can redecide his position). For every $sigma^((i))$, we can calculate their equivalent ($chi^((i))$, $upsilon^((i))$) representation, and use that to calculate $delta^((i)) / (2 upsilon^(i)')$, which are the payoffs if $chi_i$ repositions. Therefore, $exists i : (delta' / (2upsilon))_i <max{delta^((i)) / (2 upsilon^(i)')}$ means that a candidate at $chi_i$ could reposition to achieve a higher payoff than his current payoff, and therefore we have that $sigma$ represents equilibrium if and only if $forall i, (delta' / (2upsilon))_i >= max{delta^((i)) / (2 upsilon^(i)')}$. This algorithm is functionally correct, and can be used to quickly determine whether a given portfolio of strategies is a Nash equilibrium. However, it is far from ideal. The goal of this verification function is to search through an entire solution space, and determine which solutions are equilibria. We will see that the solution space we are searching through has size on the order of $vec(R + N - 1, N)$, and so this verification algorithm needs to be extremely efficent to be used on the entirety of a factorial-time search space. The algorithm in its current form has an asymptotic complexity of $O(l^2)$. Obviously, the algorithm could not do any better than $O(l)$, as it needs to, at least, iterate through and read all non-zero values of $sigma$, of which there are $l$, and we will show that it is possible to achieve that lower bound. == A more efficient implementation The key mathematical observation that leads to the major reduction in runtime is that many of the lists of mutated payoffs $delta^((i)) / (2 upsilon^(i)')$ are extremely similar. In fact, if you take the original $delta / (2upsilon)'$, the only difference that it has with any $delta^((i)) / (2 upsilon^(i)')$ are in the positions near $chi_i$, more specifically between $chi_(i - 1)$ (or 1, if $i = 1$) and $chi_(i + 1)$ (or R, if $i = l$), inclusive. This makes sense -- the payoff at any position is only affected by the number of candidates at that position and the distance to the nearest non-zero positions. We will call $delta^((i)) / (2 upsilon^(i)')$ the "_general mutated payout list (GMPL)_", denoted $g'$, and $delta^((i)) / (2 upsilon^(i)')$ the "_specific mutated payout list (SMPL)_" for position $chi_i$, denoted $g^((i))$. Let us also denote the general mutated payoff describing position $i$ as $g'(i)$. Let us compare the payoffs. First, let us address the case where $upsilon_i > 1$. As the position would still have $sigma_i^((i)) > 0$, it would still be $chi_i^((i))$, albeit with $upsilon_i^((i)) = upsilon_i - 1$. Since the only change is in $upsilon_i^((i))$, and $upsilon_i$ only affects the payout at point $i$, the SMPL only differs from the GMPL where it describes point $i$, where the SMPL's payout is equal to the current payout (if the candidate reconsiders his position but doesn't move, he'll have his current payout). Since the payout at point $i$ has a smaller denominator in the SMPL for the same numerator, the general mutated payout must be smaller than the current payout. That means that the maximum of the SMPL is greater than the current payout if and only if the maximum of the GMPL is greater than the current payout. Next, supposing that $upsilon_i = 1$ suppose we will take the case where the $chi_i$ in question is not first or last (i.e. $i != 1 and i != l$). There are five payoffs described in the GMPL pertaining to the region $[chi_(i-1), chi_(i + 1)]$, at point $chi_(i-1)$, between points $chi_(i-1)$ and $chi_i$, at point $chi_i$, between points $chi_i$ and $chi_(i+1)$, and at point $chi_(i+1)$. For the region $(chi_(i-1), chi_(i+1))$, these payouts are $(chi_(i) - chi_(i-1))/(2), (chi_(i+1)-chi_(i-1))/(4),$ and $(chi_(i+1)-chi_(i))/2$ respectively. In the SMPL, any point in the region $(chi_(i-1), chi_(i+1))$ has the payoff $(chi_(i+1)-chi_(i-1))/2$, which, again, is equal to the current payout and greater than those particular mutated payouts. At the boundary of the region, points $chi_(i-1)$ and $chi_(i+1)$, the payout also is higher in the SMPL than in the general list (intuitively, they gain some of the votes that went to $chi_(i)$, but the proof requires breaking down each case of what lies beyond the boundary), but their specific payouts may be higher than the current payout. The same is true about the first and last points, where the only positions where the specific mutated payout may be higher than the current payout are at $chi_2 - 1$ or $chi_(l-1)+1$, and $chi_2$ or $chi_(l-1)$. In summary, this means that the maximum of the SMPL for position $chi_i$ is: $ max_(n in [1, R]) {g^((i))(n)} =cases( gamma ", if" upsilon_i = 1, max{gamma, g^((i))(chi_2 - 1), g^((i))(chi_2)} ", if " upsilon_i > 1 and i=1, max{gamma, g^((i))(chi_(l-1)+1), g^((i))(chi_(l-1))} ", if " upsilon_i > 1 and i=l, max{gamma, g^((i))(chi_(i-1)), g^((i))(chi_(i+1))} ", otherwise" ) \ "where" gamma = max_(n in [1, R]) {g'(n)} $ When computed, the specific mutated payouts that still need to be calculated actually turn out to be neat combinations of general mutated and current payouts that don't require much recomputation (e.g. in the final case of the above equation, $g^((i))(chi_(i-1)) = g'(chi_(i-1)) + (chi_(i+1) - chi_i) / (upsilon'_i)$). This means that the Nash equilibrium can mostly be verified by calculating the maximum of the GMPL, along with up to two exceptions per $chi_i$. The new algorithm runs in $O(l)$ time. Several other computational optimisations helped to reduce the runtime of this algorithm. These include writing the algorithm itself as a binding in C, and using sliding window techniques to compute the moving sums and payouts in only one iteration of $chi$ and $upsilon$. However, the computational techniques used are not the focus of this paper, and will not be discussed in too much detail here. The final implementation of my search and validation algorithms is on the GitHub repository @github. == Searching the Solution Set We find all equilibria for a given $N, R$, by taking a set containing potential portfolios of strategies, and filtering it through the validation algorithm. We use Satisfiability Modulo Theorem (SMT) libraries, specifically _Z3_, a program developed by Microsoft Research to efficiently iterate through a solution space defined by several crude constraints. However, the validation algorithm itself proved too unwieldy to write in the constraint format, and that was therefore applied to the output of the SMT solver. Maintaining a small enough search space was a crucial problem. The size of the set of all $sigma$ representing portfolios of $N, R$ is effectively the number of ways to sort $N$ indistinct items into $R$ distinct baskets, which is $vec(R + N - 1, N)$. This is quite a large set to search through, so we actually generate the portfolios using an analogous representation that is easier to apply some more constraints to. We consider $L in NN_+^N$ as a representation of $sigma$, where each $L_i$ represents a candidate's position. If you do away with permutations by only considering $L : forall i < N - 1, L_(i) <= L_(i + 1)$, it is apparent that there is a bijection between the set of $L$ and the set of $sigma$, and therefore they have the same cardinality. However, it is now easier to preemptively enforce some constraints on the set. For example, the previously-mentioned constraint that the most extreme candidate in a certain direction can not be two or more positions away from the second most extreme candidate in that direction (as it is advantageous for him to move medially), can be applied by enforcing that $L_2 - L_1 < 2$ and $L_N-L_(N-1)<2$. Another constraint applied is related to the probabilistic method used in the $N > R$ proofs earlier. If the average payout of each player is $R / N$, then any gaps in $L$ big enough such that the payout of moving into them is greater than $R / N$, and $L$ could not represent an equilibrium, as there are players who would benefit from moving into the gap. A gap of size $floor(R / N) + 1$ at the extremal values and $2floor(R / N) + 1$ otherwise is large enough for this to happen, so we also enforce the following constraints: $L_1 <= floor(R / N) + 1, L_N>=R-floor(R/N), forall i in [2, N], L_i-L_(i-1) <2floor(R/N)$. = Results and Analysis We first ran our completed program over $3 <= R <= 12$ and $2 <= N < R$. The full results, in their $L$ representations, are in the GitHub repository @github in CSV format. The $R = 10$ are also included in the table below. #figure(caption: [Equilibria for $R = 10$, $3 <= N <= 9$], placement: bottom)[ #box(height: 22em)[ #columns(4, gutter: 0.1em)[ #set text(size: 8pt) #let data() = { let output = () for i in range(4, 10) { output.push([#i]) let csv_name = "data/" + str(i) + ".csv" let rows = csv(csv_name).map(it=>"(" + it.join(", ") + ")") let content = rows.map(it=>{[#it]}).join("\n") output.push(content) } return output } // #data() #table( columns: (auto, auto), inset: 3pt, align: center, [*$N$*], [*$L$ in Equilibrium*], [3], [None], ..data() ) ] ] ] It is interesting to note that there is no equilibrium for $R = 10$, $N = 3$, but this fact could actually be computed from some of the constraints we had earlier. $floor(10 / 3) = 3$, so we have $L_1 <= 4$, $L_3 >= 7$. We also know that $L_2 - L_1 < 2$ and $L_3 - L_2 < 2$. These four constraints are clearly contradictory, and so there can not be equilibrium for $N = 3$. In fact, there are no equilibria for $N=3$, $forall R>4$. The constraints imply that $(R - floor(R/3)) - (floor(R/3)+1) <= 2$, which gives us that $floor(R/3) >= (R-3)/2$. As the $(R-3)/2$ part of the equation clearly outpaces the $floor(R/3)$ component, so there is a finite set of $R$ where this equation is true. The only $R$ where this is true are $1, 2, 3, 4, 5, 7, 9$, of which the cases where $R <= N$ obviously have their almost $N$ solutions. Testing $R$ at the remaining positions shows that only $R=4$ yields any equilibria. This value of $N$ is extremely unique, as the difference $< 1$ constraints proves to be incredibly restrictive, rendering an equilibria with $N = 3$ impossible to achieve for large enough $R$. For $N = 4$, for example, these few constraints, even with the addition that gaps can not be larger than $2floor(R/4)+1$, still have solutions for all $R$. Another interesting point is that, qualitatively, the "shape" of the equilibria appears to remain relatively constant for any given $N$, $R$, s.t. $N < R$. We can quantify how "constant" this shape is by constructing a metric space for the equilibria and measuring the average distance between any two different points in the set. To do this, we use the earth-movers distance (EMD) metric to quantify this. Informally, the Earth Mover's Distance is defined as follows: Imagine two distributions as two ways of piling up dirt over their domain $D$; the EMD is defined as the minimum cost of transforming one distribution into the other if moving one unit of weight (in dirt) one unit of distance along $D$ is one unit of cost. @emd For our particular distribution, we normalise it so that we define 1 unit of weight as $N$ candidates, and 1 unit of distance as $R$ (i.e. the cost of moving a single candidate $x$ positions is $x / (N R)$). For example, let us compute the average distance of the set of equilibria for $N=4, R=10$ by hand. You can get from the first distribution in the table to the second by moving a candidate from position 4 to position 3, which has a cost of $1 / 40$. The cost of moving from the second to third positions is also $1 / 40$ likewise, and the cost of moving from the first to the third positions is $1 / 20$ (as two candidates need to be moved). Therefore, the average distance of the set is $1 / 30$ (let us denote this as $E_(4, 10) = 1 / 30$). This computation is much more tedious for larger sets, as the sets must be compared pairwise; we wrote code to find the average distance of the sets, and it is also included in the GitHub repository @github. For a control benchmark, we use the expected EMD between distributions generated where each candidate chooses a position randomly. While the expected EMD is very difficult to mathematically compute, we can estimate it by taking a sample of possible distributions of $N$ candidates over $R$ positions (we took 100,000 samples) and taking their average. We denote these estimated values as $macron(d)_(N, R)$. Comparing these to the average distance within the set of equilibria for any given $N, R$, the data shows that the average distance is, in our experimental cases, significantly smaller than the expected EMD between random distributions, showing that a given $N, R$ equilibria has a distinct "shape". For example, we computed that $macron(d)_(4, 10) = 0.199965$, which is significantly larger than our average distance of $0.03333$. We have tabulated some other results from $R = 10$ below, but the pattern appears to continue for different $R$. The data for $R <= 12$ has been computed and is in the GtHub repository. #figure(caption: [Equilibrium Average Earth Mover Distances and Expected Earth Mover Distances for a given $N$, where $R=10$])[ #table(columns: 7, [*$N$*],[*4*],[*5*],[*6*],[*7*],[*8*],[*9*], [*$E_(N,10)$*], [$0.033$], [$0.033$], [$0.033$], [0.0641], [0.047], [0.031], [*$macron(d)_(N,10)$*], [0.200], [0.182], [0.167], [0.156], [0.146], [0.139], ) ] A final relatively surprising finding is that the model does not appear to exhibit "middling" behaviour - at equilibria, the candidates are not all positioned near the median. A finding called the "median voter theorem" predicted that many election models should be won by the candidate occupying the position nearest to the "median" voter @black @bc2014. That usually means that the distribution should be closer to the median position at equilibria (as observed by Hotelling) @stabcomp, or at least have a candidate as close as possible to the median position, as getting closer to the median position would be beneficial. However, this is not the case. The issue is that our model does not fulfill the criterion for the theorem - the "Condorcet winner criterion". An election model satisfies the Condorcet winner criterion when, if a "Condorcet winner" exists, the Condorcet winner is also the overall winner. A "Condorcet winner" is defined as a candidate who would win against each other candidate in a two-candidate election. @bc2014 In some senses, you can think of a Condorcet winner as the most popular candidate pairwise. A simple counterexample shows that our model does not fulfill the "Condorcet Winner Criterion". Let $L = (3,5,7)$. Obviously, the candidate at position 5 would win in elections defined by $(3, 5)$ and $(5, 7)$. However, in this election, the candidate actually gets the least votes (only $2$ votes), compared to $4.5$ votes for the candidate at $7$ and $3.5$ votes for the candidate at $3$. Intuitively, it makes sense why this means the median voter will not win. A candidate at the median position may get "choked out" by candidates on either side, even though the median candidate may be more popular pairwise than every other candidate. = Areas for further research Mathematically, there are several possible areas of further research. The "distinct shape" of equilibria given $N, R$ could be compared with the shapes of other equilibria. We conjectured that this shape may be related to either the mixed strategy equilibria, or the equilibria if the positions were contnuous rather than distinct, but were only able to compute those equilibria for a few cases, and so did not find a way to experimentally or mathematically verify this. There are also probably other patterns between the set of equilibria of different $N, R$ that may imply useful properties for computation. For example, the size of the set of equilibria for $N = R - 1$ appears to be $4R-20$, which may suggest some sort of patterns in the cases where $N = R - 1$. Finding these patterns may be extremely useful, as cases like $N = 11, R = 12$ have proven extremely time intensive. Computationally, the next step would be to try to transfer the validation formula into SMT-compatible constraints to try to take advantage of potential optimisations from Z3's symbolic reasoning engine. #pagebreak()
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/fruitify/0.1.0/README.md
markdown
Apache License 2.0
# Fruitify Make your equations more fruity! This package automatically replaces any single letters in equations with fruit emoji. Refer to `example-documentation.typ` for more detail. ## Emoji support At the time of writing this, typst does not yet have good emoji support for PDF. This means that even though this package works as intended, the output may look very wrong when exporting to PDF. Therefore, it is recommended to stick with PNG export for now.
https://github.com/kdog3682/mathematical
https://raw.githubusercontent.com/kdog3682/mathematical/main/0.1.0/src/draw/index.typ
typst
#import "shapes.typ" #import "utils.typ" #import "brace.typ": brace #import "@preview/cetz:0.2.2" #let canvas = cetz.canvas #let set-style = cetz.draw.set-style
https://github.com/r8vnhill/apunte-bibliotecas-de-software
https://raw.githubusercontent.com/r8vnhill/apunte-bibliotecas-de-software/main/Unit2/abstract_classes.typ
typst
== Clases abstractas Una clase abstracta es un tipo de clase que no está completa por sí misma; es decir, no puede ser instanciada directamente. Estas clases son útiles como base para otras clases, permitiendo compartir métodos y propiedades comunes mientras se obliga a las clases derivadas a implementar ciertos métodos. === Características de las Clases Abstractas - *No Instanciables:* A diferencia de las clases regulares, no puedes crear instancias de una clase abstracta directamente. - *Pueden Tener Estado:* A diferencia de las interfaces, las clases abstractas pueden tener propiedades con estado. - *Uso como Tipo:* Generalmente, se recomienda no usar clases abstractas como tipo en parámetros, retornos de funciones o variables, para promover el uso de interfaces y favorecer la composición sobre la herencia. - *Nomenclatura:* Como buena práctica, se sugiere que el nombre de una clase abstracta comience con la palabra `Abstract` para clarificar su propósito y naturaleza. === Definición de una Clase Abstracta Aquí tienes un ejemplo de una clase abstracta que implementa la interfaz `ReadWritePlayer` y declara un método abstracto: ```kotlin abstract class AbstractPlayer( override val name: String, override var lifePoints: Int ) : ReadWritePlayer { abstract fun attack(player: ReadWritePlayer) } ``` === Métodos Abstractos - *Declaración Explícita:* Todos los métodos abstractos deben ser declarados explícitamente como tal. No tienen una implementación en la clase abstracta y deben ser implementados por cualquier clase no abstracta que herede de esta. === Ejemplo de Implementación Una clase que extiende `AbstractPlayer` podría verse así: ```kotlin class Warrior(name: String, lifePoints: Int) : AbstractPlayer(name, lifePoints) { override fun attack(player: ReadWritePlayer) { player.lifePoints -= 10 // Simula un ataque restando puntos de vida } } ``` Este ejemplo muestra cómo una clase concreta `Warrior` implementa el método `attack` definido en `AbstractPlayer`. Al hacerlo, proporciona la funcionalidad específica para el método abstracto.
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/viewers/gradient.typ
typst
Apache License 2.0
#show quote.where(block: true): it => { align(center)[ #rect( radius: 2mm, inset: 5mm, width: 100%, fill: gradient.linear(luma(240), luma(210)), )[ #place(top + left, dy: -7mm)[ #rect( radius: 1mm, width: 7mm, height: 5mm, fill: luma(120), )[#text(size: 20pt, baseline: -1.3pt, fill: white, font: "Trebuchet MS")[”]] ] #v(2mm) #align(left)[#emph(it.body)] #if it.has("attribution") and it.attribution != none { align(right)[-- #it.attribution] } ] ] } #quote(attribution: "some latin dude, probably", block: true)[#lorem(50)]
https://github.com/ufodauge/master_thesis
https://raw.githubusercontent.com/ufodauge/master_thesis/main/src/main.typ
typst
MIT License
#import "@preview/algorithmic:0.1.0" #import "@preview/ctheorems:1.1.0": * #import "template/index.typ": template #import algorithmic : algorithm #show: template.with( title : "卒業論文、修士論文のタイトル", student-number: "22MM305", author : "<NAME>", mentor : "山田 敏規", mentor-post : "准教授", laboratry : "山田研究室", font : "<NAME>", font-strong : "<NAME>", date : datetime.today(), references : bibliography( "references/index.bib" ), abstract: [ 本研究では、人工知能 (AI) を用いた新しい癌治療法の開発について検討した。具体的には、AI を用いて癌細胞の特定と分類を行い、その情報をもとに標的療法を行うという方法である。 標的療法は、従来の化学療法や放射線療法と比較して、副作用が少なく、より効果的な治療法として期待されている。しかし、癌細胞の種類や分子構造は多様であるため、標的となる分子を特定することが難しいという課題があった。 本研究では、AI を用いることで、この課題を解決できるのではないかと考えた。AI は、*膨大なデータからパターンを抽出することに長けている*ため、癌細胞の分子構造を特徴付けるパターンを特定することができると考えられる。 本研究では、実際に AI を用いて癌細胞の特定と分類を行うための手法を確立した。この手法を用いて、実際に癌細胞の分子構造を特徴付けるパターンを特定することに成功した。 In this study, we investigated the development of a new cancer treatment using artificial intelligence (AI). Specifically, we proposed a method of using AI to identify and classify cancer cells, and then using that information to perform targeted therapy. Targeted therapy is a promising treatment that is less likely to cause side effects than traditional chemotherapy or radiation therapy. However, cancer cells are diverse in their types and molecular structures, making it difficult to identify target molecules. In this study, we hypothesized that AI could be used to address this challenge. AI is *good at extracting patterns from large amounts of data*, so we thought it could be used to identify patterns that characterize the molecular structure of cancer cells. In this study, we established a method for using AI to identify and classify cancer cells. Using this method, we successfully identified patterns that characterize the molecular structure of cancer cells. (written by Google Bard) ], acknowledgement: [ 本研究の遂行にあたり、多大なるご協力を賜りました関係者の皆様に、深く感謝申し上げます。 特に、本研究の指導をしてくださった [指導教員の名前] 教授に、心よりの謝辞を申し上げます。また、本研究にご協力いただいた [共同研究者の名前] 氏、[共同研究者の名前] 氏にも、厚くお礼申し上げます。 本研究は、[研究費の名称] の助成を受けて行われました。ここに、そのご支援に感謝申し上げます。 We would like to express our sincere gratitude to all those who provided us with great assistance in the conduct of this study. In particular, we would like to express our deepest gratitude to Professor [Name of the supervisor] for his guidance on this study. We would also like to thank Mr. [Name of the co-researcher] and Ms. [Name of the co-researcher] for their cooperation in this study. This study was conducted with the support of the [Name of the funding source]. We would like to express our gratitude for their support. (written by Google Bard) ], ) = はじめに == 背景 背景 == 目的 目的 == 本論文の構成 構成 = 序論 typstはmarkdown likeなコーディングでpdf、ポスター、スライド等のドキュメントを作成できます。 rust言語で書かれており、コンパイルが latex に比べて早いのが特長です。 == typstは優秀だ ```typ こんな感じで @ss8843592 or #cite(<ss8843592>) と引用できます ``` こんな感じで @ss8843592 or #cite(<ss8843592>) と引用できます === エレガントに書ける ```typ $ mat(1, 2; 3, 4) $ <eq1> ``` と書くと $ a = mat(1, 2; 3, 4) $ <eq1> のように、 @eq1 を書くことができます。 また、 ```typ #figure( image("assets/image.png", width: 20%), caption: [サンプル画像] ) <img1> ``` とすれば #figure( image("assets/image.png", width: 20%), caption: [サンプル画像] ) <img1> @img1 を表示できますし、 ```typ #figure( table( columns: 4, [t], [1], [2], [3], [y], [0.3s], [0.4s], [0.8s], ), caption: [テーブル @madje2022programmable], ) <tbl1> ``` とすれば #figure( table( columns: 4, [t], [1], [2], [3], [y], [0.3s], [0.4s], [0.8s], ), caption: [テーブル @madje2022programmable], ) <tbl1> @tbl1 も表示できます。 = 先行研究 #figure( image("assets/image.png", width: 20%), caption: [Typst + git @madje2022programmable], ) <img2> === LATEX はコンパイルが遅い 本資料は、LATEX でコンパイルの待ち時間中に作りました。 他にも ```typ #include path.typ ``` とすれば、他ファイルを参照できるので、長い分量の本などを作成する際に、章ごとにファイルを分けるなどができるようになります。 便利なので広まれば良いなと思います。 詳しくは#link("https://typst.app/docs")[ 公式ドキュメント ]をご覧ください = 定義 Typstでは関数定義が簡単であるので定理の書き方などをカスタマイズできます。 == 定義例 ```typ #let theorem = thmbox( "theorem", //identifier "定理", base_level: 1 ) #theorem("すごい定理")[ Typst はすごいのである。 ] <theorem> ``` #let theorem = thmbox( "theorem", "定理", base_level: 1 ) #theorem("すごい定理")[ Typst はすごいのである。 ] <theorem> ```typ #let lemma = thmbox( "theorem", //identifier "補題", base_level: 1, ) #lemma[ Texはさようならである。 ] <lemma> ``` #let lemma = thmbox( "theorem", "補題", base_level: 1, ) #lemma[ Texはさようならである。 ] <lemma> このように、@theorem 、 @lemma を定義できます 。\ カッコ内の引数に人名などを入れることができます。 また、identifierを変えれば、カウントはリセットされる。 identifier毎にカウントを柔軟に変えられるようにしてあるので、様々な論文の形式に対応できるはずです。 ```typ #let definition = thmbox( "definition", //identifier "定義", base_level: 1, stroke: black + 1pt ) #definition("Prime numbers")[ A natural number is called a _prime number_ if it is greater than $1$ and cannot be written as the product of two smaller natural numbers. ] <definition> ``` #let definition = thmbox( "definition", "定義", base_level: 1, stroke: black + 1pt, ) #definition[ Typst is a new markup-based typesetting system for the sciences. ] <definition> @definition のようにカウントがリセットされています。 ```typ #let corollary = thmbox( "corollary", "Corollary", base: "theorem", ) #corollary[ If $n$ divides two consecutive natural numbers, then $n = 1$. ] <corollary> ``` #let corollary = thmbox( "corollary", "Corollary", base: "theorem", ) #corollary[ If $n$ divides two consecutive natural numbers, then $n = 1$. ] <corollary> baseにidentifierを入れることで@corollary のようにサブカウントを実現できます。 ```typ #let example = thmplain( "example", "Example" ).with(numbering: none) #example[ 数式は\$\$で囲む ] <example> ``` #let example = thmplain( "example", "例" ).with(numbering: none) #example[ 数式は\$\$で囲む ] <example> thmplain関数を使ってplain表現も可能です。 #algorithm({ import algorithmic: * Function("Binary-Search", args: ("A", "n", "v"), { Cmt[Initialize the search range] Assign[$l$][$1$] Assign[$r$][$n$] State[] While(cond: $l <= r$, { Assign([mid], FnI[floor][$(l + r)/2$]) If(cond: $A ["mid"] < v$, { Assign[$l$][$m + 1$] }) ElsIf(cond: [$A ["mid"] > v$], { Assign[$r$][$m - 1$] }) Else({ Return[$m$] }) }) Return[*null*] }) })
https://github.com/peterpf/modern-typst-resume
https://raw.githubusercontent.com/peterpf/modern-typst-resume/main/template/main.typ
typst
The Unlicense
#import "@preview/modern-resume:0.1.0": modern-resume, experience-work, experience-edu, project, pill #show: modern-resume.with( author: "<NAME>", job-title: "Data Scientist", bio: lorem(5), avatar: image("avatar.png"), contact-options: ( email: link("mailto:<EMAIL>")[<EMAIL>], mobile: "+43 1234 5678", location: "Austria", linkedin: link("https://www.linkedin.com/in/jdoe")[linkedin/jdoe], github: link("https://github.com/jdoe")[github.com/jdoe], website: link("https://jdoe.dev")[jdoe.dev], ), ) == Education #experience-edu( title: "Master's degree", subtitle: "University of Sciences", task-description: [ - Short summary of the most important courses - Explanation of master thesis topic ], date-from: "10/2021", date-to: "07/2023", ) #experience-edu( title: "Bachelor's degree", subtitle: "University of Sciences", task-description: [ - Short summary of the most important courses - Explanation of bachelor thesis topic ], date-from: "09/2018", date-to: "07/2021", ) #experience-edu( title: "College for Science", subtitle: "College of XY", task-description: [ - Short summary of the most important courses ], date-from: "09/2018", date-to: "07/2021", ) == Work experience #experience-work( title: "Data Scientist", subtitle: "Some Company", facility-description: "Company operating in sector XY", task-description: [ - Short summary of your responsibilities ], date-from: "08/2021", ) #experience-work( title: "Full Stack Software Engineer", subtitle: [#link("https://www.google.com")[Some IT Company]], facility-description: "Company operating in sector XY", task-description: [ - Short summary of your responsibilities ], date-from: "09/2018", date-to: "07/2021", ) #experience-work( title: "Internship", subtitle: [#link("https://www.google.com")[Some IT Company]], facility-description: "Company operating in sector XY", task-description: [ - Short summary of your responsibilities ], date-from: "09/2015", date-to: "07/2016", ) #colbreak() == Skills #pill("Teamwork", fill: true) #pill("Critical thinking", fill: true) #pill("Problem solving", fill: true) == Projects #project( title: [#link("https://www.google.com")[Project 1]], description: [ - #lorem(20) ], date-from: "08/2022", ) #project( title: "Project 2", subtitle: "Data Visualization, Data Engineering", description: [ - #lorem(20) ], date-from: "08/2022", date-to: "09/2022", ) == Certificates #project( title: "Certificate of XY", subtitle: "Issued by authority XY", date-from: "08/2022", date-to: "09/2022", ) #project( title: "Certificate of XY", subtitle: "Issued by authority XY", date-from: "05/2021", ) #project( title: "Certificate of XY", subtitle: "Issued by authority XY", ) == Languages #pill("German (native)") #pill("English (C1)") == Interests #pill("Maker-culture") #pill("Science") #pill("Sports")
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/041%20-%20Kaldheim/003_Episode%202%3A%20Awaken%20the%20Trolls.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Episode 2: Awaken the Trolls", set_name: "Kaldheim", story_date: datetime(day: 13, month: 01, year: 2021), author: "<NAME> & <NAME>", doc ) In the bottom of the longship of Cosima, Kaya lay back and watched the night sky above drift by. It was all she could really do; there were no oars on the ship, no rudder. As soon as she had stepped on board, it had lurched suddenly away from the docks, and she understood that when Alrund had told her the ship "would carry her where she needed to go," he hadn't meant she'd have any choice in the matter. Nothing to be done about it, then, except to lay back and think. #figure(image("003_Episode 2: Awaken the Trolls/01.jpg", width: 100%), caption: [Mistgate Pathway | Art by: Yeong-Hao Han], supplement: none, numbering: none) Normally, the realms of Kaldheim were no more closely linked than the individual planes—if anything, the gulf between them was more absolute, since Kaya's natural ability to planeswalk didn't allow her to cross between them. Even for the gods of this world, crossing the Cosmos was no small task. There were exceptions, according to Inga. Now and then, through mortal ingenuity or random chance, a temporary link between two realms would open—Omenpaths, they were called. It was the Doomskars people feared, though—celestial collisions that seemed invariably to lead to calamity. The last time Bretagard collided with Karfell, a frozen land of wraiths and walking corpses, a legion of the undead made it all the way to the Beskir Stronghold before it was defeated. Never in recorded history had the Doomskar bridged their realm with Immersturm, the realm of demons, but the consequences of such a thing were hard to imagine—the last time a single demon had made it to Bretagard, he went on a rampage so horrible that they named the bleakest, darkest part of the year after it. All in all, it sounded like precisely the sort of event she told herself she'd avoid from now on. #emph[Stay focused, Kaya. You've got a possibly extraplanar, certainly dangerous monster to find. Plenty to keep you busy.] A soft #emph[thump ] woke Kaya from a dreamless sleep, and her hand went to the hilt of her dagger before she realized where she was. Wait. Where #emph[was ] she, though? She sat up and winced at a stitch in her back. The longship may have been a powerful artifact, capable of sailing through the raw magical energies of the Cosmos, but that didn't make it any better as a bed. A thick mist had settled over the water behind her, swallowing up everything but the sound of the tide lapping at the stern. Ahead, the nose of the ship had run up on a muddy bank gnarled with roots. "My stop, huh?" said Kaya, to no one. She climbed out and immediately her boots sunk into the wet black earth. As she wondered if she should tie the ship to one of the thick roots curling off the edge of the shore, the boat lurched back into the waves as if it had been pushed. In moments, the longship had disappeared into the mist. "Thanks for the ride," she muttered. What exactly would she do if the monster hopped realms again? Well, she'd worry about that later. Grabbing a branch for the leverage, Kaya climbed up off the bank and into the forest. Kaya had spent plenty of time in old places. When you specialized in things that should be dead but weren't, life took you to quite the assortment of ancient tombs and forgotten cities. But never had she been in a #emph[wild] place that felt so very, very old. Every tree stooped low and grandfatherly; the youngest of them seemed to have already lived a handful of lifetimes. Here and there she came across collapsed stonework, scarcely recognizable under the moss that grew everywhere. Everything seemed a relic of a lost age, a concession to time's final victory. In an hour of walking, Kaya saw only one intact structure, a towering archway built of stone. It had to have housed the gates of some grand, sunken fortress once; that, or whoever had built this place needed their doorways twenty feet high. The forest seemed to go on forever. All the while, Kaya searched for the silvery, organic-looking veins of metal she'd seen in that cave deep in the Aldergard. #emph[I'd take a big, spooky footprint instead. Or maybe some claw marks. ] But there was nothing. No sign that the monster had been this way at all. Kaya had stopped to rest on the trunk of a fallen tree when she heard the chatter of distant voices. She was back on her feet in an instant. #emph[Thank this plane's shiny, glowing gods.] Odds were they wouldn't be as welcoming as the Omenseekers had been, but she could at least ask whoever it was for directions. Kaya pushed aside heavy, drooping branches and ducked under mossy overhangs, following the sound. Finally, she emerged into a clearing. At one end was a massive block of worked stone, covered in faded knotwork patterns and a scaled ridge of mushrooms. The rest of the clearing held a collection of strange, and loud, creatures. Hunched over, they stood about as tall as she was, meaning they'd probably be a good bit taller if they ever stood up straight. All of them were green—some a pale green, some a deeper shade, some an ugly mottled pattern—with long, dark hair wrapping their bony forms like a shawl and formidable tusks that clacked together as they opened and closed their mouths, speaking in a language she didn't understand. #emph[Trolls. ] Hadn't seen any on Kaldheim, yet, but there was no mistaking them. And, if the Omenseekers were to be believed, the local variety were a foul-tempered bunch. Thankfully, they seemed too distracted talking to, and occasionally smacking, each other to notice her. Kaya was retreating back the way she came, step by careful step, when a figure stepped out onto the massive block of stone. Instead of a troll, it was a man wearing a hood that jingled with discs of gold. At his belt hung a sheathed sword. Around the stone, four trolls stepped out from the shadows, bigger than any in the crowd. They were draped in rusty, ill-fitting suits of mail and all carried weapons of some kind—clubs, crude axes, broken swords. One of them knocked his axe against the stone block and barked something in a harsh and guttural voice. The chattering in the crowd went silent, and the hooded man gestured toward them with spread arms. "Friends," he said, in a low, sonorous voice. "You know my many names. I am called Trickster by some, Ruse-Forger by others. Some have called me the Prince of Mischief, some the God of Lies. All know me as Valki, and my first gift to you, the gift of languages, is free. Hear my words; understand them. What I have to tell you is of grave importance." #figure(image("003_Episode 2: Awaken the Trolls/02.jpg", width: 100%), caption: [Valki, God of Lies | Art by: Yongjae Choi], supplement: none, numbering: none) A god? Here? At least this one wasn't pretending to be an old man. Although#emph[, ] thought Kaya, there was something strange about him. Something she couldn't quite place. "A time of great strife approaches! Soon, a path will open to cruel and strange worlds, filled with creatures of great avarice and wickedness! If allowed, these savage peoples will burn the forests of Gnottvold to the ground! They will put the proud clans of trollkind to the sword!" Silence, and the occasional nervous clacking of teeth were all the response he received. "These foul invaders wish to"—he paused, as if seeking the right words—"they wish to seize the treasures from your warrens!" At that, the crowd exploded with angry shrieking. Valki allowed a few moments of this, before waving his hands for quiet. When no quiet was forthcoming, one of the big, armored brutes cracked a troll in the front row with his club, and the crowd went silent again. "There is but one solution to this—the clans of the Gnottvold must attack first! Too long have you been divided by petty rivalries! Strike as one, and none shall be able to stop you!" Then Kaya realized what she was seeing; Valki glimmered. It was subtle, at first, very different from the overflowing radiance that had spilled from Alrund. Easy to miss—but Kaya had hunted insubstantial enemies for a long time. She was used to spotting subtle currents of energy. What she was seeing was an illusion. And Kaya knew there was just no way she was seeing an illusion created by #emph[the God of Lies] . Quietly, Kaya ginned up a spell. Nothing fancy—a little purification, a little seeing-beyond-the-veil. Throw in a bit of wind, and~ She blew gently toward Valki, little motes of white light exiting her pursed lips. The spell tumbled forward, air swirling around it, whipping into a gale that blew the manes of the troll crowd to and fro. When it swept over Valki, it seemed to strip the Valki right off of him; in place of the God of Lies stood a red-skinned man with two prominent horns and a very surprised expression on his face. "Who dares to—? Show yourself!" he spat angrily. #emph[Bad idea] , thought Kaya. But, then again, how good had any of her ideas turned out to be so far? She stepped out from behind her tree. "Probably thought you could get away with a sloppy illusion, right?" said Kaya. "Dumb trolls won't know the difference. Bad luck for you, Tibalt." #figure(image("003_Episode 2: Awaken the Trolls/03.jpg", width: 100%), caption: [Tibalt, Cosmic Impostor | Art by: Yongjae Choi], supplement: none, numbering: none) The corner of his lip rose in a grin. The expression didn't seem to dampen any of that anger. "Sharp eyes on this one. Have we been acquainted, then?" "Nope. But—what is it they say? Your reputation precedes you." Oh, she'd heard plenty of stories about the devil planeswalker, and none of them were good. "You're too kind. And to whom do I owe the pleasure?" "Name's Kaya." "Hmm. Rings a bell. A sneak and a thief, if I remember correctly. A killer." "Quite the accusation, coming from you. What are you doing here?" Tibalt shrugged. "I could ask you the same question. We planeswalkers are meddlers by nature, aren't we? But as you can probably tell, I happened to be in the middle of something before you #emph[so rudely ] interrupted, so if you'll excuse me—kill her!" The gathered trolls looked between her and Tibalt, uncertain. The big ones by the stone block showed no hesitation, though; loping like animals, they bull-rushed their way through the crowd, sending smaller specimens flying. The first one to reach Kaya swung his axe at her with both hands, bellowing madly. It went straight through the phased section of her body, its momentum carrying him forward as he tumbled and tripped over a root. The second jabbed at her with a rusty, ancient-looking sword. She sidestepped it and shoved him hard. Just as he hit the massive tree next to her, she rendered him momentarily incorporeal; the result, when he phased back in, was an ugly tangle of green limbs sticking out of the trunk like hideous branches. The last two hovered on the edge of the crowd after that, clearly rethinking things after what had happened to their comrades. "Yeah," said Kaya. "I wouldn't." The trolls glanced at each other. A moment later, both dropped their weapons and ran. She looked up just in time to see Tibalt turn and run into the woods. #emph[Jerk's really going to make me chase him.] She followed him through a tangled knot of trees. Tibalt had a head start, but he couldn't turn his body insubstantial at will; slowly but surely, phasing through fallen trees and crumbling stone archways, she gained on him. Finally, in an open stretch of land between a series of mossy hills on one side and a few rickety wooden structures on another, she cut him off. He bent over to catch his breath. "You run like the #emph[devil] !" he said, laughing and wheezing. "We done?" said Kaya. "Tell me what you're doing here. What do you get out of riling up a bunch of trolls? What's in it for you?" "My dear," said Tibalt, giving her a look at his many sharp teeth. "Chaos is its own reward, and nothing puts a smile on my face like a bit of mayhem. But I can't see how it's any of your business. This isn't your home. These aren't your people." The thought had occurred to her, yeah. But she #emph[did ] have business here. "There's a monster in Kaldheim. Something from outside of the plane. You wouldn't have had anything to do with that, would you?" Tibalt cocked his head. "A #emph[monster?] Why, I'm shivering in my boots! I positively must find somewhere to hide! Let me just—" "You're not going anywhere, and your troll minions aren't around to help this time. Not like they were able to slow me down." "Oh, clearly not!" said Tibalt, grinning in a way that made Kaya uneasy. "At least, not the #emph[Hagi] variety. You proved yourself quite capable of dispatching them. But as far as their cousins, the Torga—well, I like their odds a bit better." He held two fingers to his mouth, then, and gave the loudest, most shrill whistle Kaya had ever heard. She clapped her hands over her ears and bent over, wincing. After it had passed, Kaya glanced around frantically, ready for a legion of trolls to charge out of the woods, but there seemed to be nothing but rolling, grassy hills and those collapsed structures shot through with wood-rot. "Looks like your big, bad troll friends aren't showing up," said Kaya. "Now let's—" A rumbling underfoot cut her off, and the hill closest to Tibalt got about a foot taller. His grin climbed a few inches, too. #figure(image("003_Episode 2: Awaken the Trolls/04.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) "Actually," said Tibalt, "it looks like your eyes aren't as keen as you believed them to be." One after another, they pulled free of the earth, showering the clearing with clumps of black dirt. On one side of her, the wooden structures seemed to collapse in reverse as a behemoth form pushed itself out of the ground, shaking loose errant pieces of wood. They were massive—twenty feet tall at least, the bony ridges along their body resembling nothing so much as features of the landscape. Of most keen interest to Kaya were their fists, each one roughly the size and shape of a boulder. In their long, lank hair grew mosses and grasses; on the one that had emerged from beneath the wooden structure, planks and beams hung off it like primitive armor. Set deep into their geological faces were pinprick-red eyes. One yawned as it rose, revealing a mouth of yellow, twisted fangs. "Torga trolls, you see, simply hate to be woken from a deep sleep," said Tibalt. "And once they have been, they've got an unfortunate tendency to rip apart anyone and anything in the vicinity." "Are you crazy?!" hissed Kaya, turning to face the trolls behind her. She counted six altogether. "They'll kill us both!" There was an unusual sound behind her, then—a keening, whistling noise, as if the air itself was being sharpened. She turned back to see that Tibalt had drawn his sword. It was a marvel, that was plain. Forged from some kind of glass, it seemed to contain a shifting spectrum of color she had only seen once before: spilling off of Alrund himself. Next to Tibalt was a hole in the world. There was no other way to put it; it hung in the air, the edges ragged and uneven and faintly glowing. Heat and sulfurous air seemed to pour out, and through the tear, Kaya glimpsed black earth, split with volcanic ruptures. Tibalt hefted the sword and grinned at her. "Works like a charm. I'd say I wish you luck, but that would make me a liar, wouldn't it?" With that, he stepped through the portal. Behind him, the edges came together and vanished—leaving Kaya with the trolls. She pulled her daggers free from their sheaths as slowly as she could. Maybe she could still get out of this without a fight. "Listen—the guy who woke you up, he just stepped out, but if you'll just give me a minute to explain myself—" One of the trolls swung at her open-palmed, like it was trying to squash a bug. Would have succeeded, too, if Kaya hadn't phased out of the way. Even clear of the blow, the impact rattled her teeth. "Okay," she said. "I tried." She jabbed one of her blades into the troll's arm—or, rather, she tried to. It felt almost exactly like trying to stab a hunk of rock. There was a ringing snap, and she watched the dagger she'd had since Tolvada break in two. The shock lasted only a moment, but that was long enough for the troll to sweep out its hand and knock her across the clearing. Her head was ringing when she pushed herself back to her feet. It had been a long time since she had been hit that hard. She flipped her remaining dagger tip-down, in a reversed grip. "I #emph[liked ] that blade." She'd tumbled right into squashing range of another troll; it swung at her with an uprooted tree, which she phased through. On the other side she slashed at its exposed leg; the blow scraped and skittered off the thick hide, leaving nothing but a thin scratch. "Come #emph[on] ," she said, dodging a backhand from a second troll. She rolled between the legs of a third, narrowly avoiding its clumsy attempt to grab her. #emph[Time to fight dirty.] Wrapping her blade in ethereal energy, Kaya jabbed it between two huge vertebrae and yanked her hand out just in time to have it rematerialize. Tricky timing—but she was rewarded with a deep bellow as the dagger solidified in its spine. With a tremendous crash, the troll hit the ground. "Who's next?" she said, turning to the others. Okay, she may have briefly disarmed herself with that little trick, but it was nothing she couldn't— Pain exploded along her left side, and then she was tumbling, rolling across the ground. The troll she had #emph[just ] put down—the one who had apparently swatted her—was lurching to his feet; she could see the scratch she'd put on his leg closing up. #emph[They heal too] , she thought, in between the waves of nausea. Why did everything on this plane heal? The other trolls roared and banged their fists on the ground, spread out in a semi-circle that blotted out the sun. One against six. She'd won fights with worse odds. But then again, she'd had weapons in those fights. One dagger broken; one embedded in an angry troll. Kaya took a deep breath, wincing at the jolt in her ribs. "Need a hand?" came a voice from her left. #figure(image("003_Episode 2: Awaken the Trolls/05.jpg", width: 100%), caption: [<NAME> | Art by: <NAME>], supplement: none, numbering: none) Leaning against one of the ancient, twisting trees of this place was a man with long braids of red hair. By his pointed ears, Kaya could see that he was an elf, but his body was packed with more muscle than she was used to seeing on their kind. He was clearly proud of it, too; despite the cold, he wore no shirt. Only a collection of charms hanging from necklaces and a pair of bracers, one of which was fixed with a brass knife-blade. There was something about his relaxed, easy pose that made him seem young, even for a people who always seemed young. "How long have you been standing there?" she said. "Long enough to see that you're not faring too well. Not that I blame you! A Torga troll is no easy opponent, let alone six. Lucky for you I happened to be passing by." That irked her. For a moment, Kaya turned away from the approaching trolls, who still had every intention of smashing her to paste. "Listen, kid, get out of here before you get hurt. I can handle myself." "I'm not so sure about that. After all, you've lost both of your blades, while I still have my secret weapon." "That the thing on your wrist?" "Oh, no. I meant this." He tossed up a small, flat stone. Caught it, made it tumble over his long fingers. Kaya blinked. "That's your secret weapon? A rock?" He only smiled and strolled toward the trolls as if he didn't have a care in the world. "Hey! Watch out!" she shouted. Stupid kid—making her save them both. Now she couldn't just run. She moved toward him, preparing to phase him out, but there was a lot of distance to cover. The trolls, it appeared, were equanimous; they were just as willing to tear this new opponent apart. As he came closer, one of them swung a mud-covered fist. He stepped out of the way without breaking pace. He was quick; she'd give him that. Even without the ability to turn insubstantial, catching the elf seemed an impossibility for the sluggish trolls. They'd slam the ground where he'd just stood while he danced to the side; they'd clap their hands together where he'd been a moment before, and he'd backflip away. It was like trying to catch smoke, or bottle lightning. More than once, Kaya thought she saw him linger a moment longer than he needed to, letting a blow from his foes miss by inches instead of miles. #emph[A show-off, then.] Meanwhile, a transformation was happening in the fist that gripped the rock; the skin of his arm and hand appeared to be growing polished and hard, turning almost exactly the same gray as the stone. As one of the trolls tried to stomp the nimble elf into the bedrock below the hill, he leaped suddenly forward. He didn't strike the creature with that brass blade affixed to his arm, though; he only touched the creature on the leg with his new stone hand. Suddenly, the same transformation that had covered the young elf's arm began to spread rapidly across the troll's leg. It's green-gray hide, already pitted and craggy, changed to rough stone. The rock moved in a ripple up its torso, creeping ahead with alarming speed. The lumbering creature had enough time to drop its tusked jaw open in surprise before the wave of stone swept up its face, the expression of surprise freezing in place. The troll with the tree swung it toward the elf in a wide arc; he vaulted straight over it, angling his body between two whip-like branches and tucking into a roll on the other side. He placed that stone-gray hand on the elbow of the troll; in moments, the whole creature was rock. He dodged another blow, turned another troll to stone, and another. It took less than a minute from beginning to end. When they were all defeated, the elf stood with his hands on his hips, staring proudly at the towering statues as if he'd carved them himself. He looked so self-satisfied, Kaya hated to admit that she was impressed. "Not bad, kid." He looked at her, his expression going sour. "Would you stop calling me that?" "What should I call you then?" "<NAME>. Prince of the elves of Skemfar. Greatest hero in all the realms. Your personal savior." "Tyvar, then," she said, trying not to roll her eyes. "I'm Kaya. I appreciate the help, but what's a #emph[great hero ] like yourself doing out in the middle of the woods? Any chance you were following me?" "Not you. Valki." "He's not Valki," said Kaya, walking over to where her blade had snapped. She slipped the metal end into the sheath; the hilt, she hung from her belt. "His name is Tibalt." Which of these trolls had her other dagger been embedded in? It was hard to say now—especially because they were all statues. She swept one hand through, probing carefully. It was stone all the way through. She swore under her breath. "Yes, I'd gathered that, thanks to your handy dispelling. I'd been suspicious of him for some time, though. Not long ago he came to see my brother in court. I don't know what lies he told Harald, but ever since that visit the elves have been preparing for war. There are rumors they mean to march against the gods themselves." She turned just in time to see all the bravado and swagger he'd had before absent. He looked young, and worried—a moment later he'd straightened up, but not quick enough for her to miss it. If Tibalt was messing with his people, she guessed she couldn't blame him for being a little concerned. "But how the legions intend to cross into the realm of the gods, I don't know," he finished. #emph[Oh, ancients. ] "The Doomskar. Alrund said there was a Doomskar coming," said Kaya. At that, Tyvar looked as surprised as the troll statues behind him. "A Doomskar? And you heard this from Alrund himself?" "Yeah. Nice guy. Lent me a boat." "And—this Tibalt. He is an enemy of yours?" "Certainly not a friend. I don't know what he's up to, but it's trouble one way or another." "We'll pursue him together, then. Clearly you need my help," said Tyvar, smiling at her in a way that hadn't yet failed to piss her off. #emph[With an attitude like that] , she thought, #emph[this kid's gonna get himself killed. ] Not like that was her problem. "Listen, I've got other business. I can't go running after every villain that rears his ugly, horned head. Besides, I don't even know how we'd follow him." "What do you mean?" "He used a sword to open some kind of portal." "Did you see anything? Through it, on the other side," said Tyvar. "Not much. It was only open for a second," said Kaya, trying to think. "I remember seeing fire, though. And the ground looked like it had been charred black." "Immersturm," said Tyvar. The name dropped into her stomach like an iron weight; she'd heard Inga whisper stories about that place. #emph[The realm of demons.] Tyvar, bafflingly, seemed excited at the news. "Well, unless you happen to have a magic boat hanging around—" But Tyvar already had his eyes closed. He extended both hands in front of him, and Kaya reflexively took a step back. Slowly, in the surrounding air, currents of mana began to curl and twist into complex patterns of glowing knotwork. Kaya realized she had seen magic like this before—it had looked almost effortless when Alrund had opened a doorway into another realm, but the fundamentals were the same. When it opened into that shimmering nightscape of the Cosmos, she felt an odd decompression in her ears, as if all the air had suddenly gone out of the clearing. Tyvar finally opened his eyes: a doorway stood before them. "Where in the hells did you learn to do that?" breathed Kaya. "The sorcerers of Skemfar are experts of their craft. And you can count me an expert among experts," he said, grinning. "I've been all throughout the realms of Kaldheim. My natural gifts express themselves a bit differently in each one." She took a step closer, and something caught her eye. In the bundle of charms around his neck, among the bones and gems and little twisted bits of metal, was a small octahedron of dark stone. Covering the sides was a minute, precise etching—a design she had seen before. #emph[But not here.] "Ah," he said, catching her gaze. He held up the little shaped stone to the light. "Feel free to admire it. I found this one in a remote realm, one even the sagas don't speak of. It was called—" "Zendikar," she said, cutting him off. "Holy ancients. You're a planeswalker." His grin flagged, uncertain. "And what, exactly, is a planeswalker?"
https://github.com/catppuccin/typst
https://raw.githubusercontent.com/catppuccin/typst/main/src/flavors/catppuccin-frappe.typ
typst
MIT License
#let frappe = ( name: "Frappé", emoji: "🪴", order: 1, dark: true, light: false, colors: ( rosewater: ( name: "Rosewater", order: 0, hex: "#f2d5cf", rgb: rgb(242, 213, 207), accent: true, ), flamingo: ( name: "Flamingo", order: 1, hex: "#eebebe", rgb: rgb(238, 190, 190), accent: true, ), pink: ( name: "Pink", order: 2, hex: "#f4b8e4", rgb: rgb(244, 184, 228), accent: true, ), mauve: ( name: "Mauve", order: 3, hex: "#ca9ee6", rgb: rgb(202, 158, 230), accent: true, ), red: ( name: "Red", order: 4, hex: "#e78284", rgb: rgb(231, 130, 132), accent: true, ), maroon: ( name: "Maroon", order: 5, hex: "#ea999c", rgb: rgb(234, 153, 156), accent: true, ), peach: ( name: "Peach", order: 6, hex: "#ef9f76", rgb: rgb(239, 159, 118), accent: true, ), yellow: ( name: "Yellow", order: 7, hex: "#e5c890", rgb: rgb(229, 200, 144), accent: true, ), green: ( name: "Green", order: 8, hex: "#a6d189", rgb: rgb(166, 209, 137), accent: true, ), teal: ( name: "Teal", order: 9, hex: "#81c8be", rgb: rgb(129, 200, 190), accent: true, ), sky: ( name: "Sky", order: 10, hex: "#99d1db", rgb: rgb(153, 209, 219), accent: true, ), sapphire: ( name: "Sapphire", order: 11, hex: "#85c1dc", rgb: rgb(133, 193, 220), accent: true, ), blue: ( name: "Blue", order: 12, hex: "#8caaee", rgb: rgb(140, 170, 238), accent: true, ), lavender: ( name: "Lavender", order: 13, hex: "#babbf1", rgb: rgb(186, 187, 241), accent: true, ), text: ( name: "Text", order: 14, hex: "#c6d0f5", rgb: rgb(198, 208, 245), accent: false, ), subtext1: ( name: "Subtext 1", order: 15, hex: "#b5bfe2", rgb: rgb(181, 191, 226), accent: false, ), subtext0: ( name: "Subtext 0", order: 16, hex: "#a5adce", rgb: rgb(165, 173, 206), accent: false, ), overlay2: ( name: "Overlay 2", order: 17, hex: "#949cbb", rgb: rgb(148, 156, 187), accent: false, ), overlay1: ( name: "Overlay 1", order: 18, hex: "#838ba7", rgb: rgb(131, 139, 167), accent: false, ), overlay0: ( name: "Overlay 0", order: 19, hex: "#737994", rgb: rgb(115, 121, 148), accent: false, ), surface2: ( name: "Surface 2", order: 20, hex: "#626880", rgb: rgb(98, 104, 128), accent: false, ), surface1: ( name: "Surface 1", order: 21, hex: "#51576d", rgb: rgb(81, 87, 109), accent: false, ), surface0: ( name: "Surface 0", order: 22, hex: "#414559", rgb: rgb(65, 69, 89), accent: false, ), base: ( name: "Base", order: 23, hex: "#303446", rgb: rgb(48, 52, 70), accent: false, ), mantle: ( name: "Mantle", order: 24, hex: "#292c3c", rgb: rgb(41, 44, 60), accent: false, ), crust: ( name: "Crust", order: 25, hex: "#232634", rgb: rgb(35, 38, 52), accent: false, ), ), )
https://github.com/mohe2015/not-tudabeamer-2023
https://raw.githubusercontent.com/mohe2015/not-tudabeamer-2023/main/template/main.typ
typst
The Unlicense
#import "@preview/not-tudabeamer-2023:0.1.0": * #show: not-tudabeamer-2023-theme.with( config-info( title: [Title], short-title: [Title], subtitle: [Subtitle], author: "Author", short-author: "Author", date: datetime.today(), department: [Department], institute: [Institute], logo: text(fallback: true, size: 0.75in, emoji.cat.face) //logo: image("tuda_logo.svg", height: 100%) ) ) #title-slide() #outline-slide() = Section == Subsection - Some text - More text - This is pretty small, you may want to change it - nested - bullet - points = Another Section == Another Subsection - Some text - More text - This is pretty small, you may want to change it = Another Section 2 = Another Section 3 = Another Section 4 = Another Section 5 = Another Section 6 = Another Section 7
https://github.com/noahjutz/AD
https://raw.githubusercontent.com/noahjutz/AD/main/components/timeline.typ
typst
#import "/config.typ": theme #let timeline(body) = { set list(marker: circle(radius: 4pt, fill: black, stroke: none)) show list: l => { layout(((width,)) => { let items = () for (i, child) in l.children.enumerate() { let (height,) = measure( block( width: width, inset: (left: 16pt), child.body ) ) items.push(block( width: 8pt, height: height + if i < l.children.len()-1 {16pt}, { set block(above: 0pt) circle( radius: 4pt, stroke: none, fill: theme.fg_light ) align(center, block( width: .5pt, height: height + if i < l.children.len()-1 {8pt} else {-8pt}, stroke: theme.fg_light ) ) } )) items.push(child.body) } grid( columns: (8pt, 1fr), column-gutter: 8pt, ..items ) }) } body }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/README.md
markdown
Apache License 2.0
# Typst Packages The package repository for Typst, where package authors submit their packages. The packages submitted here are available on [Typst Universe][universe]. ## Package format A package is a collection of Typst files and assets that can be imported as a unit. A `typst.toml` manifest with metadata is required at the root of a package. An example manifest could look like this: ```toml [package] name = "example" version = "0.1.0" entrypoint = "lib.typ" authors = ["The Typst Project Developers"] license = "MIT" description = "An example package." ``` Required by the compiler: - `name`: The package's identifier in its namespace. - `version`: The package's version as a full major-minor-patch triple. Package versioning should follow [SemVer]. - `entrypoint`: The path to the main Typst file that is evaluated when the package is imported. Required for submissions to this repository: - `authors`: A list of the package's authors. Each author can provide an email address, homepage, or GitHub handle in angle brackets. The latter must start with an `@` character, and URLs must start with `http://` or `https://`. - `license`: The package's license. Must contain a valid SPDX-2 expression describing one or multiple [OSI-approved][OSI] licenses. - `description`: A short description of the package. Double-check this for grammar and spelling mistakes as it will appear in the [package list][list]. Optional: - `homepage`: A link to the package's web presence, where there could be more details, an issue tracker, or something else. Will be linked to from the package list. - `repository`: A link to the repository where this package is developed. Will be linked to from the package list if there is no homepage. - `keywords`: An array of search keywords for the package. - `categories`: An array with up to three categories from the [list of categories][categories] to help users discover the package. - `disciplines`: An array of [disciplines] defining the target audience for which the package is useful. Should be empty if the package is generally applicable. - `compiler`: The minimum Typst compiler version required for this package to work. - `exclude`: An array of globs specifying files that should not be part of the published bundle that the compiler downloads when importing the package. To be used for large support files like images or PDF documentation that would otherwise unnecessarily increase the bundle size. Don't exclude the README or the LICENSE. Packages always live in folders named as `{name}/{version}`. The name and version in the folder name and manifest must match. Paths in a package are local to that package. Absolute paths start in the package root, while relative paths are relative to the file they are used in. ### Templates Packages can act as templates for user projects. In addition to the module that a regular package provides, a template package also contains a set of template files that Typst copies into the directory of a new project. In most cases, the template files should not include the styling code for the template. Instead, the template's entrypoint file should import a function from the package. Then, this function is used with a show rule to apply it to the rest of the document. Template packages (also informally called templates) must declare the `[template]` key in their `typst.toml` file. A template package's `typst.toml` could look like this: ```toml [package] name = "charged-ieee" version = "0.1.0" entrypoint = "lib.typ" authors = ["Typst GmbH <https://typst.app>"] license = "MIT-0" description = "An IEEE-style paper template to publish at conferences and journals for Electrical Engineering, Computer Science, and Computer Engineering" [template] path = "template" entrypoint = "main.typ" thumbnail = "thumbnail.png" ``` Required by the compiler: - `path`: The directory within the package that contains the files that should be copied into the user's new project directory. - `entrypoint`: A path _relative to the template's path_ that points to the file serving as the compilation target. This file will become the previewed file in the Typst web application. Required for submissions to this repository: - `thumbnail`: A path relative to the package's root that points to a PNG or lossless WebP thumbnail for the template. The thumbnail must depict one of the pages of the template **as initialized.** The longer edge of the image must be at least 1080px in length. Its file size must not exceed 3MB. Exporting a PNG at 250 DPI resolution is usually a good way to generate a thumbnail. You are encouraged to use [oxipng] to reduce the thumbnail's file size. The thumbnail will automatically be excluded from the package files and must not be referenced anywhere in the package. Template packages must specify at least one category in `package.categories`. If you're submitting a template, please test that it works locally on your system. The recommended workflow for this is as follows: - Add a symlink from `$XDG_DATA_HOME/typst/packages/preview` to the `preview` folder of your fork of this repository (see the section on [local packages](#local-packages)). - Run `typst init @preview/mypkg:version`. Note that you must manually specify the version as the package is not yet in the index, so the latest version won't be detected automatically. - Compile the freshly instantiated template. ### Third-party metadata Third-party tools can add their own entry under the `[tool]` section to attach their Typst-specific configuration to the manifest. ```toml [package] # ... [tool.mytool] foo = "bar" ``` ## Published packages This repository contains a collection of published packages. Due to its early and experimental nature, all packages in this repository are scoped in a `preview` namespace. A package that is stored in `packages/preview/{name}/{version}` in this repository will become available in Typst as `#import "@preview/{name}:{version}"`. You must always specify the full package version. You can use template packages to create new Typst projects with the CLI with the `typst init` command or the web application by clicking the _Start from template_ button. ### Submission guidelines To submit a package, simply make a pull request with the package to this repository. There are a few requirements for getting a package published, which are detailed below: - **Naming:** Package names should not be the obvious or canonical name for a package with that functionality (e.g. `slides` is forbidden, but `sliding` or `slitastic` would be ok). We have this rule because users will find packages with these canonical names first, creating an unfair advantage for the package author who claimed that name. Names should not include the word "typst" (as it is redundant). If they contain multiple words, names should use `kebab-case`. Look at existing packages and PRs to get a feel for what's allowed and what's not. *Additional guidance for template packages:* It is often desirable for template names to feature the name of the organization or publication the template is intended for. However, it is still important to us to accommodate multiple templates for the same purpose. Hence, template names shall consist of a unique, non-descriptive part followed by a descriptive part. For example, a template package for the fictitious _American Journal of Proceedings (AJP)_ could be called `organized-ajp` or `eternal-ajp`. Package names should be short and use the official entity abbreviation. Template authors are encouraged to add the full name of the affiliated entity as a keyword. The unamended entity name (e.g. `ajp`) is reserved for official template packages by their respective entities. Please make it clear in your PR if you are submitting an official package. We will then outline steps to authenticate you as a member of the affiliated organization. If you are an author of an original template not affiliated with any organization, only the standard package naming guidelines apply to you. - **Functionality:** Packages should conceivably be useful to other users and should expose their capabilities in a reasonable fashion. - **Documentation:** Packages must contain a `README.md` file documenting (at least briefly) what the package does and all definitions intended for usage by downstream users. Examples in the README should show how to use the package through an `@preview` import. If you have images in your README, you might want to check whether they also work in dark mode. Also consider running [`typos`][typos] through your package before release. - **Style:** No specific code style is mandated, but two spaces of indent and kebab-case for variable and function names are recommended. - **License:** Packages must be licensed under the terms of an [OSI-approved][OSI] license. In addition to specifying the license in the TOML manifest, a package must either contain a `LICENSE` file or link to one in its `README.md`. *Additional details for template packages:* If you expect the package license's provisions to apply to the contents of the template directory (used to scaffold a project) after being modified through normal use, especially if it still meets the _threshold of originality,_ you must ensure that users of your template can use and distribute the modified contents without restriction. In such cases, we recommend licensing at least the template directory under a license that requires neither attribution nor distribution of the license text. Such licenses include MIT-0 and Zero-Clause BSD. You can use an SPDX AND expression to selectively apply different licenses to parts of your package. In this case, the README or package files must make clear under which license they fall. If you explain the license distinction in the README file, you must not exclude it from the package. - **Size:** Packages should not contain large files or a large number of files. This will be judged on a case-by-case basis, but if it needs more than ten files, it should be well-motivated. To keep the package small and fast to download, please `exclude` images for the README or PDF files with documentation from the bundle. Alternatively, you can link to images hosted on a githubusercontent.com URL (just drag the image into an issue). - **Security:** Packages must not attempt to exploit the compiler or packaging implementation, in particular not to exfiltrate user data. - **Safety:** Names and package contents must be safe for work. This list may be extended over time as improvements/issues to the process are discovered. Given a good reason, we reserve the right to reject any package submission. When a package's PR has been merged and CI has completed, the package will be available for use. However, it can currently take a longer while until the package will be visible on [Typst Universe][universe]. We'll reduce this delay in the future. Once submitted, a package will not be changed or removed without good reason to prevent breakage for downstream consumers. By submitting a package, you agree that it is here to stay. If you discover a bug or issue, you can of course submit a new version of your package. There is one exception: Minor fixes to the documentation or TOML metadata of a package are allowed _if_ they can not affect the package in a way that might break downstream users. ### Downloads The Typst compiler downloads packages from the `preview` namespace on-demand. Once used, they are cached in `{cache-dir}/typst/packages/preview` where `{cache-dir}` is - `$XDG_CACHE_HOME` or `~/.cache` on Linux - `~/Library/Caches` on macOS - `%LOCALAPPDATA%` on Windows Importing a cached package does not result in network access. ## Local packages Want to install a package locally on your system without publishing it or experiment with it before publishing? You can store packages in `{data-dir}/typst/packages/{namespace}/{name}/{version}` to make them available locally on your system. Here, `{data-dir}` is - `$XDG_DATA_HOME` or `~/.local/share` on Linux - `~/Library/Application Support` on macOS - `%APPDATA%` on Windows Packages in the data directory have precedence over ones in the cache directory. While you can create arbitrary namespaces with folders, a good namespace for system-local packages is `local`: - Store a package in `~/.local/share/typst/packages/local/mypkg/1.0.0` - Import from it with `#import "@local/mypkg:1.0.0": *` Note that future iterations of Typst's package management may change/break this local setup. ## License The infrastructure around the package repository is licensed under the terms of the Apache-2.0 license. Packages in `packages/` are licensed under their respective license. [list]: https://typst.app/universe/search/ [universe]: https://typst.app/universe/ [categories]: https://github.com/typst/packages/blob/main/CATEGORIES.md [disciplines]: https://github.com/typst/packages/blob/main/DISCIPLINES.md [SemVer]: https://semver.org/ [OSI]: https://opensource.org/licenses/ [typos]: https://github.com/crate-ci/typos [oxipng]: https://github.com/shssoichiro/oxipng
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/visualize/image_07.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // // // Error: 2-22 unknown image format // #image("./image.typ")
https://github.com/r8vnhill/apunte-bibliotecas-de-software
https://raw.githubusercontent.com/r8vnhill/apunte-bibliotecas-de-software/main/Unit2/enums.typ
typst
== Enumeraciones Vamos a desarrollar un sistema para una tienda en línea que necesita manejar distintos estados de pedidos: - Pendiente - Pagado - Enviado - Entregado - Cancelado === Primer Enfoque: Strings ``` fun handleOrderState(state: String) = when (state) { "Pending" -> println("Order is pending") "Paid" -> println("Order is paid") "Shipped" -> println("Order is shipped") "Delivered" -> println("Order is delivered") "Cancelled" -> println("Order is cancelled") else -> println("Unknown state") } ``` === Problemas con el uso de Strings Usar strings o enteros para representar estados puede llevar a varios problemas: - *Errores en tiempo de ejecución*: Debido a valores inválidos o mal escritos, los estados pueden no ser manejados correctamente. - *Manejo complicado*: El manejo de múltiples estados con estructuras de control puede volverse complicado y propenso a errores si los estados o las transiciones cambian. - *Falta de verificación en tiempo de compilación*: No hay una verificación en tiempo de compilación para las transiciones de estado, lo que puede resultar en errores difíciles de detectar. ``` fun main() { handleOrderState("Delibered") // Estado incorrecto } ``` === Segundo Enfoque: Enumeraciones Una enumeración (`enum`) es un tipo de dato especial que permite a los desarrolladores definir variables que pueden tomar uno de un conjunto fijo de constantes predefinidas. Las enumeraciones mejoran la legibilidad del código y reducen errores, al garantizar que las variables solo puedan contener uno de los valores definidos en el enum. Cada elemento de una enumeración puede actuar como una instancia de la enumeración. ==== Ventajas de las Enumeraciones - *Seguridad de Tipos*: Aseguran que solo se puedan asignar valores válidos a las variables del tipo enumerado, evitando errores en tiempo de ejecución. - *Legibilidad del Código*: Proporcionan nombres significativos para un conjunto de constantes, mejorando la claridad del código. - *Mantenibilidad*: Facilitan la actualización y mantenimiento del código, ya que los valores válidos están centralizados y son fácilmente auditables. ==== Definición de Enumeraciones Las enumeraciones se definen usando la palabra clave `enum class`. Aquí un ejemplo simple: ```kotlin enum class Bool { TRUE, FALSE } ``` Este `enum` define un tipo `Bool` que puede tener uno de dos valores: `TRUE` o `FALSE`. ==== `when` exhaustivo Un `when` es exhaustivo cuando cubre todas las posibilidades lógicas para la expresión que se está evaluando. Para enumeraciones, Kotlin fuerza que el `when` sea exhaustivo. ```kotlin enum class DeliveryState { PENDING, PAID, SHIPPED, DELIVERED, CANCELLED } ``` En el siguiente ejemplo, el `when` es exhaustivo porque cubre todos los posibles valores de la enumeración `DeliveryState`: ```kotlin fun handleOrderState(state: DeliveryState) = when (state) { DeliveryState.PENDING -> println("Order is pending") DeliveryState.PAID -> println("Order is paid") DeliveryState.SHIPPED -> println("Order is shipped") DeliveryState.DELIVERED -> println("Order is delivered") DeliveryState.CANCELLED -> println("Order is cancelled") else -> println("Unknown state") // Este else es redundante si el when es exhaustivo } ``` Cuando el `when` es exhaustivo, el bloque `else` puede ser omitido: ```kotlin fun handleOrderState(state: DeliveryState) = when (state) { DeliveryState.PENDING -> println("Order is pending") DeliveryState.PAID -> println("Order is paid") DeliveryState.SHIPPED -> println("Order is shipped") DeliveryState.DELIVERED -> println("Order is delivered") DeliveryState.CANCELLED -> println("Order is cancelled") } ``` Aquí tienes un ejemplo de uso en el método `main`: ```kotlin fun main() { handleOrderState(DeliveryState.PENDING) handleOrderState(DeliveryState.PAID) handleOrderState(DeliveryState.SHIPPED) handleOrderState(DeliveryState.DELIVERED) handleOrderState(DeliveryState.CANCELLED) } ``` En este código, la función `handleOrderState` maneja todos los posibles estados de un pedido utilizando un `when` exhaustivo, asegurando que no haya estados no manejados. Aquí tienes la versión mejorada del texto: ==== Métodos en enumeraciones Las enumeraciones pueden tener métodos abstractos que deben ser sobrescritos por cada uno de los casos de la enumeración. También pueden tener métodos concretos que serán heredados por cada uno de los elementos. ```kotlin enum class DeliveryState { PENDING { override fun signal() = "Order is pending" }, PAID { override fun signal() = "Order is paid" }, SHIPPED { override fun signal() = "Order is shipped" }, DELIVERED { override fun signal() = "Order is delivered" }, CANCELLED { override fun signal() = "Order is cancelled" }; // El ; es necesario si hay métodos o propiedades en la enumeración abstract fun signal(): String // Método concreto heredado por todos los elementos de la enumeración fun isFinalState() = this == DELIVERED || this == CANCELLED } ``` En este ejemplo, cada estado de la enumeración `DeliveryState` sobrescribe el método abstracto `signal`, proporcionando un mensaje específico para cada estado. Además, la enumeración tiene un método concreto `isFinalState` que determina si el estado es final (es decir, `DELIVERED` o `CANCELLED`). Este método concreto es heredado por todos los elementos de la enumeración. Aquí tienes la versión mejorada del texto: ==== Herencia en enumeraciones Las enumeraciones pueden implementar interfaces, pero no pueden ser heredadas. ```kotlin // Definición de la interfaz State interface State { fun signal(): String } ``` ```kotlin // Implementación de la interfaz State por la enumeración DeliveryState enum class DeliveryState : State { PENDING { override fun signal() = "Order is pending" }, PAID { override fun signal() = "Order is paid" }, SHIPPED { override fun signal() = "Order is shipped" }, DELIVERED { override fun signal() = "Order is delivered" }, CANCELLED { override fun signal() = "Order is cancelled" }; // Método concreto heredado por todos los elementos de la enumeración fun isFinalState() = this == DELIVERED || this == CANCELLED } ``` En este ejemplo, la enumeración `DeliveryState` implementa la interfaz `State`, lo que obliga a cada uno de los estados de la enumeración a sobrescribir el método `signal`. Además, la enumeración incluye un método concreto `isFinalState` que determina si el estado es final (es decir, `DELIVERED` o `CANCELLED`). Este método concreto es heredado por todos los elementos de la enumeración. ``` fun handleOrderState(state: DeliveryState) = if (state.isFinalState()) { println("Final state: ${state.signal()}") } else { println("Non-final state: ${state.signal()}") } ``` Se puede acceder a todas las entradas con entries ``` fun listOrderStates() = DeliveryState.entries.forEach { println(it) } ``` Puedo “buscar” un enum con valueOf, si el valor no existe se arroja una excepción ``` fun getOrderState(name: String) = DeliveryState.valueOf(name) ``` #line(length: 100%) *Ejercicio: Interfaz y Enumeración de Acciones del Juego* Implementa una interfaz `GameAction`, una clase `Player` y una enumeración `GameEvent` que representen las acciones de un juego. *Instrucciones:* 1. *Interfaz `GameAction`*: - Define una interfaz `GameAction` que incluya un método `execute(player: Player)`. - El método `execute` debe afectar al estado de un jugador de acuerdo con el tipo de evento. 2. *Clase `Player`*: - Crea una clase `Player` que contenga dos propiedades: `healthPoints` y `manaPoints`. - La clase debe incluir métodos para aumentar y disminuir los puntos de salud (`healthPoints`) y los puntos de maná (`manaPoints`) en una cantidad dada. - No es necesario considerar casos de borde ni validaciones de datos. 3. *Enumeración `GameEvent`*: - Crea una enumeración `GameEvent` que implemente la interfaz `GameAction`. - Cada constante de la enumeración debe sobrescribir el método `execute` y definir cómo afecta al jugador. #line(length: 100%) Aquí tienes una versión mejorada de la sección sobre las limitaciones de las enumeraciones: ==== Limitaciones de las Enumeraciones - *Datos Asociados*: Los `enum` no pueden tener datos asociados específicos de instancia sin definirlos de manera estática para todos los estados. Esto significa que no puedes agregar información dinámica a cada instancia sin complicar el diseño. - *Información Dinámica*: Tienen una capacidad limitada para manejar información dinámica, como un identificador de seguimiento para el estado `SHIPPED`. No es posible asignar datos específicos de instancia sin definirlos estáticamente para todos los estados. - *Métodos Abstractos y Propiedades*: Aunque los `enum` pueden tener métodos abstractos que los estados individuales implementan, agregar nuevos métodos o propiedades que solo se aplican a algunos estados puede volverse complicado y menos intuitivo. Esto puede llevar a un diseño inconsistente y difícil de mantener. - *Complejidad del Manejo del Estado*: Si el manejo del estado se vuelve más complejo y requiere más lógica y datos, mantener todo dentro de una definición de `enum` puede hacer que la clase sea demasiado pesada y difícil de mantener. En tales casos, es mejor considerar el uso de otras estructuras de datos más flexibles, como clases selladas. Estas limitaciones pueden hacer que el uso de enumeraciones no sea la mejor opción para sistemas con estados muy dinámicos o complejos.
https://github.com/PhilChodrow/cv
https://raw.githubusercontent.com/PhilChodrow/cv/main/src/metadata.typ
typst
/* Personal information */ #let firstName = "<NAME>." #let lastName = "Chodrow" #let personalInfo = ( pronouns: "he/him", email: "<EMAIL>", homepage: "philchodrow.prof", // github: "PhilChodrow", location: "Middlebury, VT" //extraInfo: "", ) #let tagline = "Assistant Professor of Computer Science\nMiddlebury College" /* Layout settings */ #let cvLanguage = "en" #let accentColor = "burgundy" #let profilePhoto = "img/avatar.png" // Leave empty if profil photo is not needed #let varEntryOrganisationFirst = false // Choose whether to put company or position in bold #let varDisplayLogo = false // Choose whether to display organisation logo
https://github.com/7sDream/fonts-and-layout-zhCN
https://raw.githubusercontent.com/7sDream/fonts-and-layout-zhCN/master/chapters/05-features/testing.typ
typst
Other
#import "/template/template.typ": web-page-template #import "/template/components.typ": note #import "/lib/glossary.typ": tr #show: web-page-template // ## Building a testing environment == 构造测试环境 // XXX // 增加内容,当前原文缺 // ### Using hb-shape for feature testing === 使用 hb-shape 进行特性测试 // Now is a good time to introduce the `hb-shape` tool; it's a very handy utility for debugging and testing the application of OpenType features - how they affect the glyph stream, their effect on positioning, how they apply in different language and script combinations, and how they interact with each other. Learning to use `hb-shape`, which comes as part of the [HarfBuzz](http://harfbuzz.org) OpenType shaping engine, will help you with a host of OpenType-related problems. 现在是介绍 `hb-shape` 工具的好时机。这是一个非常实用的OpenType特性调试和测试工具。通过它你能看到特性如何改变#tr[glyph]流,影响#tr[glyph]的位置,在不同的语言#tr[script]环境下的不同效果,以及如何互相影响等。`hb-shape` 工具是OpenType#tr[shaping]引擎HarfBuzz @Unknown.HarfBuzz 中的一部分,它在你遇到有关OpenType的各种问题时都能帮助你。 #note[ // > If you're on Mac OS X, you can install the Harfbuzz utilities using homebrew, by calling `brew install harfbuzz` on the terminal. 如果你使用 Mac OS X,你可以通过 Homebrew 安装 HarfBuzz,只需在终端中执行:#linebreak()`brew install harfbuzz`。 ] // As we've mentioned, HarfBuzz is a shaping engine, typically used by layout applications. Shaping, as we know, is the process of taking a text, a font, and some parameters and producing a set of glyphs and their positions. `hb-shape` is a diagnostic tool which runs the shaping process for us and formats the output of the process in a number of different ways. We can use it to check the kern that we added in the previous section: HarfBuzz是#tr[shaping]引擎,常被需要进行文本#tr[layout]的应用使用。而#tr[shaping]的过程是输入文本、字体和一些其他参数,得到一系列#tr[glyph]和它们的位置的过程。`hb-shape`是一个会执行这一过程,然后将结果用不同的格式输出的诊断工具。我们可以用它来检查之前为测试字体添加的#tr[kern]: ```bash $ hb-shape TTXTest-Regular.otf 'AA' [A=0+580|A=1+580] ``` // This tells us that we have two "A" glyphs together. The first one is the first character in the input stream ("=0" - computer strings count from zero), and that it has a horizontal advance of 580 units ("+580"). The second one is the second character in the input stream ("=1") and also has an advance of 580 units. 这告诉我们,输出是两个`A`#tr[glyph]。其中第一个`A`来自输入文本中的第一个(输出中的`=0`表示第一个,因为计算机就是从0开始数数的)#tr[character],它拥有580单位的#tr[horizontal advance]。第二个`A`是输入文本中的第二个(`=1`)#tr[character],也是580单位的#tr[horizontal advance]。 // But... 但是…… ```bash $ hb-shape TTXTest-Regular.otf 'AB' [A=0+530|B=1+618] ``` // when we have an "A" and a "B", the advance width of the "A" is only 530 units. In other words, the "B" is positioned 50 units left of where it would normally be placed; the "A" has, effectively, got 50 units narrower. In other other words, our kern worked. 当我们输入`AB`时,`A`的#tr[horizontal advance]变成了530。换句话说,也就是`B`会相对其常规位置向左移动50单位。也可以看作`A`变窄了50单位。我们的#tr[kern]成功生效了。 // We didn't need to tell HarfBuzz to do any kerning - the `kern` feature is on by default. We can explicitly turn it off by passing the `--features` option to `hb-shape`. `-<feature name>` turns off a feature and `+<feature name>` turns it on: 我们不用明确告诉HarfBuzz启用#tr[kern]功能,因为`kern`特性默认就是开启的。我们可以通过向 `--features` 选项来让`hb-shape` 开关某种特性。使用 #text(ligatures: false)[`-<特姓名>`] 可以关闭一个特性,`+<特姓名>` 则可以开启。 ```bash $ hb-shape --features="-kern" TTXTest-Regular.otf 'AB' [A=0+580|B=1+618] ``` // As you see in this case, the advance width of the "A" is back to 580 units, because the `ab` kern pair is not being applied in this case. 当关闭`kern`特性后,因为AB间的#tr[kern]不再生效,`A`#tr[glyph]的#tr[advance]就又回到了580单位。 #note[ // > We will see more of `hb-shape` in the next chapter, including examples of how it shows us positioning information. 下一章我们会更多地使用`hb-shape`,也会有一些用它来显示其他#tr[positioning]信息的例子。 ]
https://github.com/dssgabriel/master-thesis
https://raw.githubusercontent.com/dssgabriel/master-thesis/main/src/main.typ
typst
Apache License 2.0
// Entry file #set page(paper: "a4") #set par(leading: 0.55em, first-line-indent: 1.8em, justify: true) #set list(indent: 0.8em) #set enum(indent: 0.8em) #set figure(numbering: "1-1") #include "titlepage.typ" #pagebreak() #set page(numbering: "1 / 1") #pagebreak() #outline(indent: true, fill: repeat[` ·`], title: "Table of contents") #pagebreak() // Blank page #pagebreak() #set heading(numbering: none, outlined: false) #include "acknowledgments.typ" #pagebreak() #include "cea.typ" #pagebreak() #set heading(numbering: "1.1", outlined: true) #include "chapters/1-introduction.typ" #pagebreak() #include "chapters/2-context.typ" #pagebreak() #include "chapters/3-contributions.typ" #pagebreak() #include "chapters/4-conclusion.typ" #pagebreak() #include "chapters/5-bibliography.typ" #pagebreak() #include "chapters/6-appendix.typ"
https://github.com/Isaac-Fate/booxtyp
https://raw.githubusercontent.com/Isaac-Fate/booxtyp/master/src/outline.typ
typst
Apache License 2.0
#import "colors.typ": color-schema #let outline-rules(body) = { set outline(title: [Contents], depth: 3, indent: auto) show outline.entry.where(level: 1): it => { // Add some space above v(12pt, weak: true) // Entry style // If the entry refers to a numbered chapter // then show "Chapter" at the beginning of the entry if it.element.numbering == "1.1" { text(weight: "bold", size: 12pt, fill: color-schema.blue.dark)[ Chapter #smallcaps(it) ] } else { // If the entry refers to an unnumbered chapter // then only show the title text(weight: "bold", size: 12pt, fill: color-schema.blue.dark)[ #smallcaps(it) ] } } show outline: it => { // Outline content it // Reset the page counter // at the end of the outline counter(page).update(0) } // The rest of the document body }
https://github.com/rlpundit/typst
https://raw.githubusercontent.com/rlpundit/typst/main/Typst/fr-Rapport/chaps/outro.typ
typst
MIT License
/* ------------------------------- NE PAS MODIFIER ------------------------------ */ #import "../common/metadata.typ": title #set page(header: smallcaps(title) + h(1fr) + emph("Conclusion générale") + line(length: 100%)) #text(white)[= Conclusion générale]#v(-1cm) /* ------------------------------------------------------------------------------ */ *Discussion* #lorem(64) *Perspectives* #lorem(32)
https://github.com/Myriad-Dreamin/tinymist
https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/docs/param_in_init.typ
typst
Apache License 2.0
// Docs for f. #let f(a) = { show it: it => /* ident after */ it; };
https://github.com/pedrofp4444/BD
https://raw.githubusercontent.com/pedrofp4444/BD/main/report/content/[3] Modelação Concetual/abordagem.typ
typst
#let abordagem = { [ == Apresentação da Abordagem de Modelação Realizada Com a conclusão da definição dos requisitos, é então possível passar à organização da informação neles contida. Para isso, a equipa “Quatro em linha” submeteu-se ao desenvolvimento de um modelo concetual capaz de estruturar e representar coerentemente os dados essenciais para atender os requisitos definidos e, para isso, recorreu à criação de um diagrama ER. A escolha da ferramenta usada na construção do modelo concetual revela-se uma tarefa significativa no desenvolvimento da base de dados, sendo que a escolha certa consegue agilizar todo processo, uma vez que a equipa desenvolvedora do sistema estaria totalmente integrada no mesmo. Para tal, decidiu-se utilizar uma ferramenta que se baseasse na notação Chen, visto ser a notação preferencial da equipa, sendo escolhida, para esse efeito, a ferramenta “brModelo”, fruto da sua clareza e facilidade na manipulação das entidades, atributos e relacionamentos. Para se proceder à modelação concetual é necessário, numa fase inicial, identificar todas as entidades únicas e relevantes destacadas pelos requisitos levantados, os relacionamentos entre as mesmas e os atributos tanto das entidades, como, possivelmente, dos relacionamentos. Identificadas estas componentes, será então possível proceder às identificações e caracterizações detalhadas de cada uma destas. ] }
https://github.com/8LWXpg/jupyter2typst
https://raw.githubusercontent.com/8LWXpg/jupyter2typst/master/test/test3.typ
typst
MIT License
#import "template.typ": * #show: template #block[ #code-block("using Plots gr() default(fmt = :png) using DataFrames" , lang: "julia", count: 2) ] #block[ ] #block[ == Using Plots.jl Plots.jl outputs plots in different formats. It is written in #link("https://julialang.org")[Julia]: #image("img/e4f510a108a52350c25b6485f4c9058cdae2ccba.png") #image("img/c75b0d358982c06f338e70a3759a053a212d8278.png") ] #block[ #code-block("f(x) = sin(x) g(x) = cos(x) h(x) = tan(x)" , lang: "julia", count: 21) ] #block[ #result-block("h (generic function with 1 method)") ] #block[ #code-block("xs = LinRange(0, 2pi, 100)" , lang: "julia", count: 22) ] #block[ #result-block("100-element LinRange{Float64, Int64}: 0.0, 0.0634665, 0.126933, 0.1904, …, 6.09279, 6.15625, 6.21972, 6.28319") ] #block[ These are the trigonometric functions, $sin(x)$ $cos(x)$ $tan(x)$ According to Wikipedia, their graphs look like this: #image("img/e4f510a108a52350c25b6485f4c9058cdae2ccba.png") ] #block[ #code-block("plot(xs, [f, g, h]; ylim = (-2, 2), framestyle = :box, grid = false, palette = :tab10)" , lang: "julia", count: 23) ] #block[ #image("./img/ea844af1262c9c7267aaed6e3f6bb2a54b115ac9.png") ] #block[ Let\'s produce an error: ] #block[ #code-block("i(x)" , lang: "julia", count: 24) ] #block[ #result-block("UndefVarError: `i` not defined Stacktrace: [1] top-level scope @ In[24]:1") ] #block[ == Rich Outputs We can try some table outputs, for example: ] #block[ #code-block("df = DataFrame((col1 = [\"First\", \"Second\", \"Third\"], col2 = [1, 2, 3]))" , lang: "julia", count: 3) ] #block[ #result-block("3×2 DataFrame  Row │ col1  col2  │ String  Int64  ─────┼─────────────── 1 │ First 1 2 │ Second 2 3 │ Third 3") ]
https://github.com/FrightenedFoxCN/typst-math-chinese
https://raw.githubusercontent.com/FrightenedFoxCN/typst-math-chinese/main/test/basic-en.typ
typst
#import "../template-en.typ": * #show: doc => conf(doc) #outline() = Introduction This is test for English documents. The outline can be directly generated by `outline` as usual. _This block is emphasized._ $ a^2 + b^2 = c^2 $ *This block is bold* == The following paragraph will be numbered directly #lorem(50) #lorem(100) == Let's continue... $ (a^2 + b^2)(c^2 + d^2) >= ((a c)^2 + (b d)^2)^2 $ == And for math blocks... #def(supplement: "Objects")[The content of definition.] Normal remarks can be made here. #def(supplement: "Category")[ #lorem(50) ] #def[To use lists in the block, we require: - For markers in the bullet list - They are in the same color as the emphcolor of the block. - Say, blue here. + For numbered list we'd like to have this as well. + But I'm currently unaware of how to do it. ] #thm( [The main theorem.], proof: [It's proof here.], supplement: "supplements" ) <thm_ref> #thm( [A theorem without proof.], supplement: [supplements can be content as well] ) #coro( [Counter are shared among theorems and corollaries.] ) #lemma( [Lemma as well.] ) #ex( [This is an exercise. To trivial to give the solution!], supplement: "remarks" ) #ex( [Another exercise.], solution: [Solutions here!], ) #rm[ Another remark. #lorem(50) ] #conj[ Conjectures are also possible. ] #eg(supplement: "An example")[ We can place an example here. ] == Now for code blocks ```rs fn helloworld() { println!("helloworld!"); } ``` == Extra Functions Use `#endofchapter()` to begin new chapter. #endofchapter() = With labels zeroed out #thm( [Labels will be zeroed out for the new chapter.], proof: [See code.] ) #endofchapter() #show : doc => set-appendix(doc) = An appendix here. == Alphabetical label for the secondary headings here. #def[ Other items in the appendices ] #thm[ will have their labels zeroed out as well. ]
https://github.com/RaphGL/ElectronicsFromBasics
https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap7/1_whats_a_series_parallel_circuit.typ
typst
Other
#import "../../core/core.typ" === What is a series-parallel circuit? With simple series circuits, all components are connected end-to-end to form only one path for electrons to flow through the circuit: #image("static/00082.png") With simple parallel circuits, all components are connected between the same two sets of electrically common points, creating multiple paths for electrons to flow from one end of the battery to the other: #image("static/00083.png") With each of these two basic circuit configurations, we have specific sets of rules describing voltage, current, and resistance relationships. #columns(2)[ *Series Circuits:* - Voltage drops add to equal total voltage. - All components share the same (equal) current. - Resistances add to equal total resistance. #colbreak() *Parallel Circuits:* - All components share the same (equal) voltage. - Branch currents add to equal total current. - Resistances diminish to equal total resistance. ] However, if circuit components are series-connected in some parts and parallel in others, we won\'t be able to apply a #emph[single] set of rules to every part of that circuit. Instead, we will have to identify which parts of that circuit are series and which parts are parallel, then selectively apply series and parallel rules as necessary to determine what is happening. Take the following circuit, for instance: #image("static/00123.png") #image("static/10126.png") This circuit is neither simple series nor simple parallel. Rather, it contains elements of both. The current exits the bottom of the battery, splits up to travel through R#sub[3] and R#sub[4], rejoins, then splits up again to travel through R#sub[1] and R#sub[2], then rejoins again to return to the top of the battery. There exists more than one path for current to travel (not series), yet there are more than two sets of electrically common points in the circuit (not parallel). Because the circuit is a combination of both series and parallel, we cannot apply the rules for voltage, current, and resistance \"across the table\" to begin analysis like we could when the circuits were one way or the other. For instance, if the above circuit were simple series, we could just add up R#sub[1] through R#sub[4] to arrive at a total resistance, solve for total current, and then solve for all voltage drops. Likewise, if the above circuit were simple parallel, we could just solve for branch currents, add up branch currents to figure the total current, and then calculate total resistance from total voltage and total current. However, this circuit\'s solution will be more complex. The table will still help us manage the different values for series-parallel combination circuits, but we\'ll have to be careful how and where we apply the different rules for series and parallel. Ohm\'s Law, of course, still works just the same for determining values within a vertical column in the table. If we are able to identify which parts of the circuit are series and which parts are parallel, we can analyze it in stages, approaching each part one at a time, using the appropriate rules to determine the relationships of voltage, current, and resistance. The rest of this chapter will be devoted to showing you techniques for doing this. #core.review[ The rules of series and parallel circuits must be applied selectively to circuits containing both types of interconnections. ]
https://github.com/darioglasl/Arbeiten-Vorlage-Typst
https://raw.githubusercontent.com/darioglasl/Arbeiten-Vorlage-Typst/main/Anhang/06_Jira/03_dev_stories.typ
typst
=== Open Dev-Stories Hier sind die noch offenen Developer-Stories aufgeführt. Dies sind Stories die nicht direkt den Nutzer betreffen, aber für die Entwicklung der Anwendung notwendig sind. #figure( table( columns: (auto, auto, auto, auto), align: left, [*Issue Key*], [*Summary*], [*Prio*], [*Epic*], [DI-03],[Beispiel Dev-Story],[High],[Neues Epic], ), caption: "Übersicht der offenen Developer-Stories", )
https://github.com/Kasci/LiturgicalBooks
https://raw.githubusercontent.com/Kasci/LiturgicalBooks/master/utils.typ
typst
#let project(title: "", authors: (), body) = { set page(paper:"a4", numbering: "1 / 1", number-align: center, ) set text(font: "Monomakh Unicode", lang: "cu") // HEADINGS show heading.where(level: 1): it => [ #align(center, text(0pt, rgb("dd1111"), upper(it))) ] show heading.where(level: 2): it => [ #align(center, text(20pt, rgb("dd5555"), it)) ] show heading.where(level: 3): it => [ #align(center, text(rgb("dd1111"), it)) ] // Title row. align(center)[ #block(text(weight: 700, 1.75em, title)) ] // Author information. pad( top: 0.25em, bottom: 0.25em, x: 1em, grid( columns: (1fr, 1fr), gutter: 1em, ), ) // Main body. set par(justify: true) body } #let centerNote(it) = { align(center, text(10pt, rgb("dd5555"), "("+it+")")) } // ----------------- // TWO/THREE COLS // ----------------- #let twoCols = false #let section2(txt) = { pad(left: -1em, right: 0.5em, (text(13pt, rgb("dd5555"))[#txt])) } #let section3(txt) = { pad(left: -1em, right: -13em, (text(13pt, rgb("dd5555"))[#txt])) } #let formatr3(r) = { if r.len() == 1 {("", section3(r.at(0)), "")} else {("", r.at(0), r.at(1))} } #let formatr2(r) = { if r.len() == 1 {("", section2(r.at(0)))} else {("", r.at(0))} } #let formatr(r) = { if twoCols { formatr2(r) } else { formatr3(r) } } #let make(values) = { table( columns: if twoCols {(1em, 1fr)} else {(1em, 1fr, 1fr)}, stroke: none, row-gutter: 0.3em, column-gutter: 0.5em, ..values.map(formatr).flatten() ) } #let lettrine(txt) = { let first = true for it in txt { if first { upper(text(rgb("dd5555"), it)) first = false } else {it} } }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-FE10.typ
typst
Apache License 2.0
#let data = ( ("PRESENTATION FORM FOR VERTICAL COMMA", "Po", 0), ("PRESENTATION FORM FOR VERTICAL IDEOGRAPHIC COMMA", "Po", 0), ("PRESENTATION FORM FOR VERTICAL IDEOGRAPHIC FULL STOP", "Po", 0), ("PRESENTATION FORM FOR VERTICAL COLON", "Po", 0), ("PRESENTATION FORM FOR VERTICAL SEMICOLON", "Po", 0), ("PRESENTATION FORM FOR VERTICAL EXCLAMATION MARK", "Po", 0), ("PRESENTATION FORM FOR VERTICAL QUESTION MARK", "Po", 0), ("PRESENTATION FORM FOR VERTICAL LEFT WHITE LENTICULAR BRACKET", "Ps", 0), ("PRESENTATION FORM FOR VERTICAL RIGHT WHITE LENTICULAR BRAKCET", "Pe", 0), ("PRESENTATION FORM FOR VERTICAL HORIZONTAL ELLIPSIS", "Po", 0), )
https://github.com/SeniorMars/tree-sitter-typst
https://raw.githubusercontent.com/SeniorMars/tree-sitter-typst/main/examples/layout/container.typ
typst
MIT License
// Test the `box` and `block` containers. --- // Test box in paragraph. A #box[B \ C] D. // Test box with height. Spaced \ #box(height: 0.5cm) \ Apart --- // Test fr box. Hello #box(width: 1fr, rect(height: 0.7em, width: 100%)) World --- // Test block over multiple pages. #set page(height: 60pt) First! #block[ But, soft! what light through yonder window breaks? It is the east, and Juliet is the sun. ]
https://github.com/7sDream/fonts-and-layout-zhCN
https://raw.githubusercontent.com/7sDream/fonts-and-layout-zhCN/master/chapters/07-localisation/arabic-urdu-sindhi.typ
typst
Other
#import "/template/template.typ": web-page-template #import "/template/components.typ": note #import "/template/lang.typ": arabic, arabic-amiri #import "/lib/glossary.typ": tr #show: web-page-template // ## Arabic, Urdu and Sindhi == 阿拉伯语、乌尔都语、信德语 // In the various languages which make use of the Arabic script, there are sometimes locally expected variations of the glyph set - for instance, we mentioned the variant numbers four, five and seven in Urdu. The expected form of the letter heh differs in Urdu, Sindhi, Parkari and Kurdish. In Persian, a language-specific form of kaf (ک, U+06A9 ARABIC LETTER KEHEH) is preferred over the usual form of kaf (ك, U+0643 ARABIC LETTER KAF). All of these substitutions can be made using the language-specific `locl` feature trick we saw above. 在使用阿拉伯字母的各种语言中,有时会根据当地习惯采用不同的字形集合。我们之前提到过,在乌尔都语中数字`4`、`5`、`7`会产生变化。而乌尔都语、信德语、帕卡利语/*#footnote[译注:原文为Pakari,疑似为笔误。这里拙校为Parkari,为印度-雅利安语支下的一门语言,主要在巴基斯坦的信德省的塔帕卡和纳加派克县使用,详见其#link("https://en.wikipedia.org/wiki/Parkari_Koli_language", in-footnote: true)[维基百科]。此处参考一份#link("https://www.gcedclearinghouse.org/sites/default/files/resources/190162chi.pdf", in-footnote: true)[SIL文档的中译本]译作“帕卡利语”。]*/、库尔德语对于字母`heh`也有不同的写法。字母 `kaf U+0643 ARABIC LETTER KAF`(#arabic[ك])在波斯语中也有一个专属样式,会写成 `U+06A9 ARABIC LETTER KEHEH`(#arabic[ک])。所有这种类型的#tr[substitution]需求都可以使用我们上面介绍的`locl`特性中的技巧来实现。 #note[ // > "How do I know all this stuff?" Well, part of good type design is doing your research: looking into exemplars and documents showing local expectations, testing your designs with native readers, and so on. But there's also a growing number of people collating and documenting this kind of language-specific information. As we've mentioned, the Unicode Standard, and the original script encoding proposals for Unicode, give some background information on how the script works. I've also added a list of resources to the end of this chapter which collects some of the best sources for type design information. “我从哪能学到这些知识呢?”想做出好的设计,就得花一部分时间在研究上。比如广泛阅读当地文本,通过各种需求文档了解他们的期望,让母语读者参与测试你的设计等。现在也有越来越多的人在收集整理这种关于特定语言的文档和信息。比如Unicode标准,以及希望Unicode编码某种#tr[script]的原始提案,都会包含一些关于此#tr[scripts]的工作方式的背景信息。在本章末尾我也提供了一个在设计字体时的优秀参考资料列表。 ] #note[ // > One of those resources is <NAME>'s notes on variant Arabic characters in different scripts. He mentions there that some Persian documents may encode kaf with using U+0643, so fonts supporting Persian *may* wish to substitute kaf with the "keheh" form; other documents, however, might use U+06A9 to represent Persian kaf but retain the use of U+0643 to deliberately refer to the standard Arabic kaf - in which case you may *not* want to make that glyph substitution. Think about your intended audience when substituting encoded characters. 参考资料中有一项是Jonathan Kew写的关于阿拉伯#tr[character]在不同#tr[scripts]中产生的变化的笔记。他提到,有些波兰语的文档会使用`U+0643`来#tr[encoding]`kaf`字母。所以支持波兰语的字体*可能*会想把`kaf`#tr[substitution]为`keheh`的样子。但也有一些文档使用`U+06A9`来表示波兰版的`kaf`,从而刻意的使`U+0643`只用于代表标准的阿拉伯字母`kaf`。这样的话就又不希望进行#tr[glyph]#tr[substitution]了。所以在实际#tr[substitution]#tr[character]前,要想想你的目标受众到底是哪些群体。 ] // Arabic fonts additionally vary depending on their adherence to calligraphic tradition. When connecting letters together into word forms, calligraphers will use a wide variety of ligatures, substitutions, and adjustments to positioning to create a natural and pleasing effect, and Arabic fonts will reflect this "fluidity" to a greater of lesser degree. As you consider the appropriate ways of writing different pairs of letters together, the more ligature forms you envision for your font, the more complex you can expect the feature processing to be. 阿拉伯字体还会因为它们对书法传统的坚持程度不同而产生差异。当字母组合成单词时,偏爱书法形式的设计师将会大量使用#tr[ligature]、#tr[substitution]、#tr[position]调整等手段来制作自然流畅的书法效果。阿拉伯字体或多或少都会有一些这种“流动性”。你越希望不同的字母对能适当的组合在一起,往字体里塞的#tr[ligature]越多,你的字体特性代码就会越复杂和难以处理。 // One important trick in Arabic feature programming is to make heavy use of chaining contextual substitutions instead of ligatures. Let's consider the word كِلَا (kilā, "both"). A simple rendering of this word, without any calligraphic substitutions, might look like this: (Glyphs from Khaled Hosny's *Amiri*.) 为阿拉伯字体进行特性设计时,一个重要的技巧是少用#tr[ligature],多用#tr[chaining]#tr[contextual]#tr[substitution]。考虑这个单词 #arabic[كِلَا](kilā,表示“都”的意思)。不使用任何书法风格时,它看上去应该是这样的(这里使用Khaled Honsny设计的Amiri字体中的#tr[glyph]): #let raw-amiri = body => arabic-amiri(text(features: ("calt": 0), body)) #figure( placement: none, )[#block(inset: (y: 2em))[ #text(size: 5em)[#raw-amiri[كِلَا]] ]] // Running 使用命令: #[ #show regex(`\p{scx=Arabic}+`.text): raw-amiri ```bash $ hb-hape --features='-calt' Amiri-Regular.ttf كِلَا [uni0627.fina=4+229|uni064E=2@-208,0+0|uni0644.medi=2+197|uni0650=0@8,0+0|uni0643.init=0+659] ``` ] // confirms that no contextual shaping beyond the conversion into initial, medial and final forms is going on: 可以确定除了让#tr[character]采用正确的词首词中词尾形式外,#tr[shaper]没有进行其他操作。 // Obviously this is unacceptable. There are two ways we can improve this rendering. The first is the obvious substitution of the final lam-alif with the lam-alif ligature, like so: // But the second, looking at the start of the word, is to form a kaf-lam ligature: 很明显这个显示效果不太好,我们可以用两种方式提升它。可以选择将词尾的`lam-alif`替换为#tr[ligature]形式(左图)。但是,如果我们关注词首,会发现它也可以形成一个`kaf-lam`#tr[ligature](右图)。 #figure( placement: none, )[#grid( columns: 2, column-gutter: 10%, [#image("kila-2.png", height: 7em)], [#image("kila-3.png", height: 7em)] )] // Ah, but... what if we want to do both? If we use ligature substitutions like so: 啊哈!但是,我们不能全都要吗?如果使用#tr[ligature]#tr[substitution]的话: ```fea feature calt { lookupflag IgnoreMarks; sub kaf-ar.init lam-ar.medi by kaf-lam.init; # 1 sub lam-ar.medi alef-ar.fina by lam-alef.fina; # 2 } calt; ``` // what is going to happen? The shaper will work through the string, seeing the glyphs ` kaf-ar.init lam-ar.medi alef-ar.fina`. It sees the first pair of glyphs, and applies Rule 1 above, meaning that the new string is `kaf-lam.init alef-ar.fina`. It tries to match any rule against this new string, but nothing matches. 这样结果会是什么呢?#tr[shaper]将会遍历输入的字符串,处理`kaf-ar.init lam-ar.medi alef-ar.fina`这个#tr[glyph]序列。他会先看到第一对#tr[glyph],然后应用上面的第一条规则。现在序列就变成了`kaf-lam.init alef-ar.fina`。它尝试在新序列上应用规则,但没有规则能匹配上。 // Let's now rewrite this feature using chained contextual substitutions and glyph classes. Instead of creating a lam-alef ligature and a kaf-lam ligature, we split each ligature into two "marked" glyphs. Let's first do this for the lam-alef ligature. We design two glyphs, `alef-ar.fina.aleflam` and `lam-ar.medi.aleflam`, which look like this: 现在我们用#tr[chaining]#tr[contextual]#tr[substitution]和#tr[glyph]类来重写这个特性。我们不直接生成`lam-alef`和`kaf-lam`#tr[ligature],而是使用“标记”#tr[glyph]来分别代表它们。首先来处理`lam-alef`。我们设计两个#tr[glyph],`alef-ar.fina.aleflam` 和 `lam-ar.medi.aleflam`,见@figure:alef-lam。 #figure( caption: [] )[#image("alef-lam.png", width: 60%)] <figure:alef-lam> // and then we substitute each glyph by its related "half-ligature": 然后我们将每个#tr[glyph]都#tr[substitution]为对应的“半#tr[ligature]”形式: ```fea lookup LamAlef { sub lam-ar.medi by lam-ar.medi.aleflam; sub alef-ar.fina by alef-ar.fina.aleflam; } LamAlef; feature calt { lookupflag IgnoreMarks; sub lam-ar.medi' lookup LamAlef alef-ar.fina' lookup LamAlef; } ``` // Finally, we create our variant kaf, which we call `kaf-ar.init.lamkaf`, and now we can apply the kaf-lam substitution: 最后,设计一个`kaf`的变体`kaf-ar.init.lamkaf`,并编写`kaf-lam`的#tr[ligature]#tr[substitution]: ```fea feature calt { lookupflag IgnoreMarks; sub kaf-ar.init' lam.medi by kaf-ar.init.lamkaf; # 1 sub lam-ar.medi' lookup LamAlef alef-ar.fina' lookup LamAlef; # 2 } ``` // Now when the shaper sees kaf lam alef, what happens? Kaf and lam match rule 1, which substitutes the kaf for its special initial form. Next, lam alef matches rule 2, which chains into the "LamAlef" lookup; this converts the first glyph to `lam-ar.medi.aleflam` and the second to `alef-ar.fina.aleflam`. 现在,如果#tr[shaper]看到`kaf lam alef`会是什么情况。首先`kam`和`lam`匹配规则1,这将`kaf`转变为我们设计的特殊形式`kaf-ar.init.lamkaf`。然后`lam`和`alef`匹配规则2,通过#tr[chaining]应用`LamAlef`,它将`lam`变成`lam-ar.medi.aleflam`,`alef`变成`alef-ar.fina.aleflam`。 // It's a little more convoluted, but this way we have a substitution arrangement that works not just by ligating a pair at a time, but which allows us to *continue* transforming glyphs across the string: alef-lam works, as does lam-kaf, but they also both work together. 虽然这个过程比较复杂,但通过这种组织方式,我们对#tr[substitution]可以有更加精细的控制权。现在我们不仅能将一对#tr[glyph]组合成它们的#tr[ligature]形式,而且可以在整个输入字符串中*持续*地进行#tr[glyph]转换:`alef-lam`#tr[ligature]正常形成,`lam-kaf`#tr[ligature]也可以正常形成,而且它们俩甚至可以同时出现。 #figure( placement: none, )[#block(inset: (top: 1em, bottom: 2.5em))[ #text(size: 5em)[#arabic-amiri[كِلَا]] ]]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/package/local-link.typ
typst
Apache License 2.0
#import "@preview/typst-ts-templates:0.1.0": *;
https://github.com/ayoubelmhamdi/typst-phd-AI-Medical
https://raw.githubusercontent.com/ayoubelmhamdi/typst-phd-AI-Medical/master/chapters/ch20.typ
typst
MIT License
#import "../functions.typ": heading_center, images, italic,linkb, dots #import "../tablex.typ": tablex, cellx, rowspanx, colspanx, hlinex #let finchapiter = text(fill:rgb("#1E045B"),"■") // #linebreak() // #linebreak() // #counter("tabl").update(n=>n+20) = DÉTECTION DES NODULES PULMONAIRES DU CANCER. == Introduction. Le cancer du poumon figure parmi les principales causes de mortalité liées au cancer dans le monde entier #cite("national2011reduced"). La reconnaissance et le diagnostic précoces des nodules pulmonaires, petites masses de tissu dans les poumons, peuvent considérablement augmenter les taux de survie et le succès du traitement pour les individus atteints de cancer du poumon. Cependant, la détection et la classification de ces nodules pulmonaires représentent un défi de taille en raison de leur taille, forme, emplacement et caractéristiques physiques variables #cite("SetioTBBBC0DFGG16"). De plus, la majorité des nodules pulmonaires sont bénins ou non cancéreux, avec seulement un faible pourcentage classé comme malin ou cancéreux #cite("dou2017automated"). Ces conditions créent des complications pour la détection et la classification automatisées des nodules pulmonaires par le biais de modèles d'apprentissage automatique. Dans cette étude, nous mettons en œuvre une expérience d'apprentissage automatique utilisant un modèle CNN pour déterminer si les nodules pulmonaires sont bénins ou malins à partir d'images de scans. Nous avons utilisé l'ensemble de données LUNA16 accessible au public #cite("SetioTBBBC0DFGG16") comprenant 888 scans CT de nodules annotés. #figure( tablex( columns: 4, align: center + horizon, auto-vlines: false, repeat-header: false, [*CT Scans*],[*Nodules*], [*Malins*], [ *Bénins*], [$888$],[$6692$], [$2526$], [$4166$], ), caption: [ Le nombre total de nodules malin et non bénin. ], kind: "tabl", supplement: [#text(weight: "bold","Table")], ) Un total de 6692 nodules ont été isolés à partir de ces scans, dont seulement 2526 étaient malveillants et 4166 étaient bénins. Nous avons l'entraînement du modèle CNN en utilisant l'ensemble d'entraînement et avons évalué ses performances sur l'ensemble de validation. == Méthode. Notre étude comprenait trois étapes principales : le prétraitement des données, le développement de l'algorithme de détection des nodules et l'évaluation des performances #cite("dou2017automated","ding2017accurate","armato2011lidc"). === Ressources. Les ressources de notre étude étaient des scans CT et des annotations provenant de l'ensemble de données LUNA16 #cite("SetioTBBBC0DFGG16"). LUNA16, un ensemble de scans CT accessible au public du Lung Database Consortium (LIDC) et de l'Image Database Resource Initiative (IDRI), comprend 888 scans CT avec une épaisseur de tranche inférieure à 3 mm et un espacement de pixel inférieur à 0,7 mm. Cet ensemble propose également deux fichiers CSV distincts contenant les détails des candidats et des annotations. Dans le fichier candidates_V2.csv, quatre colonnes sont illustrées : seriesuid, coordX, coordY, coordZ, et classe. Ici, le seriesuid fonctionne comme un identifiant unique pour chaque balayage ; coordX, coordY, et coordZ représentent les coordonnées spatiales pour chaque candidat en millimètres, et 'classe' fournit une catégorisation binaire, dépeignant si le candidat est un nodule (1) ou non (0). #figure( tablex( columns: 5, align: center + horizon, auto-vlines: false, repeat-header: false, [*seriesuid*], [*coordX*], [*coordY*], [*coordZ*], [*class*], [1.3.6...666836860], [68.42], [-74.48], [-288.7], [0], hlinex(stroke: 0.25pt), [1.3.6...666836860], [68.42], [-74.48], [-288.7], [0], hlinex(stroke: 0.25pt), [1.3.6...666836860], [-95.20936148], [-91.80940617], [-377.4263503], [0], ), caption: [Coordonnées des candidats détectés dans le dataset Luna16 avec diamètres], kind: "tabl", supplement: [#text(weight: "bold","Table")], ) Le fichier annotations.csv est composé de cinq colonnes : seriesuid, coordX, coordY, coordZ, et diamètre_mm, commandant l'identifiant unique du scanner, les coordonnées d'annotation spatiales en millimètres, et le diamètre de chaque annotation en millimètres, respectivement. Ces annotations ont été marquées manuellement en se basant sur l'identification des nodules de plus de 3 mm de diamètre par quatre radiologistes indépendants #cite("dou2017automated") #cite("ding2017accurate","armato2011lidc"). #figure( tablex( columns: 5, align: center + horizon, auto-vlines: false, repeat-header: false, [*seriesuid*], [*coordX*], [*coordY*], [*coordZ*], [*diameter_mm*], [1.3.6.1....6860], [-128.6994211], [-175.3192718], [-298.3875064], [5.65147063], hlinex(stroke: 0.25pt), [1.3.6.1....6860], [103.7836509], [-211.9251487], [-227.12125], [4.224708481], hlinex(stroke: 0.25pt), [1.3.6.1....5208], [69.63901724], [-140.9445859], [876.3744957], [5.786347814], ), caption: [Annotations des nodules détectés dans le dataset Luna16], kind: "tabl", supplement: [#text(weight: "bold","Table")], ) === Scans CT avec Nodules Pulmonaires. Pour lire, traiter et représenter visuellement les scans CT montrant des nodules pulmonaires, nous avons mis en œuvre deux bibliothèques Python : SimpleITK#footnote("sous licence «Apache License 2.0.».") et matplotlib#footnote("sous licence «PSF License.»."). - SimpleITK#cite("Beare2018", "Yaniv2017", "Lowekamp2013") offre un point d'accès simplifié à l'Insight Segmentation and Registration Toolkit (ITK), un cadre construit pour l'analyse et le traitement d'images. - Matplotlib#cite("Hunter2007"), en revanche, offre des fonctionnalités pour la visualisation et l'amélioration des images. Avec SimpleITK, nous avons lu les fichiers de scan CT de l'ensemble de données LUNA16, convertissant ces images de leur format DICOM ou NIfTI en tableaux numériques multidimensionnels manipulables, appelés tableaux numpy. De plus, SimpleITK a été utilisé pour obtenir l'origine et l'espacement des images, définis comme les coordonnées de l'image et la taille du voxel, respectivement. Par la suite, nous avons rééchantillonné les images en utilisant SimpleITK, obtenant une taille de voxel uniforme de 1 mm x 1 mm x 1 mm, normalisé les valeurs de pixel à une plage de -1000 à 320 unités Hounsfield (HU), et appliqué un algorithme de segmentation pulmonaire pour isoler les régions pulmonaires des images. Nous avons utilisé matplotlib pour tracer et afficher les tranches de scan CT contenant des nodules, complétant ces images par des lignes blanches marquant les limites autour de chaque nodule pour souligner leur emplacement et leurs dimensions #cite("SetioTBBBC0DFGG16","dou2017automated","ding2017accurate"). Une fonction a été développée, acceptant en entrée un tableau de scan CT, un tableau numpy composé de coordonnées et de diamètres de nodules, l'origine et l'espacement de l'image, et quelques paramètres optionnels. Cette fonction itère sur le tableau de nodules, calculant les coordonnées de voxel pour chaque nodule en fonction des coordonnées physiques de l'image, de son origine et de son espacement. Par la suite, elle modifie le tableau de scan CT, incorporant les lignes blanches autour de chaque nodule, et conclut en créant un tracé pour afficher les tranches de scan CT contenant les nodules en utilisant matplotlib. La @Fig1 offre un exemple d'une tranche de scan CT, où le nodule est mis en évidence par des lignes blanches. #images( filename:"images/seg4.png", caption:[ Exemples d'une tranche de scan CT avec un nodule mis en évidence par des lignes blanches. ], width: 90% // ref: ) <Fig1> ==== Prétraitement des données. Dans la phase de prétraitement des données, les scans CT ont été transformés du format DICOM en tableaux (tenseurs). Cela a été suivi par le rééchantillonnage des images pour obtenir des dimensions de voxel uniformes de 1 mm x 1 mm x 1 mm, la normalisation des valeurs de pixel pour répondre à une plage de -1000 à 320 unités Hounsfield (HU), et enfin l'utilisation d'un algorithme de segmentation pulmonaire pour extraire les régions pulmonaires des images. #cite("dou2017automated","ding2017accurate","armato2011lidc"). #figure( tablex( columns: 4, align: center + horizon, auto-vlines: false, repeat-header: false, [], [*Train*], [*Test*], [*ALL*], [*Total Nodules*],[$4483$], [$2209$], [*6692*], ), caption: [ Le nombre total de nodules d'entraînement et de tests. ], kind: "tabl", supplement: [#text(weight: "bold","Table")], ) Après avoir préparé les images de scans CT, l'ensemble de données a été divisé en ensembles d'entraînement et de test. L'ensemble d'entraînement comprenait 67% des données(4483 nodules), tandis que l'ensemble de test comprenait les 33% restants(2209 nodules). La division a été effectuée en utilisant la fonction train_test_split de la bibliothèque sklearn, avec un paramètre random_state fixé à 42 pour assurer la reproductibilité. Le modèle a été entraîné sur l'ensemble d'entraînement et évalué sur l'ensemble de test. Les performances du modèle ont été mesurées en utilisant l'exactitude, le rappel et le score F1. ==== Développement de l'algorithme de détection des nodules. La construction de l'algorithme de détection des nodules a été divisée en plusieurs étapes impératives. À sa base, l'algorithme reposait sur un modèle de réseau neuronal convolutif (CNN), chargé d'identifier les nodules à partir d'images de scans CT #cite("lin2017feature"). #images( filename:"images/model2.png", caption:[ La structure du modèle. ], width: 90% // ref: ) Le modèle conçu comprenait : - Une _couche d'entrée_ pour recevoir une image 2D avec une seule couleur de canal. - Deux couches convolutives (Couche convolutive 1 et 2) sont définies avec 32 filtres et une taille de noyau de 3x3. Les deux couches utilisent un padding 'same' pour préserver les dimensions spatiales de l'image. - Après chaque couche convolutive, une _Fonction d'activation ReLU_ est appliquée. - Après ces deux couches convolutives, une opération de MaxPooling2D est effectuée pour réduire les dimensions spatiales de moitié. - Un autre ensemble de deux couches convolutives (Couche convolutive 3 et 4) est défini, cette fois avec 64 filtres. Chaque couche est suivie d'une fonction d'activation ReLU. - Ceci est suivi d'une autre opération de MaxPooling2D. - Ensuite, un _GlobalAveragePooling2D_ est appliqué, agrégeant globalement par moyenne, permettant de réduire considérablement le nombre de paramètres du modèle. - Ensuite, une opération _Flatten_ est effectuée pour convertir le tenseur multidimensionnel en un vecteur 1D. - Ensuite, une _couche entièrement connectée_ (ou Dense) est appliquée avec deux neurones de sortie, correspondant aux deux classes cibles : la présence ou l'absence de nodules pulmonaires. - Enfin, une fonction softmax est utilisée comme fonction d'activation de la dernière couche pour effectuer une classification binaire, fournissant une distribution de probabilité sur les deux classes. Pour entraîner le modèle, l'optimiseur Adam a été utilisé avec un taux d'apprentissage de 0,001, une taille de lot de 40, et la fonction de perte d'entropie croisée binaire. Cette fonction de perte mesure la divergence entre la probabilité prédite par le modèle et la vérité terrain pour chaque image. Elle est adaptée aux problèmes de classification binaire, comme celui de détecter la présence ou l'absence de nodules. L'entropie croisée binaire pénalise les prédictions erronées plus fortement que les prédictions correctes, ce qui encourage le modèle à apprendre à distinguer les nodules des non-nodules avec une grande confiance #cite("Goodfellowetal2016"). L'entraînement du modèle s'est étendu sur 100 époques #cite("SetioTBBBC0DFGG16"). == Résultats. === Évaluation des performances du modèle. Nous avons évalué le succès du modèle à travers son *exactitude* sur les ensembles de données d'entraînement et de validation. L'*exactitude* du modèle sur les données d'entraînement et de validation a été documentée à chaque étape du processus d'apprentissage #cite("SetioTBBBC0DFGG16"). Le terme *exactitude* fait référence à la capacité du modèle à prévoir correctement les résultats sur les données d'entraînement, tandis que l'*exactitude de validation* signifie la capacité du modèle à généraliser ses prédictions à de nouvelles données inédites, c'est-à-dire les données de validation. #images( filename:"images/class2.svg", caption:[ Évolution des précisions d’entraînement et de validation au cours de l’apprentissage. ], width: 100% // ref: ) En examinant les valeurs d'*exactitude* et d'*exactitude de validation* tout au long des étapes d'apprentissage, il est indiqué que le modèle acquiert des connaissances, comme on peut le voir à travers l'amélioration progressive des exactitudes d'entraînement et de validation. Le modèle commence avec des exactitudes relativement plus faibles, autour de $64%$, avant d'augmenter à plus de 89% et de terminer avec un score de 87% à la fin de l'entraînement. Cela démontre la capacité raffinée du modèle à catégoriser correctement un ratio considérable de cas. === Métriques d'évaluation : Précision, Rappel et Score F1. #block()[ #set text(9pt, style: "italic") #grid( columns: (1fr, 2fr), rows: (auto), gutter: 3pt, figure( tablex( columns: 3, align: center + horizon, auto-vlines: false, repeat-header: false, [], [*Positive*], [ *Négative*], [*Vrai*], [$652$], [$1286$], hlinex(stroke: 0.25pt), [*Faux*], [$90$], [$199$], )+text(size: 2pt," "), caption: [La matrice de confusion.], kind: "tabl", supplement: [#text(weight: "bold","Table")], ), /* -------------------------*/ figure( tablex( columns: 4, align: center + horizon, auto-vlines: false, repeat-header: false, [], [*Précision*], [ *Rappel \ (sensibilité)*], [*F1-score*], // [*Class 0*], [$86%$], [$93%$], [$90%$], hlinex(stroke: 0.25pt), // [*Class 1*], [$88%$], [$77%$], [$82%$], hlinex(stroke: 0.25pt), [*Total*], [$87.8%$], [$76.6%$], [$81.8%$], )+text(size: 8pt," "), caption: [Précision, rappel et F1-score du modèle pour les classes 0 et 1], kind: "tabl", supplement: [#text(weight: "bold","Table")], ), ) ] La performance du modèle a aussi été évaluée à partir de _la matrice de confusion_, qui permet de calculer des métriques comme la *précision*, le *rappel (sensibilité)* et le *score F1*, en plus de l’*exactitude*. Ces mesures fournissent un aperçu plus large des performances du modèle, notamment quand il y a un déséquilibre des classes #cite("lin2017focal"). // #linebreak() #let VP="VP" #let FP="FP" #let FN="FN" - La *précision* représente la fraction des prédictions positives correctes (plus précisément, lorsque le modèle identifie correctement un nodule) sur toutes les prévisions positives faites par le modèle. Une précision élevée indique un faible taux de faux positifs du modèle. Le modèle a atteint une précision de $87.8%$. $ "précision" &= (VP) / (VP + FP) \ &= 652/(652+90) \ &= 87.8% $ // #images( // filename:"images/pre_recall2.png", // caption:[ // Précision et rappel (« recall »). La précision compte la proportion d'items pertinents parmi les items sélectionnés alors que le rappel compte la proportion d'items pertinents sélectionnés parmi tous les items pertinents sélectionnables. // ], // width: 60% // // ref: // ) - Le *Rappel (Sensibilité)*, synonyme de sensibilité ou de taux de vrais positifs, est le rapport des prédictions positives correctes à tous les positifs réels. Un rappel élevé indique que le modèle a correctement identifié la majorité des cas positifs réels. Le modèle a atteint un rappel de $76.6%$. $ "rappel" &= (VP) / (VP + FN) \ &= 652/(652+199) \ &= 76.6% $ - Le *F1-score* est la moyenne harmonique de la précision et du rappel, fournissant une seule mesure qui équilibre ces métriques. Le modèle a obtenu un _score F1_ de $81.8%$. - $ F_1 &= (2 VP)/(2VP + FP + FN) \ &= (2 times 252)/(2 times 256 + 90 + 199) \ &= 81.8% $ == Discussion. Les résultats illustrent que le modèle a performé de manière compétente dans l'identification des deux classes. En général, le modèle a performé de manière impressionnante en termes de précision, de rappel et de _score F1_. Un certain nombre de facteurs peuvent expliquer pourquoi le modèle montre une préférence pour identifier la classe 0 (pas de nodule) par rapport à la classe 1 (présence de nodule). L'une des principales explications pourrait résider dans la composition de l'ensemble de données. Si l'ensemble de données comprend un plus grand nombre d'exemples de la classe 0 que de la classe 1, le modèle pourrait devenir plus apte à identifier la classe dominante, entraînant une performance légèrement supérieure pour cette classe. Une autre explication possible pourrait être que la détection des nodules est une tâche plus difficile que la détection des non-nodules, car les nodules sont souvent petits, flous ou cachés par d’autres structures pulmonaires. Pour réduire le biais du modèle en faveur de la classe 0, il serait possible d’utiliser des techniques de rééquilibrage des classes, comme le suréchantillonnage ou le sous-échantillonnage, ou d’appliquer une focal loss, qui pénalise davantage les erreurs sur la classe minoritaire. De plus, des caractéristiques différentes entre les classes peuvent également conduire à des taux de détection différents. Des analyses ultérieures, telles que des examens approfondis des caractéristiques des données d’entrée, pourraient aider à clarifier exactement pourquoi ces différences de performances sont observées. Les solutions pourraient impliquer l'utilisation de techniques de prétraitement des images, comme la normalisation ou l’augmentation des données, pour améliorer la qualité et la diversité des données d’entrée. Pour résoudre ce problème, il faut une stratégie raffinée pour entraîner notre modèle et un indicateur de performance amélioré mieux que la précision. Les solutions potentielles comprennent : - La mise en œuvre de techniques d'augmentation des données pour augmenter le nombre d'échantillons malins dans notre ensemble de données #cite("SetioTBBBC0DFGG16"). - L'utilisation de techniques de suréchantillonnage ou de sous-échantillonnage pour obtenir un équilibre des classes dans notre ensemble de données #cite("lin2017focal"). #figure( tablex( columns: 3, align: center + horizon, auto-vlines: false, repeat-header: false, [], [*Précision*], [ *Rappel (sensibilité)*], [*Song et al.*], [$82%$], [$83%$], hlinex(stroke: 0.25pt), [*Nibali et al.*], [$89%$], [$91%$], hlinex(stroke: 0.25pt), [*Zhao et al.*], [$82%$], [$$], hlinex(stroke: 0.25pt), [*Nos modèles*], [$87%$], [$90%$], ), caption: [Comparaison avec d'autres études dans la tâche de classification des nodules#cite("Song2017", "Nibali2017", "Zhao2018").], kind: "tabl", supplement: [#text(weight: "bold","Table")], ) Dans notre travail ultérieur, nous visons à incorporer certaines de ces solutions et nous nous attendons à améliorer les performances de notre modèle par rapport à la classification des nodules pulmonaires, pour maîtriser la classification des sous-types de nodules, tels que solides, non-solides, partiellement solides, pérfissuraux, calcifiés et spiculés. Différents traitements sont nécessaires pour différents types de nodules, ce qui rend leur détection précise encore plus pertinente pour un traitement réussi. == Conclusion. Nous avons utilisé le Deep-Learning pour détecter et classifier les nodules pulmonaires dans l'ensemble de données LUNA16. Le modèle a affronté des défis liés à la diversité des nodules en termes de taille, de forme et d'emplacement, ainsi qu'à une distribution inégale dans l'ensemble de données. Malgré ces difficultés, il a performé de manière satisfaisante, produisant des scores élevés, un bon rappel et un F1 score convaincant pour les nodules, qu'ils soient bénins ou malins. Le modèle a montré un léger avantage dans l'identification des non-nodules, probablement à cause du déséquilibre de classes dans l'ensemble de données. Nous envisageons des techniques d'augmentation des données et de rééquilibrage des classes pour remédier à ce problème. Les résultats de notre étude soulignent que le Deep-Learning est efficace pour la détection et la classification des nodules pulmonaires. Il a le potentiel pour faciliter le diagnostic précoce du cancer du poumon, ce qui peut améliorer les chances de survie et l'efficacité du traitement. Nous cherchons à améliorer notre modèle pour perfectionner sa performance, en particulier dans la détection des sous-types de nodules pulmonaires. Pour cela, des recherches supplémentaires sont nécessaires. #finchapiter
https://github.com/mav3ri3k/public_notes
https://raw.githubusercontent.com/mav3ri3k/public_notes/main/main.typ
typst
#import "@preview/dvdtyp:1.0.0": * #import "@preview/fletcher:0.5.1" as fletcher: diagram, node, edge #set document(title: [So you want to build rust procedural macro with with wasm], author: "<NAME>") #show link: name => box( fill: rgb("#f1ce33"), radius: 4pt, outset: (x: 2pt, y: 3pt), )[#underline[#text(fill: rgb("000000"))[#name]]] #show: dvdtyp.with( title: "So you want to build rust procedural macro with with wasm", author: "mav3ri3k", ) #show raw: name => if name.block [ #block( fill: luma(230), inset: 4pt, radius: 4pt, )[#name] ] else [ #box( fill: luma(230), outset: (x: 2pt, y: 3pt), radius: 4pt, )[#name] ] #outline() = I am clueless, lets start! Where do you start ? Well I don’t see that the compiler allows it. Hmm.. I will do it myself then. :0 == How do do you create procedural macros ? Lets start where all things rust start: #link("https://doc.rust-lang.org/book/")[The Rust Book] #sym.arrow.r 19. Advances Features #sym.arrow.r 19.5 Macros It teaches us a nice way to create function type procedural macro. Following that, lets create a custom macro. ```rust // my_macro/src/lib.rs extern crate proc_macro; #[proc_macro] pub fn make_answer(_item: proc_macro::TokenStream) -> proc_macro::TokenStream { "fn answer() -> u32 { 42 }".parse().unwrap() } ``` *But what does it do ?* To see what our custom proc macro expands to, lets use a utility: #link("https://github.com/dtolnay/cargo-expand")[Cargo Expand]. It allows us to see how our procedural macros expand. Then upon running ```bash $ cargo expand``` from the root of our project, we obtain: ```rust #![feature(prelude_import)] #[prelude_import] use std::prelude::rust_2021::*; #[macro_use] extern crate std; extern crate proc_macro; #[proc_macro] pub fn make_answer(_item: proc_macro::TokenStream) -> proc_macro::TokenStream { "fn answer() -> u32 { 42 }".parse().unwrap() } const _: () = { extern crate proc_macro; #[rustc_proc_macro_decls] #[used] #[allow(deprecated)] static _DECLS: &[proc_macro::bridge::client::ProcMacro] = &[ proc_macro::bridge::client::ProcMacro::bang("make_answer", make_answer), ]; }; ``` A lot of text is added to our code, but most of it looks familiar like ```rust #[prelude_import]```, ```rust extern crate std;```. We would expect them to be there. But the interesting part is here: ```rust static _DECLS: &[proc_macro::bridge::client::ProcMacro] = &[ proc_macro::bridge::client::ProcMacro::bang("make_answer", make_answer), ]; ``` By just reading names, we see - ```rust proc_macro```: obviously, - ```rust bang```: name internally used for function type macros, - ```rust "make_answer"```: name of our proc macro.\ Nothing surprising. But there is something new too: ```rust bridge::client```. At this point we have come deep enough. Our holy Rust Book does not tell us about the bridge. So lets try refering the #link("https://rustc-dev-guide.rust-lang.org/")[Rustc Dev Guide] #sym.arrow.r 36. Syntax and the AST #sym.arrow.r 36.2 Macro Expansion\ I am telling you, the rust is quite well documented compared to most other language. *There is a book for everything!* If you scroll *#highlight[all]* the way down, you might reach small section on Procedural Macros. But that does not matter to us. We are all chads here. We read the code. #image("chad.jpg") So it tells us about `rustc_expand::proc_macro` and `rustc_expand::proc_macro_server` At this point we can piece together three words that we have come across: + `client` + `bridge` + `server` This likey means that procedural macros work in a server-client architecture. And you would be correct! == What is happening ? This is a good place to explain how proc macros work internally. Procedural Macros work on `TokenStream`. A TokenStream is just a stream of tokens. A token is just a group of character which have a collective meaning. For example if we take ```rust let x = 2;```, we can say the tokens would look like: #diagram( node((0, 0), [let\ Keyword]), edge("-|>"), node((1, 0), [x\ Variable Name]), edge("-|>"), node((2, 0), [=\ Logical Operator]), edge("-|>"), node((3, 0), [2\ Constant]), edge("-|>"), node((4, 0), [;\ Delimiter]), ) The names of tokens here is representive of logic rather than actual names used in compiler. The procedural macro takes in some `TokenStream` and outputs another `TokenStream` which replaces the original one. This "expansion" of the origial TokenStream happens at the compile time on the machine it is compiling on. Not during the runtime on the machine the code was built for. This is a unique problem while building the compiler. == The Chicken and the Egg problem + *What came first, the Chicken or the Egg ?*\ + *When the first ever compiler was made, how did they compile it?* + * Can compiler compile code of compiler ?* #image("think.jpg") === Bootstrapping Bootstrapping is a technique for creating a self-compiling compiler, which is a compiler written in the same programming language it compiles. This is the same technique that the rust compiler uses. The best analogy I can think of is how the Terminator uses his left arm to heal his right arm. In similar fashion the rustc compiler and std library continuously build each other until we have the final output. #image("bootstrap.jpeg") Read more at: #link("https://jyn.dev/bootstrapping-rust-in-2023/ ")[Why is Rust's build system uniquely hard to use?] === Procedural Macro as part of library Proc macros are also part of rust library. This means they have to be compatible between two different version of compiler. Therefore when the compiler calls the proc macro to run, the `TokenStream` is passed as *serializaed through a C ABI Buffer*. And thus the reaason proc macros use a server ( compiler frontend ) and client ( proc macro client ) architecture through a bridge ( C ABI Buffer ). #diagram( node-stroke: 1pt, node((0, 0), [Proc Macro Server\ (Compiler)], corner-radius: 2pt), edge("<|-|>", label: "Bridge: C ABI Buffer"), node((4, 0), [Proc Macro Client\ (Crate)], corner-radius: 2pt), ) This also means that proc macro can not have depencency on any extern crate. == Rustc_expand Lets look at actual code for Rust's compiler. The entry point is #link("https://github.com/rust-lang/rust/blob/master/compiler/rustc_expand/src/proc_macro.rs")[rustc_expand::proc_macro]. Here ```rust fn expand``` gets called for all 3 types of proc macros. This creates an instance of proc macro server defined at #link("https://github.com/rust-lang/rust/blob/master/compiler/rustc_expand/src/proc_macro_server.rs")[rustc_expand::proc_macro_server]. Then the actual client being the proc macro crate is called through the #link("https://github.com/rust-lang/rust/tree/master/library/proc_macro/src/bridge")[proc_macro::bridge]. = Add Support for wasm proc macro At this point we have explored all the words thats we discored through ```bash $ cargo expand```. We understand overall structure and how pieces are interacting. #problem[ But what about it ? All we want to do is add a way such that we can run proc macro written in rust. ] == Compile Proc Macro to Wasm Yes, so lets review. The first thing to run proc macro written in rust is to build proc macro to wasm. Lets do that. For the macro we build earlier, run the command: ```bash $ cargo build --target wasm32-unknown-unknown # Output Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.06s ``` Voila! It builds!\ *But is there a `.wasm` file in `/target` ?* ```bash # Check yourself $ ls **/*.wasm #Output Pattern, file or folder not found ``` *No, none, nota, nill, null. What ?*\ Yes you can not build proc macros to `wasm` yet.\ *Currently this has been identified as lower on list of priorities and thus no work has been done.* === Current Work Around Update your `lib.rs` file to: ```rust // my_macro/src/lib.rs extern crate proc_macro; #[no_mangle] #[export_name = "make_answer"] pub extern "C" fn make_answer(_item: proc_macro::TokenStream) -> proc_macro::TokenStream { "fn answer() -> u32 { 42 }".parse().unwrap() } ``` Just compile `lib.rs` file to `wasm` using rustc. ```bash $ rustc src/lib.rs --extern proc_macro --target wasm32-unknown-unknown --crate-type lib ``` *This has some #highlight()[glaring] drawbacks as we will find later.* == Register a wasm crate Now that we have our wasm file. Lets try using it how other proc macros are used.\ If you already have some simple rust repo with a single proc macro dependency, you can try: ```bash $ cargo build -vv``` for super verbose output which will show us what it will do in the background which are just calls to the holy rust compiler *rustc*. You will see some stuff like: ```bash Compiling my-macro v0.1.0 (/Users/apurva/projects/proc-macro-server/my-macro) Running rustc --crate-name my_macro --edition=2021 lib.rs --crate-type proc-macro -C prefer-dynamic -C embed-bitcode=no -C metadata=60c0b140b17fe75a -C extra-filename=-60c0b140b17fe75a --out-dir /Users/apurva/projects/proc-macro-server/run-macro/target/debug/deps -C incremental=/Users/apurva/projects/proc-macro-server/run-macro/target/debug/incremental -L dependency=/Users/apurva/projects/proc-macro-server/run-macro/target/debug/deps --extern proc_macro` Compiling run-macro v0.1.0 (/Users/apurva/projects/proc-macro-server/run-macro) Running rustc --crate-name run_macro --edition=2021 src/main.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --diagnostic-width=100 --crate-type bin --emit=dep-info,link -C embed-bitcode=no -C debuginfo=2 -C split-debuginfo=unpacked -C metadata=3f481d1407db4a43 -C extra-filename=-3f481d1407db4a43 --out-dir /Users/apurva/projects/proc-macro-server/run-macro/target/debug/deps -C incremental=/Users/apurva/projects/proc-macro-server/run-macro/target/debug/incremental -L dependency=/Users/apurva/projects/proc-macro-server/run-macro/target/debug/deps --extern my_macro=/Users/apurva/projects/proc-macro-server/run-macro/target/debug/deps/libmy_macro-60c0b140b17fe75a.dylib` ``` *Too much garbage! I did not sign up for this.*\ *Calm you horses buddy.*\ The first compilation just means it is building the proc macro. The second call for compiling is when it actually build the crate and attaches our macro using the line:\ ```bash --extern my_macro=/some_file_path/libmy_macro-hash_for_incremental_comp.dylib ``` Along the same line lets try to use our wasm file by directly passing it through extern: ```bash $ rustc /some_rust_file.rs --extern my_macro=/some_path/my_macro.wasm # Output (Some error) ``` Well we can not just pass wasm files to the compiler. Back to rust compiler dev guide! #link("https://rustc-dev-guide.rust-lang.org/backend/libs-and-metadata.html")[Libraries and Metadata] tells us that currently it only accepts 3 types of file + rlib + dylib + rmeta So we need to also add #highlight[wasm] to this list. This is has been done but with a #underline[caveat]. The `CrateLocator` works correctly and accepts a wasm file, however upon the next step we need to register the crate which requires metadata. Accoring to the guide ( true by the way, :D ):\ #quote[As crates are loaded, they are kept in the `CStore` with the crate metadata wrapped in the `CrateMetadata` struct.] We need `CrateMetadata`! And currently while compiling wasm file, metadata is not attached to it. The #highlight[glaring] issue I told you about. Current hack is to just just patch it all with made up data. == Expand a Proc Macro *Finally! The meat of the matter!* So now that we have registered the wasm file, we can use it to expand our proc macro. We already know which part of the compiler is reponsible: `rustc_expand::proc_macro`. Lets try to read *simplified* expand function for `BangProcMacro`. Read through the comments for small walkthrough. ```rust use rustc_ast::tokenstream::TokenStream; impl base::BangProcMacro for BangProcMacro { fn expand<'cx>( .. // takes in stream of token defined by compiler input: TokenStream, //expects a result with new stream of tokens ) -> Result<TokenStream, ErrorGuaranteed> { .. // create instance of proc macro server let server = proc_macro_server::Rustc::new(); // Run main entry function for proc macro // which takes care of talking between server and client // returns new tokenstream self.client.run(server, input, ..) } } ``` Ok. Lets think again about what we want to do. #diagram( node-stroke: 1pt, node((0, 0), [Proc Macro Server\ (Compiler)], corner-radius: 2pt), edge("<|-|>", label: "Bridge: C ABI Buffer"), node((4, 0), [Proc Macro Client\ (Crate)], corner-radius: 2pt), ) The only change to our above diagram is that now the *Proc Macro Client* is a wasm file. Which means we only need to change some logic for the client. So when do we create the client ? As hint again check the output of `$ cargo expand`. This is the function used to create a new client: ```rust impl Client<crate::TokenStream, crate::TokenStream> { pub const fn expand1(f: impl Fn(crate::TokenStream) -> crate::TokenStream + Copy) -> Self { Client { get_handle_counters: HandleCounters::get, run: super::selfless_reify::reify_to_extern_c_fn_hrt_bridge(move |bridge| { run_client(bridge, |input| f(crate::TokenStream(Some(input))).0) }), _marker: PhantomData, } } } ``` This is not meant to make sense for you. The important part is that the function takes in our proc macro function and creates a client using it. The current leading implementation is to create a thin shim function for this input function which internally runs the wasm blob. #diagram( node-stroke: 1pt, node((0, 0), [Proc Macro Server\ (Compiler)], corner-radius: 2pt), edge("<|-|>", label: "Bridge: C ABI Buffer"), node((4, 0), [Proc Macro Client\ (Crate)], corner-radius: 2pt), edge("<|-|>", label: "Shim function"), node((4, 1), [Run wasm blob], corner-radius: 2pt), ) This looks like: ```rust fn wasm_pm(ts: crate::TokenStream, path: PathBuf) -> crate::TokenStream { // call wasmtime using a shared library // and run the wasm blob internally } impl Client<crate::TokenStream, crate::TokenStream> { pub const fn expand_wasm(path: PathBuf) -> Self { let f = unsafe { wasm_pm }; Client { get_handle_counters: HandleCounters::get, run: super::selfless_reify::reify_to_extern_c_fn_hrt_bridge(move |bridge| { run_client(bridge, |input| f(crate::TokenStream(Some(input)), path).0) }), _marker: PhantomData, } } } ``` = What is the current state of project ? *Ok mav, after reading through this for 10 hours, where are we at ?* I am at final stages of finishing getting the shim working. This has taken much longer than I personally expect. There can be many reasons: + #highlight[Skill Issue] + #highlight[Skill Issue] + #highlight[Skill Issue] + #highlight[Skill Issue] + #highlight[Skill Issue] 100. libproc_macro can not have dependency on any other crate. Which means every low level implementation has to be seperately defined and used for libproc_macro. So I have gone through more low level code than ever in life. *Look out for update on this soon.* After this, the efforts will be put into adding metadata when we compile proc macro to wasm and properly registering it as a crate.
https://github.com/typst-community/valkyrie
https://raw.githubusercontent.com/typst-community/valkyrie/main/src/types/string.typ
typst
Other
#import "../base-type.typ": base-type #import "../assertions-util.typ": assert-base-type #import "../ctx.typ": z-ctx #import "../assertions-util.typ": * #import "../assertions/string.typ": matches /// Valkyrie schema generator for strings /// /// -> schema #let string( assertions: (), min: none, max: none, ..args, ) = { assert-positive-type(min, types: (int,), name: "Minimum length") assert-positive-type(max, types: (int,), name: "Maximum length") base-type(name: "string", types: (str,), ..args) + ( min: min, max: max, assertions: ( ( precondition: "min", condition: (self, it) => it.len() >= self.min, message: (self, it) => "Length must be at least " + str(self.min), ), ( precondition: "max", condition: (self, it) => it.len() <= self.max, message: (self, it) => "Length must be at most " + str(self.max), ), ..assertions, ), ) } #let email = string.with( description: "email", assertions: ( matches( regex("^[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+(\.[a-zA-Z0-9-]{2,3}){1,2}$"), message: (self, it) => "Must be an email address", ), ), ); #let ip = string.with( description: "ip", assertions: ( matches( regex("^(?:(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])\.){3}(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])$"), message: (self, it) => "Must be a valid IP address", ), ), );
https://github.com/tingerrr/finger-tree
https://raw.githubusercontent.com/tingerrr/finger-tree/main/thesis/README.md
markdown
# Project structure The project is structured as follows: - `src`: the language agnostic document source files like figures and bibliographies - `src/de`: the german thesis document files - `src/de/thesis.typ`: the entnry point of the german thesis document - `assets`: other relevant assets for compilation and styling # Tooling The following tooling is required for compilation - [typst][typst] # Compiling To compile the document into a PDF run the following commands in the same directory as this README: ``` mkdir -p out typst compile --root . src/de/thesis.typ out/thesis.pdf ``` First time compilation requires an internet connection to cache the packages used for this document. [typst]: typst.app [ctf]: https://github.com/tingerrr/chiral-thesis-fhe
https://github.com/vimkat/typst-ohm
https://raw.githubusercontent.com/vimkat/typst-ohm/main/src/templates/presentation.typ
typst
MIT License
#import "_presentation.typ": ohm-theme, title-slide, section-slide, agenda-slide, slide, metadata-line
https://github.com/tingerrr/chiral-thesis-fhe
https://raw.githubusercontent.com/tingerrr/chiral-thesis-fhe/main/src/core/authors.typ
typst
#import "/src/packages.typ" as _pkg #import "/src/utils.typ" as _utils #let _std = ( link: link, ) #let _eat-title(value) = { let rest = value.trim(at: start) let (main, rest) = _utils.token.eat-any(rest, ( "Dr.", "Prof.", regex("Dipl-\\w+\\."), "M.", "B.", )) if main == none { return (err: _pkg.oxifmt.strfmt("unknown tilte at: `{}`", rest)) } if not _utils.token.is-boundary(rest) { return (err: _pkg.oxifmt.strfmt("unexpected input after title at: `{}`", rest)) } if main in ("Prof.", "Dr.", "M.", "B.") { if main == "Prof." { ((main: main, suffix: none), rest) } else if main == "Dr." { let (suffix, rest) = _utils.token.eat(rest.trim(at: start), regex("\\w+\\. *nat\\.")) if suffix != none { if not _utils.token.is-boundary(rest) { return (err: _pkg.oxifmt.strfmt("unexpected after doctor suffix at: `{}`", rest)) } } ((main: main, suffix: suffix), rest) } else { let (suffix, rest) = _utils.token.eat(rest.trim(at: start), "Sc.") if suffix == none { return (err: _pkg.oxifmt.strfmt( "unknown {} suffix at `{}`", if main == "M." { "masters" } else { "bachelors" }, rest, )) } ((main: main, suffix: suffix), rest) } } else { if not _utils.token.is-boundary(rest) { return (err: _pkg.oxifmt.strfmt("unexpected input at: `{}`", rest)) } ((main: main, suffix: none), rest) } } #let parse-title(value) = { assert.eq( type(value), str, message: _pkg.oxifmt.strfmt("`value` must be a string, was {}", type(value)) ) let res = _eat-title(value) if type(res) == dictionary { panic(res.err) } let (title, rest) = res rest = rest.trim(at: start) if rest.len() != 0 { panic(_pkg.oxifmt.strfmt("unexpected input at `{}`", rest)) } title } #let parse-name(value) = { assert.eq( type(value), str, message: _pkg.oxifmt.strfmt("`value` must be a string, was {}", type(value)) ) value = value.trim().split(" ").filter(frag => frag.len() != 0) if value.len() == 0 { panic("name cannot be empty") } // if there's a comma it designates "last, first" style naming let part = none for (idx, frag) in value.enumerate() { if frag.ends-with(",") { if part != none { panic("a name may only contain one comma") } part = idx } } // remove the comma for the validation if part != none { value.at(part) = value.at(part).trim(",", repeat: false) part = part + 1 } // validate names // NOTE: using \w is too permissive, but this is not really an issue in most cases if not value.all(frag => frag.match(regex("^([\\w'\\-]+|[\\w]\\.)$")) != none) { panic(_pkg.oxifmt.strfmt("name contained invalid character: `{}`", value.join(" "))) } let (first, last) = if part != none { if value.len() == 1 { panic("first name may not be empty") } else { (value.slice(part), value.slice(0, part)) } } else { if value.len() == 1 { ((value.first(),), ()) } else { (value.slice(0, -1), (value.last(),)) } } ( first: first, last: last, ) } #let parse-author(value) = { assert.eq( type(value), str, message: _pkg.oxifmt.strfmt("`value` must be a string, was {}", type(value)) ) value = value.trim() let titles = () while true { let res = _eat-title(value) if type(res) == dictionary { break } else { let (title, rest) = res titles.push(title) value = rest } } value = value.trim().split(" ").filter(frag => frag.len() != 0) if value.len() == 0 { panic("author cannot be empty") } let email = if value.last().starts-with("<") and value.last().ends-with(">") { assert(value.len() != 1, message: "author must contain name") value .remove(value.len() - 1) .trim("<", repeat: false, at: start) .trim(">", repeat: false, at: end) } ( titles: titles, name: parse-name(value.join(" ")), email: email, ) } #let format-title(title, suffix: true) = { assert.eq( type(title), dictionary, message: _pkg.oxifmt.strfmt("`title` must be a title dictionary, was {}", type(title)), ) assert.eq( title.keys(), ("main", "suffix"), message: _pkg.oxifmt.strfmt( "`title` must contain `main` and `suffix`, contained {}", title.keys(), ), ) title.main if suffix and title.suffix != none { " " title.suffix } } #let format-name(name, abbreviate: false, last-first: false) = { assert.eq( type(name), dictionary, message: _pkg.oxifmt.strfmt("`name` must be a name dictionary, was {}", type(name)), ) assert.eq( name.keys(), ("first", "last"), message: _pkg.oxifmt.strfmt( "`name` must contain `first` and `last`, contained {}", name.keys(), ), ) let first = if abbreviate { name.first.map(n => if n.ends-with(".") { n } else { n.clusters().first() + "." }).join(" ") } else { name.first.join(" ") } let last = name.last.join(" ") if last-first { if name.last.len() != 0 { last ", " } first } else { first if name.last.len() != 0 { " " last } } } #let format-author( author, titles: true, title-suffix: true, abbreviate: false, email: true, link: false, ) = { assert.eq( type(author), dictionary, message: _pkg.oxifmt.strfmt("`author` must be an author dictionary, was {}", type(author)), ) assert.eq( author.keys(), ("titles", "name", "email"), message: _pkg.oxifmt.strfmt( "`author` must contain `titles`, `name` and `email`, contained {}", author.keys(), ), ) if titles { for title in author.titles { format-title(title, suffix: title-suffix) " " } } format-name(author.name, abbreviate: abbreviate) if email and author.email != none { // this will convert the output into content if link { " " (_std.link)("mailto:" + author.email, author.email) } else { " <" author.email ">" } } } #let prepare-author(author) = { if type(author) == str { parse-author(author) } else if type(author) == dictionary { // TODO: validate dictionary author } else { panic("only string and author dictionary are allowed as author") } } // TODO: check dict keys and values #let assert-author-valid(author) = { assert(type(author) in (str, dictionary)) }
https://github.com/cidremt/QFT_memo
https://raw.githubusercontent.com/cidremt/QFT_memo/main/main.typ
typst
#import ".config.typ":* #import "@preview/ouset:0.1.1":* #import "@preview/fletcher:0.4.3" as fletcher: diagram, node, edge #show: doc => init(doc)
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/scholarly-tauthesis/0.8.0/template/content/preface.typ
typst
Apache License 2.0
/*** preface.typ * * Write the preface (alkusanat) of your work here. * **/ This #link("https://typst.app/docs/")[typst] document template is intended to support writing theses in the technical fields in Tampere University. The template is based on the earlier one from the Tampere University of Technology, but it has been updated for use in Tampere University from 2019 onwards. Versions $>=$ #version(0,12,0) of the typst compiler required by this template have the capability of creating PDF/A-2b-compliant PDF documents, when run with the command ```shell typst compile --pdf-standard a-2b template/main.typ ``` Therefore, the use of the Tampere University Muuntaja-service, used for transforming PDF documents into PDF/A format, might no longer be required, when using this template. *However*, you should make sure that your template actually conforms to the PDF/A-2b standard, before submitting it to Tampere Universty archive *#link("https://trepo.tuni.fi/")[Trepo]*. This can be achieved by installing the verification program *#link("https://docs.verapdf.org/install/")[veraPDF]* and running the generated PDF file `main.pdf` through it. *If the verification program complains that the file does not conform the the PDF/A-2b standard, try feeding it to the PDF/A converter at https://muuntaja.tuni.fi before submitting it.* Acknowledgements to those who contributed to the thesis are generally presented in the preface. It is not appropriate to criticize anyone in the preface, even though the preface will not affect your grade. The preface must fit on one page. Add the date, after which you have not made any revisions to the text, at the end of the preface.
https://github.com/dead-summer/math-notes
https://raw.githubusercontent.com/dead-summer/math-notes/main/notes/Analysis/ch1-measures/1-d-lebesgue-measure.typ
typst
#import "/book.typ": book-page #import "../../../templates/conf.typ": * #import "@preview/mitex:0.2.4": * #show: thmrules.with(qed-symbol: $square$) #show: book-page.with(title: "1-D Lebesgue Measure") = 1-D Lebesgue Measure Let $F: RR -> RR$ be increasing and right-continuous. Consider intervals of the type "h-intervals": sets of the form $]a, b]$ or $[a, +infinity[$ or $diameter$ for $-infinity <= a < b <= +infinity$. #prp[ $cal(A) = {"finite disjointegral unions of h-intervals"}$ is an algebra, and $ cases( mu_0(diameter) = 0, mu_0(union.sq.big_1^infinity (a_j, b_j]) := sum_1^infinity F(b_j) - F(a_j) ) $ defines a premeasure on $cal(A)$ . ] Since $sigma(cal(A)) = cal(B)_(RR^1)$ , we can apply _Hahn extension_ to extend $mu_0$ to a unique Borel measure on $RR$ , called the Lebesgue-Stieltjes measure $mu_F$. #prp[ $mu_F = mu_G <=> F = G + "constant"$. ] We write $mu_F equiv mu$ and let $cal(M)_mu := {mu^* "-measurable subsets"}$ which is the domain of $macron(mu)$ . Then $forall E in cal(M)_mu$ , $ mu(E) = inf{sum_1^infinity mu(]a_j, b_j]) : E subset union.big_1^infinity (a_j, b_j]}. $ #lem[ $ mu(E) = inf{sum_1^infinity mu((a_j, b_j)) : E subset union.big_1^infinity (a_j, b_j)} $ ] #prf[ Suppose $E subset union.big_1^infinity (a_j, b_j)$. Since $(a_j, b_j) = union.big_1^infinity (a_j, b_j - 1/n] =: union.sq.big_(j, n = 1)^infinity I_(j, n)$ for some disjoint h-intervals $I_(j, n)$, so $ mu(E) <= mu(union.sq.big_(j, n = 1)^infinity I_(j, n)) = mu(union.big_1^infinity (a_j, b_j)). $ On the other hand, by the definition of $mu(E)$ , $forall epsilon > 0, exists {(a_j, b_j]}_1^infinity$ , s.t. $ mu(E) >= sum_1^infinity mu((a_j, b_j]) - epsilon, $ where $mu((a_j, b_j]) = F(b_j) - F(a_j)$. For $F$ right-countinuous, so $exists delta_j >0$, s.t. $ F(b_j + delta_j) - F(b_j) < epsilon/2^j,\ => mu((b_j, b_j + delta_j)) < epsilon/2^j . $ Then $mu((a_j, b_j + delta_j)) <= mu((a_j, b_j]) + epsilon/2^j$ , so $ mu(E) >= sum_1^infinity mu((a_j, b_j]) - epsilon >= sum_1^infinity mu((a_j, b_j + delta_j)) - 2epsilon $ where $E subset union.big_1^infinity (a_j, b_j + delta_j)$ . Sending $epsilon -> 0$ , the result follows. ] Let $mu_F :=$ the (implies uniqueness) measure on $sigma({"h-integralerbals"})$ obtained via Hahn extension. Convention: Denote by $mu_F$ its own completion. Recall: $ mu(E) = inf{sum_(j=1)^(infinity) mu((a_j, b_j)): E subset union.sq.big_(j=1)^(infinity) (a_j, b_j), forall E in cal(M)_mu}, $ and the fact that $ U subset RR "is open" <=> U = union.sq.big_(j=1)^(infinity) I_j, $ where ${I_j}$ are disjoint open intervals. We may deduce that $mu := mu_F$ , the Lebesgue-Stieltjes measure, is regular. #thm[ Let $E in cal(M)_mu$ be arbitrary, then $ mu(E) &= inf{mu(O): O "is open", O supset E} quad & "(outer regular)" \ &= sup{mu(K): K "is compact", K subset E}quad & "(inner regular)" $ ] Proof is left as an exercise. For sets $A, B$ , $A Delta B := (A \\ B) union (B \\ A)$ . In fact, Borel set ($E in cal(B)_RR$) $Delta$ "good set" = null set. #def[ A $G_delta$ - set is a countable insections of open sets. An $F_sigma$ - set is a countable unions of closed sets. ] "Every Borel set on $RR$ is nearly a (finite unions of) open intervals." ------ _Littlewood's First Principle_ #thm[ Let $E subset RR$ , TFAT(the following are equivalent) 1. $E in cal(M)_mu$, 2. $E = V \\ N_1$ with $V$ is $G_delta$-set, $mu(N_1) = 0$, 3. $E = H union N_2$ with $H$ is $F_sigma$-set, $mu(N_2) = 0$. ] #prf[ 2, 3 $=>$ 1: Since $mu$ is complete on $cal(M)_mu$ and $G_delta, F_sigma$-sets are Borel sets. Hence $E in cal(M)_mu$ . 1 $=>$ 2 & 3: By regularing of $mu$, $forall E in cal(M)_mu, forall j in NN, exists$ open set $O_j subset RR$ , compact set $K_j subset RR$ , s.t. $ cases( K_j subset E subset O_j \ mu(O_j) - 2^(-j) <= mu(E) <= mu(K_j) + 2^(-j) quad "if" mu(E) < infinity . ) $ Then take $V = sect.big_(j=1)^(infinity) O_j, H = union.big_(j=1)^(infinity) K_j$ . TODO: the general case: $mu(E) = infinity$ . ] #def[ When $F$ is identity mapping, $mu_F = cal(L)^1$ is the Lebesgue measure on $RR$. Its domain $cal(M)_mu$ is the set of Lebesgue measurable sets. ] In fact, we have $cal(M)_mu supset cal(B)_RR$ and the supset sign is strict. About Lebesgue measurable, we have some key (pathological) examples. #exm[ There are open, dense subsets of $(0, 1)$ of arbitrary small $cal(L)^1$-measure. In orther words, topological big $arrow.r.double.not$ measure-theoretically big. ] A specific example is as follows. #exm[ Take $q_1, q_2, ...$ be an enumenation of $QQ sect (0, 1)$ . Fix any $epsilon$ , and let $ O = union.big_(j=1)^(infinity) (q_j - epsilon/(2j), q_j + epsilon/(2j)) $ then $ cal(L)^1(O) <= sum_(j=1)^(infinity) mu((q_j - epsilon/(2j), q_j + epsilon/(2j))) <= 2 epsilon. $ ]
https://github.com/thanhdxuan/dacn-report
https://raw.githubusercontent.com/thanhdxuan/dacn-report/master/cns-report-02/contents/02-Kali.typ
typst
= Tìm hiểu Kali Linux và các công cụ liên quan == Kali Linux là gì? Kali Linux là một phát triển debian Linux - một phân phối của phiên bản linux. Mục tiêu của Linux là tập hợp lại các công cụ kiểm tra bảo mật trong môi trường hệ điều hành giúp người dùng tiện lợi hơn trong việc tìm kiếm phần mềm kiểm thử, hạcking, tấn công,... Kali còn hỗ trợ các package giúp cho việc cài đặt và cập nhật phần mềm một cách nhanh chóng, tương thích đa nền tảng điện thoại, cloud,.... == Hãy cho biết các nhóm công cụ liên quan hiện có trên Kali Linux - *Information*: Otrace, arping, auitomater, braa,.. - *Vulnerability*: bed, cisco-ocs, dhcpig, nmap, zmap,.. - *WebApps*: apache-users, burpsuite, fitmap, drib,.. - *Database*: bbqsql, jSQL Injection, sqlsus, sqlmap,... - *Password*: <PASSWORD>, <PASSWORD>, <PASSWORD>, <PASSWORD>,.. - *Wireless*: aircrack-ng, bully, asleap, bluelog,.. - *Reversing*: clang, clang++, jad, javasnoop,.. - *Exploit*: armitage, searchsploit, termineter,.. - *Sniffing*: fiked, hamster, dsniff, darkstat,... - *PostExploit*: backdor-factory, dbd, exe2hex,... - *Forensics*: binwalk, affcat, dc3dd,... - *Reporting*: cutycapt, maltego, faraday IDE,.. - *SETools*: u3-pwn, ghost phisher,... - *Services*: beef start, beef stop,.. = Cài đặt máy ảo Kali Linux == Cài đặt Virtual Box - *Bước 01*: Tải installer từ trang chủ của VirtualBox (#link("https://www.virtualbox.org/wiki/Downloads")): #align(center)[ #image("../images/virtualbox-download.png", width: 90%) ] - *Bước 02*: Chạy installer vừa mới tải về, và làm theo hướng dẫn. #align(center)[ #image("../images/virtualbox-install-setup.png", width: 90%) ] - *Bước 03*: Kết quả sau khi cài đặt, xem thông tin: #align(center)[ #image("../images/virtualbox-install-successfully.png", width: 90%) ] #pagebreak() == Dowload và tạo máy ảo Kali Linux. Hãy cho biết các bước và một số hình ảnh - *Bước 01*: Tải installer từ trang chủ của Kali Linux (#link("https://www.kali.org/get-kali/#kali-installer-images")) #align(center)[ #image("../images/kali-download.png", width: 90%) ] - *Bước 02*: Sử dụng Virtual Box để tạo máy ảo với image mới vừa tải Chọn *Mở VirtualBox* $-->$ *_New_* $-->$ *Đặt tên cho máy ảo, chọn image để tạo máy ảo (đường dẫn đến file .iso mới tải về)* $-->$ *_Next_*. #align(center)[ #image("../images/kali-install-step1.png", width: 90%) ] #pagebreak() - *Bước 03*: Setup các thông số như RAM, Storage (Dung lượng ổ đĩa), Số Cores, ... #align(center)[ #image("../images/kali-install-step2.png", width: 90%) ] - *Bước 04*: Khởi động máy ảo Chọn Máy ảo vừa tạo $-->$ *_Start_*, máy ảo sẽ boot vào trình cài đặt, ở đây chúng ta sẽ cấu hình các thông số như *Languages, Keyboard, Datetime, Username, Password, Partition, ...* - Sau khi cài đặt xong, kết quả sẽ như sau: #align(center)[ #image("../images/kali-install-successfully.png", width: 90%) ] #pagebreak() = Thu thập thông tin mạng bằng cách quét mạng == Sử dụng công cụ Nmap/Zenmap *Nmap* Xem thông tin hướng dẫn của `nmap`: ```sh nmap --help``` #show image: set align(center) #image("../images/kali-nmap-info.png", width: 90%) Xem thông tin địa chỉ ip của máy ảo: ```sh ifconfig``` #image("../images/kali-ifconfig.png", width: 90%) $==>$ Địa chỉ ip của máy ảo `Kali Linux` là `192.168.1.8` và mạng hiện tại đang kết nối là `192.168.1.0/24`. Vậy ta sẽ scan trên mạng này để tìm các host đang hoạt động. ```sh nmap -sn 192.168.1.0/24 ``` Ta có kết quả sau: #image("../images/kali-scannetwork-result.png", width: 90%) Từ hình trên, ta thấy có các host sau đang hoạt động: `192.168.1.1`, `192.168.1.5`, `192.168.1.6`, `192.168.1.8`, `192.168.1.108`. Ta thử kiểm tra chi tiết một host bất kỳ bằng câu lệnh sau: ```sh nmap -A 192.168.1.108 ``` Kết quả: #image("../images/kali-scannetwork-inspect01.png", width: 90%) #image("../images/kali-scannetwork-inspect02.png", width: 90%) Từ 02 hình trên, ta có thể lấy được các thông tin sau từ `192.168.1.108`: - Các port đang mở: `80/tcp`, `554/tcp`, `49152/tcp` - Đây là 1 Camera - ... *Zenmap* Tương tự `nmap`, `zenmap` là công cụ cung cấp giao diện cho `nmap`, dưới đây là 1 số hình ảnh về `zenmap`: #image("../images/kali-zenmap01.png", width: 90%) #image("../images/kali-zenmap02.png", width: 90%) == Sử dụng Angry IP Scanner *Cài đặt* ```sh # <package.deb> được download từ trang chủ của AIS sudo dpkg -i <package.deb> ``` *Sử dụng* Sau khi cài đặt xong, chúng ta có thể sử dụng bằng cách chọn range IP mà minh muốn scan. #image("../images/kali-angscanner01.png", width: 90%) == Đánh giá mức độ nguy hiểm của loại hình tấn công này Sử dụng những công cụ như trên để tấn công vào một trang web, máy chủ có thể làm lộ ra những lỗ hỏng có thể kể đến như: lộ license version của backend, lộ thông tin database (sql injection), lộ các thông tin về mã hóa, khóa, từ đó tin tặc dựa vào những lỗ hổng ấy có thể cài thêm các mã độc, nghe lén thông tin,... Với máy cá nhân, biết được địa chỉ IP, tin tặc có thể biết được hành vi và thông tin của đối tượng bị tấn công, từ đó có thể fake IP giả danh thành đối tượng đó. == Biện pháp đối phó - Sử dụng tường lửa, SDN (software define network) để kiểm soát lưu lượng truy cập mạng tốt hơn. - Với server, tránh để lộ thông tin về lisence, version, các package được cài, thông tin về database thông qua cách sử dụng hash, câu truy vấn sử dụng thêm những điều kiện đặc biệt. - Với người dùng thì có thể cài thêm các lớp xác thực, hạn chế truy cập thông tin, trang web có dấu hiệu khả nghi, chặn quảng cáo,.. = Nghe lén thông tin, dữ liệu == Dùng Wireshark để bắt gói, phân tích gói tin bắt được. Hãy cho biết các bước và một số hình ảnh *Wireshark* Trang chủ của wireshark hiển thị các mạng để ta capture, ta chọn `eth0` - mạng ứng với địa chỉ IP của máy mình. #image("../images/kali-wireshark-start.png", width: 60%) *Capture* #image("../images/kali-wireshark-01.png", width: 80%) Ở đây ta có thể thấy được gói tin đi qua mạng, với các thông số như Source(IP của máy gửi), Destination (IP của máy nhận), Protocol (Phương thức), chúng ta có thể filter kết quả để tiện trong việc tìm kiếm. == Đánh giá mức độ nguy hiểm của loại hình tấn công này Whireshark thường được sử dụng trong loại hình tấn công man in the middle đây là loại hình tấn công vô cùng nguy hiểm, khi đó hacker sẽ bắt được gói tin của ta và từ đó hắn có thể phân tích các trường thông tin quan trọng để bắt được các thông điệp cũng như là nắm được các cơ chế mã hóa thông tin từ đó nắm được điểm yếu và sơ hở cả 2 phía gửi và nhận. Nếu chúng ta sử dụng mã khóa đối xứng và trao đổi khóa không đảm bảo yêu cầu bảo mật cao thì dùng whireshark ta có thể bắt được khóa đối xứng đó và sử dụng để có thể thực hiện các bước tấn công tiếp theo như replay attack, snoofing,.. Nếu không phát hiện kịp thời sẽ dẫn đến chỗ bị tấn công có thể bị sập, bị truy cập trái phép, lộ thông tin cá nhân,.. và sẽ gây tổn thất không hề nhỏ. == Biện pháp đối phó - Sử dụng mã khóa công khai cũng như các kĩ thuật như Hash, Chữ ký số để che giấu thông điệp và đảm bảo an toàn thông tin khóa. - Kết nối an toàn, hạn chế dùng wifi “chùa”, khi truy cập web có thể để ý xem thử web đó đã có chứng chỉ ssl chưa thông qua https - Sử dụng mã hóa VPN, tạo một tunnel, session riêng tư để trao đổi dữ liệu. = Cài đặt máy chủ CentOS7 == Hệ điều hành CentOS là gì? CentOS (Community Enterprise Operating System) chính là một bản phân phối Linux có mã nguồn mở, và hoàn toàn miễn phí dành cho doanh nghiệp. Bản phân phối này có chức năng tương thích với Red Hat Enterprise Linux (RHEL). Hệ điều hành CentOS không chỉ giúp doanh nghiệp xây dựng được nền tảng hệ thống máy chủ, mà còn cung cấp môi trường lý tưởng phục vụ cho các hoạt động lập trình. Các ưu điểm của CentOS đó là: Hệ điều hành CentOS không chỉ giúp doanh nghiệp xây dựng được nền tảng hệ thống máy chủ, mà còn cung cấp môi trường lý tưởng phục vụ cho các hoạt động lập trình,... == Cài đặt CentOS _NOTE: Ở đây em dùng bản CentOS Stream 8 thay thế bản 7 (không làm thay đổi mục đích của bài Lab)_ Tương tự như cài đặt `Kali Linux`, chúng ta sẽ tải image, tạo máy ảo trên Virtual Box và Start. Sau khi khởi động máy ảo, máy sẽ boot vào trình cài đặt của CentOS. #image("../images/centos-install.png", width: 70%) #image("../images/centos-install02.png", width: 70%) Kết quả sau khi cài đặt xong #image("../images/centos-install-successfully.png", width: 70%) == Cấu hình để CentOS và Kali Linux có thể "thấy" nhau. Để CentOS và Kali Linux thấy được nhau, chúng ta sẽ cấu hình mạng cho 2 máy ảo này vào cùng 1 network. Chúng ta mở *VirtualBox* $-->$ *Chọn máy ảo Kali* $-->$ *Settings* $-->$ *Network*. Ở phần _Adapter 1_, chọn _Attached to: Bridged Adapter_. Thực hiện tương tự với máy ảo CentOS. Thông tin về các mạng của máy chủ CentOS sau khi cấu hình mạng: - `inet 192.168.1.4/24` - `brd 192.168.1.255` #image("../images/centos-network.png", width: 70%) Ta kiểm tra bằng cách ping vào máy chủ CentOS từ KaliLinux: ```sh ping 192.168.1.4 ``` #image("../images/ping-success.png", width: 70%) Ta có thể thấy đã ping thành công. = Tấn công vét cạn trên dịch vụ SSH của máy chủ CentOS 7 == Sử dụng `hydra` trên Kali Linux Hydra là một công cụ brute force mạnh mẻ; một công cụ ‘hack’ mật khẩu đăng nhập hệ thống nhanh chóng. Chúng ta có thể sử dụng Hydra để duyệt qua một danh sách và ‘bruteforce‘ một số dịch vụ xác thực. Hãy tưởng tượng bạn đang cố gắng đoán thủ công một số mật khẩu trên một dịch vụ cụ thể (SSH, Web Application Form, FTP hoặc SNMP) – chúng ta có thể sử dụng Hydra để duyệt qua danh sách mật khẩu và tăng tốc quá trình này để xác định mật khẩu chính xác. *Sử dụng `hydra`* ```sh hydra -h ``` #image("../images/kali-hydra-info.png", width: 90%) Để sử dụng `hydra` chúng ta cần xác định: - `target`: Địa chỉ ip/url của máy chủ. - `service`: Trên máy chủ có nhiều dịch vụ, cần xác định cần tấn công dịch vụ nào, mỗi dịch vụ sẽ ứng với một `port` khác nhau. - `login_info`: Thông tin đăng nhập và mật khẩu (tự tạo hoặc lấy ra từ một nguồn bất kỳ) Một số `option` cần lưu ý: - `-l` <login> or `-L` <login.txt>: chỉ tên đăng nhập hoặc file chứa các tên đăng nhập. - `-p` <password> or `P` <password.txt>: chỉ mặt khẩu hoặc file chứa các mật khẩu. - `-t`: Số luồng tấn công đồng thời. == Dùng công cụ hydra tấn công vét cạn trên dịch vụ SSH của máy chủ CentOS 7 với từ điển hiện có: Tạo thư viện với 2 file `login.txt` và `password.txt` với nội dung như hình và thực hiện tấn công: ```sh hydra -L login.txt -P password.txt ssh://192.168.1.4 ``` #image("../images/kali-attack-01.png", width: 80%) * Kết quả: * Tìm thấy thành công tài khoản: `root`, mật khẩu `2014486`. == Tạo danh sách các mật khầu (wordlist) bằng crunch và dùng hydra tấn công vét cạn trên dịch vụ SSH của máy chủ CentOS 7 dùng danh sách mật khẩu đã tạo ra ```sh # crunch <min> <max> <charset> -t <pattern> -o <output_file> # Giả sử ta biết chắc username là root hoặc admin echo "root" >> login.txt echo "admin" >> login.txt # Ta sẽ dùng crunch để tạo list mật khẩu # Ta giả sử 1 số thông tin đã biết như độ dài = 7, ... để tiết kiệm thời gian crunch 7 7 012468 -t 2014@@@ -o password.txt ``` #image("../images/kali-hydra-result.png", width: 80%) Tốn hơn 3 phút để `hydra` tìm ra mật khẩu với hơn 300 lần thử. == Đánh giá mức độ nguy hiểm của loại hình tấn công này Dùng cách này, ta có thể tấn công ping of death, giành quyền truy cập máy chủ, từ đó có thể giả danh máy chủ mà đi tấn công các máy chủ khác dẫn đến cuộc tấn công DDOS. Nếu không ngăn chặn kịp thời hệ thống của bạn sẽ bị sập hoặc sẽ dẫn đến mất mát lộ thông tin và giả danh thông tin. Một số cách để làm giảm thiểu khả năng bị tấn công: Với máy chủ, máy tính cá nhân, sử dụng mật khẩu mạnh và khó đoán (dài) tránh thêm những thông tin nhạy cảm và dễ đoán trong mật khẩu như ngày sinh, số điện thoại,.. Có thể cài đặt các công cụ để cho phép ssh đến máy của mình một tối thiểu số lần cho phép. Sử dụng các yếu tố xác thực khác như capcha, khuôn mặt, tránh làm lộ địa chỉ IP = Giải pháp giảm thiểu tấn công vét cạn == `fail2ban` Fail2ban là phần mềm hoạt động dựa trên việc hỗ trợ nguyên tắc, theo dõi log của hệ thống. Dựa trên cơ sở đó, bạn sớm phát hiện và ngăn chặn những cuộc tấn công vào server của mình. Cụ thể hơn, phần mềm tập trung phát triển, bảo vệ SSH, đẩy lùi nguy cơ Brute Force Attack và cũng có thể thiết lập Rules, tham số khác để sử dụng trên bất cứ dịch vụ nào hỗ trợ log file. Ngoài ra fail2ban còn hỗ trợ: Có thể phân tích các tệp nhật ký và tìm kiếm các mẫu, Tạo lệnh cấm, quy ước trong một khoảng thời gian nhất định, Hỗ trợ Database,... == Cài đặt và cấu hình fail2ban đối với dịch vụ SSH trên máy chủ CentOS * Cài đặt* ```sh sudo yum install sudo yum install fail2ban ``` * Cấu hình* ```sh vi /etc/fail2ban/jail.local ``` ```config [sshd] enabled = true filter = sshd action = iptables[name=SSH, port=ssh, protocol=tcp] logpath = /var/log/secure.log maxretry = 3 bantime = 3600 ``` *Start `fail2ban` service* ```sh service fail2ban enable service fail2ban start ``` * Kiểm tra `fail2ban` liệu đã hoạt động* ```sh systemctl status fail2ban ``` #image("../images/centos-status.png", width: 90%) Vậy chúng ta đã cấu hình `fail2ban` thành công. == Dùng công cụ hydra tấn công vét cạn trên dịch vụ SSH của máy chủ CentOS 7 và cho biết kết quả: Chúng ta tấn công lại bằng câu lệnh trước đây. #image("../images/kali-reattack.png", width: 90%) Kiểm tra các ip bị fail2ban ban: ```sh fail2ban-client status sshd ``` #image("../images/centos-ban.png", width: 90%) Ta thấy có 1 địa chỉ IP `192.168.1.8` - Chính là Kali Linux của chúng ta. Kiểm tra log: #image("../images/centos-ban-log.png", width: 90%) Vậy là `Kali Linux` (`192.168.1.8`) đã bị ban thành công.
https://github.com/soul667/typst
https://raw.githubusercontent.com/soul667/typst/main/PPT/typst-slides-fudan/themes/polylux/themes/metropolis.typ
typst
// This theme is inspired by https://github.com/matze/mtheme // The polylux-port was performed by https://github.com/Enivex // Consider using: // #set text(font: "Fira Sans", weight: "light", size: 20pt) // #show math.equation: set text(font: "Fira Math") // #set strong(delta: 100) // #set par(justify: true) #import "../logic.typ" #import "../helpers.typ" #let m-dark-teal = rgb("#23373b") #let m-light-brown = rgb("#eb811b") #let m-lighter-brown = rgb("#d6c6b7") #let m-extra-light-gray = white.darken(2%) #let m-footer = state("m-footer", []) #let m-cell = block.with( width: 100%, height: 100%, above: 0pt, below: 0pt, breakable: false ) #let m-progress-bar = helpers.polylux-progress( ratio => { grid( columns: (ratio * 100%, 1fr), m-cell(fill: m-light-brown), m-cell(fill: m-lighter-brown) ) }) #let metropolis-theme( aspect-ratio: "16-9", footer: [], body ) = { set page( paper: "presentation-" + aspect-ratio, margin: 0em, header: none, footer: none, ) m-footer.update(footer) body } #let title-slide( title: [], subtitle: none, author: none, date: none, extra: none, ) = { let content = { set text(fill: m-dark-teal) set align(horizon) block(width: 100%, inset: 2em, { text(size: 1.3em, strong(title)) if subtitle != none { linebreak() text(size: 0.9em, subtitle) } line(length: 100%, stroke: .05em + m-light-brown) set text(size: .8em) if author != none { block(spacing: 1em, author) } if date != none { block(spacing: 1em, date) } set text(size: .8em) if extra != none { block(spacing: 1em, extra) } }) } logic.polylux-slide(content) } #let slide(title: none, body) = { let header = { set align(top) if title != none { show: m-cell.with(fill: m-dark-teal, inset: 1em) set align(horizon) set text(fill: m-extra-light-gray, size: 1.2em) strong(title) } else { [] } } let footer = { set text(size: 0.8em) show: pad.with(.5em) set align(bottom) text(fill: m-dark-teal.lighten(40%), m-footer.display()) h(1fr) text(fill: m-dark-teal, logic.logical-slide.display()) } set page( header: header, footer: footer, margin: (top: 3em, bottom: 1em), fill: m-extra-light-gray, ) let content = { show: align.with(horizon) show: pad.with(2em) set text(fill: m-dark-teal) body } logic.polylux-slide(content) } #let new-section-slide(name) = { let content = { helpers.register-section(name) set align(horizon) show: pad.with(20%) set text(size: 1.5em) name block(height: 2pt, width: 100%, spacing: 0pt, m-progress-bar) } logic.polylux-slide(content) } #let focus-slide(body) = { set page(fill: m-dark-teal, margin: 2em) set text(fill: m-extra-light-gray, size: 1.5em) logic.polylux-slide(align(horizon + center, body)) } #let alert = text.with(fill: m-light-brown) #let metropolis-outline = helpers.polylux-outline(enum-args: (tight: false,))