repo
stringlengths 26
115
| file
stringlengths 54
212
| language
stringclasses 2
values | license
stringclasses 16
values | content
stringlengths 19
1.07M
|
---|---|---|---|---|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/meta/footnote-invariant-00.typ | typst | Other | #set page(height: 120pt)
#lorem(13)
There #footnote(lorem(20))
|
https://github.com/leo1oel/CSAPP | https://raw.githubusercontent.com/leo1oel/CSAPP/main/Homework/Homework2.typ | typst | #import "template.typ": *
#import "@preview/codly:0.2.0": *
#show: project.with(
title: "Homework Set 2 - Practical Skills and Concurrency",
authors: (
"<NAME> 2023010747",
)
)
#show: codly-init.with()
#codly(languages: (
c: (name: "C", icon: none, color: none),
))
#show table: set align(center)
#show image: set align(center)
= Problem 1
Answer:
(1) `/^[\w.-]+@(mails?\.)?tsinghua\.edu\.cn$/`
(2) `/^[\w.-]+@(\w+\.)?tsinghua\.(edu|org)\.cn$/`
(3)`/^[\w.-]+@(?!mails?)(\w+)\.tsinghua\.edu\.cn$/`
(4)`sort departments.txt | uniq | wc -l`
= Problem 2
Answer:
Scenario 1:
After $A$ creates a `newNode(A)` it switch to $B$, then $B$ creates its own `newNode(B)`. $B$ sets `tail` to `newNode(B)` and links `oldTail.next` to its node.
After $B$ finishes, $A$ sets `oldTail = tail`, which means that $A's$ `oldTail` is `newNode(B)`. Then $A$ links `oldTail.next` to its `newNode(A)`. In this way, `head` points to `newNode(B)` and `newNode(B)` points to `newNode(A)`. So there's no problem.
Scenario 2:
After $A$ sets `oldTail = tail`, it switch to $B$. Then $B$ creates its own `newNode(B)`. $B$ sets `tail` to `newNode(B)` and links `oldTail.next` to its node.
Back to $A$, `oldTail` now is $B's$ `oldTail`. $A$ links `oldTail.next` to its `newNode(A)`. In this way, `newNode(A)` replace `newNode(B)` and `newNode(B)` is lost.
Scenario 3:
After $A$ sets `tail = newNode(A)` it switch to $B$, then $B$ creates its own `newNode(B)`. $B$ sets `oldTail` to be `tail` which is `newNode(A)`. Then $B$ sets `tail` to `newNode(B)` and links `oldTail.next` to its node. So `newNode(A)` points to `newNode(B)`.
After $B$ finishes, $A$ sets `oldTail.next = newNode(A)`. This `oldTail` is $A's$ `oldTail` so it's `head`. So `head` points to `newNode(A)` and `newNode(A)` points to `newNode(B)`. Hence there's no problem.
= Problem 3
(1) No need to change the code. Because `lock.Acquire()` ensures that only one thread can modify the queue at a time (mutual exclusion).
(2)
```c
Lock lock;
Condition dataready;
Condition notFull;
int max_size;
Queue queue;
AddToQueue(item) {
lock.Acquire(); // Get Lock
while (queue.size() == max_size) {
notFull.wait(&lock);
}
queue.enqueue(item); // Add item
dataready.signal() ; // Signal any waiters
lock.Release(); // Release Lock
}
RemoveFromQueue() {
lock.Acquire(); /./ Get Lock
while (queue.isEmpty()) {
dataready.wait(&lock); // If nothing, sleep
}
item = queue.dequeue(); // Get next item
notFull.signal();
lock.Release(); // Release Lock
return(item);
}
```
(3)
```c
ReadFromQueue() {
lock.Acquire(); // Get Lock
while (queue.isEmpty()) {
dataready.wait(&lock); // If nothing, sleep
}
item = queue.read();
lock.Release(); // Release Lock
return(item);
}
```
= Problem 4
```c
Semaphore lock = 1;
int owner = -1;
void RLock() {
int cur_id = getMyTID();
if (owner != cur_id) {
lock.P();
owner = cur_id;
}
}
void RUnLock() {
int cur_id = getMyTID();
if (owner == cur_id) {
owner = -1;
lock.V();
}
}
```
= Problem 5
(1)
```c
Semaphore barberReady = 0;
Semaphore accessWaitRoomSeats = 1;
Semaphore customerReady = 0;
int numberOfFreeWaitRoomSeats = N;
void Barber () {
while (true) {
customerReady.P();
accessWaitRoomSeats.P();
numberOfFreeWaitRoomSeats += 1;
accessWaitRoomSeats.V();
cutHair(); // Cut customer's hair
barberReady.V();
}
}
void Customer () {
accessWaitRoomSeats.P();
if (numberOfFreeWaitRoomSeats > 0) {
numberOfFreeWRSeats -= 1;
customerReady.V();
accessWaitRoomSeats.V();
barberReady.P();
getHairCut(); // Customer gets haircut
} else {
accessWaitRoomSeats.V();
leaveWithoutHaircut(); // No haircut
}
}
```
(2)
`accessWaitRoomSeats` is used for mutex, and `customerReady` and `barberReady` are used for scheduling constraints.
(3) Customer $A_0$ is having haircut when customer $B$ arrives and has many hair. $B$ starts waiting, but every time $A_i$ nearly finishes, $A_(i+1)$ arrives and $A_(i+1)$ has no hair at all. Thus, $B$ will wait forever.
Deadlock will not happen because the barber will always be able to cut hair for the next customer.
The starvation will happen with less probability if the barber randomly selects a customer to cut hair.
We can arrange the order in the waiting room to avoid starvation. In other words, FIFO (First In, First Out) queueing discipline would be effective. Which means barber always pick the one who come earliest. |
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/meta/document_03.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
//
// // This, too.
// // Error: 23-29 expected string, found integer
// #set document(author: (123,))
// What's up? |
https://github.com/nasyxx/lab-weekly-report | https://raw.githubusercontent.com/nasyxx/lab-weekly-report/master/example.typ | typst | // other packages in https://typst.app/docs/packages/
#import "@preview/chordx:0.2.0": *
#import "smwr.typ": smwr
#show: body => smwr("Nasy", datetime(year: 2024, month: 1, day: 5), body)
// abstract
This week's main tasks include:
- #lorem(10)
- #lorem(15)
- #lorem(5)
- Total time: 40 hours
= #lorem(10)
#lorem(100)
#let chart-chord = new-chart-chords(scale: 1.5)
#let chart-chord-round = new-chart-chords(style: "round", scale: 1.5)
// Style "normal"
#chart-chord(tabs: "x32o1o", fingers: "n32n1n")[C]
#chart-chord(tabs: "ooo3", fingers: "ooo3")[C]
// Style "round"
#chart-chord-round(tabs: "xn332n", fingers: "o13421", fret-number: 3, capos: "115")[Cm]
#chart-chord-round(tabs: "onnn", fingers: "n111", capos: "313")[Cm]
= #lorem(15)
#lorem(90)
#let piano-chord = new-piano-chords(scale: 1.5)
#let piano-chord-round = new-piano-chords(scale: 1.5, style: "round")
#piano-chord(layout: "F", keys: "<KEY>", color: blue)[B]
#piano-chord-round(layout: "F", keys: "<KEY>", color: red)[B]
= #lorem(5)
#lorem(120)
#let chord = new-single-chords(style: "italic", weight: "semibold")
#chord[Jingle][G][2] bells, jingle bells, jingle #chord[all][C][2] the #chord[way!][G][2] \
#chord[Oh][C][] what fun it #chord[is][G][] to ride \
In a #chord[one-horse][A7][2] open #chord[sleigh,][D7][3] hey!
= Time
// #image("./time.png")
- Total: 40 hours
- xxx: 10 hours
- xxx: 30 hours
= Future Plan
== Short term
#lorem(105)
== Long term
#lorem(40)
|
|
https://github.com/Amelia-Mowers/typst-tabut | https://raw.githubusercontent.com/Amelia-Mowers/typst-tabut/main/doc/example-snippets/tablex.typ | typst | MIT License | #import "@preview/tabut:<<VERSION>>": tabut-cells
#import "usd.typ": usd
#import "example-data/supplies.typ": supplies
#import "@preview/tablex:0.0.8": tablex, rowspanx, colspanx
#tablex(
auto-vlines: false,
header-rows: 2,
/* --- header --- */
rowspanx(2)[*Name*], colspanx(2)[*Price*], (), rowspanx(2)[*Quantity*],
(), [*Base*], [*W/Tax*], (),
/* -------------- */
..tabut-cells(
supplies,
(
(header: [], func: r => r.name),
(header: [], func: r => usd(r.price)),
(header: [], func: r => usd(r.price * 1.3)),
(header: [], func: r => r.quantity),
),
headers: false
)
) |
https://github.com/jimipj/kamk-typst-templates | https://raw.githubusercontent.com/jimipj/kamk-typst-templates/main/README.md | markdown | MIT License | # KAMK Typst Templates
Typst templates trying to copy the appearance of KAMK's official Word templates.
## Fonts
If you are **not** running Windows, you will have to install [Calibri](https://wiki.debian.org/ppviewerFonts) font, or alternatively [Carlito](https://fonts.google.com/specimen/Carlito) font.
Carlito is possibly already [packaged](https://pkgs.org/search/?q=carlito) for your system.
If you have correctly installed one of fonts, you can ignore Typst warning about missing the other.
## Project Plan
### Usage
Copy `comments.tmTheme`, `kamk-logo.png` and `project-plan.typ` to the directory of your Typst document,\
and then add something like the following piece of code at the start of it:
```typst
#import "project-plan.typ": project-plan
#show: doc => project-plan(
date: auto,
title: "Foobar",
doc
)
```
### Parameters
| Parameter | Type | Default | Description |
| ----------- | -------------------- | ----------------| ----------------------- |
| `author` | Array, String | `()` | |
| `course` | String | `"Opintojakso"` | |
| `cover` | String | `""` | Path to cover image. |
| `date` | Auto, Datetime, None | `none` | |
| `degree` | String | `"Tutkinto"` | |
| `keywords` | Array, String | `()` | |
| `monospace` | String | `"Monospace"` | Font used for raw text. |
| `title` | None, String | `none` | |
## License
MIT
|
https://github.com/catppuccin/typst | https://raw.githubusercontent.com/catppuccin/typst/main/src/lib.typ | typst | MIT License | #import "version.typ": version
#import "catppuccin.typ": catppuccin, themes, get-palette
#import "tidy/show-module.typ": show-module
|
https://github.com/Error-418-SWE/Documenti | https://raw.githubusercontent.com/Error-418-SWE/Documenti/src/3%20-%20PB/Documentazione%20interna/Verbali/24-04-21/24-04-21.typ | typst | #import "/template.typ": *
#show: project.with(
date: "21/04/24",
subTitle: "Meeting di retrospettiva e pianificazione",
docType: "verbale",
authors: (
"<NAME>",
),
timeStart: "15:00",
timeEnd: "16:00",
);
= Ordine del giorno
- Valutazione del progresso generale;
- Analisi retrospettiva;
- Candidatura per sostenere il secondo colloquio della Product Baseline;
- Pianificazione.
= Valutazione del progresso generale
Lo Sprint 24 si conclude con il completamento dei task pianificati, rispecchiando quanto impostato all'interno dello Sprint Backlog. Il progetto didattico si sta avvicinando alla sua conclusione, come testimoniato dalla riduzione sostanziale del lavoro da svolgere.
== Verbali
Durante il processo di revisione documentale, sono stati corretti errori ortografici nei seguenti verbali:
- Verbale 24/02/18;
- Verbale 24/02/22;
- Verbale 24/03/06;
- Verbale 24/03/17.
== #st
Il #cardin ha espresso giudizio positivo nei confronti del documento #st_v, ritenendolo "in linea con quanto atteso". Tale risultato permette al gruppo di procedere con la candidatura per il secondo colloquio della Product Baseline.
== #ndp
Il documento #ndp_v ha subito un aggiornamento in merito alla sezione _Processo di Misurazione_, relativo al sistema di tracciamento delle metriche.
Infatti, al fine di rendere chiaro il sistema di metriche utilizzate, è stato necessario definire un sistema di tracciamento che assegni ad ogni metrica un codice identificativo univoco.
La revisione del documento è stata completata.
== #pdq
Il documento #pdq ha visto una ristrutturazione del suo contenuto, organizzando le metriche e i relativi valori all'interno di file JSON separati. In questo modo, la rappresentazione delle singole metriche avviene in maniera automatica mediante delle funzioni implementate in Typst all'interno del documento.
Tale approccio, permette di rendere uniforme la rappresentazione delle metriche all'interno del documento, garantendo una maggiore coerenza stilistica e semplcità di manutenzione.
Terminata, inoltre, la redazione della sezione dedicata ai test di accettazione.
== #glo
Il documento #glo_v ha visto l'aggiornamento dei termini:
- "Test di integrazione", ora in grado di essere rilevato anche fosse scritto "integration test";
- "Test di sistema", ora in grado di essere rilevato anche fosse scritto "system test".
== #man
Data la natura diversa e i destinatari a cui è rivolto il documento #man, il gruppo ha redatto un glossario dedicato.
= Analisi retrospettiva
Lo Sprint 24 è terminato con il raggiungimento della totalità degli obiettivi prefissati. Il rendimento positivo dello Sprint è sostenuto dalle principali metriche esposte dal #pdq_v:
- CPI di progetto a 1.01, che rappresenta un avanzamento positivo del progetto, con valore superiore alla soglia di accettabilità;
- EAC aumenta passando da € 12.968,71 a € 12.969,18. L'aumento è irrisorio e non incide sul budget previsto. Il valore di tale metrica è oramai consolidato, e dato il limitato lavoro rimanente, non si prevedono variazioni significative;
- $"SEV" = "SPV"$, come testimoniato anche dalla metrica CPI, che indica un avanzamento positivo del progetto, che si avvicina alla sua conclusione.
== Keep doing
Il gruppo, ormai prossimo alla conclusione del progetto, ha raggiunto, come anche espresso dalle retrospettive precedenti, una soddisfacente capacità di lavoro in modalità asincrona. Il _way of working_ è consolidato e non necessita di ulteriori modifiche.
== Improvements
Nessuna criticità incontrata durante il corso dello Sprint 24.
= Candidatura per sostenere il secondo colloquio della Product Baseline
In data 17/04/2024 il gruppo ha ottenuto l'esito positivo del primo colloquio della Product Baseline sostenuto con il #cardin. Il gruppo pertanto, intende candidarsi per il secondo colloquio con il #vardanega entro mercoledì 24/04/2024.
= Pianificazione
#let table-json(data) = {
let keys = data.at(0).keys()
table(
align: left,
columns: keys.len(),
..keys,
..data.map(
row => keys.map(
key => row.at(key, default: [n/a])
)
).flatten()
)
}
#figure(caption: [Task pianificate per lo Sprint 25.],
table-json(json("tasks.json"))
) |
|
https://github.com/Dherse/masterproef | https://raw.githubusercontent.com/Dherse/masterproef/main/masterproef/parts/5_examples.typ | typst | #import "../ugent-template.typ": *
= Examples of photonic circuit programming <sec_examples>
Several different application areas were mentioned in
@photonic_processors_use_cases, and in this section, some of these areas will be
demonstrated using the @phos programming language. The examples are meant to be
mockups of real applications and not complete implementations, focusing solely
on the @phos part of their design. These examples will be explored in different
levels of detail depending on the complexity of the application and how much of
the capabilities of @phos they demonstrate. The full code, comments, and type
annotations are available starting at @anx_beam_forming.
== Beamforming
Optical beamforming is being used to build new solid-state LiDAR systems
@xu_fully_2022. These LiDARs need precise phase matching between all of the
produced optical signals, as any imprecision over the phase and delay will
negatively impact the precision of the overall system. Conveniently, @phos
offers an easy way to ensure that signals are phase and delay matched: the
`constrain` synthesisable block. It imposes a differential constraint over a
number of signals, in this case, as will be visible in @lst_beam_forming, it is
used to enforce equal phase into the modulators, and equal delay when going back
towards the outputs.
=== Theoretical background
Beam forming allows a system to control the directionality of a signal emitted
by its antennae. It requires multiple antennae at the transmitter. The
transmitter then controls the phases of the emitted signals to create
constructive interference in the desired direction of interest and destructive
interference in the others. This allows the transmitter to focus its signal in a
specific direction. This has several advantages, it can allow a transmitter to
reach longer distances at the same transmitted power, it can be used to decrease
interference with other transmitters, and it can be used to increase the
directional precision of a system, such as in the case of a LiDAR
@van_veen_beamforming_1988 @zou_analog_2017.
#ufigure(
outline: [ Demonstration emission pattern of a beamforming system. ],
caption: [
Demonstration emission pattern of a beamforming system, showing the main lobe
and side lobes.
],
image(
"../assets/drawio/beam_forming_emission.png",
width: 60%,
alt: "Shows an ellipse with long length, representing the main lobe, followed by several smaller ellipses at its sides, representing the side lobes.",
),
)<fig_beam_forming_emission>
#pagebreak(weak: true)
=== PHÔS implementation
The @phos implementation relies on several key features of the @phos language.
It utilises the `split` function which is used to split a signal based on a list
of weights. These weights are provided by the `splat` function which creates a
list of $n$ elements all of the same value. Those signals are then constrained
to have the same phase before being phase modulated using the `modulate`
function. The resulting signals are then constrained to have the same delay
before being sent to the outputs. The code for this example is available at
@lst_beam_forming, with the fully commented code being available in
@anx_beam_forming.
==== Constrain
`constrain` is a synthesisable block that allows the user to create constraints
between signals. It can be used to impose one of two constraints, either a phase
constraint, matching the phases of the different signals, or a delay constraint,
matching the delays of the different signals. In this case, the phase constraint
is used to ensure that all of the signals have the same phase when reaching a
certain component, and the delay constraint is used to ensure that all of the
signals have the same delay. Recalling @sec_constraints, these constraints are
different due to the large order of magnitude difference between the frequency
of light and the frequency of modulated content; a phase shift on the light will
have a negligible impact on the modulated content, but a delay shift on the
light will have a large impact on the modulated content.
==== Modulate
`modulate` is a synthesisable block used to modulate an optical signal, it can
perform either phase modulation or amplitude modulation. In the case of
amplitude modulation, the synthesis stage may create a @mzi to perform a
phase-to-amplitude conversion. In this example, it is used to modulate the
external phase shifts onto the optical signals.
==== Partial function
`set` allows the creation of a partial function where parts of the arguments
have already been filled. In this case, it is used to create a partial function
of `modulate` where the type of modulation is already set to phase modulation.
==== Zipping & mapping
`zip` allows two lists to be zipped together, creating a list of tuples, where
each tuple contains the elements of the two lists at the same index. `map`
allows a function to be applied to each element of a list. In this case, `zip`
is used to zip the list of phase shifts with the list of optical signals,
creating a list of tuples where each tuple contains an optical signal and a
phase shift. This list is then mapped to a function that sets the phase shift of
the optical signal to the phase shift of the tuple.
#ufigure(
outline: [ @phos implementation of a configurable optical beamforming system. ],
caption: [
@phos implementation of a configurable optical beamforming system. A fully
commented version is available in @anx_beam_forming.
],
)[
```phos
syn beam_forming(
input: optical,
phase_shifts: (electrical...),
) -> (optical...) {
input
|> split(splat(1.0, phase_shifts.len()))
|> constrain(d_phase = 0)
|> zip(phase_shifts)
|> map(set modulate(type_: Modulation::Phase))
|> constrain(d_delay = 0)
}
```
]<lst_beam_forming>
=== Results
The time-domain simulation can easily be performed using the constraint solver,
yielding the results shown in @fig_beam_forming. In this simulation, only four
channels were simulated, each with a time-dependent phase shift at a frequency
of 1 MHz. In the first simulation #link(<fig_beam_forming>)[(a)], the phases are
following @eq_phase_shift, where $k$ refers to the channel number starting at
zero. And in the second simulation #link(<fig_beam_forming>)[(b)], they are
following @eq_phase_shift_2. The simulation shows that the phase shifts are
correctly applied to the optical signals, and that the optical signals are
correctly constrained to have the same phase and delay.
$
phi_k (t) = (k dot pi)/3 + 2 pi dot 1 "MHz" dot t
$<eq_phase_shift>
$
phi_k (t) = sin((k dot pi)/3 + 2 pi dot 1 "MHz" dot t)
$<eq_phase_shift_2>
#ufigure(
outline: [ Simulation results of the beamforming system. ],
caption: [
Simulation results of the beamforming system, showing the time-dependent phase
shifts applied to the optical signals.
],
kind: image,
table(
columns: (auto, 1fr),
stroke: none,
align: center + horizon,
[(a)],
image(
"../assets/beam_forming.png",
width: 100%,
alt: "Shows a graph, the X-axis is the time in µs, of which 4 µs are shown, the x axis is the phase in radians, it shows all four signals being offset by pi/3 radians wrt. each other, and the phase shift being applied to each signal.",
),
[(b)],
image(
"../assets/beam_forming_sine.png",
width: 100%,
alt: "Shows a graph, the X-axis is the time in µs, of which 4 µs are shown, the x axis is the phase in radians, it shows all four signals being offset by pi/3 radians wrt. each other, and the phase shift being applied to each signal.",
),
),
) <fig_beam_forming>
#pagebreak(weak: true)
== Coherent 16-QAM transmitter
In this next example, a simple 16-#gloss("qam", short: true) transmitter will be
demonstrated along with simulation results. The code for this example is
available at @anx_coherent_transmitter. This example will first cover the
theoretical background needed to understand the motivation for the example and
the measurements taken, followed by the transmitter's modulation aspects.
=== Theoretical background
#udefinition(
footer: [ Adapted from @ref_qam. ],
)[
*QAM* refers to *quadrature amplitude modulation*, a modulation scheme where the
information is encoded in both the amplitude and the phase of the signal. The
*16* refers to the number of symbols in the constellation, which is the number
of different values the signal can take. In this case, the data encoded is 4
bits per symbol for 16 possible values.
]
In telecommunications, and especially in high-speed communication, engineers
need to be able to transmit as much information as possible in a given bandwidth
while still maintaining good immunity to noise and other impairments. One way to
achieve these higher throughputs is by using more advanced modulation schemes,
state-of-the-art in photonic communication being 64-#gloss("qam", short: true) @ishimura_64-qam_2022.
In this example, however, state-of-the-art will not be reproduced and will
instead be focused on a simpler 16-#gloss("qam", short: true) modulation scheme,
based on the work by _Talkhooncheh, <NAME>, et al._ @talkhooncheh_200gbs_2023.
Modulations are often visualised using two types of diagrams: so-called _eye diagrams_ which
show the transitions between symbols, and _constellation diagrams_ which show
the actual symbols after sampling. These two visualisations are used to measure
the quality of the received signal and to visualise any impairment it might have
suffered during transmission. Eye diagrams are built by overlaying many
transitions between symbols over one another, slowly building a statistical
representation of the signal. Constellation diagrams are built by sampling the
signal at a given rate and plotting its magnitude and phase in a complex plane.
The resulting plot is a point cloud that can be used to visualise the symbols
that were transmitted.
Finally, the measure that will be used to quantify the quality of the
transmitter is not the @ber, as the measurement will be taken at the
transmitter's output. Therefore the bit error rate will be zero, but rather the
@evm. The @evm is a measure of the difference between the ideal constellation
and the actual constellation and is defined as in @eq_evm, with $N$ the number
of samples, $I_"err"$ and $Q_"err"$ the error in the in-phase and quadrature
components of the constellation, $"EVM"_"%"$ the @evm in percentage, and $"EVM Normalization Factor"$ is
a normalisation factor that depends on the modulation scheme used, for 16-#gloss("qam", short: true),
it is the maximum magnitude of the constellation @keysight_technologies_evm. A
visualisation of EVM can be found in @fig_evm. With this definition, one can see
that the @evm measures the average distance between the ideal constellation and
the actual constellation and should, therefore, be minimised.
$
"EVM"_"%" = sqrt(1 / N sum_(i = 0)^(N - 1)(I_"err" [i]^2 + Q_"err" [i]^2)) / "EVM Normalization Factor" dot 100%
$ <eq_evm>
#ufigure(
outline: [ Error vector magnitude -- reference plot ],
caption: [
Error vector magnitude -- reference plot, showing the reference and measured IQ
points, and the @evm vector of the sample.
],
image(
"../assets/drawio/evm.png",
width: 60%,
alt: "Shows a 2D cartesian plot with the X-axis labelled as I, the Y-axis labelled as Q, the vector of the ideal IQ point is shown, and the measure IQ point. The EVM vector is drawn between the two points.",
),
) <fig_evm>
=== PHÔS implementation
The circuit being built is shown in @fig_qam_mod, with its code in
@anx_coherent_transmitter. It consists of a laser source, which, in the @phos
code shown in @lst_modulation, is considered to be external to the device. The
light is then split into four parts, two of which are split into one-quarter of
the total light, while the remaining two each receive half of the total light.
Each signal is then modulated, on a real chip. This could be done using an
electro-absorption modulator (EAM) or a #gloss("mzi")-based modulator. The
signals are then phase-shifted to form the _I_ and the _Q_ modulation. The first
two signals are form the in-phase modulation, while the remaining two form the
quadrature modulation. The four modulated signals are then combined and sent to
the output.
#ufigure(
outline: [ 16-#gloss("qam", short: true) modulator circuit. ],
caption: [
16-#gloss("qam", short: true) modulator circuit, showing the splitter,
modulators, phase shifters, and interferometer.
],
image(
"../assets/drawio/qam_mod.png",
width: 100%,
alt: "Show a diagram showing a laser source being split into four parts, each part being modulated then phase shifted, finally all signals are combined together.",
),
)<fig_qam_mod>
The input signal is first split into four parts (line $#15$) into four parts
with weights $1.0$, $1.0$, $0.5$, and $0.5$. These four signals are zipped (line $#16$),
meaning that they are combined into a single value containing an optical and an
electrical signal each. All of those values are then amplitude modulated (line $#17$).
The second, third, and fourth signals are then phases shifted such that they are $0$, $90$,
and $180$ degrees out of phase with the first signal (line $#18$). The four
signals are then interfered together (line $#19$) and sent to the output. The
resulting signal is a 16-#gloss("qam", short: true) modulated signal composed of
four binary values per symbol.
One can build a signal flow diagram from this code containing all of the
intrinsic operations and constraints. Note that the input was replaced with a
source of intensity in an arbitrary unit of $1.0$, with an @awgn with mean $0.0$ and
standard deviation $0.025$ added to it. The resulting signal flow diagram can be
found in @fig_qam_tx_sgd.
#ufigure(
outline: [ Signal flow diagram of a 16-#gloss("qam", short: true) modulator. ],
caption: [ Signal flow diagram of a 16-#gloss("qam", short: true) modulator, showing the
different components. ],
image(
"../assets/drawio/qam_tx_sgd.png",
width: 70%,
alt: "Shows a graph of the signal flow, starting at the input, then into a splitter, the four arms go respectively into, a modulator, a modulator and a phase shifter, a modulator and a phase shifter, and a modulator and a phase shifter. The four arms are then merged together and sent to the output.",
),
)<fig_qam_tx_sgd>
=== Results
This example is trivial to simulate for the constraint solver, with four $100 "Gb"\/"s"$ binary
sources, it finishes simulating a 1ns window in $45 "ms"$ on a recent _AMD_ @cpu.
In @fig_results, one can see the simulation results, showing the input signal
with its simulated noise in #link(<fig_results>)[(a)], the output signal in #link(<fig_results>)[(b)],
and the intermediary signals in #link(<fig_results>)[(c)] and #link(<fig_results>)[(d)].
Finally in #link(<fig_results>)[(e)], one can see the constellation, from which
the @evm can be calculated as $4.41%$, which is $-17.11 "dB"$ when expressed
logarithmically. At these speeds, with a fairly high noise, this can be
considered a good result as it leads to a @ber of zero.
#ufigure(
outline: [ Simulation results of a 16-#gloss("qam", short: true) modulator. ],
caption: [
Simulation results of a 16-#gloss("qam", short: true) modulator, showing the
input signal, the output signal, and the intermediary signals, along with the
constellation points, and reference points. The constellation points have been
normalized before being shown.
],
image(
"../assets/qam_constellation.png",
width: 100%,
alt: "Shows the input signal, as a mostly constant signal with some noise, then shows the output signal which is a 16-QAM modulated signal, with the constellation points shown. Also shows the intermediary A, B, C, and D signals.",
),
) <fig_results>
== Lattice filter
Lattice filters are a type of filter that can be easily built from @mzi[s] and
couplers, they allow the user to easily build a filter with the frequency
response that matches their needs @ruocco_soi_2013. Specifically, lattice
filters are ideal components to use as they are completely passive, allowing for
very low power signal processing, of particular interest for microwave signal
processing in the optical domain @guan_cmos_2014. This example will shortly
discuss the theoretical background of lattice filters, then show how to build
such a filter in @phos, showing how easy and expressively the language can be
used to build such a filter. As the constraint solver is not yet able to solve
frequency domain problems, the filter will not be simulated, instead relying on
theoretical results. The general form of an @mzi based lattice filter can be
seen in @fig_mzi_lattice. This example is based off of _Ruocco et al._'s and _Guan et al._'s
works @ruocco_soi_2013 @guan_cmos_2014.
#ufigure(
outline: [ @mzi based lattice filter. ],
caption: [ @mzi based lattice filter built of three @mzi[s] with the same path length
difference. ],
image(
"../assets/drawio/mzi_lattice.png",
width: 75%,
alt: "Shows an MZI based lattice filter, built using three sections with path length difference and four couplers.",
),
)<fig_mzi_lattice>
=== Theoretical background
Lattice filters are built from two elements, couplers and sections with a length
difference. Assuming that there are no reflections, one can model these elements
as $2 times 2$ matrices, the first one being the coupler, and the second one
being a phase shifter. The _S_ matrix of a coupler can be seen in @eq_coupler,
where $tau_i$ corresponds to the coupling coefficient of the $i$-th coupler. The _S_ matrix
of a phase shifter can be seen in @eq_phase_shift_dif, where $beta$ corresponds
to the propagation constant of the waveguide, and $Delta L_i$ corresponds to the
length difference of the $i$-th section. The _S_ matrix of a complete lattice
filter can then be calculated by multiplying the _S_ matrices of the couplers
and sections together, as seen in @eq_lattice_filter.
$
S_("coupler", i) = mat(
delim: "[",
tau_i, -j dot sqrt(1 - tau_i^2);-j dot sqrt(1 - tau_i^2), tau_i,
)
$<eq_coupler>
$
S_("delay", i) = mat(delim: "[", e^(-j dot beta Delta L_i), 0;0, 1)
$<eq_phase_shift_dif>
$
S = S_("coupler", n+1) dot product^n_(i = 1) S_("delay", i) dot S_("coupler", i)
$<eq_lattice_filter>
#pagebreak(weak: true)
=== Building the filter
@mzi based lattice filters are very simple to build in @phos, assuming that one
has a function to compute the coefficients required, which would be part of a
filter synthesis toolbox, here named `filter_kind_coefficients`, then one can
build a filter with the code in @lst_lattice_filter. The code first computes the
coefficients, then folds them, meaning that it iterates over them while
accumulating a result, in this case the result are the final output signals. For
each coefficient, the accumulator signals are coupled with the computed
coefficient, and then constrained to the differential phase computed. Finally,
the last two signals are coupled together with the final computed coefficient.
The result is a filter with the frequency response of the coefficients, which
can be seen in @fig_lattice_filter, showing the theoretical results of a 4th and
8th order filter.
#ufigure(
outline: [ @mzi based lattice filter in @phos. ],
caption: [
@mzi based lattice filter in @phos, parametrically generated for the user, fully
commented example in @anx_lattice_filter.
],
)[
```phos
syn lattice_filter(a: optical, b: optical, filter_kind: FilterKind) -> (optical, optical) {
let (coeffs, final_coupler) = filter_kind_coefficients(filter_kind)
coeffs |> fold((a, b), |acc, (coeff, phase)| {
acc |> coupler(coeff) |> constrain(d_phase = phase)
}) |> coupler(final_coupler)
}
```
]<lst_lattice_filter>
#ufigure(
outline: [ Theoretical frequency response of a @mzi based lattice filter in @phos. ],
caption: [
Theoretical frequency response of a @mzi based lattice filter in @phos. Fourth
order example with coefficients:
$
(tau_i, Delta L_i) = (0.5, 30 "µm"),
(0.8, 30 "µm"),
(0.5, 30 "µm"),
(0.8, 30 "µm")
$
and a final coupler with a coefficient of $tau_5 = 0.04$, in waveguide with an
effective refractive index of $n_"eff" = 2.4$. An addition 8th order is also
shown, with the coefficients from the 4th order filter repeated twice.
],
image(
"../assets/lattice_filter.png",
width: 100%,
alt: "Shows the frequency response of a lattice filter, with the frequency response of the individual coefficients, and the final frequency response.",
),
) <fig_lattice_filter>
#pagebreak(weak: true)
== Analog matrix-vector multiplication
As previously mentioned in @photonic_processor, there are two major kinds of
programmable @pic[s], and while this work has mostly focused itself on
recirculating mesh-based photonic processors, they are capable of building the
same circuits as feedforward @pic[s]. A typical use case of feedforward meshes
is #gloss("mvm", long: true). This is useful for very quickly and efficiently
performing @mvm, a common machine learning operation. This example will
demonstrate how such a @mvm photonic circuit is built in @phos and how to use it
to perform @mvm. The example shown in this example is based on _Shokraneh et al._'s
work @shokraneh_single_2019.
This circuit is built from individual @mzi[s], with an added phase shifter,
these groupings, which can be seen in @fig_mvm_mzi, are equivalent to the
photonic gates from which a photonic processor is built, see
@sec_photonic_proc_comp. For this circuit, they are configured in a triangular
shape, as can be seen in @fig_mvm_mzi_full. The circuit is built from 6 gates
and can multiply a vector of size $4$ with a $4 times 4$ matrix.
#ufigure(
outline: [ Diagram of a single @mzi gate used when building an @mvm circuit. ],
caption: [
Diagram of a single @mzi gate used when building an @mvm circuit. The @mzi gate
is built from two couplers and two phase shifters. The first coupler is used to
split the input signal into two, the second coupler is used to recombine the two
signals, the first phase shifter is used to add a phase shift to the top signal,
and the second phase shifter is used to add a phase shift to the bottom signal.
],
image(
"../assets/drawio/fig_mvm_mzi.png",
alt: "Shows a single MZI with a tunable phase shifter on one of its arms, and a phase shifter on one of its ports",
width: 50%,
),
) <fig_mvm_mzi>
#ufigure(
outline: [ Diagram of the full @mzi @mvm circuit. ],
caption: [
Diagram of the full @mzi @mvm circuit. With the inputs annotated $I_1$ through
to $I_4$, and the output annotated $Y_0$ through to $Y_3$. The circuit is built
from 6 @mzi gates, with four inputs and four outputs.
],
image(
"../assets/drawio/fig_mvm_mzi_full.png",
width: 100%,
alt: "Shows a mesh of MZIs gates, assembles in a triangular shape, with 3 at the top, then 2 in the middle, and 1 at the top.",
),
) <fig_mvm_mzi_full>
From these diagrams, it becomes clear that the matrix-vector multiplication is
not trivial, assuming that the final operation being performed is $Y = bold(M) dot X$,
where $Y$ and $X$ are vectors, and $bold(M)$ is a matrix, these cannot be mapped
one-to-one with the values of the phase shifters on the circuit. The
transformation from the matrix $bold(M)$ into the corresponding phase shifts is
not the focus of this thesis. Therefore, the ones from _Shokraneh et al._'s work
will be used instead @shokraneh_single_2019. It is interesting to note that, by
performing the matrix multiplication in the analog domain, while the circuit can
be made to be extremely fast, it also introduces noise and imprecision.
Therefore, while machine-learning models that rely on low-precision arithmetic
may not be a problem, they would have limited use in applications that depend on
higher-precision arithmetic.
The code to create this circuit in @phos is rather long and is therefore
available in @anx_matrix_vector. It can successfully be simulated using the
constraint solver. The tests were done with the following input vectors: $X = (0, 0, 0, 0), (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), "and" (0, 0, 0, 1)$.
From these values, one can verify that the circuit is indeed performing the
correct operation by comparing that the first vector produces an empty vector
and that the other vectors return the corresponding column of the matrix. |
|
https://github.com/jdsee/htw_titlepage_typst | https://raw.githubusercontent.com/jdsee/htw_titlepage_typst/main/example.typ | typst | Apache License 2.0 | #import("htw_titlepage.typ"): *
#show: titlepage.with(
title: "A thesis written in Typst",
subtitle: "Bachelorarbeit",
subject: (name: "Angewandte Informatik", area: 4),
author: "<NAME>",
examiners: (
first: "Erstgutachter:in: Prof. Dr. X",
second: "Zweitgutachter:in: Dr. Y"
)
)
|
https://github.com/Tiggax/zakljucna_naloga | https://raw.githubusercontent.com/Tiggax/zakljucna_naloga/main/src/figures/neldermead.typ | typst | #import "@preview/cetz:0.2.2": canvas, plot, draw, vector
#let triangle = canvas(
{
import draw: *
line((0,0),(3,0),(1,2),(0,0), name: "line")
content("line.1%", anchor: "east")[$x_1$]
content("line.35%", anchor: "west", padding: .2)[$x_2$]
content("line.70%", anchor: "south", padding: .2)[$x_3$]
}
)
#let thdrn = canvas(
{
import draw: *
let x0 = (0,0,0)
let x1 = (0,0,1)
let x2 = (-2,0,1)
let x3 = (0,2,0)
line(x0,x1,x3,x0,x2,x3, name: "line")
line(x1,x2, stroke: (dash: "dotted"))
content(x0, anchor: "north", $x_0$)
content(x1, anchor: "west", $x_1$)
content(x2, anchor: "east", $x_2$)
content(x3, anchor: "south", padding: .05, $x_3$)
}
)
#let simplex = table(
columns: 2,
table.header([space],[geometrical representation]),
$RR^2$, triangle,
$RR^3$, thdrn
)
#simplex
#let reflection = canvas(
{
import draw: *
let ph = (0,0)
let pl = (2,0)
let ps = (1,1)
let c = ph.zip(pl).zip(ps).flatten().chunks(3).map(x =>
x.sum() / 3
)
line(ph,pl,ps, close: true, stroke: blue, name: "line")
content(ph,$p_h$, anchor: "east")
content(pl,$p_l$, anchor: "west", padding: .1)
content(ps,$p_s$, anchor: "south", padding: .1)
//circle(c, radius: .01)
//content(c, $overline(p)$, anchor: "north-west")
let pr = (3,1)
line(pl,ps, pr ,stroke: red, close: true)
line(ph,pr, stroke: (dash: "dashed"))
content(pr,$p_r$, anchor: "west", padding: .1)
}
)
#let contraction = canvas(
{
import draw: *
}
)
#let expansion = canvas(
{
import draw: *
}
)
#let operations = table(
columns: 2,
table.header([name], [visualization]),
[_reflection_], reflection,
[_contraction_], contraction,
[_expansion_], expansion,
)
#operations
#let graph = canvas(
{
import draw: *
rect((0,0),(5,3), name: "start")
content("start")[
Calculate initial $p_i$ and $y_i$
Determine $h$, calculate $overline(p)$
Form $P = (1 + alpha)overline(p) - alpha p_h$
Calculate $y$
]
rect((0,-1),(2,-2), name: "isy")
content("isy")[
is $y^* < y_l ?$
]
}
)
#import "@preview/fletcher:0.5.1" as fletcher: diagram, node, edge
#set text(size: .8em)
#let graph = diagram(
node((0,0))[start],
edge("->"),
node((0,1))[#rect[
Calculate initial $p_i$ and $y_i$
Determine $h$, calculate $overline(p)$
Form $p = (1 + alpha)overline(p) - alpha p_h$
Calculate $y$
]],
edge("->"),
node((0,2))[#rect[is $y^* < y_l$ ?]],
edge((),(0,3),"->",`yes`),
edge((),(1,2),"->",`no`),
node((0,3))[#rect[is $y^(**) < y_l$ ?]],
edge((),(0,4),"->",`yes`),
edge((),(1,3),`no`),
node((0,4))[#rect[Replace $p_h$ by $p^(**)$]],
edge((),(1,4)),
node((1,2))[#rect[is $y^(*) > y_i, i eq.not h $ ?]],
edge((),(2,2),"->",`yes`),
edge((),(1,3),`no`),
node((2,2))[#rect[is $y^(*) > y_h$ ?]],
edge((),(2.5,2),(2.5,2.5),"->",`no`),
edge((),(2,3),"->",`yes`),
node((2.5,2.5))[#rect[Replace $p_h$ by $p^*$]],
edge((),(2,2.5)),
node((2,3))[#rect[
Form $p^(**) = beta p_h + (1 - beta) overline(p)$
Calculate $y^(**)$
]],
edge((),(2,4)),
node((2,4))[#rect[is $y^(*) > y_h$ ?]],
edge((),(1.5,4),(1.5,4.5),"->",`no`),
edge((),(2.5,4),(2.5,4.5),"->",`yes`),
node((1.5,4.5))[#rect[Replace $p_h$ by $p^(**)$]],
edge((),(1.5,5)),
node((2.5,4.5))[#rect[Replace all $p_i$'s by $(p_i + p_l)/2$]],
edge((),(2.5,5),(1,5)),
edge((1,5),(1,5.5),"->"),
edge((1,3),(1,3.5),"->"),
node((1,3.5))[#rect[Replace $p_h$ by $p^(*)$]],
edge((),(1,4)),
edge((),(1,5)),
node((1,5.5))[#rect[Has minimum been reached?]],
edge((),(-1,5.5),(-1,1), (0,1),"->",`no`),
edge((),(2,5.5),"->",`yes`),
node((2,5.5))[#rect[EXIT]]
)
#graph |
|
https://github.com/8LWXpg/jupyter2typst | https://raw.githubusercontent.com/8LWXpg/jupyter2typst/master/test/test2.typ | typst | MIT License | #import "template.typ": *
#show: template
#block[
= 1.
== (a)
]
#block[
#code-block("using Plots
gr()
p = 0:0.01:1
I(p) = -p * log2(p)
H(p) = I(p) + I(1 - p)
plot(p, [I.(p), I.(1 .- p), H.(p)], label=[\"I(p)\" \"I(1-p)\" \"H(p)\"])"
, lang: "julia", count: 1)
]
#block[
#image("./img/985daaaacb50fe430f2a5eae9d74119bb590aea2.svg")
]
#block[
== (b)
]
#block[
#code-block("p = 0:0.01:1
I(p) = -p * log2(abs(p))
H(p1, p2) = I(p1) + I(p2) + I(1 - p1 - p2)
surface(p, p, H)"
, lang: "julia", count: 2)
]
#block[
#image("./img/e7da6cb820adc923140956e22a8c0d372f812a89.svg")
]
#block[
= 2.
The derivative image has a lower entropy than the original image, because most of its pixel values are close to zero and have a high probability. This means that the derivative image contains less information per pixel than the original image, and therefore it can be compressed more efficiently.
]
#block[
= 3.
== (a)
]
#block[
#code-block("using Optim
function quantize(f::Function, bits::Int, first::Real, last::Real)
min = optimize(f, first, last).minimum
max = -optimize(x -> -f(x), first, last).minimum
step = (max - min) / (2^bits - 1)
# return quantize function and error function
return [x -> min + step * round((f(x) - min) / step), x -> f(x) - min - step * round((f(x) - min) / step)]
end
bit = 3
f(x) = x
p1 = plot(f, -1, 1, label=\"f(x)\")
plot!(quantize(f, bit, -1, 1), -1, 1, label=[\"quantize(f, $bit)\" \"error\"], legend=:topleft)"
, lang: "julia", count: 3)
]
#block[
#image("./img/c99fc14339dd048ab2beb09d5d14b5bc121b9d0d.svg")
]
#block[
== (b)
]
#block[
#code-block("f(x) = sin(x)
p2 = plot(f, 0, 2π, label=\"f(x)\")
plot!(quantize(f, 3, 0, 2π), 0, 2π, label=[\"quantize(f, $bit)\" \"error\"])"
, lang: "julia", count: 4)
]
#block[
#image("./img/5b73b52a47ccaa96e7996b317c04e1a15ef968d6.svg")
]
|
https://github.com/WinstonMDP/math | https://raw.githubusercontent.com/WinstonMDP/math/main/exers/0.typ | typst | #import "../cfg.typ": *
#show: cfg
$
"Prove that"
lim_(x -> a) f(x) = A <-> all({x_n} subset.eq E without a):
lim_(n -> oo) x_n = a -> lim_(n -> oo) f(x_n) = A
$
That is,
$(
all(epsilon > 0) ex(delta > 0) all(x in E):
0 < abs(x - a) < delta -> abs(f(x) - A) < epsilon
) <-> \
(
all({x_n} subset.eq E without a):
(all(epsilon > 0) ex(N) all(n > N): abs(x_n - a) < epsilon) ->
all(epsilon > 0) ex(N) all(n > N): abs(f(x_n) - A) < epsilon
)$.
- To right.
$ex(delta > 0) all(x in E): 0 < abs(x - a) < delta -> abs(f(x) - A) < epsilon$.
$ex(N) all(n > N): abs(x_n - a) < delta$.
$all(n > N): 0 < abs(x_n - a) < delta$.
$all(n > N): f(x_n) - A < epsilon$.
- To left.
Suppose the opposite:
$ex(epsilon > 0) all(delta > 0) ex(x in E):
0 < abs(x - a) < delta and abs(f(x) - A) >= epsilon$.
$all(n) ex(x): 0 < abs(x - a) < 1/n and abs(f(x) - A) >= epsilon $.
$ex({x_n}) all(n): 0 < abs(x_n - a) < 1/n and abs(f(x_n) - A) >= epsilon$.
$lim_(n -> oo) x_n = a$.
$lim_(n -> oo) f(x_n) = A$.
$ex(n): f(x_n) < epsilon$.
$bot$.
|
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/compiler/color.typ | typst | Apache License 2.0 | // Test color modification methods.
---
// Test CMYK color conversion.
#let c = cmyk(50%, 64%, 16%, 17%)
#stack(
dir: ltr,
spacing: 1fr,
rect(width: 1cm, fill: cmyk(69%, 11%, 69%, 41%)),
rect(width: 1cm, fill: c),
rect(width: 1cm, fill: c.negate()),
)
#for x in range(0, 11) {
box(square(size: 9pt, fill: c.lighten(x * 10%)))
}
#for x in range(0, 11) {
box(square(size: 9pt, fill: c.darken(x * 10%)))
}
---
// The the different color spaces
#let col = rgb(50%, 64%, 16%)
#box(square(size: 9pt, fill: col))
#box(square(size: 9pt, fill: rgb(col)))
#box(square(size: 9pt, fill: oklab(col)))
#box(square(size: 9pt, fill: oklch(col)))
#box(square(size: 9pt, fill: luma(col)))
#box(square(size: 9pt, fill: cmyk(col)))
#box(square(size: 9pt, fill: color.linear-rgb(col)))
#box(square(size: 9pt, fill: color.hsl(col)))
#box(square(size: 9pt, fill: color.hsv(col)))
---
// Test hue rotation
#let col = rgb(50%, 64%, 16%)
// Oklch
#for x in range(0, 11) {
box(square(size: 9pt, fill: rgb(col).rotate(x * 36deg)))
}
// HSL
#for x in range(0, 11) {
box(square(size: 9pt, fill: rgb(col).rotate(x * 36deg, space: color.hsl)))
}
// HSV
#for x in range(0, 11) {
box(square(size: 9pt, fill: rgb(col).rotate(x * 36deg, space: color.hsv)))
}
---
// Test saturation
#let col = color.hsl(180deg, 0%, 50%)
#for x in range(0, 11) {
box(square(size: 9pt, fill: col.saturate(x * 10%)))
}
#let col = color.hsl(180deg, 100%, 50%)
#for x in range(0, 11) {
box(square(size: 9pt, fill: col.desaturate(x * 10%)))
}
#let col = color.hsv(180deg, 0%, 50%)
#for x in range(0, 11) {
box(square(size: 9pt, fill: col.saturate(x * 10%)))
}
#let col = color.hsv(180deg, 100%, 50%)
#for x in range(0, 11) {
box(square(size: 9pt, fill: col.desaturate(x * 10%)))
}
---
// Test gray color modification.
// Ref: false
#test-repr(luma(20%).lighten(50%), luma(60%))
#test-repr(luma(80%).darken(20%), luma(64%))
#test-repr(luma(80%).negate(), luma(20%))
|
https://github.com/AU-Master-Thesis/thesis | https://raw.githubusercontent.com/AU-Master-Thesis/thesis/main/sections/5-discussion/study-1.typ | typst | MIT License | #import "../../lib/mod.typ": *
== #study.H-1.full.n <s.d.study-1>
// Hypothesis 1:
// Reimplementing the original GBP Planner in a modern, flexible multi-agent simulator framework using a modern programming language will enhance the software's scientific communication and extendibility. This redevelopment will not only ensure fidelity in reproducing established results but also improve user engagement and development through advanced tooling, thereby significantly upgrading the usability and functionality for multi-agent system simulations.
// 1. Something about the simulation framework, answering the first part of the hypothesis.
// 2. Something about the programming language, answering the second part of the hypothesis.
// 3. Something about the usability and functionality, answering the third part of the hypothesis.
// 4. Something about the scientific communication and extendibility, answering the fourth part of the hypothesis.
// 5. Something about the user engagement and development, answering the fifth part of the hypothesis.
// 6. Something about the reproducibility of established results, answering the sixth part of the hypothesis.
// 6.1. Talk about how the results of their paper seems magical, where the results obtainable by their release code is similar to that of ours, so either something has not been explained properly in the paper, or the results are not reproducible.
In this section all contributions of the #acr("MAGICS") simulator are discussed. Furthermore, the simulator's capabilities are compared to the original #gbpplanner paper@gbpplanner, and the provided `gbpplanner` code@gbpplanner-code. The discussion is structured in such a way that it deliberates how effectively the first hypothesis, #study.H-1.box, has been addressed. The discussion is divided into two parts; #nameref(<s.d.simulator>, "On The Simulator MAGICS") and #nameref(<s.d.reproducibility>, "On The Reproducibility").
// 1, 2, 3:
=== On The Simulator MAGICS <s.d.simulator>
The developed simulation framework, #acr("MAGICS"), is very capable, and has extensive thought put into it in terms of usability and configurability. The Rust programming language was chosen for its performance and safety guarantees, along with modern approach to software development. And its easy access to a flurishing ecosystem with a large repository of crates, such as the Bevy Engine@bevyengine framework. As mentioned earlier, the #acr("ECS") architecture is a core part of Bevy, and has shown to be very flexible and easily extendable. Furthermore, it allows for effortless parallelization of many parts of the simulation, which is a key feature for the #acr("GBP") inference process, as it is inherently parallelizable in context of asynchronous message passing. However, even with the parallelization, #acr("MAGICS") has been unable to reach much better performance than the original #gbpplanner@gbpplanner. Nonetheless, #acr("MAGICS") excels in terms of usability and configurability, as an extensible user interface has been built, which both makes the process of tuning hyperparameters much smoother, but also much easier to understand each of their impacts. This is a significant improvement over the original #gbpplanner, which provides a simple simulator, that visualises the simulation, but does not allow for much interactability.
#acr("MAGICS") provides the following contributions over the original #gbpplanner:
#[
#set par(first-line-indent: 0em)
#let colors = (
theme.lavender,
theme.teal,
theme.green,
theme.yellow
)
// *Simulation:*
#set enum(numbering: box-enum.with(prefix: "C-", color: colors.at(0)))
+ *Simulation:*
+ Ability to adjust parameters live during simulation.
+ Ability to on-the-fly load new scenarios, or reload the current one.
+ Ability to turn factors on and off, to see their individual impact.
+ A completely reproducible simulation, along with seedable #acr("PRNG").
#pagebreak(weak: true)
#set enum(numbering: box-enum.with(prefix: "C-", color: colors.at(1)), start: 2)
+ *Configuration:*
+ An environment declarative configuration format for simple scenario setup.
+ A highly flexible formation configuration format, to describe how to spawn robots, and supply their mission, in a concise and declarative way.
+ A more in-depth main configuration file, to initialize all the simulation parameters.
#set enum(numbering: box-enum.with(prefix: "C-", color: colors.at(2)), start: 3)
+ *Planner Extension:*
+ Global planning layer, the results of which are discussed in @s.d.study-3.
+ And extension of the factor graph, to include a new tracking factor, $f_t$, the results of which are discussed in @s.d.study-3.
#set enum(numbering: box-enum.with(prefix: "C-", color: colors.at(3)), start: 4)
+ *Gaphical User Interface:*
+ A modern, #acr("UI") and #acr("UX").
+ A more flexible and configurable simulation framework.
+ Visualization of all aspects of the underlying #acr("GBP") process.
]
// #note.k[focus on something about the purpose of rust]
// closer to a modelling language
// better tool for setting up a large system
// leans closer to something like VDM, where logical constraints and system specifications can be written in a more formal way
In the context of developing complex software applications such as scientific simulators, strongly typed languages with extensive static guarantees offer significant benefits. The robust type system in Rust and its capability to prevent invalid states entirely makes it a great choice for these types of applications. Although it is not a modeling language, it enables rigorous and formal system specifications akin to declarative modeling languages like VDM@vdm. This is particularly advantageous for large systems like #acr("MAGICS"), where many different parts of the system need to interact in a very specific way.
// where maintaining clear and organized code is crucial for ensuring precise interactions between various system components.
// #cursor
// The ability
// languages that are able to provide strong guarantees about invariants are beneficial for scientific simulations
// Due to Rusts nature, it is a language that emulates modelling languages to a certain estent, where code quickly becomes very imperative and messy, modelling languages like VDM, tend towards a declarative style of simply writing out system specifications in a rigorous and formal way. Although Rust is not a modelling language, it has a very strong type system, and many logical constraints can be imposed in a meta-programming kind of approach. These features come in very handy when developing a large system like #acr("MAGICS"), where many different parts of the system need to interact in a very specific way.
=== On The Reproducibility <s.d.reproducibility>
As has been presented in the results sections #numref(<s.r.results.circle-obstacles>) the results of #acr("MAGICS"), the original paper@gbpplanner, and the code provided thereby@gbpplanner-code, are shown in comparative tables; #numref(<t.network-experiment>)-#numref(<t.comms-failure-experiment>). However, the plotted values do not contain the direct numbers from @gbpplanner as these where not available, see figures; #numref(<f.circle-experiment-ldj>)-#numref(<f.qin-vs-qout>). While reading this comparing it is recommended to keep the #gbpplanner planner open to be able to refer to figures 4, 5, and 6.
// On the LDJ metric
Looking at the *#acr("LDJ")* metric, see @f.circle-experiment-ldj, it becomes evident that #acr("MAGICS") is unable to reproduce the results that are presented in @gbpplanner with the same parameters; lookahead multiple $l_m=3$ and time horizon $t_(K-1)=13.33s$. However, lowering the lookahead horizon to five seconds; #lm3-th5.s, the results of #acr("MAGICS") come much closer, but still with a higher variance, and a very similar trend slowly downwards as $N_R$ is increased, although, with an offset of $tilde.op 2$ from the very first experiment with $N_R=5$.
// #att[Looking at this metric for the `gbpplanner` code#sp, it is possible to argue that #acr("MAGICS") performs similarly to `gbpplanner`, and the results of the paper are not reproducible with the `gbpplanner` either.]
// On the Distance Travelled metric
The *Distance Travelled* metric, see @f.circle-experiment-distance-travelled, shows a somewhat similar discrepancy in results when comparing #acr("MAGICS") to the original paper. Even with the best performing configuration of #lm3-th5.s for #acr("MAGICS"), it still has a disparity when comparing to the original paper. #acr("MAGICS") achieves a higher variance, and faster increasing trend upwards as $N_R$ is increased. One similarity between the two, is that *Distance Travelled* seems to increase linearly proportional to $N_R$.
// On the Makespan metric
Looking at the *Makespan* metric, see @f.obstacle-experiment-makespan, and comparing to Figure 6 of @gbpplanner, these results are not far off. #acr("MAGICS") performs even better than @gbpplanner when obstacles are introduced with a more stable trend upwards. Moreover, the makespan with obstacles stays closer to the Circle scenario without obstacles, while also deviating from it at a lower pace.
// On the Makespan metric
// The *Makespan* metric, see @f.obstacle-experiment-makespan, mimics the results of the *Distance Travelled* as expected, since one is a function of the other. Once again, #acr("MAGICS") pe
// On the Varying Network Connectivity experiment
// #att[merge this with *Communications Failure* experiment if results are sim]
// On the Communications Failure experiment
In the *Communications Failure* scenario, the communications radius $r_C$, was tested for values ${20, 40, 60, 80}m$. Here this thesis provides these results for #lm3-th13.n and #lm3-th5.n for #acr("MAGICS"), and #lm3-th13.n for the original `gbpplanner` code; while comparing to the #gbpplanner paper. This is the experiment that makes it clear how unattainable the results of @gbpplanner is. #acr("MAGICS") performs relatively well when looking at the makespan metric, that is for the #lm3-th5.n configuration, where #lm3-th13.n reached results much closer to that of `gbpplanner`. Either way, the decent makespan performance comes at the cost of a large amount of interrobot collisions for both configurations, see @t.comms-failure-experiment. Hereby the results for #acr("MAGICS") with #lm3-th5.n can be validated against the `gbpplanner` code, which achieves reciprocally similar results with high makespan and little to no collisions. That is, instead of a trade-off for a large number of collisions, the `gbpplanner` code takes a lot longer to solve the problem. Unfortunately it wasn't possible to count the number of collisions for `gbpplanner`, but looking at it, it seems to be nearly 0 for all levels of communication failure#footnote[See a demonstration of this at #link("https://youtu.be/MsuO7i2gRaI", "YouTube: Master Thesis - GBP Planner Communication Failure Rate 0.9")], similar to the results presented in the #gbpplanner paper. Nonetheless, the makespan for #acr("MAGICS") is much lower, and approaches @gbpplanner for a target speed of $10m"/"s$. For a target speed of $15m"/"s$ the makespan discrepancy is higher, starting at $tilde.op 135%$ of the #gbpplanner result with no communication failure, and increasing to $tilde.op 165%$ for a failure rate of $gamma = 70%$. An important note is that the results for $gamma = {80, 90}%$ are not available for #acr("MAGICS"), as the simulation was not able to complete within a reasonable time frame. It is the case that with such low communication the robots simply end up holding each other up, and the simulation approaches convergence incredibly slowly.
// It was previously seen that #lm3-th5.n gets relatively close to @gbpplanner, this is not the case for the *Communications Failure* and *Varying Network Connectivity* scenarios. Though, as mentioned these experiments have been run on the `gbpplanner` code, and looking at the makespan metric, #acr("MAGICS") seems to be performing marvolously
// it is once again possible to verify #acr("MAGICS") against the `gbpplanner` code, which achieves nearly the same results.
// f.circle-experiment-ldj
// f.circle-experiment-distance-travelled
// f.obstacle-experiment-makespan
// t.network-experiment
// f.qin-vs-qout
// t.qin-vs-qout
// t.comms-failure-experiment
Concluding on the reproducibility capabilities of #acr("MAGICS"); the results of the #gbpplanner paper@gbpplanner, are not reproducible neither by #acr("MAGICS") nor by the provided `gbpplanner` software@gbpplanner-code. However, the results of #acr("MAGICS") can be validated against `gbpplanner`, achieving strikingly similar results. Thus, it can be concluded that the simulator used to obtain the results in the #gbpplanner paper significantly differs from `gbpplanner` and is not described clearly enough in the paper to be reproducible.
// Thus is can be concluded that it seems the simulator used to obtain the results in the #gbpplanner paper, which differs from `gbpplanner` significantly, while not being described clearly in the paper to be reproducible.
// #jonas[this conclusion #sym.arrow.t came from your note, am I saying the correct thing even though I have changed the wording?: "conclusion: It seems the GBP planner paper is doing some work on their own provided simulator that is not well describe in the paper, and therefor nearly impossible to reproduce. "]
|
https://github.com/drupol/master-thesis | https://raw.githubusercontent.com/drupol/master-thesis/main/src/thesis/imports/workarounds.typ | typst | Other | #import "@preview/codelst:2.0.1": sourcecode, sourcefile
#let shell(body) = {
let body = raw(body)
let kinds = (
"$": green.darken(30%),
"#": blue.darken(10%),
">": luma(40%),
" ": luma(100%),
)
let lines = body.text.split("\n").map(line => {
if line.at(0, default: "") in kinds and line.at(1, default: "") == " " {
(line.at(0), line.slice(2))
} else {
(none, line)
}
})
set par(justify: false)
show raw.line: it => [
#let (kind, line) = lines.at(it.number - 1)
#if kind != none {
text(fill: kinds.at(kind), kind) + " " + it.body
} else {
text(fill: luma(50%), it.text)
}
]
show raw.line: set text(font: "Inconsolata Nerd Font Mono")
sourcecode(numbers-style: line-no => text(
fill: luma(160),
size: .5em,
str(line-no),
))[
#raw(lang: "sh", lines.map(((_, line)) => line).join("\n"))
]
}
#let LaTeX = {
[L];box(move(
dx: -4.2pt, dy: -1.2pt,
box(scale(65%)[A])
));box(move(
dx: -5.7pt, dy: 0pt,
[T]
));box(move(
dx: -7.0pt, dy: 2.7pt,
box(scale(100%)[E])
));box(move(
dx: -8.0pt, dy: 0pt,
[X]
));h(-8.0pt)
}
#let eg = content => {
[(e.g., #content)]
}
#show "LaTeX": LaTeX
|
https://github.com/maucejo/cnam_templates | https://raw.githubusercontent.com/maucejo/cnam_templates/main/src/presentation/_presentation_template.typ | typst | MIT License | #import "@preview/touying:0.5.2": *
#import "_slides.typ": *
#let cnam-presentation(
composante: "cnam",
color-set: "red",
..args,
body
) = {
show: touying-slides.with(
config-info(
title: none,
subtitle: none,
author: none,
over-title: none,
facade: "image"
),
config-page(
paper: "presentation-16-9",
header-ascent: 30%,
footer-descent: 30%,
margin: (top: 3em, bottom: 1.5em, x: 2em),
),
config-common(
slide-fn: slide,
new-section-slide-fn: new-section-slide,
),
config-methods(
init: (self: none, body) => {
let colors = color-theme.at(color-set)
set text(fill: colors.blue, font: "Raleway", size: 20pt, lang: "fr", ligatures: false)
set par(justify: true)
body
}
),
config-store(
align: align,
colors: color-theme.at(color-set),
color-theme-name: color-set,
composante: composante,
title-logo-height: 8%,
logo: image("../resources/logo/" + composante + ".png"),
),
..args,
)
body
} |
https://github.com/yomannnn/yomannnn.github.io | https://raw.githubusercontent.com/yomannnn/yomannnn.github.io/main/projects.typ | typst | #import "/book.typ": book-page
#show: book-page.with(title: "2023_11_02")
= My Projects |
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/text/space_04.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test that linebreak consumed surrounding spaces.
#align(center)[A \ B \ C]
|
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024 | https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/entries/brainstorm-auton-movement/entry.typ | typst | Creative Commons Attribution Share Alike 4.0 International | #import "/packages.typ": notebookinator
#import notebookinator: *
#import themes.radial.components: *
#show: create-body-entry.with(
title: "Brainstorm: Autonomous Movement",
type: "brainstorm",
date: datetime(year: 2023, month: 9, day: 11),
author: "<NAME>",
witness: "Violet Ridge",
)
In order to move the robot we need some kind of control loop. This is a process
that moves a system to a target. In this case our system is the robot, and our
target is the coordinate on the field that we are trying to move to. There are
two different types of control loops:
- Closed loop control
- Open loop control
Closed loop control checks the system state each time the loop runs. This means
that the system correct for outside interference.
Open loop control does not take the current system state into consideration. It
calculates the required movement before starting, and then simply follows those
instructions. This means that it cannot correct for outside interference.
We stated in our goals that we wanted the robot to be able to correct for errors
in past movements, so open loop controllers are not an option.
There are a variety of closed loop controllers that are commonly used:
#grid(
columns: (1fr, 1fr),
gutter: 20pt,
[
== Bang Bang Controller
This controller only has three states:
- On
- Reversed
- Off
If the robot is facing the target, the controller outputs maximum power. If the
robot is facing away from the target (it overshot), the controller outputs
maximum reverse power. If the system is within acceptable error, the controller
outputs off.
#pro-con(pros: [
- Very simple to implement
], cons: [
- Very imprecise
])
],
image("./bang-bang.svg"),
[
== PID Controller
This controller is considerably more complex than the bang bang controller. It
accounts for the fact that real systems cannot stop instantaneously, and slows
down as it approaches the target. Its output is calculated by adding together
it's 3 terms, P, I, and D.
#v(500pt)
#pro-con(pros: [
- Smooth, precise movement
], cons: [
- Harder to implement
- Doesn't work well with long movements.
])
],
image("./pid.svg"),
[
== Pure Pursuit
Pure pursuit is a complex algorithm that follows long, pre generated paths. This
algorithm works by drawing an imaginary circle around the robot, and having it
move toward the intersection between that circle and the path.
#pro-con(pros: [
- Smooth, precise movement
- Performs very well with long paths.
], cons: [
- Extremely difficult to implement
- Doesn't work well with short movements
])
],
image("./pure-pursuit.svg"),
)
|
https://github.com/kazuyanagimoto/quarto-awesomecv-typst | https://raw.githubusercontent.com/kazuyanagimoto/quarto-awesomecv-typst/main/_extensions/awesomecv/typst-show.typ | typst | MIT License | // Typst custom formats typically consist of a 'typst-template.typ' (which is
// the source code for a typst template) and a 'typst-show.typ' which calls the
// template's function (forwarding Pandoc metadata values as required)
//
// This is an example 'typst-show.typ' file (based on the default template
// that ships with Quarto). It calls the typst function named 'article' which
// is defined in the 'typst-template.typ' file.
//
// If you are creating or packaging a custom typst template you will likely
// want to replace this file and 'typst-template.typ' entirely. You can find
// documentation on creating typst templates here and some examples here:
// - https://typst.app/docs/tutorial/making-a-template/
// - https://github.com/typst/templates
#show: resume.with(
$if(title)$
title: [$title$],
$endif$
$if(date)$
date: [$date$],
$endif$
$if(author)$
author: (
firstname: unescape_text("$author.firstname$"),
lastname: unescape_text("$author.lastname$"),
address: unescape_text("$author.address$"),
position: unescape_text("$author.position$"),
contacts: ($for(author.contacts)$(
text: unescape_text("$it.text$"),
url: unescape_text("$it.url$"),
icon: unescape_text("$it.icon$"),
)$sep$, $endfor$),
),
$endif$
$if(profile-photo)$
profile-photo: unescape_text("$profile-photo$"),
$endif$
) |
https://github.com/7sDream/fonts-and-layout-zhCN | https://raw.githubusercontent.com/7sDream/fonts-and-layout-zhCN/master/chapters/04-opentype/exploring/head.typ | typst | Other | #import "/template/template.typ": web-page-template
#import "/template/components.typ": note
#import "/lib/glossary.typ": tr
#show: web-page-template
// ### The `head` table
=== `head` 表
// `head` is a general header table with some computed metadata and other top-level information about the font as a whole:
`head`表中储存的是整个字体的基本信息和一些计算出的元数据:
```xml
<head>
<!-- 此表中的大部分数据会在编译时自动计算和更新 -->
<tableVersion value="1.0"/>
<fontRevision value="1.0"/>
<checkSumAdjustment value="0x9fe5c40f"/>
<magicNumber value="0x5f0f3cf5"/>
<flags value="00000000 00000011"/>
<unitsPerEm value="1000"/>
<created value="Tue Sep 20 15:02:17 2016"/>
<modified value="Tue Sep 20 15:02:17 2016"/>
<xMin value="93"/>
<yMin value="-200"/>
<xMax value="410"/>
<yMax value="800"/>
<macStyle value="00000000 00000000"/>
<lowestRecPPEM value="3"/>
<fontDirectionHint value="2"/>
<indexToLocFormat value="0"/>
<glyphDataFormat value="0"/>
</head>
```
// The most interesting values here for font designers and layout programmers are `unitsPerEm` through `macStyle`.
这些值中,字体设计师和排版程序开发者们最关心的是从`unitsPerEm`到`macStyle`的部分。
// The `unitsPerEm` value, which defines the scaling of the font to an em, must be a power of two for fonts using TrueType outlines. The most common values are 1000 for CFF fonts and 1024 for TrueType fonts; you may occasionally come across fonts with other values. (Open Sans, for instance, has an upem of 2048.)
`unitsPerEm`定义了字体中单位长度和`em`之间的比例关系。在使用TrueType#tr[outline]的字体中必须设置成2的整数次方,一般会使用1024。使用CFF表示法的字体则一般设置成1000。偶尔也会碰到使用其他值的字体,比如Open Sans字体的`unitsPerEm`值为2048。
#note[
// > If you are writing a font renderer, you should not make assumptions about this value!
在编写字体渲染器时不应对这个值有任何预先假设!
]
// `created` and `modified` are mostly self-explanatory; in OpenType's binary representation they are actually stored as seconds since January 1st 1904, (Mac versions prior to OS X used this as their *epoch*, or reference point.) but `ttx` has kindly converted this to a more readable time value.
`created`和`modified`这两个值的意义不言自明。但需要知道的是,它们在二进制表示中其实储存的是从1904年1月1日到目标时间的秒数。Mac 系统在 OS X 版本前用这个特定日期作为其时间零点,或者叫参考点。`ttx` 工具把这个秒数自动转换为了更加易读的日期时间表示。
// `xMin` through `yMax` represent the highest and lowest coordinates used in the font. In this case, the `.notdef` glyph - the only glyph with any outlines - stretched from -200 below the baseline to 800 units above it, has a left sidebearing of 93, and its right edge falls at X coordinate 410.
`xMin`到`yMax`这四个值告诉我们字体中使用的坐标的最大最小值。在本例中,唯一有实际#tr[outline]的`.notdef`#tr[glyph]在竖直方向上具有从-200到800的跨度(这里负数表示位置在#tr[baseline]之下,正数则是#tr[baseline]之上)。它的左#tr[sidebearing]为93,右边缘则位于X坐标410的位置。
// The `macStyle` value is a bit field, used, as its name implies, to determine the style of the font on Mac systems. It consists of two bytes; the one on the left is not used, and the bits in the one of the right have the following meanings:
`macStyle`值中的每个比特位表示一个状态开关,正如其名称所示,Mac系统用这个值来确定字体的样式。它包含两个字节,其中左侧字节的没有使用,右侧字节的每一位的含义如下:
/*
|-|----------|
|0|Bold |
|1|Italic |
|2|Underline |
|3|Outline |
|4|Shadow |
|5|Condensed |
|6|Extended |
|7|(unused) |
|-|-----------|
*/
#align(center, table(
columns: 2,
align: left,
..for (i, v) in (
[粗体],
[意大利体],
[下划线],
[#tr[outline]],
[阴影],
[窄体],
[宽体],
[(未使用)]
).enumerate() { (str(i), v,) }
))
// So a bold italic condensed font should have a `macStyle` value of `00000000 00100011` (remember that we count from the right in binary).
所以,一个粗窄意大利体字体的`macStyle`值就会是 `00000000 00100011`(注意字节中最低的位在最右边)。
|
https://github.com/fsr/rust-lessons | https://raw.githubusercontent.com/fsr/rust-lessons/master/src/lesson7.typ | typst | #import "slides.typ": *
#show: slides.with(
authors: ("<NAME>", "<NAME>"), short-authors: "H&A",
title: "Wer rastet, der rostet",
short-title: "Rust-Kurs Lesson 7",
subtitle: "Ein Rust-Kurs für Anfänger",
date: "Sommersemester 2023",
)
#show "theref": $arrow.double$
#show link: underline
#new-section("Generics")
#slide(title: "Generic functions")[
#columns(2, gutter: -10%)[
```rust
fn largest_i32(list: &[i32]) -> &i32 {
let mut largest = &list[0];
for item in list {
if item > largest {
largest = item;
}
}
largest
}
```
#colbreak()
```rust
fn main() {
let number_list = vec![34, 50, 25, 100, 65];
let result = largest_i32(&number_list);
println!("The largest number is {}", result);
}
```
]
]
#slide(title: "Generic functions")[
#columns(2, gutter: -10%)[
```rust
fn largest<T>(list: &[T]) -> &T {
let mut largest = &list[0];
for item in list {
if item > largest {
largest = item;
}
}
largest
}
```
#colbreak()
#alternatives[
```rust
fn main() {
let number_list = vec![34, 50, 25, 100, 65];
let result = largest(&number_list);
println!("The largest number is {}", result);
}
```
][
```rust
fn main() {
let number_list = vec![34, 50, 25, 100, 65];
// we can specify the type explicitly
// with the turbo fish syntax
let result = largest::<u8>(&number_list);
println!("The largest number is {}", result);
}
```
]
]
- this will not work (yet), we need the `PartialOrd` trait for comparison
]
#slide(title: "Generic structs")[
#columns(2, gutter: -15%)[
```rust
struct Point<T> {
x: T,
y: T,
}
fn main() {
let integer = Point { x: 5, y: 10 };
let unsigned: Point<u32> = Point { x: 9, y: 20 };
let float = Point { x: 1.0, y: 4.0 };
// This will not work. Why?
let wont_work = Point { x: 5, y: 4.0 };
}
```
#colbreak()
#begin(2)[
```rust
struct MultiPoint<T, U> {
x: T,
y: U,
}
```
]
]
]
#slide(title: "Generic methods")[
#alternatives[
```rust
struct Point {
x: u32,
y: u32,
}
impl Point {
fn x(&self) -> &u32 {
&self.x
}
}
```
][
```rust
struct Point<T> {
x: T,
y: T,
}
impl<T> Point<T> {
fn x(&self) -> &T {
&self.x
}
}
```
]
]
#slide(title: "Non generic methods for generic structs")[
```rust
struct Point<T> {
x: T,
y: T,
}
impl Point<i32> {
fn manhattan_distance(&self, other: &Self) -> i32 {
(*self.x).abs_diff(*other.x) + (*self.y).abs_diff(*other.y)
}
}
```
]
#new-section("Traits")
#slide(title: "Defining a trait")[
- defines some set of behavior
- types that implement a trait need to implement the behavior
#uncover(2)[
```rust
pub trait Display {
// Required method
fn fmt(&self, f: &mut Formatter<'_>) -> Result<(), Error>;
}
```
]
#uncover(3)[
```rust
// file: src/news.rs
pub trait Summary {
fn summarize(&self) -> String;
}
```
]
]
#slide(title: "Implementing a trait on a type")[
```rust
// file: src/news.rs
pub struct NewsArticle {
pub headline: String,
pub location: String,
pub author: String,
pub content: String,
}
```
#begin(2)[
```rust
impl Summary for NewsArticle {
fn summarize(&self) -> String {
format!("{}, by {} ({})", self.headline, self.author, self.location)
}
}
```
]
#begin(3)[
#text(15pt)[
Note: implementing an external trait on an external type is more complicated,
#link("https://rust-lang.github.io/rfcs/2451-re-rebalancing-coherence.html#concrete-orphan-rules")[see the orphan rules]
]
]
]
#slide(title: "Using a trait from another module")[
```rust
// file: src/main.rs
// to call a trait function, the trait needs to be in scope
use news::{Summary, NewsArticle};
fn main() {
let article = NewsArticle {
headline: String::from("Patina takes over"),
location: String::from("Dimmsdale"),
author: String::from("<NAME>"),
content: String::from("Protect your metal wares, everyone!"),
};
println!("1 new article: {}", article.summarize());
}
```
]
#slide(title: "Default implementations")[
- written in the trait definition
```rust
pub trait Summary {
fn summarize(&self) -> String {
String::from("(Read more...)")
}
}
```
#uncover(2)[
- implement the trait with an empty `impl` block
```rust
impl Summary for NewsArticle {}
```
]
]
#slide(title: "Default implementations")[
- allowed to call other methods on the same trait
```rust
pub trait Summary {
// this is a required method
fn summarize_author(&self) -> String;
// this method can be implemented but does not have to be
fn summarize(&self) -> String {
format!("(Read more from {}...)", self.summarize_author())
}
}
```
]
#slide(title: "Traits as Parameters")[
#alternatives[
- using an explizit trait bound
```rust
pub fn notify<T: Summary>(item: &T) {
println!("Breaking news! {}", item.summarize());
}
```
][
- syntactic sugar for trait bounds
```rust
pub fn notify(item: &impl Summary) {
println!("Breaking news! {}", item.summarize());
}
```
][
- multiple trait bounds
```rust
// using a trait bound:
pub fn notify<T: Summary + Display>(item: &T) {...}
// using the `impl` syntax:
pub fn notify(item: &(impl Summary + Display)) {...}
```
][
- one more way to specify trait bounds
```rust
fn foo<T: Display + Clone, U: Clone + Debug>(t: &T, u: &U) -> i32 {...}
fn foo<T, U>(t: &T, u: &U) -> i32
where
T: Display + Clone,
U: Clone + Debug,
{...}
```
]
]
#slide(title: "Fixing the largest function")[
#alternatives[
```rust
fn largest<T>(list: &[T]) -> &T {
let mut largest = &list[0];
for item in list {
if item > largest {
largest = item;
}
}
largest
}
```
][
```rust
fn largest<T: std::cmp::PartialOrd>(list: &[T]) -> &T {
let mut largest = &list[0];
for item in list {
if item > largest {
largest = item;
}
}
largest
}
```
]
]
#slide(title: "Returning types that implement traits")[
#alternatives(position: top)[
```rust
fn returns_summarizable() -> impl Summary {...}
```
][
```rust
fn returns_summarizable() -> impl Summary {
NewsArticle {
headline: String::from("Patina takes over"),
location: String::from("Dimmsdale"),
author: String::from("<NAME>"),
content: String::from("Protect your metal wares, everyone!"),
}
}
```
]
```rust
fn main() {
let sumthing = returns_summarizable();
// we only know that the type of `sumthing` implements `Summarize`
// so we can only use methods from the `Summarize` trait
// (this is called an opaque type)
println!("{}", sumthing.summarize());
}
```
]
#slide(title: "Returning types that implement traits")[
- useful when returning a closure
```rust
fn returning_closure() -> impl Fn(i32) -> bool {
|x: i32| x % 3 == 0
}
```
]
#slide(title: "Implementing methods on specific trait bounds")[
#columns(gutter: -14%)[
```rust
use std::fmt::Display;
use std::cmp::PartialOrd;
struct Pair<T> {
x: T,
y: T,
}
impl<T> Pair<T> {
fn new(x: T, y: T) -> Self {
Self { x, y }
}
}
```
#colbreak()
#begin(2)[
```rust
impl<T: Display + PartialOrd> Pair<T> {
fn cmp_display(&self) {
if self.x >= self.y {
println!("x = {} is the largest", self.x);
} else {
println!("y = {} is the largest", self.y);
}
}
}
```
]
]
]
#slide(title: "Blanket implementations")[
```rust
impl<T: Display> ToString for T {
...
}
```
- read as: an implementation for `ToString` for every type that implements `Display`
theref call `to_string()` on every type that implements `Display`
]
#slide(title: "From and Into")[
#two-grid(right-width: 60%)[
#alternatives[
```rust
struct Meters(u32);
struct Millimeters(u32);
impl From<?> for ? {
fn from(value: ?) -> Self {
?
}
}
fn main() {
let m = Meters(1);
let mm: Millimeters = m.into();
println!("1m == {}", mm.0);
}
```
][
```rust
struct Meters(u32);
struct Millimeters(u32);
impl From<Meters> for Millimeters {
fn from(value: Meters) -> Self {
Millimeters(value.0 * 1000)
}
}
fn main() {
let m = Meters(1);
let mm: Millimeters = m.into();
println!("1m == {}", mm.0);
}
```
]
][
#alternatives[
- implement conversion (which direction do you need here?)
- test your conversion in main (how?)
][
1. `From<Meters> for Millimeters` or
2. `Into<Millimeters> for Meters`
- when implementing `From<Meters>`,
- you can use `mm... = m.into()` as well
- the `Temperature` task could also be done like this
]
]
]
#slide(title: "Extending Numbers")[
- with `u32` you _cannot_ get up to `fib(196)`. How far do you come?
- how can we do more?
#alternatives(position: top)[
```rust
use std::ops::{???};
struct Number(u32, u32);
impl AddAssign<?> for ? {
fn add_assign(?????self, ???) -> ? { ? }
}
impl Add<?> for ? {
fn add(???) -> ? { ? }
}
fn main() {
println!("fib(200): {:?}", fib(200));
}
```
][
```rust
use std::ops::Add;
struct Number(u32, u32, u32, u32);
impl Add<&Number> for &Number {
fn add(self, other: &Number) -> Number { ... }
}
fn main() {
println!("fib(200): {:?}", fib(200));
}
```
]
]
#slide(title: "The Add trait")[
- has an associated type
- `Output` is associated to the `Add` trait
- is a generic type in the trait scope
```rust
pub trait Add<Rhs = Self> {
type Output;
// Required method
fn add(self, rhs: Rhs) -> Self::Output;
}
```
]
|
|
https://github.com/kotfind/hse-se-2-notes | https://raw.githubusercontent.com/kotfind/hse-se-2-notes/master/algo/seminars/main.typ | typst | #import "/utils/template.typ": conf
#import "/utils/datestamp.typ": datestamp
#show: body => conf(
title: "Алгоритмы",
subtitle: "Семинары",
author: "<NAME>, БПИ233",
year: [2024--2025],
body,
)
#datestamp("2024-09-09")
#include "./2024-09-09.typ"
#datestamp("2024-09-14")
#include "./2024-09-14.typ"
#datestamp("2024-09-21")
#include "2024-09-21.typ"
|
|
https://github.com/barrel111/readings | https://raw.githubusercontent.com/barrel111/readings/main/notes/cs-masterclass.typ | typst | #import "@local/preamble:0.1.0": *
#show: project.with(
course: "",
sem: "Summer",
title: "Cauchy-Schwarz Masterclass",
subtitle: "Notes",
authors: ("<NAME>",),
)
= Starting with Cauchy
== Proofs
#prop(
"Cauchy-Schwarz",
)[For $a_i, b_i in RR$ we have $ a_1b_1 + a_2b_2 + dots.c + a_n b_n <= sqrt(a_1^2 + a_2^2 + dots.c + a_n^2)sqrt(b_1^2 + b_2^2 + dots.c + b_n^2). $]
#proof(
[of Cauchy-Schwarz (via Induction)],
)[
Let $H(n)$ stand for the hypothesis that Cauchy's inequality is valid for $n$.
The base case $H(1)$ is trivially true as the $a_1 b_1 = sqrt(a_1^2) sqrt(b_1^2)$.
The second base case $H(2)$ follows from observing that,
$ 0 <= (a_1 b_2 - b_1 a_2)^2 \ implies 2a_1 a_2 b_1 b_2 <= a_1^2 b_2^2 + b_1^2 a_2^2 \ implies a_1^2 b_1^2 + 2a_1 a_2 b_1 b_2 + a_2^2 b_2^2 <= a_1^2 b_2^2 + a_1^2 b_1^2 + a_2^2 b_2^2 + b_1^2 a_2^2 \ implies a_1 b_1 + a_2 b_2 <= sqrt(a_1^2 + a_2^2) sqrt(b_1^2 + b_2^2). $
Now, assuming $H(n)$ holds for some $n >= 1$ we wish to show that $H(n + 1)$ also
holds. Consider, using the inductive hypothesis,
$ a_1 b_1 + a_2b_2 + dots.c + a_n b_n + a_(n + 1)b_(n + 1) <= sqrt(a_1^2 + a_2^2 + dots.c + a_n^2) sqrt(b_1^2 + b_2^2 + dots.c + b_n^2) + a_(n + 1) b_(n + 1). $
Then, using $H(2)$,
$ <= sqrt(a_1^2 + a_2^2 + dots.c + a_n^2 + a_(n + 1)^2) sqrt(b_1^2 + b_2^2 + dots.c + b_n^2 + b_(n + 1)^2). $
Thus, $H(n + 1)$ also holds. By induction, Cauchy's inequality holds for all $n in NN$.
]
#lemma[ $ sum_(k = 1)^infinity a_k^2 < infinity " and " sum_(k = 1)^infinity b_k^2 < infinity " implies that " sum_(k = 1)^infinity |a_k b_k| < infinity. $ ]<additive-bound>
#proof(
"(without Cauchy-Schwarz)",
)[ We want to show that $a_k b_k$ is small whenever $a_k^2$ and $b_k^2$ are small.
The following observation gives us what we want,
$ 0 <= (x - y)^2 implies x y <= 1/2 (x^2 + y^2 ). $
We apply this to $x = |a_k|$ and $y = |b_k|$ and add the inequalities up,
#numbered_eq(
$sum_(k = 1)^infinity |a_k b_k| <= 1/2 sum_(k = 1)^infinity a_k^2 + 1/2 sum_(k = 1)^infinity b_k^2 < infinity.$,
) ]
#proof(
"of Cauchy-Schwarz (via eqn. (1))",
)[
Assume that neither ${a_k}$ not ${b_k}$ are made of zeroes. Then define the
sequences, $ hat(a)_k = a_k / (sum_(k = 1)^infinity a_k^2)^(1/2) #h(25pt) hat(b)_k = b_k / (sum_(k = 1)^infinity b_k^2)^(1/2). $
Now, apply eqn. (1) to ${hat(a)_k}$ and ${hat(b)_k}$.
$ sum_(k = 1)^infinity hat(a)_k hat(b)_k <= 1/2 sum_(k = 1)^infinity hat(a)_k^2 + 1/2 sum_(k = 1)^infinity hat(b)_k^2 = 1 \
implies sum_(k = 1)^infinity (a_k b_k)/((sum_(k = 1)^infinity a_k^2)^(1/2) (sum_(k = 1)^infinity b_k^2)^(1/2)) <= 1 \
implies sum_(k = 1)^infinity a_k b_k <= (sum_(k = 1)^infinity a_k^2)^(1/2) (sum_(k = 1)^infinity b_k^2)^(1/2). $
#remark[Normalization is a systematic way of getting from an _additive inequality_ to a _multiplicative inequality_.]
]
#prop[Equality holds in the Cauchy-Schwarz inequality iff the sequences ${a_k}$ and ${b_k}$ are
scalar multiple of one another.]
#proof[ We focus on the nontrivial case where neither of the sequences is identically
zero and where both $sum_(k = 1)^infinity a_k^2, sum_(k = 1)^infinity b_k^2$ are
finite.
The backward direction is easy to prove by a routine computation. We focus on
the forward direction. The equality #numbered_eq(
$ sum_(k = 1)^infinity a_k b_k = (sum_(k = 1)^infinity a_k^2)^(1/2) (sum_(k = 1)^infinity b_k^2)^(1/2) $,
) implies the equality (with $hat(a)_k, hat(b)_k$ defined as above) #numbered_eq(
$ sum_(k = 1)^infinity hat(a)_k hat(b)_k = 1/2 sum_(k = 1)^infinity hat(a)_k^2 + 1/2 sum_(k = 1)^infinity hat(b)_k^2 = 1. $,
)
By the two-term bound, $x y <= 1/2(x^2 + y^2)$ we also know that $ hat(a)_k hat(b)_k <= 1/2 hat(a)_k^2 + 1/2 hat(b)_k^2 " for all " k =1,2, dots, $
If any of these inequalities were strict, then we wouldn't get the equality in
eqn. (3). Thus, the equality in eqn. (2) holds for a nonzero series only when we
have $hat(a)_k = hat(b)_k$ for all $k = 1, 2, dots$. By the definitiion of these
normalized values, we have that $ a_k = lambda b_k " for all " k = 1, 2, dots, $
with $lambda$ given by the raio
$ lambda = (sum_(j = 1)^infinity a_j^2)^(1/2) slash.big (sum_(j = 1)^infinity b_j^2)^(1/2). $ ]
== Notation and Generalizations
The Cauchy-Schwarz inequality can be quite compactly represented in the context
of an _inner product space_. We introduce the requisite material here.
#definition[Suppose $V$ is a real vector space. Then a function on $V times V$ defined by
the mapping $(bold(a), bold(b)) arrow.bar inner(bold(a), bold(b))$ is an _inner product_ and
we say that $(V, inner(dot.c, dot.c))$ is a _real inner product space_ provided
that the pair $(V, inner(dot.c, dot.c))$ has the following properties
+ $inner(bold(v), bold(w)) = inner(bold(w), bold(v))$ for all $bold(v), bold(w) in V$
+ $inner(bold(v), bold(v)) > 0$ for all nonzero $bold(v) in V$
+ $inner(alpha bold(v) + bold(u), bold(w)) = alpha inner(bold(v), bold(w)) + inner(bold(u), bold(w))$ for
all $alpha in RR$ and $bold(u), bold(v), bold(w) in V$
]
On $RR^n$, the following inner product is popular.
#definition(
"Euclidean Inner Product",
)[For $bold(a), bold(b) in RR^n$, $ inner(bold(a), bold(b)) = sum_(j = 1)^n a_j b_j $]
This lets us rewrite the Cauchy-Schwarz inequality succinctly.
#prop[For $bold(a), bold(b) in RR^n$ we have $ inner(bold(a), bold(b)) <= inner(bold(a), bold(a))^(1/2) inner(bold(b), bold(b))^(1/2). $]
Of course, there are other inner product spaces too!
#example[ On $RR^n$ the following weighted sum defines an inner product, $ inner(bold(a), bold(b)) = sum_(j = 1)^n a_j b_j w_j. $ ]
#example[ Consider the vector space $C [a, b]$ of real-valued continuous functions on the
bounded interval $[a, b]$. Then, for any continuous $w:[a, b] to RR$ such that $w(x) > 0$ for
all $x in [a, b]$, we can define an inner product on $C[a, b]$ by setting $ inner(f, g) = integral_a^b f(x)g(x)w(x) #h(2pt) d x. $ ]
This naturally leads us to ask whether Cauchy-Schwarz is true for all inner
product spaces.
#prop[Let $(V, inner(dot, dot))$ be an inner product space. Then, for any $bold(v), bold(w) in V$ we
have $ inner(bold(v), bold(w)) <= inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2). $ For
nonzero $bold(v), bold(w) in V$ we have $ inner(bold(v), bold(w)) = inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2) " if and only if " bold(v) = lambda bold(w) $ for
a nonzero constant $lambda$.]
#proof[ We try to use a variant of the additive method developed above. Consider, $ 0 <= inner(bold(v) - bold(w), bold(v) - bold(w)) \ implies inner(bold(v), bold(w)) <= 1/2 (inner(bold(v), bold(v)) + inner(bold(w), bold(w))). $
We normalize to convert this to a multiplicative bound. Since the inequality
holds trivially for $bold(v) = bold(0)$ or $bold(w) = bold(0)$, we assume that $bold(v), bold(w)$ are
nonzero. Define,
$ hat(bold(v)) = bold(v)/inner(bold(v), bold(v))^(1/2) #h(15pt) hat(bold(w)) = bold(w)/inner(bold(w), bold(w))^(1/2) #h(5pt) . $
We then have,
$ inner(hat(bold(v)), hat(bold(w))) <= 1/2 (inner(hat(bold(v)), hat(bold(v))) + inner(hat(bold(w)), hat(bold(w)))) = 1 \ implies inner(
bold(v)/(inner(bold(v), bold(v))^(1/2)),
bold(w)/(inner(bold(w), bold(w))^(1/2)),
) <= 1 \ implies inner(bold(v), bold(w)) <= inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2). $
Now, we deal with the necessary condition for equality. If $bold(v), bold(w)$ are
nonzero then the normalized vectors $hat(bold(v)), hat(bold(w))$ are well
defined. Furthermore, equality in the Cauchy-Schwarz inequality then gives us $inner(hat(bold(v)), hat(bold(w))) = 1.$ So,
$ inner(bold(hat(v)), bold(hat(w))) = 1/2 (inner(bold(hat(v)), bold(hat(v))) + inner(bold(hat(w)), bold(hat(w)))) \ implies inner(bold(hat(v)) - bold(hat(w)), bold(hat(v)) - bold(hat(w))) = 0 \ implies bold(hat(v)) = bold(hat(w)) $
Thus, $ bold(v) = lambda bold(w) " for " lambda = inner(bold(v), bold(v))^(1/2)/inner(bold(w), bold(w))^(1/2). $ ]
== Symmetry and Amplification
This material is from #link(
"https://web.archive.org/web/20240516183201/https://terrytao.wordpress.com/2007/09/05/amplification-arbitrage-and-the-tensor-power-trick/",
)[Terence Tao's blog].
We consider the general setting of a complex inner product space, $V$.// That is to say, $V$ is a complex inner product space that is complete with respect to the norm induced by the inner product.
In this context, the Cauchy-Schwarz inequality is given by
$ abs(inner(bold(v), bold(w))) <= inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2). $
To prove this, we start off with the additive bound,
$ 0 <= inner(bold(v) - bold(w), bold(v) - bold(w)) \ implies Re inner(bold(v), bold(w)) <= 1/2 (inner(bold(v), bold(v)) + inner(bold(w), bold(w))). $
This is a weaker bound than Cauchy-Schwarz as $Re inner(bold(v), bold(w)) <= inner(bold(v), bold(w)) <= inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2) <= 1/2 (inner(bold(v), bold(v)) + inner(bold(w), bold(w)))$.
The last inequality follows from the AM-GM inequality (see next chapter for more
details).
We can amplify this additive bound by observing some symmetry imbalances.
Particularly, the phase rotation $bold(v) arrow.bar e^(i theta) bold(v)$ preserves
the right-hand side but not the left-hand side,
$ Re #h(2pt) e^(i theta) inner(bold(v), bold(w)) <= 1/2 (inner(bold(v), bold(v)) + inner(bold(w), bold(w))). $
We can choose any real $theta$ we want. To make the left-hand side as large as
possible, we choose $theta$ to cancel the phase of $inner(bold(v), bold(w))$.
This gets us,
$ abs(inner(bold(v), bold(w))) <= 1/2 (inner(bold(v), bold(v)) + inner(bold(w), bold(w))). $
Now, to strengthen the right-hand side we exploit a different symmetry, _homogenisation symmetry_.
Particularly, consider the map $(bold(v), bold(w)) arrow.bar (lambda bold(v), 1/lambda bold(w))$ for
a scalar $lambda > 0$. This gives us,
$ abs(inner(bold(v), bold(w))) <= lambda^2/2 inner(bold(v), bold(v)) + 1/(2 lambda^2) inner(bold(w), bold(w)). $
The choice of $lambda = sqrt(norm(bold(w)) slash norm(bold(v)))$ minimizes the
right-hand side. This gives us,
$ abs(inner(bold(v), bold(w))) <= inner(bold(v), bold(v))^(1/2) inner(bold(w), bold(w))^(1/2). $
== Yet Another Proof ™
This material is from #link(
"https://web.archive.org/web/20240215164458/https://www.dpmms.cam.ac.uk/~wtg10/csineq.html",
)[Timothy Gower's blog]. In this section, we will see a more motivated
development of a common proof for the Cauchy-Schwarz inequality (it is pretty
much the same proof as the one above).
Recall what we mean by the Cauchy-Schwarz result: For $a_k, b_k in RR$, we have $ sum_(k = 1)^infinity a_k b_k <= (sum_(k = 1)^infinity a_k^2)^(1/2) (sum_(k = 1)^infinity b_k^2)^(1/2) $ with
equality iff the sequences ${a_i}$ and ${b_i}$ are proportional.
The central idea for our proof will be trying to find a natural way to express
the fact that two sequences are proportional. One approach would be to say that
there exists a $lambda in RR$ such that $a_k = lambda b_k$ for every $k$.
However, why bother introducing an unknown variable $lambda$ unless we
absolutely have to? We could simply require all $a_k slash b_k$ to be equal.
Though, we may be worried about some $b_k$ being zero. We can resolve this by
simply saying that two sequences are proportional if $a_k b_j = a_j b_k$ for all $j, k$.
We want lots of (in fact, for all $j, k$ we want $a_k b_j - a_j b_k = 0$) terms
to be zero. This can be expressed by requiring the sum of all their squares to
be zero. So, sequences ${a_k}$ and ${b_k}$ are proportional iff $ sum_(k, j) (a_k b_j - a_j b_k)^2 = 0. $
Also note that the expression on the left is trivially at least zero. By
expanding out the left-hand side, we readily obtain both the Cauchy-Schwarz
inequality and the necessary condition for equality,
$ sum_(k, j) (a_k b_j - a_j b_k)^2 &= sum_(k, j) (a_k^2 b_j^2 - 2 a_k a_j b_k b_j + a_j^2 b_k^2) \ &= 2 sum_(k, j) a_k^2 b_j^2 - 2 sum_( k, j ) a_k b_k a_j b_j \ &= 2 sum_(k) a_k^2 sum_j b_j^2 - 2 (sum_(k) a_k b_k)^2. $
Now, we try to extend this idea to real inner product spaces. We want to show
that $ inner(bold(v), bold(w)) <= norm(bold(v)) norm(bold(w)) $ with equality
iff $bold(v)$ and $bold(w)$ are proportional with a positive constant. Again, we
motivate our proof by thinking in terms of expressing proportionality. A first
attempt is to say that $bold(v), bold(w)$ are proportional with a positive
constant iff $bold(v) slash norm(bold(v)) = bold(w) slash norm(bold(w))$ (note
how this doesn't work for proportionality in general, for example we could have $bold(v) = - bold(w)$).
As we did before, we can equivalently express this condition as requiring $norm(bold(w)) bold(v) - norm(bold(v)) bold(w) = 0$.
So that we may express this using inner products, we consider the squared
version of this: $(norm(bold(w)) bold(v) - norm(bold(v)) bold(w))^2 = 0$. Note
that the left-hand side is in fact always greater than or equal to zero. Then,
expanding the left hand side immediately gives us the Cauchy-Schwarz result,
$ (norm(bold(w)) bold(v) - norm(bold(v)) bold(w))^2 = 2norm(bold(v))^2norm(bold(w))^2 - 2 norm(bold(w)) norm(bold(v)) inner(bold(v), bold(w)). $
For a complex inner product space, $(norm(bold(w)) bold(v) - norm(bold(v)) bold(w))^2$ expands
as
$ (norm(bold(w)) bold(v) - norm(bold(v)) bold(w))^2 = 2 norm(bold(v))^2 norm(bold(w))^2 - norm(bold(w)) norm(bold(v)) (inner(bold(v), bold(w)) + inner(bold(w), bold(v))) $
Let $x$ be a complex number with modulus $abs(x) = 1$ and the property that $inner(bold(w), x bold(v))$ is
real and non-negative. Consequently, $inner(bold(w), x bold(v)) = abs(inner(bold(w), bold(v)))$.
We readily get that $abs(inner(bold(w), bold(v))) = inner(bold(w), x bold(v)) <= norm(bold(v)) norm(bold(w))$ with
equality iff $norm(bold(w)) bold(v) - norm(bold(v)) bold(w) = 0$.
= The AM-GM Inequality
|
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-10980.typ | typst | Apache License 2.0 | #let data = (
("MEROITIC HIEROGLYPHIC LETTER A", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER E", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER I", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER O", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER YA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER WA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER BA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER BA-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER PA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER MA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER NA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER NA-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER NE", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER NE-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER RA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER RA-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER LA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER KHA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER HHA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER SA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER SA-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER SE", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER KA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER QA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER TA", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER TA-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER TE", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER TE-2", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER TO", "Lo", 0),
("MEROITIC HIEROGLYPHIC LETTER DA", "Lo", 0),
("MEROITIC HIEROGLYPHIC SYMBOL VIDJ", "Lo", 0),
("MEROITIC HIEROGLYPHIC SYMBOL VIDJ-2", "Lo", 0),
)
|
https://github.com/dark-flames/apollo-typst | https://raw.githubusercontent.com/dark-flames/apollo-typst/main/packages/typst-apollo/lib.typ | typst | Apache License 2.0 | #import "./pages.typ"
#import "./book.typ" |
https://github.com/Mineorbit/cryptOgraph | https://raw.githubusercontent.com/Mineorbit/cryptOgraph/master/cryptograph.typ | typst | #let secpar = [#sym.lambda]
#let concat = [#sym.bar.v.double]
#let add_share(content) =
{
#sym.angle.l#content#sym.angle.r
}
|
|
https://github.com/loreanvictor/master-thesis | https://raw.githubusercontent.com/loreanvictor/master-thesis/main/README.md | markdown | MIT License | # thesis-template-typst
This repository provides a comprehensive Typst template for writing your Bachelor's or Master's thesis at the CIT School of TUM (Technical University of Munich). It includes two types of documents: a proposal template and a thesis template, both specifically designed for students in the field of Informatics. For more information about writing a thesis at the CIT School, please visit the [official CIT website](https://www.cit.tum.de/en/cit/studies/students/thesis-completing-your-studies/informatics/).
**Note:** This is only a template. You have to adapt the template to your thesis and discuss the structure of your thesis with your supervisor!
---
## Guidelines
__Please thorougly read our guidelines and hints on [confluence](https://confluence.ase.in.tum.de/display/EduResStud/How+to+thesis)!__ (TUM Login Required)
---
## Installation
For detailed installation instructions, please refer to the [official installation guide](https://github.com/typst/typst). Here, we provide basic steps for installing Typst's CLI:
- You can get sources and pre-built binaries from the [releases page](https://github.com/typst/typst/releases).
- Use package managers like `brew` or `pacman` to install Typst. Be aware that the versions in the package managers might lag behind the latest release.
- If you have a [Rust](https://rustup.rs/) toolchain installed, you can also install the latest development version.
Nix and Docker users, please refer to the official installation guide for detailed instructions.
## Usage
### Set thesis metadata
Fill in your thesis details in the `common/metadata.typ` file:
* Degree (Bachelor or Master)
* Your study program
* English and German title
* Advisor and supervisor
* Your name (without e-mail address or matriculation number)
* The start and submission date
### Build PDFs locally
Once you have installed Typst, you can use it like this:
```sh
# Creates `thesis.pdf` in working directory.
typst compile thesis.typ
# Creates `proposal.pdf` in working directory.
typst compile proposal.typ
# Creates PDF file at the desired path.
typst compile thesis.typ path/to/output.pdf
```
You can also watch source files and automatically recompile on changes. This is
faster than compiling from scratch each time because Typst has incremental
compilation.
```sh
# Watches source files and recompiles on changes.
typst watch thesis.typ
```
## Working with the Typst Web Editor
If you prefer an integrated IDE-like experience with autocompletion and instant preview, the Typst web editor allows you to import files directly into a new or existing document. Here's how you can do this:
1. Navigate to the [Typst Web Editor](https://typst.app/).
2. Create a new blank document.
3. Click on "File" on the top left menu, then "Upload File".
4. Select all .typ and .bib files along with the figures provided in this template repository.
**Note:** You can select multiple files to import. The editor will import and arrange all the files accordingly. Always ensure you have all the necessary .typ, .bib, and figures files you need for your document.
---
## Further Resources
- [Typst Documentation](https://typst.app/docs/)
- [Typst Guide for LaTeX Users](https://typst.app/docs/guides/guide-for-latex-users/)
- [Typst VS Code Extension (inofficial)](https://marketplace.visualstudio.com/items?itemName=nvarner.typst-lsp)
|
https://github.com/Jollywatt/typst-wordometer | https://raw.githubusercontent.com/Jollywatt/typst-wordometer/master/tests/custom-counters/test.typ | typst | MIT License | #import "/src/lib.typ": *
#set page(width: 15cm, height: auto)
#let el = [
Hello there are 10 vowels here.
]
#rect(el)
#word-count-of(el, counter: txt => (vowels: lower(txt).matches(regex("[aeiou]")).len()))
|
https://github.com/gianzamboni/cancionero | https://raw.githubusercontent.com/gianzamboni/cancionero/main/wip/entre-oraculos.typ | typst | #import "../theme/project.typ": *;
#cancion("Entre oráculos","<NAME>", withCords: true, notas: [
#grid(columns: (1fr, 1fr))[
*Transporte III* \
#rasgeo("p", "r", "r")
][
#grid(columns: (1fr, 1fr, 1fr))[
#drawScaledChord("Am")
][
#drawScaledChord("Dm")
][
#drawScaledChord("E")
]
]
])[
Uh uh uh uh uh uh uh uh
Oh oh oh oh oh oh oh oh
Uh oh oh oh oh oh oh oh
Uh oh oh oh oh oh oh oh
#seccion[A]
#chord[Navegantes][Am][1] al bajón,
No se quieren naufragar.
Es de #chord[muy][Dm][1] humilde humillación
#chord[Quien][E][4] muere al querer salvar
#seccion[A]
#chord[Que][Am][0] las olas sean carbón
Que ardan gotas en la piel
Se #chord[consumen][Dm][5] por querer gritar
#chord[Lo][E][0] que se queda en el mar
#colbreak()
#seccion[B]
#chord[Y][Dm][0] ante las altas #chord[corrientes][Am][6]
Matar #chord[insolentes][E][6] da Gracia y #chord[Fortuna][Am][5]
#chord[Cuando][Dm][0] las almas no #chord[mienten][Am][3]
Luchar la #chord[corriente][E][6] da muerte #chord[segura][Am][4]
#seccion[A]
#chord[Navegantes][Am][1] al timón
La tormenta va a empezar
Lanza rayos cuando se enteró
Que al cielo van a asalt#underline(offset: 2pt, stroke: color.red)[ar
#seccion[B]
Y] de estrellas pretenden valerse
Sonsacan la muerte fingiendo bravura
Si lo que quieren es suerte,
Mejor es cantar seduciendo a la Luna.
#colbreak()
Uh uh uh uh uh uh uh uh
Oh oh oh oh oh oh oh oh
Uh oh oh oh oh oh oh oh
Uh oh oh oh oh oh oh oh
#seccion[A]
#chord[Navegantes][Am][1] al bajón,
No se quieren naufragar.
Es de #chord[muy][Dm][1] humilde humillación
#chord[Quien][E][4] muere al querer salvar
#seccion[A]
#chord[Que][Am][0] las olas sean carbón
Que ardan gotas en la piel
Se #chord[consumen][Dm][5] por querer gritar
#chord[Lo][E][0] que se queda en el mar
#seccion[B]
#chord[Y][Dm][0] ante las altas #chord[corrientes][Am][6]
Matar #chord[insolentes][E][6] da Gracia y #chord[Fortuna][Am][5]
#chord[Cuando][Dm][0] las almas no #chord[mienten][Am][3]
Luchar la #chord[corriente][E][6] da muerte #chord[segura][Am][4]
#colbreak()
#seccion[A]
#chord[Navegantes][Am][1] al timón
La tormenta va a empezar
Lanza rayos cuando se enteró
Que al cielo van a asalt#underline(offset: 2pt, stroke: color.red)[ar
#seccion[B]
Y] de estrellas pretenden valerse
Sonsacan la muerte fingiendo bravura
Si lo que quieren es suerte,
Mejor es cantar seduciendo a la Luna.
#seccion[CODA C]
Brindo un ron por mi suerte
Que me escuche la lluvia
Que la muerte ignominia #niveles(("ja", 1), ("ma", 2), ("a", 3), ("a", 4), ("as", 3))
De mi cuero se cubra
] |
|
https://github.com/pku-typst/PKU-typst-template | https://raw.githubusercontent.com/pku-typst/PKU-typst-template/main/templates/通用/作业/themes/simple.typ | typst | MIT License | #import "@preview/linguify:0.4.1": load_ftl_data, linguify
#let languages = (
"zh",
"en",
)
#let lgf_db = eval(load_ftl_data("../L10n", languages))
#let linguify = linguify.with(from: lgf_db)
#let title(
course_name: none,
hw_no: none,
hw_week: none,
student_id: none,
student_name: none,
..args,
) = context {
let title = course_name
let subtitle = none
if hw_no != none {
subtitle = linguify("homeworkOrd", args: (ord: hw_no))
} else if hw_week != none {
subtitle = linguify("weekOrd", args: (weekNo: hw_week))
}
let author = [#student_id \ #student_name]
return [
#v(40pt)
#align(center)[
#set text(font: ("Microsoft Sans Serif", "SimHei"))
#text(size: 28pt, weight: "bold")[
#title
]
#text(size: 18pt)[
#subtitle
]
#text(size: 12pt, font: "Kaiti SC")[
#author
]
]
#pagebreak(weak: true)
]
}
#let other(doc) = {
set text(font: "Songti SC", size: 10pt)
doc
}
|
https://github.com/kdog3682/2024-typst | https://raw.githubusercontent.com/kdog3682/2024-typst/main/src/functions.typ | typst | #import "base-utils.typ": *
#import "prose.typ": prose-ref
// simple components
#let multiple-choice(
show-answer: false,
answer: "aaa",
lettering: "abcd",
choices: ("a", "b", "aaa", "xxx"),
layout: "vertical",
question
) = {
let letters = "ABCD"
let answerIndex = if exists(answer) { choices.position((x) => x == answer) } else { none }
set enum(numbering: (it) => {
let curIndex = it - 1
let index = letters.at(curIndex)
let indexContent = bold(index + ".")
indexContent
return
if answerIndex != none and curIndex == answerIndex {
box(indexContent, fill: green.lighten(70%), radius: 50%,
inset: 3pt)
} else {
indexContent
}
}, spacing: 20pt, tight: false, body-indent: 10pt)
let trax((i, item)) = {
return if answerIndex == i {
text(weight: "bold", fill: green.lighten(25%), item)
} else {
item
}
}
let choicesContent = enum(..choices.map(markup).enumerate().map(trax))
question
v(10pt)
choicesContent
// let val = block(fill: yellow, grid(column-gutter: 20pt, columns: (170pt, 1fr), a, b))
}
// #mc([hi])
#{
// [aaa]
// flex(lorem(55), lorem(18), lorem(36))
// [bbb]
}
// #math-front-matter(title: "hi", aaa: "bo")
|
|
https://github.com/typst/package-check | https://raw.githubusercontent.com/typst/package-check/main/README.md | markdown | Apache License 2.0 | # Typst package check
A tool to report common errors in Typst packages.
This tool can be used in three ways:
- `typst-package-check check`, to check a single package, in the current directory.
- `typst-package-check check @preview/NAME:VERSION` to check a given package in a clone of the `typst/packages` repository.
This command should be run from the `packages` sub-directory. In that configuration, imports will be resolved in the local
clone of the repository, nothing will be fetched from the network.
- `typst-package-check server` to start a HTTP server that listen for GitHub webhooks, and run checks when a PR is opened against
`typst/packages` (or any repository with a similar structure).
## Using this tool
You can install this tool with Cargo:
```bash
cargo install --git https://github.com/typst/package-check.git
cd my-package
typst-package-check check
```
You can also run it with Nix:
```bash
nix run github:typst/package-check -- check
```
Finally a Docker image is available:
```bash
docker run -v .:/data ghcr.io/typst/package-check check
```
When running with Docker, `/data` is the directory in which the tool will look for files to check.
## Configuring the webhook handler
The following environment variables are used for configuration.
They are all mandatory when running the server that handles webhook.
`.env` is supported.
- `PACKAGES_DIR`, path to a local clone of `typst/packages`
- `GITHUB_APP_IDENTIFIER`, the ID of the GitHub app submitting reviews.
This app should have the `checks:write` permission.
- `GITHUB_WEBHOOK_SECRET`, the secret provided by GitHub when enabling webhook handling.
- `GITHUB_PRIVATE_KEY`, the private key of the GitHub app, in PEM format.
Directly in the environment variable, not a path to an external file.
Note that you can (and should probably) use double-quotes in the `.env` file for multi-line variables.
|
https://github.com/juicebox-systems/ceremony | https://raw.githubusercontent.com/juicebox-systems/ceremony/main/instructions/README.md | markdown | MIT License | # Ceremony Instructions
This directory contains the ceremony instructions in source form.
## Build the PDF
You can build using Docker, which creates reproducible output:
```sh
./build.sh
```
Or, install [Typst](https://github.com/typst/typst) and then run:
```sh
typst compile --root .. ceremony.typ
```
This way is likely to use different fonts and therefore result in a byte-wise
different PDF.
|
https://github.com/HiiGHoVuTi/requin | https://raw.githubusercontent.com/HiiGHoVuTi/requin/main/calc/ens_fct.typ | typst | #import "../lib.typ": *
#show heading: heading_fct
On se propose ici de créer une structure de donnée permettant de représenter un ensemble infini sous la forme d’une fonction de `'a -> bool` déterministe et qui *termine toujours*. On fixe dans cet exercice $X$ un ensemble quelconque. On défini donc le type suivant en OCaml :
```ml
type 'a set = 'a -> bool;;
```
#question(0)[Expliciter la bijection entre $cal(P)(X)$ et ${0,1}^X$]
#correct([
C'est, si on pose $bold(1)_E$ l'indicatrice de l'ensemble $E$, $ phi : cases(
cal(P)(X) &--> {0,1}^X,
A &|-> bold(1)_A
) $
])
#question(0)[ Donner une fonction ```ml make_finite: 'a list -> 'a set``` qui à une liste finie renvoie l'ensemble de ses valeurs]
#correct([
```ml
let make_finite li x = match li with
| [] -> false
| e::q when e=x -> true
| e::q -> make_finite q x
```
])
#question(1)[Construire des objets de type `int set` pour représenter les ensembles suivants :
- L’ensemble des nombres pairs
- $PP$, l’ensemble des nombres premiers
- L’image d’une fonction fixée `f` positive et strictement croissante
]
#correct([
```ml
let first x = x mod 2 = 0;;
let second x =
let rec aux p = if x mod p = 0 then false else p*p>x || aux (p+1) in
aux 2;;
let third x =
let rec aux i = if f i >= x then f i = x else aux (i+1) in
aux 0;;
```
])
#question(1)[ Donner le code d’une fonction `union` qui réalise l’union de deux ensembles.]
#correct([
```ml let union f1 f2 x = f1 x || f2 x```
])
#question(2)[ Donner un ensemble que l’on ne pourra pas représenter par notre structure. Le nombre d’ensembles non représentable est-t’il fini ? Dénombrable ? Indénombrable ?]
#correct([
L'ensemble des codes (`string`) ocaml qui termine n'est pas représentable, car il n'existe pas d'algorithme permettant de savoir si un code donné termine ou pas (Problème de l'arrêt). \
Etant donné que à chaque élément représentable, on peut associer un code d'un fonction pouvant définir l'objet, le nombre d'ensemble représentable est dénombrable. Or il y a un nombre indénombrable d'ensembles, donc il y a un nombre indénombrable d'ensemble non représentable.
])
// #correct([])
#question(3)[ Soit ```ml val P: int set set```, montrer qu’il existe un $N in NN$ tel que pour tout ```ml val x: int set```, `P x` ne regarde que les entrées inférieure à $N$ dans `x`. Est-ce vrai avec ```ml val P: (int -> int) set``` ?]
#correct([
On fait l'arbre des évaluations de l'argument de `P`: Chaque neoud correspond à une évaluation de l'argument, et les branches corresponds au choix pris par l'algorithme si la réponse est oui ou non.\
Comme le calcul termine toujours et que le nombre de fils dans chaque neoud est fini et bornée par 2, on peut démontrer (ce n'est pas trivial) que l'arbre à un nombre bornée de neouds. Le max des noeuds nous donne alors le $N$.
Pour la démonstration: par l'absurde, supposons qu'il a un nombre infini de noeud. Alors à la racine il y a un des 2 sous-arbres qui à unnombre infini de noeud et on se déplace dedans. Un peu à la weirestrass, on crée un chemin dans l'arbre. Sauf que ce chemin doit s'arreter car il correspond à un calcul, et que tout les calculs sont terminant. Comme le chemin se termine, il n'y a pas un nombre infini de noeuds, absurde.
Ce n'est pas vrai pour ```ml val P: (int -> int) set``` car dans ce cas, l'arbre est à branchement infini, et peut donc avoir un nombre infini de noeuds. Par exemple, la fonction ```ml let P f = f (f 0) = 42``` qui regarde le `f 0` argument, n'est pas majoré indépendament de `f`.
])
#question(4)[ Écrire une fonction ```ml val f : (int set set) -> int set``` qui à un `int set set` non vide associe un de ces éléments.]
#correct([
Le code est plus diffile à faire que la preuve que ce soit possible. En effet, il nous suffit de simuler tout les ensembles `int set` d'entrée plus bas que $N$, et stoquer tout les résultats dans un tableau.
Code de "Tito" <NAME>, adapté du blog en Haskell de Martin Escardo :
```ml
let cons : 'a -> 'a set -> 'a set = fun x u ->
function
| 0 -> x
| i -> u (i-1)
let rec f : int set set -> (int set) = fun p ->
let u0 = try_first_bit false p in
if p u0 then u0
else try_first_bit true p
and try_first_bit : bool -> int set set -> int set = fun b p ->
cons b (fun i -> find (fun u -> p (cons b u)) i)
```
]) |
|
https://github.com/chen-qingyu/Typst-Code | https://raw.githubusercontent.com/chen-qingyu/Typst-Code/master/integral%202.typ | typst | #let LF = {v(3em); linebreak()}
$
& integral sqrt(1+x^2) dif x space ("let" x = tan t)
LF
=& I
LF
=& integral sec t dif tan t
LF
=& sec t tan t - integral tan t dif sec t
LF
=& sec t tan t - integral tan^2 t sec t dif t
LF
=& sec t tan t - integral (sec^2 t - 1) sec t dif t
LF
=& sec t tan t - integral sec^3 t dif t + integral sec t dif t
LF
=& sec t tan t - I + ln|sec t + tan t| + C
LF
LF
=> I =& 1/2 (sec t tan t + ln|sec t + tan t|) + C
LF
=& 1/2 (x sqrt(1+x^2) + ln(x + sqrt(1+x^2))) + C
LF
$
|
|
https://github.com/nogula/tufte-memo | https://raw.githubusercontent.com/nogula/tufte-memo/main/README.md | markdown | MIT License | # tufte-memo
A memo document template inspired by the design of <NAME> books for the Typst typesetting program.
For usage, see the usage guide [here](https://github.com/nogula/tufte-memo/blob/main/template/main.pdf).
The template provides handy functions: `template`, `note`, and `wideblock`. To create a document with this template, use:
```typst
#import "@preview/tufte-memo:0.1.2": *
#show: template.with(
title: [Document Title],
authors: (
(
name: "<NAME>",
role: "Optional Role Line",
affiliation: "Optional Affiliation Line",
email: "<EMAIL>"
),
)
)
...
```
additional configuration information is available in the usage guide.
The `note()` function provides the ability to produce sidenotes next to the main body content. It can be called simply with `#note[...]`. Additionally, `wideblock()` expands the width of its content to fill the full 6.5-inch-wide space, rather than be compressed in to a four-inch column. It is simply called with `wideblock[...]`.
|
https://github.com/mem-courses/calculus | https://raw.githubusercontent.com/mem-courses/calculus/main/note-2/5.曲线积分与曲面积分.typ | typst | #import "../template.typ": *
#show: project.with(
course: "Calculus II",
course_fullname: "Calculus (A) II",
course_code: "821T0160",
semester: "Spring-Summer 2024",
title: "Note #5: 曲线积分与曲面积分",
authors: (
(
name: "memset0",
email: "<EMAIL>",
id: "3230104585",
),
),
date: "April 9, 2024",
)
= 第一类曲线积分
== 第一类曲线积分的概念
*物理背景*:曲线段的质量。
#definition[
若曲线 $L = {(x(t),y(t),z(t)) | t in [a,b]}$ 是 $RR^3$ 上的光滑曲线,则此曲线的#def[弧微分]公式为
$
dif s = sqrt((x' (t))^2 + (y'(t))^2 + (z'(t))^2) dif t
$
]
#definition[
设 $Gamma$ 是空间(或平面)中的一段以 $A,B$ 为端点的光滑曲线,$f(P)$ 为定义在 $Gamma$ 上的有界函数。把 $Gamma$ 分割成任意 $n$ 个小段 $Delta l_1,Delta l_1,dots.c,Delta l_n$,$Delta l_i$ 的长度仍用 $Delta l_i$ 来表示。记 $lambda = display(max_(1<=i<=n) {Delta l_i})$。$forall P_i in Delta l_i$,若和式极限
$
lim_(lambda->0) sum_(i=1)^n f(P_i) Delta l_i
$
存在,且极限值与曲线 $Gamma$ 的分法及 $P_i$ 点的取法无关,则称上述极限为 $f(P)$ 在 $Gamma$ 上的#def[第一类曲线积分]。
]
== 平面曲线积分的计算法
#theorem[
设平面曲线 $Gamma$ 的参数方程为 $display(cases(
x = x(t),
y = y(t)
) space alpha <= t <= beta)$,其中 $x'(t), space y'(t)$ 在 $[alpha,beta]$ 上连续,则弧微分为 $display(dif l = sqrt(x'^2 (t) + y'^2 (t)) dif t)$。
设 $f(x,y)$ 为 $Gamma$ 上的连续函数,则
$
int_Gamma f(x,y) dif l = int_alpha^beta f(x(t), y(t)) sqrt(x'^2 (t) + y'^2 (t)) dif t
$
]
#note[
计算第一类曲线积分时,不要忘记利用对称性化简,可参考二重积分利用对称性化简的部分。
]
#note[
若曲线 $Gamma$ 的方程为 $r=r(theta),space theta in [alpha,beta]$,则
$
int_Gamma f(x,y) dif s = int_alpha^beta f(r cos theta, r sin theta) sqrt(r^2 (theta) + r'^2 (theta)) dif theta
$
]
#tip[
【空间质线的转动惯量】
设有空间质线 $Gamma$,其线密度为连续函数 $mu(x,y,z)$,则质线关于 $L$ 轴的转动惯量为
$
I_L = int_Gamma overline(P P_L)^2 mu(x,y,z) dif l
$
其中,$overline(P P_L)$ 为点 $P(x,y,z)$ 到 $L$ 轴的距离。特殊的,质线关于 $x$ 轴的转动惯量为
$
I_x = int_Gamma (y^2 + z^2) mu(x,y,z) dif l
$
关于 $y,z$ 轴的转动惯量形式也相仿。
]
= 第一类曲面积分
== 第一类曲面积分的概念
*物理背景*:曲面薄壳的质量。
#definition[
设 $S$ 是空间中的一张有界光滑曲面,$f(x,y,z)$ 为定义在 $S$ 上的有界函数。将 $S$ 分成互不相交的 $n$ 个小块 $Delta S_1,Delta S_2,dots.c,Delta S_n$,$Delta S_i$ 的面积仍旧用 $Delta S_i$ 来表示,记 $lambda= max_(1<=i<=n) {Delta S_i "的直径"}$。$forall P_i (xi_i,eta_i,zeta_i) in Delta S_i$,若极限
$
lim_(lambda->0) sum_(i=1)^n f(xi_i,eta_i,zeta_i) Delta S_i
$
存在,且极限值与区域 $S$ 的分割方法及 $P_i$ 的取法无关,则称上述极限为函数 $f(x,y,z)$ 在 $S$ 上的第一类曲面积分,记作
$
iintb(S) f(x,y,z) dif S
$
]
== 第一类曲面积分的计算:微元法
投影到 $x O y$ 平面:$forall dif S subset S$,$forall P(x,y,z(x,y)) in dif S$,设 $dif sigma$ 为 $dif S$ 在 $x O y$ 平面上的投影,则曲面 $S$ 在该点处的法矢量为
$
arrow(n) = pm {(diff z) / (diff x),(diff z) / (diff y),-1}
$
则 $arrow(n)$ 与 $z$ 轴正向夹角 $gamma$ 的余弦为:
$
cos gamma = pm 1 / display(sqrt(1+((diff z)/(diff x))^2+((diff z)/(diff y))^2))
$
则
$
dif sigma = dif S dot.c abs(cos gamma) = sqrt(1+((diff z)/(diff x))^2 + ((diff z)/(diff y))^2) dif sigma
$
#conclusion[
设 $f(x,y,z)$ 在曲面 $S$ 上连续,若 $S:z=z(x,y),space (x,y) in sigma_(x y)$,则
$
iintb(S) f(x,y,z) dif S
= iintb(sigma_(x y)) f(x,y,z(x,y)) sqrt(1+((diff z)/(diff x))^2+((diff z)/(diff y))^2) dif sigma
$
同理,可以得到投影到 $y O z$ 平面和 $x O z$ 平面的类似结论。
]
== 点函数积分
#definition[
设 $Omega$ 为有界形体,$partial Omega$ 为 $Omega$ 的边界。若 $partial Omega subset Omega$,则称 $Omega$ 是#def[闭形体]。
]
#definition[
设 $Omega$ 是#def[可度量的],则称 $Omega$ 是可求长(或面积、体积)的。对应的长度(或面积、体积)称为 $Omega$ 的#def[量度]。
]
#definition[
设 $Omega$ 是 $n$ 维有界闭形体,$f(P)$ 是定义在 $Omega$ 上的有界函数。将 $Omega$ 分割成 $n$ 个小子形体:$Delta Omega_1,Delta Omega_2,dots.c,Delta Omega_n$,$Delta Omega_i$ 的量度仍旧用 $Delta Omega_i$ 来表示,$Delta Omega_i$ 的直径用 $lambda_i$ 来表示,$i=1,2,dots.c,n$。记 $lambda=max{lambda_i | i=1,2,dots.c,n}$。$forall P_i in Delta Omega_i$,作和 $display(sum_(i=1)^n f(P_i) Delta Omega_i)$,若极限
$
lim_(lambda->0) sum_(i=1)^n f(P_i) Delta Omega_i
$
存在,且与 $Omega$ 的分割方法及 $P_i$ 的取法无关,则称此极限为点函数 $f(P)$ 在区域 $Omega$ 上的积分,记作
$
int_Omega f(P) dif Omega
$
]
\
我们学过的点函数积分有以下五种:
1. #[
一元函数定积分:设 $Omega = [a,b] subset RR^1$,$f(P) = f(x), space x in [a,b]$,则
$
int_Omega f(P) dif Omega = int_a^b f(x) dif x
$
]
2. #[
二重积分:设 $Omega = sigma subset RR^2$,$f(P)=f(x,y), space (x,y) in sigma$,则
$
int_Omega f(P) dif Omega = iintb(sigma) f(x,y) dif sigma
$
]
3. #[
三重积分:设 $Omega = V subset RR^3$,$f(P) = f(x,y,z), space (x,y,z) in V$,则
$
int_Omega f(P) dif Omega = iiintb(V) f(x,y,z) dif V
$
]
4. #[
第一类曲线积分(对弧长的积分):设 $Omega = Gamma subset RR^3$,$f(P) = f(x,y,z) , space (x,y,z) in Gamma$,则
$
int_Omega f(P) dif Omega = int_Gamma f(x,y,z) dif s
$
]
5. #[
第一类曲面积分(对面积的积分):设 $Omega = S subset RR^3$,$f(P) = f(x,y,z), space (x,y,z) in S$,则
$
int_Omega f(P) dif Omega = iintb(S) f(x,y,z) dif S
$
]
= 第二类曲线积分
== 第二类曲线积分的概念
*物理背景*:变力沿曲线所作的功。
#definition(name: [第二类曲线积分])[
设 $Gamma$ 是一条以 $A,B$ 为端点的光滑曲线,并指定从 $A$ 到 $B$ 的曲线方向。在 $Gamma$ 上任取一点 $M(x,y,z)$,作曲线的单位切矢量
$
arrow(T^circle.small) = arrow(T^circle.small) (x,y,z) = cos alpha arrow(i) + cos beta arrow(i) + cos gamma arrow(k)
$
其方向与指定的曲线方向一致。又设
$
arrow(A) = arrow(A) (x,y,z) = P(x,y,z) arrow(i) + Q(x,y,z) arrow(j) + R(x,y,z) arrow(k)
$
其中,$P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 是定义在 $Gamma$ 上的有界函数。
$
arrow(A) dot arrow(T^circle.small) = P(x,y,z) cos alpha + Q(x,y,z) cos beta + R(x,y,z) cos gamma
$
那么
$
int_Gamma arrow(A) dot arrow(T^circle.small) dif l
&= int_Gamma P(x,y,z) cos alpha dif l + int_Gamma Q(x,y,z) cos beta dif l + int_Gamma R(x,y,z) cos gamma dif l\
&= int_Gamma P(x,y,z) dif x + Q(x,y,z) dif y + R(x,y,z) dif z
$
上式称为函数 $P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 沿曲线 $Gamma$ 从点 $A$ 到点 $B$ 的#def[第二类曲线积分],或称为#def[对坐标的曲线积分],或称为#def[关于弧长元素投影的积分]。
]
#definition[
另记 $arrow(dif l) = arrow(T^circle.small) dif l$,则 $arrow(dif l) = {cos alpha dif l, cos beta dif l, cos gamma dif l} = {dx, dy,dz}$,称为#def[有向弧长元素]。
]
== 第二类曲线积分的性质
#property(name: [线性性质])[
$display(
int_Gamma (a arrow(F) + beta arrow(G)) dif l = alpha int_Gamma arrow(F) dif l + beta int_Gamma arrow(G) dif l
)$,其中 $alpha,beta$ 是常数。
]
#property(name: [弧段可加性])[
设 $Gamma = Gamma_1 union Gamma_2$,$Gamma_1$ 与 $Gamma_2$ 没有公共内点,且 $Gamma_1,Gamma_2$ 的方向都与 $Gamma$ 的方向一致,则有 $display(
int_Gamma arrow(F) arrow(dif l) = int_(Gamma_1) arrow(F) arrow(dif l) + int_(Gamma_2) arrow(F) arrow(dif l)
)$。
]
#property[
设曲线 $l$ 的端点为 $A,B$,则 $display(int_(Gamma_(A B)) arrow(F) arrow(dif l) = - int_(Gamma_(B A)) arrow(F) arrow(dif l))$。即改变曲线积分的方向,其结果添负号。
]
== 第二类曲线积分的计算
#theorem[
设空间光滑曲线 $Gamma_(A B)$ 的参数方程为 $x = x(t); space y = y(t); space z = z(t)$ 起点 $A$ 所对应的参数值为 $t_A$,终点 $B$ 所对应的参数值为 $t_B$,且 $P(x,y,z),space Q(x,y,z),space R(x,y,z)$ 在 $Gamma_(A B)$ 上连续,则
$
& int_(Gamma_(A B)) P(x,y,z) dif x + Q(x,y,z) dif y + R(x,y,z) dif z\
=& int_(t_A)^(t_B) (P(x(t),y(t),z(t)) x'(t) + Q(x(t),y(t),z(t)) y'(t) + R(x(t),y(t),z(t)) z'(t))
dif t \
$
]
#tip[
【求力场 $arrow(F)$ 对运动质点所作的功 $W$】
1. 利用 $arrow(F) = abs(F) arrow(F^circle.small)$,求出 $arrow(F) = P arrow(i) + Q arrow(j) + R arrow(k)$。
2. 求出质点运动路径 $Gamma_(A B)$ 的参数方程。
3. 写出功 $W$ 的积分表达式 $display(W = int_(Gamma_(A B)) P dx + Q dy + R dz)$ 并计算。
]
#tip[
【求 $I = display(int_L P dx + Q dy)$ 的步骤】(涉及到下文的格林公式和路径无关性)
1. 先判断 $display((diff P)/(diff y) = (diff Q)/(diff x))$ 是否成立。若成立,则利用路径无关性的性质计算(注意:要求在所选路径上,$P,Q$ 及其偏导数连续)
2. 若不成立,但 $display((diff Q)/(diff x) - (diff P)/(diff y))$ 较简单;
2.1. 若 $L$ 封闭且 $P,Q$ 及其偏导数在 $L$ 所围的区域连续时,直接用格林公式。
2.2. 若非闭,则添加简单曲线使其变成封闭曲线,再用格林公式。要求添加的简单曲线与 $L$ 所围的区域上,$P,Q$ 及其偏导数连续。
]
== 格林公式
#theorem[
设 $D$ 是一个平面有界闭区域,它的边界 $Gamma$ 由有限条分段光滑的曲线组成。函数 $P(x,y)$,$Q(x,y)$ 在 $D$ 上连续,并且具有连续的偏导数,则
#set math.mat(delim: "|")
$
intcb(Gamma) P dx + Q dy
= iintb(D) ((diff Q) / (diff x) - (diff P) / (diff y)) dif sigma
defeq iintb(D) mat(
display(diff/(diff x)), display(diff/(diff y));
P, Q
) dif sigma
$
#set math.mat(delim: "(")
TBD:正向
#proof[
#grid(
columns: (3.5fr, 1fr),
[
(i) 若 $D$ 是二维平面上的简单闭区域(既是 $x$ 型区域又是 $y$ 型区域,即通过 $x$ 轴上的任一点,作平行于坐标轴的直线,这条直线与 $D$ 的边界曲线 $Gamma$ 至多有两个交点,但允许其中有一段是平行于坐标轴的直线段,这时,可设
$
D = {(x,y) | y_1(x)<=y<=y_2(x),space a<=x<=b}。
$
先证:
$
intcb(Gamma) P dif x = - iintb(D) (diff P) / (diff y) dif sigma。
$
],
[
// #align(center, image("images/2024-06-07-21-49-14.png", width: 100%))
],
)
设 $Gamma_1$ 是区域 $D$ 下方的一段边界曲线,$Gamma_2$ 是上方的一段,$Gamma_3$ 是垂直的一段,则
$
intcb(Gamma) P dif x
&= intcb(Gamma_1) P dif x + intcb(Gamma_2) P dif x + intcb(Gamma_3) P dif x\
&= int_a^b P(x,y_1(x)) dx + int_b^a P(x,y_2(x)) dx + 0\
&= int_a^b P(x,y_1(x)) dx - int_a^b P(x,y_2(x)) dx
$
$
- iintb(D) (diff P) / (diff y) dif sigma
&= -int_a^b dx int_(y_1(x))^(y_2(x)) (diff P) / (diff y) dif y
= - int_a^b atpos(P(x,y), y=y_1(x), y=y_2(x)) dx\
&= - int_a^b P(x,y_2(x)) dx + int_a^b P(x,y_1(x)) dx
$
同理,$display(intcb(Gamma) Q dif y = iintb(D) (diff Q) / (diff x) dif sigma)$。两式相加即得
$
intcb(Gamma) P dx + Q dy = iintb(D) ((diff Q) / (diff x) - (diff P) / (diff y)) dif sigma
$
(ii) 若 $D$ 是一条按段光滑的闭曲线 $Gamma$ 围成的,则可用几段光滑曲线将 $D$ 分成有限个满足 (i) 的区域,然后应用 (i) 中的方法可推得相应的格林公式。
(iii) 若 $D$ 是由多条曲线所围成的,同样可以应用类似方法。
]
]
== 平面曲线积分与路径无关性
#theorem(name: [平面曲线积分与路径无关的四个条件])[
设 $D$ 是一个平面单连通区域,若函数 $P(x,y)$ 和 $Q(x,y)$ 在区域 $D$ 上连续,且具有连续的一阶偏导数,则以下四个条件等价:
(1) 沿 $D$ 内任一分段光滑的封闭曲线 $L$,有 $display(intc_L P dx + Q dy = 0)$。
(2) 对 $D$ 内任一分段光滑曲线 $Gamma_(A B)$,$display(int_(Gamma_(A B)) P dx + Q dy)$ 与路径 $Gamma$ 无关,只与起点 $A$ 与终点 $B$ 的位置有关。
(3) $P dx + Q dy$ 是 $D$ 内某一函数 $u(x,y)$ 的全微分,即存在 $u(x,y),space (x,y) in D$,使 $dif u = P dx + Q dy$。
(4) $display((diff P)/(diff y) = (diff Q)/(diff x))$,$forall (x,y) in D$。·
#proof[
#record("2024-05-30第3-5节 00:15:00")
]
#tip[
【原函数的求法】
若曲线积分与路径 $Gamma$ 无关,只与起点 $A$ 和终点 $B$ 的位置有关,则
$
u(x,y)
&= int_((x_0,y_0))^((x,y)) P(x,y) dx + Q (x,y) dy + C
= int_((x_0,y_0))^((x,y_0)) + int_((x,y_0))^((x,y)) + C\
&= int_(x_0)^x P(x,y_0) dx + int_(y_0)^y Q(x,y) dy + C
$
]
]
#theorem(name: [平面曲线积分的牛莱公式])[
若 $dif u (x,y) = P dx + Q dy$,则
$
int_(Gamma_(A B)) P dx + Q dy
= int_A^B P dx + Q dy
= int_A^B dif u(x,y)
= atpos(u(x,y), A(x_1,y_1), B(x_2,y_2))
= u(x_2,y_2) - u(x_1,y_1)
$
#proof[
#record("2024-05-30第3-5节 00:32:47")
]
]
== 空间曲线积分与路径无关性
#definition(name: [空间线(面)单连通区域])[
区域 $V$ 称为#def[线(面)单连通区域],如果 $V$ 内任一封闭曲线(面)可以不经过 $V$ 以外的点而连续收缩于 $V$ 中的一点。
#note[
也就是说这一空间线(面)的内部没有洞,否则包含这个洞的空间线(面)就不可能连续收缩于一点。
]
]
#theorem(name: [空间曲线积分与路径无关的四个条件])[
设 $V$ 是一个空间线单连通区域,若函数 $P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 在区域 $V$ 上连续,且具有连续的一阶偏导数,则以下四个条件等价:
(1) 对 $V$ 内任一分段光滑的封闭曲线 $L$,有 $display(intc_L P dx + Q dy + R dz) = 0$。
(2) 对 $V$ 内任一分段光滑曲线 $Gamma_(A B)$,$display(int_(Gamma_(A B)) P dx + Q dy + R dz)$ 与路径 $Gamma$ 无关,只与起点 $A$ 和终点 $B$ 的位置有关。
(3) 存在 $u(x,y,z),space (x,y,z) in V$,使 $du = P dx + Q dy + R dz$。这时,我们称 $u(x,y,z)$ 为 $P dx + Q dy + R dz$ 的一个原函数。
(4) $display((diff P)/(diff y) = (diff Q)/(diff x))$,$display((diff Q)/(diff z) = (diff R)/(diff y))$,$display((diff R)/(diff x) = (diff P)/(diff z))$,$forall (x,y,z) in V$。
#tip[
【原函数的求法】
既然曲线积分与路径 $Gamma$ 无关,只与起点 $A(x_0,y_0,z_0)$ 和终点 $B(x,y,z)$ 的位置有关,我们不妨设其沿着一条最简单的路径运动。
$
u(x,y,z) =& int_((x_0,y_0,z_0))^((x,y,z)) P(x,y,z) dx + Q(x,y,z) dy + R(x,y,z) dz + C\
=& int_((x_0,y_0,z_0))^((x,y_0,z_0)) + int_((x_0,y_0,z_0))^((x,y,z_0)) + int_((x_0,y_0,z_0))^((x,y,z)) + C\
=& int_(x_0)^x P(x,y_0,z_0) dx + int_(y_0)^y Q(x,y,z_0) dy + int_(z_0)^z R(x,y,z) dz + C
$
]
]
#theorem(name: [空间曲线积分的牛莱公式])[
若 $du (x,y,z) = P dx + Q dy + R dz$,则
$
int_(Gamma_(A B)) P dx + Q dy + R dz
=& int_A^B P dx + Q dy + R dz
= int_A^B du (x,y,z)
= atpos(u(x,y,z), A, B)
= u(B) - u(A)
$
]
= 第二类曲面积分
== 第二类曲面积分的概念
*物理背景*:流速场中流体通过某定侧曲面的流量。
#definition[
设 $S$ 是一个光滑曲面,则 $S$ 上处处都有连续变动的切平面和法线。$forall M in S$,曲面 $S$ 在点 $M$ 处的法线有两个方向;当取定一个方向为正向时,另一个方向为负向。过点 $M$ 作曲面 $S$ d 法矢量 $arrow(n)$。在曲面 $S$ 上取定一点 $M_0$,当动点 $M$ 从 $M_0$ 出发沿曲面不越过边界的任一封闭曲线连续移动且回到原来的位置,若其指向也不变,则称这种曲面是#def[双侧曲面],否则称这种曲面为#def[单侧曲面]。
]
#definition[
指定了法线方向的双侧曲面,称为#def[定侧曲面]。这里,我们只讨论双侧曲面。
]
#definition(name: [第二类曲面积分])[
设 $S$ 是一个有界的光滑定侧曲面,$forall M(x,y,z) in S$,点 $M$ 处的沿曲面指定侧的单位法矢量为
$
arrow(n^circle.small) = arrow(n^circle.small)(x,y,z) = cos alpha arrow(i) + cos beta arrow(j) + cos gamma arrow(k)
$
又设
$
arrow(A) = arrow(A)(x,y,z) = P(x,y,z) arrow(i) + Q(x,y,z) arrow(j) + R(x,y,z) arrow(k)
$
其中函数 $P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 是定义在 $S$ 上的有界函数。则
$
arrow(A) dot arrow(n^circle.small) = P(x,y,z) cos alpha + Q(x,y,z) cos beta + R(x,y,z) cos gamma
$
所以
#set math.mat(delim: "|")
$
iintb(S) arrow(A) dot arrow(n^circle.small) dif S
&= iintb(S) P(x,y,z) cos alpha dif S + iintb(S) Q(x,y,z) cos beta dif S + iintb(S) R(x,y,z) cos gamma dif S\
&= iintb(S) P(x,y,z) dy dz + iintb(S) Q(x,y,z) dz dx + iintb(S) R(x,y,z) dx dy\
&defeq iintb(S) P(x,y,z) dy dz + Q(x,y,z) dz dx + R(x,y,z) dx dy
= iintb(S) mat(
dy dz, dz dx, dx dy;
P, Q, R
) dif S \
&defeq arrow(A) dot arrow(dif S)\
$
#set math.mat(delim: "(")
上式称为函数 $P(x,y,z),space Q(x,y,z), space R(x,y,z)$ 沿曲面 $S$ 指定侧的#def[第二类曲线积分],也称为#def[对坐标的积分]。
]
== 第二类曲面积分的性质
#property(name: [线性性质])[
若 $alpha,beta$ 为常数,则有 $
iintb(S) (alpha arrow(A) + beta arrow(B)) arrow(dif S)
= alpha iintb(S) arrow(A) dot arrow(dif S)
+ beta iintb(S) arrow(B) dot arrow(dif S)
$
]
#property(name: [对定侧曲面的可加性])[
若曲面 $S$ 分为两个曲面 $S_1$ 与 $S_2$,满足 $S = S_1 union S_2$,且 $S_1$ 与 $S_2$ 没有公共内点,但不改变曲面的侧,则
$
iintb(S) arrow(A) dot arrow(dif S)
= iintb(S_1) arrow(A) dot arrow(dif S)
+ iintb(S_2) arrow(A) dot arrow(dif S)
$
]
#property(name: [方向性])[
若 $S^-$ 表示曲面 $S$ 的另一侧,则
$
iintb(S) arrow(A) dot arrow(dif S)
= - iintb(S^-) arrow(A) dot arrow(dif S)
$
#proof[
这是因为曲面不同的侧,每点处的 $arrow(n^circle.small)$ 方向恰好相反,故面积的投影相差一个负号。
]
]
== 第二类曲面积分的计算
第二类曲面积分的三个部分要分开计算,下面以 $display(iintb(S) R(x,y,z) dx dy)$ 为例。其中 $dx dy = cos gamma dif S$。考虑到:
$
dx dy = cos gamma dif S = cases(
dif sigma\,&quad cos gamma > 0,
0\,&quad cos gamma = 0,
-dif sigma\,&quad cos gamma < 0
)
$
我们可以得到
#conclusion[
$
iintb(S) R(x,y,z) dx dy
= cases(
display(iintb(sigma_(x y)) R(x,y,z(x,y)) dif sigma \,&quad gamma in [0,pi/2)),
display(0 \,&quad gamma = pi/2),
display(-iintb(sigma_(x y)) R(x,y,z(x,y)) dif sigma \,&quad gamma in (pi/2,pi]),
)
$
]
== 高斯公式
格林公式建立了沿封闭曲线的第二类曲线积分与二重积分的联系,类似地,沿空间闭曲面的第二类曲面积分和三重积分也有类似的联系。
#theorem(name: [高斯公式])[
设 $V$ 是一个空间有界闭区域,它的边界 $S$ 由有限多个分片光滑曲面所围成。函数 $P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 在 $V$ 上连续,且具有连续的偏导数,则
$
iintcb(S) P dy dz + Q dz dx + R dx dy
= iiintb(V) ((diff P) / (diff x) + (diff Q) / (diff y) + (diff R) / (diff z)) dif V
$
其中上式左端的曲面 $S$ 取外侧,$cos alpha, cos beta, cos gamma$ 是曲面 $S$ 外法线的方向余弦。
]
== Stocks 公式
#theorem(name: [Stocks 公式])[
设函数 $P(x,y,z)$,$Q(x,y,z)$,$R(x,y,z)$ 及其一阶偏导数在空间区域 $Omega$ 上连续,$S$ 是 $Omega$ 内的一张光滑曲面,曲面 $S$ 的边界曲线 $L$ 是分段光滑的连续曲线,$S$ 的法线方向与 $L$ 的方向符合右手法则(即人在 $S$ 的正侧沿 $L$ 行走时,$S$ 总位于他的左边),则
#set math.mat(delim: "|")
$
intcb(L) P dx + Q dy + R dz
=& iintb(S) ((diff R) / (diff y) - (diff Q) / (diff z)) dy dz
+ ((diff P) / (diff z) - (diff R) / (diff x)) dz dx
+ ((diff Q) / (diff x) - (diff P) / (diff y)) dx dy\
=& iintb(S) mat(
dy dz, dz dx, dx dy;
display(diff/(diff x)), display(diff/(diff y)), display(diff/(diff z));
P, Q, R)
= iintb(S) mat(
cos alpha, cos beta, cos gamma;
display(diff/(diff x)), display(diff/(diff y)), display(diff/(diff z));
P, Q, R) dif S
$
#set math.mat(delim: "(")
#note[
Stokes 公式中的曲面 $S$ 是以 $L$ 为边界的有侧光滑曲面。左端的积分值与以 $L$ 为边界的光滑曲面 $S$ 的形状无关。因此在计算中,可以选择最简单的曲面来求积分。
]
]
== 场论初步
=== 通量与散度
// TBD:https://classroom.zju.edu.cn/livingroom?course_id=60204&sub_id=1157684&tenant_code=112
#definition(name: [散度])[
设 $arrow(A)(x,y,z) = P(x,y,z) arrow(i)$
]
=== 矢量场的旋度
#definition(name: [矢量场的循环量、环量])[
在矢量场 $arrow(A) (M)$ 中,矢量 $arrow(A) (M)$ 沿有向封闭曲线 $L$ 的曲线积分 $display(intc_L arrow(A) dot dif arrow(l))$ 称为矢量场 $arrow(A) (M)$ 沿封闭曲线 $L$ 的#def[循环量]。
]
#definition(name: [平均循环量、平均环量密度])[
设 $L$ 是所围的曲面为 $S$,其面积也记为 $S$,且 $L$ 的方向与 $S$ 的法矢量 $arrow(n)$ 的方向符合右手法则,则 $display(display(intc_L arrow(A) dot dif arrow(l))/(S))$ 称为矢量场 $arrow(A) (M)$ 沿封闭曲线 $L$ 的绕法矢量 $arrow(n)$ 的#def[平均循环量],即循环量关于面积的平均变化率。
]
#definition(name: [环量密度])[
设 $arrow(A) = arrow(A) (M)$ 是一个矢量场,$L$ 是场中的一条封闭光滑曲线,$S$ 是以 $L$ 为边界的任意光滑曲面,其面积也记为 $S$,$L$ 的方向与曲面 $S$ 的法矢量 $arrow(n)$ 的方向符合右手法则,如果平均循环量 $display(display(intc_L arrow(A) dot dif arrow(l))/(S))$ 当曲面 $S$ 按任意方式无限收缩于点 $M$ 时,极限 $display(lim_(S -> M) display(intc_L arrow(A) dot dif arrow(l))/S)$ 存在,则称此极限为矢量场 $arrow(A) (M)$ 在点 $M$ 处绕 $arrow(n)$ 的#def[环量密度]。
]
#definition(name: [旋度])[
设矢量场 $arrow(A) (x,y,z) = P(x,y,z) arrow(i) + Q(x,y,z) arrow(j) + R(x,y,z) arrow(k)$ 满足 Stocks 公式的条件,则
$
intc_L arrow(A) dot dif arrow(l)
&= intc_L P dx + Q dy + R dz\
&= iintb(S) ((diff R) / (diff y) - (diff Q) / (diff z)) dy dz
+ ((diff P) / (diff z) - (diff R) / (diff x)) dz dx
+ ((diff Q) / (diff x) - (diff P) / (diff y)) dx dy\
&= iintb(S) (
((diff R) / (diff y) - (diff Q) / (diff z)) arrow(i)
+ ((diff P) / (diff z) - (diff R) / (diff x)) arrow(j)
+ ((diff Q) / (diff x) - (diff P) / (diff y)) arrow(k)
) dot (dy dz arrow(i) + dz dx arrow(j) + dx dy arrow(k)))\
&defeq iintb(S) rot arrow(A) dot arrow(dif S)
= iintb(S) rot arrow(A) dot arrow(n^circle.small) dif S
$
即称
#set math.mat(delim: "|")
$
rot arrow(A)
= ((diff R) / (diff y) - (diff Q) / (diff z)) arrow(i)
+ ((diff P) / (diff z) - (diff R) / (diff x)) arrow(j)
+ ((diff Q) / (diff x) - (diff P) / (diff y)) arrow(k)
defeq mat(
arrow(i), arrow(j), arrow(k);
display(diff/(diff x)), display(diff/(diff y)), display(diff/(diff z));
P, Q, R;
)
$
#set math.mat(delim: "(")
为矢量场 $arrow(A)$ 在点 $M$ 处的#def[旋度]。
]
#theorem(name: [旋度与环量密度的关系])[
矢量场 $arrow(A) (M)$ 在点 $M$ 处绕 $arrow(n)$ 的环量密度
$
display(lim_(S->M) display(intc_L arrow(A) dot dif arrow(l))/S = lim_(S->M) display(iintb(S) rot arrow(A) dot arrow(n^circle.small) dif S)/S)
$
] |
|
https://github.com/LDemetrios/Typst4k | https://raw.githubusercontent.com/LDemetrios/Typst4k/master/src/test/resources/suite/foundations/assert.typ | typst | --- assert-fail ---
// Test failing assertions.
// Error: 2-16 assertion failed
#assert(1 == 2)
--- assert-fail-message ---
// Test failing assertions.
// Error: 2-51 assertion failed: two is smaller than one
#assert(2 < 1, message: "two is smaller than one")
--- assert-bad-type ---
// Test failing assertions.
// Error: 9-15 expected boolean, found string
#assert("true")
--- assert-eq-fail ---
// Test failing assertions.
// Error: 2-19 equality assertion failed: value 10 was not equal to 11
#assert.eq(10, 11)
--- assert-eq-fail-message ---
// Test failing assertions.
// Error: 2-55 equality assertion failed: 10 and 12 are not equal
#assert.eq(10, 12, message: "10 and 12 are not equal")
--- assert-ne-fail ---
// Test failing assertions.
// Error: 2-19 inequality assertion failed: value 11 was equal to 11
#assert.ne(11, 11)
--- assert-ne-fail-message ---
// Test failing assertions.
// Error: 2-57 inequality assertion failed: must be different from 11
#assert.ne(11, 11, message: "must be different from 11")
--- assert-success ---
// Test successful assertions.
#assert(5 > 3)
#assert.eq(15, 15)
#assert.ne(10, 12)
|
|
https://github.com/augustebaum/tenrose | https://raw.githubusercontent.com/augustebaum/tenrose/main/internals.typ | typst | MIT License | #let plugin = plugin("tenrose.wasm")
#let double-precision = 1000
#let length-to-int(value) = {
calc.round(value * double-precision / 1pt)
}
#let int-to-length(value) = {
value / double-precision * 1pt
}
/// Encodes a 32-bytes integer into big-endian bytes.
#let encode-int(value) = {
bytes((
calc.rem(calc.quo(value, 0x1000000), 0x100),
calc.rem(calc.quo(value, 0x10000), 0x100),
calc.rem(calc.quo(value, 0x100), 0x100),
calc.rem(calc.quo(value, 0x1), 0x100),
))
}
/// Decodes a big-endian integer from the given bytes.
#let decode-int(bytes) = {
let result = 0
for byte in array(bytes) {
result = result * 256 + byte
}
return result
}
/// Encodes an array of integers into bytes.
#let encode-int-array(arr) = {
bytes(
arr
.map(encode-int)
.map(array)
.flatten()
)
}
/// Encodes an array of strings into bytes.
#let encode-string-array(strings) = {
bytes(strings.map(string => array(bytes(string)) + (0,)).flatten())
}
/// Transforms bytes into an array whose elements are all `bytes` with the
/// specified length.
#let group-bytes(buffer, group-len) = {
assert(calc.rem(buffer.len(), group-len) == 0)
array(buffer).fold((), (acc, x) => {
if acc.len() != 0 and acc.last().len() < group-len {
acc.last().push(x)
acc
} else {
acc + ((x,),)
}
}).map(bytes)
}
/// Group elements of the array in pairs.
#let array-to-pairs(arr) = {
assert(calc.even(arr.len()))
arr.fold((), (acc, x) => {
if acc.len() != 0 and acc.last().len() < 2 {
acc.last().push(x)
acc
} else {
acc + ((x,),)
}
})
}
/// Get an array of evaluated labels from a graph.
#let get-labels(manual-label-names, dot) = {
let encoded-labels = plugin.get_labels(
encode-int(manual-label-names.len()),
encode-string-array(manual-label-names),
bytes(dot),
)
let encoded-label-array = array(encoded-labels).split(0).slice(0, -1).map(bytes)
encoded-label-array.map(encoded-label => {
let mode = str(encoded-label.slice(0, 1))
let label-str = str(encoded-label.slice(1))
if mode == "t" {
[#label-str]
} else if mode == "m" {
math.equation(eval(mode: "math", label-str))
} else {
panic("Internal Diagraph error: Unsopported mode: `" + mode + "`")
}
})
}
/// Encodes the dimensions of labels into bytes.
#let encode-label-dimensions(styles, labels) = {
encode-int-array(
labels
.map(label => {
let dimensions = measure(label, styles)
(
length-to-int(dimensions.width),
length-to-int(dimensions.height),
)
})
.flatten()
)
}
/// Converts any relative length to an absolute length.
#let relative-to-absolute(value, styles, container-dimension) = {
if type(value) == relative {
let absolute-part = relative-to-absolute(value.length, styles, container-dimension)
let ratio-part = relative-to-absolute(value.ratio, styles, container-dimension)
return absolute-part + ratio-part
}
if type(value) == length {
return value.abs + value.em * measure(line(length: 1em), styles).width
}
if type(value) == ratio {
return value * container-dimension
}
panic("Expected relative length, found " + str(type(value)))
}
/// Renders a graph with Graphviz.
#let render(
/// A string containing Dot code.
dot,
/// Nodes whose name appear in this dictionary will have their label
/// overridden with the corresponding content. Defaults to an empty
/// dictionary.
labels: (:),
/// The name of the engine to generate the graph with. Defaults to `"dot"`.
engine: "dot",
/// The width of the image to display. If set to `auto` (the default), will be
/// the width of the generated SVG or, if the height is set to a value, it
/// will be scaled to keep the aspect ratio.
width: auto,
/// The height of the image to display. If set to `auto` (the default), will
/// be the height of the generated SVG or if the width is set to a value, it
/// will be scaled to keep the aspect ratio.
height: auto,
/// Whether to hide parts of the graph that extend beyond its frame. Defaults
/// to `true`.
clip: true,
/// A color or gradient to fill the background with. If set to `none` (the
/// default), the background will be transparent.
background: none,
) = {
let manual-labels = labels.values()
let manual-label-names = labels.keys()
let manual-label-count = manual-labels.len()
let native-labels = get-labels(manual-label-names, dot)
let native-label-count = native-labels.len()
layout(((width: container-width, height: container-height)) => style(styles => {
let font-size = measure(line(length: 1em), styles).width
let output = plugin.render(
encode-int(length-to-int(font-size)),
bytes(dot),
encode-label-dimensions(styles, native-labels),
encode-label-dimensions(styles, manual-labels),
encode-string-array(manual-label-names),
bytes(engine),
)
if output.at(0) != 0 {
return {
show: highlight.with(fill: red)
set text(white)
raw(block: true, str(output))
}
}
let integer-size = output.at(1)
output = output.slice(2)
// Get native label coordinates.
let native-label-coordinates-size = 2 * native-label-count * integer-size
let native-label-coordinates = array-to-pairs(
group-bytes(output.slice(0, native-label-coordinates-size), integer-size)
.map(decode-int)
.map(int-to-length)
)
output = output.slice(native-label-coordinates-size)
// Get manual label coordinates.
let manual-label-coordinate-sets = ()
for manual-label-index in range(manual-label-count) {
let coordinate-set = ()
let use-count = decode-int(output.slice(0, integer-size))
output = output.slice(integer-size)
for i in range(use-count) {
coordinate-set.push(
(output.slice(0, integer-size), output.slice(integer-size, 2 * integer-size))
.map(decode-int)
.map(int-to-length)
)
output = output.slice(integer-size * 2)
}
manual-label-coordinate-sets.push(coordinate-set)
}
// Get SVG dimensions.
let svg-width = int-to-length(decode-int(output.slice(0, integer-size)))
let svg-height = int-to-length(decode-int(output.slice(integer-size + 1, integer-size * 2)))
output = output.slice(integer-size * 2)
let final-width = if width == auto {
svg-width
} else {
relative-to-absolute(width, styles, container-width)
}
let final-height = if height == auto {
svg-height
} else {
relative-to-absolute(height, styles, container-height)
}
if width == auto and height != auto {
let ratio = final-height / svg-height
final-width = svg-width * ratio
} else if width != auto and height == auto {
let ratio = final-width / svg-width
final-height = svg-height * ratio
}
set align(top + left)
// Rescale the final image to the desired size.
show: block.with(
width: final-width,
height: final-height,
clip: clip,
breakable: false,
)
show: scale.with(
origin: top + left,
x: final-width / svg-width * 100%,
y: final-height / svg-height * 100%,
)
// Construct the graph and its labels.
show: block.with(width: svg-width, height: svg-height, fill: background)
// Display SVG.
image.decode(
output,
format: "svg",
width: svg-width,
height: svg-height,
)
// Place native labels.
for (label, coordinates) in native-labels.zip(native-label-coordinates) {
let (x, y) = coordinates
let label-dimensions = measure(label, styles)
place(
top + left,
dx: x - label-dimensions.width / 2,
dy: final-height - y - label-dimensions.height / 2 - (final-height - svg-height),
label,
)
}
// Place manual labels.
for (label, coordinate-set) in manual-labels.zip(manual-label-coordinate-sets) {
let label-dimensions = measure(label, styles)
for (x, y) in coordinate-set {
place(
top + left,
dx: x - label-dimensions.width / 2,
dy: final-height - y - label-dimensions.height / 2 - (final-height - svg-height),
label,
)
}
}
}))
}
|
https://github.com/rdboyes/resume | https://raw.githubusercontent.com/rdboyes/resume/main/modules_zh/professional.typ | typst | // Imports
#import "@preview/brilliant-cv:2.0.2": cvSection, cvEntry
#let metadata = toml("../metadata.toml")
#let cvSection = cvSection.with(metadata: metadata)
#let cvEntry = cvEntry.with(metadata: metadata)
#cvSection("职业经历")
#cvEntry(
title: [数据科学主管],
society: [XYZ 公司],
logo: image("../src/logos/xyz_corp.png"),
date: [2020 - 现在],
location: [旧金山, CA],
description: list(
[领导数据科学家和分析师团队,开发和实施数据驱动的策略,开发预测模型和算法以支持组织内部的决策],
[与高级管理团队合作,确定商业机会并推动增长,实施数据治理、质量和安全的最佳实践],
),
)
#cvEntry(
title: [数据分析师],
society: [ABC 公司],
logo: image("../src/logos/abc_company.png"),
date: [2017 - 2020],
location: [纽约, NY],
description: list(
[使用 SQL 和 Python 分析大型数据集,与跨职能团队合作以识别商业洞见],
[使用 Tableau 创建数据可视化和仪表板,使用 AWS 开发和维护数据管道],
),
)
#cvEntry(
title: [数据分析实习生],
society: [PQR 公司],
logo: image("../src/logos/pqr_corp.png"),
date: [2017年夏季],
location: [芝加哥, IL],
description: list(
[协助使用 Python 和 Excel 进行数据清洗、处理和分析,参与团队会议并为项目规划和执行做出贡献],
[开发数据可视化和报告以向利益相关者传达洞见,与其他实习生和团队成员合作以按时并高质量完成项目],
),
)
|
|
https://github.com/tuto193/typst-uos-thesis | https://raw.githubusercontent.com/tuto193/typst-uos-thesis/main/template.typ | typst | MIT License | #import "languages.typ": dict
// https://github.com/zagoli/simple-typst-thesis/blob/main/template.typ
// Small fancy stuff for creating tables
#let fancy-align(col, row) = {
if row == 0 { center }
else if col == 0 {left + horizon}
else {right + horizon}
}
#let fancy-fill(col, row) = {
if row == 0 {gray}
else if calc.odd(row) {silver}
else {white}
}
#let fancy-stroke = 0.5pt + black
// If you want to cite with "et. al." when there are more than 2 authors
// use only for for form:"prose" or form:"author". Otherwise just use normal citing
#let cite-et-al(label-string, form: "prose", supplement: "") = {
let label-c = if type(label-string) == "string" {
label(label-string)
} else {label-string}
if supplement != ""{
cite(label-c, form: form, style: "trends", supplement: [#supplement])
} else {
cite(label-c, form: form, style: "trends")
}
}
// This is just a shorthand for #cite(label("my_citation")...)
// Citing using strings ("my_citation") in case the citation key has some characters
// that mess the key recognition (such as: "+").
#let cite-string(label-string, supplement: "", form: "", style: "ieee") = {
let actual-form = if form != "" { form } else { "normal" }
if supplement == "" {
cite(label(label-string), form: actual-form, style: style)
} else {
cite(label(label-string), form: actual-form, style: style, supplement: supplement)
}
}
#let build-main-header(main-heading-content) = {
[
#align(center, smallcaps(main-heading-content))
#line(length: 100%)
]
}
#let build-secondary-header(main-heading-content, current-page, page-counter) = {
let page-string = str(page-counter)
if calc.even(current-page) {
[
#page-string #h(1fr) #main-heading-content
#line(length: 100%)
]
} else {
[
#main-heading-content #h(1fr) #page-string
#line(length: 100%)
]
}
}
#let build-footer-with-title(document-title) = {
[
#line(length: 100%)
#align(center, smallcaps(document-title))
]
}
#let build-empty() = {
[]
}
#let get-nice-date(lang: "en") = {
let today = datetime.today()
let nice-month = dict.at(lang).at("months").at(today.month() - 1)
// (
// "January",
// "February",
// "March",
// "April",
// "May",
// "June",
// "July",
// "August",
// "September",
// "October",
// "November",
// "December"
// ).at(today.month() - 1)
[
#nice-month #today.display("[year]")
]
}
// To know if the secondary heading appears after the main heading
#let is-after(secondary-heading, main-heading) = {
let secHeadPos = secondary-heading.location().position()
let mainHeadPos = main-heading.location().position()
if (secHeadPos.at("page") > mainHeadPos.at("page")) {
return true
}
if (secHeadPos.at("page") == mainHeadPos.at("page")) {
return secHeadPos.at("y") > mainHeadPos.at("y")
}
return false
}
#let print-proclamation(city, lang: "en") = {
set page(header: none, numbering: none)
heading(dict.at(lang).at("proclamation").at("title"), numbering: none)
[
#dict.at(lang).at("proclamation").at("contents")
// I hereby confirm that I wrote this thesis independently and that I have not made use of any other resources or means than those indicated.
]
v(1.0cm)
align(right)[
#line(length: 35%)
#city, #get-nice-date(lang: lang)
]
pagebreak(weak: true)
}
#let build-with-number(current-abs-page, page-counter, is-main-heading: false) = {
let number-string = str(page-counter)
if is-main-heading {
[
#align(center, number-string)
]
} else if calc.even(current-abs-page) {
[
#align(left, number-string)
]
} else {
[
#align(right, number-string)
]
}
}
#let get-footer(document-title) = {
locate(loc => {
// Find if there is a level 1 heading on the current page
let next-main-heading = query(selector(heading).after(loc), loc).find(headIt => {
headIt.location().page() == loc.page() and headIt.level == 1
})
if (next-main-heading != none) {
return build-with-number(counter(page).at(loc).at(0))
}
// Find the last previous level 1 heading -- at this point surely there's one :-)
let last-main-heading = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level == 1
}).last()
if last-main-heading.location().page() == loc.page() {
return build-with-number(loc.page(), counter(page).at(loc).at(0), is-main-heading: true)
}
// Find if the last level > 1 heading in previous pages
let previous-secondary-heading-array = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level > 1
})
let last-secondary-heading = if (previous-secondary-heading-array.len() != 0) {
previous-secondary-heading-array.last()
} else {none}
// Find if the last secondary heading exists and if it's after the last main heading
if (last-secondary-heading != none and is-after(last-secondary-heading, last-main-heading)) {
return build-footer-with-title(document-title)
}
return build-footer-with-title(document-title)
})
}
#let get-header(double-sided) = {
// counter(footnote).update(0)
locate(loc => {
// Find if there is a level 1 heading on the current page
let next-main-heading = query(selector(heading).after(loc), loc).find(headIt => {
headIt.location().page() == loc.page() and headIt.level == 1
})
if (next-main-heading != none) {
return build-empty()
}
// Find the last previous level 1 heading -- at this point surely there's one :-)
let last-main-heading = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level == 1
}).last()
let lastMainIndex = counter(heading).at(loc).at(0)
// Find if the last level > 1 heading in previous pages
let previous-secondary-heading-array = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level > 1
})
let last-secondary-heading = if (previous-secondary-heading-array.len() != 0) {
previous-secondary-heading-array.last()
} else {none}
// Find if the last secondary heading exists and if it's after the last main heading
if (last-secondary-heading != none and is-after(last-secondary-heading, last-main-heading)) {
let headingText = if (calc.even(loc.page())) {
str(lastMainIndex) + ". " + last-main-heading.body
} else {
str(lastMainIndex)+"." +str(last-secondary-heading.level) + ". " + last-secondary-heading.body
}
// Always make the absolute-page an odd number, if there not printing
// double-sided, so the page-number always comes on the right side
let absolute-page = if not double-sided {1} else {loc.page()}
return build-secondary-header(headingText, absolute-page, counter(page).at(loc).at(0))
}
let heading-title = str(last-main-heading.body.text)
let headingText = if heading-title.find("Bibliography") != none {
"Bibliography"
} else {
str(lastMainIndex) + ". " + last-main-heading.body
}
// Always make the absolute-page an odd number, if there not printing
// double-sided, so the page-number always comes on the right side
let absolute-page = if not double-sided {1} else {loc.page()}
return build-secondary-header(headingText, absolute-page, counter(page).at(loc).at(0))
})
}
#let invisible-heading(level: 1, numbering: none, supplement: auto,
outlined: true, content) = {
// show heading.where(level: level): set text(size: 0em, color: red)
show heading.where(level: level): it => block[]
text(size: 0pt)[
#heading(level: level, numbering: numbering, supplement: supplement, outlined: outlined)[#content]
]
}
#let small-title(content, outlined: true) = {
align(center)[
// #show heading.where(level: 1): set text(size: 0.85em)
#show heading.where(level: 1): it => block[
#set text(size: 0.85em)
#it.body
]
#heading(
outlined: outlined,
numbering: none,
content
// text(0.85em,content),
)
#v(5mm)
]
}
#let GLS_PREFIX = "gls-auto-"
#let print-glossary(glossaries, name, bold: true) = {
let to_print = ()
for (key, value) in glossaries.at(name).pairs() {
// let (abbr, full) = value
let abbr = value.at(0)
let full = value.at(1)
to_print.push([#if bold [*#abbr*] else [#abbr] #label(GLS_PREFIX + key)])
to_print.push(full)
}
grid(
columns: 2,
gutter: 3mm,
..to_print
)
}
#let GLOSSARIES = state("glossaries", (:))
#let PRINTED_GLOSSARIES = state("printed_glossaries", ())
#let gls(name) = {
let contents = locate(loc => {
let glossaries = GLOSSARIES.at(loc)
for table in glossaries.values() {
if name in table.keys() {
if table.at(name).len() > 2 {
link(label(GLS_PREFIX + name))[#table.at(name).at(2)]
} else if name not in PRINTED_GLOSSARIES.at(loc) {
link(label(GLS_PREFIX + name))[#table.at(name).at(1) (#table.at(name).at(0))]
} else {
link(label(GLS_PREFIX + name))[#table.at(name).at(0)]
}
break
}
}
}
)
contents
PRINTED_GLOSSARIES.update(curr => {
if name not in curr {
curr.push(name)
}
curr
})
// [#glossaries]
}
#let todays-date = datetime.today()
#let city = ""
#let project(
title: "",
degree: "bachelor",
lang: "en",
abstract: "",
author: "",
registration-number: "",
email: "",
institute: "Institue of Very Cool People",
city: "Osnabrück",
logo: "images/logo.png",
bibliography-path: "",
bibliography-style: "ieee",
first-supervisor: "Prof. 1",
second-supervisor: "Person 2",
glossaries: (abbreviation: (:),),
double-sided : true,
body
) = {
let margin = (bottom: 1.135in+0.4in, top: 1.125in+0.4in)
if double-sided {
margin.insert("outside", 1.0in)
margin.insert("inside", 1.3in)
} else {
margin.insert("left", 1.3in)
margin.insert("right", 1.0in)
}
set page(
paper: "a4",
// margin: (outside: 1.0in, inside: 1.3in, bottom:1.125in+0.4in, top: 1.125in + 0.4in),
margin: margin,
header-ascent: 0.4in,
footer-descent: 0.3in,
// binding: left
)
// city = city
// Set the document's basic properties.
set document(author: author, title: title)
// set text(font: "Linux Libertine", lang: "en")
set text(
size: 12pt,
// font: "Times_New_Roman",
font: "Linux Libertine",
// stretch: 120%,
lang: "en"
)
// Make sure that Figures' numbers are bold
// show figure.caption: it => [
// #strong(it.supplement) #strong(it.counter.display(it.numbering)) : #it.body
// // #strong(repr(it.fields().pairs()))
// ]
let fig-params = (
"placement",
"caption",
"kind",
"supplement",
"numbering",
"gap",
"outlined",
)
show figure.caption: emph
// Maximum show rule depth exceeded. Do something else for the moment
// show figure: it => {
// let args = for p in fig-params {
// let it = it.fields()
// if p in it { ((p): it.at(p)) }
// }
// let body = it.body
// if it.kind == image {
// args.supplement = dict.at(lang).at("fig")
// } else if it.kind == table {
// args.supplement = dict.at(land).at("tab")
// }
// figure(..args, body)
// }
// show figure.where(kind: image): set figure(supplement: dict.at(lang).at("fig"))
// show figure.where(kind: table): set figure(supplement: dict.at(lang).at("tab"))
// show figure.where(kind: image): set figure(supplement: dict.at(lang).at("fig"))
// show figure.caption: it => {
// [#it.supplement #it.counter #it.numbering #text(style: "normal", it.body)]
// }
// show.fi
show math.equation: set text(weight: 400)
// set math.equation(numbering: "(1.1)") // Currently not directly supported by typst
set math.equation(numbering: "(1)")
set heading(numbering: "1.1")
set par(justify: true)
// show heading.where(level: 1): set text(size: 24pt)
show heading.where(level: 2): set text(size: 18pt)
show heading.where(level: 3): set text(size: 14pt)
show outline.entry.where(level: 1): it => {
v(16pt, weak: true)
strong(it)
set outline(
fill: none
)
}
show outline.entry.where(level: 2): it => {
it
}
// show link: set text(fill: blue)
show ref: it => {
let eq = math.equation
let hd = heading
let el = it.element
if el != none {
if el.func() == eq {
// Override equation references.
link(el.label)[#numbering(
el.numbering,
..counter(eq).at(el.location())
)]
} else if el.func() == hd {
// headings
text(fill: green.darken(60%))[#it]
} else if el.func() == figure {
// figures
text(fill: blue.darken(60%))[#it]
} else if el.func() == table {
// table
text(fill: red.darken(70%))[#it]
} else if el.func() == ref {
// ref
text(fill: gray.darken(60%))[#it]
}
} else {
// Other references as usual.
// text(fill: gray.darken(60%))[#it]
it
}
}
show cite: set text(fill: gray.darken(60%))
// Make Raw/Code text nicely themed
// set raw(theme: "../tmThemes/gruvbox.tmTheme")
// set raw(syntaxes: "../tmThemes/GDScript.sublime-syntax")
// show raw.where(lang: "gdscript"): it =>{
// it.syntaxes:
// syntaxes = "../tmThemes/GDScript.sublime-syntax"
// }
// Make a nice fill for block-code
show raw.where(block: true): block.with(
// fill: rgb("#1d2433"),
fill: luma(220),
inset: 8pt,
radius: 5pt,
// text(fill: rgb("#a2aabc"), it)
)
show raw.where(block: false): box.with(
fill: luma(235),
inset: (x: 1pt, y: 0pt),
outset: (y: 3pt),
radius: 2pt,
)
// Title page.
// v(0.25fr)
// Logo
if logo != none {
v(0.25fr)
align(center, image(logo, width: 26%))
v(0.20fr)
} else {
v(0.45fr)
}
align(center)[
#text(1.2em, weight: 600, institute)
]
v(0.20fr)
align(center)[
#text(style: "italic")[#dict.at(lang).at("degree").at(degree)]
]
align(center)[
#text(1.8em, weight: 700, title)
]
// Author information.
align(center)[
#text(size: 14pt)[#author] \
#if registration-number != "" [#registration-number \ ]
#v(2cm)
// #datetime.today().display("[month] [year]")
#get-nice-date(lang: lang)
#v(2cm)
#table(columns: 2, stroke: none, align: left,
[#dict.at(lang).at("supervisors").at("first"):], [#first-supervisor],
[#dict.at(lang).at("supervisors").at("second"):], [#second-supervisor],
)
]
pagebreak()
if abstract != "" {
small-title([Abstract])
abstract
// v(1.618fr)
pagebreak()
}
show heading.where(level: 1): it => [
#set text(size: 24pt)
#v(1.5in)
#par(first-line-indent: 0pt)[#it.body]
#v(1.5cm)
]
// Table of contents.
heading(dict.at(lang).at("index"), numbering: none, outlined: false)
outline(
title: none,
depth: 3, indent: true,
fill: repeat(" . ")
)
pagebreak()
///*
/// Lists...
set page(numbering: "i", number-align: center)
heading(dict.at(lang).at("lists").at("figures"), numbering: none)
outline(
title: none,
depth: 3, indent: true,
target: figure.where(kind: image),
)
pagebreak()
heading(dict.at(lang).at("lists").at("tables"), numbering: none)
outline(
title: none,
depth: 3, indent: true,
target: figure.where(kind: table)
)
pagebreak()
//*/
GLOSSARIES.update(glossaries)
heading(
outlined: true,
numbering: none,
text(dict.at(lang).at("lists").at("glossary")),
)
print-glossary(glossaries, "abbreviation", bold: true)
pagebreak()
// heading(
// outlined: true,
// numbering: none,
// text("List of Symbols"),
// )
// print-glossary(glossaries, "symbol", bold: false)
// Main body.
set page(numbering: "1", number-align: bottom)
set par(first-line-indent: 20pt)
set page(header: get-header(double-sided), footer: get-footer(title))
counter(page).update(1)
// set gls(glossaries: glossaries)
show heading: set heading(supplement: [#dict.at(lang).at("sections").at("other")])
show heading.where(level: 1): set heading(supplement: [#dict.at(lang).at("sections").at("main")])
show heading.where(level: 1): it => [
// #pagebreak(weak: true)
#set text(size: 24pt)
#v(1.5in)
#block[
#if it.numbering != none [
#dict.at(lang).at("sections").at("main") #counter(heading).display()
#v(0.5cm)
]
#par(first-line-indent: 0pt)[#it.body]
]
#v(1.5cm, weak: true)
]
// show heading: it => [
// #if it.level > 1 [
// #set it.supplement(dict.at(lang).at(sections).at(other))
// ]
// ]
show heading.where(level: 2): it => [
#set text(size: 18pt)
#v(1cm, weak: true)
#block[
#if it.numbering != none [
#counter(heading).display()
]
#it.body
]
#v(1cm, weak: true)
]
show heading.where(level: 2): set text(size: 18pt)
show heading.where(level: 3): set text(size: 14pt)
///
set page(numbering: "1", number-align: center)
// set footnote(numbering: "*")
body
// Bibliography
pagebreak(weak: true)
bibliography(
bibliography-path,
title: [#dict.at(lang).at("bib")],
// style: "american-physics-society"
style: bibliography-style,
// style: "thieme"
)
pagebreak(weak: true)
// Proclamation
print-proclamation(city, lang: lang)
}
|
https://github.com/WinstonMDP/math | https://raw.githubusercontent.com/WinstonMDP/math/main/exers/l.typ | typst | #import "../cfg.typ": *
#show: cfg
$
"Prove that"
all("convergent series with positive members" sum_(n = 1)^oo a_n): \
A_n = sqrt(sum_(k = n)^oo a_k) - sqrt(sum_(k = n + 1)^oo a_k) ->
sum_(n = 1)^oo A_n "converges" and a_n =_(n -> oo) o(A_n)
$
$A_1 + A_2 + A_3 + ... + A_n =
sqrt(sum_(k = 1)^oo a_k) - sqrt(sum_(k = 2)^oo a_k) +
sqrt(sum_(k = 2)^oo a_k) - sqrt(sum_(k = 3)^oo a_k) +
sqrt(sum_(k = 3)^oo a_k) - sqrt(sum_(k = 4)^oo a_k) +
... =
sqrt(sum_(k = 1)^oo a_k) - sqrt(sum_(k = n + 1)^oo a_k) ->_(n -> oo)
sqrt(sum_(k = 1)^oo a_k)$
$A_n =
(sum_(k = n)^oo a_k - sum_(k = n + 1)^oo a_k)/
(sqrt(sum_(k = n)^oo a_k) + sqrt(sum_(k = n + 1)^oo a_k)) =
a_n/(sqrt(sum_(k = n)^oo a_k) + sqrt(sum_(k = n + 1)^oo a_k))$
$a_n/A_n =
sqrt(sum_(k = n)^oo a_k) + sqrt(sum_(k = n + 1)^oo a_k) ->_(n -> oo)
0$
$qed$
|
|
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/046%20-%20Streets%20of%20New%20Capenna/004_What%20You%20Expect%20to%20See.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"What You Expect to See",
set_name: "Streets of New Capenna",
story_date: datetime(day: 29, month: 03, year: 2022),
author: "<NAME>",
doc
)
The sun glittered off the glass and steel buildings of New Capenna, and a pair of green and red birds chased each other through the air. Kamiz paused in the shadows while her apprentice stepped into the sun and gaped.
#figure(image("004_What You Expect to See/01.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"Look, Kamiz!" Queza pointed. "They're beautiful."
Kamiz sighed. "Erithacus rubecula. They're robins."
"They're artists. Even you can appreciate beauty, can't you?"
"I appreciate facts, kid."
The robins sang as Kamiz and Queza entered Tivit's fortune-teller shop, locking the door behind them. Floating crystal balls lit the room, illuminating the Obscura sigil—hand, keyhole, and dagger. Kamiz appreciated the display of loyalty.
"Ah, Kamiz: Raffine's right-hand cephalid." Tivit padded on heavy paws and folded their bright white wings. "And Queza! The right-hand's right hand!"
"How's business, Tivit?"
"Busy. Passing through the back?"
Kamiz nodded. "Keep the door locked." She pushed aside the hanging tapestries covering the entrance to the vent shaft underground. The tunnels let Kamiz pass through New Capenna covertly—how she preferred it. One of the perks of being Raffine's spymaster: minimal interaction with people. Let Queza be personable, Kamiz had no time for it.
"Can't we stay for tea, Kamiz? I'd love to have my leaves read."
"No."
"Tivit, look into the future. Any chance Kamiz ever smiles?"
"Outlook not so good, sorry."
Kamiz tapped her foot. "Queza."
"She's mad now," said Queza through stage whispers. "The marks around her eyes pulse blue-green when she's mad. I'm coming!"
"I see glowering and silent disapproval in your future, Queza," Tivit called. "I see~no, I see~"
#figure(image("004_What You Expect to See/02.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
Tivit threw their head back and screamed. The floating crystals shattered mid-air as blue smoke twisted through the room like a hurricane. Kamiz smelled metal in the air, a storm building in the corners. Darkness suffocated all light, including Kamiz and Queza's bioluminescent markings. Silence descended, heavy and taut.
Tivit's open wings crackled with electricity. Light blazed from their eyes and mouth, knocking the two cephalids back. A chorus roared from Tivit's throat. Kamiz cast a spherical lens into the room, to record. She'd spent too much time with Raffine not to recognize a prophecy.
"The kitten hiding the cream~Corvidae, flitting and pecking~Gold in the water~Hiding the cream, the kitten!"
The room shook, Kamiz held her ground. Halos formed in the swirling light. She tried to look for Queza, but the cyclone of power forced her eyes shut. The sound rose to a crescendo~then stopped.
Shattered glass and shredded books fell. Sunlight streamed through Tivit's modest windows, and the sphinx seer collapsed.
"Queza?" Kamiz blinked her watering eyes. Her apprentice lay sprawled beneath a bookcase, cradling her head. Kamiz cast the viewing spell as she rushed over. Bruises and scrapes lit up across Queza's body—none serious.
"I'm fine. Kamiz~that was~"
Kamiz called the spherical lens back to her hand. "I'm sending the recording to Raffine now."
Queza rubbed her head, filled a water carafe for Tivit and pushed them to rehydrate. "Kamiz, I've read every account of Raffine's visions, from the founding. This was exactly like them. This is big. Different."
Kamiz kept her face blank. There was an explanation. There was nothing she did not know, only things she did not know yet.
The reporting amulet against her throat vibrated: another seer reporting a vision. The amulet shook again. It buzzed and rattled against her skin, heating with the friction, as though the stone inside would crack. Kamiz fumbled the amulet in her palm and pressed it to display the reports.
Magic windows flashed blue before her, revealing fortune-tellers, soothsayers, and augurs from Park Heights. From the Mezzio, the Caldaia. All reporting the same thing.
"Halos~Halos~Kitten hiding the cream."
"The kitten hiding the cream. Gold~"
"In the water, gold. Corvidae~Halos, halos~"
"Halos everywhere~"
"The same vision?" Queza whispered. She gripped a milky-white talisman with a halo inside. A token from the superstitious angel-faithful?
"Put that away," Kamiz muttered.
Queza stared at the reports. "What does it mean?"
Kamiz's wrist summons-charm tightened.
"We'll know soon enough." Kamiz straightened her cuffs and walked to the back tapestries. Raffine was calling her to the Cloud Spire.
#figure(image("004_What You Expect to See/03.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The elevator lifted from the city depths, through girder streets, past dazzling buildings, and into the clouds. Kamiz sent Queza to the Interfactorium's map room to begin analysis. Queza was a daydreamer, but also the best Exactor in the Obscura. Once or twice, she'd caught details Kamiz had missed—no small feat.
The elevator doors opened like beetle wings.
Whispers swirled through Raffine's sanctum; New Capenna's secrets and schemes. The glass walls let the powerful sphinx demon overlook her city, the stained glass above depicted her rise to power, how she joined with the archdemons as the angels fell, and how they enhanced her visionary capabilities.
Ghostly images of reports, crimes, and movements floated through the room. The sphinx demon paced, scattering the whispers and reports into mist. Every time, Raffine's presence was a kick in the gills. She was older than memory, powerful and mysterious. With her infernal visions, Raffine knew everything.
Kamiz dropped to her knee.
Like a storm, Raffine unfurled her wings, banishing the swirling reports and images to the corners. Her voice filled the room like thunder.
"Report."
Kamiz made a point never to tell Raffine something unless she was sure. "You've seen Tivit's vision. Other seers in New Capenna saw it at the same time. We're mapping them to pinpoint a pattern. The common elements include—"
"Do not rehearse the contents of a vision to a seer. Kittens splashing in milk, gold in the water, halos, halos, I have seen it. I am not interested in the details of your craft. I want to know who sent this vision!"
Kamiz clenched her jaw to keep from flinching. Did she say sent? Raffine's visions came from the archdemons. Who else could've sent a vision of this magnitude?
"Someone attacked the visionaries of New Capenna. Unacceptable."
The implications hit Kamiz in her stomachs. She'd warded the Cloud Spire herself. If a spellcaster managed to reach every seer, including Raffine~it meant her wards had failed. Someone got to Raffine, and it was her fault.
Kamiz stood. "I will find your answer."
"Quickly." Raffine retreated into her whispers and shadows. "Do this personally, Kamiz. Feet on the pavement. My trust is in you, alone. For now."
Kamiz swallowed the lump in her throat. "Understood."
#figure(image("004_What You Expect to See/04.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The Interfactorium walls shone brighter than a Cabaretti party. Queza gathered information orbs in her palms and flicked them to the map. Two side searches gathered data in one corner, and she traced lines of light between disconnected areas. If Queza thought they were connected, it was worth investigating.
"Each dot is a seer who experienced one of those visions." Queza pointed to the map. "Every one on our payroll, and a good number who don't know they are. Same vision, same time. No exceptions."
"Add a dot to the Cloud Spire." Kamiz grimaced and kept her coat on. She'd be going to the city to investigate soon. Feet on the pavement. "Raffine gave me her report. Same vision, same time."
Queza snapped the report into place. "I'm researching the vision's images: kitten hiding cream, gold in the water. Corvidae, which I thought referred to crows, could be any number of birds: crows, magpies, jackdaws, ravens~"
"Cut that search," Kamiz said. "We're investigating who sent the vision, not the vision contents."
"~Sent?"
"We're treating this like an attack. Someone cast a spell or poisoned the proverbial visionary waters."
Queza pursed her lips and rubbed her talisman. "Kamiz, I'm telling you, this is exactly like Raffine's visions at the founding. The big ones! It's not—"
"Why do you have that?"
Queza looked at her talisman. "You don't believe in the angels?"
"What is there to believe or disbelieve? Angels are a historical fact. They existed. They helped build New Capenna. They left."
Queza shrugged. "Some have faith they will return. Don't you suspect there's more to things than what we can see, some higher power?"
"We are the Obscura spymasters, Queza. We question everything and believe nothing without proof. Give that to me."
Queza handed over the talisman.
"Raffine says it's an attack. I know you've studied New Capenna's founding, but Raffine was there. The people who didn't listen to her are dead. This is the direction we're going."
The markings along Queza's neck pulsed a deep violet. She nodded and turned to the map. "Understood."
Kamiz pushed on. "Did anything significant occur right before the vision hit?" She called the aven surveillance reports from the stack in the corner and limited the search to the hour prior. There. An explosion in the Mezzio.
Queza zoomed into the map. "An abandoned warehouse." She spread an arch of reporting over her head, chose one, and cast the others to the side. "One person's been in the area this week." The enhanced picture revealed a leonin woman, orange tabby fur with pale yellow eyes. She wore a trench coat, a smart fedora, and a suspicious expression, like she was sniffing for a story. Lacey Lanine: reporter.
"She works for <NAME>, at the Capenna Herald."
"Sounds like the kitten hiding the cream, boss," Queza said. "Lacey and Denry have a secret scoop about something. Or they created their own scoop, to beat out the other papers."
Hmm. Kamiz knew reporters, she used to be one, before she found a job where she didn't have to talk to people. Reporters would do anything to sell papers. "Plausible theory, but—"
"A theory is no good without evidence. I know."
Good girl. "Look into the abandoned warehouse."
Queza minimized the maps and sanitized the room of secrets. "Call Oskar?"
Good choice. Kamiz liked Oskar—his trashcan treasure hunts and sewer salvages revealed a lot about people. Plus, he hung out with rats to avoid people. They were kindred souls. She cast the calling spell and brought Oskar into view, then pulled his image into the room. He blinked at the bare walls.
"How do you do this boss? It feels like I'm here."
"It looks like you are, too!" Queza passed her hand through Oskar's shoulder.
"Basic illusion. I suggest a few details, your mind fills in the rest. People see what they expect to see."
Queza and Oskar passed hands through each other again, and grinned. "New hat, Queza?"
"It is! I'm trying a new style before the Crescendo. Are you going?"
"To the Crescendo? Me?" He shuffled. "Hey, I found two more stained-glass halos for you. I'll send the locations."
"Great!"
Kamiz cleared her throat.
"What's up, boss?" Oskar wiped his nose on the back of his hand.
"Abandoned building exploded this morning. What do you know?" She pulled up the map.
"That building ain't abandoned, boss. Based on the trash, people've been inside for a week. And it didn't explode. It imploded."
She glanced at Queza. Feet on the pavement. Kamiz replaced her hat.
"I'm sending an Echo Reader team to view the implosion site. Queza and I will head there soon." She closed the illusion, then handed Queza her coat. "Why is Oskar finding halo-signs for you?"
Queza winced. "I'm an Exactor, Kamiz, you pay me to notice things."
Kamiz tilted her head. "Go on."
"Halo imagery's been popping up in the architecture. Without evidence of construction. I'm waiting to see if a pattern emerges. Oskar's my lookout."
Kamiz grunted. That is indeed what she paid Queza to do. "Very well. Question everything, follow your hunches—"
"And verify with evidence. I know, Kamiz."
"Good. Let's go find a reporter."
#figure(image("004_What You Expect to See/05.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
<NAME> tried not to jump when two Obscura agents appeared in his office.
"Yeah, I'll talk, I got nothing to hide." Denry ran his claws through a well-manicured mane. "Lacey Lanine? No, she didn't have work down there. Lacey does fluff pieces for the society pages. Nothing important."
The tips of Kamiz's tentacles prickled. If this was how Denry talked about Lacey, the reporter would be looking to prove herself with some big scoop. Motive to attack the seers?
"I'll show ya, here's her latest article. A preview of the Crescendo menu and entertainments. A line or two from <NAME> and <NAME>, see? Harmless. Anyway, she ain't been in today. Probably out with her lady friend all night."
"Lady friend?" Queza asked.
"One of those straightlaced Brokers. Lagrita or something."
"Lagrella?"
"Yeah, Lagrella!"
Kamiz and Queza rolled their eyes together. Lagrella the Magpie was a Broker lieutenant. If she was hanging around Lacey Lanine it was because she was manipulating her. A subtle push to control the Capenna Herald, most likely.
"Contact us when you see Lacey," Kamiz said. "If you don't, we'll know."
Denry swallowed hard. Kamiz and Queza left the building.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The Echo Readers stood in the warehouse rubble with their eyes closed, reading the spectral afterimages. Kamiz squinted through the dust hanging in the air.
"There's a four-person team working in darkness," the Readers intoned in unison. "Their faces are shadowed."
"Shadowed?" Queza asked.
"They're under a spell." Kamiz explained.
"They cast dozens of spells, aimed at a bottle in the warehouse center," the Readers continued. "Every spell fizzled, until this morning. Something stuck, and it devoured the room." The Echo Readers raised their heads, eyes still closed. "Everything. Building imploded, and then—" The Readers shielded their spectral eyes. "Shadow burst from the explosion."
Kamiz put her hands in her pockets. "Do you see <NAME>?"
The Echo Readers pointed to a northern side street. "There. On the first night, she entered the warehouse from that direction, handed them a bottle of Halo, and left."
"From the Vantoleone," Queza whispered. "Where she was writing an article about the Crescendo. Not out with a lady friend, eh?"
Kamiz nodded. Red herring. "Denry underestimated her. She's ambitious. Tried to make her own story. Stole Halo from the Cabaretti and arranged for experimentation. Now she'll publish her exposé about how all the seers in New Capenna were compromised."
"Easy fix," Queza said. "Give her name to a Ruiner. By morning, no paper will touch anything written by <NAME>."
Kamiz frowned. This would answer Raffine's question. She had the who, but not enough of the how and why. Halo did enhance spells. But enough to cover the entire city, even to the Cloud Spire? Something wasn't adding up.
She tapped into the Obscura network and summoned a mentalist. "Find <NAME>. I need to pick her brain."
Her reporting talisman shook. Oskar's face appeared. He pulled back to show the figure swaying beside him. Lacey.
"Someone beat you to it, boss. Found her sitting in a dumpster in a stupor. Mind wiped."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The mentalist dusted off her palms. "I can't get anything. Her mind is blank. Whoever did this was thorough."
Kamiz grunted. "Take her to the infirmary."
"You got it, boss." She guided Lacey into a car.
Queza and Oskar sifted through the trash Lacey left behind. Rats scurried from the dumpster, bringing offerings. The alley was familiar, they were just down the street from Tivit's shop. Kamiz and Queza might've passed by just this morning. Did it mean anything that Lacey was so close to Tivit? Unlikely. Meant more that she was so far away from the blast site.
"These are connected to Lacey." Oskar handed Kamiz a notepad, a business card, and a tight scroll. All three were blank.
#figure(image("004_What You Expect to See/06.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"Even her notes are scrubbed," Queza said.
Kamiz sighed. "Lacey might be a victim of her own plot. The Halo-infused spell put images into seers' minds and wiped her memory. She survived the implosion, but the magic hit her brain at point-blank range."
Good theory, it fit the evidence. Did the Echo Readers mention Lacey being nearby for the implosion?
Kamiz's head snapped up. Glass shattered nearby. Metal crashed against wood. Tivit's shop. Kamiz shoved the blank scraps into her pocket and sprinted. Queza caught up in a few steps, her coat's dagger-shaped fringe tinkled as she moved.
They pushed through screaming, panicked pedestrians to see flame burst from Tivit's shop. The balcony above the shop entrance buckled on one side, a breath away from crashing to the ground. The storefront's stained glass rained on the sidewalk. A crowd of looky-loos bunched up to watch the destruction, slowing Kamiz and Queza's rush to the front. Tivit shook in their entryway, terrified and stunned. The balcony overhead creaked and showered sparks.
"Tivit!" Queza leapt forward as the balcony broke from the wall. They would both be crushed! Kamiz projected her magic forward, gripping Tivit's thick fur and Queza's long coat, ripping them out from under the balcony. It crashed to the pavement. The fire spread, attracting more voyeurs.
Kamiz strained to see where she'd thrown the two. Tivit lifted their head, but Queza's coat lay flat and empty. She'd grabbed the coat and missed Queza. Kamiz stared at the flaming balcony. No movement. The crowd clogged the street, blocking her attempts to get closer. Light flashes signaled the arrival of the Brokers, ready to offer protection for a price.
Kamiz's markings flared blue-green. Queza.
Magic gathered in a blue storm around her hands. "Oskar!" she shouted, but he was already in motion, relaying orders to his rats. The rodents surged, feeding on his fear and fury. Kamiz's spell lifted them, enhanced them, splashed them down upon the crowd.
Illusion. The crowd saw giant flying rats and their minds filled in the rest. Everyone scattered, including the Broker's contract mages. Tivit, shaking off the shock, called water down to extinguish the flames. Kamiz rushed to the fallen balcony and flipped the charred wood to find her apprentice. Oskar sifted through the ash, cutting his fingers on broken glass. "No, no, no~" he muttered.
Nothing. No Queza. Kamiz's three hearts stopped inside her chest. She'd failed Raffine, and now her apprentice, her right hand, her—
"Oof, ouch."
Kamiz held her breath. A thin hand emerged from the gap between two buildings. A hand with bioluminescent markings.
"Queza!" Oskar jumped to his feet. Inch by inch, Queza poured herself out from the small gap. She shook herself and let her body pop back into shape.
"No bones," Queza cracked a smile. "Cephalids excel at getting in and out of tight spots. Oh!"
Kamiz pulled the young woman into a tight hug. A moment passed. Queza returned it. "Thought you were dead, kid," Kamiz muttered.
"Not dead, boss. Just squished."
Kamiz cleared her throat and let go. She straightened Queza's collar and looked her over. She was fine. Smart kid. Kept her wits about her. She never should've worried.
Oskar shuffled foot to foot, unaccustomed to talking with non-rodents. Queza tucked a wavy tentacle into place "Alright, Oskar?"
"Your finger webs are singed," he said.
"Yours are bleeding," she answered.
He shrugged. "Glass."
"Yeah."
Kamiz rolled her eyes. Across the street, the exhausted Tivit talked with a man in a well-tailored, armored pinstripe suit. He waved a scroll in Tivit's face. A contract.
Broker.
Kamiz spread her webbed fingers and viewed the man through a window of magic. Contract magic circled the scroll like chains. Kamiz plucked mirrored glass from the debris and projected it between the agent and Tivit. The binding magic reflected to the agent and fizzled. He scowled at Kamiz and scurried away.
Queza gasped. "Quick thinking, boss."
"Mirrors expose the loopholes in contract magic. Then you can exploit them. The agent knew he was beat. Always keep a mirror handy. Tivit? Never sign a contract with a Broker. They're vultures. They see minor property damage and rope you into a protection racket."
Tivit ruffled their feathers, indignant. "I was confused for a moment."
"Who did this?"
"Riveteers," Tivit said. "After the vision this morning, I took a sleeping potion to calm my nerves. When I woke up, the new stained glass was above my door. They said it was scab work and smashed it. Ah, my shop!"
Queza rushed to Tivit's side. "The stained glass appeared? What did it look like?"
Tivit waved their paw, bringing up a picture. Queza gasped.
The stained glass was a halo.
"Queza," Kamiz started.
Queza sent report requests to all agents. "Pictures of the buildings where the visions occurred this morning."
Their talismans shook. Images floated around them like bubbles. One halo. Another. Halos, halos.
"Halos, everywhere," Queza whispered. "It's the vision, Kamiz."
"We're not looking into~"
Her excitement grew. "Raffine was right," she said. "Someone did send these visions. Not Lacey Lanine, or <NAME>. It's a message from the angels!"
Kamiz waited for the punchline.
It wasn't coming.
"Enough, Queza."
"But!"
"Enough!"
Oskar and Tivit froze. Queza set her jaw.
"Have I taught you nothing?" Her eyes narrowed. "You didn't take a moment to ask how or when these halos were crafted? Or consider interviewing the glaziers in the Caldaia? Your first thought is folktales? Your angel obsession makes you sloppy."
"What about you?" Queza blurted. "You don't even follow your own rules!"
"What?"
"Have you questioned everyone, Kamiz? Your objectivity falls away at certain altitudes. I've been collecting evidence. Are you examining your assumptions? Are you questioning Raffine?"
No sound but their pounding hearts.
Kamiz straightened her coat. "You're off this case."
Queza exhaled like she'd been punched. "What?"
"Give me your sigils and talismans. You're not ready for this. Report back to the Kallock Library for Exactor duties."
Queza glared, her markings flashed purple and violet. "Fine." She pulled the talisman from her neck, shucked the sigil bracelets from her wrists and slammed them into Kamiz's open palm. Kamiz turned away to let her know she was forgotten.
"Tivit, we'll send a crew to clean up. Oskar, good work today."
"Sure, boss," Oskar muttered.
Kamiz pretended not to see Queza in her peripheral vision, fists balled and jaw clenched. She'd been wrong about the girl, as she'd been wrong about the Cloud Spire's wards. Queza was not her problem anymore.
Time to report to Raffine.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Kamiz lifted her forehead from the elevator's glass wall and sighed. Blackbirds flew past in a tight vee. She spoke into her bracelet before the elevator doors opened. "Check it out and report back to me."
"Understood," the aven surveillance chief responded. She closed the link and took a cleansing breath. The doors opened. Raffine paced as before. For a moment she looked~smaller. Diminished. As she knelt, Kamiz inspected the sanctum's magnificent architecture. The walls, the carvings. The stained glass.
#figure(image("004_What You Expect to See/07.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"Stand and report."
"We—I found the vision catalyst. Spellcasters using Halo. It killed them and destroyed a building." Wiped a reporter's mind, she almost said, but she made a point never to tell Raffine something unless she was sure. She fingered the notepad and scroll in her pocket. "The Halo was stolen from the Cabaretti."
"The spellcasters?"
"Dead, evaporated. The only witness was a reporter, <NAME>. Her mind was wiped."
Kamiz waited for Raffine to call out her passive voice, but the sphinx demon ignored it. She seemed more relieved that there was any explanation.
Kamiz knelt again, arms spread. "I made a mistake, boss. I warded the Cloud Spire, but this spell got through somehow."
The report chimed in the rune by her ear. "Kamiz? I looped the Spire three times and compared it to prior photograms. Everything is the same. Looking into your second question now."
Kamiz sat back on her heels. Raffine stopped pacing.
"What?"
"I came here to tell you I made a mistake, boss. And I did."
Raffine shook her wings. "You're forgiven. Re-establish the wards, check them again. That will be all."
"That wasn't my mistake." The clues came into focus. "My mistake was assuming every seer saw the same vision. They didn't. Everyone saw a kitten hiding a bowl of cream. You said it was kittens splashing in milk."
Raffine halted her pacing. "Hiding, splashing, they are the same."
"You didn't see the same vision." Kamiz stood. "Because you didn't see any vision at all."
Raffine drew up like a storm. "You dare?"
"No halos, boss." Kamiz pointed at the stained glass around the room. "Not a single halo here in the Cloud Spire. Halos, halos everywhere, but none here. That's why you're scared. For the first time someone sent a vision, and they didn't send it to you."
Raffine's wings drooped.
"Boss?" The second report came to her ear. "You were right. She was in the dumpster all night. Nowhere near the implosion."
Kamiz looked Raffine in the eyes. "We gotta deal with this vision. Something big is coming, and we need to protect the family. I can't do that if you hide the truth from me." She turned to the elevator.
"Where are you going?" Raffine demanded.
"To get justice for an innocent reporter. <NAME> wasn't at the building when it imploded. The spell didn't wipe her memory."
"Where was she?"
Kamiz pulled the scroll from her pocket. "Out with her lady friend."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
<NAME> was a calm oasis in the heart of the chaotic city. Lagrella stood between orderly rows of palm-leaved bushes, admiring the winged architecture. She tapped a toe in the blue-white mist spilling from the fountain. Kamiz stepped from the shadows, alone, but ready.
"Congratulations, Spymaster," she said. "You found me."
The mist swirled underfoot. Lagrella the Magpie. Corvidae, flitting and pecking. "Are Brokers always this sloppy?"
"Pardon?" Lagrella said.
"A window gets scratched, and there's a Broker signing people up for protection. An entire building implodes in the heart of the Mezzio, and not a single Broker comes sniffing? You might as well have signed your name. Once I found your girlfriend, it was basically a confession."
"You mean Lacey?" Lagrella twisted the heavy rings on her fingers. "Not my girlfriend. If I could've robbed the Cabaretti myself, I would have, but the Cabaretti don't let anyone into the kitchen. Except harmless reporters for the society pages."
Kamiz felt the puzzle pieces fit into place. The kitten hiding the cream wasn't Lacey or Denry Klin. It was the Cabaretti. Lagrella's magic pooled around Kamiz's ankles, hiding in the mist. She stood still.
"Alas," Lagrella continued. "I only love two things. My fish tank, and the Prophecy."
Kamiz knew she decorated the tank with contract breakers, dipped in gold and silver. Gold in the water. "Prophecy?" The Magpie's magic danced along Kamiz's wrists, stinging her tentacles. She kept her face blank.
Lagrella pulled a scroll and fountain pen from her coat. "The Prophecy is locked in a vault below Nido Sanctuary. A demon foretold New Capenna's destruction. When the city's Halo supply is gone, we die. I don't mind telling you, soon you won't remember."
#figure(image("004_What You Expect to See/08.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"You believe this?"
"Raffine's right hand questions a demon's prophecy? I discovered something recently. The Cabaretti figured out how to make Halo. Imagine. I sent Lacey to get me a sample. You know the rest."
Magic tightened around Kamiz's wrists. "They didn't agree to die for you."
"You'd be surprised what people agree to when they're desperate." Lagrella's smile dropped. "For example, you're going to tell me the vision that's shaken the Obscura. It came the moment the building collapsed. I know you know what they saw, Kamiz. Tell me!"
Hmm. Lagrella didn't know the vision's contents. The images were real, not manufactured. Kamiz blinked. A flash of light caught her eye.
Light from a mirror.
Lagrella didn't see what Kamiz saw, but of course not. People see what they expect to see.
"Why care that the Cabaretti can make Halo?"
Lagrella gaped. "How do you not understand? There's an angel somewhere in New Capenna! Only angels can create Halo! I'm trying to save this city! I opened a door, and before those worthless wizards failed, a message got through. You will tell me what your fortune-tellers saw! Sign here."
Kamiz gasped. Lagrella's magic jerked her hands forward. A pen materialized in her fingers, poised above Lagrella's contract. Ink threatened to spill from the tip. The ties of magic pushed her hand toward the paper. "You're forcing me to sign a contract? I thought Brokers were bound by justice, and ethics."
"I'll do anything to save New Capenna."
"No~"
"Yes. I'll add you to my collection, Kamiz. Silver, I think. Sign."
Sweat dripped under her collar. Lagrella boosted the spell, forcing her hand to the paper. Kamiz pulled her hand back, grunting, fighting to not sign away her mind. She fought against it, pulled back~!
Her hand fell through the paper.
"You get all that, Queza?"
A cheery voice piped behind her. "I did, boss. Doomsday Prophecy, fresh Halo, angels. All recorded and sent to the Cloud Spire."
Lagrella whirled about, searching for the hidden cephalid. "Kamiz is bound! She's signed my contract!"
"This contract?" Kamiz lifted the scroll and the pen. They dissolved into mist.
Lagrella fell back a step. "What?"
"This isn't real, Broker. Not the paper, or the pen. This isn't even Mergino Fountain. Look, the bushes are identical, it's not my best work. You saw what you wanted to see. Just like a magpie, distracted by shiny objects."
"My magic circles your neck, cephalid."
"It looks that way, doesn't it?" Queza stepped into the illusion. "My boss showed me a trick earlier. If you can see the loophole, you can exploit it." She lifted the mirror she'd brought. "Magically forcing someone to sign a contract leaves some big loopholes."
The magic ensnaring Kamiz fell away, an optical illusion. Lagrella looked from her hands to Kamiz.
"Shift your perspective," Kamiz said. "You're the one in the trap."
Kamiz's illusion fell at once. The fountain, plants, and architecture disappeared. They stood in a bare, windowless room. Queza stood before Lagrella, using her mirror to reflect the magic back at her, snaring her like a spiderweb. She cried out at her hand, holding the fountain pen, poised above her own deadly contract.
"Wipe your own mind with your signature or nullify Lacey's contract." She pulled the scroll from her pocket. "Your choice."
"Nullify a contract? Never!"
Kamiz pushed Lagrella's hand toward the paper. "It's amazing what people will agree to when they're desperate."
Lagrella bared her teeth, hissed, and kicked against her magical restraints. A furious scream ripped through her throat, then she slumped in defeat. "Fine," she seethed. Kamiz replaced the scroll with Lacey's contract, and Lagrella's signature cracked the paper in two.
"Nice doing business." Kamiz feigned a salute and signaled Queza to the door.
"Kamiz you can't leave me here!" Lagrella called out. "I can't move!"
"Be glad you aren't dipped in gold, Magpie. It's your magic. Untangle it yourself."
Queza closed the door behind them.
"Lacey's waking up." Kamiz touched the rune in her ear. "Memory restored."
"Why'd you help Lacey?"
Kamiz shrugged. "Got a soft spot for hapless reporters. Used to be one." She paused. "I wasn't expecting you to back me up."
Queza rubbed her arm. "Oskar realized Lacey's scroll was a Broker contract. I connected the dots. Followed you here."
Kamiz cleared her throat. "You were right."
Queza held her breath.
"I checked. No halos on the Cloud Spire. Raffine didn't see the vision."
"You checked?"
"Question and verify. You were right." She stepped toward the shops, where they would find an entrance to the vent shaft underground. "Come on, kid. Time to report to our boss."
Queza tilted her head. "You fired me."
"Because I wasn't listening." She met Queza's eyes. "You were looking at facts. I should've trusted your analysis. You were right. And if someone in New Capenna is making their own Halo, and angels are returning—Raffine will need you to talk her through it."
She pulled Queza's talisman and sigils from her pockets. Queza took them reverently.
"This is also yours." Kamiz lifted the angel talisman. It twisted in the dusk, halo twirling within.
Queza quirked a smile. "Does this mean you believe the angels will return?"
Kamiz laughed. "Ridiculous. But maybe~maybe I don't know everything."
Queza grinned. "I'll take it."
A pair of blue feathered birds chased each other in a spiral around steel columns. Their birdsong echoed off the stained glass, piercing even the shadows of New Capenna.
"Kamiz?"
"Just appreciating the beauty," she said. "Let's go."
|
|
https://github.com/Enter-tainer/typstyle | https://raw.githubusercontent.com/Enter-tainer/typstyle/master/tests/assets/unit/destruction.typ | typst | Apache License 2.0 | #let (n, ..) = layout-node(node, 0, ctx)
#grid(
columns: (1fr,) * calc.min(3, authors.len()),
gutter: 1em,
..authors.map(author => align(center, text(font: author-font, author))),
)
#let books = (
Shakespeare: "Hamlet",
Homer: ("The Odyssey", "The Iliad"),
Austen: "Persuasion",
)
#let (Austen,) = books
Austen wrote #Austen.
#let (Homer: h) = books
Homer wrote #h.
|
https://github.com/7sDream/fonts-and-layout-zhCN | https://raw.githubusercontent.com/7sDream/fonts-and-layout-zhCN/master/chapters/06-features-2/substitution/manjari.typ | typst | Other | #import "/lib/draw.typ": *
#import "/template/lang.typ": malayalam
#let start = (0, 0)
#let end = (900, 350)
#let graph = with-unit((ux, uy) => {
// mesh(start, end, (100, 100))
txt(malayalam[\u{0D15}\u{0D4D} + \u{0D38} = \u{0D15}\u{0D4D}\u{0D38}], (0, 350), anchor: "lt", size: 144 * ux)
txt(malayalam[\u{0D15}\u{0D4D} + \u{0D38} + \u{0D41} = \u{0D15}\u{0D4D}\u{0D38}\u{0D41}], (0, 160), anchor: "lt", size: 120 * ux)
})
#canvas(end, width: 80%, graph)
|
https://github.com/francescoo22/masters-thesis | https://raw.githubusercontent.com/francescoo22/masters-thesis/main/structure.typ | typst | // Frontmatter
#include "./preface/firstpage.typ"
#include "./preface/copyright.typ"
#include "./preface/abstract.typ"
#include "./preface/acknowledgements.typ"
#include "./preface/table-of-contents.typ"
// Mainmatter
#counter(page).update(1)
#set heading(numbering: "1.1", supplement: [Chapter])
// TODO: decide whether to have this or not
// #set par(first-line-indent: 1em)
#include "./chapters/1-Introduction.typ"
#include "./chapters/2-Background.typ"
#include "./chapters/3-Related-Work.typ"
#include "./chapters/4-Annotations-Kotlin.typ"
#include "./chapters/5-Annotation-System.typ"
#include "./chapters/6-Encoding.typ"
#include "./chapters/7-Conclusion.typ"
// Bibliography
#include("./appendix/bibliography/bibliography.typ")
// Appendix
#set heading(numbering: "A.1", supplement: "Appendix")
#counter(heading).update(0)
#include("./appendix/full-rules-set.typ") |
|
https://github.com/WinstonMDP/math | https://raw.githubusercontent.com/WinstonMDP/math/main/exers/4.typ | typst | #import "../cfg.typ": *
#show: cfg
$ "Prove that" all(x) in II: lim_(QQ in.rev r -> x) a^r = a^x $
$a^x = sup_(QQ in.rev r < x) a^r = inf_(QQ in.rev r > x) a^r$
$ex(r' < x): a^x - a^r' < epsilon$
$ex(r'' > x): a^r'' - a^x < epsilon$
$all(r in (r', r'')): abs(a^x - a^r) < epsilon$
|
|
https://github.com/ivhacks/resume-template | https://raw.githubusercontent.com/ivhacks/resume-template/main/resume.typ | typst | #let author = "<NAME>"
#let accent_color = rgb("#d43520")
#let text_size = 11.3pt
#let name_font = "P052"
#let main_font = "Lato"
#let bullet_baseline = 1.6pt
#let contact_info(linkedin, email, phone) = {
set text(size: 12pt)
// Personal Info
align(center, [
#h(1fr)
#box(baseline: 25%, image(height: 16pt, "svg/linkedin.svg"))
#link("https://www." + linkedin)[#linkedin]
#h(1fr)
#box(baseline: 27%, image(height: 16pt, "svg/phone.svg"))
#text([#phone])
#h(1fr)
#box(baseline: 29%, image(height: 16pt, "svg/email.svg"))
#link("mailto:" + email)[#email]
#h(1fr)
])
}
#let section_header(title, accent_color) = {
text(16pt, weight: "bold", fill: accent_color)[
#title
]
v(-13pt)
line(length: 100%, stroke: 2pt + accent_color)
}
// Sets document metadata
#set document(author: author, title: author)
// Disable ligatures so ATS systems do not get confused when parsing fonts.
#set text(lang: "en", ligatures: false)
#set page(paper: "us-letter", margin: (x: 1cm, y: 1cm))
// Link styles
#show link: underline
#set text(font: name_font)
#align(center, text(28pt, weight: "bold")[
#author
])
#set text(font: main_font, size: text_size)
#v(-25pt)
#set list(marker: [
#text(7pt, weight: "bold", fill: accent_color, baseline: bullet_baseline)[▪]
], body-indent: 0.4em)
#contact_info("linkedin.com/in/robberbob", "<EMAIL>", "346-420-1337")
// Work ********************************
#v(-12pt)
#section_header("Work Experience", accent_color)
#v(-7pt)
// First company
#text(15pt, weight: "bold")[
Plastroltech Inc.
]
#h(1fr)
#text(text_size + 1pt, baseline: -1pt)[
$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$
$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$$space$
$space$$space$$space$$space$
Arvada, CO
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
8/2024 $dash.en$ Present
]
#pad(
left: 0.4cm,
[
#v(-4pt)
#text(14pt, weight: "bold")[
Lead Principal Chief B2B SAAS Sales Specialist
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
8/2024 $dash.en$ Present
]
- Architected 46.9% product growth year over year by leveraging breakthrough snowball waterfall landfills
- Orchestrated migration to web3 cloud-based AI platform powered by Kuberenetes
- Spearheaded synergistic cross-functional initiatives, leveraging dynamic paradigms to optimize holistic scalability and drive unprecedented stakeholder-centric innovation across the value stream ecosystem
],
)
// Second company
#text(15pt, weight: "bold")[
Doofenshmirtz Evil Incorporated
]
#h(1fr)
#text(text_size + 1pt, baseline: -1pt)[
Tri-state area
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
2021 $dash.en$ 2024
]
#pad(
left: 0.4cm,
[
#v(-4pt)
#text(14pt, weight: "bold")[
Minion
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
6/2023 $dash.en$ 8/2024
]
- Pioneered disruptive evil-innovation frameworks, aligning villainous strategic objectives with core doomsday deliverables to maximize nefarious ROI and optimize cross-functional henchman synergy.
- Led a cross-discipline team of minions to operationalize cutting-edge destruction tools, leveraging integrated malicious architectures to drive forward-thinking chaos at scale while ensuring sustained evil transformation.
- Architected next-gen evil initiatives by orchestrating a holistic, villain-centric roadmap, enabling dynamic disarray and iterative havoc delivery within the competitive landscape of global villainy.
#v(-4pt)
#text(14pt, weight: "bold")[
Chief Executive Officer of Unscheduled Asset Reallocation
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
Summers of 2022 & 2023
]
- Robbed banks
],
)
// Education ********************************
#v(-9pt)
#section_header("Education", accent_color)
#v(-7pt)
#text(15pt, weight: "bold")[
Tri-State State University
]
#h(1fr)
$space$$space$$space$$space$$space$$space$$space$$space$
#text(text_size + 1pt, baseline: -1pt)[
Hawaii
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
2019 $dash.en$ 2023
]
#v(-7pt)
Master of Science in Art
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
5/2023
]
#v(-7pt)
Bachelor of Arts in Science
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
12/2022
]
#pad(
left: 0.4cm,
[
#v(-4pt)
#text(14pt, weight: "bold")[
Founder & President: Heckathon Club
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
2019 $dash.en$ 2023
]
- 9001x heckathon winner
- Made competitors cry on a regular basis
],
)
// Projects ********************************
#v(-9pt)
#section_header("Projects", accent_color)
#v(-7pt)
#text(15pt, weight: "bold")[
Anti-speedbump car
]
#h(1fr)
#text(text_size + 1pt, style: "italic", baseline: -1pt)[
8/2023 $dash.en$ 9/2023
]
- Designed 48-inch active air suspension system
- Vehicle can cruise through parking lots and school zones at 120mph
// Skills ********************************
#v(-9pt)
#section_header("Skills", accent_color)
#v(-7pt)
Evil, Minecraft command block programming, having W rizz |
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/scrutinize/0.1.0/src/question.typ | typst | Apache License 2.0 | #let _label = <scrutinize-question>
#let _builtin_counter = counter
#let _metadata_to_dict(m) = (..m.value, location: m.location())
/// The question counter
///
/// Example:
///
/// ```typ
/// #show heading: it => [Question #question.counter.display()]
/// ```
///
/// -> counter
#let counter = _builtin_counter(_label)
/// Adds a question with its metadata, and renders it.
/// The questions can later be accessed using the other functions in this module.
///
/// - body (content): the content to be displayed for this question
/// - ..args (string): only named parameters: values to be added to the question's metadata
/// -> content
#let q(
body,
..args,
) = {
assert(args.pos().len() == 0)
[#metadata((body: body, ..args.named())) #_label]
body
}
/// Locates the most recently defined question;
/// within a @@q() call, that is the question _currently_ being defined.
///
/// If a function is provided as a parameter, the located question's metadata is used
/// to call it and content is returned.
/// If a location is provided instead, the question's metadata is located there and returned directly.
///
/// Example:
///
/// ```typ
/// #question.current(q => [This question is worth #q.points points.])
///
/// #locate(loc => {
/// let points = question.current(loc).points
/// // note that `points` is an integer, not a content!
/// let points-with-extra = points + 1
/// // but eventually, `locate()` will convert to content
/// [I may award up to #points-with-extra points for great answers!]
/// })
/// ```
///
/// - func-or-loc (function, location): either a function that receives metadata and returns content, or the location at which to locate the question
/// -> content, dictionary
#let current(func-or-loc) = {
let inner(loc) = {
let q = query(selector(_label).before(loc), loc).last()
_metadata_to_dict(q)
}
if type(func-or-loc) == function {
let func = func-or-loc
// find value, transform it into content
locate(loc => func(inner(loc)))
} else if type(func-or-loc) == location {
let loc = func-or-loc
// find value, return it
inner(loc)
} else {
panic("function or location expected")
}
}
/// Locates all questions in the document, which can then be used to create grading keys etc.
/// The array of question metadata is used to call the provided function.
///
/// If a function is provided as a parameter, the array of located questions' metadata is used
/// to call it and content is returned.
/// If a location is provided instead, it is used to retrieve the metadata and they are returned directly.
///
/// Example:
///
/// ```typ
/// #question.all(qs => [There are #qs.len() questions.])
///
/// #locate(loc => {
/// let qs = question.all(loc)
/// // note that `qs` is an array, not a content!
/// // but eventually, `locate()` will convert to content
/// [The first question is worth #qs.first().points points!]
/// })
/// ```
///
/// - func-or-loc (function, location): either a function that receives metadata and returns content, or the location at which to locate the question
/// -> content, array
#let all(func-or-loc) = {
let inner(loc) = {
let qs = query(_label, loc)
qs.map(_metadata_to_dict)
}
if type(func-or-loc) == function {
let func = func-or-loc
// find value, transform it into content
locate(loc => func(inner(loc)))
} else if type(func-or-loc) == location {
let loc = func-or-loc
// find value, return it
inner(loc)
} else {
panic("function or location expected")
}
}
|
https://github.com/thudep/award-cert-printer | https://raw.githubusercontent.com/thudep/award-cert-printer/master/template.typ | typst | #set page(paper: "a4",flipped: true, margin: (x: 0em,y:0em),background: image("bg.svg"))
#let tsinghua_purple=rgb(106,8,116)
#set text(tracking: 1.5pt)
#set underline(offset: 4pt)
#place(center,dy:12em, text(font: "Source Han Serif", lang: "zh", region: "cn",[*获奖证书*], size: 40pt,fill:tsinghua_purple))
#place(left,dy:19em,dx:14%, text(font: "Source Han Serif", lang: "zh", region: "cn",strong(underline[#include "name.typ"]), size: 28pt))
#place(left,dy:19em,dx:26%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*同学*], size: 24pt, fill:tsinghua_purple))
#place(center,dy:28em, text(font: "Source Han Serif", lang: "zh", region: "cn",strong(underline[#include "prize.typ"]), size: 28pt))
#place(center,dy:24em,dx:-20%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*你在*], size: 22pt,fill:tsinghua_purple))
#place(center,dy:24em, text(font: "Source Han Serif", lang: "zh", region: "cn",strong(underline[#include "contest.typ"]), size: 22pt))
#place(center,dy:24em,dx:21%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*中荣获*], size: 22pt,fill:tsinghua_purple))
#place(left,dy:32em,dx:14%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*特发此证,以资鼓励*], size: 22pt,fill:tsinghua_purple))
#place(left,dy:39em,dx:66%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*清华大学工程物理系*], size: 20pt,fill:tsinghua_purple))
#place(center,dy:35em,dx:-24%, image("dep.jpg",width: 9em))
#place(left,dy:47em,dx:14%, text(font: "Source Han Serif", lang: "zh", region: "cn",[*证书摘要*], size: 14pt,fill:tsinghua_purple))
// see https://github.com/typst/typst/issues/2196#issuecomment-1728135476
#let try_to_string(content) = {
if content.has("text") {
content.text
} else if content.has("children") {
content.children.map(try_to_string).join("")
} else if content.has("body") {
try_to_string(content.body)
} else if content == [ ] {
" "
}
}
#place(left,dy:47em,dx:22%, text(raw(try_to_string([#include "fingerprint.typ"])), size: 16pt, fill:tsinghua_purple))
|
|
https://github.com/YunkaiZhang233/a-level-further-maths-topic-questions-david-game | https://raw.githubusercontent.com/YunkaiZhang233/a-level-further-maths-topic-questions-david-game/main/core-pure.typ | typst | #import "template.typ": *
#import "shortcut.typ": *
#let title = "Core Pure Topic Questions"
#let author = "<NAME>"
#let course_id = "Further Mathematics"
#let instructor = "<NAME>"
#let school_name = "David Game College"
#let written_time = "Spring 2024"
#show: assignment_class.with(title, author, course_id, instructor, school_name, written_time)
#set enum(numbering: "a)")
#set heading(numbering: "1.1.")
#outline(indent: auto)
#pagebreak(weak: false)
#problem_counter.update(0)
#preface
#topic[Proofs]
#topic[Complex Numbers]
#topic[Matrices]
#topic[Further Algebra]
#topic[Further Calculus]
except volumes of revolution, #a2_all
#topic[Further Vectors]
#topic[Polar Coordinates]
#a2_all
#topic[Hyperbolic Functions]
#a2_all
#topic[Differential Equations]
#a2_all
|
|
https://github.com/soul667/typst | https://raw.githubusercontent.com/soul667/typst/main/PPT/typst-slides-fudan/themes/polylux/book/src/themes/gallery/simple.md | markdown | # Simple theme

This theme is rather unobstrusive and might still be considered bare-bones.
It uses a minimal amount of colour and lets you define your slides' content very
freely.
Use it via
```typ
#import "@preview/polylux:0.2.0": *
#import themes.simple: *
#show: simple-theme.with(...)
```
`simple` uses regular headings for sections.
Unless specified otherwise, first level headings create new sections.
The regular `#outline` is configured such that it lists the names of all sections.
Second level headings are supposed to be used as slide titles and introduce some
spacing below them.
Text is configured to have a base font size of 25 pt.
## Options for initialisation
`simple-theme` accepts the following optional keyword arguments:
- `aspect-ratio`: the aspect ratio of the slides, either `"16-9"` or `"4-3"`,
default is `"16-9"`
- `footer`: text to display in the footer of every slide, default is `[]`
- `background`: background colour, default is `white`
- `foreground`: text colour, default is `black`
## Slide functions
`simple` provides the following custom slide functions:
```typ
#centered-slide[
...
]
```
A slide where the content is positioned in the center of the slide.
Not suitable for content that exceeds one page.
---
```typ
#title-slide[
...
]
```
Same as `centered-slide` but makes heading of level 1 not outlined, so that the
presentation title does not show up in the outline.
Not suitable for content that exceeds one page.
---
```typ
#slide[
...
]
```
Decorates the provided content with a header containing the current section (if
any) and a footer containing some custom text and the slide number.
---
```typ
#focus-slide(foreground: ..., background: ...)[
...
]
```
Draw attention with this variant where the content is displayed centered and text
is enlarged.
Optionally accepts a foreground colour (default: `white`) and background color
(default: `aqua.darken(50%)`).
Not suitable for content that exceeds one page.
## Example code
The image at the top is created by the following code:
```typ
#import "@preview/polylux:0.2.0": *
{{#include simple.typ:3:}}
```
|
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/math/attach-p1.typ | typst | Apache License 2.0 | // Test t and b attachments, part 1.
---
// Test basics, postscripts.
$f_x + t^b + V_1^2 + attach(A, t: alpha, b: beta)$
---
// Test basics, prescripts. Notably, the upper and lower prescripts' content need to be
// aligned on the right edge of their bounding boxes, not on the left as in postscripts.
$
attach(upright(O), bl: 8, tl: 16, br: 2, tr: 2-),
attach("Pb", bl: 82, tl: 207) + attach(upright(e), bl: -1, tl: 0) + macron(v)_e \
$
---
// A mixture of attachment positioning schemes.
$
attach(a, tl: u), attach(a, tr: v), attach(a, bl: x),
attach(a, br: y), limits(a)^t, limits(a)_b \
attach(a, tr: v, t: t),
attach(a, tr: v, br: y),
attach(a, br: y, b: b),
attach(limits(a), b: b, bl: x),
attach(a, tl: u, bl: x),
attach(limits(a), t: t, tl: u) \
attach(a, tl: u, tr: v),
attach(limits(a), t: t, br: y),
attach(limits(a), b: b, tr: v),
attach(a, bl: x, br: y),
attach(limits(a), b: b, tl: u),
attach(limits(a), t: t, bl: u),
limits(a)^t_b \
attach(a, tl: u, tr: v, bl: x, br: y),
attach(limits(a), t: t, bl: x, br: y, b: b),
attach(limits(a), t: t, tl: u, tr: v, b: b),
attach(limits(a), tl: u, bl: x, t: t, b: b),
attach(limits(a), t: t, b: b, tr: v, br: y),
attach(a, tl: u, t: t, tr: v, bl: x, b: b, br: y)
$
---
// Test function call after subscript.
$pi_1(Y), a_f(x), a^zeta (x), a^abs(b)_sqrt(c) \
a^subset.eq (x), a_(zeta(x)), pi_(1(Y)), a^(abs(b))_(sqrt(c))$
---
// Test associativity and scaling.
$ 1/(V^2^3^4^5),
1/attach(V, tl: attach(2, tl: attach(3, tl: attach(4, tl: 5)))),
attach(Omega,
tl: attach(2, tl: attach(3, tl: attach(4, tl: 5))),
tr: attach(2, tr: attach(3, tr: attach(4, tr: 5))),
bl: attach(2, bl: attach(3, bl: attach(4, bl: 5))),
br: attach(2, br: attach(3, br: attach(4, br: 5))),
)
$
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/grid-styling_03.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test inset.
#grid(
columns: (1fr,) * 3,
stroke: 2pt + rgb("333"),
inset: 5pt,
[A], [B], [C], [], [], [D \ E \ F \ \ \ G], [H],
)
#grid(
columns: 3,
inset: 10pt,
fill: blue,
[A], [B], [C]
)
#grid(
columns: 3,
inset: (y: 10pt),
[A], [B], [C]
)
#grid(
columns: 3,
inset: (left: 20pt, rest: 10pt),
stroke: 3pt + red,
[A], [B], [C]
)
#grid(
columns: 2,
inset: (
left: 20pt,
right: 5pt,
top: 10pt,
bottom: 3pt,
),
[A],
[B],
)
|
https://github.com/MatheSchool/typst-g-exam | https://raw.githubusercontent.com/MatheSchool/typst-g-exam/develop/examples/exam-mathematics-sugar.typ | typst | MIT License | #import "../src/lib.typ": *
#show: g-exam.with(
author: (
name: "<NAME>, <NAME>,",
email: "<EMAIL>",
watermark: "Teacher: Möbius",
),
school: (
name: "Sunrise Secondary School",
logo: read("./logo.png", encoding: none),
),
exam-info: (
academic-period: "Academic year 2023/2024",
academic-level: "1st Secondary Education",
academic-subject: "Mathematics",
number: "2nd Assessment 1st Exam",
content: "Radicals and fractions",
model: "Model A",
),
language: "en",
decimal-separator: ",",
// date: "November 21, 2023",
// show-student-data: "first-page",
show-student-data: "odd-pages",
// show-student-data: none,
show-grade-table: true,
question-points-position: right,
question-text-parameters: (size: 22pt, font:"OpenDyslexic"),
clarifications: (
[This test must be performed with a blue or black non-erasable pen.],
[Cheating, talking, getting up from the chair or disturbing the rest of the class can be reasons for withdrawal from the test, which will be valued with a zero.],
[All operations must appear, it is not enough to just indicate the result.],
)
)
=2? Calculate the following operations and simplify if possible:
==? $display(5/12 dot 9/15=)$
#v(1fr)
==? $display(10 dot 9/15=)$
#v(1fr)
==? $display(5/12 : 4/15=)$
#v(1fr)
==? $display(2 : 5/3 =)$
#v(1fr)
#pagebreak()
#g-question(points: 2)[Calculate the following operations and simplify if possible:]
==? $display(4/11+5/11-2/11=)$
#v(1fr)
#g-subquestion[$display(3+2/5=)$]
#v(1fr)
#g-subquestion[$display(7/12+2/9=)$]
#v(1fr)
#g-subquestion[$display(1-9/13=)$]
#v(1fr)
#pagebreak()
=2? Calculate the following operations and simplify if possible:
==? $display(3/5 - (1-7/10) = )$
#v(1fr)
==? $display((3-5/3) dot (2-7/5) =)$
#v(1fr)
#pagebreak()
#g-question(points: 2)[Sort the following fractions from highest to lowest:
\ \
#align(center, [$ 2/3 ; 3/8 ; 4/6 ; 1/2 $])
#v(1fr)
]
#g-question(points: 2)[In a garden we have 20 red, 10 white and 15 yellow rose bushes.]
#g-subquestion[What fraction does each color represent?]
#v(1fr)
#g-subquestion[If we have pruned red rose bushes, what fraction do we have left to prune?]
#v(1fr)
=2? #lorem(30)
==? #lorem(35)
// #v(1fr)
==1? #lorem(130)
// #v(1fr)
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/bugs/math-realize_01.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
$ x^2 #hide[$(>= phi.alt) union y^2 0$] z^2 $
Hello #hide[there $x$]
and #hide[$ f(x) := x^2 $]
|
https://github.com/agarmu/typst-templates | https://raw.githubusercontent.com/agarmu/typst-templates/main/apa/template.typ | typst | MIT License | #let script-size-rel = 0.75;
#let footnote-size-rel = 0.933;
#let small-size-rel = 0.90;
#let apa-paper(
title: "Paper Title",
running-head: "RUNNING HEAD",
subtitle: "subtitle",
first-page-omits-running-head: true,
abstract: none,
keywords: (),
authors: (),
affiliation: [],
bibliography-file: none,
fontname: "Times New Roman",
normal-size: 12pt,
script-size: none,
footnote-size: none,
small-size: none,
paper-size: "us-letter",
margin-left: none,
margin-right: none,
margin-top: none,
margin-bottom: none,
body
) = {
// get font sizes!
let scr-sz = if small-size == none { normal-size * script-size-rel } else { small-size };
let fn-sz = if footnote-size == none { normal-size * footnote-size-rel } else { footnote-size };
let sm-sz = if small-size == none { normal-size * small-size-rel } else { small-size };
// get margins!
let mg-l = if margin-left == none { 1in } else { margin-left };
let mg-r = if margin-right == none { 1in } else { margin-right };
let mg-t = if margin-top == none { 1in } else { margin-top };
let mg-b = if margin-bottom == none { 1in } else { margin-bottom };
// get author names
let names = authors.map(x => x.name);
let namestr = if names.len() == 2 { names.join(" and ") } else { names.join(", ", last: ", and ")}
let keywordstr = keywords.join(", ")
// set metadata
set document(title: title, author: names)
// set text
set text(size: normal-size, font: fontname)
// set page settings
set page(
paper: paper-size,
margin: (top: mg-t, bottom: mg-b, left: mg-l, right: mg-r),
header-ascent: normal-size,
header: locate(location => {
let pagenumber = counter(page).at(location).first()
grid(
columns: (1fr, 1em),
align(
left,
upper(
if running-head == none or (pagenumber == 1 and first-page-omits-running-head) { [] } else { running-head }
),
),
align(
right,
[#pagenumber]
)
)
})
)
// set heading settings
set heading(outlined: true, bookmarked: true)
show heading: it => {
if it.level == 1 {
set align(center)
[
#set par(first-line-indent: 0.0in, justify: true)
#text(size: normal-size, weight: "bold", it.body)
#linebreak()
]
} else if it.level == 2 {
[
#set par(first-line-indent: 0.0in, justify: true)
#text(size: normal-size, weight: "bold", it.body)
#linebreak()
]
} else if it.level == 3 {
[
#set par(first-line-indent: 0.0in, justify: true)
#text(size: normal-size, weight: "bold", style: "italic", it.body)
#linebreak()
]
} else if it.level == 4 {
[
#set par(first-line-indent: 0.0in, justify: true)
#text(size: normal-size, weight: "bold", style: "italic", underline(it.body))
]
} else {
[
#set par(first-line-indent: 0.0in, justify: true)
#text(size: normal-size, weight: "regular", style: "italic", underline(it.body))
]
}
}
// set list settings
set list(indent: (normal-size / 12pt) * 0.25in, body-indent: 0.5 * normal-size, spacing: 2em)
set enum(indent: (normal-size / 12pt) * 0.25in, body-indent: 0.5 * normal-size, spacing: 2em)
// set equation settings
show math.equation: set block(below: 0.67 * normal-size, above: 0.75 * normal-size)
show math.equation: set text(weight: 400)
// set bibliography style
set bibliography(style: "american-psychological-association", title: "References")
// set line spacing
set par(leading: 2em)
show heading: set block(above: 2em, below: 2em)
// title page
v(15 * normal-size, weak: false)
align(center,
{
text(size: normal-size, weight: "bold", title)
v(1 * normal-size, weak: false)
if subtitle == none { [] } else { text(size: normal-size, subtitle) }
v(3 * normal-size, weak: false)
text(size: normal-size, namestr) + [ \ ]
affiliation
}
)
pagebreak()
if abstract != none {
[
= Abstract
#set par(first-line-indent: 0.5in, justify: true)
#abstract
#if keywords.len() > 0 { emph("Keywords: ") + keywordstr}
]
pagebreak()
}
set par(first-line-indent: 0.5in, justify: true)
body
}
|
https://github.com/AxiomOfChoices/Typst | https://raw.githubusercontent.com/AxiomOfChoices/Typst/master/Courses/Math%2018_155%20-%20Differential%20Analysis%201/Assignments/Midterm.typ | typst | #import "/Templates/generic.typ": latex, header
#import "@preview/ctheorems:1.1.0": *
#import "/Templates/math.typ": *
#import "/Templates/assignment.typ": *
#show: doc => header(title: "Midterm", name: "<NAME>", doc)
#show: latex
#show: NumberingAfter
#show: thmrules
#let col(x, clr) = text(fill: clr)[$#x$]
#let pb() = {
pagebreak(weak: true)
}
#set page(numbering: "1")
#let bar(el) = $overline(#el)$
#set enum(numbering: "a)")
= Question
== Statement
+ Let $L := span(delta_x, x in RR^n)$ be the space of all finite linear combinations of $delta$-functions. Prove that $L$ is dense in $cal(D)' (RR^n)$.
+ Let $W := span(delta_x - delta_y, x, y in RR^n)$. Is $W$ dense in $cal(D)' (RR^n)$? Is $W$ dense in $ cal(E)' (RR^n)$.
== Solution
+ Using the Wikipedia page for test functions and distributions, we know that $C_c^infinity (RR^n)$ is reflexive, so it is equal to its own bidual. Now assume that $ov(L) != cal(D)'(RR^n)$ then pick some distribution $u$ which is not in $ov(L)$, we can use Hahn-Banach to create a functional on $cal(D)' (RR^n)$ which is zero on $ov(L)$ and non-zero on $u$, using the biduality we thus have a function $f in C_c^infinity (RR^n)$ such that $pair(delta_x, f) = 0$ for all $delta_x$ and also $pair(u, f)$ for some non-zero $u$. But this is clearly impossible so $ov(L) = cal(D)'(RR^n)$.
+ In fact, $W$ is dense in $cal(D)' (RR^n)$, we will show this by showing it is dense in $L$, which we know to be dense in $cal(D)' (RR^n)$. To see this let $delta_x$ be arbitrary and consider the sequence $delta_(x) - delta_((m,0,...,0)) in W$. For any function $phi in C_c^infinity (RR^n)$ we have that for large enough $m$, $(m,0,...,0)$ is outside the support of $phi$ so
$
pair(delta_x - delta_((m,0,...,0)), phi) -> pair(delta_x, phi)
$
for all $phi$. However, $W$ is not dense in $cal(E)' (RR^n)$, for consider the function $1 in C^infinity (RR^n)$, we have that for any $u in W$, $pair(u, 1) = 0$ by definition. But for example any bump function $psi$ considered as an element of $cal(E)' (RR^n)$ through integration has $pair(psi, 1) != 1$. Hence $W$ is not dense in $cal(E)' (RR^n)$.
= Question
== Statement
Let $u_1,...,u_n$ be non-zero homogeneous distributions in $cal(D)' (RR^n)$ and let the degree of $u_k$ be a real number $lambda_k$.
Assuming that $lambda_k != lambda_ell$ for $k != ell$, show that $u_1,...,u_N$ are linearly independent.
== Solution
We prove by induction, clearly this is true for $n = 1$. Now for $n + 1$ let us assume that
$
sum_(k=1)^(n+1) c_k u_k (x) = 0
$
in the sense of distributions. Clearly $0$ is homogeneous of any degree so we also have
$
sum_(k=1)^(n+1) 2^(lambda_k) c_k u_k (x) =
sum_(k=1)^(n+1) c_k u_k (2 x) = 0
$
now consider the weighted sum
$
2^(lambda_(n+1))
(sum_(k=1)^(n+1) c_k u_k (x)) -
(sum_(k=1)^(n+1) 2^(lambda_k) c_k u_k (x))
&=
0
\
sum_(k=1)^(n+1) (2^(lambda_(n+1)) - 2^(lambda_k)) c_k u_k (x)
&=
0
\
sum_(k=1)^(n) (2^(lambda_(n+1)) - 2^(lambda_k)) c_k u_k (x)
&=
0
$
Now this is a linear combination of $u_1,...u_n$ which by inductive hypothesis are linearly independent, we thus know the coefficients are all zero, namely
$
(2^(lambda_(n+1)) - 2^(lambda_k)) c_k = 0
$
for all $k$ with $1 <= k <= n$. Now because the $lambda_k$'s are distinct and $2^x$ is injective on $RR$, the brackets here never vanish, and so $c_k$ must vanish. We thus have that
$
c_(n+1) u_(n+1) (x) = 0
$
but since $u_(n+1)$ is non-zero we thus know that $c_(n+1)$ is zero and so $u_1,...,u_(n+1)$ are independent.
= Question
== Statement
Let $f$ be a measurable function on $RR$ satisfying $abs(f(x)) <= abs(x)^(-2024)$ for $x in RR backslash {0}$. Show that there is $u in cal(D)' (RR)$ such that $u = f$ on $(-infinity, 0) union (0, + infinity)$.
== Solution
Let $V$ be the subspace of $C_c^infinity (RR)$ defined by
$
V := { phi : phi^(k) (0) = 0, k <= 2024} = {phi : pair(delta_0^(k), phi) = 0, k <= 2024 }
$
which is clearly closed as an intersection of closed kernels.
For any $phi in V$ we have $phi = x^(2024) psi$ for some smooth compactly supported function $psi$, so we have
$
abs(integral f phi) = abs(integral f x^2024 psi) <= integral abs(psi)
$
also if $phi$ is supported within some compact set $K$ then so is $psi$ so we have
$
integral abs(psi) <= C_K norm(phi)_(C^n (K))
$
for some constant $C_K$, hence the functional on $V$ defined by
$
pair(u, phi) = integral f phi
$
is a well defined continuous linear functional.
Now consider the linear map $g : C_c^infinity (RR) -> C_c^infinity (RR)$ defined by first fixing a bump function
$eta$ which is supported on $[-1,1]$ and identically 1 on $[-0.5,0.5]$, and then setting
$
g(phi) = phi - eta dot (sum_(i=0)^(2024) phi^i (0) x^i)
$
one can easily check that the image of this map is in fact contained in $V$ and that $g$ is stationary on $V$. Then since this map amounts to just evaluating the $delta$ functional and its derivatives at $phi$ it is clearly a continuous map, we thus know that this is a continuous linear projection onto $V$.
I now claim that the distribution $pair(u', phi) := pair(u, g(phi))$ is in fact the distribution we are looking for, to see this let $phi$ be compactly supported on $(-infinity, 0) union (0, infinity)$, then we must have $supp phi seq K seq (-infinity, 0) union (0, infinity)$ and thus $K$ is bounded some distance away from $0$ and so $phi (x) = 0$ for all $x in (-epsilon, epsilon)$ for sufficiently small $epsilon$.
We then just notice that for such a function, $phi^i (0) = 0$ for all $i in NN$ so $phi in V$ and thus $g(phi) = phi$. We thus have
$
pair(u', phi) = pair(u, g(phi)) = pair(u, phi) = integral f phi = pair(f, phi)
$
as was desired.
= Question
== Statement
Let $R_theta = mat(cos theta, - sin theta; sin theta, cos theta)$ denote the rotation in $RR^2$ with angle $theta$. For a distribution $u in cal(D)' (RR^2)$, define $u compose R_theta$ by $pair(u compose R_theta, phi) = pair(u, f compose R_(-theta))$. The distribution $u$ is called radial of $u compose R_theta = u$ for all $theta$. Prove that any radial distribution $u in cal(D)' (RR^2)$ satisfies
$
x (partial u)/(partial y) - y (partial u)/(partial x) = 0.
$
== Solution
We have that for a radial distribution $u$
$
0
=
pair((u - u compose R_theta)/theta, phi)
=
pair(u, (phi - phi compose R_(-theta))/theta).
$
But now we have
$
(phi - phi compose R_(-theta))/theta
=
(phi(x,y) - phi(x cos theta + y sin theta, - x sin theta + y cos theta))/theta
$
so the limit as $theta -> 0$ is equal to, by definition,
$
- partial/(partial theta)|_(theta = 0) phi(x cos theta + y sin theta, - x sin theta + y cos theta).
$
But now by chain rule this is equal to
$
(partial_x phi)|_(x,y) (partial_theta|_(theta = 0) (x cos theta + y sin theta)) +
(partial_y phi)|_(x,y) (partial_theta|_(theta = 0) (-x sin theta + y cos theta))
= \
(partial_x phi)|_(x,y) (y)
+ (partial_y phi)|_(x,y) (-x)
$
hence we have
$
lim_(theta -> 0) (phi - phi compose R_(-theta))/theta = y (partial phi)/(partial x) - x (partial phi)/(partial y).
$
We thus have for every $phi$,
$
pair(u, y (partial phi)/(partial x) - x (partial phi)/(partial y)) = 0
$
and so moving around the derivatives we get
$
0
= pair((partial x u)/(partial y) - (partial y u)/(partial x), phi)
= pair(x (partial u)/(partial y) - y (partial u)/(partial x), phi)
$
which is exactly what we wanted to show.
= Question
== Statement
Let $L = sum_(abs(alpha) <= N) a_alpha partial^alpha$ be a non-zero linear differential operator in $RR^n$ with constant coefficients.
Prove that if $u in cal(E)' (RR^n)$ satisfies $L u = 0$ in the space of distributions, then $u = 0$.
== Solution
We know by definition of the derivative that
$
pair(L u, phi) = pair(u, ov(L) phi)
$
where $ov(L)$ is defined by $sum_(abs(alpha) <= N) (-1)^(abs(alpha)) a_alpha partial^(ov(alpha))$ and $ov(alpha)$ is the multi-index which is $alpha$ reversed.
Now set $phi = e^(i k x)$ be our test function, we have that
$
pair(u, ov(L) e^(i k x)) = 0.
$
Now it is clear that $ov(L) e^(i k x)$ is equal to $p(k_1,...,k_n) e^(i k x)$ for some polynomial $p$ because each derivative of $e^(i k x)$ just brings down an extra factor of $k$.
We thus have
$
p(k_1,...,k_n) pair(u, e^(i k x))
= pair(u, p(k_1,...,k_n) e^(i k x))
= 0
$
for all vectors $k_1,...,k_n$. Now one can check by inspection that
$
p(k_1,...,k_n) = sum_(abs(alpha) <= N) (-1)^(abs(alpha)) a_(alpha) k^(ov(alpha))
$
where the monomials $k^alpha$ are defined by
$
k^(alpha) = product_(i=1)^abs(alpha) k_(alpha_i)
$
Now from this representation, we see that because $L$ is a non-zero operator, $p$ is a non-zero polynomial, thus we have that $p(k_1,...,k_n) != 0$ on an open dense set.
But now we see that $pair(u, e^(i k x)) = 0$ on an open dense set of $k$'s. But this quantity is continuous in $k$ because $e^(i k_n x) -> e^(i k x)$ on any compact set if $k_n -> k$. We thus know that $pair(u, e^(i k x)) = 0$ everywhere and so since $e^(i k x)$ are dense in $C^infinity (RR^n)$ we get that $pair(u, phi) = 0$ for all $phi in C^infinity (RR^n)$ and thus $u = 0$.
|
|
https://github.com/chamik/gympl-skripta | https://raw.githubusercontent.com/chamik/gympl-skripta/main/cj-dila/18-lovci-hlav.typ | typst | Creative Commons Attribution Share Alike 4.0 International | #import "/helper.typ": dilo
#dilo("Lovci hlav", "lovci", "<NAME>", "<NAME>", "současnost", "Norsko", "2011", "epika", "severská detektivka")
#columns(2, gutter: 1em)[
*Téma*\
Záchrana vlastního života kvůli korporátnímu spiknutí.
*Motivy* -- vražda, nevěra, krádež, pomsta
*Časoprostor* -- současnost, během několika dní, Norsko (Oslo a okolí)
*Postavy* \
_<NAME>_ -- lovec hlav, egomaniak menšího vzrůstu, pompézní život \
_<NAME>_ -- jeho manželka, provozuje galerii, touží po dítěti \
_Class Greve_ -- původně elitní voják najatý, aby se stal ředitelem konkurenční firmy vyrábějící GPS trackery \
_<NAME>_ -- pomáhá Rogerovi s vloupáními, bezpečnostní agentura \
_Lotte_ -- milenka _Rogera_
*Kompozice* -- prolog a epilog, pět částí
*Vypravěč* -- ich forma z pohledu hlavní postavy, některé skutečnosti tají
*Jazykové prostředky*\
základem spisovný, používá vulgarismy, hovorový jazyk, norské názvy, profesní výrazy personalistiky z angličtiny
*Obsah*\
_<NAME>_ je lovec hlav -- na vedoucí místa ve firmách dosazuje co možná nejlepší lidi. Zároveň je to zloděj umění. Své oběti si vyhlíží z řad lidí, které doporučuje. Jeho nejnovější kandidát je _Class Greve_, kterého chce doporučit na pozici šéfa GPS společnsti (ten je bývalým šéfem jiné technologické firmy, takže je dokonalý kandidát). _C_ je vlastníkem obrazu Lov kance (<NAME>), který má velkou hodnotu, _R_ neodolá. Za pomocí svého přítele _Ova_ se _R_ vloupá do domu _Classe_ a obraz ukradne. Celý štastný volá své manželce _Dianě_, jen aby zjistil, že mobil vyzvání pod _Classovou_ postelí. Vytuší nevěru a kvůli tomu se _Classe_ rozhodne nedoporučit. Řekne o tom své manželce, která informuje _Classe_. Ten se _Rogera_ pokusí uspat pomocí sedativ, ale místo toho uspí _Ova_. _R_ si myslí, že _Ove_ umřel a hodí ho do řeky, jen aby ho zase hned vytáhl; je přesvědčený, že _Ove_ je otráven, ovšem odmítá mu zavolat záchranku, v potyčce u něj doma ho zastřelí. _Lotta_ mu dá do vlasů GPS trackery (je bývalá milenka _Classe_), takže když se pokusí ukrýt na venkově, _C_ ho vystopuje. Musí se schovat do kadibudky (kde zjistí, že _C_ nemá varlata), zabije jeho psa a pokusí se ujet kombajnem. V nemocnici ho v převleku doktora _C_ skoro dostane, ovšem _Rogera_ dřív zatkne policie. V autě je _C_ srazí nákladákem, všichni kromě _R_ zemřou. _R_ si ostříhá vlasy, zabije _Lottu_, s _Dianou_ se domluví, at _Classovi_ vymění 3 náboje za slepé. Vydávaje se za policistu si dojde do márnice pro vlasy a naláká _C_ k _Oveovi_ domů. Následuje přestřelka, kterou _R_ vyhraje (a díky mrtvému _O_ na místě se zbaví viny). _R_ má s Dianou dítě.
*Literárně historický kontext*\
Popularita severských románů v posledních letech, současná společnost. Poe (@havran[]) zakladatel detektivek.
]
#pagebreak()
*Ukázka*
„A vaše žena se jmenuje…“ Listoval jsem jeho papíry a nasadil jsem podrážděný výraz, který dává kandidátům na srozuměnou, že od nich očekávám převzetí iniciativy.
„Camilla. Jsme manželé deset let. Máme dvě děti. Chodí
do školy.“
„A jak byste popsal své manželství?“ zeptal jsem se, aniž
jsem vzhlédl. Dal jsem mu dvě dlouhé vteřiny, a když se ani
poté nezmohl na odpověď, pokračoval jsem: „Myslíte si, že
jste ještě stále manželé, ačkoli jste teď šest let trávil dvě třetiny svého bdělého života v práci?“
Vzhlédl jsem. Úděs v jeho pohledu byl očekávaný. Jsem
nekonsekventní. Vyrovnaný život. Nasazení. To nedává smysl.
Trvalo čtyři vteřiny, než odpověděl. Což je nejméně o vteřinu déle, než by mělo.
„To doufám.“
Jistý, natrénovaný úsměv. Ale ne dost dobře natrénovaný. Ne pro mě. Obrátil moje vlastní slova proti mně a já
bych si to poznamenal jako plus, pokud by to byla záměrná
ironie. V tomto případě však bohužel nevědomě napodobil
slova člověka, jehož považoval za nadřazeného. MIZERNÉ
SEBEHODNOCENÍ, poznamenal jsem si. Navíc „doufal“,
nevěděl to, nevyjádřil své vize, nečetl z křišťálové koule, nebylo mu jasné, že základním požadavkem na každého manažera je to, aby se dokázal tvářit jako jasnovidec.
NENÍ IMPROVIZÁTOR. NENÍ KRIZOVÝ MANAŽER.
„Vaše žena pracuje?“
„Ano. V jedné advokátní firmě v centru.“
„Každý den od devíti do čtyř?“
„Ano.“
#pagebreak() |
https://github.com/Jollywatt/typst-fletcher | https://raw.githubusercontent.com/Jollywatt/typst-fletcher/master/src/shapes.typ | typst | MIT License | #import "deps.typ": cetz
#import cetz: draw, vector
/// The standard rectangle node shape.
///
/// A string `"rect"` or the element function `rect` given to
/// #the-param[node][shape] are interpreted as this shape.
///
/// #diagram(
/// node-stroke: green,
/// node-fill: green.lighten(90%),
/// node((0,0), `rect`, shape: fletcher.shapes.rect)
/// )
///
#let rect(node, extrude) = {
let r = node.corner-radius
let (w, h) = node.size.map(i => i/2 + extrude)
draw.rect(
(-w, -h), (+w, +h),
radius: if r != none { r + extrude },
)
}
/// The standard circle node shape.
///
/// A string `"circle"` or the element function `circle` given to
/// #the-param[node][shape] are interpreted as this shape.
///
/// #diagram(
/// node-stroke: red,
/// node-fill: red.lighten(90%),
/// node((0,0), `circle`, shape: fletcher.shapes.circle)
/// )
///
#let circle(node, extrude) = draw.circle((0, 0), radius: node.radius + extrude)
/// An elliptical node shape.
///
/// #diagram(
/// node-stroke: orange,
/// node-fill: orange.lighten(90%),
/// node((0,0), `ellipse`, shape: fletcher.shapes.ellipse)
/// )
///
/// - scale (number): Scale factor for ellipse radii.
#let ellipse(node, extrude, scale: 1) = {
draw.circle(
(0, 0),
radius: vector.scale(node.size, 0.5).map(x => x*scale + extrude),
)
}
/// A capsule node shape.
///
/// #diagram(
/// node-stroke: teal,
/// node-fill: teal.lighten(90%),
/// node((0,0), `pill`, shape: fletcher.shapes.pill)
/// )
///
#let pill(node, extrude) = {
let size = node.size.map(i => i + 2*extrude)
draw.rect(
vector.scale(size, -0.5),
vector.scale(size, +0.5),
radius: calc.min(..size)/2,
)
}
/// A slanted rectangle node shape.
///
/// #diagram(
/// node-stroke: olive,
/// node-fill: olive.lighten(90%),
/// node((0,0), `parallelogram`, shape: fletcher.shapes.parallelogram)
/// )
///
/// - angle (angle): Angle of the slant, `0deg` is a rectangle. Don't set to
/// `90deg` unless you want your document to be larger than the solar system.
///
/// - fit (number): Adjusts how comfortably the parallelogram fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.parallelogram.with(fit: fit, angle: 35deg)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: olive,
/// fill: olive.lighten(90%),
/// ))
/// h(5mm)
/// }
#let parallelogram(node, extrude, flip: false, angle: 20deg, fit: 0.8) = {
let (w, h) = node.size
if flip { (w, h) = (h, w) }
let (x, y) = (w/2 + extrude*calc.cos(angle), h/2 + extrude)
let δ = h/2*calc.tan(angle)
let μ = extrude*calc.tan(angle)
x += δ*fit
let verts = (
(-x - μ, -y),
(+x - δ, -y),
(+x + μ, +y),
(-x + δ, +y),
)
if flip { verts = verts.map(((i, j)) => (j, i)) }
draw.line(..verts, close: true)
}
/// An isosceles trapezium node shape.
///
/// #diagram(
/// node-stroke: green,
/// node-fill: green.lighten(90%),
/// node((0,0), `trapezium`, shape: fletcher.shapes.trapezium)
/// )
///
/// - angle (angle): Angle of the slant, `0deg` is a rectangle. Don't set to
/// `90deg` unless you want your document to be larger than the solar system.
///
/// - fit (number): Adjusts how comfortably the trapezium fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.trapezium.with(fit: fit, angle: 35deg)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: green,
/// fill: green.lighten(90%),
/// ))
/// h(5mm)
/// }
///
/// - dir (top, bottom, left, right): The side the shorter parallel edge is on.
#let trapezium(node, extrude, dir: top, angle: 20deg, fit: 0.8) = {
assert(dir in (top, bottom, left, right))
let flip = dir in (right, left) // flip along diagonal line x = y
let rotate = dir in (bottom, left) // rotate 180deg
let (w, h) = node.size
if flip { (w, h) = (h, w) }
let (x, y) = (w/2 + extrude*calc.cos(angle), h/2 + extrude)
let δ = h/2*calc.tan(angle)
let μ = extrude*calc.tan(angle)
x += δ*fit
let verts = (
(-x - μ, -y),
(+x + μ, -y),
(+x - δ, +y),
(-x + δ, +y),
)
if flip { verts = verts.map(((i, j)) => (j, i)) }
if rotate { verts = verts.map(((i, j)) => (-i, -j)) }
draw.line(..verts, close: true)
}
/// A rhombus node shape.
///
/// #diagram(
/// node-stroke: purple,
/// node-fill: purple.lighten(90%),
/// node((0,0), `diamond`, shape: fletcher.shapes.diamond)
/// )
///
/// - fit (number): Adjusts how comfortably the diamond fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.diamond.with(fit: fit)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: purple,
/// fill: purple.lighten(90%),
/// ))
/// h(5mm)
/// }
#let diamond(node, extrude, fit: 0.5) = {
let (w, h) = node.size
let φ = calc.atan2(w/1pt, h/1pt)
let x = w/2*(1 + fit) + extrude/calc.sin(φ)
let y = h/2*(1 + fit) + extrude/calc.cos(φ)
draw.line(
(-x, 0pt),
(0pt, -y),
(+x, 0pt),
(0pt, +y),
close: true,
)
}
/// An isosceles triangle node shape.
///
/// One of #param[triangle][angle] or #param[triangle][aspect] may be given, but
/// not both. The triangle's base coincides with the label's base and widens to
/// enclose the label; see https://www.desmos.com/calculator/i4i9svunj4.
///
/// #diagram(
/// node-stroke: fuchsia,
/// node-fill: fuchsia.lighten(90%),
/// node((0,0), `triangle`, shape: fletcher.shapes.triangle)
/// )
///
/// - dir (top, bottom, left, right): Direction the triangle points.
/// - aspect (number, auto): Aspect ratio of triangle, or the ratio of its base
/// to its height.
/// - angle (angle, auto): Angle of the triangle opposite the base.
/// - fit (number): Adjusts how comfortably the triangle fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.triangle.with(fit: fit, angle: 120deg)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: fuchsia,
/// fill: fuchsia.lighten(90%),
/// ))
/// h(5mm)
/// }
#let triangle(node, extrude, dir: top, angle: auto, aspect: auto, fit: 0.8) = {
assert(dir in (top, bottom, left, right))
let flip = dir in (right, left) // flip along diagonal line x = y
let rotate = dir in (bottom, left) // rotate 180deg
let (w, h) = node.size
if flip { (w, h) = (h, w) }
if angle == auto and aspect == auto { aspect = w/h }
if angle == auto { angle = 2*calc.atan(aspect/2) }
if aspect == auto { aspect = 2*calc.tan(angle/2) }
let a = aspect*h/2 + fit*w/2
let b = (a + fit*w/2)/aspect
a += extrude*calc.tan(45deg + angle/4)
b += extrude/calc.cos(90deg - angle/2)
let verts = (
(-a, -h/2 - extrude),
(+a, -h/2 - extrude),
(0, +b),
)
if flip { verts = verts.map(((i, j)) => (j, i)) }
if rotate { verts = verts.map(((i, j)) => (-i, -j)) }
draw.line(..verts, close: true)
}
/// A pentagonal house-like node shape.
///
/// #diagram(
/// node-stroke: eastern,
/// node-fill: eastern.lighten(90%),
/// node((0,0), `house`, shape: fletcher.shapes.house)
/// )
///
/// - dir (top, bottom, left, right): Direction of the roof of the house.
/// - angle (angle): The slant of the roof. A plain rectangle is `0deg`, and
/// `90deg` is a sky scraper stretching past Pluto.
#let house(node, extrude, dir: top, angle: 10deg) = {
let flip = dir in (right, left) // flip along diagonal line x = y
let rotate = dir in (bottom, left) // rotate 180deg
let (w, h) = node.size
if flip { (w, h) = (h, w) }
let (x, y) = (w/2 + extrude, h/2 + extrude)
let a = h/2 + extrude*calc.tan(45deg - angle/2)
let b = h/2 + w/2*calc.tan(angle) + extrude/calc.cos(angle)
let verts = (
(-x, -y),
(-x, a),
(0pt, b),
(+x, a),
(+x, -y),
)
if flip { verts = verts.map(((i, j)) => (j, i)) }
if rotate { verts = verts.map(((i, j)) => (-i, -j)) }
draw.line(..verts, close: true)
}
/// A chevron node shape.
///
/// #diagram(
/// node-stroke: yellow,
/// node-fill: yellow.lighten(90%),
/// node((0,0), `chevron`, shape: fletcher.shapes.chevron)
/// )
///
/// - dir (top, bottom, left, right): Direction the chevron points.
/// - angle (angle): The slant of the arrow. A plain rectangle is `0deg`.
/// - fit (number): Adjusts how comfortably the chevron fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.chevron.with(fit: fit)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: yellow,
/// fill: yellow.lighten(90%),
/// ))
/// h(5mm)
/// }
#let chevron(node, extrude, dir: right, angle: 30deg, fit: 0.8) = {
let flip = dir in (right, left) // flip along diagonal line x = y
let rotate = dir in (bottom, left) // rotate 180deg
let (w, h) = node.size
if flip { (w, h) = (h, w) }
let (x, y) = (w/2 + extrude, h/2 + extrude)
let c = w/2*calc.tan(angle)
let α = extrude*calc.tan(45deg - angle/2)
let β = extrude*calc.tan(45deg + angle/2)
let ɣ = extrude/calc.cos(angle) - c
let δ = c*fit
let y = h/2 + c*fit
let verts = (
(-x, +y + α - c),
(0pt, +y + ɣ + c),
(+x, +y + α - c),
(+x, -y - β),
(0pt, -y - ɣ),
(-x, -y - β),
)
if flip { verts = verts.map(((i, j)) => (j, i)) }
if rotate { verts = verts.map(((i, j)) => (-i, -j)) }
draw.line(..verts, close: true)
}
/// An (irregular) hexagon node shape.
///
/// #diagram(
/// node-stroke: aqua,
/// node-fill: aqua.lighten(90%),
/// node((0,0), `hexagon`, shape: fletcher.shapes.hexagon)
/// )
///
/// - angle (angle): Half the exterior angle, `0deg` being a rectangle.
/// - fit (number): Adjusts how comfortably the hexagon fits the label's bounding box.
///
/// #for (i, fit) in (0, 0.5, 1).enumerate() {
/// let s = fletcher.shapes.hexagon.with(fit: fit)
/// let l = box(
/// stroke: (dash: "dashed", thickness: 0.5pt),
/// inset: 10pt,
/// raw("fit: " + repr(fit)),
/// )
/// diagram(node((i, 0), l,
/// inset: 0pt,
/// shape: s,
/// stroke: aqua,
/// fill: aqua.lighten(90%),
/// ))
/// h(5mm)
/// }
#let hexagon(node, extrude, angle: 30deg, fit: 0.8) = {
let (w, h) = node.size
let f = h/2*calc.tan(angle)*(1 - fit)
let x = w/2 + extrude*calc.tan(45deg - angle/2) - f
let y = h/2 + extrude
let z = y*calc.tan(angle)
draw.line(
(+x, -y),
(+x + z, 0pt),
(+x, +y),
(-x, +y),
(-x - z, 0pt),
(-x, -y),
close: true,
)
}
/// A truncated rectangle node shape.
///
/// #diagram(
/// node-stroke: maroon,
/// node-fill: maroon.lighten(90%),
/// node((0,0), `octagon`, shape: fletcher.shapes.octagon)
/// )
///
/// - truncate (number, length): Size of the truncated corners. A number is
/// interpreted as a multiple of the smaller of the node's width or height.
#let octagon(node, extrude, truncate: 0.5) = {
let (w, h) = node.size
let (x, y) = (w/2 + extrude, h/2 + extrude)
let d
if type(truncate) == length { d = truncate }
else { d = truncate*calc.min(w/2, h/2)}
d += extrude*0.5857864376 // (1 - calc.tan(calc.pi/8))
draw.line(
(-x + d, -y ),
(-x , -y + d),
(-x , +y - d),
(-x + d, +y ),
(+x - d, +y ),
(+x , +y - d),
(+x , -y + d),
(+x - d, -y ),
close: true,
)
}
|
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/020%20-%20Prologue%20to%20Battle%20for%20Zendikar/004_Catching%20Up.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"Catching Up",
set_name: "Prologue to Battle for Zendikar",
story_date: datetime(day: 22, month: 07, year: 2015),
author: "<NAME>",
doc
)
#emph[The mind mage <NAME> is many things to many people. Chief among his current responsibilities is that of the Living Guildpact, the magically empowered arbiter of inter-guild conflicts on the city-plane of Ravnica. But he has made many other promises and taken on many other problems—and each of those unfinished puzzles tugs at his mind.]
#emph[Some, perhaps, more than others.]
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Jace smiled tightly as the Golgari delegation shambled out of the room. He muttered a quick spell to clear out the fungal-rot smell of the esteemed ambassadors and their zombie attendants.
As soon as the door shut behind them, Jace's smile dropped, and he sat down on the large wooden desk he'd finally gotten around to installing. The desk creaked, and he frowned. He still needed a nice big chair to collapse into. Leather. Something expensive.
"Tell me that was the last for the day," he said.
"I would never perjure myself, even at your order," said his bailiff, Lavinia—rather archly, he thought.
#figure(image("004_Catching Up/01.png", width: 100%), caption: [Jace, Telepath Unbound | Art by Jaime Jones], supplement: none, numbering: none)
He groaned. It wasn't that the work was hard. Quite the opposite. It was a lot of work, and hardly any challenge.
"But," she continued, "as it happens, in this case I can say truthfully that that was the last of today's appointments. Of course, tomorrow's petitioners are already lining up."
No more sunlight streamed through the high windows of the Chamber of the Guildpact. When had he last eaten?
"They'll have to wait," he said. "Maybe I can solve all their problems, but I can't do it all in one day."
He turned toward her. She looked prim as ever. He scowled.
"You're not even tired, are you? People probably talk about <NAME>'s illusionary bailiff . . . What human could stand for twelve hours in full ceremonial armor and show no sign of it?"
She turned and looked him up and down.
"You'd have more endurance if you exercised once in a while, you know," she said. She was smiling, but that didn't mean she wasn't serious.
"Noted."
He turned to leave.
"Guildpact," said Lavinia. He turned. "Get some rest."
#figure(image("004_Catching Up/02.jpg", width: 100%), caption: [Lavinia of the Tenth | Art by <NAME>ai], supplement: none, numbering: none)
"Coffee," said Jace. "The Living Guildpact rules that coffee is an acceptable substitution for rest, as specified in subsection . . . whatever."
Lavinia had too much discipline to roll her eyes at him, but she did shake her head as he walked out of the room.
Down several twisting corridors, Jace ducked through a secret hallway to his personal apartments. No one knew about the secret hallway except him and Lavinia, and even Lavinia didn't know how to open it. There were stories on many planes of tyrants who, to preserve the secrets of their tombs and castles, killed architects or cut out their tongues. Jace had cleanly excised the knowledge from his builders' minds—far kinder, he told himself, though it didn't always feel that way.
His apartments were a mess of diagrams, projects in progress, and half-eaten meals. An illusionary depiction of a Zendikar hedron hovered, its runes tauntingly undeciphered. Globes and maps of various planes were marked with pins showing locations of import. The horn of an Onakke ogre rested on a draft copy of some dull piece of Azorius legislation.
#figure(image("004_Catching Up/03.jpg", width: 100%), caption: [Jace's Sanctum | Art by Adam Paquette], supplement: none, numbering: none)
Jace didn't have servants—too much risk, and it made him uncomfortable besides—but he occasionally summoned a menial illusion to clean the place up, usually when he was expecting company. And he did, occasionally, entertain, despite the secrecy of the apartments. The door was actually an Izzet-made teleportal, and he changed the location of its other end regularly. He could come and go as he pleased, could even have guests, and the mystery of the Living Guildpact only deepened.
He blinked, bleary-eyed. What had he been doing?
Right. Coffee.
There was a knock at the door.
Well, not really. But there was a knock on a door somewhere in the Seventh District, carried to his ears by the portal that linked his door to that one. And that was every bit as odd.
He pulled his hood up around his face, gathered mana, and carefully approached the doorway, keeping a spell ready to dispel the portal if necessary. In the meantime, he cast a spell that would let him see what was on the other side.
All of this paranoid preparation was probably unnecessary. It was probably just some confused citizen knocking on the wrong door down in the Seventh. At worst, it might be—
#figure(image("004_Catching Up/04.png", width: 100%), caption: [Liliana, Defiant Necromancer | Art by Karla Ortiz], supplement: none, numbering: none)
—Liliana?
He gaped.
Jace hadn't seen <NAME> since the day he'd realized she was playing him, and he'd skipped out on their rendezvous—after enduring mortal danger, the deaths of friends, and literal torture, all at least partially on her account. She was an amoral, self-serving death mage who'd sought him out on the orders of the dragon Planeswalker, <NAME>. She was also the first real lover he'd ever had, and he'd tried, in the time since, not to pine for her. He knew better.
The necromancer stood before an unmarked door miles away, unattended as far as he could tell. She held herself proudly, but she glanced from side to side occasionally, as though she was nervous. Or wary.
Or betraying him. Again.
An illusion? Through the portal, it was difficult to tell. If so, it was entirely convincing, right down to an irritated tapping of her left foot.
He shouldn't answer it. Whether it was really her or not, it was almost certainly a trap—and even if she didn't have plans to betray him, again, life with Liliana had a way of turning to rot in a hurry. He knew better.
He sighed, rendered himself invisible, and summoned an illusionary duplicate. The duplicate opened the door, with a telekinetic nudge from him.
"Liliana?" he said, out of the duplicate's mouth, painting a surprised look on its face. "What are—"
She casually walked right through the illusionary Jace.
"Can I come in?" she said over her shoulder.
Jace frowned, shoved the door shut, and dispelled his invisibility, his confused-looking illusionary double, and the teleportal for good measure. He hurried after her.
"What if I said no?"
"You didn't," she said.
He walked around her and got in her way. She looked past him, surveying the apartment.
"Lovely place. Shame what you've done with it."
She looked #emph[exactly] the same. But then, she would, wouldn't she? No less than four demonic contracts saw to that, etched in fell runes on her perfect skin. He'd always hated those etchings, tried not to—not to touch them.
Finally, she looked him in the eye.
"Hello, Jace."
Jace was not accustomed to noticing people's eyes. He didn't need them to read intentions, and while he'd learned to look at people's eyes when he talked to them, he'd never really learned to pay attention to them. But Liliana's eyes he remembered, old and violet-gray and full of the promise of danger. He tried to hold her gaze now, but found he couldn't stand the memories that stirred up. His eyes finally settled on her nose, the only place he could find that didn't make him some manner of uncomfortable.
"Nothing you can say will make me trust you," he said. "Not after you betrayed me."
She rolled her eyes. Her scent hit him, lilac and cinnamon masking the barest hint of something rotten and strange.
"You're the one who stood me up," she said.
"Yes, after you betrayed me!"
"That's ancient history," she said, picking up the Onakke horn and toying with it. "I'm not working for Bolas anymore, and I never meant you any harm."
"And shall I verify that?" he asked, taking the horn from her and setting it down. "Or do you still have your little protective measures?"
He had thought he'd read her mind when they first met, but she'd spoofed his telepathic abilities somehow. He had his suspicions, and the fact that she'd been secretly working for a millennia-old dragon archmage at the time was chief among them.
She said nothing, but reached out, slowly, toward his face. Part of him wanted to flinch from her touch. Part of him wanted to do very much the opposite. He settled for holding still. But she did not touch him, only took the edge of his hood between two fingers and pushed it back. She appraised him for a moment.
#figure(image("004_Catching Up/05.jpg", width: 100%), caption: [Jace, Architect of Thought | Art by <NAME>], supplement: none, numbering: none)
"You look older," she said.
"I'm not sure how to take that."
"At your age, dear, it's an unambiguous compliment." She cocked her head. "Have you started combing your hair?"
He smoothed his hair self-consciously, just for a moment, then withdrew his hand. He had, in fact, started combing it. Not that his hair was any of her business. He scowled.
"I'm guessing," he said, "that you didn't go to the considerable trouble of finding me just to critique my appearance. So let's get to business. How did you find me, and who else knows?"
She sighed theatrically.
"I hired a very good spy at a very high price," she said. "And nobody else knows, because his corpse is currently shambling around the Seventh trying to find me."
"Damn it!" he said. "You're talking about a Ravnican citizen."
"Don't fret. I made sure he deserved it, just for you," she said. "He's got a file at New Prahv as long as your arm: murder, arson, theft, extortion—and plenty of awful stuff the Azorius don't even know about. I did your friends at the Senate a favor."
"A warrant is supposed to lead to a trial," he snapped. "Not a summary execution! I have to think about that kind of thing now. I am the law—I #emph[literally] am the Law. I—Damn it, why are you smiling?"
"L<NAME>."
He sucked in a breath through gritted teeth.
"Ooh, yeah, he's a real bastard."
"Was," she said, smirking.
He sighed.
"Fine. It's not like I've never worked outside the law, even as the Guildpact."
They were still standing, slightly too close to each other, in his messy front room.
"Well?" she said. "Is the inquisition over?"
"Not just yet," he said. "What did you do to <NAME>?"
#figure(image("004_Catching Up/06.jpg", width: 100%), caption: [In Garruk's Wake | Art by Chase Stone], supplement: none, numbering: none)
"Oh," she said. "That."
"That."
"Can I at least sit down?"
He shrugged and gestured to one of the high-backed chairs that surrounded his table, but she walked around the table and flopped onto his couch. He didn't like looming over her, but he didn't want to sit next to her, so he dragged a chair over from the table and sat. She stared at him expectantly.
"Garruk," he prodded.
"Garruk." She frowned. "Not much to tell."
"So tell it."
"He attacked me," she said. "I won. I guess he's carrying a grudge."
"No."
She blinked those ancient violet eyes.
"No?"
"Tell me about the Chain Veil," said Jace.
"Oh," she said, looking away. "#emph[That] ."
He waited.
"It'll be easier if you tell me what you already know," she said.
"It'll be more informative if I don't."
In fact he already knew a great deal about the Chain Veil, its properties, and Liliana's run-ins with Garruk. But he was curious how much she'd be willing to tell him. And he did, if he was being perfectly honest, enjoy watching her squirm.
"Fine," she said. "It's a very powerful, very ancient artifact."
"Evil, too," he interjected.
"Yes, thank you," she said, rolling her eyes. "One of my demonic creditors sent me after it, as part of my servitude. I decided to use it to earn my freedom. The hard way."
#figure(image("004_Catching Up/07.jpg", width: 100%), caption: [Kothophed, Soul Hoarder | Art by <NAME>], supplement: none, numbering: none)
"You honestly think you can take on four demons—"
"Two," she said.
"What?"
"Two down," she said, holding up two fingers and grinning. "Two to go."
"Oh," he said. "That . . . changes things."
"Doesn't it?"
He'd intended, long ago, to help her find a way out of her contracts—to learn who she really was, beneath the desperation and the lies. Now she was halfway out without his help . . . and mired in something that might be worse.
"What did you do to Garruk?"
"The Veil is cursed," she said. "It was created to turn someone into a vessel for the resurrection of a long-dead race. But that's too much power for one soul to bear. It kills its users if they're not strong enough, I think."
"You #emph[think] ?"
"What can I say? I've been so busy with all this demon-slaying, I haven't exactly had time to visit the library."
"Fine," he said. "You don't seem dead."
"Nope," she said. Her eyes twinkled. "I'm too strong."
"You know what happens to the ones it doesn't kill, right?"
Her face fell—maybe the only honest emotion she'd shown since she walked in.
"Yes," she said. "Demons."
The Veil's power was overwhelming, transforming even its strongest wearers into monsters.
"And that's what Garruk is becoming. Has become, maybe. But not you."
"Not me," she said. "I don't know if it was my contracts or my necromancy. Or maybe I managed to pass the curse to him, right after I picked the thing up. Whatever the reason, he's the one turning into a monster. And I'm not. No more than I ever was, anyway."
"Alright," he said. "You're still alive, you're still human, and you're down two demons. So what's the problem?"
She arched an eyebrow.
"Who said there's a problem?"
"Lili, what are you doing here?"
She pouted.
"Can't I just drop by to see an old friend?"
"Stop it," he snapped. "We've been a lot of things, but we have #emph[never] been friends."
Silence, then. Her eyes hardened.
"I—"
"Don't," she said.
He shut his mouth.
"You're right," she said. "And for what it's worth, I'm sorry. I'm sorry for what you went through. I'm even sorry for what's happening to Garruk, if that will make you feel better."
She flopped her head back on the pillow and sighed.
"I don't know, Jace. I guess I was hoping we could . . . start over."
She lifted her head. Her eyes held his.
"Starting over is the first trick I learned," he said, forcing a smile. He lifted one hand and set it glowing, like it often did when he worked his mind magic. When he erased memories. "Just say the word . . ."
#figure(image("004_Catching Up/08.png", width: 100%), caption: [Jace, the Living Guildpact | Art by Chase Stone], supplement: none, numbering: none)
"No," she said. "Not like that."
She frowned and spread her hands in a helpless shrug. He had a hard time believing she was genuinely flustered, but she was putting on a convincing performance.
"Just . . . this conversation, at least?" she said. "Start over?"
"Well, it's too late to start with you not barging into my home."
"Fair," she said. "So where do we start?"
"How about with you apologizing for barging into my home?"
Her demeanor shifted—demure and contrite, hands folded primly in her lap, expression carefully guarded. But her eyes were playful.
"I'm #emph[so] sorry to barge in on you like this," she said, with exaggerated propriety. "I was in town, and I just couldn't resist dropping by. I deeply regret the unpleasantness of our last encounter, and I hope we can make a fresh start."
It was a game. Everything was a game with her, and he was tired of playing. He knew better. But if he didn't find out what she was up to, she would just get him into trouble some other way. And she wasn't the only one who could play games.
"What a pleasant surprise!" he said. "It's delightful to see you again—not at all suspicious or unwelcome. What sort of fresh start did you have in mind?"
She grinned wickedly.
"Buy me dinner?"
He snorted.
She smiled serenely.
"You're serious," he said.
She grinned.
"I'm always serious."
More games. More deceptions.
He knew better.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The pair strolled through Ravnica's fashionable Second District arm in arm. It was a warm night, and the streets were busy.
"So what's it like?" asked Liliana. "Being the Guildpact?"
"Exhausting," said Jace. "Everyone wants a piece of you. You're pulled in ten different directions, all the time."
#figure(image("004_Catching Up/09.jpg", width: 100%), caption: [Art by <NAME>], supplement: none, numbering: none)
"Sounds terrible," said Liliana. "Four was bad enough. Hells, being pulled in any directions is more than bad enough."
"The guilds aren't my masters," said Jace. "More like . . . clients. I have more freedom now than I did when I was part of Tezzeret's Consortium, that's for sure."
"But you're not the king," said Liliana. "You don't make the law. You're bound by it."
He shrugged.
"I wouldn't want to be a king," he said. "But yes. It can be . . . confining."
"Sir!" said a round little woman holding a basket of roses. "Sir! Buy a flower for your girlfriend?"
"She's not my—"
"Say no more, sir!" said the woman with a wink. "But a flower's always a fine gift for a lady."
"She's #emph[not ] a—"
Liliana elbowed him in the ribs.
"Of course," said Jace. He handed the woman a zino, told her to keep the change, and presented the rose to Liliana with a flourish.
"Sir!" said the woman, already working the couple behind them. "Sir! A flower for your boyfriend there?"
Liliana took the flower delicately and stared at it. In moments, it withered and dried into a blackened husk. She tucked it in her raven hair and smiled at him.
"Do you ever get tired of being difficult?" he asked.
She flashed a dizzying grin.
"Never."
They arrived.
Milena's was one of the finest restaurants in the Second, seating by reservation only. Jace exchanged a few quiet words with the maître d'—an efficient, ratlike little man named Valko—and the Living Guildpact and his guest were escorted to a table for two out on the patio, complete with candles.
"Good to know you're not above abusing your power," said Liliana.
He pulled a chair out for her, and she sat.
"I spend ten hours a day listening to zoning disputes and damage claims," said Jace, seating himself. "A table at a nice restaurant on short notice is the least this city can do in return."
"And you have this kind of money?" asked Liliana, ogling the menu.
"They usually comp it," he said. He tried to sound embarrassed, mainly because he was. But being the Guildpact wasn't easy, and it wasn't safe, and he wasn't ashamed of taking advantage of the few perks of the office. Not very, anyway.
"Of course," she said. "It's the least they can do."
They ordered, and Liliana didn't hold back—nor had he expected her to. A bottle of an expensive red Kasarda, Decamillennial vintage, rounded things out, and Jace wove a quick spell of silence to give them some privacy.
"This is a far cry from the dives we used to hide out in," said Liliana. "What was that awful little place called? The Bitter End?"
He raised a glass.
"To leaving the past . . . in the past."
She took a sip, then set her glass down quickly.
"I heard about what you did," she said. "Trying to stop Garruk."
"Oh," he said. "That."
"It was risky," she said. "I didn't think you'd do that for me."
"I didn't do it for you," said Jace. "Garruk's becoming a threat to every Planeswalker."
"Listen to yourself," she said, shaking her head. "<NAME>, defender of the Multiverse. You can't admit you're worried about me without pretending to be worried about literally everyone."
"Should I be worried about you?"
Anger clouded her face. She reached into the folds of skirt at her hip, and Jace spent a panicked half-second preparing a counterspell before he saw what she was doing.
#figure(image("004_Catching Up/10.jpg", width: 100%), caption: [The Chain Veil | Art by <NAME>], supplement: none, numbering: none)
The thing she withdrew could only be the Chain Veil. A cacaphony of unintelligible whispers filled his head, just for a moment, until he tuned it out—whatever that was, it was her business, not his. Its links were a burnished gold and exquisitely crafted, so fine it appeared to have the texture of silk. It looked heavy, and it took on an unnatural gleam in the dim light of the restaurant. It was beautiful, and enticing, and dangerous.
His hand reached out, almost reflexively. She jerked the Veil back, out of his reach, a sudden and undignified movement.
"Afraid I'll take it from you?" he asked with bemusement.
She met his gaze, and for a fleeting moment he saw pain and fear and pleading in those ancient eyes.
"Afraid of what it might do to you," she said quietly. "And anyway, you can't take it, even if I wanted you to. Do you understand yet? What it is?"
Can't? Was it bonded to her somehow? Or did it just have its hooks into her that badly? He'd believe it, in either case.
"I'm starting to," he said.
The way the candlelight flickered off the thing was somehow sinister.
"If you're not going to let me look at it, put it away," he said. "It makes my skin crawl."
She tucked it away again.
"Mine too," she whispered.
The candles flickered.
"It sounds like maybe things aren't quite under control."
He understood, now, why she'd come. Play on his emotions and his curiosity all at once. Herself in need, and a puzzle to be solved—two things she knew he'd have a hard time resisting. And maybe, just maybe, she was right.
But he was going to make her ask.
Her eyes were pools of darkness.
"Jace, I . . ."
There was a commotion at the front of the restaurant, where the patio opened onto the street. Jace turned sharply, ready to cast any of half a dozen protective spells.
A tall, broad-shouldered man stood in the street, arguing with Valko. He wore armor, hard-used but well-maintained, and he was covered in blood and dirt and some unidentifiable muck. He pointed toward Jace. He was toward the edge of Jace's easy telepathic range, but a combination of lip-reading and surface thoughts told Jace what the man was saying: #emph[I need to speak to the Guildpact] .
He flashed a Boros insignia, pushed past the flustered maître d', and walked up to their table. He was quite a bit taller than Jace, with tawny skin but strikingly bright eyes.
"<NAME>," he said. "I need your help."
The man matched the description of a Planeswalker Jace had heard about, one who was planeswalking on and off of Ravnica with unusual regularity.
#figure(image("004_Catching Up/11.png", width: 100%), caption: [Gideon, Champion of Justice | Art by David Rapoza], supplement: none, numbering: none)
Valko hurried up behind the man.
"Guildpact," said Valko. "I'm so sorry. He says it's guild business—"
"No I didn't," said the man. "I just showed you my badge."
"I'm off-duty," said Jace. Planeswalker or no, this man's troubles weren't Jace's to solve. "Come to the Hall of the Guildpact in the morning and get on the docket, and in a few days—"
"It's about a place called Zendikar," said the man.
Liliana looked like she'd swallowed a nail.
"Sir," said Valko. "Whatever your business, your attire is entirely unacceptable. I must insist—"
"He can stay," said Jace. "If you're worried about appearances, I'll slip an invisibility spell over this whole table."
"That," said Valko, "will make it exceedingly difficult to bring you your dinner."
"It won't cover the smell, either," said Liliana.
"I'll make it up to you," said Jace, and shooed Valko away.
"What about me?" said Liliana.
"My name is Gideon," said the man. He glanced at Liliana.
"She knows," said Jace. "Have a seat."
"I'd rather stand," said Gideon.
Jace stood up. It was an error. He still had to crane his neck to look Gideon in the eye, and now the size difference between them was glaringly obvious. He hated feeling small. Hated it.
"Now that you've thoroughly ruined my evening," said Jace, "how about you get to the point?"
Gideon's eyes narrowed.
"Have you actually been to Zendikar?"
"Yes," said Jace. "It didn't go well."
"Sea Gate has fallen."
"What?" said Jace. "When? How?"
"Hours ago," said Gideon. "Maybe less. I left before it was over, but the place was doomed. And as for how . . . What do you know of the Eldrazi?"
"They'd just emerged when I was last there. I saw one, shortly before I left," said Jace. 'Saw one,' that was one way to put it. 'Inadvertently released them from millennia of imprisonment to terrorize Zendikar,' that was another. Jace wondered if Gideon knew. "I know some scholars at Sea Gate. Any word of them?"
"Their archives were lost," said Gideon. "That's why I came to find you. They were close to some kind of breakthrough with the hedrons, something that could fight the Eldrazi. And you have a reputation for solving puzzles."
#figure(image("004_Catching Up/12.jpg", width: 100%), caption: [Talent of the Telepath | Art by Peter Mohrbacher], supplement: none, numbering: none)
A quick dive into the man's mind confirmed that he was telling the truth.
"The hedron network?" said Jace. "What kind of breakthrough?"
"I don't know," said Gideon. "They called it the 'puzzle of leylines,' and they believe it’s connected to the Eldrazi. Will you come with me and solve it?"
"Leylines!" said Jace. His first instinct was to reach for his notes, but of course they were back at his apartments. "I'd never tied the hedrons to leylines. That has . . . implications."
He rubbed his forehead. The Eldrazi were his responsibility, in a way. He'd spent some time since then researching them, researching the hedrons. But he had so many other responsibilities!
"If you know Zendikar, and you've seen the Eldrazi, then you know how serious this is," said Gideon. "I know you'll do the right thing."
Liliana drained her glass, shoved her chair back, and walked past Jace.
"Lili, wait—"
She kept walking.
"Give me a minute," he said to Gideon.
He ran after her, matched her pace. He knew better than to try to grab her arm—that was a good way to end up at the healer's.
"Liliana!"
She stopped and faced him, eyes bright with rage.
"I seek you out after all this time," she said. "I open up to you. And now, after all we've been through together, you're ready to walk off with some undercooked side of beef from Sunhome, just because he #emph[asked] ?"
"What's happening on Zendikar . . ." he said. "It's my fault. Sort of. It was unintentional, and I suspect I was manipulated, but the fact remains, those Eldrazi things are loose because I walked into something without understanding it."
"So now you're going to dive right back in," she said. "What are you waiting for?"
"You could come with us," he said.
"Excuse me?"
"Come with us," said Jace. "Put your skills to use fighting some actual monsters. Maybe you can make an ally of this Gideon guy."
"No," said Liliana. "Some of us don't go borrowing trouble when we already have more than enough."
"I'm not leaving until morning," said Jace. "Think it over. Come to the Hall if you change your mind."
"No."
"You could wait for me on Ravnica, then," said Jace. "Whatever research he needs me to do, it won't take long. I'll come back. We can continue our conversation. And if you ever get around to telling me why you're here, we can talk about what happens next."
"You're out of your mind," she said. "I've got demons to kill."
"Fine," said Jace. "Good luck with that. And Liliana?"
She waited.
"He #emph[asked.] #emph["]
She plucked the dead rose out of her hair and tossed it as his feet, then turned on her heel and walked away.
Jace bent to pick up the withered flower as Gideon's heavy footfalls approached behind him.
"Finished?" said Gideon.
Jace turned, ready to snap at him, but Gideon's face was so earnest, and so haggard, that Jace couldn't muster the anger. Liliana was bad news anyway. He knew better.
#figure(image("004_Catching Up/13.jpg", width: 100%), caption: [Art by <NAME>], supplement: none, numbering: none)
"Finished," said Jace. "Come on. I know a good healer who can patch you up."
"No time," said Gideon. "We have to go."
"I'm not leaving the plane until morning," said Jace. "I need to make arrangements, and I need to get my notes. And you! You can't help Zendikar if you drop dead of exhaustion. You need to get some rest."
Gideon stared down at him for a long moment.
"Fine," Gideon said at last. "Take me to this healer."
"Tell me about the Eldrazi," said Jace.
He took a step, but Gideon stopped him with one hand on Jace's shoulder. Jace reached up and deliberately pushed Gideon's hand off of him.
Gideon glanced at the withered rose that Jace was still playing through his fingers. "Do I have your full attention?"
"Of course," said Jace. "Tell me everything you know."
He dropped the dead rose on the cobblestones and fell in beside Gideon.
He knew better.
|
|
https://github.com/skyl4b/typst-templates | https://raw.githubusercontent.com/skyl4b/typst-templates/main/README.md | markdown | MIT License | # typst-templates
Some academic typst templates for university related work
|
https://github.com/hchap1/Typst-Renderer-Discord-Bot | https://raw.githubusercontent.com/hchap1/Typst-Renderer-Discord-Bot/main/buf.typ | typst | #set text(fill: rgb("eeeeee"))
#set page(fill: rgb("313338"), width: auto, height: auto, margin: (top: 0.3cm, bottom: 0.3cm, left: 0.3cm, right: 0.3cm))
$2$ |
|
https://github.com/xhalo32/constructive-logic-course | https://raw.githubusercontent.com/xhalo32/constructive-logic-course/master/notes/muistiinpanot.typ | typst | Formalisointi-kappaleen tehtäväideoita:
- Löydä tyyppi todistukselle
- Täytä todistus
- Muodosta sanalliselle lauseelle formaali versio
Muuta formalisointi-kappaleeseen:
- Lean-blueprint
- Suuret formalisointiprojektit
- Fermat'n suuri lause ja korollaareja |
|
https://github.com/RakuJa/Space_Communication_Notes | https://raw.githubusercontent.com/RakuJa/Space_Communication_Notes/main/template.typ | typst | MIT License | #import "@preview/chic-hdr:0.4.0": *
#import "@preview/physica:0.9.3": *
#let buildMainHeader(mainHeadingContent) = {
[
#align(center, smallcaps(mainHeadingContent))
#line(length: 100%)
]
}
#let buildSecondaryHeader(mainHeadingContent, secondaryHeadingContent) = {
[
#smallcaps(mainHeadingContent) #h(1fr) #emph(secondaryHeadingContent)
#line(length: 100%)
]
}
#let isAfter(secondaryHeading, mainHeading) = {
let secHeadPos = secondaryHeading.location().position()
let mainHeadPos = mainHeading.location().position()
if (secHeadPos.at("page") > mainHeadPos.at("page")) {
return true
}
if (secHeadPos.at("page") == mainHeadPos.at("page")) {
return secHeadPos.at("y") > mainHeadPos.at("y")
}
return false
}
#let getHeader() = {
locate(loc => {
// Find if there is a level 1 heading on the current page
let nextMainHeading = query(selector(heading).after(loc), loc).find(headIt => {
headIt.location().page() == loc.page() and headIt.level == 1
})
if (nextMainHeading != none) {
return buildMainHeader(nextMainHeading.body)
}
// Find the last previous level 1 heading -- at this point surely there's one :-)
let lastMainHeading = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level == 1
}).last()
// Find if the last level > 1 heading in previous pages
let previousSecondaryHeadingArray = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level > 1
})
let lastSecondaryHeading = if (previousSecondaryHeadingArray.len() != 0) {previousSecondaryHeadingArray.last()} else {none}
// Find if the last secondary heading exists and if it's after the last main heading
if (lastSecondaryHeading != none and isAfter(lastSecondaryHeading, lastMainHeading)) {
return buildSecondaryHeader(lastMainHeading.body, lastSecondaryHeading.body)
}
return buildMainHeader(lastMainHeading.body)
})
}
// Project part
#let project(
title: "",
authors: (),
subtitle: "",
objective: "",
department: "" ,
institute: "",
year: "",
logo: none,
abstract: none,
subject: "",
degree: "Master of Science",
stream: "Cybersecurity",
guide: (),
body
) = {
// Set the document's basic properties.
set document(
author: authors.map(a => a.name),
title: title)
set page(
paper: "a4",
margin: (
top: 1in,
bottom: 1in,
left: 1in,
right: 1in
)
)
set text(font: "Times New Roman", lang: "en")
set par(justify: true)
// Title row.
v(.10fr)
align(center)[
#text(12pt, strong(smallcaps(subject)))
\ \ #text(30pt, weight: 900, smallcaps(title))
\ #text(14pt, weight: 200, subtitle)
]
// Degree Part
align(center)[
#text(12pt, strong(degree))
\ #text(12pt, strong("in"))
\ #text(12pt, strong(stream))
#v(1cm)
#text(12pt, strong("Submitted By,"))
]
pad(
top: 2em,
for i in range(calc.ceil(authors.len() / 3)) {
let end = calc.min((i + 1) * 3, authors.len())
let is-last = authors.len() == end
let slice = authors.slice(i * 3, end)
grid(
columns: slice.len() * (1fr,),
gutter: 12pt,
..slice.map(author => align(center, {
text(12pt, strong(author.name))
if "rollno" in author [
\ #author.rollno
]
if "regno" in author [
\ #author.regno
]
if "department" in author [
\ #author.department
]
if "email" in author [
\ #link("mailto:" + author.email)
]
}))
)
if not is-last {
v(16pt, weak: true)
}
}
)
v(1cm)
align(center)[
#text(12pt, "Under the guidance of,")
\ #text(14pt, smallcaps(strong(guide.name)))
\ #text(12pt, smallcaps(guide.designation+","))
\ #text(12pt, smallcaps(guide.department))
]
v(0.75fr)
if logo != none {
align(center)[
#image(logo, width: 26%)
\ #text(12pt, strong(smallcaps(department)))
\ #text(14pt, institute)
\ #text("Accademic Year: " + year)
]
} else {
align(center)[
#text(12pt, strong(smallcaps(department)))
\ #text(14pt, institute)
\ #text("Accademic Year: " + year)
]
}
set page(margin: (
top: 1in,
bottom: 1in,
left: 1.5in,
right: 1in
))
if abstract != none {
pagebreak()
align(right)[
#text(34pt, underline(smallcaps(strong("Abstract"))))
]
set par(justify: true)
abstract
}
pagebreak()
outline(depth: 3, indent: true)
// Formatting the headings
// General First and then specific headings
show heading: it => [
#set align(left)
#set text(14pt)
#block(smallcaps(it.body))
]
show heading.where(level: 1): it => [
#pagebreak(weak: true)
#set align(center)
#set text(30pt)
#underline(extent: 2pt)[
#block(smallcaps(it.body))
#v(0em)
]
]
show heading.where(level:2): it => [
#set text(24pt)
#block(counter(heading).display() + " " + smallcaps(it.body))
]
show heading.where(level:3): it => [
#set text(20pt)
#block(counter(heading).display() + " " + smallcaps(it.body))
]
show heading.where(level:4): it => [
#set text(16pt)
#block(smallcaps(it.body))
]
show heading.where(level:5): it => [
#set text(14pt)
#block(smallcaps(it.body))
]
// Main body.
set par(justify: true)
set heading(numbering: "1.1")
counter(page).update(1)
set page(header: getHeader())
set page(numbering: "1", number-align: center)
body
} |
https://github.com/vncsb/desec-typst-template | https://raw.githubusercontent.com/vncsb/desec-typst-template/main/cover.typ | typst | #show link: underline
#set page(
paper: "us-letter",
margin: 0pt,
background: image("images/cover.jpg", width: 101%),
)
#v(26cm)
#h(7cm)
#place(left, dx: 4.5cm)[
#set text(font: "Roboto", size: 10pt)
#align(center)[
*CONFIDENCIAL* \
Copyright © Desec Security \
#link("(https://www.desecsecurity.com)")
]
]
|
|
https://github.com/jassielof/typst-templates | https://raw.githubusercontent.com/jassielof/typst-templates/main/upsa-bo/estudio-de-factibilidad/template/capítulos/4.estudio%20de%20la%20materia%20prima%20e%20insumos.typ | typst | MIT License | = Estudio de la Materia Prima e Insumos
== Identificación
=== Nombre Científico y Comercial
=== Características Organolépticas
=== Características Físico-Químicas
=== Atributos de Calidad a Analizar
=== Métodos de Toma de Muestra
=== Equipos para el Análisis
=== Atributos de Aceptación
== Procesos de Transporte, Manipuleo y Almacenamiento
=== Transporte
=== Manipuleo
=== Almacenamiento
== Balance Oferta-Demanda
=== Proyección de la Oferta
=== Proyección de la Demanda
=== Balance
== Selección de Proveedores
=== Condiciones de los Proveedores
=== Cálculo del Costo en Almacén
=== Selección del Proveedor Principal y Alternativo
|
https://github.com/ClazyChen/Table-Tennis-Rankings | https://raw.githubusercontent.com/ClazyChen/Table-Tennis-Rankings/main/history_CN/2019/WS-05.typ | typst |
#set text(font: ("Courier New", "NSimSun"))
#figure(
caption: "Women's Singles (1 - 32)",
table(
columns: 4,
[排名], [运动员], [国家/地区], [积分],
[1], [陈梦], [CHN], [3419],
[2], [刘诗雯], [CHN], [3408],
[3], [朱雨玲], [MAC], [3408],
[4], [丁宁], [CHN], [3312],
[5], [王曼昱], [CHN], [3310],
[6], [孙颖莎], [CHN], [3176],
[7], [木子], [CHN], [3136],
[8], [伊藤美诚], [JPN], [3097],
[9], [陈幸同], [CHN], [3096],
[10], [何卓佳], [CHN], [3092],
[11], [#text(gray, "文佳")], [CHN], [3073],
[12], [杜凯琹], [HKG], [3071],
[13], [石川佳纯], [JPN], [3062],
[14], [武杨], [CHN], [3031],
[15], [冯亚兰], [CHN], [3017],
[16], [加藤美优], [JPN], [3013],
[17], [顾玉婷], [CHN], [3012],
[18], [平野美宇], [JPN], [3006],
[19], [王艺迪], [CHN], [2998],
[20], [#text(gray, "刘高阳")], [CHN], [2992],
[21], [早田希娜], [JPN], [2989],
[22], [#text(gray, "胡丽梅")], [CHN], [2972],
[23], [芝田沙季], [JPN], [2967],
[24], [冯天薇], [SGP], [2964],
[25], [韩莹], [GER], [2960],
[26], [金宋依], [PRK], [2949],
[27], [陈可], [CHN], [2928],
[28], [桥本帆乃香], [JPN], [2917],
[29], [李倩], [POL], [2903],
[30], [佐藤瞳], [JPN], [2891],
[31], [徐孝元], [KOR], [2882],
[32], [郑怡静], [TPE], [2878],
)
)#pagebreak()
#set text(font: ("Courier New", "NSimSun"))
#figure(
caption: "Women's Singles (33 - 64)",
table(
columns: 4,
[排名], [运动员], [国家/地区], [积分],
[33], [张瑞], [CHN], [2871],
[34], [于梦雨], [SGP], [2867],
[35], [安藤南], [JPN], [2866],
[36], [CHA Hyo Sim], [PRK], [2857],
[37], [孙铭阳], [CHN], [2854],
[38], [张蔷], [CHN], [2852],
[39], [傅玉], [POR], [2852],
[40], [GU Ruochen], [CHN], [2849],
[41], [佩特丽莎 索尔佳], [GER], [2837],
[42], [石洵瑶], [CHN], [2831],
[43], [伯纳黛特 斯佐科斯], [ROU], [2828],
[44], [车晓曦], [CHN], [2822],
[45], [杨晓欣], [MON], [2822],
[46], [LIU Xi], [CHN], [2815],
[47], [田志希], [KOR], [2812],
[48], [KIM Nam Hae], [PRK], [2807],
[49], [PESOTSKA Margaryta], [UKR], [2805],
[50], [侯美玲], [TUR], [2804],
[51], [索菲亚 波尔卡诺娃], [AUT], [2801],
[52], [陈思羽], [TPE], [2799],
[53], [单晓娜], [GER], [2788],
[54], [李皓晴], [HKG], [2781],
[55], [伊丽莎白 萨玛拉], [ROU], [2781],
[56], [梁夏银], [KOR], [2778],
[57], [长崎美柚], [JPN], [2773],
[58], [SOO Wai Yam Minnie], [HKG], [2771],
[59], [阿德里安娜 迪亚兹], [PUR], [2771],
[60], [张墨], [CAN], [2756],
[61], [李佳燚], [CHN], [2753],
[62], [CHENG Hsien-Tzu], [TPE], [2751],
[63], [崔孝珠], [KOR], [2745],
[64], [布里特 伊尔兰德], [NED], [2743],
)
)#pagebreak()
#set text(font: ("Courier New", "NSimSun"))
#figure(
caption: "Women's Singles (65 - 96)",
table(
columns: 4,
[排名], [运动员], [国家/地区], [积分],
[65], [李佼], [NED], [2738],
[66], [森樱], [JPN], [2737],
[67], [浜本由惟], [JPN], [2732],
[68], [李洁], [NED], [2730],
[69], [<NAME>], [JPN], [2722],
[70], [SAWETTABUT Suthasini], [THA], [2707],
[71], [范思琦], [CHN], [2706],
[72], [EKHOLM Matilda], [SWE], [2699],
[73], [刘佳], [AUT], [2698],
[74], [李芬], [SWE], [2695],
[75], [<NAME>], [CZE], [2689],
[76], [木原美悠], [JPN], [2682],
[77], [#text(gray, "<NAME>")], [JPN], [2680],
[78], [#text(gray, "<NAME>ayuan")], [CHN], [2677],
[79], [<NAME>], [JPN], [2676],
[80], [刘斐], [CHN], [2676],
[81], [李时温], [KOR], [2675],
[82], [GRZYBOWSKA-FRANC Katarzyna], [POL], [2673],
[83], [李恩惠], [KOR], [2671],
[84], [金河英], [KOR], [2666],
[85], [妮娜 米特兰姆], [GER], [2665],
[86], [YOO Eunchong], [KOR], [2665],
[87], [MAEDA Miyu], [JPN], [2661],
[88], [LIU Hsing-Yin], [TPE], [2659],
[89], [HUANG Yingqi], [CHN], [2656],
[90], [LIN Ye], [SGP], [2656],
[91], [LANG Kristin], [GER], [2646],
[92], [森田美咲], [JPN], [2642],
[93], [KIM Youjin], [KOR], [2642],
[94], [SOLJA Amelie], [AUT], [2637],
[95], [SOMA Yumeno], [JPN], [2634],
[96], [LIU Xin], [CHN], [2633],
)
)#pagebreak()
#set text(font: ("Courier New", "NSimSun"))
#figure(
caption: "Women's Singles (97 - 128)",
table(
columns: 4,
[排名], [运动员], [国家/地区], [积分],
[97], [钱天一], [CHN], [2632],
[98], [曾尖], [SGP], [2631],
[99], [大藤沙月], [JPN], [2630],
[100], [YOON Hyobin], [KOR], [2627],
[101], [WU Yue], [USA], [2618],
[102], [<NAME>], [HUN], [2617],
[103], [申裕斌], [KOR], [2611],
[104], [<NAME>], [SVK], [2609],
[105], [<NAME>], [UKR], [2604],
[106], [张安], [USA], [2604],
[107], [邵杰妮], [POR], [2603],
[108], [WINTER Sabine], [GER], [2603],
[109], [HUANG Yi-Hua], [TPE], [2602],
[110], [#text(gray, "KATO Kyoka")], [JPN], [2594],
[111], [倪夏莲], [LUX], [2594],
[112], [NG Wing Nam], [HKG], [2591],
[113], [POTA Georgina], [HUN], [2587],
[114], [NOSKOVA Yana], [RUS], [2584],
[115], [<NAME>], [HUN], [2584],
[116], [李昱谆], [TPE], [2584],
[117], [<NAME>], [ROU], [2582],
[118], [MIKHAILOVA Polina], [RUS], [2579],
[119], [玛妮卡 巴特拉], [IND], [2577],
[120], [DIACONU Adina], [ROU], [2569],
[121], [郭雨涵], [CHN], [2568],
[122], [玛利亚 肖], [ESP], [2566],
[123], [HUANG Yu-Wen], [TPE], [2565],
[124], [#text(gray, "SO Eka")], [JPN], [2565],
[125], [#text(gray, "CHOE Hyon Hwa")], [PRK], [2564],
[126], [KIM Jiho], [KOR], [2562],
[127], [笹尾明日香], [JPN], [2561],
[128], [TIAN Yuan], [CRO], [2561],
)
) |
|
https://github.com/Enter-tainer/typst-preview | https://raw.githubusercontent.com/Enter-tainer/typst-preview/main/docs/config.typ | typst | MIT License | #import "./book.typ": book-page
#import "./templates/gh-page.typ": page-width, is-dark-theme
#import "@preview/fontawesome:0.1.0": *
#import "@preview/colorful-boxes:1.1.0": *
#show: book-page.with(title: "Configuration")
#show link: underline
= Extension Configuration Options
The following are the available options for configuring the typst-preview extension:
#let pkg_json = json("../addons/vscode/package.json")
#let config_item(key, cfg) = [
+ *#raw(key)*:
- Type: #raw(cfg.type)
#if cfg.type == "array" [
- Items: #raw(cfg.items.type)
- Description: #eval(cfg.items.description, mode: "markup")
]
- Description: #eval(cfg.description, mode: "markup")
#if cfg.at("enum", default: none) != none [
- Valid values: #for (i, item) in cfg.enum.enumerate() [
- #raw(item): #eval(cfg.enumDescriptions.at(i), mode: "markup")
]
]
#if type(cfg.default) == "string" {
if cfg.default != "" [
- Default: #raw(cfg.default)
] else [
- Default: `""`
]
} else if type(cfg.default) == "array" [
- Default: [#cfg.default.join(",")]
] else [
- Default: #cfg.default
]
]
#for (key, cfg) in pkg_json.contributes.configuration.properties {
config_item(key, cfg)
}
|
https://github.com/andreasKroepelin/TypstJlyfish.jl | https://raw.githubusercontent.com/andreasKroepelin/TypstJlyfish.jl/main/typst/lib.typ | typst | MIT License | #import "jlyfish.typ": jl, jl-raw, jl-pkg, read-julia-output
|
https://github.com/frectonz/the-pg-book | https://raw.githubusercontent.com/frectonz/the-pg-book/main/book/016.%20spam.html.typ | typst | spam.html
A Plan for Spam
Like to build things? Try Hacker
News.
August 2002(This article describes the spam-filtering techniques
used in the spamproof web-based mail reader we
built to exercise Arc. An
improved algorithm is described in Better
Bayesian Filtering.)I think it's possible to stop spam, and that
content-based filters are the way to do it.
The Achilles heel of the spammers is their message.
They can circumvent any other barrier you set up. They have so far, at
least. But they have to deliver their message, whatever it
is. If we can write software that recognizes their messages,
there is no way they can get around that._ _ _To the recipient, spam is easily recognizable. If you hired
someone to read your mail and discard the spam, they would
have little trouble doing it. How much do we have
to do, short of AI, to automate this process?I think we will be able to solve the problem with fairly
simple algorithms. In fact, I've found that you can filter
present-day spam acceptably well using nothing more than a
Bayesian combination of the spam probabilities of individual
words. Using a slightly tweaked (as described below) Bayesian
filter, we now miss less than 5 per 1000 spams, with 0 false positives.The statistical approach is not usually the first one people
try when they write spam filters. Most hackers' first instinct is
to try to write software that recognizes individual properties of
spam. You look at spams
and you think, the gall of these guys to try sending me mail
that begins "Dear Friend" or has a subject line that's all
uppercase and ends in eight exclamation points. I can filter
out that stuff with about one line of code.And so you do,
and in the beginning it works. A few simple rules will take
a big bite out of your incoming spam. Merely looking
for the word "click" will catch 79.7% of the
emails in my spam corpus, with only 1.2% false positives.I spent about six months writing software that looked for
individual spam features before I tried the statistical
approach. What I found was that recognizing that last few
percent of spams got very hard, and that as I
made the filters stricter I got more false positives.False positives are innocent emails that get mistakenly
identified as spams.
For most users,
missing legitimate email is
an order of magnitude worse than receiving spam, so a
filter that yields false positives is like an acne cure
that carries a risk of death to the patient.The more spam a user gets, the less
likely he'll be to notice one innocent mail sitting in his
spam folder. And strangely enough, the better your spam filters get,
the more dangerous false positives become, because when the
filters are really good, users will be more likely to
ignore everything they catch.I don't know why I avoided trying the statistical approach
for so long. I think it was because I got addicted to
trying to identify spam features myself, as if I were playing
some kind of competitive game with the spammers. (Nonhackers
don't often realize this, but most hackers are very competitive.)
When I did try statistical analysis, I
found immediately that it was much cleverer than I had been.
It discovered, of course, that terms like "virtumundo" and
"teens" were good indicators of spam. But it also
discovered that "per" and "FL" and "ff0000" are good
indicators of spam. In fact, "ff0000" (html for bright red)
turns out to be as good an indicator of spam as any
pornographic term._ _ _Here's a sketch of how I do statistical filtering. I start
with one corpus of spam and one of nonspam mail. At the
moment each one has about 4000 messages in it. I scan
the entire text, including headers and embedded html
and javascript, of each message in each corpus.
I currently consider alphanumeric characters,
dashes, apostrophes, and dollar signs to be part of tokens,
and everything else to be a token separator. (There is
probably room for improvement here.) I ignore tokens that
are all digits, and I also ignore html comments, not even
considering them as token separators.I count the number
of times each token (ignoring case, currently) occurs in
each corpus. At this stage I end up with two large hash
tables, one for each corpus, mapping tokens to number
of occurrences.Next I create a third hash table, this time mapping
each token to the probability that an email containing it is a spam,
which I calculate as follows [1]:
(let ((g (* 2 (or (gethash word good) 0)))
(b (or (gethash word bad) 0)))
(unless (< (+ g b) 5)
(max .01
(min .99 (float (/ (min 1 (/ b nbad))
(+ (min 1 (/ g ngood))
(min 1 (/ b nbad)))))))))
where word is the token whose probability we're
calculating, good and bad are the hash tables
I created in the first step, and ngood and nbad
are the number of nonspam and spam messages respectively.I explained this as code to show a couple of important details.
I want to bias the probabilities slightly to avoid false
positives, and by trial and error I've found that a good
way to do it is to double all the numbers in good.
This helps to distinguish between words that occasionally
do occur in legitimate email and words that almost never do.
I only consider words that occur more than five times in
total (actually, because of the doubling, occurring three
times in nonspam mail would be enough). And then there is
the question of what probability to assign to words that
occur in one corpus but not the other. Again by trial and
error I chose .01 and .99. There may be room for tuning
here, but as the corpus grows such tuning will happen
automatically anyway.The especially observant will notice that while I consider
each corpus to be a single long stream of text for purposes
of counting occurrences, I use the number of emails in
each, rather than their combined length, as the divisor
in calculating spam probabilities. This adds another
slight bias to protect against false positives.When new mail arrives, it is scanned into tokens, and
the most interesting fifteen tokens, where interesting is
measured by how far their spam probability is from a
neutral .5, are used to calculate the probability that
the mail is spam. If probs
is a list of the fifteen individual probabilities, you
calculate the
combined probability thus:
(let ((prod (apply #'* probs)))
(/ prod (+ prod (apply #'* (mapcar #'(lambda (x)
(- 1 x))
probs)))))
One question that arises in
practice is what probability to assign to a word you've
never seen, i.e. one that doesn't occur in the hash table
of word probabilities. I've found, again by trial and
error, that .4 is a good number to use. If you've never
seen a word before, it is probably fairly innocent; spam
words tend to be all too familiar.There are examples of this algorithm being applied to
actual emails in an appendix at the end.I treat mail as spam if the algorithm above gives it a
probability of more than .9 of being spam. But in practice
it would not matter much where I put this threshold, because
few probabilities end up in the middle of the range._ _ _One great advantage of the statistical approach is that you
don't have to read so many spams. Over the past six months,
I've read literally thousands of spams, and it is really
kind of demoralizing. <NAME> said if you compete
with slaves you become a slave, and there is something
similarly degrading about competing with spammers. To
recognize individual spam features you have to try to get
into the mind of the spammer, and frankly I want to spend
as little time inside the minds of spammers as possible.But the real advantage of the Bayesian approach, of course,
is that you know what
you're measuring. Feature-recognizing filters like
SpamAssassin assign a spam "score" to email. The Bayesian
approach assigns an actual probability. The problem with
a "score" is that no one knows what it means. The user
doesn't know what it means, but worse still, neither does
the developer of the filter. How many points should an
email get for having the word "sex" in it? A probability
can of course be mistaken, but there is little ambiguity
about what it means, or how evidence should be combined
to calculate it. Based on my corpus, "sex" indicates
a .97 probability of the containing email being a spam,
whereas "sexy" indicates .99 probability.
And Bayes' Rule, equally unambiguous, says that an email
containing both words would, in the (unlikely)
absence of any other evidence, have a 99.97% chance of
being a spam.Because it is measuring probabilities, the Bayesian approach
considers all the evidence in the email, both good and bad.
Words that occur disproportionately rarely
in spam (like "though" or "tonight" or "apparently")
contribute as much to decreasing the probability as
bad words like "unsubscribe" and "opt-in" do to
increasing it. So an otherwise innocent email that happens
to include the word "sex" is not going to get tagged as spam.Ideally, of course, the probabilities should be calculated
individually for each user. I get a lot of email containing
the word "Lisp", and (so far) no spam that does. So a word
like that is effectively a kind of password for sending
mail to me. In my earlier spam-filtering software, the user
could set up a list of such words and mail containing
them would automatically get past the filters. On my
list I put words like "Lisp" and also my zipcode, so
that (otherwise rather spammy-sounding) receipts from
online orders would get through. I thought I was being
very clever, but I found that the Bayesian filter did the
same thing for me, and moreover discovered of a lot of words I
hadn't thought of.When I said at the start that our filters let through less than
5 spams per 1000 with 0 false positives, I'm talking about
filtering my mail based on a corpus of my mail. But these
numbers are not misleading, because that is the approach I'm
advocating: filter each user's mail based on the spam and
nonspam mail he receives. Essentially, each user should
have two delete buttons, ordinary delete and delete-as-spam.
Anything deleted as spam goes into the spam corpus,
and everything else goes into the nonspam corpus.You could start
users with a seed filter, but ultimately each user should have
his own per-word probabilities based on the actual mail he
receives. This (a) makes the filters more effective, (b) lets
each user decide their own precise definition of spam,
and (c) perhaps best of all makes it hard for spammers
to tune mails to get through the filters. If a lot of the
brain of the filter is in the individual databases, then
merely tuning spams to get through the seed filters
won't guarantee anything about how well they'll get through
individual users' varying and much more trained filters.Content-based spam filtering is often combined with a whitelist,
a list of senders whose mail can be accepted with no filtering.
One easy way to build such a
whitelist is to keep a list of every address the user has
ever sent mail to. If a mail reader has a delete-as-spam
button then you could also add the from address
of every email the user has deleted as ordinary trash.I'm an advocate of whitelists, but more as a way to save
computation than as a way to improve filtering. I used to think that
whitelists would make filtering easier, because you'd
only have to filter email from people you'd never heard
from, and someone sending you mail for the first time is
constrained by convention in what they can say to you.
Someone you already know might send you an email talking about sex,
but someone sending you mail for the first time would not
be likely to. The problem is, people can have more than one
email address, so a new from-address doesn't guarantee that
the sender is writing to you for the first time.
It is not unusual
for an old friend (especially if he is a hacker) to suddenly
send you an email with a new from-address, so you can't
risk false positives by filtering mail from unknown
addresses especially stringently.In a sense, though, my filters do themselves embody a kind
of whitelist (and blacklist) because they are based on
entire messages, including the headers. So to that
extent they "know" the email addresses of trusted senders
and even the routes by which mail gets from them to me.
And they know the same about spam, including the server
names, mailer versions, and protocols._ _ _If I thought that I could keep up current rates of spam
filtering, I would consider this problem solved. But it
doesn't mean much to be able to filter out most present-day
spam, because spam evolves.
Indeed, most
antispam techniques so far have been like pesticides that
do nothing more than create a new, resistant strain of bugs.I'm more hopeful about Bayesian filters, because they evolve
with the spam. So as spammers start using "c0ck"
instead of "cock" to evade simple-minded spam filters
based on individual words, Bayesian filters automatically
notice. Indeed, "c0ck" is far more damning evidence than
"cock", and Bayesian filters know precisely how much more.Still, anyone who proposes a plan for spam filtering has to
be able to answer the question: if the spammers knew
exactly what you were doing,
how well could they get past you? For example, I think that if
checksum-based spam filtering becomes a serious obstacle,
the spammers will just
switch to mad-lib techniques for generating message bodies.To beat Bayesian filters, it would not be enough for spammers
to make their emails unique or to stop using individual
naughty words. They'd have to make their mails indistinguishable
from your ordinary mail. And this I think would severely
constrain them. Spam is mostly sales
pitches, so unless your regular mail is all sales pitches,
spams will inevitably have a different character. And
the spammers would also, of course, have to change (and keep
changing) their whole infrastructure, because otherwise
the headers would look as bad to the Bayesian filters as ever,
no matter what they did to the message body. I don't know
enough about the infrastructure that spammers use to know
how hard it would be to make the headers look innocent, but
my guess is that it would be even harder than making the
message look innocent.Assuming they could solve the problem of the headers,
the spam of the future will probably look something like
this:
Hey there. Thought you should check out the following:
http://www.27meg.com/foo
because that is about as much sales pitch as content-based
filtering will leave the spammer room to make. (Indeed, it
will be hard even to get this past filters, because if everything
else in the email is neutral, the spam probability will hinge on
the url, and it will take some effort to make that look neutral.)Spammers range from businesses running so-called
opt-in lists who don't even try to conceal their identities,
to guys who hijack mail servers to send out spams promoting
porn sites. If we use filtering to whittle their
options down to mails like the one above, that should
pretty much put the spammers on the "legitimate" end of
the spectrum out of business; they feel obliged
by various state laws to include boilerplate about why
their spam is not spam, and how to cancel your
"subscription," and that kind of text is easy to
recognize.(I used to think it was naive to believe that stricter laws
would decrease spam. Now I think that while stricter laws
may not decrease the amount of spam that spammers send,
they can certainly help filters to decrease the amount of
spam that recipients actually see.)All along the spectrum, if you restrict the sales pitches spammers
can make, you will inevitably tend to put them out of
business. That word business is an important one to
remember. The spammers are businessmen. They send spam because
it works. It works because although the response rate
is abominably low (at best 15 per million, vs 3000 per
million for a catalog mailing), the cost, to them, is
practically nothing. The cost is enormous for the recipients,
about 5 man-weeks for each million recipients who spend
a second to delete the spam, but the spammer
doesn't have to pay that.Sending spam does cost the spammer something, though. [2]
So the lower we can get the
response rate-- whether by filtering, or by using filters to force
spammers to dilute their pitches-- the fewer businesses will find it
worth their while to send spam.The reason the spammers use the kinds of
sales
pitches that they do is to increase response rates.
This is possibly even more disgusting
than getting inside the mind of a spammer,
but let's take a quick look inside the mind of someone
who responds to a spam. This person is either
astonishingly credulous or deeply in denial about their
sexual interests. In either case, repulsive or
idiotic as the spam seems to us, it is exciting
to them. The spammers wouldn't say these things if they
didn't sound exciting. And "thought you
should check out the following" is just not going to
have nearly the pull with the spam recipient as
the kinds of things that spammers say now.
Result: if it can't contain exciting sales pitches,
spam becomes less effective as a marketing vehicle,
and fewer businesses want to use it.That is the big win in the end. I started writing spam
filtering software because I didn't want have to look at
the stuff anymore.
But if we get good enough at filtering
out spam, it will stop working, and the spammers
will actually stop sending it._ _ _Of all the approaches to fighting spam, from software to laws,
I believe Bayesian filtering will be the single most
effective. But I also
think that the more different kinds of antispam efforts
we undertake, the better, because any measure that
constrains spammers will tend to make filtering easier.
And even within the world of content-based filtering, I think
it will be a good thing if there are many different kinds
of software being used simultaneously. The more different
filters there are, the harder it will be for
spammers to tune spams to get through them.
Appendix: Examples of FilteringHere is an example of a spam that arrived while I was writing
this article. The fifteen most interesting words in this spam are:
qvp0045
indira
mx-05
intimail
$7500
freeyankeedom
cdo
bluefoxmedia
jpg
unsecured
platinum
3d0
qves
7c5
7c266675
The words are a mix of stuff from the headers and from the
message body, which is typical of spam. Also typical of spam
is that every one of these words has a spam probability,
in my database, of .99. In fact there are more than fifteen words
with probabilities of .99, and these are just the first
fifteen seen.Unfortunately that makes this email a boring example of
the use of Bayes' Rule. To see an interesting variety of
probabilities we have to look at this actually quite
atypical spam.The fifteen most interesting words in this spam, with their probabilities,
are:
madam 0.99
promotion 0.99
republic 0.99
shortest 0.047225013
mandatory 0.047225013
standardization 0.07347802
sorry 0.08221981
supported 0.09019077
people's 0.09019077
enter 0.9075001
quality 0.8921298
organization 0.12454646
investment 0.8568143
very 0.14758544
valuable 0.82347786
This time the evidence is a mix of good and bad. A word like
"shortest" is almost as much evidence for innocence as a
word like "madam" or "promotion" is for guilt. But still the
case for guilt is stronger. If you combine these numbers
according to Bayes' Rule, the resulting probability is .9027."Madam" is obviously from spams beginning
"Dear Sir or Madam." They're not very common, but the
word "madam" never occurs in my legitimate email, and
it's all about the ratio."Republic" scores high because
it often shows up in Nigerian scam emails, and also occurs once
or twice in spams referring to Korea and South Africa.
You might say that it's
an accident that it thus helps identify this spam. But I've
found when examining spam probabilities that there are
a lot of these accidents, and they have an uncanny tendency to
push things in the right direction rather than the wrong one.
In this case, it is not entirely a coincidence that the word
"Republic" occurs in Nigerian scam emails and this spam.
There is a whole class of dubious business propositions involving
less developed countries, and these in turn are more likely
to have names that specify explicitly (because they aren't) that they
are republics.[3]On the other hand, "enter" is a genuine miss. It occurs
mostly in unsubscribe instructions, but here is used in a
completely innocent way. Fortunately the statistical approach is
fairly robust, and can tolerate quite a lot of misses
before the results start to be thrown off.For comparison,
here is an example of that rare bird, a spam that
gets through the filters. Why? Because by sheer chance it happens
to be loaded with words that occur in my actual email:
perl 0.01
python 0.01
tcl 0.01
scripting 0.01
morris 0.01
graham 0.01491078
guarantee 0.9762507
cgi 0.9734398
paul 0.027040077
quite 0.030676773
pop3 0.042199217
various 0.06080265
prices 0.9359873
managed 0.06451222
difficult 0.071706355
There are a couple pieces of good news here. First, this mail
probably wouldn't get through the filters of someone who didn't
happen to specialize in programming languages and have a good
friend called Morris. For the average user, all the top five words here
would be neutral and would not contribute to the spam probability.Second, I think filtering based on word pairs
(see below) might well
catch this one: "cost effective", "setup fee", "money back" -- pretty
incriminating stuff. And of course if they continued to spam me
(or a network I was part of), "Hostex" itself would be
recognized as a spam term.Finally, here is an innocent email.
Its fifteen most interesting words are as follows:
continuation 0.01
describe 0.01
continuations 0.01
example 0.033600237
programming 0.05214485
i'm 0.055427782
examples 0.07972858
color 0.9189189
localhost 0.09883721
hi 0.116539136
california 0.84421706
same 0.15981844
spot 0.1654587
us-ascii 0.16804294
what 0.19212411
Most of the words here indicate the mail is an innocent one.
There are two bad smelling words, "color"
(spammers love colored fonts) and "California"
(which occurs in testimonials and also in menus in
forms), but they are not enough to outweigh obviously
innocent words like "continuation" and "example".It's interesting that "describe" rates as so thoroughly
innocent. It hasn't occurred in a
single one of my 4000 spams. The data turns out to be
full of such surprises. One of the things you learn
when you analyze spam texts is how
narrow a subset of the language spammers operate in. It's
that fact, together with the equally characteristic vocabulary
of any individual user's mail, that makes Bayesian filtering
a good bet.Appendix: More IdeasOne idea that I haven't tried yet is to filter based on
word pairs, or even triples, rather than individual words.
This should yield a much sharper estimate of the probability.
For example, in my current database, the word "offers"
has a probability of .96. If you based the probabilities
on word pairs, you'd end up with "special offers"
and "valuable offers" having probabilities of .99
and, say, "approach offers" (as in "this approach offers")
having a probability of .1 or less.The reason I haven't done this is that filtering based on
individual words already works so well. But it does
mean that there is room to tighten the filters if spam
gets harder to detect.
(Curiously, a filter based on word pairs would be
in effect a Markov-chaining text generator running
in reverse.)Specific spam features (e.g. not seeing the recipient's
address in the to: field) do of course have value in
recognizing spam. They can be considered in this
algorithm by treating them as virtual words. I'll probably
do this in future versions, at least for a handful of the
most egregious spam indicators. Feature-recognizing
spam filters are right in many details; what they lack
is an overall discipline for combining evidence.Recognizing nonspam features may be more important than
recognizing spam features. False positives are such a
worry that they demand extraordinary measures. I will
probably in future versions add a second level of testing
designed specifically to avoid false positives. If a
mail triggers this second level of filters it will be accepted
even if its spam probability is above the threshold.I don't expect this second level of filtering to be Bayesian.
It will inevitably
be not only ad hoc, but based on guesses, because the number of
false positives will not tend to be large enough to notice patterns.
(It is just as well, anyway, if a backup system doesn't rely on the same
technology as the primary system.)Another thing I may try in the future is to focus extra attention
on specific parts of the email. For example, about 95% of current
spam includes the url of a site they want
you to visit. (The remaining 5% want you to call a phone number,
reply by email or to a US mail address, or in a few
cases to buy a certain stock.) The url is in such cases
practically enough by itself to determine whether the email
is spam.Domain names differ from the rest of the text in
a (non-German) email in that they often consist of several
words stuck together. Though computationally expensive
in the general case, it might be worth trying to
decompose them. If a filter has never seen the
token "xxx<PASSWORD>" before it will have an individual spam
probability of .4, whereas "xxx" and "porn" individually
have probabilities (in my corpus) of .9889 and .99
respectively, and a combined probability of .9998.I expect decomposing domain names to become more
important as spammers are gradually forced to stop using
incriminating words in the text of their messages. (A url
with an ip address is of course an extremely incriminating sign,
except in the mail of a few sysadmins.)It might be a good idea to have a cooperatively maintained
list of urls promoted by spammers. We'd need a trust metric
of the type studied by <NAME> to prevent malicious
or incompetent submissions, but if we had such a thing it
would provide a boost to any filtering software. It would
also be a convenient basis for boycotts.Another way to test dubious urls would be to send out a
crawler to look at the site before the user looked at the
email mentioning it. You could use a Bayesian filter to
rate the site just as you would an email, and whatever
was found on the site could be included in calculating
the probability of the email being a spam. A url that led
to a redirect would of course be especially suspicious.One cooperative project that I think really would be a good
idea would be to accumulate a giant corpus of spam. A large,
clean corpus is the key to making Bayesian filtering work
well. Bayesian filters could actually use the corpus as
input. But such a corpus would be useful for other kinds
of filters too, because it could be used to test them.Creating such a corpus poses some technical problems. We'd
need trust metrics to prevent malicious or incompetent
submissions, of course. We'd also need ways of erasing
personal information (not just to-addresses and ccs, but
also e.g. the arguments to unsubscribe urls, which often
encode the to-address) from mails in the corpus. If anyone
wants to take on this project, it would be a good thing for
the world.Appendix: Defining SpamI think there is a rough
consensus on what spam is, but it would be useful to have
an explicit definition. We'll need to do this if we want to establish
a central corpus of spam, or even to compare spam filtering
rates meaningfully.To start with, spam is not unsolicited commercial email.
If someone in my neighborhood heard that I was looking for an old
Raleigh three-speed in good condition, and sent me an email
offering to sell me one, I'd be delighted, and yet this
email would be both commercial and unsolicited. The
defining feature of spam (in fact, its raison d'etre)
is not that it is unsolicited, but that it is automated.It is merely incidental, too, that spam is usually commercial.
If someone started sending mass email to support some political
cause, for example, it would be just as much spam as email
promoting a porn site.I propose we define spam as unsolicited automated email.
This definition thus includes some email
that many legal definitions of spam don't. Legal definitions
of spam, influenced presumably by lobbyists, tend to exclude
mail sent by companies that have an "existing relationship" with
the recipient. But buying something from a company, for
example, does not imply that you have solicited
ongoing email from them.
If I order something from an online
store, and they then send me a stream of spam, it's still
spam.Companies sending spam often give you a way to "unsubscribe,"
or ask you to go to their site and change your "account
preferences" if you want to stop getting spam. This is
not enough to stop the mail from being spam. Not opting out
is not the same as opting in. Unless the
recipient explicitly checked a clearly labelled box (whose
default was no) asking to receive the email, then it is spam.In some business relationships, you do implicitly solicit
certain kinds of mail. When you order online, I think you
implicitly solicit a receipt, and notification when the
order ships.
I don't mind when Verisign sends me mail warning that
a domain name is about to expire (at least, if they are the
actual
registrar for it). But when Verisign sends me
email offering a FREE Guide to Building My
E-Commerce Web Site, that's spam.
Notes:[1] The examples in this article are translated
into Common Lisp for, believe it or not, greater accessibility.
The application described here is one that we wrote in order to
test a new Lisp dialect called Arc that is
not yet released.[2] Currently the lowest rate seems to be about $200 to send a million spams.
That's very cheap, 1/50th of a cent per spam.
But filtering out 95%
of spam, for example, would increase the spammers' cost to reach
a given audience by a factor of 20. Few can have
margins big enough to absorb that.[3] As a rule of thumb, the more qualifiers there are before the
name of a country, the more corrupt the rulers. A
country called The Socialist People's Democratic Republic
of X is probably the last place in the world you'd want to live.
Thanks to <NAME> for reading drafts of this; <NAME> (who is
also writing the production Arc interpreter) for several good ideas about
filtering and for creating our mail infrastructure; <NAME>,
<NAME> and <NAME> for many discussions about spam; Raph
Levien for advice about trust metrics; and <NAME>
and <NAME> for advice about statistics.
You'll find this essay and 14 others in
Hackers & Painters.
More Info:Plan for Spam FAQBetter Bayesian FilteringFilters that Fight BackWill Filters Kill Spam?Japanese TranslationSpanish TranslationChinese TranslationProbabilitySpam is DifferentFilters vs. BlacklistsTrust MetricsFiltering ResearchMicrosoft PatentSlashdot ArticleThe Wrong WayLWN: Filter ComparisonCRM114 gets 99.87%
|
|
https://github.com/td-org-uit-no/assignment-template-typst | https://raw.githubusercontent.com/td-org-uit-no/assignment-template-typst/main/template.typ | typst | // This function gets your whole document as its `body`.
#let template(
// The Assignment's title.
title: [Assignment Title],
// An array of authors. For each author you can specify a name,
// department, organization, location, and email. Everything but
// but the name is optional.
authors: (),
// The assignment's running header
header: [Assignment Header],
// The assignment's abstract. Can be omitted if you don't have one.
abstract: none,
// A list of index terms to display after the abstract.
index-terms: (),
// The article's paper size. Also affects the margins.
paper-size: "a4",
// The path to a bibliography file if you want to cite some external
// works.
bibliography-file: none,
// The assignments's content.
body,
) = {
// Set document metadata.
set document(title: title, author: authors.map(author => author.name))
// Set the body font.
set text(font: "STIX Two Text", size: 10pt)
// Configure the page.
set page(
paper: paper-size,
// The margins depend on the paper size.
margin: if paper-size == "a4" {
(x: 41.5pt, top: 80.51pt, bottom: 89.51pt)
} else {
(x: (50pt / 216mm) * 100%, top: (55pt / 279mm) * 100%, bottom: (64pt / 279mm) * 100%)
},
header: header,
numbering: "1/1",
)
// Configure equation numbering and spacing.
set math.equation(numbering: "(1)", supplement: [])
show math.equation: set block(spacing: 0.65em)
// Configure figures and tables.
set figure(supplement: [])
show figure: it => {
set text(8pt)
set align(center)
if it.kind == image [
#box[
#it.body
#v(10pt, weak: true)
Fig#it.caption
]
] else if it.kind == table [
#box[
Table#it.caption
#v(10pt, weak: true)
#it.body
]
] else [
...
]
}
// Configure appearance of equation references
show ref: it => {
if it.element != none and it.element.func() == math.equation {
// Override equation references.
link(
it.element.location(),
numbering(
it.element.numbering,
..counter(math.equation).at(it.element.location()),
),
)
} else {
// Other references as usual.
it
}
}
// Configure Tables
set table(stroke: 0.5pt)
show table: set text(8pt)
// Configure lists.
set enum(indent: 10pt, body-indent: 9pt)
set list(indent: 10pt, body-indent: 9pt)
// Configure headings.
set heading(numbering: "I.A.1.")
show heading: it => locate(loc => {
// Find out the final number of the heading counter.
let levels = counter(heading).at(loc)
let deepest = if levels != () {
levels.last()
} else {
1
}
set text(10pt, weight: 400)
if it.level == 1 [
// First-level headings are centered smallcaps.
// We don't want to number of the acknowledgment section.
#let is-ack = it.body in ([Acknowledgment], [Acknowledgement])
#set align(center)
#set text(if is-ack {
10pt
} else {
12pt
})
#show: smallcaps
#v(20pt, weak: true)
#if it.numbering != none and not is-ack {
numbering("I.", deepest)
h(7pt, weak: true)
}
#it.body
#v(13.75pt, weak: true)
] else if it.level == 2 [
// Second-level headings are run-ins.
#set par(first-line-indent: 0pt)
#set text(style: "italic")
#v(10pt, weak: true)
#if it.numbering != none {
numbering("A.", deepest)
h(7pt, weak: true)
}
#it.body
#v(10pt, weak: true)
] else [
// Third level headings are run-ins too, but different.
#if it.level == 3 {
numbering("1)", deepest)
[ ]
}
_#(it.body):_
]
})
// Display the assignments's title.
v(3pt, weak: true)
align(center, text(22pt, title))
v(8.35mm, weak: true)
// Display the authors list.
for i in range(calc.ceil(authors.len() / 4)) {
let end = calc.min((i + 1) * 4, authors.len())
let is-last = authors.len() == end
let slice = authors.slice(i * 4, end)
grid(columns: slice.len() * (
1fr,
), gutter: 12pt, ..slice.map(author => align(
center,
{
text(12pt, author.name)
if "department" in author [
\ #emph(author.department)
]
if "organization" in author [
\ #emph(author.organization)
]
if "location" in author [
\ #author.location
]
if "email" in author [
\ #link("mailto:" + author.email)
]
if "git" in author [
\ #author.git
]
},
)))
if not is-last {
v(16pt, weak: true)
}
}
v(40pt, weak: true)
// Start two column mode and configure paragraph properties.
show: columns.with(2, gutter: 12pt)
set par(justify: true, first-line-indent: 1em)
show par: set block(spacing: 0.65em)
// Display abstract and index terms.
if abstract != none [
// #set par(leading: 0.5em)
#set text(weight: 700, size: 9pt)
#h(1em) _Abstract_---#abstract
#if index-terms != () [
#v(9pt)
#set text(weight: 700, size: 9pt, style: "italic")
#h(1em)_Index terms_---#index-terms.join(", ")
]
#v(2pt)
]
// Display the assignments's contents.
body
// Display bibliography.
if bibliography-file != none {
show bibliography: set text(8pt)
bibliography(
bibliography-file,
title: text(10pt)[References],
style: "ieee",
)
}
}
|
|
https://github.com/ice1000/website | https://raw.githubusercontent.com/ice1000/website/main/dtt-dev/assemble.typ | typst | #let setup-authors(authors) = {
pad(
top: 0.5em,
bottom: 0.5em,
x: 2em,
grid(
columns: (1fr,) * calc.min(3, authors.len()),
gutter: 1em,
..authors.map(author => align(center)[
*#author.name* \
#author.email
]),
),
)
}
#import "config.typ": *
#show: dtt.with(title: "Dependent Theory of Types")
#setup-authors(((
name: "<NAME>",
email: "<EMAIL>"
),))
#set heading(numbering: "1.")
#outline(depth: 2, indent: auto)
#import "@preview/fletcher:0.4.3" as fletcher: diagram, node, edge
#let cedge(..args) = edge(label-side: center, ..args)
#set quote(block: true)
= Introduction
This is an extremely syntax-minded development on some meta-level dependent type theory,
which I wish to convey an interesting perspective.
Experienced readers will immediately know what I'm trying to do in this development, but I will not spoil it here.
The type theory I consider here is not designed to be implementable (i.e. have decidable type checking)
or practical, but rather intended to be a reasoning framework about constructions.
I will also try to avoid set theoretic terminologies as much as possible,
and restrict the prerequisites to only mathematical maturity.
In the whole development, I will assume nameless representation of variables, and treat them informally as if they are named.
For readers who are unfamiliar with logic, here are two notions that will be used frequently:
#definition("Derivable")[
A _derivable_ judgment in a type theory is a judgment one may derive using the typing rules.
]
#definition("Admissible")[
An _admissible_ rule in a type theory is a rule that can be proved at the meta level by doing case analysis on the premises.
]
The usual definition of admissibility is that it does not add new theorems to the theory,
but I personally find it too much of a characterization, and is far from something we can easily verify.
In practice, it's most likely the case that admissible rules are proved by case analysis on the premises,
which is usually clearer how to do.
= Substitution Calculus <sec:subst-calculus>
The goal of this chapter is to defined a _substitution calculus_, which a dependent type theory with a well-behaved substitution operation.
== Judgments
// CwF
#definition("Judgment schema")[
We assume the following judgment schemas of type theories:
+ $Γ ⊢$ means $Γ$ is a well-formed context.
+ $Γ ⊢ A$ means $A$ is a well-typed type in context $Γ$.
+ $Γ ⊢ A ≡ B$ means $A$ and $B$ are equal types in $Γ$.
+ $Γ ⊢ a : A$ means $a$ is a well-typed term of type $A$ in $Γ$.
+ $Γ ⊢ a ≡ b : A$ means $a$ and $b$ are equal terms of type $A$ in $Γ$.
+ $Γ ⊢ σ : Δ$ means $σ$ is a substitution object from $Γ$ to $Δ$.
+ $Γ ⊢ σ ≡ γ : Δ$ means $σ$ and $γ$ are equal substitution objects from $Γ$ to $Δ$.
]
The equality relation imposed by the judgments are called _judgmental equality_, which is the meta-level equality we will be working with throughout the development.
In fact, we don't necessarily need $Γ ⊢ A$, $Γ ⊢ a : A$, and $Γ ⊢ σ : Δ$, as they can be seen as reflexive case of the equality judgments, but we keep them regardless for better readability.
Some notational conventions:
+ For empty contexts and substitutions, we overload the symbol $·$ to represent both of them, usually wrapped in parentheses.
+ When a part of a judgment is clear from the context and writing it down will significantly distract the reader, we omit it. For instance, when talking about the equality between some terms, we may omit the context and the type.
// A <NAME> thing
#definition("Presuppositions")[
The judgments come with _presuppositions_ that are always assumed:
+ $Γ ⊢ A$ presupposes $Γ ⊢$.
+ $Γ ⊢ A ≡ B$ presupposes $Γ ⊢ A$ and $Γ ⊢ B$.
+ $Γ ⊢ a : A$ presupposes $Γ ⊢ A$.
+ $Γ ⊢ a ≡ b : A$ presupposes $Γ ⊢ a : A$ and $Γ ⊢ b : A$.
+ $Γ ⊢ σ : Δ$ presupposes $Γ ⊢$ and $Δ ⊢$.
+ $Γ ⊢ σ ≡ γ : Δ$ presupposes $Γ ⊢ σ : Δ$ and $Γ ⊢ γ : Δ$.
When we write down a rule that derives a judgment, we implicitly assume that the presuppositions are in the premises.
] <def:presup>
// For expert readers: unless explicitly stated otherwise, the type theory we consider will be structural type theories without modalities or type universes -- so that all type formers are well-behaved and simple.
#definition[We assume judgmental equality to be _reflexive_:
$ Γ ⊢ A ≡ A #h(2em) Γ ⊢ a ≡ a : A $
] <def:refl:jeq>
#definition[We assume judgmental equality to be _substitutive_.]
This is very hard to spell out formally in a general setting, but it basically means that we can substitute equal terms in any judgment.
We provide two example special cases of this principle to illustrate its meaning:
#corollary[
We assume the equality judgments to be symmetric, and transitive:
$ (Γ ⊢ A ≡ B)/(Γ ⊢ B ≡ A) #h(2em) (Γ ⊢ a ≡ b : A)/(Γ ⊢ b ≡ a : A) \
(Γ ⊢ A ≡ B #h(2em) Γ ⊢ B ≡ C)/(Γ ⊢ A ≡ C) \
(Γ ⊢ a ≡ b : A #h(2em) Γ ⊢ b ≡ c : A)/(Γ ⊢ a ≡ c : A)
$
]
#proof[
- Symmetry: $Γ ⊢ A ≡ B$ so we can replace $B$ with $A$, and the goal becomes $Γ ⊢ A ≡ A$,
which holds by @def:refl:jeq. The one for terms is similar.
- Transitivity: $Γ ⊢ A ≡ B$, so we can replace $B$ with $A$ so the other premise becomes $Γ ⊢ A ≡ C$,
which is equal to the goal.
]
#corollary[
Typing of terms is up to judgmental equality of types:
$ (Γ ⊢ A ≡ B #h(2em) Γ ⊢ a:B)/(Γ ⊢ a:A) $
] <def:typing:jeq>
#proof[$Γ ⊢ A ≡ B$ so we can replace $B$ with $A$ in the premise, which makes it equal to the goal.]
Furthermore, we assume all the congruence rules (i.e. all functions are pure) for the equality judgments, which are omitted everywhere for brevity.
== Contexts and Substitutions
// Context comprehension
#definition("Context")[
A well formed context is inductively generated by the following rules:
$ (·) ⊢ #h(2em) (Γ⊢A)/(Γ,x:A⊢) $
The symbol $·$ denotes an empty context (with parenthesis for clarification purpose),
and $Γ,x:A$ denotes the adding $x:A$ to $Γ$.
]
When the variable in the context is insignificant, we may omit it, and simply write $Γ,A$.
// Base-change functors
#definition("Substitution action")[
For a substitution object $Γ ⊢ σ : Δ$, we define the _action_ of substitution on types and terms as follows:
$ (Δ ⊢ A)/(Γ ⊢ A[σ]) #h(2em) (Δ ⊢ a : A)/(Γ ⊢ a[σ] : A[σ]) \
(Δ ⊢ A ≡ B)/(Γ ⊢ A[σ] ≡ B[σ]) #h(2em) (Δ ⊢ a ≡ b : A)/(Γ ⊢ a[σ] ≡ b[σ] : A[σ]) $
]
In PFPL, $A[σ]$ is denoted $hat(σ)(A)$.
Note that the exact behavior of this operation is not specified yet.
Next, we define _substitution object_, which is a list of "term-variable" pairs ($a"/"x$), meaning _replacing the variable $x$ with the term $a$_. A list of such thing looks like $(a"/"x, b"/"y, c"/"z,...)$, meaning that we intend to do all of these substitutions.
#definition("Substitution object")[
A substitution object is inductively generated by the following rules:
+ $ Γ ⊢ (·) : (·) $
Similar to contexts, $·$ denotes an empty substitution object, and the type of an empty substitution object is the empty context.
+ $ (Γ⊢σ:Δ #h(2em) Γ⊢A \ Γ⊢a:A[σ])/(Γ⊢(σ,a "/" x):(Δ,x:A)) $
When the $x$ in $(σ,a "/" x)$ is clear from the context, we may omit it, and simply write $(σ,a)$.
]
So, contexts are telescopic lists of types, and substitutions are telescopic list of terms which can be used to substitute variables.
For equality of substitutions, we intend to equate them according to their actions. In other words, two substitutions are equal if they act the same way on types and terms.
#definition("Substitution extensionality")[
If for every $Γ ⊢ A$, $Γ ⊢ A[σ] ≡ A[γ]$, and for every $Γ ⊢ a : A$, $Γ ⊢ a[σ] ≡ a[γ]$, then:
$ Γ ⊢ σ ≡ γ : Δ $
] <def:subst:ext>
We assume substitution to have some commonly expected properties,
which includes having an identity and compose associatively.
But in order to define those, we need to define variables introduction.
#definition("Containment")[
We define the judgment $x:A ∈_n Γ$ to mean that $x$ is the $n$-th variable in the context $Γ$,
counting from the left, and is of type $A$. The judgment is generated by the following rules:
+ $x:A ∈_n (Γ,x:A)$ for the length of $Γ$ being $n$,
+ $ (x:A ∈_n Γ)/(x:A ∈_n (Γ,y:B)) $
Extending the context does not change the level.
]
#example[
+ $x:A ∈_0 (x:A)$
+ $x:A ∈_0 (x:A,y:B)$
+ $y:B ∈_1 (x:A,y:B)$
]
For readers familiar with implementation of type theories,
this is the same as de Bruijn levels, aka code Bruijn indices.
We are using a locally nameless á la McBride approach, which uses levels for variables from the context.
#definition("Variable")[
We assume the following rule:
$ (x:A ∈_n Γ)/(Γ ⊢ x:A) $
such that $x[σ]$ is defined as the $n$-th term in the substitution object $σ$.
]
// Identity morphism
#construction("Identity substitution")[
For any context $Γ$, we define $Γ ⊢ id_Γ : Γ$ to be the following substitution object by induction on $Γ$:
+ $Γ = (·)$, then $id_((·)) := (·)$.
+ $Γ = Γ',x:A$, then $id_(Γ',x:A) := (id_(Γ'),x)$, where $Γ ⊢ x:A$.
] <cons:id:subst>
#lemma[Idenity substitution actions are identity functions:
$ Γ ⊢ A[id_Γ] ≡ A #h(2em) Γ ⊢ a[id_Γ] ≡ a : A $]
// Composition of morphisms
#construction("Composition of substitutions")[
For any substitution objects $Γ ⊢ σ : Δ$ and $Δ ⊢ γ : Θ$, we denote $Γ ⊢ (γ∘σ) : Θ$ to be the substitution object formed by induction on $γ$:
+ $γ = (·)$, which implies $Θ = (·)$, we define $(·∘σ) = σ$.
+ $γ = (γ',a)$, which implies $Θ = (Θ',x:A)$, we define $((γ',a)∘σ) = ((γ'∘σ),a[σ])$.
]
#lemma[Composition of substitutions is associative: $ (γ∘σ)∘ρ ≡ γ∘(σ∘ρ) $]
#lemma[Composition of substitutions is unital: $ (id∘σ) ≡ (σ∘id) ≡ σ $]
#lemma[Composition of substitutions commutes with substitution action: $ A[γ∘σ] ≡ A[σ][γ] #h(2em) a[γ∘σ] ≡ a[σ][γ] $]
Note that the order of composition of substitutions is reversed when applying them as actions.
#definition("Context isomorphism")[
A substitution $Γ ⊢ σ : Δ$ is called a _context isomorphism_ if there exists $Δ ⊢ γ : Γ$ such that $σ∘γ ≡ id_Δ$ and $γ∘σ ≡ id_Γ$.
We denote isomorphic contexts as $Γ ≃ Δ$.
]
#lemma[Composition of context isomorphisms will also be context isomorphisms.]
#proof[By composing their inverse to get the inverse of the composite.]
// Display maps
#construction("Projection")[For any type $Γ⊢A$,
we define $Γ,x:A ⊢π_A : Γ$ to be the identity substitution object $id_Γ$ weakened by $A$.]
An alternative way to think about $π_A$ is that it is the substitution
object that deletes the last variable from the context, and acts as the identity substitution otherwise.
== Structural properties
Since we already have weakening, we further require that the weakening substitution is an inclusion.
#definition("Weakening")[
The substitution action induced by any projection is an identity:
$ (Γ⊢A #h(2em) Γ⊢B)/(Γ,x:A⊢B[π_A] ≡ B) #h(2em)
(Γ⊢b:B)/(Γ,x:A⊢b[π_A]≡b : B)
$
]
#lesson("Pain")[If weakening is an inclusion, the variable rule becomes very easy to write down,
we can simply say: $ Γ ⊢ x : A $
if $A$ is not at the end of the context and weakening is not an inclusion,
we would have to write: $ Γ ⊢ x : A[π_B_1][π_B_2]... $
where $π_B_1,π_B_2,...$ are the projections that delete the types before $A$.]
In our case, since weakening substitutions behave like inclusions, we can omit all of them.
#theorem("Exchange")[
For types $Γ⊢A$ and $Γ⊢B$, we have the following context isomorphism:
$ Γ,A,B ≃ Γ,B,A $
]
#proof[$Γ,x:A,y:B ⊢ (id_Γ,y,x) : (Γ,B,A)$.]
#lesson("Tears")[If weakening is not an inclusion, the above will be very painful to write down!
For instance, the context expression
$ Γ,x:A,y:B $
does not make sense,
because to extend the context $Γ,A$ with $B$, we need $Γ,A ⊢ B$, as opposed to what we have, which is $Γ ⊢ B$.
So, we need to apply a weakening to get $Γ,A ⊢ B[π_A]$,
and the context is actually $Γ,A,B[π_A]$, and we need to construct the $σ$ in
$ Γ,A,B[π_A] ⊢ σ : Γ,B,A[π_B] $
We begin with weaekning $id_Γ$ into the context, and it's
$ Γ,x:A,y:B[π_A] ⊢ id_Γ [π_A][π_B[π_A]] : Γ $
and we need to append $x$ and $y$ to the end of it, which is even worse.]
== Conclusion
In this chapter, we have postulated the basic structures needed for a well-behaved _substitution calculus_, aka a _dependent type theory_,
which will be used as the foundational framework for the rest of the development.
Importantly, we have shown weakening to be an inclusion.
As a side remark, an alternative to presuppositions @def:presup is to use rules like these:
$ (Γ ⊢ a : A)/(Γ ⊢ A) $
It is up to preference and formalism to choose between the two styles.
We use presuppositions to avoid proving a reason to use the above style.
= Structures I <sec:strut-1>
The goal of this chapter is to defined some simple strucures inside dependent type theories.
We assume readers to know some less formal terminologies, such as introduction rules, elimination rules, term formers, $β$-rules, $η$-laws, etc., which are common in type theory literature.
== Nullary connectives
// Terminal object
#definition("Unit")[
We say a type theory has a _unit type_ if it has the following constructions:
+ _Formation_: $ · ⊢ top $
+ _Introduction_: $ · ⊢ ★ : top $
such that the following rules hold:
+ The fact that the formation of unit type is preserved by substitution:
$ Γ ⊢ top[σ] ≡ top $
+ The $η$-law: $ (Γ ⊢ a : top)/(Γ ⊢ a ≡ ★ : top) $
] <def:rule:unit>
#lemma[The introduction of unit type is preserved by substitution:
$ Γ ⊢ ★[σ] ≡ ★ : top $] <lem:subst:unit>
#proof[Because $Γ ⊢ ★[σ] : top$, and by the $η$-law.]
In any type theory, as long as we can assign $top$ and $★$ to an existing construction, we consider this type theory to have unit type.
#example[
The boolean type cannot be used to define a unit type, as it has two distinct terms, so the $η$-law does not hold.
]
#lemma[
The projection of a unit type, $Γ,x:top ⊢ π_top : Γ$ is a context isomorphism.]
#proof[The inverse is given by the identity substitution extended with the introduction of $top$: $ Γ ⊢ (id_Γ,★) : (Γ,x:top) $]
In fact, this can be used alternatively to define a unit type.
// Initial object
#definition("Empty")[
We say a type theory has _empty type_ if it has the following constructions:
+ _Formation_: $ Γ ⊢ bot $
+ _Elimination_:
$ (Γ, x:bot ⊢ u: A)/(Γ, x: bot ⊢ elim_bot (x) : A)
$
such that the following rules hold:
+ The fact that empty is preserved by substitution:
$ Γ ⊢ bot[σ] ≡ bot $
+ The $η$-law:
$ (Γ, x:bot ⊢ u: A)/(Γ, x: bot ⊢ u ≡ elim_bot (x) : A)
$
]
The $η$-law of the empty type says _any term_ in a context with a $bot$ in it is equivalent to $elim_bot (x)$.
Similarly we can state a theorem similar to @lem:subst:unit:
#lemma[The elimination of empty type is preserved by substitution:
$ (Δ,x:bot ⊢ a:A #h(2em) Γ ⊢ σ : Δ #h(2em) σ' := (σ,x slash x))/
(Γ,x:bot ⊢ a[σ'] ≡ elim_bot (x) : A[σ']) $] <lem_subst_empty>
#proof[So by typing of the extended substitution object we know $ Γ,x:bot ⊢ σ' : (Δ,x:bot) $
therefore the substitution is well-typed and $ Γ,x:bot ⊢ a[σ'] : A[σ'] $
and by the $η$-law.]
#lemma[For every context extended by $bot$, there is a context isomorphism among each pair of them.
In other words, for all $Γ ⊢$ and $Δ ⊢$, we have a context isomorphism between $Γ, x:bot ⊢$ and $Δ, x:bot ⊢$.]
#proof[
The isomorphism $Γ, x:bot ⊢ σ : (Δ,x:bot)$ is given by a list of $elim_bot (x)$,
whose inverse is given alike.
]
Before proceeding further, we briefly describe the intended way to use these definitions.
There might be a type theory that does not directly define a unit type, but as long as it can provide the data and prove the equations needed by @def:rule:unit, we can say it has a unit type, and may start using the rules of unit type in the type theory.
This is a form of _abstraction_, where we care only about how types are intended to be used, not how they are implemented,
and we use the abstracted rules which usually leads to lighter notations, shorter theorems and proofs, more efficient communications, and more general results.
== Product
// Cartesian product
#definition("Product")[
We say a type theory has _product types_ if it has the following constructions:
+ _Formation_:
$ (Γ⊢A #h(2em) Γ⊢B)/(Γ ⊢ A × B) $
+ _Introduction_:
$ (Γ ⊢ a:A #h(2em) Γ ⊢ b:B)/(Γ ⊢ ⟨a, b⟩ : A × B) $
+ _Elimination_:
$ (Γ ⊢ p : A × B)/(Γ ⊢ p.1 : A)
#h(2em)
(Γ ⊢ p : A × B)/(Γ ⊢ p.2 : A)
$
such that the following rules hold:
+ The fact that product is preserved by substitution:
$ Γ ⊢ (A × B)[σ] ≡ A[σ] × B[σ] $
+ The fact that projections are preserved by substitution:
$ Γ ⊢ p.1[σ] ≡ p[σ].1 : A \
Γ ⊢ p.2[σ] ≡ p[σ].2 : B $
+ The $β$-rules:
$ (Γ ⊢ a:A #h(2em) Γ ⊢ b:B)/(Γ ⊢ ⟨a,b⟩.1 ≡ a : A) \
(Γ ⊢ a:A #h(2em) Γ ⊢ b:B)/(Γ ⊢ ⟨a,b⟩.2 ≡ b : B)
$
+ The $η$-law:
$ (Γ ⊢ p : A × B)/(Γ ⊢ p ≡ ⟨p.1, p.2⟩ : A × B)
$
] <def:rule:product>
#lemma("Product extensionality")[
The following rule is derivable:
$ (Γ ⊢ a.1 ≡ b.1 : A #h(2em) Γ ⊢ a.2 ≡ b.2 : B)/
(Γ ⊢ a ≡ b : A × B)
$
] <lem:product:ext>
#proof[By $η$-law, what we need to show is equivalently $⟨a.1, a.2⟩ ≡ ⟨b.1, b.2⟩$ and by congruence of equality.]
// Beck--Chevalley condition
#lemma[The introduction of product type is preserved by substitution:
$ Γ ⊢ ⟨a,b⟩[σ] ≡ ⟨a[σ], b[σ]⟩ : A[σ] × B[σ] $] <lem_subst_product>
#proof[
Let $u := ⟨a,b⟩[σ]$. By @lem:product:ext, the goal is equivalently $u.1 ≡ a[σ]$ and $u.2 ≡ b[σ]$.
Since projection is preserved by substitution, we have $(⟨a,b⟩[σ]).1 ≡ (⟨a,b⟩.1)[σ] ≡ a[σ]$, hence $u.1 ≡ a[σ]$, likewise $u.2 ≡ b[σ]$.
]
== Extensional equality
Before diving into more complicated dependently-typed structures, we first introduce a very simple type -- the extensional equality type.
// Equalizers
#definition("Equality")[
We say a type theory has _extensional equality type_ if it has the following constructions:
+ _Formation_:
$ (Γ ⊢ A #h(2em) Γ ⊢ a:A #h(2em) Γ ⊢ b:A)/
(Γ ⊢ a =_A b) $
+ _Introduction_:
$ Γ ⊢ refl_a : a =_A a $
such that the following rules hold:
+ The fact that equality type is preserved by substitution:
$ Γ ⊢ (a =_A b)[σ] ≡ (a[σ] =_(A[σ]) b[σ]) $
+ The _elimination rule_, also known as _equality reflection_:
$ (Γ ⊢ p : a =_A b)/(Γ ⊢ a ≡ b : A) $
+ The $η$-law:
$ Γ ⊢ (p ≡ refl_a) : (a =_A a) $
]
Before stating any properties of extensional equality, observe that in the $η$-law, we do not have a premise $Γ ⊢ p : a =_A b$.
This is because we have _presuppositions_ (@def:presup), implying that this is already assumed when we _state_ the conclusion.
#lemma("Uniqueness")[The following judgment is _derivable_:
$ (Γ ⊢ p : a =_A b #h(2em) Γ ⊢ q : a =_A b)/
(Γ ⊢ p ≡ q : a =_A b)
$] <lem:refl:uniqueness>
#proof[
By elimination of equality, we know $Γ ⊢ a ≡ b : A$, hence it suffice to show:
$ Γ ⊢ p ≡ q : a =_A a $
By $η$-law, both $p$ and $q$ are equal to $refl_a$.
]
#lemma[Having _extensional equality type_ and any closed term $· ⊢ a:A$ implies having a _unit type_.]
#proof[Let $top := (a =_A a)$ and $★ := refl_a$.]
== Conclusion
In this chapter, we have defined some simply-typed structures inside dependent type theories, including unit type, empty type, product type, and extensional equality type.
In the next chapter, we will seek to generalize some of these structures into a more general construction.
= Compilers
The goal of this chapter is to defined well-typed translations between type theories, aka compilers.
Before talking about translations between type theories,
we first need to make it explicit that what data give rise to a type theory, and then we define how to translate between them.
== Conventions
We consider a type theory to be a substitution calculus (@sec:subst-calculus) plus a set of postulated rules,
denoted using bold font, e.g. $bold(A), bold(B)$, or $bold("TT")$.
In the presence of multiple type theories, we write $Γ ⊢^bold(A) ...$ to mean that this judgment happens in type theory $bold(A)$.
Recall in last chapter we have introduced a couple of _structures_ of type theories, defined to be having some data and the ability to derive some rules.
When postulating rules, we might just say "$bold(A)$ has product type (@def:rule:product)" to say that we are postulating all the rules needed by product type in $bold(A)$.
The following are some example definitions of type theories:
#example[
- The empty type theory $bold(0)$ has the empty set of postulated rules.
- The unit type theory $bold(1)$ has a unit type.
- Alternatively, another unit type theory has a unit type and product types.
] <ex:empty:unit:fp>
#lemma[In the empty type theory, there is only one context -- the empty one.]
#lemma[In the unit type theory, all contexts except the empty one are isomorphic.]
== Translations
#definition("Compiler")[
A _compiler_ from type theory $bold(A)$ to type theory $bold(B)$, denoted $bold(F):bold(A) → bold(B)$, consists of the following data:
+ A context $Δ$ in $bold(B)$, which we map the empty context in $bold(A)$ to,
+ A pair of functions, called _translations_, both denoted $[| - |]_bold(F)$,
i.e. for input $A$, it produces output $[| A |]_bold(F)$, maps the types and terms from $bold(A)$ to $bold(B)$.
3. In addition to that, we define $[| Γ |]$ to be iteratively translating the types inside $Γ$ and push them onto $Δ$ -- the translation of the empty context;
such that:
+ For $Γ ⊢^bold(A) A$, the judgment $[| Γ |]_bold(F) ⊢^bold(B) [| A |]_bold(F)$ must be derivable,
+ For $Γ ⊢^bold(A) t : A$, the judgment $[| Γ |]_bold(F) ⊢^bold(B) [| t |]_bold(F) : [| A |]_bold(F)$ must be derivable,
+ If $Γ ⊢^bold(A) A ≡ B$ is derivable, then $[| Γ |]_bold(F) ⊢^bold(B) [| A |]_bold(F) ≡ [| B |]_bold(F)$ is derivable,
+ If $Γ ⊢^bold(A) t ≡ u : A$ is derivable, then $[| Γ |]_bold(F) ⊢^bold(B) [| t |]_bold(F) ≡ [| u |]_bold(F) : [| A |]_bold(F)$ is derivable.
] <def:compiler>
By default, we assume the empty context is translated into the empty context.
When there is only one compiler in the context, we might omit the subscript $bold(F)$.
Observe that presuppositions commute with compilations:
#align(center)[#diagram(cell-size: 15mm, $
Γ ⊢^bold(A) t: A
edge("rr", #[presupposes])
edge("d", [| - |], ->)
&& Γ ⊢^bold(A) A
edge("d", [| - |], ->) \
[| Γ |] ⊢^bold(B) [| t |] : [| A |]
edge("rr", #[presupposes])
&& [| Γ |] ⊢^bold(B) [| A |]
$)]
So, when translating the rules, we do not have to do additional work to ensure that the presuppositions are satisfied.
#example[
For every type theory $bold(A)$, there exists a compiler from $bold(A)$ to the unit type theory $bold(1)$ (@ex:empty:unit:fp), by compiling all types and terms as the unit type and its introduction rule.
]
#example[
For every type theory $bold(A)$, there exists a compiler from the empty type theory $bold(0)$ to $bold(A)$, because there is nothing to compile.
]
Normally, the rules term and type formers are always postulated, not proved to be admissible,
since we have in mind that typing derivations are in correspondence with proof terms, a canonical representation of its _derivation_ -- directly indicates the existence of a derivation.
In the definition of a compiler, we require that the translated judgments are derivable,
not admissible, and the rationale is due to the following construction we wish to be well-defined:
#construction("Composition")[
If $bold(F):bold(A) → bold(B)$ and $bold(G):bold(B) → bold(C)$ are compilers,
then the _composition_ of them, denoted $bold(G) ∘ bold(F):bold(A) → bold(C)$, is a compiler,
defined as follows:
1. For $Γ ⊢^bold(A) A$, define $[| A |]_(bold(G) ∘ bold(F)) = [| [| A |]_bold(F) |]_bold(G)$,
2. For $Γ ⊢^bold(A) t : A$, define $[| t |]_(bold(G) ∘ bold(F)) = [| [| t |]_bold(F) |]_bold(G)$.
The judgmental equalities hold immediately.
]
If we only require the judgmental equalities to be admissible, they wouldn't be further preserved under translation.
#construction("Identity")[
For every type theory $bold(A)$, there exists an _identity compiler_ $id_bold(A) :bold(A) → bold(A)$
such that $[| A |]_id_bold(A) = A$ and $[| t |]_id_bold(A) = t$.]
The use of the keyword $id$ is intentional not bold to be consistent with other identities.
Then, it is tempting to state the unital and associativity laws for compilers,
but to do so we first need a notion of equality between compilers,
which is roughly that they send the same types and terms to the same types and terms.
However, this is not immediately easy, because we care about using types abstractly,
not caring how they are implemented,
so for instance if two compilers are translating something using a unit type,
one uses a distinguished unit type and the other uses a unit type implemented by some other types,
we still consider them to be the same.
== Equivalences
To start, we need to specify the equivalence between $Γ⊢A$ and $Γ⊢B$,
which we intend to do by defining a type-theoretic bijection between their terms.
#definition("Type isomorphism")[
For types $Γ⊢A$ and $Γ⊢B$, a _type isomorphism_ (or _isomorphism_ for short) between them is a pair of terms $Γ,x:A ⊢ b:B$ and $Γ,x:B ⊢ a:A$ such that the following equations are derivable:
$ Γ,x:A ⊢ x ≡ a[b slash x] : A \
Γ,x:B ⊢ x ≡ b[a slash x] : B $
We denote isomorphic type as a judgment $Γ⊢A ≃ B$.
]
We wish isomorphic types to _behave_ the same in type theory.
Then, we can talk about the equivalence between compilers:
#definition("Equivalent compilers")[
Two compilers $bold(F):bold(A) → bold(B)$ and $bold(G):bold(A) → bold(B)$ are _equivalent_ if for every type $Γ ⊢^bold(A) A$, we have:
+ a context isomorphism $[| Γ |]_bold(F) ⊢ σ : [| Γ |]_bold(G)$,
+ a type isomorphism $[| Γ |]_bold(F) ⊢ [| A |]_bold(F) ≃ [| A |]_bold(G) [σ]$.
We denote equivalent compilers as $bold(F) ≃ bold(G)$.]
#lesson[It is common that instances of a type are usually infinite,
and in that case they are always countable,
as the terms we an write down are essentially _abstract syntax trees_, and trees are countable.
However, there will still be infinite types that are not isomorphic,
since the definition of type isomorphism is an _internal_ isomorphism,
i.e. the isomorphism needs to be _inside_ the type theory.]
Then, we can define the desired unital and associativity laws for compilers:
#lemma[For compiler $bold(F):bold(A) → bold(B)$, we have:
$ (bold(F) ∘ id_bold(A)) ≃ (id_bold(B) ∘ bold(F)) ≃ bold(F) $]
#lemma[For compilers $bold(F):bold(A) → bold(B)$, $bold(G):bold(B) → bold(C)$, and $bold(H):bold(C) → bold(D)$, we have:
$ bold(H) ∘ (bold(G) ∘ bold(F)) ≃ (bold(H) ∘ bold(G)) ∘ bold(F) $]
We also need to establish the equivalence between types theories,
and to do so we need to consider the following:
+ We wish the equivalence to be weak:
using different implementations of an "abstract" type should not affect the equivalence,
+ If we translate $Γ⊢^bold(A) A$ into $Γ' ⊢^bold(B) A'$, we wish the terms to be translated so that:
+ Different terms get translated into different terms,
+ Every term of $A'$ is the translation of some term of $A$.
Putting all of these conditions together, we can form a sensible notion of equivalence between type theories.
// Essentially surjective
#definition("Surjective")[We say a compiler $bold(F):bold(A) → bold(B)$ to be _surjective_
if for every type $Γ' ⊢^bold(B) B$ there exists a type $Γ ⊢^bold(A) A$ such that:
+ there exists a context isomorphism $Γ' ⊢ σ : [| Γ |]_bold(F)$,
+ $Γ' ⊢ B ≃ [| A |]_bold(F)[σ]$.]
// Fully faithful
#definition("Injective")[Consider a compiler $bold(F):bold(A) → bold(B)$ and a type $Γ ⊢^bold(A) A$.
We say $bold(F)$ to be:
+ _full_ if for every $[| Γ |] ⊢^bold(B) u : [| A |]$,
there exists $Γ ⊢^bold(A) v : A$ such that $[| Γ |] ⊢^bold(B) u ≡ [| v |] : [| A |]$,
+ _faithful_ if $[| Γ |] ⊢^bold(B) [| u |] ≡ [| v |] : [| A |]$ implies $Γ ⊢^bold(A) u ≡ v : A$.
A compiler is _injective_ if it is both full and faithful.]
#definition("Equivalence")[We say a compiler to be an equivalence between type theories if it is surjective and injective.]
== Conclusion
In this chapter, we defined the notion of a compiler between type theories, which is a sensible _structure-preserving_ map between them, as it preserves the derivability of judgments.
Then, we described a couple of properties of compilers, and used them to define equivalences between type theories.
= Structures, revisited
The goal of this chapter is to revisit the structures defined in @sec:strut-1 with an eye on generalization.
We start the section by a reflection on the definition of "having a unit type" (@def:rule:unit).
For a type theory to have a unit type, the following needs to be true:
For every context $Γ$,
+ there is a type $Γ ⊢ top$,
+ there is a term $Γ ⊢ ★ : top$
such that every term of this type is equal to it,
+ and this whole thing is preserved by substitution.
For product types, we can rephrase its definition in a similar way,
but with the presence of rule premises, they are more complicated:
For every context $Γ$ and types $Γ ⊢ A$ and $Γ ⊢ B$,
+ there is a type $Γ ⊢ A × B$,
+ for every pair of terms $Γ ⊢ t : A$ and $Γ ⊢ u : B$, there is a term $Γ ⊢ ⟨t, u⟩ : A × B$
such that every term of this type can be written as such a pair,
+ and this whole thing is preserved by substitution.
Note that the fact that all terms can be written as such a pair is known as all terms _factor through_ the introduction rule.
Similarly for the empty type, all terms in contexts where $bot$ is present _factor through_ the elimination rule.
There seems to be a lot of things in common:
For every context $Γ$ and types $Γ ⊢ 🤔$,
+ there is a type $Γ ⊢ ✨$,
+ for every tuple of terms $Γ ⊢ ... : 🤔$, there is a term $Γ ⊢ ... : ✨$
such that every term of this type factor through its introduction,
+ and this whole thing is preserved by substitution.
Now, the real question arise: can we generalize this and how do we do that?
== Raw structures
We start by thinking about products.
Given any $Γ⊢A$ and $Γ⊢B$, we may rephrase the introduction of product $A×B$ as having another type $Γ⊢X$ with two _pseudo-projections_:
$ Γ,x:X ⊢ a: A #h(2em) Γ,x:X ⊢ b: B $
which gives me back the original $A$ and $B$.
This motivates the following definition.
// Cones for products
#definition("Raw product")[
Given any $Γ⊢A$ and $Γ⊢B$. A _raw product_ consists of the following data:
+ A type $Γ⊢X$,
+ Two terms $Γ,x:X ⊢ a: A$ and $Γ,x:X ⊢ b: B$.
We denote a raw product as $(X, a, b)$.
]
Then, the product $A×B$ is something we can always use these pieces of information to introduce an instance with, like this:
$ Γ,x:X ⊢ ⟨a,b⟩ : A×B $
Also note that the product $A×B$ can also be used to make a "raw product", namely $(A×B, x.1, x.2)$.
The special feature of this legitimate product is that it has an introduction rule that transforms any raw product into it.
Now, we can redefine the product without assuming its pre-existing rules.
#definition("Product")[
The product $(A×B,x.1,x.2)$ is a raw product such that
for every other raw product $(X,a,b)$, there exists a _unique_ term, called the _constructor_:
$ Γ, x:X ⊢ h : A×B $
such that:
$ Γ, x:X ⊢ a ≡ h.1 : A \
Γ, x:X ⊢ b ≡ h.2 : B
$
where $h.1$ is the result of the substitution $x.1 [h slash x]$, and similarly for $h.2$.
] <def:ct:product>
The idea is that constructing a term of type $A×B$ must go through its introduction rule,
We can diagramize the conditions in @def:ct:product as a commutative diagram.
In context $Γ$, we have:
#align(center)[#diagram(cell-size: 15mm, $
&X cedge("lb", a, ->)
cedge("rb", b, ->)
cedge("d", h, "-->")
& \
A &A×B cedge("l", .1, ->)
cedge("r", .2, ->)
& B
$)]
This is rather like _characterizing_ the product type, instead of _defining_ it.
Now, it is tempting to define another type in a similar vibe.
We start by trying the unit type.
#definition("Raw unit")[
A _raw unit_ consists of the following data:
+ A type $Γ⊢X$,
+ A term $Γ⊢u: X$.
We denote a raw unit as $(X, u)$.
]
Then $(top, ★)$ is an instance of such a raw unit,
and we can characterize the unit type as follows:
#definition("Unit")[
The unit type is a raw unit such that
for every other raw unit $(X,u)$, there exists a _unique_ term, called the _constructor_:
$ Γ ⊢ h : top $
such that:
$ Γ ⊢ u ≡ h : top $
] <def:ct:unit>
It is clear that this coincides with the original definition of the unit type,
where $h$ is just another name for $★$!
== Limit of Compilers
Now, we further generalize the idea of raw structures.
The data in a raw product in type theory $bold(A)$ can be represented as a _cone_,
which is defined below.
#definition("Schema of a product")[
Consider a dependent type theory $bold(D)$ with the following rules:
$ ·⊢A #h(2em) ·⊢B $
The _schema_ of a product in type theory $bold(A)$ is a compiler $bold(F) : bold(D) → bold(A)$.
] <def:schema:product>
Essentially, a schema _chooses_ two types $Γ⊢[| A |]_bold(F)$ and $Γ⊢[| B |]_bold(F)$
in $bold(A)$ for the base context $Γ=[| · |]_bold(F)$.
#definition("Schema of a unit")[
The _schema_ of a unit in type theory $bold(A)$ is a compiler $bold(F) : bold(0) → bold(A)$,
where $bold(0)$ is the empty type theory.
] <def:schema:unit>
#definition("Cone")[
A _cone_ of a schema $bold(F) : bold(D) → bold(A)$ consists of the following data,
where we denote the base context as $Γ=[| · |]_bold(F)$:
+ A type $Γ⊢X$,
+ for every type $Δ⊢A$ in $bold(D)$,
a substitution $Γ,x:X ⊢ a_A : [| Δ,A |]_bold(F)$,
+ such that the diagram of all $a_A$ and the terms interpreted by $bold(F)$ commutes.
We denote a cone as $Cone(bold(F), Γ⊢X)$, and refer to the diagram mentioned above
as the diagram of this cone.
]
In the above two cases, $Δ$ is always $·$, so the substitution $a_A$ is really just a term.
A _cone_ of the schema in @def:schema:product corresponds to the following diagram:
#align(center)[#diagram(cell-size: 15mm, $
[| A |] &X cedge("l", x.1, ->)
cedge("r", x.2, ->)
& [| B |]
$)]
Since there is no directed paths that share the same source and target,
the diagram always commutes.
Usually, there will be some term in the image of $bold(F)$,
and in those cases, we will have a nontrivial commutative diagram.
A cone of the schema in @def:schema:unit is just a type $Γ⊢X$.
With the notion of cones, we can define the notion of _limits_,
which should coincide to the original definition of the types (in our case, products and the unit type):
#definition("Limit")[
The _limit_ of the cones of a schema $bold(F) : bold(D) → bold(A)$
is a cone $Cone(bold(F), Γ⊢X)$ such that for every other cone
from the same context $Cone(bold(F), Γ⊢A)$, there is a unique term:
$ Γ,x:A ⊢ h: X $
such that the diagram of both cones and $h$ commutes.
] <def:limit>
Now, let us check that a product is the limit of the schema in @def:schema:product.
A cone for the schema consists of the following data:
+ A type $Γ⊢X$ (where $Γ$ is the base context),
+ For $Γ⊢A$, a term $Γ,x:X ⊢ a_A : A$ (we write $·,A$ as $A$ for simplicity),
+ For $Γ⊢B$, a term $Γ,x:X ⊢ b_B : B$.
The limit of these cones matches precisely with @def:ct:product.
Similarly, a unit type is the limit of the schema in @def:schema:unit,
and we leave the construction as an exercise.
Here is another fun fact: the extensional equality is also a limit!
Let's look at the following schema:
TODO
|
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/ops-invalid-10.typ | typst | Other | // Error: 3-15 cannot divide by zero
#(15deg / 0deg)
|
https://github.com/sitandr/typst-examples-book | https://raw.githubusercontent.com/sitandr/typst-examples-book/main/src/snippets/labels.md | markdown | MIT License | # Labels
## Get chapter of label
```typ
#let ref-heading(label) = context {
let elems = query(label)
if elems.len() != 1 {
panic("found multiple elements")
}
let element = elems.first()
if element.func() != heading {
panic("label must target heading")
}
link(label, element.body)
}
= Design <design>
#lorem(20)
= Implementation
In #ref-heading(<design>), we discussed...
```
## Allow missing references
```typ
// author: Enivex
#set heading(numbering: "1.")
#let myref(label) = context {
if query(label).len() != 0 {
ref(label)
} else {
// missing reference
text(fill: red)[???]
}
}
= Second <test2>
#myref(<test>)
#myref(<test2>)
```
|
https://github.com/3akev/autofletcher | https://raw.githubusercontent.com/3akev/autofletcher/main/autofletcher.typ | typst | MIT License | #import "@preview/fletcher:0.4.5" as fletcher: diagram, node, edge
// math helpers
#let vecadd(v1, v2) = v1.zip(v2).map(x => x.sum())
#let vecmult(v1, v2) = v1.zip(v2).map(x => x.product())
#let vecmultx(v, s) = (v.at(0) * s, v.at(1))
/// Calculates the relative position of a child node, like in a tree
///
/// Don't call this directly; instead, pass this as a parameter to `place-nodes`.
///
/// - i (int): The index of the child node
/// - num-total (int): The total number of children
#let tree-placer(i, num-total) = {
let idx = i - int((num-total - 1)/2)
return (idx, 1)
}
/// Returns a placer that places children in a circular arc
///
/// It appears this breaks spread, probably because it uses
/// fractional coordinates. Also, don't mix it with other non-fractional
/// placers. It messes up the graph
///
/// - start (angle, float): The starting angle of the arc
/// - length (angle, float): The length of the arc
/// - radius (float): The radius of the circle
#let arc-placer(start, length: 2*calc.pi, radius: 1) = {
if type(start) == angle {
start = start.rad()
}
if type(length) == angle {
length = length.rad()
}
let length = calc.clamp(length, 0, 2 * calc.pi)
let r = (radius, radius)
let circular-placer(i, num-total) = {
// if it's not a full circle, we subtract one from the total number of
// children cuz i is 0-indexed, but num-total is 1-indexed (sort of), so
// that leaves the last "slot" unused. this is useful when it's a full
// circle, but not when it's an arc
if length != 2*calc.pi and num-total > 1 {
num-total = num-total - 1
}
let angle = start + length * i / num-total
let vec = (calc.cos(angle), calc.sin(angle))
return vecmult(r, vec)
}
return circular-placer
}
/// A pre-defined arc placer that places children in a full circle.
#let circle-placer = arc-placer(0, length: 2 * calc.pi)
/// Returns a generic placer, where children are placed according to the given
/// relative positions. If more children are present than there are positions, positions
/// are repeated.
///
/// This is probably sufficient for most use cases.
///
/// - ..placements (coordinates): Relative positions to assign to children
/// -> function
#let placer(..placements) = {
let tab = placements.pos()
let discrete-placer(i, num-total) = {
return tab.at(calc.rem(i, tab.len()))
}
return discrete-placer
}
/// Calculates the positions of `num-children` children of `parent` node.
///
/// Returns a pair of arrays. The first array contains the coordinates of the
/// children, and the second array contains the nodes partially applied with
/// the calculated positions.
///
/// - parent (coordinates): The coordinates of the parent node
/// - num-children (int): The number of children to place
/// - placer (function): The function to calculate the relative positions of the children
/// - spread (int): A multiplier for the x coordinate, "spreads"
/// children out. Increase this for high parent nodes.
/// -> (array of coordinates + array of nodes)
#let place-nodes(parent, num-children, placer, spread: 1) = {
let coords = ()
let children = ()
for i in range(0, num-children) {
let rel-vec = placer(i, num-children)
let rel-vec = vecmultx(rel-vec, spread)
let pos = vecadd(parent, rel-vec)
coords = coords + (pos, )
children = children + (node.with(pos),)
}
return (coords, children)
}
/// Convenience function that draws edges between a parent node and its
/// children, given the coordinates of the parent and children.
///
/// - parent (coordinates): The coordinates of the parent node
/// - children (array of coordinates): The coordinates of the children nodes
/// - ..options (any): Additional options to pass to `edge`
///
#let edges(parent, children, ..options) = {
for child in children {
edge(parent, child, ..options.pos(), ..options.named())
}
}
|
https://github.com/LDemetrios/Conspects-4sem | https://raw.githubusercontent.com/LDemetrios/Conspects-4sem/master/typst/styles/theme-dispatch.typ | typst | #import "/typst/styles/themes/sepia.typ": *
|
|
https://github.com/tzx/NNJR | https://raw.githubusercontent.com/tzx/NNJR/main/resume_yaml.typ | typst | MIT License | #import "yml.typ": yml_resume
#let resume_data = yaml("example.yml")
#yml_resume(resume_data)
|
https://github.com/matnut2/SlidesBSc | https://raw.githubusercontent.com/matnut2/SlidesBSc/master/README.md | markdown | # polylux-unipd
[Polylux](https://github.com/andreasKroepelin/polylux) theme inspired by [beamer-padova](https://www.math.unipd.it/~burattin/other/tema-latex-beamer-padova/), roughly based on [typst-slides-unipd](https://github.com/SkiFire13/typst-slides-unipd) and [polylux-university](https://github.com/andreasKroepelin/polylux/blob/main/themes/university.typ).
I'm too lazy to write a proper documentation so you'll have to figure it out yourself but an [example](./slides.typ) is provided, you can see it built [here](./slides.pdf). Beware that the example is kinda broken on some long titles, but nothing show-stopping :)
|
|
https://github.com/r8vnhill/apunte-bibliotecas-de-software | https://raw.githubusercontent.com/r8vnhill/apunte-bibliotecas-de-software/main/Unit2/infix.typ | typst | == Funciones Infijas
Las funciones infijas en Kotlin permiten llamar a una función sin usar un punto y paréntesis, haciendo el código más legible.
Estas funciones solo requieren un argumento y tienen ciertos requisitos para ser definidas.
=== Requisitos para Funciones Infijas
- Debe ser miembro de una clase o una extensión de una clase.
- Debe tener un solo parámetro.
- Debe estar marcada con el modificador `infix`.
=== Ejemplo de Función Infija en una Clase
```kotlin
class Point(val x: Int, val y: Int) {
infix fun shift(dx: Int) = Point(x + dx, y)
}
```
En este ejemplo, `shift` es una función infija que desplaza el punto en el eje x por una cantidad `dx`.
```kotlin
fun main() {
val point = Point(1, 2)
val newPoint = point shift 3
println("Point shifted to x=${newPoint.x}, y=${newPoint.y}")
}
```
En el código anterior, la función infija `shift` se llama sin paréntesis, haciendo el código más conciso y legible.
=== Ejemplo de Función Infija como Extensión
También es posible definir funciones infijas como extensiones de una clase.
```kotlin
class Point(val x: Int, val y: Int)
infix fun Point.shift(dx: Int) = Point(x + dx, y)
```
Aquí, `shift` se define como una función de extensión infija para la clase `Point`.
```kotlin
fun main() {
val point = Point(1, 2)
val newPoint = point shift 3
println("Point shifted to x=${newPoint.x}, y=${newPoint.y}")
}
```
Este ejemplo demuestra cómo se puede usar una función infija de extensión de manera similar a una función infija miembro de una clase.
|
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/docs/tinymist/frontend/main.typ | typst | Apache License 2.0 | #import "mod.typ": *
#show: book-page.with(title: "Tinymist Editor Frontends")
Leveraging the interface of LSP, tinymist provides frontends to each editor, located in the #link("https://github.com/Myriad-Dreamin/tinymist/tree/main/editors")[editor folders]. They are minimal, meaning that LSP should finish its main LSP features as many as possible without help of editor frontends. The editor frontends just enhances your code experience. For example, the vscode frontend takes responsibility on providing some nice editor tools. It is recommended to install these editors frontend for your editors.
Check the following chapters for uses:
- #cross-link("/frontend/vscode.typ")[VS Cod(e,ium)]
- #cross-link("/frontend/neovim.typ")[NeoVim]
- #cross-link("/frontend/emacs.typ")[Emacs]
- #cross-link("/frontend/sublime-text.typ")[Sublime Text]
- #cross-link("/frontend/helix.typ")[Helix]
- #cross-link("/frontend/zed.typ")[Zed]
|
https://github.com/pauladam94/curryst | https://raw.githubusercontent.com/pauladam94/curryst/main/examples/rule-as-premise.typ | typst | MIT License | #import "../curryst.typ": rule, proof-tree
#set document(date: none)
#set page(width: auto, height: auto, margin: 0.5cm, fill: white)
#proof-tree(
rule(
name: $R$,
$C_1 or C_2 or C_3$,
rule(
name: $A$,
$C_1 or C_2 or L$,
rule(
$C_1 or L$,
$Pi_1$,
),
),
rule(
$C_2 or overline(L)$,
$Pi_2$,
),
)
)
|
https://github.com/RaphGL/ElectronicsFromBasics | https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap6/2_kirchhoffs_voltage_law.typ | typst | Other | #import "../../core/core.typ"
=== Kirchhoff\'s Voltage Law (KVL)
Let\'s take another look at our example series circuit, this time
numbering the points in the circuit for voltage reference:
#image("static/00110.png")
If we were to connect a voltmeter between points 2 and 1, red test lead
to point 2 and black test lead to point 1, the meter would register +45
volts. Typically the \"+\" sign is not shown, but rather implied, for
positive readings in digital meter displays. However, for this lesson
the polarity of the voltage reading is very important and so I will show
positive numbers explicitly:
$ E_(2 - 1) = +45 V $
When a voltage is specified with a double subscript (the characters
\"2-1\" in the notation \"E#sub[2-1]\"), it means the voltage at the
first point (2) as measured in reference to the second point (1). A
voltage specified as \"E#sub[cg]\" would mean the voltage as indicated
by a digital meter with the red test lead on point \"c\" and the black
test lead on point \"g\": the voltage at \"c\" in reference to \"g\".
#image("static/00435.png")
If we were to take that same voltmeter and measure the voltage drop
across each resistor, stepping around the circuit in a clockwise
direction with the red test lead of our meter on the point ahead and the
black test lead on the point behind, we would obtain the following
readings:
$
E_(3-2) = -10 V \
E_(4-3) = -20 V \
E_(1-4) = -15 V
$
#image("static/00436.png")
We should already be familiar with the general principle for series
circuits stating that individual voltage drops add up to the total
applied voltage, but measuring voltage drops in this manner and paying
attention to the polarity (mathematical sign) of the readings reveals
another facet of this principle: that the voltages measured as such all
add up to zero:
#image("static/10108.png")
This principle is known as #emph[Kirchhoff\'s Voltage Law] (discovered
in 1847 by <NAME>, a German physicist), and it can be
stated as such:
#quote[
The algebraic sum of all voltages in a loop must equal zero
]
By #emph[algebraic], I mean accounting for signs (polarities) as well as
magnitudes. By #emph[loop], I mean any path traced from one point in a
circuit around to other points in that circuit, and finally back to the
initial point. In the above example the loop was formed by following
points in this order: 1-2-3-4-1. It doesn\'t matter which point we start
at or which direction we proceed in tracing the loop; the voltage sum
will still equal zero. To demonstrate, we can tally up the voltages in
loop 3-2-1-4-3 of the same circuit:
#image("static/10109.png")
This may make more sense if we re-draw our example series circuit so
that all components are represented in a straight line:
#image("static/00111.png")
It\'s still the same series circuit, just with the components arranged
in a different form. Notice the polarities of the resistor voltage drops
with respect to the battery: the battery\'s voltage is negative on the
left and positive on the right, whereas all the resistor voltage drops
are oriented the other way: positive on the left and negative on the
right. This is because the resistors are #emph[resisting] the flow of
electrons being pushed by the battery. In other words, the \"push\"
exerted by the resistors #emph[against] the flow of electrons
#emph[must] be in a direction opposite the source of electromotive
force.
Here we see what a digital voltmeter would indicate across each
component in this circuit, black lead on the left and red lead on the
right, as laid out in horizontal fashion:
#image("static/00112.png")
If we were to take that same voltmeter and read voltage across
combinations of components, starting with only R#sub[1] on the left and
progressing across the whole string of components, we will see how the
voltages add algebraically (to zero):
#image("static/00113.png")
The fact that series voltages add up should be no mystery, but we notice
that the #emph[polarity] of these voltages makes a lot of difference in
how the figures add. While reading voltage across R#sub[1],
R#sub[1]--R#sub[2], and R#sub[1]--R#sub[2]--R#sub[3] (I\'m using a
\"double-dash\" symbol \"--\" to represent the #emph[series] connection
between resistors R#sub[1], R#sub[2], and R#sub[3]), we see how the
voltages measure successively larger (albeit negative) magnitudes,
because the polarities of the individual voltage drops are in the same
orientation (positive left, negative right). The sum of the voltage
drops across R#sub[1], R#sub[2], and R#sub[3] equals 45 volts, which is
the same as the battery\'s output, except that the battery\'s polarity
is opposite that of the resistor voltage drops (negative left, positive
right), so we end up with 0 volts measured across the whole string of
components.
That we should end up with exactly 0 volts across the whole string
should be no mystery, either. Looking at the circuit, we can see that
the far left of the string (left side of R#sub[1]: point number 2) is
directly connected to the far right of the string (right side of
battery: point number 2), as necessary to complete the circuit. Since
these two points are directly connected, they are #emph[electrically
common] to each other. And, as such, the voltage between those two
electrically common points #emph[must] be zero.
Kirchhoff\'s Voltage Law (sometimes denoted as #emph[KVL] for short)
will work for #emph[any] circuit configuration at all, not just simple
series. Note how it works for this parallel circuit:
#image("static/00114.png")
Being a parallel circuit, the voltage across every resistor is the same
as the supply voltage: 6 volts. Tallying up voltages around loop
2-3-4-5-6-7-2, we get:
#image("static/10110.png")
Note how I label the final (sum) voltage as E#sub[2-2]. Since we began
our loop-stepping sequence at point 2 and ended at point 2, the
algebraic sum of those voltages will be the same as the voltage measured
between the same point (E#sub[2-2]), which of course must be zero.
The fact that this circuit is parallel instead of series has nothing to
do with the validity of Kirchhoff\'s Voltage Law. For that matter, the
circuit could be a \"black box\" -- its component configuration
completely hidden from our view, with only a set of exposed terminals
for us to measure voltage between -- and KVL would still hold true:
#image("static/00115.png")
Try any order of steps from any terminal in the above diagram, stepping
around back to the original terminal, and you\'ll find that the
algebraic sum of the voltages #emph[always] equals zero.
Furthermore, the \"loop\" we trace for KVL doesn\'t even have to be a
real current path in the closed-circuit sense of the word. All we have
to do to comply with KVL is to begin and end at the same point in the
circuit, tallying voltage drops and polarities as we go between the next
and the last point. Consider this absurd example, tracing \"loop\"
2-3-6-3-2 in the same parallel resistor circuit:
#image("static/00114.png")
#image("static/10111.png")
KVL can be used to determine an unknown voltage in a complex circuit,
where all other voltages around a particular \"loop\" are known. Take
the following complex circuit (actually two series circuits joined by a
single wire at the bottom) as an example:
#image("static/00116.png")
To make the problem simpler, I\'ve omitted resistance values and simply
given voltage drops across each resistor. The two series circuits share
a common wire between them (wire 7-8-9-10), making voltage measurements
#emph[between] the two circuits possible. If we wanted to determine the
voltage between points 4 and 3, we could set up a KVL equation with the
voltage between those points as the unknown:
$
&E_(4-3) + E_(9-4) + E_(8-9) + E_(3-8) = 0 \
&E_(4-3) + 12 + 0 + 20 = 0 \
&E_(4-3) + 32 = 0 \
&E_(4-3) = -32 V
$
#image("static/00358.png")
#image("static/00359.png")
#image("static/00360.png")
#image("static/00361.png")
Stepping around the loop 3-4-9-8-3, we write the voltage drop figures as
a digital voltmeter would register them, measuring with the red test
lead on the point ahead and black test lead on the point behind as we
progress around the loop. Therefore, the voltage from point 9 to point 4
is a positive (+) 12 volts because the \"red lead\" is on point 9 and
the \"black lead\" is on point 4. The voltage from point 3 to point 8 is
a positive (+) 20 volts because the \"red lead\" is on point 3 and the
\"black lead\" is on point 8. The voltage from point 8 to point 9 is
zero, of course, because those two points are electrically common.
Our final answer for the voltage from point 4 to point 3 is a negative
(-) 32 volts, telling us that point 3 is actually positive with respect
to point 4, precisely what a digital voltmeter would indicate with the
red lead on point 4 and the black lead on point 3:
#image("static/00117.png")
In other words, the initial placement of our \"meter leads\" in this KVL
problem was \"backwards.\" Had we generated our KVL equation starting
with E#sub[3-4] instead of E#sub[4-3], stepping around the same loop
with the opposite meter lead orientation, the final answer would have
been E#sub[3-4] \= +32 volts:
#image("static/00437.png")
It is important to realize that neither approach is \"wrong.\" In both
cases, we arrive at the correct assessment of voltage between the two
points, 3 and 4: point 3 is positive with respect to point 4, and the
voltage between them is 32 volts.
#core.review[
#quote(block: true, attribution: [Kirchhoff\'s Voltage Law (KVL)])[
The algebraic sum of all voltages in a loop must equal zero
]
]
|
https://github.com/RiccardoTonioloDev/Bachelor-Thesis | https://raw.githubusercontent.com/RiccardoTonioloDev/Bachelor-Thesis/main/appendix/bibliography/bibliography.typ | typst | Other | #pagebreak(to: "odd")
// Hayagriva format
#bibliography("bibliography.yml") |
https://github.com/Henriquelay/pathsec-checker | https://raw.githubusercontent.com/Henriquelay/pathsec-checker/main/presentation/figures/detour_boxes.typ | typst | #set page(width: auto, height: auto, margin: (x: 0pt, y: 0pt))
#set text(font: "DejaVu Sans Mono")
#let myswitch(name, digest, expected) = {
box[
#table(
columns: 2,
fill: if (digest == expected) {
lime
} else {
red
},
align: (right, left),
[name], [#name],
[digest], [#digest],
[expected], [#expected],
)
]
}
#table(
columns: 4,
stroke: none,
inset: 3pt,
myswitch([e1], [0xBADDC0DE], [0xBADDC0DE]),
myswitch([s1], [0x3EF96770], [0x3EF96770]),
myswitch([s2], [0x2DCA9942], [0x2DCA9942]),
myswitch([s3], [0x11797334], [0x11797334]),
myswitch([s4], [0x98081E3E], [0x98081E3E]),
myswitch([s5], [0x3332E012], [0x3332E012]),
myswitch([s555], [0x90A0DF94], [0x22996AFD]),
myswitch([s7], [0xBEBE4372], [0x8FA3987D]),
myswitch([s8], [0x5AAFA7F2], [0xF4B50950]),
myswitch([s9], [0x649B8554], [0xD0C29E67]),
myswitch([s10], [0xF46427BF], [0x13FF41C1]),
)
#pagebreak()
#table(
columns: 4,
stroke: none,
inset: 3pt,
myswitch([e10], [0xDEADBEEF], [0xDEADBEEF]),
myswitch([s10], [0x5F45C4E5], [0x5F45C4E5]),
myswitch([s9], [0x4D34AD25], [0x4D34AD25]),
myswitch([s8], [0x602BAA4E], [0x602BAA4E]),
myswitch([s7], [0x96F1275B], [0x96F1275B]),
myswitch([s555], [0xF247A607], [0x377923F8]),
myswitch([s5], [0x1871A1A6], [0x1CE1F48B]),
myswitch([s4], [0x311B656F], [0xC179BFAC]),
myswitch([s3], [0x0D2C0646], [0xB9A3B130]),
myswitch([s2], [0x804DC63F], [0xEAD6AF39]),
myswitch([s1], [0x4422E397], [0xBCA3D63A]),
)
|
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/hover/builtin_var2.typ | typst | Apache License 2.0 |
#(/* ident after */ sys.version);
|
https://github.com/barddust/Kuafu | https://raw.githubusercontent.com/barddust/Kuafu/main/src/BeforeMathematics/logic.typ | typst | #import "/mathenv.typ": *
= Mathematical Logic
== Statements
#definition(name: "statements")[
*Statements* are one of the basic sentences in linguistics. A statement consists of at least two objects:
+ The subject, say $x$;
+ The predicate, which a description or a property to modify the subject, say $P$.
Then the statement is expressed by the symbol $P(x)$. If we just care about the whole sentence, not the relation between subject and these descriptions, the statement is usually writen as a capitalized letter, say $A$.
]
#example[
The sentence "I am a human" can be separated into (1) the subject "I", denoted as a letter say $a$, (2) the predicate "X is a human", translated to $P(X)$. Replacing variable $X$ by the subject $a$, "I am a human" is translated to $P(a)$.
Or just a symbol $A$. This form is used a lot:
$
A: #text[I am a human.]
$
sometimes with quotes wrapped around the sentence.
]
#remark[
A statement must be either true or false. The truth of a statement can changes under different circumstances, e.g., "Today is hot" is true if one is suffering from sunlight; it is false when one is walking in a chilling day.
A statement cannot be true and false at the same time. Consider one says "I am lying", we give this sentence a letter, say $A$. If $A$ is true, this man is telling a lie, that is $A$. In other words, $A$ is a lie, which is false. Well, this seems little weird. What if $A$ is false initially? If $A$ is false, this still means the man is telling a lie, i.e., he is lying now, thus $A$ should be true whatever. No matter what $A$'s truth is, it produces contradiction. In this case, $A:$ "I am lying" is not a statement.
]
#remark[
It is not always easy to find out the truth of a sentence. e.g., $B: x > 1$. Suppose $x$ is some real number, if $x=3$, then $B$ is true since $3 > 1$ is so obvious. Let $x=0$, however, $B$ becomes false since $0 < 1$. Supposing there are no more informations about this $x$, we can hardly tell the truth of $B$. But what we knows exactly is, when $x$ is selected or picked, there may be only two cases: $x$ is either greater than 1 or not. These two cases can not occur at the same time.
]
#remark[
If a statement $A$ is true, we may write $A=T$, or $A:T$, or $A=1$, i.e., $A$ equals to #strong[T]rue, or binary 1. Similarly, if $A$ is false, we may write $A=F$, or $A:F$, or $A=0$: $A$ equals to #strong[F]alse, or binary 0.
]
== Composite of Statements
#definition(name: "conjunction")[
Let $A,B$ be both statements. We call "$A$ and $B$" the *conjunction* of $A$ and $B$, denoted as $A and B$. $A and B$ is true if $A$ and $B$ are both true, and if false otherwise.
#figure(
table(
columns: 3,
[$A$], [$B$], [$A and B$],
[0], [0], [0],
[0], [1], [0],
[1], [0], [0],
[1], [1], [1],
),
caption: [Truth table of conjunction],
)
]
#example[
"I am such tall and handsome", is the conjunction of "I am tall" and "I am handsome". Conjunction means they both happen at the same time.
]
#definition(name: "disjunction")[
Let $A,B$ be both statements. We call "$A$ and $B$" the *disjunction* of $A$ and $B$, denoted as $A or B$. $A or B$ is true if either $A$ is true or $B$ is true, or both of them are true. $A or B$ is false only if both $A$ and $B$ are false.
#figure(
table(
columns: 3,
[$A$], [$B$], [$A or B$],
[0], [0], [0],
[0], [1], [1],
[1], [0], [1],
[1], [1], [1],
),
caption: [Truth table of disjunction],
)
]
#example[
"He loves her or she loves him" is the disjunction of "He loves her" and "She loves him". When the disjunction is true, we can obatin the following three assertion, which could happen at the same time, or just some of them happen:
+ "He loves her"
+ "She loves him"
+ "They love each other", i.e., "He loves her" and "She loves him" occur at the same time.
]
#definition(name: "negation")[
For a statement $A$, the negation of $A$ is a statement with the converse truth of $A$, denoted as $not A$, read as "not $A$".
#figure(
table(
columns: 2,
table.header([$A$], [$not A$]),
[0], [1],
[1], [0],
),
caption: [Truth table of negation],
)
]
#example[
The negation pertains to predicate not subject. For example, let $A$ be a statement "I am full". The negation $not A$ is "I am not full", where the subject is still "I".
]
#remark[
Negation has higher priority on arithmatics than other two 2-nry operations. For example $A and not B$, let $A$ be True, and $B$ be False, and we fisrtly consider $not B$, which is False. Then we obtain the conjunction, which is False, since it is true only when two operators are both true.
]
== Implication
#definition(name: "implication")[
Let $A,B$ be statements, we use $A => B$ to represet the statemet "$A$ implies $B$", which means "If $A$, then $B$" in daily langugae context. $A => B$ is false, only when $A$ is true, but $B$ is false.
#figure(
table(
columns: 3,
[$A$], [$B$], [$A => B$],
[0], [0], [1],
[0], [1], [1],
[1], [0], [0],
[1], [1], [1],
),
caption: [Truth table of implication],
)
]
#remark[
The truth of implication is not so intuitive. $A$ is actually some condition, and $B$ is a conclusion.
Known the some thing happened, the consequence occurs coherently; the case is what we can accept, in other words, we say the implication is true. But if the consequence does not show up when it is the time, we say the implication is false.
What if the event as condition does not happen? In this situation, we have no reason to deny, or refuse, or give up the implication. Which means, we _have to_ admit its truthness.
]
#remark[
Most of mathematical statements, called _propositions_, consists of many implications.
]
#definition(name: "equivalence")[
Let $A,B$ be statements, we use $A <=> B$ to represet the statemet "$A$ is equivalent to $B$". $A <=> B$ is true, if $A$ has the same truth value with $B$.
#figure(
table(
columns: 3,
[$A$], [$B$], [$A <=> B$],
[0], [0], [1],
[0], [1], [0],
[1], [0], [0],
[1], [1], [1],
),
caption: [Truth table of equivalence],
)
]
#remark[
Equivalence is likely what we talk about $A$ equal to $B$. We usually say that "$A$ if and only if $B$" for equivalence, or simply "iff".
]
#proposition[
These two statements are equivalent:
- $p => q$
- $not p or q$
] <impeq>
#proof[
List the truth table containing these two:
#figure(
table(
columns: 5,
[$p$], [$q$], [$p => q$], [$not p$], [$not p or q$],
[0], [0], [1], [1], [1],
[0], [1], [1], [1], [1],
[1], [0], [0], [0], [1],
[1], [1], [1], [0], [1],
),
)
It is obvious that $A => B$ has the same values with $not A or B$, i.e., they are equivalent. And we may write
$
(A => B) <=> (not A or B)
$
]
== Arithmetics of Statemets
#axiom[
For any two statements $A$ and $B$,
+ $A and B <=> B and A$
+ $A or B <=> B or A$
]
#proposition[
For any two statements $A$ and $B$,
+ $(A and B) and C <=> A and (B and C)$
+ $(A or B) or C <=> A or (B or C)$
+ $(A and B) or C <=> (A or C) and (B or C)$
+ $(A or B) and C <=> (A and C) or (B and C)$
]
#proof[
All the proofs can be done by listing truth table.
#figure(
table(
columns: 11,
[$A$], [$B$], [$C$],
[$A and B$], [$B and C$], [$(A and B) and C$], [$A and (B and C)$],
[$A or B$], [$B or C$], [$(A or B) or C$], [$A or (B or C)$],
// [ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ],[ ], [ ],
[ ], [ ], [1], [ ], [ ], [ ], [ ], [ ], [1],[1], [1],
[ ], [1], [ ], [ ], [ ], [ ], [ ], [1], [1],[1], [1],
[ ], [1], [1], [ ], [1], [ ], [ ], [1], [1],[1], [1],
[1], [ ], [ ], [ ], [ ], [ ], [ ], [1], [ ],[1], [1],
[1], [ ], [1], [ ], [ ], [ ], [ ], [1], [1],[1], [1],
[1], [1], [ ], [1], [ ], [ ], [ ], [1], [1],[1], [1],
[1], [1], [1], [1], [1], [1], [1], [1], [1],[1], [1],
fill: (x, y) => {
if x in (5,6) {aqua} else if x in (9,10) {teal}
}
),
caption: [Truth table for 1 and 2],
)
#figure(
table(
columns: 8,
[$A$], [$B$], [$C$],
[$A and B$], [$(A and B) or C$], [$A or C$], [$B or C$], [$(A or B) and (B or C)$],
[ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ],
[ ], [ ], [1], [ ], [1], [1], [1], [1],
[ ], [1], [ ], [ ], [ ], [ ], [1], [ ],
[ ], [1], [1], [ ], [1], [1], [1], [1],
[1], [ ], [ ], [ ], [ ], [1], [ ], [ ],
[1], [ ], [1], [ ], [1], [1], [1], [1],
[1], [1], [ ], [1], [1], [1], [1], [1],
[1], [1], [1], [1], [1], [1], [1], [1],
fill: (x, y) => {
if x in (4,7) {aqua}
}
),
caption: [Truth table for 3],
)
#figure(
table(
columns: 8,
[$A$], [$B$], [$C$],
[$A or B$], [$(A or B) and C$], [$A and C$], [$B and C$], [$(A and B) or (B and C)$],
[ ], [ ], [ ], [ ], [ ], [ ], [ ], [ ],
[ ], [ ], [1], [ ], [ ], [ ], [ ], [ ],
[ ], [1], [ ], [1], [ ], [ ], [ ], [ ],
[ ], [1], [1], [1], [1], [ ], [1], [1],
[1], [ ], [ ], [1], [ ], [ ], [ ], [ ],
[1], [ ], [1], [1], [1], [1], [ ], [1],
[1], [1], [ ], [1], [ ], [ ], [ ], [ ],
[1], [1], [1], [1], [1], [1], [1], [1],
fill: (x, y) => {
if x in (4,7) {teal}
}
),
caption: [Truth table for 4],
)
As shown above, columns in table filled with the same color, has exactly the same truths, i.e., that two propositions are equivalent.
]
#theorem(name: "De Morgan's laws")[
For any two statements $A$ and $B$,
- $not (A and B) <=> not A or not B$
- $not (A or B) <=> not A and not B$
]
#proof[
Still the Morgan's laws can be proved by truth table.
#figure(
table(
columns: 7,
[$A$], [$B$],
[$A and B$], [$not (A and B)$],
[$not A$], [$not B$], [$not A or not B$],
[ ], [ ], [ ], [1], [1], [1], [1],
[ ], [1], [ ], [1], [1], [ ], [1],
[1], [ ], [ ], [1], [ ], [1], [1],
[1], [1], [1], [ ], [ ], [ ], [ ],
fill: (x, y) => if x in (3,6) {aqua}
),
)
#figure(
table(
columns: 7,
[$A$], [$B$],
[$A or B$], [$not (A or B)$],
[$not A$], [$not B$], [$not A and not B$],
[ ], [ ], [ ], [1], [1], [1], [1],
[ ], [1], [1], [ ], [1], [ ], [ ],
[1], [ ], [1], [ ], [ ], [1], [ ],
[1], [1], [1], [ ], [ ], [ ], [ ],
fill: (x, y) => if x in (3,6) {teal}
),
caption: [Truth table for the Morgan's laws],
)
Columns in above table filled with the same color, has exactly the same truths, the two propositions are equivalent.
]
== Quantifiers
#definition(name: "universal quantifiers")[
Let $P(x)$ be some statement depending on a free variable $x$. "$P(x)$ is true for all objects $x$" is called a *universal proposition*, denoted as
$ forall x P(x) $
A universal proposition is true, iff(if and only if) for any given object $x$, $P(x)$ is true regardless of what the exact value $x$ is.
]
#example[
"Square of all real numbers is non-negative" is expressed by $forall x(x in RR => x^2 >= 0)$, or simply $forall x in RR (x^2 >= 0)$.
There are two ways to symbolize "Human will die". (1) Let $H(x)$ be a proposition "$x$ is a human", and $D(x)$ be "$x$ will die", then it looks like $forall x (H(x) => D(x))$. (2) Let $H$ be a set containing all human, and $D(x)$ be the same, then the proposition should be like $forall x in H (D(x))$.
]
#definition(name: "existential quantifiers")[
Let $P(x)$ be some statement depending on a free variable $x$. "$P(x)$ is true for some object $x$" is called a *existential proposition*, denoted as
$ exists x P(x) $
A existential proposition is true, iff(if and only if) there is at least one object $x$, such that $P(x)$ is true.
]
#example[
"Not all swans are white" means there is at least one swan which is not white. How many of them. We have no idea exactly, the certain thing we know is there exists *at least one*. The symbolic express is $exists x in S (not W(x))$, where $S$ is the set containing all swans, and $W(x)$ tells that "$x$ is white". Another expression is $exists x (S(x) => W(x))$, where $S(x)$ means "$x$ is a swan".
]
#axiom[
- $not(forall x P(x)) <=> exists x(not(P(x)))$
- $not(exists x Q(x)) <=> forall x(not(Q(x)))$
]
#remark[
The negation of universal quantifiers is the existential quantifier, and vice versa. E.g., the negation of "All swan is white" is "There exists at least one swan which is not white".
]
|
|
https://github.com/nafkhanzam/typst-common | https://raw.githubusercontent.com/nafkhanzam/typst-common/main/src/touying-themes/its-mooc.typ | typst | #import "its-theme.typ": *
#let its-mooc-theme(
title,
subtitle,
author: [<NAME>, S.T., M.T.],
institution: [
Department of Informatics \
Faculty of Intelligent Electrical and Informatics Technology \
Institut Teknologi Sepuluh Nopember
],
logo: image("its-logo.png", width: 4em),
copyright: [
#sym.copyright #datetime.today().year() All rights reserved
],
with-end: true,
..args,
body,
) = [
#show: university-theme.with(
footer-a: none,
footer-b: none,
footer-c: none,
footer-columns: (0%,),
progress-bar: false,
config-info(
title: title,
subtitle: subtitle,
author: author,
institution: institution,
logo: logo,
copyright: copyright,
),
config-methods(init: (self: none, body) => {
show: university-init.with(self: self)
set page(background: image("its-mooc-electics-background.jpg"))
body
}),
config-colors(
primary: white,
secondary: white,
tertiary: white,
neutral-lightest: white,
neutral-darkest: white,
),
..args,
)
#set line(stroke: white)
#show link: text.with(fill: rgb("#176B87"))
#show link: underline
#title-slide()
#body
#if with-end {
new-section-slide(level: 1, numbered: false)[End of #subtitle]
}
]
|
|
https://github.com/dismint/docmint | https://raw.githubusercontent.com/dismint/docmint/main/religion/quiz1.typ | typst | #import "template.typ": *
#show: template.with(
title: "Quiz 1 Prep",
subtitle: "24.05"
)
= Audi
== Secular Motivation
|
|
https://github.com/dldyou/Operation-System | https://raw.githubusercontent.com/dldyou/Operation-System/main/typst/template.typ | typst | #let team_name = "dldyou"
#let title = "운영체제 기말 정리본"
#let sub_title = "2024-1"
#let authors = (
team_name,
)
#let head = {
[
#text(weight: 700)[#team_name]
#text(weight: 400)[#sub_title]
#h(1fr)
#text(weight: 400)[#title]
#line(length: 100%, stroke: 0.2pt)
]
}
#let prompt(content, lang:"md") = {
box(
inset: 15pt,
width: auto,
fill: rgb(247, 246, 243, 50%),
)[#text(content, font: "Cascadia Mono", size: 0.8em)]
}
#let _returns = {
text("Returns:", font:"Cascadia Mono", weight: 600, size:9pt)
}
#let _params = {
text("Parameters:", font:"Cascadia Mono", weight: 600, size:9pt)
}
#let _details = {
text("Details:", font:"Cascadia Mono", weight: 600, size:9pt)
}
#let project(title: "", authors: (), logo: none, body) = {
set text(9pt, font: "Pretendard")
set heading(numbering: "1.")
set page(columns: 1, numbering: "1 / 1", number-align: center, header: head, margin: 5em)
show outline.entry.where(level: 1): it => {
v(25pt, weak:true)
strong(it)
}
show heading : it => { it; v(0.5em);}
align(center)[
#block(text(weight: 800, 1.75em, title))
]
pad(
top: 0.5em,
bottom: 0.5em,
x: 2em,
grid(
columns: (1fr,) * calc.min(1, authors.len()),
gutter: 1em,
..authors.map(author => align(center, author)),
),
)
set par(justify: true)
outline(title: "목 차", depth: 5, indent: 1em, fill: repeat(text(weight: 700)[.#h(0.5em)]))
set page(columns: 2, numbering: "1 / 1", number-align: center, header: head)
set text(8pt, font: "Pretendard")
body
} |
|
https://github.com/JakMobius/courses | https://raw.githubusercontent.com/JakMobius/courses/main/mipt-os-basic-2024/sem05/cheat-sheet/main.typ | typst |
#import "../../theme/asm.typ": *
#import "../../theme/theme.typ": *
#set page(width: 297mm, height: 210mm)
#set page(margin: 10mm)
#place(bottom + center, dy: 0.45cm)[
Архитектура компьютеров и операционные системы, МФТИ, 10 октября 2024.
]
#grid(columns: (50%, 50%),
[
= Семинар 5 - Ассемблер AArch64.
== Задача
Реализуйте функцию #raw("_print_list") на ассемблере AArch64. Функция принимает указатель на первый элемент связного списка и должна вывести в консоль содержимое полей #raw("element") всех элементов используя функцию #raw("puts").
== Заголовочный файл:
#code(numbers: true, ```c
typedef struct list_entry {
struct list_entry* next;
const char* element;
} list_entry;
void print_list(list_entry* first);
```)
== Заготовка:
#lightasmtable(numbers: true, inset: 0.2em, ```asm
_print_list:
sub sp, sp, 32
stp x29, x30, [sp, +16]
mov x29, sp
# <ваш код>
# Заготовка: печать первого элемента
ldr x0, [x0, 8] # x0 = first->element
bl _puts # puts(first->element)
# </ваш код>
ldp x29, x30, [sp, +16]
add sp, sp, 32
ret
```)
], [
#show heading: (content) => {
set block(below: 0.6em, above: 0.8em)
content
}
#set table(stroke: none, inset: (x: 5pt, y: 2pt))
== Инструкции обработки данных
#table(columns: 2,
[#lightasm("ADD <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn + Rm;")],
[#lightasm("SUB <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn - Rm;")],
[#lightasm("MUL <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn * Rm;")]
)
== Инструкции load / store
#table(columns: 2,
[#lightasm("LDR <Rt>, [<Rn>]"):], [#raw(lang: "c", "Rt = *Rn;")],
[#lightasm("LDR <Rt>, [<Rn>, <imm>]"):], [#raw(lang: "c", "Rt = *(Rn + imm);")],
[#lightasm("LDR <Rt>, [<Rn>, <imm>]!"):], [#raw(lang: "c", "Rt = *(Rn += imm);")],
[#lightasm("LDR <Rt>, [<Rn>], <imm>"):], [#raw(lang: "c", "Rt = *(Rn); Rn += imm;")],
[#lightasm("STR <Rt>, [<Rn>]"):], [#raw(lang: "c", "*Rn = Rt;")],
[#lightasm("STR <Rt>, [<Rn>, <imm>]"):], [#raw(lang: "c", "*(Rn + imm) = Rt;")],
[#lightasm("STR <Rt>, [<Rn>, <imm>]!"):], [#raw(lang: "c", "*(Rn += imm) = Rt;")],
[#lightasm("STR <Rt>, [<Rn>], <imm>"):], [#raw(lang: "c", "*(Rn) = Rt; Rn += imm;")],
[], [],
[#lightasm("STP <Ra>, <Rb> -||-"):], [Аналогично, но сохраняет два регистра],
[#lightasm("LDP <Ra>, <Rb> -||-"):], [Аналогично, но загружает два регистра],
)
== Инструкции перехода
#table(columns: 2,
[#lightasm("B <label>"):], [#raw(lang: "c", "goto label")],
[#lightasm("BL <label>"):], [#lightasm("call"), или #raw(lang: "c", "goto label") + #raw("r30 = <return address>")],
[#lightasm("RET"):], [#mnemonic("return"), или #raw(lang: "c", "goto r30")]
)
== Логические инструкции
#table(columns: 2,
[#lightasm("AND <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn & Rm;")],
[#lightasm("ORR <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn | Rm;")],
[#lightasm("EOR <Rd>, <Rn>, <Rm>"):], [#raw(lang: "c", "Rd = Rn ^ Rm;")]
)
== Условные суффиксы
Чтобы изменить флаги, *добавь суффикс #mnemonic("s")*
#table(columns: 4,
[#lightasm(".EQ"):], [Если равно],
[#lightasm(".NE"):], [Если не равно],
[#lightasm(".HS"):], [Если беззнаково $>=$],
[#lightasm(".LO"):], [Если беззнаково $<$],
[#lightasm(".MI"):], [Если отрицательно],
[#lightasm(".PL"):], [Если положительно или ноль],
[#lightasm(".VS"):], [Если переполнение],
[#lightasm(".VC"):], [Если нет переполнения],
[#lightasm(".HI"):], [Если беззнаково $>$],
[#lightasm(".LS"):], [Если беззнаково $<=$],
[#lightasm(".GE"):], [Если знаково $>=$],
[#lightasm(".LT"):], [Если знаково $<$],
[#lightasm(".GT"):], [Если знаково $>$],
[#lightasm(".LE"):], [Если знаково $<=$]
)
]) |
|
https://github.com/isaacew/aiaa-typst | https://raw.githubusercontent.com/isaacew/aiaa-typst/master/main.typ | typst | //***************************************************************
// AIAA TYPST TEMPLATE
//
// The author of this work hereby waives all claim of copyright
// (economic and moral) in this work and immediately places it
// in the public domain; it may be used, distorted or
// in any manner whatsoever without further attribution or notice
// to the creator. The author is not responsible for any liability
// from the usage or dissemination of this code.
//
// Author: <NAME>, <NAME>
// Date: 06 NOV 2023
// BAMDONE!
//***************************************************************
#import "template.typ": *
#show: aiaa.with(
title: "Preparation of Papers for AIAA Technical Conferences",
bibliography-file: "refs.bib",
authors-and-affiliations: (
(
name:"<NAME>",
job:"Insert Job Title",
department:"Department Name",
aiaa:"and AIAA Member Grade (if any) for first author"
),
(
name:"<NAME>.",
job:"Insert Job Title",
department:"Department Name",
aiaa:"and AIAA Member Grade (if any) for second author"
),
(
institution:"Business or Academic Affiliation's Full Name 1",
city:"City",
state:"State",
zip:"Zip Code",
country:"Country"
),
(
name:"<NAME>",
job:"Insert Job Title",
department:"Department Name",
aiaa:"and AIAA grade (if any) for third author"
),
(
institution:"Business or Academic Affiliation's Full Name 2",
city:"City",
state:"State",
zip:"Zip Code",
country:"Country"
),
(
name:"Fourth <NAME>",
job:"Insert Job Title",
department:"Department Name",
aiaa:"and AIAA grade (if any) for fourth author"
),
(
institution:"Business or Academic Affiliation's Full Name 3",
city:"City",
state:"State",
zip:"Zip Code",
country:"Country"
)
),
abstract: [These instructions give you guidelines for preparing papers for AIAA Technical Papers. Use this document as a template if you are using Typst. Otherwise, use this document as an instruction set. Define all symbols used in the abstract. Do not cite references in the abstract. The footnote on the first page should list the Job Title and AIAA Member Grade for each author, if known. Authors do not have to be AIAA members.]
)
#nomenclature(
([$A$], [amplitude of oscillation]),
([$a$], [cylinder diameter]),
($C_p$, "coefficient of pressure"),
($C_x$, "force coefficient in the x direction"),
($C_y$, "force coefficient in the y direction"),
($c$, "chord"),
($d t$, "time stamp"),
($F_x$, [$X$ component of the resultant pressure force acting on the vehicle]),
($F_y$, [$Y$ component of the resultant pressure force acting on the vehicle]),
($f, g$, "generic functions"),
($h$, "height"),
($i$, "time index during navigation"),
($j$, "waypoint index"),
($K$, "trailing-edge (TE) nondimensional angular deflection rate")
)
= Introduction
#dropcap(
height: 2,
gutter: 1pt,
hanging-indent: 0em,
justify: true,
)[T #smallcaps([his]) document is a template for Typst. If you are reading a hard-copy or .pdf version of this document, download the AIAA Meeting Papers Template from the Meeting Paper Author page at www.aiaa.org or from the Technical Presenter Resources page for the appropriate AIAA forum, and use it to prepare your manuscript.]
Authors using Microsoft Word will first need to save the AIAA Meeting PapersTemplate.dotx file in the “Templates” directory of their hard drive. To do so, simply open the AIAA Meeting Papers Template.dotx file and then click “File>Save As:” to save the template. \[Note: Windows users will need to indicate “Save as Type>Document\]
Authors using Microsoft Word will first need to save the AIAA Meeting PapersTemplate.dotx file in the “Templates” directory of their hard drive. To do so, simply open the AIAA Meeting Papers Template.dotx file and then click “File>Save As:” to save the template. [Note: Windows users will need to indicate “Save as Type>Document Template (\*.dot)” when asked in the dialogue box; Mac users should save the file in the “My Templates” directory.] To create a new document using this template, use the command “File>New>From Template” (Windows) or “File>Project Gallery>My Templates” (Mac). To create your formatted manuscript, type your own text over sections of the Template, or cut and paste from another document and then use the available markup styles. Note that special formatting such as subscripts, superscripts, and italics may be lost when you copy your text into the template. See Section V for more detailed formatting guidelines.
= Procedure for Paper Submission
All manuscripts are to be submitted electronically to the ScholarOne Abstracts site created for each conference. The manuscript upload will be enabled several weeks after acceptance notices have been sent. Presenting authors of accepted papers will receive an email with instructions when manuscript submission opens. It is important that presenting authors keep their email addresses up-to-date so they do not miss this notice.
Before completing manuscript submission, submitters must also select the copyright statement that will appear on the paper, and complete other acknowledgments. It is also necessary to click both the “Accept” and “Save” buttons to complete a submission. All completed manuscript submissions will be confirmed by email. Completed submissions will also have a status of “Accepted” at the top of your manuscript submission page. All files must be in pdf format. Please be sure that all security settings are removed from the pdf file before uploading to ensure proper processing of your manuscript file.
= General Guidelines
The following section outlines general (nonformatting) guidelines to follow. These guidelines are applicable to all authors (except as noted), and include information on the policies and practices relevant to the publication of your manuscript.
== Publication by AIAA
Your manuscript cannot be published by AIAA if:
+ It has been published previously or
+ The work contains copyright-infringing material or
+ An appropriate copyright statement has not yet been selected.
== Paper Review and Visa Considerations
It is the responsibility of the author to obtain any required government or company reviews for their papers in advance of publication. Start early to determine if the reviews are required; this process can take several weeks.
If you plan to attend an AIAA Forum, technical conference or professional development course held in the United States and you require a visa for travel, it is incumbent upon you to apply for a visa with the U.S. embassy (consular division) or consulate with ample time for processing. To avoid bureaucratic problems, AIAA strongly suggests that you submit your formal application to U.S. authorities a minimum of 120 days in advance of the date of anticipated travel.
Prospective conference and course attendees requiring a visa to travel to the United States should first contact AIAA to request an official letter of invitation. This letter and a copy of the conference call for papers should be presented along with the required documentation to the U.S. consular officials as part of the formal application process. AIAA cannot directly intervene with the U.S. Department of State, consular offices, or embassies on behalf of individuals applying for visas. A letter of invitation can be requested by completing the Visa Invitation Letter Request Form at https://www.aiaa.org/events-learning/Forums or you may contact the Event Registrar at <EMAIL> for more information.
== Control ID Number vs Paper
Your paper was assigned a control ID number at the time you submitted your abstract. It is critical that you reference the tracking number and conference name when contacting AIAA regarding your submission. The control ID number is not the final AIAA paper number. The paper number, which appears in the format AIAA-20XX-XXXX, will be used to refer to your paper in the program and in any publication format. It will not be assigned until shortly before the conference. *Do not include a paper number anywhere on your paper, as this number will be stamped automatically in the top right corner of your paper at the time of processing.*
== Copyright
Before AIAA can print or publish any paper, the copyright information must be completed in the submission system. Failure to complete the electronic form correctly could result in your paper not being published. The following fields must be completed:
+ Clearance Statement
+ Non-Infringement Statement
+ Publication Status Statement
+ One Copyright Assignment Statement (Select either A, B, C, or D)
Be sure to read the copyright statements carefully. AIAA requires a copyright transfer from the author(s) to AIAA or a license to publish and distribute your material; government authors can assert that the work is in the public domain. If you are not sure which copyright statement to use, contact your legal department. Refer to AIAA’s Rights and Permissions page at www.aiaa.org for more information; AIAA cannot help you determine which statement to use. Do not include a copyright statement anywhere on your paper. The correct statement will be stamped automatically at the time of processing.
== Submission Deadlines
Manuscripts will be accepted for upload to the system from the receipt of the email invitation until the deadline set for the conference. You will be notified of the specific manuscript submission deadline in your acceptance letter, and the deadline will also be listed on the conference web page at AIAA. Do not upload a draft version of your manuscript with the intent to upload a final version later. *Please review your manuscript very carefully before completing your submission to ensure that your paper is complete and final in all respects. Once the manuscript deadline has passed, you will be locked out of the manuscript site, so it is critical that you upload a final, carefully proofed document.*
Online conference proceedings will be made accessible to attendees who have registered for the “full conference” when the conference opens. Once the proceedings are published online, the conference papers will be considered the version of record and may not be removed or replaced. Changes to published papers can be made available through the Crossmark feature, where corrections and updates are accessed by clicking the Crossmark icon available on every paper published in Aerospace Research Central.
The opportunity to submit Crossmark updates will be provided to presenting authors starting the first day of the conference through 2000 hrs/8 pm Eastern Time, seven business days after the last day of the conference. The proceedings will be updated with Crossmark updates shortly after that date. AIAA will NOT accept changes and/or change requests that solely correct grammatical errors, spelling errors, or errors in formatting. All corrections should be for editorially significant changes where the change affects interpretation or crediting of the work.
To ensure conference quality, session chairs will enforce a "no paper, no podium" rule. This policy is intended to eliminate no-shows, to improve the quality of the conference for all participants, and to ensure that the published proceedings accurately represent the presentations made at a conference.
= Detailed Formatting Instructions
The styles and formats for the AIAA Papers Template have been incorporated into the structure of this document. If you are using Microsoft Word 2001 or later, please use this template to prepare your manuscript. For authors that prefer using LaTeX, AIAA has partnered with Overleaf to provide an online editor to create your manuscript in LaTex. Please visit the AIAA LaTex site for instructions. Regardless of which program you use to prepare your manuscript, please use the formatting instructions contained in this document as a guide.
If you are using the AIAA Meeting Papers emplate.dotx file to prepare your manuscript, you can simply type your own text over sections of this document or cut and paste from another document and use the available markup styles. If you choose to cut and paste, select the text from your original Word document and choose Edit>Copy. (Do not select your title and author information, since the document spacing may be affected. It is a simple task to reenter your title and author information in the template.) Open the template file. Place your cursor in the text area of the template and select Edit>Paste Special. When the Paste Special box opens, choose “unformatted text” or “keep source formatting.” Please note that special formatting (e.g., subscripts, superscripts, italics) may be lost when you copy your text into the template. Use italics for emphasis; do not underline. Use the “Print Layout” feature from the “View” menu bar (View>Print Layout) to see the most accurate representation of how your final paper will appear.
== Document Text
The default font for AIAA papers is Times New Roman, 10-point size. In the electronic template, use the “Text” or “Normal” style from the pull-down menu to format all primary text for your manuscript. The first line of every paragraph should be indented, and all lines should be single-spaced. Default margins are 1” on all sides. In the electronic version of this template, all margins and other formatting is preset. There should be no additional lines between paragraphs.
Extended quotes, such as this example, are to be used when material being cited is longer than a few sentences, or the standard quotation format is not practical. In this Word template, the appropriate style is “Extended Quote” from the drop-down menu. Extended quotes are to be in Times New Roman, 9-point font, indented 0.4” and full justified.
NOTE: If you are using the electronic template to format your manuscript, the required spacing and formatting will be applied automatically, simply by using the appropriate style designation from the pull-down menu.
== Headings
The title of your paper should be typed in bold, 24-point type, with capital and lower-case letters, and centered at the top of the page. The names of the authors, business or academic affiliation, city, and state/province should follow on separate lines below the title. The names of authors with the same affiliation can be listed on the same line above their collective affiliation information. Author names are centered, and affiliations are centered and in italic type immediately below the author names. The affiliation line for each author is to include that author’s city, state, and zip/postal code (or city, province, zip/postal code and country, as appropriate). The first-page footnotes (lower left-hand side) contain the job title and department name, street address/mail stop, and AIAA member grade for each author. Author email addresses may be included also.
Major headings (“Heading 1” in the template style list) are bold 11-point font, centered, and numbered with Roman numerals.
Subheadings (“Heading 2” in the template style list) are bold, flush left, and numbered with capital letters. Sub-Subheadings (“Heading 3” in the template style list) are italic, flush left, and numbered (1. 2. 3. etc.)
== Abstract
The abstract should appear at the beginning of your paper. It should be one paragraph long (not an introduction) and complete in itself (no reference numbers). It should indicate subjects dealt with in the paper and state the objectives of the investigation. Newly observed facts and conclusions of the experiment or argument discussed in the paper must be stated in summary form; readers should not have to read the paper to understand the abstract. The abstract should be bold, indented 3 picas (1/2”) on each side, and separated from the rest of the document by - blank lines above and below the abstract text.
== Nomenclatures
Papers with many symbols may benefit from a nomenclature list that defines all symbols with units, inserted between the abstract and the introduction. If one is used, it must contain all the symbology used in the manuscript, and the definitions should not be repeated in the text. In all cases, identify the symbols used if they are not widely recognized in the profession. Define acronyms in the text, not in the nomenclature.
== Footnotes and References
Footnotes, where they appear, should be placed above the 1” margin at the bottom of the page. To insert footnotes into the template, use the Insert>Footnote feature from the main menu as necessary. Numbered footnotes as formatted automatically in the template are acceptable, but superscript symbols are the preferred AIAA style, in the sequence, \*, \†, \‡, \§, \¶, \#, \*\*. \†\†, \‡\‡, \§\§, etc.
List and number all references at the end of the paper. Corresponding bracketed numbers are used to cite references in the text [1], unless the citation is an integral part of the sentence (e.g., “It is shown in Ref. [2] that…”) or follows a mathematical expression: “A2 + B = C (Ref. [3]).” For multiple citations, separate reference numbers with commas [4, 5], or use a dash to show a range [6-8]. Reference citations in the text should be in numerical order.
In the reference list, give all authors’ names; do not use “et al.” unless there are more than 10 authors. Papers that have not been published should be cited as “unpublished”; papers that have been submitted or accepted for publication should be cited as “submitted for publication.” Private communications and personal website should appear as footnotes rather than in the reference list.
References should be cited according to the standard publication reference style (for examples, see the “References” section of this template). Never edit titles in references to conform to AIAA style of spellings, abbreviations, etc. Names and locations of publishers should be listed; month and year should be included for reports and papers. For papers published in translation journals, please give the English citation first, followed by the original foreign language citation.
== Images, Figures, and Tables
All artwork, captions, figures, graphs, and tables will be reproduced exactly as submitted. Be sure to position any figures, tables, graphs, or pictures as you want them printed. AIAA will not be responsible for incorporating your figures, tables, etc. (Company logos and identification numbers are not permitted on your illustrations.)
Do not insert your tables and figures in text boxes. Figures should have no background, borders, or outlines. In the electronic template, use the “Figure” style from the pull-down formatting menu to type caption text. You may also insert the caption by going to the References menu and choosing Insert Caption. Make sure the label is “Fig.,” and type your caption text in the box provided. Captions are bold with a single tab (no hyphen or other character) between the figure number and figure description.
#figure(
image("Picture1.png", width:50%),
caption: [
Magnetization as a function of applied fields.
],
) <a-figure>
Place figure captions below all figures; place table titles above the tables. If your figure has multiple parts, include the labels “a),” “b),” etc. below and to the left of each part, above the figure caption. Please verify that the figures and tables you mention in the text actually exist. Please do not include captions as part of the figures, and do not put captions in separate text boxes linked to the figures. When citing a figure in the text, use the abbreviation “Fig.” except at the beginning of a sentence. Do not abbreviate “Table.” Number each different type of illustration (i.e., figures, tables, images) sequentially with relation to other illustrations of the same type.
Figure axis labels are often a source of confusion. Use words rather than symbols. As in the example to the right, write the quantity “Magnetization” rather than just “M.” Do not enclose units in parenthesis, but rather separate them from the preceding text by commas. Do not label axes only with units. As in Fig. 1, for example, write “Magnetization, kA/m” not just “kA/m.” Do not label axes with a ratio of quantities and units. For example, write “Temperature, K,” not “Temperature/K.”
Multipliers can be especially confusing. Write “Magnetization, kA/m” or “Magnetization, 103 A/m.” Do not write “Magnetization (A/m) x 1000” because the reader would not then know whether the top axis label in Fig. 1 meant 16000 A/m or 0.016 A/m. Figure labels must be legible, and all text within figures should be uniform in style and size, no smaller than 8-point type.
== Equations, Numbers, Symbols, and Abbreviations
$ integral_0^r_2 &F(r,phi)d r d phi = [sigma r_2 \/ 2(mu_0)] \ & dot integral_0^infinity exp(-lambda | z_j - z_i | ) lambda^-1 J_1 (lambda r_2) J_0 (lambda e_i) d lambda $ <equation1>
Be sure that the symbols in your equation are defined before the equation appears, or immediately following. Italicize symbols (T might refer to temperature, but T is the unit tesla). Refer to “Eq. (1),” not “(1)” or “equation (1)” except at the beginning of a sentence: “Equation (1) is…” Equations can be labeled other than “Eq.” should they represent inequalities, matrices, or boundary conditions. If what is represented is really more than one equation, the abbreviation “Eqs.” can be used.
An example of using Equations can be used as follows: It can be referred to as @equation1. Note that this will work when it's an in-sentence equation. However, when a sentence begins with Equation, you will need to use the following to make it work:
#bEquation([@equation1]) when it starts at the beginning of a sentence.
Define abbreviations and acronyms the first time they are used in the text, even after they have already been defined in the abstract. Very common abbreviations such as AIAA, SI, ac, and dc do not have to be defined. Abbreviations that incorporate periods should not have spaces: write “P.R.,” not “P. R.” Delete periods between initials if the abbreviation has three or more initials; e.g., U.N. but ESA. Do not use abbreviations in the title unless they are unavoidable (for instance, “AIAA” in the title of this article).
== General Grammar and Preferred Usage
Use only one space after periods or colons. Hyphenate complex modifiers: “zero-field-cooled magnetization.” Avoid dangling participles, such as, “Using Eq. (1), the potential was calculated.” [It is not clear who or what used Eq. (1).] Write instead “The potential was calculated using Eq. (1),” or “Using Eq. (1), we calculated the potential.”
Use a zero before decimal points: “0.25,” not “.25.” Use “cm2,” not “cc.” Indicate sample dimensions as “0.1 cm x 0.2 cm,” not “0.1 x 0.2 cm2.” The preferred abbreviation for “seconds” is “s,” not “sec.” Do not mix complete spellings and abbreviations of units: use “Wb/m2” or “webers per square meter,” not “webers/m2.” When expressing a range of values, write “7 to 9” or “7-9,” not “7~9.”
A parenthetical statement at the end of a sentence is punctuated outside of the closing parenthesis (like this). (A parenthetical sentence is punctuated within parenthesis.) In American English, periods and commas are placed within quotation marks, like “this period.” Other punctuation is “outside”! Avoid contractions; for example, write “do not” instead of “don’t.” The serial comma is preferred: “A, B, and C” instead of “A, B and C.”
If you wish, you may write in the first person singular or plural and use the active voice (“I observed that…” or “We observed that…” instead of “It was observed that…”). Remember to check spelling. If your native language is not English, please ask a native English-speaking colleague to proofread your paper.
The word “data” is plural, not singular (i.e., “data are,” not “data is”). The subscript for the permeability of vacuum µ0 is zero, not a lowercase letter “o.” The term for residual magnetization is “remanence”; the adjective is “remanent”; do not write “remnance” or “remnant.” The word “micrometer” is preferred over “micron” when spelling out this unit of measure. A graph within a graph is an “inset,” not an “insert.” The word “alternatively” is preferred to the word “alternately” (unless you really mean something that alternates). Use the word “whereas” instead of “while” (unless you are referring to simultaneous events). Do not use the word “essentially” to mean “approximately” or “effectively.” Do not use the word “issue” as a euphemism for “problem.” When compositions are not specified, separate chemical symbols by en-dashes; for example, “NiMn” indicates the intermetallic compound Ni0.5Mn0.5 whereas “Ni–Mn” indicates an alloy of some composition NixMn1-x.
Be aware of the different meanings of the homophones “affect” (usually a verb) and “effect” (usually a noun), “complement” and “compliment,” “discreet” and “discrete,” “principal” (e.g., “principal investigator”) and “principle” (e.g., “principle of measurement”). Do not confuse “imply” and “infer.”
Prefixes such as “non,” “sub,” “micro,” “multi,” and “"ultra” are not independent words; they should be joined to the words they modify, usually without a hyphen. There is no period after the “et” in the abbreviation “et al.” The abbreviation “i.e.,” means “that is,” and the abbreviation “e.g.,” means “for example” (these abbreviations are not italicized).
= Conclusions
A conclusion section is not required, though it is preferred. Although a conclusion may review the main points of the paper, do not replicate the abstract as the conclusion. A conclusion might elaborate on the importance of the work or suggest applications and extensions. Note that the conclusion section is the last section of the paper that should be numbered. The appendix (if present), acknowledgment, and references should be listed without numbers.
= Appendix
An appendix, if needed, should appear before the acknowledgments.
= Acknowledgments
An Acknowledgments section, if used, immediately precedes the References. Sponsorship information and funding data are included here. The preferred spelling of the word “acknowledgment” in American English is without the “e” after the “g.” Avoid expressions such as “One of us (S.B.A.) would like to thank…” Instead, write “F. A. Author thanks…”
= References
The following pages are intended to provide examples of the different reference types. All references should be in 9-point font, with the first line flush left and reference numbers inserted in brackets. You are not required to indicate the type of reference; different types are shown here for illustrative purposes only. The DOI (digital object identifier) should be incorporated in every reference for which it is available (see Ref. 1 sample); for more information on DOIs, visit www.doi.org or www.crossref.org. @example @Vatistas1986 @tolkien54
|
|
https://github.com/ckunte/m-one | https://raw.githubusercontent.com/ckunte/m-one/master/inc/stormsafety.typ | typst | = Storm safety
Jacket resting on its bottom remains exposed to oncoming waves in its pre and partially piled states during installation. In these states, it is susceptible to sliding and overturning from environmental actions. It is deemed storm-safe when secured with sufficient number of piles to withstand installation wave environment.
There is a prevalent practice in the industry where jackets on-bottom are commonly assessed for stability by applying 1-year return period environment. This is because piling in well-understood soils takes days (not weeks or months) to complete. As a result, the probability (p) of encountering a design wave@faltinsen_1990 within piling duration is sufficiently low.
The probability of encountering a design wave during piling is given by:
$ p = 1 - e^(- L / T) $
where,
- L -- piling duration
- T -- return period of design wave
As @pew illustrates, increase in piling duration increases the chance of encountering a design wave non-linearly. And so the best way to lower _p_ is to design the jacket to withstand wave environment with higher return periods, especially if piling duration to achieve storm safety cannot be reduced.
We experienced this issue first-hand in 2019, as our piling durations were expected to be unconventionally long, and we were able to steer the criteria for design in time towards storm-safety.
#figure(
image("/img/ep.svg", width: 100%),
caption: [
The probability of encountering a design wave during piling
],
) <pew>
#v(1em)
#let stormsafety = read("/src/stormsafety.py")
#{linebreak();raw(stormsafety, lang: "python")}
$ - * - $ |
|
https://github.com/ticks-tan/wtu-typst | https://raw.githubusercontent.com/ticks-tan/wtu-typst/main/template/wtu-essays.typ | typst | #import "@preview/codelst:2.0.0": sourcecode
#import "@preview/tablem:0.1.0": tablem
// World 字号对应 pt
#let FontSize = (
初号: 42pt,
小初: 36pt,
一号: 26pt,
小一: 24pt,
二号: 22pt,
小二: 18pt,
三号: 16pt,
小三: 15pt,
四号: 14pt,
中四: 13pt,
小四: 12pt,
五号: 10.5pt,
小五: 9pt,
六号: 7.5pt,
小六: 6.5pt,
七号: 5.5pt,
小七: 5pt,
)
// 会用到的字体
#let Font = (
宋体: ("SimSun", "Times New Roman"),
仿宋: ("FangSong", "Times New Roman"),
黑体: ("SimHei", "Times New Roman"),
楷体: ("KaiTi", "Times New Roman"),
EN: ("Times New Roman"),
Code: ("Times New Roman", "SimSun"),
)
// 封面部分
#let CoverPart = 1;
// 目录摘要部分
#let AbstractPart = 2;
// 正文部分
#let ContentPart = 3;
#let partCounter = counter("part")
#let chapterCounter = counter("chapter")
// 附录
#let appendixState = state("appendix", false)
// 代码计数
#let codeCounter = counter(figure.where(kind: "code"))
// 图片计数
#let imageCounter = counter(figure.where(kind: image))
// 表格计数
#let tableCounter = counter(figure.where(kind: table))
// 公式计数
#let equationCounter = counter(math.equation)
/* ------------------------------------------------------- */
// 使用附录
#let StartAppendix() = {
appendixState.update(true)
// 章节和标题重新计数
chapterCounter.update(0)
counter(heading).update(0)
}
// 使用参考文献
#let GBTBib = arguments(title: "参考文献:", style: "gb-7714-2015-numeric")
// 代码块
#let CodeBlock(code, caption: "") = {
figure(
sourcecode(code),
caption: caption, kind: "code", supplement: ""
)
}
// Markdown表格
#let Table(content, caption: "") = {
figure(
tablem(content),
caption: caption
)
}
// 公式
#let MathBlock(content, caption: "") = {
figure(
content,
caption: caption,
kind: "equation",
supplement: ""
)
}
/* ----------------------------------------------------- */
// 数字转中文数字 整数
#let ChineseIntNumber(num) = if num <= 10 {
("零", "一", "二", "三", "四", "五", "六", "七", "八", "九", "十").at(num)
}else if num < 100 {
// num % 10
if calc.rem(num, 10) == 0 {
// 向下取整
ChineseIntNumber(calc.floor(num / 10)) + "十"
} else if num < 20 {
"十" + ChineseIntNumber(calc.rem(num, 10))
} else {
ChineseIntNumber(calc.floor(num / 10)) + "十" + ChineseIntNumber(calc.rem(num, 10))
}
}
// numbering 中文版本
#let ChineseNumbering(..nums, location: none, bracket: false) = locate(loc => {
let realLoc = if location != none { location } else { loc }
if not appendixState.at(realLoc) {
// // 一级数字
// if nums.pos().len() == 1 {
// "第" + ChineseIntNumber(nums.pos().first()) + "章"
// } else {
// // 数字版本
numbering(if bracket {"(1.1)"} else {"1.1"}, ..nums)
// }
} else {
if nums.pos().len() == 1 {
"附录 " + numbering("A.1", ..nums)
} else {
numbering(if bracket {"(A.1)"} else {"A.1"}, ..nums)
}
}
})
// 中文目录
#let ChineseOutline(title: "目 录", depth: none, indent: false) = {
set par(first-line-indent: 0em, leading: 0em)
// 目录标题
align(center)[
#heading(numbering: none, outlined: false)[
#text(size: FontSize.三号, font: Font.黑体)[#title]
]
]
let CustomText(level, body) = {
if level == 1 {
text(size: FontSize.小三, font: Font.黑体, body)
} else if level == 2 {
text(size: FontSize.四号, font: Font.黑体, body)
} else {
text(size: FontSize.四号, font: Font.楷体, body)
}
}
locate(loc => {
// 查找标题
let elements = query(heading.where(outlined: true), loc)
for ele in elements {
let eleLoc = ele.location()
if partCounter.at(eleLoc).first() < AbstractPart and ele.numbering == none { continue }
if depth != none and ele.level > depth { continue }
let number = if ele.numbering != none {
if ele.numbering == ChineseNumbering {
ChineseNumbering(..counter(heading).at(eleLoc), location: eleLoc)
} else {
numbering(ele.numbering, ..counter(heading).at(eleLoc))
}
h(1em)
}
let line = {
if indent {
// 横向缩进
h(1em * (ele.level - 1))
}
// 一级标题额外竖直缩进, weak: 连续出现会坍缩
if ele.level == 1 {
v(0.5em, weak: true)
}
if number != none {
style(styles => {
let width = measure(number,styles).width
box(
width: width,
link(ele.location(), CustomText(ele.level, number))
)
})
}
link(eleLoc, CustomText(ele.level, ele.body))
if ele.level == 1 {
box(width: 1fr, h(1em) + box(width: 1fr) + h(1em))
} else {
box(width: 1fr, h(1em) + box(width: 1fr, repeat[.]) + h(1em))
}
let footer = query(selector(<__footer__>).after(eleLoc), eleLoc);
let page_number = if footer != () {
counter(page).at(footer.first().location()).first()
}else {
0
}
link(eleLoc, if ele.level == 1 {
strong(str(page_number))
} else {
str(page_number)
})
linebreak()
v(-0.5em)
}
line
}
})
}
/* ----------------------------------------------------- */
#let conf(
header: "2020届毕业设计论文",
zhTitle: "中文标题",
enTitle: "English Title",
colleges: "某学院",
major: "某专业",
class: "年级班级",
studentId: "学号",
zhAuthor: "张三",
enAuthor: "<NAME>",
teacher: "指导老师",
date: "某年某月某日",
zhAbstract: [],
zhKeywords: (),
enAbstract: [],
enKeywords: (),
lineSpacing: 1.5em,
outlineDepth: 3,
doc,
) = {
// 设置页面
set page(
paper: "a4",
// 页眉
header: locate(loc => {
set text(FontSize.五号, font: Font.宋体)
align(center, ("武汉纺织大学" + header))
v(-1em)
line(length: 100%)
}),
// 页脚
footer: locate(loc => {
[
#set text(FontSize.小五)
#set align(center)
#if partCounter.at(loc).first() < AbstractPart /*or query(selector(heading).after(loc), loc).len() == 0*/ {
// Skip
} else {
let headers = query(selector(heading).before(loc), loc)
let part = partCounter.at(headers.last().location()).first()
[
#if part < ContentPart {
numbering("I", counter(page).at(loc).first())
} else {
str(counter(page).at(loc).first())
}
]
}
#label("__footer__")
]
}),
)
// 设置样式
set heading(numbering: ChineseNumbering)
set figure(
numbering: (..nums) => locate(loc => {
set text(font: Font.宋体, size: FontSize.五号)
if not appendixState.at(loc) {
numbering("1.1", chapterCounter.at(loc).first(), ..nums)
} else {
numbering("A.1", chapterCounter.at(loc).first(), ..nums)
}
})
)
set math.equation(
numbering: (..nums) => locate(loc => {
set text(font: Font.宋体, size: FontSize.五号)
if not appendixState.at(loc) {
numbering("(1.1)", chapterCounter.at(loc).first(), ..nums)
} else {
numbering("(A.1)", chapterCounter.at(loc).first(), ..nums)
}
})
)
// 有序跟无序列表缩进 2 字符
set list(indent: 2em)
set enum(indent: 2em)
show strong: it => text(font: Font.黑体, weight: "semibold", it.body)
show emph: it => text(font: Font.楷体, style: "italic", it.body)
show par: set block(spacing: lineSpacing)
show raw: set text(font: Font.Code)
// 标题重新渲染
show heading: it => [
#set par(first-line-indent: 0em)
#let sizedheading(it, size, font) = [
#set text(size: size, font: font)
// 段前一行
#v(1em)
#if it.numbering != none {
counter(heading).display()
// h(1em)
}
#text(size: size, font: font)[
#it.body
]
// 段后一行
// #v(1em)
]
#if it.level == 1 {
locate(loc => {
if it.numbering != none and partCounter.at(loc).first() < ContentPart {
// 进入正文部分
partCounter.update(ContentPart)
counter(page).update(1)
}
})
// 章节重新计数
if it.numbering != none {
chapterCounter.step()
}
codeCounter.update(())
imageCounter.update(())
tableCounter.update(())
equationCounter.update(())
// set align(center)
sizedheading(it, FontSize.小三, Font.黑体)
} else {
if it.level == 2 {
sizedheading(it, FontSize.四号, Font.黑体)
} else if it.level == 3 {
sizedheading(it, FontSize.四号, Font.楷体)
} else {
sizedheading(it, FontSize.小四, Font.宋体)
}
}
]
show figure: it => [
#set align(center)
#if not it.has("kind") {
it
} else if it.kind == image {
it.body
[
#text(size: FontSize.五号, font: Font.宋体)[
#(it.caption)
]
]
} else if it.kind == table {
[
#text(size: FontSize.五号, font: Font.宋体)[
#(it.caption)
]
]
it.body
} else if it.kind == "code" {
[
#text(size: FontSize.五号, font: Font.宋体)[
代码#(it.caption)
]
]
it.body
}
]
show ref: it => {
if it.element == none {
it
} else {
h(0em, weak: true)
let item = it.element
let itLoc = item.location();
if item.func() == math.equation {
link(itLoc, [
式
#ChineseNumbering(
chapterCounter.at(itLoc).first(),
equationCounter.at(itLoc).first(),
location: itLoc,
bracket: true,
)
])
} else if item.func() == figure {
if item.kind == image {
link(itLoc, [
图
#ChineseNumbering(
chapterCounter.at(itLoc).first(),
imageCounter.at(itLoc).first(),
location: itLoc,
bracket: true,
)
])
} else if item.kind == table {
link(itLoc, [
表
#ChineseNumbering(
chapterCounter.at(itLoc).first(),
tableCounter.at(itLoc).first(),
location: itLoc,
bracket: true,
)
])
} else if item.kind == "code" {
link(itLoc, [
代码
#ChineseNumbering(
chapterCounter.at(itLoc).first(),
codeCounter.at(itLoc).first(),
location: itLoc,
bracket: true,
)
])
}
} else if item.func() == heading {
if item.level == 1 {
link(itLoc, ChineseNumbering(..counter(heading).at(itLoc), location: itLoc))
} else {
link(itLoc, [
节
#ChineseNumbering(..counter(heading).at(itLoc), location: itLoc)
])
}
}
h(0em, weak: true)
}
}
let FieldName(name) = [
#set align(right + top)
#strong(name)
]
let FieldValue(value) = [
#set align(center + horizon)
#set text(font: Font.楷体)
#grid(
rows: (auto, auto),
row-gutter: 0.2em,
value,
line(length: 100%)
)
]
// 封面部分
set text(size: FontSize.一号, font: Font.宋体, lang: "zh")
set align(center + top)
v(2em)
box(width: 100%)[
#grid(
columns: (1fr),
rows: (auto, auto),
gutter: 1em,
align(center)[#image("wtu_logo.png", height: 2.4em, fit: "contain")],
align(center)[#image("wtu_txt.png", height: 2.6em, fit: "contain")]
)
]
linebreak()
v(0.5em)
text(font: Font.仿宋)[#strong(header)]
set align(center + horizon)
set text(size: FontSize.三号)
v(60pt)
grid(
columns: (80pt, 280pt),
row-gutter: 1em,
FieldName(text("题") + h(2em) + text("目:")), FieldValue(zhTitle),
FieldName(text("学") + h(2em) + text("院:")), FieldValue(colleges),
FieldName(text("专") + h(2em) + text("业:")), FieldValue(major),
FieldName(text("年级班级:")), FieldValue(class),
FieldName(text("学") + h(2em) + text("号:")), FieldValue(studentId),
FieldName(text("姓") + h(2em) + text("名:")), FieldValue(zhAuthor),
FieldName(text("指导老师:")), FieldValue(teacher)
)
v(60pt)
text(size: FontSize.小二)[#date]
pagebreak(weak: true)
// 摘要部分
set align(left + top)
// 中文摘要
partCounter.update(AbstractPart)
counter(page).update(1)
par(justify: true, first-line-indent: 2em, leading: lineSpacing)[
// 摘要居中
#heading(numbering: none, outlined: false)[
#align(center)[
#text(size: FontSize.三号, font: Font.黑体, "摘 要")
]
]
#linebreak()
#par(
leading: 1.5em,
first-line-indent: 2em
)[
#text(size: FontSize.小四, font: Font.宋体)[
#zhAbstract
]
]
#v(2em)
#set par(first-line-indent: 0em)
#set text(size: FontSize.小四, font: Font.黑体)
关键词:
#text(size: FontSize.小四, font: Font.宋体)[
#zhKeywords.join(", ")
]
#v(2em)
]
pagebreak(weak: true)
// 英文摘要
par(justify: true, first-line-indent: 2em, leading: lineSpacing)[
#heading(numbering: none, outlined: false)[
#align(center)[
#text(size: FontSize.三号, font: Font.EN, weight: "bold", "Abstract")
]
]
#linebreak()
#par(
leading: 1.5em,
first-line-indent: 2em
)[
#text(size: FontSize.小四, font: Font.EN, weight: "regular")[
#enAbstract
]
]
#v(2em)
#set par(first-line-indent: 0em)
#text(size: FontSize.小四, font: Font.EN, weight: "bold", "Keywords: ")
#text(size: FontSize.小四, font: Font.EN)[
#enKeywords.join(", ")
]
#v(2em)
]
pagebreak(weak: true)
// 目录部分
ChineseOutline(depth: outlineDepth, indent: true)
pagebreak(weak: true)
set align(left + top)
par(justify: true, first-line-indent: 2em, leading: lineSpacing)[
#set text(size: FontSize.小四, font: Font.宋体)
#doc
]
partCounter.update(ContentPart)
// 致谢待完成 ...
} |
|
https://github.com/shiki-01/typst | https://raw.githubusercontent.com/shiki-01/typst/main/lib/conf.typ | typst | #import "component/comment.typ": comment
#import "component/title.typ": head
#import "component/description.typ": description
#import "@preview/whalogen:0.1.0": ce as whalogen
#let come(title, type, body) =[ #comment(title, type, body) ]
#let desc(name, body) = [ #description(name, body) ]
#let ce(body) = [ $#whalogen(body)$ ]
#let light(body) = [
#highlight(
top-edge: "x-height",
fill: rgb("#ffff00"),
[*#body*]
)
]
#let conf(
title: none,
date: none,
doc,
) = {
set heading(numbering: "1.")
show heading: it => [
#if it.level < 2 {pad(top: 0pt,[])}
#pad(
bottom: -40pt,
[#it\ ]
)
#if it.level < 2 {line(length: 100%,stroke: rgb("#eee"))}
#pad(bottom: -20pt, [])
#if it.level >= 2 {
[#pad(bottom: 10pt,[])]
}
]
set text(
font: "M PLUS 1",
)
show raw.where(block: false): box.with(
fill: luma(240),
inset: (x: 3pt, y: 0pt),
outset: (y: 3pt),
radius: 2pt,
)
show raw.where(block: true): block.with(
width: 100%,
fill: luma(240),
stroke: 0.7pt + rgb("#dddddd"),
inset: 10pt,
radius: 4pt,
)
show raw: set text(
font: "M PLUS 1 Code",
)
set table(
inset: 10pt,
fill: (_, y) => if calc.even(y) and y != 0 { rgb("#f8f8f8") },
stroke: 0.7pt + rgb("#dddddd")
)
show table.cell.where(y:0): strong
import "@preview/quick-maths:0.1.0": shorthands
show: shorthands.with(
($+-$, $plus.minus$),
($|-$, math.tack),
($<=$, math.arrow.l.double) // Replaces '≤'
)
head(title, date)[
#doc
]
} |
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/visualize/image_02.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test all three fit modes.
#set page(height: 50pt, margin: 0pt)
#grid(
columns: (1fr, 1fr, 1fr),
rows: 100%,
gutter: 3pt,
image("/assets/files/tiger.jpg", width: 100%, height: 100%, fit: "contain"),
image("/assets/files/tiger.jpg", width: 100%, height: 100%, fit: "cover"),
image("/assets/files/monkey.svg", width: 100%, height: 100%, fit: "stretch"),
)
|
https://github.com/rxt1077/it610 | https://raw.githubusercontent.com/rxt1077/it610/master/markup/exercises/hello-k8s.typ | typst | #import "/templates/exercise.typ": exercise, code, admonition
#show: doc => exercise(
course-name: "Systems Administration",
exercise-name: "Hello Kubernetes",
doc,
)
== Goals
+ Start the built-in version of Kubernetes in Docker Desktop
+ Build a custom Docker image for a web app
+ Create a Kubernetes deployment for the web app
+ Create a Kubernetes service to access the web app
== Kubernetes in Docker Desktop
Docker Desktop now comes with a built-in version of Kubernetes.
To start it, make sure that you have it selected in the `Settings` control:
#align(center, rect(image("/images/docker-k8s.png", width: 75%)))
You can see if Kubernetes is running in the bottom left of Docker Desktop.
If you have trouble, you may have to click the `Reset Kubernetes Cluster` button.
== Building a Custom Docker Image
At this point, we've built custom Docker images multiple times so this should be familiar to you.
Our `Dockerfile` as well as other supporting files are in the `exercises/hello-k8s` directory of the class git repo.
#code([
```console
$ cd exercises/hello-k8s
exercises/hello-k8s$ ls
Dockerfile elf11.zip index.html
exercises/hello-k8s$ docker build -t elf:v1 . <1>
<snip>
=> exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:01eee4b43dd28560b1a0f7dfa57957baa07772559065454f5f42a957d95b7031 0.0s
=> => naming to docker.io/library/elf 0.0s
```
], callouts: (
(<1>, [
Note that we're also giving this image a version tag (v1).
Kubernetes in Docker Desktop requires this to load a local image.
]),
))
== Creating a Kubernetes Deployment
Now we'll use kubectl to create a deployment in Kubernetes, check to make sure it's running, and see if it started any pods:
#code[
```console
$ kubectl create deployment hello-k8s --image=elf:v1
deployment.apps/hello-k8s created
$ kubectl get deployments
NAME READY UP-TO-DATE AVAILABLE AGE
hello-k8s 1/1 1 1 2m42s
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
hello-k8s-b9f677dd8-mvtts 1/1 Running 0 7s
```
]
Notice that our deployment, named `hello-k8s`, caused a pod to start running.
== Creating a Kubernetes Service
We have a deployment running and that deployment has started a pod to run our Docker image.
If we want to be able to connect to the pod, we will need to set up a `service`.
Let's use the `kubectl expose` command to make that service:
#code[
```console
$ kubectl expose deployment hello-k8s --type=LoadBalancer --port=8000
service/hello-k8s exposed
$ kubectl get services
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
hello-k8s LoadBalancer 10.100.129.66 localhost 8000:30246/TCP 36s
kubernetes ClusterIP 10.96.0.1 <none> 443/TCP 100m
```
]
== Deliverables
Now that you have your web app up and running, open a web browser and go to #link("http://localhost:8000").
All you have to submit in the textbox for this assignment is the _name_ of the the game you're playing!
== Wrapping Up
Now by running these `kubectl` commands you've created a few objects with default values.
You can actually view _all_ the options for the things that were created with the `get -o yaml` command which literally means print out the YAML that defines that object.
Here's some commands you can try:
#code[
```console
$ kubectl get -o yaml deployment hello-k8s
apiVersion: apps/v1
kind: Deployment
metadata:
annotations:
deployment.kubernetes.io/revision: "1"
creationTimestamp: "2024-10-21T18:49:27Z"
<snip>
$ kubectl get -o yaml service hello-k8s
apiVersion: v1
kind: Service
metadata:
creationTimestamp: "2024-10-21T20:27:23Z"
labels:
app: hello-k8s
name: hello-k8s
<snip>
```
]
As you can see there are a lot of options for these objects!
To delete the `Deployment` and the `Service`, which will delete the one `Pod` created, use the following commands:
#code[
```console
$ kubectl delete deployment hello-k8s
deployment.apps "hello-k8s" deleted
$ kubectl delete service hello-k8s
service "hello-k8s" deleted
```
]
|
|
https://github.com/arthurcadore/eng-telecom-workbook | https://raw.githubusercontent.com/arthurcadore/eng-telecom-workbook/main/semester-8/COM2/homework1/homework.typ | typst | MIT License | #import "@preview/klaro-ifsc-sj:0.1.0": report
#import "@preview/codelst:2.0.1": sourcecode
#show heading: set block(below: 1.5em)
#show par: set block(spacing: 1.5em)
#set text(font: "Arial", size: 12pt)
#set text(lang: "pt")
#set page(
footer: "Engenharia de Telecomunicações - IFSC-SJ",
)
#show: doc => report(
title: "Diversidade de Antenas",
subtitle: "Sistemas de Comunicação II",
authors: ("<NAME>",),
date: "14 de Outubro de 2024",
doc,
)
= Introdução
O objetivo deste documento é apresentar as técnicas de diversidade de antenas utilizadas em sistemas de comunicação sem fio. A diversidade de antenas é uma tecnologia que permite melhorar a qualidade do sinal enviado pelo tranmissor em sistemas de comunicação, reduzindo a probabilidade de erro de bit e aumentando a eficiência da comunicação.
= Técnicas de Diversidade de Antenas:
== Maximum Ratio Combining (MRC)
O MRC é uma técnica de combinação de sinais que utiliza pesos proporcionais à potência do sinal recebido em cada antena como base para maximizar a SNR (Signal-Noise Ratio).
Com base em uma SNR aumentada em relação a um sistema com apenas uma antena, o MRC é capaz de melhorar a qualidade do sinal recebido, reduzindo a probabilidade de erro de bit. O MRC é dividido nas seguintes etapas:
=== Recepção:
A primeira etapa do MRC é a recepção dos sinais transmitidos pelas antenas. Neste caso, consideramos que o sinal recebido em cada antena é composto por um sinal desejado e um ruído aditivo.
Cada sinal coresponde há uma antena, ou seja, se temos 2 antenas, teremos 2 sinais recebidos.
=== Ganho de Combinação:
Cada sinal recebido é associado a um ganho, que é calculado com base no valor conjugado do coeficiente do canal de cada antena, desta forma temos que:
$
alpha_i = h_i^\*
$
Tal que:
- $alpha_i$ é o ganho associado ao sinal recebido na antena i
- $h_i$ é o coeficiente do canal da antena i
=== Combinação dos Sinais:
Para calcular o valor combinado de todos os sinais recebidos, é necessário aplicar um somatório de todos os termos (sinais) de entrada:
$
y = sum^N_(i=1) alpha_i x_i
$
Tal que:
- $y$ é o sinal combinado
- $N$ é o número de antenas
- $alpha_i$ é o ganho associado ao sinal recebido na antena i
- $x_i$ é o sinal recebido na antena i
=== SNR Resultante:
A SNR total pode ser obtida através do somatório das SNRs individuais de cada termo:
$
"SNR"_("total") = sum^N_(i=1) "SNR"_i
$
Tal que:
- "SNR" é a relação sinal-ruído do sinal combinado
- "$"SNR"_i$" é a relação sinal-ruído do sinal recebido na antena i
== Selection Combining (SC)
O SC é uma técnica de diversidade de antenas que seleciona o sinal com a maior SNR para ser utilizado na recepção do sinal propriamente dito.
Desta forma, caso hajam 3 chains por exemplo, o SC irá selecionar o sinal com a melhor qualidade/Definição (maior SNR) entre as 3 antenas para ser utilizado na interpretação dos dados transmitidos.
=== Recepção:
A primeira etapa do SC é a recepção dos sinais transmitidos pelas antenas. Neste caso, consideramos que o sinal recebido em cada antena é composto por um sinal desejado e um ruído aditivo.
=== Avaliação de Qualidade:
Para avaliar a qualidade de cada sinal recebido, é necessário calcular a SNR de cada sinal. O calculo da SNR é feito simplesmente dividindo a potência do sinal pelo ruído aditivo:
$
"SNR"_i = (P_s)/(P_n)
$
Ou realizando sua subtração, caso a SNR seja dada em dB:
$
"SNR"_i ("dB") = P_s - P_n
$
=== Seleção do Sinal:
O sinal a ser utilizado na amostragem / quantização / interpretacao dos dados é aquele que possui a maior SNR entre todos os sinais recebidos, portanto:
$
y = max("SNR_i")
$
=== SNR Resultante:
Por fim, a SNR total do sistema é dada pela SNR do sinal selecionado:
$
"SNR"_("total") = "SNR"_("max")
$
== Equal Gain Combining (EGC)
O ECG é uma técnica de diversidade de antenas que combina os sinais recebidos de cada antena com mesmo ganho, ou seja, sem aplicar pesos proporcionais à potência do sinal recebido em cada antena, essa técnica permite aumentar a SNR do sinal recebido em relação a um sistema com apenas uma antena.
=== Recepção:
A primeira etapa do EGC é a recepção dos sinais transmitidos pelas antenas. Neste caso, consideramos que o sinal recebido em cada antena é composto por um sinal desejado e um ruído aditivo.
=== Combinação dos Sinais:
Para calcular o valor combinado de todos os sinais recebidos, é necessário aplicar um somatório de todos os termos (sinais) de entrada:
$
y = sum^N_(i=1) x_i
$
Neste ponto é fundamental garantir que os sinais de entrada estejam em fase, caso contrário a combinação dos sinais pode resultar em cancelamento de sinal, reduzindo a SNR do sinal combinado ao invés de aumentar, oque pioraria a qualidade do sinal recebido.
=== SNR Resultante:
A SNR total pode ser obtida através do somatório das SNRs individuais de cada termo:
$
"SNR"_("total") = sum^N_(i=1) "SNR"_i
$
Desta forma, a SNR total sempre será maior que a SNR de cada sinal individualmente, oque resulta em uma melhora na qualidade do sinal recebido.
== Alamouti
O Alamouti é uma técnica de diversidade de antenas que utiliza a técnica de codificação de espaço-tempo para transmitir dois sinais em um único intervalo de tempo.
Essa técnica baseia-se na transmissão de dois sinais em antenas diferentes, com a segunda antena transmitindo o sinal conjugado do primeiro.
As antenas receptoras recebem os sinais transmitidos e, através de um processo de combinação, são capazes de recuperar os sinais originais transmitidos, formando uma espécie de "cruzamento" entre os sinais transmitidos e recebidos.
=== Transmissão:
Suponhamos que hajam duas informações a serem transmitidas, $x_1$ e $x_2$. O Alamouti transmite essas informações em duas (ou mais) antenas diferentes, operando de maneira diferente em instantes de tempo diferentes, vejamos um exemplo:
==== Instante t=1:
Neste intervalo de tempo temos que:
- $A_1$ transmite $x_1$
- $A_2$ transmite $x_2$
==== Instante t=2 (instante seguinte):
Neste momento, temos que a transmissão se dá pelo conjugado dos sinais transmitidos no instante anterior, ou seja:
- $A_1$ transmite $-x_2^*$
- $A_2$ transmite $x_1^*$
O processo se repete para todas as mensagens a serem enviadas pelo transmissor de sinal usando a técnica Alamouti. Onde cada mensagem ocupa dois intervalos de tempo para ser transmitida com a redundância espaço-temporal necessária para a recuperação do sinal original.
=== Recepção:
No processo de recepção, as antenas receptoras recebem o sinal enviado tanto nos instantes de t=1 e t=2, e através de um processo de combinação, são capazes de recuperar os sinais originais transmitidos:
$
X_1 = h_1 x_1 + h_2 x_2 + n_1
$
$
X_2 = h_1(-x_2^*) + h_2 x_1^* + n_2
$
Tal que:
- $X_1$ e $X_2$ são os sinais recebidos
- $x_1$ e $x_2$ são os sinais originais transmitidos
- $h_1$ e $h_2$ são os coeficientes do canal das antenas receptoras
- $n_1$ e $n_2$ são os ruídos aditivos
= Conclusão:
As técnicas de diversidade de antenas são fundamentais para melhorar a qualidade do sinal recebido em sistemas de comunicação sem fio. O MRC, SC, EGC e Alamouti são exemplos de técnicas que utilizam a diversidade de antenas para aumentar a SNR do sinal recebido, reduzindo a probabilidade de erro de bit e melhorando a qualidade da comunicação.
= Referências
- #link("https://www.researchgate.net/profile/Mohamed_Mourad_Lafifi/post/if_anyone_can_support_with_matlab_code_to_plot_the_CDF_of_SINR_in_massive_MIMO/attachment/59d64a8479197b80779a4cc7/AS%3A475667000238080%401490419258391/download/SampleChapters+Wireless+Communications.pdf", "<NAME>. (2005). Wireless Communications. Cambridge University Press.")
- #link("https://ieeexplore.ieee.org/document/730453", "<NAME>. (1998). A simple transmit diversity technique for wireless communications. IEEE Journal on Selected Areas in Communications, 16(8), 1451-1458.") |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.