repo
stringlengths
26
115
file
stringlengths
54
212
language
stringclasses
2 values
license
stringclasses
16 values
content
stringlengths
19
1.07M
https://github.com/maucejo/elsearticle
https://raw.githubusercontent.com/maucejo/elsearticle/main/README.md
markdown
MIT License
# Elsearticle template [![Generic badge](https://img.shields.io/badge/Version-0.3.0-cornflowerblue.svg)]() [![MIT License](https://img.shields.io/badge/License-MIT-forestgreen)](https://github.com/maucejo/elsearticle/blob/main/LICENSE) [![User Manual](https://img.shields.io/badge/doc-.pdf-mediumpurple)](https://github.com/maucejo/elsearticle/blob/main/docs/manual.pdf) `elsearticle` is a Typst template that aims to mimic the Elsevier article LaTeX class, a.k.a. `elsearticle.cls`, provided by Elsevier to format manuscript properly for submission to their journals. ## Basic usage This section provides the minimal amount of information to get started with the template. For more detailed information, see the [manual](https://github.com/maucejo/elsearticle/blob/main/docs/manual.pdf). To use the `elsearticle` template, you need to include the following line at the beginning of your `typ` file: ```typ #import "@preview/elsearticle:0.3.0": * ``` ### Initializing the template After importing `elsearticle`, you have to initialize the template by a show rule with the `#elsearticle()` command. This function takes an optional argument to specify the title of the document. * `title`: Title of the paper * `author`: List of the authors of the paper * `abstract`: Abstract of the paper * `journal`: Name of the journal * `keywords`: List of keywords of the paper * `format`: Format of the paper. Possible values are `preprint`, `review`, `1p`, `3p`, `5p` * `numcol`: Number of columns of the paper. Possible values are 1 and 2 * `line-numbering`: Enable line numbering. Possible values are `true` and `false` ## Additional features The `elsearticle` template provides additional features to help you format your document properly. ### Appendix To activate the appendix environment, all you have to do is to place the following command in your document: ```typ #show: appendix // Appendix content here ``` ### Subfigures Subfigures are not built-in features of Typst, but the `elsearticle` template provides a way to handle them. It is based on the `subpar` package that allows you to create subfigures and properly reference them. ```typ #subfigure( figure(image("image1.png"), caption: []), <figa>, figure(image("image2.png"), caption: []), <figb>, columns: (1fr, 1fr), caption: [(a) Left image and (b) Right image], label: <fig> ) ``` ### Equations The `elsearticle` template provides the `#nonumeq()` function to create unnmbered equations. The latter function can be used as follows: ```typ #nonumeq[$ y = f(x) $ ] ``` ## Roadmap *Article format* - [x] Preprint - [x] Review - [x] 1p - [x] 3p - [x] 5p *Environment* - [x] Implementation of the `appendix` environment *Figures and tables* - [x] Implementation of the `subfigure` environment *Equations* - [x] Proper referencing of equations w.r.t. the context - [ ] Numbering each equation of a system as "(1a)" -- _On going discussions at the Typst dev level_ *Other features* - [x] Line numbering - Line numbering - Use the built-in `par.line` function available from Typst v0.12 ## License MIT licensed Copyright (C) 2024 <NAME> (maucejo)
https://github.com/mem-courses/calculus
https://raw.githubusercontent.com/mem-courses/calculus/main/note-1/3.导数与微分.typ
typst
#import "../template.typ": * #show: project.with( course: "Calculus I", course_fullname: "Calculus (A) I", course_code: "821T0150", semester: "Autumn-Winter 2023", title: "Note #3: 导数与微分", authors: ( ( name: "memset0", email: "<EMAIL>", id: "3230104585", ), ), date: "October 31, 2023", ) #let def(x) = text("【" + x + "】", weight: "bold") #let deft(x) = text("【" + x + "】", weight: "bold", fill: rgb("#FFFFFF")) = 导数 == 导数的定义 #definition[ 设 $f(x)$ 在点 $x_0$ 的邻域有定义,若 $display(lim_(x->x_0)frac(f(x)-f(x_0),x-x_0) = lim_(Delta x->0)frac(Delta y,Delta x))$ 存在,则称 $f(x)$ 在点 $x_0$ 处可导,并称此极限为 $y=f(x)$ 在点 $x_0$ 的#bb[导数]。 ] #definition[ 若函数在开区间 $I$ 内的每点都可导,就称函数在 $I$ 内可导。此时函数值构成的新函数称为#bb[导函数]。 ] #theorem[ 函数 $y=f(x)$ 在点 $x_0$ 可导的充分必要条件是 $f(x)$ 在点 $x_0$ 处的左导数 $f'_-(x_0)$ 和右导数 $f'_+(x_0)$ 均存在且相等。 ] #theorem[ 可导一定连续,连续不一定可导。(如 $y=abs(x)$ 在 $x=0$ 处连续但不可导。) ] == 高阶导数的定义 #definition[ 若函数 $y=f(x)$ 的导数 $y'=f'(x)$ 可导,则称 $f'(x)$ 的导数为 $f(x)$ 的#bb[二阶导数],记作 $y''$ 或 $display(ddy/(dx^2))$。函数 $y=f(x)$ 的 #bb[$n$ 阶导数]可以记为 $y^((n))$ 或 $display(dny/(dx^n))$。 ] == 求导 === 基本初等函数的求导公式 $ & (C)'=0 quad quad quad quad quad quad quad quad quad quad && (x^mu)' = mu x^(mu - 1)\ & (sin x)' = cos x && (cos x)' = -sin x\ & (tan x)' = sec^2 x && (cot x)' = -csc^2 x\ & (sec x)' = sec x tan x && (csc x)' = -csc x cot x\ & (a^x)' = a^x ln a && (e^x)' = e^x\ & (log_a x)' = display(1/(x ln a)) && (ln x)' = display(1/x)\ & (arcsin x)' = display(1/sqrt(1-x^2)) && (arccos x)' = -display(1/sqrt(1-x^2))\ & (arctan x)' = display(1/(1+x^2)) && (arccot x)' = -display(1/(1+x^2))\ $ #theorem[ 初等函数在定义区间内可导,且导数仍为初等函数。 #proof[ 通过构造性定义和两个重要极限可以得到 $(C)'=0$、$(sin x)' = cos x$、$(ln x)' = 1/x$,再利用求导法则得到其他基本初等函数的求导公式。 ] ] === 四则运算求导法则 #theorem[ 函数 $u(x)$ 和 $v(x)$ 都在 $x$ 具有导数 $=>$ $u(x)$ 与 $v(x)$ 的和、差、积、商($v(x)=0$ 时除外)都在点 $x$ 可导,且: 1. $[u(x) pm v(x)]' = u'(x) pm v'(x)$ 2. $[u(x) v(x)]' = u'(x) v(x) + u(x) v'(x)$ 3. $display([u(x)/v(x)]' = (u'(x) v(x) - u(x) v'(x))/(v^2(x))) quad (v(x) != 0)$ #proof[ 乘法求导法则的证明:设 $f(x) = u(x) v(x)$,则有 $ f'(x) &= lim_(h->0)(f(x+h)-f(x)) / h = lim_(h->0)(u(x+h) v(x+h) - u(x) v(x)) / h\ &= lim_(h->0) [frac(u(x+h)-u(x),h) v(x+h) + u(x) frac(v(x+h) - v(x), h)]\ &= u'(x) v(x) + u(x) v'(x)\ $ 除法求导法则的证明:设 $f(x) = display(u(x)/v(x))$,则有 $ f'(x) &= lim_(h->0)(f(x+h) - f(x)) / h = lim_(h->0)(frac(u(x+h),v(x+h))-frac(u(x),v(x))) / (h)\ &= lim_(h->0) [frac( (u(x+h)-u(x))/h v(x) - u(x) (v(x+h) - v(x))/h ,v(x+h) v(x))]\ &= (u'(x) v(x) - u(x) v'(x)) / (v^2(x)) $ ] ] === 反函数的求导法则 #theorem[ 设 $y=f(x)$ 为 $x=f^(-1)(y)$ 的反函数,$f^(-1)(y)$ 在 $y$ 的某领域内单调可导,且 $[f^(-1)(y)]!=0$: $ f'(x) = 1 / ([f^(-1)(y)]') sp "或表示为" sp dy / dx = 1 / (sp dx / dy sp) $ #example[ 1. 设 $y = arcsin x$,则 $x = sin y,sp y in (display(-pi/2\,pi/2))$,则: $ (arcsin x)' &= 1 / ((sin y)') = 1 / (cos y) = 1 / (sqrt(1 - sin^2 y)) = 1 / sqrt(1-x^2)\ (arccos x)' &= (pi / 2 - arcsin x)' = -1 / sqrt(1-x^2)\ (arctan x)' &= 1 / ((tan y)') = cos^2 y = 1 / (1 + tan^2 y) = 1 / (1+x^2)\ (arccot x)' &= (pi / 2 - arctan x)' = -1 / (1+x^2)\ $ 2. 设 $y=a^x$,则 $x = log_a y,sp y in (0,+oo)$,则: $ (a^x)' = 1 / ((log_a y)') = y ln a = a^x ln a $ ] ] === 复合函数的求导法则 #theorem[ 若 $u=g(x)$ 在点 $x$ 可导,$y=f(u)$ 在点 $u=g(x)$ 可导 $=>$ 复合函数 $y=f[g(x)]$ 在点 $x$ 可导,且 $display(dy/dx = f'(u) g'(x))$。 #note[多层复合函数求导的情形:由外向内逐层求导。] #proof[ $y=f(u)$ 在点 $u$ 可导 $=>$ $display(lim_(Du -> 0) Dy/Du = f'(u))$ $=>$ $Dy = f'(u) Du + alpha Du$(当 $Du->0$ 时 $alpha->0$) $=>$ $display(Dy / Dx = f'(u) Du/Dx + alpha Du/Dx) quad (Delta x!=0)$ 由于 $u=g(x)$ 在点 $x$ 可导,所以 $u=g(x)$ 在点 $x$ 连续 $=>$ 当 $Dx->0$ 时 $Du->0$,同时有 $alpha->0$。 $=> display(Dy/Dx = lim_(Dx->0) Dy/Dx = lim)$ 我们可以用类似 $display(Dy/Dx = Dy/Du dot Du/Dx)$ 的语言来表述,但这不严谨(有可能 $Du=0$)。 ] ] = 微分 #definition[ 设 $y=f(x)$ 在 $x$ 的某邻域 $U(x)$ 内有定义,若 $Dy=f(x+Dx)-f(x)$ 可表示为: $ Dy=A Dx+o(Dx) quad (Dx->0) $ 其中 $A$ 是与 $Dx$ 无关的量,则称 $y=f(x)$ 在点 $x$ 处可微。$A Dx$ 是 $Dy$ 的#bb[线性主部],并称其为 $y=f(x)$ 在点 $x$ 处的#bb[微分],记为 $dy=A Dx$。 ] == 可微与可导的关系 #theorem[ 函数 $y=f(x)$ 在点 $x_0$ 可微的*充要条件*是 $y=f(x)$ 在点 $x_0$ 可导,且 $A=f'(x_0)$,即 $ dy=f'(x)dx <==> dy / dx=f'(x) $ 所以导数又称微商,可以将 $display(dy/dx)$ 看作 $dy$ 和 $dx$ 的商。 #caution[这一结论对于高阶导数并不成立。] #proof[ (充分性)已知 $y=f(x)$ 在点 $x_0$ 可导,那么: $ lim_(x->x_0) Dy / Dx=f'(x_0) => Dy / Dx=f'(x_0)+alpha $ 且 $display(lim_(Dx->0) alpha=0)$,故 $ Dy = f'(x_0) Dx+alpha Dx=f'(x_0) Dx+ o(Dx) $ (必要性)已知 $y=f(x)$ 在点 $x_0$ 可微,那么: $ Dy=f(x_0+Dx)-f(x_0) = A Dx+o(Dx)\ => lim_(Dx->0) Dy / Dx=lim_(Dx->0) (A+(o(Dx)) / Dx) = A $ 故 $f'(x)$ 在 $x_0$ 可导且 $f'(x_0) = A$。 ] ] = 求导的方法及应用 == 高阶导数的求导方法 === 常用的高阶导数公式 $ & (1 / (a+x))^((n)) = (-1)^n (n!) / (a+x)^(n+1) quad quad quad quad && (1 / (a-x))^((n)) = (n!) / (a-x)^(n+1)\ $ === 莱布尼茨(Leibniz)公式 设函数 $u=u(x)$ 及 $v=v(x)$ 均有 $n$ 阶导数,则有: $ (u v)^((n)) = & u^((n)) v + n u^((n-1)) v' + (n(n-1)) / 2 u^((n-2)) v'' + dots.c + binom(n,k) u^((n-k)) v^((k)) + dots.c + u v^((n)) $ === 泰勒公式法 $f(x)$ 在 $x=0$ 处的泰勒公式为: $ f(x) = f(0) + f'(0) x + (f''(0)) / (2!) x^2 + dots.c + (f^((n))(0)) / (n!) x^n + R_n(x) $ 再令 $display((f^((n))(x))/(n!))$ 等于麦克劳林公式 $x^n$ 项的系数,即可求得 $f^((n)) (0)$。 == 参数方程下的导数 若曲线上任一点的直角坐标 $(x,y)$ 分别可被表示为关于某第三变量 $t$ 的函数,即$display(cases(x=phi(t),y=psi(t)))$,则有以下结论。 1. 若 $x=phi(t),y=psi(t)$ 均可导且 $phi(t)!=0$,则: $ dy / dx = (dif psi(t)) / (dif phi(t)) = (psi'(t) dt) / (phi(t) dt) = (psi'(t)) / (phi'(t)) $ 2. 若 $x=phi(t),y=psi(t)$ 均二阶可导且 $phi(t)!=0$,则: $ ddy / (dx^2) = (dif (display(dy/dx))) / dx = (dif (display((psi'(t))/(phi'(t))))) / (dif phi(t)) = display((psi''(t)phi'(t) - psi'(t)phi''(t))/((phi'(t))^2)dt) / (phi'(t)dt) = (psi''(t)phi'(t) - psi'(t)phi''(t)) / ((phi'(t))^3) $ #warning[注意到与 $display(dy/dx)$ 不同,$display(ddy/(dx^2))$ 须看成整体的记号,如果认为是 $(dx)^2$ 而直接进行与其他微分相消的话就可能出现错误。] == 极坐标方程下的导数 同一点的直角坐标 $(x,y)$ 与极坐标 $(r,theta)$ 满足: $ x=r cos theta, quad y=r cos theta, quad r^2 = x^2+y^2, quad tan theta=y / x sp (x!=0) $ #example[ #problem[ 求曲线 $r=a sin 2 theta$($a$ 是常数)在 $theta=display(pi/4)$ 处的法线方程。 ] #solution[ 令 $display(cases( x=r cos theta=a sin 2theta cos theta=2a sin theta cos^2 theta, y=r sin theta=a sin 2theta sin theta=2a sin^2 theta cos theta, ))$,所以 $ dy / dx = display(dy\/dif theta) / display(dx\/dif theta) = display(2a cos^3 theta - 4a sin^2 theta cos theta) / display(4a sin theta cos^2 theta - 2a sin^3 theta) $ 代入 $display(theta=pi/4)$ 得 $display(dy/dx = -1)$,所以法线方程为 $display(y-sqrt(2)/2 a = x-sqrt(2)/2 a) => y=x$。 ] ]
https://github.com/jomaway/typst-bytefield
https://raw.githubusercontent.com/jomaway/typst-bytefield/main/lib/types.typ
typst
MIT License
// bf-field #let bf-field(type, index, data: none) = ( bf-type: "bf-field", field-type: type, field-index: index, data: data, ) // data-field holds information about an field inside the main grid. #let data-field(index, size, start, end, label, format: none) = { bf-field("data-field", index, data: ( size: size, range: (start: start, end: end), label: label, format: format, ) ) } // note-field holds information about an field outside (left or right) the main grid. #let note-field(index, side, level:0, label, format: none, row: auto, rowspan: 1) = { bf-field("note-field", index, data: ( side: side, row: row, rowspan: rowspan, level: level, label: label, format: format, // TODO ) ) } // header-field hold information about a complete header row. Usually this is the top level header. #let header-field(start: auto, end: auto, autofill: auto, numbers: (), labels: (:), ..args) = { // header-field must have index 0. bf-field("header-field", none, data: ( // This is at the moment always 0 - (bpr), but in the future there might be header fields between data rows. range: (start: start, end: end), // Defines which numbers should be shown. Possible none or array if numbers. numbers: numbers, // Defines which labels should be shown. Dict of number and content. labels: labels, // Defines which numbers should be calculated automatically autofill: autofill, // Defines the format of the bitheader. format: ( // Defines the angle of the labels angle: args.named().at("angle", default: -60deg), // Defines the text-size for both numbers and labels. text-size: args.named().at("text-size",default: auto), // Defines if a marker should be shown marker: args.named().at("marker", default: true), // Defines the background color. fill: args.named().at("fill", default: none), // Defines the border stroke. stroke: args.named().at("stroke", default: none), ) ) ) } // bf-cell holds all information which are necessary for cell positioning inside the table. #let bf-cell(type, grid: center, x: auto, y: auto, colspan:1, rowspan:1, label: none, cell-index: (auto, auto) ,format: auto) = ( bf-type: "bf-cell", cell-type: type, // cell index is a tuple (field-index, slice-index) cell-index: cell-index, // position specifies the grid and and the position inside the grid. position: ( grid: grid, x: x, y: y, ), span: ( rows: rowspan, cols: colspan, ), // The text which will be shown. label: label, // specified format for the label like fill, stroke, align, inset, ... format: format, )
https://github.com/skyzh/chicv
https://raw.githubusercontent.com/skyzh/chicv/master/lib.typ
typst
Creative Commons Zero v1.0 Universal
// This file is intended to be left blank because // the template uses default config of typst and did // not add any new directives.
https://github.com/ShapeLayer/ucpc-solutions__typst
https://raw.githubusercontent.com/ShapeLayer/ucpc-solutions__typst/main/lib/utils/make-problem.typ
typst
Other
#import "/lib/i18n.typ": en-us #import "/lib/colors.typ": color #import "/lib/utils/make-prob-meta.typ": make-prob-meta #let make-problem( id: none, title: none, tags: (), difficulty: none, authors: (), stat-open: ( submit-count: -1, ac-count: -1, ac-ratio: -1, first-solver: none, first-solve-time: -1, ), stat-onsite: none, pallete: ( primary: color.bluegray.at(2), secondary: white, ), i18n: en-us.make-problem, body ) = { set page(margin: (top: 3em)) set text(size: 20pt) [= #text(fill: pallete.primary, size: 1.35em)[#id. #title]] [\ ] make-prob-meta( tags: tags, difficulty: difficulty, authors: authors, stat-open: stat-open, stat-onsite: stat-onsite, i18n: i18n.make-prob-meta ) set page( header: text( size: .8em )[ #text(fill: pallete.primary)[ #text(weight: "bold")[#id\.] #title ] ] ) set align(horizon) set list(marker: [#text(font: "inter", fill: pallete.primary)[✓]]) body }
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/layout/block-sizing.typ
typst
Apache License 2.0
// Test blocks with fixed height. --- #set page(height: 100pt) #set align(center) #lorem(10) #block(width: 80%, height: 60pt, fill: aqua) #lorem(6) #block( breakable: false, width: 100%, inset: 4pt, fill: aqua, lorem(8) + colbreak(), ) --- // Layout inside a block with certain dimensions should provide those dimensions. #set page(height: 120pt) #block(width: 60pt, height: 80pt, layout(size => [ This block has a width of #size.width and height of #size.height ]))
https://github.com/EpicEricEE/typst-plugins
https://raw.githubusercontent.com/EpicEricEE/typst-plugins/master/united/examples/numbers.typ
typst
#import "../src/lib.typ": num #set raw(lang: "typ") #set text(size: 14pt) #set table( inset: 0.7em, fill: (x, y) => if y == 0 { luma(230) } ) #set page( width: auto, height: auto, margin: 1em, background: pad(0.5pt, box( width: 100%, height: 100%, radius: 4pt, fill: white, stroke: white.darken(10%), )), ) #table( columns: 2, [*Input*], [*Output*], [`#num("12345")`], [#num("12345")], [`#num(3.14159)`], [#num(3.14159)], [`#num[3.5e5]`], [#num[3.5e5]], [`#num[2.5+-0.5]`], [#num[2.5+-0.5]], [`#num[3+2-1 e-3]`], [#num[3+2-1 e-3]], [`$num(2.53(12)e+7)$`], [$num(2.53(12)e+7)$], )
https://github.com/kdog3682/2024-typst
https://raw.githubusercontent.com/kdog3682/2024-typst/main/src/draft-shsat-ws-2-small-version.typ
typst
#set page(margin: 0.75in, paper: "us-letter") #let dialogue-item() = { lorem(30) parbreak() linebreak() } #set text(size: 12pt) #let holder(..items) = { enum(..items) } #{ let num = 1 let a = block({ lorem(20) }) let b = block({ let letters = "ABCD" set enum(numbering: (it) => { let x = letters.at(it - 1) + "." text(weight: "bold", x) }, spacing: 39pt, tight: false, body-indent: 10pt) enum( lorem(3), lorem(3), lorem(3), ) }) let val = block(breakable: false, fill: yellow.lighten(100%), grid(column-gutter: 19pt, columns: (170pt, 1fr), a, b)) set enum(full: false, numbering: (x) => { text(weight: "bold", str(x) + ".") }, tight: false, spacing: 40pt) //holder(val, val, val) // v(-30pt) [= Extra SHSAT Math Practice ] v(0pt) let name = [Michelle] let names = [<NAME> | Alvin | <NAME>] let grouping = [ *Student Group*: #names ] style(styles => { let length = measure(grouping, styles).width //line(stroke: (dash: "loosely-dotted"), length: length) }) grouping v(40pt) v(20pt) show par: set block(spacing: 0.7em) //set par(leading: 2.3em) // doesnt do anything dialogue-item() dialogue-item() dialogue-item() } #let flex(a, b) = { let widths = (150pt, auto) return block(breakable: false, grid(column-gutter: 19pt, columns: widths, a, b) ) } #let question-container(title, body) = { // aasdasd let header = { let expr = box([*Q#question.number* -- *#question.source*\ ]) expr h(1fr) box({ text(fill: blue, emph("percentages")) //rect(outset: 0pt, radius: 5pt, inset: 5pt, text(size: 8pt, "percentages")) }) v(-5pt) //line(length: 170pt) line(length: 100%) } rect(stroke: (dash: "dotted"), radius: 20pt, inset: 20pt, { header body }) }
https://github.com/floriandejonckheere/utu-thesis
https://raw.githubusercontent.com/floriandejonckheere/utu-thesis/master/thesis/figures/06-automated-modularization/metrics.typ
typst
#import "@preview/cetz:0.2.2" #let metrics = yaml("/bibliography/literature-review.yml").at("categories").at("metrics") #let total = metrics.values().sum().len() #let data = ( ([Cohesion], metrics.at("cohesion").len()), ([Coupling], metrics.at("coupling").len()), ([Other], metrics.at("other").len()), ([#v(1em)#h(3em)Modularity], metrics.at("modularity").len()), ([#v(1em)#h(7em)Network overhead], metrics.at("overhead").len()), ([#v(2em)#h(4em)Complexity], metrics.at("complexity").len()), ([#v(2.75em)#h(3em)None], metrics.at("none").len()), ([#v(2em)#h(9em)Resource usage], metrics.at("resource").len()), ) #cetz.canvas({ import cetz.chart import cetz.draw: * let colors = ( cmyk(0%, 75%, 79%, 0%), cmyk(0%, 32%, 23%, 26%), cmyk(29%, 26%, 0%, 28%), cmyk(80%, 29%, 0%, 20%), cmyk(65%, 0%, 2%, 35%), cmyk(64%, 0%, 38%, 24%), cmyk(34%, 0%, 60%, 18%), cmyk(8%, 0%, 68%, 15%), ) chart.piechart( data, clockwise: false, value-key: 1, label-key: 0, radius: 3, slice-style: colors, inner-radius: 1, inner-label: (content: (value, label) => [#text(white, str(calc.round(100 * value / total, digits: 0)) + "%")], radius: 110%), outer-label: (content: (value, label) => [#label], radius: 130%)) })
https://github.com/SkymanOne/zk-learning
https://raw.githubusercontent.com/SkymanOne/zk-learning/main/notes/zkp_programming/zkp_programming.typ
typst
#import "../base.typ": * #show: note = Programming ZKPs Here is the general overview to get from an idea to ZKP. #align(center, diagram( spacing: 5em, node-stroke: 0.5pt, node((0, 0), [*Concept*]), edge("->", [Coding]), node((1, 0), [*Program*],), edge("->", [Compiler]), node((2, 0), [*R1CS \ Arithemtic circuit*]), edge("->", [Setup]), node((3, 0), [*Params*]), edge("->", [Prove]), node((4, 0), [*ZKP*]), ) ) = Arithemtic Circuits Arithmetic circuit is concrete instance of a predicate $phi$ that the prove tries to prove over inputs $x, w$. Arithemtic circuits perform over a prime field where - $p$ a large prime - $ZZ_p$ integers that $mod p$ - prime field - Operations that are performed on the field are: $+, times, eq (mod 5)$ - e.g. $ZZ_5$ - $4 + 5 = 9 mod 5 = 4$ - $4 times 4 = 16 mod 5 = 1$ One way of viewing the Arithemtic Circuits is as systems of field equations over a prime field: - $w_0 times w_0 times w_0 eq x$ - $w_1 times w_1 eq x$ = Rank 1 Contraint Systems (R1CS) The most common for ZKP ACs *Representation:* - _x_: field elements $x_1, ..., x_l$ - _w_: field elements $w_1, ..., w_(m-l-1)$ - $phi$: _n_ contraints (equations) of form - $alpha times beta eq gamma$ - where $alpha, beta, gamma$ are *affine* (linear with an optional constant added) combinations of variables Examples: - $w_2 times (w_3 - w_2 - 1) = x_1$ - $alpha = w_2 \ beta = (w_3 - w_2 - 1) ("it's affine") \ gamma = x_1$ - $w_2 times w_2 = w_2$ - $cancel(w_2 times w_2 times w_2 = x_1)$ We have three variables multiplied so this is not an equation of the acceptable format. Instead, we can transform it into 2 equations by introducing another variable: - $w_2 times w_2 = w_4$ - $w_4 times w_2 = x_1$ We can also represent R1CS in the matrix form where: - _x_: vector of $ell$ field element - _w_: vector of $m - ell - 1$ field elements - $phi$: matrices $A, B, C in ZZ^(n times m)_p$ s.t. - $z = (1 || x || w) in ZZ^(n times m)_p$, $||$ - means concatenation - which hold when $A z circle.tiny B z = C z$, $circle.tiny$ - element-wise product. An example of element wise product: $ A circle.tiny B = mat( a, b; c, d; ) circle.tiny mat( e, f; g, h; ) = mat( a times e, b times f; c times g, d times h; ) $ When taking an inner product of $A z$, every row of _A_ define an affine combination of variables _x_ and _w_. So, every row in _A_, _B_ and _C_ define a single rank 1 contraint. == Example of writing an AC as R1CS Given the following circuit. #align(center, diagram( spacing: 2em, node-stroke: 0.5pt, node((0, 0), [$w_0$], name: <w0>, stroke: 0pt), node((0, 2), [$w_1$], name: <w1>, stroke: 0pt), node((0, 3), [$x_0$], name: <x0>, stroke: 0pt), node((1, 1), [$times$], name: <mul1>, shape: circle), node((2, 2), [$+$], name: <add>, shape: circle), node((3, 3), [$eq$], name: <eq>, shape: circle), node((1, 4), [$times$], name: <mul2>, shape: circle), edge(<w0>, <mul1>, "->"), edge(<w1>, <mul1>, "->"), edge(<w1>, <mul2>, "->"), edge(<x0>, <add>, "->"), edge(<x0>, <mul2>, "->"), edge(<mul1>, <add>, "->", [$w_2$]), edge(<mul2>, <eq>, "->", [$w_4$]), edge(<add>, <eq>, "->", [$w_3$]), ) ) We can transform it to R1CS using the following procedure: 1. Introduce intermediate witness (_w_) variables 2. Rewrite equations - $w_0 times w_1 = w_2$ - $w_3 = w_2 + x_0$ ($beta$ is _1_, therefore omitted) - $w_1 times x_0 = w_4$ - $w_3 = w_4$ = HDLs for R1CS As an HDL (hardware description language) we are going to use Circom. *In HDL objects are:* - Wires - Gates - Circuits/Subcircuits #pagebreak() *Actions are:* - Connect Wires - Create sub-circuits - cannot call functions or mutate variables Circom is an HDL for R1CS: - Wires: R1CS vars - Gates: R1CS contraints It sets values to vars and creates R1CS contraints. === Circom Let's looks at the basic example: ```circom template Multiply() { signal input x; // signal is a wire signal input y; signal output z; z <-- x * y // set signal value z === x * y // creates a contraint, must rank-1 // OR z <== x * y } component main {public [x]} = Multiply(); ``` `===` creates a contraint, must rank-1, one side must be linear, the other side must be quadratic - `template` is a subcircuit. - `public [x]` describes that `x` is public input in the instance of the template. === Circom Metaprogramming Circom has following metaprogramming features: - template args - Signal arrays - Vars - Mutable - Not signals - Evaluated at compile-time - Loops - If statements - Array access ```circom template RepeatedSquaring(n) { signal input x; singal output y; signal xs[n+1]; xs[0] <== x; for (var i = 0; i < n; i++) { xs[i+1] <== xs[i] * xs[i]; } y <== xs[n] } component main {public [x]} = RepeatedSquaring(1000); ``` === Circom witness Computation and Sub-circuits Witness computation is more general than R1CS - `<--` is more general than `===`, you can put any value since it justs sets the value, it doesn't create a constraint. ```circom template NonZero() { signal input in; signal inverse; inveser <-- 1 / in; // not R1CS 1 === in * signal' // is R1CS, creates constraint } ``` `component`s hold sub-circuits - Accesses input/outputs with dot notation ```circom template Main() { signal input a; signal input b; component nz = NonZero(); nz.in <== a; 0 === a * b; } ```
https://github.com/hugo-b-r/insa-template-typst
https://raw.githubusercontent.com/hugo-b-r/insa-template-typst/master/README.md
markdown
# Templates for INSA ## Introduction These are my custom templates that I want to use at INSA de Lyon, a french engineering school. ## Thanks Special thanks for the [typst](typst.app) team for providing such a good product ! ## Copyright Everything is Licensed under the MIT license, except the `assets` directory. Each asset, image or file in general that is located in the `assets` directory belongs to its owner. Here is a non exhaustive list: - `logo_insa.png` belongs to Groupe INSA and ISNA de Lyon - `apple_logo.jpg` belongs to Apple Inc
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/math/syntax-00.typ
typst
Other
// Test Unicode math. $ ∑_(i=0)^ℕ a ∘ b = \u{2211}_(i=0)^NN a compose b $
https://github.com/alkeryn/cv
https://raw.githubusercontent.com/alkeryn/cv/master/cv.typ
typst
#import "./lib.typ": * #show : template #set text(size: 0.98em) = #text(size: 25pt)[<NAME>] #text(size: 15pt)[Software Engineer] #[ #set text(size: 0.9em) #grid( columns: 2, gutter: 2em )[ Born on July 1998\ Mulhouse, France\ #link("mailto:<EMAIL>")\ +33631036304 ][ #link("github.com/alkeryn")\ #link("linkedin.com/in/pierre-louis-braun98/") ] ] #show heading.where(level: 1) : set text(nicered) #place(top + right, dy: -0.5cm)[#circleclip(image("IMG_1518_croped_small.jpg", height:100%), 2.2cm)] = Skills: #move(dx: -0.8em)[ #set text(size: 0.8em) #set list(indent: 1em) #grid(columns: 2)[ #set list(indent: 0.5em, marker: none) #set align(right) - Languages: - - - Databases: - Network: - Web: - Misc: ][ - *Rust*, *Python*, *Linux*, *Bash*, *C*, *C++*, *Nix* - KVM, QEMU, Proxmox, Libvirt, LXC, LXD, Docker, Kubernetes, Nix - PostgreSQL, MySQL, CockroachDB, ScyllaDB, Redis, SQLite, MongoDB - nmap, scapy, ss, ip, netcat, SSH, TCP/IP and UDP knowledge - Nginx, Apache, *JavaScript*, PHP, HTML5, CSS, WebAssembly - Reverse engineering, static & dynamic analysis, binary exploitation, writing exploits\ Data-oriented programming/design, Machine learning, Arch Linux, NixOS, *Git* ]] = Work Experience: #work("Hut8", "Senior Software Engineer", "January 2024 - Now")[ - Writing high performance software to orchestrate hundred of thousands of computers in *Rust* - Including gathering data from those computers, doing thousands of requests per second - Writing software to do integrity checks on databases - Writing software to backfill data from one database to another (influx, bigquery) - Writing software to curtail computers based on power price and other factors. - On-call Response to emergencies. ] #work("Vozforge", "Founder and Software Engineer", "May 2020 - Now")[ - Wrote an app that allows using an Android tablet as a graphic tablet on Windows and Linux, this involved writing drivers#cnt("w") and a custom TCP and UDP binary protocol from scratch. The server was initially written in *C++* and later rewritten in *Rust*, the client, running on the tablet, was written in *Kotlin*. - Writing a modular *Rust* backend framework using *Actix*, this includes modules and libraries for authentication, geospatial queries#cnt("w"), account management, messaging, notifications, and more, the databases used are *PostgreSQL*, *ScyllaDB*#cnt("w"), and *Redis*. - Wrote a visualization web UI to compare Google *S2* to Uber *H3* using WebAssembly #set text(size: 0.8em) + *KMDF* on Windows, libinput on Linux + Using Google S2 cells and ScyllaDB, the library allows for horizontally scalable realtime geospatial queries such as "getting 100 users in a radius of 100km ordered by distance" + ScyllaDB is a C++ rewrite of *CassandraDB* made by mostly the same people but with high performance as a goal. ] #work("Everdreamsoft", "Full Stack Developer", "October 2022 - July 2023")[ - Optimization and adding features to an in-house database - Wrote benchmarking tools for prototyping optimizations | *Rust* - Wrote a migration tool for blockchain data from *MongoDB* to *PostgreSQL* using *Rust* - Lead backend developer for the ChainChronicles project, an NFT subscription service | *PHP* - Developed a *Go* microservice for account synchronization with Stripe's API - Contributed to the Wakweli blockchain MVP | *Go, PostgreSQL* - Developed an in-house secret sharing application using Shamir's Secret Sharing and AES-GCM | *Go, Wails, PostgreSQL* - Researched Ory Kratos authentication service ] #v(0.5em) #columns(3)[ = Education: #work("Computer Science", "UHA 4.0 (Mulhouse, France)", "2017-2020")[] #colbreak() #set text(size: 0.8em) = Achievements : One of the winners of the 2019 DGSE Richelieu hacking CTF. It involved steganography, cryptography, reverse engineering and binary exploitation = Languages : French and English #colbreak() = References: Available upon request = Personal Project: Game server scanner written in *Rust* that can scan and get metadata of\ > 200k servers in < 10s across the whole IPv4 range. #v(1fr) #align(right)[see older work experiences on #link(<old>)[page 2]] ] #pagebreak() = Older Work Experience: <old> #work("PSA Finance", "Cybersecurity Consultant and Manager / Executive", "July 2022 - October 2022")[ - Worked as a Cybersecurity Consultant and Manager for the *PSA Finance* group (the bank of *Peugeot* and *Citroën*) as a service provider from *Sogeti* (part of *Capgemini*) - Conducted network vulnerability scans on thousands of servers in the local network using *Qualys* - Performed penetration testing and analyzed third-party penetration test reports - Wrote vulnerability reports and mitigation strategies based on the aforementioned penetration tests - Developed a *Python* tool to match vulnerabilities from the NIST NVD vulnerability database ]
https://github.com/rabotaem-incorporated/probability-theory-notes
https://raw.githubusercontent.com/rabotaem-incorporated/probability-theory-notes/master/sections/04-discrete-random-processes/03-markov-chains.typ
typst
#import "../../utils/core.typ": * == Цепи Маркова #def[ $Y$ --- не более чем счетное множество --- _фазовое пространство_. Последовательность $xi_n: Omega --> Y$ --- цепь Маркова, если $ P lr(size: #1.2em, (xi_n = a_n | underbrace(xi_0 = a_0\, ...\, xi_(n - 1) = a_(n - 1), "если случается\nс положительной вероятностью"))) = P(xi_n = a_n | xi_(n - 1) = a_(n - 1)). $ то есть, грубо говоря, если все случайные величины зависят от предыдущих, и не зависят от остальных. ] #example[ Случайные блуждания по $ZZ$. $eta_n$ --- независимые, $eta_n = plus.minus 1$ с вероятностями $p$ и $1 - p$. $ xi_n = eta_1 + eta_2 + ... + eta_n ==> xi_n = xi_(n - 1) + eta_n. $ ] #example[ Пусть есть какой-то прибор, который либо работает, либо не работает. Если он работает, то с вероятностью $p$ он может сломаться. Если он не работает, то с вероятностью $q$ его могут починить. #align(center)[ #automaton( ( wr: (br: "p", wr: "1 - p"), br: (wr: "q", br: "1 - q") ), initial: none, final: (), style: ( transition: (curve: 2), state: (radius: 2), wr-wr: (curve: 1.5), br-br: (curve: 1.5), ), labels: ( wr: image("../../img/working.jpg", width: 80pt, height: 80pt), br: image("../../img/not-working.jpg", width: 80pt, height: 80pt), ) ) ] ] #notice[ Вероятность оказаться в момент времени $n$ в том или ином состоянии определяется начальным распределением $pi_0$, и вероятностями перехода $P(xi_n = b | xi_(n - 1) = a) =: p_n (a, b)$. ] #denote[ $pi_n (a) := P(xi_n = a)$. ] #def[ Цепь Маркова _однородная_, если $p_n (a, b)$ не зависят от времени, то есть $p_n (a, b) = p_(a, b)$ для всех $n$. ] #th[ Для однородной цепи Маркова, $ P(xi_0 = a_0, xi_1 = a_1, ..., xi_n = a_n) = pi_0 (a_0) dot p_(a_0 a_1) dot p_(a_1 a_2) dot ... dot p_(a_(n - 1) a_n). $ Такое событие называется _траектория_. ] #proof[ Очевидно по индукции. ] #th(name: "существования")[ Если $pi_0: Y --> [0, 1]$ --- распределение вероятностей на $Y$ (то есть $sum_(y in Y) pi_0 (y) = 1$), и заданы переходы $p: Y times Y --> [0, 1]$ --- вероятности переходов, такие, что $sum_(y in Y) p(a, y) = 1$. Тогда существует $(Omega, Ff, P)$ --- вероятностное пространство, и $xi_n: Omega --> Y$ --- цепь Маркова, с начальным распределением $pi_0$ и вероятностями перехода $p_(a, b)$. ] #proof[ Это не трудно, но там возня. Без доказательства. ] #denote[ $P$ --- матрица $(p_(a, b))_(a, b in Y)$. Вообще говоря, бесконечная. ] #th[ $pi_n = pi_0 dot P^n$. Почему-то в теорвере принято обозначать распределения вероятностей строками. ] #proof[ База очевидна. Доказываем индукционный переход $n ~~> n + 1$. $ pi_(n + 1) (b) = P(xi_(n + 1) = b) = sum_(y in Y) P(xi_(k + 1) = b | xi_n = y) P(xi_n = y) = sum_(y in Y) pi_n (y) p_(y b) = (pi_n P) (b). $ ] #denote[ $ P(xi_n = b | xi_0 = a) = p_(a b) (n). $ ] #def[ $pi: Y --> [0, 1]$ --- _распределение_ на $Y$, если $sum_(y in Y) pi(y) = 1$. $pi$ --- _стационарное распределение_, если $pi P = pi$. ] #example[ Стационарное распределение сущесвует не всегда. Например, в случайных блужданиях по $ZZ$ с симметричной вероятностью переходов. Предположим, что нашлось стационарное распределение $pi$. Тогда $ P(xi_1 = b) = sum_(y in ZZ) P(xi_1 = b | xi_0 = y) P(xi_0 = y) newline(=) P(xi_0 = b - 1) dot 1/2 + P(xi_0 = b + 1) dot 1/2 = (pi(b - 1) + pi(b + 1))/2. $ Значит для любого $b$ целого, $ 2pi (b) = pi (b - 1) + pi (b + 1) ==> pi(b + 1) - pi(b) = pi(b) - pi(b - 1) ==> pi(y + 1) - pi(y) =: c. $ Если $c > 0$, то $ pi(n) = (pi(n) - pi(n - 1)) + pi(n - 1) = c + pi(n - 1) = ... = n c + pi(0). $ Но тогда для любого $c > 0$, начиная с какого-то $n$ вероятность будет больше 1. Аналогично, если $c < 0$ (для больших $n$ она меньше $0$). Значит $c = 0$. Но это тоже дичь какая-то: $pi equiv const$, то есть во всех целых точках вероятность одна и та же. ] #th(name: "<NAME>")[ Пусть $Y$ --- конечное, $p_(a, b) > 0$ для любых $a, b in Y$ (то есть из чего угодно можно попасть куда угодно с положительной вероятностью). Тогда сущесвует единственное стационарное распределение. Более того, $pi(b) = lim_(n -> oo) p_(a, b) (n)$. Более того, сущесвует $q < 1$ такое, что $ abs(pi(b) - p_(a, b) (n)) <= C q^n $ для любых $a, b in Y$. ] #proof[ Пусть $\#Y = m$. Рассмотрим $RR^m$ с нормой --- сумма модулей координат: $norm(x) = abs(x_1) + ... + abs(x_m)$. Рассмотрим отображение $T: RR^m --> RR^m$ вида $x arrow.bar.long (x^T P)^T = P^T x$ --- это линейное отображение. Будем смотреть на него на $ K := {x in RR^m : x_1, ..., x_m >= 0 "и " x_1 + ... + x_m = 1} $ --- это компакт. Тогда $T: K --> K$. Проверим, что это сжатие. $ norm(T x - T y) <= q norm(x - y). $ Надо доказать, что $norm(T z) <= q norm(z)$, где $z = x - y$, и $q < 1$. Пусть $delta := min_(i, j) p_(i, j) > 0$. $ (T z)_j = sum_(i = 1)^m p_(i j) z_i = sum_(i = 1)^m (p_(i j) - delta) z_i + delta sum_(i = 1)^m z_i. $ Сумма $z_i$ это $0$ (какая-та сумма разностей $x - y$, а сумма $x_i$ и $y_i$ обе 1). Оценим норму $ norm(T z) = sum_(j = 1)^m abs((T z)_j) = sum_(j = 1)^m abs(sum_(i = 1)^m (p_(i j) - delta) z_i) <= sum_(j = 1)^m sum_(i = 1)^m (p_(i j) - delta) abs(z_i) newline(=) sum_(i = 1)^m underbrace((sum_(j = 1)^m (p_(i j) - delta)), 1 - m delta) abs(z_i) = (1 - m delta) sum_(i = 1)^m abs(z_i) = (1 - m delta) norm(z). $ Положим $q := (1 - m delta) < 1$. Примерим теорему Банаха о сжатии. По ней 1. Существует неподвижная точка. 2. Любая последовательность итераций сходится к ней с хорошей скоростью. Что и требовалось. ] #notice[ Мы требуем $p_(a, b) > 0$, но это немного сильно: достаточно просить, что через сколько-то шагов (но ограниченное количество) все вершины достижимы. Просто надо будет рассматривать теорему Маркова каждые $M$ шагов. ] #def[ $a$ и $b$ --- состояния цепи. $p_(a, b) (n)$ --- вероятность попасть из $a$ в $b$ за $n$ шагов. - $b$ _достижимо_ из $a$, если сущесвует $n in NN$ такое, что $p_(a, b) (n) > 0$. - $a$ --- _существенное_, если для любого $b$ достжимого из $a$, есть достижимость $a$ из $b$. ] #exercise[ Цепь конечная. Доказать, что существует хотя бы одно существенное состояние. ] #def[ $a$ и $b$ _сообщающиеся_, если $a$ достижимо из $b$ и $b$ достижимо из $a$. ] #def[ $a$ --- _нулевое_, если $p_(a, a) (n) -->_(n -> oo) 0$. ] #denote[ $f_a (n) := P(xi_n = a, xi_(n - 1) != a, xi_(n - 2) != a, ..., xi_1 != a | xi_0 = a)$. Это вероятность того, что первый возврат случился на $n$-м шаге. $f_a (0) = 0$. ] #def[ $a$ --- _возвратное_, если $sum_(n = 1)^oo f_a (n) = 1$. Обозначим $F_a := sum_(n = 1)^oo f_a (n)$ --- вероятность возврата. ] #th(name: "критерий возвратности", label: "recurrent-criterion")[ $a$ возвратно тогда только тогда, когда $ sum_(n = 1)^oo p_(a, a) (n) = +oo, $ и если $a$ невозвратно, то $F_a = P_a/(1 + P_a)$, где $P_a := sum_(n = 1)^oo p_(a, a) (n)$. ] #proof[ Положим $ P(t) := sum_(n = 0)^oo p_(a, a) (n) t^n, quad F(t) := sum_(n = 0)^oo f_a (n) t^n. $ Тогда при $n >= 1$ $ p_(a, a) (n) = sum_(k = 0)^n f_a (k) p_(a, a) (n - k). $ Это свертка последовательностей: $ P(t) - 1 = sum_(n = 1)^oo p_(a, a) (n) t^n = sum_(n = 1)^oo sum_(k = 0)^n f_a (k) p_(a, a) (n - k) t^n = F(t) P(t). $ Значит $ F(t) = (P(t) - 1)/(P(t)). $ - Пусть $P_a = sum_(n = 1)^oo p_(a, a) (n) < +oo$. Тогда $P(t) -->_(t -> 1-) P_a + 1$ по теореме Абеля. $F(t) = (P(t) - 1)/(P(t)) -->_(t -> 1-) P_a/(P_a + 1)$, а еще $F(t) -->_(t -> 1-) F_a$. Значит они равны. - Пусть $P_a = +oo$. Тогда $F_a <-- F(t) = 1-1/(P(t)) --> 1$, так как $P(t) -->_(t->1-) +oo$. ] #follow[ Невозвратное состояние всегда нулевое. ] #th(name: "солидарности")[ Пусть $a$ и $b$ сообщающиеся. Тогда 1. Либо $a$ и $b$ нулевые, либо $a$ и $b$ ненулевые. 2. Либо $a$ и $b$ возвратные, либо $a$ и $b$ невозвратные. ] #proof[ 1. Пусть $a$ нулевое. Тогда $p_(a, a) (n) --> 0$. $a$ и $b$ сообщающиеся, поэтому существуют $i$ и $j$ такие, что $p_(a, b) (i) > 0$, и $p_(b, a) (j) > 0$. Вероятность попать из $a$ в $a$ не меньше, чем по маршруту $a ~~> b ~~> b ~~> a$. То есть $0 <-- p_(a, a) (i + n + j) >= p_(a, b) (i) dot p_(b, b) (n) dot p_(b, a) (j)$. Это стремится к нулю только если $p_(b, b) (n) --> 0$. 2. Пусть $b$ возвратное. Тогда $sum_(n = 1)^oo p_(b, b) (n) = +oo$ по критерию возвратности. Проверим расходимость такого же ряда для $a$: $sum_(n = 1)^oo p_(a, a) (n) >= sum_(n = 1)^oo p_(a, a) (n + i + j) >= p_(a, b) (i) dot p_(b, a) (j) sum_(n = 1)^oo p_(b, b) (n)$. Значит этот ряд тоже расходящийся, и $a$ --- возвратное. ]
https://github.com/Skimmeroni/Appunti
https://raw.githubusercontent.com/Skimmeroni/Appunti/main/Metodi%20Algebrici/Interi/Diofantee.typ
typst
Creative Commons Zero v1.0 Universal
#import "../Metodi_defs.typ": * Viene detta *equazione diofantea* una equazione nella forma: $ a x + b y = c " con" a, b, c, x, y in ZZ " e " a, b, c != 0 $ Dove $a, b, c$ sono i _termini noti_ e $x, y$ sono le _incognite_. Essendo $x$ e $y$ interi, le _soluzioni_ di tale equazione sono tutte e sole le coppie $(x_(0), y_(0)) in ZZ times ZZ$ tali per cui $a x_(0) + b y_(0) = c$. #example[ Si consideri l'equazione diofantea $6x + 5y = 3$. Le coppie $(3, −3)$ e $(8, −9)$ sono sue possibili soluzioni. ] #theorem("Condizione necessaria e sufficiente per la solubilitá delle equazioni diofantee")[ Si consideri l'equazione diofantea $a x + b y = c$, con termini noti non nulli $a, b, c in ZZ$ e incognite $x, y in ZZ$. Tale equazione ammette soluzione se e soltanto se $"MCD"(a, b) | c$. ] <Diophantine-solutions-exist> #proof[ Si supponga che $a x + b y = c$ ammetta una certa soluzione $(x_(0), y_(0)) in ZZ times ZZ$. Deve allora valere $a x_(0) + b y_(0) = c$. Valendo $"MCD"(a, b) | a x_(0) + b y_(0)$ si ha $"MCD"(a, b) | c$. Pertanto, se una equazione diofantea $a x + b y = c$ é risolubile, allora $"MCD"(a, b) | c$. Viceversa, si supponga che per l'equazione diofantea $a x + b y = c$ valga $"MCD"(a, b) | c$. Questo equivale a dire che vale $c = "MCD"(a, b) tilde(c)$ per un qualche $tilde(c) in ZZ$. Per l'identità di Bezout esistono certi $s, t in ZZ$ tali per cui $"MCD"(a, b) = a s + b t$. Sostituendo nell'equazione precedente, si ha $c = (a s + b t) tilde(c) = a s tilde(c) + b t tilde(c)$. Ponendo $x_(0) = s tilde(c)$ e $y_(0) = t tilde(c)$, si ha $c = a x_(0) + b y_(0)$. Essendo $(x_(0), y_(0)) in ZZ times ZZ$, tale coppia é una possibile soluzione per l'equazione. Pertanto, se per l'equazione diofantea $a x + b y = c$ vale $"MCD"(a, b) | c$, allora tale equazione ha (almeno) una soluzione. ] #example[ Si consideri l'equazione diofantea $74 x + 22 y = 10$. Ci si chiede se tale equazione ammetta soluzione. Si calcoli pertanto $"MCD"(a, b)$: #set math.mat(delim: none) $ mat( 74 & = 22 dot 3 + 8; 22 & = 8 dot 2 + 6; 8 & = 6 dot 1 + 2; 6 & = 2 dot 3; ) $ Da cui si ricava $"MCD"(74, 22) = 2$. Essendo $2 | 10$, si ha che l'equazione ammette soluzione. ] #corollary("Determinare una soluzione particolare di una equazione diofantea")[ Si consideri l'equazione diofantea risolubile $a x + b y = c$, con termini noti non nulli $a, b, c in ZZ$ e incognite $x, y in ZZ$. Una soluzione particolare $(x_(0), y_(0)) in ZZ times ZZ$ di tale equazione puó essere ottenuta dall'identitá di Bézout che ha $a$ e $b$ per termini noti. ] <Diophantine-one-solution> #proof[ Sia $a x + b y = "MCD"(a, b)$ l'identitá di Bézout per $a$ e $b$. Moltiplicando ambo i membri per un certo $tilde(c) in ZZ$, si ha $(a x + b y) tilde(c) = a x tilde(c) + b y tilde(c) = "MCD"(a, b) tilde(c)$. Sostituendo $x tilde(c) = x_(0)$, $y tilde(c) = y_(0)$ e $"MCD"(a, b) tilde(c) = c$, si ha $a x_(0) + b y_(0) = c$. Questa é una equazione diofantea, essendo costituita da soli coefficienti interi, e la coppia $(x_(0), y_(0))$ ne é soluzione. Tale equazione é infatti risolubile perché essendo $"MCD"(a, b) tilde(c) = c$, si ha $c | "MCD"(a, b)$. ] Il @Diophantine-one-solution suggerisce che per ricavare una soluzione particolare di una equazione diofantea risolubile $a x + b y = c$ sia sufficiente trovare una soluzione particolare dell'identitá di Bézout che ha $a $ e $b$ per termini noti e moltiplicare il risultato per $frac(c, "MCD"(a, b))$. #example[ Si consideri l'equazione diofantea risolubile $74 x + 22 y = 10$. É giá stato calcolato che $"MCD"(74, 22) = 2$, pertanto l'identitá di Bézout che ha $74$ e $22$ come termini noti é $74 x' + 22 y' = 2$. Se ne determini una soluzione particolare $(x_(0) ', y_(0) ')$: #set math.mat(delim: none) $ mat( 74 & = 22 dot 3 + 8 => a = 3 b + 8 => a - 3 b = 8; 22 & = 8 dot 2 + 6 => b = 2(a - 3 b) + 6 => 7 b - 2a = 6; 8 & = 6 dot 1 + 2 => (a - 3 b) = (7 b - 2 a) + 2 => 3 a - 10 b = 2; ) $ Si ha quindi $(x_(0)', y_(0)') = (3, -10)$. Essendo $frac(10, "MCD"(74, 22)) = 5$, si ha che una soluzione particolare dell'equazione diofantea $74 x + 22 y = 10$ é $(15, -50)$. ] #theorem("Soluzioni di una equazione diofantea")[ Si consideri l'equazione diofantea risolubile $a x + b y = c$, con termini noti non nulli $a, b, c in ZZ$ e incognite $x, y in ZZ$. Se la coppia $(x_(0), y_(0)) in ZZ times ZZ$ é soluzione per tale equazione, allora lo sono tutte e sole le coppie $(x_(h), y_(h)) in ZZ times ZZ$ cosí costruite: $ x_(h) = x_(0) + h (frac(b, "MCD"(a, b))) space space space y_(h) = y_(0) - h (frac(a, "MCD"(a, b))) " con" h in ZZ $ ] <Diophantine-all-solutions> #proof[ Le coppie $(x_(h), y_(h))$ cosí costruite sono certamente soluzioni di $a x + b y = c$, dato che sostituendo si ha: $ a x_(h) + b y_(h) = c & => a(x_(0) + h (frac(b, "MCD"(a, b)))) + b(y_(0) - h (frac(a, "MCD"(a, b)))) = c \ & => a x_(0) + cancel(frac(a h b, "MCD"(a, b))) + b y_(0) - cancel(frac(a h b, "MCD"(a, b))) = c => a x_(0) + b y_(0) = c $ Viceversa, sia $(overline(x), overline(y))$ una generica soluzione di $a x + b y = c$. Dato che anche $(x_(0), y_(0))$ lo é, é possibile scrivere: #set math.mat(delim: none) $ a overline(x) + b overline(y) = c = a x_(0) + b y_(0) => a(overline(x) - x_(0)) = -b(overline(y) - y_(0)) => overline(a) (overline(x) - x_(0)) = overline(b) (y_(0) - overline(y)) space space "con" space space mat( overline(a) = frac(a, "MCD"(a, b)); overline(b) = frac(b, "MCD"(a, b)) ) $ Dall'espressione si ricava che $overline(a) | overline(b) (y_(0) − overline(y))$, da cui si ha $overline(a) | y_(0) − overline(y)$. Ma allora esiste un certo $h in ZZ$ tale per cui $y_(0) − overline(y) = h overline(a)$, cioé $overline(y) = y_(0) - h overline(a)$. Sostituendo nella precedente, si ha: $ overline(a) (overline(x) - x_(0)) = overline(b) (cancel(y_(0)) - cancel(y_(0)) + h overline(a)) => cancel(overline(a)) (overline(x) - x_(0)) = overline(b) h cancel(overline(a)) => overline(x) - x_(0) = overline(b) h => overline(x) = x_(0) + overline(b) h $ Risostituendo il valore di $overline(a)$ e $overline(b)$ nelle rispettive formule, si ottiene la forma presente nell'enunciato del teorema: $ overline(x) = x_(0) + h (frac(b, "MCD"(a, b))) space space space overline(y) = y_(0) - h (frac(a, "MCD"(a, b))) " con" h in ZZ $ Essendo $(overline(x), overline(y)) in ZZ times ZZ$ una soluzione generica, si ha quindi che qualsiasi soluzione puó essere espressa in tale forma. ] #example[ Si consideri l'equazione diofantea risolubile $74 x + 22 y = 10$, del quale é nota la soluzione particolare $(15, -50)$ ed é noto che $"MCD"(74, 22) = 2$. Avendosi $frac(74, 2) = 37$ e $frac(22, 2) = 11$, é possibile ricavare la famiglia di soluzioni $(x_(h), y_(h)) in ZZ times ZZ$: $ x_(h) = 15 + 11 h space space space y_(h) = -50 - 37 h " con" h in ZZ $ ]
https://github.com/TLouf/CV
https://raw.githubusercontent.com/TLouf/CV/main/CV.typ
typst
#import "template.typ": * #set page( margin: (left: 1.4cm, right: 1.4cm, top: 1.4cm, bottom: 1.4cm), background: place(left + top, rect( fill: light_blue, height: 100%, width: 11.5mm, )) ) #set text(12pt, font: "CMU Sans serif") #rect( width: 100%, height: 30mm, fill: dark_blue, stroke: (thickness: 5pt, paint: pale_green), outset: (x: 100pt, top: 100pt), inset: (y: -14pt) )[ #table( columns: (37%, 28%, 25%), stroke: 0pt, inset: 0pt, gutter: 5%, align: horizon, table( rows: (26pt, 6pt, 20pt), stroke: 0pt, inset: 0pt, gutter: 10pt, align: (center + bottom, center + horizon, center + top), text(26pt, weight: "bold", fill: white)[<NAME>], line(length: 100%, stroke: (thickness: 3pt, paint: pale_green)), text(20pt, weight: "bold", fill: white)[Postdoc researcher], ), contact_cell[ #text(12pt, fill: white)[ #par(leading: 5pt)[ === Profiles #link("https://github.com/TLouf")[#gh_icon TLouf] \ #link("https://tlouf.github.io")[#link_icon tlouf.github.io] \ #link("https://orcid.org/0000-0002-8785-8063")[#orcid_icon 0000-0002-8785-8063] ] ] ], contact_cell[ #text(12pt, fill: white)[ #par(leading: 5pt)[ === Contact #link("mailto:<EMAIL>")[#mail_icon <EMAIL>] \ #link("https://twitter.com/t_louf")[#twitter_icon t_louf] \ #link("https://fosstodon.org/@tlouf")[#mastodon_icon <EMAIL>] ] ] ] ) ] #cv_section("Past experiences") #dated_heading([=== Postdoctoral researcher], [Oct 2023 - Now]) <NAME>, Trento, Italy \ I was hired to work on the AI4Trust European project to better understand and tackle multimodal online mis/disinformation. #dated_heading([=== Data science intern], [Jan 2018 - Jul 2018]) HousingAnywhere, Rotterdam, Netherlands \ I analyzed operational data from a large SQL database to provide insights to other teams and also carried out machine learning projects at an exploratory phase. #dated_heading([=== Research intern], [May 2017 - Oct 2017]) CEA Irfu, Saclay, France \ I investigated the potential of Micromegas gaseous detectors for X-ray spectro-polarimetry. I performed experiments to test its performance and analysed data collected at the SOLEIL synchrotron to calibrate a detector. #v(1fr) #cv_section("Education") #dated_heading([=== PhD programme in Physics of Complex Systems], [Nov 2019 - Sep 2023]) #v(1em) - Institute for Cross-Disciplinary Physics and Complex Systems (IFISC), Palma, Spain \ With <NAME> and <NAME> as supervisors of my thesis, entitled "Complexity in sociolinguistics: exploring the interplay between geography, culture and the social fabric". \ María de Maeztu Unit of Excellence PhD position. - #dated_heading( [Department for Network and Data Science (DNDS), CEU, Vienna, Austria], [#text(10pt)[Sep 2022 - Dec 2022]] ) Research stay with <NAME> for a project investigating the interplay between socio-economic inequalities and language. Funded with a competitive mobility grant from Santander. #dated_heading([=== MSc in Physics], [Sep 2018 - Sep 2019]) Imperial College, London, UK \ 12-month taught programme with selected courses in Complexity & Networks, Computational Physics, Atmospheric Physics, General Relativity (among others). \ Thesis with Dr. <NAME> entitled "An axiomatic study of spatial interaction models". #dated_heading([=== Diplôme d'Ingénieur], [Sep 2015 - Sep 2019]) Ecole Centrale de Lyon, France \ Broad engineering training with selected courses in quantum physics, nuclear engineering and observation of matter. Last year at Imperial College as part of a joint degree programme. #dated_heading([=== Preparatory classes to Grandes Écoles exams], [Sep 2013 - Sep 2015]) <NAME>, Paris, France #pagebreak() #cv_section("Publications") #v(-1em) #for work in yaml("me.yaml").at("references").sorted( key: w => str(w.issued.at(0).at("year", default: w.issued.at(0).at("literal", default: ""))) ).rev() { if work.type in ("chapter", "book", "article", "article-journal", "paper-conference") { par(leading: 0pt)[ #text(0pt, fill: white)[#cite(label(work.id))]] } } #bibliography("me.bib", title: none, style: "ieee") #v(1fr) #cv_section("Talks") #for talk in json("talks.json").sorted(key: it => it.date).rev() { if "venueurl" in talk { link(talk.venueurl)[=== #talk.venue #talk.date.slice(0, 4)] } else [ === #talk.venue #talk.date.slice(0, 4) ] [ #talk.title \ #talk.authors ] } #v(1fr) // don't break page within short sections #block(breakable: false)[ #cv_section("Technical skills") === Programming (Python) (Geospatial) data processing, (interactive) visualization, parallel computing on a server cluster, natural language processing, machine learning. I publish all code used in my projects in GitHub repositories, develop small packages (querier, spylt) and contribute occasionally to libraries like #link("https://github.com/geopandas/geopandas/pulls?q=author%3Atlouf")[GeoPandas#super_ext_link_icon], #link("https://github.com/pandas-dev/pandas/pulls?q=author%3Atlouf")[pandas#super_ext_link_icon] and #link("https://github.com/RaRe-Technologies/gensim/pulls?q=author%3Atlouf")[Gensim#super_ext_link_icon]. === Programming (others) Version control with git, Linux proficiency, documentation writing with Sphinx, Rust (beginner). === Communication Fluent in French, English and Spanish. Proficient with LaTeX, Typst, Inkscape, GIMP. #cv_section("Academic service") - Elected member of the advisory board of the #link("http://yrcss.cssociety.org/")[young researchers of the Complex Systems Society], 2023-2025 - Co-organiser of the #link("https://sites.google.com/view/css-ccs23/home")[Computational Social Science satellite of the Conference on Complex Systems], starting 2023 - Served as a reviewer for Physica A and the Journal of Linguistic Geography. - Helped the organization of the Conference on Complex Systems 2022 in Palma. #cv_section("References") - Dr. <NAME> (PhD supervisor), IFISC, #link("mailto:<EMAIL>") - Prof. <NAME> (PhD supervisor), IFISC, #link("mailto:<EMAIL>") - Dr. <NAME> (collaborator), CEU DNDS, #link("mailto:<EMAIL>") ]
https://github.com/kazewong/lecture-notes
https://raw.githubusercontent.com/kazewong/lecture-notes/main/Engineering/SoftwareEngineeringForDataScience/lab/style.typ
typst
#let style_template(doc) = [ #set text( font: "Times New Roman", size: 11pt ) #show heading.where( level: 1, ): it => text( size: 18pt, weight: "extrabold", it.body, ) #show raw.where(block: false): box.with( fill: luma(240), inset: (x: 3pt, y: 0pt), outset: (y: 3pt), radius: 2pt, ) #show link: underline #doc ]
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/058%20-%20Duskmourn%3A%20House%20of%20Horror/009_Episode%205%3A%20Don't%20Give%20In.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Episode 5: Don't Give In", set_name: "Duskmourn: House of Horror", story_date: datetime(day: 28, month: 01, year: 2024), author: "<NAME>", doc ) To Tyvar's transmutation magic, the substance of the floor and walls was identical to the flesh of the monster that had attacked them. It was unnerving to think they were walking through the body of something alive and hostile. Still, as Tyvar wrapped them in the body of the House, they were able to cross rooms and navigate halls without being attacked. He dropped the camouflage whenever either of them started to feel unfamiliar emotions tugging at the back of their minds, keeping them free from the transformation. Zimone had to admire the skill with which he cloaked and uncloaked them, walking the thin line between safety and loss of self. During a pause, she asked how he knew what to do, and watched as his expression turned pained. "During the assault on New Phyrexia, I watched many of my allies lose themselves forever," he said, voice hollow. "Upon my return home, to my fair Kaldheim, I saw that same fate befall too many. I saw the World Tree burn. Koma, who I had revered, whose favor I had sought, fell to that cruel transformation. I felt it spread, across two worlds, and I know what the process feels like to the magic of my own changes. I simply wait to feel like the world is ending, and when it does, I let the magic go." "I'm sorry," said Zimone. "I didn't mean to—" "But, we should be moving along," he said, jovial once more. He put one hand against the wall and the other on her shoulder, and the flesh of the House flowed over them, and sorrow and grief and loss were suddenly far away, hidden behind a veil of nightmares. On they moved, through shifting rooms and portrait galleries, past another library, this one with empty shelves and scraps of paper littering the floor like blood-stained confetti. Zimone still looked at it longingly, like she wanted to stop and reassemble the slaughtered stories of this dead place from the bodies they had left behind. Tyvar flickered their concealment off and then on again, buying them time, and they moved on, into an impossible room: It was large and cozy, outfitted with several small couches and large, overstuffed chairs, the styles of furnishing apparently chosen to blur the lines between the two. Shelves lined the walls, filled with a mixture of books and knickknacks, and pleasant pictures hung on the cream-colored walls. There were no strange smells or unlikely shadows, no bloodstains or nightmares. It was just … pleasant, the sort of place either of them would have been perfectly willing to doze an afternoon away. Sunlight streamed through the two closed windows, bathing the teenage girl who sat on one of the larger couches, a diary open on her knee. She wrote a line, then paused, sticking the end of her pen in her mouth as she looked thoughtfully out the window at the quiet-looking street below. She had a tumbler of amber liquid filled with ice chips, tea maybe, resting on a low nearby table, and looked as peaceful as she was out of place. This room didn't belong here. It was antithetical to the House around it, and it shouldn't have existed. Zimone gasped, grabbing Tyvar's arm in her surprise. He turned to look at her, frowning slightly. "I know her," whispered Zimone. "She was in the picture that I found in the first parlor, all the way back when we first came inside! But that picture was very old … she's younger than I am. How …" She fumbled for her detector, raising and sweeping it in front of her like she was scanning the area. "There's no active time magic here. How can she be younger than I am?" "Perhaps she's a ghost?" suggested Tyvar. "A spirit of some sort? She doesn't seem to be aware of our presence, and even wrapped in the body of this house, I'm difficult to overlook." Zimone looked at him unbelievingly. "Are you bragging about how hot you are? Like, #emph[right now] ?" "I am simply stating facts—and look. She's leaving." The girl, who still hadn't noticed them, rose, taking drink and diary with her as she left the room. The transformation was terrible and swift. Clouds immediately choked the sky outside, swallowing the sunlight; the wallpaper began to peel; and as the curtains slithered shut across the now-dark windows, blood began to bubble from the floor, marking her footprints. "That doesn't seem good," said Tyvar. "No, it's not," said Zimone. "Come on." She took off after the girl, and this time it was Tyvar who was forced to follow. They caught up with her in the next room, a small conservatory that was coming to life around her, dead plants turning green and standing up straight, ambitious vines releasing their grasp on walls and furnishings as they retreated into their pots. Zimone watched all of this with an eagle eye. "I think I understand," she said, slowly. "Whatever she is, she's repelling the House. It can't touch her. I just don't know why—and I think that's why she can't see us. If the House can't attack her in this room, we should be safe here. Let the transmutation go." Tyvar nodded and released the spell. The two returned to ordinary flesh, and Zimone cleared her throat. "Hello?" she called. The girl turned, seeming to see them for the first time. Confronted with a strange woman wearing unfamiliar clothing and carrying a boxy scanner, and a shirtless, barefoot elf, she did the only reasonable thing: she dropped her drink and screamed. "Oh, no!" yelped Zimone. "We're not here to hurt you! We just wanted to talk to you." "What are you doing in my house?" demanded the girl. "Wandering lost, trying to evade an almost certain death, and searching for our friends, who were separated from us by a maliciously appearing wall," said Tyvar. The girl just looked more confused. "I'm sorry," said Zimone. "We were following an unexplained phenomenon, and we wound up here. I'm Zimone. You are …?" "Marina," said the girl. "<NAME>. Unless you're here to kill me, and then I'm about to run outside to scream for the watch." #figure(image("009_Episode 5: Don't Give In/01.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) "A lovely name," said Tyvar. "Both of them. I am <NAME>, prince of Skemfar. We aren't here to kill you. May we speak?" "I … suppose," said Marina. "How did you get in here?" "Through the door," said Zimone. "We need to know what happened to the House." "What do you mean, what happened to the house?" Marina frowned, apparently confused. "It's the house, same as it's always been. Are you all right?" "The House has #emph[always] had monsters popping out of its walls and trying to eat people?" asked Zimone, horrified. Marina stared at her. "What? No!" "I found a book in the library about the history of this place, and it didn't mention the monsters, either," said Zimone. "So, something clearly happened." "Oh, you mean #emph[An Architectural Accounting] ? That was started by the last person to live here. Bit of a weirdo, by all accounts—she felt a house should be treated with the respect you would show a living thing. She lived here for many years. Died here, too. When we bought the place, the realtor asked us to update it—it was one of the conditions of sale, that we keep the book current with our occupancy. Mom and Dad thought it was silly, but I thought it was sort of sweet. Everything deserves to be respected for its place in the world." "That's the book," said Zimone. #figure(image("009_Episode 5: Don't Give In/02.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) "Was that the one with the academically unpleasant magic annotated in the margins?" asked Tyvar. Marina flushed red. "I was just taking notes," she said. Zimone frowned. "That was #emph[you] ? Marina, that kind of magic is—well, it's dangerous. People could be seriously hurt, or worse, if you used anything like that. #emph[Did] you use something like that? Is that what happened to the House?" Marina responded like a much younger child being told that she'd been naughty: she clamped her hands over her ears and screwed her eyes tightly shut, chanting, "You're not real, you're not here, you're not real, you're not here." While the square of floor directly under her feet remained unchanged, the rest of the room began to warp and twist, the walls opening into holes from which crawled pustulant nightmare beasts, their long, multijointed limbs reaching for Tyvar and Zimone. Zimone took a sharp step toward Marina, reaching for the girl, only to be stopped by Tyvar grabbing hold of the back of her shirt. He ran, pulling her with him. Deeper into the House they went, the beasts in pursuit, until they managed to whip around a corner and flatten themselves up against a wall, letting the House-flesh creep over them as Tyvar's magic took hold again. Panting, they pushed free and peeked back into the hall, watching as the house-spawn thundered furiously past. "The House protects her," said Tyvar. "Looks like it," said Zimone. "She has something to do with whatever happened here, I know she does. And we have the proof." Tyvar frowned. "Proof?" Zimone held up the small, tattered book she had grabbed before being pulled out of the conservatory. "I got her diary," she said. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) "Normally, I would view this as a massive invasion of privacy," said Zimone. It had taken them three rooms to find an unbroken table where they could settle to review her find. "I #emph[looked] at Rootha's diary once, and she went on this whole little monologue about how easily people burn. I'm not even sure she was trying to scare me by the end. She was mostly just reassuring herself, and fire keeps her calm. Anyway, diaries are supposed to be top secret, don't look, not ever, but I think our circumstances are a little unique." "You seem to be talking yourself into it," said Tyvar. "Do you have diaries on Kaldheim?" "Among my people, if a thing is not meant to be repeated, it should not be written down," he said. "Stories and sagas are for sharing, and secrets are for swallowing, to keep them safe from other eyes." "Oh. Well, it's a good thing for us that she keeps a diary, because we need to know," said Zimone, and opened the little book to the beginning. Marina's handwriting was crisp and reasonably clear. Zimone began to read. #emph[I] #emph[ don't ] want#emph[ to move, but what I want doesn't matter, because we're moving. It's "better for Dad's work," and I'll be going to "a great new school" where they're sure I'll make "so many wonderful new friends."] #emph[Yeah. Because sixteen years in our old neighborhood with only the other weirdos to show for it absolutely has me primed to be the social butterfly of a new scholastic environment. I'm more of a social moth. Stick to the shadows, stay out of the way, and hope I don't get swatted before I can find a lantern to immolate myself on. Not like they'd notice, which, as much as they're trying to pretend I'm not here]  #emph[… ] "It goes on like this for a few pages," said Zimone, flipping forward. "I think Marina was really lonely, and sort of scared by the move, so she pretended she didn't care. But wait, here …" She began to read again. #emph[The presence I felt in the basement when we first moved in, I felt it again, so I went down to see what I could find. This time, it talked to me! His name is Valgavoth—Val—and he was summoned here and bound by one of the owners before us. They thought he would be like the little service spirits people call to do simple tasks, and when they realized how big he was, how powerful, they panicked and ran, but they didn't release him. He's been here this whole time, alone. He says he can give me a list of books that might help me figure out how to free him]  #emph[… ] "And I think this is where she started doing the academically questionable research," said Zimone. "There's some really dense stuff here, about demonology and necromancy and spirit-binding, and it all adds up to bad news. But this Valgavoth was captive, and for the House to be this active, he must still be here." "Even now?" asked Tyvar, seemingly enthralled. "Read on, skald, and tell me the tale." "You're very weird," said Zimone. #emph[I read the books, and I think I could let him go, but I]  #emph[… I don't really want to anymore. School's still awful, and maybe it's mean of me, but Val is the only real friend I've made since we moved here. I don't want to lose him. Also, he's been locked up for a long, long time, and he's pretty mad about it, even though he tries to pretend he's not when we're talking. I think if I let him out, he might hurt a lot of people before he leaves. I don't want to hurt a lot of people.] "So, Marina was essentially a good person at one point, even if she was making poor academic decisions," said Zimone. "It seems." "Something must have changed," said Zimone. She rubbed the back of her head. "My thoughts are starting to itch. Let's be ourselves for a few seconds?" Tyvar nodded, dropping the transmutation. The crushing weight of the House's regard rushed back in, and after a moment, Zimone said: "Okay, put it back. I don't want to keep reading while the House can see us." #emph[Today was the worst day yet. Some of the girls in my necrobiology class decided to corner me after class and they—] #emph[What they—called me—pushed me—the wall.] "A whole bunch of this has been scribbled out," said Zimone. "But what's left looks pretty bad. There're tear stains on the paper. I think they hurt her bad enough to make her cry." #emph[Val says he can make them pay for what they did. He can make them suffer. I don't care anymore. I can't live like this.] #emph[I'll invite them over tomorrow after school.] Zimone flipped to the next entry, barely breathing: #emph[What have I done?] #emph[He ] #emph[took them. Hands reached out of the walls and grabbed them, and they were screaming and screaming and ] changing#emph[, like he was moving underneath their skin, and then they stopped screaming and he pulled them into the walls and they were gone. Nothing left at all.] #emph[I did this. He's trapped: no matter how angry he is or how much he wants to hurt people, he couldn't have done it without me. I did this to them. I let this happen.] #emph[I can't sleep. The house keeps creaking, and the walls are pulsing, like they're trying to breathe. I keep thinking I can hear them moving in there, trapped inside the house, trying to get away.] #emph[I did this.] Zimone looked up. "Oh, no." She turned to the next entry: #emph[The houses to either side are gone, and our house is bigger now. I think Val is the house, somehow, after all this time, and he's hungry and he's angry and I gave him the power to start eating the world around him. I can't let this go on. I have to talk to him. I have to find a way to save myself, to save my parents. Maybe I can't save the world, but I don't have to be entirely a monster, do I?] #emph[It's not too late.] Zimone closed the diary, staring at the wall as she set it to the side. "Those rituals she was researching … with the power from four people's lives, Valgavoth would have been able to expand his reach exponentially. Trap more people inside himself, and then do it again, and again, and again. Until there was nothing left outside." "What do you mean?" "I don't think there's a plane left outside the House." Zimone turned to look at Tyvar, face strange under her mask of horror-house, eyes wide and terrified. "I think at this point, mathematically speaking, Valgavoth is all." #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) The outdoor "room" where they had encountered the dancing wickerfolk was connected to more of the same, "rooms" containing forests and thorn breaks and desolate hillsides. Nashi led them deeper and deeper, through environments that should never have been contained in this manner. The Wanderer thought she heard a river running in the distance, water breaking over stones, and as impossible as the sound was, she wanted to follow it to its source. She wanted to #emph[know] . Every time it started to feel like they had managed to leave the House, there would be a glimpse of wall or a glint of light off a half-hidden window; however natural these environments felt, they were all fully contained. Niko shuddered at the sight of an earthen mound studded with bones and surrounded by trees, so like the cairns of Kaldheim, so unlike the tombs of Theros. The differences didn't matter. They looked at the similarities, and they knew a graveyard when they saw one. The group walked on, Nashi still leading the way, although Winter nodded encouragingly when anyone looked to him, indicating that they were on the right track. Forest bled into thicket into briar, until they emerged into a broad meadow too impossibly large to be contained. Huts were scattered across the rolling hills, some with smoke rising from their chimneys, making gray streaks in the pale air. Somewhere during their journey, the lights had come back on, slowly at first, getting brighter by degrees, until they could see every inch of the terrible landscape around them. Even in the "outside," the House continued to haunt them. Screaming faces were etched in the bark of looming trees—and after the wickerfolk, it was impossible to say whether the trees screamed because they had grown that way, or because they had once been survivors and were still intelligent and aware of their frozen fates. The brush that rose from the hillside in stunted clusters was suspiciously bony, giving the impression that it might close on a foot or snag a hem at any moment. This was not a good place, no matter how fresh the air seemed, or how merrily the rivers ran. Winter made a small, unhappy sound. The Wanderer and Niko looked to him, and he shook his head. "This is the Valley of Serenity," he said. "The Cult of Valgavoth lives here. We should go back." "My mother has been calling me from this direction," said Nashi. "I know we're in the right place." "But—" "You don't have to come. I didn't ask you to follow me here." "We're going with you, whatever that means," said the Wanderer. She shot Winter a challenging look. Winter sighed. "Don't say I didn't warn you," he said, and the trio walked on. The Wanderer's glimmer still drifted around her shoulders, circling her as she walked. Niko pulled a shard out of the air, spinning it between their fingers and looking thoughtfully at Winter. "Who's this cult you're talking about?" they asked. "The Cult of Valgavoth," said Winter. "They claim Duskmourn was created by an entity summoned by their ancestors, one who took root in this House and grew to swallow everything. The demon called Valgavoth is as trapped here as the rest of us, but he doesn't mind so much anymore, since he eats well. This is a perfect feeding ground for something like him, fertile and flourishing. The cult hunts survivors, both the ones who were born here and the people like me and my friends—or you. They catch us when they can, and drag us away to either be converted or given to Valgavoth. Either way, the ones they capture never come back." "They sound like charming people." "They're not." "I was being sarcastic." "I know." "Then why—" Niko caught themself, shaking their head. "Never mind. This cult is bad news, then?" "The very worst," said Winter direly. Niko looked at him. "After the wickerfolk, the razorkin, and the glitch ghosts, #emph[this] is what you want to call the very worst?" "You'll see," said Winter. He sighed. "I was hoping it would take us longer to encounter them." "So, you knew we'd encounter them." "They're inevitable in Duskmourn. They, like their Devouring Father, are everywhere." Niko frowned and followed Nashi and the Wanderer through a stone tunnel shaped like a doorway, into the next room. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) It looked like a natural cavern worn out of rock by centuries of erosion, the walls rough and uneven, the ceiling bristling with stalactites. Some matching stalagmites grew up from the floor, but most of those showed the signs of intentional reshaping, their tops chipped off and smoothed out to make flat surfaces supporting sacramental lanterns, bowls, and even a large stone slab altar made by setting a sheet of hand-chipped quartz atop four leveled stalagmites. All those things were set dressings, immutable facts of the environment, and nowhere near so disturbing as what the room contained. Half a dozen humanoid figures in long robes shaped like moth's wings, clean and well mended but tattered at the hems. It took Niko a moment to realize why the condition of their clothing was so ominous. There were no patches. No stains. They had the luxury of caring for themselves in a way that Winter never had, and by extension, the rest of the survivors wandering the House would have been denied. In this place, cleanliness was virtually a declaration of power. Chrysalises hung around the edges of the room, hard, angular things that gave the impression of natural and unnatural geometry at the same time, like they were shaped from platonic solids dredged out of another dimension. They were painted in shades of green and brown, and as Niko watched, one of them twitched, moved by something from inside. It was unsettling. It hurt their eyes to look at for too long. Three people in clothing more like Winter's were bound at the center of the room, struggling weakly against the ropes that held them. One had a vicious-looking gash on her leg, cutting through layers of fabric to the flesh beneath; the other two appeared to be unharmed. One of the robed figures—who held a leather-bound book close to his chest—was preaching to them. "The Devouring Father has not yet refused your service," he said, voice sonorous and rolling, clearly pitched to beguile. "Be reborn in his name, and you may yet be transformed in glory, cleaned of your fear by the Gift of the Threshold. Wouldn't it be glorious, children, to no longer be afraid? To no longer walk wrapped in the chains of your weakness, unable to stand proud and confident beneath his eaves?" The injured survivor began to cry, noisily. "Yes," she said, through her tears. "It would be so nice. I've been so scared." "Hush," hissed one of her companions. "They have our gear, but we can still get out of here. We can survive this." "We #emph[can't] ! I told you and I told you, but you didn't listen, because you wanted to be brave. Well, being brave isn't the same thing as being unafraid." She looked at the speaker, eyes wide and wet. "I don't want to be afraid anymore." "Take her," said the speaker. The other figures—cultists—moved forward, cutting her free. One of them kissed her forehead. Another took her arm. As a group, they pulled her toward the nearest chrysalis, which was split down the middle, and began to ease her inside. Niko couldn't watch any longer. They jumped onto the nearest stalagmite, a shard appearing in either hand. "Hey!" they yelled. "Leave her alone!" Slow and deliberate, the speaker turned to look at them, and then at their companions. "Amazing," he said. "You actually did it." The Wanderer began to step forward, only to feel a strange, placid feeling seep over her. It wasn't enough to drop her—she had more willpower than that—but it made the hands that suddenly grabbed her arms from either side feel as though they were made from iron. Nashi moved to her side, and was restrained in a similar fashion, as were the other nezumi, leaving only Niko and Winter free. Niko dropped down from the stalagmite and moved to flank Winter, clearly intending to help the man avoid capture. Winter only bowed his head, saying nothing, as the cultists rose up and grabbed Niko in turn. The speaker moved toward the pair as Niko struggled, trying to break free. He smiled and placed an approving hand on Winter's shoulder. "You shall be most favored," he said. "At this point, I better be," said Winter, and stepped backward, pulling a loose stone from the wall. A slab of solid granite crashed down between him and the others, sealing them in the room with the cultists. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) It was an unfair fight to begin with. One of the greatest swordswomen in the Multiverse, a javelin thrower who couldn't miss, and Nashi, who had learned survival on the streets of Kamigawa, running with people who didn't have his safe home to retreat to, against seven cultists. There was no way for them to lose. But then, their leader raised his book and blew across the pages, and a thick layer of dust rolled off the paper, gleaming silver in the light as it coated everything around it. Their limbs became heavy, and their movements became slow, and then they blacked out briefly, falling into a nothingness that was purely outside of and distinct from sleep. The Wanderer was the first to wake. Being jerked back and forth across the Multiverse by her own spark for years had left her better equipped than most to recover from sudden shocks to the system, and she was able to shake off the lingering effects of the sedative dust to find herself held up by loops of some white, cottony material, pinning her tightly to a post. They were in a different cavern now, this one larger and darker, with jagged outcroppings of rock on the wall. She could see her companions, tied to stone posts of their own around the edges of the room. One wall was taken up by a vast, pulsing cocoon that throbbed like the beating of a heart, the whisper-soft sound of its contractions and expansions echoing through the chamber. At the center of the room was another altar, and on the altar, a square device like the one Winter carried, like the ones they'd seen abandoned around the House. The top was open, and Tamiyo's scroll echo floated there, flickering in and out of focus, like it hurt her to be manifest. Her shoulders were slumped, and her head was bowed, and she spoke in a voice too soft for the Wanderer to hear to the man in front of her, who was writing down her every word. "Such delicious stories," said a voice to her left. She whipped her head around, as far as she could, and saw the man who'd been leading the ritual in the first cavern standing beside her, book still in his hands. "She was a glorious discovery, a jewel beyond all price, and we are beyond honored to have her." "She's not a thing to be #emph[taken] ," spat the Wanderer. "And yet we took her," said the man, almost jovially. "The Devouring Father has no interest in leaving Duskmourn. This has been a glorious cocoon to feed and nurture him, and he has grown strong. But the space outside the walls has changed recently, and now he can spread his blessings even further, to more and more new worlds. He no longer needs to strain with all his might to open a door. All he needs is to know that they exist." The Wanderer tried to struggle against her bonds. She could hear Nashi and Niko beginning to stir. "You're stealing her stories. You're giving them to a #emph[monster] ." "If you wish to remain heretics, I cannot save you, but we would be gloriously glad if you would join us," said the man, sounding almost sorry. "The offer is made. You need only accept." The Wanderer scowled, shaking her head. The man sighed. "A pity and a waste, but you have time to change your answer." Book under his arm, he walked away from her, toward the altar where the memory of Tamiyo helplessly recounted her life's work for a cruel audience. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) The sounds of Ravnica were like an assault after the eerie quiet of Duskmourn. Kaito spun around, wary of the possibility of attack. He was in the alley where he had first met Zimone, where this had all begun—#emph[had he been aiming for Ravnica? Had he been aiming for anything? He'd been desperate to escape before it was too late]  #emph[… ] Something bit into his clenched palm. He forced it open. A piece of bloodstained wood rested in his hand, a parting gift from Duskmourn. Kaito scowled at it, then cast his awareness out across the Blind Eternities, searching for the tainted foundation of the House. Grasping hold, he attempted to throw himself across the void, out into the infinite— And nothing happened. Panic momentarily gripped him, hand snapping shut on the splinter as fear bubbled up from the pit of his stomach. #emph[Had the time come for his spark to blow out as well, leaving him as dependent on the Omenpaths as so many of the others? Was he trapped? Was Kamigawa without a defender who could react with any proper speed?] Panic put that same speed into his heels as he hurried down the alley toward the spot where Niv-Mizzet and the others had set up camp. Aminatou and Etrata saw him and waved for him to stop, but he continued onward, heading for the quarantine zone. He was almost there when Niv-Mizzet closed a great talon on his shoulder, pulling him to a halt. "Kaito, what happened?" he asked, almost gently. Kaito stopped. "We were separated, and then Jace … Jace was there." "#emph[Beleren] ?" demanded Niv-Mizzet. "He #emph[left] me." Kaito looked at Niv-Mizzet, offended. "He said he was sorry, and he #emph[left] me." Aminatou and Etrata had reached them by that point, Yoshimaru on Aminatou's heels. She reached toward Kaito, then appeared to think better of it and pulled her hand back. "What you fear hasn't come to pass," she said. "You're still lit, and not yet blown to embers. The trouble isn't in you. It's in the House." "The House?" asked Kaito. "It knows you now. It knows you're more trouble than you're worth. It doesn't want you inside its walls. Your friend's fate depends on the people who are already with him. If you chose wisely, he'll be free." Proft walked up to the small group, moving with less urgency than the others. He looked to Kaito's hand, which still clutched the broken-off piece of Duskmourn. "If you would come with me, I think I have an idea that might help us," he said, and motioned Kaito forward. Lacking any better ideas, Kaito followed him back over to the workstations on the other side of the courtyard. Time was passing. And it wasn't on their side.
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/songb/0.1.0/tests/song.typ
typst
Apache License 2.0
// SPDX-FileCopyrightText: 2024 <NAME> <<EMAIL>> // // SPDX-License-Identifier: CC0-1.0 #import "../song.typ": song as s, chorus, verse, index-by-letter #import "../autobreak.typ": autobreak #set page("a5") #columns(2)[ = Songs #index-by-letter(<song>) #colbreak() = Singers #index-by-letter(<singer>) ] #pagebreak() #let song( title: none, title-index: none, singer: none, singer-index: none, references: (), doc)={ autobreak( s( title:title, title-index:title-index, singer:singer, singer-index:singer-index, references:references, doc) ) v(-1.2em) } #song( title: "First song", )[ #verse[ Premier couplet ] #chorus[ Refrain ] #verse[ Deuxième couplet ] Bridge #verse[ Troisième couplet ] ] #song( title: "Second song", )[ #verse[ Premier couplet ] #chorus[ Refrain ] #verse[ Deuxième couplet ] Bridge #verse[ Troisième couplet ] ] #counter(heading).update(99) #song( title: "French song", singer: "<NAME>", )[ #verse[ Premier couplet ] #chorus[ Refrain ] #verse[ Deuxième couplet ] Bridge #verse[ Troisième couplet ] #lorem(255) ] #song( title: "German song", )[ #verse[ Premier couplet ] #chorus[ Refrain ] #verse[ Deuxième couplet ] Bridge #verse[ Troisième couplet ] ] #context[#metadata(( name: "Additional <song>", sortable: "Additional", references: none, counter: counter(heading).get().first()+1, ))<song>] #context[#metadata(( name: "Additional <singer>", sortable: "Additional", references: none, counter: counter(heading).get().first()+1, ))<singer>] #song( title: "German song", singer: "<NAME>", references: ("again to test repetition in the index",) )[] #song( title: "Pseudo German song", title-index: "German song", references: ("with a title-index of \"German song\"","to set it up somewhere else in the index",) )[] #song( title: "Hidden song", title-index: "", references: ("with an empty title-index to hide from the index",) )[ With a reference ] #song( title: ['74 --- '75], singer: [O' Connells], singer-index: "#", )[ Typography should not be lost in the index!\ Another paragraph And a last one\ And a last one\ And a last one\ ] #song( title: "song with a lowercase", )[] #song( title: "1, 2, 3", singer: "<NAME>", )[] #song( title: "Et 1, et 2, et 3 zéro!", title-index: "#1 et 2", singer: "<NAME>", )[] #song( title: [ The German song ], title-index: "German song (The)", )[]
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cheda-seu-thesis/0.2.0/README.md
markdown
Apache License 2.0
# 「cheda」东南大学论文模板 cheda-seu-thesis 使用 Typst 复刻东南大学「本科毕业设计(论文)报告」模板和「研究生学位论文」模板。 > [!IMPORTANT] > > 此模板是民间模板,学校可能不会认可本模板。 > > 此模板内可能仍然存在诸多格式问题。 > > 如需使用此模板,请自行承担风险。 Typst Universe 和 Web App 上的模板可能不会及时更新,请访问 https://github.com/csimide/SEU-Typst-Template 以获得此模板的更多信息、目前存在的问题与开发计划。 ## 使用方法 ### Web App > [!NOTE] > > Web App 上缺失宋体、黑体、华文中宋字体,如果需要在线使用,请前往 https://github.com/csimide/SEU-Typst-Template/tree/master/fonts 下载并手动上传到 Web App 中。 > > **由于字体原因,不建议使用 Web App!!** 请在 Web App 中选择 `Start from a template`,再选择 `cheda-seu-thesis`。 随后,请按照文件内的提示操作。 ### 本地编辑 请使用 `typst init @preview/cheda-seu-thesis:0.2.0` 获取初始文件。随后,请按照文件内的提示操作。 您可以在 https://github.com/nju-lug/modern-nju-thesis#vs-code-%E6%9C%AC%E5%9C%B0%E7%BC%96%E8%BE%91%E6%8E%A8%E8%8D%90 查阅使用本地环境编辑 Typst 文件的技巧。 ## 开发与协议 除下述特殊说明的文件外,此项目使用 MIT License 。 - `demo/demo_image` 路径下的文件来自东南大学教务处本科毕设模板。 - `seu-thesis/assets/` 路径下的文件是由东南大学教务处模板经二次加工得到,或从东南大学视觉设计中取得。
https://github.com/kotfind/hse-se-2-notes
https://raw.githubusercontent.com/kotfind/hse-se-2-notes/master/os/seminars/2024-09-06.typ
typst
= Введение Преподаватель: Максим #figure( table( columns: 2, table.header[*Балл*][*Условие*], [max], [во время семинара], [0.7max], [день семинара], [0.5max], [в ближайшие шесть дней], [0], [после деделайна], ), caption: [Штрафы за сдачу задач] ) == Монолитная система vs Микроядро Микроядро состоит из отдельных модулей Монолитная система монолина == POSIX Стандарт, интерфейс взаимодействия прикладной программы и системы == Системные вызовы и библиотека libc Системный вызов --- способ для прикладной программы использовать возможности ядра
https://github.com/fyuniv/typstModernCV
https://raw.githubusercontent.com/fyuniv/typstModernCV/main/lib.typ
typst
MIT License
// const icons #let icon_size = 0.8em #let orcid_icon = box(height: icon_size, image("icons/orcid.svg")) #let github_icon = box(height: icon_size, image("icons/github.svg")) #let phone_icon = box(height: icon_size, image("icons/phone.svg")) #let email_icon = box(height: icon_size, image("icons/email.svg")) #let web_icon = box(height: icon_size, image("icons/web.svg")) #let github_link = "https://github.com/" #let orcid_link = "https://orcid.org/" #let icon_link(icon, web, account_name) = { align(horizon)[ #icon #link(web + account_name, account_name) ] } //Resume layout #let left_column_width = 16% #let column_gutter = 1.2em #let resume( title: "", author: (:), date: datetime.today().display("[month repr:long] [day], [year]"), theme: (:), body ) = { set text( font: theme.font, lang: "en", size: 11pt, fallback: true ) // Set up theme color let primary_color = { if (theme.color == "") {rgb("#4B0082")} if (theme.color == "blue") { rgb("#0D8FCD") } if (theme.color == "orange") { rgb("#CC5500") } } let accent_color = { if (theme.color == "") {rgb("#228B22")} if (theme.color == "blue") { rgb("#333333") } if (theme.color == "orange") { rgb("#422921") } } set document( author: author.firstname + " " + author.lastname, title: title, ) // Set page style set page( paper: "us-letter", margin: (left: 0.8in, right: 0.8in, top: 0.6in, bottom: 0.6in), footer: [ #set text(fill: accent_color, size: 8pt) #grid( columns: (1fr,)*3, align: (left, center, right), smallcaps[#date], counter(page).display("1 / 1"), smallcaps[ #author.firstname #author.lastname #sym.dot.c #title ], ) ], ) // Set paragraph spacing show par: set block(above: 0.8em, below: 1em) set par(justify: true) // Set heading styles set heading( numbering: none, outlined: false, ) show heading.where(level:1): it => { set text( size: 1.2em, weight: "semibold" ) grid( columns: (left_column_width, 1fr), column-gutter: 1.2em, align(horizon, line(stroke: 5pt + primary_color, length: 100%)), pad(left: -0.5em, text(fill: primary_color, it.body)) ) } show heading.where(level: 2): it => { set text(primary_color, size: 1.1em, style: "italic", weight: "regular") grid( columns: (left_column_width, 1fr), align: (right, left), column-gutter: 1.2em, row-gutter: 0.5em, [], pad(left: -0.25em, it.body) ) } show heading.where(level: 3): it => { set text(size: 10pt, weight: "regular") smallcaps[#it.body] } // Set name style let name = { set text( size: 2.5em, weight: "bold", fill: primary_color ) pad(bottom: 0.5em)[ #author.firstname #h(0.2em) #author.lastname ] } // Set position style let positions = { set text( accent_color, size: 0.8em, weight: "regular" ) smallcaps( author.positions.join( text[#" "#sym.dot.c#" "] ) ) } // Set address style let address = { set text( size: 0.8em, weight: "bold", style: "italic", ) align(right, author.address) } // Set contact information styles let contactinfo = { align(right)[ #set text(size: 0.8em, weight: "regular", style: "normal") #align(horizon)[ #if (author.phone != "") { phone_icon box(inset: (left: 6pt), author.phone ) } #if (author.email != "") { email_icon box(inset: (left: 6pt), link("mailto:" + author.email)[#author.email]) } ] ] } let socialinfo = { set text(size: 0.9em, weight: "regular", style: "normal") grid( columns: (1fr,) + (auto,)*3, column-gutter: 1.5em, align: (left, right, right, right), [], if (author.web != "") { icon_link(web_icon, "", author.web) }, if (author.orcid != "") { icon_link(orcid_icon, orcid_link, author.orcid) }, if (author.github != "") { icon_link(github_icon, github_link, author.github) } ) } grid( columns: (1fr,1fr), align: (left, right), {name; positions;}, {address; contactinfo}, ) socialinfo body } // Resume-list: marker will be placed on the left #let resume_list(body) = { set text(size: 1em, weight: 400) set par(leading: 0.8em) set enum(numbering: n => { box(width: left_column_width, align(right)[#n.]) }, body-indent: 1.2em ) show list.item: it => { grid( columns: (left_column_width, 1fr), align: (right, left), column-gutter: 1.2em, row-gutter: 0.5em, $bullet$, it.body ) } body } // Resume entry. Displayed as date title university\ [] location description #let resume_content(body) = { grid( columns: (left_column_width, auto, 1fr), align: (right, left, right), column-gutter: 1.2em, row-gutter: 0.5em, [], body ) } #let resume_entry( date: "", title: "", department: "", university: "", location: "", description: "" ) = { grid( columns: (left_column_width, auto, 1fr), align: (right, left, right), column-gutter: 1.2em, row-gutter: 0.65em, grid.cell( rowspan: 2, date, ), strong(title), department, emph(university), location, [], description ) } // Reverse the numbering of enum items. It was shared by frozolotl. A reimplement of enum can be found in https://gist.github.com/frozolotl/1eeafa5ff4a38b2aab412743bd9c1ded. It may be used to realize the same feature. #let reverse(it) = { let len = it.children.filter(child => child.func() == enum.item).len() set enum(numbering: n => box(width: left_column_width, align(right)[#(1 + len - n).]) ) it } // ---- End of Resume Template ----
https://github.com/Pablo-Gonzalez-Calderon/showybox-package
https://raw.githubusercontent.com/Pablo-Gonzalez-Calderon/showybox-package/main/examples/examples.typ
typst
MIT License
#import "@preview/codelst:1.0.0": * #import "../showy.typ": * #set text(font: "HK Grotesk", size: 12pt) #set heading(numbering: "I.") = Stokes' theorem example #sourcecode(```typ #showybox( title: "Stokes' theorem", frame: ( border-color: blue, title-color: blue.lighten(30%), body-color: blue.lighten(95%), footer-color: blue.lighten(80%) ), footer: "Information extracted from a well-known public encyclopedia" )[ Let $Sigma$ be a smooth oriented surface in $RR^3$ with boundary $diff Sigma equiv Gamma$. If a vector field $bold(F)(x,y,z)=(F_x (x,y,z), F_y (x,y,z), F_z (x,y,z))$ is defined and has continuous first order partial derivatives in a region containing $Sigma$, then $ integral.double_Sigma (bold(nabla) times bold(F)) dot bold(Sigma) = integral.cont_(diff Sigma) bold(F) dot dif bold(Gamma) $ ] ```) #showybox( title: "Stokes' theorem", frame: ( border-color: blue, title-color: blue.lighten(30%), body-color: blue.lighten(95%), footer-color: blue.lighten(80%) ), footer: "Information extracted from a well-known public encyclopedia" )[ Let $Sigma$ be a smooth oriented surface in $RR^3$ with boundary $diff Sigma equiv Gamma$. If a vector field $bold(F)(x,y,z)=(F_x (x,y,z), F_y (x,y,z), F_z (x,y,z))$ is defined and has continuous first order partial derivatives in a region containing $Sigma$, then $ integral.double_Sigma (bold(nabla) times bold(F)) dot bold(Sigma) = integral.cont_(diff Sigma) bold(F) dot dif bold(Gamma) $ ] = Gauss's Law example #sourcecode( ```typ #showybox( frame: ( border-color: red.darken(30%), title-color: red.darken(30%), radius: 0pt, thickness: 2pt, body-inset: 2em, dash: "densely-dash-dotted" ), title: "Gauss's Law" )[ The net electric flux through any hypothetical closed surface is equal to $1/epsilon_0$ times the net electric charge enclosed within that closed surface. The closed surface is also referred to as Gaussian surface. In its integral form: $ Phi_E = integral.surf_S bold(E) dot dif bold(A) = Q/epsilon_0 $ ] ``` ) #showybox( frame: ( border-color: red.darken(30%), title-color: red.darken(30%), radius: 0pt, thickness: 2pt, body-inset: 2em, dash: "densely-dash-dotted" ), title: "Gauss's Law" )[ The net electric flux through any hypothetical closed surface is equal to $1/epsilon_0$ times the net electric charge enclosed within that closed surface. The closed surface is also referred to as Gaussian surface. In its integral form: $ Phi_E = integral.surf_S bold(E) dot dif bold(A) = Q/epsilon_0 $ ] = Carnot's cycle efficency example #sourcecode( ```typ #showybox( title-style: ( weight: 900, color: red.darken(40%), sep-thickness: 0pt, align: center ), frame: ( title-color: red.lighten(80%), border-color: red.darken(40%), thickness: (left: 1pt), radius: 0pt ), title: "Carnot cycle's efficency" )[ Inside a Carnot cycle, the efficiency $eta$ is defined to be: $ eta = W/Q_H = frac(Q_H + Q_C, Q_H) = 1 - T_C/T_H $ ] ``` ) #showybox( title-style: ( weight: 900, color: red.darken(40%), sep-thickness: 0pt, align: center ), frame: ( title-color: red.lighten(80%), border-color: red.darken(40%), thickness: (left: 1pt), radius: 0pt ), title: "Carnot cycle's efficency" )[ Inside a Carnot cycle, the efficiency $eta$ is defined to be: $ eta = W/Q_H = frac(Q_H + Q_C, Q_H) = 1 - T_C/T_H $ ] = Clairaut's theorem example #sourcecode( ```typ #showybox( title-style: ( boxed-style: ( anchor: ( x: center, y: horizon ), radius: (top-left: 10pt, bottom-right: 10pt, rest: 0pt), ) ), frame: ( title-color: green.darken(40%), body-color: green.lighten(80%), footer-color: green.lighten(60%), border-color: green.darken(60%), radius: (top-left: 10pt, bottom-right: 10pt, rest: 0pt) ), title: "Clairaut's theorem", footer: text(size: 10pt, weight: 600, emph("This will be useful every time you want to interchange partial derivatives in the future.")) )[ Let $f: A arrow RR$ with $A subset RR^n$ an open set such that its cross derivatives of any order exist and are continuous in $A$. Then for any point $(a_1, a_2, ..., a_n) in A$ it is true that $ frac(diff^n f, diff x_i ... diff x_j)(a_1, a_2, ..., a_n) = frac(diff^n f, diff x_j ... diff x_i)(a_1, a_2, ..., a_n) $ ] ``` ) #showybox( title-style: ( boxed-style: ( anchor: ( x: center, y: horizon ), radius: (top-left: 10pt, bottom-right: 10pt, rest: 0pt), ) ), frame: ( title-color: green.darken(40%), body-color: green.lighten(80%), footer-color: green.lighten(60%), border-color: green.darken(60%), radius: (top-left: 10pt, bottom-right: 10pt, rest: 0pt) ), title: "Clairaut's theorem", footer: text(size: 10pt, weight: 600, emph("This will be useful every time you want to interchange partial derivatives in the future.")) )[ Let $f: A arrow RR$ with $A subset RR^n$ an open set such that its cross derivatives of any order exist and are continuous in $A$. Then for any point $(a_1, a_2, ..., a_n) in A$ it is true that $ frac(diff^n f, diff x_i ... diff x_j)(a_1, a_2, ..., a_n) = frac(diff^n f, diff x_j ... diff x_i)(a_1, a_2, ..., a_n) $ ] = Divergence theorem example #sourcecode( ```typ #showybox( footer-style: ( sep-thickness: 0pt, align: right, color: black ), title: "Divergence theorem", footer: [ In the case of $n=3$, $V$ represents a volumne in three-dimensional space, and $diff V = S$ its surface ] )[ Suppose $V$ is a subset of $RR^n$ which is compact and has a piecewise smooth boundary $S$ (also indicated with $diff V = S$). If $bold(F)$ is a continuously differentiable vector field defined on a neighborhood of $V$, then: $ integral.triple_V (bold(nabla) dot bold(F)) dif V = integral.surf_S (bold(F) dot bold(hat(n))) dif S $ ]``` ) #showybox( footer-style: ( sep-thickness: 0pt, align: right, color: black ), title: "Divergence theorem", footer: [ In the case of $n=3$, $V$ represents a volumne in three-dimensional space, and $diff V = S$ its surface ] )[ Suppose $V$ is a subset of $RR^n$ which is compact and has a piecewise smooth boundary $S$ (also indicated with $diff V = S$). If $bold(F)$ is a continuously differentiable vector field defined on a neighborhood of $V$, then: $ integral.triple_V (bold(nabla) dot bold(F)) dif V = integral.surf_S (bold(F) dot bold(hat(n))) dif S $ ] = Coulomb's law example #sourcecode( ```typ #showybox( shadow: ( color: yellow.lighten(55%), offset: 3pt ), frame: ( title-color: red.darken(30%), border-color: red.darken(30%), body-color: red.lighten(80%) ), title: "Coulomb's law" )[ Coulomb's law in vector form states that the electrostatic force $bold(F)$ experienced by a charge $q_1$ at position $bold(r)$ in the vecinity of another charge $q_2$ at position $bold(r')$, in a vacuum is equal to $ bold(F) = frac(q_1 q_2, 4 pi epsilon_0) frac(bold(r) - bold(r'), bar.v bold(r) - bold(r') bar.v^3) $ ]``` ) #showybox( shadow: ( color: yellow.lighten(55%), offset: 3pt ), frame: ( title-color: red.darken(30%), border-color: red.darken(30%), body-color: red.lighten(80%) ), title: "Coulomb's law" )[ Coulomb's law in vector form states that the electrostatic force $bold(F)$ experienced by a charge $q_1$ at position $bold(r)$ in the vecinity of another charge $q_2$ at position $bold(r')$, in a vacuum is equal to $ bold(F) = frac(q_1 q_2, 4 pi epsilon_0) frac(bold(r) - bold(r'), bar.v bold(r) - bold(r') bar.v^3) $ ] = Newton's second law example #sourcecode( ```typ #block( height: 4.5cm, inset: 1em, fill: luma(250), stroke: luma(200), breakable: false, columns(2)[ #showybox( title-style: ( boxed-style: ( anchor: (x: center, y: horizon) ) ), breakable: true, width: 90%, align: center, title: "Newton's second law" )[ If a body of mass $m$ experiments an acceleration $bold(a)$ due to a net force $sum bold(F)$, this acceleration is related to the mass and force by the following equation: $ bold(a) = frac(sum bold(F), m) $ ] ] )``` ) #block( height: 4.5cm, inset: 1em, fill: luma(250), stroke: luma(200), breakable: false, columns(2)[ #showybox( title-style: ( boxed-style: ( anchor: (x: center, y: horizon) ) ), breakable: true, width: 90%, align: center, title: "Newton's second law" )[ If a body of mass $m$ experiments an acceleration $bold(a)$ due to a net force $sum bold(F)$, this acceleration is related to the mass and force by the following equation: $ bold(a) = frac(sum bold(F), m) $ ] ] ) = Encapsulation example #sourcecode( ```typ #showybox( title: "Parent container", lorem(10), columns(2)[ #showybox( title-style: (boxed-style: (:)), title: "Child 1", lorem(10) ) #colbreak() #showybox( title-style: (boxed-style: (:)), title: "Child 2", lorem(10) ) ] )``` ) #showybox( title: "Parent container", lorem(10), columns(2)[ #showybox( title-style: (boxed-style: (:)), title: "Child 1", lorem(10) ) #colbreak() #showybox( title-style: (boxed-style: (:)), title: "Child 2", lorem(10) ) ] )
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/compiler/recursion.typ
typst
Apache License 2.0
// Test recursive function calls. // Ref: false --- // Test with named function. #let fib(n) = { if n <= 2 { 1 } else { fib(n - 1) + fib(n - 2) } } #test(fib(10), 55) --- // Test with unnamed function. // Error: 17-18 unknown variable: f #let f = (n) => f(n - 1) #f(10) --- // Test capturing with named function. #let f = 10 #let f() = f #test(type(f()), function) --- // Test capturing with unnamed function. #let f = 10 #let f = () => f #test(type(f()), int) --- // Error: 15-21 maximum function call depth exceeded #let rec(n) = rec(n) + 1 #rec(1) --- #let f(x) = "hello" #let f(x) = if x != none { f(none) } else { "world" } #test(f(1), "world")
https://github.com/RaphGL/ElectronicsFromBasics
https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap8/2_voltmeter_design.typ
typst
Other
#import "../../core/core.typ" === Voltmeter design As was stated earlier, most meter movements are sensitive devices. Some D'Arsonval movements have full-scale deflection current ratings as little as 50 µA, with an (internal) wire resistance of less than 1000 Ω. This makes for a voltmeter with a full-scale rating of only 50 millivolts (50 µA X 1000 Ω)! In order to build voltmeters with practical (higher voltage) scales from such sensitive movements, we need to find some way to reduce the measured quantity of voltage down to a level the movement can handle. Let's start our example problems with a D'Arsonval meter movement having a full-scale deflection rating of 1 mA and a coil resistance of 500 Ω: #image("static/00150.png") Using Ohm's Law ($E=I R$), we can determine how much voltage will drive this meter movement directly to full scale: $ E &= I R \ E &= 1"mA" times 500 Omega \ E &= 0.5 "volts" \ $ If all we wanted was a meter that could measure 1/2 of a volt, the bare meter movement we have here would suffice. But to measure greater levels of voltage, something more is needed. To get an effective voltmeter meter range in excess of 1/2 volt, we'll need to design a circuit allowing only a precise proportion of measured voltage to drop across the meter movement. This will extend the meter movement's range to higher voltages. Correspondingly, we will need to re-label the scale on the meter face to indicate its new measurement range with this proportioning circuit connected. But how do we create the necessary proportioning circuit? Well, if our intention is to allow this meter movement to measure a greater #emph[voltage] than it does now, what we need is a #emph[voltage divider] circuit to proportion the total measured voltage into a lesser fraction across the meter movement's connection points. Knowing that voltage divider circuits are built from #emph[series] resistances, we'll connect a resistor in series with the meter movement (using the movement's own internal resistance as the second resistance in the divider): #image("static/00151.png") The series resistor is called a "multiplier" resistor because it #emph[multiplies] the working range of the meter movement as it proportionately divides the measured voltage across it. Determining the required multiplier resistance value is an easy task if you're familiar with series circuit analysis. For example, let's determine the necessary multiplier value to make this 1 mA, 500 Ω movement read exactly full-scale at an applied voltage of 10 volts. To do this, we first need to set up an E/I/R table for the two series components: #image("static/10151.png") Knowing that the movement will be at full-scale with 1 mA of current going through it, and that we want this to happen at an applied (total series circuit) voltage of 10 volts, we can fill in the table as such: #image("static/10152.png") There are a couple of ways to determine the resistance value of the multiplier. One way is to determine total circuit resistance using Ohm's Law in the "total" column ($R=E/I$), then subtract the 500 Ω of the movement to arrive at the value for the multiplier: #image("static/10153.png") Another way to figure the same value of resistance would be to determine voltage drop across the movement at full-scale deflection ($E=I R$), then subtract that voltage drop from the total to arrive at the voltage across the multiplier resistor. Finally, Ohm's Law could be used again to determine resistance ($R=E/I$) for the multiplier: #image("static/10154.png") Either way provides the same answer ($9.5 k Omega$), and one method could be used as verification for the other, to check accuracy of work. #image("static/00152.png") With exactly 10 volts applied between the meter test leads (from some battery or precision power supply), there will be exactly 1 mA of current through the meter movement, as restricted by the "multiplier" resistor and the movement's own internal resistance. Exactly 1/2 volt will be dropped across the resistance of the movement's wire coil, and the needle will be pointing precisely at full-scale. Having re-labeled the scale to read from 0 to 10 V (instead of 0 to 1 mA), anyone viewing the scale will interpret its indication as ten volts. Please take note that the meter user does not have to be aware at all that the movement itself is actually measuring just a fraction of that ten volts from the external source. All that matters to the user is that the circuit as a whole functions to accurately display the total, applied voltage. This is how practical electrical meters are designed and used: a sensitive meter movement is built to operate with as little voltage and current as possible for maximum sensitivity, then it is "fooled" by some sort of divider circuit built of precision resistors so that it indicates full-scale when a much larger voltage or current is impressed on the circuit as a whole. We have examined the design of a simple voltmeter here. Ammeters follow the same general rule, except that parallel-connected "shunt" resistors are used to create a #emph[current divider] circuit as opposed to the series-connected #emph[voltage divider] "multiplier" resistors used for voltmeter designs. Generally, it is useful to have multiple ranges established for an electromechanical meter such as this, allowing it to read a broad range of voltages with a single movement mechanism. This is accomplished through the use of a multi-pole switch and several multiplier resistors, each one sized for a particular voltage range: #image("static/00153.png") The five-position switch makes contact with only one resistor at a time. In the bottom (full clockwise) position, it makes contact with no resistor at all, providing an "off" setting. Each resistor is sized to provide a particular full-scale range for the voltmeter, all based on the particular rating of the meter movement (1 mA, 500 Ω). The end result is a voltmeter with four different full-scale ranges of measurement. Of course, in order to make this work sensibly, the meter movement's scale must be equipped with labels appropriate for each range. With such a meter design, each resistor value is determined by the same technique, using a known total voltage, movement full-scale deflection rating, and movement resistance. For a voltmeter with ranges of 1 volt, 10 volts, 100 volts, and 1000 volts, the multiplier resistances would be as follows: #image("static/00154.png") Note the multiplier resistor values used for these ranges, and how odd they are. It is highly unlikely that a 999.5 kΩ precision resistor will ever be found in a parts bin, so voltmeter designers often opt for a variation of the above design which uses more common resistor values: #image("static/00155.png") With each successively higher voltage range, more multiplier resistors are pressed into service by the selector switch, making their series resistances add for the necessary total. For example, with the range selector switch set to the 1000 volt position, we need a total multiplier resistance value of 999.5 kΩ. With this meter design, that's exactly what we'll get: $ R_"Total" &= R_4 + R_3 + R_2 + R_1 \ R_"Total" &= 900 k Omega + 90 k Omega + 9 k Omega + 500 Omega \ R_"Total" &= 999.5 k Omega \ $ The advantage, of course, is that the individual multiplier resistor values are more common (900k, 90k, 9k) than some of the odd values in the first design (999.5k, 99.5k, 9.5k). From the perspective of the meter user, however, there will be no discernible difference in function. #core.review[ Extended voltmeter ranges are created for sensitive meter movements by adding series "multiplier" resistors to the movement circuit, providing a precise voltage division ratio. ]
https://github.com/AlyamanMas/QuranKindle
https://raw.githubusercontent.com/AlyamanMas/QuranKindle/master/template.typ
typst
// The project function defines how your document looks. // It takes your content and some metadata and formats it. // Go ahead and customize it to your liking! #let project(title: "", authors: (), logo: none, body) = { // Set the document's basic properties. set document(author: authors, title: title) set page( width: 4.9in, height: 6.8in, numbering: "1", number-align: center, margin: 0em, ) set text(font: "Amiri", lang: "ar", size: 1.3em) show table: set align(center) // Set paragraph spacing. //show par: set block(above: 1.2em, below: 1.2em) //set par(leading: 0.75em) // Title page. // The page can contain a logo if you pass one with `logo: "logo.png"`. //v(0.6fr) //if logo != none { // align(right, image(logo, width: 26%)) //} //v(9.6fr) // //text(2em, weight: 700, title) // //// Author information. //pad( // top: 0.7em, // right: 20%, // grid( // columns: (1fr,) * calc.min(3, authors.len()), // gutter: 1em, // ..authors.map(author => align(start, strong(author))), // ), //) //v(2.4fr) //pagebreak() // Table of contents. //outline(depth: 3, indent: true) //pagebreak() // Main body. //set par(justify: true) body }
https://github.com/michelebanfi/typst-terraform
https://raw.githubusercontent.com/michelebanfi/typst-terraform/main/Templates/Report/main.typ
typst
#import "template.typ" : * #show: project.with( title: {{TITLE}}, authors: ( (name: "<NAME>", affiliation: ""), (name: "<NAME>", affiliation: ""), ), abstract: [], date: "", )
https://github.com/mgoulao/IST-MSc-Thesis-Typst-Template
https://raw.githubusercontent.com/mgoulao/IST-MSc-Thesis-Typst-Template/main/section_1.typ
typst
= Introduction #lorem(60) == In this paper #lorem(20) #figure( image("images/neural-network.png", width: 40%), caption: [A simple neural network] ) #lorem(20) #figure( table(columns: (auto, auto), [Cell 1], [Cell 2], [Cell 3], [Cell 4], ), caption: [An example of a table] ) <table-1> === Contributions #lorem(40) #cite("LeCun2015DeepL")
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/layout/page-01.typ
typst
Other
// Just empty page with styles. // Should result in one green-colored A11 page. #page("a11", flipped: true, fill: green)[]
https://github.com/Functional-Bus-Description-Language/Specification
https://raw.githubusercontent.com/Functional-Bus-Description-Language/Specification/master/src/grouping.typ
typst
#pagebreak() = Grouping <grouping> Grouping is a feature of the FBDL used to inform a compiler that particular functionalities might be accessed together, and their register location must meet additional constraints. This is achieved using the group functionality described in @group. The following functionalities can be grouped: - config, - group, - irq, - mask, - param, - return, - static, - satus. However, not all groupable functionalities can belong to the same group. For example, a config shall not be grouped with a param, as param is an inner functionality of a proc, and config instantiation within a proc is invalid. #block(breakable: false)[ The following snippet presents three grouped configs. #pad(left: 1em)[ ```fbd Main bus Config_Group group type cfg_t; width = 8; A cfg_t B cfg_t C cfg_t ``` ] ] Any FBDL compliant compiler must place all three configs (`A`, `B`, `C`) in the same register. == Single register group A single register group is a group of functionalities that fit a single register. The overall width of all functionalities is not greater than the single register width. In such a case, all functionalities must be placed in the same register. The specification does not impose any specific order of the functionalities within the register, and it is left to the compiler implementation. The following listing presents an example bus description with three single register groups. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus Read_Write group C0 config; width = 16 M0 mask; width = 15 Mixed group C1 config; width = 16 S11 static; width = 8 S12 status; width = 8 Read_Only group S21 status; width = 4 S22 status; width = 7 ``` ] ] All functionalities of the `Read_Write` group can be both read and written. The code generated by a compiler for the requester must provide means for reading/writing the whole group as well as for reading/writing particular functionalities of the group. The `Mixed` group contains functionality that can be read and written (`C1`), as well as functionalities that can only be read (`S11`, `S12`). The code generated by a compiler for the requester must provide a means for reading all readable functionalities and writing all writable functionalities. It is valid even if the group has single readable or single writable functionality. The compiler must also generate means for reading/writing particular functionalities of the group. In the case of `Mixed` group this will result in two means doing exactly the same (writing the `C1` config). However, it is up to the user to decide which of the means should be used. If it makes sense, it is perfectly valid to use both of them in different contexts. All functionalities of the `Read_Only` group are read-only. In this case, the compiler must generate a mean only for reading the group. It must also generate means for reading particular functionalities. == Multi register group A multi register group is a group with functionalities that overall width is greater than the width of a single register. The specification does not impose any order of functionalities or registers in such cases, and it is left to the compiler implementation. However, the compiler must not split functionalities narrower or equal to the register width into multiple registers. This implies that any functionality with a width not greater than the register width is always read or written using single read or write access. The following snippet presents a bus description with one multi register group. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus Multireg group C config; width = 10 M mask; width = 10 SC static; width = 10 SS status; width = 10 ``` ] ] The compiler must generate code for the requester allowing to write all writable functionalities of the group as well as the code allowing reading all readable functionalities of the group. It must also generate means for reading or writing particular functionalities. There are multiple ways to place functionalities from the above example into registers. The following snippet presents one possible way. #block(breakable: false)[ #pad(left: 1em)[ ``` Nth register Nth + 1 register ----------------------------- ---------------------- || C | M | SC | 2 bits gap || || SS | 22 bits gap || ----------------------------- ---------------------- ``` ] ] However, the above arrangement might not be optimal if there is a need to read both `SC` and `SS` at the same time as it would require reading two registers not a single one. The below listing presents how to group elements within the group using subgroups. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus User group RW group C config; width = 10 M mask; width = 10 Read_Only group SC static; width = 10 SS status; width = 10 ``` ] ] The set of possible functionalities placements within the registers is now limited as the `RW` and `Read_Only` subgruops are registerified before the `User` group. The below snippet shows a possible arrangement. #block(breakable: false)[ #pad(left: 1em)[ ``` Nth register Nth + 1 register ------------------------- --------------------------- || C | M | 12 bits gap || || SC | SS | 12 bits gap || ------------------------- --------------------------- ``` ] ] This time reading both `SC` and `SS` requires reading only one register, while reading the whole `User` group still requires reading two registers. == Array groups An array group is a group with all functionalities being arrays. The functionalities do not necessarily have the same number of elements. The code generated by a compiler, for an array group, for the requester must provide a means for writing an arbitrary number of elements for all writable functionalities starting from an arbitrary index. It must also provide a mean for reading an arbitrary number of elements for all readable functionalities starting from an arbitrary index. The specification does not define what happens on access to the elements with an index greater than the length of some arrays. This is because some of the target languages support special data types indicating that the value is absent (for example, `None` - Python, `Option` - Rust), while others use for this purpose completely valid values (`0` - C, Go). === Single register array group The single register array group is an array group with overall single elements width not greater than the width of a single register. The below listing presents an example bus description with a single register array group. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus Single_Reg_Array group type cfg_t config; width = 8; A [1]cfg_t B [2]cfg_t C [3]cfg_t D [3]status; width = 8 ``` ] ] In the case of a single register array group all elements with corresponding indices must be placed in the same register. Elements with consecutive indexes must be placed in consecutive registers. The below snippet presents a possible arrangement of elements for the example bus. #block(breakable: false)[ #pad(left: 1em)[ ``` Nth register ------------------------------- || D[0] | C[0] | B[0] | A[0] || ------------------------------- Nth + 1 register ------------------------------------- || D[1] | C[1] | B[1] | 8 bits gap || ------------------------------------- Nth + 2 register ------------------------------- || D[2] | C[2] | 16 bits gap || ------------------------------- ``` ] ] === Mutli register array group The multi register array group ia an array group with overall single elements width greater than the width of a single register. The below listing presents an example bus description with a multi register array group. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus Multi_Reg_Array group A [1]config; width = 16 B [2]config; width = 12 C [2]config; width = 12 ``` ] ] In the case of multi register array group all elements with corresponding indices must be placed in consecutive registers. Also all elements with consecutive indexes must be placed in consecutive registers. Such a requirement guarantees that block access can always be used. The below snippet presents possible arrangement of elements for the example bus. #block(breakable: false)[ #pad(left: 1em)[ ``` Nth register Nth + 1 register ------------------------------ ------------------------ || C[0] | B[0] | 8 bits gap || || A[0] | 16 bits gap || ------------------------------ ------------------------ Nth + 2 register Nth + 3 register ------------------------------ ------------------------------ || C[1] | B[1] | 8 bits gap || || C[2] | B[2] | 8 bits gap || ------------------------------ ------------------------------ ``` ] ] === Mixed group A mixed group is a group with both single functionalities and array functionalities. The below listing presents an example bus description with a mixed group. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus Mixed group C config; width = 10 M mask; width = 7 S status; width = 8 CA [3]config; width = 10 SA [3]config; width = 12 ``` ] ] In case of a mixed group, array functionalities shall be registerified as the first ones assuming a pure array group. Single functionalities shall be later placed in the gaps created during array registerification. If there are no gaps, or gaps are not wide enough, then all reamining single functionalities shall be registerified as single register group or multi register group. If the gaps are wide enough to place single functionalities there, but for some reason it is not desired, then subgroup can be defined to group single functionalities of the mixed group as the first ones. The below snippet presents a possible arrangement of elements for the example bus. #block(breakable: false)[ #pad(left: 1em)[ ``` Nth register Nth + 1 register ----------------------- ------------------------------------ || CA[0] | SA[0] | C || || CA[1] | SA[1] | M | 3 bits gap || ----------------------- ------------------------------------ Nth + 2 register ------------------------------------ || CA[2] | SA[2] | S | 2 bits gap || ------------------------------------ ``` ] ] == Irq group The irq group is used for interrupt grouping. Grouped irqs have a common interrupt consumer signal. Irqs belonging to the same group might have different values of the producer trigger (in-trigger property), but all of them must have the same value for the consumer trigger (out-trigger property). In the case of level-triggered interrupt consumer the information on the interrupt source can be read from the interrupt group flag register. #block(breakable: false)[ The below snippet shows an example of an irq group for level-sensitive interrupt consumer. #pad(left: 1em)[ ```fbd Main bus IRQ group type irq_t irq; add-enable = true IRQ0 irq_t IRQ1 irq_t; clear = "On Read" IRQ2 irq_t; in-trigger = "Edge" IRQ3 irq_t; in-trigger = "Edge"; clear = "On Read" ``` ] ] The picture below presents a possible logical block diagram of the `IRQ` group with level trigger for the interrupt consumer and enable register. The `"Clear On Read"` signal is driven on every Flag Register read. The `"Explicit Clear"` signal must be driven when the requester calls a means for clearing given interrupt flags. Probably the easiest form of the `"Explicit Clear"` implementation is clear on Flag Register write, where the clear bit-mask is the value of the data bus. The Flag Register is to some extent a virtual register, as it has an address, but it does not have any storage elements. The flag is stored in the interrupt producer in case of a level-triggered producer or in the Edge Detector in case of an edge-triggered producer. #set align(center) #image("images/irq-group.png", width: 70%) #set align(left) == Param and return groups Param and return groups are used to group proc or stream parameters or returns. Such a kind of grouping may be necessary for performance optimizations, as the requester may store parameters or returns in a single list or in multiple distinct lists. Param and return groups help to avoid data reshuffling before or after the access. Param and return groups are independent. The below snippet presents a valid description with a single proc with one param and one return group. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus P proc Params group p1 param p2 param Returns group r1 return r2 return ``` ] ] Param and return groups may contain subgroups. The below snippet presents examples of two invalid and two valid params and returns grouping. #block(breakable: false)[ #pad(left: 1em)[ ```fbd Main bus # Param and return groups must have distinct names. Invalid1 proc G group p1 param p2 param G group r1 return r2 return # Param and return must not be placed in the same group. Invalid2 proc G group p param r return # Param and return groups might have subgroups. Valid1 Params group p1 param Subgroup group p2 param p3 param # Params and returns in different groups might have the same names. Valid2 Params group x param y param Returns group x return y return ``` ] ]
https://github.com/RolfBremer/in-dexter
https://raw.githubusercontent.com/RolfBremer/in-dexter/main/tests/RangeDelimiterTest.typ
typst
Apache License 2.0
#import "../in-dexter.typ": * #set page("a6", flipped: true, numbering: "1") _A Test with alternate range-delimiter._ #index(indexType: indexTypes.Start)[Entry1] #pagebreak() #pagebreak() #index(indexType: indexTypes.End)[Entry1] #pagebreak() = Index #columns(2)[ #make-index( use-bang-grouping: true, use-page-counter: true, sort-order: upper, range-delimiter: [ to ] ) ] #columns(2)[ #make-index( use-bang-grouping: true, use-page-counter: true, sort-order: upper, range-delimiter: [ --- ] ) ]
https://github.com/remigerme/typst-polytechnique
https://raw.githubusercontent.com/remigerme/typst-polytechnique/main/README.md
markdown
MIT License
# Typst Polytechnique package A Typst package for Polytechnique student reports. For a short introduction to Typst features and detailled information about the package, check the [guide](https://github.com/remigerme/typst-polytechnique/blob/main/guide.pdf) (available from the repo only). ## Usage If you want to use it on local, make sure you have the font "New Computer Modern Sans" installed. Define variables at the top of the template : ```typc #let title = "Rapport de stage en entreprise sur plusieurs lignes automatiquement" #let subtitle = "Un sous-titre pour expliquer ce titre" #let logo = image("path/to/my-logo.png") #let logo-horizontal = true #let short-title = "Rapport de stage" #let authors = ("<NAME>") #let date-start = datetime(year: 2024, month: 06, day: 05) #let date-end = datetime(year: 2024, month: 09, day: 05) #let despair-mode = false #set text(lang: "fr") ``` These variables will be used for PDF metadata, default cover page and default header. ## Contributing Contributions are welcomed ! See [contribution guidelines](CONTRIBUTING.md). ## Todo - [ ] heading not at the end of a page : might be tricky - [x] first line indent - [ ] better spacing between elements - [x] handle logos on cover page - [x] ~~handle logos on header~~ : feature canceled
https://github.com/elteammate/typst-compiler
https://raw.githubusercontent.com/elteammate/typst-compiler/main/src/ZZZ-TODO.typ
typst
#set list(marker: none) = TODO #let resolved = text(green, sym.checkmark) #let low = text(yellow, sym.circle.filled) #let med = text(orange, sym.circle.filled) #let high = text(red, sym.circle.filled) == Lexer - #resolved Works ok-ish???? == Lexer Postprocessor - #med Handle dot call on new line - #low Handle `else` on new line - #high Semicolon after function is added to content block after == Parser - #med Merge `member_access` and `call` into `method_call` == Parser Postprocessor - #low Add more checks to params, too boring though == Typesystem - #resolved Rewrite with LALR(1) parser - #resolved Make function into parametrized type - #resolved Special type for object - #low Aliases - #low Derive alias from constructors - #high Utils for type comparison == IR generator - #resolved Strongly typed function call - #low Optimize joining with empty content - #med Implicit arithmetic conversions - #med Determine call type - #low Unvirtualize non-closure calls - #med Capturing values in closure slots - #med Capturing external functions and values - #low Sink parameters - #high lvalues - #high arrays & tuples - #low `array` times `int` and others - #high dicts & objects - #high A LOT MORE AST NODE TYPES - #low Reset loop stack when entering the function - #high Basic control flow graph analisys - #low Warning in unreachable code - #low Don't allocate stack variable for `none` typed args == x86 - #high Reuse parameters on stack as local variables - #high Replace `mul` with `imul` == LLVM Not started
https://github.com/mrwunderbar666/typst-apa7ish
https://raw.githubusercontent.com/mrwunderbar666/typst-apa7ish/main/template/main.typ
typst
MIT License
#import "@preview/apa7-ish:0.1.0": * // #import "../src/apa7ish.typ": * #show: conf.with( title: "A Title is all you need", subtitle: "The impact of a good title on a paper's citations count", documenttype: "Research Article", anonymous: false, authors: ( (name: "<NAME>", email: "<EMAIL>", affiliation: "University of Pennsylvania", postal: "Department of Communication, University of Pennsylvania, Philadelphia, PA 19104", orcid: "0000-1111-1111-1111", corresponding: true), (name: "<NAME>", affiliation: "Google Brain", orcid: "0000-1111-1111-1111"), (name: "<NAME>", affiliation: "University of Pennsylvania", orcid: "0000-1111-1111-1111") ), abstract: [Title of a scientific paper is an important element that conveys the main message of the study to the readers. In this study, we investigate the impact of paper titles on citation count, and propose that the title alone has the highest impact on citation count. Using a dataset of 1000 scientific papers from various disciplines, we analyzed the correlation between the characteristics of paper titles, such as length, clarity, novelty, and citation count. Our results show that papers with clear, concise, and novel titles tend to receive more citations than those with longer or less informative titles. Moreover, we found that papers with creative and attention-grabbing titles tend to attract more readers and citations, which supports our hypothesis that the title alone has the highest impact on citation count. Our findings suggest that researchers should pay more attention to crafting effective titles that accurately and creatively summarize the main message of their research, as it can have a significant impact on the success and visibility of their work. ], date: "October 20, 2024", keywords: [content analysis, citation, bibliometrics], funding: [This research received funding by the Ministry of Silly Walks (Grant ID: 123456).] ) = Introduction Title of a scientific paper is an important element @teixeira2015importance that conveys the main message of the study to the readers @hartley2019academic. In this study, we investigate the impact of paper titles on citation count, and propose that the title alone has the highest impact on citation count. Using a dataset of 1000 scientific papers from various disciplines, we analyzed the correlation between the characteristics of paper titles, such as length, clarity, novelty, and citation count @li2019correlation. Our results show that papers with clear, concise, and novel titles tend to receive more citations than those with longer or less informative titles @west2013role. Moreover, we found that papers with creative and attention-grabbing titles tend to attract more readers and citations, which supports our hypothesis that the title alone has the highest impact on citation count. Our findings suggest that researchers should pay more attention to crafting effective titles that accurately and creatively summarize the main message of their research, as it can have a significant impact on the success and visibility of their work. = Declaration of Interest Statement #label("declaration-of-interest-statement") The authors report there are no competing interests to declare. #pagebreak() #bibliography("example.bib")
https://github.com/dismint/docmint
https://raw.githubusercontent.com/dismint/docmint/main/security/smashing.typ
typst
#import "template.typ": * #show: template.with( title: "Smashing the Stack in the 21st Century", subtitle: "6.566" ) = Introduction The stack stores the current state of the program as it runs, capturing things such as local variables, the return address, etc. Visualize the stack as exactly that, with the bottom being a high memory address where it starts, and additional frames moving upward toward lower addresses. When a `main` calls a `function`, the following generally happens: + The return address is added to the stack. + Before setting the new `function` base pointer (`%rbp`), we store the old `main` base pointer on the stack. + Set the `function` base pointer to the new value. + Add all the local variables. Note that the `%rsp` variable stores the current location of the stack (the topmost element). = Smashing the Stack Now consider the following short program of `C` ```C #include <string.h> #include <stdio.h> void copy(char *str) { char buffer[16]; strcpy(buffer, str); printf("%s\n", buffer); } int main (int argc, char **argv) { copy(argv[1]); return 0; } ``` We can easily cause a *buffer overflow* by giving a `str` argument that has more than 16 bytes. This means that we can inject information into the rest of the stack, like other local variables, the stored `%rbp`, or perhaps even dangerously, the return address. Thus it becomes possible for us to reach parts of the code that may be unreachable in a regular run of the software. Imagine there is a prototype function the developers have not connected in any way yet to the actual codebase. It's possible that we can make an attack such that the stack smash goes into the return pointer, setting it to elsewhere in the program, perhaps even that sensitive function. However, stack smashing can be even more dangerous. We've been focusing primarily on the smashing of the return address, but there is an additional layer of risk. In particular, the first part of our stack smash can be some bytes that are actually a program the attacker has created, and then the subsequent return address smash can point back to the location of this code, meaning we can run completely foreign software with this type of attack. #define( title: "Shell Code" )[ This injected code is often called *shell code*, because we usually want to open a shell where we can then run more commands. ] This means we could potentially delete files, extract sensitive information, and in general gain complete control over the system the software is running on. = Return to libc What happens if the stack we're trying to smash is not executable? That is, it no longer becomes possible for us to inject our own code and run it on the stack? Well, then we have to borrow other prats of the existing code and patch it together for our use. Suppose that we wish to `unlink` a file from the system. Without having an executable stack, this will require two steps. + Find the `unlink` system call somewhere in the existing codebase. It may come from the source code or an imported library. We will jump to this (return to libc) in order to perform the desired action. + In order to use the `unlink` we previously found, we need to load arguments into the correct registers. However, since we can't inject code, this is something that must also be scraped together from the existing codebase. The last part of this attack is to find a rop gadget (return-oriented programming) that somehow loads the correct argument registers from a place that we can smash with the stack.
https://github.com/AU-Master-Thesis/thesis
https://raw.githubusercontent.com/AU-Master-Thesis/thesis/main/sections/4-results/study-2.typ
typst
MIT License
#import "../../lib/mod.typ": * #pagebreak(weak: true) == #study.H-2.full.n <s.r.study-2> // This section presents all results pertaining to the first contribution, that is the MAGICS sim- // ulator, along with it’s capabilities to reproduce the results of the GBP planner[8]. Sections // 5.3.1-5.3.4 present the results of the experiments conducted in scenarios S-1 to S-5 respec- // tively. For a description of these scenarios see the previous section 5.2. Scenarios. This section presents results related to the second contribution around studying the effects of internal and external iteration on the factorgraphs joint optimization. @s.r.iteration-amount-plots presents the results of the _Iteration Amount_ experiment carried out in #boxed[*S-8*]. Followed by @s.r.schedules which deliberates the results of the _Iterations Schedules_ experiment described in #boxed[*S-9*]. // #todo[talk about this section not as enhancement but as ablation study] // #k[ // Construct experiment that test the effect of varying the number of variables. // Especially for sharp corners like 90 degree turns in the junction. // ] // #pagebreak(weak: true) === Iteration Amount <s.r.iteration-amount-plots> @f.iteration-amount-plots shows the results of the experiment, across four different metrics. A clear pattern shared across all four is that internal iteration $M_I = 1$ and external iteration $M_E = 1$, is not enough to properly solve for the optimum across the factorgraphs. Another general trend is that as both $M_I$ and $M_E$ the quality of each metric improves. With both the #acr("LDJ") and Makespan metric the 2D gradient is clear to see, with both a high $M_I$ and $M_E$ needed to reach the optimum. For the _distance travelled_ metric, only a couple of iterations are needed to reach the optimum plateou. _Finished at difference_ shows a more noisy histogram, with values fluctuating a lot more for $M_I < 10 and M_E < 10$. All four metrics clearly show that a large $M_E$ when $M_I in [1,2]$ is severely detrimental. // - exponentially more // // performs significantly poorer than any of the other combinations. As both $M_I$ and $M_E$ increase each metric quickly converges toward an optimum plateau, from which no further increase appears to result in an improvement. Across all metrics performance is generally suboptimal when $M_I = 1 and M_E > 1$ or $M_I > 1 and M_E$. Although for $M_I in {5, 13, 21} and M_E = 1 or M_I = 1 and M_E = {5, 13, 21}$ no drawback appears. \ // Both the distance travelled metric and finished at difference contains noticeable outliers at $M_I = 8 and M_E = 1$ and $M_I = 3 and M_E = 1$ respectively. Contrary to both the #acr("LDJ") and makespan // // Although for the #acr("LDJ"), makespan and finished at difference, $5, 13, 21$ // // #line(length: 100%, stroke: 1em + red) // // - that multiple iterative steps are required // // - repeating pattern of ... slightly worse results // // // best performance is achieved if both $M_I$ and $M_E$ a // // // Across all metrics, lower values of external iterations ($M_E$) consistently lead to poorer performance. // Increasing internal iterations ($M_I$) generally enhances performance, but optimal results are achieved with a combination of higher values for both $M_I$ and $M_E$. // // A first observation is that the #let cell(c) = box(rect(height: 0.7em, width: 0.7em, stroke: c, fill: c), baseline: 0.05em) // #let cropping = -4mm #let cropping = ( left: -4mm, bottom: -11mm, x: -2mm, y: -7mm ) #figure( grid( columns: (39%, 36%), align: center + horizon, // row-gutter: -0.5mm, block( clip: true, pad( bottom: cropping.bottom, x: cropping.x, y: cropping.y, image("../../figures/plots/iteration-amount-average-ldj.svg"), ), ), block( clip: true, pad( left: cropping.left, bottom: cropping.bottom, x: cropping.x, y: cropping.y, image("../../figures/plots/iteration-amount-average-makespan.svg"), ), ), block( clip: true, pad( x: cropping.x, y: cropping.y, image("../../figures/plots/iteration-amount-average-distance-travelled.svg"), ), ), block( clip: true, pad( left: cropping.left, x: cropping.x, y: cropping.y, image("../../figures/plots/iteration-amount-largest-time-differense.svg"), ), ), ), caption: [ Results of varying internal iteration $M_I$ and external $M_E$ split across four metrics: #acr("LDJ"), makespan, distance travelled, and largest time difference. Each metric is plotted as a 2D histogram with the $x$ and $y$-axis being $M_I$ and $M_E$ respectively. The gradient colors are chosen such that the orange colored end of the spectrum are cells #cell(theme.peach) which performed poorly for that metric. While blue #cell(theme.lavender) indicates cells with good metric values. ] ) <f.iteration-amount-plots> === Iteration Schedules <s.r.schedules> For the Schedules experiment, three of the considered metrics are measured. @f.schedules-experiment-ldj plots the #acr("LDJ") metric, as a grouped histogram plot for each schedule. @f.schedules-experiment-makespan and #numref(<f.schedules-experiment-distance-travelled>) similarly shows the _makespan_ and _distance travelled_ metric respectively. The plots show that for all three metrics the choice of schedule does not have an significant effect. When looking across all metrics and schedules it is diffecult to assess whether the number of external iterations $M_E$ has a beneficial effect. For the _Centered_, _Late as Possible_ and _Soon as Possible_ in @f.schedules-experiment-ldj it appears to result in an improvement as $M_E$ increases. But for others such _Half Start, Half End_ and _Late as Possible_ in @f.schedules-experiment-distance-travelled, no clear relation is observable. _Half Start, Half End_ is the best performing schedule in all three metrics, albeit only slightly. // interleave evenly // soon as possible // late as possible // half at beginning, half at end // centered #let handles = ( ( label: [*Centered*], color: theme.mauve, alpha: 0%, space: 1em, ), ( label: [*Half Start, Half End*], color: theme.maroon, alpha: 0%, space: 1em, ), ( label: [*Interleave Evenly*], color: theme.yellow, alpha: 0%, space: 1em, ), ( label: [*Late as Possible*], color: theme.teal, alpha: 0%, space: 1em, ), ( label: [*Soon as Possible*], color: theme.lavender, alpha: 0%, space: 0em, ), ) #let schedules-legend = legend( handles, direction: ttb, fill: theme.base ) #figure( grid( columns: (3.5fr, 1fr), align: horizon, block( clip: true, pad( x: -3mm, y: -3mm, image("../../figures/plots/schedules-experiment-ldj.svg"), ), ), schedules-legend, // std-block( // inset: 0pt, // height: 34mm, // align( // horizon, // schedules-legend // ) // ), ), caption: [ #acr("LDJ") metric for each schedules. Each column in a schedule group represent a different value of external iterations $M_E$, sorted in ascending order. The left-most column is $M_E = 5$, and the right-most is $M_E = 25$. Atop each group is the average value of the five columns shown. ] // #acr("LDJ")] ) <f.schedules-experiment-ldj> #figure( // image("../../figures/plots/schedules-experiment-makespan.svg"), grid( columns: 2, align: horizon, block( clip: true, pad( x: -3mm, y: -3mm, image("../../figures/plots/schedules-experiment-makespan.svg"), ), ), schedules-legend ), caption: [Makespan metric for each schedule. See @f.schedules-experiment-ldj for details about how to interpret the different elements of the plot. Each column in a schedule group represent a different value of external iterations $M_E$, sorted in ascending order. The left-most column is $M_E = 5$, and the right-most is $M_E = 25$. Atop each group is the average value of the five columns shown.] ) <f.schedules-experiment-makespan> #figure( // image("../../figures/plots/schedules-experiment-distance-travelled.svg"), grid( columns: 2, align: horizon, block( clip: true, pad( x: -3mm, y: -3mm, image("../../figures/plots/schedules-experiment-distance-travelled.svg"), ), ), schedules-legend ), caption: [Distance travelled metric for each schedule. See @f.schedules-experiment-ldj for details about how to interpret the different elements of the plot. Each column in a schedule group represent a different value of external iterations $M_E$, sorted in ascending order. The left-most column is $M_E = 5$, and the right-most is $M_E = 25$. Atop each group is the average value of the five columns shown.] ) <f.schedules-experiment-distance-travelled>
https://github.com/gongke6642/tuling
https://raw.githubusercontent.com/gongke6642/tuling/main/布局/pad/pad.typ
typst
#image("屏幕截图 2024-04-16 162108.png") #image("屏幕截图 2024-04-16 162318.png") #image("屏幕截图 2024-04-16 162347.png")
https://github.com/RiccardoTonioloDev/Bachelor-Thesis
https://raw.githubusercontent.com/RiccardoTonioloDev/Bachelor-Thesis/main/chapters/pyxinet.typ
typst
Other
#pagebreak(to: "odd") #import "../config/functions.typ": * = PyXiNet <ch:pyxinet> PyXiNet (*Py*\ramidal *Xi* *Net*\work) è una famiglia di modelli che tenta di combinare le diverse soluzioni trattate fino ad ora per massimizzare l'efficacia e l'efficienza nel portare a termine il compito di @MDE. Il nome suggerisce che sia una rete fortemente basata sia su #link(<ch:pydnet>)[PyDNet] che su #link(<ch:xinet>)[XiNet], infatti il primo modello, in particolare la seconda versione viste le sue ottime _performance_ di base, darà una direzione sullo stile dell'architettura generale, mentre il secondo cercherà di migliorare l'@encoder della rete. Il seguente capitolo andrà quindi a mostrare l'approccio sperimentale e esplorativo condotto, nel testare ipotesi e progressivamente migliorare e raffinare le architetture proposte. == PyXiNet *$alpha$* PyXiNet $alpha$ rappresenta il primo approccio all'uso di XiNet come @encoder. In particolare sono stati realizzati due modelli, chiamati $alpha" I"$ e $alpha" II"$. #block([ Le architetture create sono le seguenti: #figure(image("../images/architectures/PyXiNet-a1.drawio.png",width:350pt),caption: [Architettura di PyXiNet $alpha" I"$]) ],breakable: false,width: 100%) #figure(image("../images/architectures/PyXiNet-a2.drawio.png",width:350pt),caption: [Architettura di PyXiNet $alpha" II"$]) Si può notare che dopo l'uso di ogni XiNet, è presente una convoluzione trasposta, questo perchè come già discusso nel capitolo #link(<ch:xinet>)[XiNet], l'uso di tale rete effettua una riduzione della dimensionalità spaziale (altezza e larghezza) all'inizio, e poi ne fa una per ogni coppia all'interno della rete. La convoluzione trasposta serve per far si che la dimensionalità spaziale sia la medesima che avrebbe prodotto l'@encoder originale di PDV1. Anche in questo caso, come per PDV1, la convoluzione trasposta è seguita da una funzione di attivazione _ReLU_, con coefficiente di crescita di 0,2 per la parte negativa. #block([ Essendo state allenate assieme, con ciascuna si voleva fornire una risposta a diverse domande: - XiNet se usato come @encoder porta a buoni risultati? ],breakable: false,width: 100%) - A parità di livelli nella piramide, si riesce ad avere una _performance_ migliore o uguale a quella di PDV2? - Se si rimuove un livello alla piramide, come vengono impattate le _performance_? - L'uso di XiNet come impatta il numero di parametri e il tempo di inferenza? - Si vuole far notare che per rispondere a questa domanda, rispetto alle tabelle di valutazione precedenti, sono stati introdotte due nuove metriche di valutazione: numero di parametri ($\#$p), e tempo di inferenza in secondi (Inf. (s)) (il quale viene calcolato facendo una media del tempo di inferenza su 10 immagini, passate in successione al modello e non in batch, con una elaborazione su _CPU_ Intel i7-7700). #block([ L'allenamento dei due modelli ha portato ai seguenti risultati: #ext_eval_table( ( (name: [PDV1], vals: (1971624.0,0.15,0.16,1.52,6.229,0.253,0.782,0.916,0.964)), (name: [PDV2], vals: (716680.0,0.10,0.157,1.487,6.167,0.254,0.783,0.917,0.964)), (name: [PyXiNet $alpha" I"$],vals:(429661.0,0.14,0.17,1.632,6.412,0.269,0.757,0.903,0.958)), (name: [PyXiNet $alpha" II"$],vals:(709885.0,0.12,0.168,1.684,6.243,0.259,0.777,0.913,0.960)) ), 2, [PDV1 e PDV2 vs. PyXiNet $alpha$], ) ],breakable: false,width: 100%) Da come si può osservare, sebbbene il numero di parametri sia stato intorno a quelli di PDV2, se non minore, tutte le altre metriche sono fortemente peggiorate, andando a suggerire che forse non è quello il migliore uso di XiNet come @encoder. == PyXiNet *$beta$* Dato l'insuccesso dato della tipologia $alpha$, la tipologia $beta$ cerca di utilizzare al meglio XiNet, al fine di migliorare almeno una metrica di valutazione. Per quanto scritto in XiNet, le reti proposte all'interno del paper sono composte da almeno cinque XiConv. Questo suggerisce come la profondità sia un elemento essenziale per il successo nel suo utilizzo. La famiglia $beta$ è composta da quattro varianti, due da tre livelli e due da quattro livelli, tutte con una leggera variazione rispetto all'architettura a piramide tradizionale, questo per riuscire a rispondere alle seguenti domande: - La profondità aiuta XiNet nel migliorare le _performance_ del modello? - XiNet è un @encoder efficace? - Parallelizzare gli @encoder può rendere più veloce il tempo di inferenza rispetto ad averli in serie? #block([ Le architetture proposte sono quindi le seguenti: #figure(image("../images/architectures/PyXiNet-b1.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta" I"$]) ],breakable: false,width: 100%) #figure(image("../images/architectures/PyXiNet-b2.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta" II"$]) #figure(image("../images/architectures/PyXiNet-b3.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta" III"$]) #figure(image("../images/architectures/PyXiNet-b4.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta" IV"$]) #block([ Come si può notare, tutte le architetture della famiglia $beta$ hanno gli @encoder in parallelo. Questo è stato fatto per due motivi: + Vedere se la parallelizzazione del grafo della rete neurale migliorava il tempo di inferenza, potendo eseguire i vari livelli in parallelo (invece di fare aspettare ad ogni livello i livelli superiori); + Permettere agli @encoder composti da XiNet, di diventare più profondi. Questo perchè come spiegato nel capitolo #link(<ch:xinet>)[XiNet], per ogni coppia successiva di blocchi XiConv viene dimezzata la dimensionalità spaziale (altezza e larghezza). Non dipendendo ogni livello da quello precedente, possiamo sfruttare questa caratteristica per far si che ogni nuovo @encoder si occupi del ridimensionamento della risoluzione del proprio livello. ],breakable: false,width: 100%) Il modello $beta" I"$ si ispira ad $alpha" I"$, andando però semplicemente a parallelizzare gli @encoder per quindi permettere delle XiNet più lunghe successivamente. Il modello $beta" II"$ rispetto a $beta" I"$ cerca di verificare se un eventuale problema potrebbe essere l'uso della convoluzione trasposta, come metodo per aggiustare la risoluzione del tensore. Per verificare ciò le XiNet sono di un blocco più corte rispetto a $beta" I"$, e per far si che si ottenga lo stesso numero di canali di prima, si usa una convoluzione con @kernel di dimensione $1times 1$. I modelli $beta" III"$ e $beta" IV"$ vanno semplicemente ad aggiungere un livello in più nella piramide, rispetto ai corrispondenti $beta" I"$ e $beta" II"$, così da verificare che eventualmente a parità di numero di livelli si riescano ad ottenere _performance_ simili, se non migliori, a quelle di PDV2. Le due reti però utilizzano approcci differenti per aggiungere un nuovo livello: - $beta" III"$ aggiunge un @encoder tradizionale di PDV1 che è posto in serie con l'@encoder del livello superiore, questo per riuscire ad avere un risparmio sul numero di parametri (poichè avere una XiNet profonda sei blocchi XiConv, aumenta significatamente la grandezza della rete); - $beta" IV"$ invece aggiunge un'ulteriore XiNet di un blocco più lunga rispetto al suo livello precedente, seguita da una convoluzione con @kernel di dimensioni $1 times 1$ per il ridimensionamento dei canali. #block([ Con le architetture precedentemente discusse ho ottenuto i seguenti risultati: #ext_eval_table( ( (name: [PDV1], vals: (1971624.0,0.15,0.16,1.52,6.229,0.253,0.782,0.916,0.964)), (name: [PDV2], vals: (716680.0,0.10,0.157,1.487,6.167,0.254,0.783,0.917,0.964)), (name: [PyXiNet $beta" I"$],vals:(941638.0,0.16,0.156,1.546,6.259,0.251,0.791,0.921,0.965)), (name: [PyXiNet $beta" II"$],vals:(481654.0,0.14,0.168,1.558,6.327,0.259,0.762,0.910,0.963)), (name: [PyXiNet $beta" III"$],vals:(1246422.0,0.16,0.148,1.442,6.093,0.241,0.803,0.926,0.967)), (name: [PyXiNet $beta" IV"$],vals:(1446014.0,0.18,0.146,1.433,6.161,0.241,0.802,0.926,0.967)), ), 2, [PDV1 e PDV2 vs. PyXiNet $beta$], ) ],breakable: false,width: 100%) In questo caso i risultati migliorano su tutte le metriche per i modelli $beta" III"$ e $beta" IV"$, tranne per il tempo di inferenza e per il numero di parametri. Tuttavia notiamo che sebbene il numero di parametri di $beta" III"$ non è minore di quello di PDV2, è di un $tilde 37%$ inferiore a quello di PDV1 e il tempo di inferenza di questi due modelli è molto simile. Questo è quindi un buon punto di partenza per poter applicare meccanismi di attenzione, in grado di migliorare ulteriormente le prestazioni del modello. #block([ == PyXiNet *$MM$* Essendo coscienti che la _self attention_ è parecchio costosa in termini di tempo e quindi un'opzione non praticabile in contesti dove il tempo di inferenza deve essere corto e la potenza computazionale limitata, la famiglia $MM$ va in realtà semplicemente a provare e verificare quale valore aggiunto questo meccanismo di attenzione può portare nel miglioramento delle metriche di valutazione, senza avere alcune pretese sul poter essere applicata come soluzione per il caso d'uso @MDE _embedded_. ],breakable: false,width: 100%) Nel tenativo di implementazione di una _self attention_ meno impattante computazionalmente, rispetto al normale modulo di attenzione presentato in @sa e affrontato nella @sa:ch, ho realizzato un blocco chiamato _Light Self Attetion Module_ (LSAM) che rispetto al _Self Attention Module_ (SAM) discusso precedentemente, applica dell'interpolazione ad _area_ per ridimensionare la risoluzione del tensore in ingresso. Ne sono state realizzate due versioni per riuscire a capire dove utilizzare l'interpolazione ad _area_, al fine di ottenere i migliori risultati. #block([ Le architetture delle due versioni di _LSAM_ sono le seguenti: #figure(image("../images/architectures/Attention-lsam1.drawio.png",width:300pt),caption: [Architettura di _LSAM V1_]) ],breakable: false,width: 100%) #figure(image("../images/architectures/Attention-lsam2.drawio.png",width:300pt),caption: [Architettura di _LSAM V2_]) Nel primo caso l'applicazione dell'attenzione avviene nell'ambiente a risoluzione ridotta, nel secondo caso invece si vuole sfruttare la maggiore informazione presente nell'input, andando quindi ad applicarla nell'ambiente a risoluzione piena. È stata poi aggiunta la normalizzazione dei tensori, come consigliato in @layernorm. Come modello di partenza per la famiglia $MM$ è stato scelto il modello $beta" IV"$, per vedere appunto come peggiora il suo tempo di inferenza e quanto migliorano le sue metriche di valutazione. #block([ Le architetture della famiglia $MM$ sono le seguenti: #figure(image("../images/architectures/PyXiNet-m1.drawio.png",width:350pt),caption: [Architettura di PyXiNet $MM" I"$]) ],breakable: false,width: 100%) #figure(image("../images/architectures/PyXiNet-m2.drawio.png",width:350pt),caption: [Architettura di PyXiNet $MM" II"$]) #figure(image("../images/architectures/PyXiNet-m3.drawio.png",width:350pt),caption: [Architettura di PyXiNet $MM" III"$]) #figure(image("../images/architectures/PyXiNet-m4.drawio.png",width:350pt),caption: [Architettura di PyXiNet $MM" IV"$]) Da quanto osservabile, si può notare come il blocco di attenzione sia stato posizionato nell'@encoder del primo livello, questo in quanto si tratta del livello che opera alla risoluzione massima e quindi il livello che opera su quanta più informazione vicina all'input originale del modello. Entrambe le versioni di _LSAM_ sono state posizionate sia prima che dopo la concatenazione, per riuscire a trovare la posizione migliore tra le due. #block([ I risultati ottenuti per la famiglia $MM$ sono quanto viene riportato in seguito: #ext_eval_table( ( (name: [PDV1], vals: (1971624.0,0.15,0.16,1.52,6.229,0.253,0.782,0.916,0.964)), (name: [PDV2], vals: (716680.0,0.10,0.157,1.487,6.167,0.254,0.783,0.917,0.964)), (name: [PyXiNet $beta" IV"$],vals:(1446014.0,0.18,0.146,1.433,6.161,0.241,0.802,0.926,0.967)), (name: [PyXiNet $MM" I"$],vals:(1970643.0,0.36,0.147,1.351,5.98,0.244,0.8,0.926,0.967)), (name: [PyXiNet $MM" II"$],vals:(2233197.0,0.38,0.14,1.289,5.771,0.234,0.814,0.933,0.969)), (name: [PyXiNet $MM" III"$],vals:(1708499.0,0.35,0.141,1.279,5.851,0.239,0.808,0.927,0.968)), (name: [PyXiNet $MM" IV"$],vals:(1839981.0,0.36,0.145,1.25,5.885,0.242,0.798,0.926,0.967)), ), 3, [PDV1, PDV2 e PyXiNet $beta" IV"$ vs. PyXiNet $MM$], ) ],breakable: false,width: 100%) Da quanto si può notare nei risultati, le _performance_ di tutta la famiglia $MM$, superano significativamente quelle di $beta" IV"$. Si può altresì osservare che il tempo di inferenza è tuttavia almeno raddoppiato rispetto a $beta" IV"$ per tutti i modelli. Sempre dai risultati ottenuti si può anche capire che solo nella coppia di modelli $MM" I"$ - $MM" II"$, mettere il blocco _LSAM_ dopo la concatenazione ha portato a risultati migliori, infatti nella coppia $MM" III"$ - $MM" IV"$ è successo l'esatto opposto. #block([ == PyXiNet *$beta$*CBAM Come discusso nel capitolo _#link(<ch:attention>)[Attention]_ nella @cbam:ch, uno dei vantaggi principali del blocco CBAM è il suo efficiente uso di risorse garantendo comunque un buon miglioramento delle _performance_. ],breakable: false,width: 100%) Sappiamo tuttavia che, per quanto citato in @CBAM, è particolarmente adatto nell'implementazione all'interno di reti di tipo _ResNet_. Questa informazione ha quindi guidato il design di un nuovo @decoder, chiamato per l'appunto _CBAM Decoder_ (CBAMD). È stato scelto il @decoder come posto dove applicare più volte il blocco, poichè rispecchia l'architettura di una rete neurale convoluzionale a collo di bottiglia. #block([ L'architettura del nuovo @decoder è come segue: #figure(image("../images/architectures/Attention-cbamd.drawio.png",width:350pt),caption: [Architettura del @decoder CBAMD]) ],breakable: false,width: 100%) È osservabile l'approccio residuale dell'_input_, tra due convoluzioni successive. La famiglia di modelli $beta$CBAM è basata, da come si può evincere dal nome, sui modelli $beta$, in particolare $beta" III"$ e $beta" IV"$, i quali sono stati i modelli con i migliori risultati ottenuti (senza considerare i risultati ottenuti mediante l'applicazione della _self attention_). #block([ Le architetture dei due modelli derivanti dalle appena discusse scelte sono i seguenti: #figure(image("../images/architectures/PyXiNet-bcbam1.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta$CBAM I]) ],breakable: false,width: 100%) #figure(image("../images/architectures/PyXiNet-bcbam2.drawio.png",width:350pt),caption: [Architettura di PyXiNet $beta$CBAM II]) Con la creazione di questi modelli si vuole verificare l'effettivo beneficio che questo modulo di attenzione può apportare in rapporto a quanto viene degradato il tempo di inferenza e a quanto aumenta il numero totale del parametri. Si può notare che il blocco CBAMD è stato utilizzato come @decoder, solo del primo livello. Questa scelta è stata fatta per non aumentare di troppo i parametri e il tempo di inferenza, facendolo concentrare solamente sulla risoluzione maggiore (la risoluzione che poi verrà effettivamente utilizzata in un contesto reale). #block([ I risultati di questi due modelli sono riportati nella seguente tabella: #ext_eval_table( ( (name: [PDV1], vals: (1971624.0,0.15,0.16,1.52,6.229,0.253,0.782,0.916,0.964)), (name: [PDV2], vals: (716680.0,0.10,0.157,1.487,6.167,0.254,0.783,0.917,0.964)), (name: [PyXiNet $beta"CBAM I"$],vals:(1250797.0,0.19,0.143,1.296,5.91,0.239,0.805,0.928,0.968)), (name: [PyXiNet $beta"CBAM II"$],vals:(1450389.0,0.23,0.147,1.379,5.974,0.239,0.806,0.927,0.968)), ), 2, [PDV1 e PDV2 vs. PyXiNet $beta$CBAM], res: 102pt ) ],breakable: false,width: 100%) I risultati di questa famiglia di modelli come si può notare sono molto promettenti, riuscendo ad incrementare notevolmente le _performance_ del modello di partenza, senza impattare troppo fortemente sul tempo di inferenza. Il modello migliore tra i due ($beta"CBAM I"$) riesce addirittura a stare sotto i 0.20s come tempo di inferenza medio (permettendo un _framerate_ di circa 5 _fps_) e con un numero di parametri inferiore di un $tilde 37%$ rispetto a PDV1. Un'osservazione interessante che può essere fatta è che rispetto alle controparti della famiglia $beta$, l'aggiunta dei vari blocchi CBAM ha contribuito al massimo ad un incremento del $tilde 0,4%$ sul numero dei parametri. === _CBAM PyDNet_ Visti gli enormi progessi che il blocco CBAM ci permette di ottenere, sorge spontanea la messa in questione dell'utilità dell'@encoder composto dal blocco XiNet. Per verificare questa ipotesi è stata presa l'archiettura originale di PDV2, al fine di verificare se posizionando un blocco CBAM prima di ogni convoluzione della rete, con passaggio dei residuali dall'_output_ della convoluzione precedente, si sarebbero potute ottenere _performance_ interessanti. #block([ Il @decoder quindi rimande il blocco CBAMD, che però verrà usato ora su tutti i livelli, mentre l'@encoder (_CBAME_) sarà come segue: #figure(image("../images/architectures/Attention-cbame.drawio.png",width:250pt),caption: [Architettura dell'@encoder _CBAME_]) ],breakable: false,width: 100%) #block([ L'architettura del modello risultante da quanto appena descritto è quindi quanto segue: #figure(image("../images/architectures/PyDNet-m.drawio.png",width:350pt),caption: [Architettura di _CBAM PyDNet_]) ],breakable: false,width: 100%) #block([ I risultati di questo modello sono riportati in seguito: #ext_eval_table( ( (name: [PDV1], vals: (1971624.0,0.15,0.16,1.52,6.229,0.253,0.782,0.916,0.964)), (name: [PDV2], vals: (716680.0,0.10,0.157,1.487,6.167,0.254,0.783,0.917,0.964)), (name: [CBAM PyDNet],vals:(746673.0,0.28,0.167,1.722,6.509,0.251,0.776,0.916,0.965)), ), 2, [PDV1 e PDV2 vs. _CBAM PyDNet_], res: 102pt ) ],breakable: false,width: 100%) Per quanto si può notare, posizionare il blocco CBAM dopo ogni convoluzione non fa altro che degradare pesantemente le prestazioni del modello sulla maggior parte delle metriche di valutazione. Si deduce quindi che l'uso di XiNet è sicuramente responsabile per parte del miglioramento delle predizioni dei modelli dove è stato usato.
https://github.com/npikall/typst-templates
https://raw.githubusercontent.com/npikall/typst-templates/main/dev.typ
typst
// Preamble // Import template #import "templates/thesis-bui.typ": conf, maketitle //Set variables #let title = [Testing Typst Templates] #let author = [<NAME>] #let date = datetime.today().display("[day].[month].[year]") #let thesis-type = [Bachelorthesis] #let email = [<EMAIL>] #let matrnr = [123456789] #let abstract = [#lorem(60)] // Set document metadata for the actual pdf #set document(title: title, author: "J.G.") // Useing the configuration #show: doc => conf(doc, language: "en")//, title:title) //%%%%%%%%%%%%%% // Main Document //%%%%%%%%%%%%%% // Make the title #maketitle( title: title, author: author, date: date, language: "en", thesis-type: thesis-type, email: email, matrnr: matrnr, // logo: "../features/Logo.png", ) // Table of Contents //#outline(title: [Table of Contents], indent: auto) // Set document from here on to have 2 columns //#show: rest => columns(2, rest) = Introduction This file is used to test the Typst templates. Just import the templates in the top of this document and and see what effects they have on the document. #lorem(40) = Methods #lorem(70) == Tables #lorem(50) #figure( table( columns: 3, align: (left, center, right), table.hline(), table.header([Substance], [Subcritical °C], [Supercritical °C],), table.hline(), [Hydrochloric Acid],[12.0], [92.1], [Sodium Myreth Sulfate],[16.6], [104], table.hline(start: 1), [Potassium Hydroxide], table.cell(colspan: 2)[24.7], table.hline(), ), caption: [This is the caption of a table])<tab:Tabelle> Siehe @tab:Tabelle #lorem(50) == Formulas #lorem(50) $ K_t = (1 - (R^2 dot tau)/(c_a + nu dot tan delta))^4 dot k_1 $ #lorem(50) == Figures and Images #lorem(50) #figure( image("features/Yousuf-Karsh-Winston-Churchill-1941.jpg", width: 40%) , caption: [This is the caption of a figure] )<fig:winston> Siehe @fig:winston #lorem(50) == Lists #lorem(40) Here is a very important list: - Milk - Spices - Salt - Pepper - Black - White #lorem(50) + Tick + Trick + Track #lorem(40) == Functions #lorem(50) #for value in range(2,12, step:2) [ - Band #value ] == Code #lorem(30) #figure( [ #align(center)[#line(length: 40%)] ```python import numpy as np import matplotlib.pyplot as plt x = np.linspace(0, 10, 100) y = np.sin(x) fig, ax = plt.subplots() ax.plot(x, y) plt.show() ``` #align(center)[#line(length: 40%)]], caption: [This is the caption of a figure] )<fig:code> == Zitate #lorem(50) === Einstein #lorem(20)@einstein === Dirac #lorem(30)@dirac === Knuth #lorem(30)@knuthwebsite = Results #lorem(200) = Discussion #lorem(120) #outline(target: figure, title: [List of Figures]) #bibliography("features/ref.bib", title: [References])
https://github.com/han0126/MCM-test
https://raw.githubusercontent.com/han0126/MCM-test/main/2024亚太杯typst/chapter/chapter4.typ
typst
#import "../template/template.typ": * = 问题一的模型建立与求解 == 问题分析 通过全面而深入的定量分析,我们研究了二十个关键因素与洪水发生概率之间的关联性,这些因素包括季风强度、地形排水特性、河流管理措施、森林砍伐程度、城市化进程、气候变化趋势、大坝的质量状况、河流淤积现象、农业实践的影响、土地侵蚀程度、防灾措施的有效性、排水系统的效能、海岸线的脆弱性、滑坡风险、流域管理状况、基础设施的老化程度、人口分布及密度、湿地生态系统的健康状态、城乡规划的合理性以及政府政策导向。我们运用统计学工具,计算了这些因素与洪水发生概率之间的皮尔逊相关系数,这是一种衡量两个变量之间线性关系强度和方向的指标。通过这种方法,我们能够量化评估每一个因素对洪水风险的贡献度,确定哪些是主要的驱动因素,哪些影响较弱,从而揭示了它们在洪水生成机制中的相对重要性。 在数据分析的基础上,我们深入探究了各因素影响洪水概率的内在机理。例如,季风强度的增加可能会导致降雨量的显著上升,进而增加洪水发生的可能性;城市化的扩张往往伴随着绿地的减少,减少了水的自然渗透能力,增加了地表径流,从而加剧了洪水风险;而有效的河流管理和合理的流域规划,则可以降低洪水的概率。通过对这些因素的综合考量,我们得出了关于洪水风险的全新认识。 == 指标相关性判断 首先将$20$个指标分别设为随机变量$x_i(i=1,2,dots,20)$,设洪水发生概率为$Y$。随机变量之间的相关系数矩阵或者协方差矩阵可以反应随机变量之间的相关,如果相关系数越大,那么随机变量之间的相关性越强;或$c o v(x_i,Y),c o v(Y,x_i)$越趋近于$display(sqrt(D(x_i))sqrt(D(Y)))$,$x_i$与$Y$的相关性越强。 根据相关系数计算公式:($x_i, Y$为两个随机变量) $ R=display((c o v(x_i,Y))/(sqrt(D(x_i))sqrt(D(Y)))) $ 得到相关系数矩阵:$display(mat(1, R;R,1))$,通过观察随机变量$x$与$Y$的相关系数$R$的大小就能够判断两者的相关性,即第$i$项指标与洪水发生概率的相关性。其中相关系数$R$值越大,则表明第$i$项指标与洪水发生概率的相关性越强。根据协方差中$c o v(x_i,Y)$或$c o v(Y,x_i)$越趋近于$display(sqrt(D(x_i))sqrt(D(Y)))$,则随机变量$x_i$与$Y$的相关性越强。 现基于`SPSS`软件可分析得到各项指标之间的相关性: #img( image("../figures/1.jpg", width: 90%), caption: "相关矩阵热力图" ) 通过对表中详尽的数据进行细致分析,我们可以得出结论,各项指标之间的相关性表现得相当微弱。这意味着,在统计意义上,这些指标可以被视为几乎互不影响的独立变量。这种独立性表明,每一项指标的变化似乎并不直接影响其他指标的数值,它们各自遵循着自己的变化规律和模式,没有显示出明显的相互作用或连锁反应。这一发现对于理解洪水风险评估模型中的变量关系具有重要意义,它提示我们在构建预测模型时,可以将这些指标视为独立的输入因子,简化模型结构,提高计算效率。 然而,尽管各项指标彼此之间的直接联系较为松散,但它们与洪水发生概率之间的关联性却呈现出不同的面貌。通过计算各项指标与洪水概率之间的相关系数,我们发现某些指标与洪水的发生存在正相关或负相关的关系。例如,降雨量的增加可能与洪水概率呈正相关,而有效的排水系统则可能与洪水概率呈负相关。这些发现有助于我们识别出那些对洪水风险有显著影响的关键因素,为进一步的风险管理和减灾措施提供了科学依据。 #img( image("../figures/2.jpg", width: 75%), caption: "相关系数" ) 结合上述分析,基础设施恶化、季风强度、地形排水这三个因素对洪水概率影响相对最大;而海岸脆弱性、排水系统、侵蚀这三个因素对洪水概率影响相对最小。 == 合理建议与措施 针对基础设施恶化、季风强度、地形排水等问题,现给出一下建议和措施: 1. 加强基础设施建设和维护: #h(2em)定期检查和修复堤坝、水库、排水系统和其他水利工程设施。 升级老旧的排水系统,增加其容量和效率。 在关键区域建立防洪墙或提高现有防洪墙的高度和强度。 确保水闸和泵站的正常运行,以便在需要时迅速排放积水。 2. 改善城市规划和管理: #h(2em)限制在洪水易发区域的开发建设,保持足够的绿地和开放空间以促进雨水渗透。 实施雨水收集和利用系统,减少对下水道系统的压力。 采用透水性材料铺设道路和人行道,增加地面的雨水吸收能力。 3. 生态保护和恢复: #h(2em)保护和恢复湿地、河流沿岸和上游森林,这些自然生态系统可以减缓洪水流速并吸收多余水分。 实施梯田耕作、植被覆盖等措施,减少山坡地的水土流失,降低下游洪水风险。 4. 地形改造: #h(2em)在地形允许的情况下,进行土地平整和地形调整,以改善排水条件。 在低洼地区建立蓄水池或人工湿地,用于暂时储存洪水,减轻主排水系统的压力。\ \
https://github.com/Shedward/dnd-charbook
https://raw.githubusercontent.com/Shedward/dnd-charbook/main/dnd/page/proficiencies.typ
typst
#import "../core/core.typ": * #let proficiencies(..proficiencies) = page( header: section[Tools Proficiencies] )[ #columned(separator: false)[ #let proficiencyBlock(proficiency) = [ #set text(hyphenate: false) #set text(top-edge: 0.5em) #abilityHeader(proficiency.title)\ #if proficiency.source != none [ #abilitySource(proficiency.source) ] #for item in proficiency.items [ - #item\ ] #for action in proficiency.actions [ #par[ #table( stroke: none, inset: 0pt, gutter: 0.75em, columns: (1fr, auto), [*#actionName(action.name)*], mapOrNone( action.dc, v => actionName[dc #v] ), ..arrayOrNone(action.body).map(v => table.cell(colspan: 2, v)) ) ] ] #for skillEffect in proficiency.skillsEffects [ #par[ *#skillEffect.skills.pos().join(", ")*: #skillEffect.body\ ] ] ] #for proficiency in proficiencies.pos() { if type(proficiency) == "content" { proficiency } else if type(proficiency) == "dictionary" { [ #proficiencyBlock(proficiency)\ ] } else { panic("Not supported proficiency type " + type(proficiency)) } } ] ]
https://github.com/exdevutem/taller-git
https://raw.githubusercontent.com/exdevutem/taller-git/main/src/first-steps.typ
typst
= Primeros pasos Dependiendo de tu terminal, y la configuración de este, es posible que recibas información de que estás en un directorio que corresponde a un repositorio de Github por algún mensaje en tu prompt. Para el caso mío, mi prompt me indica la siguiente información, visible en @repo - El usuario activo, en este caso 'rafael'. - El directorio en el que estoy, en este caso '.../taller-github'. - La rama de git en la que estoy, en este caso 'main'. - La hora al momento de crearse el prompt. Las 17:00 en este caso. #figure( image("../assets/first-steps/directory-list.png"), caption: [Lista de archivos de este repositorio.], ) <repo> La forma en que este prompt sabe que estamos en un repositorio de git, es por la existencia de una carpeta oculta llamada `.git`, visible como primer item en @repo, que contiene toda la información de los cambios hechos en este repositorio. Bajo este punto, el repositorio de esta guía no ha recibido ningún cambio, por lo que documentaré el proceso en que este primer cambio será hecho. Como referencia de la estructura de este directorio, se adjunta @tree #figure( image("../assets/first-steps/tree.png"), caption: [Estructura del repositorio que mantiene este documento.], ) <tree> Lo primero que haré será verificar el estado del directorio con el comando `git status`. Este comando me muestra todos los archivos creados, modificados o borrados del directorio *que aún no han sido guardados en un commit*. #figure( image("../assets/first-steps/status.png"), caption: [Output del comando status de git], ) <status> Para el caso de @status, ningún archivo ha sido agregado al repositorio, por lo que iniciaré agregando los archivos `.typ` con el comando `git add *.typ`, y ver como esto modifica el output de `git status`. El resultado puede verse en @status2. Comparando con @status, vemos que git reconoce los archivos `main.typ` y `template.typ` como archivos que van a ser cometidos (apareciendo como verde). Los archivos del directorio `../assets/first-steps` y el archivo `main.pdf` no han sido sido agregados al repositorio (apareciendo de color rojo). #figure( image("../assets/first-steps/new status.png"), caption: [Output del comando status tras agregar los archivos `.typ`], ) <status2> Comúnmente verás gente que, al momento de agregar archivos, ejecutará el comando `git add .`, o similarmente, `git add *`. Estos comandos agregan "todos" los archivos que git status reconoce. Para este caso puntual, no quiero hacer esto! pues no quiero que git siga los cambios que haga sobre el archivo `main.pdf`. Este pequeño problema lo podremos solucionar prontamente. Con estos cambios agregados, lo siguiente a hacer será `cometer` estos cambios a mi repositorio con el comando `git commit -m "Commit inicial"`. El resultado es una lista de los cambios que se han hecho en comparación al último commit hecho. En este caso, se ha creado el archivo `main.typ`, y el archivo `template.typ`. #figure( image("../assets/first-steps/commit inicial.png"), caption: [Output del comando `git commit`.], ) <commit> El último paso típico que nos queda es empujar nuestro cambios a nuestro repositorio remoto de Github. Al momento de crear un repositorio nuevo, sin archivos, github nos provee un par de instrucciones para poder sincronizar nuestro repositorio local a nuestro repositorio remoto: #figure( image("../assets/first-steps/instrucciones github.png"), caption: [Instrucciones provistas por GitHub al crear un repositorio vacío.], ) <instrucciones> En nuestro caso, nuestro repositorio no está vacío, y ya contiene el primer commit, por lo que podemos saltarnos varios de estos pasos y simplemente ingresar los comandos de agregar un repositorio remoto y empujar los cambios, tal y como se ve en @push #figure( image("../assets/first-steps/push.png"), caption: [Output al momento de agregar un 'remoto' nuevo, y empujar los cambios], ) <push> Finalmente, a modo de resumen, se agrega la línea de comandos al momento de hacer este proceso entero. Este nuevo commit consta de todos los cambios hechos hasta ahora, que incluyen toda la sección de primeros pasos. #figure( image("../assets/first-steps/proceso resumido.png"), caption: [Output de terminal al momento de seguir todo el procedimiento descrito aquí, con los cambios creados hasta ahora.], ) <resumen> #figure( image("../assets/first-steps/commit avanzado.png"), caption: [Si no se especifica un mensaje corto, Git abrirá el editor por defecto para crear uno más completo, tal y como se muestra en esta figura.], ) <commit-avanzado>
https://github.com/LauHuiQi0920/3007DataVisualizationTeamCyan
https://raw.githubusercontent.com/LauHuiQi0920/3007DataVisualizationTeamCyan/main/_extensions/_extensions/poster/typst-show.typ
typst
// Typst custom formats typically consist of a 'typst-template.typ' (which is // the source code for a typst template) and a 'typst-show.typ' which calls the // template's function (forwarding Pandoc metadata values as required) // // This is an example 'typst-show.typ' file (based on the default template // that ships with Quarto). It calls the typst function named 'article' which // is defined in the 'typst-template.typ' file. // // If you are creating or packaging a custom typst template you will likely // want to replace this file and 'typst-template.typ' entirely. You can find // documentation on creating typst templates here and some examples here: // - https://typst.app/docs/tutorial/making-a-template/ // - https://github.com/typst/templates #show: doc => poster( $if(title)$ title: [$title$], $endif$ // TODO: use Quarto's normalized metadata. $if(poster-authors)$ authors: [$poster-authors$], $endif$ $if(departments)$ departments: [$departments$], $endif$ $if(size)$ size: "$size$", $endif$ // Institution logo. $if(institution-logo)$ univ_logo: "$institution-logo$", $endif$ // Footer text. // For instance, Name of Conference, Date, Location. // or Course Name, Date, Instructor. $if(footer-text)$ footer_text: [$footer-text$], $endif$ // Any URL, like a link to the conference website. $if(footer-url)$ footer_url: [$footer-url$], $endif$ // Emails of the authors. $if(footer-emails)$ footer_email_ids: [$footer-emails$], $endif$ // Color of the footer. $if(footer-color)$ footer_color: "$footer-color$", $endif$ // Text color of the footer. $if(footer-text-color)$ footer_text_color: "$footer-text-color$", $endif$ // DEFAULTS // ======== // For 3-column posters, these are generally good defaults. // Tested on 36in x 24in, 48in x 36in, and 36in x 48in posters. // For 2-column posters, you may need to tweak these values. // See ./examples/example_2_column_18_24.typ for an example. // Any keywords or index terms that you want to highlight at the beginning. $if(keywords)$ keywords: ($for(keywords)$"$it$"$sep$, $endfor$), $endif$ // Number of columns in the poster. $if(num-columns)$ num_columns: $num-columns$, $endif$ // University logo's scale (in %). $if(univ-logo-scale)$ univ_logo_scale: $univ-logo-scale$, $endif$ // University logo's column size (in in). $if(univ-logo-column-size)$ univ_logo_column_size: $univ-logo-column-size$, $endif$ // Title and authors' column size (in in). $if(title-column-size)$ title_column_size: $title-column-size$, $endif$ // Poster title's font size (in pt). $if(title-font-size)$ title_font_size: $title-font-size$, $endif$ // Authors' font size (in pt). $if(authors-font-size)$ authors_font_size: $authors-font-size$, $endif$ // Footer's URL and email font size (in pt). $if(footer-url-font-size)$ footer_url_font_size: $footer-url-font-size$, $endif$ // Footer's text font size (in pt). $if(footer-text-font-size)$ footer_text_font_size: [$footer-text-font-size$], $endif$ doc, )
https://github.com/CreakZ/mirea-algorithms
https://raw.githubusercontent.com/CreakZ/mirea-algorithms/master/reports/layouts.typ
typst
#let head1(theme) = { set align(center) text(size: 16pt, [#theme]) } #let head2(theme) = { set align(center) text(size: 14pt, [#theme]) } #let un_heading(theme) = { align(center, text(size: 16pt,[*#theme* ]) ) } #let indent = h(1.25cm) #let tab = h(0.5cm) // Костыль. Пока что не нашел, как выровнять caption по левой стороне #let tab_caption(num, capt) = { align( left, [Таблица #num -- #capt] ) } // Темплейты оформления списка литературы #let bibliography( author, title, responsibility, edit_place, pub_house, edit_year, pages, ISBN ) = { text([#author. #title / #responsibility. -- #edit_place: #pub_house, #edit_year. -- #pages. -- #ISBN]) } #let internet( title, website, URL, address_date ) = { text([#title \/\/ #website URL: #URL (дата обращения: #address_date)]) }
https://github.com/WShohei/presentation
https://raw.githubusercontent.com/WShohei/presentation/master/lab/0515/slide/main.typ
typst
#import "templates/template.typ": * #show: doc => slides(doc) #show link: l => underline(l) #slide( title: "環境構築(雑)", pad( x: 2em, )[ ローカルで動かせなくても聞けるようにはしたつもりですが、\ 自分で実行したい場合は、 - Poetryがあれば、 ```bash poetry install ``` で必要なライブラリをインストールできます。\ (versionが合わない⇒次ページ) - requirements.txtがあるので、 ```bash pip install -r requirements.txt ``` でも大丈夫だと思います。 \ (Ryeは使ったことないので分からないです...) ]) #slide( title: "環境構築(雑)", pad( x: 2em, )[ バージョンが合わない場合は、 ```bash pyenv install 3.9.{いくつでも大丈夫なはず} # 3.9.13で動作確認 pyenv local 3.9.{} ``` などでPythonのバージョンを変更してください。 その後、 ```bash poetry install pip install -r requirements.txt ``` で必要なライブラリをインストールしてください。 ] ) #slide( title: "topics", pad( x: 2em, )[ tensorの基本から始めてモデル学習の流れまでを説明します。 + Tensors + AutoGrad + Datasets & Dataloaders + Neural Networks (Linear, Conv1dまで) + Optimizing Models + おまけ + Homework ]) = Tensors #slide( title: "", pad( x: 2em, )[ - PyTorchではtensorというデータ構造が中心的な役割を果たす - Numpyのndarrayに似ているが、以下のような特徴がある - GPUを使った計算が可能 - 自動微分が可能 - tensorの初期化は以下に示すような方法がある ], ) #slide( title: "Tensor Initialization", pad( x: 2em, )[ ```python import torch import numpy as np ``` - データから直接tensorを作成する ```python data = [[1, 2], [3, 4]] x_data = torch.tensor(data) ``` - Numpyのndarrayからtensorを作成する ```python np_array = np.array(data) x_np = torch.from_numpy(np_array) ``` - 他のtensorから新しいtensorを作成する ```python x_ones = torch.ones_like(x_data) x_rand = torch.rand_like(x_data, dtype=torch.float) ``` ], ) #slide( title: "Tensor Initialization", pad( x: 2em, )[ - ランダムな値で初期化されたtensorを作成する ```python shape = (2, 3,) rand_tensor = torch.rand(shape) ones_tensor = torch.ones(shape) zeros_tensor = torch.zeros(shape) ``` - deviceを指定してtensorを作成する ```python dtype = torch.float device = torch.device('cpu') # or 'cuda' etc. tensor = torch.ones(shape, dtype=dtype, device=device) ``` ], ) #slide( title: "Tensor Operations", pad( x: 2em, )[ - tensorはNumpyのndarrayと同様に演算が可能 ```python x = torch.ones(2, 2) y = torch.ones(2, 2) z = x + y # +, -, *, /, @ などNumpyと同様の演算子が使える print(z) # tensor([[2., 2.], # [2., 2.]]) z[:, 1] = 0 # slicingも可能 print(z) # tensor([[2., 0.], # [2., 0.]]) ``` ] ) #slide( title: "Sending Tensors to GPU", pad( x: 2em, )[ - tensorは`to`メソッドを使ってGPUに送ることができる ```python tensor = torch.ones(4, 4) if torch.cuda.is_available(): tensor = tensor.to('cuda') # send to default GPU tensor = tensor.to('cuda:0') # send to GPU 0 ``` - tensorは`device`属性を使ってGPUにあるかどうかを確認できる ```python print(tensor.device) # cuda:0 ``` ], ) == AutoGrad #slide( title: "", pad( x: 2em, )[ - PyTorchの最も重要な機能の一つは自動微分機能 - tensorは`requires_grad=True`を指定することで、そのtensorに対する操作を追跡し、 微分を計算することができる ```python x = torch.ones(2, 2, requires_grad=True) ``` - 微分を計算するには`backward`メソッドを呼び出す - 微分を計算するためには、 + `backward`メソッドを呼び出すtensorがスカラーであるか、 + あるいは`backward`メソッドに引数としてweightを表すtensorを渡す必要がある ], ) #slide( title: "Example", pad( x: 2em, )[ - `backward`メソッドを呼び出す例 (`autograd/ex1.py`) ```python x = torch.ones(2, 2, requires_grad=True) y = x + 2 # tensor([[3., 3.], # [3., 3.]], grad_fn=<AddBackward0>) z = y * y * 3 # = 3(x + 2)^2 # tensor([[27., 27.], # [27., 27.]], grad_fn=<MulBackward0>) out = z.mean() # tensor(27., grad_fn=<MeanBackward0>) out.backward() print(x.grad) # d(out)/dx = 6(x + 2)/4 = 4.5 # tensor([[4.5000, 4.5000], # [4.5000, 4.5000]]) ``` ] ) #slide( title: "Example", pad( x: 2em, )[ - `backward`メソッドに引数を渡す例1 (`autograd/ex2.py`) ```python x = torch.ones(2, 2, requires_grad=True) y = x + 2 z = y * y * 3 out = z # tensor([[27., 27.], [27., 27.]]) out.backward(gradient=torch.ones(2, 2)) # 各要素に対する重みを指定 # d(out)/dx = 6(x + 2) = 18 print(x.grad) # tensor([[18., 18.], # [18., 18.]]) ``` ] ) #slide( title: "Example", pad( x: 2em, )[ - `backward`メソッドに引数を渡す例2 (`autograd/ex3.py`) ```python x = torch.ones(2, 2, requires_grad=True) y = x + 2 z = y * y * 3 out = z # tensor([[27., 27.], [27., 27.]]) out.backward(gradient=torch.tensor([[1, 2], [3, 4]])) print(x.grad) # tensor([[18., 36.], # [54., 72.]]) ``` ] ) #slide( title: "非スカラーのTensorの微分", pad( x: 2em, )[ 上述の通り、`backward`メソッドを非スカラーのtensorに対して呼び出す時には、 `gradient`引数を指定する必要がある\ 数学的な意味での微分と、異なる動作をすることに注意\ ]) #slide( title: "非スカラーのTensorの微分", pad( x: 2em, )[ $bold(x) := vec(x_1, x_2, dots.v, x_n), bold(W) = mat( w_(11), w_(12), dots.h, w_(1n) ; w_(21), w_(22), dots.h, w_(2n) ; dots.v , dots.v , dots.down , dots.v ; w_(m 1), w_(m 2), dots.h, w_(m n) ; ), bold(y) = bold(W) bold(x) := vec(y_1, y_2, dots.v, y_m)$ としたとき、\ $ y_i = sum_(l=1)^(n) w_(i l) x_l $ $W$ の各要素 $w_(j k)$ による $bold(y)$ の微分を求めると、 $ (partial y_i)/(partial w_(j k)) = delta_(i j) x_k $ という3階のテンソルが得られる\ ] ) #slide( title: "非スカラーのTensorの微分", pad( x: 2em, )[ NNのweightを更新したい場合、上述のような3階テンソルではなく、\ 2階(weightの次元に一致)テンソルを得る必要がある\ 実際にautogradで得られるのは、\ $ (partial bold(y)) / (partial bold(W)) = mat( x_1, x_2, dots.h, x_n ; x_1, x_2, dots.h, x_n ; dots.v , dots.v , dots.down , dots.v ; x_1, x_2, dots.h, x_n ; ) $ となる。(ref. `diff_tensor.py`) ] ) #slide( title: "非スカラーのTensorの微分", pad( x: 2em, )[ #set text(size: 20pt) 行列積の微分は次のように計算できる。\ Lossを$L$ (スカラー)、$bold(Y) = bold(W) bold(X)$ としたとき、\ $ (partial L) / (partial bold(W)) & = (partial L) / (partial w_(i j))\ & = (partial L) / (partial y_(i k)) (partial y_(i k)) / (partial w_(i j)) space (because y_(i k) = w_(i j) x_(j k))\ & = (partial L) / (partial y_(i k)) x_(j k) space (because (partial y_(i k)) / (partial w_(i j)) = x_(j k))\ & = (partial L) / (partial bold(Y)) bold(X)^T $ 非スカラーのtensorに対する微分を計算する際に指定する`gradient`引数は、\ ここでの$(partial L) / (partial bold(Y))$に相当する部分である ] ) #slide( title: "Disabling Autograd", pad( x: 2em, )[ - `requires_grad`が`True`のtensorに対しては、その計算は追跡されるが、`torch.no_grad`ブロック内では追跡を無効にすることができる(`autograd/no_grad.py`) ```python x = torch.ones(2, 2, requires_grad=True) print(x.requires_grad) # True with torch.no_grad(): print((x ** 2).requires_grad) # False ``` ] ) #slide( title: "Disabling Autograd", pad( x: 2em, )[ - `detach()` メソッドを使うことでも追跡を無効にすることができる ```python x = torch.ones(2, 2, requires_grad=True) y = x.detach() print(y.requires_grad) # False ``` ユースケースとしては、 + ファインチューニング時に一部のパラメータを固定する場合 + `forward`の結果のみが必要な場合の高速化 などがある ] ) #slide( title: "Computational Graph", pad( x: 2em, )[ PyTorchではAutoGradのために計算グラフが構築される - tensorは`grad_fn`属性を持ち、そのtensorを作成した演算を記録している - 各演算の微分があらかじめ定義されており、逆伝播時にはそれを使って微分を計算する - `backward`メソッドを呼ぶと各変数tensorの`grad`属性に微分が格納される - 計算グラフは動的であり、学習を行いながらモデルの構造を変更することが可能 ] ) #split_slide( title: "Computational Graph", pad( x: 2em, )[ `torchviz`を使って計算グラフを可視化することができる (`autograd/computational_graph.py`) ```python import torchviz import torch x = torch.ones(2, 2, requires_grad=True) y = x * x * x * x out = y.mean() out.backward() dot = torchviz.make_dot(out, params=dict(x=x, y=y)) dot.render("graph", format="png") ``` ], [ #image("images/graph.png") ], left_ratio: 2.4fr, right_ratio: 1fr, ) = Datasets & Dataloaders #slide(title: "Datasets & Dataloaders", [ - PyTorchでは、データセットとデータローダーを使ってデータを扱う - データセットはデータを格納し、データローダーはデータセットからバッチを取得する - データセットは`torch.utils.data.Dataset`クラスを継承して作成する - データローダーは`torch.utils.data.DataLoader`クラスを使って作成する ]) #slide(title: "Example (MNIST)", [ 1. ライブラリのインポート ```python import torch from torch.utils.data import DataLoader from torchvision import datasets, transforms ```]) #slide(title: "Example (MNIST)", [ #set text(size: 21pt) 2. データセットのダウンロードと変換\ ```python transform = transforms.Compose([ transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,)) ]) train_dataset = datasets.MNIST( root='./data', train=True, download=True, # rootにデータがない場合にダウンロードする transform=transform ) test_dataset = datasets.MNIST( root='./data', train=False, download=True, transform=transform ) ``` ]) #slide(title: "Example (MNIST)", [ 3. データローダーの作成\ ```python train_loader = DataLoader( dataset=train_dataset, batch_size=64, shuffle=True ) test_loader = DataLoader( dataset=test_dataset, batch_size=64, shuffle=False ) ```\ ]) #slide(title: "Custom Dataset", [ - 自前のデータでDatasetを作ることも可能 - `torch.utils.data.Dataset`クラスを継承して、\ `__len__`メソッドと`__getitem__`メソッドを実装する ]) #slide(title: "Datasetの作成", [ ステップ1: CustomDatasetクラスの作成\ ```python class CustomDataset(Dataset): def __init__(self, data, labels, transform=None): self.data = data self.labels = labels self.transform = transform def __len__(self): return len(self.data) def __getitem__(self, idx): sample = self.data[idx] label = self.labels[idx] if self.transform: sample = self.transform(sample) # 前処理 return sample, label ``` ]) #slide(title: "DataLoaderの使用", [ ステップ2: Datasetの初期化\ ```python data = ... # your data labels = ... # your labels dataset = CustomDataset(data, labels) ``` ステップ3: DataLoaderの作成\ ```python from torch.utils.data import DataLoader dataloader = DataLoader(dataset, batch_size=4, shuffle=True) for batch_data, batch_labels in dataloader: # training code here ``` ]) #slide(title: "__len__", [ `__len__`メソッドは、データセットのサイズを返す。\ ```python class CustomDataset(Dataset): def __init__(self, data): self.data = data def __len__(self): return len(self.data) ``` データローダーがデータセットの終了を確認するために使用される。\ ]) #slide(title: "__getitem__", [ `__getitem__`メソッドは、データセットから\ 特定のインデックスにあるサンプルを取得する。\ ```python class CustomDataset(Dataset): def __init__(self, data): self.data = data def __getitem__(self, idx): return self.data[idx] ``` このメソッドを実装することで、\ データセットから特定のデータポイントを取得できる。\ データローダーが各バッチのデータを読み込む際に使用される。\ ]) = Neural Networks #slide( title: "Neural Networks", pad( x: 2em, )[ - Pytorchでは`torch.nn`にニューラルネットワークの構築に必要な\ モジュールが含まれている - `torch.nn.Module`クラスを継承して、\ `__init__`メソッドと`forward`メソッドを実装する ]) #split_slide( title: "Example", pad( x: 2em, )[ #set text(size: 20pt) ```python class NeuralNetwork(nn.Module): def __init__(self): super(NeuralNetwork, self).__init__() self.flatten = nn.Flatten() self.linear_relu_stack = nn.Sequential( nn.Linear(28*28, 512), nn.ReLU(), nn.Linear(512, 512), nn.ReLU(), nn.Linear(512, 10) ) def forward(self, x): x = self.flatten(x) logits = self.linear_relu_stack(x) return logits ``` ], [ #set text(size: 20pt) - `__init__`メソッドでは、\ ネットワークの構造を定義する - `forward`メソッドでは、\ データがネットワークを通過するときの処理を定義する ], left_ratio: 1.7fr,right_ratio: 1.02fr) #slide(title: "Model Parameters", pad( x: 2em, )[ nn.Modlueをsubclassに持つモデルのパラメータは、\ `parameters()`や`named_parameters()`メソッドを使って取得できる\ (自前のパラメータは`nn.Parameter`を使って作成する必要) ```python model = NeuralNetwork() for name, param in model.named_parameters(): print(f"Layer: {name} | Size: {param.size()} | Values : {param[:2]} \n") # Layer: linear_relu_stack.0.weight | Size: torch.Size([512, 784]) | Values : tensor([[ 0.0022, -0.0021, ...], # ... ``` ]) //#slide( // title: "nn.Flatten", pad( // x: 2em, // )[ // ```python // torch.nn.Flatten(start_dim=1, end_dim=-1) // ``` // `start_dim`から`end_dim`までの次元を平坦化する // ```python // x = torch.rand(1, 28, 28) // flatten = nn.Flatten() // flatten(x).shape // # torch.Size([1, 784]) // ``` // ] //) #slide(title: "nn.Linear", pad( x: 2em, )[ ```python torch.nn.Linear(in_features, out_features, bias=True) ``` 入力の線形変換を行う $y = x A^T + b$\ - `in_features`: 入力のサイズ - `out_features`: 出力のサイズ - `bias`: バイアス項を含めるかどうか ```python linear = nn.Linear(20, 30) input = torch.randn(128, 20) output = linear(input) print(output.size()) # torch.Size([128, 30]) ``` ]) #slide( title: "nn.Conv1d", pad( x: 2em, )[ ```python torch.nn.Conv1d(in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True) ``` 1次元の畳み込みを行う\ - `in_channels`: 入力のチャンネル数 - `out_channels`: 出力のチャンネル数 - `kernel_size`: カーネルのサイズ - `stride`: ストライド (default: 1) - `padding`: パディング (default: 0) - `dilation`: カーネルの間隔 (default: 1) - `groups`: グループ数 (default: 1) - `bias`: バイアス項を含めるかどうか (default: True) ]) #slide( title: "nn.Conv1d", pad( x: 2em, )[ `dilation`はカーネルの間隔を指定する\ 通常のカーネル ```python [1, 2, 3] ``` dilation=2の場合 ```python [1, 0, 2, 0, 3] ``` ]) #slide( title: "nn.Conv1d", pad( x: 2em, )[ `groups`は入力と出力のチャンネルをグループに分割する\ `in_channels=4, out_channels=8, groups=2`の場合\ ```python Group 1: Input channels [0, 1] -> Output channels [0, 1, 2, 3] Group 2: Input channels [2, 3] -> Output channels [4, 5, 6, 7] ``` ]) #slide( title: "nn.Conv1d", pad( x: 2em, )[ 出力サイズは以下のように計算される\ $ O = floor((I + 2 times P - D times (K - 1) - 1) / S) + 1 \ (O: "output size", I: "input size", D: "dilation" \ P: "padding", K: "kernel size", S: "stride") $ ]) #slide( title: "nn.Conv1d", pad( x: 2em, )[ ```python # 入力チャンネル数3、出力チャンネル数6、カーネルサイズ5の1次元畳み込みレイヤー conv1d_layer = nn.Conv1d(in_channels=3, out_channels=6, kernel_size=5) # ダミー入力データ (バッチサイズ10, チャンネル数3, シーケンス長50) input_data = torch.randn(10, 3, 50) # 畳み込みレイヤーを通してデータを渡す output_data = conv1d_layer(input_data) print(output_data.shape) # torch.Size([10, 6, 46]) ``` ]) = Optimizing Models #slide( title: "Optimizing Models", pad( x: 2em, )[ - PyTorchでは、`torch.optim`に最適化アルゴリズムが実装されている - モデルのパラメータを更新するためには、\ `torch.optim.Optimizer`クラスを使う - `torch.optim`モジュールには、\ SGD、Adam、RMSpropなどの最適化アルゴリズムが含まれている ]) #slide( title: "Example", pad( x: 2em, )[ ```python model = NeuralNetwork() optimizer = torch.optim.SGD(model.parameters(), lr=1e-3) ``` - `model.parameters()`でモデルのパラメータを取得し、\ `lr`で学習率を指定して最適化アルゴリズムを初期化する ```python # Inside the training loop optimizer.zero_grad() # 勾配を初期化 loss_fn = nn.CrossEntropyLoss() loss = loss_fn(model(data), target) loss.backward() # 勾配を計算 optimizer.step() # parameterのtensor.gradに基づいてパラメータを更新 ``` ]) #slide( title: "Optimizer", pad( x: 2em, )[ 全体的な学習の流れは`train/training_ex.py`を参照 ] ) = おまけ (pytorchの内部実装) #slide( title: "", pad( x: 2em, )[ #set text(size: 20pt) #link("https://github.com/pytorch/pytorch/blob/main/torch/__init__.py#L1514")[`torch/__init__.py`]には次のような部分がある ```python for name in dir(_C._VariableFunctions): if name.startswith('__') or name in PRIVATE_OPS: continue obj = getattr(_C._VariableFunctions, name) obj.__module__ = 'torch' # Hide some APIs that should not be public if name == "segment_reduce": # TODO: Once the undocumented FC window is passed, remove the line bellow globals()[name] = obj name = "_" + name globals()[name] = obj if not name.startswith("_"): __all__.append(name) ``` ]) #slide( title: "", pad( x: 2em, )[ pytorchの実装はほとんどがCUDA C++で書かれており、`_C`の実体は`torch._C.so`でありコンパイル時に生成される\ コンパイル前のソースの多くは#link("https://github.com/pytorch/pytorch/tree/main/aten/src/ATen")[ここ]にある。 いろいろ遡っていくと、最終的に非オープンソースのcuDNNの関数が呼ばれているところ(たとえば #link("https://github.com/pytorch/pytorch/blob/37596769d8b42beba104e14d149cebe0dfd75d12/aten/src/ATen/native/cudnn/Conv_v7.cpp#L697")[ここ]まで辿り着けます。 ]) = Homework #slide( title: "", pad( x: 2em, )[ + 演算子オーバーロードを用いて、autogradが可能なTensor classを実装する + 1で作ったTensorに対して計算グラフの出力を行えるように`draw_graph`メソッドを実装する (`graphviz`を使うと便利) + `nn.Conv2d`の実装 ]) = References #slide( title: "References", pad( x: 2em, )[ + PyTorch Documentation (https://pytorch.org/docs/stable/index.html) + PyTorch Tutorials (https://pytorch.org/tutorials/) ] )
https://github.com/BreakingLead/note
https://raw.githubusercontent.com/BreakingLead/note/main/Math/chebyshev/chebyshev.typ
typst
#import "../../template-mathnote.typ": * = Chebyshev Polynomials Observe: + $cos x = x$ + $cos (2x) = 2cos^2x+1$ + $cos (3x) = 4cos^3x-3x$ + $cos (4x) = 8cos^4x-8cos^2x+1$ #theorem[ Recursive Expression ][ Let $T_(n)(cos theta) := cos(n theta)$, there is $ T_(n)(x) = 2x T_(n-1) (x) - T_(n-2) (x) $ ][ _Proof:_ $ cos((n+1)x) &= cos(n x + x) \ &= cos(n x)cos(x)-sin(n x)sin(x) \ &= 2 cos(n x)cos(x) - [cos(n x)cos(x)+sin(n x )sin(x)] \ &= 2 cos(n x)cos(x) - cos((n-1) x) \ \ T_(n+1)(cos x) &= 2T_n (cos x) - T_(n-1)(cos x) => \ T_(n)(x) &= 2x T_(n-1) (x) - T_(n-2) (x) $ ] Properties of chebyshev polynomials are listed here: - 性质1: #h(1em) $T_n (x) = cos (n arccos x)$. - 性质2: #h(1em) $T_n (x)$ 的最高幂次项 $x^n$ 系数是 $2^(n-1)$. _Proof:_ 归纳法易证 - 性质3: #h(1em) $|T_n (x)| <= 1$ - 性质4: #h(1em) $T_n (x)$ 在 $[-1,1]$ 共有 $n$ 个零点,为 $x_k=cos(((2k-1)pi)/(2n)),k=1,2,3,...,n$ _Proof:_ $ T_n (cos theta) = cos (n theta) \ T_n (cos theta) = 0 <=> cos (n theta) = 0 => theta = ((2k-1) pi)/(2n) (k in ZZ) \ T_n (x) = 0 => x = cos (((2k-1) pi)/(2n)) (k=1,2,3,...n) $ - 性质5: #h(1em) $T_n (x)$ 在 $[-1,1]$ 共有 $n+1$ 个极值点,为 $x^*_k=cos(( k pi)/n),k=0,1,2,...,n$, 且这些极值点上的函数值是以 $+1,-1$ 这样的形式轮流出现的. _Proof:_ 证明同上. = Chebyshev Approximation #theorem[1][ 设 $n$ 次最高项系数为 1 的多项式 $Q_n (x)$ 的 $n$ 个根在 $(-1,1)$ 之间, 为 $x_1,x_2,...,x_n$. 那么我们在 $0,x_1,x_2,...,x_n,1$ 之间取 $t_0,t_1,t_2,...,t_n$, 其中 $t$ 夹在两个根(暂时将0,1算入)的开区间内. 则对于任意最高项系数为 1 的多项式$R(x)$, 我们有 $ max_(x in [-1,1]) |R(x)| >= min_(i in ZZ and i in [0,n]) Q(t_i) $ ][ _Proof:_ 反证法: 设 $ exists R(x), max_(x in [-1,1]) |R(x)| < min_(i in ZZ and i in [0,n]) Q(t_i) = M \ => |R(x)| < M (forall x in [-1,1]) \ $ 令 $T(x) := R(x) - Q(x)$ 则 $T(t_0),T(t_1),...,T(t_n)$ 一定是正负交错的(画图容易看出). $=> T(x)$ 至少在 $[-1,1]$ 有 $n$ 个根, 所以它是 $n$ 次方程, 所以最高项 $x^n$ 的系数必定不为0 (代数学基本定理). 然而, $R(x) = x^n + ..., Q(x) = x^n+...$ 所以矛盾. 原命题得证. $qed$ ]
https://github.com/HernandoR/lz-brilliant-cv
https://raw.githubusercontent.com/HernandoR/lz-brilliant-cv/main/modules_SinglePage/skills.typ
typst
Apache License 2.0
#import "../brilliant-CV/template.typ": * #cvSection("Skills") #cvSkill( type: [Languages], info: languageSwitch(( "en":[English - Professional fluent #hBar() Mandarin - Native], "zh":[英语 - 专业流利 #hBar() 中文 - 母语] )), ) #cvSkill( type: [Tech Stack], info: [Python (Pandas/Numpy) #hBar() Machine Learning(PyTorch) #hBar git #hBar() Tableau] ) // #cvSkill( // type: [Personal Interests], // info: [Swimming #hBar() Cooking #hBar() Reading] // )
https://github.com/typst-doc-cn/tutorial
https://raw.githubusercontent.com/typst-doc-cn/tutorial/main/src/intermediate/mod.typ
typst
Apache License 2.0
#import "/src/book.typ" #import "../mod.typ": code as _code, exec-code as _exec-code, refs, typst-func, pro-tip #import "/typ/templates/page.typ": main-color #import "/typ/embedded-typst/lib.typ": svg-doc, default-fonts, default-cjk-fonts #let eval-local(it, scope, res) = if res != none { res } else { eval(it.text, mode: "markup", scope: scope) } #let exec-code(it, scope: (:), res: none, ..args) = _exec-code( it, res: eval-local(it, scope, res), ..args) #let code(it, scope: (:), res: none, ..args) = _code( it, res: eval-local(it, scope, res), ..args) #let frames(code, cjk-fonts: false, code-as: none, prelude: none) = { if code-as != none { code-as } else { code } if prelude != none { code-as = if code-as == none { code } code = prelude.text + "\n" + code.text } let fonts = if cjk-fonts { (..default-cjk-fonts(), ..default-fonts()) } grid(columns: (1fr, 1fr), ..svg-doc(code, fonts: fonts).pages.map(data => image.decode(data)).map(rect)) } #let frames-cjk = frames.with(cjk-fonts: true)
https://github.com/Dherse/typst-brrr
https://raw.githubusercontent.com/Dherse/typst-brrr/master/samples/masterproef/elems/acronyms.typ
typst
#import "./template.typ": todo, section #let glossary_entries = state("glossary_entries", (:)) #let query_labels_with_key(loc, key, before: false) = { if before { query(selector(label("glossary:" + key)).before(loc, inclusive: false), loc) } else { query(selector(label("glossary:" + key)), loc) } } #let gloss(key, suffix: none, short: auto, long: auto) = { locate(loc => { let glossary_entries = glossary_entries.final(loc); if key in glossary_entries { let entry = glossary_entries.at(key) let gloss = query_labels_with_key(loc, key, before: true) let in_preface(l) = section.at(l) == "preface"; let is_first_in_preface = gloss.map((x) => x.location()).find((x) => in_preface(x)) == none; let is_first = gloss.map((x) => x.location()).find((x) => not in_preface(x)) == none; let long = if ((in_preface(loc) and is_first_in_preface)) or long == true { [ (#emph(entry.long))] } else if (not in_preface(loc) and (is_first and short != true)) or long == true { [ (#emph(entry.long))] } else { none } [ #link(label(entry.key))[#entry.short#suffix#long] #label("glossary:" + entry.key) ] } else { todo("Glossary entry not found: " + key) } }) } #let glossary(title: "Glossary", entries, body) = { [ #heading(title) <glossary> ] glossary_entries.update((x) => { for entry in entries { x.insert(entry.key, (key: entry.key, short: entry.short, long: entry.long)) } x }) let elems = (); for entry in entries.sorted(key: (x) => x.key) { elems.push[ #heading(smallcaps(entry.short), level: 99) #label(entry.key) ] elems.push[ #emph(entry.long) #box(width: 1fr, repeat[.]) #locate(loc => { query_labels_with_key(loc, entry.key) .map((x) => x.location()) .dedup(key: (x) => x.page()) .sorted(key: (x) => x.page()) .map((x) => link(x)[#numbering(x.page-numbering(), ..counter(page).at(x))]) .join(", ") }) ] } table( columns: (auto, 1fr), inset: 5pt, stroke: none, fill: (_, row) => { if calc.odd(row) { luma(240) } else { white } }, align: horizon, ..elems ) show ref: r => { if r.element != none and r.element.func() == heading and r.element.level == 99 { gloss(str(r.target), suffix: r.citation.supplement) } else { r } } body };
https://github.com/Joelius300/hslu-typst-template
https://raw.githubusercontent.com/Joelius300/hslu-typst-template/main/chapters/07_reflection-and-outlook.typ
typst
MIT License
= Reflexion und Ausblick Reflexion der eigenen Arbeit, ungelöste Probleme, weitere Ideen. Wichtig; kommt häufig zu kurz in BAAs. #pagebreak()
https://github.com/sitandr/typst-examples-book
https://raw.githubusercontent.com/sitandr/typst-examples-book/main/src/typstonomicon/math_display.md
markdown
MIT License
# Make all math display math <div class="warning"> May slightly interfere with math blocks. </div> ```typ // author: eric1102 #show math.equation: it => { if it.body.fields().at("size", default: none) != "display" { return math.display(it) } it } Inline math: $sum_(n=0)^oo e^(x^2 - n/x^2)$\ Some other text on new line. $ sum_(n=0)^oo e^(x^2 - n/x^2) $ ```
https://github.com/heinwol/master-thesis
https://raw.githubusercontent.com/heinwol/master-thesis/main/typst/template.typ
typst
#import "./typst-packages/packages/preview/ctheorems/1.1.2/lib.typ": * #import "@preview/sourcerer:0.2.1": code as code_ // indentation hack from https://github.com/typst/typst/issues/311#issuecomment-2104447655 #let indent = 1.25cm #let styled = [#set text(red)].func() #let space = [ ].func() #let sequence = [].func() #let turn-on-first-line-indentation( doc, last-is-heading: false, // space and parbreak are ignored indent-already-added: false, ) = { if doc.has("children") { for (i, elem) in doc.children.enumerate() { let element = elem.func() if element == text { let previous-elem = doc.children.at(i - 1) if i == 0 or last-is-heading or previous-elem.func() == parbreak { if not indent-already-added { indent-already-added = true h(indent) } } elem } else if element == heading { indent-already-added = false last-is-heading = true elem } else if element == space { elem } else if element == parbreak { indent-already-added = false elem } else if element == sequence { turn-on-first-line-indentation( elem, last-is-heading: last-is-heading, indent-already-added: indent-already-added, ) } else if element == styled { styled( turn-on-first-line-indentation( elem.child, last-is-heading: last-is-heading, indent-already-added: indent-already-added, ), elem.styles, ) } else { indent-already-added = false last-is-heading = false elem } } } else { doc } } #let code(..args) = code_(..args) // ------------ #let default_thm_args = arguments( separator: [. #h(0.2em)], inset: (top: 0.2em, left: 0em, right: 0em, bottom: 0.2em), padding: (top: 0em, bottom: 0em), ) #let thmbox = thmbox.with(..default_thm_args) #let thmplain = thmplain.with(..default_thm_args) #let thmproof = thmproof.with(..default_thm_args) #let theorem = thmplain( "theorem", [#h(indent) Теорема], titlefmt: strong, supplement: none, base_level: 1, // ) #let corollary = thmplain( "corollary", [#h(indent) Следствие], base: "theorem", titlefmt: strong, ) #let remark = thmplain( "remark", [#h(indent) Замечание], titlefmt: strong, ).with(numbering: none) #let lemma = thmplain( "lemma", [#h(indent) Лемма], titlefmt: strong, supplement: none, base_level: 1, // ) #let claim = thmplain( "claim", [#h(indent) Утверждение], titlefmt: strong, supplement: none, base_level: 1, // ) #let proof = thmproof("proof", [#h(indent) Доказательство]) #let definition = thmplain( "definition", [#h(indent) Определение], base_level: 1, // take only the first level from the base titlefmt: strong, supplement: none, // separator: none, //[#h(0.1em):#h(0.2em)], ) // ----------- #let wrap-thm-with-indentation(env) = { (..args, doc) => env(..args, turn-on-first-line-indentation(doc)) } // #let theorem = wrap-thm-with-indentation(theorem) // #let corollary = wrap-thm-with-indentation(corollary) // #let remark = wrap-thm-with-indentation(remark) // #let claim = wrap-thm-with-indentation(claim) // #let lemma = wrap-thm-with-indentation(lemma) // #let proof = wrap-thm-with-indentation(proof) // #let definition = wrap-thm-with-indentation(definition) // ----------- #let template(body) = { // set document(author: "dds", title: "ds") // Set the basic text properties. set text( font: "Liberation Serif", lang: "ru", size: 14pt, // fallback: true, hyphenate: false, overhang: false, ) // Set the basic page properties. set page( paper: "a4", number-align: center, margin: (top: 20.5mm, bottom: 20.5mm, left: 30.5mm, right: 10.5mm), numbering: "1", // footer: rect(fill: aqua)[Footer], ) counter(page).update(1) // Set the basic paragraph properties. set par( leading: 1.25em, justify: true, // first-line-indent: 1.25em, // hanging-indent: 1.25em, ) set linebreak(justify: true) // Additionally styling for list. set enum(indent: 0cm) set list(indent: 0cm) set heading(numbering: "1.1") show heading: set align(center) show heading: it => { it v(1em) } show heading.where(level: 1): it => { pagebreak() it } show heading.where(level: 3): set heading(numbering: none, outlined: false) // show math.equation: set text(font: "New Computer Modern Math", fallback: false) // show math.equation: context repr(bold.font) //set text(font: "Arial", fallback: false) // show raw: set text(font: "Fira Code") set math.equation( // numbering: (num => "(" + ((counter(heading).get().at(0),) + (num,)).map(str).join(".") + ")") ) set math.equation(supplement: none) show math.cases: set align(left) // show math.colon: math.class("punctuation", math.colon) show math.equation: e => { show math.colon: $math.class("punctuation", math.colon) thin$ box(e) } set ref(supplement: it => { if it.func() == figure { "рис." } else { it.supplement } }) set grid(column-gutter: 5pt) show figure.caption: set par(leading: 1em) // show figure.where(kind: 1): "" // set figure(supplement: "рис.") // see https://github.com/typst/typst/issues/311#issuecomment-1722331318 show regex("^\s*!!\s*"): context h(indent) show <nonum>: set heading(numbering: none) show <nonum>: set math.equation(numbering: none) show: turn-on-first-line-indentation body }
https://github.com/ckunte/m-one
https://raw.githubusercontent.com/ckunte/m-one/master/inc/fenders.typ
typst
= Berthing fenders To avoid downtime in LNG carrier berthing, I was looking to evaluate the adequacy of breasting dolphins, at an age old jetty, see @bd, coupled temporarily with floating pneumatic fenders (FPF), while new air block fenders (ABF) get procured and replaced. #figure( image("/img/bdolphin.jpg", width: 100%), caption: [Aerial view of breasting dolphins], ) <bd> To assess fenders, I was given performance curves, one that of an ABF from the 70s, and the other of an FPF, furnished by its vendor. I've put the two together in @fsbs to illustrate how similar they look at first glance, while in fact, how different they actually are. Sharp eyes will quickly notice the inconsistent units, unequal ordinates, and interchanged twin-axes. It's obvious that I needed to put all these four curves on to a single graph to avoid optical illusion. #figure( image("/img/abfpf.jpg", width: 80%), caption: [ABF, FBF performance curves], ) <fsbs> #figure( image("/img/cf.png", width: 80%), caption: [Fender performances], ) <fc> With no data except for these plots, I had to digitize, convert to consistent units, and interpolate between data points to generate this following comparison in @fc. Note the effect units have over curve-slopes. By visual inspection of the first image alone, I'd not have picked up the fact that ABFs perform by a factor of 4 over FPFs at their respective maximum deflections, and by a factor of 2 at equivalent deflections. Also that FPFs are demonstrably softer than ABFs. And just for fun, I've also added cell fender type to the mix. where, - EVD curves correspond to Energy v. Displacement, - RVD curves correspond to Reaction v. Displacement Code for plotting performance curves of air block, floating pneumatic and cell fenders is as follows. #let pcurves = read("/src/pcurves.py") #{linebreak();raw(pcurves, lang: "python")} $ - * - $
https://github.com/SWATEngineering/Docs
https://raw.githubusercontent.com/SWATEngineering/Docs/main/src/2_RTB/VerbaliEsterni/VerbaleEsterno_231206/meta.typ
typst
MIT License
#let data_incontro = "06-12-2023" #let inizio_incontro = "15:30" #let fine_incontro = "16:45" #let luogo_incontro = "Sede Sync Lab, via <NAME>, 28" #let company = "Sync Lab"
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/visualize/stroke_03.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Dashing #line(length: 60pt, stroke: (paint: red, thickness: 1pt, dash: ("dot", 1pt))) #v(3pt) #line(length: 60pt, stroke: (paint: red, thickness: 1pt, dash: ("dot", 1pt, 4pt, 2pt))) #v(3pt) #line(length: 60pt, stroke: (paint: red, thickness: 1pt, dash: (array: ("dot", 1pt, 4pt, 2pt), phase: 5pt))) #v(3pt) #line(length: 60pt, stroke: (paint: red, thickness: 1pt, dash: ())) #v(3pt) #line(length: 60pt, stroke: (paint: red, thickness: 1pt, dash: (1pt, 3pt, 9pt)))
https://github.com/Jollywatt/typst-wordometer
https://raw.githubusercontent.com/Jollywatt/typst-wordometer/master/tests/issues/test.typ
typst
MIT License
#import "/src/lib.typ": * #set page(width: 15cm, height: auto) = Issue #link("https://github.com/Jollywatt/typst-wordometer/issues/1", `#1`) Figures might have: - no `caption` field, or - a `caption` field with the value `none`. #let el = rect[ #figure([Hello from the figure body.], caption: none) Ciao. ] #el #word-count-of(el)
https://github.com/RubixDev/typst-i-figured
https://raw.githubusercontent.com/RubixDev/typst-i-figured/main/README.md
markdown
MIT License
# I figured Configurable figure numbering per section. ## Examples ### Basic Have a look at the source [here](./examples/basic.typ). ![Example: basic](./examples/basic.png) ### Two levels deep Have a look at the source [here](./examples/level-two.typ). ![Example: two levels deep](./examples/level-two.png) ## Usage The package mainly consists of two customizable show rules, which set up all the numbering. There is also an additional function to make showing an outline of figures easier. Because the [`show-figure()`](#show-figure) function must internally create another figure element, attached labels cannot directly be used for references. To circumvent this, a new label is attached to the internal figure, with the same name but prefixed with `fig:`, `tbl:`, or `lst:` for images (and all other types of generic figures), tables, and raw code figures (aka listings) respectively. These new labels can be used for referencing without problems. ```typ // import the package #import "@preview/i-figured:0.2.4" // make sure you have some heading numbering set #set heading(numbering: "1.") // apply the show rules (these can be customized) #show heading: i-figured.reset-counters #show figure: i-figured.show-figure // show an outline #i-figured.outline() = Hello World #figure([hi], caption: [Bye World.]) <bye> // when referencing, the label names must be prefixed with `fig:`, `tbl:`, // or `lst:` depending on the figure kind. @fig:bye displays the text "hi". ``` ## Reference ### `reset-counters` Reset all figure counters. To be used in a heading show rule like `#show heading: i-figured.reset-counters`. ```typ #let reset-counters( it, level: 1, extra-kinds: (), equations: true, return-orig-heading: true, ) = { .. } ``` **Arguments:** - `it`: [`content`] &mdash; The heading element from the show rule. - `level`: [`int`] &mdash; At which heading level to reset the counters. A value of `2` will cause the counters to be reset at level two _and_ level one headings. - `extra-kinds`: [`array`] of ([`str`] or [`function`]) &mdash; Additional custom figure kinds. If you have any figures with a `kind` other than `image`, `table`, or `raw`, you must add the `kind` here for its counter to be reset. - `equations`: [`bool`] &mdash; Whether the counter for math equations should be reset. - `return-orig-heading`: [`bool`] &mdash; Whether the original heading element should be included in the returned content. Set this to false if you manually want to construct a heading instead of using the default. **Returns:** [`content`] &mdash; The unmodified heading. ### `show-figure` Show a figure with per-section numbering. To be used in a figure show rule like `#show figure: i-figured.show-figure`. ```typ #let show-figure( it, level: 1, zero-fill: true, leading-zero: true, numbering: "1.1", extra-prefixes: (:), fallback-prefix: "fig:", ) = { .. } ``` **Arguments:** - `it`: [`content`] &mdash; The figure element from the show rule. - `level`: [`int`] &mdash; How many levels of the current heading counter should be added in front. Note that you can control this individually from the `level` parameter on [`reset-counters()`](#reset-counters). - `zero-fill`: [`bool`] &mdash; If `true` and assuming a `level` of `2`, a figure after a `1.` heading but before a `1.1.` heading will show `1.0.1` as numbering, else the middle zero is excluded. Note that if set to `false`, not all figure numberings are guaranteed to have the same length. - `leading-zero`: [`bool`] &mdash; Whether figures before the first top-level heading should have a leading `0`. Note that if set to `false`, not all figure numberings are guaranteed to have the same length. - `numbering`: [`str`] or [`function`] &mdash; The actual numbering pattern to use for the figures. - `extra-prefixes`: [`dictionary`] of [`str`] to [`str`] pairs &mdash; Additional label prefixes. This can optionally be used to specify prefixes for custom figure kinds, otherwise they will also use the fallback prefix. - `fallback-prefix`: [`str`] &mdash; The label prefix to use for figure kinds which don't have another prefix set. **Returns:** [`content`] &mdash; The modified figure. ### `show-equation` Show a math equation with per-section numbering. To be used in a show rule like `#show math.equation: i-figured.show-equation`. ```typ #let show-equation( it, level: 1, zero-fill: true, leading-zero: true, numbering: "(1.1)", prefix: "eqt:", only-labeled: false, unnumbered-label: "-", ) = { .. } ``` **Arguments:** For the arguments `level`, `zero-fill`, `leading-zero`, and `numbering` refer to [`show-figure()`](#show-figure). - `it`: [`content`] &mdash; The equation element from the show rule. - `prefix`: [`str`] &mdash; The label prefix to use for all equations. - `only-labeled`: [`bool`] &mdash; Whether only equations with labels should be numbered. - `unnumbered-label`: [`str`] &mdash; A label to explicitly disable numbering for an equation. **Returns:** [`content`] &mdash; The modified equation. ### `outline` Show the outline for a kind of figure. This is just the same as calling `outline(target: figure.where(kind: i-figured._prefix + repr(target-kind)), ..)`, the function just exists for convenience and clarity. ```typ #let outline(target-kind: image, title: [List of Figures], ..args) = { .. } ``` **Arguments:** - `target-kind`: [`str`] or [`function`] &mdash; Which kind of figure to list. - `title`: [`content`] or `none` &mdash; The title of the outline. - `..args` &mdash; Other arguments to pass to the underlying [`outline()`](https://typst.app/docs/reference/meta/outline/) call. **Returns:** [`content`] &mdash; The outline element. [`str`]: https://typst.app/docs/reference/foundations/str/ [`int`]: https://typst.app/docs/reference/foundations/int/ [`bool`]: https://typst.app/docs/reference/foundations/bool/ [`content`]: https://typst.app/docs/reference/foundations/content/ [`function`]: https://typst.app/docs/reference/foundations/function/ [`array`]: https://typst.app/docs/reference/foundations/array/ [`dictionary`]: https://typst.app/docs/reference/foundations/dictionary/ ## Acknowledgements The core code is based off code from [@PgBiel](https://github.com/PgBiel) (`@PgSuper` on Discord) and [@aagolovanov](https://github.com/aagolovanov) (`@aag.` on Discord). Specifically from [this message](https://discord.com/channels/1054443721975922748/1088371919725793360/1158534418760224809) and the conversation around [here](https://discord.com/channels/1054443721975922748/1088371919725793360/1159172567282749561).
https://github.com/Error-418-SWE/Documenti
https://raw.githubusercontent.com/Error-418-SWE/Documenti/src/3%20-%20PB/Documentazione%20interna/Verbali/24-03-24/24-03-24.typ
typst
#import "/template.typ": * #show: project.with( date: "24/03/24", subTitle: "Meeting di retrospettiva e pianificazione", docType: "verbale", authors: ( "<NAME>", ), reviewers: ( "<NAME>", ), timeStart: "15:10", timeEnd: "16:05", ); = Ordine del giorno - Valutazione del progresso generale; - Analisi retrospettiva; - Prossimo meeting; - Pianificazione. = Valutazione del progresso generale <avanzamento> Durante lo Sprint 20 è stata completata buona parte delle task previste, ed è stato quasi concluso lo sviluppo del MVP. Il giorno venerdì 22/03/2024 si è svolto il meeting esterno con il Proponente per un aggiornamento riguardo lo stato dei lavori e l'illustrazione degli ultimi avanzamenti del MVP. == #adr Sono stati conclusi l'aggiornamento e l'estensione del documento, il quale è stato mandato in revisione al #cardin dopo un ultimo controllo da parte dei membri del gruppo. == #man Il #man ha visto completati i seguenti lavori: - rimozione sezione requisiti di sistema e hardware, poiché ritenuti di scarso interesse per l'utilizzatore finale, lasciando solo i requisiti riguardanti i browser; - redazione delle seguenti sezioni delle istruzioni d'uso: - Avvio e configurazione dell'ambiente; - Movimento nell'ambiente tridimensionale; - Creazione zona; - Impostazioni. == #ndp Le #ndp hanno visto completati i seguenti lavori: - revisione Processo di transizione; - aggiornamento descrizione della repository WMS3; - aggiornamento SVG workflow automazioni dei documenti. == #pdp Il #pdp ha visto completati i seguenti lavori: - redazione preventivo Sprint 19; - redazione consuntivo Sprint 19; - redazione preventivo Sprint 20. == #st È stata aggiunta la sezione sui requisiti di sistema e hardware precedentemente inserita nel #man, ed è stata redatta la sezione riguardante i requisiti soddisfatti. == Codifica L'attività di codifica ha visto completati i lavori: - implementazione componente 3D zona; - implementazione posizionamento zona nell'ambiente 3D; - implementazione controllo su caricamento dati da database; - implementazione spostamento prodotto da un bin a un altro tramite _drag and drop_; - creazione piano a partire da SVG; - implementazione ricerca prodotti; - implementazione ricerca zone; - implementazione caricamento elementi nella lista dei prodotti non collocati; - adeguamento coordinate delle zone all'ambiente creato tramite SVG; - correzione bug dimensionali sui bin; - implementazione pattern Factory per la ricerca; - risoluzione performance issues; - implementazione spostamento camera tramite tastiera; - implementazione spostamento camera tramite tastiera durante il _dragging_ di un prodotto; - limitazione di panning e trucking; - miglioramento grafico UI. = Analisi retrospettiva Il rendimento dello Sprint 20 è sostenuto dalle principali metriche esposte dal #pdq\: - CPI di progetto rimane costante a 1.00, rappresentando un valore ottimale; - EAC passa da € 12.990,31 a € 12.990,86, rimanendo quindi pressoché costante. Continua a rappresentare un valore ottimale; - $"SEV" >= "SPV"$, indica un andamento positivo del progetto. Maggiori dettagli in merito al valore delle metriche alla loro analisi sono reperibili all'interno dei documenti #pdq_v e #pdp_v. == Keep doing <keep-doing> Il gruppo si ritiene soddisfatto di come sta lavorando, in particolare riguardo lo sviluppo del MVP, che è quasi giunto al termine. == Improvements <improvements> Non sono state riscontrate criticità. = Prossimo meeting Il prossimo meeting di retrospettiva, previsto per domenica 31/03/2024 alle ore 15:00, viene anticipato alla mattina di sabato 30/03/2024 a causa delle festività pasquali. L'orario del meeting sarà definito nei prossimi giorni tramite sondaggio interno su Discord. = Pianificazione <pianificazione> #show figure: set block(breakable: true) #let table-json(data) = { let keys = data.at(0).keys() table( align: left, columns: keys.len(), ..keys, ..data.map( row => keys.map( key => row.at(key, default: [n/a]) ) ).flatten() ) } #figure(caption: [Task pianificate per lo Sprint 21.], table-json(json("tasks.json")) )
https://github.com/tingerrr/anti-matter
https://raw.githubusercontent.com/tingerrr/anti-matter/main/src/rules.typ
typst
MIT License
/// A function which displays an `outline.entry` using its default show rule _with_ the given page /// numbering function. /// /// This can be used with its default parameters to revert the outline show rule in `anti-matter`. /// /// - entry (outline.entry): the outline entry to display /// - func: (function): transforms a location to a page number /// -> content #let outline-entry(entry, func: loc => loc.page()) = { link(entry.element.location(), entry.body) if entry.fill != none { [ ] box(width: 1fr, entry.fill) [ ] } else { h(1fr) } link(entry.element.location())[#func(entry.element.location())] }
https://github.com/angelcerveraroldan/notes
https://raw.githubusercontent.com/angelcerveraroldan/notes/main/abstact_algebra/notes/rings/intro.typ
typst
#import "../../../preamble.typ" : * #import "@preview/fletcher:0.5.1" as fletcher: diagram, node, edge #let iso = $tilde.equiv$ #def(title:"Ring and Field", [ A ring is a set along with two binary operations $(R, +, dot)$, where $dot$ and $+$ are associative, $+$ is commutative, and $dot$ need not have an inverse for every elemnt. If it is the case that an inverse exists for every element, then we call it a field. Both rings and fields must also satisfy the following $ (a + b) dot c &= a c + b c \ c dot (a + b) &= c a + c b $ ])
https://github.com/kyliancc/mathematics
https://raw.githubusercontent.com/kyliancc/mathematics/main/Infomation%20and%20Entropy/cpt5/cpt5.typ
typst
#set page("a4") #align(left, text(25pt)[ *Information and Entropy* ]) #align(left, text(20pt)[ *Chapter 5: Probability* ]) = 1. Events There's some Concepts: - *Universal event*: $p("universal event") = 1$ - *Null event*: $p("null event") = 0$ - *Mutually exclusive*: $A sect B = emptyset$ - *Exhautive*: $A union B = S$, where $S$ is the sample space. - *Partition*: Both mutually exclusive and exhaustive. = 2. Joint Events and Conditional Probabilities Assume event $A$ and $B$ are independent, the *joint event* can be found: $ p(A, B) = p(A) p(B) $ We can also use conditional probabilities to describe: $ p(A, B) = p(B) p(A | B) = p(A) p(B | A) $ And It's also known as Bayes' Theorem. = 3. Averages The average value (expected value) $H_(a v)$ can be found as: $ E_(a v) = sum_i p(A_i) c_i $ Where $A_i$ is a event, and $c_i$ is its actual value. = 4. Information We will learn information when we know a event happened. The information learned form outcome $i$ is: $ log_2((1)/(p(A_i))) $. For all outcomes, we can take their average information: $ I = sum_i p(A_i) log_2((1)/(p(A_i))) $ It's called *entropy* of a source. *Basically, more likely the event probabilities are, higher value the entropy will get*. It's like we playing the guess-the-number game. Assume the number is in 1-100, the best way is to guess 50, so we can learn if it's above 50 or under 50.
https://github.com/Enter-tainer/wavy
https://raw.githubusercontent.com/Enter-tainer/wavy/master/typst-package/wavy.typ
typst
MIT License
#import "@preview/jogs:0.2.1": compile-js, call-js-function #let wavy-src = read("./wavy.js") #let wavy-bytecode = compile-js(wavy-src) #let render(src, ..args) = { let result = call-js-function(wavy-bytecode, "wavy", src) image.decode(result, ..args) }
https://github.com/mdgrs/resume-typst
https://raw.githubusercontent.com/mdgrs/resume-typst/main/modules/skills.typ
typst
Apache License 2.0
#import "../brilliant-CV/template.typ": * #cvSection("Skills") #cvSkill( type: [Languages], info: [English -- Native #hBar() French --Native #hBar() Italian -- C1] ) #cvSkill( type: [Tech Stack], info: [Scala #hBar() Python #hBar() Python Data #hBar() R #hBar() SQL] ) #cvSkill( type: [Personal Interests], info: [Mountaineering #hBar() Cycling] )
https://github.com/Enter-tainer/zint-wasi
https://raw.githubusercontent.com/Enter-tainer/zint-wasi/master/typst-package/manual.typ
typst
MIT License
#import "./lib.typ": * #import "./lib.typ" #import "@preview/tidy:0.2.0" #import "@preview/tablex:0.0.8": tablex, colspanx, cellx #import "./tidy_style.typ" #set page( paper: "a4", margin: ( y: 1em, x: 2em, ), ) // shortcut for underlined links #let l(dest, body) = underline(link(dest, body)) #{ set align(center) heading(level: 1, text(size: 17pt)[tiaoma]) [A barcode generator for typst that provides type safe API bindings for #l("https://zint.org.uk")[Zint] (#l("https://github.com/zint/zint")[GitHub]) library through a WASM #l("https://typst.app/docs/reference/foundations/plugin/")[plugin]. ] } #v(10pt) See #l("https://zint.org.uk/manual")[official Zint manual] for a more in-depth description of supported functionality. = API Some generators require additional configuration (such as composite codes), this can be achieved by passing #l(<options>)[options] to Zint. #let ty-links = (:) #let typst-type(v, doc-links: (:)) = { let ty-box(v) = { tidy_style.show-type( v, style-args: (colors: tidy_style.colors), docs: ty-links + doc-links, ) } for (i, e) in v.split(",").map(ty-box).enumerate() { if i > 0 { h(2pt) text(size: 9pt)[or] h(2pt) } e } } #let typst-val(v, ty: none) = { let content = raw(v) let fg = black let ty = ty if v.starts-with("\"") { ty = "str" } else if v == "true" or v == "false" { ty = "bool" } else if v == "none" { ty = "none" } else if v.starts-with("-") or ( "0", "1", "2", "3", "4", "5", "6", "7", "8", "9", ).contains(v.first()) { ty = "float" } if ty != none { fg = tidy_style.colors.at(ty, default: none) fg = fg.darken(50%).saturate(70%) } text(fill: fg, content) } #let t-style = ( auto-vlines: false, stroke: gray.lighten(60%), ) #let reference-table-style(head-rows: 1, key-column: true) = ( ..t-style, fill: (col, row) => if row < head-rows { let additional = 10% * (head-rows - 1 - row) blue.lighten(80% - additional) } else if col == 0 and key-column == true { blue.lighten(90%) } else { none }, ) #let docs = tidy.parse-module( read("lib.typ"), name: "tiaoma", scope: (tiaoma: lib, typst-type: typst-type, typst-val: typst-val, l: l), ) #tidy.show-module( docs, first-heading-level: 1, style: tidy_style, show-module-name: false, show-outline: false, ) == Shortcut functions Most barcodes are supported through #l(<examples>)[shortcut functions]. They accept the same arguments as `barcode` function but don't require `symbology` to be specified. == Zint configuration <options> All exported functions support optionally providing the `options` dictionary which is passed to Zint. This provides means to fully configure generated images. The following values are valid for the `options` dictionary: #let detailed(dest, content) = link(dest, underline(content)) #tablex( columns: (auto, 100pt, 1fr, auto), align: (center + horizon, center + horizon, left + horizon, center + horizon), ..reference-table-style(), [*Field*], [*Type*], [*Description*], [*Default*], [height], typst-type("float"), [Barcode height in X-dimensions (ignored for fixed-width barcodes)], typst-val("none"), [scale], typst-type("float"), [Scale factor when printing barcode, i.e. adjusts X-dimension], typst-val("1.0"), [whitespace-width], typst-type("int"), [Width in X-dimensions of whitespace to left & right of barcode], typst-val("0"), [whitespace-height], typst-type("int"), [Height in X-dimensions of whitespace above & below the barcode], typst-val("0"), [border-width], typst-type("int"), [Size of border in X-dimensions], typst-val("0"), detailed(<output_options>, "output-options"), typst-type( "int,array,dictionary", doc-links: ( "int": <output_options>, "array": <output_options_arr>, "dictionary": <output_options_dict>, ), ), [Various output parameters (bind, box etc, see below)], typst-val("0"), [fg-color], typst-type("color"), [foreground color], typst-val("black"), [bg-color], typst-type("color"), [background color], typst-val("white"), [primary], typst-type("str"), [Primary message data (MaxiCode, Composite)], typst-val("\"\""), [option-1], typst-type("int"), [Symbol-specific options (see #l("https://zint.org.uk/manual")[manual])], typst-val("-1"), [option-2], typst-type("int"), [Symbol-specific options (see #l("https://zint.org.uk/manual")[manual])], typst-val("0"), detailed(<opt_3>, "option-3"), typst-type("int,str"), [Symbol-specific options (see #l("https://zint.org.uk/manual")[manual])], typst-val("0"), [show-hrt], typst-type("bool"), [Whether to show Human Readable Text (HRT)], typst-val("true"), detailed(<input_mode>, "input-mode"), typst-type( "int,string,array,dictionary", doc-links: ( "int": <input_mode>, "string": <input_mode_str>, "array": <input_mode_arr>, "dictionary": <input_mode_dict>, ), ), [Encoding of input data], typst-val("0"), [eci], typst-type("int"), [Extended Channel Interpretation.], typst-val("0"), [dot-size], typst-type("float"), [Size of dots used in BARCODE_DOTTY_MODE.], typst-val("4.0 / 5.0"), [text-gap], typst-type("float"), [Gap between barcode and text (HRT) in X-dimensions.], typst-val("1.0"), [guard-descent], typst-type("float"), [Height in X-dimensions that EAN/UPC guard bars descend.], typst-val("5.0"), ) #pagebreak() === Input Mode <input_mode> Input mode options allow specifying how `Zint` should handle input data. `Zint` uses #typst-type("int") bitflags for these, but `tiaoma` allows you to specify them using several other formats as documented below. The following options are supported: #tablex( columns: (auto, auto, auto, 1fr), align: (left + horizon, center + horizon, center + horizon, left + horizon), ..reference-table-style(head-rows: 2), colspanx(4, cellx(align: center)[*Input format (mutually exclusive)*]), (), (), (), [*Constant*], typst-type("int"), typst-type("str"), [*Description*], raw("DATA_MODE"), typst-val("0"), typst-val("\"data\""), [Use full 8-bit range interpreted as binary data.], raw("UNICODE_MODE"), typst-val("1"), typst-val("\"unicode\""), [Use UTF-8 input.], raw("GS1_MODE"), typst-val("2"), typst-val("\"gs1\""), [Encode GS1 data using FNC1 characters.], ) #tablex( columns: (auto, auto, auto, 1fr), align: (left + horizon, center + horizon, center + horizon, left + horizon), ..reference-table-style(head-rows: 2), colspanx(4, cellx(align: center)[*Behavior customization*]), (), (), (), [*Constant*], typst-type("int"), typst-type("str"), [*Description*], raw("ESCAPE_MODE"), typst-val("8"), typst-val("\"escape\""), [Process input data for escape sequences.], raw("GS1PARENS_MODE"), typst-val("16"), typst-val("\"gs1-parentheses\""), [Parentheses (round brackets) used in GS1 data instead of square brackets to delimit Application Identifiers (parentheses must not otherwise occur in the data).], raw("GS1NOCHECK_MODE"), typst-val("32"), typst-val("\"gs1-no-check\""), [Do not check GS1 data for validity, i.e. suppress checks for valid AIs and data lengths. Invalid characters (e.g. control characters, extended ASCII characters) are still checked for.], raw("HEIGHTPERROW_MODE"), typst-val("64"), typst-val("\"height-per-row\""), [Interpret the `height` variable as per-row rather than as overall height.], raw("FAST_MODE"), typst-val("128"), typst-val("\"fast\""), [Use faster if less optimal encodation for symbologies that support it (currently Data Matrix only).], raw("EXTRA_ESCAPE_MODE"), typst-val("256"), typst-val("\"extra-escape\""), [Undocumented.], ) ==== String Value <input_mode_str> `input_mode` of #typst-type("str") type is assumed to be a _input format_ value from the first table. ==== Array Value <input_mode_arr> `input_mode` of #typst-type("array") type is assumed to be a list of #typst-type("str") values from the above tables; individual constants will be converted to #typst-type("int")s and unioned together. ==== Dictionary Value <input_mode_dict> `input_mode` of #typst-type("dictionary") type is assumed to be #typst-type("str")-#typst-type("bool") pairs where keys are constants from the above table. Additionally, _input format_ can be specified as a #typst-type("str") value paired to #typst-val("\"format\"") key. In other words, columns of the following table are equivalent: #tablex( columns: (1fr, 1fr, 1fr, 1fr), align: ( center + horizon, center + horizon, center + horizon, center + horizon, ), ..reference-table-style(key-column: false), [#typst-type("dictionary")], [#typst-type("array")], [#typst-type("str")], [#typst-type("int")], [ (#typst-val("\"format\""): #typst-val("\"data\"")) ], [(#typst-val("\"data\""))], typst-val("\"data\""), typst-val("0"), [(#typst-val("\"unicode\""): #typst-val("true"))], [(#typst-val("\"unicode\""))], typst-val("\"unicode\""), typst-val("1"), [ (#typst-val("\"gs1\""): #typst-val("true"), #typst-val("\"gs1-no-check\""): #typst-val("true")) ], [(#typst-val("\"gs1\""), #typst-val("\"gs1-no-check\""))], [N/A], typst-val("34"), ) === Output Options <output_options> Output options allow specifying how `Zint` should generate the barcode/symbol. #tablex( columns: (auto, auto, auto, 1fr), align: (left + horizon, center + horizon, center + horizon, left + horizon), ..reference-table-style(), [*Constant*], typst-type("int"), typst-type("str"), [*Description*], raw("BARCODE_BIND_TOP"), typst-val("1"), typst-val("\"barcode-bind-top\""), [Boundary bar _above_ the symbol and between rows if stacking multiple symbols.], raw("BARCODE_BIND"), typst-val("2"), typst-val("\"barcode-bind\""), [Boundary bars _above_ and _below_ the symbol and between rows if stacking multiple symbols.], raw("BARCODE_BOX"), typst-val("4"), typst-val("\"barcode-box\""), [Add a box surrounding the symbol and whitespace.], //[BARCODE_STDOUT], [8], typst-val("\"barcode-stdout\""), [Output to stdout], //[READER_INIT], [16], typst-val("\"reader-init\""), [Reader Initialisation (Programming)], raw("SMALL_TEXT"), typst-val("32"), typst-val("\"small-text\""), [Use a smaller font for the Human Readable Text.], raw("BOLD_TEXT"), typst-val("64"), typst-val("\"bold-text\""), [Embolden the Human Readable Text.], raw("CMYK_COLOUR"), typst-val("128"), typst-val("\"cmyk-color\""), [Select the CMYK colour space option for Encapsulated PostScript and TIF files.], raw("BARCODE_DOTTY_MODE"), typst-val("256"), typst-val("\"barcode-dotty-mode\""), [Plot a matrix symbol using dots rather than squares.], raw("GS1_GS_SEPARATOR"), typst-val("512"), typst-val("\"gs1-gs-separator\""), [Use GS instead of FNC1 as GS1 separator (Data Matrix only).], //[OUT_BUFFER_INTERMEDIATE], [1024], typst-val("\"out-buffer-intermediate\""), [Return ASCII values in bitmap buffer (OUT_BUFFER only)], raw("BARCODE_QUIET_ZONES"), typst-val("2048"), typst-val("\"barcode-quiet-zones\""), [Add compliant quiet zones (additional to any specified whitespace).], raw("BARCODE_NO_QUIET_ZONES"), typst-val("4096"), typst-val("\"barcode-no-quiet-zones\""), [Disable quiet zones, notably those with defaults.], raw("COMPLIANT_HEIGHT"), typst-val("8192"), typst-val("\"compliant-height\""), [Warn if height not compliant and use standard height (if any) as default.], raw("EANUPC_GUARD_WHITESPACE"), typst-val("16384"), typst-val("\"ean-upc-guard-whitespace\""), [Add quiet zone indicators ("<" / ">") to HRT whitespace (EAN/UPC)], raw("EMBED_VECTOR_FONT"), typst-val("32768"), typst-val("\"embed-vector-font\""), [Embed font in vector output.], ) ==== Array Value <output_options_arr> `output_options` of #typst-type("array") type assumed to be #typst-type("str") values from the above table. ==== Dictionary Value <output_options_dict> `output_options` of #typst-type("dictionary") type assumed to be #typst-type("str")-#typst-type("bool") pairs where keys are constants from the above table. === Option 3 <opt_3> As there's constants associated with `option_3` values, this package allows specifying the value as either an #typst-type("int") or a #typst-type("str"). The following table documents supported values and their #typst-type("str") representations: #tablex( columns: (auto, auto, auto, 1fr), align: (left + horizon, center + horizon, center + horizon, left + horizon), ..reference-table-style(), [*Constant*], [#typst-type("int")], [#typst-type("str")], [*Description*], raw("DM_SQUARE"), typst-val("100"), typst-val("\"square\""), [Only consider square versions on automatic symbol size selection], raw("DM_DMRE"), typst-val("101"), typst-val("\"rect\""), [Consider DMRE versions on automatic symbol size selection], raw("DM_ISO_144"), typst-val("128"), typst-val("\"iso-144\""), [Use ISO instead of "de facto" format for 144x144 (i.e. don't skew ECC)], raw("ZINT_FULL_MULTIBYTE"), typst-val("200"), typst-val("\"full-multibyte\""), [Enable Kanji/Hanzi compression for Latin-1 & binary data], raw("ULTRA_COMPRESSION"), typst-val("128"), typst-val("\"compression\""), [Enable Ultracode compression *(experimental)*], ) #pagebreak() = Examples <examples> #let entry-height = 4.5em #let entry-gutter = 1em #let example-entry(name, code-block, preview, ..extra) = block( breakable: true, inset: 5pt, fill: gray.lighten(90%), radius: 5pt, )[ #heading(level: 3, name) #if extra.pos().len() > 0 { stack(dir: ttb, spacing: 5pt, ..extra) } *Example:* #block( inset: 5pt, fill: white, stroke: gray.lighten(60%), radius: 2pt, width: 100%, text(size: 8pt, par(linebreaks: "simple", code-block)), ) *Result:* #block( height: entry-height, width: 100%, align(center + horizon, preview), ) ] #let example-of-simple(name, func-name, func, data, ..extra) = example-entry( name, raw( "#" + func-name + "(\"" + data + "\")", block: true, lang: "typ", ), func(data, fit: "contain"), ..extra, ) // Only works when entire content fits on a single page #let section(..content) = [ #v(1em) #grid( columns: (1fr, 1fr), column-gutter: entry-gutter, row-gutter: 7pt, ..content ) #v(2em) ] #show "<dbar>": [GS1 DataBar] #show "(chk)": [w/ Check Digit] #show "(cc)": [Composite Code] #show "(omni)": [Omnidirectional] #show "(omn)": [Omnidirectional] #show "(stk)": [Stacked] #show "(exp)": [Expanded] #show "(ltd)": [Limited] // There's a lot of page breaks here because there's currently no sane way of // making #section properly layout example blocks without cuting some of them // off. There's no way of balancing #columns and #grid/#table doesn't want to // break into the new page when it should consistently. == EAN (European Article Number) #section( example-of-simple("EANX", "eanx", eanx, "1234567890"), example-of-simple("EAN-14", "ean14", ean14, "1234567890"), example-of-simple("EAN-13", "eanx", eanx, "6975004310001"), example-of-simple("EAN-8", "eanx", eanx, "12345564"), example-of-simple("EAN-5", "eanx", eanx, "12345"), example-of-simple("EAN-2", "eanx", eanx, "12"), // example for "EAN (cc)" ) #pagebreak() == PDF417 #section( example-of-simple("Micro PDF417", "micro-pdf417", micro-pdf417, "1234567890"), example-of-simple("PDF417", "pdf417", pdf417, "1234567890"), example-of-simple("Compact PDF417", "pdf417-comp", pdf417-comp, "1234567890"), ) #pagebreak() == GS1 #section( example-of-simple("GS1-128", "gs1-128", gs1-128, "[01]98898765432106"), example-of-simple("<dbar> Omnidirectional", "dbar-omn", dbar-omn, "98898765432106"), example-of-simple("<dbar> (ltd)", "dbar-ltd", dbar-ltd, "988987654321"), example-of-simple("<dbar> (exp)", "dbar-exp", dbar-exp, "[01]98898765432106"), example-of-simple("<dbar> (stk)", "dbar-stk", dbar-stk, "1234567890"), example-of-simple("<dbar> (stk) (omn)", "dbar-omn-stk", dbar-omn-stk, "1234567890"), // example for "<dbar> (exp) (stk)" // example for "<dbar> (omn) (cc)" // example for "<dbar> (omn) (cc)" // example for "<dbar> (ltd) (cc)" // example for "<dbar> (exp) (cc)" // example for "<dbar> (stk) (cc)" // example for "<dbar> (omn) (stk) (cc)" // example for "<dbar> (exp) (stk) (cc)" ) // TODO: Remove once utilities and above examples are provided Zint supports (omn), (ltd), (exp), (stk) and (cc) variants of GS1. See #l(<options>)[configuration] section for information on how to use them. #pagebreak() == C25 #section( example-of-simple("Standard", "c25-standard", c25-standard, "123"), example-of-simple("Interleaved", "c25-inter", c25-inter, "1234567890"), example-of-simple("IATA", "c25-iata", c25-iata, "1234567890"), example-of-simple("Data Logic", "c25-logic", c25-logic, "1234567890"), example-of-simple("Industrial", "c25-ind", c25-ind, "1234567890"), ) #pagebreak() == UPC (Universal Product Code) #section( example-of-simple("UPC-A", "upca", upca, "01234500006"), example-of-simple("UPC-A (chk)", "upca-chk", upca-chk, "012345000065"), example-of-simple("UPC-E", "upce", upce, "123456"), example-of-simple("UPC-E (chk)", "upce-chk", upce-chk, "12345670"), // example for "UPC-A (cc)" // example for "UPC-E (cc)" ) #pagebreak() == HIBC (Health Industry Barcodes) #section( example-of-simple("Code 128", "hibc-128", hibc-128, "1234567890"), example-of-simple("Code 39", "hibc-39", hibc-39, "1234567890"), example-of-simple("Data Matrix", "hibc-dm", hibc-dm, "1234567890"), example-of-simple("QR", "hibc-qr", hibc-qr, "1234567890"), example-of-simple("PDF417", "hibc-pdf", hibc-pdf, "1234567890"), example-of-simple("Micro PDF417", "hibc-mic-pdf", hibc-mic-pdf, "1234567890"), example-of-simple( "Codablock-F", "hibc-codablock-f", hibc-codablock-f, "1234567890", ), example-of-simple("Aztec", "hibc-aztec", hibc-aztec, "1234567890"), ) #pagebreak() == Postal #section( example-of-simple( "Australia Post Redirection", "aus-redirect", aus-redirect, "12345678", ), example-of-simple( "Australia Post Reply Paid", "aus-reply", aus-reply, "12345678", ), example-of-simple( "Australia Post Routing", "aus-route", aus-route, "12345678", ), example-of-simple( "Australia Post Standard Customer", "aus-post", aus-post, "12345678", ), example-of-simple( "Brazilian CEPNet Postal Code", "cepnet", cepnet, "1234567890", ), example-of-simple("DAFT Code", "daft", daft, "DAFTFDATATFDTFAD"), example-of-simple( "Deutsche Post Identcode", "dp-ident", dp-ident, "1234567890", ), example-of-simple( "Deutsche Post Leitcode", "dp-leitcode", dp-leitcode, "1234567890", ), example-of-simple( "Deutsher Paket Dienst", "dpd", dpd, "0123456789012345678901234567", ), example-of-simple("Dutch Post KIX Code", "kix", kix, "1234567890"), ) #section( example-of-simple( "Japanese Postal Code", "japan-post", japan-post, "1234567890", ), example-of-simple("Korea Post", "korea-post", korea-post, "123456"), example-of-simple("POSTNET", "postnet", postnet, "1234567890"), example-entry( "Royal Mail 2D Mailmark (CMDM)", raw( "#mailmark-2d(\n\t32, 32,\n\t\"JGB 011123456712345678CW14NJ1T 0EC2M2QS REFERENCE1234567890QWERTYUIOPASDFGHJKLZXCVBNM\"\n)", block: true, lang: "typ", ), mailmark-2d( 32, 32, "JGB 011123456712345678CW14NJ1T 0EC2M2QS REFERENCE1234567890QWERTYUIOPASDFGHJKLZXCVBNM", ), ), example-of-simple( "Royal Mail 4-State Customer Code", "rm4scc", rm4scc, "1234567890", ), example-of-simple( "Royal Mail 4-State Mailmark", "mailmark-4s", mailmark-4s, "21B2254800659JW5O9QA6Y", ), example-of-simple( "Universal Postal Union S10", "upus10", upus10, "RR072705659PL", ), example-of-simple( "UPNQR (Univerzalnega Plačilnega Naloga QR)", "upnqr", upnqr, "1234567890", ), example-of-simple( "USPS Intelligent Mail", "usps-imail", usps-imail, "01300123456123456789", ), ) #pagebreak() == Other Generic Codes #section( example-of-simple("Aztec Code", "aztec", aztec, "1234567890"), example-of-simple("Aztec Rune", "azrune", azrune, "122"), example-of-simple("Channel Code", "channel", channel, "123456"), example-of-simple("Codabar", "codabar", codabar, "A123456789B"), example-of-simple("Codablock-F", "codablock-f", codablock-f, "1234567890"), example-of-simple("Code 11", "code11", code11, "0123452"), example-of-simple("Code 16k", "code16k", code16k, "1234567890"), example-of-simple("Code 32", "code32", code32, "12345678"), example-of-simple("Code 39", "code39", code39, "1234567890"), example-of-simple("Code 49", "code49", code49, "1234567890"), ) #section( example-of-simple("Code 128", "code128", code128, "1234567890"), example-of-simple("Code 128 (AB)", "code128ab", code128ab, "1234567890"), example-of-simple("Code One", "code-one", code-one, "1234567890"), example-of-simple( "Data Matrix (ECC200)", "data-matrix", data-matrix, "1234567890", ), example-of-simple("DotCode", "dotcode", dotcode, "1234567890"), example-of-simple("Extended Code 39", "ex-code39", ex-code39, "1234567890"), example-of-simple("Grid Matrix", "grid-matrix", grid-matrix, "1234567890"), example-of-simple( "Han Xin (Chinese Sensible)", "hanxin", hanxin, "abc123全ň全漄", ), example-of-simple("IBM BC412 (SEMI T1-95)", "bc412", bc412, "1234567890"), example-of-simple("ISBN", "isbnx", isbnx, "9789861817286"), ) #pagebreak() #section( example-of-simple("ITF-14", "itf14", itf14, "1234567890"), example-of-simple("LOGMARS", "logmars", logmars, "1234567890"), example-of-simple("MaxiCode", "maxicode", maxicode, "1234567890"), example-of-simple("Micro QR", "micro-qr", micro-qr, "1234567890"), example-of-simple("MSI Plessey", "msi-plessey", msi-plessey, "1234567890"), example-of-simple("NVE-18 (SSCC-18)", "nve18", nve18, "1234567890"), example-of-simple("Pharmacode One-Track", "pharma", pharma, "123456"), example-of-simple( "Pharmacode Two-Track", "pharma-two", pharma-two, "12345678", ), example-of-simple("Pharmazentralnummer", "pzn", pzn, "12345678"), example-of-simple("Planet", "planet", planet, "1234567890"), ) #pagebreak() #section( example-of-simple("Plessey", "plessey", plessey, "1234567890"), example-of-simple("QR Code", "qrcode", qrcode, "1234567890"), example-of-simple( "Rectangular Micro QR Code (rMQR)", "rmqr", rmqr, "1234567890", ), example-of-simple( "Telepen Numeric", "telepen-num", telepen-num, "1234567890", ), example-of-simple("Telepen", "telepen", telepen, "ABCD12345"), example-of-simple("Ultracode", "ultra", ultra, "1234567890"), example-of-simple( "Vehicle Identification Number", "vin", vin, "2GNFLGE30D6201432", ), example-of-simple("Facing Identification Mark", "fim", fim, "A"), example-of-simple("Flattermarken", "flat", flat, "123")[ Used for marking book covers to indicate volume order. ], ) = Symbology Values <symbology> Following symbology values are supported: #grid( columns: (1fr, 1fr, 1fr, 1fr, 1fr, 1fr), gutter: 5pt, typst-val("\"Code11\""), typst-val("\"C25Standard\""), typst-val("\"C25Inter\""), typst-val("\"C25IATA\""), typst-val("\"C25Logic\""), typst-val("\"C25Ind\""), typst-val("\"Code39\""), typst-val("\"ExCode39\""), typst-val("\"EANX\""), typst-val("\"EANXChk\""), typst-val("\"GS1128\""), typst-val("\"Codabar\""), typst-val("\"Code128\""), typst-val("\"DPLEIT\""), typst-val("\"DPIDENT\""), typst-val("\"Code16k\""), typst-val("\"Code49\""), typst-val("\"Code93\""), typst-val("\"Flat\""), typst-val("\"DBarOmn\""), typst-val("\"DBarLtd\""), typst-val("\"DBarExp\""), typst-val("\"Telepen\""), typst-val("\"UPCA\""), typst-val("\"UPCAChk\""), typst-val("\"UPCE\""), typst-val("\"UPCEChk\""), typst-val("\"Postnet\""), typst-val("\"MSIPlessey\""), typst-val("\"FIM\""), typst-val("\"Logmars\""), typst-val("\"Pharma\""), typst-val("\"PZN\""), typst-val("\"PharmaTwo\""), typst-val("\"CEPNet\""), typst-val("\"PDF417\""), typst-val("\"PDF417Comp\""), typst-val("\"MaxiCode\""), typst-val("\"QRCode\""), typst-val("\"Code128AB\""), typst-val("\"AusPost\""), typst-val("\"AusReply\""), typst-val("\"AusRoute\""), typst-val("\"AusRedirect\""), typst-val("\"ISBNX\""), typst-val("\"RM4SCC\""), typst-val("\"DataMatrix\""), typst-val("\"EAN14\""), typst-val("\"VIN\""), typst-val("\"CodablockF\""), typst-val("\"NVE18\""), typst-val("\"JapanPost\""), typst-val("\"KoreaPost\""), typst-val("\"DBarStk\""), typst-val("\"DBarOmnStk\""), typst-val("\"DBarExpStk\""), typst-val("\"Planet\""), typst-val("\"MicroPDF417\""), typst-val("\"USPSIMail\""), typst-val("\"Plessey\""), typst-val("\"TelepenNum\""), typst-val("\"ITF14\""), typst-val("\"KIX\""), typst-val("\"Aztec\""), typst-val("\"DAFT\""), typst-val("\"DPD\""), typst-val("\"MicroQR\""), typst-val("\"HIBC128\""), typst-val("\"HIBC39\""), typst-val("\"HIBCDM\""), typst-val("\"HIBCQR\""), typst-val("\"HIBCPDF\""), typst-val("\"HIBCMicPDF\""), typst-val("\"HIBCCodablockF\""), typst-val("\"HIBCAztec\""), typst-val("\"DotCode\""), typst-val("\"HanXin\""), typst-val("\"Mailmark2D\""), typst-val("\"UPUS10\""), typst-val("\"Mailmark4S\""), typst-val("\"AzRune\""), typst-val("\"Code32\""), typst-val("\"EANXCC\""), typst-val("\"GS1128CC\""), typst-val("\"DBarOmnCC\""), typst-val("\"DBarLtdCC\""), typst-val("\"DBarExpCC\""), typst-val("\"UPCACC\""), typst-val("\"UPCECC\""), typst-val("\"DBarStkCC\""), typst-val("\"DBarOmnStkCC\""), typst-val("\"DBarExpStkCC\""), typst-val("\"Channel\""), typst-val("\"CodeOne\""), typst-val("\"GridMatrix\""), typst-val("\"UPNQR\""), typst-val("\"Ultra\""), typst-val("\"RMQR\""), typst-val("\"BC412\""), )
https://github.com/protohaven/printed_materials
https://raw.githubusercontent.com/protohaven/printed_materials/main/public-code_of_conduct/protohaven-code_of_conduct.typ
typst
#import "/meta-environments/env-templates.typ": * #show: doc => policy_document( title: "Code of Conduct", authors: ("Protohaven Board",), draft: false, date: datetime( year: 2024, month: 4, day: 1, ), doc ) #include "/common-policy/code_of_conduct.typ"
https://github.com/Leo1003/resume
https://raw.githubusercontent.com/Leo1003/resume/main/README.md
markdown
# Resume This resume is built with [Typst](https://typst.app/docs), which is a new markup-based typesetting system designed to be an alternative both to advanced tools like LaTeX and simpler tools like Word and Google Docs. Access the generated [PDF](https://leo1003.github.io/resume/Resume-PDF/resume.pdf) Rendered resume: ![](https://leo1003.github.io/resume/Resume-PNG/resume-1.png) ## Credit The template used in this resume is modified from [bamboovir/typst-resume-template](https://github.com/bamboovir/typst-resume-template). And the layout is inspired from [Harkunwar/attractive-typst-resume](https://github.com/Harkunwar/attractive-typst-resume).
https://github.com/EricWay1024/Homological-Algebra-Notes
https://raw.githubusercontent.com/EricWay1024/Homological-Algebra-Notes/master/ha/1-cat.typ
typst
#import "../libs/template.typ": * = Basic Category Theory <cat-theory> This section is a crash course in category theory. The reader is advised to take the Category Theory course concurrently and/or refer to other materials, e.g. @awodey. == Basic Definitions #definition[ A *category* $cal(C)$ consists of - A collection of *objects* $ob cC$ and - For every pair of objects $X, Y in ob cC$, a collection of *morphisms* $hom_cC (X, Y)$, where for $f in Hom(C)(X, Y)$ we denote $f: X->Y$ or $X ->^f Y$ and say $X$ is the *domain* of $f$ and $Y$ is the *codomain* of $f$; such that - For every object $X$, there exists an *identity morphism* $id_X in Hom(C) (X, X)$; - For every pair of morphisms $f : X -> Y$ and $g : Y -> Z$, there exists a *composite morphism* $g oo f : X -> Z$, subject to the axioms: - For every morphism $f : X -> Y$, we have $id_y oo f = f oo id_X = f$; - For every triple of morphisms $f : X -> Y$, $g : Y -> Z$ and $h: Z -> W$, we have $(h oo g) oo f = h oo (g oo f)$, which we simply denote as $h oo g oo f$. ] #notation[ We usually write $X in cC$ when we mean $X in ob cC$. We sometimes denote $Hom(C)(X, X)$ as $End(C) (X)$ (the *endomorphisms* of $X$). We might simply write $hom$ instead of $Hom(C)$ if the underlying category is clear from the context. ] #definition[ A category $cC$ *locally small* if for every $X, Y in cC$, $Hom(C) (X, Y)$ is a set. A category $cC$ is *small* if it is locally small and further $ob cC$ is a set. ] // #remark[ These definitions above are to avoid set-theoretic size issues, which we shall not delve into. They are employed when necessary to ensure that we do not run into paradoxes. // ] #example[ A *discrete category* $cC$ is one where $ hom_cC (X, Y) = cases({id_X} quad &X = Y, nothing quad &X != Y) $ It does not contain more information than $ob cC$, so it can be simply regarded as a collection of objects, or a set when $cC$ is small. ] #example[ If $ob cC = {x}$, then $hom_cC (x, x)$ is a *monoid*. ] // #remark[ If you have never heard of monoids before, the above can be seen as the definition of a monoid. In general, a category is a "generalised" monoid because in a category you can only compose two morphisms $f, g$ in certain situations (namely, when the codomain of $f$ and the domain of $g$ match), whereas composition is allowed for any two elements of a monoid. // ] #example[ The following are the main categories we will be working with. - The category $Set$ has objects which are sets and morphisms which are functions between sets. Notice in category theory we avoid talking directly about elements of a set, because a set, which is an object of the category $Set$, is "atomic" or inseparable. - Let $k$ be a field. The category $veck$ has objects which are vector spaces over $k$ and morphisms which are linear transformations between vector spaces. We often denote $hom_veck$ as $homk$. In particular, for any $V, W in veck$, $homk (V, W)$ is also a vector space. - Let $R$ be a ring. The category $RMod$ has objects which are #lrms and morphisms which are module homomorphisms. Similarly, we have the category $ModR$ of #rrms. We often denote $hom_RMod$ or $hom_ModR$ as $homr$; it should be clear from the context which one we are referring to. - The category $Grp$ has objects which are groups and morphisms which are group homomorphisms. Similarly, we have the category $Ab$ of abelian groups. // $veck, Set, $ left/right $R$-modules, bimodules, topological spaces, etc. ] #definition[ Let $cC, cD$ be categories. The *product category* $cC times cD$ consists of objects $(C, D)$ for $C in cC$ and $D in cD$, and morphisms $(f, g) : (C_1 ,D_1 )-> (C_2, D_2)$ for $f : C_1 -> C_2$ and $g: D_1 -> D_2$. ] #definition[ A morphism $f: B-> C$ is *monic* (or a *monomorphism*) if for any $e_1, e_2 : A -> B$ such that $f compose e_1 = f compose e_2$ we have $e_1 = e_2$. A morphism $f: B->C$ is *epic* (or an *epimorphism*) if for any $g_1, g_2 : C-> D$ such that $g_1 compose f = g_2 compose f$ we have $g_1 = g_2$. ] #note[ $f: B-> C$ is monic if and only if the induced map $(f oo -) : hom_cC (A, B) -> Hom(C) (A, C)$ is injective for any $A$, and $f : B-> C$ is epic if and only if the induced map $(- oo f) : Hom(C) (C, D) -> Hom(C) (B, D)$ is injective for any $D$. ] #example[ In $Set$, a monomorphism is equivalent to a one-to-one map and an epimorphism is equivalent to an onto map. ] #example[ In the category of commutative rings, $ZZ -> QQ$ is both monic and epic. Note that if two maps agree on $ZZ->R$, they must also agree on $QQ -> R$, since a ring homomorphism $f: QQ -> R$ is uniquely determined by $f(1)$. ] #example[ In the category of commutative rings, for any ring $R$ and its ideal $I$, $R -> R\/I$ is epic. ] // [Any localisation in ring is epic? #TODO] == Categories with a Zero Object #definition[ An *initial object* $I$ of category $cC$ is an object such that for any $A in ob cC$, there exists a unique morphism $I -> A$. A *final object* $T$ is an object such that for any $A in ob cC$ there exists a unique morphism $A -> T$. ] #example[ In $Set$, an initial object is equivalent to an empty set, while a final object is equivalent to a one-element (or singleton) set. ] #definition[ A *zero object* $0$ is both initial and final. ] #example[ In $RMod$, a zero object is equivalent to the zero module. ] #proposition[ If there is a zero object in the category, then for any $B, C in cC$ we have a *zero morphism* $0 in hom_cC (B, C)$ which factors through the zero object. // https://tikzcd-typst-editor.pages.dev/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRACEQBfU9TXfIRQAmclVqMWbAMLdeIDNjwEiARlKrx9Zq0Qhi3cTCgBzeEVAAzAE4QAtkjIgcEJKp5XbDxE5dJRIAxYYLogUHRwABbGcp72-tR+iOqBwaHhUTFcFFxAA // #image("imgs/4.png", width: 30%) ] <zero-factor> #proof[ #align(center, commutative-diagram( node-padding: (40pt, 40pt), node((0, 0), [$B$]), node((0, 2), [$C$]), node((1, 1), [$0$]), arr((0, 0), (0, 2), [$0$]), arr((0, 0), (1, 1), [$exists!$], "dashed"), arr((1, 1), (0, 2), [$exists!$], "dashed"), )) It is clear from the commutative diagram. ] #notation[ In a commutative diagram, two paths with the same starting and ending points correspond to two equal morphisms. ] #notation[ We (ab)use the notation $0$ to denote both a zero object and a zero morphism. ] #definition[ In a category with a zero object, a *kernel* of $f: B->C$ is a morphism $i: A-> B$ such that $f compose i = 0$ in a universal way. That is, for any $i' : A'-> B$ such that $f compose i' = 0$, there exists a unique morphism $h : A' -> A$ such that $i' = i oo h$. We denote $i = ker(f)$. Diagrammatically, // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBpiBdUkANwEMAbAVxiRAEEQBfU9TXfIRQAmUsKq1GLNgCFuvEBmx4CRAMzkJ9Zq0QgAwvL7LBRAAybq26XvYByI4v4qhyUWa1TdIM9wkwoAHN4IlAAMwAnCABbJDIQHAgkUUkdNjDHSJikCwSkxHjrbyxMqNjEDTyc6gYsMG8oOjgACwCQKy82GAAPLDgcOAACAEJS7IrqRLiOtL0sBx5wsuqqxAAWRZAs8rXJ-OFN7aRKqfWuCi4gA #align(center, commutative-diagram( node-padding: (50pt, 50pt), node((1, 1), [$A$]), node((2, 2), [$B$]), node((1, 3), [$C$]), node((1, 0), [$A'$]), node((0, 2), [$0$]), arr((2, 2), (1, 3), [$f$]), arr((1, 1), (2, 2), [$i$]), arr((1, 0), (1, 1), [$exists !$], "dashed"), arr((1, 0), (2, 2), [$i'$], label-pos: -1em), arr((1, 1), (0, 2), []), arr((0, 2), (1, 3), []), arr((1, 0), (0, 2), []), )) // #v(20pt) // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRAEEQBfU9TXfIRQBGclVqMWbAELdeIDNjwEiAJjHV6zVohABhOXyWCiZYeK1Td7AOTdxMKAHN4RUADMAThAC2SUSA4EEjqEtps7oYgXr5IZIHBiAGWOiBYUTF+iPFBIdQARjBgUEgALACcmpKpxBneWQDM1LnZ1AxYYKlQdHAAFo4gVeG6MAAeWHA4cAAEAIR1sYhNCf5DVml2PB71SMstoSlstVwUXEA // #align(center, commutative-diagram( // node-padding: (40pt, 40pt), // node((0, 0), [$A$]), // node((0, 1), [$B$]), // node((0, 2), [$C$]), // node((1, 0), [$A'$]), // arr((0, 1), (0, 2), [$f$]), // arr((0, 0), (0, 1), [$i$]), // arr((0, 0), (0, 2), [$0$], curve: 35deg), // arr((1, 0), (0, 0), [$exists !$], label-pos: 1.5em, "dashed"), // arr((1, 0), (0, 1), [$i'$]), // arr((1, 0), (0, 2), [$0$], label-pos: -1em), // )) // #image("imgs/5.png", width: 30%) ] #notation[ Sometimes, people might also say the object $A$ in the above definition is the kernel of $f$ when the morphism $i$ is clear, and write $A = ker(f)$. However, this easily leads to confusion later on, so this note adapts the following non-standard notation: we write $A = Ker(f)$ (with a capital K) when we mean the object and $i = ker(f)$ when we mean the morphism. Hence, we would have $ Ker(f) -->^(ker(f)) B ->^f C $ such that $f oo ker(f) = 0$ in a universal way. Similar notations will be used for concepts we define later. ] <ker-notation> // #example[ // In $Set$, a kernel of $f: X -> Y$ is the inclusion map // $ // i: { x in X | f(x) = 0} arrow.hook X // $ // Hence we might also say the set ${ x in X | f(x) = 0}$ is a kernel of $f$. // ] #example[ In $veck$, kernels are kernels. ] #theorem[ A kernel is a monomorphism. ] // (This indicates that $A$ is the "biggest" subobject (to be defined!) of $B$ to be mapped to zero by $f$.) #definition[ A *cokernel* of $f: B->C$ is a morphism $j: C-> D$ such that $j compose f = 0$ in a universal way. We denote $j = coker(f)$ and $D = Coker(f)$.] #theorem[ A cokernel is an epimorphism. ] #example[ In $veck$, the cokernel of $T: V -> W$ is the quotient map $W -> W \/ im T$. ] #lemma[ Let $A$ be any object. Then for the unique morphism $f: A -> 0$, we have $ker(f) = id_A$ and $coker(f) = id_0 = 0$. Dually, for $g: 0->A$, we have $ker(g) = 0$ and $coker(g) = id_A$. ] #definition[ The *opposite category* of $cC$ is a category $cC^op$ where $ob cC^op = ob cC$ and $hom_(cC^op)(x, y) = hom_cC (y, x)$. ] #proposition[A morphism $f: B->C$ is monic in $cC$ if and only if $f^op : C -> B$ is epic in $cC^op$.] We say that "monic" and "epic" are *dual* concepts. Similarly, "initial objects" and "final objects" are dual; "kernels" and "cokernels" are dual. == Products and Coproducts #definition[ Let ${C_i | i in I}$ be a family of objects, then their *product* $product_(i in I) C_i$ is an object such that there exist $pi_j : product_(i in I) -> C_j$ for all $j in I$ in a universal way. That is, for any object $D$ with morphisms $g_j : D -> C_j$ for all $j in I$, there exists a unique morphism $D -> product_(i in I) C_i$. // #image("imgs/6.png", width: 50%) // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZ<KEY> #align(center, commutative-diagram( node((0, 0), [$D$]), node((0, 1), [$product_(i in I) C_i$]), node((1, 1), [$C_j$]), arr((0, 0), (0, 1), [$exists !$], "dashed"), arr((0, 0), (1, 1), [$g_j$]), arr((0, 1), (1, 1), [$pi_j$]), )) The *coproduct* of ${C_i | i in I}$ is defined as their product in the opposite category $C^op$. // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMA<KEY> #align(center, commutative-diagram( node((0, 0), [$D$]), node((0, 1), [$product.co_(i in I) C_i$]), node((1, 1), [$C_j$]), arr((0, 1), (0, 0), label-pos: -1em, [$exists !$], "dashed"), arr((1, 1), (0, 0), label-pos: -1em, [$f_j$]), arr((1, 1), (0, 1), label-pos: -1em, [$i_j$]), )) // Products in $cC^op$ are coproducts $product.co_(i in I) C_i$: // #image("imgs/7.png", width: 50%) ] #example[ In $Set$, let ${X_i | i in I}$ be a family of sets. $ product_(i in I) X_i = {(x_i)_(i in I) | x_i in X_i} $ and $product.co_(i in I) X_i$ is the disjoint union. ] #remark[ We need to use tuples here for the ordering of elements; suppose we want to use sets only, then it can be messy and arbitrary! This is an advantage of the language of category theory over that of set theory. ] #proposition[ $ Hom(C) (A, product C_i) bij product Hom(C) (A, C_i) $ ] #proof[ For any $C_i$, there exists $pi_i : product C_i -> C_i$ satisfying the universal property. Define $phi: Hom(C) (A, product C_i) -> product Hom(C) (A, C_i)$ as $ f |-> (pi_i compose f)_i = (pi_1 compose f, ..., pi_n compose f) $ #align(center, commutative-diagram( node((0, 0), [$A$]), node((0, 1), [$product C_i$]), node((1, 1), [$C_i$]), arr((0, 0), (0, 1), [$f$]), arr((0, 0), (1, 1), [$g_i$]), arr((0, 1), (1, 1), [$pi_i$]), )) Any $(g_i)_i in product Hom(C) (A, C_i)$ can be factorised as $(pi_i compose f')_i$ for some unique $f': A -> product C_i$ due to the universal property of the product. The existence of $f'$ ensures that $phi$ is surjective and the uniqueness of $f$ ensures injectivity. Thus $phi$ is a bijection. // #image("imgs/8.png", width: 50%) ] #proposition[We have $ Hom(C) (product.co C_i, A) bij product Hom(C) (C_i, A). $ ] #proof[ This is similar to the above case: we just reverse all the arrows. #align(center, commutative-diagram( node((0, 0), [$A$]), node((0, 1), [$product.co C_i$]), node((1, 1), [$C_i$]), arr((0, 1), (0, 0), label-pos: -1em, [$f$]), arr((1, 1), (0, 0), label-pos: -1em, [$g_i$]), arr((1, 1), (0, 1), label-pos: -1em, [$i_i$]), )) Notice the asymmetry here. It is not coproduct on the right hand side because it is still a tuple of arrows. ] == Functors and Natural Transformations #definition[ Let $cC$, $cD$ be categories. A *functor* $F: cC -> cD$ consists of - A map of objects $ob cC -> ob cD$; - #fw[For every pair objects $C_1, C_2 in cC$, a map of morphisms $ Hom(C) (C_1, C_2) -> Hom(D) (F(C_1), F(C_2)) $ ] subject to preserving morphism composition and identity morphisms. ] #definition[ Now we can define $bd("Cat")$, the category of all (small) categories, where $ob bd("Cat")$ are small categories and $hom_Cat (cC, cD)$ are functors between $cC$ and $cD$. ] #definition[ Suppose $F, G: cC -> cD$, then a *natural transformation* $alpha: F => G$ is defined by a collection of morphisms in $cD$ indexed by $x in ob cC$: $ {alpha_x: F(x) -> G(x)}_(x in ob cC) $ where the diagram commutes: // https://tikzcd-typst-editor.pages.dev/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRADEAKADwEoQAvqXSZc+QigCM5KrUYs2AcR78hI7HgJEyk2fWatEHHgHJVwkBg3ii03dX0Kjy7mcGyYUAObwioAGYAThAAtkhkIDgQSJJqIEGhMdRRSADMDvKGIMr+5gHBYYgRKYgATBkGbFy5ghYJheWR0YipAhQCQA #align(center, commutative-diagram( node((0, 0), [$F(x)$]), node((0, 1), [$G(x)$]), node((1, 0), [$F(x')$]), node((1, 1), [$G(x')$]), arr((0, 0), (0, 1), [$alpha_x$]), arr((0, 1), (1, 1), label-pos: 1.5em, [$G(f)$]), arr((0, 0), (1, 0), label-pos: 1.5em, [$F(f)$]), arr((1, 0), (1, 1), [$alpha_x'$]), )) // #image("imgs/9.png", width: 50%) ] #definition[ The *functor category* $"Fun"(cC, cD)$ is a category where the objects are functors $cC -> cD$ and the morphisms are natural transformations. ] #remark[ In $Cat$, the hom-sets are not only sets but also categories, which means that $Cat$ is a *2-category*. ] #endlec(2) == Adjoint Functors #definition[ Functors $L : cA arrows.rl cB : R$ are *adjoint* if for all $A in cA, B in cB$ there exists a bijection $ tau_(A B) : Hom(B)(L(A), B) bij Hom(A) (A, R(B)) $ such that for any $f: A-> A'$ and $g: B-> B'$, the diagram commutes: // #image("imgs/10.png") // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyx<KEY>BdUk<KEY> #align(center, commutative-diagram( node-padding: (40pt, 40pt), node((0, 0), [$Hom(B) (L(A'), B))$]), node((0, 1), [$Hom(B) (L(A), B)$]), node((0, 2), [$Hom(B) (L(A), B')$]), node((1, 0), [$Hom(A) (A', R(B))$]), node((1, 1), [$Hom(A) (A, R(B))$]), node((1, 2), [$Hom(A) (A, R(B'))$]), arr((0, 0), (0, 1), []), arr((0, 1), (0, 2), []), arr((0, 0), (1, 0), [$tau_(A'B)$]), arr((0, 1), (1, 1), [$tau_(A B)$]), arr((0, 2), (1, 2), [$tau_(A B')$]), arr((1, 0), (1, 1), []), arr((1, 1), (1, 2), []), )) ] #remark[ Recall in linear algebra we have $angle.l T v, w angle.r = angle.l v, T^* w angle.r $, where the name "adjoint" comes from. ] #remark[ Equivalently, $tau$ is a natural isomorphism between $Hom(B) (L(-), -)$ and $Hom(A) (-, R(-))$, both of which are functors $cA^op times cB -> Set$. Note that $A^op$ is used here because $Hom(A)(-, B)$ is a contravariant functor. ] // What's a product category? It's just pairs of objects and pairs of morphisms. #example[ $"Free"$ is the left adjoint of $"Forget"$. For example, we define the functors between $veck$ and $Set$: $ "Forget": veck &-> Set \ (V, +, dot) &|-> V $ $ "Free" : Set &-> veck \ X &|-> k[X] $ Then we have: $ hom_(veck)(k[X], W) &iso hom_(Set) (X, "Forget"(W))\ T &|-> T|_X \ "linearly extended" f &arrow.l.bar f $ // Forget: $Grp -> Set$. Free: $Set -> Grp$. Also happens. ] == Equivalence of Categories #definition[ In a category $cC$, objects $X, Y$ are *isomorphic* if there exists $f: X-> Y$ and $g: Y -> X$ such that $f compose g = id_Y$ and $g compose f = id_X$. We say that $f$ and $g$ are *isomorphisms*. ] In the functor category, an isomorphism (which is a natural transformation between functors) is often called a *natural isomorphism*. Consider $Cat$, then two small categories $cC$ and $cD$ are isomorphic if there are functors $F: cC-> cD$ and $G: cD-> cC$ such that $F compose G = Id$ and $G compose F = Id$. However, this rarely happens. We hence introduce the following weaker condition. #definition[ Two categories $cC$ and $cD$ are *equivalent* if there are functors $F: cC-> cD$ and $G: cD-> cC$ such that there exist natural isomorphisms $epsilon: F G => Id$ and $eta: Id => G F$. In this way $F(G(X)) iso X$ instead of $F(G(X))=X$. ] It does not really matter here if we write $F G => Id$ or $Id => F G$ (the same for $G F$) because it is a natural isomorphism, but the above way of writing is to ensure consistency with an alternative definition of adjoint functors. #remark[ Let $X, Y in Top$ and $f: X arrows.lr Y : g$ be continuous maps. If $f compose g tilde id $ and $g compose f tilde id$ then $X, Y$ are homotopy equivalent. Natural transformations are similar to the notion of homotopy. ] == Limits and Colimits #definition[ Let $I$ be a small category and $F: I -> cA$ be a functor. Then $F$ is called a *diagram*. Denote $F(i) = F_i$ for all $i in I$. A *cone* of $F$ is an object $C$ of $cA$ with morphisms ${f_i : C -> F_i}_(i in I)$, such that for any $alpha : j -> i $ in $I$, // #image("imgs/11.png", width: 30%) // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRABkQBfU9TXfIRRkAjFVqMWbAGIB9AFbdeIDNjwEiI0mOr1mrRCDlZu4mFADm8IqABmAJwgBbJGRA4ISLRP1s0WBSU7RxdENw8kACZdSQMVAJMeYOcvagjEaJ8pQ2kACkY0AAs6AEpTLiA #align(center, commutative-diagram( node-padding: (50pt, 50pt), node((0, 0), [$C$]), node((1, 0), [$F_j$]), node((1, 1), [$F_i$]), arr((0, 0), (1, 0), [$f_j$]), arr((0, 0), (1, 1), [$f_i$]), arr((1, 0), (1, 1), [$F(alpha)$]), )) commutes. A limit is a universal cone; namely, $L$ is a *limit* of $F$ if it is a cone of $F$ with ${pi_i : L -> F_i}_(i in I)$ and there exists a unique morphism $h : C -> L$ for any cone $C$ of $F$ with ${f_i : C-> F_i}_(i in I)$ such that $f_i = pi_i oo h$ for all $i in I$. We denote $L = lim_I F$. // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBoBGAXVJADcBDAGwFcYkQAZEAX1PU1z5CKMgCZqdJq3YAxAPoArHnxAZseAkXKlxNBiza<KEY> #align(center, commutative-diagram( node-padding: (50pt, 50pt), node((1, 0), [$L$]), node((2, 0), [$F_j$]), node((2, 1), [$F_i$]), node((0, 0), [$C$]), arr((1, 0), (2, 0), [$pi_j$]), arr((1, 0), (2, 1), [$pi_i$]), arr((2, 0), (2, 1), [$F(alpha)$]), arr((0, 0), (2, 0), [$f_j$], curve: -30deg, label-pos: right), arr((0, 0), (2, 1), [$f_i$], curve: 30deg), arr((0, 0), (1, 0), [$exists !$], "dashed"), )) // (Any $L$ that makes the diagram commute is called a cone and being universal means that it's a final object in the category of cones.) // #image("imgs/12.png", width: 30%) // If such $L$ exists then we call it the limit of $F$ or $lim_cal(I) F$. ] #notation[ Sometimes we write $L = lim F_i$ when $I$ is clear from the context or is not important. ] Dually, we define the colimit of $F$. This concept is important enough to be restated as follows. #definition[ Let $I$ be a small category and $F: I -> cA$ be a diagram. Denote $F(i) = F_i$ for all $i in I$. A *cocone* of $F$ is an object $C$ of $cA$ with morphisms ${f_i : F_i -> C}_(i in I)$, such that for any $alpha : j -> i $ in $I$, // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRABkQBfU9TXfIRRkAjFVqMWbAGIB9AFbdeIDNjwEiI0mOr1mrRCD<KEY> #align(center, commutative-diagram( node-padding: (50pt, 50pt), node((0, 0), [$C$]), node((1, 0), [$F_j$]), node((1, 1), [$F_i$]), arr((1, 0), (0, 0), [$f_j$], label-pos: 1.5em), arr((1, 1), (0, 0), [$f_i$], label-pos: -1em), arr((1, 0), (1, 1), [$F(alpha)$]), )) commutes. A colimit is a universal cocone; namely, $L$ is a *colimit* of $F$ if it is a cocone of $F$ with ${pi_i : F_i -> L}_(i in I)$ and there exists a unique morphism $h : L -> C$ for any cocone $C$ of $F$ with ${f_i : F_i -> C}_(i in I)$ such that $f_i = h oo pi_i $ for all $i in I$. We denote $L = colim_I F$. // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBoBGAXVJADcBDAGwFcYkQAZEAX1PU1z5CKMgCZqdJq3YAxAPoArHnxAZseAkXKlxNBizaIQ8rMv7qhRMsQn7pRgMI8JMKAHN4RUADMAThABbJG0QHAgkMkkDdjQsRTMQP0CkURowiL0pQ1U4014ffyDEEPTEVKj7YwAKJjQAC3oASgSkopLwxABmGgAjGDAoJE7Iu2zveJpGej<KEY>hhvHBbClLSO7pA+gaQ<KEY>VWjNCdy<KEY>EFPKmAAPLDgcOAAAgAhM5uEA #align(center, commutative-diagram( node-padding: (50pt, 50pt), node((1, 0), [$L$]), node((2, 0), [$F_j$]), node((2, 1), [$F_i$]), node((0, 0), [$C$]), arr((2, 0), (1, 0), [$pi_j$], label-pos: -1em), arr((2, 1), (1, 0), [$pi_i$], label-pos: -1em), arr((2, 0), (2, 1), [$F(alpha)$]), arr((2, 0), (0, 0), [$f_j$], curve: 30deg, label-pos: 1em), arr((2, 1), (0, 0), [$f_i$], label-pos: 1em, curve: -30deg), arr((1, 0), (0, 0), [$exists !$], label-pos: -1em, "dashed"), )) ] #proposition[ If any limit or colimit exists, then it is unique up to a unique isomorphism. ] #notation[ Hence we usually say "the" limit (or kernel, product, etc.) instead of "a" limit of a diagram. ] #example[ If $I$ is a discrete category, then $lim_I F = product_(i in I) F_i$ is the product and $colim_I F = product.co_(i in I) F_i$ is the coproduct. ] #example[Let $I = circle.filled arrows.rr circle.filled$ be the category with two objects and two parallel morphisms between them. Let $F : I-> cC$ be a functor which maps $I$ to $ A arrows.rr^f_g B $ in $cC$. Then when $lim_I F$ exists, we have two associated morphisms $h: lim_I F -> A$ and $h' : lim_I F -> B$, such that $f oo h = h' = g oo h$. We define the *equaliser* of $f$ and $g$ as this $h : lim_I F -> A$, denoted as $Eq(f, g)$. We also dually define the *coequaliser* of $f$ and $g$ using $colim_I F$, denoted as $Coeq(f, g)$, such that $Coeq(f, g) oo f = Coeq(f, g) oo g$. Continuing with @ker-notation, we have $ EQ(f, g) -->^(Eq(f, g)) A arrows.rr^f_g B -->^(Coeq(f, g)) COeq(f, g). $ ] #proposition[ In a category with a zero object, $Eq(f, 0) = ker f $ and $Coeq (f, 0) = coker f$. ] #proposition[ An equaliser is a monomorphism. A coequaliser is an epimorphism. ] // #image("imgs/15.png") // Coequaliser is just another direction. #proposition[ Let $L : cA arrows.lr cB : R$ be an adjunction and $ L(colim A_i) iso colim L (A_i) \ R(lim B_i) iso lim R(B_i) $ ] #proof[ Take $X in cB$. $ hom_cB (L(colim A_i), X) iso hom_cA (colim A_i, R(X)) iso lim hom_cA (A_i, R(X)) \ iso lim hom_cB (L (A_i), X) iso hom_cB (colim L (A_i), X). $ If we move colimit out of $hom$, it becomes limit. (This has been seen for products and coproducts.) We then apply Yoneda Lemma to show $L(colim A_i) $ and $colim L(A_i)$ are isomorphic. ] #remark[ Left adjunction preserves colimits and right adjunction preserves limits. In particular, left adjunction preserves cokernels and are right exact; right adjunction preserves kernels and are left exact (to be defined later). ] #proposition[ A category $cC$ has all finite limits #iff it has finite products and equalizers. ] <all-finite-limits> #proof[@awodey[Proposition 5.21].] == Subobjects and Quotient Objects @awodey[Section 5.1]. This section offers some new vocabulary to describe things we already have seen. #definition[ Let $A$ be an object of category $cC$. A *subobject* of $A$ is a monomorphism $u : S -> A$. Give two subobjects $u : S-> A$ and $v : T->A$ of $A$, we define the relation of *inclusion* of subobjects by $u subset.eq v$ if and only if there exists $f : S -> T$ such that $u = v oo f$. Such $f$ is unique if it exists, since $v$ is a monomorphism. We say two subobjects $u : S-> A$ and $v : T->A$ of $A$ are *equivalent* if $u subset.eq v$ and $v subset.eq u$. ] #proposition[ Let $u : S-> A$ and $v : T->A$ be two equivalent subobjects of $A$, then $S$ and $V$ are isomorphic objects. ] #notation[ Sometimes instead of saying $u: S-> A$ is a subobject of $A$, we may say $S$ is a subobject of $A$ when the monomorphism $u$ is clear from the context. ] #proposition[ In category $cC$, $i: A->B$ is the equaliser of $f, g: B-> C$ if and only if $i$ is the largest subobject of $B$ such that $f oo i = g oo i$. In particular, $i: A->B$ is the kernel of $f: B-> C$ if and only if $i$ is the largest subobject of $B$ such that $f oo i = 0$. ] The dual concept of subobjects is *quotient objects*.
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/typearea/0.1.0/README.md
markdown
Apache License 2.0
# typst-typearea A KOMA-Script inspired package to better configure your typearea and margins. ```typst #import "@preview/typearea:0.1.0": typearea #show: typearea.with( paper: "a4", div: 9, bcor: 11mm, ) = Hello World ``` ## Reference `typearea` accepts the following options: ### two-sided Whether the document is two-sided. Defaults to `true`. ### bcor Binding correction. Defaults to `0pt`. Additional margin on the inside of a page when two-sided is true. If two-sided is false it will be on the left or right side, depending on the value of `binding`. A `binding` value of `auto` will currently default to `left`. ### div How many equal parts to split the page into. Controls the margins. Defautls to `9`. The top and bottom margin will always be one and two parts respectively. In two-sided mode the inside margin will be one part and the outside margin two parts, so the combined margins between the text on the left side and the text on the right side is the same as the margins from the outer edge of the text to the outer edge of the page. In one-sided mode the left and right margin will take 1.5 parts each. ### ..rest All other arguments are passed on to `page()` as is. You can see which arguments `page()` accepts in the [typst reference for the page function](https://typst.app/docs/reference/layout/page/).
https://github.com/almarzn/portfolio
https://raw.githubusercontent.com/almarzn/portfolio/main/templates/typst/.template/shared/sizes.typ
typst
#let size-scale(base: 12pt, ratio: 1.1) = ( small: base * calc.pow(ratio, -1), p: base * calc.pow(ratio, 0), h6: base * calc.pow(ratio, 1), h5: base * calc.pow(ratio, 2), h4: base * calc.pow(ratio, 3), h3: base * calc.pow(ratio, 4), h2: base * calc.pow(ratio, 5), h1: base * calc.pow(ratio, 6), ) #let minor-second-scale = size-scale.with(ratio: 1.067) #let major-second-scale = size-scale.with(ratio: 1.125) #let minor-third-scale = size-scale.with(ratio: 1.200) #let major-third-scale = size-scale.with(ratio: 1.250) #let perfect-fourth-scale = size-scale.with(ratio: 1.333) #let scale = major-second-scale(base: 12pt)
https://github.com/EpicEricEE/typst-plugins
https://raw.githubusercontent.com/EpicEricEE/typst-plugins/master/equate/src/lib.typ
typst
#import "equate.typ": equate
https://github.com/LDemetrios/Typst4k
https://raw.githubusercontent.com/LDemetrios/Typst4k/master/src/test/resources/suite/layout/inline/linebreak.typ
typst
// Test line breaks. --- linebreak-overflow --- // Test overlong word that is not directly after a hard break. This is a spaceexceedinglylongy. --- linebreak-overflow-double --- // Test two overlong words in a row. Supercalifragilisticexpialidocious Expialigoricmetrioxidation. --- linebreak-hyphen-nbsp --- // Test for non-breaking space and hyphen. There are non\u{2011}breaking~characters. --- linebreak-narrow-nbsp --- // Test for narrow non-breaking space. #show "_": sym.space.nobreak.narrow 0.1_g, 1_g, 10_g, 100_g, 1_000_g, 10_000_g, 100_000_g, 1_000_000_g --- linebreak-shape-run --- // Test that there are no unwanted line break opportunities on run change. This is partly emp#emph[has]ized. --- linebreak-manual --- Hard #linebreak() break. --- linebreak-manual-directly-after-automatic --- // Test hard break directly after normal break. Hard break directly after \ normal break. --- linebreak-manual-consecutive --- // Test consecutive breaks. Two consecutive \ \ breaks and three \ \ more. --- linebreak-manual-trailing-multiple --- // Test forcing an empty trailing line. Trailing break \ \ --- linebreak-manual-justified --- // Test justified breaks. #set par(justify: true) With a soft #linebreak(justify: true) break you can force a break without #linebreak(justify: true) breaking justification. #linebreak(justify: false) Nice! --- linebreak-thai --- // Test linebreak for East Asian languages ทีวีตรวจทานนอร์ทแฟรีเลคเชอร์โกลด์อัลบัมเชอร์รี่เย้วสโตร์กฤษณ์เคลมเยอบีร่าพ่อค้าบลูเบอร์รี่สหัสวรรษโฮปแคนูโยโย่จูนสตรอว์เบอร์รีซื่อบื้อเยนแบ็กโฮเป็นไงโดนัททอมสเตริโอแคนูวิทย์แดรี่โดนัทวิทย์แอปพริคอทเซอร์ไพรส์ไฮบริดกิฟท์อินเตอร์โซนเซอร์วิสเทียมทานโคโยตี้ม็อบเที่ยงคืนบุญคุณ --- linebreak-cite-punctuation --- // Test punctuation after citations. #set page(width: 162pt) They can look for the details in @netwok, which is the authoritative source. #bibliography("/assets/bib/works.bib") --- linebreak-math-punctuation --- // Test punctuation after math equations. #set page(width: 85pt) We prove $1 < 2$. \ We prove $1 < 2$! \ We prove $1 < 2$? \ We prove $1 < 2$, \ We prove $1 < 2$; \ We prove $1 < 2$: \ We prove $1 < 2$- \ We prove $1 < 2$– \ We prove $1 < 2$— \ --- linebreak-link --- #link("https://example.com/(ab") \ #link("https://example.com/(ab)") \ #link("https://example.com/(paren)") \ #link("https://example.com/paren)") \ #link("https://hi.com/%%%%%%%%abcdef") \ --- linebreak-link-justify --- #set page(width: 240pt) #set par(justify: true) Here's a link https://url.com/data/extern12840%data_urlenc and then there are more links #link("www.url.com/data/extern12840%data_urlenc") in my text of links http://mydataurl/hash/12098541029831025981024980124124214/incremental/progress%linkdata_information_setup_my_link_just_never_stops_going/on?query=false --- linebreak-link-end --- // Ensure that there's no unconditional break at the end of a link. #set page(width: 180pt, height: auto, margin: auto) #set text(11pt) For info see #link("https://myhost.tld"). --- issue-2105-linebreak-tofu --- #linebreak()中文 --- issue-3082-chinese-punctuation --- #set text(font: "Noto Serif CJK TC", lang: "zh") #set page(width: 230pt) 課有手冬,朱得過已誰卜服見以大您即乙太邊良,因且行肉因和拉幸,念姐遠米巴急(abc0),松黃貫誰。 --- issue-80-emoji-linebreak --- // Test that there are no linebreaks in composite emoji (issue #80). #set page(width: 50pt, height: auto) #h(99%) 🏳️‍🌈 🏳️‍🌈 --- issue-hyphenate-in-link --- #set par(justify: true) // The `linebreak()` function accidentally generated out-of-order breakpoints // for links because it now splits on word boundaries. We avoid the link markup // syntax because it's show rule interferes. #"http://creativecommons.org/licenses/by-nc-sa/4.0/" --- issue-4468-linebreak-thai --- // In this bug, empty-range glyphs at line break boundaries could be duplicated. // This happens for Thai specifically because it has both // - line break opportunities // - shaping that results in multiple glyphs in the same cluster #set text(font: "Noto Sans Thai") #h(85pt) งบิก
https://github.com/vaucher-leo/template-tb-typst
https://raw.githubusercontent.com/vaucher-leo/template-tb-typst/main/fonctions.typ
typst
MIT License
#let todayDate(lang) = { let actualDay = datetime.today().display("[day]") if actualDay.first() == "0" { actualDay = actualDay.last() } let actualMonth = datetime.today().display("[month repr:long]") if lang == "FR"{ // Convert month to french if actualMonth == "January" { actualMonth = "janvier" } if actualMonth == "February" { actualMonth = "février" } if actualMonth == "March" { actualMonth = "mars" } if actualMonth == "April" { actualMonth = "avril" } if actualMonth == "May" { actualMonth = "mai" } if actualMonth == "June" { actualMonth = "juin" } if actualMonth == "July" { actualMonth = "juillet" } if actualMonth == "August" { actualMonth = "août" } if actualMonth == "September" { actualMonth = "septembre" } if actualMonth == "October" { actualMonth = "octobre" } if actualMonth == "November" { actualMonth = "novembre" } if actualMonth == "December" { actualMonth = "décembre" } } let actualYear = datetime.today().display("[year]") text(actualDay + " " + actualMonth + " " + actualYear) }
https://github.com/seapat/markup-resume
https://raw.githubusercontent.com/seapat/markup-resume/main/README.md
markdown
Apache License 2.0
# Markup-Resmue This project allows you to create a resume (including cover letter) from a markup file (toml, yaml, json, xml) using minimal amounts of `typst` code. The aim for this project was to allow to define flexible structure so that it can be used to write an internship application just as well as an academic cv or grant application. This is my first time using typst, please let me know if you spot something in the code that is bad or can be improved. ## Instructions 1. Add [markup-resume-lib](https://github.com/seapat/markup-resume-lib) as a submodule to your own repo - `git submodule add https://github.com/seapat/markup-resume-lib` 3. Look at [`example.toml`](./example.toml) and [`assets/example.pdf`](./assets/example.pdf) to learn what keywords render which part of the final cv. 4. Check out [`example.typ`](./example.typ) to see how to import your data and what functions you need to call to render a specific part of the cv. 5. Create your own! - Instead of `.toml` you can use one of the formats that is [supported by typst](https://typst.app/docs/reference/data-loading/) - Alternatively you can also create a dict in typst. - Please note that the keywords for the different structural elements are hard-coded and all others will be ignored. ## Example | Cover Letter | Resume | | :---: | :---: | | ![CL](./assets/example-1.png) | ![CV](./assets/example-2.png) | ## TODO - Smart URL parsing - Ensure some spacing properties - Support for proper citations (i.e. `.bib` files) - Support custom icons - `icon` per entry, should use some method to insert icons before each entry title among other places - [example1](https://github.com/duskmoon314/typst-fontawesome), [example2](https://github.com/Bi0T1N/typst-social), [example3](https://github.com/duskmoon314/typst-fontawesome) - Use commandline args to provide data directly to the compiler - alleviate the need to write own typst code by providing an universal `main.typ` that takes one or more files as input - commandline arguments are an upcoming/planned feature ## Credit - [Picture for the example](https://pixabay.com/vectors/profile-picture-woman-business-woman-7416279/) by Pixabay-user [Winterflow](https://pixabay.com/users/winterflower-17292963/)
https://github.com/daskol/typst-templates
https://raw.githubusercontent.com/daskol/typst-templates/main/jmlr/appendix.typ
typst
MIT License
// There are no numbers in sample paper. #set math.equation(numbering: none) // The first nameless section in the appendix. = <app:theorem> // Note: in this sample, the section number is hard-coded in. Following proper // LaTeX conventions, it should properly be coded as a reference: // In this appendix we prove the following theorem from // Section~\ref{sec:textree-generalization}: In this appendix we prove the following theorem from Section~6.2: *Theorem* #emph[Let $u,v,w$ be discrete variables such that $v, w$ do not co-occur with $u$ (i.e., $u !=0 arrow.r.double v = w = 0$ in a given dataset $cal(D)$). Let $N_(v 0), N_(w 0)$ be the number of data points for which $v=0, w=0$ respectively, and let $I_(u v), I_(u w)$ be the respective empirical mutual information values based on the sample $cal(D)$. Then $ N_(v 0) > N_(w 0) arrow.r.double I_(u v) <= I_(u w) $ with equality only if $u$ is identically 0.] $square.filled$ // The second section in the appendix. = #block[ *Proof.* We use the notation: $ P_(v)(i) = N_v^i / N, space.en i != 0; space.en P_(v 0) equiv P_v(0) = 1 - sum_(i != 0) P_v(i). $ These values represent the (empirical) probabilities of $v$ taking value $i != 0$ and 0 respectively. Entropies will be denoted by $H$. We aim to show that $(diff I_(u v)) / (diff P_(v 0)) < 0$.... ] _Remainder omitted in this sample. See #link("http://www.jmlr.org/papers/") for full paper._
https://github.com/flechonn/typst
https://raw.githubusercontent.com/flechonn/typst/main/data/doc3.typ
typst
= 1. Exercice : Appliquer la deuxième loi de Newton. Solution : $F = m * a$ Niveau d'indice : Débutant Considérez un objet de masse $m$ soumis à une force $F$. Appliquez la deuxième loi de Newton pour déterminer l'accélération $a$ de l'objet. = 2. Exercice : Résoudre un problème de mouvement circulaire uniforme. Solution : $a_c = \f"rac"{v^2}{r}$ Niveau d'indice : Intermédiaire Un objet se déplace en cercle avec une vitesse constante $v$ et un rayon $r$. Utilisez l'équation du mouvement circulaire uniforme pour calculer l'accélération centrée $a_c$ de l'objet.
https://github.com/Student-Smart-Printing-Service-HCMUT/ssps-docs
https://raw.githubusercontent.com/Student-Smart-Printing-Service-HCMUT/ssps-docs/main/contents/categories/task4/4.2.typ
typst
Apache License 2.0
== Tài liệu, thư mục cho kho lưu trữ (repository) Ở các kho lưu trữ ở cả frontend và backend, chúng tôi đã liên tục thêm và thay đổi nội dung của tài liệu, thư mục phục vụ cho việc phát triển dự án như .github (việc lưu trữ thông tin của những lần thay đổi), các folder liên quan đến frontend như components,pages,... và các folder liên quan backend như prisma,... Để có thể liên tục thay đổi và cập nhật nội dung này trên kho lưu trữ chúng ta sẽ sử dụng các câu lệnh hỗ trợ bởi git như: git add, git commit, git push, pull request để có thể cập nhật tử local branch lên remote branch và giữa các remote branch với nhau.
https://github.com/jneug/typst-nassi
https://raw.githubusercontent.com/jneug/typst-nassi/main/assets/example-cetz.typ
typst
MIT License
#import "@preview/cetz:0.2.2" #import "../src/nassi.typ" #set page(width: auto, height:auto, margin: 5mm) #let strukt = nassi.parse(``` function inorder(tree t) if t has left child inorder(left child of t) end if process(root of t) if t has right child inorder(right child of t) end if end function ```.text) #figure( cetz.canvas({ import nassi.draw: diagram import cetz.draw: * diagram((4,4), strukt) circle((rel:(.65,0), to:"nassi.e1-text"), stroke:red, fill:red.transparentize(80%), radius:.5, name: "a") content((5,5), "current subtree", name: "b", frame:"rect", padding:.1, stroke:red, fill:red.transparentize(90%)) line("a", "b", stroke:red) content((1,2), "recursion", name: "rec", frame:"rect", padding:.1, stroke:red, fill:red.transparentize(90%)) line("rec", (rel:(.15,0), to:"nassi.e3.west"), stroke:red, mark:(end: ">")) line("rec", (rel:(.15,0), to:"nassi.e7.west"), stroke:red, mark:(end: ">")) content((17, 2.7), [empty\ branches], name: "empty", frame:"rect", padding:.1, stroke:red, fill:red.transparentize(90%)) line("empty", (rel:(.15,0), to:"nassi.e4-text.east"), stroke:red, mark:(end: ">")) line("empty", (rel:(.15,0), to:"nassi.e8"), stroke:red, mark:(end: ">")) }), caption: "Nassi-Shneiderman diagram of a inorder tree traversal." ) // #cetz.canvas({ // import nassi.draw: diagram // import nassi.elements: * // import cetz.draw: * // diagram((0,0), { // branch("a > b", { // empty(name: "empty-1") // }, { // empty(name: "empty-2") // }) // }) // arc-through( // "nassi.empty-1", // ( // rel:(0,-.5), // to: ( // a:"nassi.empty-1", // b:"nassi.empty-2", // number:.5 // ) // ), // "nassi.empty-2", // mark:(start:">", end:">"), // stroke:red+2pt // ) // })
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/compiler/show-recursive.typ
typst
Apache License 2.0
// Test recursive show rules. --- // Test basic identity. #show heading: it => it = Heading --- // Test more recipes down the chain. #show list: scale.with(origin: left, x: 80%) #show heading: [] #show enum: [] - Actual - Tight - List = Nope --- // Test show rule in function. #let starwars(body) = { show list: it => block({ stack(dir: ltr, text(red, it), 1fr, scale(x: -100%, text(blue, it)), ) }) body } - Normal list #starwars[ - Star - Wars - List ] - Normal list --- // Test multi-recursion with nested lists. #set rect(inset: 3pt) #show list: rect.with(stroke: blue) #show list: rect.with(stroke: red) #show list: block - List - Nested - List - Recursive!
https://github.com/katamyra/Notes
https://raw.githubusercontent.com/katamyra/Notes/main/Compiled%20School%20Notes/CS2110/Modules/VonNeumann.typ
typst
#import "../../../template.typ": * = The Von Neumann Model == Basic Components A *computer program* consists of a set of instructions, each specifying a well defined piece of work for the computer to carry out. The *instruction* is the smallest piece of work specified in a computer program. #theorem[ The *von Neumann model* consists of 5 parts: + Memory + Processing Unit + Input + Output + Control Unit ] == Memory A typical memory for today's computer systems is $2^34$ by 8 bits, meaning it has 2^34 distinct memory locations (its address space), with 8 bits of data (its addressability). #definition[ *MAR*: Memory Address Reader. To read the contents of a memory location, we place the address of that location in the MAR, and interrogate the computers memory. The information stored in the location having that address is stored in the *memory data register (MDR)*. In order to write to this location instead, we do the same process but with _write enable on_. ] == Process Unit The actual processing of information in the computer is done at the *process unit*. The simplest processing unit is the *Arithmetic and Logic Unit (ALU)*, which can Add and perform basic bitwise operations. #definition()[ ALU's normally process data elements of a fixed size known as the *word lengths* of the computer, and the data elements are known as the *words*. Most modern day processors have 64 bit word lengths, but LC-3 has a 16 bit word length. ] #note[ Most computers provide a small storage close to the ALU to allow results to be temporarily stored while doing calculations. The easiest form of quick storage is a *register* with the same length as the word length. ] == Input and Output For the LC-3, our inputs are only the keyboard, and our outputs are only the monitor. == Control Unit #definition[ *Control Unit*: the component in charge of making sure all the other parts of the computer play together. The control unit keeps track of where we are within the process of executing a program. ] To keep track of which instruction is being executed, the control unit has an *instruction register* which contains that information. The control unit also has a *program counter*, which is a register pointing to the "next" instruction to be carried out. == Instruction Processing The central idea of computer processing is that the program and data are both stored as sequences of bits in the computers memory, and the program is executed one instruction at a time under the direction of the control unit. === The Instruction The most basic unit of processing is the *instruction*. There are three types of instructions: #definition[ *Operate instructions*: instructions that operate on data. For LC-3, these are ADD, AND, and NOT instructions ] #definition[ *Data movement instructions:* instructions that move information from the processing unit to and from memory to input/output defives. ] #definition[ *Control instructions*: instructions that are necessary for altering the sequential processing of instructions. ] === Instruction Cycle Instructions are carried out by the control unit in a systematic, step by step manner. There are six sequential phases, each of which requires zero or more steps. `FETCH DECODE EVALUATE ADDRESS FETCH OPERANDS EXECUTE STORE RESULT ` #definition[ *Fetch phase*: obtains the next instruction from memory and loads it into the instruction register (IR) of the control unit. The instructions are stored in computer memory. In order to carry out the work of an instruction, we must identify where it is. The PC contains the address of the next instruction to be processed. Steps: + The MAR is loaded with the contents of the PC, and increment the PC + The memory is interrogated, which results in the next instruction being placed by the memory into the MDR + The IR is loaded with the contents of the MDR ] #definition[ *Decode phase*: examines the instructions in order to figure out what the micro architecture is being used to do. In the LC-3, a decoder figures out which of the 16 opcodes to be used. ] #definition[ *Execute phase*: carries out the execution of the Instruction. ] There are other phases but we don't really need them for this class (?)
https://github.com/RaphGL/ElectronicsFromBasics
https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap2/3_power_in_electric_circuits.typ
typst
Other
#import "../../core/core.typ" === Power in electric circuits In addition to voltage and current, there is another measure of free electron activity in a circuit: power. First, we need to understand just what power is before we analyze it in any circuits. Power is a measure of how much work can be performed in a given amount of time. Work is generally defined in terms of the lifting of a weight against the pull of gravity. The heavier the weight and/or the higher it is lifted, the more work has been done. Power is a measure of how rapidly a standard amount of work is done. For American automobiles, engine power is rated in a unit called "horsepower," invented initially as a way for steam engine manufacturers to quantify the working ability of their machines in terms of the most common power source of their day: horses. One horsepower is defined in British units as 550 ft-lbs of work per second of time. The power of a car's engine won't indicate how tall of a hill it can climb or how much weight it can tow, but it will indicate how fast it can climb a specific hill or tow a specific weight. The power of a mechanical engine is a function of both the engine's speed and its torque provided at the output shaft. Speed of an engine's output shaft is measured in revolutions per minute, or RPM. Torque is the amount of twisting force produced by the engine, and it is usually measured in pound-feet, or lb-ft (not to be confused with foot-pounds or ft-lbs, which is the unit for work). Neither speed nor torque alone is a measure of an engine's power. A 100 horsepower diesel tractor engine will turn relatively slowly, but provide great amounts of torque. A 100 horsepower motorcycle engine will turn very fast, but provide relatively little torque. Both will produce 100 horsepower, but at different speeds and different torques. The equation for shaft horsepower is simple: #align(center)[ #box[ $ "Horsepower" = (2 pi S T) / (33,000) $ / S: shaft speed in rpm / T: shaft torque in lb-ft ] ] Notice how there are only two variable terms on the right-hand side of the equation, S and T. All the other terms on that side are constant: 2, $pi$, and 33,000 are all constants (they do not change in value). The horsepower varies only with changes in speed and torque, nothing else. We can re-write the equation to show this relationship: #align(center)[ #box[ $ "Horsepower" prop S T $ / $prop$: This symbol means proportional ] ] Because the unit of the "horsepower" doesn't coincide exactly with speed in revolutions per minute multiplied by torque in pound-feet, we can't say that horsepower equals ST. However, they are proportional to one another. As the mathematical product of ST changes, the value for horsepower will change by the same proportion. In electric circuits, power is a function of both voltage and current. Not surprisingly, this relationship bears striking resemblance to the "proportional" horsepower formula above: $ P = I E $ In this case, however, power (P) is exactly equal to current (I) multiplied by voltage (E), rather than merely being proportional to IE. When using this formula, the unit of measurement for power is the _watt_, abbreviated with the letter "W." It must be understood that neither voltage nor current by themselves constitute power. Rather, power is the combination of both voltage and current in a circuit. Remember that voltage is the specific work (or potential energy) per unit charge, while current is the rate at which electric charges move through a conductor. Voltage (specific work) is analogous to the work done in lifting a weight against the pull of gravity. Current (rate) is analogous to the speed at which that weight is lifted. Together as a product (multiplication), voltage (work) and current (rate) constitute power. Just as in the case of the diesel tractor engine and the motorcycle engine, a circuit with high voltage and low current may be dissipating the same amount of power as a circuit with low voltage and high current. Neither the amount of voltage alone nor the amount of current alone indicates the amount of power in an electric circuit. In an open circuit, where voltage is present between the terminals of the source and there is zero current, there is zero power dissipated, no matter how great that voltage may be. Since $P = I E$ and $I=0$ and anything multiplied by zero is zero, the power dissipated in any open circuit must be zero. Likewise, if we were to have a short circuit constructed of a loop of superconducting wire (absolutely zero resistance), we could have a condition of current in the loop with zero voltage, and likewise no power would be dissipated. Since $P = I E$ and $E=0$ and anything multiplied by zero is zero, the power dissipated in a superconducting loop must be zero. (We'll be exploring the topic of superconductivity in a later chapter). Whether we measure power in the unit of "horsepower" or the unit of "watt," we're still talking about the same thing: how much work can be done in a given amount of time. The two units are not numerically equal, but they express the same kind of thing. In fact, European automobile manufacturers typically advertise their engine power in terms of kilowatts (kW), or thousands of watts, instead of horsepower! These two units of power are related to each other by a simple conversion formula: $ 1 "Horsepower" = 745.7 "Watts" $ So, our 100 horsepower diesel and motorcycle engines could also be rated as "74570 watt" engines, or more properly, as "74.57 kilowatt" engines. In European engineering specifications, this rating would be the norm rather than the exception. #core.review[ - Power is the measure of how much work can be done in a given amount of time. - Mechanical power is commonly measured (in America) in "horsepower." - Electrical power is almost always measured in "watts," and it can be calculated by the formula P = IE. - Electrical power is a product of both voltage and current, not either one separately. - Horsepower and watts are merely two different units for describing the same kind of physical measurement, with 1 horsepower equaling 745.7 watts. ]
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/columns_03.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test the expansion behaviour. #set page(height: 2.5cm, width: 7.05cm) #rect(inset: 6pt, columns(2, [ ABC \ BCD #colbreak() DEF ]))
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/037%20-%20Ravnica%20Allegiance/013_The%20Gathering%20Storm%3A%20Chapter%2019.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "The Gathering Storm: Chapter 19", set_name: "Ravnica Allegiance", story_date: datetime(day: 16, month: 10, year: 2019), author: "<NAME>", doc ) The Beacon Tower was on the corner of an otherwise nondescript residential block, close enough to New Prahv that most of the residents were clerks and functionaries who worked in that massive complex. On an ordinary day, Ral might have seen vendors selling street food or newspapers, carriages transporting the wealthier residents, and a sea of pedestrians huddled under umbrellas against the autumn rains. The tower itself had been used by a nearby scriptorium as extra storage space until Ral’s team had displaced their boxes of paper and replaced them with crystals and mizzium wire. Today, of course, was anything but normal. By the time Ral reached the rendezvous, a block away from the tower, the battle between dragons was in full swing. Whatever attack Niv-Mizzet had unleashed against Bolas—a light so bright it had hurt the eyes, even through closed eyelids—had raised an enormous cloud of dust and debris, obscuring the horizon in that direction. The flashes and crackles of magic indicated the conflict was ongoing. In the rest of the district, everyone seemed to have gone a little mad. Most ordinary citizens had bolted for their cellars, ignorant of the true importance of what was happening. Others had taken to the streets in mobs, demanding answer from the guilds or clashing with whoever they thought was the enemy. There were brawls and looting, all the more so because the Azorius forces who would normally be deployed against such chaos seemed to have completely vanished. Boros Legion troops were deployed to push back against the panic, but they were spread thin, and the Gruul were making a bad situation worse. Bands of rampaging anarchs had boiled out of the rubble belts, attacking the Boros posts or slipping past to wreak havoc in the city. The rest of the guilds had tightened their defenses in response, leaving the city a collection of armed camps, while everyone watched the dragons slug it out and tried to imagine what might come of it. #emph[What would they think if they knew it was just a distraction?] <NAME>, hurling himself against the invader <NAME>, all to buy time for Ral to climb a few stories and press a few keys on an incomprehensibly complex machine. #emph[I wouldn’t believe it either.] But <NAME> clearly understood, or at least had received detailed instructions. Ral, standing in an alley between a cake shop and a haberdasher, peered around the corner and grimaced. The Azorius troops that were missing from the rest of the Tenth District were present in force here. Squads of arresters manned makeshift barricades all through the streets near the tower, hundreds of them, backed up by hussars on horseback and a swarm of thopters hovering overhead. #emph[This] , Ral thought, #emph[is not going to be easy.] On the horizon, there was a flash of light, followed a few seconds later by a dull boom fading to a roll of thunder. Ral glanced up at the clouds overhead, but though dark and heavy they showed no inclination to add any natural pyrotechnics to the draconic maelstrom. He looked over his shoulder, found the alley still empty, and sourly went back to surveying the defenses. "Hey," Kaya said, behind him. Ral restrained himself with an effort. "Sneaking up on people is bad manners at the best of times. Right now it’s a good way to get yourself electrocuted." "Sorry," she said. "Force of habit. I got your note." "That’s something, anyway." Ral straightened and turned. Kaya was dressed in the practical fighting outfit he’d first encountered her in, without any Orzhov regalia, and her plain daggers hung at her sides. "Take a look at tell me what you think." She leaned out into the street briefly, and gave a low whistle. "That’s a lot of swords. What’s in that tower, anyway?" "Maybe our last chance." Ral glanced at the seething, flashing mass of smoke and cloud that was the ongoing battle. "I’m open to suggestions." "Is it something we can steal?" Kaya said. "I could get into the tower through the buildings behind it, I think." "Unfortunately, I have to get in there myself," Ral said. "And I’m sure there are guards in there, too." "Then we have a problem," Kaya said. "Any chance of reinforcements from Orzhov?" "I brought everything I could pry loose at short notice," Kaya said. She pointed upward, and Ral craned his neck. The rooftops over the alley were thick with ugly, misshapen stone faces. #emph[Gargoyles.] "They’re quiet, and they obey orders. Anything else means having a debate with the hierophants, which I didn’t think we had time for. Tomik said he would do his best." Ral felt a pang, which he suppressed ruthlessly. #emph[Time for that later.] "If they could cause enough confusion," he said, "we might be able to make it to the front doors, but—" "My mates!" A set of rapidly-approaching footsteps resolved in Hekara, moving at speed. Then Ral found himself being #emph[hugged] , which was not an experience he was particularly interested in. He put one hand on Hekara’s forehead to pry her away, and she happily transferred herself to Kaya, who bore the embrace with better humor. "I see you got my note too," Ral said. "Yup!" Hekara let go of Kaya and turned back to him, beaming. "I was down there waiting on His Flamingosity when it turned up, and he told me to go help you out. So here I am!" Hekara had been curiously absent the last few days, given her usual reluctance to leave Ral’s side. Ral had refused to worry about her. #emph[Worry more about whoever she happens to.] Still, he had to admit it was a relief to have her back under his eye. "I heard we’ve got some smashing to do," Hekara said. "All those iron-brains over there, right?" "More or less," Ral said. "We need to get to the tower. Do you have any ideas?" "I have lots of ideas!" Hekara said. "Did I tell you the one about the funny duck who wears pants?" "#emph[Relevant] ideas," Ral amended, exchanging a glance with Kaya, who looked more amused than he was. "Maaaaybe," Hekara said. "A relevant is one of those big gray things with the ears like a giant loxodon, right?" "Hekara," Kaya said gently, as Ral ground his teeth. "How do we get past the guards?" "Oh!" she said. "That. Just wait a minute." She cupped a hand to her ear, listening, and in the stillness Ral heard a few discordant notes. "I brought some friends." #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) At first it was just a sort of wheezing, gasping noise, as though someone were playing the accordion. As it grew, it became clear that someone #emph[was] playing the accordion, and that they weren’t very good at it. Then, as the level of sound grew louder, the listener perceived that not only was the accordion player not particularly skilled, but the instrument itself seemed to be mortally wounded discharging great #emph[blats] of sound at semi-random intervals. It was quickly joined by a chorus of brass trumpets, no two in tune, and a phalanx of drummers, none of whom had shared their ideas concern what the beat should be. It was, in short, a cacophony, but a very deliberate one, a wall of discordant noise that somehow combined to produce a weird, lurching melody. It was captivating in its awfulness, swelling and falling, nearly coming together and then collapsing back into its component parts. A tiny man came around the corner, barely four feet high, dressed in an outlandish gold suit and juggling for all he was worth. A whirling galaxy of balls filled there air above him, interspersed with knives, axes, and rolling pins, and his hands were a blur as he expertly caught these objects and flung them back into the air on long, looping trajectories that were somehow in time with the timeless music playing behind him. The juggler was followed by a pair of tumbling girls in huge metal hoops, which wobbled down the street like spinning coins, their spangled occupants upside-down half the time. Behind them came a rank of drummers, six abreast, with three more standing on their shoulders. Behind #emph[that] came a platform nearly as wide as the street and as long as several carriages. It was carried by a row of large, burly creatures on each side, ogres, minotaurs, and any other species tall and broad enough, all decorated in glittering red and black and adorned with gold and silver ornaments. Atop the platform, a pair of goblins capered with the foreshadowed accordion, which it turned out was not so much damaged as heavily and inexpertly modified with a huge tube and extra set of bellows. Trumpet players in motley strolled in a circle around the moving stage, periodically turning to reverse direction with much comedic stumbling and whacking one another with their ungainly instruments. More jugglers dodged through the fray, tossing unlikely things to one another and slinging insults at the trumpeters as they nearly tripped them up. More performers flanked the stage, jumping and tumbling, whirling long silk scarves, and blowing long gouts of fire into the air. Another rank of drummers brought up the rear, all of them larger creatures carrying deeper bass drums, providing a pounding underbeat. The heavy footfalls of the stage bearers merged with the deep booms to sound like an army on the march. "What," Ral said, raising his voice to be heard, "is that supposed to be, exactly?" "Master Panjandrum’s Extraordinary Carnival of Delights!" Hekara said, bouncing excitedly. "His Rakkness told them to come give us a hand. Aren’t they great?" "They’re certainly #emph[loud] ," Ral said, as the stage went by. "Sorry," Kaya said, watching a provocatively costumed woman bend in an unlikely direction and blow kisses made of colored smoke. "I don’t think I’m keeping up. These are our reinforcements? A circus?" "With Rakdos, a circus is never just a circus," Ral said. "Come on, let’s stay close. Can your gargoyles deal with the thopters?" Kaya nodded and shouted something up to the rooftops. A moment later, the flock of gargoyles took flight, circling the tower. "They’ll wait for us to start," Kaya said, jogging to keep up with Ral. "Whatever it is we’re doing." "Just watch," Ral said. "And get ready to run." Hekara bounced along beside him, clapping her hands completely out of time with the music. The Azorius soldiers, arrayed behind their barricades, could scarcely have missed the approach of the moving stage and its phalanx of performers. Apparently, though, they weren’t clear in their own mind what to do about it, because there was a great deal of running about and consultation before an officer hurried down the street, waving his arms. "Gentlemen!" he shouted. "This area is under the direct control of the Senate, in accordance with Resolution 3842, concerning emergencies and appropriate conduct. Furthermore, your . . . entertainment ought to have been registered in advance with the Bureau of Street Use, and all relevant officials would have been notified. I’m afraid I’m going to have to ask you to disperse." "Oh, dear." A man Ral hadn’t noticed before unfolded himself from the front of the stage. "Unfolded" was exactly the right word—Ral had never seen a human so elongated. He was head and shoulders taller than Ral himself, but skeletally thin, with limbs that looked like dry sticks. A formal suit hung off him as though it were on a washing line, looking ridiculous, and a too-small hat sat absurdly on his bulging skull. His face was painted dead white, with lips and eyes outlined in brilliant crimson. "That’s Master Panjandrum," Hekara confided. Master Panjandrum stepped off the edge of the stage, foot coming down smoothly on the back of a tumbler who contorted herself to make a stool. Even at ground level, Panjandrum towered over the Azorius officer. Beside him, the little juggler was still in full swing, miscellaneous objects whirling above him in an endless loop. "It’s really too bad," the circus master said. "The lads will be so disappointed. What do you think, lads?" He raised his voice. "They say we have to go home!" The music came to an abrupt, discordant halt. The drummers stopped, the trumpeters froze mid-note, and the accordion went quiet after one last discordant #emph[blat] . There was a moment of silence, and then a hundred voices shouted in chorus. "#emph[The show must go on!] " "Well," Panjandrum said, as the music started up again. "There you have it." "W—what?" The arrester narrowed his eyes. "Now see here—" Then he stopped, because one of the objects from the juggler’s whirling collection—a metal ball about the size of a fist—had gotten away from him and fallen from a considerable height to land square on the officer’s head. The man toppled bonelessly to the cobbles. "Oops," Panjandrum said. His painted smile drew up into a huge grin. Hekara, still bouncing, elbowed Ral in the ribs. "This is where it gets good." "Captain!" someone shouted, from back in the Azorius ranks. A uniformed woman rose from cover, stepping forward, only to fall back with a butcher knife embedded in her eye. The little jester became a blur, objects spinning out of his hands into the ranks of Azorius troops. Knives, plates whose rims turned out to be sharp as razors, beanbags that burst into swarms of tiny silver darts, and even more unlikely weapons rained down. As one, the front rank of drummers smashed their instruments over their knees, revealing long, bladed whips stored inside. Those standing on the shoulders of the others jumped down, their new weapons swinging in wide, deadly arcs. Behind them, the trumpeters raised their instruments to their shoulders and pulled hidden triggers, causing them to spit steel-headed crossbow bolts. "Fire!" someone shouted, at the base of the tower. "Return fire!" Crossbows #emph[zinged] , sending a rain of quarrels into the travelling circus and sending performers crashing to the cobblestones. One man, struck in the midst of spitting fire, exploded in a spectacular ball of flame. A bolt hit one of the tumbling women as she hurtled through the air, and she spun with its momentum, executing a perfect landing with arms outstretched before taking a long bow and then falling over dead. "Everyone’s a critic," <NAME> muttered, ducking amidst the hail of fire. "Show ’em what we do to critics, lads!" The Rakdos performers gave a roar and surged forward, letting the portable stage fall to the ground. Azorius arresters rose from cover to meet them, swords drawn, and battle was joined. Kaya looked on in disbelief, then turned to Hekara. "Are all your circuses like this?" "Not #emph[all] ," Hekara said, pondering. "Sometimes they have tigers!" "Remind me to skip that one," Kaya said, drawing her daggers. Overhead, gargoyles swooped and dove, tangling with the hovering thopters. "Shall we?" #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) The square was chaos. Tumbling acrobats with bladed fingers slashed and spun, jugglers hurled their weapons, and a squad of ogres in clown makeup laid into the Azorius troopers in a way that suggested they were not at all amused. A group of hussars charged, sabers slashing. One of them cut a juggler’s head clean off, only for the decapitated performer to pop back up a moment later and reveal himself to be two goblins in a long overcoat. Ral, Kaya, and Hekara worked their way through the press, heading for the front door of the tower. For the most part, the Azorius troops ignored them in favor of more obvious threats. Hekara capered delightedly, conjuring her long, thin blades out of nothing and hurling them in every direction, finding eyes, throats, and gaps in armor. Kaya took the lead, daggers out, and when an arrester came at her she simply let him pass through in a burst of purple light, then planted a knife in his back while he tried to work out what happened. Bits of thopter were raining down, gears and smashed crystals dropping steadily around them. The flying machines fought back, with spinning drills and electric sparks, and the occasional broken gargoyle fell as well, breaking apart and coming down as a rain of gravel. Ral looked up, not at the aerial melee, but at the flashes and glows in the clouds farther off, trying to gauge how that much larger fight was going. It was impossible to see anything from here, other than that it was still in progress. #emph[We’re not out of time yet.] In front of the tower door, a rank of disciplined arresters with heavy shields stood in front of a robed mage, who shouted commands that mostly went unheard. They caught sight of Ral and the others, and raised their shields in time to deflect a rain of knives from Hekara. "Out of the way," Ral snarled, raising his hands. Lightning crackled and spat from his fingers. "This tower is off limits," the mage shouted, raising his hands. White light rose around him in neat concentric circles. "You are forbidden to cross the threshold, by order of the Senate!" Glyphs glowed and spun, giving the lawmage’s words the force of magic. Ral sent a bolt of lightning at him, but it broke against the ward. He set his jaw. "We haven’t got time for this," he said. "Hekara, can one of your friend—" "I’ll handle it," Kaya said. "Hold their attention." "Right!" Hekara said. She capered forward, summoning more blades. Kaya took a deep breath, then sank into the earth with a purple flash. Ral shrugged, and sent another bolt at the lawmage. The man twisted his hands, reinforcing his ward. Another bolt, and another, achieved just as little, and Ral saw the mage’s confidence growing. He gestured his soldiers forward. Ral was the only one who saw Kaya step out of the ground, gasping for breath. Before the mage knew she was there, she was reaching around him, bringing her dagger across his throat. He fell in a tide of blood, and the spell shivered and vanished. Ral raised his hands and felt power flowing from his accumulator, gathering for a moment in his gauntlets before leaping out to play across the entire rank of Azorius soldiers. They collapsed like dominoes, and Ral and Hekara hopped lightly over the line of armored bodies. The door rattled when Ral tried it, but didn’t move. He frowned, and glanced over his shoulder. The square was still full of Rakdos performers locked in combat with Azorius troops, but that wouldn’t last forever—reinforcements were almost certainly on the way. "Stand back," he said, raising his hands. "Hang on a minute," Kaya said. She stepped up to the door, stuck her arm straight through it, and fumbled around for a moment. A heavy #emph[thump] indicated she’d dislodged the bar from the other side, and she pulled it open. "Much easier." "That’s handy!" Hekara said. "Hey, what would happen if you put your head through, right, and then someone tried to open the door, and—" "I try not to think about it." Kaya stepped into the dark space beyond. "This place looks empty." "The beacon is at the top," Ral said. He gestured Hekara inside, and closed and barred the door behind them. The tower was, in fact, largely empty, with a single broad staircase winding around its outer rim. It had once possessed more internal floors—the stone supports for the wooden floorboards were still there—but the Izzet engineers had ripped them out to make it easier to lift components up to the top with cranes. Looking straight up, Ral could see the underside of a complicated mass of machinery, interlocking gears, great hanging loops of mizzium cable, and crackling crystal accumulators. "I mean, I would have thought they’d have guards in here too," Kaya said. "If it’s so important." "They may be waiting for us at the top as well," Ral said. "Be careful." "We gotta walk? Didn’t you say there was one of those lifter things?" Hekara said. "It was more of a catapult, if I recall correctly," Ral said. "I think they took it back to Nivix when they finished the work." "Awww," Hekara said. "That sounds #emph[awesome] ." They started up the stairs, Kaya taking the lead with daggers drawn. Halfway up the first turn, Ral held up a hand, staring at the curving staircase ahead of them. "Something moved," he said, concentrating. "Watch closely." A ball of fizzing electricity appeared above his hand, and Ral blew on it gently. It drifted forward, expanding into a field of power, not strong enough to do anything more that raise the hairs on someone’s skin. But it #emph[did] outline everything in front of them with a brief crackling aura—the walls, the stairs, a loop of hanging coil— —and a dozen strange, spindly, six-legged #emph[things] . Kaya tensed as the creatures stood up. They weren’t invisible, exactly, just expertly camouflaged, their flat metal surfaces shifting color and hue to blend in with the stone behind them. They had long, asymmetrical bodies, with lean, stilt-like legs that twisted and hooked weirdly. "Here are your guards," Ral said. "What are they?" Hekara said. "Constructs," Ral muttered. Hekara cocked her head. "I thought those were all big and covered in gears. These are sort of cute." "These are Tezzeret’s creations," Ral said. "Whatever they are," Kaya said, "we have to get past them, don’t we?" She dropped into a crouch. "Let’s get on with it." The leading construct came forward, legs clicking on the stone. Kaya ran at it, daggers extended, and it raised a limb to spear her on the needle-like point. By the time the blow came, though, she was gone, twisting sideways and phasing through one of the thing’s other limbs to attack the next machine in line. Her daggers plunged into its side, points slipping through its steel skin with a flare of purple light to wreak havoc on its interior workings. Hekara put on the mad grin she reserved for hurting people or breaking things and conjured a brace of daggers from the air. The first pair simply bounced off the construct’s tough outer covering, so the razorwitch created another pair, sharp and narrow as ice picks, and darted forward. She ducked under the lead construct, stabbing upward and driving her weapons into its belly. Ral followed her example. A lightning bolt would just slide over the things’ metal skins, so he concentrated his energy in his gauntlets, holding a ball of plasma above his palm until it glowed white-hot. When a construct lurched toward him, he dodged its slash and slammed the concentrated energy against it. What passed for its head burst apart in a shower of superheated metal, and the thing stumbled drunkenly sideways off the stair, hitting the floor of the tower far below and bursting into a mass of twisted metal. Up ahead, Kaya was dismantling another machine with her daggers, and Hekara was keeping one occupied by punching it full of tiny holes. When her picks broke in her hands, she simply summoned new ones and kept at it, staying away from the construct’s counter-strokes with contemptuous ease. Ral came in from behind and fried the thing with a touch, leaping over its collapsing body to intercept another before it could skewer Hekara from behind. Another construct fell off the stairs, four of its six legs detached already. In a few more moments of frantic action, the way was clear. "Nothing like a good fight with your mates, yeah?" Hekara looked at her two companions with a broad grin. "It can certainly be invigorating," Kaya allowed, with a small smile. "That can’t be the last of them," Ral said. He looked up and shook his head. "Something’s waiting for us up there." "Then we’ll take ’em out, too!" Hekara said. "Come on." #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em)  #linebreak The last curve of the stairs was within the machinery of the beacon, so they were flanked by banks of coils and accumulators, control panels and hanging loops of wire. None of it #emph[looked] damaged—Ral knew the core of the beacon was well protected, but he still worried Bolas’s forces might have attempted to disable it. #emph[Apparently not. Their attention must be elsewhere.] #emph[] #linebreak Where the stone tower ended, the stairs emerged onto a flat deck that formed the base of a broad copper dome. The machinery of the beacon was concentrated in the center of the room, arrayed around a single vast resonating crystal. That was the heart of the thing, the technology Ral had salvaged from Project Lightning Bug, vastly scaled up and inverted. When the proper current was applied, it would become a blazing torch, shining forth into the Multiverse. At least in theory, #emph[every] Planeswalker would be able to see it, and find their way to Ravnica. In front of the core was a keyboard, like a piano’s, complete with ivory keys. #emph[The security lockout.] That was the final safeguard. If Ral keyed in the sequence he’d chosen, what felt like a hundred years ago, then the beacon would activate. And that activation was designed to be irrevocable—it would burn until its fuel ran out. #emph[Almost there.] #emph[] #linebreak Unfortunately, the rest of the room wasn’t empty. Steel pillars stood at regular intervals, supporting a complicated mesh of wires, conduits, and elaborate gear-trains. Some of the equipment passed through grates in the floor to connect with things down below; other wires reached for the walls, exiting through other grates that gave a dim view of the darkened sky outside. In the midst of this jungle, between them and the security keyboard, two women stood side by side. Lavinia had traded her hooded cloak for a bright set of blue and gold Azorius armor, and stood with one hand on her sword. Beside her, Vraska was all in black, her tendrils already writhing. The gorgon looked over the three of them with a toothy, contemptuous smile. "Well," she said, "it took you long enough."
https://github.com/frectonz/the-pg-book
https://raw.githubusercontent.com/frectonz/the-pg-book/main/book/078.%20head.html.typ
typst
head.html Holding a Program in One's Head August 2007A good programmer working intensively on his own code can hold it in his mind the way a mathematician holds a problem he's working on. Mathematicians don't answer questions by working them out on paper the way schoolchildren are taught to. They do more in their heads: they try to understand a problem space well enough that they can walk around it the way you can walk around the memory of the house you grew up in. At its best programming is the same. You hold the whole program in your head, and you can manipulate it at will.That's particularly valuable at the start of a project, because initially the most important thing is to be able to change what you're doing. Not just to solve the problem in a different way, but to change the problem you're solving.Your code is your understanding of the problem you're exploring. So it's only when you have your code in your head that you really understand the problem.It's not easy to get a program into your head. If you leave a project for a few months, it can take days to really understand it again when you return to it. Even when you're actively working on a program it can take half an hour to load into your head when you start work each day. And that's in the best case. Ordinary programmers working in typical office conditions never enter this mode. Or to put it more dramatically, ordinary programmers working in typical office conditions never really understand the problems they're solving.Even the best programmers don't always have the whole program they're working on loaded into their heads. But there are things you can do to help: Avoid distractions. Distractions are bad for many types of work, but especially bad for programming, because programmers tend to operate at the limit of the detail they can handle.The danger of a distraction depends not on how long it is, but on how much it scrambles your brain. A programmer can leave the office and go and get a sandwich without losing the code in his head. But the wrong kind of interruption can wipe your brain in 30 seconds.Oddly enough, scheduled distractions may be worse than unscheduled ones. If you know you have a meeting in an hour, you don't even start working on something hard. Work in long stretches. Since there's a fixed cost each time you start working on a program, it's more efficient to work in a few long sessions than many short ones. There will of course come a point where you get stupid because you're tired. This varies from person to person. I've heard of people hacking for 36 hours straight, but the most I've ever been able to manage is about 18, and I work best in chunks of no more than 12.The optimum is not the limit you can physically endure. There's an advantage as well as a cost of breaking up a project. Sometimes when you return to a problem after a rest, you find your unconscious mind has left an answer waiting for you. Use succinct languages. More powerful programming languages make programs shorter. And programmers seem to think of programs at least partially in the language they're using to write them. The more succinct the language, the shorter the program, and the easier it is to load and keep in your head.You can magnify the effect of a powerful language by using a style called bottom-up programming, where you write programs in multiple layers, the lower ones acting as programming languages for those above. If you do this right, you only have to keep the topmost layer in your head. Keep rewriting your program. Rewriting a program often yields a cleaner design. But it would have advantages even if it didn't: you have to understand a program completely to rewrite it, so there is no better way to get one loaded into your head. Write rereadable code. All programmers know it's good to write readable code. But you yourself are the most important reader. Especially in the beginning; a prototype is a conversation with yourself. And when writing for yourself you have different priorities. If you're writing for other people, you may not want to make code too dense. Some parts of a program may be easiest to read if you spread things out, like an introductory textbook. Whereas if you're writing code to make it easy to reload into your head, it may be best to go for brevity. Work in small groups. When you manipulate a program in your head, your vision tends to stop at the edge of the code you own. Other parts you don't understand as well, and more importantly, can't take liberties with. So the smaller the number of programmers, the more completely a project can mutate. If there's just one programmer, as there often is at first, you can do all-encompassing redesigns. Don't have multiple people editing the same piece of code. You never understand other people's code as well as your own. No matter how thoroughly you've read it, you've only read it, not written it. So if a piece of code is written by multiple authors, none of them understand it as well as a single author would.And of course you can't safely redesign something other people are working on. It's not just that you'd have to ask permission. You don't even let yourself think of such things. Redesigning code with several authors is like changing laws; redesigning code you alone control is like seeing the other interpretation of an ambiguous image.If you want to put several people to work on a project, divide it into components and give each to one person. Start small. A program gets easier to hold in your head as you become familiar with it. You can start to treat parts as black boxes once you feel confident you've fully explored them. But when you first start working on a project, you're forced to see everything. If you start with too big a problem, you may never quite be able to encompass it. So if you need to write a big, complex program, the best way to begin may not be to write a spec for it, but to write a prototype that solves a subset of the problem. Whatever the advantages of planning, they're often outweighed by the advantages of being able to keep a program in your head. It's striking how often programmers manage to hit all eight points by accident. Someone has an idea for a new project, but because it's not officially sanctioned, he has to do it in off hours—which turn out to be more productive because there are no distractions. Driven by his enthusiasm for the new project he works on it for many hours at a stretch. Because it's initially just an experiment, instead of a "production" language he uses a mere "scripting" language—which is in fact far more powerful. He completely rewrites the program several times; that wouldn't be justifiable for an official project, but this is a labor of love and he wants it to be perfect. And since no one is going to see it except him, he omits any comments except the note-to-self variety. He works in a small group perforce, because he either hasn't told anyone else about the idea yet, or it seems so unpromising that no one else is allowed to work on it. Even if there is a group, they couldn't have multiple people editing the same code, because it changes too fast for that to be possible. And the project starts small because the idea is small at first; he just has some cool hack he wants to try out.Even more striking are the number of officially sanctioned projects that manage to do all eight things wrong. In fact, if you look at the way software gets written in most organizations, it's almost as if they were deliberately trying to do things wrong. In a sense, they are. One of the defining qualities of organizations since there have been such a thing is to treat individuals as interchangeable parts. This works well for more parallelizable tasks, like fighting wars. For most of history a well-drilled army of professional soldiers could be counted on to beat an army of individual warriors, no matter how valorous. But having ideas is not very parallelizable. And that's what programs are: ideas.It's not merely true that organizations dislike the idea of depending on individual genius, it's a tautology. It's part of the definition of an organization not to. Of our current concept of an organization, at least.Maybe we could define a new kind of organization that combined the efforts of individuals without requiring them to be interchangeable. Arguably a market is such a form of organization, though it may be more accurate to describe a market as a degenerate case—as what you get by default when organization isn't possible.Probably the best we'll do is some kind of hack, like making the programming parts of an organization work differently from the rest. Perhaps the optimal solution is for big companies not even to try to develop ideas in house, but simply to buy them. But regardless of what the solution turns out to be, the first step is to realize there's a problem. There is a contradiction in the very phrase "software company." The two words are pulling in opposite directions. Any good programmer in a large organization is going to be at odds with it, because organizations are designed to prevent what programmers strive for.Good programmers manage to get a lot done anyway. But often it requires practically an act of rebellion against the organizations that employ them. Perhaps it will help if more people understand that the way programmers behave is driven by the demands of the work they do. It's not because they're irresponsible that they work in long binges during which they blow off all other obligations, plunge straight into programming instead of writing specs first, and rewrite code that already works. It's not because they're unfriendly that they prefer to work alone, or growl at people who pop their head in the door to say hello. This apparently random collection of annoying habits has a single explanation: the power of holding a program in one's head.Whether or not understanding this can help large organizations, it can certainly help their competitors. The weakest point in big companies is that they don't let individual programmers do great work. So if you're a little startup, this is the place to attack them. Take on the kind of problems that have to be solved in one big brain. Thanks to <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, and <NAME> for reading drafts of this.Japanese TranslationSimplified Chinese TranslationPortuguese TranslationBulgarian TranslationRussian Translation
https://github.com/RiccardoTonioloDev/Bachelor-Thesis
https://raw.githubusercontent.com/RiccardoTonioloDev/Bachelor-Thesis/main/chapters/xinet.typ
typst
Other
#pagebreak(to: "odd") = XiNet <ch:xinet> Il seguente capitolo parla di XiNet @xinet, una rete neurale convoluzionale profonda, parametrizzata, orientata all'efficienza energetica per compiti relativi alla _computer vision_. Il _paper_ tratta in primo luogo lo XiConv ovvero un blocco convoluzionale parametrizzato che combina tecniche alternative per riuscire a migliorare l'efficienza energetica, rispetto ad una tradizionale convoluzione, e in secondo luogo XiNet ovvero una rete neurale che combina gli XiConv per riuscire ad ottenere il massimo dei risultati. Di conseguenza nelle seguenti sezioni verranno trattati: - @arc:xinet: L'architettura del blocco XiConv e della rete XiNet; - @val:xinet: La validazione dei risultati espressi nel paper; == Architettura <arc:xinet> Il blocco convoluzionale XiConv è detto parametrico in quanto sono in esso impostabili due parametri: - $alpha$: è il coefficiente di riduzione dei canali. Se per esempio si utilizza uno XiConv con $C_"in"=16$ e $C_"out"=32$ e l'$alpha$ è impostato a 0.4, i veri canali di _input_ e di _output_ accettati saranno rispettivamente $C_"in"=floor(alpha 16) = 6$ e $C_"out"=floor(alpha 32)= 12$; - $gamma$: è il coefficiente di compressione. Questo perchè la convoluzione principale, effettuata dal blocco XiConv è in realtà divisa in due passi: + Comprimere il numero di canali dell'_input_ da $alpha C_"in"$ a $(alpha C_"in") / gamma$ mediante una convoluzione _pointwise_ (quindi con @kernel di dimensione $1times 1$); + Successivamente applicare la convoluzione principale, con @kernel $3times 3$, che porta il numero di canali da $(alpha C_"in") / gamma$ a $alpha C_"out"$. #block([ Tra la convoluzione di compressione e quella principale è presente una somma tensoriale tra l'_output_ della convoluzione di compressione e l'_input_ originale passato alla rete, passato in _broadcasting_ e propriamente ridimensionato nei canali (mediante una convoluzione _pointwise_) e nelle dimensioni (mediante un _pooling_ medio adattivo) dal seguente blocco di elaborazione: #figure(image("../images/architectures/XiNet-broadcast.drawio.png",width:200pt),caption: [Architettura blocco di elaborazione del _broadcasted input_ in @xinet]) ],breakable: false,width: 100%) #linebreak() Infine, successivamente alla convoluzione principale, viene applicato un blocco di attenzione mista, dove viene combinata l'attenzione sui canali a quella spaziale, il cui _output_ moltiplicherà l'_output_ della convoluzione principale mediante @hadamard (per appunto applicare l'attenzione calcolata), restituendo quindi l'_output_ finale dell'intero blocco. #block([ L'architettura è la seguente: #figure(image("../images/architectures/XiNet-XiConv.drawio.png",width:350pt),caption: [Architettura dello XiConv proposto in @xinet]) ],breakable: false,width: 100%) La rete XiNet è invece detta parametrica in quando sono in essa impostabili i seguenti parametri: - $beta$: è il coefficiente che controlla il compromesso tra numero di parametri e operazioni. Questo perchè, fatta eccezione del primo blocco XiConv della rete, gli altri blocchi seguono la seguente formula per calcolare i propri canali di _output_ (e di conseguenza i canali di _input_ del blocco successivo): $ C_"out"^i = 4 ceil(alpha 2^(D_i-2)(1+((beta-1)i)/N)C_"out"^0) $ Dove: - $C_"out"^i$: rappresenta il numero dei canali di _output_ che avrà il blocco $i$-esimo; - $D_i$: rappresenta il numero di volte che l'_input_ è stato dimezzato nelle dimensioni, prima del blocco $i$-esimo; - Questo perchè per ogni coppia successiva di blocchi XiConv, il primo blocco va ad applicare la propria convoluzione principale con uno @stride 2 (dimezzando l'altezza e la larghezza del tensore di _input_); - Vengono sempre aggiunti due XiConv all'inizio della rete, a prescindere dal numero di XiConv specificati, quindi ci sarà sempre almeno un dimezzamento delle dimensioni. - $N$: il numero di XiConv utilizzati nella rete. #block([ L'architettura della rete è quindi la seguente: #figure(image("../images/architectures/XiNet.drawio.png",width:250pt),caption: [Architettura di XiNet, composta da $N$ XiConv]) ],breakable: false,width: 100%) Come si può osservare dalla figura e come precedentemente menzionato, l'_input_ della rete viene poi passato in _broadcast_ ad ogni XiConv che la compone. == Validazione <val:xinet> Purtroppo non viene menzionato alcun _benchmark_ rigurardo alle prestazioni sul _dataset_ _CIFAR10_, si può tuttavia fare una comparazione con lo stato dell'arte per vedere quanto si discosta da esso. Si è voluto quindi verificare il risultato andando a svolgere i seguenti passi: #block([- Clonare la _repository_ da @gh, mediante il comando: ```bash git clone https://github.com/micromind-toolkit/micromind.git ```],breakable: false,width: 100%) #block([- Installare il pacchetto _micromind_ (pacchetto nella quale è presente XiNet) in locale, mediante i seguenti comandi: ```bash # Per entrare nella directory clonata cd ./micromind/ # Per installare il pacchetto micromind pip install -e . # Per installare requisiti e dipendenze aggiuntive pip install -r ./recipes/image_classification/extra_requirements.txt ```],breakable: false,width: 100%) #block([- Effettuare l'allenamento con successiva valutazione mediante il seguente comando: ```bash # Per entrare nella cartella corretta cd ./recipes/image_classification/ # Per eseguire l'allenamento con successiva valutazione python train.py cfg/xinet.py ```],breakable: false,width: 100%) Seguendo i precedenti comandi ho quindi ottenuto un' _accuracy_ del 81.44% con $tilde$7.8 milioni di parametri. Il modello presentato in @topcifar è in cima alle classifiche con un'_accuracy_ del 99.61% con 11 milioni di parametri. Possiamo quindi constatare come, seppur con $tilde$3 milioni di parametri in meno, riesca ad avvicinarsi ai risultati dello stato dell'arte. Ovviamente XiNet non ha lo scopo di essere migliore in termini di _accuracy_, ma di minimizzare l'impatto energetico del modello, cercando di avere performance quanto più vicine ai modelli con le valutazioni migliori. I risultati sono quindi soddisfanceti al fine di provare ad esplorare, con questo modulo, eventuali soluzioni alternative per trovare una soluzione migliore di PDV1 e PDV2 nel campo del @MDE.
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024
https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/README.md
markdown
Creative Commons Attribution Share Alike 4.0 International
# 53E Notebook 2023-2024 [![Build Typst document](https://github.com/Area-53-Robotics/53E-Notebook/actions/workflows/build.yml/badge.svg)](https://github.com/Area-53-Robotics/53E-Notebook/actions/workflows/build.yml) ![GitHub](https://img.shields.io/github/license/Area-53-Robotics/53E-notebook) Welcome to 53E's notebook for the Over Under season. ## Rendered PDF You can downloaded the rendered version of this notebook under [releases](https://github.com/Area-53-Robotics/53E-Notebook/releases) as a pdf. We will make a release after each tournament we go to. Alternatively you can download the latest version, which gets built automatically after every commit. You can find it by going to the ["Actions"](https://github.com/Area-53-Robotics/53E-Notebook/actions) tab on this repository, clicking the latest action run. You will then be able to find a zip file containing the pdf under "Artifacts". ## How to Compile 1. Install the required fonts, located in the `assets/fonts` folder. 2. Install [Typst](https://github.com/typst/typst#installation) 3. Install the [Notebookinator](https://github.com/BattleCh1cken/notebookinator) 4. Clone the repository ```sh git clone https://github.com/Area-53-Robotics/53E-Notebook.git cd 53E-Notebook ``` 5. Compile the project ```sh typst compile main.typ ``` ## Project Structure - `main.typ`: The entry point for the notebook - `assets/`: All of the images and fonts that don't belong with a certain entry - `entries/`: Contains all of the entries in the notebook - `entries.typ`: Contains a list, which `#includes` all of the entries in the notebook - `glossary.typ`: All of the glossary entries - `packages.typ`: All of the external packages used by the notebook ## Making New Entries To create a new entry you first need to create a new `.typ` file in the `entries/` folder. There are two different ways this project organizes entries. Either they are organized by project section (for example flywheel, or intake), or they are single entries. > Words or phrases like this: \<word here\> should be substituted with a different word or phrase. To create a single entry, create a file with this path: `entries/<entry-name>/entry.typ`. To create an entry in a group, create a file with the following path: `entries/<group-name>/<entry_name>.typ` Once you've done this, you need to create an entry in the file. ```typ #import "/packages.typ": notebookinator #import notebookinator: * #import themes.radial.components: * #show: create-body-entry.with( title: "<EDP Stage>: <your title here>", type: "<EDP Stage>", date: datetime(year: 1982, month: 1, day: 1), ) Write your content here. ``` Make sure to read through the [Notebookinator](https://github.com/BattleCh1cken/notebookinator) documentation to see what components and options you can use while writing entries. You can also look at other existing entries as examples. Once you're happy with your entry you'll need to add it to the entry index at `entries/entries.typ`, so that the `main.typ` file is aware of it. Add the file to `entries/entries.typ` like this: ```typ // Do this if its a single entry: #include "./<entry_name>/entry.typ" // Do this if its a group: #include "./<group_name>/<entry_namge>.typ" ``` The order that the notebook renders the entries is dependent on the order that they're placed in this file, so make sure to put the `#include` in the correct spot. ## Style Guide These following rules should be followed when writing entries: ### Tense All entries should be written in past perfect tense. For example: > Today we built the drivetrain. ### Lists All lists should begin with capitalized letters. For example: > - Number 1 > - Number 2 > - Number 3 ### Numbers All numbers should be expressed with their respective Arabic notation rather than spelling them out. For example: > We drank 3 liters of water. ### Pronouns and Names Use "we" rather than referring to individuals in the squad. For example: > Today we programmed the PID for the catapult. If you need to be more specific, refer to the job of the people performing the task. For example: > The programmers programmed a program. If pronouns are needed, use the gender neutral "they/them". For example: > If the driver needs information, they should notify the drive team. ## Thanks to - The Typst developers - The awesome people in the [Typst discord](https://discord.gg/2uDybryKPe) - [Tablex](https://github.com/PgBiel/typst-tablex/) - [Material Icons](https://fonts.google.com/icons) - [Open Colors](https://yeun.github.io/open-color/) ## License The content of this notebook is licensed under a [Creative Commons Attribution-ShareAlike 4.0 International License][cc-by-sa]. [![CC BY-SA 4.0][cc-by-sa-image]][cc-by-sa] [cc-by-sa]: http://creativecommons.org/licenses/by-sa/4.0/ [cc-by-sa-image]: https://licensebuttons.net/l/by-sa/4.0/88x31.png
https://github.com/Ngan-Ngoc-Dang-Nguyen/thesis
https://raw.githubusercontent.com/Ngan-Ngoc-Dang-Nguyen/thesis/main/hoi-nghi-Nha-Trang/macros.typ
typst
#import "@preview/gentle-clues:0.7.1": * // #import "@preview/diagraph:0.2.0": * // #import "@preview/minitoc:0.1.0": * // #import "../../src/tybank.typ": * // #import "@preview/moremath:0.1.0": ind #show: gentle-clues.with(breakable: true) #set page(numbering: "1") #set heading(numbering: "1.1.") #set text(font: "New Computer Modern") #set par(justify: true) #set enum(numbering: "a)") #set math.equation(numbering: "(1)") // #set ref(supplement: it => [(#counter("equation"))]) #let title(x) = [#align(center)[#text(size: 1.7em, fill: black)[*#x*]] ] #let aka = [a.k.a] #let eg = [e.g.] #let ie = [i.e.] #let etal = [et. al.] #let wrt = [w.r.t.] #let remLE(x) = text(fill: blue, size: 0.75em)[[*LE:* #x]] #let fbar = $overline(f)$ #let gbar = $overline(g)$ // math's macros #let innerprod(x) = $angle.l #x angle.r$ #let polar(x) = $#x degree$ #let conj(x) = $#x^*$ #let xopt = $x^star$ #let uopt = $u^star$ #let rowmatrix = $r$ #let IndexNonZero = $I_(!= 0)$ #let IndexNonSaturated = $I_(circle)$ #let PLASSO = $P_"L"$ #let PSVM = $P_"SVM"$ #let RRbar = $overline(RR)$ #let phibar = $overline(phi)$ #let one = $bb(1)$ #let ind(x) = $Iota(#x)$ #let example(x) = [*Example.* _#x _] #let goal(x) = box(stroke: blue, inset: 0.5em)[*GOAL:* _#x _] //--------------- eqref // #let reffmt = it => box(stroke: rgb("#ff0000") + 0.5pt, outset: 2pt)[#smallcaps[#it]] #let reffmt = it => text(fill: blue)[#it] #show ref: reffmt #let linkfmt = it => [#underline(text(blue)[#it])] #import "eqref.typ": * #let eqref = eqref.with(style: reffmt) // set defaults #let customEqfmt = (nums) => [#box[Eq. (#numbering("1.1", ..nums))]] #let customEqref = eqref.with(fmt: customEqfmt, style: emph) // alternate options //--------------- eqref
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/chronos/0.1.0/gallery/notes.typ
typst
Apache License 2.0
#import "/src/lib.typ" as chronos: * #set page(width: auto, height: auto) #chronos.diagram({ _par("a", display-name: "Alice") _par("b", display-name: "Bob") _seq("a", "b", comment: [hello]) _note("left", [this is a first note]) _seq("b", "a", comment: [ok]) _note("right", [this is another note]) _seq("b", "b", comment: [I am thinking]) _note("left", [a note\ can also be defined\ on several lines]) }) #pagebreak() #chronos.diagram({ _par("a", display-name: "Alice") _par("b", display-name: "Bob") _note("left", [This is displayed\ left of Alice.], pos: "a", color: rgb("#00FFFF")) _note("right", [This is displayed right of Alice.], pos: "a") _note("over", [This is displayed over Alice.], pos: "a") _note("over", [This is displayed\ over Bob and Alice.], pos: ("a", "b"), color: rgb("#FFAAAA")) _note("over", [This is yet another\ example of\ a long note.], pos: ("a", "b")) }) #pagebreak() #chronos.diagram({ _par("caller") _par("server") _seq("caller", "server", comment: [conReq]) _note("over", [idle], pos: "caller", shape: "hex") _seq("server", "caller", comment: [conConf]) _note("over", ["r" as rectangle\ "h" as hexagon], pos: "server", shape: "rect") _note("over", [this is\ on several\ lines], pos: "server", shape: "rect") _note("over", [this is\ on several\ lines], pos: "caller", shape: "hex") }) #pagebreak() #chronos.diagram({ _par("a", display-name: "Alice") _par("b", display-name: "Bob") _par("c", display-name: "Charlie") _seq("a", "b", comment: [m1]) _seq("b", "c", comment: [m2]) _note("over", [Old method for note over all part. with:\ `note over FirstPart, LastPart`.], pos: ("a", "c")) _note("across", [New method with:\ `note across`.]) _seq("b", "a") _note("across", [Note across all part.], shape: "hex") }) #pagebreak() #chronos.diagram({ _par("a", display-name: "Alice") _par("b", display-name: "Bob") _note("over", [initial state of Alice], pos: "a") _note("over", [initial state of Bob], pos: "b") _seq("b", "a", comment: [hello]) }) #chronos.diagram({ _par("a", display-name: "Alice") _par("b", display-name: "Bob") _par("c", display-name: "Charlie") _par("d", display-name: "Donald") _par("e", display-name: "Eddie") _note("over", [initial state of Alice], pos: "a") _note("over", [initial state of Bob the builder], pos: "b", aligned: true) _note("over", [Note 1], pos: "a") _note("over", [Note 2], pos: "b", aligned: true) _note("over", [Note 3], pos: "c", aligned: true) _seq("a", "d") _note("over", [this is an extremely long note], pos: ("d", "e")) }) #pagebreak() #chronos.diagram({ _par("a", display-name: [Alice]) _par("b", display-name: [The *Famous* Bob]) _seq("a", "b", comment: [hello #strike([there])]) _gap() _seq("b", "a", comment: [ok]) _note("left", [ This is *bold*\ This is _italics_\ This is `monospaced`\ This is #strike([stroked])\ This is #underline([underlined])\ This is #underline([waved])\ ]) _seq("a", "b", comment: [A _well formatted_ message]) _note("right", [ This is #box(text([displayed], size: 18pt), fill: rgb("#5F9EA0"))\ #underline([left of]) Alice. ], pos: "a") _note("left", [ #underline([This], stroke: red) is #text([displayed], fill: rgb("#118888"))\ *#text([left of], fill: rgb("#800080")) #strike([Alice], stroke: red) Bob.* ], pos: "b") _note("over", [ #underline([This is hosted], stroke: rgb("#FF33FF")) by #box(baseline: 50%, image("gitea.png", width: 1cm, height: 1cm, fit: "contain")) ], pos: ("a", "b")) }) // TODO /* #pagebreak() #chronos.diagram({ _par("a", display-name: [Alice]) _par("b", display-name: [Bob]) _seq("a", "b", comment: [Hello]) _note("left", [This is a note]) _seq("[", "a", comment: [Test]) _note("left", [This is also a note]) })*/ #pagebreak() #chronos.diagram({ _seq("Bob", "Alice", comment: [Hello]) _evt("Other", "create") })
https://github.com/dyc3/senior-design
https://raw.githubusercontent.com/dyc3/senior-design/main/api-proxy.typ
typst
= REST API Proxy <Chapter::RestApiProxy> == Forwarding API Requests When a REST API request is made to OpenTogetherTube, the request is first received by the load balancer acting as the reverse proxy for API requests. The load balancer then selects one of the OTT monoliths based on the current load-balancing algorithm and forwards the request to the selected monolith. The monolith processes the request and sends a response back to the load balancer acting as the reverse proxy. The load balancer then returns the response to the client that made the original request. If the request requires interacting with a specific room, the load balancer will forward the request to the monolith that has the room. @Figure::activity-http-proxy shows how HTTP requests must be processed by the Balancer. #figure( image("figures/http-req-flow.svg", width: 60%), caption: "Flowchart for the flow of HTTP requests as they are processed by the system." ) <Figure::http-req-flow> #figure( image("figures/activity-http-proxy.svg", width: 60%), caption: "Activity diagram for how HTTP requests must be processed by the Balancer." ) <Figure::activity-http-proxy> #figure( image("figures/create-endpoint-activity.svg", width: 40%), caption: "Activity diagram for how invalid parameters are handled in Create endpoint." ) <Figure::create-endpoint-activity> == Forwarding Requests Credentials OpenTogetherTube platform uses tokens to authenticate and authorize users. When a user logs in, the server generates a token that is stored in the browser's local storage and included in all subsequent requests made by the user. The server verifies the token to ensure that the user is authenticated and has permission to access the requested video stream. The token is used to look up the session information in Redis, which allows the server to retrieve the user's permissions and session state. == OpenAPI Endpoints Route Specification @tbl:endpoints shows each API endpoint and the route it will lead to. Additionally, it describes the function of each endpoint. #{ let api = yaml("api.yaml") let rows = () for (path, methods) in api.paths { for (method, endpoint) in methods { if method == "parameters" { continue } let category = endpoint.at("tags", default: ("other",)).at(0) let description = endpoint.at("summary", default: "No summary") let row = (category, path, upper(method), description) rows = rows + (row,) } } figure( table( columns: (auto, auto, auto, auto), inset: 5pt, align: horizon, [Category], [Endpoint], [Methods], [Description], ..rows.flatten() ), caption: "API Endpoints" ) } <tbl:endpoints> The routing between categories can be determined based on the relationships between the endpoints and the logical flow of the application. For example, endpoints within the "Room" category may have routes connecting to each other within the category, while endpoints in different categories may have separate routes. API Specifications: - All `/api/room/:roomName` endpoints are routed to a specific monolith based on the `:roomName` parameter. - `/api/room/generate` and `/api/room/create` are routed to the monolith with the least number of rooms. - `/api/room/list` is a special case and will list rooms from all monoliths. - `/api/status` and `/api/status/metrics` should not be forwarded as their responses are specific to each monolith and are only used for monitoring purposes. - All other endpoints are stateless and can be routed to any available monolith.
https://github.com/tingerrr/masters-thesis
https://raw.githubusercontent.com/tingerrr/masters-thesis/main/src/de/chapters/7-conclusion.typ
typst
#import "/src/util.typ": * #import "/src/figures.typ" = Ergebnis Aus den Benchmarks in @chap:benchmarks kann geschlossen werden, dass persistente Baumdatenstrukturen vor allem im Worst Case bessere Performance liefern können. In ihrer jetzigen Implementierung sind 2-3-Fingerbäume keine gute Wahl für die Storage-Datenstruktur von T4gl-Arrays. Eine simple B-Baum Implementierung ohne Optimierung war in der Lage in den untersuchten Szenarios vergleichbare oder bessere Performance als QMaps zu bieten. Unter Betrachtung weiterer Szenarien wird davon ausgegangen, dass dieser Trend fortbesteht und das worst-case Zeitverhalten von T4gl-Arrays drastisch verbessern kann. = Optimierungen Verschiedene Optimierungen können die Performance der 2-3-Fingerbaum-Implementierung verbessern. Allerdings ist unklar, ob diese ausreichen, um die Performance der persistenten B-Baum-Implementierung zu erreichen, welche ähnlich unoptimiert implementiert wurde. == Pfadkopie Die Implementierung von `insert` der 2-3-Fingerbäume stützt sich auf eine simple, aber langsame Abfolge von `split`, `push` und `concat`. Es ist allerdings möglich stattdessen eine Variante mit Pfadkopie und internem Überlauf zu implementieren. Bei Einfügen eines Blattknotens in einer unsicheren Ebene, kann das maximal zu einem neuen Knoten pro Ebene führen, ähnlich dem worst-case von `push`. == Lazy-Evaluation Das Aufschieben von Operationen durch Lazy Evaluation hat einen direkten Einfluss auf die amortisierten Komplexitäten der Deque-Operationen @bib:hp-06[S. 7]. Da für die Echtzeitanalyse der Datenstruktur nur die worst-case Komplexitäten relevant sind, wurde diese allerdings vernachlässigt. Zur generellen Verbesserung der durschnittlichen Komplexitäten der Implementierung ist die Verwendung von Lazy Evaluation unabdingbar. == Generalisierung & Cache-Effizienz Der Cache einer CPU ist ein kleiner Speicher, zwischen CPU und RAM, welcher generell schneller zu lesen und schreiben ist. Ist ein Wert nicht in dem CPU-Cache, wird in den meisten Fällen beim Lesen einer Adresse im RAM der umliegende Speicher mit in den Cache gelesen. Das ist einer der Gründe, warum Arrays als besonders schnelle Datenstrukturen gelten. Wiederholte Lese- und Schreibzugriffe im gleichen Speicherbereich können häufig auf den Cache zurückgreifen. In Präsenz von Indirektion, also der Verwendung von Pointern wie bei Baumstrukturen, können Lese- und Schreibzugriffe in den Speicher öfter auf Bereiche zeigen, welche nicht in dem Cache liegen, dabei spricht man von einem Cache-Miss. @sec:finger-tree:generic beschreibt einen Versuch die Cache-Effizienz von Fingerbäumen zu erhöhen, indem durch höhere Zweigfaktoren die Tiefe der Bäume reduziert wird. Durch die geringere Tiefe sollen die rekursiven Algorithmen welche den Baumknoten folgen weniger oft Cache-Misses verursachen. Für verschiedene Teile der generalisierten Zweigfaktoren von Fingerbäumen konnten keine Beweise vorgelegt werden. Es wurden allerdings auch keine Beweise gefunden oder erarbeitet, welche die generalisierung auf höhere Zweigfaktoren gänzlich ausschließen. Je nach Stand der Beweise könnten generalisierte Varianten von Fingerbäumen in Zukunft in T4gl eingesetzt werden. Unklar ist, ob der Aufwand der Generalisierung sich mit der verbesserten Cache-Effizienz aufwiegen lässt. Die schlechten Ergebnisse der 2-3-Fingerbaum scheinen eine direkte Folge der naiven Implementierung zu sein, da die in @bib:hp-06[S. 20] gegebenen Benchmarks exzellente Performance vorweisen. Dabei ist allerdings unklar, wie stark der Einfluss von Lazy Evaluation in Haskell sich auf die Ergebnisse der Benchmarks auswirkt. == Vererbung & Virtual Dispatch Wird eine Klasse in C++ vererbt und besitzt überschreibbare Methoden, gelten diese als virtuell. Hat eine vererbare Klasse eine Methode ohne eine Implementierung, gilt die Methode, sowie die Klasse selbst als abstrakt. Virtuelle Methoden müssen bei ihrem Aufruf zur Laufzeit zunächst die richtige Implementierung der Methode im Virtual Table finden. Das erfolgt durch eine sogenannte Virtual Table, auf welchen jede abstrakte Klasse und deren erbende Klassen verweisen. Dabei werden für die CPU wichtige Optimierungen erschwert, wie Branch Prediction, Instruction Caching oder Instruction Prefetching. Die in @lst:finger-tree gegebene Definition verwendet Vererbung der Klasse `FingerTree` zur Darstellung der verschiedenen Varianten. Daraus folgt, dass Fingerbäume nicht mehr direkt verwendet werden können, eine `FingerTree`-Instanz selbst ist nutzlos ohne die Felder und Implementierung der vererbenden Varianten. Instanzen von `FingerTree` müssen durch Indirektion übergeben werden, da diese generell auf deren erbende Variante verweisen. Die Operationen auf den verschiedenen Varianten von `FingerTree` müssten entweder durch vorsichtiges casten der Pointer oder durch einheitliche API anhand virtueller Methoden erfolgen. Ersteres ist unergonomisch und fehleranfällig, `FingerTree` wird zwangsläufig zur virtuellen Klasse, daraus folgt: - dass für viele Methoden Virtual Dispatch verwendet werden muss - und dass jeder Zugriff auf einen `FingerTree` zunächst die Indirektion auflösen muss (Pointerdereferenzierung). Zweiteres ist nicht für alle Operationen sinnvoll, manche Operationen wie `split` sind nur für eine Variante sinnvoll implementiert. Um zu vermeiden, dass jeder Aufruf essentieller Funktionen wie `pop` und `pop` auf Virtual Dispatch zurückgreifen muss, können statt virtuellen Methoden durch gezieltes casten auf die korrekte Variante Cache Misses vermieden werden. Die Auswahl der Klasse kann durch das Mitführen eines Diskriminators erfolgen, welcher angibt, auf welche Variante verwiesen wird. Damit wird sowohl die Existenz des Virtual Table Pointers in allen Instanzen, sowie auch die doppelte Indirektion dadurch vermieden. Besonders häufige Pfade, wie die der `Deep`-Variante, können dem CPU als heiß vorgeschlagen werden, um diese bei der Branch Prediction zu bevorzugen. == Memory-Layout Bei C++ hat jeder Datentyp für den Speicher zwei relevante Metriken, dessen Größe und dessen Alignment. Das Alignment eines Datentyps gibt an, auf welchen Adressen im Speicher ein Wert gelegt werden darf. Ist das Alignment eines Typs `T` 8, kann dieser nur auf die Adressen gelegt werden, welche Vielfache von 8 sind, also `0x0`, `0x8`, `0x10`, `0x18` und so weiter. Da komplexe Datentypen aus anderen Datentypen bestehen, müssen auch diese im Speicher korrekt angelegt werden, deren Alignment wirkt sich auf das des komplexen Datentyps aus. Des Weiteren werden Felder in Deklarationreihenfolge angelegt. Damit die Alignments der einzelnen Feldertypen eingehalten werden, werden wenn nötig vom Compiler unbenutze Bytes zwischen Feldern eingefügt. Das nennt sich Padding. Durch clevere Sortierung der Felder können Paddingbytes durch kleinere Felder gefüllt werden. Padding zu reduzieren, reduziert die Größe des komplexen Datentyps. Reduziert man die Größe des Datentyps, erhöht man die Anzahl der Elemente, welche von der CPU in deren Cache geladen werden können. == Spezialisierte Allokatoren Eine mögliche Darstellung von Graphen ist es, Knoten in Arrays zu speichern und deren Verbindungen in Adjazenzlisten zu speichern. Dadurch können mehr Knoten in den CPU-Cache geladen werden als durch rekursive Definitionen. Da Bäume lediglich Sonderformen von Graphen sind, kann das auch auf die meisten Baumdatenstrukturen angewendet werden. Das bedeutet aber auch, dass alle Knoten kopiert werden müssten, welche von einem Graph erwaltet werden wenn dieser kopiert wird. Das steht gegen das Konzept der Pfadkopie in persistenten Bäumen. Eine Alternative, welche die Knoten eines Baums nah beieinander im Speicher anlegen könnten ohne die Struktur der Bäume zu zerstören, sind Allokatoren, welche einen kleinen Speicherbereich für die Knoten der Bäume verwalten. Somit könnte die Cache-Effizienz von 2-3-Fingerbäumen erhöht werden, ohne besonders große Änderungen an deren Implementierung vorzunehmen. == Unsichtbare Persistenz <sec:invis-pers> Da die Persistenz von 2-3-Fingerbäumen durch die API der Klassen versteckt wird, können diese auch auf Persistenz verzichten, wenn es sich nicht auf andere Instanzen auswirken kann. Wenn eine Instanz des Typs `T` der einzige Referent auf die in `_repr` verwiesene Instanz ist, kann diese problemlos direkt in diese Instanz schreiben, ohne vorher eine Kopie anzufertigen. Das ist das Kernprinzip von vielen Datenstrukuren, welche auf CoW-Persistenz basieren. Dazu gehören auch die Qt-Datenstrukturen, welche ursprünglich in T4gl zum Einsatz kamen. Das hat allerdings einen Einfluss auf die Art, auf welche eine solche Klasse verwendet werden kann. Wird eine Instanz `t1` angelegt und eine Referenz oder ein Pointer `&t1` wird an einen anderen Thread übergeben, kann zwischen dem Abgleich von `_repr.use_count() == 1` und der direkten Beschreibung der Instanz in `*_repr` eine Kopie von `t1` auf dem anderen Thread angeleg werden. Sprich, der `use_count` kann sich zwischen dem Ablgleich und dem Schreibzugriff verändern. Das hat zur Folge, dass der Schreibzugriff in beiden Instanzen `t1` und `t2` sichtbar wird, obwohl der Schreibzugriff nach der Kopie von `t1` stattfand. Um auszunutzen zu können, dass eine Instanz der einzige Referent der Daten in `*_repr` ist, dürfen keine Referenzen oder Pointer zu Instanzen von `T` an andere Threads übergeben werden. Stattdessen sollten diese Instanzen im Ursprungsthread kopiert werden, bevor sie an einen anderen Thread übergeben werden.
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/math/attach-08.typ
typst
Other
// Test limit. $ lim_(n->oo \ n "grows") sum_(k=0 \ k in NN)^n k $
https://github.com/crd2333/crd2333.github.io
https://raw.githubusercontent.com/crd2333/crd2333.github.io/main/src/components/TypstLocal/admonition/README.md
markdown
# typst-admonition [This package](https://github.com/crd2333/typst-admonition) is admonitions in [typst](https://github.com/typst/typst). Icons are redrawed from [material](https://squidfunk.github.io/mkdocs-material/reference/admonitions/). ![example](examples/example1.png) ## Usage ![example](examples/example2.png) ```typst #import "/path/to/typst-admonition/lib.typ": * #info(caption: "This is a caption")[ Cpation can be self defined if determined, otherwise it will be the same as the type. Currently supported types are:\ note, abstract, info, tip, success, question, warning, failure, bug, danger, example, quote. ] ```
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/math/class_01.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test custom content. #let dotsq = square( size: 0.7em, stroke: 0.5pt, align(center+horizon, circle(radius: 0.15em, fill: black)) ) $ a dotsq b \ a class("normal", dotsq) b \ a class("vary", dotsq) b \ a + class("vary", dotsq) b \ a class("punctuation", dotsq) b $
https://github.com/rodrigo72/typst-LNCS-template
https://raw.githubusercontent.com/rodrigo72/typst-LNCS-template/main/src/template.typ
typst
#let script-size = 7.97224pt #let footnote-size = 8.50012pt #let small-size = 9.24994pt #let normal-size = 10.00002pt #let large-size = 11.74988pt #let LNCS-paper( title: "Paper title", subtitle: "Paper subtitle", university: "University of Somewhere", email_type: "university.com", group: "Group XX", authors: (), // article's abstract (can be omitted) abstract: none, // paper size (also affects the margins) paper-size: "us-letter", bibliography-file: none, bibliography-full: false, // the document's content body, ) = { let names = authors.map(author => author.name) let author-string = if authors.len() == 2 { names.join(" and ") } else { names.join(", ", last: ", and ") } let emails = "{" + authors.map(author => author.number).join(", ") + "}@" + email_type // metadata set document(title: title, author: names) // LaTeX font. set text(size: normal-size, font: "New Computer Modern") // configure the page set page( paper: paper-size, // margins depend on the paper size margin: if paper-size != "a4" { ( top: (116pt / 279mm) * 100%, left: (126pt / 216mm) * 100%, right: (128pt / 216mm) * 100%, bottom: (94pt / 279mm) * 100%, ) } else { ( top: 117pt, left: 118pt, right: 119pt, bottom: 96pt, ) }, header-ascent: 14pt, header: locate(loc => { let i = counter(page).at(loc).first() if i == 1 { return } set text(size: script-size) grid( columns: (6em, 1fr, 6em), if calc.even(i) [#i], align(center, upper( if calc.odd(i) { title } else { author-string } )), if calc.odd(i) { align(right)[#i] } ) }), footer-descent: 12pt, footer: locate(loc => { let i = counter(page).at(loc).first() if i == 1 { align(center, text(size: script-size, [#i])) } }) ) // headings set heading(numbering: "1.") show heading: it => { let number = if it.numbering != none { counter(heading).display(it.numbering) h(7pt, weak: true) } set text(size: normal-size, weight: 400) if it.level == 1 { set text(size: normal-size) [ #v(25pt, weak: true) #number #let styled = { strong } #styled(it.body) #v(normal-size, weak: true) ] counter(figure.where(kind: "theorem")).update(0) } else { v(15pt, weak: true) number if it.level == 2 { strong(it.body) } else { it.body } h(7pt, weak: true) } } // lists and links set list(indent: 24pt, body-indent: 5pt) set enum(indent: 24pt, body-indent: 5pt) show link: set text(font: "New Computer Modern Mono") // equations show math.equation: set block(below: 8pt, above: 9pt) show math.equation: set text(weight: 400) // citation and bibliography styles set bibliography(style: "springer-mathphys", title: "References") // theorems show figure.where(kind: "theorem"): it => block(above: 11.5pt, below: 11.5pt, { strong({ it.supplement if it.numbering != none { [ ] counter(heading).display() it.counter.display(it.numbering) } [.] }) emph(it.body) }) // display title and authors v(35pt, weak: true) align(center, { text(size: large-size, weight: 700, title) v(15pt, weak: true) text(size: normal-size, weight: 400, subtitle) v(15pt, weak: true) text(size: small-size, university) v(5pt, weak: true) text(size: footnote-size, author-string) v(5pt, weak: true) text(size: footnote-size, emails) v(10pt, weak: true) text(size: normal-size, weight: 400, group) }) // display abstract if abstract != none { v(25pt, weak: true) set text(script-size) show: pad.with(x: 35pt) smallcaps[Abstract. ] abstract } // display contents. v(29pt, weak: true) body // display bibliography if bibliography-file != none { v(25pt, weak: true) show bibliography: set text(8.5pt) show bibliography: pad.with(x: 0.5pt) bibliography(bibliography-file, full: bibliography-full) } show: pad.with(x: 11.5pt) set par(first-line-indent: 0pt) set text(7.97224pt) } #let theorem(body, numbered: true) = figure( body, kind: "theorem", supplement: [Theorem], numbering: if numbered { "1" }, ) #let proof(body) = block(spacing: 11.5pt, { emph[Proof.] [ ] + body h(1fr) box(scale(160%, origin: bottom + right, sym.square.stroked)) })
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/math/frac-04.typ
typst
Other
// Error: 8-13 missing argument: lower $ binom(x^2) $
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/for-01.typ
typst
Other
#let out = () // Values of array. #for v in (1, 2, 3) { out += (v,) } // Indices and values of array. #for (i, v) in ("1", "2", "3").enumerate() { test(repr(i + 1), v) } // Pairs of dictionary. #for v in (a: 4, b: 5) { out += (v,) } // Keys and values of dictionary. #for (k, v) in (a: 6, b: 7) { out += (k,) out += (v,) } #test(out, (1, 2, 3, ("a", 4), ("b", 5), "a", 6, "b", 7)) // Grapheme clusters of string. #let first = true #let joined = for c in "abc👩‍👩‍👦‍👦" { if not first { ", " } first = false c } #test(joined, "a, b, c, 👩‍👩‍👦‍👦") // Return value. #test(for v in "" [], none) #test(type(for v in "1" []), "content")
https://github.com/kdog3682/mathematical
https://raw.githubusercontent.com/kdog3682/mathematical/main/0.1.0/src/examples/flyers/summer-pizza-party.typ
typst
#import "@local/typkit:0.1.0": * #import "@local/magic:0.1.0" #set page( margin: ( left: 1.5in, right: 1.5in, ), background: image( ) #set text( size: 1.5em ) #magic.title([Summer End Pizza Party]) #magic.table( Location: [Sunset Park Basketball Court], Date: [Sunday August 18th, 2024], Time: [11:30AM to 1:30PM], ) #v(20pt) - Pizzas - Chips - Drinks #align({ [ Tell your friends & family! *Come eat!* ] }, center) #let attrs = ( width: 200pt, height: 150pt, ) #stack( dir: ltr, image(english-baller, ..attrs), image(chinese-baller, ..attrs), ) #flip-to-bottom([ #magic.title([夏末披萨派对]) #magic.table( 地点: [日落公园篮球场], 日期: [2024年8月18日 星期日], 时间: [上午11:30 到 下午1:30], ) #v(20pt) - 披萨 - 薯片 - 饮料 #align({ [ 告诉你的朋友和家人! *来吃吧!* ] }, center) ])
https://github.com/Mc-Zen/zero
https://raw.githubusercontent.com/Mc-Zen/zero/main/src/ztable.typ
typst
MIT License
#import "num.typ": num, number-to-string #import "state.typ": num-state // #let ptable-counter = counter("__pillar-table__") #let is-normal-cell(cell, format, default: none) = { format.at(cell.x, default: default) == none or number-to-string(cell.body) == none } #let call-num(cell, format, col-widths: auto, default: none, state: auto) = { let cell-fmt = format.at(cell.x, default: default) let args = if type(cell-fmt) == dictionary { cell-fmt } else { () } num(cell.body, state: state, align: (col-widths: col-widths, col: cell.x), ..args) } #let ztable(..children, format: none) = { if format == none { return table(..children) } assert.eq(type(format), array, message: "The parameter `format` requires an array argument, got " + repr(format)) let table = context { let state = num-state.get() let table-end = query(selector(<__pillar-table__>).after(here())).first().location() let number-infos = query(selector(<__pillar-num__>) .after(here()) .before(table-end)) .map(x => x.value) // let debug-info = (counter: ptable-counter.get(), d: number-infos) if number-infos.len() == 0 { // first layout pass return { show table.cell: it => { if is-normal-cell(it, format) { it } else { call-num(it, format, state: state) } } table(..children) } } // second layout pass let aligned-columns = range(format.len()).filter(x => format.at(x) != none) let col-widths = (none,) * format.len() for col in aligned-columns { let filtered-cells = number-infos.filter(x => x.at(0) == col) col-widths.at(col) = range(1, 5) .map(i => calc.max(0pt, ..filtered-cells.map(x => x.at(i)))) } show table.cell: it => { if is-normal-cell(it, format) { it } else { let content = box( inset: it.inset, call-num(it, format, col-widths: col-widths.at(it.x), state: state) ) if it.align == auto { content } else { align(it.align, content) } } } table(..children) } table + [#metadata(none)<__pillar-table__>] }
https://github.com/dead-summer/math-notes
https://raw.githubusercontent.com/dead-summer/math-notes/main/notes/ScientificComputing/ch2-scalar-nonlinear-eqns/the-fixed-point-method.typ
typst
#import "/book.typ": book-page #import "../../../templates/conf.typ": * #import "@preview/mitex:0.2.4": * #show: book-page.with(title: "EfficiThe Fixed Point Methodency") #show: codly-init.with() #show: thmrules.with(qed-symbol: $square$) #codly_init() = 2 The Fixed Point Method Suppose an initial guess $x_0$ to the solution of the scalar nonlinearequation is given. One may construct a sequence of approximate solutions $ x_0 arrow.r x_1 arrow.r x_2 arrow.r x_3 arrow.r dots.h.c $ by a procedure/map, $ x_(k + 1) = phi (x_k) , "for" k = 0 , 1 , 2 , dots.c $ If the sequence ${x_k}$ converges to a limit $x^(*)$, then $x^(*)$ is a fixed point of the iteration or the function $phi (x)$ with $ lim_(k arrow.r + oo) x_(k + 1) = lim_(k arrow.r + oo) phi (x_k) $ which implies $ x^(*) = phi (x^(*)) $ #thm("Brouwer")[ If $phi (x) in C [a , b]$ and $phi (x) in [a , b]$ for any $x in [a , b]$, then the function $phi (x)$ has a fixed point on the interval $[a , b]$. ] #prf[ Let $g (x) = x - phi (x)$. As $ a lt.eq phi (x) lt.eq b , $ we have $ g (a) = a - phi (a) lt.eq 0 $ and $ g (b) = b - phi (b) gt.eq 0 . $ If $g (a) = 0$, then $a$ is a fixed point of $phi (x)$. If $g (b) = 0$, then $b$ is a fixed point of $phi (x)$. Otherwise, $g (a) < 0$ and $g (b) > 0$, the intermediate/mean value theorem implies that the function $g (x)$ has a zero $x^(*)$ in the interval $(a , b)$ so that $ 0 = g( x^*) = x^* - ( x^*), $ and $ x^(*) = phi (x^(*)) . $ ] #thm[ If a continuous map $phi (x)$ is compressive/contractive, i.e., $lr(|phi (x) - phi (y)|) lt.eq L lr(|x - y|)$ with $L in \[ 0 , 1 \)$, then the map has a fixed point and the fixed point is unique. ] #prf[ Suppose a sequence ${x_k}$ is generated by the map $ x_(k + 1) = phi (x_k) , k = 0 , 1 , 2 , ... $ First, we have $ lr(|x_(k + 1) - x_k|) = lr(|phi (x_k) - phi (x_(k - 1))|) lt.eq L lr(|x_k - x_(k - 1)|) lt.eq L^k lr(|x_1 - x_0|) . $ This indicates that $ lr(|x_(k + p) - x_k|) & lt.eq sum_(i = k)^(k + p - 1) lr(|x_(i + 1) - x_i|) lt.eq sum_(i = k)^(k + p - 1) L^i lr(|x_1 - x_0|) \ & = lr(|x_1 - x_0|) sum_(i = k)^(k + p - 1) L^i lt.eq frac(L^k, 1 - L) lr(|x_1 - x_0|) . $ We see that the sequence ${x_k}$ is a Cauchy sequence. So it is convergent with the limit be a fixed point. Assume that the fixed point is not unique with two fixed points $x^(*)$ and $x^(**)$, $ x^(*) = phi (x^(*)) ,\ x^(* *) = phi (x^(* *)) . $ Then we have $ lr(|x^(*) - x^(* *)|) = lr(|phi (x^(*)) - phi (x^(* *))|) lt.eq L lr(|x^(*) - x^(* *)|) , $ which leads to $ (1 - L) lr(|x^(*) - x^(* *)|) lt.eq 0 $ and $ 1 - L <= 0 . $ This is a contradiction. ] #thm[ Let a map $phi (x)$ be continuously differentiable, i.e., $phi in C^1$. If there is a constant $L in [ 0 , 1 )$ so that $ lr(|phi^(') (x)|) lt.eq L $ then the map has a fixed point and the fixed point is unique. ] #prf[ Note that for any values $x$ and $y$, we have $phi (x) - phi (y) = phi^(') (xi) (x - y) .$ Then $ lr(|phi (x) - phi (y)|) = lr(|phi^(') (xi)|) dot.op lr(|x - y|) lt.eq L lr(|x - y|) . $ ] The method to generate a sequence ${x_k}$ by the map $ x_(k + 1) = phi (x_k) , k = 0 , 1 , 2 , dots.c $ is called a *fixed point method*. #exm[ Consider the cubic equation, $ f (x) = x^3 - x^2 - x - 1 = 0 . $ ] #sln[ One may choose the map by $ phi (x) = x^3 - x^2 - 1 , $ $ phi (x) = sqrt(x^3 - x - 1) $ $ phi (x) = x^3 - 1 - 1 / x , $ $ phi (x) = 1 - 1 / x - 1 / x^2 $ or $ phi (x) = root(3, x^2 + x + 1) . $ We can see that some of them work and some of them do not as some of them are contractive and some of them are not. ] #exm[ Consider the scalar nonlinear equation $ f (x) = x - e^(- x) = 0 . $ ] It is usually tricky to construct the map $phi (x)$ for the fixed point method. One way to construct the map is to get a form as follows $ phi (x) = x - p (x) f (x) $ with $p (x)$ be called a preconditioner. The derivative reads $ phi^(') (x) = 1 - p^(') (x) f (x) - p (x) f^(') (x) . $ For the absolute value of derivative has an upper bound $L in \[ 0 , 1 \)$, $ lr(|phi^(') (x)|) = lr(|1 - p^(') (x) f (x) - p (x) f^(') (x)|) lt.eq L , $ we should have $ 1 - L ^{}( x) f( x) + p( x) {f}^{}( x) + L. $ If $p (x)$ is a constant, we get $p^(') (x) = 0$ and the condition becomes $ 1 - L p( x) {f}^{}( x) + L. $ This means that at least the constant $p$ should take the same sign as the derivative $f^(') (x)$. Let $ e_n = x^(*) - x_n $ be the $n^"th"$ error in the fixed point iteration. Then we have $ e_(n + 1) = x^(*) - x_(n + 1) = x^(*) - phi (x_n) = phi (x^(*)) - phi (x_n) . $ Assume $phi (x)$ is a contractive map so that $ lr(|phi (x) - phi (y)|) lt.eq L lr(|x - y|) $ for some $L in \[ 0 , 1 \)$. Then $ lr(|e_(n + 1)|) = lr(|phi (x^(*)) - phi (x_n)|) lt.eq L lr(|x^(*) - x_n|) = L lr(|e_n|) . $ The consecutive errors have linear (inequality) relation. We say the fixed point iteration has a linear or first-order convergence rate. #exr[ Analyze the convergence of the fixed point iteration below $ x_(k + 1) = phi_j (x_k) $ for computing the zeros of the quadratic equation $ f (x) = x^2 - x - 2 = 0 , $ when the following maps are used: 1. $phi_1(x) = x^2 - x$ , 2. $phi_2(x) = sqrt(2 + x)$ , 3. $phi_3(x) = - sqrt(2 + x)$ , 4. $phi_4(x) = 1 + 2/x$ . ]
https://github.com/CarterT27/cover_letter
https://raw.githubusercontent.com/CarterT27/cover_letter/main/README.md
markdown
# Cover Letter Created using Typst. ## How to use 1. Copy `example_letter.yml` and rename to `letter.yml`, and then change the relevant information. 2. Compile using `typst compile letter.typ`
https://github.com/pmazaitis/combination
https://raw.githubusercontent.com/pmazaitis/combination/main/combination.typ
typst
#let combination( columns: (auto,), rows: (auto,), gutter: 1em, text_align: center, text_styles: arguments(black), ..parts ) = { let comb_pairs = parts .pos() .chunks(2, exact: true) .map(((p,c)) => [ #block([#p #align(text_align, text(..text_styles, c))])] ) grid( columns: columns, rows: rows, gutter: gutter, ..comb_pairs ) }