repo
stringlengths 26
115
| file
stringlengths 54
212
| language
stringclasses 2
values | license
stringclasses 16
values | content
stringlengths 19
1.07M
|
---|---|---|---|---|
https://github.com/floriandejonckheere/utu-thesis | https://raw.githubusercontent.com/floriandejonckheere/utu-thesis/master/thesis/chapters/07-proposed-solution/03-requirements.typ | typst | #import "/helpers.typ": *
== Requirements
Our approach needs to fulfill certain requirements.
We make a distinction between functional and non-functional requirements.
In software engineering, functional requirements describe requirements that impact the design of the application in a functional way @software_engineering_body_of_knowledge_2001.
Non-functional requirements are additional requirements imposed at design-time that do not directly impact the functionality of the application @software_engineering_body_of_knowledge_2001.
The functional requirements we prioritized for our proposed solution are:
- *Quality*: the solution provides a high-quality decomposition of the monolith application, based on a set of quality metrics
- *Automation*: the solution automates the decomposition process as much as possible
- *Visual*: the solution can output the proposed decomposition in a visual manner, to aid understanding of the process and the results
The non-functional requirements identified for our solution are:
- *Performance*: the solution performs the analysis, decomposition, and evaluation reasonably fast
|
|
https://github.com/sitandr/typst-examples-book | https://raw.githubusercontent.com/sitandr/typst-examples-book/main/src/typstonomicon/try_catch.md | markdown | MIT License | # Try & Catch
```typ
// author: laurmaedje
// Renders an image or a placeholder if it doesn't exist.
// Donât try this at home, kids!
#let maybe-image(path, ..args) = context {
let path-label = label(path)
let first-time = query((context {}).func()).len() == 0
if first-time or query(path-label).len() > 0 {
[#image(path, ..args)#path-label]
} else {
rect(width: 50%, height: 5em, fill: luma(235), stroke: 1pt)[
#set align(center + horizon)
Could not find #raw(path)
]
}
}
#maybe-image("../tiger.jpg")
#maybe-image("../tiger1.jpg")
```
|
https://github.com/eusebe/sorbonne-slides-typst | https://raw.githubusercontent.com/eusebe/sorbonne-slides-typst/main/README.md | markdown | # sorbonne-slides-typst
A typst slide template for Sorbonne University based on polylux
|
|
https://github.com/ShapeLayer/ucpc-solutions__typst | https://raw.githubusercontent.com/ShapeLayer/ucpc-solutions__typst/main/docs/utils.md | markdown | Other | ---
title: ucpc.utils
supports: ["0.1.0"]
---
```typst
#import "/lib/ucpc.typ": utils
utils.
```
This is a utility prepared to quickly create various formats. Using this utility, you can create cover words and descriptions.
## make-hero
```typst
make-hero(
title: str | content,
subtitle?: str | content,
fgcolor?: color,
bgcolor?: color,
height?: relative,
authors?: array<str>,
datetime?: str | content,
)
```
Generates hero cover page.
## make-prob-meta
```typst
make-prob-meta(
tags?: array<str>,
difficulty?: str | content,
authors?: array<str> | content,
stat-open?: (
submit-count: int,
ac-count: int,
ac-ratio: float,
first-solver: str | content,
first-solve-time: int,
),
stat-onsite?: (
submit-count: int,
ac-count: int,
ac-ratio: float,
first-solver: str | content,
first-solve-time: int,
),
i18n?: i18n-dictionary.make-prob-meta
)
```
## make-prob-overview
```typst
make-prob-overview(
font-size?: length,
i18n?: i18n-directory.make-prob-overview,
..items,
)
```
## make-problem
```typst
make-problem(
id: str,
title: str,
tags?: array<str>,
difficulty?: str | content,
authors?: array<str> | content,
stat-open?: (
submit-count: int,
ac-count: int,
ac-ratio: float,
first-solver: str | content,
first-solve-time: int,
),
stat-onsite?: (
submit-count: int,
ac-count: int,
ac-ratio: float,
first-solver: str | content,
first-solve-time: int,
),
pallete?: (
primary: color,
secondary: color,
),
i18n: i18n-dictionary.make-prob-meta,
body
)
```
|
https://github.com/typst-jp/typst-jp.github.io | https://raw.githubusercontent.com/typst-jp/typst-jp.github.io/main/docs/tutorial/1-writing.md | markdown | Apache License 2.0 | ---
description: Typstã®ãã¥ãŒããªã¢ã«ã§ãã
---
# Typstã§å·çããã«ã¯
ããå§ããŸãããïŒããªãã倧åŠã§å°éçãªã¬ããŒããæžãããšã«ãªã£ããšããŸããããããã«ã¯æç« ãæ°åŒãèŠåºããå³ãå«ãŸããŠããŸãã
æžãå§ããã«ã¯ããŸãTypst appã§æ°ãããããžã§ã¯ããäœæããŸãããšãã£ã¿ãŒã«ç§»åãããšã2ã€ã®ããã«ã衚瀺ãããŸãã
1ã€ã¯ææžãäœæãããœãŒã¹ããã«ããã1ã€ã¯ã¬ã³ããªã³ã°ãããææžã衚瀺ããããã¬ãã¥ãŒããã«ã§ãã

ã¬ããŒãã®è¯ãåãå£ã¯ãã§ã«èããŠããã®ã§ããŸãã¯å°å
¥ãæžããŠã¿ãŸãããã
ãšãã£ã¿ãŒããã«ã«ããã€ãã®ããã¹ããå
¥åããŠãã ãããããã¹ããããã«ãã¬ãã¥ãŒããŒãžã«è¡šç€ºãããã®ããããã§ãããã
```example
In this report, we will explore the
various factors that influence fluid
dynamics in glaciers and how they
contribute to the formation and
behaviour of these natural structures.
```
_ãã®ãã¥ãŒããªã¢ã«å
šäœãéããŠããã®ãããªã³ãŒãäŸã玹ä»ããŸããã¢ããªãšåæ§ã«ãæåã®ããã«ã«ã¯ããŒã¯ã¢ãããå«ãŸãã2çªç®ã®ããã«ã«ã¯ãã¬ãã¥ãŒã衚瀺ãããŸããäœãèµ·ãã£ãŠãããããããããããã«äŸã«åãããŠããŒãžãçž®å°ããŠããŸãã_
次ã®ã¹ãããã¯ãèŠåºãã远å ããŠãããã€ãã®ããã¹ãã匷調ããããšã§ãã
Typstã§ã¯ãé »ç¹ã«äœ¿ãæžåŒãã·ã³ãã«ãªããŒã¯ã¢ããã§è¡šçŸããããã«ãªã£ãŠããŸããèŠåºãã远å ããã«ã¯ `=` ã®æåãå
¥åããŸããããã¹ããæäœã§åŒ·èª¿ããã«ã¯ãããã¹ãã `[_ã¢ã³ããŒã¹ã³ã¢_]` ã§å²ã¿ãŸãã
```example
= Introduction
In this report, we will explore the
various factors that influence _fluid
dynamics_ in glaciers and how they
contribute to the formation and
behaviour of these natural structures.
```
ç°¡åã§ãããïŒæ°ããæ®µèœã远å ããã«ã¯ã2è¡ã®ããã¹ãã®éã«ç©ºè¡ã远å ããã ãã§ãã
ãã®æ®µèœã«å°èŠåºããå¿
èŠãªå Žåã¯ã`=` ã®ä»£ããã« `==` ãå
¥åããŠäœæããŸãã
`=` ã®æ°ãèŠåºãã®ãã¹ãã¬ãã«ã決å®ããŸãã
次ã«ãæ°·æ²³ã®åæ
ã«åœ±é¿ãäžããèŠå ãããã€ãåæããŠã¿ãŸãããã
ãã®ããã«ãããã§ã¯çªå·ä»ããªã¹ãã䜿ããŸãããããªã¹ãã®åé
ç®ã«ã€ããŠãè¡ã®å
é ã« `+` æåãå
¥åããŸãã
ãããšãTypstãèªåçã«é
ç®ãçªå·ä»ãããŠãããã®ã§ãã
```example
+ The climate
+ The topography
+ The geology
```
ç®æ¡æžããªã¹ãã远å ãããå Žåã¯ã`+` æåã®ä»£ããã« `-` æåã䜿çšããŸãã
ãŸãããªã¹ãããã¹ãããããšãã§ããŸãã
äŸãã°ãäžèšã®ãªã¹ãã®æåã®é
ç®ã«ãµããªã¹ãã远å ããã«ã¯ããããã€ã³ãã³ãããŸãã
```example
+ The climate
- Temperature
- Precipitation
+ The topography
+ The geology
```
## å³è¡šã远å ãã {#figure}
ããªãã¯ãã¬ããŒãã«å³è¡šãå
¥ãããšãã£ãšè¯ããªãããšèããŠãããšããŸãããããŸãããã
Typstã§ã¯ãPNGãJPEGãGIFãSVGã®åœ¢åŒã®ç»åããµããŒãããŠããŸãã
ãããžã§ã¯ãã«ç»åãã¡ã€ã«ã远å ããã«ã¯ããŸãå·Šãµã€ãããŒã®ããã¯ã¹ã¢ã€ã³ã³ãã¯ãªãã¯ã㊠_ãã¡ã€ã«ããã«_ ãéããŸãã
ããã«ã¯ãããžã§ã¯ãå
ã®ãã¹ãŠã®ãã¡ã€ã«ã®ãªã¹ãã衚瀺ãããŸãã
çŸåšãããã«ããã®ã¯ããªããæžããŠããã¡ã€ã³ã®Typstãã¡ã€ã«ã ãã§ãã
å¥ã®ãã¡ã€ã«ãã¢ããããŒãããã«ã¯ãå³äžé
ã®ç¢å°ã®ãã¿ã³ãã¯ãªãã¯ããŸãã
ããã«ããã¢ããããŒããã€ã¢ãã°ãéããã³ã³ãã¥ãŒã¿ããã¢ããããŒããããã¡ã€ã«ãéžæã§ããŸãã
ã¬ããŒãã«çšããç»åãã¡ã€ã«ãéžãã§ãã ããã

以åã«ãèŠãŠããããã«ãTypstã§ã¯ç¹å®ã®èšå·ïŒ _ããŒã¯ã¢ãã_ ãšåŒã°ããïŒãç¹æã®æå³ãæã¡ãŸãã
`=`ã`-`ã`+`ã`_` ãããããèŠåºãããªã¹ãã匷調ããã¹ããäœæããããã«äœ¿çšããããšãã§ããŸãã
ããããææžã«æ¿å
¥ããããã®å
šãŠã«ç¹å¥ãªèšå·ãå²ãåœãŠããšãããã«åããã¥ããããããŠæ±ãã¥ãããªã£ãŠããŸããŸãã
ãã®ãããTypstã§ã¯äžè¬çãªæžåŒã«ã®ã¿ããŒã¯ã¢ããèšå·ãçšæãããã以å€ã¯å
šãŠ _颿°_ ã䜿ã£ãŠæ¿å
¥ããŸãã
ããŒãžã«ç»åã衚瀺ãããããã«ã¯ãTypstã® [`image`] 颿°ã䜿çšããŸãã
```example
#image("glacier.jpg")
```
äžè¬çã«ã颿°ã¯äžé£ã® _åŒæ°_ ã«å¯ŸããŠäœããã®åºåãçæããŸãã
ããŒã¯ã¢ããå
ã§é¢æ°ã _åŒã³åºã_ æã¯ãããªãã颿°ã®åŒæ°ãæå®ãããšãTypstããã®çµæïŒé¢æ°ã® _æ»ãå€_ ïŒãææžã«æ¿å
¥ããŠãããŸãã
ä»åã®å Žåã`image` 颿°ã¯äžã€ã®åŒæ°ãšããŠç»åãã¡ã€ã«ãžã®ãã¹ãåãåããŸãã
ããŒã¯ã¢ããã§é¢æ°ãåŒã³åºãã«ã¯ããŸã `#` æåãå
¥åããçŽåŸã«é¢æ°ã®ååãèšè¿°ããŸãã
ãã®åŸãåŒæ°ãäžžæ¬åŒ§ã§å²ã¿ãŸãã
Typstã¯åŒæ°ãªã¹ãå
ã§ããŸããŸãªããŒã¿åãèªèããŸãã
ç§ãã¡ã®ç»åã®ãã¡ã€ã«ãã¹ã¯çã [æåå]($str) ã§ãã®ã§ãäºéåŒçšç¬Šã§å²ãå¿
èŠããããŸãã
æ¿å
¥ãããç»åã¯ããŒãžå
šäœã®å¹
ã䜿ããŸããããã倿Žããã«ã¯ã`image` 颿°ã« `width `åŒæ°ãæž¡ããŸãã
ãã㯠_ååä»ã_ åŒæ°ã§ããã`åŒæ°ã®åå: åŒæ°ã®å€` ãšãã圢åŒã§æå®ãããŸãã
è€æ°ã®åŒæ°ãããå Žåã¯ã«ã³ãã§åºåããŸãããã®ãããããã§ã¯å
ã»ã©æå®ãããã¡ã€ã«ãã¹ã®åŸãã«ã«ã³ããä»ããå¿
èŠããããŸãã
```example
#image("glacier.jpg", width: 70%)
```
`width` åŒæ°ã¯ [çžå¯Ÿçãªé·ã]($relative) ã§ãã
äžã®äŸã§ã¯ãç»åãããŒãžã®å¹
ã® `{70%}` ãå ããããã«ããŒã»ã³ããŒãžãæå®ããŠããŸãã
ãŸãã`{1cm}` ã `{0.7in}` ã®ãããªçµ¶å¯Ÿå€ãæå®ããããšãã§ããŸãã
ããã¹ããšåæ§ã«ãç»åãããã©ã«ãã§ã¯ããŒãžã®å·ŠåŽã«é
眮ãããŸããããã«ãå³è¡šã®èª¬æïŒãã£ãã·ã§ã³ïŒãæ¬ ããŠããŸãã
ããããä¿®æ£ããããã«ã[figure]($figure) 颿°ã䜿çšããŸãããã
ãã®é¢æ°ã«ã¯ãååä»ãã§ãªãéåžžã®åŒæ°ïŒäœçœ®åŒæ°ïŒãšããŠãå³è¡šãæå®ããå¿
èŠããããŸããããã«ãªãã·ã§ã³ãšããŠãå³è¡šã«ä»ãã説ææïŒ`caption`ïŒãååä»ãåŒæ°ã§æå®ã§ããŸãã
`figure` 颿°ã®åŒæ°ãªã¹ãå
ã§ã¯ãTypstã¯æ¢ã«ã³ãŒãã¢ãŒãã«ãªã£ãŠããŸãã
ããã¯ã `image` 颿°ã®åŒã³åºãåã«ãã `#` èšå·ãåé€ããå¿
èŠãããããšãæå³ããŸãã
`#` èšå·ã¯ãããŒã¯ã¢ããå
ã§ããã¹ããšé¢æ°åŒã³åºããåºå¥ããããã«æžããã®ãªã®ã§ãã
ãã£ãã·ã§ã³ã®äžã«ã¯ãä»»æã®ããŒã¯ã¢ãããå«ããããšãåºæ¥ãŸãã
ãã颿°ã®åŒæ°ãšããŠããŒã¯ã¢ãããæå®ããããã«ã¯ããããè§æ¬åŒ§ `[ ]` ã§å²ã¿ãŸãããã®ãããŒã¯ã¢ãããè§æ¬åŒ§ã§å²ãŸããŠããæ§é ãã®ããšãã_ã³ã³ãã³ããããã¯_ ãšåŒã°ããŸã
```example
#figure(
image("glacier.jpg", width: 70%),
caption: [
_Glaciers_ form an important part
of the earth's climate system.
],
)
```
ããªãã¯ã¬ããŒãã®å·çãç¶ãããã¡ã«ãä»åºŠã¯å
ã»ã©æ¿å
¥ããå³ãæäžããåç
§ããããªã£ããšããŸãã
ãã®å ŽåããŸãå³ã«ã©ãã«ãä»ããŸãã
ã©ãã«ãšã¯ãææžå
ã®èŠçŽ ãäžæã«èå¥ããããã®ååã®ããšã§ããå
ã»ã©æ¿å
¥ããå³ã®åŸãã«ããã®å³ã®ã©ãã«ãå±±æ¬åŒ§ `< >` ã§å²ãã§æžãå ããŸãã
ããã§ãããã¹ãå
ã§ `[@]` èšå·ãæžããåŸãã«ã©ãã«åãæå®ãããšããã®å³ãåç
§ã§ããããã«ãªããŸããã
èŠåºããæ¹çšåŒãã©ãã«ãä»ããŠåç
§å¯èœã«ããããšãã§ããŸãã
```example
Glaciers as the one shown in
@glaciers will cease to exist if
we don't take action soon!
#figure(
image("glacier.jpg", width: 70%),
caption: [
_Glaciers_ form an important part
of the earth's climate system.
],
) <glaciers>
```
<div class="info-box">
ãããŸã§ã«ãã³ã³ãã³ããããã¯ïŒè§æ¬åŒ§ `[ ]` å
ã®ããŒã¯ã¢ããïŒãšæååïŒäºéåŒçšç¬Š `" "` å
ã®ããã¹ãïŒã颿°ã«æž¡ããŠããŸããã
ã©ã¡ããããã¹ããå«ãã§ããããã«èŠããŸãããéãã¯äœã§ããããïŒ
ã³ã³ãã³ããããã¯ã¯ããã¹ããå«ãããšãã§ããŸããããã以å€ã«ãããŸããŸãªããŒã¯ã¢ããã颿°åŒã³åºããªã©ãå«ãããšãã§ããŸãã
äžæ¹ãæååã¯æ¬åœã« _æåã®äžŠã³_ ã«éããŸããã
äŸãã°ãimage 颿°ã¯ãåŒæ°ãšããŠç»åãã¡ã€ã«ãžã®ãã¹ãæž¡ãããããšãæ³å®ããŠããŸããããã«æç« ã®æ®µèœãä»ã®ç»åãæž¡ããŠãæå³ããããŸããã
image颿°ã®åŒæ°ãšããŠããŒã¯ã¢ããããã¡ã§æååãèš±å¯ãããã®ã¯ããããããããªã®ã§ãã
ãããšã¯å察ã«ãæååã¯ã³ã³ãã³ããããã¯ãæåŸ
ãããå Žæã§ããã°ã©ãã«ã§ãæžãããšãåºæ¥ãŸãããªããªããæååã¯åãªãæåã®äžŠã³ã§ãããæåã®äžŠã³ã¯æå¹ãªã³ã³ãã³ãã®äžçš®ã ããã§ãã
</div>
## åèæç®ã®è¿œå {#bibliography}
ã¬ããŒããäœæããéã«ã¯ããã®äž»åŒµãè£ä»ããå¿
èŠããããŸãããã
åèæç®ãææžã«è¿œå ããã«ã¯ã[`bibliography`]($bibliography) 颿°ã䜿çšã§ããŸãã
ãã®é¢æ°ã¯ãåŒæ°ãšããŠåèæç®ãã¡ã€ã«ãžã®ãã¹ãæž¡ãããããšãæ³å®ããŠããŸãã
Typstã§ã¯ããã€ãã£ããªåèæç®ã®åœ¢åŒãšããŠ[Hayagriva](https://github.com/typst/hayagriva/blob/main/docs/file-format.md)ã䜿çšããŠããŸããã
äºææ§ã®ããã«BibLaTeXãã¡ã€ã«ã䜿çšã§ããŸãã
ã¯ã©ã¹ã¡ãŒããæ¢ã«æç®èª¿æ»ãè¡ãã`.bib` ãã¡ã€ã«ãéã£ãŠãããã®ã§ãããã䜿çšããŸãããã
ãã¡ã€ã«ããã«ãéããŠãã¡ã€ã«ãã¢ããããŒãããTypst appã§ã¢ã¯ã»ã¹ã§ããããã«ããŸãã
ææžã«åèæç®ã远å ãããŠããå Žåãåèæç®æ¬ã«ããæç®ãæäžã§åŒçšããããšãã§ããŸãã
åŒçšã¯ã©ãã«ãžã®åç
§ãšåãæ§æã䜿çšããŸããããã©ã«ãã§ã¯ãæäžã«æç®ã®åŒçšãèšè¿°ããæç¹ã§åããŠããã®æç®ãTypstã®åèæç®ã»ã¯ã·ã§ã³ã«è¡šç€ºãããããã«ãªã£ãŠããŸãã
Typstã¯ããŸããŸãªåŒçšããã³åèæç®ã®ã¹ã¿ã€ã«ããµããŒãããŠããŸãã詳现ã«ã€ããŠã¯ [ãªãã¡ã¬ã³ã¹]($bibliography.style)ãåç
§ããŠãã ããã
```example
= Methods
We follow the glacier melting models
established in @glacier-melt.
#bibliography("works.bib")
```
## æ°åŒ {#maths}
æ¹æ³ã«é¢ããç¯ãèä»ãããåŸãææžã®äž»èŠãªéšåã§ããæ¹çšåŒã«é²ã¿ãŸãã
Typstã«ã¯çµã¿èŸŒã¿ã®æ°åŠèšæ³ããããç¬èªã®æ°åŠè¡šèšã䜿çšããŸãã
ç°¡åãªæ¹çšåŒããå§ããŸããããTypstã«æ°åŠçãªè¡šçŸãæåŸ
ããããšãç¥ãããããã«ã`[$]` èšå·ã§å²ã¿ãŸãã
```example
The equation $Q = rho A v + C$
defines the glacial flow rate.
```
æ¹çšåŒã¯ã€ã³ã©ã€ã³ã§è¡šç€ºãããåšå²ã®ããã¹ããšåãè¡ã«é
眮ãããŸãã
ãããç¬ç«ããè¡ã«ãããå Žåã¯ãæ¹çšåŒã®æåãšæåŸã«ãããã1ã€ãã€ã¹ããŒã¹ãæ¿å
¥ããå¿
èŠããããŸãã
```example
The flow rate of a glacier is
defined by the following equation:
$ Q = rho A v + C $
```
Typstã§ã¯ãåäžã®æå `Q`, `A`, `v`, `C` ã¯ãã®ãŸãŸè¡šç€ºãããäžæ¹ã§ `rho` ã¯ã®ãªã·ã£æåã«å€æãããŠããã®ãããããŸãã
æ°åŒã¢ãŒãã§ã¯ãåäžã®æåã¯åžžã«ãã®ãŸãŸè¡šç€ºãããŸããããããè€æ°åãé£ãªã£ãŠããæåã¯èšå·ã倿°ããŸãã¯é¢æ°åãšããŠæ±ãããŸãã
ç°ãªãçš®é¡ã®æåã©ããã®ä¹ç®ãïŒä¹ç®èšå·ãçç¥ããŠïŒç€ºãããã«ã¯ãæåãšæåã®éã«ã¹ããŒã¹ãæ¿å
¥ããŠãã ããã
è€æ°ã®æåãããªã倿°ã衚ãããå Žåã¯ã倿°ã®ååãåŒçšç¬Šã§å²ã¿ãŸãã
```example
The flow rate of a glacier is given
by the following equation:
$ Q = rho A v + "time offset" $
```
ã¬ããŒãã«ã¯ç·åã®åŒãå¿
èŠã§ãã
`sum` èšå·ã䜿çšããŠãç·åã®ç¯å²ãäžä»ãæåãšäžä»ãæåã§æå®ããããšãã§ããŸãã
```example
Total displaced soil by glacial flow:
$ 7.32 beta +
sum_(i=0)^nabla Q_i / 2 $
```
ã·ã³ãã«ã倿°ã«äžä»ãæåã远å ããã«ã¯ã`_` ã®æåãå
¥åããŠããäžä»ãæåãå
¥åããŸãã
åæ§ã«ãäžä»ãæåã远å ããã«ã¯ `^` ã®æåã䜿çšããŸãã
ããäžä»ãæåãäžä»ãæåãè€æ°ã®èŠçŽ ãããªãå Žåã¯ãããããäžžæ¬åŒ§ã§å²ãå¿
èŠããããŸãã
äžèšã®äŸããåæ°ã®æ¿å
¥æ¹æ³ãããããšæããŸãã
ååãšåæ¯ã®éã« `/` ã®æåã眮ãã ãã§ãTypstã¯èªåçã«ãããåæ°ã«å€æããŸãã
Typstã§ã¯ãäžžæ¬åŒ§ã®ãã¹ããã¹ããŒãã«è§£æ±ºããããã«ãªã£ãŠããŸããããã°ã©ãã³ã°èšèªã颿°é»åã®ããã«ãäžžæ¬åŒ§ãå
¥ãåã«ããåŒãå
¥åãããšã
Typstã¯äžžæ¬åŒ§ã§å²ãŸããéšååŒãé©åã«è§£éããŠèªåçã«çœ®ãæããŸãã
```example
Total displaced soil by glacial flow:
$ 7.32 beta +
sum_(i=0)^nabla
(Q_i (a_i - epsilon)) / 2 $
```
æ°åŠã®ãã¹ãŠã®æŠå¿µã«ç¹å¥ãªæ§æãããããã§ã¯ãããŸããã
代ããã«ãå
çšã® `image` 颿°ã®ããã«é¢æ°ã䜿çšããŸãã
äŸãã°ãåãã¯ãã«ãæ¿å
¥ããã«ã¯ã`vec` 颿°ã䜿çšã§ããŸãã
æ°åŒã¢ãŒãå
ã§ã¯ã颿°åŒã³åºã㯠`#` ã§å§ããå¿
èŠã¯ãããŸããã
```example
$ v := vec(x_1, x_2, x_3) $
```
æ°åŒã¢ãŒãå
ã§ã®ã¿äœ¿çšå¯èœãªé¢æ°ããããŸãã
äŸãã°ã[`cal`]($math.cal) 颿°ã¯éåãªã©ã«äžè¬çã«äœ¿çšãããã«ãªã°ã©ãã£æåã衚瀺ããããã«äœ¿ãããŸãã
æ°åŒã¢ãŒããæäŸãããã¹ãŠã®é¢æ°ã®å®å
šãªãªã¹ãã«ã€ããŠã¯ã[ãªãã¡ã¬ã³ã¹ã®æ°åŒã»ã¯ã·ã§ã³]($category/math)ãåç
§ããŠãã ããã
ããäžã€ãç¢å°ãªã©ã®å€ãã®èšå·ã«ã¯å€ãã®ããªãšãŒã·ã§ã³ããããŸãã
ããããæ§ã
ãªããªãšãŒã·ã§ã³ã®äžããç¹å®ã®èšå·ãéžæããã«ã¯ããã®èšå·ã®ã«ããŽãªåã®åŸã«ãããããšå
·äœçãªèšå·ã®çš®é¡ã瀺ã修食åã远å ããŸãã
```example
$ a arrow.squiggly b $
```
ãã®è¡šèšæ³ã¯ããŒã¯ã¢ããã¢ãŒãã§ãå©çšå¯èœã§ãããããã§ã¯èšå·åã®åã« `#sym.` ãä»ããå¿
èŠããããŸãã
å©çšå¯èœãªãã¹ãŠã®èšå·ã«ã€ããŠã¯[èšå·ã»ã¯ã·ã§ã³]($category/symbols/sym)ãåç
§ããŠãã ããã
## ãŸãšã {#review}
ããªãã¯Typstã§åºæ¬çãªææžãæžãæ¹æ³ãåŠã³ãŸãããããã¹ãã®åŒ·èª¿ããªã¹ãã®æžãæ¹ãç»åã®æ¿å
¥ãã³ã³ãã³ãã®é
眮ãTypstã«ãããæ°åŠçãªåŒã®çµçãªã©ãåŠã³ãŸããããŸããTypstã®é¢æ°ã«ã€ããŠãåŠã³ãŸããã
Typstã§ã¯ææžã«æ¿å
¥ã§ããããŸããŸãªã³ã³ãã³ãããããŸããããšãã°ã[衚]($table)ã[å³åœ¢]($category/visualize)ã[ã³ãŒããããã¯]($raw)ãªã©ã§ããããã«ããããä»ã®æ©èœã«ã€ããŠè©³ããåŠã¶ã«ã¯[ãªãã¡ã¬ã³ã¹]($reference)ãåç
§ããŠãã ããã
ãããŸã§ã§ãã¬ããŒãã®å·çã¯å®äºããŸããã
ããªãã¯å³äžã®ããŠã³ããŒããã¿ã³ãã¯ãªãã¯ããŠPDFãä¿åããã¯ãã§ãã
ããããããªãã¯ã¬ããŒããããŸãã«ãçŽ æŽã«èŠãããšæãããããããŸããã
次ã®ã»ã¯ã·ã§ã³ã§ã¯ãææžã®å€èгãã«ã¹ã¿ãã€ãºããæ¹æ³ãåŠã³ãŸãã
|
https://github.com/omroali/SegmentationsFeaturingAndTracking | https://raw.githubusercontent.com/omroali/SegmentationsFeaturingAndTracking/main/Report/code.typ | typst | #import "@preview/problemst:0.1.0": pset
#show: pset.with(
class: "Computer Vision", student: "<NAME> - 28587497", title: "Assignment 1", date: datetime.today(),
)
== image_segmentation.py
```python
import os
import cv2
from cv2.typing import MatLike
import numpy as np
from segmentation.utils import fill
import math
class ImageSegmentation:
def __init__(self, image_path: str, save_dir: str = None):
self.processing_data = []
self.image_path = image_path
self.image = cv2.imread(image_path)
self.processing_images = []
self.save_dir = save_dir
def log_image_processing(self, image, operation: str):
"""log the image processing"""
self.processing_data.append(operation)
self.processing_images.append(image)
def gblur(self, image, ksize=(3, 3), iterations=1):
"""apply gaussian blur to the image"""
blur = image.copy()
for _ in range(iterations):
blur = cv2.GaussianBlur(blur, ksize, cv2.BORDER_DEFAULT)
self.log_image_processing(blur, f"gblur,kernel:{ksize},iterations:{iterations}")
return blur
def mblur(self, image, ksize=3, iterations=1):
"""apply gaussian blur to the image"""
blur = image.copy()
for _ in range(iterations):
blur = cv2.medianBlur(blur, ksize)
self.log_image_processing(
blur, f"medianblur,kernel:{ksize},iterations:{iterations}"
)
return blur
def adaptive_threshold(self, image, blockSize=15, C=3):
"""apply adaptive threshold to the image"""
image = image.copy()
adaptive_gaussian_threshold = cv2.adaptiveThreshold(
src=image,
maxValue=255,
adaptiveMethod=cv2.ADAPTIVE_THRESH_GAUSSIAN_C,
thresholdType=cv2.THRESH_BINARY,
blockSize=blockSize,
```
```python
C=C,
)
self.log_image_processing(
adaptive_gaussian_threshold,
f"adaptive_threshold,blockSize:{blockSize},C:{C}",
)
return adaptive_gaussian_threshold
def dilate(self, image, kernel=(3, 3), iterations=1, op=cv2.MORPH_ELLIPSE):
"""apply dilation to the image"""
image = image.copy()
kernel = cv2.getStructuringElement(op, kernel)
dilation = cv2.dilate(
src=image,
kernel=kernel,
iterations=iterations,
)
self.log_image_processing(
dilation,
f"erode,kernel:{kernel},iterations:{iterations}",
)
return dilation
def erode(self, image, kernel=(3, 3), iterations=1, op=cv2.MORPH_ELLIPSE):
"""apply dilation to the image"""
image = image.copy()
kernel = cv2.getStructuringElement(op, kernel)
dilation = cv2.erode(
src=image,
kernel=kernel,
iterations=iterations,
)
self.log_image_processing(
dilation,
f"dilate,kernel:{kernel},iterations:{iterations}",
)
return dilation
def closing(self, image, kernel=(5, 5), iterations=10):
"""apply closing to the image"""
image = image.copy()
kernel = cv2.getStructuringElement(cv2.MORPH_ELLIPSE, kernel)
closing = cv2.morphologyEx(
src=image,
op=cv2.MORPH_CLOSE,
kernel=kernel,
iterations=iterations,
)
self.log_image_processing(
closing,
f"closing,kernel:{kernel},iterations:{iterations}",
)
return closing
```
```python
def opening(self, image, kernel=(5, 5), iterations=1, op=cv2.MORPH_ELLIPSE):
"""apply opening to the image"""
image = image.copy()
kernel = cv2.getStructuringElement(op, kernel)
opening = cv2.morphologyEx(
src=image,
op=cv2.MORPH_OPEN,
kernel=kernel,
iterations=iterations,
)
self.log_image_processing(
opening,
f"opening,kernel:{kernel},iterations:{iterations}",
)
return opening
def generic_filter(self, image, kernel, iterations=1, custom_msg="genertic_filter"):
result = image.copy()
for i in range(iterations):
result = cv2.filter2D(result, -1, kernel)
self.log_image_processing(
result, f"{custom_msg},kernel:{kernel},iterations:{iterations}"
)
return result
def dilate_and_erode(
self, image, k_d, i_d, k_e, i_e, iterations=1, op=cv2.MORPH_ELLIPSE
):
image = image.copy()
for _ in range(iterations):
for _ in range(i_d):
image = self.dilate(image, (k_d, k_d), op=op)
for _ in range(i_e):
image = self.erode(image, (k_e, k_e), op=op)
self.log_image_processing(
image,
f"dilate_and_erode,k_d:{(k_d,k_d)},i_d={i_d},k_e:{(k_e, k_e)},i_e={i_e},iterations:{iterations}",
)
return image
def fill_image(self, image_data, name, show=True):
self.log_image_processing(
image_data[name],
f"fill_{name}",
)
image_data[f"fill_{name}"] = {
"image": fill(image_data[name]["image"].copy()),
"show": show,
}
```
```python
def find_ball_contours(
self,
image,
circ_thresh,
min_area=400,
max_area=4900,
convex_hull=False,
):
img = image.copy()
cnts = cv2.findContours(img, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = cnts[0] if len(cnts) == 2 else cnts[1]
blank_image = np.zeros(img.shape, dtype=img.dtype)
for c in cnts:
# Calculate properties
peri = cv2.arcLength(c, True)
# Douglas-Peucker algorithm
approx = cv2.approxPolyDP(c, 0.0001 * peri, True)
# applying a convex hull
if convex_hull == True:
c = cv2.convexHull(c)
# get contour area
area = cv2.contourArea(c)
if area == 0:
continue # Skip to the next iteration if area is zero
circularity = 4 * math.pi * area / (peri**2)
if (
(len(approx) > 5)
and (area > min_area and area < max_area)
and circularity > circ_thresh
):
cv2.drawContours(blank_image, [c], -1, (255), cv2.FILLED)
return blank_image
@staticmethod
def preprocessing(image):
image_data = {}
image_data["original"] = {
"image": image.image,
"show": True,
}
image_data["grayscale"] = {
"image": cv2.cvtColor(image.image, cv2.COLOR_BGRA2GRAY),
"show": False,
}
```
```python
image_data["hsv"] = {
"image": cv2.cvtColor(image.image.copy(), cv2.COLOR_BGR2HSV),
"show": False,
}
(_, _, intensity) = cv2.split(image_data["hsv"]["image"])
image_data["intensity"] = {
"image": intensity,
"show": False,
}
image_data["gblur"] = {
"image": image.gblur(
image_data["intensity"]["image"], ksize=(3, 3), iterations=2
),
"show": False,
}
image_data["blur"] = {
"image": image.mblur(
image_data["intensity"]["image"], ksize=3, iterations=2
),
"show": False,
}
intensity_threshold = cv2.threshold(
image_data["intensity"]["image"], 125, 255, cv2.THRESH_BINARY
)[1]
image_data["intensity_threshold"] = {
"image": intensity_threshold,
"show": False,
}
name = "adap_gaus_thrsh"
image_data[name] = {
"image": image.adaptive_threshold(
image=image_data["blur"]["image"].copy(),
blockSize=19,
C=5,
),
"show": False,
}
image_data["open"] = {
"image": image.opening(
image=image_data["adap_gaus_thrsh"]["image"].copy(),
kernel=(5, 5),
iterations=4,
),
"show": False,
}
image_data["dilate"] = {
"image": image.dilate(
image=image_data["open"]["image"].copy(),
kernel=(3, 3),
iterations=2,
),
"show": False,
```
```python
}
image_data["erode"] = {
"image": image.erode(
image=image_data["open"]["image"].copy(),
kernel=(3, 3),
iterations=2,
),
"show": True,
}
fill_erode = image.fill_image(image_data, "erode")
image_data["dilate_and_erode"] = {
"image": image.dilate_and_erode(
image_data["fill_erode"]["image"],
k_d=4,
i_d=5,
k_e=5,
i_e=2,
iterations=1,
),
"show": False,
}
contours = image.find_ball_contours(
cv2.bitwise_not(image_data["dilate_and_erode"]["image"]),
0.32,
)
image_data["contours"] = {
"image": contours,
"show": False,
}
image_data["im_1"] = {
"image": cv2.bitwise_not(
image_data["intensity_threshold"]["image"],
),
"show": False,
}
image_data["im_2"] = {
"image": cv2.bitwise_not(
image_data["contours"]["image"],
),
"show": False,
}
image_data["segmentation_before_recontour"] = {
"image": cv2.bitwise_not(
cv2.bitwise_or(
image_data["im_1"]["image"], image_data["im_2"]["image"]
),
),
"show": True,
}
recontours = image.find_ball_contours(
```
```python
image_data["segmentation_before_recontour"]["image"],
0.0,
min_area=100,
max_area=4900,
convex_hull=True,
)
image_data["convex_hull"] = {
"image": recontours,
"show": True,
}
image_data["opening_after_segmentation"] = {
"image": image.opening(
image_data["convex_hull"]["image"],
kernel=(3, 3),
iterations=5,
),
"show": True,
}
image_data["segmentation"] = {
"image": image.find_ball_contours(
image_data["opening_after_segmentation"]["image"],
0.72,
250,
5000,
True,
),
"show": True,
}
return image_data
```
#pagebreak()
== utils.py
```python
import os
import glob
from natsort import natsorted
import numpy as np
import matplotlib.pyplot as plt
import cv2
def get_images_and_masks_in_path(folder_path):
images = sorted(filter(os.path.isfile, glob.glob(folder_path + "/*")))
image_list = []
mask_list = []
for file_path in images:
if "data.txt" not in file_path:
if "GT" not in file_path:
image_list.append(file_path)
else:
mask_list.append(file_path)
return natsorted(image_list), natsorted(mask_list)
# source and modofied from https://stackoverflow.com/a/67992521
def img_is_color(img):
if len(img.shape) == 3:
# Check the color channels to see if they're all the same.
c1, c2, c3 = img[:, :, 0], img[:, :, 1], img[:, :, 2]
if (c1 == c2).all() and (c2 == c3).all():
return True
return False
from heapq import nlargest, nsmallest
def dice_score(processed_images, masks, save_path):
eval = []
score_dict = {}
for idx, image in enumerate(processed_images):
score = dice_similarity_score(image, masks[idx], save_path)
score_dict[image] = score
if len(eval) == 0 or max(eval) < score:
max_score = score
max_score_image = image
if len(eval) == 0 or min(eval) > score:
min_score = score
min_score_image = image
eval.append(score)
avg_score = sum(eval) / len(eval)
max_text = f"Max Score: {max_score} - {max_score_image}\n"
min_text = f"Min Score: {min_score} - {min_score_image}\n"
avg_text = f"Avg Score: {avg_score}\n"
```
```python
print("--- " + save_path + "\n")
print(max_text)
print(min_text)
print(avg_text)
print("---")
FiveHighest = nlargest(5, score_dict, key=score_dict.get)
FiveLowest = nsmallest(5, score_dict, key=score_dict.get)
with open(f"{save_path}/dice_score.txt", "w") as f:
f.write("---\n")
f.write(max_text)
f.write(min_text)
f.write(avg_text)
f.write("---\n")
f.write("Scores:\n")
for idx, score in enumerate(eval):
f.write(f"\t{score}\t{masks[idx]}\n")
f.write("---\n")
f.write("5 highest:\n")
for v in FiveHighest:
f.write(f"{v}, {score_dict[v]}\n")
f.write("---\n")
f.write("5 lowest:\n")
for v in FiveLowest:
f.write(f"{v}, {score_dict[v]}\n")
frame_numbers = [extract_frame_number(key) for key in score_dict.keys()]
plt.figure(figsize=(12, 3))
plt.bar(frame_numbers, score_dict.values(), color="c")
plt.title("Dice Score for Each Image Frame")
plt.xlabel("Image Frame")
plt.ylabel("Dice Similarity Similarity Score")
plt.ylim([0.8, 1])
plt.xticks(
frame_numbers, rotation=90
) # Rotate the x-axis labels for better readability
plt.grid(True)
plt.tight_layout() # Adjust the layout for better readability
plt.savefig(f"Report/assets/dice_score_barchart.png")
# standard deviation
std_dev = np.std(eval)
print(f"Standard Deviation: {std_dev}")
mean = np.mean(eval)
print(f"Mean: {mean}")
# plot boxplot
plt.figure(figsize=(12, 3))
plt.violinplot(eval, vert=False, showmeans=True)
plt.title("Dice Score Distribution")
plt.xlabel("Dice Similarity Score")
plt.grid(True)
plt.tight_layout()
plt.text(0.83, 0.9, f'Standard Deviation: {std_dev:.2f}', transform=plt.gca().transAxes)
```
```python
plt.text(0.83, 0.80, f'Mean: {mean:.2f}', transform=plt.gca().transAxes)
plt.savefig(f"Report/assets/dice_score_violin.png")
def extract_frame_number(path):
components = path.split("/")
filename = components[-1]
parts = filename.split("-")
frame_number_part = parts[-1]
frame_number = frame_number_part.split(".")[0]
return int(frame_number)
def dice_similarity_score(seg_path, mask_path, save_path):
seg = cv2.threshold(cv2.imread(seg_path), 127, 255, cv2.THRESH_BINARY)[1]
mask = cv2.threshold(cv2.imread(mask_path), 127, 255, cv2.THRESH_BINARY)[1]
intersection = cv2.bitwise_and(seg, mask)
dice_score = 2.0 * intersection.sum() / (seg.sum() + mask.sum())
difference = cv2.bitwise_not(cv2.bitwise_or(cv2.bitwise_not(seg), mask))
cv2.imwrite(save_path + f"/difference_ds_{dice_score}.jpg", difference)
return dice_score
def show_image_list(
image_dict: dict = {},
list_cmaps=None,
grid=False,
num_cols=2,
figsize=(20, 10),
title_fontsize=12,
save_path=None,
):
list_titles, list_images = list(image_dict.keys()), list(image_dict.values())
assert isinstance(list_images, list)
assert len(list_images) > 0
assert isinstance(list_images[0], np.ndarray)
if list_titles is not None:
assert isinstance(list_titles, list)
assert len(list_images) == len(list_titles), "%d imgs != %d titles" % (
len(list_images),
len(list_titles),
)
if list_cmaps is not None:
assert isinstance(list_cmaps, list)
assert len(list_images) == len(list_cmaps), "%d imgs != %d cmaps" % (
len(list_images),
len(list_cmaps),
)
```
```python
num_images = len(list_images)
num_cols = min(num_images, num_cols)
num_rows = int(num_images / num_cols) + (1 if num_images % num_cols != 0 else 0)
# Create a grid of subplots.
fig, axes = plt.subplots(num_rows, num_cols, figsize=figsize)
# Create list of axes for easy iteration.
if isinstance(axes, np.ndarray):
list_axes = list(axes.flat)
else:
list_axes = [axes]
for i in range(num_images):
img = list_images[i]
title = list_titles[i] if list_titles is not None else "Image %d" % (i)
cmap = (
list_cmaps[i]
if list_cmaps is not None
else (None if img_is_color(img) else "gray")
)
list_axes[i].imshow(img, cmap=cmap)
list_axes[i].set_title(title, fontsize=title_fontsize)
list_axes[i].grid(grid)
list_axes[i].axis("off")
for i in range(num_images, len(list_axes)):
list_axes[i].set_visible(False)
fig.tight_layout()
if save_path is not None:
fig.savefig(save_path)
plt.close(fig)
def fill(img):
des = cv2.bitwise_not(img.copy())
contour, hier = cv2.findContours(des, cv2.RETR_CCOMP, cv2.CHAIN_APPROX_SIMPLE)
for cnt in contour:
cv2.drawContours(des, [cnt], 0, 255, -1)
return cv2.bitwise_not(des)
```
#pagebreak()
== seg_main.py
```python
import os
import cv2
from tqdm import tqdm
from datetime import datetime
from segmentation.image_segmentation import ImageSegmentation
from segmentation.utils import (
dice_score,
get_images_and_masks_in_path,
show_image_list,
)
import multiprocessing as mp
dir_path = os.path.dirname(os.path.realpath(__file__))
path = "data/ball_frames"
def store_image_data(log_data, time: datetime):
"""method to store in a text file the image data for processing"""
check_path = os.path.exists(f"process_data/{time}/data.txt")
if not check_path:
with open(f"process_data/{time}/data.txt", "w") as f:
for log in log_data:
f.write(f"{log}\n")
def process_image(inputs: list[list, bool]) -> None:
"""method to process the image"""
[image_path, save, time, save_dir] = inputs
image = ImageSegmentation(image_path, save_dir)
data = image.preprocessing(image)
processed_images = {}
for key in data.keys():
if data[key]["show"] is not False:
processed_images[key] = data[key]["image"]
log_data = image.processing_data
name = os.path.splitext(os.path.basename(image_path))[0]
save_path = None
if save:
save_path = f"{save_dir}/{name}"
if not os.path.exists(save_dir):
os.mkdir(save_dir)
store_image_data(log_data, time)
if data["segmentation"]["image"] is not None:
segmentation_path = f"{save_dir}/segmentation/"
if not os.path.exists(segmentation_path):
os.mkdir(segmentation_path)
seg_path = f"{segmentation_path}{os.path.basename(image.image_path)}"
cv2.imwrite(seg_path, data["segmentation"]["image"])
```
```python
show_image_list(
image_dict=processed_images,
figsize=(10, 10),
save_path=save_path,
)
def process_all_images(images, save=False):
time = datetime.now().isoformat("_", timespec="seconds")
save_path = f"process_data/{time}"
seg_path = f"{save_path}/segmentation"
with mp.Pool() as pool:
inputs = [[image, save, time, save_path] for image in images]
list(
tqdm(
pool.imap_unordered(process_image, inputs, chunksize=4),
total=len(images),
)
)
pool.close()
pool.join()
return save_path, seg_path
def main():
images, masks = get_images_and_masks_in_path(path)
processed_image_path, seg_path = process_all_images(images, True)
processed_images, _ = get_images_and_masks_in_path(seg_path)
dice_score(processed_images, masks, seg_path)
if __name__ == "__main__":
main()
```
#pagebreak()
== seg_main.py
```python
import os
import re
import cv2
from cv2.gapi import bitwise_and
from matplotlib import pyplot as plt
from matplotlib.artist import get
from segmentation.utils import get_images_and_masks_in_path
import numpy as np
from segmentation.utils import fill
import math
from skimage.feature import graycomatrix, graycoprops
BALL_SMALL = "Tennis"
BALL_MEDIUM = "Football"
BALL_LARGE = "American\nFootball"
def shape_features_eval(contour):
area = cv2.contourArea(contour)
# getting non-compactness
perimeter = cv2.arcLength(contour, closed=True)
non_compactness = 1 - (4 * math.pi * area) / (perimeter**2)
# getting solidity
convex_hull = cv2.convexHull(contour)
convex_area = cv2.contourArea(convex_hull)
solidity = area / convex_area
# getting circularity
circularity = (4 * math.pi * area) / (perimeter**2)
# getting eccentricity
ellipse = cv2.fitEllipse(contour)
a = max(ellipse[1])
b = min(ellipse[1])
eccentricity = (1 - (b**2) / (a**2)) ** 0.5
return {
"non_compactness": non_compactness,
"solidity": solidity,
"circularity": circularity,
"eccentricity": eccentricity,
}
def texture_features_eval(patch):
# # Define the co-occurrence matrix parameters
distances = [1]
angles = np.radians([0, 45, 90, 135])
levels = 256
symmetric = True
```
```python
normed = True
glcm = graycomatrix(
patch, distances, angles, levels, symmetric=symmetric, normed=normed
)
filt_glcm = glcm[1:, 1:, :, :]
# Calculate the Haralick features
asm = graycoprops(filt_glcm, "ASM").flatten()
contrast = graycoprops(filt_glcm, "contrast").flatten()
correlation = graycoprops(filt_glcm, "correlation").flatten()
# Calculate the feature average and range across the 4 orientations
asm_avg = np.mean(asm)
contrast_avg = np.mean(contrast)
correlation_avg = np.mean(correlation)
asm_range = np.ptp(asm)
contrast_range = np.ptp(contrast)
correlation_range = np.ptp(correlation)
return {
"asm": asm,
"contrast": contrast,
"correlation": correlation,
"asm_avg": asm_avg,
"contrast_avg": contrast_avg,
"correlation_avg": correlation_avg,
"asm_range": asm_range,
"contrast_range": contrast_range,
"correlation_range": correlation_range,
}
def initialise_channels_features():
def initialise_channel_texture_features():
return {
"asm": [],
"contrast": [],
"correlation": [],
"asm_avg": [],
"contrast_avg": [],
"correlation_avg": [],
"asm_range": [],
"contrast_range": [],
"correlation_range": [],
}
return {
"blue": initialise_channel_texture_features(),
"green": initialise_channel_texture_features(),
"red": initialise_channel_texture_features(),
}
def initialise_shape_features():
return {
"non_compactness": [],
"solidity": [],
```
```python
"circularity": [],
"eccentricity": [],
}
def get_all_features_balls(path):
features = {
BALL_LARGE: {
"shape_features": initialise_shape_features(),
"texture_features": initialise_channels_features(),
},
BALL_MEDIUM: {
"shape_features": initialise_shape_features(),
"texture_features": initialise_channels_features(),
},
BALL_SMALL: {
"shape_features": initialise_shape_features(),
"texture_features": initialise_channels_features(),
},
}
images, masks = get_images_and_masks_in_path(path)
for idx, _ in enumerate(images):
image = images[idx]
mask = masks[idx]
msk = cv2.imread(mask, cv2.IMREAD_GRAYSCALE)
_, msk = cv2.threshold(msk, 127, 255, cv2.THRESH_BINARY)
# overlay binay image over it's rgb counterpart
img = cv2.imread(image)
img = cv2.bitwise_and(cv2.cvtColor(msk, cv2.COLOR_GRAY2BGR), img)
contours, _ = cv2.findContours(msk, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)
for contour in contours:
area = cv2.contourArea(contour)
ball_img = np.zeros(msk.shape, dtype=np.uint8)
cv2.drawContours(ball_img, contour, -1, (255, 255, 255), -1)
fill_img = cv2.bitwise_not(fill(cv2.bitwise_not(ball_img)))
rgb_fill = cv2.bitwise_and(cv2.cvtColor(fill_img, cv2.COLOR_GRAY2BGR), img)
out = fill_img.copy()
out_colour = rgb_fill.copy()
# Now crop image to ball size
(y, x) = np.where(fill_img == 255)
(topy, topx) = (np.min(y), np.min(x))
(bottomy, bottomx) = (np.max(y), np.max(x))
padding = 3
out = out[
topy - padding : bottomy + padding, topx - padding : bottomx + padding
]
out_colour = out_colour[
```
```python
topy - padding : bottomy + padding, topx - padding : bottomx + padding
]
# getting ball features
shape_features = shape_features_eval(contour)
texture_features_colour = {
"blue": texture_features_eval(out_colour[:, :, 0]),
"green": texture_features_eval(out_colour[:, :, 1]),
"red": texture_features_eval(out_colour[:, :, 2]),
}
# segmenting ball by using area
if area > 1300: # football
append_ball = BALL_LARGE
elif area > 500: # soccer_ball
append_ball = BALL_MEDIUM
else: # tennis ball
append_ball = BALL_SMALL
for key in shape_features:
features[append_ball]["shape_features"][key].append(shape_features[key])
for colour in texture_features_colour.keys():
for colour_feature in texture_features_colour[colour]:
features[append_ball]["texture_features"][colour][
colour_feature
].append(texture_features_colour[colour][colour_feature])
return features
def feature_stats(features, ball, colours=["blue", "green", "red"]):
def get_stats(array):
return {
"mean": np.mean(array),
"std": np.std(array),
"min": np.min(array),
"max": np.max(array),
}
def get_ball_shape_stats(features, ball):
feature_find = ["non_compactness", "solidity", "circularity", "eccentricity"]
return {
feature: get_stats(features[ball]["shape_features"][feature])
for feature in feature_find
}
def get_ball_texture_stats(features, ball, colour):
feature_find = ["asm_avg", "contrast_avg", "correlation_avg"]
return {
texture: get_stats(features[ball]["texture_features"][colour][texture])
for texture in feature_find
}
```
```python
stats = {
ball: {
"shape_features": get_ball_shape_stats(features, ball),
"texture_features": {
colour: get_ball_texture_stats(features, ball, colour)
for colour in colours
},
},
}
return stats
def get_histogram(data, Title):
"""
data {ball: values}
"""
for ball, values in data.items():
plt.figure(figsize=(3,3))
plt.hist(values, bins=20, alpha=0.5, label=ball)
plt.xlabel(Title)
plt.ylabel("Frequency")
plt.legend()
plt.tight_layout()
plt.savefig("Report/assets/features/"+ Title + "_histogram_" + ball.replace("\n", "_"))
# plt.show()
if __name__ == "__main__":
features = get_all_features_balls("data/ball_frames")
balls = [
BALL_SMALL,
BALL_MEDIUM,
BALL_LARGE,
]
non_compactness = {
ball: features[ball]["shape_features"]["non_compactness"] for ball in balls
}
solidity = {ball: features[ball]["shape_features"]["solidity"] for ball in balls}
circularity = {
ball: features[ball]["shape_features"]["circularity"] for ball in balls
}
eccentricity = {
ball: features[ball]["shape_features"]["eccentricity"] for ball in balls
}
get_histogram(non_compactness, "Non-Compactness")
get_histogram(solidity, "Soliditiy")
get_histogram(circularity, "Circularity")
```
```python
get_histogram(eccentricity, "Eccentricity")
channel_colours = ["red", "green", "blue"]
def get_ch_features(feature_name):
return {
colour: {
ball: features[ball]["texture_features"][colour][feature_name]
for ball in balls
}
for colour in channel_colours
}
def get_ch_stats(feature_data, colours=channel_colours):
return [[feature_data[colour][ball] for ball in balls] for colour in colours]
asm_avg = get_ch_features("asm_avg")
contrast_avg = get_ch_features("contrast_avg")
correlation_avg = get_ch_features("correlation_avg")
asm_range = get_ch_features("asm_range")
asm_data = get_ch_stats(asm_avg)
contrast_data = get_ch_stats(contrast_avg)
correlation_data = get_ch_stats(correlation_avg)
asm_range_data = get_ch_stats(asm_range)
asm_title = "ASM Avg"
contrast_title = "Contrast Avg"
correlation_title = "Correlation Avg"
asm_range_title = "ASM Range Avg"
plt_colours = ["yellow", "white", "orange"]
channels = ["Red Channel", "Green Channel", "Blue Channel"]
plt.figure()
def get_boxplot(data, title, colours=plt_colours, rows=3, columns=3, offset=0):
channels = ["Red Channel", "Green Channel", "Blue Channel"]
fig = plt.figure(figsize=(8,3)) # Get the Figure object
fig.suptitle(title) # Set the overall title
for i, d in enumerate(data):
ax = plt.subplot(rows, columns, i + offset + 1)
ax.set_facecolor(channel_colours[i])
ax.patch.set_alpha(0.5)
violins = plt.violinplot(
d, showmeans=True, showmedians=False, showextrema=False
)
for j, pc in enumerate(violins["bodies"]):
pc.set_facecolor(colours[j])
pc.set_edgecolor("black")
pc.set_alpha(0.2)
plt.xticks([1, 2, 3], balls, rotation=45)
plt.title(channels[i])
```
```python
def get_boxplot_specific(data, title, i, colours=plt_colours):
plt.figure(figsize=(2.5,6))
d = data[i]
violins = plt.violinplot(
d, showmeans=True, showmedians=False, showextrema=False
)
for j, pc in enumerate(violins["bodies"]):
pc.set_facecolor(colours[j])
pc.set_edgecolor("black")
pc.set_alpha(0.5)
plt.xticks([1, 2, 3], balls, rotation=45)
plt.title(title + '\n' + channels[i])
ax = plt.gca() # Get the current Axes instance
ax.set_facecolor(channel_colours[i]) # Set the background color
ax.patch.set_alpha(0.1) # Set the alpha value
columns = 3
rows = 1
get_boxplot_specific(asm_data, asm_title, 2)
plt.tight_layout()
plt.savefig("Report/assets/features/asm_data_blue_channel")
plt.close()
get_boxplot_specific(asm_range_data, asm_range_title, 2)
plt.tight_layout()
plt.savefig("Report/assets/features/asm_range_data_blue_channel")
plt.close()
get_boxplot_specific(contrast_data, contrast_title, 0)
plt.tight_layout()
plt.savefig("Report/assets/features/contrast_data_red_channel")
plt.close()
get_boxplot_specific(correlation_data, correlation_title, 1)
plt.tight_layout()
plt.savefig("Report/assets/features/correlation_green_channel")
plt.close()
```
#pagebreak()
= tracking_main.py
```python
from matplotlib import pyplot as plt
import numpy as np
def kalman_predict(x, P, F, Q):
xp = F * x
Pp = F * P * F.T + Q
return xp, Pp
def kalman_update(x, P, H, R, z):
S = H * P * H.T + R
K = P * H.T * np.linalg.inv(S)
zp = H * x
xe = x + K * (z - zp)
Pe = P - K * H * P
return xe, Pe
def kalman_tracking(
z,
x01=0.0,
x02=0.0,
x03=0.0,
x04=0.0,
dt=0.5,
nx=0.16,
ny=0.36,
nvx=0.16,
nvy=0.36,
nu=0.25,
nv=0.25,
kq=1,
kr=1,
):
# Constant Velocity
F = np.matrix([[1, dt, 0, 0], [0, 1, 0, 0], [0, 0, 1, dt], [0, 0, 0, 1]])
# Cartesian observation model
H = np.matrix([[1, 0, 0, 0], [0, 0, 1, 0]])
# Motion Noise Model
Q = kq*np.matrix([[nx, 0, 0, 0], [0, nvx, 0, 0], [0, 0, ny, 0], [0, 0, 0, nvy]])
# Measurement Noise Model
R = kr*np.matrix([[nu, 0], [0, nv]])
x = np.matrix([x01, x02, x03, x04]).T
P = Q
N = len(z[0])
s = np.zeros((4, N))
```
```python
for i in range(N):
xp, Pp = kalman_predict(x, P, F, Q)
x, P = kalman_update(xp, Pp, H, R, z[:, i])
val = np.array(x[:2, :2]).flatten()
s[:, i] = val
px = s[0, :]
py = s[1, :]
return px, py
def rms(x, y, px, py):
return np.sqrt(1/len(px) * (np.sum((x - px)**2 + (y - py)**2)))
def mean(x, y, px, py):
return np.mean(np.sqrt((x - px)**2 + (y - py)**2))
if __name__ == "__main__":
x = np.genfromtxt("data/x.csv", delimiter=",")
y = np.genfromtxt("data/y.csv", delimiter=",")
na = np.genfromtxt("data/na.csv", delimiter=",")
nb = np.genfromtxt("data/nb.csv", delimiter=",")
z = np.stack((na, nb))
dt = 0.5
nx = 160.0
ny = 0.00036
nvx = 0.00016
nvy = 0.00036
nu = 0.00025
nv = 0.00025
px1, py1 = kalman_tracking(z=z)
nx = 0.16 * 10
ny = 0.36
nvx = 0.16 * 0.0175
nvy = 0.36 * 0.0175
nu = 0.25
nv = 0.25 * 0.001
kq = 0.0175
kr = 0.0015
px2, py2 = kalman_tracking(
nx=nx,
ny=ny,
nvx=nvx,
nvy=nvy,
nu=nu,
nv=nv,
kq=kq,
kr=kr,
z=z,
)
```
```python
plt.figure(figsize=(12, 5))
plt.plot(x, y, label='trajectory')
plt.plot(px1, py1, label=f'intial prediction, rms={round(rms(x, y, px1, py1), 3)}')
print(f'initial rms={round(rms(x, y, px1, py1), 3)}, mean={round(mean(x, y, px1, py1), 3)}')
plt.plot(px2, py2,label=f'optimised prediction, rms={round(rms(x, y, px2, py2), 3)}')
print(f'optimised rms={round(rms(x, y, px2, py2), 3)}, mean={round(mean(x, y, px2, py2), 3)}')
plt.scatter(na, nb,marker='x',c='k',label=f'noisy data, rms={round(rms(x, y, na, nb), 3)}')
print(f'noise rms={round(rms(x, y, na, nb), 3)}, mean={round(mean(x, y, na, nb), 3)}')
plt.legend()
plt.title("Kalman Filter")
plt.savefig("Report/assets/tracking/kalman_filter.png")
# plt.show()
``` |
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/georges-yetyp/0.1.0/template/main.typ | typst | Apache License 2.0 | #import "@preview/georges-yetyp:0.1.0": rapport
#show: rapport.with(
nom: "Georgette Lacourgette",
titre: "Titre du stage",
entreprise: (
nom: "Nom de l'entreprise",
adresse: [
12 rue de la Chartreuse, \
38000 Grenoble, \
France
],
logo: image("logo.png", height: 4em),
),
responsable: (
nom: "<NAME>",
fonction: "CTO",
téléphone: "+33 6 66 66 66 66",
email: "<EMAIL>"
),
tuteur: (
nom: "<NAME>",
téléphone: "+33 6 66 66 66 66",
email: "<EMAIL>"
),
référent: (
nom: "<NAME>",
téléphone: "+33 6 66 66 66 66",
email: "<EMAIL>"
),
résumé: [
#lorem(100)
#lorem(20)
],
glossaire: [
/ Georges : Prénom de la mascotte de l'école.
]
)
= Introduction
== Présentation de l'entreprise
#lorem(30)
#lorem(50)
#figure(
image("logo.png"),
caption: [Le logo de l'entreprise]
)
#lorem(100)
== Mes missions
#lorem(50)
#figure(
```rust
fn main() {
println!("Hello world!");
}
```,
caption: [Le fameux programme "Hello world"]
)
#lorem(130) |
https://github.com/TimPaasche/Typst.Template.CoverLetter | https://raw.githubusercontent.com/TimPaasche/Typst.Template.CoverLetter/master/README.md | markdown | MIT License | # Typst.Template.CoverLetter |
https://github.com/Amelia-Mowers/typst-tabut | https://raw.githubusercontent.com/Amelia-Mowers/typst-tabut/main/doc/example-snippets/rearrange.typ | typst | MIT License | #import "@preview/tabut:<<VERSION>>": tabut
#import "example-data/supplies.typ": supplies
#tabut(
supplies,
(
(header: [Price], func: r => r.price), // This column is moved to the front
(header: [Name], func: r => r.name),
(header: [Name 2], func: r => r.name), // copied
// (header: [Quantity], func: r => r.quantity), // removed via comment
)
) |
https://github.com/JarKz/math_analysis_with_typst | https://raw.githubusercontent.com/JarKz/math_analysis_with_typst/main/groups/fifth.typ | typst | MIT License | = ÐÑÑÐ°Ñ Ð³ÑÑппа вПпÑПÑПв
1. *ÐпÑеЎелеМОе ÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа, егП МепÑеÑÑвМПÑÑÑ Ðž ОМÑегÑОÑПваМОе.*
2. *ÐОÑÑеÑеМÑОÑПваМОе ÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа (пÑавОлП ÐейбМОÑа).*
3. *ÐпÑеЎелеМОе МеÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
4. *РавМПЌеÑÐœÐ°Ñ ÑÑ
ПЎОЌПÑÑÑ ÐœÐµÑПбÑÑвеММПгП ОМÑегÑала ,завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа. ÐÑОÑеÑОй ÐПÑО.*
5. *ÐÑОзМак ÐейеÑÑÑÑаÑÑа ÑавМПЌеÑМПй ÑÑ
ПЎОЌПÑÑО МеÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
5. *ÐÑОзМак ÐОÑОÑ
ле ÑавМПЌеÑМПй ÑÑ
ПЎОЌПÑÑО МеÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
7. *ÐепÑеÑÑвМПÑÑÑ ÐœÐµÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
8. *ÐМÑегÑОÑПваМОе МеÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
9. *ÐОÑÑеÑеМÑОÑПваМОе МеÑПбÑÑвеММПгП ОМÑегÑала, завОÑÑÑегП ÐŸÑ Ð¿Ð°ÑаЌеÑÑа.*
10. *ÐМÑегÑал ÐОÑОÑ
ле.*
11. *ÐМÑегÑал ÐÑаÑÑПМа.*
12. *ÐеÑа-ÑÑМкÑÐžÑ ÐйлеÑа О ее ÑвПйÑÑва.*
13. *ÐаЌЌа-ÑÑМкÑÐžÑ ÐйлеÑа О ее ÑвПйÑÑва.*
14. *СвÑÐ·Ñ ÐŒÐµÐ¶ÐŽÑ Ð±ÐµÑа- О гаЌЌа-ÑÑкМÑОÑЌО.*
15. *ÐпÑеЎелеМОе $n$-ЌеÑМПй ЌеÑÑ ÐŒÐœÐŸÐ¶ÐµÑÑва.*
16. *ÐМПжеÑÑва ЌеÑÑ ÐœÑлÑ.*
17. *ÐÑбОÑÑеЌÑе ЌМПжеÑÑва.*
18. *ÐпÑеЎелеМОе кÑаÑМПÑÑО ОМÑегÑала.*
19. *СÑÐŒÐŒÑ ÐаÑбÑ. ÐÑОÑеÑОй ÑÑÑеÑÑÐ²ÐŸÐ²Ð°ÐœÐžÑ ÐºÑаÑМПгП ОМÑегÑала.*
20. *ÐПÑÑаÑПÑМПе ÑÑлПвОе ÑÑÑеÑÑÐ²ÐŸÐ²Ð°ÐœÐžÑ ÐºÑаÑМПгП ОМÑегÑала.*
21. *СвПйÑÑва кÑаÑМПгП ОМÑегÑала.*
22. *СвеЎеМОе ЎвПйМПгП ОМÑегÑала к пПвÑПÑМПЌÑ.*
23. *ÐÑÑОÑлеМОе плПÑаЎО О ПбÑеЌа Ñ Ð¿ÐŸÐŒÐŸÑÑÑ ÐŽÐ²ÐŸÐ¹ÐœÐŸÐ³ÐŸ ОМÑегÑала.*
24. *СвеЎеМОе $n$-кÑаÑМПгП ОМÑегÑала к пПвÑПÑМПЌÑ.*
25. *ÐеПЌеÑÑОÑеÑкОй ÑÐŒÑÑл ЌПЎÑÐ»Ñ ÑкПбОаМа.*
26. *ÐаЌеМа пеÑеЌеММÑÑ
в ЎвПйМПЌ ОМÑегÑале.*
27. *ÐаЌеМа пеÑеЌеММÑÑ
в кÑаÑМПЌ ОМÑегÑале.*
28. *ÐÑОвПлОМейМÑй ОМÑегÑал 1-гП ÑПЎа.*
29. *ÐÑОвПлОМейМÑй ОМÑегÑал 2-гП ÑПЎа.*
30. *ЀПÑÐŒÑла ÐÑОМа.*
31. *ÐÑОвПлОМейМÑе ОМÑегÑалÑ, Ме завОÑÑÑОе ÐŸÑ Ð¿ÑÑО ОМÑегÑОÑПваМОÑ.*
32. *ÐПМÑÑОе пПвеÑÑ
МПÑÑО.*
33. *ÐлПÑÐ°ÐŽÑ Ð¿ÐŸÐ²ÐµÑÑ
МПÑÑО.*
34. *ÐÑОеМÑаÑÐžÑ Ð³Ð»Ð°ÐŽÐºÐŸÐ¹ пПвеÑÑ
МПÑÑО.*
35. *ÐПвеÑÑ
МПÑÑМÑе ОМÑегÑÐ°Ð»Ñ 1-гП ÑПЎа.*
36. *ÐПвеÑÑ
МПÑÑМÑе ОМÑегÑÐ°Ð»Ñ 2-гП ÑПЎа.*
37. *ÐпÑеЎелеМОе гÑаЎОеМÑа, ЎОвеÑгеМÑОО, ÑПÑПÑа.*
38. *ЊОÑкÑлÑÑÐžÑ Ðž пПÑПк векÑПÑМПгП пПлÑ.*
39. *ЀПÑÐŒÑла ÐÑÑÑПгÑаЎÑкПгП.*
40. *ЀПÑÐŒÑла СÑПкÑа.*
41. *СПлеМПОЎалÑМПе О пПÑеМÑОалÑМПе векÑПÑМÑе пПлÑ.*
|
https://github.com/max-niederman/MTH311 | https://raw.githubusercontent.com/max-niederman/MTH311/main/ivt.typ | typst | #import "./lib.typ": *
#show: common.with(title: "Intermediate Value Theorem")
= Induction on the Nonnegative Reals
Let $A$ be a subset of $RR$ such that
- If $[0, x) subset.eq A$, then $x in A$.
- For any $a$ in $A$, there exists some $b in RR$ such that $a < b$ and $(a, b) subset.eq A$.
Then $A supset.eq [0, oo)$.
_Proof_:
Consider the set $A^c = [0, oo) backslash A$,
and assume by contradiction that it is nonempty.
It is bounded below by zero, as all elements of $[0, oo)$ are nonnegative.
Therefore, $inf A^c$ exists and is a nonnegative real number.
Numbers in the interval $[0, inf A^c)$ cannot be in $A^c$,
as $inf A^c$ is a lower bound on $A^c$.
However, such numbers are in $[0, oo)$ and thus also in $[0, oo) backslash A^c = A$.
That is, $[0, inf A^c) subset.eq A$.
By condition one, we have $[0, inf A^c] subset.eq A$.
We then apply condition two to $inf A^c$ to find some $b in RR$ such that $inf A^c < b$ and $(inf A^c, b) subset.eq A$.
Taking the union of the two intervals, we have $[0, b) subset.eq A$, with $inf A^c < b$.
Now take some $x in A^c$.
It must be greater than or equal to $b$,
because if it were less than $b$, it would be in $[0, b)$ and therefore in $A$.
Therefore, $b$ is a lower bound on $A^c$,
which contradicts $inf A^c < b$.
Hence, the assumption that $A^c$ is nonempty must be false,
and $A supset.eq [0, oo)$.
#sym.qed
#colbreak()
= Increasing Nonnegative Domain
For any $L in RR$ and $b in [0, oo)$, and some function $f : RR -> RR$,
if $f$ is continuous on $[0, b]$ and
$f(0) <= L <= f(b)$,
then there exists some $c in [0, b]$ such that $f(c) = L$.
_Proof_:
Consider the set
$
B =
{ b in [0, oo) | &f "continuous on" [0, b] \
&and f(0) <= L <= f(b) \
&quad => exists c in [0, b], f(c) = L } "."
$
That is, the set of $b$ values for which the proposition holds w.r.t. $f$ and $L$.
== Limit Case
Assume that there is some $x$ such that $[0, b) subset.eq B$,
and furthermore that $f$ is continuous on $[0, b]$ and $f(0) <= L <= f(b)$.
If $L = f(b)$, then we have that there exists $c$ such that $f(c) = L$ trivially.
Otherwise, $L < f(b)$ and so $0 < f(b) - L$.
Then, because $f$ is continuous at $b$, we can apply the limit definition to find that there exists $delta > 0$ such that for all $x in RR$,
$ |b - x| < delta => |f(b) - f(x)| < f(b) - L "." $
Define $b' = max(0, b - delta/2)$.
Observe that
$
b - delta/2 &<= b' \
b - b' &<= delta/2 \
b - b' &< delta ","
$
and because $b' > b$,
$
abs(b - b') &< delta "."
$
Now we apply the limit to get
$
abs(f(b) - f(b')) &< f(b) - L \
f(b) - f(b') &< f(x) - L \
- f(b') &< - L \
L &< f(b') "."
$
We already know that $f(0) < L$, so
$ f(0) < L < f(b') "," $
and because $b' in [0, b)$,
we have $c in [0, b')$ such that
$ f(c) = L "." $
Furthermore, $[0, b')$ is a subset of $[0, b)$,
so that $c$ also satisfies the proposition for $b$.
== Successor Case
Assume that $b in B$.
That is, if $f$ is continuous on $[0, b]$ and $f(0) <= L <= f(b)$,
then there exists $c in [0, b]$ such that $f(c) = L$.
=== Case 1: $L <= f(b)$
Let $b' = b + 1$.
Consider any $x in (b, b')$,
and assume that $f$ is continuous on $[0, x]$ and $f(0) <= L <= f(x)$.
Because $x > b$, we have that $f$ is continuous on $[0, b]$ and $f(0) <= L$.
Furthermore, $L <= f(b)$, so the conditions for $b$ are satisfied.
Therefore, there exists $c in [0, b + 1]$ such that $f(c) = L$.
=== Case 2: $f(b) < L$, $f$ discontinuous at $b$
Let $b' = b + 1$.
For any $x in (b, b')$, we know that $f$ is not continuous on the interval $[0, x]$
because it is not continuous at $b in [0, x]$.
Therefore, it follows from the principle of explosion that the continuity of that interval
implies the existence of some $c in [0, x]$ such that $f(c) = L$.
=== Case 3: $f(b) < L$, $f$ continuous at $b$
Define $epsilon = L - f(b)$.
Because $L > f(b)$, we have $epsilon > 0$.
Applying the limit of $f$ at $b$, we find that there exists $delta > 0$ such that for all $x in RR$,
$ |b - x| < delta => |f(b) - f(x)| < epsilon "." $
Let $b' = b + delta$ and consider any $x in (b, b')$.
Then
$
b <& x &<& b + delta \
0 <& x - b &<& delta \
-delta <& x - b &<& delta \
& abs(x - b) &<& delta \
& abs(b - x) &<& delta "."
$
Therefore,
$
abs(f(b) - f(x)) &< epsilon \
abs(f(x) - f(b)) &< epsilon \
f(x) - f(b) &< epsilon \
f(x) - f(b) &< L - f(b) \
f(x) &< L "."
$
So it would be a contradiction to have $L <= f(x)$.
Once again, the membership condition holds for $x$ by the principle of explosion.
== Conclusion
By applying the limit and successor cases to the induction result,
we have that $B supset.eq [0, b)$.
Because $B subset.eq [0, b)$ as well, we have that $B = [0, oo)$.
Hence, the proposition holds for all $b in [0, oo)$.
#sym.qed |
|
https://github.com/alberto-lazari/cns-report | https://raw.githubusercontent.com/alberto-lazari/cns-report/main/introduction.typ | typst | = Introduction
== Related works
In recent years, the widespread adoption of Quick Response (QR) codes has introduced new challenges for the security and robustness of mobile applications.
QR codes, while they can be used to provide simple and immediate access to information, they also present potential security vulnerabilities that can be exploited by malicious actors, being them just another form of input for an application.
The paper "If Youâre Scanning This, Itâs Too Late! A QR Code-Based Fuzzing Methodology to Identify Input Vulnerabilities in Mobile Apps." by @carboni2023if made a first effort towards an automated fuzzing-based methodology able to address this problem.
Their proposal required the use of a real smartphone to run the tests on, which limited the possibility of automation and the reproducibility of the results, being them based on various factors that depends on the device, that also causes some hardware-related probems.
== Contributions
Our work #cite(<QRFuzz-2>, form: "normal") builds upon the @carboni2023if research, which introduced a fuzzing framework designed to uncover QR code input vulnerabilities in mobile apps.
However, to improve the efficiency of the proposed approach with the aim of making the testing more scalable, we propose an alternative design.
Our proposal is based on the virtualization of the mobile phone, that allows for a more detailed configuration of the environment and potentially for the creation of multiple parallel testing sessions, without the need of large quantities of dedicated hardware.
Our framework also uses a virtual camera, linked to the virtual device, such that it is possible to virtually display QR codes on it, cutting out the hardware-related problems the previous proposal suffers of.
By virtualizing the testing infrastructure, we aim to overcome the limitations of traditional physical devices, providing researchers and developers with a more versatile platform for uncovering potential vulnerabilities in Android applications that provide QR code interactions.
The introduced virtualization not only improves the efficiency of the fuzzing process but also simplifies the adoption of this fuzzing methodology for new users.
The source code of our system is freely available on GitHub #footnote(link("https://github.com/albertolazari/qrfuzz")).
#v(5em)
== Report's structure
In the following sections of this report, we present the details of our virtualization approach.
We provide a comprehensive overview of the previous work (@previous_work), critically analyze the key aspects of its original design (@critical_aspects), and illustrate the details of our approach (@our_approach).
Subsequent sections discuss the technological and implementation details of webcam (@webcam_virtualization) and device virtualization (@device_virtualization), how the entire process is automated (@automation), results of the performed tests (@test_results) and conclude with insights into future possible actions that could further improve this field (@future_work). |
|
https://github.com/maucejo/book_template | https://raw.githubusercontent.com/maucejo/book_template/main/template/main.typ | typst | MIT License | #import "../src/book.typ": *
#show: book.with(
author: "<NAME>",
commity: (
(
name: "<NAME>",
position: "Professeur des Universités",
affiliation: "Streeling university",
role: "Rapporteur",
),
(
name: "<NAME>",
position: "Maître de conférences - HDR",
affiliation: "Synnax University",
role: "Rapporteur"
),
(
name: "<NAME>",
position: "Maître de conférences",
affiliation: "Beltegeuse University",
role: "Examinateur"
),
(
name: "<NAME>",
position: "Maître de conférences",
affiliation: "Caladan University",
role: "Examinateur"
),
),
lang: "fr"
)
#show: front-matter
#include "front_matter/front_main.typ"
#show: main-matter
#tableofcontents()
#listoffigures()
#listoftables()
#include "chapters/ch_main.typ"
// #bibliography("bibliography/sample.yml")
#bibliography("bibliography/sample.bib")
#show: appendix
#include "appendix/app_main.typ"
#back-cover(resume: lorem(100), abstract: lorem(100)) |
https://github.com/Blezz-tech/math-typst | https://raw.githubusercontent.com/Blezz-tech/math-typst/main/ÐаÑÑОМкО/ÐеЌП ваÑÐžÐ°ÐœÑ 2024/ÐаЎаМОе 01.3.typ | typst | #import "@preview/cetz:0.1.2"
#import "/lib/my_cetz.typ": defaultStyle
#set align(center)
#cetz.canvas(length: 0.5cm, {
import cetz.draw: *
import cetz.angle: angle
set-style(..defaultStyle)
let (A, B, C, D, O) = ((0,4), (-10,0), (0,-4), (10,0), (0,0))
angle(B, A, O, label: text([$13 degree$], size: 8pt) )
angle(B, C, O, label: text([$13 degree$], size: 8pt) )
angle(C, B, D, radius: 0.6)
angle(C, B, D, radius: 0.45)
line(A, B, C, D, close: true)
circle(O, fill: blue, stroke: (paint: blue), radius: 0.07)
content(O, $ O $, anchor: "top-left")
content(A, $ A $, anchor: "bottom")
content(B, $ B $, anchor: "right")
content(C, $ C $, anchor: "top")
content(D, $ D $, anchor: "left")
line(B, D, stroke: (dash: "dashed", paint: blue))
line(A, C, stroke: (dash: "dashed", paint: blue))
})
|
|
https://github.com/EpicEricEE/typst-marge | https://raw.githubusercontent.com/EpicEricEE/typst-marge/main/tests/overlap/side/test.typ | typst | MIT License | #import "/src/lib.typ": sidenote
#set par(justify: true)
#set page(width: 12cm, height: auto, margin: (x: 4cm, rest: 5mm))
#let sidenote = sidenote.with(numbering: "1")
#lorem(8)
#sidenote(side: right)[This sidenote is on the right side.]
#lorem(2)
#sidenote(side: left)[
This one is on the left side and thus doesn't overlap with the previous one.
]
#lorem(6)
#sidenote(side: left)[
This is also on the left side and does overlap.
]
#lorem(28)
|
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/055%20-%20Murders%20at%20Karlov%20Manor/006_Episode%206%3A%20Explosions%20of%20Genius.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"Episode 6: Explosions of Genius",
set_name: "Murders at Karlov Manor",
story_date: datetime(day: 12, month: 01, year: 2024),
author: "<NAME>",
doc
)
"Where are we going, exactly?"
"All will be clear in short order."
"Mmmm âŠ" Etrata tapped her chin and grabbed Proft's arm. Before he could react, she pulled him into a nearby alley, spun him around, and pressed his back against the wall. He blinked as she loomed over him. She wasn't taller than he was; looming should have been impossible, and yet, she was accomplishing it.
"No," she said in a level, reasonable tone. "All will be clear right now, or we're not going any farther. I've had to prevent one attempt on your life. If it hadn't been someone who knows me, that would #emph[not] have ended well for you. So, you're going to tell me where we're going, or we're not going there."
Proft raised an eyebrow. She didn't flinch. After a long, strained silence, he sighed and glanced toward the street, apparently checking to see if their little altercation had attracted any unwanted attention.
Seeing that they were alone, he turned back to Etrata. "I'll need you to release me first."
Grudgingly, she let go of his arm. Proft rubbed the spot she'd been gripping as he shook his head.
"Is such violence a common means of discussion among House Dimir?" he asked. "It seems a bit ⊠blunt."
"Only when the person we're dealing with is immune to seeing sense," she said. "Where are we going?"
"The powder I found in your chamber is like nothing I've encountered before. We need to have it analyzed by someone we can trust not to be working against us."
"You told me we were going to see a friend of yours," said Etrata. "Not that it would involve walking down the street in broad daylight. I'm a fugitive."
"Yes, and apparently I'm a marked man. Your point would be?" Something in Etrata's expression must have told Proft he was on thin ice, because he continued: "Kylox is ⊠a sensitive individual. After some issues with espionage and patent theft, he has become very attuned to anything that smacks of subterfuge or deceit. With him, you're #emph[more] likely to be seen and intercepted if you attempt to come in via a back route or hidden passage than if you approach openly."
Etrata blinked. "That may be the most ridiculous thing I've ever heard."
"He's brilliant, really, just paranoidâif it can be called paranoia when every one of his wild assertions has eventually proven true. Regardless, any other route could do us material harm."
"And we're going to him, not someone else, because �"
"We can't go to the Agency without revealing my involvement in your escape. We can't go to the guilds until we know who set an assassin on my trail. This is someone I trust implicitly, who fears discovery enough to understand any requests we make for circumspection, who will ask few questions. There's none better in this city, I assure you."
Etrata frowned. "I still don't like this open of an approach."
"We're nearly there."
"You need to start telling me things. You can't hold them back just because you want to look clever when you reveal them."
Proft smiled, very faintly. "I'll take that under consideration."
They left the alley for the street. At the corner, Proft turned, then turned again, heading down a narrow lane between two shops. Etrata followed. When the lane branched, he turned again onto an even narrower sub-street barely wide enough to allow the shops on either side to open their doors.
The street ended at a blank wall. Proft touched the wall, almost in imitation of Etrata at her hideout, then backtracked three storefronts to a door with a small plaque identifying it as a bookkeeper's office. He rapped his knuckles against it and stepped back, waiting. After several seconds ticked by without anything changing, he frowned and knocked again.
He waited longer this time. When he began to step forward again, Etrata held up a hand to stop him.
"I'm sturdier than you are," she explained, moving to try the doorknob. "If this thing is set up to shock or poison me or something, I'll probably be fine. You wouldn't be."
"A valid reason, if not one I particularly care forâoh, hello." The doorknob twisted easily when she turned it, and when she let go, the door swung silently inward. "That's unusual. He never leaves the door unlocked."
"Lovely," said Etrata. "So we're probably walking into a trap."
"I certainly hope not," said Proft. He pushed past her, stepping into the darkness beyond.
Etrata sighed and followed.
As soon as they were both inside, a series of linked tubes around the edges of the room lit up as the lightning elemental contained inside began darting back and forth, filling the air with a jittering, flickering light. There was a break in one of the containment units, which probably explained the flickering; under ideal conditions, the elemental's light would have been steady enough to work by. The light was still enough to reveal a small workshop, the sort of place maintained by a single inventor for their private use, cozy and compactâand destroyed.
It looked as if an entire gang of people had passed through, smashing everything they could get their hands on. Scraps of paper and bits of blueprint littered the floor. The active tubes were clearly a backup system; larger tubes lower on the wall had been smashed, adding shards of glass to the debris.
Proft moved to the center of the room, not saying a word. The crunch of glass under his feet and Etrata's quiet exclamation of general dismay were both swallowed by the silence. Proft stopped to take another look around before pressing his index fingers together, resting them against his chin, and bowing his head in apparent concentration.
Thin blue lines spread outward from his feet, racing across the room to crawl up the walls and across the ceiling. They met there, knotting together in an elaborate network of delicate tangles. The space between them lit up blue-white, until the entire room was bathed in a magical glow, Proft at the center.
#figure(image("006_Episode 6: Explosions of Genius/01.png.jpg", width: 100%), caption: [Art by: Daarken], supplement: none, numbering: none)
"Hmm," he said, lowering his hands. "This isn't correct."
The light pulsed, and the workshop was no longer destroyed. It was perfect and pristine, cluttered as Izzet workshops almost always were, but with no sign that anything more dramatic than a late-night brainstorming session had ever happened here. The light steadied, flicker fading as the containment system was restored. Slowly, Proft began pacing around the outside of the room, occasionally ruffling through a stack of light-limned papers or adjusting a pencil that Etrata knew for a fact wasn't actually there anymore. Finally, he paused in front of a patch of wall, squinting at it before looking at the floor.
"Etrata, your assistance, if you would be so kind." He snapped his fingers, and the light shattered around them, replaced by the wreckage of the workshop as it actually existed. Etrata hurried across the room to stand beside him.
He was looking at the floor when she got there, or more specifically, at a capsized bookshelf covering a large section of the floor. "What do you need?" asked Etrata.
"I need this moved."
"And you think I can do it?"
"I do."
It was a simple statement, made with such assurance that Etrata was bending to move the bookshelf aside almost before she realized she was going to agree. It was made of good, sturdy hardwood and had clearly been designed to stand up to the rigors of the workshop; as she hoisted it off the floor, books and small boxes fell from the shelves, landing around her feet.
Moving the bookshelf revealed a rumpled rug, which had been almost entirely obscured. Proft nodded satisfaction and crouched down.
"This is normally where you would say 'thank you,'" said Etrata.
Proft ignored her, flipping back the edge of the rug and revealing ⊠absolutely nothing.
"Your secret door was Izzet work," he said, standing again as she set the bookshelf on its base. "Try the same patterns here, if you would be so kind."
Etrata looked at him suspiciously but got down on her knees and began drumming her fingers against the floor. The first several taps sounded solid. The next sounded almost hollow. She looked up again.
"No one builds a hidey-hole like an Izzet inventor, but once they have a mechanism that works, they tend to keep it until someone else manages to come up with something better," said Proft, taking a step back to give her room. "It's always amused me that Kylox was so angry about spies in his last shared lab. He's as big a thief as the rest of them."
A square portion of floor dropped several inches with a loud click. Etrata kept drumming, rolling her eyes at the same time. "Oh, and I suppose you like having people who are smarter than you around while #emph[you're] trying to work?"
"I wouldn't know," said Proft, as the floor dropped again, this time all the way, revealing a trapdoor. He stepped forward, offering Etrata a hand up. "It's never happened. Shall we descend?"
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The trapdoor led to a ladder; the ladder led down into the boilerpits, distant firelight illuminating the web of tangled pipes and exposed steam vents. The air was hot, heavy, and remarkably fragrant.
Proft took a satisfied breath, then coughed. "Breathe shallowly," he advised. "This isn't the sewer, but filth still sinks, regardless of the intent."
"I've been here before," said Etrata. "Which way?"
"Kylox never leaves Izzet-controlled territory if he can help it," said Proft. "This pipe runs in two directions. That way, he leaves the district. This way, he moves deeper in." He began walking away from the district. Etrata followed without hesitation.
They had gone about ten feet when Etrata grabbed Proft by the shoulder. He stopped, looking back at her. She gestured downward. He glanced at the ground.
"Yes, the tripwire. I saw it," he said.
"It's too obvious. There's probablyâ"
"A pressure plate on the other side. I anticipated that."
She threw up her hands. "Why am I trying so hard to keep you alive? Clearly, you don't need it."
"No, but it's sweet of you." Proft turned, scanning the pipes until he found a break in the pattern. "This way. I believe we'll find our host in short order."
They squeezed through the gap he'd spotted, following the series of bends on the other side until it opened into a larger chamber. There, hunched over a makeshift drafting desk and writing furiously, was a red viashino, facial scales reflecting the pallid light of the lantern on the desk's edge.
"Hello, Kylox," said Proft. "You've looked better."
Kylox's head jerked up, eyes widening behind their magnifying lenses. "Alquist!" he exclaimed, dropping his pen.
Seen more closely, he obviously hadn't escaped the destruction of his workshop unscathed. He was missing several scales, his clothing was torn and disarrayed, and the short spikes atop his head looked as if they'd been brushed backward, creating a prickly hedgehog effect.
"How did you �" he asked, then stopped. "Why am I asking? You'll have some nonsensical answer that ends with you here whether I want you here or not. What do you want?"
"I found a substance I want you to analyze," said Proft, as calm as if this were a perfectly normal place to have a conversation about science.
Kylox didn't appear to agree. He gaped at Proft. "Get out," he said.
"What?"
"Get out. No matter how many favors I owe you, I'm not doing this right now." He cast an anxious glance at the passage they'd come through. "Were you followed?"
"No," said Etrata with certainty.
"How did you get here?"
"The door in your workshop," said Proft. "Really, Kylox, if you would justâ"
"How did you open that? Whyâno, never mind." Kylox rose, gathering an armful of papers from the desk, tail swaying as he moved toward a small shelf jammed below a row of pipes. "I don't have the equipment here to do what you need, Proft. Go. I'll send a message when it's safe."
"If you tell us what happened, perhaps we can help," said Proft.
"You can't help," said Kylox, glancing around as he stacked his papers. Everything about the inventor radiated anxiety: the way he stood, the way he moved, the tenor of his voice. "I was workingâI was working on something secret. Something no one was meant to know."
"What sort of something?" asked Proft.
Kylox whirled, exploding into motion. The spikes on his head rose in agitation. "No, no! Not you! I can't tell #emph[you] ! I can tell Ezrim. Only Ezrim. Can you get me to him without being seen? Is that within your power, oh great detective?"
The way he spoke the words, they weren't praise. He turned them into a cutting insult, and Etrata glanced to Proft to see how the man would react.
His expression hadn't changed, and didn't as the sound of footsteps echoed along the tunnels, racing toward their location. Kylox's eyes widened.
"They're here," he moaned. Then, in a much softer voice, he commanded, "Leave me. #emph[Hide!] "
Proft moved then, grabbing Etrata by the arm and pulling her with him across the room to a bank of unusually dense piping. He let her go to grab a section of the grid, pulling down and yanking it toward himself. It swung outward, and he jumped inside, Etrata close behind.
When closed, the false wall of piping was solid, with only a few gaps through which they could see the room. They watched in silence as a group of goblins poured through another hidden passageway, surrounding Kylox, who flinched away from them. Proft tensed. The goblins produced thin lengths of mizzium chain, wrapping them around Kylox, who said nothing and only allowed himself to be taken.
Proft slumped against the wall of their narrow hiding space. Etrata kept her eyes on the attack. Something moved in the corner, catching Proft's attention, and he glanced toward it, watching a spider creep down the wall and vanish back into shadow.
When he looked back, the goblins were gone, and Kylox was gone with them. He frowned and gestured for Etrata to open the wall again.
The two stepped back into the now abandoned chamber.
"Do you allow all your friends to be taken captive like that?" demanded Etrata.
"Kylox is a coward in many ways," said Proft, scanning the area. "He wouldn't have suggested we hide if he thought we could help him. We were well outnumbered. He doesn't know your capabilities, but by allowing himself to be taken, we can now follow and hopefully recover him. Come along." He strode across the room, heading for the entry the goblins had used.
On the way, he paused by the shelf Kylox had been next to, just long enough to grab a small, unornamented wooden box and tuck it under his arm. Etrata frowned. He didn't pause, and in the end, she had to follow.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Being back on the busy streets of Ravnica was more of a relief than Kaya would have believed before visiting the moor outside of Vitu-Ghazi. Ravnica was supposed to be a place of constant sound and motion, life without end, even when it ended. Open spaces and green places were for Kaldheim and Dominaria, not for here. Not for the plane that had become her home.
She knew these crowds. Even after everything, she knew these people, knew the way they moved and hurried from place to place ⊠knew when something broke the pattern. The people behind them weren't moving the right way. She set a hand on Kellan's arm, guiding him toward the nearest alley.
He started turning toward her, and she tightened her grip, still facing forward.
"Say nothing," she said pleasantly. "We're being followed."
Kellan blinked, letting her lead him away from the crowd. Once they were in the alley, they turned, waiting.
It was a short wait. A group of six people in long, dark robes followed them, too quickly to have arrived by accident, and fanned out to surround the pair. One of them had a hefty hammer. Kaya frowned.
"What is this?" she asked. "An ambush, or a staring contest?"
The nearest robed figure lunged. Kaya danced back and kept moving as the other five joined the fray, all six attacking at once.
Not in unison, but not in the convenient one-by-one pattern so many groups seemed to use, either. Three grabbed for Kaya, the other three lunging for Kellan. Kaya turned partially insubstantial, letting the first attacker charge right through her, his own momentum carrying him into the nearby wall. He impacted with a sickening crunch.
#figure(image("006_Episode 6: Explosions of Genius/02.png.jpg", width: 100%), caption: [Art by: Durion], supplement: none, numbering: none)
Drawing her daggers, Kaya focused on the others who had decided she was the better target, shifting her weight to her rear foot while she waited for them to come at her. They were both substantially larger than she was, making speed her best asset in this fight. Speed, and the ability to turn insubstantial. It was almost exhilarating, having something as straightforward as a simple alley brawl to worry about. She spun and wove, letting them reach for her, striking when they got too close. She dropped the first almost before the fight had been joined in earnest.
The one with the hammer was down, felled by a blow to the back of the neck, before she had a chance to check on Kellan. Like her, he was down to his last opponent; the other two were on the ground, their faces pressed to the alley floor. Kellan had his swords drawn and was matching blade for blade with his remaining attacker, who held a pair of vicious-looking knives. Kaya kicked her remaining opponent, first in the knee, then in the groin. He folded like a broken ladder, and she kicked him in the head for good measure before she started toward Kellan. Then she stopped dead, blinking.
Kellan had hooked his blades around the attacker's knives and disarmed him with expert skill, leaving the man looking helplessly around for something else he could use as a weapon. Before he could find it, Kellan slammed his shoulder into the man's chest, knocking him back.
Kaya moved toward the man, who was at least still conscious, and wrapped one hand around his throat, spinning her dagger in the other hand as she pinned him to the wall, trapping one arm against his body. "Hello," she said. "We're your intended victims. You want to tell us why you came after us?"
The man moved his free arm quickly enough that if he'd been holding another knife, he could have hurt her grievously. Instead, he shoved something that looked like a tiny green sprout into his mouth, triumph lighting up his eyes as he swallowed.
"What was that?" Kaya demanded.
"I don't know, but the ones on the ground just did the same thing," said Kellan.
Kaya could only stare as the flesh of the man in front of her softened and turned green, growing plush as it transmuted into moss beneath her hand. Then he dissolved, moss scattering across the alley floor, leaving his empty robe to drop to the ground. Kaya danced backward with a wordless sound of disgust, shaking his remains from her fingers. They didn't stick, and no signs of the same transmutation marred her skin.
Kellan was retching. Kaya turned to face him. All the other attackers had transformed into the same mossy scatter. Kellan bent forward, hands on his knees, and paused.
"Kaya, come over here."
"Why?"
"Because I need you to see something."
Careful not to step in the moss, Kaya moved to Kellan's side. He drew one of his bracers and stooped, shifting the cloak with the tip. "Look," he said.
A tuft of white fur tipped in gray clung to the fabric. Kaya straightened, staring at him. Kellan did the same, nodding very slightly as he did.
They were still staring at each other when an Agency thopter zipped into the alley. It flew to a stop between them before projecting a holo-message of Ezrim in his office, looking at them sternly.
"Your recent altercation has been noted," it said. "Boros officers are on their way to your location. Secure the scene and return to headquarters."
"Yes, sir," said Kellan automatically. The thopter zipped away as he pulled a pair of weighted ovals out of his pocket, gesturing for Kaya to step out of the alley as he affixed one to either side of the entrance. As soon as he let go, a ribbon of pure light extended between the two.
"Agency barrier wards," said Kellan. "They'll let our investigators through, but no one else. Come on."
Together, they moved down the street, moving through the crowds swiftly and without further challenge.
The street outside the Agency headquarters was clear, the previous swarm of gossiping agents dissipated. Kaya still ducked behind Kellan as they climbed off their mounts and made for the doors, letting the actual, official agent precede her into the building.
The ghost of Agrus Kos was waiting in the foyer. "The boss wants you," he said as they entered. "Says it's urgent. Doubly so for you, ma'am." He nodded to Kaya, a sympathetic look on his faintly translucent face.
That was enough to tell Kaya what Ezrim had for her, and she hurried down the hall, leaving Kellan and Agrus Kos to follow. Ezrim's door was closed when she arrived. She didn't bother knocking but simply walked straight through.
Ezrim was behind his desk. He looked up when she appeared, seemingly unsurprised by her entrance. "Thank you for coming so quickly," he said. "Though I have a door for a reason."
There was a knock at the door. Ezrim glanced toward it.
"Some people remember manners," he said. "Come in!"
Kellan slipped into the room, Agrus Kos close behind. "You called for us?"
"Yes." Ezrim returned his attention to Kaya. "Teysa's killer has been apprehended by the Azorius."
Kaya's legs felt suddenly weak. She grabbed the edge of a bookshelf to hold herself up. "Sir?"
"A low-level hitman, no guild affiliation," said Ezrim. "He swears he doesn't know what happened. One minute he was walking through the Eighth District, and the next, he was in Karlov Manor, covered in Teysa's blood. He ran. Someone saw him leaving the area, and the Azorius were called. They have him in custody."
Kaya and Kellan were gaping at him when the door slammed open and Aurelia appeared, dragging a thrashing woman in Rakdos colors by the hair. The woman had been tied up, hands secured behind her back, but she fought like she thought she could break free. Aurelia half-threw her to the floor, wings spread in indignation.
"She was coming for #emph[me] ," she snapped, voice cold as the grave. "She killed ten of my guards before I stopped her."
"You #emph[cheated] ," snapped the woman. "Not supposed to bring wings to a ground fight. Naughty and nasty and not playing fair."
Aurelia ignored her, absorbed in her own fury. "This is the one they call <NAME>, and her presence proves the Cult of Rakdos is behind all this senseless slaughter. We should have known. I'll gather the Legion, and we'll marchâ"
If the Boros Legion marched to war against another guild with the city in such a delicate state of recovery, everything would collapse. The Dimir were missing, and the Golgari were in self-imposed exile. Ravnica couldn't afford to lose another guild.
This might not be her home anymore. That didn't mean Kaya wanted to see the place burn.
"Wait," said Kaya desperately.
"Better listen, bird-lady," said Massacre Girl snidely.
Kaya resisted the urge to kick her.
"These agents have been to see Judith of the Rakdos and were returning to make their report," said Ezrim.
"Agents?" Aurelia looked at Kaya quizzically, rage temporarily dimmed by confusion.
"For this case," said Kaya. "I'm more neutral than many. Massacre Girl. Why did you attack the Boros warleader when you knew the possible consequences?"
"I don't know," said Massacre Girl. "I don't remember anything before she was sweeping my legs out from under me and stepping on my chest. I didn't even get paid."
Kaya turned back to Aurelia. "You see, this could be connected to the other attack. In both cases, the assailant didn't remember the act, or know to evade the aftermath. Warleader, we spoke to Judith, and she didn't have the demeanor of a guilty person. If anything, she was helpful. She directed us toward a lead, which we're presently investigating. Please, we need time before open accusations are made against another guild."
"Even in the face of your own loss, you would press for patience," said Aurelia.
"You would do the same, if you were thinking clearly," said <NAME>os. "Listen to her. She speaks sense."
Aurelia frowned at him. "#emph[You] would advise #emph[me] ?"
"You sent me to oversee. I'm overseeing." He looked at her calmly. "They need time."
Aurelia closed her wings, still frowning. "Twenty-four hours, no more, and the assassin stays with us," she said. "If another prisoner is lost, heads will roll."
#figure(image("006_Episode 6: Explosions of Genius/03.png.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"That's all we'll need," said Kaya with evident relief.
Aurelia gathered her prisoner and her pride and swept away. As soon as she was gone, Kaya sagged. Kellan put out an arm to steady her.
"It's fine," she said, waving him away. "Just ⊠having a killer means this isn't a trick. Teysa's really gone."
One more person she hadn't saved, one more friend she'd never see againânot in the same way. Teysa's ghost might come back, but that wouldn't undo the damage. Kaya rubbed her face with one hand. Being someone she cared about was starting to feel like a dangerous proposition.
"We have twenty-four hours," she said, lowering her hand. "Let's get to work."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
The goblins who had captured Kylox clearly hadn't realized anyone was watching; they made no effort to cover their tracks as they passed through the boilerpits to their own ladder to the street above. Proft and Etrata stayed back far enough not to be seen and followed them out into the early evening air.
The kidnappers made no effort to hide their thrashing captive, either, but no one looked too closely or stopped to ask them what they were doing. Proft and Etrata continued to follow, not interfering, as the goblins carried Kylox to a disreputable-looking pawnshop. They exchanged a look then hurried to the store, stopping outside.
Proft produced something that looked like a small trumpet from inside his jacket, pressing one end to his ear and the other to the glass. Etrata began to ask him a question. He waved her off then pressed a finger to his lips, signaling her to stay quiet.
Inside, the familiar, faintly nasal voice of Krenko rang out clear and true: "What do you know?"
"Nothing!" Kylox replied. "I'm notâI don't understand what you think Iâ"
"The #emph[killings] , what do you know about the #emph[killings] ?" Krenko sniffed. "I know enough to know I'm at risk here. You're #emph[going] to tell me everything you know."
Proft lowered the trumpet. "That sounded like a threat to me," he said, looking to Etrata. "Those guards, do you think you could take them?"
Etrata looked mildly offended. "I'm a professional."
"Excellent," Proft said and kicked the door open.
Etrata surged into the room like a shadowy tide, Proft strolling along behind her.
"That will be quite enough of that," he said mildly as Etrata disarmed the first of the goblin guards. Krenko squawked in surprise, moving so two more of his men could cover him, only to watch them go down as well. The Dimir assassin moved with restrained grace, and in moments, all six guards were on the ground, not moving.
Etrata moved to start untying Kylox, while Proft focused on Krenko. "What," he asked, "are you doing?"
"Iâimportant people have been dying!" said Krenko. "#emph[I'm] important! I could be next! #emph[He] "âhe indicated Kyloxâ"was talking about doing work for important people, but he wouldn't work for me! He might know something! He's #emph[going] to tell me!"
"I did tell you," said Kylox, rubbing his wrists as he stood. "I don't know anything. Alquist, thank you. I hoped you'd understand what I was asking for."
Proft didn't have time to respond before the window smashed in and a bulky man in laborer's clothes crashed into the room. He charged for Krenko, swinging a daggerâand ran into Kylox first. There was a strangled gasp as the viashino fell out of the way and slid, motionless, to the floor. Proft moved to his friend while Etrata slashed at the curled grip of the attacker, knocking the dagger loose. She jumped onto his back, then, and wrapped an arm around his throat.
#figure(image("006_Episode 6: Explosions of Genius/04.png.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"Krenko, you useless pile, the chains!" she shouted.
Surprise broke through Krenko's look of terror, and he hurried to get the chain that had been used to tie up Kylox, tossing it to Etrata. She squeezed the man's neck a little tighter, then slid down and began tying him up quickly, immobilizing him.
When she turned, Proft was there, a bleak look on his face. "Kylox?" she asked.
He shook his head.
"I'm so sorry."
"As am I." He stepped toward the attacker. "Why are you here?"
The man didn't answer, only snarled at the cowering Krenko. Proft frowned.
"His eyes aren't focused, Etrata," he said. "See?"
"His pupils are too dilated," she said. "He's clearly intoxicated."
"Perhaps âŠ" Proft glanced over his shoulder. "We need to break the fugue somehow."
"Allow me," Etrata said and stepped in front of the man, locking her gaze on his own.
There was no outward display of her psychic abilities, but he jerked back, pupils returning to a more normal state as he tried to recoil from her and the fear she had induced.
"What am I doing here?" he demanded, nearly sounding panicked. "This isn't the florist. My husband's going to kill me!"
"As I suspected." Proft turned to Etrata. "People are being brainwashed into these attacks. They can't be held responsible, as you can't. Someone is doing this. And I am #emph[going] to find out who."
|
|
https://github.com/VisualFP/docs | https://raw.githubusercontent.com/VisualFP/docs/main/SA/design_concept/appendix/questionnaire_answers.typ | typst | #import "@preview/tablex:0.0.5": tablex, cellx
= Design Iteration One - Survey Results <design_iteration_one_survey_results>
The design evaluation questionnaire (as described in @design_eval_questionnaire) was given to 7 students and exprienced programmers.
These are the results:
#let questionnaireResult(
participant,
participantDescription,
answersPerConcept,
generalComments: []
) = {
[=== Survey Results from #participant] // the heading_increase don't seem to affect this
participantDescription
for conceptAnswers in answersPerConcept {
let (concept, meaningAnswer, lookAnswer, teachingAnswer, scalingAnswer, suggestionsAndComments) = conceptAnswers
heading(level: 5, numbering: none, concept) // the heading_increase don't seem to affect this
figure(
tablex(
columns: 2,
stroke: .5pt,
cellx(align: center + horizon)[*Question*],
cellx(align: center + horizon)[*Answer*],
cellx()[*Were you able to understand the meaning of the boxes and arrows?*],
cellx(conceptAnswers.meaningAnswer),
cellx()[*Do you find the concept nice to look at?*],
cellx(conceptAnswers.lookAnswer),
cellx()[*Could you imagine teaching functional programming using this vizualization?*],
cellx(conceptAnswers.teachingAnswer),
cellx()[*Could you imagine how the concept scales to more complex expressions?*],
cellx(conceptAnswers.scalingAnswer),
cellx()[*Do you have any suggestions for improvement or general comments on the concept?*],
cellx(conceptAnswers.suggestionsAndComments),
),
kind: "table",
supplement: "Table",
caption: "Design questionnaire answers for " + concept + " design from " + participant
)
}
if (generalComments != "") {
heading(level: 5, numbering: none, "General Comments")
generalComments
}
}
#questionnaireResult(
"Prof. Dr. <NAME>",
"Prof. Dr. <NAME> is a lecturer at OST and advisor of this project.",
(
(
concept: "Flo-inspired",
meaningAnswer: "Not really. Semantics & the arrows are unclear (insertion or reverse result)",
lookAnswer: "Not really.",
teachingAnswer: "Not really. The arrows obsucre the denotational semantics.",
scalingAnswer: "Yes. The arrows allow blocks to remain small.",
suggestionsAndComments: "",
),
(
concept: "Scratch-inspired",
meaningAnswer: "Somewhat better than the Flo-inspired version.",
lookAnswer: "Somewhat better than the Flo-inspired version.",
teachingAnswer: "Somewhat better than the Flo-inspired version.",
scalingAnswer: "Somewhat better than the Flo-inspired version.",
suggestionsAndComments: "Without types, one has no guidance on which blocks fit where",
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "Better than the other two, but not quite there yet.",
lookAnswer: "Better than the other two, but not quite there yet.",
teachingAnswer: "Better than the other two, but not quite there yet.",
scalingAnswer: "Better than the other two, but not quite there yet.",
suggestionsAndComments: "",
)
),
generalComments: [
The questionnaire may not do full justice to the visual programming methods since it only reviews the end state, and not the method of programming.
All methods seem to have a "bottom-up" strategy on constructing programs (i.e. start with small steps with what is available, and tinker with it unit you come up with something that you can use).
The imperative paradigm forces one to do this (top level blocks are always ";", and therefore uninteresting).
In FP, we are able to design our programs "top-down", starting with a specification (type definition at least).
This specification often admits a top-level function that is often interesting (e.g. filter, map), with further "holes" that can similarly be filled successively.
It may be a good idea to design the VP tool around to support the method we want people to learn "how to design programs" (see "recipe for defining functions" & video on "Schreib dein program").
There are huge parallels between programming & constructing formal proofs (Curry-Howard-Lambek isomorphism) that can be a mental aid in designing such a tool - even if one does not immediately expose this to the beginner (please don't).
The more I think about it, the more I am under the impression the VP tool and concept should support the existing recommended methodology and process of designing (functional) programs.
This process has been quite well thought out, and does not need to be re-invented.
What I feel is missing, when doing this in a textual form, is that the "visual model" of what this text should look like in the minds of the learners is not immediately visible.
A visual tool can help learners build the correct visual model/intuition faster.
Once this visual model/intuition is finally in place, the tool will often little benefit and become tedious to use.
The users will then switch to the textual representation, but still always have the visual model in mind.
]
)
#questionnaireResult(
"<NAME>",
"<NAME> is a scientific employee at the institute for software at OST",
(
(
concept: "Flo-inspired",
meaningAnswer: [
Mostly. I was first wondering why the arrow in the "Product of Numbers" example goes from the interim-result-ellipse 'Num a => a' to the argument slot of "(\*):apply", instead of the product-block as with all other cases where functions are passed as parameters.
But then I realised that the result of the function call with xs is passed and not the function itself.
],
lookAnswer: "No, too noisy.",
teachingAnswer: "Perhaps, but only as an aid to show certain signatures of a partial expression, not in general to teach functional programming from the ground up.",
scalingAnswer: "It'll get very complex very fast.",
suggestionsAndComments: [
- Move type-signatures into the blocks instead of above them
- Make type-signatures hideable
- Option to switch between curried-interpretation and n-ary-function interpretation
]
),
(
concept: "Scratch-inspired",
meaningAnswer: "I think so.",
lookAnswer: "Yes",
teachingAnswer: "Yes, but I don't see an advantage compared to a pretty AST.",
scalingAnswer: "It'll look like a mountain-skyline.",
suggestionsAndComments: "Highlight which argument-instances belong to which argument-bindings when hovering over them.",
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "Mostly, but I'm not sure if I understood everything right.",
lookAnswer: "Yes",
teachingAnswer: "Perhaps, but only as an aid to show certain signatures of a partial expression, not in general to teach functional programming from the ground up.",
scalingAnswer: "It would probably get complex too, but probably not as complex as the other two designs.",
suggestionsAndComments: [
- Put the function-types next to the function name, so that there is no danger of confusing them.
- Your approach for pattern-matching nicely shows that you don't have access to parts of a pattern that aren't named. But somehow the way it's visualized seems strange to me and is somewhat unsatisfying. But I don't know how to do it better.
]
)
),
generalComments: [
I quite like the bock-arrow diagrams in "The state monad" in "Programming in Haskell" by <NAME> (second edition, chapter 12.3 Monads, pages 168 - 141).
I don't know how well that approach generalises without overloading it like the Flo-inspired examples.
In contrast to your examples the diagrams from the book show the data flow (but not how calls are plugged together syntactically).
]
)
#questionnaireResult(
"<NAME>",
"<NAME> is a third-year software-development apprentice at Vontobel.",
(
(
concept: "Flo-inspired",
meaningAnswer: "No, but I assume that the squares are some kind of input?",
lookAnswer: "If I understood this concept, I assume that I would've thought that it looked to complicated.",
teachingAnswer: "",
scalingAnswer: "",
suggestionsAndComments: ""
),
(
concept: "Scratch-inspired",
meaningAnswer: "I think I understood this concept the most.",
lookAnswer: "Yes",
teachingAnswer: "Probably.",
scalingAnswer: "I think complex expressions would take up a wide space and would be very complicated to understand.",
suggestionsAndComments: "Keep the explanation (like in the first example) of the boxes (definition, declaration & parameters)",
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "No.",
lookAnswer: "",
teachingAnswer: "",
scalingAnswer: "",
suggestionsAndComments: ""
)
)
)
#questionnaireResult(
"<NAME>",
"<NAME> is a student at OST and has visited the functional programming lecture.",
(
(
concept: "Flo-inspired",
meaningAnswer: "",
lookAnswer: "No, very confusing with too many arrows and annotations.",
teachingAnswer: "",
scalingAnswer: "",
suggestionsAndComments: ""
),
(
concept: "Scratch-inspired",
meaningAnswer: "",
lookAnswer: "Yes",
teachingAnswer: "",
scalingAnswer: "",
suggestionsAndComments: [
- No type-annotations, so it's difficult to tell what goes where
- Type-Hole isn't intuitive
- Operators should be treated like any other function
],
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "",
lookAnswer: "Yes",
teachingAnswer: "",
scalingAnswer: "",
suggestionsAndComments: ""
)
),
generalComments: "It would be nice to have 'referential-transparency', i.e. hovering over a block to see the type of a specific argument."
)
#questionnaireResult(
"<NAME>",
"<NAME> is a technical employee at the institute for software at OST",
(
(
concept: "Flo-inspired",
meaningAnswer: "I don't know Flo and for me it is not a very obvious notation. I can guess the semantics though.",
lookAnswer: "It looks a bit cluttered to me.",
teachingAnswer: "I think I would visualize it differently.",
scalingAnswer: "It will probably clutter quite fast, I already find 'Product of Numbers' hard to read. I don't see a simple way to split it into multiple parts.",
suggestionsAndComments: "Maybe multiple argument functions can have the argument in the same block instead of the :apply notation? I understand that this is to highlight currying, but I think you could also explain this by only highlighting the empty argument boxes. This would reduce clutter and make it more scalable."
),
(
concept: "Scratch-inspired",
meaningAnswer: "I find this quite easy to read. The only confusing bits I find are the type annotations (purple), especially because it mixes up constraints and types, but also because it could be interpreted as being part of the lower layer (i.e. in 'Map Add 5 Function' it could be interpreted as describing the (+) and not the 5).",
lookAnswer: "Yes, it looks clean and colorful.",
teachingAnswer: "Yes.",
scalingAnswer: "It seems to clutter up less fast, and even then, it could be possible to split it up into multiple towers with references to each other (maybe when visualizing Haskell code, definitions in 'where' or in a let expression could be a separate tower, this would also solve the problem of multiple references.",
suggestionsAndComments: [
- Type annotations: There could be a separate type annotation tower that can be enabled or disabled. Or it should be more obvious where the type annotation applies. At the moment it looks like the types are arguments to the function (which is actually the case in GHC Core or with the TypeApplications extension, but not in normal Haskell). Constraints should be ignored or handled differently.
- Infix functions should look like +, not (+), if they are visualized in an infix way.
],
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "I find this one difficult to read. I especially have difficulty with the apparent mix-up of types and values. It seems that the last part of an arrow chain is the return type, and the rest is a value if present and a type if partially applied? I like the currying visualization with nested boxes though.",
lookAnswer: "It looks more formal than Scratch-inspired, which to me is a disadvantage. It also has more text.",
teachingAnswer: "No, I find it difficult to describe the semantics of single components. Maybe I'd be able to if you gave me an explanation of their meaning.",
scalingAnswer: "I guess it would be possible to use cross references. It looks less cluttered than the Flo -inspired one.",
suggestionsAndComments: "It seems like the single component semantics are not entirely consistent here."
)
),
generalComments: [
- I think it is important to have clear and simple semantics for single components of your visualization. In order to ensure this, it may be useful to think about reduction rules for your visualization.
- I like your use of color and how it distinguishes different things (types, value, arguments, ...)
- Type polymorphism and constraints seems to be a challenge to visualize. For polymorphic types, TypeApplications may be a useful inspiration (i.e. receive types as a different kind of argument to functions). Constraints could maybe then be applied to these kinds of type arguments. Con of this approach is that in Haskell, you don't pass types as arguments.
- Do you also plan on visualizing type definitions?
- My vote is on a Scratch-inspired version.
]
)
#questionnaireResult(
"<NAME>",
"<NAME> is a master student & scientific assistant at the institute for software at OST",
(
(
concept: "Flo-inspired",
meaningAnswer: [
Ja, ich bin mir jedoch nicht sicher, ob man es ohne Haskell Erfahrung versteht.
Ausserdem hÀtte ich die Pfeile fÌrs VerstÀndnis eher von unten nach oben gemacht (siehe erste Box).
Ich möchte nicht vom Resultat zurÌck gehen, sondern wende ein Argument nach dem anderen an und gelange am Schluss zum Resultat.
(wenn man jedoch die Argumente im UI dann so hinziehen kann macht von unten nach oben mehr Sinn)
],
lookAnswer: "GrundsÀtzlich ja, es wird jedoch schnell unÌbersichtlich. Es brÀuchte noch mehr Farbe und die Pfeile könnten je nach FunktionalitÀt unterschiedlich gestaltet werden.",
teachingAnswer: "So wie es jetzt ist, eher nicht, da es zu unÃŒbersichtlich ist. Aber wenn es etwas ausgereifter ist, denke ich schon. Man kann es ja dann wahrscheinlich Schritt fÃŒr Schritt einblenden, bzw. zusammensetzen.",
scalingAnswer: "Ich glaube es wird immer unÃŒbersichtlicher...",
suggestionsAndComments: "Ich finde die Rekursion nicht so verstÀndlich. Man sieht nicht, dass product rekursiv aufgerufen wird. Ich hÀtte die match Box als noch mit product beschriftet und mit Farbe gearbeitet. Die ::Num a -> a Box verwirrt mich. Ausserdem fÀnde ich es besser die Applikation in einer Box zu machen"
),
(
concept: "Scratch-inspired",
meaningAnswer: "Ja, ich finde hier sieht man am besten, wie die Parameter in einander verschachtelt sind",
lookAnswer: "Ja, die Farben sind mega gut fÌrs VerstÀndnis und es ist sehr Ìbersichtlich. Rein visuell der beste Vorschlag.",
teachingAnswer: "Gut ist hier, dass man sieht wie man Schritt fÌr Schritt etwas einblenden könnte. Ich weiss jedoch nicht, ob es wirklich einen Mehrwert gegenÌber dem Code bietet... Bzw. Man sieht wie im Code die ZusammenhÀnge nicht ganz",
scalingAnswer: "Ich könnte mir vorstellen, dass es schnell zu Ìberladen wird",
suggestionsAndComments: [
- Type annotations: There could be a separate type annotation tower that can be enabled or disabled. Or it should be more obvious where the type annotation applies. At the moment it looks like the types are arguments to the function (which is actually the case in GHC Core or with the TypeApplications extension, but not in normal Haskell). Constraints should be ignored or handled differently.
- Infix functions should look like +, not (+), if they are visualized in an infix way.
],
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "Ich finde es schlechter verstÀndlich als der erste Vorschlag. Ich könnte mir aber vorstellen, dass eine Kombination aus diesem und dem ersten funktionieren könnte.",
lookAnswer: "Farben und Boxen finde ich gut und dass die Applikation und der Zusammenhang zwischen Argumenten und den Typen besser sichtbar ist. Aber es sieht irgendwie zu mathematisch aus :) Ich könnte mir vorstellen, dass das Personen abschrecken könnte",
teachingAnswer: "So nicht unbedingt. Aber wenn man es mit dem ersten Vorschlag verbinden wÃŒrde, denke ich schon",
scalingAnswer: "Ich glaube, es wird mega kompliziert mit der Verschachtelung. Ich finde die Pfeile beim ersten Vorschlag besser",
suggestionsAndComments: "It seems like the single component semantics are not entirely consistent here."
)
),
generalComments: [
#figure(
image("../static/general_comments_eliane_schmidli_1.png")
)
#figure(
image("../static/general_comments_eliane_schmidli_2.png")
)
]
)
#questionnaireResult(
"<NAME>",
"<NAME> is a scientific assistant at the institute for software at OST",
(
(
concept: "Flo-inspired",
meaningAnswer: "Mostly. It is confusing, that the input (e.g.) a and output (results) have the same arrow direction. It is not clear where to begin and how the data 'flows'",
lookAnswer: "No. In my opinion it looks more complicated than the code",
teachingAnswer: "No",
scalingAnswer: "No. More complex would probably look more messy",
suggestionsAndComments: "If grey boxes are type only, draw just a line or place it inside. But use no arrow"
),
(
concept: "Scratch-inspired",
meaningAnswer: "The match-case are where confusing to me. But the rest yes",
lookAnswer: "Better than Flo. Cleaner and smaler. It has some structure visible",
teachingAnswer: "Rather no",
scalingAnswer: "Yes (at least better than the others)",
suggestionsAndComments: "Maybe an other syntax for match-case to dinstinguish between functions names",
),
(
concept: "Haskell Function-Notation inspired",
meaningAnswer: "No",
lookAnswer: "No, gets to big/messy soon",
teachingAnswer: "No",
scalingAnswer: "No, gets big very soon",
suggestionsAndComments: ""
)
),
generalComments: "Maybe something like a tree structure (similar to expression trees) that goes from top to bottom? It would may be some kind of mix between Flo and Scratch. Make a own symbol for match-cases (to distinguish from functions). Make sure it is tidy (same thing on same height level etc.)"
)
#pagebreak() |
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/bugs/linebreak-no-justifiables.typ | typst | Apache License 2.0 | // Test breaking a line without justifiables.
---
#set par(justify: true)
#block(width: 1cm, fill: aqua, lorem(2))
|
https://github.com/HitZhouzhou/SecondYear_FirstSemester | https://raw.githubusercontent.com/HitZhouzhou/SecondYear_FirstSemester/main/algorithm/4homework/template.typ | typst | //---------------------------------------------------------------------
//-------------------------------need to modify------------------------
#let heiti = ("Noto Sans CJK SC", "Times New Roman")
#let songti = ("Noto Serif CJK SC", "Times New Roman")
#let mono = ("FiraCode Nerd Font Mono", "Sarasa Mono SC","Courier New", "Courier", "Noto Serif CJK SC")
//---------------------------------------------------------------------
// some handly functions
#let equation_num(_) = {
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("equation-chapter" + str(chapt))
let n = c.at(loc).at(0)
"(" + str(chapt) + "-" + str(n + 1) + ")"
})
}
#let table_num(_) = {
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("table-chapter" + str(chapt))
let n = c.at(loc).at(0)
str(chapt) + "-" + str(n + 1)
})
}
#let image_num(_) = {
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("image-chapter" + str(chapt))
let n = c.at(loc).at(0)
str(chapt) + "-" + str(n + 1)
})
}
#let equation(equation, caption: "") = {
figure(
equation,
caption: caption,
supplement: [å
¬åŒ],
numbering: equation_num,
kind: "equation",
)
}
#let tbl(tbl, caption: "") = {
figure(
tbl,
caption: caption,
supplement: [衚],
numbering: table_num,
kind: "table",
)
}
#let img(img, caption: "") = {
figure(
img,
caption: caption,
supplement: [åŸ],
numbering: image_num,
kind: "image",
)
}
#let empty_par() = {
v(-1em)
box()
}
#let project(
logopath: "",
subject: "",
labname: "",
kwargs: (),
firstlineindent: 0em,
body,
) = {
// åŒçšçæ¶åïŒåŸè¡šå
¬åŒçç numbering äŒæéè¯¯ïŒæä»¥çšåŒçš element æåšæ¥
show ref: it => {
if it.element != none and it.element.func() == figure {
let el = it.element
let loc = el.location()
let chapt = counter(heading).at(loc).at(0)
// èªåšè·³èœ¬
link(loc)[#if el.kind == "image" or el.kind == "table" {
// æ¯ç« æç¬ç«ç计æ°åš
let num = counter(el.kind + "-chapter" + str(chapt)).at(loc).at(0) + 1
it.element.supplement
" "
str(chapt)
"-"
str(num)
} else if el.kind == "equation" {
// å
Œξ '(' ')'
let num = counter(el.kind + "-chapter" + str(chapt)).at(loc).at(0) + 1
it.element.supplement
" ("
str(chapt)
"-"
str(num)
")"
} else {
it
}
]
} else {
it
}
}
// åŸè¡šå
¬åŒçæç
show figure: it => {
set align(center)
if it.kind == "image" {
set text(font: heiti, size: 12pt)
it.body
it.supplement
" " + it.counter.display(it.numbering)
" " + it.caption
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("image-chapter" + str(chapt))
c.step()
})
} else if it.kind == "table" {
set text(font: songti, size: 12pt)
it.body
set text(font: heiti, size: 12pt)
it.supplement
" " + it.counter.display(it.numbering)
" " + it.caption
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("table-chapter" + str(chapt))
c.step()
})
} else if it.kind == "equation" {
// éè¿å€§æ¯äŸæ¥èŸŸå°äžéŽåé å³çæåž
grid(
columns: (20fr, 1fr),
it.body,
align(center + horizon,
it.counter.display(it.numbering)
)
)
locate(loc => {
let chapt = counter(heading).at(loc).at(0)
let c = counter("equation-chapter" + str(chapt))
c.step()
})
} else {
it
}
}
set page(paper: "a4", margin: (
top: 2.5cm,
bottom: 2cm,
left: 2cm,
right: 2cm
))
// å°é¢
align(center)[
#v(30pt)
#image(logopath, width: 100%)
#v(50pt)
#text(
size: 36pt,
font: songti,
weight: "bold"
)[ã#subjectã#linebreak()#labname]
#set align(bottom)
#let info_value(body) = {
rect(
width: 100%,
inset: 2pt,
stroke: (
bottom: 1pt + black
),
text(
font: songti,
size: 16pt,
bottom-edge: "descender"
)[
#body
]
)
}
#let info_key(body) = {
rect(width: 100%, inset: 2pt,
stroke: none,
text(
font: songti,
size: 16pt,
body
))
}
#let pair_into(pair) = {
let (key, value) = pair
(info_key(key + ":"), info_value(value))
}
#grid(
columns: (70pt, 240pt),
rows: (40pt, 40pt),
gutter: 3pt,
..kwargs.pairs().map(pair_into).flatten()
)
]
counter(page).update(1)
set page(
header: {
set text(font: songti, 10pt, baseline: 8pt, spacing: 3pt)
set align(center)
[ã#subjectã#labname]
line(length: 100%, stroke: 0.7pt)
},
footer: {
set align(center)
text(font: songti, 10pt, baseline: -3pt,
counter(page).display("1"))
// grid(
// columns: (5fr, 1fr, 5fr),
// line(length: 100%, stroke: 0.7pt),
// text(font: songti, 10pt, baseline: -3pt,
// counter(page).display("1")
// ),
// line(length: 100%, stroke: 0.7pt)
// )
}
)
set text(font: songti, 12pt)
set par(justify: true, leading: 1.24em, first-line-indent: firstlineindent)
show par: set block(spacing: 1.24em)
// TODO
// ç®åå
硬çŒç
set heading(
numbering: (..nums) => {
let vars = nums.pos()
if vars.len() == 1 {
numbering("äžã ", vars.last())
} else {
numbering("1. ", vars.last())
}
},
)
show heading.where(level: 1): it => {
// set align(center)
set text(weight: "bold", font: heiti, size: 18pt)
set block(spacing: 1.5em)
it
}
show heading.where(level: 2): it => {
set text(weight: "bold", font: heiti, size: 14pt)
set block(above: 1.5em, below: 1.5em)
it
}
// éŠæ®µäžçŒ©è¿ïŒæåšå äž box
show heading: it => {
set text(weight: "bold", font: heiti, size: 12pt)
set block(above: 1.5em, below: 1.5em)
it
} + empty_par()
counter(page).update(1)
// è¡å
代ç
show raw.where(block: false): it => {
set text(font: mono, 12pt)
it
}
show raw.where(block: false): box.with(
fill: rgb(217, 217, 217, 1),
inset: (x: 3pt, y: 0pt),
outset: (y: 3pt),
radius: 2pt
)
// 代ç å
// 玧æ¥ççæ®µèœæ 猩è¿ïŒå å
¥äžäžªç©ºè¡
show raw.where(block: true): it => {
set text(font: mono, 10pt)
set block(inset: 5pt, fill: rgb(217, 217, 217, 1), radius: 4pt, width: 100%)
set par(leading: 0.62em, first-line-indent: 0em)
it
} + empty_par()
// æ åºå衚
show list: it => {
it
} + empty_par()
// æåºå衚
show enum: it => {
it
} + empty_par()
body
}
|
|
https://github.com/csimide/SEU-Typst-Template | https://raw.githubusercontent.com/csimide/SEU-Typst-Template/master/README.md | markdown | MIT License | # äžå倧åŠè®ºææš¡æ¿
äœ¿çš Typst å€å»äžå倧åŠãæ¬ç§æ¯äžè®Ÿè®¡ïŒè®ºæïŒæ¥åãæš¡æ¿åãç ç©¶çåŠäœè®ºæãæš¡æ¿ã
è¯·åš [`init-files`](./init-files/) ç®åœå
æ¥ç Demo PDFã
> [!IMPORTANT]
>
> æ€æš¡æ¿æ¯æ°éŽæš¡æ¿ïŒæäžè¢«åŠæ ¡è®€å¯çé£é©ã
>
> æ¬æš¡æ¿èœå·²å°œåå°è¯å€ååå§ Word æš¡æ¿ïŒäœå¯èœä»ç¶ååšè¯žå€æ ŒåŒé®é¢ã
>
> Typst æ¯äžäžªä»åšæŽ»è·åŒåãå¯èœäŒæèŸå€§åæŽçæçå·¥å
·ïŒè¯·éæ©ææ°çæš¡æ¿äžæ¬æš¡æ¿å»ºè®®ç Typst çæ¬çžé
å䜿çšã
> [!CAUTION]
>
> æ¬æš¡æ¿åš [`0.2.2`](https://github.com/csimide/SEU-Typst-Template/tree/c44b5172178c0c2380b322e50931750e2d761168) -> `0.3.0` æ¶è¿è¡äºç Žåæ§åæŽãæå
³æ€æ¬¡åæŽç诊ç»ä¿¡æ¯ïŒè¯·æ¥ç[æŽæ°æ¥å¿](CHANGELOG.md)
- [äžå倧åŠè®ºææš¡æ¿](#äžå倧åŠè®ºææš¡æ¿)
- [äœ¿çšæ¹æ³](#äœ¿çšæ¹æ³)
- [æ¬å°äœ¿çš](#æ¬å°äœ¿çš)
- [Web App](#web-app)
- [æš¡æ¿å
容](#æš¡æ¿å
容)
- [ç ç©¶çåŠäœè®ºææš¡æ¿](#ç ç©¶çåŠäœè®ºææš¡æ¿)
- [æ¬ç§æ¯äžè®Ÿè®¡ïŒè®ºæïŒæ¥åæš¡æ¿](#æ¬ç§æ¯äžè®Ÿè®¡è®ºææ¥åæš¡æ¿)
- [ç®åååšçé®é¢](#ç®åååšçé®é¢)
- [åèæç®](#åèæç®)
- [åæ
éŸæ¥](#åæ
éŸæ¥)
- [åŒåäžåè®®](#åŒåäžåè®®)
- [äºæ¬¡åŒå](#äºæ¬¡åŒå)
## äœ¿çšæ¹æ³
æ¬æš¡æ¿éèŠäœ¿çš Typst 0.11.x çŒè¯ã
æ€æš¡æ¿å·²äžäŒ Typst Universe ïŒå¯ä»¥äœ¿çš `typst init` åèœåå§åïŒä¹å¯ä»¥äœ¿çš Web App çŒèŸãTypst Universe äžçæš¡æ¿å¯èœäžæ¯ææ°çæ¬ãåŠæéèŠäœ¿çšææ°çæ¬çæš¡æ¿ïŒä»æ¬ repo äžè·åã
### æ¬å°äœ¿çš
请å
å®è£
äœäº `fonts` ç®åœå
çå
šéšåäœãç¶åïŒæšå¯ä»¥äœ¿çšä»¥äžäž€ç§æ¹åŒäœ¿çšæ¬æš¡æ¿ïŒ
- äžèœœ/clone æ¬ repo çå
šéšæä»¶ïŒçŒèŸ `init-files` ç®åœå
çç€ºäŸæä»¶ã
- äœ¿çš `typst init @preview/cheda-seu-thesis:0.3.0` æ¥è·åæ€æš¡æ¿äžåå§åæä»¶ã
éåïŒæšå¯ä»¥éè¿çŒèŸç€ºäŸæä»¶æ¥çææ³èŠç论æã䞀ç§è®ºææ ŒåŒç诎æéœåšå¯¹åºçç€ºäŸææ¡£å
ã
åŠæšäœ¿çš VSCode äœäžºçŒèŸåšïŒå¯ä»¥å°è¯äœ¿çš [Tinymist](https://marketplace.visualstudio.com/items?itemName=nvarner.typst-lsp) äž [Typst Preview](https://marketplace.visualstudio.com/items?itemName=mgt19937.typst-preview) æä»¶ãåŠææ¬å°å
äºåæ¥éæ±ïŒå¯ä»¥äœ¿çš [Typst Sync](https://marketplace.visualstudio.com/items?itemName=OrangeX4.vscode-typst-sync) æä»¶ãæŽå€çŒèŸæå·§ïŒå¯æ¥é
<https://github.com/nju-lug/modern-nju-thesis#vs-code-%E6%9C%AC%E5%9C%B0%E7%BC%96%E8%BE%91%E6%8E%A8%E8%8D%90> ã
### Web App
> [!NOTE]
>
> ç±äºåäœåå ïŒäžå»ºè®®äœ¿çš Web App çŒèŸæ€æš¡æ¿ã
请æåŒ <https://typst.app/universe/package/cheda-seu-thesis> å¹¶ç¹å» `Create project in app` ïŒæåš Web App äžéæ© `Start from a template`ïŒåéæ© `cheda-seu-thesis`ã
ç¶åïŒè¯·å° <https://github.com/csimide/SEU-Typst-Template/tree/master/fonts> å
ç **ææ** åäœäžäŒ å° Typst Web App å
该项ç®çæ ¹ç®åœã泚æïŒä¹åæ¯æ¬¡æåŒæ€é¡¹ç®ïŒæµè§åšéœäŒè±è޹åŸé¿æ¶éŽä» Typst Web App çæå¡åšäžèœœè¿äžæ¹åäœïŒäœéªèŸå·®ã
æåïŒè¯·æç
§èªåšæåŒçæä»¶çæç€ºæäœã
## æš¡æ¿å
容
### ç ç©¶çåŠäœè®ºææš¡æ¿
æ€ Typst æš¡æ¿æç
§[ãäžå倧åŠç ç©¶çåŠäœè®ºææ ŒåŒè§å®ã](https://seugs.seu.edu.cn/2023/0424/c26669a442680/page.htm)å¶äœïŒå¶äœæ¶åèäº [SEUThesis æš¡æ¿](https://ctan.math.utah.edu/ctan/tex-archive/macros/latex/contrib/seuthesis/seuthesis.pdf)ã
åœåæ¯æè¿åºŠïŒ
- ææ¡£æä»¶
- [x] å°é¢
- [x] äžè±ææé¡µ
- [x] äžè±ææèŠ
- [x] ç®åœ
- [x] æ¯è¯è¡š
- [x] æ£æ
- [x] èŽè°¢
- [x] åèæç®
- [x] éåœ
- [ ] 玢åŒ
- [ ] äœè
ç®ä»
- [ ] åè®°
- [ ] å°åº
- åèœ
- [ ] ç²å®¡
- [x] 页ç çŒå·ïŒæ£æå䜿çšçœé©¬æ°åïŒæ£æåæ£æå䜿çšé¿æäŒ¯æ°å
- [x] æ£æãéåœåŸè¡šçŒå·æ ŒåŒïŒè¯Šè§ç é¢èŠæ±
- [x] æ°åŠå
¬åŒæŸçœ®äœçœ®ïŒçл页é¢å·ŠäŸ§äž€äžªæ±åè·çŠ»
- [x] æ°åŠå
¬åŒçŒå·ïŒå
¬åŒåå³äž
- [x] æå
¥ç©ºçœé¡µïŒæ°ç« èæ»åšå¥æ°é¡µäž
- [x] 页çïŒå¥æ°é¡µæŸç€ºç« èå·åç« èæ é¢ïŒå¶æ°é¡µæŸç€ºåºå®å
容
- [x] åèæç®ïŒæ¯æåè¯æŸç€º
### æ¬ç§æ¯äžè®Ÿè®¡ïŒè®ºæïŒæ¥åæš¡æ¿
æ€ Typst æš¡æ¿åºäºäžåå€§åŠæ¬ç§æ¯äžè®Ÿè®¡ïŒè®ºæïŒæ¥åæš¡æ¿ïŒ2024 幎 1 æïŒä»¿å¶ïŒåæš¡æ¿å¯ä»¥åšæå¡å€çœç«äžäžèœœïŒ[2019 幎 9 æç](https://jwc.seu.edu.cn/2021/1108/c21686a389963/page.htm) , [2024 幎 1 æç](https://jwc.seu.edu.cn/2024/0117/c21686a479303/page.htm)ïŒã
åœåæ¯æè¿åºŠïŒ
- ææ¡£æä»¶
- [x] å°é¢
- [x] äžè±ææèŠ
- [x] ç®åœ
- [x] æ£æ
- [x] åèæç®
- [x] éåœ
- [x] èŽè°¢
- [ ] å°åº
- åèœ
- [ ] ç²å®¡
- [x] 页ç çŒå·ïŒæ£æå䜿çšçœé©¬æ°åïŒæ£æåæ£æå䜿çšé¿æäŒ¯æ°å
- [x] æ£æãéåœåŸè¡šçŒå·æ ŒåŒïŒè¯Šè§æ¬ç§æ¯è®ŸèŠæ±
- [x] æ°åŠå
¬åŒæŸçœ®äœçœ®ïŒçл页é¢å·ŠäŸ§äž€äžªæ±åè·çŠ»
- [x] æ°åŠå
¬åŒçŒå·ïŒå
¬åŒåå³äŸ§äžå¿
- [x] 页çïŒæŸç€ºåºå®å
容
- [x] åèæç®ïŒæ¯æåè¯æŸç€º
- [ ] ~~è¡šæ ŒæŸç€ºâç»è¡šâ~~ ç±äºæå¡å€æäŸçæš¡æ¿äžæ²¡æç»åºâç»è¡šâæŸç€ºæ ·äŸïŒæ
æäžå®ç°ã
> [!NOTE]
>
> å¯ä»¥ççéå£ <https://github.com/TideDra/seu-thesis-typst/> 项ç®ïŒä¹æ£åšäœ¿çš Typst å®ç°æ¯äžè®Ÿè®¡ïŒè®ºæïŒæ¥åæš¡æ¿ïŒè¿æäŸäºæ¯è®Ÿç¿»è¯æš¡æ¿ã该项ç®çå®ç°ç»èäžæ¬æš¡æ¿å¹¶äžçžåïŒæšå¯ä»¥æ ¹æ®èªå·±çåå¥œéæ©ã
## ç®åååšçé®é¢
- äžæéŠæ®µææ¶äŒèªåšçŒ©è¿ïŒææ¶äžäŒãåŠææ²¡æèªåšçŒ©è¿ïŒéèŠäœ¿çš `#h(2em)` æåšçŒ©è¿äž€äžªå笊ã
- åèæç®æ ŒåŒäžå®å
šç¬ŠåèŠæ±ã请è§äžæ¹åèæç®å°èã
- è¡è·ã蟹è·çæåŸ
ç»§ç»è°æŽã
### åèæç®
åèæç®æ ŒåŒäžå®å
šç¬ŠåèŠæ±ãTypst èªåžŠç GB/T 7714-2015 numeric æ ŒåŒäžåŠæ ¡èŠæ±æ ŒåŒçžæ¯ïŒæä»¥äžé®é¢ïŒ
1. åŠæ ¡èŠæ±åšäœè
æ°éèŸå€æ¶ïŒè±æäœ¿çš `et al.` äžæäœ¿çš `ç` æ¥çç¥ãäœæ¯ïŒTypst ç®åä»
å¯ä»¥æŸç€ºäžºåäžè¯èšã
**A:** 该é®é¢ç³» Typst ç CSL è§£æåšäžæ¯æ CSL-M 富èŽçã
<details>
<summary> 诊ç»åå </summary>
- äœ¿çš CSL å®ç°è¿äž feature éèŠçšå° [CSL-M](https://citeproc-js.readthedocs.io/en/latest/csl-m/index.html#cs-layout-extension) æ©å±çå€ `layout` åèœïŒè Typst å°äžæ¯æ CSL-M æ©å±åèœãè¯Šè§ https://github.com/typst/typst/issues/2793 äž https://github.com/typst/citationberg/issues/5 ã
- Typst ç®åäŒå¿œè§ BibTeX/CSL äžç `language` åæ®µãåè§ https://github.com/typst/hayagriva/pull/126 ã
å 䞺äžè¿°åå ïŒç®ååŸéŸäœ¿çš Typst åçæ¹æ³å®ç°æ ¹æ®è¯èšèªåšéçš `et al.` äž `ç`ã
</details>
OrangeX4 åæåäºäžäžªåºäºæ¥æŸæ¿æ¢ç `bilingual-bibliography` åèœïŒè¯åŸåš Typst æ¯æ CSL-M åå®ç°äžæè¥¿æäœ¿çšäžåçå
³é®è¯ã
æ¬æš¡æ¿ç Demo ææ¡£å
å·²äœ¿çš `bilingual-bibliography` åŒçšïŒè¯·æ¥ç Demo ææ¡£ä»¥äºè§£çšæ³ã泚æïŒè¯¥åèœä»åšæµè¯ïŒåŸå¯èœæ BugïŒè¯Šè§ https://github.com/csimide/SEU-Typst-Template/issues/1 ã
> è¯·åš https://github.com/nju-lug/modern-nju-thesis/issues/3 æ¥çæŽå€æå
³åè¯åèæç®å®ç°ç讚论ã
>
> æ¬æš¡æ¿æŸç»å°è¯äœ¿çš https://github.com/csimide/cslper äœäžºåè¯åèæç®çå®ç°æ¹æ³ã
2. åŠæ ¡ç»åºçèäŸäžïŒé€äºçº¯çµåèµæºïŒå³äœ¿åŒçšæç®æ¥èªçº¿äžæž éïŒä¹åäžå `OL`ãè®¿é®æ¥æãDOI äž éŸæ¥ãäœæ¯ïŒTypst å
眮ç GB/T 7714-2015 numeric æ ŒåŒäŒäžºææ bib å
å®ä¹äºéŸæ¥/DOI çæç®æ·»å `OL` æ è®°åéŸæ¥/DOI ã
**A:** 该é®é¢ç³»åŠæ ¡çæ åäž GB/T 7714-2015 äžå®å
šäžèŽå¯ŒèŽçã
è¯·äœ¿çš `style: "./seu-thesis/gb-t-7714-2015-numeric-seu.csl"` ïŒäŒèªåšäŸæ®æç®ç±»åéæ©æ¯åŠæŸç€º `OL` æ è®°åéŸæ¥/DOIã
> 该 csl ä¿®æ¹èª <https://github.com/redleafnew/Chinese-STD-GB-T-7714-related-csl/blob/main/003gb-t-7714-2015-numeric-bilingual-no-url-doi.csl>
>
> åæä»¶åºäº CC-BY-SA 3.0 åè®®å
±äº«ã
3. äœè
倧å°åïŒæè
å
¶ä»ç»èïŒäžåŠæ ¡èäŸäžäžèŽã
4. åŠäœè®ºæäžïŒåŠæ ¡èŠæ±åŒçšå
¶ä»åŠäœè®ºæçæç®ç±»ååºåœåäœ `[D]: [å士åŠäœè®ºæ].` æ ŒåŒïŒäœæš¡æ¿æŸç€ºäžº `[D]` ïŒäžæŸç€ºåç±»å«ã
5. åŠäœè®ºæäžïŒåŠæ ¡ç»åºçèäŸäœ¿çšå
šè§ç¬Šå·ïŒåŠå
šè§æ¹æ¬å·ãå
šè§å¥ç¹çã
6. åŒçšæ¡ç®äž¢å€± `. ` ïŒåŠ `[M]2nd ed`ã
**3~6 A:** åŠæ ¡çšçæ¯ GB/T 7714-2015 çæ¹èšïŒæŸç»æåŠé¿æå®å«å GB/T 7714-SEU ïŒç®å没æŸå°å®çŸå¹é
åŠæ ¡èŠæ±ç CSLïŒäžååŠé¢çèŠæ±ä¹äžå€ªäžæ ·ïŒïŒåç»äŒåäžäžªç¬ŠåèŠæ±ç CSL æä»¶ã
**2024-05-02 æŽæ°:** ç°å·²åæ¥å®ç° CSLãäžåŸäžè¯Ž Typst ç CSL æ¯ææè°âŠâŠç®åä¿®å€æ
åµåŠäžïŒ
- é®é¢ 3 已修å€ïŒ
- é®é¢ 4 åšåŠäœè®ºæç CSL å
已修å€ïŒäœ Typst 䌌ä¹äžæ¯æè¿äžå段ïŒå æ€æ æ³æŸç€ºïŒ
- é®é¢ 5 äžåå€ä¿®å€ïŒæ¥é
æ°ç¯å·²å衚çåŠäœè®ºæïŒäœ¿çšç乿¯åè§ç¬Šå·ïŒ
- é®é¢ 6 äŒŒä¹æ¯ Typst ç CSL æ¯æçé®é¢ïŒæ¬æš¡æ¿é垊ç CSL æä»¶å·²ç»åäºé¢å€å€çïŒåºè¯¥äžäŒäž¢ `. ` äºã
7. åŒçšå
¶ä»åŠäœè®ºææ¶ïŒGB7714-2015/æ¬ç§æ¯è®Ÿ/åŠäœè®ºæåèŠæ±æ³šæ `å°ç¹: åŠæ ¡åç§°, 幎仜.` ãäœæ¯æš¡æ¿äžæŸç€ºè¿äžé¡¹ã
**A:** Typst äžæ¯æ `school` `institution` äœäžº `publisher` çå«åïŒäºŠäžæ¯æè§£æ csl äžç `institution` ïŒ https://github.com/typst/hayagriva/issues/112 ïŒãåŠéä¿®å€ïŒè¯·æåšä¿®æ¹ bib æä»¶å
å¯¹åºæ¡ç®ïŒåš `school = {åŠæ ¡åç§°},` äžå äžè¡ `publisher = {åŠæ ¡åç§°},` ã
<details>
<summary> ä¿®æ¹ç€ºäŸ </summary>
```biblatex
@phdthesis{Example1,
type = {{ç¡å£«åŠäœè®ºæ}},
title = {{æžé±Œèæ¯äžçTypstæš¡æ¿äœ¿çšç ç©¶}},
author = {<NAME>},
year = {2024},
langid = {chinese},
address = {å京},
school = {äžå倧åŠ},
publisher = {äžå倧åŠ},
}
```
</details>
8. æ£æäžè¿ç»åŒçšïŒäžæ åå¹¶é误ïŒäŸåŠïŒåŒçš 1 2 3 4 åºåœæŸç€ºäžº [1-4] ïŒäœæ¯æŸç€ºäžº [1,4] ïŒã
**A:** äžŽæ¶æ¹æ¡æ¯æ csl æä»¶é `after-collapse-delimiter=","` æ¹æ `after-collapse-delimiter="-"`ãæ¬æš¡æ¿é垊ç CSL æä»¶å·²åæ€ä¿®æ¹ã
诊ç»åå è¯·è§ https://github.com/typst/hayagriva/issues/154 ã
https://github.com/typst/hayagriva/pull/176 æ£å°è¯è§£å³è¿äž bugã**该 bug ä¿®å€åïŒè¯·åæ¶æ€éäžè¿°å¯¹ csl ç䞎æ¶ä¿®æ¹ã**
## åæ
éŸæ¥
- Typst Touying äžå倧åŠäž»é¢å¹»ç¯çæš¡æ¿ by QuadnucYard - https://github.com/QuadnucYard/touying-theme-seu
- äžåå€§åŠ Typst æ¬ç§æ¯è®Ÿæš¡æ¿äžæ¯è®Ÿç¿»è¯æš¡æ¿ by Geary.Z (TideDra) - https://github.com/TideDra/seu-thesis-typst
## åŒåäžåè®®
åŠææšåšäœ¿çšè¿çšäžéå°ä»»äœé®é¢ïŒè¯·æäº€ issueãæ¬é¡¹ç®æ¬¢è¿æšç PRãåŠææå
¶ä»æš¡æ¿éæ±ä¹å¯ä»¥åš issue äžæåºã
é€äžè¿°ç¹æ®è¯Žæçæä»¶å€ïŒæ€é¡¹ç®äœ¿çš MIT License ã
- `init-files/demo_image/` è·¯åŸäžçæä»¶æ¥èªäžåå€§åŠæå¡å€æ¬ç§æ¯è®Ÿæš¡æ¿ã
- `seu-thesis/assets/` è·¯åŸäžçæä»¶æ¯ç±äžåå€§åŠæå¡å€æš¡æ¿ç»äºæ¬¡å å·¥åŸå°ïŒæä»äžå倧åŠè§è§è®Ÿè®¡äžååŸã
- `fonts` è·¯åŸäžçæä»¶æ¯æ€æš¡æ¿çšå°çåäœã
- `äžåå€§åŠæ¬ç§æ¯äžè®Ÿè®¡ïŒè®ºæïŒåèæš¡æ¿ (2024幎1æä¿®è®¢).docx` æ¯æå¡å€æäŸçæ¯è®Ÿè®ºææš¡æ¿ã
### äºæ¬¡åŒå
æ¬æš¡æ¿æ¬¢è¿äºæ¬¡åŒåãåšäºæ¬¡åŒååïŒå»ºè®®äºè§£æ¬æš¡æ¿çäž»èŠç¹æ§äžå
³èçæä»¶ïŒ
- æèŸäžºéº»çŠçåŸè¡šãå
¬åŒçŒå·ïŒåŸè¡šçŒå·æ ŒåŒäžçžåïŒçè³éåœäžæ£æäžåŸè¡šçŒå·æ ŒåŒä¹äžçžåïŒåŸçåç§°åšåŸäžæ¹ïŒè¡šçåç§°åšè¡šäžæ¹ïŒå
¬åŒäžæ¯å±
äžå¯¹éœïŒå
¬åŒçŒå·äœçœ®äžæ¯å³äŸ§äžäžå±
äžïŒã
- å·²ç»æ¹çš `i-figured` å
宿ã
- ïŒä»
ç ç©¶çåŠäœè®ºæïŒå¥æ°é¡µå¶æ°é¡µé¡µçäžåïŒäžæé¡µçäžæŸç€ºç« èåç§°çéæ±ã
- 该åèœäœäº `seu-thesis/parts/main-body-degree-fn.typ`ã
- æšèæ¹çš `chic-hdr` èäžæ¯èªé 蜮åïŒç±äºåå²éçé®é¢æ¬æš¡æ¿ææªæ¹çšã
- æ¯æåè¯æŸç€ºåèæç®ïŒèªåšäœ¿çš `et al.` å `ç`ïŒ
- 该åèœæ¥èª `bilingual-bibliography`ïŒå
³èçæä»¶æ¯ `seu-thesis/utils/bilingual-bibliography.typ`ã
- æå
³ `bilingual-bibliography` çæŽå€ä¿¡æ¯ïŒè¯·æ¥ç https://github.com/nju-lug/modern-nju-thesis/issues/3
> [!NOTE]
>
> æ¬æš¡æ¿å
é çèœ®åæ¯èŸå€ïŒèäžæç代ç 莚éäžè¬ïŒè¯·é
æ
åçšã
|
https://github.com/katamyra/Notes | https://raw.githubusercontent.com/katamyra/Notes/main/Compiled%20School%20Notes/CS1332/Modules/BasicSorts.typ | typst | #import "../../../template.typ": *
= Basic Sorts
== Insertion Sort
#theorem[
At iteration j, everything before index j is sorted. "Insert" the item at index j into the sorted array from 0 to j - 1 by swapping leftwards.
]
For insertion sort, we basically create a sub array starting from the front of the main array that is always sorted. Then for each element, we place it into its correct location in the sorted subarray.
#note[
For *stability*, do not swap items with the same value!
Example:
#let values = (2, 4, 6, 8, 6)
#values
When at the second 6, we will swap it iwth 8, instead of 6 in order to main stability.
Another note: In insertion sort, before the last iteration, it is possible that none of the items are in their final position.
]
#theorem[
*Time Complexity*
*Best*: Already sorted: O(n)
- This is because we still need to look at every data item to make sure everything is in order.
*Worst*: Reverse sorted order: O(n^2)
- Each element needs to be swapped all the way to the front (aka moving it the maximum distance).
]
== Selection Sort
#theorem[
For each index in the array, find the largest/smallest item in the unsorted part of the array, and then place it into that position repeatedly until sorted.
]
In this case, each element is being placed into its final spot in the array. This is different from selection sort where we cant guaruntee anything is in its correct spot until the end.
#theorem[
*Time Complexity*
*Best*: $O(n^2)$
*Worst*: $O(n^2)$
]
|
|
https://github.com/soul667/typst | https://raw.githubusercontent.com/soul667/typst/main/PPT/typst-slides-fudan/themes/polylux/book/src/dynamic/complex.md | markdown | # Complex display rules
There are multiple options to define more complex display rules than a single
number.
### Array
The simplest extension is to use an array.
For example
```typ
{{#include rule-array.typ:5:}}
```
results in:

The array elements can actually themselves be any kind of rule that is explained
on this page.
### Interval
You can also provide a (bounded or half-bounded) interval in the form of a
dictionary with a `beginning` and/or an `until` key:
```typ
{{#include rule-interval.typ:5:}}
```
results in:

In the last case, you would not need to use `#only` anyways, obviously.
### Convenient syntax as strings
In principle, you can specify every rule using numbers, arrays, and intervals.
However, consider having to write
```typ
#uncover(((until: 2), 4, (beginning: 6, until: 8), (beginning: 10)))[polylux]
```
That's only fun the first time.
Therefore, we provide a convenient alternative.
You can equivalently write:
```typ
{{#include rule-string.typ:6}}
```
which results in:

Much better, right?
The spaces are optional, so just use them if you find it more readable.
Unless you are creating those function calls programmaticly, it is a good
recommendation to use the single-number syntax (`#only(1)[...]`) if that
suffices and the string syntax for any more complex use case.
|
|
https://github.com/typst-community/mantodea | https://raw.githubusercontent.com/typst-community/mantodea/main/src/style.typ | typst | MIT License | #import "_pkg.typ"
#import "_valid.typ"
#import "theme.typ" as _theme
/// The default style applied over the whole document.
///
/// - theme (theme): The theme to use for the document styling.
/// -> function
#let default(
theme: _theme.default,
_validate: true,
) = body => {
let theme = theme
if _validate {
import _valid as z
theme = z.parse(theme, _theme.schema(), scope: ("theme",))
}
set page(
numbering: "1",
header: context {
let section = _pkg.hydra.hydra(2, display: (_, it) => {
numbering("1.1", ..counter(heading).at(it.location()).slice(1))
[ ]
it.body
})
align(center, emph(section))
},
)
set text(12pt, font: theme.fonts.text, lang: "en")
set par(justify: true)
set heading(numbering: "I.1.")
show heading: it => {
let scale = if it.level == 1 {
1.8
} else if it.level == 2 {
1.4
} else if it.level == 3 {
1.2
} else {
1.0
}
let size = 1em * scale;
let above = if it.level == 1 { 1.8em } else { 1.44em } / scale;
let below = 0.75em / scale;
set text(size, font: theme.fonts.headings)
set block(above: above, below: below)
if it.level == 1 {
pagebreak(weak: true)
block({
if it.numbering != none {
text(fill: theme.colors.primary, {
[Part ]
counter(heading).display()
})
linebreak()
}
it.body
})
} else {
block({
if it.numbering != none {
text(fill: theme.colors.primary, counter(heading).display())
[ ]
}
it.body
})
}
}
show raw: set text(9pt, font: theme.fonts.code)
show figure.where(kind: raw): set block(breakable: true)
body
}
|
https://github.com/tiankaima/typst-notes | https://raw.githubusercontent.com/tiankaima/typst-notes/master/7e1810-algo_hw/hw3.typ | typst | #import "utils.typ": *
== HW 3 (Week 4)
Due: 2024.03.31
#rev1_note[
+ Review: äºåæ
éåæ¹åŒ: å
åºéå, äžåºéå, ååºéå.
+ Review: äºåæçŽ¢æ
- äºåæçŽ¢æ æ¯äžç§äºåæ , å
¶äžæ¯äžªèç¹ $x$ éœæäžäžªå
³é®å $"key"[x]$ 以åäžäžªæå $x$ çç¶èç¹çæé $p[x]$, 以åæåå·Šå³å©åçæé $"left"[x]$ å $"right"[x]$. äºåæçŽ¢æ æ§èŽš:
+ 对äºä»»æèç¹ $x$, å
¶å·Šåæ äžç*ææ*å
³é®åçåŒéœå°äº $"key"[x]$.
+ 对äºä»»æèç¹ $x$, å
¶å³åæ äžç*ææ*å
³é®åçåŒéœå€§äº $"key"[x]$.
- äºåæçŽ¢æ çäžåºé忝äžäžªæåºåºå. æ€å€, éè¿äžé¢äºåæçŽ¢æ çå
åº/ååºéåç»æ, å¯ä»¥åæšåºè¿é¢æ çç»æ. äœæ¯éè¿äžåºéåç»ææ æ³å¯äžç¡®å®äžé¢äºåæ .
- å驱çæçŽ¢é»èŸ:
- åŠæå·Šèç¹äžäžºç©º, é£ä¹åªéèŠæçŽ¢å·Šèç¹çæå€§åŒ(å°œå¯èœçåå³ãåäžéå)
- åŠæå·Šèç¹äžºç©º, åäžæŸå°ç¬¬äžäžªåå·Šç parent , ä¹å°±æ¯è¯Žå¯¹è¿äžª parent æ¥è¯Ž, åœåèç¹æ¯å³å©å. åŠææ¯å·Šå©åçè¯é£å°±æç»åäžéå.
- è¿åæåäžäžªç¶èç¹. åŠæå°æ ¹éšäŸæ§äžååšåå·Šç parent, é£ä¹åªèœè¯ŽææåŒå§çèç¹å·²ç»å€åšæŽæ£µæ çå·Šäžè§, 宿²¡æå驱, è¿å空.
+ Review: çº¢é»æ
åš BST çåºç¡äžå¢å color åæ®µ, ååŒäžºçº¢æé». çº¢é»æ çæ§èŽš:
- æ¯äžªèç¹æè
æ¯çº¢è², æè
æ¯é»è².
- æ ¹èç¹æ¯é»è².
- æ¯äžªå¶åèç¹æ¯é»è².(空èç¹)
- åŠæäžäžªèç¹æ¯çº¢è², åå®ç䞀䞪åèç¹éœæ¯é»è².
- å¯¹äºæ¯äžªèç¹, ä»è¯¥èç¹å°å
¶ææå代å¶åèç¹çç®åè·¯åŸäž, å䞪é¢è²çèç¹æ°ç®çžå. (é»é«çžå)
+ Review: éåºå¯¹
$
\#{(i,j) | i < j quad and quad A[i] > A[j]}
$
+ Review: åºéŽæ
æä»¬å¯¹çº¢é»æ çç»æè¿è¡æ©åŒ æ¥ååšäžç»åºéŽ, $A^((i))=[t^((i))_1, t^((i))_2]$. äžå®æ°äžæ ·, åºéŽæçäžååŸ, å³å¯¹äºäž€äžªåºéŽ $A, B$ æ¥è¯Ž, èŠä¹ $A sect B != emptyset$, èŠä¹ $A$ åš $B$ ç巊䟧/å³äŸ§, è¿äžç§æ
åµäºæ¥.
åºéŽæ ç䜿çšå·Šç«¯ç¹ (äœç«¯ç¹) äœäžºæåºç key (å
³é®å), å¹¶äžé¢å€ç»Žæ€äžäžª $x.max$, 代衚åœåèç¹å¯¹åºçåæ äž, ææåºéŽçå³ç«¯ç¹ (é«ç«¯ç¹) çæå€§åŒ, æå»ºæ¹åŒç±»äŒŒèœ¬ç§»æ¹çš, ç»Žæ€æ¹åŒåªéèŠåäžæŽæ°, éœäžè¶
è¿äžè¬çº¢é»æ çå€æåºŠ.
]
=== Question 12.2-3
Write the `TREE-PREDECESSOR` procedure(which is symmetric to `TREE-SUCCESSOR`).
#ans[
```txt
TREE-PREDECESSOR(x)
if x.left != nil
return TREE-MAXIMUM(x.left)
y = x.p
while y != nil and x == y.left
x = y
y = y.p
return y
```
]
=== Question 13.1-5
Show that the longest simple path from a node $x$ in red-black tree to a descendant leaf at most twice that of the shortest simple path from node $x$ to a descendant leaf.
#rev1_note[
è¯æä»çº¢é»æ èç¹ $x$ å°å¶åèç¹çæé¿ç®åè·¯åŸé¿åºŠè³å€æ¯æçç®åè·¯åŸé¿åºŠç䞀å.
äžé¢è¿äžªçæ¡å®é
äžè¯Žæ: ä»»æäž€æ¡è·¯åŸçé»é«çžå, 红è²èç¹ç±äºäžå®ååšé»è²çåèç¹, é£ä¹çº¢è²èç¹çæ°éä¹äžå€§äºé»é«. æçè·¯åŸäžå°äºé»é«, æé¿è·¯åŸäžå€§äºäºåé»é«, åŸè¯.
]
#ans[
Consider the longest simple path $(a_1, ... ,a_s)$ & the shortest simple path $(b_1, ... ,b_t)$, they have equal number of black nodes (Property 5).
Neither of the paths can have repeated red node (Property 4).
Thus at most $floor((s - 1) / 2)$ of the nodes in the longest path are red, so $ t >= ceil((s+1)/2) $ If by way of contradiction, we had $s > t dot 2$, then $ t >= ceil((s+1) / 2) >= ceil(t+1) = t+1 $ which is a contradiction.
]
=== Question 17.1-7
Show how to use an order-statistic tree to count the number of inversions in an array of $n$ distinct elements in $O(n lg n)$ time.
#rev1_note[
èèæç
§åŠäžæ¹åŒå»é计ç®:
$
"Inv"(j)=\#{(i,j) | i < j quad and quad A[i] > A[j]}\
"TotalInv" = sum_(j=1)^(n) "Inv"(j)
$
æç
§è¿æ ·çæè·¯, $"Inv"(j)$åªäŸèµå $A[1:j]$ åºåäžå
çŽ , å
·äœç诎, $"Inv"(j)$ åªè· $A[j]$ åš $A[1:j]$ çæåçžå
³, è®°äœ $r(j)$. é£ä¹æä»¬æ:
$
"Inv"(j) = j - r(j) >=0
$
è¿æ ·çæè·¯äžæå
¥æåºçæè·¯æ¯äžèŽç, åœ $A[1:j-1]$ å·²ç»æ¯æåºæ°ç»æ¶, $A[j]$ æ°çæå
¥äœçœ® ($r(j)$) æå³çäž $j - r(j)$ 䞪å
çŽ äº€æ¢äºäœçœ®, å³äžºååçéåºæ°.
æ¯æ¬¡æå
¥æ¶éŽåæ¥è¯¢äœçœ®æ¶éŽæçšæ¶éŽéœæ¯ $O(log k)$. æ»çšæ¶ $O(log n!)=O(n log n)$
]
#ans[
$O(n lg(n))$ time is required to build a red-black treem so everytime we insert a node, we can calculate the number of inversion using $"OS-RANK"$ (which is the rank of the node, thus calculating inversions).
]
=== Question 17.3-2
Describe an efficient algorithm that, given an interval $i$, returns an interval overlapping $i$ that has the minimum low endpoint, or $T."nil"$ if no such interval exists.
#ans[
Modify the usual interval search not to break out the loop when a overlap is found, but to keep track of the minimum low endpoint. Return the interval with the minimum low endpoint if found, otherwise return $T."nil"$.
] |
|
https://github.com/leiserfg/fervojo | https://raw.githubusercontent.com/leiserfg/fervojo/master/typst-package/examples/simple.typ | typst | MIT License | #import "@preview/fervojo:0.1.0": *
= The rendered svg
#render(```
{[`select-stmt` ["WITH" <!, "RECURSIVE"> 'common-table-expression'*","]?#`Common table expressions`],
<{[["SELECT" <!, "DISTINCT", "ALL"> 'result-column'*","]#`Projection clause` ["FROM" <'table-or-subquery'*",", 'join-clause'>]?#`From clause`],
[["WHERE" 'expr']?#`Where clause` ["GROUP" "BY" 'expr'*"," ["HAVING" 'expr']?]?#`Grouping`]
}#`Single SELECT`,
["VALUES" ["(" 'expr'*"," ")"]*","]#`Literal values`
>*['compound-operator'#[`E.g. "SELECT` "UNION" `SELECT"`]]#`Compounded SELECT`,
[["ORDER" "BY" 'ordering-item'*","]?#`Ordering` ["LIMIT" 'expr' <!, ["OFFSET"? 'expr']>]?#`Limiting`]
}
```)
= The default.css
#text(str(default-css()))
|
https://github.com/sitandr/typst-examples-book | https://raw.githubusercontent.com/sitandr/typst-examples-book/main/src/packages/code.md | markdown | MIT License | # Code
## `codly`
> See docs [there](https://github.com/Dherse/codly)
``````typ
#import "@preview/codly:0.1.0": codly-init, codly, disable-codly
#show: codly-init.with()
#codly(languages: (
typst: (name: "Typst", color: rgb("#41A241"), icon: none),
),
breakable: false
)
```typst
#import "@preview/codly:0.1.0": codly-init
#show: codly-init.with()
```
// Still formatted!
```rust
pub fn main() {
println!("Hello, world!");
}
```
#disable-codly()
``````
## Codelst
``````typ
#import "@preview/codelst:2.0.0": sourcecode
#sourcecode[```typ
#show "ArtosFlow": name => box[
#box(image(
"logo.svg",
height: 0.7em,
))
#name
]
This report is embedded in the
ArtosFlow project. ArtosFlow is a
project of the Artos Institute.
```]
`````` |
https://github.com/maxi0604/typst-builder | https://raw.githubusercontent.com/maxi0604/typst-builder/main/example.typ | typst | MIT License | For example, you could have files on the root directory.
|
https://github.com/EricWay1024/Homological-Algebra-Notes | https://raw.githubusercontent.com/EricWay1024/Homological-Algebra-Notes/master/ha/2-ab.typ | typst | #import "../libs/template.typ": *
= Abelian Categories
<ab-cat>
== $Ab$-enriched Categories
We have seen, for example, that in $veck$ every hom-set not only is a collection (or set) of morphisms but also has some "additional structures", i.e., a vector space. This leads to the idea of enriched categories, where enriching means equipping the hom-sets with "additional structures". The following is an instance where every hom-set is an abelian group.
#definition[
We call a category $cC$ *$Ab$-enriched* if every $Hom(C)(X, Y)$ is a abelian group, subject to bilinear morphism composition, namely $ (f + g) compose h = f compose h + g compose h quad "and" quad f compose (k + h) = f compose k + f compose h $
for all $f, g : Y-> Z$ and $h, k : X->Y$.
]
#remark[
An equivalent way to put the bilinearity is the following: the composition mappings $ c_(X Y Z): Hom(C)(X, Y) times Hom(C)(Y, Z) -> Hom(C)(X, Z), quad (f, g) mapsto g oo f $
are group homomorphisms in each variable @borceux[Definition 1.2.1].
]
// The abelian group structure on hom-sets means that we are allowed to add two morphisms (as above) in $Hom(C)(X, Y)$.
#definition[
Let $cC$ be an $Ab$-enriched category and $X, Y in cC$. The *zero morphism* $0 in Hom(C)(X, Y)$ is defined as the identity of the abelian group $Hom(C) (X, Y)$.
]
However, note that an $Ab$-enriched category needs not have a zero object, so this is actually a redefinition of a zero morphism from @zero-factor. We will see later that the two definitions match when the zero object is present. Since group homomorphisms map identity to identity, we have the following:
#proposition[
In an *Ab*-enriched category, let $X->^g Y->^f Z->^h W$. If $f$ is a zero morphism, then $f oo g$ and $h oo f$ are zero morphisms.
]
<zero-composition>
#endlec(3)
We can also define functors between $Ab$-enriched categories which respect the abelian group structures of the hom-set:
#definition[
If $cC, cD$ are $Ab$-enriched, we call $F : cC -> cD$ an *additive functor* if $ Hom(C)(X, Y) -> Hom(D)(F(X), F(Y)) $ is a group homomorphism for any $X, Y in cC$.
]
#proposition[
If $cC$ is an *Ab*-enriched category, then so is $cC^op$.
]
#proof[
The definition is self-dual. Namely, reversing all the arrows in $cC$ breaks neither the group structure on hom-sets nor the bilinear morphism composition.
]
An $Ab$-enriched category needs not have a zero object. Nevertheless, once it has an initial or final object, it has a zero object, as shown below.
#proposition[Let $*$ be an object in an *Ab*-enriched category, then the followings are equivalent:
+ $*$ is a final object;
+ $*$ is an initial object;
+ $*$ is a zero object.
]
<ab-zero>
#proof[
(3) $=>$ (1) and (3) $=>$ (2) is obvious. We only prove (1) $=>$ (3), and (2) $=>$ (3) follows from duality.
Suppose $*$ is a terminal object and let $id_* : * -> *$ be the unique morphism in the abelian group of $Hom(C)(*, *)$, and so $id_* = 0$.
For any object $A$ and $f : * -> A$ (because $Hom(C)(*, A) $ contains at least the zero morphism), we have $ f = f compose id_* = f compose 0 = 0 in Hom(C)(*, A). $
So there is a unique morphism from $*$ to $A$ and therefore $*$ is also initial.
]
// This also includes the case of the empty product and coprodut, namely any final object is initial and thus zero.
In fact, a final object is an empty product and an initial object an empty coproduct, and the previous result can be generalised.
#proposition[
In an *Ab*-enriched category $cC$, let $X_1, X_2$ be two objects. Then
+ If the product $X_1 times X_2$ exists, then the coproduct $X_1 union.sq X_2$ also exists and is isomorphic to $X_1 times X_2$;
+ If the coproduct $X_1 union.sq X_2$ exists, then the product $X_1 times X_2$ also exists and is isomorphic to $X_1 union.sq X_2$.
]
<ab-product>
#proof[@notes[Proposition 3.7]
// , @li[Theorem 3.4.9]
and @borceux[Proposition 1.2.4]. We prove statement (1) and leave (2) to duality.
Suppose the product $X_1 times X_2$ exists with projections $p_k colon X_1 times X_2 arrow.r X_k$. By definition of products, there are unique morphisms $i_k colon X_k arrow.r X_1 times X_2$ such that the following diagrams commute.
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBpiBdUkANwEMAbAVxiRAA0B9YgAjwFt4PLgCYQAX1LpMufIRQAGclVqMWbLsQlSQGbHgJERy6vWatEHTmMnT9cokoUqz6y5u13ZhlMZEu1CysbFRgoAHN4IlAAMwAnCH4kJRAcCCRiWxB4xOT<KEY>
#align(center, commutative-diagram(
node-padding: (80pt, 50pt),
node((1, 1), [$X_1 times X_2$]),
node((1, 0), [$X_1$]),
node((1, 2), [$X_2$]),
node((0, 0), [$X_1$]),
node((2, 2), [$X_2$]),
arr((1, 1), (1, 0), [$p_1$], label-pos: -1em),
arr((1, 1), (1, 2), [$p_2$]),
arr((0, 0), (1, 0), [$id_(X_1)$]),
arr((0, 0), (1, 2), [$0$], curve: 30deg),
arr((0, 0), (1, 1), [$exists ! i_1$], label-pos: 1em, "dashed"),
arr((2, 2), (1, 2), [$id_(X_2)$], label-pos: -1.5em),
arr((2, 2), (1, 0), [$0$], label-pos: -1em, curve: 30deg),
arr((2, 2), (1, 1), [$exists ! i_2$], label-pos: -1em, "dashed"),
))
Explicitly, we have for $j, k in {1, 2}$, $ p_j oo i_k = cases(id_(X_j) quad &"if " j = k, 0 quad &"otherwise") $
// #image("imgs/16.png")
Then we have $ p_1 compose lr((i_1 p_1 plus i_2 p_2)) eq p_1 comma quad p_2 compose lr((i_1 p_1 plus i_2 p_2)) eq p_2. $
By definition of products, $id_(X_1 times X_2) $ is the unique morphism $h : X_1 times X_2 -> X_1 times X_2$ with $p_k compose h eq p_k$ for each $k$, so $i_1 p_1 plus i_2 p_2 eq id_(X_1 times X_2)$. We claim that
$ X_1 rgt(i_1) X_1 times X_2 lft(i_2) X_2 $ is a universal cocone and thus a coproduct. Suppose
$X_1 rgt(f_1) A lft(f_2) X_2 $
is another cocone. Then we have a map
$ phi eq f_1 compose p_1 plus f_2 compose p_2 colon X_1 times X_2 arrow.r A $
such that for $k = 1, 2$, $phi oo i_k = f_k $.
This gives a commutative diagram
// #align(center,image("../imgs/2023-10-29-11-34-35.png",width:30%))
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRAA0B9ARhAF9S6TLnyEUAJnJVajFmy7j+gkBmx4CRblOr1mrRBx4ACPAFt4RhUqFrRm0t2m65BgIL9pMKAHN4RUABmAE4QpkhkIDgQSJIyemxYPNYgwaFIWpHRiLHO+iCJigKBIWGIEVFIAMw6snkBSUUpJenUFYjVcS4pnIXKqaWxbR0MWGB5UHRwABZeIDXxBmhTWB58QA
#align(center, commutative-diagram(
node-padding: (60pt, 50pt),
node((0, 0), [$X_1$]),
node((0, 2), [$X_2$]),
node((0, 1), [$X_1 times X_2$]),
node((1, 1), [$A$]),
arr((0, 0), (0, 1), [$i_1$]),
arr((0, 2), (0, 1), [$i_2$], label-pos: right),
arr((0, 0), (1, 1), [$f_1$], label-pos: right),
arr((0, 2), (1, 1), [$f_2$]),
arr((0, 1), (1, 1), [$phi$], "dashed"),
))
It remains to show that $phi$ is unique. To see this, note that for any
such $phi$ we have $ phi & eq phi compose id_(X_1 times X_2)\
& eq phi compose lr((i_1 p_1 plus i_2 p_2))\
& eq phi i_1 compose p_1 plus phi i_2 compose p_2\
& eq f_1 compose p_1 plus f_2 compose p_2 dot.basic $
]
#definition[
Let $cC$ be an $Ab$-enriched category and let $X_1, X_2 in cC$. The *biproduct* of $X_1$ and $X_2$ is an object $X_1 xor X_2$ with morphisms $p_k : X_1 xor X_2 -> X_k$ and $i_k : X_k -> X_1 xor X_2 $ for $k = 1, 2$, such that
- $p_k oo i_k = 1_(X_k)$;
- $p_j oo i_k = 0 $ for $k != j$;
- $i_1 oo p_1 + i_2 oo p_2 = 1_(X_1 xor X_2)$.
// - $X_1 xor X_2$ with $(p_1, p_2)$ is a product of $X_1$ and $X_2$;
// - $X_1 xor X_2$ with $(i_1, i_2)$ is a coproduct of $X_1$ and $X_2$.
// If $X$ and $Y$ has a product (or a coproduct) in $cC$, then it is called the *biproduct* of $X$ and $Y$, denoted as $X xor Y$.
]
#corollary[
In an $Ab$-enriched category, a binary biproduct is both a product and a coproduct, and a binary product (or a binary coproduct) is a biproduct.
]
#proof[
This follows from the proof of @ab-product.
]
// We can show that $x union.sq y iso x times y$ and we use the notation of a biproduct $x ds y$ to denote both.
#remark[This extends to all _finite_ products and coproducts but
does not extend to _infinite_ products or coproducts.
]
#lemma[
In an $Ab$-enriched category, an additive functor preserves biproducts.
]
<additive-preserve-biproduct>
#proof[
Notice that an additive functor preserves identity morphisms, zero morphisms, morphism compositions and morphism additions, and they are all we need in the definition of biproducts.
]
Being able to add and subtract parallel morphisms means we can rephrase the definitions for a monomorphism and epimorphism.
#proposition[
In an $Ab$-enriched category $cC$, $f : B-> C$ is a monomorphism if and only if $f oo u = 0$ implies $u = 0$ for all $u : A -> B$.
Dually, $f : B -> C$ is an epimorphism if and only if $v oo f = 0$ implies $v = 0$ for all $v : C -> D$.
]
<ab-mono>
#proof[
$f : B -> C$ is a monomorphism, if and only if $(f oo -) : hom_cC (A, B) -> Hom(C) (A, C)$ is injective for any $A$, if and only if $(f oo -)$ (as a $ZZ$-homomorphism) has kernel $0$.
]
== Additive Categories
Inspired by @ab-zero and @ab-product, we naturally define the following:
#definition[
An $Ab$-enriched category $cC$ is *additive* if it has all finite biproducts, including the zero object.
]
Now we can reconcile the two definitions we have had for zero morphisms.
#proposition[
In an additive category $cC$, let $f: A->B$. Then $f$ is the identity of $Hom(C) (A, B)$ if and only if it can be factored as $A -> 0 -> B$.
]
#proof[
Since $Hom(C) (A, 0)$ has an unique element $h$, it must be the identity of the group. Similarly, $Hom(C) (0, B)$ contains only the identity $g$. The composition $g oo h$ is the identity of $Hom(C) (A, B)$ by @zero-composition.
]
#proposition[
In an additive category, if a monomorphism $i : A-> B$ is a zero morphism, then $A$ is the zero object.
Dually, if an epimorphism $p : C -> D$ is a zero morphism, then $D$ is the zero object.
]
<additive-mono-zero>
#proof[
Take any $X$ and $u : X -> A$, we have
$
X arrow^u A ->^i B.
$
$i = 0$, so $i oo u = 0$; but since $i$ is monic, $u = 0$ by @ab-mono.
Therefore there is a unique (zero) morphism from any $X$ to $A$, so $A$ is final and thus zero.
]
#proposition[@rotman[Proposition 5.89].
Let $f colon A arrow.r B$ be a morphism in an additive
category $cal(C)$. If $ker f$ exists, then $f$ is monic if and only if $ker f eq 0$.
Dually, if $coker f$ exists, then $f$ is epic if and only $coker f eq 0$.
]
<additive-ker>
#proof[
Let $ker f$ be $i : K -> A$. Suppose $i = 0$. Since we know a kernel is a monomorphism, by @additive-mono-zero, $K = 0$. To show that $f$ is monic, take any $u : X -> A$ such that $f oo u = 0$. Then by the universal property of a kernel, there exists a unique morphism $h : X -> K$ such that $u = i oo h$. Thus $u$ factors through $K = 0$, so $u = 0$, proving $f$ is monic by @ab-mono.
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRAGkQBfU9TXfIRQBGclVqMWbAILdeIDNjwEiAJjHV6zVohAAhOXyWCiZYeK1TdADW7iYUAObwioAGYAnCAFskAZmocCCQyEAYsMB0QKDo4AAsHEE1JKLjDEE8ff0DgxFEJbTYmdMzfPJykdQKrDJKvMtCgpHzLKKw7LiA
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((0, 0), [$K$]),
node((0, 1), [$A$]),
node((0, 2), [$B$]),
node((1, 0), [$X$]),
arr((1, 0), (0, 0), [$h$], label-pos: 1em, "dashed"),
arr((1, 0), (0, 1), [$u$], label-pos: -1em),
arr((0, 1), (0, 2), [$f$]),
arr((0, 0), (0, 1), [$i$]),
))
On the other hand, suppose $f$ is monic. Then $ker f = 0$ directly follows from @ab-mono.
// We refer to the diagrams in the definitions of kernel and
// cokernel. Let ker $u$ be $iota colon K arrow.r A$, and assume that
// $iota eq 0$. If $g colon X arrow.r A$ satisfies $u g eq 0$, then the
// universal property of kernel provides a morphism
// $theta colon X arrow.r K$ with $g eq iota theta eq 0$ \(because
// $iota eq 0$). Hence, $u$ is monic. Conversely, if $u$ is monic,
// consider $ K arrows.rr^iota_0 A arrow.r^u B dot.basic $
// Since $u iota eq 0 eq u 0$, we have $iota eq 0$. The proof for
// epimorphisms and cokers is dual.
// #TODO modify
]
== Pre-abelian Categories
Now inspired by @additive-ker, we define the following:
#definition[
An additive category $cC$ is *pre-abelian* if any morphism has a kernel and a cokernel.
]
#corollary[
Let $f$ be a morphism in a pre-abelian category. $f$ is monic if and only if $ker f$ = 0. Dually, $f$ is epic if and only if $coker f = 0$.
]
<pre-add-mono>
In fact, we get more than just kernels and cokernels:
#proposition[
A pre-abelian category has all finite limits and colimits.
]
#proof[
Let $cC$ be a pre-abelian category. Since
$Eq(f, q) = ker(f - g)$, $cC$ has all equalisers and coequalisers. We also know that $cC$ has all finite products and coproducts as an additive category. Thus it has all finite limits and colimits by @all-finite-limits.
]
#proposition[
If $cC$ is pre-abelian, for every morphism $f : X-> Y$, there exists a unique morphism $G -> D$ as shown below.
// // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRAGsYAnAAgAoAZgEoQAX1LpMufIRQBGclVqMWbABpiJIDNjwEiAJkXV6zVohABNTZN0yiAZmPKzbAMYROXQSPG3p+vKkckqmqhYeXnxRwr5aOgGyyEYhJirmHNx8kVmxYkowUADm8ESgAlwQALZIZCA4EEhyfiAV1U3UDUhGLuGtNq2VNYg9XYgOLW3DCvWNiAAsk0NIAKydcwZL7Qvrq9QMWGAZUHRwABaFIGmuFjAAHlhwOHA8AIT5okA
// #align(center, commutative-diagram(
// node-padding: (50pt, 50pt),
// node((0, 0), [$ker (f)$]),
// node((0, 1), [$X$]),
// node((0, 2), [$Y$]),
// node((0, 3), [$coker(f)$]),
// node((1, 1), [$coker(ker(f))$]),
// node((1, 2), [$ker(coker(f))$]),
// arr((0, 0), (0, 1), []),
// arr((0, 1), (0, 2), [$f$]),
// arr((0, 2), (0, 3), []),
// arr((0, 1), (1, 1), []),
// arr((1, 2), (0, 2), []),
// arr((1, 1), (1, 2), [$exists !$], "dashed"),
// ))
// https://t.yw.je/#N4Igdg9<KEY>
#align(center, commutative-diagram(
node((0, 0), [$K$]),
node((0, 1), [$X$]),
node((0, 2), [$Y$]),
node((0, 3), [$C$]),
node((1, 1), [$G$]),
node((1, 2), [$D$]),
arr((0, 0), (0, 1), [$ker(f)$]),
arr((0, 1), (0, 2), [$f$]),
arr((0, 2), (0, 3), [$coker(f)$]),
arr((0, 1), (1, 1), [$coker (ker (f))$], label-pos: -3.5em),
arr((1, 2), (0, 2), [$ker(coker(f))$], label-pos: -3.5em),
arr((1, 1), (1, 2), [$exists !$], "dashed"),
))
]
<pre-ab-morphism>
#proof[
Since $coker(f) oo f = 0$, by the universal property of kernel, there exists $c : X -> D$ such that $f = ker(coker(f)) oo c$. Since $f oo ker(f) = 0$, we have $ker(coker(f)) oo c oo ker(f) = 0$. Now notice $ker(coker(f))$ is monic, and hence by @pre-add-mono, $ker(ker(coker(f))) = 0$. By the universal property of kernel again, there exists $d : K -> 0$ such that $c oo ker(f) = ker(ker(coker(f))) oo d$. Thus $c oo ker(f)$ factors through the zero object and thus is $0$. The desired morphism is obtained from the universal property of cokernel.
#align(center, commutative-diagram(
node((0, 0), [$K$]),
node((0, 1), [$X$]),
node((0, 2), [$Y$]),
node((0, 3), [$C$]),
node((1, 1), [$G$]),
node((1, 2), [$D$]),
node((2, 2), [$0$]),
arr((0, 0), (0, 1), [$ker(f)$]),
arr((0, 1), (0, 2), [$f$]),
arr((0, 2), (0, 3), [$coker(f)$]),
arr((0, 1), (1, 1), [$coker (ker (f))$], label-pos: 0),
arr((1, 2), (0, 2), [$ker(coker(f))$], label-pos: -3.5em),
arr((1, 1), (1, 2), [$exists !$], "dashed"),
arr((0, 1), (1, 2), [$c$]),
arr((2, 2), (1, 2), [$ker(ker(coker(f)))$], label-pos: -4.5em),
arr((0, 0), (2, 2), [$d$], curve: -40deg)
))
]
#definition[In a pre-abelian category, we define the *coimage* of a morphism $f$ as $ coim (f) = coker(ker(f)) $ and *image* of $f$ as $ im(f) = ker(coker(f)). $ Continuing with @ker-notation, we have $G = Coim(f)$ and $D = IM(f)$ in the above diagram.
We call $f$ *strict* if the map $Coim (f) -> IM f$ is an isomorphism.
]
== Abelian Categories
#definition[
A pre-ablian category is *abelian* if all morphisms are strict.
]
#corollary[
In an abelian category, every morphism $f : X-> Y$ has a factorisation
$
X ->^g IM (f) ->^h Y,
$
where $g$ is an epimorphism and $h$ is a monomorphism.
]
#proof[
Notice $g = coker(ker(f)) = coim(f)$ and $h = ker(coker(f)) = im(f)$.
]
We can always write $f = im(f) oo coim(f)$ and consider $im(f)$ as a subobject of $Y$.
#remark[
The followings are two equivalent definitions of an abelian category:
- A pre-abelian category where every monomorphism is a kernel and every epimorphism is a cokernel;
- A pre-abelian category where every monomorphism is the kernel of its cokernel and every epimorphism is the cokernel of its kernel.
]
We prove part of the equivalence:
#proposition[
In an abelian category, every monomorphism is the kernel of its cokernel, and every epimorphism is the cokernel of its kernel.
]
#proof[
Use the diagram in the proof of @pre-ab-morphism. Let $f$ be a monomorphism, then $ker(f) = 0$ and $K = 0$. It is not to hard to find $G = X$ and $coker(ker(f)) = id_X$. Since $D$ and $G$ are isomorphic, we see that $X$ is isomorphic to $D$ and thus $f = ker(coker(f))$.
]
#remark[
Now it is time to give a list of properties that abelian categories have, packing everything we have picked up along the way:
- Every hom-set is an abelian group subject to bilinear morphism composition;
- It has a zero object and has a zero morphism between any two objects, which is the identity of the abelian group and factors through $0$;
- It has all limits and colimits;
- Any finite product and coproduct coincide as the biproduct;
- $f$ is monic if and only if $f oo u = 0$ implies $u = 0$, if and only if $ker f = 0$, if and only if $f = im(f)$;
- $g$ is epic if and only if $v oo g = 0$ implies $v = 0$, if and only if $coker g = 0$, #iff $g = colim(g)$;
- $f$ is monic and $f = 0$ implies the domain of $f$ is $0$;
- $g$ is epic and $g = 0$ implies the codomain of $g$ is $0$;
- $Coim(f) -> IM(f)$ is an isomorphism;
- Any $f$ can be factorised as $f = ker(coker(f)) oo coker(ker(f)) = im(f) oo coim(f)$.
]
// Remark. This is equivalent to: (The converses are always true in any category.) This is equivalent to every mono is the kernel of its cokernel and every epi is the cokernel of its kernel. (? TODO)
We now introduce the most important member in the family of abelian categories.
#proposition[
For any ring $R$, the category $RMod$ is an abelian category. In particular, $Ab$ is an abelian category.
]
#proof[
($RMod$ is $Ab$-enriched.) For any $A, B in RMod$, the set $homr(A, B)$ of module homomorphisms $A -> B$ can be naturally seen as an abelian group under pointwise addition. It is easy to check that the composition is bilinear.
($RMod$ is additive.) We know that the direct sum exists as a coproduct for any finite family of modules $(M_i)_(i in I)$ in $RMod$.
($RMod$ is pre-abelian.) Let $f : A -> B$ be a morphism in $RMod$. Then
$
Ker(f) = {a in A : f(a) = 0}
$
with $ker(f) : Ker(f) -> A$ being the inclusion map, is a categorical kernel. Also,
$
Coker(f) = B over IM(f)
$
where $IM(f) = {f(a) in B : a in A}$, with $coker(f) : B -> Coker(f)$ being the quotient map, is a categorical cokernel.
($RMod$ is abelian.) We find
$
Coker(ker(f)) = A over Ker(f) iso IM(f)
$
by the First Isomorphism Theorem and
$
Ker(coker(f)) = IM(f)
$
by construction. Hence the image and coimage coincide up to isomorphism, i.e., any $f$ is strict.
]
#remark[
Note that the product and coproduct of a family $(M_i)_(i in I)$ coincide when $I$ is finite but differ when $I$ is infinite:
$ union.sq.big _(i in I) M_i = plus.circle.big_(i in I) M_i = {(m_i) _(i in I) | m_i in M_i, m_i = 0 "for almost all" i}, $
$ product _( i in I) M_i = {(m_i) _(i in I) | m_i in M_i}. $
]
#proposition[
In $RMod$, a monomorphism is equivalent to an injective homomorphism and an epimorphism is equivalent to a surjective homomorphism.
]
#example[If $cA$ is an abelian category and $cC$ is any small category, then the category of functors $Fun(cC, cA)$ is abelian.]
#example[
The category of Banach spaces over $RR$ is not an abelian category, but a *quasi-abelian category*.
// We have $V attach(arrow.r.hook, t: i) W$ which are open. Then $coker i = W \/ overline(V)$. Then $ker coker i = overline(V) != V$. (The closure of $V$.)
// This is an example of quasi-abelian categories.
]
== Exact Sequences and Functors
#note[
All discussions in this section are limited to an abelian category.
]
We have trekked a long way to establish abelian categories.
The key element that we seek from an abelian category is the notion of exactness:
#definition[
In an abelian category, a sequence of maps $A attach(->, t: f) B attach(->, t: g) C $ is called *exact* at $B$ if $ker g = im f$ (as equivalent subobjects of $B$).
]
#definition[
In an abelian category, a *short exact sequence* $0 -> A attach(->, t: f) B attach(->, t: g) C -> 0$ is exact at $A$, $B$ and $C$, or "exact everywhere".
]
#lemma[
$im (0 -> A) = 0$ and $im(A -> 0) = 0$.
]
#proposition[
$0 -> A attach(->, t: f) B attach(->, t: g) C -> 0$ is a #sest if and only if $f$ is monic, $g$ is epic, and $ker g = im f$.
]
#proof[
- Exactness at $A$ $<=>$ $ker f = im (0 -> A) = 0$ $<=>$ $f$ is monic.
- Exactness at $B$ $<=>$ $ker g = im f$.
- Exactness at $C$ $<=>$ $im g = ker (C -> 0) = id_C$ $<=>$ $g = coim (g )$ $<=>$ $g$ is epic.
]
#proposition[
If $ses(A, B, C, f:f, g:g)$ is a #sest, then $f = ker g$ and $g = coker f$.
]
#proof[
$f$ is monic, so $f = im(f) = ker(g)$. $g$ is epic, so $g = coim(g) = coker(ker(g)) = coker(f)$.
]
#corollary[
$ses(A, B, C, f:f, g:g)$ can be rewritten as
$
ses(IM(f), B, Coker(f), f:"", g:coker(f))
$ or
$
ses(Ker(g), B, Coim(g), f:ker(g), g:"").
$
]
#proposition[
If $A->^f B->C->D->^g E$ is an exact sequence, then
$
ses(Coker(f), C, Ker(g))
$
is a #sest.
]
<five-to-ses>
#definition[
A #sest $ses(A, B, C)$ is *split* if $B$ is isomorphic to $A ds C$.
// #image("imgs/19.png")
]
// #lemma[
// An additive functor preserves split short exact sequences.
// ]
// <additive-preserve-split>
// #proof[
// This follows from @additive-preserve-biproduct.
// ]
#lemma("Splitting Lemma")[
Let $ses(A, B, C, f:f, g:g)$ be a short exact sequence. The followings are equivalent:
+ The short exact sequence is split;
+ There exists a *retraction*#footnote[The terms "retraction" and "section" come from algebraic topology, but for our purpose they are nothing more than certain morphisms.] $r: B->A$ such that $r oo f = id_A$;
+ There exists a *section* $s : C -> B$ such that $g oo s = id_C$.
]
<splitting-lemma>
#proof[
// #TODO https://math.stackexchange.com/questions/748699/abstract-nonsense-proof-of-the-splitting-lemma
Although it is possible to give a purely category-theoretic proof, as can be seen @splitting-lemma-doc, we give a proof in $RMod$, which is in fact sufficient in view of @metatheorem.
(1) $=>$ (2) and (1) $=>$ (3) are trivial by the definition of biproducts.
(2) $=>$ (1). We first claim that $B = IM f + Ker r$. Take any $b in B$, then plainly $b = f r(b) + (b - f r(b)) $. Since $r (b - f r(b)) = r (b) - r f r (b) = 0$, we have $b - f r(b) in Ker r$. Also obviously $f r (b) in IM f$.
We further claim that $B = IM f ds Ker r$. Suppose $b in IM f sect Ker r$, then there exists $a in A$ such that $b = f(a)$; also $r (b) = 0$. Then $0 = r f (a) =a$, so $b = f(a) = 0$.
Now we claim that $Ker r iso C$; in particular, the restriction $g|_(Ker r) : Ker r -> C$ is an isomorphism. Take any $c in C$, then since $g$ is a surjection, there exists some $f(a) + k in B$, where $a in A$ and $k in Ker r$, such that $g (f(a) + k) = c$. Note that $g f(a) = 0$, because $f(a) in IM f = Ker g$ by exactness at $B$, so for any $c in C$, there exists $k in Ker r$ such that $g(k) = c$. Thus $g|_(Ker r)$ is surjective. On the other hand, if $g(k) = 0$ for $k in Ker r$, then $k in Ker g = IM f$, but $IM f sect Ker r = {0}$, so $k = 0$. Thus $g|_(Ker r)$ is injective.
Finally, observe that $f$ is an injection, so $IM(f) iso A$.
(3) $=>$ (1). The proof is similar as above and thus omitted.
]
#corollary[Let $M, S, T$ be $R$-modules.
- If $M = S ds T$ and $S subset.eq N subset.eq M$, then $N = S ds (N sect T)$.
- If $M = S ds T$ and $S' subset.eq S$, then $M over S' = S over S' ds (T + S') over S'$.
]
<split-sub>
#proof[
@rotman[Corollary 2.24].
]
#definition[
An additive functor $F: cC -> cD$ is called
- *right exact* if the exactness of $A-> B-> C-> 0$ implies the exactness of $F(A) -> F(B) -> F(C) -> 0 $;
- *left exact* if the exactness of $0-> A-> B-> C$ implies the exactness of $0 -> F(A) -> F(B) -> F(C) $;
- *exact* if the exactness of $0->A->B->C->0$ implies the exactness of $ses(F(A), F(B), F(C))$,
for any $A, B, C in cC$.
]
#remark[
By definition, _right exactness preserves cokernels_, since $C$ is the cokernel of the map $A -> B$ and $F(C)$ is the cokernel of the map $F(A) -> F(B)$. Similarly, _left exactness preserves kernels_.
]
#lemma[
Let $cA$ be an abelian category. Let $M in cA$. The functor $ Hom(A)(M, -): cA -> Ab $ is left exact.
]
<hom-left-exact>
#proof[
Let $0->A->^f B->^g C$ be exact in $cA$, then we want to prove
$ 0 -> Hom(A)(M, A) ->^(f oo -) Hom(A)(M, B) ->^(g oo -) Hom(A)(M, C) $
is exact in $Ab$.
Exactness at $Hom(A) (M, A)$ is equivalent to $(f oo -) $ being monic, so let us calculate $Ker(f oo -)$. Let $u in Hom(A)(M, A)$ such that $(f oo -) (u) = 0$, i.e., $f oo u = 0$. But $f$ is monic, so $u = 0$, and thus $Ker(f oo -) = 0$ and $(f oo -)$ is monic.
Exactness at $Hom(A) (M, B)$ is equivalent to $Ker(g oo -) = IM(f oo -)$. To show that $Ker(g oo -) subset.eq IM(f oo -)$, let $ v in Ker(g oo -)$. Then $v : M -> B$ such that $g oo v = 0$. Note that $A = Ker(g)$ and $f = ker(g)$, so by the universal property of kernel, there exists $h : M -> A$ such that $v = f oo h$, hence $v in IM(f oo -)$. On the other hand, to show that $IM(f oo -) subset.eq Ker(g oo -)$, notice that if $v in IM (f oo -)$, then $v = f oo h$ for some $h$ and then $g oo v = g oo f oo h = 0$ since $g oo f = 0$.
]
// #TODO how to understand $f oo -$
#remark[
The functor $Hom(A) (M, -)$ fails to be exact in general because it does not necessarily send an epimorphism to an epimorphism. For a counterexample, let $cA = Ab$ (where an epimorphism is equivalent to a surjective homomorphism) and $M = ZZ over 2 ZZ$. The quotient map $h: ZZ -> ZZ over 4 ZZ $ is an surjective homomorphism. On the other hand, for any abelian group $A$, an element in $hom_Ab (ZZ over 2 ZZ, A)$ (i.e., a group homomorphism $ZZ over 2ZZ -> A$) is uniquely determined by an element in $A$ with order $2$. Hence $hom_Ab ( ZZ over 2 ZZ, ZZ) = 0$ and $hom_Ab ( ZZ over 2 ZZ, ZZ over 4ZZ) = ZZ over 2ZZ$, and we see the induced map $ (h oo -) : hom_Ab ( ZZ over 2 ZZ, ZZ) -> hom_Ab ( ZZ over 2 ZZ, ZZ over 4ZZ) $ cannot be surjective.
]
#corollary[Dually, $Hom(A) (-, M): cA^op -> Ab$ is also left exact. ] <hom-left-exact-2>
#note[
What does left exactness mean for a contravariant functor? If $X -> Y -> Z -> 0$ is exact in $cA$, then $0 -> Z -> Y -> X$ is exact in $cA^op$, and $0 -> Hom(A)(Z, M) -> Hom(A)(Y, M) -> Hom(A)(X, M)$ is exact in $Ab$.
]
#endlec(4)
== Projective and Injective Objects
#definition[
Let $cA$ be an abelian category. An object $P$ is called *projective* if $Hom(A) (P, -)$ is exact.
Dually, an object $I$ is called *injective* if $Hom(A) (-, I)$ is exact.
]
In other words, $P$ is projective if for any #sest $ses(X, Y, Z)$ in $cA$, $ ses(Hom(A)(P, X), Hom(A)(P, Y), Hom(A)(P, Z)) $ is a #sest.
#proposition[
The followings are equivalent:
1. $P$ is a projective object;
2. For any epimorphism $h : Y -> Z$, the induced map $(h oo -) : Hom(A) (P, Y) -> Hom(A) (P, Z)$ is surjective;
3. For any epimorphism $h : Y-> Z$ and any morphism $f : P -> Z$, there exists (not necessarily unique) $g : P -> Y$ such that $f = h oo g$, i.e. the following commutes (which we refer to as the *lifting property*):
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBoAGAXVJADcBDAGwFcYkQAFEAX1PU1z5CKAEyli1Ok1btyPPiAzY8BImQk0GLNohAAtef2VCi5cZK0zdATR6SYUAObwioAGYAnCAFskZkDgQSGJS2uxuhiCePsE0gUjEvO5evogAzHFBiCGWOiAAFpHRqf7x6TSMWGB5UPRw+Q4gmtJ5MAAeWHA4CNyU3EA
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((0, 1), [$P$]),
node((1, 2), [$0$]),
node((1, 1), [$Z$]),
node((1, 0), [$Y$]),
arr((0, 1), (1, 1), [$f$]),
arr((1, 1), (1, 2), []),
arr((1, 0), (1, 1), [$h$]),
arr((0, 1), (1, 0), [$exists g$], "dashed"),
))
4. Any #sest $ses(A, B, P)$ splits.
]
<projective-split>
#proof[
(1) $=>$ (2) is obvious; (2) $=>$ (1) by @hom-left-exact.
(2) $<=>$ (3) is also obvious.
(3) $=>$ (4). // https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBpiBdUkANwEMAbAVxiRAEEQBfU9TXfIRQAmclVqMWbAELdeIDNjwEiAZjHV6zVohAAFOXyWCiAFg0TtbAAyGF-ZUOTWLWqbts8j<KEY>bnEYK<KEY>AWyQ<KEY>Cg<KEY>qrMQ<KEY>JA6q10EUuakc3akaynElpc5zsWKxByaxFUuCi4gA
#align(center, commutative-diagram(
node-padding: (50pt, 40pt),
node((1, 1), [$A$]),
node((1, 2), [$B$]),
node((1, 3), [$P$]),
node((1, 4), [$0$]),
node((1, 0), [$0$]),
node((0, 3), [$P$]),
arr((0, 3), (1, 3), [$id_P$]),
arr((1, 2), (1, 3), [$g$]),
arr((0, 3), (1, 2), [$s$], label-pos: -1em, "dashed"),
arr((1, 0), (1, 1), []),
arr((1, 1), (1, 2), []),
arr((1, 3), (1, 4), []),
))
Since $g : B-> P$ is an epimorphism, we can always find $s : P -> B$ such that $g oo s= id_P$ by the lifting property. Then (4) holds by @splitting-lemma[Splitting Lemma].
(4) $=>$ (3). See @ses-split-projective.
]
#corollary[Dually, the followings are equivalent:
1. $I$ is injective;
2. For any monomorphism $h: X->Y$, the induced map $(- oo h) : Hom(A) (Y, I) -> Hom(A) (X, I)$ is surjective;
3. For any monomorphism $h: X->Y$ and any $f: X->I$, there exists $g: Y->I$ such that $f = g oo h$, i.e., the following commutes (which we refer to as the *extension property*):
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBpiBdUkANwEMAbAVxiRAEkQBfU9TXfIRQAGUsKq1GLNsO68QGbHgJEy46vWatEIABpy+SwUQBMYiZuk6AmtwkwoAc3hFQAMwBOEALZIzIHAgkUUktNjcDEE8fJDIAoMQTHncvX0TqQKQAZg0pbRAAC0jotJz44OoGLDB8qDo4AocQXLCdGAAPLDgcOAACRzsuIA
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((1, 1), [$I$]),
node((0, 0), [$0$]),
node((0, 1), [$X$]),
node((0, 2), [$Y$]),
arr((0, 1), (1, 1), [$f$]),
arr((0, 0), (0, 1), []),
arr((0, 1), (0, 2), [$h$]),
arr((0, 2), (1, 1), [$exists g$], "dashed"),
))
4. Any #sest $ses(I, A, B)$ splits.
]
== Categories of Modules
#proposition[
Ring $R$ viewed as an object in $RMod$ is projective.
]
#proof[ It is equivalent to say the functor $ homr (R, -)$ is exact. In fact,
$homr (R, M) iso M $ because any module morphism $phi : R -> M $ is entirely determined by $phi(1_R)$. Given any #sest $ses(M, M', M'') $, if we apply $homr (R, -)$, we get the same #sest, which is exact.
]
// #corollary[
// Any free module $R^(ds I)$ is projective.
// ]
// #proof[
// The proof is similar as above. #TODO
// ]
#note[In $RMod$, we have
$ homr (R, plus.circle.big_(i in I) M_i) = plus.circle.big_(i in I) M_i = plus.circle.big_(i in I) homr (R, M_i). $
This does not follow from the universal property of the direct sum; this is because $R$ is special.
]
#definition[
Let $cA$ be an additive category. We call an object $C$ *compact* if the canonical morphism $ product.co_(i in I) Hom(A) (C, G_i) -> Hom(A)(C, product.co_(i in I) G_i) $
is an isomorphism for any family ${G_i}_(i in I)$ of objects in $cA$ such that $product.co_(i in I) G_i$ exists.
]
#remark[
You might find different definitions for an arbitrary category (not necessarily additive), but they are equivalent under the additive context.
]
#definition[
In a category $cC$ with coproducts, an object $G$ is called a *generator* if for any $X in cC$, there is an epimorphism
$product.co_I G -> X -> 0$.
]
#lemma[
$R$ is a generator of $RMod$.
]
#proof[
Recall @module-generator.
]
#lemma[
In an abelian category $cA$, any hom-set
$hom_cA (X, Y)$ can be seen as a right module over ring $End(A)(X)$, or equivalently a left module over $End(A)(X)^op$.
]
#proof[
First notice $End(A)(X)$ is indeed a ring with composition as multiplication.
Take any $m in Hom(A)(X, Y)$ and $r in End(A)(X)$.
Define the multiplication $m r$ as $m oo r in Hom(A)(X, Y)$. It is easy to verify that this makes $Hom(A) (X, Y)$ a right module over $End(A)(X)$.
]
#theorem("Morita's Theorem")[
Let $cA$ be an abelian category. Assume $cA$ has (small) coproducts. Assume that $P$ is a compact, projective generator. Let ring $R = End(A) (P)$, then the functor $ Hom(A)(P, -) : cA -> ModR $ is an equivalence of categories.
]
#note[
If $cA = SMod$ for some ring $S$, we have observed that $S$ (as an object of $SMod$) is a compact, projective generator. In this case, $R = end_S (S)$. We observe that any module homomorphism $phi: S -> S$ is uniquely determined by $phi(1) in S$ with $phi(s) = s phi(1)$, and the composition of two homomorphisms $phi_1 , phi_2 : S-> S$ is in the opposite direction of multiplication in $S$: $ phi_1 (phi_2(s)) = s phi_2(1) phi_1(1) $
Therefore, $R = end_S (S) = S^op$. Thus, indeed, we have $SMod$ is equivalent to $ModR$, which is $Mod$-$S^op$.
]
// #remark[
// Using the definition of equivalence, you want to construct another functor in the opposite direction and show their composites are natural isomorphic to identity functors. Alternatively, you might also prove that the functor is fully faithful and essentially surjective, if you can.
// ]
#proof[
@rotman[Theorem 5.55] and @pareigis[p. 211].
// https://cornellmath.wordpress.com/2008/04/10/abelian-categories-and-module-categories/
Denote $ F:=Hom(A)(P, -) : cA -> ModR$.
Using the definition of categorical equivalence, we want to construct another functor $G : ModR -> cA$ and show $F G$ and $G F$ are naturally isomorphic to identity functors. We see that in this way $G$ should be left adjoint to $F$, so $G$ must preserves colimits and in particular be right exact.
Inspired by the discussion above, we define $G$ in the following way. We first set $G(R) = P$ and $G(R^(ds I)) = P^(ds I)$. Any morphism $f: R^(ds J) -> R^(ds I)$ can be represented by a (possibly infinite) matrix with entries $a_(i j) in R$ for all $i in I$ and $j in J$. However, notice that $R = End(A) (P)$ by definition and thus the same matrix $(a_(i j))_(i in I, j in J)$ can also be seen as a morphism $P^(ds J) -> P^(ds I)$, which is defined to be $G(f)$.
Now, for any $R$-module $M$, we can find a presentation
$
R^(ds J) ->^f R^(ds I) -> M -> 0
$
Under $G$, this becomes
$
P^(ds J) ->^(G(f)) P^(ds I) -> G(M) -> 0
$
where we define $G(M) = Coker(G(f))$. It can be verified that $G$ is a functor.
// TODO ?
Since $P$ is a projective object, $F$ is exact and preserves cokernels; since $P$ is compact, $F$ preserves direct sums. On the other hand, $G$ is right exact and preserves direct sums by construction. Hence the composites $F G$ and $G F$ are right exact and preserves direct sums.
Now we check $F G$ and $G F$ are naturally isomorphic to identity functors.
For $F G : ModR -> ModR$, we have $ F G (R) = F (P) = hom_cA (P, P) = R $
and hence $F G(R^(ds I)) = R^(ds I)$. Now for any $M in ModR$, there is a commutative diagram
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJ<KEY>
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((0, 0), [$R^(ds J)$]),
node((0, 1), [$R^(ds I)$]),
node((0, 2), [$M$]),
node((1, 0), [$F G ( R^(ds J) )$]),
node((1, 1), [$F G (R^(ds I))$]),
node((1, 2), [$F G (M)$]),
node((0, 3), [$0$]),
node((1, 3), [$0$]),
arr((0, 0), (1, 0), []),
arr((0, 1), (1, 1), []),
arr((0, 2), (1, 2), []),
arr((0, 0), (0, 1), []),
arr((0, 1), (0, 2), []),
arr((0, 2), (0, 3), []),
arr((1, 0), (1, 1), []),
arr((1, 1), (1, 2), []),
arr((1, 2), (1, 3), []),
))
Since $F G$ preserves cokernels, we see that $F G(M) iso M$. Hence $F G$ is naturally isomorphic to the identity functor of $ModR$.
For $G F: cA -> cA$, we have
$G F (P) = G( R) = P
$,
so $ G F (P^(ds I)) =P^( ds I)$. Now take any $X in cA$, since $P$ is a generator, we can find
$
P^(ds J) -> P^(ds I) -> X -> 0
$
A similar argument as before gives the result.
// #TODO review
]
#remark[
$cA$ can have more than one compact, projective generator, say $P_1$ and $P_2$. Then $A = End(A) (P_1)^op hyph Mod = End(A) (P_2)^op hyph Mod$, where rings $End(A) (P_1)$ and $End(A) (P_2)$ are not necessarily isomorphic. This is *Morita equivalence* of rings.
For example, consider $veck$ for some field $k$. Then $k$ and $k^n$ are both compact, projective generators of $veck$. Then $k$ and $M_n (k)$ ($n times n$ matrices over $k$) both are equivalent to $veck$ as categories.
// #TODO
]
#theorem("Freyd-Mitchell Embedding Theorem")[
If $cA$ is a small abelian category, there is a ring $R$ and an exact, fully faithful embedding functor $cA -> RMod$.
]
<metatheorem>
#proof[
// Using Yoneda embeddings. $cA -> Fun(cA^op, Ab)$. (?)
@weibel[p. 25].
]
This theorem indicates that we can embed an abstract category into a concrete one. From a practical perspective, we can prove any reasonable statements for $RMod$ and they will also hold for abelian categories. An example is the following.
#lemma("Snake Lemma")[
Suppose we have a commutative diagram of objects in an abelian category or $RMod$
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBpiBdUkANwEMAbAVxiRAEEQBfU9TXfIRQAmclVqMWbAELdeIDNjwEiAZjHV6zVohABhOXyWCiZAAzitU3ew<KEY>sk<KEY>yv<KEY>ja2Is7qusQAVmompFEh<KEY>qS2LEiKsZ7p-qn5hV2QLvHj9YA2P3P9xAB2Q5a7i5aXgY8w3Sxl8cG6zmW10aBA1AYWDAXjgEEhUG4FC4QA
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((1, 1), [$A$]),
node((1, 2), [$B$]),
node((1, 3), [$C$]),
node((0, 1), [$A'$]),
node((0, 2), [$B'$]),
node((0, 3), [$C'$]),
node((0, 4), [$0$]),
node((1, 0), [$0$]),
arr((0, 1), (1, 1), [$f$]),
arr((0, 2), (1, 2), [$g$]),
arr((0, 3), (1, 3), [$h$]),
arr((0, 1), (0, 2), [$i'$]),
arr((0, 2), (0, 3), [$p'$]),
arr((0, 3), (0, 4), []),
arr((1, 0), (1, 1), []),
arr((1, 1), (1, 2), [$i$]),
arr((1, 2), (1, 3), [$p$]),
))
// #image("imgs/23.png")
such that the rows are exact, then there is an exact sequence
$ Ker f -> Ker g -> Ker h attach(->, t: diff) Coker f -> Coker g -> Coker h $
where the *connecting (homo)morphism* $diff$ is given by a well-defined formula $ diff(c') = i^(-1) g p'^(-1) (c') + IM(f) $ where $p'^(-1)$ means finding some element $b' in B'$ such that $p'(b') = c'$ and so on.
Further, if $A' -> B'$ is monic, so is $Ker f -> Ker g$.
If $B -> C$ is epic, so is $Coker g -> Coker h$.
]
<snake>
#proof[A detailed proof can be seen @snake-lemma-doc.
We have the following commutative diagram:
#v(20pt)
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZARgBoAmAXVJADcBDAGwFcYkQBBEAX1PU1z5CKchWp0mrdgCEefEBmx4CRAMxiaDFm0QgAwnP5KhRMsXFapujgHJDCgcuHJR5zZJ0hpd3kcEqUdTcJbXY9H3lFf2cyAAYLD3YAaxgAJwACADN7KKciUXj3UN0UjIBzHMcTQNJCkKsQUvSAC0rj<KEY>
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((2, 1), [$A$]),
node((2, 2), [$B$]),
node((2, 3), [$C$]),
node((1, 1), [$A'$]),
node((1, 2), [$B'$]),
node((1, 3), [$C'$]),
node((0, 1), [$Ker f$]),
node((0, 2), [$Ker g$]),
node((0, 3), [$Ker h$]),
node((3, 1), [$Coker f$]),
node((3, 2), [$Coker g$]),
node((3, 3), [$Coker h$]),
node((1, 4), [$0$]),
node((2, 0), [$0$]),
arr((0, 1), (1, 1), []),
arr((1, 1), (2, 1), [$f$]),
arr((2, 1), (3, 1), []),
arr((0, 2), (1, 2), []),
arr((1, 2), (2, 2), [$g$]),
arr((2, 2), (3, 2), []),
arr((0, 3), (1, 3), []),
arr((1, 3), (2, 3), [$h$]),
arr((2, 3), (3, 3), []),
arr((1, 1), (1, 2), [$i'$]),
arr((1, 2), (1, 3), [$p'$]),
arr((1, 3), (1, 4), []),
arr((2, 0), (2, 1), []),
arr((2, 1), (2, 2), [$i$]),
arr((2, 2), (2, 3), [$p$]),
arr((0, 3), (3, 1), [$diff$], curve: -68deg, "dashed"),
arr((3, 1), (3, 2), [$j$]),
arr((3, 2), (3, 3), [$q$]),
arr((0, 1), (0, 2), [$j'$]),
arr((0, 2), (0, 3), [$q'$]),
))
In the first row, consider map $j' := i'|_(Ker f) : Ker f -> B'$. We claim that $j' : Ker f -> Ker g$. Indeed, take any $a' in Ker f subset.eq A'$, we have
$ g(j'(a')) = g(i'(a')) = i(f(a')) = i(0) = 0. $
Then $j'(a') in Ker g$ and thus $j' : Ker f -> Ker g$. Similarly, $q' := p'|_(Ker g) : Ker g -> Ker h$. We then see the first row is exact because of the exactness of $A' -> B' -> C'$. Also, if $i'$ is an injection, i.e., $Ker(i') = 0$, then obviously $Ker(j') = 0$.
In the last row, define $j : Coker(f) -> Coker(g)$ as $a + IM(f) |-> i(a) + IM(g)$ for any $a in A$. We claim that this map is well-defined. If $a_1, a_2 in A$ such that $a_1 + IM(f) = a_2 + IM(f)$, then $a_1 - a_2 in IM(f)$, thus there exists $a' in A'$ so that $a_1 - a_2 = f(a')$. Then
$i(a_1 - a_2) = i(f(a')) = g(i'(a')) in IM(g). $
Then
$ j(a_1 + IM(f)) = i(a_1) + IM(g) = i(a_2) + IM(g) = j(a_2 + IM(f)). $
So $j$ is well-defined. Similarly, we can define $q : Coker g -> Coker h$ and show the exactness of the last row. We can also see that the surjection of $p$ implies the surjection of $q$.
Now all arrows except $diff$ are clear.
Pick any $c' in Ker h subset.eq C'$.
Since $p'$ is surjective, there exists $b' in B'$ so that $p'(b') = c'$.
Now $0 = h(c') = h(p'(b')) = p(g(b')), $ so $g(b') in Ker p = IM i$, and there exists unique $a in A$ such that $i(a) = g(b')$.
We thus define $diff: Ker h -> Coker f$ as $diff(c') = a + IM(f). $
We claim this is a well-defined function.
Then it suffices to show for any two choices $b'_1, b'_2$ of $b'$ and corresponding choices $a_1, a_2$ of $a$, $diff (c')$ gives the same value. Since $p'(b'_1) = p'(b'_2) = c'$, we have $b'_1 - b'_2 in Ker(p') = IM(i')$. Thus we can write $b'_1 - b'_2 = i'(a')$ for some $a' in A'$. Then
$i(a_1 - a_2) = g(b'_1 - b'_2) = g(i'(a')) = i (f (a')), $
but $i$ is injective, and hence $a_1 - a_2 = f(a') in IM f$.
We omit the proof of the exactness at $Ker h$ and $Coker f$.
// See @li[Theorem 6.8.6].
]
#endlec(5) |
|
https://github.com/jneug/schule-typst | https://raw.githubusercontent.com/jneug/schule-typst/main/src/kl.typ | typst | MIT License | #import "core/document.typ"
#import "core/layout.typ": base-header, header-left, header-right
#import "_imports.typ": *
#let grading-table = (
"0": .0,
"1": .20,
"2": .27,
"3": .33,
"4": .40,
"5": .45,
"6": .50,
"7": .55,
"8": .60,
"9": .65,
"10": .70,
"11": .75,
"12": .80,
"13": .85,
"14": .90,
"15": .95,
)
#let ewh(exercises) = {
v(8mm)
[Name: #box(stroke:(bottom:.6pt+black), width:6cm)]
// TODO: Should the grading table be created here?
ex.grading.display-expectations-table-expanded(exercises)
v(4mm)
align(right, [*Note:* #box(stroke:(bottom:.6pt+black), width:4cm)])
align(right, [Datum, Unterschrift: #box(stroke:(bottom:.6pt+black), width:4cm)])
v(1fr)
align(
center,
ex.grading.display-grading-table(
exercises,
grading-table,
),
)
}
#let kl-title(
doc,
) = block(
below: 0.65em,
width: 100%,
rect(
width: 100%,
stroke: (
right: 10pt + theme.muted,
bottom: 10pt + theme.muted,
),
inset: -2pt,
)[
#rect(width: 100%, stroke: 2pt + black, fill: white, inset: 0.25em)[
#set align(center)
#set text(fill: theme.text.title)
#heading(
level: 1,
outlined: false,
bookmarked: false,
)[
#smallcaps(doc.title) (#doc.duration Minuten)
]
#v(-1em)
#heading(level: 2, outlined: false, bookmarked: false)[
#args.if-none(doc.subject, () => [])
#args.if-none(doc.class, () => [])
#(doc.author-abbr)()
]
#v(0.25em)
]
],
)
#let klausur(
ewh: ewh,
..args,
body,
) = {
let (doc, page-init, tpl) = base-template(
type: "KL",
type-long: "Klausur",
_tpl: (
options: (
duration: t.integer(default: 180),
split-expectations: t.boolean(default: false),
),
aliases: (
dauer: "duration",
erwartungen-einzeln: "split-expectations",
),
),
fontsize: 10pt,
title-block: kl-title,
..args,
body,
)
{
show: page-init.with(header: base-header.with(rule: true))
tpl
}
if doc.solutions == "page" {
show: page-init.with(header-center: (..) => [= Lösungen])
context ex.solutions.display-solutions-page(ex.get-exercises())
}
{
show: page-init.with(
header-center: (..) => [= Erwartungshorizont],
footer: (..) => [],
)
context ewh(ex.get-exercises())
}
}
// TODO: Rework this. Maybe add "pre-pages" and "post-pages" as conecpt in base-template / document?
#let deckblatt(message: [Klausuren und Informationen fÃŒr die Aufsicht]) = [
#v(.5fr)
#align(center)[
#text(4em, font: theme.fonts.sans, weight: "bold")[
#the-number. #the-type #the-subject
]
#text(3em, font: theme.fonts.sans, weight: "bold")[
#sym.tilde #the-class #sym.tilde
]
#v(4em)
#text(3em, font: theme.fonts.sans, weight: "bold")[
#document.use-value("date", d => d.display())
// #options.display(
// "datum",
// format: dt => if dt != none {
// ("Sonntag", "Montag", "Dienstag", "Mittwoch", "Donnerstag", "Freitag", "Samstag").at(dt.weekday())
// dt.display(", [day].[month].[year]")
// },
// )
]
#v(2em)
#text(2em, weight: 400, message)
#v(2em)
#block()[
#set text(1.2em)
#set align(right)
// / Beginn: #luecke(width: 2cm) Uhr
// / Abgabe: #luecke(width: 2cm) Uhr
]
]
#v(1fr)
#grid(
columns: (1fr, 1fr),
gutter: 3cm,
[*Anwesend:*], [*Abwesend:*],
)
#v(1fr)
#pagebreak()
]
#let teilaufgabe = teilaufgabe.with(points-format: ex.points-format-join)
|
https://github.com/Quaternijkon/Typst_FLOW | https://raw.githubusercontent.com/Quaternijkon/Typst_FLOW/main/src/magic.typ | typst | // ---------------------------------------------------------------------
// List, Enum, and Terms
// ---------------------------------------------------------------------
/// Align the list marker with the baseline of the first line of the list item.
///
/// Usage: `#show: align-list-marker-with-baseline`
#let align-list-marker-with-baseline(body) = {
show list.item: it => {
let current-marker = {
set text(fill: text.fill)
if type(list.marker) == array {
list.marker.at(0)
} else {
list.marker
}
}
let hanging-indent = measure(current-marker).width + .6em + .3pt
set terms(hanging-indent: hanging-indent)
if type(list.marker) == array {
terms.item(
current-marker,
{
// set the value of list.marker in a loop
set list(marker: list.marker.slice(1) + (list.marker.at(0),))
it.body
},
)
} else {
terms.item(current-marker, it.body)
}
}
body
}
/// Scale the font size of the list items.
///
/// Usage: `#show: scale-list-items.with(scale: .75)`
///
/// - `scale` (number): The ratio of the font size of the current level to the font size of the upper level.
#let scale-list-items(
scale: .75,
body,
) = {
show list.where().or(enum.where().or(terms)): it => {
show list.where().or(enum.where().or(terms)): set text(scale * 1em)
it
}
body
}
/// Make the list, enum, or terms nontight by default.
///
/// Usage: `#show list: nontight(list)`
#let nontight(lst) = {
let fields = lst.fields()
fields.remove("children")
fields.tight = false
return (lst.func())(..fields, ..lst.children)
}
/// Make the list, enum, and terms nontight by default.
///
/// Usage: `#show: nontight-list-enum-and-terms`
#let nontight-list-enum-and-terms(body) = {
show list.where(tight: true): nontight
show enum.where(tight: true): nontight
show terms.where(tight: true): nontight
body
}
// ---------------------------------------------------------------------
// Bibliography
// ---------------------------------------------------------------------
#let bibliography-counter = counter("footer-bibliography-counter")
#let bibliography-state = state("footer-bibliography-state", ())
#let bibliography-map = state("footer-bibliography-map", (:))
/// Display the bibliography as footnote.
///
/// Usage: `#show: magic.bibliography-as-footnote.with(bibliography("ref.bib"))`
///
/// Notice: You cannot use the same key twice in the same document, unless you use the escape option like `@key[-]`.
///
/// - numbering (string): The numbering format of the bibliography in the footnote.
///
/// - escape (content): The escape string which will be used to escape the cite key, in order to avoid the conflict of the same key.
///
/// - bibliography (bibliography): The bibliography argument. You should use the `bibliography` function to define the bibliography like `bibliography("ref.bib")`.
#let bibliography-as-footnote(numbering: "[1]", escape: [-], bibliography, body) = {
show cite: it => if it.supplement != escape {
box({
place(hide(it))
context {
let bibitem = bibliography-state.final().at(bibliography-counter.get().at(0))
footnote(numbering: numbering, bibitem)
bibliography-map.update(map => {
map.insert(str(it.key), bibitem)
map
})
}
bibliography-counter.step()
})
} else {
footnote(numbering: numbering, context bibliography-map.final().at(str(it.key)))
}
// Record the bibliography items.
{
show grid: it => {
bibliography-state.update(
range(it.children.len()).filter(i => calc.rem(i, 2) == 1).map(i => it.children.at(i).body),
)
}
place(hide(bibliography))
}
body
}
/// Display the bibliography.
///
/// You can avoid `multiple bibliographies are not yet supported` error by using this function.
///
/// Usage: `#magic.bibliography()`
#let bibliography(title: auto) = {
context {
let title = title
let bibitems = bibliography-state.final()
if title == auto {
if text.lang == "zh" {
title = "åèæç®"
} else {
title = "Bibliography"
}
}
if title != none {
heading(title)
v(.45em)
}
grid(
columns: (auto, 1fr),
column-gutter: .7em,
row-gutter: 1.2em,
..range(bibitems.len()).map(i => (numbering("[1]", i + 1), bibitems.at(i))).flatten(),
)
}
} |
|
https://github.com/Trebor-Huang/HomotopyHistory | https://raw.githubusercontent.com/Trebor-Huang/HomotopyHistory/main/infcat.typ | typst | #import "common.typ": *
= æ ç©·èçŽ
æ ç©·èçŽçåºæ¬æ³æ³éåžžç®æŽ. 对象ä¹éŽææå°, æå°ä¹éŽæ 2-æå°, 2-æå°ä¹éŽæ 3-æå°, 以æ€ç±»æš. 产çè¿ç§æŠå¿µæå€éåå²åšæº, æ¥äžæ¥æä»¬äŸæ¬¡å¯¹åäžªåšæºçåå²çº¿æ¡è¿è¡æ¢³ç.
== èçŽçå±é
=== å䌊èçŽ
ææç©ºéŽçèçŽäŸ¿äºç ç©¶æææ§èŽš, äŸåŠ $sans("Top")$ çåæå°±æ¯ææç©ºéŽçåè. äœæ¯åäŒŠè®ºäžæŽåžžè§çæ¯ææç©ºéŽçå䌊çä»·, å³ååš $f : X -> Y$, $g : Y -> X$ äœ¿åŸ $f compose g$ äž $g compose f$ éœäžæååœæ°å䌊. è¿èªç¶åŒåºäº*å䌊èçŽ*çå®ä¹. $sans("Ho")(sans("Top"))$ çå¯¹è±¡æ¯ææç©ºéŽ, èæå°æ¯è¿ç»æ å°åšå䌊äžççä»·ç±». è¿äžªèçŽäžçåæå°±æ¯å䌊çä»·.
ç¶è, è¿ç§å®ä¹æè¯žå€é®é¢. äŸåŠå
¶äžçæéäžäœæéå¹¶äžå¯¹åºå䌊æéäžäœæé, çè³åšåŸå€æ
åµäžéœæ ¹æ¬äžååš. è¿æ¯å 䞺åå»å䌊å
³ç³»æåŒäºæŽé«é¶çä¿¡æ¯, å æ€åç§å䌊æäœéœéŸä»¥æ£ç¡®è¿è¡. æš¡åèçŽæ¯äžç§é¿å
åå»å䌊, å©çšé¢å€æ·»å çç»æåš $sans("Top")$ äžè¿è¡å䌊æäœçåæ³.
åæ ·çç°è±¡åšä»£æ°äžä¹æåºç°. åšéŸå€åœ¢çç ç©¶äž, åç§åœå对åè°çŸ€ç圱åå¯ä»¥ç±å
¶å¯Œåºåœåæè¿°. äœæ¯è¿ç§æè¿°éåžžå€æ, äŸåŠå€äžªåœåå€å对éŸå€åœ¢ç圱åéèŠç¹æçè°±åºå衚述. ç©¶å
¶æ ¹æ¬, æ¯å¯Œåºåœåååè°æäœå¯ŒèŽä¿¡æ¯ç䞢倱. ç¶è, 富åºåœåçæé äž, æå°é¢è§£æ¯äžå¯äžç, ä»
ä»
åšéŸå䌊æä¹äžå¯äž, å æ€æéèŠååè°ç¡®ä¿è¯å®ä¹. å æ€, ç± Grothendieck äžå
¶åŠç <NAME> åš 1960 幎代èèäºéŸå€åœ¢èçŽ $"Ch"(A)$ åå»éŸå䌊åŸå°èçŽ $cal(K)(A)$, äžå±éšåæåæ (å³ä¿æææåè°çŸ€çæ å°, ç±»æ¯åŒ±å䌊çä»·) åŸå°çèçŽ $cal(D)(A)$. è¿äºèçŽå¯ä»¥ç±»äŒŒäºæš¡åèçŽäžæ ·èµäºé¢å€çç»æ, ç§°äœ*äžè§èçŽ*.
äºå®äž, ä»£æ°ææäžä¹åºç°äºäžäžªéåžžéèŠçäžè§èçŽ, ä¹å°±æ¯è°±çå䌊èçŽ.
åŸæ©å°±åç°äºäžåè°äžå䌊ä¹éŽçå
³ç³»
$ H^n (X; G) tilde.equiv [X, K(G, n)] $
å
¶äž $[X, Y]$ 衚瀺 $X -> Y$ æ å°åšå䌊å
³ç³»äžççä»·ç±». $K(G, n)$ æ¯äžç±»ç¹æ®ç空éŽ, ç§°äœ *EilenbergâMac Lane 空éŽ*. 对äºä»»äœå¹¿ä¹çäžåè°ç论 $h^bullet$, åç± <NAME> åš 1962 幎ç»åºäºèåç*å¯è¡šå®ç*, å³å®æ»æ¯å¯ä»¥åæ $ h^n (X) tilde.equiv [X, S_n] $
å
¶äž $S_n$ æ¯äžå空éŽ, ææ å° $S_n -> Omega S_(n+1)$ (ä¹çä»·äº $Sigma S_n -> S_(n+1)$, å 䞺è¿äž€äžªåœå䌎é). è¿ç§ç»æç§°äœ*è°±* (spectrum). è¿äžªæŠå¿µç± Elon Lages Lima åš 1958 幎æååŒå
¥.
æéèŠç广ä¹äžåè°çè®ºå°±æ¯ $K$-ç论, å©çšææç©ºéŽäžçåéäžç»åºäžåè°çŸ€. 䜿çšå®åé空éŽççæ¬ç§°äžº $K upright(O)$, å®å¯¹åºçè°±æéåžžæè¶£çåšææ§ç»æ, 读è
å¯ä»¥ç¿»å°å°é¢é¡µæ¬£èµ. è¿ç§ç»æç§°äœ *Bott åšææ§*.
åšåèšäžååºççé¢åäŒŠçŸ€è¡šæ Œ, ç»å¿è§å¯å¯ä»¥çå°æ²¿çå¯¹è§æ¹ååŸå³äžç§»åšæ¶, æ»æ¯äŒæ¶æå°äžäžªåºå®ç矀, ç§°äœçé¢ç*çš³å®å䌊矀*. è¿æ¯ç±å䌊ç Freuthendal 纬æ¬å®çä¿è¯ç, 峿»¡è¶³ç¹å®æ¡ä»¶çç©ºéŽ $X$, äžæå纬æ¬åŸå°äžåå䌊矀 $ pi_n (X), pi_(n+1) (Sigma X), pi_(n+2) (Sigma^2 X), dots $ æç»æ»æ¯è¶äºçš³å®. è¿äžªçŸ€è®°äœ $pi_n^s (X)$. çš³å®å䌊矀ä¹äºå䌊矀, æ£åŠè°±ä¹äºææç©ºéŽ. æ¢èšä¹, è°±æ¯çš³å®å䌊矀è¿ç§ä»£æ°ç»æçææå¯¹åº. äŸåŠç颿æè°± $SS$, å
¶çš³å®å䌊矀ä¹åšå°é¢é¡µååº. è°±äžçå䌊论称äœçš³å®å䌊论, å®çžèŸäºå䌊论èèšæŽå®¹æå计ç®.
$sans("Ho")(sans("Spec"))$ 乿æäžè§èçŽ, è¿è¯Žæè¿äºé¢ååºç°çé®é¢æ¯çžéç, å æ€äžäžªç»äžçè§£å³æ¹æ¡å°äŒæç€ºå䌊论äžçæ·±å±ç»æ.
=== é«ç»Žä»£æ°
äžåè°æéèŠçç»æå°±æ¯æ¯ç§¯ $H^n (X) times.circle H^m (X) -> H^(n+m) (X)$. è¿ç¿»è¯å°è°±äž, 对åºçæ¯*ç¯è°±* (ring spectrum). è°±äžå¯ä»¥å®ä¹çŒ©ç§¯ $X and Y$, ç±»æ¯äº€æ¢çŸ€çåŒ é积. 猩积çåäœå
æ¯çè°± $SS$, ç±»æ¯äº€æ¢çŸ€ $ZZ$. ç¯è°±å°±æ¯æºåžŠäºä¹æ³ $mu : X and X -> X$ äžåäœå
$eta : SS -> X$ çè°±, 䜿åŸä¹æ³åäœåŸäžç»ååŸåšå䌊æä¹äžæç«.
类䌌çå¯¹è±¡è¿æ H 空éŽ, å°±æ¯åžŠç¹ææç©ºéŽ $X$ äžç»å®è¿ç»æ å° $m : X times X -> X$, 满足 $m(*, -)$ äž $m(-, *)$ éœåäŒŠäºæåæ å°. è¿å¯ä»¥çäœå䌊æä¹äžç矀å
¬ç匱åç.
尜管åšåŸå€æ
åµäžåªéèŠåŒ±åçå
¬çå³å¯, äœæ¯äžäºæé äžéèŠæŽå®æŽçå
¬çæèœäœç°èŸå¥œçæ§èŽš. äŸåŠ, ç¯è°±äžå䞪å
çŽ ç乿³äŒæäºç§ç»åæ¹åŒ, ç»ååŸç»åºå䌊äºèŸ¹åœ¢
#box(width: 100%)[
#set align(center)
#fletcher.diagram(spacing: (0cm, 1.5cm), node-defocus: 0, $
& (x dot y) dot (z dot w) edge("dr", bend: #20deg) & \
x dot (y dot (z dot w)) edge("ur", bend: #20deg) edge("d") && ((x dot y) dot z) dot w \
x dot ((y dot z) dot w) edge("rr") && (x dot (y dot z)) dot w edge("u")
$)]
æ€äºèŸ¹åœ¢äžäžå®å¯çŒ©, å³å®ä»¬å°œç®¡ç»å, äœæ¯ç»åçâæ¹åŒäžå¯äžâ. èŠæ±äºæ€äºèŸ¹åœ¢å¯çŒ©å, äºäžªå
çŽ ç乿³åäŒææäžäžªä¹é¢äœ, 以æ€ç±»æš. è¿äžç³»åå äœäœç§°äœ *Stasheff å€è圢*. 1963 幎, Stasheff å©çšå®æåºäºææ ç©·é«ç»Žçç»ååŸçè¿ç®, ç§°äœ $A_oo$ 代æ°. èè¥åå äžäº€æ¢åŸ, å°±åŸå° $E_oo$ 代æ°. è¿äºä»£æ°çç ç©¶ç§°äœ âçŸäžœæ°ä»£æ°â. #footnote[ãçŸäžœæ°äžçãæ¯äžæ¬å乿éŠå°è¯Ž.]
== é«ç»ŽèçŽ
èªèçŽè®ºæåºä»¥æ¥, å®å°±ä»¥èœå€ç®æŽå°è¡šèŸŸå䞪æ°åŠåæ¯äžéèçæœè±¡ç»æèé»å. ç¶è讜åºçæ¯, èçŽè®ºæ¬èº«åŽæ æ³èªç¶å°çº³å
¥èçŽçç ç©¶æ¡æ¶äž. åŠæå°èçŽè§äœå¯¹è±¡, åœåè§äœæå°, 就応ç¥äºåœåä¹éŽèªç¶åæçä¿¡æ¯. åŸå€æ
åµäžå®é
äžåœåå¹¶äžå®å
šçžç, èæ¯çžå·®äžäžªèªç¶åæ. åŠæåªèèåœååšèªç¶åæäžççä»·ç±», å°±äŒäž $sans("Ho")(sans("Top"))$ äžæ ·åºç°è®žå€é®é¢.
äºå®äž, å
šäœèçŽææçæ¯äžäžª *2-èçŽ* $sans("Cat")$, èéèçŽ ââ ä¹ç§°äœ 1-èçŽ. 2-èçŽé€äºå¯¹è±¡äžæå°ä¹å€, è¿ææå°ä¹éŽç 2-æå°. å
·äœæ¥è¯Ž, æ¯äž€äžªå¯¹è±¡ä¹éŽçæå°äž 2-æå°ææäžäžª 1-èçŽ $hom(X, Y)$. ææçæå° $id in hom(X, X)$, äžäžæäºå
åœå $hom(Y, Z) times hom(X, Y) -> hom(X, Z)$ 衚瀺 1-æå°çå€å. å¯¹äº $sans("Cat")$ æ¥è¯Ž, $hom(X, Y)$ å°±æ¯åœåäžèªç¶æå°ææçèçŽ.
1-æå°å€åçç»ååŸäžèœçŽæ¥çšçžçå
³ç³»æè¿°, å 䞺æä»¬è®€äžº âçžåâ ç 1-æå°åŸåŸçžå·®äžäžªåæ. å æ€æä»¬åŒå
¥èªç¶åæ
$ alpha_(f,g,h) &:& f compose (g compose h) &tilde.equiv (f compose g) compose h \
lambda_f &:& id compose f &tilde.equiv f \
rho_f &:& f compose id &tilde.equiv f $
æä»¬æ¯æ¬¡åšæ°åŠå¯¹è±¡äžåŒå
¥æ å°, å°±éèŠèèè¿äºæ å°æ¯åŠæ»¡è¶³çåŒ. è¿é, è¿äºèªç¶åæä¹éèŠæ»¡è¶³äžäºé¢å€ççåŒ (ç±äºè¿äºåææ¯ 1-èçŽ $hom(X, Y)$ äžç, æä»¥å¯ä»¥çŽæ¥è°è®ºçžç). äŸåŠå䞪æå°çå€åæäºç§æ¹åŒ, å¯ä»¥çš $alpha_(bullet, bullet, bullet)$ è¿æ¥å®ä»¬:
#box(width: 100%)[
#set align(center)
#fletcher.diagram(spacing: (0cm, 1.5cm), node-defocus: 0, $
& (f compose g) compose (h compose k) edge("dr", ->, bend: #20deg) & \
f compose (g compose (h compose k)) edge("ur", ->, bend: #20deg) edge("d", ->) && ((f compose g) compose h) compose k \
f compose ((g compose h) compose k) edge("rr", ->) && (f compose (g compose h)) compose k edge("u", ->)
$)]
åŸè¡šäžäž€æ¡è·¯åŸå€åéèŠçžç. å¯¹äº $lambda_bullet, rho_bullet$ ä¹åèªæäžäžªäžè§åœ¢çåŒ. è¿äºå€æççåŒç§°äœ*è莯æ§* (coherence) çåŒ. 2-èçŽçç论æ¶å»éèŠå€çè¿äºé«é¶çåŒ, å æ€çšæŸç¹ç. 读è
å¯ä»¥è¯åŸéªè¯ä»»äœ 1-èçŽå¯ä»¥çäœ 2-èçŽ, å³å° $hom(X, Y)$ ä»éååçº§äžºåªææåæå°çèçŽ.
å®ä¹ 2-èçŽä¹éŽçåœåæ¶, åæ ·éèŠç»åºäžäºè莯æ§çåŒ. è¿æ¶æä»¬éèŠè¯æäžäžªå
蟹圢åŸè¡šäžäž€äžªå蟹圢åŸè¡šåå«äº€æ¢. èªç¶åæ¢åéèŠäžç³»å亀æ¢åŸè¡š. æå, 2-èçŽäžçèªç¶åæ¢ä¹éŽè¿ææŽé«å±çæ å°, ç§°äœ*è°æŽ*.
2-èçŽçè®ºåš 1967 å¹Žç± <NAME> æåº. è¿æ¯äžºäºè¡šè¿°ä»£æ°å äœçäžéç论äžèªç¶åºç°çç»æ. çš 2-èçŽçè¯èš, å¯ä»¥å°å®è¡šè¿°äžºåœå $F : cal(B) -> sans("Cat")$, å
¶äž $cal(B)$ æ¯ 1-èçŽè§äœéåç 2-èçŽ. äœæ¯ç±äºèµ·å没æ 2-èçŽçå·¥å
·, Grothendieck å°å
¶è¡šè¿°äžº 1-èçŽçåœå $p : cal(E) -> cal(B)$, çšæ¯äžªå¯¹è±¡ $X in cal(B)$ äžç纀绎代æ $F(X)$. 2-èçŽå¯ä»¥æŽå çŽæ¥å°è¡šè¿°æ€äº.
èœç¶æäºè®žç¹ç, 2-èçŽçç论ä»ç¶åšäººç±»æäœçèåŽå
. äœæ¯ 3-èçŽçå€æåºŠè¿
éè¶
è¿äºçè§£äžè®°å¿çé床. åš @tricategory äž, 3-èçŽçå®ä¹å°±å æ®äº 6 页, èå äžå¯¹åºçåœåãèªç¶åæ¢ãè°æŽ, è¿æè°æŽä¹éŽçæ å° ââ ç§°äœ*埮æ°* ââ 蟟å°äºæäººç 19 页. è¿ä»
ä»
æ¯çœåå®ä¹, è¿æªåŒå§è¯æä»»äœæ§èŽšæç»åºä»»äœæé . æŸç¶, è¿ç§åæ³æ¯æ æ³æç»ç. å æ€, ç»å€§éšåçèçŽè®ºç ç©¶éœå±éåš $n$-èçŽ $(n <= 2)$ äž.
æ¢ç¶ 3-èçŽçç论就已ç»éŸä»¥çè§£, åŸé¿äžæ®µæ¶éŽå
, æ ç©·èçŽéœè¢«è®€äžºæ¯æŽå äžå¯è§Šç¢°ç对象. æ ç©·èçŽç论çäžå€§èŽ¡ç®å°±æ¯æç Žäºè¿äžè®€ç¥, å¹¶äž¥æ Œå°å»ºç«äºæ ç©·èçŽè®ºçåç§å¯ä»¥ç±»æ¯ 1-èçŽè®ºçå®ç.
== äœäžºç©ºéŽ?
=== ç»åææ
äžçŽä»¥æ¥, å䌊论çç ç©¶åå¶äºç©ºéŽçæ§èŽš. 讞å€å®çéœæ¯å
åšæ§èŽšèŸå¥œç空éŽäžå»ºç«ç. èµ·å䜿çšçæ¯ (å äœ) å纯å€åœ¢. ç¹ã线段ãäžè§åœ¢ãåé¢äœç $n$ 绎æšå¹¿ç§°äœ*å纯圢* (simplex, ç®ç§°å圢). åŠææäžªç©ºéŽè¢«äžæå纯圢åç©ºéŽ $cal(K)$ èŠç, äœ¿åŸæ¯äžªå纯圢 $s in cal(K)$ çé¢ä¹å±äº $cal(K)$, å¹¶äžä»»äœäž€äžªå纯圢ç亀é $s_1 sect s_2$ èŠä¹ç©º, èŠä¹æ¯ $s_1$ äž $s_2$ çé¢, 就称 $cal(K)$ æ¯è¿äžªç©ºéŽäžç*å纯å€åœ¢*ç»æ. æµåœ¢çäžè§ååå°±æ¯ææµåœ¢äžçå纯å€åœ¢ç»æ. åšææç©ºéŽè¿æªåºç°çæ¶å, å纯å€åœ¢å°±å¯ä»¥çŽæ¥äœäžºç©ºéŽçå®ä¹.
æä»¬ä¹å¯ä»¥è±çŠ»ææç©ºéŽèå®å
šæœè±¡çèèå纯å€åœ¢. 泚æå°å纯å€åœ¢äž, é¡¶ç¹éå®å
šå³å®äºå纯圢. å äžºåŠææäž€äžª $n$ 绎å圢çé¡¶ç¹çžå, é£ä¹å®ä»¬çžäº€éèŠæ¯è¿äž€äžªå圢çé¢, ççŸ.
å æ€æä»¬å¯ä»¥è®Ÿæäžäžªéå $V$ 衚瀺å€åœ¢çé¡¶ç¹, å¹¶äžæäžææéé空åé $cal(K) subset.eq cal(P)(V)$ æè¿°äºå€åœ¢çå纯圢éå. å
¶äžéèŠæ¯äžªé¡¶ç¹çåç¹é ${v} in cal(K)$, 衚瀺é¶ç»Žå圢, å¹¶äžä»»äœå圢 $s in cal(K)$ çé空åé (å äœæä¹æ¯å圢çé¢) ä¹å±äº $cal(K)$. è¿ç§°äœ*æœè±¡å纯å€åœ¢*. è¿å¯ä»¥è§äœæ¯å纯å€åœ¢çèåŸ, å¯ä»¥æç
§æœè±¡å纯å€åœ¢çæå¯ŒæŒèŽŽå äœäœåŸå°å äœå纯å€åœ¢.
è¿ç§å€åœ¢çŠæ¢å€æçç²åæ
åµåºç°. äŸåŠå $SS^1$ å¿
é¡»ç±äžæ¡çº¿æ®µææ, äžèœçŽæ¥ç±äžæ¡çº¿æ®µå°äž€äžªé¡¶ç¹ç²å; ç $SS^2$ åæå°éèŠå䞪äžè§åœ¢ææç©ºå¿åé¢äœç¶. å æ€åšåå
·äœè®¡ç®æ¶åŸåŸäŒéèŠåŸå€å圢, æ¯èŸç¹ç. åæ¶, åšæé å纯å€åœ¢æ¶åŸåŸäŒéå°äžæ»¡è¶³æ¡ä»¶çæ
åµ, å°±éèŠé¢å€åŒå
¥æäœä¿®æ£. Eilenberg äž Zilber @semisimplicial åš 1949 幎åŒå
¥äºæŽå 宜æŸçå€åœ¢, ç§°äœ*åå纯å€åœ¢* (semi-simplicial complex). å 䞺å圢äžåç±é¡¶ç¹å³å®, $n$ 绎å圢äžèœçäœé¡¶ç¹éçåé, èæ¯éèŠé¢å€ç¡®å®äžäžªéå $K_n$. å
¶æ $(n+1)$ 䞪 $(n-1)$ 绎é¢, å æ€æåœæ° $sigma_i : K_n -> K_(n-1)$, å
¶äž $0 <= i <= n$. 䞺äºä¿è¯æŒæ¥å
³ç³»æ£ç¡®, è¿äºåœæ°éèŠæ»¡è¶³äžäºçåŒ.
$n$ 绎å纯圢ç $k$ ç»Žé¢æ $binom(n+1,k+1)$ 䞪, åŠææä»¬è®° $[n] = {0, dots, n}$, å°±å $[k] -> [n]$ çåè°åå°äžäžå¯¹åº. æä»¬å¯ä»¥çšèçŽè¯èšéæ°è¡šèŸŸåå纯å€åœ¢, å³ $Delta_"inj"^"op" -> sans("Set")$ çåœå. å
¶äž $Delta_"inj"$ æ¯ $[1], [2], dots$ äžå®ä»¬ä¹éŽçåè°åå°ææçèçŽ. è¿äºåå°å¯ä»¥è®€äžºæ¯æä»£å圢çæäžªé¢ç圢åŒç¬Šå·.
åš @semisimplicial äž, Eilenberg äž Zilber è¿è®šè®ºäºåŠäžç±»å¯¹è±¡, ç§°äœ*å®å€åå纯å€åœ¢* (complete semi-simplicial complex). åšåå纯å€åœ¢çåºç¡äžåŒå
¥äº*éå*çå圢. äŸåŠé¿åºŠäžºé¶ç线段就æ¯éå $1$-å圢. å®å€åå纯å€åœ¢å¯ä»¥è¡šè¿°äžº $Delta^"op" -> sans("Set")$ çåœå, å
¶äž $Delta$ æ¯ $[1], [2], dots$ äžå®ä»¬ä¹éŽçå
šäœåè°æ å°ææçèçŽ. æ¯äžªåå纯å€åœ¢éœå¯ä»¥é æ·»å éåçå圢ææå®å€åå纯å€åœ¢, äœæ¯åä¹äžç¶. äŸåŠçé¢ $SS^2$ å¯ä»¥è¡šè¿°äžºæ°å¥œæäžäžªééå $2$-å圢, å¹¶äžå®çäžäžªé¢éœæ¯éåç $1$-å圢.
ç±äºè¿äºç»ææ¬èŽšäžåªéèŠéåä¹éŽçæ å°å
³ç³», äžéèŠææä¿¡æ¯å°±å¯ä»¥è®šè®º, å æ€åœä»£äžè¬ç§°ä¹äžºåå纯é, èäžç§°å€åœ¢. åæ¶, ç±äºå®å€åå纯å€åœ¢éåžžéèŠ, 䜿çšè¿çšäžéæžç®å, æç»åç§°å䞺*å纯é* (simplicial set). ç»å®å纯éæåå纯é, æ»èœä»¥ä¹äžºèåŸ, æé åºå¯¹åºçææç©ºéŽ, ç§°äœå
¶*å äœå®ç°*.
å幎, Whitehead å®ä¹äºæŽå 宜æŸç *CW å€åœ¢*, æç§°èè
å€åœ¢ (cellular complex). å
¶äž, ç²æ¥äžéèŠæ²¿çå圢çé¢, èå¯ä»¥éæç²æ¥. å æ€, å圢çç»æå°±äžåéèŠ, å¯ä»¥çŽæ¥æ¿æ¢æå®å¿çäœ. äŸåŠ, å¯ä»¥å°åççèŸ¹çæ²¿çåç»äž€åç²æ¥åŸå° $RR PP^2$. Whitehead è¯æäºèåç Whitehead å®ç, å³å¯¹äº CW å€åœ¢æ¥è¯Ž, åŠæ $f : X -> Y$ 诱富äºå䌊矀çåæ, é£ä¹å°±äžå®ååšå䌊æä¹äžçéæ å°. æ¢å¥è¯è¯Ž, 匱å䌊çä»·å¯ä»¥æšåºå䌊çä»·. å æ€, CW å€åœ¢äžæ¹é¢éåžžçµæŽ», åŠäžæ¹é¢æé€äºåšå䌊è§è§äžç
æçææç©ºéŽ. äŸåŠ *Warsaw å*, ç± $sin(1/x)$ æ²çº¿äž ${(0, y) | -1 <= y <= 1}$ ç²æ¥èæ.\
#box(width: 100%)[
#set align(center)
#import cetz.draw: *
#cetz.canvas(
{
cetz.plot.plot(name: "plot", size: (20/calc.pi, 2), axis-style: none, x-tick-step: none, y-tick-step: none, {
cetz.plot.add(domain: (calc.sqrt(2*calc.pi), 8), samples: 400, t => (1/(t*t), calc.sin(t*t)), style: (stroke: black))
cetz.plot.add(domain: (calc.sqrt(2*calc.pi), 8), samples: 400, t => (-1/(t*t), -calc.sin(t*t)), style: (stroke: black))
cetz.plot.add-anchor("center", (0,0))
cetz.plot.add-anchor("left", (-1/(2*calc.pi),0))
cetz.plot.add-anchor("left-up", (-1/(2*calc.pi)-1/(4*calc.pi*calc.pi), 1))
cetz.plot.add-anchor("right", (1/(2*calc.pi),0))
cetz.plot.add-anchor("right-down", (1/(2*calc.pi)+1/(4*calc.pi*calc.pi), -1))
})
rect((rel: (20/63, 1), to: "plot.center"), (rel: (-20/63, -1), to: "plot.center"), stroke: none, fill: black)
bezier("plot.left", (0,-1), "plot.left-up", (-2,-1))
line((0,-1), (20/calc.pi, -1))
bezier("plot.right", (20/calc.pi, -1), "plot.right-down", (20/calc.pi + 0.5, -1))
}
)]
è¿æ¡æ²çº¿ææå䌊矀å䞺é¶, å æ€ä»å䌊çè§è§æ²¡æä»»äœå¯è¯å«çæ§èŽš. äœæ¯å®å¹¶äžå¯çŒ©. CW å€åœ¢è§é¿äºè¿ç§æ
åµ, äœæ¯ä»»äœææç©ºéŽéœäžæäžª CW å€åœ¢åŒ±å䌊çä»· (å¯¹äº Warsaw åæ¥è¯Ž, å®äžåç¹ç©ºéŽåŒ±å䌊çä»·). å æ€, CW å€åœ¢æäžºäºä»£æ°ææäžå䌊论äžå¯æçŒºçå·¥å
·.
<NAME> åš 1956 幎起系ç»å°åå±äºå纯éäžçå䌊论. å°œç®¡æ²¡æææçæŠå¿µ, å䌊çåç§å
³é®ç»æéœå¯ä»¥åšå纯éäžäœç°. ä»å䌊çè§åºŠ, å纯éèçŽ $sans("sSet")$ äžææç©ºéŽçèçŽ $sans("Top")$ æ¯çä»·ç. äž¥æ Œæ¥è¯Ž, å°±æ¯è¿äž€è
ææçæš¡åèçŽä¹éŽæ Quillen çä»·. å æ€, æä»¬ä¹å¯ä»¥æå纯éç§°äœ â空éŽâ.
// Kan æåºäºäžç§æ»¡è¶³ç¹æ®æ¡ä»¶çå纯é, ç§°äœ *Kan å€åœ¢*. å° $n$ 绎å圢çäœå纯é, è®°äœ $Delta^n$. å°å
¶æå»äžå¿äžå
¶äžäžäžªé¢, å°±åŸå°äºâè§â $Lambda^n_i$, å
¶äž $0 <= i <= n$ 衚瀺æå»äºç¬¬å 䞪é¢. ç»å®å纯圢 $X$, åŠæå¯¹äºä»»äœè§ $Lambda^n_i -> X$, éœååšå¡«å
$Delta^n -> X$, 就称å
¶äžº Kan å€åœ¢. æ¢å¥è¯è¯Ž, ææåŠåŸæç€ºçäº€æ¢æ¹éœå¯ä»¥å¡«å
¥å¯¹è§çº¿.
=== æ 穷矀è
åšä»£æ°ææçåŒç«¯æåºçåºæ¬çŸ€çæŠå¿µ, éèŠéæ©äžäžªåºç¹. å¯ä»¥ç»åºäžäžªäžéæ©åºç¹ççæ¬, ç§°äœ*åºæ¬çŸ€è*. 矀è峿ææå°éœå¯éçèçŽ. 矀å¯ä»¥çäœåªæäžäžªå¯¹è±¡ç矀è, 矀çå
çŽ å¯¹åºè¿äžªå¯¹è±¡å°èªèº«çæå°. åºæ¬çŸ€èçå¯¹è±¡æ¯ææç©ºéŽäžçç¹, èæå°æ¯éè·¯çå䌊çä»·ç±». é£ä¹åºæ¬çŸ€èäžå¯¹è±¡åšåæå
³ç³»äžççä»·ç±»å°±æ¯ææç©ºéŽçéè·¯è¿é忝 $pi_0 (X)$. å æ€, åºæ¬çŸ€èåæ¶å
å«äº $pi_0(X)$ äž $pi_1 (X)$ çä¿¡æ¯, å¯ä»¥è®°äœ $pi_(<= 1) (X)$.
泚æå°äžºäºéè·¯å€åçç»ååŸççåŒ, è¿åå»äºéè·¯ä¹éŽçå䌊, å æ€æå€±äºäžäºä¿¡æ¯. å¯¹äºæ²¡ææŽé«ç»Žçç»æçç©ºéŽ ââ å³ $pi_n (X) = 0$ $(n > 1)$, ç§°äœ *1-æªæ*ç©ºéŽ ââ æ¥è¯Ž, åºæ¬çŸ€èå°±å
å«äºå
šéšçä¿¡æ¯. äž¥æ Œæ¥è¯Ž, çŸ€èææç 2-èçŽ $sans("Grpd") arrow.hook sans("Cat")$ äž 1-æªæææç©ºéŽãè¿ç»æ å°ãæ å°å䌊ççä»·ç±»ææç 2-èçŽ $sans("Top")_(<= 1)$ æ¯çä»·ç.
äžäžæ¥æšå¹¿å°±æ¯ 2-矀è, å³æææ å°éœå¯éç 2-èçŽ, 以æ€ç±»æš. åŠæèŠèŠçå
šäœç©ºéŽèäžæå€±ä»»äœä¿¡æ¯, èªç¶å°±éèŠå¯¹ $n$-矀èåæé $n -> oo$. è¿æ ·, å°±èœå®ä¹ âåºæ¬æ 穷矀èâ, çŽæ¥åéè·¯èäžåå»ä»»äœå䌊. è¿å°±äœ¿åŸç©ºéŽçæŠå¿µå®å
šæè±ææ, ä»èä¹é¿å
äºç
æææç©ºéŽçé®é¢.
Grothendieck åšèåçæçš¿ã远寻å ã@PursuingStacks äžæåºçšæ 穷矀èäœäžºç©ºéŽççä»·å®ä¹. âå â æä»£çå°±æ¯ç°ä»£æç§°çæ 穷矀è. æ 穷矀èäžç©ºéŽççä»·æ§ç§°äœ*å䌊å讟*. äœæ¯åŠäžæç€º, 1-æªæçå䌊å讟éèŠçšå° 2-èçŽççä»·, å æ€ä»
ä»
æ¯è¡šè¿°åºå®æŽçå䌊å讟就å¿
é¡»å®ä¹æ ç©·èçŽä¹éŽççä»·, æŽäžçšè¯Žè¯æäº. å æ€, æ ç©·èçŽçç论ä¹äŒç»åº (å䌊æä¹äž) 空éŽçæ¬èŽšçå¯ç€º.
== æ ç©·èçŽ
è¿äžæ¡çº¿çåå±, æç»èåå¬çäºæ ç©·èçŽçç ç©¶.
1973 幎, Boardman äž Vogt @quasicategory åŒå
¥äº*æèçŽ* (quasicategory), æ å¿çæ ç©·èçŽè®ºçåŒç«¯. è¿æ¯åºäºå纯éçå®ä¹. å纯éäžçé¡¶ç¹ä»£è¡šå¯¹è±¡, 蟹代衚æå°. äžè§åœ¢è¡šç€ºäž€äžªæå°çå€åæ¯ç¬¬äžäžªæå°. 泚æè¿æå³ç䞀䞪æå°å€åå¯ä»¥äžå¯äž, å®ä»¬åªéèŠåšæŽé«ç»Žçå䌊æä¹äžå¯äžå³å¯. åæ¶, äžæ¡èŸ¹ä¹éŽå¯ä»¥æå€äºäžäžªäžè§åœ¢, 衚瀺å®ä»¬ä¹éŽçå䌊å¯ä»¥éå¹³å¡. è¿ç§æ¹åšäœ¿åŸæ 穷绎çè莯åŸå¯ä»¥ç»äžå°æè¿°. æ ç©·èçŽçæäœå°±åæäºæééçäžäºç»åé®é¢.
éå, æäœæ ç©·èçŽæéçå·¥å
·ç𳿥åå±. æ ç©·èçŽçå
¶ä»å®ä¹ä¹éæžåºç°. äŸåŠ $sans("sSet")$-å
å®èçŽ, $sans("Top")$-å
å®èçŽçç. è¿äºå®ä¹éœæ¯äºçžçä»·ç.
è¿å
¥ 21 äžçºª, <NAME> ååºäºãé«ç»Žæè±¡è®ºã@HTT, ç³»ç»å°æè¿°äºæ ç©·èçŽäžæ ç©·æè±¡çç论. åæ¶, ãé«ç»Žä»£æ°ã@HA å°äžè§èçŽçç论æå±äžºçš³å®æ ç©·èçŽ. åšè¿äžªæ¡æ¶äž, å䌊èçŽå¯ä»¥çäœæ¯æ ç©·èçŽå
å«ä¿¡æ¯çæªæ, æ£åŠåºæ¬çŸ€æ¯ææç©ºéŽæå
å«ä¿¡æ¯çæªæ. è¿æ ·, ä¹åæå°çå䌊å°éŸå°±è¿åèè§£.
|
|
https://github.com/TeunSpithoven/Signals-And-Embedded-Systems | https://raw.githubusercontent.com/TeunSpithoven/Signals-And-Embedded-Systems/main/starter.typ | typst | // CHANGE THIS TO THE CORRECT PATH
#import "./template/fhict-template.typ": *
#import "./components/terms.typ": term_list
#show: fhict_doc.with(
title: "",
subtitle: "",
authors: (
(
name: "dsdss",
),
),
version-history: (
(
version: "",
date: "",
author: [ddd],
changes: "",
),
),
pre-toc: [#include "./components/pre-toc.typ"],
// bibliography-file: "my-sources.bib",
glossary-terms: term_list,
)
Hoi @banaan
|
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/bugs/table-lines_00.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
#set page(height: 50pt)
Hello
#table(
columns: 4,
[1], [2], [3], [4]
)
|
https://github.com/ukihot/igonna | https://raw.githubusercontent.com/ukihot/igonna/main/articles/shell-work/reg.typ | typst | == æ£èŠè¡šçŸ
æ£èŠè¡šçŸã¯ãæååã®ãã¿ãŒã³ã衚çŸããããã®èšæ³ã§ããã
ããã«ãããç¹å®ã®æ¡ä»¶ãæºããæååã广çã«æ€çŽ¢ãããã眮æãããããããšãã§ããã
åºæ¬çãªæ£èŠè¡šçŸã以äžã«ç€ºããïŒæ£èŠè¡šçŸã®ããšã«ã³ãã³`:`ãä»äžããããæ³šæããŠã»ããã
=== Character ClassïŒæå
- [abc]: ããããäžã€
- => a, b, c
- [^abc]: å«ãŸãªã
- => d ãªã©
=== MetacharactersïŒã¡ã¿æå
- `.`: ä»»æã®1æå
- `\n`: æ¹è¡ã³ãŒã
- `^`: è¡ã®å
é
- `$`: è¡ã®æ«å°Ÿ
- `\d`: åè§æ°å
- `\s`: 空çœ
- `\S`: 空çœä»¥å€ãã¹ãŠ
- `\w`:åè§è±æ°åãšã¢ã³ããŒã¹ã³ã¢
- `\l`:åè§è±å°æå
- `\u`:åè§è±å€§æå
=== QuantifiersïŒéæå®
- `*`: çŽåã®èŠçŽ ã0å以äžç¹°ãè¿ãïŒã¯ã€ã«ãã«ãŒããšãããïŒ
- `+`: çŽåã®èŠçŽ ã1å以äžç¹°ãè¿ã
- `?`: çŽåã®èŠçŽ ã0åãŸãã¯1å
- {n}: çŽåã®ãã¿ãŒã³ãnåç¹°ãè¿ã
=== Grouping : ã°ã«ãŒãã³ã°
- (): æ¬åŒ§å
ã®ãã¿ãŒã³ã1ã€ã®èŠçŽ ãšããŠæ±ã
== æ£èŠè¡šçŸã®äŸ
é»è©±çªå·ã¯ä»¥äžã®æ£èŠè¡šçŸãé©çšã§ããã
```re
^(\d{3}-\d{4}-\d{4}|\d{10})$
```
ãã€ãã³ãå«ã圢åŒïŒäŸ: 123-4567-8901ïŒãŸãã¯ãã€ãã³ãªãã®åœ¢åŒïŒäŸ: 1234567890ïŒã«ãããããã |
|
https://github.com/Complex2-Liu/macmo | https://raw.githubusercontent.com/Complex2-Liu/macmo/main/contests/2023/content/problem-12.typ | typst | #import "../lib/math.typ": problem, proof, note, ans
#let triangle = math.class("normal", sym.triangle.stroked.t)
#problem[
åŠäžåŸæç€º, 讟 $H$ äž $O$ åå«äžºäžè§åœ¢ $A B C$ çåå¿åå€å¿.
è¯ç¡®å®åéå $arrow(O A) + arrow(O B) + arrow(O C) - arrow(O H)$,
å
¶äž $arrow(O A)$, $arrow(O B)$, $arrow(O C)$ å $arrow(O H)$ 䞺åé,
å¹¶ç»åºè¯æ.
]
#proof[
äºå®äžåŠæäœ å°è¯è¿çšå€æ°å€§æ³æ¥çž IMO å äœé¢, äœ åºè¯¥åŸçæè¿æ ·äžäžªæ§èŽš:
$h = a + b + c$, æä»¥è¿é¢ççæ¡å°±æ¯ $0$.
#align(center)[
#image("../diagram-12.svg", width: 40%)
]
éŠå
讟 $arrow(O D) = arrow(O A) + arrow(O B)$, å 䞺 $triangle A O B$
æ¯çè
°äžè§åœ¢, æä»¥ $D$ å®é
äžæ¯ $O$ å
³äº $A B$ ç relfection.
äžé¢ææåäžäºå¹³é¢å äœåžžè§çåºæ¬åŸåœ¢åæ§èŽš, äžäžªåæ ŒçåŠçåºè¯¥å¯¹æ€æ¯èŸçæ:
#enum(numbering: "(1)", indent: 1em)[
$C H D O$ æ¯å¹³è¡å蟹圢.
][
$2 O M = C H$.
][
讟 $C'$ æ¯ $C O$ äžå€æ¥åç亀ç¹, å $C'$ å®é
äžæ¯ $H$ å
³äº $A B$
çäžç¹ç reflection.
][
$H$ å
³äº $A B$ ç relfection èœåšå€æ¥åäž.
]
è¿äžåçæ§èŽšéœæºäº $H$ äž $O$ çè§å
±èœ, i.e. $angle A C O = angle B C H$.
]
#note[
è¿éæä»¬ä»¥ç¹ $C$ 䞺_äž»è§è§_, å®é
äžäœ 以 $A, B$ 䞺䞻è§è§ä¹èœåŸå°å¹³è¡çç»è®º.
]
/* vim: set ft=typst: */
|
|
https://github.com/lcharleux/LCharleux_Teaching_Typst | https://raw.githubusercontent.com/lcharleux/LCharleux_Teaching_Typst/main/README.md | markdown | MIT License | # Support de cours
Auteur: <NAME> (<EMAIL>)
Ce dépÎt contient des éléments de cours écrits avec Typst.
## Eléments disponibles
### [**MECA510** & **MECA512**] Rappels communs sur les vecteurs et les torseurs pour
- [MECA510-512_Rappels.pdf](https://github.com/lcharleux/LCharleux_Teaching_Typst/raw/outputs/MECA510-512_Rappels.pdf)
### [**MECA510**] Statique des solides
- [MECA510_Statique.pdf](https://github.com/lcharleux/LCharleux_Teaching_Typst/raw/outputs/MECA510_Statique.pdf)
### [**MECA512**] Cinématique des solides indéformables
- [MECA512_Cinematique.pdf](https://github.com/lcharleux/LCharleux_Teaching_Typst/raw/outputs/MECA512_Cinematique.pdf)
### Divers essais de figures et d'environnements Typst
- [demo.pdf](https://github.com/lcharleux/LCharleux_Teaching_Typst/raw/outputs/demo.pdf) |
https://github.com/EpicEricEE/typst-marge | https://raw.githubusercontent.com/EpicEricEE/typst-marge/main/tests/overlap/multiple/test.typ | typst | MIT License | #import "/src/lib.typ": sidenote
#set par(justify: true)
#set page(width: 8cm, height: 20cm, margin: (outside: 4.5cm, rest: 5mm))
#let sidenote = sidenote.with(numbering: "1")
#lorem(5)
#sidenote[This is the first sidenote.]
#for n in range(13) [
oh #sidenote[This one is moved down to prevent overlap.]
]
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/let-19.typ | typst | Other | // Ref: false
// Destructuring with an empty sink.
#let (a: _, ..b) = (a: 1)
#test(b, (:))
|
https://github.com/Mendrejk/thesis | https://raw.githubusercontent.com/Mendrejk/thesis/master/typst/main.typ | typst | #set page(margin: (
top: 0cm,
bottom: 0cm,
x: 0cm,
))
#image("pd_mgr_pl.docx.svg")
#set page(margin: (
top: 2.5cm,
bottom: 2.5cm,
x: 2.5cm,
))
// #set text(font: "Montserrat")
#set text(
font: "Satoshi",
size: 12pt
)
#set par(justify: true)
#show link: underline
#set text(lang: "PL")
#set page(numbering: "1")
#set heading(numbering: "1.1)")
#import "@preview/big-todo:0.2.0": *
// Strona tytuÅowa
// #set align(center)
// #text(size: 22pt)[
// Politechnika WrocÅawska\
// WydziaÅ Informatyki i Telekomunikacji
// ]
// #line(length: 100%)
// #align(left)[
// Kierunek: *IST* \
// SpecjalnoÅÄ: *PSI*
// ]
// #block(spacing: 1em)#set text(font: "Atkinson Hyperlegible")
// #text(size: 32pt)[
// PRACA DYPLOMOWA \
// MAGISTERSKA
// ]
// *Analiza moÅŒliwoÅci wykorzystania metod uczenia maszynowego w rekonstrukcji nagraÅ dźwiÄkowych*
// #block(spacing: 1em)
// <NAME>
// #block(spacing: 1em)
// Opiekun pracy
// *Dr inŌ <NAME>*
// #set align(left)
// #pagebreak(weak: true)
#pagebreak(weak: true)
#outline(indent: true)
#pagebreak(weak: true)
= WstÄp
== TÅo historyczne nagraÅ dźwiÄkowych
Historia rejestracji dźwiÄku siÄga poÅowy XIX wieku, kiedy to w 1857 roku <NAME> skonstruowaÅ fonoautograf - pierwsze urzÄ
dzenie zdolne do zapisywania dźwiÄku @first-recorded-sound. ChoÄ fonoautograf nie umoÅŒliwiaÅ odtwarzania zarejestrowanych dźwiÄków, stanowiÅ przeÅom w dziedzinie akustyki i zapoczÄ
tkowaÅ erÄ nagraÅ dźwiÄkowych. Pierwszym nagraniem uznawanym za moÅŒliwe do odtworzenia byÅa francuska piosenka ludowa "<NAME>", zarejestrowana przez Scotta w 1860 roku @first-recorded-sound.
Kolejnym kamieniem milowym w historii rejestracji dźwiÄku byÅo wynalezienie fonografu przez <NAME> w 1877 roku. UrzÄ
dzenie to nie tylko zapisywaÅo dźwiÄk, ale równieÅŒ umoÅŒliwiaÅo jego odtwarzanie, co otworzyÅo drogÄ do komercjalizacji nagraÅ dźwiÄkowych @edison-phonograph. W nastÄpnych dekadach technologia nagrywania ewoluowaÅa, przechodzÄ
c przez etapy takie jak pÅyty gramofonowe, taÅmy magnetyczne, aÅŒ po cyfrowe noÅniki dźwiÄku @sound-recording-history.
Kluczowe momenty w historii rejestracji dźwiÄku obejmujÄ
:
1. 1888 - Wprowadzenie pÅyt gramofonowych przez <NAME> @berliner-gramophone
2. 1920-1930 - Rozwój nagraÅ elektrycznych, znaczÄ
co poprawiajÄ
cych jakoÅÄ dźwiÄku @electrical-recording
3. 1948 - Pojawienie siÄ pÅyt dÅugograjÄ
cych (LP) @lp-record
4. 1963 - Wprowadzenie kaset kompaktowych przez Philips @compact-cassette
5. 1982 - Komercjalizacja pÅyt CD, rozpoczynajÄ
ca erÄ cyfrowÄ
w muzyce @cd-introduction
Rozwój technologii nagrywania miaÅ ogromny wpÅyw na jakoÅÄ i dostÄpnoÅÄ nagraÅ muzycznych. Wczesne nagrania charakteryzowaÅy siÄ ograniczonym pasmem czÄstotliwoÅci, wysokim poziomem szumów i znieksztaÅceÅ @early-recording-limitations. Wraz z postÄpem technologicznym, jakoÅÄ dźwiÄku stopniowo siÄ poprawiaÅa, osiÄ
gajÄ
c szczyt w erze cyfrowej. JednoczeÅnie, ewolucja noÅników dźwiÄku od pÅyt gramofonowych przez kasety magnetyczne po pliki cyfrowe, znaczÄ
co zwiÄkszyÅa dostÄpnoÅÄ muzyki dla szerokiego grona odbiorców @music-accessibility.
Warto zauwaÅŒyÄ, ÅŒe mimo znacznego postÄpu technologicznego, wiele historycznych nagraÅ o ogromnej wartoÅci kulturowej i artystycznej wciÄ
ÅŒ pozostaje w formie, która nie oddaje peÅni ich oryginalnego brzmienia. Stwarza to potrzebÄ rozwoju zaawansowanych technik rekonstrukcji i restauracji nagraÅ, co stanowi jedno z gÅównych wyzwaÅ wspóÅczesnej inÅŒynierii dźwiÄku i muzykologii @13 @historical-remastering-challenges.
#pagebreak(weak: true)
== Problematyka jakoÅci historycznych nagraÅ
Historyczne nagrania dźwiÄkowe, mimo ich ogromnej wartoÅci kulturowej i artystycznej, czÄsto charakteryzujÄ
siÄ niskÄ
jakoÅciÄ
dźwiÄku, co stanowi istotne wyzwanie dla wspóÅczesnych sÅuchaczy i badaczy. Problematyka ta wynika z kilku kluczowych czynników.
Ograniczenia wczesnych technologii nagrywania stanowiÅy gÅównÄ
przeszkodÄ w uzyskiwaniu wysokiej jakoÅci dźwiÄku. Wczesne urzÄ
dzenia rejestrujÄ
ce charakteryzowaÅy siÄ wÄ
skim pasmem przenoszenia, co skutkowaÅo utratÄ
zarówno niskich, jak i wysokich czÄstotliwoÅci @early-recording-limitations. Typowe pasmo przenoszenia dla nagraÅ z poczÄ
tku XX wieku wynosiÅo zaledwie 250-2500 Hz, co znaczÄ
co ograniczaÅo peÅniÄ brzmienia instrumentów i gÅosu ludzkiego @audio-bandwidth-history. Ponadto, pierwsze systemy nagrywajÄ
ce wprowadzaÅy znaczne szumy i znieksztaÅcenia do rejestrowanego materiaÅu, co byÅo spowodowane niedoskonaÅoÅciami mechanicznymi i elektrycznymi ówczesnych urzÄ
dzeÅ @noise-in-early-recordings.
WpÅyw warunków przechowywania na degradacjÄ noÅników jest kolejnym istotnym czynnikiem wpÅywajÄ
cym na jakoÅÄ historycznych nagraÅ. NoÅniki analogowe, takie jak pÅyty gramofonowe czy taÅmy magnetyczne, sÄ
szczególnie podatne na uszkodzenia fizyczne i chemiczne @analog-media-degradation. Ekspozycja na wilgoÄ, ekstremalne temperatury czy zanieczyszczenia powietrza moÅŒe prowadziÄ do nieodwracalnych zmian w strukturze noÅnika, co przekÅada siÄ na pogorszenie jakoÅci odtwarzanego dźwiÄku. W przypadku taÅm magnetycznych, zjawisko print-through, polegajÄ
ce na przenoszeniu sygnaÅu magnetycznego miÄdzy sÄ
siednimi warstwami taÅmy, moÅŒe wprowadzaÄ dodatkowe znieksztaÅcenia @print-through-effect.
PrzykÅady znaczÄ
cych nagraÅ historycznych o niskiej jakoÅci dźwiÄku sÄ
liczne i obejmujÄ
wiele kluczowych momentów w historii muzyki. Jednym z najbardziej znanych jest nagranie <NAME> wykonujÄ
cego fragment swojego "TaÅca wÄgierskiego nr 1" z 1889 roku, które jest najstarszym znanym nagraniem muzyki powaÅŒnej @brahms-recording. Nagranie to, mimo swojej ogromnej wartoÅci historycznej, charakteryzuje siÄ wysokim poziomem szumów i znieksztaÅceÅ. Innym przykÅadem sÄ
wczesne nagrania bluesa, takie jak "Crazy Blues" <NAME> z 1920 roku, które pomimo przeÅomowego znaczenia dla rozwoju gatunku, cechujÄ
siÄ ograniczonym pasmem czÄstotliwoÅci i obecnoÅciÄ
szumów tÅa @early-blues-recordings.
Wyzwania zwiÄ
zane z odtwarzaniem i konserwacjÄ
starych nagraÅ sÄ
zÅoÅŒone i wymagajÄ
interdyscyplinarnego podejÅcia. Odtwarzanie historycznych noÅników czÄsto wymaga specjalistycznego sprzÄtu, który sam w sobie moÅŒe byÄ trudny do utrzymania w dobrym stanie @playback-equipment-challenges. Proces digitalizacji, choÄ kluczowy dla zachowania dziedzictwa audio, niesie ze sobÄ
ryzyko wprowadzenia nowych znieksztaÅceÅ lub utraty subtelnych niuansów oryginalnego nagrania @music-digitization-challenges. Ponadto, konserwacja fizycznych noÅników wymaga stworzenia odpowiednich warunków przechowywania, co moÅŒe byÄ kosztowne i logistycznie skomplikowane @preservation-storage-requirements.
Dodatkowo, etyczne aspekty restauracji nagraÅ historycznych stanowiÄ
przedmiot debaty w Årodowisku muzycznym i konserwatorskim. Pytanie o to, jak daleko moÅŒna posunÄ
Ä siÄ w procesie cyfrowej rekonstrukcji bez naruszenia integralnoÅci oryginalnego dzieÅa, pozostaje otwarte @ethical-considerations-in-audio-restoration.
Problematyka jakoÅci historycznych nagraÅ stanowi zatem nie tylko wyzwanie techniczne, ale równieÅŒ kulturowe i etyczne. Rozwój zaawansowanych technik rekonstrukcji audio, w tym metod opartych na sztucznej inteligencji, otwiera nowe moÅŒliwoÅci w zakresie przywracania i zachowania dziedzictwa dźwiÄkowego, jednoczeÅnie stawiajÄ
c przed badaczami i konserwatorami nowe pytania dotyczÄ
ce granic ingerencji w historyczny materiaÅ @ai-in-audio-restoration.
#linebreak()
== Znaczenie rekonstrukcji nagraÅ muzycznych
Rekonstrukcja historycznych nagraÅ muzycznych odgrywa kluczowÄ
rolÄ w zachowaniu i promowaniu dziedzictwa kulturowego, oferujÄ
c szereg korzyÅci zarówno dla badaczy, jak i dla szerszej publicznoÅci.
WartoÅÄ kulturowa i historyczna archiwów muzycznych jest nieoceniona. Nagrania dźwiÄkowe stanowiÄ
unikalne Åwiadectwo rozwoju muzyki, technik wykonawczych i zmian w stylistyce muzycznej na przestrzeni lat @cultural-value-of-music-archives. Rekonstrukcja tych nagraÅ pozwala na zachowanie i udostÄpnienie szerszemu gronu odbiorców dzieÅ, które w przeciwnym razie mogÅyby zostaÄ zapomniane lub staÄ siÄ niedostÄpne ze wzglÄdu na degradacjÄ noÅników @preservation-of-audio-heritage. Ponadto, zrekonstruowane nagrania umoÅŒliwiajÄ
wspóÅczesnym sÅuchaczom doÅwiadczenie wykonaÅ legendarnych artystów w jakoÅci zbliÅŒonej do oryginalnej, co ma ogromne znaczenie dla zrozumienia historii muzyki i ewolucji stylów wykonawczych @historical-performance-practice.
Rola zrekonstruowanych nagraÅ w badaniach muzykologicznych jest fundamentalna. Wysokiej jakoÅci rekonstrukcje pozwalajÄ
naukowcom na dokÅadnÄ
analizÄ technik wykonawczych, interpretacji i stylów muzycznych z przeszÅoÅci @musicological-research-methods. UmoÅŒliwiajÄ
one równieÅŒ badanie ewolucji praktyk wykonawczych oraz porównywanie róŌnych interpretacji tego samego utworu na przestrzeni lat @performance-practice-evolution. W przypadku kompozytorów, którzy sami wykonywali swoje dzieÅa, zrekonstruowane nagrania stanowiÄ
bezcenne źródÅo informacji o intencjach twórców @composer-performances.
WpÅyw jakoÅci nagraÅ na percepcjÄ i popularnoÅÄ utworów muzycznych jest znaczÄ
cy. Badania wskazujÄ
, ÅŒe sÅuchacze sÄ
bardziej skÅonni do pozytywnego odbioru i czÄstszego sÅuchania nagraÅ o wyÅŒszej jakoÅci dźwiÄku @music-audio-quality-perception. Rekonstrukcja historycznych nagraÅ moÅŒe przyczyniÄ siÄ do zwiÄkszenia ich dostÄpnoÅci i atrakcyjnoÅci dla wspóÅczesnych odbiorców, potencjalnie prowadzÄ
c do odkrycia na nowo zapomnianych artystów lub utworów @rediscovery-of-forgotten-music. Ponadto, poprawa jakoÅci dźwiÄku moÅŒe pomóc w lepszym zrozumieniu i docenieniu niuansów wykonania, które mogÅy byÄ wczeÅniej niezauwaÅŒalne ze wzglÄdu na ograniczenia techniczne oryginalnych nagraÅ @nuance-in-restored-recordings.
#pagebreak(weak: true)
Ekonomiczne aspekty rekonstrukcji nagraÅ sÄ
równieÅŒ istotne. Rynek remasterów i zrekonstruowanych nagraÅ historycznych stanowi znaczÄ
cy segment przemysÅu muzycznego @remastered-recordings-market. Wydawnictwa specjalizujÄ
ce siÄ w tego typu projektach, takie jak "<NAME>" czy "Naxos Historical", odnoszÄ
sukcesy komercyjne, co Åwiadczy o istnieniu popytu na wysokiej jakoÅci wersje klasycznych nagraÅ @classical-music-reissues. Ponadto, rekonstrukcja nagraÅ moÅŒe prowadziÄ do powstania nowych źródeÅ przychodów dla artystów lub ich spadkobierców, a takÅŒe instytucji kulturalnych posiadajÄ
cych prawa do historycznych nagraÅ @remastered-recordings-market @remastering-education.
Warto równieÅŒ zauwaÅŒyÄ, ÅŒe rekonstrukcja nagraÅ muzycznych ma istotne znaczenie dla edukacji muzycznej. Zrekonstruowane nagrania historyczne mogÄ
sÅuÅŒyÄ jako cenne narzÄdzie dydaktyczne, umoÅŒliwiajÄ
c studentom muzyki bezpoÅredni kontakt z wykonaniami wybitnych artystów z przeszÅoÅci i pomagajÄ
c w zrozumieniu ewolucji stylów muzycznych.
PodsumowujÄ
c, znaczenie rekonstrukcji nagraÅ muzycznych wykracza daleko poza samÄ
poprawÄ jakoÅci dźwiÄku. Jest to proces o fundamentalnym znaczeniu dla zachowania dziedzictwa kulturowego, wspierania badaÅ naukowych, edukacji muzycznej oraz rozwoju przemysÅu muzycznego. W miarÄ jak technologie rekonstrukcji dźwiÄku, w tym metody oparte na sztucznej inteligencji, stajÄ
siÄ coraz bardziej zaawansowane, moÅŒna oczekiwaÄ, ÅŒe ich rola w przywracaniu i promowaniu historycznych nagraÅ bÄdzie nadal rosÅa, przynoszÄ
c korzyÅci zarówno dla Åwiata nauki, jak i dla miÅoÅników muzyki na caÅym Åwiecie @future-of-audio-restoration.
#linebreak()
== Cel i zakres pracy
GÅównym celem pracy jest dogÅÄbna analiza i ocena potencjaÅu metod uczenia maszynowego w dziedzinie rekonstrukcji nagraÅ dźwiÄkowych. Szczególny nacisk poÅoÅŒony zostanie na zbadanie efektywnoÅci zaawansowanych technik sztucznej inteligencji w przywracaniu jakoÅci historycznym nagraniom muzycznym, koncentrujÄ
c siÄ na wyzwaniach, które tradycyjne metody przetwarzania sygnaÅów audio czÄsto pozostawiajÄ
nierozwiÄ
zane @ai-in-audio-restoration.
W ramach pracy zostanie poÅoÅŒony nacisk na trzy kluczowe zagadnienia:
1. Eliminacja szumów i zakÅóceÅ typowych dla historycznych nagraÅ @noise-reduction-techniques.
2. Poszerzanie pasma czÄstotliwoÅciowego w celu wzbogacenia brzmienia nagraÅ o ograniczonym paÅmie @bandwidth-extension-methods.
3. Rekonstrukcja uszkodzonych fragmentów audio, co jest szczególnie istotne w przypadku wielu historycznych nagraŠ@audio-inpainting-techniques.
PodejÅcie badawcze opiera siÄ na implementacji i analizie wybranych metod uczenia maszynowego, skupiajÄ
c siÄ na architekturze *Generatywnych Sieci Przeciwstawnych* (Generative Adversarial Networks - GAN) @gan-in-audio-processing. Wybór tej architektury wynika z jej udokumentowanej skutecznoÅci w zadaniach generatywnych i rekonstrukcyjnych w innych powiÄ
zanych dziedzinach, takich jak przetwarzanie obrazów @gan-image-restoration.
W ramach badaÅ zostanie podjÄta próba opracowania zaawansowanego modelu GAN, który bÄdzie wykorzystywaÅ strukturÄ enkoder-dekoder z poÅÄ
czeniami skip dla generatora oraz architekturÄ konwolucyjnÄ
dla dyskryminatora. Zastosowany zostanie szereg technik normalizacji i regularyzacji, takich jak Batch Normalization, Spectral Normalization czy Gradient Penalty, celem poprawy stabilnoÅci i wydajnoÅci treningu. Kluczowym elementem bÄdzie wykorzystanie kompleksowego zestawu funkcji strat, obejmujÄ
cego Adversarial Loss, Content Loss, oraz specjalistyczne funkcje straty dla domen audio, jak Spectral Convergence Loss czy Phase-Aware Loss. Zaimplemenotwane zostanÄ
zaawansowane techniki optymalizacji, w tym Adam Optimizer z niestandardowymi parametrami oraz dynamiczne dostosowywanie wspóÅczynnika uczenia.
Metodologia badawcza obejmuje kilka kluczowych etapów. Na poczÄ
tku przygotowany zostanie obszerny zestaw danych treningowych, skÅadajÄ
cy siÄ z par nagraÅ oryginalnych i ich zdegradowanych wersji. NastÄpnie zaimplementowane zostanÄ
róŌnorodne warianty architektury GAN, dostosowane do specyfiki przetwarzania sygnaÅów audio. Proces treningu bÄdzie wykorzystywaÅ zaawansowane techniki, takie jak augmentacja danych czy przetwarzanie na spektrogramach STFT. Ostatnim etapem bÄdzie wszechstronna ewaluacja wyników, ÅÄ
czÄ
ca wiele obiektywnych metryk jakoÅci audio.
Oczekiwane rezultaty pracy obejmujÄ
kompleksowÄ
ocenÄ skutecznoÅci proponowanych metod uczenia maszynowego w zadaniach rekonstrukcji nagraÅ audio, w zestawieniu z tradycyjnymi technikami przetwarzania sygnaÅów. Przeprowadzona zostanie szczegóÅowa analizÄ wpÅywu róŌnych komponentów architektury i parametrów modeli na jakoÅÄ rekonstrukcji. Istotnym elementem bÄdzie identyfikacja mocnych stron i ograniczeÅ metod opartych na AI w kontekÅcie specyficznych wyzwaÅ zwiÄ
zanych z restauracjÄ
historycznych nagraÅ muzycznych. Na podstawie uzyskanych wyników, zostanÄ
sformuÅowane wnioski dotyczÄ
ce potencjaÅu sztucznej inteligencji w dziedzinie rekonstrukcji nagraÅ, wraz z rekomendacjami dla przyszÅych badaÅ i zastosowaÅ praktycznych @ai-remastering-ethics.
Realizacja powyÅŒszych celów ma potencjaÅ nie tylko do znaczÄ
cego wkÅadu w dziedzinÄ przetwarzania sygnaÅów audio i uczenia maszynowego, ale równieÅŒ do praktycznego zastosowania w procesach restauracji i zachowania dziedzictwa muzycznego @preservation-of-audio-heritage. Wyniki pracy mogÄ
znaleÅºÄ zastosowanie w instytucjach kulturalnych, archiwach dźwiÄkowych oraz w przemyÅle muzycznym, przyczyniajÄ
c siÄ do lepszego zachowania i udostÄpnienia cennych nagraÅ historycznych szerokiej publicznoÅci.
#pagebreak(weak: true)
= Zagadnienie poprawy jakoÅci sygnaÅów dźwiÄkowych
#linebreak()
== Charakterystyka znieksztaÅceÅ w nagraniach muzycznych
Zagadnienie poprawy jakoÅci sygnaÅów dźwiÄkowych jest ÅciÅle zwiÄ
zane z charakterystykÄ
znieksztaÅceÅ wystÄpujÄ
cych w nagraniach muzycznych. Zrozumienie natury tych znieksztaÅceÅ jest kluczowe dla opracowania skutecznych metod ich redukcji lub eliminacji.
GÅówne typy znieksztaÅceÅ wystÄpujÄ
cych w nagraniach audio obejmujÄ
szereg problemów, które Szczotka @1 identyfikuje w swojej pracy, w tym szumy, brakujÄ
ce dane, intermodulacjÄ i flutter. Szczególnie istotnym problemem, zwÅaszcza w przypadku historycznych nagraÅ, jest ograniczenie pasma czÄstotliwoÅciowego, co stanowi gÅówny przedmiot badaÅ w pracy nad BEHM-GAN @9. Wczesne systemy rejestracji dźwiÄku czÄsto byÅy w stanie uchwyciÄ jedynie wÄ
ski zakres czÄstotliwoÅci, co prowadziÅo do utraty wielu detali dźwiÄkowych, szczególnie w zakresie wysokich i niskich tonów. Ponadto, jak wskazujÄ
badania nad rekonstrukcjÄ
mocno skompresowanych plików audio @11, kompresja moÅŒe wprowadzaÄ dodatkowe znieksztaÅcenia, które znaczÄ
co wpÅywajÄ
na jakoÅÄ dźwiÄku.
ZnieksztaÅcenia nieliniowe stanowiÄ
kolejnÄ
kategoriÄ problemów, które mogÄ
powaÅŒnie wpÅynÄ
Ä na jakoÅÄ nagrania. MogÄ
one wynikaÄ z niedoskonaÅoÅci w procesie nagrywania, odtwarzania lub konwersji sygnaÅu. Efektem tych znieksztaÅceÅ moÅŒe byÄ wprowadzenie niepoÅŒÄ
danych harmonicznych skÅadowych lub intermodulacji, co prowadzi do zmiany charakteru dźwiÄku @analog-media-degradation.
W przypadku historycznych nagraÅ na noÅnikach analogowych, takich jak pÅyty winylowe czy taÅmy magnetyczne, czÄsto wystÄpujÄ
specyficzne rodzaje znieksztaÅceÅ. Na przykÅad, efekt print-through w taÅmach magnetycznych moÅŒe prowadziÄ do pojawienia siÄ echa lub przesÅuchów miÄdzy sÄ
siednimi warstwami taÅmy @print-through-effect. Z kolei w przypadku pÅyt winylowych, charakterystyczne trzaski i szumy powierzchniowe sÄ
nieodÅÄ
cznym elementem, który moÅŒe znaczÄ
co wpÅywaÄ na odbiór muzyki.
WpÅyw tych znieksztaÅceÅ na percepcjÄ muzyki i jej wartoÅÄ artystycznÄ
jest znaczÄ
cy. Badania pokazujÄ
, ÅŒe jakoÅÄ dźwiÄku ma istotny wpÅyw na to, jak sÅuchacze odbierajÄ
i oceniajÄ
muzykÄ @audio-quality-perception. ZnieksztaÅcenia mogÄ
maskowaÄ subtelne niuanse wykonania, zmieniaÄ barwÄ instrumentów czy gÅosu, a w skrajnych przypadkach caÅkowicie znieksztaÅcaÄ intencje artystyczne twórców.
W kontekÅcie historycznych nagraÅ, znieksztaÅcenia mogÄ
stanowiÄ barierÄ w peÅnym docenieniu wartoÅci artystycznej i kulturowej danego dzieÅa. Nawet niewielkie poprawy w jakoÅci dźwiÄku mogÄ
znaczÄ
co wpÅynÄ
Ä na odbiór i interpretacjÄ wykonania.
#pagebreak(weak: true)
JednoczeÅnie warto zauwaÅŒyÄ, ÅŒe niektóre rodzaje znieksztaÅceÅ, szczególnie te charakterystyczne dla okreÅlonych epok czy technologii nagrywania, mogÄ
byÄ postrzegane jako element autentycznoÅci nagrania. To stawia przed procesem rekonstrukcji dźwiÄku wyzwanie znalezienia równowagi miÄdzy poprawÄ
jakoÅci a zachowaniem historycznego charakteru nagrania @ethical-considerations-in-audio-restoration.
Zrozumienie charakterystyki znieksztaÅceÅ w nagraniach muzycznych jest kluczowym krokiem w opracowaniu skutecznych metod ich redukcji. W kolejnych czÄÅciach pracy skupiÄ siÄ na tym, jak zaawansowane techniki uczenia maszynowego, w szczególnoÅci sieci GAN, mogÄ
byÄ wykorzystane do adresowania tych problemów, jednoczeÅnie starajÄ
c siÄ zachowaÄ artystycznÄ
integralnoÅÄ oryginalnych nagraÅ.
#linebreak()
=== Szumy i trzaski
Szumy i trzaski stanowiÄ
jeden z najbardziej powszechnych problemów w historycznych nagraniach. ŹródÅa szumów sÄ
róŌnorodne i obejmujÄ
ograniczenia sprzÄtowe, takie jak szum termiczny w elektronice, oraz zakÅócenia elektromagnetyczne pochodzÄ
ce z otoczenia lub samego sprzÄtu nagrywajÄ
cego @noise-in-early-recordings. Charakterystyka trzasków jest czÄsto zwiÄ
zana z przyczynami mechanicznymi, takimi jak uszkodzenia powierzchni pÅyt winylowych, lub elektronicznymi, wynikajÄ
cymi z niedoskonaÅoÅci w procesie zapisu lub odtwarzania.
WpÅyw szumów i trzasków na jakoÅÄ odsÅuchu jest znaczÄ
cy. MogÄ
one maskowaÄ subtelne detale muzyczne, zmniejszaÄ dynamikÄ nagrania oraz powodowaÄ zmÄczenie sÅuchacza. W skrajnych przypadkach, intensywne szumy lub czÄste trzaski mogÄ
caÅkowicie zaburzyÄ odbiór muzyki, czyniÄ
c nagranie trudnym lub niemoÅŒliwym do sÅuchania @audio-quality-perception.
#linebreak()
=== Ograniczenia pasma czÄstotliwoÅciowego
Historyczne ograniczenia w rejestrowaniu peÅnego spektrum czÄstotliwoÅci sÄ
jednym z kluczowych wyzwaÅ w rekonstrukcji nagraÅ. Wczesne systemy nagrywania czÄsto byÅy w stanie zarejestrowaÄ jedynie wÄ
ski zakres czÄstotliwoÅci, typowo miÄdzy 250 Hz a 2500 Hz @audio-bandwidth-history. To ograniczenie miaÅo powaÅŒne konsekwencje dla brzmienia instrumentów i wokalu, prowadzÄ
c do utraty zarówno niskich tonów, nadajÄ
cych muzyce gÅÄbiÄ i ciepÅo, jak i wysokich czÄstotliwoÅci, odpowiedzialnych za klarownoÅÄ i przestrzennoÅÄ dźwiÄku.
Znaczenie szerokiego pasma dla naturalnoÅci i peÅni dźwiÄku jest trudne do przecenienia. WspóÅczesne badania pokazujÄ
, ÅŒe ludzkie ucho jest zdolne do percepcji dźwiÄków w zakresie od okoÅo 20 Hz do 20 kHz, choÄ z wiekiem górna granica czÄsto siÄ obniÅŒa. PeÅne odtworzenie tego zakresu jest kluczowe dla realistycznego oddania brzmienia instrumentów i gÅosu ludzkiego. Rekonstrukcja szerokiego pasma czÄstotliwoÅciowego w historycznych nagraniach stanowi zatem jedno z gÅównych zadaÅ w procesie ich restauracji, co odzwierciedlajÄ
badania nad technikami takimi jak BEHM-GAN @9.
#pagebreak(weak: true)
=== ZnieksztaÅcenia nieliniowe
ZnieksztaÅcenia nieliniowe stanowiÄ
szczególnie zÅoÅŒonÄ
kategoriÄ problemów w rekonstrukcji nagraÅ audio. Definiuje siÄ je jako odstÄpstwa od idealnej, liniowej relacji miÄdzy sygnaÅem wejÅciowym a wyjÅciowym w systemie audio. Przyczyny tych znieksztaÅceÅ mogÄ
byÄ róŌnorodne, obejmujÄ
c miÄdzy innymi nasycenie magnetyczne w taÅmach analogowych, nieliniowÄ
charakterystykÄ lamp elektronowych w starszym sprzÄcie nagrywajÄ
cym, czy teŌ ograniczenia mechaniczne w przetwornikach @analog-media-degradation.
WpÅyw znieksztaÅceÅ nieliniowych na nagrania jest znaczÄ
cy i czÄsto subtelny. ProwadzÄ
one do powstania dodatkowych harmonicznych skÅadowych dźwiÄku, które nie byÅy obecne w oryginalnym sygnale, oraz do zjawiska intermodulacji, gdzie róŌne czÄstotliwoÅci wejÅciowe generujÄ
nowe, niepoÅŒÄ
dane tony. W rezultacie, brzmienie instrumentów moÅŒe ulec zmianie, a czystoÅÄ i przejrzystoÅÄ nagrania zostaje zaburzona. W niektórych przypadkach, zwÅaszcza w muzyce elektronicznej czy rockowej, pewne formy znieksztaÅceÅ nieliniowych mogÄ
byÄ celowo wprowadzane dla uzyskania poÅŒÄ
danego efektu artystycznego.
Korekcja znieksztaÅceÅ nieliniowych stanowi jedno z najwiÄkszych wyzwaÅ w procesie rekonstrukcji audio. W przeciwieÅstwie do znieksztaÅceÅ liniowych, które moÅŒna stosunkowo Åatwo skorygowaÄ za pomocÄ
filtrów, znieksztaÅcenia nieliniowe wymagajÄ
bardziej zaawansowanych technik. Tradycyjne metody czÄsto okazujÄ
siÄ niewystarczajÄ
ce, co skÅania badaczy do poszukiwania rozwiÄ
zaÅ opartych na uczeniu maszynowym, takich jak adaptacyjne modelowanie nieliniowoÅci czy zastosowanie gÅÄbokich sieci neuronowych @11. TrudnoÅÄ polega na tym, ÅŒe korekta tych znieksztaÅceÅ wymaga precyzyjnego odtworzenia oryginalnego sygnaÅu, co jest szczególnie skomplikowane w przypadku historycznych nagraÅ, gdzie brakuje referencyjnego materiaÅu wysokiej jakoÅci.
#linebreak()
== Tradycyjne metody poprawy jakoÅci nagraÅ
Ewolucja technik restauracji nagraÅ audio przeszÅa znaczÄ
cÄ
transformacjÄ od prostych metod analogowych do zaawansowanych technik cyfrowych. PoczÄ
tkowo, restauracja nagraÅ opieraÅa siÄ gÅównie na fizycznej konserwacji noÅników i optymalizacji sprzÄtu odtwarzajÄ
cego. Wraz z rozwojem technologii cyfrowej, pojawiÅy siÄ nowe moÅŒliwoÅci manipulacji sygnaÅem audio, co znaczÄ
co rozszerzyÅo arsenaÅ narzÄdzi dostÄpnych dla inÅŒynierów dźwiÄku @6. Nogales i inni w swojej pracy porównujÄ
efektywnoÅÄ klasycznych metod filtracji, takich jak filtr Wienera, z nowoczesnymi technikami gÅÄbokiego uczenia, ilustrujÄ
c tÄ ewolucjÄ.
Jednak tradycyjne metody, mimo swojej skutecznoÅci w wielu przypadkach, majÄ
pewne ograniczenia. GÅównym problemem jest trudnoÅÄ w selektywnym usuwaniu szumów bez wpÅywu na oryginalny sygnaÅ muzyczny. Ponadto, rekonstrukcja utraconych lub znieksztaÅconych czÄstotliwoÅci czÄsto prowadzi do artefaktów dźwiÄkowych, które mogÄ
byÄ równie niepoÅŒÄ
dane jak oryginalne znieksztaÅcenia. Cheddad i Cheddad @5 w swoich badaniach nad aktywnÄ
rekonstrukcjÄ
utraconych sygnaÅów audio podkreÅlajÄ
te ograniczenia, proponujÄ
c jednoczeÅnie nowe podejÅcia uzupeÅniajÄ
ce klasyczne techniki restauracji.
=== Filtracja cyfrowa
Filtracja cyfrowa stanowi podstawÄ wielu technik restauracji audio. WyróŌniamy trzy podstawowe typy filtrów: dolnoprzepustowe, górnoprzepustowe i pasmowe. Dai i inni @8 w swoich badaniach nad super-rozdzielczoÅciÄ
sygnaÅów muzycznych pokazujÄ
, jak tradycyjne metody filtracji mogÄ
byÄ rozszerzone i ulepszone dziÄki zastosowaniu uczenia maszynowego.
Zastosowanie filtracji w redukcji szumów polega na identyfikacji i selektywnym tÅumieniu czÄstotliwoÅci, w których dominuje szum. W korekcji czÄstotliwoÅciowej, filtry sÄ
uÅŒywane do wzmacniania lub osÅabiania okreÅlonych zakresów czÄstotliwoÅci, co pozwala na poprawÄ balansu tonalnego nagrania.
Wady filtracji cyfrowej obejmujÄ
ryzyko wprowadzenia artefaktów dźwiÄkowych, zwÅaszcza przy agresywnym filtrowaniu, oraz potencjalnÄ
utratÄ subtelnych detali muzycznych. ZaletÄ
jest natomiast precyzja i powtarzalnoÅÄ procesu, a takÅŒe moÅŒliwoÅÄ niedestrukcyjnej edycji.
#linebreak()
=== Remasterowanie
Remasterowanie to proces poprawy jakoÅci istniejÄ
cego nagrania, czÄsto z wykorzystaniem nowoczesnych technologii cyfrowych. Celem remasteringu jest poprawa ogólnej jakoÅci dźwiÄku, zwiÄkszenie gÅoÅnoÅci do wspóÅczesnych standardów oraz dostosowanie brzmienia do wspóÅczesnych systemów odtwarzania.
Typowe etapy remasteringu obejmujÄ
normalizacjÄ, kompresjÄ i korekcjÄ EQ. Moliner i VÀlimÀki @8 w swojej pracy nad BEHM-GAN pokazujÄ
, jak nowoczesne techniki mogÄ
byÄ wykorzystane do przezwyciÄÅŒenia ograniczeÅ tradycyjnego remasteringu, szczególnie w kontekÅcie rekonstrukcji wysokich czÄstotliwoÅci w historycznych nagraniach muzycznych.
Kontrowersje wokóŠremasteringu czÄsto dotyczÄ
konfliktu miÄdzy zachowaniem autentycznoÅci oryginalnego nagrania a dÄ
ÅŒeniem do poprawy jakoÅci dźwiÄku. Lattner i Nistal @11 w swoich badaniach nad stochastycznÄ
restauracjÄ
mocno skompresowanych plików audio pokazujÄ
, jak zaawansowane techniki mogÄ
byÄ wykorzystane do poprawy jakoÅci nagraÅ bez utraty ich oryginalnego charakteru, co stanowi istotny gÅos w debacie o autentycznoÅci vs. jakoÅÄ dźwiÄku.
Mimo swoich ograniczeÅ, tradycyjne metody poprawy jakoÅci nagraÅ wciÄ
ÅŒ odgrywajÄ
istotnÄ
rolÄ w procesie restauracji audio. JednakÅŒe, rosnÄ
ca zÅoÅŒonoÅÄ wyzwaÅ zwiÄ
zanych z restauracjÄ
historycznych nagraÅ skÅania badaczy do poszukiwania bardziej zaawansowanych rozwiÄ
zaÅ, w tym metod opartych na sztucznej inteligencji, które mogÄ
przezwyciÄÅŒyÄ niektóre z ograniczeÅ tradycyjnych technik.
#pagebreak(weak: true)
== Wyzwania w rekonstrukcji historycznych nagraÅ muzycznych
Proces rekonstrukcji historycznych nagraÅ muzycznych stawia przed badaczami szereg zÅoÅŒonych wyzwaÅ, wymagajÄ
cych interdyscyplinarnego podejÅcia i zaawansowanych technik przetwarzania sygnaÅów.
Fundamentalnym problemem jest brak oryginalnych, wysokiej jakoÅci źródeÅ dźwiÄku. Wiele historycznych nagraÅ przetrwaÅo jedynie w formie znacznie zdegradowanej, czÄsto na noÅnikach analogowych, które same ulegÅy deterioracji @analog-media-degradation. Szczotka @1 zwraca uwagÄ, ÅŒe niedobór niezakÅóconych sygnaÅów referencyjnych komplikuje proces uczenia modeli rekonstrukcyjnych, zmuszajÄ
c do opracowywania zaawansowanych metod symulacji degradacji dźwiÄku.
Identyfikacja i separacja poszczególnych instrumentów w nagraniach historycznych stanowi kolejne istotne wyzwanie. Dai i wspóÅpracownicy @8 podkreÅlajÄ
znaczenie tego aspektu, szczególnie w kontekÅcie rekonstrukcji zÅoÅŒonych utworów orkiestrowych, gdzie ograniczenia wczesnych systemów nagrywania czÄsto prowadziÅy do nakÅadania siÄ ÅcieÅŒek instrumentalnych.
Kluczowym dylematem jest zachowanie autentycznoÅci brzmienia przy jednoczesnej poprawie jakoÅci. Moliner i VÀlimÀki @9 akcentujÄ
potrzebÄ znalezienia równowagi miÄdzy poprawÄ
technicznej jakoÅci dźwiÄku a utrzymaniem charakterystycznego, historycznego brzmienia nagrania. Zbyt agresywna ingerencja moÅŒe prowadziÄ do utraty autentycznoÅci i kontekstu historycznego.
Etyczne aspekty ingerencji w historyczne nagrania budzÄ
kontrowersje w Årodowisku muzycznym i konserwatorskim. Lattner i Nistal @11 poruszajÄ
kwestiÄ granic dopuszczalnej modyfikacji oryginalnego nagrania, argumentujÄ
c za ostroÅŒnym stosowaniem zaawansowanych technik rekonstrukcji.
Techniczne ograniczenia w odtwarzaniu oryginalnego brzmienia wynikajÄ
z fundamentalnych róŌnic miÄdzy historycznymi a wspóÅczesnymi technologiami audio. Cheddad @5 zwracajÄ
uwagÄ na trudnoÅci w wiernym odtworzeniu charakterystyki akustycznej dawnych sal koncertowych czy specyfiki historycznych instrumentów.
ZÅoÅŒonoÅÄ wyzwaÅ zwiÄ
zanych z rekonstrukcjÄ
historycznych nagraÅ muzycznych wymaga kompleksowego podejÅcia. Integracja zaawansowanych technik przetwarzania sygnaÅów, metod uczenia maszynowego, wiedzy muzykologicznej oraz refleksji etycznej jest kluczowa dla skutecznego rozwiÄ
zywania napotkanych problemów. Badania prowadzone przez Nogalesa i in. @6 wskazujÄ
na potrzebÄ ciÄ
gÅego doskonalenia istniejÄ
cych metod oraz opracowywania nowych rozwiÄ
zaÅ. PrzyszÅoÅÄ rekonstrukcji nagraÅ historycznych zaleÅŒy od zdolnoÅci naukowców do tworzenia innowacyjnych technik, które bÄdÄ
w stanie sprostaÄ unikalnym wymaganiom kaÅŒdego historycznego dzieÅa muzycznego, zachowujÄ
c jednoczeÅnie jego autentycznoÅÄ i wartoÅÄ artystycznÄ
.
#pagebreak(weak: true)
= Metody sztucznej inteligencji w poprawie jakoÅci nagraÅ dźwiÄkowych
#linebreak()
== PrzeglÄ
d technik uczenia maszynowego w przetwarzaniu dźwiÄku
Rozwój metod uczenia maszynowego w ostatnich latach przyniósÅ znaczÄ
cy postÄp w dziedzinie przetwarzania i analizy sygnaÅów dźwiÄkowych. Techniki te znajdujÄ
coraz szersze zastosowanie w poprawie jakoÅci nagraÅ, rekonstrukcji uszkodzonych fragmentów oraz ekstrakcji informacji z sygnaÅów audio.
#linebreak()
=== Ewolucja zastosowaÅ uczenia maszynowego w dziedzinie audio
PoczÄ
tki wykorzystania uczenia maszynowego w przetwarzaniu dźwiÄku siÄgajÄ
lat 90. XX wieku, kiedy to zaczÄto stosowaÄ proste modele statystyczne do klasyfikacji gatunków muzycznych czy rozpoznawania mowy @4. Wraz z rozwojem mocy obliczeniowej komputerów oraz postÄpem w dziedzinie sztucznych sieci neuronowych, nastÄ
piÅ gwaÅtowny wzrost zainteresowania tymi technikami w kontekÅcie analizy i syntezy dźwiÄku.
PrzeÅomowym momentem byÅo zastosowanie gÅÄbokich sieci neuronowych, które umoÅŒliwiÅy modelowanie zÅoÅŒonych zaleÅŒnoÅci w sygnaÅach audio. Badania wykazaÅy, ÅŒe gÅÄbokie sieci konwolucyjne potrafiÄ
skutecznie wyodrÄbniaÄ cechy charakterystyczne dźwiÄków, co otworzyÅo drogÄ do bardziej zaawansowanych zastosowaÅ, takich jak separacja źródeÅ dźwiÄku czy poprawa jakoÅci nagraÅ.
W ostatnich latach coraz wiÄkszÄ
popularnoÅÄ zyskujÄ
modele generatywne, takie jak sieci GAN (Generative Adversarial Networks) czy modele dyfuzyjne, które umoÅŒliwiajÄ
nie tylko analizÄ, ale takÅŒe syntezÄ wysokiej jakoÅci sygnaÅów audio @8. Te zaawansowane techniki znajdujÄ
zastosowanie w rekonstrukcji uszkodzonych nagraÅ oraz rozszerzaniu pasma czÄstotliwoÅci starych rejestracji dźwiÄkowych.
#linebreak()
=== Klasyfikacja gÅównych podejÅÄ: nadzorowane, nienadzorowane, póÅnadzorowane
W kontekÅcie przetwarzania sygnaÅów audio moÅŒna wyróŌniÄ trzy gÅówne podejÅcia do uczenia maszynowego:
a) Uczenie nadzorowane:
W tym podejÅciu model uczy siÄ na podstawie par danych wejÅciowych i oczekiwanych wyników. W dziedzinie audio moÅŒe to obejmowaÄ uczenie siÄ mapowania miÄdzy zaszumionymi a czystymi nagraniami w celu usuwania szumów, czy teÅŒ klasyfikacjÄ instrumentów na podstawie oznaczonych próbek dźwiÄkowych. PrzykÅadem zastosowania uczenia nadzorowanego jest praca Nogales A. i innych @6, w której autorzy wykorzystali konwolucyjne sieci neuronowe do rekonstrukcji uszkodzonych nagraÅ audio.
b) Uczenie nienadzorowane:
Techniki nienadzorowane skupiajÄ
siÄ na odkrywaniu ukrytych struktur w danych bez korzystania z etykiet. W kontekÅcie audio moÅŒe to obejmowaÄ grupowanie podobnych dźwiÄków czy wyodrÄbnianie cech charakterystycznych bez uprzedniej wiedzy o ich znaczeniu.
c) Uczenie póÅnadzorowane:
To podejÅcie ÅÄ
czy elementy uczenia nadzorowanego i nienadzorowanego, wykorzystujÄ
c zarówno oznaczone, jak i nieoznaczone dane. Jest szczególnie przydatne w sytuacjach, gdy dostÄpna jest ograniczona iloÅÄ oznaczonych próbek, co czÄsto ma miejsce w przypadku historycznych nagraÅ audio.
#linebreak()
=== Rola reprezentacji dźwiÄku w uczeniu maszynowym: spektrogramy, cechy MFCC, surowe próbki
Wybór odpowiedniej reprezentacji dźwiÄku ma kluczowe znaczenie dla skutecznoÅci modeli uczenia maszynowego w zadaniach przetwarzania audio.
a) Spektrogramy:
PrzedstawiajÄ
rozkÅad czÄstotliwoÅci sygnaÅu w czasie, co pozwala na analizÄ zarówno cech czasowych, jak i czÄstotliwoÅciowych. Spektrogramy sÄ
szczególnie przydatne w zadaniach takich jak separacja źródeÅ czy poprawa jakoÅci nagraÅ. W pracy @8 autorzy wykorzystali spektrogramy logarytmiczne jako wejÅcie do modelu GAN, osiÄ
gajÄ
c dobre wyniki w zadaniu rozszerzania pasma czÄstotliwoÅci nagraÅ muzycznych.
b) Cechy MFCC (Mel-Frequency Cepstral Coefficients):
ReprezentujÄ
charakterystykÄ widmowÄ
dźwiÄku w sposób zbliÅŒony do ludzkiego systemu sÅuchowego. MFCC sÄ
czÄsto stosowane w zadaniach klasyfikacji i rozpoznawania mowy. Badania wykazaÅy, ÅŒe cechy MFCC mogÄ
byÄ skutecznie wykorzystywane w ocenie jakoÅci rekonstrukcji nagraÅ historycznych.
c) Surowe próbki:
Niektóre modele, szczególnie te oparte na sieciach konwolucyjnych, mogÄ
pracowaÄ bezpoÅrednio na surowych próbkach audio. PodejÅcie to eliminuje potrzebÄ rÄcznego projektowania cech, pozwalajÄ
c modelowi na samodzielne odkrywanie istotnych wzorców w sygnale.
Wybór odpowiedniej reprezentacji zaleÅŒy od specyfiki zadania oraz architektury modelu. Coraz czÄÅciej stosuje siÄ teÅŒ podejÅcia hybrydowe, ÅÄ
czÄ
ce róŌne reprezentacje w celu uzyskania lepszych wyników.
#pagebreak(weak: true)
Techniki uczenia maszynowego oferujÄ
szerokie spektrum moÅŒliwoÅci w dziedzinie przetwarzania i poprawy jakoÅci sygnaÅów audio. Ewolucja tych metod, od prostych modeli statystycznych po zaawansowane sieci generatywne, umoÅŒliwia rozwiÄ
zywanie coraz bardziej zÅoÅŒonych problemów zwiÄ
zanych z rekonstrukcjÄ
i poprawÄ
jakoÅci nagraÅ dźwiÄkowych. W kontekÅcie przetwarzania sygnaÅów audio kluczowe znaczenie ma odpowiedni dobór podejÅcia (nadzorowane, nienadzorowane lub póÅnadzorowane) oraz reprezentacji dźwiÄku. WÅaÅciwe decyzje w tym zakresie pozwalajÄ
na optymalne wykorzystanie potencjaÅu uczenia maszynowego, co przekÅada siÄ na skutecznoÅÄ i efektywnoÅÄ opracowywanych rozwiÄ
zaÅ. PostÄp w tej dziedzinie otwiera nowe moÅŒliwoÅci w zakresie zachowania i odtwarzania dziedzictwa kulturowego, jakim sÄ
historyczne nagrania dźwiÄkowe.
#linebreak()
== Sieci neuronowe w zadaniach audio
Sieci neuronowe staÅy siÄ fundamentalnym narzÄdziem w przetwarzaniu sygnaÅów dźwiÄkowych, oferujÄ
c niezrównanÄ
elastycznoÅÄ i zdolnoÅÄ do modelowania zÅoÅŒonych zaleÅŒnoÅci. Ich adaptacyjna natura pozwala na automatyczne wyodrÄbnianie istotnych cech z surowych danych audio, co czyni je niezwykle skutecznymi w szerokiej gamie zastosowaÅ - od klasyfikacji dźwiÄków po zaawansowanÄ
syntezÄ mowy.
RóŌnorodnoÅÄ architektur sieci neuronowych pozwala na dobór optymalnego rozwiÄ
zania do specyfiki danego zadania audio. Konwolucyjne sieci neuronowe (CNN) wykazujÄ
szczególnÄ
skutecznoÅÄ w analizie lokalnych wzorców w spektrogramach, podczas gdy rekurencyjne sieci neuronowe (RNN) doskonale radzÄ
sobie z modelowaniem dÅugoterminowych zaleÅŒnoÅci czasowych. Autoenkodery z kolei znajdujÄ
zastosowanie w kompresji i odszumianiu sygnaÅów, oferujÄ
c moÅŒliwoÅÄ redukcji wymiarowoÅci przy zachowaniu kluczowych cech dźwiÄku.
EfektywnoÅÄ poszczególnych architektur moÅŒe siÄ znaczÄ
co róŌniÄ w zaleÅŒnoÅci od konkretnego zadania. Badania empiryczne wskazujÄ
, ÅŒe hybrydowe podejÅcia, ÅÄ
czÄ
ce zalety róŌnych typów sieci, czÄsto prowadzÄ
do najlepszych rezultatów w zÅoÅŒonych scenariuszach przetwarzania audio.
#linebreak()
=== Konwolucyjne sieci neuronowe (CNN)
Konwolucyjne sieci neuronowe zrewolucjonizowaÅy sposób, w jaki analizujemy sygnaÅy audio. Ich unikalna architektura, inspirowana biologicznym systemem wzrokowym, okazaÅa siÄ niezwykle skuteczna w wyodrÄbnianiu hierarchicznych cech z reprezentacji czasowo-czÄstotliwoÅciowych dźwiÄku.
W kontekÅcie analizy audio, CNN operujÄ
najczÄÅciej na spektrogramach, traktujÄ
c je jako dwuwymiarowe "obrazy" dźwiÄku. Warstwy konwolucyjne dziaÅajÄ
jak filtry, wyodrÄbniajÄ
c lokalne wzorce spektralne, które mogÄ
odpowiadaÄ konkretnym cechom akustycznym, takim jak akordy, formanty czy charakterystyki instrumentów.
Klasyfikacja dźwiÄków i rozpoznawanie mowy to obszary, w których sieci CNN wykazujÄ
szczególnÄ
skutecznoÅÄ. W zadaniach identyfikacji gatunków muzycznych czy detekcji sÅów kluczowych, sieci te potrafiÄ
automatycznie nauczyÄ siÄ rozpoznawaÄ istotne cechy spektralne, czÄsto przewyÅŒszajÄ
c tradycyjne metody oparte na rÄcznie projektowanych cechach.
Adaptacje CNN do specyfiki danych dźwiÄkowych obejmujÄ
m.in. zastosowanie dilated convolutions. Ta technika pozwala na zwiÄkszenie pola recepcyjnego sieci bez zwiÄkszania liczby parametrów, co jest szczególnie przydatne w modelowaniu dÅugoterminowych zaleÅŒnoÅci czasowych w sygnaÅach audio. Dilated CNN znalazÅy zastosowanie m.in. w generowaniu dźwiÄku w czasie rzeczywistym.
=== Rekurencyjne sieci neuronowe (RNN)
Rekurencyjne sieci neuronowe wyróŌniajÄ
siÄ zdolnoÅciÄ
do przetwarzania sekwencji danych, co czyni je naturalnym wyborem do analizy sygnaÅów audio. Ich architektura, oparta na pÄtlach sprzÄÅŒenia zwrotnego, pozwala na uwzglÄdnienie kontekstu czasowego w przetwarzaniu dźwiÄku, co jest kluczowe w wielu zadaniach, takich jak modelowanie muzyki czy rozpoznawanie mowy ciÄ
gÅej.
LSTM (Long Short-Term Memory) i GRU (Gated Recurrent Unit) to popularni "nastÄpcy" klasycznych RNN, którzy rozwiÄ
zujÄ
problem zanikajÄ
cego gradientu. Te zaawansowane jednostki rekurencyjne potrafiÄ
efektywnie przetwarzaÄ dÅugie sekwencje audio, zachowujÄ
c informacje o odlegÅych zaleÅŒnoÅciach czasowych.
W syntezie mowy, modele oparte na LSTM wykazaÅy siÄ zdolnoÅciÄ
do generowania naturalnie brzmiÄ
cych wypowiedzi, uwzglÄdniajÄ
cych niuanse prozodyczne. W dziedzinie modelowania muzyki, sieci rekurencyjne znalazÅy zastosowanie w generowaniu sekwencji akordów czy komponowaniu melodii, potrafiÄ
c uchwyciÄ zÅoÅŒone struktury harmoniczne i rytmiczne.
#linebreak()
=== Autoenkodery
Autoenkodery to fascynujÄ
ca klasa sieci neuronowych, której gÅównym zadaniem jest nauczenie siÄ efektywnej, skompresowanej reprezentacji danych wejÅciowych. W kontekÅcie audio, ta zdolnoÅÄ do redukcji wymiarowoÅci otwiera szereg moÅŒliwoÅci - od kompresji sygnaÅów po zaawansowane techniki odszumiania.
Klasyczny autoenkoder skÅada siÄ z enkodera, który "Åciska" dane wejÅciowe do niÅŒszego wymiaru, oraz dekodera, który próbuje odtworzyÄ oryginalne dane z tej skompresowanej reprezentacji. W zastosowaniach audio, autoenkodery mogÄ
nauczyÄ siÄ reprezentacji, które zachowujÄ
kluczowe cechy dźwiÄku, jednoczeÅnie eliminujÄ
c szum czy niepoÅŒÄ
dane artefakty.
#pagebreak(weak: true)
Wariacyjne autoenkodery (VAE) idÄ
o krok dalej, wprowadzajÄ
c element losowoÅci do procesu kodowania. Ta cecha czyni je szczególnie przydatnymi w generowaniu nowych, unikalnych dźwiÄków, zachowujÄ
cych charakterystykÄ danych treningowych. VAE znalazÅy zastosowanie m.in. w syntezie mowy i efektów dźwiÄkowych.
Splotowe autoenkodery (CAE) ÅÄ
czÄ
zalety autoenkoderów i CNN, co czyni je skutecznymi w zadaniach zwiÄ
zanych z przetwarzaniem spektrogramów. Ich zdolnoÅÄ do wyodrÄbniania lokalnych cech spektralnych przy jednoczesnej redukcji wymiarowoÅci sprawia, ÅŒe sÄ
cennym narzÄdziem w odszumianiu i restauracji nagraÅ audio.
#linebreak()
== Generatywne sieci przeciwstawne (GAN) w kontekÅcie audio
Generatywne sieci przeciwstawne (GAN) to innowacyjna architektura uczenia maszynowego, która zrewolucjonizowaÅa podejÅcie do generacji i przetwarzania danych, w tym sygnaÅów audio. Podstawowa idea GAN opiera siÄ na "rywalizacji" dwóch sieci neuronowych: generatora, który tworzy nowe dane, oraz dyskryminatora, który ocenia ich autentycznoÅÄ. Ta koncepcja, poczÄ
tkowo opracowana dla obrazów, zostaÅa z powodzeniem zaadaptowana do domeny audio, otwierajÄ
c nowe moÅŒliwoÅci w syntezie i manipulacji dźwiÄkiem.
W kontekÅcie danych dźwiÄkowych, architektura GAN wymaga specyficznego podejÅcia. Generator czÄsto pracuje na reprezentacjach czasowo-czÄstotliwoÅciowych, takich jak spektrogramy, tworzÄ
c nowe "obrazy" dźwiÄku. Dyskryminator z kolei analizuje te reprezentacje, uczÄ
c siÄ rozróŌniaÄ miÄdzy autentycznymi a wygenerowanymi próbkami. Kluczowym wyzwaniem jest zapewnienie, aby wygenerowane spektrogramy byÅy nie tylko realistyczne wizualnie, ale takÅŒe przekÅadaÅy siÄ na spójne i naturalne brzmienia po konwersji z powrotem do domeny czasowej.
Zastosowania GAN w dziedzinie audio sÄ
niezwykle róŌnorodne. W syntezie dźwiÄku, sieci te potrafiÄ
generowaÄ realistyczne efekty dźwiÄkowe czy nawet caÅe utwory muzyczne, naÅladujÄ
c style konkretnych artystów. W zadaniach super-rozdzielczoÅci audio, sieci GAN wykazujÄ
imponujÄ
cÄ
zdolnoÅÄ do rekonstrukcji wysokich czÄstotliwoÅci w nagraniach o ograniczonym paÅmie, co znajduje zastosowanie w restauracji historycznych nagraÅ. Transfer stylu audio, inspirowany podobnymi technikami w przetwarzaniu obrazów, pozwala na przenoszenie charakterystyk brzmieniowych miÄdzy róŌnymi nagraniami, otwierajÄ
c fascynujÄ
ce moÅŒliwoÅci w produkcji muzycznej.
Trening GAN dla sygnaÅów audio niesie ze sobÄ
specyficzne wyzwania. NiestabilnoÅÄ treningu, charakterystyczna dla GAN, jest szczególnie problematyczna w domenie audio, gdzie nawet drobne artefakty mogÄ
znaczÄ
co wpÅynÄ
Ä na jakoÅÄ percepcyjnÄ
. Projektowanie odpowiednich funkcji straty, które uwzglÄdniajÄ
specyfikÄ ludzkiego sÅuchu, stanowi kolejne wyzwanie. Ponadto, zapewnienie spójnoÅci fazowej w generowanych spektrogramach wymaga dodatkowych technik, takich jak wykorzystanie informacji o fazie lub bezpoÅrednie generowanie w domenie czasowej.
#pagebreak(weak: true)
== Modele dyfuzyjne w rekonstrukcji dźwiÄku
Modele dyfuzyjne reprezentujÄ
nowatorskie podejÅcie do generacji danych, które w ostatnich latach zyskaÅo ogromnÄ
popularnoÅÄ w dziedzinie przetwarzania dźwiÄku. U podstaw tej koncepcji leÅŒy idea stopniowego dodawania szumu do danych, a nastÄpnie uczenia siÄ procesu odwrotnego - usuwania szumu, co prowadzi do generacji nowych, wysokiej jakoÅci próbek.
Proces generacji dźwiÄku w modelach dyfuzyjnych moÅŒna podzieliÄ na dwa etapy. W pierwszym, zwanym procesem forward, do oryginalnego sygnaÅu audio stopniowo dodawany jest szum gaussowski, aÅŒ do otrzymania czystego szumu. W drugim etapie, zwanym procesem reverse, model uczy siÄ krok po kroku usuwaÄ ten szum, rozpoczynajÄ
c od losowej próbki szumu i stopniowo przeksztaÅcajÄ
c jÄ
w realistyczny sygnaÅ audio. Ta unikalna architektura pozwala na generacjÄ dźwiÄku o wysokiej jakoÅci i szczegóÅowoÅci.
Zastosowania modeli dyfuzyjnych w rekonstrukcji i syntezie audio sÄ
obiecujÄ
ce. W zadaniach rekonstrukcji uszkodzonych nagraÅ, modele te wykazujÄ
zdolnoÅÄ do "wypeÅniania" brakujÄ
cych fragmentów w sposób spójny z resztÄ
nagrania. W syntezie mowy, modele dyfuzyjne potrafiÄ
generowaÄ niezwykle naturalne i ekspresyjne wypowiedzi, uwzglÄdniajÄ
c subtelne niuanse prozodyczne.
W porównaniu z GAN, modele dyfuzyjne oferujÄ
kilka istotnych zalet w kontekÅcie zadaÅ audio. Przede wszystkim, ich trening jest bardziej stabilny i przewidywalny, co przekÅada siÄ na konsekwentnie wysokÄ
jakoÅÄ generowanych próbek. Modele dyfuzyjne wykazujÄ
równieÅŒ lepszÄ
zdolnoÅÄ do modelowania róŌnorodnoÅci danych, unikajÄ
c problemu "mode collapse" charakterystycznego dla GAN. JednakÅŒe, kosztem tych zalet jest zazwyczaj dÅuÅŒszy czas generacji, co moÅŒe ograniczaÄ ich zastosowanie w aplikacjach czasu rzeczywistego.
Aktualne osiÄ
gniÄcia w dziedzinie modeli dyfuzyjnych dla dźwiÄku sÄ
imponujÄ
ce. Modele takie jak WaveGrad czy DiffWave demonstrujÄ
wysokÄ
jakoÅÄ w syntezie mowy, czÄsto przewyÅŒszajÄ
c modele autoregresyjne. W dziedzinie muzyki, modele dyfuzyjne pokazujÄ
rezultaty w generacji instrumentalnej i wokalnej, zachowujÄ
c niezwykÅÄ
szczegóÅowoÅÄ brzmienia.
Eksplorowane sÄ
techniki ÅÄ
czenia modeli dyfuzyjnych z innymi architekturami, takimi jak transformery, w celu lepszego modelowania dÅugoterminowych zaleÅŒnoÅci w sygnaÅach audio. RosnÄ
ce zainteresowanie multimodalnych modeli dyfuzyjnych otwiera moÅŒliwoÅci syntezy audio skorelowanej z innymi modalnoÅciami, takimi jak obraz czy tekst.
Zarówno GAN, jak i modele dyfuzyjne reprezentujÄ
przeÅomowe podejÅcia w dziedzinie generacji i rekonstrukcji dźwiÄku. KaÅŒda z tych technik oferuje unikalne zalety. Dalszy rozwój tych metod niewÄ
tpliwie przyczyni siÄ do postÄpu w takich dziedzinach jak restauracja historycznych nagraÅ, synteza mowy czy produkcja muzyczna, otwierajÄ
c nowe horyzonty w przetwarzaniu i generacji sygnaÅów audio.
= Zastosowania metod sztucznej inteligencji w rekonstrukcji nagraÅ muzycznych
#linebreak()
== Ogólny przeglÄ
d praktycznych zastosowaÅ AI w restauracji nagraÅ
Zastosowanie sztucznej inteligencji (AI) w rekonstrukcji nagraÅ muzycznych stale zyskuje na popularnoÅci. Tradycyjne techniki restauracji, jak filtry analogowe i cyfrowe, miaÅy swoje ograniczenia, szczególnie w kontekÅcie skomplikowanych sygnaÅów muzycznych. Nowoczesne metody AI, w tym gÅÄbokie uczenie i generatywne sieci przeciwstawne (GAN), oferujÄ
nowe moÅŒliwoÅci w przywracaniu uszkodzonych i zdegradowanych nagraÅ muzycznych, poprawiajÄ
c ich jakoÅÄ w sposób, który wczeÅniej nie byÅ moÅŒliwy.
PrzykÅadowo, praca Dai et al. pokazuje, jak sieci GAN mogÄ
byÄ wykorzystane do poprawy rozdzielczoÅci sygnaÅów muzycznych, co prowadzi do bardziej precyzyjnej i szczegóÅowej rekonstrukcji dźwiÄku @8. Z kolei badania przedstawione przez Nogales et al. wykorzystujÄ
gÅÄbokie autoenkodery do przywracania jakoÅci nagraÅ, przewyÅŒszajÄ
c tradycyjne metody, takie jak filtracja Wienera @6.
#linebreak()
== Porównanie skutecznoÅci metod AI z tradycyjnymi technikami
Tradycyjne metody rekonstrukcji nagraÅ muzycznych, takie jak filtry Wienera czy metody interpolacji oparte na DSP, sÄ
powszechnie stosowane, ale ich skutecznoÅÄ jest ograniczona. Wprowadzenie technik AI, w szczególnoÅci gÅÄbokich sieci neuronowych, znaczÄ
co poprawiÅo jakoÅÄ odtwarzania i rekonstrukcji nagraÅ.
PrzykÅadem jest zastosowanie GAN do poprawy jakoÅci mocno skompresowanych plików MP3. ArtykuÅ z MDPI pokazuje, jak stochastyczne generatory oparte na GAN sÄ
w stanie wytworzyÄ próbki bliÅŒsze oryginaÅowi niÅŒ tradycyjne metody DSP, szczególnie w przypadku dźwiÄków perkusyjnych i wysokich czÄstotliwoÅci @11. Dodatkowo, metody takie jak nienegatywna faktoryzacja macierzy (NMF) i gÅÄbokie sieci neuronowe (DNN) zostaÅy zastosowane do odrestaurowania historycznych nagraÅ fortepianowych, jak pokazuje praca na temat rekonstrukcji nagrania Johannesa Brahmsa z 1889 roku @4.
#pagebreak(weak: true)
== WpÅyw postÄpu w dziedzinie AI na moÅŒliwoÅci rekonstrukcji nagraÅ
PostÄp w dziedzinie AI, a zwÅaszcza rozwój modeli dyfuzyjnych i GAN, otworzyÅ nowe moÅŒliwoÅci w rekonstrukcji nagraÅ muzycznych. Modele te pozwalajÄ
na generowanie dźwiÄku o wysokiej jakoÅci, nawet z uszkodzonych i silnie skompresowanych źródeÅ. ArtykuÅ na temat modeli dyfuzyjnych dla restauracji dźwiÄku przedstawia kompleksowe omówienie tego tematu, podkreÅlajÄ
c ich zdolnoÅÄ do generowania naturalnie brzmiÄ
cych próbek dźwiÄkowych @13.
#linebreak()
== Usuwanie szumów i zakÅóceÅ
Usuwanie szumów i zakÅóceÅ z nagraÅ muzycznych stanowi kluczowe wyzwanie w procesie rekonstrukcji dźwiÄku, szczególnie w kontekÅcie metod opartych na sztucznej inteligencji. Nagrania muzyczne mogÄ
byÄ naraÅŒone na róŌnorodne typy szumów, takie jak szumy tÅa, impulsowe zakÅócenia oraz artefakty powstaÅe podczas konwersji analogowo-cyfrowej. W celu ich skutecznego usuniÄcia, konieczne jest zrozumienie charakterystyki kaÅŒdego z tych szumów, a takÅŒe ich wpÅywu na jakoÅÄ odbioru dźwiÄku przez sÅuchacza.
W ostatnich latach, metody oparte na sztucznej inteligencji, w tym sieci neuronowe, zyskaÅy na popularnoÅci w kontekÅcie identyfikacji i separacji szumów od sygnaÅu muzycznego. Zastosowanie autoenkoderów oraz sieci GAN okazaÅo siÄ szczególnie efektywne w odszumianiu nagraÅ, co potwierdzajÄ
liczne badania @14. Autoenkodery, ze wzglÄdu na swojÄ
zdolnoÅÄ do kompresji danych i ich rekonstrukcji, umoÅŒliwiajÄ
wyodrÄbnienie istotnych cech sygnaÅu, a jednoczeÅnie eliminacjÄ niepoÅŒÄ
danych szumów. Z kolei sieci GAN, które skÅadajÄ
siÄ z generatora i dyskryminatora, pozwalajÄ
na generowanie bardziej realistycznych rekonstrukcji sygnaÅu dźwiÄkowego, dziÄki czemu moÅŒliwe jest zachowanie wiÄkszej iloÅci detali muzycznych podczas usuwania szumów @14.
Porównanie efektywnoÅci róŌnych architektur sieci neuronowych w zadaniu usuwania szumów wykazaÅo, ÅŒe tradycyjne metody oparte na filtracji spektralnej ustÄpujÄ
nowoczesnym podejÅciom opartym na gÅÄbokim uczeniu siÄ. PrzykÅadem moÅŒe byÄ zastosowanie bloków rezydualnych oraz technik normalizacji w architekturze sieci, co prowadzi do znaczÄ
cej poprawy jakoÅci odszumionego dźwiÄku @14.
Niemniej jednak, wyzwania zwiÄ
zane z zachowaniem detali muzycznych podczas usuwania szumów pozostajÄ
istotnym problemem. GÅÄbokie sieci uczÄ
ce siÄ czÄsto majÄ
tendencjÄ do usuwania nie tylko szumów, ale równieÅŒ subtelnych niuansów muzycznych, co moÅŒe prowadziÄ do utraty pierwotnego charakteru nagrania. Aby zminimalizowaÄ ten efekt, stosowane sÄ
zaawansowane funkcje strat, takie jak Perceptual Loss czy Signal-to-Noise Ratio Loss, które pomagajÄ
w zachowaniu jak najwiÄkszej iloÅci oryginalnych detali dźwiÄkowych @15.
#pagebreak(weak: true)
== Rozszerzanie pasma czÄstotliwoÅciowego
Rozszerzanie pasma czÄstotliwoÅciowego w historycznych nagraniach stanowi istotne wyzwanie technologiczne i badawcze, majÄ
ce na celu poprawÄ jakoÅci dźwiÄku przy zachowaniu integralnoÅci oryginalnego materiaÅu.
#linebreak()
=== Problematyka ograniczonego pasma w historycznych nagraniach
Historyczne nagrania, z uwagi na ograniczenia technologiczne ówczesnych systemów rejestracji dźwiÄku, czÄsto charakteryzujÄ
siÄ ograniczonym pasmem przenoszenia, co prowadzi do utraty wyÅŒszych czÄstotliwoÅci i w rezultacie zuboÅŒenia jakoÅci dźwiÄku. Tego typu nagrania sÄ
zwykle poddawane cyfryzacji, a nastÄpnie obróbce majÄ
cej na celu odzyskanie jak najwiÄkszej iloÅci utraconej informacji. Rozszerzanie pasma czÄstotliwoÅciowego staje siÄ tutaj kluczowym narzÄdziem, które umoÅŒliwia przywrócenie peÅniejszego brzmienia nagrania, a co za tym idzie, zbliÅŒenie siÄ do oryginalnego zamysÅu artystycznego twórcy.
#linebreak()
=== Techniki AI do estymacji i syntezy brakujÄ
cych wysokich czÄstotliwoÅci
Zastosowanie sztucznej inteligencji, w szczególnoÅci technik uczenia maszynowego, przyniosÅo nowe moÅŒliwoÅci w zakresie rekonstrukcji brakujÄ
cych informacji w historycznych nagraniach. PrzykÅadem tego jest metoda Blind Audio Bandwidth Extension (BABE), która wykorzystuje model dyfuzyjny do estymacji brakujÄ
cych wysokich czÄstotliwoÅci w nagraniach o ograniczonym paÅmie przenoszenia. Model ten, dziaÅajÄ
cy w tzw. trybie zero-shot, pozwala na realistyczne odtworzenie utraconych czÄÅci spektrum czÄstotliwoÅci bez koniecznoÅci znajomoÅci szczegóÅów degradacji sygnaÅu @16. Testy subiektywne potwierdziÅy, ÅŒe zastosowanie BABE znaczÄ
co poprawia jakoÅÄ dźwiÄku w nagraniach historycznych @16.
#linebreak()
=== Zastosowanie sieci GAN w super-rozdzielczoÅci spektralnej
Sieci generatywne (GAN) znalazÅy szerokie zastosowanie w przetwarzaniu dźwiÄku, w tym w rozszerzaniu pasma czÄstotliwoÅciowego. Metoda BEHM-GAN wykorzystuje sieci GAN do rozszerzania pasma czÄstotliwoÅciowego w nagraniach muzycznych z poczÄ
tku XX wieku. Zastosowanie GAN pozwala na realistycznÄ
syntezÄ brakujÄ
cych wysokich czÄstotliwoÅci, co przekÅada siÄ na znacznÄ
poprawÄ percepcyjnej jakoÅci dźwiÄku @9.
#pagebreak(weak: true)
=== Metody oceny jakoÅci rozszerzonego pasma czÄstotliwoÅciowego
Ocena jakoÅci dźwiÄku po zastosowaniu technik rozszerzania pasma czÄstotliwoÅciowego jest kluczowym etapem procesu. W przypadku historycznych nagraÅ ocena ta jest szczególnie istotna, poniewaÅŒ dodanie nowych informacji moÅŒe wpÅynÄ
Ä na oryginalny charakter nagrania. W zwiÄ
zku z tym stosuje siÄ zarówno metody obiektywne, jak i subiektywne. PrzykÅadem sÄ
testy preferencyjne, w których sÅuchacze oceniajÄ
jakoÅÄ dźwiÄku pod kÄ
tem jego spójnoÅci i naturalnoÅci @16.
#linebreak()
=== Etyczne aspekty dodawania nowych informacji do historycznych nagraÅ
Dodawanie nowych informacji do historycznych nagraÅ rodzi szereg pytaÅ etycznych. GÅówna kwestia dotyczy tego, na ile moÅŒemy modyfikowaÄ oryginalny materiaÅ, by nie zatraciÄ jego autentycznoÅci. Rozszerzanie pasma czÄstotliwoÅciowego za pomocÄ
AI i GAN musi byÄ prowadzone z poszanowaniem dla oryginalnego dzieÅa, aby zachowaÄ jego integralnoÅÄ i nie wprowadzaÄ zmian, które mogÅyby zostaÄ odebrane jako manipulacje oryginaÅem @17 @3.
#linebreak()
== UzupeÅnianie brakujÄ
cych fragmentów
#linebreak()
=== Przyczyny i charakterystyka ubytków
Braki w nagraniach muzycznych mogÄ
mieÄ róŌnorodne przyczyny, takie jak uszkodzenia fizyczne noÅników, bÅÄdy w digitalizacji, czy celowe wyciÄcia fragmentów w procesie edycji. Charakterystyka tych ubytków jest równie zróŌnicowana â od krótkich, niemal niezauwaÅŒalnych przerw, po dÅuÅŒsze fragmenty, które znaczÄ
co wpÅywajÄ
na integralnoÅÄ utworu muzycznego. W zwiÄ
zku z tym, rekonstrukcja brakujÄ
cych fragmentów staÅa siÄ kluczowym zadaniem w konserwacji i restauracji nagraÅ dźwiÄkowych.
#linebreak()
=== Metody AI do interpolacji brakujÄ
cych fragmentów
W ostatnich latach znaczÄ
cy postÄp dokonaÅ siÄ w dziedzinie sztucznej inteligencji, szczególnie w kontekÅcie interpolacji brakujÄ
cych danych audio. Metody te wykorzystujÄ
zaawansowane modele uczenia maszynowego, które sÄ
zdolne do odtwarzania brakujÄ
cych próbek dźwiÄkowych w sposób, który jest trudny do odróŌnienia od oryginaÅu. Na przykÅad, techniki oparte na modelach autoregresyjnych, takich jak Rekurencyjne Sieci Neuronowe (RNN) i Long Short-Term Memory (LSTM), umoÅŒliwiajÄ
przewidywanie brakujÄ
cych próbek na podstawie istniejÄ
cego kontekstu dźwiÄkowego, co prowadzi do bardziej naturalnej rekonstrukcji @18.
#linebreak()
=== Wykorzystanie kontekstu muzycznego w rekonstrukcji ubytków
Modele te mogÄ
efektywnie wykorzystywaÄ kontekst muzyczny, analizujÄ
c struktury melodyczne, rytmiczne i harmoniczne, co pozwala na precyzyjne wypeÅnienie braków w sposób, który zachowuje spójnoÅÄ i naturalnoÅÄ nagrania. WaÅŒnym aspektem jest tutaj takÅŒe ocena spójnoÅci muzycznej rekonstruowanych fragmentów, która moÅŒe byÄ przeprowadzona zarówno subiektywnie, poprzez testy odsÅuchowe, jak i obiektywnie, z wykorzystaniem narzÄdzi analitycznych @1.
#linebreak()
== Poprawa jakoÅci mocno skompresowanych plików audio
Kompresja stratna, taka jak MP3, AAC, czy OGG, jest powszechnie stosowana w celu redukcji rozmiaru plików audio. Jednak proces ten nieodÅÄ
cznie wiÄ
ÅŒe siÄ z utratÄ
pewnych informacji, co wpÅywa na jakoÅÄ dźwiÄku. W szczególnoÅci mogÄ
pojawiÄ siÄ artefakty, takie jak brakujÄ
ce detale w wyÅŒszych czÄstotliwoÅciach czy znieksztaÅcenia perkusji, które negatywnie wpÅywajÄ
na odbiór muzyczny @8.
Aby przeciwdziaÅaÄ tym problemom, rozwijane sÄ
techniki oparte na sztucznej inteligencji (AI). Jednym z obiecujÄ
cych podejÅÄ jest zastosowanie gÅÄbokich sieci neuronowych, które mogÄ
identyfikowaÄ i redukowaÄ artefakty kompresji. PrzykÅadowo, modele oparte na architekturze U-Net czy Wave-U-Net sÄ
w stanie skutecznie poprawiaÄ jakoÅÄ dźwiÄku, szczególnie w przypadku nagraÅ mocno skompresowanych @6.
Zastosowanie Generatywnych Sieci Przeciwstawnych (GAN) otwiera nowe moÅŒliwoÅci w odtwarzaniu detali utraconych podczas kompresji. GAN-y potrafiÄ
generowaÄ brakujÄ
ce fragmenty sygnaÅu audio w sposób realistyczny, co pozwala na znacznÄ
poprawÄ jakoÅci muzyki. Badania wykazujÄ
, ÅŒe sieci GAN sÄ
szczególnie skuteczne w zwiÄkszaniu rozdzielczoÅci czÄstotliwoÅciowej nagraÅ oraz w poprawie jakoÅci dźwiÄku w skompresowanych plikach MP3 @11.
IstotnÄ
czÄÅciÄ
tych procesów jest odpowiednie szkolenie modeli AI. Trening odbywa siÄ na parach nagraÅ przed i po kompresji, co umoÅŒliwia modelom nauczenie siÄ odtwarzania utraconych detali. Wyzwanie stanowi jednak generalizacja tych modeli na róŌne formaty kompresji, gdyÅŒ algorytmy mogÄ
wykazywaÄ róŌnÄ
skutecznoÅÄ w zaleÅŒnoÅci od typu kompresji. Dalsze badania sÄ
konieczne, aby zapewniÄ efektywne dziaÅanie tych technologii w szerokim spektrum formatów @10 @12.
#pagebreak(weak: true)
= Charakterystyka wybranej metody - sieci GAN w rekonstrukcji nagraÅ muzycznych
Generatywne Sieci Przeciwstawne (GAN) to potÄÅŒne narzÄdzie w przetwarzaniu i rekonstrukcji sygnaÅów dźwiÄkowych. SkÅadajÄ
siÄ z generatora, który tworzy nowe próbki, oraz dyskryminatora, który ocenia ich jakoÅÄ. W kontekÅcie audio, sieci GAN umoÅŒliwiajÄ
przywracanie znieksztaÅconych sygnaÅów poprzez identyfikacjÄ i korekcjÄ utraconych detali, wykorzystujÄ
c techniki takie jak szybka transformacja Fouriera (STFT) @19. Wybór GAN jako gÅównej metody do analizy w tej pracy wynika z jej ponad przeciÄtnej zdolnoÅci do odtwarzania cech sygnaÅu, które zostaÅy utracone podczas kompresji oraz z powodu innych znieksztaÅceÅ @11.
#linebreak()
== Architektura sieci GAN dla zadaÅ audio
W kontekÅcie zadaÅ przetwarzania audio, architektura Generatywnych Sieci Przeciwstawnych (GAN) skÅada siÄ z dwóch gÅównych komponentów: generatora i dyskryminatora. Generator jest odpowiedzialny za tworzenie nowych próbek danych, które majÄ
naÅladowaÄ rzeczywiste dane, podczas gdy dyskryminator ocenia te próbki, starajÄ
c siÄ odróŌniÄ generowane dane od prawdziwych. Proces ten polega na iteracyjnym szkoleniu obu komponentów, gdzie generator uczy siÄ tworzyÄ coraz bardziej realistyczne dane, a dyskryminator staje siÄ coraz bardziej wyrafinowany w wykrywaniu sztucznie wygenerowanych danych @6.
Adaptacje architektury GAN do specyfiki danych audio czÄsto obejmujÄ
zastosowanie konwolucji jednokierunkowych (1D), które sÄ
bardziej odpowiednie do przetwarzania sygnaÅów dźwiÄkowych niÅŒ tradycyjne konwolucje dwuwymiarowe. W modelach audio GAN konwolucje 1D pozwalajÄ
na skuteczniejsze modelowanie ciÄ
gÅoÅci czasowej sygnaÅu dźwiÄkowego, co jest kluczowe dla zachowania naturalnego brzmienia @20.
ZnaczÄ
cÄ
rolÄ w architekturze GAN dla zadaÅ audio odgrywajÄ
spektrogramy oraz reprezentacje czasowo-czÄstotliwoÅciowe. Spektrogramy, które reprezentujÄ
sygnaÅ audio w domenie czasowo-czÄstotliwoÅciowej, sÄ
czÄsto uÅŒywane jako wejÅcie do generatora lub dyskryminatora. Tego typu reprezentacje pozwalajÄ
modelom GAN na lepsze wychwycenie charakterystycznych wzorców akustycznych, co przekÅada siÄ na wyÅŒszÄ
jakoÅÄ generowanych sygnaÅów dźwiÄkowych @21.
PrzykÅady konkretnych implementacji GAN dla rekonstrukcji nagraÅ muzycznych obejmujÄ
takie podejÅcia jak Dual-Step-U-Net, które ÅÄ
czÄ
konwencjonalne techniki gÅÄbokiego uczenia z GAN. W tego typu modelach, szczególnie waÅŒne jest eliminowanie zakÅóceÅ obecnych w nagraniach analogowych, co osiÄ
ga siÄ poprzez gÅÄbokie uczenie na rzeczywistych, zaszumionych danych audio @22.
#pagebreak(weak: true)
== Proces uczenia sieci GAN
Generatywne Sieci Przeciwstawne (GAN) opierajÄ
siÄ na zasadzie przeciwstawnego uczenia, gdzie dwie sieci neuronowe â generator i dyskryminator â rywalizujÄ
ze sobÄ
. Generator stara siÄ tworzyÄ próbki danych, które majÄ
naÅladowaÄ rzeczywiste próbki, podczas gdy dyskryminator ocenia, czy próbka pochodzi od generatora, czy jest oryginalnym danymi. Proces ten prowadzi do stopniowego udoskonalania obu modeli: generator staje siÄ coraz lepszy w tworzeniu realistycznych danych, a dyskryminator w ich rozróŌnianiu @4.
Trening GAN na danych audio niesie ze sobÄ
specyficzne wyzwania. Ze wzglÄdu na róŌnorodnoÅÄ sygnaÅów dźwiÄkowych oraz ich reprezentacji, takie jak spektrogramy czy bezpoÅrednie fale dźwiÄkowe, trening wymaga precyzyjnego dopasowania architektury sieci oraz funkcji straty. Jednym z problemów jest zachowanie ciÄ
gÅoÅci sygnaÅu oraz unikniÄcie problemu zaniku gradientu, który moÅŒe prowadziÄ do niestabilnoÅci procesu uczenia @6. Aby stabilizowaÄ proces uczenia, w kontekÅcie przetwarzania dźwiÄku czÄsto stosuje siÄ techniki takie jak normalizacja spektralna. Pozwala ona na lepsze zarzÄ
dzanie wartoÅciami wag sieci i zapobiega eksplozji gradientów, co jest kluczowe dla zachowania stabilnoÅci modelu w dÅuÅŒszej perspektywie @21.
Strategie doboru hiperparametrów, takie jak rozmiar wsadu (batch size), stopa uczenia siÄ, czy wybór optymalizatora, sÄ
kluczowe dla efektywnego treningu sieci GAN. W kontekÅcie przetwarzania dźwiÄku istotne jest równieÅŒ zarzÄ
dzanie procesem treningowym poprzez monitorowanie postÄpów modelu i dynamiczne dostosowywanie parametrów, aby unikaÄ problemów takich jak nadmierne dopasowanie (overfitting) @1.
#linebreak()
== Funkcje straty i metryki oceny jakoÅci
W kontekÅcie GAN stosowanych do rekonstrukcji audio, funkcje straty odgrywajÄ
kluczowÄ
rolÄ w ksztaÅtowaniu jakoÅci generowanych danych. NajczÄÅciej stosowane sÄ
adversarial loss, która odpowiada za przeciwstawne uczenie generatora i dyskryminatora, oraz reconstruction loss, która pomaga w precyzyjnym odtwarzaniu szczegóÅów sygnaÅu dźwiÄkowego @8.
Do obiektywnej oceny jakoÅci rekonstrukcji audio stosuje siÄ metryki takie jak PESQ (Perceptual Evaluation of Speech Quality) i STOI (Short-Time Objective Intelligibility). Metryki te pozwalajÄ
na iloÅciowe porównanie jakoÅci sygnaÅów oryginalnych i odtworzonych, co jest kluczowe dla oceny efektywnoÅci modeli GAN w zadaniach rekonstrukcji dźwiÄku @23. Oprócz obiektywnych metryk, subiektywne metody ewaluacji, takie jak testy odsÅuchowe przeprowadzone przez ekspertów, sÄ
czÄsto uÅŒywane do oceny jakoÅci generowanych próbek dźwiÄkowych. Metody te pozwalajÄ
na uwzglÄdnienie percepcji ludzkiego sÅuchu, co jest niezbÄdne przy ocenie jakoÅci nagraÅ muzycznych @12.
#pagebreak(weak: true)
== Modyfikacje i rozszerzenia standardowej architektury GAN
W standardowej architekturze GAN, mimo jej licznych zalet, istnieje szereg ograniczeÅ, które mogÄ
wpÅywaÄ na skutecznoÅÄ i stabilnoÅÄ modeli, zwÅaszcza w kontekÅcie zadaÅ zwiÄ
zanych z przetwarzaniem dźwiÄku. Motywacja do wprowadzania modyfikacji w standardowej architekturze GAN wynika z potrzeby przezwyciÄÅŒenia tych ograniczeÅ, takich jak problem zaniku gradientów, wolna konwergencja, czy trudnoÅci z generowaniem wysokiej jakoÅci próbek w zÅoÅŒonych przestrzeniach danych.
W ciÄ
gu ostatnich lat powstaÅo wiele wariantów GAN, które zostaÅy dostosowane do specyficznych wymagaÅ zadaÅ zwiÄ
zanych z przetwarzaniem dźwiÄku. Do najwaÅŒniejszych wariantów stosowanych w audio naleÅŒÄ
m.in. Conditional GAN, który pozwala na kontrolowane generowanie danych na podstawie dodatkowych informacji, oraz WaveGAN, który jest szczególnie przydatny do generowania surowych sygnaÅów audio. Warianty te wprowadzajÄ
unikalne modyfikacje zwiÄkszajÄ
ce ich efektywnoÅÄ w okreÅlonych zastosowaniach @24.
#linebreak()
=== Conditional GAN
Conditional GAN (cGAN) wprowadza koncepcjÄ warunkowego generowania, gdzie proces generowania danych jest kontrolowany przez dodatkowe informacje, takie jak etykiety klas czy inne cechy danych. W ten sposób moÅŒliwe jest uzyskanie bardziej precyzyjnych wyników, dostosowanych do specyficznych wymagaÅ zadania. W kontekÅcie rekonstrukcji nagraÅ audio, cGAN moÅŒe byÄ wykorzystany do kontrolowania parametrów rekonstrukcji, co pozwala na lepsze odwzorowanie oryginalnego sygnaÅu lub dostosowanie go do okreÅlonych warunków @25.
Zastosowania Conditional GAN w rekonstrukcji nagraÅ obejmujÄ
m.in. rozszerzenie pasma czÄstotliwoÅci dźwiÄku, co jest szczególnie istotne w przypadku nagraÅ o ograniczonej jakoÅci. Modele cGAN sÄ
wykorzystywane do augmentacji danych audio, co pozwala na zwiÄkszenie iloÅci danych treningowych i poprawÄ jakoÅci wyników w zadaniach takich jak diagnostyka oparta na dźwiÄku @26.
#linebreak()
=== CycleGAN
CycleGAN jest modelem generatywnym, który umoÅŒliwia uczenie siÄ transformacji miÄdzy róŌnymi domenami danych bez potrzeby posiadania sparowanych próbek treningowych. W przeciwieÅstwie do tradycyjnych metod nadzorowanych, CycleGAN wykorzystuje mechanizm dwóch cykli generacyjnych (cykl do przodu i cykl do tyÅu), co pozwala na uczenie bez nadzoru. GÅówna idea tego podejÅcia polega na tym, ÅŒe model uczy siÄ odwzorowywaÄ dane z jednej domeny na drugÄ
w taki sposób, aby moÅŒliwe byÅo odzyskanie oryginalnych danych przy odwrotnym procesie transformacji.
#pagebreak(weak: true)
CycleGAN znalazÅ szerokie zastosowanie w transferze stylu audio i konwersji gÅosu. DziÄki zdolnoÅci do uczenia siÄ z danych nieparowanych, model ten jest wykorzystywany do zmiany stylu muzycznego utworów czy konwersji gÅosu pomiÄdzy róŌnymi mówcami. CycleGAN zostaÅ uÅŒyty w celu transformacji dźwiÄków silników okrÄtowych @27.
CycleGAN oferuje ogromny potencjaÅ w rekonstrukcji nagraÅ bez par treningowych. DziÄki swojej zdolnoÅci do pracy z nieparowanymi danymi, model ten moÅŒe byÄ uÅŒywany do odtwarzania sygnaÅów dźwiÄkowych w przypadkach, gdy brak jest sparowanych próbek treningowych, co czyni go wyjÄ
tkowo uÅŒytecznym w rekonstrukcji historycznych nagraÅ lub innych zÅoÅŒonych zadaÅ dźwiÄkowych @27.
#linebreak()
=== Progressive GAN
Progressive GAN (ProGAN) wprowadza innowacyjnÄ
koncepcjÄ stopniowego zwiÄkszania rozdzielczoÅci generowanych danych. Zamiast trenowaÄ model na danych o docelowej rozdzielczoÅci od samego poczÄ
tku, ProGAN zaczyna od niskiej rozdzielczoÅci i stopniowo dodaje nowe warstwy, które zwiÄkszajÄ
poziom szczegóÅowoÅci generowanych danych. Takie podejÅcie pozwala na uzyskanie bardziej stabilnych i realistycznych wyników, co jest szczególnie korzystne dla zastosowaÅ zwiÄ
zanych z generowaniem zÅoÅŒonych danych, takich jak dźwiÄki @28.
Adaptacje Progressive GAN do domeny audio opierajÄ
siÄ wÅaÅnie na koncepcji stopniowego zwiÄkszania rozdzielczoÅci, ale zastosowanej do sygnaÅów dźwiÄkowych. DziÄki temu podejÅciu moÅŒliwe jest generowanie próbek audio o coraz wyÅŒszej jakoÅci, co jest istotne w kontekÅcie syntezy dźwiÄku, gdzie precyzja i jakoÅÄ sÄ
kluczowe @29.
ProGAN jest wykorzystywany w wielu zadaniach zwiÄ
zanych z generowaniem wysokiej jakoÅci próbek dźwiÄkowych. PrzykÅadem moÅŒe byÄ zastosowanie tego modelu do syntezy mowy, gdzie stopniowe zwiÄkszanie jakoÅci generowanych sygnaÅów pozwala na uzyskanie bardziej naturalnych i realistycznych próbek dźwiÄkowych @29.
#pagebreak(weak: true)
= Implementacja i eksperymenty
#linebreak()
== Opis zestawu danych
W ramach przeprowadzonych badaÅ nad zastosowaniem sieci GAN w rekonstrukcji nagraÅ dźwiÄkowych, wykorzystano zestaw danych *MusicNet Dataset* @musicnet, skÅadajÄ
cy siÄ z wysokiej jakoÅci nagraÅ muzyki klasycznej. Wybór tego gatunku muzycznego podyktowany byÅ jego zÅoÅŒonoÅciÄ
harmonicznÄ
i dynamicznÄ
, co stanowi istotne wyzwanie dla algorytmów rekonstrukcyjnych. Brak wokalu w nagraniach muzyki klasycznej pozwoliÅ na skupienie siÄ na czystej rekonstrukcji sygnaÅów muzycznych.
Proces przygotowania danych skÅadaÅ siÄ z kilku kluczowych etapów, zaimplementowanych w skryptach `to_mp3.py`, `to_vinyl_crackle.py`, `generate_stfts.py` oraz `data_preparation.py`. PoczÄ
tkowo, pliki źródÅowe w formacie WAV zostaÅy przekonwertowane na format MP3 z przepÅywnoÅciÄ
320 kbit/s, co pozwoliÅo na zachowanie wysokiej jakoÅci dźwiÄku przy jednoczesnej redukcji rozmiaru plików. NastÄpnie, dÅugie nagrania zostaÅy podzielone na dokÅadnie 10-sekundowe fragmenty, co zapewniÅo jednolitÄ
strukturÄ wejÅciowÄ
dla sieci neuronowej.
#figure(
```python
def process_file(file_info):
input_path, output_dir = file_info
filename = os.path.basename(input_path)
file_id = os.path.splitext(filename)[0]
audio = AudioSegment.from_wav(input_path)
segment_length = 10 * 1000 # 10 seconds in milliseconds
num_full_segments = len(audio)
segments = []
for i in range(num_full_segments):
start = i * segment_length
segment = audio[start:start + segment_length]
output_filename = f"{file_id}-{i}.mp3"
output_path = os.path.join(output_dir, output_filename)
segment.export(output_path, format="mp3", bitrate="320k")
segments.append(output_filename)
# Check if there's a remaining segment and if it's exactly 10 seconds
remaining_audio = audio[num_full_segments * segment_length:]
if len(remaining_audio) == segment_length:
output_filename = f"{file_id}-{num_full_segments}.mp3"
output_path = os.path.join(output_dir, output_filename)
remaining_audio.export(output_path, format="mp3", bitrate="320k")
segments.append(output_filename)
return filename, segments
```,
caption: "Procedura przygotowania danych"
)
Istotnym elementem przygotowania danych byÅa augmentacja, majÄ
ca na celu symulacjÄ efektów charakterystycznych dla historycznych nagraÅ. W tym celu opracowano algorytm generujÄ
cy szum o charakterystyce spektralnej zbliÅŒonej do autentycznych pÅyt winylowych. Proces ten, zaimplementowany w skrypcie `to_vinyl_crackle.py`, obejmowaÅ generowanie róŌnorodnych efektów, takich jak trzaski, pÄkniÄcia i zadrapania, z wykorzystaniem technik przetwarzania sygnaÅów, w tym filtracji pasmowo-przepustowej.
#figure(
```python
def generate_vinyl_crackle(duration_ms, sample_rate):
num_samples = int(duration_ms * sample_rate / 1000)
samples = np.zeros(num_samples)
event_density = 0.0001
event_positions = np.random.randint(0, num_samples, int(num_samples * event_density))
for pos in event_positions:
event_type = np.random.choice(['pop', 'crackle', 'scratch'])
if event_type == 'pop':
duration = np.random.randint(5, 15)
event = np.random.exponential(0.01, duration)
event = event * np.hanning(duration)
elif event_type == 'crackle':
duration = np.random.randint(20, 50)
event = np.random.normal(0, 0.01, duration)
event = event * (np.random.random(duration) > 0.7)
else: # scratch
duration = np.random.randint(50, 200)
event = np.random.normal(0, 0.05, duration)
event = event * np.hanning(duration)
end_pos = min(pos + len(event), num_samples)
samples[pos:end_pos] += event[:end_pos - pos]
nyquist = sample_rate / 2
low = 500 / nyquist
high = 7000 / nyquist
b, a = signal.butter(3, [low, high], btype='band')
samples = signal.lfilter(b, a, samples)
samples = samples / np.max(np.abs(samples))
return samples
```,
caption: "Procedura generowania trzasków winylowych"
)
#pagebreak(weak: true)
Kolejnym etapem byÅo przeksztaÅcenie sygnaÅów audio do reprezentacji czasowo-czÄstotliwoÅciowej przy uÅŒyciu *krótkoczasowej transformaty Fouriera (STFT)*. Skrypt `generate_stfts.py` realizowaÅ to zadanie, stosujÄ
c funkcjÄ `librosa.stft()` z oknem analizy o dÅugoÅci 2048 próbek i przeskokiem 512 próbek. Dodatkowo, zastosowano technikÄ skalowania `signed square root`, która pozwoliÅa na redukcjÄ dynamiki sygnaÅu przy jednoczesnym zachowaniu informacji o fazie.
#figure(
```python
def process_audio_file(file_path, output_dir, window_size=2048, hop_size=512):
try:
base_name = os.path.splitext(os.path.basename(file_path))[0]
output_file = os.path.join(output_dir, f"{base_name}_stft.npz")
if os.path.exists(output_file):
return False, (0, 0)
audio, sr = librosa.load(file_path, sr=None)
stft = librosa.stft(audio, n_fft=window_size, hop_length=hop_size)
stft_scaled = signed_sqrt(stft.real) + 1j * signed_sqrt(stft.imag)
np.savez_compressed(output_file, stft=stft_scaled, sr=sr, window_size=window_size, hop_size=hop_size)
return True, stft_scaled.shape
except Exception as e:
logging.error(f"Error processing {file_path}: {str(e)}")
return False, (0, 0)
```,
caption: "Procedura tworzenia krótkoczasowych trasnformat Fouriera"
)
W procesie normalizacji i skalowania danych, zaimplementowanym w klasie `STFTDataset`, zastosowano normalizacjÄ amplitudy do zakresu [-1, 1]. Proces ten obejmowaÅ oddzielne przetwarzanie magnitudy i fazy spektrogramów, co umoÅŒliwiÅo zachowanie peÅnej informacji o strukturze czasowo-czÄstotliwoÅciowej sygnaÅu.
#pagebreak(weak: true)
#figure(
```python
class STFTDataset(Dataset):
def __init__(self, clean_files, noisy_files):
self.clean_files = clean_files
self.noisy_files = noisy_files
def __getitem__(self, idx):
clean_file = self.clean_files[idx]
noisy_file = self.noisy_files[idx]
clean_stft = np.load(clean_file)['stft']
noisy_stft = np.load(noisy_file)['stft']
clean_mag, clean_phase = np.abs(clean_stft), np.angle(clean_stft)
noisy_mag, noisy_phase = np.abs(noisy_stft), np.angle(noisy_stft)
clean_mag_original = clean_mag.copy()
noisy_mag_original = noisy_mag.copy()
clean_mag_norm = (clean_mag - np.min(clean_mag)) / (np.max(clean_mag) - np.min(clean_mag)) * 2 - 1
noisy_mag_norm = (noisy_mag - np.min(noisy_mag)) / (np.max(noisy_mag) - np.min(noisy_mag)) * 2 - 1
clean_data_norm = np.stack([clean_mag_norm, clean_phase], axis=0)
noisy_data_norm = np.stack([noisy_mag_norm, noisy_phase], axis=0)
clean_data_original = np.stack([clean_mag_original, clean_phase], axis=0)
noisy_data_original = np.stack([noisy_mag_original, noisy_phase], axis=0)
return (torch.from_numpy(noisy_data_norm).float(),
torch.from_numpy(clean_data_norm).float(),
torch.from_numpy(noisy_data_original).float(),
torch.from_numpy(clean_data_original).float())
```,
caption: "Klasa przechowujÄ
ca zbiory danych STFT"
)
Finalny zestaw danych, przygotowany przez funkcjÄ `prepare_data()` w skrypcie `data_preparation.py`, skÅadaÅ siÄ z par nagraÅ: oryginalnych, wysokiej jakoÅci fragmentów oraz ich odpowiedników z symulowanymi znieksztaÅceniami winylowymi. Dane zostaÅy podzielone na zbiory treningowy i walidacyjny z wykorzystaniem funkcji `train_test_split()` z biblioteki scikit-learn, zapewniajÄ
c reprezentatywnoÅÄ obu zbiorów.
Warto podkreÅliÄ, ÅŒe caÅy proces przygotowania danych zostaÅ zoptymalizowany pod kÄ
tem wydajnoÅci obliczeniowej. Wykorzystano przetwarzanie równolegÅe z uÅŒyciem `ProcessPoolExecutor`, co znaczÄ
co przyspieszyÅo obróbkÄ duÅŒej iloÅci plików audio. Dodatkowo, zastosowano techniki zarzÄ
dzania pamiÄciÄ
, przetwarzajÄ
c dane w mniejszych porcjach, co umoÅŒliwiÅo jak najefektywniejsze wykorzystanie dostÄpnych zasobów sprzÄtowych.
== Architektura proponowanego modelu
W ramach badaÅ nad rekonstrukcjÄ
nagraÅ dźwiÄkowych opracowano architekturÄ Generatywnej Sieci Przeciwstawnej (GAN), skÅadajÄ
cÄ
siÄ z generatora i dyskryminatora. Model ten zostaÅ zaprojektowany z myÅlÄ
o efektywnym przetwarzaniu spektrogramów STFT, które stanowiÄ
reprezentacjÄ danych wejÅciowych.
#linebreak()
=== Generator
Generator, bÄdÄ
cy kluczowym elementem architektury, wykorzystuje *zmodyfikowanÄ
strukturÄ U-Net*. Wybór tej architektury podyktowany byÅ jej skutecznoÅciÄ
w zadaniach przetwarzania obrazów, które moÅŒna zaadaptowaÄ do analizy spektrogramów. Wprowadzono jednak szereg modyfikacji dostosowujÄ
cych model do specyfiki rekonstrukcji nagraÅ audio:
W przeciwieÅstwie do klasycznego U-Net, zastosowano normalizacjÄ spektralnÄ
w warstwach konwolucyjnych, co pomaga w stabilizacji treningu GAN i poprawia jakoÅÄ generowanych wyników.
#figure(
```python
def encoder_block(self, in_channels, out_channels):
return nn.Sequential(
spectral_norm(nn.Conv2d(in_channels, out_channels, 3, stride=2, padding=1)),
nn.BatchNorm2d(out_channels),
nn.LeakyReLU(0.2)
)
```,
caption: "Blok skÅadowy enkodera architektury Generatora"
)
CentralnÄ
czÄÅÄ generatora stanowi bottleneck skÅadajÄ
cy siÄ z trzech bloków rezydualnych. Ta modyfikacja pozwala na lepsze przetwarzanie cech na wysokim poziomie abstrakcji.
#figure(
```python
class ResidualBlock(nn.Module):
def __init__(self, channels):
super().__init__()
self.conv1 = spectral_norm(nn.Conv2d(channels, channels, 3, padding=1))
self.conv2 = spectral_norm(nn.Conv2d(channels, channels, 3, padding=1))
self.norm1 = nn.BatchNorm2d(channels)
self.norm2 = nn.BatchNorm2d(channels)
self.relu = nn.LeakyReLU(0.2)
def forward(self, x):
residual = x
x = self.relu(self.norm1(self.conv1(x)))
x = self.norm2(self.conv2(x))
return x + residual
```,
caption: "Rezydualny blok skÅadowy architektury Generatora"
)
#pagebreak(weak: true)
Zamiast standardowej funkcji aktywacji ReLU, zastosowano LeakyReLU, co pomaga w unikniÄciu problemu "umierajÄ
cych neuronów" i poprawia gradient przepÅyw w sieci.
Dekoder zostaÅ dostosowany do specyfiki danych audio poprzez zastosowanie warstw dekonwolucyjnych z normalizacjÄ
spektralnÄ
:
#figure(
```python
def decoder_block(self, in_channels, out_channels):
return nn.Sequential(
spectral_norm(nn.ConvTranspose2d(in_channels, out_channels, 4, stride=2, padding=1)),
nn.BatchNorm2d(out_channels),
nn.LeakyReLU(0.2)
)
```,
caption: "Blok skÅadowy dekodera architektury Generatora"
)
Podobnie jak w klasycznym U-Net, zastosowano poÅÄ
czenia skip miÄdzy odpowiadajÄ
cymi sobie warstwami enkodera i dekodera. Jest to kluczowe dla zachowania drobnych detali w rekonstruowanych nagraniach.
Te modyfikacje pozwoliÅy na stworzenie architektury, która ÅÄ
czy zalety U-Net z specyficznymi wymaganiami rekonstrukcji nagraÅ audio w kontekÅcie GAN.
#linebreak()
=== Dyskryminator
Dyskryminator, drugi kluczowy komponent architektury GAN, wykorzystuje strukturÄ konwolucyjnÄ
. SkÅada siÄ z piÄciu bloków dyskryminatora, z których kaÅŒdy zawiera warstwÄ konwolucyjnÄ
z normalizacjÄ
spektralnÄ
oraz funkcjÄ aktywacji LeakyReLU:
#figure(
```python
def discriminator_block(self, in_channels, out_channels):
return nn.Sequential(
spectral_norm(nn.Conv2d(in_channels, out_channels, 4, stride=2, padding=1)),
nn.LeakyReLU(0.2)
)
```,
caption: "Blok skÅadowy architektury Dyskryminatora"
)
W celu poprawy stabilnoÅci treningu i jakoÅci generowanych wyników, w architekturze zastosowano szereg technik normalizacji. Normalizacja spektralna zostaÅa wykorzystana we wszystkich warstwach konwolucyjnych, zarówno w generatorze, jak i dyskryminatorze. Technika ta efektywnie kontroluje dynamikÄ gradientów, co przyczynia siÄ do bardziej stabilnego procesu uczenia.
Ponadto, w generatorze zastosowano normalizacjÄ wsadowÄ
(Batch Normalization) po kaÅŒdej warstwie konwolucyjnej. Metoda ta normalizuje aktywacje w obrÄbie mini-batcha, co pomaga w redukcji wewnÄtrznego przesuniÄcia kowariancyjnego i przyspiesza konwergencjÄ modelu.
#pagebreak(weak: true)
Wykorzystanie spektrogramów STFT jako reprezentacji danych wejÅciowych stanowi kluczowy element proponowanej architektury. Spektrogramy te, obliczane z uÅŒyciem krótkoczasowej transformaty Fouriera, dostarczajÄ
bogatej reprezentacji czasowo-czÄstotliwoÅciowej sygnaÅu audio. Taka reprezentacja umoÅŒliwia modelowi efektywne przetwarzanie zarówno informacji o amplitudzie, jak i fazie sygnaÅu, co jest kluczowe dla zadaÅ rekonstrukcji nagraÅ dźwiÄkowych.
#linebreak()
== Proces treningu i optymalizacji
W ramach procesu treningu i optymalizacji opracowanego modelu GAN zastosowano szereg technik majÄ
cych na celu poprawÄ stabilnoÅci uczenia i jakoÅci generowanych wyników.
Implementacja funkcji strat stanowiÅa kluczowy element procesu optymalizacji. W modelu wykorzystano kombinacjÄ róŌnych funkcji strat, kaÅŒda z przypisanÄ
wagÄ
, co pozwoliÅo na precyzyjne kierowanie procesem uczenia:
#figure(
```python
self.loss_weights = {
'adversarial': 2.5,
'content': 10.0,
'spectral_convergence': 0.1,
'spectral_flatness': 0.1,
'phase_aware': 0.1,
'multi_resolution_stft': 1.0,
'perceptual': 0.1,
'time_frequency': 1.0,
'snr': 1.0
}
```,
caption: "Struktura wag funkcji strat"
)
Adversarial Loss (Hinge Loss): Funkcja ta stanowi podstawÄ treningu przeciwstawnego w architekturze GAN. W implementacji wykorzystano wariant Hinge Loss, który promuje bardziej stabilne uczenie siÄ generatora i dyskryminatora. Dla generatora, strata ta dÄ
ÅŒy do maksymalizacji prawdopodobieÅstwa, ÅŒe dyskryminator sklasyfikuje wygenerowane próbki jako prawdziwe. Dla dyskryminatora, celem jest maksymalizacja marginesu miÄdzy prawdziwymi a faÅszywymi próbkami.
Content Loss (L1 lub L2): Ta funkcja straty mierzy bezpoÅredniÄ
róŌnicÄ miÄdzy wygenerowanym sygnaÅem a sygnaÅem docelowym. Implementacja umoÅŒliwia wybór miÄdzy normÄ
L1 (Årednia wartoÅÄ bezwzglÄdna róŌnic) a normÄ
L2 (Årednia kwadratowa róŌnic). Norma L1 jest czÄsto preferowana w zadaniach zwiÄ
zanych z przetwarzaniem sygnaÅów, gdyÅŒ jest mniej wraÅŒliwa na ekstremalne wartoÅci.
#pagebreak(weak: true)
Spectral Convergence Loss: Funkcja ta mierzy podobieÅstwo widmowe miÄdzy sygnaÅem wygenerowanym a docelowym. Jest szczególnie istotna w kontekÅcie rekonstrukcji nagraÅ audio, gdyÅŒ koncentruje siÄ na zachowaniu charakterystyki czÄstotliwoÅciowej sygnaÅu. Obliczana jest jako stosunek normy róŌnicy widm do normy widma oryginalnego.
Spectral Flatness Loss: Ta funkcja straty ocenia róŌnicÄ w "pÅaskoÅci" widmowej miÄdzy sygnaÅem wygenerowanym a docelowym. PÅaskoÅÄ widmowa jest miarÄ
tego, jak równomiernie rozÅoÅŒona jest energia sygnaÅu w dziedzinie czÄstotliwoÅci. Jest szczególnie przydatna w zachowaniu ogólnej charakterystyki tonalnej rekonstruowanego dźwiÄku.
Phase-Aware Loss: Funkcja ta uwzglÄdnia zarówno informacje o amplitudzie, jak i fazie sygnaÅu. Jest to kluczowe w rekonstrukcji dźwiÄku, gdzie zachowanie prawidÅowych relacji fazowych jest niezbÄdne dla uzyskania naturalnie brzmiÄ
cego rezultatu. SkÅada siÄ z dwóch komponentów: straty amplitudy i straty fazy.
Multi-Resolution STFT Loss: Funkcja analizuje sygnaÅ w róŌnych skalach czasowo-czÄstotliwoÅciowych. Wykorzystuje krótkoczasowÄ
transformatÄ Fouriera (STFT) o róŌnych rozmiarach okna, co pozwala na uchwycenie zarówno krótko- jak i dÅugoczasowych struktur w sygnale audio.
Time-Frequency Loss: Funkcja ta ÅÄ
czy w sobie stratÄ w dziedzinie czasu i czÄstotliwoÅci. UwzglÄdnia zarówno bezpoÅrednie róŌnice w próbkach czasowych, jak i róŌnice w reprezentacji czÄstotliwoÅciowej sygnaÅu.
Signal-to-Noise Ratio (SNR) Loss: Ta funkcja straty opiera siÄ na klasycznej mierze jakoÅci sygnaÅu - stosunku sygnaÅu do szumu. W kontekÅcie rekonstrukcji audio, "szumem" jest róŌnica miÄdzy sygnaÅem wygenerowanym a docelowym. Funkcja ta promuje generowanie sygnaÅów o wysokim SNR, co przekÅada siÄ na lepszÄ
jakoÅÄ percepcyjnÄ
.
Perceptual Loss: WykorzystujÄ
c wstÄpnie nauczony model ekstrakcji cech *VGGish* @vggish, funkcja ta porównuje wysokopoziomowe reprezentacje sygnaÅu wygenerowanego i docelowego. Pozwala to na uwzglÄdnienie bardziej abstrakcyjnych i percepcyjnie istotnych cech dźwiÄku, wykraczajÄ
cych poza proste porównania amplitud czy widm.
Optymalizacja modelu opieraÅa siÄ na algorytmie Adam z niestandardowymi parametrami beta:
#figure(
```python
g_optimizer = optim.Adam(gan.generator.parameters(), lr=g_lr, betas=(0.0, 0.9))
d_optimizer = optim.Adam(gan.discriminator.parameters(), lr=d_lr, betas=(0.0, 0.9))
```,
caption: "Parametry optymizatorów Adam"
)
#pagebreak(weak: true)
Z powodu rozmiarów modelu oraz duÅŒych plików treningowych, napotkano na problemy z rozmiarem batcha wynikajÄ
ce z przekroczenia limitu pamiÄci wirtualnej karty graficznej. Z tego powodu zastosowano technikÄ akumulacji gradientów, co pozwoliÅo na efektywne zwiÄkszenie rozmiaru batcha bez zwiÄkszania zuÅŒycia pamiÄci:
#figure(
```python
g_loss = g_loss / self.accumulation_steps
g_loss.backward()
if (self.current_step + 1) % self.accumulation_steps == 0:
torch.nn.utils.clip_grad_norm_(self.generator.parameters(), max_norm=1.0)
self.g_optimizer.step()
```,
caption: "Technika akumulacji gradientów"
)
Dynamiczne dostosowywanie wspóÅczynnika uczenia (learning rate) zostaÅo zaimplementowane w oparciu o wartoÅci funkcji straty:
#figure(
```python
if self.g_loss_ma > self.g_loss_threshold:
for param_group in self.g_optimizer.param_groups:
param_group['lr'] *= 1.01
```,
caption: "Dynamiczny wspóÅczynnik uczenia"
)
W celu poprawy stabilnoÅci treningu zastosowano techniki regularyzacji. Gradient Penalty zostaÅ wprowadzony do funkcji straty dyskryminatora:
#figure(
```python
gp = self.gradient_penalty(real_target_norm, generated_audio_norm.detach())
d_loss = d_loss + 10 * gp
```,
caption: "Gradient Penalty na podstawie Wasserstein GAN"
)
Instance Noise z mechanizmem annealing zostaÅ uÅŒyty do stopniowego zmniejszania szumu dodawanego do danych wejÅciowych w trakcie treningu:
#figure(
```python
self.instance_noise *= self.instance_noise_anneal_rate
```,
caption: "Mechanizm stopniowego zmniejszania szumu danych wejÅciowych Dyskryminatora"
)
Monitorowanie i wizualizacja procesu treningu zostaÅy zrealizowane za pomocÄ
dedykowanych metod Callback. LossVisualizationCallback umoÅŒliwiaÅ Åledzenie i wizualizacjÄ róŌnych komponentów funkcji straty w czasie rzeczywistym:
#figure(
```python
loss_visualization_callback = LossVisualizationCallback(log_dir=run_log_dir)
loss_visualization_callback.on_epoch_end(epoch, combined_losses)
```,
caption: "Metody wizualizacji strat"
)
#pagebreak(weak: true)
Implementacja Early Stopping pozwoliÅa na automatyczne przerwanie treningu w przypadku braku poprawy wyników:
#figure(
```python
early_stopping_callback = EarlyStoppingCallback(patience=5, verbose=True, delta=0.01,
path=os.path.join(run_log_dir, 'best_model.pt'))
if early_stopping_callback(epoch, val_loss, gan):
print("Early stopping triggered")
break
```,
caption: "Mechanizm wczesnego przerwania treningu"
)
#pagebreak(weak: true)
Zapisywanie checkpointów umoÅŒliwiÅo zachowanie stanu modelu w regularnych odstÄpach czasu, co pozwoliÅo na wznowienie treningu w przypadku nagÅego przerwania:
#figure(
```python
checkpoint_callback = CheckpointCallback(checkpoint_dir)
checkpoint_callback(epoch, gan)
```,
caption: "Mechanizm zachowania stanu modelu"
)
#linebreak()
== Charakterystyka kodu źródÅowego
Struktura projektu i organizacja moduÅów w opracowanym systemie rekonstrukcji nagraÅ dźwiÄkowych odzwierciedla moduÅowe i funkcjonalne podejÅcie do rozwiÄ
zania problemu. Projekt zostaÅ podzielony na kilka kluczowych moduÅów, kaÅŒdy odpowiedzialny za specyficzny aspekt przetwarzania i analizy danych audio.
GÅówne moduÅy projektu obejmujÄ
:
1. `models.py`: Zawiera definicje klas `Generator`, `Discriminator` i `AudioEnhancementGAN`, które stanowiÄ
trzon architektury sieci GAN.
2. `losses.py`: Implementuje róŌnorodne funkcje straty wykorzystywane w procesie treningu, w tym `adversarial_loss`, `content_loss`, `spectral_convergence_loss` i inne.
3. `data_preparation.py`: Odpowiada za przygotowanie i przetwarzanie danych wejÅciowych, zawierajÄ
c klasÄ `STFTDataset` do obsÅugi spektrogramów STFT.
4. `callbacks.py`: Implementuje mechanizmy monitorowania i wizualizacji procesu treningu, w tym `LossVisualizationCallback` i `CheckpointCallback`.
5. `main.py`: Stanowi punkt wejÅcia do aplikacji, integrujÄ
c wszystkie komponenty i implementujÄ
c logikÄ treningu.
Kluczowe klasy i funkcje w implementacji sieci GAN obejmujÄ
:
- Klasa `AudioEnhancementGAN`: Centralna klasa projektu, integrujÄ
ca generator i dyskryminator oraz implementujÄ
ca logikÄ treningu.
- Klasy `Generator` i `Discriminator`: ImplementujÄ
odpowiednio architekturÄ generatora i dyskryminatora.
- Funkcja `generator_loss`: Implementuje funkcjÄ straty dla generatora, ÅÄ
czÄ
cÄ
róŌne komponenty straty.
Projekt w szerokim zakresie wykorzystuje biblioteki PyTorch, librosa i pydub:
- PyTorch sÅuÅŒy jako podstawowy framework implementacji i treningu sieci neuronowych.
- Librosa jest wykorzystywana do zaawansowanego przetwarzania sygnaÅów audio, w szczególnoÅci do obliczania i manipulacji spektrogramami STFT.
- Pydub znajduje zastosowanie w procesie przygotowania danych, umoÅŒliwiajÄ
c konwersjÄ i manipulacjÄ plikami audio.
Mechanizmy przetwarzania równolegÅego i zarzÄ
dzania pamiÄciÄ
zostaÅy zaimplementowane w celu optymalizacji wydajnoÅci:
- Wykorzystanie `ProcessPoolExecutor` i `ThreadPoolExecutor` do równolegÅego przetwarzania plików audio:
- Dynamiczne dostosowywanie liczby procesów do dostÄpnych zasobów CPU:
- Ograniczanie uÅŒycia pamiÄci RAM poprzez przetwarzanie danych w mniejszych porcjach:
Implementacja interfejsu wiersza poleceÅ (CLI) do obsÅugi skryptów zostaÅa zrealizowana z wykorzystaniem moduÅu `argparse`, co umoÅŒliwia elastyczne konfigurowanie parametrów treningu:
#figure(
```python
parser = argparse.ArgumentParser(description='Audio Enhancement GAN')
parser.add_argument('--no-cuda', action='store_true', default=False,
help='disables CUDA training')
parser.add_argument('--batch-size', type=int, default=16, metavar='N',
help='input batch size for training (default: 16)')
parser.add_argument('--epochs', type=int, default=50, metavar='N',
help='number of epochs to train (default: 50)')
args = parser.parse_args()
```,
caption: "Mechanizm argumentów wierza poleceÅ"
)
#pagebreak(weak:true)
== Metodologia eksperymentów
Przeprowadzone eksperymenty miaÅy na celu ocenÄ skutecznoÅci opracowanego modelu GAN w rekonstrukcji nagraÅ dźwiÄkowych, ze szczególnym uwzglÄdnieniem usuwania szumów charakterystycznych dla pÅyt winylowych. GÅównym celem byÅo zbadanie zdolnoÅci modelu do odwzorowania oryginalnych sygnaÅów audio poprzez poprawÄ jakoÅci oraz usuniÄcie artefaktów z próbek.
Opis przeprowadzonych eksperymentów:
1. Trening modelu na zbiorze danych zawierajÄ
cym pary nagraÅ: oryginalne (czyste) oraz z dodanymi szumami winylowymi.
2. Walidacja modelu na oddzielnym zbiorze danych, niedostÄpnym podczas treningu.
3. Generowanie rekonstrukcji dla wybranych próbek testowych i analiza wyników.
Metody ewaluacji obejmowaÅy obiektywne metryki jakoÅci audio oraz walidacjÄ na oddzielnym zbiorze danych:
1. Obiektywne metryki:
- Signal-to-Noise Ratio (SNR): Mierzy stosunek mocy sygnaÅu do mocy szumu.
- Spectral Convergence: Ocenia podobieÅstwo widmowe miÄdzy sygnaÅem oryginalnym a zrekonstruowanym.
- Perceptual Evaluation of Audio Quality (PEAQ): Symuluje subiektywnÄ
ocenÄ jakoÅci dźwiÄku.
2. Walidacja na oddzielnym zbiorze danych:
- Wykorzystano cross-walidacjÄ, dzielÄ
c zbiór danych na czÄÅÄ treningowÄ
i walidacyjnÄ
.
- Monitorowano straty walidacyjne w trakcie treningu, aby uniknÄ
Ä przeuczenia.
Ze wzglÄdu na *niepowodzenie badania*, subiektywna *ocena przez odsÅuch nie byÅa moÅŒliwa*. W normalnych warunkach proces ten obejmowaÅby:
- PróbkÄ badaczy z moÅŒliwie róŌnych kohort oceniajÄ
cych jakoÅÄ rekonstrukcji.
- Ålepe testy AB porównujÄ
ce oryginalne nagrania z rekonstrukcjami.
- OcenÄ parametrów takich jak czystoÅÄ dźwiÄku, zachowanie detali muzycznych i ogólna jakoÅÄ.
#pagebreak(weak: true)
W ramach badaÅ przeprowadzono eksperymenty majÄ
ce na celu optymalizacjÄ architektury i hiperparametrów modelu GAN. W przypadku generatora, testowano róŌne konfiguracje sieci, modyfikujÄ
c liczbÄ i rozmiar warstw konwolucyjnych oraz eksperymentujÄ
c z róŌnymi typami bloków rezydualnych. Podobne modyfikacje wprowadzano w architekturze dyskryminatora, gdzie zweryfikowano wpÅyw gÅÄbokoÅci sieci na jakoÅÄ wyników oraz efektywnoÅÄ róŌnych technik normalizacji, ze szczególnym uwzglÄdnieniem normalizacji spektralnej. Proces optymalizacji obejmowaÅ równieÅŒ dostrajanie kluczowych hiperparametrów, takich jak wspóÅczynniki uczenia siÄ dla generatora i dyskryminatora. Badano takÅŒe wpÅyw rozmiaru batcha oraz liczby kroków akumulacji gradientu na stabilnoÅÄ i efektywnoÅÄ procesu uczenia. Celem tych kompleksowych eksperymentów byÅo znalezienie optymalnej konfiguracji modelu, zapewniajÄ
cej najlepsze wyniki w zadaniu rekonstrukcji nagraÅ audio przy jednoczesnym zachowaniu stabilnoÅci treningu i efektywnoÅci obliczeniowej.
W ramach analizy wpÅywu poszczególnych komponentów na jakoÅÄ rekonstrukcji, przeprowadzono badania ukierunkowane na róŌne aspekty modelu. W obszarze funkcji strat dokonano eksperymentów dotyczÄ
cych wpÅywu róŌnych wag przypisanych poszczególnym komponentom oraz analizie efektywnoÅci funkcji takich jak spectral flatness loss czy phase-aware loss. Zweryfikowano takÅŒe skutecznoÅÄ róŌnych technik normalizacji, w tym batch normalization i instance normalization w generatorze.
Wyniki eksperymentów byÅy analizowane poprzez wizualizacjÄ spektrogramów STFT, monitorowanie krzywych miÄdzy epokami, oraz analizÄ metryk algorytmu podczas procesu uczenia. Mimo ÅŒe badanie nie przyniosÅo oczekiwanych rezultatów w zakresie jakoÅci rekonstrukcji audio, dostarczyÅo cennych informacji na temat zachowania modelu GAN w kontekÅcie przetwarzania sygnaÅów dźwiÄkowych oraz wskazaÅo potencjalne kierunki dalszych badaÅ i ulepszeÅ.
#pagebreak(weak: true)
= Analiza wyników
W ramach przeprowadzonego badania nad rekonstrukcjÄ
nagraÅ dźwiÄkowych z wykorzystaniem sieci GAN zastosowano obiektywne metody analizy celem oceny skutecznoÅci opracowanego modelu. Proces ewaluacji koncentrowaÅ siÄ na trzech gÅównych aspektach: *analizie wizualizacji spektrogramów STFT*, *obserwacji funkcji strat* w trakcie treningu oraz *testach odsÅuchowych z wykorzystaniem odwrotnej transformaty STFT*.
Analiza wizualizacji spektrogramów STFT stanowiÅa kluczowy element oceny obiektywnej. Porównanie spektrogramów oryginalnych nagraÅ, ich wersji z dodanymi szumami winylowymi oraz rekonstrukcji generowanych przez sieÄ GAN pozwoliÅo na bezpoÅredniÄ
obserwacjÄ skutecznoÅci modelu w usuwaniu charakterystycznych znieksztaÅceÅ.
#figure(
image("fig1.png"),
caption: [Spektogram oryginalnego fragmentu]
) <fig1>
#figure(
image("fig2.png"),
caption: [Spektogram fragmentu z dodanym szumem winylowym]
) <fig2>
#figure(
image("fig3.png"),
caption: [Spektogram fragmentu rekonstrukcji w poczÄ
tkowej fazie treningu]
) <fig3>
#figure(
image("fig4.png"),
caption: [Spektogram fragmentu rekonstrukcji w koÅcowej fazie treningu]
) <fig4>
Analiza tych wizualizacji ujawniÅa, ÅŒe model byÅ wstanie w pewnym stopniu zniwelowaÄ trzaski charakterystyczne dla pÅyt winylowych, co widoczne byÅo jako redukcja pionowych linii na spektrogramach reprezentujÄ
cych nagÅe, krótkotrwaÅe zakÅócenia. JednoczeÅnie model nauczyÅ siÄ uwydatniÄ fale oznaczajÄ
ce wysokie dźwiÄki znajdujÄ
ce siÄ w oryginalnej próbce. *Model uczyÅ siÄ poprawnie i dÄ
ÅŒyÅ w kierunku oryginalnych próbek, jednak nie byÅ w stanie zniwelowaÄ szumów które wygenerowaÅ w poczÄ
tkowych fazach uczenia.*
Obserwacja funkcji strat w trakcie procesu uczenia stanowiÅa drugi istotny aspekt oceny obiektywnej. Analiza ta pozwoliÅa na Åledzenie postÄpów w zdolnoÅci modelu do rekonstrukcji nagraÅ w miarÄ upÅywu epok treningu. PoniÅŒej przedstawiono przykÅadowe wartoÅci strat uzyskane podczas procesu uczenia:
#figure(
table(
columns: 10,
[epoka], [0], [1], [2], [...], [14], [15], [16], [...], [20],
[wartoÅÄ funkcji straty], [249], [197], [169], [...], [44], [49], [45], [...], [46]
)
)
#pagebreak(weak: true)
Obserwacja tych wartoÅci ujawnia interesujÄ
cÄ
tendencjÄ. W poczÄ
tkowych fazach treningu widoczny jest znaczÄ
cy spadek wartoÅci funkcji straty, co sugeruje, ÅŒe model uczyÅ siÄ efektywnie redukowaÄ bÅÄdy rekonstrukcji. JednakÅŒe, okoÅo 14-16 epoki wartoÅÄ funkcji straty osiÄ
gnÄÅa minimum (okoÅo 44-45) i przestaÅa znaczÄ
co spadaÄ, utrzymujÄ
c siÄ na podobnym poziomie w kolejnych epokach.
To zjawisko jest niepokojÄ
ce, biorÄ
c pod uwagÄ, ÅŒe jakoÅÄ rekonstrukcji audio pozostawaÅa niezadowalajÄ
ca. Sugeruje to, ÅŒe model osiÄ
gnÄ
Å *minimum lokalne*, które nie odpowiadaÅo satysfakcjonujÄ
cemu rozwiÄ
zaniu problemu rekonstrukcji. Innymi sÅowy, funkcja straty przestaÅa dostarczaÄ uÅŒytecznych informacji dla dalszej optymalizacji modelu, mimo ÅŒe nie byÅ on jeszcze w stanie generowaÄ wysokiej jakoÅci rekonstrukcji audio.
Trzecim elementem oceny obiektywnej byÅy testy odsÅuchowe z wykorzystaniem odwrotnej transformaty STFT. W tym procesie, spektrogramy wygenerowane przez model byÅy przeksztaÅcane z powrotem na sygnaÅy audio w formacie MP3. Ta metoda miaÅa na celu umoÅŒliwienie bezpoÅredniej oceny sÅuchowej jakoÅci rekonstrukcji, stanowiÄ
c istotne uzupeÅnienie analizy wizualnej i numerycznej.
Wyniki tych testów odsÅuchowych okazaÅy siÄ jednak *skrajnie niezadowalajÄ
ce*. We wszystkich próbach, niezaleÅŒnie od etapu treningu czy konfiguracji modelu, uzyskane sygnaÅy audio skÅadaÅy siÄ wyÅÄ
cznie z niezrozumiaÅych szumów. Å»adna z wygenerowanych próbek nie wykazywaÅa cech przypominajÄ
cych muzykÄ czy jakiekolwiek rozpoznawalne dźwiÄki. Ta obserwacja stanowi najbardziej dobitne Åwiadectwo nieefektywnoÅci modelu w zadaniu rekonstrukcji nagraÅ muzycznych, podkreÅlajÄ
c znaczÄ
cÄ
rozbieÅŒnoÅÄ miÄdzy czÄÅciowymi poprawami widocznymi na spektrogramach a faktycznÄ
jakoÅciÄ
dźwiÄku percypowanÄ
przez ludzkie ucho.
#linebreak()
Obserwacja zmian w spektrogramach w trakcie procesu uczenia ujawniÅa pewne interesujÄ
ce tendencje. W poczÄ
tkowych fazach treningu, model wykazywaÅ zdolnoÅÄ do czÄÅciowej redukcji najbardziej widocznych artefaktów, takich jak pionowe linie reprezentujÄ
ce trzaski charakterystyczne dla pÅyt winylowych. JednakÅŒe, w wielu podejÅciach do uczenia, w miarÄ postÄpu treningu, pojawiaÅy siÄ niepokojÄ
ce zjawiska:
#figure(
image("fig5.png"),
caption: [Spektogram fragmentu rekonstrukcji na poczÄ
tku bÅÄdów w nauce]
) <fig5>
#figure(
image("fig6.png"),
caption: [Spektogram fragmentu rekonstrukcji pod koniec bÅÄdów w nauce]
) <fig6>
Na powyÅŒszej wizualizacji moÅŒna zaobserwowaÄ, ÅŒe model po kilkunastu epokach uczenia omylnie nauczyÅ siÄ podmieniaÄ wartoÅci zerami, odpowiadajÄ
ce dźwiÄkom na poziomie 0dB.
Dyskusja na temat niedostatecznej poprawy jakoÅci do uzyskania uÅŒytecznych wyników audio musi uwzglÄdniÄ kilka kluczowych aspektów:
1. ZÅoÅŒonoÅÄ zadania: Rekonstrukcja peÅnych nagraÅ muzycznych okazaÅa siÄ znacznie bardziej skomplikowana niÅŒ poczÄ
tkowo zakÅadano. Model musiaÅ nie tylko usunÄ
Ä szumy, ale takÅŒe odtworzyÄ subtelne detale muzyczne, co stanowiÅo wyzwanie wykraczajÄ
ce poza moÅŒliwoÅci obecnej architektury.
2. NieadekwatnoÅÄ funkcji strat: Mimo zastosowania róŌnorodnych funkcji strat, mogÅy one nie w peÅni odzwierciedlaÄ percepcyjne aspekty jakoÅci dźwiÄku. To mogÅo prowadziÄ do optymalizacji modelu w kierunku, który nie przekÅadaÅ siÄ bezpoÅrednio na poprawÄ sÅyszalnej jakoÅci.
3. Ograniczenia architektury: Zastosowana architektura GAN, mimo swojej zÅoÅŒonoÅci, mogÅa nie byÄ wystarczajÄ
co dostosowana do specyfiki rekonstrukcji sygnaÅów audio. W szczególnoÅci, model mógÅ mieÄ trudnoÅci z zachowaniem spójnoÅci fazowej, co jest kluczowe dla naturalnego brzmienia dźwiÄku.
4. Problemy z danymi treningowymi: JakoÅÄ i reprezentatywnoÅÄ danych treningowych mogÅy nie byÄ wystarczajÄ
ce do nauczenia modelu efektywnej rekonstrukcji. W szczególnoÅci, symulowane szumy winylowe mogÅy nie w peÅni oddawaÄ zÅoÅŒonoÅÄ rzeczywistych znieksztaÅceÅ wystÄpujÄ
cych w historycznych nagraniach.
5. NiestabilnoÅÄ treningu GAN: Charakterystyczna dla architektury GAN niestabilnoÅÄ procesu uczenia mogÅa prowadziÄ do problemów z konwergencjÄ
, co objawiaÅo siÄ niekonsekwentnÄ
jakoÅciÄ
generowanych spektrogramów w róŌnych fazach treningu.
PodsumowujÄ
c, obiektywna ocena wyników wykazaÅa, ÅŒe opracowany model GAN, mimo pewnych obiecujÄ
cych aspektów widocznych w analizie spektrogramów, nie byÅ w stanie osiÄ
gnÄ
Ä zadowalajÄ
cego poziomu rekonstrukcji nagraÅ dźwiÄkowych. CaÅkowity brak rozpoznawalnych elementów muzycznych w wygenerowanych próbkach audio podkreÅla gÅÄbokÄ
rozbieÅŒnoÅÄ miÄdzy czÄÅciowymi poprawami obserwowanymi w domenie czÄstotliwoÅciowej a faktycznÄ
percepcjÄ
dźwiÄku.
Ta sytuacja uwypukla zÅoÅŒonoÅÄ zadania rekonstrukcji nagraÅ muzycznych i wskazuje na potrzebÄ dalszych, pogÅÄbionych badaÅ w tym obszarze. PrzyszÅe prace powinny skupiÄ siÄ na:
1. Udoskonaleniu architektury modelu, ze szczególnym uwzglÄdnieniem zachowania spójnoÅci fazowej sygnaÅu.
2. Opracowaniu bardziej zaawansowanych funkcji strat, które lepiej odzwierciedlaÅyby percepcyjne aspekty jakoÅci dźwiÄku.
3. ZwiÄkszeniu rozmiaru i róŌnorodnoÅci zbioru danych treningowych, z moÅŒliwym uwzglÄdnieniem rzeczywistych, a nie tylko symulowanych, znieksztaÅceÅ.
4. Eksploracji alternatywnych podejÅÄ do generatywnego modelowania dźwiÄku, takich jak modele autoregresyjne czy modele dyfuzyjne.
Mimo ÅŒe obecne wyniki nie speÅniÅy oczekiwaÅ w zakresie praktycznej uÅŒytecznoÅci, stanowiÄ
one cenny wkÅad w zrozumienie wyzwaÅ zwiÄ
zanych z zastosowaniem technik uczenia maszynowego do rekonstrukcji nagraÅ dźwiÄkowych i otwierajÄ
nowe ÅcieÅŒki dla przyszÅych badaÅ w tej dziedzinie.
#pagebreak(weak: true)
= Wnioski i perspektywy
#linebreak()
== Podsumowanie osiÄ
gniÄtych rezultatów
Niniejsza praca miaÅa na celu zbadanie moÅŒliwoÅci wykorzystania metod uczenia maszynowego, ze szczególnym uwzglÄdnieniem Generatywnych Sieci Przeciwstawnych (GAN), w procesie rekonstrukcji nagraÅ dźwiÄkowych. GÅównym celem byÅo opracowanie i implementacja modelu GAN zdolnego do poprawy jakoÅci historycznych nagraÅ muzycznych, ze szczególnym naciskiem na usuwanie szumów charakterystycznych dla pÅyt winylowych oraz rozszerzanie pasma czÄstotliwoÅciowego.
Przeprowadzone eksperymenty dostarczyÅy cennych informacji na temat potencjaÅu i ograniczeÅ zastosowania sieci GAN w tym kontekÅcie. Kluczowe wyniki obejmujÄ
czÄÅciowe sukcesy w redukcji trzasków charakterystycznych dla pÅyt winylowych, co byÅo widoczne na spektrogramach STFT generowanych próbek. Model wykazaÅ zdolnoÅÄ do uczenia siÄ pewnych aspektów rekonstrukcji, co objawiaÅo siÄ stopniowÄ
poprawÄ
jakoÅci generowanych spektrogramów w trakcie procesu treningu.
JednakÅŒe, ocena skutecznoÅci proponowanej metody GAN w rekonstrukcji nagraÅ ujawniÅa znaczÄ
ce ograniczenia. Mimo obiecujÄ
cych rezultatów widocznych w domenie czÄstotliwoÅciowej, próby konwersji zrekonstruowanych spektrogramów z powrotem do domeny czasowej nie przyniosÅy zadowalajÄ
cych wyników. Wygenerowane sygnaÅy audio charakteryzowaÅy siÄ wysokim poziomem szumów i znieksztaÅceÅ, co uniemoÅŒliwiÅo ich subiektywnÄ
ocenÄ poprzez odsÅuch.
PorównujÄ
c osiÄ
gniÄte rezultaty z poczÄ
tkowymi zaÅoÅŒeniami i oczekiwaniami, naleÅŒy przyznaÄ, ÅŒe badanie nie speÅniÅo wszystkich postawionych celów. ZakÅadana moÅŒliwoÅÄ generowania wysokiej jakoÅci rekonstrukcji nagraÅ, które byÅyby percepcyjnie zbliÅŒone do oryginaÅów, nie zostaÅa osiÄ
gniÄta. Model GAN, mimo swojej zÅoÅŒonoÅci i zastosowania zaawansowanych technik optymalizacji, nie byÅ w stanie w peÅni odtworzyÄ subtelnoÅci i detali muzycznych niezbÄdnych do uzyskania satysfakcjonujÄ
cej jakoÅci dźwiÄku.
Niemniej jednak, przeprowadzone badania dostarczyÅy cennych informacji na temat wyzwaÅ zwiÄ
zanych z zastosowaniem uczenia maszynowego w dziedzinie rekonstrukcji audio. Zidentyfikowano kluczowe problemy, takie jak trudnoÅci w zachowaniu spójnoÅci fazowej sygnaÅu czy ograniczenia zwiÄ
zane z redukcjÄ
szumów w wygenerowanych nagraniach. Te obserwacje stanowiÄ
istotny wkÅad w zrozumienie kompleksowoÅci zadania rekonstrukcji nagraÅ muzycznych i otwierajÄ
drogÄ do dalszych, bardziej ukierunkowanych badaÅ w tej dziedzinie.
PodsumowujÄ
c, mimo ÅŒe proponowana metoda GAN nie osiÄ
gnÄÅa wszystkich zakÅadanych celów w zakresie praktycznej rekonstrukcji nagraÅ, przeprowadzone badania przyczyniÅy siÄ do pogÅÄbienia wiedzy na temat zastosowania uczenia maszynowego w przetwarzaniu sygnaÅów audio. Zidentyfikowane wyzwania i ograniczenia stanowiÄ
cenny punkt wyjÅcia do dalszych prac nad udoskonaleniem technik rekonstrukcji nagraÅ dźwiÄkowych z wykorzystaniem sztucznej inteligencji.
== Ograniczenia proponowanej metody
W trakcie realizacji badaÅ nad zastosowaniem sieci GAN w rekonstrukcji nagraÅ dźwiÄkowych napotkano szereg istotnych wyzwaÅ i ograniczeÅ, które wpÅynÄÅy na ostateczne wyniki pracy. Identyfikacja i analiza tych trudnoÅci stanowi kluczowy element w zrozumieniu aktualnych ograniczeÅ metody oraz wyznaczeniu kierunków dalszych badaÅ.
Jednym z gÅównych wyzwaÅ okazaÅa siÄ zÅoÅŒonoÅÄ zadania rekonstrukcji peÅnych nagraÅ muzycznych. W przeciwieÅstwie do prostszych zadaÅ, takich jak usuwanie pojedynczych typów zakÅóceÅ, peÅna rekonstrukcja wymaga jednoczesnego adresowania wielu aspektów jakoÅci dźwiÄku. Model musiaÅ nie tylko usuwaÄ szumy i trzaski, ale takÅŒe odtwarzaÄ utracone czÄstotliwoÅci i zachowywaÄ muzycznÄ
spójnoÅÄ, co okazaÅo siÄ zadaniem przekraczajÄ
cym moÅŒliwoÅci opracowanej architektury.
Istotnym czynnikiem ograniczajÄ
cym skutecznoÅÄ rekonstrukcji byÅa jakoÅÄ i reprezentatywnoÅÄ danych treningowych. Mimo staraÅ o stworzenie zróŌnicowanego zbioru danych, symulowane znieksztaÅcenia mogÅy nie w peÅni odzwierciedlaÄ zÅoÅŒonoÅÄ rzeczywistych uszkodzeÅ wystÄpujÄ
cych w historycznych nagraniach. Ograniczenie to mogÅo prowadziÄ do niedostatecznej generalizacji modelu na rzeczywiste przypadki.
ZÅoÅŒonoÅÄ architektury sieci GAN, choÄ teoretycznie korzystna, w praktyce przyczyniÅa siÄ do powstania szeregu problemów. NiestabilnoÅÄ procesu uczenia, charakterystyczna dla GAN, okazaÅa siÄ szczególnie problematyczna w kontekÅcie danych audio. Balansowanie miÄdzy uczeniem generatora i dyskryminatora byÅo trudne, co czÄsto prowadziÅo do nieoptymalnych rezultatów lub zjawiska zapadania siÄ modelu (mode collapse).
W domenie audio napotkano specyficzne problemy, które nie wystÄpujÄ
lub sÄ
mniej znaczÄ
ce w innych obszarach zastosowaÅ uczenia maszynowego. Kluczowym wyzwaniem okazaÅo siÄ zachowanie spójnoÅci fazowej w rekonstruowanych sygnaÅach. Nawet niewielkie bÅÄdy w odtworzeniu fazy prowadziÅy do znaczÄ
cych znieksztaÅceÅ percepcyjnych, co byÅo szczególnie widoczne przy próbach konwersji spektrogramów z powrotem do domeny czasowej.
PowaÅŒnym ograniczeniem okazaÅy siÄ równieÅŒ kwestie sprzÄtowe i obliczeniowe. Wykorzystywana w badaniach karta graficzna Radeon 6950 XTX z 16 GB pamiÄci VRAM, mimo swoich wysokich parametrów, wielokrotnie okazywaÅa siÄ niewystarczajÄ
ca do efektywnego treningu modelu na peÅnym zbiorze danych. Problemy z brakiem pamiÄci wirtualnej wymuszaÅy ograniczenie rozmiaru batchy lub stosowanie technik takich jak akumulacja gradientów, co z kolei wpÅywaÅo na stabilnoÅÄ i efektywnoÅÄ procesu uczenia. DÅugi czas treningu, siÄgajÄ
cy kilkudziesiÄciu godzin na peÅny proces, znaczÄ
co ograniczaÅ moÅŒliwoÅci eksperymentowania z róŌnymi konfiguracjami modelu i hiperparametrami.
#pagebreak(weak: true)
Dodatkowym wyzwaniem okazaÅa siÄ interpretacja wyników poÅrednich. Mimo obserwowanych popraw w reprezentacjach czÄstotliwoÅciowych (spektrogramach), przekÅadanie tych ulepszeÅ na percepcyjnÄ
jakoÅÄ dźwiÄku okazaÅo siÄ nieoczywiste. Sugeruje to, ÅŒe stosowane funkcje straty mogÅy nie w peÅni odzwierciedlaÄ aspekty istotne dla ludzkiej percepcji dźwiÄku.
Wreszcie, naleÅŒy zwróciÄ uwagÄ na ograniczenia wynikajÄ
ce z samej natury podejÅcia opartego na uczeniu maszynowym. Model, uczÄ
c siÄ na podstawie dostarczonych przykÅadów, mógÅ mieÄ trudnoÅci z rekonstrukcjÄ
rzadkich lub unikalnych elementów muzycznych, które nie byÅy dobrze reprezentowane w zbiorze treningowym.
#linebreak()
== Potencjalne kierunki dalszych badaÅ
Przeprowadzone badania, mimo napotkanych ograniczeÅ, otwierajÄ
szereg interesujÄ
cych ÅcieÅŒek dla dalszych prac w dziedzinie rekonstrukcji nagraÅ dźwiÄkowych z wykorzystaniem metod uczenia maszynowego. PrzyszÅe badania mogÅyby skupiÄ siÄ na kilku kluczowych obszarach.
W kontekÅcie architektury sieci GAN, obiecujÄ
cym kierunkiem wydaje siÄ eksploracja bardziej zaawansowanych wariantów, takich jak Progressive GAN czy StyleGAN, które wykazaÅy imponujÄ
ce rezultaty w dziedzinie generowania obrazów. Adaptacja tych architektur do domeny audio mogÅaby potencjalnie przezwyciÄÅŒyÄ niektóre z napotkanych problemów, szczególnie w zakresie stabilnoÅci treningu i jakoÅci generowanych wyników. Ponadto, warto rozwaÅŒyÄ implementacjÄ technik takich jak self-attention czy transformer blocks w generatorze, co mogÅoby poprawiÄ zdolnoÅÄ modelu do uchwycenia dÅugoterminowych zaleÅŒnoÅci w sygnaÅach muzycznych.
Alternatywnym podejÅciem wartym eksploracji sÄ
modele dyfuzyjne, które w ostatnim czasie zyskaÅy znaczÄ
cÄ
popularnoÅÄ w zadaniach generatywnych. Modele te, takie jak DDPM (Denoising Diffusion Probabilistic Models), mogÅyby okazaÄ siÄ szczególnie skuteczne w kontekÅcie rekonstrukcji audio, ze wzglÄdu na ich zdolnoÅÄ do stopniowego usuwania szumu z danych. Ich potencjaÅ w generowaniu wysokiej jakoÅci próbek dźwiÄkowych oraz stabilnoÅÄ treningu czyniÄ
je atrakcyjnÄ
alternatywÄ
dla tradycyjnych GAN-ów.
Istotnym kierunkiem badaÅ powinna byÄ takÅŒe gÅÄbsza integracja wiedzy dziedzinowej z zakresu przetwarzania sygnaÅów audio w procesie uczenia maszynowego. MoÅŒna by rozwaÅŒyÄ opracowanie specjalizowanych warstw sieciowych, które explicite modelowaÅyby zjawiska akustyczne, takie jak propagacja fal dźwiÄkowych czy rezonans. Implementacja zaawansowanych technik analizy czÄstotliwoÅciowej, takich jak transformata falkowa czy analiza cepstralna, bezpoÅrednio w architekturze sieci neuronowej mogÅaby znaczÄ
co poprawiÄ jej zdolnoÅÄ do precyzyjnej rekonstrukcji sygnaÅów muzycznych.
#pagebreak(weak: true)
PrzyszÅe badania powinny równieÅŒ rozszerzyÄ zakres eksperymentów o szerszy wachlarz gatunków muzycznych i typów nagraÅ. Szczególnie interesujÄ
ce pozostaje badanie skutecznoÅci modeli w rekonstrukcji nagraÅ wokalnych, muzyki elektronicznej czy zapisów koncertów na ÅŒywo.
#linebreak()
== Implikacje dla przyszÅoÅci rekonstrukcji nagraÅ muzycznych
Rozwój technik opartych na sztucznej inteligencji w dziedzinie rekonstrukcji nagraÅ muzycznych niesie ze sobÄ
znaczÄ
ce implikacje dla ochrony dziedzictwa kulturowego. PotencjaÅ AI w tym kontekÅcie jest ogromny â zaawansowane algorytmy mogÄ
nie tylko przywróciÄ do ÅŒycia historyczne nagrania, ale takÅŒe uczyniÄ je dostÄpnymi dla szerszej publicznoÅci w niespotykanej dotÄ
d jakoÅci. MoÅŒliwoÅÄ automatycznej poprawy jakoÅci tysiÄcy godzin archiwalnych nagraÅ otwiera nowe perspektywy dla badaczy, muzykologów i miÅoÅników muzyki, umoÅŒliwiajÄ
c gÅÄbsze zrozumienie i docenienie muzycznego dziedzictwa ludzkoÅci.
JednoczeÅnie, stosowanie AI w rekonstrukcji historycznych nagraÅ rodzi istotne pytania etyczne. Kluczowe jest zachowanie równowagi miÄdzy poprawÄ
jakoÅci a zachowaniem autentycznoÅci oryginaÅu. Zbyt agresywna ingerencja algorytmów AI moÅŒe prowadziÄ do znieksztaÅcenia historycznego brzmienia, zacierajÄ
c granicÄ miÄdzy rekonstrukcjÄ
a reinterpretacjÄ
. Dlatego niezbÄdne jest wypracowanie standardów i wytycznych etycznych, które bÄdÄ
kierowaÄ wykorzystaniem AI w tym obszarze, zapewniajÄ
c poszanowanie integralnoÅci artystycznej i historycznej rekonstruowanych dzieÅ.
PatrzÄ
c w przyszÅoÅÄ, moÅŒna prognozowaÄ dynamiczny rozwój technologii AI w dziedzinie przetwarzania audio. MoÅŒemy spodziewaÄ siÄ pojawienia siÄ coraz bardziej wyrafinowanych modeli, zdolnych do jeszcze dokÅadniejszej analizy i rekonstrukcji sygnaÅów dźwiÄkowych. Prawdopodobne jest równieÅŒ powstanie systemów hybrydowych, ÅÄ
czÄ
cych klasyczne techniki przetwarzania sygnaÅów z zaawansowanymi algorytmami uczenia maszynowego, co moÅŒe prowadziÄ do przeÅomów w jakoÅci i efektywnoÅci rekonstrukcji.
WpÅyw zaawansowanych technik rekonstrukcji na przemysÅ muzyczny i praktyki archiwizacyjne bÄdzie najprawdopodobniej znaczÄ
cy. MoÅŒemy oczekiwaÄ rosnÄ
cego zainteresowania remasteringiem i ponownym wydawaniem historycznych nagraÅ w poprawionej jakoÅci. To z kolei moÅŒe wpÅynÄ
Ä na strategie wydawnicze i modele biznesowe w branÅŒy muzycznej. Dla archiwów i instytucji kulturalnych, nowe technologie rekonstrukcji mogÄ
oznaczaÄ rewolucjÄ w sposobie przechowywania i udostÄpniania zbiorów audio, potencjalnie prowadzÄ
c do demokratyzacji dostÄpu do muzycznego dziedzictwa.
#pagebreak(weak: true)
PodsumowujÄ
c, mimo ÅŒe obecne badania nad wykorzystaniem AI w rekonstrukcji nagraÅ muzycznych napotkaÅy pewne ograniczenia, perspektywy na przyszÅoÅÄ sÄ
niezwykle obiecujÄ
ce. Dalszy rozwój w tej dziedzinie ma potencjaÅ nie tylko do znaczÄ
cego postÄpu technologicznego, ale takÅŒe do gÅÄbokiej transformacji naszego podejÅcia do zachowania i interpretacji muzycznego dziedzictwa. Kluczowe bÄdzie zrównowaÅŒone podejÅcie, które zmaksymalizuje korzyÅci pÅynÄ
ce z nowych technologii, jednoczeÅnie zachowujÄ
c szacunek dla integralnoÅci i autentycznoÅci historycznych nagraÅ.
#pagebreak(weak: true)
#outline(
title: [Lista symboli],
target: figure,
)
#pagebreak(weak: true)
#bibliography("bibliography.yml")
|
|
https://github.com/hei-templates/hevs-typsttemplate-thesis | https://raw.githubusercontent.com/hei-templates/hevs-typsttemplate-thesis/main/02-main/01-abstract.typ | typst | MIT License |
#pagebreak()
#heading(numbering:none)[Abstract] <sec:abstract>
#lorem(50)
#lorem(50)
#lorem(50)
#v(2em)
_*Keywords*_: _keyword 1_, _keyword 2_, _keyword 3_ |
https://github.com/0x1B05/nju_os | https://raw.githubusercontent.com/0x1B05/nju_os/main/book_notes/content/01_intro.typ | typst | #import "../template.typ": *
= intro
- demo code: https://github.com/remzi-arpacidusseau/ostep-code
- homework: https://github.com/remzi-arpacidusseau/ostep-homework
- projects: https://github.com/remzi-arpacidusseau/ostep-projects
The book is about *virtualization*, *concurrency*, and *persistence* of the operating system.
== So what happens when a program runs?
A running program does one very simple thing: it executes instructions.
The processor fetches an instruction from memory, decodes it and executes it. After it is done with this instruction, the processor moves on to the next instruction, and so on, and so on, until the program finally completes1. This is the Von Neumann model of computing.
While a lot of other wild things are going on with the primary goal of making the system *easy to use*. OS is in charge of making sure the system operates correctly and efficiently in an easy-to-use manner.
== virtualization
The OS takes a physical resource (such as the processor, or memory, or a disk) and transforms it into a more general, powerful, and easy-to-use virtual form of itself.
A typical OS, in fact, exports a few hundred *system calls* that are available to applications. Because the OS provides these calls to run programs, access memory and devices, and other related actions, we also sometimes say that the OS provides *a standard library* to applications.
The OS is sometimes known as a resource manager.
=== Virtualizing The CPU
```c
#include <stdio.h>
#include <stdlib.h>
#include <sys/time.h>
#include <assert.h>
#include "common.h"
int main(int argc, char *argv[]) {
if (argc != 2) {
fprintf(stderr, "usage: cpu <string>\n");
exit(1);
}
char *str = argv[1];
while (1) {
Spin(1);
printf("%s\n", str);
}
return 0;
}
```
```sh
prompt> ./cpu A & ; ./cpu B & ; ./cpu C & ; ./cpu D &
[1] 7353
[2] 7354
[3] 7355
[4] 7356
A
B
D
C
A
B
D
C
A
C
B
D
...
```
=== Virtualizing Memory
```c
#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
#include "common.h"
int main(int argc, char *argv[]) {
int *p = malloc(sizeof(int)); // a1
assert(p != NULL);
printf("(%d) address pointed to by p: %p\n", getpid(), p); // a2
*p = 0; // a3
while (1) {
Spin(1);
*p = *p + 1;
printf("(%d) p: %d\n", getpid(), *p); // a4
}
return 0;
}
```
```sh
prompt> ./mem &; ./mem &
[1] 24113
[2] 24114
(24113) address pointed to by p: 0x200000
(24114) address pointed to by p: 0x200000
(24113) p: 1
(24114) p: 1
(24114) p: 2
(24113) p: 2
(24113) p: 3
(24114) p: 3
(24113) p: 4
(24114) p: 4
...
```
Each process accesses its own private *virtual address space* (sometimes just called its *address space*), which the OS somehow maps onto the physical memory of the machine.
== Concurrency
```c
#include <stdio.h>
#include <stdlib.h>
#include "common.h"
#include "common_threads.h"
volatile int counter = 0;
int loops;
void *worker(void *arg) {
int i;
for (i = 0; i < loops; i++) {
counter++;
}
return NULL;
}
int main(int argc, char *argv[]) {
if (argc != 2) {
fprintf(stderr, "usage: threads <loops>\n");
exit(1);
}
loops = atoi(argv[1]);
pthread_t p1, p2;
printf("Initial value : %d\n", counter);
Pthread_create(&p1, NULL, worker, NULL);
Pthread_create(&p2, NULL, worker, NULL);
Pthread_join(p1, NULL);
Pthread_join(p2, NULL);
printf("Final value : %d\n", counter);
return 0;
}
```
```sh
prompt> gcc -o thread thread.c -Wall -pthread
prompt> ./thread 1000
Initial value : 0
Final value : 2000
```
```sh
prompt> ./thread 100000
Initial value : 0
Final value : 143012 // huh??
prompt> ./thread 100000
Initial value : 0
Final value : 137298 // what the??
```
A key part of the program above, where the shared counter is incremented, takes three instructions: one to load the value of the counter from memory into a register, one to increment it, and one to store it back into memory. Because these three instructions do not execute atomically (all at once), strange things can happen.
== persistence
The software in the operating system that usually manages the disk is called the *file system*. It is assumed that often times, users will want to *share* information that is in files.
```c
#include <stdio.h>
#include <unistd.h>
#include <assert.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <string.h>
int main(int argc, char *argv[]) {
int fd = open("/tmp/file", O_WRONLY | O_CREAT | O_TRUNC, S_IRUSR | S_IWUSR);
assert(fd >= 0);
char buffer[20];
sprintf(buffer, "hello world\n");
int rc = write(fd, buffer, strlen(buffer));
assert(rc == (strlen(buffer)));
fsync(fd);
close(fd);
return 0;
}
```
The file system has to do a fair bit of work: first figuring out where on disk this new data will reside, and then keeping track of it in various structures the file system maintains. Doing so requires issuing I/O requests to the underlying storage device, to either read existing structures or update (write) them. As anyone who has written a *device driver* knows, getting a device to do something on your behalf is an intricate and detailed process. It requires a deep knowledge of the low-level device interface and its exact semantics. Fortunately, the OS provides a standard and simple way to access devices through its system calls. Thus, the OS is sometimes seen as a *standard library*.
To handle the problems of system crashes during writes, most file systems incorporate some kind of intricate write protocol, such as *journaling* or *copy-on-write*, carefully ordering writes to disk to ensure that if a failure occurs during the write sequence, the system can recover to reasonable state afterwards.
== Design Goals
One goal in designing and implementing an operating system is to provide *high performance*; another way to say this is our goal is to *mini-mize the overheads* of the OS. Virtualization and making the system easy to use are well worth it, but not at any cost; thus, we must strive to provide virtualization and other OS features without excessive overheads. These overheads arise in a number of forms: extra time (more instructions) and extra space (in memory or on disk).
Another goal will be to provide *protection* between applications, as well as between the OS and applications.Protection is at the heart of one of the main principles underlying an operating system, which is that of *isolation*; isolating processes from one another is the key to protection and thus underlies much of what an OS must do.
The operating system must also run *non-stop*; when it fails, all applications running on the system fail as well. Because of this dependence, operating systems often strive to provide a high degree of *reliability*.
Other goals: energy-efficiency, security, mobility...
== History
- Early Operating Systems: Just Libraries
- -> Beyond Libraries: Protection
- -> The Era of Multiprogramming
- -> The Modern Era
|
|
https://github.com/xsro/xsro.github.io | https://raw.githubusercontent.com/xsro/xsro.github.io/zola/typst/Control-for-Integrator-Systems/part2.typ | typst | #import "template.typ": template
#show: template.with(
title:[*Sliding Mode Control for Integrator Systems*],
part:[*part 2*: Second and High Order Sliding Mode Control]
)
#include "9signum.typ"
#include "8sta.typ"
#include "7homo.typ"
#include "6highorder.typ"
|
|
https://github.com/soul667/typst | https://raw.githubusercontent.com/soul667/typst/main/PPT/typst-slides-fudan/themes/polylux/book/src/themes/gallery/clean.md | markdown | # Clean theme

This theme is a bit more opinionated than the `simple` theme but still supposed
to be an off-the-shelf solution that fits many use cases.
Use it via
```typ
#import "@preview/polylux:0.2.0": *
#import themes.clean: *
#show: clean-theme.with(...)
```
`clean` uses polylux' section handling, the regular `#outline()` will not work
properly, use `#polylux-outline()` instead.
Text is configured to have a base font size of 25 pt.
## Options for initialisation
`clean-theme` accepts the following optional keyword arguments:
- `aspect-ratio`: the aspect ratio of the slides, either `"16-9"` or `"4-3"`,
default is `"16-9"`
- `footer`: text to display in the footer of every slide, default is `[]`
- `short-title`: short form of the presentation title, to be displayed on every
slide, default: `none`
- `logo`: some content (most likely an image) used as a logo on every slide,
default: `none`
- `color`: colour of decorative lines, default: `teal`
## Slide functions
`clean` provides the following custom slide functions:
```typ
#title-slide(...)
```
Creates a title slide where title and subtitle are put between decorative lines,
along with logos and author and date infos.
Accepts the following keyword arguments:
- `title`: title of the presentation, default: `none`
- `subtitle`: subtitle of the presentation, default: `none`
- `authors`: authors of presentation, can be an array of contents or a single
content, will be displayed in a grid, default: `()`
- `date`: date of the presentation, default: `none`
- `watermark`: some content (most likely an image) used as a watermark behind
the titlepage, default: `none`
- `secondlogo`: some content (most likely an image) used as a second logo opposite
to the regular logo on the title page, default: `none`
Does not accept additional content.
---
```typ
#slide(...)[
...
][
...
]
```
Decorates the provided content with a header containing the current section (if
any), the short title of the presentation, and the logo; and a footer containing
some custom text and the slide number.
Accepts an arbitrary amount of content blocks, they are placed next to each other
as columns.
Configure using the `columns` and `gutter` keyword arguments.
Pass the slide title as a keyword argument `title`.
Accepts the following keyword arguments:
- `title`: title of the slide, default: `none`,
- `columns`: propagated to `grid` for placing the body columns, default: array
filled with as many `1fr` as there are content blocks
- `gutter`: propagated to `grid` for placing the body columns, default: `1em`
---
```typ
#focus-slide(foreground: ..., background: ...)[
...
]
```
Draw attention with this variant where the content is displayed centered and text
is enlarged.
Optionally accepts a foreground colour (default: `white`) and background color
(default: `teal`).
Not suitable for content that exceeds one page.
---
```typ
#new-section-slide(name)
```
Start a new section with the given `name` (string or content, positional argument).
Creates a slide with this name in the center, a decorative line below, and
nothing else.
Use `#polylux-outline()` to display all sections, similarly to how you would use
`#outline()` otherwise.
## Example code
The image at the top is created by the following code:
```typ
#import "@preview/polylux:0.2.0": *
{{#include clean.typ:3:}}
```
|
|
https://github.com/kokkonisd/typst-phd-template | https://raw.githubusercontent.com/kokkonisd/typst-phd-template/main/src/common.typ | typst | The Unlicense | #import "colors.typ": *
#import "fonts.typ": *
// Setup a message box for warnings, notes etc.
//
// Parameters:
// - primary-color: the primary color of the box.
// - secondary-color: the secondary color of the box.
// - icon: the icon used in the box (single character).
// - title: the title of the box.
// - message: the message displayed in the box.
// - width: the width of the box.
#let message-box(primary-color, secondary-color, icon, title, message, width: 100%) = block(
radius: 4pt,
fill: primary-color,
breakable: false,
)[
#set text(fill: white)
#show raw: set text(fill: black)
#block(
radius: (top: 4pt),
fill: secondary-color,
inset: (x: 10pt, y: 4pt),
width: width,
)[
#set align(left)
#box[
#set align(center + horizon)
#box[
#circle(fill: primary-color, inset: 0.5pt)[
#text(fill: secondary-color)[#strong[#icon]]
]
]
#box(inset: 3pt)[#emph[#title]]
]
]
#block(inset: 10pt, above: 0pt, width: width)[
#message
]
]
// Create a warning box.
//
// Parameters:
// - message: the message to put in the warning box.
// - width: the width of the warning box.
#let warn(message, width: 100%) = message-box(
WARN_PRIMARY_COLOR, WARN_SECONDARY_COLOR, "!", "Warning", message, width: width
)
// Create a note box.
//
// Parameters:
// - message: the message to put in the note box.
// - width: the width of the warning box.
#let note(message, width: 100%) = message-box(
NOTE_PRIMARY_COLOR, NOTE_SECONDARY_COLOR, "i", "Note", message, width: width
)
// Create a code snippet.
//
// Parameters:
// - source: the raw source of the snippet.
// - file: the file (or title, context etc) of the snippet.
#let code(source, file: none) = block(breakable: false)[
#if file != none [
#block(
radius: (top: 4pt),
inset: (x: 10pt, y: 4pt),
below: 0pt,
fill: MAIN_COLOR,
)[
#text(
fill: CODE_SNIPPET_COLOR,
size: 0.7em
)[
#emph[#file]
]
]
]
#block(
radius: (
bottom: 4pt,
top-right: 4pt,
top-left: if file != none { 0pt } else { 4pt },
),
inset: 10pt,
fill: CODE_SNIPPET_COLOR,
width: 100%
)[
#raw(source.text, lang: source.at("lang", default: none))
]
]
|
https://github.com/crd2333/crd2333.github.io | https://raw.githubusercontent.com/crd2333/crd2333.github.io/main/src/docs/Courses/æ°æ®ç»æäžç®æ³/ADSæéé¢.typ | typst | ---
order: 4
---
#import "/src/components/TypstTemplate/lib.typ": *
#show: project.with(
title: "é«çº§æ°æ®ç»æäžç®æ³åæ",
lang: "zh",
)
#let Q(body1, body2) = [
#question(body1)
#note(caption: "Answer", body2)
]
= é«çº§æ°æ®ç»æäžç®æ³åææéé¢
== HW 1
#question()[
For the result of accessing the keys 3, 9, 1, 5 in order in the splay tree in the following figure, which one of the following statements is FALSE?
]
#note(caption: "Answer")[
éå€©éæ©æš¡æ splayïŒæ²¡ä»ä¹å¥œè¯ŽçïŒé项ä¹äžæŸäºïŒé»çŒå¿«éæš¡æèœå
#grid(
columns: (auto, auto),
fig("/public/assets/Courses/ADS/æéé¢/img-2024-02-28-22-11-23.png", width: 80%),
fig("/public/assets/Courses/ADS/æéé¢/img-2024-02-28-22-10-48.png", width: 50%),
)
]
#question()[Consider the following buffer management problem. Initially the buffer size (the number of blocks) is one. Each block can accommodate exactly one item. As soon as a new item arrives, check if there is an available block. If yes, put the item into the block, induced a cost of one. Otherwise, the buffer size is doubled, and then the item is able to put into. Moreover, the old items have to be moved into the new buffer so it costs $k+1$ to make this insertion, where k is the number of old items. Clearly, if there are $N$ items, the worst-case cost for one insertion can be $Omega(N)$. To show that the average cost is $O(1)$, let us turn to the amortized analysis. To simplify the problem, assume that the buffer is full after all the $N$ items are placed. Which of the following potential functions works?
/ A.: The number of items currently in the buffer.
/ B.: The opposite number of items currently in the buffer.
/ C.: The number of available blocks currently in the buffer.
/ D.: The opposite number of available blocks in the buffer
]
#note(caption: "Answer")[
é DïŒäžäŒãæè§æ¯ AD äºéäžïŒA 䞺ä»ä¹äžè¡ïŒ
]
== HW 2
#question()[
Insert 3, 1, 4, 5, 9, 2, 6, 8, 7, 0 into an initially empty 2-3 tree (with splitting). Which one of the following statements is FALSE?
]
#note(caption: "Answer")[
æš¡ææå
¥ïŒæåç»åºæ¥å¯èœé¿è¿æ ·
#fig("/public/assets/Courses/ADS/æéé¢/img-2024-03-12-23-11-31.png", width: 50%)
]
#question()[
Which of the following statements concerning a B+ tree of order M is TRUE? \
A. the root always has between 2 and M children\
...
]
#note(caption: "Answer")[
泚æå®ä¹ïŒroot èŠä¹æ¯å¶åïŒèŠä¹ææ¯ 2 å° M 䞪å©åã
]
== HW 3
#question()[
When evaluating the performance of data retrieval, it is important to measure the relevancy of the answer set. (T/F)
]
#note(caption: "Answer")[
F. å¬åçåæŽäžªçæ¡éççžå
³æ§æ å
³ïŒè¿é¢æºå¬äººçïŒä¹äžçè¿çæ¯é£ä¹åäºã
]
== HW 4
#question()[
The result of inserting keys $1$ to $2^k- 1$ for any $k>4$ in order into an initially empty skew heap is always a full binary tree.
]
#note(caption: "Answer", breakable: false)[
äžæ¯åŸæäžºä»ä¹èŠ $k > 4$ïŒåšæçæ¥å¥œå $k ge 2$ å°±éœæç«äºïŒå¯¹ $k=4$ ç»åºçåŸåŠäžïŒ
#syntree("[1 [3 [7 15 11] [5 13 9]] [2 [6 14 10] [4 12 8]]]")
ïŒå¯ä»¥èäžäžè§åŸïŒèçæ¶éŽïŒ
]
== HW 7
#question()[
Which one of the following is the lowest upper bound of $T(N)$. $T(N)$ for the following recursion $T(N) = 2T(sqrt(N)) + log N$?
]
#note(caption: "Answer")[
çæ¡æ¯ $O(log N log log N)$ãæ³šæå°åœæ°å¹¶éå
žå圢åŒïŒæä»¥èŠå
æ¢å
$m = log N$ïŒåŸå° $T(2^m) = 2T(2^(m/2)) + m$ãå讟 $G(m)=T(2^m)$ïŒåæ $G(m)=2G(m\/2)+m$ïŒä¹å°±æ¯å¯¹åéæ¢äžªå
ïŒåå¯¹åœæ°æ¢äžªå
ïŒ\
ç±äž»å®ç(1)ç¥ $G(m)=O(m log m)$ïŒæä»¥ $T(N)=O(log N log log N)$
]
== HW 9
#question()[
Let $S$ be the set of activities in Activity Selection Problem. Then the earliest finish activity $a_m$ must be included in all the maximum-size subset of mutually compatible activities of $S$.
]
#note(caption: "Answer")[
F. 泚æïŒè¿äžªè¿äŒŒè§£äžå®åšæäžäžªæäŒè§£äžïŒäœäžäžå®åšæææäŒè§£äžã
]
== HW 10
#question()[
All NP problems are decidable.
]
#note(caption: "Answer")[
T.
]
== HW 11
#question()[
To approximate a maximum spanning tree $T$ of an undirected graph $G=(V,E)$ with distinct edge weights $w(u,v)$ on each edge $(u,v) in E$, let's denote the set of maximum-weight edges incident on each vertex by $S$. Also let $w(E')=sum_{(u,v) in E'} w(u,v)$ for any edge set $E'$. Which of the following statements is TRUE?
- A. $S=T$ for any graph $G$
- B. $S != T$ for any graph $G$
- C. $w(T) >= w(S)\/2$ for any graph $G$
- D. None of the above
]
#note(caption: "Note")[
æææ¯ïŒå¯¹æ¯äžªé¡¶ç¹éå®çæå€§èŸ¹ïŒè¿æ ·éåºæ¥çç»æäžäžå®æ¯æå€§çææ ïŒçè³äžäžå®èéïŒäœè¶³å€è¿äŒŒãA å B é项çèµ·æ¥äºæ¥ïŒå®é
äžå¯ä»¥å嫿é åºåäŸã
#fig("/public/assets/Courses/ADS/æéé¢/img-2024-05-10-19-20-37.png", width: 50%)
对 C é项ïŒç±äºãäžäŒ
]
== HW 12
#Q(
[
Greedy method is a special case of local search.
],
[
F. 莪å¿å¯ä»¥å€§æŠå®ä¹äžºæ¯äžæ¥æ ¹æ®å¯åä¿¡æ¯çæäŒæ¥å³çãèå±éšæçޢ忝ä»äžäžªåå§è§£äžéè¿å±éšæ°åšïŒä»èæ¢çŽ¢æ°è§£çå¯èœãäžç§åžžè§çå±éšæçŽ¢æ¯"k亀æ¢"å±éšæçŽ¢ãéè¿äº€æ¢è§£äžçæäºç»æïŒä»èæµè¯è¿ç§æ°åšæ¯åŠèœè·åŸæŽäŒçè§£ã
]
)
#Q(
[A bipartite graph $G$ is one whose vertex set can be partitioned into two sets $A$ and $B$, such that each edge in the graph goes between a vertex in $A$ and a vertex in $B$. Matching $M$ in $G$ is a set of edges that have no end points in common. Maximum Bipartite Matching Problem finds a matching with the greatest number of edges (over all matching).\
Consider the following Gradient Ascent Algorithm:
```
As long as there is an edge whose endpoints are unmatched, add it to the current matching. When there is no longer such an edge, terminate with a locally optimal matching.
```
Let $M_1$ and $M_2$ be matchings in a bipartite graph $G$. Which of the following statements is true?
- A. This gradient ascent algorithm never returns the maximum matching.
- B. Suppose that $|M_1|>2|M_2|$. Then there must be an edge $e$ in $M_1$ such that $M_2 union {e}$ is a matching in $G$.
- C. Any locally optimal matching returned by the gradient ascent algorithm in a bipartite graph $G$ is at most half as large as a maximum matching in $G$.
- D. All of the above
],
[
äžäŒã
]
)
== HW15
#Q(
[
If only one tape drive is available to perform the external sorting, then the tape access time for any algorithm will be $Omega(N^2)$.
],
[
çæ¡æ¯ TãäžæïŒè·æ°æ®åºçç论æç¹äžäžæ ·ïŒèäžäžºä»ä¹ $Omega(N^2)$ 没æèèå
åå€§å° $M$(æè
诎åå¹¶è·¯æ° $k$)ïŒ
]
)
= å€ä¹
== Chap 1
- è®°é«åºŠäžº $h$ ç AVL-tree æå°æ $h_i$ 䞪èç¹ïŒé£ä¹æ $h_i = h_(i-1) + h_(i-2) + 1$ïŒ$h_(-1) = 0, h_0 = 1$ãäŸåŠïŒç»å® $h = 6$ïŒé£ä¹ $h_6 = 33$ã
|
|
https://github.com/kdog3682/mathematical | https://raw.githubusercontent.com/kdog3682/mathematical/main/0.1.0/src/geometry/shapes.typ | typst |
#import "@preview/cetz:0.2.2"
#let get-rect-coordinates(w, h) = {
let a = (0, 0)
let b = (0, h)
let c = (w, h)
let d = (w, 0)
return (a, b, c, d)
}
#let get-brace-points() = {
}
#let brace(points, c, place: "below") = {
let (a, b) = get-brace-points(points, place)
cetz.decorations.brace(a, b, stroke: 0.5pt, name: "brace")
cetz.draw.content("brace.k", resolve-content(c))
}
#let square(length) = {
let points = get-rect-coordinates(length, length)
cetz.draw.rect(points.at(0), points.at(2))
brace(points, length, place: "below")
}
#let rectangle(w, h) = {
cetz.draw.rect(a, c)
circ(b)
circ(c)
Small squares have side length 2.
#square(2)
|
|
https://github.com/FrightenedFoxCN/cetz-cd | https://raw.githubusercontent.com/FrightenedFoxCN/cetz-cd/main/src/arrows.typ | typst | #import "@preview/cetz:0.1.2"
#import "utils.typ": *
// Here we calculate the array-related information
#let resolve-arrow-string(string) = {
let res = (0, 0)
for i in string {
if i == "u" {
res.at(1) -= 1
} else if i == "d" {
res.at(1) += 1
} else if i == "l" {
res.at(0) -= 1
} else if i == "r" {
res.at(0) += 1
} else {
return none
}
}
res
}
// this is the *internal* representation of the arrow
#let cd-arrow(start, // start point
end, // end point
style, // style of the arrow
text, // text attached to the arrow
text-size, // size of the text, this should be passed here due to the limitation of content type
swapped, // if the text is on the right; on the left default
bent, // if the arrow is bent, a degree is given here
offset) = ( // offset of the arrow from the centerline
start: start,
end: end,
style: style,
text: text,
text-size: text-size,
swapped: swapped,
bent: bent,
offset: offset
)
#let default-arrow(start, end) = arrow(start, end, "-", none, (0, 0), false, 0, 0)
#let default-arrow-with-text(start, end, text, text-size) = arrow(start, end, "-", text, text-size, false, 0, 0)
#let draw-arrow(arrow) = {
cetz.draw.line(arrow.start, arrow.end, mark: (fill: black, end: ">"), stroke: (thickness: 0.5pt))
// place the text
if arrow.text != none {
// for the visualize effect, the text should be a bit nearer to the start point of the arrow
let text-position = add2d(scale2d(0.55, arrow.start), scale2d(0.45, arrow.end))
// cetz.draw.circle(text-position, radius:.08)
// slope of the arrow
let slope = 0.
let normal = (0., 0.)
let tangent = (0., 0.)
if arrow.end.at(0) - arrow.start.at(0) != 0 {
slope = (arrow.end.at(1) - arrow.start.at(1)) / (arrow.end.at(0) - arrow.start.at(0))
normal = normalize2d((-slope, 1.))
tangent = normalize2d((1., slope))
} else {
normal = (1., 0.)
tangent = (0., 1.)
}
if arrow.swapped != true {
text-position = add2d(text-position, scale2d(0.2, normal))
text-position = add2d(text-position, scale2d(0.5, mult2d(normal, arrow.text-size)))
} else {
text-position = add2d(text-position, scale2d(-0.2, normal))
text-position = add2d(text-position, scale2d(-0.5, mult2d(normal, arrow.text-size)))
}
// cetz.draw.circle(text-position, radius:.05)
cetz.draw.content(text-position, arrow.text)
}
}
// for the user, arr is used to create the arrow
#let arr(direction,
style : "-", // style of the arrow
text : none, // text attached to the arrow
swapped: false,
bent : 0., // if the arrow is bent, a degree is given here
offset : 0.) = ( // offset of the arrow from the centerline
direction: resolve-arrow-string(direction),
style: style,
text: text,
swapped: swapped,
bent: bent,
offset: offset
)
#let parse-arrow(content) = {
content.split("\\")
.map(a => {a.trim(" ")})
.map(a => a.split("&")
.map(a => a.trim(" ").split(",")
.map(a => arr(a.trim(" ")))))
} |
|
https://github.com/ntjess/toolbox | https://raw.githubusercontent.com/ntjess/toolbox/main/cetz-plus/git-graph.typ | typst | #import "@preview/cetz:0.2.0"
#let d = cetz.draw
#let offset(anchor, x: 0, y: 0) = {
(v => cetz.vector.add(v, (x, y)), anchor)
}
#let default-colors = (red, orange, yellow, green, blue, purple, fuchsia, gray)
#let color-boxed(..args) = {
set text(0.8em)
box(
inset: (y: 0.25em, x: 0.1em),
fill: yellow.lighten(80%),
stroke: black + 0.5pt,
radius: 0.2em,
..args
)
}
#let _layers = (
LANES: -4,
BRANCH: -3,
GRAPH: -2,
COMMIT: 1,
TAG: 1,
)
#let _git-graph-defaults = (
default-branch-colors: default-colors,
branches: (:),
active-branch: "main",
commit-id: 0,
commit-spacing: 0.8,
ref-branch-map: (:),
lane-spacing: 2,
lane-style: (
stroke: (paint: gray, dash: "dashed")
),
graph-style: (
stroke: (thickness: 0.25em),
radius: 0.25
),
commit-style: (
decorator: color-boxed,
spacing: 0.8,
angle: 45deg,
),
tag-style: (
decorator: color-boxed.with(fill: blue.lighten(75%), stroke: black),
angle: -45deg
),
)
#let _is-empty(content) = {
content == "" or content == [] or content.has("text") and content.text == ""
}
#let graph-props(func) = {
d.get-ctx(ctx => {
let props = ctx.git-graph
props.ctx = ctx
func(props)
})
}
#let set-graph-props(func) = {
d.set-ctx(ctx => {
ctx.git-graph = func(ctx.git-graph)
ctx
})
}
#let branch-props(func, branch: auto) = {
graph-props(props => {
let branch = branch
if branch == auto {
branch = props.active-branch
}
if branch not in props.branches {
panic("Branch `" + branch + "` does not exist")
}
let sub-props = props.branches.at(branch)
props.name = branch
func(props + sub-props)
})
}
#let background-lanes() = {
graph-props(props => {
for (name, branch-props) in props.branches.pairs() {
let (ctx, latest-commit) = cetz.coordinate.resolve(props.ctx, "head")
let end = offset(name, y: latest-commit.at(1) - props.commit-spacing)
d.on-layer(_layers.LANES, d.line(name, end, ..props.lane-style, anchor: "north"))
}
})
}
#let _branch-line(src, dst, color) = {
// Easier than a merge line since src is guaranteed to be left of dst
graph-props(props => {
let ctx = props.ctx
let (ctx, a, b) = cetz.coordinate.resolve(ctx, src, dst)
assert(
a.at(0) < b.at(0) and a.at(1) >= b.at(1),
message: "source branch must start before destination branch"
)
let radius = props.graph-style.radius
let stroke = (stroke: (paint: color, ..props.graph-style.stroke))
d.line(offset(b, y: -b.at(1)), b, ..stroke)
d.merge-path(..stroke, {
d.line(
src, (b.at(0) - radius, a.at(1)),
)
d.arc((), start: 90deg, delta: -90deg, radius: radius)
})
})
}
#let branch(name, color: auto, colors: default-colors) = {
if type(name) != str {
name = name.text
}
set-graph-props(props => {
let branches = props.branches
if name in branches {
panic("Branch `" + name + "` already exists")
}
let color = color
let n-cur = branches.len()
if color == auto {
color = colors.at(calc.rem(n-cur, colors.len()))
}
branches.insert(name, (fill: color, lane: n-cur))
props.branches = branches
props.head = name
props.active-branch = name
props
})
let styled(..args) = {
set text(weight: "bold", fill: white)
rect(radius: 0.25em, ..args)
}
branch-props(props => {
d.content((props.lane * props.lane-spacing, 0), styled(name, fill: props.branches.at(name).fill), name: name, anchor: "west")
})
branch-props(props => {
let new-head = name
if props.commit-id > 0 {
let (_, head-pos, lane-pos) = cetz.coordinate.resolve(props.ctx, "head", name)
let join-loc = (lane-pos.at(0), head-pos.at(1) - props.commit-spacing)
if head-pos.at(1) < 0 {
d.on-layer(-props.lane + _layers.BRANCH, _branch-line("head", join-loc, props.fill))
}
new-head = (lane-pos.at(0), head-pos.at(1))
}
d.anchor("head", new-head)
d.anchor(name + "/head", new-head)
})
}
#let checkout(branch) = {
set-graph-props(props => {
if branch not in props.branches {
panic("Branch `" + branch + "` does not exist")
}
props.active-branch = branch
props
})
d.get-ctx(ctx => {
d.anchor("head", branch + "/head")
})
}
#let commit(message, branch: auto) = {
if branch != auto {
checkout(branch)
}
set-graph-props(props => {
props.commit-id = props.commit-id + 1
props.ref-branch-map.insert(str(props.commit-id), props.active-branch)
props
})
let on-graph = d.on-layer.with(_layers.GRAPH)
let on-branch = d.on-layer.with(_layers.BRANCH)
branch-props(props => {
let txt = props.commit-style.at("decorator")(message)
let (_, lane-pos) = cetz.coordinate.resolve(props.ctx, "head")
d.anchor("head", (lane-pos.at(0), -props.commit-id * props.commit-spacing))
on-graph(d.content("head", circle(fill: props.fill, radius: 0.5em), name: "circ"))
on-branch(
d.line(props.name, "head", stroke: (paint: props.fill, ..props.graph-style.stroke))
)
if not _is-empty(message) {
let rot = props.commit-style.at("angle")
d.content("circ.south-west", txt, anchor: "east", angle: rot)
}
})
graph-props(props => {
d.anchor(props.active-branch + "/head", "head")
d.anchor("commit-id-" + str(props.commit-id), "head")
})
}
#let tag(message) = {
graph-props(props => {
let txt = props.tag-style.at("decorator")(message)
let rot = props.tag-style.at("angle")
d.content("head", txt, anchor: "west", angle: rot, padding: 0.75em)
})
}
#let _merge-line(src, dest, color) = {
// A line with a quarter-circle turn from src to dest branch
let radius = 0.5em
graph-props(props => {
let ctx = props.ctx
let (ctx, a, b) = cetz.coordinate.resolve(ctx, src, dest)
assert(
calc.abs(a.at(1)) < calc.abs(b.at(1)),
message: "Destination branch must be below source branch"
)
let radius = props.graph-style.radius
let p = d.merge-path(stroke: (paint: color, ..props.graph-style.stroke), {
d.line(src, (a.at(0), b.at(1) + radius))
if a.at(0) < b.at(0) {
d.arc((), start: 180deg, delta: 90deg, radius: radius)
} else {
d.arc((), start: 0deg, delta: -90deg, radius: radius)
}
d.line((), b)
})
d.on-layer(_layers.BRANCH, p)
})
}
#let merge(commit-id, message: []) = {
commit(message)
d.on-layer(_layers.GRAPH, d.circle((), radius: 0.35em, fill: white, stroke: none))
graph-props(props => {
let commit-id = commit-id
let refs = props.ref-branch-map
if commit-id.replace("/head", "") in props.branches {
commit-id = commit-id + "/head"
refs.insert(commit-id, commit-id.split("/").at(0))
} else if commit-id not in refs {
panic("Commit ref `" + commit-id + "` does not exist")
}
let src-branch = refs.at(commit-id)
if src-branch == props.active-branch {
panic(
"Cannot merge branch into itself. head is already at `" + src-branch
+ "`, and commit `" + commit-id + "` belongs to the same branch.
Perhaps you forgot to checkout a different branch before merging?"
)
}
let branch-props = props.branches.at(src-branch)
_merge-line(commit-id, "head", branch-props.fill)
})
}
#let git-graph(graph, name: none, ..style) = {
d.set-ctx(ctx => {
ctx.git-graph = _git-graph-defaults
ctx
})
d.group(name: name, graph)
} |
|
https://github.com/Coekjan/typst-upgrade | https://raw.githubusercontent.com/Coekjan/typst-upgrade/master/tests/exception1/entry.typ | typst | MIT License | #import non-string: *
#import "module1.typ"
#import "@non-preview/package:1.0.0"
|
https://github.com/mismorgano/UG-DifferentialGeometry23 | https://raw.githubusercontent.com/mismorgano/UG-DifferentialGeometry23/main/Tareas/Tarea-01/Tarea-01.typ | typst |
#let title = [
Geometria Diferencial\
Tarea 1
]
#let author = [
<NAME>
]
#let book = [
Differential Geometry of Curves and Surfaces
]
#set text(12pt,font: "New Computer Modern")
#set enum(numbering: "a)")
#set math.equation(numbering: "(1)", supplement: [Eq.])
#align(center, text(17pt)[
*#title*\
#author
])
Del libro *#book*.
== Problemas
*Problema 1* _Sea $alpha :I -> RR^3$ una curva parametrizada, con $alpha'(t) != 0$ para todo
$t in I$. Muestra que $norm(alpha (t))$ es constante distinto de cero si y solo si $alpha (t)$
es ortogonal a $alpha ' (t)$ para todo $t in I$._
*Solución:*
//Notemos que
Dado que $norm(alpha(t))^2 = angle.l alpha(t), alpha(t)angle.r$, se cumple que
$ norm(alpha(t))^2 ' = angle.l alpha(t), alpha(t)angle.r' = 2 angle.l alpha(t), alpha'(t)angle.r. $ <dot-der>
Además podemos notar que
$norm(alpha(t))$ es constante si y solo si $norm(alpha(t))^2$ es constante.
Entonces
sà $norm(alpha(t))$ es constante distinto de cero obtenemos que $alpha(t) != bold(0)$, por
hipotesis $alpha'(t)!= bold(0)$ y por @dot-der obtenemos que
$2angle.l alpha(t), alpha'(t) angle.r =0$, lo cual implica que $alpha(t)$ y $alpha'(t)$ son ortogonales.
Ahora, sà $alpha(t)$ y $alpha'(t)$ son ortogonales
por @dot-der obtenemos que $norm(alpha(t))^2 ' =0$, lo cual implica que $norm(alpha(t))$
es constante y además distinto de cero pues $alpha(t)!=0$.
//lo cual implica que $angle.l alpha(t), alpha'(t)angle.r = 0$.
*Problema 2* #emph[Sea $alpha(0, pi) -> RR^2$ dada por
#footnote[Me parece que la paremetrización del libro era incorrecta.]
$ alpha(t) = (
sin(t), ln(cot(t/2)-cos(t))
), $
donde $t$ es el angulo entre el eje $y$ y el vector $alpha(t)$. La traza de $alpha$ es llamada
*tractrix*. Muestra que
+ $alpha$ es una curva regular diferenciable excepto en $t=pi/2$
+ La longitud del segmento de la recta tangente a la tractix en el punto de tangencia y el eje $y$
siempre es 1. ]
*Demostración:*
//#enum[
a) Primero notemos que
$ alpha'(t) &= (cos(t), sin(t) - 1/2csc^2(t/2)1/cot(t/2)) \
&= (cos(t), sin(t) -1/sin(t)), $
por lo que $alpha$ es diferenciable. Además $alpha'(t) = bold(0)$ si y solo si $cos(t) = 0$ para $t in (0,pi)$,
lo cual pasa si y solo si $t=pi/2$, se sigue que $alpha$ es regular diferenciable excepto en $t=pi/2$.
//][Por otro lado]
b) Dado un $t in (0, pi)without {pi/2}$ tenemos que la ecuación de la recta tangente a la tractrix que pasa por
$alpha(t)$ es $alpha(t) + lambda alpha'(t)$. Como nos interesa que $alpha(t)+lambda alpha'(t)$
intersecte al eje $y$ entonces se debe cumplir que su primera coordenada sea cero, es decir,
$ sin(t) + lambda cos(t) = 0, $
lo cual implica que $lambda = -sin(t)/cos(t)$. Dado que $alpha(t)$ es el punto de tangencia, tenemos
que $lambda alpha'(t)$ es el segmento de la recta tangente que une el punto de tangencia y el eje $y$,
luego, su longitud es $norm(lambda alpha'(t))$. Podemos ver que
$ norm(lambda alpha'(t))^2 &=lambda^2( cos^2(t) + sin^2(t)-2 + 1/(sin^2(t)) )\
&=lambda^2(1/(sin^2(t)) -1) \
&=lambda^2((1-sin^2(2))/(sin^2(t)))\
&=(sin^2(t))/(cos^2(t)) dot (cos^2(t))/(sin^2(t))\
&=1, $
y por tanto $norm(lambda alpha'(t))$ = 1, como queremos.
//Para $t = pi/2$ tenemos que $alpha(t) = (1, 0)$ y por tanto
*Problema 3*
#emph[Muestra que la ecuación de un plano que pasa por tres puntos no colineales $p_1 = (x_1, y_1, z_1)$,
$p_2 = (x_2, y_2, z_2)$, $p_3 = (x_3, y_3, z_3)$ está dada por
$ (p-p_1) and (p - p_2) dot (p-p_3) = 0, $
donde $p=(x, y, z)$ es un punto arbitrario del plano.]
*Demostración:*
Primero veamos la "idea" detras de la formula. Sabemos que un plano queda determinado por
un punto en el plano $P_0$ y un vector normal al plano $n$,
pues dado otro punto $P$ en el plano se debe cumplir que $angle.l P_0-P, n angle.r = 0$.
En nuestro caso tenemos que $p-p_1$ y $p - p_2$ son puntos en el plano que lo generan, entonces
$(p-p_1) and (p - p_2)$ es normal al plano y como $p - p_3$ es un punto del plano, se debe
cumplir $(p-p_1) and (p-p_2) dot (p-p_3) = 0$. |
|
https://github.com/Meisenheimer/Notes | https://raw.githubusercontent.com/Meisenheimer/Notes/main/src/Analysis.typ | typst | MIT License | #import "@local/math:1.0.0": *
= Analysis
== Calculus
=== Mean value theorem
#env("Theorem", name: "Rolle's theorem")[
Given $n gt.eq 2$ and $f in C^(n-1)([a, b])$ with $f^((n))(x)$ exists at each point of $(a, b)$, suppose that $f(x_0) = dots.c f(x_n) = 0$ for $a lt.eq x_0 < dots.c < x_n lt.eq b$, then there is a point $xi in (a, b)$ such that $f^((n))(xi) = 0$.
]
#env("Theorem", name: "Lagrange's mean value theorem")[
Given $f in C^1([a, b])$, then there exists $xi in (a, b)$ such that
$ f^prime (xi) = (f(b) - f(a)) / (b-a). $
]
#env("Theorem", name: "Cauchy's mean value theorem")[
Given $f, g in C^1([a, b])$, then there exists $xi in (a, b)$ such that
$ (f(b) - f(a)) g^prime (xi) = (g(b) - g(a)) f^prime (xi). $
If $g(a) eq.not g(b)$ and $g(xi) eq.not 0$, this is equivalent to
$ (f^prime (xi)) / (g^prime (xi)) = (f(b) - f(a)) / (g(b) - g(a)). $
]
#env("Theorem", name: "First mean value theorems for definite integrals")[
Given $f in C([a, b])$ and $g$ integrable and does not change sign on $[a, b]$, then there exists $xi$ in $(a, b)$ such that
$ integral_a^b f(x) g(x) upright(d) x = f(xi) integral_a^b g(x) upright(d) x. $
]
#env("Theorem", name: "Second mean value theorems for definite integrals")[
Given $f$ a integrable function and $g$ a positive monotonically decreasing function, then there exists $xi$ in $(a, b)$ such that
$ integral_a^b f(x) g(x) upright(d) x = g(a) integral_a^xi f(x) upright(d) x. $
If $g$ is a positive monotonically increasing function, then there exists $xi$ in $(a, b)$ such that
$ integral_a^b f(x) g(x) upright(d) x = g(b) integral_xi^b f(x) upright(d) x. $
If $g$ is a monotonically function, then there exists $xi$ in $(a, b)$ such that
$ integral_a^b f(x) g(x) upright(d) x = g(a) integral_a^xi f(x) upright(d) x + g(b) integral_xi^b f(x) upright(d) x. $
]
=== Series
#env("Definition")[
A series $sum_(n=1)^infinity a_n$ is *absolute convergent* if the series of absolute values $sum_(n=1)^infinity |a_n|$ converges.
]
#env("Theorem")[
If a series is absolute convergent, then any reordering of it converges to the same limit.
]
#env("Theorem", name: "n-th term test")[
If $limits(lim)_(n -> infinity) a_n eq.not 0$, then the series divergent.
]
#env("Theorem", name: "Direct comparison test")[
If $sum_(n=1)^infinity b_n$ is convergent and exists $N > 0$, for all $n > N$, $0 lt.eq a_n lt.eq b_n$, then $sum_(n=1)^infinity a_n$ is convergent; if $sum_(n=1)^infinity b_n$ is divergent and exists $N > 0$, for all $n > N$, $0 lt.eq b_n lt.eq a_n$, then $sum_(n=1)^infinity a_n$ is divergent.
]
#env("Theorem", name: "Limit comparison test")[
Given two series $sum_(n=1)^infinity a_n$ and $sum_(n=1)^infinity b_n$ with $a_n gt.eq 0, b_n > 0$. Then if $limits(lim)_(n -> infinity) a_n / b_n = c in (0, infinity)$, then either both series converge or both series diverge.
]
#env("Theorem", name: "Ratio test")[
Given $sum_(n=1)^infinity a_n$ and
$ R = limsup_(n -> infinity) abs(a_(n+1) / a_n), r = liminf_(n -> infinity) abs(a_(n+1) / a_n), $
if $R < 1$, then the series converges absolutely; if $r > 1$, then the series diverges.
]
#env("Theorem", name: "Root test")[
Given $sum_(n=1)^infinity a_n$ and
$ R = limsup_(n -> infinity) (|a_n|)^(1/n), $
if $R < 1$, then the series converges absolutely; if $R > 1$, then the series diverges.
]
#env("Theorem", name: "Integral test")[
Given $sum_(n=1)^infinity f(n)$ where $f$ is monotone decreasing, then the series converges iff the improper integral
$ integral_1^infinity f(x) upright(d) x $
is finite. In particular,
$ integral_1^infinity f(x) upright(d) x lt.eq sum_(n=1)^infinity f(n) lt.eq f(1) + integral_1^infinity f(x) upright(d) x $
]
#env("Theorem", name: "Alternating series test")[
Given $sum_(n=1)^infinity (-1)^n a_n$ where $a_n$ are all positive or negative, then the series converges if $|a_n|$ decreases monotonically and $limits(lim)_(n -> infinity) a_n = 0$.
]
=== Multivariable calculus
#env("Theorem", name: "Green's theorem")[
Let $Omega$ be the region in a plane with $partial Omega$ a positively oriented, piecewise smooth, simple closed curve. If $P$ and $Q$ are functions of $(x, y)$ defined on an open region containing $Omega$ and have continuous partial derivatives there, then
$ integral.cont_(partial Omega) (P upright(d) x + Q upright(d) y) = integral.double_Omega ((partial Q) / (partial x) - (partial P) / (partial y)) upright(d) x upright(d) y $
where the path of integration along $C$ is anticlockwise.
]
#env("Theorem", name: "Stokes' theorem")[
Let $Omega$ be a smooth oriented surface in $RR^3$ with $partial Omega$ a piecewise smooth, simple closed curve. If $mathbf(F) (x,y,z) = (F_x (x,y,z), F_y (x,y,z), F_z (x,y,z))$ is defined and has continuous first order partial derivatives in a region containing $Omega$, then
$ integral.double_Omega (nabla times mathbf(F)) dot.c upright(d) S(x) = integral.cont_(partial Omega) mathbf(F) dot.c upright(d) x $
]
#env("Theorem", name: "Gauss-Green theorem (Divergence theorem)")[
For a bounded open set $Omega in RR^n$ that $partial Omega in C^1$ and a function $mathbf(F)(mathbf(x)) = (F_1 (mathbf(x)), dots, F_n (mathbf(x))): overline(Omega)-> RR^n$ satisfies $mathbf(F)(mathbf(x)) in C^1(Omega) sect C(overline(Omega))$,
$ integral_Omega "div" mathbf(F)(mathbf(x)) upright(d) mathbf(x) = integral_(partial Omega) mathbf(F)(mathbf(x)) dot mathbf(n) upright(d) S(x), $
where $mathbf(n)$ is outward pointing unit normal vector at $partial Omega$.
]
#env("Definition")[
An *implicit function* is a function of the form
$ F(x_1, dots, x_n) = 0, $
where $x_1, dots, x_n$ are variables.
]
#env("Theorem")[
Let $F(mathbf(x), mathbf(y)): RR^(n+m) -> RR^m$ be a differentiable function of two variables, and $(mathbf(x)_0, mathbf(y)_0)$ the point that $F(mathbf(x)_0, mathbf(y)_0) = mathbf(0)$. If the Jacobian matrix
$ J_(F, mathbf(y)) (mathbf(x)_0, mathbf(y)_0) = ((partial F_i) / (partial y_j) (mathbf(x)_0, mathbf(y)_0)) $
is invertible, then there exists an open set $Omega subset.eq RR^n$ containing $mathbf(x)_0$ such that there exists a unique function $f: Omega -> RR^m$ such that $f(mathbf(x)_0) = mathbf(y)_0$ and $F(mathbf(x), f(mathbf(y))) = mathbf(0)$ for all $mathbf(x) in Omega$.
Moreover, $f$ is continuously differentiable and, denoting the left-hand panel of the Jacobian matrix shown in the previous section as
$ J_(F, mathbf(x)) (mathbf(x)_0, mathbf(y)_0) = ((partial F_i) / (partial x_j) (mathbf(x)_0, mathbf(y)_0)), $
the Jacobian matrix of partial derivatives of $f$ in $Omega$ is given by
$ ((partial f_i) / (partial x_j) (mathbf(x)))_(m times n) = -(J_(F, mathbf(y)) (mathbf(x), f(mathbf(x))))_(m times m)^(-1) (J_(F, mathbf(x)) (mathbf(x), f(mathbf(x))))_(m times n) . $
]
== Real Analysis
=== Lebesgue Measure
#env("Definition")[
Given an bounded interval $I in RR$, denoted by $cal(l)(I)$ the *length* of the interval defined as the distance of its endpoints,
$ cal(l)([a, b]) = cal(l)((a, b)) = b - a. $
]
#env("Definition")[
For any subset $E subset RR$, the *Lebesgue outer measure* $m^*(E)$ is defined as
$ m^*(E) = inf { sum_(i=1)^n cal(l)(I_i): {I_i}_(i=1)^n " is a sequence of open intervals that " E subset union.big_(i=1)^n I_i }. $
]
#env("Theorem")[
If $E_1 subset E_2 subset RR$, then $m^*(E_1) lt.eq m^*(E_2)$.
]
#env("Theorem")[
Given an interval $I subset RR$, $m^*(I) = cal(l)(I)$.
]
#env("Theorem")[
Given ${E_i subset RR}_(i=1)^n$, $m^*(union.big_(i=1)^n E_i) lt.eq sum_(i=1)^n m^*(E_i)$.
]
#env("Definition")[
The sets $E$ are said to be *Lebesgue-measurable* if
$ forall A subset RR, m^*(A) = m^*(A sect X) + m^*(A sect (RR backslash A)) $
and its Lebesgue measure is defined as its Lebesgue outer measure: $m(E) = m^*(E)$.
]
#env("Theorem")[
The set of all measurable sets $E subset RR$ forms a $sigma$-algebra $cal(F)$ where
- $cal(F)$ contains the sample space: $RR in cal(F)$;
- $cal(F)$ is closed under complements: if $A in cal(F)$, then also $(RR backslash A) in cal(F)$;
- $cal(F)$ is closed under countable unions: if $A_i in cal(F), i = 1, dots$, then also $(union_(i=1)^infinity A_i) in cal(F)$.
]
#env("Definition")[
A *measurable space* is a tuple $(X, cal(F))$ consisting of an arbitrary non-empty set $X$ and a $sigma$-algebra $cal(F) subset.eq 2^X$.
]
== Complex Analysis
#env("Definition")[
Given an open set $Omega$ and a function $f(z): Omega -> CC$, the *derivative* of $f(z)$ at a point $z_0 in Omega$ is defined as the limits
$ f^prime (z) = lim_(z -> z_0) (f(z) - f(z_0))/(z - z_0), $
and the function is said to be *complex differentiable* at $z_0$.
]
#env("Definition")[
A function $f(z)$ is holomorphic on an open set $Omega$ if it is complex differentiable at every point of $Omega$.
]
#env("Theorem")[
If a complex function $f(x + mathbf(i) y) = u(x, y) + mathbf(i) v(x, y)$ is holomorphic, then $u$ and $v$ have first partial derivatives, and satisfy the CauchyâRiemann equations,
$ (partial u)/(partial x) = (partial v)/(partial y) " and " (partial u)/(partial y) = (partial v)/(partial x), $
or equivalently,
$ (partial f) / (partial overline(z)) = 0. $
]
#env("Theorem", name: "Cauchy's integral theorem")[
Given a simply connected domain $Omega$ and a holomorphic function $f(z)$ on it, for any simply closed contour $C$ in $Omega$,
$ integral_C f(z) upright(d) x = 0. $
]
// #env("Theorem", name: "Cauchy's integral formula")[
// ]
#env("Theorem", name: "Residue formula")[
Suppose that $f$ is holomorphic in an open set containing a toy contour $gamma$ and its interior, except for some points $z_1, dots, z_n$ inside $gamma$, then
$ integral_gamma f(z) upright(d) z = 2 pi mathbf(i) sum_(k=1)^n upright("res")_(z_k) f, $
where for a pole $z_0$ of order $n$,
$ upright("res")_(z_0) f = lim_(z -> z_0) 1/((n-1)!) (upright(d)/(upright(d)z))^(n-1) (z - z_0)^n f(z). $
]
== Important Inequalities
=== Fundamental inequality
#env("Theorem", name: "Fundamental inequality")[
$ forall x, y in RR^+, 2 / (1/a + 1/b) <= sqrt(a b) <= (a+b)/2 <= sqrt((a^2 + b^2) / 2), " equality holds iff " a = b. $
]
=== Triangle inequality
#env("Theorem", name: "Triangle inequality")[
$ & a, b in CC, & & bar.v|a| - |b|bar.v <= |a plus.minus b| <= |a| + |b|, \
& mathbf(a), mathbf(b) in RR^n, & & bar.v||mathbf(a)|| - ||mathbf(b)||bar.v <= ||mathbf(a) plus.minus mathbf(b)|| <= ||mathbf(a)|| + ||mathbf(b)||. $
]
=== Bernoulli inequality
#env("Theorem", name: "Bernoulli inequality")[
$ & forall x in (-1, +infinity), forall a in [1, +infinity), & & (1 + x)^a >= 1 + a x, \
& forall x in (-1, +infinity), forall a in (0, 1), & & (1 + x)^a <= 1 + a x, \
& forall x in (-1, +infinity), forall a in (-1, 0), & & (1 + x)^a >= 1 + a x, \
& forall x_i in RR, i in \{1, dots, n\}, & & product_(i=1)^n (1 + x_i) >= 1 + sum_(i=1)^n x_i, \
& forall y >= x > 0, & & (1 + x)^y >= (1 + y)^x. $
]
=== Jensen's inequality
#env("Theorem", name: "Jensen's inequality")[
For a real convex function $f(x): [a, b] -> RR$, numbers $x_1 dots, x_n in [a, b]$ and weights $a_1, dots, a_n$, the Jensen's inequality can be start as
$ (sum_(i=1)^n a_i f(x_i)) / (sum_(i=1)^n a_i) >= f ( (sum_(i=1)^n a_i x_i) / (sum_(i=1)^n a_i) ). $
And for concave function $f$,
$ (sum_(i=1)^n a_i f(x_i)) / (sum_(i=1)^n a_i) <= f ( (sum_(i=1)^n a_i x_i) / (sum_(i=1)^n a_i) ). $
Equality holds iff $x_1 = dots.c = x_n$ or $f$ is linear on $[a, b]$.
]
=== CauchyâSchwarz inequality
#env("Theorem", name: "CauchyâSchwarz inequality")[
\ *Discrete form.* For real numbers $a_1, dots a_n, b_1, dots b_n in RR, n >= 2$
$ sum_(i=1)^n a_i^2 sum_(i=1)^n b_i^2 >= ( sum_(i=1)^n a_i b_i ). $
Equality holds iff $a_1 / b_1 = dots.c = a_n / b_n$ or $a_i = 0$ or $b_i = 0$.
\ *Inner product form.* For a inner product space $V$ with a norm induced by the inner product,
$ forall mathbf(a), mathbf(b) in V ||mathbf(a)|| dot.c ||mathbf(b)|| >= |angle.l mathbf(a), mathbf(b) angle.r|. $
Equality holds iff $exists k in RR, " s.t. " k mathbf(a) = mathbf(b) " or " mathbf(a) = k mathbf(b)$.
\ *Probability form.* For random variables $X$ and $Y$,
$ sqrt(E(X^2)) dot.c sqrt(E(Y^2)) >= |E(X Y)|. $
Equality holds iff $exists k in RR, " s.t. " k X = Y " or " X = k Y$.
\ *Integral form.* For integrable functions $f, g in L^2(Omega)$,
$ (integral_Omega f^2(x) upright(d) x) (integral_Omega g^2(x) upright(d) x) >= ( integral_Omega f(x) g(x) upright(d) x )^2. $
Equality holds iff $exists k in RR, " s.t. " k f(x) = g(x) " or " f(x) = k g(x)$.
]
=== Hölder's inequality
#env("Theorem", name: "Hölder's inequality")[
\ *Discrete form.* For real numbers $a_1, dots a_n, b_1, dots b_n in RR, n >= 2$ and $p, q in [1, +infinity)$ that $(1/p) + (1/q) = 1$,
$ ( sum_(i=1)^n a_i^p )^(1/p) ( sum_(i=1)^n b_i^q )^(1/q) >= ( sum_(i=1)^n a_i b_i ). $
Equality holds iff $exists c_1, c_2 in RR, c_1^2 + c_2^2 eq.not 0, " s.t. " c_1 a_i^p = c_2 b_i^q$.
\ *Integral form.* For functions $f in L^p (Omega), g in L^q (Omega)$ and $p, q in [1, +infinity)$ that $1/p + 1/q = 1$,
$ ( integral_Omega |f(x)|^p upright(d) x )^(1/p) ( integral_Omega |g(x)|^q upright(d) x )^(1/q) >= integral_Omega f(x) g(x) upright(d) x. $
]
=== Young's inequality
#env("Theorem", name: "Young's inequality")[
For $p, q in [1, +infinity)$ that $1/p + 1/q = 1$,
$ forall a, b in RR^*, a^p / p + b^q / q >= a b. $
Equality holds iff $a^p = b^q$.
]
=== Minkowski inequality
#env("Theorem", name: "Minkowski inequality")[
For a metric space $S$,
$ forall f, g in L^p (S), p in [1, +infinity], ||f||_p + ||g||_p >= ||f + g||_p. $
For $p in (1, +infinity)$, equality holds iff $exists k >= 0, " s.t. " f = k g$ or $k f = g$.
]
== Special Functions
=== Gaussian function
#env("Definition")[
A *Gaussian function*, or a Gaussian, is a function of the form
$ f(x) = a exp(-((x - b)^2) / (2 c^2)), $
where $a in RR^+$ is the height of the curve's peak, $b in RR$ is the position of the center of the peak and $c in RR^+$ is the standard deviation or the Gaussian root mean square width.
]
#env("Theorem")[
The integral of a Gaussian is
$ integral_(-infinity)^(+infinity) a exp(-((x - b)^2)/ (2 c^2)) upright("d") x = a c sqrt(2 pi). $
]
#env("Definition")[
A *normal distribution* or a *Gaussian distribution* is a continuous probability distribution of the form
$ f_(mu, sigma) (x) = 1 / (sigma sqrt(2 pi)) exp(-((x - mu)^2)(2 sigma^2)), $
where $mu$ is the mean and $sigma$ is the standard deviation.
]
=== Dirac delta function
#env("Definition")[
The *Dirac delta function* centered at $overline(x)$ is
$ delta(x - overline(x)) = lim_(epsilon -> 0) f_(overline(x), epsilon) (x - overline(x)), $
where $f_(overline(x), epsilon)$ is a normal distribution with its mean at $overline(x)$ and its standard deviation as $epsilon$.
]
#env("Theorem")[
The Dirac delta function satisfies
$ delta(x - overline(x)) =& cases(+infinity\, & #h(1em) x = overline(x), 0\, & #h(1em) x eq.not overline(x),) #h(1em) integral_(-infinity)^x delta(x - overline(x)) upright("d") x =& cases(1\, & #h(1em) x >= 0, 0\, & #h(1em) x < 0) $
where $H(x) = integral_(-infinity)^x delta(x - overline(x)) upright("d") x$ is called *Heaviside function* or *step function*.
]
#env("Theorem")[
If $f: RR -> RR$ is continuous, then
$ integral_(-infinity)^(+infinity) delta(x - overline(x)) f(x) upright("d") x = f(overline(x)). $
]
=== Gamma function
#env("Definition")[
The *Gamma function* defined on $CC$ is
$ Gamma(z) = integral_0^(+infinity) t^(z-1) e^(-t) upright("d") t, $
where $upright("Re") (z) > 0$.
]
#env("Theorem")[
The Gamma function satisfies
$ & forall x in CC, & & Gamma(x + 1) = x Gamma(x), \ & forall n in NN^*, & & Gamma(n) = (n - 1)!. $
]
#env("Theorem")[
The Gamma function satisfies
$ forall x in (0, 1), Gamma(1 - x) Gamma(x) = pi / sin(pi x), $
which implies
$ Gamma(1/2) = sqrt(pi). $
]
=== Beta Function
#env("Definition")[
For $p, q in RR^+$, the *Beta function* is defined as
$ B(p, q) = integral_0^1 x^(p-1) (1 - x)^(q-1) upright("d") x. $
]
#env("Theorem")[
The Beta function satisfies
$ forall p, q in RR^+, B(p, q) = B(q, p) = (Gamma(p) Gamma(q)) / Gamma(p + q). $
]
#env("Theorem")[
The Beta function satisfies
$ & forall p > 0, forall q > 1, & & B(p, q) = (q - 1) / (p + q - 1) B(p, q - 1), \
& forall p > 1, forall q > 0, & & B(p, q) = (p - 1) / (p + q - 1) B(p - 1, q), \
& forall p > 1, forall q > 1, & & B(p, q) = ((p - 1)(q - 1)) / ((p + q - 1)(p + q - 2)) B(p - 1, q - 1). $
]
|
https://github.com/xlxs4/cv | https://raw.githubusercontent.com/xlxs4/cv/main/xlxs4.typ | typst | MIT License | #import "cv.typ/cv.typ": *
// Load CV data from YAML
#let cvdata = yaml("xlxs4.yml")
#let uservars = (
headingfont: "Linux Libertine",
bodyfont: "Linux Libertine",
fontsize: 10pt,
linespacing: 6pt,
showAddress: true,
showNumber: true,
)
#let customrules(doc) = {
doc
}
#let cvinit(doc) = {
doc = setrules(uservars, doc)
doc = showrules(uservars, doc)
doc = customrules(doc)
doc
}
#show: doc => cvinit(doc)
#cvheading(cvdata, uservars)
#cvwork(cvdata)
#cvexperience(cvdata)
#cveducation(cvdata)
#cvskills(cvdata)
#cvconferences(cvdata)
#endnote
|
https://github.com/hweissi/tugraz-typst-theme | https://raw.githubusercontent.com/hweissi/tugraz-typst-theme/main/tugraz-polylux.typ | typst | // This theme is inspired by https://github.com/matze/mtheme
// The polylux-port was performed by https://github.com/Enivex
// Consider using:
// #set text(font: "Fira Sans", weight: "light", size: 20pt)
// #show math.equation: set text(font: "Fira Math")
// #set strong(delta: 100)
// #set par(justify: true)
#import "@preview/polylux:0.3.1": *
#import "@preview/showybox:2.0.1": showybox
#let m-dark-teal = rgb("#1A1A1A")
#let m-light-brown = rgb("#F70146")
#let m-lighter-brown = rgb("#e0c0c5")
#let m-extra-light-gray = white.darken(2%)
#let m-footer-background = rgb("#dfdfdf")
#let m-footer = state("m-footer", [])
#let m-page-progress-bar = state("m-page-progress-bar", [])
#let m-cell = block.with(
width: 100%,
height: 100%,
above: 0pt,
below: 0pt,
breakable: false
)
#let m-progress-bar = utils.polylux-progress( ratio => {
grid(
columns: (ratio * 100%, 1fr),
m-cell(fill: m-light-brown),
m-cell(fill: m-lighter-brown)
)
})
#let tugraz-theme(
aspect-ratio: "16-9",
footer: [],
progress-bar: true,
body
) = {
set page(
paper: "presentation-" + aspect-ratio,
fill: m-extra-light-gray,
margin: 0em,
header: none,
footer: none,
)
set list(marker: (text(size: 1.7em, "â¢"), text(size: 1.5em, "â¢"), text(size: 1.3em, "â¢")))
m-footer.update(footer)
if progress-bar {
m-page-progress-bar.update(m-progress-bar)
}
body
}
#let title-slide(
title: [],
subtitle: none,
author: none,
date: none,
extra: none,
) = {
let content = {
set text(fill: m-dark-teal)
set align(horizon)
place(top + right, pad(30%, image("./TU_Graz.svg", format: "svg", width: 20%)))
block(width: 100%, inset: 2em, {
text(size: 1.8em, strong(title), weight: "regular", fill: m-light-brown)
if subtitle != none {
linebreak()
linebreak()
text(size: 0.8em, subtitle)
}
line(length: 100%, stroke: .05em + m-light-brown)
set text(size: .8em)
if author != none {
block(spacing: 1em, text(weight: "medium", author))
}
if date != none {
block(spacing: 1em, date)
}
set text(size: .8em)
if extra != none {
block(spacing: 1em, extra)
}
})
}
logic.polylux-slide(content)
}
#let slide(title: none, body) = {
let header = {
set align(top)
if title != none {
show: m-cell.with(fill: m-dark-teal, inset: 1em)
set align(horizon)
set text(fill: m-extra-light-gray, size: 1.2em)
strong(title)
h(1fr)
box(pad(bottom: .75em, text(size: .5em, "www.tugraz.at", weight: "medium")))
box(pad(left: .2em, bottom: .75em, square(fill: m-light-brown, size: .3em)))
} else {
h(1fr)
box(pad(bottom: .75em, text(size: .5em, "www.tugraz.at", weight: "medium")))
box(pad(left: .2em, bottom: .75em, square(fill: m-light-brown, size: .3em)))
}
}
let footer = {
set text(size: 0.7em)
show: m-cell.with(fill: m-footer-background)
set align(horizon)
box(align(left+top, square(height: 100%, fill: m-light-brown, align(horizon+center, text(fill: m-extra-light-gray, weight: "regular", size: .9em, logic.logical-slide.display())))))
h(1fr)
box(height: 100%, pad(1.5em, text(fill: m-dark-teal, size: .9em, m-footer.display())))
place(bottom, block(height: 2pt, width: 100%, m-page-progress-bar.display()))
}
set page(
header: header,
footer: footer,
margin: (top: 3em, bottom: 2em),
fill: m-extra-light-gray,
)
let content = {
show: align.with(horizon)
show: pad.with(left: 2em, 1em)
set text(fill: m-dark-teal)
body
}
logic.polylux-slide(content, max-repetitions: 15)
}
#let new-section-slide(name) = {
let content = {
utils.register-section(name)
set align(horizon)
show: pad.with(20%)
set text(size: 1.5em)
name
block(height: 2pt, width: 100%, spacing: 0pt, m-progress-bar)
}
logic.polylux-slide(content)
}
#let focus-slide(body) = {
set page(fill: m-dark-teal, margin: 2em)
set text(fill: m-extra-light-gray, size: 1.5em)
logic.polylux-slide(align(horizon + center, body))
}
#let alert = text.with(fill: m-light-brown)
#let numbering-func(..nums) = {
box(
square(fill: m-light-brown, height: 1.2em,
align(center, text(fill: m-extra-light-gray, weight: "medium",
nums.pos().map(str).join(".")))
))
}
#let metropolis-outline = utils.polylux-outline(enum-args: (tight: false, numbering: numbering-func))
#let quotebox = showybox.with(frame: (
border-color: m-dark-teal,
footer-color: m-light-brown.darken(25%).desaturate(10%),
body-color: m-light-brown.lighten(80%),
radius: 0pt
),
footer-style: (
color: m-extra-light-gray,
weight: "light",
align: right
),
shadow: (
offset: 5pt
)
)
#let defbox = showybox.with(frame: (
border-color: m-dark-teal.lighten(20%),
title-color: m-extra-light-gray.darken(30%),
body-color: m-extra-light-gray.darken(10%),
radius: 0pt
),
title-style: (
color: m-light-brown.darken(20%),
weight: "medium",
align: left
),
shadow: (
offset: 4pt
)
) |
|
https://github.com/0x1B05/algorithm-journey | https://raw.githubusercontent.com/0x1B05/algorithm-journey/main/practice/note/content/åšæè§å.typ | typst | #import "../template.typ": *
= åšæè§å
== ä»éåœå
¥æäžç»Žåšæè§å
#tip("Tip")[
é¢ç® 1 å°é¢ç® 4ïŒéœä»éåœå
¥æïŒéæžæ¹åºåšæè§åçå®ç°
]
=== #link(
"https://leetcode.cn/problems/minimum-cost-for-tickets/",
)[é¢ç® 2: æäœç¥šä»· ]
åšäžäžªç«èœŠæ
è¡åŸå欢è¿çåœåºŠïŒäœ æåäžå¹Žè®¡åäºäžäºç«èœŠæ
è¡.
åšæ¥äžæ¥çäžå¹ŽéïŒäœ èŠæ
è¡çæ¥åå°ä»¥äžäžªå䞺 days çæ°ç»ç»åº, æ¯äžé¡¹æ¯äžäžªä» 1
å° 365 çæŽæ° ç«èœŠç¥šæäžç§äžåçé宿¹åŒ:
- äžåŒ 䞺æ 1 倩 çéè¡è¯å®ä»·äžº `costs[0]` çŸå
- äžåŒ 䞺æ 7 倩 çéè¡è¯å®ä»·äžº `costs[1]` çŸå
- äžåŒ 䞺æ 30 倩 çéè¡è¯å®ä»·äžº `costs[2]` çŸå
éè¡è¯å
讞æ°å€©æ éå¶çæ
è¡, äŸåŠïŒåŠææä»¬åšç¬¬ 2 倩è·åŸäžåŒ 䞺æ 7 倩 çéè¡è¯,
é£ä¹æä»¬å¯ä»¥è¿çæ
è¡ 7 倩(第 2~8 倩)
è¿åäœ æ³èŠå®æåšç»å®çå衚 `days` äžååºçæ¯äžå€©çæ
è¡æéèŠçæäœæ¶è޹
#example(
"Example",
)[
- èŸå
¥ïŒ`days = [1,4,6,7,8,20]`, `costs = [2,7,15]`
- èŸåºïŒ`11`
- è§£éïŒ
- äŸåŠïŒè¿éæäžç§èŽä¹°éè¡è¯çæ¹æ³ïŒå¯ä»¥è®©äœ å®æäœ çæ
è¡è®¡åïŒ
- åšç¬¬ 1 倩ïŒäœ è±äº `costs[0] = $2` ä¹°äºäžåŒ 䞺æ 1 倩çéè¡è¯ïŒå®å°åšç¬¬ 1 倩çæã
- åšç¬¬ 3 倩ïŒäœ è±äº `costs[1] = $7` ä¹°äºäžåŒ 䞺æ 7 倩çéè¡è¯ïŒå®å°åšç¬¬ 3, 4, ...,
9 倩çæã
- åšç¬¬ 20 倩ïŒäœ è±äº `costs[0] = $2` ä¹°äºäžåŒ 䞺æ 1 倩çéè¡è¯ïŒå®å°åšç¬¬ 20
倩çæã
- äœ æ»å
±è±äº \$11ïŒå¹¶å®æäºäœ 计åçæ¯äžå€©æ
è¡ã
]
==== è§£ç
#code(
caption: [è§£ç],
)[
```java
// dpçŒå
public static int mincostTickets(int[] days, int[] costs) {
int[] dp = new int[days.length + 1];
for (int i = 0; i < dp.length; i++) {
dp[i] = Integer.MAX_VALUE;
}
int ans = f1(days, costs, 0, dp);
return ans;
}
// åœåäœäºdays[cur...], èŠæ¥ç宿daysçæäœæ¶è޹
public static int f1(int[] days, int[] costs, int cur, int[] dp) {
// åç»å·²ç»æ²¡ææ
è¡äº
if (cur >= days.length) {
return 0;
}
if (dp[cur] < Integer.MAX_VALUE) {
return dp[cur];
}
int ans = Integer.MAX_VALUE;
for (int k = 0; k < costs.length; k++) {
// åŠæcosts[0] 1倩 -> f(days,costs,cur+1)
// åŠæcosts[1] 7倩 -> f(days,costs,cur+?);èŠçdays[cur]+7 æ°å¥œ<days[i] -> f(days, costs, i)
// åŠæcosts[2] 30倩 -> f(days,costs,cur+?);èŠçdays[cur]+30 æ°å¥œ<days[i] -> f(days, costs, i)
if (k == 0) {
// 1倩
ans = Math.min(ans, costs[0] + f1(days, costs, cur + 1, dp));
} else if (k == 1) {
// 7倩
int i = 0;
for (i = cur + 1; i < days.length && days[i] < days[cur] + 7; i++)
;
ans = Math.min(ans, costs[1] + f1(days, costs, i, dp));
} else {
// 30倩祚
int i = 0;
for (i = cur + 1; i < days.length && days[i] < days[cur] + 30; i++)
;
ans = Math.min(ans, costs[2] + f1(days, costs, i, dp));
}
}
dp[cur] = ans;
return ans;
}
// 衚äŸèµ
public static int MAX_DAYS = 366;
public static int mincostTickets2(int[] days, int[] costs) {
int n = days.length;
int[] dp = new int[n + 1];
Arrays.fill(dp, 0, n + 1, Integer.MAX_VALUE);
dp[n] = 0;
for (int cur = n - 1; cur >= 0; cur--) {
for (int k = 0; k < costs.length; k++) {
if (k == 0) {
// 1倩
dp[cur] = Math.min(dp[cur], costs[0] + dp[cur + 1]);
} else if (k == 1) {
// 7倩
int i = 0;
for (i = cur + 1; i < days.length && days[i] < days[cur] + 7; i++)
;
dp[cur] = Math.min(dp[cur], costs[1] + dp[i]);
} else {
// 30倩祚
int i = 0;
for (i = cur + 1; i < days.length && days[i] < days[cur] + 30; i++)
;
dp[cur] = Math.min(dp[cur], costs[2] + dp[i]);
}
}
}
return dp[0];
}
```
]
è¿éé¢å·Šçå€çæŽç®æŽåŒåŸåŠä¹ äžäž:
#code(caption: [å·Šç¥çå€ç])[
```java
// æ 论æäº€ä»ä¹æ¹æ³éœåžŠçè¿äžªæ°ç» 0 1 2
public static int[] durations = { 1, 7, 30 };
...
for (int k = 0, j = i; k < 3; k++) {
// kæ¯æ¹æ¡çŒå· : 0 1 2
while (j < days.length && days[i] + durations[k] > days[j]) {
// å äžºæ¹æ¡2æç»çå€©æ°æå€ïŒ30倩
// æä»¥whileåŸªç¯æå€æ§è¡30次
// æäžŸè¡äžºå¯ä»¥è®€äžºæ¯O(1)
j++;
}
ans = Math.min(ans, costs[k] + f1(days, costs, j));
}
...
```
]
=== #link(
"https://leetcode.cn/problems/decode-ways/description/",
)[é¢ç® 3: è§£ç æ¹æ³]
äžæ¡å
å«åæ¯ A-Z çæ¶æ¯éè¿ä»¥äžæ å°è¿è¡äº çŒç ïŒ
```
'A' -> "1"
'B' -> "2"
...
'Z' -> "26"
```
èŠè§£ç å·²çŒç çæ¶æ¯ïŒæææ°åå¿
é¡»åºäºäžè¿°æ å°çæ¹æ³ïŒååæ å°å忝ïŒå¯èœæå€ç§æ¹æ³ïŒãäŸåŠïŒ"11106"
å¯ä»¥æ å°äžºïŒ
- `"AAJF"` ïŒå°æ¶æ¯åç»äžº `(1 1 10 6)`
- `"KJF"` ïŒå°æ¶æ¯åç»äžº `(11 10 6)`
泚æïŒæ¶æ¯äžèœåç»äžº `(1 11 06)` ïŒå 䞺 `"06" `äžèœæ å°äžº `"F" `ïŒè¿æ¯ç±äº `"6" `å `"06" `åšæ å°äžå¹¶äžçä»·ã
ç»äœ äžäžªåªå«æ°åçé空å笊䞲 `s` ïŒè¯·è®¡ç®å¹¶è¿åè§£ç æ¹æ³çæ»æ° ã
#tip("Tip")[
é¢ç®æ°æ®ä¿è¯çæ¡è¯å®æ¯äžäžª 32 äœ çæŽæ°ã
]
==== è§£ç
#code(
caption: [è§£ç],
)[
```java
public static int numDecodings(String s) {
int[] dp = new int[s.length() + 1];
Arrays.fill(dp, -1);
int ans = f(s, 0, dp);
return ans;
}
// s[cur...æå€å°ç§çŒç æ¹æ³.
public static int f(String s, int cur, int[] dp) {
if (cur == s.length()) {
return 1;
}
if (dp[cur] != -1) {
return dp[cur];
}
int ans = 0;
// éåœåç
if (s.charAt(cur) == '0') {
return 0;
} else {
ans += f(s, cur + 1, dp);
// éåœåç以åäžäžäžª(èŠæ±èŠå°äº26)
if (cur + 1 < s.length() && ((s.charAt(cur) - '0') * 10 + (s.charAt(cur + 1) - '0')) <= 26) {
ans += f(s, cur + 1, dp);
}
}
dp[cur] = ans;
return ans;
}
public static int numDecodings2(String s) {
int n = s.length();
int[] dp = new int[n + 1];
Arrays.fill(dp, 0);
dp[n] = 1;
for (int cur = n - 1; cur >= 0; cur--) {
if (s.charAt(cur) == '0') {
dp[cur] = 0;
} else {
dp[cur] += dp[cur + 1];
// éåœåç以åäžäžäžª(èŠæ±èŠå°äº26)
if (cur + 1 < n && ((s.charAt(cur) - '0') * 10 + (s.charAt(cur + 1) - '0')) <= 26) {
dp[cur] += dp[cur + 2];
}
}
}
return dp[0];
}
```
]
ç»åºäºäžç§ç©ºéŽåçŒ©çæ¹æ³, å©çšåéçæ»åšæŽæ°å°±ç®åºæ¥äºç»æ, éåžž amazing.
#code(
caption: [空éŽå猩],
)[
```java
// äž¥æ Œäœçœ®äŸèµçåšæè§å + 空éŽå猩
public static int numDecodings3(String s) {
// dp[i+1]
int next = 1;
// dp[i+2]
int nextNext = 0;
for (int i = s.length() - 1, cur; i >= 0; i--) {
if (s.charAt(i) == '0') {
cur = 0;
} else {
cur = next;
if (i + 1 < s.length() && ((s.charAt(i) - '0') * 10 + s.charAt(i + 1) - '0') <= 26) {
cur += nextNext;
}
}
nextNext = next;
next = cur;
}
return next;
}
```
]
=== #link("https://leetcode.cn/problems/decode-ways-ii/")[é¢ç® 4: è§£ç æ¹æ³ II]
äžæ¡å
å«åæ¯ A-Z çæ¶æ¯éè¿ä»¥äžæ å°è¿è¡äº çŒç ïŒ
```
'A' -> "1"
'B' -> "2"
...
'Z' -> "26"
```
èŠè§£ç å·²çŒç çæ¶æ¯ïŒæææ°åå¿
é¡»åºäºäžè¿°æ å°çæ¹æ³ïŒååæ å°å忝ïŒå¯èœæå€ç§æ¹æ³ïŒãäŸåŠïŒ"11106"
å¯ä»¥æ å°äžºïŒ
- `"AAJF"` ïŒå°æ¶æ¯åç»äžº `(1 1 10 6)`
- `"KJF"` ïŒå°æ¶æ¯åç»äžº `(11 10 6)`
泚æïŒæ¶æ¯äžèœåç»äžº `(1 11 06)` ïŒå 䞺 `"06" `äžèœæ å°äžº `"F" `ïŒè¿æ¯ç±äº `"6" `å `"06" `åšæ å°äžå¹¶äžçä»·ã
é€äº äžé¢æè¿°çæ°å忝æ å°æ¹æ¡ïŒçŒç æ¶æ¯äžå¯èœå
å« `'*'` å笊ïŒå¯ä»¥è¡šç€ºä» `'1'` å° `'9' `çä»»äžæ°åïŒäžå
æ¬` '0'`ïŒãäŸåŠïŒçŒç å笊䞲 `"1*"` å¯ä»¥è¡šç€º` "11"`ã`"12"`ã`"13"`ã`"14"`ã`"15"`ã`"16"`ã`"17"`ã`"18" `æ `"19" `äžçä»»æäžæ¡æ¶æ¯ã对 `"1*" `è¿è¡è§£ç ïŒçžåœäºè§£ç 该å笊䞲å¯ä»¥è¡šç€ºçä»»äœçŒç æ¶æ¯ã
ç»äœ äžäžªå笊䞲 s ïŒç±æ°åå `'*'` åç¬Šç»æïŒè¿å è§£ç 该åç¬Šäž²çæ¹æ³ æ°ç® ã
ç±äºçæ¡æ°ç®å¯èœé垞倧ïŒè¿å 10^9 + 7 ç æš¡ ã
==== è§£ç
#code(
caption: [è§£ç],
)[
```java
public static int numDecodings(String s) {
long[] dp = new long[s.length() + 1];
Arrays.fill(dp, -1);
long ans = f(s, 0, dp);
return (int) ans;
}
public static long MOD = 1000000007;
// s[cur...]æå€å°ç§çŒç æ¹æ³.
public static long f(String s, int cur, long[] dp) {
if (cur == s.length()) {
return 1;
}
if (dp[cur] != -1) {
return dp[cur];
}
long ans = 0;
// éåœåç
if (s.charAt(cur) == '0') {
return 0;
} else {
// åœåäžæ¯0, æå ç§æ
åµ.
// ä»
éåœåäœ(å¯èœæ¯æ°åæè
*).
// éåœåäœä»¥åäžäžäœ, 第äžäœæ¯1/2 æè
*, 第äºäœæ¯0-6æè
, 第äºäœæ¯*
// ä»
éåœåäœ, åºåæ®éå*
int cur_char = s.charAt(cur) - '0';
if (cur_char > 0 && cur_char < 10) {
// æ¯æ®éæ°å
ans += f(s, cur + 1, dp);
} else {
// æ¯*
ans += 9 * f(s, cur + 1, dp);
}
// éåœå以åäžäžäœ
boolean oneOr2 = cur_char == 1 || cur_char == 2;
boolean curStar = cur_char == ('*' - '0');
if (cur + 1 < s.length()) {
// 第äžäœæ¯1/2, 第äºäžªæ¯ 0-6 / *
int next_char = s.charAt(cur + 1) - '0';
boolean isNum = next_char >= 0 && next_char <= 9;
boolean lessThan6 = isNum && next_char <= 6;
boolean nextStar = next_char == ('*' - '0');
if (oneOr2) {
if ((cur_char == 2 && lessThan6)
|| (cur_char == 1 && isNum)) {
ans += f(s, cur + 2, dp);
} else if (nextStar) {
if (cur_char == 1) {
ans += 9 * f(s, cur + 2, dp);
} else if (cur_char == 2) {
ans += 6 * f(s, cur + 2, dp);
}
}
// 第äžäœæ¯*(代衚1/2), 第äºäœå¯èœæ¯æ°åæè
*
} else if (curStar) {
if (isNum) {
if (lessThan6) {
ans += 2 * f(s, cur + 2, dp);
} else {
ans += f(s, cur + 2, dp);
}
} else if (nextStar) {
ans += 15 * f(s, cur + 2, dp);
}
}
}
}
ans = (ans + MOD) % MOD;
dp[cur] = ans;
return ans;
}
public static int numDecodings2(String s) {
int n = s.length();
long[] dp = new long[n + 1];
Arrays.fill(dp, 0);
dp[n] = 1;
for (int cur = n - 1; cur >= 0; cur--) {
if (s.charAt(cur) == '0') {
dp[cur] = 0;
} else {
// åœåäžæ¯0, æå ç§æ
åµ.
// ä»
éåœåäœ(å¯èœæ¯æ°åæè
*).
// éåœåäœä»¥åäžäžäœ, 第äžäœæ¯1/2 æè
*, 第äºäœæ¯0-6æè
, 第äºäœæ¯*
// ä»
éåœåäœ, åºåæ°åå*
int cur_char = s.charAt(cur) - '0';
if (cur_char > 0 && cur_char < 10) {
// æ¯æ®éæ°å
dp[cur] += dp[cur + 1];
} else {
// æ¯*
dp[cur] += 9 * dp[cur + 1];
}
// éåœå以åäžäžäœ
boolean oneOr2 = cur_char == 1 || cur_char == 2;
boolean curStar = cur_char == ('*' - '0');
if (cur + 1 < s.length()) {
// 第äžäœæ¯1/2, 第äºäžªæ¯ 0-6 / *
int next_char = s.charAt(cur + 1) - '0';
boolean isNum = next_char >= 0 && next_char <= 9;
boolean lessThan6 = isNum && next_char <= 6;
boolean nextStar = next_char == ('*' - '0');
if (oneOr2) {
if ((cur_char == 2 && lessThan6)
|| (cur_char == 1 && isNum)) {
dp[cur] += dp[cur + 2];
} else if (nextStar) {
if (cur_char == 1) {
dp[cur] += 9 * dp[cur + 2];
} else if (cur_char == 2) {
dp[cur] += 6 * dp[cur + 2];
}
}
// 第äžäœæ¯*(代衚1/2), 第äºäœå¯èœæ¯æ°åæè
*
} else if (curStar) {
if (isNum) {
if (lessThan6) {
dp[cur] += 2 * dp[cur + 2];
} else {
dp[cur] += dp[cur + 2];
}
} else if (nextStar) {
dp[cur] += 15 * dp[cur + 2];
}
}
}
}
dp[cur] = (dp[cur] + MOD) % MOD;
dp[cur] = dp[cur];
}
return (int) dp[0];
}
```
]
è¿éäŸç¶å¯ä»¥äœ¿çšæ»åšæŽæ°.
=== #link(
"https://leetcode.cn/problems/ugly-number-ii/description/",
)[é¢ç® 5: äžæ° II]
#tip(
"Tip",
)[
åœçæäºä»éåœå°åšæè§åç蜬åè¿çš, é£ä¹å°±å¯ä»¥çº¯ç²¹çšåšæè§åçè§è§æ¥åæé®é¢äº,
é¢ç® 5 å°é¢ç® 8ïŒéœæ¯çº¯ç²¹çšåšæè§åçè§è§æ¥åæãäŒåç
]
#definition(
"Definition",
)[
*äžæ°* å°±æ¯èŽšå ååªå
å« `2`ã`3` å `5` çæ£æŽæ°ãç»äœ äžäžªæŽæ° `n` ïŒè¯·äœ æŸåºå¹¶è¿å第 `n` 䞪
äžæ° ã
]
#example("Example")[
- èŸå
¥ïŒ`n = 10`
- èŸåºïŒ`12`
- è§£éïŒ`[1, 2, 3, 4, 5, 6, 8, 9, 10, 12]` æ¯ç±å 10 äžªäžæ°ç»æçåºåã
]
#example("Example")[
- èŸå
¥ïŒ`n = 1`
- èŸåºïŒ`1`
- è§£éïŒ`1` é垞被è§äžºäžæ°ã
]
==== è§£ç
æ¹æ³ 1(ææŽå): èªç¶æ°æäžŸ, äžäžªäžªè¯(?)
æ¹æ³ 2(次æŽå): æ¯äžªäžæ°éœæ¯åé¢çæäžªäžæ°\*2,\*3,\*5 åŸå°ç, ä» 1 åŒå§,
æ¯äžªäžæ°æ°åå«\*2,\*3,\*5, æŸå°æå°ç, æåº 1 -> 2 3 5 æŸå°æå°ç 2 2 -> 4 6 10
æŸå°é€å» 2, æå°çé£äžªå³ 3 3 -> 6 9 15 æŸå°é€å» 2 3 æå°çé£äžªå³ 4 ....
æ¹æ³ 2 çå³ççç¥å°±æ¯æ¯åé¢äžäžªäžæ°å€§çé颿å°çé£äžª
æ¹æ³ 3(æäŒè§£): ä» 1 åŒå§, äžäžªæé\*2,\*3,\*5, æŸå°æå°ç,
ç¶åçžåºçåæ°ç§»å°äžäžäžªäžæ°äžé¢å».
äžåŒå§äžäžªæééœæå 1. p1(2), p2(3), p3(5) æŸå°æéæççèŸå°çé£äžª å³ 2. æ¥ç
p1 æé没å¿
èŠçåš 1 äœçœ®äº, ç§»å° 2 äœçœ®. p1(4), p2(3), p3(5)
æŸå°æéæççèŸå°çé£äžª å³ 3. æ¥ç p2 æé没å¿
èŠçåš 1 äœçœ®äº, ç§»å° 2 äœçœ®.
å°æé®é¢, ä»ä¹æ ·çå¯èœæ§åä¹äžäŒæäžºäžäžäžªäžæ°çè§£äº.
#code(caption: [è§£ç])[
```java
public static int nthUglyNumber(int n) {
int[] uglyNums = new int[n + 1];
uglyNums[1] = 1;
// p2 p3 p5åå«å¯¹åº*2 *3 *5çæéåå«ååšä»ä¹äžæ
int p2 = 1, p3 = 1, p5 = 1;
for (int i = 2; i <= n; i++) {
int a = uglyNums[p2] * 2;
int b = uglyNums[p3] * 3;
int c = uglyNums[p5] * 5;
// åœåçäžæ°
int cur = Math.min(Math.min(a, b), c);
if (cur == a) {
++p2;
}
if (cur == b) {
++p3;
}
if (cur == c) {
++p5;
}
uglyNums[i] = cur;
}
return uglyNums[n];
}
```
]
=== #link(
"https://leetcode.cn/problems/longest-valid-parentheses/description/",
)[é¢ç® 6: æé¿æææ¬å·]
ç»äœ äžäžªåªå
å« `'('` å `')' `çåç¬Šäž²ïŒæŸåºæé¿ææïŒæ ŒåŒæ£ç¡®äžè¿ç»ïŒæ¬å· åäž²
çé¿åºŠã
#example("Example")[
- èŸå
¥ïŒ`s = "(()"`
- èŸåºïŒ`2`
- è§£éïŒæé¿æææ¬å·åäž²æ¯ `"()"`
]
==== è§£ç
#code(caption: [è§£ç])[
```java
public static int longestValidParentheses(String str) {
char[] s = str.toCharArray();
// dp[0...n-1]
// dp[i]: åäž²å¿
须以iç»å°Ÿ, åŸå·Šæå€æšå€è¿èœææ?
int[] dp = new int[s.length];
if (s.length > 0) {
dp[0] = 0;
}
int ans = 0;
for (int cur = 1; cur < dp.length; cur++) {
if (s[cur] == ')') {
int tmp = cur - dp[cur - 1] - 1;
if (tmp >= 0 && s[tmp] == '(') {
// åªèŠåŸåæšäžæ¬¡å°±å¯ä»¥äº, åŠætmpæ£å¥œæšå°åºäºä¹äžçšç®¡
dp[cur] = dp[cur - 1] + 2 + (tmp > 0 ? dp[tmp - 1] : 0);
} else {
dp[cur] = 0;
}
} else {
dp[cur] = 0;
}
ans = Math.max(dp[cur], ans);
}
return ans;
}
```
]
=== #link(
"https://leetcode.cn/problems/unique-substrings-in-wraparound-string/description/",
)[é¢ç® 7: ç¯ç»å笊䞲äžå¯äžçåå笊䞲]
å®ä¹å笊䞲 `base` 䞺äžäžª `"abcdefghijklmnopqrstuvwxyz" `æ éç¯ç»çåç¬Šäž²ïŒæä»¥ `base` çèµ·æ¥æ¯è¿æ ·çïŒ `"...zabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcd...."`.
ç»äœ äžäžªå笊䞲 `s` ïŒè¯·äœ ç»è®¡å¹¶è¿å `s` äžæå€å° äžåé空åäž² ä¹åš `base` äžåºç°ã
#example(
"Example",
)[
- èŸå
¥ïŒ`s = "zab"`
- èŸåºïŒ`6`
- è§£éïŒå笊䞲 `s` æå
䞪åå笊䞲 `("z", "a", "b", "za", "ab", and "zab")` åš `base` äžåºç°ã
]
==== è§£ç
=== æ»ç»
ç¥éæä¹ç®çç®æ³ vs ç¥éæä¹è¯çç®æ³
æäºéåœåšå±åŒè®¡ç®æ¶ïŒæ»æ¯éå€è°çšåäžäžªåé®é¢çè§£ïŒè¿ç§éå€è°çšçéåœåæåšæè§ååŸææ¶ç
åŠææ¯æ¬¡å±åŒéœæ¯äžåçè§£ïŒæè
éå€è°çšçç°è±¡åŸå°ïŒé£ä¹æ²¡ææ¹åšæè§åçå¿
èŠ
ä»»äœåšæè§åé®é¢éœäžå®å¯¹åºçäžäžªæéå€è°çšè¡äžºçéåœ
æä»¥ä»»äœåšæè§åçé¢ç®éœäžå®å¯ä»¥ä»éåœå
¥æïŒéæžå®ç°åšæè§åçæ¹æ³
*å°è¯çç¥* å°±æ¯
*蜬移æ¹çš*ïŒå®å
šäžåäºïŒæšèä»*å°è¯*å
¥æïŒå 䞺代ç 奜åïŒå¹¶äžäžæŠåç°å°è¯é误ïŒéæ°æ³å«çéåœä»£ä»·èœ»ïŒ
åšæè§åç倧èŽè¿çšïŒ
æ³åºè®Ÿè®¡äŒè¯çéåœå°è¯(æ¹æ³ãç»éªãåºå®å¥è·¯åŸå€)ïŒæå
³å°è¯å±åŒé¡ºåºç诎æ ->
è®°å¿åæçŽ¢(ä»é¡¶å°åºçåšæè§å)
ïŒåŠææ¯äžªç¶æçè®¡ç®æäžŸä»£ä»·åŸäœïŒåŸåŸå°è¿éå°±å¯ä»¥äº ->
äž¥æ Œäœçœ®äŸèµçåšæè§å(ä»åºå°é¡¶çåšæè§å) ïŒæŽå€æ¯äžºäºäžé¢è¯Žç
è¿äžæ¥äŒåæäžŸåçåå€ ->
è¿äžæ¥äŒå空éŽïŒç©ºéŽå猩ïŒïŒäžç»Žãäºç»Žãå€ç»Žåšæè§åéœååšè¿ç§äŒå ->
è¿äžæ¥äŒåæäžŸä¹å°±æ¯äŒåæ¶éŽïŒæ¬èæ²¡ææ¶åïŒäœæ¯åç»å·šå€å
容åè¿æå
³ïŒ
è§£å³äžäžªé®é¢ïŒå¯èœæåŸå€å°è¯æ¹æ³;
äŒå€çå°è¯æ¹æ³äžïŒå¯èœè¥å¹²çå°è¯æ¹æ³æéå€è°çšçæ
åµïŒå¯ä»¥èœ¬åæåšæè§å;
è¥å¹²äžªå¯ä»¥èœ¬åæåšæè§åçæ¹æ³äžïŒåå¯èœæäŒå£ä¹å.
å€å®åªäžªæ¯æäŒçåšæè§åæ¹æ³ïŒäŸæ®æ¥èªé¢ç®å
·äœåæ°çæ°æ®é,
æäŒçåšæè§åæ¹æ³å®ç°åïŒåç»åæäžæŽå¥çäŒåæå·§
== ä»éåœå
¥æäºç»Žåšæè§å
- å°è¯åœæ°æ 1 䞪å¯ååæ°å¯ä»¥å®å
šå³å®è¿ååŒïŒè¿èå¯ä»¥æ¹åº 1 ç»Žåšæè§å衚çå®ç°
- å°è¯åœæ°æ 2 䞪å¯ååæ°å¯ä»¥å®å
šå³å®è¿ååŒïŒé£ä¹å°±å¯ä»¥æ¹åº 2 ç»Žåšæè§åçå®ç°
äžç»Žãäºç»Žãäžç»Žçè³å€ç»Žåšæè§åé®é¢ïŒå€§äœè¿çšéœæ¯ïŒ
1. ååºå°è¯éåœ
2. è®°å¿åæçŽ¢(ä»é¡¶å°åºçåšæè§å)
3. äž¥æ Œäœçœ®äŸèµçåšæè§å(ä»åºå°é¡¶çåšæè§å)
4. 空éŽãæ¶éŽçæŽå€äŒå
åšæè§å衚ç倧å°ïŒæ¯äžªå¯ååæ°çå¯èœæ§æ°éçžä¹
åšæè§åæ¹æ³çæ¶éŽå€æåºŠïŒåšæè§å衚çå€§å° \* æ¯äžªæ ŒåçæäžŸä»£ä»·
äºç»Žåšæè§åäŸç¶éèŠå»æŽç åšæè§åè¡šçæ Œåä¹éŽçäŸèµå
³ç³» æŸå¯»äŸèµå
³ç³»ïŒåŸåŸ
éè¿ç»åŸæ¥å»ºç«ç©ºéŽæïŒäœ¿å
¶æŽæŸèæè§ ç¶åäŸç¶æ¯ ä»ç®åæ Œåå¡«åå°å€ææ Œå
çè¿çšïŒå³äž¥æ Œäœçœ®äŸèµçåšæè§å(ä»åºå°é¡¶)
äºç»Žåšæè§åçåçŒ©ç©ºéŽæå·§åçäžéŸïŒäŒäºä¹ååç¯äžåŸ
äœæ¯äžåé¢ç®äŸèµå
³ç³»äžäžæ ·ïŒéèŠåŸç»å¿çç»åŸæ¥æŽçå
·äœé¢ç®çäŸèµå
³ç³»
æåè¿è¡ç©ºéŽå猩çå®ç°
äžå®èŠ *ååºå¯ååæ°ç±»åç®åïŒäžæ¯ int ç±»åæŽå€æïŒ*ïŒå¹¶äž
*å¯ä»¥å®å
šå³å®è¿ååŒçéåœïŒ* ä¿è¯åå°
è¿äºå¯ååæ°å¯ä»¥å®å
šä»£è¡šä¹åå³çè¿çšå¯¹åç»è¿çšç圱åïŒå廿¹åšæè§åïŒ
äžç®¡å ç»Žåšæè§å, ç»åžžä»éåœçå®ä¹åºåïŒé¿å
åç»è¿è¡åŸå€èŸ¹ç讚论,
è¿éèŠäžå®çç»éªæ¥é¢ç¥
=== #link(
"https://leetcode.cn/problems/minimum-path-sum/description/",
)[ é¢ç® 1 : æå°è·¯åŸå ]
ç»å®äžäžªå
å«éèŽæŽæ°ç m x n çœæ Œ `grid` ïŒè¯·æŸåºäžæ¡ä»å·Šäžè§å°å³äžè§çè·¯åŸïŒäœ¿åŸè·¯åŸäžçæ°åæ»å䞺æå°ã
诎æïŒæ¯æ¬¡åªèœåäžæè
åå³ç§»åšäžæ¥ã
==== è§£ç
#code(caption: [ é¢ç® 1 : æå°è·¯åŸå ])[
```java
public int minPathSum(int[][] grid) {
// æŽåéåœ
public static int minPathSum1(int[][] grid) {
return f1(grid, grid.length - 1, grid[0].length - 1);
}
// ä»(0,0)å°(i,j)æå°è·¯åŸå
// äžå®æ¯æ¬¡åªèœå峿è
åäž
public static int f1(int[][] grid, int i, int j) {
if (i == 0 && j == 0) {
return grid[0][0];
}
int up = Integer.MAX_VALUE;
int left = Integer.MAX_VALUE;
if (i - 1 >= 0) {
up = f1(grid, i - 1, j);
}
if (j - 1 >= 0) {
left = f1(grid, i, j - 1);
}
return grid[i][j] + Math.min(up, left);
}
// è®°å¿åæçŽ¢
public static int minPathSum2(int[][] grid) {
int n = grid.length;
int m = grid[0].length;
int[][] dp = new int[n][m];
for (int i = 0; i < n; i++) {
for (int j = 0; j < m; j++) {
dp[i][j] = -1;
}
}
return f2(grid, grid.length - 1, grid[0].length - 1, dp);
}
public static int f2(int[][] grid, int i, int j, int[][] dp) {
if (dp[i][j] != -1) {
return dp[i][j];
}
int ans;
if (i == 0 && j == 0) {
ans = grid[0][0];
} else {
int up = Integer.MAX_VALUE;
int left = Integer.MAX_VALUE;
if (i - 1 >= 0) {
up = f2(grid, i - 1, j, dp);
}
if (j - 1 >= 0) {
left = f2(grid, i, j - 1, dp);
}
ans = grid[i][j] + Math.min(up, left);
}
dp[i][j] = ans;
return ans;
}
// äž¥æ Œäœçœ®äŸèµçåšæè§å
public static int minPathSum3(int[][] grid) {
int n = grid.length;
int m = grid[0].length;
int[][] dp = new int[n][m];
dp[0][0] = grid[0][0];
for (int i = 1; i < n; i++) {
dp[i][0] = dp[i - 1][0] + grid[i][0];
}
for (int j = 1; j < m; j++) {
dp[0][j] = dp[0][j - 1] + grid[0][j];
}
for (int i = 1; i < n; i++) {
for (int j = 1; j < m; j++) {
dp[i][j] = Math.min(dp[i - 1][j], dp[i][j - 1]) + grid[i][j];
}
}
return dp[n - 1][m - 1];
}
// äž¥æ Œäœçœ®äŸèµçåšæè§å + 空éŽå猩æå·§
public static int minPathSum4(int[][] grid) {
int n = grid.length;
int m = grid[0].length;
// å
让dp衚ïŒåææ³è±¡äžç衚ç第0è¡çæ°æ®
int[] dp = new int[m];
dp[0] = grid[0][0];
for (int j = 1; j < m; j++) {
dp[j] = dp[j - 1] + grid[0][j];
}
for (int i = 1; i < n; i++) {
// i = 1ïŒdpè¡šåææ³è±¡äžäºç»Žè¡šç第1è¡çæ°æ®
// i = 2ïŒdpè¡šåææ³è±¡äžäºç»Žè¡šç第2è¡çæ°æ®
// i = 3ïŒdpè¡šåææ³è±¡äžäºç»Žè¡šç第3è¡çæ°æ®
// ...
// i = n-1ïŒdpè¡šåææ³è±¡äžäºç»Žè¡šç第n-1è¡çæ°æ®
dp[0] += grid[i][0];
for (int j = 1; j < m; j++) {
dp[j] = Math.min(dp[j - 1], dp[j]) + grid[i][j];
}
}
return dp[m - 1];
}
}
```
]
#tip("Tip")[
æ¹æåšæè§åçæ¶å, å¹¶äžèœæ`dp`éœåå§å䞺-1
]
#tip(
"Tip",
)[
èœæ¹æåšæè§åçéåœïŒç»äžç¹åŸïŒ
- *å³å®è¿ååŒçå¯ååæ°ç±»ååŸåŸéœæ¯èŸç®åïŒäžè¬äžäŒæ¯ int ç±»åæŽå€æã䞺ä»ä¹ïŒ*
- ä»è¿äžªè§åºŠïŒå¯ä»¥è§£é
*垊路åŸçéåœïŒå¯ååæ°ç±»å倿ïŒïŒäžéåæè
诎没æå¿
èŠæ¹æåšæè§å* é¢ç® 2
å°±æ¯è¯Žæè¿äžç¹ç
]
=== #link(
"https://leetcode.cn/problems/word-search/description/",
)[é¢ç® 2: åè¯æçŽ¢]
ç»å®äžäžª m x n äºç»Žåç¬Šçœæ Œ `board` åäžäžªå笊䞲åè¯ `word` ãåŠæ `word` ååšäºçœæ ŒäžïŒè¿å `true` ïŒåŠåïŒè¿å `false` ã
åè¯å¿
é¡»æç
§åæ¯é¡ºåºïŒéè¿çžé»çåå
æ Œå
ç忝ææïŒå
¶äžâçžé»âåå
æ Œæ¯é£äºæ°Žå¹³çžé»æåçŽçžé»çåå
æ Œãåäžäžªåå
æ Œå
ç忝äžå
讞被éå€äœ¿çšã
#tip("Tip")[
åŸåŸé¢ç®çæ°æ®ééœæ¯èŸå°
]
#code(
caption: [é¢ç® 2: åè¯æçŽ¢],
)[
```java
public static boolean exist(char[][] board, String word) {
boolean succeed = false;
char[] w = word.toCharArray();
int m = board.length;
int n = board[0].length;
for (int i = 0; i < m; i++) {
for (int j = 0; j < n; j++) {
succeed = f(board, w, i, j, 0);
if (succeed)
break;
}
}
return succeed;
}
// ä»i,jåºå,å°äºw[k], æ¥äžæ¥èœäžèœå¹é
å°w[k+1...]
public static boolean f(char[][] board, char[] w, int i, int j, int k) {
// å¹é
å°æåäžäžªäº
if (k == w.length) {
return true;
}
// è¶ç
if (i < 0 || i == board.length || j < 0 || j == board[0].length || board[i][j] != w[k]) {
return false;
}
// äžè¶ç board[i][j] = w[k]
char tmp = board[i][j];
board[i][j] = 0;
boolean ans = f(board, w, i + 1, j, k + 1)
|| f(board, w, i - 1, j, k + 1)
|| f(board, w, i, j + 1, k + 1)
|| f(board, w, i, j - 1, k + 1);
board[i][j] = tmp;
return ans;
}
```
]
=== #link(
"https://leetcode.cn/problems/longest-common-subsequence/description/",
)[é¢ç® 3: æé¿å
Œ
±ååºå]
ç»å®äž€äžªå笊䞲 `text1` å `text2`ïŒè¿åè¿äž€äžªå笊䞲çæé¿ å
Œ
±ååºå
çé¿åºŠãåŠæäžååš å
Œ
±ååºå ïŒè¿å `0` ã
äžäžªå笊䞲ç ååºå
æ¯æè¿æ ·äžäžªæ°çå笊䞲ïŒå®æ¯ç±åå笊䞲åšäžæ¹åå笊ççžå¯¹é¡ºåºçæ
åµäžå 逿äºå笊ïŒä¹å¯ä»¥äžå é€ä»»äœå笊ïŒåç»æçæ°å笊䞲ã
äŸåŠïŒ`"ace" `æ¯ `"abcde" `çååºåïŒäœ `"aec" `äžæ¯ `"abcde" `çååºåã
䞀䞪å笊䞲ç å
Œ
±ååºå æ¯è¿äž€äžªå笊䞲æå
±åæ¥æçååºåã
#tip("Tip")[
- èŸå
¥ïŒ`text1 = "abcde"`, `text2 = "ace"`
- èŸåºïŒ`3`
- è§£éïŒæé¿å
Œ
±ååºåæ¯ `"ace" `ïŒå®çé¿åºŠäžº `3` ã
]
==== è§£ç
å¯èœæ§, åšæè§åçå
³é® äžç§åžžè§çå°±æ¯é¿åºŠå®å¥œä¹å, æç
§ç»å°Ÿæ¥è®šè®ºå¯èœæ§.(䞺å¥,
äžç§åžžè§çæš¡å)
#code(
caption: [é¢ç® 3: æé¿å
Œ
±ååºå],
)[
```java
public class Code03_LongestCommonSubsequence {
public static int longestCommonSubsequence(String text1, String text2) {
char[] t1 = text1.toCharArray();
char[] t2 = text2.toCharArray();
return f(t1, t2, t1.length - 1, t2.length - 1);
}
// t1[0..p1] äž t2[0...p2] å
Œ
±ååºåçé¿åºŠ
public static int f(char[] t1, char[] t2, int i1, int i2) {
if (i1 < 0 || i2 < 0) {
return 0;
}
// å
Œ
±åäž²äžå
å«ç»å°Ÿ
int p1 = f(t1, t2, i1 - 1, i2 - 1);
// èŠäžäžªç»å°Ÿ,äžèŠåŠäžäžª
int p2 = f(t1, t2, i1, i2 - 1);
int p3 = f(t1, t2, i1 - 1, i2);
// 䞀䞪ç»å°ŸéœèŠ
int p4 = t1[i1] == t2[i2] ? (p1 + 1) : 0;
return Math.max(Math.max(p1, p2), Math.max(p3, p4));
}
// 䞺äºé¿å
åŸå€èŸ¹ç讚论
// åŸå€æ¶ååŸåŸäžçšäžæ æ¥å®ä¹å°è¯ïŒèæ¯çšé¿åºŠæ¥å®ä¹å°è¯
// å 䞺é¿åºŠæçæ¯0ïŒèäžæ è¶ççè¯è®šè®ºèµ·æ¥å°±æ¯èŸéº»çŠ
public static int longestCommonSubsequence2(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
return f2(s1, s2, n, m);
}
// s1[åçŒé¿åºŠäžºlen1]对åºs2[åçŒé¿åºŠäžºlen2]
// æé¿å
Œ
±ååºåé¿åºŠ
public static int f2(char[] s1, char[] s2, int len1, int len2) {
if (len1 == 0 || len2 == 0) {
return 0;
}
int ans;
if (s1[len1 - 1] == s2[len2 - 1]) {
ans = f2(s1, s2, len1 - 1, len2 - 1) + 1;
} else {
ans = Math.max(f2(s1, s2, len1 - 1, len2), f2(s1, s2, len1, len2 - 1));
}
return ans;
}
// è®°å¿åæçŽ¢
public static int longestCommonSubsequence3(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
int[][] dp = new int[n + 1][m + 1];
for (int i = 0; i <= n; i++) {
for (int j = 0; j <= m; j++) {
dp[i][j] = -1;
}
}
return f3(s1, s2, n, m, dp);
}
public static int f3(char[] s1, char[] s2, int len1, int len2, int[][] dp) {
if (len1 == 0 || len2 == 0) {
return 0;
}
if (dp[len1][len2] != -1) {
return dp[len1][len2];
}
int ans;
if (s1[len1 - 1] == s2[len2 - 1]) {
ans = f3(s1, s2, len1 - 1, len2 - 1, dp) + 1;
} else {
ans = Math.max(f3(s1, s2, len1 - 1, len2, dp), f3(s1, s2, len1, len2 - 1, dp));
}
dp[len1][len2] = ans;
return ans;
}
// äž¥æ Œäœçœ®äŸèµçåšæè§å
public static int longestCommonSubsequence4(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
int[][] dp = new int[n + 1][m + 1];
for (int len1 = 1; len1 <= n; len1++) {
for (int len2 = 1; len2 <= m; len2++) {
if (s1[len1 - 1] == s2[len2 - 1]) {
dp[len1][len2] = 1 + dp[len1 - 1][len2 - 1];
} else {
dp[len1][len2] = Math.max(dp[len1 - 1][len2], dp[len1][len2 - 1]);
}
}
}
return dp[n][m];
}
// äž¥æ Œäœçœ®äŸèµçåšæè§å + 空éŽå猩
public static int longestCommonSubsequence5(String str1, String str2) {
char[] s1, s2;
if (str1.length() >= str2.length()) {
s1 = str1.toCharArray();
s2 = str2.toCharArray();
} else {
s1 = str2.toCharArray();
s2 = str1.toCharArray();
}
int n = s1.length;
int m = s2.length;
int[] dp = new int[m + 1];
for (int len1 = 1; len1 <= n; len1++) {
int leftUp = 0, backup;
for (int len2 = 1; len2 <= m; len2++) {
backup = dp[len2];
if (s1[len1 - 1] == s2[len2 - 1]) {
dp[len2] = 1 + leftUp;
} else {
dp[len2] = Math.max(dp[len2], dp[len2 - 1]);
}
leftUp = backup;
}
}
return dp[m];
}
}
```
]
=== #link(
"https://leetcode.cn/problems/longest-palindromic-subsequence/description/",
)[æé¿åæååºå]
#definition(
"Definition",
)[
ååºåå®ä¹äžºïŒäžæ¹åå©äœå笊顺åºçæ
åµäžïŒå 逿äºå笊æè
äžå é€ä»»äœå笊圢æçäžäžªåºåã
]
ç»äœ äžäžªå笊䞲 `s` ïŒæŸåºå
¶äžæé¿çåæååºåïŒå¹¶è¿å该åºåçé¿åºŠã
#example("Example")[
- èŸå
¥ïŒ`s = "bbbab"`
- èŸåºïŒ`4`
- è§£éïŒäžäžªå¯èœçæé¿åæååºå䞺 `"bbbb" `ã
]
#example("Example")[
- èŸå
¥ïŒ`s = "cbbd"`
- èŸåºïŒ`2`
- è§£éïŒäžäžªå¯èœçæé¿åæååºå䞺 `"bb"` ã
]
#tip("Tip")[
- 1 <= `s.length` <= 1000
- s ä»
ç±å°åè±æåæ¯ç»æ
]
==== è§£ç
#code(caption: [é¢ç®4: æé¿åæååºå])[
```java
public class Code04_LongestPalindromicSubsequence {
public static int longestPalindromeSubseq1(String str) {
char[] s = str.toCharArray();
int n = s.length;
int[][] dp = new int[n][n];
int ans = f1(s, 0, n - 1, dp);
return ans;
}
// l...r æé¿åæååºå
public static int f1(char[] s, int l, int r, int[][] dp) {
if (l == r) {
return 1;
}
if (l + 1 == r) {
return s[l] == s[r] ? 2 : 1;
}
if (dp[l][r] != 0) {
return dp[l][r];
}
int ans;
if (s[l] == s[r]) {
ans = 2 + f(s, l + 1, r - 1, dp);
} else {
ans = Math.max(f(s, l + 1, r, dp), f(s, l, r - 1, dp));
}
dp[l][r] = ans;
return ans;
}
// äž¥æ Œè¡šç»æ
public static int longestPalindromeSubseq3(String str) {
char[] s = str.toCharArray();
int n = s.length;
int[][] dp = new int[n][n];
// l䞺è¡ïŒr䞺å -> r >= l
for (int l = n - 1; l >= 0; l--) {
// l=r 对è§çº¿æ¯1
dp[l][l] = 1;
// l+1 = r
if (l + 1 < n) {
dp[l][l + 1] = s[l] == s[l + 1] ? 2 : 1;
}
for (int r = l + 2; r < n; r++) {
if (s[l] == s[r]) {
dp[l][r] = 2 + dp[l - 1][r - 1];
} else {
dp[l][r] = Math.max(dp[l - 1][r], dp[l][r - 1]);
}
}
}
return dp[0][n - 1];
}
}
```
]
=== #link(
"https://www.nowcoder.com/practice/aaefe5896cce4204b276e213e725f3ea",
)[é¢ç®5: äºåæ çç»ææ°]
#definition("Definition")[
æ çé«åºŠ: å®ä¹äžºææå¶åå°æ ¹è·¯åŸäžèç¹äžªæ°çæå€§åŒ.
]
å°åŒºç°åšæ`n`䞪èç¹,仿³è¯·äœ åž®ä»è®¡ç®åºæå€å°ç§äžåçäºåæ æ»¡è¶³èç¹äžªæ°äžº`n`äžæ çé«åºŠäžè¶
è¿`m`çæ¹æ¡.å äžºçæ¡åŸå€§,æä»¥çæ¡éèŠæš¡äž`1e9+7`åèŸåº.
- èŸå
¥æè¿°ïŒ 第äžè¡èŸå
¥äž€äžªæ£æŽæ°`n`å`m`. `1â€mâ€nâ€50`
- èŸåºæè¿°ïŒ èŸåºäžäžªçæ¡è¡šç€ºæ¹æ¡æ°.
#example("Example")[
- èŸå
¥ïŒ `3 3`
- èŸåºïŒ `5`
]
==== è§£ç
#code(
caption: [é¢ç®5: äºåæ çç»ææ°],
)[
```java
public class Code05_NodenHeightNotLargerThanm {
public static int MAXN = 51;
public static int MAXM = 51;
public static int MOD = 1000000007;
public static long[][] dp = new long[MAXN][MAXM];
public static void main(String[] args) throws IOException {
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
StreamTokenizer in = new StreamTokenizer(br);
PrintWriter out = new PrintWriter(new OutputStreamWriter(System.out));
while (in.nextToken() != StreamTokenizer.TT_EOF) {
int n = (int) in.nval;
in.nextToken();
int m = (int) in.nval;
for (int i = 0; i < n+1; i++) {
for (int j = 0; j < m+1; j++) {
dp[i][j] = -1;
}
}
out.println(compute(n, m));
}
out.flush();
out.close();
br.close();
}
// n䞪èç¹ïŒé«åºŠäžè¶
è¿m
// è®°å¿åæçŽ¢
public static int compute(int n, int m) {
if (n == 0) {
return 1;
}
// n>0
if (m == 0) {
return 0;
}
if (dp[n][m] != -1) {
return (int) dp[n][m];
}
long ans = 0;
// left å 䞪 right å 䞪 å€Žå æäžäžª
for (int i = 0; i < n; i++) {
int left = compute(i, m - 1) % MOD;
int right = compute(n - i - 1, m - 1) % MOD;
ans = (ans + ((long) left * right) % MOD) % MOD;
}
dp[n][m] = ans;
return (int) ans;
}
}
```
]
=== #link(
"https://leetcode.cn/problems/longest-increasing-path-in-a-matrix/",
)[ç©éµäžçæé¿éå¢è·¯åŸ]
ç»å®äžäžª `m x n` æŽæ°ç©éµ `matrix` ïŒæŸåºå
¶äž æé¿éå¢è·¯åŸ çé¿åºŠã
å¯¹äºæ¯äžªåå
æ ŒïŒäœ å¯ä»¥åŸäžïŒäžïŒå·ŠïŒå³å䞪æ¹åç§»åšã äœ äžèœ åš å¯¹è§çº¿
æ¹åäžç§»åšæç§»åšå° 蟹çå€ïŒå³äžå
讞ç¯ç»ïŒã
#example("Example")[
- èŸå
¥ïŒ
```
matrix =
[[9,9,4],
[6,6,8],
[2,1,1]]
```
- èŸåºïŒ`4`
- è§£éïŒæé¿éå¢è·¯åŸäžº `[1, 2, 6, 9]`ã
]
#tip("Tip")[
- m == matrix.length
- n == matrix[i].length
- 1 <= m, n <= 200
- 0 <= matrix[i][j] <= 2^31 - 1
]
==== è§£ç
#code(caption: [é¢ç®6: ç©éµäžæé¿éå¢è·¯åŸ])[
```java
public class Code06_LongestIncreasingPath {
public static int longestIncreasingPath(int[][] matrix) {
int n = matrix.length;
int m = matrix[0].length;
int max = Integer.MIN_VALUE;
for (int i = 0; i < n; i++) {
for (int j = 0; j < m; j++) {
max = Math.max(max, f(matrix, i, j));
}
}
return max;
}
// ä» i,j åºå æé¿çéå¢è·¯åŸ
public static int f(int[][] matrix, int i, int j) {
int next = 0;
if (i > 0 && grid[i][j] < grid[i - 1][j]) {
next = Math.max(next, f1(grid, i - 1, j));
}
if (i + 1 < grid.length && grid[i][j] < grid[i + 1][j]) {
next = Math.max(next, f1(grid, i + 1, j));
}
if (j > 0 && grid[i][j] < grid[i][j - 1]) {
next = Math.max(next, f1(grid, i, j - 1));
}
if (j + 1 < grid[0].length && grid[i][j] < grid[i][j + 1]) {
next = Math.max(next, f1(grid, i, j + 1));
}
return next + 1;
}
public static int longestIncreasingPath2(int[][] grid) {
int n = grid.length;
int m = grid[0].length;
int[][] dp = new int[n][m];
int ans = 0;
for (int i = 0; i < n; i++) {
for (int j = 0; j < m; j++) {
ans = Math.max(ans, f2(grid, i, j, dp));
}
}
return ans;
}
public static int f2(int[][] grid, int i, int j, int[][] dp) {
if (dp[i][j] != 0) {
return dp[i][j];
}
int next = 0;
if (i > 0 && grid[i][j] < grid[i - 1][j]) {
next = Math.max(next, f2(grid, i - 1, j, dp));
}
if (i + 1 < grid.length && grid[i][j] < grid[i + 1][j]) {
next = Math.max(next, f2(grid, i + 1, j, dp));
}
if (j > 0 && grid[i][j] < grid[i][j - 1]) {
next = Math.max(next, f2(grid, i, j - 1, dp));
}
if (j + 1 < grid[0].length && grid[i][j] < grid[i][j + 1]) {
next = Math.max(next, f2(grid, i, j + 1, dp));
}
dp[i][j] = next + 1;
return next + 1;
}
}
```
]
=== #link(
"https://leetcode.cn/problems/distinct-subsequences/description/",
)[é¢ç®7: äžåçååºå]
ç»äœ 䞀䞪å笊䞲 `s` å `t` ïŒç»è®¡å¹¶è¿ååš `s` ç ååºå äž `t` åºç°ç䞪æ°ïŒç»æéèŠå¯¹ `10^9 + 7` åæš¡ã
#example("Example")[
- èŸå
¥ïŒ`s = "rabbbit", t = "rabbit"`
- èŸåºïŒ`3`
- è§£éïŒ
- åŠäžæç€º, æ 3 ç§å¯ä»¥ä» `s` äžåŸå° `"rabbit"` çæ¹æ¡ã
- *rabb* b *it*
- *rab* b *bit*
- *rabb* b *it*
]
==== è§£ç
#code(
caption: [é¢ç®7: äžåçååºå],
)[
```java
public class Code01_DistinctSubsequences {
// çŽæ¥åšæè§å
public static int numDistinct(String s, String t) {
int MOD = 1000000007;
char[] s1 = s.toCharArray();
char[] s2 = t.toCharArray();
int len1 = s1.length;
int len2 = s2.length;
// 衚瀺åçŒé¿åºŠ(䜿çšäžæ äŒæåŸå€èŸ¹çæ
åµç讚论)
int[][] dp = new int[len1 + 1][len2 + 1];
// å°è¯: s1[...p1] å·²ç»å¹é
äºx䞪 s2[...p2]
// s1[0] -> s2[0] = 1 s2[1....] 0
// s2[0] -> s1[0] = 1 s1[1....] 1 (ååºåçæç空é)
for (int i = 0; i < dp.length; i++) {
dp[i][0] = 1;
}
// æç
§æ«å°Ÿè®šè®ºå¯èœæ§,s1[...p1] s2[...p2]
// äžèŠs1çæåäžäžª, å°±èŠs1[...p1-1] s2[...p2]
// å¯¹åº dp[p1-1][p2](äž)
// èŠs1çæåäžäžªïŒå°±èŠèŠæ±s1[p1] == s2[p2], æ¥äžæ¥s1[...p1-1] s2[...p2-1]
// 对åºdp[p1-1][p2-1](å·Šäž)
// 以äžäž€ç§å¯èœæ§çžå
// -------->填衚
for (int i = 1; i <= len1; i++) {
for (int j = 1; j <= len2; j++) {
int p1 = dp[i-1][j]%MOD;
// å 䞺代衚é¿åºŠæä»¥åšäžæ çæ¶åèŠ-1ïŒäžç¶äŒè¶ç
int p2 = s1[i-1]==s2[j-1]?( dp[i-1][j-1]%MOD ):0;
dp[i][j] = (p1+p2)%MOD;
}
}
return dp[len1][len2];
}
// 空éŽå猩
public static int numDistinct2(String str, String target){
int MOD = 1000000007;
char[] s1 = s.toCharArray();
char[] s2 = t.toCharArray();
int len1 = s1.length;
int len2 = s2.length;
// 衚瀺åçŒé¿åºŠ(䜿çšäžæ äŒæåŸå€èŸ¹çæ
åµç讚论)
int[] dp = new int[len2 + 1];
dp[0] = 1;
// -------->填衚
for (int i = 1; i <= len1; i++) {
int leftUp = 1, backup;
for (int j = 1; j <= len2; j++) {
backup = dp[j];
// äž
int p1 = dp[j]%MOD;
// å·Šäž
int p2 = s1[i-1]==s2[j-1]?( leftUp%MOD ):0;
// åœå
dp[j] = (p1+p2)%MOD;
leftUp = backup;
}
}
return dp[len2];
}
}
```
]
=== #link(
"https://leetcode.cn/problems/edit-distance/description/",
)[é¢ç®8: çŒèŸè·çŠ»]
ç»äœ 䞀䞪åè¯ `word1` å `word2`ïŒ è¯·è¿åå° `word1` èœ¬æ¢æ `word2` æäœ¿çšçæå°æäœæ°ã
äœ å¯ä»¥å¯¹äžäžªåè¯è¿è¡åŠäžäžç§æäœïŒ
- æå
¥äžäžªå笊(代价䞺`a`)
- å é€äžäžªå笊(代价䞺`b`)
- æ¿æ¢äžäžªå笊(代价䞺`c`)
#example("Example")[
- èŸå
¥ïŒ`word1 = "horse", word2 = "ros", a=b=c=1`
- èŸåºïŒ`3`
- è§£éïŒ
- horse -> rorse (å° 'h' æ¿æ¢äžº 'r')
- rorse -> rose (å é€ 'r')
- rose -> ros (å é€ 'e')
]
==== è§£ç
1. `s1[i-1]`åäž
- `s1[i-1]`->`s2[j-1]`
1. `s1[i-1]==s2[j-1]` `s1[0...i-2]`æå®`s2[0...j-2]`å³`dp[i-1][j-1]`
2. `s1[i-1]!=s2[j-1]` `s1[i-1]`çŽæ¥æ¿æ¢æ`s2[j-1]`å³`dp[i-1][j-1]+æ¿æ¢ä»£ä»·`
- `s1[i-1]!->s2[j-1]`
- `s1[...i-1]`æå®`s2[...j-2]`ïŒæåéè¿æå
¥æå®`s2[j-1]` å³`dp[i][j-1]+æå
¥`
2. `s1[i-1]`äžåäž, å æ`s1[i-1]` å³`dp[i-1][j]+å é€ä»£ä»·`
#code(caption: [é¢ç®8: çŒèŸè·çŠ»])[
```java
public class Code02_EditDistance {
public int minDistance(String word1, String word2) {
return editDistance2(word1, word2, 1, 1, 1);
}
// ååå°è¯ç
// a : str1äžæå
¥1䞪å笊ç代价
// b : str1äžå é€1䞪å笊ç代价
// c : str1äžæ¹å1䞪å笊ç代价
// è¿åä»str1蜬åæstr2çæäœä»£ä»·
public static int editDistance1(String str1, String str2, int a, int b, int c) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
// dp[i][j] :
// s1[åçŒé¿åºŠäžºi]æ³åæs2[åçŒé¿åºŠäžºj]ïŒè³å°ä»åºå€å°ä»£ä»·
int[][] dp = new int[n + 1][m + 1];
for (int i = 1; i <= n; i++) {
dp[i][0] = i * b;
}
for (int j = 1; j <= m; j++) {
dp[0][j] = j * a;
}
for (int i = 1; i <= n; i++) {
for (int j = 1; j <= m; j++) {
int p1 = Integer.MAX_VALUE;
if (s1[i - 1] == s2[j - 1]) {
p1 = dp[i - 1][j - 1];
}
int p2 = Integer.MAX_VALUE;
if (s1[i - 1] != s2[j - 1]) {
p2 = dp[i - 1][j - 1] + c;
}
int p3 = dp[i][j - 1] + a;
int p4 = dp[i - 1][j] + b;
dp[i][j] = Math.min(Math.min(p1, p2), Math.min(p3, p4));
}
}
return dp[n][m];
}
// æäžŸå°äŒåç
public static int editDistance2(String str1, String str2, int a, int b, int c) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
// dp[i][j] :
// s1[åçŒé¿åºŠäžºi]æ³åæs2[åçŒé¿åºŠäžºj]ïŒè³å°ä»åºå€å°ä»£ä»·
int[][] dp = new int[n + 1][m + 1];
for (int i = 1; i <= n; i++) {
dp[i][0] = i * b;
}
for (int j = 1; j <= m; j++) {
dp[0][j] = j * a;
}
for (int i = 1; i <= n; i++) {
for (int j = 1; j <= m; j++) {
if (s1[i - 1] == s2[j - 1]) {
dp[i][j] = dp[i - 1][j - 1];
} else {
dp[i][j] = Math.min(Math.min(dp[i - 1][j] + b, dp[i][j - 1] + a), dp[i - 1][j - 1] + c);
}
}
}
return dp[n][m];
}
// 空éŽå猩
public static int editDistance3(String str1, String str2, int a, int b, int c) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
int m = s2.length;
int[] dp = new int[m + 1];
for (int j = 1; j <= m; j++) {
dp[j] = j * a;
}
for (int i = 1, leftUp, backUp; i <= n; i++) {
leftUp = (i - 1) * b;
dp[0] = i * b;
for (int j = 1; j <= m; j++) {
backUp = dp[j];
if (s1[i - 1] == s2[j - 1]) {
dp[j] = leftUp;
} else {
dp[j] = Math.min(Math.min(dp[j] + b, dp[j - 1] + a), leftUp + c);
}
leftUp = backUp;
}
}
return dp[m];
}
}
```
]
== ä»éåœå
¥æäžäœåšæè§å
- å°è¯åœæ°æ1䞪å¯ååæ°å¯ä»¥å®å
šå³å®è¿ååŒïŒè¿èå¯ä»¥æ¹åº1ç»Žåšæè§å衚çå®ç°
- å°è¯åœæ°æ2䞪å¯ååæ°å¯ä»¥å®å
šå³å®è¿ååŒïŒé£ä¹å°±å¯ä»¥æ¹åº2ç»Žåšæè§åçå®ç°
- å°è¯åœæ°æ3䞪å¯ååæ°å¯ä»¥å®å
šå³å®è¿ååŒïŒé£ä¹å°±å¯ä»¥æ¹åº3ç»Žåšæè§åçå®ç°
倧äœè¿çšéœæ¯ïŒ
+ ååºå°è¯éåœ
+ è®°å¿åæçŽ¢(ä»é¡¶å°åºçåšæè§å)
+ äž¥æ Œäœçœ®äŸèµçåšæè§å(ä»åºå°é¡¶çåšæè§å)
+ 空éŽãæ¶éŽçæŽå€äŒå
=== #link("https://leetcode.cn/problems/ones-and-zeroes/")[é¢ç®1: äžåé¶]
ç»äœ äžäžªäºè¿å¶å笊䞲æ°ç» `strs` åäž€äžªæŽæ° `m` å `n` ã
è¯·äœ æŸåºå¹¶è¿å `strs` çæå€§åéçé¿åºŠïŒè¯¥åéäž æå€ æ `m` 䞪 `0` å `n` 䞪 `1` ã
#example("Example")[
- èŸå
¥ïŒ`strs = ["10", "0001", "111001", "1", "0"]`, `m = 5`, `n = 3`
- èŸåºïŒ`4`
- è§£éïŒæå€æ 5 䞪 0 å 3 䞪 1 çæå€§å鿝 {"10","0001","1","0"} ïŒå æ€çæ¡æ¯ 4 ã
å
¶ä»æ»¡è¶³é¢æäœèŸå°çåéå
æ¬ `{"0001","1"}` å `{"10","1","0"}` ã`{"111001"}` äžæ»¡è¶³é¢æïŒå 䞺å®å« `4` 䞪 `1` ïŒå€§äº `n` çåŒ `3` ã
]
#example("Example")[
- èŸå
¥ïŒ`strs = ["10", "0", "1"]`, `m = 1`, `n = 1`
- èŸåºïŒ`2`
- è§£éïŒæå€§çå鿝 `{"0", "1"}` ïŒæä»¥çæ¡æ¯ `2` ã
]
#tip("Tip")[
- `1 <= strs.length <= 600`
- `1 <= strs[i].length <= 100`
- `strs[i]` ä»
ç± `'0'` å `'1'` ç»æ
- `1 <= m, n <= 100`
]
==== è§£ç
#code(caption: [é¢ç®1: äžåé¶])[
```java
public class Code01_OnesAndZeroes {
public static int zeros;
public static int ones;
public static void count(String str){
zeros = 0;
ones = 0;
for (int i = 0; i < str.length(); i++) {
if(str.charAt(i)=='0'){
zeros++;
}else if(str.charAt(i)=='1'){
ones++;
}
}
}
public static int findMaxForm1(String[] strs, int m, int n) {
return f1(strs, 0, m, n);
}
// strs[i...] èªç±éæ©ïŒ0çæ°éäžè¶
è¿zïŒ1çæ°éäžè¶
è¿o
// æå€èœéæ©å€å°å笊䞲
public static int f1(String[] strs, int cur, int z, int o){
if(cur==strs.length){
return 0;
}
// äžéæ©åœåå笊䞲
int p1 = f1(strs, cur+1, z, o);
// éæ©åœåå笊䞲
int p2 = 0;
count(strs[cur]);
if(zeros<=z && ones<=o){
p2 = 1 + f1(strs, cur+1, z-zeros, o-ones);
}
return Math.max(p1, p2);
}
// è®°å¿åæçŽ¢
public static int findMaxForm2(String[] strs, int m, int n) {
int[][][] dp = new int[strs.length][m+1][n+1];
for (int i = 0; i < dp.length; i++) {
for (int j = 0; j < dp[0].length; j++) {
for (int k = 0; k < dp[0][0].length; k++) {
dp[i][j][k] = -1;
}
}
}
return f2(dp, strs, 0, m, n);
}
public static int f2(int[][][] dp, String[] strs, int cur, int z, int o){
if(cur==strs.length){
return 0;
}
if(dp[cur][z][o]!=-1){
return dp[cur][z][o];
}
// äžéæ©åœåå笊䞲
int p1 = f2(dp, strs, cur+1, z, o);
// éæ©åœåå笊䞲
int p2 = 0;
count(strs[cur]);
if(zeros<=z && ones<=o){
p2 = 1 + f2(dp, strs, cur+1, z-zeros, o-ones);
}
dp[cur][z][o] = Math.max(p1, p2);
return dp[cur][z][o];
}
// äž¥æ Œè¡šäŸèµ
public static int findMaxForm3(String[] strs, int m, int n) {
int len = strs.length;
// æ¥å° strs[cur...]ïŒèŠ0çæ°é<=mïŒ1çæ°é<=n çæå€§é¿åºŠ
int[][][] dp = new int[len+1][m+1][n+1];
// base case: dp[len][..][..]=0
// æ¯äžå±äŸèµäžäžå±
for (int cur = len-1; cur >= 0; cur--) {
count(strs[cur]);
for (int z = 0; z <= m; z++) {
for (int o = 0; o <= n; o++) {
int p1 = dp[cur+1][z][o];
int p2 = 0;
if(z>=zeros && o >=ones){
p2 = 1 + dp[cur+1][z-zeros][o-ones];
}
dp[cur][z][o] = Math.max(p1, p2);
}
}
}
return dp[0][m][n];
}
// 空éŽå猩
public static int findMaxForm4(String[] strs, int m, int n) {
// 代衚cur==len
int[][] dp = new int[m+1][n+1];
// 第iå±äŸèµç¬¬i+1å±åœåäœçœ®ïŒä»¥åå·Šäžè§æäžªåŒ
// ä»å³äžå°å·Šäžè¿è¡ç©ºéŽå猩
for (String s : strs) {
// æ¯äžªåç¬Šäž²éæžéåå³å¯
// æŽæ°æ¯äžå±ç衚
// åä¹åçéåæ²¡æåºå«
count(s);
for (int z = m; z >= zeros; z--) {
for (int o = n; o >= ones; o--) {
dp[z][o] = Math.max(dp[z][o], 1 + dp[z - zeros][o - ones]);
}
}
}
return dp[m][n];
}
}
```
]
=== #link("https://leetcode.cn/problems/profitable-schemes/")[é¢ç®2: çå©è®¡å]
éå¢éæ `n` ååå·¥ïŒä»ä»¬å¯ä»¥å®æåç§åæ ·çå·¥äœåé 婿¶Šã第 `i` ç§å·¥äœäŒäº§ç `profit[i]` ç婿¶ŠïŒå®èŠæ± `group[i]` åæåå
±ååäžãåŠææååäžäºå
¶äžäžé¡¹å·¥äœïŒå°±äžèœåäžåŠäžé¡¹å·¥äœãå·¥äœçä»»äœè³å°äº§ç `minProfit` 婿¶Šçåé称䞺 çå©è®¡å ãå¹¶äžå·¥äœçæåæ»æ°æå€äžº `n` ã
æå€å°ç§è®¡åå¯ä»¥éæ©ïŒå äžºçæ¡åŸå€§ïŒæä»¥ è¿åç»ææš¡ `10^9 + 7` çåŒã
#example("Example")[
- èŸå
¥ïŒ`n = 5`, `minProfit = 3`, `group = [2,2]`, `profit = [2,3]`
- èŸåºïŒ`2`
- è§£éïŒè³å°äº§ç 3 ç婿¶ŠïŒè¯¥éå¢å¯ä»¥å®æå·¥äœ 0 åå·¥äœ 1 ïŒæä»
å®æå·¥äœ 1 ã
æ»çæ¥è¯ŽïŒæäž€ç§è®¡åã
]
#tip("Tip")[
- èŸå
¥ïŒ`n = 10`, `minProfit = 5`, `group = [2,3,5]`, `profit = [6,7,8]`
- èŸåºïŒ`7`
- è§£éïŒè³å°äº§ç `5` ç婿¶ŠïŒåªèŠå®æå
¶äžäžç§å·¥äœå°±è¡ïŒæä»¥è¯¥éå¢å¯ä»¥å®æä»»äœå·¥äœã
æ 7 ç§å¯èœç计åïŒ(0)ïŒ(1)ïŒ(2)ïŒ(0,1)ïŒ(0,2)ïŒ(1,2)ïŒä»¥å (0,1,2) ã
]
#tip("Tip")[
- `1 <= n <= 100`
- `0 <= minProfit <= 100`
- `1 <= group.length <= 100`
- `1 <= group[i] <= 100`
- `profit.length == group.length`
- `0 <= profit[i] <= 100`
]
==== è§£ç
#code(caption: [é¢ç®2: çå©è®¡å])[
```java
public class Code02_ProfitableSchemes {
public static int MOD = 1000000007;
public int profitableSchemes1(int n, int minProfit, int[] group, int[] profit) {
return f1(0, group, profit,n, minProfit);
}
// æ¥å°ç¬¬job仜工äœ,èŠæ±å©äžçå·¥äœn䞪人è³å°äº§çminProfitç婿¶Š
// è¿åæ¹æ¡æ°
public static int f1(int job, int[] group, int[] profit,int n, int minProfit){
int len = profit.length;
// åŠææ²¡äººäºæè
å·¥äœéå®äº
if(n < 0 || job==len){
return minProfit > 0 ? 0:1;
}
// äžååœåè¿ä»œå·¥äœ
int p1 = f1(job+1, group, profit, n, minProfit);
// ååœåè¿ä»œå·¥äœ
int p2 = 0;
if(n-group[job]>=0){
p2= f1(job+1, group, profit, n-group[job], minProfit-profit[job]);
}
return p1+p2;
}
public int profitableSchemes2(int n, int minProfit, int[] group, int[] profit) {
int len = profit.length;
int[][][] dp = new int[len+1][n+1][minProfit+1];
for (int i = 0; i < dp.length; i++) {
for (int j = 0; j < dp[0].length; j++) {
for (int k = 0; k < dp[0][0].length; k++) {
dp[i][j][k] = -1;
}
}
}
return f2(dp, 0, group, profit,n, minProfit);
}
public static int f2(int[][][] dp, int job, int[] group, int[] profit,int n, int minProfit){
int len = profit.length;
// åŠææ²¡äººäºæè
å·¥äœéå®äº
if(n < 0 || job==len){
return minProfit > 0 ? 0:1;
}
if(dp[job][n][minProfit]!=-1){
return dp[job][n][minProfit];
}
// äžååœåè¿ä»œå·¥äœ
int p1 = f2(dp, job+1, group, profit, n, minProfit);
// ååœåè¿ä»œå·¥äœ
int p2 = 0;
if(n-group[job]>=0){
p2= f2(dp, job+1, group, profit, n-group[job], Math.max(0,minProfit-profit[job]));
}
dp[job][n][minProfit] = (p1+p2) % MOD;
return dp[job][n][minProfit];
}
// äž¥æ Œè¡šç»æ+空éŽå猩
public int profitableSchemes3(int n, int minProfit, int[] group, int[] profit) {
int len = profit.length;
// i = 没æå·¥äœçæ¶åïŒi == g.length
int[][] dp = new int[n + 1][minProfit + 1];
// å·¥äœéå®ä¹åïŒè¿æäººäœæ¯å·²ç»äžçšåçå©
for (int person = 0; person <= n; person++) {
dp[person][0] = 1;
}
for (int job = len-1; job >= 0; job--) {
for (int person = n; person >= 0; person--) {
for (int prof = minProfit; prof >= 0; prof--) {
int p1 = dp[person][prof];
int p2 = 0;
if((person-group[job])>=0){
p2 = dp[person-group[job]][Math.max(0,prof-profit[job])];
}
dp[person][prof] = (p1+p2)%MOD;
}
}
}
return dp[n][minProfit];
}
}
```
]
=== #link("https://leetcode.cn/problems/knight-probability-in-chessboard/")[é¢ç®3: éªå£«åšæ£çäžçæŠç]
åšäžäžª `n x n` çåœé
è±¡æ£æ£çäžïŒäžäžªéªå£«ä»åå
æ Œ `(row, column)` åŒå§ïŒå¹¶å°è¯è¿è¡ `k` 次移åšãè¡å忝 ä» `0` åŒå§ çïŒæä»¥å·Šäžåå
æ Œæ¯ `(0,0)` ïŒå³äžåå
æ Œæ¯ `(n - 1, n - 1)` ã
象æ£éªå£«æ`8`ç§å¯èœçèµ°æ³(类䌌象æ£äžçðçèµ°æ³)ãæ¯æ¬¡ç§»åšåšåºæ¬æ¹åäžæ¯äž€äžªåå
æ ŒïŒç¶ååšæ£äº€æ¹åäžæ¯äžäžªåå
æ Œãæ¯æ¬¡éªå£«èŠç§»åšæ¶ïŒå®éœäŒéæºä»8ç§å¯èœçç§»åšäžéæ©äžç§(å³äœ¿æ£åäŒçŠ»åŒæ£ç)ïŒç¶åç§»åšå°é£éãéªå£«ç»§ç»ç§»åšïŒçŽå°å®èµ°äº `k` æ¥æçŠ»åŒäºæ£çã
è¿å éªå£«åšæ£çåæ¢ç§»åšåä»çåšæ£çäžçæŠç ã
#example("Example")[
- èŸå
¥: `n = 3`, `k = 2`, `row = 0`, `column = 0`
- èŸåº: `0.0625`
- è§£é: æäž€æ¥(å°`(1,2)`ïŒ`(2,1)`)å¯ä»¥è®©éªå£«çåšæ£çäžã
åšæ¯äžäžªäœçœ®äžïŒä¹æäž€ç§ç§»åšå¯ä»¥è®©éªå£«çåšæ£çäžãéªå£«çåšæ£çäžçæ»æŠçæ¯0.0625ã
]
#example("Example")[
- èŸå
¥: `n = 1`, `k = 0`, `row = 0`, `column = 0`
- èŸåº: `1.00000`
]
#tip("Tip")[
- `1 <= n <= 25`
- `0 <= k <= 100`
- `0 <= row, column <= n - 1`
]
==== è§£ç
#code(caption: [éªå£«åšæ£çäžçæŠç - è§£ç])[
```java
public class Code03_KnightProbabilityInChessboard {
public double knightProbability(int n, int k, int row, int col) {
double[][][] dp = new double[k+1][n][n];
for (int t = 0; t <= k; t++) {
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
dp[i][j][t] = -1;
}
}
}
return f(n, row, col, k, dp);
}
// ä»(i,j)åºåè¿ækæ¥èŠèµ°ïŒè¿åæååšæ£çäžçæŠç
public static double f(int n, int i, int j, int k, double[][][] dp) {
if (i < 0 || i >= n || j < 0 || j >= n) {
return 0;
}
if (dp[i][j][k] != -1) {
return dp[i][j][k];
}
double ans = 0;
if(k==0){
return 1;
}else{
ans += (f(n, i - 2, j + 1, k - 1, dp) / 8);
ans += (f(n, i - 1, j + 2, k - 1, dp) / 8);
ans += (f(n, i + 1, j + 2, k - 1, dp) / 8);
ans += (f(n, i + 2, j + 1, k - 1, dp) / 8);
ans += (f(n, i + 2, j - 1, k - 1, dp) / 8);
ans += (f(n, i + 1, j - 2, k - 1, dp) / 8);
ans += (f(n, i - 1, j - 2, k - 1, dp) / 8);
ans += (f(n, i - 2, j - 1, k - 1, dp) / 8);
}
dp[i][j][k] = ans;
return ans;
}
}
```
]
=== #link("https://leetcode.cn/problems/paths-in-matrix-whose-sum-is-divisible-by-k/")[é¢ç®4: ç©éµäžåèœè¢« K æŽé€çè·¯åŸ]
ç»äœ äžäžªäžæ ä» 0 åŒå§ç `m x n` æŽæ°ç©éµ `grid` åäžäžªæŽæ° k ãäœ ä»èµ·ç¹ `(0, 0)` åºåïŒæ¯äžæ¥åªèœåŸ äž æè
åŸ å³ ïŒäœ æ³èŠå°èŸŸç»ç¹ `(m - 1, n - 1)` ã
è¯·äœ è¿åè·¯åŸåèœè¢« `k` æŽé€çè·¯åŸæ°ç®ïŒç±äºçæ¡å¯èœåŸå€§ïŒè¿åçæ¡å¯¹ `10^9 + 7` åäœ çç»æã
#example("Example")[
- èŸå
¥ïŒ
```
grid = [
[5,2,4],
[3,0,5],
[0,7,2]
]
k = 3
```
- èŸåºïŒ2
- è§£éïŒæäž€æ¡è·¯åŸæ»¡è¶³è·¯åŸäžå
çŽ çåèœè¢« `k` æŽé€ã
- ç¬¬äžæ¡è·¯åŸå䞺 5 + 2 + 4 + 5 + 2 = 18 ïŒèœè¢« 3 æŽé€ã
- ç¬¬äºæ¡è·¯åŸå䞺 5 + 3 + 0 + 5 + 2 = 15 ïŒèœè¢« 3 æŽé€ã
]
#example("Example")[
- èŸå
¥ïŒ`grid = [[0,0]]`, `k = 5`
- èŸåºïŒ`1`
- è§£éïŒçº¢è²æ 泚çè·¯åŸå䞺 0 + 0 = 0 ïŒèœè¢« 5 æŽé€ã
]
#example("Example")[
- èŸå
¥ïŒ`grid = [[7,3,4,9],[2,3,6,2],[2,3,7,0]]`, `k = 1`
- èŸåºïŒ`10`
- è§£éïŒæ¯äžªæ°åéœèœè¢« 1 æŽé€ïŒæä»¥æ¯äžæ¡è·¯åŸçåéœèœè¢« `k` æŽé€ã
]
#tip("Tip")[
- `m == grid.length`
- `n == grid[i].length`
- `1 <= m, n <= 5 * 10^4`
- `1 <= m * n <= 5 * 10^4`
- `0 <= grid[i][j] <= 100`
- `1 <= k <= 50`
]
==== è§£ç
`(k + r - (grid[x][y] % k)) % k`è§£æïŒ
æ¥å°`(x, y)`äœçœ®ïŒåœåäœçœ®å äžå©äžçèŠååºäœæ°rã
#example("Example")[
- `k=7`,`r=3`:åœåå äžå©äžçèŠååºäœæ°`3`
- åœ`grid[x][y]%k = 2<3`,å©äžçèŠäœ`1`
- åœ`grid[x][y]%k = 4>3`,å©äžçèŠäœ`7+3-4=6`
]
#code(caption: [K æŽé€çè·¯åŸ - è§£ç])[
```java
public class Code04_PathsDivisibleByK {
public static int MOD = 1000000007;
public static int numberOfPaths1(int[][] grid, int k) {
return f1(grid, k, 0, 0, 0);
}
// ä»(i,j)åºåïŒæç»äžå®èŠèµ°å°å³äžè§(n-1,m-1)ïŒæå€å°æ¡è·¯åŸïŒçޝå å%kæ¯r
public static int f1(int[][] grid, int k, int x, int y, int r){
int n = grid.length;
int m = grid[0].length;
if (x==n-1 && y==m-1) {
return grid[x][y]%k==r?1:0;
}
int ans = 0;
int need = (k+r-(grid[x][y]%k))%k;
int p1=0, p2=0;
// åäžèµ°
if(x+1<n){
p1 = f1(grid, k, x+1, y, need);
}
// åå³èµ°
if(y+1<n){
p2 = f1(grid, k, x+1, y, need);
}
ans = (p1+p2)%MOD;
return ans;
}
public static int numberOfPaths2(int[][] grid, int k) {
int n = grid.length;
int m = grid[0].length;
int[][][] dp = new int[k][n][m];
for (int r = 0; r < k; r++) {
for (int i = 0; i < n; i++) {
for (int j = 0; j < m; j++) {
dp[r][i][j] = -1;
}
}
}
return f2(dp, grid, k, 0, 0, 0);
}
// ä»(i,j)åºåïŒæç»äžå®èŠèµ°å°å³äžè§(n-1,m-1)ïŒæå€å°æ¡è·¯åŸïŒçޝå å%kæ¯r
public static int f2(int[][][] dp, int[][] grid, int k, int x, int y, int r){
int n = grid.length;
int m = grid[0].length;
if(dp[r][x][y]!=-1){
return dp[r][x][y];
}
if (x==n-1 && y==m-1) {
return grid[x][y]%k==r?1:0;
}
int need = (k+r-grid[x][y]%k)%k;
int p1=0, p2=0;
// åäžèµ°
if(x+1<n){
p1 = f2(dp, grid, k, x+1, y, need);
}
// åå³èµ°
if(y+1<m){
p2 = f2(dp, grid, k, x, y+1, need);
}
dp[r][x][y] = (p1+p2)%MOD;
return dp[r][x][y];
}
public static int numberOfPaths3(int[][] grid, int k) {
int n = grid.length;
int m = grid[0].length;
// ä»(i,j)åºåïŒæç»äžå®èŠèµ°å°å³äžè§(n-1,m-1)ïŒæå€å°æ¡è·¯åŸïŒçޝå å%kæ¯r
int[][][] dp = new int[n][m][k];
dp[n-1][m-1][grid[n-1][m-1]%k] = 1;
// æåäžåïŒä»äžå°äžäŸèµ
for (int i = n-2; i >= 0; i--) {
for (int r = 0; r < k; r++) {
int need = (k+r-grid[i][m-1]%k)%k;
dp[i][m-1][r] = dp[i+1][m-1][need];
}
}
// æåäžè¡ïŒä»å³å°å·ŠäŸèµ
for (int j = m-2; j >= 0; j--) {
for (int r = 0; r < k; r++) {
int need = (k+r-grid[n-1][j]%k)%k;
dp[n-1][j][r] = dp[n-1][j+1][need];
}
}
for (int i = n-2; i >= 0; i--) {
for (int j = m-2; j >= 0; j--) {
for (int r = 0; r < k; r++) {
int need = (k+r-grid[i][j]%k)%k;
// äŸèµå³èŸ¹
int p1 = dp[i][j+1][need];
// äŸèµäžèŸ¹
int p2 = dp[i+1][j][need];
dp[i][j][r] = (p1+p2)%MOD;
}
}
}
return dp[0][0][0];
}
}
```
]
#tip("Tip")[
衚äŸèµå¯ä»¥çæäžäžªäºç»Žåæ ïŒæ¯äžªåæ æ Œåéé¢æäžªkå±çæåã
]
=== #link("https://leetcode.cn/problems/scramble-string/")[é¢ç®5: æ°ä¹±å笊䞲]
䜿çšäžé¢æè¿°çç®æ³å¯ä»¥æ°ä¹±å笊䞲 `s` åŸå°å笊䞲 `t` ïŒ
+ åŠæå笊䞲çé¿åºŠäžº `1` ïŒç®æ³åæ¢
+ åŠæå笊䞲çé¿åºŠ `> 1` ïŒæ§è¡äžè¿°æ¥éª€ïŒ
+ åšäžäžªéæºäžæ å€å°å笊䞲åå²æäž€äžªé空çåå笊䞲ãå³ïŒåŠæå·²ç¥å笊䞲 `s` ïŒåå¯ä»¥å°å
¶åæäž€äžªåå笊䞲 `x` å `y` ïŒäžæ»¡è¶³ `s = x + y` ã
+ éæº å³å®æ¯èŠã亀æ¢äž€äžªåå笊䞲ãè¿æ¯èŠãä¿æè¿äž€äžªåå笊䞲ç顺åºäžåããå³ïŒåšæ§è¡è¿äžæ¥éª€ä¹åïŒ`s` å¯èœæ¯ `s = x + y` æè
`s = y + x` ã
+ åš `x` å `y` è¿äž€äžªåå笊䞲äžç»§ç»ä»æ¥éª€ 1 åŒå§éåœæ§è¡æ€ç®æ³ã
ç»äœ 䞀䞪 é¿åºŠçžç çå笊䞲 `s1` å `s2`ïŒå€æ `s2` æ¯åŠæ¯ `s1` çæ°ä¹±å笊䞲ãåŠææ¯ïŒè¿å `true` ïŒåŠåïŒè¿å `false` ã
#example("Example")[
- èŸå
¥ïŒ`s1 = "great"`, `s2 = "rgeat"`
- èŸåºïŒ`true`
- è§£éïŒ`s1` äžå¯èœåççäžç§æ
圢æ¯ïŒ
```
"great" --> "gr/eat" // åšäžäžªéæºäžæ å€åå²åŸå°äž€äžªåå笊䞲
"gr/eat" --> "gr/eat" // éæºå³å®ïŒãä¿æè¿äž€äžªåå笊䞲ç顺åºäžåã
"gr/eat" --> "g/r / e/at" // åšåå笊䞲äžéåœæ§è¡æ€ç®æ³ã䞀䞪åå笊䞲åå«åšéæºäžæ å€è¿è¡äžèœ®åå²
"g/r / e/at" --> "r/g / e/at" // éæºå³å®ïŒç¬¬äžç»ã亀æ¢äž€äžªåå笊䞲ãïŒç¬¬äºç»ãä¿æè¿äž€äžªåå笊䞲ç顺åºäžåã
"r/g / e/at" --> "r/g / e/ a/t" // ç»§ç»éåœæ§è¡æ€ç®æ³ïŒå° "at" åå²åŸå° "a/t"
"r/g / e/ a/t" --> "r/g / e/ a/t" // éæºå³å®ïŒãä¿æè¿äž€äžªåå笊䞲ç顺åºäžåã
```
ç®æ³ç»æ¢ïŒç»æå笊䞲å `s2` çžåïŒéœæ¯ `"rgeat"`
è¿æ¯äžç§èœå€æ°ä¹± `s1` åŸå° `s2` çæ
圢ïŒå¯ä»¥è®€äžº `s2` æ¯ `s1` çæ°ä¹±å笊䞲ïŒè¿å `true`
]
#example("Example")[
- èŸå
¥ïŒ`s1 = "abcde"`, `s2 = "caebd"`
- èŸåºïŒ`false`
]
#example("Example")[
- èŸå
¥ïŒ`s1 = "a"`, `s2 = "a"`
- èŸåºïŒ`true`
]
#tip("Tip")[
- s1.length == s2.length
- 1 <= s1.length <= 30
- s1 å s2 ç±å°åè±æåæ¯ç»æ
]
==== è§£ç
åŠæäž€äžªå笊䞲å笊ç§ç±»äžæ ·ïŒå¯¹åºçæ°éä¹äžæ ·ïŒäž€äžªæ¯åŠäžå®äºäžºæ°ä¹±äž²å¢ïŒ
äžäžå®ïŒ
#example("Example")[
- s1: `abcd`
+ `a bcd`
+ `ab cd`
+ `abc d`
- s2: `cadb`
没æ³å¿ïŒ
]
#code(caption: [æ°ä¹±å笊䞲 - è§£ç])[
```java
public class Code05_ScrambleString {
public static boolean isScramble1(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
return f1(s1, 0, n - 1, s2, 0, n - 1);
}
// s1[l1....r1]
// s2[l2....r2]
// ä¿è¯l1....r1äžl2....r2
// æ¯äžæ¯æ°ä¹±äž²çå
³ç³»
public static boolean f1(char[] s1, int l1, int r1, char[] s2, int l2, int r2) {
if (l1 == r1) {
// s1[l1..r1]
// s2[l2..r2]
return s1[l1] == s2[l2];
}
// s1[l1..i][i+1....r1]
// s2[l2..j][j+1....r2]
// äžäº€éå»è®šè®ºæ°ä¹±å
³ç³»
for (int i = l1, j = l2; i < r1; i++, j++) {
if (f1(s1, l1, i, s2, l2, j) && f1(s1, i + 1, r1, s2, j + 1, r2)) {
return true;
}
}
// 亀éå»è®šè®ºæ°ä¹±å
³ç³»
// s1[l1..........i][i+1...r1]
// s2[l2...j-1][j..........r2]
for (int i = l1, j = r2; i < r1; i++, j--) {
if (f1(s1, l1, i, s2, j, r2) && f1(s1, i + 1, r1, s2, l2, j - 1)) {
return true;
}
}
return false;
}
// äŸç¶æŽåå°è¯ïŒåªäžè¿å䞪å¯ååæ°ïŒåæäºäžäžª
public static boolean isScramble2(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
return f2(s1, s2, 0, 0, n);
}
public static boolean f2(char[] s1, char[] s2, int l1, int l2, int len) {
if (len == 1) {
return s1[l1] == s2[l2];
}
// s1[l1.......] len
// s2[l2.......] len
// å·Š : k䞪 å³: len - k 䞪
for (int k = 1; k < len; k++) {
if (f2(s1, s2, l1, l2, k) && f2(s1, s2, l1 + k, l2 + k, len - k)) {
return true;
}
}
// 亀éïŒ
for (int i = l1 + 1, j = l2 + len - 1, k = 1; k < len; i++, j--, k++) {
if (f2(s1, s2, l1, j, k) && f2(s1, s2, i, l2, len - k)) {
return true;
}
}
return false;
}
public static boolean isScramble3(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
// dp[l1][l2][len] : int 0 -> 没å±åŒè¿
// dp[l1][l2][len] : int -1 -> å±åŒè¿ïŒè¿åçç»ææ¯false
// dp[l1][l2][len] : int 1 -> å±åŒè¿ïŒè¿åçç»ææ¯true
int[][][] dp = new int[n][n][n + 1];
return f3(s1, s2, 0, 0, n, dp);
}
public static boolean f3(char[] s1, char[] s2, int l1, int l2, int len, int[][][] dp) {
if (len == 1) {
return s1[l1] == s2[l2];
}
if (dp[l1][l2][len] != 0) {
return dp[l1][l2][len] == 1;
}
boolean ans = false;
for (int k = 1; k < len; k++) {
if (f3(s1, s2, l1, l2, k, dp) && f3(s1, s2, l1 + k, l2 + k, len - k, dp)) {
ans = true;
break;
}
}
if (!ans) {
for (int i = l1 + 1, j = l2 + len - 1, k = 1; k < len; i++, j--, k++) {
if (f3(s1, s2, l1, j, k, dp) && f3(s1, s2, i, l2, len - k, dp)) {
ans = true;
break;
}
}
}
dp[l1][l2][len] = ans ? 1 : -1;
return ans;
}
public static boolean isScramble4(String str1, String str2) {
char[] s1 = str1.toCharArray();
char[] s2 = str2.toCharArray();
int n = s1.length;
boolean[][][] dp = new boolean[n][n][n + 1];
// å¡«ålen=1å±ïŒææçæ Œå
for (int l1 = 0; l1 < n; l1++) {
for (int l2 = 0; l2 < n; l2++) {
dp[l1][l2][1] = s1[l1] == s2[l2];
}
}
for (int len = 2; len <= n; len++) {
// 泚æåŠäžçèŸ¹çæ¡ä»¶ : l1 <= n - len l2 <= n - len
for (int l1 = 0; l1 <= n - len; l1++) {
for (int l2 = 0; l2 <= n - len; l2++) {
for (int k = 1; k < len; k++) {
if (dp[l1][l2][k] && dp[l1 + k][l2 + k][len - k]) {
dp[l1][l2][len] = true;
break;
}
}
if (!dp[l1][l2][len]) {
for (int i = l1 + 1, j = l2 + len - 1, k = 1; k < len; i++, j--, k++) {
if (dp[l1][j][k] && dp[i][l2][len - k]) {
dp[l1][l2][len] = true;
break;
}
}
}
}
}
}
return dp[0][0][n];
}
}
```
]
|
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/math/attach-p1_00.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test basics, postscripts.
$f_x + t^b + V_1^2 + attach(A, t: alpha, b: beta)$
|
https://github.com/yhtq/Notes | https://raw.githubusercontent.com/yhtq/Notes/main/æœè±¡ä»£æ°/ç« è/æš¡ãå.typ | typst | #import "../../template.typ": *
#show: note.with(
title: "æœè±¡ä»£æ°/代æ°åŠåºç¡",
author: "YHTQ",
date: none,
logo: none,
withOutlined: false
)
#let chapterThree = [
= æš¡
== åºæ¬æŠå¿µ
ç±»æ¯çº¿æ§ç©ºéŽæ¯åäœçšäºäº€æ¢çŸ€ïŒæš¡æ¯ç¯äœçšäºäº€æ¢çŸ€
== äž»çæ³æŽç¯äžæéçææš¡çåç±»å®ç
æ¬èäžïŒææç $R$ éœæ¯ PID
#lemma[][
讟 $p, q in R$ æ¯äžçžäŒŽççŽ å
ïŒ$r, s in NN$ïŒåïŒ
- è¥ $M tilde.eq R$ïŒå $quotient(M, p^r M) tilde.eq quotient(R, (p^r))$
- è¥ $M tilde.eq quotient(R, p^s R)$ïŒåïŒ
$
p^r M = cases(
quotient(p^r R, p^s R) space& r <= s,
{0} & r > s
)
$
- è¥ $M tilde.eq quotient(R, q^s R)$ïŒåïŒ
$
p^r M = M
$
]
#proof[
+
+
+ 泚æå°åš PID äžïŒ$(p^r, q^s) = (1)$ã讟 $a p^r + b q^s = 1$ïŒå $m = a m p^r + b m q^s in (p^r + (q^s)) M = p^r M$
]
= å
== åºæ¬æŠå¿µ
åçå®ä¹åé¢å·²ç»ç»åºäºã对äºåçèå¯ïŒæä»¬ä»æç®åçååŒå§ïŒå³ç¹åŸäžºçŽ æ°ççŽ åïŒåä»å®ä»¬åºåæé æ°çåã
#definition[ç¹åŸ][
对å $F$ïŒè¥ååšæ£æŽæ° $n$ 䜿åŸ:
$
n dot 1 = 0
$
åç§°æ»¡è¶³èŠæ±çæå°æŽæ° $n$ 䞺å $F$ çç¹åŸïŒè®°äœ $"char"(F)$ïŒåŠåç§° $F$ çç¹åŸäžº $0$ã
]
#corollary[][
讟 $"char"(F) = n$ïŒå $forall x in F$ïŒæ $n dot x = 0$
]
#theorem[][
åçç¹åŸäžå®æ¯çŽ æ°
]
#proof[
$(n m) dot 1 = 0 <=> (n dot 1)(m dot 1) = 0 <=> n dot 1 = 0 or m dot 1 = 0$ïŒè¡šæåçç¹åŸäžå®äžå¯åè§£ïŒè¿èäžå®æ¯çŽ æ°ã
]
#corollary[][
ææç¹åŸäžº $p$ çåéœå
å« $ZZ_p$ïŒç¹åŸäžº $0$ çåéœå
å« $QQ$
]
#lemma[][
åéŽçåæäžå®æ¯åå°
]
#proof[
æŸç¶ååæäžå®æ¯ç¯åæïŒè¿è $ker(phi)$ æ¯ $F$ ççæ³ïŒç±äº $F$ æ¯åïŒ$ker(phi)$ åªèœæ¯ $F$ æè
$\{0\}$ïŒè $phi$ äžå¯èœæ¯ $0$ æ å°ïŒå æ€ $ker(phi) = \{0\}$ïŒå³ $phi$ æ¯åå°ã
]
#remark[][
è¿äžªåŒçç»åºè¥æä»¬æäžäžªååæ $F -> K$ïŒå $F$ å¯ä»¥çäœ $K$ çååïŒè¿èæä»¬å¯ä»¥å° $K$ çäœ $F$ çæ©åŒ åïŒè¿å°±åŒåºäºåæ©åŒ çæŠå¿µã
]
== åæ©åŒ
仿Šå¿µäžæ©åŒ å
³ç³»åºæ¬å°±æ¯ååå
³ç³»ïŒåªæ¯æä»¬åšç ç©¶åæ¶åŸåŸä»å°è³å€§ïŒå æ€æäºååçæ©åŒ å
³ç³»ïŒ
#definition[åæ©åŒ ][
讟 $F$ æ¯ $K$ çååïŒåç§° $K$ æ¯ $F$ çäžäžªåæ©åŒ ïŒå¹¶ç§° $F$ æ¯äžäžªåºåãè¥ $F subset E subset K$ éœæ¯åïŒåç§° $E$ æ¯äžéŽåã
æ€æ¶ïŒ$K$ æ¯äžäžª $F$ åé空éŽïŒè¿èç§°åæ©åŒ $quotient(K, F)$ çæ¬¡æ°ïŒ
$
[K : F] := dim_F K
$
]
#theorem[][
è¥ $F subset E subset K$ïŒåïŒ
$
[K : F] = [K : E] [E : F]
$
]
#proof[
讟 $m = [K : E], n = [E : F] $\
æŸ $alpha_1, alpha_2, ..., alpha_m$ æ¯ $K$ åš $E$ äžçäžç»åºïŒ$beta_1, beta_2, ..., beta_n$ æ¯ $E$ åš $F$ äžçäžç»åºïŒ\
å对任æ $k in K$ïŒååš $lambda_i in E$ïŒäœ¿åŸïŒ
$
k = sum_(i = 1)^m lambda_i beta_i
$
è对 $lambda_i in E$ïŒååš $mu_(i j) in K$ïŒäœ¿åŸïŒ
$
lambda_i = sum_(j = 1)^n = mu_(i j) alpha_j
$
è¿èïŒ
$
k = sum_(i = 1)^m sum_(j = 1)^n mu_(i j) alpha_j beta_i
$
衚æ $alpha_i, beta_j$ ææäžç»çæå
ïŒåªéè¯æå®ä»¬çº¿æ§æ å
³ïŒäºå®äžïŒ
$
0 = sum_(i = 1)^m sum_(j = 1)^n mu_(i j) alpha_j beta_i => sum_(i = 1)^m (sum_(j = 1)^n mu_(i j) alpha_j) beta_i = 0 \
=> sum_(j = 1)^n mu_(i j) alpha_j = 0, space forall i \
=> mu_(i j) = 0, space forall i, j
$
]
#example[][
讟 $F$ æ¯åïŒå $F[x]$ æ¯äž»çæ³æŽç¯ãåå
¶äžäžªäžå¯çºŠå
$p(x)$ïŒå $quotient(F[x], (p(x)))$ æ¯åïŒå®åœç¶ $F$ çäžäžªåæ©åŒ ïŒäž $[quotient(F[x], (p(x))) : F] = deg(p(x))$\
æŽéèŠçæ¯ä»¥äžåŒçïŒ
#lemma[][
什 $theta = x + (p) in quotient(F[x], (p(x)))$
åå®å°æäžºæ¹çš
$
p(z) = 0, z in quotient(F[x], (p(x)))
$
çæ ¹
]
#proof[
讟 $p(x) = sum_i a^i x^i$ïŒåïŒ
$
p(theta) = sum_i a^i (x + (p))^i = sum_i a^i x^i + (p) = p(x) + (p) = 0
$
]
衚ææä»¬å¥œåççæ¯æ·»å äºäžäžªæ ¹ã
]
#remark[][
$quotient(RR, (x^2 + 1))$ æç
§äžé¢çä»ç»æäžºäº $RR$ çäžäžªæ©åŒ åïŒå®å°±æ¯ $CC$ãäœäºå®äž $CC$ äžæå€é¡¹åŒ $x^2 + 1$ ç䞀䞪èäžæ¯äžäžªæ ¹
]
#definition[][
讟 $F subset K$ éœæ¯åïŒ$a_i in K$ïŒåè®°ïŒ
$
F(a_1, a_2, ..., a_n)
$
䞺å
å« $F, a_i$ ç $K$ äžçæå°ååïŒç§°äžº $F$ å $a_i$ ççæåã\
ç±æé䞪å
çŽ çæçå称䞺æéçæ\
ç¹å«çïŒå䞪å
çŽ çæçåæäžºåæ©åŒ
]
#example[][
- $QQ(sqrt(2), sqrt(3)) = QQ(sqrt(2) + sqrt(3))$
]
#theorem[][
讟 $K = F(alpha)$ïŒå以äžäž€è
æäžåªæäžäžªæç«ïŒ
- $1, alpha, alpha^2, ..., alpha^n, ...$ åš $F$ äžå
šéšçº¿æ§æ å
³ïŒè¿è $F(alpha) tilde.eq "Frac"(F[x])$ïŒæ€æ¶ç§° $alpha$ åš $F$ äžè¶
è¶
- $1, alpha, alpha^2, ..., alpha^n, ...$ åš $F$ äžçº¿æ§çžå
³ïŒè¿èäœ¿åŸ $f(alpha) = 0$ ç $F$ äžå€é¡¹åŒææäžäžªçæ³ïŒè¿èæ¯äž»çæ³ãåå
¶å¯äžçæå
$m(x)$ïŒåç§° $m(x)$ 䞺 $alpha$ åš $F$ äžçæå°å€é¡¹åŒïŒå®äžå®äžå¯çºŠïŒè®°äœ $m_alpha(x)$ïŒå¹¶æïŒ
$
F(alpha) tilde.eq quotient(F[x], (m_alpha (x)))
$
æ€æ¶ç§° $alpha$ åš $F$ äžä»£æ°ïŒalgebraicïŒ
]
#proof[
- å¯¹äºæ
åµäžïŒæé ç¯åæïŒ
$
funcDef(phi, F[x], K, f(x), f(alpha))
$
ç±é¢è®ŸïŒ$phi$ æ¯åå°ïŒè¿èå®å¯ä»¥å»¶æå° $phi': Frac(F[x]) -> K$ïŒææåäžçåæïŒåœç¶æ¯åå°ïŒå æ€ïŒ
$
Frac(F[x]) tilde.eq im(phi') = K
$
- å¯¹äºæ
åµäºïŒåªéè¯æ $m_alpha (x)$ ç¡®å®äžå¯çºŠãäºå®äžïŒ
$
funcDef(phi, F[x], K, f(x), f(alpha))
$
ç»åºç¯äžç满åæïŒè¿èïŒ
$
quotient(F[x], ker(phi)) tilde.eq im(phi) = K
$
è $K$ æ¯åïŒåæ¶ $ker(phi) = (m_alpha (x))$ïŒè¡šæ $m_alpha (x)$ ç¡®å®äžå¯çºŠã
]
#definition[][
讟 $K = F(alpha_1, alpha_2, ..., alpha_i, ...)$ïŒè¥æ¯äžäžªå
çŽ éœæ¯ä»£æ°çïŒåç§° $quotient(K, F)$ æ¯ä»£æ°çïŒåŠå称䞺è¶
è¶ç
]
#proposition[][
以äžäž€è
çä»·ïŒ
- $[K : F]$ æé
- $quotient(K, F)$ æ¯æéçæäžä»£æ°ç
]
#proof[
- $1 => 2$ æ¯å®¹æçïŒåæ¶æä»¬æïŒ
$
deg(m_alpha (x)) = [F(alpha), F] | [K, F]
$
- åŠäžäŸ§ç¥æŸå€æïŒ
#lemma[][
讟 $F subset E subset K$ïŒ$alpha in K$ïŒ$alpha$ åš $E, F$ äžçæå°å€é¡¹åŒåå«äžº $m_E (x), m_F (x)$ïŒåïŒ
$
m_(F) (x) = 0 in E => m_(E) (x) | m_(F) (x) in E[x]
$
]
#corollary[][
$[E(alpha) : E] <= [F(alpha): F]$
]
#definition[][
讟 $F subset E_i subset K$ïŒåè®°ïŒ
$
E_1 E_2 ... E_n
$
䞺å
å« $E_i$ çæå°çåïŒç§°äžº $E_i$ çå€å
]
#lemma[][
讟 $[E_i, F]$ æéïŒåïŒ
$
[E_1 E_2 : F] <= [E_1 : F] [E_2 : F]
$
]
#proof[
讟 $E_1 = F(alpha_1, alpha_2, ..., alpha_n)$ïŒåç±äžé¢çåŒçïŒ
$
[E_2(alpha_1) : E_2] <= [F(alpha_1) : F]\
[E_2(alpha_1, alpha_2) : E_2(alpha_1)] <= [F(alpha_1, alpha_2) : F(alpha_1)]\
$
ç±äºåæ©åŒ çæ¬¡æ°å¯ä¹ïŒå·Šå³äŸ§å
šéšä¹èµ·æ¥åŸå°ïŒ
$
[E_2(alpha_1, alpha_2, ..., alpha_n) : E_2] <= [F(alpha_1, alpha_2, ..., alpha_n) : F]\
<=> [E_1 E_2 : E_2] <= [E_1 : F] \
=> [E_1 E_2 : F] = [E_1 E_2 : E_2] [E_2 : F] <= [E_1 : F] [E_2 : F]
$
]
ç±è¿äºåŒçïŒè®Ÿ $K = F(alpha_1, alpha_2, ..., alpha_n)$ïŒåïŒ
$
[K : F] = [F(alpha_1) F(alpha_2) ... F(alpha_n) : F]<= product [F(alpha_i) : F]
$
è¯æ¯
]
#corollary[代æ°éå
][
讟 $alpha, beta in K$ åš $F$ äžä»£æ°ïŒåïŒ
$
alpha plus.minus beta, alpha beta, alpha / beta
$
éœåš $F(alpha, beta)$ ä¹äžïŒç±äžé¢çå®çè¿æ¯æéæ©åŒ ïŒè¿èè¿äºå
çŽ éœåš $F$ äžä»£æ°ïŒä»è $K$ äžææåš $F$ äžä»£æ°çå
çŽ ææ $K$ çäžäžªååïŒç§°äžº $K$ åš $F$ äžç代æ°éå
]
#theorem[][
è¥ $quotient(K, E)$ äž $quotient(E, F)$ éœæ¯ä»£æ°çïŒå $quotient(K, F)$ 乿¯ä»£æ°ç
]
== åè£å
#definition[åè£å][
ç»å®å $F$ 以åå
¶äžçå€é¡¹åŒ $f in F[x]$ïŒè¿éå¹¶äžèŠæ±äžå¯çºŠïŒãäžäžªåæ©åŒ $quotient(K, F)$ 称䞺 $f(x)$ çåè£åïŒåŠæïŒ
- $f(x)$ åš $K[x]$ äžå¯ä»¥åæäžæ¬¡å€é¡¹åŒçä¹ç§¯ïŒæè
è¯Žæ°æ $deg f$ äžªæ ¹ $alpha_i$
- $K = F[alpha_1, alpha_2, ..., alpha_(deg f)]$
]
#remark[][
讟 $F <= E <= K, f(x) in F[x]$ïŒ$f(x)$ åš $F$ äžçåè£åæ¯ $K$ïŒåå®åš $E$ äžçåè£ååœç¶ä¹æ¯ $K$ã
]<cor1>
#theorem[][
åæ©åŒ äžå®æ¯æéæ©åŒ ãæŽè¿äžæ¥ïŒ$f(x) in F[x]$ïŒåè£å䞺 $K$ïŒåïŒ
$
[K : F] <= n!
$
]
#proof[
对 $f(x)$ çæ¬¡æ°åœçº³ïŒå讟 $deg(f) < n$ çæ
圢éœå·²æç«ã\
讟 $f(x) = p(x)k(x)$ïŒå
¶äž $p(x)$ äžå¯çºŠã什ïŒ
$
E = quotient(F[x], (p(x)))
$
åš $E$ äžïŒ$p(x) in E[x]$ åœç¶ææ ¹ïŒå æ€ $f(x)$ ä¹ææ ¹ïŒè®ŸïŒ
$
f(x) = (x - theta)g(x)
$
ç±åœçº³æ³ïŒååšåè£åïŒ
$
quotient(K, E)
$
满足ïŒ
$
[K : E] <= (n-1)!
$
ä»èåœç¶ $quotient(K, F)$ æ¯ $f(x)$ çåè£åïŒäžïŒ
$
[K : F] <= n!
$
]
#example[ååå][
å $QQ$ äžïŒå€é¡¹åŒ $x^n - 1$ çåè£å称䞺 $n$ 次åååã
]
#lemma[][
讟 $eta$ æ¯ååæïŒ$p(x)$ æ¯äžå¯çºŠå€é¡¹åŒïŒå $eta(p(x))$ åœç¶ä¹æ¯äžå¯çºŠå€é¡¹åŒïŒè¿äžæ¥ïŒ
$
quotient(F[x], (q(x))) tilde.eq quotient(eta(F[x]), (eta(p(x))))
$
]
#example[][
- $a + b sqrt(2) -> a - b sqrt(2)$ æ¯ $Q[sqrt(2)]$ çèªåæ
$Q((sqrt(5+sqrt(2)))) tilde.eq quotient(QQ(sqrt(2)), (x^2 - 5 -sqrt(2))) tilde.eq quotient(QQ(sqrt(2)), (x^2 - 5 +sqrt(2))) tilde.eq Q((sqrt(5-sqrt(2))))$
]
#lemma[][
讟 $eta$ æ¯ååæïŒ$f(x) in F[x], f'(x) = eta(f(x))$ã讟 $E, E'$ å嫿¯ $f(x), f'(x)$ åšåèªåäžçåè£åïŒé£ä¹ååšïŒäžäžå®å¯äžïŒåæ $sigma$ 䜿åŸäž€äžªåè£ååæ
]
#proof[
æä»¬çæè·¯æ¯ç»åºäžäžªæ åçåè£åæé ïŒåè¯æå®ä»¬çäžèŽæ§ã\
æä»¬è¯æå 区çåœé¢ïŒ
#lemma[][
讟 $eta:F -> F'$ æ¯åæïŒ$E, E'$ æ¯ $E, F'$ çäžäžªæ©åäœ¿åŸ $E$ æ¯ $f(x) in F[x] $ïŒæäºæ ¹çæïŒ$f'(x) = eta(f(x)) in F'[x]$ åšå
¶äžåè£ïŒåååšåæ $sigma'$ å° $E$ åµå
¥ $E'$
]
#proof[
]
]
#theorem[][
讟 $F <= E, E' <= K$, $E, E'$ éœæ¯ $f(x) in F[x]$ åš $F$ äžçåè£åïŒåïŒ
$
E = E'
$
]
#corollary[][
讟 $F <= E <= K$ïŒ$sigma$ æ¯ $K$ äžçèªåæïŒ$E$ æ¯æäžªåè£åïŒäž $sigma|_F = id$ïŒåïŒ
$
sigma(E) = E
$
]<lemma1>
== æ£è§æ©åŒ
#definition[æ£è§æ©åŒ ][
äžäžªä»£æ°åæ©åŒ $quotient(K, F)$ 称䞺æ£è§æ©åŒ ïŒåŠæïŒ
- 对 $F[x]$ äžææäžå¯çºŠå€é¡¹åŒ $p(x)$ïŒåªèŠå®åš $K$ äžæäžäžªæ ¹ïŒäŸ¿æææïŒçäºæ¬¡æ°ïŒçæ ¹
]
è¿äžªæ¡ä»¶ç䌌åŸåŒºïŒäœäºå®äžå¹¶äžç¶ïŒäžé¢çå®ç䟿诎æäºè¿ç¹
#theorem[][
äžäžªæéåæ©åŒ $quotient(K, F)$ æ¯æ£è§æ©åŒ åœäžä»
åœå®æ¯æäžªå€é¡¹åŒ $f(x)$ çåè£å
]
#proof[
- 讟 $K = F(alpha_i)$ æ¯æ£è§æ©åŒ ïŒæŸç¶å®å°±æ¯ $f(x) = product_i m_(alpha_i) (x)$ çåè£å
- åŠäžäŸ§æ¯äžå¹³å¡çã讟 $K$ æ¯ $f(x)$ çåè£åïŒ$p(x)$ æ¯æäžªäžå¯çºŠå€é¡¹åŒïŒå®åš $K$ äžææ ¹ $alpha$ã讟 $L$ æ¯ $p(x)$ åš $K$ äžçåè£åïŒåŸè¯ïŒ
$
L = K
$
- éŠå
ïŒ$K <= L$ æ¯æŸç¶çã
- å $beta in L$ æ¯åŠäžäžª $p(x)$ åš $L$ äžçæ ¹ïŒè¥ $p$ æ¯äžæ¬¡å€é¡¹åŒïŒç»è®ºæ¯æŸç¶çïŒïŒæä»¬åç° $L$ å°åæ¶æäžº $F(alpha), F(beta)$ çåè£åïŒå æ€ïŒ
$
F(alpha) tilde.eq F(beta)
$
å¹¶äžè¯±å¯Œåº $L$ äžéå¹³å¡èªåæ $eta$, $eta(F(alpha)) = eta(F(beta)). eta|_F = id$ãäœç± @lemma1ïŒè¿æå³çïŒ
$
eta(K) = K
$
è $alpha in K => beta in K$ïŒè¿è $K$ æ¥æ $p(x)$ åš $L$ äžææçæ ¹ïŒä»èç»è®ºæç«
]
#corollary[][
讟 $quotient(K, F)$ æ¯æéæ£è§æ©åŒ ïŒ$F <= E <= K$ïŒå $quotient(K, E)$ 乿¯æ£è§æ©åŒ ïŒ$quotient(F, E)$ æªå¿
ïŒ
]
#proof[
åå¿ @cor1ïŒç»è®ºæ¯å®¹æç
]
#definition[æ£è§éå
][
ç§° $quotient(L, K)$ æ¯ $quotient(K, F)$ çæ£è§éå
ïŒåŠæïŒ
- $quotient(L, F)$ æ¯æ£è§æ©åŒ
- è¥ $quotient(L', F)$ 乿¯æ£è§æ©åŒ ïŒ$L' subset L$ïŒå $L = L'$
]
#lemma[][
æéåæ©åŒ $quotient(K, F)$ çæ£è§éå
åšåææä¹äžååšäžå¯äžïŒå
·äœçåæå¯èœå€ç§ïŒ
]
#proof[
- ååšæ§ïŒè®Ÿ $F = F(alpha_i)$ïŒåå $product_i m_(alpha_i) (x)$ çäžäžªåè£å $L$ïŒå®¹æéªè¯ $quotient(L, K)$ å°±æ¯æ£è§éå
- å¯äžæ§ïŒè®Ÿ $quotient(L', K)$ æ¯æ£è§éå
ïŒä»åäžé¢ç $f(x)$ïŒå¹¶å $L$ å°±æ¯å®çåè£åãæŸç¶ $f(x)$ åš $L'$ äžåè£ãç±ä¹åçå®çïŒååš $L -> L'$ çåµå
¥ãç±å®ä¹çç¬¬äºæ¡ïŒäžåºææ¯ $L'$ æŽå°çæ£è§æ©åŒ ïŒå æ€$L' tilde.eq L$
]
== å¯åæ©åŒ
#definition[å®å
šå][
ç§°äžäžªç¹åŸ $p$ çæéå $F$ 䞺å®å
šåïŒåŠææ å°ïŒ
$
funcDef(sigma, F, F, x, x^p)
$
æ¯åæãæ¢èšä¹ïŒææå
çŽ éœå¯ä»¥åŒ $p$ æ¬¡æ ¹å·
]
#definition[圢åŒå¯Œæ°][
ç§°åäžå€é¡¹åŒ $f(x)$ ç圢åŒå¯Œæ°äžº $f'(x)$ïŒåŠåå©çšå¯Œæ°æ³å计ç®å
¶å¯Œæ°äžæ ·ã圢åŒå¯Œæ°æ»¡è¶³å¯Œæ°è®¡ç®çæææ§èŽšã
]
#proposition[éæ ¹å€å«æ³][
åäžå€é¡¹åŒ $f(x)$ ïŒåšå
¶åè£åäžïŒæéæ ¹åœäžä»
åœ $f(x), f'(x)$ äžäºçŽ
]<éæ ¹å€å«æ³>
#proof[
å©çšåäžå€é¡¹åŒç¯æ¯äž»çæ³æŽç¯ïŒ$(f(x), f'(x)) = (d(x)) => d(x) | f(x), d(x) | f'(x)$
]
#lemma[][
å®å
šåçä»£æ°æ©åŒ è¿æ¯å®å
šå
]
#proof[
讟 $char F = p$ã对任æ $alpha in K$ïŒå°æïŒ
$
alpha^(1/p) in E = F(alpha)
$
ä»èåå°äºæéæ©åŒ æ
圢ã\
æä»¬æïŒ
$
[E : sigma(E)][sigma[E] : sigma[F]] = [E : F][F : sigma(F)]
$
èæŸç¶ $[sigma[E] : sigma[F]] = [E : F]$ïŒå æ€ïŒ
$
[E : sigma(E)] = [F : sigma(F)]
$
]
#corollary[][
$quotient(K, F)$ æ¯æéæ©åŒ ïŒ$K$ å®å
š $=> K$ å®å
šãäœåšä»£æ°æ©åŒ äžè¿äžªäºå®äžæç«ã
]
#definition[][
è¥ $f(x) in F[x]$ æ¯äžå¯çºŠå€é¡¹åŒïŒåšå
¶åè£åäžïŒ
- è¥ $f(x)$ æéæ ¹ïŒåç§° $f$ æ¯äžå¯åç
- è¥ $f(x)$ æ éæ ¹ïŒåç§° $f$ æ¯å¯åç
]
#proposition[][
äžå¯çºŠå€é¡¹åŒ $f$ äžå¯ååœäžä»
åœ $f'(x) = 0$
]
#proof[
ç»å $f$ äžå¯çºŠåä¹åçéæ ¹å€å«æ³ @éæ ¹å€å«æ³ïŒç»è®ºæ¯å®¹æç
]
#corollary[][
- åšç¹åŸ $0$ çåäžïŒææäžå¯çºŠå€é¡¹åŒéœæ¯å¯åç
- åšç¹åŸ $p$ çåäžïŒææäžå¯åå€é¡¹åŒéœåœ¢åŠ $g(x^p)$ïŒäžå®å
šåäžæ²¡æäžå¯åå€é¡¹åŒ
]
#proof[
(1) æ¯æŸç¶çïŒå¯¹äº (2)ïŒä»€ïŒ
$
D(sum_i a_i x^i) = sum_i i a_i x^(i-1) = 0 => i a_i = 0 => i = 0 or a_i = 0
$
å æ€å¯¹äºææäžæ¯ $p$ çåæ°ïŒå $i != 0$ïŒè¿è $a_i = 0$
åæ¶ïŒè®Ÿ $F$ æ¯å®å
šåïŒå¯è®ŸïŒ
$
sum_i a_i x^(p i) = sum_i b_i^p x^(p i)
$
泚æå°åšç¹åŸ $p$ çåäžïŒäžå®æïŒ
$
sum_i b_i^p x^(p i) = (sum_i b_i x^i)^p
$
è¿äžå€é¡¹åŒäžå¯çºŠççŸïŒ
]
#corollary[][
åšç¹åŸ $p$ çåäžïŒææäžå¯çºŠå€é¡¹åŒéœåœ¢åŠïŒ
$
g(x^(p^e))
$
å
¶äž $g(x)$ å·²ç»æ¯å¯åäžå¯çºŠå€é¡¹åŒãåæ¶åš $g$ çåè£åäžïŒ$f$ æ°æ $deg(g)$ 䞪äžåçæ ¹ã
]<é¶ç¹äžªæ°>
#proof[
å°äžé¢çæšè®ºè¿äžæ¥è¿è¡ïŒè¥å·²ç»å¯å就忢ïŒè¥äžå¯å就继ç»è¿è¡ïŒæç»ç±äºæ¬¡æ°æéäžå®å¯ä»¥åŸå°ç±»äŒŒç圢åŒã
对äºåè
ïŒæ³šæå°ïŒ
$
g(x) = product_(i=1)^n (x - alpha_i)\
f(x) == product_(i=1)^n (x^(p^e) - alpha_i) = product_(i=1)^n (x - alpha_i^(1/(p^e)))^(p^e)
$
ä»èç»è®ºæç«
]
#definition[][
åšä»£æ°æ©åŒ $quotient(K, F)$ äžïŒ
- è¥ $alpha in K$ åš $F$ äžæå°å€é¡¹åŒå¯å/äžå¯åïŒåç§° $alpha$ å¯å/äžå¯å
- è¥ææå
çŽ éœå¯åïŒåç§°åæ©åŒ å¯åïŒåŠåæäžºäžå¯å
]
#proposition[][
ç»å®åæ©åŒ $quotient(K, E), quotient(E, F)$ïŒè¥ $alpha$ åš $quotient(K, F)$ äžå¯åïŒååš $quotient(K, E)$ å¯å
]
#proof[
ç±äºäž€äžªåæ©åŒ äžæå°å€é¡¹åŒææŽé€å
³ç³»ïŒç»è®ºæ¯å®¹æç
]
#theorem[][
- 讟 $alpha$ åš $F$ å¯åïŒå $F(alpha)$ æ¯å¯åæ©åŒ
- è®Ÿä»£æ°æ©åŒ $quotient(K, E), quotient(E, F)$ å¯åïŒåæ©åŒ $quotient(K, F)$ ä¹å¯å
]<å¯åæ©åŒ å®ç>
#proof[
è¿äžªå®çæ¯åŸäžå¹³å¡çïŒå
论述äžç¹æ³æ³ãæä»¬å¯ä»¥æŸå°äžäžªåæ©åŒ çæ£è§éå
$M$ïŒèèææåæ $phi: K -> M$ äž $phi_F = id$ ææçéå $Hom_F (K, M)$ïŒè¿äžªéåæçéèŠçåèœã
#lemma[][
讟 $K = F(alpha)$, $alpha$ çæå°å€é¡¹åŒäžº $m(x) = g(x^(p^e))$ïŒå
¶äž $g(x)$ æ¯å¯åäžå¯çºŠå€é¡¹åŒïŒåïŒ
$
|Hom_F (K, M)| = deg g(x) <= [F(alpha) : F]
$
åçåœäžä»
åœ $alpha$ å¯å
]
#proof[
æŸç¶è¿æ ·çåæç± $alpha$ çåå¯äžç¡®å®ãåæ¶ïŒç±äº $phi(m(x)) = m(x)$ïŒå æ€ $phi(alpha)$ äžå®ä¹æ¯ $m(x)$ çæ ¹ã@é¶ç¹äžªæ° ç»åºäºè¿æ ·é¶ç¹çäžªæ°æ°äžº $deg g(x)$ãåæ¶ïŒæä»¬æïŒ
$
[F(alpha) : F] = deg(m(x)) >= deg(g(x))
$
ä»èåçåœäžä»
åœ $m(x) = g(x)$ïŒä¹å³ $m(x)$ å¯åïŒçä»·äº $alpha$ å¯å
]
#corollary[][
讟æéåæ©åŒ $quotient(K, F)$ äžå
¶æ£è§éå
$quotient(M, F)$ïŒåæïŒ
$
|Hom_F (K, M)| <= [K : F]
$
äžåçåœäžä»
åœïŒäž€è
çä»·ïŒïŒ
+ $K = F(alpha_1, alpha_2, ..., alpha_n)$ äžæ¯äžªå
çŽ éœå¯å
+ $quotient(K, F)$ å¯å
]<lemma-1>
#proof[
泚æå°ïŒ
$
|Hom_F (F(alpha_1), M)| <= [F(alpha_1) : F]\
|Hom_(F(alpha_1)) (F (alpha_1, alpha_2), M)| <= [F(alpha_1, alpha_2) : F(alpha_1)]
$
äžéŸéªè¯äž€èŸ¹éœæ»¡è¶³ä¹æ§ïŒå æ€ïŒ
$
|Hom_F (F (alpha_1, alpha_2), M)| <= [F(alpha_1, alpha_2) : F]
$
以æ€ç±»æšå³åŸçä»·æ¡ä»¶ 1ãå¯¹äº 2ïŒæ³šæå°ä»»äœäžå¯åå
çŽ éœäžå®äŒç Žåçäºå·ïŒå æ€ç»è®ºä¹æ¯å¯¹çã
]
è³æ€ïŒæä»¬è¯æäº @å¯åæ©åŒ å®ç ç第äžéšåã
åèè第äºéšåïŒåŠææ©åŒ æ¯æéæ©åŒ ç»è®ºå·²ç»å®¹æäºïŒèæä»¬å¯¹äºå¯åæ©åŒ çèŠæ±æ¯æ¯äžªå
çŽ éœå¯åïŒæ¯äžªå
çŽ æåžŠæçä¿¡æ¯éæ¯æéçãæŽäž¥æ Œç诎ïŒè®Ÿ $alpha in K$ïŒåå®åš $E$ äžçæå°å€é¡¹åŒ $m_E (x)$ã\
$m_E (x)$ çç³»æ°æŸç¶åªææé䞪ïŒèèè¿äºç³»æ°ææç $F$ çæéæ©åŒ ïŒ
$
E' = F(a_1, a_2, ..., a_n)\
K' = F(a_1, a_2, ..., a_n, alpha)
$
å¯ä»¥è¯æ $F -> E', E' -> K'$ éœæ¯å¯åæ©åŒ ïŒç¹å«çïŒ$alpha$ å¯å
]
#theorem[æéåçåç±»][
- 讟 $F$ æ¯ç¹åŸ $p$ çåïŒåïŒ
$
|F| = p^([F : ZZ_p])
$
- ç»å®æéåçå
çŽ äžªæ° $n$ïŒååšåææä¹äžååšå¯äžçåæ»¡è¶³ $|F| = n$ïŒäžäžªå
žèæ¯ $x^(p^n) - x in ZZ_p [x]$ çåè£å
]
#proof[
1 æ¯æŸç¶çã讟 $|F| = p^n$ïŒå乿³çŸ€ $F^times$ çé¶æ°äžº $p^n - 1$ïŒä»èïŒ
$
a^(p^n-1) = 1 <=> x^(p^n) - x = 0 space forall a in F
$
åæ¶ $0$ 乿¯ $x^(p^n) - x = 0$ çæ ¹ïŒå æ€ $F$ äžææå
çŽ æ°äžºå€é¡¹åŒ $x^(p^n) - x = 0$ çæ ¹ïŒå æ€ $F$ å°±æ¯å®çåè£åã\
åä¹ïŒç±äº $D(x^(p^n) - x) != 0$ïŒæ
åšå®çåè£å宿²¡æéæ ¹ïŒè¿èå®çåè£åè³å°æ $p^n$ 䞪å
çŽ ã\
]
#theorem[Primitive element theorem, æ¬åå
å®ç][
- äžäžªæéå¯åæ©åŒ äžå®å¯ä»¥ç±å¯äžå
çŽ çæ
- 讟 $alpha, beta$ 代æ°äžå¯åïŒåäžå®ååš $eta$ äœ¿åŸ $F(alpha, beta) = F(beta)$
]<PET>
#proof[
åªè¯æ 2ïŒä»è 1 æ¯æŸç¶ç
èè $eta = alpha + c beta$ãäºå®äžïŒå¯ä»¥æè§å°å¯¹äºç»å€§å€æ° $c$ éœæç«ïŒåªéèŠèº²åŒé£äºè¿æ°æå·®çéšåã
- å¯¹äºæéåïŒæŸç¶æéåéœæ¯å®å
šå
#lemma[][
讟 $|F| = p^n$ïŒå $F$ äžååš $p^m$ é¶åååœäžä»
åœ $m | n$ïŒäžè¿æ ·çå忝äžäžªåè£åïŒè¿èå¯äž
]
#proof[
ç±äºæ©åææçº¿æ§ç©ºéŽïŒå¿
èŠæ§æ¯æŸç¶ãåä¹ïŒæ³šæå°ïŒ
$
x^(p^n) - x = (x^(p^m) - x)(x^(p^n-1)-1)/(x^(p^m-1)-1)
:= f(x) g(x)
$
ä»è $f(x)$ çåè£å $F_(p^m)$ åœç¶å¯ä»¥è¿äžæ¥åè£äžº $f(x)g(x)$ çåè£å $F$ïŒä»èç»è®ºæç«ã
]
- å讟 $F$ æ¯æ ç©·åïŒ
讟 $f, g$ æ¯ $alpha, beta$ çæå°å€é¡¹åŒïŒå¹¶å $E$ 䞺 $f g$ çåè£åïŒåäž $f, g$ çäžåçæ ¹åå«äžºïŒ
$
alpha_1 = alpha, alpha_2, ..., alpha_n\
beta_1 = beta, beta_2, ..., beta_m
$
æä»¬åžæå¯¹ææç $i, k$ åäžçäº $1$ ç $j$ïŒéœæïŒ
$
alpha_i + c beta_1 eq.not alpha_k + c beta_j
$
è¿åªæé€äºæéå€äžª $c$ïŒæä»¬è¿ææ ç©·å€äžª $c$ïŒå¯çšïŒæä»¬åªéè¯æè¿äº $c$ éœæ¯å¥œçïŒä¹å³ïŒ
$
F(alpha + c beta := eta) = F(alpha, beta)
$
åªéèŠè®© $alpha, beta$ èœåš $F(eta)$ äžïŒèèïŒ
$
f_1(x) = f(eta - c x)\
$
ç±åé¢çæ¡ä»¶æä»¬å¯ä»¥ç¥éïŒåš $E$ äž $f_1, g$ æå¯äžçå
Œ
±æ ¹ $beta_1$ïŒåŠå $alpha_1 + c beta_1 = alpha_k + c beta_j$ïŒ
åæ¶ïŒç±å¯åæ§ç¥ $g(x)$ åš $E$ äžäžäŒæéæ ¹ïŒè¿èïŒ
$
(f_1, g) = (x - beta_1)
$
ç¶èæå€§å
¬å åŒåšåæ©åŒ äžä¿æäžåïŒå æ€ $x - beta_1$ 乿¯å®ä»¬åš $F(eta)$ äžçæå€§å
¬å åŒïŒè¿è $beta_1 in F(eta)$ïŒå æ€ä¹æŸæ $alpha_1 in F(eta)$ïŒè¯æ¯ïŒ
]
#remark[][
对äºäžå¯åæ©åŒ ïŒæªå¿
èœä¿è¯çŒ©åå°äžäžªãäºå®äžïŒèèïŒ
$
F = F_p (x, y)\
K = F_p (x^(1/p), y^(1/p))
$
å $[K : F] = p^2$ïŒè $K$ äžä»»äœå
çŽ çæå°å€é¡¹åŒç次æ°äžå®äžè¶
è¿ $p$ïŒè¿èäžå¯èœç±äžäžªå
çŽ çæã
]
@lemma-1 æ¯åŸéèŠçïŒåšä¹åç䌜çœåç论äžåæ ·èµ·çéèŠäœçš
== 䌜çœåç论
#definition[䌜çœåæ©åŒ ][
ç§°äžäžªåæ©åŒ æ¯äŒœçœåæ©åŒ ïŒåŠææ©åŒ æ¯å¯åæ£è§ç
]
#definition[䌜çœå矀][
ç»å®äŒœçœåæ©åŒ $quotient(K, F)$ïŒç§°ææ $K$ çæ»¡è¶³ $phi|_(F) = id$ çèªåæ $phi$ ææç矀䞺䌜çœå矀ïŒè®°äœ $Gal(quotient(K, F))$
]
#proposition[][
$|Gal(quotient(K, F))| = [K : F]$
]
#proof[
è¿æ¯ @lemma-1 çèªç¶ç»æ
]
#example[åäºæ¬¡æ©åŒ ][
$QQ -> QQ(sqrt(2), sqrt(3))$ïŒå
¶äŒœçœå矀æå䞪å
çŽ ïŒ
$
sqrt(2) -> plus.minus sqrt(2)\
sqrt(3) -> plus.minus sqrt(3)
$
è¿äžªçŸ€åœç¶å°±æ¯åå
矀 $ZZ_2 times ZZ_2$\
åæ¶ïŒäžäžªéå¹³å¡å
çŽ åå«æäžº $QQ(sqrt(2)), QQ(sqrt(3)), QQ(sqrt(6))$ ççš³å®åïŒè¿äžäžªåéœæ¯åæ©åŒ çäžéŽååã
]
ä»äžé¢çäŸåå¯ä»¥çå°ïŒäŒœçœå矀çå矀å¯èœæ¯æäžªååççš³å®åïŒä»¥äžçå®ç诎æäºè¿äžªåŸäžå¹³å¡çäºå®ã
#theorem[äž»å®ç/䌜çœåå®ç][
讟 $quotient(K, F)$ æ¯æé䌜çœåæ©åŒ ïŒ$G$ æ¯å
¶äŒœçœå矀ãåïŒ
+ åšä»¥äžéåä¹éŽååšäžäžå¯¹åºïŒ
$
{E | K <= E <= F} &<-> {H | H <= G}\
E &-> Gal(quotient(K, E)) = {phi in Isom(K) | phi|_E = id}
$
åŸåŸå° $H <= G$ çåè®°äœ $K^(H)$
+ äžé¢çäžäžå¯¹åºäžïŒçŸ€è¶å°ïŒåè¶å€§ïŒä¹å³ïŒ
$
H_1 <= H_2 <=> K^(H_1) >= K^(H_2)
$
+ $|H| = [K : K^H], [G:H]=[K^H:F]$
+ 讟 $E = K^H$ïŒåä»»å $g in G$ïŒæïŒ
$
g(E) = K^(conjugateRight(g, H))
$
+ $H norS G$ åœäžä»
åœ $quotient(K^H, F)$ æ¯æ£è§æ©åŒ ïŒåæ¶æä»¬æïŒ
$
quotient(G, H) tilde.eq Gal(quotient(K^H, F))
$
+ 讟 $E_1, E_2$ åå«å¯¹åº $H_1, H_2$ïŒåïŒ
$
E_1 E_2 <-> H_1 sect H_2\
E_1 sect E_2 <-> generatedBy(H_1\, H_2)
$
]
#proof[
å
ç»åºç¬¬äžæ¡çè¯æïŒè¿æ¯çžå¯¹æå°éŸæäžå¹³å¡çäžæ¡ã\
ç±äŒœçœåæ©åŒ çèŠæ±ïŒæä»¬å¯ä»¥å讟 $K$ æ¯æäžªå¯åå€é¡¹åŒ $f(x) in F[x]$ çåè£åã\
- ç»å® $H <= G$ïŒå
æŸåºè¢« $H$ äžåçåå $E$ïŒåè¯æ $Gal(quotient(K, E)) = H$\
- éŠå
ïŒåœç¶æ $H <= Gal(quotient(K, E))$ïŒæä»¬åªéèŠè¯æäžäŒå€åºæ¥äžäºå
çŽ ãæä»¬çæ³æ³æ¯éè¿è®¡æ°è§£å³ïŒä¹å³æä»¬åžæè¯æïŒ
$
|H| >= |Gal(quotient(K, E))| = [K : E]
$<equ_c>
è¿ä¹æ¯äž»å®çæéŸçéšåïŒè¿äžªäºå®æäž€äžªè¯æïŒ
+ å©çš @PETïŒå¯ä»¥å讟 $E -> K$ æ¯åæ©åŒ ïŒå讟 $alpha$ æ¯çæå
\
èèå€é¡¹åŒïŒ
$
f(x) = product_(sigma in H) (x - sigma(alpha))
$
ç±äºå®çææç³»æ°éœè¢« $H$ äžææå
çŽ åºå®ïŒåœç¶æ $f(x) in E[x]$ãå æ€ïŒå®æ¯ $alpha$ åš $E$ äžçé¶åå€é¡¹åŒä»èæïŒ
$
m_alpha (x) | f(x)
$
è $deg(m_alpha (x)) = [K : E], deg(f(x)) = |H|$ïŒäžåŒå³è¡šæ@equ_c æç«
+ #lemma[Artin][
讟 $H = {sigma_i}$ïŒä»€ $a_i$ æ¯ $K$ äž $n+1$ 䞪å
çŽ ïŒåå®ä»¬åš $E$ äžçº¿æ§çžå
³ã
]
#proof[
èèç©éµïŒ
$
(sigma_i (a_j))_(n times (n+1))
$
æŸç¶ïŒå®çååéåœç¶åš $K$ äžçº¿æ§çžå
³ãæä»¬åžæè¯æå®çååéåš $E$ äžä¹çº¿æ§çžå
³ïŒè¿èè§å¯ç¬¬äžè¡ç«åŸåç»è®ºã\
è®°å
¶ååé䞺 $alpha_i$ïŒäžåŠšå讟ïŒ
$
exists r: alpha_1, alpha_2, ..., alpha_r åš K "äžçº¿æ§çžå
³"ïŒäœ\
alpha_1, alpha_2, ..., alpha_(r+1) åš K "äžçº¿æ§æ å
³"
$
讟ïŒ
$
alpha_(r+1) = (alpha_1, alpha_2, ..., alpha_r)vec(k_1, k_2, dots.v ,k_r)
$
åïŒ
$
sigma(alpha_(r+1)) = (sigma(alpha_1), sigma(alpha_2), ..., sigma(alpha_r))vec(sigma(k_1), sigma(k_2), dots.v, sigma(k_r))
$
ç¶èïŒæä»¬æ³šæå° $sigma$ äœçšäº $alpha_i$ æ 鿝äžäžªïŒçžåçïŒå亀æ¢ïŒå æ€äžåŠšè®ŸïŒ
$
sigma(alpha_i) = A alpha_i, forall i = 1, 2, ..., n
$
è¿éç $A$ æ¯äžäžªåçç©éµïŒåœç¶æ¯å¯éçãä»èäžåŒå䞺ïŒ
$
A alpha_(r+1) = (A alpha_1, A alpha_2, ..., A alpha_r)vec(sigma(k_1), sigma(k_2), dots.v, sigma(k_r))\
=> A alpha_(r+1) = A(alpha_1, alpha_2, ..., alpha_r)vec(sigma(k_1), sigma(k_2), dots.v, sigma(k_r))\
=> alpha_(r+1) = (alpha_1, alpha_2, ..., alpha_r)vec(sigma(k_1), sigma(k_2), dots.v, sigma(k_r))
$
äœç±äº $alpha_(r+1)$ 被衚åºçæ¹åŒåºè¯¥æ¯å¯äžçïŒè¿è¡šæïŒ
$
sigma(k_i) = k_i, forall i = 1, 2, ..., r
$
è¿æ ·çæäœå¯¹ææ $sigma in H$ éœæç«ïŒå æ€ææçç³»æ°è¢« $H$ äžææå
çŽ ä¿æäžåšïŒè¿èïŒ
$
k_i in E, forall i = 1, 2, ..., r
$
è¿å°±è¯æçº¿æ§çžå
³æ§åš $E$ äžä¹æç«ïŒåç»è®ºåŸè¯
]
æäºè¿äžªåŒçïŒç»è®ºæ¯æŸç¶ç
- åä¹ïŒç»å®äžéŽå $E$ïŒæŸå°åºå® $E$ äžåšç䌜çœå矀 $H <= G$ïŒæä»¬åžæè¯æè¢« $H$ åºå®çåå $E' = E$
- éŠå
ïŒåœç¶æ $E <= E'$
- å
¶æ¬¡ïŒæä»¬è¿æ¯æ¥è®¡ç®äžäžæ©åŒ 次æ°ïŒå©çšäžé¢çç»æïŒæ³šæå°ïŒ
$
[K : E] = |H| = [K : E']
$
è¿åœç¶å°±è¡šæ $E = E'$
]
#proof[
+ åé¢å·²ç»è¯æ
+
è¥ $H_1 <= H_2$ïŒåœç¶æ $K^(H_1) >= K^(H_2)$\
è¥ $E_1 <= E_2$ïŒåœç¶æ $Gal(K, E_2) <= Gal(K, E_1)$
+ 泚æå° $K -> F$ å¯åæ£è§èŽå«ç $H -> F$ å¯åæ£è§ïŒå æ€ $H = Gal(quotient(K, K^H))$ åœç¶æäžè¿°ç»è®º
+
$
x in K^(conjugateRight(g, H)) <=> conjugateRight(g, H)x = x \
<=> H Inv(g) x = Inv(g) x <=>Inv(g) x in K^H <=> x in g(K^H)
$
+ åå¿ @lemma1ïŒæä»¬èŠåçäºæ
åŸç±»äŒŒãäºå®äžïŒ@lemma1 åè¯æä»¬æ£è§æ©åŒ äžå®å¯¹åºæ£è§å矀ïŒåè¿æ¥åªéè¯ææ£è§åçŸ€å¯¹åºæ£è§æ©åŒ ã\
䞺æ€ïŒå $f(x) in F[x]$ïŒå®åš $K^H$ äžæé¶ç¹ïŒåªéè¯æå®åè£ãå
è¯æäžäžªåŒçïŒ
#lemma[][
è¥ $quotient(K, F)$ æ¯äŒœçœåæ©åŒ ïŒäž $F[x]$ äžäžå¯çºŠå€é¡¹åŒ $f(x)$ åš $K$ äžåè£ã讟å
¶äžäžªæ ¹äžº $alpha$ïŒåå®çæææ ¹æ°äžº $Gal(quotient(K, F)) alpha$
]<lemma-polynomial>
#proof[
éŠå
ïŒæŸç¶ $Gal(quotient(K, F)) alpha$ åœç¶æ¯å€é¡¹åŒçæ ¹ïŒå 䞺以䌜çœå矀äžçå
çŽ äœçšäº $f(x)$ äžæ¹åææç³»æ°\
什 $g(x) = product_(sigma in Gal(quotient(K, F))) (x - sigma(a))$ïŒæ³šæå° $Gal(quotient(K, F))$ äžææå
çŽ ä¿æ $g(x)$ äžåïŒå æ€å®æ¯ $F$ ç³»æ°å€é¡¹åŒãåæ¶æŸæ $f, g$ äœäžº $K[x]$ äžå€é¡¹åŒäžäºçŽ ïŒè¿èäœäžº $F[x]$ äžå€é¡¹åŒä¹äžäºçŽ ãè $f(x)$ æ¯äžå¯çºŠå€é¡¹åŒïŒç»åº $f(x) | g(x)$ïŒè¿å°±è¯æäºåŠäžæ¹é¢ã
]
åå°åç»è®ºïŒåŒçåè¯æä»¬ $f(x)$ ïŒåš $K$ïŒäžçææé¶ç¹æ°äžº $Gal(quotient(K, F))alpha$ïŒäœåŠäžæ¹é¢ïŒ
$
sigma(alpha) in K^(conjugateRight(sigma, H)) = K^H
$
è¿å°±è¯æäº $K^H$ å·²ç»å
å« $f(x)$ çæææ ¹ã\
æåïŒåæ¬¡ç± @lemma1ïŒä»»äœ $Gal(quotient(K, F))$ éœäŒä¿æ $K^H$ çš³å®ïŒå æ€æèªç¶çåæïŒ
$
eta: Gal(quotient(K, F)) -> Gal(quotient(K^H, F))
$
- $ker eta = {sigma in Isom(K) | sigma|_(K^H) = id} = Gal(quotient(K, K^H)) = H$
- 计æ°åç°å®ä¹æ¯æ»¡å°ïŒå æ€åæå®çå³åŸåç»è®º
+ 泚æå° $phi|_(E_1 E_2) = id <=> phi|_(E_1) = id and phi|_(E_2) = id$ïŒå æ€ç¬¬äžæ¡æç«ã\
对äºç¬¬äºæ¡ïŒæä»¬ä»åŠäžæ¹åèèã$K^(generatedBy(H_1\, H_2))$ å°±æ¯è¡šç€ºè¢«ææ $H_1, H_2$ äžçå
çŽ éœåºå®çå
çŽ ææçååïŒåœç¶å°±æ¯ $E_1 sect E_2$
]
#remark[][
@lemma-polynomial æ¯ååéèŠçïŒæäºè¿äžªåŒçïŒæä»¬åŸåŸä¹æäŒœçœå矀äžçå
çŽ ç§°äœå
±èœã
]
#proposition[][
讟 $quotient(K, F)$ æ¯äŒœçœåæ©åŒ ïŒ$quotient(E, F)$ æ¯ä»»æåæ©åŒ ïŒåïŒ
- $quotient(K E, E)$ æ¯äŒœçœåæ©åŒ
- $Gal(quotient(K E, E)) tilde.eq Gal(quotient(K, K sect E))$
]
#proof[
ç±é¢è®ŸïŒè®Ÿ $K$ æ¯ $F$ äžæäžªå¯åå€é¡¹åŒ $f(x)$ çåè£åã\
å° $f(x)$ çäœ $E$ äžå€é¡¹åŒïŒåœç¶å®æ¯å¯åçïŒåå®çåè£å䞺 $K'$\
泚æå°ïŒ
+ $E <= K'$
+ $f(x) in F[x]$ åš $K'$ äžåè£ïŒä»è $K <= K'$
è¿å°±è¯Žæ $K E <= K'$\
å
¶æ¬¡ïŒåœç¶æ $f(x) in E[x]$ åš $K E$ äžåè£ïŒå æ€ïŒ
$
K' = K E
$
è¿å°±è¯æäºç¬¬äžæ¡ã\
ç»åºåææ å°
$
Phi: Gal(quotient(K E, E)) -> Gal(quotient(K, K sect E))\
sigma -> sigma|_K
$
- åå°æ§èŽšå®¹æéªè¯
- æ»¡å°æ§èŽšæ¯èŸå°éŸïŒæä»¬éèŠå©çšäŒœçœåç论è¿åã讟 $H = im Phi <= Gal(quotient(K, K sect E))$ïŒå $K^H <= K$ïŒåªééªè¯ $K^H <= E$\
ä»»å $sigma in Gal(quotient(K E, E))$ïŒç±äº $sigma|_K$ ä¿æ $K^H$ äžåšïŒåœç¶æ $sigma$ ä¿æ $K^H$ äžåš\
äœåŠäžæ¹é¢ïŒè¢« $Gal(quotient(K E, E))$ äžææå
çŽ ä¿æäžåšçåååœç¶åªèœå«äº $E$ïŒè¿è $K^H <= E => K^H <= K sect E => K^H = K sect E$
]
#example[][
è¿äžªåœé¢äžïŒæäžäžªæ©åŒ æ¯äŒœçœåæ©åŒ çæ¡ä»¶æ¯å¿
èŠçãå讟åšåœé¢äžïŒææ¡ä»¶æ¹æ $quotient(K E, F)$ æ¯äŒœçœåçïŒäž
$
K sect E = F
$
é£ä¹æä»¬å°æïŒ
$
Gal(quotient(K E, E)) = H_1, Gal(quotient(K E, K)) = H_2\
Gal(quotient(K E, F)) = generatedBy(H_1\, H_2)\
Gal(quotient(K E, K E)) = {1} = H_1 sect H_2
$
æ€æ¶äžºäºåŸå°å¥œçç»è®ºïŒæä»¬åœç¶åžæ $H_1, H_2$ è³å°äžäžªæ¯æ£è§åçŸ€ïŒæ¢èšä¹ïŒå
¶äžäžéšåæ¯äŒœçœåæ©åŒ ïŒïŒè¿èç«å»æïŒ
$
quotient(generatedBy(H_1\, H_2), H_1) tilde.eq H_2
$
]
#theorem[][
讟 $quotient(K_1, F), quotient(K_2, F)$ éœæ¯äŒœçœåæ©åŒ ïŒåïŒ
- $K_1 sect K_2$ æ¯äŒœçœåç
- $K_1 K_2 $ æ¯äŒœçœåç
-
$
Gal(quotient(K_1 K_2, F)) tilde.eq {(g_1, g_2) in Gal(quotient(K_1, F)) times Gal(quotient(K_2, F))| g_1|_(K_1 sect K_2) = g_2|_(K_1 sect K_2)}
$
- è¥è¿æ $K_1 sect K_2 = F$ïŒåïŒ
$
Gal(quotient(K_1 K_2, F)) tilde.eq Gal(quotient(K_1, F)) times Gal(quotient(K_2, F))
$
]
#proof[
- åªéæç
§å®ä¹éªè¯æ£è§å³å¯
- 泚æå° $K_1, K_2$ æ¯ $f_1 (x), f_2 (x)$ åè£åïŒå $K_1 K_2$ å°±æ¯ $f_1 (x) f_2 (x)$ çåè£å
]
#example[][
åšäŒœçœåæ©åŒ $QQ -> QQ(root(3, 2), omega)$ äžïŒäŒœçœå矀æ°äžº $S_3$ïŒ$QQ$ äžäžå¯çºŠå€é¡¹åŒ $x^2 - 2$ çäžäžªæ ¹çå
šçœ®æ¢ïŒ\
å
¶äž $(23)$ åºå®äº $QQ(root(3, 2))$ïŒäžæ¯æ£è§æ©åŒ ã$(123)$ åºå®äº $omega$ïŒ$omega = (root(3, 2)omega)/root(3, 2) = (root(3, 2)omega^2)/(root(3, 2)omega)$ïŒïŒè¿æ¯æ£è§æ©åŒ ïŒäºæ¬¡æ©åŒ 忣è§ïŒïŒåæ¶å¯¹åºççŸ€ä¹æ¯æ£è§åçŸ€ïŒææ°äžº $2$ çææ°åæ£è§ïŒ
]
对äºäžè¬çå¯ååæ©åŒ $quotient(K, F)$ïŒæä»¬åŸåŸå¯ä»¥åå®çæ£è§éå
$quotient(L, F)$ïŒæ€æ¶å®æäžºäžäžªäŒœçœåæ©åŒ ãå讟 $K -> L$ 对åºå矀 $H$ïŒå $quotient(K, F)$ çæ§èŽšæç§æä¹äžå°±æ¯éªéç©ºéŽ $G:H$ äžçæ§èŽšã
#example[][
èè $QQ[root(4, 2)]$ïŒå®çæ£è§éå
æ¯ $QQ[root(4, 2), i]$ãå®ç䌜çœåçŸ€æ¯ $D_4$ïŒå䞪顶ç¹åå«äžº $plus.minus root(4, 2), plus.minus root(4, 2)i$\
äºå®äžïŒå
¶äžç $r = root(4, 2) -> root(4, 2)i, s = i -> -i$\
æä»¬æ³èŠå¯»æŸå矀 ${1, s r}$ åºå®çååïŒæ³šæå°ïŒ
$
s r(root(4, 2) + s r(root(4, 2))) = root(4, 2) + s r(root(4, 2))
$
å æ€ $root(4, 2) + s r(root(4, 2))$ å¯èœæ¯æä»¬èŠæŸçåäžçäžäžªå
çŽ ïŒå®å°±æ¯ $(1-i)root(4, 2)$ãå®åš $QQ$ äžæå°å€é¡¹åŒäžº $4$ æ¬¡ïŒæŸç¶æ¯ $8$ çå åïŒäœäžäŒæ¯ $1, 2, 8$ïŒïŒå æ€å®å°±æ¯äžäžªçæå
]
#theorem[æéåæ©åŒ ç䌜çœå矀][
讟 $F_(q) -> F_(q^m)$ æ¯äŒœçœåæ©åŒ ïŒåå®ç䌜çœå矀就æ¯ïŒ
$
a -> a^q
$
å
¶äž $q$ æ¯å¯æ¯å°Œå
çŽ ïŒè¿äžå®æ¯äžªåŸªç¯çŸ€ã
]
#definition[é¿èŽå°æ©åŒ ][
ç§°äžäžªäŒœçœåæ©åŒ æ¯é¿èŽå°æ©åŒ ïŒåŠæå®ç䌜çœå矀æ¯é¿èŽå°çŸ€
]
#definition[ååæ©åŒ ][
- çç¥ $n$ 次åäœæ ¹ææåŸªç¯çŸ€ïŒå®ççæå
被称䞺æ¬ååäœæ ¹ïŒè®°äœ $zeta_n^a$ãæŸç¶è¿æ ·ççæå
æ°æ $phi(n)$ 䞪ã
å®ä¹ïŒ
$
Phi_n (x) = product_a (x - zeta_n^a) in CC[x]
$
è¿æ ·çå€é¡¹åŒè¢«ç§°äžºååå€é¡¹åŒïŒæ€æ¶æïŒ
$
x^n - 1 = product_i (x - omega_n^i) = product_(d | n)Phi_n (x)
$
äºå®äžïŒç±æ€å¯ä»¥åœçº³è¯æ $Phi_n (x) in ZZ[x]$ \
äžé¢çå®ç衚æ $Phi_n (x)$ äžå¯çºŠïŒè¿èæäžº $zeta_n^a$ çæå°å€é¡¹åŒãåœ¢åŠ $QQ[zeta_n^a]$ çæ©åŒ è¢«ç§°äžºååæ©åŒ ïŒå®ç䌜çœå矀æ°äžºïŒ
$
(ZZ_n)^times
$
]
#theorem[][
$Phi_n (x)$ åš $ZZ[x], QQ[x]$ éœäžå¯çºŠïŒè¿èæäžº $zeta_n^a$ çæå°å€é¡¹åŒ
]
#proof[
ç±é«æ¯åŒçïŒåªéèŠè¯æåš $ZZ[x]$ äžå¯çºŠå³å¯ã
- å $zeta$ ïŒåš $Phi_n$ çåè£åäžïŒæ¯äžäžª $n$ 次åäœæ ¹\ åªéèŠè¯æå®å°±æ¯æå°å€é¡¹åŒ $m(x)$ïŒæŸæ $m(x) | Phi_n (x)$ã\
æä»¬çç®æ æ¯è¯æ $m(x)$ å
å« $Phi_n (x)$ çæææ ¹ãèæ³šæå° $Phi_n (x)$ çæææ ¹éœåœ¢åŠïŒ
$
zeta^a, a = p_1^(alpha_1)p_2^(alpha_2)...,p_n^(alpha_n)
$
䞺äºè¯æè¿äºå
çŽ éœæ¯ $m(x)$ çæ ¹ïŒäºå®äžæ¯æ¬¡åªéæ·»å äžäžªçŽ å åãç±äºææç $n$ 次åäœæ ¹éœæ¯å¯¹ç§°çïŒè¿æ ·çæ·»å åœç¶å¯ä»¥äžçŽè¿è¡äžå»ïŒå æ€æä»¬åªéèŠäžé¢çåŒçïŒ
#lemma[][
讟 $p$ æ¯äžäžªäžæ¯ $n$ çå åççŽ æ°ïŒå $zeta^p$ 乿¯ $f(x)$ çæ ¹
]
#proof[
åŠè¥äžç¶ïŒä»€ $g(x)$ æ¯ $zeta^p$ çæå°å€é¡¹åŒãæŸç¶ $f(x)$ åºè¯¥äž $g(x)$ äºçŽ ã\
æ€æ¶ååå€é¡¹åŒ $Phi_n (x)$ å°æ¥æäž€äžªäºçŽ çå å $f(x)$ å $g(x)$ ã\
äœæ¯ïŒæä»¬æïŒ
$
g(zeta^p) = 0 => g(x^p) "以 " zeta "䞺äžäžªæ ¹" => m(x) | g(x^p)
$
什 $g(x^p) = f(x)k(x), k(x) in ZZ[x]$\
åèªç¶åæïŒ $phi: ZZ -> ZZ_p$ïŒæä»¬æïŒ
$
phi(g(x^p)) = phi(f(x)) phi(k(x))
$
ç¶èæä»¬æ³šæå°ïŒä»»äœå€é¡¹åŒ $mod p $ éœæïŒ
$
phi(g(x^p)) = phi((g(x)))^p
$
ä»èïŒ
$
phi((g(x))^p) = phi(f(x)k(x))
$
è¿è¡šæïŒåš $ZZ_p [x]$ äžïŒ$f(x)$ äž $g(x)$ æå
Œ
±å åã\
äœåæä»¬æïŒ
$
phi(f(x)g(x)) | phi(Phi_n (x))
$
ç±äº $phi(f(x)), phi(g(x))$ æå
Œ
±å åïŒ$phi(Phi_n (x))$ å°åšå®çåè£åäžæéæ ¹ïŒå èç±å®ä¹ïŒ$x^n - 1 in ZZ_p [x]$ å°åšåè£åäžæéæ ¹ã\
äœç±éæ ¹å€å«æ³ïŒè¿æ¯èè°¬çã
]
ååæ©åŒ æ¯äžäžªéåžžå
·äœèåŒºå€§çæ©åŒ ïŒå¯ä»¥åŒåºåŸå€æè¶£çç»æã
#corollary[][
ä»»ææé亀æ¢çŸ€éœæ¯æäžªäŒœçœåæ©åŒ ç䌜çœå矀ã
]
#proof[
ç±äºæé亀æ¢çŸ€æç»æå®çïŒä»€ïŒ
$
G = quotient(ZZ, n_1) times quotient(ZZ, n_2) times ... times quotient(ZZ, n_k)
$
ç±çå©å
é·å®çïŒååš $p_i$ 䜿åŸïŒ
$
n_i = 1 mod p_i
$
åè¿æ ·é åºç $(ZZ_(p_1 p_2 ... p_n))^times$ æ°å¥œå¯¹åºäžäžªååæ©åŒ ïŒç䌜çœå矀ïŒïŒè $G$ æäžºè¿äžªäŒœçœå矀çäžäžªæ£è§å矀ïŒä»èåœç¶ä¹æ¯æäžªäŒœçœåæ©åŒ ç䌜çœå矀ã
]
#example[][
æŸå°äžäžªæ¬¡æ°äžº $3$ ç $QQ$ äžçåŸªç¯æ©åŒ ïŒäŒœçœå矀䞺埪ç¯çŸ€ïŒ\
泚æå°ååæ©åŒ $QQ(zeta_7)$ ç䌜çœå矀æ°å¥œæ¯ $ZZ_6$ïŒå æ€åå®çäžäžªäºé¶åŸªç¯å矀ïŒè¿äžªåŸªç¯å矀æåºå®çåå $F$ 峿»¡è¶³ïŒ
$
[QQ(zeta_7) : F] = 3\
[F : QQ] = 2\
$
äžïŒ
$
Gal(quotient(F, QQ)) tilde.eq quotient(ZZ_6, ZZ_2) = ZZ_3
$
]
#theorem[Kronecker - Weber][
$QQ$ äžä»»ææéé¿èŽå°æ©åŒ éœæ¯æäžªååæ©åŒ çåæ©åŒ
]
è¿äžªå®çæ¯é垞区倧çïŒåç»å»¶äŒžåºè¯žå€å·¥äœè®šè®ºèœåŠæç±»äŒŒçå·¥äœå»¶äŒžå° $QQ$ 以å€çåäžã
]
== å€é¡¹åŒç䌜çœå矀
è¿å
¥äž»é¢ä¹åïŒæä»¬éèŠåå€äžäºå·¥å
·
#definition[矀ç¹åŸ][
ç»å®äžäžªäº€æ¢çŸ€ $G$ 并讟 $L$ æ¯åïŒ$H$ çäžäžªåŒåš $L$ äžçç¹åŸæ¯æ $H$ å° $L^times$ çäžäžªçŸ€åæ
]
#theorem[Artin][
å讟 $kai_i$ æ¯çŸ€ $G$ åš $L$ äžçäžåç¹åŸïŒåå®ä»¬äœäžº$H -> L^times$ çåœæ°æ¯çº¿æ§æ å
³ç
]<Artin-linear-independent>
#proof[
å讟å®ä»¬çº¿æ§çžå
³ïŒé£ä¹æä»¬äžåŠšå讟 $kai_1, ..., kai_(r-1)$ æ¯æå€§æ å
³ç»ïŒå¹¶æïŒ
$
kai_r (h) = sum_i a_i kai_i (h) ,forall h in H
$<linear_relation>
æ¢ç¶å®ä»¬æ¯äžåçç¹åŸïŒå¯è®ŸïŒ
$
kai_1 (h_0) != kai_r (h_0)
$
ç±äºä»»ææ§ïŒæä»¬ç¥éïŒ
$
kai_r (h h_0) = sum_i a_i kai_i (h h_0)
$
äœç¹åŸæ¯çŸ€åæïŒå æ€ïŒ
$
kai_r (h) kai_r (h_0) = sum_i a_i kai_i (h) kai_i (h_0)
$
泚æå°åäžç乿³çŸ€äžå« $0$ïŒå æ€ $kai_r (h_0) !=0$ïŒäžåŒå°ç»åºäžäžªäžåäº@linear_relation ç线æ§è¡šåºïŒççŸïŒ
]
#definition[åŸªç¯æ©åŒ ][
ç§°äžäžªäŒœçœåæ©åŒ æ¯åŸªç¯çïŒåŠæå®ç䌜çœå矀æ¯åŸªç¯çŸ€
]
#theorem[Kummer][
- å讟 $char(F)$ äžæ¯ $n$ çå åïŒäž $F$ å
嫿æ $n$ 次åäœæ ¹ïŒåä»»å $a in F$ïŒ$K = F(root(n, a))$ æ¯åŸªç¯æ©åŒ ïŒäžæ©åŒ æ¬¡æ°æ¯ $n$ çå å
- å讟 $char(F)$ äžæ¯ $n$ çå åïŒäž $F$ å
嫿æ $n$ 次åäœæ ¹ïŒåææ $F$ äžçåŸªç¯æ©åŒ éœç±æ·»å æäžª $root(n, a)$ åŸå°
]<Kummer-theorem>
#proof[
+ 泚æå°ïŒ
$
x^n - a = product_i (x - root(n, a) xi_n^i)
$
åæ¶ç±æ¡ä»¶ïŒ$x^n - 1$ æ éæ ¹ïŒå æ€è¿äºåäœæ ¹äž€äž€äžçïŒåæ¶äžåŒæ¯å¯åå€é¡¹åŒïŒåš $K$ äžå®å
šåè£ïŒè¿è $K$ æ¯åè£åïŒæ©åŒ æ¯æ£è§æ©åŒ ãè¿äžå¹¶ç»åºæ©åŒ æ¯äŒœçœåæ©åŒ \
䞺äºå³å®çŸ€ççç»æïŒèèä»»äœ $F-$ èªåæïŒæŸç¶è¿æ ·çèªåæç± $root(n, a)$ çåå¯äžç¡®å®ïŒäžäžå®æ $root(n, a)$ éå° $root(n, a) xi_n^i$ äžæäžäžªã\
ç»åºæ å°ïŒ
$
funcDef(lambda, Gal(quotient(K, F)), {xi_n^i}, sigma, sigma(root(n, a))/root(n, a))
$
- äžé¢å·²ç»è¯Žæ $F-$ èªåæ $sigma$ ç± $lambda(sigma)$ å¯äžç¡®å®ïŒä»è宿¯åå°
- 计ç®éªè¯ $lambda$ æ¯çŸ€åæïŒ
$
sigma compose tau(root(n, a)) = sigma(xi_n^(lambda(tau))root(n, a)) = tau(xi_n^(lambda(tau))) tau(root(n, a)) = xi_n^(lambda(tau))tau(root(n, a))\
= xi_n^(lambda(tau) + lambda(sigma)) root(n, a)
$
足以诎æåæ
å æ€ç±åæå®çïŒç»è®ºæç«ãå¯¹äºæ©åŒ 次æ°ïŒç±äºæäžäžª $n$ 次é¶åå€é¡¹åŒïŒç»è®ºæŸç¶ã
+ è¯æçæ žå¿åœç¶æ¯æŸå°äžäžª $a$\
å讟䌜çœå矀䞺 $generatedBy(sigma)$ïŒå¯¹äºä»»æ $alpha$ïŒä»€:
$
b = sum_i^n xi_n^(i-1) sigma^(i-1)(alpha)
$
- éŠå
è¯æååšäžäžª $alpha$ 䜿åŸäžåŒéé¶ãäºå®äžïŒ@Artin-linear-independent åè¯æä»¬ $L^times -> L^times$ ç矀ç¹åŸïŒ
$
1, sigma, sigma^2, ..., sigma^(n-1)
$
çº¿æ§æ å
³ïŒä»èç»è®ºåœç¶æ¯æ£ç¡®ç
- è¿äžªæé çèµ·æ¥æç¹åªå€·ææïŒå®é
äžæä»¬çæ³æ³æ¯æä»¬é¢æ³ç $root(n, a)$ åºè¯¥æ»¡è¶³ïŒ
$
sigma(root(n, a)) = xi_n^i root(n, a)
$
äžåŒå³æ¯ä»è¿ä»£çè§åºŠæé åºäºè¿æ ·äžäžªçåŒ:
$
sigma(b) = xi_n^(-1) b
$
å $a = b^n$ïŒæ³šæå°ïŒ
$
sigma^i (b) = xi_n^(-i) b
$
衚æäŒœçœå矀 $Gal(quotient(K, F))$ äžæ²¡æä»»äœåçŸ€ä¿æ $b$ äžåšïŒè¿è没æä»»äœäžéŽåïŒä¹å³ïŒ
$
K = F(b) = F(root(n, a))
$
]
#definition[坿 ¹åŒæ±è§£][
讟 $F -> F$ æ¯ä»£æ°æ©åŒ ïŒåç§° $K$ å¯è¢«æ ¹åŒè¡šç€ºïŒåŠæååšåçéŸïŒ
$
F = K_0 subset K_1 subset K_2 ..., subset K_s = K
$
满足æ¯äžäžªæ©åŒ éœæ¯æ·»å æäžª $root(n_i, a_i)$ åŸå°çåæ©åŒ
]
è¿äžªå®ä¹äžå宿©åŒ æ¯äŒœçœåçïŒäœæä»¬æ»å¯ä»¥å䌜çœåéå
#proposition[][
å $quotient(K, F)$ ç䌜çœåéå
$L$ïŒååæ©åŒ $L$ 乿»¡è¶³äžé¢çéŸæ¡ä»¶
]
#proof[
å䌜çœåéå
对åºäŒœçœå矀 $H$ïŒæ³šæå°äŒœçœåéå
å¯ç±ïŒ
$
L = product_(sigma in H) sigma(K)
$
åŸå°ã\
å
¶æ¬¡ïŒå¯¹äºä»»æ $sigma$ïŒæ³šæå°ïŒ
$
F -> sigma(K_1)
$
åœç¶æ¯ç±æ·»å æäžªæ ¹åŒåŸå°çåæ©åŒ ïŒé£ä¹ïŒ
$
K_1 -> K_1 sigma(K_1)
$
乿¯ç±æ·»å æäžªæ ¹åŒåŸå°çåæ©åŒ ïŒäŸæ¬¡ç±»æšå³åŸç»è®º
]
#definition[][
ç§°äžäžªäžå¯çºŠå€é¡¹åŒç䌜çœå矀䞺ç±å®çæçåè£åäœäžºåæ©åŒ 产çç䌜çœå矀
]
#theorem[][
äžäžªäžå¯çºŠå€é¡¹åŒçæäžªæ ¹å¯è¢«æ ¹åŒæ±è§£åœäžä»
åœå®ç䌜çœå矀å¯è§£
]
#proof[
åšè¯æäžïŒäžåŠšè®Ÿå¯æ ¹åŒæ±è§£çåéŸæç»äº§ç䌜çœåæ©åŒ
- å¿
èŠæ§ïŒ
讟 $K_(i+1) = K_i (root(n_i, a_i))$ïŒäžºäºæ¹äŸ¿æä»¬å°ææéèŠç $n$ 次åäœæ ¹æ·»å è¿å»ïŒä»€ïŒ
$
K_(i)^' = K_i (xi_(n_i))
$
æç»æ¹çšåœç¶åš $L'$ å¯è§£ã\
泚æå°æ¯äžª $K_i^' -> K_(i+1)^'$ éœæ¯äŒœçœåçïŒèªç¶æä»¬æïŒ
$
G_(i+1) norS G_i
$
åæ¶ç± @Kummer-theorem ïŒæ¯äžªåçŸ€éœæ¯åŸªç¯çŸ€ïŒè¿è $F -> L'$ ç䌜çœå矀å¯è§£ã $L -> F$ äœäžºå䌜çœåæ©åŒ ïŒå®ç䌜çœå矀æäžº $G$ çæ£è§å矀ïŒåœç¶ä¹å¯è§£ã
- å
åæ§ïŒ
什 $F'$ æ¯ $F$ æ·»å è¿ææå¯èœéèŠçåäœæ ¹ $xi_|G|$ïŒèèæ°çæ©åŒ ïŒ
$
F' -> K F'
$
å®ç䌜çœå矀æ¯ïŒåæäºïŒå䌜çœå矀çå矀ïŒåœç¶ä¹å¯è§£ã\
åæ¶ïŒå®çå¯è§£çŸ€éŸå©çšäŒœçœåç论å°èœ¬å䞺äžç³»ååŸªç¯æ©åŒ ãç± @Kummer-theorem ç论ïŒè¿åœäžä»
åœæ¯äžäžªå¯¹åºç矀æ©åŒ éœæ¯ç±åŒ $n$ æ¬¡æ ¹å·çåæ©åŒ çæïŒè¿èç»è®ºæ£ç¡®ã
]
ä¹åïŒæä»¬è®šè®ºåŠäœçæ£çèèå€é¡¹åŒçå¯è§£æ§ã泚æå°äžå¯çºŠå€é¡¹åŒç䌜çœå矀åœç¶åšæææ ¹äžæäœçšïŒäž
- äœçšæ¯åçïŒäžäŒä¿ææäžªæ ¹äžå
- äœçšæ¯äŒ éçïŒåŠåäžäžªèœšéäžæææ ¹å¯ä»¥ææäžåå€é¡¹åŒçå åïŒççŸ
#lemma[][
讟 $F$ ç¹åŸé¶ïŒæäžªåæ©åŒ åœ¢åŠ $F(x_1, x_2, ..., x_n) := M$ïŒ$x_i$ äºäžçžåãåææå
³äº $x_i$ ç对称å€é¡¹åŒ $s_i$ïŒä»€ïŒ
$
L = F(s_1, s_2, ..., s_n)
$
æŸç¶ $L <= M$ïŒæŽè¿äžæ¥ïŒ$M$ æ¯ $L$ äžæ éæ ¹å€é¡¹åŒçåè£åïŒè¿èæ¯äŒœçœåæ©åŒ ã\
å®ç䌜çœåçŸ€å°æ¯ $S_n$ çäžäžªå矀ïŒäºå®äžïŒå®ç䌜çœåçŸ€å°±æ¯ $S_n$
]
#proof[
]
æä»¬ç¥é $S_n$ åœç¶æäžäžªæ£è§å矀 $A_n$ïŒè¿äžªæ£è§å矀对åºå°åªäžªåæ©åŒ å¢ïŒäºå®äžïŒä»€ïŒ
$
D := product_(1 <=i < j <= n) (x_i - x_j)
$
äžéŸåç° $S_n$ äžä¿æå®äžåççå
çŽ æ°å¥œå°±æ¯ $A_n$ äžçææå
çŽ ã
#lemma[][
åšç¹åŸ $0$ çåäžïŒåäžå¯çºŠå€é¡¹åŒçåè£æ©åŒ $F -> K$ïŒåæ ·å®ä¹å€å«åŒ $D$ïŒåïŒ
$
D in F <=> Gal(quotient(K, F)) <= A_n
$
]
#proof[
$G sect A_n = A_n <=> F(D) = F$
]
#example[äžæ¬¡æ¹çšçæ±æ ¹å
¬åŒ][
ç»å®äžæ¬¡æ¹çšïŒå讟å€é¡¹åŒå·²ç»äžå¯çºŠïŒïŒ
$
x^3 + p x + q = 0
$
ç±çº¿æ§ä»£æ°çæ¹æ³ïŒå¯ä»¥æ±åºå®çå€å«åŒçå¹³æ¹ïŒ
$
D^2 = - 4 p^3 - 27 q^2
$
ç±åŒçïŒ
- $D^2$ äžºå¹³æ¹æ°æ¶ïŒ$G$ æ¯ $A_3$ çå矀ïŒåªèœæ¯ $A_3 = ZZ_3$
- åŠåïŒ$G subset.not A_3$ïŒåå äžºå®æ¯äŒ éçïŒå æ€å®åªèœæ¯ $S_3$\
æä»¬åœç¶å¯ä»¥æ·»å è¿ $D = sqrt(D^2)$ïŒæ€æ¶ $G$ å°±æ¯ $ZZ_3$ïŒä»èæ©åŒ æäžºåŸªç¯æ©åŒ ïŒ@Kummer-theorem 衚æäžå®å¯ä»¥æ·»å æäžªåŒçäžæ¬¡æ ¹å·å®ç°ïŒæä»¬æŸå°è¿äžªåŒå³å¯ã
å $omega$ æ¯äžæ¬¡åäœæ ¹ïŒè®Ÿ $alpha, beta, gamma$ æ¯æ¹çšçäžäžªæ ¹ïŒç±»äŒŒ Kummer ç论çè¯æäžïŒå®ä¹ïŒ
$
theta_1 = alpha + omega sigma(alpha) + omega^2 sigma^2(alpha)\
= alpha + omega beta + omega^2 gamma\
theta_2 = alpha + omega^2 beta + omega gamma\
theta_3 = alpha + beta + gamma = 0
$
æä»¬åœç¶ç¥é $theta_1^3, theta_2^3$ éœæ¯åäžç°æçæ°ïŒè¿èåªèŠæ·»å è¿ $root(3, theta_1), root(3, theta_2)$ïŒäžé¢äžåŒéœæäžºçº¿æ§æ¹çšïŒè§£ä¹å³å¯ã
]
== æ 穷䌜çœåç论
æä»¬å°è¯å©çšäžäºæå·§å°äŒœçœåç论æ©å
è³æ ç©·æ
圢
#definition[éåæé/æå°æé (Inverse Limit)][
- 讟ååšäžåéåä¹éŽç满å°ïŒ
$
A_1 <-^(f_1) A_2 <-^(f_2) A_3 ...
$
åå®ä¹éåæé/æå°æéïŒææ¶ä¹çŽæ¥ç§°äžºæéïŒäžºïŒ
$
inverseLimit(n) A_n = {(a_1, a_2, ...) in product_n A_n | f_n(a_(n+1)) = a_n}
$
- åšäžé¢çå®ä¹äžïŒå°éåæ¹äžºçŸ€/ç¯/å/...ïŒæ»¡å°æ¹äžºæ»¡åæïŒåå¯ç±»äŒŒå®ä¹éåæé
]
#example[p- è¿æ°å][
讟 $p$ 䞺äžäžªçŽ æ°ïŒä»€ $A_n := quotient(ZZ, p^n ZZ)$ïŒæŸç¶ $A_(n)$ äž $A_(n+1)$ ä¹éŽååšèªç¶çæ»¡åæ $f_n$\
什 $ZZ_p = inverseLimit(n) A_n$\
è¿åœç¶æ¯äº€æ¢ç¯ïŒå¹¶äžå¯ä»¥éªè¯åªèŠ $x_0 != 0, p$ ïŒæ¯äžª $x_i$ éœå°äž $p$ äºçŽ ïŒè¿èå®å°±æ¯å¯éå
ã\
å
¶äžçå
çŽ å¯ä»¥æŽæŸåŒçåäœïŒ
$
x = a_0 + a_1 p + a_2 p^2 + ...... space a_i in {0, 1, ..., p-1}
$
]
#lemma[][
讟æç¯äžçéåæé $R = inverseLimit(n) R_n$ïŒæä»¬å°æïŒ
$
R^times = inverseLimit(n) R_n^times
$
]
#proof[
泚æå° $f_n (R^times_(n+1)) subset R_n^times$ã\
å¯¹äº $a = (a_i) in R^times$ïŒå $b = (b_i = Inv(a_i))$ãåªéèŠéªè¯ $f_n (b_(n+1)) = b_n$ãäºå®äžïŒ
$
1 = f_n (1) = f_n (a_(n+1) b_(n+1)) = f_n (a_(n+1)) f_n (b_(n+1)) = a_n f_n (b_(n+1))
$
ç±éå
çå¯äžæ§ïŒè¿è¡šæ $f_n (b_(n+1)) = b_n$ïŒå æ€ $b in R$ äž $a b = 1$ïŒè¿è $a, b in R^times$
]
#example[][
什 $R = inverseLimit(n) quotient(CC[x], (x^n))$ïŒå®å
¶å®å°±æ¯ $CC$ äžç圢åŒå¹çº§æ°ç¯ïŒæ¯äžé¡¹éœæ¯æéå€é¡¹åŒïŒéåèµ·æ¥å°±æ¯äžäžªæ ç©·å€é¡¹åŒïŒä¹å°±æ¯äžäžªæ³°åå±åŒåŒã\
]
#definition[æšå¹¿çéåæé][
ç§°äžäžªååºé $I$ æ¯æ»€åçïŒåŠæ $forall i, j in I, exists k, k > i and k > j$\
å讟æä»¬æäžåéå/矀/ç¯/... $A_i, i in I$ïŒå¹¶äžå¯¹ $j > i, exists phi_(j i) : A_j -> A_i$ æ¯åæïŒäœ¿åŸïŒ
$
phi_(k j) compose phi_(j i) = phi_(k i)
$
åç§°ä¹äžºäžäžªååç³»ç»ãæ€æ¶æä»¬å®ä¹éåæéïŒ
$
inverseLimit(i in I) = {(a_i) | a_i in A_i, forall j > i, phi_(j i)(a_j) = a_i}
$
]
#proposition[][
讟 $lambda_i: B -> A_i$ æ¯åæå¹¶äžæ»¡è¶³:
$
forall j > i, phi_(j i) compose lambda_j = lambda_i
$
åå°ååšåæå° $B$ æ å
¥ $inverseLimit(i in I) A_i$
]
#example[][
åæŽé€å
³ç³»äœäžº $ZZ$ äžç滀è¿çååºå
³ç³»ïŒå¹¶åå
¶äžèªç¶çåæïŒå®ä¹ïŒ
$
ZZ^(\^) = inverseLimit(n) quotient(ZZ, n ZZ)
$
]
#lemma[][
$ZZ^(\^) tilde.eq product_(p "is prime") ZZ_p$
]
#proof[
å®ä¹:
$
phi_1 : ZZ^(\^) -> product_(p "is prime") ZZ_p\
(a_n) -> (a_(p^r))_r
$
$
phi_2: product_(p "is prime") ZZ_p -> ZZ^(\^)\
phi_2 = pi_1 : ZZ_n -> ZZ_n^(\^) compose pi_2: product_(p "is prime") ZZ_p -> ZZ_n
$
]
#definition[ååæéçææ][
对äºååæé $inverseLimit(i) A_i = A$ïŒå®ä¹å
¶æææ¯ä¹ç§¯ææçéå¶
]
#theorem[][
åŠææ¯äžª $A_i$ éœæ¯ Hausdorff 空éŽïŒå $A$ 乿¯ Hausdorff 空éŽ
]
#definition[ææçŸ€][
ç§°äžäžªçŸ€ $G$ æ¯ææçŸ€ïŒåп宿¯äžäžªææç©ºéŽäžçŸ€è¿ç®æ¯è¿ç»ç
]
#proposition[][
- 讟 $U subset R$ æ¯åŒéïŒå $forall g, h in G, g U h in G$ 乿¯åŒé
- 讟 $H <= G$ æ¯åŒå矀ïŒåå®åæ¶ä¹æ¯éç
- è¥ $G$ æ¯çŽ§çŸ€ïŒååŒåé $H$ æ¯æéææ ç
]
#definition[Profinite group][
ç§°äžäžªææçŸ€ $G$ æ¯ profinite çïŒåп宿¯æéçŸ€çæ»€è¿ååæé
]
#lemma[][
讟 $G$ æ¯ profinite 矀ïŒå:
$
G tilde.eq inverseLimit(H norS G "open") quotient(G, H)
$
]
#proof[
$G -> inverseLimit(H norS G "open") quotient(G, H)$ æ¯èªç¶ç\
åä¹ïŒè®ŸïŒ
$
G = inverseLimit(i in I) G_i
$
èè $pi_i: G -> G_i$ïŒå $ker pi_i norS G$ïŒå¯ä»¥æé åè¿æ¥çæ å°ïŒ
$
inverseLimit(H norS G "open") quotient(G, H) -> quotient(G, ker pi_i) -> G_i
$
]
对äºä»»äœäžäžªæ 穷䌜çœåæ©åŒ ïŒæ ç宿¯å
¶æææé䌜çœåèªæ©åŒ çå¹¶ïŒéç¹æ¯åŠäœå®ä¹äŒœçœå矀
#definition[][
åšæ 穷绎ïŒä»£æ°ïŒäŒœçœåæ©åŒ äžïŒå®ä¹ïŒ
$
Gal(quotient(K, F)) := inverseLimit(quotient(E, F) "æé䌜çœå") Gal(quotient(E, F))
$
]
#example[][
$QQ(xi_(p^infinity)) = QQ(xi_(p^n), n in NN)$ïŒåå®ç䌜çœåçŸ€æ¯ææçŽ æ°å¹é¶åŸªç¯çŸ€çååæéïŒåœç¶å°±æ¯ $ZZ^times$
]
#lemma[][
$Gal(quotient(K, F)) = {"autoiso" sigma: K -> K | sigma_F = id}$
]
#proof[
å¯¹ä»»äœæéå䌜çœåæ©åŒ $quotient(E, F)$ïŒå®ä»¬äŒœçœå矀çååæéäžæ¹é¢æ¯åæ©åŒ ç䌜çœå矀ïŒåŠäžæ¹é¢åœç¶ä¹å¯¹åºææä¿æ $F$ äžåç $K$ äžèªåæïŒå æ€ç»è®ºæç«
]
#theorem[äž»å®ç][
䌜çœå矀çæäºå矀äžåæ©åŒ ååšäžäžå¯¹åºïŒå
·äœèèšïŒ
$
"éå矀" <-> "åæ©åŒ "\
"åŒå矀" <-> "æéåæ©åŒ "\
"æ£è§å矀" <-> "å䌜çœåæ©åŒ "\
$
]
== 代æ°éå
äžè¶
è¶æ©åŒ
#definition[代æ°å°é][
ç§°å $F$ æ¯ä»£æ°å°éçïŒåŠæå®çæ¯äžªå€é¡¹åŒéœææ ¹ïŒè¿èæ¯äžªå€é¡¹åŒåè£ïŒãè¿ä¹çä»·äºå
¶äžæ²¡æéå¹³å¡çä»£æ°æ©åŒ
]
#definition[代æ°éå
][
ç§°å $F$ ç代æ°éå
æ¯äžäžªä»£æ°æ©åŒ $F -> algClosure(F)$ïŒäœ¿åŸ $F^"alg"$ 代æ°å°é
]
类䌌çå¯ä»¥å®ä¹å¯åéåïŒå¯åéå
çç
#theorem[][
- ä»»äœåéœæä»£æ°éå
ïŒäžåšåæçæä¹äžå¯äž
- åç代æ°éå
å°±æ¯å
¶äžææå€é¡¹åŒçåè£å
]
#proof[
åªè¯æ 2ïŒ1 æ¶åäžäºçº¯ç²¹éå论çé®é¢ïŒè¿éè·³è¿\
讟 $F$ äžææå€é¡¹åŒçåè£åïŒä¹æ¯æææéä»£æ°æ©åŒ çå€åïŒäžº $F'$ïŒå讟 $F'$ è¿æéå¹³å¡äžå¯çºŠå€é¡¹åŒ:
$
sum_i a_i x^i
$
åæ©åŒ $F(a_0, a_1, a_2, ..., a_n, alpha)$ æäžº $F$ äžæéä»£æ°æ©åŒ ïŒç±å®ä¹å®åºè¯¥å
å«äº $F'$ïŒè¿å°±è¡šæ $F'$ 代æ°å°é
]
#definition[è¶
è¶æ©åŒ ][
讟 ${x_1, x_2, ..., x_n, ...}$ åš $F$ äžä»£æ°æ å
³ïŒä¹å³äžååš $F$ äžçïŒå€å
ïŒå€é¡¹åŒäœ¿åŸ $f(x_1, x_2, ..., x_n, ...) = 0$ïŒ\
å $F(x_1, x_2, ..., x_n, ...)$ 称䞺è¶
è¶æ©åŒ ãç¹å«çïŒæïŒ
$
F(x_1, x_2, ..., x_n, ...) tilde.eq (F[x_1, x_2, ..., x_n, ...])/(F[x_1, x_2, ..., x_n, ...])
$
å
¶äžçæ ç©·å
å€é¡¹åŒåæä»»ææéå
å€é¡¹åŒ
]
#definition[è¶
è¶åº][
è¶
è¶æ©åŒ äžïŒæå€§çä»£æ°æ å
³ç»ç§°äžºè¶
è¶åºïŒæŸç¶è¿çä»·äºå®ä»¬å¯ä»¥å¯äžçº¿æ§è¡šåºä»»äœå
çŽ
]
#theorem[][
对äºä»»æè¶
è¶æ©åŒ ïŒè¶
è¶åºæ¯ååšçïŒäžä»»æè¶
è¶åºå
·æçžåçåºæ°
]
#remark[][
è¶
è¶åºå¹¶äžæå³çè¶
è¶çæå
ïŒäŸåŠ ${x}, {x^2}$ éœæ¯äžç»è¶
è¶åºïŒäœæŸç¶æïŒ
$
F(x) != F(x^2)
$
]
#definition[纯è¶
è¶æ©åŒ ][
è¶
è¶æ©åŒ $quotient(K, F)$ 称䞺纯è¶
è¶æ©åŒ ïŒåŠæååšäžç»çº¿æ§æ å
³çå
çŽ $S$ïŒäœ¿åŸ $K = F(S)$
]
æ¥äžæ¥æä»¬èŠåè¿°æè°çåžå°äŒ¯ç¹é¶ç¹å®çïŒäžºæ€æä»¬éèŠäžäºäº€æ¢ä»£æ°
#definition[å¹é¶æ ¹ïŒradicalïŒ][
讟 $I$ æ¯äº€æ¢ç¯äžççæ³ïŒåç§°å®çå¹é¶æ ¹äžºïŒ
$
sqrt(I) = union_(i=1)^(+infinity) {f in R | f^i in I}
$
å®ä¹æ¯ç¯äžççæ³\
è¥ $I = sqrt(I)$ åç§° $I$ 䞺 radical
]
æŸç¶è¥ $I$ æ¯å€é¡¹åŒç¯ççæ³ïŒå $I$ äžææå€é¡¹åŒçé¶ç¹åœç¶æ°äžº $sqrt(I)$ äžææå€é¡¹åŒçé¶ç¹
#theorem[åžå°äŒ¯ç¹é¶ç¹å®ç][
- å讟 $K$ æ¯ä»£æ°éåïŒåå€é¡¹åŒç¯ $K[x_1, x_2, ..., x_n]$ çæå€§çæ³ååœ¢åŠ $(x_1 - a_1, ..., x_n - a_n), a_i in K$
- å讟 $K$ æ¯ä»£æ°éåïŒåä»»æçæ³ $I$ åæ»¡è¶³ïŒ
$
I(Z(I)) = sqrt(I)
$
]
#proof[
æä»¬çæè·¯æ¯æè°ç Nother æ£è§åïŒå®ç±»äŒŒäºä»£æ°çåæ©åŒ ãè¿èäžæä»¬æè¯Žçç¯éœæ¯äº€æ¢ç¯ã
#definition[][
讟 $A$ æ¯ $B$ çåç¯ïŒç§° $B$ äžå
çŽ $x$ æ¯åš $A$ äžæŽçïŒåŠæïŒ
ååšéŠäžå€é¡¹åŒïŒ
$
x^n + a_(n-1)x^(n-1) +...+a_0
$
äœ¿åŸ $x$ æ¯å®çæ ¹
]
#remark[][
- æä»¬èŠæ±å€é¡¹åŒå¿
é¡»éŠäžïŒåšåäžè¿æ å
³çާèŠäœåšç¯äžæå¿
èŠé¢å€èŠæ±
- æä»¬æ²¡ææå°å€é¡¹åŒçæŠå¿µïŒäœæ¯è®žå€æŠå¿µä»ç¶èœè¿è¡æšå¹¿
]
#proposition[][
äžå诎æ³çä»·ïŒ
- $x$ åš $A$ äžæŽ
- $A[x]$ ïŒè¿éç $x$ æ¯æè¿äžªå
çŽ $x$ èéèªç±çäžå®å
ïŒæ¯æéçæ $A$ æš¡
- $exists C <= B, A[x] <= B$ äž $C$ åš $A$ äžæéçæ
]
#proof[
- 2 -> 3: æŸç¶
- 1 -> 2: \
泚æå°ïŒ
$
x^n = -(a_(n-1)x^(n-1) +...+a_0)
$
è¿è $A[x]$ å°±æ¯ç± $1, x, x^2, ..., x^(n-1)$ çæç $A$ æš¡
- 3 -> 1 æ¯ç¥åŸ®æç¹å€æçïŒ\
讟 $C$ 被 $e_1, e_2, ..., e_n$ æéçæïŒæä»¬åžæä»¿ç
§çº¿æ§ç©ºéŽäžïŒç¹åŸå€é¡¹åŒè¿æ¯é¶åå€é¡¹åŒçæè·¯æé \
å讟ïŒ
$
x(e_1, e_2, ..., e_n) = (e_1, e_2, ..., e_n) A
$
æ³šææ€æ¶ç $A$ æªå¿
æ¯å¯äžçïŒäœæ»ä¹æ¯ååšç\
泚æå°äžåŒå®é
äžå°±æ¯ïŒ
$
(e_1, e_2, ..., e_n) (x I - A) = 0
$
ç±çº¿æ§ä»£æ°çç¥è¯ïŒå $x I - A$ ç䌎éç©éµ $B^*$ïŒæïŒ
$
0 = (e_1, e_2, ..., e_n) (x I - A) B^* = |x I - A| (e_1, e_2, ..., e_n)
$
ç±äº $e_1, e_2, ..., e_n$ æ¯äžç»çæå
ïŒå æ€åºè¯¥ååšååé $X$ïŒäœ¿åŸïŒ
$
1 = (e_1, e_2, ..., e_n)X
$
è¿å°±æïŒ
$
|x I - A| = |x I - A| (e_1, e_2, ..., e_n) X = 0
$
è¿å°±æ¯æä»¬èŠçéŠäžå€é¡¹åŒ
]
#corollary[][
- 讟 $x_1, x_2, ..., x_n$ æ¯ $A$ äžçæŽå
ïŒå $A[x_1, x_2, ..., x_n]$ æ¯æéçææŽæš¡
- $A$ äžæææŽå
ææçé忝äžäžªå
å« $A$ çåç¯ïŒç§°å
¶äžº $A$ çæŽéå
ãè¥ $A$ å
å«äºæææŽå
ïŒåç§°å
¶åš $B$ äžæŽéã\ è¥ $B$ äžææå
çŽ éœåš $A$ äžæŽïŒåç§° $B$ åš $A$ äžæŽã
- 讟 $A subset B subset C$ïŒäž $B$ åš $A$ äžæŽïŒ$C$ åš $B$ äžæŽïŒå $C$ åš $A$ äžæŽ
- $A$ åš $B$ äžçæŽéå
æ¯æŽéç
]
#proof[
- 类䌌äžå
çæ
åœ¢æ¯æŸç¶ç
- 泚æå° $A[x, y]$ æ¯æéçæ $A-$æš¡ïŒè $x plus.minus y, x y$ éœåšå
¶äžïŒç±ä¹åçåœé¢ç¥è¿äºå
çŽ éœæ¯æŽå
ïŒè¿èæŽå
ææçéå对å åä¹å°é
- å®å
šä»¿ç
§åç对åºåœé¢è¯æå³å¯
- å讟 $A$ çæŽéå
$C$ è¿ææŽå
$x$ïŒåç±äžé¢çåœé¢ç¥ $x$ ä¹åš $A$ äžæŽïŒè¿è $x in C$
]
æè°ç Nother æ£ååå¯ä»¥æ³ææèäžäžªæŽç¯äžç $k-$ 代æ°å°åºæå€å°çâè¶
è¶æ¬¡æ°â
#theorem[Nother æ£è§å][
讟 $k$ æ¯åïŒ$R$ æ¯æéçæ$k-$代æ°ïŒä¹å³ $R = quotient(k[x_1, x_2, ..., x_n], I)$ïŒ$I$ æ¯å
¶äžæäžªçæ³ã\
æ€æ¶ïŒååš $k <= n$ 以åäžäžªåµå
¥ååæïŒ
$
phi: k[y] = k[x_1, x_2, ..., x_r] -> R
$
äœ¿åŸ $R$ åš $k[y]$ äžæŽ
]
#proof[
对 $n$ ååœçº³ïŒå讟 $n - 1$ æ¶æ
圢已ç»è¯æ\
åš $I$ äžæŸå°éé¶å
çŽ $f$\
åæ³ $f = x_(n)^i + ...$ïŒé£æä»¬å°±å¯ä»¥çŽæ¥ä»£æ¢ãäœæ¯ç°å®äžå
讞æä»¬è¿ä¹åïŒéåžžæå·§æ§çæä»¬å¯ä»¥å以æ¢å
ã\
什 $z_i = x_i - x_(1)^(r_(i-1)), i=2, 3...$ïŒ$r_i$ çåŒæä»¬çšåç¡®å®\
èèåæ $psi: =k[x_1, x_2, ..., x_n] -> k[x, z_1, z_2, ..., z_n]$ïŒ$psi(f)$ å°±æ¯æ¢å
åçç»æïŒè®° $I' = psi(I)$\
äžåŠšè®Ÿ $r_n$ å
å倧ïŒ$psi(f)$ çæé«æ¬¡é¡¹åºåœäŒæ¯ $x_1^N$ïŒç³»æ°äžåŠšè®Ÿäžº 1ïŒ \
æ¥äžæ¥ïŒæä»¬åš $quotient(k[x_1, z_1, z_2, ..., z_n], I')$ äžæ¹æ $x_1$ïŒèªç¶äº§çäžäžªåµå
¥æ å°
$
phi' : quotient(k[z_1, z_2, ..., z_n], I') -> quotient(k[x, z_1, z_2, ..., z_n], I'')
$
åŠäžæ¹é¢ïŒå¯ä»¥éªè¯è¿äžªåµå
¥æ¯æéçæçæŽæ©åŒ ïŒèç±åœçº³å讟ïŒåè
åš $k[y]$ äžæ¯æŽçïŒè¿è $R$ åš $k[y]$ äžä¹æ¯æŽç
]
#lemma[][
讟 $R$ æ¯åïŒ$S$ æ¯å
¶äžçåç¯äž $R$ åš $S$ äžæŽïŒå $S$ æ¯å
]
#proof[
ä»»å $x in S$ïŒç±äº $Inv(x) in R$ïŒæ
å¯è®ŸïŒ
$
0 = f(Inv(x)) = x^(-n) + a_(n-1) x^(-n+1) + ... + a_0\
x^(-1) = - a_(n-1) - a_(n-2) x - ... - a_0 x^(n-1)
$
æ
ç»è®ºæç«
]
æç»æä»¬å¯ä»¥åå°åžå°äŒ¯ç¹é¶ç¹å®ççè¯æã\
å讟 $M$ æ¯ $k[x_1, x_2, ..., x_n]$ äžçæå€§çæ³ïŒå $quotient(k[x_1, x_2, ..., x_n], M)$ å°æ¯åã\
ç± Nother æ£è§åïŒå¯ä»¥æŸå° $k[y]$ äœ¿åŸ $quotient(k[x_1, x_2, ..., x_n], M)$ åšå
¶äžæŽãç¶èç±åŒçïŒ$k[y]$ æ¯åïŒä»è $k[y] = k$ïŒæ©åŒ ïŒ
$
k -> quotient(k[x_1, x_2, ..., x_n], M)
$
æ¯æéæ©åŒ ïŒè¿å°±ç»åºäºç»æã\
对äºç¬¬äºäžªç»è®ºïŒæ³šæå°ïŒ
$
f in sqrt(I) => f^n in I => f^n(Z(I)) = 0 => f(Z(I)) = 0 => f in I(Z(I))
$
åŠäžäžªæ¹åçè¯æéèŠäžäºä»£æ°å äœäžçææ¯ãå讟 $I = (f_1, f_2, ..., f_n)$ïŒ$g$ 满足ïŒ
$
forall a in k^n, f_1 (a) = f_2 (a) = ... = f_n (a) = 0 => g(a) = 0
$
èèçæ³ïŒ
$
J = k[x_1, x_2, ..., x_(n+1)] = I J + (1 - g x_(n+1))
$
å°æ $J$ çé¶ç¹éæ¯ç©ºéã代æ°å äœäžïŒè¿å°ç»åº $J = (1)$ïŒè¿æ¯å 䞺ïŒ
- å讟 $Z(J) != (1)$ïŒé£ä¹ $J$ äžå®å¯ä»¥æ©å
æäžºäžäžªæå€§çæ³ $M$ãç±é¶ç¹å®ççç¬¬äžæ¡ïŒ
$
M = (x_1 - a_1, x_2 - a_2, ..., x_(n+1) - a_(n+1))
$
èèèªç¶çåµå
¥ $phi: quotient(k[x_1, x_2, ..., x_n], J) -> quotient(k[x_1, x_2, ..., x_n], M) = k$ïŒè¿æ¯å 䞺 $M$ æ¯æ¯ $J$ 倧ççæ³ïŒ\
èåšå·ŠäŸ§ $f_i = 0, 1 - g x_(n+1) = 0$ïŒåšå³äŸ§è¿æå³ç $f(a) = 1 - g(a) a_(n+1) = 0 => 1 = 0$ïŒççŸïŒ
å æ€ $J = (1)$ïŒæ
ïŒ
$
1 = sum_i h_i f_i + h (1 - g x_(n+1)) =^("代å
¥" x_(n+1) = Inv(g)) sum_i h_i f_i (x_1, x_2, ..., x_n, Inv(g))
$
æŽçç«åŸ $g^l in (f_1, f_2, ..., f_n)$
]
#theorem[åžå°äŒ¯ç¹é¶ç¹å®ççæç»çæ¬][
- ååšäžäžå¯¹åºïŒ
$
{k^n "ç代æ°åé"} &<-> {k[x_1, x_2, ..., x_n] "çæ ¹çæ³"}\
Z &-> I(Z)\
Z(I) &<- I
$
- $I_1 subset I_2 <=> Z(I_1) supset Z(I_2)$
- $Z(I_1 + I_2) = Z(I_1) sect Z(I_2)$
- $Z(I_1 sect I_2) = Z(I_1) union Z(I_2)$
]
]
#chapterThree |
|
https://github.com/protohaven/printed_materials | https://raw.githubusercontent.com/protohaven/printed_materials/main/common-policy/shop_rules.typ | typst | = Shop Rules
== Be Safe
- Get safety clearances
- Wear protective equipment
- Watch and reset equipment after use
- Never use equipment that is red-tagged
== Take Care of Each Other
- Be aware of your surroundings
- Don't use a tool if it poses a danger to someone else
== Take Care of the Tools
- Get tool clearances
- Do not alter of use equipment beyond limits
- Notify staff when maintenance is needed
== Keep the Shop Clean
- Clean up after yourself
- Return tools to their original locations |
|
https://github.com/Area-53-Robotics/53A-Notebook-Over-Under-2023-2024 | https://raw.githubusercontent.com/Area-53-Robotics/53A-Notebook-Over-Under-2023-2024/master/Vex%20Robotics%2053A%20Notebook%202023%20-%202024/Entries/Misc.%20Entry/Scrum-Master-Evaluation.typ | typst | #set page(header: [ VR
#h(1fr)
November 4, 2023
])
= SCRUM MASTER EVALUATION
\
== Key Problems
#block(
width: 100%,
fill: rgb("FFEAE8"),
inset: 8pt,
radius: 4pt,
[
=== Catapult Malfunctions
- Unsure why catapult got stuck
- Catapult didnât have enough range
> Couldnât shoot triballs across middle barrier
> The gear ratio didnât provide enough torque to pull back enough rubber bands to shoot triballs over the middle barrier
- Ratchet is not strong enough
> Catapult would occasionally turn against the ratchet
> Randomly released
=== Intake Malfunctions
- Intake was bending inwards towards triballs
- This may have worsened with wear on intake during matches
- Bending caused too much compression on triballs
- Force of wheels spinning couldnât overcome the force on compression of the triball
=== Autonomous
- Our autonomous programs were not very reliable
- Require more tuning and testing in the weeks before our next tournament
],
)
== Sprint Timeline
#block(
width: 100%,
fill: rgb("EEEEFF"),
inset: 8pt,
radius: 4pt,
[
- First sprint ended with our first tournament of the season
- Next sprint will last until our next tournament: December 2, 2023
- 7 official meetings until tournament
- Plan may change if
+ team schedules more/less meeting
+ 53A meets alone
\
])
== Timeline
\
#box(height: 450pt,
columns(2)[
#set par(justify: true)
#set align(center)
#rect[11/10/23]
#line(end: (0%, 5%))
#rect[11/11/23]
#line(end: (0%, 5%))
#rect[11/17/23]
#line(end: (0%, 5%))
#rect[11/18/23]
#line(end: (0%, 5%))
#rect[11/24/23]
#line(end: (0%, 5%))
#rect[11/25/23]
#line(end: (0%, 5%))
#rect[12/1/23]
#set align(left)
#rect[- Brainstorm intake and catapult solutions
- Design new catapult
]
#line(end: (0%, 1%))
#rect[- Assemble new catapult
- Design new intake
]
#line(end: (0%, 2%))
#rect[- Test catapult
- Assemble new intake
]
#line(end: (0%, 1%))
#rect[- Test intake
- Test + tune autonomous
]
#line(end: (0%, 1.5%))
#rect[- Test + tune autonomous
- Driver practice
]
#line(end: (0%, 1%))
#rect[- Test + tune autonomous
- Driver practice
]
#line(end: (0%, 1%))
#rect[- Test + tune autonomous
- Driver practice
]
]
)
|
|
https://github.com/Big-Ouden/ensiie_rapport | https://raw.githubusercontent.com/Big-Ouden/ensiie_rapport/main/0.1.5/NOTES.md | markdown | # Building notes
How to to build locally and deploy to Typst app and Typst templates.
## Image quantization
To reduce the size of images, which is nice for reducing the template size.
```bash
pngquant *.png --ext .png --force
```
## Integration to the official repos
- Create symlink to this repository from `~/.cache/typst/packages` to `git/packages/packages/preview`. For this :
```bash
ln -s ~/git/modern-isc-report ~/.cache/typst/packages/preview/isc-hei-report/0.1.5
```
This prevents the download of packages and uses the local versions of this package.
- For local testing and development, once the step above has been done, you can simply build from the `template` directory using `typst watch report.typ`
- Additional testing can be conducted to see if the template instance works correctly with
```bash
typst init @preview/isc-hei-report:0.1.5
```
- Copy the content of this repos to the `typst-template` repository using
```bash
cp * -R ~/git/packages/packages/preview/isc-hei-report/0.1.5/
```
- Create PR as usual.
|
|
https://github.com/goshakowska/Typstdiff | https://raw.githubusercontent.com/goshakowska/Typstdiff/main/tests/test_working_types/strong/strong_updated.typ | typst | *first*\
#strong[second_updated]\
*third_updated*\
#strong[forth] |
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/glossarium/0.4.0/README.md | markdown | Apache License 2.0 | # Typst glossary
> Glossarium is based in great part of the work of [<NAME>](https://github.com/Dherse) from his master thesis available at: <https://github.com/Dherse/masterproef>. His glossary is available under the MIT license [here](https://github.com/Dherse/masterproef/blob/main/elems/acronyms.typ).
Glossarium is a simple, easily customizable typst glossary inspired by [LaTeX glossaries package](https://www.ctan.org/pkg/glossaries) . You can see various examples showcasing the different features in the `examples` folder.

## Manual
### Import and setup
This manual assume you have a good enough understanding of typst markup and scripting.
For Typst 0.6.0 or later import the package from the typst preview repository:
```typ
#import "@preview/glossarium:0.4.0": make-glossary, print-glossary, gls, glspl
```
For Typst before 0.6.0 or to use **glossarium** as a local module, download the package files into your project folder and import `glossarium.typ`:
```typ
#import "glossarium.typ": make-glossary, print-glossary, gls, glspl
```
After importing the package and before making any calls to `gls`,` print-glossary` or `glspl`, please ***MAKE SURE*** you add this line
```typ
#show: make-glossary
```
> *WHY DO WE NEED THAT ?* : In order to be able to create references to the terms in your glossary using typst ref syntax `@key` glossarium needs to setup some [show rules](https://typst.app/docs/tutorial/advanced-styling/) before any references exist. This is due to the way typst works, there is no workaround.
>
>Therefore I recommend that you always put the `#show: ...` statement on the line just below the `#import` statement.
### Printing the glossary
First we have to define the terms.
A term is a [dictionary](https://typst.app/docs/reference/types/dictionary/) composed of 2 required and 2 optional elements:
- `key` (string) *required, case-sensitive, unique*: used to reference the term.
- `short` (string) *required*: the short form of the term replacing the term citation.
- `long` (string or content) *optional*: The long form of the term, displayed in the glossary and on the first citation of the term.
- `desc` (string or content) *optional*: The description of the term.
- `plural` (string or content) *optional*: The pluralized short form of the term.
- `longplural` (string or content) *optional*: The pluralized long form of the term.
- `group` (string) *optional, case-sensitive*: The group the term belongs to. The terms are displayed by groups in the glossary.
Then the terms are passed as a list to `print-glossary`
```typ
#print-glossary(
(
// minimal term
(key: "kuleuven", short: "KU Leuven"),
// a term with a long form and a group
(key: "unamur", short: "UNamur", long: "Namur University", group: "Universities"),
// a term with a markup description
(
key: "oidc",
short: "OIDC",
long: "OpenID Connect",
desc: [OpenID is an open standard and decentralized authentication protocol promoted by the non-profit
#link("https://en.wikipedia.org/wiki/OpenID#OpenID_Foundation")[OpenID Foundation].],
group: "Accronyms",
),
// a term with a short plural
(
key: "potato",
short: "potato",
// "plural" will be used when "short" should be pluralized
plural: "potatoes",
desc: [#lorem(10)],
),
// a term with a long plural
(
key: "dm",
short: "DM",
long: "diagonal matrix",
// "longplural" will be used when "long" should be pluralized
longplural: "diagonal matrices",
desc: "Probably some math stuff idk",
),
)
)
```
By default, the terms that are not referenced in the document are not shown in the glossary, you can force their appearance by setting the `show-all` argument to true.
You can also disable the back-references by setting the parameter `disable-back-references` to `true`.
Group page breaks can be enable by setting the parameter `enable-group-pagebreak` to `true`.
You can call this function from anywhere in you document.
### Referencing terms.
Referencing terms is done using the key of the terms using the `gls` function or the reference syntax.
```typ
// referencing the OIDC term using gls
#gls("oidc")
// displaying the long form forcibly
#gls("oidc", long: true)
// referencing the OIDC term using the reference syntax
@oidc
```
#### Handling plurals
You can use the `glspl` function and the references supplements to pluralize terms.
The `plural` key will be used when `short` should be pluralized and `longplural` will be used when `long` should be pluralized. If the `plural` key is missing then glossarium will add an 's' at the end of the short form as a fallback.
```typ
#glspl("potato")
```
Please look at the examples regarding plurals.
#### Overriding the text shown
You can also override the text displayed by setting the `display` argument.
```typ
#gls("oidc", display: "whatever you want")
```
## Final tips
I recommend setting a show rule for the links to that your readers understand that they can click on the references to go to the term in the glossary.
```typ
#show link: set text(fill: blue.darken(60%))
// links are now blue !
```
## Changelog
### 0.4.0
- Support for plurals has been implemented, showcased in [examples/plural-example/main.typ](examples/plural-example). Contributed by [@St0wy](https://github.com/St0wy).
- The behavior of the gls and glspl functions has been altered regarding calls on undefined glossary keys. They now cause panics. Contributed by [@St0wy](https://github.com/St0wy).
### 0.3.0
- Introducing support for grouping terms in the glossary. Use the optional and case-sensitive key `group` to assign terms to specific groups. The appearanceof the glossary can be customized with the new parameter `enable-group-pagebreak`, allowing users to insert page breaks between groups for better organization. Contributed by [indicatelovelace](https://github.com/indicatelovelace).
### 0.2.6
#### Added
- A new boolean parameter `disable-back-references` has been introduced. If set to true, it disable the back-references (the page number at the end of the description of each term). Please note that disabling back-references only disables the display of the page number, if you don't have any references to your glossary terms, they won't show up unless the parameter `show-all` has been set to true.
### 0.2.5
#### Fixed
- Fixed a bug where there was 2 space after a reference. Contributed by [@drupol](https://github.com/drupol)
### 0.2.4
#### Fixed
- Fixed a bug where the reference would a long ref even when "long" was set to false. Contributed by [@dscso](https://github.com/dscso)
#### Changed
- The glossary appearance have been improved slightlyby. Contributed by [@JuliDi](https://github.com/JuliDi)
### Previous versions did not have a changelog entry
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/meta/bibliography-01.typ | typst | Other | #set page(width: 200pt)
= Details
See also #cite(<distress>, supplement: [p. 22]), @arrgh[p. 4], and @distress[p. 5].
#bibliography("/works.bib")
|
https://github.com/Servostar/dhbw-abb-typst-template | https://raw.githubusercontent.com/Servostar/dhbw-abb-typst-template/main/src/pages/prerelease-note.typ | typst | MIT License | // .--------------------------------------------------------------------------.
// | Preleminary release Notice |
// '--------------------------------------------------------------------------'
// Author: <NAME>
// Edited: 28.06.2024
// License: MIT
#let new_prerelease_note(config) = (
context {
pagebreak(weak: true)
let thesis = config.thesis
let author = config.author
if text.lang == "de" [
#heading("Vorabfassung")
] else if text.lang == "en" [
#heading("Preliminary Version")
]
v(1em)
if text.lang == "de" [
Bei dieser Ausgabe der Arbeit mit dem Thema
] else if text.lang == "en" [
This edition of the work with the subject
]
v(1em)
set align(center)
text(weight: "bold", thesis.title)
if thesis.subtitle != none {
linebreak()
thesis.subtitle
}
set align(left)
v(1em)
set par(justify: true)
if text.lang == "de" [
handelt es sich _nicht_ um die fertige Fassung. Das Dokument kann Inhaltliche-, Grammatikalische- sowie Format-Fehler enthalten. Das Dokument ist im Rahmen der Aufgabenstellung von Seiten der #author.university nicht zur Bewertung freigegeben und ein anderer Verwendungszweck als eine Vorschau ist nicht gestattet.
] else if text.lang == "en" [
is not the final version. The document may contain errors in content, grammar and formatting. The document may not be released for evaluation to #author.university as part of the assignment, and any use other than a preview is not permitted.
]
v(1em)
h(1em)
[#author.name, #datetime.today().display()]
}
)
|
https://github.com/antonWetzel/Masterarbeit | https://raw.githubusercontent.com/antonWetzel/Masterarbeit/main/arbeit/segmentierung.typ | typst | #import "setup.typ": *
= Segmentierung von Waldgebieten <seperierung_in_segmente>
== Ablauf
FÌr die Segmentierung werden alle Punkte in gleich breite parallele Scheiben entlang der Höhe unterteilt. Danach werden die Scheiben von Oben nach Unten einzeln verarbeitet, um die Segmente zu bestimmen. DafÌr werden die Punkte in einer Scheibe zu zusammenhÀngenden Bereichen zusammengefasst. Mit den Bereichen werden die Koordinaten der BÀume bestimmt, welche in der momentanen Scheibe existieren. Die Punkte in der Scheibe werden dann dem nÀchsten Baum zugeordnet.
== Bereiche bestimmen <segmentierung_bereiche_chapter>
#let points = (
(0, 1.9),
(-0.5, 2.0),
(-0.3, 2.3),
(0.4, 2.5),
(0.7, 2.1),
(0.6, 1.8),
(-0.1, 1.6),
(1.8, 1.0),
(2.3, 1.1),
(2.7, 1.4),
(1.8, 0.8),
(1.9, 0.4),
(2.5, 0.5),
(-0.5, -0.9),
(-1.2, -1.1),
(-0.8, -1.3),
(-1.1, -1.9),
(-0.5, -1.8),
)
FÃŒr jede Scheibe werden konvexe zusammenhÀngende Bereiche bestimmt, dass die Punkte in unterschiedlichen Bereichen einen Mindestabstand voneinander entfernt sind. DafÃŒr wird der leeren Menge von Bereichen gestartet und jeder Punkt zu der Menge hinzugefÃŒgt. Wenn ein Punkt vollstÀndig in einem Bereich enthalten ist, wird der Bereich nicht erweitert. Ist der Punkt auÃerhalb, aber nÀher als den Mindestabstand zu einem der Bereiche, so wird der Bereich erweitert. Ist der Punkt von allen bisherigen Bereichen weiter entfernt, so wird ein neuer Bereich angefangen.
Dadurch entstehen Bereiche wie in @segmentierung_bereiche. Bei einer Baumspitze entsteht ein kleiner Bereich. Wenn mehrere BÀume sich berÃŒhren, werden die zugehörigen Punkte zu einem gröÃeren Bereich zusammengefasst.
// BR06-ALS
#figure(
caption: [Beispiel fÃŒr berechnete Segmente fÃŒr zwei aufeinanderfolgende Scheiben.],
grid(
columns: 1 * 2,
gutter: 1em,
subfigure(rect(image("../images/test_5-areas.svg"), inset: 0pt), caption: [Höhere Scheibe]),
subfigure(rect(image("../images/test_6-areas.svg"), inset: 0pt), caption: [Tiefere Scheibe]),
),
) <segmentierung_bereiche>
Bei der Berechnung sind alle momentanen Bereiche in einer Liste gespeichert. Ein Bereich ist dabei eine Liste von Eckpunkten. Die Eckpunkte sind dabei sortiert, dass fÌr einen Eckpunkt der nÀchste Punkt entlang der Umrandung vom Bereich der nÀchste Punkt in der Liste ist. FÌr den letzten Punkt ist der erste Punkt in der Liste der nÀchste Eckpunkt.
Um einen Punkt zu einem Bereich hinzuzufÃŒgen wird wie in @segmentierung_add_point fÃŒr jede Kante der Abstand zum Punkt bestimmt und Kanten mit positivem Abstand werden ausgetauscht.
#figure(
caption: [HinzufÃŒgen von einem neuen Eckpunkt zu einem Bereich.],
grid(
columns: 2,
subfigure(
caption: [Abstand berechnen],
cetz.canvas(length: 1cm, {
import cetz.draw: *
set-style(stroke: black)
line((-2, 0), (5, 0), stroke: white)
line((-2, -1), (5, -1), stroke: white)
line((-2, -2), (3, 3), stroke: white)
line((5, -2), (0, 3), stroke: white)
line((-1, -1), (0, 0), (3, 0), (4, -1), close: true, stroke: black, fill: silver)
line((0, 0), (3, 0), name: "edge", mark: (end: ">", fill: black))
line((-1, -1), (0, 0))
line((3, 0), (4, -1))
content("edge.start", anchor: "north", $a$, padding: 0.15)
circle((-1, -1), fill: black, stroke: none, radius: 0.1)
circle((4, -1), fill: black, stroke: none, radius: 0.1)
circle("edge.end", fill: black, stroke: none, radius: 0.1)
content("edge.end", anchor: "north", $b$, padding: 0.15)
content((1.5, 0), anchor: "north", $d$, padding: 0.15)
line((0, 0), (0, 2), name: "out", mark: (end: ">", fill: black))
content((0, 1), $o$, anchor: "east", padding: 0.1)
line((0, 0), (2, 1.5), stroke: gray, mark: (end: ">", fill: gray))
content((2, 1.5), anchor: "south", padding: 0.15, $p$)
line((2, 0), (2, 1.5), stroke: gray)
content((2, 0.75), anchor: "west", padding: 0.1, $o dot (p - a)$)
circle((2, 1.5), fill: black, stroke: none, radius: 0.1)
circle("edge.start", fill: black, stroke: none, radius: 0.1)
}),
), subfigure(
caption: [Punkt hinzufÃŒgen],
cetz.canvas(length: 1cm, {
import cetz.draw: *
set-style(stroke: black)
line((-2, 0), (5, 0), stroke: silver)
line((-2, -1), (5, -1), stroke: silver)
line((-2, -2), (3, 3), stroke: silver)
line((5, -2), (0, 3), stroke: silver)
line((-1, -1), (0, 0), (3, 0), (4, -1), close: true, stroke: none, fill: silver)
line((0, 0), (3, 0), (4, -1), stroke: red)
line((0, 0), (2, 1.5), (4, -1), stroke: green)
line((0, 0), (-1, -1), (4, -1), stroke: black)
circle((2, 1.5), fill: green, stroke: none, radius: 0.1)
content((2, 1.5), anchor: "south", padding: 0.15, $p$)
circle((-1, -1), fill: black, stroke: none, radius: 0.1)
circle((4, -1), fill: black, stroke: none, radius: 0.1)
circle((0, 0), fill: black, stroke: none, radius: 0.1)
circle((3, 0), fill: red, stroke: none, radius: 0.1)
}),
),
),
) <segmentierung_add_point>
Um die Distanz von einem Punkt zu einem Bereich zu berechnen, wird der gröÃte positive Abstand vom Punkt zu allen Kanten berechnet. FÃŒr jede Kante mit den Eckpunkten $a = (a_x, a_y)$ und $b = (b_x, b_y)$ wird zuerst der Vektor $d = (d_x, d_y) = b - a$ berechnet. Der normalisierte Vektor $o =1 / (norm(d)) dot (d_y, -d_x)$ ist orthogonal zu $d$ und zeigt aus dem Bereich hinaus, solange $a$ im Uhrzeigersinn vor $b$ auf der Umrandung liegt. FÃŒr den Punkt $p$ kann nun der Abstand zur Kante mit dem Skalarprodukt $o dot (p - a)$ berechnet werden. Wenn der Punkte auf der Innenseite der Kante liegt, ist der Abstand negativ.
Um einen Punkt zu einem Bereich hinzuzufÃŒgen, werden alle Kanten entfernt, bei denen der Punkt auÃerhalb liegt, und zwei neue Kanten zum Punkt werden hinzugefÃŒgt. DafÃŒr werden die Eckpunkte entfernt, bei denen der neue Punkt auÃerhalb der beiden angrenzenden Kanten liegt. An der Stelle, wo die Punkte entfernt wurden, wird stattdessen der neue Eckpunkt eingefÃŒgt.
#let area_figure(centers) = {
import cetz.draw: *
set-style(stroke: black)
let points = (
(0.0, 0.0),
(1.0, 0.0),
(1.5, 0.6),
(1.2, 0.8),
(0.2, 0.8),
(-0.1, 0.7),
(-0.5, 0.5),
(-0.5, 0.3),
(-0.4, 0.1),
)
let center = (0.0, 0.0)
let total_area = 0.0
for i in range(1, points.len() - 1) {
let a = points.at(0)
let b = points.at(i)
let c = points.at(i + 1)
let c = ((a.at(0) + b.at(0) + c.at(0)) / 3.0, (a.at(1) + b.at(1) + c.at(1)) / 3.0)
let area = (b.at(0) * c.at(1) - b.at(1) * c.at(0)) / 2.0
if centers {
circle(c, fill: gray, stroke: none, radius: 0.1cm)
}
total_area += area;
center = (center.at(0) + c.at(0) * area, center.at(1) + c.at(1) * area)
}
let center = (center.at(0) / total_area, center.at(1) / total_area)
for i in range(0, points.len() - 1) {
line(points.at(i), points.at(i + 1))
}
line(points.last(), points.at(0))
for i in range(2, points.len() - 1) {
line(points.at(0), points.at(i), stroke: gray)
}
for p in points {
circle(p, fill: black, stroke: none, radius: 0.1cm)
}
if centers {
circle(center, fill: green, stroke: none, radius: 0.1cm)
let fake_center = (0.0, 0.0)
for point in points {
fake_center = (fake_center.at(0) + point.at(0), fake_center.at(1) + point.at(1))
}
fake_center = (fake_center.at(0) / points.len(), fake_center.at(1) / points.len())
circle(fake_center, fill: red, stroke: none, radius: 0.1cm)
}
}
== Koordinaten bestimmen
FÌr die BÀume der momentanen Scheibe werden die Koordinaten gesucht. Die Menge der Koordinaten startet mit der leeren Menge fÌr die höchste Scheibe. Bei jeder Scheibe wird die Menge der Koordinaten mit den gefundenen Bereichen aktualisiert. DafÌr wird fÌr alle Bereiche in der momentanen Scheibe die FlÀche und der Schwerpunkt wie in @segmentierung_schwerpunkt berechnet.
#figure(
caption: [FlÀche und Schwerpunkt fÌr einen konvexen Bereich.],
grid(
columns: 1 *2,
gutter: 1em,
subfigure(
cetz.canvas(length: 3.0cm, area_figure(false)),
caption: [Bereich vollstÀndig in Dreiecke unterteilen],
),
subfigure(
cetz.canvas(length: 3.0cm, area_figure(true)),
caption: [Werte von den Dreiecken kombinierten],
),
),
) <segmentierung_schwerpunkt>
Weil die Bereiche konvex sind, können diese trivial in Dreiecke unterteilt werden. DafÌr wird ein beliebiger Punkt ausgewÀhlt und fÌr jede Kante, die nicht zum Punkt gehört, wird ein Dreieck gebildet. Die FlÀche vom Bereich ist die Summe von den FlÀchen von den Dreiecken. FÌr den Schwerpunkt wird fÌr jedes Dreieck der Schwerpunkt berechnet und dieser mit der FlÀche vom zugehörigen Dreieck gewichtet.
Weil die konvexe HÃŒlle von allen Punkten in einem Bereich gebildet wird, können Bereiche sich Ãberscheiden, obwohl die Punkte der Bereiche den Mindestabstand voneinander entfernt sind. Bei dem HinzuzufÃŒgen von neuen Punkten werden die Bereiche sequentiell iteriert, wodurch bei ÃŒberschneidenden Bereichen der erste prÀferiert wird. Dadurch wÀchst der erste Bereich und die spÀteren Bereiche bleiben klein. Nachdem alle Punkte hinzugefÃŒgt wÃŒrden, werden deshalb die Bereiche gefiltert. Alle Bereiche mit einem Zentrum in einem anderen Bereich oder einer FlÀche kleiner als ein Schwellwert werden entfernt.
Danach werden die Koordinaten aus den vorherigen Scheiben mit den Schwerpunkten der Bereiche von der momentanen Scheibe aktualisiert. FÌr jede Koordinate wird der nÀchste Schwerpunkt nÀher als die zweifache Maximaldistanz bestimmt. Wenn ein naher Schwerpunkt gefunden wurde, wird die Koordinate mit der Position vom Schwerpunkte aktualisiert. Wenn kein naher Schwerpunkt existiert, so bleibt die Position gleich.
FÃŒr alle Schwerpunkte, welche nicht nah an einer der vorherigen Koordinaten liegen, wird ein neues Segment angefangen. DafÃŒr wird der Schwerpunkt zur Liste der Koordinaten hinzugefÃŒgt.
== Punkte zuordnen
Mit den Koordinaten wird das Voronoi-Diagramm berechnet, welches den Raum in Bereiche unterteilt, dass alle Punkte in einem Bereich fÌr eine Koordinate am nÀchsten an dieser Koordinate liegen. FÌr jeden Punkt wird nun der zugehörige Bereich im Voronoi-Diagramm bestimmt und der Punkt zum zugehörigen Segment zugeordnet. Ein Beispiel fÌr so eine Unterteilung ist in @segmentierung_voronoi zu sehen.
#figure(
caption: [Berechnete Koordinaten fÌr die Punkte mit zugehörigen Bereichen und Voronoi-Diagramm.],
grid(
columns: 1 * 2,
gutter: 1em,
subfigure(rect(image("../images/test_5-moved.svg"), inset: 0pt), caption: [Höhere Scheibe]),
subfigure(rect(image("../images/test_6-moved.svg"), inset: 0pt), caption: [Tiefere Scheibe]),
),
) <segmentierung_voronoi>
Der Ablauf wird fÌr alle Scheiben von Oben nach Untern durchgefÌhrt, wodurch alle Punkte zu Segmenten zugeordnet werden. In @segment_example ist die Segmentierung fÌr eine Punktwolke gegeben. Die unterschiedlichen Segmente sind durch unterschiedliche EinfÀrbung der zugehörigen Punkte markiert.
#figure(
caption: [Segmentierung von einem Waldgebiet.],
image("../images/auto-crop/segments-br05-als.png"),
) <segment_example>
|
|
https://github.com/0x1B05/english | https://raw.githubusercontent.com/0x1B05/english/main/cnn10/content/20240304.typ | typst | #import "../template.typ": *
= 20240304
What's up lovely people! Hope you've had an awesome weekend. Let's start this week strong with some motivation Monday. Remember #strike[complacency] #underline[complacency] is the constant enemy, so let's learn one thing or do something that #strike[makes] #underline[will make] us a little better today than we were yesterday. I'm Coy Wire, this is CNN10.
== Korean doctors
We start with doctors protesting--thousands of doctors taking to the street in #strike[South of Korea] #underline[_Seoul, South Korea_], expressing support for the many more thousands of doctors who have been on #strike[straight] #underline[*strike*] for nearly two weeks. This all because the government's plan to increase medical school admissions. The plan includes increasing the country's medical school #strike[and ...] #underline[*enrollment*] by 2,000#underline[,] starting in the 2025 _academic year_. That will bring the total to about 5,000 per year. The government says this plan is to help #strike[me] #underline[meet] the challenging #strike[health care] #underline[*healthcare*] demands #strike[the] #underline[as] South Korea faces, one of the lowest doctor #underline[to] population ratios for a developed country. But doctors disagree with government's method _in believes_ their medical school system can not _handle of_ #strike[fast] #underline[vast] increased educating and training new medical students. Also they're concerned that proposed plan does not include _staffing_ in specific fields that have already been #strike[seen] #underline[seeing] a shortage such as #underline[*pediatrics*] and the emergency departments. Doctors are also worried that the addition of new doctors will lead to increase public medical expenses. But this protest have not really won public support. A recent survey showed a majority of the South #strike[Korea's] #underline[Koreans] #strike[outsiding] #underline[are siding] with government's plan #strike[was some critical sane.] #underline[with some critics saying,] doctors are just worried about receiving a lower income now with more competition in the field. With the strike _ongoing_, the country health ministry says the government has allowed military doctors and nurses to perform some medical procedures#strike[,] normally performed by these striking doctors.
=== words, phrases and sentences
==== words
- _academic year_
- _Seoul_
- _staff_
- _pediatric_
- _expense_
- _ongoing_
==== phrases
- _meet demands_
- _on strike_
- _a to b ratios_
- _in believes that ..._
- _a majority of ..._
- _side with_
- _perform procedures_
=== åè¯
==== åæ
What's up lovely people! Hope you've had an awesome weekend. Let's start this week strong with some motivation Monday. Remember complacency is the constant enemy, so let's learn one thing or do something that will make us a little better today than we were yesterday. I'm <NAME>, this is CNN10.
We start with doctors protesting--thousands of doctors taking to the street in Seoul, South Korea, expressing support for the many more thousands of doctors who have been on strike for nearly two weeks. This all because the government's plan to increase medical school admissions. The plan includes increasing the country's medical school enrollment by 2,000, starting in the 2025 academic year. That will bring the total to about 5,000 per year. The government says this plan is to help meet the challenging healthcare demands as South Korea faces, one of the lowest doctor to population ratios for a developed country. But doctors disagree with government's method in believes their medical school system can not handle of vast increased educating and training new medical students. Also they're concerned that proposed plan does not include staffing in specific fields that have already been seeing a shortage such as pediatrics and the emergency departments. Doctors are also worried that the addition of new doctors will lead to increase public medical expenses. But this protest have not really won public support. A recent survey showed a majority of the South Koreans are siding with government's plan with some critics saying, doctors are just worried about receiving a lower income now with more competition in the field. With the strike ongoing, the country health ministry says the government has allowed military doctors and nurses to perform some medical procedures normally performed by these striking doctors.
==== åèç¿»è¯
倧家奜ïŒåžæäœ 们床è¿äºæå¿«çåšæ«ã让æä»¬ä»¥å
满åšåçææäžåŒå§è¿äžªæ°çäžåšãè®°äœïŒèªæ»¡æ¯æä»¬çæ°žææäººïŒæä»¥è®©æä»¬åŠä¹ äžä»¶äºæåäžäºè®©èªå·±æ¯æšå€©æŽå¥œçäºæ
ãææ¯Coy WireïŒè¿éæ¯CNN10ã
æä»¬ä»å»çæè®®åŒå§ãæåäžäžåå»çèµ°äžéŠå°çè¡å€ŽïŒè¡šèŸŸå¯¹å·²ç»çœ¢å·¥è¿äž€åšçæåäžäžåå»ççæ¯æãè¿äžåéœæ¯å 䞺æ¿åºè®¡åå¢å å»åŠé¢çæç计åã该计åå
æ¬ä»2025åŠå¹ŽåŒå§å¢å 该åœçå»åŠé¢æç2000人ïŒè¿å°äœ¿æ¯å¹Žçæçæ»æ°èŸŸå°çºŠ5000äººãæ¿åºè¡šç€ºïŒè¿äžè®¡åæšåšåž®å©æ»¡è¶³é©åœé¢äžŽçæææ§å»çéæ±ïŒå 䞺é©åœçå»çäººå£æ¯äŸæ¯å蟟åœå®¶äžæäœçä¹äžãäœå»ç们äžåææ¿åºçæ¹æ³ïŒä»ä»¬è®€äžºä»ä»¬çå»åŠé¢ç³»ç»æ æ³æ¿ååŠæ€å€§è§æš¡çå¢å ïŒæè²åå¹è®æ°çå»åŠçãä»ä»¬è¿æ
å¿ïŒæè®®ç计åäžå
æ¬åšå·²ç»åºç°ç猺çç¹å®é¢åç人åé
å€ïŒåŠå¿ç§åæ¥è¯ç§ãå»çä»¬è¿æ
å¿ïŒå¢å æ°å»çå°å¯ŒèŽå
Œ
±å»ç莹çšçå¢å ãäœè¿æ¬¡æè®®å¹¶æ²¡æçæ£èµ¢åŸå
¬äŒçæ¯æãæè¿çäžé¡¹è°æ¥æŸç€ºïŒå€§å€æ°é©åœäººæ¯ææ¿åºç计åïŒäžäºæ¹è¯äººå£«ç§°ïŒå»çåªæ¯æ
å¿åšè¿äžªé¢åé¢äžŽæŽå€ç«äºåæ¶å
¥äŒåå°ãåšçœ¢å·¥æç»è¿è¡çåæ¶ïŒè¯¥åœå«çéšè¡šç€ºïŒæ¿åºå·²å
讞åéå»ç忀士æ§è¡äžäºéåžžç±è¿äºçœ¢å·¥å»çæ§è¡çå»ççšåºã
==== 1st
What's up lovely people. Hope you #underline[have] had a happy weekend. Let's start this week from this energetic Monday. Remember complacency is our constant enemy, so let's #strike[study] #underline[learn] one #underline[thing] or do something that #strike[makes] #underline[will make] us a little better than we were yesterday. I'm Coy Wire, this is CNN10.
We start with #strike[the] doctors protesting, thousands of doctors #strike[walked] #underline[taking] to the street #strike[of Soer] #underline[in Seoul, South Korea], expressing the support of #underline[many more] thousands of doctors who have been on strike for #strike[about] #underline[nearly] 2 weeks. #underline[This] all because the government #strike[are going] #underline[is plan] to increase #strike[the medical students enrollment] #underline[medical school admissions]. The plan includes #strike[increasement] #underline[increasing] #strike[about 2,000 students in medical school] #underline[the country's medical enrollment by 2,000, starting in the 2025 academic year.]#strike[, leading to the total number of the enrollment in medical school arriving about 5,000.]#underline[That will bring the total to about 5,000 per year.] The government says, the plan aims at helping Korean meet the challenging #strike[medical] #underline[*healthcare*] demands they are facing#strike[, for Korean has a lowest doctor ratio of population among the developed country.] #underline[, one of the lowest doctor to population ratios for a developed coutry.] But doctors #strike[don't agree] #underline[disagree] with #strike[the method the government are taking] #underline[the government's method]#strike[, they think] #underline[in believes] the medical system can't #strike[burden the increasement of new medical students who need to be educated and trained on such a scale.] #underline[_handle of vast increased educating and training new new medical students._] #strike[They also concern, the plan doesn't include the staff of the particular field that has a lack of people such as ... and emergency...] #underline[_They're concerned_ that proposed plan does not include staffing in specific fields that have already been seeing a shortage such as pediatrics and the emergency departments.] #strike[They also concern the cost for public medical ... will increase after the plan.] #underline[They are also worried that *the addition of new doctors* will lead to increase *public medical expense*.] But this protest doesn't win #strike[the support of people] #underline[public support]. A recent survey shows #strike[most Koreans] #underline[a majority of the South Koreans] #strike[support this plan. Some] #underline[*are siding with* government's plan with some] critics says, doctors are just #strike[concerning] #underline[worried about] #strike[their income will decrease after the more competition come in this field] #underline[receiving a lower income now with more competition in the field]. #strike[When the strike is on]#underline[With the strike ongoing], the #underline[*country health ministry*] says, the government has allowed the military doctors and nurses to #strike[do] #underline[perform] some medical #strike[processes] #underline[procedures] #strike[usually] #underline[*normally*] #strike[executed] #underline[performed] by these #strike[doctors on strike] #underline[striking doctors].
== Weather
All right, let's turn now to weather as we first head to #underline[*Sierra Nevada*] mountain region#underline[, where *blizzard*] conditions continue to *slam* the area. Dangerous#underline[, *fierce*] winds #underline[of] more than 75 miles per hour#underline[,] _as well as_ the heavy snowfall #strike[threaten the ... of] #underline[*threatened* to *dump up* to] ten feet of snow in the mountain, has forced the officials to close down roads and impact travel. Many #strike[results] #underline[resorts] on the California#underline[-Nevada border] are also shut down, and about 6 million people in that region #underline[are in] a winter alert.
And let's head to another region, the #underline[*Texas Panhandle*] continuing to be #underline[*ravaged*] by the extremely hot and dry conditions, as the biggest #underline[*inferno*] in Texas history #underline[rages on.] The #strike[smoke ..] #underline[Smokehouse Creek] Fire has been #strike[burnt] #underline[burning] for nearly a week now, and destroyed more than 1 million #underline[acres] in the #strike[state of taxies] #underline[State of Texas] alone. It's a scary and #underline[*devastating*] reality many #strike[taxiers] #underline[Texans] are dealing with. Let's turn to our Camila Bernal for more:
It's been windy. It is hot, a lot hotter than it's been over the last couple of days. And it's #underline[dry,] conditions #underline[that] are making #underline[it] extremely difficult for firefighters in this area that still battling the largest #underline[wild]fire in Texas history#strike[ containment. It's] #underline[. *Containment* is] still very low, so there's a lot of work to be done. And #strike[at] #underline[in] the meantime, you have that #strike[graving] #underline[grieving] process beginning for a lot of families that have lost everything#strike[,] #underline[but] the cleanup process. And like, you #underline[are] seeing behind me #strike[is] #underline[at] Johnson house, and it's been difficult to do that cleanup _because the wind have been so high_. But #strike[nevertheless] #underline[nonetheless], #strike[you've seen] #underline[you're seeing] them right now they are trying to #underline[*sift* that *debris*], trying to look for #strike[every] #underline[any] jewellery, anything they can find of #strike[the] #underline[what was] left of their home, the home that they have built for over 20 years. I want #underline[want you to listen to what Ronnie] Johnson told me when #strike[you] #underline[he] first got here to his home after the fire.
We came back at 10:30 that night. #underline[We kind of *snuck through some ranches* to drive up here and see it gone. This was pretty tough.]
And you can hear the emotion#strike[ that's] #underline[. It's] been so difficult for members of this community. It also had a huge impact on the cattle industry here, because 85% percent of #strike[states] #underline[state's] cattle is raised here in #underline[the Panhandle.] And so a lot of #underline[*ranchers*] are also having to start from zero and have shared the struggle both emotionally and #underline[*financially*]. They know it's going to take a long time to get back where they were before those fires.
=== words, phrases and sentences
==== words
- _blizzard_
- _fierce_
- _threaten_
- _ravage_
- _inferno_
- _devastating_
- _containment_
- _sift_
- _debris_
- _sneak (snuck in U.S.)_
- _ranch_
- _cattle_
==== phrases
- _slam the area_
- _dump up_
- _rage on_
- _in the meantime_
==== sentences
- Dangerous, fierce winds of more than 75 miles per hour, _as well as_ the heavy snowfall threatened to dump up to ten feet of snow in the mountain, has forced the officials to close down roads and impact on travel.
- The Texas Panhandle continuing to be *ravaged* by the extremely hot and dry conditions, as the biggest *inferno* in Texas history _rages on_.
- The wind have been so *high*.
- A lot of ranchers have shared the struggle both emotionally and financially.
=== åè¯
==== åæ
All right, let's turn now to weather as we first head to Sierra Nevada mountain region, where blizzard conditions continue to slam the area. Dangerous, fierce winds of more than 75 miles per hour, as well as the heavy snowfall threatened to dump up to ten feet of snow in the mountain, has forced the officials to close down roads and impact travel. Many resorts on the California-Nevada border are also shut down, and about 6 million people in that region are in a winter alert.
And let's head to another region, the Texas Panhandle continuing to be ravaged by the extremely hot and dry conditions, as the biggest inferno in Texas history rages on. The Smokehouse Creek Fire has been burning for nearly a week now, and destroyed more than 1 million acres in the State of Texas alone. It's a scary and devastating reality many Texans are dealing with. Let's turn to our Camila Bernal for more:
It's been windy. It is hot, a lot hotter than it's been over the last couple of days. And it's dry, conditions that are making it extremely difficult for firefighters in this area that still battling the largest wildfire in Texas history . Containment is still very low, so there's a lot of work to be done. And in the meantime, you have that grieving process beginning for a lot of families that have lost everything but the cleanup process. And like, you are seeing behind me at Johnson house, and it's been difficult to do that cleanup because the wind have been so high. But nonetheless, you're seeing them right now they are trying to sift that debris, trying to look for any jewellery, anything they can find of what was left of their home, the home that they have built for over 20 years. I want want you to listen to what <NAME> told me when he first got here to his home after the fire.
We came back at 10:30 that night. We kind of snuck through some ranches to drive up here and see it gone. This was pretty tough.
And you can hear the emotion . It's been so difficult for members of this community. It also had a huge impact on the cattle industry here, because 85% percent of state's cattle is raised here in the Panhandle. And so a lot of ranchers are also having to start from zero and have shared the struggle both emotionally and financially. They know it's going to take a long time to get back where they were before those fires.
==== åèç¿»è¯
ç°åšè®©æä»¬èœ¬å倩æ°ïŒéŠå
æä»¬æ¥å°å
å蟟山èå°åºïŒé£éæŽé£éªæ¡ä»¶æç»èèãè¶
è¿æ¯å°æ¶75è±éçå±é©ççé£éïŒä»¥å倧éªåšèïŒå¯èœäŒåšå±±åºå 积å€èŸŸåè±å°ºç积éªïŒå·²è¿«äœ¿å®åå
³éé路并圱åæ
è¡ãå å·åå
å蟟å·èŸ¹å¢ç讞å€åºŠåèå°ä¹å
³éäºïŒè¯¥å°åºçºŠæ600äžäººå€äºå¬å£èŠæ¥ç¶æã
æ¥äžæ¥æä»¬èœ¬ååŠäžäžªå°åºïŒåŸ·å
èšæ¯å·çPanhandleå°åºæç»åå°æç«¯ççåå¹²ç¥æ¡ä»¶çæ§æ®ïŒå 䞺執å
èšæ¯å·åå²äžæå€§çå±±ç«ç»§ç»èèãSmokehouse Creek Fireå·²ç»çç§äºå°è¿äžäžªææïŒä»
åšåŸ·å
èšæ¯å·å°±æ§æ¯äº100å€äžè±äº©çåå°ãè¿æ¯åŸå€åŸ·å
èšæ¯äººæ£åšåºå¯¹çå¯æèæ¯çæ§çç°å®ã让æä»¬å¬å¬æä»¬çCamila Bernalçæ¥éïŒ
é£å¿åŸå€§ã倩æ°éåžžççïŒæ¯è¿å»å 倩èŠçåŸå€ãèäžéåžžå¹²ç¥ïŒè¿ç§æ¡ä»¶äœ¿åŸåšè¿äžªå°åºçæ¶é²å们é¢äžŽæå€§å°éŸïŒä»ä»¬ä»åšäžåŸ·å
èšæ¯åå²äžæå€§çå±±ç«äœæäºãæ§å¶ç«å¿çè¿å±ä»ç¶åŸæ
¢ãæä»¥è¿æåŸå€å·¥äœèŠåãåæ¶ïŒè®žå€å®¶åºå·²ç»åŒå§äºå€±å»äžåçæ²çè¿çšïŒæž
çè¿çšä¹å·²ç»åŒå§ãå°±åäœ åšæèº«åçå°çïŒçºŠç¿°éå®¶åºïŒç±äºé£å¿å€ªå€§ïŒæž
çå·¥äœååŸéåžžå°éŸãäœå°œç®¡åŠæ€ïŒäœ ç°åšçå°ä»ä»¬æ£åšåªåæž
çæ®éªžïŒè¯åŸå¯»æŸä»»äœéŠé¥°ïŒä»»äœä»ä»¬èœæŸå°çæ®çåšå®¶äžçç©åïŒè¿æ¯ä»ä»¬å»ºé äº20å€å¹Žçå®¶ãææ³è®©äœ 们å¬äžäž<NAME>åè¯æçïŒä»å€§ç«å第äžäžªåå°å®¶çå°çæ¯è±¡ã
æä»¬æäž10:30忥çãæä»¬åšäžäºååºéŽç©¿è¡ïŒå·å·æ¥å°è¿éïŒç»æåç°å®¶å·²ç»äžåšäºãè¿çæ¯å€ªéŸäºã
äœ å¯ä»¥å¬å°ä»çæ
绪ã对äºè¿äžªç€Ÿåºçæåæ¥è¯ŽïŒè¿äžåéœéåžžè°éŸãè¿ä¹å¯¹è¿éççè产äžäº§çäºå·šå€§åœ±åïŒå 䞺執å
èšæ¯å·85%çç矀åšPanhandle饲å
»ãå æ€ïŒè®žå€ç§åºäž»ä¹äžåŸäžä»é¶åŒå§ïŒä»ä»¬åšæ
æåç»æµäžéœæ¿åç巚倧çååãä»ä»¬ç¥éèŠæ¢å€å°ç«çŸåçç¶æéèŠåŸé¿æ¶éŽã
==== 1st
Now let's turn #underline[now] to weather#strike[. First we come] #underline[as we first head] to the Sierra Nevada mountain region, where #strike[snowstorm] #underline[*blizzard*] conditions #strike[raged on] #underline[slam the area] constantly. #strike[The wind blew over 75 miles per hour and the snowstorm continuously threatened, which may dump up snow over 10 feet and] #underline[Dangerous, fierce winds of more than 75 miles per hour, as well as the heavy snowfall threatened to dump up to ten feet of snow in the mountain,] has forced the officials #underline[to] closed down the road and impact travel. Many resorts on board of California-Nevada are #strike[closed] #underline[*shut down*]#strike[. There are] #underline[and] about 6 million people #underline[in that region are] in #strike[winter alarm] #underline[a winter alert].
Then we turn to another region, the Texas Panhandle #strike[in ... are threatened by the extreme hot and dry conditions for the mountain fire constantly raging on.] #underline[continuing to be ravaged by the extremely hot and dry conditions, as the biggest inferno in Texas history rages on.] The Smokehouse Creek Fire has been #strike[on fire] #underline[burning] for #strike[about] #underline[nearly] a week, #strike[destroying] #underline[and destroyed] more than 1 million acres in the State of Texas alone. #strike[This is a miserable and destructive fact that many ... are confronting.] #underline[It's a scary and devastating reality many Texans are dealing with.] Let's #strike[hear] #underline[turn to] the report from <NAME> #underline[for more]:
The wind blows heavily. #strike[The air] #underline[It] is very hot, #underline[a lot] hotter than #strike[the days before] #underline[it's been over the last couple of days]. And it is #strike[so] dry #strike[that firefighters in this region are facing such difficulties and they are still combating with the largest mountain fire in ...] #underline[, conditions that are making it extremely difficult for firefighters in this area that still battling the largest wildfire in Texas history.] #strike[But the fire-controlling process] #underline[Containment] is very slow, so there's a lot of work to #strike[do] #underline[be done]. #strike[Simultaneously] #underline[In the meantime], #strike[many families begin to pain for the loss of everything, the cleanup starting.] #underline[you have that grieving process beginning for a lot of families that have lost everything but the cleaner process.] #strike[As you see behind me was the ..., the cleanup becomes more difficult for the heavy wind.] #underline[And like, you're seeing behind me at Johnson house, and it's been difficult to do that cleanup because the wind have been so high.] But nonetheless, you #strike[can see] #underline[are seeing] they are trying to #strike[clean the remains] #underline[*sift* that debris], trying to find any jewellery or anything #strike[else] #underline[they can find of what was left of their home]#strike[. This is their 20 years' house.] #underline[, the home they've built for over 20 years.] I want you to listen to what <NAME> told me when he got home first after the fire.
We came back at 10:30 #strike[at] #underline[that] night. We snuck through some #strike[farms] #underline[ranches], and got here, then #strike[found the house missing] #underline[saw it gone]. It's pretty tough.
You can hear their emotion. #strike[All are] #underline[It's been so] tough #strike[to] #underline[for] the members of the community. And that also has an impact on the #strike[cow] #underline[cattle] industry, because 85% of California's cattle #strike[are] #underline[is] raised here in the Panhandle. So many owner of farms have to from zero, who are burdening huge press. They know it #strike[takes] #underline[will take] a long time to #strike[recover] #underline[get back where they were before those fires].
== Supersonic aircraft
All right, you might remember we were talking about the speed of sound about a month ago#strike[. When] #underline[when] we mentioned the development of new supersonic plane. #strike[we will take it] #underline[We're taking] a closer look today#strike[. And] #underline[at] what #strike[air spacing] #underline[*aerospace* and] defense company, *Lockheed Martin* has built #strike[an ...] #underline[and *debuted*] for NASA. The experimental plane is called X-59, and it's a #strike[quite] #underline[quiet] supersonic aircraft. NASA is looking to possibly #strike[evolutionize] #underline[revolutionize] the air #strike[travelling] #underline[travel] industry#strike[ by] #underline[. And by] that#underline[, I] mean go really fast, so fast that it could travel faster than the speed of sound. Take a look #underline[at] this piece, *profiling* the X-59:
This is the X-59 *quest*, a new plane built by <NAME> for NASA. NASA is hoping it would be #strike[a solover of] #underline[able to solve a] problem that has stopped commercial air plane from flying really really fast, that's because:
When #strike[the] #underline[an] aircraft goes supersonic, faster than the speed of sound, it create shockwaves. When I heard the sonic boom for the first time, I finally understood why it was such a big deal. If something's travelling #strike[supersonicly] #underline[*supersonically*], you can see it coming but you won't hear it at all until it has already passed you.
As you could imagine, sonic booms can be #strike[desruptive] #underline[*disruptive*] to life on the ground, so #strike[... 131953] #underline[disruptive that in 1973], the United States government #strike[alone] #underline[along] with much of the world #strike[ban] #underline[banned] all #strike[Soviet...] #underline[*civilian*] aircraft from supersonic #underline[flight] overland.
For the U.S., it's a speed limit.
The only #underline[civilian] jet #underline[to *ferry*] passengers faster than the speed of sound was the *Concorde*. It is #strike[the] only able to fly over the ocean supersonic. So you could do London to New York, but you had to slow down before you got to New York #underline[and be *subsonic*.] So the #strike[arrival ... is] #underline[*routes* were very] limited. The Concorde was a *celebrated* #strike[technology] #underline[technical] achievement, and _an exclusive luxury_ for its passengers. Limited #underline[roots] and heavy fuels consumption #underline[meant that] operating the Concorde was not #strike[good. Business however] #underline[good business. However], it #underline[stopped flying] in 2003. But if supersonic airplane could fly overland, they can fly anywhere, and the *business proposition* changes. NASA and <NAME> are hoping this crazy looking airplane is a step in that direction. It is designed to go supersonic without making the loud sonic boom, we've #underline[been *accustomed*] to hearing for decades now. To understand how this might work, it's important to understand how a sonic boom happens. Just like a #strike[boom] #underline[boat], when the #strike[boom] #underline[boat] is moving, you get the #strike[way coming...] #underline[weight coming off of it], and it's #strike[worthy] #underline[with it] the entire time #strike[of] #underline[it's] travelling. Similarly for supersonic aircraft, the shockwaves *come off* the aircraft and travel to the ground the entire time #underline[it's flying supersonic.] So when engineers designed the X-59, #strike[the use of] #underline[they used] smooth and long #strike[air dynamic] #underline[aerodynamic] lines to limit their shockwaves from reaching the ground. The engine is above the wing, and that's so #underline[that] the shock wave from the engine isn't able to go down to the ground #strike[when it] #underline[. It] just goes up.
It is still #underline[*audible*] like a quiet #underline[*thump*.] And most people, if them hear it, won't even notice it.
NASA is aiming for a first test flight #strike[to] #underline[of] the X-59 in the spring of 2024. *Poetically*, over the #underline[the same patch] of California desert where *Chuck Yeager* first #strike[brought ...] #underline[*broke the sound barrier*] in the X-1.
NASA #underline[can take] that data, #strike[they ...] #underline[and then] be able to present that to the #strike[regular] #underline[*regulatory*] authorities to actually *repeal* the overland supersonic #underline[limitations] #strike[they have to] #underline[or was that we have today.]
A #strike[changing regulation] #underline[change in regulation] could #strike[leave] #underline[lead to] a #strike[surgeon] #underline[*surge* of] investment #strike[to] #underline[in] the next generation of commercial supersonic aircraft, making travel faster than the speed of sound#underline[,] more accessible to everyone. And that could send massive shockwaves to the air travel industry. Shockwaves that the NASA and Lockheed Martin hope will sound more like #underline[*thump*.]
=== words, phrases and sentences
==== words
- _supersonic_
- _aerospace_
- _debut_
- _profile_
- _supersonically_
- _disruptive_
- _civilian_
- _ferry_
- _subsonic_
- _route_
- _exclusive_
- _celebrated_
- _business proposition_
- _aerodynamic_
- _audible_
- _thump_
- _poetically_
- _regulatory_
- _repeal_
- _regulation_
==== phrases
- _come off_
- _a surge of_
==== sentences
- _This crazy looking airplane is a step in that direction._
- _Similarly for supersonic aircraft,..._
- _That could send massive shockwaves to the air travel industry._
=== åè¯
==== åæ
All right, you might remember we were talking about the speed of sound about a month ago when we mentioned the development of new supersonic plane. We're taking a closer look today at what aerospace and defense company, Lockheed Martin has built and debuted for NASA. The experimental plane is called X-59, and it's a quiet supersonic aircraft. NASA is looking to possibly revolutionize the air travel industry . And by that, I mean go really fast, so fast that it could travel faster than the speed of sound. Take a look at this piece, profiling the X-59:
This is the X-59 quest, a new plane built by Lockheed Martin for NASA. NASA is hoping it would be able to solve a problem that has stopped commercial air plane from flying really really fast, that's because:
When an aircraft goes supersonic, faster than the speed of sound, it create shockwaves. When I heard the sonic boom for the first time, I finally understood why it was such a big deal. If something's travelling supersonically, you can see it coming but you won't hear it at all until it has already passed you.
As you could imagine, sonic booms can be disruptive to life on the ground, so disruptive that in 1973, the United States government along with much of the world banned all civilian aircraft from supersonic flight overland. For the U.S., it's a speed limit.
The only civilian jet to ferry passengers faster than the speed of sound was the Concorde. It is only able to fly over the ocean supersonic. So you could do London to New York, but you had to slow down before you got to New York and be subsonic. So the routes were very limited. The Concorde was a celebrated technical achievement, and an exclusive luxury for its passengers. Limited roots and heavy fuels consumption meant that operating the Concorde was not good business. However, it stopped flying in 2003. But if supersonic airplane could fly overland, they can fly anywhere, and the business proposition changes.
NASA and Lockheed Martin are hoping this crazy looking airplane is a step in that direction. It is designed to go supersonic without making the loud sonic boom, we've been accustomed to hearing for decades now. To understand how this might work, it's important to understand how a sonic boom happens. Just like a boat, when the boat is moving, you get the weight coming off it, and it's with it the entire time it's travelling. Similarly for supersonic aircraft, the shockwaves come off the aircraft and travel to the ground the entire time it's flying supersonic. So when engineers designed the X-59, they used smooth and long aerodynamic lines to limit their shockwaves from reaching the ground. The engine is above the wing, and that's so that the shock wave from the engine isn't able to go down to the ground . It just goes up.
It is still audible like a quiet thump. And most people, if them hear it, won't even notice it.
NASA is aiming for a first test flight of the X-59 in the spring of 2024. Poetically, over the same patch of California desert where Chuck Yeager first broke the sound barrier in the X-1.
NASA can take that data, and then be able to present that to the regulatory authorities to actually repeal the overland supersonic limitations or was that we have today.
A change in regulation could lead to a surge of investment in the next generation of commercial supersonic aircraft, making travel faster than the speed of sound, more accessible to everyone. And that could send massive shockwaves to the air travel industry. Shockwaves that the NASA and Lockheed Martin hope will sound more like thump.
==== åèç¿»è¯
ä¹è®žäœ è¿è®°åŸå€§çºŠäžäžªæåæä»¬è°è®ºè¿å£°éæ¶ïŒæå°äºæ°è¶
é³é飿ºçåŒåïŒä»å€©æä»¬å°æŽè¯Šç»å°äºè§£äžäžèªç©ºèªå€©ååœé²å
¬åžLockheed Martin䞺çŸåœåœå®¶èªç©ºèªå€©å±ïŒNASAïŒå»ºé åéŠæ¬¡äº®çžçæ°åå®éªé£æºX-59ãè¿æ¶å®éªé£æºè¢«ç§°äžºX-59ïŒæ¯äžç§éé³è¶
é³é飿ºãNASAåžæèœå€åœ»åºæ¹åèªç©ºæ
è¡äžïŒæçæææ¯ïŒå®ç°çæ£çå¿«éãåŠæ€ä¹å¿«ïŒä»¥è³äºèœå€æ¯å£°éæŽå¿«å°è¡è¿ã让æä»¬æ¥çäžäžè¿ç¯ä»ç»X-59çæç« ïŒ
è¿å°±æ¯X-59æ¢çŽ¢è
ïŒäžæ¶ç±Lockheed Martin䞺NASA建é çæ°å飿ºãNASAåžæå®èœå€è§£å³åçšé£æºäžè®©é£è¡éåžžå¿«çé®é¢ïŒåå æ¯ïŒ
åœé£æºè¶
é³éè¡è¿æ¶ïŒå¿«äºå£°éïŒäŒäº§çå²å»æ³¢ãåœæç¬¬äžæ¬¡å¬å°é³çæ¶ïŒæç»äºæçœäžºä»ä¹è¿æ¯äžªå€§é®é¢äºãåŠææç©ä»¥è¶
é³éè¡è¿ïŒäœ å¯ä»¥çå°å®ïŒäœçŽå°å®å·²ç»ç»è¿äœ äºïŒäœ æäŒå¬å°å£°é³ãäœ å¯ä»¥æ³è±¡ïŒé³çäŒå¯¹å°é¢çæŽ»é æå¹²æ°ã1973幎ïŒçŸåœæ¿åºäžäžçäžè®žå€åœå®¶äžèµ·çŠæ¢æææ°çšé£æºåšéå°äžè¿è¡è¶
é³éé£è¡ã对äºçŸåœæ¥è¯ŽïŒè¿æ¯äžç§é床éå¶ã
å¯äžäžæ¶è¿éä¹å®¢è¶
é³éçæ°çšå·æ°åŒé£æºæ¯åååŒé£æº(Concorde)ãå®åªèœåšæµ·äžè¿è¡è¶
é³éé£è¡ãæä»¥äœ å¯ä»¥ä»äŒŠæŠé£åŸçºœçºŠïŒäœäœ å¿
é¡»åšæµèŸŸçºœçºŠä¹ååéïŒèœ¬äžºæ¬¡é³éé£è¡ãå æ€ïŒèªçº¿éåžžæéãConcordeæ¯äžé¡¹å€åèµèªçææ¯æå°±ïŒä¹æ¯å
¶ä¹å®¢çç¬ç¹å¥¢äŸäº«åãæéçèªçº¿å髿²¹èæå³çè¿è¥åååŒé£æºå¹¶äžåç®ãç¶èïŒå®åš2003å¹Žåæ¢é£è¡ãäœæ¯ïŒåŠæè¶
é³é飿ºèœå€åšéå°äžé£è¡ïŒå®ä»¬å¯ä»¥é£åŸä»»äœå°æ¹ïŒåäžææ¡å°±äŒåçååã
NASAåLockheed Martinåžæè¿æ¶çèµ·æ¥åŸç¯çç飿ºæ¯æçè¿äžªæ¹åè¿åºçäžæ¥ãå®è¢«è®Ÿè®¡äžºåšè¶
é³éè¡è¿æ¶äžååºæä»¬å å幎æ¥ä¹ 以䞺垞ç巚倧é³çãèŠçè§£è¿ç§åçïŒäºè§£å£°çæ¯åŠäœåççåŸéèŠãå°±åäžèè¹ïŒåœè¹åšç§»åšæ¶ïŒå®åžŠçè¹èº«äžçééäžèµ·ç§»åšãåæ ·ïŒå¯¹äºè¶
é³é飿ºïŒå²å»æ³¢ä»é£æºäžè±èœå¹¶åšæŽäžªè¶
é³éé£è¡æéŽäŒ æå°å°é¢ãå æ€ïŒåœå·¥çšåžè®Ÿè®¡X-59æ¶ïŒä»ä»¬äœ¿çšäºå
æ»äžé¿ç空æ°åšååŠçº¿æ¡æ¥éå¶å²å»æ³¢å°èŸŸå°é¢ãååšæºäºæºç¿Œäžæ¹ïŒè¿æ ·ååšæºäº§ççå²å»æ³¢å°±æ æ³äŒ æå°å°é¢ïŒåªäŒåäžäŒ æãå®ä»ç¶æ¯å¯å¬å°çïŒåäžäžªèœ»åŸ®çéé声ã倧倿°äººïŒåŠæå¬å°äºïŒçè³éœäžäŒæ³šæå°ã
NASA计åäº2024幎æ¥å£è¿è¡X-59çéŠæ¬¡è¯é£ãè¯æå°ïŒå°±åšå å©çŠå°Œäºæ²æŒ çåäžåå°æ¹ïŒ<NAME>éŠæ¬¡åšX-1äžçªç Žé³éæ¶ãNASAå¯ä»¥å©çšè¿äºæ°æ®ïŒç¶ååçç®¡æºææäŸïŒä»¥å®é
åºé€æä»¬ä»å€©æé¢äžŽçè¶
é³éé£è¡éå¶ã
æ³è§çååå¯èœäŒå¯ŒèŽäžäžä»£åçšè¶
é³é飿ºçæèµæ¿å¢ïŒäœ¿è¶
é³éé£è¡å¯¹æ¯äžªäººéœæŽå æ®éãè¿å¯èœäŒç»èªç©ºæ
è¡äžåžŠæ¥å·šå€§çå²å»æ³¢ãNASAåLockheed Martinåžæè¿äºå²å»æ³¢å¬èµ·æ¥æŽåæ¯éé声ã
==== 1st
Maybe you still remember #strike[we've talked] #underline[we were talking] about the speed of sound #strike[last month] #underline[about a month ago]#strike[,] #underline[when] we mentioned the development of supersonic plane. Today we #strike[are going to learn more about the new experimental plane X-59 which debuted and was built by Lockheed Martin]#underline[are taking a closer look today at what aerospace and defense company, Lockheed Martin has built and debuted] for NASA. This experimental plane was called X-59, a kind of #underline[quiet] supersonic plane. NASA is #strike[hoping] #underline[looking] to revolutionize the #strike[plane travelling] #underline[air travel] industry#strike[, which means implement the real speed] #underline[. And by that, I mean go really fast], so fast that it #strike[can drive] #underline[could travel] faster than the speed of sound. Let's take a look at #strike[the article about] #underline[this piece, profiling] X-59:
This is the X-59 quest, a new kind of plane built by Lockheed Martin for NASA. NASA is hoping it #strike[can] #underline[would] solve the problem #strike[of the business plane speed limitation] #underline[that has stopped comercial air plan from flying really really fast], that's because:
When a #strike[plane] #underline[aircraft] #strike[travels supersonicly] #underline[goes supersonic], faster than the speed of sound, it will create shockwaves. When I first heard of the #strike[shockwaves] #underline[sonic boom], I finally understood why it's #strike[a big problem] #underline[such a big deal]. If something #strike[travels] #underline[is travelling] supersonicly, you can see it #underline[coming], but you won't hear anything until it passed you.
As you #strike[can] #underline[could] imagine, #strike[shockwaves] #underline[sonic booms] #strike[will] #underline[can] be disruptive to the life on the ground#strike[. American] #underline[, so disruptive that in 1973, the United States] government along with #strike[many other countries across] #underline[much of] the world #strike[prohibited] #underline[banned] #strike[the] #underline[all] civilian aircraft from #strike[travelling supersonicly over the land in 1973] #underline[supersonic flight overland]. #strike[To] #underline[For] the U.S., it's a kind of speed limit.
The only civilian jet #strike[that is allowed] to #strike[carry] #underline[*ferry*] passengers is Concorde, but it is only enable to travel supersonic over the sea. So you #strike[can fly from] #underline[could do] London to New York, but before arriving New York, you #strike[must] #underline[got to] slow down #underline[and be subsonic], owing to which the routes are limited. Concorde is a #underline[*celebrated*] technical achievement#strike[ full of credit] #underline[, and an exclusive luxury for its passengers]. Limited routes and high fuel consumption #underline[meant that operating the Concorde was not goo business.] However it #strike[has been] stopped flying #strike[from] #underline[in] 2003. But if supersonic aircraft could fly #strike[over the land] #underline[overland], they can fly anywhere, and the #strike[future of business] #underline[business proposition] will change.
NASA and <NAME> are hoping this crazy looking airplane is a step in that direction. It is designed #underline[without makeing the loud sonic boom we've been accustomed to for decades]. To understand #strike[this principle] #underline[how this might work], it is important to #strike[know how shockwaves are made] #underline[how a sonic boom happen]. Just like a boat, #strike[it moves along along with its weight] #underline[when the boat is moving, you get the weight coming off it and it's with it the entire time it's travelling]. Similarly, #strike[to] #underline[for] supersonic aircraft, shockwaves come off the plane and #strike[transport] #underline[travel] to the ground #strike[when] #underline[the entire time] it is flying. So when engineers are designing X-59, they limited shockwaves #strike[arriving at] #underline[from reaching the] ground with smooth and long aerodynamic lines. The engine is above the wing, so shockwaves from it #strike[can't spread] #underline[isn't able to go down] to the ground #strike[which] #underline[. It] only can go #strike[above] #underline[up].
It is still audible, like a slight thump. Most people won't notice it even they have heard of it.
NASA #strike[are planing] #underline[is aiming for] #strike[the first fly] #underline[a first flight] of X-59 in the spring of 2024. Poetically, #strike[in] #underline[over] the same #strike[place] #underline[patch] of California desert, where Chuck Yeager first broke the sound barrier in the X-1.
NASA can #strike[exploit it can provide] #underline[take that data, and then be able to present] it to the #strike[regulation organization] #underline[regulatory authorities] to repeal the #underline[overland supersonic] limitations #strike[in the speed of supersonic plane] we are confronting today.
The changes in regulation could lead to a #strike[boom in ...] #underline[surge of investment] in the next generation of #strike[business] #underline[commercial] supersonic aircraft, #strike[which makes] #underline[making travel faster than the speed of sound], more accessible to everyone. It has a potential to #strike[bring a] #underline[send a massive] shockwave to the air travel industry. NASA and Lockheed Martin hopes #strike[these shockwaves are more like thump] #underline[will sound more like thump].
|
|
https://github.com/hitszosa/universal-hit-thesis | https://raw.githubusercontent.com/hitszosa/universal-hit-thesis/main/harbin/bachelor/pages/achievement.typ | typst | MIT License | #import "../config/constants.typ": special-chapter-titles
#let achievement(
content,
) = [
#heading(special-chapter-titles.ææ, level: 1, numbering: none)
#content
] |
https://github.com/Otto-AA/dashy-todo | https://raw.githubusercontent.com/Otto-AA/dashy-todo/main/README.md | markdown | MIT No Attribution | # Dashy TODO
Create TODO comments, which are displayed at the sides of the page.

## Limitations
Currently, there is no prevention of TODOs being rendered on top of each other. See [here](https://github.com/Otto-AA/dashy-todo/issues/1) for more information.
## Usage
The package provides a `todo(message, position: auto | left | right)` method. Call it anywhere you need a todo message.
```typst
#import "@preview/dashy-todo:0.0.1": todo
// It automatically goes to the closer side (left or right)
A todo on the left #todo[On the left].
// You can specify a side if you want to
#todo(position: right)[Also right]
// You can add arbitrary content
#todo[We need to fix the $lim_(x -> oo)$ equation. See #link("https://example.com")[example.com]]
// And you can create an outline for the TODOs
#outline(title: "TODOs", target: figure.where(kind: "todo"))
```
## Styling
You can modify the text by wrapping it, e.g.:
```
#let small-todo = (..args) => text(size: 0.6em)[#todo(..args)]
#small-todo[This will be in fine print]
``` |
https://github.com/Mouwrice/resume | https://raw.githubusercontent.com/Mouwrice/resume/main/metadata.typ | typst | // NOTICE: Copy this file to your root folder.
/* Personal Information */
#let firstName = "Maurice"
#let lastName = "<NAME>"
#let personalInfo = (
github: "Mouwrice",
phone: "+32 492 45 01 71",
email: "<EMAIL>",
linkedin: "maurice-van-wassenhove",
)
/* Language-specific */
// Add your own languages while the keys must match the varLanguage variable
#let headerQuoteInternational = (
"": [Master of Science in Computer Science Engineering & Bachelor of Science in Computer Science],
"nl": [Master in Computer Science Engineering & Bachelor in de Informatica],
)
#let cvFooterInternational = (
"": "Curriculum vitae",
"nl": "Curriculum vitae"
)
#let letterFooterInternational = (
"": "Cover Letter",
"nl": "Motivatie brief",
)
/* Layout Setting */
#let awesomeColor = "darknight" // Optional: skyblue, red, nephritis, concrete, darknight
#let profilePhoto = "../src/Maurice_2022_rounded.png" // Leave blank if profil photo is not needed
#let varLanguage = "" // INFO: value must matches folder suffix; i.e "zh" -> "./modules_zh"
#let varEntrySocietyFirst = false // Decide if you want to put your company in bold or your position in bold
#let varDisplayLogo = true // Decide if you want to display organisation logo or not
|
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/meta/state_00.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
#let s = state("hey", "a")
#let double(it) = 2 * it
#s.update(double)
#s.update(double)
$ 2 + 3 $
#s.update(double)
Is: #s.display(),
Was: #locate(location => {
let it = query(math.equation, location).first()
s.at(it.location())
}).
|
https://github.com/songguokunsgg/HUNNU-Typst-Master-Thesis | https://raw.githubusercontent.com/songguokunsgg/HUNNU-Typst-Master-Thesis/master/README.md | markdown | MIT License | # æ¹ååžè倧åŠç¡å£«åŠäœè®ºæ
åºäºåäº¬å€§åŠæ¯äžè®ºæïŒè®Ÿè®¡ïŒç Typst æš¡æ¿ïŒèœå€ç®æŽãå¿«éãæç»çæ PDF æ ŒåŒçæ¯äžè®ºæãæè°¢åäœè
çå享ã
## 泚æäºé¡¹
- å é€äºæ¬ç§çåå士çžå
³çæä»¶ïŒå äžºæ¬æ ¡çæ¬ç¡åæš¡æ¿å·®è·èŸå€§ïŒæ æ³éçšã
- 该暡æ¿å¹¶é宿¹æš¡æ¿ïŒèæ¯æ°éŽæš¡æ¿ïŒ**ååšäžè¢«è®€å¯çé£é©**ã请æååå€å¥œæ¹æ¡ BïŒäŸåŠ WordãTeX LiveãMacTeX çã
- ç§»é€äºå项ç®äžåŒé¢æ¥åçžå
³å
容ã
- äžæš¡æ¿çé¢çžå
³çæä»¶éœåš hnu-thesis æä»¶å€¹å
ïŒå»ºè®®æ®é䜿çšè
äžèŠä¿®æ¹æ€æä»¶å€¹ïŒåäœæéçå
容éœå¯ä»¥ååš thesis.typ æä»¶äžãåŠæåç»ææš¡æ¿æŽæ°ïŒåºæ¬äžäŒåæïŒïŒäœ å¯ä»¥çŽæ¥äœ¿çš `git pull` æåïŒç¶ååå¹¶å²çªå³å¯ã
## äœè
çè¯
æ¯äžå£ïŒæèªå·±ä¹åšäœ¿çšè¿äžªæš¡æ¿å论æïŒåçè¿çšéåžžçèæïŒèäžå€§éšåé®é¢éœå·²ç»åŸå°äºè§£å³ãäœæ¯ç®åç¡®å®ååšäžäºèœåèåŽä¹å€çé®é¢ïŒæ åè§£å³ïŒ
1. åèæç®ïŒè¿äžªççæ¯**æå€§ç硬䌀**ïŒåªæç°åš Typst æ¯æäºèªå®ä¹æ ·åŒïŒäœå¯çšèåŽååæéïŒäžæ¯æ CSL-M çå€ layoutïŒãæä»¥éŸä»¥å®ç°äžè±æç®æ··æïŒäºå®äžåªæåè¯èšçŒæéœèœéŸå®ç°ïŒåœäœ äœ¿çš `#set text(lang: "en")`ïŒæ¶ïŒâåèæç®âäŒåæè±æïŒã宿¹äžè§£å³è¿äžªé®é¢ïŒäžåœåŠçå°±æ æ³æ£åžžäœ¿çš Typst å论æã
2. png åŸçé®é¢ïŒå°é¢é¡µç logo é¢è§æ£åžžïŒäœæ¯éšå PDF 蜯件æåŒäŒéäœã
3. 没æ checkedboxïŒçŽæ¥å¯ŒèŽå°é¢äžå¯çšã
4. ç®åœçŒæé®é¢ïŒåŠæ ¡èŠæ±æ»ç»äžå±æäžå â第 X ç« âïŒäœæ¯æ»ç»å屿åå«çŒ 1 2 èïŒææè§è¿äžªé®é¢å¯ä»¥è§£å³ïŒäœæ¯æ²¡æŸå°åæ³ã
5. åŸè¡šè泚ïŒç¡¬äŒ€ïŒæ æ³å®ç°ïŒæé ä¿®æ¹åäœå€§å°å®ç°äºã
6. ååŸïŒæ²¡æè¿äžªåèœã
æ¬æš¡æ¿åšåäº¬å€§åŠæš¡æ¿åºç¡äžåŒåïŒæ°å¢åä¿®æ¹äºåŸå€éçšäºæ¬æ ¡çåºæ¯ïŒäŸåŠç®æ³ãäžçº¿è¡šãç®åœæ ŒåŒãå°é¢é¡µã页ç亀éçŒæã
è¿æ¯æçæåäžæ¬¡æŽæ°ïŒç±äºä»¥äžé®é¢æ æ³è§£å³ïŒæå¿
é¡»æå·²ç»åå®çå皿蜬æ¢äžº LaTex 䜿çšïŒäžèœæ¿åéåŒç©ç¬ïŒãäœæä»ç¶æè°¢ TypstïŒç»äºæåŸå¥œçåçš¿åäœäœéªãè¿ä»œæš¡æ¿åšç°é¶æ®µå·²ç»åºæ¬å®åïŒåç»çåèœéèŠç宿¹æŽæ°ïŒæä»¥æäžäŒåæŽæ°äºãææ¯äžåïŒåŠææå¯¹ Typst æå
Žè¶£çååŠïŒæ¬¢è¿ç»§ç»å®åè¿äžªæš¡æ¿ïŒè®©æ¹åžå€§çåäœååŠå€äžäžªéæ©ã
åŠæç°é¶æ®µäœ æ³äœ¿çšè¿ä»œæš¡æ¿ïŒæšèäœäžºå皿䜿çšïŒæç»æçè¿æ¯äœ¿çš Latex æŽå¥œã
## å£å¿
- Typst æ¯äžéšæ°ççæçæ è®°è¯èšïŒè¿åäžå°å Word æ LaTeX äžæ ·æççš³å®ã
- 仿éšåæªèœè§£å³çé®é¢ïŒéèŠçå° Typst 宿¹æ¯æïŒå¯èœè¿éèŠå 䞪æïŒ
- 没æäŒªç²äœïŒæ æ³ç»éšååäœå ç²ïŒäŸåŠã楷äœãåã仿å®ãã
- å·²æ¯æ [èªå®ä¹ CSL æ ·åŒ](https://typst.app/docs/reference/meta/bibliography/)ã
## äŒå¿
Typst æ¯å¯çšäºåºççå¯çŒçšæ è®°è¯èšïŒæ¥æåéãåœæ°äžå
管ççç°ä»£çŒçšè¯èšçç¹æ§ïŒæ³šéäºç§åŠåäœ (science writing)ïŒå®äœäž LaTeX çžäŒŒã
- **è¯æ³ç®æŽ**ïŒäžæéŸåºŠè· Markdown çžåœïŒææ¬æºç å¯è¯»æ§é«ïŒäžäŒå LaTeX äžæ ·å
æ¥çåææ äžè±æ¬å·ã
- **çŒè¯é床快**ïŒTypst äœ¿çš Rust è¯èšçŒåïŒå³ typ(e+ru)stïŒç®æ è¿è¡å¹³å°æ¯ WASMïŒå³æµè§åšæ¬å°çŠ»çº¿è¿è¡ïŒä¹å¯ä»¥çŒè¯æåœä»€è¡å·¥å
·ïŒéçšäžç§ **å¢éçŒè¯** ç®æ³åäžç§æçºŠæççé¢çŒåæ¹æ¡ïŒ**ææ¡£é¿åºŠåºæ¬äžäŒåœ±åçŒè¯é床ïŒäžçŒè¯é床äžåžžè§ Markdown æž²æåŒææž²æé床çžåœ**ã
- **ç¯å¢æå»ºç®å**ïŒäžéèŠå LaTeX äžæ ·æè
Ÿå 䞪 G çåŒåç¯å¢ïŒåçæ¯æäžæ¥é©çéæäžè¯èšïŒæ 论æ¯å®æ¹ Web App åšçº¿çŒèŸïŒè¿æ¯äœ¿çš VS Code å®è£
æä»¶æ¬å°åŒåïŒéœæ¯ **å³åŒå³çš**ã
- **ç°ä»£çŒçšè¯èš**ïŒTypst æ¯å¯çšäºåºççå¯çŒçšæ è®°è¯èšïŒæ¥æ **åéãåœæ°ãå
管çäžéè¯¯æ£æ¥** çç°ä»£çŒçšè¯èšçç¹æ§ïŒåæ¶ä¹æäŸäº **éå
** çç¹æ§ïŒäŸ¿äºè¿è¡ **åœæ°åŒçŒçš**ã以åå
æ¬äº `[æ è®°æš¡åŒ]`ã`{èæ¬æš¡åŒ}` äž `$æ°åŠæš¡åŒ$` çå€ç§æš¡åŒçäœçšåïŒå¹¶äžå®ä»¬å¯ä»¥äžé深床å°ã亀äºå°åµå¥ãå¹¶äžéè¿ **å
管ç**ïŒäœ äžåéèŠå TexLive äžæ ·åšæ¬å°å®è£
äžå€§å å¹¶äžå¿
èŠçå®å
ïŒèæ¯ **æéèªåšä»äºç«¯äžèœœ**ã
å¯ä»¥åèåäœè
åäžæå»ºåç¿»è¯ç [Typst äžæææ¡£çœç«](https://typst-doc-cn.github.io/docs/) è¿
éå
¥éšã
## 䜿çš
**æ®éçšæ·åªéèŠä¿®æ¹æ ¹ç®åœäžç `thesis.typ` æä»¶å³å¯ïŒåºæ¬å¯ä»¥æ»¡è¶³äœ çææéæ±ïŒ`hnu-thesis` ç®åœäžç代ç å¯ä»¥çšäºåæ°æ¥é
ïŒäœæ¯ç论äžäœ äžåºè¯¥å¯¹å
¶è¿è¡æŽæ¹ã**
åŠæäœ è®€äžºäžèœæ»¡è¶³äœ çéæ±ïŒå¯ä»¥å
æ¥é
åé¢çéšåã
### åšçº¿çŒèŸ
Typst æäŸäºå®æ¹ç Web AppïŒæ¯æå Overleaf äžæ ·åšçº¿çŒèŸïŒ<https://typst.app/>
**äœæ¯ Web App 并没æå®è£
æ¬å° Windows æ MacOS ææ¥æçåäœïŒæä»¥åäœäžå¯èœååšå·®åŒïŒæä»¥æšèæ¬å°çŒèŸïŒ**
PS: èœç¶äž Overleaf çèµ·æ¥çžäŒŒïŒäœæ¯å®ä»¬åºå±åçå¹¶äžçžåãOverleaf æ¯åšåå°æå¡åšè¿è¡äºäžäžª LaTeX çŒè¯åšïŒæ¬èŽšäžæ¯è®¡ç®å¯éåçæå¡ïŒè Typst åªéèŠåšæµè§åšç«¯äœ¿çš WASM ææ¯æ§è¡ïŒæ¬èŽšäžæ¯ IO å¯éåçæå¡ïŒæä»¥å¯¹æå¡åšåååŸå°ïŒåªéèŠèŽèŽ£æä»¶çäºååšäžåäœåæ¥åèœïŒã
### æ¬å°çŒèŸïŒæšèïŒ
1. å
鿬项ç®ïŒæè
çŽæ¥éè¿ [GitHub Releases](https://gitee.com/songguokunsgg/hnu-thesis-typst) 页é¢äžèœœã
```bash
git clone https://github.com/OrangeX4/nju-thesis-typst.git
```
2. åš [VS Code](https://code.visualstudio.com/) äžæåŒè¯¥ç®åœã
3. åš VS Code äžå®è£
[Typst LSP](https://marketplace.visualstudio.com/items?itemName=nvarner.typst-lsp) å [Typst Preview](https://marketplace.visualstudio.com/items?itemName=mgt19937.typst-preview) æä»¶ãåè
èŽèŽ£è¯æ³é«äº®åéè¯¯æ£æ¥ïŒåè
èŽèŽ£é¢è§ã
- 乿šèäžèœœ [Typst Companion](https://marketplace.visualstudio.com/items?itemName=CalebFiggers.typst-companion) æä»¶ïŒå
¶æäŸäºäŸåŠ `Ctrl + B` è¿è¡å ç²ç䟿æ·çå¿«æ·é®ã
- äœ è¿å¯ä»¥äžèœœæåŒåç [Typst Sync](https://marketplace.visualstudio.com/items?itemName=OrangeX4.vscode-typst-sync) å [Typst Sympy Calculator](https://marketplace.visualstudio.com/items?itemName=OrangeX4.vscode-typst-sympy-calculator) æä»¶ïŒåè
æäŸäºæ¬å°å
çäºåæ¥åèœïŒåè
æäŸäºåºäº Typst è¯æ³çç§åŠè®¡ç®åšåèœã
4. æäž `Shift + Ctrl + P`ïŒç¶åèŸå
¥åœä»€ `Typst Preview: Preview current file`ïŒå³å¯ **忥å¢éæž²æäžé¢è§**ïŒè¿æäŸäº **å
æ ååå®äœåèœ**ã
### ç¹æ§ / 路线åŸ
- **è¯Žæææ¡£**
- [ ] çŒåæŽè¯Šç»çè¯Žæææ¡£ïŒåç»èèäœ¿çš [tidy](https://github.com/typst/packages/tree/main/packages/preview/tidy/0.1.0) çŒåïŒäœ ç°åšå¯ä»¥å
åè [NJUThesis](https://mirror-hk.koddos.net/CTAN/macros/unicodetex/latex/njuthesis/njuthesis.pdf) çææ¡£ïŒåæ°å€§äœä¿æäžèŽïŒæè
çŽæ¥æ¥é
å¯¹åºæºç åœæ°çåæ°
- **ç±»åæ£æ¥**
- [ ] åºè¯¥å¯¹ææåœæ°å
¥åè¿è¡ç±»åæ£æ¥ïŒåæ¶æ¥é
- **å
šå±é
眮**
- [x] 类䌌 LaTeX äžç `documentclass` çå
šå±ä¿¡æ¯é
眮
- [x] **ç²å®¡æš¡åŒ**ïŒå°äžªäººä¿¡æ¯æ¿æ¢æå°é»æ¡ïŒå¹¶äžéèèŽè°¢é¡µé¢ïŒè®ºææäº€é¶æ®µäœ¿çš
- [x] **å颿š¡åŒ**ïŒäŒå å
¥ç©ºçœé¡µïŒäŸ¿äºæå°
- [x] **èªå®ä¹åäœé
眮**ïŒå¯ä»¥é
眮ãå®äœãããé»äœãäžã楷äœãçåäœå¯¹åºçå
·äœåäœ
- [ ] **åäœè§£èŠå**ïŒå°åäœé
眮è¿äžæ¥è§£èŠåïŒè®©çšå°åäœçå°æ¹å äžäžå±åäœåç§°é
眮项ïŒä»ãæ é¢ïŒå®äœïŒã-ãå
·äœåäœãéæäžºãæ é¢ã-ãå®äœã-ãå
·äœåäœãïŒ
- [x] **æ°åŠåäœé
眮**ïŒæš¡æ¿äžæäŸé
眮ïŒçšæ·å¯ä»¥èªå·±äœ¿çš `#show math.equation: set text(font: "Fira Math")` æŽæ¹
- **æš¡æ¿**
- [x] ç ç©¶çæš¡æ¿
- [x] å°é¢
- [x] 声æé¡µ
- [x] æèŠ
- [x] 页ç
- [ ] åœå®¶åŸä¹ŠéŠå°é¢
- [ ] åºçææä¹Š
- **çŒå·**
- [x] åèšäœ¿çšçœé©¬æ°åçŒå·
- [x] éåœäœ¿çšçœé©¬æ°åçŒå·
- [x] è¡šæ Œäœ¿çš `1.1` æ ŒåŒè¿è¡çŒå·
- [x] æ°åŠå
¬åŒäœ¿çš `(1.1)` æ ŒåŒè¿è¡çŒå·
- **ç¯å¢**
- [x] å®çç¯å¢ïŒè¿äžªå¯ä»¥èªå·±ä¿®æ¹é
眮ïŒ
- [x] ç®æ³ç¯å¢ïŒç®åçŸè§çšåºŠèŸå·®ïŒäœåªæè¿äžªå
å¯çšïŒ
## Q&A
### æäžäŒ LaTeXïŒå¯ä»¥çšè¿äžªæš¡æ¿å论æåïŒ
å¯ä»¥ã
åŠæäœ äžå
³æ³šæš¡æ¿çå
·äœå®ç°åçïŒäœ å¯ä»¥çš Markdown Like çè¯æ³è¿è¡çŒåïŒåªéèŠæç
§æš¡æ¿çç»æçŒåå³å¯ã
### æäžäŒçŒçšïŒå¯ä»¥çšè¿äžªæš¡æ¿å论æåïŒ
åæ ·å¯ä»¥ã
åŠæä»
ä»
æ¯åœææ¯å
¥éšäžæ¬Ÿç±»äŒŒäº Markdown çè¯èšïŒçžä¿¡äœ¿çšè¯¥æš¡æ¿çäœéªäŒæ¯äœ¿çš Word çŒåæŽå¥œã
### 䞺ä»ä¹æçåäœæ²¡ææŸç€ºåºæ¥ïŒèæ¯äžäžªäžªãè±è
åãïŒ
è¿æ¯å 䞺æ¬å°æ²¡æå¯¹åºçåäœïŒ**è¿ç§æ
åµç»åžžåçåš MacOS çãæ¥·äœãæŸç€ºäž**ã
äœ åºè¯¥å®è£
æ¬ç®åœäžç `fonts` éçææåäœïŒéé¢å
å«äºå¯ä»¥å
莹åçšçãæ¹æ£æ¥·äœãåãæ¹æ£ä»¿å®ãïŒç¶ååéæ°æž²ææµè¯å³å¯ã
äœ å¯ä»¥äœ¿çš `#fonts-display-page()` æŸç€ºäžäžªåäœæž²ææµè¯é¡µé¢ïŒæ¥ç对åºçåäœæ¯åŠæŸç€ºæåã
åŠæè¿æ¯äžèœæåïŒäœ å¯ä»¥æç
§æš¡æ¿éç诎æèªè¡é
眮åäœïŒäŸåŠ
```typst
#let (...) = documentclass(
fonts: (楷äœ: ("Times New Roman", "FZKai-Z03S")),
)
```
å
æ¯å¡«åè±æåäœïŒç¶ååå¡«åäœ éèŠçãæ¥·äœãäžæåäœã
**åäœåç§°å¯ä»¥éè¿ `typst fonts` åœä»€æ¥è¯¢ã**
åŠææŸäžå°äœ æéèŠçåäœïŒå¯èœæ¯å 䞺 **该åäœåäœïŒVariantsïŒæ°éè¿å°**ïŒå¯ŒèŽ Typst æ æ³è¯å«å°è¯¥äžæåäœã
### 䞺ä»ä¹æ¥·äœæ æ³å ç²ïŒ
å 䞺äžè¬é»è®€å®è£
çãæ¥·äœãåªææ ååéçåäœïŒæ²¡æå ç²çæ¬çåäœïŒåæç²æ¥·çåäœå¹¶äžæ¯å
莹åçšçïŒïŒè Typst åæ²¡æå®ç°äŒªç²äœïŒFake BoldïŒç®æ³ïŒæä»¥å¯ŒèŽæ æ³æ£åžžå ç²ã
ç®åæè¿æ²¡æŸå°äžäžªæ¯èŸå¥œçè§£å³æ¹æ³ã
### åŠä¹ Typst éèŠå€ä¹
ïŒ
äžè¬èèšïŒä»
ä»
è¿è¡ç®åççŒåïŒäžå
³æ³šåžå±çè¯ïŒäœ å¯ä»¥æåŒæš¡æ¿å°±åŒå§åäºã
åŠæäœ æ³è¿äžæ¥åŠä¹ Typst çè¯æ³ïŒäŸåŠåŠäœæç¯åžå±ïŒåŠäœè®Ÿçœ®é¡µè页ççïŒäžè¬åªéèŠå äžªå°æ¶å°±èœåŠäŒã
åŠæäœ è¿æ³åŠä¹ Typst çã[å
ä¿¡æ¯](https://typst-doc-cn.github.io/docs/reference/meta/)ãéšåïŒè¿èèœå€çŒåèªå·±çæš¡æ¿ïŒäžè¬èèšéèŠå å€©çæ¶éŽé
è¯»ææ¡£ïŒä»¥åä»äººçŒåçæš¡æ¿ä»£ç ã
åŠæäœ æ Python æ JavaScript çèæ¬è¯èšççŒåç»éªïŒäºè§£è¿åœæ°åŒçŒçšãå®ãæ ·åŒãç»ä»¶ååŒåçæŠå¿µïŒå
¥éšé床äŒå¿«åŸå€ã
### ææçŒå LaTeX çç»éªïŒåŠäœå¿«éå
¥éšïŒ
å¯ä»¥åè [é¢å LaTeX çšæ·ç Typst å
¥éšæå](https://typst-doc-cn.github.io/docs/guides/guide-for-latex-users/)ã
### ç®å Typst æåªäºç¬¬äžæ¹å
åæš¡æ¿ïŒ
å¯ä»¥åè [ç¬¬äžæ¹å
](https://typst-doc-cn.github.io/docs/packages/)ã[Awesome Typst Links](https://github.com/qjcg/awesome-typst) å [Awesome Typst åè¡šäžæç](https://github.com/typst-doc-cn/awesome-typst-cn)ã
### 䞺ä»ä¹åªæäžäžª thesis.typ æä»¶ïŒæ²¡ææç« èåå€äžªæä»¶ïŒ
å 䞺 Typst **è¯æ³è¶³å€ç®æŽ**ã**çŒè¯é床足å€å¿«**ãå¹¶äž **æ¥æå
æ ç¹å»å€ååéŸæ¥åèœ**ã
è¯æ³ç®æŽçå¥œå€æ¯ïŒå³äœ¿æææå
容éœååšåäžäžªæä»¶ïŒäœ ä¹å¯ä»¥åŸç®åå°å蟚åºå䞪éšåçå
容ã
çŒè¯é床足å€å¿«çå¥œå€æ¯ïŒäœ äžåéèŠå LaTeX äžæ ·ïŒå°å
容忣åšå 䞪æä»¶ïŒå¹¶éè¿æ³šéçæ¹åŒæé«çŒè¯é床ã
å
æ ç¹å»å€ååéŸæ¥åèœïŒäœ¿åŸäœ å¯ä»¥çŽæ¥æåšé¢è§çªå£å°äœ æ³èŠçäœçœ®ïŒç¶åçšéŒ æ ç¹å»å³å¯å°èŸŸå¯¹åºæºç æåšäœçœ®ã
è¿æäžäžªå¥œå€æ¯ïŒåäžªæºæä»¶äŸ¿äºåæ¥åå享ã
å³äœ¿äœ è¿æ¯æ³èŠåæå äžªç« èïŒä¹æ¯å¯ä»¥çïŒTypst æ¯æäœ äœ¿çš `#import` å `#include` è¯æ³å°å
¶ä»æä»¶çå
容富å
¥æçœ®å
¥ãäœ å¯ä»¥æ°å»ºæä»¶å€¹ `chapters`ïŒç¶åå°åäžªç« èçæºæä»¶æŸè¿å»ïŒç¶åéè¿ `#include` 眮å
¥ `thesis.typ` éã
### æåŠäœæŽæ¹é¡µé¢äžçæ ·åŒïŒå
·äœçè¯æ³æ¯æä¹æ ·çïŒ
ç论äžäœ å¹¶äžéèŠæŽæ¹ `nju-thesis` ç®åœäžçä»»äœæä»¶ïŒæ è®ºæ¯æ ·åŒè¿æ¯å
¶ä»çé
眮ïŒäœ éœå¯ä»¥åš `thesis.typ` æä»¶å
ä¿®æ¹åœæ°åæ°å®ç°æŽæ¹ãå
·äœçæŽæ¹æ¹åŒå¯ä»¥é
读 `nju-thesis` ç®åœäžçæä»¶çåœæ°åæ°ã
äŸåŠïŒæ³èп޿¹é¡µé¢èŸ¹è·äžº `50pt`ïŒåªéèŠå°
```typst
#show: doc
```
æ¹äžº
```typst
#show: doc.with(margin: (x: 50pt))
```
å³å¯ã
åç»æä¹äŒçŒåäžäžªæŽè¯Šç»çææ¡£ïŒå¯èœäŒèèäœ¿çš [tidy](https://github.com/typst/packages/tree/main/packages/preview/tidy/0.1.0) æ¥çŒåã
åŠæäœ é
读äºé£äºåœæ°çåæ°ïŒä»ç¶äžç¥éåŠäœä¿®æ¹åŸå°äœ éèŠçæ ·åŒïŒæ¬¢è¿æåº IssueïŒåªèŠæè¿°æž
æ¥é®é¢å³å¯ã
æè
乿¬¢è¿å 矀讚论ïŒ943622984
### 该暡æ¿åå
¶ä»ç°å Typst äžæè®ºææš¡æ¿çåºå«ïŒ
å
¶ä»ç°åç Typst äžæè®ºææš¡æ¿å€§å€éœæ¯åš 2023 幎 7 æä»œä¹åïŒTypst Verison 0.6 ä¹åïŒåŒåçïŒåœæ¶ Typst è¿äžäžå€æçïŒçè³è¿ **å
管ç** åèœéœè¿æ²¡æïŒå æ€åœæ¶ç Typst äžæè®ºææš¡æ¿çåŒåè
åºæ¬éœæ¯èªå·±ä»å€ŽåäºäžééèŠçåèœ/åœæ°ïŒå æ€é æäº **代ç èŠå床é«**ã**æå€§å©é¢æ¡åŒä»£ç **ã**éå€é 蜮å** äž **éŸä»¥èªå®ä¹æ ·åŒ** çé®é¢ã
è¯¥æš¡æ¿æ¯åš 2023 幎 10 ïœ 11 æä»œïŒTypst Verison 0.9 æ¶ïŒåŒåçïŒæ€æ¶ Typst è¯æ³åºæ¬çš³å®ïŒå¹¶äžæäŸäº **å
管ç** åèœïŒå æ€èœå€åå°åŸå€äžå¿
èŠç代ç ã
å¹¶äžæå¯¹æš¡æ¿çæä»¶æ¶æè¿è¡äºè§£èŠïŒäž»èŠåäžºäº `utils`ã`templates` å `layouts` äžäžªç®åœïŒè¿äžäžªç®åœå¯ä»¥çåæçåŒåè
æåïŒå¹¶äžäœ¿çš **éå
** ç¹æ§å®ç°äºç±»äŒŒäžå¯åå
šå±åéçå
šå±é
眮èœåïŒå³æš¡æ¿äžç `documentclass` åœæ°ç±»ã
## License
This project is licensed under the MIT License.
|
https://github.com/Han-duoduo/mathPater-typest-template | https://raw.githubusercontent.com/Han-duoduo/mathPater-typest-template/main/chapter/chap3.typ | typst | Apache License 2.0 | = æš¡åå讟
+ å讟æ¯äž€äžªå°åºçäž»æ§è¿æ¥è®Ÿå€éœèœäºçžæ¥æ¶å°ä¿¡å·ïŒå³è¿äž€äžªå°åºäžºé»åºã
+ åšèèPCIæš¡3å¹²æ°çæ
åµäžïŒäž€äžªåé¢é»å°åº ïŒ çä¿¡å·åŒºåºŠçå·®å°äºçäºç»å®éšéã
+ åè®Ÿéæ°åé
PCIåäžäŒäº§çå
¶ä»ä»£ä»·ã
+ æ¯æ¬¡PCIçåé
æ¯ç¬ç«çïŒåšéå€å€æ¬¡å®éªåå¯ä»¥çææ¯ååååžçïŒå æ€å¯ä»¥éçšååååžçéæºæ°è¿è¡æš¡æ
|
https://github.com/SkymanOne/zk-learning | https://raw.githubusercontent.com/SkymanOne/zk-learning/main/notes/intro/intro.typ | typst | #import "../base.typ": *
#show: note
= Introduction to Zero Knowledge Proofs
There are a prover and verifier
`String` = proof submitted by the prover
Verifier rejects or accepts the proof (i.e. the `String`)
In CS we talk about polynomial proofs, that can be verified in *polynomial* time $->$ *NP proofs*
- The `string` is short
- Polynomial time constraints
Given claim `x`, the length of the proof `w` should be of polynomial length.
More formally: `|w|` = polynomial in `|x|`.
The verifier has a *polynomyial* time contraints for execution.
`V` _accepts_ `w` if $V(x,w) = 1$, otherwise _rejects_.
#align(center,
diagram(
spacing: 8em,
node((0,0), [*Prover* \ Uncontrained runtime]),
edge("->", [Proof *w*]),
node((1,0), [*Verifier* \ Verifies in polynomial time of claim `x`]),
)
)
= Efficently verifiable proofs
We can formalise the the input as:
- Language $L$ - a set of binary strings
More formally:
#def([
$L$ is an *NP* language (or NP decision problem), if there is a verifier $V$ that runs in polynomial time of length of the claim $x$ that where
- *Completeness [True claims have short proofs]* \
if $x in L$, there is a polynomially (of $|x|$) long witness $w in {0,1}^*$ st $V(x,w) = 1$
- *Soundness [False claims have no proofs]* \
if $x in.not L$, there is no witness. That is, for all $w in {0,1}^*, V(x,w)=0$
])
= Proving Quadratic Residue #footnote[https://mathworld.wolfram.com/QuadraticResidue.html]
*Goal*: proof that $y$ is a quadratic residue $mod N$.
#align(center,
diagram(
spacing: 15em,
node((0,0), [*Prover*]),
edge("->", [Proof = $sqrt(y) mod N in Z^*_N$]),
node((1,0), [*Verifier*]),
)
)
*The gist:* Prove (or persuade) the verifier that the prover could prove the statement.
Adding additional components - *Interactive* and *Probabilistic* proofs
- *Interaction* - verifier engages in the _non-trivial_ interaction with the prover
- *Randomness* - verifier is randomised, and can accept am invalid proof with a small probability (quantified).
#align(center,
diagram(
spacing: 5em,
node-stroke: 0.5pt,
node(enclose: ((0,0), (0,1)), [*Prover*]),
node(enclose: ((1,0), (1,1)), [*Verifier*]),
edge((0, 0), (1, 0), "->", [Answer]),
edge((0, 0.3), (1, 0.3), "<-", [Question]),
edge((0, 0.6), (1, 0.6), "->", [Answer]),
edge((0, 0.9), (1, 0.9), "<-", [Question]),
edge((0, 1), (1, 1), "..", []),
)
)
*Examples*: https://medium.com/@houzier.saurav/non-technical-101-on-zero-knowledge-proofs-cab77d671a40
*Here is the interactive proof for the quadratic residue:*
1. Prover chooses a random $r$ s.t. $1 †r †N$ s.t. $gcd(r, N) = 1$
2. Prover sends $s$, s.t. $s = r^2 mod N$
- If the prover sends $sqrt(s) mod N$ and $sqrt(s times y) mod N$ later, then the verifier coulde deduce $x = sqrt(y) mod N$
- Instead, the prover gives either one of the roots.
3. Verifier randomly chooses $b in {0, 1}$
4. if $b=1$: verifier sends $z=r$, otherwise $z=r times sqrt(y) mod N$
5. Verifier accepts the proofs only if $z^2 = s y^b mod n$
- if $b=0$ then $z^2 = s mod N$
- otherwise, $z^2 = s y mod N$
During the first try, the probability of error is $1/2$, after $k$ interactions => it is $(1/2)^k$.
*Note*: at the beginning of each interaction, the Prover generates a new $r$ which is $sqrt(s) mod N$
Assessing the properties:
- *Completeness*: if the claim is true, the verifier will accept it.
- *Soundness*: if the claim is false, $forall$provers, $P("Verifier accepts") †1/2$
- After $k$ interactions: $forall$provers, $P("Verifier accepts") †(1/2)^k$
= How does previous example worked?
Sending both parts of the QR, proves that the verifier *could* solve the original equation. Additionally, sending just the $s$ reveal no knowledge about the original solution.
The ability of the prover, to provide solution to either part persuades the verifier that it can solve the original equation.
= Interactive Proofs
#align(center,
diagram(
spacing: 5em,
node-stroke: 0.5pt,
node(enclose: ((0,0), (0,1)), [*Prover*]),
node(enclose: ((1,0), (1,1)), [*Verifier*]),
node(enclose: ((0,1.5), (1,1.5)), [*Claim / Theorem X*]),
edge((0, 0), (1, 0), "->", [$a_1$]),
edge((0, 0.3), (1, 0.3), "<-", [$q_1$]),
edge((0, 0.6), (1, 0.6), "->", [$a_2$]),
edge((0, 0.9), (1, 0.9), "<-", [$q_2$]),
edge((0, 1), (1, 1), "..", []),
)
)
#def([
$(P,V)$ is an interactive proof for $L$, if $V$ is probabilistic $"poly"(|x|)$
and
- *Completeness*: if $x in L$, $P[(P,V)(x) = "accept"] ⥠c$
- *Soundness*: if $x in.not L$ for every $P^*$, $P[(P^*,V)(x) = "accept"] †s$
We want to $c$ to be as close to *b* as possible, and $s$ to be as negligible as possible.
Equivalent as long as $c - s ⥠1/"poly"(|x|)$
])
#def([
Class of interaction proof languages = {$L$ for which there is an interactive proof}
])
= Zero-Knowledge views
The intuition behind zero knowledge interactions:
For any true statements, what the verifier can compute *after* the interaction is the same to what it could have computer *before* the interaction. In other words, the verifier *does not gain any more information* (i.e. knowledge) after the verification interaction. This should hold true *even for malicious verifiers.*
View is essentially a set of traces containing submitted questions $q^*$ to the verifier and received answers $a^*$ from it.
After the interaction $V$ learns:
- Theorem (T) is true, $x in L$
- A *view* of interactions
#def([
*$"view"_v(P,V)[x]$* = ${(q_1, a_1), (q_2, a_2), ...}$, which is a probibility distributions over random values of $V$ and $P$.
])
In the case of $P$, the random value it selects for QR is $r$, in the case of $P$ it is the choice whether the reveal $r$ or $r sqrt(y) mod N$
== Simulation Paradigm
We can say that V's view gives no further knowledge to it, if it could have simulated it on its own s.t. the simulated view and the real view are _computationally indistinguishable_.
What that means is that we can have a polynomial time "distinguisher" that extracts the trace from either of the views, and if it can not differentiate it with the probability less than 50/50, we can say that they views are _computationally indistinguishable_.
#align(center,
diagram(
spacing: 10em,
node-stroke: 0.5pt,
node(enclose: ((0, 0), (0,1)), [The poly-time \ Distinguisher]),
node(enclose: ((1, 0), (1, 1))),
node((1, 0.2), [Real view], shape: rect),
node((1, 0.8), [Simulated \ view]),
edge((0,0.2) , (1, 0.2), "<-", [sample]),
edge((0,0.8) , (1, 0.8), "<-", [sample])
)
)
More formally:
#align(center,
diagram(
spacing: 10em,
node-stroke: 0.5pt,
node(enclose: ((0, 0), (0,1)), [The poly-time \ Distinguisher]),
node(enclose: ((1, 0), (1, 1))),
node((1, 0.2), [$D_1$ - k-bit strings]),
node((1, 0.8), [$D_2$ - k-bit strings]),
edge((0,0.5) , (1, 0.5), "<-", [sample]),
)
)
#def([
For all distinguisher algorithms, even after receiving a polynomial number of samples for $D_b$, if $P("Distinguisher guesses " b) < 1/2 + c$ where $c$ negligible constant, \ then $D_1 .. D_k$ are *computationally indistinguishable*
])
= Zero-Knowledge Definition
#def([
An Interactive Protocol (P,V) is zero-knowledge for a language $L$ if there exists a *PPT* (Probabilistic Polynomial Time) algorithm *Sim* (a simulator) such that for every $x in L$, the following two probability distributions are
*poly-time* indistinguishable:
1. $"view"_v(P,V)[x] = "traces"$
2. $"Sim"(x, 1^lambda)$
$(P,V)$ is a zero-knowledge interactive protocol if it is *complete*, *sound* and *zero-knowledge*.
])
Flavours of Zero-Knowledge:
- Computationally indistinguishable distributions = CZK
- Perfectly identical distributions = PZK
- Statistically close distributions = SZK
= Simulator Example
For QR, the view presented as $"view"_v(P,V): (s, b, z)$
Simulator workflow:
1. Pick a random $b$
2. Pick a random $z in Z^*_N$
3. Compute $s = z^2 "/" y^b$
4. Output $(s, z, b)$
We would see that $(s, z, b)$ is identically distributed as in real view.
*For adversarial verifier $V^*$:*
1. Pick a random $b$
2. Pick a random $z in Z^*_N$
3. Compute $s = z^2 "/" y^b$
4. If $V*(N, y, s) = b$, then output $(s, z, b)$, otherwise go to step 1.
Within 2 iteration, we would still reach identical distribution.
= Proof of knowledge
The prover convinces the verifier not only about the theorem but that it also knows the $x$.
Using that information, we can construct the knowledge extractor using the rewinding technique.
More formally:
#def([
Consider $L_R = {x : exists w "s.t." R(x, w) = "accept"}$ for poly-time relation $R$.
$(P,V)$ is a proof of knowledge (POK) for $L_R$ if $exists$ PPT (knowledge) extractor algorithm $E$ s.t. $forall x in L$ in expected poly-time $E^(P)(x)$ outputs $w$ s.t. $V(x, w)$ = accept
])
#align(center,
diagram(
spacing: 12em,
node-stroke: 0.5pt,
node(enclose: ((0,0), (0,1)), [*Prover*]),
node(enclose: ((1,0), (1,1)), [*Verifier*]),
edge((0, 0), (1, 0), "->", [$s=r^2 mod n$]),
edge((0, 0.2), (1, 0.2), "<-", [$b=0$]),
edge((0, 0.4), (1, 0.4), "->", [$r$]),
edge((0, 0.7), (1, 0.7), "<-", [_In the same cycle of $s$_ \ $b=1$]),
edge((0, 0.9), (1, 0.9), "->", [$r times sqrt(y) mod N$]),
)
)
And the verifier can determine $r times sqrt(y) mod N$, hence, extract the knowledge.
= All of NP is in Zero-Knowledge
#def([
*Theorem[GMW86,Naor]*: If one-way functions exist, then every
$L in "NP"$ has computational zero knowledge interactive proofs
])
Shown that an NP-Complete Problem has a ZK interactive Proof.
[GMW87] Showed ZK interactive proof for G3-COLOR using bit commitments.
= Reading
#link("https://link.springer.com/chapter/10.1007/3-540-47721-7_11", "How to Prove All NP Statements in Zero-Knowledge and a Methodology of Cryptographic Protocol Design (Extended Abstract)") |
|
https://github.com/Karolinskis/KTU-typst | https://raw.githubusercontent.com/Karolinskis/KTU-typst/main/README.md | markdown | # KTU Typst Thesis template
Unofficial Typst template for Kaunas University of Technology.
## Fonts
The following fonts are included in the `fonts` directory:
- Times New Roman Bold Italic
- Times New Roman Bold
- Times New Roman Italic
- Times New Roman
## Contributing
Contributions are welcome. If you see any mistakes or want to add something, feel free to open an issue or a pull request.
|
|
https://github.com/tiankaima/typst-notes | https://raw.githubusercontent.com/tiankaima/typst-notes/master/b47475-2024_lug_sfd_poster/main.typ | typst | #let light_green = cmyk(75%, 0%, 100%, 0%)
#let green = cmyk(100%, 0%, 100%, 20%)
#let light_blue = cmyk(40%, 3%, 0%, 25%)
#let blue = cmyk(72%, 30%, 0%, 70%)
#let mark(body) = box(clip: true, stroke: 0.2cm + red, body)
#let mark(body) = box(body)
#set page(width: 80cm, height: 180cm, fill: light_blue, margin: 0cm)
#let sfd-img = image("imgs/sfd_logo.svg", height: 20cm)
#let sfd-text = text(size: 200pt, font: "Prism", fill: green, stroke: 5pt + white)[
#show par: set block(spacing: 3cm)
Software
Freedom
Day
]
#let sfd-sym = pad(
top: 5cm,
grid(
columns: (auto, auto),
align: bottom + left,
inset: (x: 2cm, y: 0cm),
mark(sfd-img), mark(sfd-text),
),
)
#let sfd-text-cn = pad(
top: 5cm,
bottom: 2cm,
text(size: 240pt, font: "baotuxiaobaiti", fill: white)[
蜯件èªç±æ¥
],
)
#let date-location-box = pad(
2cm,
rect(
radius: 20cm,
width: 40cm,
height: 6cm,
fill: green,
align(
center + horizon,
text(size: 120pt, font: "Nanum Pen Script", fill: white)[
2024.09.21 3C101
],
),
),
)
#let keynotes = box(width: 100%, fill: blue)[
#set align(left)
#set text(size: 80pt, fill: white, font: ("linux libertine", "Source Han Serif SC"))
#let title(it) = text(size: 100pt, font: "Trajan Pro", it)
#let author(it) = text(size: 60pt, underline(it), weight: "bold")
#let details(it) = text(size: 60pt, fill: gray.lighten(10%), it)
#pad(x: 3cm, y: 3.5cm)[
#title[
Keynotes:
]
- $TT$ypst 101 #author[\@tiankaima]
#details[
è§åŸ LaTeX #strike[è¿äºç¬šé]ïŒMarkdown åèœäžå€ïŒæ¥è¯è¯æ°çæçç³»ç» TypstïŒ
åšå享äžïŒæä»¬å°ä»ç» Typst çåºæ¬äœ¿ç𿹿³ïŒå¹¶å±ç€ºäžäºé«çº§åèœã
]
- çµä¿¡5Gäžçœæå¡ç®ä» #author[\@james]
#details[
åŠæ ¡æåŒé5Gååäžçœæå¡ïŒä»¥æ¹äŸ¿åžçåšåœå
䜿çšç§»åšé讯çœç»è®¿é®æ ¡åçœå
èµæºãè¿éç®åä»ç»ææ ¡äžåœçµä¿¡5Gååäžçœæå¡ç建讟è¿å±åææ¯ç»èã
]
- æç Debian maintainer ä¹è·¯ #author[\@äºæ³¢ from PLCT]
#details[
åšæ¬æ¬¡å享äžïŒ äºæ³¢å°ä»ç»äžäžªç€Ÿåºå°çœåŠäœéè¿ä¿®å
ãæå
äžæ¥æ¥æäžº Debian maintainerçè³ Debian developer.
]
- `tracexec`: äŒé
å°è¿œèžª exec ç³»ç»è°çš #author[\@ä»»é¹é£]
#details[
åšé
ç¯å¢æä»æºç å®è£
蜯件æ¶ïŒæä»¬åžžåžžäŒéå°åç§å¥æªçé误ã
è¿äºéè¯¯åšæ²¡æäžäžæçæ
åµäžå¯èœéŸä»¥çè§£ïŒéåžžæä»¬äŒæ³èŠç¥éåºéçåœä»€åæ§è¡ç¯å¢å°åºæ¯ä»ä¹ã
æå°ä»ç»åå±ç€º `tracexec` åŠäœäŒé
å°è§£å³è¿ç±»é®é¢ïŒä»¥å `tracexec` çåŠäžå€§åŠçšïŒåå©å€çšåºè°è¯ã
]
]
#pad(x: 3cm, bottom: 5cm)[
#title[
Lightning Talks:
]
...
]
]
#let caption-text = [
#v(3cm)
#show: it => pad(x: 5cm, it)
#set text(size: 60pt, fill: white, font: ("linux libertine", "Source Han Serif SC"))
#set align(left)
#grid(
columns: (3fr, 1fr),
align: top + left,
[
#pad(left: 1cm)[
äžåœç§åŠææ¯å€§åŠ Linux çšæ·åäŒæ¯ç±äžåœç§åŠææ¯å€§åŠåšæ ¡ç GNU/Linux ç±å¥œè
åèµ·å¹¶ç»æçå¢äœïŒæšåšèåç§å€§ç GNU/Linux 䜿çšè
ïŒæå»ºä¿¡æ¯äº€æµå
±äº«çå¹³å°ïŒå®£äŒ èªç±èœ¯ä»¶çä»·åŒïŒæé«èªç±èœ¯ä»¶ç€Ÿåºæåæ°åŽïŒæšå¹¿èªç±èœ¯ä»¶çåºçšã
]
#set text(size: 40pt)
#let size = 10cm
#grid(
columns: (size, size, size, size),
rows: (size, 2cm),
inset: (x: 1cm, y: 0.1cm),
align: center,
image("imgs/qrcodes/埮信å
¬äŒå·.svg"),
image("imgs/qrcodes/QQå
¬äŒå·.svg"),
image("imgs/qrcodes/çœç«.svg"),
image("imgs/qrcodes/鿥ç§å€§.svg"),
"埮信å
¬äŒå·", "QQå
¬äŒå·", "çœç«", "鿥ç§å€§",
)
],
pad(
x: 2cm,
image("imgs/logo-white.svg", height: 14cm),
),
)
]
#align(center)[
#mark[
#sfd-sym
]
#mark[
#sfd-text-cn
]
#mark[
#date-location-box
]
#mark[
#keynotes
]
#mark[
#caption-text
]
]
|
|
https://github.com/SillyFreak/typst-stack-pointer | https://raw.githubusercontent.com/SillyFreak/typst-stack-pointer/main/docs/manual.typ | typst | MIT License | #import "template.typ" as template: *
#import "/src/lib.typ" as sp
#let package-meta = toml("/typst.toml").package
#let date = datetime(year: 2024, month: 2, day: 23)
#show: manual(
title: "Stack Pointer",
// subtitle: "...",
authors: package-meta.authors.map(a => a.split("<").at(0).trim()),
abstract: [
_Stack Pointer_ is a library for visualizing the execution of (imperative) computer programs, particularly in terms of effects on the call stack: stack frames and local variables therein.
],
url: package-meta.repository,
version: package-meta.version,
date: date,
)
// the scope for evaluating expressions and documentation
#let scope = (sp: sp)
#let exec-example(sim-code, typst-code, render) = {
let preamble = ```typc
import sp: *
```
let steps = eval(mode: "code", scope: scope, preamble.text + typst-code.text)
block(breakable: false, {
grid(columns: (2fr, 3fr), sim-code, typst-code)
})
render(steps)
}
#let exec-grid-render(columns: none, steps) = {
if columns == none { columns = steps.len() }
let steps = steps.enumerate(start: 1)
let render((n, step), width: auto, height: auto) = block(
width: width,
height: height,
fill: gray.lighten(80%),
radius: 0.2em,
inset: 0.4em,
breakable: false,
{
[Step #n: ]
if step.step.line != none [line #step.step.line]
else [(no line)]
v(-0.5em)
list(..step.state.stack.map(frame => {
frame.name
if frame.vars.len() != 0 {
[: ]
frame.vars.pairs().map(((name, value)) => [#name~=~#value]).join[, ]
}
}))
}
)
context {
let rows = range(steps.len(), step: columns).map(i => {
steps.slice(i, calc.min(steps.len(), i + columns))
})
grid(
columns: (1fr,) * columns,
column-gutter: 0.4em,
row-gutter: 0.4em,
..for row in rows {
let height = row.map(step => measure(render(step)).height).fold(0pt, calc.max)
row.map(render.with(width: 100%, height: height))
}
)
}
}
#show raw: it => {
if it.lang == none {
let (text, lang: _, lines: _, theme: _, ..fields) = it.fields()
raw(text, lang: "typc", ..fields)
} else {
it
}
}
#let ref-line(n) = {
show raw: set text(fill: gray, size: 0.9em)
raw(lang: "", "("+str(n)+")")
}
= Introduction
_Stack Pointer_ provides tools for visualizing program execution. Its main use case is for presentations using frameworks such as #link("https://polylux.dev/book/")[Polylux], but it tries to be as general as possible.
There are two main concepts that underpin Stack Pointer: _effects_ and _steps_. An effect is something that changes program state, for example a variable assignment. Right now, the effects that are modeled by this library are limited to ones that affect the stack, giving it its name, but more possibilities would be interesting as well. A step is an instant during program execution at which the current program state should be visualized.
Collectively, these (and a few similar but distinct entities) make up _execution sequences_. In this documentation, _sequence item_ or just _item_ means either an effect or a step, or depending on context also one of these similar entities.
Stack Pointer helps build these execution sequences, calculating the list of states at the steps of interest, and visualizing these states.
= Examples
As the typical use case is presentations, where the program execution would be visualized as a sequence of slides, the examples here will necessarily be displayed differently. So, don't let the basic appearance here fool you: it is up to you how to display Stack Pointer's results.
== Setting a variable
The possibly simplest example would be visualizing assigning a single variable, then returning . Let's do this and look at what's happening exactly:
#exec-example(
```c
int main() {
int a = 0;
return 0;
}
```,
```typc
execute({
l(1, call("main"))
l(2); l(2, push("a", 0))
l(3); l(none, ret())
})
```,
exec-grid-render.with(columns: 5),
)
A lot is going on here: we're using #ref-fn("execute()") to get the result of an execution sequence, which we created with multiple #ref-fn("l()") calls. Each of these calls first applies zero or more effects, then a step that can be used to visualize the execution state. We use steps without effects to change to lines 2 and 3 before showing what these do, for example. For line 1, we don't do that and instead immediately add the `call("main")` effect, because we want the main stack frame from the beginning. For the final step that returns from main, we don't put a line number because it's not meaningful anymore. Not specifying a line number is also useful for visualizing calls into library functions.
The simple visualization here -- listing the five steps (one per `l()` call) in a grid -- is _not_ part of Stack Pointer itself; it's done directly in this manual in a way that's appropriate for this format. The information in each step -- namely, the line numbers, stack frames, and local variables for each stack frame -- _are_ provided by Stack Pointer though.
== Low-level functions for effects and steps
The `l()` function is usually the appropriate way for defining execution sequences, but sometimes it makes sense to drop down a level. Here's the same code, represented with only the low-level functions of Stack Pointer.
#exec-example(
```c
int main() {
int a = 0;
return 0;
}
```,
```typc
execute({
call("main"); step(line: 1)
step(line: 2)
push("a", 0); step(line: 2)
step(line: 3)
ret(); step(line: none)
})
```,
exec-grid-render.with(columns: 5),
)
Apart from the individual function calls, one difference can be seen: the #ref-fn("step()") function takes only named arguments; in fact, a bare step doesn't even _have_ to have a line; `l()` just has it because it's very common to need it. For use cases where line numbers are not needed but `l()` seems like a better fit, just define your own variant using the #ref-fn("bare-l()") helper, e.g.: `let l(id, ..args) = bare-l(id: id, ..args)` -- now you can call this with `id` as a positional parameter.
== Calling functions (the manual way)
Regarding the call stack, a single function is not particularly interesting; the fun begins when there are function calls and thus multiple stack frames. Without additional high-level function tools, this can be achieved as follows:
#exec-example(
```c
int main() {
foo(0);
return 0;
}
void foo(int x) {
return;
}
```,
```typc
execute({
let foo(x) = {
l(6, call("foo"), push("x", x))
l(7); ret() // (1)
}
let main() = {
l(1, call("main"))
l(2); foo(0); l(2) // (2)
l(3); l(none, ret()) // (3)
}
main()
})
```,
exec-grid-render.with(columns: 5),
)
As soon as we're working with functions, it makes sense to represent every example function with an actual Typst function. `foo()` comes first because of how Typst resolves names (for mutually recursive functions this doesn't work, and Typst currently #link("https://github.com/typst/typst/issues/744")[doesn't seem to support it]). The execution sequence is ultimately constructed by calling `main()` once.
The `foo()` function contains its own #ref-fn("call()") and #ref-fn("ret()") effects, so that these don't need to be handled at every call site. There's a small asymmetry though: the return effect at #ref-line(1) is not added via `l()` and thus not immediately followed by a step. After returning, the line where execution resumes depends on the caller, so the next step is generated by `main()` at #ref-line(2), with a line within the C `main()` function.
An exception to this is the `main()` function itself: at #ref-line(3), we generate a step (with line `none`) because here it is clear where execution will resume - or rather, that it _won't_ resume.
== Calling functions (the convenient way)
Some of this function setup is still boilerplate, so Stack Pointer provides a simpler way, using #ref-fn("func()"):
#exec-example(
```c
int main() {
foo(0);
return 0;
}
void foo(int x) {
return;
}
```,
```typc
execute({
let foo(x) = func("foo", 5, l => {
l(0, push("x", x))
l(1)
})
let main() = func("main", 1, l => {
l(0)
l(1); foo(0); l(1)
l(2)
})
main(); l(none)
})
```,
exec-grid-render.with(columns: 5),
)
`func()` brings two conveniences: one, you put your implementation into a closure that gets an `l()` function that interprets line numbers relative to a first line number (for example, line 5 for `foo()`). This makes it easier to adapt the Typst simulation if you change the code example it refers to. Two, the `call()` and `ret()` effects don't need to be applied manually.
The downside is that this uniform handling means that we needed to manually add the last step after returning from `main()`.
== Using return values from functions
The convenience of mapping example functions to Typst functions comes in part from mirroring the logic: instead of having to track specific parameter values in specific calls, just pass exactly those values in Typst calls. Return values are an important part of this, but they need a bit of care:
#exec-example(
```c
int main() {
int x = foo();
return 0;
}
int foo() {
return 0;
}
```,
```typc
execute({
let foo() = func("foo", 6, l => {
l(0)
l(1); retval(0) // (1)
})
let main() = func("main", 1, l => {
l(0)
l(1)
let (x, ..rest) = foo(); rest // (2)
l(1, push("x", x))
l(2)
})
main(); l(none)
})
```,
exec-grid-render.with(columns: 5),
)
In line #ref-line(1) we have the first piece of the puzzle, the #ref-fn("retval()") function. This function is emitted by the implementation closure as if it was a sequence item, but it must be handled before `execute()` could see it because it isn't actually one. In line #ref-line(2) the caller, who normally receives an array of items, now also receives the return value as the first element of that array. By destructuring, the first element is removed, and then the rest of the array needs to be emitted so that these items are part of the complete execution sequence.
== Displaying execution state
Until now the examples have concerned themselves with the actual execution of programs; how to get from #ref-fn("execute()")'s result to the gray boxes in this documentation was not addressed yet. This is potentially different between documents, and Stack Pointer doesn't do a lot for you here yet; still, here's one example for how execution state could be displayed.
The following is a function that takes the example code and _one_ step as returned by #ref-fn("execute()"). Below, you see how it looks when the four last steps of the variable assignment example are rendered next to each other using this function:
#{
import sp: *
let code = ```c
int main() {
int a = 0;
return 0;
}
```
let steps = execute({
let main() = func("main", 1, l => {
l(0)
let a = 0
l(1); push("a", a); l(1)
l(2)
})
main(); l(none)
})
let steps = steps
let render-code = ```typc
let render(code, step) = {
let line = step.step.line
let stack = step.state.stack
block[
#code
#if line != none {
place(
top,
// dimensions are hard-coded for this specific situation
// don't take this part as inspiration, please ;)
dx: 0.47em, dy: 0.15em + (line - 1) * 1.22em,
circle(radius: 0.5em)
)
}
]
block(inset: (x: 0.9em))[
Stack: #parbreak()
#if stack.len() == 0 [(empty)]
#list(..stack.map(frame => {
frame.name
if frame.vars.len() != 0 {
[: ]
frame.vars.pairs().map(((name, value)) => [#name~=~#value]).join[, ]
}
}))
]
}
```
let render = eval(mode: "code", render-code.text + "; render")
render-code
grid(columns: (1fr,) * 4, ..range(1, 5).map(i => render(code, steps.at(i))))
}
A more typical situation would probably put the steps on individual slides. In polylux, for example, the `only()` function can be used to only show some information (the current line markers, the stack state) on specific subslides. To do so, it makes sense to first `enumerate(start: 1)` the steps, so that each step has a subslide index attached to it. The gallery has a complete example of using Stack Pointer with Polylux.
= Module reference
Functions that return sequence items, or similar values like #ref-fn("retval()"), return a value of the following shape: `((type: "..", ..),)` -- that is, an array with a single element. That element is a dictionary with some string `type`, and any other payload fields depending on the type. The payload fields are exactly the parameters of the following helper functions, unless specified otherwise.
#module(
read("/src/lib.typ"),
name: "stack-pointer",
label-prefix: none,
scope: scope,
show-module-name: false,
)
== Effects
Effects are in a separate module because they have the greatest potential for extension,
however they are also re-exported from the main module.
#module(
read("/src/effects.typ"),
name: "stack-pointer.effects",
label-prefix: none,
scope: scope,
show-module-name: false,
)
|
https://github.com/TempContainer/typnotes | https://raw.githubusercontent.com/TempContainer/typnotes/main/opti/con_set.typ | typst | #import "/book.typ": book-page
#import "/templates/my-template.typ": *
#show: book-page.with(title: "Convex Sets")
#show: template
= Hello, typst
Sample page äžææµè¯.
```cpp
#include <iostream>
int main() {
std::cout << "Hello World!\n";
return 0;
}
``` |
|
https://github.com/piepert/typst-seminar | https://raw.githubusercontent.com/piepert/typst-seminar/main/Beispiele/UniHausaufgabe/a01_antwort.typ | typst | #import "template.typ": adt, task, subtask, project, pointed
#show: project.with(
title: "1",
authors: (
(name: "<NAME>", matnr: "123456789"),
(name: "<NAME>", matnr: "987654321"),
(name: "<NAME>", matnr: "123459876")
),
date: "Sonntag, 16.04.2023")
#task[Objekt-orientierte Programmierung (OOP)][Beantworten Sie die folgenden Fragen zu grundlegenden Konzepten der objekt-orientierten Programmierung in Java.]
#subtask([Was ist ein _Objekt_? Was ist eine _Klasse_? Worin besteht der Unterschied?], 2, [
Ein Objekt ist hÀufigerweise eine abstrahierte ReprÀsentation realer Dinge. Klassen sind eine Kategorie gleichartiger Objekte. Objekte sind Instanzen von Klassen, wÀhrend Klassen die BauplÀne fÌr Objekte darstellen.
])
#subtask([Wie wird der Zustand eines Objektes gespeichert? Wie wird sein Verhalten reprÀsentiert?], 2, [
Der Zustand eines Objektes wird durch die Werte seine Attribute gespeichert. Das Verhalten wird durch die Methoden der Klasse beschrieben.
])
#subtask([Wie werden Objekte in Java erzeugt? Was ist ein _Konstruktor_? Wie und wann werden Objekte in Java wieder zerstört?], 3, [
Objekte werden mit dem `new`-Keyword generiert. Der Konstruktor initialisiert das durch `new` generierte Objekt. Durch den Garbage-Collector werden Objekte dann erzeugt, wenn sie nicht mehr verwendet werden, d.h. z.B. wenn keine weiteren Referenzen mehr auf das Objekt existieren.
])
#subtask([Was ist _Vererbung_? Wie deklariert man in Java eine Klasse, die von einer anderen Klasse erbt?], 2, [
Vererbung bezeichnet eine hierarchische Relation zwischen Klassen, in der eine Klasse $B$, die _Unterklasse_, von einer Klasse $A$, die _Oberklasse_, Attribute, Methoden und Implementierungen erbt. Damit kann $B$ als ein $A$ behandelt werden. In der Klassendefinition kann man mittels dem Keyword `extends` nach dem Namen der Klasse eine andere Klasse angeben, von der sie erbt.
```java
class B extends A {
...
}
```
])
#subtask([Was sind #emph([abstrakte Klassen]) und #emph([Methoden])?], 2, [
Abstrake Klassen sind Klassen, von denen keine Objekte erzeugt werden können. Abstrakte Methoden sind Methoden ohne Implementierung, die in jeder Unterklasse implementiert werden mÌssen.
])
#subtask([Was sind _Interfaces_? Wozu benötigt man sie in Java?], 3, [
Interfaces sind Àhnlich zu abstrakten Klassen, die ausschlieÃlich aus abstrakten Methoden bestehen. In Java kann eine Klasse nur von maximal einer anderen Klasse erben, jedoch von beliebig vielen Interfaces. FÃŒr Mehrfachvererbung sind Interfaces ein Ausweg.
])
#subtask([Was sind _Exceptions_? Wie behandelt man sie in Java?], 2, [
Exceptions sind Fehlerereignisse, die, wenn sie nicht behandelt werden, das Programm stoppen. Sie werden durch `try`-`catch`-Anweisungen behandelt. Jede `try`-`catch`-Anweisungen besteht aus einem `try`-Block und beliebig vielen `catch`-Blöcken, die eine bestimmte Exception $E$ behandeln. Im `try`-Block wird der auftretende Fehler abgefangen und der passende `catch`-Fall ausgefÌhrt. Ist kein passender `catch`-Fall vorhanden, wurde sie nicht behandelt und das Programm wird gestoppt.
```java
public static int parseInput(String input) {
try {
return Integer.parseInt(input);
} catch (NumberFormatException e) {
return -1;
}
}
```
])
#subtask([Was sind _Generics_? Welche Vorteile bieten sie?], 2, [
Generics machen es möglich, Methoden und Klassen fÌr verschiedene Datentypen gleichzeitig zu definieren. Wenn keine besondere Eigenschafft eines Datentyps $T$ benutzt wird, kann man die Klasse bzw. die Methode $A$ mithilfe von Generics auf eine Klasse bzw. Methode #emph("A<T>") generalisieren. Anstelle des Namens des Datentyps kann dann ein beliebiger, in der Methoden- oder Klassendefinition definierter, Platzhalter verwendet werden
```java
import java.util.ArrayList;
public class GenericsExample {
public static <T> ArrayList<T> addElement(ArrayList<T> a, T b) {
a.add(b);
return a;
}
public static void main(String[] args) {
ArrayList<Integer> l1 = new ArrayList<Integer>();
l1.add(1);
ArrayList<String> l2 = new ArrayList<String>();
l2.add("Hello");
addElement(l1, 2);
addElement(l2, "World");
}
}
```
])
#task[Komplexe Zahlen][Sowohl abstrakte Datentypen (ADT) als auch Objekte kapseln Daten zusammen mit den Operationen, die auf diese zugreifen. Es bestehen viele Gemeinsamkeiten, aber auch wichtige Unterschiede.]
#subtask([Spezifizieren Sie den ADT "Complex Number" zur ReprÀsentation komplexer Zahlen. Es soll möglich sein, eine komplexe Zahl zu erzeugen, zwei komplexe Zahlen zu addieren, zu subtrahieren, zu multiplizieren und zu dividieren sowie den Real- und ImaginÀrteil einer komplexen Zahl zu bestimmen!], 3, [
#show math.equation: it => emph(it)
#let complex = "complex"
#let add = "add"
#let sub = "sub"
#let mul = "mul"
#let div = "div"
#let Radd = "Radd"
#let Rsub = "Rsub"
#let Rmul = "Rmul"
#let Rdiv = "Rdiv"
Ein algebraischer Datentyp $CC$ fÃŒr "Complex Number", wobei $Radd$, $Rsub$, $Rdiv$ und $Rmul$ Operanten zur Addition, Subtraktion, Division und Multiplikation von reellen Zahlen sind:
#adt(
(
$RR$,
),
(
$complex: RR times RR -> CC$,
$add: CC times CC -> CC$,
$sub: CC times CC -> CC$,
$mul: CC times CC -> CC$,
$div: CC times CC -> CC$,
),
$forall r_1, r_2, i_1, i_2 in RR circle.filled.small$,
(
$add(complex(r_1, i_1), complex(r_2, i_2)) = complex(r_1 + r_2, i_1 + i_2)$,
$sub(complex(r_1, i_1), complex(r_2, i_2)) = complex(r_1 - r_2, i_1 - i_2)$,
$mul(complex(r_1, i_1), complex(r_2, i_2)) = complex(r_1 r_2 - i_1 i_2, r_1 i_2 + r_2 i_1)$,
$div(complex(r_1, i_1), complex(r_2, i_2)) = complex((r_1 r_2 + i_1 i_2)/(r_2 r_2 + i_2 i_2), (i_1 r_2 - r_1 i_2)/(r_2 r_2 + i_2 i_2))$
)
)
])
#subtask([Wie lassen sich aus der Signatur des ADT Complex die Methodendeklarationen einer objekt-orientierten Implementierung ableiten? Wie gehen Sie vor?], 3, [
Der ADT ist die Spezifikation einer Klasse. Verwendete Sorten erscheinen als benutzte Datentypen, $RR$ steht hier fÃŒr `double`, $CC$ ist die Klasse `Complex` selbst. Die Operatoren sind die Signaturen der Methoden fÃŒr die Klasse, wobei der Definitionsbereich der Abbildung durch die Argumente dargestellt und der RÃŒckgabewert der Methode der Wertebereich ist. Die Regeln sind die Implementierung der jeweiligen Methoden.
])
#subtask([Erstellen Sie eine objekt-orientierte Implementierung in Java! Nutzen Sie hierzu die Datei Complex.java!], 4, [
Siehe auch `src/Complex.java`.
#raw(block: true, lang: "java", read("src/Complex.java"))
]) |
|
https://github.com/SWATEngineering/Docs | https://raw.githubusercontent.com/SWATEngineering/Docs/main/src/3_PB/VerbaliEsterni/VerbaleEsterno_240320/content.typ | typst | MIT License | #import "meta.typ": inizio_incontro, fine_incontro, luogo_incontro, company
#import "functions.typ": glossary, team
#let participants = csv("participants.csv")
#let participants_company = csv("participants_company.csv")
= Partecipanti
/ Inizio incontro: #inizio_incontro
/ Fine incontro: #fine_incontro
/ Luogo incontro: #luogo_incontro
== Partecipanti di #team
#table(
columns: (3fr, 1fr),
[*Nome*], [*Durata presenza*],
..participants.flatten()
)
== Partecipanti di #emph[#company]
#for member in participants_company.flatten() [
- #member
]
= Sintesi Elaborazione Incontro
/*************************************/
/* INSERIRE SOTTO IL CONTENUTO */
/*************************************/
Durante l'incontro il team ha condiviso con la Proponente lo stato di avanzamento del progetto e pianificato l'ultimo probabile incontro in vista della presentazione PB.
== Stato avanzamento prodotto
L'incontro Ú iniziato con una breve analisi del lavoro svolto dal team dal precedente SAL, concentrato sull'implementazione dei test di unità dei simulatori e di integrazione dell'intero sistema nelle sue componenti. Entrambe le tipologie di test sono state effettuate con librerie apposite di Python: per la componente di simulazione sono state raggiunte una statement coverage e una branch coverage superiori all'80%.
== Sito vetrina
Sono state presentate le nuove modifiche estetiche e strutturali apportate al sito vetrina #link("https://swatengineering.github.io/"), come il nuovo metodo di ordinamento dei file e la grafica delle icone. La Proponente ne ha apprezzato forma e contenuti, specie vedendolo per la prima volta dall'inizio del progetto.
== Impegni presi
Considerando l'avvio del secondo lotto per il progetto didattico, la Proponente ha sottolineato la necessità di rispettare le scadenze fissate precedentemente dal team per portare a termine il progetto nelle tempistiche previste. Ciò consentirebbe all'azienda di gestire efficacemente la candidatura dei nuovi gruppi, ormai molto vicina. Inoltre, Ú stato chiesto al team di preparare una breve valutazione del ruolo svolto dalla Proponente lungo il percorso che ha portato al completamento del progetto per individuare possibili miglioramenti e garantire un'esperienza ottimale ai nuovi gruppi. Ad esempio, la Proponente ha già deciso di interrompere l'utilizzo dell'applicazione Element a favore di Discord, considerato più diffuso e funzionale.
== Pianificazione prossimo SAL
Per il prossimo SAL sono state proposte le due date di giovedì 28/03/2024 e di martedì 02/04/2024, con orari ancora da definire, direttamente presso la sede della Proponente. à nostra responsabilità comunicare la nostra preferenza tramite email. Tuttavia, l'incontro di martedì 02/04/2024 vedrebbe la partecipazione di <NAME>, e sarebbe quindi preferibile. Durante tale incontro, infatti, il team punta ad ottenere l'approvazione formale del prodotto software realizzato come MVP; alla luce della preferenza espressa dal team di non effettuare la terza revisione CA, si tratterebbe a tutti gli effetti dell'ultimo incontro esterno.
|
https://github.com/ryuryu-ymj/mannot | https://raw.githubusercontent.com/ryuryu-ymj/mannot/main/src/util.typ | typst | MIT License | #let copy-stroke(_stroke, args) = {
let s = stroke(_stroke)
return stroke((
paint: args.at("paint", default: s.paint),
thickness: args.at("thickness", default: s.thickness),
cap: args.at("cap", default: s.cap),
join: args.at("join", default: s.join),
dash: args.at("dash", default: s.dash),
miter-limit: args.at("miter-limit", default: s.miter-limit),
))
}
|
https://github.com/chendaohan/bevy_tutorials_typ | https://raw.githubusercontent.com/chendaohan/bevy_tutorials_typ/main/10_coordinate_system/coordinate_system.typ | typst | #set page(fill: rgb(35, 35, 38, 255), height: auto, paper: "a3")
#set text(fill: color.hsv(0deg, 0%, 90%, 100%), size: 22pt, font: "Microsoft YaHei")
#set raw(theme: "themes/Material-Theme.tmTheme")
= 1. 2D å 3D åºæ¯
Bevy åšæžžæäžçäžäœ¿çšå³æ Y åäžåæ ç³»ã䞺äºä¿æäžèŽïŒ3D å 2D çåæ ç³»çžåã
- X 蜎ä»å·Šåå³ïŒæ£ X æåå³äŸ§ïŒ
- Y 蜎ä»äžå°äžïŒæ£ Y æåäžïŒ
- Z 蜎ä»è¿å°è¿ïŒæ£ Z æåäœ ïŒ
åç¹é»è®€æ
åµäžåšå±å¹äžå¿ã
è¿æ¯å³æåæ ç³»ïŒäœ å¯ä»¥äœ¿çšå³æçæææ¥å¯è§å 3 äžªèœŽïŒææ = XïŒé£æ = YïŒäžæ = Zã
#image("images/handedness.png")
= 2. UI
å¯¹äº UIïŒBevy éµåŸªäžå€§å€æ°å
¶ä» UI å·¥å
·å
ãWeb çžåç纊å®ã
- åç¹äœäºå±å¹å·Šäžè§
- Y 蜎æåäžæ¹
- X 蜎ä»å±å¹å·ŠèŸ¹çŒå°å±å¹å³èŸ¹çŒ
- Y 蜎ä»å±å¹äžèŸ¹çŒå°å±å¹äžèŸ¹çŒ
= 3. å
æ åå±å¹
å
æ äœçœ®åä»»äœå
¶ä»çªå£ïŒå±å¹ç©ºéŽïŒåæ éµåŸªäž UI çžåç纊å®ã |
|
https://github.com/hexWars/resume | https://raw.githubusercontent.com/hexWars/resume/main/README-zh.md | markdown | MIT License |
# typst-resume-template

è¿äžªé¡¹ç®æ¯äžäžªäœ¿çšTypst讟计çç®åæš¡æ¿ïŒçµææ¥èªäºè¿äžª[çœç«](https://satnaing.dev/blog)ã
## é¢è§
| | |
|:---:|:---:|
|  |  |
## 䜿çš
åžžçšçSVGæä»¶å·²ç»ååšäºæä»¶å€¹`typst`äžïŒæš¡æ¿æä»¶äžº`typst/resume.typ`ïŒåš`typst/main.typ`æä»¶äžèŸå
¥äœ çç®åå
容ã
äœ å¯ä»¥å°æ¬é¡¹ç®äžèœœïŒåštypstçœç«äžäŒ `typst`æä»¶å€¹å䜿çš
### ä¿®æ¹é¡µé¢åæ°
```typst
#set page(margin: (top: 15mm, bottom: 15mm))
#set text(font: "Linux Libertine", lang: "zh", 1em)
#set par(leading: 0.58em)
```
å
æ¬åäœå€§å°ïŒè¯èšïŒé¡¶éšè·çŠ»ïŒåºéšè·çŠ»ç
### æ¹åé¢è²
ä¿®æ¹`theme_color`åæ°
### ä¿®æ¹ç«çº¿
åŠæäœ æ³ä¿®æ¹é£æ¡ç«çº¿ïŒå¯ä»¥æŸå°
```typst
line(
start: (0%, 11.1%),
end: (0%, 0%),
length: 4cm,
stroke: 6pt + theme_color,
)
```
è¿è¡ä¿®æ¹
### åå²çº¿
åå²çº¿ïŒå¯çšäºåŠåéšåïŒå€äžªåŠåçåå²
## 讞å¯
Format is MIT but all the data is owned by hexWars
|
https://github.com/nasyxx/lab-weekly-report | https://raw.githubusercontent.com/nasyxx/lab-weekly-report/master/smwr.typ | typst | // SMILE Lab Weekly Report Template
// by Nasy <EMAIL>
// Version 0.1.0
#let smwr(author, date, body) = {
let title = [#author's Weekly Report]
let last_date = date - duration(weeks: 1)
let date_format = "[month repr:short] [day], [year]"
set document(title: title, author: author, date: date)
set page(paper: "us-letter", numbering: "1")
set text(font: "EB Garamond", fallback: true)
set par(justify: true)
show heading: it => {
text(weight: "semibold", smallcaps(it.body))
}
align(left, text(2em, weight: "semibold", smallcaps[#title]))
align(left, text(1em, weight: "medium", [
#last_date.display(date_format) ---
#date.display(date_format + ", Week [week_number]")
]))
// abstract
heading(outlined: false, numbering: none, text(1.2em, smallcaps[Abstract], weight: "medium"))
body
}
|
|
https://github.com/SillyFreak/typst-scrutinize | https://raw.githubusercontent.com/SillyFreak/typst-scrutinize/main/src/lib.typ | typst | MIT License | #import "grading.typ"
#import "task.typ"
#import "solution.typ"
#import "task-kinds/mod.typ" as task-kinds
|
https://github.com/EpicEricEE/typst-equate | https://raw.githubusercontent.com/EpicEricEE/typst-equate/master/src/equate.typ | typst | MIT License | // Element function for alignment points.
#let align-point = $&$.body.func()
// Element function for sequences.
#let sequence = $a b$.body.func()
// Element function for a counter update.
#let counter-update = counter(math.equation).update(1).func()
// State for tracking whether equate is enabled.
#let equate-state = state("equate/enabled", 0)
// State for tracking the sub-numbering property.
#let sub-numbering-state = state("equate/sub-numbering", false)
// State for tracking whether we're in a shared alignment block.
#let share-align-state = state("equate/share-align", 0)
// State for tracking whether we're in a nested equation.
#let nested-state = state("equate/nested-depth", 0)
// Show rule necessary for referencing equation lines, as the number is not
// stored in a counter, but as metadata in a figure.
#let equate-ref(it) = {
if it.element == none { return it }
if it.element.func() != figure { return it }
if it.element.kind != math.equation { return it }
if it.element.body == none { return it }
if it.element.body.func() != metadata { return it }
// Display correct number, depending on whether sub-numbering was enabled.
let nums = if sub-numbering-state.at(it.element.location()) {
it.element.body.value
} else {
// (3, 1): 3 + 1 - 1 = 3
// (3, 2): 3 + 2 - 1 = 4
(it.element.body.value.first() + it.element.body.value.slice(1).sum(default: 1) - 1,)
}
assert(
it.element.numbering != none,
message: "cannot reference equation without numbering."
)
let num = numbering(
if type(it.element.numbering) == str {
// Trim numbering pattern of prefix and suffix characters.
let counting-symbols = ("1", "a", "A", "i", "I", "äž", "壹", "ã", "ã", "ã¢", "ã€", "×", "ê°", "ã±", "*")
let prefix-end = it.element.numbering.codepoints().position(c => c in counting-symbols)
let suffix-start = it.element.numbering.codepoints().rev().position(c => c in counting-symbols)
it.element.numbering.slice(prefix-end, if suffix-start == 0 { none } else { -suffix-start })
} else {
it.element.numbering
},
..nums
)
let supplement = if it.supplement == auto {
it.element.supplement
} else if type(it.supplement) == function {
(it.supplement)(it.element)
} else {
it.supplement
}
link(it.element.location(), if supplement not in ([], none) [#supplement~#num] else [#num])
}
// Extract lines and trim spaces.
#let to-lines(equation) = {
let lines = if equation.body.func() == sequence {
equation.body.children.split(linebreak())
} else {
((equation.body,),)
}
// Trim spaces at begin and end of line.
let lines = lines.filter(line => line != ()).map(line => {
if line.first() == [ ] and line.last() == [ ] {
line.slice(1, -1)
} else if line.first() == [ ] {
line.slice(1)
} else if line.last() == [ ] {
line.slice(0, -1)
} else {
line
}
})
lines
}
// Layout a single equation line with the given number.
#let layout-line(
number: none,
number-align: none,
number-width: auto,
text-dir: auto,
line
) = context {
let equation(body) = [
#math.equation(
block: true,
numbering: _ => none,
body
) <equate:revoke>
]
// Short circuit if no number has to be added.
if number == none {
return equation(line.join())
}
// Short circuit if number is a counter update.
if type(number) == content and number.func() == counter-update {
return {
number
equation(line.join())
}
}
// Start of equation block.
let x-start = here().position().x
// Resolve number width.
let number-width = if number-width == auto {
measure(number).width
} else {
number-width
}
// Resolve equation alignment in x-direction.
let equation-align = if align.alignment.x in (left, center, right) {
align.alignment.x
} else if text-dir == ltr {
if align.alignment.x == start { left } else { right }
} else if text-dir == rtl {
if align.alignment.x == start { right } else { left }
}
// Add numbers to the equation body, so that they are aligned at their
// respective baselines. If the equation is centered, the number is put
// on both sides of the equation to keep the center alignment.
let num = box(width: number-width, align(number-align, number))
let line-width = measure(equation(line.join())).width
let gap = 0.5em
layout(bounds => {
let space = if equation-align == center {
bounds.width - line-width - 2 * number-width
} else {
bounds.width - line-width - number-width
}
let body = if number-align.x == left {
if equation-align == center {
h(-gap) + num + h(space / 2 + gap) + line.join() + h(space / 2) + hide(num)
} else if equation-align == right {
num + h(space + 2 * gap) + line.join()
} else {
h(-gap) + num + h(gap) + line.join() + h(space + gap)
}
} else {
if equation-align == center {
hide(num) + h(space / 2) + line.join() + h(space / 2 + gap) + num + h(-gap)
} else if equation-align == right {
h(space + gap) + line.join() + h(gap) + num + h(-gap)
} else {
line.join() + h(space + 2 * gap) + num
}
}
equation(body)
})
}
// Replace "fake labels" with a hidden figure that is labelled
// accordingly.
#let replace-labels(
lines,
number-mode,
numbering,
supplement,
has-label
) = {
// Main equation number.
let main-number = counter(math.equation).get()
// Indices of lines that contain a label.
let labelled = lines
.enumerate()
.filter(((i, line)) => {
if line.len() == 0 { return false }
if line.last().func() != raw { return false }
if line.last().lang != "typc" { return false }
if line.last().text.match(regex("^<.+>$")) == none { return false }
return true
})
.map(((i, _)) => i)
// Indices of lines that are marked not to be numbered.
let revoked = lines
.enumerate()
.filter(((i, line)) => {
if i not in labelled { return false }
return line.last().text == "<equate:revoke>"
})
.map(((i, _)) => i)
// The "revoke" label shall not count as a labelled line.
labelled = labelled.filter(i => i not in revoked)
// Indices of numbered lines in this equation.
let numbered = if number-mode == "line" {
range(lines.len()).filter(i => i not in revoked)
} else if labelled.len() == 0 and has-label {
// Only outer label, so number all lines.
range(lines.len()).filter(i => i not in revoked)
} else {
labelled
}
(
numbered,
lines.enumerate()
.map(((i, line)) => {
if i in revoked {
// Remove "revoke" label and space and return line.
line.remove(-1)
if line.at(-2, default: none) == [ ] { line.remove(-2) }
return line
}
if i not in labelled { return line }
// Remove trailing spacing (before label).
if line.at(-2, default: none) == [ ] { line.remove(-2) }
// Append sub-numbering only if there are multiple numbered lines.
let nums = main-number + if numbered.len() > 1 {
(numbered.position(n => n == i) + 1,)
}
// We use a figure with kind "equation" to make the sub-equation
// referenceable with the correct supplement. The numbering is stored
// in the figure body as metadata, as a counter would only show a
// single number.
line.at(-1) = [#figure(
metadata(nums),
kind: math.equation,
numbering: numbering,
supplement: supplement
)#label(line.last().text.slice(1, -1))]
return line
})
)
}
// Remove labels from lines, so that they don't interfere when measuring.
#let remove-labels(lines) = {
lines.map(line => {
if line.len() == 0 { return line }
if line.last().func() != raw { return line }
if line.last().lang != "typc" { return line }
if line.last().text.match(regex("^<.+>$")) == none { return line }
line.remove(-1)
if line.at(-1, default: none) == [ ] { line.remove(-1) }
return line
})
}
// Splitting an equation into multiple lines breaks the inbuilt alignment
// with alignment points, so it is emulated here by adding spacers manually.
#let realign(lines) = {
// Utility shorthand for unnumbered block equation.
let equation(body) = [
#math.equation(
block: true,
numbering: none,
body
) <equate:revoke>
]
// Consider lines of other equations in shared alignment block.
let extra-lines = if share-align-state.get() > 0 {
let num = counter("equate/align/counter").get().first()
let align-state = state("equate/align/" + str(num), ())
remove-labels(align-state.final())
} else {
()
}
let lines = extra-lines + lines
// Short-circuit if no alignment points.
if lines.all(line => align-point() not in line) {
return lines.slice(extra-lines.len())
}
// Store widths of each part between alignment points.
let part-widths = lines.map(line => {
line.split(align-point())
.map(part => measure(equation(part.join())).width)
})
// Get maximum width of each part.
let part-widths = for i in range(calc.max(..part-widths.map(points => points.len()))) {
(calc.max(..part-widths.map(line => line.at(i, default: 0pt))), )
}
// Get maximum width of each slice of parts.
let max-slice-widths = array.zip(..lines.map(line => range(part-widths.len()).map(i => {
let parts = line.split(align-point()).map(array.join)
if i >= parts.len() {
0pt
} else {
let slice = parts.slice(0, i + 1).join()
measure(equation(slice)).width
}
}))).map(widths => calc.max(..widths))
// Add spacers for each part, so that the part widths are the same for all lines.
let lines = lines.map(line => {
line.split(align-point())
.enumerate()
.map(((i, part)) => {
// Add spacer to make part the correct width.
let width-diff = part-widths.at(i) - measure(equation(part.join())).width
let spacing = if width-diff > 0pt { h(0pt) + box(fill: yellow, width: width-diff) + h(0pt) }
if calc.even(i) {
spacing + part.join() // Right align.
} else {
part.join() + spacing // Left align.
}
})
.intersperse(align-point())
})
// Update maximum slice widths to include spacers.
let max-slice-widths = array.zip(..lines.map(line => range(part-widths.len()).map(i => {
let parts = line.split(align-point()).map(array.join)
if i >= parts.len() {
0pt
} else {
let slice = parts.slice(0, i + 1).join()
calc.max(max-slice-widths.at(i), measure(equation(slice)).width)
}
}))).map(widths => calc.max(..widths))
// Add spacers between parts to ensure correct spacing with combined parts.
lines = for line in lines {
let parts = line.split(align-point()).map(array.join)
for i in range(max-slice-widths.len()) {
if i >= parts.len() {
break
}
let slice = parts.slice(0, i).join() + h(0pt) + parts.at(i)
let slice-width = measure(equation(slice)).width
if slice-width < max-slice-widths.at(i) {
parts.at(i) = h(0pt) + box(fill: green, width: max-slice-widths.at(i) - slice-width) + h(0pt) + parts.at(i)
}
}
(parts,)
}
// Append remaining spacers at the end for lines that have less align points.
let line-widths = lines.map(line => measure(equation(line.join())).width)
let max-line-width = calc.max(..line-widths)
lines = lines.zip(line-widths).map(((line, line-width)) => {
if line-width < max-line-width {
line.push(h(0pt) + box(fill: red, width: max-line-width - line-width))
}
line
})
lines.slice(extra-lines.len())
}
// Any block equation inside this block will share alignment points, thus
// allowing equations to be interrupted by text and still be aligned.
//
// Sub-numbering is not (yet) continued across equations in this block, so
// each new equation will get a new main number. Equations with a revoke label
// will not share alignment with other equations in this block.
#let share-align(body) = {
context assert(
equate-state.get() > 0,
message: "shared alignment block requires equate to be enabled."
)
share-align-state.update(n => {
assert.eq(n, 0, message: "nested shared alignment blocks are not supported.")
n + 1
})
let align-counter = counter("equate/align/counter")
align-counter.step()
show math.equation.where(block: true): it => {
if it.has("label") and it.label == <equate:revoke> {
return it
}
let align-state = state("equate/align/" + str(align-counter.get().first()), ())
align-state.update(lines => lines + to-lines(it))
it
}
body
share-align-state.update(n => n - 1)
}
// Applies show rules to the given body, so that block equations can span over
// page boundaries while retaining alignment. The equation number is stepped
// and displayed at every line, optionally with sub-numbering.
//
// Parameters:
// - breakable: Whether to allow page breaks within the equation.
// - sub-numbering: Whether to add sub-numbering to the equation.
// - number-mode: Whether to number all lines or only lines containing a label.
// Must be either "line" or "label".
// - debug: Whether to show alignment spacers for debugging.
//
// Returns: The body with the applied show rules.
#let equate(
breakable: auto,
sub-numbering: false,
number-mode: "line",
debug: false,
body
) = {
// Validate parameters.
assert(
breakable == auto or type(breakable) == bool,
message: "expected boolean or auto for breakable, found " + repr(breakable)
)
assert(
type(sub-numbering) == bool,
message: "expected boolean for sub-numbering, found " + repr(sub-numbering)
)
assert(
number-mode in ("line", "label"),
message: "expected \"line\" or \"label\" for number-mode, found " + repr(number-mode)
)
assert(
type(debug) == bool,
message: "expected boolean for debug, found " + repr(debug)
)
// This function was applied to a reference or label, so apply the reference
// rule instead of the equation rule.
if type(body) == label {
return {
show ref: equate-ref
ref(body)
}
} else if body.func() == ref {
return {
show ref: equate-ref
body
}
}
show math.equation.where(block: true): set block(breakable: breakable) if type(breakable) == bool
show math.equation.where(block: true): it => {
// Don't apply show rule in a nested equations.
if nested-state.get() > 0 {
return it
}
// Allow a way to make default equations.
if it.has("label") and it.label == <equate:revoke> {
return it
}
// Make spacers visible in debug mode.
show box.where(body: none): it => {
if debug { it } else { hide(it) }
}
show box.where(body: none): set box(
height: 0.4em,
stroke: 0.4pt,
) if debug
// Prevent show rules on figures from messing with replaced labels.
show figure.where(kind: math.equation): it => {
if it.body == none { return it }
if it.body.func() != metadata { return it }
none
}
// Main equation number.
let main-number = counter(math.equation).get().first()
// Resolve text direction.
let text-dir = if text.dir == auto {
if text.lang in (
"ar", "dv", "fa", "he", "ks", "pa",
"ps", "sd", "ug", "ur", "yi",
) { rtl } else { ltr }
} else {
text.dir
}
// Resolve number position in x-direction.
let number-align = if it.number-align.x in (left, right) {
it.number-align.x
} else if text-dir == ltr {
if it.number-align.x == start { left } else { right }
} else if text-dir == rtl {
if it.number-align.x == start { right } else { left }
}
let (numbered, lines) = replace-labels(
to-lines(it),
number-mode,
it.numbering,
it.supplement,
it.has("label")
)
// Short-circuit for single-line equations.
if lines.len() == 1 and share-align-state.get() == 0 {
if it.numbering == none { return it }
if numbering(it.numbering, 1) == none { return it }
let number = if numbered.len() > 0 {
numbering(it.numbering, main-number)
} else {
// Step back counter as this equation should not be counted.
counter(math.equation).update(n => n - 1)
}
return {
// Update state to allow correct referencing.
sub-numbering-state.update(_ => sub-numbering)
layout-line(
lines.first(),
number: number,
number-align: number-align,
text-dir: text-dir
)
// Step back counter as we introducted an additional equation
// that increased the counter by one.
counter(math.equation).update(n => n - 1)
}
}
// Calculate maximum width of all numberings in this equation.
let max-number-width = if it.numbering == none { 0pt } else {
calc.max(0pt, ..range(numbered.len()).map(i => {
let nums = if sub-numbering and numbered.len() > 1 {
(main-number, i + 1)}
else {
(main-number + i,)
}
measure(numbering(it.numbering, ..nums)).width
}))
}
// Update state to allow correct referencing.
sub-numbering-state.update(_ => sub-numbering)
// Layout equation as grid to allow page breaks.
block(grid(
columns: 1,
row-gutter: par.leading,
..realign(lines).enumerate().map(((i, line)) => {
let sub-number = numbered.position(n => n == i)
let number = if it.numbering == none {
none
} else if sub-number == none {
// Step back counter as this equation should not be counted.
counter(math.equation).update(n => n - 1)
} else if sub-numbering and numbered.len() > 1 {
numbering(it.numbering, main-number, sub-number + 1)
} else {
numbering(it.numbering, main-number + sub-number)
}
layout-line(
line,
number: number,
number-align: number-align,
number-width: max-number-width,
text-dir: text-dir
)
})
))
// Revert equation counter step(s).
if it.numbering == none {
// We converted a non-numbered equation into multiple empty-
// numbered ones and thus increased the counter at every line.
counter(math.equation).update(n => n - lines.len())
} else {
// Each line stepped the equation counter, but it should only
// have been stepped once (when using sub-numbering). We also
// always introduced an additional numbered equation that
// stepped the counter.
counter(math.equation).update(n => {
n - if sub-numbering and numbered.len() > 1 { numbered.len() } else { 1 }
})
}
}
// Apply this show rule first to update nested state.
// This works because the context provided by the other show rule does not
// yet include the updated state, so it can be retrieved correctly.
show math.equation.where(block: true): it => {
// Turn off numbering for nested equations, as it's usually unwanted.
// Workaround for https://github.com/typst/typst/issues/5263.
set math.equation(numbering: none)
nested-state.update(n => n + 1)
it
nested-state.update(n => n - 1)
}
// Add show rule for referencing equation lines.
show ref: equate-ref
equate-state.update(n => n + 1)
body
equate-state.update(n => n - 1)
}
|
https://github.com/ls1intum/thesis-template-typst | https://raw.githubusercontent.com/ls1intum/thesis-template-typst/main/utils/todo.typ | typst | MIT License | #let TODO(body, color: yellow, width: 100%, breakable: true) = {
block(
width: width,
radius: 3pt,
stroke: 0.5pt,
fill: color,
inset: 10pt,
breakable: breakable,
)[
#body
]
} |
https://github.com/juruoHBr/typst_xdutemplate | https://raw.githubusercontent.com/juruoHBr/typst_xdutemplate/main/template.typ | typst |
#import "@preview/cuti:0.2.1": show-cn-fakebold
#import "utils.typ": *
// ---------------------------å
šå±åé-------------------------
#let headings = state("headings",())
#let frontmattercnt = counter("frontmattercnt")
#let mainpagecnt = counter("mainpagecnt")
// ----------------------------èŸ
å©åœæ°-------------------------
#let get-fonts(fonts) = {
let font-dict = (
en-main-font: "Times New Roman",
en-heading-font: "Times New Roman",
ch-main-font: "SimSun",
ch-heading-font: "SimHei",
) + fonts
let title-font = (font-dict.en-heading-font, font-dict.ch-heading-font)
let main-font = (font-dict.en-main-font,font-dict.ch-main-font)
return (title-font,main-font)
}
// è·åheaderçææ¬
#let getheaderinfo(loc,title) = {
if(calc.odd(loc.page())){
let headings_array = headings.final(loc)
let headertext =none
for page_heading in headings_array{
if int(loc.page()) < page_heading.at(0){
return headertext
}
headertext = page_heading
}
return headings_array.last()
}
else {
return (0,-1,title)
}
}
// æçheader
#let header-fun(numberformat: "1",cnt: counter(page),title:[]) = {
let headercontext = {
locate(loc=>{
h(1fr)
let headerinfo = getheaderinfo(loc,title)
if(headerinfo.at(1)==-1){
headerinfo.at(2)
}
else {
let header-format = headerinfo.at(1)
if header-format != none{
numbering(header-format,..headerinfo.at(3)) + " "
}
headerinfo.at(2)
}
h(1fr)
})
}
block(width: 100%, height: 100%, stroke: (bottom:1pt), inset: 0.5em)[
#headercontext
#locate(loc =>
if(calc.odd( loc.page() )){
place(right+bottom)[#cnt.display(numberformat)]
}
else {
place(left + bottom)[#cnt.display(numberformat)]
}
)
]
}
// ----------------------------ææ¡£åœæ°-------------------------
#let abstract(
ch-abstract,
en-abstract
) = {
heading(level: 1, outlined: false)[æèŠ]
ch-abstract
heading(level: 1, outlined: false)[Abstract]
en-abstract
}
#let cover(
info: none
) = [
#set par(first-line-indent: 0em)
#set table(stroke: (x, _) => if x == 1 { (bottom: 0.8pt) })
#set page(margin: (top: 2.54cm, bottom: 2.54cm) )
#set align(center)
#{
set text(size: 12pt)
place(top+right,dx: -1.5cm)[
#table(
columns: (3em,7em),
align: bottom + center,
column-gutter: 0.5em,
[*ç~~级*], [*#info.class*],
[*åŠ~~å·*], [*#info.number*],
)
]
}
#v(2.5cm)
#image("figures/æ ¡å.jpg", width: 7.73cm, height: 1.27cm, fit: "stretch")
#v(0.9cm)
#text(font: "SimHei", size: 42pt, tracking: 5pt)[æ¬ç§æ¯äžè®Ÿè®¡è®ºæ]
#v(1.2cm)
#image("figures/æ ¡åŸœ.png",width: 4.39cm, height: 4.18cm, fit: "stretch")
#v(1cm)
#{
set text(size: 15pt)
let titles = info.covertitle.split("\n")
move(dx : -15pt,
table(
columns: (5em,15em),
align: bottom,
row-gutter: 1.8em,
column-gutter: 1em,
[*é¢~~~~~~~~ç®*], hei[#titles.at(0)],
[],hei[#titles.at(1)],
[*åŠ~~~~~~~~é¢*], [#info.school],
[*äž~~~~~~~~äž*], [#info.major],
[*åŠçå§å*], [#info.name],
[*富åžå§å*], [#info.tutor],
)
)
}
]
#let thesis-contents() ={
pagebreak()
outline(indent: auto)
}
#let front-matter(doc,title: []) = {
//页é¢è®Ÿçœ®
set page(
header: frontmattercnt.update(it=>it+1)+ header-fun(numberformat: "i",cnt: frontmattercnt,title:title),
footer: []
)
doc
}
#let mainbody(title:[],doc) = {
// æ é¢è®Ÿçœ®
show heading: set heading(numbering: "1.1")
show heading.where(level: 1): set heading(numbering: "第äžç« ")
set page(
header: mainpagecnt.update(it=> it+1)+header-fun(numberformat: "1",cnt: mainpagecnt, title: title),
footer: []
)
show math.equation: set text(font: ("New Computer Modern Math", "SimHei"))
set math.equation(numbering: it=>{
set text(font: ("Times New Roman","SimSun"))
"åŒ(" + context str(counter(heading).get().first() )+ "-" + str(it) +")"
})
set figure(numbering: it=>{
context str(counter(heading).get().first()) + "." + str(it)
})
counter(page).update(1)
doc
}
#let after-matter(title:[],doc) = {
set page(
header: mainpagecnt.update(it=> it+1)+header-fun(numberformat: "1",cnt: mainpagecnt, title: title),
footer: []
)
doc
}
#let appendix(doc) = {
set figure(numbering: it=>{
context numbering("A",counter(heading).get().first()) + str(it)
})
set math.equation(numbering: it=>{
"åŒ(" + context numbering("A",counter(heading).get().first() )+ "-" + str(it) +")"
})
show heading.where(level: 1): set heading(numbering: "éåœA")
counter(heading).update(0)
doc
}
//----------------------------è®ºææš¡æ¿--------------------
#let xdudoc(
fonts: (:),
fontsize : 12pt,
factor: 1.5,
doc
) = {
//å
šå±è®Ÿçœ®
show strong: show-cn-fakebold
set text(lang: "zh", region: "cn")
set pagebreak(to: "odd", weak: true)
let (title-font,main-font) = get-fonts(fonts)
set text(
font: main-font,
size: fontsize,
lang: "zh",
)
set par(
leading: 15.6pt * factor - 0.7em,
first-line-indent: 2em,
justify: true,
)
set page(margin: (top:3cm, bottom: 2cm, inside: 3cm, outside: 2cm))
set block(above: 15.6pt * factor - 0.7em, below: 15.6pt * factor - 0.7em)
//ç« èæ é¢è®Ÿçœ®
show heading: set text(font: title-font)
show heading: it =>{it + fake-par}
show heading.where(level:1) : it => {
pagebreak()
set text(size: 16pt)
align(center)[#it]
locate(loc=>{
let pagenum = int(loc.page())
headings.update(
headings => headings + ( (pagenum,it.numbering,it.body,counter(heading).at((loc) )), )
)
})
counter(figure.where(kind: table)).update(0)
counter(figure.where(kind: image)).update(0)
counter(math.equation).update(0)
}
show heading.where(level: 2) : set text(size: 14pt)
set math.equation(supplement: none)
show figure.where(kind: table): set figure.caption(position: top)
show figure.caption : set text(size: 10pt)
doc
} |
|
https://github.com/dark-flames/apollo-typst | https://raw.githubusercontent.com/dark-flames/apollo-typst/main/packages/typst-apollo/theme.typ | typst | Apache License 2.0 | #import "@preview/shiroa:0.1.0": target
// Theme (Colors)
#let theme-target = if target.contains("-") { target.split("-").at(1) } else { "light" }
#let theme-style = toml("theme-style.toml").at(theme-target)
#let is-dark-theme = theme-style.at("color-scheme") == "dark"
#let is-light-theme = not is-dark-theme
#let main-color = rgb(theme-style.at("main-color"))
#let dash-color = rgb(theme-style.at("dash-color"))
#let main-font = (
"Charter",
"Source Han Serif SC",
"Source Han Serif TC",
"Linux Libertine",
)
#let code-font = (
"BlexMono Nerd Font Mono",
"DejaVu Sans Mono",
)
#let code-theme-file = theme-style.at("code-theme")
#let code-extra-colors = if code-theme-file.len() > 0 {
let data = xml(theme-style.at("code-theme")).at(1)
let find-child(elem, tag) = {
elem.children.find(e => "tag" in e and e.tag == tag)
}
let find-kv(elem, key, tag) = {
let idx = elem.children.position(e => "tag" in e and e.tag == "key" and e.children.first() == key)
elem.children.slice(idx).find(e => "tag" in e and e.tag == tag)
}
let plist-dict = find-child(data, "dict")
let plist-array = find-child(plist-dict, "array")
let theme-setting = find-child(plist-array, "dict")
let theme-setting-items = find-kv(theme-setting, "settings", "dict")
let background-setting = find-kv(theme-setting-items, "background", "string")
let foreground-setting = find-kv(theme-setting-items, "foreground", "string")
(
bg: rgb(background-setting.children.first()),
fg: rgb(foreground-setting.children.first()),
)
} else {
(
bg: rgb(239, 241, 243),
fg: none,
)
} |
https://github.com/DashieTM/nix-introduction | https://raw.githubusercontent.com/DashieTM/nix-introduction/main/topics/nixos.typ | typst | #import "../utils.typ": *
#polylux-slide[
== NixOS
#v(15pt)
- a GNU/Linux distribution
- fundamentally different file system design
- nix store
- otherwise just like any penguin variant
- only configures and installs system wide programs
- use home-manager for user-based configuration
#pdfpc.speaker-note(```md
```)
]
|
|
https://github.com/ice1000/website | https://raw.githubusercontent.com/ice1000/website/main/dtt-dev/wip.typ | typst | #import "config.typ": *
#show: dtt.with(title: "WIP")
// Cartesian coproduct
#definition("Sum")[
We say a type theory has _sum type_ if it has the following constructions:
+ $ (Îâ¢A #h(2em) Îâ¢B)/(Π⢠A + B) $
The _formation rule_,
+ $ (Π⢠a:A)/(Π⢠inl(a) : A + B) \ (Π⢠b:B)/(Π⢠inr(b) : A + B)
$
The _introduction rules_,
+ $ (Π⢠s : A + B \ Î, x:A ⢠u : C #h(2em) Î, y:B ⢠v : C)/
(Π⢠elim_+(s, x. u, y. v) : C)
$
The _elimination rule_;
such that the following rules are derivable:
+ $Π⢠(A + B)[Ï] â¡ A[Ï] + B[Ï]$ the fact that sum is preserved by substitution action,
+ $ (Π⢠a:A)/(Π⢠elim_+(inl(a), x. u, y. v) ⡠u[a slash x] : C) \
(Π⢠b:B)/(Π⢠elim_+(inr(b), x. u, y. v) ⡠v[b slash y] : C)
$
The $β$-rules,
+ $ (Î, x:A+B ⢠u : C \
u_1 := u[inl(y) slash x] #h(2em)
u_2 := u[inr(y) slash x]
)/
(Î, x:A+B ⢠u â¡ elim_+(x, y. u_1, y. u_2) : C)
$
The $η$-law.
]
#definition("Raw extensional equality")[
Given $뉢a:A$ and $뉢b:A$.
A _raw extensional equality_ consists of the following data:
+ A type $뉢X$,
+ The equality reflection rule, namely
$ (Îâ¢p:X)/(Îâ¢aâ¡b:A) $
]
Then, $a=_A b$ is an instance of such a raw extensional equality,
which can be characterized as follows:
#definition("Extensional equality")[
The extensional equality $a=_A b$ is a raw extensional equality such that
for every other raw extensional equality $X$, there exists a _unique_ term,
called the _reflexivity principle_:
+ $ Π⢠h : a =_A b $
such that:
]
|
|
https://github.com/ckunte/resume | https://raw.githubusercontent.com/ckunte/resume/master/ckunte-resume.typ | typst | // <NAME>'s resume
#import "/inc/preamble.typ": resume
#show: doc => resume(doc)
// meta info
#let auth_name = "<NAME>"
#let auth_mail = "<EMAIL>"
#let res_title = auth_name + " - resumé"
#let start_year = 1995 // year of beginning my career
#let curr_year = int(datetime.today().display("[year]")) // current year in int.
#let tot_exp = calc.abs(curr_year - start_year) // total exp. in years
#let op_exp = calc.abs(curr_year - start_year - 16) // oper. exp. in years
//
#set document(
title: res_title,
author: auth_name,
)
#set page(header: context {
if counter(page).get().first() > 1 [
~
#h(1fr)
_ #auth_name _
]
})
// title + subtitle
#align(center, text(18pt)[
*#auth_name*
])
#align(center, [
#auth_mail
])
//
\
Offshore structures engineer with #tot_exp years of proven track record in engineering, fabrication, and installation of fixed and floating offshore systems, and #op_exp years in supporting and troubleshooting for operating units./*Designated technical authority for fixed structures, and mooring systems.*/ Experienced in managing projects and engineering teams, inter-discipline coordination, cost control, and stakeholder management. Self-driven professional, striving for safety and quality in all undertaken activities without compromising schedule or cost. Strives to help deliver challenging projects, and manage assets efficiently---both as a team leader as well as an individual contributor. Generates value by employing competitive scoping, standardisation, automation where practicable, and by delivering via others.
= Education
/ 2021: Wind energy, Technical University of Denmark (DTU, DK) // https://www.coursera.org/account/accomplishments/certificate/Y9CRZSXUSTWB
/ 2017: Project management framework, Shell Academy, IN
/ 2016: P299 Layout course, Shell Academy, IN
/ 2013: Managing upstream business, Shell Academy, NL
/ 2012: Advanced analysis with USFOS, MY
/ 2010: Assessment of maritime (hull) structures, DNV, NL
/ 2008: Facilities integration and production engineering, Shell Academy, NL
/ 2008: Developing Exploration and Production business skills, Shell Academy, NL
/ 2007: Advanced knowledge management, Shell Academy, NL
/ 1994: M.Eng., Structural, Karnatak University, IN
= Professional accreditations
/ 2015: Associate member of the Royal Institution of Naval Architects
/ 2011: Member of American Society of Mechanical Engineers
= Honours
/ 2023: For successfully leading the delivery and assurance of the engineering design of the Crux substructure to completion, _by Shell Australia_.
/ 2021: Awarded Distinguished Talent Residency for contributions to Australian and international projects in the energy sector, _by Government of Australia_.
/ 2020: For developing a criteria for reliability for a manned fixed platform in HI field offshore West Africa, _by Shell Nigeria_.
/ 2019: Service Recognition Award for enabling the concept of a fixed offshore structure in 170m in challenging design conditions (calcareous soils) and prevailing geo-hazards, with robustness to pass Decision Gate 3, _by Shell Australia_.
/ 2018: For successfully leading the delivery and assurance of the engineering design, construction, and installation of the CALM buoy replacement, _by Brunei Shell Petroleum_.
/ 2017: For championing new technology, product development and maturation (viz., employing smoothed particle hydrodynamics in remote flare buoy design; flaring, station-keeping) in extreme low wind conditions, _by Qatar Shell_.
/ 2016: For developing a lean new emergency power generation host within PAA platform's highly congested layout, and designing it to withstand blast over-pressures, while minimising impact to the supporting box girder deck, saving 67% tonnage, and simplifying offshore construction activity, _by Sakhalin Energy_.
/ 2013: Service recognition award (drilling, conductor repair) for engineering and execution of novel and low cost conductor repairs at SFDPA platform, _by Shell Malaysia_.
/ 2012: For mitigating fabrication challenges in eight lift safety critical welding joints in E8K and F13K compression modules, 1,800t each, _by Shell Malaysia_.
/ 2011: For (a) the maturation of floating LNG's weather-vaning turret-mooring concept, and response-based mooring design for application in harsh cyclonic environment, (b) turret size optimisation, and (c) developing a qualification process for the largest polyester rope for station-keeping for ultra deep water applications, jointly by _Shell Projects and Technology_ and _Shell Australia_.
/ 2005: For developing and designing Kikeh topsides for the catamaran tow to mate with Asia's first deep water truss SPAR, _by Murphy Oil Malaysia_.
/ 2001: For engineering and execution of two lean fixed offshore platforms, Bintang A and B, _by ExxonMobil Malaysia_.
/ 2000: For evaluating and mitigating foundation reliability from well-blowout event at EWE platform, _by Unocal Thailand_.
= Positions
/ 2024-date: Principal engineer, Kent Australia
/ 2019-24: Substructure lead, Crux project, Shell Australia Pty Ltd
/ 2016-18: Principal technical expert, Shell India Markets Pvt Ltd
/ 2012-15: Team lead offshore structures, Shell Malaysia Exploration \& Production
/ 2006-11: Snr. research engineer, Shell Intl. Exploration \& Production, The Netherlands
/ 2000-06: Lead offshore structures engineer, Technip Malaysia
/ 1996-00: Offshore structures engineer, <NAME>ers India
/ 1995-96: Junior engineer, Gammon India
= Monograph
/ 2024: #link("https://github.com/ckunte/m-one/releases/")[m-one] --- a collection of notes and code from engineering offshore structures.
= Technical notes and papers
== Development
/ 2023: <NAME>, <NAME>, <NAME>, _Development of time load histories for cyclic pile loading for Crux platform_, Crux project library.
/ 2022: <NAME>, <NAME>, _Crux reliability accounting for wave-in-deck loading_, Crux project library.
/ 2022: <NAME>, _A review of changes in ISO 19902:2020 standard and their relevance to CRUX fixed steel offshore platform design_, Crux project library.
/ 2021: <NAME>, _Structural steel grade_, Crux project library.
/ 2021: <NAME>, _Crux Topsides: Review of Weights_, Crux project library.
/ 2021: <NAME>, _A review of helideck and its support structure weights_, Crux project library.
/ 2021: <NAME>, _Review and selection of offshore pedestal-mounted crane standard_, Crux project library.
/ 2020: <NAME>, <NAME>, _Offshore West Africa: $gamma_E$ and $R_m$_ for exposure levels, Shell P&T library.
/ 2020: <NAME>, _Jacket in-service performance assessment from proposed revisions to horizontal frame elevations_, Crux project library.
/ 2020: <NAME>, _Riser span assessment_, Crux project library.
/ 2020: <NAME>, _J-tube design assessment_, Crux project library.
/ 2020: <NAME>, _A review of tenderer-submitted information in support of steel mill capabilities in the context of requirements for offshore structural steel materials for Crux project_, Crux project library.
/ 2019: <NAME>, _Crux topsides elevation considerations_, Crux project library.
/ 2019: <NAME>, _Review and selection of offshore pedestal-mounted crane standard for the Crux project_, Crux project library.
/ 2018: <NAME>, <NAME>, <NAME>, _Determination of exposure level for the Crux fixed offshore platform_, Shell P&T library.
/ 2018: <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, _Feasibility for remote flare deployment offshore Qatar_, Shell P&T library.
/ 2017: <NAME>, <NAME>, <NAME>, Crux fixed platform -- _Substructure request for information_, Shell P&T library.
/ 2017: <NAME>, <NAME>, <NAME>, _Chinese GB steel feasibility for fixed platform concept in Crux field development_, Shell P&T library.
/ 2017: <NAME>, Crux fixed platform -- _Influence of wider topsides layout and potential future cantilever module on substructure design_, Shell P&T library.
/ 2016: <NAME>, <NAME>, <NAME>, et al., Crux fixed platform -- _Influence of soil sensitivity on platform foundation_, Shell P&T library.
/ 2014: <NAME>, <NAME>, <NAME>, _A minimum facility, low cost substructure concept for SF-LCD project_, Shell Malaysia library.
/ 2010: <NAME>, <NAME>, _FLNG Lean turret mooring feasibility_, Shell P&T library.
/ 2010: <NAME>, <NAME>, <NAME>, _Generic FLNG mooring and hybrid riser tower design -- Feasibility study for Tupi field_, Shell P&T library.
/ 2010: <NAME>, <NAME>, <NAME>, _Qualification of polyester mooring ropes_, Shell P&T library.
/ 2009: <NAME>, <NAME>, <NAME>, _Generic FLNG mooring and hybrid riser tower design -- Feasibility study for Carnarvon basin_, Shell P&T library.
/ 2009: <NAME>, <NAME>, _Generic FLNG mooring design feasibility for Carnarvon basin_, Shell P&T library.
/ 2009: <NAME>, <NAME>, _Shell FLSO Concept -- External turret mooring and riser analysis report_, Shell P&T library.
== Drilling support
/ 2017: <NAME>, Crux fixed platform -- _Temporary drilling deck_, Shell P&T library.
/ 2016: <NAME>, <NAME>, _Review of Crux fixed platform for jack-up access_, Shell P&T library.
/ 2016: <NAME>, <NAME>, Crux fixed platform -- _Preliminary structural feasibility of early TAD drilling over jacket substructure_, Shell P&T library.
/ 2014: <NAME>, _Feasbility of BTMPB platform for well intervention activities at BT-206_, Shell Malaysia library.
/ 2014: <NAME>, <NAME>, _Feasibility of SFDP-A platform for pre-drill and well intervention activities_, Shell Malaysia library.
== Installation
/ 2021: <NAME>, _Grouting_, Crux project library.
/ 2021: <NAME>, _Assessment of Buoyancy and Flotation Tanks for Jacket Installation_, Crux project library.
/ 2021: C Kunte, _Criteria for jacket on-bottom stability and storm safety_, Crux project library.
/ 2021: C Kunte, _Drop object impact assessment of drill pipe over drilling template_, Crux project library.
/ 2020: C Kunte, _Integrity assessment of piles during sea-transportation_, Crux project library.
/ 2019: <NAME>, _Drill caisson design_, Crux project library.
/ 2018: <NAME>, <NAME>, <NAME>, <NAME>, Crux fixed platform -- _Feasibility of jacket installation over pre-drilled wells subsea_, Shell P&T library.
/ 2017: <NAME>, Crux fixed platform -- _Structural integrity assessment of topsides during sea-transportation_, Shell P&T library.
/ 2015: <NAME>, <NAME>, _Barge motion responses in the South China Sea_, Shell Malaysia library.
== Asset integrity management
/ 2023: <NAME>, _A review of 2022 underwater inspection reports for strength and serviceability of MLJ1 and MLJ2 platforms offshore Brunei_, Shell P&T library.
/ 2016: <NAME>, <NAME>, _Loading platform pile integrity check during fender replacement at BLNG jetty_, Shell P&T library.
/ 2016: <NAME>, <NAME>, _Integrity assessment of breasting dolphin structures during air block fender change out stages, BLNG jetty_, Shell P&T library.
/ 2016: <NAME>, P Sarathi, _Cell fender feasibility for breasting dolphins, BLNG jetty_, Shell P&T library.
/ 2016: <NAME>, <NAME>, <NAME>, <NAME>, _Sakhalin LNG plant -- Structural adequacy of 125m and 60m flare derricks for increased wind speeds_, Shell P&T library.
/ 2016: <NAME>, <NAME>, _PAA offshore platform -- Review of the new proposed emergency power generation module structure for accidental exposure to blast over-pressures_, Shell P&T library.
/ 2014: <NAME>, <NAME>, _A review of SJQA platform's structural integrity and its feasibility as tie-in host_, Shell Malaysia library.
/ 2013: <NAME>, <NAME>, <NAME>, <NAME>, _North Sabah ADP -- Structural feasibility of platforms For the chemical delivery system_, Shell Malaysia library.
/ 2013: <NAME>, _Design corrosion allowances_, Shell Malaysia library.
= Software development
/ typst-snippets-st: Offers useful text-expanding snippets for Sublime Text editor in authoring papers, notes, reports, and viewgraphs effortlessly Typst. (#link("https://github.com/ckunte/typst-snippets-st")[Repository], MIT license, open source, freeware)
/ typst-snippets-vim: Offers useful text-expanding snippets for Vim and Neovim text editors in authoring papers, notes, reports, and viewgraphs effortlessly Typst. (#link("https://github.com/ckunte/typst-snippets-vim")[Repository], MIT license, open source, freeware)
/ csv2sacs: Python scripts to convert Metocean data into a readily usable SACS seastate input file(s). (#link("https://github.com/ckunte/csv2sacs")[Repository], MIT license, open source, freeware)
/ latex-snippets-st: Offers useful text-expanding snippets for Sublime Text editor in authoring papers, notes, and reports effortlessly in LaTeX. (#link("https://github.com/ckunte/latex-snippets-st")[Repository], MIT license, open source, freeware)
/ latex-snippets-vim: Offers useful text-expanding snippets for Vim and Neovim text editors in authoring papers, notes, and reports effortlessly in LaTeX. (#link("https://github.com/ckunte/latex-snippets-vim")[Repository], MIT license, open source, freeware)
/ sacs_st: A cross-platform syntax highlighting plug-in for Sublime Text editor that colour codes various SACS input parameters to help break the monotony of text and improve readability of model input files. (#link("https://github.com/ckunte/sacs_st")[Repository], MIT license, open source, freeware)
/ usfos_st: A cross-platform syntax highlighting plug-in for Sublime Text editor that colour codes various USFOS input parameters to help break the monotony of text and improve readability of model input files. (#link("https://github.com/ckunte/usfos_st")[Repository], MIT license, open source, freeware)
/ chisel: A lean, static website generator publishing software written for python with many useful features. (#link("https://github.com/ckunte/chisel")[Repository], MIT license, open source, freeware)
= Training material
/ offshore-lifts: Educational presentation pack covering offshore lifts (#link("https://github.com/ckunte/offshore-lifts")[source])
/ structural-dynamics: Educational presentation pack covering historical underpinnings and introduction to structural dynamics (#link("https://github.com/ckunte/structural-dynamics")[source])
/ git-talk: Education presentation pack covering version control for engineers (using git) for model management (#link("https://github.com/ckunte/git-talk")[source])
#v(1fr)
#align(center, text(9pt)[
_Last updated: #datetime.today().display()_
])
|
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/for-00.typ | typst | Other | // Ref: true
// Empty array.
#for x in () [Nope]
// Dictionary is traversed in insertion order.
// Should output `Name: Typst. Age: 2.`.
#for (k, v) in (Name: "Typst", Age: 2) [
#k: #v.
]
// Block body.
// Should output `[1st, 2nd, 3rd, 4th]`.
#{
"["
for v in (1, 2, 3, 4) {
if v > 1 [, ]
[#v]
if v == 1 [st]
if v == 2 [nd]
if v == 3 [rd]
if v >= 4 [th]
}
"]"
}
// Content block body.
// Should output `2345`.
#for v in (1, 2, 3, 4, 5, 6, 7) [#if v >= 2 and v <= 5 { repr(v) }]
// Map captured arguments.
#let f1(..args) = args.pos().map(repr)
#let f2(..args) = args.named().pairs().map(p => repr(p.first()) + ": " + repr(p.last()))
#let f(..args) = (f1(..args) + f2(..args)).join(", ")
#f(1, a: 2)
|
https://github.com/giZoes/justsit-thesis-typst-template | https://raw.githubusercontent.com/giZoes/justsit-thesis-typst-template/main/resources/lib.typ | typst | MIT License | // å京倧åŠåŠäœè®ºææš¡æ¿ modern-nju-thesis
// Author: https://github.com/OrangeX4
// Repo: https://github.com/nju-lug/modern-nju-thesis
// åšçº¿æš¡æ¿å¯èœäžäŒæŽæ°åŸåŸåæ¶ïŒåŠæéèŠææ°çæ¬ïŒè¯·å
³æ³š Repo
#import "@preview/anti-matter:0.0.2": anti-inner-end as mainmatter-end
#import "layouts/doc.typ": doc
#import "layouts/preface.typ": preface
#import "layouts/mainmatter.typ": mainmatter
#import "layouts/appendix.typ": appendix
#import "pages/fonts-display-page.typ": fonts-display-page
#import "pages/bachelor-cover.typ": bachelor-cover
#import "pages/bachelor-title-page.typ": bachelor-titlepage
#import "pages/bachelor-decl-page.typ": bachelor-decl-page
#import "pages/bachelor-abstract.typ": bachelor-abstract
#import "pages/bachelor-abstract-en.typ": bachelor-abstract-en
#import "pages/bachelor-outline-page.typ": bachelor-outline-page
#import "pages/list-of-figures.typ": list-of-figures
#import "pages/list-of-tables.typ": list-of-tables
#import "pages/notation.typ": notation
#import "layouts/conclusion.typ": conclusion
#import "pages/acknowledgement.typ": acknowledgement
#import "utils/custom-cuti.typ": *
#import "utils/textcricled.typ": onum
#import "utils/bilingual-bibliography.typ": bilingual-bibliography
#import "utils/custom-numbering.typ": custom-numbering
#import "utils/custom-heading.typ": heading-display, active-heading, current-heading
#import "utils/indent.typ": indent, fake-par
#import "@preview/i-figured:0.2.4": show-figure, show-equation
#import "utils/style.typ": åäœ, åå·
// 䜿çšåœæ°éå
ç¹æ§ïŒéè¿ `documentclass` åœæ°ç±»è¿è¡å
šå±ä¿¡æ¯é
眮ïŒç¶åæŽé²åºæ¥æäºå
šå±é
眮çãå
·äœç `layouts` å `templates` å
éšåœæ°ã
#let documentclass(
doctype: "bachelor", // "bachelor" | "master" | "doctor" | "postdoc"ïŒææ¡£ç±»åïŒé»è®€äžºæ¬ç§ç bachelor
degree: "academic", // "academic" | "professional"ïŒåŠäœç±»åïŒé»è®€äžºåŠæ¯å academic
nl-cover: false, // TODO: æ¯åŠäœ¿çšåœå®¶åŸä¹ŠéŠå°é¢ïŒé»è®€å
³é
twoside: false, // å颿š¡åŒïŒäŒå å
¥ç©ºçœé¡µïŒäŸ¿äºæå°
// need-assignment: false,
anonymous: false, // ç²å®¡æš¡åŒ
bibliography: none, // 忥çåèæç®åœæ°
fonts: (:), // åäœïŒåºäŒ å
¥ãå®äœãããé»äœãããæ¥·äœããã仿å®ãããç宜ã
info: (:),
) = {
// é»è®€åæ°
fonts = åäœ + fonts
info = (
title: "åºäº Typst çå京倧åŠåŠäœè®ºæ",
title-en: "NJU Thesis Template for Typst",
grade: "20XX",
student-id: "1234567890",
author: "åŒ äž",
author-en: "<NAME>",
department: "æåŠé¢",
department-en: "XX Department",
major: "æäžäž",
major-en: "XX Major",
field: "ææ¹å",
field-en: "XX Field",
supervisor: ("æå", "ææ"),
supervisor-en: "<NAME>",
supervisor-ii: (),
supervisor-ii-en: "",
submit-date: datetime.today(),
) + info
(
// å°äŒ å
¥åæ°å富åº
doctype: doctype,
degree: degree,
nl-cover: nl-cover,
twoside: twoside,
anonymous: anonymous,
fonts: fonts,
info: info,
onum: onum,
// need-assignment: need-assignment,
// 页é¢åžå±
doc: (..args) => {
doc(
..args,
info: info + args.named().at("info", default: (:)),
)
},
preface: (..args) => {
preface(
twoside: twoside,
..args,
)
},
mainmatter: (..args) => {
mainmatter(
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
)
},
mainmatter-end: (..args) => {
mainmatter-end(
..args,
)
},
appendix: (..args) => {
appendix(
..args,
)
},
// åäœå±ç€ºé¡µ
fonts-display-page: (..args) => {
fonts-display-page(
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
)
},
// å°é¢é¡µïŒéè¿ type ååå°äžååœæ°
cover: (..args) => {
bachelor-cover(
anonymous: anonymous,
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
info: info + args.named().at("info", default: (:)),
)
},
//é¢ç®é¡µ
title: (..args) => {
bachelor-titlepage(
anonymous: anonymous,
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
info: info + args.named().at("info", default: (:)),
)
},
// 声æé¡µïŒéè¿ type ååå°äžååœæ°
decl-page: (..args) => {
bachelor-decl-page(
anonymous: anonymous,
twoside: twoside,
// need-assignment: need-assignment,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
info: info + args.named().at("info", default: (:)),
)
},
// äžææèŠé¡µïŒéè¿ type ååå°äžååœæ°
abstract: (..args) => {
bachelor-abstract(
anonymous: anonymous,
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
info: info + args.named().at("info", default: (:)),
)
},
// è±ææèŠé¡µïŒéè¿ type ååå°äžååœæ°
abstract-en: (..args) => {
bachelor-abstract-en(
anonymous: anonymous,
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
info: info + args.named().at("info", default: (:)),
)
},
// ç®åœé¡µ
outline-page: (..args) => {
bachelor-outline-page(
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
)
},
// æåŸç®åœé¡µ
list-of-figures: (..args) => {
list-of-figures(
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
)
},
// è¡šæ Œç®åœé¡µ
list-of-tables: (..args) => {
list-of-tables(
twoside: twoside,
..args,
fonts: fonts + args.named().at("fonts", default: (:)),
)
},
// 笊å·è¡šé¡µ
notation: (..args) => {
notation(
twoside: twoside,
..args,
)
},
// åèæç®é¡µ
bilingual-bibliography: (..args) => {
bilingual-bibliography(
bibliography: bibliography,
..args,
)
},
// èŽè°¢é¡µ
acknowledgement: (..args) => {
acknowledgement(
anonymous: anonymous,
twoside: twoside,
..args,
)
},
// ç»è®ºé¡µ
conclusion: (..args) => {
conclusion(
..args,
)
},
)
}
|
https://github.com/xrarch/books | https://raw.githubusercontent.com/xrarch/books/main/documents/a4xmanual/chapbooting.typ | typst | #import "@preview/tablex:0.0.6": tablex, cellx, colspanx, rowspanx
#box([
= Booting
This section describes the boot protocol used by the *A4X* firmware. The old A3X boot protocol is also supported via an embedded A3X firmware which is chain-loaded when a legacy operating system is selected, but will not be documented here.
Note that all of the client-facing structures and services described here (in general, everything prefixed with `Fw`) can be found in the `Headers/a4xClient.hjk` header file, which should be included in order to access them from programs written in Jackal.
A partition is bootable if it contains a valid OS record at an offset of 1 sector from the partition base (bytes 512-1023 within the partition). The OS record sector has the following layout:
```
STRUCT AptOsRecord
// The 32-bit magic number must read 0x796D6173.
Magic : ULONG,
// A 15-character, null-terminated label for the installed
// operating system.
OsName : UBYTE[16],
// The sector offset within the partition, at which the bootstrap
// program begins.
BootstrapSector : ULONG,
// The count of sectors in the bootstrap program.
BootstrapCount : ULONG,
END
```
])
If a valid OS record is found, the partition is assumed to be bootable. In the following sector (sector 2), a 64x64 monochrome bitmap is located. This is used as an icon in the boot picker.
== The Bootstrap Program
When booting from a partition, the bootstrap sectors are loaded in sequence off the disk into physical memory beginning at address 0x3000. The first 32 bits of the bootstrap must be 0x676F646E in order to be considered valid. Control is then transferred to address 0x3004 through the Jackal function pointer with the following signature:
#box([
```
FNPTR FwBootstrapEntrypoint (
IN devicedatabase : ^FwDeviceDatabaseRecord,
IN apitable : ^FwApiTableRecord,
IN bootpartition : ^VOID,
IN args : ^UBYTE,
) : UWORD
```
])
That is, as per the Jackal ABI for XR/17032, a pointer to the *DeviceDatabase* is supplied in register `a0`, a pointer to the *ApiTable* is supplied in register `a1`, a handle to the boot partition is supplied in register `a2`, and an argument string is supplied in register `a3`. The bootstrap program can return a value in register `a3`.
Note that memory in the range of 0x0 through 0x2FFF should _not_ be modified until *A4X* services will no longer be called, as this region is used to store its runtime data (such as the initial stack). After this region is trashed, *A4X* may only be re-entered through a system reset (which can be accomplished by jumping to physical address 0xFFFE1000 with virtual addressing disabled).
== The Device Database
The *DeviceDatabase* is a simple structure constructed in low memory by the firmware. It contains information about all of the devices that were detected. It has the following layout:
```
STRUCT FwDeviceDatabaseRecord
// 32-bit count of the total RAM detected in the system.
TotalRamBytes : ULONG,
// The number of processors detected.
ProcessorCount : UBYTE,
// The number of bootable partitions found.
BootableCount : UBYTE,
Padding : UBYTE[2],
// A table of information about all of the RAM slots.
Ram : FwRamRecord[FW_RAM_MAX],
// A table of information about all of the physical disks.
Dks : FwDksInfoRecord[FW_DISK_MAX],
// A table of information about devices attached to the Amtsu
// peripheral bus.
Amtsu : FwAmtsuInfoRecord[FW_AMTSU_MAX],
// A table of information about the boards attached to the EBUS
// expansion slots.
Boards : FwBoardInfoRecord[FW_BOARD_MAX],
// A table of information about each processor detected in the
// system.
Processors : FwProcessorInfoRecord[FW_PROCESSOR_MAX],
// Information about the boot framebuffer, or lack thereof.
Framebuffer : FwFramebufferInfoRecord,
// Information about the boot keyboard, or lack thereof.
Keyboard : FwKeyboardInfoRecord,
// The machine type --
// XR_STATION, XR_MP, or XR_FRAME.
MachineType : FwMachineType,
END
```
Note that the `Headers/a4xClient.hjk` header file should be used to access the device database and other *A4X* structures - this incomplete information is only provided here for quick reference.
#box([
== The API Table
A pointer to an API table is passed to the bootstrap program. The API table consists of function pointers that can be called to receive services from the firmware. The currently defined APIs follow:
```
STRUCT FwApiTableRecord
PutCharacter : FwApiPutCharacterF,
GetCharacter : FwApiGetCharacterF,
ReadDisk : FwApiReadDiskF,
PutString : FwApiPutStringF,
KickProcessor : FwApiKickProcessorF,
END
```
])
=== PutCharacter
```FNPTR FwApiPutCharacterF (
IN byte : UWORD,
)```
Puts a single character to the firmware console.
=== GetCharacter
```FNPTR FwApiGetCharacterF () : UWORD```
Returns a single byte from the firmware console. This is non-blocking and returns -1 (0xFFFFFFFF) if no bytes are available.
=== ReadDisk
```
FNPTR FwApiReadDiskF (
IN partition : ^VOID,
IN buffer : ^VOID,
IN sector : ULONG,
IN count : ULONG,
) : UWORD
```
Reads a number of sectors from the specified partition handle into the buffer. The base address of the buffer _must_ be aligned to a sector size boundary. Returns *TRUE* (non-zero) if successful, *FALSE* (zero) otherwise.
=== PutString
```
FNPTR FwApiPutStringF (
IN str : ^UBYTE,
)
```
Puts a null-terminated string of bytes to the firmware console. Could be easily synthesized from *PutCharacter* but is provided for convenience to small boot sectors written in assembly language.
=== KickProcessor
```
FNPTR FwApiKickProcessorF (
IN number : UWORD,
IN context : ^VOID,
IN callback : FwKickProcessorCallbackF,
)
```
Causes the processor with the specified number to execute the provided callback. The callback routine is called with the opaque context value and with the number of the processor. Its signature follows:
```
FNPTR FwKickProcessorCallbackF (
IN number : UWORD,
IN context : ^VOID,
)
```
*KickProcessor* does not wait for the callback to be executed; execution continues on both processors asynchronously. If synchronization is required, it must be implemented manually.
If the processor with the given number (equivalent to its index in the *DeviceDatabase*) does not exist, the results are undefined.
Also note that the firmware does not contain multiprocessor synchronization, so if firmware services may be invoked by multiple processors concurrently, locking must be provided by the user.
The size of the initial stack provided by the firmware for processors other than the boot processor is not explicitly defined here, but is quite small, so the depth of the call stack during execution of the callback must either be very small, or the recipient processor must switch quickly to a new stack. |
|
https://github.com/drupol/master-thesis | https://raw.githubusercontent.com/drupol/master-thesis/main/src/thesis/3-tools.typ | typst | Other | #import "imports/preamble.typ": *
#import "theme/template.typ": *
#import "theme/common/titlepage.typ": *
#import "theme/common/metadata.typ": *
#import "theme/disclaimer.typ": *
#import "theme/leftblank.typ": *
#import "theme/acknowledgement.typ": *
#import "theme/abstract.typ": *
#import "theme/infos.typ": *
#import "theme/definition.typ": *
#chapterquote(
title: "Software evaluation",
ref: "chapter3",
quoteAttribution: <clarke1973>,
quoteText: [
Any sufficiently advanced technology is indistinguishable from magic.
],
)
This chapter explores the pivotal role of tooling in achieving reproducibility
within #gls("SE"), highlighting the importance of environment consistency,
dependency management, and process isolation.
Reproducibility in #gls("SE") is not merely a desirable attribute but a
cornerstone of trustworthy, reliable, and verifiable software development
practices. As software systems grow increasingly complex and integral to every
facet of the modern world, from critical infrastructure to personal devices, the
stakes for ensuring their reproducibility have never been higher. This chapter
introduces and examines four distinct methods for building software, each with
its unique approach:
- Bare compilation
It is the most rudimentary method, depends on the operating system's compilers
and libraries for software construction.
- Compilation with Docker
Using containerization technology, encapsulates not just the software and its
dependencies but also the entire runtime environment.
- Compilation with Nix
Nix uses a unique store for packages built in isolation, each with a unique
identifier that includes dependencies, preventing conflicts and ensuring
reproducible environments.
- Compilation with Guix
Inspired by Nix, Guix offers a transactional package management system that
isolates dependencies to ensure consistent and reproducible software
environments through specific version-linked profiles.
The four evaluation methods chosen for detailed evaluation in the context of
reproducibility represent a wide range of approaches to managing software build
environments, each addressing different aspects of reproducibility. Bare
compilation was selected to provide a baseline, demonstrating the fundamental
challenges encountered without the aid of advanced tooling, such as
environmental inconsistencies and dependency conflicts. This method serves as a
contrast to the more sophisticated techniques that follow. Docker is included
for its widespread adoption and popularity, as well as its approach to
encapsulating the runtime environment, which significantly mitigates issues
arising from system variability. Guix and Nix are examined due to their unique
approach to dependency management and environment isolation, employing a package
management approach that is based on the functional paradigm
(@def-functional-package-management) to ensure exact reproducibility of
environments across different systems. The chapter aims to cover a spectrum from
the most basic to the most advanced strategies.
#definition(
term: "Functional package management",
name: "def-functional-package-management",
)[
From @10-1007-978-3-319-27308-2_47, functional package management is a
discipline that transcribes the functional programming paradigm to software
deployment: build and installation processes are viewed as pure functions
(@def-pure-function) in the mathematical sense whose result depends
exclusively on the inputs (@def-inputs-outputs), and their result is a value
that is, an immutable directory.
]
This chapter aims to provide readers with an understanding of how these
contribute to the broader goal of reproducible #gls("SE"). Through a detailed
exploration of each approach, readers will gain insight into the strengths,
weaknesses, and applicability of Bare compilation, Docker, Guix and Nix in
various software development scenarios.
== Methodology
Our primary objective is to assess the reproducibility of a software build using
four different methods: Bare compilation, Docker, Guix, and Nix. By compiling a
C program (@datetime.c) with each tool, we can evaluate reproducibility both
over space and time (@reproducibility).
The study uses a quantitative research design, focusing on the comparison of
binary files generated from compiling identical source code with different
methods, on the same environment. This approach allows for an empirical
assessment of reproducibility challenges inherent to each compilation
tool and environment.
=== Evaluation Criteria
We will consider three primary criteria.
Firstly, *reproducibility in time* assesses whether the outputs of builds are
identical across repeated compilations in the same environment. This criterion
involves compiling the same source code twice with a few seconds of interval
between compilations. By comparing the outputs of these compilations, we can
determine if the build process produces consistent results over time.
Secondly, *reproducibility in space* focuses on the consistency of build outputs
across different environments. To evaluate this, the same source code is
compiled in various environments. This process helps to ensure that the software
build process is not dependent on specific environmental factors and can produce
identical outputs regardless of where it is compiled.
Lastly, the *reproducibility of the build environment* evaluates the stability
and consistency of the environment itself, including the dependencies required
for building the output. This criterion ensures that the environment, which
encompasses all necessary tools and libraries, remains stable and consistent
across different instances and setups.
=== Tools And Technologies
The evaluation of reproducibility tools in this study encompasses several
approaches to software compilation and package management, each with its unique
methodology.
In @ch3-tool1, the bare compilation method involves direct compilation on the
host system without the use of containerization or package management tools.This
approach relies on the default tools and libraries installed in the operating
system, providing a straightforward but less controlled environment for building
software. This method is assessed to understand the baseline reproducibility and
potential variability introduced by the host system's native environment.
In @ch3-tool2, Docker is used to provide a containerized environment for
software compilation. Using Docker containers ensures that the build process
occurs in a consistent and isolated environment, independent of the host
system's configuration. This method helps in evaluating how containerization can
enhance reproducibility by encapsulating all necessary dependencies and tools
within a controlled and replicable environment.
In @ch3-tool3, the Guix package ecosystem is employed to manage the software
build process. Guix focuses on providing a reproducible and declarative approach
to package management, ensuring that the build environment and dependencies are
precisely defined and versioned. This approach is examined for its ability to
maintain consistency and reproducibility across different systems and
environments by leveraging Guix's robust package management features.
In @ch3-tool4, the Nix package ecosystem is used to manage and build software.
Similar to Guix, Nix offers a declarative and reproducible package management
system, allowing for precise control over the build environment and
dependencies. The evaluation of Nix focuses on its capability to provide a
reproducible build environment that can be consistently replicated across
various systems, enhancing the reliability and stability of the software
development process.
=== Scenarios
Our examples and builds focus on custom-made scenarios to highlight the
differences in reproducibility across the four tools. There are multiple
scenarios being evaluated:
In the first scenario, using @ch3-tool1, a C program is built using the host
default C compiler. The second scenario involves @ch3-tool2, where a C program
is built in a Docker container utilizing the C compiler. The third scenario,
with @ch3-tool3, involves building a C program using Guix. Finally, there are
two scenarios for @ch3-tool4: one involves building a C program using Nix legacy
(not flake), and the other uses Nix flake to build the same program.
=== Compilation And Execution <ch3-compilation-execution>
A trivial C program (@datetime.c) has been chosen for its straightforwardness,
allowing the focus to remain on the build process and output rather than
software complexity.
Each method will compile the same C program (@datetime.c) twice. Detailed steps
for compilation and execution within each environment will be documented,
ensuring transparency and reproducibility of the process itself by the readers.
Each compilation's resulting output will be executed to verify functionality,
although the correctness of the execution's output will not be evaluated.
=== Environment Setup
To ensure the robustness and universality of our reproducibility assessment, all
test scenarios described in this chapter are executed through GitHub Actions
#cite(<ghActions>, form: "normal"). GitHub Actions is an automation platform
that enables #gls("CICD"), allowing builds to be performed, tested, and deployed
across various machines and architectures directly from GitHub repositories
#cite(<9978190>,form:"normal").
Our testing environments supports three distinct architectures:
- `x86_64-linux`: This represents the widely used Linux operating systems on
Intel and AMD processors. To ensure a thorough evaluation, two instances, each
running the different versions of Ubuntu (`20.04` and `22.04`), are employed.
- `x86_64-darwin`: Dedicated to supporting macOS on Intel processors.
- `aarch64-darwin`: Addressing the latest generation of macOS powered by the
ARM-based Apple Silicon processors.
This selection encompasses both `x86` and `ARM` architectures, as well as Linux
and macOS operating systems, providing a comprehensive view of reproducibility
across the most commonly used development platforms in #gls("SE"). The choice of
these architectures ensures the results are relevant to a broad spectrum of
development environments and application targets.
Each of our scenarios is streamlined through the use of a `Makefile`. A
`Makefile` as seen in @ch3-example-makefile is a text file that contains a set
of directives used by the GNU `make` #cite(form: "normal", <gnumake>) utility to
automate the build process of software projects. These directives contain
specific shell commands.
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../resources/sourcecode/example-makefile"),
)
},
caption: [An example of `Makefile`],
) <ch3-example-makefile>
Each scenario's `Makefile` essentially contain four essential steps only:
- `clean`: Removes the build artefact of a build process, if any.
- `build`: Executes a build process, generating an output artefact.
- `check`: Prints the checksum of the build artefact.
- `run`: Execute the artefact
#info-box[
All source code and scenario files are available for reference under the
`lib/` directory of this master's thesis source code
#cite(<PolMasterThesis>, form:"normal"). Each scenario's directory contains
its own `Makefile`. These makefiles can be used to locally reproduce the
commands and results outlined in this document.
]
Our GitHub Actions workflows #cite(<r13yBuildScenarios>,form:"normal") use
these Makefiles, automating the execution of each scenario and ensuring
consistency and repeatability in the process. By doing so, they empower users to
locally reproduce the steps outlined in this document with full transparency.
This approach aligns with best practices in #gls("SE") for reproducibility, and
extends those principles to broader scientific research practices.
=== Output Comparison
To compare the results, we will compare the checksums of the resulting outputs.
We exclusively use the `nix hash path` command provided by the Nix package
manager to compute the hash of a path.
#info-box(kind: "important")[
The `nix hash path` command is provided by Nix, a tool we will explore in this
chapter. Nix provides this command as part of its suite, but it can be applied
anywhere, not just to files within the Nix ecosystem. This command
distinguishes itself by its capacity to hash directories in addition to files.
An alternative to this approach could have been the use of a
#gls("SWHID") #cite(<hal-01865790>,form:"normal").
]
The `nix` command is available on systems with Nix installed. The difference
with a traditional `sha256sum` is that the former computes the hash of the path,
which includes the content and the metadata while the latter computes the hash
of the content only. Another advantage of using that command is its ability to
create a hash prefixed by the algorithm used, similar to #gls("SRI")
#cite(<sri>, form: "normal") hashes.
=== Expected Outcomes
At the opposite of the previous more theoretical chapters, this practical
chapter aims to empirically compare the differences in reproducibility
achievable with Bare compilation, Docker, Guix, and Nix. Insights into the
challenges and benefits of each method will inform best practices in #gls("SE")
for achieving reproducible builds.
== Evaluation 1 - Bare compilation <ch3-tool1>
This method is the most rudimentary approach to software compilation, relying on
the host system's installed compilers and libraries to build software. This
build method correspond to Scenario 1, with the corresponding `Makefile` in
@ch3-makefile-scenario1, that can be executed on any system, with the commands:
`make build` to compile, `make check` to print the checksum, #raw("make run") to
run the compiled binary. As explained in @ch3-compilation-execution, we notice
that the steps are executed twice and in @ch3-tool1-build, the steps to build,
check and run the build are detailed.
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../lib/scenario-1/Makefile"),
)
},
caption: [`Makefile` of Scenario 1],
) <ch3-makefile-scenario1>
#figure(
{
shell(read("../../resources/sourcecode/scenario-1.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Terminal log of the steps to build, check and run Scenario 1],
) <ch3-tool1-build>
At lines 4 and 9 of @ch3-tool1-build, we notice that the `make check` step
prints two different checksums, indicating that the output of the two builds is
different at each run. As a result, this build is not reproducible. This
discrepancy in the output is likely caused by the dynamic replacement of the
`__DATE__` and `__TIME__` macros in the source code, which are replaced with the
current date and time at the moment of compilation.
#heading(outlined: false, level: 3, "Reproducibility In Time")
This method involves directly compiling source code on a system with only the
essential compilers and libraries available on the host. This method's primary
advantage lies in its simplicity and direct control over the build process,
allowing for a clear understanding of dependencies and compilation steps.
However, it lacks isolation from the system environment, leading to potential
#emph[it works on my machine] issues due to variations in system configurations.
Additionally, the lack of encapsulation and dependency management can lead to
difficulties in achieving consistent and reproducible builds across different
environments. This method is therefore classified as non-reproducible in time.
#heading(outlined: false, level: 3, "Reproducibility In Space")
This method is not reproducible in time, therefore we will consider it as not
reproducible in space either. Technically it would be possible to reproduce the
same output on another environment, but it would be practically impossible to
run the build at exactly the same time. This method is therefore classified as
non-reproducible in space.
#heading(outlined: false, level: 3, "Reproducibility Of The Build Environment")
The virtual machines used on Github Actions are versioned. However, the software
installed on the images are not. From one build to another, we can have
a different version of `gcc` or any other software installed on the image.
Therefore, we have absolutely no control over the build environment and it is
very complicated to reproduce the same environment on another machine.
Therefore, reproducibility of the build environment is not guaranteed.
== Evaluation 2 - Docker <ch3-tool2>
Docker #cite(<docker>,form:"normal") has revolutionised software deployment by
encapsulating applications in containers, ensuring they run consistently across
any environment. Unlike traditional virtual machines, Docker containers are
lightweight, share the host's kernel, and bundle applications with their
dependencies, promoting the principle of #emph["build once, run anywhere"]. This
approach streamlines development, testing, and production workflows,
significantly reducing compatibility issues and, to some extent, simplifying
scalability.
Central to Docker's appeal is its contribution to the #gls("DevOps") movement,
fostering better collaboration between development and operations teams by
eliminating the #emph["it works on my machine"] problem. Docker's ecosystem,
including the Docker Hub #cite(<dockerhub>, form: "normal"), offers a vast
repository of container images, facilitating reuse and collaboration across the
software community.
Docker uses the #gls("OCI") standard for its container images, ensuring
interoperability across different containerization technologies, including
@podman and @kubernetes. The #gls("OCI") specification outlines a format for
container images and a runtime environment, aiming to create a standard that
supports portability and consistency across various platforms and cloud
environments.
Due to its popularity #cite(<9403875>, form: "normal", supplement: [p. 9]),
Docker is a key player in modern software development, enabling efficient,
consistent, and scalable applications through containerization, supporting agile
and #gls("DevOps") practices, and accelerating the transition from development
to production.
#figure(
sourcefile(
file: "Dockerfile",
lang: "dockerfile",
read("../../lib/scenario-2/Dockerfile"),
),
caption: [From Scenario 2, the `dockerfile` used by Docker],
) <ch3-dockerfile>
This method involves creating an #gls("OCI") image, compiling @datetime.c,
through a `Dockerfile` and setting the compilation result as default command as
shown in @ch3-dockerfile. This ensures that each time the image is executed, the
compiled executable runs within the container. However, instead of printing only
the checksum of the resulting binary, the `check` step also outputs the checksum
of the image.
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../lib/scenario-2/Makefile"),
)
},
caption: [`Makefile` of Scenario 2],
) <ch3-makefile-scenario2>
#figure(
shell(read("../../resources/sourcecode/scenario-2.log")),
supplement: "Terminal session",
kind: "terminal",
caption: [Terminal log of the steps to build, check and run Scenario 2],
) <ch3-docker-build>
#heading(outlined: false, level: 3, "Reproducibility In Time")
In @ch3-docker-build, it is observed on lines 5 and 12 that building the image
twice and extracting the resulting binary produces different checksums.
Additionally, on lines 6 and 13, it is evident that the checksums of the images
are inevitably different. Consequently, this method is classified as
non-reproducible over time.
#heading(outlined: false, level: 3, "Reproducibility In Space")
This scenario was executed on various machines and architectures, resulting in
different binaries and images. Therefore, this method is classified as
non-reproducible in space as well.
#heading(
outlined: false,
level: 3,
"Reproducibility Of The Build Environment",
) <ch3-docker-build-env>
The reproducibility of build environments in Docker images, while generally
reliable in the short term, can face challenges over time. Docker images are
built on layers, often starting from base images provided by specific vendors.
These base images can receive updates that alter their contents, meaning a
`Dockerfile` that successfully built an image at one time might not produce an
identical image later due to changes in its base layers
#cite(<9403875>, form: "normal", supplement: [p. 1]). Additionally, not
pinning specific versions of base images and external dependencies in the
`Dockerfile` can lead to inconsistencies, making the exact reproduction of a
Docker environment challenging if not managed carefully. Therefore, while Docker
simplifies the consistency of deployment environments, ensuring long-term exact
reproducibility requires careful management of image sources and dependencies.
Docker is intrinsically designed to facilitate reproducible builds, with the
capability to generate identical containers across multiple executions. However,
the challenge to reproducibility arises not from Docker's fundamental features
but from the use of specific base images within Docker containers. A significant
illustration of this problem is shown in @ch3-docker-build, where rebuilding the
image results in different containers even though the base image version has
been pinned to a specific commit at lines 1 and 7.
#info-box(kind: "info")[
"Pinning" refers to the practice of specifying exact versions of software,
base images, or dependencies to use when building a Docker container. This
practice helps ensure that the build environment remains consistent and
predictable over time, despite updates or changes to those dependencies.
Pinning is crucial for maintaining consistency as it prevents the build
environment from changing unexpectedly due to updates in dependencies. It also
enhances reproducibility, allowing developers to recreate the same environment
at a later date, which is vital for debugging and development. Moreover, it
enhances reliability by reducing the likelihood of encountering unexpected
issues or conflicts caused by differing versions of dependencies.
For example, specifying `FROM alpine:3.19.1` in a `Dockerfile` instead of
`FROM alpine` ensures that Alpine image at version 3.19.1 is always used,
providing stability. Additionally, to minimize the risk of variation, the
`build-base` package used in the `Dockerfile` (@ch3-dockerfile) is pinned to
version `0.5-r3`. This mechanism applies similarly across different
programming language ecosystems. However, it is important to note that version
tags, like `3.19.1` or `0.5-r3`, can be replaced or updated by the
maintainers, without users' awareness, potentially altering the contents of a
"pinned" version and impacting reproducibility.
To mitigate this issue, using digests can ensure images are anchored to a
specific snapshot, offering a stronger guarantee of immutability. For
instance, specifying `FROM alpine@sha256:c5b1261d6d3e43071626931fc004f70149baeba2c8ec672bd4f27761f8e1ad6b`,
as shown in @ch3-dockerfile, ensures that the exact same image is used
consistently, regardless of any upstream updates. While using a digest to pin
the base image ensures immutability, the `apk` package manager does not
support a similar mechanism, only tags are supported. It's important to be
aware of the limitations of the tools #eg[the `apk` package manager] used
in the base image, as even with precautions, variability in the build process
may still be introduced.
]
Docker's containerization technology offers a way to create consistent software
environments across various systems by encapsulating both the software and its
dependencies within containers. This encapsulation aids in ensuring a uniform
deployment process. However, the approach's reliance on base images and the
package managers they use brings forth challenges in maintaining
reproducibility. This is primarily because base images might not be strictly
version-controlled, and the package managers used within these images can result
in the installation of varying dependency versions over time.
For example, traditional package managers like `apt` (used in Debian-based
#glspl("OS")) or `yum` (used in RedHat-based #glspl("OS")) do not
inherently guarantee the installation of the exact same version of a software
package across space and time. Typically, this variability stems from updates in
the package repositories, where an `apt-get install` command might fetch a newer
version of a library than was originally used. Such updates could potentially
introduce unexpected behaviour or incompatibilities.
Docker and similar containerization technologies act as sophisticated
assemblers, piecing together the diverse components required to create a
container. This process, while streamlined and efficient, is not immune to the
introduction of variability at any stage of the assembly line. Whether due to
updates in base images, fluctuations in package versions, or differences in
underlying infrastructure, these variables can compromise the reproducibility of
the resulting container (@def-deterministic-build). Recognising this, it becomes
crucial for developers and researchers to approach container creation with a
keen awareness of these potential pitfalls. By meticulously managing base
images, employing reliable package managers, and adhering to best practices in
`Dockerfile` construction, one can mitigate the risks of variability and move
closer to achieving true reproducibility in containerised environments.
== Evaluation 3 - Guix <ch3-tool3>
@guixwebsite is an advanced package manager, designed to provide reproducible,
user-controlled, and transparent package management. It leverages functional
programming concepts to ensure reproducibility and reliability, using the GNU
Guile #cite(<guile>, form:"normal") programming language for its core daemon,
package definitions and system configurations #cite(<courtes2013functional>,form:"normal").
Central to Guix's philosophy is the concept of reproducible builds and
environments. This ensures that software can be built in a deterministic manner,
enabling exact replication of software environments at any point in space and
time. Guix achieves this by capturing all dependencies, including the toolchain
and libraries, in a way that they can be precisely recreated. It supports
transactional package upgrades and rollbacks, making system modifications
risk-free by allowing users to revert to previous states easily.
Guix uses GNU Guile #cite(<guile>,form:"normal"), a Scheme
#cite(<scheme>,form:"normal") implementation, allowing for more expressive and
programmable package definitions. This choice reflects Guix's emphasis on
customization and alignment with the @fsfwebsite project's philosophy, rejecting
proprietary blobs and
aiming for complete software freedom, which may limit hardware compatibility but
enhance long-term reproducibility #cite(<9403875>,form:"normal"). Nonetheless,
users have the liberty to extend Guix with custom packages, whether free or not,
without compromising the tool's reproducibility capabilities. In case of
disappearing upstream sources, Guix can leverage Software Heritage
#cite(<swh>, form:"normal") to retrieve source code, ensuring long-term
accessibility even if the original source disappears. While Guix's reliance on a
general-purpose functional programming language may present a steep learning
curve, it offers extensive flexibility for those proficient in Lisp-like
languages.
#info-box(kind: "note", ref: "info-box-proprietary-software")[
Proprietary software does not expose its source code to the public, which may
seem counter-intuitive to the principles of reproducibility. Proprietary
software "typically cannot be distributed, inspected, or modified by others.
It is, thus, reliant on a single supplier and prone to proprietary
obsolescence" #cite(<9403875>, form: "normal", supplement: [p. 3]).
Ensuring the reproducibility of such software is challenging, as users lack
access to the build process and the software's lifespan is often limited due
to its proprietary nature. Pre-built binaries will work only as long as there
are no breaking changes in dependencies like the GNU C library, making their
reproducibility capabilities time-limited.
Being aware of the broader implications of using proprietary software is
crucial but does not necessarily compromise reproducibility at short term.
However, relying on proprietary software for long-term reproducibility is
risky due to the lack of transparency and control over the software's
evolution.
]
Guix is committed to ensuring reproducibility and reliability, based on the
functional deployment model first introduced by @Dolstra2006. It assures
reproducible builds by treating software environments as immutable entities,
thereby minimising variability across different systems. Guix's approach to
software building and package management, grounded in the principles of
functional programming and transactional package upgrades, places a strong
emphasis on reproducibility. However, this functional paradigm
(@def-functional-package-management) introduces a learning curve and
necessitates a shift from traditional imperative package management methods.
Additionally, the adoption of Guix might be further complicated by the absence
of non-free software availability, marking a significant consideration for teams
considering Guix.
#figure(
{
sourcefile(
file: "guix.scm",
lang: "Lisp",
read("../../lib/scenario-3/guix.scm"),
)
},
caption: [From Scenario 3, the Guix build file (`guix.scm`)],
) <ch3-default-guix>
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../lib/scenario-3/Makefile"),
)
},
caption: [`Makefile` of Scenario 3],
) <ch3-makefile-scenario3>
#figure(
{
shell(read("../../resources/sourcecode/scenario-3.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Building the C sourcecode from the Guix build file of Scenario 3],
) <ch3-guix-build>
#heading(outlined: false, level: 3, "Reproducibility In time")
In @ch3-guix-build, we notice on lines 5 and 11 that the output hashes are the
same. This is therefore classified as reproducible in time.
#heading(outlined: false, level: 3, "Reproducibility In Space")
Building the program in a different environment with the same architecture
(`x86_64-linux`) resulted in identical output. Compiling the source code on
another architecture (`aarch64_darwin`) also produced consistent results, though
different from those obtained on `x86_64-linux`. Therefore, we can conclude that
the program is reproducible across different environments, #emph[modulo] the
hardware architecture.
#heading(outlined: false, level: 3, "Reproducibility Of The Build Environment")
The reproducibility of the build environment is heavily controlled when using
Guix. The dependencies are locked and pinned, it is simply not possible to
create a different build environment.
== Evaluation 4 - Nix <ch3-tool4>
@nix is a revolutionary package management system that dramatically reshapes the
landscape of software construction, consumption, deployment and management. Its
distinctive methodology, grounded in the principles introduced in @Dolstra2006,
marked its inception, setting a new standard for handling software packages.
Central to Nix's core is its use of the Nix language, a domain specific
Turing-complete language that facilitates the description of software packages,
their dependencies, and the environments in which they operate.
#info-box(ref: "def-turing-complete")[
The term "Turing-complete" is named after the British mathematician and
logician Alan Turing, who introduced the concept of a Turing machine as a
fundamental model of computation. A Turing-complete language is a programming
language that can simulate a Turing machine, a theoretical device that can
solve any computation that can be described algorithmically. Turing
completeness is a fundamental property of any programming language that can
perform any computation that a Turing machine can, given enough time and
memory. This property allows a language to express any algorithm or
computation, making it a powerful tool for software development. Examples of
Turing-complete languages include: Python, PHP, C++ and JavaScript. On the
other hand, non-Turing-complete languages, which are limited in their
computational capabilities, include: SQL, Regex and HTML.
]
This language enables Nix to implement a functional deployment model, ensuring
reproducibility, reliability, and portability across different systems by
treating packages as functions of their inputs, which results in deterministic
builds.
Nix emphasises a deterministic build environment, allowing developers to specify
and isolate dependencies explicitly. This method significantly mitigates
#emph["it works on my machine"] issues by providing a high degree of control over
the build environment. Nix's strength in ensuring reproducibility comes with the
need to embrace its unique approach to system configuration and package
management, representing a paradigm shift for new users.
#info-box(kind: "conclusion")[
Nix essentially modifies the #gls("POSIX", long: false) standard by installing
software in unique locations rather than following the shared file structure
described by the #gls("FHS"). This seemingly minor change brings about several
advantageous properties, such as software composition, immutability,
configuration rollback, caching and reproducibility.
]
Nix provides two principal methodologies that are not mutually exclusive: the
legacy method (\u{00B1}2006) and the relatively newer #emph[Flake]
(\u{00B1}2020) approaches.
=== Nix legacy method
The legacy way of using Nix involves defining a `default.nix` file that is
similar to a function definition in the Nix programming language. This file
contains a set of inputs, specifies dependencies, the build command and its
output. By default, this method does not enable pure evaluation mode, meaning
the hermeticity of the build process is not guaranteed. As a result, potential
uncontrolled side effects may occur during the build process. For instance, as
demonstrated in @ch3-default-nix at line 2, we manually enforce a very specific
version of the `pkgs` variable, a specific snapshot of the Nix package
repository that fixes the versions of all packages and libraries. Similarly to
the process outlined in @ch3-docker-build-env for Docker, this approach, known
as #emph[dependency pinning], ensures consistency and reproducibility in the
build environment.
#figure(
{
set text(size: .85em)
sourcefile(
file: "default.nix",
lang: "nix",
read("../../lib/scenario-4/default.nix"),
)
},
caption: [The Nix build file (`default.nix`) from Scenario 4],
) <ch3-default-nix>
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../lib/scenario-4/Makefile"),
)
},
caption: [`Makefile` of Scenario 4],
) <ch3-makefile-scenario4>
#figure(
{
shell(read("../../resources/sourcecode/scenario-4.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Building the C sourcecode with Nix in Scenario 4],
) <ch3-default-nix-build>
=== Nix Flake
Nix #emph[Flake] introduces a structured approach to managing Nix projects,
focusing on reproducibility and ease of use. Currently in an experimental phase,
Flake is anticipated to transition to a stable feature soon due to increasing
community endorsement (@ch3-flake-vs-legacy) and the tangible reproducibility
advantages it offers.
#figure(
image("../../resources/images/flake-vs-legacy.jpg"),
caption: [On the left, new repositories containing a `flake.nix` file, and on
the right, containing a `default.nix` file
(#link("https://x.com/DeterminateSys/status/1794394407266910626")[Determinate System])
],
) <ch3-flake-vs-legacy>
Flakes aim to simplify and enhance the Nix experience by providing an immutable,
version-controlled way to manage packages, resulting in significant improvements
in reproducibility and build isolation. Flakes manage project dependencies
through a single, top-level `flake.lock` file, which is automatically generated
to precisely pin the versions of all dependencies, including transitive ones, as
specified in the `flake.nix` file. This file ensures project consistency and
reproducibility across different environments.
In addition to altering the Nix command-line syntax, Flakes enforce a specific
structure and entry point for Nix expressions, standardising project setup and
evaluation. They enable pure evaluation mode by default, which enhances the
purity and isolation of evaluations, making builds more consistent and reducing
side effects. For instance, making external requests during a build is not
possible with Flakes, ensuring that every dependency must be explicitly
declared. Flakes require changes to be tracked through `git`, enabling the exact
reversion of the project to be pinned in the `flake.lock` file.
The files `flake.nix` and `flake.lock` are complementary and central to the
locking mechanism that ensures reproducibility. Together, when committed in a
project, they guarantee that every user of a Flake, regardless of when they
build or deploy the project, will use the exact same versions of dependencies,
thereby ensuring that the project is built consistently every time. However, it
is possible to have only a `flake.nix` file without a `flake.lock` file. In
such cases, having a reproducible build environment is not guaranteed since
dependencies could drift to newer versions.
#figure(
{
sourcefile(
file: "flake.nix",
lang: "nix",
read("../../lib/scenario-5/flake.nix"),
)
},
caption: [The Nix Flake file (`flake.nix`) from Scenario 5],
) <ch3-flake-nix>
#figure(
{
sourcefile(
file: "Makefile",
lang: "Makefile",
read("../../lib/scenario-5/Makefile"),
)
},
caption: [`Makefile` of Scenario 5],
) <ch3-makefile-scenario5>
#figure(
{
shell(read("../../resources/sourcecode/scenario-5.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Building the C sourcecode with Nix flake in Scenario 5],
) <ch3-nix-flake-build>
#heading(outlined: false, level: 3, "Reproducibility In Time")
In @ch3-default-nix-build, we notice on line 5 and 11 that building twice the
sourcecode using Nix's legacy method produces the same output. In
@ch3-nix-flake-build, on line 4 and 9 we notice the same thing.
This is therefore classified as reproducible in time.
#heading(outlined: false, level: 3, "Reproducibility In Space")
Just like Guix, building the program in a different environment with the same
architecture (`x86_64-linux`) resulted in identical output. Compiling the source
code on another architecture (`aarch64_darwin`) also produced consistent
results, though different from those obtained on `x86_64-linux`. Therefore, we
can conclude that the program is reproducible across different environments,
#emph[modulo] the hardware architecture.
#heading(outlined: false, level: 3, "Reproducibility Of The Build Environment")
The reproducibility of the build environment is heavily controlled. The
dependencies are locked and pinned, it is simply not possible to
create a different build environment.
=== Dealing With Variability
This section will focus on how Nix deals with unstable outputs, highlighting how
they have abstracted this issue behind the scenes. The scenarios that will be
used are:
- Scenario 6: Building an #gls("OCI") image with Nix
- Scenario 7: Compiling a Typst document to a PDF file
- Scenario 8: Compiling a Typst document to a PDF file with Nix, showing how Nix
abstracts the issue of non-deterministic builds.
- Scenario 9: Compiling a Typst document with Nix, fixing the issue of
non-deterministic builds.
#info-box[
Typst #cite(<typst>, form: "normal") is an advanced markup-based typesetting
language that compiles to #gls("PDF") or #gls("SVG"). It was initiated in 2019
at the Technical University of Berlin by <NAME> and <NAME>.
Developed in Rust, this programmable markup language for typesetting became
the subject of their master's theses, which they wrote in 2022. After several
years of closed-source development, Typst was open-sourced and released to the
public in 2023. Despite being relatively recent and lacking a stable version,
Typst's maturity has allowed it to be used for writing this master's thesis.
]
Building #gls("OCI") images using Docker is a common use case in the software
development process. However, the output of the build can be non-deterministic
due to the nature of the build process. In scenario 6, we will build an
#gls("OCI") image using Nix only.
#figure(
{
sourcefile(
file: "flake.nix",
lang: "nix",
read("../../lib/scenario-6/flake.nix"),
)
},
caption: [
The Nix Flake file (`flake.nix`) to build an OCI image in Scenario 6
],
) <ch3-flake-nix-container>
#figure(
{
shell(read("../../resources/sourcecode/scenario-6.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Building an #gls("OCI") image with Nix],
) <ch3-nix-flake-container-build>
In @ch3-nix-flake-container-build, line 5 and 11, we notice that building twice
an #gls("OCI") image using Nix produces the same output. The Flake file in
@ch3-flake-nix-container shows that it is possible to create reproducible
#gls("OCI") containers with Nix, in a simple and declarative way.
In scenario 7, we will compile a trivial Typst document.
Consider the following Typst document on the left, and it's rendering on the
right:
#grid(
columns: 2,
rows: 1,
column-gutter: 1em,
align: bottom,
figure(
{
sourcefile(
file: "hello-world.typst",
lang: "typst",
read("../../lib/scenario-7/src/hello-world.typst"),
)
},
caption: [Typst document],
),
[
#figure(
box(stroke: .6pt + luma(200), radius: 3pt)[
#image("../../resources/images/hello-world.svg")
],
caption: [Rendering of the Typst document],
) <typst-hello-world-rendered>],
)
@ch3-hello-world-typst-build-log shows that manually compiling the same document
twice yields different resulting files.
#figure(
{
shell(read("../../resources/sourcecode/scenario-7.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Manually compiling a Typst document to a #gls("PDF") document in Scenario 7],
) <ch3-hello-world-typst-build-log>
While viewing the resulting #gls("PDF") files side by side, we notice that they
appear totally identical to @typst-hello-world-rendered. However, the checksum
of those files are different. This discrepancy is common, where the same input
can produce different outputs due to non-deterministic behaviour in the build
process. Even if the resulting outputs are identical, there can be internal
differences. Therefore, given an arbitrary build output, it is impossible to
determine if a build is valid or not. It is important to acknowledge that tools
like Guix or Nix address this issue by ensuring that the build environment only
is consistent and reproducible. In @ch3-nix-typst-flake, we will show how to
compile the same Typst document using Nix and how to eventually fix the
discrepancy.
#figure(
{
sourcefile(
file: "flake.nix",
lang: "nix",
read("../../lib/scenario-8/flake.nix"),
)
},
caption: [
The Nix `flake.nix` file to build a Typst document to a PDF in Scenario 8
],
) <ch3-nix-typst-flake>
Compile it twice and observe the outcome:
#figure(
{
shell(read("../../resources/sourcecode/scenario-8.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Building a Typst document in Scenario 8],
) <ch3-hello-world-typst-build>
At lines 4 and 7 of @ch3-hello-world-typst-build, we notice that compiling
twice a Typst document with Nix produces two different #gls("PDF") files, their
respective checksums are different. While the visual output appears identical,
the underlying files are not. At line 3 of @ch3-hello-world-typst-rebuild, we
leverage a command with specific flags to verify if a build output is
reproducible.
#figure(
{
shell(read("../../resources/sourcecode/scenario-8-rebuild.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Checking if a build output is reproducible],
) <ch3-hello-world-typst-rebuild>
Nix will build the document once (line 2), then a second time (line 3) and then
compare the output hashes. Thanks to the `--keep-failed` argument, we inform Nix
to keep the failed builds so we can do a more introspective analysis of the
issue and try to find the root cause of the discrepancy, for example, using
`diffoscope` #cite(<diffoscope>, form: "normal") in
@ch3-hello-world-typst-rebuild-diffoscope.
#figure(
{
shell(read("../../resources/sourcecode/scenario-8-diffoscope.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Checking discrepancies between two builds using `diffoscope`],
) <ch3-hello-world-typst-rebuild-diffoscope>
#figure(
image("../../resources/images/diffoscope-typst.svg"),
caption: [
A visual comparison with `diffoscope` of two #gls("PDF") files generated
from the same Typst document
],
) <ch3-nix-typst-diff>
`diffoscope` visually compares the discrepancy between the two #gls("PDF")
files. From the report in @ch3-nix-typst-diff, the highlighted difference seems
to be the creation date metadata. Doing a quick search on @typstdoc confirms
that Typst is able to change the creation date of the output file.
@ch3-nix-typst-flake-fixed implements the trivial change at line 1:
#figure(
{
sourcefile(
file: "hello-world.typst",
lang: "typst",
read("../../lib/scenario-9/src/hello-world.typst"),
)
},
caption: [On line 1, the Typst document date is now set to `none`],
) <ch3-nix-typst-flake-fixed>
#figure(
{
shell(read("../../resources/sourcecode/scenario-9-rebuild.log"))
},
supplement: "Terminal session",
kind: "terminal",
caption: [Checking if compiled Typst document is reproducible in Scenario 9],
) <ch3-hello-world-typst-fixed-log>
Now we notice that running the command to check if the output is reproducible
returns nothing, meaning that the output is fully reproducible.
#info-box[
Often, raising an issue with the upstream project is the most effective method
for informing the authors about a problem and monitoring its resolution. In
the case of Typst, an issue
#cite(form: "normal", <typstReproducibleBuildIssue1>) was documented to
describe the problem, and in less than two weeks, it had been addressed and
resolved. Consequently, the discrepancy in @ch3-hello-world-typst-build is no
longer applicable for Typst versions newer than `0.11.0`.
]
== Conclusion
In this concluding section of the chapter, a summary of the reproducibility
assessment can be found in @ch3-table-conclusion. Following the table, this
section provides a detailed explanation of our categorization process, outlining
the specific criteria used for classifying. Each classification is justified
based on the results obtained from our comprehensive empirical evaluation
process.
#figure(
include "../../resources/typst/ch3-table-conclusion.typ",
caption: [Software evaluation comparison],
kind: "table",
supplement: [Table],
) <ch3-table-conclusion>
In evaluating the reproducibility of various tools and methodologies within, a
particular focus has been set on the bare compilation method (@ch3-tool1). This
approach, characterised by its reliance on the host operating system's installed
software for compiling source code into executable programs, presents a nuanced
challenge to reproducibility. Theoretically, bare compilation allows for a
straightforward reproduction of computational results, assuming a static and
uniform environment across different computational setups. However, the
practical application of this method exposes inherent vulnerabilities to
environmental variability. The reliance on the host's installed software means
that the exact version of compilers, libraries, and other dependencies can
significantly impact the outcome of the compilation process. These elements are
seldom identical across different systems or even over time on the same system,
given updates and changes to the software environment. Consequently, the
reproducibility promised by Bare compilation is compromised by these external
variables, which are often not documented with sufficient rigor or are outside
the user's control. Acknowledging these challenges, we categorise the bare
compilation (@ch3-tool1) as non-reproducible by default, reflecting a practical
assessment rather than a theoretical limitation. The classification underscores
the significant effort required to document and manage the dependencies on the
host's software to achieve a reproducible build process. This perspective is
supported by the literature #cite(<Schwab2000>, form: "normal"), which advocates
for standardising and simplifying the management of computational research
artefacts. The classification of the method 1 (@ch3-tool1) as *non-reproducible*
is a pragmatic acknowledgment of the difficulties presented by the dependency on
the computational environment.
Docker and similar containerization technologies (@ch3-tool2) can facilitate
reproducible environments. The reason is that while they provide a high degree
of isolation from the host system, they are still subject to variability due to
the base images and package managers used within the containers. This
variability, however, can be effectively managed with low effort. By
meticulously selecting and managing base images and dependencies, it is indeed
feasible to elevate Docker from partially to fully reproducible. For these
reasons, they are categorised as *partially reproducible*.
Nix (@ch3-tool3) and Guix (@ch3-tool4) provide a high level of control over
the build environment and dependencies, facilitating deterministic and
reproducible builds across different systems. By capturing all dependencies and
environment specifics in a declarative manner, Nix and Guix offer a reliable and
transparent approach to software development. The functional deployment model
implemented by Guix, Nix and their forks (like @lix), along with their
transactional package upgrades and rollbacks, further enhances reproducibility
by enabling exact replication of software environments within the same
architecture at any point in space and time.Under the hood, they introduces a
novel approach to addressing the challenges of reproducibility. By using a very
specific storage model, they ensures that the resulting output directory is
determined by the hash of all inputs. This model, while not guaranteeing bitwise
identical binaries across all scenarios, especially across different hardware
architectures, ensures that the process and environment for building the
software are reproducible. Nix and Guix's model represents a significant step
forward in mitigating reproducibility challenges within #gls("SE"). By ensuring
that every build can be traced back to its exact dependencies and build
environment, it enhances the reliability of software deployments. This approach
is particularly beneficial in #gls("CICD") pipelines, where consistency and
reliability are paramount. Achieving reproducibility in #gls("SE") is filled
with challenges, from architecture dependencies to non-determinism in compilers.
These solutions offers a compelling solution by ensuring reproducible build
environments. The exploration of the concepts used in Guix and Nix, and its
methodologies provides valuable insights into the complexities of software
reproducibility and the necessity for continued research and development in this
field. They both are categorised as *reproducible*.
|
https://github.com/Coekjan/parallel-programming-learning | https://raw.githubusercontent.com/Coekjan/parallel-programming-learning/master/ex-4/report.typ | typst | #import "../template.typ": *
#import "@preview/cetz:0.2.2" as cetz
#import "@preview/codelst:2.0.1" as codelst
#show: project.with(
title: "å¹¶è¡çšåºè®Ÿè®¡ç¬¬ 4 次äœäžïŒCUDA çŒçšïŒ",
authors: (
(name: "å¶ç¯ä»", email: "<EMAIL>", affiliation: "ACT, SCSE"),
),
)
#let data = toml("data.toml")
#let lineref = codelst.lineref.with(supplement: "代ç è¡")
#let sourcecode = codelst.sourcecode.with(
label-regex: regex("//!\s*(line:[\w-]+)$"),
highlight-labels: true,
highlight-color: lime.lighten(50%),
)
#let data-time(raw-data) = raw-data.pairs().map(pair => {
let (tpb, data) = pair
(str(tpb), data.sum() / data.len())
})
#let data-chart(raw-data, width, height, time-max) = cetz.canvas({
cetz.chart.columnchart(
size: (width, height),
data-time(raw-data),
y-max: time-max,
x-label: [_线çšåå
çº¿çšæ°é_],
y-label: [_å¹³åè¿è¡æ¶éŽïŒåäœïŒç§ïŒ_],
bar-style: none,
)
})
#let data-table(raw-data) = table(
columns: (auto, 1fr, 1fr, 1fr),
table.header([*线çšåå
çº¿çšæ°é*], table.cell([*è¿è¡æ¶éŽïŒåäœïŒç§ïŒ*], colspan: 3)),
..raw-data.pairs().map(pair => {
let (tpb, data) = pair
(str(tpb), data.map(str))
}).flatten()
)
= å®éªïŒç©éµä¹æ³
== å®éªå
å®¹äžæ¹æ³
äœ¿çš NVIDIA CUDA åŒæçŒçšæ¥å£å®ç°ç©éµä¹æ³çå¹¶è¡å éïŒå¹¶åšäžåç线çšé
眮äžè¿è¡ïŒè®°åœè¿è¡æ¶éŽå¹¶è¿è¡åæã
- ç©éµå€§å°ïŒ8192 #sym.times 8192
- CUDA é
眮ïŒè®°åçœæ Œäžçº¿çšåæ°é䞺 $B$ãå线çšåäžçº¿çšæ°é䞺 $T$ïŒïŒ
- äºç»Žæåžçº¿çšäžçº¿çšå
- ç¡®ä¿ $B times T = 8192$
- è°æŽçº¿çšåäžçº¿çšæ°éïŒ2 \~ 32
çšåºæé è¿çšäžæåŠäžèŠç¹ïŒ
+ çŠçš GPU äžç FMAD æä»€ïŒé¿å
æµ®ç¹è¯¯å·®ïŒ
+ äŸæ®ç¯å¢åé `THREADS_PER_BLOCK` å³å®çº¿çšåäžçº¿çšæ°éïŒ
+ çŒå ```c cudaCheck()``` å®åœæ°æ£æ¥ CUDA åœæ°è°çšçé误ïŒ
+ äžºè®°åœæåºæ¶éŽïŒäœ¿çš POSIX ç ```c gettimeofday()``` åœæ°ïŒ
+ 䞺ç®èŠå°è®°åœç©éµä¹æ³ç»æïŒå粟床浮ç¹éµåïŒïŒäœ¿çš OpenSSL ç SHA1 ç®æ³è®¡ç®å
¶æçº¹ã
代ç åŠ @code:matmul-code æç€ºïŒå
¶äžïŒ
- #lineref(<line:cuda-kernel>) å®ä¹äºç©éµä¹æ³åš CUDA äžçæ žåœæ°ïŒ
- #lineref(<line:cuda-blk-th-1>)ã#lineref(<line:cuda-blk-th-2>) å©çš CUDA API è·å线çšåäžçº¿çšçååä¿¡æ¯ïŒ
- #lineref(<line:cuda-malloc-1>)ã#lineref(<line:cuda-malloc-2>)ã#lineref(<line:cuda-malloc-3>) åš GPU äžç³è¯·å
åïŒ#lineref(<line:cuda-free-1>)ã#lineref(<line:cuda-free-2>)ã#lineref(<line:cuda-free-3>) éæŸ GPU äžçå
åïŒ
- #lineref(<line:cuda-memcpy-1>)ã#lineref(<line:cuda-memcpy-2>)ã#lineref(<line:cuda-memcpy-3>)ã#lineref(<line:cuda-memcpy-4>)ã#lineref(<line:cuda-memcpy-5>)ã#lineref(<line:cuda-memcpy-6>) åš CPU äž GPU ä¹éŽè¿è¡æ°æ®äŒ èŸïŒ
- #lineref(<line:cuda-tpb>)ã#lineref(<line:cuda-bpg-1>) äž #lineref(<line:cuda-bpg-2>) åå线çšåäžçº¿çšïŒ
- #lineref(<line:cuda-matmul-1>)ã#lineref(<line:cuda-matmul-2>)ã#lineref(<line:cuda-matmul-3>) äž #lineref(<line:cuda-matmul-4>) è°çš CUDA æ žåœæ°è¿è¡ç©éµä¹æ³ïŒ
- #lineref(<line:cuda-check-last-err>) æ£æ¥ CUDA åœæ°è°çšçé误ïŒ
- #lineref(<line:cuda-sync>) åæ¥æ žåœæ°çæ§è¡ã
#figure(
sourcecode(
raw(read("matmul/matmul.cu"), lang: "cpp"),
),
caption: "å¹¶è¡ç©éµä¹æ³ MPI å®ç°ä»£ç ",
) <code:matmul-code>
== å®éªè¿çš
åšåŠ @chapter:platform-info æè¿°çå®éªå¹³å°äžè¿è¡å®éªïŒåå«äœ¿çš 2ã4ã8ã16ã32 äœäžºçº¿çšåäžççº¿çšæ°éïŒè®°åœè¿è¡æ¶éŽïŒæµå® 3 次åå¹³ååŒïŒåå§æ°æ®åŠ @table:matmul-raw-data æç€ºã
== å®éªç»æäžåæ
ç©éµä¹æ³å®éªæµå®çè¿è¡æ¶éŽåŠ @figure:matmul-chart æç€ºã
å¯è§ïŒåœçº¿çšåå
çº¿çšæ°éèŸå°æ¶ïŒGPU å
è°åºŠå忥åŒé倧ïŒå¯ŒèŽæ§èœæŸèäžéïŒå äžºæ æ³å
åå©çš CUDA çå¹¶è¡è®¡ç®èœåïŒèåœçº¿çšåå
çº¿çšæ°éå¢å€æ¶ïŒæ§èœéæžæåïŒå¹¶è¶äºçš³å®ã
#figure(
data-chart(data.matmul, 12, 8, 100),
caption: "ç©éµä¹æ³è¿è¡æ¶éŽ",
) <figure:matmul-chart>
ç©éµä¹æ³å®éªäžçåå§æ°æ®åŠ @table:matmul-raw-data æç€ºã
#figure(
data-table(data.matmul),
caption: "ç©éµä¹æ³å®éªåå§æ°æ®",
) <table:matmul-raw-data>
= éæ³š
== çŒè¯äžè¿è¡
代ç äŸèµ NVIDIA CUDA åºïŒåå
¶å¯¹åºé©±åšïŒãOpenSSL åºïŒè¥æªå®è£
è¿äºåºïŒéæåšå®è£
ãåšåå€å¥œäŸèµåïŒå¯äœ¿çšä»¥äžåœä»€è¿è¡çŒè¯äžè¿è¡ïŒ
- çŒè¯ïŒ```sh make```ïŒ
- éè¿æ·»å `nvcc` åœä»€è¡é项 `--fmad=false` æ¥çŠçš GPU äžç FMAD æä»€ïŒ
- è¿è¡ïŒ```sh make run```ïŒ
- å¯éè¿ç¯å¢åé ```THREADS_PER_BLOCK``` æ¥æå®çº¿çšåå
çº¿çšæ°éïŒäŸåŠïŒ```sh THREADS_PER_BLOCK=32 make run```ïŒ
- è¿è¡ç»æåè¥æç€ºéè¯¯ïŒæ£æµå°æçº¹é误ïŒïŒå诎æè¿è¡ç»æäžæ£ç¡®ïŒè¯¥æ£æµæºå¶ç倧èŽé»èŸç± @code:makefile-fingerprint äžç Makefile 代ç ç»åºïŒ
#figure(
sourcecode(
```make
# The fingerprint of the result
FINGERPRINT := 00 11 22 33 44 55 66 77 88 99 99 88 77 66 55 44 33 22 11 00
# Run the program `app` and check the fingerprint
.PHONY: run
run:
exec 3>&1; stdbuf -o0 ./app | tee >(cat - >&3) | grep -q $(FINGERPRINT)
```
),
caption: "Makefile äžçæçº¹æ£æµä»£ç "
) <code:makefile-fingerprint>
- æž
çïŒ```sh make clean```ã
== å®éªå¹³å°ä¿¡æ¯ <chapter:platform-info>
#figure(
table(
columns: (auto, 1fr),
table.header([*项ç®*], [*ä¿¡æ¯*]),
[CPU], [11th Gen Intel Core i7-11800H \@ 16x 4.6GHz],
[GPU], [NVIDIA GeForce RTX 3060 Laptop GPU],
[å
å], [DDR4 32 GB],
[æŸå], [6 GB],
[æäœç³»ç»], [Manjaro 24.0.1 WynsdeyïŒLinux 6.6.32ïŒ],
[CUDA], [12.4],
),
caption: "å®éªå¹³å°ä¿¡æ¯",
) <table:platform-info>
|
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/import-06.typ | typst | Other | // Usual importing syntax also works for function scopes
#import enum
#let d = (e: enum)
#import d.e
#import d.e: item
#item(2)[a]
|
https://github.com/piepert/typst-seminar | https://raw.githubusercontent.com/piepert/typst-seminar/main/Beispiele/Hausarbeit/outline-template.typ | typst | #let outline(title: "Inhaltsverzeichnis", depth: none, indent: true, fill: " . ") = {
heading(title, numbering: none)
locate(it => {
let elements = query(selector(heading).after(it), it)
for (i, e) in elements.enumerate() {
if e.outlined == false or (depth != none and r.level > depth) { continue }
let number = if e.numbering != none {
numbering(e.numbering, ..counter(heading).at(e.location()))
" "
}
let line = {
if indent {
h(1em * (e.level - 1 ))
}
if e.level == 1 {
v(weak: true, 0.5em)
set text(weight: "bold")
number
e.body
} else {
number
e.body
}
// Filler dots
box(width: 1fr, h(3pt) + box(width: 1fr, repeat(fill)) + h(3pt))
// Page number
let page_number = counter(page).at(e.location()).first()
str(page_number)
linebreak()
}
link(e.location(), line)
}
})
}
|
|
https://github.com/han0126/MCM-test | https://raw.githubusercontent.com/han0126/MCM-test/main/2024æ ¡èµtypst/chapter/chapter3.typ | typst | = æš¡åå讟
åšå»ºæš¡çè¿çšäžïŒäžºç®åé®é¢äžæ¹äŸ¿å»ºæš¡ïŒæä»¬åšäžåœ±åæš¡åçå¯é æ§åæææ§çåæäžïŒååºä»¥äžå讟ïŒ
ïŒ1ïŒå讟ç«èµè¡šç°äŒåŒçåŠçåŸåŸä¹åšåŠäžäžæèŸå¥œç衚ç°ãå æ€ïŒç«èµæç»©äžåŠççåŠäžæç»©ä¹éŽååšäžå®çæ£çžå
³å
³ç³»ã
ïŒ2ïŒå讟积æåäžç«èµå¯ä»¥ä¿è¿åŠçç绌åçŽ èŽšæåïŒå
æ¬äœäžéäºäžäžç¥è¯ãå¢éåäœèœåãåæ°æç»Žãæ²éèœåçæ¹é¢ãéè¿åäžç«èµïŒåŠçå¯ä»¥åšå®è·µäžäžææåèªèº«çèœååçŽ èŽšæ°Žå¹³ã
ïŒ3ïŒå讟åäžç«èµå¹¶è·åŸäžå®æç»©çåŠçïŒå
¶æªæ¥çåŠä¹ åèäžåå±å¯èœäŒåå°ç§¯æåœ±åãç«èµç»éªå¯ä»¥äžºåŠççååŠãå°±äžåç§ç çæ¹é¢æäŸé¢å€çå å项ïŒå¢åŒºå
¶ç«äºåã
ïŒ4ïŒå讟åèµåŠçæå
¥æŽå€çæ¶éŽå粟ååå€ç«èµïŒåŸåŸèœååŸæŽå¥œçæç»©ãå æ€ïŒç«èµæç»©äžåŠçåå€ç«èµçæ¶éŽæå
¥ä¹éŽååšäžå®çæ£çžå
³å
³ç³»ã
|
|
https://github.com/Sematre/typst-kit-thesis-template | https://raw.githubusercontent.com/Sematre/typst-kit-thesis-template/main/sections/03_introduction.typ | typst | = Introduction
This is the themplate for Bachelor's and Master's theses at SDQ.
For more information on the formatting of theses at SDQ, please refer to
#link("https://sdq.kastel.kit.edu/wiki/Ausarbeitungshinweise") or to your advisor.
== Pacing and indentation
To separate parts of text in Typst, please use two line breaks in your source code.
They will then be set with correct indentation.
Do _not_ use:
- ```typ #parbreak()```
- ```typ #v()```
or other commands to manually insert spaces, since they break the layout of this template.
== Example: Citation
A citation: @becker2008a
== Example: Figures
A reference: The SDQ logo is displayed in @sdqlogo.
(Use ```typ @``` for easy referencing.)
#figure(
image("/assets/logo-sdq.svg", width: 4cm),
caption: "SDQ logo"
) <sdqlogo>
== Example: Tables
Typst offers nicely typeset tables, as in @atable.
#figure(
table(
columns: 2,
[abc], [def],
[ghi], [jkl],
[123], [456],
[789], [0AB]
),
caption: "A table"
) <atable>
== Example: Formula
$
f(x) = ohm(g(x)) (x arrow infinity)
arrow.l.r.double
limsup_(x arrow infinity) |f(x) / g(x)| > 0
$ |
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/grotesk-cv/0.1.0/content/profile.typ | typst | Apache License 2.0 | #import "../lib.typ": *
#import "../metadata.typ"
#import "@preview/fontawesome:0.2.1": *
== #fa-icon("id-card") #h(5pt) #get-header-by-language("Summary", "Resumen")
#v(5pt)
#if is-english() [
Experienced Software Engineer specializing in artificial intelligence, machine learning, and robotics. Proficient in C++, Python, and Java, with a knack for developing sentient AI systems capable of complex decision-making. Passionate about ethical AI development and eager to contribute to groundbreaking projects in dynamic environments.
] else if is-spanish() [
Ingeniero de Software experimentado especializado en inteligencia artificial, aprendizaje automático y robótica. Competente en C++, Python y Java, con un talento para desarrollar sistemas de IA conscientes capaces de tomar decisiones complejas. Apasionado por el desarrollo ético de la IA y ansioso por contribuir a proyectos innovadores en entornos dinámicos.
]
|
Subsets and Splits