repo
stringlengths 26
115
| file
stringlengths 54
212
| language
stringclasses 2
values | license
stringclasses 16
values | content
stringlengths 19
1.07M
|
---|---|---|---|---|
https://github.com/freundTech/typst-matryoshka | https://raw.githubusercontent.com/freundTech/typst-matryoshka/main/tests/template.typ | typst | MIT License | #import "/lib.typ": compile
#set page(fill: gray)
#compile("")
|
https://github.com/xiongyaohua/godot-typst | https://raw.githubusercontent.com/xiongyaohua/godot-typst/main/doc/Design.typ | typst | MIT License | = Introduction
The intention of this project is to use Godot as a presenting tool with the help of Typst. Godot, as an game engine, is good at showing motion graphics, 3D models and providing interactivity. One the other hand, Typst is good at typesetting texts especially those mixed with math. So they are a perfect match.
= How it should work
- This plugin provides a `TypstPages` node.
- Write `.typ` files for the slides.
- Add a `TypstPages` instance in the scene tree.
- Attach `.typ` files to it.
- Attach other Godot scenes as child for the `TypstPages` instance.
- `TypstPages` node handles page changing, and which scenes should show/hide accordingly.
= How to bind
Two bindings are considered:
+ `gdrust`
+ `godot-cpp` + `cxx`
The second one is choosn. While gdrust is a direct binding to Rust language, use godot-cpp is more preferable for the following reasons:
+ `godot-cpp` is officially supported by Godot developers. As result, it is more mature and following new Godot versions more closely.
+ The `godot-cpp` API mimics Godot internal API closely. This similarity leaves the door open, to integrate Typst directly into the engine as a module, achieving better performance.
+ `cxx` is used instead of `cbindgen`, to reduce chance of type mistakes.
= Other Considerations
- Use only containers provided by `godot-cpp` when possible, to ease potential porting to module.
= TODO
- Setup project files, check all parts work(`cxx`, `scons`, `godot-cpp`)
- Add `TypstPages`
|
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024 | https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/entries/tournament-cardinal/entry.typ | typst | Creative Commons Attribution Share Alike 4.0 International | #import "/packages.typ": notebookinator, diagraph
#import notebookinator: *
#import themes.radial.components: *
#import themes.radial.colors: *
#import "/utils.typ": tournament-from-csv
#import diagraph: *
#show: create-body-entry.with(
title: "Tournament: Cardinal Classic",
type: "test",
date: datetime(year: 2024, month: 2, day: 10),
author: "<NAME>",
witness: "<NAME>",
)
= Qualifications
#let qual-data = tournament-from-csv(
read("./RE-VRC-23-1735-Default Division-Results-2024-02-13.csv"),
team-name: "53E",
section: "qualifications",
)
#(qual-data.at(3).awp = true)
#(qual-data.at(4).awp = true)
#tournament(..qual-data)
= Eliminations
We were ranked 23rd in alliance selection, so we didn't have very many options
while making alliance selection. We wanted to prioritize getting picked by
another team over making a pick ourselves in an attempt to move up as a seed.
#raw-render[```dot
digraph {
rankdir=LR;
start->"We get picked by 1727Y"
"We get picked by 1727Y"->"We pick 53C" [label = "no"]
"We get picked by 1727Y"->"end" [label = "yes"]
"We pick 53C" ->"end"[label = "yes"]
start[shape=Mdiamond]
end[shape=Msquare]
}
```]
Unfortunately 1727Y got picked by 11555A, and we got picked by 21078A before we
could pick 53C.
#let elims-data = tournament-from-csv(
read("./RE-VRC-23-1735-Default Division-Results-2024-02-13.csv"),
team-name: "53E",
section: "eliminations",
)
#tournament(..elims-data)
#colbreak()
= Reflection
== Overall Performance
#grid(
columns: (1fr, 1fr),
pie-chart(
(value: 4, color: green, name: "wins"),
(value: 3, color: red, name: "losses"),
),
[
Our win loss ratio was not where we want it to be, but we were able to score the
AWP twice, and almost a third time during Q83. Each time we got AWP we massively
advanced in the rankings, but the loss in our final qualifications match left us
without very many options during eliminations.
],
)
== Robot Performance
After this tournament we'll have some time before the state tournament to
address issues. Therefore it's worth it to do an overview of our robot's
subsystem's performance to see where we need improvement.
=== Wings
#grid(
columns: (1fr, 1fr),
gutter: 20pt, //
[
Overall the wings were probably the best performing part of our robot, and work
great, even with the changes in strategy we've made. There are a few minor
issues, but these are easily fixable.
],
pro-con(pros: [
- Work reliably
- Consistently push triballs into the goal
], cons: [
- Sometimes don't fully retract if rubber bands break
]),
)
=== Intake
#grid(
columns: (1fr, 1fr),
gutter: 20pt, //
[
The intake is the biggest problem point on our robot currently. We haven't
changed the design in a very long time, and the optimal design has changed a
lot. Not only that, but the intake has some severe structural problems that will
only get worse if not addressed.
],
pro-con(pros: [
- Consistently grabs the triballs
], cons: [
- Rubber bands easily break
- C-channel is bending, putting stress on the axle
- The axle is low strength, making it very easy to bend
]),
)
#colbreak()
=== Flywheel
#grid(
columns: (1fr, 1fr),
gutter: 20pt, //
[
The flywheel is another problem point on our robot, but we're not sure if we
will have enough time to fix it before the state competition. Having a shooting
mechanism is becoming less and less important during match play, but its still
very useful during skills.
],
pro-con(pros: [
- Can be loaded very quickly
- Elevation makes it hard to block
], cons: [
- Extremely inconsistent spread
- Bad mounting to the lift gives it more friction than it should
]),
)
=== Lift
#grid(
columns: (1fr, 1fr),
gutter: 20pt, //
[
The lift has a few problems, but its performance is good enough that we probably
won't address many issues with it.
],
pro-con(pros: [
- Is typically able to elevate in under 5 seconds
], cons: [
- Only lifts us to A tier
- Has trouble lifting without rubber bands
- Has trouble actuating multiple times in a match
]),
)
== Strategy Breakdown
The strategy we used at this this tournament was very different from the
strategy we used at Gateway. We've become more comfortable with the field
starvation strategies, and have become better at knowing when to apply them and
when not to. Here is the decision making process we used to choose our strategy:
#raw-render[```dot
digraph {
rankdir=LR;
start->"Do our opponents have angled wings?"
"Do our opponents have angled wings?" ->"Field Starvation"[label = "yes"]
"Do our opponents have angled wings?" ->"Shooting"[label = "no"]
start[shape=Mdiamond]
}
```]
The main deciding factor is how fast they can return triballs to our side of the
field, and the fastest way we've seen to do that is angled wings.
#grid(
columns: (1fr, 1fr),
gutter: 20pt,
pie-chart(
(value: 5, color: red, name: "Field Starvation"),
(value: 2, color: orange, name: "Shooting"),
), //
[
Overall we ran field starvation strategies more often than we did shooting,
mainly because even if we were facing an alliance that couldn't relocate
triballs easily, they had some kind of shooting mechanism we could capitalize
off of. Overall we found it more effective to score triballs introduced by other
teams rather than introduce them ourselves.
],
)
|
https://github.com/Its-Alex/resume | https://raw.githubusercontent.com/Its-Alex/resume/master/lib/education.typ | typst | MIT License | #import "components/title.typ": customTitle
#import "components/link_with_icon.typ": linkWithIcon
#let education(title, education) = [
#customTitle(title)
#grid(
columns: (100%),
gutter: 0pt,
row-gutter: 1.5em,
..education.map((educationItem) => [
#block(breakable: false)[
#grid(
columns: (40%, 60%),
gutter: 0pt,
{
grid(
columns: 100%,
gutter: 5pt,
{
text(weight: 600)[#educationItem.name]
}
)
},
{
grid.cell(align: right)[
#text(weight: 400, rgb("BCBABF"))[(#educationItem.dates)]
]
}
)
#text()[#eval(educationItem.description, mode: "markup")]
#if type(educationItem.link) == str [
#linkWithIcon(
"link.svg",
educationItem.link,
educationItem.link
)
]
]
])
)
] |
https://github.com/LEXUGE/poincare | https://raw.githubusercontent.com/LEXUGE/poincare/main/src/reports/artiq/main.typ | typst | MIT License | #import "@preview/physica:0.9.0": *
#import "@preview/gentle-clues:0.4.0": *
#import "@lexuge/templates:0.1.0": *
#import "@preview/fletcher:0.5.1" as fletcher: diagram, node, edge
#import shorthands: *
#show: simple.with(
title: "ARTIQ Diagnostic Streaming Project Report",
authors: ((name: "<NAME>", email: "<EMAIL>"),),
disp_content: true,
)
#show figure: set block(breakable: true)
#pagebreak()
= Introduction
#link("https://github.com/m-labs/artiq/")[ARTIQ] is an open source control system used for quantum information experiments. #link("https://sinara-hw.github.io/")[Sinara] is the open source hardware project used by ARTIQ system.
The streamer prototype we implemented in this project is capapble of streaming the samples from Sinara "Sampler" ADC (analogue-to-digital convertor) to ARTIQ Dashboard. It is designed to be compatible with messages in ARTIQ analyzer, hence through extension it can in principle stream a broad range of messages.
User should be noted that the performance of the prototype isn't entirely desirable, and some understanding of the design/implementation would be useful to achieve desirable outcome.
This document is written for the git revision `c6a9057a07950ccaa01adf7515938f03fc865bfb`.
= Usage
The basic setup is as following
#let colors = (maroon, olive, eastern)
#figure(
diagram(
edge-stroke: 1pt,
node-corner-radius: 3pt,
mark-scale: 80%,
node((0,-0.5), [Kasli], fill: colors.at(0)),
node((0,0), [$dots.v$]),
node((0,0.5), [Kasli], fill: colors.at(0)),
edge("-|>", label: "UDP"),
edge((0,-0.5), (2, 0), "-|>", label: "UDP"),
node((2, 0), [`artiq_streamer` \ publisher], fill: colors.at(1)),
edge("-|>", label: "ZMQ"),
node((4,0.5), [ARTIQ Dashboard], fill: colors.at(2)),
node((4,0), [$dots.v$]),
edge((2,0), (4, -0.5), "-|>", label: "ZMQ"),
node((4,-0.5), [ARTIQ Dashboard], fill: colors.at(2)),
),
caption: [Basic setup]
)<fig:basic-setup>
#note[Currently, only the `master` variant of Kasli is supported.]
== Basic Idea<sec:basic-idea>
#figure(
diagram(
edge-stroke: 1pt,
edge((0,0), (5,0), "-|>", label: $t$, label-anchor: "north", label-sep: -0.5em),
edge((1,0), (4,0), "|-|", label: [Streaming Window]),
edge((0.5, 2), (0.5, 0), "..*", label: [`trigger`], label-angle: auto, label-side: left),
edge((1, 1), (1, 0), "..*", label: [`start`], label-angle: auto, label-side: right),
edge((4, 1), (4, 0), "..*", label: [`end`], label-angle: auto, label-side: right)
),
caption: [Triggering Setup]
)<fig:trigger>
In a usual setup, we set `start` and `end` manually (via `artiq_streamer set`) or in experiment code via ```python self.streamer.set_window(start, end)``` where `start, end` are in machine unit (nanosecond) relative to the trigger timestamp.
After relevant setup, we would call ```python self.streamer.set_trigger()``` in the experiment code which sets the trigger timestamp. And streaming will enable when we are in the _streaming window_.
== Kasli Setup
=== Building the gateware<sec:kasli-target-setup>
The ```python eem.SUServo.add_std()``` function has been patched to return a tuple ```python suservo, sample_tap``` instead of `suservo` only.
To use the ```python sample_tap```, the streamer gateware needs to be created in the target file:
```python
# Create the streamer using sample_tap as the source.
self.submodules.streamer = ClockDomainsRenamer("rio_phy")(
streamer.Streamer(
self.rtio_tsc,
self.get_native_sdram_if(),
[sample_tap],
cpu_dw=self.cpu_dw,
)
)
# Firmware needs to setup buffer adddresses etc via CSR.
self.csr_devices.append("streamer")
```
Another submodule `artiq.gateware.rtio.phy.servo.StreamerTrigger` is needed to control the streamer (set streaming window, enable/disable the streamer). Thus also needed in target:
```python
# RTIO component used to control the streamer.
self.submodules.streamer_ctrl = rtservo.StreamerTrigger(
self.rtio_tsc, self.streamer
)
self.rtio_channels.append(rtio.Channel.from_phy(self.streamer_ctrl))
```
See also `artiq.gateware.targets.kasli_stream_test` as an example of a simple modified target.
=== Configure the publisher endpoint<sec:pub-endpoint>
Kasli needs to know where the `artiq_streamer` publisher (@fig:basic-setup) is to send data. This is configured using `flash_storage.img`.
Specifically, we make the flash_storage
```bash
artiq_mkfs -s mac KASLI_MAC -s ip KASLI_STATIC_IP -s streaming_publisher 10.255.6.1:5005 flash_storage_streaming.img
```
where `10.255.6.1:5005` should be substituted based on the address you plan to use for publisher.
And we flash it to kasli using
```bash
sudo artiq_flash -f flash_storage_streaming.img storage start
```
== Publisher Setup<sec:publisher-setup>
`artiq_streamer` is a standalone tool which can ran publisher or set up streaming window on-fly etc.
To use it as a publisher, by using nix,
```bash
git clone https://github.com/LEXUGE/artiq_streamer
nix run .#artiq_streamer -- publisher "tcp://0.0.0.0:5006" "0.0.0.0:5005"
```
where `0.0.0.0:5005` should be in accordance with `10.255.6.1:5005` previously (@sec:pub-endpoint). The address `tcp://0.0.0.0:5006` is where dashboard binds to send data via ZMQ (See also @fig:basic-setup).
Alternatively, you can directly compile and run `artiq_streamer` if you have Rust toolchain installed by `cargo run --release`
For more detail, see the help by ```bash
$ nix run .#artiq_streamer -- --help
Usage: artiq_streamer publish [PUBLISHER_ADDR] [UDP_ADDR]
Arguments:
[PUBLISHER_ADDR] [default: tcp://0.0.0.0:5006]
[UDP_ADDR] [default: 0.0.0.0:5005]
Options:
-h, --help Print help
```
== Experiment and Dashboard Setup
=== `device_db`
The `device_db` has to include the following two RTIO components:
1. a `StreamerTap` module which controls the prescaling/enable of the `sample_tap`.
2. a `Streamer` module which controls the enable/streaming window of the streamer.
```python
"sample_tap" = {
"type": "local",
"module": "artiq.coredevice.suservo",
"class": "StreamerTap",
"arguments": {"channel": SAMPLE_TAP_CHANNEL },
}
"streamer" = {
"type": "local",
"module": "artiq.coredevice.suservo",
"class": "Streamer",
"arguments": {"channel": STREAMER_CHANNEL },
}
```
The `SAMPLE_TAP_CHANNEL` has to be manually calculated, and is the added as the last RTIO component in the ```python eem.SUServo.add_std()```. See the function code for more detail.
The `STREAMER_CHANNEL` also needs to be manually calculated, and it depends on where you added `StreamerTrigger` in your target code (as in @sec:kasli-target-setup).
For example, see `artiq/examples/streaming` for a simple setup.
#warning[
As of now, ARTIQ master may have some issue with updating `device_db.py`, not showing up the newly added core devices. If necessary, restart the ARTIQ master.
]
=== Experiment/Kernel code
Refer to @sec:basic-idea first.
See also ```python artiq.coredevice.suservo``` for details on RTIO methods available in ARTIQ kernel. And also `artiq/examples/streaming` for a simple experiment.
=== Dashboard Setup
There is a `Streamer` tab in the dashboard, and different channels can be selected in that tab. When experiment is running, the plot will be automatically updated in accordance to trigger/streaming window settings.
#warning[
Currently the publisher address that dashboard listens to (`tcp://0.0.0.0:5006` in @sec:publisher-setup) is _harcoded_ in `artiq/dashboard/streamer.py`. You would need to manually change it to `tcp://PUBLISHER_IP:PUBLISHER_PORT` in the code.
]
The result looks like this in dashboard:
#figure(
image("imgs/sinwave.png", width: 100%),
caption: [
Plot of streamed ADC samples with a connected function generator, viewed in dashboard
],
)
#figure(
image("imgs/single-qubit-gate.png", width: 100%),
caption: [
Plot of streamed ADC samples about laser pulses in a single qubit gate experiment, viewed in dashboard
],
)
== Set Streaming Window on-fly
Instead of setting streaming window using
```python
self.streamer.set_window(delta_start=START, delta_end=END)
```
in experiment code, we can also set it on fly using `artiq_streamer` by
```bash
nix run .#artiq_streamer -- set KASLI_IP:2001 START END
```
So if the IP address of Kasli is `10.255.6.177`, we would run
```bash
nix run .#artiq_streamer -- set 10.255.6.177:2001 START END
```
Note the port `2001` is static and set in firmware (`artiq/firmware/runtime/stream.rs`).
#pagebreak()
= Design
The prototype consists of
- A module in ARTIQ gateware used to intercept ("tap") the samples.
- A module in ARTIQ firmware to send tapped samples over Ethernet to PC.
- A standalone `artiq_streamer` command-line tool used as a publisher (see @fig:basic-setup).
- A module in ARTIQ dashboard used to plot and display data.
We will focus on the design of how firmware and gateware work together.
== Basic Data Structure
The idea is to implement a simple ring-buffer with gateware as producer and firmware as consumer. The producer writes `message` (a 32 bytes long payload) and consumer reads it. In practice, there are two types of messages:
1. `SampleMessage`
2. `StopMessage`
and the `StopMessage` will be appended automatically at the end of the streaming window (@fig:trigger). See `artiq/gateware/streamer.py` for the exact message format.
Every `SEGMENT_SIZE` number of consecutive `message`s will form a `segment`. The buffer size (in unit of `message`) is `buffer.len() = NUM_SEGMENT * SEGMENT_SIZE`.
Let `base_address` and `last_address` be the start and end addresses of the buffer respectively. For the sake of simplicity of illustration, *we treat each address as if corresponding to one `message`-long memory.* In other words, the machine is `message`-addressing.
Firmware will place a guard when it's reading, and gateware will fallback to protect the segment in use from overwritten by gateware.
In Rust-style psedo-code, the design looks like
#figure(
```rust
let mut guard = 0;
// Consumer (Firmware)
// The number of segment that we have already read.
// 0 represents nothing read yet. Pointer can be wrap around and be larger than or equal to NUM_SEGMENT.
// Since we label the segment from 0, this also represents (after mod NUM_SEGMENT) the segment that we are _about_ to read.
let mut consumer_pointer: usize = 0;
loop {
// Get the number of segment the gateware has written completely.
let producer_pointer = addr / SEGMENT_SIZE;
// If we have still some completely-written segments not read.
if consumer_pointer < producer_pointer {
// As commented before, consumer_pointer also represents the segment we are _about_ to read after % NUM_SEGMENT.
let wrapped_pointer = consumer_pointer % NUM_SEGMENT;
// Place the guard so if gateware is about to be running into the segment we are curently reading, it will stop.
guard = wrapped_pointer;
// Read the `wrapped_pointer` segment in the buffer. And each segment is sent in a single UDP packet to the publisher.
read_and_send_segment(wrapped_pointer);
// We have read another segment, increment the pointer by definition.
consumer_pointer += 1;
}
}
// Producer (Gateware)
let mut addr = base_address;
// sample_tap_fifo is where sample tap feeds messages into the gateware.
while !sample_tap_fifo.is_empty() {
write_message_to_memory(sample_tap_fifo.get_message());
// ((addr + 1) / SEGMENT_SIZE) % NUM_SEGMENT: which segment our next message is gonna be in.
// If the next message is gonna be in the segment which is currently guarded (i.e. currently reading)
if ((addr + 1) / SEGMENT_SIZE) % NUM_SEGMENT == guard {
// Fallback to the beginning of the current segment.
// The next message will then start overwriting the current segment.
addr -= (SEGMENT_SIZE - 1);
} else {
// Otherwise we are safe.
// Increment the address, when reaching the end, start from the beginning.
if addr == last_address {
addr = base_address;
} else {
addr += 1;
}
}
}
```,
caption: [Pseudo-code of the firmware/gateware logic]
)<data-structure>
== Publisher and Dashboard
The simple gateware#footnote[It corresponds to the `DMAWriter` logic in `artiq/gateware/streamer.py`] presented in @data-structure doesn't care about what messages it's writing.
Each segment is sent in a single UDP packet to the publisher. The publisher will parse the packet and messages within it, and publishes each `message` as a ZMQ message, based on the RTIO channel contained within each message#footnote[`StopMessage` doesn't have a RTIO channel, and is sent in a ZMQ channel of its own: `STOP_CHANNEL`.].
And the dashboard will subscribe to the corresponding ZMQ channel, based on the `device_db` provided by dashboard, when user opens a new waveform. And dashboard then parses the `message`s received. When a `StopMessage` is received, the waveform is refreshed (*This is not a desirable implementation, see Future Improvements.*).
== Correctness and Existing Problems<sec:existing-problems>
In the end, we want to prioritize these goals for correctness:
#task[
1. `message`s' timestamps received by publisher should be in strictly increasing order. In particular
1. timestamps of `message`s within each segment MUST be increasing.
2. even between different segment, the timestamp should still be in order.
2. The timestamp between every `SampleMessage` to their next nearest `StopMessage` received by the publisher should be smaller than the streaming window length (see @fig:trigger).
$ #text[`NextStopMessage.time`] - #text[`SampleMessage.time`] lt.eq #text[Streaming window length] $
In other words, *the streaming window user sees on the dashboard should be strictly within the streaming window they set.*
3. The ratio
$ #text[Number of messages contributing to a complete streaming window]/#text[Total number of messages] $
should be as high as possible while maintaining reasonable throughput and delay.
]
The first goal is in practice achieved by our current design, as we are only reading complete segment, and each complete segment by definition contains consecutive messages. And guard protects us from data races and corruption.
Moreover, even though UDP protocol doesn't guarantee order of packets, in lab environment, we haven't encountered any out of order issue between segments under prolonged streaming (more than one hour).
The second goal is *not* achieved by the current design due to the following counterexample:
#eg[`StopMessage` period is not guaranteed][
If `guard` is currently at 3, and the gateware has written a `StopMessage(time=t1)` in segment 2. Then in the next streaming window, if the guard doesn't move and streaming window is long enough, the `StopMessage(t1)` written will be overwritten.
And user will see a prolonged streaming window due to the missing `StopMessage(t1)`.
]
In fact, the second goal is in theory unlikely to be completely achieved. Because it boils down to preserving all `StopMessage`s, and given in the extreme case a limited buffer size and a immobile guard, the gateware can not preserve all `StopMessage`s.
The last goal is not what our design is optimized for. This is evident as
#eg[`message` sent are not optimized for complete segment][
If the gateware runs into the guard, the last segment is overwritten, then current streaming window is already not complete.
However, the firmware will still read the previous segments of the current streaming window, leading to wasted bandwidth/throughput.
In the extreme case of long streaming window and low buffer size, we will get frequent but short (very short compared to streaming window set) waveform, which is not useful.
It's more useful to stream less frequent but complete streaming window.
]<eg:incomplete-window>
An example of the @eg:incomplete-window is
#figure(
image("imgs/incomplete-waveform.png"),
caption: [An example of incomplete waveform streamed, the streaming window set is 1 second.]
)<fig:incomplete-waveform>
Seems like the limit for snappy response is at 8000 messages/second
- 100ms headroom, 10 ms streaming window seems to give no glitch.
- longer headroom seems to adversely affecting responsiveness.
- set tap window + headroom = streaming window will cause trouble
There are two parts that could cause delay:
- When there is very little headroom, firmware could have problem to catch up, causing delay in displaying data.
- When the update is very frequent, the dashboard has problem to flush them in time.
= Implementation Details
This section will contain detail and existing problems used to implement the design.
== Gateware
The gateware part of the streamer prototype lies in:
- `artiq/gateware/streamer.py`: `Streamer` and `message` encoding and buffer writing.
- `artiq/gateware/rtio/phy/servo.py`: `StreamerTrigger` and `TriggeredSampleTap` used for controlling streamer and sample tap via RTIO.
- `artiq/gateware/suservo/adc_ser.py`: `DownSampler` which can downsample and enable/disable tapping for ADC.
- `artiq/coredevice/suservo.py`: Contains modules used in kernel code to interface with `Streamer` and `DownSampler`.
#figure(
diagram(
node((0,0), [`Streamer` \ `streamer.py`], name: <streamer>),
node((2,0), [`DownSampler` \ `suservo/adc_ser.py`], name: <sampler>),
node((0,-2), [`StreamerTrigger` \ `rtio/phy/servo.py`], name: <streamer-ctrl>),
node((2,-2), [`TriggeredSampleTap` \ `rtio/phy/servo.py`], name: <sampler-ctrl>),
edge(<streamer-ctrl>, <streamer>, "-|>", label: [Controls]),
edge(<sampler-ctrl>, <sampler>, "-|>", label: [Controls]),
edge(<sampler>, <streamer>, "-|>", label: [Provides Data]),
node((0,-4), [`Streamer` \ `coredevice/suservo.py`], name: <streamer-kernel-mod>),
node((2,-4), [`StreamerTap` \ `coredevice/suservo.py`], name: <tap-kernel-mod>),
edge(<streamer-kernel-mod>, <streamer-ctrl>, "-|>", label: [Issues RTIO output]),
edge(<tap-kernel-mod>, <sampler-ctrl>, "-|>", label: [Issues RTIO output]),
node((1, -5), [Kernel], name: <kernel>),
edge(<kernel>, <streamer-kernel-mod>, "-|>", label: [Calls]),
edge(<kernel>, <tap-kernel-mod>, "-|>", label: [Calls]),
),
caption: [Gateware architecture, all paths under `artiq/gateware` except `coredevice/suservo.py`]
)<fig:gateware-architecture>
Specific details can be found in the respective source code.
And the streamer contains various different submodules,
#figure(
diagram(
node((-1, -5), [`DownSampler`], name: <sampler1>),
node((1, -5), [`DownSampler`], name: <sampler2>),
edge(<sampler1>, <multi-tap>, "-|>"),
edge(<sampler2>, <multi-tap>, "-|>"),
node(enclose: (<dma>, <msg-encoder>, <multi-tap>), // a node spanning multiple centers
inset: 20pt, stroke: teal, fill: teal.lighten(90%), name: <streamer>),
node((0,0), [`DMAWriter`], name: <dma>),
node((0,-2), [`MessageEncoder`], name: <msg-encoder>),
edge(<msg-encoder>, <dma>, "-|>", label: [Feeds into \ `FIFO` \ `Convertor` \ finally]),
node((0,-4), [`MultiSampleTap`], name: <multi-tap>),
edge(<multi-tap>, <msg-encoder>, "-|>"),
),
caption: [`streamer.py` architecture, `Streamer` basically aggregates the submodules in blue.]
)
Notably, we slightly adapted the logic in @data-structure because:
1. The memory is not actually `message`-addressable.
2. The gateware doesn't support modulo operation well.
=== MiSoC patch
An important patch `misoc_wishbone_add_uncached.patch` is needed for the gateware-firmware pair to function normally. The patch allows us to disable the CPU L2 cache for a certain user-specified memory region. In our case, this region is our buffer.
This allows us to always get the correct data instead of outdated data in L2 cache. *The reason this patch is needed is SDRAM (where our buffer lives) isn't really designed to be used by components other than CPU.*
== Firmware
We implemented UDP support using `smoltcp` in `artiq/firmware/runtime/sched.rs`, and the only other change is `artiq/firmware/runtime/stream.rs` where a logic very similar to @data-structure is implemented.
The `stream.rs` also implements the ```rust ctrl_thread()``` function which spawns the control logic that allows `artiq_streamer` (or any other tool) to set streaming window on-fly.
== `artiq_streamer`
The `artiq_streamer` uses `nom`, a parser combinator, to parse the `message` and segments received from Kasli. The parsing logic is located in `src/parser.rs` where relevant unit tests also reside.
To support additional messages, additional parsers should be written. The simple parser contained within doesn't support skipping unkown messages as of now.
== Dashboard
The main changes are:
- `artiq/dashboard/streamer.py`: where waveform plotting, ZMQ subscriber are handled.
- `artiq/dashboard/streamer_helper.py`: where decoding, device_db handling are done.
Both parts are rather hacky, and would benefit a lot from future improvements.
#pagebreak()
= Future Improvements
Many future improvements could be done. As for design problems, refer to @sec:existing-problems.
== Firmware
- The `artiq/firmware/runtime/stream.rs` module should be feature gated to allow disabling streamer in compile time for minimal overhead.
- It is found by empirical observation that the firmware consumes segments in a spiky pattern: blocks for certain time and consumes a lot in a short interval. This could be due to some scheduling problem.
Solving this could potentially allow us to have less incomplete waveform as seen in @fig:incomplete-waveform, as even very low (~300 messages per second) throughput demand will suffer from this issue.
More specifically, setting streaming window to `10ms` and trigger every `20ms` with downsampling rate of $16$ (i.e. sample every 16 samples) will produce the incomplete-waveform. Assuming a sampling rate of This parameter has a througput demand of
$ 10^6 times 1/16 times 1 / 2 times 1/ 32 = 976.5625 #text[packets per second] $
which is completely within the firmware performance limit of `~1400 pps`. Thus it should be scheduling issue limiting us from complete traces.
- A very simple modification is to disable the streamer when the user doesn't include `streaming_publisher` key in their config.
- Similar to `artiq_streamer`'s setting streaming window capability, a capability to trigger on-fly can also be implemented.
== Gateware
- The `MultiSampleTap` will lose samples if multiple samples arrive at the same time.
- The `message_count` in `DMAWriter` has hardcoded relation regarding whether `message_len` or `cpu_dw` is longer, which is not robust.
- A gateware-based consumer will be desirable as it should significantly improve the performance. Gateware-based UDP has already been implemented in the latest version of #link("https://github.com/enjoy-digital/liteeth")[LiteETH].
== Dashboard
- The plotting code `artiq/dashboard/streamer.py` has bad performance probably due to autoscaling on every refresh. In particular, the function
```python on_dump_receive(self, dump)
```
should be fixed.
- The plotting could crash the dashboard, and the update has some signficant delay when the streaming window is small (i.e. dashboard refreshing rate is high).
- The publisher address should not be hardcoded.
= Some Measurements
#figure(
image("imgs/wireshark-udp-1-byte.png"),
caption: [UDP throughput of Kasli over Ethernet with 1 byte of payload.]
)
#figure(
image("imgs/wireshark-udp-1024-bytes.png"),
caption: [UDP throughput of Kasli over Ethernet with 1024 bytes of payload.]
)
#figure(
image("imgs/maximum-streaming-throughput.png"),
caption: [Streaming throughput of Kasli over Ethernet with 1024 bytes long segment/packet.]
)
|
https://github.com/tinnamchoi/resumes | https://raw.githubusercontent.com/tinnamchoi/resumes/master/src/preamble.typ | typst | #import "template/template.typ": *
#let name = "<NAME>"
#let links = (
github: "tinnamchoi",
email: "<EMAIL>",
website: "tinnamchoi.github.io",
linkedin: "tinnamchoi",
)
|
|
https://github.com/EunTilofy/Compiler2024 | https://raw.githubusercontent.com/EunTilofy/Compiler2024/main/lab3/Report_of_Lab3.typ | typst | #import "../template.typ": *
#show: project.with(
course: "编译原理",
title: "Compilers Principals - Lab3",
date: "2024.5.11",
authors: "<NAME>, 3210106357",
has_cover: false
)
= 实验内容
本次实验,我们基于 lab2 的语义分析,
实现了 SysY 语言向中间代码的转化。
我们基于先前构建出的语法树和符号表,对语法树上的每一个节点进行递归构建其对应的中间代码,我们使用变量名称和其在符号表上的位置来唯一标识一个变量,解决了变量重名的问题。通过:
```
make compiler
./compiler <input file> [output file]
```
可以对输入的 sy 文件进行语法和语义的检查,
如果可以正确解析出语法树并且通过类型检查和数组检查,程序将正常退出并返回 0,并且将生成中间代码到 output file(如果没有定义,则默认为 ir.out),
同时在错误流中显示:
```
Parse success!
```
否则,程序将汇报错误,一个错误的代码的解析输出如下:
```
DEBUG: type error at src/semantic.hpp:219
DEBUG: type error at src/semantic.hpp:105
DEBUG: type error at src/semantic.hpp:39
```
报错信息表示语义分析错误在源程序中的位置,
在这里,我们并未实现面向用户的报错信息,仅用于个人调试。
= 代码实现
== 主接口
main.cc 在lab2的基础上,增加了中间代码生成的部分:
```cpp
string IR_OUT = "ir.out";
if(argc >= 3) IR_OUT = string(argv[2]);
IR ir(&checker, Root);
ofstream ir_out(IR_OUT);
ir.print(ir_out);
std::cerr << "\nParse success !" << std::endl;
```
== 中间代码存储格式
=== Class IR
我们以一行作为中间代码的最小存储单元,class IR 用于存储中间代码段,其中可能包括一行或多行中间代码,其成员定义如下(不含成员函数):
```cpp
string type;
unique_ptr<IR_info> info;
vector<IR> child;
IR() : info(nullptr) { type = "NULL"; }
IR(IR_info *o) : type(o->type), info(unique_ptr<IR_info>(o)) { }
IR(const IR& o) : type(o.type), info(o.info ? o.info->clone() : nullptr), child(o.child) {}
```
其中 type 表示该段中间代码的类型,如果它包括多行,则类型为 “NULL”。child 为每一行的中间代码。info 表示中间代码的具体信息(如果多行的话为 nullptr)。这里我们使用指针来存储中间代码的信息,并且维护其深拷贝的操作。
=== Class IR_info
IR_info 用于描述一行代码输出时的具体信息。其包括一个虚函数 print(),
```cpp
virtual void print(ostream &OUT) const;
```
我们为不同类型的中间代码定义了不同的子类和相关的输出函数。如二元运算赋值语句:
```cpp
class info_assign_binary : public IR_info
{
public:
IR_info* clone() const { return new info_assign_binary(*this); }
string lv, v1, op, v2;
info_assign_binary(string lv, string v1, string op, string v2) : lv(lv), v1(v1), op(op), v2(v2) { IR_info::type = "ASSIGN"; }
void print(ostream &OUT) const { OUT << lv << " = " << (v1+" "+upd_op(op)+" "+v2) << "\n"; }
};
```
因此输出中间代码时,只需要从IR根节点开始,依次执行每一个 child 的输出函数即可。
== 中间代码生成
=== 命名格式
临时变量的命名,代码中全局变量,局部变量的命名,中间代码中 Label 的命名格式参照以下函数生成:
```cpp
string get_label() { static int tot = 0; ++tot; return "L"+to_string(tot); }
string get_tmp() { static int tot = 0; ++tot; return "irt3mP"+to_string(tot); }
string get_name(Node* o) { return "irVar_xxx"+to_string(abs(o->val))+"_"+o->text; }
```
=== 部分实现细节
- 变量定义
全局变量和局部变量对应的中间代码不同,所以需要额外传入一个参数 \_inline,表示是否为局部变量。
```cpp
IR from_decl(Node *o, bool _inline = 0);
```
- IR节点声明
使用如下宏来表示一个新的 IR 节点:
```cpp
#define INFO(type, ...) (new info_##type(__VA_ARGS__))
#define _IR(type, ...) IR(new info_##type(__VA_ARGS__))
```
这样可以更加方便地定义一个 IR 代码段,比如对于 IfElse 节点:
```cpp
string l1 = get_label(), l2 = get_label(), l3 = get_label();
rt._with(from_condition(o->child[0], l1, l2), _IR(label, l1),
from_stmt(o->child[1]), _IR(_goto, l3),
_IR(label, l2), from_stmt(o->child[2]), _IR(label, l3));
```
- exp 节点
表达式需要用一个临时变量暂存,因此在转化 exp 节点的时候,还需要传入一个参数表示转换后的代码存在哪个变量里:
```cpp
IR from_exp(Node* o, string tmp);
```
- 变量调用
对于数组变量/全局变量,调用的时候先取其地址,然后再 Load 其中的值,而对于局部变量,可以直接使用普通 Assign 语句来进行赋值操作。
```cpp
if(/*It is an array or global variable*/)
rt.merge(_IR(assign, tmp, get_name(id)));
else
{
string t = get_tmp();
rt._with(_IR(assign_addr, t, get_name(id)), _IR(assign_load, tmp, t));
}
```
= 测试结果
```
python3 test.py ./compiler lab3 -l
```
tests 下的测试样例全部通过:
#figure(
image("1.png", width: 45%),
caption: [
All tests passed!
],
) |
|
https://github.com/dark-flames/resume | https://raw.githubusercontent.com/dark-flames/resume/main/main.typ | typst | MIT License | #import "resume.typ": *
#resume((
x-lang: sys.inputs.at("x-lang", default: "en"),
x-version: sys.inputs.at("x-version", default: "resume")
)) |
https://github.com/metamuffin/typst | https://raw.githubusercontent.com/metamuffin/typst/main/tests/typ/layout/par-justify.typ | typst | Apache License 2.0 |
---
#set page(width: 180pt)
#set block(spacing: 5pt)
#set par(justify: true, first-line-indent: 14pt, leading: 5pt)
This text is justified, meaning that spaces are stretched so that the text
forms a "block" with flush edges at both sides.
First line indents and hyphenation play nicely with justified text.
---
// Test that lines with hard breaks aren't justified.
#set par(justify: true)
A B C \
D
---
// Test forced justification with justified break.
A B C #linebreak(justify: true)
D E F #linebreak(justify: true)
---
// Test that there are no hick-ups with justification enabled and
// basically empty paragraph.
#set par(justify: true)
#""
---
// Test that the last line can be shrunk
#set page(width: 155pt)
#set par(justify: true)
This text can be fitted in one line. |
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/057%20-%20Bloomburrow/004_Episode%204%3A%20Soothsaying%20and%20Stormcalling.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"Episode 4: Soothsaying and Stormcalling",
set_name: "Bloomburrow",
story_date: datetime(day: 05, month: 07, year: 2024),
author: "<NAME>",
doc
)
#strong[Helga]
Bruised and exhausted from being tossed about in the Long River, then crawling ashore to find a temporary refuge in a cavern full of bones, Helga could only stare blankly at the circle of weapons and ratfolk surrounding her and the others. She'd nearly died for the third time in as many days, and now she was being threatened yet again. Any spark of defiance or fear had been snuffed out, leaving only ashes.
"You may not have noticed," she said, "but we had little control over our arrival here, wherever we are."
"Irrelevant," the ratfolk snapped, brandishing his dagger. "You will leave, or your corpses will feed the beetles."
Hugs rolled to his feet with surprising grace, growling. Finneas tensed as if readying to leap away, while Gev's eyes flared orange.
"Let's not be hasty," Mabel said, holding up her empty paws.
"I'm prepared to engage in some targeted haste," Ral said, lightning twining around his bracer.
Shuffling footsteps approached from the mouth of the cavern. The ratfolk stepped backward, revealing a wizened figure in a shell-adorned cloak. His fur, if it had ever been darker, had faded to a pale gray, and he leaned on a snail-handled cane, moving slowly as if his bones ached. Mismatched eyes regarded Helga from beneath his hood, one black and the other red.
"These strangers are not like the others," he said, a slight wheeze in his soft voice. "The River brought them to us. We must offer them hospitality."
They sheathed their weapons, and one of the ratfolk offered Helga a paw to help her rise. She hesitated, then took it.
"Come," the elder said. "I am Coffey, and you must rest and eat before you proceed. Perhaps we might even assist in your quest. Yes?"
Mabel's whiskers twitched, but her posture relaxed. "We'd be most grateful."
Coffey gestured with his cane, and a ribbed mussel shell was shifted to the side, revealing the opening to a tunnel large enough for Hugs to fit inside without hunching. The scents of mildew and mushrooms emanated from the darkness; Helga reconsidered whether following these strangers might not be a grave mistake.
"Seriously?" Ral thumped his tail against the mud, then winced and glared at the appendage as if it had offended him. Still, he fell in behind Mabel.
Matching Coffey's pace, the ratfolk guided them through a maze of limestone, hints of shell patterns, curves and whorls sculpting grim histories on the pockmarked surface. The walls bore no obvious signs that the ratfolk had carved them, though the floors seemed worn smooth and slightly concave by the passage of countless feet over time. Enchanted lights in woven sconces brightened as they approached and faded as they passed, occasionally revealing side tunnels or unadorned rooms lined with bare beds.
"What is this place?" Ral asked.
"Our village," the elder said. "These were once ancient burrows, created by enormous insects from ages gone, now long forgotten. By all but us, of course."
Helga shivered, not merely because the air cooled as they descended deeper into the rock.
"What do you do here?" Mabel asked.
"We're lorekeepers," Coffey replied, breathing like a leaky bellows. "We preserve the history of Bloomburrow and Valley within it. Every tale we find, every scrap of legend that washes up on our shores, we store in our ossuary for generations to come. The past defines the present, and thus the future."
After enough turns that Helga had no notion how to get out, they reached a wooden door, which Coffey opened. He gestured everyone inside as their ratfolk escort silently returned to whatever hidey-holes they'd emerged from.
To Helga's surprise, the room was spacious yet homey. A light basket hung from the ceiling, filled with faintly glowing pearls. Comfortable-looking chairs sat near a wall entirely taken up with shelves, all the way to the ceiling, some accessible only by ascending a spiral staircase. Rows and stacks of books shared space with assorted knickknacks—iridescent blue beetle-shell sculptures, a box covered in mothwing scales, a painted driftwood half-mask. Rugs and pillows softened the stone floors, and an ironsap stove warmed the room, the peat inside apparently spelled to burn without smoke. Scents of seaweed and wet soil mingled with freshly steeped chamomile tea. An umber molefolk stood over the teapot, wearing a quilted jacket patched at the elbows, a pair of spectacles perched on his nose.
Coffey eased into one of the chairs, resting his cane nearby. A small pill bug uncurled itself from a basket in the corner and scuttled over to him, settling at his feet.
"Would anyone care for a drink?" Coffey asked.
"Yes, please," Mabel said. "For all of us, I think."
Coffey inclined his head at the molefolk. "Tucker, if you'd oblige us."
Tucker produced a motley collection of mugs and served the party, then set to slicing pieces of seedcake. Helga gratefully sipped her tea, relaxing into the pillow she'd claimed. Her limbs ached from clinging to Hugs for so long, and she sported new bruises from detritus striking her in the tumultuous river. No doubt the others were in similar condition, with the badgerfolk likely feeling the worst of it. All of them sat, too, except for Ral, who perused book spines, occasionally pulling one out and peering at the contents.
"My apologies for the hostile greeting," Coffey said. "We have few visitors, and you're the second group of strangers to pass through in so many days. The first, alas, caused a stir, and our guard has been raised ever since."
"What strangers?" Mabel asked, leaning forward, her dark eyes keen as her blade.
"Mercenaries." He raised his cane and drew circles in the air, a spiral of blue-violet magic rising from the shell handle. At its center, a shimmering image took shape: a weaselfolk wearing a hooded red tailcoat, one gloved paw brandishing a rapier. A wicked scar covered his right eye, three diagonal lines as if a claw had raked his face.
"This is Cruelclaw," Coffey continued. "He led his band through the swamp, pilfering what supplies he pleased, the Night Owl wreaking havoc in his wake."
"We're looking for him, too!" Helga exclaimed, then covered her mouth with her hands. Outbursts weren't polite, as her parents had often reminded her.
Mabel didn't scold. "We found two of his necromancers in a village devastated by the Night Owl. They seemed to know something about the why of it, so we followed them, but lost their trail when the docklands of Three Tree City were set upon by the Flood Gar."
"The Flood Gar?" Tucker squeaked, spilling his tea. Mabel rose and helped him sop it up with a cloth.
"Cruelclaw must be stopped," Coffey intoned, "before more Calamity Beasts join this madness."
"But how could he cause these attacks?" Helga asked, bewildered. "No one has power over the Calamity Beasts, not even the great weavers from the time of the Order of the Holly Leaf."
"I believe Cruelclaw stole something that may give him, or whoever he's working for, such power." Coffey closed his eyes as if pained. "Our scouts reported that he had in his possession a Calamity Beast egg."
Helga gasped, feeling as if all her blood had drained out like water from a cracked teapot.
"The egg is … important?" Ral asked, his gray-blue eyes narrowed below the goggles perched on his forehead.
"From it," Tucker said, "a new Calamity Beast will someday be born. Who knows what magical potential exists in such a thing?"
"Could be none at all," Mabel mused, retrieving her tea and sitting. "The egg may only be valued for the creature that will come from it. We must find whoever is giving Cruelclaw his orders."
"And if you do?" Coffey asked, his mismatched eyes like coals, one lit and one dark. "What then?"
Mabel's voice was resolute. "Then we return the egg to its rightful place."
Finneas leaped to his feet. "Mabel, no! What could you be thinking? I know Oliver went on about you being a hero and all, but this takes it a bit far. I'm just a farmer, and Zoraline is a local cleric, and who even knows about Gev and Hugs—"
"Excuse you?" Gev hissed, his tail brightening. "The Striped Rapscallions fear no mercenaries or Calamity Beasts."
Ral choked on a laugh, and Mabel sent him a quelling look. Zoraline stirred, stretching her wings and peering around the room upside down.
"The theft of the egg has upset the balance of the world," she said. "The music of the stars will remain discordant unless we can restore harmony."
Helga wasn't sure precisely what that meant, but she agreed that no one should be wandering about with a Calamity Beast egg. Especially not if it meant the Night Owl was chasing them, sowing chaos. But like Finneas, she wondered if she could really do anything to help.
"For what it's worth," Ral said, "I don't know any of you, but so long as Helga is my only link to Beleren, I'll be sure to keep you all alive. Even if that means fighting an owl or a giant fish or whatever." He paused, then added as if to himself, "Maybe I'll even set up a relay tower here."
Helga wanted to protest that she knew no more than she'd already told him, but she held her peace.
Mabel put her mug down on the floor and stood. "Our knowledge of this problem makes it our responsibility. If we don't find the egg, who knows what harm might come to Valley? To our families and friends? To strangers who are themselves someone's family, someone's friend?" Her gaze swept the room, lighting on each of them in turn. "You're brave, and clever, and quick, and strong, and kind. We've come this far, worked together, fought together, and we can solve this problem together, too."
Helga thought of her failed cantrip, nearly spilling Mabel to the ground in the heat of battle. Huddling in a heap as everyone around her fought. She wasn't brave, or clever, or quick, or strong. #emph[Was she kind? ] She hoped so,#emph[ but what could kindness accomplish?]
"Even if we want to stop Cruelclaw," Finneas protested, "we have no notion of where he is, or where he's going. We can't foil him if we can't find him."
"Helga may be able to solve that problem," Mabel said.
"Me?" Helga pressed a hand to her chest. "What can I do?"
"Use your augury skills to find Cruelclaw."
Protests dying unspoken, Helga favored Mabel with a nervous smile. No one had ever asked her to do such a thing. Almost no one even believed she could. Her parents, her siblings, her neighbors … only her grandparents had supported her, and she'd always suspected they were humoring her out of love.
Mabel trusted her, though. Mabel believed in her. And hadn't Ral, a complete stranger, seen enough truth in her drawings to pledge himself as her protector?
"I can try," Helga said slowly. "I'll need a large bowl filled with water."
Tucker produced a pitcher and basin formed from a polished snail's shell, resting it on the floor near Helga. He poured fresh water from the one into the other, spilling not a drop. Coffey gestured, and the light pearls in the basket dimmed, casting only the faintest glimmer across the basin's pale pink surface.
Helga opened her waterproof pack and pulled out her journal, relieved it hadn't been soaked to ruin by the Long River. She turned to the most recent drawing, of the strange Hawk, made just before the Night Owl's attack. #emph[Were they connected somehow?]
"What is that?" Ral asked, peering over her shoulder.
"I'm not certain," she replied. "I've never seen anything like it."
"There's something familiar about the head and wings," he murmured. "It'll come to me. I suppose I should leave you to your business."
Helga found a blank page, set pencil to paper. A hush fell over the room, the loudest sound Coffey's gentle wheeze. She stared into the water, clear and still, and tried to calm her mind, to reach for the place within her whence visions sprung.
Nothing happened.
Distractions inundated her. The shelves thrown into shadow, the scents of peat and soggy clothing, the rustle of fabric, the twitch of ear or whisker. A headache formed behind her left eye as she pressed the point of her pencil into the page. She was trying too hard, she knew. She must relax. Only, this needed to work. So much depended on her seeing some clue to Cruelclaw's location or destination. If she failed again, they would all fail. #emph[And then what? More Night Owl attacks? More villages destroyed? Something worse that she couldn't even imagine?] Her chest tightened and her breathing came in furtive sips, and still the water showed her nothing.
Zoraline startled her with a gentle touch. "The light is inside you," the batfolk murmured. "You need not force it to glow; you need only uncover it."
Helga almost reflexively dismissed the cryptic words, but instead she made herself consider them. Truly, her augury had never been something she could compel like other weavers did. She couldn't control when her attention focused rather than scattered like dropped beads. What she could do was the same as she'd done that dark day at the side of the pond: sit with her journal and doodle.
With a deep, slow breath, she loosened her grip on her pencil. Drew an aimless spiral. Turned it into a snail's shell. Slid her gaze back to the bowl. The color wasn't uniform, nor was the surface perfectly smooth, but it had clearly been polished. She'd seen soapworts that pale, though they tended toward a darker pink. Had they started blooming yet? If not now, then soon …
Time blurred like wet ink. Someone tugged Helga's journal from her hand. She lifted her gaze to Mabel, who turned the book around to show everyone—what? Helga blinked, dazed, until she saw her own handiwork.
A massive, three-tiered fountain rose from a pond dotted in lily-pads, a graceful spire of water at its peak. Helga fancied herself a decent artist, but even if her hasty sketch had been less capably executed, anyone in Valley could easily recognize what she'd rendered.
"They've gone to Fountainport," Helga whispered.
"And so shall we," Mabel said. "Well done."
Others echoed that sentiment. Helga only wished her flush of triumph wasn't tainted by the knowledge that they would be returning to the domain of <NAME> and the site of her greatest failure.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
#strong[Mabel]
Having profusely thanked Coffey and Tucker for their hospitality, the party took their leave to resume their journey. Loath as Mabel was to continue rather than resting longer in the comfortable den, Cruelclaw and his mercenaries had not only a head start, but shoulders and chest.
"We wish you every success," Coffey told her. "We had sent some of our own to tail your quarry. If you meet with them, perhaps you can assist each other."
Mabel would welcome the help. She'd no notion of how many mercenaries were in Cruelclaw's band, but the necromancers were an army unto themselves.
Tucker led them through a different set of tunnels, judging by the scents—more grass than silt—and eventually they emerged into late afternoon sunshine at the edge of the marshland. A forest of juniper and oak trees rose in the distance to the north and west, while to the east, craggy boulders loomed like discarded children's toys, the gaps between them dotted with clusters of red ferns and firethorn.
"Which way is Fountainport?" Mabel asked.
"East-northeast," Tucker replied, regarding her over the tops of his spectacles. "You could go north, then east, find some villages in the woods to shelter and provision you on the way. Only a few lizardfolk homes in the hills if you take that route, though the way is straighter as the birdfolk flies."
Mabel scrutinized her star map. Once Zoraline awoke for the night, she could assist, but until then …
"We know a shortcut," Hugs rumbled.
"The dandelion field?" Gev asked. Hugs inclined his head, and the lizardfolk hissed a sigh.
"Is something amiss with this shortcut?" Mabel asked.
"Full of hedge parsley," Gev said. "It takes an age to pick the so-sticky seeds off Hugs's fur."
Sticky seeds were a simple problem compared to what they'd faced thus far. Unfortunately, Tucker chimed in as well.
"That path could be dangerous at present," the molefolk said. "A great storm some days past brought with it a terrible creature. Very dangerous."
"What kind of creature?" Finneas asked, gripping his bow, ears angling backward.
Tucker adjusted his spectacles. "I haven't seen it myself. I'm told it's not a Calamity Beast, but it's very like one. We have no record of anything similar in our histories."
Helga croaked nervously.
"Perhaps it's from another plane," Ral muttered.
"Another what?" Finneas asked.
"Nothing."
Mabel regarded the otterfolk curiously, but it wasn't the time to pry. "If the field is faster, then the field it shall be. We can always hew east or north as necessary."
Tucker stood at the entrance to the tunnels, receding behind them as they walked away. Helga waved one last time, and he returned the gesture solemnly.
The sun hadn't moved substantially across the sky before they reached the dandelion field Hugs promised. He and Gev led the way, pushing between flower stalks interspersed with wiry grasses and the more delicate blossoms of the hedge parsley. Bright yellow petals nodded in the breeze, white puffs occasionally bursting from a stronger gust or the brush of Hugs's shoulder, sending their seeds floating across the landscape. There was no sign of the mercenaries, nor of any storms, the only clouds wispy as candle smoke.
Eventually, the silence seemed to unnerve Finneas enough to outweigh his dismals. He began to loose questions like arrows at Ral, albeit with more forced cheer than he'd shown with Helga.
"May I ask where you hail from?" Finneas asked.
"Far away," Ral replied.
"The Outer Woods?"
"Farther."
Finneas skipped around a pebble. "Any family waiting for you back home?"
For some reason, that question gave Ral pause. "My husband, Tomik," he said brusquely. He whacked a flower with his tail, seemingly on purpose rather than by accident, and whatever question Finneas might have aimed next stayed in his quiver. He sped up to walk just behind Hugs, while Ral lagged, putting distance between himself and the others.
Mabel matched her pace to Ral's, rosettes of dandelion leaves rustling beneath their feet. A ladybug crawled up a stem of grass, then flew away in a hum of wings. A line of ants marched past a mound of earthworm castings, on their own inscrutable mission. Idly, Mabel pulled out the wedge of seedcake Tucker had pressed on her before they departed, breaking off a piece and tasting it. Delicious, with just the right amount of carraway. She offered some to Ral, whose nose wrinkled.
"It's good," Mabel said. "I would know. I'm a baker by trade."
"Are you?" Ral asked incredulously, eyeing her sword. "What are you doing all the way out here, then, looking for trouble?"
"Why are you so far from home searching for your friend?"
"Touché." Ral took the bit of seedcake and tossed it in his mouth.
"I miss my husband," Mabel said. "And my littles. How they smell, their voices, their sweet hugs …"
Ral was quiet for a time, then said, "I miss my husband, too." He sounded almost surprised by the admission as he absently rubbed the white cloth tied to his wrist. "I've never really had anyone to miss before. And I've been so focused on finding Beleren that I've been able to ignore it."
Mabel patted his arm. "I'm sure you'll be back together in no time at all. You'll appreciate him even more for the absence, and he you."
"I wish I had your optimism," Ral muttered. "Beleren is slippery as a damn eel, and I have no idea what he's planning." He struck a dandelion stem with his bracer, sending the seeds flying.
Mabel didn't know what an eel was, and her fur briefly rose as she had the oddest sense of something immense and inscrutable beyond her ken, like the stars fixed in the firmament. #emph[Should she be worried about this Beleren and his allies, or even Ral himself?]
Gev appeared between them as if he'd always been there. "I miss my long-ago home in the Valley's rim. So warm, the stones there. Though not as warm as the Ever-Burning Oak."
"You've been to the Ever-Burning Oak?" Helga asked.
"But of course," Gev said, bobbing his head rapidly. "The Striped Rapscallions have roamed over all of Bloomburrow."
Whatever else he might have said was lost as a vast shadow passed over them. Mabel's sword was in her paw a heartbeat after as she scanned the sky. Beside her, Ral tensed, lightning sparking in his blue-gray eyes as he pulled down his goggles. The others, too, halted their procession.
"Whatever that was," Ral said grimly, "it's summoning a storm."
True to his assertion, dark clouds massed above. Instead of rolling in like a plow crossing a field, they swirled like water circling a drain, their center difficult to discern. The wind picked up, blowing hard enough that the tops of the flowers bowed. With her line of sight less obstructed, Mabel finally saw the source of the wild magic.
A giant creature hovered in the air, purple lightning crackling across its body. It resembled the Sun Hawk in that both were birdlike. This monster, however, had four wings instead of two, the ends webbed like a batfolk's, a crest on its head like some she'd seen on lizardfolk. The feathers on its back were mud-colored, as were its coverts, but the primaries and secondaries were white, the tail striped. Its sharp talons looked large enough to carry off even Hugs with little difficulty.
#figure(image("004_Episode 4: Soothsaying and Stormcalling/01.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"What is that thing?" Finneas whispered, his voice trembling as he hugged the ground.
"It's a dragon," Ral whispered back. "Animal-shifted, which I suppose is to be expected."
"What are you talking about?" Helga asked.
"Hush and listen," Ral replied, quiet but intense. "That creature is not from Bloomburrow. It's extremely dangerous, like your Calamity Beasts. If we attract its attention, we'll be in immense trouble."
Of that, Mabel had no doubt. "We'll continue, low and slow as poured treacle. Hide anything reflective. No talking. Gev, take the lead. Everyone else, stay close to Hugs. I'll bring up the rear."
Gev acknowledged the order with a blink of his inner eyelids, then vanished into the green. Hugs's bulky form was harder to hide, but he proceeded at a pace that made snails seem quick. Mabel hoped that, from above, he resembled a small boulder or similar, nothing that would interest the dragon-hawk. Finneas crouched next to him, ears pinned back, and even Helga moved with surprising stealth. Ral kept his bracer tucked against his chest so it wouldn't catch the light, and Mabel sheathed her sword.
Despite the care he took, Hugs struggled to remain unobtrusive. Some dandelion puffs, pressed sideways by every gust, struck him and burst, their seeds careening into the air and leaving a trail. Mabel hoped against hope it might provide cover for their movement rather than making their location more obvious to the creature, which continued to circle in the darkening sky.
A dandelion hit Hugs's shoulder, the white tufts brushing Zoraline's face. To Mabel's horror, the sleeping batfolk sneezed awake, the sound jarringly loud amid the shushing of the wind-blown flowers and grass. Everyone froze. Zoraline rubbed her nose and stretched her wings, peering around in confusion.
"Where are we?" Zoraline asked. "Why is everyone so quiet?"
Gev slapped both hands over Zoraline's mouth, but the damage was done.
The dragon-hawk's cruelly curved beak opened, revealing a long, forked purple tongue. Lightning flashed and crackled in that cavernous maw. It screamed, surprisingly deep and harsh, more roar than shriek. It swooped toward the party, who alternately ducked or scattered. Talons sharp as death closed over empty air just above Hugs, and the creature flew up and away for another pass. The coiling clouds of the storm thickened like gravy, obscuring the sun and stealing its warmth from the air.
Ral slid next to Mabel and bared his teeth. "Lucky for you, storms are my purview. Unfortunately, dragons aren't, and I'm not sure how well my magic works here."
"Any help you might provide would be welcome," Mabel said. "For all our sakes."
"I might not be able to kill it, but I think I can make it unhappy." Ral tapped his bracer. "I need a lightning rod—something to conduct electricity."
Finneas pulled an arrow from his satchel wound with copper wire. "I'll give it something to chew on."
Ral nodded. "Tell everyone to run on my signal."
The dragon-hawk dove again, swerving away from Hugs as Finneas hit it in the mouth with the copper arrow. It veered up and away, snarling in annoyance—the arrow was stuck fast. Meanwhile, Mabel searched for a shelter that might protect them; the forest was well to the northwest, but the eastern hills remained near enough to reach with a long, hard run.
Another pass, and this time Zoraline vocalized a haunting glissando that seemed to scramble the creature's senses. It reeled as if dizzy, climbing back into the air, the arrow still lodged next to that curving beak.
Ral's blue-gray eyes crackled with power that rippled along the length of his black fur, collecting in his bracer. Up in the whirling clouds, a flash of light was chased by an ominous rumble.
#figure(image("004_Episode 4: Soothsaying and Stormcalling/02.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"You like a little lightning, don't you? Well, let's check your battery capacity." Ral raised his bracer-clad arm and shouted, #emph["] #emph[Run!"]
"To me!" Mabel darted toward the boulders, ensuring the others followed. Gev quickly outpaced her, Finneas close behind. Hugs's relative slowness, encumbered as he was by Zoraline, was mitigated by his longer stride. He kept close to Helga, who gripped a wand in one hand as if it were a protective talisman.
A barrage of lightning ripped through the blanket of gray above, so bright it might have been a second sun. Mabel resisted the urge to simply stand and watch in awe at the sheer power of Ral's magic. The wild, natural energy tore from the clouds, striking the head of the dragon-hawk before coalescing on the otterfolk. She wasn't certain, but she thought she heard Ral's laughter beneath the overwhelming drumbeats of thunder.
The dragon-hawk writhed and roared, eyes sparking wildly, a storm of energy under the surface of its skin—but it didn't fall. Mabel grimaced; #emph[would it retaliate?]
No. Its four wings beat the air in a rush of purple-tinged power, and soon it disappeared into the bank of clouds that still lay thick across the once blue canvas of afternoon. Even so, Mabel continued to run, as fast as her weary legs could carry her, toward the boulders in the foothills.
After a handful of tense minutes that felt like an hour, everyone but Ral huddled beneath a vast stone tipped diagonally like a stray book on a shelf. The skies opened, scattering rain. Murmuring softly, Helga raised her wand, which glowed a faint blue at its beaded tip. The falling drops solidified into a thin barrier perpendicular to the rock, deflecting the worst of the wet. Mabel squinted in the direction from which they'd fled.
The curtain of water eventually parted to reveal Ral, trudging toward them. None of the fluid seemed to touch him, as if he, like Helga, could bend it away to keep himself dry.
Before he reached the shelter, he waved a paw at the sky as if brushing away gnats. The rain lessened from a steady rush to a light mist. Clouds thinned and parted, a shaft of sunlight stealing through. Of the dragon-hawk, no sign remained but the labored breathing of tired animalfolk.
"That sure was some fancy weaving," Finneas said, his ears slowly rising and angling forward.
Ral nodded at him. "Good shot, yourself."
Gev, clinging to the underside of the stone, licked his snout indignantly. "And where is my compliment for my so-excellent stealth and not attracting the monster's attention, hmm?"
"You did a wonderful job, Gev," Mabel said.
The lizardfolk raised his lower eyelids but seemed mollified. "It is as well, because now I have the job to remove the so-many stickers from the fur of my friend Hugs."
Hugs snorted, whether in amusement or derision, Mabel didn't know. A laugh swelled in her belly like a yeasted bun, sweet and light.
Soon, the rain subsided entirely, and Helga dismissed her cantrip like a popped bubble. The scent of petrichor permeated the landscape, soothing despite its promise of an impending trek across muddy earth. Still, they were alive and whole; any amount of accumulated muck paled in comparison to the dark depths of what might have been.
"On we go, then," Mabel said, brushing moisture from her cloak. "Fountainport awaits."
A faint whistling sound drew everyone's attention to Hugs's back. Zoraline snored softly, not a care in the world, a wispy dandelion seed stuck to the outside of her ear.
|
|
https://github.com/Nerixyz/icu-typ | https://raw.githubusercontent.com/Nerixyz/icu-typ/main/docs/docs/fmt-date.md | markdown | MIT License | # `fmt-date`
```typst-code
let fmt-date(
dt,
locale: "en",
length: "full"
)
```
Formats a date in some [`locale`](#locale). Dates are assumed to be ISO dates.
## Arguments
### `dt`
The date to format. This can be a [`datetime`][datetime] or a dictionary with `year`, `month`, `day`.
example{
```typst +preview
#fmt-date(datetime(
year: 2024,
month: 5,
day: 31,
)) \
#fmt-date(( // (1)!
year: 2020,
month: 8,
day: 14,
), locale: "fr")
```
1. Date passed as a dictionary
}example
### `locale`
The locale to use when formatting the date. A [Unicode Locale Identifier]. Notably, this can be used to set the calendar by setting `ca` to a [bcp47 calendar name](https://github.com/unicode-org/cldr/blob/main/common/bcp47/calendar.xml).
example{
```typst +preview(vertical) linenums="1"
#let date = datetime(
year: 2024,
month: 5,
day: 31,
)
#set enum(start: 8)
+ #fmt-date(date, locale: "en")
+ #fmt-date(date, locale: "ka")
+ #fmt-date(date, locale: "en-u-ca-buddhist")
+ #fmt-date(date, locale: "en-u-ca-chinese")
+ #fmt-date(date, locale: "en-u-ca-coptic")
+ #fmt-date(date, locale: "en-u-ca-dangi")
+ #fmt-date(date, locale: "en-u-ca-ethioaa")
+ #fmt-date(date, locale: "en-u-ca-hebrew")
+ #fmt-date(date, locale: "en-u-ca-indian")
+ #fmt-date(date, locale: "en-u-ca-islamic")
+ #fmt-date(date, locale: "en-u-ca-iso8601")
+ #fmt-date(date, locale: "en-u-ca-japanese")
+ #fmt-date(date, locale: "en-u-ca-persian")
+ #fmt-date(date, locale: "en-u-ca-roc")
```
}example
### `length`
The length of the formatted date (`#!typst-code "full"` (default), `#!typst-code "long"`, `#!typst-code "medium"`, or `#!typst-code "short"`).
example{
```typst +preview(vertical)
#let date = datetime(
year: 2024,
month: 5,
day: 31,
)
*Full*
- #fmt-date(date, length: "full")
- #fmt-date(date, length: "full", locale: "is")
- #fmt-date(date, length: "full", locale: "hi")
*Long*
- #fmt-date(date, length: "long")
- #fmt-date(date, length: "long", locale: "lo")
- #fmt-date(date, length: "long", locale: "pt")
*Medium*
- #fmt-date(date, length: "medium")
- #fmt-date(date, length: "medium", locale: "mn")
- #fmt-date(date, length: "medium", locale: "hy")
*Short*
- #fmt-date(date, length: "short")
- #fmt-date(date, length: "short", locale: "es")
- #fmt-date(date, length: "short", locale: "sk")
```
}example
[datetime]: https://typst.app/docs/reference/foundations/datetime/
[Unicode Locale Identifier]: https://unicode.org/reports/tr35/tr35.html#Unicode_locale_identifier
|
https://github.com/kotfind/hse-se-2-notes | https://raw.githubusercontent.com/kotfind/hse-se-2-notes/master/cpp/lectures/2024-09-13.typ | typst | = Введение
Препод: <NAME>
Оценка:
$ "Итог"
= 0.5 dot (0.1 dot "А" + 0.2 dot "Дз1" + 0.35 dot "Дз2" + 0.35 dot "Дз3")
+ 0.5 dot (0.3 dot "Кр" + 0.7 dot "Экз") $
Будет:
- ООП
- Параллельное и конкурентное программирование
- Функциональное программирование
- Всякое
= Классы
Классы --- исторически первое отличие C++ от C
```cpp
class Matrix {
private:
size_t n_rows_;
size_t n_cols_;
double *data_;
public:
Matrix(size_t n_rows, size_t n_cols);
Matrix(const Matrx& other);
Matrix() = delete; // Явно удаляем default-ный конструктор,
// хотя он, и так, не создается
int rank() const;
size_t n_rows() const { return n_rows_; }
}
int main() {
Matrix m(10, 10);
m.rank();
}
```
"Программы надо писать для людей"
Методы, реализованные внутри объявления класса, часто становятся inline-овыми.
Инкапсуляция --- скрытие внутреннего состояния (private в классах). Позволяет:
+ меньше косячить в программах
+ отделять реализацию от интерфейса
#figure(caption: "Инкапсуляция в Си")[
`public.h`:
```c
typedef void* Matrix;
Matrix matrix_create();
int matrix_rank(Matrix m);
```
`public.h`:
```c
Matrix matrix_create() { ... }
int matrix_rank(Matrix m) {
struct MatrixData* = (MatrixData*)m;
...
}
```
]
const --- после называния метода, значит метод не меняет экземпляр
```cpp
Matrix m; // default-ный конструктор
Matrix m(1, 1); // конструктор
Matrix m(); // объявление функции
Matrix m{}; // default-ный консруктор
Matrix m2 = m; // конструктор копирования
```
Удалять, как создавали:
```cpp
Matrix* pm = new Matrix(1, 1);
Matrix* a = new Matrix[100];
delete pm;
delete[] a;
```
New по уже выделенной памяти:
```cpp
void* addr = malloc(...);
new (addr) Matrix(1, 1);
a.~Matrix();
```
Если у полей нет default-ного конструктора или поля константы или ссылки, то делать так:
```cpp
class X {...};
X::X(int y) : a(y) { ... }
```
Поля инициализирются в том порядке, в котором указаны в классе
`X&&` --- r-value
```cpp
X::X(X&& other) {...}
```
Нельзя перегрузить оператор внутри класса, если первый аргумент другого типа
Правило трех:
- *TODO*
- *TODO*
- *TODO*
Правило пяти:
- .. правило трех
- *TODO*
- *TODO*
Не стоит бросать exception в деструкторе тк exception во время обработки
exception-а --- плохо
exception в конструкторе --- можно
Хорошо делать exception только с типами, унаследованными от std::exception
|
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compiler/array-10.typ | typst | Other | // The the `first` and `last` methods.
#test((1,).first(), 1)
#test((2,).last(), 2)
#test((1, 2, 3).first(), 1)
#test((1, 2, 3).last(), 3)
|
https://github.com/Matt-Tansley/quarto-typst-landscape | https://raw.githubusercontent.com/Matt-Tansley/quarto-typst-landscape/main/_extensions/quarto-typst-landscape/typst-template.typ | typst |
#let PrettyTypst(
// Default document title.
// 'title' in your .qmd file will override this.
title: "quarto-typst-landscape",
// Logo in top right corner.
typst-logo: none,
// The document content.
body
) = {
// Set document metadata.
set document(title: title)
// Configure pages.
set page(
flipped: true, // landscape
paper: "a4",
margin: (left: 2cm, right: 2cm, top: 4cm, bottom: 2cm),
numbering: "1",
number-align: right,
background: place(left + top, rect(
fill: rgb("#E6E6FA"),
height: 3cm,
width: 100%,
))
)
// Set the body font.
set text(10pt, font: "Ubuntu")
// Configure headings.
show heading.where(level: 1): set block(above: 0.6cm, below: 0.6cm)
show heading.where(level: 2): set block(above: 0.6cm, below: 0.6cm)
// Links should be purple.
show link: set text(rgb("#800080"))
// Configure light purple border.
show figure: it => block({
move(dx: -3%, dy: 1.5%, rect(
fill: rgb("FF7D79"),
inset: 0pt,
move(dx: 3%, dy: -1.5%, it.body)
))
})
// Purple border column
grid(
columns: (1fr),
// Title.
pad(bottom: 1cm, text(font: "Ubuntu", 20pt, weight: 800, upper(title))),
// The main body text.
{
set par(justify: true)
body
v(1fr)
},
)
}
|
|
https://github.com/hei-templates/hevs-typsttemplate-thesis | https://raw.githubusercontent.com/hei-templates/hevs-typsttemplate-thesis/main/02-main/06-validation.typ | typst | MIT License | #import "../00-templates/helpers.typ": *
#pagebreak()
= Validation <sec:validation>
#lorem(50)
#minitoc(after:<sec:validation>, before:<sec:conclusion>)
#pagebreak()
== Section 1
#lorem(50)
== Section 2
#lorem(50)
== Discussion
#lorem(50)
|
https://github.com/FrightenedFoxCN/typst-math-chinese | https://raw.githubusercontent.com/FrightenedFoxCN/typst-math-chinese/main/template-en.typ | typst | #import "@preview/showybox:2.0.2": *
#let change_footer_style(content, emphcolor, leading, qed) = if content != none {block(width: 100%, )[
#align(left)[
#set list(marker: (strong[•]))
#set text(
font: ("Times New Roman"),
)
#[
#set text(emphcolor)
#leading
]
#content
]
#if qed {
align(right)[#sym.qed]
}
]} else {
""
}
#let chapter_numbering(..nums) = {
if (nums.pos().len() == 1) {
"Lecture " + numbering("1", ..nums) + ": "
} else {
numbering("1.1.1", ..nums)
}
}
#let appendix_numbering(..nums) = {
if (nums.pos().len() == 1) {
"Appendix " + numbering("A", ..nums) + ": "
} else {
numbering("A.1.1", ..nums)
}
}
#let change_body_style(counter, emphcolor, leading, supplement, heading) = block(width: 100%)[
#if counter != none {
counter.step()
}
#set list(marker: (text(emphcolor)[•]))
#text(
font: ("Times New Roman"),
weight: "bold",
emphcolor)[
#leading
#if counter != none [
#context counter.display()
]
#if supplement != none [
(#supplement)
]
]
#heading
]
// 数学环境块
#let mathenv(
term, // 正文
supplement, // 补充说明(括号里面的东西)
counter, // 计数器,需要在模板中定义
blockcolor, // 块颜色,指底色
emphcolor, // 强调色,补充说明和编号的颜色
leading // 编号前面写什么
) = showybox(
frame: (
body-color: blockcolor,
radius: 0pt,
border-color: blockcolor,
),
breakable: true,
change_body_style(counter, emphcolor, leading, supplement, term),
)
// 补充数学环境块,例如例子块
#let compmathenv(
term,
supplement,
counter,
emphcolor,
leading
) = showybox(
frame: (
body-color: white,
radius: 0pt,
border-color: emphcolor,
dash: "dashed",
thickness: (y: 1pt),
inset: (x: 0em, y: 1em)
),
breakable: true,
)[
#set text(
font: ("Times New Roman"),
)
#change_body_style(counter, emphcolor, leading, supplement, term)
]
// 带证明/解答的环境块
#let mathenvWithCompanion(
heading, // 正文
supplement,
counter,
emphcolor,
blockcolor,
leading1, // 第一块的编号提示词
leading2, // 第二块的编号提示词
content, // 第二块的内容
qed: false // 是否需要有 qed 标识
) = showybox(
frame: (
body-color: blockcolor,
radius: 0pt,
border-color: blockcolor,
footer-color: white
),
footer-style: (
color: black
),
breakable: true,
footer: change_footer_style(content, emphcolor, leading2, qed),
change_body_style(counter, emphcolor, leading1, supplement, heading)
)
// 以下是一些预定义的环境块
#let defcounter = counter("def")
#let defblockcolor = rgb(220, 227, 248)
#let defemphcolor = rgb(31, 119, 184)
#let def(term, supplement: none, counter: defcounter) = mathenv(term, supplement, defcounter, defblockcolor, defemphcolor, "Definition")
#let rmcounter = counter("rm")
#let rmblockcolor = rgb(255, 237, 193)
#let rmemphcolor = rgb(215, 94, 106)
#let rm(term, supplement: none, counter: rmcounter) = mathenv(term, supplement, rmcounter, rmblockcolor, rmemphcolor, "Remark")
#let conjcounter = counter("conj")
#let conjblockcolor = rgb(255, 213, 206)
#let conjemphcolor = rgb(233, 66, 66)
#let conj(term, supplement: none, counter: conjcounter) = mathenv(term, supplement, conjcounter, conjblockcolor, conjemphcolor, "Conjecture")
#let egcounter = counter("eg")
#let egemphcolor = rgb(130, 110, 217)
#let eg(term, supplement: none, counter: egcounter) = compmathenv(term, supplement, egcounter, egemphcolor, "Example")
#let thmcounter = counter("thm")
#let thmblockcolor = rgb(209, 255, 226)
#let thmemphcolor = rgb(0, 134, 24)
#let thm(heading, proof: none, supplement: none) = mathenvWithCompanion(heading, supplement, thmcounter, thmemphcolor, thmblockcolor, "Theorem", "Pf.", proof, qed: true)
#let coro(heading, proof: none, supplement: none) = mathenvWithCompanion(heading, supplement, thmcounter, thmemphcolor, thmblockcolor, "Corollary", "Pf.", proof, qed: true)
#let prop(heading, proof: none, supplement: none) = mathenvWithCompanion(heading, supplement, thmcounter, thmemphcolor, thmblockcolor, "Proposition", "Pf.", proof, qed: true)
#let lemma(heading, proof: none, supplement: none) = mathenvWithCompanion(heading, supplement, thmcounter, thmemphcolor, thmblockcolor, "Lemma", "Pf.", proof, qed: true)
#let excounter = counter("ex")
#let exemphcolor = rgb(35, 155, 171)
#let exblockcolor = rgb(161, 255, 238)
#let ex(heading, solution: none, supplement: none) = mathenvWithCompanion(heading, supplement, excounter, exemphcolor, exblockcolor, "Exercise", "Sol.", solution)
#let endofchapter() = {
// clearing all counters
defcounter.update(0)
rmcounter.update(0)
egcounter.update(0)
thmcounter.update(0)
excounter.update(0)
[#pagebreak()]
}
#let conf(doc) = {
set heading (
numbering: chapter_numbering
)
doc
}
#let set-appendix(doc) = {
counter(heading).update(0)
set heading (
numbering: appendix_numbering
)
doc
} |
|
https://github.com/RiccardoTonioloDev/Bachelor-Thesis | https://raw.githubusercontent.com/RiccardoTonioloDev/Bachelor-Thesis/main/chapters/pydnet.typ | typst | Other | #import "@preview/codly:1.0.0": *
#import "../config/functions.typ": *
#pagebreak(to: "odd")
= PyDNet <ch:pydnet>
PyDNet (*Py*\ramidal *D*\epth *Net*\work) è una famiglia di modelli composta da due versioni @PyDNetV1@PyDNetV2 (che per comodità verranno riferite come PDV1 e PDV2), che cercano di risolvere il problema della @MDE mediante un approccio non supervisionato, con circa 2 milioni di parametri in PDV1 e circa 700.000 in PDV2.
Una loro caratteristica interessante infatti, per l'applicazione in sistemi di tipo @embedded, è proprio il numero di parametri estremamente basso.
L'obbiettivo di questi modelli infatti, è quello di essere abbastanza leggeri da poter essere direttamente eseguiti su un processore, senza il supporto di una scheda grafica, come ad esempio nei cellulari @MDEInWild.
Questa loro caratteristica li rende molto interessanti come punto di partenza per sperimentare con tecniche innovative o apportare modifiche ai loro blocchi, di modo da migliorarne le prestazioni.
Tuttavia, essendo modelli relativamente datati, sono stati scritti in una versione di @Tensorflow ormai deprecata, il che li rende non più facilmente utilizzabili.
Di conseguenza i segueti sottocapitoli hanno la seguente funzione:
- @arc:pydnet: descrive l'architettura del modello e dei suoi sotto-componenti;
- @fun:pydnet: descrive come il modello cerca di risolvere il problema dell'@MDE, analizzando funzioni di perdita e procedura di allenamento;
- @conf:pydnet: descrive tutta la procedura di configurazione dell'ambiente necessario al fine di poter eseguire il codice originale di PDV1;
- @val:pydnet: descrive il processo di validazione del codice pubblicato, per verificare che effettivamente conduca a risultati simili a quelli del _paper_;
- @mig:pydnet: descrive il processo di migrazione di PDV1, da @Tensorflow a @PyTorch, con la sua successiva validazione, su tutti gli scenari proposti dal _paper_;
- @plus: descrive in primo luogo degli esperimenti condotti sugli iperparametri di PDV1 per analizzare poi le conseguenze che avranno sulle metriche di valutazione, e in secondo luogo PDV2 per verificare come questo modello si comporta rispetto alla sua versione precedente, a parità di modalità di addestramento.
#block([== Architettura del modello <arc:pydnet>
PDV1 è una rete convoluzionale profonda strutturata su sei livelli, dove ogni livello riceve l' _input_ dal livello superiore (fatta eccezione per il primo livello che riceve l'immagine di _input_) e elabora l'_input_ ricevuto mediante un @encoder, che restituisce un _output_ il quale farà da _input_ al livello inferiore (fatta eccezione per l'ultimo livello).
],breakable: false,width: 100%)
#block([L'_output_ dell'@encoder viene poi concatenato all'_output_ del livello inferiore sulla dimensione dei canali, e passato ad un @decoder, il cui _output_ verrà:
- Passato attraverso la funzione di attivazione sigmoide, il cui _output_ corrisponderà alla mappa di @disparità del livello preso in analisi;
],breakable: false,width: 100%)
- Passato attraverso una convoluzione trasposta con @kernel di dimensione $2 times 2$ e @stride 2, per raddoppiare le dimensioni di altezza e larghezza del tensore in ingresso, il cui _output_ verrà passato al livello superiore (eccezione fatta per il primo livello, che ha come unico _output_ la mappa di @disparità del livello).
#block([L'architettura è quindi la seguente:
#figure(
image("../images/architectures/PyDNetV1.drawio.png",width: 250pt),
caption: [Architettura del modello PDV1]
)
],breakable: false,width: 100%)
L'@encoder è composto da una convoluzione con @kernel di dimensione $3 times 3$ e @stride 2, che va quindi a ridurre l'altezza e la larghezza del tensore di ingresso della metà, seguita da una convoluzione con @kernel di dimensione $3 times 3$.
L'@encoder del livello $i, forall i in {1,2,3,4,5,6}$ avrà quindi come tensore in uscita un tensore con dimensioni di altezza e larghezza pari a $1 / 2^i$ delle dimensioni dell'_input_ iniziale.
In più, andando dal primo al sesto livello, i canali di uscita prodotti dalla seconda convoluzione dell'encoder sono 16, 32, 64, 96, 128, 196.
#block([L'architettura dell'@encoder è quindi la seguente:
#figure(
image("../images/architectures/PyDNetV1-encoder.drawio.png",width: 250pt),
caption: [#link(<encoder>)[Encoder] del modello PDV1.]
)
],breakable: false,width: 100%)
Il @decoder invece è composto da una successione di quattro convoluzioni con @kernel di dimensione $3 times 3$ e @stride 2, i quali rispettivamente producono delle @fmap con un numero di canali pari a 96, 64, 32 e 8, mantenendo invece le dimensioni di altezza e larghezza dell'_input_.
#block([L'architettura del @decoder è quindi la seguente:
#figure(
image("../images/architectures/PyDNetV1-decoder.drawio.png",width: 250pt),
caption: [#link(<decoder>)[Decoder] del modello PDV1.]
)
],breakable: false,width: 100%)
Successivamente ad ogni convoluzione, tranne per l'ultima del decoder, viene applicata la funzione di attivazione _leaky ReLU_ con coefficiente di crescita per la parte negativa di 0,2.
#block([== Funzionamento <fun:pydnet>
Lo scopo di PDV1 e PDV2 è quello di riuscire a portare a termine il compito di @MDE. Per farlo imparano, date due immagini stereo dello stesso scenario, ad applicare uno sfasamento a ogni pixel dell'immagine di sinistra, in modo da renderla quanto più simile possibile alla corrispondente immagine di destra, e uno sfasamento a ogni pixel dell'immagine di destra per renderla quanto più simile possibile alla corrispondente immagine di sinistra.
],breakable: false,width: 100%)
Quanto enunciato però viene fatto solamente durante l'addestramento del modello, poichè durante l'inferenza ogni immagine di _input_ viene introdotta nel modello come immagine di sinistra. Si prende infatti come _output_ del modello, solo il primo canale delle @fmap uscenti dalla sigmoide del livello, che infatti corrisponde alla mappa di @disparità per le immagini di sinistra.
Questa strategia funziona perchè progressivamente, durante l'allenamento, viene forzato un allineamento tra le predizioni per le immagini di sinistra e quelle di destra, rendendo ambivalente l'_output_ del modello sul canale di sinistra rispetto a quello di destra.
#block([
=== Funzioni di perdita
_*Image error loss*_ ($cal(L)_"ap"$):
#figure($ cal(L)_"ap"^l = 1/N sum_(i,j) alpha (1-"SSIM"(I^l_(i,j),tilde(I)^l_(i,j)))/2 + (1-alpha)norm((I^l_(i,j),tilde(I)^l_(i,j)))_1 $,caption: [_Image error loss_ calcolata per l'immagine di sinistra.])
],breakable: false,width: 100%)
Questa è la funzione di perdita che penalizza quanto più l'immagine originale di sinistra $I^l$ è diversa dall'immagine di destra con lo sfasamento applicato $tilde(I)^l$ (appunto per diventare l'immagine di sinistra).
La prima parte della sommatoria, utilizzando la funzione SSIM, serve per misurare la similarità strutturale tra le due immagini, mentre con la seconda parte, mediante la norma 1, serve per misurare la distanza tra i corrispondenti pixel delle due immagini.
Il parametro $alpha$ viene usato per regolare il peso tra la prima e la seconda parte, il quale viene impostato a 0,85, andando quindi a dare molta più importanza alla prima.
#block([_*Disparity smoothness loss*_ ($cal(L)_"ds"$):
#figure($ cal(L)_"ds"^l = 1/N sum_(i,j) abs(delta_x d^l_(i,j))e^(-norm(delta_x I^l_(i,j))) + abs(delta_y d^l_(i,j))e^(-norm(delta_y I^l_(i,j))) $,caption: [_Disparity smoothness loss_ calcolata per l'immagine di sinistra.])
],breakable: false,width: 100%)
In questo caso invece, questa funzione di perdita disincentiva discontinuità di profondità calcolate mediante norma 1, a meno che non ci sia una discontinuità sul gradiente dell'immagine.
La prima parte della sommatoria analizza le discontinuità sull'asse orizontale, mentre la seconda parte sull'asse verticale.
#block([_*Left-right consistency loss*_ ($cal(L)_"lr"$):
#figure($ cal(L)^l_"lr" = 1/N sum_(i,j)abs(d^l_(i,j)-d^r_(i,j+d^l_(i,j))) $,caption: [_Left-right consistency loss_ calcolata per l'immagine di sinistra.])
],breakable: false,width: 100%)
Infine, l'ultima funzione di perdita, formula nota nel campo degli algoritmi stereo, serve per forzare coerenza tra le predizioni di @disparità di destra $d^r$ e di sinistra $d^l$. Questo viene reso possibile andando a penalizzare differenze tra le $d^l_(i,j)$, e le $d^r_(i,j)$ ma con uno sfasamento applicato sull'asse orizontale di una quantità corrispondente a $d^l_(i,j)$ (ovvero lo stesso sfasamento che viene applicato alle immagini di destra per renderle quanto più simili alle immagini di sinistra).
#block([*Funzione di perdita completa* ($cal(L)_"lr"$):
Le precedenti funzioni di perdita vengono calcolate anche per l'immagine di destra, per poi essere combinate nel seguente modo, creando la funzione di perdita completa $cal(L)_"s"$:
$ cal(L)_"s" = alpha_"ap" (cal(L)^l_"ap"+cal(L)^r_"ap") + alpha_"ds" (cal(L)^l_"ds"+cal(L)^r_"ds") + alpha_"lr" (cal(L)^l_"lr"+cal(L)^r_"lr") $
],breakable: false,width: 100%)
#block([I pesi per i vari termini della funzione completa sono impostati nel seguente modo:
- $alpha_"ap" = 1$;
],breakable: false,width: 100%)
- $alpha_"lr" = 1$;
- $alpha_"ds" = 1/r$ dove $r$ è il fattore di scala a ciascun livello di risoluzione.
#block([=== Allenamento
Per l'allenamento viene utilizzato l'ottimizzatore @adam con i seguenti parametri: $beta_1=0.9$, $beta_2=0.999$ e $epsilon=10^(-8)$.
],breakable: false,width: 100%)
Il _learning rate_ parte da $10^(-4)$ per il primo 60% delle epoche, e viene dimezzato ogni 20% successivo.
#block([Infine vengono applicate, con una probabilità del 50%, le seguenti _data augmentation_:
- Capovolgimento orizontale delle immagini;
],breakable: false,width: 100%)
- Trasformazione delle immagini:
- Correzione gamma;
- Correzione luminosità;
- Sfasamento dei colori.
Il _dataset_ viene suddiviso in _batch_ da 8 immagini, e verranno eseguite un totale di 50 epoche di allenamento.
#block([== Configurazione dell'ambiente <conf:pydnet>
La versione di @Tensorflow usata per i codici dei modelli PDV1 e PDV2 è la `1.8`, ormai deprecata da anni e non più scaricabile dai _package manager_ come @pip o @Anaconda.
],breakable: false,width: 100%)
Una versione retrocompatibile con la `1.8` e ancora scaricabile tramite @pip è la `1.13.2`, che però dipende da una versione del pacchetto `protobuf` non più disponibile. Fortunatamente, la versione `3.20` di `protobuf` è ancora scaricabile da @pip e compatibile con @Tensorflow `1.13.2`.
Il codice si basava anche su una versione deprecata del pacchetto `scipy`, facilmente sostituibile con la versione `1.2`, ancora disponibile tramite @pip.
L'ultima configurazione necessaria per eseguire il codice è la corretta versione di @Python, ancora scaricabile e che riesca ad essere compatibile con tutti i pacchetti sopra menzionati e con le relative dipendenze. Grazie ad @Anaconda è possibile scaricare la versione `3.7` che è utilizzabile per questo scopo.
#block([I comandi da terminale per ottenere la configurazione citata, previa corretta installazione di @pip e @Anaconda sono:
#codly(languages: (
bash: (name: "Bash", color: gray)
),number-format: none, zebra-fill: none)
```bash
# Creare l'ambiente Anaconda (usare il nome che si preferisce)
conda create -n <nomeAmbiente> python=3.7
# Attivare l'ambiente Antaconda creato
conda activate <nomeAmbiente>
# Installare i paccheti richiesti
pip install protobuf==3.20 tensorflow_gpu=1.13.2 scipy=1.2 matplotlib wandb
```
],breakable: false,width: 100%)
Tra i pacchetti installati mediante @pip è presente anche @Wandb, un sistema che permette di registrare, gestire e catalogare il _plot_ delle _loss function_ per i vari esperimenti che verranno condotti.
Il codice è tecnicamente eseguibile, solo se si dispone di una scheda video all'interno della macchina. Tuttavia il cluster del dipartimento di matematica ha versioni troppo aggiornate dei driver @CUDA e della libreria @cuDNN, per essere utilizzabili da @Tensorflow `1.13.2`.
Ho quindi ritrovato le versioni adatte (per @CUDA, la versione `10.0`, scaricabile seguendo le istruzioni presenti nell'#link("https://developer.nvidia.com/cuda-toolkit-archive","archivio CUDA"), e per @cuDNN, la versione `7.4.2`, scaricabile seguendo le istruzioni presenti nell'#link("https://developer.nvidia.com/rdp/cudnn-archive","archivio cuDNN")) e ho proceduto con il configurare una macchina con tali librerie e _driver_.
#block([Infine bisogna scaricare il _dataset_ KITTI, il quale verrà utilizzato per l'addestramento e valutazione del modello, utilizzando questo comando:
```bash
wget -i utils/kitti_archives_to_download.txt -P ~/my/_output_/folder/
```
],breakable: false,width: 100%)
#block([Successivamente bisogna effettuare l'_unzip_ di tutte le cartelle compresse e convertire tutte le immagini da `.png` a `.jpg`, mediante i seguenti comandi:
```bash
cd <pathCartellaDataset>
find <pathCartellaDataset> -name '*.zip' | parallel 'unzip -d {.} {}'
find <pathCartellaDataset> -name '*.png' | parallel 'convert {.}.png {.}.jpg && rm {}'
```
Dove `<pathCartellaDataset>` è il _path_ che conduce alle cartelle `.zip` precedentemente scaricate.
],breakable: false,width: 100%)
#block([== Validazione <val:pydnet>
Seguendo le istruzioni ritrovabili nella _repository_ di PDV1 e Monodepth@monodepth, si possono recuperare le istruzioni per effettuare l'esecuzione dell'allenamento, il _testing_ e la successiva valutazione.
],breakable: false,width: 100%)
Quindi, una volta impostata una _codebase_ come scritto nella documentazione di PDV1 possono essere utilizzati i seguenti comandi.
#block([*Per l'allenamento*:
#codly(languages: (
bash: (name: "Bash", color: gray)
),number-format: none, zebra-fill: none)
```bash
conda activate <nomeAmbiente>
python3 <pathFileEseguibile>/monodepth_main.py \
--mode train \
--model_name pydnet_v1 \
--data_path <datasetPath> \
--filenames_file <fileNamesDatasetPath>/eigen_train_files.txt \
--log_directory <outputFilesPath>
```
Dove:
- `<nomeAmbiente>`: è il nome dell'ambiente @Anaconda da dover attivare;
],breakable: false,width: 100%)
- `<pathFileEseguibile>`: è il _path_ che conduce al _file_ `monodepth_main.py`, _file_ che dovrà essere eseguito per eseguire l'allenamento;
- `<datasetPath>`: è il _path_ che conduce alla cartella contenente il _dataset_;
- `<fileNamesDatasetPath>`: è il _path_ che conduce al _file_ `eigen_train_files.txt`;
- `<outputFilesPath>`: è il _path_ che conduce alla cartella dove verranno salvati tutti i _file_ di _output_ prodotti dalla procedura di allenamento.
Questa procedura produrrà dei _file_ di _checkpoint_, ritrovabili nella cartella `<outputFilesPath>`.
#block([*Per il _testing_*:
```bash
conda activate <nomeAmbiente>
python3 <pathFileEseguibile>/experiments.py \
--datapath <datasetPath> \
--filenames <fileNamesDatasetPath>/eigen_test_files.txt \
--output_directory <outputFilesPath> \
--checkpoint_dir <checkpointPath>
```
Dove:
- `<nomeAmbiente>`: è il nome dell'ambiente @Anaconda da dover attivare;
],breakable: false,width: 100%)
- `<pathFileEseguibile>`: è il _path_ che conduce al _file_ `experiments.py`, _file_ che dovrà essere eseguito per calcolare e generare il _file_ `disparities.npy`;
- `<datasetPath>`: è il _path_ che conduce alla cartella contenente il _dataset_;
- `<fileNamesDatasetPath>`: è il _path_ che conduce al _file_ `eigen_test_files.txt`;
- `<outputFilesPath>`: è il _path_ che conduce alla cartella dove verrà salvato il _file_ `disparities.npy`;
- `<checkpointPath>`: è il _path_ che conduce alla cartella dove è posizionato il _checkpoint_ da usare per impostare i pesi del modello, precedentemente creato dalla fase di _training_.
Questa procedura produrrà un _file_ `disparities.npy`, contenente tutte le @disparità prodotte dal modello, avente avuto come _input_ le immagini appartenenti al _test set_.
#block([*Per la valutazione*:
```bash
conda activate <nomeAmbiente>
python3 <pathFileEseguibile>/evaluate_kitti.py \
--_split_ eigen \
--gt_path <datasetPath> \
--filenames_path <fileNamesDatasetPath> \
--predicted_disp_path <disparitiesPath>/disparities.npy \
```
Dove:
- `<nomeAmbiente>`: è il nome dell'ambiente @Anaconda da dover attivare;
],breakable: false,width: 100%)
- `<pathFileEseguibile>`: è il _path_ che conduce al _file_ `evaluate_kitti.py`, _file_ che dovrà essere eseguito per valutare il _file_ `disparities.npy`, precedentemente creato dalla fase di _testing_;
- `<datasetPath>`: è il _path_ che conduce alla cartella contenente il _dataset_;
- `<fileNamesDatasetPath>`: è il _path_ che conduce alla cartella al cui interno è posizionato `eigen_test_files.txt`;
- `<disparitiesPath>`: è il _path_ che conduce alla cartella dove è posizionato il _file_ `disparities.npy`.
Questa procedura mostrerà a terminale i valori calcolati per ciascuna metrica di valutazione del modello.
#block([Una volta seguita questa procedura ho ottenuto i seguenti risultati:
#eval_table(
(
(name: [PDV1 _paper_], vals: (0.163,1.399,6.253,0.262,0.759,0.911,0.961)),
(name: [PDV1 ri-allenato], vals: (0.164,1.427,6.369,0.266,0.757,0.908,0.960))
),
1,
[PDV1 vs. PDV1 ri-allenato]
)
],breakable: false,width: 100%)
Come si può notare i risultati sono estremamente vicini e di conseguenza il _paper_ @PyDNetV1 è stato dimostrato valido.
#block([== Migrazione da Tensorflow a PyTorch <mig:pydnet>
Verificati i risultati ottenuti nel _paper_, si può quindi partire con la migrazione dell'intera _codebase_ da @Tensorflow a @PyTorch, standard del mondo della ricerca nel campo del _machine learning_, che ci permetterà successivamente di integrare in PDV2 tecniche innovative, altrimenti impossibili da sperimentare.
],breakable: false,width: 100%)
#block([=== Il _dataset_
La migrazione è cominciata con l'entità che governa l'approvvigionamento di immagini alla procedura di addestramento, per allenare il modello.
],breakable: false,width: 100%)
In @PyTorch questa entità è chiamata `Dataset` e può essere implementata mediante l'omonima interfaccia.
#block([L'interfaccia espone i seguenti due metodi astratti:
- `__len__(self)`: il quale deve restituire la lunghezza del _dataset_;
],breakable: false,width: 100%)
- `__getitem__(self, i: int)`: il quale dato un indice, deve restituire l'elemento o gli elementi del _dataset_ corrispondenti ad esso.
Siccome i nomi dei vari _file_ da recuperare per il _dataset_ sono presenti all'interno di determinati _file_ di testo (nello specifico `eigen_train_files.txt` per il training e `eigen_test_files.txt` per il testing, secondo lo _split_ presentato in @eigen), organizzati in un formato simile al `.csv`, nell'implementazione di questa entità ho scelto di appoggiarmi alla libreria _Pandas_, la quale solitamente viene utilizzata apposta per analizzare grandi _file_ `.csv` in modo efficiente.
Inoltre, grazie alle _API_ di _Pandas_ è molto facile reperire la dimensione del _dataset_ (ogni riga del _file_ di testo corrisponde ai _path_ della coppia di immagini stereo della stessa scena), ed è molto facile dato un indice reperire i _path_ delle corrispondenti immagini stereo.
#linebreak()
Appoggiandomi poi alla libreria _Pillow_ (standard di lettura efficiente delle immagini nell'ecosistema @Python) e @PyTorch, mi sono occupato della lettura delle immagini selezionate mediante _Pandas_, della succesiva loro conversione in tensori e dell'applicazione di un eventuale _data augmentation_ da applicare a questi, prima che vengano restituiti dal metodo `__getitem__`.
Il `Dataset` è stato creato in modo da far restituire una tupla di tensori $(T_"sx",T_"dx")$ se questo è in modalità _training_ altirmenti, se in modalità _testing_, resitituirà solo il tensore di sinistra $T_"sx"$.
Ho implementato infine un metodo di utilità che a partire dal `Dataset` genera un `DataLoader`, il quale sarà il diretto usufruitore del primo per fornire alla procedura di addestramento i corretti _batch_ di immagini.
#block([=== I modelli <models>
I modelli di PDV1 e PDV2 sono stati ricreati con una corrispondenza 1:1 rispetto a quanto ritrovabile nella _codebase_ originale (cambia solo la sintassi con la quale sono stati implementati, dovuta solo alla differenza di API tra @Tensorflow e @PyTorch), in quanto entrambe le parti devono rappresentare gli stessi modelli matematici.
],breakable: false,width: 100%)
#block([Tuttavia, ho approfittato dei vari metodi, interfacce e classi che @PyTorch offre per:
- Creare moduli, mediante l'implementazione dell'interfaccia `torch.nn.Module` per poter costruire l'@encoder e il @decoder come due moduli a se stanti, poi integrati come sotto-moduli dei modelli, di modo da rendere il codice più leggibile e compartimentalizzato;
],breakable: false,width: 100%)
- Creare sequenze di blocchi o _layer_, mediante l'impiego di oggetti `torch.nn.Sequential`, per poter rendere il codice più semplice e sequenziale, migliorandone la leggibilità.
#block([=== La configurazione
Il codice originale fa un forte uso degli argomenti da terminale per definire le varie impostazioni di esecuzione del programma, mentre il codice migrato fa uso di _file_ di configurazione scritti in @Python, così da poter specificare anche i tipi delle varie impostazioni inseribili e da poter sfruttare il @linter di @Python per avere suggerimenti riguardo alle impostazioni durante la scrittura del codice.
],breakable: false,width: 100%)
Nel mio caso ho scritto un _file_ di configurazione `ConfigHomeLab.py` per la configurazione del programma di modo da poterlo eseguire sul mio computer di casa, e un _file_ di configurazione `ConfigCluster.py` per poterlo invece eseguire sul cluster di dipartimento.
#block([Di conseguenza le fasi di #link(<training>)[_training_], #link(<utilizzo>)[utilizzo] e #link(<valutazione>)[valutazione] hanno tutte bisogno di due argomenti da terminale:
- `--mode`: che serve a specificare la modalità di esecuzione del codice (se per l'allenamento, utilizzo o valutazione);
],breakable: false,width: 100%)
- `--env`: che serve a specificare la configurazione da utilizzare (nel mio caso tra `ConfigHomeLab`, ovvero la scelta di default se non viene inserito niente e `ConfigCluster`).
#block([=== La procedura di _training_ <training>
Tutto il codice per il _training_ è stato realizzato dentro un _file_ apposito `training.py`, il quale viene eventualmente richiamato dal _file_ `main.py`.
],breakable: false,width: 100%)
#block([Anche nel caso della procedura di _training_ c'è una corrispondenza 1:1 rispetto a quanto ritrovabile nella _codebase_ originale, poichè per ottenere gli stessi risilutati è necessario che il modello segua lo stesso addestramento, tuttavia sono state applicate le seguenti scelte stilistiche e organizzative:
- Ogni funzione di perdita è implementata nella propria funzione @Python. Successivamente la funzione di perdita completa richiama tutte le altre, come la corrispondente formula matematica, così da rendere più comprensibile e compartimentalizzato il codice;
],breakable: false,width: 100%)
- Rigurado al salvataggio dei _checkpoint_, ho scelto di salvare sia l'ultimo _checkpoint_ che quello della versione del modello con la valutazione migliore sul _test set_. Questo perchè dopo ogni epoca viene fatta una valutazione del modello sul _test set_.
Successivamente, come precedentemente fatto per la _codebase_ originale, è stato aggiunto @Wandb per effettuare la registrazione delle funzioni di perdita per ogni esperimento.
#block([=== La procedura di utilizzo <utilizzo>
Tutto il codice per l'utilizzo è stato realizzato dentro un _file_ apposito `using.py`, il quale viene eventualmente richiamato dal _file_ `main.py`.
],breakable: false,width: 100%)
#block([Come per la respository originale ho fatto in modo che si possano utilizzare i modelli nei seguenti modi:
- Se si imposta `--mode=use` si può fornire un secondo argomento `--img_path` dove si specifica il _path_ dell'immagine di cui si vuole ottenere la mappa delle @disparità. In questo modo verrà generata una mappa delle disparità con nome omonimo al _file_ inserito come _input_, che verrà posizionata nella medesima cartella del _file_ di _input_;
],breakable: false,width: 100%)
- Se si desidera utilizzare il modello attraverso la _webcam_ integrata del computer, si deve impostare `--mode=webcam`. Bisogna tuttavia assicurarsi di avere il comando `ffmpeg` disponibile mediante terminale.
Inoltre, nel caso in cui si voglia integrare il modello in un'altro programma, è stata creata la funzione `use()` la quale, una volta forniti come parametri: il modello da utilizzare, l'immagine sotto forma di immagine _Pillow_ o tensore di @PyTorch, le dimensioni delle immagini accettate dal modello, le dimensioni originali dell'immagine e il dispositivo sulla quale si vuole eseguire il modello (`cuda` o `cpu`), restituisce in _output_ un tensore di @PyTorch, rappresentante la mappa delle @disparità corrispondente.
#block([=== La procedura di valutazione <valutazione>
Tutto il codice per la valutazione e per il _testing_ è stato realizzato dentro i corrispondenti _file_ `evaluating.py` e `testing.py`, i quale vengono eventualmente richiamati dal _file_ `main.py`.
],breakable: false,width: 100%)
#block([La procedura di valutazione si divide in due parti:
- _testing_: la fase di _testing_ si occupa di fornire le predizioni per tutte le immagini del _test set_, e di salvarle in un _file_ chiamato `disparities.npy`;
],breakable: false,width: 100%)
- valutazione: la fase di valutazione si occupa di analizzare il _file_ `disparities.npy`, al fine di produrre delle valutazioni sulle metriche presentate in @eigen.
La procedura di testing è stata riscritta completamente sempre con corrispondenza 1:1 con la _codebase_ originale per poter sfruttare poi le stesse procedure di valutazione. Infatti le procedure di valutazione, essendo scritte in @Python utilizzando solamente _Numpy_, non sono dipendenti da un framework di _machine learning_ specifico e non sono quindi state migrate, ma tenute come sono.
#block([=== Risultati della migrazione
Per eseguire il codice bisogna innanzitutto avere un ambiente @Anaconda con tutte le dipendenze necessarie, e per farlo bisogna eseguire i seguenti comandi da terminale:
```bash
conda create -n <nomeAmbiente>
conda activate <nomeAmbiente>
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
pip install wandb pandas matplotlib Pillow
```
Dove `<nomeAmbiente>` è il nome che diamo all'ambiente @Anaconda che poi utilizzeremo. Se il proprio _computer_ usa @CUDA `11.8`, bisogna sostituire nei comandi sovrastanti `pytorch-cuda=12.1` con `pytorch-cuda=11.8`
],breakable: false,width: 100%)
Successivamente bisogna impostare il _file_ di configurazione che si decide di usare, fornendo tutte le impostazioni richieste (entrambi i _file_ sono commentati al di sotto di ogni impostazione per spiegare che valori riporvi).
#block([Fatto ciò possiamo allenare il modello tramite il seguente comando:
```bash
python3 main.py --mode=train --env=<configurazioneUsata>
```
Dove `<configurazioneUsata>` è il nome della configurazione che decidiamo di utilizzare.
],breakable: false,width: 100%)
Dopo averlo allenato avremo due _file_ di _checkpoint_ generati nella directory specificata all'interno del _file_ di configurazione.
Sempre nel _file_ di configurazione dobbiamo ora specificare quale _checkpoint_ dobbiamo utilizzare.
#block([Una volta specificato il _checkpoint_ da utilizzare testiamo il modello con il seguente comando:
```bash
python3 main.py --mode=test --env=<configurazioneUsata>
```
Questa procedura avrà generato un _file_ `disparities.npy` nella directory specificata all'interno del _file_ di configurazione.
],breakable: false,width: 100%)
#block([Ora possiamo andare a valutare l'_output_ del modello (ovvero l'_output_ della fase precedente), utilizzando il seguente comando:
```bash
python3 main.py --mode=eval --env=<configurazioneUsata>
```
],breakable: false,width: 100%)
#block([Seguendo questi passaggi ho ottenuto i seguenti risultati:
#eval_table(
(
(name: [PDV1], vals: (0.163,1.399,6.253,0.262,0.759,0.911,0.961)),
(name: [PDV1 in @PyTorch], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964))
),
1,
[PDV1 vs. PDV1 in @PyTorch (_training_ su _KITTI_, 50 epoche)]
)
],breakable: false,width: 100%)
#block([Ho poi allenato il modello sul _dataset_ _CityScapes_ per poi fare _ @finetune _ sul _dataset_ _KITTI_, ottenendo i seguenti risultati:
#eval_table(
(
(name: [PDV1], vals: (0.148,1.318,5.932,0.244,0.8,0.925,0.967)),
(name: [PDV1 in @PyTorch], vals: (0.147,1.378,5.91,0.242,0.804,0.927,0.967))
),
1,
[PDV1 vs. PDV1 in @PyTorch (_training_ su _CityScapes_+_KITTI_, 50 epoche)]
)
],breakable: false,width: 100%)
#block([Infine ho allenato il modello per 200 epoche, ottenendo i seguenti risultati:
#eval_table(
(
(name: [PDV1], vals: (0.153,1.363,6.03,0.252,0.789,0.918,0.963)),
(name: [PDV1 in @PyTorch], vals: (0.153,1.473,6.23,0.251,0.789,0.918,0.964)),
),
1,
[PDV1 vs. PDV1 in @PyTorch (_training_ su _KITTI_, 200 epoche)]
)
],breakable: false,width: 100%)
Si può quindi constatare che la migrazione a @PyTorch è stata un successo, permettendoci di arrivare nell'intorno dei risultati enunciati in @PyDNetV1, in tutti gli esperimenti proposti.
#block([== Iperparametri e PyDNetV2 <plus>
=== Esplorazione degli iperparametri
È stata condotta una procedura investigativa relativa a come il cambiamento degli iperparametri influenzi le _performance_ del modello, per poter capire poi come eventualmente migliorare la procedura di training al fine di avere un modello che a parità di architettura abbia una valutazione migliore.
],breakable: false,width: 100%)
#block([
Sono quindi state analizzate le seguenti situazioni date le seguenti ipotesi:
- Cosa succede se il _dataset_ è in bianco e nero? Questa ipotesi va a verificare come la semplificazione del _dataset_ impatta la prestazioni del modello. Uso il termine semplificare, in quanto la rappresentazione delle immagini passa dall'essere (secondo la norma $"numCanali"times"altezzaImmagine"times"larghezzaImmagine"$) $3times H times W$ a $1times H times W$, andando a rappresentare con quell'unico canale, solamente la luminosità del _pixel_\. I risultati sono i seguenti:
#eval_table(
(
(name: [PDV1], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV1 _Luminance_ _dataset_], vals: (0.164,1.553,6.423,0.263,0.763,0.906,0.959))
),
1,
[In @PyTorch: PDV1 vs. PDV1 (_Luminance_ _dataset_)]
)
],breakable: false,width: 100%)
Come si può vedere, i risultati peggiorano, questo a significare che avere effettivamente i canali dei colori introduceva informazione significativa.
#block([- Cosa succede se il _dataset_ è in formato _HSV_? Il formato _HSV_ è un formato di rappresentazione dei colori che si appoggia sempre su tre canali, ma rispetto alla rappresentazione _RGB_, i tre in questo caso sono usati per rappresentare tonalità, saturazione e luminosità. Si cerca quindi di capire se un'altra rappresentazione dei colori possa portare beneficio alle _performance_ del modello. I risultati sono i seguenti:
#eval_table(
(
(name: [PDV1], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV1 _HSV_ _dataset_], vals:(0.237,2.451,8.259,0.377,0.605,0.798,0.899))
),
1,
[In @PyTorch: PDV1 vs. PDV1 (_HSV_ _dataset_)]
)
],breakable: false,width: 100%)
Si può quindi evincere dai risultati che il miglior formato per la rappresentazione dei colori per un buon addestramento del modello è l'_RGB_.
#block([- Cosa succede se si applica un ribaltamento verticale alle immagini? In questo caso si vuole verificare come vengono impattate le prestazioni del modello, aggiungendo il ribaltamento verticale alle immagini. Questo perchè alla base è presente l'ipotesi di una possibile migliore generalizzazione del modello, se addestrato anche su scenari poco probabili, poichè ad esempio, abituarsi al far si che il cielo sia sempre nelle parti superiori delle immagini, creerà nel modello un'influenza forte. I risultati sono i seguenti:
#eval_table(
(
(name: [PDV1], vals:(0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV1 _VFlip_], vals: (0.159,1.403,6.277,0.26,0.765,0.909,0.961))
),
1,
[In @PyTorch: PDV1 vs. PDV1 (con _vertical flip_)]
)
],breakable: false,width: 100%)
Si nota che, sebbene le prime due metriche di valutazione sono leggermente migliorate, nel resto le prestazioni degradano significativamente.
#block([- Cosa succede se si applica un ribaltamento verticale alle immagini rimuovendo il ribaltamento orizontale? Volendo provare a vedere che conseguenza avrebbero portato i soli ribaltamenti verticali, rimuovendo quindi quelli orizontali, ho ottenuto i seguenti risultati:
#eval_table(
(
(name: [PDV1], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV1 _VFlip_ no _HFlip_], vals: (0.173,1.636,6.536,0.269,0.746,0.9,0.957))
),
1,
[In @PyTorch: PDV1 vs. PDV1 (con _vertical flip_ senza _horizontal flip_)],
)
],breakable: false,width: 100%)
Si può quindi notare che mettere il modello in condizioni poco probabili, non lo aiuta a generalizzare meglio, ma introduce solo più confusione.
#block([- Cosa succede al cambiare della dimensione delle immagini di _input_? Si vuole in questo caso verificare come gradualmente aumentando la dimensione delle immagini, sulla quale il modello viene addestrato, si modificano le metriche di valutazione per il modello. Sono riportati in seguito i risultati per ciascuna risoluzione di _input_ provata:
#eval_table(
(
(name: [PDV1 $512times 256$], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV1 $640times 192$], vals: (0.163,1.525,6.203,0.251,0.778,0.916,0.963)),
(name: [PDV1 $1024times 320$], vals: (0.151,1.373,5.919,0.245,0.794,0.923,0.966)),
(name: [PDV1 $1280times 384$], vals: (0.139,1.249,5.742,0.234,0.816,0.932,0.969))
),
1,
[In @PyTorch: confronto tra varie risoluzioni di _input_ per PDV1],
)
],breakable: false,width: 100%)
Si può notare come aumentando la dimensione delle immagini di _input_, le metriche di valutazione migliorano notevolmente. È tuttavia da fare un'osservazione.
Avere immagini di _input_ più grandi determina una _performance_ in termini di tempo di inferenza e in termini di consumo di memoria molto peggiore, è quindi preferibile concentrarsi sulla costruzione di un modello migliore su un _input_ di dimensioni accettabili, che sull'uso di modelli leggeri ma costosi per la dimensione dell'_input_ che elaborano.
#block([=== PyDNet V2
PDV2 è un modello ancor più leggero rispetto alla versione precedente, infatti si passa dai 2 milioni di parametri per PDV1 a circa 700.000 parametri per PDV2. #block([Questo è stato reso possibile rimuovendo solamente gli ultimi due livelli della piramide, senza cambiare ne gli @encoder ne i @decoder, andando quindi ad avere la seguente architettura:
],breakable: false,width: 100%)
#figure(
image("../images/architectures/PyDNetV2.drawio.png",width: 250pt),
caption: [Architettura del modello PDV2]
)
],breakable: false,width: 100%)
#block([
Sebbene la procedura di allenamento di PDV2 differisce dalla procedura di PDV1, si è voluto verificare a parità di ambiente di addestramento, la performance del modello di PDV2 migrato anch'esso in @PyTorch (come menzionato #link(<models>)[precedentemente]), ottenendo quindi i seguenti risultati:
#eval_table(
(
(name: [PDV1], vals: (0.16,1.52,6.229,0.253,0.782,0.916,0.964)),
(name: [PDV2], vals: (0.157,1.487,6.167,0.254,0.783,0.917,0.964)),
),
1,
[In @PyTorch: PDV1 vs. PDV2]
)
],breakable: false,width: 100%)
Dai risultati si può dedurre che forse avere sei livelli invece che quattro, avendo di conseguenza circa 1.3 milioni di parametri in più, fa imparare al modello più rumore che informazione utile.
|
https://github.com/sysu/better-thesis | https://raw.githubusercontent.com/sysu/better-thesis/main/specifications/bachelor/abstract.typ | typst | MIT License | // 利用 state 捕获摘要参数,并通过 context 传递给渲染函数
#import "/utils/style.typ": 字号, 字体
#import "/utils/indent.typ": fake-par
#let abstract-keywords = state("keywords", ("中山大学", "论文", "学位论文", "规范"))
#let abstract-content = state("abstract", [
摘要应概括论文的主要信息,应具有独立性和自含性,即不阅读论文的全文,就能获得必要的信息。
摘要内容一般应包括研究目的、内容、方法、成果和结论,要突出论文的创造性成果或新见解,不要
与绪论相混淆。语言力求精练、准确,以300-500字为宜。关键词是供检索用的主题词条,应体现论
文特色,具有语义性,在论文中有明确的出处,并应尽量采用《汉语主题词表》或各专业主题词表提
供的规范词。关键词与摘要应在同一页,在摘要的下方另起一行注明,一般列3-5个,按词条的外延
层次排列(外延大的排在前面)。
])
#let abstract(
keywords: (),
body,
) = {
context abstract-keywords.update(keywords)
context abstract-content.update(body)
}
// 中文摘要页
#let abstract-page() = {
// 中文摘要内容 宋体小四号
set text(font: 字体.宋体, size: 字号.小四)
// 中文摘要标题 黑体三号居中
show heading.where(level: 1): set text(font: 字体.黑体, size: 字号.三号)
// 摘要标题不编号
show heading.where(level: 1): set heading(numbering: none)
// 通过插入假段落修复[章节第一段不缩进问题](https://github.com/typst/typst/issues/311)
show heading.where(level: 1): it => {
it
fake-par
}
[
= 摘要
#set par(first-line-indent: 2em)
#context abstract-content.final()
#v(1em)
// 摘要正文下方另起一行顶格打印“关键词”款项,后加冒号,多个关键词以逗号分隔。
// (标题“关键词”加粗)
#set par(first-line-indent: 0em)
*关键词:*#context abstract-keywords.final().join(",")
]
}
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/meta/bibliography_04.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
//
// // Error: 15-55 duplicate bibliography keys: netwok, issue201, arrgh, quark, distress, glacier-melt, tolkien54, sharing, restful, mcintosh_anxiety, psychology25
// #bibliography(("/assets/files/works.bib", "/assets/files/works.bib")) |
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/lovelace/0.1.0/examples/euclid-algorithm.typ | typst | Apache License 2.0 | #set page(width: 30em, height: auto, margin: 1em)
#import "../lib.typ": *
#show: setup-lovelace
#algorithm(
caption: [The Euclidean algorithm],
pseudocode(
no-number,
[*input:* integers $a$ and $b$],
no-number,
[*output:* greatest common divisor of $a$ and $b$],
[*while* $a != b$ *do*], ind,
[*if* $a > b$ *then*], ind,
$a <- a - b$, ded,
[*else*], ind,
$b <- b - a$, ded,
[*end*], ded,
[*end*],
[*return* $a$]
)
)
|
https://github.com/EpicEricEE/typst-based | https://raw.githubusercontent.com/EpicEricEE/typst-based/master/src/base64.typ | typst | MIT License | #import "coder.typ"
#let alphabet-64 = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/"
#let alphabet-64-url = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_"
/// Encodes the given data in base64 format.
///
/// Arguments:
/// - data: The data to encode. Must be of type array, bytes, or string.
/// - pad: Whether to pad the output with "=" characters.
/// - url: Whether to use the URL safe alphabet.
///
/// Returns: The encoded string.
#let encode(data, pad: true, url: false) = {
let alphabet = if url { alphabet-64-url } else { alphabet-64 }
coder.encode(data, alphabet, pad: pad)
}
/// Decodes the given base64 string.
///
/// URL safe characters are automatically converted to their standard
/// counterparts. Invalid characters are ignored.
///
/// Arguments:
/// - string: The string to decode.
///
/// Returns: The decoded bytes.
#let decode(string) = {
string = string.replace("-", "+").replace("_", "/")
coder.decode(string, alphabet-64)
}
|
https://github.com/raygo0312/Typst_template | https://raw.githubusercontent.com/raygo0312/Typst_template/main/README.md | markdown | # Typstの日本語テンプレート
Typstの日本語テンプレートを作成しました.修正点があればissueにあげるか,pull requestを送ってください.
## Typstのインストール
ここでは,VSCodeを用いたインストール方法を記載します.その他の方法は,
- Web版を使いたい場合 : [公式ページ](https://typst.app/)
- Terminal上で使いたい場合 : [GitHub](https://github.com/typst/typst)
を参照してください.
VSCodeにどちらかの拡張機能を入れます.
- Typst LSP (公式)
- Tinymist Typst (おすすめ)
このテンプレートを使用する場合は次のフォントをインストールしてもらう必要があります.再起動が必要かもしれません
- [原ノ味フォント](https://github.com/trueroad/HaranoAjiFonts)
- [COMPUTER MODERN](https://www.fontsquirrel.com/fonts/computer-modern)
- [katex-fonts](https://github.com/KaTeX/katex-fonts/tree/master)
具体的な使い方は[公式ドキュメント](https://typst.app/docs/)か解説ブログをご覧ください。
## テンプレートファイル
テンプレートファイルのimportには絶対パスを指定できないので,パッケージとしてあげます.詳しくは[こちら](https://github.com/typst/packages?tab=readme-ov-file#local-packages)を参照してください.
1. dataディレクトリに`typst/packages/local/japanese-template/0.1.0/typst.toml`を作成します.
1. 以下のコードを入力します.
1. `typst.toml`と同じディレクトリ内にこのリポジトリを`template`という名前でシンボリックリンクを作成します.
```toml
[package]
name = "japanese-template"
version = "0.1.0"
entrypoint = "template/template.typ"
authors = ["raygo"]
```
## Tinymistの使い方
詳しくは[こちら](https://marketplace.visualstudio.com/items?itemName=myriad-dreamin.tinymist)を参照してください.
- 右上の虫眼鏡がついているアイコンをクリックするとプレビューが開きます.
- typstファイルの先頭にある`Export PDF`をクリックするとPDFに変換されます.
- 設定ファイルに`"tinymist.formatterMode": "typstyle"`を書き込むとフォーマットされます.
## ドキュメントの作成
`main.typ`を参考にしてください.
次のコードを冒頭に入力します.その下に出力したい内容を入力します.
```Typst
#import "@local/japanese-template:0.1.0": *
#show: it => jarticle(
title: "タイトル",
author: "名前",
it,
)
```
`jarticle`関数は次の引数が使えます.
### titlepage
タイトルページを作成するかを選択します.\
Default: `false`
### title
タイトル入力します.\
Default: `""`
### office
著者の所属を入力します.\
Default: `""`
### author
著者の名前を入力します.\
Default: `""`
### date
作成日を入力するか選択します.\
Default: `false`
## スライドの作成
`slide.typ`を参考にしてください.
次のコードを冒頭に入力します.その下に出力したい内容を入力します.
```Typst
#import "@local/japanese-template:0.1.0": *
#show: it => slide-style(it)
```
タイトルスライドは次のように作成します.
```Typst
#title-slide(
title: "タイトル",
author: "名前",
)
```
スライドは次のように作成します.
```Typst
#slide(
title: "タイトル"
)[
内容
]
```
`slide`関数は次の引数が使えます.
### title
タイトル入力します.\
Default: `""`
### title-bgcolor
`rgb("#789ABC")`のように,タイトルの背景色をカラーコードで入力します.\
Default: `rgb("#DDDDFF")`
### verticaly
内容の配置を変えられます.`top`で上段揃え.`horizon`で中央揃え.`bottom`で下段揃え.\
Default: `horizon` |
|
https://github.com/denkspuren/typst_programming | https://raw.githubusercontent.com/denkspuren/typst_programming/main/extractText/extractText.typ | typst | // see https://github.com/typst/templates/issues/12#issuecomment-1793845765
#let extractText(element) = {
if type(element) == content {
if element == [ ] { return " " }
return extractText(element.fields()).trim() }
if type(element) == dictionary { return extractText(element.values()) }
if type(element) == array {
return element.fold("", (res, item) => res + extractText(item))
}
if type(element) == bool { return "" }
return str(element)
}
#assert.eq(extractText("hey"), "hey")
#assert.eq(extractText(12), "12")
#assert.eq(extractText(12.0), "12")
#assert.eq(extractText(12.1), "12.1")
#assert.eq(extractText(false), "")
#assert.eq(extractText(version(1,2,3)), "1.2.3")
#assert.eq(extractText((1,2,3)), "123")
#assert.eq(extractText((4,(1,"Hey",12.0),(hey: 2))), "41Hey122")
#assert.eq(extractText([This is some text.]), "This is some text.")
#assert.eq(extractText([This is _some_ text.]), "This is some text.")
#assert.eq(extractText([ Is $x^2$ an _even_ Function? ]), "Is x2 an even Function?")
#assert.eq(extractText([Is $x^2$ an _even_ Function?]), "Is x2 an even Function?")
#assert.eq(extractText([[Hey] [you]]), "[Hey] [you]")
/*
#let title = [ Is $x^2$ an _even_ Function? ]
#title
#extractText(title)
*/
|
|
https://github.com/tingerrr/hydra | https://raw.githubusercontent.com/tingerrr/hydra/main/doc/chapters/3-reference.typ | typst | MIT License | #import "@preview/tidy:0.2.0"
#import "/doc/util.typ": bbox
#let stable(is) = if is {
bbox(fill: green.lighten(50%), `stable`)
} else {
bbox(fill: yellow.lighten(50%), `unstable`)
}
== Stability
The following stability guarantees are made, this package tries to adhere to semantic versioning.
#table(columns: 2, gutter: 0.25em, align: (right, left),
stable(false), [API may change with any version bump.],
stable(true), [
API will not change without a major version bump or a minor version bump before `1.0.0`, if such
a change occures it is a bug and unintended.
],
)
== Custom Types
#set heading(outlined: false)
The following custom types are used to pass around information easily:
=== `sanitized-selector` #stable(true)
Defines a selector for an ancestor or primary element.
```typc
(
target: queryable,
filter: ((context, candidates) => bool) | none,
)
```
=== `hydra-selector` #stable(true)
Defines a pair of primary and ancestor element selectors.
```typc
(
primary: sanitized-selector,
ancestors: sanitized-selector | none,
)
```
=== `candidates` #stable(true)
Defines the candidates that have been found in a specific context.
```typc
(
primary: (prev: content | none, next: content | none, last: content | none),
ancestor: (prev: content | none, next: content | none),
)
```
=== `context` #stable(false)
Defines the options passed to hydra nad resolved contextual information needed for querying and
displaying.
```typc
(
prev-filter: (context, candidates) => bool,
next-filter: (context, candidates) => bool,
display: (context, content) => content,
skip-starting: bool,
use-last: bool,
book: bool,
anchor: label | none,
anchor-loc: location,
primary: sanitized-selector,
ancestors: sanitized-selector,
)
```
#pagebreak()
#set heading(numbering: none)
#let mods = (
(`hydra`, "/src/lib.typ", true, [
The package entry point. All functions validate their inputs and panic using error messages
directed at the end user.
]),
(`core`, "/src/core.typ", false, [
The core logic module. Some functions may return results with error messages that can be used to
panic or recover from instead of panicking themselves.
]),
(`selectors`, "/src/selectors.typ", true, [
Contains functions used for creating custom selectors.
]),
(`util`, "/src/util.typ", false, [
Utlity functions and values.
]),
(`util/core`, "/src/util/core.typ", false, [
Utlity functions.
]),
(`util/assert`, "/src/util/assert.typ", false, [
Assertions used for input and state validation.
]),
)
#let render-module(title, path, is-stable, description, style: tidy.styles.minimal) = [
== #title #stable(is-stable)
#description
#tidy.show-module(tidy.parse-module(read(path)), style: style)
]
#mods.map(x => render-module(..x)).join(pagebreak(weak: true))
|
https://github.com/CreakZ/mirea-algorithms | https://raw.githubusercontent.com/CreakZ/mirea-algorithms/master/reports/work_4/work_4.typ | typst | #import "../title_page_template.typ": title_page
#import "../layouts.typ": un_heading, head1, indent, head2
#set page(
paper: "a4",
margin: (left: 2cm, right: 2cm, top: 2cm, bottom: 2cm)
)
#set heading(numbering: "1.1.1.")
#set text(font: "New Computer Modern", size: 14pt, kerning: false)
#set figure(supplement: [Рисунок])
#set figure.caption(separator: [ -- ])
#show table: it => {align(center, it)}
#title_page(4, [Многомерные массивы])
#un_heading([Оглавление])
#outline(
title: none,
indent: none
)
#pagebreak()
#set page(numbering: "1")
#head1([= Условие задачи])
#indent Задания моего персонального варианта (№22):
#let task(num, desc) = {
par(justify: true, [#indent *#num*. #desc])
}
#task([Задание 1], [Разработать АТД задачи варианта по управлению многомерными данными и реализовать на статическом многомерном массиве.])
#task([Задание 2], [Разработать АТД задачи варианта по управлению многомерными данными и реализовать на динамическом многомерном массиве.])
#task([Задание 3], [Разработать программу решения задачи варианта по управлению многомерными данными и реализовать с применением шаблона #raw("<vector>", lang: "cpp") библиотеки STL.])
#head1([= Постановка задачи])
#task([Задания 1 и 2], [Дана квадратная матрица. Найти определитель данной матрицы методом Гаусса.])
#task([Задание 3], [На плоскости задано множество точек с целочисленными координатами. Необходимо найти количество отрезков, обладающих следующими свойствами: \
#indent 1. Оба конца отрезка принадлежат заданному множеству; \
#indent 2. Ни один конец отрезка не лежит на осях координат; \
#indent 3. Отрезок пересекается ровно с одной осью координат. \
#indent Напишите эффективную по времени и по используемой памяти программу для решения этой задачи.])
#pagebreak()
#head1([= Математическая модель])
#head2([== Задания 1 и 2.])
#indent Допустим, у нас есть матрица следующего вида: \
#align(
center, [
$ A =
mat(
1, 3, 3, 7;
3, 4, 3, 4;
5, 6, 7, 8;
6, 9, 6, 9;
) $
]
)
#par(justify: true, [#indent Согласно методу Гаусса, для вычисления определителя нужно сначала привести матрицу к треугольному виду, а затем найти произведение элементов на главной диагонали. Значение полученного произведения (назовем его $Delta$) и будет определителем матрицы. \
#indent Приведем матрицу $A$ к диагональному виду:])
#align(center, [$ Delta =
mat(
1, 3, 3, 7;
3, 4, 3, 4;
5, 6, 7, 8;
6, 9, 6, 9;
delim: "|"
) =
mat(
1, 3, 3, 7;
0, -5, -6, -17;
0, -9, -8, -27;
0, -9, -12, -33;
delim: "|"
) =
mat(
1, 3, 3, 7;
0, -5, -6, -17;
0, 0, 14/5, 18/5;
0, 0, -6/5, -12/5;
delim: "|"
) =
mat(
1, 3, 3, 7;
0, -5, -6, -17;
0, 0, 14/5, 18/5;
0, 0, 0, -6/7;
delim: "|"
) $])
#indent В данном случае $Delta = 1 times (-5) times 14/5 times (-6/7) = 12 $.
#par(justify: true, [#indent Как видно из приведенного примера, приведение к треугольному виду представляет из себя $quote.angle.l.double$обнуление$quote.angle.r.double$ элементов, стоящих под главной диагональю. Для того, чтобы добиться этого в общем случае, нужно из текущей строки вычесть строку, располагающуюся выше текущей и умноженную на некоторый коэффициент (назовем его $lambda$). \
])
#indent В нашем примере $lambda_1 = a_21/a_11 = 3$,
$lambda_2 = a_32/a_21 = 9/5$, $lambda_3 = a_43/a_32 = -6/14$.
#indent Видно, что общем случае для $n$-й строки ($n > 1, n in NN$) получается: \
#align(center, [$ lambda_n = a_((n+1)n) / a_(n n) $])
#indent Тогда элемент $n$-й строки $m$-го столбца будет равен: $ a_(n m) = a_(n m) - lambda a_((n-1) m) $
#indent Определитель же в общем виде будет находиться по следующей формуле: $ Delta = attach(Pi, t: n, b: 1) a_(n n) $
#head2([== Задание 3])
#par(justify: true, [
#indent Построим математическую модель задачи на основе рассмотрения примера (см. рис. 1).
#figure(
image("images/plot.svg", width: 60%),
caption: [Пример взаимного расположения точек на координатой плоскости]
)
#indent Как видно из рис. 1, точки, лежащие в соседних квадрантах, образуют отрезки, подходящие под критерии задачи ($A B$, $A C$, $B D$, $C D$), в то время как точки, лежащие в противолежащих квадрантах, таковых отрезков не образуют ($A D$, $B C$). Действительно, среди соответственных координат точек подходящих отрезков произведение ровно одной пары будет меньше нуля (например, для точек $A$ и $B:$ $y_A times y_B = 2 times (-3) = -6 < 0$, но $x_A times x_B = 2 times 5 = 10 > 0$), а для прочих отрезков произведение обеих пар будет одновременно либо больше, либо меньше нуля. #linebreak()
#indent Отсюда получим следующую функцию:
$ F(x_1, x_2, y_1, y_2) = (x_1 times x_2 < 0) plus.circle (y_1 times y_2 < 0), $ #indent где $x_1, x_2, y_1, y_2$ -- координаты точек отрезка. #linebreak()
#indent Значение функции $F$ для отрезков, пересекающих ровно одну ось координат, всегда будет равно 1, а для прочих -- 0.
])
#pagebreak()
#head1([= АТД задачи])
#head2([== АТД для заданий 1 и 2])
#let tab(theme) = {
set par(first-line-indent: 1.25cm, hanging-indent: 1.25cm)
text([#theme #linebreak()])
}
#let method(num, head, pre, post, header) = {
par(hanging-indent: 1.25cm, first-line-indent: 1.25cm, [
#num #head #linebreak()
Предусловие: #pre #linebreak()
Постусловие: #post #linebreak()
#align(left, [Заголовок: #header #linebreak()])
])
}
#par(justify: true, [АТД matrix \ { \
#tab([_Данные_ (описание свойств структуры данных задачи)])
#tab([MAX_SIZE - максимальная размерность матрицы])
#tab([size - размерность текущего массива])
#tab([array - двумерный массив, содержащий действительные числа])
#tab([_Операции_ (объявления операций)])
#method(
[1.],
[Метод, осуществляющий заполнение с клавиатуры],
[array, size],
[array, заполненный действительными числами],
[#raw("fillManually()")]
)
#method(
[2.],
[Метод, заполняющий массив случайными значениями],
[array, size],
[массив array, заполненный случайными значениями],
[#raw("fillRandomly()")]
)
#method(
[3.],
[Метод, осуществляющий вывод текущих значений массива],
[array, size],
[выведенные через пробел элементы массива],
[#raw("print()")]
)
#method(
[4.],
[Метод, возвращающий значение определителя матрицы],
[array, size],
[действительное число - значение определителя матрицы],
[#raw("det()")])
}
])
#pagebreak()
#head2([== АТД для задания 3])
#par(justify: true, [АТД Set \ { \
#tab([_Данные_ (описание свойств структуры данных задачи)])
#tab([points - вектор, содержащий пары целочисленных значений (координаты точки)])
#method([1.],
[Метод, осуществляющий заполнение с клавиатуры],
[array, size],
[array, заполненный действительными числами],
[#raw("fillManually()")]
)
#method([2.],
[Метод, заполняющий массив случайными значениями],
[array, size],
[массив array, заполненный случайными значениями],
[#raw("fillRandomly()")]
)
#method([3.],
[Метод, осуществляющий вывод текущих значений массива],
[array, size],
[выведенные через пробел элементы массива],
[#raw("print()")]
)
#method([4.],
[Метод, возвращающий #raw("true"), если хотя бы одна точка отрезка лежит на осях координат, иначе #raw("false")],
[координаты точек концов отрезка],
[#raw("true")/#raw("false")],
[#raw("onAxis(std::pair<int, int>& point1, std::pair<int, int>& point2)")])
#method([5.],
[Метод, возвращающий #raw("true"), если отрезок пересекает ровно одну ось координат, иначе #raw("false")],
[координаты конца точек отрезка],
[#raw("true")/#raw("false")],
[#raw("singleAxisIntersection(std::pair<int, int>& point1, std::pair<int, int>& point2"))])
#method([6.],
[Метод, возращающий случайно сгенерированное целое число в диапазоне [-100, 100]],
[_сильно зависит от конкретного языка программирования_],
[целое число из диапазона [-100, 100]],
[#raw("generate(std::uniform_int_distribution<>& dis, std::mt19937& generator)")]
)
#method([7.],
[Метод, возвращающий количество отрезков, удовлетворяющих условию задачи 3],
[points],
[целое число - количество отрезков],
[#raw("validSegments()")])
}
])
#head1([= Разработка и реализация задачи])
#head2([== Код программы])
#head2([=== Задание 1])
#text([1. Код файла #raw("matrixStatic.h") #linebreak()])
#let static_h = read("../../src/work_4/static_array/headers/matrix_static_array.h")
#raw(static_h, lang: "cpp")
#text([2. Код файла #raw("matrixStatic.cpp") #linebreak()])
#let static_cpp = read("../../src/work_4/static_array/source/matrix_static_array.cpp")
#raw(static_cpp, lang: "cpp")
#text([3. Код файла #raw("main.cpp") #linebreak()])
#let static_main = read("../../src/work_4/static_array/main.cpp")
#raw(static_main, lang: "cpp")
#head2([=== Задание 2])
#text([1. Код файла #raw("matrixDynamic.h") #linebreak()])
#let dynamic_h = read("../../src/work_4/dynamic_array/headers/matrix_dynamic_array.h")
#raw(dynamic_h, lang: "cpp")
#text([2. Код файла #raw("matrixDynamic.cpp") #linebreak()])
#let dynamic_cpp = read("../../src/work_4/dynamic_array/source/matrix_dynamic_array.cpp")
#raw(dynamic_cpp, lang: "cpp")
#text([3. Код файла #raw("main.cpp") #linebreak()])
#let dynamic_main = read("../../src/work_4/dynamic_array/main.cpp")
#raw(dynamic_main, lang: "cpp")
#head2([=== Задание 3])
#text([1. Код файла #raw("matrixVector.h")])
#let vector_h = read("../../src/work_4/vector/headers/matrix_vector.h")
#raw(vector_h, lang: "cpp")
#text([2. Код файла #raw("matrixVector.cpp")])
#let vector_cpp = read("../../src/work_4/vector/source/matrix_vector.cpp")
#raw(vector_cpp, lang: "cpp")
#text([3. Код файла #raw("main.cpp")])
#let vector_main = read("../../src/work_4/vector/main.cpp")
#raw(vector_main, lang: "cpp")
#head2([== Набор тестов])
#head2([=== Задания 1 и 2])
Таблица 1 -- Таблица тестов для заданий 1 и 2
#table(
columns: 3,
align: (center, center, center),
table.header(
[Номер], [Входные данные], [Ожидаемые выходные данные]
),
[1], [
size = 3 \
array = {{1, 2, 3}, \ {4, 5, 6}, {7, 8, 9}}
], [0],
[2], [
size = 4 \
array = {{1, 3, 3, 7}, {3, 4, 3, 4}, {5, 6, 7, 8}, {6, 9, 6, 9}}
], [18]
)
#head2([=== Задание 3])
Таблица 2 -- Таблица тестов для задания 3
#table(
columns: 3,
align: (center, center, center),
table.header(
[Номер], [Входные данные], [Ожидаемые выходные данные]
),
[1], [points = {{1,2}, {3,4}, {5,-2}, {-3,-5}}], [3],
[2], [points = {{1,0}, {0,1}, {0,0}, {6,9}}], [0]
)
#pagebreak()
#head2([== Результаты тестирования])
#figure(
image("images/test12_1.png", width: 60%),
caption: [Результаты тестирования теста №1 для заданий 1 и 2]
)
#figure(
image("images/test12_2.png", width: 60%),
caption: [Результаты тестирования теста №2 для заданий 1 и 2]
)
#figure(
image("images/test3_1.png", width: 60%),
caption: [Результаты тестирования теста №1 для задания 3]
)
#figure(
image("images/test3_1.png", width: 60%),
caption: [Результаты тестирования теста №2 для задания 3]
)
#head1([= Вывод])
#par(first-line-indent: 1.25cm, justify: true,
[#indent В результате работе были получены навыки по определению многомерного статического и динамического массивов в программе, их представлению в оперативной памяти, определению структуры данных для хранения данных задачи и ее наиболее оптимальной реализации, а также разработке алгоритмов операций на многомерном (двумерном массиве) в соответствии с задачей.]
) |
|
https://github.com/morrisfeist/cvgen | https://raw.githubusercontent.com/morrisfeist/cvgen/master/template/template.typ | typst | MIT License | #import "theme.typ": theme
#set page(margin: 0pt, fill: theme.base)
#set text(font: "Liberation Sans", size: 10pt, fill: theme.text);
#show heading: set text(theme.primary)
#grid(columns: (30fr, 70fr), include "sidebar.typ", include "body.typ")
|
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024 | https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/entries/decide-drivetrain-config.typ | typst | Creative Commons Attribution Share Alike 4.0 International | #import "/packages.typ": notebookinator
#import notebookinator: *
#import themes.radial.components: *
#show: create-body-entry.with(
title: "Decide: Drivetrain Configuration",
type: "decide",
date: datetime(year: 2023, month: 6, day: 19),
author: "<NAME>",
witness: "Violet Ridge",
)
We then rated each configuration for it's speed, strength, and maneuverability
on a scale of 0 to 5.
#decision-matrix(
properties: ((name: "Speed"), (name: "Strength"), (name: "Maneuverability")),
("6:3, 4\" wheels, green cartridges", 5, 2, 3),
("3:5, 3.25\" wheels, blue cartridges", 3, 4, 2),
("3:5, 4\" wheels, blue cartridges", 0, 0, 0),
("4:7, 4\" wheels, blue cartridges", 3.5, 4, 3),
)
#admonition(
type: "note",
[
The 3:5, 4", blue cartridge config was not rated due to it's infeasibility.
],
)
#admonition(
type: "decision",
[
We settled on a drivetrain with a 4:7 gear ratio, 4" wheels, and blue
cartridges, leaving us with a final RPM of 342, and a final speed of 5.96
feet/second.
],
)
= Final Drivetrain Design
#grid(
columns: (1fr, 1fr),
gutter: 20pt,
image("/assets/drivetrain/drivetrain-cad-side.png"),
image("/assets/drivetrain/drivetrain-cad-top.png"),
)
#colbreak()
//TODO: remove item number, or make the item numbers mean something
#align(center, [
#image("/assets/drivetrain/part-drawings/1.png")
#image("/assets/drivetrain/part-drawings/2.png")
#image("/assets/drivetrain/part-drawings/3.png")
])
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/layout/table-04.typ | typst | Other | // Error: 14-19 expected color, none, array, or function, found string
#table(fill: "hey")
|
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/048%20-%20Dominaria%20United/002_Episode%202%3A%20Sand%20in%20the%20Hourglass.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"Episode 2: Sand in the Hourglass",
set_name: "Dominaria United",
story_date: datetime(day: 11, month: 08, year: 2022),
author: "<NAME>",
doc
)
#figure(image("002_Episode 2: Sand in the Hourglass/01.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
Time trickled away more slowly than the grains of sand settling between the rocks. The fine particles sifted into Karn's joints. He didn't know how long he had lain there, pinned in the dark. Was it days or weeks that had passed? What if months had flown away, like a small and startled bird? What if it was longer? Years, decades, eons—
No, he could not think of it.
No one would miss him. No one knew where he had gone. He should have told someone. He should have at least told Jhoira. Or Jaya. If he had told them, they would have known where to search and either freed him or seen the Phyrexians themselves.
What if the Phyrexians were the ones to find him? Would it be worse if no one found him? He might wait alone, forever in the darkness. In the silence.
Sand trickled down. A scrabbling noise. Maybe claws grating on rough stone.
A weight lifted away from his hand, exposing it to the chill air currents. He could move his fingers. Relief shot through him, a pang more powerful than the Blind Eternities. He stretched his fingers, marveling at the freedom of this small movement, the ability to make #emph[any] movement. Something warm and soft touched his fingertips. Organic, impenetrable to his senses. Not Phyrexian. Gentle. Thoughtful.
He was found.
The warmth left his fingertips. Had his rescuer departed?
The scratching quickened. Rocks grated. Pebbles cascaded. Clunks as large rocks tossed away landed. The burden on him lightened. Karn strained. The material around him budged, shifting at the pressure of his enormous strength. Karn exercised the powerful mechanisms in his torso, pushing himself upright. Rocks poured away. He heaved himself up slowly. He wanted to take care not to hurt his rescuer with any stray stones.
As his efforts increased, the scratching noise ended. Footfalls retreated as his rescuer stepped clear. Karn would have to trust that they had moved to a safe distance.
Karn hauled himself to his feet. Stone poured off him, and he was free. The warm air caressed his body. He rolled his shoulders, reveling in their movement. The tumbling rock kicked up a gray haze. He shook the fine particles from his body and wiped clean his eyes.
Ajani stood in the tunnel, his fur a striking white in the torchlight. The pupil of his unscarred pale-blue eye glinted with the nocturnal hue of a nighttime predator. His shoulders had a proud set to them, like he was pleased he'd found Karn. He granted Karn a friendly, close-lipped smile.
Karn nodded, tentative. He'd only met Ajani a few times. For Ajani's species, baring teeth was a hostile action, so this small smile was friendlier than a broad human grin.
"How did you find me?" Karn cleared his throat. It, too, felt dusty. The mechanism inside it clicked uncomfortably. "I told no one I was here."
Ajani coughed, awkward, deep in his chest. "After you didn't answer the letters, Jhoira became~ worried about you. She asked Raff to place a tracking spell on the letters, one that would only activate when you—and only you—opened the envelope. That's how I located your camp."
Karn stilled, embarrassed. Had Jhoira known every time he'd read a letter and left it unanswered? Every time he'd pushed aside the paper heaps on his worktop to make room for a new project? Had Ajani seen the chaotic clutter that populated his workspace? Karn would have never let his camp descend to such a state if he'd expected a visitor.
He evaded Ajani's gaze and investigated the joints in his body for damage. Ah, the spearhead. He'd forgotten he'd left that lodged in him.
"Every time you moved the letters, Jhoira knew you were alive," Ajani said, "and didn't want to talk. She was determined to give you the time you needed, and not to press you. She knows how private you can be when you are~ upset."
Karn worked at the spearhead, trying to remove it from his body. The rockfall had jammed it even deeper into him.
"But when you stopped shuffling the letters around," Ajani continued, "she grew concerned. And here I am."
Karn grunted. He wiggled the spearhead back and forth, trying to loosen it from between his torso plates. His blunt fingers, though capable of the most detailed work, couldn't dig deep enough. He still couldn't believe Jhoira had known how often he'd looked at those letters, considered replying, and then set them aside. Too many times. "Jhoira is well?"
"She is in her workshop on the Mana Rig." Ajani shrugged.
"And the #emph[Weatherlight] ?"
"She returned the #emph[Weatherlight] to its rightful owner," Ajani said. "Shanna captains it."
"Ah, good. Shanna will rise to the task." Karn had served with Sisay and was pleased to see the airship in her descendant's hands. "Do you mind if I~ ?" Ajani nodded at the spearpoint.
Karn shrugged.
Ajani was not as tall as Karn, but he was tall enough that he had to bend his head to inspect the spearhead. He inserted his claws into Karn's joints with surprising delicacy. "You know, every Planeswalker goes through phases like this. We withdraw, especially if we have played a role in changing a plane's fate. I have seen it time and time again. After a great hunt, you feast, and you sleep. It's natural, and there is no shame in that."
"I do not feast, and I do not sleep," Karn said.
"That does not mean you don't need to recuperate." Ajani eased the spearhead from Karn's body.
Karn had never been permitted to "recuperate" when Urza had loosed him as a war machine. Urza had explained it was unnecessary and, indifferent to Karn's weariness, had turned his attention to other, more interesting projects.
Ajani examined the spearpoint. Its metal glimmered a sickly green in the dimness. "You encountered more than a rockfall. What happened here, my friend?"
Karn didn't wish to answer the question—not until he knew whether he could trust Ajani. The vision he'd had upon touching Sheoldred still thrummed within him—Phyrexian agents everywhere, hidden across Dominaria. Waiting. "How long have I been buried?"
"A few months," Ajani admitted. "It took time to locate you."
Months lost. Months that could have been spent preparing.
Sheoldred's segmented parts had skittered along his paralyzed arms, down his back; spiderlike, they had poured over him. She would have had ample time to reassemble herself. Rona as well, he was sure.
"You have damaged yourself." Karn nodded at Ajani, whose claw had torn at the cuticle, a wound that had most likely occurred when he had dug Karn out from the rockslide. "Let us return to my camp for supplies. I must also check the sensitive equipment there to verify that it is still functional."
Karn did not voice what he feared most: did he still have the sylex and the clay tablet?
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
In the months Karn had been buried, his campsite had remained undisturbed but not unchanged; the small tents had gone dingy with mildew.
Ajani hunched his shoulders. He had a Planeswalker's distaste for these caves. Even if one could not directly sense the interplanar technologies, their way of warping time made the space claustrophobic. Karn, too, could feel the pressure.
Karn led Ajani through his cluttered camp, then ducked into his main tent. The box holding the sylex and the tablet remained where he'd left it and appeared to be locked. Karn ignored it, conscious of Ajani's eyes on him.
Karn located a barrel with water—potable, though he normally used it for cleaning purposes—and a rag. He handed the rag to Ajani, to wash and wrap his wound.
"Why were you here, Karn?" Ajani rinsed his hand, working out the grit that had lodged in his wound.
Karn inventoried his equipment for damage as he replied. "Searching for artifacts. Due to the unique properties of the Caves of Koilos, not even the most entrepreneurial archeologists or enthusiastic researchers have plundered them." He made his way in a circuit around the tent, toward the box where he'd hidden the sylex and its tablet. Casually. The box seemed intact, but he dared not open it. He reached out with his special senses. The tablet felt like mere clay, a combination of aluminum, silicon, magnesium, sodium, and other trace elements. The sylex buzzed at him: present but indecipherable due to its powerful magic.
Karn set the box aside. He faced Ajani and related all he had seen.
"Sheoldred escaped?" Ajani paced in the tent's confines. "Karn, we must warn—"
"I have tried," Karn said. "Many times."
"Now you have seen Sheoldred."
Karn wished he could trust Ajani, but he shook his head. "The caves where I discovered the Phyrexian staging ground are no longer accessible. I have no proof that the Phyrexians have returned to Dominaria."
"Don't we?" Ajani held out the spearhead. "Karn, there is a peace summit between the Keldons and the Benalish. If any nations will take the Phyrexian return seriously, it's those two. I propose we speak to their leaders."
Ajani was right. Of all the nations in Dominaria, Keld and New Benalia were likeliest to listen to Karn's warning. Radha, the leader of the Keldons, had reforged that rugged nation of warriors into a devastating military force. <NAME> led New Benalia's knights, whose passion for justice made each one worth a dozen fighters. "Let me gather my finds and sensitive equipment before we go."
Ajani tapped an amulet hanging from his belt. "Jhoira gave me a summoning device for the #emph[Weatherlight] before she sent me. Shanna will honor it."
"The #emph[Weatherlight ] may be a speedy ship, but she is not quick enough." Karn stacked several devices on the chest that held the sylex and the tablet and loaded everything into a rucksack. "I propose that we planeswalk."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Karn did not know how other Planeswalkers perceived the Blind Eternities, but to him the interminable space felt like crushed velvet, its lukewarm prickle sometimes verging on pain. The vertigo plunging through Karn contrasted with the sense that he wasn't moving at all, which was at odds with the feeling that he pulled himself along a cord to an unknown destination. He burst through a silken gash into cool air.
Karn stood knee-deep in wild grasses, orange poppies, and purple-flowering thistles. Inland, the farms seemed young, recently cleared land holding fields yellow with blooming canola. The homesteads bled into mountains, the misty temperate rainforests punctuated with emerald alpine meadows.
If he had been human, he would have taken a single shuddering breath.
To his other side, a large stone statue protected a seaport whose buildings and streets were carved into white chalk cliffs. Eons ago, a Phyrexian portal ship must have decapitated the statue, and the decrepit hulk lay atop the statue's neck. Overgrown with honeysuckle, it shadowed the city's colorful awnings. In the center of the bay, a worn-smooth island protruded from the water—the statue's head, now home to seabirds.
Ajani led Karn down narrow paths past modest homes carved into the chalk. These seemed small and well worn in contrast to the newly sculpted city hall, which had large but chunky dimensions, wide windows, and balconies framed with ornate columns.
"Do you know where the peace talks are occurring?" Karn asked.
Ajani paused and cocked his head. "Follow the sound of arguing, I suppose."
Karn could hear nothing. The leonin's senses had to be spectacular.
Ajani led Karn through a grand but empty reception area, then up a narrow set of stairs. The corridors linking the rooms felt claustrophobic, lit only by torches. They pushed between brass-bound double doors into a light-filled room dominated by a long granite table. A broad balcony overlooked the sea, and a male varied thrush—orange breast with a black collar, black mask, and black cap, a beautiful creature—perched on its railing.
To one side sat the representatives of House Capashen of Benalia. The nobleman at the table—<NAME>en, a middle-aged man with light ochre skin—had a proud tower with seven colored windows embroidered onto his silks. The knights arrayed behind him, their silver armor chased with gold, their stained-glass shields held at the ready, possessed the same gilded motif on their breastplates.
To the other side lounged the large, gray-skinned Keldon warriors with their heavy leather armor and their heavier weapons. Their warlord—Radha—sat opposite the Capashen nobleman. She had a Keldon's ash-colored skin, black mane, and bulky muscles, but the pointed ears and blue markings of a Skyshroud elf.
Other officials, led by a man from New Argive who had fair skin and a black goatee, lined the sides of the granite negotiation table between the two conflicting parties.
Ajani and Karn must have arrived as the negotiations were set to begin, because only a moment later both Jodah and Jaya arrived. Jodah portaled in, stepping through a door his magic sliced into the air. His office, cluttered with books and bric-a-brac, vanished when the portal slid closed. Jaya planeswalked into the room, appearing with a flash and the smell of charcoal.
"It's been a while, old man." Jaya gave Jodah a friendly embrace.
With his boyish features and shaggy brunette hair, Jodah could have been Jaya's grandson, even though he was thousands of years her senior. "Come for the family silver?"
"Oh, there's nothing silver here that I like enough to keep—except for my hair. I've already checked your pockets. Thought about taking up lint farming?"
Jodah smiled. "I'm not worried. Your tongue is quicker than your fingertips."
Jaya's gaze fell on Karn and Ajani. "Well, this is a surprise. Are you two here to work on the negotiations as well?"
Ajani regarded Jaya, solemn. "We must speak to you regarding what Karn has seen in the Caves of Koilos. The Phyrexians have returned to Dominaria."
The idle small talk at the negotiation table dropped into shocked silence. Jodah and Jaya exchanged looks, then turned their attention to Karn. The Keldons, Benalish, and Argivians broke into argument, the overlapping dialects and accents turning their fears into babble. Only the Benalish knights remained at their posts, their rigid posture a testament to their discipline.
Jaya had paled. "It hardly seems possible."
"I have walked this plane for millennia," Jodah said, "and I have read the stories, examined the histories. I have visited the ruins: I tell you this not to boast but so that you know I speak the truth—the Phyrexians cannot traverse the Blind Eternities."
"Sheoldred has traveled between planes—" Karn said.
"Only Planeswalkers can do that now." Jodah pinched the bridge of his nose. "If I recall, Karn, that is a reality you helped usher in." His age—similar to Urza's, when Urza had created Karn—overwrote his young features with exhaustion. Karn could not believe Jodah would deny the truth, not when Karn had seen Sheoldred. Perhaps most Phyrexians could not survive the journey through the Blind Eternities, but Sheoldred had: even if it had burned away her organic materials, even if it had damaged and weakened her, somehow she had succeeded.
<NAME> stood and paced. He seemed agitated. "The Phyrexians are ancient history. I cannot see what you would have to gain by asserting this."
"I located a staging ground for a new invasion," Karn said, "led by one of New Phyrexia's leaders, a praetor named Sheoldred. The Society of Mishra serves her, and the Phyrexians are compleating dozens of ordinary citizens. We cannot know how many Phyrexians are stationed throughout Dominaria's nations. They may even be among us now."
"Have I not been warning you of this?" The young nobleman from New Argive stood. Based on his gold-embroidered and fur-lined finery, he had to be an important official. "Phyrexian sleeper agents will permeate every layer of society if we do not act now. For all we know, they already have!"
#figure(image("002_Episode 2: Sand in the Hourglass/02.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"Stenn, your alarmist tendencies are not helping," Jodah said. "Karn, where are the Phyrexians now?"
The varied thrush fixed a beady eye on him as if curious about his answer.
Karn did not have an answer. "They evacuated while I was incapacitated. I do not know."
Jodah sighed. "The diplomatic situation is too sensitive to halt negotiations now. If you knew where they were, this would be a different matter, but without stronger information, like a location, how could we act to root them out?"
"And even if the Phyrexians were on Dominaria," Jaya said, "historically they've divided before they conquered. If we leave this conflict between Benalia and the Keldons unresolved, we'd play right into their hands."
The thrush hopped along the railing.
"Karn, are you listening to me?" Jodah asked.
Karn returned his attention to Jodah. He placed the spearhead on the table. "I am."
"I have seen weapons from the Society of Mishra before," Jodah said, mild.
"When has Karn ever lied?" Ajani growled. "If he says he saw Sheoldred compleating people, then we are in danger."
"I believe you," Aron said. "But I cannot send my soldiers chasing whispers and rumors across Dominaria. Between the hostilities with the Keldons and fighting off the Cabal's resurgence, I don't have the fighters."
"His troops have the same engagements as mine." Radha laughed, a short bark. "I suppose we have found common ground in that."
Jodah glanced between Radha and Aron. "The Phyrexians haven't been a threat for centuries. I know that your memory is long, Karn. As is mine. If we address today's issue—the conflict between the Capashens and the Keldons—we can then discuss redeploying those same soldiers to fight the Phyrexians."
So many people had screamed in Sheoldred's lair, their voices thin and their pain sharp beneath the ecstatic orisons to her glory. "What of the lives Sheoldred now takes?"
Jodah placed his hand on Karn's shoulder. "We may not be talking about something as grand as an interplanar invasion, but lives are being lost to this conflict. They matter, too."
"We will go with you, Karn," said Jaya. "We will search for them. But now? Let us focus on the task at hand."
Karn could feel the room's attention swing back to the table, and the negotiations.
The thrush flew away.
"Stenn," Jaya said, "have someone show Karn and Ajani to the guest quarters."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Karn's room was simple, its furniture basic but well crafted: a bed, a large table with two chairs, and a washstand with a porcelain basin in it. Karn pushed the bed to one side and moved the table into the room's center. He unloaded his backpack, taking care to ensure the sylex, still in its case, was secure.
"The most coherent argument Jodah and Jaya had against assisting us," Karn said, "was that we do not know where the Phyrexians are. If we can determine their location, then we will be able to persuade Jodah and Jaya to help."
"And perhaps the others as well." Ajani paused, his powerful body coiled. "How?"
"A scrying device." Karn lifted his hand above the tabletop. He generated first the viewing plane, a copper sheet covered in crystal. He filled the narrow layer between the two materials with liquid. The device's remainder, a complex assemblage of mechanical parts, required his concentration. His body buzzed with the magic coursing through him.
Ajani watched him, the pale blue of his unscarred eye attentive. "What is that?"
"It is for viewing remote locations." Karn let pride seep into his voice. He'd developed the plan for it himself, and he knew of no other device that could perform similarly. Karn focused on Jhoira. Not on her face. Not on her physical presence, but on her essence, the qualities that made her #emph[Jhoira] . How she always saw through a person's circumstances to their essence. How she was willing to give everyone the benefit of a doubt.
The Mana Rig resolved within the crystal. At first fuzzy, the image filled with depth, then color. Perched upon a cliff's edge in Shiv's brutal desert, the metal structure had the size and complexity of a small city. The image tightened into a single location, a workshop with Jhoira in it. She sat at a workbench, her head bent, bronze hair bound and falling between her shoulder blades. She flipped a disconnected toggle back and forth as if thinking.
"Can you view Sheoldred?" Ajani asked.
All too easily could Karn visualize Sheoldred: her humanoid torso rising from her scorpion-like body; her voice, intimate and resonant inside his head. #emph[Karn] ~#emph[ such plans.]
The scryer's image dissolved into mistiness. Karn leaned back on his heels. Ajani glanced at Karn. "They must have protections in place to prevent us from scrying them."
"A sensible precaution." Unfortunately.
Ajani pulled the amulet from his belt that could summon the #emph[Weatherlight] . He placed it in Karn's palm. "You'll need this."
Karn examined the amulet. It seemed a straightforward device. "I can twin this."
Ajani smiled, his lips closed. "Even better."
Karn extended his senses into the amulet. He reproduced it, the metal coiling up from his fingertips to form an identical amulet. Ajani clipped the original to his belt while Karn manufactured a chain for his copy. Karn hung the amulet from his neck, feeling odd about the adornment. Normally he eschewed such things.
A varied thrush perched on Karn's windowsill, behind Ajani's shoulder.
If Karn could draw the Phyrexians out, he would not need to find them. He'd know where they were. The Phyrexians wanted to neutralize Dominaria's most powerful weapons. That included the sylex. He would use news of its presence to lure them into the open. But first he had to hide the sylex somewhere safe.
"Perhaps if we could speak to Jaya alone," Ajani suggested, "we could persuade her. She is no diplomat at heart."
Karn stared at the varied thrush, so still, so attentive. "Perhaps."
#figure(image("002_Episode 2: Sand in the Hourglass/03.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Karn let himself into the negotiations. Stenn was setting an inkwell on the granite table as Jodah and Jaya gave both Radha and Aron Capashen quill pens. He did not wish to interrupt before they signed. The sea breeze poured over the balcony, cool with springtime's edge.
"You are an impressive leader," Aron said. "I am proud to enter this new era with you."
Radha smiled. "You do like to talk fancy."
"And you like to be mistaken for a brute," <NAME> said. "Anyone who takes you for a simple warrior must soon regret it."
Jodah smiled. "Radha, Aron will bring this agreement to the other houses to present for ratification. I will accompany him to ensure that this process is accomplished within the next few months, during which all hostilities in the Ice Rime Hills will cease."
Radha put up her hands, conceding. "Yes, yes. The sacred sites aren't worth any more war—no matter what artifacts they might contain."
A breeze stirred the room as a pale-blue bead of light formed midair. The light whorled outward into a disc that brightened into azure as Teferi stepped through the vortex. He'd aged well: his shoulders had broadened with middle age, gray threaded his hair, and his umber skin had health's warm blush to it.
"Another Planeswalker?" Aron sat back in his chair, exasperated.
"It must be a sign of interesting times," Radha said.
Jodah stood. "What's happened?"
"It's the Phyrexians—they were on Kamigawa." Teferi closed his eyes and shook his head. "Given what Kaya told me of what she saw on Kaldheim—"
"They can travel between planes," Jaya said, lips drawn tight.
After a moment, Jodah said, "That is alarming, to say the least."
Had not Karn explained this to both Jaya and Jodah? He had seen this, with his own eyes. He felt Sheoldred's touch on his body, in his mind. Yet Teferi had arrived, bearing secondhand news, and Jodah and Jaya believed his assertions? Where were their requests for "proof of a location" now?
Karn might as well have been a statue for all the regard they had given him. And the threat Teferi had warned them of wasn't even #emph[on] Dominaria.
But none of that mattered. Only one fact remained relevant: "If the Phyrexians have traveled between multiple planes, then their invasion plans are much more widespread and well-coordinated than we anticipated."
Radha tensed. "Then we must fight."
Aron shook his head. His knights seemed restless, hands twitching toward their swords as if they expected to launch into action. "I never would have thought I'd live to see another Phyrexian invasion."
"The true Twilight has come," one of Radha's warriors hissed. "How can we battle such creatures?"
"However bad it is," Stenn said, "what will come is worse."
Jodah gave Jaya his calmest "help me" expression. Jaya flapped her hand at Karn and Ajani, as if asking them to remove Teferi, the origin of this disruption. Radha and Aron had not signed—and this made it seem like they wouldn't. Jodah looked like he'd bitten down on a charged piece of aluminum.
"I have the feeling that my timing was less than immaculate," Teferi said.
"You don't say," Jaya said, and gave them a meaningful look.
"I am not certain about this mutual protection clause—" Aron began.
"It might be best to look to our own shores, our own peoples—" Radha said.
Karn ushered Teferi away toward the door. Teferi let him.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
As planeswalking had exhausted Teferi, Karn and Ajani led him to the suite adjacent to theirs.
Outside, spring rain pattered down against the cliffside. The rosemary plants growing from cracks in the stone scented the air that wafted through the unglazed windows. Did rosemary's scent please Karn because #emph[he ] liked it? Or because Urza had designed him to like it? Karn would never know.
Teferi always made Karn consider his origins. Not always comfortably.
"How is Niambi?" Karn asked.
"She's providing medical aid to the nomadic tribes in Jamuraa." Teferi's pride in his daughter radiated from him. "And Jhoira?"
"I have not spoken to Jhoira in some time." Karn wished that Urza had made his face with a human's mobility and its subtlety in micro-expressions so that it would be easier for him to signal to Teferi that he did not wish to speak about this.
Ajani glanced between Teferi and Karn as if the awkward silence stretching between them were visible, a piece of string tight enough to twang. "Something else is troubling you."
"I did not wish to say this before the Keldons and Benalish," Teferi admitted, "but they took Tamiyo. Even Planeswalkers might be vulnerable to them now~ We waited too long, Ajani."
Ajani froze, shock plain across his face. "Tamiyo?"
Teferi nodded wearily. "We can discuss it after I've gotten some rest."
Karn watched as Ajani's hands curled into tight fists, anger and sorrow crossing his friend's face. He hadn't known they were close.
"I should rest as well," said the leonin after a moment.
Karn accepted this as his cue to depart. Back in his room, he opened the case with the sylex and the tablet in it. He removed the tablet, relocked the case, and set it on the table. He'd keep this here, to research it. But the sylex—that he needed to rehome.
Somewhere safe. And he knew just the place.
Karn pressed his palms to the scryer's crystal-covered copper. Jhoira's image appeared. She was no longer in her workshop but sleeping, her face crunched into her pillow, her reddish-brown hair lying in a messy braid across one cheek. Karn let her image fade.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Evading the Oyster Bay guards was simple: the people here may have once made great pirates, but they had not adopted the organized banality of guard duty. Karn, large in the shadows, avoided any light that would glint from his body. He slipped through the town's carved streets, sticking to the darkness, up and around to the top of the cliff.
He hiked along the Phyrexian portal ship's spine, its degraded metal softened with wildflowers like purple asters and goldenrod, toward a hill blanketed in young vine maples. Ferns rustled at Karn's shins, and the damp air condensed on his body.
Now a sufficient distance to avoid tweaking Jaya and Jodah's senses, Karn stepped through the searing Blind Eternities, tearing a wound in it. The edges fluttered against his body. He passed through it to Shiv and the Mana Rig, straight into Jhoira's workshop. It held a breathless silence, like every instrument in it waited for Jhoira to wake.
Karn located a supply closet. He stowed the sylex and its case on the lowest shelf behind lengths of pipe whose dust promised that Jhoira had not needed them recently. He generated two devices: one alarm that would register if the pipes moved, and another weight-sensitive alarm that would notify him if anyone moved the box itself. There. The sylex was safe. Or as safe as it could be. Karn stepped back into the Blind Eternities.
#figure(image("002_Episode 2: Sand in the Hourglass/04.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
Back on the forest hill, Karn wound his way downhill toward Oyster Bay. A light glimmered between the pale slender-trunked birch trees. A silhouetted person held aloft a lamp. Karn paused, but the lamp had glinted from his body. He had been seen. The figure moved closer. Stenn, the New Argivian noble from the negotiation table.
A nightjar called, its low warble traveling between the trees.
What if his precautions had not been enough?
"Out for a walk?" Stenn called.
"Yes," Karn said. "I do not sleep. You are awake late?"
"No, up early." As Stenn neared, his features grew clearer. His beard was trimmed and his hair tidy. "Dawn is the only time I truly feel safe. At peace. With the smell of baking bread wafting over the city, with the citizens starting to wake, I can imagine we aren't at war."
Morning had begun to whiten the sky. The air tasted like dew and cinnamon.
"I overheard the other Planeswalkers saying that you're immune to Phyrexian influence?"
"Yes."
"This means you may be the only Planeswalker who can be trusted." Stenn's sable mantle beaded with water. "You aren't the only one who can read the signs of invasion. King Darien has tasked me with discovering Phyrexian agents. Obviously, this is not common knowledge."
"What will you do after you discover such an agent?" Karn asked.
"What must be done," Stenn said. "The only thing that can be done. Once someone is compleated—they are lost, whether they know it or not."
"They don't know themselves?"
"No," Stenn said. "I think they are more useful to the Phyrexians—and harder to discover—if they themselves do not know."
It made sense that those forced to act against their own interests, their families, and their very plane, would be kept oblivious of their own actions. The Phyrexians had to be inserting these unknowing sleeper agents everywhere. Yet to kill such people, people who had already been so wronged~ King Darien must've selected Stenn for his ruthlessness.
"Have you ever caught such an agent?" Karn asked.
"No. Not yet." Stenn gazed at the dawn-glazed sea. Fishing boats skidded along the waves, tan sails belling. "Teferi's news frightened them."
Karn nodded. "They should be frightened. Do you think Benalia and Keld will unify?"
"I don't know," Stenn admitted, "but I do know that I can promise this: New Argive will mobilize. We will stand with you in defense of Dominaria."
Karn nodded, relieved that someone had taken him as a credible source. He had found his first ally willing to provide military support. "We can discuss the details later."
In town, few seemed wakeful—only bakers tucking yeasty loaves into ovens and children milking goats and feeding chickens. Sometimes, Karn imagined their pains: losing a pet rooster to the dining table, spilling a much-needed bucket of milk. Long after these people had died, Karn would continue to ponder their lives.
He felt old. Old, and tired. And the children's beautiful brevity seemed an unbearable tragedy in this still morning.
When Karn reached the city hall, Ajani was awake, pacing between banks of ropey wisteria vines. Ajani paused, his body quivering with tension, and his tail lashed once. Karn suspected this was not a voluntary gesture. He had seen how the leonin seemed to smother his non-human mannerisms when near humans. Ajani's blue eye caught the light in the dimness, pupil glinting a predator's green.
"Karn. Do you think the humans are awake yet?" Ajani asked. "Jodah and Jaya will sit the representatives down at the negotiation table once more today."
Karn could summon no patience for how Jodah continued to prioritize this small human conflict before the Phyrexian threat. "Some are. I encountered Stenn this morning, and he has pledged New Argive's forces."
"Then let us speak to Jaya," Ajani said, "before negotiations recommence."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
"You two would be much more compassionate toward me right now if you had heard of this substance called 'caffeine,'" Jaya muttered.
"I have heard of it," Karn said.
"It is vile," Ajani said.
Teferi entered the room and opened the doors to the balcony. The chilly sea breeze freshened the room, bringing with it springtime's rising birdsong. A seagull landed on the balcony and cocked its head, looking at Teferi's bread roll in a meaningful way. A varied thrush landed on the rail, then jumped along it. Could it be the same bird as yesterday? How could such a shy woodland bird, with its orange breast, tolerate a seagull?
"It isn't important who can charge what taxes at which border," Ajani said. "We should be prioritizing the fight against the Phyrexians."
"Correct." Karn eyed the thrush. "And keeping the sylex from Phyrexian hands."
"The sylex?" Ajani started. "You have it with you?"
"I had it in my possession," Karn said, "as I planned to deploy it on New Phyrexia and eradicate the Phyrexian threat at its source once I determined its workings."
"Karn, we agreed to handle that together. You can't go there alone," said Teferi, serious.
"You said yourself that we waited too long. All of you promised me your help, and then you told me to be patient. No longer," said Karn.
The thrush was not even pretending to peck at invisible crumbs.
Karn seized the bird. "I know what you are."
"Karn—" Jaya said.
The bird's chest peeled open and cables shot out. The cables, slick with blood and slime, wrapped themselves around Karn's head. Goo slipped down his skin and a maw at the tentacle's core searched along Karn's cheek for purchase, its teeth scraping down the smooth metal. Karn readjusted his grip around the creature's slippery body, trying to pull it from his face. But its wires had wrapped all the way around his head, locking together in a thick tangle at the nape of his neck. The creature's teeth caught on Karn's lip. It jabbed needle-like protuberances into him, like it wanted to inject him with some substance, and the needles snapped.
"It's too close to Karn," Jaya shouted. "I can't blast it."
"Let me—" Ajani said.
Slime sloughed off the creature and sizzled on Karn's skin, corroding his metal. It hurt. The creature snaked its tentacles between the joints on Karn's neck and around his collar, as if trying to prize him apart. Karn grunted and squeezed his fingers between the creature's slippery body and his face. He forced it off him, flinging it across the room where it smacked against the opposite wall and slid down. The creature caterpillar-crawled toward the door.
Teferi raised his hands, slowing the creature within a blurring field to prevent its rapid escape. Ajani lunged forward and pierced the creature with his claws, pinning it to the floor. It shrieked and writhed. Acid spurted from the wound.
Karn, face still steaming from the creature's corrosive slime, held out both hands, one over the other. He generated a bird cage, building it upwards until the bars united into a dome. Ajani ripped the monstrosity from the floor and flung it into the cage.
It rattled the bars, screeching.
Jaya crossed her arms. "Turns out Jodah has bigger things to worry about than taxes."
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Karn placed the Phyrexian bird on the granite negotiation table. Jodah leaned toward it, his eyes widening. The creature in the cage hissed at him. <NAME> looked sick. His Benalish knights had not moved, their discipline ironclad. Radha stared at it, eyes glittering. Her warriors had broken out into muttered prayers. Stenn's lips had thinned in satisfaction that his point was made.
"They're here," Jodah murmured. "Among us."
"I told you—" Stenn said.
Three of the Benalish knights exploded outward from their armor. Their eyes burst open in a shower of glistening black oil and their jaws distended, metal teeth emerging from their flesh to stud their gaping maws. Metallic fibers wriggled out from between the gaps in their armor. One of the creatures swung toward the granite table, his clawed hands drawn together in a double fist. It slammed its hands down on the granite table, cracking it in two.
"The negotiations are over," it said.
Its comrade seized Aron with its writhing tentacles, bundling him up like a spider would a fly.
Karn strode forward, Teferi and Ajani flanking him. Jaya held up her hands, summoning fire into her palms. Jodah gathered energy, distorting the air around him with ribbons of color, and then solidified it into a forcefield to protect the unchanged Benalish soldiers from the Phyrexians.
#figure(image("002_Episode 2: Sand in the Hourglass/05.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none)
"For Gerrard," one woman bellowed, lifting her sword. She dodged past Jodah's barrier to charge her ex-comrades. The Phyrexian knight avoided her blow by splitting itself in two: it slid apart into two meaty pieces, legs sprouting from what had once been glistening internal organs. Both halves attacked.
"The first wind of ascension is Forger," Radha called, backing toward the door. She—like Aron—had come to the negotiation table unarmed.
"Burning away impurity!" Her warriors bellowed, forming up around her to protect her. They fought off the lashing tentacles that reached to seize her, chopping off the Phyrexians' limbs. But any appendage that hit the ground seemed to gain life of its own, sprouting legs and teeth, squirming toward the retreating Keldons.
The Argivians fell back, joining the Keldons, fighting with their rapiers, the weapons of nobles who had never seen battle and never expected to. Stenn himself only wielded a dagger. Separated from his people, he backed away between the shards of the broken table until he reached Jaya's wreath of protective flame. Karn had almost reached Aron.
The Phyrexian holding him released a low laugh like an opening steam valve. It rolled its body around Aron and bounded onto a neighboring balcony. Ajani snarled with frustration and launched himself after it.
Ajani! Karn could not follow—the balconies would break under his weight if he tried to leap after light-footed Ajani. Karn made a noise, low in his chest, with frustration, and took a step back. Teferi cursed.
"I can't jump that distance," Teferi said.
The Keldons had reached the door.
"I do not wish to leave you, Archmage," Radha shouted. "Keld stands with Dominaria—for Dominarians. We will battle this blasphemy alongside you, to defend all peoples."
"Go," Jodah shouted. "We will fight together another day!"
"There are too many of them," Karn said. "Blockade them in this room!"
Radha nodded once.
The brass double doors slammed shut, locking the Planeswalkers and the mage into the room with the Phyrexians.
Jaya swirled her hands up and around, blocking off the Phyrexians from Jodah, protecting him. Her flame burned white with a vicious heat. Karn had no doubt that Jaya's magic could defeat even this. He pushed himself through the heat. It seared the tentacles trying to wriggle into the joints of his body, ending them.
"Much as I'd love to do this all day," Jaya said, throwing a fireball at a writhing hunk of metal and flesh, "Jodah?"
"I have summoned the energy." Jodah's eyes were aglow with it, his skin incandescent. "But I need to know where to direct it to create the portal. A secure location."
"Argivia," Stenn gasped. He flicked a piece of tentacle off him with his blade and stomped on it. Blood and oil spurted beneath his boot, and he turned toward the next encroaching tentacle and speared it through. "New Argive's watchtower."
"It is as safe a place as any." Karn retreated toward Jodah, Teferi at his side.
Jodah's portal spun into existence behind him. It opened like a doorway cut into the air itself, revealing a small circular room.
Jodah retreated through it to sustain it from the other side.
"I'll hold them off," Jaya said, searing the writhing cables with her fire. "If you can get through the portal, I'll blast this room with such fire that not a single piece of Phyrexian will remain. Go!"
"My thanks," Stenn said. He backed through the portal as well.
"And mine as well," Teferi said, and vanished through the whirling vortex.
Jaya grinned in triumph as she raised her hands in a blaze of fire and set the whole room alight. The screams of the Phyrexians, wet and unnatural, whistled.
Karn stepped through the portal. The magic tingled across his skin and swallowed him, depositing him on the other side. A shape, airborne, flicked past him. Karn turned to search for it. He couldn't see any movement in the small room but for those who had arrived with him: Stenn, Jodah, Jaya, Teferi, and himself.
Jaya, last through the portal, joined Karn at his side.
Jodah closed the portal and collapsed, sagging to the ground. Transporting so many people was no easy feat—even for Jodah.
The humans all sat on the floor, sweating, panting, and bleeding, while Karn remained standing. He searched the room for the flickering shadow. The tower room had tiny arched windows ringing it and was empty but for a pedestal in the center, which seemed to have a control panel on it. Overhead a golden light shimmered through a crystal—no, not a crystal: a powerstone.
A shadow flitted across the powerstone's face.
"One followed us," Karn said.
"We must not let it escape. It could wreak havoc on the city." Stenn flipped a toggle on the central control panel. The watchtower thumped as gears ground to life. The walls' interiors echoed with the rattle of moving chains. Steel shutters and blast doors slammed shut, blocking out all light. The room instantly felt stuffier, more claustrophobic. Stenn handed Karn the key. "You are the only one who's incorruptible, so it's only right that you have it."
Jaya bumped her shoulder into Jodah's. "You never get tired of being right, do you?"
"The millennia may wear on, but no. No, I do not." Jodah's smile faded, and he turned to Karn.
"Nothing and no one may leave while the tower is in lockdown," Stenn said.
Teferi eyed the steel shutters. "We must capture and destroy the Phyrexian trapped here. And we must determine if any among us has been compromised. We need to know who we can trust before we can plan how to defeat them."
"Agreed," Jodah said.
The group checked the room. The small Phyrexian thing that had come with them had escaped the chamber. Karn surmised it must have squirreled itself away through some crack in the stone. He strung the key from the same chain he used to hang the scryer and the beacon to summon the #emph[Weatherlight] and turned to face his companions. A frisson of unease traveled through his body, as if an electric current arced through him. Jodah, Jaya, Teferi, Stenn~ How could he determine who he could trust?
If the Phyrexians were already on Dominaria, who could anyone trust?
|
|
https://github.com/metamuffin/typst | https://raw.githubusercontent.com/metamuffin/typst/main/tests/typ/meta/state.typ | typst | Apache License 2.0 | // Test state.
---
#let s = state("hey", "a")
#let double(it) = 2 * it
#s.update(double)
#s.update(double)
$ 2 + 3 $
#s.update(double)
Is: #s.display(),
Was: #locate(location => {
let it = query(math.equation, location).first()
s.at(it.location())
}).
---
#set page(width: 200pt)
#set text(8pt)
#let ls = state("lorem", lorem(1000).split("."))
#let loremum(count) = {
ls.display(list => list.slice(0, count).join(".").trim() + ".")
ls.update(list => list.slice(count))
}
#let fs = state("fader", red)
#let trait(title) = block[
#fs.display(color => text(fill: color)[
*#title:* #loremum(1)
])
#fs.update(color => color.lighten(30%))
]
#trait[Boldness]
#trait[Adventure]
#trait[Fear]
#trait[Anger]
|
https://github.com/Dideldumm/realistic-polygons | https://raw.githubusercontent.com/Dideldumm/realistic-polygons/main/parcio-typst/bachelor-thesis/chapters/merge_convex_hulls/merge_convex_hulls.typ | typst | = Merge Convex Hulls Algorithm
The main idea of this algorithm is that vertices that are almost collinear should end up a connected polygonal chain.
To achieve this the algorithm recursively builds up convex hulls of the points.
The result is a set of layers of convex hulls, similar to an onion.
Then the convex hulls are merged to build a polygon that consists of all the given points.
== Pseudocode
```c
function merge_convex_hulls(PointSet point_set) -> Polygon {
List convex_hulls := create_convex_hulls(point_set);
Polygon polygon = convex_hulls.pop_front();
while (convex_hulls is not empty) {
Polygon new_hull = convex_hulls.pop_front();
polygon = merge(polygon, new_hull);
}
}
/**
* Iteratively creates a convex hull for the given point set and
* removes the points from the set.
* Repeats this process until the set is empty.
* Returns a list of the created convex hulls
* The hulls in the list are sorted from most outside to most inside.
*/
function create_convex_hulls(PointSet point_set) -> ListOfConvexHulls {
List convex_hulls := {};
while (point_set is not empty) {
Polygon new_hull = create_convex_hull(point_set);
convex_hulls.add(new_hull);
point_set.remove_all(new_hull.get_vertices());
}
return convex_hulls;
}
function merge(Polygon polygon, ConvexPolygon new_hull) -> Polygon {
// TODO add implementation
for (Point vertex : new_hull.vertices()) {
Segment nearest_segment = find_nearest_segment(polygon.edges(), point);
polygon.insert_point_at_segment(nearest_segment, point);
}
return polygon;
}
```
bla |
|
https://github.com/Vortezz/fiches-mp2i-maths | https://raw.githubusercontent.com/Vortezz/fiches-mp2i-maths/main/chapter_4.typ | typst | #set page(header: box(width: 100%, grid(
columns: (100%),
rows: (20pt, 8pt),
align(right, text("CHAPITRE 4. SOMMES")),
line(length: 100%),
)), footer: box(width: 100%, grid(
columns: (50%, 50%),
rows: (8pt, 20pt),
line(length: 100%),
line(length: 100%),
align(left, text("<NAME> - MP2I")),
align(right, text("<NAME> - 2023/2024")),
)))
#set heading(numbering: "I.1")
#set math.vec(delim:"(")
#let titleBox(title) = align(center, block(below: 50pt, box(height: auto, fill: rgb("#eeeeee"), width: auto, inset: 40pt, text(title, size: 20pt, weight: "bold"))))
#titleBox("Sommes")
= Manipulation des signes $sum$ et $product$
== Définition des notations
Soit $I$ un ensemble fini et $(a_i)_(i in I)$ une famille de nombres réels ou complexes.
- On note $sum_(i in I) a_i$ la somme des $a_i$ pour $i in I$.
- On note $product_(i in I) a_i$ le produit des $a_i$ pour $i in I$.
- Lorque $I = [|n,m|]$, avec $n <= m$, on note $sum_(i = n)^m a_i$ la somme des $a_i$ pour $i in [|n,m|]$.
On dit que $i$ une variable muette, il est donc possible de remplacer $i$ par une autre lettre. Cependant, il est impossible de remplacer $i$ par une lettre déjà utilisée dans la somme.
Si $I = emptyset$, alors par convention $sum_(i in I) a_i = 0$ et $product_(i in I) a_i = 1$.
On définit la *factorielle* de $n$ par $n! = product_(k = 1)^n k$.
== Changements d'indice
On a $I$ et $J$ deux ensembles finis, et $f : I ->^(approx) J$ une bijection, alors $sum_(j in J) a_j = sum_(i in I) a_(f(i))$. (On peut remplacer $sum$ par $product$).
Il est aussi possible de translater l'indice, c'est-à-dire de remplacer $i$ par $i - l$. On a alors $sum_(i = n)^m a_i = sum_(i = n - l)^(m-l) a_(i + l)$.
== Sommation par groupement de termes
On suppose $I = I_1 union.plus I_2$, avec $I$ fini, ainsi $sum_(i in I) a_i = sum_(i in I_1) a_i + sum_(i in I_2) a_i$.
On peut ainsi généraliser à $n$ ensembles $I_1, I_2, ..., I_n$, avec $I = I_1 union.plus I_2 union.plus ... union.plus I_n$, on a alors $sum_(i in I) a_i = sum_(i in I_1) a_i + sum_(i in I_2) a_i + ... + sum_(i in I_n) a_i = sum_(k = 1)^n sum_(i in I_k) a_i$.
== Linéarité
Soit $lambda$ et $mu$ deux nombres réels ou complexes, alors on a $sum_(i in I)a_i + sum_(i in I)b_i = sum_(i in I) (a_i + b_i)$ et $lambda sum_(i in I) a_i = sum_(i in I) lambda a_i$.
Ainsi on en déduit $sum_(i in I) (lambda a_i + mu b_i) = lambda sum_(i in I) a_i + mu sum_(i in I) b_i$.
On a $E$ un ensemble fini, et $a$ un nombre réel ou complexe, alors $sum_(i in I) a = |E| times a$.
== Sommes télescopiques
On dit que $sum_(k=0)^n a_k$ est une somme télescopique si $a_k = b_(k+1) - b_k$.
On a alors $sum_(k=0)^n a_k = sum_(k=0)^n (b_(k+1) - b_k) = b_(n+1) - b_0$.
== Cas des produits
- Si $I_1 sect I_2 = emptyset$, alors $product_(i in I_1) a_i times product_(i in I_2) a_i = product_(i in I_1 union.plus I_2) a_i$
- Si $(product_(i in I) a_i)^lambda (product_(i in I) b_i)^mu = product_(i in I) (a_i)^lambda (b_i)^mu$
- $product_(i in I) a = a^(|I|)$
On dit que $product_(k=0)^n a_k$ est une produit télescopique si $a_k = b_(k+1) / b_k$.
On a alors $product_(k=0)^n a_k = product_(k=0)^n (b_(k+1) / b_k) = b_(n+1) / b_0$.
== Sommes multiples
Certaines sommes sont indexées sur un produit cartésien.
Ainsi on a $K subset I times J$,
- Soit $i in I$, on définit la *coupe de $K$ suivant $i$* : $K_(i,circle.filled.small) = {j in J | (i,j) in K}$
- Soit $j in J$, on définit la *coupe de $K$ suivant $j$* : $K_(circle.filled.small,j) = {i in I | (i,j) in K}$
On définit aussi $K'_(i,circle.filled.small) = {(i,j) | j in K_(i,circle.filled.small)}$ et $K'_(circle.filled.small,j) = {(i,j) | i in K_(circle.filled.small,j)}$.
On a l'inversion des signes sommes, ainsi :
$ sum_((i,j) in K) a_(i,j) = sum_(i in I) sum_(j in K_(i,circle.filled.small)) a_(i,j) = sum_(i in I) sum_((i,j) in K'_(i,circle.filled.small)) a_(i,j) = sum_(j in J) sum_(i in K_(circle.filled.small,j)) a_(i,j) = sum_(j in J) sum_((i,j) in K'_(circle.filled.small)) a_(i,j) $
Si $K = I times J$ on a $K_(i,circle.filled.small) = J$ et $K_(circle.filled.small,j) = I$, ainsi $ sum_((i,j) in I times J) a_(i,j) = sum_(i in I) sum_(j in J) a_(i,j) = sum_(j in J) sum_(i in I) a_(i,j) "(somme sur un pavé)" $
On a aussi $sum_(i=0)^(n) sum_(j=i)^(n) a_(i,j) = sum_(j=0)^n sum_(i=0)^j a_(i,j) "(somme sur un triangle)" $
== Produits de sommes
On a $(sum_(i in I) a_i) (sum_(j in J) b_j) = sum_(i in I) sum_(j in J) a_i b_j = sum_((i,j) in I times J) a_i b_j$
#emoji.warning Il est important de rentre les indices idépendants comme dit précédemment.
*Théorème de distributivité généralisé* : On a : $ product_(k = 1)^n (sum_(i=1)^(m_k) a_(k,i)) = sum_((i_1,...,i_n) in [|1, m_1|] times ... times [|1,m_n|]) a_(1,i_1) a_(2,i_2) ... a_(n,i_n) $
= Sommes classiques à connaître
== Sommes de puissances d'entiers
- $sum_(k=1)^n k^0 = sum_(k=1)^n 1 = n$
- $sum_(k=1)^n k = (n(n+1))/2$
- $sum_(k=1)^n k^2 = (n(n+1)(2n+1))/6$
- $sum_(k=1)^n k^3 = ((n(n+1))/2)^2$
== Sommes géométriques
On a $a$ et $b$ deux nombres réels ou complexes, et $n in NN$, ainsi :
- $a^n - b^n = (a-b) sum_(k=0)^(n-1) a^(n-1-k) b^k$
- Si $b=1$, $a^n - 1 = (a-1) sum_(k=0)^(n-1) a^k$
- Si $n$ est impair, on a $a^n + b^n = (a+b) sum_(k=0)^(n-1) (-1)^k a^(n-1-k) b^k$
Soit $x$ un nombre réel ou complexe, on a $sum_(k=0)^n x^k = cases(
n+1 "si" x = 1,
(1 - x^(n+1)) / (1-x) "sinon"
)$ |
|
https://github.com/Gchism94/PrettyTypst | https://raw.githubusercontent.com/Gchism94/PrettyTypst/main/_extensions/PrettyTypst/typst-show.typ | typst | Creative Commons Zero v1.0 Universal | #show: PrettyTypst.with(
$if(title)$
title: "$title$",
$endif$
$if(typst-logo)$
typst-logo: (
path: "$typst-logo.path$",
caption: [$typst-logo.caption$]
),
$endif$
)
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/references/cross_file_ref_label.typ | typst | Apache License 2.0 | // path: base2.typ
== Base 2 <b2>
Ref to b1 @b1
Ref to b2 @b2
Ref to b1 @b1 again
-----
// path: base1.typ
== Base 1 <b1>
Ref to b1 @b1
Ref to b2 @b2
-----
// compile:true
#set heading(numbering: "1.")
= Test Ref Label
#include "base1.typ"
#include "base2.typ"
Ref to b1 /* position after */ @b1
Ref to b2 @b2 |
https://github.com/matthiasbeyer/typst-template-paper | https://raw.githubusercontent.com/matthiasbeyer/typst-template-paper/master/README.md | markdown | MIT License | # typst-template-paper
A `typst` template repository **FOR PERSONAL USE**.
This is intended as _template repository_ not as _typst template_.
This means, I use this repository to kickstart other repositories. I do not use
this repository as template for a typst document.
This is mainly for personal use and to discover/learn how typst works, but maybe
someone finds this intersting/useful.
Feature requests will be ignored if not accompanied by a PR implementing them -
and then only if I find it useful.
## License
MIT.
See [LICENSE].
(c) 2024 <NAME>
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/jurz/0.1.0/jurz.typ | typst | Apache License 2.0 | #let init-jurz = (
two-sided: false,
gap: 1em,
supplement: "Rz.",
reset-level: 0,
body
) => {
// Reset the Rz. in each heading "bigger" (i.e., with a smaller level) than the reset-level
show heading: it => {
it
if (it.level <= reset-level) {
counter("rz").update(0)
}
}
// Setup basic settings
show heading.where(level: 99): set heading(numbering: (..nums) => {
counter("rz").display()
}, supplement: supplement, outlined: false)
// Set up rendering outside the text flow
show heading.where(level: 99): it => {
counter("rz").step()
context {
// in one-sided layouts, every page is considered to be an even page
let isOddPage = here().page().bit-and(1) > 0 and two-sided;
let position = if isOddPage { right } else { left }
let inner-position = if isOddPage { left } else { right }
let dx = gap * if isOddPage { 1 } else { -1 }
place(
position,
dx: dx,
place(inner-position, it.body)
)
}
}
// Display the body
body
}
#let rz = heading(level: 99, counter("rz").display())
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/meta/figure-01.typ | typst | Other |
// Testing figures with tables.
#figure(
table(
columns: 2,
[Second cylinder],
image("test/assets/files/cylinder.svg"),
),
caption: "A table containing images."
) <fig-image-in-table>
|
https://github.com/Tran-Thu-Le/typst-collection | https://raw.githubusercontent.com/Tran-Thu-Le/typst-collection/main/multi-files-reference/readme.md | markdown | # References on multiple Typst files
- For the current version of Typst (05/06/2024), one cannot cite references on different Typst files. This folder show you how to hack this using file name `multi-ref.typ`.
This folder provides a minimal example of how to use multiple references in different Typst files
- This is based on the https://stackoverflow.com/questions/77913564/add-a-citation-reference-without-a-bibliography-entry-in-typst |
|
https://github.com/OverflowCat/BUAA-Digital-Image-Processing-Sp2024 | https://raw.githubusercontent.com/OverflowCat/BUAA-Digital-Image-Processing-Sp2024/master/chap11/chain.typ | typst | #let colored(x) = text(fill: red, $#x$)
#let calcChain(chain) = {
let chain- = ()
let content = []
let first = int(chain.at(0)) - int(chain.at(chain.len()-1))
if first < 0 {
first += 4
}
content += [0. #colored($ #chain.at(chain.len()-1) - #chain.at(0) = first;$)]
for i in range(chain.len() - 1) {
let pre = chain.at(i)
let post = chain.at(i + 1)
let delta = int(post) - int(pre)
content += [ + $post - pre = #delta#if delta < 0 [$, quad #delta + 4 = #(delta+4)$] #if i == chain.len() - 2 {"."} else {";"}$ ]
if delta < 0 {
delta += 4
}
chain-.push(delta)
}
(
content: content,
first: first,
res: chain-.map(str).join(""),
)
}
#let chain = "0110233210332322111"
#let (res, content, first) = calcChain(chain)
#import "util.typ": problem
#problem[
计算编码 #chain 的一次差分。
]
#content
故编码 $chain$ 的一次差分为 $res$;用循环首差链码计算的一次差分为 $colored(first)res$。
#let lShift(s, k) = {
s.slice(k) + s.slice(0, k)
}
#let calcShape(chain) = {
let min = chain
for i in range(1, chain.len()) {
let shifted = lShift(chain, i)
if shifted < min {
min = shifted
}
}
min
}
|
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/math/matrix-alignment_00.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test alternating alignment in a vector.
$ vec(
"a" & "a a a" & "a a",
"a a" & "a a" & "a",
"a a a" & "a" & "a a a",
) $
|
https://github.com/platformer/typst-algorithms | https://raw.githubusercontent.com/platformer/typst-algorithms/main/README.md | markdown | MIT License | # Algo
A Typst library for writing algorithms. On Typst v0.6.0+ you can import the `algo` package:
```typst
#import "@preview/algo:0.3.3": algo, i, d, comment, code
```
Otherwise, add the `algo.typ` file to your project and import it as normal:
```typst
#import "algo.typ": algo, i, d, comment, code
```
Use the `algo` function for writing pseudocode and the `code` function for writing code blocks with line numbers. Check out the [examples](#examples) below for a quick overview. See the [usage](#usage) section to read about all the options each function has.
## Examples
Here's a basic use of `algo`:
```typst
#algo(
title: "Fib",
parameters: ("n",)
)[
if $n < 0$:#i\ // use #i to indent the following lines
return null#d\ // use #d to dedent the following lines
if $n = 0$ or $n = 1$:#i #comment[you can also]\
return $n$#d #comment[add comments!]\
return #smallcaps("Fib")$(n-1) +$ #smallcaps("Fib")$(n-2)$
]
```
<img src="https://user-images.githubusercontent.com/40146328/235323240-e59ed7e2-ebb6-4b80-8742-eb171dd3721e.png" width="400px" />
<br />
Here's a use of `algo` without a title, parameters, line numbers, or syntax highlighting:
```typst
#algo(
line-numbers: false,
strong-keywords: false
)[
if $n < 0$:#i\
return null#d\
if $n = 0$ or $n = 1$:#i\
return $n$#d\
\
let $x <- 0$\
let $y <- 1$\
for $i <- 2$ to $n-1$:#i #comment[so dynamic!]\
let $z <- x+y$\
$x <- y$\
$y <- z$#d\
\
return $x+y$
]
```
<img src="https://user-images.githubusercontent.com/40146328/235323261-d6e7a42c-ffb7-4c3a-bd2a-4c8fc2df5f36.png" width="300px" />
<br />
And here's `algo` with more styling options:
```typst
#algo(
title: [ // note that title and parameters
#set text(size: 15pt) // can be content
#emph(smallcaps("Fib"))
],
parameters: ([#math.italic("n")],),
comment-prefix: [#sym.triangle.stroked.r ],
comment-styles: (fill: rgb(100%, 0%, 0%)),
indent-size: 15pt,
indent-guides: 1pt + gray,
row-gutter: 5pt,
column-gutter: 5pt,
inset: 5pt,
stroke: 2pt + black,
fill: none,
)[
if $n < 0$:#i\
return null#d\
if $n = 0$ or $n = 1$:#i\
return $n$#d\
\
let $x <- 0$\
let $y <- 1$\
for $i <- 2$ to $n-1$:#i #comment[so dynamic!]\
let $z <- x+y$\
$x <- y$\
$y <- z$#d\
\
return $x+y$
]
```
<img src="https://github.com/platformer/typst-algorithms/assets/40146328/89f80b5d-bdb2-420a-935d-24f43ca597d8" width="300px" />
Here's a basic use of `code`:
````typst
#code()[
```py
def fib(n):
if n < 0:
return None
if n == 0 or n == 1: # this comment is
return n # normal raw text
return fib(n-1) + fib(n-2)
```
]
````
<img src="https://user-images.githubusercontent.com/40146328/235324088-a3596e0b-af90-4da3-b326-2de11158baac.png" width="400px"/>
<br />
And here's `code` with some styling options:
````typst
#code(
indent-guides: 1pt + gray,
row-gutter: 5pt,
column-gutter: 5pt,
inset: 5pt,
stroke: 2pt + black,
fill: none,
)[
```py
def fib(n):
if n < 0:
return None
if n == 0 or n == 1: # this comment is
return n # normal raw text
return fib(n-1) + fib(n-2)
```
]
````
<img src="https://github.com/platformer/typst-algorithms/assets/40146328/c091ac43-6861-40bc-8046-03ea285712c3" width="400px"/>
## Usage
### `algo`
Makes a pseudocode element.
```typst
algo(
body,
header: none,
title: none,
parameters: (),
line-numbers: true,
strong-keywords: true,
keywords: _algo-default-keywords, // see below
comment-prefix: "// ",
indent-size: 20pt,
indent-guides: none,
indent-guides-offset: 0pt,
row-gutter: 10pt,
column-gutter: 10pt,
inset: 10pt,
fill: rgb(98%, 98%, 98%),
stroke: 1pt + rgb(50%, 50%, 50%),
radius: 0pt,
breakable: false,
block-align: center,
main-text-styles: (:),
comment-styles: (fill: rgb(45%, 45%, 45%)),
line-number-styles: (:)
)
```
**Parameters:**
* `body`: `content` — Main algorithm content.
* `header`: `content` — Algorithm header. If specified, `title` and `parameters` are ignored.
* `title`: `string` or `content` — Algorithm title. Ignored if `header` is specified.
* `Parameters`: `array` — List of algorithm parameters. Elements can be `string` or `content` values. `string` values will automatically be displayed in math mode. Ignored if `header` is specified.
* `line-numbers`: `boolean` — Whether to display line numbers.
* `strong-keywords`: `boolean` — Whether to strongly emphasize keywords.
* `keywords`: `array` — List of terms to receive strong emphasis. Elements must be `string` values. Ignored if `strong-keywords` is `false`.
The default list of keywords is stored in `_algo-default-keywords`. This list contains the following terms:
```
("if", "else", "then", "while", "for",
"repeat", "do", "until", ":", "end",
"and", "or", "not", "in", "to",
"down", "let", "return", "goto")
```
Note that for each of the above terms, `_algo-default-keywords` also contains the uppercase form of the term (e.g. "for" and "For").
* `comment-prefix`: `content` — What to prepend comments with.
* `indent-size`: `length` — Size of line indentations.
* `indent-guides`: `stroke` — Stroke for indent guides.
* `indent-guides-offset`: `length` — Horizontal offset of indent guides.
* `row-gutter`: `length` — Space between lines.
* `column-gutter`: `length` — Space between line numbers, text, and comments.
* `inset`: `length` — Size of inner padding.
* `fill`: `color` — Fill color.
* `stroke`: `stroke` — Stroke for the element's border.
* `radius`: `length` — Corner radius.
* `breakable`: `boolean` — Whether the element can break across pages. WARNING: indent guides may look off when broken across pages.
* `block-align`: `none` or `alignment` or `2d alignment` — Alignment of the `algo` on the page. Using `none` will cause the internal `block` element to be returned as-is.
* `main-text-styles`: `dictionary` — Styling options for the main algorithm text. Supports all parameters in Typst's native `text` function.
* `comment-styles`: `dictionary` — Styling options for comment text. Supports all parameters in Typst's native `text` function.
* `line-number-styles`: `dictionary` — Styling options for line numbers. Supports all parameters in Typst's native `text` function.
### `i` and `d`
For use in an `algo` body. `#i` indents all following lines and `#d` dedents all following lines.
### `comment`
For use in an `algo` body. Adds a comment to the line in which it's placed.
```typst
comment(
body,
inline: false
)
```
**Parameters:**
* `body`: `content` — Comment content.
* `inline`: `boolean` — If true, the comment is displayed in place rather than on the right side.
NOTE: inline comments will respect both `main-text-styles` and `comment-styles`, preferring `comment-styles` when the two conflict.
NOTE: to make inline comments insensitive to `strong-keywords`, strong emphasis is disabled within them. This can be circumvented via the `text` function:
```typst
#comment(inline: true)[#text(weight: 700)[...]]
```
### `no-emph`
For use in an `algo` body. Prevents the passed content from being strongly emphasized. If a word appears in your algorithm both as a keyword and as normal text, you may escape the non-keyword usages via this function.
```typst
no-emph(
body
)
```
**Parameters:**
* `body`: `content` — Content to display without emphasis.
### `code`
Makes a code block element.
```typst
code(
body,
line-numbers: true,
indent-guides: none,
indent-guides-offset: 0pt,
tab-size: auto,
row-gutter: 10pt,
column-gutter: 10pt,
inset: 10pt,
fill: rgb(98%, 98%, 98%),
stroke: 1pt + rgb(50%, 50%, 50%),
radius: 0pt,
breakable: false,
block-align: center,
main-text-styles: (:),
line-number-styles: (:)
)
```
**Parameters:**
* `body`: `content` — Main content. Expects `raw` text.
* `line-numbers`: `boolean` — Whether to display line numbers.
* `indent-guides`: `stroke` — Stroke for indent guides.
* `indent-guides-offset`: `length` — Horizontal offset of indent guides.
* `tab-size`: `integer` — Amount of spaces that should be considered an indent. If unspecified, the tab size is determined automatically from the first instance of starting whitespace.
* `row-gutter`: `length` — Space between lines.
* `column-gutter`: `length` — Space between line numbers and text.
* `inset`: `length` — Size of inner padding.
* `fill`: `color` — Fill color.
* `stroke`: `stroke` — Stroke for the element's border.
* `radius`: `length` — Corner radius.
* `breakable`: `boolean` — Whether the element can break across pages. WARNING: indent guides may look off when broken across pages.
* `block-align`: `none` or `alignment` or `2d alignment` — Alignment of the `code` on the page. Using `none` will cause the internal `block` element to be returned as-is.
* `main-text-styles`: `dictionary` — Styling options for the main raw text. Supports all parameters in Typst's native `text` function.
* `line-number-styles`: `dictionary` — Styling options for line numbers. Supports all parameters in Typst's native `text` function.
## Contributing
PRs are welcome! And if you encounter any bugs or have any requests/ideas, feel free to open an issue.
|
https://github.com/Skimmeroni/Appunti | https://raw.githubusercontent.com/Skimmeroni/Appunti/main/Metodi%20Algebrici/Codici/Duale.typ | typst | Creative Commons Zero v1.0 Universal | #import "../Metodi_defs.typ": *
Sia $C$ un codice in $ZZ_(p)^(n)$ di dimensione $k$. L'insieme $C^(perp)
subset.eq ZZ_(p)^(n)$ che contiene tutti i vettori ortogonali ad ogni
vettore di $C$ si dice *codice duale* di $C$. In particolare, se $C =
C^(perp)$, il codice$C$ si dice *autoduale*.
$ C^(perp) = {x in ZZ_(p)^(n) : x dot c = 0, space forall c in C} $
#example[
Sia $C in ZZ_(2)^(4) = {0000, 1110, 1011, 0101}$ un codice. Si voglia
costruire il codice duale:
$ C^(perp) =
cases(
0 dot A + 0 dot B + 0 dot C + 0 dot D = 0,
1 dot A + 1 dot B + 1 dot C + 0 dot D = 0,
1 dot A + 0 dot B + 1 dot C + 1 dot D = 0,
0 dot A + 1 dot B + 0 dot C + 1 dot D = 0
) =
cases(
0 = 0,
A + B + C = 0,
A + C + D = 0,
B + D = 0
) =
cases(
A + C - D = 0,
A + C + D = 0,
B = -D
) $
Ricordando che $[a]_(2) = [-a]_(2)$ per qualsiasi $a$, si ha
$C^(perp) = {x in ZZ_(2)^(4) : x = (A, B, A + B, B)}$. Essendo
$ZZ_(2)^(4)$ un insieme finito, é possibile esplicitare il codice
duale di $C$ come $C^(perp) = {0000, 0111, 1010, 1101}$.
]
#lemma[
Sia $C$ un codice in $ZZ_(p)^(n)$. $C^(perp)$ è un sottospazio
vettoriale di $ZZ_(p)^(n)$.
]
#proof[
Si noti innanzitutto come la parola nulla sia necessariamente parte
di $C^(perp)$. Infatti:
$ cases(
0 dot x_(1, 1) + 0 dot x_(2, 1) + dots + 0 dot x_(n, 1) = 0,
0 dot x_(1, 2) + 0 dot x_(2, 2) + dots + 0 dot x_(n, 2) = 0,
dots.v,
0 dot x_(1, k) + 0 dot x_(2, k) + dots + 0 dot x_(n, k) = 0
) $
Inoltre, per $x, y in C^(perp)$ e $lambda in ZZ_(p)$ si ha
$ cases(
(x + y) dot c = x dot c + y dot c = 0,
(lambda x) dot c = lambda (x dot c) = 0
) space forall c in C $
Pertanto, $x + y in C^(perp)$ e $lambda x in C^(perp)$.
]
#lemma[
Sia $C$ un codice in $ZZ_(p)^(n)$. Vale $(C^(perp))^(perp) = C$.
] <Double-perp-is-none>
// #proof[
// Dimostrabile, da aggiungere
// ]
#theorem[
Sia $C in ZZ_(p)^(n)$ un codice lineare con dimensione $k$ e matrice
generatrice $G$. Un vettore $x in ZZ_(p)^(n)$ appartiene a $C^(perp)$
se e soltanto se $x$ è ortogonale ad ogni vettore riga di $G$, ovvero
se e soltanto se il prodotto matriciale $x (G^(t))$ é il vettore nullo.
] <Perp-matrix-product-null>
#proof[
Siano $cal(B)_(C) = {b_(1), b_(2), ..., b_(k)}$ una base qualsiasi
di $C$ e $cal(B) = {e_(1), e_(2), ..., e_(n)}$ una qualsiasi base
di $ZZ_(p)^(n)$. I coefficienti della matrice $G$ sono i coefficienti
della combinazione lineare usata per esprimere i vettori della base
$cal(B)_(C)$ in funzione della base $cal(B)$:
#grid(
columns: (0.65fr, 0.35fr),
[$ b_(i) = sum_(j = 1)^(n) lambda_(i, j) e_(i)
space "con" space lambda_(i, j) in ZZ_(p) space
forall i = {1, ..., n}, j = {1, ..., k} $],
[$ G = mat(
lambda_(1, 1), lambda_(1, 2), dots, lambda_(1, n);
lambda_(2, 1), lambda_(2, 2), dots, lambda_(2, n);
dots.v, dots, dots.down, dots.v;
lambda_(k, 1), lambda_(k, 2), dots, lambda_(k, n);
) $]
)
È chiaro che $x = (x_(1), dots, x_(n)) in ZZ_(p)^(n)$ appartiene a
$C^(perp)$ se e soltanto se è ortogonale ai vettori di $cal(B)_(C)$.
Si ha:
$ x(G^(t)) = (x_(1), dots, x_(n))
mat(
lambda_(1, 1), lambda_(2, 1), dots, lambda_(k, 1);
lambda_(1, 2), lambda_(2, 2), dots, lambda_(k, 2);
dots.v, dots, dots.down, dots.v;
lambda_(1, n), lambda_(2, n), dots, lambda_(k, n)
) = (x_(1) lambda_(1, 1) + dots + x_(n) lambda_(1, n), dots,
x_(1) lambda_(k, 1) + dots + x_(n) lambda_(k, n))
$
Inoltre:
$ cases(
x_(1) lambda_(1, 1) + dots + x_(n) lambda_(1, n) =
(x_(1), dots, x_(n)) dot (lambda_(1, 1), dots, lambda_(1, n)) =
x dot b_(1),
dots.v,
x_(1) lambda_(k, 1) + dots + x_(n) lambda_(k, n) =
(x_(1), dots, x_(n)) dot (lambda_(k, 1), dots, lambda_(k, n)) =
x dot b_(k)
) $
Combinando i due risultati, si ha:
$ x(G^(t)) = (x dot b_(1), dots, x dot b_(k)) $
]
#corollary[
Se $C$ è un codice lineare in $ZZ_(p)^(n)$ di dimensione
$k$, allora $C^(perp)$ è un codice lineare in $ZZ_(p)^(n)$
di dimensione $n − k$.
] <Perp-dimension>
#proof[
Per il @Perp-matrix-product-null si ha che $x = (x_(1), dots,
x_(n)) in ZZ_(p)^(n)$ appartiene a $C^(perp)$ se e soltanto
se $x(G^(t)) = 0$. Allora i vettori di $C^(perp)$ sono tutte
e sole le soluzioni del sistema lineare omogeneo $x(G^(t)) =
0$ nelle incognite $x_(1), dots, x_(n)$. La matrice $G$ é
certamente invertibile, e di conseguenza lo é anche $G^(t)$.
Pertanto, entrambe devono essere a rango pieno, che in questo
caso equivale ad avere rango $k$, e lo spazio delle soluzioni
ha pertanto dimensione $n − k$.
]
Sia $C$ un codice lineare in $ZZ_(p)^(n)$ di dimensione $k$. Si dice
*matrice di controllo* per $C$ una qualsiasi matrice $H$ che genera
$C^(perp)$.
#theorem[
Sia $C$ un codice e $H$ una sua matrice di controllo. Un vettore
$x = (x_(1), dots, x_(n)) in ZZ_(p)^(n)$ appartiene a $C$ se e
soltanto se il prodotto matriciale $x(H^(t))$ é il vettore nullo.
] <Control-matrix-product-null>
#proof[
Se $H$ é matrice di controllo per $C$, allora é matrice generatrice
per $C^(perp)$. Per il @Perp-matrix-product-null, $x$ appartiene a
$(C^(perp))^(perp)$ se e soltanto se il prodotto matriciale $x (H^(t))$
é il vettore nullo. Tuttavia, per il @Double-perp-is-none, $C =
(C^(perp))^(perp)$, pertanto $x$ appartiene a $C$ se e soltanto se
il prodotto matriciale $x (H^(t))$ é il vettore nullo.
]
Sia $C$ un codice e $H$ una sua matrice di controllo.
Il @Control-matrix-product-null fornisce un metodo per
determinare se un elemento $x in ZZ_(p)^(n)$ appartenga
a $C$.
#lemma[
Sia $C$ un codice di dimensione $k$, e sia $S = (I_(k) | A)$ una sua
matrice in forma standard. Allora la matrice $H = (−A^(t) | I_(n − k))$
è una matrice di controllo per $C$.
]
//#proof[
// $ H S^(t) = (−A^(t) | I_(n − k)) (I_(k) | A)^(t) =
// (−A^(t) | I_(n − k)) (frac(I_(k), (A^(t))^(-1))) =
// -A^(t) + A^(t) = underline(0) $
//]
#theorem[
Sia $C in ZZ_(p)^(n)$ un codice di dimensione $k$ e sia $H$ una sua
matrice di controllo. La distanza minima di $C$ è uguale al minimo
ordine di un insieme linearmente dipendente di colonne della matrice
$H$. In particolare, se $d(C)$ è la distanza minima di $C$, si ha
che $H$ ha almeno $d(C) - 1$ colonne linearmente indipendenti.
]
//#proof[
// Dimostrabile, da aggiungere
//]
#example[
Sia $C in ZZ_(3)^(5)$ un codice lineare di dimensione $k$, e sia $H$
la matrice di controllo per $C$ cosí definita:
$ H = mat(
2, 0, 0, 1, 1;
0, 2, 0, 0, 2;
0, 0, 1, 2, 0) $
Si voglia determinare $k = dim(C)$. Per il @Perp-dimension, si ha
$dim(H) = n - dim(C)$. Essendo $H$ di dimensione $3 times 5$, si
ha $dim(C) = n - dim(H) = 5 - 3 = 2$.
Si voglia determinare $d(C)$. Le colonne di $H$ sono a $2$ a $2$
linearmente indipendenti. Invece, le colonne $1$, $2$ e $5$ sono
linearmente dipendenti. Pertanto, $d(C) = 3$.
]
|
https://github.com/jgm/typst-hs | https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compute/foundations-22.typ | typst | Other | // Error: 7-12 expected semicolon or line break
#eval("1 2")
|
https://github.com/kotatsuyaki/canonical-nthu-thesis | https://raw.githubusercontent.com/kotatsuyaki/canonical-nthu-thesis/main/pages/outlines.typ | typst | MIT License | #let outline-pages(
outline-figures: true,
outline-tables: true,
) = {
outline(indent: auto)
pagebreak()
if outline-figures {
outline(title: "List of Figures", target: figure.where(kind: image))
pagebreak()
}
if outline-tables {
outline(title: "List of Tables", target: figure.where(kind: table))
pagebreak()
}
}
|
https://github.com/MultisampledNight/flow | https://raw.githubusercontent.com/MultisampledNight/flow/main/src/doc/playground.typ | typst | MIT License | #import "../lib.typ" as flow: *
#show: note
- #track.event(
summary: "exist",
start: datetime(year: 2024, month: 9, day: 10),
)
|
https://github.com/polarkac/MTG-Stories | https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/002%20-%20Return%20to%20Ravnica/010_The%20Seven%20Bells%2C%20Part%201.typ | typst | #import "@local/mtgstory:0.2.0": conf
#show: doc => conf(
"The Seven Bells, Part 1",
set_name: "Return to Ravnica",
story_date: datetime(day: 07, month: 11, year: 2012),
author: "<NAME>",
doc
)
== Council of the Izmagus
=== Report of Record
#emph[Micas Vay: We’re protecting you—for now. But the Azorius are demanding justice.]
#emph[Bori Andon: Justice for a broken window? I wasn’t aware cathedrals had such tender feelings. ]
#emph[Micas Vay: This is no laughing matter. ]
#emph[Bori Andon: I’m pursuing research given to me personally by Niv-Mizzet. ]
#emph[Micas Vay: Look around you. This chamber is filled with the best and brightest of Ravnica. Your pursuits are no grander than any of ours. Just more destructive. ]
#emph[Bori Andon: You’re not worried about a few damaged buildings. You’re threatened by my success. Such base emotions are bad for our cause, Prime Izmagnus. ]
#emph[Micas Vay: You must stop these dramatic displays. You must stop drawing the eye of the lawmakers. Bori Andon: I have followed your directives! ]
#emph[Micas Vay: Lower your voice, Andon. By order of this council, your current projects are placed on hold. The Firemind has assigned you a special task. It must be completed before you can resume your personal research. ]
#emph[Bori Andon: What task? ]
#emph[Micas Vay: Solve the Theorem of Simultaneous Discordance. Various: [Shouting, arguing…]]
#emph[Bori Andon: You can’t derail my research. I am on the brink of— ]
#emph[Micas Vay: Sit down or you will be removed from chambers! ]
#emph[Bori Andon: This is absurd! ]
#emph[<NAME>: It’s the so-called unsolvable theorem. Are you familiar with it, Andon? ]
#emph[<NAME>: Of course… ]
#emph[<NAME>: The question is this: Can a single person simultaneously ring all seven bells in the seven great bell towers? Many great minds have tried, but each has failed. They say it’s impossible. Are you clever enough to prove them wrong? ]
#emph[Bori Andon: You don’t have the authority! ]
#emph[<NAME>: This comes directly from the Firemind. If you don’t like it, you may sever your ties with the Izzet. ]
#emph[<NAME>: You’re a bastard, Vay! This is your doing. Well, I prove you’re a— ]
#emph[<NAME>: Guards! Remove this man from chambers. Various: [Shouting, arguing…guards escort Andon from the chamber]]
#emph[<NAME> (addressing council): Well, he’s gone. His ego will get the best of him, and he’ll try to solve it. At least it will keep him out of trouble, for now. And what’s the worst that could happen? All seven bells will ring. ]
#emph[Various: [Laughing…]]
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
#figure(image("010_The Seven Bells, Part 1/02.jpg", width: 100%), caption: [Firemind’s Foresight | Art by <NAME>], supplement: none, numbering: none)
== The Journal of <NAME>: Day One
I awoke to the sound of chimes. From my bedroom window, I can see two of the seven great bell towers of the Kalnika Quarter. I grew up here and I’ve heard the bells every morning since I was born. At dawn, the bells share a harmony, but each rope is pulled by a different bell ringer. How could one man ring all seven at once? This is the council’s attempt to humiliate me. They task me with an "unsolvable" problem. Well, I’ll show them. Nothing is unsolvable.
Despite the grayness of my outlook, the bells are startlingly beautiful. In my half sleep, I imagined myself in an immense room filled with staircases that lead nowhere. I must have dozed again. In my dreams, I paced blank corridors with no end. I turned a corner and recoiled at the sight of massive sentinel seated on a silver throne. His eyes followed me, no matter what direction I ran.
The dream is an omen. They’ll be watching my every move.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
Subject never left his flat
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== The Journal of Bori Andon: Day Two
I traveled across two districts to visit an old friend, Zaba, who makes the best maps in all of Ravnica. As I hoped, he had exquisite maps of the Seven Great Bell Towers. His work is so finely detailed that I could practically see the rat holes in the walls. Suddenly, I can make sense of the confusing tangle of elevated walkways, the bridges over nonexistent rivers, and the multi-level streets that I knew by heart but could never have quantified.
Over cups of tea, Zaba entertained me with the legend of Kalnika, a great paladin for whom my quarter is named. I felt like a child at my grandfather’s knee as Zaba told a tale of ancient Ravnica, when a lich king tyrannized the people. The paladin taught the peasants a series of codes rung on the bells. When they heard the right sequence, they knew it was time to rise up #emph[en masse] and kill the tyrant. According to Zaba, the towers have a particular order. There is only one route through the quarter that allows you to visit the towers in proper sequence. He gave me a toothless smile and told me if I could find the right path, my true love would be waiting for me around the corner.
When I told him of my trouble with the council, Zaba gave me the maps at no cost. He said something about the meaning being in the journey itself and we parted ways. On the street, a man with black hair trailed me for three blocks. It doesn’t surprise me that Vay sent one of his thugs to follow me. Well, he can watch all he likes. I have nothing to hide.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
Our interrogation of the mapmaker yielded little more than we already knew. Subject continues to search for solution to the Theorem of Simultaneous Discordance.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== The Journal of Bori Andon: Day Three
I wore a hole through my shoe leather, but I’ve done it. Zaba’s silly comment about the "true path" gave me an idea. I discovered the singular route that takes me to each of the bell towers once and only once. It was a fascinating exercise in the geometry of place, and the interconnectedness has left me both weary and hopeful. Ugh, it feels like rats are eating the backs of my eyeballs. I must sleep.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
Walking. Walking. And more walking. The subject talks to himself. He seems confused and often concerns passersby with his flapping and scurrying about. We really don’t think the Theorem is in danger of being solved. I request reassignment.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== The Journal of Bori Andon: Day Five
My Hypno-Imager is complete. I’ve devised a way to transfer the "true path" in three-dimensional space while accommodating natural obstacles and the historical information I received from the mapmaker. This will allow me to find the perfect center where I will be within equidistance to each bell. I have calibrated the imager so I can transmit sonus-ripples, which will strike the bells and cause them to ring simultaneously. I can’t wait to see the look on Vay’s face.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
The subject exited his flat carrying a brass box with cords attached to large glass vessels strapped to his back. The vessels contained some bluish liquid and fog. The subject made his way to the center of the Kalnika Open Market. You know that enormous statue of a centaur? Well, he climbed up on top of it and sat down like he was about to ride a horse. Then he fiddled with his contraption for ages. People were staring at him like he’d lost his mind (I think he’s lost his mind). At one point, blue light shot out of the device, but nothing else happened.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== The Journal of Bori Andon: Day Eight
When will this headache cease? Even the slightest noise feels like an assault. My Hypno-Imager failed, but I know why. In my next attempt, I must use elements already existing in the environment. The air itself is the answer! Wind will be my invisible accomplice. I have successfully recalibrated the imager. My new device, the Hypno-Explusor, will suck a massive volume of air inward and then expulse it in a radial pattern, causing the bells to chime. I’m slightly concerned about the displacement of this volume of air. I anticipate that residents will feel a slight to moderate breeze, and that is all. I can’t spare time to run tests. The Firemind is waiting.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
Suspicions about subject’s sanity are confirmed. He’s sitting on the centaur again. And he has a new device. It still has the brass box and the glass vessels. But now there’s a hat-like component. And by hat, I mean a towering pile of copper wire and pipes. He’s fiddling with the box. There’s a strange whooshing sound. Huh, the rubbish in the gutters has just started hovering. I have a bad feeling about this.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== Azorius Incident Report
Just after the clocks struck nine, residents of the Kalnika Quarter reported a strong breeze on an otherwise calm day. The breeze intensified and shattered nearly every window in the quarter. The broken glass and debris coalesced near the Dome of the Black Dove. Abruptly, it transformed into a cylindrical cyclone of glass shards and continued to gather strength. At its peak, the glass-storm was taller than the Dome itself. Residents in a three-block perimeter were evacuated. A joint force of battlemages worked to contain and disperse this massive threat. Their efforts were successful, and there were no deaths reported in the incident. The grounds of the Dome are littered with broken glass, but the structure itself has been saved.
#figure(image("010_The Seven Bells, Part 1/04.jpg", width: 100%), caption: [Cyclonic Rift | Art by <NAME>], supplement: none, numbering: none)
== The Journal of Bori Andon: Day Ten
Wrong, wrong, wrong. That approach proved fruitless, and the bells did not chime. To complicate matters, the man with black hair was watching me again. He’s probably reporting my failure to the council and they’re all having a good laugh at my expense. Also, birds keep flying overhead. I’ve heard that pigeons can be trained as spies. I see now that wind was too prosaic of a solution. I must convert the energy of people’s thoughts and propel it to the Great Bell Towers, thereby ringing the bells. How much is the weight of thought? How much energy does a person’s brain emit?
I have recalibrated the imager again. This device—the Hypno-Oblatrix—will collect and condense all thoughts in the vicinity of the bell towers. I’ll then transmit the converted energy of those thoughts directly to the bell tower thereby ringing all the bells simultaneously. Those affected will feel a tingling along in the forebrain… oh, scratch that. I have no idea what they’ll experience.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
Watcher’s Report
They haven’t even finished cleaning up the glass yet, and he’s back. He’s carrying another device. I can tell it’s new because the hat is even taller. Judging by the faces on the passersby, it’s emitting foul-smelling fumes. Request permission to leave post. I’m going down below to watch from a safe distance.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== Azorius Incident Report
A mass mind attack has just been launched on the Kalnika Quarter. There have been widespread reports of memory loss, disorientation, and bleeding out of ears. Suspect or suspects still at large.
#v(0.35em)
#line(length: 100%, stroke: rgb(90%, 90%, 90%))
#v(0.35em)
== The Journal of Bori Andon: Day Ten, Addendum
The Hypno-Oblatrix failed to ring the bells. But something amazing happened instead. I activated the device, and people fell to the ground like ragdolls. As planned, I transmitted the collected energy toward the bell towers. #emph[Unexpectedly] , I saw hundreds of glowing lines crisscrossing the air like metaphysical threads. Because of my location, I could see that each line intersected with the great bell towers. What could they be? Borderlines? Conduits? Sensors? What is this madness?
I’m terrified of the implications. I’ve uncovered some kind of secret Æther channels running throughout the quarter. The bell towers are nexus points. Maybe Niv-Mizzet tasked me the theorem not as a punishment, but to uncover this baffling secret. I was meant to find this, but what did I find?
To be continued…
|
|
https://github.com/lkndl/typst-bioinfo-thesis | https://raw.githubusercontent.com/lkndl/typst-bioinfo-thesis/main/modules/back-matter.typ | typst | #import "styles.typ": *
#let references(
bibliography-file: none,
style: "apa", lang: "en") = {
// pass a pre-installed citation style or the path to a .csl file here. Download `.csl`s from https://github.com/citation-style-language/styles
// also check https://www.ub.tum.de/en/citation-guide
if bibliography-file != none {
//pagebreak()
//v(-space)
let bib-title = if lang == "en" [References] else [Literaturverzeichnis]
heading(level: 1, outlined: true, numbering: none, bib-title)
//show bibliography: set text(8pt)
set block(spacing: .65em)
bibliography(bibliography-file,
title: none,
style: style)
}
}
#let list-of(thing, title, left-width: 8mm, show-supplement: false) = {
// this is used for list-of-figures and list-of-tables
// use the state
show outline: it => {
in-outline.update(true)
it
in-outline.update(false)
}
show outline.entry: it => {
let c = it.element.caption
let (title, suppl) = (c.body, c.supplement)
let number = numbering(c.numbering, ..c.counter.at(it.element.location()))
let loc = it.element.location()
let left-thing = {
if show-supplement [#suppl ]
number
if show-supplement [: ]
}
box(width: 100%, stack(dir: ltr,
// the number of the "thing"
box(width: left-width, link(loc, left-thing)),
// the title, i.e. short part of the caption, may still be multi-line
box(width: 100% - page-num-width - left-width, [
#link(loc, title)
// the gap characters
#box(width: 1fr, align(right, typst-repeat(justify: false, box(width: quantum, "."))))
]),
// the page number, bottom-aligned for long figure titles
align(bottom + right,
box(width: page-num-width, link(loc, it.page)))
))
}
locate(loc => {
let figures = figure.where(kind: thing)
if query(figures, loc).len() == 0 {
return // break if none found
}
//pagebreak()
//v(-space)
heading(level: 1, outlined: true, numbering: none, title)
outline(
title: none,
target: figures,
)
})
}
#let list-of-figures(lang) = {
let lof-title = if lang == "en" [List of Figures] else [Abbildungsverzeichnis]
list-of(image, lof-title)
}
#let list-of-tables(lang) = {
let lot-title = if lang == "en" [List of Tables] else [Tabellenverzeichnis]
list-of(table, lot-title)
} |
|
https://github.com/mariunaise/HDA-Thesis | https://raw.githubusercontent.com/mariunaise/HDA-Thesis/master/pseudocode/offsets.typ | typst | #import "@preview/lovelace:0.3.0": *
#pseudocode-list(booktabs: true, numbered-title: [Find all offsets $phi$])[
+ *input* $Phi, S$
+ *list* offsets $phi$
+ *if* $S$ is odd
+ $S = s-1$
+ *append* 0 *to list* offsets
+ *while* $i <= S/2$
+ *append* $+(i dot Phi)$ *to list* offsets
+ *append* $- (i dot Phi)$ *to list* offsets
+ *sort list* offsets in ascending order
+ *return* offsets
+ *end*
]
|
|
https://github.com/donghoony/KUPC-2023 | https://raw.githubusercontent.com/donghoony/KUPC-2023/master/abstractions.typ | typst | #import "colors.typ" : *
#let cell = rect.with(
height: 25pt,
inset: 12pt,
width: 130%,
stroke: none,
)
#let column_width = (5%, 5%, 35%, 15%, 25%)
#let row_header_content(_text, size:1.3em) = {
cell(
fill: none,
text(weight: "semibold", size:size, fill: black)[#_text],
)
}
#let pick_color(tier: "") = {
let c = black
if (tier == "b") {c = AC_BRONZE}
if (tier == "s") {c = AC_SILVER}
if (tier == "g") {c = AC_GOLD}
if (tier == "p") {c = AC_PLATINUM}
if (tier == "d") {c = AC_DIAMOND}
if (tier == "r") {c = AC_RUBY}
return c
}
#let row_content(_text, tier: "", mono: false) = {
let c = pick_color(tier: tier)
let w = "regular"
if (tier != "") {w = "bold"}
cell(
if (mono == true) {text(font: "Pretendard", size: 15pt, fill: c)[#_text]}
else {text(weight: w, size: 15pt, fill: c)[#_text]}
)
}
#let row_header(args) = {
align(center)[
#grid(
columns: column_width,
row_header_content("🦆", size:2.5em),
row_header_content("🪿", size:2.5em),
row_header_content("문제"),
row_header_content("의도한 난이도"),
row_header_content("출제자")
)
]
line(length: 100%)
v(1em)
}
#let row_contents(problem) = {
align(center)[
#grid(
columns: column_width,
row_content(problem.d2),
row_content(problem.d1),
align(left)[#row_content(problem.title)],
align(left)[#row_content(problem.difftext, tier: problem.diff)],
align(left)[#row_content(h(2em) + problem.setter.map(setter => {
setter.at(0)
}).join(", "), mono: true)],
)
]
}
#let abstract_page(problems: ()) = {
align(horizon)[
#grid(
columns: (100%),
row_header(("🪿", "🦆", "문제", "의도한 난이도", "출제자")),
..problems.map(problem => {
row_contents(problem)
})
)
]
pagebreak(weak: true)
}
|
|
https://github.com/Quaternijkon/Typst_FLOW | https://raw.githubusercontent.com/Quaternijkon/Typst_FLOW/main/content.typ | typst | #import "src/exports.typ": *
#import "theme.typ": *
= #smallcaps("CPU-oriented Optimizations")
== Background
===
#lorem(200)
== How Milvus Addresses These?
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
#side-by-side[A][B][C]
== Cache-aware Design in Milvus
= #smallcaps("GPU-oriented Optimizations")
== Supporting bigger k in GPU kernel
== Supporting multi-GPU devices
= #smallcaps("GPU and CPU Co-design")
== The Limitations
== Addressing the first limitation.
== Addressing the second limitation.
// = #smallcaps("CPU-oriented Optimizations")
// == Background
// == How Milvus Addresses These?
// == Cache-aware Design in Milvus
// = #smallcaps("GPU-oriented Optimizations")
// == Supporting bigger k in GPU kernel
// == Supporting multi-GPU devices
// = #smallcaps("GPU and CPU Co-design")
// == The Limitations
// == Addressing the first limitation.
// == Addressing the second limitation. |
|
https://github.com/ufodauge/master_thesis | https://raw.githubusercontent.com/ufodauge/master_thesis/main/src/template/components/common/route.typ | typst | MIT License | #let Route(
font : "<NAME>",
font-strong: "<NAME>",
body
) = [
#show regex("、"): ","
#show regex("。"): "."
#set page(
paper : "a4",
margin: (
bottom: 119.077pt,
top : 108.5pt,
left : 82.5pt,
right : 88.5pt,
)
)
#set text(font: font, size: 12pt)
#show strong: set text(
weight: "extralight",
font : font-strong,
)
#set par(
first-line-indent: 1em,
justify : true
)
#body
] |
https://github.com/jneug/schule-typst | https://raw.githubusercontent.com/jneug/schule-typst/main/src/util/types.typ | typst | MIT License | /// Wrapper module for valkyrie to add some missing types and augment existing
/// ones.
#import "typst.typ"
#import "@preview/valkyrie:0.2.0": *
// Aliases for the original valkyrie types.
#let _choice = choice
#let _content = content
/// Augmented #cmd-[choice] function that adds an #arg[aliases] argument similar
/// to the @cmd-[dictionary] type.
///
/// - aliases (dictionary): A dictionary of alias to option mappings.
/// - ..args (any): Same arguments as for valkyries #cmd-[choice].
/// -> dictionary
#let choice(..args, aliases: (:)) = {
let pre-transform = (self, it) => aliases.at(it, default: it)
if "pre-transform" in args.named() {
pre-transform = (self, it) => args.named().pre-transform(self, pre-transform(self, it))
}
_choice(..args, pre-transform: pre-transform)
}
/// Schema for a field that always will be set to a constant value, no matter
/// the value supplied for that key. The value is taken from the supplied
/// types default value.
///
/// - type (dictionary): Any of the valkyrie types with a default value set.
/// -> dictionary
#let constant(type) = {
type.optional = true
type.pre-transform = (..) => type.default
type
}
/// Augments the content type to include #dtype("symbol") as allowed type.
#let content = base-type.with(name: "content", types: (typst.content, str, symbol))
/// Type for Typst build-in #dtype("version").
#let version = base-type.with(name: "version", types: (version,))
/// Type for Typst build-in #dtype("symbol").
#let symbol = base-type.with(name: "symbol", types: (symbol,))
/// Type for Typst build-in #dtype("label").
#let label = base-type.with(name: "label", types: (label,))
/// Type for Typst build-in #dtype("auto").
#let aut0 = base-type.with(name: "auto", types: (type(auto),))
|
https://github.com/xiaodong-hu/typst-note-template | https://raw.githubusercontent.com/xiaodong-hu/typst-note-template/main/note_template.typ | typst | MIT License | #let mycolor = (
celestial_blue: rgb(74, 150, 209), // `Celestial blue: \definecolor{celestialblue}{rgb}{0.29, 0.59, 0.82}` from `https://latexcolor.com/`
amber_orange: rgb(255, 128, 0), // `Amber (SAE/ECE): \definecolor{amber(sae/ece)}{rgb}{1.0, 0.75, 0.0}` from `https://latexcolor.com/`
cerise_pink: rgb(237, 58, 130), // `Cerise pink: \definecolor{cerisepink}{rgb}{0.93, 0.23, 0.51}` from `https://latexcolor.com/`
coral_red: rgb(255, 64, 64) // `Coral red: \definecolor{coralred}{rgb}{1.0, 0.25, 0.25}` from `https://latexcolor.com/`
)
#let note(
title: none,
authors: (), // empty array for multiple authors. Here each author is a dictionary of `name: str` and `affiliations: int`
intitutions: (), // empty array for multiple affiliations
// date: none,
abstract: none,
show_contents: false,
two_columns: false,
document,
bib_filename: (),
show_references: false,
) = {
set page(paper: "us-letter", margin: (x: 2cm, y: 2cm),)
set par(justify:true, first-line-indent: 1em)
set text(
// font: "stix2",
size: 10pt
)
/// maketitle and abstract
align(center)[
#text(12pt, weight: "bold")[#title]
#linebreak()
#v(3pt)
#let n_author = 1;
#for author in authors {
author.name
let n_affiliation = 1;
for affiliation in author.affiliations {
assert(affiliation <= intitutions.len(), message: "affiation label NOT match with the number of intitutions!")
text(blue)[$""^#affiliation$]
if n_affiliation != author.affiliations.len() {
text(black)[$""^,$]
}
n_affiliation += 1
}
if n_author != authors.len() {
[, ]
}
n_author += 1
h(5pt)
}
#linebreak()
#for i in range(0, intitutions.len()) {
text(blue)[$""^#(i+1)$]
text(black)[_#intitutions.at(i)_]
linebreak()
}
(Dated: #datetime.today().display()) // default date
#v(10pt,weak: true)
#box(
width: 40em,
// first-line-indent: 1em,
[
// *Abstract*
#align(left)[
#h(1em) // manual indent
#abstract
]
]
)
// abstract
#v(15pt)
]
/// customize the appearance
set align(left)
set heading(numbering: "I.A.") // customize heading numbering
show heading: self => [
#align(center)[
#text(fill: mycolor.celestial_blue, 10pt)[
#v(25pt,weak: true)
#self
#v(15pt,weak: true)
] // customize heading style
]
]
show link: set text(fill: mycolor.cerise_pink) // customize link color
show ref: set text(fill: mycolor.cerise_pink) // customize reference color
show emph: set text(fill: mycolor.cerise_pink) // customize emphasis style
show strong: set text(fill: mycolor.cerise_pink) // customize strong style
/// table-of-contents
{
set par(justify:true, first-line-indent: 0em) // turn off first-line-indent for table-of-contents
if show_contents {
show outline.entry: it => {
// we will customize the outline using the original definitions. To avoid infinite recursion, we will use a tag to mark the modified outline entries. The trick is used in https://stackoverflow.com/questions/77031078/how-to-remove-numbers-from-outline
let outline_contents = if it.at("label", default: none) == <modified-tag> {
it // just return itself if is already modified
} else {
[
#outline.entry(
it.level,
it.element,
it.body,
[], // remove fill
it.page
)<modified-tag>
#linebreak()
#v(-7.5pt, weak: true)
]
}
if it.level == 1 {
v(10pt,weak: true)
}
if it.level == 1 {
text(fill: mycolor.celestial_blue, weight: "bold")[#outline_contents]
} else {
text(fill: mycolor.celestial_blue)[#outline_contents]
}
}
outline(
title: [Contents],
indent: 1em,
// depth: 2, // infinited depth
)
}
v(15pt)
}
/// begin document
// set par(justify:true, first-line-indent: 1em) // recover the first-line-indent
if two_columns {
columns(2,gutter: 20pt, document)
} else {
document
}
/// references
if show_references {
bibliography(bib_filename, title: "References")
}
} |
https://github.com/dark-flames/apollo-typst | https://raw.githubusercontent.com/dark-flames/apollo-typst/main/content/posts/multi-version-short.md | markdown | Apache License 2.0 | +++
title = "Ancestors: Multi Version(Short)"
date = "2024-07-09"
[taxonomies]
tags=["documentation"]
[extra]
typst = "intro/short"
+++
|
https://github.com/BreakingLead/note | https://raw.githubusercontent.com/BreakingLead/note/main/Math/uncategorized/schwarz_beamer.typ | typst | #import "@preview/touying:0.2.0": *
|
|
https://github.com/PorterLu/Typst | https://raw.githubusercontent.com/PorterLu/Typst/main/advance/advanced_styling.typ | typst | #set par(justify: true)
#set heading(numbering: "1.1.1.1")
#set page(
paper: "us-letter",
header: align(right)[
Advanced Styling
],
numbering: "1"
)
#set text(
font: "Linux Libertine",
size: 11pt,
)
#align(center, text(17pt)[*Advanced Styling*])
#grid(
columns: (1fr, 3fr, 1fr, 3fr, 1fr),
align(center)[],
align(center)[
KunLu \
Southern University of Science and Technology \
#link("<EMAIL>")
],
align(center)[],
align(center)[
KunLu \
Southern University of Science and Technology \
#link("<EMAIL>")
],
align(center)[]
)
#align(center)[
#set par(justify: false)
*Abstract* \
#lorem(64)
]
#show: rest => columns(2, rest)
#show heading: it => [
#set align(center)
#set text(12pt, weight: "regular")
#block(smallcaps(it.body))
]
#show heading.where(
level: 1
): it => block(width: 100%)[
#set align(center)
#set text(12pt, weight: "regular")
#smallcaps(it.body)
]
#show heading.where(
level: 2
): it => text(
size: 11pt,
weight: "regular",
style: "italic",
it.body + [.],
)
= Introduction
== Motivation
#h(2em)#lorem(300)
== Contribution
#h(2em)#lorem(300)
= Related Work
#h(2em)#lorem(300)
|
|
https://github.com/silent-dxx/typst-color-emoji | https://raw.githubusercontent.com/silent-dxx/typst-color-emoji/main/examples/simple.typ | typst | MIT License | #import "../cm.typ"
#let udl0 = underline.with(
stroke: 1.5pt + red,
offset: 2pt,
)
#show raw.where(block: false): box.with(
fill: aqua.lighten(80%),
inset: (x: 3pt, y: 0pt),
outset: (y: 3pt),
radius: 2pt,
)
#text(2.0em)[
= Typst Color Emoji Simple Demo
]
\
////////////////////////////////////
#[
#show:udl0
#text(size: 2.0em)[
\1. 2.0em fonts
]
]
#text(2.0em)[
Hello
#cm.o("cat.face")
#cm.o("heart.beat")
#cm.o("face.grin")
#cm.o("hand.ok")
#cm.o("panda")
]
\
////////////////////////////////////
#[
#show:udl0
#text(size: 2.0em)[
\2. 3.0em fonts
]
]
#text(3.0em)[
World
#cm.t("cat.face")
#cm.t("heart.beat")
#cm.t("face.grin")
#cm.t("hand.ok")
#cm.t("panda")
]
#cm.t("cat.face")
\
////////////////////////////////////
#[
#show:udl0
#text(size: 2.0em)[
\3. default fonts
]
]
#lorem(10)
#cm.t("cat.face")
#cm.t("heart.beat")
#cm.t("face.grin")
#cm.t("hand.ok")
#cm.t("panda")
\
////////////////////////////////////
#[
#show:udl0
#text(size: 2.0em)[
\4. \#emoji + \#em.t + \#em.o
]
]
`face.grin:` #emoji.face.grin #cm.t("face.grin") #cm.o("face.grin")
`face.joy :` #emoji.face.joy #cm.t("face.joy") #cm.o("face.joy")
////////////////////////////////////
#pagebreak()
#set text(size: 2.0em)
= Emoji List
#let line = ()
#for (i, v) in cm.cm-dic {
cm.t(i)
line.push(i)
if line.len() == 20 {
for i2 in line {
cm.o(i2)
}
line = ()
}
}
#parbreak()
#for i2 in line {
cm.o(i2)
}
|
https://github.com/rangerjo/tutor | https://raw.githubusercontent.com/rangerjo/tutor/main/example/main.typ | typst | MIT License |
#import "@local/tutor:0.6.1": totalpoints, lines, default-config
#import "src/ex1/ex.typ" as ex1
#import "src/ex2/ex.typ" as ex2
#let cfg = toml("tutor.toml")
#if "tutor_sol" in sys.inputs {
if sys.inputs.tutor_sol == "true" {
(cfg.sol = true)
} else if sys.inputs.tutor_sol == "false" {
(cfg.sol = false)
}
}
#set heading(numbering: "1.1")
#text(16pt)[
Name: $underline(#h(15cm))$
#v(3mm)
#grid(
columns: (1fr, 1fr),
rows: 10mm,
gutter: 5mm,
// align: left + horizon,
[Points: $underline(#h(4cm))$ / #totalpoints(cfg)],
[Grade: $underline(#h(6cm))$],
)
]
#outline()
#ex1.exercise(cfg)
#ex2.exercise(cfg)
|
https://github.com/ryuryu-ymj/mannot | https://raw.githubusercontent.com/ryuryu-ymj/mannot/main/tests/test-mark.typ | typst | MIT License | #import "/src/mark.typ": mark
// #set page(width: 12cm, height: 16cm, margin: (x: 24pt, y: 24pt))
#let cell-pat = pattern(size: (20pt, 20pt))[
#place(line(start: (0%, 0%), end: (0%, 100%), stroke: silver))
#place(line(start: (0%, 0%), end: (1000%, 0%), stroke: silver))
]
#set page(fill: cell-pat)
#set heading(numbering: "1.")
// #show math.equation: set text(20pt)
#let rmark(body) = mark(body, color: red)
#let gmark(body) = mark(body, color: green)
#let bmark(body) = mark(body, color: blue)
#let boxmark(body) = mark(body, fill: none, stroke: .5pt)
// #show: mannot-init
= Color
#let color-list = (
black,
gray,
silver,
navy,
blue,
aqua,
teal,
eastern,
purple,
fuchsia,
maroon,
red,
orange,
yellow,
olive,
green,
lime,
)
#let mark-list = color-list.map(c => mark(math.text("x"), color: c)).sum()
#text(20pt)[ $ #mark-list $ ]
= Fill / Stroke
$
mark(x, fill: #none, stroke: #.5pt)
mark(x, fill: #none, stroke: #(1pt + red), radius: #100%)
mark(x, fill: #none, stroke: #(bottom: 2pt + blue))
$
= Padding
$
mark(x, padding: #0pt) quad
mark(x, padding: #(x: 8pt, y: 4pt)) quad
mark(x, padding: #(left: 4pt, right: 2pt, top: 6pt, bottom: 8pt)) quad
mark(x, padding: #(right: 0pt, rest: 4pt))
$
= Size / position
== Block
$
mark(x, padding: #0pt) x_mark(x, padding:#0pt) x_x_mark(x, padding:#0pt)
$
$
mark(x y T) \
mark(x + y) \
mark(x + integral x dif x) \
mark(x + vec(1, 2, delim: "[")) \
mark(x + vec(1, 2, delim: "[") + vec(1, 2, 3, delim: "{")) \
mark(x + mat(1, 2; 3, 4; delim: "[")) \
mark(x_(1+p)^T) x_rmark(i + 1)_bmark(j + 1)_gmark(k + 1) x^rmark(i + 1)^bmark(j + 1)^gmark(k + 1) \
$
== Inline
$mark(x)$
$mark(y)$
$mark(T)$
$mark(x y T)$
$mark(x + y)$
$mark(x + integral x dif x)$
$mark(x + vec(1, 2, delim: "["))$
$mark(x + vec(1, 2, delim: "[") + vec(1, 2, 3, delim: "{"))$
$mark(x + mat(1, 2; 3, 4; delim: "["))$
$mark(x_(1+p)^T) x_rmark(i + 1)_bmark(j + 1)_gmark(k + 1) x^rmark(i + 1)^bmark(j + 1)^gmark(k + 1)$
$mark(#{
for i in range(30) {
$x + $
}
$x$
})$
#lorem(30)
$mark(x + y + T)$
#lorem(20)
= Nesting
$
mark(gmark(x) + bmark(y) + z) \
$
// $
// boxmark(boxmark(x) + boxmark(integral boxmark(x) dif x) + boxmark(y)) \
// $
= Layout
$
x \
x \
mark(x) \
$
$
x + 1 \
mark(x + 1) \
mark(x) + 1 \
x + mark(1) \
mark(x) + mark(1) \
$
$
x + integral x dif x \
mark(x + integral x dif x) \
$
$
x + vec(1, 2, delim: "[") + vec(1, 2, 3, delim: "{") \
mark(x + vec(1, 2, delim: "[") + vec(1, 2, 3, delim: "{")) \
$
$
x < integral_0^1 x dif x \
mark(x < integral_0^1 x dif x) \
$
$
x + x_1 + x_1 \
mark(x + x_1 + x_1)
$
$
x y z \
mark(x y z) \
x mark(y) z \
rmark(x) gmark(y) bmark(z) \
$
$
x #metadata(none) y \
x #place(none) y \
x #{box(place(none)) + sym.wj} y \
x #{sym.wj + box(place(none))} y \
x #{sym.wj + box(place(none)) + sym.wj} y \
$
$
2 x + 3 y = 4 \
mark(2 x + 3 y = 4) \
mark(2) gmark(x) + rmark(3) bmark(y) = mark(4)
$
$
x dif x \
mark(x dif x) \
// rmark(x) dif bmark(x) \
rmark(x) dif bmark(x) \
rmark(x) gmark(dif) bmark(x) \
$
Align-point:
$
2 x &+ 3 y &+ 4 z &= 5 \
mark(2 x &+ 3 y &+ 4 z &= 5) \
rmark(2 x) &+ gmark(3 y) &+ bmark(4 z) &= mark(5) \
22 x &+ 33 y & &= 55 \
mark(22 x &+ 33 y & &= 55) \
rmark(22 x) &+ gmark(33 y) & &= bmark(55) \
x & &+ 44 z &= 5 \
mark(x & &+ 44 z &= 5) \
rmark(x) & &+ gmark(44 z) &= bmark(5) \
$
User-defined function:
#let dfrac(a, b) = $( dif#a ) / ( dif#b )$
$
1 + dfrac(a, b) \
mark(1 + dfrac(a, b)) \
1 + dfrac(rmark(a), gmark(b))
$
Horizontal spacing:
$
x #h(1em, weak: true) y #h(1em, weak: true) x \
x mark(#h(1em, weak: true) y #h(1em, weak: true)) x \
x mark(#h(1em) y #h(1em)) x \
mark(x #h(1em, weak: true) y #h(1em, weak: true) x) \
$
|
https://github.com/jmigual/typst-efilrst | https://raw.githubusercontent.com/jmigual/typst-efilrst/main/examples/basic.typ | typst | MIT License | #import "../src/lib.typ" as efilrst
#show ref: efilrst.show-rule
#efilrst.reflist(
[My cool constraint A],<c:a>,
[My also cool constraint B],<c:b>,
[My non-refernceable constraint C],
list-style: "C1)",
ref-style: "C1",
name: "Constraint"
)
See how my @c:a is better than @c:b.
|
https://github.com/dadn-dream-home/documents | https://raw.githubusercontent.com/dadn-dream-home/documents/main/contents/07-thiet-ke-kien-truc/2-frontend/index.typ | typst | == Mô hình MVC ở frontend
Do ở frontend có _quá nhiều_ class nên nhóm xin phép không thêm vào báo cáo. Thay
vào đó, nhóm xin phép trình bày sơ lược hướng tiếp cận.
Nhóm sử dụng *Dart / Flutter* để xây dựng ứng dụng di động. Trong đó, nhóm sử dụng
mô hình MVC để xây dựng ứng dụng, sử dụng hướng dẫn từ
#cite("andrea0", "andrea1", "andrea2").
- Các class *View* sẽ là các Widget từ Flutter, có trách nhiệm nhận dữ liệu và
hiển thị, gọi các callback được truyền vào khi có tương tác người dùng.
- Các class *Controller* sẽ cung cấp các callback cho *View*, và gọi các hàm từ
các controller khác để thực hiện các logic.
- Các class *Model* sẽ là các class đại diện cho các đối tượng trong ứng dụng,
có thể là các đối tượng từ backend, hoặc các đối tượng được tạo ra để phục vụ
cho việc hiển thị. Hầu hết các class này được tạo ra bởi `gRPC`. |
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/syntaxes/textmate/tests/unit/bugs/field.typ | typst | Apache License 2.0 | #figure(caption: [test])[].caption |
https://github.com/Its-Alex/resume | https://raw.githubusercontent.com/Its-Alex/resume/master/README.md | markdown | MIT License | # Resume
Repository to create and build a PDF of my resume.
You can see the current version in [`lastest release`](https://github.com/Its-Alex/resume/releases/tag/latest).
## Requirements
- [`mise`](https://mise.jdx.dev/)
- [`direnv`](https://direnv.net/)
```sh
$ mise plugins install typst https://github.com/stephane-klein/asdf-typst
$ mise install
$ direnv allow
```
## Getting started
To compile resumes, you can use:
```bash
$ ./scripts/compile.sh
```
You can watch them too, one by one, for example for the french one:
```bash
$ ./scripts/watch.sh fr
```
All datas are based from a `yaml` file, for example for the resume in french
[datas/fr.yaml](/datas/fr.yaml).
For now only `french` resume is available.
## License
[MIT](/LICENSE) |
https://github.com/RubixDev/typst-i-figured | https://raw.githubusercontent.com/RubixDev/typst-i-figured/main/examples/basic.typ | typst | MIT License | #import "../i-figured.typ"
#set page(width: 15cm, height: auto, margin: 1.5cm)
// set up heading numbering
#set heading(numbering: "1.")
// this resets all figure counters at every level 1 heading.
// custom figure kinds must be added here.
#show heading: i-figured.reset-counters.with(extra-kinds: ("atom",))
// this show rule is the main logic, custom prefixes for custom figure kinds
// can optionally be added here.
#show figure: i-figured.show-figure.with(extra-prefixes: (atom: "atom:"))
// a similar function exists for math equations
#show math.equation: i-figured.show-equation
// show outlines for all kinds of figures
#i-figured.outline()
#i-figured.outline(target-kind: table, title: [List of Tables])
#i-figured.outline(target-kind: raw, title: [List of Listings])
#i-figured.outline(target-kind: "atom", title: [List of Atoms])
// and equations
#outline(target: math.equation, title: [List of Equations])
#figure([x], caption: [This is a figure before the first heading.])
= Introduction
// references to figures must be prefixed with the respective prefix
Below are @fig:my-figure, @tbl:my-table, @lst:my-listing, @atom:my-atom, and also @eqt:my-equation.
Also see @fig:my-second-figure @eqt:my-second-equation and @fig:my-third-figure.
#figure([a], caption: [This is a figure.]) <my-figure>
#figure(table([a]), caption: [This is a table.]) <my-table>
#figure(```rust fn main() {}```, caption: [This is a code listing.]) <my-listing>
#figure(circle(radius: 10pt), caption: [A curious atom.], kind: "atom", supplement: "Atom") <my-atom>
$ phi.alt := (1 + sqrt(5)) / 2 $ <my-equation>
= Background
#figure([b], caption: [This is another figure.]) <my-second-figure>
$ F_n = floor(1 / sqrt(5) phi.alt^n) $ <my-second-equation>
== Some History
#figure([c], caption: [This is the third figure.]) <my-third-figure>
== Hello World
#figure([d], caption: [Guess what? This is also a figure.])
#figure([e], caption: [This is the final figure.])
|
https://github.com/Jollywatt/typst-fletcher | https://raw.githubusercontent.com/Jollywatt/typst-fletcher/master/docs/readme-examples/first-isomorphism-theorem.typ | typst | MIT License | #diagram(cell-size: 15mm,/*<*/
edge-stroke: fg,
crossing-fill: none,/*>*/ $
G edge(f, ->) edge("d", pi, ->>) & im(f) \
G slash ker(f) edge("ur", tilde(f), "hook-->")
$) |
https://github.com/typst-community/typst-install | https://raw.githubusercontent.com/typst-community/typst-install/main/test.typ | typst | MIT License | + The climate
- Temperature
- Precipitation
+ The topography
+ The geology
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/hover/param.typ | typst | Apache License 2.0 |
/// Test
///
/// - param (int): The `parameter`.
#let f(/* ident after */ param: 1) = 1;
|
https://github.com/Gekkio/gb-ctr | https://raw.githubusercontent.com/Gekkio/gb-ctr/main/chapter/cartridges/mbc7.typ | typst | Creative Commons Attribution Share Alike 4.0 International | #import "../../common.typ": *
== MBC7
TODO.
|
https://github.com/jbirnick/typst-headcount | https://raw.githubusercontent.com/jbirnick/typst-headcount/master/README.md | markdown | MIT License | > [!NOTE]
> This is a [Typst](https://typst.app/) package. Click [here](https://typst.app/universe/package/headcount/) to find it in the Typst Universe.
# `headcount`
This package allows you to make **counters depend on the current chapter/section number**.
The advantage compared to [rich-counter](https://typst.app/universe/package/rich-counters/) is that it is more efficient and you stick with native `counter`s.
## Showcase
In the following example, `mycounter` inherits the first level from headings (but not deeper levels).
```typ
#import "@preview/headcount:0.1.0": *
#import "@preview/great-theorems:0.1.0": *
#show: great-theorems-init
#set heading(numbering: "1.1")
// contruct theorem environment with counter that inherits 2 levels from heading
#let thmcounter = counter("hello")
#let theorem = mathblock(
blocktitle: [Theorem],
counter: thmcounter,
numbering: dependent-numbering("1.1", levels: 2)
)
#show heading: reset-counter(thmcounter, levels: 2)
// set figure counter so that it inherits 1 level from heading
#set figure(numbering: dependent-numbering("1.1"))
#show heading: reset-counter(counter(figure.where(kind: image)))
= First heading
The theorems inherit 2 levels from the headings and the figures inherit 1 level from the headings.
#theorem[Some theorem.]
#theorem[Some theorem.]
#figure([SOME FIGURE], caption: [some figure])
#figure([SOME FIGURE], caption: [some figure])
== Subheading
#theorem[Some theorem.]
#figure([SOME FIGURE], caption: [some figure])
#figure([SOME FIGURE], caption: [some figure])
= Second heading
#theorem[Some theorem.]
#figure([SOME FIGURE], caption: [some figure])
#theorem[Some theorem.]
```

## Usage
To make another `counter` inherit from the heading counter, you have to do **two** things.
1. For the numbering of your counter, you have to use `dependent-numbering(...)`.
- `dependent-numbering(style, level: 1)` (needs `context`)
Is a replacement for the `numbering` function, with the difference that it precedes any counter value with `level` many values of the heading counter.
```typ
#import "@preview/headcount:0.1.0": *
#set heading(numbering: "1.1")
#let mycounter = counter("hello")
= First heading
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
= Second heading
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
```
This displays the desired amount of levels of the heading counter in front of the actual counter.
However, as you can see in the code above, our actual counter does not yet reset in each section.
2. For resetting the counter at the appropriate places, you need to equip `heading` with the `show` rule that `reset-counter(...)` returns.
- `reset-counter(counter, level: 1)` (needs `context`)
Returns a function that should be used as a `show` rule for `heading`. It will reset `counter` if the level of the heading is less than or equal to `level`.
```typ
#import "@preview/headcount:0.1.0": *
#set heading(numbering: "1.1")
#show heading: reset-counter(mycounter, levels: 1)
#let mycounter = counter("hello")
= First heading
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
= Second heading
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
#context mycounter.step()
#context mycounter.display(dependent-numbering("1.1"))
```
**Note:** The `level` that you pass to `dependent-numbering(...)` and the `level` that you pass to `reset-counter(...)` must be the _same_.
## Limitations
Due to current Typst limitations, there is no way to detect manual updates or steps of the heading counter, like `counter(heading).update(...)` or `counter(heading).step(...)`.
Only occurrences of actual `heading`s can be detected.
So make sure that after you call e.g. `counter(heading).update(...)`, you place a heading directly after it, before you use any counters that depend on the heading counter.
|
https://github.com/Myriad-Dreamin/tinymist | https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/docs/tinymist/frontend/sublime-text.typ | typst | Apache License 2.0 |
#import "/docs/tinymist/frontend/mod.typ": *
#show: book-page.with(title: "Tinymist Sublime Support for Typst")
Follow the instructions in the #link("https://github.com/sublimelsp/LSP/blob/main/docs/src/language_servers.md#tinymist")[sublimelsp documentation] to make it work.
|
https://github.com/lphoogenboom/typstThesisDCSC | https://raw.githubusercontent.com/lphoogenboom/typstThesisDCSC/master/typFiles/specialChapter.typ | typst | // !!!!
// STUDENTS, DO NOT EDIT THIS FILE!
// !!!!
#import "../projectInfo.typ": student, report
#let specialChapter(
content: lorem(40),
chapterTitle: "Special Chapter",
studentName: "<NAME>",
showInOutline: true,
body
) = {
set align(top)
let topMargin = 2.5cm+1.35cm
set par(justify: true, linebreaks: "optimized")
set text(size: 10pt,font: "New Computer Modern Math", weight: 500)
set page(
numbering: "i",
margin: (top:topMargin, bottom: 3.16cm),
header:
[
#locate(loc => {
let titlePage = counter(page).at(label(lower(chapterTitle))).first()
let thisPage = counter(page).at(loc).first()
set align(if calc.rem(thisPage, 2) == 0 { left } else { right })
if thisPage != titlePage {counter(page).display("i"); v(-9pt) ; line(length: 100%, stroke: 0.5pt)} else {}
})
],
header-ascent: 21.4%,
footer-descent: 9%,
footer:
[
#locate(loc => {
let n = counter(page).at(loc).first()
if calc.rem(n,2) == 0 {
[
#stack(dir: ltr,
text(font: "New Computer Modern Sans")[#student.name],
align(right)[#text(font: "New Computer Modern Sans")[#report.type]])
]
}
else {
[
#stack(dir: ltr,
text(font: "New Computer Modern Sans")[#report.type],
align(right)[#text(font: "New Computer Modern Sans")[#student.name]])
]
}
})
]
)
v(116pt-topMargin+2.5cm)
line(length: 100%, stroke: 2pt)
v(-10pt)
align(
right,
//[#text(size: 24.4pt, font: "New Computer Modern Sans", weight: "bold")[#chapterTitle ]#label(lower(chapterTitle))]
[#text(size: 24.4pt, font: "New Computer Modern Sans")[#heading(outlined: showInOutline)[#chapterTitle]]#label(lower(chapterTitle))]
)
v(87pt)
set text(size: 10.5pt,font: "New Computer Modern Math", weight: 500)
[#content]
pagebreak(to: "even", weak: false)
body
} |
|
https://github.com/Enter-tainer/wavy | https://raw.githubusercontent.com/Enter-tainer/wavy/master/README.md | markdown | MIT License | # [Wavy](https://github.com/Enter-tainer/wavy)
Draw digital timing diagram in Typst using [Wavedrom](https://wavedrom.com/).

````typ
#import "@preview/wavy:0.1.1"
#set page(height: auto, width: auto, fill: black, margin: 2em)
#set text(fill: white)
#show raw.where(lang: "wavy"): it => wavy.render(it.text)
= Wavy
Typst, now with waves.
```wavy
{
signal:
[
{name:'clk',wave:'p......'},
{name:'bus',wave:'x.34.5x',data:'head body tail'},
{name:'wire',wave:'0.1..0.'}
]
}
```
```js
{
signal:
[
{name:'clk',wave:'p......'},
{name:'bus',wave:'x.34.5x',data:'head body tail'},
{name:'wire',wave:'0.1..0.'}
]
}
```
````
## Documentation
### `render`
Render a wavedrom json5 string to an image
#### Arguments
* `src`: `str` - wavedrom json5 string
* All other arguments are passed to `image.decode` so you can customize the image size
#### Returns
The image, of type `content`
|
https://github.com/profetia/me | https://raw.githubusercontent.com/profetia/me/main/src/option_ext.typ | typst | #import "option.typ": declare, option
#option("lang", "en")
#let en = declare("lang", "en")
#let zh = declare("lang", "zh")
|
|
https://github.com/piepert/logik-tutorium-wise2024-2025 | https://raw.githubusercontent.com/piepert/logik-tutorium-wise2024-2025/main/src/raw-plan.typ | typst | Creative Commons Zero v1.0 Universal | //#DONT_COMPILE_TO_PDF
#import "/src/packages/goals.typ": *
#import "/src/templates/exercise.typ": *
#show: ref-goals
#state("tut-dates").update((
// datetime(year: 2024, month: 10, day: 16), // beginnt erst ab zweiter Woche
datetime(year: 2024, month: 10, day: 23),
datetime(year: 2024, month: 10, day: 30),
datetime(year: 2024, month: 11, day: 6),
datetime(year: 2024, month: 11, day: 13),
datetime(year: 2024, month: 11, day: 20),
datetime(year: 2024, month: 11, day: 27),
datetime(year: 2024, month: 12, day: 4),
datetime(year: 2024, month: 12, day: 11),
datetime(year: 2024, month: 12, day: 18),
datetime(year: 2025, month: 1, day: 8),
datetime(year: 2025, month: 1, day: 15),
datetime(year: 2025, month: 1, day: 22),
datetime(year: 2025, month: 1, day: 29),
))
#let plan-sequence(content) = (
table.cell(colspan: 3, fill: none, v(-1em)),
table.hline(stroke: purple + 1pt),
table.cell(colspan: 3, fill: purple, align(center,
strong(
text(fill: white,
counter("table-sequence").step() +
[Abschnitt #counter("table-sequence").display() - #content])))),
table.hline(stroke: purple + 1pt)
)
#let cell-meeting = align(center + top,
counter("plan-table").step() + locate(loc => [
*#counter("plan-table").at(loc).first(). Sitzung* \
#if state("tut-dates").at(loc).len() > (counter("plan-table").at(loc).first() - 1) {
state("tut-dates").at(loc).at(counter("plan-table").at(loc).first() - 1).display("[day].[month].[year]")
} else {
[N/A]
}
])
)
#show table.cell: set par(justify: false)
#table(
columns: (15%, auto, auto),
stroke: none,
fill: (col, row) => (
purple,
blue.lighten(75%),
none,
).at(
if row == 0 {
row
} else {
1 + calc.rem(row, 2)
}
),
// map-cells: cell => {
// if cell.x == 0 and cell.y >= 1 and cell.colspan == 1 {
// cell.content = align(center + top, counter("plan-table").step() + locate(loc => [
// *#counter("plan-table").at(loc).first(). Sitzung* \
// #if state("tut-dates").at(loc).len() > (counter("plan-table").at(loc).first() - 1) {
// state("tut-dates").at(loc).at(counter("plan-table").at(loc).first() - 1).display("[day].[month].[year]")
// } else {
// [N/A]
// }
// ]))
// }
// return cell
// },
// map-rows: (row, cells) => {
// let index = 0
// while index < cells.len() {
// if cells.at(index) == none {
// index += 1
// continue
// }
// cells.at(index).content = [
// #set par(justify: false)
// #set text(size: 0.75em)
// #if row <= 0 {
// v(0.25em)
// cells.at(index).content
// v(0.25em)
// } else {
// v(0.5em)
// cells.at(index).content
// v(0.5em)
// }
// ]
// index += 1
// }
// return cells
// },
// Nr., Datum, Thema + organisatorisches (Lernevaluation?), Lektüre, Aufgabenblatt
text(fill: white, strong[Sitzung]),
table.vline(stroke: purple),
text(fill: white, strong[Inhalt, Material]),
table.vline(stroke: purple),
text(fill: white, strong[Ziele]),
..plan-sequence[logische Grundlagen],
cell-meeting, [
*Organisatorisches*
- Vorstellung und Erwartungen
*Einführung in die Logik*
- Einführung und Motivation logischer Analyse
- philosophische Argumente und ihre Gütekriterien
*Material:*
- Aufgabenserie #counter("plan-table").display()
], [
// - Ich kann den Begriff "Logik" definieren.
// - Ich kann den Aufbau eines philosophischen Argumentes erklären.
// - Ich kann den Begriff "Argument" definieren.
// - Ich kann die Gütekriterien von philosophischen Argumenten nennen.
@definieren-logik[Ich kann den Begriff "Logik" definieren.]
#multi-goal-ref([Ich weiß, was ein philosophisches Argument ist und wie es aufgebaut ist.], "wissen-phil-argumente", "erkennen-phil-argumente")
@definieren-argument[Ich kann den Begriff "Argument" definieren.]
#multi-goal-ref([Ich kann die Gütekriterien von philosophischen Argumenten nennen.], "definieren-gültigkeit", "definieren-schlüssigkeit")
],
cell-meeting, [
*Folgern und Folgerung Beweisen*
- Vertiefung der Gütekriterien
- logische Folgerung
- einfaches Beweisen
*Material:*
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
@identifizieren-aussagesätze[Ich kann Aussagesätze identifizieren.]
#multi-goal-ref([Ich kann die Gütekriterien von philosophischen Argumenten definieren und voneinander abgrenzen.], "definieren-gültigkeit", "definieren-schlüssigkeit")
@definieren-logische-folgerung[Ich kann "logische Folgerung" definieren.]
@kennen-aufbau-beweis[Ich kann einen Beweis korrekt aufbauen.]
// @beweise-metasprache[Ich kann einen einfachen indirekten Beweis führen.]
],
..plan-sequence[Aussagenlogik],
cell-meeting, [
*Grundlagen der Formalisierung*
- aussagenlogische Zusammenhänge in der natürlichen Sprache
- aussagenlogische Satzbausteine der natürlichen Sprache
- notwendige und hinreichende Bedingungen
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
], [
// - Ich kann die Formalisierung von gültigen Schlüssen motivieren.
@identifizieren-al-strukturen[Ich kann die aussagenlogische Struktur der deutschen Sprache identifizieren.]
@identifizieren-hinr-notw-bed[Ich kann die hinreichende und notwendige Bedingung in einem Wenn-Dann-Satz bestimmen.]
],
cell-meeting, [
*Syntax der Aussagenlogik, AL-Formalisierung*
- Schemata und Mustererkennung
- Syntax der Aussagenlogik
- aussagenlogische Junktoren
- Formalisieren von Ausdrücken natürlicher Sprache in die Sprache AL
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
#multi-goal-ref([Ich kann erkennen, ob ein Ausdruck syntaktisch korrekt nach den Regeln von AL gebildet wurde.], "entwickeln-gefühl-al-syntax",
"erkennen-suchen-schemata")
#multi-goal-ref([Ich kann syntaktisch korrekte Ausdrücke nach den Bildungsregeln von AL bilden.], "entwickeln-gefühl-al-syntax", "bilden-ausdrücke-schemata")
#multi-goal-ref([Ich kann die aussagenlogischen Junktoren in der natürlichen Sprache erkennen und korrekt formalisieren.], "identifizieren-junktoren", "formalisieren-al")
@formalisieren-wd-gdw-nur[Ich kann die Phänomene "nur" und "genau dann, wenn" im Wenn-Dann-Satz bzw. Genau-Dann-Wenn-Satz korrekt formalisieren.]
],
..plan-sequence[Wahrheitstabelle],
cell-meeting, [
*Semantik der Aussagenlogik*
- Semantik der Junktoren
- logische Wahrheit, logische Falschheit
- logische Folgerung und logische Äquivalenz
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
@bilden-notw-hinr-äquivalenz[Ich kann äquivalente natürlichsprachliche Sätze für Wenn-Dann-Sätze bilden, besonders im Zusammenhang von "nur" und der Kontraposition des Konditionals.]
@definieren-semantik-junktoren[Ich kann die Wahrheitsbedingungen der Junktoren natürlich-sprachlich wiedergeben.]
@darstellen-junktoren-wahrheitstabelle[Ich kann die Wahrheitsbedingungen der Junktoren mit der Wahrheitstabelle darstellen.]
@auswerten-ausdrücke-wahrheitstabelle[Ich kann AL-Ausdrücke mit der Wahrheitstabelle auswerten.]
#multi-goal-ref([Ich kann "logische Wahrheit", "logische Falschheit" und "logische Äquivalenz" mit der Wahrheitstabelle beweisen.], "beweisen-logische-wahrheit", "beweisen-logische-folgerung", "beweisen-logische-äquivalenz")
],
..plan-sequence[Kalkül des natürlichen Schließens (KdnS)],
cell-meeting, [
*Ableiten mit dem KdnS*
- Einführung des KdnS
- die Regeln: DS, KM, KP, $not$-Bes. und $not$-Einf.
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
], [
@aufbauen-kdns[Ich kann den KdnS korrekt aufbauen.]
#multi-goal-ref([Ich kann Schemata für Ableitungsregeln im KdnS erkennen und anwenden.], "verstehen-direkte-regeln", "beweise-kdns-einfach")
@beweise-kdns-einfach[Ich kann einfache bis mittelkomplexe Beweise im Kalkül des natürlichen Schließens führen.]
],
cell-meeting, [
*Beweise mit Zusatzannahmen*
- die Regeln: $and$-Bes., $and$-Einf., $or$-Einf., MP, MT
- linke Beweisspalte
- die Regel der $->$-Einführung
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
], [
@bilden-linke-beweisspalte[Ich kann die linke Beweisspalte korrekt herstellen und darin die Abhängigkeiten einer Zeile ablesen.]
@erkennen-konditionalisierung[Ich kann erkennen, wann eine $->$-Einf. gefordert ist.]
@prüfen-abhängigkeiten[Ich weiß, wann und wie ich die Abhängigkeiten meiner abgeleiteten Konklusion prüfen muss.]
],
cell-meeting, [
*Reductio ad absurdum, verzweigte Beweise*
- die Regeln: DM, $<->$-Bes., $<->$-Einf., $->$-Ers. und $->$-Einf.
- die Regel des Reductio ad absurdums (RAA)
- verzweigte Beweise
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
#multi-goal-ref([Ich kann einen Beweis mittels der Regel RAA im KdnS korrekt führen.], "wissen-raa-verfahren", "beweisen-mittels-raa")
@beweisen-verzweigt[Ich kann einen einfachen verzweigten Beweis führen.]
],
..plan-sequence[Prädikatenlogik],
cell-meeting, [
*Motivation und Syntax der Prädikatenlogik, prädikatenlogische Formalisierung*
- Syllogismen, Prädikatierung und Modelltheorie
- Syntax der Prädikatenlogik
- Formalisierung unquantifizierter Beispiele
- Formalisierung quantifizierter Beispiele
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
], [
@formalisieren-pl-unquantifiziert[Ich kann einfache bis mittelkomplexe prädikatenlogische unquantifizierte Sachverhalte formalisieren.]
@formalisieren-pl-quantifiziert[Ich kann einfache prädikatenlogische quantifizierte Sachverhalte formalisieren.]
],
cell-meeting, [
*Quantoren und das logische Quadrat*
- das logische Quadrat
- Formalisierung quantifizierter Sätze
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
@benennen-begriffe-log-quad[Ich kann die Begriffe des logischen Quadrats benennen.]
@finden-beispiele-log-quad[Ich kann zu einem gegebenen Satz im logischen Quadrat weitere Sätze für die freien Stellen im logischen Quadrat bilden.]
@umrechnen-quantoren[Ich kann das Negationszeichen vor Quantoren durch Umwandlung entfernen.]
],
..plan-sequence[Prädikatenlogisches Kalkül des natürlichen Schließens],
cell-meeting, [
*Uneingeschränkte prädikatenlogische Ableitungsregeln*
- die Regeln: $forall$-Bes., $exists$-Einf. und QT
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
], [
@variablen-spezialisieren[Ich kann allquantifizierte Sätze korrekt mit der $forall$-Bes. spezialisieren.]
@konstanten-generalisieren[Ich kann unquantifizierte Sätze korrekt mit der $exists$-Einf. generalisieren.]
],
cell-meeting, [
*Eingeschränkte prädikatenlogische Ableitungsregeln*
- die Regeln: $exists$-Bes., $forall$-Einf. und PKS
*Material:*
- Skript p. / S.
- Aufgabenserie #counter("plan-table").display()
- LEV #counter("table-sequence").display()
], [
#multi-goal-ref([Ich kann unquantifizierte Sätze unter Berücksichtigung der Einsränkungen korrekt mit der $forall$-Einf. generalisieren.], "kennen-einschränkung-allq-einf", "konstanten-generalisieren")
#multi-goal-ref([Ich kann existenzquantifizierte Sätze unter Berücksichtigung der Einsränkungen korrekt mit der $exists$-Bes. spezialisieren.], "kennen-einschränkung-exq-bes", "variablen-spezialisieren")
#multi-goal-ref([Ich kann die Bedingungen der $exists$-Bes. und $forall$-Einf. in meiner Ableitung korrekt prüfen.], "kennen-einschränkung-allq-einf", "kennen-einschränkung-exq-bes")
],
..plan-sequence[Reserve],
cell-meeting, table.cell(align: center + horizon, [
*Reserve*
]), [],
// [], table.cell(align: center + horizon, [
// *Reserve*
// ]), [],
)
|
https://github.com/tomowang/typst-twentysecondcv | https://raw.githubusercontent.com/tomowang/typst-twentysecondcv/main/README.md | markdown | MIT License | # Twenty Seconds CV/Resume in Typst

[](https://typst.app/)
This Typst CV template is inspired by LaTeX template
[Twenty Seconds Resume/CV](https://www.latextemplates.com/template/twenty-seconds-resumecv).
## How to Use
This project has following files:
- `twentysecondcv.typ` - The Typst template file
- `example.typ` - The example resume
- `fonts` - Font awesome 6.4.2 For The Desktop. Downloaded from [https://fontawesome.com/download](https://fontawesome.com/download)
This template use Font Awesome for icons. You need to import the font awesome fonts
in your Typst project for the icons to show up properly. In macOS, just double
click the .otf font file to install.
> ~~Note: If you use icon with dash sigh like `chess-queen`, it may not work properly.
> See https://github.com/typst/typst/issues/2578.~~
>
> ~~You can use unicode instead. For example, `chess-queen` can be replaced with `"\u{f445}"`.
> You can find the unicode [here](https://github.com/typst/packages/blob/main/packages/preview/fontawesome/0.2.1/lib-gen.typ)~~
>
> According to https://github.com/duskmoon314/typst-fontawesome/issues/5,
> the ligature issue seems to be fixed and you can use icon name with dash.
Use `typst compile example.typ` to compile the example.typ into PDF.
This command will download `@preview/fontawesome:0.2.1` packge automatically.
If you want to use this template in [https://typst.app](https://typst.app),
just upload font awesome font files and `twentysecondcv.typ` template file.
You can see online example at [https://typst.app/project/rjQNvqRtLwNI1bfWz1iwKt](https://typst.app/project/rjQNvqRtLwNI1bfWz1iwKt)

## Todo
- [ ] smartdiagram for skills
|
https://github.com/Enter-tainer/m-jaxon | https://raw.githubusercontent.com/Enter-tainer/m-jaxon/master/test.typ | typst | MIT License | #import "./typst-package/lib.typ" as m-jaxon
// Uncomment the following line to use the m-jaxon from the official package registry
// #import "@preview/m-jaxon:0.1.1"
#m-jaxon.render(`E`, inline: true) #m-jaxon.render("E = mc^2", inline: true)
|
https://github.com/kacper-uminski/math-notes | https://raw.githubusercontent.com/kacper-uminski/math-notes/main/tams11/book.typ | typst | Creative Commons Zero v1.0 Universal | #let book_template(
title: none,
course: none,
author: "<NAME>",
doc
) = {
set text(size: 10pt, font: "New Computer Modern")
set outline(depth: 2)
set math.equation(supplement: it => [Eq.#it])
show outline.entry.where(level : 1): it => {
v(12pt, weak: true)
strong(it)
}
show heading.where(level: 1): it => {
set text(size: 25pt)
pagebreak(weak: true)
it
v(1cm)
}
set heading(
numbering: (..nums) => {
let (section, ..subsections) = nums.pos()
if subsections.len() == 0 {
[Chapter #section: ]
} else {
numbering("1.1", ..nums.pos())
}
},
)
show math.integral: math.limits.with(inline: false)
align(center, text(18pt)[
#course - #title
])
show sym.emptyset: sym.phi.alt
align(center, text(15pt)[
#smallcaps(author)
])
pagebreak()
doc
}
#let titled_block(title, txt) = [
#counter(heading).step(level: 3)
*#title #context numbering("1.", ..counter(heading).get())*
#txt
]
#let example(txt) = titled_block("Example", emph(txt))
#let definition(txt) = titled_block("Definition", txt)
#let ncr(all, choice) = $vec(all,choice)$
|
https://github.com/EricWay1024/Homological-Algebra-Notes | https://raw.githubusercontent.com/EricWay1024/Homological-Algebra-Notes/master/ha/0-module.typ | typst | #import "../libs/template.typ": *
= Module Theory Recap
<module-recap>
#definition[
Let $R$ be a ring. A *left $R$-module* $M$ is an abelian group with maps $R times M -> M$ (called multiplication), denoted as $(r, m) |-> r dot m = r m$, which satisfies:
$ r(m_1 + m_2) &= r m_1 + r m_2, \
(r_1 + r_2) m &= r_1 m + r_2 m, \
(r_1 r_2) m &= r_1 (r_2 m), \
1_R dot m &= m. $
A *right $R$-module* is defined similarly, but with multiplication on the right, namely $m r$.
If $R$ is a commutative ring, then left and right $R$-modules are the same, and we call them *$R$-modules*.
]
// Another way to understand the definition is to think of $R$ acting on an abelian group $M$, where for each $r in R$ we define a group homomorphism $M -> M$, denoted as $m |-> r dot m = r m$.
// Obviously a left $R$-module is the same as a right $R^op$-module.
#definition[
Let $M$ be a #lrm. A *submodule* $N$ of $M$ satisfies:
- $N$ is a subgroup of $(M, +)$;
- $r n in N$ for all $r in R$ and $n in N$.
In this case we denote $N subset M$.
]
#definition[
Let $R$ be a ring. Let $M_1, M_2$ be left $R$-modules. A map $phi : M_1 -> M_2$ is a *module homomorphism* if it satisfies:
$
phi(x + y) &= phi(x) + phi(y), \
phi(r x) &= r phi(x).
$
for all $x, y in M_1$ and $r in R$.
]
// Compositions of module homomorphisms are still module homomorphisms, and hence we obtain the category of left $R$-modules, denoted as $RMod$. Similarly, we have the category of right $R$-modules, denoted as $ModR$.
// For all $M, M' in RMod$, we see that $ homr(M, M')$ is an abelian group.
// [Remark 6.1.8 Li].
// Also, $endr(M)$ is a ring, where the multiplication is defined as composition. Therefore any right $R$-module is also a left $D := endr(M)$-module.
// [p. 205, Li].
#definition[
The *kernel* of a module homomorphism $phi : M_1 -> M_2$ is defined as
$ Ker(phi) := {x in M_1 : phi(x) = 0}. $
The *image* of $phi$ is defined as
$ IM(phi) := {phi(x) : x in M_1}. $
It can be shown that $Ker(phi) subset M_1$ and $IM(phi) subset M_2$.
]
#definition[
Let $N subset M$ be #lrms. Define a #lrm on the quotient group $M over N$ with
$ r(x + N) = r x + N $
for all $r in R$ and $x in M$.
Then the *quotient map* $M -> M over N$ is a module homomorphism and $M over N$ is
a *quotient module*.
]
#definition[
Let $X$ be a set. The *free module* with basis $X$ is defined as $ R^(ds X) = plus.circle.big_(x in X) R x. $
We have the inclusion map $i: X -> R^(ds X)$ between sets: $ i(x) = 1_R dot x. $
An element $m in R^(ds X) $ can be written as
$ m = sum_(x in X) a_x x, $
where only finitely many $a_x in R$ is non-zero.
]
#proposition[
For any $R$-module $M$ and map between sets $phi.alt: X -> M$, there exists a unique module homomorphism $phi : R^(ds X) -> M$ that make the following commute:
// #align(center,image("../imgs/2023-10-28-21-16-04.png",width:30%))
// https://t.yw.je/#N4Igdg9gJgpgziAXAbVABwnAlgFyxMJZABgBpiBdUkANwEMAbAVxiRAA0QBfU9TXfIRQBGclVqMWbAEoA9ABQAPCACcABOwCU3XiAzY8BIqOHj6zVohABZbuJhQA5vCKgAZiogBbJGRA4IJGEed08fRD8ApAAmanMpKzQACywAOkYcHVDvIOooxFiQBiwwSxAoOjgkhxA4yTKYRSw4HDgAQjU1Oy4gA
#align(center, commutative-diagram(
node-padding: (50pt, 50pt),
node((0, 0), [$X$]),
node((0, 1), [$R^(xor X)$]),
node((1, 1), [$M$]),
arr((0, 0), (0, 1), []),
arr((0, 0), (1, 1), [$phi.alt$]),
arr((0, 1), (1, 1), [$exists! phi $], "dashed"),
))
]
#definition[
Let $X$ be a subset of $R$-module $M$ and let $i: X->M$ be the inclusion map. We have the corresponding map $sigma: R^(ds X) -> M$. We say
- $X$ is *linear independent* or *free* if $sigma$ is injective and $X$ is *linear dependent* otherwise;
- $X$ spans or generates $M$ if $sigma$ is surjective, in which case $X$ is a *generating set* of $M$. A module with a finite generating subset is called a *finitely generated module*.
A linear independent generating subset of $M$ is called a *basis* of $M$, and a module with a basis is called a *free module*.
// #align(center,image("imgs/2023-10-28-21-19-38.png",width:100%))
]
#corollary[
Any $R$-module $M$ is isomorphic to a quotient of a free module.
]
<module-generator>
// #proof[
// Take some subset $X$ of $M$ and inclusion map $i : X -> M$, we have the corresponding homomorphism $sigma: R^(ds X) -> M$ with $im(sigma) iso R^(ds X) over ker(sigma)$. If we take $X = M$ (or any generating set of $M$), then $im(sigma) = M$.
// ]
// #remark[
// This means we have the exact sequence:
// $ 0 -> ker(sigma) -> R^(ds X) rgt(sigma) M -> 0 $
// ]
#proposition[
Any submodule of a free module over a PID is free.
]
<sub-pid>
// #proof[
// TODO
// ]
// Remark: Categorification. If we discuss a category $(C, ds, tp)$ then this is similar to categorifying some underlying structure of a ring. Active field of research.
// We were discussing $V tpk W$ for $k$ a field. To generalise, for a ring $R$ and left $R$-modules $M, N$, we can define $M tpr N$.
// Note: if $R$ is a non-commutative ring, $M$ is a right $R$-module and $N$ is a left $R$-module, then $M tpr N$ is (only) an abelian group. There would be a problem moving the scalars $r in R$ from side to side in the definition (using free modules), i.e. we can only have things like $m r tp n - m tp r n$.
// = Introduction
// #definition[A ring $(R, +, dot)$ satisfies:
// - $R$ is an abelian group under addition;
// - Multiplication is associative;
// - Distributive: $a(b+c)= a b + a c, (a + b) c = a c + b c$.
// Optionally,
// - Multiplication can be commutative $=>$ commutative rings;
// - Multiplication can have an identity.
// ]
// == Analogies between groups and rings
// Groups act on sets. Rings act on modules.
// #definition[
// Let $M$ be an abelian group. A module is when $R$ acts on $M$, satisfying:
// - $r (m + n) = r m + r n$ (distributitive for addition in $M$);
// - $(r + s) m = r m + s m$ (distributitive for addition in $R$);
// - $(r s) m = r (s m)$ (associative? for multiplication in $R$);
// - $1_R m = m$ if $R$ has $1_R$.
// ]
// When $R=ZZ$ the module is an abelian group.
// We have left, right, and 2-sided group actions (where the left and right actions commute). Similarly, we have left, right, and 2-sided modules. If the ring is commutative, then left and right modules are the same.
// We have disjoint union for sets. We also have direct sum for modules:
// #definition[
// Let $M, N$ be $R$-modules. The direct sum $M plus.circle N$ is the $R$-module formed by $(m, n)$ where $m in M, n in N$.
// ]
// We have cartisian product for sets. We also have tensor product for modules. $abs(S times T) = abs(S) times abs(T)$ and similarly $dim (V plus.circle W) = dim V times dim W$. Caution: the inclusion-exclusion priciple does not work for modules when there are more than $2$ modules.
// Cayley theorem claims that every group $G$ is the symmetry of the set $G$ acted on the right by $G$; hence the symmetry group is $G$ acting on the left. Every ring is the set of endomorphisms of some abelian group. We take $M = R$ with a right action of $R$. The endomorphism is just $R$ acting on the left.
// Homomorphisms of groups. Homomorphisms of rings. Caution: $ZZ \/ 6ZZ tilde.equiv ZZ \/ 2 ZZ plus.circle ZZ \/ 3ZZ$ by the Chinese remainder theorem but $ZZ \/ 2 ZZ$ is not a subring of $ZZ \/ 6 ZZ$, because it does not take $1$ to $1$.
// We have maps of $G$-sets that preserve the action of $G$. We also have homomorphism of modules (linear transformation). Notice that for a left module, a homomorphism $f: M-> N$ of modules should be written on the right: $m f in N$ so that $(r m) f = r (m f)$.
// We have subgroups. We also have subrings. Normal subgroups; ideals. But we have left, right and 2-sided ideals. A 2-sided ideal is a kernel of homomorphism of rings.
// #definition[An ideal is closed under addition and for any $i in I, r in R$, we have $i r , r i in I$.]
// We also have left or right submodules. A left ideal is just a submodule of $R$ considered as a left module.
// We have symmteric groups $S_n$. We also have symmetric groups of free modules $R^n = R plus.circle R plus.circle ...$ and the set of linear transformations of $R^n$ which is $M_n (R)$, $n times n$ matrices.
// = Group rings
// Given a group $G$ and a ring $R$ |
|
https://github.com/jneug/schule-typst | https://raw.githubusercontent.com/jneug/schule-typst/main/src/core/document.typ | typst | MIT License | #import "../util/types.typ" as t
#let _author-schema = t.dictionary(
(
name: t.string(),
email: t.email(optional: true),
abbr: t.string(optional: true),
),
pre-transform: t.coerce.dictionary(it => (name: it)),
aliases: (
kuerzel: "abbr",
abbreviation: "abbr",
),
)
#let _doc-schema = t.dictionary(
(
type: t.string(),
type-long: t.string(),
title: t.content(),
topic: t.content(optional: true),
subject: t.content(optional: true),
number: t.string(
optional: true,
pre-transform: (_, v) => if v != none {
str(v)
},
),
class: t.string(optional: true),
author: t.array(
_author-schema,
pre-transform: t.coerce.array,
optional: true,
),
date: t.date(
pre-transform: (_, it) => if type(it) == str {
let parts
if it.contains(".") {
parts = it.split(".")
return datetime(
year: int(parts.at(2)),
month: int(parts.at(1)),
day: int(parts.at(0)),
)
} else if parts.contains("-") {
parts = it.split("-")
return datetime(
year: int(parts.at(0)),
month: int(parts.at(1)),
day: int(parts.at(2)),
)
}
} else {
it
},
optional: true,
),
license: t.string(default: "cc-by-sa"),
version: t.string(
optional: true,
pre-transform: (_, it) => {
if type(it) == datetime {
it = it.display()
}
if it != none {
it = str(it)
}
it
},
),
variant: t.string(
optional: true,
pre-transform: (_, it) => if "variant" in sys.inputs {
sys.inputs.variant
} else if it != none {
str(it)
},
),
solutions: t.choice(
("none", "page", "here", "after"),
default: "seite",
aliases: (
"seite": "page",
"keine": "none",
"sofort": "here",
"folgend": "after",
),
),
preferred-theme: t.string(default: "default"),
),
aliases: (
"typ": "type",
"typ-lang": "type-long",
"titel": "title",
"reihe": "topic",
"thema": "topic",
"fach": "subject",
"klasse": "class",
"kurs": "class",
"nummer": "number",
"no": "number",
"nr": "number",
"datum": "date",
"authors": "author",
"autor": "author",
"autoren": "author",
"vari": "variant",
"variante": "variant",
"ver": "version",
"lizenz": "license",
"solution": "solutions",
"loesungen": "solutions",
"theme": "preferred-theme",
),
)
#let create(..args, options: (:), aliases: (:)) = {
let schema = _doc-schema
schema.dictionary-schema += options
if aliases != (:) {
let _pre-transform = schema.pre-transform
schema.pre-transform = (self, it) => {
it = _pre-transform(self, it)
for (src, dst) in aliases {
let value = it.at(src, default: none)
if (value != none) {
it.insert(dst, value)
let _ = it.remove(src)
}
}
return it
}
}
let doc = t.parse(args.named(), schema)
doc.insert("_debug", sys.inputs.at("debug", default: false) in ("1", "true", 1, true))
// add some utility function
doc += (
author-abbr: (sep: ", ", suffix: "(", prefix: ")") => {
let abbr = doc.author.filter(a => "abbr" in a)
if abbr != () {
suffix + abbr.map(a => a.at("abbr", default: none)).join(", ") + prefix
}
},
)
doc
}
#let _state-document = state("schule.document", (:))
#let update(func) = _state-document.update(doc => func(doc))
#let update-value(key, func) = _state-document.update(doc => {
doc.insert(key, func(doc.at(key, default: none)))
doc
})
#let get() = _state-document.get()
#let final() = _state-document.final()
#let use(func) = context func(_state-document.get())
#let get-value(key, default: none) = _state-document.get().at(key, default: default)
#let use-value(key, func, default: none) = context func(get-value(key, default: default))
#let save(doc) = _state-document.update(doc)
#let save-meta(doc) = context {
[#metadata(final())<schule-document>]
}
|
https://github.com/rabotaem-incorporated/probability-theory-notes | https://raw.githubusercontent.com/rabotaem-incorporated/probability-theory-notes/master/sections/02-general/03-joint-distribution.typ | typst | #import "../../utils/core.typ": *
== Совместное распределение
#ticket[Совместная плотность распределения. Функции распределения и плотности для независимых случайных величин.]
#def[
Вектор
$ arrow(xi) = (xi_1, xi_2, ..., xi_n): Omega --> RR^n $
является _совместным_ или _многомерным_ распределением, если
$P_(arrow(xi)) = P_(xi_1, xi_2, ..., xi_n)$ --- мера на борелевских подмножествах $RR^n$: $P_(arrow(xi)) (A) = P(arrow(xi) in A)$.
]
#notice[
$P_arrow(xi)$ однозначно определяет $P_xi_1, P_xi_2$, ..., $P_xi_n$, но не наоборот.
Определяет, так как если взять $B in RR$ борелевское, то $P_xi_1 (B) = P_(arrow(xi)) (B times RR^(n - 1))$.
Не наоборот, так как можно, например, рассмотреть $xi_1, xi_2: Omega --> {0, 1}$ --- подбрасывания правильной монетки. Пусть $(xi_1, xi_2): Omega --> {(0, 0), (0, 1), (1, 0), (1, 1)}$, все по $1/4$ --- тогда подбрасывания независимы, и $P_xi_1 = 1/2$, $P_xi_2 = 1/2$. Теперь пусть $xi_1 = xi_2$, то есть подбрасывание одно. Тогда $P_xi_1 = 1/2$, $P_xi_2 = 1/2$. Восстановить $arrow(xi)$ не получается.
]
#def[
Случайные величины $xi_1$, $xi_2$, ..., $xi_n$ независимы, если для любых $A_1$, $A_2$, ..., $A_n$ борелевских подмножеств $RR$,
$
P(xi_1 in A_1, xi_2 in A_2, ..., xi_n in A_n) = P(xi_1 in A_1) P(xi_2 in A_2) ... P(xi_n in A_n).
$
Или, эквивалентно, события $xi_1 in A_1$, $xi_2 in A_2$, ..., $xi_n in A_n$ независимы. Так как мы можем выбирать $A_i$ как хотим, мы можем не делать оговорку о том, что это также должно быть верно по всем подмножествам индексов.
]
#th[
$xi_1$, $xi_2$, ..., $xi_n$ --- независимы тогда и только тогда, когда $P_(arrow(xi)) = P_xi_1 P_xi_2 ... P_xi_n$.
]
#proof[
- "$==>$": $P(xi_1 in A_1, ..., xi_n in A_n) = P(xi_1 in A_1) dot ... dot P(xi_n in A_n)$. Причем,
$
P(xi_1 in A_1, ..., xi_n in A_n) =
P_(arrow(xi)) (A_1 times ... times A_n).
$
и
$
P(xi_1 in A_1) dot ... dot P(xi_n in A_n) =
P_xi_1 (A_1) dot ... dot P_xi_n (A_n).
$
Значит на измеримых прямоугольниках меры $P_(arrow(xi))$ и $P_xi_1 P_xi_2 ... P_xi_n$ совпадают, а значит они совпадают и на всем пространстве.
- "$<==$": Из того, что есть равенство мер, следует равенство мер на прямоугольниках, то есть независимость.
]
#def[
_Совместная (многомерная) функция распределения_:
$
F_(arrow(xi))&: RR^n --> [0, 1],\
F_(arrow(xi)) (arrow(x)) &:= P(xi_1 <= x_1, xi_2 <= x_2, ..., xi_n <= x_n).
$
]
#props[
1. $0 <= F_(arrow(xi)) <= 1$
2. $lim_(x_k -> -oo) F_(arrow(xi)) (arrow(x)) = 0$
3. $lim_(x_1, ..., x_n -> +oo) F_(arrow(xi)) (arrow(x)) = 1$
4. Если $x_1 <= y_1$, ..., $x_n <= y_n$, то $F_(arrow(xi)) (arrow(x)) <= F_(arrow(xi)) (arrow(y))$.
]
#proof[
1. Очевидно.
2. Проверяем на последовательностях:
$x_1^((m)) -->_(m -> oo) -oo$.
$
F_(arrow(xi)) (x_1^((m)), x_2, ..., x_n) =
P(xi_1 <= x_1^((m)), xi_2 <= x_2, ..., xi_n <= x_n)
newline(-->^"непр. меры\nсверху")
P_(arrow(xi)) (Sect_(m = 1)^oo {xi_1 <= x_1^((m)), xi_2 <= x_2, ..., xi_n <= x_n}) = P(nothing) = 0.
$
3. Аналогично.
4. Очевидно.
]
#ticket[Совместная плотность распределения. Функции распределения и плотности для независимых случайных величин.]
#def[
_Совместная (многомерная) плотность распределения_ --- функция $p_arrow(xi) (t) >= 0$, такая что
$
F_(arrow(xi)) (arrow(x)) = integral_oo^(x_1) ... integral_oo^(x_n) p_arrow(xi) (t_1, ..., t_n) dif t_n ... dif t_1.
$
]
#follow(name: "из теоремы")[
1. $xi_1$, $xi_2$, ..., $xi_n$ --- независимы тогда и только тогда, когда $F_(arrow(x)) (arrow(x)) = F_(xi_1) (x_1) ... F_(xi_n) (x_n)$.
2. Пусть $xi_1$, ..., $xi_n$ абсолютно непрерывны. Тогда
$
xi_1, ..., xi_n "независимы" <==> p_(arrow(xi)) (t) = p_(xi_1) (t_1) ... p_(xi_n) (t_n),
$
причем абсолютная непрерывность и независимость гарантируют наличие совместной плотности.
]
#proof[
1.
- "$==>$":
$
F_(arrow(xi)) =
P_arrow(xi) ((-oo, x_1] times (-oo, x_2] times ... times (-oo, x_n]) newline(=)
P_xi_1 (-oo, x_1] dot ... dot P_(xi_n) (-oo, x_n] =
F_xi_1 (x_1) dot ... dot F_(xi_n) (x_n).
$
- "$<==$":
Знаем совпадение мер на $(-oo, x_1] times ... times (-oo, x_n]$, хотим доказать совпадение на ячейках. Пусть, для простоты, $n = 2$. Тогда
$
P_(xi, eta) ((a, b] times (c, d]) &newline(=)
P_(xi, eta) ((-oo, b] times (-oo, d]) -
P_(xi, eta) ((-oo, b] times (-oo, c]) &-\
P_(xi, eta) ((-oo, a] times (-oo, d]) +
P_(xi, eta) ((-oo, a] times (-oo, c]) newline(=)
F_(xi, eta) (b, d) - F_(xi, eta) (b, c) - F_(xi, eta) (a, d) + F_(xi, eta) (a, c) &newline(=)
F_xi (b) F_eta (d) - F_xi (b) F_eta (c) - F_xi (a) F_eta (d) + F_xi (a) F_eta (c) &newline(=)
(F_xi (b) - F_xi (a)) (F_eta (d) - F_eta (c)) &newline(=)
P_xi (a, b] P_eta (c, d].
$
Значит меры совпадают на ячейках. Отсюда независимость.
Аналогично это делается в общем виде ($n > 2$), но букв будет больше. Короче, в любом случае не трудно.
2.
- "$==>$": $F_arrow(xi) (arrow(x)) = F_(xi_1) (x_1) ... F_(xi_n) (x_n)$.
Докажем, что $p(arrow(t)) = p_(xi_1) (t_1) ... p_(xi_n) (t_n)$ --- совместная плотность.
$
integral_oo^(x_1) ... integral_oo^(x_n) p_arrow(xi) (t_1, ..., t_n) dif t_n ... dif t_1 =
integral_oo^(x_1) ... integral_oo^(x_n) p_(xi_1) (t_1) dot ... dot p_(xi_n) (t_n) dif t_n ... dif t_1 newline(=)
integral_oo^(x_1) p_(xi_1) (t_1) dif t_1 dot ... dot integral_oo^(x_n) p_(xi_n) (t_n) dif t_n =
F_(xi_1) (x_1) dot ... dot F_(xi_n) (x_n) =^"независимость" F_(arrow(xi)) (arrow(x)).
$
- "$<==$": так же, равенства те же.
]
=== Отступление в теорию меры. Свертка мер.
#ticket[Свертки мер. Свертки мер, имеющих плотность.]
#def[
Пусть $mu$, $nu$ --- конечные меры на борелевских подмножествах $RR$. _Свертка_ мер $mu$ и $nu$ --- это мера $mu * nu$ на $RR$, такая что
$
mu * nu (A) = integral_RR mu (A - y) dif nu (y).
$
]
#props[
1. $mu * nu (A) = integral_(RR^2) bb(1)_A (x + y) dif mu(x) dif nu(y)$.
2. $mu_1 * mu_2 * ... * mu_n (A) = integral_(RR^n) bb(1)_A (x_1 + ... + x_n) dif mu_1 (x_1) ... dif mu_n (x_n)$.
3. $mu * nu = nu * mu$.
4. $mu_1 * (mu_2 * mu_3) = (mu_1 * mu_2) * mu_3$.
5. $(c mu) * nu = c dot mu * nu$
6. $(mu_1 + mu_2) * nu = mu_1 * nu + mu_2 * nu$.
7. $mu * delta_0 = mu$, где $delta_0$ --- мера с _единичной нагрузкой в нуле_, то есть
$
delta_0 (A) = cases(0 "если" 0 in.not A, 1 "если" 0 in A).
$
]
#proof[
1.
$
mu * nu (A) =
integral_RR mu (A - y) dif nu (y) =
integral_RR integral_RR bb(1)_(A - y) (x) dif mu (x) dif nu (y) =
integral_(RR^2) bb(1)_A (x + y) dif mu(x) dif nu(y).
$
2. Аналогично + индукция.
3. Очевидно.
4. Очевидно.
5. Очевидно.
6. Очевидно.
7.
$
mu * delta_0 (A) = delta_0 * mu (A) = integral_RR delta_0 (A - y) dif mu(y) = integral_RR bb(1)_A (y) dif mu (y) = mu (A).
$
]
#th[
Пусть $mu$ и $nu$ --- меры с плотностями $p_mu$ и $p_nu$ относительно меры Лебега. Тогда $mu * nu$ --- мера с плотностью
$
p(t) := integral_RR p_mu (t - s) dot p_nu (s) dif s.
$
]
#proof[
Надо доказать, что $mu * nu (A) = integral_A p(t) dif t$, то есть
$
mu * nu (A) =
integral_(RR^2) bb(1)_A (x + y) dif mu(x) dif nu(y) =^?
integral_(RR^2) bb(1)_A (t) dot p_mu (t - s) dot p_nu (s) dif s dif t newline(=)
integral_A integral_RR p_mu (t - s) dot p_nu (s) dif s dif t =
integral_A p(t) dif t.
$
Преобразуем второй интеграл:
$
integral_(RR^2) bb(1)_A (t) dot p_mu (t - s) dot p_nu (s) dif s dif t =
integral_(RR^2) bb(1)_A (t) dot p_mu (t - s) dot p_nu (s) dif t dif s newline(=^(u = t - s))
integral_(RR^2) bb(1)_A (u + s) dot p_mu (u) dot p_nu (s) dif u dif s =
integral_(RR^2) bb(1)_A (u + s) dif mu(u) dif nu(s).
$
]
=== Отступление закончилось, возвращаемся к теорверу
#ticket[Распределение суммы независимых случайных величин. Примеры.]
#th[
Если $xi$, $eta$ независимы, то $P_(xi + eta) (t) = P_(xi) * P_(eta) (t)$.
]
#proof[
Пусть $B subset RR^2$ такое, что $(x, y) in B$ тогда и только тогда, когда $x + y in A$
$
P_(xi + eta) (A) = P(xi + eta in A) = P ((xi, eta) in B)) = P_(xi, eta) (B) = integral_(RR^2) bb(1)_B (x, y) dif P_(xi, eta) (x, y) newline(=)
integral_(RR^2) bb(1)_A (x + y) dif P_(xi) (x) dif P_(eta) (y) = P_(xi) * P_(eta) (A).
$
]
#examples[
1. Свертка с дискретным распределением:
пусть
$
delta_x (A) = cases(0 "если" x in.not A, 1 "если" x in A),
$
и $nu = sum c_k delta_(x_k)$ --- дискретная мера. Тогда
$
mu * nu (A) = integral_RR (mu (A - y)) dif nu (y) = sum c_k mu (A - x_k).
$
2. $xi_i sim op("Poisson") (lambda_i)$, $xi_1$ и $xi_2$ независимы. Тогда
$
P_(xi_i) = sum_(k = 0)^oo (lambda_i^k dot e^(-lambda))/(k!),
$
и
$
P_(xi_1 + xi_2)({n}) = P_(xi_1) * P_(xi_2) ({n}) =
sum_(k = 0)^oo P_(xi_1) ({n - k}) dot (lambda_2^k e^(-lambda_2))/(k!) =
sum_(k = 0)^n (lambda_1^(n - k) e^(-lambda_1))/(n - k)! dot (lambda_2^k e^(-lambda_2))/k! newline(=)
e^(-lambda_1 - lambda_2)/n! sum_(k = 0)^n C_n^k lambda_1^(n - k) lambda_2^k =
e^(-lambda_1 - lambda_2)/n! (lambda_1 + lambda_2)^n.
$
]
#exercise(plural: true)[
1. Пусть $xi_1, xi_2 sim op("Exp") (1)$ и независимы. Найти распределение $xi_1 + xi_2$.
2. Пусть $xi_i sim Nn(a_i, sigma_i^2)$ независимы. Доказать, что $xi_1 + xi_2 sim Nn (a_1 + a_2, sigma_1^2 + sigma_2^2)$.
]
|
|
https://github.com/jakobjpeters/Typstry.jl | https://raw.githubusercontent.com/jakobjpeters/Typstry.jl/main/docs/source/guides/package_interoperability.md | markdown | MIT License |
# Package Interoperability
This guide illustrates how to use Typstry.jl in compatible notebooks and packages.
## Notebooks
IJulia.jl, Pluto.jl, and QuartoNotebookRunner.jl each [`render`](@ref) [`Typst`](@ref)s and [`TypstText`](@ref)s.
Pluto.jl and QuartoNotebookRunner.jl also `render` [`TypstString`](@ref)s,
whereas IJulia.jl will support them in its next feature release.
## Packages
### MakieTeX.jl
!!! note
This package re-exports [`@typst_str`](@ref) and [`TypstString`](@ref).
`````@eval
using Markdown: Markdown
Markdown.parse("""```julia-repl
julia> using CairoMakie, MakieTeX
julia> f = Figure(; size = (100, 100))
julia> LTeX(f[1, 1], TypstDocument(typst"\$ 1 / x \$"))
julia> save("makie_tex.svg", f)
```""")
`````
### TypstJlyfish.jl
`````@eval
using Markdown: Markdown
using Typstry: preamble
Markdown.parse("""```typst
$preamble#import "@preview/jlyfish:0.1.0": *
#read-julia-output(json("typst_jlyfish.json"))
#jl-pkg("Typstry")
#jl(`using Typstry; typst"\$1 / x\$"`)
```
```julia-repl
julia> using TypstJlyfish, Typstry
julia> TypstJlyfish.compile("typst_jlyfish.typ";
evaluation_file = "typst_jlyfish.json",
typst_compile_args = "--format=svg --font-path=\$julia_mono"
)
```""")
`````
|
https://github.com/drupol/master-thesis | https://raw.githubusercontent.com/drupol/master-thesis/main/src/thesis/2-reproducibility.typ | typst | Other | #import "imports/preamble.typ": *
#import "theme/template.typ": *
#import "theme/common/titlepage.typ": *
#import "theme/common/metadata.typ": *
#import "theme/disclaimer.typ": *
#import "theme/leftblank.typ": *
#import "theme/acknowledgement.typ": *
#import "theme/abstract.typ": *
#import "theme/infos.typ": *
#import "theme/definition.typ": *
#chapterquote(
title: "Reproducibility",
ref: "chapter2",
quoteAttribution: <cacioppo2015social>,
quoteText: [
Reproducibility is a minimum necessary condition for a finding to be
believable and informative.
],
)
== Reproducibility in Science
#info-box(kind: "cite", footer: [@kpopper1934])[
No serious physicist would offer for publication, as a scientific discovery,
one for whose reproduction he could give no instructions.
]
The concept of reproducibility lies at the heart of scientific inquiry, serving
as a critical benchmark for the validation and acceptance of research findings.
It is a principle that transcends scientific disciplines, insisting that the
results of an experiment or study must be consistently replicable under
identical conditions by different researchers. This aspect of the scientific
method ensures the reliability and integrity of scientific knowledge. It
establishes a framework where hypotheses are not just tested but also subjected
to repeated verification, underpinning the trust and credibility that society
places in scientific discoveries. The journey of reproducibility, originating
from the earliest scientific endeavours, has evolved to adapt to the
complexities and nuances of modern research methodologies. This evolution
mirrors the progression of scientific thought and technology, from rudimentary
experiments to sophisticated, computer-assisted analyses.
One can observe the glimpse of the first traces of this concept in @kpopper1934.
The concept of reproducibility is far from new and has been a cornerstone in the
sciences for centuries. It aims to explain natural phenomena in an objective and
repeatable manner.
According to @Castillo1669, the scientific method (@scientificmethod), a
formalised and widely-adopted process for exploring observations and answering
questions, is inherently designed to be repeatable. However, this does not
guarantee that the results of all experiments conducted using the scientific
method will be reproducible. When results cannot be replicated, it raises
questions about the validity of the experiment and the credibility of the
researcher.
#info-box[
In the realm of scientific research, #emph[repeatable] and #emph[reproducible]
are terms often used interchangeably, yet they hold distinct meanings.
#emph[Repeatable research] refers to the ability of a study or experiment to
yield the same results when conducted again under the same conditions with the
same materials and methods by the same researchers. It primarily focuses on
the consistency and reliability of results within the original research
context.
On the other hand, #emph[reproducible research] emphasises the ability of an
independent researcher to attain at the same findings and conclusions using
the original study's raw data and following the same methodologies, but
possibly under different conditions and with different tools. Reproducibility
extends the validation process beyond the original researchers, ensuring that
the results hold up under scrutiny and can be reliably used as a foundation
for further study.
Together, repeatability and reproducibility are foundational to the integrity
and advancement of scientific knowledge, allowing for a deeper trust and
understanding of research findings.
]
While reproducibility can be considered closely aligned with the scientific
method, it is not an intrinsic part of it. The scientific method is a procedural
approach for conducting experiments, whereas reproducibility is a quality
attribute of the experimental results (@scientificmethodwithreproducibility).
As of 2016, some of its basic terms were not standardised. This diverse
nomenclature has led to confusion, both conceptual and operational, about what
kind of confirmation is needed to trust a given scientific result.
#grid(
columns: (1fr, 1fr),
[
#figure(
include "../../resources/typst/scientific-method.typ",
caption: [Scientific method],
) <scientificmethod>
],
[
#figure(
include "../../resources/typst/scientific-method-w-r13y.typ",
caption: [Scientific method with reproducibility],
) <scientificmethodwithreproducibility>
],
)
Reproducibility in research is a major factor that determines the uniqueness of
research studies. It means obtaining consistent results using the same data and
protocol as the original study. For example, researchers confirm the validity of
a new discovery by repeating the experiments that produced the original results.
Moreover, other researchers in the field are also able to repeat the same
experiments producing the results similar to the original.
=== Reproducibility Levels <ch2-r13y-levels>
According to @ESSAWY2020104753, reproducibility is organised in four levels:
- *Repeatability*: Achieved upon obtaining consistent results using the same
input data, computational steps, methods, and code on the original
researcher's machine. This level is normally achieved in scientific papers.
- *Runnability*: Achieved when the author of the research can obtain consistent
results using the same input data, computational steps, methods, code and
conditions of analysis on a new machine.
- *Reproducibility*: Achieved when a new researcher, not an original author of
the analysis, is able to reproduce the analysis in their own computational
environment #cite(<NAP25303>, form: "normal").
- *Replicability*: Achieved by obtaining consistent results across studies aimed
at answering the same scientific question, each of which has obtained its own
data #cite(<NAP25303>, form: "normal"). Replicability also allows scientists
not involved in the original study to build from and expand on research once
they are first able to reproduce that research.
#figure(
include "../../resources/typst/essawy-table.typ",
caption: [The four levels of reproducibility and their requirements.],
kind: "table",
supplement: [Table],
) <table-levels-of-reproducibility>
#info-box[
It's crucial to understand that these levels are interconnected and not
isolated. Achieving reproducibility level means that the criteria for both
repeatability and runnability levels have been met.
]
@table-levels-of-reproducibility delineates four levels of reproducibility, each
with specific prerequisites. It's important to acknowledge that these levels are
organised in ascending order of difficulty to attain, starting from the simplest
to the most challenging. Consequently, progressing through these levels
necessitates an incremental investment of resources, time, and effort.
=== Formalisation
#definition(term: "Experiment")[
An experiment $e$ conducted with a set of parameters and conditions $p$ where
$r(e,p)$ represents the experiment results, is #emph[reproducible] if and only
if:
#box[
$
& forall e in E, forall e' in R(e), forall p in "par"(e), quad r(
e, p
) eq r(e', p)
$
]
where
- $E$ is the set of all possible experiments
- $"par"$ is a function defined as $"par": E -> cal(P)(P)$ where $cal(P)(P)$ is the
powerset of $P$, the set of all possible parameters of all possible
experiments
- $R(e)$ is a function defined as $R: E -> cal(P)(E)$ where $cal(P)(E)$ is a powerset
of $E$ that gives for each experiment $e in E$, its set of independent
replications
] <def-reproducible-experiment>
== Reproducibility in Computer Science
#info-box(kind: "cite", footer: [@Barba2018])[
In their vision of reproducible research, readers should be able to rebuild
published results using the author's underlying programs and raw data.
Implicitly, they are advocating for open code and data.
]
As we shift our focus from general scientific domains to the realm of
#gls("CS"), the principles of reproducibility undergo a unique transformation.
In our digital era, where computations and algorithms form the backbone of
research, reproducibility challenges and solutions take on new dimensions. The
intricate interplay of software, hardware, and data in #gls("CS") demands a
re-examination and adaptation of traditional reproducibility concepts. This is
where the principles established in the broader scientific community intersect
with the specificities of computing, leading to a distinct and crucial discourse
on reproducibility in the field of #gls("CS").
The initial recorded use of the term #emph[reproducible research] in an academic
paper is believed to have occurred in 1992, in a presentation by
@Claerbout1992's team at Stanford, during the Society of Exploration Geophysics
conference. @Schwab2000, the same group of researchers updated their definition
of #emph[reproducibility in computationally oriented research].
In @Donoho2009, it is stated that reproducibility depends on open code and data.
The authors define reproducible computational research as that
#emph[
in which all details of computations, code and data, are made conveniently
available to others.
]
#definition(
term: "Repeatability (Same team, same experimental setup)",
name: "acm_repeatability",
)[
The measurement can be obtained with stated precision by the same team using
the same measurement procedure, the same measuring system, under the same
operating conditions, in the same location on multiple trials. For
computational experiments, this means that a researcher can reliably repeat
her own computation.
]
@Goodman2016 acknowledge the lack of standardisation in foundational terms like
reproducibility, replicability, reliability, robustness, and generalizability.
To address this, they suggest a new lexicon: #emph[Methods Reproducibility] to
align with the original concept of reproducibility as defined by @Claerbout1992
and @Donoho2009, #emph[Results Reproducibility] corresponding to what @Peng2009
refers to as replicability, and #emph[Inferential Reproducibility] to denote a
distinct category.
#definition(
term: "Reproducibility (Different team, different experimental setup)",
name: "acm_reproducibility",
)[
The measurement can be obtained with stated precision by a different team
using the same measurement procedure, the same measuring system, under the
same operating conditions, in the same or a different location on multiple
trials. For computational experiments, this means that an independent group
can obtain the same result using the author's own artefacts.
]
The term #emph[reproducibility] in the context of #gls("CS") has been refined
and explored in many subsequent works and identifying a single #emph[first]
definition can be challenging due to the evolution of the concept over times.
According to @Barba2018, who conducted a detailed article on the terminology
history, the most appropriate terminology (@acm_repeatability,
@acm_reproducibility, @acm_replicability) to describe reproducibility in the
context of #gls("CS") is the definitions derived from
@acm_artifact_review_badging[Artifact Review and Badging].
#definition(
term: "Replicability (Different team, same experimental setup)",
name: "acm_replicability",
)[
The measurement can be obtained with stated precision by a different team, a
different measuring system, in a different location on multiple trials. For
computational experiments, this means that an independent group can obtain the
same result using artefacts which they develop completely independently.
]
In the context of this document, @reproducibility is the definition of
#emph[reproducibility] that we will use when referring to the concept of
reproducibility in #gls("CS").
#definition(term: [Reproducibility], name: "reproducibility")[
Reproducibility is the ability to consistently obtain identical results across
multiple runs of a computer task when using the same methods and data,
regardless of method, space and time. Note that this does not necessarily
imply that the outputs are correct or the desired outputs.
]
#info-box(kind: "important")[
#emph[Space] and #emph[Time] are terms borrowed from physics. In the context
of reproducibility in #gls("SE"), space refers to different systems, while
time refers to different moments in time
#cite(<malka-hal-04430009>,form:"normal"). (more about that in
@def-deterministic-build).
]
=== Scope
In this master thesis, the exploration of reproducibility will focus on a
specific aspect: ensuring the reproducibility of building source code to ensure
that the resulting application works. This critical area is paramount in the
field of
#gls("SE"), where the ability to consistently recreate identical software
artefacts from a given source code under varying conditions and environments
stands as a paramount concern. The intent is not to undermine the importance of
these other facets, but rather to mainly concentrate the efforts and analysis on
the reproducibility of source code building and compilation.
#info-box[
We acknowledge that languages like JavaScript, PHP, Python are not compiled
but merely interpreted by their respective interpreter. Often, these scripting
languages require dependencies provided by their respective package manager as
well. Ensuring the availability of these dependencies is an integral part of
the software build process and, to some extent, corresponds to the compilation
in non-compiled languages. Consequently, in the context of this thesis,
"compiling source code" is applicable to both types of programming languages.
]
The concept of reproducibility can be distinctly categorised into multiples
phases like: #emph[reproducibility at build time]
(@def-reproducibility-build-time) and #emph[reproducibility at run time]
(@def-reproducibility-run-time). It is important to note that these phases are
not mutually exclusive and can be combined to achieve a higher level of
reproducibility.
#definition(
name: "def-reproducibility-build-time",
term: "Reproducibility at build time",
)[
Reproducibility at build time refers to the ability to consistently generate
the same executable or software artefact from a given source code across
different builds on different environments, across different space and time.
This aspect is crucial in ensuring that the software compilation process is
deterministic and immune to variances in development environments, compiler
versions, or build tools. It involves a meticulous standardisation and
documentation of the build environment and dependencies to guarantee that the
same executable is produced regardless of when or where the build occurs.
]
#definition(
name: "def-reproducibility-run-time",
term: "Reproducibility at run time",
)[
Reproducibility at run time addresses the consistency of software behaviour
and output when the software is executed in different environments or under
varying conditions. This type of reproducibility focuses on ensuring that the
software performs identically and produces the same results regardless of the
#gls("OS"), underlying hardware, or external dependencies it interacts
with during execution.
]
To illustrate these phases, the C source code in @montecarlo-pi.c implements the
Monte Carlo method to approximate the value of π. This is an example of
reproducibility at build time, but not at run time.
#figure(
{
sourcefile(
file: "montecarlo-pi.c",
lang: "c",
read("../../resources/sourcecode/montecarlo-pi.c"),
)
},
caption: [`montecarlo-pi.c`],
) <montecarlo-pi.c>
#figure(
{
shell(read("../../resources/sourcecode/montecarlo-pi-compilation.log"))
},
caption: [
Building the same source code multiple times always yields the same binary
executable
],
supplement: "Terminal session",
kind: "terminal",
)
The Monte Carlo algorithm is inherently stochastic, it uses random sampling or
probabilistic simulation as a core part of its computation. This randomness is
intrinsic to the algorithm's design and purpose.
The distinction between build time and run time reproducibility for the Monte
Carlo algorithm arises from its usage of a random source. While the algorithm
can be reliably built into a consistent binary (build time reproducibility), its
outputs can vary on different executions under the same conditions due to its
inherent randomness (lack of runtime reproducibility). This does not undermine
the validity of the algorithm but rather is a characteristic of its
probabilistic approach to problem-solving.
#figure(
shell(read("../../resources/sourcecode/montecarlo-pi.c.log")),
caption: [
Running the binary multiple times does not always yields the same result
],
supplement: "Terminal session",
kind: "terminal",
)
In practice, for certain applications, runtime reproducibility can be attained
by controlling the random number generator, specifically by setting a fixed seed
as an input parameter.
In the next example, the source code is not reproducible at build time and we
might erroneously think that the program is reproducible at run time.
We observed that compiling the same source code multiple times results in
different binaries. This variation occurs because the source code includes the
macros `__TIME__` and `__DATE__`, which are substituted with the current time
and date during compilation. As a result, we cannot achieve reproducibility at
build time.
#figure(
{
sourcefile(
file: "datetime.c",
lang: "c",
read("../../resources/sourcecode/datetime.c"),
)
},
caption: [Sourcecode of `datetime.c`, a C program with macros],
) <datetime.c>
Upon executing the produced binary, the outcome appears consistent. This might
suggest that the binary is reproducible at run time, however, this assumption is
incorrect. Consider a scenario where a different user compiles the same source
code. In such a case, runtime reproducibility between the original and another
user is not assured.
#figure(
shell(read("../../resources/sourcecode/datetime.c.log")),
caption: [
An example of program that it neither reproducible at build time and
run time.
],
supplement: "Terminal session",
kind: "terminal",
)
=== Quantifying Reproducibility
Quantifying reproducibility is traditionally viewed as a binary state: it is
either reproducible or not. However, this perspective oversimplifies the
complexity of software environments. In reality, reproducibility exists on a
spectrum, where the focus shifts from a mere #emph[yes-or-no] assessment to
evaluating the extent and conditions under which a computation is reproducible.
#figure(
include "../../resources/typst/reproducibility-rule.typ",
caption: "Reproducibility states",
) <reproducibility-rule>
We will explore this concept with Docker images as a primary example. Docker, a
popular containerization platform, uses Dockerfiles (@dockerfile-example).
Basically, a `Dockerfile` is a script with a set of instructions to build
images. These images are then used to run software in a consistent environment.
However, many images on the Docker Hub #cite(<dockerhub>, form:"normal") present
challenges to reproducibility. The reasons vary: some Dockerfiles are not
publicly available but especially because most of them include significant
variability in their build processes, making exact replication of the images
pretty much impossible. We will consider these challenges in more detail in
@chapter4.
#figure(
sourcefile(
file: "nodejs.dockerfile",
lang: "dockerfile",
read("../../resources/sourcecode/nodejs.dockerfile"),
),
caption: [An example of `Dockerfile`],
) <dockerfile-example>
Determining a precise value for a Docker image's temporal reproducibility is
complex. Thus, for the purposes of this thesis, we simplify the classification
into three broad categories as outlined in @reproducibility-rule:
#emph[Not reproducible], #emph[Partially reproducible], #emph[Reproducible].
While more nuanced classifications are possible, this simplified tripartite
model provides a sufficient basis in this thesis.
Despite these challenges, Docker images are widely used. They remain static
between updates, creating a window during which their environment is consistent.
This period (the interval between two successive updates) can serve as an
indirect measure of reproducibility. Essentially, the longer the time between
updates, the more stable and, by extension, reproducible the image is
considered. In considering the temporal dimension of reproducibility, it is
essential to recognise that software artefacts are not unchanging entities; they
offer a predictable environment for a finite period. Imagine a scale where 0
represents non-reproducibility and 1 indicates full reproducibility. On this
scale, the temporal reproducibility of Docker images would be positioned between
0 and 1, acknowledging the nuanced nature of this concept.
However, this reproducibility is inherently dynamic due to the nature of
software evolution. Each update to a Docker image might introduce changes
affecting the software's behaviour, thereby impacting its reproducibility over
time. Understanding this aspect of temporal reproducibility is crucial for
managing software environments in a continuously advancing technological
landscape.
The Docker use case can be classified under the #emph[Partially reproducible]
class. This is because, while Docker images ensure reproducibility at run time
by providing a consistent execution environment, they often fall short of
reproducibility at build time due to the variability inherent in their
Dockerfiles. This dichotomy highlights the spectrum of reproducibility, where
Docker images occupy an intermediate position. They are neither fully
reproducible (due to build-time variability) nor completely irreproducible
(thanks to their runtime stability). This categorization not only helps in
understanding the reproducibility status of Docker images but also underscores
the need for a nuanced approach to classifying software reproducibility,
acknowledging the various shades that exist between the #emph[Not reproducible]
and #emph[Reproducible] boundaries.
=== Open Source
#info-box(kind: "cite", footer: [@Donoho2009])[
If everyone on a research team knows that everything they do is going to
someday be published for reproducibility, they'll behave differently from day
one. Striving for reproducibility imposes a discipline that leads to better
work.
]
Open Source refers to a type of software whose source code is freely available
for anyone to view, modify, and distribute. This model encourages collaborative
development and sharing, allowing users and developers to improve the software
and adapt it to their needs.
Open-source software development, known for managing complex projects with high
quality, significantly enhances reproducibility by fostering professionalism and
transparency. Making open-source software reproducible offers numerous
advantages: it streamlines the onboarding of new contributors, improves testing
and feature implementation, ensures transparent build processes, facilitates
security audits, and quickens response times in the dependency supply chain in
case of issues.
Reproducibility, intrinsically linked with Open Source, is fundamentally an
activity that builds trust (@barba2022definingroleopensource), making it a
leading method for ensuring software can be reliably built and verified by a
diverse global community.
A direct correspondence can be established with the taxonomy of
@ESSAWY2020104753 when considering the process of building software from source
code. This process can be analogised to a scientific experiment. In this
context, the act of an individual building the source code of another developer
on their own machine mirrors the concept of #emph["Reproducibility"] as
described in @table-levels-of-reproducibility. This signifies that the software,
when compiled by different users from its source code, consistently results in
the same executable or software artefact. The transparency inherent in
open-source software is a foundational advantage. Since the source code is
publicly available, it allows researchers to scrutinise how the software
operates, understand how results are generated, and validate the reliability and
accuracy of the software. This level of openness is crucial for replicability
and trust in scientific research.
Furthermore, open-source software promotes a culture of collaboration and
community involvement. Active communities that grow around open-source projects
contribute to the software's continual improvement. This community-driven
development leads to the identification and resolution of bugs, thereby
enhancing the software's reliability and, consequently, the reproducibility
outcomes that depend on it.
A key feature of open-source software is the permissive nature of its licencing,
which, depending on the specific licence, facilitates the reuse and modification
of software without legal or technical barriers. This flexibility is vital for
verifying and replicating studies, as researchers can adapt the software for
their specific needs without restrictions, though some licences may impose
certain conditions. Additionally, open-source development tools provide
excellent record-keeping capabilities, like version control systems #eg[
`git`, Mercurial, Pijul], enabling researchers to track changes and understand
the context of each update. This aspect is essential for reproducing and
validating research findings.
Lastly, the open source approach aligns well with the scientific values of
openness and sharing, promoting a culture that values transparency and
reproducibility in scientific inquiry. Moreover, the community-driven nature of
open-source software reduces the risk of obsolescence, ensuring that research
tools remain accessible and up-to-date for future replication efforts.
Open-source software embodies a framework that is not only conducive to the
scientific pursuit of knowledge but also reinforces the integrity and
sustainability of #gls("SE") through its emphasis on transparency,
collaboration, and adaptability. Therefore, open source is key for facilitating
reproducibility as highlighted in @hinsenKonradFramework2020's framework on
reproducible scientific computations. This framework provides a structured
approach to understanding and addressing reproducibility. He proposes four
essential possibilities which can only be achieved with open-source software:
- Inspectability: the possibility to inspect all input data and source code
- Executability: the possibility to run the code on a suitable computer to
verify the results
- Explorability: the possibility to explore the behavior of the code by
inspecting intermediate results, making small modifications, or using code
analysis tools
- Verifiability: the possibility to verify that the published executable
versions correspond to the available source code
Open-source development, by its nature of allowing anyone to build, verify and
use software, stands out as an effective, if not the best, approach to
bolstering both confidence and safety in software systems. This widespread
participation and verification process inherent in open-source development
contributes significantly to the reliability and security of the software.
=== Terminology
Establishing formal definitions and terminology is crucial for aligning
researchers, practitioners, and readers on the same wavelength. By articulating
a clear and precise mathematical representation, we facilitate a universal
understanding of what it means for a computation to be reproducible.
The following section is dedicated to constructing such a formal definitions, a
balance between the rigor required by the academic community and the clarity
needed for widespread adoption.
==== Computation
A computation is a process that involves the execution of algorithms or a series
of operations to obtain a result, usually performed by a computer. It can be
complex, involving multiple steps, conditions, and data manipulations
(@inputs-computation-outputs). The formal definition of computation takes into
account the computational environment variable, reflecting the specific context
where the computation occurs.
#figure(
include "../../resources/typst/inputs-computation-outputs.typ",
caption: "Inputs, computation, outputs",
) <inputs-computation-outputs>
In the context of #gls("CS"), defining a computation involves considering the
broader scope of activities and processes that a computer performs, extending
beyond the traditional mathematical abstraction of a function. A computation can
be understood as a sequence of steps or operations performed by a computer to
transform input data into output data. This process can involve various types of
functions, algorithms and data manipulations. Essentially, a computation can be
depicted as an abstraction involving one or multiple functions.
Examples of computations could be: a program build, a compilation, a program
execution, a data analysis, a data transformation.
- When source code is compiled, the input is the source code and the output is
the binary executable.
- When a program is executed, the input is the binary and the output is the
result of the program.
- When making a data analysis, the input is the raw data and the output is the
analysis.
- When evaluating a function, method, or procedure in any programming language,
the input consists of the function itself along with its parameters. The
output is the result of the function applied to these parameters, including
any potential side effects #eg[changes in the program's state].
#definition(term: "Computation", name: "def-computation")[
A computation $c$ is a set of one or more functions $f:I times E → R$.
where
- $I$ is the set of all possible arguments or inputs the computation needs
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
- $R$ is the set of all possible outputs
]
It is crucial to distinguish functions, both of which are integral in the realms
of Mathematics and #gls("CS"). In Mathematics, a function is a deterministic
construct defining a specific relationship between sets of inputs and outputs,
mapping each input to exactly one output. It acts as a fundamental building
block within computations to describe how values are transformed. In #gls("CS"),
functions are similar but can be classified as pure (@section-pure-function) or
impure (@section-impure-function), with pure functions having no side effects
and impure functions potentially affecting the state or relying on external
variables. While a function provides the rules for individual transformations
within a computation, the computation itself represents the broader and more
dynamic process of achieving a result, often involving the execution of complex
algorithms, data handling, and the application of multiple functions and
operations.
#grid(
columns: (1fr, 1fr),
gutter: 1em,
[
#figure(
{
set text(font: "Virgil 3 YOFF")
cetz.canvas({
import cetz.draw: *
rect((-3, 3), (3, -3))
circle((0, 0), radius: 2)
content((0, -2.25), "Functions")
content((0, -3.25), "Mathematical environment")
})
},
caption: [Functions in the context of Mathematics],
) <functions-in-mathematics>
],
[
#figure(
{
set text(font: "Virgil 3 YOFF")
cetz.canvas({
import cetz.draw: *
rect((-3, 3), (3, -3))
circle((0, 0), radius: 2)
line((0, 2), (0, -2))
content((1, 0), "pure")
content((-1, 0), "impure")
content((0, -2.25), "Functions")
content((0, -3.25), "Computational environment")
})
},
caption: [Functions in the context of #gls("CS")],
) <functions-in-cs>
],
)
In #gls("CS") (@functions-in-cs), a function necessitates an environment in
which it will be evaluated, effectively making, to some extent, this environment
an extra input parameter per se. This computational environment, which
encompasses the hardware #eg[filesystem, memory, #gls("CPU", long: false)],
software #eg[#gls("OS", long: false)] and date #eg[the current date and
time], may influence the function's behaviour and output. Consequently,
functions in #gls("CS") are inherently designed to interact with and adapt to
their environment, thereby making them dynamic and versatile but also
potentially non-deterministic.
Conversely, in Mathematics (@functions-in-mathematics), a function is evaluated
independently of any environment, or with the environment variable effectively
set to null, ensuring its behaviour is entirely predictable and self-contained.
This means its behaviour is entirely predictable and self-contained. This
distinction highlights the adaptability and complexity of functions in
computational contexts, compared to their more stable and defined mathematical
equivalents.
==== Inputs and Outputs
An input is the data provided to a computation. The output is the result of a
computation or any other changes made to the environment the computation is
being evaluated in.
#definition(term: "inputs and outputs", name: "def-inputs-outputs")[
The function $f: I -> R$ is a function mapping the domain input set $I$ on the
codomain output set $R$.
]
Inputs and outputs can vary widely, ranging from user interactions and network
connections to files and directories. The nature of these inputs and outputs
significantly impacts the reproducibility of computational processes.
Consider user interactions, such as mouse or eyes movements. These are
inherently challenging to replicate precisely due to their dynamic and
unpredictable nature. For instance, reproducing the exact trajectory of a mouse
movement is virtually impossible due to the minute variations in human actions.
However, a more reproducible approach would be to capture these interactions in
a structured format like in `eyeScrollR` (@Larigaldie2024). Recording the
coordinates of mouse movements over time in a file creates a detailed log that
can be replayed. This arbitrary method transforms a non-reproducible user
interaction into a reproducible set of data.
For a computation to be considered reproducible, its inputs and outputs must be
storable and retrievable. Typically, the most feasible types for such storage
are files or directories, primarily due to the ubiquity and accessibility of
file systems in computing environments. Files and directories offer a stable and
widely accessible medium to store and retrieve data.
In this thesis, the focus will be on scenarios where inputs and outputs are in
the form of files, unless specified otherwise. This assumption aligns with the
common practices in computational processes and aids in maintaining the
reproducibility of the computations discussed.
In the context of software compilation, an output is correct when it faithfully
reflects the state of its transitive inputs. Basically, the output represents
all direct and indirect dependencies used in the build process.
"Transitive inputs" refer to not only the direct inputs #eg[source code] but
also to the inputs of those inputs #eg[libraries, frameworks, compilers, data
resources].
From the point of view of the software build process as shown in
@inputs-outputs-part1, the inputs are all the source code files, configuration
files, and dependencies required to build the software.
#figure(
include "../../resources/typst/inputs-and-outputs-part1.typ",
caption: "Inputs, computation, outputs",
) <inputs-outputs-part1>
In @inputs-outputs-part2, the process has been refined from the perspective of
the user running the software, where the input is now composed of the program
and its parameters. This distinction is crucial as it highlights the dynamic
nature of computational processes. The user's interaction with the software,
such as providing parameters or executing commands, is integral to the inputs
and can significantly influence the output.
#figure(
include "../../resources/typst/inputs-and-outputs-part2.typ",
caption: "The input is now composed of the program and its parameters",
) <inputs-outputs-part2>
In @inputs-outputs-part3, the environment where the computation is evaluated is
added to the input. This environment includes the hardware, software, space, and
time in which the computation is executed. This addition further refines the
definition of inputs and outputs, emphasising the dynamic and context-dependent
nature of computational processes.
#figure(
include "../../resources/typst/inputs-and-outputs-part3.typ",
caption: [
The input is now composed of the program and its parameters and the
environment where it is going to be evaluated.
],
) <inputs-outputs-part3>
We could break down the environment further. However, as we delve deeper into
segmenting the components essential for a computation, the process becomes
increasingly subjective (@hinsenKonrad2020guix).
Reproducibility implies to compare outputs to determine if they are equivalent.
According to @Acm2018[p.5], there are multiple equivalence classes:
#figure(
include "../../resources/typst/equivalence-classes-of-reproducibility.typ",
caption: [Classes of reproducibility],
kind: "table",
supplement: "Table",
)
- Two natural phenomena could be observed by human experts and considered as the
same.
- Two results could be statistically equivalent, in that the numeric values are
different, but they both convey the same statistical interpretation.
- Two results could be the same data in the sense that they encode the same
numeric contents, but differ in some irrelevant detail. For example, an output
file might incidentally contain the system time and the name of the user who
ran the program.
- Two results could be equivalent, in every way, bit-per-bit. This is the
strictest form of equivalence.
In the context of this thesis, we will assume that two results are equivalent if
they are the same, bit-per-bit.
#info-box(kind: "important")[
It is important to clarify that in the context of reproducibility, the time
taken to compute the output is not typically considered. This means that two
results can be deemed equivalent or reproducible even if the computational
time to achieve these results varies. For instance, consider a situation where
a piece of code is refactored: if the output data remains unchanged, the
process is considered reproducible from a data consistency perspective.
Nonetheless, even if the refactored code requires significantly more time and
resources to execute, it is still classified as reproducible as long as the
output remains consistent with the original.
This distinction underscores that reproducibility focuses on the consistency
of the output data rather than the performance or efficiency of the
computational process. This aspect is particularly relevant in environments
where hardware or system efficiencies may differ, yet the integrity and
equivalence of the output data remain the primary concern. While this might
provoke debate regarding resource efficiency and computational time, for the
purposes of this thesis, it is assumed that the temporal and resource aspects
of computing the output are secondary to the consistency of the results
themselves.
]
==== Evaluation of a Computation
The evaluation of a computation is the process of determining the resulting
output of a function for a given set of arguments. It involves applying, in a
specific computational environment, the function's defined operations to the
inputs to produce an output. This does not necessarily imply that the outputs
are correct. Note that, #emph[correct] means that the evaluation has
successfully completed without errors.
#figure(
include "../../resources/typst/inputs-and-outputs-part4.typ",
caption: [
The evaluation of inputs into outputs where the input is composed of the
program and its parameters and the environment where it is going to be
evaluated.
],
) <inputs-outputs-part4>
#definition(term: "Evaluation", name: "def-evaluation")[
$"eval": (F, I, E) -> R$ is a function that evaluates a function $f$ and its
parameters $i$ in a specific computational environment $e$ to produce a
result, an output.
$forall f in F, forall i in I, forall e in E, quad "eval"(f,i,e) eq f(i, e)$
where
- $F$ is the set of all possible computations
- $I$ is the set of all possible arguments the computation needs
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
- $R$ is the set of all possible outputs
]
In the realm of mathematics, a function is typically isolated, operating solely
on its provided arguments, with no external environmental factors influencing
its output. Contrarily in #gls("CS"), it is quite common for a computation to
interact with, and be influenced by, its surrounding environment during
evaluation, which necessitates @def-evaluation.
==== Pure Function <section-pure-function>
As seen in @functions-in-mathematics, the concept of a #emph[pure function] as
defined in @def-pure-function does not explicitly exist in Mathematics. This is
because functions are always inherently considered to be deterministic and
side-effect free, here functions in maths are #emph[by default] pure. Any
mathematical function evaluates under the assumption that given the same inputs,
the output will always be the same, and the evaluation of the function does not
alter any external state or variable.
#definition(term: "Pure function", name: "def-pure-function")[
A pure function can be defined as a function where the same input always
yields to the same output.
Let $f: I times E → R$ be a function. Then $f$ is *pure* if and only if:
$forall i in I, forall e_1, e_2 in E, quad "eval"(f,i,e_1) eq "eval"(f,i,e_2)$
where
- $I$ is the set of all possible inputs arguments
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
A bridge can be drawn between the mathematical definition of a function
$f: I -> R$ and this definition by considering the environment variable $E$ as
an empty set, making the function independent of any external state or
variable. This effectively reduces the definition of a pure function in
#gls("CS") to the mathematical definition of a function.
]
However in #gls("CS"), it makes sense to define what are pure and impure
functions because a function might behave differently depending on the
environment in which it is executed. Therefore, the purity of a function in the
context of #gls("CS") is vital for understanding and managing side effects
and state in software, it is a distinction that does not apply in the static,
deterministic realm of pure mathematics.
This distinction highlights how the same term can have different implications in
different disciplines, reflecting the unique nature of challenges and concepts
in programming versus pure mathematics. However, we can still try to define such
a function in a theoretical #gls("CS") context.
A pure function is a specific type of function in programming characterised by
the following properties:
- Deterministic: for a given set of inputs, a pure function always returns the
same output. This means the function's output is solely determined by its input
values and does not rely on any external state or data.
- No side effects: A pure function does not cause any observable side effects in
the system. This means it does not modify any external state, global variables,
or data outside its scope. It also does not produce outputs other than its
return value, such as printing to the console or altering the state of the
program beyond the scope of the function.
#info-box[
A checksum(@checksum) is an example of pure function. It will constantly
return the same output for the given output.
]
==== Impure Function <section-impure-function>
An impure function is the opposite of the above definition of a pure function.
#definition(term: "Impure function", name: "def-impure-function")[
An impure function is a function that does not always yields the same output
for a given input. This can be formally expressed as:
Let $f: I times E → R$ be a function. Then $f$ is *impure* if and only if:
$forall i in I, exists e_1, e_2 in E, quad "eval"(f,i,e_1) eq.not "eval"(
f,i,e_2
)$
where
- $I$ is the set of all possible inputs arguments
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
]
It is a specific type of function in programming characterised by the following
properties:
- Non-deterministic: the function can yield different outputs for the same set
of input values at different times, depending on the state of the system or
environment in which it is executed.
- Side effects: the function performs actions that modify some state outside its
local environment or has observable interactions with the outside world. This
can include altering global variables, modifying input arguments, I/O
operations, or calling other impure functions.
As seen in @functions-in-cs, this concept only exists in programming, as it is a
direct consequence of the mutable nature of the state in programming. In pure
mathematics, functions are conceptualised as mappings from elements of one set
(the domain) to elements of another set (the codomain), without any side effects
or external dependencies. This distinction highlights the difference between the
theoretical framework of mathematics and the practical aspects of programming,
where functions often interact with a mutable state or environment.
Given this, we will allow ourselves to define such a function in the theoretical
context of computer science. Implying that to define such a function, an
additional parameter, which will be used to calculate the time, #emph[must] be
passed as a parameter to the function. This parameter corresponds to the `E`
parameter in @def-impure-function.
Given this, we will allow ourselves to define such a function within the
theoretical context of #gls("CS").
#info-box[
An example of an impure function is one that returns the current date and
time, as its output depends on external state and can vary with each call.
]
==== Checksum <checksum>
Although understanding the concept of a checksum is not essential for
understanding the definitions, it is crucial to define it due to its recurring
presence in the next chapters.
A checksum is the result of a computation. It is a one-way pure function which
takes an input of an arbitrary size and returns a string of a fixed size,
depending on the checksum algorithm in use. For example when using a `git`, each
commit ID is a checksum of the current commit's content and the previous
commit's ID.
#figure(
include "../../resources/typst/figure-checksum.typ",
caption: "Inputs, checksum, string output",
) <inputs-checksum-string>
A one-way function is easy to compute but is practically impossible to reverse.
This is mostly due to the fixed size output, the number of possible inputs
(#emph[domain]) exceeds the number of possible outputs (#emph[codomain]). The
time complexity of such a function is usually linear, which means that the time
it takes to compute the checksum is proportional to the size of the input,
therefore $cal(O)(n)$.
A checksum is a function that returns a string called #emph[hash], which is
supposedly unique for a given input. Checksum algorithms are designed to
produce a unique hash for each unique input. However, the term "supposedly
unique" is used because, in theory, it is possible for two different inputs to
produce the same hash, an occurrence known as a #emph[collision]. The ability to
find collisions undermines the security of the algorithm. There are different
types of algorithms to calculate a checksum
#eg[#gls("MD5", long: false), #gls("SHA1", long: false),
#gls("SHA2", long: false)]. Older algorithms like #gls("MD5", long: false) have
known vulnerabilities that allow collision attacks while more modern algorithms
like SHA-256 (#gls("SHA2", long: false)) are currently considered to be pretty
much impossible to crack.
// TODO: Add bibtex ref in glossary for checksum algorithm
While the mathematical theory allows for the possibility of collisions in
checksum hashes, the reality of their application in modern checksum algorithms
is substantially different. The sophisticated design of these algorithms
significantly reduce the likelihood of such occurrences. This ensures a high
level of trust in their effectiveness for generating distinct and reliable
representations of data, despite the theoretical potential for identical hashes
of different inputs.
#info-box(kind: "info")[
Choosing an appropriate checksum algorithm is paramount due to the rapid
evolution of computational power as described by Moore's Law
#cite(<4785860>,form:"normal"), which leads to previously secure algorithms
becoming vulnerable as computing capabilities expand.
For instance, #gls("MD5") checksums, once deemed secure for storing passwords,
are now easily compromised through brute force attacks. This underlines the
need for an adaptable approach to checksums, continually updating them to stay
ahead of advancements in computational attack strategies. According to
@courtes_2022_6581453[Notes on SHA-1, p.16], the SHA-1 algorithm family is now
approaching end of life.
To ensure the highest level of security and adaptability to future
computational capabilities, it is advisable to use SHA-2 algorithm family such
as SHA-384 or SHA-512. These algorithms provide a longer bit length, offering
enhanced security and a lower risk of collisions, making them well-suited for
securing sensitive data in the face of evolving technological threats.
]
==== Reproducibility
The concept of reproducibility can be applied in many situations, this thesis
will concentrate on a particular application area, thus narrowing its scope. In
this thesis, a computation will typically refer to the process of compiling
source code into a binary file, except in cases where it is explicitly defined
differently.
Reproducibility is a property of a computation. It is the ability to
consistently obtain identical results across multiple runs of a computation.
#definition(term: "Reproducibility", name: "def-reproducibility")[
Reproducibility is a property of a computation satisfying the following
condition:
#box[
$
& forall c in C, forall i in I, forall e_1, e_2 in E, quad "eval"(
c, i, e_1
) eq "eval"(c, i, e_2)
$
]
where
- $C$ is the set of all possible computations
- $I$ is the set of all possible inputs arguments
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
Once that condition is met, the computation is considered to be reproducible.
]
The set $I$ and $E$, respectically representing the set of all possible inputs
and the hardware and software environment including the date and time, are also
considered as abstractions. In reality, these sets are complex and intricate
entities, that could potentially be composed of many interdependent components.
However, for the purpose of this definition, they are treated as atomic.
We could consider expanding the list of arguments to achieve greater
specificity, delving deeper into the intricate details that influence
reproducibility. However, the objective here is to provide the reader with an
initial understanding of reproducibility through a formal definition. This
approach is about finding a balance between comprehensive detail and conceptual
clarity, thereby offering a foundational glimpse into the formalism that
underpins reproducible computational processes without becoming mired in
excessive complexity.
The definition of reproducibility (@def-reproducibility) closely matches the
definition of pure function (@def-pure-function) and, inherently, mathematical
functions. However, as seen in @table-function-computation, understanding the
nuances between theoretical functions and practical computations is essential.
Theoretically, mathematical functions are conceptualised as $I -> R$, reflecting
the abstract nature of mathematics where the function's result $R$ is purely
dependent on its inputs $I$ and external factors are considered non-existent. In
the practical world, this theoretical construct is transposed through an
evaluation function (@def-evaluation). For mathematical functions, this
environment parameter is known and intentionally left empty, symbolising the
deliberate exclusion of external influences and striving to maintain the purity
of the theoretical definition. This is in contrast to practical computations in
programming, where the environment parameter $E$ is often filled with various
real-world parameters and factors, reflecting the nature of computations where
outcomes are influenced by the environment variable.
#figure(
include "../../resources/typst/functions-vs-computations.typ",
caption: [Nuances between functions and computations],
kind: "table",
supplement: "Table",
) <table-function-computation>
This fundamental distinction underscores the challenges of achieving
reproducibility and predictability in the practical realm, necessitating
robustness and adaptability to manage the variability and complexity of
real-world conditions. Together, these definitions provide a comprehensive
paradigm for understanding the interplay between the idealised theoretical
constructs and their practical applications, emphasising the importance of
environmental control in ensuring the computations' reproducibility. The concept
of reproducibility, a computational property, underscores the ability to
replicate results across different environments within $E$, serving as a
cornerstone for verifying and validating scientific work.
The process of controlling the computational environment $E$ underscores a
fundamental challenge in #gls("SE"): achieving reproducibility through
environment standardisation. The environment often encompasses factors such as
hardware and software configurations, (#gls("CPU"), #gls("OS"), library
versions, and runtime conditions), which can significantly impact a function's
behaviour and output. The Monte Carlo simulation algorithm (@montecarlo-pi.c),
exemplifies this challenge: it may be reproducible at build time but can exhibit
variance at run time due to environmental factors.
This singularity highlights the essence of reproducibility: the need to
meticulously control or normalise the environment in which computations occur.
By ensuring that ideally environment remains constant, we can more closely
approximate the behaviour of pure computations in practical software systems.
This approach aim to simplify the computational model and serves as a strategic
endeavour to minimise the unpredictability introduced by varying environments.
In conclusion, while the formalism of computations' purity and reproducibility
provides the basis of a theoretical framework, the practical application in
#gls("SE") involves the intricate task of environment management. It is through
this lens that we understand reproducibility not just as a characteristic of the
function itself, but as a holistic property of the entire computational
ecosystem, encompassing both the function and its operating environment. This
broader view acknowledges that while pure functions offer a paradigm for
reproducibility, achieving this in complex, real-world systems often
necessitates rigorous control and standardisation of the computational
environment which is virtually impossible to deliver.
=== Software Security
The concept of reproducibility is pivotal in software security for several
reasons. Reproducibility ensures that software can be consistently recreated or
regenerated from its source code, guaranteeing that the software's behaviour
remains unchanged across different builds. This consistency is crucial for
verifying the security of software systems. If a software build is reproducible,
security experts can confidently assess that the build has not been tampered
with or altered to include malicious code. This becomes increasingly important
in an era where cybersecurity threats are both sophisticated and prevalent.
In the context of software security, reproducibility also aids in the
traceability and verification of software components. It allows for the thorough
examination and validation of all parts of the software, ensuring they are
exactly as intended and free from vulnerabilities or unauthorised alterations.
This traceability is particularly relevant in light of the executive order
14028, #emph[Improving the Nation's Cybersecurity], issued by
@Executive-Order-14028. This document underscores the importance of enhancing
cybersecurity across federal agencies and emphasises the integrity of the
software supply chain.
The European counterpart, the #gls("CRA") by the European Union reinforces these
efforts by setting cybersecurity requirements for software. It is the first
European regulation imposing minimum cybersecurity requirements on all
interconnected products put on the European market
#footnote[https://ccb.belgium.be/en/cyber-resilience-act-cra], to make them more
secure. This act aims to reduce vulnerabilities in software products, enhancing
security throughout their lifecycle. Software must come with clear information
on their features and instructions for secure installation, operation, and
maintenance. This strategy reflects a commitment to producing and using
reproducible software.
==== Software Bill Of Materials
The #gls("SBOM") is an essential element, acting as a detailed inventory of all
the components required to build and operate a piece of software, including all
applied patches and licensing information in a structured and well-known format.
There are multiple existing formats and standards, the most common ones are:
- #gls("SPDX", long: true): A comprehensive standard maintained by the Linux
Foundation, designed to facilitate license compliance, security, and broader
software component analysis through a detailed documentation approach,
supporting multiple formats like RDF, JSON, and YAML. It caters to a wide
range of stakeholders, including software companies, legal teams, and
open-source projects, with a particular strength in granular licensing
details.
- #gls("CycloneDX", long: false): A lightweight #gls("SBOM") standard aimed at
enhancing application security and managing software supply chain risks. It
emphasises simplicity and efficiency, supporting formats such as XML, JSON,
and ProtoBuf, and is particularly tailored towards the identification of
software components, their vulnerabilities, and risk assessments, making it a
favourite in the application security and #gls("DevSecOps") communities.
The key differences between the #gls("SPDX") and #gls("CycloneDX") formats lie
primarily in their focus, structure, and community support. The choice between
#gls("SPDX") and #gls("CycloneDX") should be guided by an organisation's
specific needs, whether the focus is on extensive licensing compliance or
streamlined security and risk management within the software supply chain.
The #gls("CRA") #cite(<CRA>, form:"normal") mandates the incorporation of a
#gls("SBOM") in software products, highlighting its important role in bolstering
software security and transparency. This requirement marks a significant
advancement in enhancing the integrity and security of software, ensuring that
all components are meticulously documented and traceable throughout the software
lifecycle. While the #gls("CRA") encompasses various provisions, much of it will
become enforceable three years after its passage, likely in early 2027.
Specifically, regarding #gls("SBOM"), the following applies to products with
digital elements available: #quote[identify and document vulnerabilities and
components contained in products with digital elements, including by drawing up
a software bill of materials in a commonly used and machine-readable format
covering at the very least the top-level dependencies of the products]
#cite(<CRA>, supplement: "Annex I, Part II (1)", form:"normal").
==== Supply Chain <ch2-supply-chain>
A software application is composed of many components, each of which is
developed by different teams or organisations. These components are then
composed together into a final product, which is the software application
itself. This process is known as the
#emph[software supply chain] #cite(<malka-hal-04482192>, form: "normal").
Contemporary software development leverages the concepts of composability and
reusability, preferring the integration and reuse of existing libraries over
developing new functionalities from scratch. This methodology enhances
productivity and contributes to the creation of more reliable software by
allowing each component to concentrate on executing a specific function
effectively. Nevertheless, this reliance on external components leads to the
accumulation of both direct and indirect dependencies, complicating the software
supply chain significantly. The build environments, which encompass all
necessary components and their precise versions for software compilation, become
intricate and difficult to replicate across different systems and over time.
This growing complexity, "politely called #emph[dependency management]"
#cite(<8509170>, form:"normal") but more colloquially known as
#emph[dependency hell], is a phenomenon that developers have become all too
familiar with. While Semantic Versioning (@package-managers) offers a strategy
to mitigate these issues, it alone is insufficient to ensure reproducibility
#cite(<TSE2019>, form: "normal", supplement: [p.11]).
To illustrate this concept, the graph in
@python-runtime-dependencies-graph-with-flaw acts as a simplified #gls("SBOM")
for "My App" version `1.2.3`, highlighting its runtime dependencies essential
for the application's operation. This visualization selectively excludes the
build-time dependencies required for the application's compilation to maintain
conciseness. A vulnerability has been identified in `xz` (marked in red), a
critical runtime dependency. Consequently, this vulnerability could potentially
compromise its dependent components (marked in orange), including the
application itself, underscoring the interconnected risk within the software's
dependency graph. This scenario, while being a simplified representation,
mirrors the recent CVE-2024-3094 #cite(<CVE-2024-3094>, form: "normal") in the
`xz` project #cite(<xz>, form: "normal"), which affected numerous software
applications and highlighted the criticality of managing software supply chain
risks.
#figure(
include "../../resources/typst/my-app-graph-not-ok.typ",
caption: [
Dependency graph of `my-app` version `1.2.3`, where a flaw has been detected
in `xz` dependency
],
) <python-runtime-dependencies-graph-with-flaw>
These issues are known as #emph[supply chain attacks], a type of cyber attack
that targets vulnerabilities in the supply chain of software or hardware
products, with the aim of compromising the final product by infiltrating its
development or distribution process. This can involve tampering with the
production of components, the assembly of systems, or the delivery of software
updates, thereby infecting end users who trust these sources. One particular
aspect of supply chain attacks is that even the original authors of the software
may be unaware that their product has been compromised, as the malicious
alterations often occur downstream in the supply chain. Although not as frequent
as direct attacks on software or systems, supply chain attacks are becoming
increasingly common due to their potential for widespread impact. Gartner
predicts that by 2025, 45% of organisations worldwide will have experienced
attacks on their software supply chains, a three-fold increase from 2021
#footnote[
https://www.gartner.com/en/newsroom/press-releases/2022-03-07-gartner-identifies-top-security-and-risk-management-trends-for-2022
] while Cybersecurity Ventures predicts that the global cost of software supply
chain attacks to businesses will reach nearly \$138 billion by 2031
#footnote[https://go.snyk.io/2023-supply-chain-attacks-report.html].
Notable examples include the #emph[Stuxnet] worm in 2010
#cite(<mueller2012stuxnet>, form: "normal"),
the #emph[Heartbleed] bug discovered in 2014
#cite(<Heartbleed101>, form: "normal"), and the #emph[SolarWinds] breach in 2020
#cite(<solarwinds-9579611>, form: "normal"). These incidents highlight the
exploitation of interconnectedness and inherent trust within the supply chain,
making supply chain attacks particularly insidious and effective methods of
cyber warfare that can simultaneously affect a large number of users or
organisations.
==== Reproducibility And Security
Reproducibility is a fundamental aspect of software security, particularly in
the context of the software supply chain. It ensures that software can be
reliably and consistently regenerated from its source code, thereby safeguarding
against malicious alterations or tampering. This is particularly relevant in the
context of supply chain attacks, where the integrity of the software supply
chain is compromised, potentially leading to widespread security breaches.
It is paramount to have a clear understanding that having something reproducible
does not mean that it is secure. It is a necessary condition but not a
sufficient one. If a compiler is flawed, it might produce reproducible builds
that could also be potentially insecure.
#figure(
{
set text(font: "Virgil 3 YOFF")
image("../../resources/images/security-independent-builds.svg")
},
caption: [
The reproducible builds approach to increasing trust in executables built by
untrusted third parties.
],
) <security-independent-builds>
In @security-independent-builds inspired from @abs-2104-06020, end-users should
disregard the binary artefact supplied by their software vendor if its checksum
(`806e7...9c271`) diverges from those generated by independent third parties
(`4e14e...4c0a9`). The security of software is deemed more robust when its
reproducibility is confirmed across multiple environments. It is the consensus
among these environments that contributes to the perception of security. The
premise here is not merely the reproducibility, but the uniformity of this
reproducibility across space and time, which strengthens the trust in the
software's integrity and security.
As cyber threats evolve, ensuring that software can be reliably and consistently
bit-per-bit reproduced from its source code becomes a cornerstone for
maintaining security integrity. Reproducibility not only facilitates the
verification of software for tampering or malicious alterations but also
strengthens trust in software systems amidst the growing complexity of cyber
threats. Therefore, integrating reproducibility into software development and
distribution processes is a crucial step towards enhancing overall cybersecurity
resilience and safeguarding against the ever-increasing sophistication of cyber
attacks.
=== Reproducibility Utopia <ch2-r13y-utopia>
Reproducibility in #gls("SE") is often considered as an utopia. The exact
replication of a software poses a significant challenge. Thus, while striving
for reproducibility is essential, achieving absolute reproducibility is
frequently unattainable in practice.
One of the primary impediments in achieving reproducibility lies in the
dependency on hardware architecture. Software compiled for different
architectures, such as `x86` and `ARM,` inherently produces disparate binaries #cite(<patterson2013>,form:"normal").
These differences stem from the instruction sets and optimizations that are
specific to each platform, leading to divergent outputs despite using identical
source code. This variance highlights a significant reproducibility challenge,
as achieving bitwise identical results across architectures is *not feasible* as
of today.
Compilers #eg[GCC, Rustc, #LaTeX, Typst] also play a role in software
development, transforming high-level code into machine-level instructions.
However, not all compilers operate deterministically. In this context,
non-determinism refers to the phenomenon where compilers produce different
outputs given the same input source code across different compilations. Factors
contributing to this non-determinism include variations in memory allocation,
inclusion of timestamps, and embedding of file paths in the binary output. These
variances pose challenges to achieving consistent, reproducible builds.
#info-box[
A compiler is essentially an application that transforms input into output.
Tools like GCC are referred to as compilers because they convert high-level
code into machine-level instructions. However, the term #emph[compiler] is not
limited to programming languages alone. For example, #LaTeX is a compiler that
transforms a `.tex` file into a `.pdf` file, rustc compiles a `.rs` file into
a binary file, and Typst compiles a `.typ` file into a `.pdf` file. Typically,
compilers convert human-readable files into machine-readable files.
]
In @chapter3, acknowledging the reality that full reproducibility may not be
entirely achievable, we will delve deeper into these challenges by exploring the
impact of non-deterministic compilers and the strategies to mitigate these
challenges using different methods.
== Deterministic Builds And Environments
In this section, we will explore the concept of deterministic builds, and the
potential sources of non-determinism in software builds.
The concept of deterministic builds is essential for ensuring reproducibility. A
build is termed #emph[deterministic] when it consistently generates identical
outputs from a given set of inputs, irrespective of the environment or time of
execution. This predictability is central to software reproducibility, yet
several sources of non-determinism frequently challenge its realisation. One
single non-deterministic component in a build process can render the entire
build non-deterministic. Therefore, it is crucial to identify and understand
these sources of non-determinism to ensure reproducibility. Many of these
sources of non-determinism are related to the environment in which the build
occurs. This environment encompasses the hardware, software, and runtime
conditions in which the build process is executed. These factors can
significantly influence the build process, thereby impacting the stability of
the output.
#definition(term: "Deterministic build", name: "def-deterministic-build")[
Let $B$ be a build process defined as a function:
$B: I times E -> O$
where
- $I$ is the set of all possible input arguments
- $E$ is the set of all possible execution environments (hardware, software,
space, time)
then the build B is deterministic if $I times E$ is deterministic:
$"Determinism"(I times E) -> "Determinism"(B)$
where `Determinism` is a function asserting that its argument is
deterministic.
]
According to @abs-2104-06020, a reproducible build environment is essential for
achieving deterministic and reproducible builds. It ensures consistency in the
software building process by providing a controlled and predictable set of
conditions under which the software can be built. @malka-hal-04430009[p.1]
further elaborate that a build environment is reproducible in both space and
time when it is possible to replicate the same build environment on any machine
and at any point in the past or future.
#info-box[
When a process exhibits a lack of reproducibility over time, it indicates a
fundamental instability within the process. While it would be technically
feasible to replicate the same output in a different environment, within the
same architecture, achieving exact temporal replication of the build process
is practically impossible. This temporal variability serves as a critical
indicator of potential difficulties in ensuring reproducibility across diverse
environments or machines.
]
=== Sources Of Non-Determinism
In this section we will explore the sources of non-determinism in software
builds and usage. The list is not exhaustive, it just includes the most common
sources of non-determinism. The list is created from @abs-2104-06020's paper
and information of the @ReproducibleBuildsOrg project, a global initiative
aiming at improving reproducible builds in software development.
==== Randomness
Using random data in a computation is a common source of non-determinism and
must be avoided. When random data is required, the solution is to use a
predetermined value acting as a seed to the pseudo-random number generator.
Using a predetermined value as a seed ensures that the same random data is
generated each time the computation is executed, thereby guaranteeing
reproducibility.
Hardcoding the seed in the source code would be nonsensical because it would not
be random any more, the seed should be passed as a parameter to the computation.
This parameter can be passed as a command-line argument, an environment
variable, or a configuration file, leaving the responsibility to the user to
provide a seed.
==== Build Paths
Build paths are paths used by the source code to locate files and resources.
Sometimes, it can happen that absolute paths are used in the source code, which
means that the build will only be reproducible on the same machine where it was
built.
To avoid this, relative paths should be used instead of absolute paths and
sometimes post-processing is required to remove the build path or to normalize
it with a predefined value.
==== Volatile Inputs
Volatile inputs are inputs that can change over time. For example, the current
date and time are volatile inputs, network streams as well. Dealing with date
and time will be done in @timestamps. For network streams, the solution is to
never rely on remote data while building. Instead, the data should be downloaded
beforehand and stored locally.
This is a common issue in the context of software compilation, where the build
process might download dependencies from the internet during the build.
==== Package Managers <package-managers>
Package managers are tools that automate the process of installing, upgrading,
configuring, and removing packages, typically from a central repository or
package registry. They are widely used in software development to manage
dependencies and facilitate the build process. For example, `Cargo` for Rust,
`Composer` for PHP, `NPM` for NodeJS, `Dune` for OCaml, `tlmgr` for #LaTeX.
Package managers are also used to manage software at the operating system level
like: `apt` in Debian based distributions, `pacman` in Arch Linux, `dnf` in
Fedora, `brew` in macOS, `chocolatey` in Windows
#cite(<9403875>, form: "normal", supplement: [p. 10]).
Package managers can inadvertently introduce non-determinism by automatically
downloading or updating dependencies to their latest versions. This process can
lead to inconsistencies, particularly when a newer version of a package includes
changes that are incompatible with the project's codebase. To mitigate this, the
#gls("SemVer") scheme is widely adopted, offering a structured version naming
convention that aids dependency management. However, while packages may declare
#gls("SemVer") compliance, adherence levels vary, with some strictly following
#gls("SemVer") principles and others adopting them more leniently
#cite(<TSE2019>, form: "normal", supplement: [p.5]). Notably, there has been a
trend towards increasing adoption and stricter adherence to #gls("SemVer")
principles by package managers over time
#cite(<TSE2019>, form: "normal", supplement: [p.13]). It provides a structured
version naming convention designed to convey the nature of changes between
releases, thereby aiding in the management of dependencies with a syntax that
succinctly specifies version constraints. While this mechanism greatly
facilitates dependency resolution by leveraging a minimalistic syntax, it
inherently permits variability over time, potentially compromising
reproducibility.
#figure(
sourcefile(
file: "composer.json",
lang: "json",
read("../../resources/sourcecode/composer.json"),
),
caption: [A `composer.json` file, used by the PHP package manager, Composer],
) <composer-json>
In @composer-json, the dependency `foo/http` is specified with version `^1`,
where the caret symbol (`^`) indicates that Composer should install the latest
minor version within the major version `1`. In contrast, the dependency
`foo/bar` is locked to version `1.2.3`, signalling that Composer must install
that specific version, regardless of newer releases. This distinction
underscores the importance of using package managers judiciously to achieve
determinism. For Composer, determinism is further ensured by including a
`composer.lock` file in the project, which explicitly pins each dependency to a
particular version, thus facilitating reproducibility. The decision to require
this file varies by project and is not in the scope this master thesis.
Ensuring reproducibility in the context of package managers is particularly
challenging due to the amount of different ecosystems and the lack of
standardisation. For example, in the Python realm#footnote[
https://linuxfr.org/news/l-installation-et-la-distribution-de-paquets-python-1-4
], there have been and there are still multiple package manager ecosystems:
`distutils`, `setuptools`, `pip`, `pypi`, `venv`, `conda`, `anaconda`, `poetry`,
`hatch`, `rye`, `uv`. Each of these has its own configuration file format, which
can be used to specify the version of each dependency. However, there is no
standardisation which makes it difficult to ensure reproducibility. The same
issue applies to operating system's package managers. For example, in Debian
based distributions, there are multiple package managers: `apt`, `aptitude`,
`dpkg`.
A potential solution would be to use a universal package manager that would work
across all Linux distributions and programming languages. Tools like `AppImage`,
`snap` and `flatpak` aim to address this challenge, albeit at the operating
system level. These tools simplify the process of transferring a single piece of
software and its pre-built dependencies to various systems. However, they do not
include the C library, leading to potential compatibility issues on newer or
older systems, depending on the version on which the software was originally
built #cite(<9403875>, form: "normal", supplement: [Appendix A.B.2)]).
Furthermore, while these tools represent a step in the right direction, they
introduce their own set of challenges, such as a lack of standardisation among
them, limited adoption, and insufficient support from major distributions.
There are also package managers such as `Nix` and `Guix` that tackle the issue
by being independent and universal. These can be installed and used on GNU/Linux
operating systems, with Nix additionally supporting macOS and FreeBSD. These
package managers offer a method to build and install packages within a sandboxed
environment, thereby isolating them from the rest of the system during build
time #cite(<9403875>, form: "normal", supplement: [Appendix A.B.3)]). This
approach significantly enhances reproducibility. We will explore these package
managers further in @chapter4.
==== Version Information
Version information like commit identifiers can be used to precisely identify
the source code used to build a program.
#figure(
shell(read("../../resources/sourcecode/listing-typst-version.log")),
caption: [Example of program including a commit ID],
) <listing-typst-version>
As illustrated in @listing-typst-version, incorporating specific version
information, such as a commit ID, helps reproduce a build by facilitating the
retrieval of an identical source code version. Nevertheless, the efficacy of
commit IDs as reproducibility anchors remains debatable. These identifiers may
frequently be unavailable at the time of build. It is essential to recognize
that `git`, a distributed version control system designed to handle everything
from small to very large projects with speed and efficiency, metadata, including
commit IDs, is not an intrinsic element of the source code. Instead, it is part
of the version control system in use. `git` allows multiple developers to work
together on the same project simultaneously, providing a robust system for
tracking changes, version history, and collaboration. However, the potential for
easy substitution of one version control system for another renders reliance on
such ephemeral metadata a precarious foundation for software reproducibility.
In scenarios where a version number is necessary, it can be derived from a
dedicated file, such as a changelog or eventually provided through an
environment variable. This approach decouples the versioning process from the
underlying version control system, potentially offering a more stable and
reliable method for software version identification.
==== File Order
It is important to ensure that processing multiple files in a stable order
remains stable.
Listing files relies on the low-level #gls("POSIX", long: false) call `readdir`, which itself is
dependent on the filesystem in use and therefore does not guarantee any
consistent ordering.
#info-box(kind: "info")[
According to @LibCManual[p.415]: The order in which files appear in a
directory tends to be fairly random. A more useful program would sort the
entries before printing them.
In @tlpi[p.354]: The filenames returned by `readdir()` are not in sorted
order, but rather in the order in which they happen to occur in the directory.
This depends on the order in which the file system adds files to the directory
and how it fills gaps in the directory list after files are removed.
]
There are numerous situations where relying on an existing list of files can
result in non-determinism. For instance, when generating an archive from the
contents of a directory, many file systems do not provide consistent ordering
when listing files within that directory. Consequently, the arrangement of files
in the archive may differ between builds, causing unpredictable archives.
Although these archives might contain identical content, they could have been
compressed with varying file orders.
To address this, one could enforce a stable order by explicitly sorting the
inputs before processing them. This can be done by sorting the list of files in
the directory based on a specific criterion, such as their names or modification
timestamps.
#figure(
shell(read("../../resources/sourcecode/tar-sort-name-flag.log")),
caption: [
Use of `--sort=name` flag to ensure a stable order of files in an archive
],
) <listing-tar-sort-flag>
==== Timestamps <timestamps>
Timestamps are among the biggest sources of non-determinism in software builds,
as they can lead to differences due to changing times between builds. Since
reproducibility checks the content of the output and its metadata, building
multiple times some source code will create output artefacts with possibly the
same content but with different metadata, like file timestamps, making them
irreproducible.
Often, timestamps are used to approximate which version of the source were
built. Since file timestamps are volatile, the source code needs to be tracked
more accurately than just a timestamp. Just like for version information, the
solution would be to extract the date from a dedicated file like a changelog, or
a specific commit #cite(<nixpkgs-pull-256270> ,form: "normal").
To circumvent this issue, `SOURCE_DATE_EPOCH` is a specific environment variable
convention for pinning timestamps to a specific value that has been introduced
by the `reproducible-builds.org` community and it is now widely used by many
compilers and build tools.
Another option is to use `libfaketime`, a library that intercepts system
function calls retrieving the current time of day and replies with a predefined
date and time instead.
When none of these options are viable, using a tool like `strip-nondeterminism`
#cite(<strip-nondeterminism>, form: "normal") is a temporary workaround for
stripping non-deterministic information such as timestamps and filesystem
ordering from various file and archive formats.
==== Locale Environment Variables
#figure(
shell(read("../../resources/sourcecode/date-format-flags.log")),
caption: [Use `LC_ALL` and `-u` flags to configure the date format],
) <listing-date-format-flags>
`LC_ALL` is a locale environment variable that can modify various aspects of an
application's behaviour. It can change the date format, string collation order,
and character encoding. Although each parameter can be set individually,
`LC_ALL` enables you to configure them all simultaneously and override any other
locale environment variables.
In @listing-date-format-flags, we methodically incorporate various flags, such
as `-u`, and the `LC_ALL` environment variable to the `date` command. This
approach ensures that the output we receive is predictable and consistent,
regardless of the underlying system configuration.
=== Comparing Builds
In the quest for software reproducibility, identifying and understanding the
differences between two builds of the same software becomes paramount,
especially when those builds are not identical. This section introduces a tool
designed specifically for this purpose.
Developed under the umbrella of the @ReproducibleBuildsOrg effort, `diffoscope`
#cite(<diffoscope>, form:"normal") is a comprehensive, open-source tool that
excels in comparing files and directories. Its unique capability to recursively
unpack archives of various types and transform binary formats into a
human-readable form makes it an indispensable tool for software comparison. It
seeks to simplify the process of identifying discrepancies between software
builds. This functionality is crucial for developers and researchers striving to
pinpoint and resolve the causes of non-reproducibility. An online version of the
tool is also available#footnote[https://try.diffoscope.org/].
To demonstrate the effectiveness of `diffoscope` in identifying differences
between non-reproducible builds, @bash-gcc-not-reproducible-builds considers
the hypothetical example of a simple program that outputs the current date and
time. Due to its nature, compiling this program twice, even with the same source
code, will inherently produce two different builds.
First, we compile the sourcecode twice, creating `build1` and `build2`:
#figure(
shell(
read("../../resources/sourcecode/bash/bash-gcc-not-reproducible-builds.log"),
),
caption: [
Compilation of non-reproducible programs and the use of their checksums for
comparison
],
) <bash-gcc-not-reproducible-builds>
Then, we use `diffoscope` to compare these builds:
#figure(
shell(read("../../resources/sourcecode/bash/bash-diffoscope-comparison.log")),
) <bash-diffoscope-comparison>
The tool will generate a detailed report (@diffoscope-report) highlighting the
differences between `build1` and `build2`. In this hypothetical example,
differences might include timestamps or other build-specific metadata embedded
within the binary.
#figure(
{
image("../../resources/images/diffoscope-report.svg")
},
caption: [A `diffoscope` report using HTML format],
) <diffoscope-report>
=== Fixing Builds
In this subsection, we delve into strategies for addressing non-reproducible
builds, acknowledging the vast array of potential causes and the impossibility
of covering every solution comprehensively.
Previously in @bash-gcc-not-reproducible-builds, we encountered an issue, where
compiling the sourcecode (@datetime.c) twice resulted in different binaries.
Using `diffoscope`, we identified, as shown in @diffoscope-report, the source of
variability as date and time strings embedded within the binaries.
A solution has been proposed in @timestamps, we can leverage the
`SOURCE_DATE_EPOCH` environment variable to address this specific challenge in
achieving reproducible builds. This approach standardises the date and time used
during the build process, ensuring consistency across compilations and thus
contributing to reproducibility.
#figure(
shell(read("../../resources/sourcecode/bash/bash-fixing-builds.log")),
caption: [Fix builds using an environment variable],
) <bash-fixing-builds>
== Conclusion
This chapter embarked upon a detailed journey through the landscape of
reproducibility, focusing particularly on its pivotal role within the realms of
science and, more specifically, #gls("CS") and #gls("SE"). Through rigorous
analysis, we unveiled the multifaceted nature of reproducibility.
We dissected the concept of reproducibility, from its foundational elements in
science to its intricate implications in computer science, delineating the
essential terminology that frames our discussion: computations, pure and impure
functions, inputs, outputs, and the environmental variables that intertwine to
influence reproducibility. The exploration into deterministic builds and the
sources of non-determinism not only highlights the inherent challenges but also
sets the stage for the subsequent focus on the tools and methodologies designed
to tame these complexities.
As we pivot toward the next chapter, our narrative will transition from the
theoretical underpinnings to the practical arsenal at our disposal for enhancing
reproducibility in #gls("SE"). While the groundwork laid in this chapter paves
the way for an in-depth exploration, it is important to acknowledge the vast
landscape of tools and methodologies available in this domain. Given the scope
of this thesis, we will focus on four evaluation methods using three key tools,
with the understanding that this selection is not exhaustive but rather
representative of the broader ecosystem. Through the lens of real-world
applications and case studies, we will explore how these chosen tools are used
to mitigate the challenges identified herein and to foster an ecosystem where
reproducible research and development are not merely aspirational goals but
operational norms.
In fine, this chapter serve as both a foundation and a bridge. It offers a
comprehensive understanding of reproducibility that is critical for appreciating
the significance of the solutions and methodologies discussed in the chapters
that follow. It is within this framework that we continue our quest to demystify
reproducibility, moving from conceptuals clarity to practical application, with
the ultimate aim of enhancing the reliability, security, and transparency of
#gls("SE") practices.
|
https://github.com/justmejulian/typst-documentation-template | https://raw.githubusercontent.com/justmejulian/typst-documentation-template/main/utils/getCurrentHeading.typ | typst | #let getCurrentHeading() = {
locate(loc => {
// Find if there is a level 1 heading on the current page
let nextMainHeading = query(selector(heading).after(loc, inclusive: false), loc).find(headIt => {
headIt.location().page() == loc.page() and headIt.level == 1
})
if (nextMainHeading != none) {
// If there is a level 1 heading on the current page, don't show in header
return " "
}
// Find the last previous level 1 heading
let lastMainHeading = query(selector(heading).before(loc), loc).filter(headIt => {
headIt.level == 1
}).last()
return lastMainHeading.body
})
}
|
|
https://github.com/Hobr/njust_thesis_typst_template | https://raw.githubusercontent.com/Hobr/njust_thesis_typst_template/main/util/font.typ | typst | MIT License | // 字体
#let fonts = (
zh_宋体: "SimSun",
zh_楷体: "KaiTi",
zh_黑体: "SimHei",
zh_等线: "DengXian",
en: "Times New Roman",
math: "Cambria Math",
jp: "MS Mincho",
)
// 字号
#let fontSize = (
一号: 26pt,
小一: 24pt,
二号: 22pt,
小二: 18pt,
三号: 16pt,
小三: 15pt,
四号: 14pt,
小四: 12pt,
五号: 10.5pt,
小五: 9pt,
六号: 7.5pt,
小六: 6.5pt,
七号: 5.5pt,
)
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/compiler/ops-prec.typ | typst | Apache License 2.0 | // Test operator precedence.
// Ref: false
---
// Multiplication binds stronger than addition.
#test(1+2*-3, -5)
// Subtraction binds stronger than comparison.
#test(3 == 5 - 2, true)
// Boolean operations bind stronger than '=='.
#test("a" == "a" and 2 < 3, true)
#test(not "b" == "b", false)
---
// Assignment binds stronger than boolean operations.
// Error: 2:3-2:8 cannot mutate a temporary value
#let x = false
#(not x = "a")
---
// Precedence doesn't matter for chained unary operators.
// Error: 3-12 cannot apply '-' to boolean
#(-not true)
---
// Not in handles precedence.
#test(-1 not in (1, 2, 3), true)
---
// Parentheses override precedence.
#test((1), 1)
#test((1+2)*-3, -9)
// Error: 8-9 unclosed delimiter
#test({(1 + 1}, 2)
|
https://github.com/dainbow/MatGos | https://raw.githubusercontent.com/dainbow/MatGos/master/themes/38.typ | typst | #import "../conf.typ": *
= Вычеты. Вычисление интегралов по замкнутому контуру при помощи вычетов
#definition[
Пусть $f$ голоморфна в $dot(O_r (a)), a in CC$, то определим вычет как
#eq[$res_a f = 1 / (2 pi i) integral_(gamma_rho) f(z) d z $]
]
#lemma[Вычеты определены корректно (не завият от $gamma$)]
#proof[
Пусть $f = sum_(-oo)^(+oo) c_n (z-a)^n, space z in dot(O)_r(a)$, то
#eq[$
1/(2 pi i) integral_(gamma_rho) f(z) d z = sum_(-oo)^(+oo) c_n 1/(2 pi i) integral_(gamma_rho) (z-a)^n = c_(-1)
$]
Не зависит от $gamma$
]
#theorem[_Коши о вычетах_ (a.k.a _Вычисление интегралов по замкнутому контуру с помощью вычетов_)
Пусть $D$ ограничена циклом $Gamma = gamma_0 - gamma_1 - gamma_2 - dots - gamma_n$.
Пусть $A = {a_1, a_2, a_3, dots , a_N} subset.eq D$.
$f$ голоморфна в $D' without A$ где $D' supset D$. Тогда
#eq[$ 1 /(2 pi i) integral_Gamma f(z) d z = sum^N res_(a_i) f$]
]
#proof[
Окужаем каждую особую точку кругом радиуса $R$. Добавляем и вычитаем из $Gamma$ эти
круги ($delta_i$). В части с минусами получаем новый цикл $tilde(Gamma) = Gamma - sum delta_i$,
такой что в нем $f$ голоморфна.
Проверяем что $tilde(Gamma) space ~ space 0 (mod tilde(D))$
- В точках вне $D$ он так и остался 0.
- В новых точках (внутри $delta_i$) $ 1 - 1 = 0$
Следовательно интеграл по $tilde(Gamma)$ равен 0, а оставшая часть это $sum integral_(delta_i) f d z= 2 pi i sum res_(a_i) f$.
]
|
|
https://github.com/dashuai009/dashuai009.github.io | https://raw.githubusercontent.com/dashuai009/dashuai009.github.io/main/src/content/blog/029.typ | typst |
#let date = datetime(
year: 2022,
month: 6,
day: 20,
)
#metadata((
title: "wsl2-arch",
subtitle: [wsl2,archlinux],
author: "dashuai009",
abstract: "个人癖好:所有软件都要最新的!arch完美戳中了我。ArchWSL!开个blog记录新机都要装什么环境。",
description: "这是一种将C++更加现代的代码组织方式。 模块是一组源代码文件,独立于导入它们的翻译单元进行编译。",
pubDate: date.display(),
))<frontmatter>
#import "../__template/style.typ": conf
#show: conf
#outline()
== ArchWSL
<archwsl>
#link("https://github.com/yuk7/ArchWSL")[ArchWSL]
!!!
|
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-4E00.typ | typst | Apache License 2.0 | #let data = (
"0": ("<CJK Ideograph, First>", "Lo", 0),
"51ff": ("<CJK Ideograph, Last>", "Lo", 0),
)
|
https://github.com/Tiggax/zakljucna_naloga | https://raw.githubusercontent.com/Tiggax/zakljucna_naloga/main/src/sec/4rezultati.typ | typst | #import "/src/additional.typ": todo
#set heading(offset: 1)
#import "/src/figures/mod.typ": constants_diff_fig
= Default system
Simulating the default system with variables seen in @constants, we get the following graphs seen in @default and in @default-fac.
Volume starts to rise with day 2, as at that time bioreactor feeding initiates.
At day 3 a temperature shift has been made, that changes the growth rate of @VCD.
@DO levels drop from 80% to 25% within the five days, when the @PID controller starts adding oxygen to keep the minimum level of 25%.
It can be seen that maximum values of the @PID controller increase with the same rate as @VCD grows.
@VCD concentration increases until some time before day 10 when glutamine runs out, and causes it to stop growing.
@VCD slowly starts to drop till around day 12, when glucose runs out, which prompts it to drop faster.
#figure(
caption: [Graph of the default system],
image("/figures/default.png")
)<default>
#figure(
caption: [Faceted Graph of the default system],
image("/figures/default-facet.png")
)<default-fac>
= Variations of the constants
== Inoculation values
In @vcd, a graph of @VCD can be seen, graphing three different initial inoculations with inoculum of $0.4$, $0.5$ (default) and $0.6$.
All three inoculations had a temperature shift at day 3.
All three values reach roughly the same peak, with larger inoculations reaching somewhat higher peaks, and faster.
Towards the last day it can be seen that the biggest inoculation start to drop off the fastest.
#figure(
caption: [Graph of viable cell density over time based on the changing initial $"VCD"$ value],
image("/figures/vcd.png")
)<vcd>
@vcd-fac shows the whole system for each of the values in relation to time.
It can be seen that the glucose and glutamine levels drop faster with higher value of inoculation, as does the oxygen.
As oxygen drops faster, @PID control starts earlier, but does not max out its regulation.
With lower levels of inoculation, it can be seen that on the last day, the product concentration is lower.
It could be seen that the concentration of product extracted increases with the higher inoculation value.
This means that larger inoculations can potentially produce more product in the same time-frame,but require more oxygen, glucose and glutamine during their run.
#figure(
caption: [Graph of viable cell density over time based on the changing initial $"VCD"$ value],
image("/figures/vcd-facet.png")
)<vcd-fac>
== $mu_"max"$ Variations
In @mu_max it can be seen that changing the $mu_"max"$ constant changes the rate at which the cells multiply.
This means that higher values translate to faster growth of cells in the same medium.
This faster growth also means that @VCD reaches its maximum density faster.
#figure(
caption: [Graph of viable cell density over time based on the changing values of $mu_max$],
image("/figures/mu_max.png")
)<mu_max>
Looking at other values in @mu_max-fac, it can be seen that higher values of $mu_"max"$ also accelerate the consumption of glucose, glutamine and oxygen.
As more cells are generated, the concentration of product also rises.
Bigger consumption also strains the @PID control, as oxygen needs to be pumped into the system sooner and in bigger quantities for cells to not suffocate.
#figure(
caption: [Graph of viable cell density over time based on the changing values of $mu_max$],
image("/figures/mu_max-facet.png")
)<mu_max-fac>
== Feed rate
Changing the feed rate results in the @feed.
Red line represents a bioreactor with no feed, meaning a normal batch bioreactor.
It can be noted that since the feed rate is 0%, cell concentration stays the same after reaching the maximum of the system.
This is because of the assumption in @vcd-sec, that 100% of cells in the bioreactor are alive at every increment.
It can be seen that as the feed rate increases the concentration is increases slower, and then starts to fall off when cells stop multiplying because of lack of nutrients.
This happens, since volume steadily increases, and cells start to get diluted.
#figure(
caption: [Graph of viable cell density over time based on the changing values of feed rate],
image("/figures/feed.png")
)<feed>
In @feed-fac we can see feed rate compared to other values.
Glucose and glutamine last longer, as the feed rate adds them in the substrate. The bigger the feed, the more substrate gets added per minute, meaning the volume of substrate in the bioreactor increases.
In @PID control a small decrease of needed oxygen can be seen, as more oxygen can get dissolved in larger volume.
The highest concentration of product occurs in the batch bioreactor, while bioreactors with feed contain lower concentration.
#figure(
caption: [Graph of viable cell density over time based on the changing values of feed rate],
image("/figures/feed-facet.png")
)<feed-fac>
Simulating the bioreactor for bigger feed values, so that at least one of the substrate nutrients does not get depleted was done with constants in @process-const.
#figure(
caption: [Different feed rates in @process and @process-fac were built with the following constants differing from the default values in @constants ],
constants_diff_fig("feed_process0")
)<process-const>
The simulation made four runs with different feed rates in each iteration.
The facet graph of the four runs can be seen in @process-fac.
With growing feed rate, the @VCD needs more time to reach the capacity of the system, with concentrations of different runs staying in similar ranges.
As glucose and glutamine are being feed into the system, larger feed rate values, extend the time of cell growth.
As the values are in concentrations, and the volume is not constant, the values can be deceiving.
Looking at @process, we can see the number of cells and theoretical mass values of each of the nutrients, as well as the wanted product, at the last day of the process.
It can be seen that increasing the feed rate results in a increase of Viable cells, but somewhere between 10 % and 15 % this corelation starts to die off.
Looking at glucose and glutamine mass, it can be seen that they follow a similar pattern.
Similarly the mass of product at the end of the process is the biggest at 10 % feed rate.
#figure(
caption: [Facet of the different feed rates ranging from 0% to 15 % in 5 % increments.],
image("/figures/feed_process-facet.png")
)<process-fac>
#figure(
caption: [Last day cell count and mass of substrate and product for different feed rates ranging from 0% to 15 % with 5 % increments],
image("/figures/feed_process.png")
)<process>
== Glucose and glutamine
Graphing glucose in @glucose and @glucose-fac and glutamine in @glutamin and @glutamin-fac it can be seen, that glucose and glutamine constants mainly affect the consumption of glucose and glutamine, with no major changes in the system.
#figure(
caption: [Graph of glucose concentration over time based on the changing glucose constant],
image("/figures/glucose.png")
)<glucose>
#figure(
caption: [Graph of glucose concentration over time based on the changing glucose constant],
image("/figures/glucose-facet.png")
)<glucose-fac>
#figure(
caption: [Graph of glutamine concentration over time based on the changing glutamine constant],
image("/figures/glutamine.png")
)<glutamin>
#figure(
caption: [Graph of glutamine concentration over time based on the changing glutamine constant],
image("/figures/glutamine-facet.png")
)<glutamin-fac>
= Fitting to the Data
The system seen in @fit, was fitted to the data values from @data_figure.
The solution to the system differs from the default system (seen in @constants) with values seen in @fit-const.
The system was fitted as close as possible by manually adjusting values, and then the `n_vcd` and `mu max` were fitted using the values from @VCD points.
#figure(
caption: [The constants used to simulate the approximate solution],
constants_diff_fig("data-fit")
)<fit-const>
#figure(
caption: [facet graph of the system],
image("/figures/data-fit.png")
)<fit>
@VCD values in @fit correspond close to the simulated model, with values from around the day 10, starting to fall off.
This could due to the model having an assumption of 100% viability, where as in experimental bioreactors cell death occurs.
Glucose and glutamine data was manually fitted to the points, as trying to fit using Nelder-Mead method resulted in a failed result.
While fitting the data only @VCD points were used as target for minimization.
Trying to fit the model on glucose or glutamine resulted in model identifying value of 0 as the best result.
This could be attributed to the minimization function used.
The function used was a simple sum of all squared distances from points to the value at that time.
At points close to 0, the minimization function could incorrectly identify 0 as the best option, with constants being low numbers.
= Suggestions for further work
== Viability model
The model assumes that @VCD viability is 100%, meaning that the cells never die.
This is not the case in bioreactor processes, as missing nutrients and needed substances for cell metabolism can induce apoptosis.
== Better product simulation
The model made production of product relative to density of cells, with a conversion constant.
In biopharmaceuticals production of metabolites can be dependant on other concentrations of molecules in the medium.
Production can also be affected by temperature or in what stage the cells are.
The metabolites can also be secondary as opposed to primary, and so the they are released after cell apoptosis, meaning the growth of concentration would be dependant on cells that die each minute.
As some metabolites can be complex molecules, they can be unstable, and are prone to degradation over time, simulation of product loss over time could be beneficial for bioreactor optimization.
More detailed simulation of product production could be beneficiary.
== Oxygen simulation
=== Henry's constant
The model uses one value of henry's constant to calculate the amount of dissolved oxygen using Henry's law.
The constant is dependant on temperature, which means that pre temperature shift and post temperature shift Henry's constant should be a different value.
The detail of different constant, could improve simulation accuracy.
=== PID control
PID controller was implemented as a simple P control.
Derivative and integral parts could prove to optimize oxygen consumption and could be modeled accordingly.
The implementation of PID controls with Runge-Kutta method simulation could be improved upon, as the current implementation calculates the average of non zero input values.
=== Volumetric dissolution
Volumetric Mass Transfer Coefficient ($k L a$) is used to calculate the increase of oxygen concentration in the bioreactor.
The coefficient is dependant on the volume and the area to get the effective dissolution of the oxygen.
In the model @kLa was calculated based on the default volume of the bioreactor, and then scaled proportionally to the increased volume.
Simulations using more detailed dissolution model could be made in the future.
== Data point fitting
=== Minimization function
the model implemented a simple sum of a distance of data points from the calculated value of the model at each point time as a minimization function for the Nelder-Mead method.
This is fast and simple, but created some problems when trying to calculate the best fit for glutamine and glucose, as the "best" value would always become 0.
A further study into point distance evaluation for data fitting would be beneficial for better solutions to experimental data.
One possible solution would be to evaluate the best simplex for a given constant, since the model used non optimized values for their simplex.
=== Machine learning
A complete model could be developed, and then used to fit experimental data.
The data about the model state, could then be used to train an artificial neural network on similarities between the same @CHO strain use, bioreactor, product, e.t.c. that could be used to predict future bioreactor runs.
Machine learning would also prove important as a solution to search for the optimal simplex for data fitting. |
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/tiaoma/0.2.0/README.md | markdown | Apache License 2.0 | # tiaoma
[tiaoma(条码)](https://github.com/enter-tainer/zint-wasi) is a barcode generator for typst. It compiles [zint](https://github.com/zint/zint) to wasm and use it to generate barcode. It support nearly all common barcode types. For a complete list of supported barcode types, see [zint's documentation](https://zint.org.uk/).
## Example
```typ
#import "@preview/tiaoma:0.2.0"
#set page(width: auto, height: auto)
= tiáo mǎ
#tiaoma.ean("1234567890128")
```

## Manual
Please refer to [manual](./manual.pdf) for more details.
## Alternatives
There are other barcode/qrcode packages for typst such as:
- https://github.com/jneug/typst-codetastic
- https://github.com/Midbin/cades
Packages differ in provided customization options for generated barcodes. This package is limited by zint functionality, which focuses more on coverage than customization (e.g. inserting graphics into QR codes). Patching upstream zint code is (currently) outside of the scope of this package - if it doesn't provide functionality you need, check the rest of the typst ecosystem to see if it's available elsewhere or request it [upstream](https://github.com/zint/zint) and [notify us](https://github.com/Enter-tainer/zint-wasi/issues) when it's been merged.
### Pros
1. Support for far greater number of barcode types (all provided by zint library)
2. Should be faster as is uses a WASM plugin which bundles zint code which is written in C; others are written in pure typst or javascript.
### Cons
1. While most if not all of zint functionality is covered, it's hard to guarantee there's no overlooked functionality.
2. This package uses typst plugin system and has a WASM backend written in Rust which makes is less welcoming for new contributors.
|
https://github.com/coljac/typst-cas-thesis | https://raw.githubusercontent.com/coljac/typst-cas-thesis/main/README.md | markdown | The Unlicense | # Typst CAS thesis
A Typst PhD thesis template for Swinburne University and the Centre for Astrophysics and Supercomputing specifically.
This is a work in progress. It will be added to the official template repository when that's ready.
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/layout/cjk-punctuation-adjustment.typ | typst | Apache License 2.0 | #set page(width: 15em)
// In the following example, the space between 》! and ? should be squeezed.
// because zh-CN follows GB style
#set text(lang: "zh", region: "CN", font: "Noto Serif CJK SC")
原来,你也玩《原神》!?
// However, in the following example, the space between 》! and ? should not be squeezed.
// because zh-TW does not follow GB style
#set text(lang: "zh", region: "TW", font: "Noto Serif CJK TC")
原來,你也玩《原神》! ?
---
#set text(lang: "zh", region: "CN", font: "Noto Serif CJK SC")
《书名〈章节〉》 // the space between 〉 and 》 should be squeezed
〔茸毛〕:很细的毛 // the space between 〕 and : should be squeezed
---
#set page(width: 21em)
#set text(lang: "zh", region: "CN", font: "Noto Serif CJK SC")
// These examples contain extensive use of Chinese punctuation marks,
// from 《Which parentheses should be used when applying parentheses?》.
// link: https://archive.md/2bb1N
(〔中〕医、〔中〕药、技)系列评审
(长三角[长江三角洲])(GB/T 16159—2012《汉语拼音正词法基本规则》)
【爱因斯坦(Albert Einstein)】物理学家
〔(2009)民申字第1622号〕
“江南海北长相忆,浅水深山独掩扉。”([唐]刘长卿《会赦后酬主簿所问》)
参看1378页〖象形文字〗。(《现代汉语词典》修订本)
|
https://github.com/crd2333/crd2333.github.io | https://raw.githubusercontent.com/crd2333/crd2333.github.io/main/src/docs/Reading/跟李沐学AI(论文)/GAN.typ | typst | // ---
// order: 6
// ---
#import "/src/components/TypstTemplate/lib.typ": *
#show: project.with(
title: "d2l_paper",
lang: "zh",
)
= Generative Adversarial Nets
== 摘要 & 引言 & 相关工作
- 摘要
- 两种写法
- 创新工作(本文):讲清楚自己是谁?
- 拓展工作:和别人的区别、创新
- 提出了一种 framework(掂量这个词的份量),即Generator 和 Discriminator
- $G$ 的目标是使 $D$ 犯错,而非常见的直接拟合数据分布
- $D$ 的目标是分辨输入来自 $G$ 还是真实数据
- 类似于 minimax two-player game,最终希望找到函数空间中的一个解,使 $G$ 逼近真实数据而 $D$ 无法分辨
- 使用 MLP --> error backpropagation 训练,无需 Markov chains or unrolled approximate inference networks 近似推理过程的展开
- 引言
- 深度学习不等于深度神经网络,我可以不用 DNN 去近似似然函数,可以有别的方法得到计算上更好的模型
- 对 $G$ 和 $D$ 做了造假者和警察的比喻
- $G$ 的输入是随机噪音(通常为高斯),映射到任意分布。两个模型都是 MLP 所以可以通过反向传递来优化
- 相关工作
- 版本为 NIPS final version,不同于 arxiv 早期版本其实基本没写相关工作,可能是因为作者的确是完全自己想毫无参考的,但事实上还是有一定类似工作的
- 学习数据分布,一种想法是直接假定了数据分布是什么然后去学它的参数,另一种想法是直接学习一个模型去近似这个分布(坏处是,即使学会了也不知道究竟是什么分布),后者逐渐成为主流
- 相关工作 VAEs
- 相关工作 NCE(noise-contrastive estimation)
- 相关工作 PM(predictability minimization),GAN 几乎是它的逆向,也是一段趣事
- 易混淆的概念 adversarial examples,用于测试算法的稳定性
== 方法
- adversarial modeling framework 在 $G$ 和 $D$ 都是 MLP 最简单直白
- 我们设数据为 $bx$,其分布为 $p_"data" (x)$,并定义一个噪声变量,其分布为 $p_z (bz)$
- $G$ 学习 $bx$ 上的分布 $p_g=G(bz; th_g)$,$G$ 为由 $th_g$ 参数化的 MLP 函数,使 $p_g$ 逼近 $p_"data"$
- 定义 $D(bx; th_d)$,$D$ 为由 $th_d$ 参数化的 MLP 函数,输出标量表示输入是真实数据的概率($1$ 为真实)
- 定义损失函数和优化目标
$ min_G max_D V(D,G) = EE_(bx wave p_"data" (bx)) [log D(x)] + EE_(bz wave p_z (bz)) [log (1 - D(G(bz)))] $
- 举一个游戏的例子
- 显示器里一张 4k 分辨率(800w pixels)的图片。每一个像素是一个随机向量,由游戏程序 $p_"data"$ 所控制。$bx$ 是一个 800w 维度的多维随机变量
- $G$ 的学习目标:生成和游戏里一样的图片。思考游戏生成图片的方式?4k 图片由 $100$ 个变量控制
+ 反汇编游戏代码,找到代码生成原理 —— 困难
+ 放弃底层原理,直接构造一个约 $100$ 维的向量(这就是这里的 $bz$,a prior on input),用 MLP 强行拟合最后图片的样子
- 好处是计算简单,坏处是不真正了解代码。看到一个图片,很难找到对应的 $bz$;只能反向操作,随机给一个 $bz$,生成一个像样的图片
- $D$ 的学习目标,判断一个图片是不是游戏里生成的
- $D$ 最大化 $V(D,G)$。对完美的 $D$ 而言,前一项判正,后一项判负,即 $D(bx) = 1, ~~ D(G(bz)) = 0$,上式 $V(D,G) = 0$,达到最大
- $G$ 最小化 $V(D,G)$。对完美的 $G$ 而言,$D(G(bz)) = 1$,上式 $V(D,G) = -infty$,达到最小
- 伪代码
#algo(title: [*Algorithm 1*: 伪代码])[
+ *for* number of training iterations *do*
+ *for* $k$ steps *do*
+ Sample minibatch of m noise samples ${bz^((1)), . . . , bz^((m))}$ from noise prior pg(z).
+ Sample minibatch of m examples ${bx^((1)), . . . , bx^((m))}$ from data generating distribution $p_"data" (x)$.
+ Update the discriminator by ascending its stochastic gradient:
- $ nabla_(th_d) 1 / m sum_(i=1)^m [log D(bx^((i))) + log (1 - D(G(bz^((i)))))] $
+ *end for*
+ Sample minibatch of m noise samples ${bz^((1)), . . . , bz^((m))}$ from noise prior $p_g(bz)$.
+ Update the generator by descending its stochastic gradient:
- $ nabla_(th_d) 1 / m sum_(i=1)^m log (1 - D(G(bz^((i))))) $
+ *end for*
+ The gradient-based updates can use any standard gradient-based learning rule. We used momentum in our experiments.
]
- GAN 的收敛特别不稳定,因为要确保 $G$ 和 $D$ 实力相当,这也是后续很多工作的改进方向
- 另外一个小问题是说,一开始判别器 $D$ 容易训练得特别强大,导致 $log (1 - D(G(bz))) == 0$,$G$ 梯度消失无法学习,可以把 $G$ 的优化目标暂时改成 $max_G log D(G(bz))$
- 理论证明
+ 当 $G$ 固定,$D$ 的最优解就是 $D(x) = frac(p_"data" (bx), p_"data" (bx) + p_g (bx))$
+ $G$ 达到全局最优当且仅当 $p_g = p_"data"$(使用 KL 散度证明)
+ 如果 $G$ 和 $D$ 有足够的容量,并且算法中每一次迭代都让 $D$ 达到当前最优(但其实不一定有,只迭优化了 $k$ 步),那么 $G$ 确保能够收敛到最优(使用泛函分析证明)
== 实验 & 评价
- 实验结果相对来说不是那么好,但也因此给后人留了很多机会
- 优劣
- 作者 claim 优点是没有看训练数据,因此能够生成比较锐利的边缘(?),但后来大家发现不是这样子
- 缺点是训练困难,不好收敛
- 未来工作
+ conditional GAN
+ 学习到的近似 inference 其实可以用任意模型去蒸馏
+ 通过训练一组共享参数的条件模型,可以对所有条件进行近似建模(?)
+ 半监督学习
+ 效率优化
- 李沐的评价
+ 无监督学习,无需标注数据。标签和数据来自真实采样 + 生成器拟合
+ 借助 $D$ 用有监督的损失函数来训练无监督,相对高效。同时也是自监督学习的灵感来源,i.e. BERT
|
|
https://github.com/TypstApp-team/typst | https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/layout/pad.typ | typst | Apache License 2.0 | // Test the `pad` function.
---
// Use for indentation.
#pad(left: 10pt, [Indented!])
// All sides together.
#set rect(inset: 0pt)
#rect(fill: conifer,
pad(10pt, right: 20pt,
rect(width: 20pt, height: 20pt, fill: rgb("eb5278"))
)
)
Hi #box(pad(left: 10pt)[A]) there
---
// Pad can grow.
#pad(left: 10pt, right: 10pt)[PL #h(1fr) PR]
---
// Test that the pad element doesn't consume the whole region.
#set page(height: 6cm)
#align(left)[Before]
#pad(10pt, image("/files/tiger.jpg"))
#align(right)[After]
---
// Test that padding adding up to 100% does not panic.
#pad(50%)[]
|
https://github.com/ukihot/igonna | https://raw.githubusercontent.com/ukihot/igonna/main/articles/algo/dp.typ | typst | == 最大和問題
== ナップサック問題
== 部分和問題
== 最小個数部分和問題
== 最長共通部分列問題(Longest Common Subsequence)
== レーベンシュタイン距離 |
|
https://github.com/danilasar/conspectuses-3sem | https://raw.githubusercontent.com/danilasar/conspectuses-3sem/master/Дискра/240912.typ | typst | = Бинарные отношения
$A, B eq.not emptyset$
Декартово произведение есть
$A or B = {(a, b | a in A and b in B)}$
$S subset.eq A or B$
$] A, B eq.not emptyset$, тогда *бинарное отношение* --- подмножество их декартова произведения.
$emptyset$ --- пустое бинарное отношение
/ Декартово произведение: Универсальное бинарное отношение между A и B
$Rho(A times B)$
$|A| = n, |B| = m => |Rho(A times B)| = 2^(n m) $
$(a, b) in rho$ --- пара принадлежит бинарному отношению $rho$.\
$a rho b <=> (a, b) in rho$
$rho subset.eq A times A$ --- бинарное отношение на множестве $A$.
$Rho(A times A)$ --- множество всех бинарных отношений, причём $|Rho(A times A)| = 2^(n^2)$, где $n = |A|$.
Рассмотрим $NN$
$(a, b) in = <=> a = b$
$(a, b) in <= <=> a <= b$
$(a, b) in | <=> a | b$ ($a$ является делителем $b$)
$(a, b) in eq.triple_alpha <=> a eq.triple_alpha b$
== Операции над бинарными отношениями
$A, B != emptyset$
$rho, sigma subset.eq A or B$
+ $overline(rho) = {{a, b} in A times B | (a, b) in.not rho}$
+ $rho union sigma = {(a, b) in a times B | (a,b or rho and (a, b) in sigma)}$
+ $rho sect sigma = {(a, b) in a times B | (a,b in rho or (a, b) in sigma)}$
+ $rho^(-1) = {(b, a) in B times A | (a, b) in rho}$
+ $rho subset.eq A times B, sigma subset.eq B times C$ \ $rho circle.small b = {(a, c) in A times C | exists_(b in B) (a, b) in rho and (b, a) in sigma}$
$A = {a, b, c}, B = { 1, 2 }, C = { +, -, !, ?}$
$s = {(a, 2), (b, 1), (b, 2), (c, 1)}$ \
$sigma = {(a, 1), (b, 1), (c, 2)}$ \
$tau = {(1, +), (1, !), (2, ?)}$
$overline(rho) = {(a, 1), (c, 2)}$ \
$sigma = A times B$
=== Ассоциативность операции умножения
$rho circle.small (sigma circle.small tau) = ... $
== Способы задания бинарных отношений между конечными множествами
$] A eq.not emptyset, rho subset.eq A times B, A = {a_1, dots, a_n}, B = {b_1, dots, b_n}$
$L$ со стрелочкой наверщу (разобраться) --- ориентированный граф.
$A union B$ --- вершины ориентированного графа. Рёбра ведут из вершин множества A в вершины множества B.
#image("image.png")
$rho = {(a, 1), (b, 1), (b, 2), (c, 1)}$
#image(width: 50%, "Снимок экрана от 2024-09-12 11-16-16.png")
$M, N in M_(...)$
$M(rho)$ --- матрица
$(M(rho))_(i j) = cases(1\, text("если") (a_i, b_j) in rho, 0\, text("если") (a_i, b_j) in.not rho)$ |
|
https://github.com/Myriad-Dreamin/typst.ts | https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/visualize/shape-square_01.typ | typst | Apache License 2.0 |
#import "/contrib/templates/std-tests/preset.typ": *
#show: test-page
// Test auto-sized square.
#square(fill: eastern)[
#set text(fill: white, weight: "bold")
Typst
]
|
https://github.com/Jeomhps/datify | https://raw.githubusercontent.com/Jeomhps/datify/main/src/translations.typ | typst | MIT License | #import "utils.typ": first-letter-to-upper
#import "config.typ": default-date-lang
#let day-names = toml("translations/day_name.toml")
#let month-names = toml("translations/month_name.toml")
#let day-name = (weekday, ..args) => {
let lang = default-date-lang
let upper = false
for arg in args.pos() {
if type(arg) == "string" {
lang = arg
} else if type(arg) == "boolean" {
upper = arg
}
}
let weekday-to-str = str(weekday)
let name = day-names.at(lang).at(weekday-to-str)
if upper {
return first-letter-to-upper(name)
} else {
return name
}
}
#let month-name = (month, ..args) => {
let lang = default-date-lang
let upper = false
for arg in args.pos() {
if type(arg) == "string" {
lang = arg
} else if type(arg) == "boolean" {
upper = arg
}
}
let month-to-str = str(month)
let name = month-names.at(lang).at(month-to-str)
if upper {
return first-letter-to-upper(name)
} else {
return name
}
}
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/fletcher/0.5.2/src/deps.typ | typst | Apache License 2.0 | #import "@preview/cetz:0.3.1"
|
https://github.com/typst/packages | https://raw.githubusercontent.com/typst/packages/main/packages/preview/rivet/0.1.0/CHANGELOG.md | markdown | Apache License 2.0 | # Changelog
## [v0.1.0] - 2024-10-02
- prepared for publication in Typst Universe
## [v0.0.2] - 2024-06-15
### Added
- `width` parameter to `schema.render` for easier integration
- `all-bit-i` config option
- colored ranges
- format specification in the manual
## [v0.0.1] - 2024-05-19
- initial version
- ported all features from the [python package](https://git.kb28.ch/HEL/rivet/)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.