repo
stringlengths
26
115
file
stringlengths
54
212
language
stringclasses
2 values
license
stringclasses
16 values
content
stringlengths
19
1.07M
https://github.com/yhtq/Notes
https://raw.githubusercontent.com/yhtq/Notes/main/计算机网络/第一节.typ
typst
#import "../template.typ": * #show: note.with( title: "计算机网络", author: "YHTQ", date: none, logo: none, ) #let chapter1 = [ = 前言 - 课程考核 :mooc (15%,通过即满)+4次lab取3次最高分(各10%)+期末考试(55%) - Lab:https://n2sys-edu.github.io/Lab-Frontend-Pub/ - 不点名 #set heading(outlined: false) == 通信网络:结点(Node)+链路(Link) - 在计算机诞生前便有许多通讯网络 - 计算机网络: - 结点:个人计算机、服务器、手机、路由器、交换机..... - 链路:网络、光纤、无限媒介 - 传输对象:01比特承载的数字信息 - 计算机网络技术的发展非常快,需要底层通讯技术、计算系统、应用支撑的技术支持 == 发展历史 + 早期阶段:1983年前,早期互联网技术,美苏争霸中的科技竞争促使计算机网络技术研究的开始。最早只有几个参与研究的大学连接在这个网络中。首次创新性的采用分组交换技术(分布式架构,各个节点之间完全对等,保证了互联网结构安全性),有别于电话网络。(苏联早期多次考虑过建立国家统一网络管理国家经济,有许多超前性设计,但最终无法完成) + 80年代中期:Internet将多个小型网络连接成全球性网络。1983年,TCP/IP协议成为Internet的标准协议,标志现代互联网诞生。 + 90年代互联网经历了快速发展,包括万维网、Mosaic浏览器(网景)等等重大技术出现,90年代称为Web1.0 + 2000年互联网泡沫破灭,逐渐开始低调发展,2000-2007年称为Web2.0。2007年3G+iPhone标志着移动互联的开始 + 2007至今 移动互联网时代,网络发展为云计算、大数据提供机遇 从发展历程来看,早期互联网以下载为主,如今用户本身开始产生大量数据,对上行带宽提出了更高的需求 == 今天的互联网:Internet Internet 实际上是由大量子网络分级组成的。 一般而言,Internet 是专有名词,是现如今连接全球的互联网。internet 泛指一类将小网络连接称大网络的网络类型 == 互联网存在的问题 + 网络安全与威胁 + 网络带宽的增长跟不上需求增长(传输大量数据时因特网带宽甚至不如卡车) + 网络管理混乱:设备数量庞大,功能复杂,许多问题难以复现与测试 未来网络需要大量新兴技术,包括网络可编程,分布式计算系统,自动化验证技术,卫星通信等等。 我国的互联网技术:底层通信强,上层互联网应用强,中间网络体系结构薄弱。 == 标准化组织与学术会议 ISO(国际标准化组织), ITU(国际电信联盟), IEEE(国际电气和电子工程师协会), WIFI联盟, 万维网联盟 ISOC(互联网协会), IAB(互联网体系结构委员会) 学术会议: ACM SIGCOMM, USENIX NSDI, IEEE INFOCOM, ACM MOBICOM(移动互联网) #set heading(outlined: true) = 计算机网络的内容和分层架构 - 基本内容:信息传输 - 终极目标:更快、更稳定、更好用 - 构成:个域网、局域网、城域网、广域网 - 今天的网络:网络的网络 - ISP: Internet Service Provider / 网络服务商 - _ 注:网络传输中习惯使用 $10^3$ 为进制,以比特(bit / b)为最小单位 _ == 对等实体:识别,命名,组织形式 - 网络边缘:端系统 - 主机(Host):桌面计算机,服务器,智能设备...... - 功能:运行应用程序,从网络接收数据并提供给应用程序,将用户产生的数据发给接入网 - 内部架构 - 网络设备硬件:网络适配器(网卡) - 操作系统内核:网卡内核驱动,内核协议栈(与网卡无关) - 用户软件:调用操作系统的接口(socket)进行数据发送、接收 - 命名(确切来说是网卡的命名,以下都是以单个网卡为单位的)\ + 唯一的设备ID(48位MAC地址),全球唯一,不可修改。(虚拟机中的虚拟网卡可以修改mac地址) + IP地址:一组数字,可以根据需要随时配置的地址,方便管理 + 主机名:字符串,便于记忆 - 接入网:通信线路 - 目的:将主机连接到边缘路由器 - 边缘路由器:主机去往任何其他远程端设备经过的第一台路由器 - 通过路由器统一各种异构网络 - 各种物理介质 - 引导型介质(有线介质):信号在固体介质中传输 - 双绞线(通常所谓的网线):多根互相绝缘铜线,电话线用一对双绞线,网线用8对;广泛用于计算机网络(以太网)双向传输 - 同轴电缆(逐渐被淘汰) - 光纤:玻璃纤维携带光脉冲,高速,误码率低 - 非引导性介质(无线介质):无具体导体 - 无线电:电磁频谱中各波段携带信号,不依赖介质的广播 - 半双工:两个方向都能传输,但不能同时 - 容易受到各种形式的干扰:反射、物体阻挡、干扰、噪声 - 链路类型: + WiFi(无线局域网),10米量级 + 广域(3/4/5G),10公里范围 + 蓝牙:短距离,有限速率 + 地面微波:点对点,45Mbps + 卫星 - 同步卫星:往返时延大 - 低轨卫星:往返时延小,但需要大量卫星以及卫星间通讯 - 接入方式 + 数字用户线DSL:使用电话线连接到数字用户线接入复用器(DSLAM)\ 网络数据经过调制解调器和分路器和电话线合并,现代DSL可以同时传输语音和网络数据 + 同轴电缆:使用传统有线电视接入头端上网,类似于DSL + 光纤:带宽大、线路稳定。 - 我国由于发展较晚,互联网基建建设时大量使用光纤,因此我国的互联网基础设施较好 - 分为有源AON和无源PON两类,也就是每个用户独立线路和用户共享线路 + 无线接入:通过基站(“接入点”)再通过有线方式接入路由器 - 实际家庭网络往往采用有线、无线技术混合 - 企业可以通过交换机和专线直接连接到ISP - 网络核心:交换设备 - 目标:连接各个端系统 - 由各类路由器、交换机组成 - 数据传输方式:分组交换(包交换) - 主机将数据分成分组(Package),发送到网络 - 每个分组独立的从一个路由器转发到下一个路由器,逐跳传输到目的地 - 采用 *存储转发机制:一个分组必须被路由器完整收到再发送出去,本质上会带来额外延迟但更加稳定* - 每个分组的头部都带有控制信息 支持灵活的统计多路复用:通过灵活按需分配带宽支撑多个主机 - 核心功能: - 路由:一种全局操作,通过源和目标计算数据分组传输的最优路径 - 转发:本地操作,路由器或交换机将接收到的数据分组转发出去 - 瞬时拥塞:\ 当两个数据包同时到达时,由于存储转发机制可能造成一个数据包被阻塞,从而造成瞬时拥塞。如果两方数据流量很大,一方可能被阻塞很长时间,造成严重的拥塞。 - 另一种选择:电路交换 先呼叫建立连接确定资源预留,再传输数据(此时独占资源),传输完成后释放资源。打电话时大部分时候采用电路交换。\ 由于因特网中的数据流量很难预测(存在大量突发流量),电路交换不适用于因特网。 - 事实上两种交换方式都存在很多问题,因此现代网络仍然在探索更先进的交换方式 - 端系统通过网络供应商接入Internet,不同网络供应商之间也要相互连接。 - 网络供应商之间并不是全连接关系。事实上采用的途径是网络供应商连接到几个全局供应商,全局供应商之间再相互连接(连接节点称为 Internet exchange point) - 在一些国家,成为 ISP 的门槛很低,因此还出现了一些区域性 ISP 介于全局供应商和网络供应商之间 - 由此供应商之间的连接关系形成了一个层次化的结构。Tier 1 的供应商互不结算,包括中国电信,中国联通等。Tier 2 的供应商与 Tier 1 的供应商结算,包括中国移动,教育网等。Tier 3 的供应商与 Tier 2 的供应商结算,包括一些区域性 ISP。 - 还有一种网络服务:CDN,通过将内容分发到离用户更近的地方,提高用户访问速度。CDN 服务商与 ISP 之间也需要相互连接。 == 服务:传输接口,服务性能,可靠性 - 服务类型 - 面向连接服务:在数据传输前需要建立连接,如电话 - 每个请求都需要等待答复 - 流程: + 发送请求 + 接受答复 + 发送数据 + 应答 + 请求断开连接 + 断开连接 - 无连接服务:数据传输前不需要建立连接,如邮件 - 性能指标: - 吞吐量类: + 带宽:网络某通道传输数据的能力,单位为 bit/s,缩写为bps + 包转发率:以包为单位的某网络设备的交换能力 线速转发:在设备最大负载下,设备能够转发数据包的最大速率(显然包越大处理单个包的额外负载越小) + 比特率:单位时间内主机向信道发送的比特数,单位为 bps + 吞吐量:单位时间内通过某节点的数据量\ 由于各种原因,主机不一定可以占满介质的传输能力,因此吞吐量一般小于比特率 + 有效吞吐量:单位时间内接收方正确接收的信息量 + 利用率:吞吐量 / 带宽 + 丢包率:单位时间内丢失的数据包占总发送数据包的比例 - 延迟类: - 时延/延迟:数据从网络的一段传输到另一端所花的时间,分为以下四种的和 - 传输/发送延迟:数据从结点进入到传输媒体所需要的时间 - 传播延迟:电磁波在信道中需要传播一定距离而花费的时间 - 处理延迟:主机或路由器在收到分组时,为处理分组(例如分析首部、提取数据、差错检验或查找路由)所花费的时间 - 排队延迟:分包在路由器的输出队列中等待传输所花费的时间 延迟模型:\ 传输时延 $->$ 传播时延 $->$ 排队时延 $->$ 处理时延 $->$ 传输时延 $->$ ...... - 量级: - 传输时延 + 物理时延:专用硬件可达微秒到纳秒级 - 传播时延:与距离有关,光纤中等效约为 $2\/3$ 光速,微波中等效约为光速 - 排队时延难以估计,严重时可达秒级。可以通过排队论进行分析:$"平均队列长度" = "平均到达速率" * "平均排队时延"$ - 有的场景下(例如高频量化交易)对延迟极度敏感,因此需要对延迟进行极端优化。例如建立专线消除排队时延,设计专有硬件芯片和操作系统模块消除处理和传输时延,使用微波(不需要反射)在晴天时达到更小的传输时延,甚至建设中继站点等等。 - 有时我们也会关心双向延迟,也就是一来一回总计的延迟,称为往返延迟 (Round-Trip Time,RTT)。ping指令得到的延迟便是往返延迟。 - 时延抖动:时延的变化程度称为时延抖动,在语音视频等服务中往往会严重降低用户体验 - 延迟丢包:在网络游戏等服务中,由于时延过大导致到达时已经无用的数据包称为延迟丢包 - 时延带宽积:传播时延 $times$ 带宽,表示在传输过程中介质的容积。只有链路完全充满链路才能达到时延带宽积所表示的传输能力。 - 其他指标: - 可靠性:发送的每个消息,接收方收到一次且仅收到一次 - 完整性:发送的数据无法被篡改 - 隐私性:发送的数据不被第三方截获,有时也包含发送方身份不被暴露 - 可审计性(accountability):可追溯用户的传输行为。由于开销过于巨大往往难以实现 == 协议:传输内容的格式、语义、顺序 - 协议:在两个或多个通信实体之间交换信息的规则 \ 三要素: - 语法:规定传输数据的格式 - 语义:规定传输数据的含义 - 时序:规定各种操作的顺序 - 协议封装:将数据分为头部和载荷,其中头部包含协议的控制信息,载荷包含用户所发送的数据 - 常用的互联网协议(数据巨大) - IP / TCP / UDP / HTTP / FTP / SSH / IMTP / POP - WAP: Wireless Application Protocol - SIP: Session Initiation Protocol - PPP: Point-to-Point Protocol - IPX: Internetwork Packet Exchange - HIP: Host Identity Protocol (HIP) - IGMP: Internet Group Management Protocol, multicast - ICMP: Internet Control Message Protocol, e.g., ping, traceroute - RTP: Real-time Transport Protocol - PIM: Protocol-Independent Multicast - RED: Random early detection - RTCP: RTP Control Protocol (RTCP) - RIP: Routing Information Protocol - SMTP: Simple Mail Transfer Protocol - RTSP: Real Time Streaming Protocol - BFD: Bidirectional Forwarding Detection - CIDR: Classless Inter-Domain Routing - NNTP: Network News Transfer Protocol - STUN: Session Traversal Utilities for NAT - VTP: VLAN Trunk Protocol - POP: Post Office Protocol - LISP: Locator/ID Separation Protocol - TFTP: Trivial File Transfer Protocol - LDP: Label Distribution Protocol == 实现与管理:“功能.协议$->$实体”,资源分配与调度 - 计算机网络的学习内容:网络尸体如何通过各种协议实现各种网络功能和服务 == 计算机网络的分层架构 - 计算机网络是一个高度复杂的系统,通过分层架构解决系统的复杂性,层与层之间通过接口进行交互 - 分层的问题:引入额外开销,分层间可能隐藏了重要信息 - 计算机网络的经典分层 - OSI模型(从上至下): + 应用层(Application Layer):通过应用层协议,提供应用程序便捷的网络服务调用 + 表示层(Presentation Layer)(基本弃用) + 会话层(Session Layer)(基本弃用) + 传输层(Transport Layer):将数据从源端口发送到目的端口(端对端管道,忽略了中间的网络转发) + 网络层(Network Layer):将数据包跨越网络从源设备发送到目的设备 + 数据链路层(Data Link Layer):实现相邻网络实体间的数据传输 + 物理层(Physical Layer):信息物理传输 == TCP/IP 参考模型 后于 TCP/IP 协议出现的技术 + 应用层(HTTP,域名,DNS,P2P......) + 传输层(TCP/UDP/IP,套接字......) + 互联网层 + 网络接口层 采用无连接技术 == 模型比较与问题 - OSI:分层的实际实现糟糕,以至于从来没有真正被实现过,较为失败的技术。 - TCP/IP:只基于TCP/IP协议,不足以用于新兴协议。 - 有时双方的术语经常混用,不会明确区分(比如网络接口层称为数据链路层和物理层) 本课程大体采用 TCP/IP 参考模型,但不局限于 TCP/IP 协议 == 协议封装 每一层都有自己需要的头部,因此真正的数据是层层封装的,下层的头在上层看来也是数据 == 网络分层的实现 - 端系统:实现全部功能 - 边缘网和网络核心:第一层到第三层 - 交换机:前两层,物理层和数据链路层,只能正确处理单跳传输 - 路由器:交换机 + 网络层 - 注:这是传统意义上的定义,现代交换机和路由器几乎已经不做区分 - 特征 - 复杂功能由端系统实现,聪明终端 + 简单网络,大大提升可拓展性 - 以 IP 协议为核心:IP on everything, everything on IP - 局限性 - IP 协议难以升级 - _资源管理依赖端系统,网络核心只能进行粗粒度流量,而端系统之间又缺乏协同(排队延迟严重的原因之一)_ = 网络应用层 事实上同一主机的程序间也可进行网络通信,但本门课程主要介绍不同主机间通讯 - 应用层对传输层希望提供的服务: + 可靠传输 + 高吞吐 + 低时延 + 安全 - 实际传输层提供的服务 + TCP - 面向连接 - 可靠传输:丢包重传 - 有序传输 - 流量控制 - 拥塞控制 - 延迟、吞吐量、安全均无法保证 + UDP - 无连接 - 不可靠传输 - 无法保证其他所有要求 - 但由于协议简单,性能较 TCP 更好 - 网络应用层的好处 - 屏蔽底层细节:网络设备往往不在乎应用层逻辑 - 抽象:许多网络应用有相同的通信模式,网络应用层封装了这些共同模式 - 提供额外功能,如安全性 例:TCP + SSL - TCP 明码传输,但 SSL 提供加密服务,保证隐私与数据完整 == 端口 由于主机上往往有多个应用层的实体(应用程序),因此通过端口的方式区分彼此 == 客户端/服务器模式 - 客户端向服务器发出请求 - 服务器被动响应,必须始终在线并有固定IP地址与端口 - 面向连接和无连接均可,面向连接模式的通讯是双向的 - 服务器进程可以采用循环模式(也叫阻塞模式,多个请求时依请求的先后顺序依次响应)或者并发模式(也叫非阻塞模式,运行多个服务器进程,同时服务多个客户端) - 通常而言,不会使用无连接的并发模式,因为无连接UDP只有一个套接字,无法被多个从属进程同时访问 - 特例: 浏览器/服务器模式 == P2P 模式 - 任何两个个体对等,多个主机间进行平等、对等的通信 - 可以看作每个个体既是服务器也是客户端 - 每个实体都不需要同时在线,可以随时进出,都贡献并收到一部分资源 核心问题:索引,可能方法: + 中心化索引 + 洪泛请求 + 混合索引 + 分布式哈希表 == 应用层协议 - 开放协议:所有使用开放协议的程序可以互操作,如 HTTP, DNS - 私有协议:厂商内部自行规定的协议,如微信、QQ == web 与 HTTP 协议 - www = world wide web (万维网) 由通过 urls 定位的 web 对象,http 服务器和客户端,服务器与客户端对话的 http 协议组成 - url(统一资源定位符) 协议类型:\//主机名:端口\//路径和文件名 - web 对象 - 静态对象和静态网页:多媒体内容,排版信息......(HTML,XML等标记语言) - 动态对象与动态网页:脚本,数据库...... - CGI:通用网关接口 - 一种在服务器创建动态文档的标准,定义了动态文档如何创建、输入输出结果如何交互等信息 - 常见使用方式是通过环境变量传给 CGI 进程,再把 stdout 的结果返回客户端 == HTTP 协议 - 在传输层通常使用 TCP 80端口(HTTP 3.0改为 UDP) - 无状态协议,服务器不保留状态 - HTTPS:HTTP + TLS === HTTP 1.0 - 用户输入 URL - 若该页面包含两幅 Jpg 图像,则需执行三次完整的 TCP 连接(建立连接,数据传输,连接) === HTTP 1.1 - 允许持久化连接 - 之后允许 pipeline,一次发送多个请求,按顺序响应 === HTTP 2 - 多路复用,可以同时多任务交错,可以自定义优先级 - 服务器端可以主动推送 - 数据压缩 - 允许应用进行流量控制 === HTTP 3.0 - 主要是将 TCP 换为 UDP + QUIC === HTTP 报文 - 方法字段:最常用 get post - URL,使用 get 方法时可以带参数,往往有长度限制。使用 post 方法时参数放在实体主体中,此时参数可以加密 - 响应报文会多一个状态码,均为三位数字 - 1xx 表示通知信息 - 2xx 成功 - 3xx 表示重定向 - 4xx 客户端发生差错,如请求中有方法错误或不能完成 - 5xx 服务器端差错 === wireshark 网络监听工具,检测某个网卡的所有网络活动 === HTTP 优化 浏览器缓存:在浏览器主机保留用户访问过的服务器web副本\ 代理服务器缓存:ISP缓存ISP客户访问过的服务器副本,所有用户都可使用。\ 显然缓存技术需要检查网页是否过期,如果过期就返回新的对象。\ 现在往往使用询问式策略,在发送请求时给出指定缓存的时间 === cookie http 协议无状态,用 cookie 将服务器和客户端的状态相互对应。 但 cookie 技术也会造成一些隐私泄露问题 == 电子邮件系统 - 用户代理(邮件客户端) - 编辑发送邮件 - 接收、读取和管理邮件 - 管理地址 - 无统一标准 - 传输代理(邮件服务器) - 邮件在邮件服务器之间会通过 SMTP 协议发送。 - 发送邮件时可以使用SMTP协议,接受时由于不能保证对方开机因此需要其他协议 - SMTP协议:定义如何传输邮件,简单,无任何身份验证,无任何加密,传输ASCII码,效率较低 - POP3, Post Office Protocol-version3/ IMAP, Internet Message Access Protocol,两种邮件访问协议,需要处理客户端不在线的问题 - POP3:用户登录认证后,与服务器建立连接,收取后服务器更新(删除已收到的邮件) - IMAP:POP3 的改进版,允许保存、管理邮件 - 两者都是基于 TCP 的应用层协议 邮箱:邮件服务器上一段内存区域所保存的一个地址 == 域名服务 === DNS 简介 - 现如今的域名服务完全是分布式的: - 域名映射记录多达数十亿条 - 每天万亿级别的查询 - 读操作远大于写操作 - 对延迟敏感 - 域名服务以树结构组织: - 顶级域名:com, net, cn,... - 一般分三类:国家及地区,互联网基础设施(ip6.arpa...),通用顶级域名 - 二级域名:... - 域名服务器负责域名解析工作。如果没有相应结果,会向相邻的域名服务器查询,直到找到为止(这个过程与用户无关) - 根服务器:不负责直接查询,负责将用户的查询分发到顶级域名服务器 - 二级域名服务器:将一定范围内的网络设备分为区和域 - 本地 DNS 服务器:本地 ISP 提供的 DNS 服务器,离用户最近。 === DNS 解析过程 - 习惯上 UDP 53端口 - 域名查询往往分为递归查询和迭代查询。往往本地 DNS 服务器采用递归查询,向更上级查询时习惯使用迭代查询。(请求方有一定自主权) + 递归查询:当收到一条无法处理的查询时,该域名服务器以客户身份向上级查询,直到得到结果(先向根服务器查询顶级域名,再依次向顶级域名服务器查询二级域名,直到得到结果) + 迭代查询:当收到一条无法处理的查询时,该域名服务器将更上级的域名服务器返回客户端从而降低根服务器负载 - DNS 报文: - 事务 ID :一个随机数,用于标识查询 - 问题计数:DNS 查询的数目(可以同时多次请求,现实中基本只进行依次) - 回答计数:DNS 回答的数目 - RD:递归查询标志,1 为(若服务器端允许)则递归查询,若为 0 且服务器支持则取决于递归结果是否得到权威应答。 - RA: - Reply code:返回码,表示响应的差错状态 - 问题部分 - 查询名:一般是域名,有时也可反向查询 - 查询类型:通常查询为 A,即查询域名对应的 IP 地址。通过 IP 查询域名的类型为 - 资源部分: - 回答问题区域字段 - 权威域名服务器区域字段 - 附加信息字段 都使用资源记录格式(变长数组,自己记录自己的长度) === 优化: 缓存:DNS 服务器会缓存查询结果,以提高查询效率\ 安全:原始的 DNS 服务没有任何加密和认证。 == P2P 每个实体都是一个对等结点,每个节点之间理论上可以互相连接,可以随时加入或退出。\ 理论上,这种模式可以充分利用带宽和计算资源。例如分发文件,单一服务器的带宽压力很大,而 P2P 模式下可以充分利用所有节点的带宽。\ 但是主要问题在于资源索引,也即给定资源,找到一个可用的提供资源的节点。 === 可行的索引方式 + 中心化索引:有一个中心化服务器负责检索 可能问题: - 单点故障 - 性能瓶颈 + 洪泛请求:每个对等节点自己建立自己的索引,对等节点之间互相连接。互相连接的节点构成一个应用层面的图(一般将这种图称为 overlay 图),遍历所有节点找到所需资源 可能问题: - 需要遍历图,对每个节点的计算压力很大 + 混合索引:某些节点运算能力较强,记为超级节点。普通节点至少向超级节点连接,超级节点之间互相连接。普通节点向连接的超级节点寻求索引,超级节点之间洪泛请求。 === P2P 实例 - Gnutella 协议:完全去中心化,纯粹的文件分发,采用洪泛请求但是需要假设一些节点始终存活 - Bt 协议:不是纯 P2P 架构,所有正在交换某个文件的节点组成一个种子,依赖于中心化的跟踪器维护每个种子的信息 优化策略: - 优先选择少见的文件块下载 - 趋向于上传速度更多的节点,换言之贡献越多,下载越快 - 往往设计策略能够激励节点留下来 问题: - 下载速度不及预期 - peer 缺乏共享精神 - 恶意 peer 节点 - 版权纠纷 - 主流网站封锁 - skype 采用层次化的超级节点 - 区块链 频繁写操作,主要问题:如何决定写权力 == 流媒体 特点: - 任意方均读取写入 - 延迟敏感,数据量大,主要传输视频或音频 - 乱序数据完全失效 视频编码标准: - H.264 最广泛,兼容性好 - H.265 视频内容压缩更高,但硬件成本高且专利许可费高(提出时纳入了许多企业的专利,导致了很大的专利壁垒) - VP9、AV1 开源标准,性能也很好,但比较新使用较少 音频编码标准:AAC、MP3、Opus等 音视频封装容器格式: - MP4:兼容性好 - MKV:允许多音轨、多字幕 - FLV:早期由于跨平台常用,现逐渐淘汰 目标: - 在网络条件有限的情况下保证传输质量 === 媒体点播 - 浏览器通过 HTTP 从服务器下载并播放流媒体文件 - 发送端以恒定速率发送分包,但由于网络传输抖动,到达时速率非恒定。若到达即播放很容易发生卡顿 - 常用策略是缓存一定时间后恒定速率播放,以时延为代价消除抖动性。 === 主要的流媒体协议 - 网络层:IP/RSVP(流媒体专用) - 传输层:TCP、UDP、SCTP - 流媒体技术: + RTP:建立在 UDP 的基础上,只负责数据传输,不做任何处理和服务质量保证 + RTCP:与 RTP 配合使用,监事及反馈其服务质量,进行一些音视频同步等操作。这两个很少直接使用,往往作为基础 + RTSP:本身不传输数据,是多媒体播放控制协议,负责暂停、继续等操作,记录和传输用户状态。结合RTCP使用,延迟大但开销低,用于摄像头,物联网等 + RTMP:最早用于 FLASH,后来广泛使用。开销大,延迟低,支持复杂交互,广泛用于直播,实时通信等。 + RSVP 网络层协议,通常用于专网 + WebRTC:建立浏览器间点对点传输 + 基于 HTTP 的其他技术 - MPEG-DASH:支持各种协议的开放标准,动态自适应传输,将完整视频拆分为固定时长的片段,每个片段采用不同码率,客户端自适应选取不同码率进行下载 - HLS:APPLE 生态 - HDS:本来用于 FLASH,后逐步停止更新 === 进一步优化 使用一些传输友好的视频编码\ 适当放松丢包恢复\ 改进网络调度,进行边缘计算等 == CDN 服务 在互联网上,极少量网站拥有极大的流量,因此有了 CDN 来将互联网上的内容分发到不同位置,不同用户可以访问离自己更近的位置。\ CDN 服务部署在广泛的地理分布,广度和深度较大,同时位置也要经由分析最大化效率。 === 重定向 CDN 网络需要实现将用户请求调度到临近 CDN 服务器 - HTTP 重定向:服务提供者返回 CDN 清单,建议用户向新的位置发起请求 - DNS 重定向:通过 DNS 服务器辅助重定向 - 网页所有者直接重写页面,链接到 CDN 服务器 == telnet 早期的远程登陆服务,目标是解决异构计算机系统的差异性问题,尤其是对终端键入指令的解释 == TFTP 相比 FTP,协议更简单,建立在 UDP 之上,无目录操作。\ 利用接收方的每次确认来保证接收方确实收到,一段时间未收到确认则重传,接收方可能需要去重。\ == SNMP 主要用于管理计算机网络中的设备\ 指导思想:尽量简单。只定义了中心管理器进行的四类操作\ 操作方式: - 轮询:循环遍历所有设备 - 陷阱:允许设备主动产生操作,类似于操作系统中的异常。开销较小,但对管理员要求较高。(需要提前定义异常事件) 粒度粗糙,局限性很大。读操作只能获得一部分信息,写操作只能更改预先设置的一部分参数 = 网络传输层 传输层的服务以尽力而为为目标,将复杂逻辑交由应用层实现。\ 最经典的传输层协议即为 UDP 和 TCP == UDP 最简单的传输层协议,无连接,不可靠,无拥塞控制,无流量控制,无差错恢复,无时延保证,无安全保证,简单的错误检查。\ 发送单位为数据报文。 == TCP 可靠,面向连接,流量控制,拥塞控制,差错恢复,安全保证,复杂的错误检查。\ 发送单位为字节流。 == 套接字 无论 UDP 还是 TCP,都通过复用和分用,将套接字的数据交由网络设备,再由网络设备交给套接字。\ 传输层不处理分包等行为,交于更下层处理 === TCP 套接字 监听套接字:并不接收数据,只接收连接请求,一般只有一个,使用双方都知道的端口号 连接套接字:接收数据,一般有多个,与监听套接字共享端口号。此时必须使用额外机制区分不同的套接字 === UDP 套接字 无连接,因此只有一个套接字,使用双方都知道的端口号\ 可选用有限纠错机制,主要任务是端口号的分用复用。\ 校验机制:计算 checksum 一般实现时,发送方不设缓冲区,接收方设缓冲区。\ == 可靠传输 网络的可靠性是由端系统保证的,而不是网络设备。\ 网络设备有多种情况可以造成不可靠性,例如拥塞丢弃,不同路径导致后发先至。\ OSI 模型中希望链路层提供可靠传输保障,但实际上大量可靠性由传输层甚至应用层实现,部分较新的技术在链路层或者网络层实现可靠性。\ === 完美信道 先从最简单情形开始,假设信道是完美的,不会丢包,不会出错等。此时可靠传输协议非常简单,只要忠实地发送数据接收数据即可。这样的协议有时称作乌托邦协议。 === 有错但不丢包 假设数据包传输过程中可能发生一些错误,这些错误可以被某种检错机制检查,但不会发生丢包问题。 思想方法:自动重传请求 - 发送方:发送数据包,等待确认, - 接收方:接收数据包,检查数据包是否有错,若有错则发送否定反馈,若无错则发送确认 - 发送方:若收到确认,则发送下一个数据包,若收到否定反馈,则重传数据包 实例: + rdt 2.0 协议:使用 NAK,ACK 作为否定、肯定反馈。这种协议称为停等协议,等到收到确认后才发送下一个数据包。 问题:NAK,ACK 本身也可能损坏\ 解决方法: - 接收方再次反馈,以此类推。可能陷入死循环 - 设计巧妙的校验技术,允许一定程度上恢复。可能带来额外的计算与传输开销 - 发送方无法确认接收方状态时,直接进行重传。此时接收方需要去重处理 现实中往往采用第三种情况,因为方案三只在发生错误时付出代价,而方案二则总是需要付出代价。在出错概率极高的网络中,有时使用类似方案二的技术更好。 + rdt 2.1:采用重传技术的改进。发送数据包时需要序号,接收端需要适当丢弃 - 由于 rdt 是停等协议,序号只需一个 bit 表明是否是之前数据的重传即可。 + rdt 2.2: 注意到采用此机制时,发送方准备接受确认时,只要没有确认信息完好(收不到反馈或者收到反馈有错),立刻重传即可。进而其实无需反馈有错,只需接收方在 ACK 中加入 seq,发送端收到回复时判断 seq 是否正确即可。 === 可能丢包或出错 这里不考虑乱序的问题。\ 在可能丢包的情境中,接收方可能不知道发送方是否发送了数据包,因此需要发送方自行处理。\ + rdt 3.0 发送方采用超时重传,一段时间未收到确认就进行重传。此时也有接收方收到多次同一数据包的可能,因此需要接收方去重,但前面已经实现了去重机制。 在停等协议之中,大量的时间被浪费在等待回复上,这是不必要的,可以进行优化: === 流水线传输 多个数据包以流水线发送、接受。发送端依次发出,接收端接收后各自发送确认。每个等待确认的包都需要独特的序列号,发送方需要保存所有未被确认的数据包。此时若多个数据包出错,则每个数据包都需要重传。 实践上,往往规定最大未确认包数量为 $N$,这被称为滑动窗口机制。有了滑动窗口,便可复用已经被确认收到的序列号。 + 回退 $N$ 机制 - 接收方对多个连续数据包进行累计确认,非连续数据包直接跳过。 - 发送方维护计时器,从最古老计时器开始,若超时则重传该数据包及其后的所有数据包。 - 该机制下,接收方只确认自己接收到包的数量,发送方的未确认序列一定是连续的,随接收方发送确认向后推移。 + 选择重传机制 - 接收方对每个数据包独立确认 - 发送方对每个未确认数据包进行计数 - 发送方超时或接收到 ACK 错误时,进行单个包的重传 - 接收方接到乱序包时,对包进行缓存后重排 - 接收方和发送方都维护一个窗口,窗口大小为 $N$。发送方接收到 sendbase 的 ACK 时,更新 sendbase。接收方收到: - $($recvbase, recvbase+ N - 1\ $\]$ : 确认 n 并缓存 - recvbase : 确认 n,将一系列包交付应用,更新应用层 - [recvbase - N, recvbase - 1]:再次确认 n (不能省略,否则由于 ACK 发生错误发送方的窗口可能卡住) - 其他:丢弃(可以确认发送方窗口已经滑过,因而不管也行) - 窗口大小不能超过 seq 取值的一半 - 尽管机制复杂,确实仍有出错的可能:如果 pkg 1 延时过久才到达,可能会被认作新的 pkg 1。所幸实际使用中的 $N$ 往往非常大(TCP 使用 $2^31$),而网络中的包往往设计生命周期,因此这种情况发生的概率很小。 == TCP 协议 === 重要选项: + MSS 最大段长度:TCP 段的最大长度,一般为 1460 字节 + 窗口比例因子:窗口大小的倍数,实际接收窗口大小 = 窗口大小 $* 2^("窗口比例因子")$ === 可靠传输 - TCP 协议中以字节流为基础,每次发送的数据包都是一段长度的字节 - 基本机制与上面介绍的类似,采用流水线传输。不同的是它对字节建立序号,ACK 值给出下一个期望的字节序号,捎带在正常的数据包之中。 - 定时器只对最早未被确认的字节进行计时,但重发时也之重发最早未确认的报文段。 - 接收方采用累积确认,仅在按序正确收到报文段时向前推进 - 但接收方对于失序的报文段也会缓存,以便后续重排 + 一些优化 - 超时值的确认可以利用 RTT 估计(数据包从发送到接收再返回的时间),可以用滑窗估计平均 RTT 值 - 问题:ACK 可能是重传ACK,产生二义性。 - 解决方法:计算 RTT 时忽略那些发生重传的数据包 - 问题:发生重传的数据包往往是网络问题,一次到达的数据包往往是网络正常的情况,因此忽略重传数据包可能导致 RTT 估计偏小 - 解决方案:采用超时补偿,一旦超时发生就翻倍超时值,直到某个给定上限 - 实际使用中,在三次握手中估计初始 RTT,超时时进行补偿。 - 快速重传 在等待超时的过程中,我们与其闲置不如直接重传。\ 利用 ACK 机制,一旦受到重复的 ACK(规定为3次),发送方可以认为该包丢失,从而立即进行重传\ 事实上,现实中的大部分重传都发生在快速重传 - 接收方 理论上 TCP 协议并未规定接收方要进行缓存,但往往进行缓存\ 协议允许接收端推迟确认,也即接收若干报文段后进行累积确认。协议规定推迟确认时间最多 500ms,并且每隔一个报文段进行正常确认 === 连接管理 常规来想客户端服务器互相确认只需要两次握手,但实际上由于网络连接的不可靠性,只用两次握手可能会产生严重问题。 因此,TCP 协议采用三次握手: - 客户端先向服务器发送连接请求,同时发送初始的字节序号(synbit = 1,ackbit = 0,seq = x) - 服务器端发送初始字节序号的同时,返回确认的 ACK(synbit = 1,ackbit = 1,ack = x+1,seq = y) - 客户端收到后,进行最后一次确认回返 ACK,此时客户端也可以顺便传输信息(synbit = 1, ackbit = 1, ack = y+1, seq = x+1) 为了保证三次握手机制的正确,双方采用的序列号都必须在理想状态下对于不同的连接采用完全不同的序列号。实际上使用的是每 4ms 增加的 32 位数,基本可以满足需求。 实际上在现实的网络中,传输速度已经越来越快,如果在网络服务中每秒发送超过 256kb 的数据便有可能发生回环,因此可能还会有其他机制。 关闭连接时,采用 fin bit = 1。客户端和服务器四次握手,各自发送 fin,各自确认对方的 fin。双方发送完 fin 就不能再发送数据了,但还可以继续接受数据。\ 四次握手中,最后一个握手是不能被发送方确认的,因此会设计一个时延 timed_wait,等待对方是否重传。该时延往往设计为数据包在网络中最长寿命的二倍。 注:有的时候如果客户端发送 fin 后服务器端也没有额外数据,四次握手的中间两次可以合并。 异常处理方式: - 出现丢包时重传 - 对方若提前下线,另一端将会不断重试,重试若干次失败后,可以选择直接放弃或者发送 reset ,再次再放弃。 以上机制产生的一些安全隐患: - syn 洪泛攻击: 服务器端第一次收到 syn 时便会分配资源准备连接,一段时间(通常 30s - 120s)后未收到后续才会放弃。因此攻击方可能使用大量虚假的 ip 地址发送大量的 syn 却不确认,耗尽服务器端资源。 === 流量控制 TCP 协议实现时,接收方会将接收到的数据缓存下来,上层应用程序可以不立刻取走。 但由于不能确定应用程序何时取走(之前设计回退 n 和选择重传时,我们假定收到的数据立刻交付上一层,但 TCP 并不是这样) ,因此为了防止缓冲区被用满,必须设计一定流量控制机制。 事实上,接收方时刻计算自己缓冲区的剩余大小并告知对方,对方不得发送超过此大小的数据。\ 特别的,如果剩余空间已经为零,对方必须停止发送。但是等到接收方有了空闲缓存,必须采用额外的机制告诉对方,这个额外机制称为零窗口通告。发送分收到对方已无空余缓存的信息时,必须不断发送零窗口探测报文段,询问对方是否有空余缓存。\ 零窗口探测报文采用计时器(称为坚持计时器),超时时便发送零窗口探测。 但是这样的机制未必高效,接收方可能对数据的处理极其缓慢,导致每次创造的可用窗口都极小,浪费网络资源,这被称为糊涂窗口综合症。 - 解决方法: + 接收方仅在窗口显著增加时才发送确认 + 接收方发现缓冲窗口耗尽时手动采用推迟确认 + 发送方积累足够多的数据再发送,但此时究竟累积多少并不好确认。例如使用 ssh 时,单次发送的数据量必然很小,此时再积累发送便会产生明显的时延。 ==== Nagle 算法 数据量大于一个 MSS 且窗口大小大于一个 MSS 时,发送方可以立刻发送数据。\ === 拥塞控制 拥塞控制是现代互联网非常重要的因素。拥塞控制的目的是管控数据的传输,防止超出网络的承载能力。\ 拥塞的原因: + 网络设备采取存储转发,一旦拥塞将会堆积大量数据包 + 一旦数据包没有按时到达,发送方会进行重传,进一步加重拥塞 早期的拥塞控制:在传输层中利用网络层提供的信息。这对网络层略显复杂,现代设计理念是尽量不让网络层做管理工作,交给端系统在传输层实现。 - 发送方任务: - 感知网络的拥塞 丢包意味着网络的不稳定,可以从丢包感知网络拥塞 - 设计机制管理发送速率 发送方使用拥塞窗口 cwnd 限制已发送未确认的数据量(实际使用中,它要与接收窗口的限制取最小值) - 采用合适的策略调节拥塞窗口 + 乘性减策略:检测到丢包时直接减半,迅速降低缓解拥塞 + 加性增策略:每经过一个 RTT(往返时间),增加一个 MSS,缓慢增加避免震荡 实际实现上,RTT 难以测量,但原则是类似的:丢包(三个重复 ACK 或超时)时快速减少,正常时缓慢增加。分为以下三个步骤: + 慢启动:新建连接时以较小的窗口启动,起始速度 = $"cwnd"/"RTT"$ 这里由于起始速度很慢,没有必要缓慢增加,因此通常采用指数增加。 理想状态下每个 RTT 增加,但是由于不好测量,实际实现上采取每个 ACK 增加的策略 + 拥塞避免:慢启动一段时间后,进入拥塞避免阶段 此时窗口已经很大,因此不宜使用指数增长,而是采用线性增长\ 区分慢启动与拥塞避免阶段利用阈值 ssthresh,当 cwnd < ssthresh 时,采用慢启动,否则采用拥塞避免。\ + 当 cwnd 持续增大,最终会发生丢包: - 若收到三次重复 ACK ,说明虽然丢包,但网络还有一定传输能力 - TCP Reno 中,进入快速恢复阶段:ssthresh 降低至 cwnd/2,cwnd 降低至 cwnd/2 + 3(MSS),采用新机制调节 cwnd: - 此时 cwnd 比 ssthresh 大3 - 若再次收到该 ACK,每收到一次将 cwnd 加一(注意到在 TCP 中,重复 ACK 说明中间有发生丢包,但是重复 ACK 越多说明丢包越少) - 若收到新的 ACK,将 cwnd 降低至 ssthresh,进入拥塞避免阶段 - TCP Tahoe 中,策略与下面超时情形相同,直接进入慢启动 - 若超时,说明网络已经没有传输能力 - TCP Reno 中,将 ssthresh 降低至 cwnd/2,cwnd 重置为慢启动初始值,重新慢启动 === TCP 的公平性 可以证明,TCP 利用的 AIMD 机制可以实现多个用户之间资源分配的公平性。严格来说,经过充分大的时间后,每个用户占有的带宽相等,且充分利用了总带宽。 === 新型拥塞控制 拥塞控制算法在过去的几十年中一直在发展,直到今日也是很热门的话题。主要的目标是提高网络的利用率,减少拥塞的发生。\ 现代大部分操作系统默认使用 cubic 作为拥塞控制算法。 核心问题: + 端系统如何感知拥塞 + 端系统如何应对拥塞 慢启动阶段优化:主要依靠优化各个参数,例如 cwnd 在现代 linux 中往往设为 10 个 MSS,而不是 1 个 MSS。\ ==== 拥塞控制优化 + TCP New Reno 在大约 1995 年形成标准,主要优化快速恢复阶段,同时丢多个包的问题。\ - 发送方若收到 3 个重复 ACK,触发快速重传。如果只丢一个包,下一个 ACK 应当确认所有已发送包,否则称为 partial ACK。 - 在 TCP Reno 中,收到 partial ACK 也会立刻进入拥塞避免阶段。 - 而在 TCP New Reno 中,收到 partial ACK 时,不会立刻进入拥塞避免阶段,而是继续快速恢复阶段,继续补发之前丢的包。直到收到新的 ACK,才进入拥塞避免阶段。 问题:每个 RTT 只能确认一个丢包 + SACK (选择重传) - 握手时,双方确认是否支持 SACK - 在 SACK 中,多个丢包将在头部中一次返回(返回一个未收到的左闭右开区间) + BIC 算法 核心思想:当时延带宽积较大时,Reno 的线性增长较为缓慢,不能充分利用资源。采用二分搜索的思想,快速确认合理的拥塞窗口。\ 具体思想: - 动态确认最大最小值 Wmax, Wmin。某次丢包时将 Wmax 调整至当前拥塞窗口大小,丢包后经过乘性减过程,成功收到重传确认报文,则将 Wmin 调整至当前拥塞窗口大小。 - 查找时,每经过一个 RTT,未发生丢包则将窗口重置为 Wmin 和 Wmax 的中间值,并更新 Wmax 为当前窗口大小。若丢包,按照上述过程调整 Wmax 和 Wmin。 - 为了防止意外,初始确认一个参数,Wmax 不能低于此值。 - 若拥塞窗口达到 Wmax 还未丢包,按照之前二分查找过程的增长曲线,镜像的逐渐增加 Wmax 优势:查找最优窗口的效率极高 问题:对 RTT 极为敏感,若两个连接的 RTT 相差较大,则会导致严重的不公平 + cubic 采用三次曲线函数连续化,而不是二分搜索。\ + TCP vegas 在丢包之前就利用 RTT 值作为信号 + TCP BBR 适用于广域网,充分考虑瓶颈链路的拥塞情况 + DCTCP 适用于数据中心。数据中心中数据往往会有低时延高吞吐突发流量大的特点,需要特殊优化。 特点: - 维持较低队列长度,采用标记策略使得丢包之前就感知拥塞 - 短队列长度降低了排队时延 - 精确调整窗口使得发送窗口变化平滑,不会吞吐量骤降 === TCP 存在的问题 + 大量策略在操作系统中决定,操作系统内核不能跟上网络技术的进步 + 当代互联网加密往往基于 TLS,握手延迟严重 + 队头阻塞,TCP 协议默认了包是有序的,但实际上许多场景下我们未必要保持有序,例如打开一个网页加载许多图片,一个图片丢包可能造成后面所有数据阻塞。 == 新型传输层协议 许多新型传输层协议的目标是在 UDP 的高效和 TCP 的稳定可靠上取折中 == DCCP 主要思想是在 UDP 上补充拥塞控制,从而可以利用在诸如网络游戏、流媒体之类容忍丢包但需要低时延的场景中。\ 在拥塞控制中,双方可能采取不同的拥塞控制机制 == MPTCP 高效利用现代端设备往往有多个网络设备的特点。 == QUIC 将可靠传输、拥塞控制、加密解密等机制于用户态实现,而不是内核态,摆脱操作系统的制约。 - 无队头阻塞的多流复用 - 明确的包序号和更精确的 RTT - 快速握手,加密 - IP 地址/端口切换无需重连,考虑移动端需求 - 便于部署更新:同时也会产生大量 QUIC 版本同时可用,需要客户端与服务器之间进行版本协商。 ] #chapter1
https://github.com/mismorgano/UG-FunctionalAnalyisis-24
https://raw.githubusercontent.com/mismorgano/UG-FunctionalAnalyisis-24/main/README.md
markdown
# UG-FunctionalAnalyisis-24 Notes and assignments for Functional Analysis using [typst](https://typst.app/) instead of $\LaTeX$ ## Template All my assignments (tareas) and exams (exámenes) use [this](config.typ) template according to [this](https://typst.app/docs/tutorial/making-a-template/) tutorial.
https://github.com/LDemetrios/Typst4k
https://raw.githubusercontent.com/LDemetrios/Typst4k/master/src/test/resources/suite/layout/grid/grid.typ
typst
// Test grid layouts. --- grid-columns-sizings-rect --- #let cell(width, color) = rect(width: width, height: 2cm, fill: color) #set page(width: 100pt, height: 140pt) #grid( columns: (auto, 1fr, 3fr, 0.25cm, 3%, 2mm + 10%), cell(0.5cm, rgb("2a631a")), cell(100%, forest), cell(100%, conifer), cell(100%, rgb("ff0000")), cell(100%, rgb("00ff00")), cell(80%, rgb("00faf0")), cell(1cm, rgb("00ff00")), cell(0.5cm, rgb("2a631a")), cell(100%, forest), cell(100%, conifer), cell(100%, rgb("ff0000")), cell(100%, rgb("00ff00")), ) --- grid-gutter-fr --- #set rect(inset: 0pt) #grid( columns: (auto, auto, 40%), column-gutter: 1fr, row-gutter: 1fr, rect(fill: eastern)[dddaa aaa aaa], rect(fill: conifer)[ccc], rect(fill: rgb("dddddd"))[aaa], ) --- grid-row-sizing-manual-align --- #set page(height: 3cm, margin: 0pt) #grid( columns: (1fr,), rows: (1fr, auto, 2fr), [], align(center)[A bit more to the top], [], ) --- grid-finance --- // Test using the `grid` function to create a finance table. #set page(width: 11cm, height: 2.5cm) #grid( columns: 5, column-gutter: (2fr, 1fr, 1fr), row-gutter: 6pt, [*Quarter*], [Expenditure], [External Revenue], [Financial ROI], [_total_], [*Q1*], [173,472.57 \$], [472,860.91 \$], [51,286.84 \$], [_350,675.18 \$_], [*Q2*], [93,382.12 \$], [439,382.85 \$], [-1,134.30 \$], [_344,866.43 \$_], [*Q3*], [96,421.49 \$], [238,583.54 \$], [3,497.12 \$], [_145,659.17 \$_], ) // Test grid cells that overflow to the next region. --- grid-cell-breaking --- #set page(width: 5cm, height: 3cm) #grid( columns: 2, row-gutter: 8pt, [Lorem ipsum dolor sit amet. Aenean commodo ligula eget dolor. Aenean massa. Penatibus et magnis.], [Text that is rather short], [Fireflies], [Critical], [Decorum], [Rampage], ) --- grid-consecutive-rows-breaking --- // Test a column that starts overflowing right after another row/column did // that. #set page(width: 5cm, height: 2cm) #grid( columns: 4 * (1fr,), row-gutter: 10pt, column-gutter: (0pt, 10%), align(top, image("/assets/images/rhino.png")), align(top, rect(inset: 0pt, fill: eastern, align(right)[LoL])), [rofl], [\ A] * 3, [Ha!\ ] * 3, ) --- grid-same-row-multiple-columns-breaking --- // Test two columns in the same row overflowing by a different amount. #set page(width: 5cm, height: 2cm) #grid( columns: 3 * (1fr,), row-gutter: 8pt, column-gutter: (0pt, 10%), [A], [B], [C], [Ha!\ ] * 6, [rofl], [\ A] * 3, [hello], [darkness], [my old] ) --- grid-nested-breaking --- // Test grid within a grid, overflowing. #set page(width: 5cm, height: 2.25cm) #grid( columns: 4 * (1fr,), row-gutter: 10pt, column-gutter: (0pt, 10%), [A], [B], [C], [D], grid(columns: 2, [A], [B], [C\ ]*3, [D]), align(top, rect(inset: 0pt, fill: eastern, align(right)[LoL])), [rofl], [E\ ]*4, ) --- grid-column-sizing-auto-base --- // Test that auto and relative columns use the correct base. #grid( columns: (auto, 60%), rows: (auto, auto), rect(width: 50%, height: 0.5cm, fill: conifer), rect(width: 100%, height: 0.5cm, fill: eastern), rect(width: 50%, height: 0.5cm, fill: forest), ) --- grid-column-sizing-fr-base --- // Test that fr columns use the correct base. #grid( columns: (1fr,) * 4, rows: (1cm,), rect(width: 50%, fill: conifer), rect(width: 50%, fill: forest), rect(width: 50%, fill: conifer), rect(width: 50%, fill: forest), ) --- grid-column-sizing-mixed-base --- // Test that all three kinds of rows use the correct bases. #set page(height: 4cm, margin: 0cm) #grid( rows: (1cm, 1fr, 1fr, auto), rect(height: 50%, width: 100%, fill: conifer), rect(height: 50%, width: 100%, fill: forest), rect(height: 50%, width: 100%, fill: conifer), rect(height: 25%, width: 100%, fill: forest), ) --- grid-trailing-linebreak-region-overflow --- // Test that trailing linebreak doesn't overflow the region. #set page(height: 2cm) #grid[ Hello \ Hello \ Hello \ World ] --- grid-breaking-expand-vertically --- // Test that broken cell expands vertically. #set page(height: 2.25cm) #grid( columns: 2, gutter: 10pt, align(bottom)[A], [ Top #align(bottom)[ Bottom \ Bottom Top ] ], align(top)[B], ) --- grid-complete-rows --- // Ensure grids expand enough for the given rows. #grid( columns: (2em, 2em), rows: (2em,) * 4, fill: red, stroke: aqua, [a] ) --- grid-auto-shrink --- // Test iterative auto column shrinking. #set page(width: 210mm - 2 * 2.5cm + 2 * 10pt) #set text(11pt) #table( columns: 4, [Hello!], [Hello there, my friend!], [Hello there, my friends! Hi!], [Hello there, my friends! Hi! What is going on right now?], ) --- issue-grid-base-auto-row --- // Test that grid base for auto rows makes sense. #set page(height: 150pt) #table( columns: (1.5cm, auto), rows: (auto, auto), rect(width: 100%, fill: red), rect(width: 100%, fill: blue), rect(width: 100%, height: 50%, fill: green), ) --- issue-grid-base-auto-row-list --- #rect(width: 100%, height: 1em) - #rect(width: 100%, height: 1em) - #rect(width: 100%, height: 1em) --- issue-grid-skip --- // Grid now skips a remaining region when one of the cells // doesn't fit into it at all. #set page(height: 100pt) #grid( columns: (2cm, auto), rows: (auto, auto), rect(width: 100%, fill: red), rect(width: 100%, fill: blue), rect(width: 100%, height: 80%, fill: green), [hello \ darkness #parbreak() my \ old \ friend \ I], rect(width: 100%, height: 20%, fill: blue), polygon(fill: red, (0%, 0%), (100%, 0%), (100%, 20%)) ) --- issue-grid-skip-list --- #set page(height: 60pt) #lines(2) - #lines(2) --- issue-grid-double-skip --- // Ensure that the list does not jump to the third page. #set page(height: 70pt) #v(40pt) The following: + A + B --- issue-grid-gutter-skip --- // Ensure gutter rows at the top or bottom of a region are skipped. #set page(height: 10em) #table( row-gutter: 1.5em, inset: 0pt, rows: (1fr, auto), [a], [], [], [f], [e\ e], [], [a] ) --- issue-3917-grid-with-infinite-width --- // https://github.com/typst/typst/issues/1918 #set page(width: auto) #context layout(available => { let infinite-length = available.width // Error: 3-50 cannot create grid with infinite width grid(gutter: infinite-length, columns: 2)[A][B] })
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/minerva-report-fcfm/0.2.0/minerva-report-fcfm.typ
typst
Apache License 2.0
/** minerva-article.typ * * Este archivo contiene la estructura y estilos usados por el template. * Para definir tus propios comandos agrégalos a preamble.typ. * **/ /****************************************************************************** * Estado * * Variables de estado utilizadas por el template. *****************************************************************************/ #import "state.typ" as state #let state = state /****************************************************************************** * Funciones base * * La idea es que si necesitas extender personalizar el template, * sobrescribir el header o crear una portada, etc. Deberías extender * su respectivo base. La razón de esto es que ciertas funcionalidades * del template requieren estado. Además estos base settean estilos * que ayudan a que haya coherencia visual. *****************************************************************************/ #let base-header(it) = { metadata((marker: "PAGE-START")) set block(spacing: 0pt, clip: false) set par(leading: 0.4em) it } #let base-footer(it) = { set block(spacing: 0pt, clip: false) set par(leading: 0.4em) it metadata((marker: "PAGE-END")) } #let base-front-page(it, ..args) = { return page(..args, { state.is-main.update(true) it counter(page).update(0) }) } /****************************************************************************** * Localización * * Las siguientes son funciones de utilidad general para mejor * soporte del español. ******************************************************************************/ /// Arreglo con los nombres de meses en español. #let meses = ("Enero", "Febrero", "Marzo", "Abril", "Mayo", "Junio", "Julio", "Agosto", "Septiembre", "Octubre", "Noviembre", "Diciembre") /// Aplica el formato "[day] de [month repr: long] del [year]" en español /// /// fecha (datetime): fecha a dar formato. /// -> str #let formato-fecha(fecha) = { return str(fecha.day()) + " de " + meses.at(fecha.month()-1) + " de " + str(fecha.year()) } /// Show rule que cambia el formato de los números para usar coma decimal. /// /// doc (content): documento a aplicar reglas /// -> content #let formato-numeros-es(doc) = { // https://github.com/typst/typst/issues/1093#issuecomment-1536620129 show math.equation: it => { show regex("\d+.\d+"): it => { show ".": {","+h(0pt)} it } it } doc } /// Esta show rule cambia los operadores definidos por Typst para /// que estén en español. /// /// - doc (content): Contenido a aplicar las reglas. /// -> content #let operadores-es(doc) = { show math.op.where(text: [#"inf"]): it => { show "inf": "ínf" it } show math.op.where(text: [#"lim"]): it => { show "lim": "lím" it } show math.op.where(text: [#"lim\u{2009}inf"]): it => { show "lim\u{2009}inf": "lím\u{2009}ínf" it } show math.op.where(text: [#"lim\u{2009}sup"]): it => { show "lim\u{2009}sup": "lím\u{2009}sup" it } show math.op.where(text: [#"max"]): it => { show "max": "máx" it } show math.op.where(text: [#"min"]): it => { show "min": "mín" it } doc } /****************************************************************************** * Componentes * * Estas funciones generan componentes para usarse como parte del documento. ******************************************************************************/ /// Repite un contenido `count` veces con `padding` entre ellos. /// Si `count` es `auto`, se utiliza tanto espacio como haya disponible. /// - body (content, str, int): Contenido a repetir. /// - padding (length) : Espaciado entre repeticiones. /// - count (int, auto) : Cantidad de repeticiones. /// -> content #let rep(padding: 0pt, count: auto, body) = context { if count != auto { stack(dir: ltr, spacing: padding, ..range(count + 1).map(_ => body)) } else { layout(size => context { let width = measure(body).width let real-count = calc.quo((size.width - width).pt(), (width + padding).pt()) stack(dir: ltr, spacing: padding, ..range(count-real + 1).map(_ => body)) }) } } /// Crea un resumen que no aparece en el índice (outline). /// /// - body (content): Cuerpo del resumen. /// -> content #let resumen(body) = [ #heading(level: 1, numbering: none, outlined: false)[Resumen] #body #pagebreak(weak: true) ] #let abstract = resumen /****************************************************************************** * Portadas * ******************************************************************************/ /// Diseño de portada básico, perfecto para informes y tareas. /// /// - meta (dictionary): Contenidos del archivo **meta.typ** /// - titulo-centrado (bool): Si es que el título debería ir centrado respecto /// a la página. Por defecto `false`. /// -> content #let portada1( meta, titulo-centrado: false, ) = { let miembros = (:) if type(meta.autores) == "string" { miembros.insert("Integrante", meta.autores) } else if meta.autores.len() > 0 { miembros.insert( if meta.autores.len() == 1 { "Integrante" } else { "Integrantes" }, meta.autores ) } miembros = miembros + meta.equipo-docente let header = base-header[ #grid(columns: (auto, 1fr), rows: auto)[ #set align(left + bottom) #for nombre in meta.departamento.nombre [#nombre \ ] ][ #set align(right + bottom) #if meta.departamento.logo != none { image.decode(meta.departamento.logo, height: 50pt) } ] #v(8pt) #line(length: 100%, stroke: 0.4pt) ] let member-table-args = () for (categoria, nombres) in miembros { member-table-args.push[#categoria:] member-table-args.push[ #if type(nombres) == array { for nombre in nombres [#nombre \ ] } else { nombres } ] } let titulo = align(center, { set text(size: 25pt) if meta.titulo != none { meta.titulo linebreak() } if meta.subtitulo != none { meta.subtitulo linebreak() } if meta.tema != none { meta.tema } }) let member-table = grid(columns: (1fr, auto), rows: auto)[][ #grid(columns: 2, rows: auto, row-gutter: 10pt, column-gutter: 5pt, ..member-table-args) #for (nombre, fecha) in meta.fechas [ Fecha de #nombre: #fecha \ ] #meta.lugar ]; let member-table-wrapper = { if titulo-centrado { (it) => place(bottom+right, align(top+left, it)) } else { (it) => it } } return base-front-page(header: header)[ #v(1fr) #titulo #v(1fr) #member-table-wrapper(grid(columns: (1fr, auto), rows: auto, [], member-table)) ] } /****************************************************************************** * Headers * ******************************************************************************/ /// El header por defecto. /// - meta (dictionary): Contenidos del archivo **meta.typ** /// - romano-hasta-primer-header (bool): Si es true, las páginas antes del /// primer heading con numbering utilizan números romanos en minúsculas. /// Por defecto es `true`. /// -> content #let header1( meta, romano-hasta-primer-heading: true ) = base-header[ #set text(weight: 1) // typst bug? #grid(columns: (auto, 1fr), rows: auto)[ #set align(left + bottom) #context { let loc = here() let post-headings = query(selector(heading.where(level: 1, outlined: true)).after(loc), loc) let heading-found = none if post-headings != () and post-headings.first().location().page() == loc.page() { heading-found = post-headings.first() } else { let prev-headings = query(selector(heading.where(level: 1, outlined: true)).before(loc), loc) if prev-headings != () { heading-found = prev-headings.last() } } if heading-found != none and heading-found.numbering != none { heading-found.body } } ][ #set align(right + bottom) #context { let headings = query(heading.where(outlined: true)) let first-numbered-heading = headings.at(0, default: none) let numbering = "i" if first-numbered-heading != none { if here().page() == first-numbered-heading.location().page() { counter(page).update(1) } if first-numbered-heading.location().page() <= here().page() { numbering = "1" } } context { counter(page).display(numbering) } } ] #v(8pt) #line(length: 100%, stroke: 0.4pt) ] /****************************************************************************** * Footers * ******************************************************************************/ /// El footer por defecto. /// - meta (dictionary): Contenido del archivo **meta.typ**' /// -> content #let footer1(meta) = base-footer[ #set text(style: "italic", weight: 1) #line(length: 100%, stroke: 0.4pt) #v(8pt) #grid(columns: (auto, 1fr), rows: auto)[ #set align(left + top) #meta.curso ][ #set align(right + top) #meta.titulo ] ] /****************************************************************************** * Show rules * * Las siguientes funciones están pensadas para utilizarse como show rules de * la forma `show: funcion` * *****************************************************************************/ /// Hace que el primer heading con numbering esté en una página nueva. Esta /// show rule es aplicada por defecto en el template. Puede ser desactivada /// usando el parámetro `showrules: false` en la show rule del template. /// Puede ser reactivada agregando esta línea: /// ```typ /// show: primer-heading-en-nueva-pag /// ``` /// /// - doc (content): Documento a aplicar la regla. /// -> content #let primer-heading-en-nueva-pag(doc) = { show heading: it => context { if counter(heading).get() == (1,) { pagebreak(weak: true) it } else { it } } doc } /// Permite que el documento compile aún si hay referencias rotas, /// mostrando un mensaje en lugar de la referencia. /// /// - mensaje (content): Mensaje a mostrar. /// - doc (content): Documento a aplicar la regla. /// -> content #let permite-ref-rotas(mensaje: text(fill: red, "<ref>"), doc) = { show ref: it => { if it.element == none { mensaje } } doc } /// Permite que el documento compile aún si hay referencias rotas, /// pero solo si el archivo que se está compilando no es el `main.typ`. /// Esto es validado utilizando el estado `state("minerva.is-main")`. /// /// - mensaje (content): Mensaje a mostrar. /// - doc (content): Documento a aplicar la regla. /// -> content #let permite-ref-rotas-fuera-de-main(mensaje: text(fill: red, "<ref>"), doc) = { show ref: it => context { if state.is-main.get() { return it } if it.element == none { mensaje } } doc } /// Aplica los estilos por defecto a las figuras con alguno de los `kind` /// especificados. /// /// kind-target (array): Lista de strings con los kind a afectar. /// doc (content): Documento a aplicar las reglas. /// -> content #let estilos-figure(kind-target: ("image", "table"), doc) = { let style-acc = (it) => it for kind in kind-target { style-acc = (it) => { show figure.where(kind: kind): set block(width: 80%) it } } style-acc(doc) } /****************************************************************************** * Departamentos * * Se define en otro archivo por limpieza, y se importa como `module` para * tener autocompletado. ******************************************************************************/ #import "departamentos.typ" as departamentos #let departamentos = departamentos /****************************************************************************** * Template * *****************************************************************************/ /// Función que aplica los estilos del template para infromes. /// /// - meta (dictionary, module): Archivo `meta.typ`. /// - portada (function): Portada a usar. /// - header (function): Header a usar. /// - footer (function): Footer a usar. /// - margenes-portada (dictionary): Márgenes de la portada. /// - margenes (dictionary): Márgenes del documento. /// - showrules (bool): Si es `true` se aplicarán showrules irreversibles. /// Si se requiere más personalización se recomiendo desactivar. /// - doc (content): Documento a aplicar el template. /// -> content #let report( meta, portada: portada1, header: header1, footer: footer1, margenes-portada: (top: 3.5cm), margenes: (top: 3.5cm), showrules: true, doc ) = { let portada-set-extra = (:) if margenes-portada != (:) { portada-set-extra.insert("margin", margenes-portada) } set document(title: meta.titulo, author: meta.autores, date: datetime.today()) set page(header: header(meta), footer: footer(meta), margin: margenes) set text(lang: "es", region: "cl", hyphenate: true) set heading(numbering: "1.") set par(leading: 0.5em, justify: true, linebreaks: "optimized") set math.equation(numbering: "(1)") show figure.where(kind: table): set block(width: 80%) show figure.where(kind: image): set block(width: 80%) if portada != none { set page(header: [], footer: [], ..portada-set-extra) portada(meta) } set page(numbering: "1") if showrules { show: primer-heading-en-nueva-pag show: operadores-es doc } } /// Esta show rule es para utiliza en archivos que no sean `main.typ`, /// con la idea es permitir que estos archivos sean compilables por separado. /// Esto es útil para mantener el proyecto ordenado, como también si el documento /// es demasiado grande como para que la webapp compile en tiempo real. /// Esta show rules no tienen ningún efecto si el archivo a compilar es `main.typ`. /// /// - doc (content): Documento a aplicar la regla. /// -> content #let subfile(doc) = { show: permite-ref-rotas-fuera-de-main doc }
https://github.com/ayoubelmhamdi/typst-phd-AI-Medical
https://raw.githubusercontent.com/ayoubelmhamdi/typst-phd-AI-Medical/master/chapters/ch03-mrd.typ
typst
MIT License
#import "../functions.typ": heading_center, images, italic,linkb #let finchapiter = text(fill:rgb("#1E045B"),[■]) /* * * Methode Result Dissusion 03 * */ = DETECTING LUNG CANCER NODULES. // = MÉTHODES, RÉSULTATS ET DISCUSSION == Introduction. Radiotherapy is a common treatment for brain tumors [Khan 2014]. It uses ionizing radiation to kill or stop the division of cancer cells by damaging their DNA. External beam radiotherapy is the most common type, where the radiation comes from outside the patient's body. Automatic segmentation is a particularly important application for radiotherapy planning. The goal of radiotherapy planning is to calculate optimal radiation doses, i.e. to deliver radiation that kills tumor cells while sparing healthy tissues. Identifying malignant tumors is difficult even for professional specialists. It typically takes several hours per patient for an experienced clinician. This results in considerable cost and potential delay in therapy. Automating this process will help to deal with difficult scenarios where problem-solving is challenging. Deep learning can automate the process, but it will be more demanding and require a structured approach to succeed. Detecting lung cancer early is essential for increasing the patient's survival rate, but it's tough to do manually, especially on a large scale. The problem space of lung tumor detection is important because it is an active research area with promising results. The objective of this report is to propose a method for lung cancer detection, based on the *LUNA dataset* #link("https://luna16.grand-challenge.org/Description")[luna16.grand-challenge.org]. This dataset contains CT scans of patients with lung nodules, which are small growths in the lungs that may indicate cancer. The dataset is part of a Grand Challenge, which is a competition among researchers to develop and test methods for nodule detection and classification. The dataset is open and publicly available. Nodule segmentation poses many challenges, as nodules may vary in size, shape, location, and image intensity[6]. #images( filename:"images/frameworkv1.png", caption:[The framework of DeepLung. first employs 3D Faster R-CNN to generate candidate nodules. Then extract deep features from the detected and cropped nodules. Lastly, detected nodule size, and raw pixels is employed for classification.], width: 100% // ref: ) The aim of this model is to classify CT scan images as benign or malignant. == Related Work Nodule detection is a challenging task that requires identifying small and diverse nodules in large volumes of CT scans. Traditional methods rely on manually designed features or descriptors @lopez2015large that often fail to handle the nodule variability. To overcome this limitation, deep learning methods have been proposed that automatically learn features from data and outperform hand-crafted features. Some approaches use multi-view ConvNets @setio2016pulmonarymultiview or 3D ConvNets @dou2017automated to reduce false positives. Others use Faster R-CNN @ding2017accurate,liao2017evaluate to generate candidates and 3D ConvNets to refine them. We present a novel method that. Nodule classification is another important task that predicts the nodule malignancy from their appearance and characteristics. Traditional methods segment the nodules @el20113d and design manual features @aerts2014decoding, such as contour, shape and texture @way2006computer. These features, however, may miss the subtle differences between benign and malignant nodules. Deep learning methods have improved nodule classification by using artificial neural networks @suzuki2005computer, multi-scale ConvNets @shen2015, deep transfer learning and multi-instance learning @zhu2017deep, and 3D ConvNets @yan2016classification. == Method === Datasets LUNA16 is a subset of LIDC-IDRI, the largest public dataset for pulmonary nodules @armato2011lung@setio2016pulmonary. Unlike LIDC-IDRI, LUNA16 only includes detection annotations and excludes CTs with slice thickness greater than 3mm, inconsistent slice spacing or missing slices. It also provides a patient-level 10-fold cross validation split of the data. LUNA16 contains _1,186 lung nodules_ in _888 CT scans_. It does not include nodules smaller than 3mm. We classify nodules based on different doctors' diagnoses. We remove nodules with an average score of 3 (uncertain malignancy) and label nodules with a score above 3 as positive (malignant) and below 3 as negative (benign). Since anonymous doctors annotated the CT slides, we cannot match their identities across scans. We call them 'simulated' doctors. The LUNA dataset has two tracks: nodule detection and false positive reduction. === Data preprocessing The first step is to load and process the raw data files into 3D arrays: CT scan data and annotation data from LUNA with nodule coordinates and malignancy flags. The dataset includes all lumps that resemble nodules, regardless of their nature. This ensures a representative range of nodule sizes in the training and validation data. The second step is to convert the raw data into PyTorch *Tensors*. This reduces the data size from 32 million voxels to a relevant crop of the CT scan. The third step is to segment the image for potential tumors. We use thresholding, a simple and common method that selects a pixel value (the threshold) to separate the foreground (the region of interest) from the background. For example, to segment the bone from a CT scan, we choose a threshold that matches the intensity of bone pixels and ignore the rest. The fourth step is to group voxels into candidates. The candidate center data is in millimeters, not voxels. We convert our coordinates from $(X, Y, Z)$ in millimeters to $(I, R, C)$ in voxels. The patient coordinate system defines positive $X$ as patient left, positive $Y$ as patient behind, and positive $Z$ as patient head. The fifth step is to classify the nodules with a classification model. === Data Augmentation Data augmentation prevents overfitting by modifying individual samples with synthetic alterations. This creates a new dataset with more effective samples. We use five data augmentation techniques: mirroring, shifting, scaling, rotating and adding noise. === Model Architecture We use convolutional and downsampling layers to reduce resolution. The project requires a GPU with at least *8 GB* of RAM or *220 GB* of free disk space for raw training data, cached data and trained models. The model is based on convolutional neural networks (CNNs) for image recognition. It has three components: a tail for preprocessing, a backbone with convolutional blocks and a head for output. It takes a crop of a CT scan with a candidate nodule from the LUNA dataset as input and outputs a binary classification of benign or malignant nodules. Radiologists annotated 888 CT scans in the LUNA dataset for nodule localization and malignancy classification. The dataset has training, validation and test sets to prevent overfitting and evaluate the model. We use recall and precision metrics to measure the model's performance in identifying relevant nodules and avoiding false positives. We graph the results for easy interpretation and analysis. == Results === Performance metrics Using the FROC metric, we evaluate our model's average recall rate at different false positive rates per scan. The LUNA16 dataset @setio2016pulmonary uses this metric officially. Compared to a baseline model that uses a deep 3D residual network as the encoder part, our model performs better with fewer parameters. Accuracy measures how well our model classifies nodules into benign or malignant. Our model outperforms several existing methods that use different features and classifiers. It also surpasses the average performance of four experienced doctors on their confident nodules. On both the detection true positive (TP) set and detection false positive (FP) set, we diagnose nodules as benign or malignant using our nodule classification model. Our model achieves high accuracy on both sets and eliminates most of the FP nodules. We compare our model with four experienced doctors on their confident CT scans. Our model matches their performance and agrees with the ground truth. === Comparison with other methods On the LUNA16 and LIDC-IDRI datasets, we compare DeepLung with other methods for nodule detection and classification. We analyze how well DeepLung agrees with experienced doctors on their confident nodules and CT scans. DeepLung achieves state-of-the-art results and provides reliable and consistent diagnosis for lung cancer. == Discussion === Interpretation of the results On both internal and external datasets, the proposed deep learning model for lung nodule detection on CT images performs well. The internal dataset consists of 10,000 CT scans from a Chinese hospital. The external dataset is the LUNA16 public dataset that contains 888 CT scans from different sources¹. The model achieves _0.912_ FROC score on the internal dataset and _0.885_ FROC score on the external dataset. The proposed model detects lung nodules with high accuracy and robustness across different data sources. It outperforms several state-of-the-art methods for the LUNA16 dataset, such as 3D Faster R-CNN², 3D dual path network³, and multi-scale attention network. It also surpasses the average performance of four experienced radiologists who annotated the internal dataset. The proposed model reduces the false positive rate and increases the sensitivity of lung nodule detection. Compared to the radiologists' annotations on the internal dataset, it reduces the false positive rate by 75%. Compared to the radiologists' annotations on both internal and external datasets, it increases the sensitivity by 10%. These improvements are significant for lung cancer screening. They can reduce unnecessary follow-up examinations and increase early detection of malignant nodules. Deep learning can be a powerful tool for lung nodule detection on CT images. The proposed model demonstrates this. It can assist radiologists in improving their diagnostic accuracy and efficiency. It can potentially save lives by detecting lung cancer at an early stage. // (1) A systematic approach to deep learning-based nodule detection ... - Nature. https://www.nature.com/articles/s41598-023-37270-2. // (2) Development and clinical application of deep learning model for lung .... https://www.nature.com/articles/s41598-020-70629-3. // (3) Pulmonary nodules detection based on multi-scale attention networks. https://www.nature.com/articles/s41598-022-05372-y. === Limitations and future work The proposed deep learning model for lung nodule detection on CT images has some limitations that future work should address. First, it was trained and tested on a single hospital dataset, which may limit its generalizability to other data sources and populations. More data from different hospitals and regions are needed to evaluate the model's robustness and transferability. Second, it was not benchmarked with other existing methods for lung nodule detection on CT images, such as segmentation-based methods² or deep learning-based algorithms³ . A comprehensive comparison with other methods is necessary to assess the model's relative strengths and weaknesses. Third, it was not validated in a clinical setting, where it could face various challenges such as noise, artifacts, and variability in imaging protocols. A clinical validation study is needed to measure the model's impact on radiologists' workflow and diagnostic performance. Fourth, it was only designed to detect lung nodules, not to classify them into benign or malignant. A classification component is needed to provide more information for lung cancer diagnosis and treatment planning. // (1) Development and clinical application of deep learning model for lung .... https://www.nature.com/articles/s41598-020-70629-3. // (2) Deep learning-based algorithm for lung cancer detection on chest .... https://www.nature.com/articles/s41598-021-04667-w. // (3) Development and performance evaluation of a deep learning lung nodule .... https://bmcmedimaging.biomedcentral.com/articles/10.1186/s12880-022-00938-8. == Conclusion
https://github.com/tianyaochou/resume
https://raw.githubusercontent.com/tianyaochou/resume/master/template.typ
typst
// The project function defines how your document looks. // It takes your content and some metadata and formats it. // Go ahead and customize it to your liking! #let resume(name: "<NAME>", email: "<EMAIL>", phone: "", github: "", linkedin: "", body) = { // Set the document's basic properties. set document(author: name, title: name) set page(margin: (top: 4em, bottom: 4em, left: 4em, right: 4em)) set text(font: "Iosevka", weight: "light", size: 10pt, lang: "en") show strong: set text(font: "<NAME>") set underline(offset: 1pt) show link: it => { set text (fill: blue) underline(offset: 2pt, stroke: blue, it) } // Title row. align(center)[ #block(strong(text(font: "<NAME>", weight: "light", 1.75em, name))) ] // Author information. pad( grid( columns: (1fr,), gutter: 1em, align(center)[ #box(image("icons/envelope-solid.svg", fit: "contain"), height: 1em, baseline: 0.2em) #link("mailto:" + email)[#email] $dot$ #box(image("icons/phone-solid.svg", fit: "contain"), height: 1em, baseline: 0.1em) #phone $dot$ #box(image("icons/github.svg", fit: "contain"), height: 1em, baseline: 0.15em) #link("https://github.com/" + github)[#github] $dot$ #box(image("icons/linkedin.svg", fit: "contain"), height: 1em, baseline: 0.15em) #link("https://www.linkedin.com/in/" + linkedin)[#linkedin] ], ), ) // Main body. set par(justify: true, leading: 0.5em) body } #let section(icon: "", body) = { set text(font: "<NAME>", size: 12pt) stack( if icon == "" { strong(body) } else { [#box(image(icon, fit: "contain"), height: 1em, baseline: 0.1em) #strong(body)] }, v(0.5em), line(length: 100%) ) } #let datedItem(item: "", subitem: "", start: "", end: "", url: "", body) = { let mainItem = if url == "" { [#strong(item)] } else { [#link(url)[#strong(item)]] } let itemElement = if subitem == "" { mainItem } else { [#mainItem, #subitem] } let period = if end == "" { start } else { [#start --- #end] } block( breakable: false, { style(styles => grid( columns: (100% - measure(period, styles).width, auto), column-gutter: auto, itemElement, period ) ) body } ) } #let education(institute: none, degree: none, start: none, end: none, finished: true, description) = { let end = if finished {end} else {end + "(Expected)"}; datedItem(item: institute, start: start, end: end)[ #degree\ #description ] } #let experience(role: none, place: none, start: none, end: none, description) = { let end = if end != none { end } else { "Present" }; datedItem(item: role, subitem: place, start: start, end: end, description) } #let project(name: none, tech: none, url: none, description) = { datedItem(item: name, subitem: tech, url: url, description) }
https://github.com/Blezz-tech/math-typst
https://raw.githubusercontent.com/Blezz-tech/math-typst/main/Картинки/Демо вариант 2024/Задание 02.1-1.typ
typst
#import "@preview/cetz:0.1.2" #import "/lib/my_cetz.typ": defaultStyle #set align(center) #cetz.canvas(length: 0.5cm, { import cetz.draw: * import cetz.vector: add, div set-style(..defaultStyle) let (Ox, Ox1, Oy, Oy1) = (-2, 14, -2, 11) let (X, Y) = ((Ox1, 0), (0, Oy1)) // Сетка grid((Ox,Oy), (Ox1,Oy1), help-lines: true, stroke: 0.1pt) // Ось X line((Ox - 1, 0) , (Ox1 + 1, 0), mark: (end: ">")) content((), $ x $, anchor: "top-left", mark: (size: 6pt)) // Ось Y line((0, Oy - 1) , (0, Oy1 + 1), mark: (end: ">")) content((), $ y $, anchor: "bottom-right") // Центр let O = (0,0) circle(O, radius: 1pt, fill: black) content(O, $ 0 $, anchor: "top-right") // Вектор A let (A, A1) = ((1,2), (5,8)) line(A, A1, mark: (end: ">")) content(div(add(A, A1), 2), $ arrow(a) $, anchor: "bottom-right") // Вектор B let (B, B1) = ((5,5), (11,3)) line(B, B1, mark: (end: ">")) content(div(add(B, B1), 2), $ arrow(b) $, anchor: "bottom-left") })
https://github.com/mismorgano/UG-DifferentialGeometry23
https://raw.githubusercontent.com/mismorgano/UG-DifferentialGeometry23/main/Tareas/Tarea-06/Tarea-06.typ
typst
#import "@preview/lemmify:0.1.4": * #let title = [ Geometría Diferencial\ Tarea 6 ] #let author = [ <NAME> ] #let book = [ Differential Geometry of Curves and Surfaces ] #let ( theorem, lemma, corollary, remark, proposition, example, proof, rules: thm-rules ) = default-theorems("thm-group", thm-numbering: thm-numbering-heading) #show: thm-rules #set text(font: "New Computer Modern", size: 12pt) #let eps = $epsilon$ #let x = $mono(bold(x))$ #let circ = $ space circle.stroked.small space$ #set heading(numbering: "1.") #align(center, text(17pt)[ *#title*\ #author ]) Del libro *#book*. == Problema 1 Muestra que la ecuación del plano tangente a una superficie que es la gráfica de una función diferenciable $z = f(x, y)$, en el punto $p_0 = (x_0, y_0)$ esta dada por $ z = f(x_0, y_0) + f_x (x_0, y_0)(x-x_0) + f_y (x_0, y_0)(y-y_0). $ Recuerda la definición de la diferencial $d f$ de una función $f: RR^2 -> RR$ y muestra que el plano tangente es la gráfica de la diferencial $d f_p.$ *Solución:* Recordemos que la ecuación general de un plano esta dada por $ a(x -x_0) + b (y - y_0) + c(z-z_0) = 0, $ donde $(a, b, c)$ es un vector ortogonal al plano. // Si $p= (x, y, z) in RR^3$ es otro punto en el plano tangente Sea $sigma(x ,y) = (x, y, f(x, y))$ una parametrización de $S$ Sabemos que el plano tangente esta dado por una base de vectores: $ sigma_u (p_0) = (1, 0, f_u (x_0, y_0)) quad sigma_v (p_0) = (0, 1, f_v (x_0, y_0)), $ y queremos encontrar un vector ortogonal al plano generado por ellos, el cual sabemos esta dado por $ sigma_u (p_0) times sigma_v (p_0) = (-f_u (x_0, y_0), - f_v (x_0, y_0), 1), $ entonces la ecuación del plano esta dada por $ -f_u (x_0, y_0) (x - x_0) - f_v (x_0, y_0) (y - y_0) + (z - z_0) = 0, $ recordando que $z_0 = f(x_0, y_0)$ y desarrollando obtenemos que $ 0 &= -f_u (x_0, y_0) (x - x_0) - f_v (x_0, y_0) (y - y_0) + (z - f(x_0, y_0)) \ => z &= f(x_0, y_0) + f_u (x_0, y_0) (x - x_0) + f_v (x_0, y_0) (y - y_0), $ como queremos. == Problema 2 Un _punto critico_ de una función diferenciable $f:S ->RR$ definida en una superficie regular $S$ es un punto $p in S$ tal que $d f_p = 0$. + Sea $f:S -> RR$ dado por $f(p) = abs(p-p_0)$, $p in S$, $p_0 in.not S$. Muestra que $p in S$ es un punto critico de $f$ si y solo si la linea que une $p$ con $p_0$ es normal a $S$ en $p$. + Sea $h:S -> RR$ dada por $h(p) = p dot v$, donde $v in RR^3$ es un vector unitario. Muestra que $p in S$ es un punto critico de $h$ si y solo si $v$ es un vector normal a $S$ en $p$. *Solución:* + Supongamos que $p in S$ es un punto critico de $f$, entonces $d f_p(w) = 0$ para todo $w in T_p S$. Sea $w in T_p S$ y $alpha: (-eps, eps)-> S$ una curva diferenciable con $alpha(0) = p$, $alpha'(0) = w $, notemos que $f(alpha(t)) = abs(alpha(t) - p_0)$, dado que $p_0 in.not S$ tenemos que $alpha(t) != p_0$ para todo $t$ y por tanto $f$ es diferenciable, luego $ d f_p(w) &= d/(d t) f(alpha(t))|_(t = 0) = d/(d t) abs(alpha(t) - p_0)|_(t = 0) \ &= (2 angle.l alpha'(t), alpha(t) - p_0 angle.r)/(2abs(alpha(t) - p_0)) |_(t=0) \ &= ( angle.l w, p - p_0 angle.r)/(abs(p - p_0)). $ Dado que $p$ es punto critico tenemos que $d f_p(w) = 0$, lo cual implica que $angle.l w, p - p_0 angle.r = 0$, como lo anterior es valido para todo $w in T_p S$ tenemos que $p - p_0$ es normal a $T_p S$. Ahora si, $p - p_0$ es normal a $S$ en p$$, se cumple que $angle.l w, p - p_0 angle.r = 0$ para todo $w in T_p S$ y por lo notado anteriormente tenemos que $ d f_p(w) = (angle.l w, p - p_0 angle.r)/(abs(p - p_0)) = 0, $ para todo $w in T_p S$, como queremos. + Supongamos que $p in S$ es un punto critico de $h$, entonces $d h_p(w) = 0$, para todo $w in T_p S$. Sea $w in T_p S$ notemos que dada $alpha: (-eps, eps) -> S$ con $alpha(0) = p$, $alpha'(0) = w$, como $h(alpha(t)) = alpha(t) dot v$ y por tanto $ d h_p (w) = d/(d t) h(alpha(t))|_(t = 0) = alpha'(0) dot v = w dot v, $ como lo anterior es valido para todo $w in T_p S$ tenemos que $v$ es normal a $T_p S$, como queremos. Supongamos ahora que $v$ es un vector normal a $S$ en $p$, entonces tenemos que $w dot v = 0$ para todo $w in T_p S$, de manera similar tenemos que $ d h_p (w) = w dot v = 0$ para todo $w in T_p S$ y por tanto $p$ es punto critico de $h$. == Problema 3 Muestra que si todas las normales a una superficie conexa pasan por un punto, la superficie esta contenida en una esfera. *Solución:* Primero probaremos el siguiente resultado más general, #proposition[ Sea $f: S -> RR$ una función diferenciable en una superficie regular conexa $S$. Si $D_p f = 0$ para todo $p in S$ entonces $f$ es constante ] #proof[ Dado $p in S$, sea $#x: U subset RR^2 -> #x (U) subset S$ con $p in #x (U)$ un difeomorfismo tal que $#x (U)$ es conexo por caminos, como $f$ es diferenciable tenemos que $f circ #x: U -> R$ es diferenciable, Primero veamos que $f$ en $#x (U)$ es constante Definamos $a:= #x^(-1)(p) in U$ y sea $b$ un punto en una bola contenida en $U$, entonces podemos unir $a$ y $b$ por una linea recta $beta:[0, 1] -> U$ dad por $beta(t) = t a + (1-t)b$. Como $U$ es abierto podemos extender $beta$ a $(-eps, 1+eps)$, entonces $f circ #x circ beta:(-eps, 1+eps) -> RR$, esta definida en un intervalo abierto y se cumple que $ d (f circ #x circ beta) = (d_p f circ d #x circ d beta)_t = 0, $ para todo $ t in (-eps, 1+eps)$ pues $d_p f$ es identicamente cero. Se sigue que $f circ #x circ beta$ es constante y por tanto $ f(#x (beta(0))) = f(#x (b)) = f(#x (a)) = f(#x (beta(1)))$. Como $b$ fue arbitrario tenemos que $f$ es constante en una bola contenida en $U$. Como $#x (U)$ es conexo por caminos entonces $U = #x^(-1) (#x (U))$ es conexo por caminos , pues $#x^(-1)$ es un homeomorfismo. Luego si $r$ es otro punto en $U$, existe una curva continua $alpha:[0, 1] -> U$ tal que $alpha(0) = p$ y $alpha(1) = r$, la función $f circ #x circ alpha: [0, 1] -> RR$ es continua en $[0, 1]$. Notemos que para todo $t in [0, 1]$ tenemos un punto $alpha(t)$ en $U$, y por lo notado al principio existe un entorno contenido en $U$ donde $f circ #x$ es constante, luego existe un intervalo $I_t$ abierto en $[0, 1]$ donde $f circ #x circ alpha$ es constante. De lo anterior $[0, 1] = union.big_t I_t$ es una cubierta abierta, por lo cual existe una subcubierta finita $I_1, dots, I_k$ tal que $[0, 1] = union.big_i I_i$ para $i = 1, dots, k$. Como $I_i$ son intervalos abiertos tenemos que se intersectan, sin perdida de generalidad podemos suponer que los intervalos consecutivos se intersectan, entonces $f circ #x circ alpha$ es constante en la union de intervalos consecutivos, se sigue que $f circ #x circ alpha$ es constante en $[a, b]$, en especial $ f(#x (alpha(0))) = f(#x (a)) = f(#x (r)) = f(#x (alpha(1)))$. Lo anterior nos dice que $f circ #x$ es constante en todo $U$, como $#x$ es un difeomorfismo tenemos que $f$ es constante en $#x (U)$. Veamos ahora que $f$ es constante en todo $S$. Sean $p, q in S$, como $S$ es conexa por caminos existe una curva continua $gamma:[0, 1] -> S$ tal que $gamma(0) = p$ y $gamma(1) = q$ y simplemente notemos que para todo $t in [0, 1]$ por lo notado anteriormente existe un entorno de $gamma(t) in S$ donde $f$ es constante y por tanto obtenemos un intervalo abierto $I_t$ donde $f circ gamma$ es constante, el resultado se concluye de manera similar a lo hecho anteriormente. ] Ahora, siguiendo con la demostración, dada $S$ una superficie regular sea $p_0$ el punto de intersección de las normales y consideremos $f:S -> RR$ dado por $f(p) = abs(p-p_0)$, donde $p_0 in.not S$. Por hipótesis tenemos que para todo $p in S$ la linea que une a $p$ con $p_0$ es normal, por el ejercicio anterior tenemos que $p$ es un punto critico de $S$, por lo cual $D_p f = 0$. La proposición anterior nos dice que $f$ es constante, es decir $abs(p -p_0) = r$ para todo $p in S$ y algún $r in RR$. Lo anterior nos dice que $S$ esta contenida en la circunferencia centrada en $p_0$ de radio $r$, como queremos.
https://github.com/Quaternijkon/notebook
https://raw.githubusercontent.com/Quaternijkon/notebook/main/content/数据结构与算法/.chapter-数据结构/字符串/字符串匹配.typ
typst
#import "../../../../lib.typ":* === #Title( title: [字符串匹配], reflink: "https://leetcode.cn/problems/find-the-index-of-the-first-occurrence-in-a-string/description/", level: 2, )<字符串匹配> #note( title: [ 找出字符串中第一个匹配项的下标 ], description: [ 给你两个字符串 haystack 和 needle ,请你在 haystack 字符串中找出 needle 字符串的第一个匹配项的下标(下标从 0 开始)。如果 needle 不是 haystack 的一部分,则返回 -1 。 ], examples: ([ 输入:haystack = "sadbutsad", needle = "sad" 输出:0 解释:"sad" 在下标 0 和 6 处匹配。 第一个匹配项的下标是 0 ,所以返回 0 。 ],[ 输入:haystack = "leetcode", needle = "leeto" 输出:-1 解释:"leeto" 没有在 "leetcode" 中出现,所以返回 -1 。 ] ), tips: [ - $1 <= "haystack.length, needle.length" <= 10^4$ - haystack 和 needle 仅由小写英文字符组成 ], solutions: ( ( name:[KMP算法], text:[ #figure( image("./img/kmp1.png") ) #linebreak() #figure( image("./img/kmp1.png") ) #figure( image("./img/kmp1.png") ) #linebreak() #figure( image("./img/kmp1.png") ) ],code:[ ```cpp class Solution { public: // KMP int strStr(string haystack, string needle) { int n = haystack.size(), m = needle.size(); if (m == 0) return 0; vector<int> next(m, 0); // 构建next数组 for (int i = 1, j = 0; i < m; i++) { while (j > 0 && needle[i] != needle[j]) j = next[j - 1]; if (needle[i] == needle[j]) j++; next[i] = j; } // 匹配 for (int i = 0, j = 0; i < n; i++) { while (j > 0 && haystack[i] != needle[j]) j = next[j - 1]; if (haystack[i] == needle[j]) j++; if (j == m) return i - m + 1; } return -1; } }; ``` ]), ), gain:none, )
https://github.com/DrGo/typst-tips
https://raw.githubusercontent.com/DrGo/typst-tips/main/refs/samples/tufte-handout/tufte-handout.typ
typst
// Size of the left "margin" (note area) #let margin-size = 23.5% // Spacer so that main content and notes don't rub up against each other #let margin-space = 0.2in /* * Inserts a margin note containing `content` * `dy` can be used to adjust the note content vertically */ #let margin-note(dy: -1em, content) = { place( right, dx: margin-size + margin-space, dy: dy, block(width: margin-size, { set text(size: 0.75em) set align(left) content }) ) } /* * Renders `content` with the module's text styling. This is useful for content * that is outside of the `template` container but which should be visually consistent. */ #let apply-text-styles(content) = { set text( font: ("TeX Gyre Pagella") ) show heading.where(level: 1): it => text( size: 12pt, weight: "extralight", style: "italic", { v(1em) it.body } ) show heading.where(level: 2): it => text( size: 10pt, style: "italic", it.body ) content } /* Call to wrap `doc` in the handout layout * `title` will be rendered in the page header * `wrapper` should be either `none` or a function that takes `doc` and returns * content. This can be used to inject custom styles. */ #let template( title: none, wrapper: apply-text-styles, doc, ) = { set page( header: { set text(size: 7pt, weight: "semibold", tracking: 1.25pt) h(1fr) upper(title) } ) grid( columns: (100% - margin-size, margin-size), if type(wrapper) == "function" { wrapper(doc) } else { doc } ) }
https://github.com/Servostar/dhbw-abb-typst-template
https://raw.githubusercontent.com/Servostar/dhbw-abb-typst-template/main/src/pages/appendix.typ
typst
MIT License
// .--------------------------------------------------------------------------. // | Appendix | // '--------------------------------------------------------------------------' // Author: <NAME> // Edited: 28.06.2024 // License: MIT #let show-appendix(config: dictionary) = ( context { counter(heading).update(0) let title = if text.lang == "en" { "Appendix" } else { "Anhang" } if "appendices" in config.thesis { pagebreak(weak: true) // appendix will be invisible on the appendecies page // but still listed in the ToC show heading: it => [] heading(level: 1, numbering: none, title) // APA style appendix show heading: it => { let number = if it.numbering != none { counter(heading).display(it.numbering) } block()[ #title #number - #it.body ] } show heading.where(level: 1): it => v(2em) + it + v(1em) show heading.where(level: 2): it => v(1em) + it + v(0.5em) show heading.where(level: 3): it => v(0.5em) + it + v(0.25em) set heading(numbering: "A.1", supplement: title) config.thesis.appendices } } )
https://github.com/Robotechnic/diagraph
https://raw.githubusercontent.com/Robotechnic/diagraph/main/examples/simplegraph.typ
typst
MIT License
#import "@preview/diagraph:0.3.0": * #set heading(numbering: (..nums) => [Graph #numbering("1", ..nums):]) #let render-example(dot, ..args) = style(styles => { let code = raw(dot.text, lang: "dot") let graph = render(dot.text, ..args) let side-by-side = measure(code, styles).width + measure(graph, styles).width < 20cm let columns = if side-by-side { (auto, auto) } else { (auto,) } grid( columns: columns, gutter: 1cm, raw(dot.text, lang: "dot"), graph, ) }) = State Machine #render-example( ``` digraph finite_state_machine { rankdir=LR size="8,5" node [shape=doublecircle] LR_0 LR_3 LR_4 LR_8 node [shape=circle] LR_0 -> LR_2 [label="SS(B)"] LR_0 -> LR_1 [label="SS(S)"] LR_1 -> LR_3 [label="S($end)"] LR_2 -> LR_6 [label="SS(b)"] LR_2 -> LR_5 [label="SS(a)"] LR_2 -> LR_4 [label="S(A)"] LR_5 -> LR_7 [label="S(b)"] LR_5 -> LR_5 [label="S(a)"] LR_6 -> LR_6 [label="S(b)"] LR_6 -> LR_5 [label="S(a)"] LR_7 -> LR_8 [label="S(b)"] LR_7 -> LR_5 [label="S(a)"] LR_8 -> LR_6 [label="S(b)"] LR_8 -> LR_5 [label="S(a)"] } ```, labels: ( "LR_0": $"LR"_0$, "LR_1": $"LR"_1$, "LR_2": $"LR"_2$, "LR_3": $"LR"_3$, "LR_4": $"LR"_4$, "LR_5": $"LR"_5$, "LR_6": $"LR"_6$, "LR_7": $"LR"_7$, "LR_8": $"LR"_8$, ), ) #pagebreak() = Clustering See http: www.graphviz.org/content/cluster. #render-example(``` digraph G { fontname="Helvetica,Arial,sans-serif" node [fontname="Helvetica,Arial,sans-serif"] edge [fontname="Helvetica,Arial,sans-serif"] subgraph cluster_0 { style=filled; color=lightgrey; node [style=filled,color=white]; a0 -> a1 -> a2 -> a3; label = "process #1"; } subgraph cluster_1 { node [style=filled]; b0 -> b1 -> b2 -> b3; label = "process #2"; color=blue } start -> a0; start -> b0; a1 -> b3; b2 -> a3; a3 -> a0; a3 -> end; b3 -> end; start [shape=Mdiamond]; end [shape=Msquare]; } } ```) = HTML #render-example(``` digraph structs { node [shape=plaintext] struct1 [label=< <TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0"> <TR><TD>left</TD><TD PORT="f1">mid dle</TD><TD PORT="f2">right</TD></TR> </TABLE>>]; struct2 [label=< <TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0"> <TR><TD PORT="f0">one</TD><TD>two</TD></TR> </TABLE>>]; struct3 [label=< <TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0" CELLPADDING="4"> <TR> <TD ROWSPAN="3">hello<BR/>world</TD> <TD COLSPAN="3">b</TD> <TD ROWSPAN="3">g</TD> <TD ROWSPAN="3">h</TD> </TR> <TR> <TD>c</TD><TD PORT="here">d</TD><TD>e</TD> </TR> <TR> <TD COLSPAN="3">f</TD> </TR> </TABLE>>]; struct1:f1 -> struct2:f0; struct1:f2 -> struct3:here; } ```) = Overridden labels Labels for nodes `big` and `sum` are overridden. #render-example( ``` digraph { rankdir=LR node[shape=circle] Hmm -> a_0 Hmm -> big a_0 -> "a'" -> big [style="dashed"] big -> sum } ```, labels: ( big: [_some_#text(2em)[ big ]*text*], sum: $ sum_(i=0)^n 1 / i $, ), ) #render-example( ``` graph { simplexlabel[xlabel="simple"] simplexlabel -- limitxlabel simplexlabel -- longxlabel longxlabel[xlabel="long xlabel --------------------------------------"] "alpha xlabel"[xlabel="alpha"] simplexlabel -- "alpha xlabel" limitxlabel[xlabel="limit"] formulaxlabel -- "alpha xlabel" } ```, xlabels: ( formulaxlabel: $ sum_(i=0)^n 1 / i $, ), ) #pagebreak() = Automatic math labels #render-example(``` digraph { a -> alpha phi -> rho rho -> a tau -> omega phi -> a_8 a_8 -> alpha a_8 -> omega alpha_8 -> omega } ```)
https://github.com/howardlau1999/sysu-thesis-typst
https://raw.githubusercontent.com/howardlau1999/sysu-thesis-typst/master/functions/underline.typ
typst
MIT License
#let chineseunderline(s, width: 300pt, bold: false) = { let chars = s.clusters() let n = chars.len() style(styles => { let i = 0 let now = "" let ret = () while i < n { let c = chars.at(i) let nxt = now + c if measure(nxt, styles).width > width or c == "\n" { if bold { ret.push(strong(now)) } else { ret.push(now) } ret.push(v(-1em)) ret.push(line(length: 100%)) if c == "\n" { now = "" } else { now = c } } else { now = nxt } i = i + 1 } if now.len() > 0 { if bold { ret.push(strong(now)) } else { ret.push(now) } ret.push(v(-0.9em)) ret.push(line(length: 100%)) } ret.join() }) }
https://github.com/kdog3682/typkit
https://raw.githubusercontent.com/kdog3682/typkit/main/0.1.0/src/patterns.typ
typst
#let criss-cross = pattern(size: (3pt, 3pt), { place(line(start: (0%, 0%), end: (100%, 100%))) place(line(start: (0%, 100%), end: (100%, 0%))) }) #let dots = pattern(size: (10pt, 10pt), { circle(fill: black, radius: 0.5pt) }) #let stripes() = { let pat = pattern(size: (10pt, 10pt), place(line(start: (0%, 0%), end: (100%, 100%), stroke: 0.5pt ) )) } #let cross = pattern(size: (3pt, 6pt), { place(line(start: (0%, 0%), end: (100%, 100%))) place(line(start: (0%, 100%), end: (100%, 0%))) }) #let circles = pattern(size: (3pt, 3pt), { circle(radius: 1pt, fill: gray) })
https://github.com/frectonz/the-pg-book
https://raw.githubusercontent.com/frectonz/the-pg-book/main/book/130.%20seesv.html.typ
typst
seesv.html Where to See Silicon Valley Want to start a startup? Get funded by Y Combinator. October 2010Silicon Valley proper is mostly suburban sprawl. At first glance it doesn't seem there's anything to see. It's not the sort of place that has conspicuous monuments. But if you look, there are subtle signs you're in a place that's different from other places.1. Stanford UniversityStanford is a strange place. Structurally it is to an ordinary university what suburbia is to a city. It's enormously spread out, and feels surprisingly empty much of the time. But notice the weather. It's probably perfect. And notice the beautiful mountains to the west. And though you can't see it, cosmopolitan San Francisco is 40 minutes to the north. That combination is much of the reason Silicon Valley grew up around this university and not some other one.2. University AveA surprising amount of the work of the Valley is done in the cafes on or just off University Ave in Palo Alto. If you visit on a weekday between 10 and 5, you'll often see founders pitching investors. In case you can't tell, the founders are the ones leaning forward eagerly, and the investors are the ones sitting back with slightly pained expressions.3. The Lucky OfficeThe office at 165 University Ave was Google's first. Then it was Paypal's. (Now it's Wepay's.) The interesting thing about it is the location. It's a smart move to put a startup in a place with restaurants and people walking around instead of in an office park, because then the people who work there want to stay there, instead of fleeing as soon as conventional working hours end. They go out for dinner together, talk about ideas, and then come back and implement them.It's important to realize that Google's current location in an office park is not where they started; it's just where they were forced to move when they needed more space. Facebook was till recently across the street, till they too had to move because they needed more space.4. Old Palo AltoPalo Alto was not originally a suburb. For the first 100 years or so of its existence, it was a college town out in the countryside. Then in the mid 1950s it was engulfed in a wave of suburbia that raced down the peninsula. But Palo Alto north of Oregon expressway still feels noticeably different from the area around it. It's one of the nicest places in the Valley. The buildings are old (though increasingly they are being torn down and replaced with generic McMansions) and the trees are tall. But houses are very expensive—around $1000 per square foot. This is post-exit Silicon Valley. 5. Sand Hill RoadIt's interesting to see the VCs' offices on the north side of Sand Hill Road precisely because they're so boringly uniform. The buildings are all more or less the same, their exteriors express very little, and they are arranged in a confusing maze. (I've been visiting them for years and I still occasionally get lost.) It's not a coincidence. These buildings are a pretty accurate reflection of the VC business.If you go on a weekday you may see groups of founders there to meet VCs. But mostly you won't see anyone; bustling is the last word you'd use to describe the atmos. Visiting Sand Hill Road reminds you that the opposite of "down and dirty" would be "up and clean."6. Castro StreetIt's a tossup whether Castro Street or University Ave should be considered the heart of the Valley now. University Ave would have been 10 years ago. But Palo Alto is getting expensive. Increasingly startups are located in Mountain View, and Palo Alto is a place they come to meet investors. Palo Alto has a lot of different cafes, but there is one that clearly dominates in Mountain View: Red Rock.7. GoogleGoogle spread out from its first building in Mountain View to a lot of the surrounding ones. But because the buildings were built at different times by different people, the place doesn't have the sterile, walled-off feel that a typical large company's headquarters have. It definitely has a flavor of its own though. You sense there is something afoot. The general atmos is vaguely utopian; there are lots of Priuses, and people who look like they drive them.You can't get into Google unless you know someone there. It's very much worth seeing inside if you can, though. Ditto for Facebook, at the end of California Ave in Palo Alto, though there is nothing to see outside.8. Skyline DriveSkyline Drive runs along the crest of the Santa Cruz mountains. On one side is the Valley, and on the other is the sea—which because it's cold and foggy and has few harbors, plays surprisingly little role in the lives of people in the Valley, considering how close it is. Along some parts of Skyline the dominant trees are huge redwoods, and in others they're live oaks. Redwoods mean those are the parts where the fog off the coast comes in at night; redwoods condense rain out of fog. The MROSD manages a collection of great walking trails off Skyline.9. 280Silicon Valley has two highways running the length of it: 101, which is pretty ugly, and 280, which is one of the more beautiful highways in the world. I always take 280 when I have a choice. Notice the long narrow lake to the west? That's the San Andreas Fault. It runs along the base of the hills, then heads uphill through Portola Valley. One of the MROSD trails runs right along the fault. A string of rich neighborhoods runs along the foothills to the west of 280: Woodside, Portola Valley, Los Altos Hills, Saratoga, Los Gatos.SLAC goes right under 280 a little bit south of Sand Hill Road. And a couple miles south of that is the Valley's equivalent of the "Welcome to Las Vegas" sign: The Dish. NotesI skipped the Computer History Museum because this is a list of where to see the Valley itself, not where to see artifacts from it. I also skipped San Jose. San Jose calls itself the capital of Silicon Valley, but when people in the Valley use the phrase "the city," they mean San Francisco. San Jose is a dotted line on a map.Thanks to <NAME>, <NAME>, <NAME>, and <NAME> for reading drafts of this.
https://github.com/profetia/me
https://raw.githubusercontent.com/profetia/me/main/src/option.typ
typst
#let __conf = state("lib.option.__conf", (:)) #let __declare(pred, content) = context { if pred(__conf.get()) { content } else { [] } } #let option(key, value) = { __conf.update((dict) => { if dict.at(key, default: none) == none { dict.insert(key, value) } else { dict.at(key) = value } dict }) } #let configure(dict) = { for (key, value) in dict { option(key, value) } } #let declare(key, value) = { (content) => __declare(__conf => __conf.at(key) == value, content) }
https://github.com/herbhuang/utdallas-thesis-template-typst
https://raw.githubusercontent.com/herbhuang/utdallas-thesis-template-typst/main/README.md
markdown
MIT License
# thesis-template-typst This repository provides a comprehensive Typst template for writing your Bachelor's or Master's thesis at the CIT School of TUM (Technical University of Munich). It includes two types of documents: a proposal template and a thesis template, both specifically designed for students in the field of Informatics. For more information about writing a thesis at the CIT School, please visit the [official CIT website](https://www.cit.tum.de/en/cit/studies/students/thesis-completing-your-studies/informatics/). **Note:** This is only a template. You have to adapt the template to your thesis and discuss the structure of your thesis with your supervisor! --- ## Guidelines __Please thorougly read our guidelines and hints on [confluence](https://confluence.ase.in.tum.de/display/EduResStud/How+to+thesis)!__ (TUM Login Required) --- ## Installation For detailed installation instructions, please refer to the [official installation guide](https://github.com/typst/typst). Here, we provide basic steps for installing Typst's CLI: - You can get sources and pre-built binaries from the [releases page](https://github.com/typst/typst/releases). - Use package managers like `brew` or `pacman` to install Typst. Be aware that the versions in the package managers might lag behind the latest release. - If you have a [Rust](https://rustup.rs/) toolchain installed, you can also install the latest development version. Nix and Docker users, please refer to the official installation guide for detailed instructions. ## Usage ### Set thesis metadata Fill in your thesis details in the [`metadata.typ`](/metadata.typ) file: * Degree (Bachelor or Master) * Your study program * English and German title * Advisor and supervisor * Your name (without e-mail address or matriculation number) * The start and submission date ### Write your thesis For the actual content of your thesis, there is a dedicated folder named [`/content`](/content) which includes all the chapters and sections of your thesis. This applies for the proposal as well as the thesis (see [`/content/proposal`](/content/proposal) for proposal content). You can add or remove chapters as needed (adapt the [`thesis.typ`](/thesis.typ) with the `#include(...)` accordingly). If you need to customize the layout of the template, you can do so by modifying the corresponding file in the [`layout`](/layout) directory. ### Build PDFs locally Once you have installed Typst, you can use it like this: ```sh # Creates `thesis.pdf` in working directory. typst compile thesis.typ # Creates `proposal.pdf` in working directory. typst compile proposal.typ # Creates PDF file at the desired path. typst compile thesis.typ path/to/output.pdf ``` You can also watch source files and automatically recompile on changes. This is faster than compiling from scratch each time because Typst has incremental compilation. ```sh # Watches source files and recompiles on changes. typst watch thesis.typ ``` ### Updating Your Repository to the Latest Template Version If you have created your thesis repository using the Typst Thesis Template, you might want to update your repository to incorporate the latest changes from the template. Follow these steps to sync your repository with the latest version of the template. **Steps to Update:** 1. Add the Template Repository as a Remote First, navigate to your repository in the terminal and add the original template repository as a new remote: ```sh git remote add template https://github.com/ls1intum/thesis-template-typst.git ``` 2. Fetch the latest updates from the template repository: ```sh git fetch template ``` 3. Merge the Changes into Your Repository Merge the changes from the template's main branch into your current branch. This might require resolving merge conflicts if there are any differences between your customizations and the template's updates: ```sh git merge template/main --allow-unrelated-histories ``` 4. Resolve Merge Conflicts If there are any merge conflicts, git will notify you. Open the conflicting files, resolve the conflicts, and then add the resolved files: ```sh git add <resolved-file> ``` 5. Commit the Merge After resolving conflicts and adding the resolved files, commit the merge: ```sh git commit -m "Merge updates from Typst Thesis Template" ``` 6. Push the Changes to Your Repository Finally, push the merged changes to your repository: ```sh git push origin <branch-name> ``` ## Working with the Typst Web Editor If you prefer an integrated IDE-like experience with autocompletion and instant preview, the Typst web editor allows you to import files directly into a new or existing document. Here's how you can do this: 1. Navigate to the [Typst Web Editor](https://typst.app/). 2. Create a new blank document. 3. Click on "File" on the top left menu, then "Upload File". 4. Select all .typ and .bib files along with the figures provided in this template repository. **Note:** You can select multiple files to import. The editor will import and arrange all the files accordingly. Always ensure you have all the necessary .typ, .bib, and figures files you need for your document. ## Working with VS Code If you prefer to have a more integrated experience with your favorite code editor, you can use the Typst VS Code extension. The extension provides syntax highlighting, autocompletion, and error checking for Typst files. You can install the extension from the [VS Code Marketplace](https://marketplace.visualstudio.com/items?itemName=nvarner.typst-lsp). 1. Open your project in VS Code 2. Set the correct file (`thesis.typ` or `proposal.typ`) as the main file. This can be done by opening the respective file and running the command `Typst: Pin the main file to the currently opened document`. Just hit `CMD + Shift + P` and search for the command. --- ## Further Resources - [Typst Documentation](https://typst.app/docs/) - [Typst Guide for LaTeX Users](https://typst.app/docs/guides/guide-for-latex-users/) - [Typst VS Code Extension (inofficial)](https://marketplace.visualstudio.com/items?itemName=nvarner.typst-lsp)
https://github.com/polarkac/MTG-Stories
https://raw.githubusercontent.com/polarkac/MTG-Stories/master/stories/035%20-%20Core%202019/001_Chronicle%20of%20Bolas%3A%20The%20Twins.typ
typst
#import "@local/mtgstory:0.2.0": conf #show: doc => conf( "Chronicle of Bolas: The Twins", set_name: "Core 2019", story_date: datetime(day: 13, month: 06, year: 2018), author: "<NAME>", doc ) "Did you hear that?" The glissade of ticking and popping sounds had been faint, almost inaudible. If it hadn't been such a still day, Naiva would have thought it a trick of the breeze caught in the branches of a nearby stand of stunted juniper trees. Spear in hand, she studied the snow-draped land. A steep slope above them tilted dizzily toward the monstrous white summit of the mountain called Eternal Ice. The deep slash of a valley led down to where their large hunting party had been encamped since the new moon. All around, the high peaks of the Qal Sisma cut into the sky like so many jagged teeth. Dragons lazily glided in circles on the updrafts above the peaks. Dragons and humans weren't the only hunters in the mountains. She scanned the debris field of rocks in which the juniper grew. Nothing moved that she could see, but a few more quiet pops and ticks sounded. "Bai, goblin claws make that noise on rock." Baishya knelt ten paces away on an exposed outcrop of rock that stuck up above the densely packed snow field, which they'd walked halfway across. Head bowed, she raised a hand for silence. "Bai." Naiva kept her voice low. "We need to keep moving." "You're too impatient. My vision led me right here, I'm sure of it." "There's nothing to see." "Yes, there is. You just can't see it." "I don't think you can see it either. You just say so to get Grandmother's attention because you're not as good at hunting as I am." Baishya looked back over her shoulder with the familiar lift of her chin and roll of eyes. Everyone in the clan said the two girls looked exactly alike, but Naiva absolutely knew for certain she personally never had that smug look of complacency on her face, not ever. "No matter how accurately you throw your spear and how expertly you wield a knife, you're no use as a hunter if you can't keep your mouth shut. Especially not to complain about me. You didn't have to come with me." "Someone has to keep you safe when you hear voices telling you to climb sacred mountains that are off limits to ordinary people~" Naiva broke off. A low whumph like a huge bear stamping a foot shuddered through the air. Cracks shot across the hard surface of the snow higher up on the slope. Baishya pressed her hands to her face as if a bright light was blinding her. "They're here," she said in a tone of awe, oblivious to the danger. The snow broke, starting to slide. Naiva plunged forward, dragged Baishya off the outcrop, and threw them down behind it. They flattened themselves into a slight overhang, backs pressed against the rock. The booming roar of the avalanche deafened them. Naiva flipped up her outer mantle of krushok skin, holding it open with her arms as snow cascaded over the outcrop and roared on down the slope. But it wouldn't be enough. The mountain was named Eternal Ice because its snowfield was so solid and stable, a holy place where hunters dared not hunt and only whisperers would walk when they were lured there by the voices of the ancestors. Yet now all the snow and ice of generations had broken, and it was going to bury them. Naiva did not fear death. But she was suddenly furious that Baishya was so determined to prove herself as a shaman that she had to drag her twin on a reckless quest. So they would die together as they had been born together, locked in a cold tomb. Baishya's hands began to glow with a greenish light. The sight so astonished Naiva that she forgot to be afraid. As the snow poured down, cascading over the top of the overhang, sliding around the curve of the outcrop, burying them in the ice of the ancestors, her sister began to mold and shape the crushing snow into a wall in front of them. The snow thundered against this barrier, bowing it inward. Naiva held her breath, thinking the snow would splinter and give way. But the magical wall held. The noise lessened. The rumbling faded into a pregnant silence. It should have been too dark to see except Baishya's hands were glowing with the eerie, wispy light. #figure(image("001_Chronicle of Bolas: The Twins/01.jpg", width: 100%), caption: [Rattleclaw Mystic | Art by: <NAME>], supplement: none, numbering: none) Naiva's voice had frozen in her throat. Her breath blew clouds of mist in front of her eyes, only it was not her breath. The wall dissolved into a white haze like the soft fall of a heavy snowstorm. Gauzy figures walked out of the snowfall. They were mostly human in shape: tall, slender, walking on two legs but not on the snow, rather on the troubled gusts of air rising from the catastrophic collapse. One wore a cloth the color of moonlight wrapped around its waist, speckled with darts of green like glowing eyes. The others wore wispy scarves as delicate as dew-laden spider webs. Instead of hair and beards, they had filaments growing from their pale flesh. These delicate string-like tentacles curled and waved in strange patterns. Baishya touched her own ears as if trying to muffle a howling clamor of multiple people shouting all at once. Naiva heard nothing, still deafened by the aftermath of the roaring sound or perhaps because she wasn't worthy; she couldn't hear what the elementals were saying, if they were saying anything at all. Baishya's eyes rolled up in her head and she slumped forward in a faint. They had coaxed them up here to kill and eat them! Naiva grabbed her spear. Baishya jolted forward and clamped down hard on her sister's arm. "No! Don't be stupid. The windfolk came to warn us, not to hurt us." As if her voice was a shattering blow, the elementals vanished into a thick cloud of snowflakes; or maybe that was just a concealment spell used to hide their retreat. "You can't just hit first and ask questions later, Nai! You have to listen." "I didn't hear anything!" "You never do." Baishya shook snow off her mantle and eased out from under the overhang. Her gasp of shock shot fear into Naiva's bones. She pushed out beside her sister. Naiva had always walked boldly where Baishya crept with hesitation. But even for Naiva this was too much; she gaped at the destructive path cut by the avalanche. Wide stripes and patches of bare rock had been exposed on the mountainside. Half the massive snow field had caved away, pouring into the valley to smother it in vast heaps of snow. "Grandmother and the camp are down there!" Naiva cried, imagining their broken bodies. But she didn't cry. Tears would not bring them back. "They're all right." "How can you know?" "The windfolk told me. They called me up here to give me a message for Grandmother." "What did they say?" She rubbed her eyes as if they were burning. "I have to tell Grandmother." "Not me? Don't you trust me?" "Why do you always make it about you?" "I don't always make it about me!" A faint boom sounded as another avalanche ripped away at an unseen slope. "Sound causes avalanches, too," Baishya added in a whisper. "As if I don't know that!" "Then why are you still talking?" Naiva bit back a retort. It was so annoying when Baishya was right, but she was right, and Naiva knew better to risk loud sounds where another avalanche might easily break. She grabbed her spear and the pack. They picked their way as quickly as was safe across the remains of the snowfield. The avalanche had hit the debris field full on, tossing stones farther down the mountain. Here they found the corpses of a small pack of goblins, smashed and smothered. "Told you something was stalking us," Naiva muttered. Baishya waved a hand for silence. An object scraped softly on rock. Naiva whipped around just as a squat, blood-spattered goblin leaped from behind a boulder right at her. Its claws raked for her head, but she slammed its torso with the haft of her spear and sent it tumbling. The tip of its claw caught on her leather shoulder plate. She used its momentum to flip it off her and onto the ground. It hit hard, feet scrabbling for purchase as it attempted to get to its feet. She was faster, with a cut to its hip to cripple it, in and out through tough skin and gristle, followed by a stab to the face. The first poke missed, and the point skittered on the rock. The goblin snapped at her arm, teeth catching on her leather vambrace. She stomped hard once, slamming its head back again, then circled her spear point back and skewered it with a thrust through the eye into the brain. Blood leaked brightly over the snow. She gave herself a moment of grim amusement that she had reason to be thankful for the avalanche. A single goblin was no danger to a hunter, but against so many she and Baishya could have been overwhelmed. Baishya had her knife out, kicking each of the crushed goblins to make sure there was no life left in them. Naiva wiped her blade clean in the snow, shook out their game net, and rolled the small bodies into it. "The tribe isn't starving, Nai. No one wants to eat goblin." "We are not leaving meat behind. Not with dragons so close." Dragging the laden net behind them, they slogged over to where stands of hardy juniper offered a more stable path down into the valley. Clouds of white haze were still billowing skyward along the avalanche's path. The dragons, taking it as a game, raced in from the distant peaks to breathe fire onto the heaps of snow. Meltwater churned down the valley's cleft in rising bursts of whitewater. #figure(image("001_Chronicle of Bolas: The Twins/02.jpg", width: 100%), caption: [Thornwood Falls | Art by: <NAME>], supplement: none, numbering: none) "Even if they survived the avalanche, how can they survive such a flood?" Naiva whispered, heart cold. She hated being afraid. It made her angry. "The windfolk promised me." Yet Baishya's voice shook, no longer so certain. She reached for Naiva, and they clasped hands for shared reassurance. This was how it had always been: born when the midwife had cut open the belly of their dead mother, even then they'd been holding hands. The stream at the bottom of the valley had swollen into a rampaging river sweeping far past its banks and now brown with debris and soil and torn up vegetation. They could not descend directly into the valley lest they be swept up in the flood, so they took a longer route picking their way along the slope at an angle. "We could be moving faster if we didn't have to haul this dead weight." Baishya gestured to the lifeless goblins tumbled up into the net. "I say that to myself about you all the time!" Baishya laughed and stopped complaining, but in fact Naiva's mind was churning through every possible disaster. If Grandmother was dead, what then? Was it better to go to Ayagor, where there was a permanent encampment devoted to the feeding of Dragonlord Atarka? Or to join a new hunting band, one of the many that ranged widely through the vast territory of the Qal Sisma to find new sources of game? Or to journey to the borderlands where small hunting parties lived in defensible caves and ran patrols? She intended to survive, and that meant finding people who would take them in. People who wouldn't mind Baishya's absentmindedness when she burned a pan of roasting barley, or her dreamy staring at the sky when she was supposed to be scraping a hide. People who wouldn't just turn her twin over to Atarka once they discovered she was a shaman. Yet, what if Baishya was a burden heavier than the net of dead goblins? What if there was no group that would risk taking in a young, inexperienced whisperer whose presence could get them all killed? Could the two girls survive alone? Or would Naiva have to let her go? "Look there!" Baishya jolted to a halt, breathing hard. The waters had begun to recede, leaving the valley floor scoured clean of vegetation. Even trees had been torn from the ground and whirled downward to fetch up in teetering piles. A hill rose above one such heap of debris. Crowned with hardy fir trees, it had remained above the flood. People sheltered there, small as ants from this distance. By the time they trudged off the mountainside their legs were coated with mud and Naiva's whole body was aching. But a shout greeted them as they reached the hill. A sentry beckoned them in under the trees. Several fires were blazing as the big hunting party dried out. No tents had survived the scramble to safety, but the hunters had their gear. Grandmother was tending to several injured people. Her stern expression relaxed slightly as she saw them, but this touch of relief was all the emotion she allowed herself. "Naiva, what do you have there?" "A pack of dead goblins that were trying to sneak up on us." Grandmother nodded curtly. As always, she simply expected Naiva to have done the right thing without ever bothering to praise her. "Baishya, come aside with me." Naiva handed the net over to other hunters and followed Grandmother and Baishya into the trees. "What happened, girl? Some of the people are muttering that you going up the sacred mountain caused the avalanche. We barely escaped. Worse, this valley will take generations to recover. We've relied on the rich hunting here to feed ourselves now that Atarka demands so much meat." "It was the windfolk." "You saw the windfolk? They haven't communicated with us since we bowed our heads to Atarka. I doubt they trust us now." "They gave me a message for you, Grandmother." "For me?" "For Yasova Dragonclaw." #figure(image("001_Chronicle of Bolas: The Twins/03.jpg", width: 100%), caption: [Yasova Dragonclaw | Art by: <NAME>], supplement: none, numbering: none) Naiva leaned closer, hands curling into fists, shocked to hear Baishya speak that word. Atarka had banished the name Dragonclaw and eaten every person who had dared use the term in her presence. "Naiva, don't let anyone approach until she's finished." Grandmother grasped Baishya's arm. "Tell me everything." In the shadow of the fir trees the air felt colder than ever. An old skin of snow half circled the north-facing trunks of the big trees where the sun never reached. Baishya let out all her breath in a hissing exhale. Her voice got rougher as she slipped into a whisper trance, sinking back into the vision the windfolk had granted her. Naiva was no shaman, but she had always been able to sense vague aspects of her twin's thoughts. She too seemed to sink back into the midst of the killing avalanche when all the world was tumbling around them; however, it wasn't the memory but the vision through which they fell. #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) There is a shadow, a great shadow. It is not clouds, nor is it night. Ripples sweep through the vast airy gulf of the sky. The shadow is a magnificent creature, terrifying and dark and powerful, and it is blind, or maybe it was born in a place of blindness and does not know how to see. Its wings beat storms through the heavens. Out of the storms fall giant egg-stones in different colors. Some plummet without ever waking, but the ones who wake uncurl as they fall and shake themselves in the wide vast gulf of the sky. Their wings unfurl, for they are not eggs. They are the children of the great shadow that lives betwixt and between, in a place and in no place. They are newborn dragons curled up into a ball, and they fall tumbling out of the sky in a flurry of ice and wings. From one beat of the great shadow's wings, there fall seven such egg-stones onto a world that is not Tarkir, although there is no name for it in the language of the windfolk. First the brightest one uncurls. With the beat of pale wings, as it slows its descent, it opens its eyes and speaks: "Arcades Sabboth." By naming itself it takes control of its own destiny. No dragon would allow another to name it. Unlike the small beasts of the lower worlds, they always know exactly who they are. Then rises a dragon whose scales have a metallic sheen. His voice is measured and curious, as if surprised and delighted to discover he also has a name: "I am <NAME>. How interesting. What does this all mean?" A massive welter of reddish-green flashes outward to reveal spiral horns and a wild howl: "Palladia-Mors is my name! No one else can have it!" Two of the bigger egg-stones drop as if they are already dead. They crack into the hard ground and gouge impact craters into a mountainside. Soil and rock splash outward from each strike to make a ring of debris. "What is this place?" says <NAME> as he glides down to land a trifle gracelessly—he's still very young—on the peak of an isolated mountain rising in the midst of a vast plateau. The mountain is a smooth-sloped conical shape, symmetrical and pleasing, with a large crater at the top. He peers into the bowl of the crater but sees no huge broken egg. A warm wind rises out of the depths, hot and sulphuric. "Ah! What a pleasant heat!" He opens his wings, letting the sun dry out the dampness lingering on his still-soft scales. Craning his supple neck, he studies the landscape. The great shadow ripples across an expanse of forest and grassland toward a ridge of distant mountains. Sunlight returns behind its passage, gilding the scene with vivid colors. #figure(image("001_Chronicle of Bolas: The Twins/04.jpg", width: 100%), caption: [<NAME> | Art by: YW Tang], supplement: none, numbering: none) Arcades Sabboth alights beside him to bask. "Such a lot of trees everywhere around our perch. And look, there are all sorts of animals abounding here, some on four feet and some on two. Some are wild, and some have tamed themselves. They must all have names, just as we do. What is that assemblage of structures over by the river? It looks very orderly and interesting." The reddish-green dragon lands lower down to explore the fresh debris scattered by the impact of the two eggs into the mountain. She snorts in scorn at the shattered bodies lying broken inside. "These two were too weak to wake up. Good riddance." "Look!" Chromium gazes skyward. "There are two more!" Two small egg-stones tumble groundward, like an afterthought. Palladia-Mors grunts. "More weak, useless ones." She turns her attention toward distant grasslands where beasts graze in teeming herds. "I'm going hunting." With a huff of breath that almost kindles to flame, she launches herself into the sky. The slope of the mountain cuts off the trajectory of the last two egg-stones. Losing interest in the lost egg-stones, Arcades sweeps out his wings and flies toward the assemblage of structures. Yet Chromium Rhuell can't help but wonder what has become of the last ones, these younger siblings, especially when no tremor of impact shakes the ground. When he circles the peak, he sees nothing on its lower slopes: no impact crater, no freshly born dragons flying, nothing. Just a dense growth of trees cut through here and there with meadows. It's as if the other egg-stones dissolved, and maybe they did. Maybe they were of no more substance in this world than the Ur-wings that birthed them and fell back into the realm of blind shadow. He wonders what Arcades is up to and if he should go after him, then notices another egg-fall in the foothills of a far distant mountain range as the great shadow's wings make another beat: "More egg-stones falling! Cousins!" Intrigued, he flies away to seek them out. So he does not see the tangle of wings that unfold just before impact. The sixth egg-stone unfurls into a startled green dragon just before she crashes down into a clearing at the base of the mountain and rolls several times. Her clumsy landing surprises a party of hunters who, with nets, iron-tipped spears, and lean, ugly dogs, have just brought down a large carnivorous beast. Its blood is still steaming, fragrant and warm, and so the hunger consuming her belly is her first thought. She roars to scare them away. "I am <NAME>. Give me the meat, or I will kill you." The startled hunters and their dogs are so over-awed by her unexpected ferocity and shattering roar that they do not notice the last egg-stone. It unfurls into not one, but two small dragons born twinned together. Not twenty paces from the clearing they hit the canopy, crashing down through branches and, with twin thumps, come to rest on the forest floor amid a welter of needles and fern. "Ouch," says the smaller of the two. He rubs his head against the ground to wipe away a trickle of blood where the tough branches have scratched through the still-tender scales. The other one tries to shake open his bruised wings but is trapped by branches fallen like a net over him. A broken tree trunk pins his body. "I'm stuck," he says. "I'll help you," says the first, studying the other with a keen eye. "You're Nicol, aren't you? That's your name." "Of course it's my name. Hsst, quiet, Ugin. Look out there. What kind of greeting are they giving her? I don't trust them." In the clearing <NAME> roars again. The hunters back away from the beast they've killed. She is big compared to the bipeds, but when she lunges forward toward the carcass, her right wing drags a little. The fall injured her. The hunters exchange looks like speech. With nods and gestures, they fan out. Something about their demeanor has changed. They are still cautious and fearful, but as she gorges herself they slowly move to encircle her with a form of lesser cunning, sly and cowardly. When she raises her head to cough warning smoke at them, they fall back; when her attention returns to her meal, they creep forward again. "Stay still." Ugin starts picking at the debris with his foreclaws and mouth, trying to pull it apart without upsetting the entire heap into a crash that will draw attention to them. Nicol can't look away, gripped by a confusion, a frenzy churning in his gut: the blood and the anticipation swell like hunger; how dare these small, weak bipeds assault one of his own? The hunters fling a large net over her head. With a howl of surprise she thrusts upward, to fly. The hunters cling to the ends of the net, and at first her sheer staggering strength hauls those who can hold on right up off the ground, their feet kicking in the air. As she tops the nearest trees the net tangles so thoroughly in her wings that she loses her lift and flails downward. She crushes one hunter when she lands on him, thrashing and roaring. She bites at the rope, but now her damaged wing is also caught in a branch and she can't maneuver. Dogs bark excitedly, nipping at her flanks as she twists. "Hurry up! We have to help her." says Nicol. "Quiet. If they see us, you're trapped and at their mercy." #figure(image("001_Chronicle of Bolas: The Twins/05.jpg", width: 100%), caption: [Dragon | Art by: <NAME>], supplement: none, numbering: none) Nicol hisses. It's true they can do nothing as long as he's trapped. It's maddening. It's wrong! With a cough of stinging sparks she drives back the first attack. Her scorching breath drives two hunters to their knees. They shriek in pain as burns whiten their skin. The others fall back. One among them shouts orders, and again they rally, again they ready their spears. They attack from all sides, yelling loudly, goading each other. She claws the belly of one right open, guts spilling in a mass of ooze and stink. But his death gives the leader an opening to duck in on her other side and plunge his spear deep into the still-soft scales of her underbelly. Hot blood pumps out from the wound, spraying the leader from head to toe in red. She flops sideways, her caught wing tearing with a horrible ripping noise. Another hunter goes down beneath the bulk of her twisting body, but now her head is vulnerable. Two hunters thrust into her right eye. Dogs lunge for her open belly, scrabbling to dig deep and pull out her soft viscera. Yet still she struggles, still she fights because she is a dragon, and dragons never bow before lesser creatures. She crunches a dog between her teeth. Left side dragging, the two spears still wobbling from her eye, she pulls herself into the trees, seeking escape although there is no escape as the surviving hunters, including the leader clad in her blood, pursue her. Nicol is still stuck. He opens his mouth to roar fury, but Ugin clamps talons over his muzzle, smothering him. "Hush." Fortune favors the two young dragons this day: the hunted leads the hunters away from them. But they hear the shouting and the frenzied barking. Almost lost between all the noise comes the dragon's weak cough as she tries to burn them. There's more thrashing, a howl of pain, agonized yelps, a mortal scream. "Hurry up, Ugin!" says Nicol. "It's not too late. She's still killing them." "Kick with your right rear leg." Nicol kicks, dislodging a weight. "That was the last one." Impatient, Nicol surges forward, scrambling over a tumble of rough-barked limbs as the rest of the debris slides away onto the floor. As he and Ugin bolt into the clearing littered with the corpses of five hunters and three dogs, a chorus of triumphant shouts splits the air. The odor of mortality cuts like a gust of wind through the trees. The death of a dragon smells like honey. Its sweetness is its power, although these hunters don't know that yet. "It's too late," sighs Ugin. The heat of anger boils up from deep in Nicol's heart. He will burn them. #emph[Burn them] . Ugin grabs his right rear leg and tugs him to a halt. "There are many of them and only two of us. We are smaller than our sister." "We are not injured." "We can do nothing for her." "We can avenge her. These puny creatures cannot be allowed to attack us." "We must find the others first. Safety in numbers, as the hunters had. Not one of them could have taken her alone." "What others?" "Other dragons who fell with us. Our siblings. Did you not notice them?" Nicol looks at the cloudless sky and the dizzyingly brilliant sun. The sun is magnificent, bolder and brighter than anything else, dazzling and powerful, the antithesis of shadow and fear. "I am not afraid of the hunters," he says, sure that the sun fears nothing. #figure(image("001_Chronicle of Bolas: The Twins/06.jpg", width: 100%), caption: [Mountain | Art by: <NAME> Ro], supplement: none, numbering: none) "Of course you aren't." "I'm not!" Ugin is young but clever. He sees that to argue will gain him nothing. "Come, Nicol. Let us climb to the top of the peak and see if we can spot our siblings." Nicol is not going to admit he did not notice any dragons except <NAME>. But more than that, he despises running away like a fear-struck weakling. Yet the dogs have started barking with the fierce yaps that mean they have caught a new scent. The hunters are puny, true, and their sister killed five of them already, but they've proven they can work together to accomplish a task that would be impossible for anyone alone. "Which way?" "Up." Ugin takes an awkward running start and jumps with a flap of wings, then thuds down onto the ground. It would have been funny if they weren't about to be set upon by emboldened killers. "I can do it," says Nicol. The chorus of frantic barking intensifies as several dogs race into the clearing. A kick of adrenalin surges through him. He leaps forward onto the lead dog and rips its head off with a single bite. Salty blood saturates his mouth. He chomps several times and swallows. It would taste better if he could savor it, but teeth nip at his flanks as other dogs race around him, snapping. "Nicol! They're coming." "Only cowards run!" "Only fools mistake prudence for cowardice." Annoyed because Ugin is right, Nicol swipes with a claw in a big circle, driving back the dogs. More break through the bushes at the clearing's edge. The hunters' voices are getting louder. When he pushes off with his rear legs and flaps his wings he rises faster than expected; even so, he's still awkward. His lower clawed feet brush across the pointed crowns of fir trees. He barely flies out of the clearing without getting tangled in the trees again. But he is out, away from the hunters, some of whom have now run into the clearing. They stare up after him, no doubt in awe. As he rises above the forest, he starts flying toward the peak. He looks back, suddenly worried. Ugin has vanished. "Over here!" His twin has passed him already. They race to the summit and land in a welter of wings. Nicol wipes blood from his muzzle onto his forelegs. The blood is already cooling and congealing, but the pound of his heart is still going strong. How easy it was to rip the animal's head from its neck! He could have torn through all the dogs because their teeth can't penetrate his scales. It is the hunters who are dangerous, with their weapons and the way they work together to achieve something they cannot do alone. Then he sees the nearest impact crater and inside it the body of a dragon, much larger than either he or Ugin. It did not survive the fall. "Which death is worse?" he asks. "Never to waken, or to waken and live your few moments in a frenzy of fear and fighting?" Ugin does not reply. He stares all around at the landscape. The world is not new, but they are new, like infants whose eyes cannot fully understand what they see: green forest, yellow-green grassy plains, the silver threads of rivers winding their way across a wide plateau. All sorts of creatures wander this wide world. Everything waits to be discovered. Ugin shifts his gaze upward and for the longest time stares at the heavens above. "Where did we come from?" he asks. "Where did our progenitor go? What lies beyond the sky?" "I see one!" Nicol spots a dragon swooping low over a herd of animals. It's exhilarating to watch the prey scatter in fear. The dragon snatches a running beast with such grace and power. #figure(image("001_Chronicle of Bolas: The Twins/07.jpg", width: 100%), caption: [Art by: <NAME>], supplement: none, numbering: none) The yapping still sounds from below as the dogs find the forest debris where he and Ugin landed. When he thinks of the dead sister, he wants to tear all the hunters and dogs to pieces, but maybe the fault doesn't lie with them. They just took the opportunity to get something they wanted. Maybe the fault lies with the dragons who didn't survive. He can still hear the death howl of Merrevia. Dying isn't wonderful. It's bad. But being the hunter: that's a better thing. He climbs to an outcropping that will allow him to drop into an updraft; he's already getting a sense for this world, for the way invisible winds and currents can help you find your path. Before he launches, he halts, feeling the lack of his twin's presence, and turns back. Ugin hasn't moved. He's still staring dreamily at the landscape. "You fool," says Bolas, "we have to keep up with the others. Warn them about the hunters. Learn how to find our revenge. Hurry!" Ugin turns a calm gaze toward Nicol. His eyes are like crystals with depths that give way to mysteries. He says, "Someone is looking for you, <NAME>. Come to me." #v(0.35em) #line(length: 100%, stroke: rgb(90%, 90%, 90%)) #v(0.35em) A shout of warning broke through Baishya's raspy voice. Baishya blinked wildly, swayed as the vision left her, and collapsed into Grandmother's strong arms. Naiva grabbed her spear and ran for the edge of the trees. Three dragons had landed at the edge of the makeshift camp. They were Atarka's broodlings, with stocky bodies and antlered crests. The two big ones huffed threatening curls of flame, but like most of Atarka's brood, they hadn't much in the way of a mind to think with. The smallest, however, had a look of cunning in its fiery eyes. It spoke only Dragonspeech, expecting them to understand. "We smell magic in the air. Surrender your shamans to us, or we will kill you all." Naiva's pulse raced, and her mouth went dry. She tightened a hand on her spear as she exchanged glances with the uninjured hunters, all of whom stood, like her, with spears held upright at their side—meant to look unthreatening, they could defend at a moment's notice. And yet to defend meant attacking the dragons, and such an attack would cause a war between Atarka and the clan. The humans could not win this war; that's what Grandmother had understood eighteen years ago. Was it better to die fighting or to live cringing? "What heralds have approached this humble band?" Grandmother emerged alone from the trees. She carried no weapon; the dragon claw staff that had once announced her position as clan chief had been hidden deep in a secret cave, guarded by concealed whisperers. A fake one had been carved and given to Atarka to destroy. But Grandmother was weapon enough in her own presence. If she feared anything, Naiva had yet to learn what it was. "I am Yasova, First Mother of this hunting band. Do you have a name, honored broodling?" The broodling spat a tongue of flame harmlessly onto the ground. "A big snowfall tore the ice and snow off the mountain. How are you not dead in the snowfall? Torn apart like the trees? We smelled the foul odor of magic. This work is forbidden to you by order of Dragonlord Atarka." Grandmother gestured toward the firs standing straight and tall behind them. "We camped upon this hill," she lied, for anyone who knew anything about camps or had half a brain could see there was no sign of firepits and temporary shelters. "The avalanche and flood passed below us. We ask your permission to continue our journey." The dragon blinked once and then a second time as thoughts crawled across its slow mind. "Where do you go?" They had been planning to stay a full cycle of the moon in the verdant valley before returning toward Ayagor, so Naiva was surprised at Grandmother's next words. "We have been assigned by our hunt caller to patrol the eastern range of the Qal Sisma against the incursions of enemy clans. We'd like to keep traveling while there is still daylight. For your trouble, and out of respect, we have gathered a little snack for you." She caught Naiva's eye and lifted her chin in the direction of the net. With the help of one of the other hunters, Naiva dragged it forward and shook out the corpses onto the rocky slope. The two big dragons snuffled eagerly, looking toward their leader for permission to eat. Even the small one was distracted by the offering of an unexpected treat. They were a greedy lot, and their hunger was their frailty. As they tore into the goblins, Grandmother drew everyone back into the shelter of the trees. "Make ready to move," she said. "The injured who cannot move must remain here with supplies until we can return for them." "Where are we really going?" asked Naiva. Grandmother gave her an impatient look. "You should already know." Naiva's cheeks flamed with humiliation. Fingers brushed her sleeve, and she turned to find Baishya beside her, face flushed, as with a fever. "Didn't you hear, Nai? The vision was passed to me by the windfolk but it didn't come from them." "Who did it come from?" "From Ugin, the Spirit Dragon." "Ugin is dead. Grandmother was there and saw him die. She's told us that story a hundred times." "Yes. That's why we have to go to Ugin's grave. We must find out what this vision portends."
https://github.com/jens-hj/ds-exam-notes
https://raw.githubusercontent.com/jens-hj/ds-exam-notes/main/lectures/10.typ
typst
#import "../lib.typ": * #show link: it => underline(emph(it)) #set math.equation(numbering: "(1)") #set enum(full: true) #set math.mat(delim: "[") #set math.vec(delim: "[") #set list(marker: text(catppuccin.latte.lavender, sym.diamond.filled)) #show heading.where(level: 1): it => text(size: 22pt, it) #show heading.where(level: 2): it => text(size: 18pt, it) #show heading.where(level: 3): it => { text(size: 14pt, mainh, pad( left: -0.4em, gridx( columns: (auto, 1fr), align: center + horizon, it, rule(stroke: 1pt + mainh) ) )) } #show heading.where(level: 4): it => text(size: 12pt, secondh, it) #show heading.where(level: 5): it => text(size: 12pt, thirdh, it) #show heading.where(level: 6): it => text(thirdh, it) #show emph: it => text(accent, it) #show ref: it => { //let sup = it.supplement let el = it.element if el == none { it.citation } else { let eq = math.equation // let sup = el.supplement if el != none and el.func() == eq { // The reference is an equation let sup = if it.fields().at("supplement", default: "none") == "none" { [Equation] } else { [] } // [#it.has("supplement")] show regex("\d+"): set text(accent) let n = numbering(el.numbering, ..counter(eq).at(el.location())) [#sup #n] } else if it.citation.has("supplement") { if el != none and el.func() == eq { show regex("\d+"): set text(accent) let n = numbering(el.numbering, ..counter(eq).at(el.location())) [#el.supplement #n] } else { text(accent)[#it] } } } } === General about Object Storage - More recent - Not Hierarchical inherently - Can mimic hierarchy with how you name objects - Backups can take advantage of the mimicked hierarchy - Quite a saturated market already - Amazon S3 - Microsoft Azure - HP Cloud Object Storage - etc. - Open Source Solutions - OpenStack Swift (Python) - Ceph - Riak CS - etc. === Before - Relational databases (SQL) - NoSQL databases - Block Storage - SAN - Oldest and simplest - Fixed-sized chunks - Block is a portion of data - Address: identify part of block - No block metadata - Limits scalability - Good performance with local app and storage - More latency the further apart - File Storage - NAS - Hierarchical file system - Works well with _smaller files_ - Issues when retrieving large amounts of data - Not scalable - Unique address #ra finite number of files can be stored #report-block[ Make sure to understand the difference between block and file storage ] === What is an Object - File + Metadata ==== Example #gridx( columns: 2, image("../img/9/os-ex.png"), ) === What Object Storage - Collection of _a lot_ of objects - Enables metadata search (kinda similar capability to a database) === Why Object Storage Enables loading large amounts of data *Benefits:* - Big data - Capacity - Scalability - Cheap #gridx( columns: 2, image("../img/9/os-why.png"), image("../img/9/os-comp.png"), ) === What is Object Storage NOT #gridx( columns: 2, image("../img/9/os-not.png"), ) === OpenStack Swift - Can do some erasure coding #ra think about it as systematic Reed Solomon - No master/slave coordinator/worker architecture - One extra level: Container (bucket) - Container is a collection of objects - Container can have metadata - Replica management similar to Hadoop - With a crawler/scanner that checks for consistency - Takes action if inconsistency is found - Scales linearly #gridx( columns: 2, image("../img/9/os-swift.png"), image("../img/9/os-swift2.png"), image("../img/9/os-interaction.png"), image("../img/9/os-objects.png"), image("../img/9/os-swift-arch.png"), image("../img/9/os-swift-arch2.png"), image("../img/9/os-swift-arch3.png"), image("../img/9/os-swift-arch4.png"), image("../img/9/os-swift-arch5.png"), image("../img/9/os-swift-arch6.png"), ) ==== Regions and Zones - Enables handling of very different data-center topologies *Regions:* - Separate physical locations - Minimum of 1 regions (~1 data center) - A region can be defined within data centers, but is typically not used *Zones:* - Independent failure domains - Minimum of 1 zone per region - Zones are typically divisions within a data center - Failure domains - Maybe physical fire walls - Maybe different power sources - Maybe different network switches - Maybe different racks *Cluster:* - A collection of regions - Make these global for optimal reliability - Specific regions - Can have different policies - Like different replication policies - Different erasure coding policies *Differences between Hadoop and Swift:* - Hadoop verified to the client right after receiving the data - Fast! - Swift only tells client stuff is done after complete storage is finished - Slow! - But more reliable - Hadoop only spread on two racks - Two racks fail #ra data is lost - Faster! (Than Swift) - Swift object clusters - 1 replica on 3 different racks in different zones in different regions - Slower! (Than Hadoop) - Highly reliable - Spread out as much as possible #gridx( columns: 2, image("../img/9/os-regions.png"), image("../img/9/os-zones.png"), image("../img/9/os-cluster.png"), image("../img/9/os-policies.png"), ) === When to use what - Hybrid options are best! #gridx( columns: 2, image("../img/9/os-comp-full.png"), ) #report-block[ Replication placement strategies still apply to object storage \ #ra Doesn't depend on the type of storage \ #ra Simply just strategies for how to place replicas ]
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/math/syntax-02.typ
typst
Other
// Test common symbols. $ dot \ dots \ ast \ tilde \ star $
https://github.com/sysu/better-thesis
https://raw.githubusercontent.com/sysu/better-thesis/main/others/master-proposal.typ
typst
MIT License
#import "style.typ": 字体, 字号 #let table-stroke = 0.5pt #set page(numbering: "1") #set par(leading: 1em) // 封面 #{ set align(center) v(5em) image("master-proposal-logo.png", width: 7cm) v(2em) par(text( font: 字体.黑体, size: 字号.一号, weight: "bold", "某某学院\n硕士研究生论文开题报告", )) v(8em) let info-inset = 0pt let info-key(body) = { rect( width: 100%, inset: info-inset, stroke: none, text(font: 字体.黑体, size: 字号.三号, weight: "bold", body), ) } let info-value(body) = { set align(center) rect( width: 100%, inset: info-inset, stroke: (bottom: table-stroke + black), text( font: 字体.黑体, size: 字号.三号, bottom-edge: "descender", body, ), ) } grid( columns: (100pt, 150pt), column-gutter: -5pt, row-gutter: 15pt, info-key[论 文 题 目], info-value[题目], info-key[作 者 姓 名], info-value[张三], info-key[作 者 学 号], info-value[1234567890], info-key[专 业 名 称], info-value[某专业], info-key[研 究 方 向], info-value[某方向], info-key[指 导 教 师], info-value[李四], ) v(10em) text( font: 字体.黑体, size: 字号.三号, weight: "bold", "二0  年  月  日", ) } #pagebreak(weak: true) // 主体 #align(center, text( font: 字体.黑体, size: 字号.三号, weight: "bold", "开题报告", )) #set text(font: 字体.宋体, size: 字号.五号) #set underline(offset: 0.1em) #table( columns: (2em, 1fr), align: (center + horizon, auto), stroke: table-stroke, inset: 10pt, )[ 题 \ \ 目 ][ #v(5em) ][ 题 \ \ 目 \ \ 来 \ \ 源 \ \ 及 \ \ 类 \ \ 型 ][ #v(25em) ][ 研 \ \ 究 \ \ 背 \ \ 景 \ \ 及 \ \ 意 \ \ 义 ][ #v(1fr) ][ 国 \ \ 内 \ \ 外 \ \ 现 \ \ 状 \ \ 及 \ \ 分 \ \ 析 ][ #v(1fr) ][ 研 \ \ 究 \ \ 目 \ \ 标 \ \ 、 \ \ 研 \ \ 究 \ \ 内 \ \ 容 \ \ 和 \ \ 拟 \ \ 解 \ \ 决 \ \ 的 \ \ 关 \ \ 键 \ \ 问 \ \ 题 ][ #v(1fr) ][ 研 \ \ 究 \ \ 方 \ \ 法 \ \ 、 \ \ 设 \ \ 计 \ \ 及 \ \ 试 \ \ 验 \ \ 方 \ \ 案 \ \ 、 \ \ 可 \ \ 行 \ \ 性 \ \ 分 \ \ 析 ][ #v(1fr) ][ 计 \ \ 划 \ \ 进 \ \ 度 \ \ 和 \ \ 质 \ \ 量 \ \ 保 \ \ 证 ][ #v(1fr) ][ 预 \ \ 期 \ \ 成 \ \ 果 \ \ 与 \ \ 创 \ \ 新 \ \ 点 ][ #v(1fr) ][ 参 \ \ 考 \ \ 文 \ \ 献 \ \ ︵ \ \ 不 \ \ 少 \ \ 于 \ \ 20 \ \ 篇 \ \ ︶ ][ #v(1fr) ][ 导 \ \ 师 \ \ 意 \ \ 见 ][ #v(28em) ][ 考 \ \ 核 \ \ 小 \ \ 组 \ \ 意 \ \ 见 \ \ 及 \ \ 结 \ \ 论 ][ #v(20em) 是否进入论文写作:#h(1em) 是 #sym.ballot #h(2em) 否 #sym.ballot // 是否进入论文写作:#h(1em) 是 #sym.ballot.x #h(2em) 否 #sym.ballot #v(5em) 签字:#underline(" " * 10) #h(1em) #underline(" " * 10) #h(1em) #underline(" " * 10) #h(1em) #v(3em) 日期:#h(4em) 年 #h(2em) 月 #h(2em) 日 ] *注:需向导师提供电子版开题报告,可打印粘贴。*
https://github.com/liuguangxi/suiji
https://raw.githubusercontent.com/liuguangxi/suiji/main/tests/test-discrete-f.typ
typst
MIT License
#set document(date: none) #import "/src/lib.typ": * #let print-arr(arr) = { if type(arr) != array { [#raw(str(arr) + " ")] } else { [#raw(arr.map(it => str(it)).join(" "))] } } #{ let rng = gen-rng-f(42) let arr = () let p = (1,) let g = discrete-preproc-f(p) [#g \ ] (rng, arr) = discrete-f(rng, g) raw(repr(arr)); [ \ ] (rng, arr) = discrete-f(rng, g, size: 1) raw(repr(arr)); [ \ ] (rng, arr) = discrete-f(rng, g, size: 0) raw(repr(arr)); [ \ ] (rng, arr) = discrete-f(rng, g, size: 10) print-arr(arr); parbreak() let p = (1, 3, 5, 0, 6, 4, 2) let g = discrete-preproc-f(p) [#g \ ] (rng, arr) = discrete-f(rng, g, size: 100) print-arr(arr); parbreak() let p = (0.0, 0.0, 0.0, 0.0) let g = discrete-preproc-f(p) [#g \ ] (rng, arr) = discrete-f(rng, g, size: 100) print-arr(arr); parbreak() }
https://github.com/fenjalien/metro
https://raw.githubusercontent.com/fenjalien/metro/main/src/metro.typ
typst
Apache License 2.0
#import "defs/units.typ" #import "defs/prefixes.typ" #import "impl/impl.typ" #import "utils.typ": combine-dict #import "dependencies.typ": strfmt #let _state-default = ( units: units._dict, prefixes: prefixes._dict, prefix-power-tens: prefixes._power-tens, powers: ( square: impl.raiseto([2]), cubic: impl.raiseto([3]), squared: impl.tothe([2]), cubed: impl.tothe([3]) ), qualifiers: (:), ) #let _state = state("metro-setup", _state-default) #let metro-reset() = _state.update(_ => return _state-default) #let metro-setup(..options) = _state.update(s => { return combine-dict(options.named(), s) }) #let declare-unit(unt, symbol) = _state.update(s => { s.units.insert(unt, symbol) return s }) #let create-prefix = math.class.with("unary") #let declare-prefix(prefix, symbol, power-tens) = _state.update(s => { s.prefixes.insert(prefix, symbol) s.prefix-power-tens.insert(prefix, power-tens) return s }) #let declare-power(before, after, power) = _state.update(s => { s.powers.insert(before, impl.raiseto([#power])) s.powers.insert(after, impl.tothe([#power])) return s }) #let declare-qualifier(quali, symbol) = _state.update(s => { s.qualifiers.insert(quali, impl.qualifier(symbol)) return s }) #let unit(input, ..options) = context { return impl.unit( input, combine-dict( options.named(), _state.get() ) ) } #let num(number, e: none, pm: none, pw: none, ..options) = context { return impl.num( number, exponent: e, uncertainty: pm, power: pw, combine-dict( options.named(), _state.get() ) ) } #let qty( number, units, e: none, pm: none, pw: none, ..options ) = context { return impl.qty( number, units, e: e, pm: pm, pw: pw, combine-dict(options.named(), _state.get()) ) } #let num-list( ..numbers-options, ) = context { assert( numbers-options.pos().len() > 1, message: strfmt("Expected at least two numbers, got {} instead!", numbers-options.pos().len()) ) return impl.num-list( numbers-options.pos(), combine-dict(numbers-options.named(), _state.get()) ) } #let qty-list( ..numbers-options, ) = context { assert( numbers-options.pos().len() > 2, message: strfmt("Expected at least two numbers and a unit, got {} instead!", numbers-options.pos().len()) ) let numbers = numbers-options.pos() return impl.qty-list( unit: numbers.pop(), numbers, combine-dict(numbers-options.named(), _state.get()), ) } #let num-product( ..numbers-options, ) = context { assert( numbers-options.pos().len() > 1, message: strfmt("Expected at least two numbers, got {} instead!", numbers-options.pos().len()) ) return impl.num-product( numbers-options.pos(), combine-dict(numbers-options.named(), _state.get()) ) } #let qty-product( ..numbers-options, ) = context { assert( numbers-options.pos().len() > 1, message: strfmt("Expected at least two numbers and a unit, got {} instead!", numbers-options.pos().len()) ) let numbers = numbers-options.pos() return impl.qty-product( unit: numbers.pop(), numbers, combine-dict(numbers-options.named(), _state.get()) ) } #let num-range( n1, n2, ..options, ) = context { return impl.num-range( n1, n2, combine-dict(options.named(), _state.get()) ) } #let qty-range( n1, n2, unit, ..options, ) = context { return impl.qty-range( n1, n2, unit: unit, combine-dict(options.named(), _state.get()) ) } #let complex( real, imag, ..unit-options, ) = context { let unit = unit-options.pos() if unit.len() == 1 { unit = unit.first() } else if unit == () { unit = none } else { panic(strfmt("Expected only one or none positional argument, got {}", unit.len())) } return impl.complex( real, imag, unit, combine-dict(unit-options.named(), _state.get()) ) } #let ang( ..ang-options, ) = context { return impl.ang( ang-options.pos(), combine-dict(ang-options.named(), _state.get()) ) }
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-11100.typ
typst
Apache License 2.0
#let data = ( ("CHAKMA SIGN CANDRABINDU", "Mn", 230), ("CHAKMA SIGN ANUSVARA", "Mn", 230), ("CHAKMA SIGN VISARGA", "Mn", 230), ("CHAKMA LETTER AA", "Lo", 0), ("CHAKMA LETTER I", "Lo", 0), ("CHAKMA LETTER U", "Lo", 0), ("CHAKMA LETTER E", "Lo", 0), ("CHAKMA LETTER KAA", "Lo", 0), ("CHAKMA LETTER KHAA", "Lo", 0), ("CHAKMA LETTER GAA", "Lo", 0), ("CHAKMA LETTER GHAA", "Lo", 0), ("CHAKMA LETTER NGAA", "Lo", 0), ("CHAKMA LETTER CAA", "Lo", 0), ("CHAKMA LETTER CHAA", "Lo", 0), ("CHAKMA LETTER JAA", "Lo", 0), ("CHAKMA LETTER JHAA", "Lo", 0), ("CHAKMA LETTER NYAA", "Lo", 0), ("CHAKMA LETTER TTAA", "Lo", 0), ("CHAKMA LETTER TTHAA", "Lo", 0), ("CHAKMA LETTER DDAA", "Lo", 0), ("CHAKMA LETTER DDHAA", "Lo", 0), ("CHAKMA LETTER NNAA", "Lo", 0), ("CHAKMA LETTER TAA", "Lo", 0), ("CHAKMA LETTER THAA", "Lo", 0), ("CHAKMA LETTER DAA", "Lo", 0), ("CHAKMA LETTER DHAA", "Lo", 0), ("CHAKMA LETTER NAA", "Lo", 0), ("CHAKMA LETTER PAA", "Lo", 0), ("CHAKMA LETTER PHAA", "Lo", 0), ("CHAKMA LETTER BAA", "Lo", 0), ("CHAKMA LETTER BHAA", "Lo", 0), ("CHAKMA LETTER MAA", "Lo", 0), ("CHAKMA LETTER YYAA", "Lo", 0), ("CHAKMA LETTER YAA", "Lo", 0), ("CHAKMA LETTER RAA", "Lo", 0), ("CHAKMA LETTER LAA", "Lo", 0), ("CHAKMA LETTER WAA", "Lo", 0), ("CHAKMA LETTER SAA", "Lo", 0), ("CHAKMA LETTER HAA", "Lo", 0), ("CHAKMA VOWEL SIGN A", "Mn", 0), ("CHAKMA VOWEL SIGN I", "Mn", 0), ("CHAKMA VOWEL SIGN II", "Mn", 0), ("CHAKMA VOWEL SIGN U", "Mn", 0), ("CHAKMA VOWEL SIGN UU", "Mn", 0), ("CHAKMA VOWEL SIGN E", "Mc", 0), ("CHAKMA VOWEL SIGN AI", "Mn", 0), ("CHAKMA VOWEL SIGN O", "Mn", 0), ("CHAKMA VOWEL SIGN AU", "Mn", 0), ("CHAKMA VOWEL SIGN OI", "Mn", 0), ("CHAKMA O MARK", "Mn", 0), ("CHAKMA AU MARK", "Mn", 0), ("CHAKMA VIRAMA", "Mn", 9), ("CHAKMA MAAYYAA", "Mn", 9), (), ("CHAKMA DIGIT ZERO", "Nd", 0), ("CHAKMA DIGIT ONE", "Nd", 0), ("CHAKMA DIGIT TWO", "Nd", 0), ("CHAKMA DIGIT THREE", "Nd", 0), ("CHAKMA DIGIT FOUR", "Nd", 0), ("CHAKMA DIGIT FIVE", "Nd", 0), ("CHAKMA DIGIT SIX", "Nd", 0), ("CHAKMA DIGIT SEVEN", "Nd", 0), ("CHAKMA DIGIT EIGHT", "Nd", 0), ("CHAKMA DIGIT NINE", "Nd", 0), ("CHAKMA SECTION MARK", "Po", 0), ("CHAKMA DANDA", "Po", 0), ("CHAKMA DOUBLE DANDA", "Po", 0), ("CHAKMA QUESTION MARK", "Po", 0), ("CHAKMA LETTER LHAA", "Lo", 0), ("CHAKMA VOWEL SIGN AA", "Mc", 0), ("CHAKMA VOWEL SIGN EI", "Mc", 0), ("CHAKMA LETTER VAA", "Lo", 0), )
https://github.com/mariunaise/HDA-Thesis
https://raw.githubusercontent.com/mariunaise/HDA-Thesis/master/graphics/plots/temperature/global_diffs/global_diffs.typ
typst
#import "@preview/cetz:0.2.2" #let data = csv("./sorted_configurations_with_diff.csv") #let errorrate = data.enumerate().map( row => (row.at(0),calc.log(float(row.at(1).at(1)))) ) #let diff = data.enumerate().map( row => (row.at(0),float(row.at(1).at(2))) ) #let conf = data.enumerate().map( row => (row.at(0), row.at(1).at(0)) ) #let formatter(v) = [$10^#v$] #cetz.canvas({ import cetz.draw: * import cetz.plot set-style( axes: (bottom: (tick: (label: (angle: 90deg, offset: 0.5)))) ) plot.plot( y-label: "Bit error rate", x-label: "Enrollment, reconstruction temperature", legend: "legend.north", legend-style: (offset: (2.25, 0), stroke: none), x-tick-step: none, x-ticks: conf, y-format: formatter, y-tick-step: 0.5, axis-style: "scientific-auto", size: (16,6), plot.add(errorrate, axes: ("x", "y"), style: (stroke: (paint: red)), label: $op("BER")(100, 2^2)$), plot.add-hline(1) ) plot.plot( y2-label: "Temperature difference", legend: "legend.north", legend-style: (offset: (-2.25, 0), stroke: none), y2-tick-step: 10, axis-style: "scientific-auto", size: (16,6), plot.add(diff, axes: ("x1","y2"), label: [Temperature difference]), ) })
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/unichar/0.1.0/ucd/block-1B00.typ
typst
Apache License 2.0
#let data = ( ("BALINESE SIGN ULU RICEM", "Mn", 0), ("BALINESE SIGN ULU CANDRA", "Mn", 0), ("BALINESE SIGN CECEK", "Mn", 0), ("BALINESE SIGN SURANG", "Mn", 0), ("BALINESE SIGN BISAH", "Mc", 0), ("BALINESE LETTER AKARA", "Lo", 0), ("BALINESE LETTER AKARA TEDUNG", "Lo", 0), ("BALINESE LETTER IKARA", "Lo", 0), ("BALINESE LETTER IKARA TEDUNG", "Lo", 0), ("BALINESE LETTER UKARA", "Lo", 0), ("BALINESE LETTER UKARA TEDUNG", "Lo", 0), ("BALINESE LETTER RA REPA", "Lo", 0), ("BALINESE LETTER RA REPA TEDUNG", "Lo", 0), ("BALINESE LETTER LA LENGA", "Lo", 0), ("BALINESE LETTER LA LENGA TEDUNG", "Lo", 0), ("BALINESE LETTER EKARA", "Lo", 0), ("BALINESE LETTER AIKARA", "Lo", 0), ("BALINESE LETTER OKARA", "Lo", 0), ("BALINESE LETTER OKARA TEDUNG", "Lo", 0), ("BALINESE LETTER KA", "Lo", 0), ("BALINESE LETTER KA MAHAPRANA", "Lo", 0), ("BALINESE LETTER GA", "Lo", 0), ("BALINESE LETTER GA GORA", "Lo", 0), ("BALINESE LETTER NGA", "Lo", 0), ("BALINESE LETTER CA", "Lo", 0), ("BALINESE LETTER CA LACA", "Lo", 0), ("BALINESE LETTER JA", "Lo", 0), ("BALINESE LETTER JA JERA", "Lo", 0), ("BALINESE LETTER NYA", "Lo", 0), ("BALINESE LETTER TA LATIK", "Lo", 0), ("BALINESE LETTER TA MURDA MAHAPRANA", "Lo", 0), ("BALINESE LETTER DA MURDA ALPAPRANA", "Lo", 0), ("BALINESE LETTER DA MURDA MAHAPRANA", "Lo", 0), ("BALINESE LETTER NA RAMBAT", "Lo", 0), ("BALINESE LETTER TA", "Lo", 0), ("BALINESE LETTER TA TAWA", "Lo", 0), ("BALINESE LETTER DA", "Lo", 0), ("BALINESE LETTER DA MADU", "Lo", 0), ("BALINESE LETTER NA", "Lo", 0), ("BALINESE LETTER PA", "Lo", 0), ("BALINESE LETTER PA KAPAL", "Lo", 0), ("BALINESE LETTER BA", "Lo", 0), ("BALINESE LETTER BA KEMBANG", "Lo", 0), ("BALINESE LETTER MA", "Lo", 0), ("BALINESE LETTER YA", "Lo", 0), ("BALINESE LETTER RA", "Lo", 0), ("BALINESE LETTER LA", "Lo", 0), ("BALINESE LETTER WA", "Lo", 0), ("BALINESE LETTER SA SAGA", "Lo", 0), ("BALINESE LETTER SA SAPA", "Lo", 0), ("BALINESE LETTER SA", "Lo", 0), ("BALINESE LETTER HA", "Lo", 0), ("BALINESE SIGN REREKAN", "Mn", 7), ("BALINESE VOWEL SIGN TEDUNG", "Mc", 0), ("BALINESE VOWEL SIGN ULU", "Mn", 0), ("BALINESE VOWEL SIGN ULU SARI", "Mn", 0), ("BALINESE VOWEL SIGN SUKU", "Mn", 0), ("BALINESE VOWEL SIGN SUKU ILUT", "Mn", 0), ("BALINESE VOWEL SIGN RA REPA", "Mn", 0), ("BALINESE VOWEL SIGN RA REPA TEDUNG", "Mc", 0), ("BALINESE VOWEL SIGN LA LENGA", "Mn", 0), ("BALINESE VOWEL SIGN LA LENGA TEDUNG", "Mc", 0), ("BALINESE VOWEL SIGN TALING", "Mc", 0), ("BALINESE VOWEL SIGN TALING REPA", "Mc", 0), ("BALINESE VOWEL SIGN TALING TEDUNG", "Mc", 0), ("BALINESE VOWEL SIGN TALING REPA TEDUNG", "Mc", 0), ("BALINESE VOWEL SIGN PEPET", "Mn", 0), ("BALINESE VOWEL SIGN PEPET TEDUNG", "Mc", 0), ("BALINESE ADEG ADEG", "Mc", 9), ("BALINESE LETTER KAF SASAK", "Lo", 0), ("BALINESE LETTER KHOT SASAK", "Lo", 0), ("BALINESE LETTER TZIR SASAK", "Lo", 0), ("BALINESE LETTER EF SASAK", "Lo", 0), ("BALINESE LETTER VE SASAK", "Lo", 0), ("BALINESE LETTER ZAL SASAK", "Lo", 0), ("BALINESE LETTER ASYURA SASAK", "Lo", 0), ("BALINESE LETTER ARCHAIC JNYA", "Lo", 0), (), ("BALINESE INVERTED CARIK SIKI", "Po", 0), ("BALINESE INVERTED CARIK PAREREN", "Po", 0), ("BALINESE DIGIT ZERO", "Nd", 0), ("BALINESE DIGIT ONE", "Nd", 0), ("BALINESE DIGIT TWO", "Nd", 0), ("BALINESE DIGIT THREE", "Nd", 0), ("BALINESE DIGIT FOUR", "Nd", 0), ("BALINESE DIGIT FIVE", "Nd", 0), ("BALINESE DIGIT SIX", "Nd", 0), ("BALINESE DIGIT SEVEN", "Nd", 0), ("BALINESE DIGIT EIGHT", "Nd", 0), ("BALINESE DIGIT NINE", "Nd", 0), ("BALINESE PANTI", "Po", 0), ("<NAME>", "Po", 0), ("<NAME>", "Po", 0), ("BALINESE <NAME>", "Po", 0), ("BALINESE <NAME>", "Po", 0), ("<NAME>", "Po", 0), ("<NAME>", "Po", 0), ("BALINESE MUSICAL SYMBOL DONG", "So", 0), ("BALINESE MUSICAL SYMBOL DENG", "So", 0), ("BALINESE MUSICAL SYMBOL DUNG", "So", 0), ("BALINESE MUSICAL SYMBOL DANG", "So", 0), ("BALINESE MUSICAL SYMBOL DANG SURANG", "So", 0), ("BALINESE MUSICAL SYMBOL DING", "So", 0), ("BALINESE MUSICAL SYMBOL DAENG", "So", 0), ("BALINESE MUSICAL SYMBOL DEUNG", "So", 0), ("BALINESE MUSICAL SYMBOL DAING", "So", 0), ("BALINESE MUSICAL SYMBOL DANG GEDE", "So", 0), ("BALINESE MUSICAL SYMBOL COMBINING TEGEH", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING ENDEP", "Mn", 220), ("BALINESE MUSICAL SYMBOL COMBINING KEMPUL", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING KEMPLI", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING JEGOGAN", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING KEMPUL WITH JEGOGAN", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING KEMPLI WITH JEGOGAN", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING BENDE", "Mn", 230), ("BALINESE MUSICAL SYMBOL COMBINING GONG", "Mn", 230), ("BALINESE MUSICAL SYMBOL RIGHT-HAND OPEN DUG", "So", 0), ("BALINESE MUSICAL SYMBOL RIGHT-HAND OPEN DAG", "So", 0), ("BALINESE MUSICAL SYMBOL RIGHT-HAND CLOSED TUK", "So", 0), ("BALINESE MUSICAL SYMBOL RIGHT-HAND CLOSED TAK", "So", 0), ("BALINESE MUSICAL SYMBOL LEFT-HAND OPEN PANG", "So", 0), ("BALINESE MUSICAL SYMBOL LEFT-HAND OPEN PUNG", "So", 0), ("BALINESE MUSICAL SYMBOL LEFT-HAND CLOSED PLAK", "So", 0), ("BALINESE MUSICAL SYMBOL LEFT-HAND CLOSED PLUK", "So", 0), ("BALINESE MUSICAL SYMBOL LEFT-HAND OPEN PING", "So", 0), ("BALINESE PANTI LANTANG", "Po", 0), ("BALINESE PAMADA LANTANG", "Po", 0), ("BALINESE PANTI BAWAK", "Po", 0), )
https://github.com/kdog3682/mathematical
https://raw.githubusercontent.com/kdog3682/mathematical/main/0.1.0/src/examples/convex-hull-attempt-1.typ
typst
// doesnt really work // Helper function to calculate the angle between two points #let getAngle(p1, p2) = { calc.atan2(p2.at(1) - p1.at(1), p2.at(0) - p1.at(0)) } // Helper function to calculate the Euclidean distance between two points #let getHypot(p1, p2) = { let dx = p2.at(0) - p1.at(0) let dy = p2.at(1) - p1.at(1) calc.sqrt(dx * dx + dy * dy) } // Helper function to remove duplicate angles from the results #let removeDuplicateAngles(results) = { let unique = () for result in results { if unique.len() == 0 or result.angle != unique.last().angle { unique.push(result) } } unique } // Helper function to determine if three points make a counter-clockwise turn #let ccw(p1, p2, p3) = { (p2.at(0) - p1.at(0)) * (p3.at(1) - p1.at(1)) - (p2.at(1) - p1.at(1)) * (p3.at(0) - p1.at(0)) } // Main function to compute the convex hull #let getConvexHull(points) = { points = points.sorted() let pivot = points.first() let results = points.map(point => { ( angle: getAngle(pivot, point), distance: getHypot(pivot, point), point: point, ) }) results = results.sorted(key: el => { return (el.angle, el.distance) }) results = removeDuplicateAngles(results) // panic(results) let hull = () for result in results { let point = result.point let l = hull.len() if l < 3 { hull.push(point) } else { while l >= 2 and ccw(hull.at(l - 2), hull.at(l - 1), point) > 0 { hull.pop() l = hull.len() } hull.push(point) } } return hull } #import "@preview/cetz:0.2.0" #import cetz.draw: * // Define the points of a pentagon #let pentagonPoints = ( (0, 0), (4, 0), (5, 3), (2, 5), (2, 0), (2, 1), (2, 2), (2, 3), (0, 3) ) // Get the convex hull // Function to scale points to fit nicely on the canvas #let scalePoints(points, scale: 20, offsetX: 0, offsetY: 0) = { points.map(p => ((p.at(0) + offsetX) * scale, (p.at(1) + offsetY) * scale)) } // Create the visualization #let points(points, ..sink) = { for p in points { circle(p, radius: 0.2, stroke: none, fill: black, ..sink) } } #let polygon(points) = { line(..points, close: true) } #let drawing() = { let hull = getConvexHull(pentagonPoints) cetz.canvas({ grid((0, 0), (5, 5), stroke: 0.25pt) points(pentagonPoints, fill: red) polygon(hull) points(hull) }) } #drawing()
https://github.com/enseignantePC/2023-24
https://raw.githubusercontent.com/enseignantePC/2023-24/master/Chapitre1/ex/correction.typ
typst
// Get Polylux from the official package repository #import "@preview/polylux:0.3.1": * // Make the paper dimensions fit for a presentation and the text larger #set page(paper: "presentation-16-9") #set text(size: 25pt) #let slide = polylux-slide #let obo = one-by-one // Use #polylux-slide to create a slide and style it using your favourite Typst functions #slide[ #align( horizon + center, )[ = Correction Exercice ] ] #slide[ == Exercice 1 #one-by-one[ 1. Exprimer littéralement puis calculer la masse volumique de l’éthanol en g·cm-3. ][ On a #set text(45pt) $" "rho = m/V = (12g)/(15"mL") = #{ 12 / 15 }g/"cm"^3$ \ \ ] ] #slide[ #one-by-one[ 2. Exprimer la masse d’éthanol en kilogramme, et le volume en $m^3$. Rappel : $m^3$ = 1 × $10^3$ 𝐿. ][ $m = 12g = 0.012"kg"$ $V = 15"mL" = 0.0015L = 1.5 times 10^(-3)L = 1.5 times 10^(-6)m^3 $ ][ 3. En déduire la valeur de la masse volumique de l’éthanol en kg·m-3. ][ #set text(45pt) $rho = 0.012/(1.5 times 10^(-6)) = #{ 0.012 / (1.5e-6) }=8 times 10^3"kg"/m^3$ ] ] #slide[ #one-by-one[ 4. Si un $𝑚^3$ d’eau correspond à 1000 Litres, combien 1 $"cm"^3$ d’eau correspond t-il en *mL* . (Il y a 4 unités différentes dans cette phrase.) ][ - $1c m^3 = ..... "mL"$ ][ - $1m = 100"cm"$ ][ - $1m^3 = 1m times 1m times 1m = 100"cm" times 100"cm" ][times 100"cm" = 10^6 "cm"^3$ ][ - $1m^3 = 1000L = 10^6 "mL"$ ][ - $10^6"mL" = 10^6"cm"^3$ ][ - $1"mL" = 1"cm"^3$ ][ ] ] #slide[ == Exercice 2: Décrire la composition d’un mélange #only(2)[ 1. Dans quel état physique ces deux espèces chimiques se trouvent-elles à la température ambiante (20 ℃), et avant le mélange ? Justifier la réponse ] ] #slide[ #one-by-one[ 2. Déterminer les masses d’eau et d’éther introduites dans l’éprouvette. ][ Si $rho = m/V$ alors $m = rho times V$ ][ - $m_"eau" = rho_"eau" times V_"eau" = 1 times 5 = 5g$ ][ - $m_"éther" = rho_"éther" times V_"éther" = 0.71 times 15 = #{10.65}g$ ] ] #slide[ == Exercice 3: Savoir si une solution est saturée #obo(start: 2)[ 1. Calculer la masse maximale de chlorure de sodium que l’on peut dissoudre dans 𝑉 = 100 mL d’eau. ][ Avec les unités on a $s = m/V$ donc $m = s times V = 36g$ ][ 2. En déduire si la solution obtenue est saturée. ][ 50g>36g donc la solution est saturée. ] ]
https://github.com/ralphmb/My-Dissertation
https://raw.githubusercontent.com/ralphmb/My-Dissertation/main/sections/survival.typ
typst
Creative Commons Zero v1.0 Universal
#show table: set text(8pt) #show table: set align(center) == Interactions between variables and survival time The time to first goal (TTFG) is a statistic of interest for both bookmakers and bettors. Many bookmakers offer the ability to bet on a timeframe in which the first goal will be scored, usually dividing the match into 10 or 15 minute intervals. Similar bets can be placed on whether a goal will be scored before/after certain points in the game, particularly around the number of goals scored in the first/second half. In this section we hope to answer some questions about the TTFG in football matches. We will examine its dependence on various factors, such as comparing home teams vs away. The Kaplan Meier plot is a We won't investigate any continuous variables until later as these aren't amenable to Kaplan Meier fitting, hence all the over/under factor variables. Also of note is that the data set in this section is structured differently to the previous. In the logistic regression section we looked at data where each observation corresponded to a match, in this section each observation corresponds to the match from the perspective of each side, home and away. This makes model fitting easier since we have one goal time in each observation but we ought to be careful examining variables with side-dependent effects, as any effect that benefits home sides as much as it hurts away sides will have no effect overall. @survplot shows a Kaplan-Meier survival curve for first-goal events across every match in the 2022/23 season, including data from both home and away teams, Notable in this plot and most of the following ones is the sharp drop at 45 minutes. This happens because goals occuring in injury time at the end of the first half are counted as occuring in the 45th minute, thus raising the number of goals falling into this bin./* Ne<NAME> mentions this - they have the same*/ All highlighted regions represent 95% confidence intervals, based on the estimated standard error of survival chance. #figure( image("../assets/surv.png", fit: "contain") ) <survplot> A quarter of games have their first goals within 22 minutes. The median time for first goal is 49 minutes, slightly into the second half. The third quartile value is not reached, as 72.8% of teams end the game without scoring, just shy of three quarters. #figure( image("../assets/cumhaz.png", fit: "contain") ) <cumhaz> @cumhaz shows a cumulative hazard plot (CH plot) for times to first goal, along with a dashed line showing a straight-line fit. Apart from the jump at 45 minutes, it is clear that the graph is very close to linear, a visual suggestion that the hazard is close to constant throughout the match. === Interaction with game side As shown in @survhomeaway, TTFG is clearly different for home sides vs away. The median first goal times are 45 and 53 minutes for home and away sides respectively, with lower quartiles in the 21th and 23rd minutes. The upper quartile TTFG for home sides lies in the 84th minute, but among away sides only 67.1% manage to score by the end of the match. #figure( image("../assets/surv_homeaway.png", fit: "contain"), caption: [Survival curves for each side] ) <survhomeaway> To ascertain whether these subgroups are really different from one another, we can first examine their respective cumulative hazard plots. @cumhazhomeaway shows CH plots for both home and away sides, as well as the difference in the logarithms of each observed CH function. The red and blue dashed lines show best fit lines for each. The black curve gives the log of the hazard ratio (HR) between the groups, or equivalently the log of the difference between the two cumulative hazards. A constant log difference line implies the CH functions are proportional to one another. An assumption underlying many models in survival analysis is that of proportional hazards, where the hazard (equivalent cumulative hazard) functions differ by a constant multiplicative factor. Among other things that we'll see later, this assumption justifies the use of more powerful log-rank test versus the Wilcoxon-Gehan test for difference in hazard between the groups. In order to calculate the log HR the CH for one treatment group has to be interpolated to match the number of observations seen in the other, but this should not change its shape. \ #figure( image("../assets/cumhaz_homeaway.png", fit: "contain"), caption: [Cumulative hazard plots for home and away sides, and their log difference.] ) <cumhazhomeaway> The log HR appears fairly constant after around 30 minutes, but before this it varies quite a bit, even changing sign at one point. This could be due to away teams playing more aggressively in the opening of a match, to gain an early lead; the aforementioned authors Nevo and Ritov observed a similar effect in their paper. As we will see a similar effect is present for many variables, with different groups indistinguishable in the opening minutes of the match. Some portion of this effect is possibly numerical, any effect on play due to a change in some variable will have had less chance to accumulate near the start, hence the groups will be closer together and their ratio will be more effected by noise. For this reason we will assume for this and similar examples that the hazards are proportional, and use the log-rank test, implemented in R using `survdiff(..., rho = 0)`. The test shows that these populations differ significantly, with $chi_("df"=1)^2 = 11.6, p=0.0005$. === Interaction with team strength We can plot seperate KM curves for each team grouping, by whether they attained above or below median points in the previous season. As we saw in the logistic regression section, these variables in their raw numerical form were problematic for the proportional odds assumption of ordered logistic regression. #figure( image("../assets/surv_grouping.png", fit: "contain"), caption: [Survival curves by team grouping] ) <survgrouping> The curves in @survgrouping are superficially similar to those in @survhomeaway. The quartiles for higher grouped teams are in the 19th, 43th and 76th minutes, for lower teams the 23rd, 54th and 90\<th minutes, with 31.1% of these lower grouped teams failing to score. Much like with home vs away sides, the CH curves are fit quite well by the lines, and the log difference is fairly constant after 30 or so minutes. The arm for lower grouped teams appears somewhat concave through the second half of the match, though this a fairly minor deviation. #figure( image("../assets/cumhaz_grouping.png", fit: "contain"), caption: [Cumulative hazard by team grouping] ) <cumhazgrouping> For the same reasoning as before, we perform a log rank test, showing this split has real effects on TTFG, giving $chi_("df"=1)^2 = 11.9, p=0.0006$. === Interaction with other team's strength Intuitively the strength of the other team may also affect the rate at which a team can score. As in the previous section we'll look at the binary grouping by last-season points here to determine stronger and weaker teams. #figure( image("../assets/surv_oppgrouping.png", fit: "contain"), caption: [Survival curves by opponent's grouping] ) <survoppgrouping> The survival curves in this example are much closer together than the previous, but the effect still seems visible. As with many of the other variables looked at here, the curves are much more clearly distinct towards the second half of the match. Unlike the others, this distinction appears later, at around the 45 minute mark. The quartile values of TTFG are at 21, 45 and 89 minutes for teams playing a weaker opposition, and 21, 55, 90+ minutes otherwise, with 30.9% of teams facing a tougher opponent failing to score by full time.\ The log difference line again appears constant nearer the end, as the hazard ratio appears to rise towards and settle at $approx e^(0.2)$. The relatively larger shifts in the observed hazard ratio mean we will test the difference here using the Wilcoxon-Gehan test, resulting in a chi-square value of 1.9 on 1 d.f., so $p = 0.2$. This is above the usual threshold but given the interaction between the strengths of either side, seen in previous sections, it may be worth investigation in a fuller model. #figure( image("../assets/cumhaz_oppgrouping.png", fit: "contain"), caption: [Cumulative hazard by opponent's grouping] ) <cumhazoppgrouping> === Interaction with distance travelled // Check this using Cox model as well - and explain the need for binary grouping here. Restricting our attention to away sides, we fit survival curves to TTFG based on whether each away side travelled more or less than the median distance to attend a match. As mentioned at the beginning of this section we do focus on away sides as the effect of this variable presumably depends on the side. In the next section we will fit this as an interaction term. #figure( image("../assets/surv_distance.png", fit: "contain"), caption: [Survival curves for away sides by distance travelled] ) <survdistance> @survdistance shows the results, again quite clearly distinguishable. Likely due to the inherent disadvantage suffered by every team examined here, neither curve quite reaches below the 25% line by 90 minutes, with teams who travelled a longer distance failing to score 41.1% of the time and closer teams 24.7% of the time. The lower quartile for first goal time lands in the 19th and 27th minute for near and far teams respectively, with medians at 50 and 62 minutes. \ @cumhazdistance shows that the CH plots for this variable don't fit the lines quite as cleanly. Some measure of this is likely due to the smaller sample - we've excluded results for home sides - and this is shown too in the confidence intervals in @survdistance. #figure( image("../assets/cumhaz_distance.png", fit: "contain"), caption: [Cumulative hazard by distance travelled] ) <cumhazdistance> Scoring rate appears to pick up at the end of the game for closer-located teams, and unlike with our previous two variables we see that the distinction between the groups near the start of the match is much wider, with the log difference line being positive (if very bumpy) for almost the entire 90 minute window. The W-G test shows $chi_("df"=1)^2 = 5.8, p=0.02$, leading us to take this as a significant difference. === Interaction with derby matches A 'derby' game is common terminology for a football match between two rival teams. Usually both teams are local to one another, maybe representing neighbouring towns or nearby city districts. These games are often considered to be higher stakes for fans of each team. In some particular cases this has boiled over into incidents of violence or property destruction. Due to the fact that derby games are fought between geographically close teams, the effect found earlier where away teams score worse when travelling farther might lead these games to being more closely matched than games in general. Our definition of what games constitute derbies can be found in the data collection section earlier. Perhaps due to the much smaller size of the derby group (about 60 games in 380) the estimates given by R are much wider, visible in the confidence intervals in red in @survderby. The survival curves here seem very similar, with the point estimates not too distinct and one line contained completely within the CI of the other. #figure( image("../assets/surv_derby.png", fit: "contain"), caption: [Survival curves for (non-/)derby games] ) <survderby> This similarity is just as visible in the CH plots, where both lines follow very similar paths. This could be a symptom of the smaller sample size, as that is likely what lends the curve for derby games to be so jagged. #figure( image("../assets/cumhaz_derby.png", fit: "contain"), caption: [Cumulative hazard in (non-/)derby games] ) <cumhazderby> The estimated log difference is very close to zero, and the observed log difference doesn't follow it closely at any point on the plot. Due to the erratic log-difference curve, we will confirm our visual analysis with the Wilcoxon test, which gives $chi_("df"=1)^2 = 0.5, p=0.5$. === Other interactions An interaction with the day of the game was tested. No significant link was found between date/time of kickoff (for this we use the Unix timestamp, an integer value which gives the number of seconds since 1/1/70) and TTFG, nor was a link found for a similar factor variable classifying whether the match occured earlier/later in the season. Red cards, much like derby games, are scarcely represented in the data, hence much like derby games no significant effects were found in tests, even when restricting the data to only one side. In some literature (see @nevoritov) the authors use the match time at which red cards are given to investigate the change in hazard before and after the event but we don't have this data. The overall effect of red cards on first-goal time may be murky, since naturally it means a team is playing with one fewer player for some time, however there may be a confounding effect whereby a more aggressive team would score quicker goals and commit more rule-breaking behaviour. We will investigate this as an interaction term. == Fitting parametric models The R function `flexsurvreg` can fit parametric survival models of two kinds, accelerated failure time (AFT) and proportional hazards (PH). AFT models focus on the effects of each variable on survival times, so the survival time of the $k$th subject is distributed as $T_(k) = e^(eta_(k)) T_(0)$, with $eta = sum_(i)beta_(i) X_(i)$ the linear predictor and $T_(0)$ follows some appropriate baseline distribution. A unit change in the linear predictor thus "speeds up time" for the $k$th subject by a factor of $e$. By contrast PH models focus on how each variable affects the hazard, so $h_(k)(t) = exp(eta_(k))h_(0)(t)$ gives the hazard of the $k$th subject as a multiple of a baseline hazard. A unit increase in the linear predictor thus "increases the risk" of the event of interest by a factor of $e$. For parametric models this $h_(0)$ has a closed form, for instance an exponentially distributed survival time correponds to a constant baseline hazard. //https://www.mdpi.com/2227-7390/12/1/56 Weibull models (and therefore exponential models) can be used in either of these frameworks, as Weibull distributions satisfy both properties. For the sake of comparison with the popular Cox regression model, we will take the proportional hazards approach. \ Having examined the effects of a few variables on TTFG alone, we will try fitting two parametric models to our data, one exponential and one Weibull. In the logistic regression section we saw that the raw points values didn't follow the proportional odds assumption. As both the exponential and Weibull models follow a proportional hazards relationship, we may see a related effect here. The first step will be to remove insignificant variables, and then to check that the model fits are appropriate. Away team travel distance as a raw kilometer value was tested but found insignificant, so much like with the ordered logistic regression model we will use the closer/farther variable. The model using raw distance values reduced down (after removal of insignificant variables) to a model strictly nested inside the one we'll soon arrive at, hence we won't include it here. ```R s ~ home_or_away*(red_card_home + red_card_away + distance_grouping) + points2021 + opponent_points2021 ``` Given the large amount of data in the model selection process, relevant R output can be found in the appendices. We first remove any insignificant interaction terms to arrive at our 2nd group of models, then to arrive at our third and final set of models we remove any variables still insignificant. The results of this process are in the following table. Both final models use the same set of variables. #table( columns:3, [Variable], [Estimate], [S.E.], [Shape $gamma$], [1.145], [0.050], [Scale $alpha$], [0.007], [0.002], [Home (1) vs Away], [0.037], [0.140], [Closer vs Farther (1)], [-0.460], [0.148], [Points last season], [0.010], [0.003], [Opponent's Points], [-0.007], [0.003], [Side*Distance], [0.571], [0.202], ) And for the exponential model: #table( columns:3, [Variable], [Estimate], [S.E], [Rate $lambda$], [0.013], [0.003], [Home (1) vs Away], [0.030], [0.140], [Closer vs Farther (1)], [-0.442], [0.148], [Points last season], [0.010], [0.003], [Opponent's Points], [-0.007], [0.003], [Side*Distance], [0.548], [0.202], ) All coefficients are significant to a 95% level, except the home/away factor, which is included as its interaction with distance is significant. Unlike in the logistic regression section, red cards seem to have less of an effect here. This could be because first goals skew towards the early game, meaning they will often have already occurred by the time a red card is given, unlike match outcome which is affected by events throughout the full course of the match. The opposite signs on the distance variable and the distance interaction can be explained as the effect of travel distance differing for each team side. Coefficients in each model are broadly similar, as could be expected for similar baseline distributions. The shape parameter estimated for the Weibull-based model is greater than one, indicating an increasing hazard. There are a number of methods for assessing Weibull PH model suitability. If the model depends on some categorical variable, one can plot $log(-log(S(t)))$ against $log(t)$ using the Kaplan Meier survival functions $S(t)$ estimated for each strata. If the underlying distribution is indeed Weibull, then under the proportional hazards assumption this plot should show a number of parallel lines. #figure( image("../assets/logsurvplot.png"), ) <graphplot> In @graphplot we show an example of such a test, for the home vs away variable. Most of the model observations are contained in the $3<ln(t)$ range, hence the clearer distinction there. We can see the same effect here as in the non-constant log HR curves earlier, where the curves seem hard to tell apart in the earlier parts of the game. Nonetheless we can at least see that the lines appear parallel around the top right. This method is less useful for continuous variables like points.\ There exist residual methods for assessing the fit of parametric survival models exist, such as Cox-Snell, martingale, and deviance residuals. We will just examine the Cox-Snell residuals. // Explanation of cox snell resids //https://stats.stackexchange.com/questions/246812/cox-snell-residuals-in-r #set table( fill: none, ) #figure( table( columns: 2, image("../assets/qqweib.png", fit: "contain"), image("../assets/qqexp.png", fit: "contain") ), caption: [Plots of Cox Snell residuals] ) #set table( fill: (x, y) => if x == 0 or y == 0 { gray.lighten(40%) }, align: right, ) In the Q-Q plots above, we plot Cox-Snell residuals against theoretical $~"exponential"(1)$ quantiles, with the $y=x$ line overlaid in red. Non-censored Weibull-distributed data have, in theory, exponentially distributed Cox-Snell residuals. @coxsnell. We see that both models fit the data quite well. For both models the smaller residuals seem almost perfectly exponentially distributed. At the upper end, the Weibull model seems to fit a bit closer, whereas the exponential model seems to have residuals lying above the line. To examine which of the models fits better we can perform a likelihood ratio test. As the same variables turned out to be significant, and the exponential is a special case of the Weibull model, these models are nested. Calculating $-2(l_2 - l_1)$ we find a value of $chi^(2) = 9.07$. This is on 1 df, as Weibull models estimate exactly one extra parameter, the shape $gamma$. This is above the 1% $chi^2$ critical value of 6.635, hence we can conclude that the extra flexibility of the Weibull model does improve the fit somewhat. We can quantify the estimated hazard mulitipliers for different categories of teams over the baseline hazard. While the differing coefficients of each team's points indicate a changing relationship of the hazard as points vary, we will just assume the two teams have an average number, 57.2. #table( columns:3, [], [Home], [Away], [Closer], [1.25], [1.20], [Farther], [1.39], [0.76] ) The advantage of home teams is clearly seen by comparing the first column to the second. The interaction with distance also shows, as away team performance almost matches that of home teams when they don't have to travel as far == How does a Cox regression compare? The linear fits to the Kaplan Meier curves shown in the univariate analyses seemed fairly close, but the discontinuous jump at 45 minutes and seeming non-linearity throughout parts of the curves may hint that a more general approach could be useful. The Weibull models we just fit can accommodate a changing hazard, but they only have limited parameters through which to describe the full shape of the curve - their cumulative hazard has the form $H(t) prop t^(gamma)$. For some curves, such as in @cumhazhomeaway, the early sections appear linear, and these are followed by a slightly concave shape. In order to account for these features, as well as the jump in hazard at 45 minutes, we can use Cox regression.\ Cox regression still assumes proportional hazards, that is, the shape of the curve must be more or less the same for e.g. home teams and away teams, but they may differ by a constant multiplicative factor. The advantage of this is that the shape can be abstracted away, using a baseline hazard $h_(0)(t)$ of which the hazards for each treatment group are multiples. This means we aren't tied to a power-function cumulative hazard curve like a Weibull model, however it means that the survival curve may not be smooth, and we lose the niceties of a parametric distribution such as closed-form expressions for quantities of interest. Since we've already determined the variables that are significant for the parametric model, we will fit a Cox model against the same covariates, and we find the following coefficients. #table( columns: 3, [Variable], [$beta$], [p], [Home (1) vs Away], [0.0350], [0.802], [Closer vs Farther (1)], [-0.460], [0.002], [Points last season], [0.010], [0.000], [Opponent points], [-0.0073], [0.020], [Side \* Points], [0.574], [0.005] ) The coefficients here are very similar to those in the Weibull fit, providing some more evidence that goal times follow a Weibull distribution quite well, as the less constrained baseline hazard function here still results in similar coefficient estimates. As mentioned, the Cox model requires the proportional hazards assumption to hold. We can check this in R using `cox.zph()`. The output of this function is as follows: ```R > cox.zph(coxreg) chisq df p home_or_away 0.5673 1 0.451 distance_grouping 0.0699 1 0.792 points2021 0.6350 1 0.426 opponent_points2021 2.0605 1 0.151 home_or_away:distance_grouping 3.3777 1 0.066 GLOBAL 9.0028 5 0.109 ``` This test uses residuals from the models to test against the null hypothesis that hazards are indeed proportional. /*cite: Proportional hazards tests and diagnostics based on weighted residuals*/ As we can see, none of the tests for individual variables, nor for the models as a whole, gives enough evidence to reject the null hypothesis, though the interaction term gets close. If (e.g.) the home-advantage, expressed as a hazard ratio between home and away teams, were larger towards the end of a match, this would violate the proportional hazards assumption. The Schoenfeld residuals $s_k$ at time $t_k$ are calculated as the differences between the covariate values of the subject having an event at time $t_k$, minus the mean covariate values of those subjects still at risk, who haven't had the event yet @parkhendry. These can be shown to have expected values proportional to the difference $beta(t)-beta$ between a time-varying coefficient and the constant Cox regression coefficient @grambsch. Therefore a trend in the Schoenfeld residuals implies that the model should perhaps be respecified with a time-dependent coefficient. #figure( image("../assets/schoenfeld_interaction.png", fit: "contain"), caption: [Schoenfeld residuals against transformed match-time] )<schoenfeld> In @schoenfeld we plot scaled values of $s_k$ against time as the solid line. Match time is rescaled according to event density, so the intervals between x-axis ticks get longer. We can see that the black line rises toward the end of the match, indicating an increased effect on hazard ratio between the groups. We could conjecture this is caused by away teams who travel farther experiencing more fatigue as the game goes on, giving their opponents an edge, though the trend isn't strong enough to be statistically significant (and we don't have the data to confidently make causal arguments like this). Similar plots can be made for the other variables, though they show no trends of a similar magnitude. Reassured that our Cox regression fit is appropriate, we can make some comparisons to the parametric model. To pick an example we can look at the hazard ratio for Arsenal (69 points last season) playing at home against Newcastle (49 points, 394km away), over the baseline hazard. For our Cox model we see $ (h_(1)(t))/(h_(0)(t)) &= exp(0.0350 times 1 - 0.460 times 1 + ...)\ &= exp(0.515) = 1.673 $ And the corresponding ratio for Newcastle in this match is 0.639, giving a hazard ratio between the teams of 2.620. Taking each of these values to be their lower (upper) 95% confidence bounds, we find a lower (upper) estimate for this hazard ratio to be 1.350 (5.0837) We can calculate the same ratio for our earlier parametric models. The hazard function for the $i"th"$ subject can be parametrised as $ h_(i)(t) = alpha gamma t^(gamma-1)exp(beta_1 X_(i 1) + ...) $ Then the hazard ratio between teams 1 and 2 is: $ (h_(2)(t))/(h_(1)(t)) = exp(sum_(j) beta_j (X_(2 j)-X_(1 j))) $ The between-teams hazard ratio for our earlier Weibull model is 2.617, again very similar to the Cox model. While clearly not an exhaustive test, this example does demonstrate how similar the predictions given by each model are. Given this, we might prefer the Weibull model over the Cox model in order to answer questions about our data set. The closed form hazard function gives a more complete description of our data, and in particular for the Weibull model can also be easily rephrased as an accelerated failure time model, letting us answer questions about expected goal times much more easily than the Cox model. While the flexibility of semi- or non-parametric models is often needed in situations with unusually behaved data, in our case these don't seem particularly necessary. == A brief look at a count model Given a process with exponentially distributed inter-event times, the number of such events in some timespan will be Poisson distributed. Modelling the number of goals scored in a football match is often done through a Poisson approach, but methods have been developed to fit count models to data with Weibull distributed inter-event times. \ Such models are often inexpressible in closed form, but series approximations exist for calculation. The probability of $n$ events having occurred by time $t$ can be expressed as the probability density of having the first event at time $tau$, multiplied by the probability of $n-1$ events having occured in the period between $tau$ and $t$, integrated over $tau$. The first term is given by the pdf of the inter-event time distribution, $f(tau)$, so this calculation gives a recursive solution for the count distribution, solved practically through Taylor series @mcshane.\ #figure( image("../assets/wei_count.png",fit:"contain"), caption:[Bar plot of actual/fitted frequency for each goal count] ) <weicount> Lucky for us, authors Kharrat and Boshnakov (mentioned in the literature review) have written the R package `countr`, which handles the model fitting. We will fit a Poisson and a Weibull renewal-count model, each against the same covariates as before. In a different paper @kharratboshnakov2017 the same authors show a plot similar to @weicount, though ours differs in that the Weibull count model doesn't show any obvious improvement visually over the Poisson one. We can confirm our suspicions with a likelihood ratio test, which shows that with $chi^(2)_("df"=1) = 2.17, p = 0.141$ the extra shape parameter used in the Weibull count model doesn't significantly improve the fit. Our earlier analysis of first goals came to the opposite conclusion, possibly suggesting that the nature of 2nd, 3rd goals might be different from first goals. The authors mentioned examine over 1000 matches, so the issue could conceivably be one of sample size, though changing conditions in the sport could also be at play, as 7 years have passed since their paper. More work could be done to show why this difference in results has occured.
https://github.com/jamesrswift/pixel-pipeline
https://raw.githubusercontent.com/jamesrswift/pixel-pipeline/main/src/pipeline/canvas/lib.typ
typst
The Unlicense
// #import "/src/package.typ": package #import "/src/math/lib.typ": * #import "process.typ" #let paint( scale: 1em, background: none, body ) = context { if body == none {return []} let (bounds, drawables) = process.many(body) if bounds == none {return []} // Filter hidden drawables drawables = drawables.filter( d => not d.at("hidden", default: false) ) // Order draw commands by z-index drawables = drawables.sorted(key: (cmd) => { return cmd.at("z-index", default: 0) }) // Final canvas size let (width, height, ..) = vector.scale(aabb.size(bounds), scale) box( width: width, height: height, fill: background, align(top, { for drawable in drawables { let (x, y, _) = if drawable.type == "path" { vector.sub( aabb.aabb(path-util.bounds(drawable.segments)).low, bounds.low) } else { (0, 0, 0) } place( top + left, float: false, { let (width, height) = measure(drawable.body) move( dx: (drawable.pos.at(0) - bounds.low.at(0)) * scale - width / 2, dy: (drawable.pos.at(1) - bounds.low.at(1)) * scale - height / 2, drawable.body, ) }, dx: x * scale, dy: y * scale ) } }) ) }
https://github.com/kmitsutani/a_note_on_Hioki_I-1
https://raw.githubusercontent.com/kmitsutani/a_note_on_Hioki_I-1/main/README.md
markdown
MIT No Attribution
# typst-jp-note-template このリポジトリは [kimushun1101/typst-jp-conf-template](https://github.com/kimushun1101/typst-jp-conf-template) をフォークして作成した Typst でアブストラクト付きのノートを書く時のテンプレートです. 研究の途中で共同研究者等に共有するようなものを想定しています. 主に - 2カラムの廃止 - 様々なスペースをゆったりととる という変更を行い個人用ノートとして見やすくなるようにしました. また Appendix を追加しました. ## ディレクトリ構造 | ファイル | 意味 | | -------- | ----------------------- | | main.typ | 原稿の Typst ソースコード | | refs.yml | 参考文献ファイル | | ディレクトリ | 含まれるファイルの種類 | | ------------- | --------------------------- | | figs   | 論文に使用する画像ファイル | | libs   | 体裁を整えるライブラリファイル | ## 使用方法 GitHub に慣れていない方は,GitHub ページの `<>Code▼` から `Download ZIP` して展開してください. 慣れている方は,`git clone` したり `use this template` したり,適宜扱ってください. ### Neovim + nvimdots - LSP: [ayamir/nvimdots](ayamir/nvimdots) を適用すると`mason.nvim` が入るのでNeovim 内で `:MasonInstall typst-lsp` とすると LSP との連携ができるようになります. - その他シンタックスハイライトやfiletypeの適用など: [kaarmu/typst.vim](kaarmu/typst.vim) - リアルタイムプレビュー: [https://github.com/chomosuke/typst-preview.nvim](chomosuke/typst-preview.nvim) ## 参考元 [kimushun1101/typst-jp-conf-template](https://github.com/kimushun1101/typst-jp-conf-template) ## ライセンス 参考元にならってライセンスを付与しています. Typst ファイル : MIT No Attribution CSL ファイル : Creative Commons Attribution-ShareAlike 3.0 License
https://github.com/stat20/stat20handout-typst
https://raw.githubusercontent.com/stat20/stat20handout-typst/main/_extensions/stat20handout/typst-show.typ
typst
#show: doc => stat20handout( $if(title)$ title: [$title$], $endif$ $if(title-prefix)$ title-prefix: [$title-prefix$], $endif$ $if(course-name)$ course-name: [$course-name$], $endif$ $if(semester)$ semester: [$semester$], $endif$ $if(date)$ date: [$date$], $endif$ $if(margin)$ margin: ($for(margin/pairs)$$margin.key$: $margin.value$,$endfor$), $endif$ $if(paper)$ paper: ("$papersize$",), $endif$ $if(mainfont)$ font: ("$mainfont$",), $endif$ $if(fontsize)$ fontsize: $fontsize$, $endif$ $if(section-numbering)$ sectionnumbering: "$section-numbering$", $endif$ cols: $if(columns)$$columns$$else$1$endif$, doc, )
https://github.com/flechonn/interface-typst
https://raw.githubusercontent.com/flechonn/interface-typst/main/BD/TYPST/exo1.typ
typst
#show terms: meta => { let title = label("Addition Exercise") let duration = label("30min") let difficulty = label("easy") let solution = label("1") let figures = label("") let points = label("10pts") let bonus = label("1") let author = label("") let references = label("Link") let language = label("english") let material = label("Algo") let name = label("exo1") } = Exercise Calculate the sum of the following numbers : ```py let numbers = [5, 8, 12, 3] let sum = 0 for n in numbers { text(n) sum += n if n != numbers[-1] { text(" + ") } else { text(" = ") } } ``` = Solution sum = 5 + 8 + 12 + 3 = 28.
https://github.com/loreanvictor/master-thesis
https://raw.githubusercontent.com/loreanvictor/master-thesis/main/thesis.typ
typst
MIT License
#import "thesis_template.typ": * #import "common/cover.typ": * #import "common/titlepage.typ": * #import "thesis_typ/disclaimer.typ": * #import "thesis_typ/acknowledgement.typ": * #import "thesis_typ/abstract_en.typ": * #import "thesis_typ/abstract_de.typ": * #import "common/metadata.typ": * #import "common/math_utils.typ": * #import "thesis_typ/math_classes.typ": * #cover( title: titleEnglish, degree: degree, program: program, author: author, ) #titlepage( title: titleEnglish, titleGerman: titleGerman, degree: degree, program: program, supervisor: supervisor, advisors: advisors, author: author, startDate: startDate, submissionDate: submissionDate ) #disclaimer( title: titleEnglish, degree: degree, author: author, submissionDate: submissionDate ) #acknowledgement() #abstract_en() #abstract_de() #show: project.with( title: titleEnglish, titleGerman: titleGerman, degree: degree, program: program, supervisor: supervisor, advisors: advisors, author: author, startDate: startDate, submissionDate: submissionDate ) = Introduction <intro> Diagrams play a pivotal role in software development and design. Among the various types, UML diagrams in particular are considered 'the de-facto standard' for representing software architectures and workflows @uml-empirical. UML, however, has proven diffcult to adopt both by professionals in the industry @uml-in-practice and by students and instructors in academia @uml-learning. This diffuclty underscores the need for easy-to-use and learning-focused UML modeling tools such as Apollon #footnote("https://github.com/ls1intum/Apollon"), which have shown to be effective in improving learning outcomes in modelling @uml-learning. A key feature of such tools is realtime collaboration. Collaborative settings are shown to have a positive impact on contribution amongst participants specifically during modeling using UML @collab-on-table. Combined with a rising trend in remote work and study, realtime collaboration is becoming a necessity for making the gains of a collaborative working/learning environments for modeling readily available to a wider audience, while further shortening assessment, evaluation and feedback cycles, hence further increasing engagement of such tools. == Problem Though Apollon has proven successful in significantly improving learning of UML modeling amongst students, it still lacks proper support for realtime collaboration. This is a significant drawback as it limits the tool's potential to be used in collaborative settings, such as in group projects, or in remote learning environments, the later of which have become increasingly popular post the COVID-19 pandemic. Such a limitation deprives the software and its users, including online educational platforms and learning management softwares such as Artemis #footnote("https://github.com/ls1intum/Artemis") which integrate Apollon, from the benefits of a collaborative learning environment, wherein students work in groups towards solving a problem or accomplishing a shared task. These collaborative learning techniques have been shown to increase academic motivation @collab-learning and peer contribution @collab-on-table, effects that specifically in a blended learning environment, become more pronouned based on how well the tools in use are optimised for collaboration @collab-learning-blended. Implementation of such features, on the other hand, brings its own unique challenges. A key challenge in such a context is maintaining a consistent state between all peers: while working on the same UML diagram, all participants should see the same diagram, while being able to quickly and responsively modify the diagram without having to lock the diagram or wait for other peers, meaning occasionally changing the diagram simultaenously, resulting in conflicting states. Common solutions for such problems, such as Operation Transformation @ot, demand a lot of complexity for conflict detection and reconcilliation, which have proven difficult and error-prone to even merely check for correctness @ot-issues, @ot-admis, @ot-proof, and demand strong network guarantees and low fault tolerance @diff-sync, @ot. Tools built around these approaches typically require tight server-client integration @sharedb, which increases the difficulty of integration and use for a tool such as Apollon, where third-party consumers should integrate the tool and handle server-side operations. More recent solutions such as Conflict-free Replicated Data Types (CRDTs) @og-crdt, @crdts are more promising for the specific requirements of a project such as Apollon. The "conflict-free" nature of such solutions greatly simplifies complexity for integrators. However, they still incur noticeable trade-offs in terms of performance and complexity @yata, posing further challenges in design and implementation. == Motivation A straightforward solution for realtime collaboration in Apollon has potentially great benefits on effectiveness of learning UML modeling, specifically in online and collaborative environments. Furthermore, a solution with minimal server-side complexity enables a wide range of platforms and LMSs to easily integrate such capabilities, furthering access to benefits of collaborative learning to a much wider audience. As Apollon is an open-source project, a simple architectural design can be adopted by the community on similar projects and use cases with relative ease. Moreover, a theoretical analysis of conditions wherein such a solution is applicable can simplify proof of correctness for such applications, and guide future research on boundaries within which similar solutions can be developed. == Objectives // #rect( // width: 100%, // radius: 10%, // stroke: 0.5pt, // fill: yellow, // )[ // Note: Describe the research goals and/or research questions and how you address them by summarizing what you want to achieve in your thesis, e.g. developing a system and then evaluating it. // ] Present thesis focuses on three main objectives: design and implementation of a realtime collaboration system for Apollon, integration into Apollon Standalone #footnote("https://github.com/ls1intum/Apollon_standalone") and Artemis, and theoretical analysis of the proposed solution. #v(4mm) 1. *Design and Implementation* \ The main aim of the thesis is to design and implement a realtime collaboration solution for Apollon. The solution should impose minimal complexities on potential consumers of Apollon intending to integrate the tool into their own platforms, and subsequently will draw inspiration from more distributed solutions such as CRDTs. At the same time, we intend to minimise the complexity incurred on Apollon's code base itself, providing a solution that works for the wide variety of diagram types supported by Apollon out of the box, working seamlessly for future diagram types without additional effort, and retaining a simple architecture maintainable considering the student-driven high-influx development process of Apollon. #v(4mm) 2. *Integration* \ The second objective would be to integrate the realtime collaboration features of Apollon into example consumers of the tool, namely Apollon Standalone, a standalone version of Apollon, and Artemis, an interactive learning management system. The unqiue constraints of each of these systems will act as a testbed for simplicity of adoption of the proposed solution, and helps in analysing potential challenges related to integration of realtime collaboration in different environments and for various use cases and in accordance with different persistence mechanisms. #v(4mm) 3. *Theoretical Analysis* \ The final objective of this work is to provide a theoretical analysis of the proposed solution, specifically focusing on the conditions under which the solution is applicable, and the boundaries within which the solution is correct. This analysis will describe consistency properties achievable using the proposed solution under various network topologies and constraints, and discuss the theoretical correlations of the proposed solution with established ones. == Outline The rest of the thesis is structured as follows: @background provides some background information on Apollon editor and Artemis teaching platform. The concept of realtime collaboration is subsequently introduced, alongside consistency requirements often considered for such systems. An overview of the most commonly used techniques and solutions for satisfying these requirements and addressing associated challenges is given, namely Operational Transformation and Conflict-free Replicated Data Types. @related-work provides a more in-depth review of these techniques and their suitability for the specific problems that are the focus of this thesis, providing insight into similarities and differences of the proposed solution with existing approaches, and the reasoning as to why a novel solution was designed and implemented. In @requirements-analysis, functional and non-functional requirements for a realtime collaboration system for Apollon are outlined, as well as system models, following requirement ellicitation and analysis process recommended by @bruegge2004object. @system-design describes the system design, including the architectural design of the proposed solution, the data format of diagrams and changes and their interactions with the realtime collaboration solution, the control flow of the subsystem, and an overview of example integrations of the proposed system into Apollon Standalone and Artemis projects. @theoretical-analysis provides a theoretical analysis of the proposed solution, evaluating the general conditions under which the proposed solution schema is applicable, the consistency criteria it satisfies under different conditions, and sufficient conditions for assessment of when application of such a solution would yield a correct, convergent result. Theoretical relationship between the proposed solution, existing solutions and various consistency constraints under various network conditions are discussed, providing a further understanding of the trade-offs that were made to achieve the specific goals of this thesis. The work is finally summarized in @summary, where achieved and open goals are outlined, and a conclusion and outlook for future work is provided. #pagebreak() = Background <background> The following section provides background information on key concepts that present thesis is based on or draws inspiration from. This includes an overview of Apollon, the online UML editor which is the basis of the implementation work conducted, Artermis, the educational platform within which Apollon is integrated, and the general concept of realtime collaboration systems alongside they key challenges they face, specifically consistency models desirable in such systems. This section finally provides an overview of the most commonly used techniques for achieving those consistency requirements, namely Operational Transformation, and Conflict-free Replicated Data Types, detailing some examples of the latter family of solutions to offer a better understanding of the realtime collaboration solution proposed and implemented by this thesis. A critical analysis of these methodologies for the specific problem of realtime collaboration on UML diagrams, specifically for Apollon, is subsequently provided in @related-work. == Apollon and Artemis <apollon-artemis> Apollon is an interactive web-based UML editor, developed at the Technical University of Munich. It was originally designed to facilitate drag-and-drop modeling exercises, where in instructors easily and efficiently create UML diagrams and students fill-in the missing parts, with support for automated assessment @thesis-schulz. It has since evolved into an easy-to-use and learning-focused UML modeling tool with support for educational feedback and guidance @thesis-tobi, and a great degree of extensibility specifically with regards to adding support for new diagram types @thesis-willand. Since its inception, Apollon was designed to be easily integrated into other educational platforms, enhancing the learning experience of students using such platforms @thesis-schulz. Usage of Apollon has proven to improve various learning outcomes with regards to modelling, specifically increasing student engagement, improving learning success in modelling, and better understanding of taught concepts in programming exercises @uml-learning. Apollon was originally designed to be integrated into Artemis, an interactive learning management system also developed at the Technical University of Munich originally with a focus on programming exercises and automatic assessment, which later expanded to include support for a variety of exercise types and exams, namely modelling exercises, with support for individual or team-based exercises with integrated online editors. It is used in multiple universities for various online, offline or hybrid courses, and has proven helpful in guiding students in their learning process via iterative feedback cycles whilst reducing the overhead effort required by instructors and teaching assistants @artemis. == Realtime Collaboration Realtime collaborative applications (sometimes refered to in literature as group-editors or realtime group-editors) allow distributed users to work together, simultaenously, on shared data and documents @op-effect. As outlined above, such collaborative settings can provide various benefits for different use-cases: for example collaborative educational settings where multiple students work in a group on a shared task result in increased academic motivation @collab-learning. However, communication in such a setting is dependent on network which is often prone to disconnection or delay, making it rather impossible to achieve a strong degree of consistency between all collaborating peers @cap. Ideally, a realtime collaborative system or application should display these characteristics in order to achieve its desired goals @cci: - _realtime_: response to local user interaction should be immediate, i.e. a user shouldn't wait for approval to see the effect of their actions. - _distributed_: users reside on different machines connected via network prone to non-deterministic latency and disconnections. - _unconstrained editing_: multiple users should be allowed to modify various and potentially overlapping parts of the data concurrently. The requirements of realtimeness and unconstrained collaboration necessitate local replicas of the data to be modified by each client. Such a setup however leads to the consistency issues of _causality-violation_ and _divergence_ @ot-issues, the former referring to when causaly related operations are delivered to a client out-of-order (e.g. when deletion of a string is delivered before the insertion of said string), and the latter referring to two replicas diverging due to execution of two concurrent (not causally related) operations in different orders (e.g. when two users insert different characters at the beginning of a string). In response, a consistency model requiring convergence and causality preservation was proposed @ot for such systems. Another problem that arises with such systems is _Intention Violation_ @cci. For example, a user might want to delete the first two characters of a string, while another user is inserting two other characters at the beginning of this string. An execution order might lead to deletion of the characters that were not intended by the deleting user. To address this issue, _Intention Preservation_ is often considered as another desired consistency constraint, which requires the effect of executing an operation to be the same as the intended effect, i.e. its effect on the state wherein the operation was issued. This notion has been further formalised into _single operation effect preservation_ and _multi-operation effects relation preservation_, the latter of which requires that the relation between two operations is also preserved when they are executed, as it was in the original states they were generated @op-effect (e.g. if a character is inserted after another character, this relation holds when these operations are applied to other replicas with different states as well). === Eventual Consistency A consistency model often adopted to address these requirements is *Eventual Consistency* (EC) @ev, also referred to as optimistic replication @ev-2, which informally guarantees that if no new updates are issued, and all updates are eventually propagated to all peers, all peers will eventually converge to the same state. This model is often also used to assure convergence of replicated data in distributed systems (e.g. distributed databases), and is formulated as BASE @base, as a set of guarantees diametrically opposed to ACID guarantees @acid for partitioned databases, to further emphasize availability instead of consistency when handling replicated data (a trade-off necessary due to CAP theorem @cap): - _Basically Available_: the system is always available for read and write operations, even in the presence of network partitions or failures. - _Soft state_: the state of the system is allowed to be inconsistent between replicas, but will eventually converge to a consistent state. - _Eventual consistency_: the system guarantees that if no new updates are issued, and all updates are eventually propagated to all peers, all peers will eventually converge to the same state. Eventual Consistency is a weak guarantee, specifically as it is merely a guarantee of liveness and does not provide any safety guarantees. Under EC, reads might yield any potentially inconsistent result before the repilcas have converged. To compensate, a stronger version, _Strong Eventual Consistency_, is often considered, which additionally guarantees that all replicas that have received the same (unordered) set of updates will yield the same results for read operations. _Monotonicity_ of the state is another additional guarantee, which ensures that the state is never _rolled back_ @crdts. == Operational Transformation <ot-background> Operational Transformation (or Operation Transformation) is a well-established and commonly used technique for maintaining consistency in a distributed collaborative setting, used in products such as Google Docs @ot-json. The technique was originally introduced with a focus on text editting @ot, and while the majority of study and improvements have been focused on that front, extensions to other formats, including abritrary JSON trees, have been explored @ot-json. The core idea behind Operation Transformation is to transform operations received from peers before they are applied to local state. @figure-ot provides a basic example of how this works: Alice and Bob start from sync state of "xyz". Alice inserts "a" at the beginning of the string, Bob deletes "y" at index 1. Alice and Bob then exchange operations. If they were to apply these remote operations naively, Alice would yield "ayz", and Bob "axz", their local states diverging. However, Alice determines that the received operation is concurrent with her previous insertion, for example by using a vector clock @vector-clock, and transforms the delete operation against her insertion, by increasing its index. Both Alice and Bob yield "axz", the convergent (and intended) state. #figure( image("figures/ot-fig-1.svg"), caption: [ A basic example of Operational Transformation ], ) <figure-ot> Operational Transformation assumes a network reliable enough for delivering messages exactly once @ot, and is fault tolerant towards out-of-order delivery by transforming incoming operations in a manner that ensures some form of commutativity, while the operations themselves typically lack that property @ot-proof. For example, in the scenario depicted in @figure-ot, while $O_1$ and $O_2$ aren't commutative, we do have $O_1 compose O'_2 equiv O_2 compose O'_1$. Ensuring this property with higher number of peers and increasing number of concurrent operations is specially challenging @ot-issues, and although distributed variants exist, many implementations (such as Google Docs) resort to sequencing and transforming operations centrally, on a central server, to avoid these issues altogether @ot-x-crdt. == Conflict-free Replicated Data Types <crdt-background> Conflict-free Replicated Data Types are a more recent family of solutions for achieving eventual consistency, based on (re)desigining data types in a manner that operations are natively commutative @ot-x-crdt, @ot-v-crdt. This property ensures that operations can be exchanged in a distributed (peer-to-peer) manner, and all peers will converge to the same state. The most basic example of such a data type is a counter @crdt-pure-op, where peers exchange "increment" and "decrement" operations. In a network with exactly-once-delivery, all peers eventually converge on the same value. Another simple example would be a 2P-Set @crdt-list, where elements can be added and removed, but not added back again. Each peer holds a main set and a _tombstone_ set, moving elements from the main set to the tombstone upon receiving _remove_ operations, and adding elements to the main set upon receiving _add_ operations if said elements are not in present in the tombstone set #footnote([ The limitation of adding back removed elements can also be circumvented using unique IDs, and storing IDs in the tomstone set @crdt-list. ]). #figure( image("figures/crdt-2pset.svg"), caption: [ Op-based 2P-Set CRDT ], ) <figure-crdt-2pset> CRDTs are often categoriesed based on how peers communicate, resulting in varying trade-offs between implementation complexity, message size, and network fault tolerance @crdt-list. Examples described above are _operation-based_, also referred to as CmRDTs, as peers communicate operations between each other. An alternative would be communicating local states, resulting in _state-based_ CRDTs, or CvRDTs. For example, a state-based grow-only counter can be constructed by each peer maintaining a mapping of peer IDs to counters. Each peer updates its own counter locally, and upon recieving the state of another peer, their corresponding counter in local mapping is updated to the maximum of the remote and local mappings. The _value_ of the counter, often obtained by a _query_ function, would then be the sum of all counters in the vector. #figure( image("figures/crdt-counter.svg"), caption: [ State based CRDT counter ], ) <figure-crdt-counter> State-based CRDTs have a higher network fault tolerance as they do not require exactly-once-delivery guarantee. This however comes at the cost of larger message sizes and implementation complexity. It has been shown that CvRDTs and CmRDTs can emulate each other and are thus equivalent @crdts. The problem of message size of CvRDTs can be mitigated by sending _delta-states_ instead of full states, resulting in a third class called $delta$-CRDTs, though this technique also incurs its own complexities for synchronising update histories between peers @delta-crdt. CRDTs can also be used to construct more complicated data types, even those whose operations are not naturally commutative (such as strings, arrays, sequences, etc) @crdt-list, @treedoc, @logoot, @yata. @figure-crdt-seq shows schematics of a CRDT sequence, wherein each element is assigned a unique ID with some total ordering. When a character is inserted, it is assigned an ID between those of its siblings at time of insertion: for example when inserting "a" between "x", "y", its ID is picked so that $"ID"_x < "ID"_a < "ID"_y$, ensuring preservation of intention. Deleted characters are merely marked as such, which similar to tomstoning of 2P-Set elements, handles out-of-order delivery of operations and conserves causality. #figure( image("figures/crdt-seq.svg"), caption: [ Schematic CRDT Sequence ], ) <figure-crdt-seq> #pagebreak() = Related Work <related-work> A prime candidate for solving challenges of realtime collaboration for Apollon are solutions based on Operational Transformation. Though originally developed for text editing, OT has been extended to support other formats, including arbitrary JSON objects @ot-json, with corresponding tools such as ShareDB @sharedb. Such approaches however require a server-client architecture with tight integration on the server-side, posing an issue to potential consumers of Apollon who would like to integrate the tool into their own systems with a variety of potentially incompatible technology stacks. Requiring potential consumers to handle transformation and sequencing of operations on the server-side is also not feasible, as it would drastically increase the barrier for adoption of such features. Furthermore, while distributed designs for OTs have also been proposed @ot, checking correctness of such algorithms and covering all potential corner cases is quite complex and often error-prone @ot-issues, @ot-proof, @ot-admis. The famous example of the dOPT puzzle (@figure-dopt-puzzle) demonstrates this challenge, where the originally proposed dOPT algorithm fails to achieve convergence @ot-issues. #figure( image("figures/ot-puzzle.svg"), caption: [ dOPT puzzle @ot-issues ], ) <figure-dopt-puzzle> These difficulties, alongside stronger network requirements of OT-based solutions (mainly guarantee of exactly-once-delivery), render them unsuitable for the specific problem of realtime collaboration for Apollon. CRDT-based solutions are another viable alternative for bringing realtime collaboration to Apollon. General-purpose and extensible solutions such as YATA @yata have been developed, resulting in general purpose libraries such as Yjs @yjs, that are actively utilised by popular collaborative diagram editting software such as tldraw#footnote("https://www.tldraw.com") @tldraw-yjs. The distributed nature of CRDTs removes any burden of integration of realtime collaboration of Apollon from integrating platforms, availing realtime collaboration in a seamless manner to any such platform. That said, CRDTs introduce their own host of performance overheads and complexities @ot-v-crdt, @delta-crdt, @yata. Garbage collecting tombstones is a recurring problem in CRDTs, and while CvRDTs are more fault tolerant than CmRDTs or OTs, they incur a much higher message size and bandwidth consumption. Besides performance concerns, CRDTs typically require a more complex internal type that is _queried_ into the user-facing type, which increases the complexity of development of Apollon itself, specifically extension of new diagram types, as developers would need to compose their own CRDTs for such extensions. Potential consumers and integrators also might need to "participate" as clients in the replication system to be able to provide remote persistence, which in-turn exposes them to the complexities of these internal data types. #pagebreak() = Requirements Analysis <requirements-analysis> In this chapter, we study the requirements for adding support for realtime collaboration to Apollon, following requirement ellicitation and analysis process advised by Bruegge and Dutoit @bruegge2004object. We first provide an overview of the purpose, scope, objectives and success criteria of the system that we aim to develop. We then outline the functional and non-functional requirements of the system, and provide important system models for the requirements analysis. == Overview The purpose of this thesis is to design and implement support for realtime collaboration into Apollon, so that implementation of realtime collaboration for consumers integrating Apollon into their platforms becomes a straightforward task. This means Apollon must in some form or manner, inform consumer code of "changes" happening to diagrams that need to be synced with potential remote peers, and provide a mechanism for importing such changes produced by a remote Apollon client. The implementation should yield at minimum _Eventual Consistency_, ideally _Strong Eventual Consistency_ and _Intention Preservation_, with minimal complexity and overhead for consumer code, and minimal network guarantees. It should be designed in a manner that also allows third-party code to participate in any such network, primarily with purpose of remote persistence. We strive for an architecture that also incurs minimal complexity onto Apollon itself, maintaining its extensible architecture and ease of adding new diagram types, without requiring developers to design such future capabilities with realtime collaboration and its associated challenges in mind. Such a design, ideally should be domain-agnostic enough to also be applicable to similar projects, even beyond the scope of UML diagrams. == Current System <current-system> This section describes the status of realtime collaboration in Apollon prior to the implementation work of present thesis. The current design lacks any direct support for realtime collaboration within Apollon, requiring integrating platforms to implement and handle such a system from scratch. As described in @background, many such solutions (such as CRDTs) require specifically designed internal data types, which limits the range of possible solutions for consumers and integrating platforms. @figure-stquo outlines an example implementation of such a system without delving into specifics of the implementation. Apollon consists of components necessary for rendering diagrams and providing interactions upon them. The UML model itself is managed via the _Core Model_ component, which follows a Flux-based data flow for managing the state @flux (@figure-flux), and separates the internal representation of the UML model, which has additional rendering and interaction metadata, from the external representation exposed via _ApollonEditor_ interface. #figure( image("figures/stquo.svg", width: 70%), caption: [ Example implementation of realtime collaboration with current system, based on @thesis-willand ], ) <figure-stquo> #v(4mm) The _ApollonEditor_ interface provides methods for consumer code to subscribe to the state of the UML model, represented via the external schema. Upon each update to the internal store (triggered by various actions), the state is transformed into the external schema and pushed to subscribers. In the example implementation provided, the consumer client code can feed this state into their own realtime collaboration subsystem (denoted as "Collab."), which might detect changes, add metadata, or perform other tasks necessary for realtime collaboration, and push the changes to a central server for further processing, or to remote peers via some message broadcast layer. Additionally, a local persistence component stores the latest state of the diagram locally and synchronises with a remote server for remote persistence. In the provided example, for simplicity, a naive approach is assumed, where the server merely acts as a message broadcast amongst peers, and necessary transformation, concurrency detection or reconcilliation is done by the client. #v(4mm) #figure( image("figures/flux.svg", width: 70%), caption: [ Simplified view of the "flux" data flow managing Apollon's internal state ], ) <figure-flux> #v(4mm) == Proposed System <proposed-system> The proposed system should shift responsibility of detecting changes, updating and preparing _update messages_, detecting concurrency and conflicts, and reconcilliating them into consistent diagrams across multiple remote peers, to a new component within Apollon itself (denoted as "Patcher" in @figure-proposed). Responsibilities such as displaying and detecting online peers and their status still remains with the consumer code. The "Patcher" component detects changes in the core model and notifies subscribers with _update_ or _sync_ messages (depending on the actual design), and handles incoming _update_ (or _sync_) messages from remote peers, properly updating the core model accordingly. These functionalities are provided as an independent component to the rest of Apollon subsystems, to maintain simplicity and extensibility. #figure( image("figures/proposed.svg"), caption: [ Proposed system with example integration. ], ) <figure-proposed> #v(4mm) The following sections detail the requirements of the proposed system. First, the functional requirements are documented, describing expected interactions and functional behaviour of the realtime collaboration system with users and integrating platforms. Qualitiative requirements of the system are then detailed under non-functional requirements and categorized using the FURPS+ model as advised by @bruegge2004object. #pagebreak() === Functional Requirements <functional-requirements> The following requirements describe how the realtime collaboration component of Apollon should functionally behave towards users and integrating platforms (also refered to as "Consumer Code"). - FR1 *Collaboration*: Changes made by a user to the diagram should be reflected to all other users working on the same diagram, as determined by the consumer system. <fr1> - FR2 *Realtime Updates*: Users should see the effect of changes they make locally immediately, regardless of network conditions. <fr2> - FR3 *Unconstrained Editing*: Multiple users should be able to edit any part of the diagram they desire at any moment they desire. <fr3> - FR4 *Eventual Consistency*: All users should eventually see the same diagram as the peers they are collaborating with. <fr4> - FR5 *Intention Preservation*: The effect of a change made by a user in the final diagram should be the same as the effect intended by the user, i.e. the effect it had on the diagram in the state it was issued by the user. <fr5> - FR6 *Conflict Resolution*: In case users change different parts of the diagram concurrently, all of their changes should be reflected in the diagram. <fr6> - FR7 *Update Notification*: Consumer code should be able to subscribe and unsubscribe to changes in the diagram, receiving _update_ or _sync_ messages that can be disseminated to remote peers. <fr7> - FR8 *Receiving Updates*: The system should receive _update_ or _sync_ messages from consumer code, potentially originated from some remote peer running Apollon client as well. Changes denoted by these messages should be reflected in the local diagram. <fr8> === Nonfunctional Requirements <non-functional-requirements> This section, as specified by @bruegge2004object, describes the aspects and requirements of the system that are not directly related to the functional behaviour of the system, but are rather indicative of its quality, usability, performance, etc. The requirements are categorized using the FURPS+ model. - NFR1 *Usability*: The system should provide a smooth diagram editing experience. When no other peers are actively editing the diagram, the user should experience no difference in editting compared to a completely local, non-collaborative setting. <nfr1> - NFR2 *Reliability*: The system should be fault-tolerant towards network partitions. When users disconnect, they should be able to continue editing the diagram, and when they reconnect, they should eventually see the changes made by their peers, and have their peers see the changes they have made. <nfr2> - NFR3 *Reliability*: The system should be fault-tolerant towards network unreliabilities, including out-of-order delivery of messages, non-deterministic network delays, message loss, duplicate delivery, etc. Ideally, only eventual delivery of messages should be assumed, i.e. if a message is sent infinite times, it will eventually be delivered infinite times. <nfr3> - NFR4 *Reliability*: Diagrams that users see and work on should converge as soon as possible. The ideal limit of this requirement would be for _Strong Eventual Consistency_ to be satisfied by the system, i.e. users that have received the same (unordered) set of updates (messages) see the same diagram. <nfr4> - NFR5 *Performance*: Users should see changes made by their peers in a timely manner, considering network conditions. <nfr5> - NFR6 *Performance*: The system should have minimal overhead on computing resources of user devices, including processing and memory, for handling conflict resolution, concurrency detection, etc. <nfr6> - NFR7 *Performance*: The system should incur minimal overhead on network bandwidth, considering the size of _update_ (or _sync_) messages issued. The size of the messages should ideally not scale with either the size of the diagram or the number of users. <nfr7> - NFR8 *Supportability*: The system should be designed in a manner that allows easy integration by consumer code. Specifically, the system should not require consumer code to detect conflicts or provide any reconcilliation mechanism. The system should also incur minimum requirements on how consumer code disseminates _update_ messages to remote peers. <nfr8> - NFR9 *Supportability*: The system should issue _update_ or _sync_ messages in a format that is easy to understand by consumer code and can be used to update existing diagrams in different tech stacks, languages, etc., in order to facilitate participation in the network e.g. for remote persistence of diagrams. <nfr9> - NFR10 *Supportability*: The system should incur minimal code complexity on other Apollon subsystems, specifically requiring minimal awareness of realtime collaboration mechanism while designing and implementing further changes to other subsystems. <nfr10> - NFR11 *Supportability*: The system should be decoupled, as much as possible, from specific schema of various UML diagram types, allowing for easy extension of the system to support new diagram types without having to consider realtime collaboration in depth. <nfr11> == System Models === Scenarios Here we describe two scenarios for using realtime collaboration with Apollon, from perspective of potential users, as described in @bruegge2004object. A visionary scenario and a demo scenario are detailed: the former describes an ideal solution for realtime collaboration, while the latter describes a more realistic scenario that is feasible within the scope of present work. #v(4mm) *Visionary Scenario*: Multiple users work simultaenously on a UML diagram important to their work, which they prefer to keep as private as possible, on a platform utilising Apollon as its UML editor. The users are geographically dispersed with various network conditions, Bob, for example is riding a train with intermittent connectivity. To maintain the required privacy, the platform allows them to communicate in a P2P encrypted manner (for example, utilising a gossip protocol), so their diagram and the changes they make are not visible to any third-parties, for example servers operating the platform. Users can make changes to any part of the diagram at any moment they desire, while they see in realtime changes made by their peers, without seeing their own changes getting lost unless they clearly see because a peer overrides a change made them. During this collaboration session, Bob's train enters a tunnel, and he is temporarily disconnected from his peers. He can still work on the diagram, making changes, moving elements, creating relationships, etc. As soon as he reconnects, he sees the changes made by his peers in the meanwhile, while his peers also immediately receive the changes he made. #v(4mm) *Demo Scenario*: This scenario describes the interaction between a developer of Artemis, an online learning platform integrating Apollon, a developer of Apollon, adding a new diagram type to Apollon, and some students, working on a team exercise on Artemis platform, on this new diagram type. *Artemis Developer*: Eugene is a developer for Artemis, tasked with enabling team-based modeling exercises on the platform. He integrates Apollon client into client side code, subscribing to updates as specificed by ApollonEditor API. He relays these changes to Artemis server without any further processing. On the server, he implements a simple broadcast mechanism to relay changes to all students participating in a team-based exercise. On the client, he again receives these changes and passes them back to Apollon client, which synchronises the diagrams between participating students automatically. *Apollon Developer*: Matthias is a developer for Apollon team who wants to implement a new BPMN diagram type for the editor. He implements the new diagram type following existing Apollon architectural guides, which includes a simple guide on how to organise diagram data so that it would automatically work with the realtime collaboration system. Besides this, he doesn't need to consider any additional steps for enabling realtime collaboration, and his new diagram type automatically works with the system. *Students*: A group of students are working on a team BPMN modeling exercise on Artemis platform. Each student can make changes to any part of the diagram as they see fit, seeing the changes they make immediately on the diagram. They can also see the changes made by their peers, like when a new element is created, or a new relationship is made. Alice and Bob, simultaenously move two connected elements, and they see the elements moved as they intended with the other connected element also move as their peer intended. === Use Case Model From the usage scenarios outlined in the previous section, we have identified three main actors for the realtime collaboration system in Apollon: - *User*: A user utilises Apollon, or more specifically, a platform integrating Apollon, in collaboration with other users, to work on some shared UML diagram. Bob from the visionary scenario, or the students from demo scenario, are examples of such actors. Users are the primary actors of the realtime collaboration system. - *Platform Developer*: A developer of a platform integrating Apollon, responsible for providing realtime collaboration to the *Users*. This includes integrating realtime collaboration features provided by Apollon, alongside additional context-specific features (for example, access management in case of team-based exercises on Artemis). - *Apollon Developer*: A developer working to maintain and extend Apollon, specifically works to extend Apollon with new diagram types. These actors use functionality provided by two different scopes: _Apollon_, as the diagram editor which also provides features for realtime collaboration, and the _Integrating Platform_, which embeds Apollon into their own system and provides additional context-specific functionality. Use cases pertaining to *Users* are described in @figure-use-case, and further detailed in @table-conc. The scope of these functionalities is distributed between _Apollon_ and _Integrating Platform_: _Apollon_ is the source of truth for the local replica of the diagram a user works with, renders the diagram, and provides the user interface for updating the diagram. The _Integrating Platform_ is responsible for additional functionality related to collaboration that falls outside of this scope: for example, *Users* might need to see the online/offline status of their peers, have a sharing mechanism to invite other *Users* to collaborate, save their diagrams remotely, etc. #figure( image("figures/use-case.svg", width: 80%), caption: [ Use cases related to realtime collaboration. Use cases are symmetric for both users, the diagram is simplified for further readability. ], ) <figure-use-case> #v(8mm) Within the scope of _Apollon_, a *User* should be able to change the diagram in an unconstrained manner (#link(<fr3>)[FR3]), seeing the results of these changes immediately (#link(<fr2>)[FR2]). Their peers should at some point see the effect of these changes (#link(<fr1>)[FR1]), eventually all *Users* seeing the same diagram (#link(<fr4>)[FR4]). #figure( table( columns: (auto, 1fr), align: left, [_Use case name_], [Concurrent Modification], [_Participating actors_], [*Users*], [_Flow of events_], [ 1. Mutliple *Users* collaborate on a diagram. 2. *Users* concurrently change the diagram. 1. Each *User* sees their local changes immediately. 2. *Apollon* issues _updates_ pertaining to these concurrent changes. 3. *Integrating Platform* relays these messages to remote peers. 4. *Integrating Platform* receives _update_ messages and relays them to *Apollon*. 5. *Apollon* reconciles changes and updates diagram. 3. *User* sees the changes made by their peers. ], [_Entry Condition_], [The diagram is in some form shared with *Users*], [_Exit Condition_], [All *Users* see the same diagram.], [_Quality Requirements_], [ - *Users* see changes by their peers in a timely manner. - *Users* experience a smooth editing flow, on par with a local, non-collaborative setting. ] ), caption: [ Use case table for "Concurrent modification" use case. ], ) <table-conc> #v(8mm) Use cases pertaining to *Platform Developers* and *Apollon Developers* can be observed in @figure-use-case-dev. For an *Apollon Developer*, integrating realtime collaboration support into the new diagram type is considered an "extended use case", as developers could skip this step for a particular diagram type, though this would over time degrade the functionality of Apollon for integrating platforms and users alike. For a *Platform Developer*, integration of realtime collaboration of Apollon inevitably includes implementation of a message delivery mechanism (so that clients can communicate with each other), which is also a requirement for additional, out-of-scope requirements such as session management (managing which users access a collaboration session), user status, etc. A *Platform Developer* should be, at the minimum, able to connect this broadcast layer to _update_ or _sync_ messages provided by Apollon (#link(<fr7>)[FR7]), and conversely should be able to feed received updates back into Apollon (#link(<fr8>)[FR8]). #figure( image("figures/use-case-dev.svg"), caption: [ Use cases related to integrating realtime collaboration, or extending Apollon with support for realtime collaboration. ], ) <figure-use-case-dev> #v(8mm) From this model, it can be observed that in order to achieve ease of integration for *Apollon Developers* (#link(<nfr10>)[NFR10], #link(<nfr11>)[NFR11]), the realtime collaboration system should be designed as an independent component, directly hooked to the main state flow of Apollon itself, and without tight coupling into schema of various diagram types. Additionally, if any specific requirements for schema of new diagram types are necessary, they should be as minimal and intuitive as possible. Similarly, ease of integration for *Platform Developers* (#link(<nfr8>)[NFR8], #link(<nfr9>)[NFR9]) the system should not require any additional mechanism beyond the already necessary message delivery mechanism, while also minimising requirements of this mechanism as much as possible. #pagebreak() === Analysis Object Model <analysis-object-model> Following the requirements and use cases outlined in previous sections, we can identify the main concepts of the realtime collaboration system in Apollon and subsequently construct an analysis object model as per @bruegge2004object. The main concept of the system is "changes to the diagram", which we will refer to as *Patches*, which are generated in response to changes made by the user, delivered to other clients through the integrating platform, inspected for detection of concurrent and conflicting changes, etc. The integrating platform needs to subscribe to new patches when they are emitted due to user interaction, and also ask Apollon to apply incoming patches from other peers. Since the integrating platform interacts with Apollon via _ApollonEditor_ interface (@figure-stquo), this necessitates new exposed methods within _ApollonEditor_, _patch()_ and _subscribe()_, as detailed in @figure-analysis-object. The other primary concept necessary for the system is an independent component responsible for detecting changes, in a generic and diagram-type agnostic manner, emitting change objects, and handling incoming changes, reconcilliating them with local changes and ensuring eventual consistency amongst peers. This component is denoted as the *Patcher* in @figure-analysis-object, and essentially is wrapped by _ApollonEditor_ to provide necessary functionality to the integrating platform. As previously mentioned, Apollon utilises a unidirectional Flux-based @flux data flow (@figure-flux), wherein the application state, including the diagram state, is maintained by a central _Store_. The user interface updates itself when changes occur to the _Store_, and dispatches _Actions_ in response to user interaction (or other events). These _Actions_ are preprocessed by _Middlewares_ (which might modify the actions or execute arbitrary side effects), and then used by _Reducers_ to calculate necessary updates to the store per _Action_. Since the *Patcher* component needs to be diagram-type agnostic, it needs to directly attach to this flow, creating *Patches* in response to arbitrary _Actions_ (regardless of the diagram type that generated them). This requires a specific middleware (denoted as *Patcher Middleware* in @figure-analysis-object) which utilises the *Patcher* component to check for necessary *Patches* to be emitted in response to potentially state-altering actions. Similarly, state updates resulting from *Patches* received from remote peers need to be expressed by a corresponding _Action_, which is then intercepted by a special reducer (*Patcher Reducer* in @figure-analysis-object), passed to the *Patcher* for detecting any potential conflicts and conduct any necessary transformations, yielding the changes that are subsequently applied to the store. #figure( image("figures/analysis-object.svg"), caption: [ Analysis object model for a realtime collaboration system in Apollon. ], ) <figure-analysis-object> #pagebreak() === Dynamic Model Based on the requirement analysis provided so far, we can construct a dynamic model of the realtime collaboration system of Apollon, detailing the flow of events and interactions between various components of the system. @figure-dynamic-comm provides a high-level overview of this process via a UML communication diagram. The process starts with an action dispatched from the _View_ (step 1), potentially due to user interaction. This action is intercepted by the _Middleware_ layer, wherein it is processed by composed middlewares, including the _Patcher Middleware_ described in the previous section, which asks the _Patcher_ component to check for potential patches that would need to be emitted (step 2). The action (or the processed action) is meanwhile passed to the _Reducer_ layer and its effect are applied to the _Store_ (step 1.a), resulting in the user receiving immediate realtime feedback of the changes they have issued. The _Patcher_ checks the action and, if necessary, emits patches, which are relayed through _ApollonEditor_ interface layer to the _Integrating Platform_ (steps 3 and 4). For simplicity, we've assumed a standard client-server architecture with a remote persistence layer for the integrating platform in this diagram. The _Integrating Platform_ then broadcasts the patch to all remote peers, potentially including the originating client (depending on the designed solution and network requirements), and passes the patch down to _ApollonEditor_ (step 7). An action is dispatched (step 8), which is intercepted by the _Patcher Reducer_, which utilising the _Patcher_ calculates necessary updates to the store (steps 9 and 10), which are then applied to the _Store_ and subsequently updated in the user interface. The process for handling incoming changes from a remote peer would be basically captured in steps 6 through 12. #figure( image("figures/dynamic-comm.svg", width: 75%), caption: [ Dybamic model of flow of events for realtime collaboration within Apollon. ], ) <figure-dynamic-comm> #pagebreak() = System Design <system-design> This section details the mapping of requirements detailed in @requirements-analysis to the solution domain, loosely following the guidelines provided by @bruegge2004object, and providing a high-level overview as well as detailed description of the design of the system and the reconcilliation solution that enables realtime collaboration capabilities implemented for Apollon. Example integrations of the system into real platforms (Apollon Standalone and Artemis) are also discussed, alongside the behavior of the system in boundary conditions and mitigation strategies for potential network conditions. == Overview Apollon is an open-source project providing a web-based UML diagram editor that can be integrated into various platforms, as described in @apollon-artemis. It is written in Typescript #footnote("http://typescriptlang.org/"), and utilises React #footnote("http://reactjs.org/") as a rendering framework and Redux #footnote("https://redux.js.org"), a Flux-based state management library, for managing the internal state of the application via a unidirectional flow (@current-system, @figure-flux). This architecture puts constraints on how realtime collaboration can be integrated into Apollon, as the source of truth remains within the _Store_ managed by Redux, but also provides a unique opportunity for the realtime collaboration component of Apollon to handle potential changes in a generic and diagram-type agnostic manner. Apollon interfaces with the external systems integrating it (the integrating platform) via the _ApollonEditor_ interface, which also exposes a modified diagram format compared to the internal diagram format it manages. Overall, these constraints define the general architecture and design of the realtime collaboration system independent of the specific realtime collaboration solution utilised, as described in @analysis-object-model. To design a solution satisfying requirements detailed in @functional-requirements and @non-functional-requirements, a standard format for Apollon's exposed _update_ messages are picked (JSONPatch @jsonpatch), ensuring interoperability and easy integration. Inspired by CRDT-based solutions for realtime collaboration (see @crdt-background), the exposed diagram format is updated to produce non-conflicting and commutative patching operations solely according to the specified patching standard for all independent changes made by users, while maintaining simplicity and performance. To ensure the latter, a client-server architecture is loosely expected from integrating platforms (though this can be adopted for other topologies as well), which natively decides on the order of conflicting patches ensuring eventual consistency amongst peers. Finally, a stuttering-prevention component is introduced for the system, ensuring a smooth user experience for the users under various network conditions. == Design Goals The design goals of the system, which drive subsequent design decisions, are described per @bruegge2004object, derived from non-functional requirements detailed in @non-functional-requirements: #v(4mm) - *Usability* \ Users of platforms integrating Apollon expect a smooth diagram editing experience, regardless of network conditions. Specifically, collaborative editing should not introduce any additional complexity or overhead to the user experience, as much as possible (#link(<nfr1>)[NFR1]). #v(2mm) - *Performance* \ Performance directly ties to the previous design goal, as a system with a lot of overhead will have detimental effects on user experience and portability (the next described design goal). For example, considering the web-based nature of Apollon and the wide-variety of devices that users might access it with, reconcilliation strategies that are taxing on the client-side resources such as CPU or memory should be avoided, specifically strategies that scale resource consumption based on content-size or number of users (#link(<nfr6>)[NFR6]). Or, depending on network conditions, if a user sees the changes made by their peers too late, the collaborative nature of the setting is disturbed (#link(<nfr5>)[NFR5]). Integrating platforms might also have various performance requirements of their own, which complicates integration of taxing solutions. For example, solutions that utilise large message payloads incur performance constraints on integrating platform servers and network topology (#link(<nfr7>)[NFR7]). #v(2mm) - *Portability* \ Apollon is designed to be easily integrated into other platforms. Its realtime collaboration system should also be quite straightforward to integrate, with minimal requirements on the integrating platform beyond the necessary message delivery mechanism (#link(<nfr8>)[NFR8]). It should not require any tight-coupling with the internal workings of the realtime collaboration solution, and should expose the system in a standard and interoperable manner, as integrating platforms might utilise the system for various use cases such as remote persistence (#link(<nfr9>)[NFR9]). #v(2mm) - *Extensibility* \ Apollon is specifically designed to be extensible, specially with regards to new diagram types, so any realtime collaboration solution must consider that and avoid tight-coupling with various features of Apollon, specifically various diagram types (#link(<nfr10>)[NFR10], #link(<nfr11>)[NFR11]). #v(2mm) - *Reliability* \ At a basic level, all collaborating peers should eventually see the same diagram for realtime collaboration to make sense to begin with (Eventual Consistency requirement). Beyond that, additional fault tolerance towards various network conditions is desirable as it allows users with various network conditions to collaborate without issues, and provides further flexibility for integrating platforms for handling various network conditions (#link(<nfr2>)[NFR2], #link(<nfr3>)[NFR3], #link(<nfr4>)[NFR4]). #v(2mm) === Trade-offs <trade-offs> The design goals described in the above section are outlined in order of priority for this work. Usability requirements are strong enough for a realtime collaboration system that many of them actually fall under functional requirements, as otherwise the definition of a realtime collaboration system would be violated. Performance requirements take second priority as they affect both usability and can have serious negative impact on portability, which considering the design and intended use of Apollon, cause them to take the next priority. Portability and extensibility requirements follow, considering that Apollon is designed to be integrated into other platforms, to be extensible and is developed via a student-driven development model. Reliability in face of various network conditions takes last priority. Though important, integrating platforms can be expected to setup commonplace network systems such as a client-server setup, which greatly simplify many issues with distributed reconcilliation. Normal network conditions can also be expected in many cases: while out-of-order delivery will occur necessarily due to realtime nature of the system (a user sees their own changes always out-of-order with changes of other peers), properties such as exactly-once-delivery or a preserved deliver-order between the server and the clients can be expected (and are in fact the assumptions for popular solutions such as Operational Transformation @ot and Google Docs @ot-x-crdt). Subsequently, in trade-offs between performance, portability, extensibility and additional reliability, the latter takes lower priority. That said, the design of the system should enable integrating platforms to mitigate such corner cases without requiring a tight coupling to the internals of the system. == Data Format As detailed in @related-work, solutions based on Operational Transformation techniques are not suitable for the particular case of realtime collaboration for Apollon as they either incur lots of error-prone complexity on the code-base or require integrating platforms to manage sequencing and transforming operations and hence getting tightly coupled with the realtime collaboration system, both of which go against outlined design goals, specifically extensibility and portability. Instead, we draw inspiration from CRDT-based solutions, in particular transforming the exposed diagram data format, alongside an exposed and standardised _update_ message format, which allows for non-conflicting and commutative operations. To this end, the proposed solution specifically draws inspiration from $delta$-CRDTs (@delta-crdt), wherein _update_ (or _sync_) messages are deltas (or patches) applied to local replicas of state. Alternatives would be syncing the entire state, similar to CvRDTs @crdts, which would require large message sizes @ot-x-crdt, @ot-v-crdt, @yata, going against performance design principle and its associated non-functional requirements, or to communicate operations similar to CmRDTs @crdts, which would require an exactly-once-delivery guarantee of the network, going against the reliability design principle. Patches on the other hand, are small but idempotent, allowing for system design with further fault tolerance towards various network conditions. === Patch Format Patches are the main added entity exposed to integrating platforms via the realtime collaboration system. Though integrating platforms can treat patches as a blackbox by merely broadcasting them to all participating peers, they might need to be able to interpret them and utilise them for various purposes, for example for maintaining remote backups of the diagram which is updated based on patches issued by clients. JSONPatch (RFC 6902) @jsonpatch format was subsequently picked as the format of patch messages to ensure interoparability and ease of integration. JSONPatch is a web standard specified by the Internet Engineering Task Force (IETF) #footnote("https://www.ietf.org/") for expressing sequences of operations on JSON objects, using the JSON Pointer @jsonpointer standard for addressing specific parts of the JSON object. Due to its status as an approved web standard and the host of tools available to read, produce and apply JSONPatches in various languages @jsonpatch-exp, it satisfies the portability design goal through simplicity of integration and interoparability. #v(4mm) #figure( ```json [ { "op": "replace", "path": "/baz", "value": "boo" }, { "op": "add", "path": "/hello", "value": ["world"] }, { "op": "remove", "path": "/foo" } ] ```, caption: [An example JSONPatch message @jsonpatch-exp.], ) <code-jsonpatch-example> #v(4mm) === Diagram Schema <diagram-schema> As mentioned in previous sections, to address challenges of realtime collaboration, we also propose modifying the schema of UML diagrams exposed by Apollon to the integrating platform in a way that the diagram schema, in combination with the patch messages (which are in JSONPatch format), constitute a data type that natively produces non-confilcting and commutative (patch) operations. This is done, however, while considering the extensibility and portability design goals previously outlined. In particular, the diagram schema can't be overtly tightly coupled with the realtime collaboration system, for example, by including clock vectors for each participating user, or by representing strings as linked lists of characters with unique identifiers (as in a CvRDT G-Counter or a CmRDT sequence respectively, see @crdt-background). This means that not all operations produced by the resulting data type will be commutative. Instead, we attempt to ensure commutativity for operations that are naturally independent, and fallback to other conflict resolution mechanisms for operations that are not. To achieve this, we propose the following modifications: #v(4mm) - Use of mappings of unique identifiers to entities instead of lists or sequences of entities that do have, or can naturally have, unique identifiers, - Use of mappings of unique identifiers to boolean values (inclusion maps) to represent sets, instead of lists or sequences of such identifiers, when applicable, - Treating other cases, including lists and strings that cannot be converted in this manner, as atomic objects. #v(4mm) #figure( ```json { "elements": [ { "id": "<unique-id-1>", "type": "Package", ... }, { "id": "<unqiue-id-2>", "type": "Class", ... }, ], "relationships": [ { "id": "<unique-id-3>", "source": { "element": "<unique-id-1>" }, "target": { "element": "<unique-id-2>" }, "path": [ { "x": 0, "y": 0 }, { "x": 10, "y": 10 } ], ... } ], "interactive": ["<unique-id-1>", "<unique-id-3>"] } ```, caption: [An example of a partial diagram in the original diagram schema] ) <code-schema-og> #v(4mm) @code-schema-og provides an example of the schema of UML diagrams exposed by Apollon prior to the work done in this thesis. Within this schema, though various UML elements can be identified by their unique IDs, their default adderssing vai JSONPointer would utilise their index in the list. The result would be unnecessary conflicts between patches that would add or remove elements, as outlined in @code-conflicting-patches. #v(4mm) #figure( ```json [ {"op": "add", "path": "/elements/0", "value": ...} ] [ {"op": "replace", "path": "/elements/0/type", "value": "Interface"} ] ```, caption: [Example of patches conflicting with original schema] ) <code-conflicting-patches> #v(4mm) This conflict would be resolved by applying the proposed modifications to the schema, as displayed by @code-schema-new. In this new schema, the `path` property of the first patch would become `/elements/<unique-id-4>`, where the `path` property of the second patch would become `/elements/<unique-id-1>/type`, resolving the conflict between these two independent patches and rendering them commutative. #v(4mm) #figure( ```json { "elements": { "<unique-id-1>": { "type": "Package", ... }, "<unqiue-id-2>": { "type": "Class", ... }, }, "relationships": { "<unique-id-3>": { "source": { "element": "<unique-id-1>" }, "target": { "element": "<unique-id-2>" }, "path": [ { "x": 0, "y": 0 }, { "x": 10, "y": 10 } ], ... } }, "interactive": {"<unique-id-1>": true, "<unique-id-3>": true} } ```, caption: [An example of a partial diagram in the modified diagram schema] ) <code-schema-new> #v(4mm) Note that not all lists and sequences can be replaced by mappings. For example, the `path` property of relationships in @code-schema-new (e.g. `relationships/<unique-id-3>/path`) is a list of points without unique identifiers and semantically meaningful indices (the order of the points in the path), leaving no straightforward solution for turning it into a mapping without encoding additional conflict resolution mechanisms to the type. Subsequently, we treat such addresses as atomic objects, i.e. any operation with a child path will be uplifted to an operation replacing the whole list. As noted above, not all patch operations produced by this modified data type are commutative, specifically effect of operations relating to the same path will be dependent on the order of delivery. These operations are however idempotent, so we rely on the integrating platform to determine the "winning" patch for each path. In particular, an integrating platform with a client-server architecture, where the server guarantees delivery order of its messages to each client, will be natively picking this "winning" patch by rebroadcasting all patches to all clients (including the client who issued the patch), hence achieving Eventual Consistency. In @boundary, and in more detail in @g-eval, we discuss mechanisms through which this can be achieved in less reliable network conditions. == Subsytem Decomposition <subsystems> The following section details the decomposition of the subsystems involved in realtime collaboration system implemented for Apollon, explained in detail as per @bruegge2004object. @figure-subsystems provides a high-level overview of the subsystems involved in realtime collaboration, in the context of an example integration of Apollon. The newly added subsystem for realtime collaboration (denoted as _Realtime Collab._) is highlighted for better visibility of changes implemented to Apollon and how these changes affect Apollon's interactions with integrating platforms. #v(4mm) #figure( image("figures/subsystems.svg", width: 80%), caption: [ Subsystem decomposition of the implemented realtime collaboration system in Apollon, in an example integration scenario ], ) <figure-subsystems> #v(4mm) As described in previous sections, Apollon is a web-based UML editor that is integrated into web-based client of any integrating platform. Interactions between Apollon and the integrating platform are managed via the _ApollonEditor_ interface, with internal components contributing to the exposed APIs. In particular, the _Store_ components centrally manages all of the internal state of the application, including the state of the UML diagram being editted, with the _User Interface_ layer updating itself in response to changes in the _Store_. As detailed before, the content of the _Store_ is updated by _Reducer_ layer (which aggregates reducers from various subsystems) in response to dispatched actions (potentially by the UI layer). These actions are passed through a _Middleware_ layer beforehand (which similarly aggregates middlewares from various subsystems) which might modify the incoming actions, dispatch new actions, or merely run side-effects. This process can be observed in more detail in @figure-analysis-object. The realtime collaboration subsystem interfaces with the _Store_ through three main constructs: (1) An exposed _Middleware_, which composes into the middleware layer and intercepts all changes made to the _Store_, triggering the _Patcher_ to produce any patches and emit them if necessary. (2) An exposed _Reducer_, which intercepts _patch actions_ (actions dispatched in response to patches received from remote peers) and triggers the _Patcher_ to calculate necessary updates to the _Store_. (3) An exposed _Saga_, which is a middleware with long running side-effects, re-aligning the layout of the local diagram after changes are applied due to an incoming remote patch (e.g. ensuring connected elements remain connected). The _Patcher_ itself interacts with the _Patch Verifier_ during both processes: it uses the verifier to _sign_ patches that are to be emitted, and to _verify_ incoming patches to match them against previously signed patches. This verification system allows the realtime collaboration to optimistically suppress unnecessary remote patches and avoid stuttering, and is further detailed in @stuttering. The integrating platform interfaces with the realtime collaboration by subscribing to the _Patcher_ (for emitting local changes) or by dispatching _patch actions_ (for receiving remote changes), both through the _ApollonEditor_ interface. The integrating platform can then utilise received patches to update the local or remote storage, or broadcast the change to other peers. The integrating platform is required to deliver emitted patches to all clients, including the client issuing the patch (see @diagram-schema). #v(4mm) #figure( image("figures/classes-detailed.svg"), caption: [ Detailed class diagram of the implemented realtime collaboration system ], ) <figure-classes> #v(4mm) @figure-classes further details the concepts in the realtime collaboration subsystem. The _Patcher_ is the central component of the system, responsible for tracking changes to the state and emitting patches when necessary. The _Patcher_ specifically allows for subscription to two types of changes "discrete", which denote important lower frequency changes (e.g. adding a new element, moving an element), and "continuous", denoting higher-frequency changes that can be dropped/throttled (e.g. patches related to moving an element while it is being moved by the user). Utilising continuous patches, the integrating platform can provide a smoother user experience, at the expense of higher-bandwidth usage, a choice that should be made by the integrating platform and provides flexibility for various network conditions. === Stuttering Prevention <stuttering> Stuttering happens in scenarios where a client bounces between states in rapid succession. Take the following scenario as an example: 1. Clients A and B start from state $I$. 2. Client A applies $O_X$ to the diagram, yielding state $X$. 3. Client B applies $O_Y$ (of the same scope) to the diagram, yielding state $Y$. 4. Both clients receive $O_X$ from the server, syncing on state $X$. 5. Both clients receive $O_Y$ from the server, syncing on state $Y$. The following are the sequence of states experienced by both clients: $ A: I -> X -> X -> Y \ B: I -> Y -> X -> Y $ In this scenario, client A experiences a smooth transition of states, while client B experiences stuttering. Considering the condition on the network layer (provided by the integrating platform) to rebroadcast all patches even to the issueing client, the stuttering effect shown above can also happen to a single client: 1. Client applies $O_X$, yielding state $X$. 2. Client applies $O_Y$, yielding state $Y$. 3. Client receives $O_X$ from the server, syncing on state $X$. 4. Client receives $O_Y$ from the server, syncing on state $Y$. The client will subsequently experience the following sequence of states: $ I -> X -> Y -> X -> Y $ #v(4mm) To mitigate this stuttering issue, the _Patch Verifier_ component was added to the realtime collaboration subsystem. When patches are to be emitted, the _Patcher_ asks the _Patch Verifier_ to sign each operation within the patch. The verifier checks each operation, and assigns a unique, randomly generated identifier to each replace operation (as other operations do not yield stuttering). The verifier also records the corresponding path of the patch with the unique identifier, for future verification of incoming patches (@figure-st-sign). #v(4mm) #figure( image("figures/st-sign.svg", width: 40%), caption: [ Process of signing operations of a patch ], ) <figure-st-sign> #v(4mm) When a patch is received, the _Patch Verifier_ checks each operation whether it is signed, and whether its path is already recorded. The operation is skipped, and if the patch is self-signed, the path is removed from the record, allowing subsequent patches on this path to be applied (@figure-st-verify). #v(4mm) #figure( image("figures/st-verify.svg", width: 65%), caption: [ Process of verifying operations of incoming patches ], ) <figure-st-verify> #v(4mm) Assuming each client receives patches from the server in the correct order, this mechanism allows clients to optimisitcally suppress patches that they know will be overwritten by a subsequent patch, hence avoiding stuttering. In effect, a client will _wait for confirmation_ for each patch it issues to the server, and supress all incoming patches on that path until it receives the confirmation. #pagebreak() === Example Integrations The implemented solution was also integrated into two platforms integrating Apollon: Apollon Standalone and Artemis. @figure-apollon-standalone-integration and @figure-artemis-integration provide an overview of the integration of the realtime collaboration system into these platforms. #v(4mm) #figure( image("figures/apollon-standalone.svg", width: 60%), caption: [ Integration of realtime collaboration into ApollonStandalone ], ) <figure-apollon-standalone-integration> #v(4mm) Apollon Standalone uses a local persistence component that stores the diagrams locally, as well as a file storage service storing diagrams remotely when they are shared. The local persistence component subscribes to the UML diagrams directly through the store itself #footnote([The interfaces provided by the _Patcher_ and the _Store_ are aggregated in _ApollonEditor_ interface. This detail is removed from the diagram for clarity.]), and the collaboration service on the client side subscribes to incoming patches from _Patcher_ component of Apollon, which are relayed to the server via a WebSocket API. The service broadcasts these patches back to all connected clients, and also applies the patch to the remotely persisted diagram. #v(4mm) #figure( image("figures/artemis.svg"), caption: [ Integration of realtime collaboration into Artemis ], ) <figure-artemis-integration> #v(4mm) Artemis integrates the realtime collaboration of Apollon differently. The context of collaboration for Artemis are team-based modeling exercises, and upon emission of patches to the server, additional access control should be conducted to ensure that only valid team members receive the patches. To increase performance, the process is completely decoupled from persistence, which is handled by Artemis client periodically checking the diagram and syncing with the server, independent of the realtiome collaboration system. #pagebreak() == Boundry Conditions <boundary> The proposed system achieves convergence (which is analogous to eventual consistency under described network conditions) in a client-server setup under the following network conditions, assuming that the integrating platform simply broadcasts all incoming patches to all clients (including the client that issued the patch): #v(4mm) - Exactly-once-delivery: each message that is sent by the client to the server and vice-versa delivers exactly once. - Preserved delivery order: the order of delivery of messages from the server to the client is preserved. #v(4mm) Note that while preserved deliver order of protocols such as TCP @tcp ensures that messages transferred between the server and the client are delivered in the order they were sent, clients will still, inevitably, see different changes (specifically those issued locally) out-of-order, which is why commutativity of independent operations is a requirement regardless of network guarantees. Additionally, the exactly-once-delivery guarantee can't be upheld under certain network conditions, for example where intermittent disconnections affect some clients. While idempotency of operations described via the proposed solution allows for mitigation strategies such as repeated retries, as such strategies are implemented in a separate network layer, they might violate the preserved order guarantee instead (i.e. message $a$ and $b$ are sent, $b$ is received while $a$ fails, and the system retries $a$, which results in out-of-order delivery of the messages). Mitigation for such conditions is not baked in to the proposed solution itself in favor of meeting the expressed design goals (specifically of performance, portability and extensibility). Instead, the system is designed in a manner that accomodate such mitigation strategies. One such strategy would be to repeat the latest operation of each path, in an unsigned manner, to each client until the client confirms receipt. While operations of differing paths might be delivered out-of-order, their commutativity ensures that the clients converge on the same state. The removal of signature also ensures the stuttering suppression mechanism doesn't overrule the patch. To respect stuttering prevention, the network layer can for example send the unsigned version of the patch with a preset delay, avoiding the stuttering effect as well. This naive strategy ensures eventual consistency amongst clients in network conditions where merely eventual delivery is guaranteed. The flexibility of the design of the system also allows integrating platforms to implement more sophisticated strategies. For example, using totally ordered logical timestamps allow the integrating platform to operate in a fully distributed manner, though it would incur associated overheads and complexities. The integrating platform can also detect prolonged disconnections of a client and resync the diagram wholesale. Further analysis and details of how various consistency properties can be achieved under various network conditions and topoligies is further discussed in @eval. #pagebreak() = Theoretical Analysis <theoretical-analysis> This section provides a theoretical analysis of the proposed solution, evaluating the conditions under which it can achieve desired consistency properties of a realtime collaboration system. As a first step, a generalisation of the proposed solution (as detailed in @system-design) is provided (@general-form), followed by a detailed evaluation of this general form under various network conditions, alongside evaluation of applicability of this general criteria to the specific case of the proposed solution (@eval). In the later parts of this chapter, the evaluation results and limitations of the provided analysis are further discussed. == Overview The gist of the solution proposed by this thesis for realtime collaboration in Apollon can be deconstructed as a two-step approach: - We first (re)design the data format to break it into separate and independent operational scopes, where operations from different scopes are commutative and non-conflicting natively, - We ensure strong indempotency properties for operations within each scope, allowing for a mechanism to determine the _winning order_ or the _winning operation_ within each scope, with complexity of said mechanism solely determined by the feasible network topology (e.g. a simple relay in a client-server setup). Based on these, a variety of actual mechanisms can be designed and implemented, ranging from a naive relay to a fully distributed setup, either treating the operations as blackboxes, or by coupling to the "scope" they affect. These notions can be generalised using the introduced concept of _lenses_, which are arbitrary functions mapping the data to some other data type. Operations can be then categorised based on lenses which they affect (i.e. the change they make is reflected by the lens), establishing _operational scopes_. Commutativity between operations of different scopes can then be achieved if the original data-type can be reconstructed from lenses associated with these scopes. These operational scopes are then used to define required idempotency properties within each scope. == General Form <general-form> === Definitions <defs> We assume a data type (an arbitrary set) $D$, equiped with an equality operator $=$, and a set of operations $O_D$ in the general form of $f: D -> D$. Within this context, we define a _lens_ as an arbitrary function from $D$ to some other domain $D'$ similarly equiped with an equality operator, i.e. a _lens_ is a function $ell: D -> D'$. For a given set of lenses, $Gamma_D$, the target domain $D'$ is determined implicitly as a set of all possible domains of members of $Gamma_D$, constructed in any arbitrary manner from its constintuents, but specifically in a manner that still provides an equality operator. For readability, we refer to the result of applying lens $ell$ on data $d$ as _$d$ viewed through (or under) $ell$_ and denote it by $d_ell$. As a first step, we utilise the concept of lenses to provide definitions for the notion of _operational scope_ of any given operation: #definition[ A _lens sees an operation_, iff any data that changes under the operation also changes under it when viewed through that lens. Formally, we denote this relation with the $sees$ symbol, so for $ell in Gamma_D$, $f in O_D$ we have: #v(4mm) $ ell sees f <=> forall d in D: d != f(d) => d_ell != f(d)_ell $ ] <def-sees> #definition[ Two operations _share scope_ iff any lens that sees one, also sees the other. We denote this relationship by the $sharesscope$ symbol, and for all $f, g in O_D$ we have: #v(4mm) $ f sharesscope g <=> forall ell in Gamma_D: ell sees f <=> ell sees g $ #v(2mm) It is trivial to see that sharing scope is reflexive, commutative and transitive, so it can be used to partition operations into _scopes_. We define the scope of each operation $f$, denoted by $Delta f$, as the set of all operations sharing scope with $f$, i.e.: #v(4mm) $ Delta f := { g in O_D | f sharesscope g } $ ] <def-sharesscope> #pagebreak() For the next step, we provide definitions to capture the notion of _independence_ of operations of varying scopes: #definition[ Operation $f$ _is blind to_ operation $g$ under lens $ell$, iff: #v(4mm) $ forall d in D: f(g(d))_ell = f(d)_ell $ We denote this relationship with the $blindto_ell$ symbol ($f blindto_ell g$). ] <def-blind-under> #v(4mm) #definition[ A given lens $ell$ _is blind to_ operation $f$, iff it does not see the effect of $f$ on any data. We denote this relationship by the $blindto$ symbol, and have: #v(4mm) $ ell blindto f <=> forall d in D: d_ell = f(d)_ell $ #v(4mm) ] <def-blind> #definition[ A lens $ell$ _selects_ an operation $f$ (or its scope), or $ell$ _is a selector for_ $f$ (or its scope), iff it sees $f$ and is blind to any operation $g$ outside of the scope. We denote this relationship with the $selects$ symbol, and have: #v(4mm) $ ell selects f <=> (ell sees f) and (forall g in.not Delta f: ell blindto g) $ ] <def-selects> #v(4mm) With these baseline definitions established, we can derive basic conditions for commutativity of operations with differing scopes: #proposition[ Given operation $f$ selected by given lens $ell$, $f$ commutes with operation $g$ outside of its scope under $ell$, if and only if $f$ is blind to $g$ under $ell$. <prop-com> #proof[ For any $g in.not Delta f$ where $f$ is blind to $g$ under $ell$ we have: $ f blindto_ell g & => forall d in D: f(g(d))_ell = f(d)_ell \ ell blindto g & => forall d in D: g(f(d))_ell = f(d)_ell \ & => forall d in D: f(g(d))_ell = g(f(d))_ell \ & => (f compose g)_ell = (g compose f)_ell $ Conversely, for any $g in.not Delta f$ that commutes with $f$ under $ell$ we have: $ ell blindto g & => forall d in D: g(f(d))_ell = f(d)_ell \ (f compose g)_ell = (g compose f)_ell & => forall d in D: g(f(d))_ell = f(g(d))_ell \ & => forall d in D: f(g(d))_ell = f(d)_ell \ & => f blindto_ell g $ Altogether, we have: $ ell selects f => forall g in.not Delta f: f blindto_ell g <=> (f compose g)_ell = (g compose f)_ell $ ] ] === Reconstruction <rec> Having established sufficient criteria for operations of different scopes to commute under their respective selectors, we can now expand this notion to general commutativity by reconstructing the original data from a given set of lenses. For an ordered set of lenses $Gamma$, denoted by $angle.l ell_1, ell_2, ... angle.r$, we express the image of any data $d in D$ by $angle.l d_ell_1, d_ell_2, ... angle.r$, denoted as $d_Gamma$. Subsequently we denote the image of D under $Gamma$ as $D_Gamma$: $ D_Gamma equiv { d_Gamma | d in D } $ #v(2mm) #definition[ Some data type $D$ _can be reconstrcuted_ from an ordered set of lenses $Gamma$, iff there exists function $R: D_Gamma -> D$ such that: #v(4mm) $ forall d in D: R(d_Gamma) = d $ ] <def-rec> #v(2mm) #proposition[ (_Reconstruction_) Given a data type $D$, a set of operations $O$, and an ordered set of lenses $Gamma$, if the following conditions hold: 1. $Gamma$ consists of selectors of O: $forall ell in Gamma exists f in O: ell selects f$ 2. operations not sharing scope are blind to each other under their selectors, 3. we can reconstruct $D$ from $Gamma$, Then operations of different scopes commute. <prop-rec> #proof[ We first show that operations of different scopes commute under all lenses. #link(<prop-com>)[Proposition 1] already proves that such operations commute under their respective selectors, so it suffices to show that operations commute under lenses selecting neither: $ ell blindto f and ell blindto g & => \ forall d in D:& f(g(d))_ell = g(d)_ell = d_ell \ & = f(d)_ell = g(f(d))_ell \ & => (f compose g)_ell = (g compose f)_ell $ This means $f$ and $g$ commute under $Gamma$: $ forall d in D: f(g(d))_Gamma & = angle.l f(g(d))_ell_1, f(g(d))_ell_2, ... angle.r \ & = angle.l g(f(d))_ell_1, g(f(d))_ell_2, ... angle.r \ & = g(f(d))_Gamma \ & => (f compose g)_Gamma = (g compose f)_Gamma $ Since we can reconstruct $D$ from $Gamma$, for example using reconstruction function $R$, we can prove $f$ and $g$ generally commute: $ forall d in D: f(g(d)) &= R(f(g(d))_Gamma) \ &= R(g(f(d))_Gamma) \ &= g(f(d)) \ & => f compose g = g compose f $ ] ] Intuitively, this result indicates that given a particular data type and its associated operations, if we can find a set of lenses that partition operations into independent scopes (operations of one scope being blind to operations of another under their selector) and reconstruct the data from these lenses, then the operations of different scopes will commute. === Idempotency <idem> As described in @system-design, the proposed solution relies on idempotency of operations within each scope to resolve their conflicts. In particular, an operation $f$ might be applied locally by the client, and then reapplied when the client receives a sequence of other operations from the server, including said operation. We capture this notion in the definition of _strong idempotency_: #definition[ An operation $f$ is _strongly idemptoent_ with respect to a set of operations $O$, iff for any finite (potentially empty) sequence $G$ of operations of $O$ in form of $angle.l g_1, g_2, ... angle.r$, we have: #v(4mm) $ f compose G compose f = f compose G $ Here, application and composition of ordered sets of operations is equivalent to application and composition of their elements in given order, i.e. $ & f compose G = f compose g_1 compose g_2 compose ... \ & G compose f = g_1 compose g_2 compose ... compose f \ & G(d) = g_1(g_2(...(g_n(d))...)) \ & G compose G' = g_1 compose g_2 compose ... compose g_n compose g_1' compose g_2' compose ... compose g_m' $ ]<def-str-idem> #v(4mm) Strong idempotency in particular allows us to fulfill the realtime requirment of a realtime collaboration system, while also achieving eventual consistency using a client-server architecture with exactly-once-delivery and preserved delivery order guarantees. If we have independent operational scopes that can reconstruct the data, we can achieve strong idempotency by merely ensuring each operation is strongly idempotent with respect to its own scope. #proposition[ Given a data type $D$, a set of operations $O$, and an ordered set of lenses $Gamma$, if conditions of #link(<prop-rec>)[_Reconstruction_] (Proposition 2) hold and operations are strongly idempotent with respect to their scopes, then operations are strongly idempotent with respect to $O$. <prop-stid> #proof[ We use strong induction on the size of the operation sequence $X$. For the initial step of induction, assume $X = angle.l x angle.r$. We now prove that given operation $f$ is strongly idempotent with respect to ${x}$ under all lenses, i.e.: $ forall ell in Gamma: (f compose x compose f)_ell = (f compose x)_ell $ - If $ell blindto f$, $ell blindto x$, then we have $forall d in D: f(x(f(d)))_ell = d_ell = f(x(d))_ell$ - If $ell blindto f$, $ell selects x$, then we have: $ forall d in D: f(x(f(d)))_ell &= x(f(d))_ell #h(20mm)& because ell blindto f \ &= x(d)_ell & because x blindto_ell f \ &= f(x(d))_ell & because ell blindto f $ - If $ell selects f$, $x in.not Delta f$, then we have: $ forall d in D: f(x(f(d)))_ell &= f(f(d))_ell #h(20mm)& because f blindto_ell x \ &= f(d)_ell & because f "is idempotent" \ &= f(x(d))_ell & because f blindto_ell x $ - If $ell selects f$, $ell selects x$, then $x in Delta f$ and we have strong idempotency by assumption. We now use the reconstruction function to expand this result to the whole data type: $ forall d in D: (f compose x compose f)(d) &= R(angle.l (f compose x compose f)(d)_ell_1, (f compose x compose f)(d)_ell_2, ... angle.r) \ &= R(angle.l (f compose x)(d)_ell_1, (f compose x)(d)_ell_2, ... angle.r) \ &= (f compose x)(d) \ $ As the next step of induction, for any larger sequence $X$, we break it into smaller sequences $X_1$ and $X_2$. Due to induction, we have: $ f compose X compose f &= f compose X_1 compose X_2 compose f &\ &= f compose X_1 compose f compose X_2 compose f #h(20mm)& because f compose X_1 = f compose X_1 compose f \ &= f compose X_1 compose f compose X_2 & because f compose X_2 compose f = f compose X_2 \ &= f compose X_1 compose X_2 & because f compose X_1 compose f = f compose X_1 \ & = f compose X $ #v(4mm) ] ] It is notable that the operations of the proposed realtime collaboration solution for Apollon display a stronger property than _strong idempotency_: they effectively _overwrite_ the content of their respective scopes. This notion can be expressed by an operation being blind to other operations of a scope, including potentially itself, under the selectors of the scope: #definition[ An operation $f$ _overwrites_ a scope $Delta$, iff it is blind to all operations of $Delta$ under their selectors, i.e.: $ forall ell selects Delta, forall g in Delta: f blindto_ell g $ ] Needless to say that if independent operational scopes are established, then an operation can only overwrite its own scope. In such a condition, the operation is also necessarily strongly idempotent with regards to its scope. The converse is however not ture, for example given the following setup: $ & f := (x, y) -> cases( (x + 1, 1) text(#h(20mm)& "if" y = 0), (x, y) & "otherwise" ) \ & g := (x, y) -> (3, y) \ & O := {f, g}, Gamma := { p -> p }, D := ZZ^2 \ $ Operation $f$ is strongly idempotent with respect to its scope, but it does not overwrite its scope (e.g. $f(0, 0) != f o g(0, 0)$). #pagebreak() == Evaluation <eval> In the following section, we evaluate which consistency properties can be achieved when the generalised criteria described in @general-form are satisfied, and under which network conditions. We then evaluate the applicability of these criteria to the specific case of the proposed solution for realtime collaboration in Apollon. === General Form Evaluation <g-eval> ==== Client-Server Setup, Reliable Network For a client-server setup with exactly-once-delivery and preserved delivery order guarantees, it can be observed that _strong idempotency_ of operations suffices to achieve _Convergence_ (i.e. all clients will eventually converge to the same state @cci). As mentioned before, low-level network protocols such as TCP guarantee preserved delivery order @tcp, and exactly-once-delivery can also be assumed if no serious partitions occur (i.e. clients don't get disconnected). In such a scenario, it suffices for the server to broadcast the operations it receives to all clients, including the client who issued the operation. Assuming client A conducts operation $a$ while client B conducts operation $b$ locally, and the server broadcasts the sequence $angle.l a, b angle.r$, and assuming both clients started from the same initial state $i$, its easy to observe how their respective states, denoted by $S_A$ and $S_B$ converge: $ S_A = (a compose b compose a)(i) = (a compose b)(i) #h(20mm)& because a "is strongly idempotent" \ S_B = (a compose b compose b)(i) = (a compose b)(i) #h(20mm)& because b "is idempotent" \ $ A similar process happens with larger sequences of events, as strong idempotency ensures that operations executed locally to ensure realtime aspect of the system yield no effect on the final state each client converges to, as it can be derived purely from the sequence of operations the server broadcasts. In this scenario, _Strong Convergence_ is also achieved. As per @crdts, _Strong Convergence_ requires that all clients that have received the same set of operations have converged on the same state. If two clients have observed the same operations in the described setup, they have necessarily observed them in the same order as delivered by the server, which as above, results in the same state. ==== Client-Server Setup, Unreliable Network For a client-server setup where eventual delivery is the only network guarantee (i.e. all messages will eventually be delivered), a data type partitioned into independent operational scopes that can be reconstructed from these scopes (i.e. requirements of #link(<prop-rec>)[_Reconstruction_], Proposition 2, are met), with operations that _overwrite_ their scopes, can achieve _Eventual Consistency_ (convergence under the described network conditions @crdts). To that end, the server needs to replay the last operation of each scope to all clients until it receives an acknowledgement from said client for said operation. Operations of various scopes might arrive at clients out-of-order, but they commute and subsequently yield the same final state (as per #link(<prop-rec>)[Proposition 2]). Additionally, since the operations overwrite their respective scopes, it suffices for eventual convergence within each scope that each client eventually receives the last operation of said scope, which is guaranteed by the network conditions. Note that this scenario does not satisfy _Strong Eventual Consistency_, as it is possible for a client to have received the same set of operations of a particular scope with a different order, resulting in two clients who have observed the same set of operations but have not converged on the same state. ==== Distributed Setup In a distributed setup with an eventual delivery guarantee, if conditions of #link(<prop-rec>)[_Reconstruction_] hold and operations overwrite their scopes, _Strong Eventual Consistency_ (and by extension, _Eventual Consistency_) can be achieved. In this scenario, we additionally treat each operational scope as a LWW Register #footnote([Last-Write-Wins Register]) CRDT, as described in @crdt-list, for example we can attach a logical timestamp combined with a unique client identifier to each operation, and use it to reach consensus on the _winning_ operation amongst peers. With this modification, operations within each scope will commute, and due to #link(<prop-rec>)[Proposition 2], operations of different scopes also commute, achieving full commutativity. ==== Intention Preservation <int-prev> Intention Preservation is another consistency requirement desired for realtime collaboration systems, which ensures that the intention of each operation is preserved when it is executed in different clients upon different states @cci. The effect of an operation can be expressed with its scope, in which case intention preservation is achieved when operational scopes are independent and operations overwrite their respective scopes, since the context of application does not affect the operation anymore. That said, for example in the case of the solution implemented for Apollon, where paths of UML relationships are treated as atomic registers, two clients moving two connected elements at the same time might converge on a diagram where the elements are no longer connected. Specifically, if clients A and B move connected elements $e_A$ and $e_B$ respectively, the operational scope related to the path of the connection between the elements will be overwritten by one, for example client A, which leads to the potential of it being disconnected from $e_B$. In a sense, the intention is preserved: client A intended for the path to be connected to $e_A$ and end where $e_B$ used to be when they made their changes, but it in a more real sense, the intention of both clients was to maintain the elements connected. As noted by @cci, intention preservation only takes "syntactic intention" into account, but the provided example yields a "semantic violation of intention" that arguably results in a diagram with a wrong syntax, although the intention based on the syntax of the diagram object (the UML diagram schema) is preserved. === Evaluation of Proposed Solution <s-eval> Since the operations of the proposed solution are JSONPatches, a natural candidate for lenses is functions selecting each possible operation path (JSONPointer). Within this context, it is trivial to observe that _replace_ operations of different paths do constitute independent operational scopes, and each operation overwrites its scope, if the paths of the patches issued are detailed to the level of atomic objects (whether they are atomic by nature, like boolean or string values, or are treated atomic as perscribed by @diagram-schema): Since these paths point to atomic objects, two operations either point to the same object (in which case each operation overwrites its scope), or they point to different atomic objects and are independent (the lenses corresponding to their paths select them and are blind to the other, and they are blind to each other under these lenses). #figure( ```json { "a": { "b": 2, "c": ["x", "y"] } } A: [{ "op": "replace", "path": "/a/b", "value": 3}], B: [{ "op": "replace", "path": "/a/b", "value": 4}], C: [{ "op": "replace", "path": "/a/c", "value": ["x", "z"]}], D: [{ "op": "replace", "path": "/a", "value": { "b": 6, "c": [] }}] ```, caption: [ Example JSON document with atomic and non-atomic replace operations ], ) <figure-json> Note that this operational independence between replace operations would not be true if paths used in operations did not point to atomic objects. For example, in @figure-json, operations A and C are independent, and operations A and B share the scope and overwrite it, both selected by the lens corresponding to the path `"/a/b"`. However, operations A and D are not independent, as the path of D is a parent of the path of A, and the operations are not blind to each other under the lens corresponding to the path `"/a"`. As described in @diagram-schema, arrays that can't be transformed into mappings with unique ids are treated as atomic values in the proposed solution. Subsequently, no _add_ or _remove_ operations with paths traversing array indices are produced. With this complexity resolved, _add_ and _remove_ operations can also be assumed as _replace_ operations, respectively from and to some null value. We can also assume that each _add_ or _remove_ operation is broken into several _add_ or _remove_ operations with atomic values, to preserve operational independence described above. For example, in @figure-json, an _add_ operation for path `"/a"` can be broken into two _add_ operations for paths `"/a/b"` and "`/a/c`". Overall, with specified criteria, the proposed solution does induce independent operational scopes with operations overwriting their respective scopes. It is trivial to reconstruct the JSON object from any given set of JSON Pointers and their associated atomic values. == Discussion <disc> The results outlined in @eval, demonstrate that utilising the proposed solution for realtime collaboration for Apollon, we can achieve _Convergence_ and _Strong Convergence_ in a client-server setup with reliable network conditions, _Eventual Consistency_ but not _Strong Eventual Consistency_ in a client-server setup with unreliable network, and we achieve _Strong Eventual Consistency_ in a distributed setting by adding a distributed layer on top of the implemented solution. A particularly interesting aspect of the provided generalised solution is its relation with other techniques such as CRDTs. In particular, the structure of many (but not all) CRDTs can be expressed via independent operational scopes, and a mechanism for determining "winning" operations within each scope. For example, a CRDT sequence, as described in @crdt-background, can be expressed with a data type of mapping of unique identifiers (with a total order) to characters, lenses that select each such identifier, and operations for mapping a character to an identifier (_insert_) or marking such an identifier as deleted (_delete_). The main difference from this perspective is that operations within each scope are also specifically designed, in tandem with the data type, to commut as well, yielding _Strong Eventual Consistency_. As discussed in @eval, the generalised form of the proposed solution (and by extension, the specific solution proposed for Apollon) can also be extended to achieve _Strong Eventual Consistency_ by attaching similarly unique identifiers with an established total ordering to each operational scope. In effect, the proposed solution of this thesis deconstructs the process of designing a CRDT, retaining the essential aspect of operational independence, but avoiding complexity necessary for achieving full commutativity. Instead, the solution relies on network guarantees, or subsequent adoptions, to ensure convergence, providing an extensible bedrock that avoids complexities and overheads associated with CRDTs while remaining flexible enough for their addition in cases where stronger guarantees are needed. == Limitations <limits> The provided theoretical analysis and evaluation considers only consistency properties for the proposed solution. A key missing aspect here is performance analysis, specifically as the proposed solution was preferred over CRDTs in part due to it avoiding performance overheads attached with CRDTs. While intuitively it is easy to see why the simpler solution provided here, with its lack of awareness of participants or avoidance of maintaining or communicating data beyond what is required for representation of diagrams, would be more performant, a more detailed analysis of this aspect is needed. Similarly, while the provided analysis does outline the achievable consistency properties in various conditions, it is lacking in deriving minimum requirements of the external environment (feasible network topology, network guarantees) based on desired consistency properties. For example, it is not clear if _Strong Convergence_ suffices for a particular application, which network guarantees would be required minimally. The evaluation is in need of a more in-depth analysis of intention preservation and its correlation with expressed criteria and framework. The provided example in @int-prev highlights this shortcoming, as it can't be clearly argued whether syntactic intention is preserved, even for the special case of realtime collaboration in Apollon from which the generalised form was derived. Provided analysis merely hints at the connection between the outlined general framework and CRDTs, leaving room for more in-depth analysis of properties of CRDTs that can be relaxed in favor of simplicity and performance, and the implications of such relaxations on consistency. The perceived "simplicity" that is achieved here is also only described qualitatively, and while this might be easy to express more precisely in comparison with CRDTs where information about pariticpants is embedded into the data type itself, or in cases where deriving the representational data from the replicated schema is non-trivial, present work falls short of providing such an expression, or further investigation with cases where distinctions are not trivial. #pagebreak() = Summary <summary> In the following sections, we summarize the status of the thesis, providing an overview of fulfillment of outlined functional and non-functional requirements, and the achieved and open goals. We conclude by providing an overview of the contributions made by this work and potential future work. == Status <status> @fr-status and @nfr-status provide an overview of the status of functional and non-functional requirements, respectively. The following symbols are used to denote the status of each requirement: - $success$: requirement fulfilled, - $semisuccess$: requirement partially fulfilled, - $failure$: requirement not fulfilled. #figure( table( columns: (30mm, 1fr, auto), align: (center, left, center), table.header([Requirement], [Description], [Status]), link(<fr1>)[FR1], [*Collaboration*: users see each other's changes], [$success$], link(<fr2>)[FR2], [*Realtime*: local changes are reflected immediately], [$success$], link(<fr3>)[FR3], [*Unconstrained Editing*: users can edit any part of the diagram], [$success$], link(<fr4>)[FR4], [*Eventual Consistency*: users eventually converge], [$success$], link(<fr5>)[FR5], [*Intention Preservation*: changes maintain intended effect], [$semisuccess$], link(<fr6>)[FR6], [*Conflict Resolution*: independent changes don't conflict], [$success$], link(<fr7>)[FR7], [*Update Notification*: consumer code gets updates], [$success$], link(<fr8>)[FR8], [*Receiving Updates*: independent changes don't conflict], [$success$], ), caption: [Status of functional requirements] ) <fr-status> Most functional requirements are fulfilled by the implementation of the proposed system and its integration into Artemis and Apollon Standalone platforms, as detailed in @eval. As discussed in that section, and specifically as elaborated in @int-prev, while intention behind changes is preserved in most cases, there are example scenarios where it is violated. While as described in @subsystems, the implementation implores other mechanisms to compensate for such cases, it can be argued that the realtime collaboration system did not fully materialise this particular requirement. #figure( table( columns: (30mm, 1fr, auto), align: (center, left, center), table.header([Requirement], [Description], [Status]), link(<nfr1>)[NFR1], [*Usability*: realtime collab as smooth as local editing], [$success$], link(<nfr2>)[NFR2], [*Reliability*: fault-tolerance towards network partitions], [$failure$], link(<nfr3>)[NFR3], [*Reliability*: fault-tolerance towards network unreliability], [$semisuccess$], link(<nfr4>)[NFR4], [*Reliability*: peers converge strongly], [$semisuccess$], link(<nfr5>)[NFR5], [*Performance*: updates deliver quickly], [$success$], link(<nfr6>)[NFR6], [*Performance*: optimised resource usage], [$semisuccess$], link(<nfr7>)[NFR7], [*Performance*: optimised message size], [$success$], link(<nfr8>)[NFR8], [*Supportability*: easy integration], [$success$], link(<nfr9>)[NFR9], [*Supportability*: update message interop.], [$success$], link(<nfr10>)[NFR10], [*Supportability*: independent subsystem], [$success$], link(<nfr11>)[NFR11], [*Supportability*: easy extension of new diagram types], [$semisuccess$], ), caption: [Status of non-functional requirements] ) <nfr-status> #link(<nfr2>)[NFR2] is considered not fulfilled, as although theoretically the proposed solution can be used in a manner that fault tolerance towards network partitions is achieved (see @g-eval), neither the actual implementation nor the integrations utilise such a mechanism. This particular non-functional requirement was deprioritized in favor of the supportability goals as detailed in @trade-offs. The same trade-off affected resulted in deprioritization of #link(<nfr3>)[NFR3] and #link(<nfr4>)[NFR4], which are considered partially fulfilled as the implemented system does provide strong convergence and fault-tolerance towards some degree of out-of-order delivery and message loss, while achieving ideal tolerance is delegated to the integrating platforms (see @g-eval). #link(<nfr6>)[NFR6] is considered partially fulfilled, as though overheads associated with commonly used realtime collaboration solutions such as CRDTs and Operational Transformation are avoided, present work still lacks a proper analysis of the performance of the system, due to limited time scope of the work. #link(<nfr11>)[NFR11] is also considered partially fulfilled, as though addition of new diagram types is mostly unaffected by the realtime collaboration system and the system supports those extensions out-of-the-box, the ideal of "zero awareness" (required from developers) could not be achieved and developers of such extensions need to consider the guidelines provided in @diagram-schema. === Realized Goals <r-goals> From the targetted objectives of the work, design and implementation is considered complete, providing a simple, easily integratable and extensible realtime collaboration system that achieves strong convergence with minimal work required by integrating platforms, and can easily be expanded to behave in a more fault-tolerant manner (see @eval). The integration goal is considered partially achieved, as integration of the realtime collaboration system into Artemis and Apollon Standalone projects was implemented, successfully acting as a testbed to ensure simplicity of integration of the system. The goal is considered partially achieved as there is still room for integration into a more scalable remote persistence layer, which might come with its own unique challenges. The goal of providing a theoretical analysis is also considered mostly achieved. The provided analysis outlines a generalised framework for assessing the proposed solution, similar solutions, and their relations to CRDTs, alongside proving consistency properties in vaiorus network topologies and conditions. === Open Goals <o-goals> As mentioned above, the objective of integration into a more scalable remote persistence system and resolving all potential challenges that come with that is an open goal. This goal was not achieved due to limited time constraints of the project, and is left for future work. Additionally, as described in @nfr-status, many requirements related to the reliability design goal were partially realised. This is due to the design decision of prioritising extensibility and portability of the solution over strong fault tolerance, while flexibility to easily add mechanisms providing such fault tolerance was provided, as detailed in @trade-offs. == Conclusion <conclusion> Through this thesis, realtime collaboration capabilities for Apollon were designed and implemented, supporting straightforward integration into platforms aiming to utilise Apollon, as demonstrated by integration with Apollon Standalone and Artemis. Implemented solution provides a simple, yet easily integratable and existensible system that achieves strong consistency properties without requiring coupling of consumer code or future Apollon extensions into the system, while providing flexibility for various circumstances and requirements. Present work also provided a theoretical analysis of the proposed solution, establishing a generalised framework for assessing the realtime collaboration system of Apollon alongside any similar systems, outlining theoretically sufficient conditions for designing and implementing such systems, and providing an evaluation of consistency properties such systems would yield in various architectures and under various network conditions. The analysis also provided a connection between the proposed solution and CRDTs, establishing a guideline for relaxing constraints of CRDTs in a granular manner to achieve simplicity and performance. == Future Work <future-work> As outlined in previous sections, present thesis highlights several areas for improvement and further work, both with regards to realtime collaboration on UML diagrams and similarly structured data, and specifically with regards to Apollon. === Integration Integration into a wider variety of platforms can help further refine the requirements and realtime collaboration features provided by Apollon itself. Most notably of interest would be integration into a system requiring scalable persistence solutions, as it potentially highlights requirements in the area of applying update messages provided by Apollon as atomic operations on partitioned databases, and designing architectures for fault-tolerant and consistent persistence of data in face of high-frequency updates. Another area of interest would be integration into more distributed platforms, which help further analyse the requirements of fault-tolerance and consistency, and potentially result in a realtime collaboration system that can be used in an out-of-the-box manner in a wider variety of platforms, assuming associated overheads with such solutions can be overcome. === Functionality Extension Another area for potential improvement is extended functionality of the realtime collaboration system within Apollon itself. For example, better support for partition-recovery mechanisms could further reduce complexity of implementing more robust realtime collaboration for integrating platforms. Additionally, improved intention preservation, specifically with regards to special cases such as paths of UML relationships, can result in smoother and more intuitive user experience. Such work would also require further investigation into how such systems can be implemented within the boundaries of the constraints and design goals outlined in this thesis, or which trade-offs should be looked at in a different manner to accomodate such more in-depth systems. === Analysis As mentioned in @limits, present thesis also leaves a lot of room for further theoretical analysis. A more in-depth performance analysis of the proposed solution can provide further insight into the validity of the trade-offs made in favor of simplicity and performance against fault tolerance. Additionally, further theoretical investigation can shed more light into the connection between such potential trade-offs, for example by providing a more precise expression of the "simplicity" goal of this thesis, specifically as it is also boasted as one of the main advantages of CRDTs in general against similar solutions @crdts, @ot-v-crdt. Further theoretical work could also provide a more concrete framework for relaxing various consistency constraints under various network conditions and how such relaxations would yield differing performance results.
https://github.com/Az-21/typst-material-you
https://raw.githubusercontent.com/Az-21/typst-material-you/main/Preview/preview.typ
typst
Apache License 2.0
#import "m3.typ": m3light, m3dark #set page(paper:"a4", margin: 32pt) #let m3box(color) = box(width: 32pt, height: 32pt, fill: color, radius: 2pt, stroke: 1pt) $ overbracket( #m3box(m3light.primary) #m3box(m3light.onPrimary) #m3box(m3light.primaryContainer) #m3box(m3light.onPrimaryContainer), "Light Primary" ) #h(8pt) overbracket( #m3box(m3light.secondary) #m3box(m3light.onSecondary) #m3box(m3light.secondaryContainer) #m3box(m3light.onSecondaryContainer), "Light Secondary" ) #h(8pt) overbracket( #m3box(m3light.tertiary) #m3box(m3light.onTertiary) #m3box(m3light.tertiaryContainer) #m3box(m3light.tertiaryContainer), "Light Tertiary" ) #h(8pt) overbracket( #m3box(m3light.background) #m3box(m3light.onBackground), "Light Background" ) $ // -- // $ underbracket( #m3box(m3dark.primary) #m3box(m3dark.onPrimary) #m3box(m3dark.primaryContainer) #m3box(m3dark.onPrimaryContainer), "Dark Primary" ) #h(8pt) underbracket( #m3box(m3dark.secondary) #m3box(m3dark.onSecondary) #m3box(m3dark.secondaryContainer) #m3box(m3dark.onSecondaryContainer), "Dark Secondary" ) #h(8pt) underbracket( #m3box(m3dark.tertiary) #m3box(m3dark.onTertiary) #m3box(m3dark.tertiaryContainer) #m3box(m3dark.tertiaryContainer), "Dark Tertiary" ) #h(8pt) underbracket( #m3box(m3dark.background) #m3box(m3dark.onBackground), "Dark Background" ) $
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cetz/0.3.0/src/lib/tree.typ
typst
Apache License 2.0
// CeTZ Library for Layouting Tree-Nodes #import "/src/util.typ" #import "/src/draw.typ" #import "/src/coordinate.typ" #import "/src/vector.typ" #import "/src/matrix.typ" #import "/src/process.typ" #import "/src/anchor.typ" as anchor_ #let typst-content = content // Default edge draw callback // // - from (string): Source element name // - to (string): Target element name // - parent (node): Parent (source) tree node // - child (node): Child (target) tree node #let default-draw-edge(from, to, parent, child) = { draw.line(from, to) } // Default node draw callback // // - node (node): The node to draw #let default-draw-node(node, _) = { let text = if type(node) in (content, str, int, float) { [#node] } else if type(node) == dictionary { node.content } draw.get-ctx(ctx => { draw.content((), text) }) } /// Lays out and renders tree nodes. /// /// For each node, the `tree` function creates an anchor of the format `"node-<depth>-<child-index>"` that can be used to query a nodes position on the canvas. /// /// ```typc example /// import cetz.tree /// set-style(content: (padding: .1)) /// tree.tree(([Root], ([A], [A.A], [A.B]), ([B], [B.A]))) /// ``` /// /// - root (array): A nested array of content that describes the structure the tree should take. Example: `([root], [child 1], ([child 2], [grandchild 1]))` /// - draw-node (auto,function): The function to call to draw a node. The function will be passed two positional arguments, the node to draw and the node's parent, and is expected to return elements (`(node, parent-node) => elements`). The node's position is accessible through the "center" anchor or by using the previous position coordinate `()`. If `auto` is given, just the node's value will be drawn as content. The following predefined styles can be used: /// - draw-edge (none,auto,function): The function to call draw an edge between two nodes. The function will be passed the name of the starting node, the name of the ending node, and the end node and is expected to return elements (`(source-name, target-name, target-node) => elements`). If `auto` is given, a straight line will be drawn between nodes. /// - direction (str): A string describing the direction the tree should grow in ("up", "down", "left", "right") /// - parent-position (str): Positioning of parent nodes (begin, center, end) /// - grow (float): Depth grow factor /// - spread (float): Sibling spread factor /// - name (none,str): The tree element's name #let tree( root, draw-node: auto, draw-edge: auto, direction: "down", parent-position: "center", grow: 1, spread: 1, name: none ) = { assert(parent-position in ("begin", "center","end", "after-end")) assert(grow > 0) assert(spread > 0) direction = ( up: "north", down: "south", right: "east", left: "west" ).at(direction) if draw-edge == auto { draw-edge = default-draw-edge } else if draw-edge == none { draw-edge = (..) => () } if draw-node == auto { draw-node = default-draw-node } assert(draw-node != none, message: "Node draw callback must be set!") let build-node(tree, depth: 0, sibling: 0) = { let children = () let content = none if type(tree) == array { children = tree.slice(1).enumerate().map( ((n, c)) => build-node(c, depth: depth + 1, sibling: n) ) content = tree.at(0) } else { content = tree } return ( x: 0, y: depth * grow, n: sibling, depth: depth, children: children, content: content ) } // Layout node recursive // // return: // (node, left-x, right-x) let layout-node(node, shift-x) = { if node.children.len() == 0 { node.x = shift-x return (node, node.x, node.x) } else { let (min-x, max-x) = (none, none) let (left, right) = (none, none) let n-children = node.children.len() for i in range(0, n-children) { let child = node.children.at(i) let (child-min-x, child-max-x) = (none, none) (child, child-min-x, child-max-x) = layout-node(child, shift-x) node.children.at(i) = child left = util.min(child.x, left) right = util.max(child.x, right) min-x = util.min(min-x, child-min-x) max-x = util.max(max-x, child-max-x) shift-x = child-max-x + spread } if parent-position == "begin" { node.x = left } else if parent-position == "center" { node.x = left + (right - left) / 2 } else if parent-position == "end" { node.x = right } else { //after-end node.x = right+spread max-x = max-x + spread } node.direct-min-x = left node.direct-max-x = right node.min-x = min-x node.max-x = max-x return (node, min-x, max-x) } } let node-position(node) = { if direction == "south" { return (node.x, -node.y) } else if direction == "north" { return (node.x, node.y) } else if direction == "west" { return (-node.y, node.x) } else if direction == "east" { return (node.y, node.x) } else { panic(message: "Invalid tree direction.") } } let anchors(node, parent-path) = { if parent-path != none { parent-path += "-" } else { parent-path = "" } let d = (:) d.insert(parent-path + str(node.n), node-position(node)) for child in node.children { d += anchors(child, parent-path + str(node.n)) } return d } let build-element(node, parent-name) = { let name = if parent-name != none { parent-name + "-" + str(node.n) } else { "0" } // Render element node.name = name node.group-name = "g" + name node.element = { draw.anchor(node.name, node-position(node)) draw.group(name: node.group-name, { draw.move-to(node-position(node)) draw.anchor("default", ()) draw-node(node, parent-name) }) } // Render children node.children = node.children.map(c => build-element(c, name)) // Render edges node.edges = if node.children != () { draw.group({ for child in node.children { draw-edge(node.group-name, child.group-name, node, child) } }) } else { () } return node } let root = build-node(root) let (nodes, ..) = layout-node(root, 0) let node = build-element(nodes, none) // Render node recursive let render(node) = { if node.element != none { node.element if "children" in node { for child in node.children { render(child) } } node.edges } } draw.group(name: name, render(node)) }
https://github.com/7sDream/fonts-and-layout-zhCN
https://raw.githubusercontent.com/7sDream/fonts-and-layout-zhCN/master/chapters/06-features-2/positioning/positioning.typ
typst
Other
#import "/template/template.typ": web-page-template #import "/template/components.typ": note #import "/lib/glossary.typ": tr #show: web-page-template // Types of Positioning Rule == 各种类型的#tr[positioning]规则 <section:positioning-rule-types> // After all the substitution rules have been processed, we should have the correct sequence of glyphs that we want to lay out. The next job is to run through the lookups in the `GPOS` table in the same way, to adjust the positioning of glyphs. We have seen example of single and pair positioning rules:. We will see in this section that a number of other ways to reposition glyphs are possible. 在所有#tr[substitution]规则处理完后,我们应该就得到了字体可以正确处理的#tr[glyph]序列。下一项工作是按照相同的方式运行一遍`GPOS`表中的所有#tr[lookup],来调整#tr[glyph]的位置。我们已经见过单#tr[character]和字偶对的#tr[positioning]规则了,在本节中会介绍将#tr[glyph]重新#tr[positioning]的其他各种方式。 // To be fair, most of these will be generated more easily and effectively by the user interface of your font editor - but not all of them. Let's dive in. 客观的说,大部分类型的规则通过使用字体编辑软件中的UI界面可以更方便的生成,但也不是所有类型都这样。让我们开始吧。
https://github.com/donabe8898/typst-slide
https://raw.githubusercontent.com/donabe8898/typst-slide/main/opc/単発/IRC/main.typ
typst
MIT License
#import "@preview/polylux:0.3.1": * #import themes.metropolis: * #show: metropolis-theme.with( aspect-ratio: "16-9", footer: [IRC], // short-author: "donabe8898", // short-date: none, // color-a: rgb("#0C6291"), // color-b:rgb("#A63446"), // color-c:rgb("#FBFEF9"), // progress-bar: true ) #show link: set text(blue) #set text(font: "Noto Sans CJK JP",size:18pt) #show heading: set text(font: "Noto Sans CJK JP") #show raw: set text(font: "Noto Sans Mono CJK JP") #show raw.where(block: true): block.with( fill: luma(240), inset: 10pt, radius: 4pt ) // タイトル #title-slide( title: "IRCで遊ぼう", subtitle: "インターネットチャットの原点", author: "<NAME>", date: "2024-1-1", extra: "OECUプログラミングサークル" ) #slide(title: "本講義の目的")[ - IRCで遊びます ] #focus-slide()[みなさん、チャットアプリ何使っていますか?] #slide(title: "IRC")[ - Internet Relay Chat 👀 - サーバーを経由してクライアント同士でチャトができる仕組み - OSI参照モデル → アプリケーション層 - 1988年にネット掲示板で使われていた技術の代用として開発 *LINEやDiscordの先祖* ] #slide(title:"特徴")[ + 中央集権 - 1サーバーに複数のチャンネルを作成してユーザーがそこに入る #v(1em) + 連携 - サーバー同士を連携させることが可能 - 連携先と連携者のサーバーで同一のチャンネルが作成 - 片方のサーバーが落ちてももう片方でチャットを継続 #v(1em) + 秘匿性 - サーバーにログインしないとメッセージが閲覧できない - サーバーにメッセージが残らない→ログイン以前のやりとりが見えない - 通信内容は暗号化されない - TLSで暗号化することも可能 ] #slide(title: "使ってみよう")[ = クライアントソフトウェア #v(1em) - *HexChat* (Win, Linux) - Pidgin (BSD, Linux, Solaris) - LimeChat (Win, mac) - 「IRC client」で検索すると出てくる お好きなものをインストールしよう ] #slide(title: "サーバーにログイン")[ + Network Listで「追加」を押す + ネットワークの名前を決める(各自のお好みで) + ネットワークを選択したまま「編集」を押す + 「サーバー」タブのipとポートを`irc.donabe8898.dev/6697`に設定 + 「このネットワークのすべてのサーバーへはSSLを使う」にチェックを入れる + 「不正なSSL証明書を受け入れる」にチェックを入れる + 「パスワード」を`serverDona9118`に設定する + 閉じて接続 ] // 本講義の目的 // 1. クライアントサーバーシステムの基礎知識 // 2. IRCサーバーの仕組み // IRCとは // - 1980年代後半からすでに存在した // - 2020現在でもインターネット上の様々なコミュニティで活用されている // IRCプロトコルの特徴 // - 中央集権 & リレー // - インターネット掲示板とは違って、ログインしないとメッセージが見れない // IRCサーバー // - クライアントはIRCサーバーにログインすることで、チャンネルの設立が可能になる // - チャンネルに対して1つのメッセージを送ると, サーバーはチャンネルに所属するユーザー全員にメッセージを転送する // - チャンネルに入ってないとメッセージは見れない // IRCクライアント // 様々なクライアントがある // - コマンドライン上で動くものから, GUIを含むリッチなものまで // - OSを問わず様々なクライアントが存在する // 実際に使ってみましょう // 1. クライアントを用意 // - windows: LimeChat // - macOS: LimeChat // - GNU/Linux: pidgin // スライドは以上です
https://github.com/WinstonMDP/knowledge
https://raw.githubusercontent.com/WinstonMDP/knowledge/master/SLAE.typ
typst
#import "cfg.typ": cfg #show: cfg = Система линейных алгебраических уравнений $(a_(i j) | b_i)$ - расширенная матрица системы. Совместная СЛАУ - это СЛАУ, у которой существует решение. Определённая СЛАУ - это СЛАУ, у которой решение единственно. Элементарное преобразование типа I - это поменять местами две строки СЛАУ. Элементарное преобразование типа II - это прибавить к строке другую, умноженную на константу. Эквивалентные СЛАУ ($a tilde b$) - это СЛАУ, у которых одни и те же решения. Две системы эквивалентны, если одна получается из другой элементарными преобразованиями. Главные неизвестные - это первые ненулевые элементы в строках. Свободные неизвестные - это не главные неизвестные. Ступенчатый вид - это вид, при котором под главными неизвестными все нули и номера главных неизвестных в строке возрастают при увеличении номера строки. Всякую матрицу можно элементарными преобразованиями привести к ступенчатому виду.
https://github.com/agarmu/typst-templates
https://raw.githubusercontent.com/agarmu/typst-templates/main/notes/main.typ
typst
MIT License
#import "template.typ": * #import "theorems.typ": * #import "symbols.typ": * #show: thmrules #show: ilm.with( title: [Filename], author: "<NAME>", date: none, abstract: [], preface: [], paper-size: "us-letter", figure-index: (enabled: true), table-index: (enabled: true), listing-index: (enabled: false) ) = Sample Heading!! #lorem(30) == Sample Subheading!! #lorem(60) #theorem("No largest prime")[ There is no largest prime. ] #proof[ We prove by contradiction. Assume that there is a largest prime. Hence there is a set of primes $P = {2, 3, 5, ... , p_n} subset NN$. Then consider: $ q = product_(p in P) p $ As $|P|$ is finite, $q$ must be too. Further note that $forall p in P, p divides q$. Hence $forall p in P, p divides.not (q + 1)$. So $q + 1$ is coprime to all primes and must therefore also be prime, a contradiction. ] #corollary[ There are infinitely many primes. ]
https://github.com/thanhdxuan/dacn-report
https://raw.githubusercontent.com/thanhdxuan/dacn-report/master/Lab02/contents/02-introduce.typ
typst
#set enum(numbering: "a)") = *Hệ mã bất đối xứng* == Câu 1: Cho biết vai trò của the public và private key trong hệ mã khoá công khai với ứng dụng mã hoá? - _Private key_: khóa riêng dùng để giải mã dữ liệu ai đó đã mã hóa <KEY> của chính mình và gửi dữ liệu đó cho mình. - _Public key_: Khóa công khai để tiến hành mã hóa thông điệp muốn gửi cho người khác và khóa công khai có được là của người nhận dữ liệu. == Câu 2: Thực hiện tính toán: mã hoá và giải mã thông điệp sử dụng giải thuật RSA cho các câu bên dưới: + $p=3;q=11,e=7;M=5$ - $n = n * q = 3 * 11 = 33$ - $phi(n) = (p - 1)*(q - 1) = 20$ - $e = 7$ - $d = 3$ thỏa mãn $(d * e) mod phi(n) = 1$ - public key: ( e, n = 7, 33 ) - private key: ( d, n = 3, 33 ) *Mã hóa*: $C = M^e mod n = 5^7 mod 33 = 14$\ *Giải mã*: $M = C^d mod n = 14^3 mod 33 = 5$ + $p=5;q=11,e=3;M=9$ - $n = n * q = 5 * 11 = 55$ - $phi(n) = (p - 1)*(q - 1) = 40$ - $e = 3$ - $d = 27$ thỏa mãn $(d * e) mod phi(n) = 1$ - public key: ( e, n = 3, 55 ) - private key: ( d, n = 27, 55 ) *Mã hóa*: $C = M^e mod n = 9^3 mod 55 = 14$\ *Giải mã*: $M = C^d mod n = 14^27 mod 55 = 9$ + $p=7;q=11,e=17;M=8$ - $n = n * q = 7 * 11 = 77$ - $phi(n) = (p - 1)*(q - 1) = 60$ - $e = 17$ - $d = 53$ thỏa mãn $(d * e) mod phi(n) = 1$ - public key: ( e, n = 17, 77 ) - private key: ( d, n = 53, 77 ) *Mã hóa*: $C = M^e mod n = 8^17 mod 77 = 57$\ *Giải mã*: $M = C^d mod n = 57^27 mod 77 = 8$ + $p=11;q=13,e=11;M=7$ - public key: ( e, n = 11, 143 ) - private key: ( d, n = 11, 143 ) *Mã hóa*: $C = M^e mod n = 7^11 mod 143 = 106$\ *Giải mã*: $M = C^d mod n = 106^11 mod 143 = 7$ + $p=17;q=31,e=7;M=2$ - public key: ( e, n = 343, 527 ) - private key: ( d, n = 7, 527 ) *Mã hóa*: $C = M^e mod n = 2^343 mod 527 = 349$\ *Giải mã*: $M = C^d mod n = 349^7 mod 527 = 2$ == Câu 3: Giả sử trong hệ mã khoá công khai sử dụng RSA, bạn biết được một ciphertext C = 10 được gởi đến một người có public key là e = 5, n = 35. Chúng ta có thể sử dụng được các thông tin như trên để giải mã được thông điệp gốc (M) được không, nêu từng bước thực hiện và giải thích? Phân tích thừa số nguyên tố: $n = 5 * 7$ $==>$ $phi(n) = (5 - 1) * (7 - 1) = 24$ Ta có: $(d * e) mod phi(n) = 1$ $==>$ Ta cần giải phương trình $(d * 5) mod 24 = 1$. Sử dụng giải thuật *Extended Euclidean* ta có $d = 5$. Vậy, private key là: (d, n) = (5, 35) $==>$ $M = C^d mod n = 10^5 mod 35 = 5$. == Câu 4: Trong ứng dụng với hệ mã khoá công khai sử dụng RSA, chúng ta biết được một thành viên đang dùng public key là e = 31, n = 3599. Chúng ta có thể tìm được private key của thành viên nói trên được hay không, nêu từng bước thực hiện và giải thích? Ta thực hiện thử sai với các số nguyên tố nhỏ hơn 60, kết quả cho thấy ta phân tích được: $n = 3599 = 59 * 61$. Vậy $(p, q)$ sẽ là $(59, 61)$ hoặc $(61, 59)$. Ta có: $phi(n) = (p - 1) * (q - 1) = 3480$. Để tìm $d$, ta cần giải phương trình: $(31 * d) mod 3480 = 1$. Sử dụng thuật toán *Extended Euclidean* để tìm $d$, ta được $d = 3031$. #image("/images/1c4.jpeg") Vậy, private key của thành viên nói trên là: (3031, 3599).
https://github.com/erfan-khadem/resume
https://raw.githubusercontent.com/erfan-khadem/resume/main/modules_en/education.typ
typst
Apache License 2.0
#import "../brilliant-CV/template.typ": * #cvSection("Education") #cvEntry( title: [Bachelors of Science in Electrical Engineering], society: [Sharif University of Technolog], date: [2022 - 2026], location: [Tehran, Iran], logo: "../src/logos/sharif.png", description: list( [Optional Courses: Realtime Embedded Systems #hBar() Operating Systems #hBar() Cryptography #hBar() Advanced Programming], [GPA: 18.34 #hBar() Department Average: 15.72 / 20] ) ) #cvEntry( title: [Highschool Diploma in Physics and Mathematics], society: [Dastgheib 1 Highschool of Shiraz], date: [2019 - 2022], location: [Shiraz, Iran], logo: "../src/logos/nodet.png", description: list( [Sh<NAME> 1 highschool is under the supervision of NODET, National Organization for Development of Exceptional Talents.], [GPA: 19.75 / 20] ) )
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/flyingcircus/3.0.0/src/Impl.typ
typst
Apache License 2.0
#import "@preview/cetz:0.2.2" #import "@preview/cuti:0.2.1": regex-fakeitalic #import "@preview/tablex:0.0.8": gridx, cellx, hlinex, vlinex /// Sets the tex to the Koch Fette FC font for people who don't want to remember that. /// /// - body (content) /// - ..args: Any valid argument to the text function /// -> content #let KochFont(body, ..args) = text(..args)[#text(font: "Koch Fette FC")[#body]] #let defaultImg = read("images/Default.png", encoding: none) /// Defines the FlyingCircus template /// /// - Title (str): Title of the document. Goes in metadata and on title page. /// - Author (str): Author(s) of the document. Goes in metadata and on title page. /// - CoverImg (bytes): Image to make the first page of the document. /// - Description (str): Text to go with the title on the title page. /// - Dedication (str): Dedication to go down below the title on the title page. /// - body (content) /// -> content #let FlyingCircus(Title: str, Author: str, CoverImg: none, Desciption: "Aircraft Catalogue", Dedication: "", body) = { //Set PDF Metadata set document(title: Title, author: Author) //Set Default Font document settings set text(font: "Balthazar", size: 14pt) show par: set block(spacing: 0.5em) set par(leading: 0.35em) set par(justify: true) set list(indent: 1em) set enum(indent: 1em) set page(numbering: "1") //Create fake italics because fonts don't have it. show emph: it => { regex-fakeitalic(it.body) } //Replace Synchronization Marker show regex("✣"): text(font: "Wingdings", [Ë]) //Create default page format set page( paper: "a4", //Header is alternating directions and the centered "Flying Circus" with border header: context{ align(center, cetz.canvas(length: 100%, { import cetz.draw:* set-style(stroke: (paint: black, thickness: 0.75mm)) line((-.5, 0), (.5, 0)) circle((-.5, 0), radius: 1.5mm, fill: black) circle((.5, 0), radius: 1.5mm, fill: black) if (calc.rem(counter(page).get().first(), 2) == 1) { for x in range(-490, 480, step: 13){ line((x / 1000, -2mm), ((x + 20) / 1000, 2mm)) } } else { for x in range(-470, 500, step: 13){ line((x / 1000, -2mm), ((x - 20) / 1000, 2mm)) } } content((0, 1pt), KochFont(stroke: 5pt + white)[Flying Circus]) content((0, 1pt), KochFont(stroke: 0.5pt + black)[Flying Circus]) })) }, //Footer is alternating directions with page number at outside and only partial bar footer: context{ align(center, cetz.canvas(length: 100%, { import cetz.draw:* set-style(stroke: (paint: black, thickness: 0.75mm)) if (calc.rem(counter(page).get().first(), 2) == 1) { content((-.5, 0), KochFont(size: 18pt)[#counter(page).get().first()]) line((-.45, 0), (.5, 0)) circle((-.45, 0), radius: 1.5mm, fill: black) circle((.5, 0), radius: 1.5mm, fill: black) for x in range(-440, 480, step: 13){ line((x / 1000, -2mm), ((x + 20) / 1000, 2mm)) } } else { content((.5, 0), KochFont(size: 18pt)[#counter(page).get().first()]) line((-.5, 0), (.45, 0)) circle((-.5, 0), radius: 1.5mm, fill: black) circle((.45, 0), radius: 1.5mm, fill: black) for x in range(-470, 450, step: 13){ line((x / 1000, -2mm), ((x - 20) / 1000, 2mm)) } } })) }, //Margins, duh. margin: (top: 0.5in, bottom: 0.75in, left: 0.75in, right: 0.75in), ) //Place CoverImage, if it exists if (CoverImg != none) { page(paper: "a4", header: none, footer: none, margin: 0pt)[ #set image(height: 100%, fit: "stretch") #CoverImg ] } //Create Title Page (and eventual TOC) let onepage(body) = page( //Header is evenly LtR, no text header: context{ align(center, cetz.canvas(length: 100%, { import cetz.draw:* set-style(stroke: (paint: black, thickness: 0.75mm)) line((-.45, 0), (.45, 0)) circle((-.45, 0), radius: 1.5mm, fill: black) circle((.45, 0), radius: 1.5mm, fill: black) for x in range(-440, 430, step: 13){ line((x / 1000, -2mm), ((x + 20) / 1000, 2mm)) } })) }, //Footer is smaller and even LtR footer: context{ set align(center) cetz.canvas(length: 100%, { import cetz.draw:* set-style(stroke: (paint: black, thickness: 0.75mm)) line((-.25, 0), (.25, 0)) circle((-.25, 0), radius: 1.5mm, fill: black) circle((.25, 0), radius: 1.5mm, fill: black) for x in range(-240, 230, step: 13){ line((x / 1000, -2mm), ((x + 20) / 1000, 2mm)) } }) }, body, ) onepage[ #set align(center) //The Big FC #KochFont(size: 75pt)[ Flying\ #box(rotate(90deg)[#scale(x: -100%)[$ theta.alt $]]) Circus #box(rotate(-90deg)[$ theta.alt $]) ] //Title and description #text(size: 24pt)[ #Desciption \ #Title ] //Fancy swirls #text( size: 75pt, )[ #box(rotate(-90deg, reflow: true)[#scale(y: 300%, reflow: true)[$ sigma.alt $]]) #box(rotate(-90deg, reflow: true)[#scale(y: -300%, reflow: true)[$ sigma.alt $]]) ] #v(1fr) //Dedication is optional #Dedication #v(2fr) //And the Author, of course #text(size: 24pt)[ #Author #v(1em) ] ] //Style outline // show outline.title: it => { // text // } show outline.entry.where(level: 1): it => { v(10pt, weak: true) text(size: 20pt, it) } onepage[#align( center, box( width: 80%, outline(depth: 10, indent: 1em, title: KochFont(size: 50pt, fill: white, stroke: black + 1pt)[Contents #h(1fr)]), ), )] //Set up headings (gotta be styled after #outline) show heading: it => { KochFont[#align(left)[#it.body]] } show heading.where(level: 1): it => { KochFont(size: 24pt)[#align(left)[#it.body]] v(-1.3em) line(length: 100%, stroke: 1pt + luma(100)) } show heading.where(level: 2): it => { KochFont(size: 24pt)[#align(left)[#it.body]] } show heading.where(level: 3): it => { KochFont(size: 18pt)[#align(left)[#it.body]] } //Reset Page counter to 1, and let's go! counter(page).update(1) body } #let MaybeImage(img, ..args) = if (img != none) { [ #set image(..args) #img ] } /// Defines the FlyingCircus Plane page. Always on a new page. Image optional. /// /// - Plane (str | dictionary): JSON string or dictionary representing the plane stats. /// - Nickname (str): Nickname to go under the aircraft name. /// - Img (bytes | none): Image to go at the top of the page. Set to none to remove. /// - BoxText (dictionary): Pairs of values to go in the box over the image. Does nothing if no Img provided. /// - BoxAnchor (str): Which anchor of the image to put the box in? Sample values are "north", "south-west", "center". /// - DescriptiveText (content) /// -> content #let FCPlane(Plane, Nickname: str, Img: defaultImg, BoxText: none, BoxAnchor: "north", DescriptiveText) = { pagebreak(weak: true) let plane = Plane //Read in json.decode file if it's a path if (type(Plane) == str) { plane = json.decode(Plane) } //Define image element let plane_image = if (Img != none) { context cetz.canvas( length: 100%, { import cetz.draw:* content((0.5, 1), anchor: "north", MaybeImage(Img, width: page.width * 0.95, fit: "stretch"), name: "image") if (BoxText != none) { content("image." + BoxAnchor, anchor: BoxAnchor, padding: 5mm, align(center)[ #let cells = (hlinex(),) #for (k, v) in BoxText { cells.push(text(size: 12pt, k)) cells.push(text(size: 12pt, v)) cells.push(hlinex()) } #gridx(columns: 2, fill: white.transparentize(50%), stroke: luma(170), vlinex(x: 0), vlinex(x: 2), ..cells) ]) } }, ) } //Define title element let plane_title = { set text(fill: luma(100)) set block(spacing: 0em) let hN = plane.keys().contains("Price") let hU = plane.keys().contains("Used") text(size: 20pt)[#plane.Name];h(1fr); if (hN) { [#plane.Price;þ New] }; if (hN and hU) { [, ] }; if (hU) { [#plane.Used;þ Used] } line(length: 100%, stroke: luma(100)) [_\"#Nickname\"_ #h(1fr); #plane.Upkeep;þ] } //Read through the stat rows and push them into cells let cells = () for row in plane.Stats { for (k, v) in row { if (k == "Name") { cells.push(table.cell([#v], align: right, stroke: none)) } else { cells.push(table.cell([#v])) } } } //Construct the stats table element let statTable = table( columns: 6, align: center, table.cell(stroke: none)[], table.cell(stroke: none)[Boost], table.cell(stroke: none)[Handling], table.cell(stroke: none)[Climb], table.cell(stroke: none)[Stall], table.cell(stroke: none)[Speed], ..cells, ) //Define the vital parts table element let vitalTable = table( columns: (100%), rows: (auto, 1fr), align: center + horizon, table.header(table.cell(stroke: none)[Vital Parts]), [#plane.at("Vital Parts") \ #plane.Crew], ) //Define the secondary stats table element let miscTable = table( columns: (100%), align: center + horizon, [#plane.Propulsion], [#plane.Aerodynamics], [#plane.Survivability], table.cell(align: left + horizon, [#plane.Armament]), ) place(top + left, box(width: 0pt, height: 0pt, hide[= #plane.Name])) grid( columns: 1, rows: (auto, 1fr, auto), grid.cell( align: center, [ #plane_image #plane_title #v(-1em) #context stack(dir: ltr, spacing: 1%, box(width:59%, height:measure(statTable).height, statTable), box(width: 40%, height: measure(statTable).height, vitalTable)) #v(-1em) #miscTable ], ), grid.cell(columns(2)[#DescriptiveText], inset: (y: 0.5em)), grid.cell(align(center)[#text(size: 24pt)[#underline[#link(plane.Link)[Plane Builder Link]]]]), ) } /// Defines the FlyingCircus Simple Vehicle. Not always a full page. Image optional. /// /// - Vehicle (str | dictionary): JSON string or dictionary representing the Vehicle stats. /// - Img (bytes): Image to go above the vehicle. /// - DescriptiveText (content) /// -> content #let FCVehicleSimple(Vehicle, Img: none, DescriptiveText) = { //Read in the Vehicle JSON file let vehicle = Vehicle; if (type(Vehicle) == str) { vehicle = json.decode(Vehicle) } //Define title element let veh_title = par(leading: -1em)[ #set text(fill: luma(100)) #set block(spacing: 0em) #link(vehicle.Link)[#text(size: 20pt)[#vehicle.Name]]; #h(1fr); #vehicle.Price;þ, #vehicle.Upkeep;þ Upkeep #line(length: 100%, stroke: luma(100)) ] //Define image element let veh_image = align(center)[ #MaybeImage(Img, width: 110%, fit: "stretch") ] //Define the stat table element let veh_stats = align(center)[#table( align: center + horizon, columns: (1fr, 1fr, 1fr, 1fr, 1fr, 1fr), [Speed], [#vehicle.Speed], [Torque], [#vehicle.Torque], [Handling], [#vehicle.Handling], [Armour], [#vehicle.Armour], [Integrity], [#vehicle.Integrity], [Safety], [#vehicle.Safety], [Reliability], [#vehicle.Reliability], [Fuel Uses], [#vehicle.at("Fuel Uses")], [Stress], [#vehicle.Stress], table.cell(colspan: 3, [#vehicle.Size]), table.cell(colspan: 3, [#vehicle.Cargo]), ) ] let cells = () for row in vehicle.Crew { for (idx, (k, v),) in row.pairs().enumerate() { if (idx == 0) { if (v.contains("Loader")) { cells.push(table.cell(stroke: none)[]) cells.push(table.cell([#v])) } else { cells.push(table.cell(colspan: 2)[#v]) } } else { cells.push(table.cell([#v])) } } } let veh_crew = table( align: (left, left, center, center, center, left), columns: (1em, auto, auto, auto, auto, 1fr), table.cell(stroke: none, []), table.cell(stroke: none, []), [Type], [Vis.], [Escape], [Notes], ..cells, ) place(top + left, box(width: 0pt, height: 0pt, hide[= #vehicle.Name])) if (Img != none) { veh_image v(-1em) } veh_title DescriptiveText veh_stats v(-1em) align(center)[#KochFont(size: 18pt)[Crew]] v(-1em) veh_crew } /// Defines the FlyingCircus Plane page. Always on a new page. Image optional. /// If the Img is provided, it will take up two facing pages, otherwise only one, but a full page, unlike the Simple. /// /// - Vehicle (str | dictionary): JSON string or dictionary representing the Vehicle stats. /// - Img (bytes | none): Image to go at the top of the first page. Set to none to remove. /// - TextVOffset (length): How far to push the text down the page. Want to do that inset text thing the book does? You can, the text can overlap with thte image. Does nothing if no Img provided. /// - BoxText (dictionary): Pairs of values to go in the box over the image. Does nothing if no Img provided. /// - BoxAnchor (str): Which anchor of the image to put the box in? Sample values are "north", "south-west", "center". /// - FirstPageContent (content): Goes on the first page. If no image is provided, it is not present. /// - AfterContent (content): Goes after the stat block. Always present. /// -> content #let FCVehicleFancy(Vehicle, Img: none, TextVOffset: 0pt, BoxText: none, BoxAnchor: "north", FirstPageContent, AfterContent) = { //Read in the Vehicle JSON file let vehicle; if (type(Vehicle) == str) { vehicle = json.decode(Vehicle) } //Define image element is done below because it needs context. //Define Firsttitle element let veh_title = par(leading: -1em)[ #set text(fill: luma(100), size: 24pt) #set block(spacing: 0em) #link(vehicle.Link)[#vehicle.Name] #line(length: 100%, stroke: luma(100)) ] //Define title element let veh_title2 = align(center)[#box(width: 70%)[#par(leading: -1em)[ #set text(fill: luma(100)) #set block(spacing: 0em) #link(vehicle.Link)[#text(size: 20pt)[#vehicle.Name]]; #h(1fr); #vehicle.Price;þ #line(length: 100%, stroke: luma(100)) #vehicle.Nickname;#h(1fr)Upkeep #vehicle.Upkeep;þ ]]] //Define the stat table element let veh_stats = align(center)[#box(width: 70%)[ #table( align: center + horizon, columns: (1fr, 1fr, 1fr, 1fr, 1fr, 1fr), table.cell(colspan: 2, [Speed]), table.cell(colspan: 2, [Torque]), table.cell(colspan: 2, [Handling]), table.cell(colspan: 2, [#vehicle.Speed]), table.cell(colspan: 2, [#vehicle.Torque]), table.cell(colspan: 2, [#vehicle.Handling]), table.cell(colspan: 2, [Armour]), table.cell(colspan: 2, [Integrity]), table.cell(colspan: 2, [Safety]), table.cell(colspan: 2, [#vehicle.Armour]), table.cell(colspan: 2, [#vehicle.Integrity]), table.cell(colspan: 2, [#vehicle.Safety]), table.cell(colspan: 2, [Reliability]), table.cell(colspan: 2, [Fuel Uses]), table.cell(colspan: 2, [Stress]), table.cell(colspan: 2, [#vehicle.Reliability]), table.cell(colspan: 2, [#vehicle.at("Fuel Uses")]), table.cell(colspan: 2, [#vehicle.Stress]), table.cell(colspan: 3, [#vehicle.Size]), table.cell(colspan: 3, [#vehicle.Cargo]), ) ]] let cells = () for row in vehicle.Crew { for (idx, (k, v),) in row.pairs().enumerate() { if (idx == 0) { if (v.contains("Loader")) { cells.push(table.cell(stroke: none)[]) cells.push(table.cell([#v])) } else { cells.push(table.cell(colspan: 2)[#v]) } } else { cells.push(table.cell([#v])) } } } let veh_crew = table( align: (left, left, center, center, center, left), columns: (1em, auto, auto, auto, auto, 1fr), table.cell(stroke: none, []), table.cell(stroke: none, []), [Type], [Vis.], [Escape], [Notes], ..cells, ) //If we have an image, then this is a two-page thing if (Img != none) { pagebreak(weak: true, to: "even") page( background: align( top, )[ #context cetz.canvas( length: 100%, { import cetz.draw:* content((0.5, 1), anchor: "north", MaybeImage(Img, width: page.width, fit: "stretch"), name: "image"); if (BoxText != none) { content("image." + BoxAnchor, anchor: BoxAnchor, padding: 1in, align(center)[ #let cells = () #for (k, v) in BoxText { cells.push(text(size: 12pt, [#k: #v])) } #gridx( columns: 1, fill: white.transparentize(50%), stroke: black, vlinex(x: 0), vlinex(x: 1), hlinex(y: 0), ..cells, hlinex(), ), ]) } }, ) ], margin: (top: 0pt), header: none, )[ #place(top + left, box(width: 0pt, height: 0pt, hide[= #vehicle.Name])) #v(TextVOffset) #veh_title #FirstPageContent ] } //Second page goes back to normal pagebreak() if (Img == none) { place(top + left, box(width: 0pt, height: 0pt, hide[= #vehicle.Name])) } veh_title2 veh_stats align(center)[#KochFont(size: 18pt)[Crew]] v(-1em) veh_crew AfterContent } /// Defines the FlyingCircus Ship page. Always on a new page. Image optional. /// /// - Ship (str | dictionary): JSON string or dictionary representing the Ship stats. /// - Img (bytes | none): Image to go at the top of the page. Set to none to remove. /// - DescriptiveText (content): Goes below the name and above the stats table. /// - notes (content): Goes in the notes section. /// -> content #let FCShip(Ship: dictionary, Img: bytes, DescriptiveText, notes) = { pagebreak(weak: true) //Define image element let ship_image = align(center)[ #MaybeImage(Img, width: 80%, fit: "contain") ] //Define title element let ship_title = KochFont(size: 24pt)[#Ship.Name] //Construct the stats table element let statTable = gridx( columns: (30%, 30%, 30%), align: center + horizon, hlinex(end: 2, expand: (0pt, 1.9em)), [Speed], [Handling], cellx(rowspan: 6, inset: (top: -2em, right: -2em, bottom: -3em))[ #let data = () #for s in Ship.DamageStates { data.push((value: 1, label: s)) } #cetz.canvas(length: 100%, { import cetz.draw: * import cetz.chart set-style(stroke: (paint: black)) chart.piechart( data, radius: 0.5, value-key: "value", label-key: "label", outer-label: (content: none), inner-label: (content: "LABEL", radius: 130%), slice-style: ((fill: white, stroke: black),), ) line((0, 0), (0, 0.5), stroke: (thickness: 3pt)) }) ], hlinex(end: 2, expand: (0pt, 0.8em)), [#Ship.Speed], [#Ship.Handling], hlinex(end: 2, expand: (0pt, 0.3em)), [Hardness], [Soak], hlinex(end: 2, expand: (0pt, 0em)), [#Ship.Hardness], [#Ship.Soak], hlinex(end: 2, expand: (0pt, 0.1em)), [Strengths], [Weaknesses], hlinex(end: 2, expand: (0pt, 0.4em)), [#Ship.Strengths], [#Ship.Weaknesses], hlinex(end: 2, expand: (0pt, 1.0em)), vlinex(x: 0), vlinex(x: 1), ) // let statTable = grid(columns:(30%,30%,30%), align: center+horizon, // grid.hline(end:2, expand:15%+1em) // ) let cells = () for weap in Ship.Weapons { cells.push([#weap.Name]) for dir in ("Fore", "Left", "Right", "Rear", "Up"){ cells.push([#weap.at(dir, default: "-")]) } } let weaponTable = table( columns: (3fr, 1fr, 1fr, 1fr, 1fr, 1fr), table.cell(stroke: none)[], [Fore], [Left], [Right], [Rear], [Up], ..cells, ) place(top + left, box(width: 0pt, height: 0pt, hide[= #Ship.Name])) grid(columns: 1, rows: (auto, 1fr, auto), grid.cell([ #ship_image #ship_title #v(0.5em) ]), grid.cell([#DescriptiveText]), grid.cell([ #statTable #align(center)[#KochFont(size: 18pt)[Weapons]] #weaponTable #align(center)[#KochFont(size: 18pt)[Notes]] #notes ])) } /// Defines the FlyingCircus Weapon card. Image optional. /// /// - Weapon (str | dictionary): JSON string or dictionary representing the Weapon stats. /// - Img (bytes | none): Image to go above the card. Set to none to remove. /// - DescriptiveText (content): Goes below the name and above the stats table. /// -> content #let FCWeapon(Weapon, Img: none, DescriptiveText)={ let weapon = Weapon //Read in json.decode file if it's a path if (type(Weapon) == str) { weapon = json.decode(Weapon) } place(top + left, box(width: 0pt, height: 0pt, hide[== #weapon.Name])) MaybeImage(Img) { set text(fill: luma(100)) set block(spacing: 0em) text(size: 20pt)[#weapon.Name]; h(1fr); weapon.Price line(length: 100%, stroke: luma(100)) } DescriptiveText v(-1em) let cells = () for (k, v) in Weapon.Cells { cells.push(table.cell(fill: black)[#text(fill: white)[#k]]) cells.push([#v]) } table( columns: (1fr,) * (2 * Weapon.Cells.len()), align: center + horizon, ..cells, table.cell(align: left, colspan: (2 * Weapon.Cells.len()))[#Weapon.Tags], ) } #let HiddenHeading(body) = { show heading: it => [] body }
https://github.com/hugo-s29/typst-algo
https://raw.githubusercontent.com/hugo-s29/typst-algo/master/algo.typ
typst
MIT License
#let algorithm = (it) => box(inset: (x: 15pt, y: 10pt), stroke: black + 1pt, radius: 5pt, align(left, it)) #let algo_style = (it) => strong(it) #let algo_call_style = (it) => smallcaps(it) #let algo_instruction = (it) => it // for customization purposes #let algo_if = (cond) => algo_instruction[ #algo_style[If] #cond #algo_style[then] ] #let algo_else_if = (cond) => algo_instruction[ #algo_style[Else if] #cond #algo_style[then] ] #let algo_else = algo_instruction(algo_style[Else]) #let algo_for = (cond) => algo_instruction[ #algo_style[For] #cond #algo_style[do] ] #let algo_while = (cond) => algo_instruction[ #algo_style[While] #cond #algo_style[do] ] // this function display the start of a procedure-like block // it can be used for procedures, functions, or something custom if needed #let algo_procedure_like = (label, name, args: none) => { if args == none { algo_instruction[ #label #algo_call_style(name) ] } else { algo_instruction(algo_style[#label ] + algo_call_style(name) + [(#args)]) } } #let algo_procedure = (name, args: none) => algo_procedure_like([Procedure], name, args: args) #let algo_function = (name, args: none) => algo_procedure_like([Function], name, args: args) #let algo_end_if = algo_instruction(algo_style[End If] + linebreak()) #let algo_end_for = algo_instruction(algo_style[End For] + linebreak()) #let algo_end_while = algo_instruction(algo_style[End While] + linebreak()) #let algo_end_procedure = algo_instruction(algo_style[End Procedure] + linebreak()) #let algo_end_function = algo_instruction(algo_style[End Function] + linebreak()) #let algo_return = algo_instruction(algo_style[Return]) #let algo_call = (name, args: none) => algo_call_style(name) + [(#args)] #let algo_block = (it) => [ #linebreak() #box( pad( left: 10pt, box( stroke: (left: 1pt), inset: (left: 10pt), outset: (y: 5pt), it, ), ), ) #linebreak() ]
https://github.com/francescoo22/kt-uniqueness-system
https://raw.githubusercontent.com/francescoo22/kt-uniqueness-system/main/src/proof-tree.typ
typst
// Code from: https://github.com/SkiFire13/typst-prooftree #let prooftree( spacing: ( horizontal: 1em, vertical: 0.5em, lateral: 0.5em, ), label: ( // TODO: split offset into horizontal and vertical offset: -0.1em, side: left, padding: 0.2em, ), line-stroke: 0.5pt, ..rules ) = { // Check the types of the parameters. assert( type(spacing) == "dictionary", message: "The value `" + repr(spacing) + "` of the `spacing` argument was expected" + "to have type `dictionary` but instead had type `" + type(spacing) + "`." ) assert( type(label) == "dictionary", message: "The value `" + repr(label) + "` of the `label` argument was expected" + "to have type `dictionary` but instead had type `" + type(label) + "`." ) assert( type(line-stroke) == "length", message: "The value `" + repr(line-stroke) + "` of the `line-stroke` argument was expected" + "to have type `length` but instead had type `" + type(line-stroke) + "`." ) // Check validity of `spacing`'s keys. for (key, value) in spacing { if key not in ("horizontal", "vertical", "lateral", "h", "v", "l") { panic("The key `" + key + "` in the `spacing` argument `" + repr(spacing) + "` was not expected.") } if type(value) != "length" { panic( "The value `" + repr(value) + "` of the key `" + key + "` in the `spacing` argument `" + repr(spacing) + "` was expected to have type `length` but instead had type `" + type(value) + "`." ) } } // Check exclusivity of `spacing`'s keys. let mutually_exclusive(key1, key2, keys) = { assert( key1 not in keys or key2 not in keys, message: "The keys `" + key1 + "` and `" + key2 + "` in the `spacing` argument `" + repr(spacing) + "` are mutually exclusive." ) } mutually_exclusive("horizontal", "h", spacing.keys()) mutually_exclusive("vertical", "v", spacing.keys()) mutually_exclusive("lateral", "l", spacing.keys()) // Check validity of `label`'s keys. let expected = ("offset": "length", "side": "alignment", "padding": "length") for (key, value) in label { if key not in expected { panic("The key `" + key + "` in the `label` argument `" + repr(label) + "` was not expected.") } if type(value) != expected.at(key) { panic( "The value `" + repr(value) + "` of the key `" + key + "` in the `label` argument `" + repr(label) + "` was expected to have type `" + type.at(key) + "` but instead had type `" + type(value) + "`." ) } } if "side" in label { assert( label.side == left or label.side == right, message: "The value for the key `side` in the argument `label` can only be either " + "`left` (default) or `right`, but instead was `" + repr(label.side) + "`." ) } // Check basic validity of `rules`. if rules.pos().len() == 0 { panic("The `rules` argument cannot be empty.") } let settings = ( spacing: ( horizontal: spacing.at("horizontal", default: spacing.at("h", default: 1.5em)), vertical: spacing.at("vertical", default: spacing.at("v", default: 0.5em)), lateral: spacing.at("lateral", default: spacing.at("l", default: 0.5em)), ), label: ( offset: label.at("offset", default: -0.1em), side: label.at("side", default: left), padding: label.at("padding", default: 0.2em), ), line-stroke: line-stroke, ) // Draw the rules in a stack-based evaluation order. style(styles => { let stack = () for rule in rules.pos() { let to_pop = rule.__prooftree_to_pop let rule_func = rule.__prooftree_rule_func assert( to_pop <= stack.len(), message: "The rule `" + repr(rule.__prooftree_raw) + "` was expecting at least " + str(to_pop) + " rules in the stack, but only " + str(stack.len()) + " were present." ) let elem = rule_func( settings, styles, stack.slice(stack.len() - to_pop) ) stack = stack.slice(0, stack.len() - to_pop) stack.push(elem) } assert( stack.len() == 1, message: "Some rule remained unmatched: " + str(stack.len()) + " roots were found but only 1 was expected." ) set align(start) set box(inset: 0pt, outset: 0pt) stack.pop().body }) } #let axiom(label: none, body) = { // Check the type of `label`. assert( type(label) in ("string", "content", "none"), message: "The type of the `label` argument `" + repr(label) + "` was expected to be " + "`none`, `string` or `content` but was instead `" + type(label) + "`." ) // TODO: allow the label to be aligned on left, right or center (default and current). ( __prooftree_raw: body, __prooftree_to_pop: 0, __prooftree_rule_func: (settings, styles, children) => { let body = box(body, inset: (x: settings.spacing.lateral)) // let body = body if label != none { // Labels stack on top of axioms body = stack( dir: ttb, spacing: 1.5 * settings.spacing.vertical, align(center, label), body ) } return ( body: body, label_wleft: 0pt, label_wright: 0pt, wleft: 0pt, wright: 0pt, ) } ) } #let rule( n: 1, label: none, root ) = { // Check validity of the `n` parameter assert( type(n) == "integer", message: "The type of the `n` argument `" + repr(n) + "` was expected to be " + "`integer` but was instead `" + type(n) + "`." ) // Check the type of `label`. assert( type(label) in ("string", "dictionary", "content", "none"), message: "The type of the `label` argument `" + repr(label) + "` was expected to be " + "`none`, `string`, `content` or `dictionary` but was instead `" + type(label) + "`." ) // If the type of `label` was string then it's good, otherwise we need to check its keys. if type(label) == "dictionary" { for (key, value) in label { // TODO: maybe consider allowing `top`, `top-left` and `top-right` if `rule(n: 0)` gets changed. if key not in ("left", "right") { panic("The key `" + key + "` in the `label` argument `" + repr(label) + "` was not expected.") } if type(value) not in ("string", "content") { panic( "The value `" + repr(value) + "` of the key `" + key + "` in the `label` argument `" + repr(label) + "` was expected to have type `string` or `content` but instead had type `" + type(value) + "`." ) } } } ( __prooftree_raw: root, __prooftree_to_pop: n, __prooftree_rule_func: (settings, styles, children) => { let width(it) = measure(it, styles).width let height(it) = measure(it, styles).height let maxl(..lengths) = width( for length in lengths.pos() { line(length: length) } ) let gtl(l1, l2) = maxl(l1, l2) != l2 let minl(l1, l2) = if gtl(l1, l2) { l2 } else { l1 } let root = [ #h(settings.spacing.lateral) #root #h(settings.spacing.lateral) ] // Get some values from the children, or 0pt if n == 0 let ( children_wleft, children_label_wleft, children_wright, children_label_wright ) = (0pt, 0pt, 0pt, 0pt) if n != 0 { children_wleft = children.first().wleft children_label_wleft = children.first().label_wleft children_wright = children.last().wright children_label_wright = children.last().label_wright } // Map the children to a single block let branches = children.map(c => box(c.body)).join(h(settings.spacing.horizontal)) // Calculate the offsets of the "inner" branches, i.e. ignoring branches' labels let wbranches_nolabel = width(branches) - children_label_wleft - children_label_wright let ibranches_offset = maxl(0pt, width(root) - wbranches_nolabel) / 2 // Compute the start, end and length of the line to satisfy the "inner" branches let ib_line_start = ibranches_offset + children_wleft let ib_line_end = ibranches_offset + wbranches_nolabel - children_wright let ib_line_len = ib_line_end - ib_line_start // Pad the line length to satisfy the root too let line_len = maxl(ib_line_len, width(root)) // Adjust the line start to account for the root padding let line_start = if gtl(ib_line_len, width(root)) { // No root padding ib_line_start } else if gtl(width(root), wbranches_nolabel) { // The "inner" branches are too tight 0pt } else { // Weird, situation, we have `wbranches_nolabel > width(root) > ib_line_len` // The line should be adjusted so that it fits kinda in the middle // TODO: maybe this should also consider labels? let min_left = maxl(0pt, ib_line_end - line_len) let max_right = minl(maxl(wbranches_nolabel, width(root)), ib_line_start + line_len) (max_right + min_left) / 2 - line_len / 2 } // Finish computing the offsets by considering the ignored left branches label let branches_offset = maxl(0pt, ibranches_offset - children_label_wleft) let line_start = line_start + (branches_offset + children_label_wleft - ibranches_offset) let root_offset = line_start + (line_len - width(root)) / 2 // Compute body without the label. // This is needed later to calculate the sizes when placing the new labels. let body_nolabel = stack( dir: ttb, spacing: settings.spacing.vertical, box(inset: (left: branches_offset), branches), line(start: (line_start, 0pt), length: line_len, stroke: settings.line-stroke), box(inset: (left: root_offset), root), ) // Normalize label given the default value in the `prooftree` function. let label = label if type(label) == "none" { label = ( left: none, right: none ) } if type(label) in ("string", "content") { label = ( left: if settings.label.side == left { label } else { none }, right: if settings.label.side == right { label } else { none } ) } label = ( left: label.at("left", default: none), right: label.at("right", default: none), ) // Pad the labels to separate them from the rule let left_label = box(inset: (right: settings.label.padding), label.left) let right_label = box(inset: (left: settings.label.padding), label.right) // Compute extra space the left label might need let new_left_space = maxl(0pt, width(left_label) - line_start) let left_label_width_offset = maxl(0pt, line_start - width(left_label)) // Compute the width offset of the right label let right_label_width_offset = new_left_space + line_start + line_len // Compute the final width let final_width = maxl( right_label_width_offset + width(right_label), new_left_space + width(body_nolabel) ) // Place the label on top of the rest. // Note that this needs to fix the final dimensions in order to use `place`. let body = box(width: final_width, height: height(body_nolabel))[ #set block(spacing: 0pt) #box(inset: (left: new_left_space), body_nolabel) #place( bottom + left, dx: left_label_width_offset, dy: settings.label.offset, box(height: 2 * (height(root) + settings.spacing.vertical), align(horizon, left_label)) ) #place( bottom + left, dx: right_label_width_offset, dy: settings.label.offset, box(height: 2 * (height(root) + settings.spacing.vertical), align(horizon, right_label)) ) ] // Compute the final sizes for the next rule let label_wleft = minl( new_left_space + branches_offset + children_label_wleft, left_label_width_offset + width(left_label) ) let label_wright = minl( children_label_wright + (final_width - branches_offset - width(branches)), final_width - right_label_width_offset ) let wleft = (new_left_space + root_offset) - label_wleft let wright = width(body) - new_left_space - root_offset - width(root) - label_wright ( body: body, label_wleft: label_wleft, label_wright: label_wright, wleft: wleft, wright: wright, ) } ) }
https://github.com/jneug/schule-typst
https://raw.githubusercontent.com/jneug/schule-typst/main/src/util/harbinger.typ
typst
MIT License
#import "@preview/oxifmt:0.2.0": strfmt #let oxi-template = ` <svg width="{canvas-width}" height="{canvas-height}" xmlns="http://www.w3.org/2000/svg" > <!-- Definitions for reusable components --> <defs> <filter id="shadow" x="-100%" y="-100%" width="300%" height="300%"> <feFlood flood-opacity="{flood-opacity}" flood-color="{flood-color}" /> <feComposite in2="SourceGraphic" operator="in" /> <feGaussianBlur stdDeviation="{blur-x:?} {blur-y:?}" result="blur" /> </filter> </defs> <rect x="{rect-x-offset}" y="{rect-y-offset}" rx="{radius:?}" ry="{radius:?}" width="{rect-width}" height="{rect-height}" style="filter:url(#shadow)"/> </svg> `.text #let _resolve-blur(blur, styles, size) = { let typ = type(blur) if typ == type(1pt) { (x: blur, y: blur) } else if typ in (int, float) { (x: blur * 1pt, y: blur * 1pt) } else if typ == dictionary { ( x: _resolve-blur(blur.x, styles, size).x, y: _resolve-blur(blur.y, styles, size).y, ) } else if typ == ratio { let frac = blur / 100% (x: frac * size.width, y: frac * size.height) } else { panic("Unexpected blur type: `" + typ + "` for value `" + repr(blur) + "`") } } /// Shadow Box that uses svg filters to create the shadow effect. /// /// *Example:* /// #example(`harbinger.shadow-box( /// radius: 5pt, /// inset:1em, /// fill:white, /// dx: 2pt, /// dy: 2pt, /// blur:2, /// )[This is a nice shadow box] /// `) /// - body (content): This is the content of the shadow box. /// - shadow-fill (color): The color of the shadow. /// -> content /// - opacity (number): The opacity of the shadow. /// - dx (number): The horizontal offset of the shadow. /// - dy (number): The vertical offset of the shadow. /// - radius (number): The radius of the shadow. /// - blur (number): The blur of the shadow. /// - margin (number): The margin of the shadow. /// - ..args (dictionary): Additional arguments for the shadow box (width, height, fill, etc). #let shadow-box( body, shadow-fill: black, opacity: 0.5, dx: 0pt, dy: 0pt, radius: 0pt, blur: 3, margin: 2, ..args, ) = { style(styles => layout(size => { let named = args.named() for key in ("width", "height") { if key in named and type(named.at(key)) == ratio { named.insert(key, size.at(key) * named.at(key)) } } let blur = _resolve-blur(blur, styles, size) let opts = (blur-x: blur.x, blur-y: blur.y, radius: radius) let shadow-fill = shadow-fill.rgb().components().map(el => el / 100% * 255) opts.flood-color = strfmt("rgb({}, {}, {}, {})", ..shadow-fill) let boxed-content = box(body, radius: radius, ..named) let rect-size = measure(boxed-content, styles) let (rect-x-offset, rect-y-offset) = ( blur.x * margin, blur.y * margin, ) let canvas-size = ( width: 2 * rect-x-offset + rect-size.width, height: 2 * rect-y-offset + rect-size.height, ) opts += ( rect-x-offset: rect-x-offset, rect-y-offset: rect-y-offset, rect-width: rect-size.width, rect-height: rect-size.height, canvas-width: canvas-size.width, canvas-height: canvas-size.height, flood-opacity: opacity, ) let svg-shadow = image.decode(strfmt(oxi-template, ..opts), ..canvas-size) block({ place( dx: dx - rect-x-offset, dy: dy - rect-y-offset, svg-shadow, ) boxed-content }) })) }
https://github.com/qo/term
https://raw.githubusercontent.com/qo/term/main/examples/figure/figure.typ
typst
#import "../../term.typ": term #let content = `$ ls -la total 140 drwxr-xr-x. 1 null null 48 Nov 30 01:11 . drwx------. 1 null null 1032 Dec 3 18:30 .. -rw-r--r--. 1 null null 23093 Nov 30 00:40 s1.png -rw-r--r--. 1 null null 26076 Nov 30 01:02 s2.png -rw-r--r--. 1 null null 42784 Nov 30 01:18 s3.png -rw-r--r--. 1 null null 42735 Nov 30 01:12 s4.png ` #figure( term( content: content ), caption: "Listing files in a directory", )
https://github.com/justinvulz/document
https://raw.githubusercontent.com/justinvulz/document/main/summer_camp/DMA_summer_group_wb.typ
typst
#import "../typst_packages/lecture.typ": * #import "@preview/cetz:0.2.2" #import "@preview/fletcher:0.4.5" as fletcher: diagram,node,edge #import fletcher.shapes:circle #show par: set block(spacing: 0.9em) #show math.equation: set block(spacing: 0.9em) #let theorem = thmbox( "p", "Theorem", // fill: rgb("e8e8f8"), stroke: black, base_level: 1, padding: (y: 0em)) #let definition = thmbox( "p", "Definition", // fill: rgb("e8f8e8"), stroke: black, base_level:1, padding: (y: 0em)) #let proof = thmproof("proof","Proof") #let lemma = thmbox( "p", "Lemma", // fill: rgb("f8e8e8"), stroke: black, base_level: 1, padding: (y: 0em)) #let example = thmplain("example","Example").with( inset: (top: 0.5em, bottom: 0.5em, left: 1em, right: 1em), numbering: none) #let exercise = thmbox( "exercise", "Exercise", stroke: black + 1pt, base: none, ).with(numbering: "I") #let remark = thmplain("remark","Remark").with( inset: (top: 0.5em, bottom: 0.5em, left: 1em, right: 1em), numbering: none) #show: doc => conf( "群論", "陽明交大應數系營隊", doc) #makeTitle \ #set par(first-line-indent: 1.5em) 在數學中,群論 (Group theory) 研究名為「群」的代數構。 群論在許多的領域都有很重要的應用。像是,倍立方、化圓為方、三等分角,五次多項式無法解的原因都可以用群論來解釋。 另外,像是標準粒子模型、量子力學 (李群)、晶體結構、密碼學等領域也有很多群論的應用。testtesttest #set par(first-line-indent: 0em) = 群 (Group) #definition[ $angle.l G , * angle.r$是一個集合 $G$ 與一個二元運算 $* : G times G |-> G$,滿足以下條件: #set enum(numbering: al(n => [$cal(G)_#n$:])) + 對於所有的$a,b,c in G$, $ (a*b)*c = a*(b*c) quad textb("結合律") $ + 存在一個元素 $e in G$,使得對於所有的 $a in G$, $ a*e = e*a = a quad textb("單位元素") $ + 對於每一個 $a in G$,存在一個元素 $a^(-1) in G$,使得 $ a*a^(-1) = a^(-1)*a = e quad textb("反元素") $ ] #example[ #set table(stroke: (x,y) =>( bottom: if y==0 {1pt}, right: if x==0 {1pt}, )) 我們來看一些例子:\ - $angle.l ZZ, + angle.r$、$angle.l QQ,+ angle.r$、$angle.l RR,+ angle.r$ - $angle.l QQ^+, times angle.r$ - $C_3 = {e,a,b}$ 與下面的運算是一個群。 #table( columns: (2em,2em,2em,2em), rows: auto, align: center, $cir$, $e$, $a$, $b$, $e$, $e$, $a$, $b$, $a$, $a$, $b$, $e$, $b$, $b$, $e$, $a$ ) ] #remark[有時候我們會省略二元運算$*$,以$G$表示一個群。] #definition[ 讓$G$是一個群,定義$abs(G)$是$G$的元素個數,稱為$G$的*order*。 ] #definition[ 一個群$G$如果滿足交換率i.e. 對於所有的$a,b in G$,$ a*b = b*a $,則稱$G$是一個*交換群*(Abelian groups)。 ] #example[ - $C_3$的order是$3$。 - $ZZ$ 是一個交換群。 - 可逆矩陣的集合與矩陣乘法是一個群,但不是交換群。 ] == 群的性質 #theorem[ 如果$G$是一個群,那*消去率*成立,即對於所有的$a,b,c in G$, $ a*b = a*c => b = c \ b*a = c*a => b = c $ ] #proof[ 讓$G$是一個群,$a,b,c in G$。假設$a*b = a*c$,因為$a in G$,所以$a$的反元素$a^(-1)$存在,且$a*a^(-1) = a^(-1)*a = e$。 $ &a*b = a*c \ => &a^(-1)*a*b = a^(-1)*a*c \ => &e*b = e*c \ => &b = c $ ] #theorem[ 群$G$的單位元素$e$唯一。 ] #proof[ 假設存在第二個單位元素$e_2$,滿足$e_2*a = a*e_2 = a med forall a in G$,因為$e in G$,所以$e_2*e = e * e$,根據消去律$e_2 = e$。 ] #theorem[ 讓$G$是一個群,$a ,b in G$,那麼 $ (a b)^(-1) = b^(-1) a^(-1) $ ] #proof[ 我們直接相乘 $ (a b)b^(-1) a^(-1) &= a (b b^(-1)) a^(-1) quad textr("結合律")\ &= a e a^(-1)\ &= a a^(-1)\ &= e $ 我們證明了 $(a b)b^(-1) a^(-1) =e$,接著我們用同樣的方法可以得到$b^(-1) a^(-1)(a b) =e$ 也是成立的。 根據反元素的定義,$(a b)^(-1) = b^(-1) a^(-1)$。 ] = 置換群(Permutation Group) 我們接下來討論一個特殊的群,置換群。考慮一個集合$A = {1,2,3,4,5}$,我們可以將$A$的元素重新排列成$A = {3,1,5,2,4}$。我們可以將這個排列表示成一個函數$phi : A -> A$,這個函數將$1$映射到$3$,$2$映射到$1$,以此類推。我們可以將這個排列表示成一個表格,如@fig1 所示。 我們稱這樣的函數為一個*置換*。但是,@fig2 的函數不是一個置換,因為$4$沒有被任何一個元素映射到。 #grid( columns: (1fr,1fr), rows: (auto), align: center, [#figure( $ 1 -> 3\ 2 -> 4\ 3 -> 5\ 4 -> 2\ 5 -> 1 $, caption: "一個置換", ) <fig1>], [#figure( $ 1 -> 2\ 2 -> 3\ 3 -> 2\ 4 -> 5\ 5 -> 1\ $, caption: "不是置換", )<fig2>] ) #definition[ 一個$A$的是*置換*是一個一一對應的函數 $phi : A -> A$。 (one-one and onto) ] 我們現在給定兩個置換 $tau$ 和 $sigma$ ,我們定義他們的合成 $sigma cir tau$,對於所有的 $x in A$, $ (sigma cir tau)(x) = sigma(tau(x)) \ A -->^tau A -->^sigma A $ 因為 $tau$ 和 $sigma$ 是一一對應的函數,所以 $sigma cir tau$ 也是一一對應的函數。所以 $sigma cir tau$ 是一個置換。 #example[ #let msigma = $mat( 1, 2, 3, 4, 5; 3, 4, 5, 2, 1 )$ #let mtau = $mat( 1, 2, 3, 4, 5; 2, 3, 4, 5, 1 )$ 對於上的 $sigma$ 我們可以表示成, $ sigma = msigma $ 定義 $tau$ 為, $ tau = mtau $ 我們可以計算 $sigma cir tau$, $ sigma cir tau = msigma cir mtau = mat( 1, 2, 3, 4, 5; 4, 5, 2, 1, 3 ) $ 所以像是 $sigma cir tau (1) = sigma(tau(1)) = sigma(2) = 4$ ] == 循環置換 (Cyclc) 一個置換除了可以用上述的方法表示,我們還可以用*循環*的方式表示。我們來看下面的例子, 定義一個置換$ sigma = mat(1, 2, 3, 4, 5; 3, 4, 5, 2, 1) $ 我們觀察一下 $sigma$ 的作用,可以發現 $sigma$ 將 $1 -> 3 -> 5 -> 1$,$2 -> 4 -> 2$,所以我們可以將 $sigma$ 表示成一個循環 $sigma = (1,3,5)(2,4)$。 #figure( grid( columns: (1fr,1fr), rows: (auto), align: center, diagram( { // traingle let (p1,p3,p5) = ((0,0),(1,0),(0.5,0.866)) node(p1,[*1*]) node(p3,[*3*]) node(p5,[*5*]) edge(p1,p3,"->",bend: 55deg) edge(p3,p5,"->",bend: 55deg) edge(p5,p1,"->",bend: 55deg) } ), diagram( { // traingle let (p2,p4) = ((0,0),(1,0)) node(p2,[*2*]) node(p4,[*4*]) edge(p2,p4,"->",bend: 55deg) edge(p4,p2,"->",bend: 55deg) } ) ), caption: "一個置換的循環" ) 透過循環置換,我們可以很容易的表示一個置換,並且可以很容易的計算該置換的反元素。例如,對於上面的例子,$sigma^(-1) = (5,3,1)(4,2)$。 #remark[ 如果一個置換$ sigma = mat(1,2,3,4,5;1,2,4,5,3) = (1)(2)(3,4,5) $ 為了簡化,我們有時候會省略一個元素的循環,寫成 $sigma = (3,4,5)$。 ] #definition[ 一個集合$A$的所有置換構成一個群,我們稱這個群為$A$的*置換群*,記作$S_A$。 ] #remark[ $S_n$ 表示 $n$ 個元素的置換群。 $S_n$的order是 $n!$。 ] = 對稱群(Symmetry Groups) 接下來我們考慮一種特殊的置換群,稱為*對稱群*。我們考慮一個正三角形,將正三角形的頂點邊繼承$1,2,3$,我們來討論他有那些對稱性。 // 我們把順時鐘旋轉$120 degree$ 得到一個新的正三角形(@sqr90)所示。我們可以將這個操作表示成一個置換: // $ rho_1 = mat(1,2,3; 2,3,1) = (1,2,3) $ // 我們稱這樣的置換是*對稱置換*,他可以把圖形打回自身。 #let (p1,p2,p3) = ((0,0),(1,0),(0.5,-0.866)) // #grid( // columns: (1fr,1fr,1fr), // rows: (auto), // align: center, // [#figure( // diagram( // { // node(p1,[*1*]) // node(p2,[*2*]) // node(p3,[*3*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // } // ), // caption: "正三角形", // )<fig3>], // [#figure( // diagram({ // node(p1,[*2*]) // node(p2,[*3*]) // node(p3,[*1*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // }), // caption: [順時針旋轉 $120$ 度], // )<sqr90>], // [#figure( // diagram({ // node(p1,[*1*]) // node(p2,[*3*]) // node(p3,[*2*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // let mid = ((p2.at(0)+p3.at(0))/1.9,(p2.at(1)+p3.at(1))/1.9) // edge((0,0),mid,stroke: red) // }), // caption: [沿某一軸鏡射], // )<sqf1>] // ) // 接下來看一下 @fig3 到 @sqf1 的變換,我們可以得到另一個置換 // $ tau_1 = mat(1,2,3;1,3,2) = (1)(2,3) $ // 接著我考慮 $tau_1 cir rho_1$ 這個置換,先把三角形旋轉$120 degree$,再把它沿著@sqf1 的軸鏡射,我們可以得到一個新的置換 : // $ // tau_1 cir rho_1 &= mat(1,2,3;1,3,2) mat(1,2,3;2,3,1) \ // &= mat(1,2,3;3,1,2) \ // &= (1,3,2) // $ // 而 $tau_1 cir rho_1$ 這個置換就是沿著另一個軸鏡射的置換。如下圖所示: // #grid( // columns: (1fr,1fr,1fr), // rows: (auto), // align: center, // [#figure( // diagram( // { // node(p1,[*1*]) // node(p2,[*2*]) // node(p3,[*3*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // } // ), // caption: "正三角形", // )], // [#figure( // diagram({ // node(p1,[*2*]) // node(p2,[*3*]) // node(p3,[*1*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // }), // caption: [$rho_1$], // )], // [#figure( // diagram({ // node(p1,[*2*]) // node(p2,[*1*]) // node(p3,[*3*]) // edge(p1,p2,"-") // edge(p2,p3,"-") // edge(p3,p1,"-") // edge(p3,(0.5,0.1),stroke: red) // }), // caption: [$tau_1 cir rho_1$], // )] // ) 我們可以繼續枚舉所有三角形的對稱操作,我們可以得到以下的置換: #grid( columns :(1fr,1fr,1fr), rows: (auto,auto), row-gutter: 2em, align: center, [#figure(supplement: none, diagram( { node(p1,[*1*]) node(p2,[*2*]) node(p3,[*3*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") } ),caption:[$mat(1,2,3;1,2,3) = e$] )], [#figure(supplement: none, diagram({ node(p1,[*2*]) node(p2,[*3*]) node(p3,[*1*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") }),caption:[$mat(1,2,3;2,3,1) = (1,2,3) = rho_1$] )], [#figure(supplement: none, diagram( { node(p1,[*3*]) node(p2,[*1*]) node(p3,[*2*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") } ),caption:[$mat(1,2,3;3,1,2) = (3,2,1) =rho_2$] )], [#figure(supplement: none, diagram({ node(p1,[*2*]) node(p2,[*1*]) node(p3,[*3*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") edge(p3,(0.5,0.1),stroke: red) }),caption:[$mat(1,2,3;2,1,3) = (1,2)(3) = tau_1$] )], [#figure(supplement: none, diagram({ node(p1,[*3*]) node(p2,[*2*]) node(p3,[*1*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") let mid = ((p3.at(0)+p1.at(0))/1.9,(p3.at(1)+p1.at(1))/1.9) edge(p2,mid,stroke: red) }),caption:[$mat(1,2,3;3,2,1) = (1,3)(2) = tau_2$] )], [#figure(supplement: none, diagram({ node(p1,[*1*]) node(p2,[*3*]) node(p3,[*2*]) edge(p1,p2,"-") edge(p2,p3,"-") edge(p3,p1,"-") let mid = ((p3.at(0)+p2.at(0))/1.9,(p3.at(1)+p2.at(1))/1.9) edge(p1,mid,stroke: red) }),caption:[$mat(1,2,3;1,3,2) = (3,2)(1) = tau_3$] )], ) 把上述的對稱置換收集起來,並用上面提到的$cir$當作二運算,我們可以得到一個*對稱群*,稱為正三角形的對稱群$D_3$。 同樣的,我們可以考慮正方形的對稱群$D_4$,正方形的對稱群有$8$個元素,我們可以將$D_4$寫下來: $ D_4 = {e, rho_1, rho_2, rho_3, tau_1, tau_2, tau_3, tau_4} $ #figure( diagram({ node((0,0),[1]) node((1,0),[2]) node((1,1),[3]) node((0,1),[4]) edge((0,0),(1,0),"-") edge((1,0),(1,1),"-") edge((1,1),(0,1),"-") edge((0,1),(0,0),"-") edge((0.5,-0.1),(0.5,1.1),"-") edge((-0.1,0.5),(1.1,0.5),"-") edge((0,0),(1,1),"-") edge((1,0),(0,1),"-") }), caption: "正方形的對稱性" )<refq> 其中$tau_1 dots tau_4$是以@refq 中的對稱軸鏡射,$rho_1 dots rho_3$是以中點為圓心的旋轉。我們可以把他們用循環寫下來: $ e &= (1)(2)(3)(4) \ rho_1 &= (1,2,3,4)\ rho_2 &= (1,3)(2,4)\ rho_3 &= (1,4,3,2)\ tau_1 &= (2,4)\ tau_2 &= (1,3)\ tau_3 &= (1,2)(4,3)\ tau_4 &= (1,4)(2,3)\ $ == 計算對稱群的order 我們上面提到了正三角形的對稱群$D_3$和正方形的對稱群$D_4$,並列出其中的一些元素,那我們要怎麼確定這些對稱群的order呢?我們下面來討論一個方法。 + 先找到圖形的不動點。 + 畫一條通過不動點的直線。 + 假設有$m$個對稱稱使得這條線不動,而條線在對稱性下會被打到$n$個不同的位子。 + 那麼這個對稱群的order就是$n times m$。 下一節會證明這個方法是正確的。 = 群作用(Group Action) #let gset = $G negspace textb("-set")$ #definition[ 一個群$angle.l G,* angle.r$對一個集合$A$的*作用*是一個映射 $phi : G times A -> A$,滿足以下條件: #set enum(numbering: al("1.")) + 對於所有 $a in A quad phi(e,a) = a$ + 對於所有 $a in A$ 和 $g,h in G$,$phi(g*h,a) = phi(g,phi(h,a))$ 在這個情況下,我們稱$A$是一個#gset。 ] 為了簡化,我們會省略運算函數,寫成$g a$代表$phi(g,a)$。 所以上述的條件可以寫成 + 對於所有 $a in A quad e a = a$ + 對於所有 $a in A$ 和 $g,h in G$,$(g h) a = g (h a)$ #theorem[ 讓$X$是一個#gset。如果$g x_1 = g x_2$,那$x_1 = x_2$ ] #proof[ 假設 $g x_1 = g x_2$,那麼 $g^(−1)g x_1 = g^(−1) g x_2$,所以 $e x_1 = e x_2$,所以 $x_1 = x_2$。 ] #remark[ 如果$x != y$,那$g x != g y$ ] == 不動點 (Fixed point)、穩定子群 (stabilizers subgroup)、軌道 (Orbits) #let Stab = math.op("Stab") #theorem[ 讓$X$是一個#gset,我們定義一個在$X$上的關係$tilde.op$,對於所有的$x,y in X$,$x tilde.op y$當且僅當存在$g in G$,使得$g x = y$。這個關係是一個等價關係。 ] <relation> #proof[ \ *自反性*:對於所有的$x in X$,$x tilde.op x$,因為$e x = x$。 \ *對稱性*:如果$x tilde.op y$,那麼存在$g in G$,使得$g x = y$,所以$g^(-1) y = x$,所以$y tilde.op x$。 \ *傳遞性*:如果$x tilde.op y$且$y tilde.op z$,那麼存在$g,h in G$,使得$g x = y$且$h y = z$,所以$h g x = z$,所以$x tilde.op z$。 ] #definition[ 讓$X$是一個#gset,每一個在 @relation 下的等價類稱為一個*軌道*。如果$x in X$,包含$x$的分割是$x$的軌道,記作$G_x$。 ] #remark[ 讓 $X$ 是一個 #gset,$x in X$,那麼 $x$ 的軌道 $G_x = {g x mid(|) g in G}$。 ] #definition[ 讓$X$是一個#gset,讓$x in X$,$g in G$。我們定義; $ Stab_G (x) = {g in G | g x = x} \ X^g = {x in X | g x = x} $ $Stab_G (x)$稱為$x$的*穩定子群*,$X^g$稱為$g$的*不動點*。 ] #theorem([軌道-穩定子定理 (Orbit-Stabilizer Theorem)])[ 讓$G$是一個有限群,讓 $X$ 是一個 #gset,$x in X$,那麼 $abs(G) = abs(G_x) abs(Stab_G (x))$。 ] <orbit-stabilizer> #proof[ 定義$f:G -> G_x$,$f(g) = g x$。我們證明每一個在$G_x$裡的元素都被打到$abs(Stab_G (x))$這麼多次。\ 給定一個$y in G_x$,那麼存在$h in G$使得$y = h x$。\ \ 我們先證明這個引理: $f(g) = y <==> h^(-1) g in Stab_G (x)$。 \ $=>$:如果$f(g) = y$,那麼$g x = h x$,所以$h^(-1)g x = x$,所以$h^(-1)g in Stab_G (x)$。 \ $arrow.l.double$:如果$h^(-1) g in Stab_G (x)$,那麼$h^(-1) g x = x$,所以$g x = h x$,所以$f(g) = y$。\ \ 接著我們來討論有多少 $g in G$ 使得 $h^(-1)g in Stab_G (x)$。\ $ h^(-1) g in Stab_G (x) &<==> exists tilde(g) in Stab_G (x) st h^(-1) g = tilde(g)\ &<==> exists tilde(g) in Stab_G (x) st g = h tilde(g)\ &<==> g in {h tilde(g) | tilde(g) in Stab_G (x)} $ 所以,$f(g) = y <==> g in {h tilde(g) | tilde(g) in Stab_G (x)}$,並且,對於所有$tilde(g) in Stab_G(x)$,$f(h tilde(g)) = h tilde(g)x = h x =y $。 因此,每個$y in G_x$ 都 $abs(Stab_G (x))$ 個 $g in G$ 使得 $f(g) = y$。所以,$abs(G) = abs(G_x) abs(Stab_G (x))$。 ] // #pagebreak() == 伯恩賽德引理 (Burnside’s Lemma) #lemma([*伯恩賽德引理*])[ 讓$G$是一個有限群,讓$X$是一個#gset。讓$r$是$X$的軌道數,那麼 $ r dot abs(G) = sum_(g in G) abs(X^g) $ ] #proof[ (雙重計數) #set math.equation(numbering: "(1)") 我們考慮序組$(g,x)$,其中$g x = x$。假設這樣的序組有$N$個。 給定一個$g in G$,我們計算$(g,x)$的數量,這個數量是$abs(X^g)$。所以 $ N = sum_(g in G) abs(X^g) $ 另一方面,給定一個$x in X$,我們計算$(g,x)$的數量,這個數量是$abs(Stab_G (x))$。所以 $ N = sum_(x in X) abs(Stab_G (x)) $ 根據 @orbit-stabilizer[*軌道穩定子定理* Thm],$abs(Stab_G (x))abs(G_x) = abs(G)$,所以, $ N = sum_(x in X) abs(Stab_G (x)) = sum_(x in X) abs(G) / abs(G_x) = abs(G) sum_(x in X) 1 / abs(G_x) $ 對於在相同軌道的元素,$abs(G_x)$是相同的。讓$cal(O)$是一個軌道,我們有 $ sum_(x in cal(O)) 1 / abs(G_x) = sum_(x in cal(O)) 1 / abs(cal(O)) = 1 $ 用 *(3)* 代入 *(2)*,我們得到 $ N = abs(G) dot (textr("軌道的數量")) = abs(G) dot r $ 因此, 結合 *(1)* 和 *(4)*,我們得到 $ r dot abs(G) = sum_(g in G) abs(X^g) $ ] #example[ 用$4$個顏色對一個正三角形的三個邊進行著色,有幾種不同的著色方法?(兩種著色方式被認為是相同的,如果他們可以通過旋轉、鏡射相互變換) 我們讓$G = D_3$是三角型的對稱群,$X$是所有著色的結果($abs(X) = 4^3$),所以我們要求$X$在$G$下有幾個軌道。根據前的討論,我們知道$abs(G) = 6$,然後我們計算不動點的個數: $ abs(X^(rho_0)) = 4^3\ abs(X^(rho_1)) = 4\ abs(X^(rho_2)) = 4\ abs(X^(tau_1)) = 4^2\ abs(X^(tau_2)) = 4^2\ abs(X^(tau_3)) = 4^2\ $ 根據*伯恩賽德引理*,我們有 $ 6r &= 4^3 +4 +4 +4^2 +4^2 + 4^2 = 120\ r &= 20 $ 所以正三角形的相異著色方法有$20$種。 ] == 著色多項式 我們考慮我們有$n$個顏色,幫一個有對稱性的圖形上色,我們假設在對稱性下有$r$種上色方式。 讓$X$是所有上色方法的集合,讓$G$是該圖形的對稱群,根據博恩賽德引理,我們有 $ r = 1/abs(G) sum_(g in G) abs(X^g) $ 其中$X^g$是在$g$下的不動點的集合。我們觀察一下$g in G$,我們知道$g$可以被寫成循環的形式,像是下面這樣: $ g = underbrace((1,2,3)(5,4) dots (\#,\#),m_g) $ 所以$g$種共有$m_g$個循環。我們發現在這種情況下要在$g$下不動的著色方法必須滿足「每個循環內的顏色都一樣」,所以$abs(X^g) = n^(m_g)$ 所以我們得到, $ r = 1/abs(G) sum_(g in G) abs(X^g) = 1/abs(G) sum_(g in G) n^(m_g) $ #example[ #let mg = $m_g$ 我們考慮有$n$個顏色,對一個正四邊形的頂點上色,我們要求在對稱性下有幾種不同的著色方法。 我們讓$G = D_4$是正四邊形的對稱群,$X$是所有著色的結果($abs(X) = n^4$),所以我們要求$X$在$G$下有幾個軌道。根據前的討論,我們知道$abs(G) = 8$,然後我們計算不動點的個數: - $1$個 $4$ cycle的單位變換,$e = (1)(2)(3)(4)$ - $2$個 $1$ cycle的旋轉($90 degree, 270 degree$),e.x. $g = (1,2,3,4)$ - $1$個 $2$ cycle的旋轉($180 degree$),e.x. $g = (1,2)(3,4)$ - $2$個 $3$ cycle的鏡射(對角線的鏡射),e.x. $g = (1)(3)(2,4)$ - $2$個 $2$ cycle的鏡射(中線的鏡射),e.x. $g = (1,3)(2,4)$ 所以我們有 $ r &= 1/8 (n^4 + 2n + n^2 + 2n^3 +2n^2) \ r &= 1/8 (n^4 + 2n^3 + 3n^2 +2n) $ ] #example[ #let mg = $m_g$ 我們現在有$n$個顏色,幫一個正六面體上色,可以通過旋轉變換得到視為相同的著色方式。總共有多少種不同的著色方式? #grid( columns: (1fr,1fr), rows: (auto), align: center, cetz.canvas(length: 1.3cm,{ import cetz.draw:* ortho(x:20deg,y:45deg,z:0deg,{ on-xy(z:-1,{ rect((-1,-1),(1,1),fill: rgb("e8e8f8")) }) on-xy(z:1,{ rect((-1,-1),(1,1),fill: rgb(silver)) }) on-yz(x:-1,{ rect((-1,-1),(1,1)) }) on-yz(x:1,{ rect((-1,-1),(1,1)) }) on-xz(y:-1,{ rect((-1,-1),(1,1)) }) on-xz(y:1,{ rect((-1,-1),(1,1)) }) // line((0,0,0),(2,0,0),stroke: red) // line((0,0,0),(0,2,0),stroke: green) // line((0,0,0),(0,0,2),stroke: blue) }) }), cetz.canvas(length: 1.3cm,{ import cetz.draw:* rect((0,0),(1,1),name: "1") rect((1,0),(2,1)) rect((2,0),(3,1)) rect((3,0),(4,1)) rect((1,1),(2,2)) rect((1,0),(2,-1)) content((0.5,0.5),[*1*]) content((1.5,0.5),[*2*]) content((2.5,0.5),[*3*]) content((3.5,0.5),[*4*]) content((1.5,1.5),[*5*]) content((1.5,-0.5),[*6*]) }) ) 讓$D$是正六面體的對稱群,我們根據之前的討論,我們知道$abs(D) = 24$,我們討論裡面的變換: + 單位變換:$(1)(2)(3)(4)(5)(6)$ + 過對面中點轉軸旋轉$90degree,270degree$,如:$(1,2,3,4)(5)(6)$,共 6 個。 + 過對面中點轉軸旋轉$180degree$,如:$(1,3)(2,4)(5)(6)$,共 3 個。 + 過對邊中點轉軸旋轉$180degree$,如:$(1,5)(3,6)(2,4)$,共 6 個。 + 過對頂點轉軸旋轉$120degree,240degree$,如:$(1,5,4)(2,3,6)$,共 8 個 所以我們有 $ r &= 1/24 (n^6 + 6n^3 + 3n^4 + 6n^3 + 8n^2) \ r &= 1/24 (n^6 + 3n^4 + 12n^3 + 8n^2) $ ] #example[ 在旋轉的對稱性下,用$n$個顏色對一個正四面體的*邊*上色,總共有多少種不同的著色方式? #figure[ #image("../pic/summer_camp/image.png",width: 11em) ] 我們讓$G$是正四面體的對稱群,我們通過軌道-穩定子定理,我們可以得到$abs(G) =12$ 我們討論裡面的對置換: - 單位變換:$(1)(2)(3)(4)(5)(6)$ - $8$個以一面中點的垂線為轉軸的旋轉:$(1,2,3)(4,5,6)$ - $3$個以過兩對邊中點的轉軸旋轉:$(1)(6)(2,4)(5,3)$ 所以我們有 $ r = 1/12 (n^6 + 8n^2 + 3n^4) $ ] == 練習 #exercise[ 對於正$n$邊形的對稱群$D_n$,$abs(D_n)$是多少? ] #exercise[ 有$n$個不同顏色的珠子,我們要把這些珠子串成一串$6$個珠子的項鍊,可以通過旋轉變換得到視為相同的項鍊。總共有多少種不同的項鍊? #figure[ #diagram( node-stroke: 1pt, { for t in range(6).map(i => i/6*360deg) { node((calc.cos(t),calc.sin(t)),[#v(0.1em)],shape: circle) edge((calc.cos(t),calc.sin(t)),(calc.cos(t+60deg),calc.sin(t+60deg)), bend: 30deg) } }) ] #set enum (numbering: al("a)")) + 對稱群的order是多少? + 對稱群的元素有哪些? 每個元素有幾個循環? + 有多少種不同的著色方式? ] #exercise[ 在旋轉的對稱性下,用$n$個顏色對一個正四面體的*面*上色。 #set enum (numbering: al("a)")) + 對稱群的order是多少? + 對稱群的元素有哪些? 每個元素有幾個循環? + 有多少種不同的著色方式? ] #exercise[ 有$3$個顏色,幫一個正六面體上色,*每個顏色上兩個面*,可以通過旋轉變換得到視為相同的著色方式。總共有多少種不同的著色方式? ]
https://github.com/Kasci/LiturgicalBooks
https://raw.githubusercontent.com/Kasci/LiturgicalBooks/master/CSL_old/oktoich/Hlas4/1_Pondelok.typ
typst
#let V = ( "HV": ( ("", "Dál jesí známenije", "Sohriších k tebí čelovikoľúbče, ne po jestestvú jáko čelovík, da isprošú proščénije: no páče čelovíka, i výše jestestvá, i páče proščénija. No íže výše jestéstvennych ustáv, i páče postižénija mýsli, čelovík Spáse mój býv, i páče umá čelovikoľúbnoje imýj, pomíluj mjá k tebí obraščájuščahosja."), ("", "", "Položíl jesí pokajánije sohrišájuščym Christé, a ne právednym: óbraz úbo ímam razbójnika že i blúdnaho, Manassíji i bludnícy, honíteľa, mytarjá, i otmétnika, neudób otčajavájusja, tvojé bo čelovikoľúbnoje, i prebláhóe vídyj Spáse mój, obraščájusja že i slezjú, i blahonadéžden byváju jáko priímeši mjá."), ("", "", "Dážď mí umilénije, i zlých otčuždénije, i soveršénnoje ispravlénije, v strastéch ťilésnych nýňi pohružénomu, i udalénomu ot tebé Bóže vsích carjú, i nikákože nadéždu imúšča, spasí mja blúdnaho, mnóhija rádi bláhosti, Iisúse vsesíľne, spáse dúš nášich."), ("", "Jáko dóbľa", "Božéstvennyja svítlosti prijátnaja žilíšča, i prijátelišča vsečestnája, utverdíl jesí ánheľskaja vóinstva bezsmértne: i božéstvennym sostavlénijem, zríteli i služíteli tvojejá slávy sijá položív, v kríposti skončaváti tvojé slóvo, i choťínije ispolňáti vseďíteľnoje i presvjatóje."), ("", "", "Bláhosti pučínu javíti voschoťív, bláh sýj beznačáľne, pérvije sozdál jesí vsesíľnym tvojím manovénijem, i božéstvennym poveľínijem, ánheľskija líki, i sílam číny: voístinnu bo podobáše blahómu izlijátisja, i pochodíti, da i mnózim býti darovánijem Vladýko."), ("", "", "Serafími šestokrilátiji, Cheruvími mnohoočítiji s prevysókimi Prestóly, tebé obstoját, vseďíteľnaho tvojehó sijánija neposrédstvenňi priobščájuščesja, Hospóďstvija, Načála, Vlásti, Archánheli, Ánheli i Síly božéstvennyja: tvojú slávu vsederžíteľu chváľašče, o nás móľatsja tebí."), ("Bohoródičen", "", "Prevozšédši číny ánheľskija vseneporóčnaja, so ánhely vsehdá molí ánhely Vladýčestvujuščaho, i vséju tváriju, darováti nám prehrišénij ostavlénije, i izbávitisja strastéj, i dostójny sotvoríti nás tohdá pivcý slávy jehó, i nasľídniki netľínnyja píšči."), ), "S": ( ("", "", "Choťích slezámi omýti mojích prehrišénij rukopisánije Hóspodi, i próčeje životá mojehó pokajánijem blahouhodíti tebí, no vráh ľstít mjá, i bóret dúšu mojú: Hóspodi, préžde dáže do koncá ne pohíbnu, spasí mja."), ("", "", "Któ oburevájem, i pritekája ko pristánišču tvojemú Hóspodi, ne spasétsja? Ilí któ nedúhuja, i pripádaja ko vračevstvú tvojemú ne uvračúetsja, soďíteľu vsjáčeskich, i vračú nedúžnych? Hóspodi, préžde dáže do koncá ne pohíbnu, spasí mja."), ("", "", "Proslavľájajsja v pámjatech svjatých tvojích Christé Bóže, i ot ních umolén byvája, nizposlí nám véliju mílosť."), ("Bohoródičen", "", "Rádujsja, svíta óblače. Rádujsja, svíščniče svítlyj. Rádujsja rúčko, v néjže mánna. Rádujsja, žézle Aarónov. Rádujsja, kupinó neopalímaja. Rádujsja, čertóže. Rádujsja, prestóle. Rádujsja, horó svjatája. Rádujsja, pribížišče. Rádujsja, božéstvennaja trapézo. Rádujsja, dvére tájnaja. Rádujsja, vsích rádoste."), ), ) #let P = ( "1": ( ("", "", "Mórja čermnúju pučínu nevlážnymi stopámi drévnij pišešéstvovav Izráiľ, krestoobráznyma Mojséovyma rukáma, Amalíkovu sílu v pustýni pobidíl jésť."), ("", "", "Prijimí jéže ot duší molénije mojé prečístaja Vladýčice, jáže Bóha róždši plótiju: k tebí bo pribihóch k deržávňij pómošči, da ne pohrišú nadéždy mojejá."), ("", "", "Pripádaju raboľípno k tebí prečístaja Bohorodíteľnice, jáko derznovénije imúšči mnóho, ot vsjákich mjá skorbéj tvojími molítvami izbávi, k Sýnu tvojemú chodátajstvujušči."), ("", "", "Mórja žitéjskaho volnámi potopľájem, i napásťmi ľútymi oderžím, ko pristánišču ustremíchsja nevlájemomu, pokróvu tvojemú: ťímže mjá izbávi Bohoródice, ot ľútych."), ("", "", "Blahoutróbnym i tíchim tvojím ókom vozzrí na rabá tvojehó, i uslýšati potščísja, blahája, ispolňájušči moľbú tvojehó rabá prečístaja, i razorjájušči sovíty lukávyja."), ), "3": ( ("", "", "Veselítsja o tebí cérkov tvojá Christé, zovúšči: tý mojá kríposť Hóspodi, i pribížišče i utverždénije."), ("", "", "Pomóščnica na vrahí, i v bránech pobórnica Vladýčice, tý vírnym jesí, i pribížišče súščym v pečálech."), ("", "", "Jehóže rodilá jesí plótiju Bóha vsích, jáko Sýna molí, razrišénije zól mojích dáti mí ot žitéjskich nastojánij."), ("", "", "Prízri mílostivno na smirénije náše Vladýčice, vo jéže hňíva naležáščaho izbávitisja rabóm tvojím."), ("", "", "Króvom tvojím Vladýčice blahája, spasájemi prísno ot vsjákich skorbéj, chvalú Sýnu tvojemú prinošájem."), ), "4": ( ("", "", "Uslýšach slúch tvój, i ubojáchsja, razumích ďilá tvojá, i užasóchsja Hóspodi."), ("", "", "Jáko imúšči derznovénije čístaja Bohorodíteľnice, k Sýnu tvojemú, nastojáščaho mjá preminí iskušénija, razrušájušči sovíty částyja vrahóv prísno borjúščich mjá."), ("", "", "Lícy výšnich činóv, múčenicy i právednicy, so apóstoly, prorók božéstvennych sobór, i prepodóbnych s Bohoródiceju i Máteriju o nás Christá molíte."), ("", "", "Preslávnuju tvojú Máter Christé, prijimí moľáščujusja o míri, mílostivno tebí zovúščuju: Sýne mój, prijimí moľbú mojú, i utiší hňív naležáščij zemlí."), ("", "", "Pripádaju tí Bohoródice, i moľúsja iz hlubiný sérdca mojehó: nastojáščaho mjá preminí iskušénija, jáko da izbávľsja ot ľútych, písň prinesú tvojéj svítlosti."), ), "5": ( ("", "", "Nečestíviji ne úzrjat slávy tvojejá Christé, no mý ťa jedinoródne, Otéčeskija slávy sijánije Božestvá, ot nóšči útreňujušče, vospivájem ťá čelovikoľúbče."), ("", "", "Upovánije zemným, i zastuplénije čístaja, umilosérdisja na náše smirénije, mólimsja, i nastojáščaho hňíva svobodí nás."), ("", "", "Sťínu neoborímu čístaja, molítvu tvojú deržášče vopijém tí: umilosérdisja o Vladýčice, i vídimyja vrahí prožení."), ("", "", "Chváľaščijisja prísno upovánijem jéže k tebí prečístaja, da ne postydímsja, mólimsja so slezámi, i poklaňájemsja tvojéj bláhosti."), ("", "", "Prečístoju tvojéju rukóju prepítaja razžení vrahí, na ný vostajúščyja: i da razumíjut okajánniji, jáko na ťá upovánije vozložíchom."), ), "6": ( ("", "", "Vozopí, proobrazúja pohrebénije tridnévnoje, prorók Jóna, v kíťi moľásja: ot tlí izbávi mjá, Iisúse carjú síl."), ("", "", "Očísti náša hrichí čelovikoľúbče, mólimsja, molítvami bez símene róždšija ťá: nás bo rádi Slóve, tvojú čestnúju króv prolijál jesí."), ("", "", "Sobórišče na ný lukávnoje sobrásja neprávedno borjúščich nás, Bohonevísto: no sích nizloží, jáko Símona volchvá drévle."), ("", "", "Uslýši nášu moľbú Vladýčice, mólimsja, i búri vólny utiší razlíčnych boľíznej, ímiže vrazí na ný sobrášasja."), ("", "", "Pečáľ mojú na rádosť preloží, jáko ščédr, i pláč v vesélije preminí, i umilosérdisja Bohoródicy rádi, íže vódu v vinó v Káňi Haliléjsťij pretvorívyj Christé."), ), "S": ( ("", "", "Mnóhimi prehrišéniji áz blúdnyj úm pomračív, vopijú tvojemú krípkomu zastupléniju, Bohoródice: prosvití duší mojejá zínicy, i vozsijáj mí pokajánija svítluju zarjú, i oblecý mja vo orúžije svíta Bohorodíteľnice čístaja."), ), "7": ( ("", "", "Ne predážď nás do koncá ímene tvojehó rádi, i ne razorí zavíta tvojehó, i ne otstávi mílosti tvojejá ot nás, Hóspodi Bóže otéc nášich, prepítyj vo víki."), ("", "", "Íže v razlíčnych napástech i skórbech sýj, k tebí čístaja, spaséniju mojemú nýňi pribihóch, vopijú, da ne vozvraščúsja posrámlen ot čájanija mojehó: no uslýši, i izbávi mjá sítej lovjáščich."), ("", "", "Raboľípno výju prekloňáju okajánnyj, i ot sérdca mojehó vopijú, rúci prostér, koľína prekloňáju, i moľú ťa prečístuju Ďívu, izbávitisja skorbéj. Prísno nachoďáščich mňí ot navíta čuždáho."), ("", "", "Preslávnaja čístaja Maríje, zemných pochvaló, mólimsja, podážď tvojú pómošč nám, moľáščymsja i kláňajuščymsja blahočéstno, roždéstvú tvojemú, jáko inýja ne sťažáchom nadéždy i zastúpnicy rázve tebé."), ("", "", "Jáže páče umá Bóha róždši neiskusomúžnaja Maríje, i páče vsjákaho jestestvá, zastúpnice vírnych, naprásnych iskušénij čtúščyja ťá izbávi čísty, i bez vréda ot vráh vsjáčeskich, vídimych i nevídimych."), ), "8": ( ("", "", "Rúci rasprostér Daniíl, ľvóv zijánija v róvi zatčé: óhnennuju že sílu uhasíša, dobroďíteliju prepojásavšesja, blahočéstija račíteli ótrocy, vzyvájušče: blahoslovíte vsjá ďilá Hospódňa Hóspoda."), ("", "", "Rúci vozďíti ne smíju k Sýnu tvojemú čístaja, vés bo jésm oskvernén: ťímže derzáju k tebí Vladýčice pribíhnuti, chodátajstvuj k ščédromu Bóhu i blahopremínnomu, izbávitisja nám soprotívnych vráh oskorbľájuščich nás."), ("", "", "Óči i sérdce i dúšu na ťá prečístaja, vozložích: ťímže umilosérdisja Vladýčice čístaja, ščédromu pripádajušči o mňí chuďím i neterpilívim, da spasét mjá ot vsjákija núždy, i jázi, i pečáli."), ("", "", "Orúžijem tvojím, jáko sílen, borjúščyja nás poperí Hóspodi, i pobídu dáruj Vladýko, víroju naďíjuščymsja na ťá, Bohoródica mólitsja s predtéčeju Joánnom, i apóstoľskij lík i múčenik tvojích."), ("", "", "Rádosť blahovíščenija inohdá Havrijíl tebé čístaja Ďívo, prinesé, i pramáterneje sítovanije razrušísja roždestvóm tvojím: ťímže samá i mojejá duší unýnije očístivši, pokaží mja molítvami tvojími neposrámlena."), ), "9": ( ("", "", "Kámeň nerukoséčnyj, ot nesikómyja horý tebé Ďívo, kraeuhóľnyj otsečésja, Christós, sovokupívyj razstojáščajasja jestestvá: ťím veseľáščesja, ťá Bohoródice veličájem."), ("", "", "Javí Bohoródice Ďívo, tvojú pómošč vskóri: prikloní že usérdno úcho tvojé, i vopijúščich tépľi uslýši, i ľútych izminí, izbavľájušči nás molítvami tvojími."), ("", "", "Vés ľínostiju oderžím, i vo otčájaniji jésm hlubiný mojích prehrišénij: ťímže mí Ďívo Máti, rúku prostrí, jáko Petrú Christós, i iz hlubiný prehrišénij izbávi mjá."), ("", "", "Jazýk neobuzdán i veleríčiv ukrotí Ďívo, izoščrénnyj jáko strilá, ustrilíti choťášč i umoríti mjá, jáko vósk razlíj, i sovít jehó sújeten sotvorí."), ("", "", "Razruší sovíty vsjá, jáže na ný vooružénnych, Máti Bóha výšňaho, rádosti že ispólni upovájuščich na ťá: da usérdno vsí tvojé propovímy zastuplénije."), ), ) #let U = ( "S1": ( ("", "", "Smirénnuju mojú dúšu posití Hóspodi, vo hresích vsé žitijé iždívšuju: jákože bludnícu prijimí mené, i spasí mja."), ("", "", "Preplávaja pučínu nastojáščaho žitijá, pomyšľáju bézdnu mnóhich mojích zól, i ne imíjaj okormíteľa pomyšlénij, Petróv proviščaváju tí hlás: spasí mja Christé, spasí mja Bóže, jáko čelovikoľúbec."), ("Bohoródičen", "", "Sťiná nepobidímaja nám christijánom jesí Bohoródice Ďívo: k tebí bo pribihájušče nevredími prebyvájem, i páki sohrišájušče, ímamy ťá molítvennicu. Ťím blahodárstvenno vopijém tí: rádujsja blahodátnaja, Hospóď s tobóju."), ), "S2": ( ("", "", "Skóro sovnídem v nevístnik Christóv, da vsí uslýšim božéstvennyj hlás Christá Bóha nášeho: prijidíte ľúbjaščiji nebésnuju slávu, i prijimíte sijú vkúpi s múdrymi ďívami, prosvitívše sviščý svojá víroju."), ("", "", "Mnóžestvom prehrišénij mojích osuždájem, i stráchom mučénija smuščájem jésm. Christé Bóže náš slézy pokajánija ot sérdca prinošú tebí, imúščemu vlásť životá, i smérti, i zovú ti vo umiléniji: sohriších, Hóspodi, spasí mja."), ("", "", "Dnés ánheľskaja vójinstva, v pámjať strastotérpec prijidóša, vírnych mýsli prosvitíti, i vselénnuju blahodátiju ujasníti: ťí Bóže umolén byvája, dáruj nám véliju mílosť."), ("Bohoródičen", "", "Jéže ot ánhela slóvo prijémši Ďívo, vo utróbi tvojéj, i róždši voploščénna Christá Bóha Jemmanúila, Bohoródice, molí o dušách nášich."), ), "S3": ( ("", "Skóro predvarí", "Neveščéstvennymi ustý, Tróice prebožéstvennaja, neprestánno pojút ťá bezplótnych lícy, i so stráchom predstoját, svját, vzyvájušče, trijipostásnoje jestestvó: pomíluj rúk tvojích sozdánije tvojé, ťích moľbámi, jedíne čelovikoľúbče."), ("", "", "Ánheľstiji číni so stráchom predstoját prestólu tvojemú Vladýko, i jéže otonúdu zarjámi prísno prosviščájemi, písň tebí nemólčno pobídnuju vospivájut Hóspodi: íchže svjaščénnymi molítvami, mír míru, i ostavlénije prehrišénij nášich dáruj."), ("Bohoródičen", "", "Ďívo vseneporóčnaja, jáže presúščnaho Bóha róždši, so bezplótnymi tohó neprestánno molí, ostavlénije prehrišénij, i ispravlénije žitijá dáti nám préžde koncá, víroju i ľubóviju pojúščym ťá po dólhu, jedína vsepítaja."), ), "K": ( "P1": ( "1": ( ("", "", "Tristáty kripkija, roždéjsja ot Ďívy, bezstrástija vo hlubiňí, duší tričástnoje potopí, moľúsja, da tebí jáko v timpáňi, vo umerščvléniji ťilesé pobídnoje vospojú pínije."), ("", "", "Spáse mój Iisúse, íže blúdnaho spasýj, i íže bludnícy pláč prijémyj, íže mytarjá vozdochnúvša, manovénijem svoím opravdávyj, i mené bez čislá sohrišívšaho, i obraščájuščasja prijimí, i spasí mja."), ("", "", "Jáko véšč sňidájet dúšu mojú zlóbnyj óhň, i plámeňu tvorít podhňiščénije búduščemu: čelovikoľúbče, uhasí tohó, orošénijem tvojích mílostej dolhoterpilíve, podáv mí slézy pokajánija."), ("Múčeničen", "", "Rázuma mnóha ispólnen lík svjatých stradálec, nerazúmen sovít i soprotívnoje mudrovánije vsích zakonoprestúpnych, cilomúdrenno ukloní, i božéstvennych póčestej polučí."), ("Múčeničen", "", "Íže míra prezrívše krásnaja víroju, i premírnuju žízň nasľídovaste múdriji strastotérpcy prechváľniji, vsjákaho mjá úbo izbávite mirskáho smišénija, ískrenno vás blažáščaho."), ("Bohoródičen", "", "Presvítlaja sviščé sólnca slávy, duší mojejá óhň unýnijem uhašénnyj vozžzí prečístaja, prísno božéstvennych ďíl jeléjem napojájušči, da víroju i ľubóviju slávľu ťá."), ), "2": ( ("", "", "Tristáty kripkija, roždéjsja ot Ďívy, bezstrástija vo hlubiňí, duší tričástnoje potopí, moľúsja, da tebí jáko v timpáňi, vo umerščvléniji ťilesé pobídnoje vospojú pínije."), ("", "", "Jáko úmi čístiji, ánheli, velíkomu i pérvomu predstojášče Umú, i božéstvennaho sijánija nasyščájemi, lučéju mjá vášeju ozaríte, pojúšče vsevinóvnaho Slóva preslávniji."), ("", "", "Jáko úmi čístiji, ánheli, velíkomu i pérvomu predstojášče Umú, i božéstvennaho sijánija nasyščájemi, lučéju mjá vášeju ozaríte, pojúšče vsevinóvnaho Slóva preslávniji."), ("", "", "Otňúd k Bóhu preklóňšesja ľubóviju, i božéstvennymi dobrótami voobražájemi jávi, o archánheli slávniji, stojánijem priľížnym obstoité, zovúšče pobídnuju písň ziždíteľu."), ("Bohoródičen", "", "Jáže jedína vo črévi prijémši Slóvo, jehóže ánheľskaja vójinstva slávjat prísno, dúšu mojú ozarí, rišášči hrichóvnoje mráčnoje zlomyšlénije vseneporóčnaja, i prosviščájušči rázumom roždestvá tvojehó."), ), ), "P3": ( "1": ( ("", "", "S vysotý snizšél jesí vóleju na zémľu, prevýše vsjákaho načála, i smirénnoje voznésl jesí iz áda preispódňaho jestestvó čelovíčeskoje: ňísť bo svját páče tebé, čelovikoľúbče."), ("", "", "Nóščiju mjá žitijá ťmá strastéj objála jésť, Christé Bóže, svít sýj nezachodímyj, lučámi pokajánija ozarív mjá spasí, jáko čelovikoľúbec, da slávľu ťá."), ("", "", "Části mjá pokaží Christé, izbránnych nasľídnika, soprotívnyja části otlučív Spáse, slezámi i mílostyneju očiščéna, jáko da vo chvaléniji ťá slávľu vsehdá."), ("Múčeničen", "", "Očervleníšasja voístinnu króviju váši nóhi, i bystríjše k nebesí potekóša, so hrichóm zémľu ostávľše, múčenicy, sobesídnicy božéstvennym sílam."), ("Múčeničen", "", "Udručájemo ťílo váše ránami rastájašesja, Christóvy stradáľcy: dušévnaja že ukripľášesja síla, ľubóviju privjazújema nerišímo k sozdávšemu vsjáčeskaja choťínijem."), ("Bohoródičen", "", "<NAME>, jáko súšči vsím Hóspoda róždši, obladájema strasťmí umá mojehó, i zlóboju očernéna, svobódna sotvorí i prosvití."), ), "2": ( ("", "", "S vysotý snizšél jesí vóleju na zémľu, prevýše vsjákaho načála, i smirénnoje voznésl jesí iz áda preispódňaho jestestvó čelovíčeskoje: ňísť bo svját páče tebé, čelovikoľúbče."), ("", "", "Íže premírnymi líki vospivájemyj Christé, čínom ťích vírnych sostavlénija vospiváti Bohomúdrenno sotvorí, Slóve: ňísť bo svját páče tebé čelovikoľúbče."), ("", "", "Íže premírnymi líki vospivájemyj Christé, čínom ťích vírnych sostavlénija vospiváti Bohomúdrenno sotvorí, Slóve: ňísť bo svját páče tebé čelovikoľúbče."), ("", "", "Ľubóviju téploju, privjazánijem račíteľnym pričaščájuščesja pérvomu istóčniku služíteľňi predstoité, pojúšče nemólčno, jedíno suščestvó Božestvá beznačáľnaho, božéstvenniji archánheli."), ("Bohoródičen", "", "Jévinu drévňuju kľátvu, Christá róždši Máti čístaja, presvítlo razrišíla jesí, vsích vinčájuščaho blahoslovéňmi: ňísť bo prečístaja, rázvi tebé pomóščnicy."), ), ), "P4": ( "1": ( ("", "", "Siďáj v slávi na prestóľi božestvá, vo óblaci léhci priíde Iisús prebožéstvennyj, netľínnoju dlániju, i spasé zovúščyja: sláva Christé síľi tvojéj."), ("", "", "Jáko sudijí právedňijšemu nýňi pripádaju tí Hóspodi, osuždénna i otčájannaho mjá uščédri, i právednaho tvojehó izbávi izrečénija, i stojánija izbránnych spodóbi."), ("", "", "V razbójniki vpádša neukrotímyja čelovikoľúbče, i ujázvlena bývša iscilí Christé, vozlivája na mjá pokajánija vinó i jeléj i oďivája mjá odéždoju spasénija mojehó."), ("Múčeničen", "", "Oblekóstesja svýše rízoju spasénija, v sovlečéniji ťilésňim vsechváľniji múčenicy, i sovlékšaho préžde práotca obnažíste, bez dychánija mértva sotvórše."), ("Múčeničen", "", "Vitíjstvujušče pred bezzakónnymi múčenicy, Bóžija slóva rázumom blahočéstija ukrašájemi, mudrecý i vitíji vsjá posramíste nečéstvujuščyja, i vrahá umertvívše."), ("Bohoródičen", "", "Na ťá jákože dóžď, premúdrosti bézdna Iisús sníde, jedínu čístuju tebé obrít Bohorodíteľnice Ďívo, i zahradí nečéstija potóki ľútyja, božéstvennoju blahodátiju ."), ), "2": ( ("", "", "Siďáj v slávi na prestóľi božestvá, vo óblaci léhci priíde Iisús prebožéstvennyj, netľínnoju dlániju, i spasé zovúščyja: sláva Christé síľi tvojéj."), ("", "", "Nepostižímoju síloju ot nebytijá privél jesí nebésnyja umý, Slóve Bóžij presúščestvennyj, i neizrečénnoju tvojéju slávoju ukrasíl jesí, zovúščyja: sláva Christé síľi tvojéj."), ("", "", "Upravľájemi výšnija síly Dúchom, i brozdámi jehó, i božéstvennymi zarjámi oblistájemi, neotpádajuščyja číny nasľídovaša, jedíno čtúšče vsích načálo i Božestvó."), ("", "", "Licá tvojehó dobrótu krásnuju zríti, spodóbišasja služíteľnyja tvojá svítlosti, i otonúdu razumínija vosprijémľušče, tebí vopijút: sláva Christé síľi tvojéj."), ("Bohoródičen", "", "Caríca Ďívo, zlatóju odéždoju ukrášena, Sýnu carjú nýňi predstoít, jáže bez sravnénija ánhel prevýšši, zovúščich: sláva Christé síľi tvojéj."), ), ), "P5": ( "1": ( ("", "", "Nýňi vostánu, proróčeski rečé Bóh, nýňi proslávľusja, nýňi voznesúsja, pádšaho prijém ot Ďívy, i k svítu úmnomu voznosjáj mojehó Božestvá."), ("", "", "O káko osuždén choščú predstáti tebí, Sudijí Bóhu vsích, i obličén býti o vsích zlých, íchže bez umá sohriších vóleju, i vsehó sebé nepotrébna sotvorích!"), ("", "", "Spasí mja Hóspodi, jáko zól ispólnichsja mnóhich, i moľúsja: iscilí hrichí mojá i hnojénija ľútaja, i ne ostávi pohíbnuti mjá jedínaho Iisúse mój, mnóho tí sohrišívšaho."), ("Múčeničen", "", "Končínu blažénnu obrítše stradáľcy jávi, slávu ulučíša, Christá proslavľájušče svojími údy, strastopoložíteľa, jázvam i ránam pričaščátisja múžeski predrazsuždájušče."), ("Múčeničen", "", "Bohátstvo nebésnoje, žitijém božéstvennym i krásnym, vinéc neuvjadájemyj, i svít nevečérnij, i žilíšče nikákože obetšájemoje nerukotvorénno nasľídovaste, stradáľcy Christóvy blažénniji."), ("Bohoródičen", "", "Tvojá čudesá proróčestiji prorekóša hlási, hóru naricájušče ťá prečístaja, i dvér, i svitíľnik svítel: iz nehóže voístinnu svít čúdnyj prosviščájet čístaja, mír vés."), ), "2": ( ("", "", "Nýňi vostánu, proróčeski rečé Bóh, nýňi proslávľusja, nýňi voznesúsja, pádšaho prijém ot Ďívy, i k svítu úmnomu voznosjáj mojehó Božestvá."), ("", "", "Trépetno slávjat Cheruvími, i Serafími, so Prestóly, i božéstvenniji Archánheli, i Hospóďstvija, i Síly, i Načála, i Vlásti so Ánhely, čestnóe jedíno i Tróičeskoje Božestvó."), ("", "", "Trépetno slávjat Cheruvími, i Serafími, so Prestóly, i božéstvenniji Archánheli, i Hospóďstvija, i Síly, i Načála, i Vlásti so Ánhely, čestnóe jedíno i Tróičeskoje Božestvó."), ("", "", "Javíšasja ánheli svitovídno sijájušče Christé, tvojé voskresénije propovídajušče v míri prepodóbnym ženám, i vrahóv tvojích sotrjasájušče úm blistáňmi tvojehó Božestvá."), ("Bohoródičen", "", "Nýňi sochraní opolčénije archánheľskimi cérkov tvojú, tebé slávjaščuju pravoslávnymi hlásy, íže ot Ďívy neizrečénno roždéjsja, i čelovíki ot tlí izbavľájaj."), ), ), "P6": ( "1": ( ("", "", "Prijidóch vo hlubiný morskíja, i potopíla mjá jésť búrja mnóhich hrichóv: no jáko Bóh iz hlubiný vozvedí živót mój, mnohomílostive."), ("", "", "Mértv sýj ne razumíju, ne čúvstvuja okajánnyj, i sóvisť oskvernénu nosjá prísno: Bóže sozdáteľu mój, da ne do koncá pohubíši mené."), ("", "", "Ďijánija mojá jákože vrazí oklevetáti mjá choťát na sudíšči tvojém, ščédre: ot níchže skóro izbávi Christé, nastavľája mjá k pokajániju."), ("Múčeničen", "", "Sokruší kósti strastonósec, sobór zakonoprestúpnych, no ne sokruší ťích víru, jejáže rádi nasľídnicy javíšasja Bóhu, i Spásu dúš nášich."), ("Múčeničen", "", "Jáko čestnóje kámenije, na kámeni nepokoléblemi nadéždi Bohomúdrenno nazdášasja strastonóscy, i jáko chrámy svjatáho Dúcha, v chrám Bóžij vselíšasja."), ("Bohoródičen", "", "Omračívšejesja sérdce mojé témnymi naítiji hrichá, svítom íže v tebí Bohonevístnaja ozarí, jáko sólnce Christá róždšaja."), ), "2": ( ("", "", "Prijidóch vo hlubiný morskíja, i potopíla mjá jésť búrja mnóhich hrichóv: no jáko Bóh iz hlubiný vozvedí živót mój, mnohomílostive."), ("", "", "Ókrest Vladýki predstojášče ánheľskaja vójinstva, i čísťi naslaždájuščesja sijánijem načála svítlaho, prosvitíte víroju vás pojúščich."), ("", "", "Ókrest Vladýki predstojášče ánheľskaja vójinstva, i čísťi naslaždájuščesja sijánijem načála svítlaho, prosvitíte víroju vás pojúščich."), ("", "", "Premúdrostiju tvojéju sotvorívyj ánheľskija líki, Hospóďstvija že i Síly, i Serafímy, jáko Vladýka pokazál jesí, chvalámi ťá čtúščyja."), ("Bohoródičen", "", "Jáko na prestóľich prevoznesénnych, Christé, počivájaj, i vsjáčeskaja božéstvennym prómyslom sobľudájaj, na rukú ďivíčeskuju Vladýko počíl jesí."), ), ), "P7": ( "1": ( ("", "", "Júnoši trí vo Vavilóňi, veľínije mučítelevo na bújstvo prelóžše, posreďí plámene vopijáchu: blahoslovén jesí Hóspodi Bóže otéc nášich."), ("", "", "Komú ťa upodóbľu, dušé mojá okajánnaja, uvý mňí, ľúbjaščuju bezmístnaja, a ne dóbraja vzyskújuščuju? Ťímže préžde koncá potščísja óbrazy blahíja pokazáti."), ("", "", "Túču mí Christé sléz dážď, jáko da ot ľútych mojích izmýjusja, i ne ostávi mené Spáse, nýňi pohíbnuti, sohrišívšaho tí mnóho páče čelovík."), ("Múčeničen", "", "Mértvosť umerščvlénnaho Slóva nosjášče na ťíľi svojém, prélesť umertvíste: živeté že i umérše slávniji, i umerščvlénnyja strasťmí stradáľcy vračújete."), ("Múčeničen", "", "Kóje místo nýňi ne ímať vás múčenicy prosvitíteli, i predhrádije? Kája straná ne osvjaščájetsja vášimi stradáňmi slávniji, i vostókami iscilénij?"), ("Bohoródičen", "", "Jedína prebyváješi po roždeství Vladýčice, ďivstvennoju dobrótoju sijájušči, jedína máternich izbíhla jesí boľíznej: Bóha bo jedína rodilá jesí, Izbáviteľa dúš nášich."), ), "2": ( ("", "", "Júnoši trí vo Vavilóňi, veľínije mučítelevo na bújstvo prelóžše, posreďí plámene vopijáchu: blahoslovén jesí Hóspodi Bóže otéc nášich."), ("", "", "Sviďíteľi soďijannym imúšče mýslennyja ánhely, izberém čístoje žitijé dušé mojá, vopijúšče: blahoslovén jesí Hóspodi Bóže otéc nášich."), ("", "", "Sviďíteľi soďijannym imúšče mýslennyja ánhely, izberém čístoje žitijé dušé mojá, vopijúšče: blahoslovén jesí Hóspodi Bóže otéc nášich."), ("", "", "Úhlem očiščénnyj, prestólu tvojemú predstojáščyja zrít Serafímy, božéstvennyj vopijá Isáija: blahoslovén jesí Hóspodi Bóže otéc nášich."), ("Bohoródičen", "", "Číny vsích bezplótnych, Ďívo, jávi prevoschódiši, jáko tvorcá róždši i Hóspoda: blahoslovén prečístaja, plód tvojehó čréva."), ), ), "P8": ( "1": ( ("", "", "Izbáviteľu vsích vsesíľne, posreďí plámene blahočéstvovavšyja, snizšéd orosíl jesí, i naučíl jesí píti: vsjá ďilá blahoslovíte , pójte Hóspoda."), ("", "", "Javíchsja skotóm podóben bezslovésnym, podklonívsja strastém. Slóve Bóžij prebeznačáľne, obratív mjá, spasí vopijúšča: blahoslovíte vsjá ďilá Hospódňa Hóspoda."), ("", "", "Ozobá vépr, pozobá jedínok mjá Spáse, jákože vinohrád vozďílannyj Dúchom: ot nehóže izbávi mjá Slóve, i plodonósna dobroďítelmi ábije tebí pokaží."), ("Múčeničen", "", "Obroščénija króvnaja očervleníša bahrjanícu vám Bohotkánnuju, jéjuže ukrasístesja múčenicy, vincý nosjášče pobídy, íže v výšnich carjú víčnomu predstoité."), ("Múčeničen", "", "Svjaščénnoje múčenikov soslóvije nesvjaščénnoje razruší sohlásije, bezzakónnovati poveľivájuščeje, i zakónno postradáv, ot Vladýki vsích zakónno vinčásja."), ("Bohoródičen", "", "blahoslovít vsjá tvár roždestvó tvojé, blahoslovéňmi nás vinčávšeje, i ot kľátvy izjémšeje, vseblahoslovénnaja jedína, i preproslávlennaja, jáže ród náš oblahodátivšaja."), ), "2": ( ("", "", "Izbáviteľu vsích vsesíľne, posreďí plámene blahočéstvovavšyja, snizšéd orosíl jesí, i naučíl jesí píti: vsjá ďilá blahoslovíte , pójte Hóspoda."), ("", "", "Jáko sýj živót bezsmértnyj, bezsmértnomu životú pričaščátisja ziždíteľňi sotvoríl jesí ánhely, i naučíl jesí píti: blahoslovíte , pójte Hóspoda."), ("", "", "Jáko sýj živót bezsmértnyj, bezsmértnomu životú pričaščátisja ziždíteľňi sotvoríl jesí ánhely, i naučíl jesí píti: blahoslovíte , pójte Hóspoda."), ("", "", "Mýslenno obstojášče ťá archánheli, neprestánnymi hlásy vospivájut, Bohoľípno čtúšče ťá jáko Vladýku vsích: blahoslovíte , pójte Hóspoda."), ("Bohoródičen", "", "Zakónniji óbrazi proobražáchu ťá vseblažénnaja, róždšuju Bóha, véšči plótsťij sojediňájema, préžde neveščéstvenna súšča božéstvennym jestestvóm: blahoslovím Ďívo, roždestvó tvojé."), ), ), "P9": ( "1": ( ("", "", "Jéva úbo nedúhom preslušánija kľátvu vselíla jésť: tý že Ďívo Bohoródice, prozjabénijem črevonošénija, mírovi blahoslovénije procvilá jésť: ťím ťá vsí veličájem."), ("", "", "Sé pokajánija vrémja, čtó ľinímsja, čtó snóm pohružájemsja? Unýnija da otstúpim, ukrasím sviščý, jákože píšet, jeléjem blahotvorénija: da ne ostánem vňijúdu dveréj rydájušči."), ("", "", "Dóndeže jésť vrémja pokájatisja, obratísja ot zól tvojích dušé mojá, jáže soďíla jesí víďinijem i nevíďinijem, i vozopíj k víduščemu vsjá: sohriších tí, prostí Vladýko, i ne hnušájsja mené nedostójnaho."), ("Múčeničen", "", "Sobrá Christós stradávšyja svjatýja javlénňijše, ot vsjákija straný i hráda na místa slávnaja, v čestnája pokójišča, i nýňi pervoródnych cérkov prosviščájut veseľáščesja."), ("Múčeničen", "", "Vsečestnája ráka čestných múčenik tvojích Hóspodi, lučámi božéstvennaho Dúcha ozarjájema, preslávňi iscilénija ispuščájet svítlosť, i razorjájet nedúhov boľízni, jedíne mnohomílostive."), ("Bohoródičen", "", "Ot svíta íže v tebí Bohonevísto, zarjámi dúšu mojú prosvití, ležáščuju v róvi pohíbeli vozstávi, vrahí sokrušájušči, oskorbľájuščyja sérdce mojé prísno, i k strastém porivájuščyja mjá."), ), "2": ( ("", "", "Sokrovénnoje Bóžije neizrečénnoje v tebí soveršájetsja, jávstvennoje tájinstvo, Ďívo prečístaja: íbo Bóh iz tebé voplotísja za milosérdije. ťímže ťá jáko Bohoródicu veličájem."), ("", "", "Íže umá rodíteľa, i predložíteľa Sýnu i Dúchu pojúšče ánheli, nýňi k nám božéstvennyja blahodáti podajánija, priľížno prijémše posyláti usérdstvujte."), ("", "", "Krasnó udobrjájemi netľínija dárom i blahodátiju , božéstvenniji archánheli, tebé istóčnika Christé prisnosúščna netľínija, pojúšče jáko bahodáteľa veličájut."), ("Bohoródičen", "", "Nevístnik voploščénija neizrečénnaho, i čertóh oduševlénnyj, i kovčéh zakóna blahodáti ťá Bohomáti, vírniji svímy: ťímže ťá neprestánno veličájem."), ), ), ), "ST": ( ("", "", "Omýj mjá slezámi moími Spáse, jáko oskverníchsja mnóhimi hrichí. Ťímže i pripádaju tí: sohriších Bóže, pomíluj mjá."), ("", "", "Ovčá jésm slovésnaho tvojehó stáda, i k tebí pribiháju pástyrju dóbromu: vzyščí mené zablúždšaho, Bóže, i pomíluj mjá."), ("Múčeničen", "", "Któ ne užasájetsja zrjá, svjatíji múčenicy, pódviha dóbraho vášeho, ímže podvizástesja? Káko vo plóti súšče, bezplótnaho vrahá pobidíste, Christá ispovídajušče, i krestóm vooružívšesja! Ťímže dostójno javístesja bisóv prohonítelije, i várvarov soprotivobórcy, neprestánno moľáščesja, spastísja dušám nášym."), ("Bohoródičen", "", "Bohoródice vsích caríce, pravoslávnych pochvaló, jeretíčestvujuščich šatánija razorí, i líca ích posramí, ne kláňajuščichsja, nižé čtúščich prečístaja, čéstnýj tvój óbraz."), ) ) #let L = ( "B": ( ("", "", "Drévom Adám rajá býsť izselén, drévom že kréstnym razbójnik v ráj vselísja: óv úbo vkúš zápoviď otvérže sotvóršaho: óv že sraspinájem, Bóha ispovída tajáščahosja, pomjaní mja, vopijá, vo cárstviji tvojém."), ("", "", "Páče vsích čelovík na zemlí áz sohriších, i bojúsja támo nelicemírnaho sudíšča prebláhíj: na némže neosuždéna mjá sobľudí tohdá, i ot múki izbávi, podajá mi pokajánije, omyvájuščeje vsjákija skvérny, jáko čelovikoľúbec."), ("", "", "Cheruvími, i Serafími , Vlásti, Prestóli, Archánheli, Hospóďstvija vkúpi i Síly, svjatíji Ánheli, Načála vysočájšaja, Vladýci vsích nýňi predstojášče, sohrišénij ostavlénija i žitijá ispravlénija isprosíte vsím vopijúščym vírno: pomjaní nás vo cárstviji tvojém."), ("", "", "Primišájuščesja ohňú, chvrástnuju prélesť popalíste strastonóscy Christóvy: i mnóžestvom vášeja króve, hlubínnaho zmíja otňúd potopíste vsechváľniji, pobídu vzémše, s výšnimi vójinstvy rádujetesja, moľáščesja priľížno spastísja nám."), ("", "", "Trisólnečnaja zaré, jáže v mirskích sijájuščaja ispolnénijich, duší mojejá ľútyja othnávši strásti, nizposlí mňí svíta sijánije, i očiščénije sohrišénij, víroju nýňi zovúšču tí beznačáľnomu Otcú, Sýnu soprestóľnu, i Dúchu: Tróice, vseďíteľnaja sílo, spasí nás."), ("", "", "Sohrišájuščaho prísno, i ľínostiju vesmá soderžíma, uščédri čístaja i pokajánija óbrazy udobrí, dajúšči umilénije nedoumínňij duší mojéj, prečístaja, nadéždo nepostýdnaja: ľubóviju vospivájuščich ťá, i vopijúščich vírno, pomjaní i nás Ďívo vsepítaja."), ) )
https://github.com/katamyra/Notes
https://raw.githubusercontent.com/katamyra/Notes/main/Compiled%20School%20Notes/CS3001/Sections/Section6.typ
typst
= The Ice Cream Club == Consumer Protection From a consumer protection point of view, this is a clear violation of their privacy. These men were obviously not informed of their data being sold at all, so their privacy is getting invaded. Even if some might argue that it "was in the fine print", or something similar, realistically it was hidden in a manner that the men would never at known their data would be sold. = Privacy + I think that generally, I am honestly less worried about my privacy than I should be. Sometimes when I think about my privacy I tend to feel like "so what" if it doesn't + New technology in general is really threatening our privacy. Especially with AI, our data is being used in completely new and unique ways through unique algorithms. + As developers we need to be more cautious about making sure our code does not have underlying prejudices based on the data we use, and that we don't contribute to morally gray projects. + As citizens we need to shun companies that are using our data unfairly, and vote on privacy protection laws + Policy makers need to have stronger laws = EU vs US Citizens - This is good for EU individuals, but bad for companies because it gives them less room to take advantage of us - The trade off is that privacy leads to less innovation/cool projects being made with our data - Less economic growth from increasing compliance costs for businesses = National ID Card - IF YOU CAN LOCK IT, "Reduced Identity Theft: A national ID card with robust security features could help combat identity theft and fraud by making it more difficult for individuals to use false or stolen identities. = Right to be Forgotten - Good for victims, but for ex-criminals they shouldn't have that right = Face CHAT GPT
https://github.com/elteammate/typst-compiler
https://raw.githubusercontent.com/elteammate/typst-compiler/main/src/reflection-ast.typ
typst
#import "utils.typ": * #let ast_node_type = mk_enum( debug: true, "paren", "ident", "unknown", "member_access", "call", "call_param_list", "content", "hash_expr", "stmt", "code_block", "content_block", "let_", "if_", "else_", "while_", "for_", "continue_", "break_", "return_", "set_", "show_", "import_", "param_list", "sink", "unary_plus", "unary_minus", "unary_not", "binary_add", "binary_sub", "binary_mul", "binary_div", "binary_eq", "binary_ne", "binary_lt", "binary_gt", "binary_le", "binary_ge", "binary_in", "binary_not_in", "binary_and", "binary_or", "type_cast", "assign", "add_assign", "sub_assign", "mul_assign", "div_assign", "literal_int", "literal_float", "literal_string", "literal_bool", "literal_none", "suffixed", "array", "dict", "lambda", ) #let mk_node(type, span: none, ..fields) = { assert(fields.pos().len() == 0, message: "AST requires node parameters to be named") ( type: type, fields: fields.named(), span: span, ) } #let is_ast_node(obj) = type(obj) == "dictionary" and "type" in obj and "fields" in obj #let pprint_ast_old(ast) = [ #text(blue, raw(ast.type)) // @ #pprint_span(ast.span) #list( ..ast.fields.pairs().map(item => { let key = item.at(0) let value = item.at(1) let ty = if is_ast_node(value) { "ast" } else { type(value) } [#raw(key);(#raw(ty))#sym.space.quad] if ty == "ast" { pprint_ast(value) } else if ty in ("dictionary", "array") { let items = () for k, v in value { if is_ast_node(v) { items.push([#raw(k): #pprint_ast(v)]) } else { items.push([#raw(k): #repr(v)]) } } list(..items) } else { text(maroon, [#repr(value)]) } }) ) ] #let fix_paren_value(parenthesised) = { if parenthesised.dict_flag { return mk_node( ast_node_type.dict, elements: parenthesised.elements, ) } if parenthesised.elements.len() == 0 { return mk_node( ast_node_type.array, elements: (), ) } else if parenthesised.elements.len() == 1 { if parenthesised.elements.at(0).sink { return mk_node( ast_node_type.array, elements: parenthesised.elements, ) } else if parenthesised.elements.at(0).key != none { return mk_node( ast_node_type.dict, elements: parenthesised.elements, ) } else if parenthesised.trailing_comma { return mk_node( ast_node_type.array, elements: parenthesised.elements, ) } else { return mk_node( ast_node_type.paren, expr: parenthesised.elements.at(0).value, ) } } else { if parenthesised.elements.all(x => x.key == none) { return mk_node( ast_node_type.array, elements: parenthesised.elements, ) } else if parenthesised.elements.all(x => x.sink or x.key != none) { return mk_node( ast_node_type.dict, elements: parenthesised.elements, ) } else { panic("Bad parenthesised literal") } } } #let fix_paren_call(parenthesised) = { parenthesised.elements } #let fix_paren_params(parenthesised) = { parenthesised.elements }
https://github.com/Toniolo-Marco/git-for-dummies
https://raw.githubusercontent.com/Toniolo-Marco/git-for-dummies/main/slides/practice/remote.typ
typst
#import "@preview/touying:0.5.2": * #import themes.university: * #import "@preview/numbly:0.1.0": numbly #import "@preview/fletcher:0.5.1" as fletcher: node, edge #let fletcher-diagram = touying-reducer.with(reduce: fletcher.diagram, cover: fletcher.hide) #import "/slides/components/gh-button.typ": gh_button #import "/slides/components/git-graph.typ": branch_indicator, commit_node, connect_nodes, branch #import "/slides/components/utils.typ": rainbow #import "/slides/components/thmbox.typ": custom-box, alert-box #grid(columns: 2, column-gutter: 5%, [So far* we have worked only on the local repository*, addressing scenarios without considering the remote repository. In this chapter we will make up for this shortcoming.], image("/slides/img/meme/git-remote-add.png") ) --- === Analysis. To get information about the remote we can make use of several commands: #align(center, ```bash ➜ git remote show #show the name of all remotes origin ➜ git remote show origin #show info about one remote * remote origin Fetch URL: https://github.com/nome-organizzazione/nome-repo.git Push URL: https://github.com/nome-organizzazione/nome-repo.git HEAD branch: (unknown) ➜ git remote -v #show info about all remotes origin https://github.com/nome-organizzazione/nome-repo.git (fetch) origin https://github.com/nome-organizzazione/nome-repo.git (push) ``` ) As you can guess the command suggested by GitHub: `git remote add origin URL`, (seen previously) will add the URL as a remote repository, with the name *origin*. --- === Fetch The `git fetch` command *downloads new commits, branches and tags from the remote repositories* and thus allows us to compare the information received with our local repo. *All this is performed without applying the changes to our branches locally.* In particular, the command has this syntax: `git fetch <remote> <refspec>`. If launched without arguments it may not update all the remotes (or the one we are interested in). To know which remote involves the fetch operation, try: #align(center, ```bash ➜ git fetch -v POST git-upload-pack (186 bytes) From https://github.com/Owner/repo = [up to date] main -> origin/main = [up to date] feature-1 -> origin/feature-1 = [up to date] feature-2 -> origin/feature-2 ``` ) --- === Fetch By default, git uses _origin_ as the remote, so for example if we had a remote, like the one depicted here; *the command would not work*: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // origin main branch name:"main ", remote: "my-fork ", indicator-xy: (4.75,0.5), color:blue, start:(0,1), length:5, head: 4, ), ) ] ] #pause There are several solutions we could adopt: - Obviously use the specific command `git fetch my-fork`. - Apply fetch to all remotes: `git fetch --all`. - Set the remote we are interested in as the default. This can be done either by editing the file in `.git/config`, or with the command. --- === Push and Pull Operations The push and pull operations are needed to keep *local and remote repositories synchronized*. As the name of the command itself suggests, `git push` *sends local changes to the remote repository*, while `git pull` *downloads and applies changes* from the remote repository. To identify the membership of a branch to a remote repository in the branch label we will use the notation _remote/branch_: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // main branch name:"main", indicator-xy: (6.75,0.5), color:blue, start:(0,1), length:7, head: 6, ), // origin/main indicator branch_indicator("origin/main", (3.75,0.5), blue), ) ] ] --- === Push In the case just presented, the _main_ branch in the local repository is “ahead” of the branch in the remote repository. To synchronize the two branches, therefore, we will have to do a _push_. Once we run the `git push origin main` command, if all goes well, the result will be: #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // main branch name:"main ", remote: "orgin ", indicator-xy: (6.75,0.5), color:blue, start:(0,1), length:7, head: 6, ), ) ] ] As you can see we use this special label to indicate that the branch locally is aligned with the remote branc --- === Pull #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch_indicator("main", (3.75,0.5), blue), branch( // main branch name:"main/origin", indicator-xy: (6.8,0.5), color:blue, start:(0,1), length:7, head: 6, ), //other branch stuff connect_nodes((3,1),(4,2),teal), branch( // old branch name:"feature", indicator-xy: (6,2.5), color: teal, start: (3,2), length: 3, ), ) ] ] In a similar case, however, it is useful to move to the main branch and perform a _pull_: we are developing our feature and someone has pushed to the main branch in remote. --- === Options #alert-box(title:"Warning")[ There are many other options applicable to the `git push` and `git pull` commands, in addition to `-u` which we have seen in the previous chapters; such as: `--force` and `--force-with-lease` we advise you to read the official documentation@git-docs before using them. ] #align(center, image("/slides/img/meme/git-help.png", width: 40%) ) --- === Pull Request #custom-box(title:"Pull Requests")[ PR are the tools through we apply *changes in repositories* on which *we do not have permissions*. They are widely used by the community and supported by GitHub, Git Lab (Merge Request) and BitBucket. ] PR allows developed features to be submitted to the maintainers of the original project, these will be visible to the entire organization (if private, or alternatively to everyone). Subsequently it can be accepted, rejected, or subject to adjustment. #footnote([On GitHub, PRs cannot be removed except by contacting GitHub support itself.]) In the same way as a merge, PRs can also have conflicts, which must be resolved in order to integrate the desired features. #footnote([If not done yet, you will need to log in via `gh` and set the default repo with the interactive command `gh repo set-default` before proceeding.]) --- #include "../animations/pr.typ" #include "../animations/remote-example.typ" === Remove Remote Branches Going back to the previous example, in order to *clean everything up*, we would like to delete the feature branches that we used previously. Suppose we have pushed _feature-2_ previously #footnote([If, on the other hand, a branch, created by others, so we don't have a copy of it locally, is deleted directly in the remote, just run `git fetch --all --prune`.]) #align(center)[ #scale(90%)[ #set text(10pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // remote origin name:"main", remote:("origin ","my-fork "), indicator-xy: (6,-0.5), color:lime, start:(0,-0.75), length:7, head:6, commits:("",none,none,none,none,none,"merge pr"), angle: 0deg ), connect_nodes((1,-0.75),(2,1),blue), branch( // main branch name:"", indicator-xy: (5.75,0.5), color:blue, start:(1,1), length:5, commits:("","",none,"","") ), connect_nodes((6,1),(7,-0.75),blue,bend:-25deg), //feature-2 branch connect_nodes((3.5,0),(3,1),orange), branch( name: "feature-2", remote:("my-fork"), indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //feature-1 branch connect_nodes((2,1),(3,2),teal), branch( name:"feature-1", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ] --- First we move to a local branch different from the ones we want to delete. To delete the branch remotely we give the command `git push my-fork -d feature-2` and immediately after that to delete it locally we can give `git branch -d feature-1 feature-2`. Then we can run the command `git fetch --all`. #align(center)[ #scale(100%)[ #set text(11pt) #fletcher-diagram( node-stroke: .1em, node-fill: none, spacing: 4em, mark-scale: 50%, branch( // remote origin name:"main", remote:("origin ","my-fork "), indicator-xy: (6,-0.5), color:lime, start:(0,-0.75), length:7, head: 6, commits:("",none,none,none,none,none,"merge pr") ), connect_nodes((1,-0.75),(2,1),blue), branch( // main branch name:"", indicator-xy: (5.75,0.5), color:blue, start:(1,1), length:5, commits:("","",none,"","") ), connect_nodes((6,1),(7,-0.75),blue,bend:-25deg), //orange branch connect_nodes((3.5,0),(3,1),orange), branch( name: "", indicator-xy: (5,0), color: orange, start: (2.5,0), length:2 ), connect_nodes((5,1),(4.5,0),orange), //teal branch connect_nodes((2,1),(3,2),teal), branch( name:"", indicator-xy: (6,1.5), color: teal, start: (2,2), length: 3, ), connect_nodes((5,2),(6,1),teal), ) ] ]
https://github.com/monaqa/typscrap.nvim
https://raw.githubusercontent.com/monaqa/typscrap.nvim/master/class/colors.typ
typst
#let fg = ( w0: rgb("#808080"), p0: rgb("#c145b2"), r0: rgb("#ce5623"), g0: rgb("#349900"), c0: rgb("#009caa"), b0: rgb("#4f7ae1"), w1: rgb("#777777"), p1: rgb("#ae4aa1"), r1: rgb("#b95630"), g1: rgb("#3d8d1a"), c1: rgb("#008f9a"), b1: rgb("#4f73c8"), w2: rgb("#6f6f6f"), p2: rgb("#9c4d90"), r2: rgb("#a45639"), g2: rgb("#42802d"), c2: rgb("#00828a"), b2: rgb("#4f6caf"), w3: rgb("#666666"), p3: rgb("#894e7f"), r3: rgb("#8f5540"), g3: rgb("#467438"), c3: rgb("#00757b"), b3: rgb("#4f6597"), w4: rgb("#5d5d5d"), p4: rgb("#764e6f"), r4: rgb("#7a5345"), g4: rgb("#496740"), c4: rgb("#2c686b"), b4: rgb("#4e5d7f"), ) #let bg = ( w0: rgb("#f4f4f4"), y0: rgb("#f6f7d7"), g0: rgb("#ddfcf3"), b0: rgb("#e4f6ff"), p0: rgb("#ffecff"), r0: rgb("#ffece8"), w1: rgb("#ececec"), y1: rgb("#f0f1b8"), g1: rgb("#c4faea"), b1: rgb("#d1eeff"), p1: rgb("#ffdfff"), r1: rgb("#ffdfd8"), w2: rgb("#e4e4e4"), y2: rgb("#ebea97"), g2: rgb("#a9f7e1"), b2: rgb("#bee7ff"), p2: rgb("#ffd1ff"), r2: rgb("#ffd1c7"), w3: rgb("#dddddd"), y3: rgb("#e6e472"), g3: rgb("#8bf4d9"), b3: rgb("#aadeff"), p3: rgb("#ffc3ff"), r3: rgb("#ffc3b7"), w4: rgb("#d5d5d5"), y4: rgb("#e1dd41"), g4: rgb("#65f1d0"), b4: rgb("#96d6ff"), p4: rgb("#feb4ff"), r4: rgb("#ffb4a7"), w5: rgb("#cecece"), y5: rgb("#dcd500"), g5: rgb("#26eec7"), b5: rgb("#82cdff"), p5: rgb("#fda5ff"), r5: rgb("#ffa596"), )
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/compute/calc-18.typ
typst
Other
// Error: 14-31 exponent is too large #calc.pow(2, 10000000000000000)
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/text/smartquotes_00.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Use language quotes for missing keys, allow partial reset #set smartquote(quotes: "«»") "Double and 'Single' Quotes" #set smartquote(quotes: (double: auto, single: "«»")) "Double and 'Single' Quotes"
https://github.com/sitandr/typst-examples-book
https://raw.githubusercontent.com/sitandr/typst-examples-book/main/src/basics/tutorial/advanced_styling.md
markdown
MIT License
# Advanced styling ## The `show` rule ```typ Advanced styling comes with another rule. The _`show` rule_. Now please compare the source code and the output. #show "Be careful": strong[Play] This is a very powerful thing, sometimes even too powerful. Be careful with it. #show "it is holding me hostage": text(green)[I'm fine] Wait, what? I told you "Be careful!", not "Play!". Help, it is holding me hostage. ``` ## Now a bit more serious ```typ Show rule is a powerful thing that takes a _selector_ and what to apply to it. After that it will apply to all elements it can find. It may be extremely useful like that: #show emph: set text(blue) Now if I want to _emphasize_ something, it will be both _emphasized_ and _blue_. Isn't that cool? ``` ## About syntax ```typ Sometimes show rules may be confusing. They may seem very diverse, but in fact they all are quite the same! So // actually, this is the same as // redify = text.with(red) // `with` creates a new function with this argument already set #let redify(string) = text(red, string) // and this is the same as // framify = rect.with(stroke: orange) #let framify(object) = rect(object, stroke: orange) // set default color of text blue for all following text #show: set text(blue) Blue text. // wrap everything into a frame #show: framify Framed text. // it's the same, just creating new function that calls framify #show: a => framify(a) Double-framed. // apply function to `the` #show "the": redify // set text color for all the headings #show heading: set text(purple) = Conclusion All these rules do basically the same! ``` ## Blocks One of the most important usages is that you can set up all spacing using blocks. Like every element with text contains text that can be set up, every _block element_ contains blocks: ```typ Text before = Heading Text after #show heading: set block(spacing: 0.5em) Text before = Heading Text after ``` ## Selector ```typ So show rule can accept _selectors_. There are lots of different selector types, for example - element functions - strings - regular expressions - field filters Let's see example of the latter: #show heading.where(level: 1): set align(center) = Title == Small title Of course, you can set align by hand, no need to use show rules (but they are very handy!): #align(center)[== Centered small title] ``` ## Custom formatting ```typ Let's try now writing custom functions. It is very easy, see yourself: // "it" is a heading, we take it and output things in braces #show heading: it => { // center it set align(center) // set size and weight set text(12pt, weight: "regular") // see more about blocks and boxes // in corresponding chapter block(smallcaps(it.body)) } = Smallcaps heading ``` ## Setting spacing TODO: explain block spacing for common elements ## Formatting to get an "article look" ```typ #set page( // Header is that small thing on top header: align( right + horizon, [Some header there] ), height: 12cm ) #align(center, text(17pt)[ *Important title* ]) #grid( columns: (1fr, 1fr), align(center)[ Some author \ Some Institute \ #link("mailto:<EMAIL>") ], align(center)[ Another author \ Another Institute \ #link("mailto:<EMAIL>") ] ) Now let's split text into two columns: #show: rest => columns(2, rest) #show heading.where( level: 1 ): it => block(width: 100%)[ #set align(center) #set text(12pt, weight: "regular") #smallcaps(it.body) ] #show heading.where( level: 2 ): it => text( size: 11pt, weight: "regular", style: "italic", it.body + [.], ) // Now let's fill it with words: = Heading == Small heading #lorem(10) == Second subchapter #lorem(10) = Second heading #lorem(40) == Second subchapter #lorem(40) ```
https://github.com/csimide/cslper
https://raw.githubusercontent.com/csimide/cslper/master/README.md
markdown
# Cslper 自用的 Typst 引用文献处理脚本。 在 GB/T 7714 标准中,中英文引文需要分别使用 `等` 或 `et al.` ,这一功能需要 CSL-M 扩展才能实现。Typst 使用的 CSL 解析器 [citationberg](https://github.com/typst/citationberg) 暂不支持 [CSL-M](https://citeproc-js.readthedocs.io/en/latest/csl-m/index.html) 扩展,因此需要“曲线”实现。 > [!NOTE] > > 如果使用 GB/T 7714-2015-numeric 及其衍生格式,建议使用 > https://github.com/nju-lug/modern-nju-thesis/issues/3 > 方案。此 Repo 仅保留备查。 注:本 Repo 只支持编号式(引用是上标 [1] [2-3] 这种),不支持作者年份式。 ## 食用方法 0. `git clone` 到本地。 1. 安装 [Deno](https://deno.com) 。当然也可以用 Node 搭配自己喜欢的包管理器等等,需要自己改一下代码。 2. 将自己的引文库导出为 CSL JSON 格式。如果是 Zotero 用户,请在 `导出文献库` 或 `导出条目` 内选择 `CSL JSON` 或 `Better CSL JSON` (若安装了 Better BibTex 插件)。 3. 准备好符合要求的 CSL 格式文件,可以到 https://github.com/redleafnew/Chinese-STD-GB-T-7714-related-csl 搜寻。 4. 修改 `main.ts` 内开头的两行内容,使其指向上两步中准备的文件。 5. 运行 `deno convert`,会生成 `bibout.bib`。 6. 到 Typst 里,使用 ```typst #bibliography("bibout.bib", style: "tab.csl") ``` 其中 `tab.csl` 是本 Repo 里的 `tab.csl` 文件。 ## 原理解释 - `main.ts` 使用 `citation-js` 将输入文件的所有参考文献条目都按指定的 CSL 处理成符合要求的格式,作为该条目的 `title` 生成新的 bibtex 文件。使用停用词法和 `language` 字段内是否包含 `en` 来判定参考文献的语种,并选用 `et al.` 或 `等`。 - `tab.csl` 是一个编号式的 CSL 引注文件,但参考文献表部分只显示 bib 中的 `title` 部分。 这两者结合,相当于使用脚本生成符合要求的引注条目,再在 typst 中引用。 ## 附注 参考 https://github.com/cherichy/BUAA-typst/blob/main/typstcite.md 。 本 Repo 中 `locates-zh-CN.xml` 来自 [citation-style-language/locales@6de1dc29](https://github.com/citation-style-language/locales/blob/6de1dc298a357ef89b965c975eed967f211028c0/locales-zh-CN.xml),采用 Creative Commons Attribution-ShareAlike 3.0 协议授权使用。原贡献者列表在此文件内。 原本想 WASM 缝个 citation.js 到 typst,后来发现凑合一下也不是不能用.jpg
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/table_03.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test inset. #table( columns: 3, inset: 10pt, [A], [B], [C] ) #table( columns: 3, inset: (y: 10pt), [A], [B], [C] ) #table( columns: 3, inset: (left: 20pt, rest: 10pt), [A], [B], [C] ) #table( columns: 2, inset: ( left: 20pt, right: 5pt, top: 10pt, bottom: 3pt, ), [A], [B], )
https://github.com/topdeoo/NENU-Thesis-Typst
https://raw.githubusercontent.com/topdeoo/NENU-Thesis-Typst/master/utils/format.typ
typst
#import "@preview/cuti:0.2.1": * // 中文缩进 #let indent = h(2em) // 假段落,附着于 heading 之后可以实现首行缩进 #let empty-par = par[#box()] #let fake-par = context empty-par + v(-measure(empty-par + empty-par).height) #let invisible-heading(..args) = { set text(size: 0pt, fill: white) heading(numbering: none, ..args) } #let unpack(pairs) = { let dict = (:) for (key, value) in pairs { dict[key] = value } dict }
https://github.com/Amelia-Mowers/typst-tabut
https://raw.githubusercontent.com/Amelia-Mowers/typst-tabut/main/doc/example-snippets/basic.typ
typst
MIT License
#import "@preview/tabut:<<VERSION>>": tabut #import "example-data/supplies.typ": supplies #tabut( supplies, // the source of the data used to generate the table ( // column definitions ( header: [Name], // label, takes content. func: r => r.name // generates the cell content. ), (header: [Price], func: r => r.price), (header: [Quantity], func: r => r.quantity), ) )
https://github.com/tingerrr/typst-test
https://raw.githubusercontent.com/tingerrr/typst-test/main/migrating.md
markdown
MIT License
# Migrating This file documents breaking changes and how to handle them while using the main branch of typst-test. The entries are ordered in descending relevance, i.e. last breaking change first. This file will be removed on the first release, as from then on, a changelog shall be curated. ## CI semi stable tag pre-deprecation notice The `ci-semi-stable` tag has received it's last bump to a pre-rewrite commit. It will soon be deprecated on the release of 0.1.0 of `typst-test`. The repository will likely be archived in ffavor of a new name. ## Rewrite The rewrite is now complete and the following things have changed and will be gradually tested, refined and stabilized until `0.1.0` is released: - `edit` exists but has no implementation anymore it will soon serve to edit tests's meta data like their reference kind instead of opening the tests - `run`, `compare` and `update` now take multiple tests and exact names for filtering, more elabirate filtering is done using `-e ...` with test set expressions - there are various - the output of `list` and `status` has changed - `util` has been added for running utility and debugging commands - various global options were moved to be on the respective commands they are relevant for only - typst test now brings it's own typst compiler (currently 0.11.1) and has no ability to change to another compiler version at the moment - reference images are no longer optimized when written to, this may be added back before 0.1.0 **Many of the aforementioned features are only tested locally and need further testing before the rewrite is fully done.** ## CI semi stable tag The `ci-semi-stable` tag will no longer be bumped on breaking changes. Instead a branch of the same name that follows `main` is provided for the same purpose. Simply change your CI step to use the branch option instead: ```diff jobs: tests: # ... steps: - name: Install typst-test from github uses: baptiste0928/[email protected] with: crate: typst-test git: https://github.com/tingerrr/typst-test.git - tag: ci-semi-stable + branch: ci-semi-stable # ... ``` ## Folder Structure The folder structure changed from having all tests in a dediacted folder with references and the like in different dedicated folders to having a dedicated folder per test. To use your existing project's tests, the scripts have to be moved and renamed. Previously tests were be arranged like follows: ``` tests/ typ/ test1.typ test2/ test.typ ... ... ``` To reuse the scripts, move them into the following structure: ``` tests/ test1/ test.typ ref/ 1.png out/ ... diff/ ... test2/ test.typ ... ... ``` Furthermore, the patterns in the `test/.gitignore` should be adjusted from `out/**` to `**/out/`, the same for `diff`. Observe the following: - free standing tests are no longer allowed, they must be in a folder and be named `test.typ` - tests can now be nested, their path serves as their name - references, output and diff images now live directly next to the test script in their respective sub folders You can copy the references into the sub folders, or simply regenerate them using the `update` sub command. If you used relative paths they must be adjusted, if you used absolute paths, then the tests should continue to work as expected.
https://github.com/catppuccin/typst
https://raw.githubusercontent.com/catppuccin/typst/main/README.md
markdown
MIT License
<h3 align="center"> <img src="https://raw.githubusercontent.com/catppuccin/catppuccin/main/assets/logos/exports/1544x1544_circle.png" width="100" alt="Logo"/><br/> <img src="https://raw.githubusercontent.com/catppuccin/catppuccin/main/assets/misc/transparent.png" height="30" width="0px"/> Catppuccin for <a href="https://typst.app/">Typst</a> <img src="https://raw.githubusercontent.com/catppuccin/catppuccin/main/assets/misc/transparent.png" height="30" width="0px"/> </h3> <p align="center"> <a href="https://github.com/catppuccin/typst/stargazers"><img src="https://img.shields.io/github/stars/catppuccin/typst?colorA=363a4f&colorB=b7bdf8&style=for-the-badge"></a> <a href="https://github.com/catppuccin/typst/issues"><img src="https://img.shields.io/github/issues/catppuccin/typst?colorA=363a4f&colorB=f5a97f&style=for-the-badge"></a> <a href="https://github.com/catppuccin/typst/contributors"><img src="https://img.shields.io/github/contributors/catppuccin/typst?colorA=363a4f&colorB=a6da95&style=for-the-badge"></a> </p> <p align="center"> <img src="https://raw.githubusercontent.com/catppuccin/typst/main/assets/previews/preview.webp"/> </p> ## Previews <details> <summary>🌻 Latte</summary> <img src="https://raw.githubusercontent.com/catppuccin/typst/main/assets/previews/latte.webp"/> </details> <details> <summary>🪴 Frappé</summary> <img src="https://raw.githubusercontent.com/catppuccin/typst/main/assets/previews/frappe.webp"/> </details> <details> <summary>🌺 Macchiato</summary> <img src="https://raw.githubusercontent.com/catppuccin/typst/main/assets/previews/macchiato.webp"/> </details> <details> <summary>🌿 Mocha</summary> <img src="https://raw.githubusercontent.com/catppuccin/typst/main/assets/previews/mocha.webp"/> </details> ## Installation Eventually, this package will be made available through Typst's built-in package manager. For now, there are two methods you can follow to install this package: ### Method 1: Using the install script This method requires that you have Python installed on your system and available in your PATH. 1. Clone or download this repository into a directory on your system. 2. Ensure you have [Just](https://github.com/casey/just) installed on your system. 3. Open the directory containing the repository in a commandline terminal and run the following command: ```sh just install ``` Or, to first build the package and then install it (this may not work on all systems), run: ```sh just tmThemes whiskers install ``` If you received no errors, the package should now be installed and available for use in your Typst documents. If you receive an error, you will either see a message stating that your platform is not supported for the automated install or something bad happened. In the latter case, please [open an issue](https://github.com/catppuccin/typst/issues/new?assignees=&labels=bug&template=bug.yaml) on this repository. ### Method 2: Manual install 1. Clone or download this repository to you computer. 2. Move the contents of the downloaded repository into `{data-dir}/typst/packages/local/catppuccin/{version}` to make them available locally on your system. Here, `{data-dir}` is - `$XDG_DATA_HOME` or `~/.local/share` on Linux - `~/Library/Application Support` on macOS - `%APPDATA%` on Windows Further instruction can be found on Typst's [package](https://github.com/typst/packages?tab=readme-ov-file#local-packages) repository. As an example, using v0.1.0, you path may look like `~/.local/share/typst/packages/local/catppuccin/0.1.0`. ## Usage In your project, import the package (ensure you have the correct version number) with ```typst #import "@local/catppuccin:0.1.0": catppuccin, themes ``` To format your document with a theme, use the following syntax towards the top of your document: ```typst #show: catppuccin.with(themes.mocha, code_block: true, code_syntax: true) ``` Replace `mocha` with the flavour of your choice! This can also be passed as a string literal `"mocha"`. You can further adjust the arguments to `catppuccin.with` to customise the theme look of your document. ## 💝 Thanks to - [TimeTravelPenguin](https://github.com/TimeTravelPenguin) &nbsp; <p align="center"> <img src="https://raw.githubusercontent.com/catppuccin/catppuccin/main/assets/footers/gray0_ctp_on_line.svg?sanitize=true" /> </p> <p align="center"> Copyright &copy; 2021-present <a href="https://github.com/catppuccin" target="_blank">Catppuccin Org</a> </p> <p align="center"> <a href="https://github.com/catppuccin/catppuccin/blob/main/LICENSE"><img src="https://img.shields.io/static/v1.svg?style=for-the-badge&label=License&message=MIT&logoColor=d9e0ee&colorA=363a4f&colorB=b7bdf8"/></a> </p>
https://github.com/Skimmeroni/Appunti
https://raw.githubusercontent.com/Skimmeroni/Appunti/main/C++/Introduzione/Struct.typ
typst
Creative Commons Zero v1.0 Universal
#import "@preview/showybox:2.0.1": showybox Similmente agli array, che sono tipi primitivi, le *struct* sono considerate tipi compositi. Di fatto, una struct é un "raggruppamento" di dati anche di tipo diverso, chiamati *campi*. Sono di fatto una forma piú "rudimentale" del concetto di classe. Tutti i dati di una struct sono di default pubblici, quindi liberamente modificabili. Una struct puó essere inizializzata allo stesso modo di come viene inizializzato un array, dove ogni elemento $i$-esimo all'interno delle parentesi graffe viene assegnato alla $i$-esima variabile contenuta nella `struct`. Una `struct` puó anche essere inizializzata parzialmente, ovvero assegnando un valore solamente ai primi $n$ campi. #grid( columns: (0.3fr, 0.7fr), [ ``` struct name_type { type_1 name_1; type_2 name_2; ... type_n name_n; }; ``` ], [ ``` struct_type struct_name = {field_1, field_2, ..., field_n}; ``` ] ) #showybox[ ``` struct Point { int x; int y; }; Point A = {5, 2}; ``` ] L'operatore `.` permette di accedere ai dati di una `struct`, specificando il nome del campo a cui ci si riferisce. L'operatore `->` permette di accedere ad un campo di una struct quando ci riferisce ad essa tramite un puntatore e non direttamente (é una abbreviazione di una deferenziazione seguita da un accesso). #grid( columns: (0.5fr, 0.5fr), [ ``` struct_name.field ``` ], [ ``` pointer_to_a_struct->field ``` ] ) #showybox[ ``` Point P = {5, 2}; Point* Q = &P; P.x = 10; Q->y = 8; // same as (*Q).p = 8 std::cout << Q->x << " " << P.y << std::endl; // prints 10 8 ``` ] La memoria occupata da una `struct` dipende dalla politica di allocazione della memoria del compilatore. In genere, viene prediletta una allocazione di memoria che ottimizza l'accesso piuttosto che la dimensione. Per tale motivo, per ottenere la massima efficienza in termini di spazio occupato é preferibile disporre i dati all'interno in ordine decrescente di grandezza, di modo che piú dati possano venire "accorpati" in un'unica `word`. Un `enum` é un tipo di dato che consente di associare in maniera automatica dei valori interi costanti a dei nomi di stringhe. Permette di usare delle stringhe come dei "segnaposto" per dei valori che dovrebbero essere legati da una qualche semantica. ``` enum name {name_1 = value_1, name_2 = value_2, ..., name_n = value_n}; ``` Il valore a cui ciascun campo di un `enum` viene assegnato puó venire specificato oppure lasciato dedurre al compilatore. Nel secondo caso, a tutti i campi dell'`enum` che vengono dopo l'ultimo campo con un valore specificato viene assegnato il valore a quest'ultimo successivo. Se nessun valore viene specificato, ai campi di `enum` vengono ordinatamente assegnati i numeri $1, 2, 3, dots, $ #showybox[ ``` enum day {Mon = 10, Tue = 20, Wed = 30, Thu = 40, Fri = 50, Sat = 60, Sun = 70}; day d; d = Wed; // Allowed d = 10; // NOT Allowed int f = Fri; // Allowed enum days {Mon, Tue, Wed, Thu, Fri, Sat, Sun}; // 1, 2, 3, 4, 5, 6, 7 enum days {Mon = 1, Tue, Wed = 5, Thu, Fri = 2, Sat, Sun}; // 1, 2, 5, 6, 2, 3, 4 ``` ] `typedef` permette di associare un alias ad un tipo di dato giá esistente. É utile per riferirsi ad un tipo avente un nome molto lungo con un alias piú corto. Puó essere utile anche per "mascherare" valori veri con nomi di comodo, di modo che da fuori da una classe i dati appaiano con nomi piú semplici da comprendere. ``` typedef old_name new_name; ``` #showybox[ ``` typedef unsigned long int uli; uli x = 10; // Same as unsigned long int x = 10 ``` ] `const` é un modificatore che, posto davanti alla definizione di una variabile, la rende immutabile, ovvero non é piú possibile modificarne il valore in un secondo momento. Permette di creare delle costanti, ovvero valori che devono imprescindibilmente assumere uno ed un solo valore #footnote[`const` occupa lo stesso spazio che nel C aveva `#define`; infatti, sebbene sia possibile anche in C++ definire costanti in questo modo, é da considerarsi una worst practice, dato che il linguaggio offre uno strumento preposto.]. Se si tenta di aggiungere `const` ad una variabile che non viene inizializzata quando viene dichiarata viene restituito un messaggio di errore. ``` const var_type var_name = value; ``` Il valore di una reference a cui viene aggiunto il modificatore `const` puó cambiare se il valore originale viene cambiato, ma non puó comunque venire modificato direttamente. #showybox[ ``` const float pi; // NOT Allowed const float pi = 3.14; // Allowed pi = 3.1415; // NOT Allowed g = 1; const int gamma = g; // Allowed int f = 2; const int& e = f; f++; // Allowed, now e = 3 even if constant e++; // NOT Allowed ``` ] Rispetto ai puntatori, l'uso del modificatore `const` puó portare a conseguenze impreviste. Possono presentarsi tre situazioni, in base a dove viene posta la keyword `const` nella dichiarazione del puntatore: - Il modificatore `const` si trova prima del tipo di dato del puntatore. In questo senso, il puntatore "protegge" la variabile, impedendo che sia possibile modificarla se si passa dal puntatore. Sia il puntatore, sia l'oggetto in sé se vi si accede direttamente, sono liberamente modificabili. Infatti, la keyword `const` si riferisce comunque sempre e solo al puntatore, anche se il dato a cui si riferisce non é una costante; - Il modificatore `const` si trova dopo il tipo di dato del puntatore. In questo senso, é il puntatore stesso ad essere una costante, e non é piú possibile modificarlo (scollegarlo e collegarlo ad altro, per esempio), ma é possibile modificare il valore dell'oggetto in sé se vi si accede tramite il puntatore; - Il modificatore `const` si trova sia prima che dopo il tipo di dato del puntatore. Sia il puntatore, sia l'oggetto se vi si accede tramite il puntatore, non sono modificabili. Assegnare ad un puntatore (non necessariamente con `const`) un tipo di dato che ha il modificatore `const` restituisce un errore in fase di compilazione, perché si sta di fatto negando il "senso" dell'aver dichiarato tale variabile come constante in principio. #showybox[ ``` int i = 200; const int* p1 = &i; // 1st type *p1 = 100; // NOT allowed p1 = nullptr; // Allowed int* const p2 = &i; // 2nd type *p2 = 100; // Allowed p2 = nullptr; // Not allowed const int* const p3 = &i; // 3rd type *p3 = 100; // NOT allowed p3 = nullptr; // NOT allowed ``` ]
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/example/0.1.0/lib.typ
typst
Apache License 2.0
// A package can contain includable markup just like other files. This is an *example!* // Paths are package local and absolute paths refer to the package root. #import "/util/math.typ": add, sub, mul, div
https://github.com/pascalguttmann/typst-template-report-lab
https://raw.githubusercontent.com/pascalguttmann/typst-template-report-lab/main/template/chapter/evaluation.typ
typst
MIT License
= Evaluation of the Results and Error Discussion
https://github.com/Myriad-Dreamin/typst.ts
https://raw.githubusercontent.com/Myriad-Dreamin/typst.ts/main/fuzzers/corpora/layout/stack-1_03.typ
typst
Apache License 2.0
#import "/contrib/templates/std-tests/preset.typ": * #show: test-page // Test aligning things in RTL stack with align function & fr units. #set page(width: 50pt, margin: 5pt) #set block(spacing: 5pt) #set text(8pt) #stack(dir: rtl, 1fr, [A], 1fr, [B], [C]) #stack(dir: rtl, align(center, [A]), align(left, [B]), [C], )
https://github.com/jakobjpeters/Typstry.jl
https://raw.githubusercontent.com/jakobjpeters/Typstry.jl/main/docs/source/references/commands.md
markdown
MIT License
# Commands ```@eval using Markdown, Typstry Markdown.parse("This reference documents " * lowercasefirst(split(string(@doc Typstry.Commands), "\n")[5])) ``` ## `Typstry` ```@docs TypstCommand TypstError @typst_cmd julia_mono preamble render ``` ## `Base` ```@docs == addenv detach eltype firstindex getindex hash ignorestatus iterate(::TypstCommand) keys lastindex length run setcpuaffinity setenv show(::IO, ::MIME"text/plain", ::TypstCommand) show(::IO, ::Union{MIME"application/pdf", MIME"image/png", MIME"image/svg+xml"}, ::Union{Typst, TypstString, TypstText}) showerror ```
https://github.com/Caslus/lucasphilippe
https://raw.githubusercontent.com/Caslus/lucasphilippe/main/README.md
markdown
# Personal Page with Astro and Typst This is a personal page built using [Astro](https://astro.build/) and [Typst](https://typst.app/), with GitHub Actions automating the build process, generating an up-to-date resume, and deploying to GitHub Pages. ## Overview The site was designed to be easy to maintain, leveraging automation for key tasks. Every push to the repository triggers a GitHub Action workflow that builds the site, generates a new version of the resume using Typst, and automatically deploys everything to GitHub Pages. ## Why Astro? It's fast, allows for modern web development with minimal overhead, and it's perfect for creating a highly optimized static personal page that loads quickly and efficiently. ## Why Typst? Typst is used for generating my resume because it allows me to focus on content instead of formatting, I used to spend hours on Word trying to adjust line breaks. It also integrates smoothly into my build pipeline, allowing my resume to be automatically updated every time I push changes to the repository. ## Author This is project is built by [<NAME>](https://lucasphilippe.com). ## License This project is distributed under the [MIT license](LICENSE).
https://github.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024
https://raw.githubusercontent.com/Area-53-Robotics/53E-Notebook-Over-Under-2023-2024/giga-notebook/entries/decide-drivetrain-sensors.typ
typst
Creative Commons Attribution Share Alike 4.0 International
#import "/packages.typ": notebookinator #import notebookinator: * #import themes.radial.components: * #show: create-body-entry.with( title: "Decide: Drivetrain Sensors", type: "decide", date: datetime(year: 2023, month: 07, day: 28), author: "<NAME>", witness: "<NAME>", ) We rated each configuration for: - Ease of use on a scale of 1 to 2 - Accuracy on a scale of 1 to 5 - Compactness on a scale of 1 to 3 We weighted accuracy so high due to how crucial it is that the tracking correctly represent the robot's location throughout the entire match. #decision-matrix( properties: ((name: "Ease of use"), (name: "Accuracy"), (name: "Compactness")), ("GPS", 2, 2, 3), ("Three Tracking Wheels", 1, 5, 1), ("Two Tracking Wheels and IMU", 1.5, 4.5, 2), ("Integrated Motor Encoders", 1, 2, 3), ) #admonition( type: "decision", )[ We decided on two tracking wheels and an IMU due to just the right balance of accuracy, compactness, and ease of use. ] #heading([Final Tracking Wheel Design], level: 1) #grid( columns: (50%, 50%), rows: (25%, 25%), gutter: 0pt, image("../assets/tracking-wheels/isometric.png"), image("../assets/tracking-wheels/front.png"), image("../assets/tracking-wheels/top.png"), image("../assets/tracking-wheels/side.png"), ) #colbreak() #set align(center) #image("../assets/tracking-wheels/part-drawings/1.png") #image("../assets/tracking-wheels/part-drawings/2.png")
https://github.com/pta2002/typst-timeliney
https://raw.githubusercontent.com/pta2002/typst-timeliney/main/README.md
markdown
MIT License
# Timeliney Create Gantt charts automatically with Typst! Here's a fully-featured example: ```typst #import "@preview/timeliney:0.0.1" #timeliney.timeline( show-grid: true, { import timeliney: * headerline(group(([*2023*], 4)), group(([*2024*], 4))) headerline( group(..range(4).map(n => strong("Q" + str(n + 1)))), group(..range(4).map(n => strong("Q" + str(n + 1)))), ) taskgroup(title: [*Research*], { task("Research the market", (0, 2), style: (stroke: 2pt + gray)) task("Conduct user surveys", (1, 3), style: (stroke: 2pt + gray)) }) taskgroup(title: [*Development*], { task("Create mock-ups", (2, 3), style: (stroke: 2pt + gray)) task("Develop application", (3, 5), style: (stroke: 2pt + gray)) task("QA", (3.5, 6), style: (stroke: 2pt + gray)) }) taskgroup(title: [*Marketing*], { task("Press demos", (3.5, 7), style: (stroke: 2pt + gray)) task("Social media advertising", (6, 7.5), style: (stroke: 2pt + gray)) }) milestone( at: 3.75, style: (stroke: (dash: "dashed")), align(center, [ *Conference demo*\ Dec 2023 ]) ) milestone( at: 6.5, style: (stroke: (dash: "dashed")), align(center, [ *App store launch*\ Aug 2024 ]) ) } ) ``` ![Example Gantt chart](sample.png) ## Installation Import with `#import "@preview/timeliney:0.0.1"`. Then, call the `timeliney.timeline` function. ## Documentation See [the manual](manual.pdf)! ## Changelog ### 0.1.0 - Update CeTZ to 0.2.2 (@LordBaryhobal) - Add offset parameter
https://github.com/ckunte/resume
https://raw.githubusercontent.com/ckunte/resume/master/inc/preamble.typ
typst
#let resume(doc) = { // settings set page( margin: ( x: 1in, y: 1in, ), numbering: "1 of 1", ) // text characteristics set text( font: "Segoe UI", // Segoe UI, New Computer Modern, top-edge: "cap-height", bottom-edge: "baseline", number-type: "old-style", size: 12pt, ) // Configure paragraph properties set par(spacing: 0.65em, leading: 0.65em, first-line-indent: 12pt, justify: true) // Emphasise caption show figure.caption: emph // link properties show cite: set text(fill: maroon) show link: set text(fill: rgb(0, 0, 255)) show link: underline // colour terms show regex("tb[a,c,d,p]"): set text(fill: red, style: "italic") // for to-be-[advised, confirmed, determined, planned] show strike: set text(style: "italic") show regex("done\([0-9]{2}w[0-9]{2}\)"): set text(fill: rgb(0, 96, 0), style: "italic") show "callback requested": set text(fill: rgb(255, 0, 0), style: "italic") show "info shared": set text(fill: rgb(0, 96, 0), style: "italic") show "callback requested": set text(fill: rgb(255, 0, 0), style: "italic") // let hl(content, col) = highlight(fill: col)[#content] // show "monitor and maintain": match => { hl(match, rgb(255,179,186)) } // show "repair": match => { hl(match, rgb(255,223,186)) } // show "replace": match => { hl(match, rgb(255,255,186)) } // show "run additional": match => { hl(match, rgb(255,179,186)) } // show "impracticable": match => { hl(match, rgb(255,179,186)) } /* show strike: set text(fill: rgb(0, 96, 0), style: "italic") */ // small-caps, where supported by font let sc(content) = text(features: ("c2sc",))[#content] show regex("[A-Z]{2,}|[A-Z]{1}[0-9]{1,}|[A-Z]{1}\&[A-Z]{1}"): match => { sc(match) } doc } //// Call the following in the main file // #import "preamble.typ": cknotes // #show: doc => cknotes(doc)
https://github.com/Myriad-Dreamin/tinymist
https://raw.githubusercontent.com/Myriad-Dreamin/tinymist/main/crates/tinymist-query/src/fixtures/completion-pkgs/touying-utils-_size-to-pt.typ
typst
Apache License 2.0
// path: lib.typ // - level (auto, ): The level #let _size-to-pt(size, container-dimension) = { let to-convert = size if type(size) == ratio { to-convert = container-dimension * size } measure(v(to-convert)).height } ----- // contains: level, hierachical, depth #import "lib.typ": * #current-heading(/* range 0..1 */)[]; ----- // contains: "body" #import "lib.typ": * #current-heading(level: /* range 0..1 */)[]; ----- // contains: false, true #import "lib.typ": * #current-heading(hierachical: /* range 0..1 */)[]; ----- // contains: false, true #import "lib.typ": * #current-heading(depth: /* range 0..1 */)[];
https://github.com/NamLe0609/bias-ai-report
https://raw.githubusercontent.com/NamLe0609/bias-ai-report/main/pre_body.typ
typst
= Introduction This report focuses on a machine learning task for emotion recognition on human face images. Whilst not novel, it is a useful tool in a variety of applications, such as in marketing, human-robot interactions, healthcare, and security @Emotion-recognition-meta-review. == Description of tasks This is a classification task which predicts the emotion of a person from an image of their face, and outputs the emotion(s) associated with it. The range of possible emotions is typically decided by a dataset, though we aim to pick a dataset which contains the emotions of: joy, trust, fear, surprise, sadness, disgust, anger, and anticipation. According to Plutchik's wheel of emotions, a widely accepted model in discrete emotion theory, these are the universally recognized basic emotions @Emotion-recognition-meta-review. #figure( image("emotion-wheel.jpg", width: 91%), caption: [Plutchik's wheel of emotions, with the base emotions as well as their amplified/attenuated versions. Intensity increases towards the center and vice versa @Emotion-wheel-source.], ) <emotion-wheel> We do not use the full range of emotions as it is not practical to train a model to recognize all of them. The base emotions are sufficient for most applications. For the output format, we want a probability distribution over the classes. This has the benefit of giving more information than single class outputs, can be used to calculate a confidence score, is easily transformed into a positive/negative/neutral output and is easily done (softmax layer). With this, we allow for more model flexibility, which is crucial for deployment to various organization and/or be made accessible via open sourcing. The dataset used for training is custom-made from online datasets, and the process of producing it will be elaborated in section III. == Ethical impact, Value Sensitive Design, and use case AI ethics refers to a set of values and principles that guide the responsible use of AI technologies @AI-ethics-definition, whereas AI bias refers to "computer systems that systematically and unfairly discriminate against certain individuals or groups of individuals" @Bias-main-paper. An ethical impact assessment aims to root out any potential biases in the model by considering stakeholders' values/principles, allowing for fair outcomes for all. To do this, we employ the use of Value Sensitive Design (VSD). Value in VSD refers to value, defined as "what a group of people consider important in life" @VSD-paper, hence VSD is a methodology which considers the values of both direct and indirect stakeholders in the design of technology. VSD involves three investigation steps. + Conceptual: Identifies stakeholders, what values they hold, how they are affected, discuss trade-offs between values. + Empirical: Using quantitative/qualitative methods to expand on the concepts found in the previous step. + Technical: Analysis of existing technological mechanism on, and proactive design supporting of human values. A potential use case of this model is for healthcare surveillance system @Emotion-recognition-medical-surveillance. In hospitals, the model could be used to detect signs of depression or anxiety in patients, and alert the clinic to administer medicine. Compared to traditional methods, such as human observation, the model could be less prone to error and be more available to patients. In the hospital example, we must make sure the model is only used on images of patients who have given explicit consent, but how do we get the consent of people who are not mentally well enough to make decisions? Issues like this is beyond our scope of responsibility, but it is important to consider nonetheless. In the following sections (section II and III), we focus solely on the healthcare use case in hospitals.
https://github.com/EpicEricEE/typst-droplet
https://raw.githubusercontent.com/EpicEricEE/typst-droplet/master/src/droplet.typ
typst
MIT License
#import "extract.typ": extract #import "split.typ": split #import "util.typ": inline // Sets the font size so the resulting text height matches the given height. // // Parameters: // - height: The target height of the resulting text. // - text-args: Named arguments to be passed to the underlying text element. // - body: The content of the text element. // // Returns: The text with the adjusted size. #let sized(height, ..text-args, body) = context { let styled-text = text.with(..text-args.named(), body) let measured = measure(styled-text(1em)).height let factor = if measured > 0pt { height / measured } else { 1 } styled-text(factor * 1em) } // Resolves the given height to an absolute length. // // Height can be given as an integer, which is interpreted as the number of // lines, or as a length. // // Requires context. #let resolve-height(height) = { if type(height) == int { measure([x\ ] * height).height } else { height.to-absolute() } } // Shows the first letter of the given content in a larger font. // // If the first letter is not given as a positional argument, it is extracted // from the content. The rest of the content is split into two pieces, where // one is positioned next to the dropped capital, and the other below it. // // Parameters: // - height: The height of the first letter. Can be given as the number of // lines (integer) or as a length. If set to `auto`, no scaling is // applied. // - justify: Whether to justify the text next to the first letter. // - gap: The space between the first letter and the text. // - hanging-indent: The indent of lines after the first line. // - overhang: The amount by which the first letter should overhang into the // margin. Ratios are relative to the width of the first letter. // - depth: The minimum space below the first letter. Can be given as the // number of lines (integer) or as a length. // - transform: A function to be applied to the first letter. // - text-args: Named arguments to be passed to the underlying text element. // - body: The content to be shown. // // Returns: The content with the first letter shown in a larger font. #let dropcap( height: 2, justify: auto, gap: 0pt, hanging-indent: 0pt, overhang: 0pt, depth: 0pt, transform: none, ..text-args, body ) = layout(bounds => { let text-args = text-args if height != auto { // Set default top and bottom edge to "bounds" if not specified. if "top-edge" not in text-args.named() { text-args = arguments(..text-args, top-edge: "bounds") } if "bottom-edge" not in text-args.named() { text-args = arguments(..text-args, bottom-edge: "bounds") } } let (letter, rest) = if text-args.pos() == () { extract(body) } else { // First letter already given. (text-args.pos().first(), body) } if transform != none { letter = context transform(letter) } let letter-height = if height == auto { // Don't rescale if height is set to auto. measure(text(..text-args.named(), letter)).height } else { resolve-height(height) } let depth = resolve-height(depth) // Create dropcap with the height of sample content. let letter = box( height: letter-height + depth, sized(letter-height, letter, ..text-args.named()) ) let letter-width = measure(letter).width // Resolve overhang if given as percentage. let overhang = if type(overhang) == ratio { letter-width * overhang } else if type(overhang) == relative { letter-width * overhang.ratio + overhang.length } else { overhang } // Resolve justify if given as auto. let justify = if justify == auto { par.justify } else { justify } // Try to justify as many words as possible next to dropcap. let bounded = box.with(width: bounds.width - letter-width - gap + overhang) let index = 1 let top-position = 0pt let prev-height = 0pt let (first, second, sep) = while true { let (first, second, _) = split(rest, index) let first = { set par(hanging-indent: hanging-indent, justify: justify) first } let height = measure(bounded(first)).height let (_, new, sep) = split(first, -1) top-position = calc.max( top-position, height - measure(new).height - par.leading.to-absolute() ) if top-position >= letter-height + depth - 1e-6pt and height > prev-height { // Limit reached, new element doesn't fit anymore split(rest, index - 1) break } if second == none { // All content fits next to dropcap. (first, none, none) break } index += 1 prev-height = height } // Layout dropcap and aside text as grid. set par(justify: justify) // Find out whether there is a break between the first and second part. let has-break = type(sep) == content and sep.func() in (linebreak, parbreak) if not has-break { let sep = split(first, -1).at(2) has-break = type(sep) == content and sep.func() in (linebreak, parbreak) } // Find elements at boundary. let last-of-first = split(first, -1).at(1) let first-of-second = if second == none { none } else { split(second, 1).at(0) } let func(body) = if inline(last-of-first) { box(body) + linebreak() } else { block(body) } func(grid( column-gutter: gap, columns: (letter-width - overhang, 1fr), move(dx: -overhang, letter), { set par(hanging-indent: hanging-indent) first if not has-break and inline(last-of-first) and inline(first-of-second) { linebreak(justify: justify) } } )) if type(sep) == content and sep.func() in (linebreak, parbreak) { sep } second })
https://github.com/CAT-Performance-Optimization-Centre/Math-Catalogue
https://raw.githubusercontent.com/CAT-Performance-Optimization-Centre/Math-Catalogue/main/Sequence-and-Series/sequenceandseries.typ
typst
MIT License
#import "@preview/diatypst:0.1.0": * #show: slides.with( title: "Infinite Sequence & Series", // Required subtitle: "", date: "27.09.2024", authors: ("CAT性能优化中心"), // Optional Styling ratio: 16/9, layout: "large", title-color: teal, footer: true, counter: true, ) #outline() = Sequences == Limits of Sequences A *Sequence* is a function whose domain is a set of form ${n in ZZ:n>=m}; m$ is usually 1 or 0. It is denoted by symbols ${a_n}$ or $(s_n)$;$(s_n)=(s_1,s_2 dots ,s_n)$. *Example* *(a)* Consider a sequence $(s_(n))_(n in NN)$ where $(s_n)=1/n^2$. This is the sequence $(1,1/4,1/9,1/16, dots)$ which is a function with domain $NN$ whose values at each $n$ is $1/n^2$. *(b)* If $a_n=(n)^(1/n),n in NN$ then sequence is $(1,(2)^(1/2),(3)^(1/3),dots)$. It turns out $a_100 approx 1.0471 and a_1000 approx 1.0069$. / Definition: A Sequence $(s_n)$ of real numbers is said to be $italic(c o n v e r g e)$ to real number $s$ provided that for each $epsilon>0 $ $exists$ a number $N$ $ n>N => |s_n-s|<epsilon $ If $(s_n)$ convergers to $s$ ($lim_(n -> infinity) s_n=s$).The number $s$ is called limit of sequence($s_n$). A sequence that does not $italic(c o n v e r g e)$ to some real number is said to *diverge*. == Discussion about Proofs *Example* Prove $lim (3n+1)/(7n-4)=3/7.$ $D i s c u s s i o n.$ For each $epsilon>0$ $ |(3n+1)/(7n-4)-3/7| < epsilon =>19/(49epsilon) +4/7<n $ Our steps are reversible, so we'll put $N=19/(49epsilon) +4/7$ *Formal Proof* Let $epsilon>0$ and $N=19/(49epsilon) +4/7$.Then $n>N=>n>19/(49epsilon) +4/7$ $ therefore (19)/(7(7n-4))<epsilon => |(3n+1)/(7n-4)-3/7| < epsilon $ This proves $lim (3n+1)/(7n-4)=3/7$. / Sandwich Theorem: Let ${a_n},{b_n} and {c_n} $ be sequences of real numbers. If $a_n<=b_n<=c_n forall n>N$ if $lim a_n=lim c_n=L$ then $lim b_n=L$. *Example* $lim_(n->infinity) cos(n)/n $ Sol. As $-1<=cos(n)<=1 => -1/n<=cos(n)/n<=1/n$ $and lim_(n->infinity)1/(|n|)=0=L$. Hence limit converges to 0. == Johann Bernoulli Rule (L'Hôpital's Rule) It states that functions $f and g $ which are defined on open interval $I$ and differentiable on ${I-c}$ for a (possibly infinite) accumulation point $c$ of $I$, $ lim_(x-> c) f(x)/g(x)= lim_(x->c) (f'(x))/(g'(x)) $ / Definition: A sequence ${a_n}$ is *nondecreasing* if $a_n<=a_(n+1) forall n$. The sequence is *nonincreasing* if $a_n>=a_(n+1) forall n.$ The sequence is said to be *monotonic* if it is either nondecreasing or nonincreasing. / Monotonic sequence Theorem: If a sequence ${a_n}$ is both bounded and monotonic, then sequence $c o n v e r g e s .$ / Stirling's Approximation: $n!$ $~ sqrt(2 pi n)(n/e)^n$ == An Infinite Series / Definition: Given a sequence of numbers ${a_n}$, an expression of form $ a_1+a_2+a_3+dots+a_n+dots $ is an *infinite series*. $a_n$ is *nth term* of series. The sequence ${s_n} $ is defined by $ s_1=a_1 $ $ s_2=a_1+a_2 $ $ . $ $ . $ $ s_n=a_1+a_2+dots+a_n= sum_(k=0)^(n) a_k $ is the sequence of partial sums of series, the number $s_n$ being nth partial sum. If sequence of partial sum converges to limit *L* $ s_n =L $ == Integral Test / Definition: Let ${a_n}$ be a sequence of positive terms. Suppose that $a_n=f(n)$, where $f$ is a continuous, positive, decreasing function of $x forall x >=N$.Then the series $sum_(n=N)^(infinity) a_n$ and the integral $integral_(N)^(infinity) f(x)d x$ both diverge or both converge. *Example* Show that *p-series* $ sum_(n=1)^(infinity) 1/n^p = 1/1^p+1/2^p+dots+1/n^p+ dots, p in RR $ converges if $p>1$ and diverges if $p<=1$. *Proof* If $p>1$, then $f(x)=1/x^p$ is nonincreasing function $forall x in RR^+$ $ integral_(1)^(infinity) (dif x )/(x^p)= lim_(epsilon-> infinity) [(x^(-p+1))/(p+1)]_1^epsilon=1/(p-1) $ If $0<p<1$ then $1-p>0 and integral_(1)^(infinity) (dif x )/(x^p)=1/(1-p) lim(b^(1-p)-1)=infinity $ Series diverges by integral test. == Comparison Test / Definition : Let $sum a_n,sum c_n and sum d_n$ be series with non negative terms. Suppose that for some integer $N$ $ d_n<=a_n<=c_n forall n >N $ 1. If $sum c_n$ converges then $sum a_n$ also converges. 2. If $sum d_n$ diverges then $sum a_n$ also diverges. *Limit Comparison Test* / Definition: Suppose that $a_n>0 and b_n >0$ $ forall n>=N, N in ZZ$ 1. If $lim_(n -> infinity) a_n/b_n =c>0,$ then $sum a_n and sum b_n$ both converge or diverge. 2. If $lim_(n -> infinity ) a_n/b_n=0$ and $sum b_n$ converges, then $sum a_n$ coverges. 3. If $lim_(n -> infinity ) a_n/b_n=infinity$ and $sum b_n$ diverges, then $sum a_n$ diverges. *Absolute Convergence Test* If $sum_(n=1)^(infinity)|a_n|$ converges, then $sum_(n=1)^(infinity)a_n$. == The Ratio Test Let $sum a_n$ be any series and suppose that $ lim_(n -> infinity) |a_(n+1)/a_n| =rho $ Then 1. Series *converges* absolutely if $rho <1$ 2. Series *diverges* if $rho>1$ 3. Test is inconclusive if $rho=1$ / The Root Test: Let $sum a_n$ be any series and suppose that $ lim_(n -> infinity) (|a_n|)^(1/n) = rho $ Then 1. Series *converges* absolutely if $rho <1$ 2. Series *diverges* if $rho>1$ 3. Test is inconclusive if $rho=1$ == Raabe's Test Let $sum a_n$ be any series, then $ rho_n equiv n(a_n/a_(n+1)-1 ) $ 1. Converge if $rho=lim_(n-> infinity) rho_n>1$ 2. Diverge if $lim_(n -> infinity) rho_n<1$ 3. Test is inconclusive if $rho=1$ / Betrand Test: Let $sum a_n (a_n >0, forall n in NN )$ be any series, then $ rho_n equiv (n(a_n/a_(n+1)-1 )-1)ln(n) $ 1. Converges if $ lim_(n -> infinity)rho_n>1$ 2. Diverges if $lim_(n-> infinity)rho_n<1$ == The Alternating Series Test The series $ sum_(n=1)^(infinity) (-1)^(n+1) u_n=u_1-u_2+u_3-u_4+dots $ converges if 1. The $u_n 's$ are all positive. 2. The positive $u_n 's$ are nonincreasing $u_n>=u_(n+1) forall n>=N$ for some $N in ZZ$ 3. $u_n -> 0$ == Exercises 1. Show that if $sum_(n=1)^(infinity) s_n$ converges then $ sum_(n=1)^(infinity) ((1+sin(s_n))/2)^n $ converges. 2. Prove that *(The Basel problem)* $ sum_(n=1)^(infinity) 1/n^2 = pi^2/6 $ (HINT: Use fourier series $f(x)=x^2 and x in [-pi,pi]$) / References: 1. Springer, "Elementary Analysis",$T h e o r y $ $o f $ $ C a l c u l u s$,<NAME>, edition 2., pp. 33-46,2013 <NAME>, "Early Transcendentals",$T h o m a s's$ $ C a l c u l u s$,<NAME>, Jr.,edition 13.,pp. 572-652,2014
https://github.com/JakMobius/courses
https://raw.githubusercontent.com/JakMobius/courses/main/mipt-os-basic-2024/sem02/main.typ
typst
#import "@preview/polylux:0.3.1": * #import "@preview/cetz:0.2.2" #import "../theme/theme.typ": * #import "./utils.typ": * #import "./floats.typ": * #import "./encodings.typ" #show: theme #title-slide[ #align(horizon + center)[ = Представление данных в компьютере АКОС, МФТИ 19 сентября, 2024 ] ] #show: enable-handout #slide(header: [Как работают целые числа], place-location: center+horizon)[ #draw-number-bits(137) ] #slide(header: [Целочисленные типы в C])[ - #codebox(lang: "c", "char") : *1 байт* (или #codebox(lang: "c", "CHAR_BIT") бит) данных; - #codebox(lang: "c", "short") и #codebox(lang: "c", "int") : не менее *16 бит* данных; - #codebox(lang: "c", "long") : не менее *32 бит* данных; - #codebox(lang: "c", "long long") : не менее *64 бит* данных. == Типы фиксированной длины (#codebox(lang: "c", "#include <stdint.h>")): - #codebox(lang: "c", "int8_t") и #codebox(lang: "c", "uint8_t") : строго *8 бит*; - #codebox(lang: "c", "int16_t") и #codebox(lang: "c", "uint16_t") : строго *16 бит*; - #codebox(lang: "c", "int32_t") и #codebox(lang: "c", "uint32_t") : строго *32 бита*; - #codebox(lang: "c", "int64_t") и #codebox(lang: "c", "uint64_t") : строго *64 бита*. ] #slide(header: [Как работают знаковые числа], background-image: none)[ #place(center+horizon)[ #draw-number-bits(-23, signed: true) ] #place(bottom)[ #set list(marker: none) - #colbox(color: gray)[⚠️] : Любое отрицательное число начинается с #codebox("1") и наоборот; - #colbox(color: gray)[⚠️] : Конвертация знаковых типов друг к другу становится менее тривиальным. ] ] #slide(header: [Знаковые и беззнаковые типы в C], background-image: none)[ - #codebox(lang: "c", "char") : *не определено стандартом*. - #codebox(lang: "c", "short") , #codebox(lang: "c", "int") , #codebox(lang: "c", "long") и #codebox(lang: "c", "long long") : по умолчанию *знаковые*; - Любой из типов выше можно сделать: - *Знаковым* (напр., #codebox(lang: "c", "signed char")); - *Беззнаковым* (напр., #codebox(lang: "c", "unsigned int")). #uncover((beginning: 2))[ == Типы фиксированной длины (#codebox(lang: "c", "#include <stdint.h>")): - #codebox(lang: "c", "int16_t") : *знаковое* 16-битное число; - #codebox(lang: "c", "uint16_t") : *беззнаковое* 16-битное число (#codebox("u") в начале от слова #codebox(lang: "c", "unsigned")); ] #uncover((beginning: 3))[ #colbox(color: red)[⚠️] : Знаковые типы *нельзя переполнять* - в Си это UB. Беззнаковые -- можно. ] ] #slide(header: [Как хранить длинные типы?], place-location: horizon)[ = #codebox(lang: "c", "(uint16_t) 19847 = ") #draw-short(19847, endian: "big") #uncover((beginning: 2))[ = Не будет ли проблем с такой схемой?... ] ] #slide(header: [Что может пойти не так?], background-image: none, place-location: horizon)[ #set text(size: 25pt) #code(numbers: true)[```c int main() { uint64_t my_long = 42; printf("%d\n", &my_long); // Что выведет? } ```] ] #slide( place-location: horizon)[ #table( columns: 2, stroke: none, codebox(lang: "c", "uint8_t"), draw-simple-bits(endian: "big", 8), codebox(lang: "c", "uint16_t"), draw-simple-bits(endian: "big", 16), codebox(lang: "c", "uint32_t"), draw-simple-bits(endian: "big", 32), codebox(lang: "c", "uint64_t"), draw-simple-bits(endian: "big", 64), ) #let theme = cell-color(base-color: blue) #colbox(color: gray)[⚠️] : *Младший #box( fill: theme.background-color, inset: 7pt, stroke: 1pt + theme.stroke-color, baseline: 0.1em + 5pt, )[ #set text(fill: theme.text-color) синий ] байт оказывается в разных местах*. ] #slide(header: [Сложности приведения типов], background-image: none, place-location: horizon)[ #text(size: 25pt, { code(numbers: true)[```c uint8_t a_byte = 42; uint16_t a_16b = a_byte; // Перенесёт 42 во второй байт uint32_t a_32b = a_byte; // Перенесёт 42 в четвёртый байт uint64_t a_64b = a_byte; // Перенесет 42 в восьмой байт ```] }) #colbox(color: gray)[⚠️] : В схеме Big-Endian *каждый целочисленный каст* требует *менять порядок байт*: ] #slide(header: [Что, если хранить байты наоборот?], place-location: horizon)[ = #codebox(lang: "c", "(uint16_t) 19847 = ") #draw-short(19847, endian: "little") ] #slide(header: [Что, если хранить байты наоборот?], place-location: horizon)[ #table( columns: 2, stroke: none, codebox(lang: "c", "uint8_t"), draw-simple-bits(8), codebox(lang: "c", "uint16_t"), draw-simple-bits(16), codebox(lang: "c", "uint32_t"), draw-simple-bits(32), codebox(lang: "c", "uint64_t"), draw-simple-bits(64), ) #colbox(color: green)[✔] В такой схеме (Little-Endian) приведение целочисленных типов не требует перемещения байт. ] #slide(background-image: none)[ #[ #set list(marker: none) == Схема Big-Endian (BE, cначала старший байт) - #pro() Удобная для человека. Порядок бит как в десятичной записи; - #pro() Позволяет ускорять #codebox("strcmp") и #codebox("memcmp") ; - #con() Иногда сложнее приводить типы. == Схема Little-Endian (LE, cначала младший байт) - #pro() Легко кастуется туда-обратно; - #con() Чуть-чуть ломает мозг. ] == К чему пришли люди: - x86 всегда LE. ARM по умолчанию LE, но поддерживает оба варианта. - В Big-Endian вводятся двоичные литералы: #codebox(lang: "c", "0b1000000000000000 == 32768") , а не #codebox(lang: "c", "128"); - А еще Big-Endian используется при передачи данных по сети. ] #focus-slide[ #text(size: 40pt)[*Дробные числа*] ] #slide(header: [Дробные числа], place-location: center + horizon)[ #draw-number-bits(fractional: 3, 10.675) = Несколько младших разрядов можно зарезервировать под дробную часть. Получится #codebox("fixed-point"). ] #slide(header: [Недостатки #codebox("fixed-point")], place-location: horizon)[ #[ #set list(marker: none) - #con() Для каждой задачи *нужно подбирать оптимальное количество дробных бит*. - #con() При неоптимальном порядке представление либо *теряет относительную точность*, либо *быстро переполнится*. ] #uncover((beginning: 2))[ == К чему пришли люди: - Давайте *менять число дробных бит на ходу*; - Выделим несколько бит под *счётчик*; - Назовём это #emoji.sparkles #codebox("floating-point") #emoji.sparkles. ] ] #slide(header: [#codebox("floating-point") в IEEE 754], background-image: none)[ // #draw-float("FLOAT_MIN", 3, 4, float: float-from-bits((false, false, false, false, false, false, false, true), 3, 4)) // #draw-float("Почти 1", 3, 4, float: float-from-bits((false, false, true, true, false, false, false, true), 3, 4)) #place(center + horizon, { let float = float-from-string(str(calc.pi), 3, 4) draw-float-scheme([*$pi approx #float-value(float)$* = ], float) }) #place(center + bottom)[ #set text(size: 25pt) (*Вообще, в IEEE 754 минимум 16 бит, но суть та же*) ] ] #slide(background-image: none, place-location: horizon + center)[ #set text(size: 30pt) 🔗 #link( "https://www.h-schmidt.net/FloatConverter/IEEE754.html", )[*Интерактивная визуализация работы #codebox("float")*] ] #slide(background-image: none, place-location: horizon + center)[ #let float = float-from-string(str(calc.pi), 3, 4) #set text(size: 60pt) #draw-float-formula(float) #set text(size: 25pt) *На самом деле, двоичная аналогия этого:* #set text(size: 60pt) $1.616255 dot 10^(-35)$ ] #slide(background-image: none)[ #let scale = 0.8 #let draw-scale() = { cetz.draw.fill(black) for i in array.range(-16, 18, step: 2) { let x = i * scale cetz.draw.circle((x, -0.3), radius: 0.1) cetz.draw.content((x - 0.5, -0.5), (x + 0.5, -1.5), padding: 0, { align(center + horizon)[ *#i* ] }) } } = #codebox("fixed-point") (знаковый, 8 бит, 3-б. дробь) #cetz.canvas(length: 1cm, { draw-scale() for i in range(-128, 128) { let value = (i / 8) * scale cetz.draw.line((value, 0), (value, 1)) } }) #v(1em) = #codebox("floating-point") (3-б. экспонента, 4-б. мантисса) #cetz.canvas(length: 1cm, { draw-scale() for i in range(256) { let float = float-from-bits(get-bit-array(i), 3, 4) if not float-is-infinity(float) and not float-is-nan(float) { let value = float-value(float) * scale cetz.draw.line((value, 0), (value, 1)) } } }) #v(1em) #set list(marker: none) - #pro() Макс. погрешность на $(1, 15)$ стала *3%* против *5.6%*; - #con() *32 из 256 значений (12%)* уходят на служебные (#codebox("NaN"), #codebox("Inf")). ] #slide(header: [Спец. значения #codebox("floating-point") в IEEE 754], background-image: none)[ #place(horizon + center)[ #set text(size: 25pt) #table( columns: 2, stroke: none, align: (right, center), codebox("+Inf"), draw-float-inline(float-infinity(false, 3, 4)), codebox("-Inf"), draw-float-inline(float-infinity(true, 3, 4)), codebox("qNaN"), draw-float-inline(float-nan(3, 4)), codebox("sNaN"), draw-float-inline(float-from-bits((false, true, true, true, false, true, false, false), 3, 4)), codebox("0"), draw-float-inline(float-zero(false, 3, 4)), codebox("-0"), draw-float-inline(float-zero(true, 3, 4)) ) ] #place(bottom)[ - При нулевой экспоненте включается *денормализованный режим*. Он заменяет старшую единицу на ноль. Так сохраняется плотность значений близко к нулю. ] ] #slide(header: [Сложение #codebox("floating-point")], place-location: center + bottom, background-image: none)[ #let float-a-name = "4.5" #let float-a = float-from-string(float-a-name, 3, 4) #let float-b-name = "-4" #let float-b = float-from-string(float-b-name, 3, 4) #draw-float-add-slide(float-a-name, float-a, float-b-name, float-b) ] #slide(header: [Сложение #codebox("floating-point") разных порядков], place-location: center + bottom, background-image: none)[ #let float-a-name = "1.625" #let float-a = float-from-string(float-a-name, 3, 4) #let float-b-name = "4.5" #let float-b = float-from-string(float-b-name, 3, 4) #draw-float-add-slide(float-a-name, float-a, float-b-name, float-b) ] #slide(header: [Денормализованные #codebox("floating-point")], place-location: center + bottom, background-image: none)[ #let float-a-name = "0.5" #let float-a = float-from-string(float-a-name, 3, 4) #let float-b-name = "-0.03125" #let float-b = float-from-string(float-b-name, 3, 4) #draw-float-add-slide(float-a-name, float-a, float-b-name, float-b) ] #slide(header: [Умножение #codebox("floating-point")], place-location: center + bottom, background-image: none)[ #let float-a = float-from-string(str(calc.pi), 3, 4) #let float-a-name = math.pi #let float-b = float-from-string(str(calc.e), 3, 4) #let float-b-name = [e] #draw-float-mul-slide(float-a-name, float-a, float-b-name, float-b) ] #slide(header: [Размеры реальных #codebox("floating-point")], place-location: horizon + center)[ #set text(size: 35pt) #let head(content) = { text(size: 30pt)[*#content*] } #let bits(content) = { text(size: 30pt)[#content] } #table( columns: 3, stroke: none, align: (right, center, center), inset: (x: 30pt, y: 15pt), table.header( head([]), head([Экспонента]), head([Мантисса]) ), table.hline(), codebox(lang: "c", "float"), bits([*8* бит]), bits([*23* бита]), codebox(lang: "c", "double"), bits([*11* бит]), bits([*52* бита]) ) ] #slide(header: [Further reading], background-image: none, place-location: horizon + center)[ #set text(size: 30pt) 🔗 #link( "https://dl.acm.org/doi/pdf/10.1145/93548.93557?download=false", )[*Как переводить строку в #codebox("float")*] 🔗 #link( "https://dl.acm.org/doi/pdf/10.1145/93548.93559?download=false", )[*Как переводить #codebox("float") в строку*] ] #focus-slide[ #only("1")[ #text(size: 40pt)[*НћЉЏЄћЇљЏ*] ] #only("2")[ #text(size: 40pt)[*Кодировки*] ] ] #let special-char-box(color: rgb(60, 60, 60), char) = { box( baseline: 0.1em + 5pt, inset: (x: 5pt, y: 5pt), radius: 3pt, fill: color, )[ #set text(size: 15pt, baseline: -1pt, fill: white) #raw(char) ] } #let draw-ascii-table(codepage: none) = { let special-chars = ("NUL", "SOH", "STX", "ETX", "EOT", "ENQ", "ACK", "BEL", "BS", "TAB", "LF", "VT", "FF", "CR", "SO", "SI", "DLE", "DC1", "DC2", "DC3", "DC4", "NAK", "SYN", "ETB", "CAN", "EM", "SUB", "ESC", "FS", "GS", "RS", "US") let rows = if codepage == none { 8 } else { 16 } table( columns: array.range(17).map(i => 42pt), rows: array.range(rows + 1).map(i => 24pt), inset: 0pt, stroke: none, [], ..array.range(16).map(num => raw("0x" + str(num, base: 16))), table.hline(), table.vline(x: 1), ..array.flatten(array.range(rows).map((y) => { let row = "0x" + str(y * 16, base: 16) if y == 0 { row += "0" } return (raw(row), ..array(range(16)).map((x) => { let code = y * 16 + x if code < special-chars.len() { special-char-box(special-chars.at(code)) } else if code == 32 { raw("␣") } else if code == 127 { special-char-box("DEL") } else if code >= 128 { let char = codepage.at(code - 128) if(type(char) == str) { raw(char) } else { special-char-box(color: char.at("color", default: rgb(60, 60, 60)), char.name) } } else { let str = str.from-unicode(code) raw(str) } })) })) ) } #slide(place-location: horizon + center, background-image: none)[ #draw-ascii-table() #align(center)[ *Это ASCII-таблица.* #[*A*]merican #[*S*]tandard #[*C*]ode for #[*I*]nformation #[*I*]nterchange Кратно старше каждого в этой аудитории ] ] #slide(place-location: horizon + center, background-image: none)[ #set block(above: 14pt, below: 14pt) #draw-ascii-table(codepage: encodings.cp1251) *CP1251 - Наша, родная кодировка.* ] #slide(place-location: horizon + center, background-image: none)[ #set block(above: 14pt, below: 14pt) #draw-ascii-table(codepage: encodings.cp1256) *CP1256 - арабская кодировка.* ] #slide(place-location: horizon + center, background-image: none)[ #set block(above: 14pt, below: 14pt) #draw-ascii-table(codepage: encodings.cp437) *CP437 - стандартная кодировка IBM PC.* Также известна как Alt Codes ] #slide(header: [Кодовая страница 437], place-location: horizon + left)[ - #link("https://en.wikipedia.org/wiki/Code_page_437#Internationalization")[*Покрывает алфавиты нескольких языков*]: Английского, Немецкого и Шведского; - Содержит *символы национальных валют* (¢, £, ¥, ƒ, ₧); - Позволяет рисовать *таблички псевдографикой*; - *Очень пытается быть универсальной*, но нельзя объять необъятное. ] #slide(place-location: center + horizon)[ #box(width: 80%, height: 10cm)[ #align(center)[ #set block(above: 18pt, below: 18pt) Его Величество #text(size: 60pt, weight: "black")[Unicode] ] #v(2cm) #align(left)[ - Максимально полный стандарт; - Содержит *155 063* символов (по состоянию на 10 сент. 2024); - Постоянно расширяется; - *Как вместить его во всем привычную ASCII?* ] ] ] #slide(place-location: horizon + center, background-image: none)[ #set block(above: 14pt, below: 14pt) #draw-ascii-table(codepage: encodings.utf8) *UTF-8.* Кодирует символы последовательностью от 1 до 4 байт. ] #slide(header: [Кодировка UTF-8], background-image: none)[ #let theme = (i, bit) => { let color = if bit == "1" { blue } else if bit == "0" { black } else { green } cell-color(active: true, base-color: color) } #let draw-bytes(bytes) = { let cell-size = (x: 0.85, y: 1) box(baseline: 0.3em, { cetz.canvas(length: 1cm, { for (i, byte) in bytes.split(" ").enumerate() { let bit-array = byte.codepoints() draw-bits-boxes((8 + 0.15) * i, 0, cell-size, bit-array, theme, (i, bit) => { let theme = theme(i, bit) let mask = calc.pow(2, bit-array.len() - i - 1) set text(weight: "bold", fill: theme.text-color) if bit == "*" { bit = "" } [#bit] }) } }) }) } #place(horizon)[ #draw-bytes("0*******") #h(7pt) -- Обычный ASCII-символ (7 бит, 128 символов) #draw-bytes("110***** 10******") #draw-bytes("1110**** 10****** 10******") #draw-bytes("11110*** 10****** 10****** 10******") ] #place(bottom)[ - В зависимости от длины кода, используется разная кодировка. - Всего можно закодировать *1114111* символов. ] ] #title-slide[ #place(horizon + center)[ = Спасибо за внимание! ] #place( bottom + center, )[ // #qr-code("https://github.com/JakMobius/courses/tree/main/mipt-os-basic-2024", width: 5cm) #box( baseline: 0.2em + 4pt, inset: (x: 15pt, y: 15pt), radius: 5pt, stroke: 3pt + rgb(185, 186, 187), fill: rgb(240, 240, 240), )[ 🔗 #link( "https://github.com/JakMobius/courses/tree/main/mipt-os-basic-2024", )[*github.com/JakMobius/courses/tree/main/mipt-os-basic-2024*] ] ] ] #floats-test()
https://github.com/TypstApp-team/typst
https://raw.githubusercontent.com/TypstApp-team/typst/master/tests/typ/layout/page.typ
typst
Apache License 2.0
// Test the page class. --- // Just empty page. // Should result in auto-sized page, just like nothing. #page[] --- // Just empty page with styles. // Should result in one conifer-colored A11 page. #page("a11", flipped: true, fill: conifer)[] --- // Set width and height. // Should result in one high and one wide page. #set page(width: 80pt, height: 80pt) #[#set page(width: 40pt);High] #[#set page(height: 40pt);Wide] // Flipped predefined paper. #[#set page(paper: "a11", flipped: true);Flipped A11] --- // Test page fill. #set page(width: 80pt, height: 40pt, fill: eastern) #text(15pt, font: "Roboto", fill: white, smallcaps[Typst]) #page(width: 40pt, fill: none, margin: (top: 10pt, rest: auto))[Hi] --- // Just page followed by pagebreak. // Should result in one forest-colored A11 page and one auto-sized page. #page("a11", flipped: true, fill: forest)[] #pagebreak() --- // Layout without any container should provide the page's dimensions, minus its margins. #page(width: 100pt, height: 100pt, { layout(size => [This page has a width of #size.width and height of #size.height ]) h(1em) place(left, rect(width: 80pt, stroke: blue)) })
https://github.com/NOOBDY/formal-language
https://raw.githubusercontent.com/NOOBDY/formal-language/main/q11.typ
typst
The Unlicense
#let q11 = [ 11. (Bonus) (Morphisms between `NFA`s) In this problem we assume that we are considering `NFA`s without $epsilon.alt$-transitions. Given two `NFA`s $N_i = (Q_i, Sigma, delta_i, q_(0i), F_i)$, $i = 1, 2$, we say that a relation $phi subset.eq Q_1 times Q_2$ is a _simulation_ of $N_1$ by $N_2$, denoted by $phi : N_1 -> N_2$, if the following properties hold: - $(q_(01), q_(02)) in phi$. - Whenever $(p, q) in phi$, for every $r in delta_1(p, a)$, there is some $s in delta_2(q, a)$ so that $(r, s) in phi$, for all $a in Sigma$. - Whenever $(p, q) in phi$, if $p in F_1$ then $q in F_2$. 1. If $N_1$ and $N_2$ are actually `DFA`s, show that an $F$-map $phi : N_1 -> N_2$ of `DFA`s is a simulation of $N_1$ by $N_2$. 2. Let $phi : N_1 -> N_2$ be a simulation of $N_1$ by $N_2$. Prove that for every $w in Sigma^ast$, for every $q_1 in hat(delta)_q (q_(01), w)$, there is some $q_2 in hat(delta)_2(q_(02), w)$ so that $(q_1, q_2) in phi$. $ &(q_(01), q_(02)) in phi subset.eq Q_1 times Q_2 \ &q_((i+1)1) = hat(delta)_1(q_(i 1), w_i) \ &q_((i+1)2) = hat(delta)_2(q_(i 2), w_i) \ because &forall i in NN, (q_((i+1)1), q_((i+1)2)) in phi \ therefore &(q_1, q_2) in phi $ 3. Conclude that $L(N_1) subset.eq L(N_2)$. $ &(q_(01), q_(02)) in phi \ &forall w in Sigma^ast, hat(delta)_1 (q_(01), w) in F_1 \ &hat(delta)_2 (q_(02), w) in F_2 $ + If $N_1$ is an `NFA` and $N_2$ is a `DFA`, prove that if $L(N_1) subset.eq L(N_2)$, then there is some simulation $phi : N_1 -> N_2$ of $N_1$ by $N_2$. Hint. Consider the relation $phi = {(q_1, q_2) | q_1 in hat(delta)_1(q_(01), w), q_2 = hat(delta)_2(q_(02), w), w in Sigma^ast}$. Remark. If $N_1$ and $N_2$ are `DFA`s and $L(N_1) subset.eq L(N_2)$, then there may not exist any `DFA` map from $N_1$ to $N_2$, but above shows that there is always a simulation of $N_1$ by $N_2$. + Give a counter-example showing that (c) is generally _false_ for `NFA`s, i.e., if $N_1$ and $N_2$ are both `NFA`s and $L(N_1) subset.eq L(N_2)$, there may not be any simulation $phi : N_1 -> N_2$. In order to salvage (c), we modify the conditions of the definition of a simulation: we say that $phi : N_1 -> N_2$ is a _generalized simulation_ (or _g_-simulation) if - $(q_(01), q_(02)) in phi$ - Whenever $(p, q) in phi$, for all $a in Sigma$, if $delta_1(p, a) != emptyset$ and $delta_2(q, a) != emptyset$, then for every $r in delta_1(p, a)$, there is some $s in delta_2(q, a)$ so that $(r, s) in phi$. - For all $w in Sigma^ast$ with $|w| < n_1 2^(n_2)$, for every $q_1 in hat(delta)_1(q_(01), w e) sect F_1$, there is some $q_2 in hat(delta)_2(q_(02), w)sect F_2$ so that $(q_1, q_2) in phi$. 6. Prove that $L(N_1) subset.eq L(N_2)$ iff there is some _g_-simulation $phi : N_1 -> N_2$. + We say that $phi : N_1 -> N_2$ is a _g-bisimulation_ between $N_1$ and $N_2$ if $phi$ is a _g_-simulation between $N_1$ and $N_2$ and $phi^(−1)$ is a _g_-simulation between $N_2$ and $N_1$ (recall that $phi^(−1) = {(q, p) in Q_2 times Q_1 | (p, q) in phi}$). + Prove that $L(N_1) = L(N_2)$ iff there is some _g_-bisimulation between $N_1$ and $N_2$. ]
https://github.com/trondhauklien/typst-resume
https://raw.githubusercontent.com/trondhauklien/typst-resume/main/README.md
markdown
# Typst resume Simple resume template using Typst as markup language and data imported from a json file. ### Usage Typst CLI can be installed on Windows, MacOS or Linux. See the [Typst README](https://github.com/typst/typst#installation). Compile the resume to a PDF. ```bash typst compile main.typ ``` ### Example See the compiled example PDF [`main.pdf`](https://github.com/trondhauklien/typst-resume/blob/main/main.pdf).
https://github.com/typst-jp/typst-jp.github.io
https://raw.githubusercontent.com/typst-jp/typst-jp.github.io/main/CONTRIBUTING.md
markdown
Apache License 2.0
# 翻訳ガイドライン > [!NOTE] > 当プロジェクトの[README](./README.md)や「[はじめに:Typst Japan Communityより](https://typst-jp.github.io/docs/)」、[Typst公式](https://typst.app/)の[ライセンス](https://github.com/typst/typst/blob/main/LICENSE)や[コントリビューション・ガイド](https://github.com/typst/typst/blob/main/CONTRIBUTING.md)も併せてご参照ください。 Typst日本語ドキュメント翻訳プロジェクトにご興味をお持ちいただき、どうもありがとうございます。 このプロジェクトは、[Typst GmbH](https://typst.app/legal/)の許諾を得て、最新の[公式のドキュメント](https://typst.app/docs/)より翻訳を行うことで、非公式な日本語ドキュメントを提供することを目的としています。まさに、あなたのようなボランティアの皆様のご協力の元、成り立っています。当ガイドラインをご一読の上、翻訳・校正・提案及びその他の作業にご参加いただければ幸いです。 この翻訳ガイドラインは、翻訳に参加する皆様に、翻訳の進め方に対する説明やより良質な翻訳を行うためのガイダンスを提供します。 ## 翻訳の進め方 翻訳は[GitHub上の当リポジトリ](https://github.com/typst-jp/typst-jp.github.io)を中心に行います。実際の翻訳作業やそれに対する議論や提案などは、主にGitHubの[Issue](https://github.com/typst-jp/typst-jp.github.io/issues)や[Pull Request](https://github.com/typst-jp/typst-jp.github.io/pulls)機能を通じて行います。また、[Discordサーバー「くみはんクラブ」](https://discord.gg/9xF7k4aAuH)の`#typst-翻訳`チャンネルでも、質問の対応や合意の形成などを行うことがあります。 ### 翻訳提案の手順 1. このGitHubリポジトリをフォークします。 2. ドキュメントの実体は、主にMarkdownおよびYAMLの2種類のファイルから構成されています。それぞれ、下記の注意書きに従って翻訳作業をお願いします。 1. `./docs/i18n/**/`内のYAMLファイル群は、Typstの言語リファレンスの本体です。その中に含まれている、**既存の`*-ja.yaml`ファイルを直接書き換えて翻訳してください**。**`*-en.yaml`や`*-zh.yaml`は翻訳しないでください**。 - 例:[Reference > Model](https://typst.app/docs/reference/model/)を翻訳する際は、`./docs/i18n/category/model-ja.yaml`を編集してください。`model-en.yaml`や`model-zh.yaml`は放置してください。 2. `./docs`内のMarkdownファイル群は、Typstのチュートリアルや入門ガイドなど、言語リファレンス以外のページの本体です。**既存のMarkdownファイルを直接書き換えて翻訳してください**。 それに加えて、`./docs/src/lib.rs`ファイルの[`urlify`関数](https://github.com/search?q=repo%3Atypst-jp/typst-jp.github.io%20urlify&type=code)を編集して、中国語版の記事タイトルを日本語版のものに書き換えてください。このプロセスを抜かすと、WebページのURLが正しく生成されません。 3. 「サードパーティパッケージ」のページの翻訳を追加する場合は、`./static/assets/index2ja.json`も編集する必要があります。 3. 翻訳の際は、[後述のガイドライン](#スタイルマニュアル)を参照し、[v0.12.0時点での公式ドキュメント](https://github.com/typst/typst/tree/v0.12.0/docs)から翻訳してください。 4. 翻訳作業の途中でも、Draft Pull Requestを作成して、翻訳の進捗状況を共有することができます。 5. 翻訳作業が終わったら、Pull Requestを作成し、送信してください。 ご質問などがある場合は、[「くみはんクラブ」のDiscordサーバー](https://discord.gg/9xF7k4aAuH)に参加してご連絡ください。 もちろん、Discordサーバーに参加していない方からのPull Requestも大いに歓迎します。 ### 技術的な詳細 [中国語版](https://github.com/typst-doc-cn/typst-doc-cn.github.io?tab=readme-ov-file#%E6%8A%80%E6%9C%AF%E7%BB%86%E8%8A%82)を参照してください。 ### ローカル環境でドキュメントを生成する 変更したMarkdown/YAMLファイルから、ローカル環境でWebサイトのデータを生成することも可能です。翻訳の際にこの作業は必須ではありませんが、書き換えたファイルがWebページとして正しく表示されるのか確認するのに役立ちます。 まず、このリポジトリのクローンを作成し、`cargo`ツールチェーン、PythonおよびPythonパッケージの`jinja2`と`pyyaml`をインストールする必要があります。 ``` # `./docs`以下のディレクトリを変更した場合は、次の2行のコマンドを実行する必要があります cargo test --package typst-docs --lib -- tests::test_docs --exact --nocapture # `./docs/i18n`ディレクトリのみを変更した場合は、このコマンド行を実行するだけで済みます python ./gen.py ``` 最終的にコンパイルされたファイルは`./dist`に出力されます。 Node.jsがインストールされている場合は、`npx serve ./dist`でプレビューできます。 上記のローカル環境を構築するDockerfileも整備しております。詳細は[.devcontainer/README.md](.devcontainer/README.md)をご参照ください。 ## スタイルマニュアル スタイルマニュアルでは、当プロジェクトにおける翻訳の品質確保のための、統一したスタイルの参照基準を提供します。具体的には、基本、文体、表記、用語の4つの観点から、翻訳の際に留意すべき事項を示します。 本スタイルマニュアルは絶対的なルールではなく、翻訳全体の整合性を保つための基本方針として提供しているものです。そのため、本マニュアルの内容に必ず従う義務はなく、ケース・バイ・ケースで適用して翻訳を行ってください。本マニュアルの内容に疑問がある場合は、IssueやPull Requestなどで他の翻訳者に意見を求めることもできます。 ### 基本 1. 翻訳は、原則として説明文章や表などに限ります。コードやコマンドなどの技術的な表現は、原文のままとします。 3. コード記述例の中に出てくる英文のコメントは、日本語に翻訳する必要はありません。 4. 既存の翻訳を参照し、一貫性を保つようにしてください。 5. 疑問点、不明点などがある場合は、必要に応じて、積極的にIssuesやDiscordなどで議論・相談してください。 6. 構成や段分けなどについては、原文の構成をなるべく保つようにしてください。 ### 文体 文体については、以下のガイドラインに従ってください。 1. 基本的に「です」「ます」調の敬体を使用すること。ただし、引用、見出し、箇条書きなどに関しては、その限りではありません。 2. 一般的に用いられる現代日本語共通語に基づき、平易的な表現を心がけること。 ### 表記 約物および日本語の表記については、以下のルールに従ってください。 1. 和欧混植がなされている訳文において、和文と欧文の間には半角スペースを手動で挿入しないこと。 2. 和文において、句点は「。」を、読点は「、」を使用し、他の記号も原則として全角を使用すること。 3. コロン「:」やセミコロン「;」は、原則として使用しないこと。ただし、「例:」など、文中以外で用いる場合はその限りではありません。 4. 数字や欧文(ラテン文字、キリル文字、ギリシア文字など)は、原則として半角を使用すること。 5. 原則、現代仮名遣いおよび常用漢字表に基づいた表記を使用すること。送り仮名や仮名書き、ひらがな・カタカナの使い分けは、一般的な書き方に従ってください。ただし、引用、特定の用語や固有名詞については、その限りではありません。 ### 用語 用語については、以下のガイドラインに従ってください。 1. [用語集](https://typst-jp.github.io/docs/glossary/)を参照すること。 1. 用語集にあってかつ適切と思われる場合は、その通りに翻訳してください。 2. 用語集にあっても不適切と思われる場合は、IssueやDiscordで相談してください 3. 必要と思われるのに用語集にない場合は、既存の翻訳を参照し、追加を提案してください。 2. 用語と用語でないものを、柔軟に見分けて訳し分けること。 3. 現代日本語の一般的な、わかりやすい用語を使用すること。 ### 参考 * [JTFスタイルガイド](https://www.jtf.jp/tips/styleguide) * [ウィキペディア日本語版のスタイルマニュアル](https://ja.wikipedia.org/wiki/Wikipedia:%E3%82%B9%E3%82%BF%E3%82%A4%E3%83%AB%E3%83%9E%E3%83%8B%E3%83%A5%E3%82%A2%E3%83%AB) * [ウィキペディア日本語版の表記ガイド](https://ja.wikipedia.org/wiki/Wikipedia:%E8%A1%A8%E8%A8%98%E3%82%AC%E3%82%A4%E3%83%89) * [Microsoft Localization Style Guides](https://learn.microsoft.com/ja-jp/globalization/reference/microsoft-style-guides) * [WordPress 翻訳ハンドブック](https://ja.wordpress.org/team/handbook/translation/) * [Vue.js 公式サイト日本語翻訳ガイド](https://github.com/vuejs-translations/docs-ja/blob/main/.github/CONTRIBUTING.md) * [ja.react.dev 翻訳スタイルガイド](https://github.com/reactjs/ja.react.dev/wiki/%E7%BF%BB%E8%A8%B3%E3%82%B9%E3%82%BF%E3%82%A4%E3%83%AB%E3%82%AC%E3%82%A4%E3%83%89)
https://github.com/frectonz/the-pg-book
https://raw.githubusercontent.com/frectonz/the-pg-book/main/book/213.%20want.html.typ
typst
want.html What You (Want to)* Want November 2022Since I was about 9 I've been puzzled by the apparent contradiction between being made of matter that behaves in a predictable way, and the feeling that I could choose to do whatever I wanted. At the time I had a self-interested motive for exploring the question. At that age (like most succeeding ages) I was always in trouble with the authorities, and it seemed to me that there might possibly be some way to get out of trouble by arguing that I wasn't responsible for my actions. I gradually lost hope of that, but the puzzle remained: How do you reconcile being a machine made of matter with the feeling that you're free to choose what you do? [1]The best way to explain the answer may be to start with a slightly wrong version, and then fix it. The wrong version is: You can do what you want, but you can't want what you want. Yes, you can control what you do, but you'll do what you want, and you can't control that.The reason this is mistaken is that people do sometimes change what they want. People who don't want to want something — drug addicts, for example — can sometimes make themselves stop wanting it. And people who want to want something — who want to like classical music, or broccoli — sometimes succeed.So we modify our initial statement: You can do what you want, but you can't want to want what you want.That's still not quite true. It's possible to change what you want to want. I can imagine someone saying "I decided to stop wanting to like classical music." But we're getting closer to the truth. It's rare for people to change what they want to want, and the more "want to"s we add, the rarer it gets.We can get arbitrarily close to a true statement by adding more "want to"s in much the same way we can get arbitrarily close to 1 by adding more 9s to a string of 9s following a decimal point. In practice three or four "want to"s must surely be enough. It's hard even to envision what it would mean to change what you want to want to want to want, let alone actually do it.So one way to express the correct answer is to use a regular expression. You can do what you want, but there's some statement of the form "you can't (want to)* want what you want" that's true. Ultimately you get back to a want that you don't control. [2] Notes[1] I didn't know when I was 9 that matter might behave randomly, but I don't think it affects the problem much. Randomness destroys the ghost in the machine as effectively as determinism.[2] If you don't like using an expression, you can make the same point using higher-order desires: There is some n such that you don't control your nth-order desires. Thanks to <NAME>, <NAME>, <NAME>, and <NAME> for reading drafts of this.Irish Translation
https://github.com/Toniolo-Marco/git-for-dummies
https://raw.githubusercontent.com/Toniolo-Marco/git-for-dummies/main/slides/animations/pr.typ
typst
#import "@preview/touying:0.5.2": * #import "@preview/numbly:0.1.0": numbly #import "@preview/fletcher:0.5.1" as fletcher: node, edge #let fletcher-diagram = touying-reducer.with(reduce: fletcher.diagram, cover: fletcher.hide) #slide(repeat: 2, self => [ #let (uncover, only, alternatives) = utils.methods(self) #only("1")[ You can create Pull Requests either through the web interface (for each GitHub repo we have the dedicated section above) or via CLI: ```bash ➜ gh pr create ? Where should we push the 'feature-1' branch? [Use arrows to move, type to filter] > Username/project Skip pushing the branch Cancel ``` ] #only("2")[ As we said before, a Pull Request generally consists of: title, body (detailed description), list of commits. Below is the command we gave who precisely asks for the first two pieces of information: ```bash Creating pull request for Username:feature-1 into main in Official-Owner/project ? Title implemented feature-1 stuff ? Body <Received> ? What's next? Submit remote: remote: To https://github.com/Username/project.git * [new branch] HEAD -> feature-1 branch 'feature-1' set up to track 'my-fork/feature-1'. https://github.com/Official-Owner/project/pull/1 ``` Also associated with a PR, there are labels, customized according to the repo, required reviewers, which can be edited, and community comments. ] ])
https://github.com/Kasci/LiturgicalBooks
https://raw.githubusercontent.com/Kasci/LiturgicalBooks/master/CU/minea/1_generated/00_all/12_august.typ
typst
#import "../../../all.typ": * #show: book = #translation.at("M_12_august") #include "../12_august/06.typ" #pagebreak() #include "../12_august/15.typ" #pagebreak()
https://github.com/TeunSpithoven/Signals-And-Embedded-Systems
https://raw.githubusercontent.com/TeunSpithoven/Signals-And-Embedded-Systems/main/components/pre-toc.typ
typst
// CHANGE THIS TO THE CORRECT PATH #import "../template/fhict-template.typ": * = Pagina voor de table of contents == Pagina voor de table of contents === Pagina voor de table of contents ==== Pagina voor de table of contents
https://github.com/RaphGL/ElectronicsFromBasics
https://raw.githubusercontent.com/RaphGL/ElectronicsFromBasics/main/DC/chap7/4_component_failure_analysis.typ
typst
Other
=== Component failure analysis #quote(attribution: [P.A.M Dirac, physicist], block: true)[ I consider that I understand an equation when I can predict the properties of its solutions, without actually solving it. ] There is a lot of truth to that quote from Dirac. With a little modification, I can extend his wisdom to electric circuits by saying, \"I consider that I understand a circuit when I can predict the approximate effects of various changes made to it without actually performing any calculations.\" At the end of the series and parallel circuits chapter, we briefly considered how circuits could be analyzed in a _qualitative_ rather than _quantitative_ manner. Building this skill is an important step towards becoming a proficient troubleshooter of electric circuits. Once you have a thorough understanding of how any particular failure will affect a circuit (i.e. you don\'t have to perform any arithmetic to predict the results), it will be much easier to work the other way around: pinpointing the source of trouble by assessing how a circuit is behaving. Also shown at the end of the series and parallel circuits chapter was how the table method works just as well for aiding failure analysis as it does for the analysis of healthy circuits. We may take this technique one step further and adapt it for total qualitative analysis. By _\"qualitative\"_ I mean working with symbols representing \"increase,\" \"decrease,\" and \"same\" instead of precise numerical figures. We can still use the principles of series and parallel circuits, and the concepts of Ohm\'s Law, we\'ll just use symbolic _qualities_ instead of numerical _quantities_. By doing this, we can gain more of an intuitive \"feel\" for how circuits work rather than leaning on abstract equations, attaining Dirac\'s definition of \"understanding.\" Enough talk. Let\'s try this technique on a real circuit example and see how it works: #image("static/00132.png") This is the first \"convoluted\" circuit we straightened out for analysis in the last section. Since you already know how this particular circuit reduces to series and parallel sections, I\'ll skip the process and go straight to the final form: #image("static/00136.png") R#sub[3] and R#sub[4] are in parallel with each other; so are R#sub[1] and R#sub[2]. The parallel equivalents of $R_3//R_4$ and $R_1//R_2$ are in series with each other. Expressed in symbolic form, the total resistance for this circuit is as follows: $ R_"Total" = (R_1 "//" R_2) - (R_3 "//" R_4) $ First, we need to formulate a table with all the necessary rows and columns for this circuit: #image("static/10135.png") Next, we need a failure scenario. Let\'s suppose that resistor R#sub[2] were to fail shorted. We will assume that all other components maintain their original values. Because we\'ll be analyzing this circuit qualitatively rather than quantitatively, we won\'t be inserting any real numbers into the table. For any quantity unchanged after the component failure, we\'ll use the word \"same\" to represent \"no change from before.\" For any quantity that has changed as a result of the failure, we\'ll use a down arrow for \"decrease\" and an up arrow for \"increase.\" As usual, we start by filling in the spaces of the table for individual resistances and total voltage, our \"given\" values: #image("static/10136.png") The only \"given\" value different from the normal state of the circuit is R#sub[2], which we said was failed shorted (abnormally low resistance). All other initial values are the same as they were before, as represented by the \"same\" entries. All we have to do now is work through the familiar Ohm\'s Law and series-parallel principles to determine what will happen to all the other circuit values. First, we need to determine what happens to the resistances of parallel subsections R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4]. If neither R#sub[3] nor R#sub[4] have changed in resistance value, then neither will their parallel combination. However, since the resistance of R#sub[2] has decreased while R#sub[1] has stayed the same, their parallel combination must decrease in resistance as well: #image("static/10137.png") Now, we need to figure out what happens to the total resistance. This part is easy: when we\'re dealing with only one component change in the circuit, the change in total resistance will be in the same direction as the change of the failed component. This is not to say that the _magnitude_ of change between individual component and total circuit will be the same, merely the _direction_ of change. In other words, if any single resistor decreases in value, then the total circuit resistance must also decrease, and vice versa. In this case, since R#sub[2] is the only failed component, and its resistance has decreased, the total resistance _must_ decrease: #image("static/10138.png") Now we can apply Ohm\'s Law (qualitatively) to the Total column in the table. Given the fact that total voltage has remained the same and total resistance has decreased, we can conclude that total current must increase ($I=E/R$). In case you\'re not familiar with the qualitative assessment of an equation, it works like this. First, we write the equation as solved for the unknown quantity. In this case, we\'re trying to solve for current, given voltage and resistance: $ I = E / R $ Now that our equation is in the proper form, we assess what change (if any) will be experienced by \"I,\" given the change(s) to \"E\" and \"R\": $ I = (E "(same)") / (R arrow.b) $ If the denominator of a fraction decreases in value while the numerator stays the same, then the overall value of the fraction must increase: $ arrow.t I = (E "(same)") / (R arrow.b) $ Therefore, Ohm\'s Law ($I=E/R$) tells us that the current (I) will increase. We\'ll mark this conclusion in our table with an \"up\" arrow: #image("static/10142.png") With all resistance places filled in the table and all quantities determined in the Total column, we can proceed to determine the other voltages and currents. Knowing that the total resistance in this table was the result of R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4] in _series_, we know that the value of total current will be the same as that in R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4] (because series components share the same current). Therefore, if total current increased, then current through R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4] must also have increased with the failure of R#sub[2]: #image("static/10143.png") Fundamentally, what we\'re doing here with a qualitative usage of Ohm\'s Law and the rules of series and parallel circuits is no different from what we\'ve done before with numerical figures. In fact, its a lot easier because you don\'t have to worry about making an arithmetic or calculator keystroke error in a calculation. Instead, you\'re just focusing on the _principles_ behind the equations. From our table above, we can see that Ohm\'s Law should be applicable to the R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4] columns. For R#sub[3]\/\/R#sub[4], we figure what happens to the voltage, given an increase in current and no change in resistance. Intuitively, we can see that this must result in an increase in voltage across the parallel combination of R#sub[3]\/\/R#sub[4]: #image("static/10144.png") But how do we apply the same Ohm\'s Law formula ($E=I R$) to the R#sub[1]\/\/R#sub[2] column, where we have resistance decreasing _and_ current increasing? It\'s easy to determine if only one variable is changing, as it was with R#sub[3]\/\/R#sub[4], but with two variables moving around and no definite numbers to work with, Ohm\'s Law isn\'t going to be much help. However, there is another rule we can apply _horizontally_ to determine what happens to the voltage across R#sub[1]\/\/R#sub[2]: the rule for voltage in series circuits. If the voltages across R#sub[1]\/\/R#sub[2] and R#sub[3]\/\/R#sub[4] add up to equal the total (battery) voltage and we know that the R#sub[3]\/\/R#sub[4] voltage has increased while total voltage has stayed the same, then the voltage across R#sub[1]\/\/R#sub[2] _must_ have decreased with the change of R#sub[2]\'s resistance value: #image("static/10145.png") Now we\'re ready to proceed to some new columns in the table. Knowing that R#sub[3] and R#sub[4] comprise the parallel subsection R#sub[3]\/\/R#sub[4], and knowing that voltage is shared equally between parallel components, the increase in voltage seen across the parallel combination R#sub[3]\/\/R#sub[4] must also be seen across R#sub[3] and R#sub[4] individually: #image("static/10146.png") The same goes for R#sub[1] and R#sub[2]. The voltage decrease seen across the parallel combination of R#sub[1] and R#sub[2] will be seen across R#sub[1] and R#sub[2] individually: #image("static/10147.png") Applying Ohm\'s Law vertically to those columns with unchanged (\"same\") resistance values, we can tell what the current will do through those components. Increased voltage across an unchanged resistance leads to increased current. Conversely, decreased voltage across an unchanged resistance leads to decreased current: #image("static/10148.png") Once again we find ourselves in a position where Ohm's Law can't help us: for R#sub[2], both voltage and resistance have decreased, but without knowing _how much_ each one has changed, we can\'t use the $I=E/R$ formula to qualitatively determine the resulting change in current. However, we can still apply the rules of series and parallel circuits _horizontally_. We know that the current through the R#sub[1]\/\/R#sub[2] parallel combination has increased, and we also know that the current through R#sub[1] has decreased. One of the rules of parallel circuits is that total current is equal to the sum of the individual branch currents. In this case, the current through R#sub[1]\/\/R#sub[2] is equal to the current through R#sub[1] added to the current through R#sub[2]. If current through R#sub[1]\/\/R#sub[2] has increased while current through R#sub[1] has decreased, current through R#sub[2] _must_ have increased: #image("static/10149.png") And with that, our table of qualitative values stands completed. This particular exercise may look laborious due to all the detailed commentary, but the actual process can be performed very quickly with some practice. An important thing to realize here is that the general procedure is little different from quantitative analysis: start with the known values, then proceed to determining total resistance, then total current, then transfer figures of voltage and current as allowed by the rules of series and parallel circuits to the appropriate columns. A few general rules can be memorized to assist and/or to check your progress when proceeding with such an analysis: - For any _single_ component failure (open or shorted), the total resistance will always change in the same direction (either increase or decrease) as the resistance change of the failed component. - When a component fails shorted, its resistance always decreases. Also, the current through it will increase, and the voltage across it _may_ drop. I say \"may\" because in some cases it will remain the same (case in point: a simple parallel circuit with an ideal power source). - When a component fails open, its resistance always increases. The current through that component will decrease to zero, because it is an incomplete electrical path (no continuity). This _may_ result in an increase of voltage across it. The same exception stated above applies here as well: in a simple parallel circuit with an ideal voltage source, the voltage across an open-failed component will remain unchanged.
https://github.com/Enter-tainer/typst-preview
https://raw.githubusercontent.com/Enter-tainer/typst-preview/main/docs/book.typ
typst
MIT License
#import "@preview/book:0.2.3": * #show: book #book-meta( title: "Typst Preview Book", description: "Document for typst preview ", authors: ("Enter-tainer", "Myriad-Dreamin"), language: "en", repository: "https://github.com/Enter-tainer/typst-preview", summary: [ #prefix-chapter("intro.typ")[Get Started], = User Guide - #chapter("vscode.typ")[Use In VScode] - #chapter("config.typ")[Configuration] - #chapter("standalone.typ")[Standalone] = Developer Guide - #chapter("arch.typ")[Typst-Preview Architecture] - #chapter("dev.typ")[Set Up Development Environment] - #chapter("editor.typ")[Port Typst-Preview To Other Editors] ] ) // re-export page template #import "./templates/gh-page.typ": project #let book-page = project
https://github.com/peterhellberg/typ
https://raw.githubusercontent.com/peterhellberg/typ/main/README.md
markdown
# typ :printer: A small [Zig](https://ziglang.org/) ⚡ module, as a convenience for me when writing WebAssembly [plugins](https://typst.app/docs/reference/foundations/plugin/) for [Typst](https://typst.app/) > [!NOTE] > Initially based on the [hello.zig](https://github.com/astrale-sharp/wasm-minimal-protocol/blob/master/examples/hello_zig/hello.zig) > example in [wasm-minimal-protocol](https://github.com/astrale-sharp/wasm-minimal-protocol/) ## Requirements You will want to have a fairly recent [Zig](https://ziglang.org/download/#release-master) as well as the [Typst CLI](https://github.com/typst/typst?tab=readme-ov-file#installation) > [!IMPORTANT] > I had to `rustup default 1.79.0` when compiling the latest `typst` > as there were some breaking change in `1.80.0` > [!TIP] > Some of the software that I have installed for a pretty > nice **Typst** workflow in [Neovim](https://neovim.io/): > > - https://github.com/nvarner/typst-lsp > - https://github.com/kaarmu/typst.vim > - https://github.com/chomosuke/typst-preview.nvim ## Usage Use `zig fetch` to add a `.typ` to the `.dependencies` in your `build.zig.zon` ```console zig fetch --save https://github.com/peterhellberg/typ/archive/refs/tags/v0.0.9.tar.gz ``` > [!NOTE] > You should now be able to update your `build.zig` as described below. #### `build.zig` ```zig const std = @import("std"); pub fn build(b: *std.Build) void { const target = b.resolveTargetQuery(.{ .cpu_arch = .wasm32, .os_tag = .freestanding, }); const hello = b.addExecutable(.{ .name = "hello", .root_source_file = b.path("hello.zig"), .strip = true, .target = target, .optimize = .ReleaseSmall, }); const typ = b.dependency("typ", .{}).module("typ"); hello.root_module.addImport("typ", typ); hello.entry = .disabled; hello.rdynamic = true; b.installArtifact(hello); } ``` #### `hello.zig` ```zig const typ = @import("typ"); export fn hello() i32 { const msg = "*Hello* from `hello.wasm` written in Zig!"; return typ.str(msg); } export fn echo(len: usize) i32 { var res = typ.alloc(u8, len * 2) catch return 1; defer typ.free(res); typ.in(res.ptr); for (0..len) |i| { res[i + len] = res[i]; } return typ.ok(res); } ``` #### `hello.typ` ```typst #set page(width: 10cm, height: 10cm) #set text(font: "Inter") == A WebAssembly plugin for Typst #line(length: 100%) #emph[Typst is capable of interfacing with plugins compiled to WebAssembly.] #line(length: 100%) #let p = plugin("zig-out/bin/hello.wasm") #eval(str(p.hello()), mode: "markup") #eval(str(p.echo(bytes("1+2"))), mode: "code") ``` #### Expected output ![hello.png](https://github.com/user-attachments/assets/a1cd9c86-ef94-4d1f-a44c-b958475f79b0)
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/zen-zine/0.1.0/README.md
markdown
Apache License 2.0
# zen-zine Excellently type-set a cute little zine about your favorite topic! Providing your eight pages in order will produce a US-Letter page with the content in a layout ready to be folded into a zine! The content is wrapped before movement so that padding and alignment are respected. Here is the template and its preview: ```typst #import "@preview/zen-zine:0.1.0": zine #set document(author: "Tom", title: "Zen Zine Example") #set text(font: "Linux Libertine", lang: "en") #let my_eight_pages = ( range(8).map( number => [ #pad(2em, text(10em, align(center, str(number)))) ] ) ) // provide your content pages in order and they // are placed into the zine template positions. // the content is wrapped before movement so that // padding and alignment are respected. #zine( // draw_border: true, // zine_page_margin: 5pt, contents: my_eight_pages ) ``` ![Image of Template](template/preview.png) ## Improvement Ideas Roughly in order of priority. - Write documentation and generate a manual - Deduce `page` properties so that user can change the page they wish to use. - Make sure the page is `flipped` and deduce the zine page width and height from the full page width and height (and the zine margin) - I'm currently struggling with finding out the page properties (what's the `#get` equivalent to `#set`?) - Add other zine sizes (there is a 16 page one I believe?) - Digital mode where zine pages are separate pages (of the same size) rather than 'sub pages' of a printer page
https://github.com/iceghost/resume
https://raw.githubusercontent.com/iceghost/resume/main/main.typ
typst
#set page(paper: "a4", margin: (x: 1cm, bottom: 1.5cm, top: 1.5cm)) #set block(width: 100%) #set terms(separator: [: ], hanging-indent: 0em) #set par(linebreaks: "optimized") #set text(hyphenate: false, size: 10pt) #show heading.where(level: 2): set block( inset: (top: .5em, bottom: .5em), outset: (left: 0.2cm), above: 1.3em, fill: gray.lighten(90%) ) #show heading.where(level: 3): set block(above: 1.5em, below: 1em) #show "/": text.with(fill: gray) #show link: underline.with(stroke: gray) #show: columns #stack( dir: ltr, image("images/portrait.jpg", height: 2.5cm), 1fr, align(horizon)[ #box(text(weight: 900, size: 1.6em)[<NAME>]) #set text(size: 0.9em) 3#super[rd]-year Student #box(image("images/github-mark.png", height: 1em), baseline: 20%) #link("https://github.com/iceghost")[iceghost] #sym.bar.v #box(image("images/place.png", height: 1em), baseline: 20%) Cu Chi, HCMC \ #box(image("images/phone.png", height: 1em), baseline: 20%) 0394282309 #sym.bar.v #box(image("images/email.png", height: 1em), baseline: 20%) <EMAIL> ], 1fr ) #show "@": text.with(fill: gray) #include "1-profile.typ" #include "2-education.typ" #include "3-activities.typ" #include "4-skills.typ" #colbreak() #include "5-projects.typ"
https://github.com/floriandejonckheere/utu-thesis
https://raw.githubusercontent.com/floriandejonckheere/utu-thesis/master/thesis/figures/06-automated-modularization/publications-by-year.typ
typst
#import "@preview/cetz:0.2.2": canvas, chart, draw #let slr = yaml("/bibliography/literature-review.yml") #let publications_by_year = ( "2014": 0, "2015": 0, "2016": 0, "2017": 0, "2018": 0, "2019": 0, "2020": 0, "2021": 0, "2022": 0, "2023": 0, "2024": 0, ) #for (platform) in slr.platforms.keys() { let pubs = yaml("/bibliography/literature-review/" + platform + ".yml") for (key) in slr.platforms.at(platform).primary { let year = str(pubs.at(key).date).split("-").first() publications_by_year.insert(year, publications_by_year.at(year) + 1) } } #canvas({ chart.columnchart( size: (14, 3), y-tick-step: 2, publications_by_year.pairs(), ) })
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cetz/0.2.0/src/lib/decorations.typ
typst
Apache License 2.0
#import "decorations/brace.typ": brace, brace-default-style, flat-brace, flat-brace-default-style #import "decorations/path.typ": zigzag, wave, coil
https://github.com/jakobjpeters/Typstry.jl
https://raw.githubusercontent.com/jakobjpeters/Typstry.jl/main/docs/source/references/strings.md
markdown
MIT License
# Strings ```@eval using Markdown, Typstry Markdown.parse("This reference documents " * lowercasefirst(split(string(@doc Typstry.Strings), "\n")[5])) ``` ## `Typstry` ```@docs Mode Typst TypstString TypstText @typst_str code markup math context show_typst(::Any, ::AbstractChar) ``` ## `Base` ```@docs IOBuffer codeunit isvalid iterate(::TypstString) ncodeunits pointer repr show(::IO, ::TypstString) show(::IO, ::MIME"text/typst", ::Typst) ```
https://github.com/csimide/SEU-Typst-Template
https://raw.githubusercontent.com/csimide/SEU-Typst-Template/master/seu-thesis/templates/degree.typ
typst
MIT License
#import "../pages/cover-degree-fn.typ": degree-cover-conf #import "../pages/title-page-degree-cn-fn.typ": title-cn-conf #import "../pages/title-page-degree-en-fn.typ": title-en-conf #import "../pages/statement-degree-fn.typ": degree-statement-conf #import "../parts/abstract-degree-fn.typ": abstract-conf #import "../parts/outline-degree-fn.typ": outline-conf #import "../parts/terminology.typ": terminology-conf #import "../parts/main-body-degree-fn.typ": main-body-bachelor-conf #import "../utils/set-degree.typ": set-degree #import "../utils/smart-pagebreak.typ": gen-smart-pagebreak #import "../utils/thanks.typ": thanks #import "../utils/show-appendix.typ": show-appendix-degree #let degree-utils = (thanks, show-appendix-degree) #let degree-conf( author: (CN: "王东南", EN: "<NAME>", ID: "012345"), thesis-name: ( CN: "硕士学位论文", EN: [ A Thesis submitted to \ Southeast University \ For the Academic Degree of Master of Touching Fish ], heading: "东南大学硕士学位论文", ), title: ( CN: "摸鱼背景下的Typst模板使用研究", EN: "A Study of the Use of the Typst Template During Touching Fish", ), advisors: ( (CN: "湖牌桥", EN: "HU Pai-qiao", CN-title: "教授", EN-title: "Prof."), ( CN: "苏锡浦", EN: "SU Xi-pu", CN-title: "副教授", EN-title: "Associate Prof.", ), ), school: ( CN: "摸鱼学院", EN: "School of Touchingfish", ), major: ( main: "摸鱼科学", submajor: "计算机摸鱼", ), degree: "摸鱼学硕士", category-number: "N94", secret-level: "公开", UDC: "303", school-number: "10286", committee-chair: "张三 教授", readers: ( "李四 副教授", "王五 副教授", ), date: ( CN: ( defend-date: "2099年01月02日", authorize-date: "2099年01月03日", finish-date: "2024年01月15日", ), EN: ( finish-date: "Jan 15, 2024", ), ), thanks: "本论文受到摸鱼基金委的基金赞助(123456)", degree-form: "应用研究", cn-abstract: [示例摘要], cn-keywords: ("关键词1", "关键词2"), en-abstract: [#lorem(100)], en-keywords: ("Keywords1", "Keywords2"), always-start-odd: false, terminology: none, anonymous: false, skip-with-page-blank: false, bilingual-bib: true, first-level-title-page-disable-heading: false, doc, ) = { let smart-pagebreak = gen-smart-pagebreak.with( skip-with-page-blank: skip-with-page-blank, always-skip-even: always-start-odd, ) show: set-degree.with( always-new-page: smart-pagebreak, bilingual-bib: bilingual-bib, ) degree-cover-conf( author: author, thesis-name: thesis-name, title: title, advisors: advisors, school: school, major: major, degree: degree, category-number: category-number, secret-level: secret-level, UDC: UDC, school-number: school-number, committee-chair: committee-chair, readers: readers, date: date, degree-form: degree-form, anonymous: anonymous, ) smart-pagebreak() title-cn-conf( author: author, thesis-name: thesis-name, title: title, advisors: advisors, school: school, major: major, date: date, thanks: thanks, anonymous: false, ) smart-pagebreak() title-en-conf( author: author, thesis-name: thesis-name, title: title, advisors: advisors, school: school, date: date, anonymous: false, ) smart-pagebreak() degree-statement-conf() smart-pagebreak() abstract-conf( cn-abstract: cn-abstract, cn-keywords: cn-keywords, en-abstract: en-abstract, en-keywords: en-keywords, page-break: smart-pagebreak, ) smart-pagebreak() outline-conf() if not terminology in (none, [], [ ], "") { smart-pagebreak() terminology-conf(terminology) } smart-pagebreak(skip-with-page-blank: true) show: main-body-bachelor-conf.with( thesis-name: thesis-name, first-level-title-page-disable-heading: first-level-title-page-disable-heading, ) doc } #show: degree-conf.with( author: (CN: "王东南", EN: "<NAME>", ID: "012345"), thesis-name: ( CN: "硕士学位论文", EN: [ A Thesis submitted to \ Southeast University \ For the Academic Degree of Master of Touching Fish ], heading: "东南大学硕士学位论文", ), title: ( CN: "摸鱼背景下的Typst模板使用研究", EN: "A Study of the Use of the Typst Template During Touching Fish", ), advisors: ( (CN: "湖牌桥", EN: "<NAME>", CN-title: "教授", EN-title: "Prof."), ( CN: "苏锡浦", EN: "<NAME>", CN-title: "副教授", EN-title: "Associate Prof.", ), ), school: ( CN: "摸鱼学院", EN: "School of Touchingfish", ), major: ( main: "摸鱼科学", submajor: "计算机摸鱼", ), degree: "摸鱼学硕士", category-number: "N94", secret-level: "公开", UDC: "303", school-number: "10286", committee-chair: "<NAME>", readers: ( "李四 副教授", "王五 副教授", ), date: ( CN: ( defend-date: "2099年01月02日", authorize-date: "2099年01月03日", finish-date: "2024年01月15日", ), EN: ( finish-date: "Jan 15, 2024", ), ), thanks: "本论文受到摸鱼基金委的基金赞助(123456)", degree-form: "应用研究", cn-abstract: [示例摘要], cn-keywords: ("关键词1", "关键词2"), en-abstract: [#lorem(100)], en-keywords: ("Keywords1", "Keywords2"), always-start-odd: true, terminology: none, anonymous: false, )
https://github.com/dead-summer/math-notes
https://raw.githubusercontent.com/dead-summer/math-notes/main/notes/ScientificComputing/ch1-intro-to-scicomp/computer-representation-of-numbers.typ
typst
#import "/book.typ": book-page #import "../../../templates/conf.typ": * #import "@preview/mitex:0.2.4": * #show: book-page.with(title: "Computer Representation of Numbers") #show: codly-init.with() #codly_init() = 3 Computer Representation of Numbers We should first recognize that real numbers (numbers with decimal points) can only be represented by finite precision in computers. Typically, computer representation has two precisions: - *Single precision* : computer round-off $#mi("{\epsilon }_{1} = {2}^{-{23}} \approx 1.19 \\times {10}^{-7}")$ ; - *Double precision* : computer round-off $#mi("{\epsilon }_{2} = {2}^{-{52}} \approx 2.22 \\times {10}^{-{16}}")$ . The corresponding numbers are called floating point numbers. The finite precision introduces computer round-offs, which contributes to the major part of implementation errors. ```cpp #include <cmath> #include <cstdio> #include <cstdlib> #include <iostream> int main(int argc, char *argv[ ]) { float eps = M PI; int count = 0; while (eps > 0.0) { count++; std::cout << "count = " << count << ", eps = " << eps << std::endl; eps *= 0.1; } return EXIT SUCCESS; } ``` ```cpp #include <cmath> #include <cstdio> #include <cstdlib> #include <iostream> int main(int argc, char *argv[ ]) { float eps = M PI; int count = 0; float c = 1.0 + eps; while (c > 1.0) { count++; std::cout << "count = " << count << ", eps = " << eps << std::endl; eps *= 0.1; c = 1.0 + eps; } return EXIT SUCCESS; } ``` The next code tells us what value the computer round-off of floating point data is by the single precision. It is $epsilon.alt_(1) = 2^(-23)$ , approximately equal to $1.19 times 10^(-7)$ . ```cpp #include <cmath> #include <cstdio> #include <cstdlib> #include <iostream> int main(int argc, char *argv[ ]) { int count = 0; float eps = 1.0; float c = 1.0 + eps; while (c > 1.0) { count++; std::cout << "count = " << count << ", eps = " << eps << std::endl; eps *= 0.5; c = 1.0 + eps; } return EXIT SUCCESS; } ``` The next code tells us what value the computer round-off of floating point data is by the double precision. It is $epsilon.alt_2 = 2^(-52)$ , approximately equal to $2.22 times 10^(-16)$ . ```cpp #include <cmath> #include <cstdio> #include <cstdlib> #include <iostream> int main(int argc, char *argv[ ]) { int count = 0; double eps = 1.0; double c = 1.0 + eps; while (c > 1.0) { count++; std::cout << "count = " << count << ", eps = " << eps << std::endl; eps *= 0.5; c = 1.0 + eps; } return EXIT SUCCESS; } ```
https://github.com/typst/packages
https://raw.githubusercontent.com/typst/packages/main/packages/preview/cheda-seu-thesis/0.2.1/seu-thesis/pages/statement-bachelor-ic.typ
typst
Apache License 2.0
#import "../utils/fonts.typ": 字体, 字号 #page(paper: "a4", margin: (top: 2cm+0.5cm, bottom: 2cm+0.5cm, left: 2cm+0.25cm, right: 2cm+0.25cm), { v(80pt-0.4cm) set align(left) set text(font: 字体.宋体, size: 字号.小四, lang: "zh", region: "cn") set par(leading: 11.3pt, justify: true, first-line-indent: 0pt) set line(stroke: 0.6pt) align(center, text(font: 字体.黑体, size: 字号.小二)[东南大学毕业(设计)论文独创性声明]) v(24pt) h(2em) [本人声明所呈交的毕业(设计)论文是我个人在导师指导下进行的研究工作及取得的研究成果。尽我所知,除了文中特别加以标注和致谢的地方外,论文中不包含其他人已经发表或撰写过的研究成果,也不包含为获得东南大学或其它教育机构的学位或证书而使用过的材料。与我一同工作的同志对本研究所做的任何贡献均已在论文中作了明确的说明并表示了谢意。] v(18pt) grid( columns: 11, h(48pt), [论文作者签名:], line(length: 8em, start: (6pt, 10pt)), h(18pt), [日期:], line(length: 3.5em, start: (6pt, 10pt)), [年], line(length: 2.5em, start: (2pt, 10pt)), [月], line(length: 2.5em, start: (2pt, 10pt)), [日], ) v(140pt) align(center, text(font: 字体.黑体, size: 字号.小二)[东南大学毕业(设计)论文使用授权声明]) v(13pt) h(2em) [东南大学有权保留本人所送交毕业(设计)论文的复印件和电子文档,可以采用影印、缩印或其他复制手段保存论文。本人电子文档的内容和纸质论文的内容相一致。除在保密期内的保密论文外,允许论文被查阅和借阅,可以公布(包括刊登)论文的全部或部分内容。论文的公布(包括刊登)授权东南大学教务处办理。] v(16pt) grid( columns: (24pt, 200pt, 22pt, 1fr), row-gutter: 10pt, rows: 2, [], grid( columns: 2, [论文作者签名:], line(length: 115pt, start: (6pt, 10pt)), ), [], grid( columns: 2, [导师签名:], line(length: 140pt, start: (6pt, 10pt)), ), [], grid( columns: 7, [日期:], line(length: 3.5em, start: (6pt, 10pt)), [年], line(length: 2.9em, start: (2pt, 10pt)), [月], line(length: 2.9em, start: (2pt, 10pt)), [日], ), [], grid( columns: 7, [日期:], line(length: 2.5em, start: (6pt, 10pt)), [年], line(length: 2.9em, start: (2pt, 10pt)), [月], line(length: 2.9em, start: (2pt, 10pt)), [日], ), ) })
https://github.com/GYPpro/Java-coures-report
https://raw.githubusercontent.com/GYPpro/Java-coures-report/main/.VSCodeCounter/2023-12-14_23-06-43/results.md
markdown
# Summary Date : 2023-12-14 23:06:43 Directory d:\\Desktop\\Document\\Coding\\JAVA\\Rep\\Java-coures-report Total : 37 files, 2747 codes, 167 comments, 480 blanks, all 3394 lines Summary / [Details](details.md) / [Diff Summary](diff.md) / [Diff Details](diff-details.md) ## Languages | language | files | code | comment | blank | total | | :--- | ---: | ---: | ---: | ---: | ---: | | Java | 35 | 2,567 | 165 | 439 | 3,171 | | Typst | 1 | 179 | 2 | 39 | 220 | | Markdown | 1 | 1 | 0 | 2 | 3 | ## Directories | path | files | code | comment | blank | total | | :--- | ---: | ---: | ---: | ---: | ---: | | . | 37 | 2,747 | 167 | 480 | 3,394 | | . (Files) | 3 | 236 | 9 | 56 | 301 | | rubbish | 1 | 73 | 0 | 15 | 88 | | sis1 | 1 | 26 | 0 | 3 | 29 | | sis2 | 2 | 192 | 15 | 22 | 229 | | sis3 | 2 | 63 | 5 | 11 | 79 | | sis4 | 1 | 34 | 0 | 3 | 37 | | sis5 | 5 | 501 | 31 | 81 | 613 | | sis6 | 2 | 124 | 8 | 19 | 151 | | sis7 | 3 | 261 | 1 | 35 | 297 | | sis8 | 10 | 435 | 18 | 120 | 573 | | sis9 | 7 | 802 | 80 | 115 | 997 | Summary / [Details](details.md) / [Diff Summary](diff.md) / [Diff Details](diff-details.md)
https://github.com/LDemetrios/Typst4k
https://raw.githubusercontent.com/LDemetrios/Typst4k/master/src/test/resources/suite/layout/grid/footers.typ
typst
--- grid-footer --- #set page(width: auto, height: 15em) #set text(6pt) #set table(inset: 2pt, stroke: 0.5pt) #table( columns: 5, align: center + horizon, table.header( table.cell(colspan: 5)[*Cool Zone*], table.cell(stroke: red)[*Name*], table.cell(stroke: aqua)[*Number*], [*Data 1*], [*Data 2*], [*Etc*], table.hline(start: 2, end: 3, stroke: yellow) ), ..range(0, 5).map(i => ([John \##i], table.cell(stroke: green)[123], table.cell(stroke: blue)[456], [789], [?], table.hline(start: 4, end: 5, stroke: red))).flatten(), table.footer( table.hline(start: 2, end: 3, stroke: yellow), table.cell(stroke: red)[*Name*], table.cell(stroke: aqua)[*Number*], [*Data 1*], [*Data 2*], [*Etc*], table.cell(colspan: 5)[*Cool Zone*] ) ) --- grid-footer-gutter-and-no-repeat --- // Gutter & no repetition #set page(width: auto, height: 16em) #set text(6pt) #set table(inset: 2pt, stroke: 0.5pt) #table( columns: 5, gutter: 2pt, align: center + horizon, table.header( table.cell(colspan: 5)[*Cool Zone*], table.cell(stroke: red)[*Name*], table.cell(stroke: aqua)[*Number*], [*Data 1*], [*Data 2*], [*Etc*], table.hline(start: 2, end: 3, stroke: yellow) ), ..range(0, 5).map(i => ([John \##i], table.cell(stroke: green)[123], table.cell(stroke: blue)[456], [789], [?], table.hline(start: 4, end: 5, stroke: red))).flatten(), table.footer( repeat: false, table.hline(start: 2, end: 3, stroke: yellow), table.cell(stroke: red)[*Name*], table.cell(stroke: aqua)[*Number*], [*Data 1*], [*Data 2*], [*Etc*], table.cell(colspan: 5)[*Cool Zone*] ) ) --- grid-cell-override-in-header-and-footer --- #table( table.header(table.cell(stroke: red)[Hello]), table.footer(table.cell(stroke: aqua)[Bye]), ) --- grid-cell-override-in-header-and-footer-with-gutter --- #table( gutter: 3pt, table.header(table.cell(stroke: red)[Hello]), table.footer(table.cell(stroke: aqua)[Bye]), ) --- grid-footer-top-stroke --- // Footer's top stroke should win when repeated, but lose at the last page. #set page(height: 10em) #table( stroke: green, table.header(table.cell(stroke: red)[Hello]), table.cell(stroke: yellow)[Hi], table.cell(stroke: yellow)[Bye], table.cell(stroke: yellow)[Ok], table.footer[Bye], ) --- grid-footer-relative-row-sizes --- // Relative lengths #set page(height: 10em) #table( rows: (30%, 30%, auto), [C], [C], table.footer[*A*][*B*], ) --- grid-footer-cell-with-y --- #grid( grid.footer(grid.cell(y: 2)[b]), grid.cell(y: 0)[a], grid.cell(y: 1)[c], ) --- grid-footer-expand --- // Ensure footer properly expands #grid( columns: 2, [a], [], [b], [], grid.cell(x: 1, y: 3, rowspan: 4)[b], grid.cell(y: 2, rowspan: 2)[a], grid.footer(), grid.cell(y: 4)[d], grid.cell(y: 5)[e], grid.cell(y: 6)[f], ) --- grid-footer-not-at-last-row --- // Error: 2:3-2:19 footer must end at the last row #grid( grid.footer([a]), [b], ) --- grid-footer-not-at-last-row-two-columns --- // Error: 3:3-3:19 footer must end at the last row #grid( columns: 2, grid.footer([a]), [b], ) --- grid-footer-overlap --- // Error: 4:3-4:19 footer would conflict with a cell placed before it at column 1 row 0 // Hint: 4:3-4:19 try reducing that cell's rowspan or moving the footer #grid( columns: 2, grid.header(), grid.footer([a]), grid.cell(x: 1, y: 0, rowspan: 2)[a], ) --- grid-footer-multiple --- // Error: 4:3-4:19 cannot have more than one footer #grid( [a], grid.footer([a]), grid.footer([b]), ) --- table-footer-in-grid --- // Error: 3:3-3:20 cannot use `table.footer` as a grid footer // Hint: 3:3-3:20 use `grid.footer` instead #grid( [a], table.footer([a]), ) --- grid-footer-in-table --- // Error: 3:3-3:19 cannot use `grid.footer` as a table footer // Hint: 3:3-3:19 use `table.footer` instead #table( [a], grid.footer([a]), ) --- grid-footer-in-grid-header --- // Error: 14-28 cannot place a grid footer within another footer or header #grid.header(grid.footer[a]) --- table-footer-in-grid-header --- // Error: 14-29 cannot place a table footer within another footer or header #grid.header(table.footer[a]) --- grid-footer-in-table-header --- // Error: 15-29 cannot place a grid footer within another footer or header #table.header(grid.footer[a]) --- table-footer-in-table-header --- // Error: 15-30 cannot place a table footer within another footer or header #table.header(table.footer[a]) --- grid-footer-in-grid-footer --- // Error: 14-28 cannot place a grid footer within another footer or header #grid.footer(grid.footer[a]) --- table-footer-in-grid-footer --- // Error: 14-29 cannot place a table footer within another footer or header #grid.footer(table.footer[a]) --- grid-footer-in-table-footer --- // Error: 15-29 cannot place a grid footer within another footer or header #table.footer(grid.footer[a]) --- table-footer-in-table-footer --- // Error: 15-30 cannot place a table footer within another footer or header #table.footer(table.footer[a]) --- grid-header-in-grid-footer --- // Error: 14-28 cannot place a grid header within another header or footer #grid.footer(grid.header[a]) --- table-header-in-grid-footer --- // Error: 14-29 cannot place a table header within another header or footer #grid.footer(table.header[a]) --- grid-header-in-table-footer --- // Error: 15-29 cannot place a grid header within another header or footer #table.footer(grid.header[a]) --- table-header-in-table-footer --- // Error: 15-30 cannot place a table header within another header or footer #table.footer(table.header[a]) --- grid-header-footer-block-with-fixed-height --- #set page(height: 17em) #table( rows: (auto, 2.5em, auto), table.header[*Hello*][*World*], block(width: 2em, height: 10em, fill: red), table.footer[*Bye*][*World*], ) --- grid-header-footer-and-rowspan-non-contiguous-1 --- // Rowspan sizing algorithm doesn't do the best job at non-contiguous content // ATM. #set page(height: 20em) #table( rows: (auto, 2.5em, 2em, auto, 5em, 2em, 2.5em), table.header[*Hello*][*World*], table.cell(rowspan: 3, lorem(20)), table.footer[*Ok*][*Bye*], ) --- grid-header-footer-and-rowspan-non-contiguous-2 --- // This should look right #set page(height: 20em) #table( rows: (auto, 2.5em, 2em, auto), gutter: 3pt, table.header[*Hello*][*World*], table.cell(rowspan: 3, lorem(20)), table.footer[*Ok*][*Bye*], ) --- grid-header-and-footer-lack-of-space --- // Test lack of space for header + text. #set page(height: 9em + 2.5em + 1.5em) #table( rows: (auto, 2.5em, auto, auto, 10em, 2.5em, auto), gutter: 3pt, table.header[*Hello*][*World*], table.cell(rowspan: 3, lorem(30)), table.footer[*Ok*][*Bye*], ) --- grid-header-and-footer-orphan-prevention --- // Orphan header prevention test #set page(height: 13em) #v(8em) #grid( columns: 3, gutter: 5pt, grid.header( [*Mui*], [*A*], grid.cell(rowspan: 2, fill: orange)[*B*], [*Header*], [*Header* #v(0.1em)], ), ..([Test], [Test], [Test]) * 7, grid.footer( [*Mui*], [*A*], grid.cell(rowspan: 2, fill: orange)[*B*], [*Footer*], [*Footer* #v(0.1em)], ), ) --- grid-header-and-footer-empty --- // Empty footer should just be a repeated blank row #set page(height: 8em) #table( columns: 4, align: center + horizon, table.header(), ..range(0, 2).map(i => ( [John \##i], table.cell(stroke: green)[123], table.cell(stroke: blue)[456], [789] )).flatten(), table.footer(), ) --- grid-header-and-footer-containing-rowspan --- // When a footer has a rowspan with an empty row, it should be displayed // properly #set page(height: 14em, width: auto) #let count = counter("g") #table( rows: (auto, 2em, auto, auto), table.header( [eeec], table.cell(rowspan: 2, count.step() + context count.display()), ), [d], block(width: 5em, fill: yellow, lorem(7)), [d], table.footer( [eeec], table.cell(rowspan: 2, count.step() + context count.display()), ) ) #context count.display() --- grid-nested-with-footers --- // Nested table with footer should repeat both footers #set page(height: 10em, width: auto) #table( table( [a\ b\ c\ d], table.footer[b], ), table.footer[a], ) --- grid-nested-footers --- #set page(height: 12em, width: auto) #table( [a\ b\ c\ d], table.footer(table( [c], [d], table.footer[b], )) ) --- grid-footer-rowspan --- // General footer-only tests #set page(height: 9em) #table( columns: 2, [a], [], [b], [], [c], [], [d], [], [e], [], table.footer( [*Ok*], table.cell(rowspan: 2)[test], [*Thanks*] ) ) --- grid-footer-bare-1 --- #set page(height: 5em) #table( table.footer[a][b][c] ) --- grid-footer-bare-2 --- #table(table.footer[a][b][c]) #table( gutter: 3pt, table.footer[a][b][c] ) --- grid-footer-stroke-edge-cases --- // Test footer stroke priority edge case #set page(height: 10em) #table( columns: 2, stroke: black, ..(table.cell(stroke: aqua)[d],) * 8, table.footer( table.cell(rowspan: 2, colspan: 2)[a], [c], [d] ) ) --- grid-footer-hline-and-vline-1 --- // Footer should appear at the bottom. Red line should be above the footer. // Green line should be on the left border. #set page(margin: 2pt) #set text(6pt) #table( columns: 2, inset: 1.5pt, table.cell(y: 0)[a], table.cell(x: 1, y: 1)[a], table.cell(y: 2)[a], table.footer( table.hline(stroke: red), table.vline(stroke: green), [b], ), table.cell(x: 1, y: 3)[c] ) --- grid-footer-hline-and-vline-2 --- // Table should be just one row. [c] appears at the third column. #set page(margin: 2pt) #set text(6pt) #table( columns: 3, inset: 1.5pt, table.cell(y: 0)[a], table.footer( table.hline(stroke: red), table.hline(y: 1, stroke: aqua), table.cell(y: 0)[b], [c] ) ) --- grid-footer-below-rowspans --- // Footer should go below the rowspans. #set page(margin: 2pt) #set text(6pt) #table( columns: 2, inset: 1.5pt, table.cell(rowspan: 2)[a], table.cell(rowspan: 2)[b], table.footer() )
https://github.com/jgm/typst-hs
https://raw.githubusercontent.com/jgm/typst-hs/main/test/typ/layout/table-01.typ
typst
Other
#table(columns: 3, stroke: none, fill: green, [A], [B], [C])
https://github.com/taooceros/MATH-542-HW
https://raw.githubusercontent.com/taooceros/MATH-542-HW/main/HW7/HW7.typ
typst
#import "@local/homework-template:0.1.0": * // Take a look at the file `template.typ` in the file panel // to customize this template and discover how it works. #show: project.with(title: "Math 542 HW7", authors: ("<NAME>",)) = 13.1 == 2 Show that $x^3 - 2x -2$ is irreducible over $QQ$ and let $theta$ be a root. Computer $(1+theta)(1+theta+theta^2)$ and $(1+theta)/(1+theta+theta^2)$ in $QQ(theta)$. #solution[ $ (1 + theta)(1 + theta + theta^2) &= 1 + theta + theta^2 + theta + theta^2 + theta^3 \ &= 1 + 2theta + 2 theta^2 + theta^3 = 1 + 2theta +2theta^2 + 2theta + 2\ &= 3 + 4theta + 2theta^2 $ We want to find the inverse of $(1+theta+theta^2)$, thus by euclidean algorithm $ x^3-2x-2 &= (x^2 + x + 1)(x-1) + (-2x - 1)\ x^2+x+1 &= (-2x-1)(-1/2x - 1/4) + (3/4)\ &= (x^3-2x-2-(x^2+x+1)(x-1))(-1/2x - 1/4) + (3/4)\ -3/4 &= (x^3-2x-2)(-1/2 x-1/4) - (x^2+x+1)((-1/2x-1/4)(x-1)+1)\ (x^2+x+1)^(-1) &= 4/3((-1/2x-1/4)(x-1) + 1) = (-2x^2 + x + 5)/3 $ Then $ (1+theta)/(1+theta+theta^2) = -(2 θ^3)/3 - θ^2/3 + 2 θ + 5/3 = - theta^2/3 + 2/3 theta + 1/3 $ ] == 5 Suppose $alpha$ is a rational root of a monic polynomial in $ZZ[x]$. Prove that $alpha$ is an integer. #solution[ Suppose $alpha = n/d$ where $abs(d) > 1$. The polynomial can be written as $ a_n x^n + a_(n-1) x^(n-1) + ... + a_1 x + a_0 $ Thus $ (n/d)^n + a_(n-1) (n/d)^(n-1) + ... + a_1 (n/d) + a_0 = 0 \ $ $ -(n/d)^n &= a_(n-1) (n/d)^(n-1) + ... + a_1 (n/d) + a_0 \ &= b/d^(n-1) $ for some $b in ZZ$. Since $abs(d) > 1$, we have reached a contradiction. ] = 13.4 == 3 Splitting field over $QQ$ for $x^4 + x^2 + 1$. #solution[ $ x^4 + x^2 + 1 = (x^2-x+1)(x^2+x+1) $ Thus we can find that it has 4 roots $ (-1 - i sqrt(3))/2, (1 + i sqrt(3))/2, (1-i sqrt(3))/2, (-1 + i sqrt(3))/2 $ Thus we have the splitting field $ QQ(sqrt(3)) $ ] == 4 Determine the splitting field and its degree over $QQ$ for $x^6 - 4$. #solution[ Note that $x^6-4 = 0 => x^6 = 4 => x^6 = (root(6,4)) dot 1 = root(3,2) dot 1$. Thus the splitting field need to contain all the root of the polynomial, which is $ QQ(root(3,2), root(3,2) zeta(6), root(3,2) zeta(6), root(3,2) zeta(6), root(3,2) zeta(6), root(3,2) zeta(6)) $ The degree is 6. ] == 5 Let $K$ be a finite extension of $F$. Prove that $K$ is a splitting field over $F$ if and only if every irreducible polynomial in $F[x]$ that has a root in $K$ splits completely in $K[x]$. #let div = $\/$ #solution[ Denote the polynomial as $p in F[x]$. We know that $k$ is a splitting field of $p$, and thus $k cong F[x] div p$. Assume there are two roots $alpha, beta$ in $k$ such that $alpha in k$ and $beta in.not k$. We know that $F[alpha] cong F[x] div p cong F[beta]$. Thus we have an isomorphism $phi : F[alpha] cong F[beta]$. Consider the splitting field of $p$ denoted as $k$, thus we have an injective map from $F[alpha] arrow.hook k$, and $F[beta] arrow.hook k$. Then we consider the algebratic closure of $F$ noted as $overline(F)$. Automatically we have an isomorphism that extends $phi$ to $overline(F)$. Restricting $phi$ to $k$, we have a homomorphism $overline(phi) : k -> k$ that sends $alpha arrow.bar beta$, which means $beta in k$. This is a contradiction. The other direction follows as definition of splitting field. ]
https://github.com/herbhuang/utdallas-thesis-template-typst
https://raw.githubusercontent.com/herbhuang/utdallas-thesis-template-typst/main/utils/todo.typ
typst
MIT License
#let TODO(body, color: yellow) = { rect( width: 100%, radius: 3pt, stroke: 0.5pt, fill: color, )[ #body ] }